shell bypass 403

GrazzMean Shell

: /proc/thread-self/root/proc/self/cwd/ [ drwxr-xr-x ]
Uname: Linux web3.us.cloudlogin.co 5.10.226-xeon-hst #2 SMP Fri Sep 13 12:28:44 UTC 2024 x86_64
Software: Apache
PHP version: 8.1.31 [ PHP INFO ] PHP os: Linux
Server Ip: 162.210.96.117
Your Ip: 3.21.127.14
User: edustar (269686) | Group: tty (888)
Safe Mode: OFF
Disable Function:
NONE

name : python3.6.tar
site-packages/validate.py000064400000134205147511334560011445 0ustar00# validate.py
# A Validator object
# Copyright (C) 2005-2014:
# (name) : (email)
# Michael Foord: fuzzyman AT voidspace DOT org DOT uk
# Mark Andrews: mark AT la-la DOT com
# Nicola Larosa: nico AT tekNico DOT net
# Rob Dennis: rdennis AT gmail DOT com
# Eli Courtwright: eli AT courtwright DOT org

# This software is licensed under the terms of the BSD license.
# http://opensource.org/licenses/BSD-3-Clause

# ConfigObj 5 - main repository for documentation and issue tracking:
# https://github.com/DiffSK/configobj

"""
    The Validator object is used to check that supplied values 
    conform to a specification.
    
    The value can be supplied as a string - e.g. from a config file.
    In this case the check will also *convert* the value to
    the required type. This allows you to add validation
    as a transparent layer to access data stored as strings.
    The validation checks that the data is correct *and*
    converts it to the expected type.
    
    Some standard checks are provided for basic data types.
    Additional checks are easy to write. They can be
    provided when the ``Validator`` is instantiated or
    added afterwards.
    
    The standard functions work with the following basic data types :
    
    * integers
    * floats
    * booleans
    * strings
    * ip_addr
    
    plus lists of these datatypes
    
    Adding additional checks is done through coding simple functions.
    
    The full set of standard checks are : 
    
    * 'integer': matches integer values (including negative)
                 Takes optional 'min' and 'max' arguments : ::
    
                   integer()
                   integer(3, 9)  # any value from 3 to 9
                   integer(min=0) # any positive value
                   integer(max=9)
    
    * 'float': matches float values
               Has the same parameters as the integer check.
    
    * 'boolean': matches boolean values - ``True`` or ``False``
                 Acceptable string values for True are :
                   true, on, yes, 1
                 Acceptable string values for False are :
                   false, off, no, 0
    
                 Any other value raises an error.
    
    * 'ip_addr': matches an Internet Protocol address, v.4, represented
                 by a dotted-quad string, i.e. '1.2.3.4'.
    
    * 'string': matches any string.
                Takes optional keyword args 'min' and 'max'
                to specify min and max lengths of the string.
    
    * 'list': matches any list.
              Takes optional keyword args 'min', and 'max' to specify min and
              max sizes of the list. (Always returns a list.)
    
    * 'tuple': matches any tuple.
              Takes optional keyword args 'min', and 'max' to specify min and
              max sizes of the tuple. (Always returns a tuple.)
    
    * 'int_list': Matches a list of integers.
                  Takes the same arguments as list.
    
    * 'float_list': Matches a list of floats.
                    Takes the same arguments as list.
    
    * 'bool_list': Matches a list of boolean values.
                   Takes the same arguments as list.
    
    * 'ip_addr_list': Matches a list of IP addresses.
                     Takes the same arguments as list.
    
    * 'string_list': Matches a list of strings.
                     Takes the same arguments as list.
    
    * 'mixed_list': Matches a list with different types in 
                    specific positions. List size must match
                    the number of arguments.
    
                    Each position can be one of :
                    'integer', 'float', 'ip_addr', 'string', 'boolean'
    
                    So to specify a list with two strings followed
                    by two integers, you write the check as : ::
    
                      mixed_list('string', 'string', 'integer', 'integer')
    
    * 'pass': This check matches everything ! It never fails
              and the value is unchanged.
    
              It is also the default if no check is specified.
    
    * 'option': This check matches any from a list of options.
                You specify this check with : ::
    
                  option('option 1', 'option 2', 'option 3')
    
    You can supply a default value (returned if no value is supplied)
    using the default keyword argument.
    
    You specify a list argument for default using a list constructor syntax in
    the check : ::
    
        checkname(arg1, arg2, default=list('val 1', 'val 2', 'val 3'))
    
    A badly formatted set of arguments will raise a ``VdtParamError``.
"""

__version__ = '1.0.1'


__all__ = (
    '__version__',
    'dottedQuadToNum',
    'numToDottedQuad',
    'ValidateError',
    'VdtUnknownCheckError',
    'VdtParamError',
    'VdtTypeError',
    'VdtValueError',
    'VdtValueTooSmallError',
    'VdtValueTooBigError',
    'VdtValueTooShortError',
    'VdtValueTooLongError',
    'VdtMissingValue',
    'Validator',
    'is_integer',
    'is_float',
    'is_boolean',
    'is_list',
    'is_tuple',
    'is_ip_addr',
    'is_string',
    'is_int_list',
    'is_bool_list',
    'is_float_list',
    'is_string_list',
    'is_ip_addr_list',
    'is_mixed_list',
    'is_option',
    '__docformat__',
)


import re
import sys
from pprint import pprint

#TODO - #21 - six is part of the repo now, but we didn't switch over to it here
# this could be replaced if six is used for compatibility, or there are no
# more assertions about items being a string
if sys.version_info < (3,):
    string_type = basestring
else:
    string_type = str
    # so tests that care about unicode on 2.x can specify unicode, and the same
    # tests when run on 3.x won't complain about a undefined name "unicode"
    # since all strings are unicode on 3.x we just want to pass it through
    # unchanged
    unicode = lambda x: x
    # in python 3, all ints are equivalent to python 2 longs, and they'll
    # never show "L" in the repr
    long = int

_list_arg = re.compile(r'''
    (?:
        ([a-zA-Z_][a-zA-Z0-9_]*)\s*=\s*list\(
            (
                (?:
                    \s*
                    (?:
                        (?:".*?")|              # double quotes
                        (?:'.*?')|              # single quotes
                        (?:[^'",\s\)][^,\)]*?)  # unquoted
                    )
                    \s*,\s*
                )*
                (?:
                    (?:".*?")|              # double quotes
                    (?:'.*?')|              # single quotes
                    (?:[^'",\s\)][^,\)]*?)  # unquoted
                )?                          # last one
            )
        \)
    )
''', re.VERBOSE | re.DOTALL)    # two groups

_list_members = re.compile(r'''
    (
        (?:".*?")|              # double quotes
        (?:'.*?')|              # single quotes
        (?:[^'",\s=][^,=]*?)       # unquoted
    )
    (?:
    (?:\s*,\s*)|(?:\s*$)            # comma
    )
''', re.VERBOSE | re.DOTALL)    # one group

_paramstring = r'''
    (?:
        (
            (?:
                [a-zA-Z_][a-zA-Z0-9_]*\s*=\s*list\(
                    (?:
                        \s*
                        (?:
                            (?:".*?")|              # double quotes
                            (?:'.*?')|              # single quotes
                            (?:[^'",\s\)][^,\)]*?)       # unquoted
                        )
                        \s*,\s*
                    )*
                    (?:
                        (?:".*?")|              # double quotes
                        (?:'.*?')|              # single quotes
                        (?:[^'",\s\)][^,\)]*?)       # unquoted
                    )?                              # last one
                \)
            )|
            (?:
                (?:".*?")|              # double quotes
                (?:'.*?')|              # single quotes
                (?:[^'",\s=][^,=]*?)|       # unquoted
                (?:                         # keyword argument
                    [a-zA-Z_][a-zA-Z0-9_]*\s*=\s*
                    (?:
                        (?:".*?")|              # double quotes
                        (?:'.*?')|              # single quotes
                        (?:[^'",\s=][^,=]*?)       # unquoted
                    )
                )
            )
        )
        (?:
            (?:\s*,\s*)|(?:\s*$)            # comma
        )
    )
    '''

_matchstring = '^%s*' % _paramstring

# Python pre 2.2.1 doesn't have bool
try:
    bool
except NameError:
    def bool(val):
        """Simple boolean equivalent function. """
        if val:
            return 1
        else:
            return 0


def dottedQuadToNum(ip):
    """
    Convert decimal dotted quad string to long integer
    
    >>> int(dottedQuadToNum('1 '))
    1
    >>> int(dottedQuadToNum(' 1.2'))
    16777218
    >>> int(dottedQuadToNum(' 1.2.3 '))
    16908291
    >>> int(dottedQuadToNum('1.2.3.4'))
    16909060
    >>> dottedQuadToNum('255.255.255.255')
    4294967295
    >>> dottedQuadToNum('255.255.255.256')
    Traceback (most recent call last):
    ValueError: Not a good dotted-quad IP: 255.255.255.256
    """
    
    # import here to avoid it when ip_addr values are not used
    import socket, struct
    
    try:
        return struct.unpack('!L',
            socket.inet_aton(ip.strip()))[0]
    except socket.error:
        raise ValueError('Not a good dotted-quad IP: %s' % ip)
    return


def numToDottedQuad(num):
    """
    Convert int or long int to dotted quad string
    
    >>> numToDottedQuad(long(-1))
    Traceback (most recent call last):
    ValueError: Not a good numeric IP: -1
    >>> numToDottedQuad(long(1))
    '0.0.0.1'
    >>> numToDottedQuad(long(16777218))
    '1.0.0.2'
    >>> numToDottedQuad(long(16908291))
    '1.2.0.3'
    >>> numToDottedQuad(long(16909060))
    '1.2.3.4'
    >>> numToDottedQuad(long(4294967295))
    '255.255.255.255'
    >>> numToDottedQuad(long(4294967296))
    Traceback (most recent call last):
    ValueError: Not a good numeric IP: 4294967296
    >>> numToDottedQuad(-1)
    Traceback (most recent call last):
    ValueError: Not a good numeric IP: -1
    >>> numToDottedQuad(1)
    '0.0.0.1'
    >>> numToDottedQuad(16777218)
    '1.0.0.2'
    >>> numToDottedQuad(16908291)
    '1.2.0.3'
    >>> numToDottedQuad(16909060)
    '1.2.3.4'
    >>> numToDottedQuad(4294967295)
    '255.255.255.255'
    >>> numToDottedQuad(4294967296)
    Traceback (most recent call last):
    ValueError: Not a good numeric IP: 4294967296

    """
    
    # import here to avoid it when ip_addr values are not used
    import socket, struct
    
    # no need to intercept here, 4294967295L is fine
    if num > long(4294967295) or num < 0:
        raise ValueError('Not a good numeric IP: %s' % num)
    try:
        return socket.inet_ntoa(
            struct.pack('!L', long(num)))
    except (socket.error, struct.error, OverflowError):
        raise ValueError('Not a good numeric IP: %s' % num)


class ValidateError(Exception):
    """
    This error indicates that the check failed.
    It can be the base class for more specific errors.
    
    Any check function that fails ought to raise this error.
    (or a subclass)
    
    >>> raise ValidateError
    Traceback (most recent call last):
    ValidateError
    """


class VdtMissingValue(ValidateError):
    """No value was supplied to a check that needed one."""


class VdtUnknownCheckError(ValidateError):
    """An unknown check function was requested"""

    def __init__(self, value):
        """
        >>> raise VdtUnknownCheckError('yoda')
        Traceback (most recent call last):
        VdtUnknownCheckError: the check "yoda" is unknown.
        """
        ValidateError.__init__(self, 'the check "%s" is unknown.' % (value,))


class VdtParamError(SyntaxError):
    """An incorrect parameter was passed"""

    def __init__(self, name, value):
        """
        >>> raise VdtParamError('yoda', 'jedi')
        Traceback (most recent call last):
        VdtParamError: passed an incorrect value "jedi" for parameter "yoda".
        """
        SyntaxError.__init__(self, 'passed an incorrect value "%s" for parameter "%s".' % (value, name))


class VdtTypeError(ValidateError):
    """The value supplied was of the wrong type"""

    def __init__(self, value):
        """
        >>> raise VdtTypeError('jedi')
        Traceback (most recent call last):
        VdtTypeError: the value "jedi" is of the wrong type.
        """
        ValidateError.__init__(self, 'the value "%s" is of the wrong type.' % (value,))


class VdtValueError(ValidateError):
    """The value supplied was of the correct type, but was not an allowed value."""
    
    def __init__(self, value):
        """
        >>> raise VdtValueError('jedi')
        Traceback (most recent call last):
        VdtValueError: the value "jedi" is unacceptable.
        """
        ValidateError.__init__(self, 'the value "%s" is unacceptable.' % (value,))


class VdtValueTooSmallError(VdtValueError):
    """The value supplied was of the correct type, but was too small."""

    def __init__(self, value):
        """
        >>> raise VdtValueTooSmallError('0')
        Traceback (most recent call last):
        VdtValueTooSmallError: the value "0" is too small.
        """
        ValidateError.__init__(self, 'the value "%s" is too small.' % (value,))


class VdtValueTooBigError(VdtValueError):
    """The value supplied was of the correct type, but was too big."""

    def __init__(self, value):
        """
        >>> raise VdtValueTooBigError('1')
        Traceback (most recent call last):
        VdtValueTooBigError: the value "1" is too big.
        """
        ValidateError.__init__(self, 'the value "%s" is too big.' % (value,))


class VdtValueTooShortError(VdtValueError):
    """The value supplied was of the correct type, but was too short."""

    def __init__(self, value):
        """
        >>> raise VdtValueTooShortError('jed')
        Traceback (most recent call last):
        VdtValueTooShortError: the value "jed" is too short.
        """
        ValidateError.__init__(
            self,
            'the value "%s" is too short.' % (value,))


class VdtValueTooLongError(VdtValueError):
    """The value supplied was of the correct type, but was too long."""

    def __init__(self, value):
        """
        >>> raise VdtValueTooLongError('jedie')
        Traceback (most recent call last):
        VdtValueTooLongError: the value "jedie" is too long.
        """
        ValidateError.__init__(self, 'the value "%s" is too long.' % (value,))


class Validator(object):
    """
    Validator is an object that allows you to register a set of 'checks'.
    These checks take input and test that it conforms to the check.
    
    This can also involve converting the value from a string into
    the correct datatype.
    
    The ``check`` method takes an input string which configures which
    check is to be used and applies that check to a supplied value.
    
    An example input string would be:
    'int_range(param1, param2)'
    
    You would then provide something like:
    
    >>> def int_range_check(value, min, max):
    ...     # turn min and max from strings to integers
    ...     min = int(min)
    ...     max = int(max)
    ...     # check that value is of the correct type.
    ...     # possible valid inputs are integers or strings
    ...     # that represent integers
    ...     if not isinstance(value, (int, long, string_type)):
    ...         raise VdtTypeError(value)
    ...     elif isinstance(value, string_type):
    ...         # if we are given a string
    ...         # attempt to convert to an integer
    ...         try:
    ...             value = int(value)
    ...         except ValueError:
    ...             raise VdtValueError(value)
    ...     # check the value is between our constraints
    ...     if not min <= value:
    ...          raise VdtValueTooSmallError(value)
    ...     if not value <= max:
    ...          raise VdtValueTooBigError(value)
    ...     return value
    
    >>> fdict = {'int_range': int_range_check}
    >>> vtr1 = Validator(fdict)
    >>> vtr1.check('int_range(20, 40)', '30')
    30
    >>> vtr1.check('int_range(20, 40)', '60')
    Traceback (most recent call last):
    VdtValueTooBigError: the value "60" is too big.
    
    New functions can be added with : ::
    
    >>> vtr2 = Validator()       
    >>> vtr2.functions['int_range'] = int_range_check
    
    Or by passing in a dictionary of functions when Validator 
    is instantiated.
    
    Your functions *can* use keyword arguments,
    but the first argument should always be 'value'.
    
    If the function doesn't take additional arguments,
    the parentheses are optional in the check.
    It can be written with either of : ::
    
        keyword = function_name
        keyword = function_name()
    
    The first program to utilise Validator() was Michael Foord's
    ConfigObj, an alternative to ConfigParser which supports lists and
    can validate a config file using a config schema.
    For more details on using Validator with ConfigObj see:
    https://configobj.readthedocs.org/en/latest/configobj.html
    """

    # this regex does the initial parsing of the checks
    _func_re = re.compile(r'(.+?)\((.*)\)', re.DOTALL)

    # this regex takes apart keyword arguments
    _key_arg = re.compile(r'^([a-zA-Z_][a-zA-Z0-9_]*)\s*=\s*(.*)$',  re.DOTALL)


    # this regex finds keyword=list(....) type values
    _list_arg = _list_arg

    # this regex takes individual values out of lists - in one pass
    _list_members = _list_members

    # These regexes check a set of arguments for validity
    # and then pull the members out
    _paramfinder = re.compile(_paramstring, re.VERBOSE | re.DOTALL)
    _matchfinder = re.compile(_matchstring, re.VERBOSE | re.DOTALL)


    def __init__(self, functions=None):
        """
        >>> vtri = Validator()
        """
        self.functions = {
            '': self._pass,
            'integer': is_integer,
            'float': is_float,
            'boolean': is_boolean,
            'ip_addr': is_ip_addr,
            'string': is_string,
            'list': is_list,
            'tuple': is_tuple,
            'int_list': is_int_list,
            'float_list': is_float_list,
            'bool_list': is_bool_list,
            'ip_addr_list': is_ip_addr_list,
            'string_list': is_string_list,
            'mixed_list': is_mixed_list,
            'pass': self._pass,
            'option': is_option,
            'force_list': force_list,
        }
        if functions is not None:
            self.functions.update(functions)
        # tekNico: for use by ConfigObj
        self.baseErrorClass = ValidateError
        self._cache = {}


    def check(self, check, value, missing=False):
        """
        Usage: check(check, value)
        
        Arguments:
            check: string representing check to apply (including arguments)
            value: object to be checked
        Returns value, converted to correct type if necessary
        
        If the check fails, raises a ``ValidateError`` subclass.
        
        >>> vtor.check('yoda', '')
        Traceback (most recent call last):
        VdtUnknownCheckError: the check "yoda" is unknown.
        >>> vtor.check('yoda()', '')
        Traceback (most recent call last):
        VdtUnknownCheckError: the check "yoda" is unknown.
        
        >>> vtor.check('string(default="")', '', missing=True)
        ''
        """
        fun_name, fun_args, fun_kwargs, default = self._parse_with_caching(check)
            
        if missing:
            if default is None:
                # no information needed here - to be handled by caller
                raise VdtMissingValue()
            value = self._handle_none(default)
        
        if value is None:
            return None
        
        return self._check_value(value, fun_name, fun_args, fun_kwargs)


    def _handle_none(self, value):
        if value == 'None':
            return None
        elif value in ("'None'", '"None"'):
            # Special case a quoted None
            value = self._unquote(value)
        return value


    def _parse_with_caching(self, check):
        if check in self._cache:
            fun_name, fun_args, fun_kwargs, default = self._cache[check]
            # We call list and dict below to work with *copies* of the data
            # rather than the original (which are mutable of course)
            fun_args = list(fun_args)
            fun_kwargs = dict(fun_kwargs)
        else:
            fun_name, fun_args, fun_kwargs, default = self._parse_check(check)
            fun_kwargs = dict([(str(key), value) for (key, value) in list(fun_kwargs.items())])
            self._cache[check] = fun_name, list(fun_args), dict(fun_kwargs), default
        return fun_name, fun_args, fun_kwargs, default
        
        
    def _check_value(self, value, fun_name, fun_args, fun_kwargs):
        try:
            fun = self.functions[fun_name]
        except KeyError:
            raise VdtUnknownCheckError(fun_name)
        else:
            return fun(value, *fun_args, **fun_kwargs)


    def _parse_check(self, check):
        fun_match = self._func_re.match(check)
        if fun_match:
            fun_name = fun_match.group(1)
            arg_string = fun_match.group(2)
            arg_match = self._matchfinder.match(arg_string)
            if arg_match is None:
                # Bad syntax
                raise VdtParamError('Bad syntax in check "%s".' % check)
            fun_args = []
            fun_kwargs = {}
            # pull out args of group 2
            for arg in self._paramfinder.findall(arg_string):
                # args may need whitespace removing (before removing quotes)
                arg = arg.strip()
                listmatch = self._list_arg.match(arg)
                if listmatch:
                    key, val = self._list_handle(listmatch)
                    fun_kwargs[key] = val
                    continue
                keymatch = self._key_arg.match(arg)
                if keymatch:
                    val = keymatch.group(2)
                    if not val in ("'None'", '"None"'):
                        # Special case a quoted None
                        val = self._unquote(val)
                    fun_kwargs[keymatch.group(1)] = val
                    continue
                
                fun_args.append(self._unquote(arg))
        else:
            # allows for function names without (args)
            return check, (), {}, None

        # Default must be deleted if the value is specified too,
        # otherwise the check function will get a spurious "default" keyword arg
        default = fun_kwargs.pop('default', None)
        return fun_name, fun_args, fun_kwargs, default


    def _unquote(self, val):
        """Unquote a value if necessary."""
        if (len(val) >= 2) and (val[0] in ("'", '"')) and (val[0] == val[-1]):
            val = val[1:-1]
        return val


    def _list_handle(self, listmatch):
        """Take apart a ``keyword=list('val, 'val')`` type string."""
        out = []
        name = listmatch.group(1)
        args = listmatch.group(2)
        for arg in self._list_members.findall(args):
            out.append(self._unquote(arg))
        return name, out


    def _pass(self, value):
        """
        Dummy check that always passes
        
        >>> vtor.check('', 0)
        0
        >>> vtor.check('', '0')
        '0'
        """
        return value
    
    
    def get_default_value(self, check):
        """
        Given a check, return the default value for the check
        (converted to the right type).
        
        If the check doesn't specify a default value then a
        ``KeyError`` will be raised.
        """
        fun_name, fun_args, fun_kwargs, default = self._parse_with_caching(check)
        if default is None:
            raise KeyError('Check "%s" has no default value.' % check)
        value = self._handle_none(default)
        if value is None:
            return value
        return self._check_value(value, fun_name, fun_args, fun_kwargs)


def _is_num_param(names, values, to_float=False):
    """
    Return numbers from inputs or raise VdtParamError.
    
    Lets ``None`` pass through.
    Pass in keyword argument ``to_float=True`` to
    use float for the conversion rather than int.
    
    >>> _is_num_param(('', ''), (0, 1.0))
    [0, 1]
    >>> _is_num_param(('', ''), (0, 1.0), to_float=True)
    [0.0, 1.0]
    >>> _is_num_param(('a'), ('a'))
    Traceback (most recent call last):
    VdtParamError: passed an incorrect value "a" for parameter "a".
    """
    fun = to_float and float or int
    out_params = []
    for (name, val) in zip(names, values):
        if val is None:
            out_params.append(val)
        elif isinstance(val, (int, long, float, string_type)):
            try:
                out_params.append(fun(val))
            except ValueError as e:
                raise VdtParamError(name, val)
        else:
            raise VdtParamError(name, val)
    return out_params


# built in checks
# you can override these by setting the appropriate name
# in Validator.functions
# note: if the params are specified wrongly in your input string,
#       you will also raise errors.

def is_integer(value, min=None, max=None):
    """
    A check that tests that a given value is an integer (int, or long)
    and optionally, between bounds. A negative value is accepted, while
    a float will fail.
    
    If the value is a string, then the conversion is done - if possible.
    Otherwise a VdtError is raised.
    
    >>> vtor.check('integer', '-1')
    -1
    >>> vtor.check('integer', '0')
    0
    >>> vtor.check('integer', 9)
    9
    >>> vtor.check('integer', 'a')
    Traceback (most recent call last):
    VdtTypeError: the value "a" is of the wrong type.
    >>> vtor.check('integer', '2.2')
    Traceback (most recent call last):
    VdtTypeError: the value "2.2" is of the wrong type.
    >>> vtor.check('integer(10)', '20')
    20
    >>> vtor.check('integer(max=20)', '15')
    15
    >>> vtor.check('integer(10)', '9')
    Traceback (most recent call last):
    VdtValueTooSmallError: the value "9" is too small.
    >>> vtor.check('integer(10)', 9)
    Traceback (most recent call last):
    VdtValueTooSmallError: the value "9" is too small.
    >>> vtor.check('integer(max=20)', '35')
    Traceback (most recent call last):
    VdtValueTooBigError: the value "35" is too big.
    >>> vtor.check('integer(max=20)', 35)
    Traceback (most recent call last):
    VdtValueTooBigError: the value "35" is too big.
    >>> vtor.check('integer(0, 9)', False)
    0
    """
    (min_val, max_val) = _is_num_param(('min', 'max'), (min, max))
    if not isinstance(value, (int, long, string_type)):
        raise VdtTypeError(value)
    if isinstance(value, string_type):
        # if it's a string - does it represent an integer ?
        try:
            value = int(value)
        except ValueError:
            raise VdtTypeError(value)
    if (min_val is not None) and (value < min_val):
        raise VdtValueTooSmallError(value)
    if (max_val is not None) and (value > max_val):
        raise VdtValueTooBigError(value)
    return value


def is_float(value, min=None, max=None):
    """
    A check that tests that a given value is a float
    (an integer will be accepted), and optionally - that it is between bounds.
    
    If the value is a string, then the conversion is done - if possible.
    Otherwise a VdtError is raised.
    
    This can accept negative values.
    
    >>> vtor.check('float', '2')
    2.0
    
    From now on we multiply the value to avoid comparing decimals
    
    >>> vtor.check('float', '-6.8') * 10
    -68.0
    >>> vtor.check('float', '12.2') * 10
    122.0
    >>> vtor.check('float', 8.4) * 10
    84.0
    >>> vtor.check('float', 'a')
    Traceback (most recent call last):
    VdtTypeError: the value "a" is of the wrong type.
    >>> vtor.check('float(10.1)', '10.2') * 10
    102.0
    >>> vtor.check('float(max=20.2)', '15.1') * 10
    151.0
    >>> vtor.check('float(10.0)', '9.0')
    Traceback (most recent call last):
    VdtValueTooSmallError: the value "9.0" is too small.
    >>> vtor.check('float(max=20.0)', '35.0')
    Traceback (most recent call last):
    VdtValueTooBigError: the value "35.0" is too big.
    """
    (min_val, max_val) = _is_num_param(
        ('min', 'max'), (min, max), to_float=True)
    if not isinstance(value, (int, long, float, string_type)):
        raise VdtTypeError(value)
    if not isinstance(value, float):
        # if it's a string - does it represent a float ?
        try:
            value = float(value)
        except ValueError:
            raise VdtTypeError(value)
    if (min_val is not None) and (value < min_val):
        raise VdtValueTooSmallError(value)
    if (max_val is not None) and (value > max_val):
        raise VdtValueTooBigError(value)
    return value


bool_dict = {
    True: True, 'on': True, '1': True, 'true': True, 'yes': True, 
    False: False, 'off': False, '0': False, 'false': False, 'no': False,
}


def is_boolean(value):
    """
    Check if the value represents a boolean.
    
    >>> vtor.check('boolean', 0)
    0
    >>> vtor.check('boolean', False)
    0
    >>> vtor.check('boolean', '0')
    0
    >>> vtor.check('boolean', 'off')
    0
    >>> vtor.check('boolean', 'false')
    0
    >>> vtor.check('boolean', 'no')
    0
    >>> vtor.check('boolean', 'nO')
    0
    >>> vtor.check('boolean', 'NO')
    0
    >>> vtor.check('boolean', 1)
    1
    >>> vtor.check('boolean', True)
    1
    >>> vtor.check('boolean', '1')
    1
    >>> vtor.check('boolean', 'on')
    1
    >>> vtor.check('boolean', 'true')
    1
    >>> vtor.check('boolean', 'yes')
    1
    >>> vtor.check('boolean', 'Yes')
    1
    >>> vtor.check('boolean', 'YES')
    1
    >>> vtor.check('boolean', '')
    Traceback (most recent call last):
    VdtTypeError: the value "" is of the wrong type.
    >>> vtor.check('boolean', 'up')
    Traceback (most recent call last):
    VdtTypeError: the value "up" is of the wrong type.
    
    """
    if isinstance(value, string_type):
        try:
            return bool_dict[value.lower()]
        except KeyError:
            raise VdtTypeError(value)
    # we do an equality test rather than an identity test
    # this ensures Python 2.2 compatibilty
    # and allows 0 and 1 to represent True and False
    if value == False:
        return False
    elif value == True:
        return True
    else:
        raise VdtTypeError(value)


def is_ip_addr(value):
    """
    Check that the supplied value is an Internet Protocol address, v.4,
    represented by a dotted-quad string, i.e. '1.2.3.4'.
    
    >>> vtor.check('ip_addr', '1 ')
    '1'
    >>> vtor.check('ip_addr', ' 1.2')
    '1.2'
    >>> vtor.check('ip_addr', ' 1.2.3 ')
    '1.2.3'
    >>> vtor.check('ip_addr', '1.2.3.4')
    '1.2.3.4'
    >>> vtor.check('ip_addr', '0.0.0.0')
    '0.0.0.0'
    >>> vtor.check('ip_addr', '255.255.255.255')
    '255.255.255.255'
    >>> vtor.check('ip_addr', '255.255.255.256')
    Traceback (most recent call last):
    VdtValueError: the value "255.255.255.256" is unacceptable.
    >>> vtor.check('ip_addr', '1.2.3.4.5')
    Traceback (most recent call last):
    VdtValueError: the value "1.2.3.4.5" is unacceptable.
    >>> vtor.check('ip_addr', 0)
    Traceback (most recent call last):
    VdtTypeError: the value "0" is of the wrong type.
    """
    if not isinstance(value, string_type):
        raise VdtTypeError(value)
    value = value.strip()
    try:
        dottedQuadToNum(value)
    except ValueError:
        raise VdtValueError(value)
    return value


def is_list(value, min=None, max=None):
    """
    Check that the value is a list of values.
    
    You can optionally specify the minimum and maximum number of members.
    
    It does no check on list members.
    
    >>> vtor.check('list', ())
    []
    >>> vtor.check('list', [])
    []
    >>> vtor.check('list', (1, 2))
    [1, 2]
    >>> vtor.check('list', [1, 2])
    [1, 2]
    >>> vtor.check('list(3)', (1, 2))
    Traceback (most recent call last):
    VdtValueTooShortError: the value "(1, 2)" is too short.
    >>> vtor.check('list(max=5)', (1, 2, 3, 4, 5, 6))
    Traceback (most recent call last):
    VdtValueTooLongError: the value "(1, 2, 3, 4, 5, 6)" is too long.
    >>> vtor.check('list(min=3, max=5)', (1, 2, 3, 4))
    [1, 2, 3, 4]
    >>> vtor.check('list', 0)
    Traceback (most recent call last):
    VdtTypeError: the value "0" is of the wrong type.
    >>> vtor.check('list', '12')
    Traceback (most recent call last):
    VdtTypeError: the value "12" is of the wrong type.
    """
    (min_len, max_len) = _is_num_param(('min', 'max'), (min, max))
    if isinstance(value, string_type):
        raise VdtTypeError(value)
    try:
        num_members = len(value)
    except TypeError:
        raise VdtTypeError(value)
    if min_len is not None and num_members < min_len:
        raise VdtValueTooShortError(value)
    if max_len is not None and num_members > max_len:
        raise VdtValueTooLongError(value)
    return list(value)


def is_tuple(value, min=None, max=None):
    """
    Check that the value is a tuple of values.
    
    You can optionally specify the minimum and maximum number of members.
    
    It does no check on members.
    
    >>> vtor.check('tuple', ())
    ()
    >>> vtor.check('tuple', [])
    ()
    >>> vtor.check('tuple', (1, 2))
    (1, 2)
    >>> vtor.check('tuple', [1, 2])
    (1, 2)
    >>> vtor.check('tuple(3)', (1, 2))
    Traceback (most recent call last):
    VdtValueTooShortError: the value "(1, 2)" is too short.
    >>> vtor.check('tuple(max=5)', (1, 2, 3, 4, 5, 6))
    Traceback (most recent call last):
    VdtValueTooLongError: the value "(1, 2, 3, 4, 5, 6)" is too long.
    >>> vtor.check('tuple(min=3, max=5)', (1, 2, 3, 4))
    (1, 2, 3, 4)
    >>> vtor.check('tuple', 0)
    Traceback (most recent call last):
    VdtTypeError: the value "0" is of the wrong type.
    >>> vtor.check('tuple', '12')
    Traceback (most recent call last):
    VdtTypeError: the value "12" is of the wrong type.
    """
    return tuple(is_list(value, min, max))


def is_string(value, min=None, max=None):
    """
    Check that the supplied value is a string.
    
    You can optionally specify the minimum and maximum number of members.
    
    >>> vtor.check('string', '0')
    '0'
    >>> vtor.check('string', 0)
    Traceback (most recent call last):
    VdtTypeError: the value "0" is of the wrong type.
    >>> vtor.check('string(2)', '12')
    '12'
    >>> vtor.check('string(2)', '1')
    Traceback (most recent call last):
    VdtValueTooShortError: the value "1" is too short.
    >>> vtor.check('string(min=2, max=3)', '123')
    '123'
    >>> vtor.check('string(min=2, max=3)', '1234')
    Traceback (most recent call last):
    VdtValueTooLongError: the value "1234" is too long.
    """
    if not isinstance(value, string_type):
        raise VdtTypeError(value)
    (min_len, max_len) = _is_num_param(('min', 'max'), (min, max))
    try:
        num_members = len(value)
    except TypeError:
        raise VdtTypeError(value)
    if min_len is not None and num_members < min_len:
        raise VdtValueTooShortError(value)
    if max_len is not None and num_members > max_len:
        raise VdtValueTooLongError(value)
    return value


def is_int_list(value, min=None, max=None):
    """
    Check that the value is a list of integers.
    
    You can optionally specify the minimum and maximum number of members.
    
    Each list member is checked that it is an integer.
    
    >>> vtor.check('int_list', ())
    []
    >>> vtor.check('int_list', [])
    []
    >>> vtor.check('int_list', (1, 2))
    [1, 2]
    >>> vtor.check('int_list', [1, 2])
    [1, 2]
    >>> vtor.check('int_list', [1, 'a'])
    Traceback (most recent call last):
    VdtTypeError: the value "a" is of the wrong type.
    """
    return [is_integer(mem) for mem in is_list(value, min, max)]


def is_bool_list(value, min=None, max=None):
    """
    Check that the value is a list of booleans.
    
    You can optionally specify the minimum and maximum number of members.
    
    Each list member is checked that it is a boolean.
    
    >>> vtor.check('bool_list', ())
    []
    >>> vtor.check('bool_list', [])
    []
    >>> check_res = vtor.check('bool_list', (True, False))
    >>> check_res == [True, False]
    1
    >>> check_res = vtor.check('bool_list', [True, False])
    >>> check_res == [True, False]
    1
    >>> vtor.check('bool_list', [True, 'a'])
    Traceback (most recent call last):
    VdtTypeError: the value "a" is of the wrong type.
    """
    return [is_boolean(mem) for mem in is_list(value, min, max)]


def is_float_list(value, min=None, max=None):
    """
    Check that the value is a list of floats.
    
    You can optionally specify the minimum and maximum number of members.
    
    Each list member is checked that it is a float.
    
    >>> vtor.check('float_list', ())
    []
    >>> vtor.check('float_list', [])
    []
    >>> vtor.check('float_list', (1, 2.0))
    [1.0, 2.0]
    >>> vtor.check('float_list', [1, 2.0])
    [1.0, 2.0]
    >>> vtor.check('float_list', [1, 'a'])
    Traceback (most recent call last):
    VdtTypeError: the value "a" is of the wrong type.
    """
    return [is_float(mem) for mem in is_list(value, min, max)]


def is_string_list(value, min=None, max=None):
    """
    Check that the value is a list of strings.
    
    You can optionally specify the minimum and maximum number of members.
    
    Each list member is checked that it is a string.
    
    >>> vtor.check('string_list', ())
    []
    >>> vtor.check('string_list', [])
    []
    >>> vtor.check('string_list', ('a', 'b'))
    ['a', 'b']
    >>> vtor.check('string_list', ['a', 1])
    Traceback (most recent call last):
    VdtTypeError: the value "1" is of the wrong type.
    >>> vtor.check('string_list', 'hello')
    Traceback (most recent call last):
    VdtTypeError: the value "hello" is of the wrong type.
    """
    if isinstance(value, string_type):
        raise VdtTypeError(value)
    return [is_string(mem) for mem in is_list(value, min, max)]


def is_ip_addr_list(value, min=None, max=None):
    """
    Check that the value is a list of IP addresses.
    
    You can optionally specify the minimum and maximum number of members.
    
    Each list member is checked that it is an IP address.
    
    >>> vtor.check('ip_addr_list', ())
    []
    >>> vtor.check('ip_addr_list', [])
    []
    >>> vtor.check('ip_addr_list', ('1.2.3.4', '5.6.7.8'))
    ['1.2.3.4', '5.6.7.8']
    >>> vtor.check('ip_addr_list', ['a'])
    Traceback (most recent call last):
    VdtValueError: the value "a" is unacceptable.
    """
    return [is_ip_addr(mem) for mem in is_list(value, min, max)]


def force_list(value, min=None, max=None):
    """
    Check that a value is a list, coercing strings into
    a list with one member. Useful where users forget the
    trailing comma that turns a single value into a list.
    
    You can optionally specify the minimum and maximum number of members.
    A minumum of greater than one will fail if the user only supplies a
    string.
    
    >>> vtor.check('force_list', ())
    []
    >>> vtor.check('force_list', [])
    []
    >>> vtor.check('force_list', 'hello')
    ['hello']
    """
    if not isinstance(value, (list, tuple)):
        value = [value]
    return is_list(value, min, max)
    
    

fun_dict = {
    'integer': is_integer,
    'float': is_float,
    'ip_addr': is_ip_addr,
    'string': is_string,
    'boolean': is_boolean,
}


def is_mixed_list(value, *args):
    """
    Check that the value is a list.
    Allow specifying the type of each member.
    Work on lists of specific lengths.
    
    You specify each member as a positional argument specifying type
    
    Each type should be one of the following strings :
      'integer', 'float', 'ip_addr', 'string', 'boolean'
    
    So you can specify a list of two strings, followed by
    two integers as :
    
      mixed_list('string', 'string', 'integer', 'integer')
    
    The length of the list must match the number of positional
    arguments you supply.
    
    >>> mix_str = "mixed_list('integer', 'float', 'ip_addr', 'string', 'boolean')"
    >>> check_res = vtor.check(mix_str, (1, 2.0, '1.2.3.4', 'a', True))
    >>> check_res == [1, 2.0, '1.2.3.4', 'a', True]
    1
    >>> check_res = vtor.check(mix_str, ('1', '2.0', '1.2.3.4', 'a', 'True'))
    >>> check_res == [1, 2.0, '1.2.3.4', 'a', True]
    1
    >>> vtor.check(mix_str, ('b', 2.0, '1.2.3.4', 'a', True))
    Traceback (most recent call last):
    VdtTypeError: the value "b" is of the wrong type.
    >>> vtor.check(mix_str, (1, 2.0, '1.2.3.4', 'a'))
    Traceback (most recent call last):
    VdtValueTooShortError: the value "(1, 2.0, '1.2.3.4', 'a')" is too short.
    >>> vtor.check(mix_str, (1, 2.0, '1.2.3.4', 'a', 1, 'b'))
    Traceback (most recent call last):
    VdtValueTooLongError: the value "(1, 2.0, '1.2.3.4', 'a', 1, 'b')" is too long.
    >>> vtor.check(mix_str, 0)
    Traceback (most recent call last):
    VdtTypeError: the value "0" is of the wrong type.

    >>> vtor.check('mixed_list("yoda")', ('a'))
    Traceback (most recent call last):
    VdtParamError: passed an incorrect value "KeyError('yoda',)" for parameter "'mixed_list'"
    """
    try:
        length = len(value)
    except TypeError:
        raise VdtTypeError(value)
    if length < len(args):
        raise VdtValueTooShortError(value)
    elif length > len(args):
        raise VdtValueTooLongError(value)
    try:
        return [fun_dict[arg](val) for arg, val in zip(args, value)]
    except KeyError as e:
        raise VdtParamError('mixed_list', e)


def is_option(value, *options):
    """
    This check matches the value to any of a set of options.
    
    >>> vtor.check('option("yoda", "jedi")', 'yoda')
    'yoda'
    >>> vtor.check('option("yoda", "jedi")', 'jed')
    Traceback (most recent call last):
    VdtValueError: the value "jed" is unacceptable.
    >>> vtor.check('option("yoda", "jedi")', 0)
    Traceback (most recent call last):
    VdtTypeError: the value "0" is of the wrong type.
    """
    if not isinstance(value, string_type):
        raise VdtTypeError(value)
    if not value in options:
        raise VdtValueError(value)
    return value


def _test(value, *args, **keywargs):
    """
    A function that exists for test purposes.
    
    >>> checks = [
    ...     '3, 6, min=1, max=3, test=list(a, b, c)',
    ...     '3',
    ...     '3, 6',
    ...     '3,',
    ...     'min=1, test="a b c"',
    ...     'min=5, test="a, b, c"',
    ...     'min=1, max=3, test="a, b, c"',
    ...     'min=-100, test=-99',
    ...     'min=1, max=3',
    ...     '3, 6, test="36"',
    ...     '3, 6, test="a, b, c"',
    ...     '3, max=3, test=list("a", "b", "c")',
    ...     '''3, max=3, test=list("'a'", 'b', "x=(c)")''',
    ...     "test='x=fish(3)'",
    ...    ]
    >>> v = Validator({'test': _test})
    >>> for entry in checks:
    ...     pprint(v.check(('test(%s)' % entry), 3))
    (3, ('3', '6'), {'max': '3', 'min': '1', 'test': ['a', 'b', 'c']})
    (3, ('3',), {})
    (3, ('3', '6'), {})
    (3, ('3',), {})
    (3, (), {'min': '1', 'test': 'a b c'})
    (3, (), {'min': '5', 'test': 'a, b, c'})
    (3, (), {'max': '3', 'min': '1', 'test': 'a, b, c'})
    (3, (), {'min': '-100', 'test': '-99'})
    (3, (), {'max': '3', 'min': '1'})
    (3, ('3', '6'), {'test': '36'})
    (3, ('3', '6'), {'test': 'a, b, c'})
    (3, ('3',), {'max': '3', 'test': ['a', 'b', 'c']})
    (3, ('3',), {'max': '3', 'test': ["'a'", 'b', 'x=(c)']})
    (3, (), {'test': 'x=fish(3)'})
    
    >>> v = Validator()
    >>> v.check('integer(default=6)', '3')
    3
    >>> v.check('integer(default=6)', None, True)
    6
    >>> v.get_default_value('integer(default=6)')
    6
    >>> v.get_default_value('float(default=6)')
    6.0
    >>> v.get_default_value('pass(default=None)')
    >>> v.get_default_value("string(default='None')")
    'None'
    >>> v.get_default_value('pass')
    Traceback (most recent call last):
    KeyError: 'Check "pass" has no default value.'
    >>> v.get_default_value('pass(default=list(1, 2, 3, 4))')
    ['1', '2', '3', '4']
    
    >>> v = Validator()
    >>> v.check("pass(default=None)", None, True)
    >>> v.check("pass(default='None')", None, True)
    'None'
    >>> v.check('pass(default="None")', None, True)
    'None'
    >>> v.check('pass(default=list(1, 2, 3, 4))', None, True)
    ['1', '2', '3', '4']
    
    Bug test for unicode arguments
    >>> v = Validator()
    >>> v.check(unicode('string(min=4)'), unicode('test')) == unicode('test')
    True
    
    >>> v = Validator()
    >>> v.get_default_value(unicode('string(min=4, default="1234")')) == unicode('1234')
    True
    >>> v.check(unicode('string(min=4, default="1234")'), unicode('test')) == unicode('test')
    True
    
    >>> v = Validator()
    >>> default = v.get_default_value('string(default=None)')
    >>> default == None
    1
    """
    return (value, args, keywargs)


def _test2():
    """
    >>> 
    >>> v = Validator()
    >>> v.get_default_value('string(default="#ff00dd")')
    '#ff00dd'
    >>> v.get_default_value('integer(default=3) # comment')
    3
    """

def _test3():
    r"""
    >>> vtor.check('string(default="")', '', missing=True)
    ''
    >>> vtor.check('string(default="\n")', '', missing=True)
    '\n'
    >>> print(vtor.check('string(default="\n")', '', missing=True))
    <BLANKLINE>
    <BLANKLINE>
    >>> vtor.check('string()', '\n')
    '\n'
    >>> vtor.check('string(default="\n\n\n")', '', missing=True)
    '\n\n\n'
    >>> vtor.check('string()', 'random \n text goes here\n\n')
    'random \n text goes here\n\n'
    >>> vtor.check('string(default=" \nrandom text\ngoes \n here\n\n ")',
    ... '', missing=True)
    ' \nrandom text\ngoes \n here\n\n '
    >>> vtor.check("string(default='\n\n\n')", '', missing=True)
    '\n\n\n'
    >>> vtor.check("option('\n','a','b',default='\n')", '', missing=True)
    '\n'
    >>> vtor.check("string_list()", ['foo', '\n', 'bar'])
    ['foo', '\n', 'bar']
    >>> vtor.check("string_list(default=list('\n'))", '', missing=True)
    ['\n']
    """
    
    
if __name__ == '__main__':
    # run the code tests in doctest format
    import sys
    import doctest
    m = sys.modules.get('__main__')
    globs = m.__dict__.copy()
    globs.update({
        'vtor': Validator(),
    })

    failures, tests = doctest.testmod(
        m, globs=globs,
        optionflags=doctest.IGNORE_EXCEPTION_DETAIL | doctest.ELLIPSIS)
    assert not failures, '{} failures out of {} tests'.format(failures, tests)
site-packages/dateutil/__init__.py000064400000000105147511334560013215 0ustar00# -*- coding: utf-8 -*-
from ._version import VERSION as __version__
site-packages/dateutil/easter.py000064400000005105147511334560012746 0ustar00# -*- coding: utf-8 -*-
"""
This module offers a generic easter computing method for any given year, using
Western, Orthodox or Julian algorithms.
"""

import datetime

__all__ = ["easter", "EASTER_JULIAN", "EASTER_ORTHODOX", "EASTER_WESTERN"]

EASTER_JULIAN = 1
EASTER_ORTHODOX = 2
EASTER_WESTERN = 3


def easter(year, method=EASTER_WESTERN):
    """
    This method was ported from the work done by GM Arts,
    on top of the algorithm by Claus Tondering, which was
    based in part on the algorithm of Ouding (1940), as
    quoted in "Explanatory Supplement to the Astronomical
    Almanac", P.  Kenneth Seidelmann, editor.

    This algorithm implements three different easter
    calculation methods:

    1 - Original calculation in Julian calendar, valid in
        dates after 326 AD
    2 - Original method, with date converted to Gregorian
        calendar, valid in years 1583 to 4099
    3 - Revised method, in Gregorian calendar, valid in
        years 1583 to 4099 as well

    These methods are represented by the constants:

    * ``EASTER_JULIAN   = 1``
    * ``EASTER_ORTHODOX = 2``
    * ``EASTER_WESTERN  = 3``

    The default method is method 3.

    More about the algorithm may be found at:

    http://users.chariot.net.au/~gmarts/eastalg.htm

    and

    http://www.tondering.dk/claus/calendar.html

    """

    if not (1 <= method <= 3):
        raise ValueError("invalid method")

    # g - Golden year - 1
    # c - Century
    # h - (23 - Epact) mod 30
    # i - Number of days from March 21 to Paschal Full Moon
    # j - Weekday for PFM (0=Sunday, etc)
    # p - Number of days from March 21 to Sunday on or before PFM
    #     (-6 to 28 methods 1 & 3, to 56 for method 2)
    # e - Extra days to add for method 2 (converting Julian
    #     date to Gregorian date)

    y = year
    g = y % 19
    e = 0
    if method < 3:
        # Old method
        i = (19*g + 15) % 30
        j = (y + y//4 + i) % 7
        if method == 2:
            # Extra dates to convert Julian to Gregorian date
            e = 10
            if y > 1600:
                e = e + y//100 - 16 - (y//100 - 16)//4
    else:
        # New method
        c = y//100
        h = (c - c//4 - (8*c + 13)//25 + 19*g + 15) % 30
        i = h - (h//28)*(1 - (h//28)*(29//(h + 1))*((21 - g)//11))
        j = (y + y//4 + i + 2 - c + c//4) % 7

    # p can be from -6 to 56 corresponding to dates 22 March to 23 May
    # (later dates apply to method 2, although 23 May never actually occurs)
    p = i - j + e
    d = 1 + (p + 27 + (p + 6)//40) % 31
    m = 3 + (p + 26)//30
    return datetime.date(int(y), int(m), int(d))
site-packages/dateutil/tzwin.py000064400000000073147511334560012635 0ustar00# tzwin has moved to dateutil.tz.win
from .tz.win import *
site-packages/dateutil/tz/_common.py000064400000027326147511334560013560 0ustar00from six import PY3

from functools import wraps

from datetime import datetime, timedelta, tzinfo


ZERO = timedelta(0)

__all__ = ['tzname_in_python2', 'enfold']


def tzname_in_python2(namefunc):
    """Change unicode output into bytestrings in Python 2

    tzname() API changed in Python 3. It used to return bytes, but was changed
    to unicode strings
    """
    def adjust_encoding(*args, **kwargs):
        name = namefunc(*args, **kwargs)
        if name is not None and not PY3:
            name = name.encode()

        return name

    return adjust_encoding


# The following is adapted from Alexander Belopolsky's tz library
# https://github.com/abalkin/tz
if hasattr(datetime, 'fold'):
    # This is the pre-python 3.6 fold situation
    def enfold(dt, fold=1):
        """
        Provides a unified interface for assigning the ``fold`` attribute to
        datetimes both before and after the implementation of PEP-495.

        :param fold:
            The value for the ``fold`` attribute in the returned datetime. This
            should be either 0 or 1.

        :return:
            Returns an object for which ``getattr(dt, 'fold', 0)`` returns
            ``fold`` for all versions of Python. In versions prior to
            Python 3.6, this is a ``_DatetimeWithFold`` object, which is a
            subclass of :py:class:`datetime.datetime` with the ``fold``
            attribute added, if ``fold`` is 1.

        .. versionadded:: 2.6.0
        """
        return dt.replace(fold=fold)

else:
    class _DatetimeWithFold(datetime):
        """
        This is a class designed to provide a PEP 495-compliant interface for
        Python versions before 3.6. It is used only for dates in a fold, so
        the ``fold`` attribute is fixed at ``1``.

        .. versionadded:: 2.6.0
        """
        __slots__ = ()

        @property
        def fold(self):
            return 1

    def enfold(dt, fold=1):
        """
        Provides a unified interface for assigning the ``fold`` attribute to
        datetimes both before and after the implementation of PEP-495.

        :param fold:
            The value for the ``fold`` attribute in the returned datetime. This
            should be either 0 or 1.

        :return:
            Returns an object for which ``getattr(dt, 'fold', 0)`` returns
            ``fold`` for all versions of Python. In versions prior to
            Python 3.6, this is a ``_DatetimeWithFold`` object, which is a
            subclass of :py:class:`datetime.datetime` with the ``fold``
            attribute added, if ``fold`` is 1.

        .. versionadded:: 2.6.0
        """
        if getattr(dt, 'fold', 0) == fold:
            return dt

        args = dt.timetuple()[:6]
        args += (dt.microsecond, dt.tzinfo)

        if fold:
            return _DatetimeWithFold(*args)
        else:
            return datetime(*args)


def _validate_fromutc_inputs(f):
    """
    The CPython version of ``fromutc`` checks that the input is a ``datetime``
    object and that ``self`` is attached as its ``tzinfo``.
    """
    @wraps(f)
    def fromutc(self, dt):
        if not isinstance(dt, datetime):
            raise TypeError("fromutc() requires a datetime argument")
        if dt.tzinfo is not self:
            raise ValueError("dt.tzinfo is not self")

        return f(self, dt)

    return fromutc


class _tzinfo(tzinfo):
    """
    Base class for all ``dateutil`` ``tzinfo`` objects.
    """

    def is_ambiguous(self, dt):
        """
        Whether or not the "wall time" of a given datetime is ambiguous in this
        zone.

        :param dt:
            A :py:class:`datetime.datetime`, naive or time zone aware.


        :return:
            Returns ``True`` if ambiguous, ``False`` otherwise.

        .. versionadded:: 2.6.0
        """

        dt = dt.replace(tzinfo=self)

        wall_0 = enfold(dt, fold=0)
        wall_1 = enfold(dt, fold=1)

        same_offset = wall_0.utcoffset() == wall_1.utcoffset()
        same_dt = wall_0.replace(tzinfo=None) == wall_1.replace(tzinfo=None)

        return same_dt and not same_offset

    def _fold_status(self, dt_utc, dt_wall):
        """
        Determine the fold status of a "wall" datetime, given a representation
        of the same datetime as a (naive) UTC datetime. This is calculated based
        on the assumption that ``dt.utcoffset() - dt.dst()`` is constant for all
        datetimes, and that this offset is the actual number of hours separating
        ``dt_utc`` and ``dt_wall``.

        :param dt_utc:
            Representation of the datetime as UTC

        :param dt_wall:
            Representation of the datetime as "wall time". This parameter must
            either have a `fold` attribute or have a fold-naive
            :class:`datetime.tzinfo` attached, otherwise the calculation may
            fail.
        """
        if self.is_ambiguous(dt_wall):
            delta_wall = dt_wall - dt_utc
            _fold = int(delta_wall == (dt_utc.utcoffset() - dt_utc.dst()))
        else:
            _fold = 0

        return _fold

    def _fold(self, dt):
        return getattr(dt, 'fold', 0)

    def _fromutc(self, dt):
        """
        Given a timezone-aware datetime in a given timezone, calculates a
        timezone-aware datetime in a new timezone.

        Since this is the one time that we *know* we have an unambiguous
        datetime object, we take this opportunity to determine whether the
        datetime is ambiguous and in a "fold" state (e.g. if it's the first
        occurence, chronologically, of the ambiguous datetime).

        :param dt:
            A timezone-aware :class:`datetime.datetime` object.
        """

        # Re-implement the algorithm from Python's datetime.py
        dtoff = dt.utcoffset()
        if dtoff is None:
            raise ValueError("fromutc() requires a non-None utcoffset() "
                             "result")

        # The original datetime.py code assumes that `dst()` defaults to
        # zero during ambiguous times. PEP 495 inverts this presumption, so
        # for pre-PEP 495 versions of python, we need to tweak the algorithm.
        dtdst = dt.dst()
        if dtdst is None:
            raise ValueError("fromutc() requires a non-None dst() result")
        delta = dtoff - dtdst

        dt += delta
        # Set fold=1 so we can default to being in the fold for
        # ambiguous dates.
        dtdst = enfold(dt, fold=1).dst()
        if dtdst is None:
            raise ValueError("fromutc(): dt.dst gave inconsistent "
                             "results; cannot convert")
        return dt + dtdst

    @_validate_fromutc_inputs
    def fromutc(self, dt):
        """
        Given a timezone-aware datetime in a given timezone, calculates a
        timezone-aware datetime in a new timezone.

        Since this is the one time that we *know* we have an unambiguous
        datetime object, we take this opportunity to determine whether the
        datetime is ambiguous and in a "fold" state (e.g. if it's the first
        occurance, chronologically, of the ambiguous datetime).

        :param dt:
            A timezone-aware :class:`datetime.datetime` object.
        """
        dt_wall = self._fromutc(dt)

        # Calculate the fold status given the two datetimes.
        _fold = self._fold_status(dt, dt_wall)

        # Set the default fold value for ambiguous dates
        return enfold(dt_wall, fold=_fold)


class tzrangebase(_tzinfo):
    """
    This is an abstract base class for time zones represented by an annual
    transition into and out of DST. Child classes should implement the following
    methods:

        * ``__init__(self, *args, **kwargs)``
        * ``transitions(self, year)`` - this is expected to return a tuple of
          datetimes representing the DST on and off transitions in standard
          time.

    A fully initialized ``tzrangebase`` subclass should also provide the
    following attributes:
        * ``hasdst``: Boolean whether or not the zone uses DST.
        * ``_dst_offset`` / ``_std_offset``: :class:`datetime.timedelta` objects
          representing the respective UTC offsets.
        * ``_dst_abbr`` / ``_std_abbr``: Strings representing the timezone short
          abbreviations in DST and STD, respectively.
        * ``_hasdst``: Whether or not the zone has DST.

    .. versionadded:: 2.6.0
    """
    def __init__(self):
        raise NotImplementedError('tzrangebase is an abstract base class')

    def utcoffset(self, dt):
        isdst = self._isdst(dt)

        if isdst is None:
            return None
        elif isdst:
            return self._dst_offset
        else:
            return self._std_offset

    def dst(self, dt):
        isdst = self._isdst(dt)

        if isdst is None:
            return None
        elif isdst:
            return self._dst_base_offset
        else:
            return ZERO

    @tzname_in_python2
    def tzname(self, dt):
        if self._isdst(dt):
            return self._dst_abbr
        else:
            return self._std_abbr

    def fromutc(self, dt):
        """ Given a datetime in UTC, return local time """
        if not isinstance(dt, datetime):
            raise TypeError("fromutc() requires a datetime argument")

        if dt.tzinfo is not self:
            raise ValueError("dt.tzinfo is not self")

        # Get transitions - if there are none, fixed offset
        transitions = self.transitions(dt.year)
        if transitions is None:
            return dt + self.utcoffset(dt)

        # Get the transition times in UTC
        dston, dstoff = transitions

        dston -= self._std_offset
        dstoff -= self._std_offset

        utc_transitions = (dston, dstoff)
        dt_utc = dt.replace(tzinfo=None)

        isdst = self._naive_isdst(dt_utc, utc_transitions)

        if isdst:
            dt_wall = dt + self._dst_offset
        else:
            dt_wall = dt + self._std_offset

        _fold = int(not isdst and self.is_ambiguous(dt_wall))

        return enfold(dt_wall, fold=_fold)

    def is_ambiguous(self, dt):
        """
        Whether or not the "wall time" of a given datetime is ambiguous in this
        zone.

        :param dt:
            A :py:class:`datetime.datetime`, naive or time zone aware.


        :return:
            Returns ``True`` if ambiguous, ``False`` otherwise.

        .. versionadded:: 2.6.0
        """
        if not self.hasdst:
            return False

        start, end = self.transitions(dt.year)

        dt = dt.replace(tzinfo=None)
        return (end <= dt < end + self._dst_base_offset)

    def _isdst(self, dt):
        if not self.hasdst:
            return False
        elif dt is None:
            return None

        transitions = self.transitions(dt.year)

        if transitions is None:
            return False

        dt = dt.replace(tzinfo=None)

        isdst = self._naive_isdst(dt, transitions)

        # Handle ambiguous dates
        if not isdst and self.is_ambiguous(dt):
            return not self._fold(dt)
        else:
            return isdst

    def _naive_isdst(self, dt, transitions):
        dston, dstoff = transitions

        dt = dt.replace(tzinfo=None)

        if dston < dstoff:
            isdst = dston <= dt < dstoff
        else:
            isdst = not dstoff <= dt < dston

        return isdst

    @property
    def _dst_base_offset(self):
        return self._dst_offset - self._std_offset

    __hash__ = None

    def __ne__(self, other):
        return not (self == other)

    def __repr__(self):
        return "%s(...)" % self.__class__.__name__

    __reduce__ = object.__reduce__


def _total_seconds(td):
    # Python 2.6 doesn't have a total_seconds() method on timedelta objects
    return ((td.seconds + td.days * 86400) * 1000000 +
            td.microseconds) // 1000000


_total_seconds = getattr(timedelta, 'total_seconds', _total_seconds)
site-packages/dateutil/tz/tz.py000064400000142622147511334560012563 0ustar00# -*- coding: utf-8 -*-
"""
This module offers timezone implementations subclassing the abstract
:py:`datetime.tzinfo` type. There are classes to handle tzfile format files
(usually are in :file:`/etc/localtime`, :file:`/usr/share/zoneinfo`, etc), TZ
environment string (in all known formats), given ranges (with help from
relative deltas), local machine timezone, fixed offset timezone, and UTC
timezone.
"""
import datetime
import struct
import time
import sys
import os
import bisect

from six import string_types
from ._common import tzname_in_python2, _tzinfo, _total_seconds
from ._common import tzrangebase, enfold
from ._common import _validate_fromutc_inputs

try:
    from .win import tzwin, tzwinlocal
except ImportError:
    tzwin = tzwinlocal = None

ZERO = datetime.timedelta(0)
EPOCH = datetime.datetime.utcfromtimestamp(0)
EPOCHORDINAL = EPOCH.toordinal()


class tzutc(datetime.tzinfo):
    """
    This is a tzinfo object that represents the UTC time zone.
    """
    def utcoffset(self, dt):
        return ZERO

    def dst(self, dt):
        return ZERO

    @tzname_in_python2
    def tzname(self, dt):
        return "UTC"

    def is_ambiguous(self, dt):
        """
        Whether or not the "wall time" of a given datetime is ambiguous in this
        zone.

        :param dt:
            A :py:class:`datetime.datetime`, naive or time zone aware.


        :return:
            Returns ``True`` if ambiguous, ``False`` otherwise.

        .. versionadded:: 2.6.0
        """
        return False

    @_validate_fromutc_inputs
    def fromutc(self, dt):
        """
        Fast track version of fromutc() returns the original ``dt`` object for
        any valid :py:class:`datetime.datetime` object.
        """
        return dt

    def __eq__(self, other):
        if not isinstance(other, (tzutc, tzoffset)):
            return NotImplemented

        return (isinstance(other, tzutc) or
                (isinstance(other, tzoffset) and other._offset == ZERO))

    __hash__ = None

    def __ne__(self, other):
        return not (self == other)

    def __repr__(self):
        return "%s()" % self.__class__.__name__

    __reduce__ = object.__reduce__


class tzoffset(datetime.tzinfo):
    """
    A simple class for representing a fixed offset from UTC.

    :param name:
        The timezone name, to be returned when ``tzname()`` is called.

    :param offset:
        The time zone offset in seconds, or (since version 2.6.0, represented
        as a :py:class:`datetime.timedelta` object.
    """
    def __init__(self, name, offset):
        self._name = name

        try:
            # Allow a timedelta
            offset = _total_seconds(offset)
        except (TypeError, AttributeError):
            pass
        self._offset = datetime.timedelta(seconds=offset)

    def utcoffset(self, dt):
        return self._offset

    def dst(self, dt):
        return ZERO

    @tzname_in_python2
    def tzname(self, dt):
        return self._name

    @_validate_fromutc_inputs
    def fromutc(self, dt):
        return dt + self._offset

    def is_ambiguous(self, dt):
        """
        Whether or not the "wall time" of a given datetime is ambiguous in this
        zone.

        :param dt:
            A :py:class:`datetime.datetime`, naive or time zone aware.


        :return:
            Returns ``True`` if ambiguous, ``False`` otherwise.

        .. versionadded:: 2.6.0
        """
        return False

    def __eq__(self, other):
        if not isinstance(other, tzoffset):
            return NotImplemented

        return self._offset == other._offset

    __hash__ = None

    def __ne__(self, other):
        return not (self == other)

    def __repr__(self):
        return "%s(%s, %s)" % (self.__class__.__name__,
                               repr(self._name),
                               int(_total_seconds(self._offset)))

    __reduce__ = object.__reduce__


class tzlocal(_tzinfo):
    """
    A :class:`tzinfo` subclass built around the ``time`` timezone functions.
    """
    def __init__(self):
        super(tzlocal, self).__init__()

        self._std_offset = datetime.timedelta(seconds=-time.timezone)
        if time.daylight:
            self._dst_offset = datetime.timedelta(seconds=-time.altzone)
        else:
            self._dst_offset = self._std_offset

        self._dst_saved = self._dst_offset - self._std_offset
        self._hasdst = bool(self._dst_saved)

    def utcoffset(self, dt):
        if dt is None and self._hasdst:
            return None

        if self._isdst(dt):
            return self._dst_offset
        else:
            return self._std_offset

    def dst(self, dt):
        if dt is None and self._hasdst:
            return None

        if self._isdst(dt):
            return self._dst_offset - self._std_offset
        else:
            return ZERO

    @tzname_in_python2
    def tzname(self, dt):
        return time.tzname[self._isdst(dt)]

    def is_ambiguous(self, dt):
        """
        Whether or not the "wall time" of a given datetime is ambiguous in this
        zone.

        :param dt:
            A :py:class:`datetime.datetime`, naive or time zone aware.


        :return:
            Returns ``True`` if ambiguous, ``False`` otherwise.

        .. versionadded:: 2.6.0
        """
        naive_dst = self._naive_is_dst(dt)
        return (not naive_dst and
                (naive_dst != self._naive_is_dst(dt - self._dst_saved)))

    def _naive_is_dst(self, dt):
        timestamp = _datetime_to_timestamp(dt)
        return time.localtime(timestamp + time.timezone).tm_isdst

    def _isdst(self, dt, fold_naive=True):
        # We can't use mktime here. It is unstable when deciding if
        # the hour near to a change is DST or not.
        #
        # timestamp = time.mktime((dt.year, dt.month, dt.day, dt.hour,
        #                         dt.minute, dt.second, dt.weekday(), 0, -1))
        # return time.localtime(timestamp).tm_isdst
        #
        # The code above yields the following result:
        #
        # >>> import tz, datetime
        # >>> t = tz.tzlocal()
        # >>> datetime.datetime(2003,2,15,23,tzinfo=t).tzname()
        # 'BRDT'
        # >>> datetime.datetime(2003,2,16,0,tzinfo=t).tzname()
        # 'BRST'
        # >>> datetime.datetime(2003,2,15,23,tzinfo=t).tzname()
        # 'BRST'
        # >>> datetime.datetime(2003,2,15,22,tzinfo=t).tzname()
        # 'BRDT'
        # >>> datetime.datetime(2003,2,15,23,tzinfo=t).tzname()
        # 'BRDT'
        #
        # Here is a more stable implementation:
        #
        if not self._hasdst:
            return False

        # Check for ambiguous times:
        dstval = self._naive_is_dst(dt)
        fold = getattr(dt, 'fold', None)

        if self.is_ambiguous(dt):
            if fold is not None:
                return not self._fold(dt)
            else:
                return True

        return dstval

    def __eq__(self, other):
        if not isinstance(other, tzlocal):
            return NotImplemented

        return (self._std_offset == other._std_offset and
                self._dst_offset == other._dst_offset)

    __hash__ = None

    def __ne__(self, other):
        return not (self == other)

    def __repr__(self):
        return "%s()" % self.__class__.__name__

    __reduce__ = object.__reduce__


class _ttinfo(object):
    __slots__ = ["offset", "delta", "isdst", "abbr",
                 "isstd", "isgmt", "dstoffset"]

    def __init__(self):
        for attr in self.__slots__:
            setattr(self, attr, None)

    def __repr__(self):
        l = []
        for attr in self.__slots__:
            value = getattr(self, attr)
            if value is not None:
                l.append("%s=%s" % (attr, repr(value)))
        return "%s(%s)" % (self.__class__.__name__, ", ".join(l))

    def __eq__(self, other):
        if not isinstance(other, _ttinfo):
            return NotImplemented

        return (self.offset == other.offset and
                self.delta == other.delta and
                self.isdst == other.isdst and
                self.abbr == other.abbr and
                self.isstd == other.isstd and
                self.isgmt == other.isgmt and
                self.dstoffset == other.dstoffset)

    __hash__ = None

    def __ne__(self, other):
        return not (self == other)

    def __getstate__(self):
        state = {}
        for name in self.__slots__:
            state[name] = getattr(self, name, None)
        return state

    def __setstate__(self, state):
        for name in self.__slots__:
            if name in state:
                setattr(self, name, state[name])


class _tzfile(object):
    """
    Lightweight class for holding the relevant transition and time zone
    information read from binary tzfiles.
    """
    attrs = ['trans_list', 'trans_list_utc', 'trans_idx', 'ttinfo_list',
             'ttinfo_std', 'ttinfo_dst', 'ttinfo_before', 'ttinfo_first']

    def __init__(self, **kwargs):
        for attr in self.attrs:
            setattr(self, attr, kwargs.get(attr, None))


class tzfile(_tzinfo):
    """
    This is a ``tzinfo`` subclass thant allows one to use the ``tzfile(5)``
    format timezone files to extract current and historical zone information.

    :param fileobj:
        This can be an opened file stream or a file name that the time zone
        information can be read from.

    :param filename:
        This is an optional parameter specifying the source of the time zone
        information in the event that ``fileobj`` is a file object. If omitted
        and ``fileobj`` is a file stream, this parameter will be set either to
        ``fileobj``'s ``name`` attribute or to ``repr(fileobj)``.

    See `Sources for Time Zone and Daylight Saving Time Data
    <http://www.twinsun.com/tz/tz-link.htm>`_ for more information. Time zone
    files can be compiled from the `IANA Time Zone database files
    <https://www.iana.org/time-zones>`_ with the `zic time zone compiler
    <https://www.freebsd.org/cgi/man.cgi?query=zic&sektion=8>`_
    """

    def __init__(self, fileobj, filename=None):
        super(tzfile, self).__init__()

        file_opened_here = False
        if isinstance(fileobj, string_types):
            self._filename = fileobj
            fileobj = open(fileobj, 'rb')
            file_opened_here = True
        elif filename is not None:
            self._filename = filename
        elif hasattr(fileobj, "name"):
            self._filename = fileobj.name
        else:
            self._filename = repr(fileobj)

        if fileobj is not None:
            if not file_opened_here:
                fileobj = _ContextWrapper(fileobj)

            with fileobj as file_stream:
                tzobj = self._read_tzfile(file_stream)

            self._set_tzdata(tzobj)

    def _set_tzdata(self, tzobj):
        """ Set the time zone data of this object from a _tzfile object """
        # Copy the relevant attributes over as private attributes
        for attr in _tzfile.attrs:
            setattr(self, '_' + attr, getattr(tzobj, attr))

    def _read_tzfile(self, fileobj):
        out = _tzfile()

        # From tzfile(5):
        #
        # The time zone information files used by tzset(3)
        # begin with the magic characters "TZif" to identify
        # them as time zone information files, followed by
        # sixteen bytes reserved for future use, followed by
        # six four-byte values of type long, written in a
        # ``standard'' byte order (the high-order  byte
        # of the value is written first).
        if fileobj.read(4).decode() != "TZif":
            raise ValueError("magic not found")

        fileobj.read(16)

        (
            # The number of UTC/local indicators stored in the file.
            ttisgmtcnt,

            # The number of standard/wall indicators stored in the file.
            ttisstdcnt,

            # The number of leap seconds for which data is
            # stored in the file.
            leapcnt,

            # The number of "transition times" for which data
            # is stored in the file.
            timecnt,

            # The number of "local time types" for which data
            # is stored in the file (must not be zero).
            typecnt,

            # The  number  of  characters  of "time zone
            # abbreviation strings" stored in the file.
            charcnt,

        ) = struct.unpack(">6l", fileobj.read(24))

        # The above header is followed by tzh_timecnt four-byte
        # values  of  type long,  sorted  in ascending order.
        # These values are written in ``standard'' byte order.
        # Each is used as a transition time (as  returned  by
        # time(2)) at which the rules for computing local time
        # change.

        if timecnt:
            out.trans_list_utc = list(struct.unpack(">%dl" % timecnt,
                                                    fileobj.read(timecnt*4)))
        else:
            out.trans_list_utc = []

        # Next come tzh_timecnt one-byte values of type unsigned
        # char; each one tells which of the different types of
        # ``local time'' types described in the file is associated
        # with the same-indexed transition time. These values
        # serve as indices into an array of ttinfo structures that
        # appears next in the file.

        if timecnt:
            out.trans_idx = struct.unpack(">%dB" % timecnt,
                                            fileobj.read(timecnt))
        else:
            out.trans_idx = []

        # Each ttinfo structure is written as a four-byte value
        # for tt_gmtoff  of  type long,  in  a  standard  byte
        # order, followed  by a one-byte value for tt_isdst
        # and a one-byte  value  for  tt_abbrind.   In  each
        # structure, tt_gmtoff  gives  the  number  of
        # seconds to be added to UTC, tt_isdst tells whether
        # tm_isdst should be set by  localtime(3),  and
        # tt_abbrind serves  as an index into the array of
        # time zone abbreviation characters that follow the
        # ttinfo structure(s) in the file.

        ttinfo = []

        for i in range(typecnt):
            ttinfo.append(struct.unpack(">lbb", fileobj.read(6)))

        abbr = fileobj.read(charcnt).decode()

        # Then there are tzh_leapcnt pairs of four-byte
        # values, written in  standard byte  order;  the
        # first  value  of  each pair gives the time (as
        # returned by time(2)) at which a leap second
        # occurs;  the  second  gives the  total  number of
        # leap seconds to be applied after the given time.
        # The pairs of values are sorted in ascending order
        # by time.

        # Not used, for now (but seek for correct file position)
        if leapcnt:
            fileobj.seek(leapcnt * 8, os.SEEK_CUR)

        # Then there are tzh_ttisstdcnt standard/wall
        # indicators, each stored as a one-byte value;
        # they tell whether the transition times associated
        # with local time types were specified as standard
        # time or wall clock time, and are used when
        # a time zone file is used in handling POSIX-style
        # time zone environment variables.

        if ttisstdcnt:
            isstd = struct.unpack(">%db" % ttisstdcnt,
                                  fileobj.read(ttisstdcnt))

        # Finally, there are tzh_ttisgmtcnt UTC/local
        # indicators, each stored as a one-byte value;
        # they tell whether the transition times associated
        # with local time types were specified as UTC or
        # local time, and are used when a time zone file
        # is used in handling POSIX-style time zone envi-
        # ronment variables.

        if ttisgmtcnt:
            isgmt = struct.unpack(">%db" % ttisgmtcnt,
                                  fileobj.read(ttisgmtcnt))

        # Build ttinfo list
        out.ttinfo_list = []
        for i in range(typecnt):
            gmtoff, isdst, abbrind = ttinfo[i]
            # Round to full-minutes if that's not the case. Python's
            # datetime doesn't accept sub-minute timezones. Check
            # http://python.org/sf/1447945 for some information.
            gmtoff = 60 * ((gmtoff + 30) // 60)
            tti = _ttinfo()
            tti.offset = gmtoff
            tti.dstoffset = datetime.timedelta(0)
            tti.delta = datetime.timedelta(seconds=gmtoff)
            tti.isdst = isdst
            tti.abbr = abbr[abbrind:abbr.find('\x00', abbrind)]
            tti.isstd = (ttisstdcnt > i and isstd[i] != 0)
            tti.isgmt = (ttisgmtcnt > i and isgmt[i] != 0)
            out.ttinfo_list.append(tti)

        # Replace ttinfo indexes for ttinfo objects.
        out.trans_idx = [out.ttinfo_list[idx] for idx in out.trans_idx]

        # Set standard, dst, and before ttinfos. before will be
        # used when a given time is before any transitions,
        # and will be set to the first non-dst ttinfo, or to
        # the first dst, if all of them are dst.
        out.ttinfo_std = None
        out.ttinfo_dst = None
        out.ttinfo_before = None
        if out.ttinfo_list:
            if not out.trans_list_utc:
                out.ttinfo_std = out.ttinfo_first = out.ttinfo_list[0]
            else:
                for i in range(timecnt-1, -1, -1):
                    tti = out.trans_idx[i]
                    if not out.ttinfo_std and not tti.isdst:
                        out.ttinfo_std = tti
                    elif not out.ttinfo_dst and tti.isdst:
                        out.ttinfo_dst = tti

                    if out.ttinfo_std and out.ttinfo_dst:
                        break
                else:
                    if out.ttinfo_dst and not out.ttinfo_std:
                        out.ttinfo_std = out.ttinfo_dst

                for tti in out.ttinfo_list:
                    if not tti.isdst:
                        out.ttinfo_before = tti
                        break
                else:
                    out.ttinfo_before = out.ttinfo_list[0]

        # Now fix transition times to become relative to wall time.
        #
        # I'm not sure about this. In my tests, the tz source file
        # is setup to wall time, and in the binary file isstd and
        # isgmt are off, so it should be in wall time. OTOH, it's
        # always in gmt time. Let me know if you have comments
        # about this.
        laststdoffset = None
        out.trans_list = []
        for i, tti in enumerate(out.trans_idx):
            if not tti.isdst:
                offset = tti.offset
                laststdoffset = offset
            else:
                if laststdoffset is not None:
                    # Store the DST offset as well and update it in the list
                    tti.dstoffset = tti.offset - laststdoffset
                    out.trans_idx[i] = tti

                offset = laststdoffset or 0

            out.trans_list.append(out.trans_list_utc[i] + offset)

        # In case we missed any DST offsets on the way in for some reason, make
        # a second pass over the list, looking for the /next/ DST offset.
        laststdoffset = None
        for i in reversed(range(len(out.trans_idx))):
            tti = out.trans_idx[i]
            if tti.isdst:
                if not (tti.dstoffset or laststdoffset is None):
                    tti.dstoffset = tti.offset - laststdoffset
            else:
                laststdoffset = tti.offset

            if not isinstance(tti.dstoffset, datetime.timedelta):
                tti.dstoffset = datetime.timedelta(seconds=tti.dstoffset)

            out.trans_idx[i] = tti

        out.trans_idx = tuple(out.trans_idx)
        out.trans_list = tuple(out.trans_list)
        out.trans_list_utc = tuple(out.trans_list_utc)

        return out

    def _find_last_transition(self, dt, in_utc=False):
        # If there's no list, there are no transitions to find
        if not self._trans_list:
            return None

        timestamp = _datetime_to_timestamp(dt)

        # Find where the timestamp fits in the transition list - if the
        # timestamp is a transition time, it's part of the "after" period.
        trans_list = self._trans_list_utc if in_utc else self._trans_list
        idx = bisect.bisect_right(trans_list, timestamp)

        # We want to know when the previous transition was, so subtract off 1
        return idx - 1        

    def _get_ttinfo(self, idx):
        # For no list or after the last transition, default to _ttinfo_std
        if idx is None or (idx + 1) >= len(self._trans_list):
            return self._ttinfo_std

        # If there is a list and the time is before it, return _ttinfo_before
        if idx < 0:
            return self._ttinfo_before

        return self._trans_idx[idx]

    def _find_ttinfo(self, dt):
        idx = self._resolve_ambiguous_time(dt)

        return self._get_ttinfo(idx)

    def fromutc(self, dt):
        """
        The ``tzfile`` implementation of :py:func:`datetime.tzinfo.fromutc`.

        :param dt:
            A :py:class:`datetime.datetime` object.

        :raises TypeError:
            Raised if ``dt`` is not a :py:class:`datetime.datetime` object.

        :raises ValueError:
            Raised if this is called with a ``dt`` which does not have this
            ``tzinfo`` attached.

        :return:
            Returns a :py:class:`datetime.datetime` object representing the
            wall time in ``self``'s time zone.
        """
        # These isinstance checks are in datetime.tzinfo, so we'll preserve
        # them, even if we don't care about duck typing.
        if not isinstance(dt, datetime.datetime):
            raise TypeError("fromutc() requires a datetime argument")

        if dt.tzinfo is not self:
            raise ValueError("dt.tzinfo is not self")

        # First treat UTC as wall time and get the transition we're in.
        idx = self._find_last_transition(dt, in_utc=True)
        tti = self._get_ttinfo(idx)

        dt_out = dt + datetime.timedelta(seconds=tti.offset)

        fold = self.is_ambiguous(dt_out, idx=idx)

        return enfold(dt_out, fold=int(fold))

    def is_ambiguous(self, dt, idx=None):
        """
        Whether or not the "wall time" of a given datetime is ambiguous in this
        zone.

        :param dt:
            A :py:class:`datetime.datetime`, naive or time zone aware.


        :return:
            Returns ``True`` if ambiguous, ``False`` otherwise.

        .. versionadded:: 2.6.0
        """
        if idx is None:
            idx = self._find_last_transition(dt)

        # Calculate the difference in offsets from current to previous
        timestamp = _datetime_to_timestamp(dt)
        tti = self._get_ttinfo(idx)

        if idx is None or idx <= 0:
            return False

        od = self._get_ttinfo(idx - 1).offset - tti.offset
        tt = self._trans_list[idx]          # Transition time

        return timestamp < tt + od

    def _resolve_ambiguous_time(self, dt):
        idx = self._find_last_transition(dt)

        # If we have no transitions, return the index
        _fold = self._fold(dt)
        if idx is None or idx == 0:
            return idx

        # If it's ambiguous and we're in a fold, shift to a different index.
        idx_offset = int(not _fold and self.is_ambiguous(dt, idx))

        return idx - idx_offset

    def utcoffset(self, dt):
        if dt is None:
            return None

        if not self._ttinfo_std:
            return ZERO

        return self._find_ttinfo(dt).delta

    def dst(self, dt):
        if dt is None:
            return None

        if not self._ttinfo_dst:
            return ZERO

        tti = self._find_ttinfo(dt)

        if not tti.isdst:
            return ZERO

        # The documentation says that utcoffset()-dst() must
        # be constant for every dt.
        return tti.dstoffset

    @tzname_in_python2
    def tzname(self, dt):
        if not self._ttinfo_std or dt is None:
            return None
        return self._find_ttinfo(dt).abbr

    def __eq__(self, other):
        if not isinstance(other, tzfile):
            return NotImplemented
        return (self._trans_list == other._trans_list and
                self._trans_idx == other._trans_idx and
                self._ttinfo_list == other._ttinfo_list)

    __hash__ = None

    def __ne__(self, other):
        return not (self == other)

    def __repr__(self):
        return "%s(%s)" % (self.__class__.__name__, repr(self._filename))

    def __reduce__(self):
        return self.__reduce_ex__(None)

    def __reduce_ex__(self, protocol):
        return (self.__class__, (None, self._filename), self.__dict__)


class tzrange(tzrangebase):
    """
    The ``tzrange`` object is a time zone specified by a set of offsets and
    abbreviations, equivalent to the way the ``TZ`` variable can be specified
    in POSIX-like systems, but using Python delta objects to specify DST
    start, end and offsets.

    :param stdabbr:
        The abbreviation for standard time (e.g. ``'EST'``).

    :param stdoffset:
        An integer or :class:`datetime.timedelta` object or equivalent
        specifying the base offset from UTC.

        If unspecified, +00:00 is used.

    :param dstabbr:
        The abbreviation for DST / "Summer" time (e.g. ``'EDT'``).

        If specified, with no other DST information, DST is assumed to occur
        and the default behavior or ``dstoffset``, ``start`` and ``end`` is
        used. If unspecified and no other DST information is specified, it
        is assumed that this zone has no DST.

        If this is unspecified and other DST information is *is* specified,
        DST occurs in the zone but the time zone abbreviation is left
        unchanged.

    :param dstoffset:
        A an integer or :class:`datetime.timedelta` object or equivalent
        specifying the UTC offset during DST. If unspecified and any other DST
        information is specified, it is assumed to be the STD offset +1 hour.

    :param start:
        A :class:`relativedelta.relativedelta` object or equivalent specifying
        the time and time of year that daylight savings time starts. To specify,
        for example, that DST starts at 2AM on the 2nd Sunday in March, pass:

            ``relativedelta(hours=2, month=3, day=1, weekday=SU(+2))``

        If unspecified and any other DST information is specified, the default
        value is 2 AM on the first Sunday in April.

    :param end:
        A :class:`relativedelta.relativedelta` object or equivalent representing
        the time and time of year that daylight savings time ends, with the
        same specification method as in ``start``. One note is that this should
        point to the first time in the *standard* zone, so if a transition
        occurs at 2AM in the DST zone and the clocks are set back 1 hour to 1AM,
        set the `hours` parameter to +1.


    **Examples:**

    .. testsetup:: tzrange

        from dateutil.tz import tzrange, tzstr

    .. doctest:: tzrange

        >>> tzstr('EST5EDT') == tzrange("EST", -18000, "EDT")
        True

        >>> from dateutil.relativedelta import *
        >>> range1 = tzrange("EST", -18000, "EDT")
        >>> range2 = tzrange("EST", -18000, "EDT", -14400,
        ...                  relativedelta(hours=+2, month=4, day=1,
        ...                                weekday=SU(+1)),
        ...                  relativedelta(hours=+1, month=10, day=31,
        ...                                weekday=SU(-1)))
        >>> tzstr('EST5EDT') == range1 == range2
        True

    """
    def __init__(self, stdabbr, stdoffset=None,
                 dstabbr=None, dstoffset=None,
                 start=None, end=None):

        global relativedelta
        from dateutil import relativedelta

        self._std_abbr = stdabbr
        self._dst_abbr = dstabbr

        try:
            stdoffset = _total_seconds(stdoffset)
        except (TypeError, AttributeError):
            pass

        try:
            dstoffset = _total_seconds(dstoffset)
        except (TypeError, AttributeError):
            pass

        if stdoffset is not None:
            self._std_offset = datetime.timedelta(seconds=stdoffset)
        else:
            self._std_offset = ZERO

        if dstoffset is not None:
            self._dst_offset = datetime.timedelta(seconds=dstoffset)
        elif dstabbr and stdoffset is not None:
            self._dst_offset = self._std_offset + datetime.timedelta(hours=+1)
        else:
            self._dst_offset = ZERO

        if dstabbr and start is None:
            self._start_delta = relativedelta.relativedelta(
                hours=+2, month=4, day=1, weekday=relativedelta.SU(+1))
        else:
            self._start_delta = start

        if dstabbr and end is None:
            self._end_delta = relativedelta.relativedelta(
                hours=+1, month=10, day=31, weekday=relativedelta.SU(-1))
        else:
            self._end_delta = end

        self._dst_base_offset_ = self._dst_offset - self._std_offset
        self.hasdst = bool(self._start_delta)

    def transitions(self, year):
        """
        For a given year, get the DST on and off transition times, expressed
        always on the standard time side. For zones with no transitions, this
        function returns ``None``.

        :param year:
            The year whose transitions you would like to query.

        :return:
            Returns a :class:`tuple` of :class:`datetime.datetime` objects,
            ``(dston, dstoff)`` for zones with an annual DST transition, or
            ``None`` for fixed offset zones.
        """
        if not self.hasdst:
            return None

        base_year = datetime.datetime(year, 1, 1)

        start = base_year + self._start_delta
        end = base_year + self._end_delta

        return (start, end)

    def __eq__(self, other):
        if not isinstance(other, tzrange):
            return NotImplemented

        return (self._std_abbr == other._std_abbr and
                self._dst_abbr == other._dst_abbr and
                self._std_offset == other._std_offset and
                self._dst_offset == other._dst_offset and
                self._start_delta == other._start_delta and
                self._end_delta == other._end_delta)

    @property
    def _dst_base_offset(self):
        return self._dst_base_offset_


class tzstr(tzrange):
    """
    ``tzstr`` objects are time zone objects specified by a time-zone string as
    it would be passed to a ``TZ`` variable on POSIX-style systems (see
    the `GNU C Library: TZ Variable`_ for more details).

    There is one notable exception, which is that POSIX-style time zones use an
    inverted offset format, so normally ``GMT+3`` would be parsed as an offset
    3 hours *behind* GMT. The ``tzstr`` time zone object will parse this as an
    offset 3 hours *ahead* of GMT. If you would like to maintain the POSIX
    behavior, pass a ``True`` value to ``posix_offset``.

    The :class:`tzrange` object provides the same functionality, but is
    specified using :class:`relativedelta.relativedelta` objects. rather than
    strings.

    :param s:
        A time zone string in ``TZ`` variable format. This can be a
        :class:`bytes` (2.x: :class:`str`), :class:`str` (2.x: :class:`unicode`)
        or a stream emitting unicode characters (e.g. :class:`StringIO`).

    :param posix_offset:
        Optional. If set to ``True``, interpret strings such as ``GMT+3`` or
        ``UTC+3`` as being 3 hours *behind* UTC rather than ahead, per the
        POSIX standard.

    .. _`GNU C Library: TZ Variable`:
        https://www.gnu.org/software/libc/manual/html_node/TZ-Variable.html
    """
    def __init__(self, s, posix_offset=False):
        global parser
        from dateutil import parser

        self._s = s

        res = parser._parsetz(s)
        if res is None:
            raise ValueError("unknown string format")

        # Here we break the compatibility with the TZ variable handling.
        # GMT-3 actually *means* the timezone -3.
        if res.stdabbr in ("GMT", "UTC") and not posix_offset:
            res.stdoffset *= -1

        # We must initialize it first, since _delta() needs
        # _std_offset and _dst_offset set. Use False in start/end
        # to avoid building it two times.
        tzrange.__init__(self, res.stdabbr, res.stdoffset,
                         res.dstabbr, res.dstoffset,
                         start=False, end=False)

        if not res.dstabbr:
            self._start_delta = None
            self._end_delta = None
        else:
            self._start_delta = self._delta(res.start)
            if self._start_delta:
                self._end_delta = self._delta(res.end, isend=1)

        self.hasdst = bool(self._start_delta)

    def _delta(self, x, isend=0):
        from dateutil import relativedelta
        kwargs = {}
        if x.month is not None:
            kwargs["month"] = x.month
            if x.weekday is not None:
                kwargs["weekday"] = relativedelta.weekday(x.weekday, x.week)
                if x.week > 0:
                    kwargs["day"] = 1
                else:
                    kwargs["day"] = 31
            elif x.day:
                kwargs["day"] = x.day
        elif x.yday is not None:
            kwargs["yearday"] = x.yday
        elif x.jyday is not None:
            kwargs["nlyearday"] = x.jyday
        if not kwargs:
            # Default is to start on first sunday of april, and end
            # on last sunday of october.
            if not isend:
                kwargs["month"] = 4
                kwargs["day"] = 1
                kwargs["weekday"] = relativedelta.SU(+1)
            else:
                kwargs["month"] = 10
                kwargs["day"] = 31
                kwargs["weekday"] = relativedelta.SU(-1)
        if x.time is not None:
            kwargs["seconds"] = x.time
        else:
            # Default is 2AM.
            kwargs["seconds"] = 7200
        if isend:
            # Convert to standard time, to follow the documented way
            # of working with the extra hour. See the documentation
            # of the tzinfo class.
            delta = self._dst_offset - self._std_offset
            kwargs["seconds"] -= delta.seconds + delta.days * 86400
        return relativedelta.relativedelta(**kwargs)

    def __repr__(self):
        return "%s(%s)" % (self.__class__.__name__, repr(self._s))


class _tzicalvtzcomp(object):
    def __init__(self, tzoffsetfrom, tzoffsetto, isdst,
                 tzname=None, rrule=None):
        self.tzoffsetfrom = datetime.timedelta(seconds=tzoffsetfrom)
        self.tzoffsetto = datetime.timedelta(seconds=tzoffsetto)
        self.tzoffsetdiff = self.tzoffsetto - self.tzoffsetfrom
        self.isdst = isdst
        self.tzname = tzname
        self.rrule = rrule


class _tzicalvtz(_tzinfo):
    def __init__(self, tzid, comps=[]):
        super(_tzicalvtz, self).__init__()

        self._tzid = tzid
        self._comps = comps
        self._cachedate = []
        self._cachecomp = []

    def _find_comp(self, dt):
        if len(self._comps) == 1:
            return self._comps[0]

        dt = dt.replace(tzinfo=None)

        try:
            return self._cachecomp[self._cachedate.index((dt, self._fold(dt)))]
        except ValueError:
            pass

        lastcompdt = None
        lastcomp = None

        for comp in self._comps:
            compdt = self._find_compdt(comp, dt)

            if compdt and (not lastcompdt or lastcompdt < compdt):
                lastcompdt = compdt
                lastcomp = comp

        if not lastcomp:
            # RFC says nothing about what to do when a given
            # time is before the first onset date. We'll look for the
            # first standard component, or the first component, if
            # none is found.
            for comp in self._comps:
                if not comp.isdst:
                    lastcomp = comp
                    break
            else:
                lastcomp = comp[0]

        self._cachedate.insert(0, (dt, self._fold(dt)))
        self._cachecomp.insert(0, lastcomp)

        if len(self._cachedate) > 10:
            self._cachedate.pop()
            self._cachecomp.pop()

        return lastcomp

    def _find_compdt(self, comp, dt):
        if comp.tzoffsetdiff < ZERO and self._fold(dt):
            dt -= comp.tzoffsetdiff

        compdt = comp.rrule.before(dt, inc=True)

        return compdt

    def utcoffset(self, dt):
        if dt is None:
            return None

        return self._find_comp(dt).tzoffsetto

    def dst(self, dt):
        comp = self._find_comp(dt)
        if comp.isdst:
            return comp.tzoffsetdiff
        else:
            return ZERO

    @tzname_in_python2
    def tzname(self, dt):
        return self._find_comp(dt).tzname

    def __repr__(self):
        return "<tzicalvtz %s>" % repr(self._tzid)

    __reduce__ = object.__reduce__


class tzical(object):
    """
    This object is designed to parse an iCalendar-style ``VTIMEZONE`` structure
    as set out in `RFC 2445`_ Section 4.6.5 into one or more `tzinfo` objects.

    :param `fileobj`:
        A file or stream in iCalendar format, which should be UTF-8 encoded
        with CRLF endings.

    .. _`RFC 2445`: https://www.ietf.org/rfc/rfc2445.txt
    """
    def __init__(self, fileobj):
        global rrule
        from dateutil import rrule

        if isinstance(fileobj, string_types):
            self._s = fileobj
            # ical should be encoded in UTF-8 with CRLF
            fileobj = open(fileobj, 'r')
        else:
            self._s = getattr(fileobj, 'name', repr(fileobj))
            fileobj = _ContextWrapper(fileobj)

        self._vtz = {}

        with fileobj as fobj:
            self._parse_rfc(fobj.read())

    def keys(self):
        """
        Retrieves the available time zones as a list.
        """
        return list(self._vtz.keys())

    def get(self, tzid=None):
        """
        Retrieve a :py:class:`datetime.tzinfo` object by its ``tzid``.

        :param tzid:
            If there is exactly one time zone available, omitting ``tzid``
            or passing :py:const:`None` value returns it. Otherwise a valid
            key (which can be retrieved from :func:`keys`) is required.

        :raises ValueError:
            Raised if ``tzid`` is not specified but there are either more
            or fewer than 1 zone defined.

        :returns:
            Returns either a :py:class:`datetime.tzinfo` object representing
            the relevant time zone or :py:const:`None` if the ``tzid`` was
            not found.
        """
        if tzid is None:
            if len(self._vtz) == 0:
                raise ValueError("no timezones defined")
            elif len(self._vtz) > 1:
                raise ValueError("more than one timezone available")
            tzid = next(iter(self._vtz))

        return self._vtz.get(tzid)

    def _parse_offset(self, s):
        s = s.strip()
        if not s:
            raise ValueError("empty offset")
        if s[0] in ('+', '-'):
            signal = (-1, +1)[s[0] == '+']
            s = s[1:]
        else:
            signal = +1
        if len(s) == 4:
            return (int(s[:2]) * 3600 + int(s[2:]) * 60) * signal
        elif len(s) == 6:
            return (int(s[:2]) * 3600 + int(s[2:4]) * 60 + int(s[4:])) * signal
        else:
            raise ValueError("invalid offset: " + s)

    def _parse_rfc(self, s):
        lines = s.splitlines()
        if not lines:
            raise ValueError("empty string")

        # Unfold
        i = 0
        while i < len(lines):
            line = lines[i].rstrip()
            if not line:
                del lines[i]
            elif i > 0 and line[0] == " ":
                lines[i-1] += line[1:]
                del lines[i]
            else:
                i += 1

        tzid = None
        comps = []
        invtz = False
        comptype = None
        for line in lines:
            if not line:
                continue
            name, value = line.split(':', 1)
            parms = name.split(';')
            if not parms:
                raise ValueError("empty property name")
            name = parms[0].upper()
            parms = parms[1:]
            if invtz:
                if name == "BEGIN":
                    if value in ("STANDARD", "DAYLIGHT"):
                        # Process component
                        pass
                    else:
                        raise ValueError("unknown component: "+value)
                    comptype = value
                    founddtstart = False
                    tzoffsetfrom = None
                    tzoffsetto = None
                    rrulelines = []
                    tzname = None
                elif name == "END":
                    if value == "VTIMEZONE":
                        if comptype:
                            raise ValueError("component not closed: "+comptype)
                        if not tzid:
                            raise ValueError("mandatory TZID not found")
                        if not comps:
                            raise ValueError(
                                "at least one component is needed")
                        # Process vtimezone
                        self._vtz[tzid] = _tzicalvtz(tzid, comps)
                        invtz = False
                    elif value == comptype:
                        if not founddtstart:
                            raise ValueError("mandatory DTSTART not found")
                        if tzoffsetfrom is None:
                            raise ValueError(
                                "mandatory TZOFFSETFROM not found")
                        if tzoffsetto is None:
                            raise ValueError(
                                "mandatory TZOFFSETFROM not found")
                        # Process component
                        rr = None
                        if rrulelines:
                            rr = rrule.rrulestr("\n".join(rrulelines),
                                                compatible=True,
                                                ignoretz=True,
                                                cache=True)
                        comp = _tzicalvtzcomp(tzoffsetfrom, tzoffsetto,
                                              (comptype == "DAYLIGHT"),
                                              tzname, rr)
                        comps.append(comp)
                        comptype = None
                    else:
                        raise ValueError("invalid component end: "+value)
                elif comptype:
                    if name == "DTSTART":
                        rrulelines.append(line)
                        founddtstart = True
                    elif name in ("RRULE", "RDATE", "EXRULE", "EXDATE"):
                        rrulelines.append(line)
                    elif name == "TZOFFSETFROM":
                        if parms:
                            raise ValueError(
                                "unsupported %s parm: %s " % (name, parms[0]))
                        tzoffsetfrom = self._parse_offset(value)
                    elif name == "TZOFFSETTO":
                        if parms:
                            raise ValueError(
                                "unsupported TZOFFSETTO parm: "+parms[0])
                        tzoffsetto = self._parse_offset(value)
                    elif name == "TZNAME":
                        if parms:
                            raise ValueError(
                                "unsupported TZNAME parm: "+parms[0])
                        tzname = value
                    elif name == "COMMENT":
                        pass
                    else:
                        raise ValueError("unsupported property: "+name)
                else:
                    if name == "TZID":
                        if parms:
                            raise ValueError(
                                "unsupported TZID parm: "+parms[0])
                        tzid = value
                    elif name in ("TZURL", "LAST-MODIFIED", "COMMENT"):
                        pass
                    else:
                        raise ValueError("unsupported property: "+name)
            elif name == "BEGIN" and value == "VTIMEZONE":
                tzid = None
                comps = []
                invtz = True

    def __repr__(self):
        return "%s(%s)" % (self.__class__.__name__, repr(self._s))


if sys.platform != "win32":
    TZFILES = ["/etc/localtime", "localtime"]
    TZPATHS = ["/usr/share/zoneinfo",
               "/usr/lib/zoneinfo",
               "/usr/share/lib/zoneinfo",
               "/etc/zoneinfo"]
else:
    TZFILES = []
    TZPATHS = []


def gettz(name=None):
    tz = None
    if not name:
        try:
            name = os.environ["TZ"]
        except KeyError:
            pass
    if name is None or name == ":":
        for filepath in TZFILES:
            if not os.path.isabs(filepath):
                filename = filepath
                for path in TZPATHS:
                    filepath = os.path.join(path, filename)
                    if os.path.isfile(filepath):
                        break
                else:
                    continue
            if os.path.isfile(filepath):
                try:
                    tz = tzfile(filepath)
                    break
                except (IOError, OSError, ValueError):
                    pass
        else:
            tz = tzlocal()
    else:
        if name.startswith(":"):
            name = name[:-1]
        if os.path.isabs(name):
            if os.path.isfile(name):
                tz = tzfile(name)
            else:
                tz = None
        else:
            for path in TZPATHS:
                filepath = os.path.join(path, name)
                if not os.path.isfile(filepath):
                    filepath = filepath.replace(' ', '_')
                    if not os.path.isfile(filepath):
                        continue
                try:
                    tz = tzfile(filepath)
                    break
                except (IOError, OSError, ValueError):
                    pass
            else:
                tz = None
                if tzwin is not None:
                    try:
                        tz = tzwin(name)
                    except WindowsError:
                        tz = None

                if not tz:
                    from dateutil.zoneinfo import get_zonefile_instance
                    tz = get_zonefile_instance().get(name)

                if not tz:
                    for c in name:
                        # name must have at least one offset to be a tzstr
                        if c in "0123456789":
                            try:
                                tz = tzstr(name)
                            except ValueError:
                                pass
                            break
                    else:
                        if name in ("GMT", "UTC"):
                            tz = tzutc()
                        elif name in time.tzname:
                            tz = tzlocal()
    return tz


def datetime_exists(dt, tz=None):
    """
    Given a datetime and a time zone, determine whether or not a given datetime
    would fall in a gap.

    :param dt:
        A :class:`datetime.datetime` (whose time zone will be ignored if ``tz``
        is provided.)

    :param tz:
        A :class:`datetime.tzinfo` with support for the ``fold`` attribute. If
        ``None`` or not provided, the datetime's own time zone will be used.

    :return:
        Returns a boolean value whether or not the "wall time" exists in ``tz``.
    """
    if tz is None:
        if dt.tzinfo is None:
            raise ValueError('Datetime is naive and no time zone provided.')
        tz = dt.tzinfo

    dt = dt.replace(tzinfo=None)

    # This is essentially a test of whether or not the datetime can survive
    # a round trip to UTC.
    dt_rt = dt.replace(tzinfo=tz).astimezone(tzutc()).astimezone(tz)
    dt_rt = dt_rt.replace(tzinfo=None)

    return dt == dt_rt


def datetime_ambiguous(dt, tz=None):
    """
    Given a datetime and a time zone, determine whether or not a given datetime
    is ambiguous (i.e if there are two times differentiated only by their DST
    status).

    :param dt:
        A :class:`datetime.datetime` (whose time zone will be ignored if ``tz``
        is provided.)

    :param tz:
        A :class:`datetime.tzinfo` with support for the ``fold`` attribute. If
        ``None`` or not provided, the datetime's own time zone will be used.

    :return:
        Returns a boolean value whether or not the "wall time" is ambiguous in
        ``tz``.

    .. versionadded:: 2.6.0
    """
    if tz is None:
        if dt.tzinfo is None:
            raise ValueError('Datetime is naive and no time zone provided.')

        tz = dt.tzinfo

    # If a time zone defines its own "is_ambiguous" function, we'll use that.
    is_ambiguous_fn = getattr(tz, 'is_ambiguous', None)
    if is_ambiguous_fn is not None:
        try:
            return tz.is_ambiguous(dt)
        except:
            pass

    # If it doesn't come out and tell us it's ambiguous, we'll just check if
    # the fold attribute has any effect on this particular date and time.
    dt = dt.replace(tzinfo=tz)
    wall_0 = enfold(dt, fold=0)
    wall_1 = enfold(dt, fold=1)

    same_offset = wall_0.utcoffset() == wall_1.utcoffset()
    same_dst = wall_0.dst() == wall_1.dst()

    return not (same_offset and same_dst)


def _datetime_to_timestamp(dt):
    """
    Convert a :class:`datetime.datetime` object to an epoch timestamp in seconds
    since January 1, 1970, ignoring the time zone.
    """
    return _total_seconds((dt.replace(tzinfo=None) - EPOCH))


class _ContextWrapper(object):
    """
    Class for wrapping contexts so that they are passed through in a
    with statement.
    """
    def __init__(self, context):
        self.context = context

    def __enter__(self):
        return self.context

    def __exit__(*args, **kwargs):
        pass

# vim:ts=4:sw=4:et
site-packages/dateutil/tz/__pycache__/__init__.cpython-36.pyc000064400000000462147511334560020144 0ustar003

6�cY��
@s*ddlTdddddddd	d
ddd
dg
ZdS)�)�*ZtzutcZtzoffsetZtzlocalZtzfileZtzrangeZtzstrZtzicalZtzwinZ
tzwinlocalZgettzZenfoldZdatetime_ambiguousZdatetime_existsN)Ztz�__all__�rr�/usr/lib/python3.6/__init__.py�<module>s

site-packages/dateutil/tz/__pycache__/win.cpython-36.pyc000064400000022344147511334560017205 0ustar003

6�cY�,�@s�ddlZddlZddlmZddlmZyddlZddlmZWnek
r\e	d��YnXddl
mZdd	d
gZej
d�ZdZd
ZdZdd�Ze�ZGdd
�d
e�ZGdd�de�ZGdd�de�ZGdd	�d	e�Zdd�Zdd�ZdS)�N)�winreg)�	text_type)�wintypesz#Running tzwin on non-Windows system�)�tzrangebase�tzwin�
tzwinlocal�tzres�z7SOFTWARE\Microsoft\Windows NT\CurrentVersion\Time Zonesz4SOFTWARE\Microsoft\Windows\CurrentVersion\Time Zonesz4SYSTEM\CurrentControlSet\Control\TimeZoneInformationcCsLtjdtj�}ytj|t�j�t}Wntk
r>t}YnX|j�|S)N)r�ConnectRegistry�HKEY_LOCAL_MACHINE�OpenKey�TZKEYNAMENTZCloseZWindowsError�TZKEYNAME9X)�handle�	TZKEYNAME�r�/usr/lib/python3.6/win.py�
_settzkeynames
rc@s6eZdZdZejej�Zd
dd�Z	dd�Z
dd�Zd	S)r	z{
    Class for accessing `tzres.dll`, which contains timezone name related
    resources.

    .. versionadded:: 2.5.0
    �	tzres.dllcCs@tjd�}tjtjtjtjf|j_|j|_tj|�|_	||_
dS)N�user32)�ctypesZWinDLLrZ	HINSTANCEZUINT�LPWSTRZc_int�LoadStringWZargtypes�_tzres�	tzres_loc)�selfrrrrr�__init__1s
ztzres.__init__cCs<|j�}tjtj|�tj�}|j|jj||d�}|d|�S)a�
        Load a timezone name from a DLL offset (integer).

        >>> from dateutil.tzwin import tzres
        >>> tzr = tzres()
        >>> print(tzr.load_name(112))
        'Eastern Standard Time'

        :param offset:
            A positive integer value referring to a string from the tzres dll.

        ..note:
            Offsets found in the registry are generally of the form
            `@tzres.dll,-114`. The offset in this case if 114, not -114.

        rN)	�p_wcharr�castZbyrefrrrrZ_handle)r�offsetZresourceZlpBufferZncharrrr�	load_name?sztzres.load_namec	CsH|jd�s|S|jd�}yt|d�}Wntd��YnX|j|�S)a�
        Parse strings as returned from the Windows registry into the time zone
        name as defined in the registry.

        >>> from dateutil.tzwin import tzres
        >>> tzr = tzres()
        >>> print(tzr.name_from_string('@tzres.dll,-251'))
        'Dateline Daylight Time'
        >>> print(tzr.name_from_string('Eastern Standard Time'))
        'Eastern Standard Time'

        :param tzname_str:
            A timezone name string as returned from a Windows registry key.

        :return:
            Returns the localized timezone string from tzres.dll if the string
            is of the form `@tzres.dll,-offset`, else returns the input string.
        �@z,-rzMalformed timezone string.)�
startswith�split�int�
ValueErrorr!)rZ
tzname_strZ	name_spltr rrr�name_from_stringUs

ztzres.name_from_stringN)r)�__name__�
__module__�__qualname__�__doc__rZPOINTERrZWCHARrrr!r'rrrrr	(s

c@sPeZdZdZdd�Zdd�Zedd��Zdd	�Zd
d�Z	dd
�Z
edd��ZdS)�	tzwinbasezBtzinfo class based on win32's timezones available in the registry.cCstd��dS)Nz#tzwinbase is an abstract base class)�NotImplementedError)rrrrrvsztzwinbase.__init__cCs�t|t�stS|j|jko�|j|jko�|j|jko�|j|jko�|j|jko�|j|jko�|j	|j	ko�|j
|j
ko�|j|jko�|j|jko�|j
|j
ko�|j|jkS)N)�
isinstancer,�NotImplemented�_std_offset�_dst_offset�
_stddayofweek�
_dstdayofweek�_stdweeknumber�_dstweeknumber�_stdhour�_dsthour�
_stdminute�
_dstminute�	_std_abbr�	_dst_abbr)r�otherrrr�__eq__ys
ztzwinbase.__eq__csVtjdtj��>}tj|t��&��fdd�ttj��d�D�}WdQRXWdQRX|S)z4Return a list of all time zones known to the system.Ncsg|]}tj�|��qSr)rZEnumKey)�.0�i)�tzkeyrr�
<listcomp>�sz"tzwinbase.list.<locals>.<listcomp>r)rrrr
r�range�QueryInfoKey)r�resultr)r@r�list�s

*ztzwinbase.listcCs|jS)N)�_display)rrrr�display�sztzwinbase.displaycCsT|js
dSt||j|j|j|j|j�}t||j|j|j	|j
|j�}||j8}||fS)a�
        For a given year, get the DST on and off transition times, expressed
        always on the standard time side. For zones with no transitions, this
        function returns ``None``.

        :param year:
            The year whose transitions you would like to query.

        :return:
            Returns a :class:`tuple` of :class:`datetime.datetime` objects,
            ``(dston, dstoff)`` for zones with an annual DST transition, or
            ``None`` for fixed offset zones.
        N)
�hasdst�picknthweekday�	_dstmonthr3r7r9r5�	_stdmonthr2r6r8r4�_dst_base_offset)r�yearZdstonZdstoffrrr�transitions�s
ztzwinbase.transitionscCs
|jdkS)Nr)rJ)rrrr�_get_hasdst�sztzwinbase._get_hasdstcCs|jS)N)�_dst_base_offset_)rrrrrL�sztzwinbase._dst_base_offsetN)
r(r)r*r+rr=�staticmethodrErGrNrO�propertyrLrrrrr,ts	r,c@s$eZdZdd�Zdd�Zdd�ZdS)rc	Cs||_tjdtj��8}td�jt|d�}tj||��}t|�}WdQRXWdQRX|d|_	|d|_
|d|_tj
d|d�}|d|d	}||d
}tj|d�|_tj|d�|_|dd
�\|_|_|_|_|_|dd�\|_|_|_|_|_|j|j|_|j�|_dS)Nz{kn}\{name})�kn�nameZStdZDlt�Displayz=3l16hZTZIrr�)�minutes��	��)�_namerrrr�formatrr
�valuestodictr:r;rF�struct�unpack�datetime�	timedeltar0r1rKr2r4r6r8rJr3r5r7r9rPrOrH)	rrTr�	tzkeynamer@�keydict�tup�	stdoffset�	dstoffsetrrrr�s"


  ztzwin.__init__cCsdt|j�S)Nz	tzwin(%s))�reprr\)rrrr�__repr__�sztzwin.__repr__cCs|j|jffS)N)�	__class__r\)rrrr�
__reduce__�sztzwin.__reduce__N)r(r)r*rrirkrrrrr�s&c@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
rc
 Csntjdtj���}tj|t��}t|�}WdQRX|d|_|d|_yBtd�j	t
|jd�}tj||��}t|�}|d|_WdQRXWntk
r�d|_YnXWdQRX|d|d}||d}t
j|d	�|_t
j|d	�|_tjd
|d�}	|	dd
�\|_|_|_|_|	d|_tjd
|d�}	|	dd
�\|_|_|_|_|	d|_|j|j|_|j�|_dS)NZStandardNameZDaylightNamez	{kn}\{sn})rSZsnrUZBiasZStandardBiasZDaylightBias)rWz=8hZ
StandardStartr�r
Z
DaylightStart) rrrr
�TZLOCALKEYNAMEr^r:r;rr]rrF�OSErrorrarbr0r1r_r`rKr4r6r8r2rJr5r7r9r3rPrOrH)
rrZ
tzlocalkeyrdrcr@Z_keydictrfrgrerrrr�s2





ztzwinlocal.__init__cCsdS)Nztzwinlocal()r)rrrrrisztzwinlocal.__repr__cCsdt|j�S)Nztzwinlocal(%s))rhr:)rrrr�__str__sztzwinlocal.__str__cCs
|jffS)N)rj)rrrrrk#sztzwinlocal.__reduce__N)r(r)r*rrirorkrrrrr�s.c	CsTtj||d||�}|j||j�ddd�}||dt}|j|krP|t8}|S)z> dayofweek == 0 means Sunday, whichweek 5 means last instance rr
)Zday)ra�replaceZ
isoweekday�ONEWEEK�month)	rMrrZ	dayofweekZhourZminuteZ	whichweek�firstZ
weekdayoneZwdrrrrI's
rIcCs�i}tj|�d}d}x�t|�D]v}tj||�\}}}|tjksJ|tjkr\|d@r�|d}n2|tjkr�|jd�r�|pxt�}|j	|�}|j
d�}|||<q W|S)	z0Convert a registry key's values to a dictionary.rN�� z@tzres�ll)rrCrBZ	EnumValueZ	REG_DWORDZREG_DWORD_LITTLE_ENDIANZREG_SZr#r	r'�rstrip)�keyZdout�sizeZtz_resr?Zkey_name�valueZdtyperrrr^5s





r^)rar_Z	six.movesrZsixrrrr&�ImportErrorZ_commonr�__all__rbrqrrrmrr�objectr	r,rrrIr^rrrr�<module>s,

LJ/:site-packages/dateutil/tz/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000462147511334560021103 0ustar003

6�cY��
@s*ddlTdddddddd	d
ddd
dg
ZdS)�)�*ZtzutcZtzoffsetZtzlocalZtzfileZtzrangeZtzstrZtzicalZtzwinZ
tzwinlocalZgettzZenfoldZdatetime_ambiguousZdatetime_existsN)Ztz�__all__�rr�/usr/lib/python3.6/__init__.py�<module>s

site-packages/dateutil/tz/__pycache__/tz.cpython-36.pyc000064400000105541147511334560017046 0ustar003

6�cY���@s�dZddlZddlZddlZddlZddlZddlZddlmZddl	m
Z
mZmZddl	m
Z
mZddl	mZyddlmZmZWnek
r�dZZYnXejd�Zejjd�Zej�ZGd	d
�d
ej�ZGdd�dej�ZGd
d�de�ZGdd�de�ZGdd�de�Z Gdd�de�Z!Gdd�de
�Z"Gdd�de"�Z#Gdd�de�Z$Gdd�de�Z%Gdd�de�Z&ej'dk�r�d d!gZ(d"d#d$d%gZ)ngZ(gZ)d0d&d'�Z*d1d(d)�Z+d2d*d+�Z,d,d-�Z-Gd.d/�d/e�Z.dS)3a{
This module offers timezone implementations subclassing the abstract
:py:`datetime.tzinfo` type. There are classes to handle tzfile format files
(usually are in :file:`/etc/localtime`, :file:`/usr/share/zoneinfo`, etc), TZ
environment string (in all known formats), given ranges (with help from
relative deltas), local machine timezone, fixed offset timezone, and UTC
timezone.
�N)�string_types�)�tzname_in_python2�_tzinfo�_total_seconds)�tzrangebase�enfold)�_validate_fromutc_inputs)�tzwin�
tzwinlocalc@sbeZdZdZdd�Zdd�Zedd��Zdd	�Ze	d
d��Z
dd
�ZdZdd�Z
dd�ZejZdS)�tzutczD
    This is a tzinfo object that represents the UTC time zone.
    cCstS)N)�ZERO)�self�dt�r�/usr/lib/python3.6/tz.py�	utcoffset$sztzutc.utcoffsetcCstS)N)r
)rrrrr�dst'sz	tzutc.dstcCsdS)N�UTCr)rrrrr�tzname*sztzutc.tznamecCsdS)a6
        Whether or not the "wall time" of a given datetime is ambiguous in this
        zone.

        :param dt:
            A :py:class:`datetime.datetime`, naive or time zone aware.


        :return:
            Returns ``True`` if ambiguous, ``False`` otherwise.

        .. versionadded:: 2.6.0
        Fr)rrrrr�is_ambiguous.sztzutc.is_ambiguouscCs|S)z�
        Fast track version of fromutc() returns the original ``dt`` object for
        any valid :py:class:`datetime.datetime` object.
        r)rrrrr�fromutc>sz
tzutc.fromutccCs0t|ttf�stSt|t�p.t|t�o.|jtkS)N)�
isinstancer�tzoffset�NotImplemented�_offsetr
)r�otherrrr�__eq__Fs
ztzutc.__eq__NcCs
||kS)Nr)rrrrr�__ne__Osztzutc.__ne__cCsd|jjS)Nz%s())�	__class__�__name__)rrrr�__repr__Rsztzutc.__repr__)r �
__module__�__qualname__�__doc__rrrrrr	rr�__hash__rr!�object�
__reduce__rrrrr src@sjeZdZdZdd�Zdd�Zdd�Zedd	��Ze	d
d��Z
dd
�Zdd�ZdZ
dd�Zdd�ZejZdS)ra1
    A simple class for representing a fixed offset from UTC.

    :param name:
        The timezone name, to be returned when ``tzname()`` is called.

    :param offset:
        The time zone offset in seconds, or (since version 2.6.0, represented
        as a :py:class:`datetime.timedelta` object.
    cCs>||_yt|�}Wnttfk
r*YnXtj|d�|_dS)N)�seconds)�_namer�	TypeError�AttributeError�datetime�	timedeltar)r�name�offsetrrr�__init__csztzoffset.__init__cCs|jS)N)r)rrrrrrmsztzoffset.utcoffsetcCstS)N)r
)rrrrrrpsztzoffset.dstcCs|jS)N)r))rrrrrrssztzoffset.tznamecCs
||jS)N)r)rrrrrrwsztzoffset.fromutccCsdS)a6
        Whether or not the "wall time" of a given datetime is ambiguous in this
        zone.

        :param dt:
            A :py:class:`datetime.datetime`, naive or time zone aware.


        :return:
            Returns ``True`` if ambiguous, ``False`` otherwise.

        .. versionadded:: 2.6.0
        Fr)rrrrrr{sztzoffset.is_ambiguouscCst|t�stS|j|jkS)N)rrrr)rrrrrr�s
ztzoffset.__eq__NcCs
||kS)Nr)rrrrrr�sztzoffset.__ne__cCs"d|jjt|j�tt|j��fS)Nz
%s(%s, %s))rr �reprr)�intrr)rrrrr!�sztzoffset.__repr__)r r"r#r$r0rrrrr	rrrr%rr!r&r'rrrrrXs

rcsxeZdZdZ�fdd�Zdd�Zdd�Zedd	��Zd
d�Z	dd
�Z
ddd�Zdd�ZdZ
dd�Zdd�ZejZ�ZS)�tzlocalzR
    A :class:`tzinfo` subclass built around the ``time`` timezone functions.
    cs`tt|�j�tjtjd�|_tjr:tjtj	d�|_
n|j|_
|j
|j|_t|j�|_
dS)N)r()�superr3r0r,r-�time�timezone�_std_offsetZdaylightZaltzone�_dst_offset�
_dst_saved�bool�_hasdst)r)rrrr0�sztzlocal.__init__cCs,|dkr|jrdS|j|�r"|jS|jSdS)N)r;�_isdstr8r7)rrrrrr�s

ztzlocal.utcoffsetcCs0|dkr|jrdS|j|�r(|j|jStSdS)N)r;r<r8r7r
)rrrrrr�s

ztzlocal.dstcCstj|j|�S)N)r5rr<)rrrrrr�sztzlocal.tznamecCs$|j|�}|o"||j||j�kS)a6
        Whether or not the "wall time" of a given datetime is ambiguous in this
        zone.

        :param dt:
            A :py:class:`datetime.datetime`, naive or time zone aware.


        :return:
            Returns ``True`` if ambiguous, ``False`` otherwise.

        .. versionadded:: 2.6.0
        )�
_naive_is_dstr9)rrZ	naive_dstrrrr�s
ztzlocal.is_ambiguouscCst|�}tj|tj�jS)N)�_datetime_to_timestampr5�	localtimer6Ztm_isdst)rr�	timestamprrrr=�sztzlocal._naive_is_dstTcCsF|js
dS|j|�}t|dd�}|j|�rB|dk	r>|j|�SdS|S)NF�foldT)r;r=�getattrr�_fold)rrZ
fold_naiveZdstvalrArrrr<�s

ztzlocal._isdstcCs&t|t�stS|j|jko$|j|jkS)N)rr3rr7r8)rrrrrrs
ztzlocal.__eq__NcCs
||kS)Nr)rrrrrrsztzlocal.__ne__cCsd|jjS)Nz%s())rr )rrrrr!sztzlocal.__repr__)T)r r"r#r$r0rrrrrr=r<rr%rr!r&r'�
__classcell__rr)rrr3�s		
(r3c@sReZdZdddddddgZdd	�Zd
d�Zdd
�ZdZdd�Zdd�Z	dd�Z
dS)�_ttinfor/�delta�isdst�abbr�isstd�isgmt�	dstoffsetcCs x|jD]}t||d�qWdS)N)�	__slots__�setattr)r�attrrrrr0sz_ttinfo.__init__cCsRg}x6|jD],}t||�}|dk	r|jd|t|�f�qWd|jjdj|�fS)Nz%s=%sz%s(%s)z, )rLrB�appendr1rr �join)r�lrN�valuerrrr!s
z_ttinfo.__repr__cCsbt|t�stS|j|jko`|j|jko`|j|jko`|j|jko`|j|jko`|j|jko`|j	|j	kS)N)
rrErr/rFrGrHrIrJrK)rrrrrr$s
z_ttinfo.__eq__NcCs
||kS)Nr)rrrrrr2sz_ttinfo.__ne__cCs(i}x|jD]}t||d�||<qW|S)N)rLrB)r�stater.rrr�__getstate__5sz_ttinfo.__getstate__cCs,x&|jD]}||krt||||�qWdS)N)rLrM)rrSr.rrr�__setstate__;sz_ttinfo.__setstate__)r r"r#rLr0r!rr%rrTrUrrrrrEs
rEc@s,eZdZdZdddddddd	gZd
d�ZdS)
�_tzfilezw
    Lightweight class for holding the relevant transition and time zone
    information read from binary tzfiles.
    �
trans_list�trans_list_utc�	trans_idx�ttinfo_list�
ttinfo_std�
ttinfo_dst�
ttinfo_before�ttinfo_firstcKs(x"|jD]}t|||j|d��qWdS)N)�attrsrM�get)r�kwargsrNrrrr0Isz_tzfile.__init__N)r r"r#r$r_r0rrrrrVAsrVcs�eZdZdZd&�fdd�	Zdd�Zdd�Zd'd
d�Zdd
�Zdd�Z	dd�Z
d(dd�Zdd�Zdd�Z
dd�Zedd��Zdd�ZdZdd�Zd d!�Zd"d#�Zd$d%�Z�ZS))�tzfilea�
    This is a ``tzinfo`` subclass thant allows one to use the ``tzfile(5)``
    format timezone files to extract current and historical zone information.

    :param fileobj:
        This can be an opened file stream or a file name that the time zone
        information can be read from.

    :param filename:
        This is an optional parameter specifying the source of the time zone
        information in the event that ``fileobj`` is a file object. If omitted
        and ``fileobj`` is a file stream, this parameter will be set either to
        ``fileobj``'s ``name`` attribute or to ``repr(fileobj)``.

    See `Sources for Time Zone and Daylight Saving Time Data
    <http://www.twinsun.com/tz/tz-link.htm>`_ for more information. Time zone
    files can be compiled from the `IANA Time Zone database files
    <https://www.iana.org/time-zones>`_ with the `zic time zone compiler
    <https://www.freebsd.org/cgi/man.cgi?query=zic&sektion=8>`_
    Nc	s�tt|�j�d}t|t�r2||_t|d�}d}n.|dk	rB||_nt|d�rV|j|_n
t	|�|_|dk	r�|stt
|�}|�}|j|�}WdQRX|j|�dS)NF�rbTr.)
r4rbr0rr�	_filename�open�hasattrr.r1�_ContextWrapper�_read_tzfile�_set_tzdata)r�fileobj�filenameZfile_opened_hereZfile_stream�tzobj)rrrr0ds"




ztzfile.__init__cCs*x$tjD]}t|d|t||��qWdS)z= Set the time zone data of this object from a _tzfile object �_N)rVr_rMrB)rrlrNrrrri|sztzfile._set_tzdatacs�t��|jd�j�dkr td��|jd�tjd|jd��\}}}}}}|rnttjd||j|d����_ng�_|r�tjd||j|���_ng�_g}x(t	|�D]}	|j
tjd	|jd
���q�W|j|�j�}
|r�|j|dtj
�|�rtjd||j|��}|�r"tjd||j|��}g�_x�t	|�D]�}	||	\}
}}d
|
dd
}
t�}|
|_tjd�|_tj|
d�|_||_|
||
jd|��|_||	k�o�||	dk|_||	k�o�||	dk|_�jj
|��q2W�fdd��jD��_d�_d�_d�_�j�r؈j�s$�jd�_�_n�x�t	|ddd�D]V}	�j|	}�j�r`|j�r`|�_n�j�rx|j�rx|�_�j�r6�j�r6P�q6W�j�r��j�r��j�_x,�jD]}|j�s�|�_P�q�W�jd�_d}g�_xlt�j�D]^\}	}|j�s
|j}|}n*|dk	�r*|j||_|�j|	<|�p2d}�jj
�j|	|��q�Wd}x~t t	t!�j���D]h}	�j|	}|j�r�|j�p�|dk�s�|j||_n|j}t"|jtj��s�tj|jd�|_|�j|	<�qhWt#�j��_t#�j��_t#�j��_�S)N�ZTZifzmagic not found�z>6l�z>%dlz>%dBz>lbb��z>%db�<�r)r(�csg|]}�j|�qSr)rZ)�.0�idx)�outrr�
<listcomp>sz'tzfile._read_tzfile.<locals>.<listcomp>r���rz)$rV�read�decode�
ValueError�struct�unpack�listrXrY�rangerO�seek�os�SEEK_CURrZrEr/r,r-rKrFrG�findrHrIrJr[r\r]r^rW�	enumerate�reversed�lenr�tuple)rrjZ
ttisgmtcntZ
ttisstdcntZleapcntZtimecntZtypecntZcharcntZttinfo�irHrIrJZgmtoffrGZabbrind�ttiZ
laststdoffsetr/r)rxrrh�s�
		






	



ztzfile._read_tzfileFcCs6|js
dSt|�}|r|jn|j}tj||�}|dS)Nr)�_trans_listr>Z_trans_list_utc�bisectZbisect_right)rr�in_utcr@rWrwrrr�_find_last_transition^sztzfile._find_last_transitioncCs8|dks|dt|j�kr |jS|dkr.|jS|j|S)Nrr)r�r��_ttinfo_stdZ_ttinfo_before�
_trans_idx)rrwrrr�_get_ttinfoms
ztzfile._get_ttinfocCs|j|�}|j|�S)N)�_resolve_ambiguous_timer�)rrrwrrr�_find_ttinfoxs
ztzfile._find_ttinfocCsnt|tj�std��|j|k	r&td��|j|dd�}|j|�}|tj|jd�}|j	||d�}t
|t|�d�S)a
        The ``tzfile`` implementation of :py:func:`datetime.tzinfo.fromutc`.

        :param dt:
            A :py:class:`datetime.datetime` object.

        :raises TypeError:
            Raised if ``dt`` is not a :py:class:`datetime.datetime` object.

        :raises ValueError:
            Raised if this is called with a ``dt`` which does not have this
            ``tzinfo`` attached.

        :return:
            Returns a :py:class:`datetime.datetime` object representing the
            wall time in ``self``'s time zone.
        z&fromutc() requires a datetime argumentzdt.tzinfo is not selfT)r�)r()rw)rA)rr,r*�tzinfor}r�r�r-r/rrr2)rrrwr�Zdt_outrArrrr}s

ztzfile.fromutccCsd|dkr|j|�}t|�}|j|�}|dks4|dkr8dS|j|d�j|j}|j|}|||kS)a6
        Whether or not the "wall time" of a given datetime is ambiguous in this
        zone.

        :param dt:
            A :py:class:`datetime.datetime`, naive or time zone aware.


        :return:
            Returns ``True`` if ambiguous, ``False`` otherwise.

        .. versionadded:: 2.6.0
        NrFr)r�r>r�r/r�)rrrwr@r�ZodZttrrrr�s


ztzfile.is_ambiguouscCsF|j|�}|j|�}|dks$|dkr(|St|o:|j||��}||S)Nr)r�rCr2r)rrrwrCZ
idx_offsetrrrr��s

ztzfile._resolve_ambiguous_timecCs"|dkrdS|jstS|j|�jS)N)r�r
r�rF)rrrrrr�s
ztzfile.utcoffsetcCs0|dkrdS|jstS|j|�}|js*tS|jS)N)Z_ttinfo_dstr
r�rGrK)rrr�rrrr�s
z
tzfile.dstcCs |js|dkrdS|j|�jS)N)r�r�rH)rrrrrr�sz
tzfile.tznamecCs2t|t�stS|j|jko0|j|jko0|j|jkS)N)rrbrr�r�Z_ttinfo_list)rrrrrr�s

z
tzfile.__eq__cCs
||kS)Nr)rrrrrr�sz
tzfile.__ne__cCsd|jjt|j�fS)Nz%s(%s))rr r1rd)rrrrr!�sztzfile.__repr__cCs
|jd�S)N)�
__reduce_ex__)rrrrr'�sztzfile.__reduce__cCs|jd|jf|jfS)N)rrd�__dict__)rZprotocolrrrr��sztzfile.__reduce_ex__)N)F)N)r r"r#r$r0rirhr�r�r�rrr�rrrrrr%rr!r'r�rDrr)rrrbNs(]
$

	rbc@s6eZdZdZddd�Zdd�Zdd�Zed	d
��ZdS)�tzrangeaQ
    The ``tzrange`` object is a time zone specified by a set of offsets and
    abbreviations, equivalent to the way the ``TZ`` variable can be specified
    in POSIX-like systems, but using Python delta objects to specify DST
    start, end and offsets.

    :param stdabbr:
        The abbreviation for standard time (e.g. ``'EST'``).

    :param stdoffset:
        An integer or :class:`datetime.timedelta` object or equivalent
        specifying the base offset from UTC.

        If unspecified, +00:00 is used.

    :param dstabbr:
        The abbreviation for DST / "Summer" time (e.g. ``'EDT'``).

        If specified, with no other DST information, DST is assumed to occur
        and the default behavior or ``dstoffset``, ``start`` and ``end`` is
        used. If unspecified and no other DST information is specified, it
        is assumed that this zone has no DST.

        If this is unspecified and other DST information is *is* specified,
        DST occurs in the zone but the time zone abbreviation is left
        unchanged.

    :param dstoffset:
        A an integer or :class:`datetime.timedelta` object or equivalent
        specifying the UTC offset during DST. If unspecified and any other DST
        information is specified, it is assumed to be the STD offset +1 hour.

    :param start:
        A :class:`relativedelta.relativedelta` object or equivalent specifying
        the time and time of year that daylight savings time starts. To specify,
        for example, that DST starts at 2AM on the 2nd Sunday in March, pass:

            ``relativedelta(hours=2, month=3, day=1, weekday=SU(+2))``

        If unspecified and any other DST information is specified, the default
        value is 2 AM on the first Sunday in April.

    :param end:
        A :class:`relativedelta.relativedelta` object or equivalent representing
        the time and time of year that daylight savings time ends, with the
        same specification method as in ``start``. One note is that this should
        point to the first time in the *standard* zone, so if a transition
        occurs at 2AM in the DST zone and the clocks are set back 1 hour to 1AM,
        set the `hours` parameter to +1.


    **Examples:**

    .. testsetup:: tzrange

        from dateutil.tz import tzrange, tzstr

    .. doctest:: tzrange

        >>> tzstr('EST5EDT') == tzrange("EST", -18000, "EDT")
        True

        >>> from dateutil.relativedelta import *
        >>> range1 = tzrange("EST", -18000, "EDT")
        >>> range2 = tzrange("EST", -18000, "EDT", -14400,
        ...                  relativedelta(hours=+2, month=4, day=1,
        ...                                weekday=SU(+1)),
        ...                  relativedelta(hours=+1, month=10, day=31,
        ...                                weekday=SU(-1)))
        >>> tzstr('EST5EDT') == range1 == range2
        True

    NcCs@ddlma||_||_yt|�}Wnttfk
r<YnXyt|�}Wnttfk
rbYnX|dk	r|tj|d�|_	nt
|_	|dk	r�tj|d�|_n(|r�|dk	r�|j	tjdd�|_nt
|_|r�|dkr�tjdddtjd
�d�|_
n||_
|�r|dk�rtjdd	d
tjd�d�|_n||_|j|j	|_t|j
�|_dS)Nr)�
relativedelta)r(r)�hours�rn)r��month�day�weekday�
�rr�rrrz)�dateutilr��	_std_abbr�	_dst_abbrrr*r+r,r-r7r
r8�SU�_start_delta�
_end_delta�_dst_base_offset_r:�hasdst)r�stdabbr�	stdoffset�dstabbrrK�start�endrrrr0Js:ztzrange.__init__cCs4|js
dStj|dd�}||j}||j}||fS)a�
        For a given year, get the DST on and off transition times, expressed
        always on the standard time side. For zones with no transitions, this
        function returns ``None``.

        :param year:
            The year whose transitions you would like to query.

        :return:
            Returns a :class:`tuple` of :class:`datetime.datetime` objects,
            ``(dston, dstoff)`` for zones with an annual DST transition, or
            ``None`` for fixed offset zones.
        Nr)r�r,r�r�)rZyearZ	base_yearr�r�rrr�transitionsys

ztzrange.transitionscCsVt|t�stS|j|jkoT|j|jkoT|j|jkoT|j|jkoT|j|jkoT|j|jkS)N)	rr�rr�r�r7r8r�r�)rrrrrr�s
ztzrange.__eq__cCs|jS)N)r�)rrrr�_dst_base_offset�sztzrange._dst_base_offset)NNNNN)	r r"r#r$r0r�r�propertyr�rrrrr�sI
-r�c@s,eZdZdZddd�Zddd�Zdd	�Zd
S)
�tzstra
    ``tzstr`` objects are time zone objects specified by a time-zone string as
    it would be passed to a ``TZ`` variable on POSIX-style systems (see
    the `GNU C Library: TZ Variable`_ for more details).

    There is one notable exception, which is that POSIX-style time zones use an
    inverted offset format, so normally ``GMT+3`` would be parsed as an offset
    3 hours *behind* GMT. The ``tzstr`` time zone object will parse this as an
    offset 3 hours *ahead* of GMT. If you would like to maintain the POSIX
    behavior, pass a ``True`` value to ``posix_offset``.

    The :class:`tzrange` object provides the same functionality, but is
    specified using :class:`relativedelta.relativedelta` objects. rather than
    strings.

    :param s:
        A time zone string in ``TZ`` variable format. This can be a
        :class:`bytes` (2.x: :class:`str`), :class:`str` (2.x: :class:`unicode`)
        or a stream emitting unicode characters (e.g. :class:`StringIO`).

    :param posix_offset:
        Optional. If set to ``True``, interpret strings such as ``GMT+3`` or
        ``UTC+3`` as being 3 hours *behind* UTC rather than ahead, per the
        POSIX standard.

    .. _`GNU C Library: TZ Variable`:
        https://www.gnu.org/software/libc/manual/html_node/TZ-Variable.html
    Fc	Cs�ddlma||_tj|�}|dkr,td��|jd
krJ|rJ|jd9_tj||j|j|j	|j
ddd�|j	s~d|_d|_n&|j
|j�|_|jr�|j
|jdd	�|_t|j�|_dS)Nr)�parserzunknown string format�GMTrrF)r�r�)�isend)r�rrz)r�r��_sZ_parsetzr}r�r�r�r0r�rKr�r��_deltar�r�r:r�)r�sZposix_offset�resrrrr0�s"

ztzstr.__init__rcCs<ddlm}i}|jdk	rr|j|d<|jdk	r`|j|j|j�|d<|jdkrVd|d<qpd|d<q�|jr�|j|d<n*|jdk	r�|j|d<n|jdk	r�|j|d	<|s�|s�d
|d<d|d<|jd�|d<nd|d<d|d<|jd�|d<|j	dk	�r�|j	|d<nd
|d<|�r0|j
|j}|d|j|j
d8<|jf|�S)Nr)r�r�r�rr�r�ZyeardayZ	nlyeardayrnr�r(i i�Qrrz)r�r�r�r�Zweekr�ZydayZjydayr�r5r8r7r(Zdays)r�xr�r�rarFrrrr��s<








ztzstr._deltacCsd|jjt|j�fS)Nz%s(%s))rr r1r�)rrrrr!sztzstr.__repr__N)F)r)r r"r#r$r0r�r!rrrrr��s
 
)r�c@seZdZddd�ZdS)�_tzicalvtzcompNcCs@tj|d�|_tj|d�|_|j|j|_||_||_||_dS)N)r()r,r-�tzoffsetfrom�
tzoffsetto�tzoffsetdiffrGr�rrule)rr�r�rGrr�rrrr0sz_tzicalvtzcomp.__init__)NN)r r"r#r0rrrrr�sr�csZeZdZgf�fdd�	Zdd�Zdd�Zdd�Zd	d
�Zedd��Z	d
d�Z
ejZ�Z
S)�
_tzicalvtzcs*tt|�j�||_||_g|_g|_dS)N)r4r�r0�_tzid�_comps�
_cachedate�
_cachecomp)r�tzid�comps)rrrr0s
z_tzicalvtz.__init__c
Cs
t|j�dkr|jdS|jdd�}y|j|jj||j|�f�Stk
rTYnXd}d}x4|jD]*}|j||�}|rf|s�||krf|}|}qfW|s�x"|jD]}|j	s�|}Pq�W|d}|jj
d||j|�f�|jj
d|�t|j�dk�r|jj�|jj�|S)Nrr)r�r�)r�r��replacer�r��indexrCr}�_find_compdtrG�insert�pop)rrZ
lastcompdtZlastcomp�comp�compdtrrr�
_find_comps4


z_tzicalvtz._find_compcCs2|jtkr|j|�r||j8}|jj|dd�}|S)NT)Zinc)r�r
rCr�Zbefore)rr�rr�rrrr�Is
z_tzicalvtz._find_compdtcCs|dkrdS|j|�jS)N)r�r�)rrrrrrQsz_tzicalvtz.utcoffsetcCs|j|�}|jr|jStSdS)N)r�rGr�r
)rrr�rrrrWs
z_tzicalvtz.dstcCs|j|�jS)N)r�r)rrrrrr^sz_tzicalvtz.tznamecCsdt|j�S)Nz<tzicalvtz %s>)r1r�)rrrrr!bsz_tzicalvtz.__repr__)r r"r#r0r�r�rrrrr!r&r'rDrr)rrr�s*r�c@sBeZdZdZdd�Zdd�Zddd�Zd	d
�Zdd�Zd
d�Z	dS)�tzicala\
    This object is designed to parse an iCalendar-style ``VTIMEZONE`` structure
    as set out in `RFC 2445`_ Section 4.6.5 into one or more `tzinfo` objects.

    :param `fileobj`:
        A file or stream in iCalendar format, which should be UTF-8 encoded
        with CRLF endings.

    .. _`RFC 2445`: https://www.ietf.org/rfc/rfc2445.txt
    c	Csjddlmat|t�r(||_t|d�}nt|dt|��|_t|�}i|_	|�}|j
|j��WdQRXdS)Nr)r��rr.)r�r�rrr�rerBr1rg�_vtz�
_parse_rfcr{)rrjZfobjrrrr0ss
ztzical.__init__cCst|jj��S)z?
        Retrieves the available time zones as a list.
        )r�r��keys)rrrrr��sztzical.keysNcCsP|dkrDt|j�dkr td��nt|j�dkr6td��tt|j��}|jj|�S)a�
        Retrieve a :py:class:`datetime.tzinfo` object by its ``tzid``.

        :param tzid:
            If there is exactly one time zone available, omitting ``tzid``
            or passing :py:const:`None` value returns it. Otherwise a valid
            key (which can be retrieved from :func:`keys`) is required.

        :raises ValueError:
            Raised if ``tzid`` is not specified but there are either more
            or fewer than 1 zone defined.

        :returns:
            Returns either a :py:class:`datetime.tzinfo` object representing
            the relevant time zone or :py:const:`None` if the ``tzid`` was
            not found.
        Nrzno timezones definedrz more than one timezone available)r�r�r}�next�iterr`)rr�rrrr`�s
z
tzical.getcCs�|j�}|std��|ddkr>d|ddk}|dd�}nd}t|�dkrzt|dd��dt|dd��d	|St|�d
kr�t|dd��dt|dd��d	t|dd��|Std|��dS)Nzempty offsetr�+�-rrnr�irsrqzinvalid offset: )r�r�rzr)rzrr)�stripr}r�r2)rr��signalrrr�
_parse_offset�s,<ztzical._parse_offsetcCsH|j�}|std��d}xh|t|�kr�||j�}|s>||=q|dkrv|ddkrv||d|dd�7<||=q|d7}qWd}g}d}d}�x�|D�]�}|s�q�|jdd�\}	}
|	jd�}|s�td��|dj�}	|dd�}|�r$|	d	k�r(|
d)k�rntd|
��|
}d}d}
d}g}d}�q@|	d
k�r|
dk�r�|�rNtd|��|�s\td��|�sjtd��t||�|j|<d}n�|
|k�r|�s�td��|
dk�r�td��|dk�r�td��d}|�r�tj	dj
|�dddd�}t|
||dk||�}|j|�d}ntd|
���q@|�r�|	dk�r2|j|�d}n�|	d*k�rH|j|�n�|	dk�rx|�rltd|	|df��|j
|
�}
nj|	dk�r�|�r�td |d��|j
|
�}n>|	d!k�r�|�r�td"|d��|
}n|	d#k�r�ntd$|	��n>|	d%k�r
|�rtd&|d��|
}n|	d+k�rntd$|	��q�|	d	kr�|
dkr�d}g}d}q�WdS),Nzempty stringr� rF�:�;zempty property nameZBEGIN�STANDARD�DAYLIGHTzunknown component: ZENDZ	VTIMEZONEzcomponent not closed: zmandatory TZID not foundz at least one component is neededzmandatory DTSTART not foundz mandatory TZOFFSETFROM not found�
T)Z
compatibleZignoretz�cachezinvalid component end: ZDTSTART�RRULE�RDATE�EXRULE�EXDATEZTZOFFSETFROMzunsupported %s parm: %s Z
TZOFFSETTOzunsupported TZOFFSETTO parm: ZTZNAMEzunsupported TZNAME parm: �COMMENTzunsupported property: ZTZIDzunsupported TZID parm: �TZURL�
LAST-MODIFIED)r�r�)r�r�r�r�)r�r�r�)�
splitlinesr}r��rstrip�split�upperr�r�r�ZrrulestrrPr�rOr�)rr��linesr��liner�r�ZinvtzZcomptyper.rRZparmsZfounddtstartr�r�Z
rrulelinesrZrrr�rrrr��s�

















ztzical._parse_rfccCsd|jjt|j�fS)Nz%s(%s))rr r1r�)rrrrr!+sztzical.__repr__)N)
r r"r#r$r0r�r`r�r�r!rrrrr�hs

vr�Zwin32z/etc/localtimer?z/usr/share/zoneinfoz/usr/lib/zoneinfoz/usr/share/lib/zoneinfoz
/etc/zoneinfocCsDd}|s,ytjd}Wntk
r*YnX|dks<|dkr�x�tD]v}tjj|�s�|}x*tD] }tjj||�}tjj|�r\Pq\WqBtjj|�rByt	|�}PWqBt
ttfk
r�YqBXqBWt
�}�nz|jd�r�|dd�}tjj|��r
tjj|��rt	|�}nd}�n6�x2tD]l}tjj||�}tjj|��sP|jdd�}tjj|��sP�qyt	|�}PWnt
ttfk
�rzYnX�qWd}tdk	�r�yt|�}Wntk
�r�d}YnX|�s�ddlm}|�j|�}|�s@xb|D]6}|dk�r�yt|�}Wntk
�rYnXP�q�W|dk�r.t�}n|tjk�r@t
�}|S)
NZTZr�rr�rmr)�get_zonefile_instance�
0123456789r�rrz)r�r)r��environ�KeyError�TZFILES�path�isabs�TZPATHSrP�isfilerb�IOError�OSErrorr}r3�
startswithr�r
ZWindowsErrorZdateutil.zoneinfor�r`r�rr5r)r.�tz�filepathrkr�r��crrr�gettz:sz










r�cCsZ|dkr |jdkrtd��|j}|jdd�}|j|d�jt��j|�}|jdd�}||kS)a�
    Given a datetime and a time zone, determine whether or not a given datetime
    would fall in a gap.

    :param dt:
        A :class:`datetime.datetime` (whose time zone will be ignored if ``tz``
        is provided.)

    :param tz:
        A :class:`datetime.tzinfo` with support for the ``fold`` attribute. If
        ``None`` or not provided, the datetime's own time zone will be used.

    :return:
        Returns a boolean value whether or not the "wall time" exists in ``tz``.
    Nz,Datetime is naive and no time zone provided.)r�)r�r}r�Z
astimezoner)rr�Zdt_rtrrr�datetime_exists�s
r�c
Cs�|dkr |jdkrtd��|j}t|dd�}|dk	rLy
|j|�SYnX|j|d�}t|dd�}t|dd�}|j�|j�k}|j�|j�k}|o�|S)a\
    Given a datetime and a time zone, determine whether or not a given datetime
    is ambiguous (i.e if there are two times differentiated only by their DST
    status).

    :param dt:
        A :class:`datetime.datetime` (whose time zone will be ignored if ``tz``
        is provided.)

    :param tz:
        A :class:`datetime.tzinfo` with support for the ``fold`` attribute. If
        ``None`` or not provided, the datetime's own time zone will be used.

    :return:
        Returns a boolean value whether or not the "wall time" is ambiguous in
        ``tz``.

    .. versionadded:: 2.6.0
    Nz,Datetime is naive and no time zone provided.r)r�r)rAr)r�r}rBrr�rrr)rr�Zis_ambiguous_fnZwall_0Zwall_1Zsame_offsetZsame_dstrrr�datetime_ambiguous�s 

r�cCst|jdd�t�S)z�
    Convert a :class:`datetime.datetime` object to an epoch timestamp in seconds
    since January 1, 1970, ignoring the time zone.
    N)r�)rr��EPOCH)rrrrr>�sr>c@s(eZdZdZdd�Zdd�Zdd�ZdS)	rgz^
    Class for wrapping contexts so that they are passed through in a
    with statement.
    cCs
||_dS)N)�context)rrrrrr0�sz_ContextWrapper.__init__cCs|jS)N)r)rrrr�	__enter__�sz_ContextWrapper.__enter__cOsdS)Nr)�argsrarrr�__exit__�sz_ContextWrapper.__exit__N)r r"r#r$r0rrrrrrrg�srg)N)N)N)/r$r,r~r5�sysr�r�ZsixrZ_commonrrrrrr	�winr
r�ImportErrorr-r
ZutcfromtimestamprZ	toordinalZEPOCHORDINALr�rrr3r&rErVrbr�r�r�r�r��platformr�r�r�r�r�r>rgrrrr�<module>	s\
8Fv-
5"jRH
J

.site-packages/dateutil/tz/__pycache__/win.cpython-36.opt-1.pyc000064400000022344147511334560020144 0ustar003

6�cY�,�@s�ddlZddlZddlmZddlmZyddlZddlmZWnek
r\e	d��YnXddl
mZdd	d
gZej
d�ZdZd
ZdZdd�Ze�ZGdd
�d
e�ZGdd�de�ZGdd�de�ZGdd	�d	e�Zdd�Zdd�ZdS)�N)�winreg)�	text_type)�wintypesz#Running tzwin on non-Windows system�)�tzrangebase�tzwin�
tzwinlocal�tzres�z7SOFTWARE\Microsoft\Windows NT\CurrentVersion\Time Zonesz4SOFTWARE\Microsoft\Windows\CurrentVersion\Time Zonesz4SYSTEM\CurrentControlSet\Control\TimeZoneInformationcCsLtjdtj�}ytj|t�j�t}Wntk
r>t}YnX|j�|S)N)r�ConnectRegistry�HKEY_LOCAL_MACHINE�OpenKey�TZKEYNAMENTZCloseZWindowsError�TZKEYNAME9X)�handle�	TZKEYNAME�r�/usr/lib/python3.6/win.py�
_settzkeynames
rc@s6eZdZdZejej�Zd
dd�Z	dd�Z
dd�Zd	S)r	z{
    Class for accessing `tzres.dll`, which contains timezone name related
    resources.

    .. versionadded:: 2.5.0
    �	tzres.dllcCs@tjd�}tjtjtjtjf|j_|j|_tj|�|_	||_
dS)N�user32)�ctypesZWinDLLrZ	HINSTANCEZUINT�LPWSTRZc_int�LoadStringWZargtypes�_tzres�	tzres_loc)�selfrrrrr�__init__1s
ztzres.__init__cCs<|j�}tjtj|�tj�}|j|jj||d�}|d|�S)a�
        Load a timezone name from a DLL offset (integer).

        >>> from dateutil.tzwin import tzres
        >>> tzr = tzres()
        >>> print(tzr.load_name(112))
        'Eastern Standard Time'

        :param offset:
            A positive integer value referring to a string from the tzres dll.

        ..note:
            Offsets found in the registry are generally of the form
            `@tzres.dll,-114`. The offset in this case if 114, not -114.

        rN)	�p_wcharr�castZbyrefrrrrZ_handle)r�offsetZresourceZlpBufferZncharrrr�	load_name?sztzres.load_namec	CsH|jd�s|S|jd�}yt|d�}Wntd��YnX|j|�S)a�
        Parse strings as returned from the Windows registry into the time zone
        name as defined in the registry.

        >>> from dateutil.tzwin import tzres
        >>> tzr = tzres()
        >>> print(tzr.name_from_string('@tzres.dll,-251'))
        'Dateline Daylight Time'
        >>> print(tzr.name_from_string('Eastern Standard Time'))
        'Eastern Standard Time'

        :param tzname_str:
            A timezone name string as returned from a Windows registry key.

        :return:
            Returns the localized timezone string from tzres.dll if the string
            is of the form `@tzres.dll,-offset`, else returns the input string.
        �@z,-rzMalformed timezone string.)�
startswith�split�int�
ValueErrorr!)rZ
tzname_strZ	name_spltr rrr�name_from_stringUs

ztzres.name_from_stringN)r)�__name__�
__module__�__qualname__�__doc__rZPOINTERrZWCHARrrr!r'rrrrr	(s

c@sPeZdZdZdd�Zdd�Zedd��Zdd	�Zd
d�Z	dd
�Z
edd��ZdS)�	tzwinbasezBtzinfo class based on win32's timezones available in the registry.cCstd��dS)Nz#tzwinbase is an abstract base class)�NotImplementedError)rrrrrvsztzwinbase.__init__cCs�t|t�stS|j|jko�|j|jko�|j|jko�|j|jko�|j|jko�|j|jko�|j	|j	ko�|j
|j
ko�|j|jko�|j|jko�|j
|j
ko�|j|jkS)N)�
isinstancer,�NotImplemented�_std_offset�_dst_offset�
_stddayofweek�
_dstdayofweek�_stdweeknumber�_dstweeknumber�_stdhour�_dsthour�
_stdminute�
_dstminute�	_std_abbr�	_dst_abbr)r�otherrrr�__eq__ys
ztzwinbase.__eq__csVtjdtj��>}tj|t��&��fdd�ttj��d�D�}WdQRXWdQRX|S)z4Return a list of all time zones known to the system.Ncsg|]}tj�|��qSr)rZEnumKey)�.0�i)�tzkeyrr�
<listcomp>�sz"tzwinbase.list.<locals>.<listcomp>r)rrrr
r�range�QueryInfoKey)r�resultr)r@r�list�s

*ztzwinbase.listcCs|jS)N)�_display)rrrr�display�sztzwinbase.displaycCsT|js
dSt||j|j|j|j|j�}t||j|j|j	|j
|j�}||j8}||fS)a�
        For a given year, get the DST on and off transition times, expressed
        always on the standard time side. For zones with no transitions, this
        function returns ``None``.

        :param year:
            The year whose transitions you would like to query.

        :return:
            Returns a :class:`tuple` of :class:`datetime.datetime` objects,
            ``(dston, dstoff)`` for zones with an annual DST transition, or
            ``None`` for fixed offset zones.
        N)
�hasdst�picknthweekday�	_dstmonthr3r7r9r5�	_stdmonthr2r6r8r4�_dst_base_offset)r�yearZdstonZdstoffrrr�transitions�s
ztzwinbase.transitionscCs
|jdkS)Nr)rJ)rrrr�_get_hasdst�sztzwinbase._get_hasdstcCs|jS)N)�_dst_base_offset_)rrrrrL�sztzwinbase._dst_base_offsetN)
r(r)r*r+rr=�staticmethodrErGrNrO�propertyrLrrrrr,ts	r,c@s$eZdZdd�Zdd�Zdd�ZdS)rc	Cs||_tjdtj��8}td�jt|d�}tj||��}t|�}WdQRXWdQRX|d|_	|d|_
|d|_tj
d|d�}|d|d	}||d
}tj|d�|_tj|d�|_|dd
�\|_|_|_|_|_|dd�\|_|_|_|_|_|j|j|_|j�|_dS)Nz{kn}\{name})�kn�nameZStdZDlt�Displayz=3l16hZTZIrr�)�minutes��	��)�_namerrrr�formatrr
�valuestodictr:r;rF�struct�unpack�datetime�	timedeltar0r1rKr2r4r6r8rJr3r5r7r9rPrOrH)	rrTr�	tzkeynamer@�keydict�tup�	stdoffset�	dstoffsetrrrr�s"


  ztzwin.__init__cCsdt|j�S)Nz	tzwin(%s))�reprr\)rrrr�__repr__�sztzwin.__repr__cCs|j|jffS)N)�	__class__r\)rrrr�
__reduce__�sztzwin.__reduce__N)r(r)r*rrirkrrrrr�s&c@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
rc
 Csntjdtj���}tj|t��}t|�}WdQRX|d|_|d|_yBtd�j	t
|jd�}tj||��}t|�}|d|_WdQRXWntk
r�d|_YnXWdQRX|d|d}||d}t
j|d	�|_t
j|d	�|_tjd
|d�}	|	dd
�\|_|_|_|_|	d|_tjd
|d�}	|	dd
�\|_|_|_|_|	d|_|j|j|_|j�|_dS)NZStandardNameZDaylightNamez	{kn}\{sn})rSZsnrUZBiasZStandardBiasZDaylightBias)rWz=8hZ
StandardStartr�r
Z
DaylightStart) rrrr
�TZLOCALKEYNAMEr^r:r;rr]rrF�OSErrorrarbr0r1r_r`rKr4r6r8r2rJr5r7r9r3rPrOrH)
rrZ
tzlocalkeyrdrcr@Z_keydictrfrgrerrrr�s2





ztzwinlocal.__init__cCsdS)Nztzwinlocal()r)rrrrrisztzwinlocal.__repr__cCsdt|j�S)Nztzwinlocal(%s))rhr:)rrrr�__str__sztzwinlocal.__str__cCs
|jffS)N)rj)rrrrrk#sztzwinlocal.__reduce__N)r(r)r*rrirorkrrrrr�s.c	CsTtj||d||�}|j||j�ddd�}||dt}|j|krP|t8}|S)z> dayofweek == 0 means Sunday, whichweek 5 means last instance rr
)Zday)ra�replaceZ
isoweekday�ONEWEEK�month)	rMrrZ	dayofweekZhourZminuteZ	whichweek�firstZ
weekdayoneZwdrrrrI's
rIcCs�i}tj|�d}d}x�t|�D]v}tj||�\}}}|tjksJ|tjkr\|d@r�|d}n2|tjkr�|jd�r�|pxt�}|j	|�}|j
d�}|||<q W|S)	z0Convert a registry key's values to a dictionary.rN�� z@tzres�ll)rrCrBZ	EnumValueZ	REG_DWORDZREG_DWORD_LITTLE_ENDIANZREG_SZr#r	r'�rstrip)�keyZdout�sizeZtz_resr?Zkey_name�valueZdtyperrrr^5s





r^)rar_Z	six.movesrZsixrrrr&�ImportErrorZ_commonr�__all__rbrqrrrmrr�objectr	r,rrrIr^rrrr�<module>s,

LJ/:site-packages/dateutil/tz/__pycache__/tz.cpython-36.opt-1.pyc000064400000105541147511334560020005 0ustar003

6�cY���@s�dZddlZddlZddlZddlZddlZddlZddlmZddl	m
Z
mZmZddl	m
Z
mZddl	mZyddlmZmZWnek
r�dZZYnXejd�Zejjd�Zej�ZGd	d
�d
ej�ZGdd�dej�ZGd
d�de�ZGdd�de�ZGdd�de�Z Gdd�de�Z!Gdd�de
�Z"Gdd�de"�Z#Gdd�de�Z$Gdd�de�Z%Gdd�de�Z&ej'dk�r�d d!gZ(d"d#d$d%gZ)ngZ(gZ)d0d&d'�Z*d1d(d)�Z+d2d*d+�Z,d,d-�Z-Gd.d/�d/e�Z.dS)3a{
This module offers timezone implementations subclassing the abstract
:py:`datetime.tzinfo` type. There are classes to handle tzfile format files
(usually are in :file:`/etc/localtime`, :file:`/usr/share/zoneinfo`, etc), TZ
environment string (in all known formats), given ranges (with help from
relative deltas), local machine timezone, fixed offset timezone, and UTC
timezone.
�N)�string_types�)�tzname_in_python2�_tzinfo�_total_seconds)�tzrangebase�enfold)�_validate_fromutc_inputs)�tzwin�
tzwinlocalc@sbeZdZdZdd�Zdd�Zedd��Zdd	�Ze	d
d��Z
dd
�ZdZdd�Z
dd�ZejZdS)�tzutczD
    This is a tzinfo object that represents the UTC time zone.
    cCstS)N)�ZERO)�self�dt�r�/usr/lib/python3.6/tz.py�	utcoffset$sztzutc.utcoffsetcCstS)N)r
)rrrrr�dst'sz	tzutc.dstcCsdS)N�UTCr)rrrrr�tzname*sztzutc.tznamecCsdS)a6
        Whether or not the "wall time" of a given datetime is ambiguous in this
        zone.

        :param dt:
            A :py:class:`datetime.datetime`, naive or time zone aware.


        :return:
            Returns ``True`` if ambiguous, ``False`` otherwise.

        .. versionadded:: 2.6.0
        Fr)rrrrr�is_ambiguous.sztzutc.is_ambiguouscCs|S)z�
        Fast track version of fromutc() returns the original ``dt`` object for
        any valid :py:class:`datetime.datetime` object.
        r)rrrrr�fromutc>sz
tzutc.fromutccCs0t|ttf�stSt|t�p.t|t�o.|jtkS)N)�
isinstancer�tzoffset�NotImplemented�_offsetr
)r�otherrrr�__eq__Fs
ztzutc.__eq__NcCs
||kS)Nr)rrrrr�__ne__Osztzutc.__ne__cCsd|jjS)Nz%s())�	__class__�__name__)rrrr�__repr__Rsztzutc.__repr__)r �
__module__�__qualname__�__doc__rrrrrr	rr�__hash__rr!�object�
__reduce__rrrrr src@sjeZdZdZdd�Zdd�Zdd�Zedd	��Ze	d
d��Z
dd
�Zdd�ZdZ
dd�Zdd�ZejZdS)ra1
    A simple class for representing a fixed offset from UTC.

    :param name:
        The timezone name, to be returned when ``tzname()`` is called.

    :param offset:
        The time zone offset in seconds, or (since version 2.6.0, represented
        as a :py:class:`datetime.timedelta` object.
    cCs>||_yt|�}Wnttfk
r*YnXtj|d�|_dS)N)�seconds)�_namer�	TypeError�AttributeError�datetime�	timedeltar)r�name�offsetrrr�__init__csztzoffset.__init__cCs|jS)N)r)rrrrrrmsztzoffset.utcoffsetcCstS)N)r
)rrrrrrpsztzoffset.dstcCs|jS)N)r))rrrrrrssztzoffset.tznamecCs
||jS)N)r)rrrrrrwsztzoffset.fromutccCsdS)a6
        Whether or not the "wall time" of a given datetime is ambiguous in this
        zone.

        :param dt:
            A :py:class:`datetime.datetime`, naive or time zone aware.


        :return:
            Returns ``True`` if ambiguous, ``False`` otherwise.

        .. versionadded:: 2.6.0
        Fr)rrrrrr{sztzoffset.is_ambiguouscCst|t�stS|j|jkS)N)rrrr)rrrrrr�s
ztzoffset.__eq__NcCs
||kS)Nr)rrrrrr�sztzoffset.__ne__cCs"d|jjt|j�tt|j��fS)Nz
%s(%s, %s))rr �reprr)�intrr)rrrrr!�sztzoffset.__repr__)r r"r#r$r0rrrrr	rrrr%rr!r&r'rrrrrXs

rcsxeZdZdZ�fdd�Zdd�Zdd�Zedd	��Zd
d�Z	dd
�Z
ddd�Zdd�ZdZ
dd�Zdd�ZejZ�ZS)�tzlocalzR
    A :class:`tzinfo` subclass built around the ``time`` timezone functions.
    cs`tt|�j�tjtjd�|_tjr:tjtj	d�|_
n|j|_
|j
|j|_t|j�|_
dS)N)r()�superr3r0r,r-�time�timezone�_std_offsetZdaylightZaltzone�_dst_offset�
_dst_saved�bool�_hasdst)r)rrrr0�sztzlocal.__init__cCs,|dkr|jrdS|j|�r"|jS|jSdS)N)r;�_isdstr8r7)rrrrrr�s

ztzlocal.utcoffsetcCs0|dkr|jrdS|j|�r(|j|jStSdS)N)r;r<r8r7r
)rrrrrr�s

ztzlocal.dstcCstj|j|�S)N)r5rr<)rrrrrr�sztzlocal.tznamecCs$|j|�}|o"||j||j�kS)a6
        Whether or not the "wall time" of a given datetime is ambiguous in this
        zone.

        :param dt:
            A :py:class:`datetime.datetime`, naive or time zone aware.


        :return:
            Returns ``True`` if ambiguous, ``False`` otherwise.

        .. versionadded:: 2.6.0
        )�
_naive_is_dstr9)rrZ	naive_dstrrrr�s
ztzlocal.is_ambiguouscCst|�}tj|tj�jS)N)�_datetime_to_timestampr5�	localtimer6Ztm_isdst)rr�	timestamprrrr=�sztzlocal._naive_is_dstTcCsF|js
dS|j|�}t|dd�}|j|�rB|dk	r>|j|�SdS|S)NF�foldT)r;r=�getattrr�_fold)rrZ
fold_naiveZdstvalrArrrr<�s

ztzlocal._isdstcCs&t|t�stS|j|jko$|j|jkS)N)rr3rr7r8)rrrrrrs
ztzlocal.__eq__NcCs
||kS)Nr)rrrrrrsztzlocal.__ne__cCsd|jjS)Nz%s())rr )rrrrr!sztzlocal.__repr__)T)r r"r#r$r0rrrrrr=r<rr%rr!r&r'�
__classcell__rr)rrr3�s		
(r3c@sReZdZdddddddgZdd	�Zd
d�Zdd
�ZdZdd�Zdd�Z	dd�Z
dS)�_ttinfor/�delta�isdst�abbr�isstd�isgmt�	dstoffsetcCs x|jD]}t||d�qWdS)N)�	__slots__�setattr)r�attrrrrr0sz_ttinfo.__init__cCsRg}x6|jD],}t||�}|dk	r|jd|t|�f�qWd|jjdj|�fS)Nz%s=%sz%s(%s)z, )rLrB�appendr1rr �join)r�lrN�valuerrrr!s
z_ttinfo.__repr__cCsbt|t�stS|j|jko`|j|jko`|j|jko`|j|jko`|j|jko`|j|jko`|j	|j	kS)N)
rrErr/rFrGrHrIrJrK)rrrrrr$s
z_ttinfo.__eq__NcCs
||kS)Nr)rrrrrr2sz_ttinfo.__ne__cCs(i}x|jD]}t||d�||<qW|S)N)rLrB)r�stater.rrr�__getstate__5sz_ttinfo.__getstate__cCs,x&|jD]}||krt||||�qWdS)N)rLrM)rrSr.rrr�__setstate__;sz_ttinfo.__setstate__)r r"r#rLr0r!rr%rrTrUrrrrrEs
rEc@s,eZdZdZdddddddd	gZd
d�ZdS)
�_tzfilezw
    Lightweight class for holding the relevant transition and time zone
    information read from binary tzfiles.
    �
trans_list�trans_list_utc�	trans_idx�ttinfo_list�
ttinfo_std�
ttinfo_dst�
ttinfo_before�ttinfo_firstcKs(x"|jD]}t|||j|d��qWdS)N)�attrsrM�get)r�kwargsrNrrrr0Isz_tzfile.__init__N)r r"r#r$r_r0rrrrrVAsrVcs�eZdZdZd&�fdd�	Zdd�Zdd�Zd'd
d�Zdd
�Zdd�Z	dd�Z
d(dd�Zdd�Zdd�Z
dd�Zedd��Zdd�ZdZdd�Zd d!�Zd"d#�Zd$d%�Z�ZS))�tzfilea�
    This is a ``tzinfo`` subclass thant allows one to use the ``tzfile(5)``
    format timezone files to extract current and historical zone information.

    :param fileobj:
        This can be an opened file stream or a file name that the time zone
        information can be read from.

    :param filename:
        This is an optional parameter specifying the source of the time zone
        information in the event that ``fileobj`` is a file object. If omitted
        and ``fileobj`` is a file stream, this parameter will be set either to
        ``fileobj``'s ``name`` attribute or to ``repr(fileobj)``.

    See `Sources for Time Zone and Daylight Saving Time Data
    <http://www.twinsun.com/tz/tz-link.htm>`_ for more information. Time zone
    files can be compiled from the `IANA Time Zone database files
    <https://www.iana.org/time-zones>`_ with the `zic time zone compiler
    <https://www.freebsd.org/cgi/man.cgi?query=zic&sektion=8>`_
    Nc	s�tt|�j�d}t|t�r2||_t|d�}d}n.|dk	rB||_nt|d�rV|j|_n
t	|�|_|dk	r�|stt
|�}|�}|j|�}WdQRX|j|�dS)NF�rbTr.)
r4rbr0rr�	_filename�open�hasattrr.r1�_ContextWrapper�_read_tzfile�_set_tzdata)r�fileobj�filenameZfile_opened_hereZfile_stream�tzobj)rrrr0ds"




ztzfile.__init__cCs*x$tjD]}t|d|t||��qWdS)z= Set the time zone data of this object from a _tzfile object �_N)rVr_rMrB)rrlrNrrrri|sztzfile._set_tzdatacs�t��|jd�j�dkr td��|jd�tjd|jd��\}}}}}}|rnttjd||j|d����_ng�_|r�tjd||j|���_ng�_g}x(t	|�D]}	|j
tjd	|jd
���q�W|j|�j�}
|r�|j|dtj
�|�rtjd||j|��}|�r"tjd||j|��}g�_x�t	|�D]�}	||	\}
}}d
|
dd
}
t�}|
|_tjd�|_tj|
d�|_||_|
||
jd|��|_||	k�o�||	dk|_||	k�o�||	dk|_�jj
|��q2W�fdd��jD��_d�_d�_d�_�j�r؈j�s$�jd�_�_n�x�t	|ddd�D]V}	�j|	}�j�r`|j�r`|�_n�j�rx|j�rx|�_�j�r6�j�r6P�q6W�j�r��j�r��j�_x,�jD]}|j�s�|�_P�q�W�jd�_d}g�_xlt�j�D]^\}	}|j�s
|j}|}n*|dk	�r*|j||_|�j|	<|�p2d}�jj
�j|	|��q�Wd}x~t t	t!�j���D]h}	�j|	}|j�r�|j�p�|dk�s�|j||_n|j}t"|jtj��s�tj|jd�|_|�j|	<�qhWt#�j��_t#�j��_t#�j��_�S)N�ZTZifzmagic not found�z>6l�z>%dlz>%dBz>lbb��z>%db�<�r)r(�csg|]}�j|�qSr)rZ)�.0�idx)�outrr�
<listcomp>sz'tzfile._read_tzfile.<locals>.<listcomp>r���rz)$rV�read�decode�
ValueError�struct�unpack�listrXrY�rangerO�seek�os�SEEK_CURrZrEr/r,r-rKrFrG�findrHrIrJr[r\r]r^rW�	enumerate�reversed�lenr�tuple)rrjZ
ttisgmtcntZ
ttisstdcntZleapcntZtimecntZtypecntZcharcntZttinfo�irHrIrJZgmtoffrGZabbrind�ttiZ
laststdoffsetr/r)rxrrh�s�
		






	



ztzfile._read_tzfileFcCs6|js
dSt|�}|r|jn|j}tj||�}|dS)Nr)�_trans_listr>Z_trans_list_utc�bisectZbisect_right)rr�in_utcr@rWrwrrr�_find_last_transition^sztzfile._find_last_transitioncCs8|dks|dt|j�kr |jS|dkr.|jS|j|S)Nrr)r�r��_ttinfo_stdZ_ttinfo_before�
_trans_idx)rrwrrr�_get_ttinfoms
ztzfile._get_ttinfocCs|j|�}|j|�S)N)�_resolve_ambiguous_timer�)rrrwrrr�_find_ttinfoxs
ztzfile._find_ttinfocCsnt|tj�std��|j|k	r&td��|j|dd�}|j|�}|tj|jd�}|j	||d�}t
|t|�d�S)a
        The ``tzfile`` implementation of :py:func:`datetime.tzinfo.fromutc`.

        :param dt:
            A :py:class:`datetime.datetime` object.

        :raises TypeError:
            Raised if ``dt`` is not a :py:class:`datetime.datetime` object.

        :raises ValueError:
            Raised if this is called with a ``dt`` which does not have this
            ``tzinfo`` attached.

        :return:
            Returns a :py:class:`datetime.datetime` object representing the
            wall time in ``self``'s time zone.
        z&fromutc() requires a datetime argumentzdt.tzinfo is not selfT)r�)r()rw)rA)rr,r*�tzinfor}r�r�r-r/rrr2)rrrwr�Zdt_outrArrrr}s

ztzfile.fromutccCsd|dkr|j|�}t|�}|j|�}|dks4|dkr8dS|j|d�j|j}|j|}|||kS)a6
        Whether or not the "wall time" of a given datetime is ambiguous in this
        zone.

        :param dt:
            A :py:class:`datetime.datetime`, naive or time zone aware.


        :return:
            Returns ``True`` if ambiguous, ``False`` otherwise.

        .. versionadded:: 2.6.0
        NrFr)r�r>r�r/r�)rrrwr@r�ZodZttrrrr�s


ztzfile.is_ambiguouscCsF|j|�}|j|�}|dks$|dkr(|St|o:|j||��}||S)Nr)r�rCr2r)rrrwrCZ
idx_offsetrrrr��s

ztzfile._resolve_ambiguous_timecCs"|dkrdS|jstS|j|�jS)N)r�r
r�rF)rrrrrr�s
ztzfile.utcoffsetcCs0|dkrdS|jstS|j|�}|js*tS|jS)N)Z_ttinfo_dstr
r�rGrK)rrr�rrrr�s
z
tzfile.dstcCs |js|dkrdS|j|�jS)N)r�r�rH)rrrrrr�sz
tzfile.tznamecCs2t|t�stS|j|jko0|j|jko0|j|jkS)N)rrbrr�r�Z_ttinfo_list)rrrrrr�s

z
tzfile.__eq__cCs
||kS)Nr)rrrrrr�sz
tzfile.__ne__cCsd|jjt|j�fS)Nz%s(%s))rr r1rd)rrrrr!�sztzfile.__repr__cCs
|jd�S)N)�
__reduce_ex__)rrrrr'�sztzfile.__reduce__cCs|jd|jf|jfS)N)rrd�__dict__)rZprotocolrrrr��sztzfile.__reduce_ex__)N)F)N)r r"r#r$r0rirhr�r�r�rrr�rrrrrr%rr!r'r�rDrr)rrrbNs(]
$

	rbc@s6eZdZdZddd�Zdd�Zdd�Zed	d
��ZdS)�tzrangeaQ
    The ``tzrange`` object is a time zone specified by a set of offsets and
    abbreviations, equivalent to the way the ``TZ`` variable can be specified
    in POSIX-like systems, but using Python delta objects to specify DST
    start, end and offsets.

    :param stdabbr:
        The abbreviation for standard time (e.g. ``'EST'``).

    :param stdoffset:
        An integer or :class:`datetime.timedelta` object or equivalent
        specifying the base offset from UTC.

        If unspecified, +00:00 is used.

    :param dstabbr:
        The abbreviation for DST / "Summer" time (e.g. ``'EDT'``).

        If specified, with no other DST information, DST is assumed to occur
        and the default behavior or ``dstoffset``, ``start`` and ``end`` is
        used. If unspecified and no other DST information is specified, it
        is assumed that this zone has no DST.

        If this is unspecified and other DST information is *is* specified,
        DST occurs in the zone but the time zone abbreviation is left
        unchanged.

    :param dstoffset:
        A an integer or :class:`datetime.timedelta` object or equivalent
        specifying the UTC offset during DST. If unspecified and any other DST
        information is specified, it is assumed to be the STD offset +1 hour.

    :param start:
        A :class:`relativedelta.relativedelta` object or equivalent specifying
        the time and time of year that daylight savings time starts. To specify,
        for example, that DST starts at 2AM on the 2nd Sunday in March, pass:

            ``relativedelta(hours=2, month=3, day=1, weekday=SU(+2))``

        If unspecified and any other DST information is specified, the default
        value is 2 AM on the first Sunday in April.

    :param end:
        A :class:`relativedelta.relativedelta` object or equivalent representing
        the time and time of year that daylight savings time ends, with the
        same specification method as in ``start``. One note is that this should
        point to the first time in the *standard* zone, so if a transition
        occurs at 2AM in the DST zone and the clocks are set back 1 hour to 1AM,
        set the `hours` parameter to +1.


    **Examples:**

    .. testsetup:: tzrange

        from dateutil.tz import tzrange, tzstr

    .. doctest:: tzrange

        >>> tzstr('EST5EDT') == tzrange("EST", -18000, "EDT")
        True

        >>> from dateutil.relativedelta import *
        >>> range1 = tzrange("EST", -18000, "EDT")
        >>> range2 = tzrange("EST", -18000, "EDT", -14400,
        ...                  relativedelta(hours=+2, month=4, day=1,
        ...                                weekday=SU(+1)),
        ...                  relativedelta(hours=+1, month=10, day=31,
        ...                                weekday=SU(-1)))
        >>> tzstr('EST5EDT') == range1 == range2
        True

    NcCs@ddlma||_||_yt|�}Wnttfk
r<YnXyt|�}Wnttfk
rbYnX|dk	r|tj|d�|_	nt
|_	|dk	r�tj|d�|_n(|r�|dk	r�|j	tjdd�|_nt
|_|r�|dkr�tjdddtjd
�d�|_
n||_
|�r|dk�rtjdd	d
tjd�d�|_n||_|j|j	|_t|j
�|_dS)Nr)�
relativedelta)r(r)�hours�rn)r��month�day�weekday�
�rr�rrrz)�dateutilr��	_std_abbr�	_dst_abbrrr*r+r,r-r7r
r8�SU�_start_delta�
_end_delta�_dst_base_offset_r:�hasdst)r�stdabbr�	stdoffset�dstabbrrK�start�endrrrr0Js:ztzrange.__init__cCs4|js
dStj|dd�}||j}||j}||fS)a�
        For a given year, get the DST on and off transition times, expressed
        always on the standard time side. For zones with no transitions, this
        function returns ``None``.

        :param year:
            The year whose transitions you would like to query.

        :return:
            Returns a :class:`tuple` of :class:`datetime.datetime` objects,
            ``(dston, dstoff)`` for zones with an annual DST transition, or
            ``None`` for fixed offset zones.
        Nr)r�r,r�r�)rZyearZ	base_yearr�r�rrr�transitionsys

ztzrange.transitionscCsVt|t�stS|j|jkoT|j|jkoT|j|jkoT|j|jkoT|j|jkoT|j|jkS)N)	rr�rr�r�r7r8r�r�)rrrrrr�s
ztzrange.__eq__cCs|jS)N)r�)rrrr�_dst_base_offset�sztzrange._dst_base_offset)NNNNN)	r r"r#r$r0r�r�propertyr�rrrrr�sI
-r�c@s,eZdZdZddd�Zddd�Zdd	�Zd
S)
�tzstra
    ``tzstr`` objects are time zone objects specified by a time-zone string as
    it would be passed to a ``TZ`` variable on POSIX-style systems (see
    the `GNU C Library: TZ Variable`_ for more details).

    There is one notable exception, which is that POSIX-style time zones use an
    inverted offset format, so normally ``GMT+3`` would be parsed as an offset
    3 hours *behind* GMT. The ``tzstr`` time zone object will parse this as an
    offset 3 hours *ahead* of GMT. If you would like to maintain the POSIX
    behavior, pass a ``True`` value to ``posix_offset``.

    The :class:`tzrange` object provides the same functionality, but is
    specified using :class:`relativedelta.relativedelta` objects. rather than
    strings.

    :param s:
        A time zone string in ``TZ`` variable format. This can be a
        :class:`bytes` (2.x: :class:`str`), :class:`str` (2.x: :class:`unicode`)
        or a stream emitting unicode characters (e.g. :class:`StringIO`).

    :param posix_offset:
        Optional. If set to ``True``, interpret strings such as ``GMT+3`` or
        ``UTC+3`` as being 3 hours *behind* UTC rather than ahead, per the
        POSIX standard.

    .. _`GNU C Library: TZ Variable`:
        https://www.gnu.org/software/libc/manual/html_node/TZ-Variable.html
    Fc	Cs�ddlma||_tj|�}|dkr,td��|jd
krJ|rJ|jd9_tj||j|j|j	|j
ddd�|j	s~d|_d|_n&|j
|j�|_|jr�|j
|jdd	�|_t|j�|_dS)Nr)�parserzunknown string format�GMTrrF)r�r�)�isend)r�rrz)r�r��_sZ_parsetzr}r�r�r�r0r�rKr�r��_deltar�r�r:r�)r�sZposix_offset�resrrrr0�s"

ztzstr.__init__rcCs<ddlm}i}|jdk	rr|j|d<|jdk	r`|j|j|j�|d<|jdkrVd|d<qpd|d<q�|jr�|j|d<n*|jdk	r�|j|d<n|jdk	r�|j|d	<|s�|s�d
|d<d|d<|jd�|d<nd|d<d|d<|jd�|d<|j	dk	�r�|j	|d<nd
|d<|�r0|j
|j}|d|j|j
d8<|jf|�S)Nr)r�r�r�rr�r�ZyeardayZ	nlyeardayrnr�r(i i�Qrrz)r�r�r�r�Zweekr�ZydayZjydayr�r5r8r7r(Zdays)r�xr�r�rarFrrrr��s<








ztzstr._deltacCsd|jjt|j�fS)Nz%s(%s))rr r1r�)rrrrr!sztzstr.__repr__N)F)r)r r"r#r$r0r�r!rrrrr��s
 
)r�c@seZdZddd�ZdS)�_tzicalvtzcompNcCs@tj|d�|_tj|d�|_|j|j|_||_||_||_dS)N)r()r,r-�tzoffsetfrom�
tzoffsetto�tzoffsetdiffrGr�rrule)rr�r�rGrr�rrrr0sz_tzicalvtzcomp.__init__)NN)r r"r#r0rrrrr�sr�csZeZdZgf�fdd�	Zdd�Zdd�Zdd�Zd	d
�Zedd��Z	d
d�Z
ejZ�Z
S)�
_tzicalvtzcs*tt|�j�||_||_g|_g|_dS)N)r4r�r0�_tzid�_comps�
_cachedate�
_cachecomp)r�tzid�comps)rrrr0s
z_tzicalvtz.__init__c
Cs
t|j�dkr|jdS|jdd�}y|j|jj||j|�f�Stk
rTYnXd}d}x4|jD]*}|j||�}|rf|s�||krf|}|}qfW|s�x"|jD]}|j	s�|}Pq�W|d}|jj
d||j|�f�|jj
d|�t|j�dk�r|jj�|jj�|S)Nrr)r�r�)r�r��replacer�r��indexrCr}�_find_compdtrG�insert�pop)rrZ
lastcompdtZlastcomp�comp�compdtrrr�
_find_comps4


z_tzicalvtz._find_compcCs2|jtkr|j|�r||j8}|jj|dd�}|S)NT)Zinc)r�r
rCr�Zbefore)rr�rr�rrrr�Is
z_tzicalvtz._find_compdtcCs|dkrdS|j|�jS)N)r�r�)rrrrrrQsz_tzicalvtz.utcoffsetcCs|j|�}|jr|jStSdS)N)r�rGr�r
)rrr�rrrrWs
z_tzicalvtz.dstcCs|j|�jS)N)r�r)rrrrrr^sz_tzicalvtz.tznamecCsdt|j�S)Nz<tzicalvtz %s>)r1r�)rrrrr!bsz_tzicalvtz.__repr__)r r"r#r0r�r�rrrrr!r&r'rDrr)rrr�s*r�c@sBeZdZdZdd�Zdd�Zddd�Zd	d
�Zdd�Zd
d�Z	dS)�tzicala\
    This object is designed to parse an iCalendar-style ``VTIMEZONE`` structure
    as set out in `RFC 2445`_ Section 4.6.5 into one or more `tzinfo` objects.

    :param `fileobj`:
        A file or stream in iCalendar format, which should be UTF-8 encoded
        with CRLF endings.

    .. _`RFC 2445`: https://www.ietf.org/rfc/rfc2445.txt
    c	Csjddlmat|t�r(||_t|d�}nt|dt|��|_t|�}i|_	|�}|j
|j��WdQRXdS)Nr)r��rr.)r�r�rrr�rerBr1rg�_vtz�
_parse_rfcr{)rrjZfobjrrrr0ss
ztzical.__init__cCst|jj��S)z?
        Retrieves the available time zones as a list.
        )r�r��keys)rrrrr��sztzical.keysNcCsP|dkrDt|j�dkr td��nt|j�dkr6td��tt|j��}|jj|�S)a�
        Retrieve a :py:class:`datetime.tzinfo` object by its ``tzid``.

        :param tzid:
            If there is exactly one time zone available, omitting ``tzid``
            or passing :py:const:`None` value returns it. Otherwise a valid
            key (which can be retrieved from :func:`keys`) is required.

        :raises ValueError:
            Raised if ``tzid`` is not specified but there are either more
            or fewer than 1 zone defined.

        :returns:
            Returns either a :py:class:`datetime.tzinfo` object representing
            the relevant time zone or :py:const:`None` if the ``tzid`` was
            not found.
        Nrzno timezones definedrz more than one timezone available)r�r�r}�next�iterr`)rr�rrrr`�s
z
tzical.getcCs�|j�}|std��|ddkr>d|ddk}|dd�}nd}t|�dkrzt|dd��dt|dd��d	|St|�d
kr�t|dd��dt|dd��d	t|dd��|Std|��dS)Nzempty offsetr�+�-rrnr�irsrqzinvalid offset: )r�r�rzr)rzrr)�stripr}r�r2)rr��signalrrr�
_parse_offset�s,<ztzical._parse_offsetcCsH|j�}|std��d}xh|t|�kr�||j�}|s>||=q|dkrv|ddkrv||d|dd�7<||=q|d7}qWd}g}d}d}�x�|D�]�}|s�q�|jdd�\}	}
|	jd�}|s�td��|dj�}	|dd�}|�r$|	d	k�r(|
d)k�rntd|
��|
}d}d}
d}g}d}�q@|	d
k�r|
dk�r�|�rNtd|��|�s\td��|�sjtd��t||�|j|<d}n�|
|k�r|�s�td��|
dk�r�td��|dk�r�td��d}|�r�tj	dj
|�dddd�}t|
||dk||�}|j|�d}ntd|
���q@|�r�|	dk�r2|j|�d}n�|	d*k�rH|j|�n�|	dk�rx|�rltd|	|df��|j
|
�}
nj|	dk�r�|�r�td |d��|j
|
�}n>|	d!k�r�|�r�td"|d��|
}n|	d#k�r�ntd$|	��n>|	d%k�r
|�rtd&|d��|
}n|	d+k�rntd$|	��q�|	d	kr�|
dkr�d}g}d}q�WdS),Nzempty stringr� rF�:�;zempty property nameZBEGIN�STANDARD�DAYLIGHTzunknown component: ZENDZ	VTIMEZONEzcomponent not closed: zmandatory TZID not foundz at least one component is neededzmandatory DTSTART not foundz mandatory TZOFFSETFROM not found�
T)Z
compatibleZignoretz�cachezinvalid component end: ZDTSTART�RRULE�RDATE�EXRULE�EXDATEZTZOFFSETFROMzunsupported %s parm: %s Z
TZOFFSETTOzunsupported TZOFFSETTO parm: ZTZNAMEzunsupported TZNAME parm: �COMMENTzunsupported property: ZTZIDzunsupported TZID parm: �TZURL�
LAST-MODIFIED)r�r�)r�r�r�r�)r�r�r�)�
splitlinesr}r��rstrip�split�upperr�r�r�ZrrulestrrPr�rOr�)rr��linesr��liner�r�ZinvtzZcomptyper.rRZparmsZfounddtstartr�r�Z
rrulelinesrZrrr�rrrr��s�

















ztzical._parse_rfccCsd|jjt|j�fS)Nz%s(%s))rr r1r�)rrrrr!+sztzical.__repr__)N)
r r"r#r$r0r�r`r�r�r!rrrrr�hs

vr�Zwin32z/etc/localtimer?z/usr/share/zoneinfoz/usr/lib/zoneinfoz/usr/share/lib/zoneinfoz
/etc/zoneinfocCsDd}|s,ytjd}Wntk
r*YnX|dks<|dkr�x�tD]v}tjj|�s�|}x*tD] }tjj||�}tjj|�r\Pq\WqBtjj|�rByt	|�}PWqBt
ttfk
r�YqBXqBWt
�}�nz|jd�r�|dd�}tjj|��r
tjj|��rt	|�}nd}�n6�x2tD]l}tjj||�}tjj|��sP|jdd�}tjj|��sP�qyt	|�}PWnt
ttfk
�rzYnX�qWd}tdk	�r�yt|�}Wntk
�r�d}YnX|�s�ddlm}|�j|�}|�s@xb|D]6}|dk�r�yt|�}Wntk
�rYnXP�q�W|dk�r.t�}n|tjk�r@t
�}|S)
NZTZr�rr�rmr)�get_zonefile_instance�
0123456789r�rrz)r�r)r��environ�KeyError�TZFILES�path�isabs�TZPATHSrP�isfilerb�IOError�OSErrorr}r3�
startswithr�r
ZWindowsErrorZdateutil.zoneinfor�r`r�rr5r)r.�tz�filepathrkr�r��crrr�gettz:sz










r�cCsZ|dkr |jdkrtd��|j}|jdd�}|j|d�jt��j|�}|jdd�}||kS)a�
    Given a datetime and a time zone, determine whether or not a given datetime
    would fall in a gap.

    :param dt:
        A :class:`datetime.datetime` (whose time zone will be ignored if ``tz``
        is provided.)

    :param tz:
        A :class:`datetime.tzinfo` with support for the ``fold`` attribute. If
        ``None`` or not provided, the datetime's own time zone will be used.

    :return:
        Returns a boolean value whether or not the "wall time" exists in ``tz``.
    Nz,Datetime is naive and no time zone provided.)r�)r�r}r�Z
astimezoner)rr�Zdt_rtrrr�datetime_exists�s
r�c
Cs�|dkr |jdkrtd��|j}t|dd�}|dk	rLy
|j|�SYnX|j|d�}t|dd�}t|dd�}|j�|j�k}|j�|j�k}|o�|S)a\
    Given a datetime and a time zone, determine whether or not a given datetime
    is ambiguous (i.e if there are two times differentiated only by their DST
    status).

    :param dt:
        A :class:`datetime.datetime` (whose time zone will be ignored if ``tz``
        is provided.)

    :param tz:
        A :class:`datetime.tzinfo` with support for the ``fold`` attribute. If
        ``None`` or not provided, the datetime's own time zone will be used.

    :return:
        Returns a boolean value whether or not the "wall time" is ambiguous in
        ``tz``.

    .. versionadded:: 2.6.0
    Nz,Datetime is naive and no time zone provided.r)r�r)rAr)r�r}rBrr�rrr)rr�Zis_ambiguous_fnZwall_0Zwall_1Zsame_offsetZsame_dstrrr�datetime_ambiguous�s 

r�cCst|jdd�t�S)z�
    Convert a :class:`datetime.datetime` object to an epoch timestamp in seconds
    since January 1, 1970, ignoring the time zone.
    N)r�)rr��EPOCH)rrrrr>�sr>c@s(eZdZdZdd�Zdd�Zdd�ZdS)	rgz^
    Class for wrapping contexts so that they are passed through in a
    with statement.
    cCs
||_dS)N)�context)rrrrrr0�sz_ContextWrapper.__init__cCs|jS)N)r)rrrr�	__enter__�sz_ContextWrapper.__enter__cOsdS)Nr)�argsrarrr�__exit__�sz_ContextWrapper.__exit__N)r r"r#r$r0rrrrrrrg�srg)N)N)N)/r$r,r~r5�sysr�r�ZsixrZ_commonrrrrrr	�winr
r�ImportErrorr-r
ZutcfromtimestamprZ	toordinalZEPOCHORDINALr�rrr3r&rErVrbr�r�r�r�r��platformr�r�r�r�r�r>rgrrrr�<module>	s\
8Fv-
5"jRH
J

.site-packages/dateutil/tz/__pycache__/_common.cpython-36.opt-1.pyc000064400000026262147511334560021001 0ustar003

6�cY�.�@s�ddlmZddlmZddlmZmZmZed�ZddgZdd�Z	e
ed�rZdd	d�ZnGd
d�de�Zddd�Zd
d�Z
Gdd�de�ZGdd�de�Zdd�Zeede�ZdS)�)�PY3)�wraps)�datetime�	timedelta�tzinfo�tzname_in_python2�enfoldcs�fdd�}|S)z�Change unicode output into bytestrings in Python 2

    tzname() API changed in Python 3. It used to return bytes, but was changed
    to unicode strings
    cs$�||�}|dk	r tr |j�}|S)N)r�encode)�args�kwargs�name)�namefunc��/usr/lib/python3.6/_common.py�adjust_encodings
z*tzname_in_python2.<locals>.adjust_encodingr)r
rr)r
rr
s�fold�cCs|j|d�S)a�
        Provides a unified interface for assigning the ``fold`` attribute to
        datetimes both before and after the implementation of PEP-495.

        :param fold:
            The value for the ``fold`` attribute in the returned datetime. This
            should be either 0 or 1.

        :return:
            Returns an object for which ``getattr(dt, 'fold', 0)`` returns
            ``fold`` for all versions of Python. In versions prior to
            Python 3.6, this is a ``_DatetimeWithFold`` object, which is a
            subclass of :py:class:`datetime.datetime` with the ``fold``
            attribute added, if ``fold`` is 1.

        .. versionadded:: 2.6.0
        )r)�replace)�dtrrrrr!sc@s eZdZdZfZedd��ZdS)�_DatetimeWithFoldz�
        This is a class designed to provide a PEP 495-compliant interface for
        Python versions before 3.6. It is used only for dates in a fold, so
        the ``fold`` attribute is fixed at ``1``.

        .. versionadded:: 2.6.0
        cCsdS)Nrr)�selfrrrr@sz_DatetimeWithFold.foldN)�__name__�
__module__�__qualname__�__doc__�	__slots__�propertyrrrrrr6srcCsLt|dd�|kr|S|j�dd�}||j|jf7}|r@t|�St|�SdS)a�
        Provides a unified interface for assigning the ``fold`` attribute to
        datetimes both before and after the implementation of PEP-495.

        :param fold:
            The value for the ``fold`` attribute in the returned datetime. This
            should be either 0 or 1.

        :return:
            Returns an object for which ``getattr(dt, 'fold', 0)`` returns
            ``fold`` for all versions of Python. In versions prior to
            Python 3.6, this is a ``_DatetimeWithFold`` object, which is a
            subclass of :py:class:`datetime.datetime` with the ``fold``
            attribute added, if ``fold`` is 1.

        .. versionadded:: 2.6.0
        rrN�)�getattrZ	timetupleZmicrosecondrrr)rrr
rrrrDscst���fdd��}|S)z�
    The CPython version of ``fromutc`` checks that the input is a ``datetime``
    object and that ``self`` is attached as its ``tzinfo``.
    cs.t|t�std��|j|k	r$td���||�S)Nz&fromutc() requires a datetime argumentzdt.tzinfo is not self)�
isinstancer�	TypeErrorr�
ValueError)rr)�frr�fromutcgs


z)_validate_fromutc_inputs.<locals>.fromutc)r)r"r#r)r"r�_validate_fromutc_inputsbs	r$c@s<eZdZdZdd�Zdd�Zdd�Zdd	�Zed
d��Z	dS)
�_tzinfoz=
    Base class for all ``dateutil`` ``tzinfo`` objects.
    cCsV|j|d�}t|dd�}t|dd�}|j�|j�k}|jdd�|jdd�k}|oT|S)a6
        Whether or not the "wall time" of a given datetime is ambiguous in this
        zone.

        :param dt:
            A :py:class:`datetime.datetime`, naive or time zone aware.


        :return:
            Returns ``True`` if ambiguous, ``False`` otherwise.

        .. versionadded:: 2.6.0
        )rr)rrN)rr�	utcoffset)rrZwall_0Zwall_1Zsame_offsetZsame_dtrrr�is_ambiguousxsz_tzinfo.is_ambiguouscCs4|j|�r,||}t||j�|j�k�}nd}|S)a�
        Determine the fold status of a "wall" datetime, given a representation
        of the same datetime as a (naive) UTC datetime. This is calculated based
        on the assumption that ``dt.utcoffset() - dt.dst()`` is constant for all
        datetimes, and that this offset is the actual number of hours separating
        ``dt_utc`` and ``dt_wall``.

        :param dt_utc:
            Representation of the datetime as UTC

        :param dt_wall:
            Representation of the datetime as "wall time". This parameter must
            either have a `fold` attribute or have a fold-naive
            :class:`datetime.tzinfo` attached, otherwise the calculation may
            fail.
        r)r'�intr&�dst)r�dt_utc�dt_wallZ
delta_wall�_foldrrr�_fold_status�s

z_tzinfo._fold_statuscCst|dd�S)Nrr)r)rrrrrr,�sz
_tzinfo._foldcCsh|j�}|dkrtd��|j�}|dkr0td��||}||7}t|dd�j�}|dkr`td��||S)a�
        Given a timezone-aware datetime in a given timezone, calculates a
        timezone-aware datetime in a new timezone.

        Since this is the one time that we *know* we have an unambiguous
        datetime object, we take this opportunity to determine whether the
        datetime is ambiguous and in a "fold" state (e.g. if it's the first
        occurence, chronologically, of the ambiguous datetime).

        :param dt:
            A timezone-aware :class:`datetime.datetime` object.
        Nz0fromutc() requires a non-None utcoffset() resultz*fromutc() requires a non-None dst() resultr)rz;fromutc(): dt.dst gave inconsistent results; cannot convert)r&r!r)r)rrZdtoffZdtdstZdeltarrr�_fromutc�sz_tzinfo._fromutccCs"|j|�}|j||�}t||d�S)a�
        Given a timezone-aware datetime in a given timezone, calculates a
        timezone-aware datetime in a new timezone.

        Since this is the one time that we *know* we have an unambiguous
        datetime object, we take this opportunity to determine whether the
        datetime is ambiguous and in a "fold" state (e.g. if it's the first
        occurance, chronologically, of the ambiguous datetime).

        :param dt:
            A timezone-aware :class:`datetime.datetime` object.
        )r)r.r-r)rrr+r,rrrr#�s
z_tzinfo.fromutcN)
rrrrr'r-r,r.r$r#rrrrr%ss%r%c@szeZdZdZdd�Zdd�Zdd�Zedd	��Zd
d�Z	dd
�Z
dd�Zdd�Ze
dd��ZdZdd�Zdd�ZejZdS)�tzrangebasea�
    This is an abstract base class for time zones represented by an annual
    transition into and out of DST. Child classes should implement the following
    methods:

        * ``__init__(self, *args, **kwargs)``
        * ``transitions(self, year)`` - this is expected to return a tuple of
          datetimes representing the DST on and off transitions in standard
          time.

    A fully initialized ``tzrangebase`` subclass should also provide the
    following attributes:
        * ``hasdst``: Boolean whether or not the zone uses DST.
        * ``_dst_offset`` / ``_std_offset``: :class:`datetime.timedelta` objects
          representing the respective UTC offsets.
        * ``_dst_abbr`` / ``_std_abbr``: Strings representing the timezone short
          abbreviations in DST and STD, respectively.
        * ``_hasdst``: Whether or not the zone has DST.

    .. versionadded:: 2.6.0
    cCstd��dS)Nz%tzrangebase is an abstract base class)�NotImplementedError)rrrr�__init__�sztzrangebase.__init__cCs*|j|�}|dkrdS|r |jS|jSdS)N)�_isdst�_dst_offset�_std_offset)rr�isdstrrrr&s
ztzrangebase.utcoffsetcCs(|j|�}|dkrdS|r |jStSdS)N)r2�_dst_base_offset�ZERO)rrr5rrrr)s
ztzrangebase.dstcCs|j|�r|jS|jSdS)N)r2Z	_dst_abbrZ	_std_abbr)rrrrr�tznames
ztzrangebase.tznamec
Cs�t|t�std��|j|k	r$td��|j|j�}|dkrF||j|�S|\}}||j8}||j8}||f}|j	dd�}|j
||�}|r�||j}n
||j}t|o�|j
|��}	t||	d�S)z, Given a datetime in UTC, return local time z&fromutc() requires a datetime argumentzdt.tzinfo is not selfN)r)r)rrr rr!�transitions�yearr&r4r�_naive_isdstr3r(r'r)
rrr9�dston�dstoffZutc_transitionsr*r5r+r,rrrr#s$




ztzrangebase.fromutccCsD|js
dS|j|j�\}}|jdd�}||ko>||jkSS)a6
        Whether or not the "wall time" of a given datetime is ambiguous in this
        zone.

        :param dt:
            A :py:class:`datetime.datetime`, naive or time zone aware.


        :return:
            Returns ``True`` if ambiguous, ``False`` otherwise.

        .. versionadded:: 2.6.0
        FN)r)�hasdstr9r:rr6)rr�start�endrrrr'>s
ztzrangebase.is_ambiguouscCsj|js
dS|dkrdS|j|j�}|dkr.dS|jdd�}|j||�}|rb|j|�rb|j|�S|SdS)NF)r)r>r9r:rr;r'r,)rrr9r5rrrr2Tsztzrangebase._isdstcCsT|\}}|jdd�}||kr6||ko.|kn}n||koH|kn}|S)N)r)r)rrr9r<r=r5rrrr;isztzrangebase._naive_isdstcCs|j|jS)N)r3r4)rrrrr6usztzrangebase._dst_base_offsetNcCs
||kS)Nr)r�otherrrr�__ne__{sztzrangebase.__ne__cCsd|jjS)Nz%s(...))�	__class__r)rrrr�__repr__~sztzrangebase.__repr__)rrrrr1r&r)rr8r#r'r2r;rr6�__hash__rBrD�object�
__reduce__rrrrr/�s

!r/cCs|j|jdd|jdS)Ni�Qi@B)ZsecondsZdaysZmicroseconds)Ztdrrr�_total_seconds�srHZ
total_secondsN)r)r)Zsixr�	functoolsrrrrr7�__all__r�hasattrrrr$r%r/rHrrrrr�<module>s

vsite-packages/dateutil/tz/__pycache__/_common.cpython-36.pyc000064400000026262147511334560020042 0ustar003

6�cY�.�@s�ddlmZddlmZddlmZmZmZed�ZddgZdd�Z	e
ed�rZdd	d�ZnGd
d�de�Zddd�Zd
d�Z
Gdd�de�ZGdd�de�Zdd�Zeede�ZdS)�)�PY3)�wraps)�datetime�	timedelta�tzinfo�tzname_in_python2�enfoldcs�fdd�}|S)z�Change unicode output into bytestrings in Python 2

    tzname() API changed in Python 3. It used to return bytes, but was changed
    to unicode strings
    cs$�||�}|dk	r tr |j�}|S)N)r�encode)�args�kwargs�name)�namefunc��/usr/lib/python3.6/_common.py�adjust_encodings
z*tzname_in_python2.<locals>.adjust_encodingr)r
rr)r
rr
s�fold�cCs|j|d�S)a�
        Provides a unified interface for assigning the ``fold`` attribute to
        datetimes both before and after the implementation of PEP-495.

        :param fold:
            The value for the ``fold`` attribute in the returned datetime. This
            should be either 0 or 1.

        :return:
            Returns an object for which ``getattr(dt, 'fold', 0)`` returns
            ``fold`` for all versions of Python. In versions prior to
            Python 3.6, this is a ``_DatetimeWithFold`` object, which is a
            subclass of :py:class:`datetime.datetime` with the ``fold``
            attribute added, if ``fold`` is 1.

        .. versionadded:: 2.6.0
        )r)�replace)�dtrrrrr!sc@s eZdZdZfZedd��ZdS)�_DatetimeWithFoldz�
        This is a class designed to provide a PEP 495-compliant interface for
        Python versions before 3.6. It is used only for dates in a fold, so
        the ``fold`` attribute is fixed at ``1``.

        .. versionadded:: 2.6.0
        cCsdS)Nrr)�selfrrrr@sz_DatetimeWithFold.foldN)�__name__�
__module__�__qualname__�__doc__�	__slots__�propertyrrrrrr6srcCsLt|dd�|kr|S|j�dd�}||j|jf7}|r@t|�St|�SdS)a�
        Provides a unified interface for assigning the ``fold`` attribute to
        datetimes both before and after the implementation of PEP-495.

        :param fold:
            The value for the ``fold`` attribute in the returned datetime. This
            should be either 0 or 1.

        :return:
            Returns an object for which ``getattr(dt, 'fold', 0)`` returns
            ``fold`` for all versions of Python. In versions prior to
            Python 3.6, this is a ``_DatetimeWithFold`` object, which is a
            subclass of :py:class:`datetime.datetime` with the ``fold``
            attribute added, if ``fold`` is 1.

        .. versionadded:: 2.6.0
        rrN�)�getattrZ	timetupleZmicrosecondrrr)rrr
rrrrDscst���fdd��}|S)z�
    The CPython version of ``fromutc`` checks that the input is a ``datetime``
    object and that ``self`` is attached as its ``tzinfo``.
    cs.t|t�std��|j|k	r$td���||�S)Nz&fromutc() requires a datetime argumentzdt.tzinfo is not self)�
isinstancer�	TypeErrorr�
ValueError)rr)�frr�fromutcgs


z)_validate_fromutc_inputs.<locals>.fromutc)r)r"r#r)r"r�_validate_fromutc_inputsbs	r$c@s<eZdZdZdd�Zdd�Zdd�Zdd	�Zed
d��Z	dS)
�_tzinfoz=
    Base class for all ``dateutil`` ``tzinfo`` objects.
    cCsV|j|d�}t|dd�}t|dd�}|j�|j�k}|jdd�|jdd�k}|oT|S)a6
        Whether or not the "wall time" of a given datetime is ambiguous in this
        zone.

        :param dt:
            A :py:class:`datetime.datetime`, naive or time zone aware.


        :return:
            Returns ``True`` if ambiguous, ``False`` otherwise.

        .. versionadded:: 2.6.0
        )rr)rrN)rr�	utcoffset)rrZwall_0Zwall_1Zsame_offsetZsame_dtrrr�is_ambiguousxsz_tzinfo.is_ambiguouscCs4|j|�r,||}t||j�|j�k�}nd}|S)a�
        Determine the fold status of a "wall" datetime, given a representation
        of the same datetime as a (naive) UTC datetime. This is calculated based
        on the assumption that ``dt.utcoffset() - dt.dst()`` is constant for all
        datetimes, and that this offset is the actual number of hours separating
        ``dt_utc`` and ``dt_wall``.

        :param dt_utc:
            Representation of the datetime as UTC

        :param dt_wall:
            Representation of the datetime as "wall time". This parameter must
            either have a `fold` attribute or have a fold-naive
            :class:`datetime.tzinfo` attached, otherwise the calculation may
            fail.
        r)r'�intr&�dst)r�dt_utc�dt_wallZ
delta_wall�_foldrrr�_fold_status�s

z_tzinfo._fold_statuscCst|dd�S)Nrr)r)rrrrrr,�sz
_tzinfo._foldcCsh|j�}|dkrtd��|j�}|dkr0td��||}||7}t|dd�j�}|dkr`td��||S)a�
        Given a timezone-aware datetime in a given timezone, calculates a
        timezone-aware datetime in a new timezone.

        Since this is the one time that we *know* we have an unambiguous
        datetime object, we take this opportunity to determine whether the
        datetime is ambiguous and in a "fold" state (e.g. if it's the first
        occurence, chronologically, of the ambiguous datetime).

        :param dt:
            A timezone-aware :class:`datetime.datetime` object.
        Nz0fromutc() requires a non-None utcoffset() resultz*fromutc() requires a non-None dst() resultr)rz;fromutc(): dt.dst gave inconsistent results; cannot convert)r&r!r)r)rrZdtoffZdtdstZdeltarrr�_fromutc�sz_tzinfo._fromutccCs"|j|�}|j||�}t||d�S)a�
        Given a timezone-aware datetime in a given timezone, calculates a
        timezone-aware datetime in a new timezone.

        Since this is the one time that we *know* we have an unambiguous
        datetime object, we take this opportunity to determine whether the
        datetime is ambiguous and in a "fold" state (e.g. if it's the first
        occurance, chronologically, of the ambiguous datetime).

        :param dt:
            A timezone-aware :class:`datetime.datetime` object.
        )r)r.r-r)rrr+r,rrrr#�s
z_tzinfo.fromutcN)
rrrrr'r-r,r.r$r#rrrrr%ss%r%c@szeZdZdZdd�Zdd�Zdd�Zedd	��Zd
d�Z	dd
�Z
dd�Zdd�Ze
dd��ZdZdd�Zdd�ZejZdS)�tzrangebasea�
    This is an abstract base class for time zones represented by an annual
    transition into and out of DST. Child classes should implement the following
    methods:

        * ``__init__(self, *args, **kwargs)``
        * ``transitions(self, year)`` - this is expected to return a tuple of
          datetimes representing the DST on and off transitions in standard
          time.

    A fully initialized ``tzrangebase`` subclass should also provide the
    following attributes:
        * ``hasdst``: Boolean whether or not the zone uses DST.
        * ``_dst_offset`` / ``_std_offset``: :class:`datetime.timedelta` objects
          representing the respective UTC offsets.
        * ``_dst_abbr`` / ``_std_abbr``: Strings representing the timezone short
          abbreviations in DST and STD, respectively.
        * ``_hasdst``: Whether or not the zone has DST.

    .. versionadded:: 2.6.0
    cCstd��dS)Nz%tzrangebase is an abstract base class)�NotImplementedError)rrrr�__init__�sztzrangebase.__init__cCs*|j|�}|dkrdS|r |jS|jSdS)N)�_isdst�_dst_offset�_std_offset)rr�isdstrrrr&s
ztzrangebase.utcoffsetcCs(|j|�}|dkrdS|r |jStSdS)N)r2�_dst_base_offset�ZERO)rrr5rrrr)s
ztzrangebase.dstcCs|j|�r|jS|jSdS)N)r2Z	_dst_abbrZ	_std_abbr)rrrrr�tznames
ztzrangebase.tznamec
Cs�t|t�std��|j|k	r$td��|j|j�}|dkrF||j|�S|\}}||j8}||j8}||f}|j	dd�}|j
||�}|r�||j}n
||j}t|o�|j
|��}	t||	d�S)z, Given a datetime in UTC, return local time z&fromutc() requires a datetime argumentzdt.tzinfo is not selfN)r)r)rrr rr!�transitions�yearr&r4r�_naive_isdstr3r(r'r)
rrr9�dston�dstoffZutc_transitionsr*r5r+r,rrrr#s$




ztzrangebase.fromutccCsD|js
dS|j|j�\}}|jdd�}||ko>||jkSS)a6
        Whether or not the "wall time" of a given datetime is ambiguous in this
        zone.

        :param dt:
            A :py:class:`datetime.datetime`, naive or time zone aware.


        :return:
            Returns ``True`` if ambiguous, ``False`` otherwise.

        .. versionadded:: 2.6.0
        FN)r)�hasdstr9r:rr6)rr�start�endrrrr'>s
ztzrangebase.is_ambiguouscCsj|js
dS|dkrdS|j|j�}|dkr.dS|jdd�}|j||�}|rb|j|�rb|j|�S|SdS)NF)r)r>r9r:rr;r'r,)rrr9r5rrrr2Tsztzrangebase._isdstcCsT|\}}|jdd�}||kr6||ko.|kn}n||koH|kn}|S)N)r)r)rrr9r<r=r5rrrr;isztzrangebase._naive_isdstcCs|j|jS)N)r3r4)rrrrr6usztzrangebase._dst_base_offsetNcCs
||kS)Nr)r�otherrrr�__ne__{sztzrangebase.__ne__cCsd|jjS)Nz%s(...))�	__class__r)rrrr�__repr__~sztzrangebase.__repr__)rrrrr1r&r)rr8r#r'r2r;rr6�__hash__rBrD�object�
__reduce__rrrrr/�s

!r/cCs|j|jdd|jdS)Ni�Qi@B)ZsecondsZdaysZmicroseconds)Ztdrrr�_total_seconds�srHZ
total_secondsN)r)r)Zsixr�	functoolsrrrrr7�__all__r�hasattrrrr$r%r/rHrrrrr�<module>s

vsite-packages/dateutil/tz/win.py000064400000026205147511334560012721 0ustar00# This code was originally contributed by Jeffrey Harris.
import datetime
import struct

from six.moves import winreg
from six import text_type

try:
    import ctypes
    from ctypes import wintypes
except ValueError:
    # ValueError is raised on non-Windows systems for some horrible reason.
    raise ImportError("Running tzwin on non-Windows system")

from ._common import tzrangebase

__all__ = ["tzwin", "tzwinlocal", "tzres"]

ONEWEEK = datetime.timedelta(7)

TZKEYNAMENT = r"SOFTWARE\Microsoft\Windows NT\CurrentVersion\Time Zones"
TZKEYNAME9X = r"SOFTWARE\Microsoft\Windows\CurrentVersion\Time Zones"
TZLOCALKEYNAME = r"SYSTEM\CurrentControlSet\Control\TimeZoneInformation"


def _settzkeyname():
    handle = winreg.ConnectRegistry(None, winreg.HKEY_LOCAL_MACHINE)
    try:
        winreg.OpenKey(handle, TZKEYNAMENT).Close()
        TZKEYNAME = TZKEYNAMENT
    except WindowsError:
        TZKEYNAME = TZKEYNAME9X
    handle.Close()
    return TZKEYNAME


TZKEYNAME = _settzkeyname()


class tzres(object):
    """
    Class for accessing `tzres.dll`, which contains timezone name related
    resources.

    .. versionadded:: 2.5.0
    """
    p_wchar = ctypes.POINTER(wintypes.WCHAR)        # Pointer to a wide char

    def __init__(self, tzres_loc='tzres.dll'):
        # Load the user32 DLL so we can load strings from tzres
        user32 = ctypes.WinDLL('user32')

        # Specify the LoadStringW function
        user32.LoadStringW.argtypes = (wintypes.HINSTANCE,
                                       wintypes.UINT,
                                       wintypes.LPWSTR,
                                       ctypes.c_int)

        self.LoadStringW = user32.LoadStringW
        self._tzres = ctypes.WinDLL(tzres_loc)
        self.tzres_loc = tzres_loc

    def load_name(self, offset):
        """
        Load a timezone name from a DLL offset (integer).

        >>> from dateutil.tzwin import tzres
        >>> tzr = tzres()
        >>> print(tzr.load_name(112))
        'Eastern Standard Time'

        :param offset:
            A positive integer value referring to a string from the tzres dll.

        ..note:
            Offsets found in the registry are generally of the form
            `@tzres.dll,-114`. The offset in this case if 114, not -114.

        """
        resource = self.p_wchar()
        lpBuffer = ctypes.cast(ctypes.byref(resource), wintypes.LPWSTR)
        nchar = self.LoadStringW(self._tzres._handle, offset, lpBuffer, 0)
        return resource[:nchar]

    def name_from_string(self, tzname_str):
        """
        Parse strings as returned from the Windows registry into the time zone
        name as defined in the registry.

        >>> from dateutil.tzwin import tzres
        >>> tzr = tzres()
        >>> print(tzr.name_from_string('@tzres.dll,-251'))
        'Dateline Daylight Time'
        >>> print(tzr.name_from_string('Eastern Standard Time'))
        'Eastern Standard Time'

        :param tzname_str:
            A timezone name string as returned from a Windows registry key.

        :return:
            Returns the localized timezone string from tzres.dll if the string
            is of the form `@tzres.dll,-offset`, else returns the input string.
        """
        if not tzname_str.startswith('@'):
            return tzname_str

        name_splt = tzname_str.split(',-')
        try:
            offset = int(name_splt[1])
        except:
            raise ValueError("Malformed timezone string.")

        return self.load_name(offset)


class tzwinbase(tzrangebase):
    """tzinfo class based on win32's timezones available in the registry."""
    def __init__(self):
        raise NotImplementedError('tzwinbase is an abstract base class')

    def __eq__(self, other):
        # Compare on all relevant dimensions, including name.
        if not isinstance(other, tzwinbase):
            return NotImplemented

        return  (self._std_offset == other._std_offset and
                 self._dst_offset == other._dst_offset and
                 self._stddayofweek == other._stddayofweek and
                 self._dstdayofweek == other._dstdayofweek and
                 self._stdweeknumber == other._stdweeknumber and
                 self._dstweeknumber == other._dstweeknumber and
                 self._stdhour == other._stdhour and
                 self._dsthour == other._dsthour and
                 self._stdminute == other._stdminute and
                 self._dstminute == other._dstminute and
                 self._std_abbr == other._std_abbr and
                 self._dst_abbr == other._dst_abbr)

    @staticmethod
    def list():
        """Return a list of all time zones known to the system."""
        with winreg.ConnectRegistry(None, winreg.HKEY_LOCAL_MACHINE) as handle:
            with winreg.OpenKey(handle, TZKEYNAME) as tzkey:
                result = [winreg.EnumKey(tzkey, i)
                          for i in range(winreg.QueryInfoKey(tzkey)[0])]
        return result

    def display(self):
        return self._display

    def transitions(self, year):
        """
        For a given year, get the DST on and off transition times, expressed
        always on the standard time side. For zones with no transitions, this
        function returns ``None``.

        :param year:
            The year whose transitions you would like to query.

        :return:
            Returns a :class:`tuple` of :class:`datetime.datetime` objects,
            ``(dston, dstoff)`` for zones with an annual DST transition, or
            ``None`` for fixed offset zones.
        """

        if not self.hasdst:
            return None

        dston = picknthweekday(year, self._dstmonth, self._dstdayofweek,
                               self._dsthour, self._dstminute,
                               self._dstweeknumber)

        dstoff = picknthweekday(year, self._stdmonth, self._stddayofweek,
                                self._stdhour, self._stdminute,
                                self._stdweeknumber)

        # Ambiguous dates default to the STD side
        dstoff -= self._dst_base_offset

        return dston, dstoff

    def _get_hasdst(self):
        return self._dstmonth != 0

    @property
    def _dst_base_offset(self):
        return self._dst_base_offset_


class tzwin(tzwinbase):

    def __init__(self, name):
        self._name = name

        # multiple contexts only possible in 2.7 and 3.1, we still support 2.6
        with winreg.ConnectRegistry(None, winreg.HKEY_LOCAL_MACHINE) as handle:
            tzkeyname = text_type("{kn}\\{name}").format(kn=TZKEYNAME, name=name)
            with winreg.OpenKey(handle, tzkeyname) as tzkey:
                keydict = valuestodict(tzkey)

        self._std_abbr = keydict["Std"]
        self._dst_abbr = keydict["Dlt"]

        self._display = keydict["Display"]

        # See http://ww_winreg.jsiinc.com/SUBA/tip0300/rh0398.htm
        tup = struct.unpack("=3l16h", keydict["TZI"])
        stdoffset = -tup[0]-tup[1]          # Bias + StandardBias * -1
        dstoffset = stdoffset-tup[2]        # + DaylightBias * -1
        self._std_offset = datetime.timedelta(minutes=stdoffset)
        self._dst_offset = datetime.timedelta(minutes=dstoffset)

        # for the meaning see the win32 TIME_ZONE_INFORMATION structure docs
        # http://msdn.microsoft.com/en-us/library/windows/desktop/ms725481(v=vs.85).aspx
        (self._stdmonth,
         self._stddayofweek,   # Sunday = 0
         self._stdweeknumber,  # Last = 5
         self._stdhour,
         self._stdminute) = tup[4:9]

        (self._dstmonth,
         self._dstdayofweek,   # Sunday = 0
         self._dstweeknumber,  # Last = 5
         self._dsthour,
         self._dstminute) = tup[12:17]

        self._dst_base_offset_ = self._dst_offset - self._std_offset
        self.hasdst = self._get_hasdst()

    def __repr__(self):
        return "tzwin(%s)" % repr(self._name)

    def __reduce__(self):
        return (self.__class__, (self._name,))


class tzwinlocal(tzwinbase):
    def __init__(self):
        with winreg.ConnectRegistry(None, winreg.HKEY_LOCAL_MACHINE) as handle:
            with winreg.OpenKey(handle, TZLOCALKEYNAME) as tzlocalkey:
                keydict = valuestodict(tzlocalkey)

            self._std_abbr = keydict["StandardName"]
            self._dst_abbr = keydict["DaylightName"]

            try:
                tzkeyname = text_type('{kn}\\{sn}').format(kn=TZKEYNAME,
                                                          sn=self._std_abbr)
                with winreg.OpenKey(handle, tzkeyname) as tzkey:
                    _keydict = valuestodict(tzkey)
                    self._display = _keydict["Display"]
            except OSError:
                self._display = None

        stdoffset = -keydict["Bias"]-keydict["StandardBias"]
        dstoffset = stdoffset-keydict["DaylightBias"]

        self._std_offset = datetime.timedelta(minutes=stdoffset)
        self._dst_offset = datetime.timedelta(minutes=dstoffset)

        # For reasons unclear, in this particular key, the day of week has been
        # moved to the END of the SYSTEMTIME structure.
        tup = struct.unpack("=8h", keydict["StandardStart"])

        (self._stdmonth,
         self._stdweeknumber,  # Last = 5
         self._stdhour,
         self._stdminute) = tup[1:5]

        self._stddayofweek = tup[7]

        tup = struct.unpack("=8h", keydict["DaylightStart"])

        (self._dstmonth,
         self._dstweeknumber,  # Last = 5
         self._dsthour,
         self._dstminute) = tup[1:5]

        self._dstdayofweek = tup[7]

        self._dst_base_offset_ = self._dst_offset - self._std_offset
        self.hasdst = self._get_hasdst()

    def __repr__(self):
        return "tzwinlocal()"

    def __str__(self):
        # str will return the standard name, not the daylight name.
        return "tzwinlocal(%s)" % repr(self._std_abbr)

    def __reduce__(self):
        return (self.__class__, ())


def picknthweekday(year, month, dayofweek, hour, minute, whichweek):
    """ dayofweek == 0 means Sunday, whichweek 5 means last instance """
    first = datetime.datetime(year, month, 1, hour, minute)

    # This will work if dayofweek is ISO weekday (1-7) or Microsoft-style (0-6),
    # Because 7 % 7 = 0
    weekdayone = first.replace(day=((dayofweek - first.isoweekday()) % 7) + 1)
    wd = weekdayone + ((whichweek - 1) * ONEWEEK)
    if (wd.month != month):
        wd -= ONEWEEK

    return wd


def valuestodict(key):
    """Convert a registry key's values to a dictionary."""
    dout = {}
    size = winreg.QueryInfoKey(key)[1]
    tz_res = None

    for i in range(size):
        key_name, value, dtype = winreg.EnumValue(key, i)
        if dtype == winreg.REG_DWORD or dtype == winreg.REG_DWORD_LITTLE_ENDIAN:
            # If it's a DWORD (32-bit integer), it's stored as unsigned - convert
            # that to a proper signed integer
            if value & (1 << 31):
                value = value - (1 << 32)
        elif dtype == winreg.REG_SZ:
            # If it's a reference to the tzres DLL, load the actual string
            if value.startswith('@tzres'):
                tz_res = tz_res or tzres()
                value = tz_res.name_from_string(value)

            value = value.rstrip('\x00')    # Remove trailing nulls

        dout[key_name] = value

    return dout
site-packages/dateutil/tz/__init__.py000064400000000317147511334560013657 0ustar00from .tz import *

__all__ = ["tzutc", "tzoffset", "tzlocal", "tzfile", "tzrange",
           "tzstr", "tzical", "tzwin", "tzwinlocal", "gettz",
           "enfold", "datetime_ambiguous", "datetime_exists"]
site-packages/dateutil/_version.py000064400000000333147511334560013305 0ustar00"""
Contains information about the dateutil version.
"""

VERSION_MAJOR = 2
VERSION_MINOR = 6
VERSION_PATCH = 1

VERSION_TUPLE = (VERSION_MAJOR, VERSION_MINOR, VERSION_PATCH)
VERSION = '.'.join(map(str, VERSION_TUPLE))
site-packages/dateutil/rrule.py000064400000170613147511334560012623 0ustar00# -*- coding: utf-8 -*-
"""
The rrule module offers a small, complete, and very fast, implementation of
the recurrence rules documented in the
`iCalendar RFC <http://www.ietf.org/rfc/rfc2445.txt>`_,
including support for caching of results.
"""
import itertools
import datetime
import calendar
import sys

try:
    from math import gcd
except ImportError:
    from fractions import gcd

from six import advance_iterator, integer_types
from six.moves import _thread, range
import heapq

from ._common import weekday as weekdaybase

# For warning about deprecation of until and count
from warnings import warn

__all__ = ["rrule", "rruleset", "rrulestr",
           "YEARLY", "MONTHLY", "WEEKLY", "DAILY",
           "HOURLY", "MINUTELY", "SECONDLY",
           "MO", "TU", "WE", "TH", "FR", "SA", "SU"]

# Every mask is 7 days longer to handle cross-year weekly periods.
M366MASK = tuple([1]*31+[2]*29+[3]*31+[4]*30+[5]*31+[6]*30 +
                 [7]*31+[8]*31+[9]*30+[10]*31+[11]*30+[12]*31+[1]*7)
M365MASK = list(M366MASK)
M29, M30, M31 = list(range(1, 30)), list(range(1, 31)), list(range(1, 32))
MDAY366MASK = tuple(M31+M29+M31+M30+M31+M30+M31+M31+M30+M31+M30+M31+M31[:7])
MDAY365MASK = list(MDAY366MASK)
M29, M30, M31 = list(range(-29, 0)), list(range(-30, 0)), list(range(-31, 0))
NMDAY366MASK = tuple(M31+M29+M31+M30+M31+M30+M31+M31+M30+M31+M30+M31+M31[:7])
NMDAY365MASK = list(NMDAY366MASK)
M366RANGE = (0, 31, 60, 91, 121, 152, 182, 213, 244, 274, 305, 335, 366)
M365RANGE = (0, 31, 59, 90, 120, 151, 181, 212, 243, 273, 304, 334, 365)
WDAYMASK = [0, 1, 2, 3, 4, 5, 6]*55
del M29, M30, M31, M365MASK[59], MDAY365MASK[59], NMDAY365MASK[31]
MDAY365MASK = tuple(MDAY365MASK)
M365MASK = tuple(M365MASK)

FREQNAMES = ['YEARLY', 'MONTHLY', 'WEEKLY', 'DAILY', 'HOURLY', 'MINUTELY', 'SECONDLY']

(YEARLY,
 MONTHLY,
 WEEKLY,
 DAILY,
 HOURLY,
 MINUTELY,
 SECONDLY) = list(range(7))

# Imported on demand.
easter = None
parser = None


class weekday(weekdaybase):
    """
    This version of weekday does not allow n = 0.
    """
    def __init__(self, wkday, n=None):
        if n == 0:
            raise ValueError("Can't create weekday with n==0")

        super(weekday, self).__init__(wkday, n)


MO, TU, WE, TH, FR, SA, SU = weekdays = tuple(weekday(x) for x in range(7))


def _invalidates_cache(f):
    """
    Decorator for rruleset methods which may invalidate the
    cached length.
    """
    def inner_func(self, *args, **kwargs):
        rv = f(self, *args, **kwargs)
        self._invalidate_cache()
        return rv

    return inner_func


class rrulebase(object):
    def __init__(self, cache=False):
        if cache:
            self._cache = []
            self._cache_lock = _thread.allocate_lock()
            self._invalidate_cache()
        else:
            self._cache = None
            self._cache_complete = False
            self._len = None

    def __iter__(self):
        if self._cache_complete:
            return iter(self._cache)
        elif self._cache is None:
            return self._iter()
        else:
            return self._iter_cached()

    def _invalidate_cache(self):
        if self._cache is not None:
            self._cache = []
            self._cache_complete = False
            self._cache_gen = self._iter()

            if self._cache_lock.locked():
                self._cache_lock.release()

        self._len = None

    def _iter_cached(self):
        i = 0
        gen = self._cache_gen
        cache = self._cache
        acquire = self._cache_lock.acquire
        release = self._cache_lock.release
        while gen:
            if i == len(cache):
                acquire()
                if self._cache_complete:
                    break
                try:
                    for j in range(10):
                        cache.append(advance_iterator(gen))
                except StopIteration:
                    self._cache_gen = gen = None
                    self._cache_complete = True
                    break
                release()
            yield cache[i]
            i += 1
        while i < self._len:
            yield cache[i]
            i += 1

    def __getitem__(self, item):
        if self._cache_complete:
            return self._cache[item]
        elif isinstance(item, slice):
            if item.step and item.step < 0:
                return list(iter(self))[item]
            else:
                return list(itertools.islice(self,
                                             item.start or 0,
                                             item.stop or sys.maxsize,
                                             item.step or 1))
        elif item >= 0:
            gen = iter(self)
            try:
                for i in range(item+1):
                    res = advance_iterator(gen)
            except StopIteration:
                raise IndexError
            return res
        else:
            return list(iter(self))[item]

    def __contains__(self, item):
        if self._cache_complete:
            return item in self._cache
        else:
            for i in self:
                if i == item:
                    return True
                elif i > item:
                    return False
        return False

    # __len__() introduces a large performance penality.
    def count(self):
        """ Returns the number of recurrences in this set. It will have go
            trough the whole recurrence, if this hasn't been done before. """
        if self._len is None:
            for x in self:
                pass
        return self._len

    def before(self, dt, inc=False):
        """ Returns the last recurrence before the given datetime instance. The
            inc keyword defines what happens if dt is an occurrence. With
            inc=True, if dt itself is an occurrence, it will be returned. """
        if self._cache_complete:
            gen = self._cache
        else:
            gen = self
        last = None
        if inc:
            for i in gen:
                if i > dt:
                    break
                last = i
        else:
            for i in gen:
                if i >= dt:
                    break
                last = i
        return last

    def after(self, dt, inc=False):
        """ Returns the first recurrence after the given datetime instance. The
            inc keyword defines what happens if dt is an occurrence. With
            inc=True, if dt itself is an occurrence, it will be returned.  """
        if self._cache_complete:
            gen = self._cache
        else:
            gen = self
        if inc:
            for i in gen:
                if i >= dt:
                    return i
        else:
            for i in gen:
                if i > dt:
                    return i
        return None

    def xafter(self, dt, count=None, inc=False):
        """
        Generator which yields up to `count` recurrences after the given
        datetime instance, equivalent to `after`.

        :param dt:
            The datetime at which to start generating recurrences.

        :param count:
            The maximum number of recurrences to generate. If `None` (default),
            dates are generated until the recurrence rule is exhausted.

        :param inc:
            If `dt` is an instance of the rule and `inc` is `True`, it is
            included in the output.

        :yields: Yields a sequence of `datetime` objects.
        """

        if self._cache_complete:
            gen = self._cache
        else:
            gen = self

        # Select the comparison function
        if inc:
            comp = lambda dc, dtc: dc >= dtc
        else:
            comp = lambda dc, dtc: dc > dtc

        # Generate dates
        n = 0
        for d in gen:
            if comp(d, dt):
                if count is not None:
                    n += 1
                    if n > count:
                        break

                yield d

    def between(self, after, before, inc=False, count=1):
        """ Returns all the occurrences of the rrule between after and before.
        The inc keyword defines what happens if after and/or before are
        themselves occurrences. With inc=True, they will be included in the
        list, if they are found in the recurrence set. """
        if self._cache_complete:
            gen = self._cache
        else:
            gen = self
        started = False
        l = []
        if inc:
            for i in gen:
                if i > before:
                    break
                elif not started:
                    if i >= after:
                        started = True
                        l.append(i)
                else:
                    l.append(i)
        else:
            for i in gen:
                if i >= before:
                    break
                elif not started:
                    if i > after:
                        started = True
                        l.append(i)
                else:
                    l.append(i)
        return l


class rrule(rrulebase):
    """
    That's the base of the rrule operation. It accepts all the keywords
    defined in the RFC as its constructor parameters (except byday,
    which was renamed to byweekday) and more. The constructor prototype is::

            rrule(freq)

    Where freq must be one of YEARLY, MONTHLY, WEEKLY, DAILY, HOURLY, MINUTELY,
    or SECONDLY.

    .. note::
        Per RFC section 3.3.10, recurrence instances falling on invalid dates
        and times are ignored rather than coerced:

            Recurrence rules may generate recurrence instances with an invalid
            date (e.g., February 30) or nonexistent local time (e.g., 1:30 AM
            on a day where the local time is moved forward by an hour at 1:00
            AM).  Such recurrence instances MUST be ignored and MUST NOT be
            counted as part of the recurrence set.

        This can lead to possibly surprising behavior when, for example, the
        start date occurs at the end of the month:

        >>> from dateutil.rrule import rrule, MONTHLY
        >>> from datetime import datetime
        >>> start_date = datetime(2014, 12, 31)
        >>> list(rrule(freq=MONTHLY, count=4, dtstart=start_date))
        ... # doctest: +NORMALIZE_WHITESPACE
        [datetime.datetime(2014, 12, 31, 0, 0),
         datetime.datetime(2015, 1, 31, 0, 0),
         datetime.datetime(2015, 3, 31, 0, 0),
         datetime.datetime(2015, 5, 31, 0, 0)]

    Additionally, it supports the following keyword arguments:

    :param cache:
        If given, it must be a boolean value specifying to enable or disable
        caching of results. If you will use the same rrule instance multiple
        times, enabling caching will improve the performance considerably.
    :param dtstart:
        The recurrence start. Besides being the base for the recurrence,
        missing parameters in the final recurrence instances will also be
        extracted from this date. If not given, datetime.now() will be used
        instead.
    :param interval:
        The interval between each freq iteration. For example, when using
        YEARLY, an interval of 2 means once every two years, but with HOURLY,
        it means once every two hours. The default interval is 1.
    :param wkst:
        The week start day. Must be one of the MO, TU, WE constants, or an
        integer, specifying the first day of the week. This will affect
        recurrences based on weekly periods. The default week start is got
        from calendar.firstweekday(), and may be modified by
        calendar.setfirstweekday().
    :param count:
        How many occurrences will be generated.

        .. note::
            As of version 2.5.0, the use of the ``until`` keyword together
            with the ``count`` keyword is deprecated per RFC-2445 Sec. 4.3.10.
    :param until:
        If given, this must be a datetime instance, that will specify the
        limit of the recurrence. The last recurrence in the rule is the greatest
        datetime that is less than or equal to the value specified in the
        ``until`` parameter.

        .. note::
            As of version 2.5.0, the use of the ``until`` keyword together
            with the ``count`` keyword is deprecated per RFC-2445 Sec. 4.3.10.
    :param bysetpos:
        If given, it must be either an integer, or a sequence of integers,
        positive or negative. Each given integer will specify an occurrence
        number, corresponding to the nth occurrence of the rule inside the
        frequency period. For example, a bysetpos of -1 if combined with a
        MONTHLY frequency, and a byweekday of (MO, TU, WE, TH, FR), will
        result in the last work day of every month.
    :param bymonth:
        If given, it must be either an integer, or a sequence of integers,
        meaning the months to apply the recurrence to.
    :param bymonthday:
        If given, it must be either an integer, or a sequence of integers,
        meaning the month days to apply the recurrence to.
    :param byyearday:
        If given, it must be either an integer, or a sequence of integers,
        meaning the year days to apply the recurrence to.
    :param byweekno:
        If given, it must be either an integer, or a sequence of integers,
        meaning the week numbers to apply the recurrence to. Week numbers
        have the meaning described in ISO8601, that is, the first week of
        the year is that containing at least four days of the new year.
    :param byweekday:
        If given, it must be either an integer (0 == MO), a sequence of
        integers, one of the weekday constants (MO, TU, etc), or a sequence
        of these constants. When given, these variables will define the
        weekdays where the recurrence will be applied. It's also possible to
        use an argument n for the weekday instances, which will mean the nth
        occurrence of this weekday in the period. For example, with MONTHLY,
        or with YEARLY and BYMONTH, using FR(+1) in byweekday will specify the
        first friday of the month where the recurrence happens. Notice that in
        the RFC documentation, this is specified as BYDAY, but was renamed to
        avoid the ambiguity of that keyword.
    :param byhour:
        If given, it must be either an integer, or a sequence of integers,
        meaning the hours to apply the recurrence to.
    :param byminute:
        If given, it must be either an integer, or a sequence of integers,
        meaning the minutes to apply the recurrence to.
    :param bysecond:
        If given, it must be either an integer, or a sequence of integers,
        meaning the seconds to apply the recurrence to.
    :param byeaster:
        If given, it must be either an integer, or a sequence of integers,
        positive or negative. Each integer will define an offset from the
        Easter Sunday. Passing the offset 0 to byeaster will yield the Easter
        Sunday itself. This is an extension to the RFC specification.
     """
    def __init__(self, freq, dtstart=None,
                 interval=1, wkst=None, count=None, until=None, bysetpos=None,
                 bymonth=None, bymonthday=None, byyearday=None, byeaster=None,
                 byweekno=None, byweekday=None,
                 byhour=None, byminute=None, bysecond=None,
                 cache=False):
        super(rrule, self).__init__(cache)
        global easter
        if not dtstart:
            dtstart = datetime.datetime.now().replace(microsecond=0)
        elif not isinstance(dtstart, datetime.datetime):
            dtstart = datetime.datetime.fromordinal(dtstart.toordinal())
        else:
            dtstart = dtstart.replace(microsecond=0)
        self._dtstart = dtstart
        self._tzinfo = dtstart.tzinfo
        self._freq = freq
        self._interval = interval
        self._count = count

        # Cache the original byxxx rules, if they are provided, as the _byxxx
        # attributes do not necessarily map to the inputs, and this can be
        # a problem in generating the strings. Only store things if they've
        # been supplied (the string retrieval will just use .get())
        self._original_rule = {}

        if until and not isinstance(until, datetime.datetime):
            until = datetime.datetime.fromordinal(until.toordinal())
        self._until = until

        if count is not None and until:
            warn("Using both 'count' and 'until' is inconsistent with RFC 2445"
                 " and has been deprecated in dateutil. Future versions will "
                 "raise an error.", DeprecationWarning)

        if wkst is None:
            self._wkst = calendar.firstweekday()
        elif isinstance(wkst, integer_types):
            self._wkst = wkst
        else:
            self._wkst = wkst.weekday

        if bysetpos is None:
            self._bysetpos = None
        elif isinstance(bysetpos, integer_types):
            if bysetpos == 0 or not (-366 <= bysetpos <= 366):
                raise ValueError("bysetpos must be between 1 and 366, "
                                 "or between -366 and -1")
            self._bysetpos = (bysetpos,)
        else:
            self._bysetpos = tuple(bysetpos)
            for pos in self._bysetpos:
                if pos == 0 or not (-366 <= pos <= 366):
                    raise ValueError("bysetpos must be between 1 and 366, "
                                     "or between -366 and -1")

        if self._bysetpos:
            self._original_rule['bysetpos'] = self._bysetpos

        if (byweekno is None and byyearday is None and bymonthday is None and
                byweekday is None and byeaster is None):
            if freq == YEARLY:
                if bymonth is None:
                    bymonth = dtstart.month
                    self._original_rule['bymonth'] = None
                bymonthday = dtstart.day
                self._original_rule['bymonthday'] = None
            elif freq == MONTHLY:
                bymonthday = dtstart.day
                self._original_rule['bymonthday'] = None
            elif freq == WEEKLY:
                byweekday = dtstart.weekday()
                self._original_rule['byweekday'] = None

        # bymonth
        if bymonth is None:
            self._bymonth = None
        else:
            if isinstance(bymonth, integer_types):
                bymonth = (bymonth,)

            self._bymonth = tuple(sorted(set(bymonth)))

            if 'bymonth' not in self._original_rule:
                self._original_rule['bymonth'] = self._bymonth

        # byyearday
        if byyearday is None:
            self._byyearday = None
        else:
            if isinstance(byyearday, integer_types):
                byyearday = (byyearday,)

            self._byyearday = tuple(sorted(set(byyearday)))
            self._original_rule['byyearday'] = self._byyearday

        # byeaster
        if byeaster is not None:
            if not easter:
                from dateutil import easter
            if isinstance(byeaster, integer_types):
                self._byeaster = (byeaster,)
            else:
                self._byeaster = tuple(sorted(byeaster))

            self._original_rule['byeaster'] = self._byeaster
        else:
            self._byeaster = None

        # bymonthday
        if bymonthday is None:
            self._bymonthday = ()
            self._bynmonthday = ()
        else:
            if isinstance(bymonthday, integer_types):
                bymonthday = (bymonthday,)

            bymonthday = set(bymonthday)            # Ensure it's unique

            self._bymonthday = tuple(sorted(x for x in bymonthday if x > 0))
            self._bynmonthday = tuple(sorted(x for x in bymonthday if x < 0))

            # Storing positive numbers first, then negative numbers
            if 'bymonthday' not in self._original_rule:
                self._original_rule['bymonthday'] = tuple(
                    itertools.chain(self._bymonthday, self._bynmonthday))

        # byweekno
        if byweekno is None:
            self._byweekno = None
        else:
            if isinstance(byweekno, integer_types):
                byweekno = (byweekno,)

            self._byweekno = tuple(sorted(set(byweekno)))

            self._original_rule['byweekno'] = self._byweekno

        # byweekday / bynweekday
        if byweekday is None:
            self._byweekday = None
            self._bynweekday = None
        else:
            # If it's one of the valid non-sequence types, convert to a
            # single-element sequence before the iterator that builds the
            # byweekday set.
            if isinstance(byweekday, integer_types) or hasattr(byweekday, "n"):
                byweekday = (byweekday,)

            self._byweekday = set()
            self._bynweekday = set()
            for wday in byweekday:
                if isinstance(wday, integer_types):
                    self._byweekday.add(wday)
                elif not wday.n or freq > MONTHLY:
                    self._byweekday.add(wday.weekday)
                else:
                    self._bynweekday.add((wday.weekday, wday.n))

            if not self._byweekday:
                self._byweekday = None
            elif not self._bynweekday:
                self._bynweekday = None

            if self._byweekday is not None:
                self._byweekday = tuple(sorted(self._byweekday))
                orig_byweekday = [weekday(x) for x in self._byweekday]
            else:
                orig_byweekday = tuple()

            if self._bynweekday is not None:
                self._bynweekday = tuple(sorted(self._bynweekday))
                orig_bynweekday = [weekday(*x) for x in self._bynweekday]
            else:
                orig_bynweekday = tuple()

            if 'byweekday' not in self._original_rule:
                self._original_rule['byweekday'] = tuple(itertools.chain(
                    orig_byweekday, orig_bynweekday))

        # byhour
        if byhour is None:
            if freq < HOURLY:
                self._byhour = set((dtstart.hour,))
            else:
                self._byhour = None
        else:
            if isinstance(byhour, integer_types):
                byhour = (byhour,)

            if freq == HOURLY:
                self._byhour = self.__construct_byset(start=dtstart.hour,
                                                      byxxx=byhour,
                                                      base=24)
            else:
                self._byhour = set(byhour)

            self._byhour = tuple(sorted(self._byhour))
            self._original_rule['byhour'] = self._byhour

        # byminute
        if byminute is None:
            if freq < MINUTELY:
                self._byminute = set((dtstart.minute,))
            else:
                self._byminute = None
        else:
            if isinstance(byminute, integer_types):
                byminute = (byminute,)

            if freq == MINUTELY:
                self._byminute = self.__construct_byset(start=dtstart.minute,
                                                        byxxx=byminute,
                                                        base=60)
            else:
                self._byminute = set(byminute)

            self._byminute = tuple(sorted(self._byminute))
            self._original_rule['byminute'] = self._byminute

        # bysecond
        if bysecond is None:
            if freq < SECONDLY:
                self._bysecond = ((dtstart.second,))
            else:
                self._bysecond = None
        else:
            if isinstance(bysecond, integer_types):
                bysecond = (bysecond,)

            self._bysecond = set(bysecond)

            if freq == SECONDLY:
                self._bysecond = self.__construct_byset(start=dtstart.second,
                                                        byxxx=bysecond,
                                                        base=60)
            else:
                self._bysecond = set(bysecond)

            self._bysecond = tuple(sorted(self._bysecond))
            self._original_rule['bysecond'] = self._bysecond

        if self._freq >= HOURLY:
            self._timeset = None
        else:
            self._timeset = []
            for hour in self._byhour:
                for minute in self._byminute:
                    for second in self._bysecond:
                        self._timeset.append(
                            datetime.time(hour, minute, second,
                                          tzinfo=self._tzinfo))
            self._timeset.sort()
            self._timeset = tuple(self._timeset)

    def __str__(self):
        """
        Output a string that would generate this RRULE if passed to rrulestr.
        This is mostly compatible with RFC2445, except for the
        dateutil-specific extension BYEASTER.
        """

        output = []
        h, m, s = [None] * 3
        if self._dtstart:
            output.append(self._dtstart.strftime('DTSTART:%Y%m%dT%H%M%S'))
            h, m, s = self._dtstart.timetuple()[3:6]

        parts = ['FREQ=' + FREQNAMES[self._freq]]
        if self._interval != 1:
            parts.append('INTERVAL=' + str(self._interval))

        if self._wkst:
            parts.append('WKST=' + repr(weekday(self._wkst))[0:2])

        if self._count is not None:
            parts.append('COUNT=' + str(self._count))

        if self._until:
            parts.append(self._until.strftime('UNTIL=%Y%m%dT%H%M%S'))

        if self._original_rule.get('byweekday') is not None:
            # The str() method on weekday objects doesn't generate
            # RFC2445-compliant strings, so we should modify that.
            original_rule = dict(self._original_rule)
            wday_strings = []
            for wday in original_rule['byweekday']:
                if wday.n:
                    wday_strings.append('{n:+d}{wday}'.format(
                        n=wday.n,
                        wday=repr(wday)[0:2]))
                else:
                    wday_strings.append(repr(wday))

            original_rule['byweekday'] = wday_strings
        else:
            original_rule = self._original_rule

        partfmt = '{name}={vals}'
        for name, key in [('BYSETPOS', 'bysetpos'),
                          ('BYMONTH', 'bymonth'),
                          ('BYMONTHDAY', 'bymonthday'),
                          ('BYYEARDAY', 'byyearday'),
                          ('BYWEEKNO', 'byweekno'),
                          ('BYDAY', 'byweekday'),
                          ('BYHOUR', 'byhour'),
                          ('BYMINUTE', 'byminute'),
                          ('BYSECOND', 'bysecond'),
                          ('BYEASTER', 'byeaster')]:
            value = original_rule.get(key)
            if value:
                parts.append(partfmt.format(name=name, vals=(','.join(str(v)
                                                             for v in value))))

        output.append(';'.join(parts))
        return '\n'.join(output)

    def replace(self, **kwargs):
        """Return new rrule with same attributes except for those attributes given new
           values by whichever keyword arguments are specified."""
        new_kwargs = {"interval": self._interval,
                      "count": self._count,
                      "dtstart": self._dtstart,
                      "freq": self._freq,
                      "until": self._until,
                      "wkst": self._wkst,
                      "cache": False if self._cache is None else True }
        new_kwargs.update(self._original_rule)
        new_kwargs.update(kwargs)
        return rrule(**new_kwargs)

    def _iter(self):
        year, month, day, hour, minute, second, weekday, yearday, _ = \
            self._dtstart.timetuple()

        # Some local variables to speed things up a bit
        freq = self._freq
        interval = self._interval
        wkst = self._wkst
        until = self._until
        bymonth = self._bymonth
        byweekno = self._byweekno
        byyearday = self._byyearday
        byweekday = self._byweekday
        byeaster = self._byeaster
        bymonthday = self._bymonthday
        bynmonthday = self._bynmonthday
        bysetpos = self._bysetpos
        byhour = self._byhour
        byminute = self._byminute
        bysecond = self._bysecond

        ii = _iterinfo(self)
        ii.rebuild(year, month)

        getdayset = {YEARLY: ii.ydayset,
                     MONTHLY: ii.mdayset,
                     WEEKLY: ii.wdayset,
                     DAILY: ii.ddayset,
                     HOURLY: ii.ddayset,
                     MINUTELY: ii.ddayset,
                     SECONDLY: ii.ddayset}[freq]

        if freq < HOURLY:
            timeset = self._timeset
        else:
            gettimeset = {HOURLY: ii.htimeset,
                          MINUTELY: ii.mtimeset,
                          SECONDLY: ii.stimeset}[freq]
            if ((freq >= HOURLY and
                 self._byhour and hour not in self._byhour) or
                (freq >= MINUTELY and
                 self._byminute and minute not in self._byminute) or
                (freq >= SECONDLY and
                 self._bysecond and second not in self._bysecond)):
                timeset = ()
            else:
                timeset = gettimeset(hour, minute, second)

        total = 0
        count = self._count
        while True:
            # Get dayset with the right frequency
            dayset, start, end = getdayset(year, month, day)

            # Do the "hard" work ;-)
            filtered = False
            for i in dayset[start:end]:
                if ((bymonth and ii.mmask[i] not in bymonth) or
                    (byweekno and not ii.wnomask[i]) or
                    (byweekday and ii.wdaymask[i] not in byweekday) or
                    (ii.nwdaymask and not ii.nwdaymask[i]) or
                    (byeaster and not ii.eastermask[i]) or
                    ((bymonthday or bynmonthday) and
                     ii.mdaymask[i] not in bymonthday and
                     ii.nmdaymask[i] not in bynmonthday) or
                    (byyearday and
                     ((i < ii.yearlen and i+1 not in byyearday and
                       -ii.yearlen+i not in byyearday) or
                      (i >= ii.yearlen and i+1-ii.yearlen not in byyearday and
                       -ii.nextyearlen+i-ii.yearlen not in byyearday)))):
                    dayset[i] = None
                    filtered = True

            # Output results
            if bysetpos and timeset:
                poslist = []
                for pos in bysetpos:
                    if pos < 0:
                        daypos, timepos = divmod(pos, len(timeset))
                    else:
                        daypos, timepos = divmod(pos-1, len(timeset))
                    try:
                        i = [x for x in dayset[start:end]
                             if x is not None][daypos]
                        time = timeset[timepos]
                    except IndexError:
                        pass
                    else:
                        date = datetime.date.fromordinal(ii.yearordinal+i)
                        res = datetime.datetime.combine(date, time)
                        if res not in poslist:
                            poslist.append(res)
                poslist.sort()
                for res in poslist:
                    if until and res > until:
                        self._len = total
                        return
                    elif res >= self._dtstart:
                        if count is not None:
                            count -= 1
                            if count < 0:
                                self._len = total
                                return
                        total += 1
                        yield res
            else:
                for i in dayset[start:end]:
                    if i is not None:
                        date = datetime.date.fromordinal(ii.yearordinal + i)
                        for time in timeset:
                            res = datetime.datetime.combine(date, time)
                            if until and res > until:
                                self._len = total
                                return
                            elif res >= self._dtstart:
                                if count is not None:
                                    count -= 1
                                    if count < 0:
                                        self._len = total
                                        return

                                total += 1
                                yield res

            # Handle frequency and interval
            fixday = False
            if freq == YEARLY:
                year += interval
                if year > datetime.MAXYEAR:
                    self._len = total
                    return
                ii.rebuild(year, month)
            elif freq == MONTHLY:
                month += interval
                if month > 12:
                    div, mod = divmod(month, 12)
                    month = mod
                    year += div
                    if month == 0:
                        month = 12
                        year -= 1
                    if year > datetime.MAXYEAR:
                        self._len = total
                        return
                ii.rebuild(year, month)
            elif freq == WEEKLY:
                if wkst > weekday:
                    day += -(weekday+1+(6-wkst))+self._interval*7
                else:
                    day += -(weekday-wkst)+self._interval*7
                weekday = wkst
                fixday = True
            elif freq == DAILY:
                day += interval
                fixday = True
            elif freq == HOURLY:
                if filtered:
                    # Jump to one iteration before next day
                    hour += ((23-hour)//interval)*interval

                if byhour:
                    ndays, hour = self.__mod_distance(value=hour,
                                                      byxxx=self._byhour,
                                                      base=24)
                else:
                    ndays, hour = divmod(hour+interval, 24)

                if ndays:
                    day += ndays
                    fixday = True

                timeset = gettimeset(hour, minute, second)
            elif freq == MINUTELY:
                if filtered:
                    # Jump to one iteration before next day
                    minute += ((1439-(hour*60+minute))//interval)*interval

                valid = False
                rep_rate = (24*60)
                for j in range(rep_rate // gcd(interval, rep_rate)):
                    if byminute:
                        nhours, minute = \
                            self.__mod_distance(value=minute,
                                                byxxx=self._byminute,
                                                base=60)
                    else:
                        nhours, minute = divmod(minute+interval, 60)

                    div, hour = divmod(hour+nhours, 24)
                    if div:
                        day += div
                        fixday = True
                        filtered = False

                    if not byhour or hour in byhour:
                        valid = True
                        break

                if not valid:
                    raise ValueError('Invalid combination of interval and ' +
                                     'byhour resulting in empty rule.')

                timeset = gettimeset(hour, minute, second)
            elif freq == SECONDLY:
                if filtered:
                    # Jump to one iteration before next day
                    second += (((86399 - (hour * 3600 + minute * 60 + second))
                                // interval) * interval)

                rep_rate = (24 * 3600)
                valid = False
                for j in range(0, rep_rate // gcd(interval, rep_rate)):
                    if bysecond:
                        nminutes, second = \
                            self.__mod_distance(value=second,
                                                byxxx=self._bysecond,
                                                base=60)
                    else:
                        nminutes, second = divmod(second+interval, 60)

                    div, minute = divmod(minute+nminutes, 60)
                    if div:
                        hour += div
                        div, hour = divmod(hour, 24)
                        if div:
                            day += div
                            fixday = True

                    if ((not byhour or hour in byhour) and
                            (not byminute or minute in byminute) and
                            (not bysecond or second in bysecond)):
                        valid = True
                        break

                if not valid:
                    raise ValueError('Invalid combination of interval, ' +
                                     'byhour and byminute resulting in empty' +
                                     ' rule.')

                timeset = gettimeset(hour, minute, second)

            if fixday and day > 28:
                daysinmonth = calendar.monthrange(year, month)[1]
                if day > daysinmonth:
                    while day > daysinmonth:
                        day -= daysinmonth
                        month += 1
                        if month == 13:
                            month = 1
                            year += 1
                            if year > datetime.MAXYEAR:
                                self._len = total
                                return
                        daysinmonth = calendar.monthrange(year, month)[1]
                    ii.rebuild(year, month)

    def __construct_byset(self, start, byxxx, base):
        """
        If a `BYXXX` sequence is passed to the constructor at the same level as
        `FREQ` (e.g. `FREQ=HOURLY,BYHOUR={2,4,7},INTERVAL=3`), there are some
        specifications which cannot be reached given some starting conditions.

        This occurs whenever the interval is not coprime with the base of a
        given unit and the difference between the starting position and the
        ending position is not coprime with the greatest common denominator
        between the interval and the base. For example, with a FREQ of hourly
        starting at 17:00 and an interval of 4, the only valid values for
        BYHOUR would be {21, 1, 5, 9, 13, 17}, because 4 and 24 are not
        coprime.

        :param start:
            Specifies the starting position.
        :param byxxx:
            An iterable containing the list of allowed values.
        :param base:
            The largest allowable value for the specified frequency (e.g.
            24 hours, 60 minutes).

        This does not preserve the type of the iterable, returning a set, since
        the values should be unique and the order is irrelevant, this will
        speed up later lookups.

        In the event of an empty set, raises a :exception:`ValueError`, as this
        results in an empty rrule.
        """

        cset = set()

        # Support a single byxxx value.
        if isinstance(byxxx, integer_types):
            byxxx = (byxxx, )

        for num in byxxx:
            i_gcd = gcd(self._interval, base)
            # Use divmod rather than % because we need to wrap negative nums.
            if i_gcd == 1 or divmod(num - start, i_gcd)[1] == 0:
                cset.add(num)

        if len(cset) == 0:
            raise ValueError("Invalid rrule byxxx generates an empty set.")

        return cset

    def __mod_distance(self, value, byxxx, base):
        """
        Calculates the next value in a sequence where the `FREQ` parameter is
        specified along with a `BYXXX` parameter at the same "level"
        (e.g. `HOURLY` specified with `BYHOUR`).

        :param value:
            The old value of the component.
        :param byxxx:
            The `BYXXX` set, which should have been generated by
            `rrule._construct_byset`, or something else which checks that a
            valid rule is present.
        :param base:
            The largest allowable value for the specified frequency (e.g.
            24 hours, 60 minutes).

        If a valid value is not found after `base` iterations (the maximum
        number before the sequence would start to repeat), this raises a
        :exception:`ValueError`, as no valid values were found.

        This returns a tuple of `divmod(n*interval, base)`, where `n` is the
        smallest number of `interval` repetitions until the next specified
        value in `byxxx` is found.
        """
        accumulator = 0
        for ii in range(1, base + 1):
            # Using divmod() over % to account for negative intervals
            div, value = divmod(value + self._interval, base)
            accumulator += div
            if value in byxxx:
                return (accumulator, value)


class _iterinfo(object):
    __slots__ = ["rrule", "lastyear", "lastmonth",
                 "yearlen", "nextyearlen", "yearordinal", "yearweekday",
                 "mmask", "mrange", "mdaymask", "nmdaymask",
                 "wdaymask", "wnomask", "nwdaymask", "eastermask"]

    def __init__(self, rrule):
        for attr in self.__slots__:
            setattr(self, attr, None)
        self.rrule = rrule

    def rebuild(self, year, month):
        # Every mask is 7 days longer to handle cross-year weekly periods.
        rr = self.rrule
        if year != self.lastyear:
            self.yearlen = 365 + calendar.isleap(year)
            self.nextyearlen = 365 + calendar.isleap(year + 1)
            firstyday = datetime.date(year, 1, 1)
            self.yearordinal = firstyday.toordinal()
            self.yearweekday = firstyday.weekday()

            wday = datetime.date(year, 1, 1).weekday()
            if self.yearlen == 365:
                self.mmask = M365MASK
                self.mdaymask = MDAY365MASK
                self.nmdaymask = NMDAY365MASK
                self.wdaymask = WDAYMASK[wday:]
                self.mrange = M365RANGE
            else:
                self.mmask = M366MASK
                self.mdaymask = MDAY366MASK
                self.nmdaymask = NMDAY366MASK
                self.wdaymask = WDAYMASK[wday:]
                self.mrange = M366RANGE

            if not rr._byweekno:
                self.wnomask = None
            else:
                self.wnomask = [0]*(self.yearlen+7)
                # no1wkst = firstwkst = self.wdaymask.index(rr._wkst)
                no1wkst = firstwkst = (7-self.yearweekday+rr._wkst) % 7
                if no1wkst >= 4:
                    no1wkst = 0
                    # Number of days in the year, plus the days we got
                    # from last year.
                    wyearlen = self.yearlen+(self.yearweekday-rr._wkst) % 7
                else:
                    # Number of days in the year, minus the days we
                    # left in last year.
                    wyearlen = self.yearlen-no1wkst
                div, mod = divmod(wyearlen, 7)
                numweeks = div+mod//4
                for n in rr._byweekno:
                    if n < 0:
                        n += numweeks+1
                    if not (0 < n <= numweeks):
                        continue
                    if n > 1:
                        i = no1wkst+(n-1)*7
                        if no1wkst != firstwkst:
                            i -= 7-firstwkst
                    else:
                        i = no1wkst
                    for j in range(7):
                        self.wnomask[i] = 1
                        i += 1
                        if self.wdaymask[i] == rr._wkst:
                            break
                if 1 in rr._byweekno:
                    # Check week number 1 of next year as well
                    # TODO: Check -numweeks for next year.
                    i = no1wkst+numweeks*7
                    if no1wkst != firstwkst:
                        i -= 7-firstwkst
                    if i < self.yearlen:
                        # If week starts in next year, we
                        # don't care about it.
                        for j in range(7):
                            self.wnomask[i] = 1
                            i += 1
                            if self.wdaymask[i] == rr._wkst:
                                break
                if no1wkst:
                    # Check last week number of last year as
                    # well. If no1wkst is 0, either the year
                    # started on week start, or week number 1
                    # got days from last year, so there are no
                    # days from last year's last week number in
                    # this year.
                    if -1 not in rr._byweekno:
                        lyearweekday = datetime.date(year-1, 1, 1).weekday()
                        lno1wkst = (7-lyearweekday+rr._wkst) % 7
                        lyearlen = 365+calendar.isleap(year-1)
                        if lno1wkst >= 4:
                            lno1wkst = 0
                            lnumweeks = 52+(lyearlen +
                                            (lyearweekday-rr._wkst) % 7) % 7//4
                        else:
                            lnumweeks = 52+(self.yearlen-no1wkst) % 7//4
                    else:
                        lnumweeks = -1
                    if lnumweeks in rr._byweekno:
                        for i in range(no1wkst):
                            self.wnomask[i] = 1

        if (rr._bynweekday and (month != self.lastmonth or
                                year != self.lastyear)):
            ranges = []
            if rr._freq == YEARLY:
                if rr._bymonth:
                    for month in rr._bymonth:
                        ranges.append(self.mrange[month-1:month+1])
                else:
                    ranges = [(0, self.yearlen)]
            elif rr._freq == MONTHLY:
                ranges = [self.mrange[month-1:month+1]]
            if ranges:
                # Weekly frequency won't get here, so we may not
                # care about cross-year weekly periods.
                self.nwdaymask = [0]*self.yearlen
                for first, last in ranges:
                    last -= 1
                    for wday, n in rr._bynweekday:
                        if n < 0:
                            i = last+(n+1)*7
                            i -= (self.wdaymask[i]-wday) % 7
                        else:
                            i = first+(n-1)*7
                            i += (7-self.wdaymask[i]+wday) % 7
                        if first <= i <= last:
                            self.nwdaymask[i] = 1

        if rr._byeaster:
            self.eastermask = [0]*(self.yearlen+7)
            eyday = easter.easter(year).toordinal()-self.yearordinal
            for offset in rr._byeaster:
                self.eastermask[eyday+offset] = 1

        self.lastyear = year
        self.lastmonth = month

    def ydayset(self, year, month, day):
        return list(range(self.yearlen)), 0, self.yearlen

    def mdayset(self, year, month, day):
        dset = [None]*self.yearlen
        start, end = self.mrange[month-1:month+1]
        for i in range(start, end):
            dset[i] = i
        return dset, start, end

    def wdayset(self, year, month, day):
        # We need to handle cross-year weeks here.
        dset = [None]*(self.yearlen+7)
        i = datetime.date(year, month, day).toordinal()-self.yearordinal
        start = i
        for j in range(7):
            dset[i] = i
            i += 1
            # if (not (0 <= i < self.yearlen) or
            #    self.wdaymask[i] == self.rrule._wkst):
            # This will cross the year boundary, if necessary.
            if self.wdaymask[i] == self.rrule._wkst:
                break
        return dset, start, i

    def ddayset(self, year, month, day):
        dset = [None] * self.yearlen
        i = datetime.date(year, month, day).toordinal() - self.yearordinal
        dset[i] = i
        return dset, i, i + 1

    def htimeset(self, hour, minute, second):
        tset = []
        rr = self.rrule
        for minute in rr._byminute:
            for second in rr._bysecond:
                tset.append(datetime.time(hour, minute, second,
                                          tzinfo=rr._tzinfo))
        tset.sort()
        return tset

    def mtimeset(self, hour, minute, second):
        tset = []
        rr = self.rrule
        for second in rr._bysecond:
            tset.append(datetime.time(hour, minute, second, tzinfo=rr._tzinfo))
        tset.sort()
        return tset

    def stimeset(self, hour, minute, second):
        return (datetime.time(hour, minute, second,
                tzinfo=self.rrule._tzinfo),)


class rruleset(rrulebase):
    """ The rruleset type allows more complex recurrence setups, mixing
    multiple rules, dates, exclusion rules, and exclusion dates. The type
    constructor takes the following keyword arguments:

    :param cache: If True, caching of results will be enabled, improving
                  performance of multiple queries considerably. """

    class _genitem(object):
        def __init__(self, genlist, gen):
            try:
                self.dt = advance_iterator(gen)
                genlist.append(self)
            except StopIteration:
                pass
            self.genlist = genlist
            self.gen = gen

        def __next__(self):
            try:
                self.dt = advance_iterator(self.gen)
            except StopIteration:
                if self.genlist[0] is self:
                    heapq.heappop(self.genlist)
                else:
                    self.genlist.remove(self)
                    heapq.heapify(self.genlist)

        next = __next__

        def __lt__(self, other):
            return self.dt < other.dt

        def __gt__(self, other):
            return self.dt > other.dt

        def __eq__(self, other):
            return self.dt == other.dt

        def __ne__(self, other):
            return self.dt != other.dt

    def __init__(self, cache=False):
        super(rruleset, self).__init__(cache)
        self._rrule = []
        self._rdate = []
        self._exrule = []
        self._exdate = []

    @_invalidates_cache
    def rrule(self, rrule):
        """ Include the given :py:class:`rrule` instance in the recurrence set
            generation. """
        self._rrule.append(rrule)

    @_invalidates_cache
    def rdate(self, rdate):
        """ Include the given :py:class:`datetime` instance in the recurrence
            set generation. """
        self._rdate.append(rdate)

    @_invalidates_cache
    def exrule(self, exrule):
        """ Include the given rrule instance in the recurrence set exclusion
            list. Dates which are part of the given recurrence rules will not
            be generated, even if some inclusive rrule or rdate matches them.
        """
        self._exrule.append(exrule)

    @_invalidates_cache
    def exdate(self, exdate):
        """ Include the given datetime instance in the recurrence set
            exclusion list. Dates included that way will not be generated,
            even if some inclusive rrule or rdate matches them. """
        self._exdate.append(exdate)

    def _iter(self):
        rlist = []
        self._rdate.sort()
        self._genitem(rlist, iter(self._rdate))
        for gen in [iter(x) for x in self._rrule]:
            self._genitem(rlist, gen)
        exlist = []
        self._exdate.sort()
        self._genitem(exlist, iter(self._exdate))
        for gen in [iter(x) for x in self._exrule]:
            self._genitem(exlist, gen)
        lastdt = None
        total = 0
        heapq.heapify(rlist)
        heapq.heapify(exlist)
        while rlist:
            ritem = rlist[0]
            if not lastdt or lastdt != ritem.dt:
                while exlist and exlist[0] < ritem:
                    exitem = exlist[0]
                    advance_iterator(exitem)
                    if exlist and exlist[0] is exitem:
                        heapq.heapreplace(exlist, exitem)
                if not exlist or ritem != exlist[0]:
                    total += 1
                    yield ritem.dt
                lastdt = ritem.dt
            advance_iterator(ritem)
            if rlist and rlist[0] is ritem:
                heapq.heapreplace(rlist, ritem)
        self._len = total


class _rrulestr(object):

    _freq_map = {"YEARLY": YEARLY,
                 "MONTHLY": MONTHLY,
                 "WEEKLY": WEEKLY,
                 "DAILY": DAILY,
                 "HOURLY": HOURLY,
                 "MINUTELY": MINUTELY,
                 "SECONDLY": SECONDLY}

    _weekday_map = {"MO": 0, "TU": 1, "WE": 2, "TH": 3,
                    "FR": 4, "SA": 5, "SU": 6}

    def _handle_int(self, rrkwargs, name, value, **kwargs):
        rrkwargs[name.lower()] = int(value)

    def _handle_int_list(self, rrkwargs, name, value, **kwargs):
        rrkwargs[name.lower()] = [int(x) for x in value.split(',')]

    _handle_INTERVAL = _handle_int
    _handle_COUNT = _handle_int
    _handle_BYSETPOS = _handle_int_list
    _handle_BYMONTH = _handle_int_list
    _handle_BYMONTHDAY = _handle_int_list
    _handle_BYYEARDAY = _handle_int_list
    _handle_BYEASTER = _handle_int_list
    _handle_BYWEEKNO = _handle_int_list
    _handle_BYHOUR = _handle_int_list
    _handle_BYMINUTE = _handle_int_list
    _handle_BYSECOND = _handle_int_list

    def _handle_FREQ(self, rrkwargs, name, value, **kwargs):
        rrkwargs["freq"] = self._freq_map[value]

    def _handle_UNTIL(self, rrkwargs, name, value, **kwargs):
        global parser
        if not parser:
            from dateutil import parser
        try:
            rrkwargs["until"] = parser.parse(value,
                                             ignoretz=kwargs.get("ignoretz"),
                                             tzinfos=kwargs.get("tzinfos"))
        except ValueError:
            raise ValueError("invalid until date")

    def _handle_WKST(self, rrkwargs, name, value, **kwargs):
        rrkwargs["wkst"] = self._weekday_map[value]

    def _handle_BYWEEKDAY(self, rrkwargs, name, value, **kwargs):
        """
        Two ways to specify this: +1MO or MO(+1)
        """
        l = []
        for wday in value.split(','):
            if '(' in wday:
                # If it's of the form TH(+1), etc.
                splt = wday.split('(')
                w = splt[0]
                n = int(splt[1][:-1])
            elif len(wday):
                # If it's of the form +1MO
                for i in range(len(wday)):
                    if wday[i] not in '+-0123456789':
                        break
                n = wday[:i] or None
                w = wday[i:]
                if n:
                    n = int(n)
            else:
                raise ValueError("Invalid (empty) BYDAY specification.")

            l.append(weekdays[self._weekday_map[w]](n))
        rrkwargs["byweekday"] = l

    _handle_BYDAY = _handle_BYWEEKDAY

    def _parse_rfc_rrule(self, line,
                         dtstart=None,
                         cache=False,
                         ignoretz=False,
                         tzinfos=None):
        if line.find(':') != -1:
            name, value = line.split(':')
            if name != "RRULE":
                raise ValueError("unknown parameter name")
        else:
            value = line
        rrkwargs = {}
        for pair in value.split(';'):
            name, value = pair.split('=')
            name = name.upper()
            value = value.upper()
            try:
                getattr(self, "_handle_"+name)(rrkwargs, name, value,
                                               ignoretz=ignoretz,
                                               tzinfos=tzinfos)
            except AttributeError:
                raise ValueError("unknown parameter '%s'" % name)
            except (KeyError, ValueError):
                raise ValueError("invalid '%s': %s" % (name, value))
        return rrule(dtstart=dtstart, cache=cache, **rrkwargs)

    def _parse_rfc(self, s,
                   dtstart=None,
                   cache=False,
                   unfold=False,
                   forceset=False,
                   compatible=False,
                   ignoretz=False,
                   tzinfos=None):
        global parser
        if compatible:
            forceset = True
            unfold = True
        s = s.upper()
        if not s.strip():
            raise ValueError("empty string")
        if unfold:
            lines = s.splitlines()
            i = 0
            while i < len(lines):
                line = lines[i].rstrip()
                if not line:
                    del lines[i]
                elif i > 0 and line[0] == " ":
                    lines[i-1] += line[1:]
                    del lines[i]
                else:
                    i += 1
        else:
            lines = s.split()
        if (not forceset and len(lines) == 1 and (s.find(':') == -1 or
                                                  s.startswith('RRULE:'))):
            return self._parse_rfc_rrule(lines[0], cache=cache,
                                         dtstart=dtstart, ignoretz=ignoretz,
                                         tzinfos=tzinfos)
        else:
            rrulevals = []
            rdatevals = []
            exrulevals = []
            exdatevals = []
            for line in lines:
                if not line:
                    continue
                if line.find(':') == -1:
                    name = "RRULE"
                    value = line
                else:
                    name, value = line.split(':', 1)
                parms = name.split(';')
                if not parms:
                    raise ValueError("empty property name")
                name = parms[0]
                parms = parms[1:]
                if name == "RRULE":
                    for parm in parms:
                        raise ValueError("unsupported RRULE parm: "+parm)
                    rrulevals.append(value)
                elif name == "RDATE":
                    for parm in parms:
                        if parm != "VALUE=DATE-TIME":
                            raise ValueError("unsupported RDATE parm: "+parm)
                    rdatevals.append(value)
                elif name == "EXRULE":
                    for parm in parms:
                        raise ValueError("unsupported EXRULE parm: "+parm)
                    exrulevals.append(value)
                elif name == "EXDATE":
                    for parm in parms:
                        if parm != "VALUE=DATE-TIME":
                            raise ValueError("unsupported EXDATE parm: "+parm)
                    exdatevals.append(value)
                elif name == "DTSTART":
                    for parm in parms:
                        raise ValueError("unsupported DTSTART parm: "+parm)
                    if not parser:
                        from dateutil import parser
                    dtstart = parser.parse(value, ignoretz=ignoretz,
                                           tzinfos=tzinfos)
                else:
                    raise ValueError("unsupported property: "+name)
            if (forceset or len(rrulevals) > 1 or rdatevals
                    or exrulevals or exdatevals):
                if not parser and (rdatevals or exdatevals):
                    from dateutil import parser
                rset = rruleset(cache=cache)
                for value in rrulevals:
                    rset.rrule(self._parse_rfc_rrule(value, dtstart=dtstart,
                                                     ignoretz=ignoretz,
                                                     tzinfos=tzinfos))
                for value in rdatevals:
                    for datestr in value.split(','):
                        rset.rdate(parser.parse(datestr,
                                                ignoretz=ignoretz,
                                                tzinfos=tzinfos))
                for value in exrulevals:
                    rset.exrule(self._parse_rfc_rrule(value, dtstart=dtstart,
                                                      ignoretz=ignoretz,
                                                      tzinfos=tzinfos))
                for value in exdatevals:
                    for datestr in value.split(','):
                        rset.exdate(parser.parse(datestr,
                                                 ignoretz=ignoretz,
                                                 tzinfos=tzinfos))
                if compatible and dtstart:
                    rset.rdate(dtstart)
                return rset
            else:
                return self._parse_rfc_rrule(rrulevals[0],
                                             dtstart=dtstart,
                                             cache=cache,
                                             ignoretz=ignoretz,
                                             tzinfos=tzinfos)

    def __call__(self, s, **kwargs):
        return self._parse_rfc(s, **kwargs)


rrulestr = _rrulestr()

# vim:ts=4:sw=4:et
site-packages/dateutil/__pycache__/rrule.cpython-36.pyc000064400000117164147511334560017111 0ustar003

6�cY���@sJdZddlZddlZddlZddlZyddlmZWn ek
rTddlmZYnXddl	m
Z
mZddlm
Z
mZddlZddlmZddlmZd	d
ddd
ddddddddddddgZedgddgddgddgdd gdd!gdd"gdd#gdd$gdd%gdd&gdd'gddgd"�Zee�Zeedd��eedd��eedd(��ZZZeeeeeeeeeeeeeedd"��Zee�ZeedNd��eedOd��eedPd��ZZZeeeeeeeeeeeeeedd"��Zee�Z dQZ!dRZ"dddddd d!gd?Z#[[[ed4=ed4=e d=ee�Zee�Zdd
dddddgZ$eed"��\Z%Z&Z'Z(Z)Z*Z+da,da-Gd@dA�dAe�ZedBdC�ed"�D��\Z.Z/Z0Z1Z2Z3Z4Z5dDdE�Z6GdFdG�dGe7�Z8GdHd	�d	e8�Z9GdIdJ�dJe7�Z:GdKd
�d
e8�Z;GdLdM�dMe7�Z<e<�Z=dS)Sz�
The rrule module offers a small, complete, and very fast, implementation of
the recurrence rules documented in the
`iCalendar RFC <http://www.ietf.org/rfc/rfc2445.txt>`_,
including support for caching of results.
�N)�gcd)�advance_iterator�
integer_types)�_thread�range�)�weekday)�warn�rrule�rruleset�rrulestr�YEARLY�MONTHLY�WEEKLY�DAILY�HOURLY�MINUTELY�SECONDLY�MO�TU�WE�TH�FR�SA�SU�����������	�
��� �<�[�y��������1�O�n�;�Z�x��������0�N�m�7cs"eZdZdZd�fdd�	Z�ZS)rz7
    This version of weekday does not allow n = 0.
    Ncs&|dkrtd��tt|�j||�dS)NrzCan't create weekday with n==0)�
ValueError�superr�__init__)�selfZwkday�n)�	__class__��/usr/lib/python3.6/rrule.pyrCDszweekday.__init__)N)�__name__�
__module__�__qualname__�__doc__rC�
__classcell__rGrG)rFrHr@srccs|]}t|�VqdS)N)r)�.0�xrGrGrH�	<genexpr>KsrPcs�fdd�}|S)zT
    Decorator for rruleset methods which may invalidate the
    cached length.
    cs�|f|�|�}|j�|S)N)�_invalidate_cache)rD�args�kwargs�rv)�frGrH�
inner_funcSsz&_invalidates_cache.<locals>.inner_funcrG)rUrVrG)rUrH�_invalidates_cacheNsrWc@sneZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	ddd�Z
ddd�Zddd�Zddd�Z
dS)�	rrulebaseFcCs4|rg|_tj�|_|j�nd|_d|_d|_dS)NF)�_cacher�
allocate_lock�_cache_lockrQ�_cache_complete�_len)rD�cacherGrGrHrC\s

zrrulebase.__init__cCs.|jrt|j�S|jdkr"|j�S|j�SdS)N)r\�iterrY�_iter�_iter_cached)rDrGrGrH�__iter__fs


zrrulebase.__iter__cCs>|jdk	r4g|_d|_|j�|_|jj�r4|jj�d|_dS)NF)rYr\r`�
_cache_genr[�locked�releaser])rDrGrGrHrQns



zrrulebase._invalidate_cacheccs�d}|j}|j}|jj}|jj}x�|r�|t|�kr�|�|jr@Py$xtd�D]}|jt	|��qLWWn&t
k
r�d|_}d|_PYnX|�||V|d7}q"Wx ||jkr�||V|d7}q�WdS)Nrr&Tr)rcrYr[�acquirere�lenr\r�appendr�
StopIterationr])rD�i�genr^rfre�jrGrGrHrays.


zrrulebase._iter_cachedcCs�|jr|j|St|t�rd|jr:|jdkr:tt|��|Sttj||j	pJd|j
pTtj|jp\d��Sn`|dkr�t|�}y"xt
|d�D]}t|�}q�WWntk
r�t�YnX|Stt|��|SdS)Nrr)r\rY�
isinstance�slice�step�listr_�	itertools�islice�start�stop�sys�maxsizerrri�
IndexError)rD�itemrkrj�resrGrGrH�__getitem__�s$



zrrulebase.__getitem__cCs:|jr||jkSx$|D]}||kr&dS||krdSqWdS)NTF)r\rY)rDrxrjrGrGrH�__contains__�s

zrrulebase.__contains__cCs|jdkrx|D]}qW|jS)z� Returns the number of recurrences in this set. It will have go
            trough the whole recurrence, if this hasn't been done before. N)r])rDrOrGrGrH�count�s

zrrulebase.countcCsX|jr|j}n|}d}|r8x8|D]}||kr.P|}q Wnx|D]}||krLP|}q>W|S)z� Returns the last recurrence before the given datetime instance. The
            inc keyword defines what happens if dt is an occurrence. With
            inc=True, if dt itself is an occurrence, it will be returned. N)r\rY)rD�dt�incrk�lastrjrGrGrH�before�s


zrrulebase.beforecCsP|jr|j}n|}|r2x4|D]}||kr|SqWnx|D]}||kr8|Sq8WdS)z� Returns the first recurrence after the given datetime instance. The
            inc keyword defines what happens if dt is an occurrence. With
            inc=True, if dt itself is an occurrence, it will be returned.  N)r\rY)rDr}r~rkrjrGrGrH�after�s


zrrulebase.afterNccsh|jr|j}n|}|r dd�}ndd�}d}x6|D].}|||�r2|dk	rZ|d7}||krZP|Vq2WdS)aH
        Generator which yields up to `count` recurrences after the given
        datetime instance, equivalent to `after`.

        :param dt:
            The datetime at which to start generating recurrences.

        :param count:
            The maximum number of recurrences to generate. If `None` (default),
            dates are generated until the recurrence rule is exhausted.

        :param inc:
            If `dt` is an instance of the rule and `inc` is `True`, it is
            included in the output.

        :yields: Yields a sequence of `datetime` objects.
        cSs||kS)NrG)�dc�dtcrGrGrH�<lambda>�sz"rrulebase.xafter.<locals>.<lambda>cSs||kS)NrG)r�r�rGrGrHr��srNr)r\rY)rDr}r|r~rk�comprE�drGrGrH�xafter�s


zrrulebase.xafterrc	Cs�|jr|j}n|}d}g}|r`x�|D]6}||kr4Pq$|sP||krZd}|j|�q$|j|�q$Wn@x>|D]6}||krvPqf|s�||kr�d}|j|�qf|j|�qfW|S)a Returns all the occurrences of the rrule between after and before.
        The inc keyword defines what happens if after and/or before are
        themselves occurrences. With inc=True, they will be included in the
        list, if they are found in the recurrence set. FT)r\rYrh)	rDr�r�r~r|rkZstarted�lrjrGrGrH�betweens.

zrrulebase.between)F)F)F)NF)Fr)rIrJrKrCrbrQrarzr{r|r�r�r�r�rGrGrGrHrX[s




)rXcsJeZdZdZd�fdd�	Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	�Z
S)r
a|
    That's the base of the rrule operation. It accepts all the keywords
    defined in the RFC as its constructor parameters (except byday,
    which was renamed to byweekday) and more. The constructor prototype is::

            rrule(freq)

    Where freq must be one of YEARLY, MONTHLY, WEEKLY, DAILY, HOURLY, MINUTELY,
    or SECONDLY.

    .. note::
        Per RFC section 3.3.10, recurrence instances falling on invalid dates
        and times are ignored rather than coerced:

            Recurrence rules may generate recurrence instances with an invalid
            date (e.g., February 30) or nonexistent local time (e.g., 1:30 AM
            on a day where the local time is moved forward by an hour at 1:00
            AM).  Such recurrence instances MUST be ignored and MUST NOT be
            counted as part of the recurrence set.

        This can lead to possibly surprising behavior when, for example, the
        start date occurs at the end of the month:

        >>> from dateutil.rrule import rrule, MONTHLY
        >>> from datetime import datetime
        >>> start_date = datetime(2014, 12, 31)
        >>> list(rrule(freq=MONTHLY, count=4, dtstart=start_date))
        ... # doctest: +NORMALIZE_WHITESPACE
        [datetime.datetime(2014, 12, 31, 0, 0),
         datetime.datetime(2015, 1, 31, 0, 0),
         datetime.datetime(2015, 3, 31, 0, 0),
         datetime.datetime(2015, 5, 31, 0, 0)]

    Additionally, it supports the following keyword arguments:

    :param cache:
        If given, it must be a boolean value specifying to enable or disable
        caching of results. If you will use the same rrule instance multiple
        times, enabling caching will improve the performance considerably.
    :param dtstart:
        The recurrence start. Besides being the base for the recurrence,
        missing parameters in the final recurrence instances will also be
        extracted from this date. If not given, datetime.now() will be used
        instead.
    :param interval:
        The interval between each freq iteration. For example, when using
        YEARLY, an interval of 2 means once every two years, but with HOURLY,
        it means once every two hours. The default interval is 1.
    :param wkst:
        The week start day. Must be one of the MO, TU, WE constants, or an
        integer, specifying the first day of the week. This will affect
        recurrences based on weekly periods. The default week start is got
        from calendar.firstweekday(), and may be modified by
        calendar.setfirstweekday().
    :param count:
        How many occurrences will be generated.

        .. note::
            As of version 2.5.0, the use of the ``until`` keyword together
            with the ``count`` keyword is deprecated per RFC-2445 Sec. 4.3.10.
    :param until:
        If given, this must be a datetime instance, that will specify the
        limit of the recurrence. The last recurrence in the rule is the greatest
        datetime that is less than or equal to the value specified in the
        ``until`` parameter.

        .. note::
            As of version 2.5.0, the use of the ``until`` keyword together
            with the ``count`` keyword is deprecated per RFC-2445 Sec. 4.3.10.
    :param bysetpos:
        If given, it must be either an integer, or a sequence of integers,
        positive or negative. Each given integer will specify an occurrence
        number, corresponding to the nth occurrence of the rule inside the
        frequency period. For example, a bysetpos of -1 if combined with a
        MONTHLY frequency, and a byweekday of (MO, TU, WE, TH, FR), will
        result in the last work day of every month.
    :param bymonth:
        If given, it must be either an integer, or a sequence of integers,
        meaning the months to apply the recurrence to.
    :param bymonthday:
        If given, it must be either an integer, or a sequence of integers,
        meaning the month days to apply the recurrence to.
    :param byyearday:
        If given, it must be either an integer, or a sequence of integers,
        meaning the year days to apply the recurrence to.
    :param byweekno:
        If given, it must be either an integer, or a sequence of integers,
        meaning the week numbers to apply the recurrence to. Week numbers
        have the meaning described in ISO8601, that is, the first week of
        the year is that containing at least four days of the new year.
    :param byweekday:
        If given, it must be either an integer (0 == MO), a sequence of
        integers, one of the weekday constants (MO, TU, etc), or a sequence
        of these constants. When given, these variables will define the
        weekdays where the recurrence will be applied. It's also possible to
        use an argument n for the weekday instances, which will mean the nth
        occurrence of this weekday in the period. For example, with MONTHLY,
        or with YEARLY and BYMONTH, using FR(+1) in byweekday will specify the
        first friday of the month where the recurrence happens. Notice that in
        the RFC documentation, this is specified as BYDAY, but was renamed to
        avoid the ambiguity of that keyword.
    :param byhour:
        If given, it must be either an integer, or a sequence of integers,
        meaning the hours to apply the recurrence to.
    :param byminute:
        If given, it must be either an integer, or a sequence of integers,
        meaning the minutes to apply the recurrence to.
    :param bysecond:
        If given, it must be either an integer, or a sequence of integers,
        meaning the seconds to apply the recurrence to.
    :param byeaster:
        If given, it must be either an integer, or a sequence of integers,
        positive or negative. Each integer will define an offset from the
        Easter Sunday. Passing the offset 0 to byeaster will yield the Easter
        Sunday itself. This is an extension to the RFC specification.
     NrFc
sRtt|�j|�|s(tjj�jdd�}n*t|tj�sFtjj|j��}n|jdd�}||_	|j
|_||_||_
||_i|_|r�t|tj�r�tjj|j��}||_|dk	r�|r�tdt�|dkr�tj�|_nt|t�r�||_n|j|_|dkr�d|_n�t|t��r:|dk�s(d|k�odkn�r0td��|f|_nLt|�|_x@|jD]6}|dk�sxd|k�ondkn�rLtd���qLW|j�r�|j|jd<|dk�r:|
dk�r:|	dk�r:|
dk�r:|dk�r:|tk�r|dk�r�|j}d|jd<|j}	d|jd<n8|tk�r|j}	d|jd<n|tk�r:|j�}
d|jd	<|dk�rLd|_ n<t|t��r^|f}tt!t"|���|_ d|jk�r�|j |jd<|
dk�r�d|_#n0t|
t��r�|
f}
tt!t"|
���|_#|j#|jd
<|dk	�rt$�s�ddl%m$a$t|t��r�|f|_&ntt!|��|_&|j&|jd<nd|_&|	dk�r6f|_'f|_(npt|	t��rH|	f}	t"|	�}	tt!d
d�|	D���|_'tt!dd�|	D���|_(d|jk�r�tt)j*|j'|j(��|jd<|dk�r�d|_+n0t|t��r�|f}tt!t"|���|_+|j+|jd<|
dk�rd|_,d|_-�n8t|
t��st.|
d��r |
f}
t"�|_,t"�|_-x`|
D]X}t|t��rT|j,j/|�n8|j0�sh|tk�rx|j,j/|j�n|j-j/|j|j0f��q6W|j,�s�d|_,n|j-�s�d|_-|j,dk	�r�tt!|j,��|_,dd�|j,D�}nt�}|j-dk	�rtt!|j-��|_-dd�|j-D�}nt�}d	|jk�r:tt)j*||��|jd	<|dk�rf|t1k�r^t"|j2f�|_3nd|_3nXt|t��rx|f}|t1k�r�|j4|j2|dd�|_3n
t"|�|_3tt!|j3��|_3|j3|jd<|dk�r�|t5k�r�t"|j6f�|_7nd|_7nXt|t��r�|f}|t5k�r|j4|j6|dd�|_7n
t"|�|_7tt!|j7��|_7|j7|jd<|dk�rj|t8k�rb|j9f|_:nd|_:nbt|t��r||f}t"|�|_:|t8k�r�|j4|j9|dd�|_:n
t"|�|_:tt!|j:��|_:|j:|jd<|jt1k�r�d|_;nng|_;xP|j3D]F}x>|j7D]4}x,|j:D]"}|j;j<tj=||||jd���qW�q�W�q�W|j;j>�t|j;�|_;dS)Nr)Zmicrosecondz�Using both 'count' and 'until' is inconsistent with RFC 2445 and has been deprecated in dateutil. Future versions will raise an error.inz:bysetpos must be between 1 and 366, or between -366 and -1�bysetpos�bymonth�
bymonthday�	byweekday�	byyearday)�easter�byeastercss|]}|dkr|VqdS)rNrG)rNrOrGrGrHrPsz!rrule.__init__.<locals>.<genexpr>css|]}|dkr|VqdS)rNrG)rNrOrGrGrHrPs�byweeknorEcSsg|]}t|��qSrG)r)rNrOrGrGrH�
<listcomp>Isz"rrule.__init__.<locals>.<listcomp>cSsg|]}t|��qSrG)r)rNrOrGrGrHr�Os�)rs�byxxx�base�byhourr*�byminute�bysecond)�tzinfoi����i����)?rBr
rC�datetimeZnow�replacerm�fromordinal�	toordinal�_dtstartr��_tzinfo�_freq�	_interval�_count�_original_rule�_untilr	�DeprecationWarning�calendarZfirstweekday�_wkstrr�	_bysetposrA�tupler
�month�dayrr�_bymonth�sorted�set�
_byyeardayr��dateutil�	_byeaster�_bymonthday�_bynmonthdayrq�chain�	_byweekno�
_byweekday�_bynweekday�hasattr�addrEr�hour�_byhour�_rrule__construct_bysetr�minute�	_byminuter�second�	_bysecond�_timesetrh�time�sort)rD�freq�dtstart�interval�wkstr|�untilr�r�r�r�r�r�r�r�r�r�r^�pos�wdayZorig_byweekdayZorig_bynweekdayr�r�r�)rFrGrHrC�sL
(

(





























zrrule.__init__c
Cs�g}dgd\}}}|jrD|j|jjd��|jj�dd�\}}}dt|jg}|jdkrr|jdt|j��|jr�|jdt	t
|j��d	d
��|jdk	r�|jdt|j��|jr�|j|jjd��|j
jd
�dk	�rDt|j
�}g}xJ|d
D]>}|j�r(|jdj|jt	|�d	d
�d��q�|jt	|��q�W||d
<n|j
}d}	xFd4D]>\}
}|j|�}|�rT|j|	j|
d$jd%d&�|D��d'���qTW|jd(j|��d)j|�S)5z�
        Output a string that would generate this RRULE if passed to rrulestr.
        This is mostly compatible with RFC2445, except for the
        dateutil-specific extension BYEASTER.
        NrzDTSTART:%Y%m%dT%H%M%Sr"zFREQ=rz	INTERVAL=zWKST=rrzCOUNT=zUNTIL=%Y%m%dT%H%M%Sr�z{n:+d}{wday})rEr�z
{name}={vals}�BYSETPOSr��BYMONTHr��
BYMONTHDAYr��	BYYEARDAYr��BYWEEKNOr��BYDAY�BYHOURr��BYMINUTEr��BYSECONDr��BYEASTERr��,css|]}t|�VqdS)N)�str)rN�vrGrGrHrP�sz rrule.__str__.<locals>.<genexpr>)�name�vals�;�
�r�r��r�r��r�r��r�r��r�r��r�r��r�r��r�r��r�r��r�r�)
r�r�r�r�r�r�r�r�r�r�)r�rhZstrftime�	timetuple�	FREQNAMESr�r�r�r��reprrr�r�r��get�dictrE�format�join)
rD�output�h�m�s�partsZ
original_ruleZwday_stringsr�Zpartfmtr��key�valuerGrGrH�__str__�sT
 



z
rrule.__str__cKsN|j|j|j|j|j|j|jdkr&dndd�}|j|j�|j|�t	f|�S)z�Return new rrule with same attributes except for those attributes given new
           values by whichever keyword arguments are specified.NFT)r�r|r�r�r�r�r^)
r�r�r�r�r�r�rY�updater�r
)rDrSZ
new_kwargsrGrGrHr��s
z
rrule.replacec5cs�|jj�\	}}}}}}}}}	|j}
|j}|j}|j}
|j}|j}|j}|j	}|j
}|j}|j}|j
}|j}|j}|j}t|�}|j||�t|jt|jt|jt|jt|jt|jt|ji|
}|
tkr�|j}n�t|jt|j t|j!i|
}|
tko�|jo�||jk�s<|
tk�r|j�r||jk�s<|
tk�rB|j�rB||jk�rBf}n||||�}d}|j"}�xX||||�\}} }!d}"�x$|| |!�D�]}#|�r�|j#|#|k�s�|�r�|j$|#�s�|�r�|j%|#|k�s�|j&�r�|j&|#�s�|�r�|j'|#�s�|�s�|�r|j(|#|k�r|j)|#|k�s�|�r�|#|j*k�rP|#d|k�rP|j*|#|k�s�|#|j*k�r�|#d|j*|k�r�|j+|#|j*|k�r�d||#<d}"�q�W|�r�|�r�g}$x�|D]�}%|%dk�r�t,|%t-|��\}&}'nt,|%dt-|��\}&}'y&dd�|| |!�D�|&}#||'}(Wnt.k
�r$Yn6Xt/j0j1|j2|#�})t/j/j3|)|(�}*|*|$k�r�|$j4|*��q�W|$j5�xh|$D]`}*|
�r�|*|
k�r�||_6dS|*|jk�rn|dk	�r�|d8}|dk�r�||_6dS|d7}|*V�qnWn�x�|| |!�D]�}#|#dk	�r�t/j0j1|j2|#�})xv|D]n}(t/j/j3|)|(�}*|
�r4|*|
k�r4||_6dS|*|jk�r|dk	�rf|d8}|dk�rf||_6dS|d7}|*V�qW�q�Wd}+|
tk�r�||7}|t/j7k�r�||_6dS|j||��n^|
tk�r.||7}|dk�rt,|d�\},}-|-}||,7}|dk�rd}|d8}|t/j7k�r||_6dS|j||��n�|
tk�r�||k�rd||dd||jd	7}n||||jd	7}|}d}+�n�|
tk�r�||7}d}+�nx|
tk�r|"�r�|d
|||7}|�r�|j8||jdd�\}.}nt,||d�\}.}|.�r||.7}d}+||||�}�n|
tk�r�|"�rD|d
|d|||7}d}/d}0x�t9|0t:||0��D]v}1|�r�|j8||jdd�\}2}nt,||d�\}2}t,||2d�\},}|,�r�||,7}d}+d}"|�s�||k�r`d}/P�q`W|/�s�t;d��||||�}�n"|
tk�r|"�r,|d|d|d|||7}d}0d}/x�t9d|0t:||0��D]�}1|�rl|j8||jdd�\}3}nt,||d�\}3}t,||3d�\},}|,�r�||,7}t,|d�\},}|,�r�||,7}d}+|�s�||k�rJ|�s�||k�rJ|�s�||k�rJd}/P�qJW|/�st;d��||||�}|+�r\|dk�r\t<j=||�d}4||4k�r\x\||4k�r�||48}|d7}|dk�r�d}|d7}|t/j7k�r�||_6dSt<j=||�d}4�qFW|j||��q\WdS)NrFrTcSsg|]}|dk	r|�qS)NrG)rNrOrGrGrHr�@szrrule._iter.<locals>.<listcomp>r(r"r#�r�)r�r�r�i�r*z$Invalid combination of interval and zbyhour resulting in empty rule.iQiz!Invalid combination of interval, z&byhour and byminute resulting in emptyz rule.��
i�zCInvalid combination of interval and byhour resulting in empty rule.i�QzGInvalid combination of interval, byhour and byminute resulting in emptyzMInvalid combination of interval, byhour and byminute resulting in empty rule.)>r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r��	_iterinfo�rebuildr
�ydaysetr�mdaysetr�wdaysetr�ddaysetrrrr��htimeset�mtimeset�stimesetr��mmask�wnomask�wdaymask�	nwdaymask�
eastermask�mdaymask�	nmdaymask�yearlen�nextyearlen�divmodrgrwr��dater��yearordinalZcombinerhr�r]ZMAXYEAR�_rrule__mod_distancerrrAr�Z
monthrange)5rD�yearr�r�r�r�r�rZyearday�_r�r�r�r�r�r�r�r�r�r�Zbynmonthdayr�r�r�r��iiZ	getdaysetZtimesetZ
gettimeset�totalr|Zdaysetrs�endZfilteredrjZposlistr�ZdayposZtimeposr�rryZfixday�div�modZndaysZvalidZrep_raterlZnhoursZnminutesZdaysinmonthrGrGrHr`�s�

 
















"





zrrule._itercCspt�}t|t�r|f}x@|D]8}t|j|�}|dksJt|||�ddkr|j|�qWt|�dkrltd��|S)a
        If a `BYXXX` sequence is passed to the constructor at the same level as
        `FREQ` (e.g. `FREQ=HOURLY,BYHOUR={2,4,7},INTERVAL=3`), there are some
        specifications which cannot be reached given some starting conditions.

        This occurs whenever the interval is not coprime with the base of a
        given unit and the difference between the starting position and the
        ending position is not coprime with the greatest common denominator
        between the interval and the base. For example, with a FREQ of hourly
        starting at 17:00 and an interval of 4, the only valid values for
        BYHOUR would be {21, 1, 5, 9, 13, 17}, because 4 and 24 are not
        coprime.

        :param start:
            Specifies the starting position.
        :param byxxx:
            An iterable containing the list of allowed values.
        :param base:
            The largest allowable value for the specified frequency (e.g.
            24 hours, 60 minutes).

        This does not preserve the type of the iterable, returning a set, since
        the values should be unique and the order is irrelevant, this will
        speed up later lookups.

        In the event of an empty set, raises a :exception:`ValueError`, as this
        results in an empty rrule.
        rrz+Invalid rrule byxxx generates an empty set.)	r�rmrrr�rr�rgrA)rDrsr�r�ZcsetZnumZi_gcdrGrGrHZ__construct_byset�s

zrrule.__construct_bysetcCsLd}xBtd|d�D]0}t||j|�\}}||7}||kr||fSqWdS)a�
        Calculates the next value in a sequence where the `FREQ` parameter is
        specified along with a `BYXXX` parameter at the same "level"
        (e.g. `HOURLY` specified with `BYHOUR`).

        :param value:
            The old value of the component.
        :param byxxx:
            The `BYXXX` set, which should have been generated by
            `rrule._construct_byset`, or something else which checks that a
            valid rule is present.
        :param base:
            The largest allowable value for the specified frequency (e.g.
            24 hours, 60 minutes).

        If a valid value is not found after `base` iterations (the maximum
        number before the sequence would start to repeat), this raises a
        :exception:`ValueError`, as no valid values were found.

        This returns a tuple of `divmod(n*interval, base)`, where `n` is the
        smallest number of `interval` repetitions until the next specified
        value in `byxxx` is found.
        rrN)rrr�)rDr�r�r�ZaccumulatorrrrGrGrHZ__mod_distanceszrrule.__mod_distance)NrNNNNNNNNNNNNNF)rIrJrKrLrCr�r�r`r�rrMrGrG)rFrHr
.st{>/c@sveZdZddddddddd	d
ddd
ddgZdd�Zdd�Zdd�Zdd�Zdd�Zdd�Z	dd�Z
dd�Zd d!�Zd"S)#r�r
�lastyear�	lastmonthr	r
r
�yearweekdayr�mrangerrrrrrcCs&x|jD]}t||d�qW||_dS)N)�	__slots__�setattrr
)rDr
�attrrGrGrHrCDsz_iterinfo.__init__cCs�|j}||jk�r2dtj|�|_dtj|d�|_tj|dd�}|j�|_	|j
�|_tj|dd�j
�}|jdkr�t|_
t|_t|_t|d�|_t|_n&t|_
t|_t|_t|d�|_t|_|js�d|_�n`dg|jd|_d|j|jd}}|dk�r"d}|j|j|jd}n
|j|}t|d�\}	}
|	|
d}x�|jD]�}|dk�rh||d7}d|k�o||kn�s��qN|dk�r�||dd}
||k�r�|
d|8}
n|}
x8td�D],}d|j|
<|
d7}
|j|
|jk�r�P�q�W�qNWd|jk�rr||d}
||k�r,|
d|8}
|
|jk�rrx8td�D],}d|j|
<|
d7}
|j|
|jk�rBP�qBW|�r2d|jk�rtj|ddd�j
�}d||jd}dtj|d�}|dk�r�d}d|||jddd}nd|j|dd}nd}||jk�r2xt|�D]}
d|j|
<�qW|j�r�||j k�sR||jk�r�g}|j!t"k�r�|j#�r�x:|j#D]"}|j$|j|d|d���qrWnd|jfg}n$|j!t%k�r�|j|d|d�g}|�r�dg|j|_&x�|D]�\}}|d8}x�|jD]�\}}|dk�r8||dd}
|
|j|
|d8}
n*||dd}
|
d|j|
|d7}
||
k�ov|kn�r�d|j&|
<�q�W�q�W|j'�r�dg|jd|_(t)j)|�j�|j	}x|j'D]}d|j(||<�q�W||_||_ dS)	Nimrrr#r�4���r)*r
rr�Zisleapr	r
r�rr�r
rr�M365MASKr�MDAY365MASKr�NMDAY365MASKr�WDAYMASKr�	M365RANGEr�M366MASK�MDAY366MASK�NMDAY366MASK�	M366RANGEr�rr�rrr�rr�r
r�rhrrr�rr�)rDrr��rrZ	firstydayr�Zno1wkstZ	firstwkstZwyearlenrrZnumweeksrErjrlZlyearweekdayZlno1wkstZlyearlenZ	lnumweeksZranges�firstrZeyday�offsetrGrGrHr�Is�












$
z_iterinfo.rebuildcCstt|j��d|jfS)Nr)rprr	)rDrr�r�rGrGrHr��sz_iterinfo.ydaysetcCsLdg|j}|j|d|d�\}}xt||�D]}|||<q2W|||fS)Nr)r	rr)rDrr�r��dsetrsrrjrGrGrHr��s
z_iterinfo.mdaysetcCsldg|jd}tj|||�j�|j}|}x4td�D](}|||<|d7}|j||jjkr6Pq6W|||fS)Nr#r)	r	r�rr�r
rrr
r�)rDrr�r�r+rjrsrlrGrGrHr��sz_iterinfo.wdaysetcCs:dg|j}tj|||�j�|j}|||<|||dfS)Nr)r	r�rr�r
)rDrr�r�r+rjrGrGrHr��sz_iterinfo.ddaysetc	CsPg}|j}x8|jD].}x(|jD]}|jtj||||jd��qWqW|j�|S)N)r�)r
r�r�rhr�r�r�r�)rDr�r�r��tsetr(rGrGrHr��sz_iterinfo.htimesetcCs@g}|j}x(|jD]}|jtj||||jd��qW|j�|S)N)r�)r
r�rhr�r�r�r�)rDr�r�r�r,r(rGrGrHr�sz_iterinfo.mtimesetcCstj||||jjd�fS)N)r�)r�r�r
r�)rDr�r�r�rGrGrHr�s
z_iterinfo.stimesetN)
rIrJrKrrCr�r�r�r�r�r�rrrGrGrGrHr�>s
r�csjeZdZdZGdd�de�Zd�fdd�	Zedd��Zed	d
��Z	edd��Z
ed
d��Zdd�Z�Z
S)raL The rruleset type allows more complex recurrence setups, mixing
    multiple rules, dates, exclusion rules, and exclusion dates. The type
    constructor takes the following keyword arguments:

    :param cache: If True, caching of results will be enabled, improving
                  performance of multiple queries considerably. c@s@eZdZdd�Zdd�ZeZdd�Zdd�Zd	d
�Zdd�Z	d
S)zrruleset._genitemcCs>yt|�|_|j|�Wntk
r,YnX||_||_dS)N)rr}rhri�genlistrk)rDr-rkrGrGrHrC
s
zrruleset._genitem.__init__cCs^yt|j�|_WnHtk
rX|jd|kr<tj|j�n|jj|�tj|j�YnXdS)Nr)	rrkr}rir-�heapq�heappop�remove�heapify)rDrGrGrH�__next__szrruleset._genitem.__next__cCs|j|jkS)N)r})rD�otherrGrGrH�__lt__szrruleset._genitem.__lt__cCs|j|jkS)N)r})rDr3rGrGrH�__gt__"szrruleset._genitem.__gt__cCs|j|jkS)N)r})rDr3rGrGrH�__eq__%szrruleset._genitem.__eq__cCs|j|jkS)N)r})rDr3rGrGrH�__ne__(szrruleset._genitem.__ne__N)
rIrJrKrCr2�nextr4r5r6r7rGrGrGrH�_genitem	s	
r9Fcs,tt|�j|�g|_g|_g|_g|_dS)N)rBrrC�_rrule�_rdate�_exrule�_exdate)rDr^)rFrGrHrC+s
zrruleset.__init__cCs|jj|�dS)z\ Include the given :py:class:`rrule` instance in the recurrence set
            generation. N)r:rh)rDr
rGrGrHr
2szrruleset.rrulecCs|jj|�dS)z_ Include the given :py:class:`datetime` instance in the recurrence
            set generation. N)r;rh)rD�rdaterGrGrHr>8szrruleset.rdatecCs|jj|�dS)z� Include the given rrule instance in the recurrence set exclusion
            list. Dates which are part of the given recurrence rules will not
            be generated, even if some inclusive rrule or rdate matches them.
        N)r<rh)rD�exrulerGrGrHr?>szrruleset.exrulecCs|jj|�dS)z� Include the given datetime instance in the recurrence set
            exclusion list. Dates included that way will not be generated,
            even if some inclusive rrule or rdate matches them. N)r=rh)rD�exdaterGrGrHr@Fszrruleset.exdateccslg}|jj�|j|t|j��x$dd�|jD�D]}|j||�q2Wg}|jj�|j|t|j��x$dd�|jD�D]}|j||�qxWd}d}tj|�tj|�x�|�r`|d}|s�||j	k�r:xB|o�|d|k�r|d}t
|�|o�|d|kr�tj||�q�W|�s$||dk�r4|d7}|j	V|j	}t
|�|r�|d|kr�tj||�q�W||_dS)NcSsg|]}t|��qSrG)r_)rNrOrGrGrHr�Qsz"rruleset._iter.<locals>.<listcomp>cSsg|]}t|��qSrG)r_)rNrOrGrGrHr�Vsrr)
r;r�r9r_r:r=r<r.r1r}r�heapreplacer])rDZrlistrkZexlistZlastdtrZritemZexitemrGrGrHr`Ms<



zrruleset._iter)F)rIrJrKrL�objectr9rCrWr
r>r?r@r`rMrGrG)rFrHrs"c@s�eZdZeeeeeee	d�Z
dddddddd	�Zd
d�Zdd
�Z
eZeZe
Ze
Ze
Ze
Ze
Ze
Ze
Ze
Ze
Zdd�Zdd�Zdd�Zdd�ZeZddd�Zddd�Zdd�Z dS) �	_rrulestr)r
rrrrrrrrrrrr!r")rrrrrrrcKst|�||j�<dS)N)�int�lower)rD�rrkwargsr�r�rSrGrGrH�_handle_int{sz_rrulestr._handle_intcKs dd�|jd�D�||j�<dS)NcSsg|]}t|��qSrG)rD)rNrOrGrGrHr�sz._rrulestr._handle_int_list.<locals>.<listcomp>r�)�splitrE)rDrFr�r�rSrGrGrH�_handle_int_list~sz_rrulestr._handle_int_listcKs|j||d<dS)Nr�)�	_freq_map)rDrFr�r�rSrGrGrH�_handle_FREQ�sz_rrulestr._handle_FREQcKsVtsddlmay$tj||jd�|jd�d�|d<Wntk
rPtd��YnXdS)Nr)�parser�ignoretz�tzinfos)rMrNr�zinvalid until date)rLr��parser�rA)rDrFr�r�rSrGrGrH�
_handle_UNTIL�sz_rrulestr._handle_UNTILcKs|j||d<dS)Nr�)�_weekday_map)rDrFr�r�rSrGrGrH�_handle_WKST�sz_rrulestr._handle_WKSTcKs�g}x�|jd�D]�}d|krD|jd�}|d}t|ddd	��}	n^t|�r�x"tt|��D]}
||
dkrZPqZW|d|
�p~d}	||
d�}|	r�t|	�}	ntd��|jt|j||	��qW||d<dS)
z:
        Two ways to specify this: +1MO or MO(+1)
        r��(rrNz+-0123456789z$Invalid (empty) BYDAY specification.r�r)rHrDrgrrArh�weekdaysrQ)rDrFr�r�rSr�r�Zsplt�wrErjrGrGrH�_handle_BYWEEKDAY�s"

z_rrulestr._handle_BYWEEKDAYNFc
Cs�|jd�dkr.|jd�\}}|dkr2td��n|}i}x�|jd�D]�}	|	jd�\}}|j�}|j�}y t|d|�|||||d�WqBtk
r�td	|��YqBttfk
r�td
||f��YqBXqBWtf||d�|��S)
N�:r�RRULEzunknown parameter namer��=Z_handle_)rMrNzunknown parameter '%s'zinvalid '%s': %s)r�r^r)�findrHrA�upper�getattr�AttributeError�KeyErrorr
)
rD�liner�r^rMrNr�r�rFZpairrGrGrH�_parse_rfc_rrule�s&
z_rrulestr._parse_rfc_rrulec	Cs�|rd}d}|j�}|j�s$td��|r�|j�}	d}
xr|
t|	�kr�|	|
j�}|sZ|	|
=q6|
dkr�|ddkr�|	|
d|dd�7<|	|
=q6|
d7}
q6Wn|j�}	|r�t|	�dkr�|jd�dks�|jd�r�|j	|	d||||d�Sg}g}
g}g}�x�|	D�]�}|�s�q|jd�dk�r,d	}|}n|jdd�\}}|jd
�}|�sTtd��|d}|dd�}|d	k�r�x|D]}td|���qxW|j
|��q|d
k�r�x$|D]}|dk�r�td|���q�W|
j
|�n�|dk�r
x|D]}td|���q�W|j
|�n�|dk�rFx$|D]}|dk�rtd|���qW|j
|�nV|dk�r�x|D]}td|���qVWt�s~ddlmatj
|||d�}ntd|���qW|�s�t|�dk�s�|
�s�|�s�|�r�t�r�|
�s�|�r�ddlmat|d�}x&|D]}|j|j	||||d���q�Wx:|
D]2}x*|jd�D]}|jtj
|||d���q0W�q Wx&|D]}|j|j	||||d���q\Wx:|D]2}x*|jd�D]}|jtj
|||d���q�W�q�W|�r�|�r�|j|�|S|j	|d||||d�SdS)NTzempty stringr� rrWzRRULE:)r^r�rMrNrXr�zempty property namezunsupported RRULE parm: ZRDATEzVALUE=DATE-TIMEzunsupported RDATE parm: ZEXRULEzunsupported EXRULE parm: ZEXDATEzunsupported EXDATE parm: ZDTSTARTzunsupported DTSTART parm: )rL)rMrNzunsupported property: )r^)r�rMrNr�)r�r^rMrNrr)r[�striprA�
splitlinesrg�rstriprHrZ�
startswithr`rhrLr�rOrr
r>r?r@)rDr�r�r^ZunfoldZforcesetZ
compatiblerMrN�linesrjr_Z	rrulevalsZ	rdatevalsZ
exrulevalsZ
exdatevalsr�r�ZparmsZparmZrsetZdatestrrGrGrH�
_parse_rfc�s�	 























z_rrulestr._parse_rfccKs|j|f|�S)N)rg)rDr�rSrGrGrH�__call__Dsz_rrulestr.__call__)NFFN)NFFFFFN)!rIrJrKr
rrrrrrrJrQrGrIZ_handle_INTERVALZ
_handle_COUNTZ_handle_BYSETPOSZ_handle_BYMONTHZ_handle_BYMONTHDAYZ_handle_BYYEARDAYZ_handle_BYEASTERZ_handle_BYWEEKNOZ_handle_BYHOURZ_handle_BYMINUTEZ_handle_BYSECONDrKrPrRrVZ
_handle_BYDAYr`rgrhrGrGrGrHrCnsN

irCi��i��i��)
rrr*r+r,r-r.r/r0r1r2r3r4)
rrr5r6r7r8r9r:r;r<r=r>r?)>rLrqr�r�ruZmathr�ImportErrorZ	fractionsZsixrrZ	six.movesrrr.Z_commonrZweekdaybase�warningsr	�__all__r�r$rprZM29ZM30ZM31r%r r&r!r'r#r"r�r
rrrrrrr�rLrrrrrrrrTrWrBrXr
r�rrCrrGrGrGrH�<module>sl�.@.@(
TDm[site-packages/dateutil/__pycache__/__init__.cpython-36.pyc000064400000000246147511334560017507 0ustar003

6�cYE�@sddlmZdS)�)�VERSIONN)Z_versionr�__version__�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/dateutil/__pycache__/relativedelta.cpython-36.opt-1.pyc000064400000033641147511334560021541 0ustar003

6�cYiZ�@s�ddlZddlZddlZddlmZddlmZddlmZddl	m
Z
edd�ed	�D��\Z
ZZZZZZZd
ddd
ddddgZGdd
�d
e�Zdd�ZdS)�N)�copysign)�
integer_types)�warn�)�weekdayccs|]}t|�VqdS)N)r)�.0�x�r	�#/usr/lib/python3.6/relativedelta.py�	<genexpr>
sr��
relativedelta�MO�TU�WE�TH�FR�SA�SUc@s�eZdZdZd%dd�Zdd�Zedd	��Zejd
d	��Zdd�Z	d
d�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�ZeZdd�ZeZdd�ZdZdd �Zd!d"�ZeZd#d$�ZdS)&r
a�
    The relativedelta type is based on the specification of the excellent
    work done by M.-A. Lemburg in his
    `mx.DateTime <http://www.egenix.com/files/python/mxDateTime.html>`_ extension.
    However, notice that this type does *NOT* implement the same algorithm as
    his work. Do *NOT* expect it to behave like mx.DateTime's counterpart.

    There are two different ways to build a relativedelta instance. The
    first one is passing it two date/datetime classes::

        relativedelta(datetime1, datetime2)

    The second one is passing it any number of the following keyword arguments::

        relativedelta(arg1=x,arg2=y,arg3=z...)

        year, month, day, hour, minute, second, microsecond:
            Absolute information (argument is singular); adding or subtracting a
            relativedelta with absolute information does not perform an aritmetic
            operation, but rather REPLACES the corresponding value in the
            original datetime with the value(s) in relativedelta.

        years, months, weeks, days, hours, minutes, seconds, microseconds:
            Relative information, may be negative (argument is plural); adding
            or subtracting a relativedelta with relative information performs
            the corresponding aritmetic operation on the original datetime value
            with the information in the relativedelta.

        weekday:
            One of the weekday instances (MO, TU, etc). These instances may
            receive a parameter N, specifying the Nth weekday, which could
            be positive or negative (like MO(+1) or MO(-2). Not specifying
            it is the same as specifying +1. You can also use an integer,
            where 0=MO.

        leapdays:
            Will add given days to the date found, if year is a leap
            year, and the date found is post 28 of february.

        yearday, nlyearday:
            Set the yearday or the non-leap year day (jump leap days).
            These are converted to day/month/leapdays information.

    Here is the behavior of operations with relativedelta:

    1. Calculate the absolute year, using the 'year' argument, or the
       original datetime year, if the argument is not present.

    2. Add the relative 'years' argument to the absolute year.

    3. Do steps 1 and 2 for month/months.

    4. Calculate the absolute day, using the 'day' argument, or the
       original datetime day, if the argument is not present. Then,
       subtract from the day until it fits in the year and month
       found after their operations.

    5. Add the relative 'days' argument to the absolute day. Notice
       that the 'weeks' argument is multiplied by 7 and added to
       'days'.

    6. Do steps 1 and 2 for hour/hours, minute/minutes, second/seconds,
       microsecond/microseconds.

    7. If the 'weekday' argument is present, calculate the weekday,
       with the given (wday, nth) tuple. wday is the index of the
       weekday (0-6, 0=Mon), and nth is the number of weeks to add
       forward or backward, depending on its signal. Notice that if
       the calculated date is already Monday, for example, using
       (0, 1) or (0, -1) won't change the day.
    NrcCstdd�||fD��rtd��|o$|�r�t|tj�o>t|tj�sHtd��t|tj�t|tj�kr�t|tj�s~tjj|j��}nt|tj�s�tjj|j��}d|_d|_	d|_
d|_d|_d|_
d|_d|_d|_d|_d|_d|_d|_d|_d|_d|_d|_|j|jd|j|j}|j|�|j|�}||k�rFtj}d}n
tj}d}x.|||��r~||7}|j|�|j|�}�qRW||}|j|j
d|_|j|_�nV||_||_	||d	|_
||_||_|	|_
|
|_||_||_|
|_||_||_||_||_||_td
d�||
|||||fD���r4tdt�t|t ��rLt!||_n||_d}|�rb|}n|�r||}|dk�r|d|_|�r�ddddddddddddg}x\t"|�D]D\}}||k�r�|d|_|dk�r�||_n|||d|_P�q�Wtd|��|j#�dS)Ncss"|]}|dk	o|t|�kVqdS)N)�int)rrr	r	r
rcsz)relativedelta.__init__.<locals>.<genexpr>zGNon-integer years and months are ambiguous and not currently supported.z&relativedelta only diffs datetime/dater�ri�Qrcss"|]}|dk	ot|�|kVqdS)N)r)rrr	r	r
r�sz2Non-integer value passed as absolute information. z4This is not a well-defined condition and will raise zerrors in future versions.�;��Z�x������ii0iNinzinvalid year day (%d)���zfNon-integer value passed as absolute information. This is not a well-defined condition and will raise z�Non-integer value passed as absolute information. This is not a well-defined condition and will raise errors in future versions.r)$�any�
ValueError�
isinstance�datetime�date�	TypeError�fromordinal�	toordinal�years�months�days�leapdays�hours�minutes�seconds�microseconds�year�month�dayr�hour�minute�second�microsecond�	_has_time�_set_months�__radd__�operator�gt�ltr�DeprecationWarningr�weekdays�	enumerate�_fix)�selfZdt1Zdt2r(r)r*r+�weeksr,r-r.r/r0r1r2rZyeardayZ	nlyeardayr3r4r5r6ZdtmZcompareZ	incrementZdeltaZydayZydayidx�idxZydaysr	r	r
�__init__[s�









zrelativedelta.__init__cCs�t|j�dkrHt|j�}t|j|d�\}}|||_|j||7_t|j�dkr�t|j�}t|j|d�\}}|||_|j||7_t|j�dkr�t|j�}t|j|d�\}}|||_|j||7_t|j�dk�r"t|j�}t|j|d�\}}|||_|j||7_t|j�dk�rlt|j�}t|j|d�\}}|||_|j	||7_	|j�s�|j�s�|j�s�|j�s�|j
dk	�s�|jdk	�s�|jdk	�s�|j
dk	�r�d	|_nd
|_dS)Ni?Bi@Br�<���rrr)�absr/�_sign�divmodr.r-r,r*r)r(r3r4r5r6r7)rA�s�div�modr	r	r
r@�s<









 zrelativedelta._fixcCs
|jdS)Nr)r*)rAr	r	r
rB�szrelativedelta.weekscCs|j|jd|d|_dS)Nr)r*rB)rA�valuer	r	r
rB�scCsR||_t|j�dkrHt|j�}t|j|d�\}}|||_|||_nd|_dS)NrHrr)r)rIrJrKr()rAr)rLrMrNr	r	r
r8s

zrelativedelta._set_monthsc	Cs�t|j�}t|jd|j|d�}t|�}t|jd||d�}t|�}t|jd||d�}t|�}t|jd||�}|j|j|j	||||||j
|j|j|j
|j|j|j|j|jd�S)aA
        Return a version of this object represented entirely using integer
        values for the relative attributes.

        >>> relativedelta(days=1.5, hours=2).normalized()
        relativedelta(days=1, hours=14)

        :return:
            Returns a :class:`dateutil.relativedelta.relativedelta` object.
        rGrHrE�
�g��.A)r(r)r*r,r-r.r/r+r0r1r2rr3r4r5r6)rr*�roundr,r-r.r/�	__class__r(r)r+r0r1r2rr3r4r5r6)	rAr*Zhours_fr,Z	minutes_fr-Z	seconds_fr.r/r	r	r
�
normalizeds 
zrelativedelta.normalizedc
CsFt|t��r|j|j|j|j|j|j|j|j|j|j|j|j|j|j	|j	|j
p`|j
|jdk	rp|jn|j|jdk	r�|jn|j|j
dk	r�|j
n|j
|jdk	r�|jn|j|jdk	r�|jn|j|jdk	r�|jn|j|jdk	r�|jn|j|jdk	r�|jn|jd�St|tj��rp|j|j|j|j|j|j|j|j|j|j	|j	|j
|j|j|j
|j|j|j|j|jd�St|tj��s�tS|j�r�t|tj��r�tjj|j��}|j�p�|j|j}|j�p�|j}|j�r||j7}|dk�r�|d7}|d8}n|dk�r|d8}|d7}ttj||�d|j
�p0|j
�}|||d�}x*dD]"}t||�}|dk	�rF|||<�qFW|j}|j
�r�|d	k�r�tj|��r�||j
7}|jf|�tj||j|j|j|j	d
�}	|j�rB|jj|jj �p�d}
}t!|�dd}|dk�r|d|	j�|
d7}n||	j�|
d7}|d9}|	tj|d
�7}	|	S)N)r(r)r*r,r-r.r/r+r0r1r2rr3r4r5r6rr)r0r1r2r3r4r5r6�)r*r,r-r.r/rr)r*)r3r4r5r6r)"r"r
rSr(r)r*r,r-r.r/r+r0r1r2rr3r4r5r6r#Z	timedeltar$�NotImplementedr7r&r'�min�calendarZ
monthrange�getattrZisleap�replace�nrI)
rA�otherr0r1r2�repl�attrrOr*�retrZnthZjumpdaysr	r	r
�__add__/s�




















zrelativedelta.__add__cCs
|j|�S)N)r`)rAr\r	r	r
r9�szrelativedelta.__radd__cCs|j�j|�S)N)�__neg__r9)rAr\r	r	r
�__rsub__�szrelativedelta.__rsub__cCs
t|t�stS|j|j|j|j|j|j|j|j|j|j|j|j	|j	|j
|j
|jpb|j|jdk	rr|jn|j|j
dk	r�|j
n|j
|jdk	r�|jn|j|jdk	r�|jn|j|jdk	r�|jn|j|jdk	r�|jn|j|jdk	r�|jn|j|jdk	�r|jn|jd�S)N)r(r)r*r,r-r.r/r+r0r1r2rr3r4r5r6)r"r
rVrSr(r)r*r,r-r.r/r+r0r1r2rr3r4r5r6)rAr\r	r	r
�__sub__�s6







zrelativedelta.__sub__cCsX|j|j|j|j|j|j|j|j|j|j	|j
|j|j|j
|j|j|jd�S)N)r(r)r*r,r-r.r/r+r0r1r2rr3r4r5r6)rSr(r)r*r,r-r.r/r+r0r1r2rr3r4r5r6)rAr	r	r
ra�s 
zrelativedelta.__neg__cCs�|jo�|jo�|jo�|jo�|jo�|jo�|jo�|jo�|jdko�|j	dko�|j
dko�|jdko�|jdko�|j
dko�|jdko�|jdkS)N)r(r)r*r,r-r.r/r+r0r1r2rr3r4r5r6)rAr	r	r
�__bool__�s 






zrelativedelta.__bool__cCs�yt|�}Wntk
r tSX|jt|j|�t|j|�t|j|�t|j|�t|j	|�t|j
|�t|j|�|j|j
|j|j|j|j|j|j|jd�S)N)r(r)r*r,r-r.r/r+r0r1r2rr3r4r5r6)�floatr%rVrSrr(r)r*r,r-r.r/r+r0r1r2rr3r4r5r6)rAr\�fr	r	r
�__mul__�s(zrelativedelta.__mul__cCsNt|t�stS|js|jr~|js*|jr.dS|jj|jjkrBdS|jj|jj}}||kr~|sj|dkov|pv|dkr~dS|j|jk�oL|j|jk�oL|j|jk�oL|j|jk�oL|j	|j	k�oL|j
|j
k�oL|j|jk�oL|j|jk�oL|j
|j
k�oL|j|jk�oL|j|jk�oL|j|jk�oL|j|jk�oL|j|jk�oL|j|jkS)NFr)r"r
rVrr[r(r)r*r,r-r.r/r+r0r1r2r3r4r5r6)rAr\Zn1Zn2r	r	r
�__eq__�s2
&zrelativedelta.__eq__cCs|j|�S)N)rh)rAr\r	r	r
�__ne__szrelativedelta.__ne__cCs0ydt|�}Wntk
r$tSX|j|�S)Nr)rer%rVrg)rAr\Z
reciprocalr	r	r
�__div__s
zrelativedelta.__div__cCs�g}x.dD]&}t||�}|r
|jd	j||d
��q
Wx6dD].}t||�}|dk	r:|jdj|t|�d
��q:Wdj|jjdj|�d�S)Nr(r)r*r+r,r-r.r/z{attr}={value:+g})r^rOr0r1r2rr3r4r5r6z{attr}={value}z{classname}({attrs})z, )Z	classnameZattrs)r(r)r*r+r,r-r.r/)r0r1r2rr3r4r5r6)rY�append�format�reprrS�__name__�join)rA�lr^rOr	r	r
�__repr__s


zrelativedelta.__repr__)NNrrrrrrrrrNNNNNNNNNN)rn�
__module__�__qualname__�__doc__rDr@�propertyrB�setterr8rTr`r9rbrcrardZ__nonzero__rg�__rmul__rh�__hash__rirj�__truediv__rqr	r	r	r
r
s6G
y!
#WcCsttd|��S)Nr)rr)rr	r	r
rJ"srJ)r#rXr:ZmathrZsixr�warningsrZ_commonr�tuple�rangerrrrrrrr>�__all__�objectr
rJr	r	r	r
�<module>s(site-packages/dateutil/__pycache__/rrule.cpython-36.opt-1.pyc000064400000117164147511334560020050 0ustar003

6�cY���@sJdZddlZddlZddlZddlZyddlmZWn ek
rTddlmZYnXddl	m
Z
mZddlm
Z
mZddlZddlmZddlmZd	d
ddd
ddddddddddddgZedgddgddgddgdd gdd!gdd"gdd#gdd$gdd%gdd&gdd'gddgd"�Zee�Zeedd��eedd��eedd(��ZZZeeeeeeeeeeeeeedd"��Zee�ZeedNd��eedOd��eedPd��ZZZeeeeeeeeeeeeeedd"��Zee�Z dQZ!dRZ"dddddd d!gd?Z#[[[ed4=ed4=e d=ee�Zee�Zdd
dddddgZ$eed"��\Z%Z&Z'Z(Z)Z*Z+da,da-Gd@dA�dAe�ZedBdC�ed"�D��\Z.Z/Z0Z1Z2Z3Z4Z5dDdE�Z6GdFdG�dGe7�Z8GdHd	�d	e8�Z9GdIdJ�dJe7�Z:GdKd
�d
e8�Z;GdLdM�dMe7�Z<e<�Z=dS)Sz�
The rrule module offers a small, complete, and very fast, implementation of
the recurrence rules documented in the
`iCalendar RFC <http://www.ietf.org/rfc/rfc2445.txt>`_,
including support for caching of results.
�N)�gcd)�advance_iterator�
integer_types)�_thread�range�)�weekday)�warn�rrule�rruleset�rrulestr�YEARLY�MONTHLY�WEEKLY�DAILY�HOURLY�MINUTELY�SECONDLY�MO�TU�WE�TH�FR�SA�SU�����������	�
��� �<�[�y��������1�O�n�;�Z�x��������0�N�m�7cs"eZdZdZd�fdd�	Z�ZS)rz7
    This version of weekday does not allow n = 0.
    Ncs&|dkrtd��tt|�j||�dS)NrzCan't create weekday with n==0)�
ValueError�superr�__init__)�selfZwkday�n)�	__class__��/usr/lib/python3.6/rrule.pyrCDszweekday.__init__)N)�__name__�
__module__�__qualname__�__doc__rC�
__classcell__rGrG)rFrHr@srccs|]}t|�VqdS)N)r)�.0�xrGrGrH�	<genexpr>KsrPcs�fdd�}|S)zT
    Decorator for rruleset methods which may invalidate the
    cached length.
    cs�|f|�|�}|j�|S)N)�_invalidate_cache)rD�args�kwargs�rv)�frGrH�
inner_funcSsz&_invalidates_cache.<locals>.inner_funcrG)rUrVrG)rUrH�_invalidates_cacheNsrWc@sneZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	ddd�Z
ddd�Zddd�Zddd�Z
dS)�	rrulebaseFcCs4|rg|_tj�|_|j�nd|_d|_d|_dS)NF)�_cacher�
allocate_lock�_cache_lockrQ�_cache_complete�_len)rD�cacherGrGrHrC\s

zrrulebase.__init__cCs.|jrt|j�S|jdkr"|j�S|j�SdS)N)r\�iterrY�_iter�_iter_cached)rDrGrGrH�__iter__fs


zrrulebase.__iter__cCs>|jdk	r4g|_d|_|j�|_|jj�r4|jj�d|_dS)NF)rYr\r`�
_cache_genr[�locked�releaser])rDrGrGrHrQns



zrrulebase._invalidate_cacheccs�d}|j}|j}|jj}|jj}x�|r�|t|�kr�|�|jr@Py$xtd�D]}|jt	|��qLWWn&t
k
r�d|_}d|_PYnX|�||V|d7}q"Wx ||jkr�||V|d7}q�WdS)Nrr&Tr)rcrYr[�acquirere�lenr\r�appendr�
StopIterationr])rD�i�genr^rfre�jrGrGrHrays.


zrrulebase._iter_cachedcCs�|jr|j|St|t�rd|jr:|jdkr:tt|��|Sttj||j	pJd|j
pTtj|jp\d��Sn`|dkr�t|�}y"xt
|d�D]}t|�}q�WWntk
r�t�YnX|Stt|��|SdS)Nrr)r\rY�
isinstance�slice�step�listr_�	itertools�islice�start�stop�sys�maxsizerrri�
IndexError)rD�itemrkrj�resrGrGrH�__getitem__�s$



zrrulebase.__getitem__cCs:|jr||jkSx$|D]}||kr&dS||krdSqWdS)NTF)r\rY)rDrxrjrGrGrH�__contains__�s

zrrulebase.__contains__cCs|jdkrx|D]}qW|jS)z� Returns the number of recurrences in this set. It will have go
            trough the whole recurrence, if this hasn't been done before. N)r])rDrOrGrGrH�count�s

zrrulebase.countcCsX|jr|j}n|}d}|r8x8|D]}||kr.P|}q Wnx|D]}||krLP|}q>W|S)z� Returns the last recurrence before the given datetime instance. The
            inc keyword defines what happens if dt is an occurrence. With
            inc=True, if dt itself is an occurrence, it will be returned. N)r\rY)rD�dt�incrk�lastrjrGrGrH�before�s


zrrulebase.beforecCsP|jr|j}n|}|r2x4|D]}||kr|SqWnx|D]}||kr8|Sq8WdS)z� Returns the first recurrence after the given datetime instance. The
            inc keyword defines what happens if dt is an occurrence. With
            inc=True, if dt itself is an occurrence, it will be returned.  N)r\rY)rDr}r~rkrjrGrGrH�after�s


zrrulebase.afterNccsh|jr|j}n|}|r dd�}ndd�}d}x6|D].}|||�r2|dk	rZ|d7}||krZP|Vq2WdS)aH
        Generator which yields up to `count` recurrences after the given
        datetime instance, equivalent to `after`.

        :param dt:
            The datetime at which to start generating recurrences.

        :param count:
            The maximum number of recurrences to generate. If `None` (default),
            dates are generated until the recurrence rule is exhausted.

        :param inc:
            If `dt` is an instance of the rule and `inc` is `True`, it is
            included in the output.

        :yields: Yields a sequence of `datetime` objects.
        cSs||kS)NrG)�dc�dtcrGrGrH�<lambda>�sz"rrulebase.xafter.<locals>.<lambda>cSs||kS)NrG)r�r�rGrGrHr��srNr)r\rY)rDr}r|r~rk�comprE�drGrGrH�xafter�s


zrrulebase.xafterrc	Cs�|jr|j}n|}d}g}|r`x�|D]6}||kr4Pq$|sP||krZd}|j|�q$|j|�q$Wn@x>|D]6}||krvPqf|s�||kr�d}|j|�qf|j|�qfW|S)a Returns all the occurrences of the rrule between after and before.
        The inc keyword defines what happens if after and/or before are
        themselves occurrences. With inc=True, they will be included in the
        list, if they are found in the recurrence set. FT)r\rYrh)	rDr�r�r~r|rkZstarted�lrjrGrGrH�betweens.

zrrulebase.between)F)F)F)NF)Fr)rIrJrKrCrbrQrarzr{r|r�r�r�r�rGrGrGrHrX[s




)rXcsJeZdZdZd�fdd�	Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	�Z
S)r
a|
    That's the base of the rrule operation. It accepts all the keywords
    defined in the RFC as its constructor parameters (except byday,
    which was renamed to byweekday) and more. The constructor prototype is::

            rrule(freq)

    Where freq must be one of YEARLY, MONTHLY, WEEKLY, DAILY, HOURLY, MINUTELY,
    or SECONDLY.

    .. note::
        Per RFC section 3.3.10, recurrence instances falling on invalid dates
        and times are ignored rather than coerced:

            Recurrence rules may generate recurrence instances with an invalid
            date (e.g., February 30) or nonexistent local time (e.g., 1:30 AM
            on a day where the local time is moved forward by an hour at 1:00
            AM).  Such recurrence instances MUST be ignored and MUST NOT be
            counted as part of the recurrence set.

        This can lead to possibly surprising behavior when, for example, the
        start date occurs at the end of the month:

        >>> from dateutil.rrule import rrule, MONTHLY
        >>> from datetime import datetime
        >>> start_date = datetime(2014, 12, 31)
        >>> list(rrule(freq=MONTHLY, count=4, dtstart=start_date))
        ... # doctest: +NORMALIZE_WHITESPACE
        [datetime.datetime(2014, 12, 31, 0, 0),
         datetime.datetime(2015, 1, 31, 0, 0),
         datetime.datetime(2015, 3, 31, 0, 0),
         datetime.datetime(2015, 5, 31, 0, 0)]

    Additionally, it supports the following keyword arguments:

    :param cache:
        If given, it must be a boolean value specifying to enable or disable
        caching of results. If you will use the same rrule instance multiple
        times, enabling caching will improve the performance considerably.
    :param dtstart:
        The recurrence start. Besides being the base for the recurrence,
        missing parameters in the final recurrence instances will also be
        extracted from this date. If not given, datetime.now() will be used
        instead.
    :param interval:
        The interval between each freq iteration. For example, when using
        YEARLY, an interval of 2 means once every two years, but with HOURLY,
        it means once every two hours. The default interval is 1.
    :param wkst:
        The week start day. Must be one of the MO, TU, WE constants, or an
        integer, specifying the first day of the week. This will affect
        recurrences based on weekly periods. The default week start is got
        from calendar.firstweekday(), and may be modified by
        calendar.setfirstweekday().
    :param count:
        How many occurrences will be generated.

        .. note::
            As of version 2.5.0, the use of the ``until`` keyword together
            with the ``count`` keyword is deprecated per RFC-2445 Sec. 4.3.10.
    :param until:
        If given, this must be a datetime instance, that will specify the
        limit of the recurrence. The last recurrence in the rule is the greatest
        datetime that is less than or equal to the value specified in the
        ``until`` parameter.

        .. note::
            As of version 2.5.0, the use of the ``until`` keyword together
            with the ``count`` keyword is deprecated per RFC-2445 Sec. 4.3.10.
    :param bysetpos:
        If given, it must be either an integer, or a sequence of integers,
        positive or negative. Each given integer will specify an occurrence
        number, corresponding to the nth occurrence of the rule inside the
        frequency period. For example, a bysetpos of -1 if combined with a
        MONTHLY frequency, and a byweekday of (MO, TU, WE, TH, FR), will
        result in the last work day of every month.
    :param bymonth:
        If given, it must be either an integer, or a sequence of integers,
        meaning the months to apply the recurrence to.
    :param bymonthday:
        If given, it must be either an integer, or a sequence of integers,
        meaning the month days to apply the recurrence to.
    :param byyearday:
        If given, it must be either an integer, or a sequence of integers,
        meaning the year days to apply the recurrence to.
    :param byweekno:
        If given, it must be either an integer, or a sequence of integers,
        meaning the week numbers to apply the recurrence to. Week numbers
        have the meaning described in ISO8601, that is, the first week of
        the year is that containing at least four days of the new year.
    :param byweekday:
        If given, it must be either an integer (0 == MO), a sequence of
        integers, one of the weekday constants (MO, TU, etc), or a sequence
        of these constants. When given, these variables will define the
        weekdays where the recurrence will be applied. It's also possible to
        use an argument n for the weekday instances, which will mean the nth
        occurrence of this weekday in the period. For example, with MONTHLY,
        or with YEARLY and BYMONTH, using FR(+1) in byweekday will specify the
        first friday of the month where the recurrence happens. Notice that in
        the RFC documentation, this is specified as BYDAY, but was renamed to
        avoid the ambiguity of that keyword.
    :param byhour:
        If given, it must be either an integer, or a sequence of integers,
        meaning the hours to apply the recurrence to.
    :param byminute:
        If given, it must be either an integer, or a sequence of integers,
        meaning the minutes to apply the recurrence to.
    :param bysecond:
        If given, it must be either an integer, or a sequence of integers,
        meaning the seconds to apply the recurrence to.
    :param byeaster:
        If given, it must be either an integer, or a sequence of integers,
        positive or negative. Each integer will define an offset from the
        Easter Sunday. Passing the offset 0 to byeaster will yield the Easter
        Sunday itself. This is an extension to the RFC specification.
     NrFc
sRtt|�j|�|s(tjj�jdd�}n*t|tj�sFtjj|j��}n|jdd�}||_	|j
|_||_||_
||_i|_|r�t|tj�r�tjj|j��}||_|dk	r�|r�tdt�|dkr�tj�|_nt|t�r�||_n|j|_|dkr�d|_n�t|t��r:|dk�s(d|k�odkn�r0td��|f|_nLt|�|_x@|jD]6}|dk�sxd|k�ondkn�rLtd���qLW|j�r�|j|jd<|dk�r:|
dk�r:|	dk�r:|
dk�r:|dk�r:|tk�r|dk�r�|j}d|jd<|j}	d|jd<n8|tk�r|j}	d|jd<n|tk�r:|j�}
d|jd	<|dk�rLd|_ n<t|t��r^|f}tt!t"|���|_ d|jk�r�|j |jd<|
dk�r�d|_#n0t|
t��r�|
f}
tt!t"|
���|_#|j#|jd
<|dk	�rt$�s�ddl%m$a$t|t��r�|f|_&ntt!|��|_&|j&|jd<nd|_&|	dk�r6f|_'f|_(npt|	t��rH|	f}	t"|	�}	tt!d
d�|	D���|_'tt!dd�|	D���|_(d|jk�r�tt)j*|j'|j(��|jd<|dk�r�d|_+n0t|t��r�|f}tt!t"|���|_+|j+|jd<|
dk�rd|_,d|_-�n8t|
t��st.|
d��r |
f}
t"�|_,t"�|_-x`|
D]X}t|t��rT|j,j/|�n8|j0�sh|tk�rx|j,j/|j�n|j-j/|j|j0f��q6W|j,�s�d|_,n|j-�s�d|_-|j,dk	�r�tt!|j,��|_,dd�|j,D�}nt�}|j-dk	�rtt!|j-��|_-dd�|j-D�}nt�}d	|jk�r:tt)j*||��|jd	<|dk�rf|t1k�r^t"|j2f�|_3nd|_3nXt|t��rx|f}|t1k�r�|j4|j2|dd�|_3n
t"|�|_3tt!|j3��|_3|j3|jd<|dk�r�|t5k�r�t"|j6f�|_7nd|_7nXt|t��r�|f}|t5k�r|j4|j6|dd�|_7n
t"|�|_7tt!|j7��|_7|j7|jd<|dk�rj|t8k�rb|j9f|_:nd|_:nbt|t��r||f}t"|�|_:|t8k�r�|j4|j9|dd�|_:n
t"|�|_:tt!|j:��|_:|j:|jd<|jt1k�r�d|_;nng|_;xP|j3D]F}x>|j7D]4}x,|j:D]"}|j;j<tj=||||jd���qW�q�W�q�W|j;j>�t|j;�|_;dS)Nr)Zmicrosecondz�Using both 'count' and 'until' is inconsistent with RFC 2445 and has been deprecated in dateutil. Future versions will raise an error.inz:bysetpos must be between 1 and 366, or between -366 and -1�bysetpos�bymonth�
bymonthday�	byweekday�	byyearday)�easter�byeastercss|]}|dkr|VqdS)rNrG)rNrOrGrGrHrPsz!rrule.__init__.<locals>.<genexpr>css|]}|dkr|VqdS)rNrG)rNrOrGrGrHrPs�byweeknorEcSsg|]}t|��qSrG)r)rNrOrGrGrH�
<listcomp>Isz"rrule.__init__.<locals>.<listcomp>cSsg|]}t|��qSrG)r)rNrOrGrGrHr�Os�)rs�byxxx�base�byhourr*�byminute�bysecond)�tzinfoi����i����)?rBr
rC�datetimeZnow�replacerm�fromordinal�	toordinal�_dtstartr��_tzinfo�_freq�	_interval�_count�_original_rule�_untilr	�DeprecationWarning�calendarZfirstweekday�_wkstrr�	_bysetposrA�tupler
�month�dayrr�_bymonth�sorted�set�
_byyeardayr��dateutil�	_byeaster�_bymonthday�_bynmonthdayrq�chain�	_byweekno�
_byweekday�_bynweekday�hasattr�addrEr�hour�_byhour�_rrule__construct_bysetr�minute�	_byminuter�second�	_bysecond�_timesetrh�time�sort)rD�freq�dtstart�interval�wkstr|�untilr�r�r�r�r�r�r�r�r�r�r^�pos�wdayZorig_byweekdayZorig_bynweekdayr�r�r�)rFrGrHrC�sL
(

(





























zrrule.__init__c
Cs�g}dgd\}}}|jrD|j|jjd��|jj�dd�\}}}dt|jg}|jdkrr|jdt|j��|jr�|jdt	t
|j��d	d
��|jdk	r�|jdt|j��|jr�|j|jjd��|j
jd
�dk	�rDt|j
�}g}xJ|d
D]>}|j�r(|jdj|jt	|�d	d
�d��q�|jt	|��q�W||d
<n|j
}d}	xFd4D]>\}
}|j|�}|�rT|j|	j|
d$jd%d&�|D��d'���qTW|jd(j|��d)j|�S)5z�
        Output a string that would generate this RRULE if passed to rrulestr.
        This is mostly compatible with RFC2445, except for the
        dateutil-specific extension BYEASTER.
        NrzDTSTART:%Y%m%dT%H%M%Sr"zFREQ=rz	INTERVAL=zWKST=rrzCOUNT=zUNTIL=%Y%m%dT%H%M%Sr�z{n:+d}{wday})rEr�z
{name}={vals}�BYSETPOSr��BYMONTHr��
BYMONTHDAYr��	BYYEARDAYr��BYWEEKNOr��BYDAY�BYHOURr��BYMINUTEr��BYSECONDr��BYEASTERr��,css|]}t|�VqdS)N)�str)rN�vrGrGrHrP�sz rrule.__str__.<locals>.<genexpr>)�name�vals�;�
�r�r��r�r��r�r��r�r��r�r��r�r��r�r��r�r��r�r��r�r�)
r�r�r�r�r�r�r�r�r�r�)r�rhZstrftime�	timetuple�	FREQNAMESr�r�r�r��reprrr�r�r��get�dictrE�format�join)
rD�output�h�m�s�partsZ
original_ruleZwday_stringsr�Zpartfmtr��key�valuerGrGrH�__str__�sT
 



z
rrule.__str__cKsN|j|j|j|j|j|j|jdkr&dndd�}|j|j�|j|�t	f|�S)z�Return new rrule with same attributes except for those attributes given new
           values by whichever keyword arguments are specified.NFT)r�r|r�r�r�r�r^)
r�r�r�r�r�r�rY�updater�r
)rDrSZ
new_kwargsrGrGrHr��s
z
rrule.replacec5cs�|jj�\	}}}}}}}}}	|j}
|j}|j}|j}
|j}|j}|j}|j	}|j
}|j}|j}|j
}|j}|j}|j}t|�}|j||�t|jt|jt|jt|jt|jt|jt|ji|
}|
tkr�|j}n�t|jt|j t|j!i|
}|
tko�|jo�||jk�s<|
tk�r|j�r||jk�s<|
tk�rB|j�rB||jk�rBf}n||||�}d}|j"}�xX||||�\}} }!d}"�x$|| |!�D�]}#|�r�|j#|#|k�s�|�r�|j$|#�s�|�r�|j%|#|k�s�|j&�r�|j&|#�s�|�r�|j'|#�s�|�s�|�r|j(|#|k�r|j)|#|k�s�|�r�|#|j*k�rP|#d|k�rP|j*|#|k�s�|#|j*k�r�|#d|j*|k�r�|j+|#|j*|k�r�d||#<d}"�q�W|�r�|�r�g}$x�|D]�}%|%dk�r�t,|%t-|��\}&}'nt,|%dt-|��\}&}'y&dd�|| |!�D�|&}#||'}(Wnt.k
�r$Yn6Xt/j0j1|j2|#�})t/j/j3|)|(�}*|*|$k�r�|$j4|*��q�W|$j5�xh|$D]`}*|
�r�|*|
k�r�||_6dS|*|jk�rn|dk	�r�|d8}|dk�r�||_6dS|d7}|*V�qnWn�x�|| |!�D]�}#|#dk	�r�t/j0j1|j2|#�})xv|D]n}(t/j/j3|)|(�}*|
�r4|*|
k�r4||_6dS|*|jk�r|dk	�rf|d8}|dk�rf||_6dS|d7}|*V�qW�q�Wd}+|
tk�r�||7}|t/j7k�r�||_6dS|j||��n^|
tk�r.||7}|dk�rt,|d�\},}-|-}||,7}|dk�rd}|d8}|t/j7k�r||_6dS|j||��n�|
tk�r�||k�rd||dd||jd	7}n||||jd	7}|}d}+�n�|
tk�r�||7}d}+�nx|
tk�r|"�r�|d
|||7}|�r�|j8||jdd�\}.}nt,||d�\}.}|.�r||.7}d}+||||�}�n|
tk�r�|"�rD|d
|d|||7}d}/d}0x�t9|0t:||0��D]v}1|�r�|j8||jdd�\}2}nt,||d�\}2}t,||2d�\},}|,�r�||,7}d}+d}"|�s�||k�r`d}/P�q`W|/�s�t;d��||||�}�n"|
tk�r|"�r,|d|d|d|||7}d}0d}/x�t9d|0t:||0��D]�}1|�rl|j8||jdd�\}3}nt,||d�\}3}t,||3d�\},}|,�r�||,7}t,|d�\},}|,�r�||,7}d}+|�s�||k�rJ|�s�||k�rJ|�s�||k�rJd}/P�qJW|/�st;d��||||�}|+�r\|dk�r\t<j=||�d}4||4k�r\x\||4k�r�||48}|d7}|dk�r�d}|d7}|t/j7k�r�||_6dSt<j=||�d}4�qFW|j||��q\WdS)NrFrTcSsg|]}|dk	r|�qS)NrG)rNrOrGrGrHr�@szrrule._iter.<locals>.<listcomp>r(r"r#�r�)r�r�r�i�r*z$Invalid combination of interval and zbyhour resulting in empty rule.iQiz!Invalid combination of interval, z&byhour and byminute resulting in emptyz rule.��
i�zCInvalid combination of interval and byhour resulting in empty rule.i�QzGInvalid combination of interval, byhour and byminute resulting in emptyzMInvalid combination of interval, byhour and byminute resulting in empty rule.)>r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r��	_iterinfo�rebuildr
�ydaysetr�mdaysetr�wdaysetr�ddaysetrrrr��htimeset�mtimeset�stimesetr��mmask�wnomask�wdaymask�	nwdaymask�
eastermask�mdaymask�	nmdaymask�yearlen�nextyearlen�divmodrgrwr��dater��yearordinalZcombinerhr�r]ZMAXYEAR�_rrule__mod_distancerrrAr�Z
monthrange)5rD�yearr�r�r�r�r�rZyearday�_r�r�r�r�r�r�r�r�r�r�Zbynmonthdayr�r�r�r��iiZ	getdaysetZtimesetZ
gettimeset�totalr|Zdaysetrs�endZfilteredrjZposlistr�ZdayposZtimeposr�rryZfixday�div�modZndaysZvalidZrep_raterlZnhoursZnminutesZdaysinmonthrGrGrHr`�s�

 
















"





zrrule._itercCspt�}t|t�r|f}x@|D]8}t|j|�}|dksJt|||�ddkr|j|�qWt|�dkrltd��|S)a
        If a `BYXXX` sequence is passed to the constructor at the same level as
        `FREQ` (e.g. `FREQ=HOURLY,BYHOUR={2,4,7},INTERVAL=3`), there are some
        specifications which cannot be reached given some starting conditions.

        This occurs whenever the interval is not coprime with the base of a
        given unit and the difference between the starting position and the
        ending position is not coprime with the greatest common denominator
        between the interval and the base. For example, with a FREQ of hourly
        starting at 17:00 and an interval of 4, the only valid values for
        BYHOUR would be {21, 1, 5, 9, 13, 17}, because 4 and 24 are not
        coprime.

        :param start:
            Specifies the starting position.
        :param byxxx:
            An iterable containing the list of allowed values.
        :param base:
            The largest allowable value for the specified frequency (e.g.
            24 hours, 60 minutes).

        This does not preserve the type of the iterable, returning a set, since
        the values should be unique and the order is irrelevant, this will
        speed up later lookups.

        In the event of an empty set, raises a :exception:`ValueError`, as this
        results in an empty rrule.
        rrz+Invalid rrule byxxx generates an empty set.)	r�rmrrr�rr�rgrA)rDrsr�r�ZcsetZnumZi_gcdrGrGrHZ__construct_byset�s

zrrule.__construct_bysetcCsLd}xBtd|d�D]0}t||j|�\}}||7}||kr||fSqWdS)a�
        Calculates the next value in a sequence where the `FREQ` parameter is
        specified along with a `BYXXX` parameter at the same "level"
        (e.g. `HOURLY` specified with `BYHOUR`).

        :param value:
            The old value of the component.
        :param byxxx:
            The `BYXXX` set, which should have been generated by
            `rrule._construct_byset`, or something else which checks that a
            valid rule is present.
        :param base:
            The largest allowable value for the specified frequency (e.g.
            24 hours, 60 minutes).

        If a valid value is not found after `base` iterations (the maximum
        number before the sequence would start to repeat), this raises a
        :exception:`ValueError`, as no valid values were found.

        This returns a tuple of `divmod(n*interval, base)`, where `n` is the
        smallest number of `interval` repetitions until the next specified
        value in `byxxx` is found.
        rrN)rrr�)rDr�r�r�ZaccumulatorrrrGrGrHZ__mod_distanceszrrule.__mod_distance)NrNNNNNNNNNNNNNF)rIrJrKrLrCr�r�r`r�rrMrGrG)rFrHr
.st{>/c@sveZdZddddddddd	d
ddd
ddgZdd�Zdd�Zdd�Zdd�Zdd�Zdd�Z	dd�Z
dd�Zd d!�Zd"S)#r�r
�lastyear�	lastmonthr	r
r
�yearweekdayr�mrangerrrrrrcCs&x|jD]}t||d�qW||_dS)N)�	__slots__�setattrr
)rDr
�attrrGrGrHrCDsz_iterinfo.__init__cCs�|j}||jk�r2dtj|�|_dtj|d�|_tj|dd�}|j�|_	|j
�|_tj|dd�j
�}|jdkr�t|_
t|_t|_t|d�|_t|_n&t|_
t|_t|_t|d�|_t|_|js�d|_�n`dg|jd|_d|j|jd}}|dk�r"d}|j|j|jd}n
|j|}t|d�\}	}
|	|
d}x�|jD]�}|dk�rh||d7}d|k�o||kn�s��qN|dk�r�||dd}
||k�r�|
d|8}
n|}
x8td�D],}d|j|
<|
d7}
|j|
|jk�r�P�q�W�qNWd|jk�rr||d}
||k�r,|
d|8}
|
|jk�rrx8td�D],}d|j|
<|
d7}
|j|
|jk�rBP�qBW|�r2d|jk�rtj|ddd�j
�}d||jd}dtj|d�}|dk�r�d}d|||jddd}nd|j|dd}nd}||jk�r2xt|�D]}
d|j|
<�qW|j�r�||j k�sR||jk�r�g}|j!t"k�r�|j#�r�x:|j#D]"}|j$|j|d|d���qrWnd|jfg}n$|j!t%k�r�|j|d|d�g}|�r�dg|j|_&x�|D]�\}}|d8}x�|jD]�\}}|dk�r8||dd}
|
|j|
|d8}
n*||dd}
|
d|j|
|d7}
||
k�ov|kn�r�d|j&|
<�q�W�q�W|j'�r�dg|jd|_(t)j)|�j�|j	}x|j'D]}d|j(||<�q�W||_||_ dS)	Nimrrr#r�4���r)*r
rr�Zisleapr	r
r�rr�r
rr�M365MASKr�MDAY365MASKr�NMDAY365MASKr�WDAYMASKr�	M365RANGEr�M366MASK�MDAY366MASK�NMDAY366MASK�	M366RANGEr�rr�rrr�rr�r
r�rhrrr�rr�)rDrr��rrZ	firstydayr�Zno1wkstZ	firstwkstZwyearlenrrZnumweeksrErjrlZlyearweekdayZlno1wkstZlyearlenZ	lnumweeksZranges�firstrZeyday�offsetrGrGrHr�Is�












$
z_iterinfo.rebuildcCstt|j��d|jfS)Nr)rprr	)rDrr�r�rGrGrHr��sz_iterinfo.ydaysetcCsLdg|j}|j|d|d�\}}xt||�D]}|||<q2W|||fS)Nr)r	rr)rDrr�r��dsetrsrrjrGrGrHr��s
z_iterinfo.mdaysetcCsldg|jd}tj|||�j�|j}|}x4td�D](}|||<|d7}|j||jjkr6Pq6W|||fS)Nr#r)	r	r�rr�r
rrr
r�)rDrr�r�r+rjrsrlrGrGrHr��sz_iterinfo.wdaysetcCs:dg|j}tj|||�j�|j}|||<|||dfS)Nr)r	r�rr�r
)rDrr�r�r+rjrGrGrHr��sz_iterinfo.ddaysetc	CsPg}|j}x8|jD].}x(|jD]}|jtj||||jd��qWqW|j�|S)N)r�)r
r�r�rhr�r�r�r�)rDr�r�r��tsetr(rGrGrHr��sz_iterinfo.htimesetcCs@g}|j}x(|jD]}|jtj||||jd��qW|j�|S)N)r�)r
r�rhr�r�r�r�)rDr�r�r�r,r(rGrGrHr�sz_iterinfo.mtimesetcCstj||||jjd�fS)N)r�)r�r�r
r�)rDr�r�r�rGrGrHr�s
z_iterinfo.stimesetN)
rIrJrKrrCr�r�r�r�r�r�rrrGrGrGrHr�>s
r�csjeZdZdZGdd�de�Zd�fdd�	Zedd��Zed	d
��Z	edd��Z
ed
d��Zdd�Z�Z
S)raL The rruleset type allows more complex recurrence setups, mixing
    multiple rules, dates, exclusion rules, and exclusion dates. The type
    constructor takes the following keyword arguments:

    :param cache: If True, caching of results will be enabled, improving
                  performance of multiple queries considerably. c@s@eZdZdd�Zdd�ZeZdd�Zdd�Zd	d
�Zdd�Z	d
S)zrruleset._genitemcCs>yt|�|_|j|�Wntk
r,YnX||_||_dS)N)rr}rhri�genlistrk)rDr-rkrGrGrHrC
s
zrruleset._genitem.__init__cCs^yt|j�|_WnHtk
rX|jd|kr<tj|j�n|jj|�tj|j�YnXdS)Nr)	rrkr}rir-�heapq�heappop�remove�heapify)rDrGrGrH�__next__szrruleset._genitem.__next__cCs|j|jkS)N)r})rD�otherrGrGrH�__lt__szrruleset._genitem.__lt__cCs|j|jkS)N)r})rDr3rGrGrH�__gt__"szrruleset._genitem.__gt__cCs|j|jkS)N)r})rDr3rGrGrH�__eq__%szrruleset._genitem.__eq__cCs|j|jkS)N)r})rDr3rGrGrH�__ne__(szrruleset._genitem.__ne__N)
rIrJrKrCr2�nextr4r5r6r7rGrGrGrH�_genitem	s	
r9Fcs,tt|�j|�g|_g|_g|_g|_dS)N)rBrrC�_rrule�_rdate�_exrule�_exdate)rDr^)rFrGrHrC+s
zrruleset.__init__cCs|jj|�dS)z\ Include the given :py:class:`rrule` instance in the recurrence set
            generation. N)r:rh)rDr
rGrGrHr
2szrruleset.rrulecCs|jj|�dS)z_ Include the given :py:class:`datetime` instance in the recurrence
            set generation. N)r;rh)rD�rdaterGrGrHr>8szrruleset.rdatecCs|jj|�dS)z� Include the given rrule instance in the recurrence set exclusion
            list. Dates which are part of the given recurrence rules will not
            be generated, even if some inclusive rrule or rdate matches them.
        N)r<rh)rD�exrulerGrGrHr?>szrruleset.exrulecCs|jj|�dS)z� Include the given datetime instance in the recurrence set
            exclusion list. Dates included that way will not be generated,
            even if some inclusive rrule or rdate matches them. N)r=rh)rD�exdaterGrGrHr@Fszrruleset.exdateccslg}|jj�|j|t|j��x$dd�|jD�D]}|j||�q2Wg}|jj�|j|t|j��x$dd�|jD�D]}|j||�qxWd}d}tj|�tj|�x�|�r`|d}|s�||j	k�r:xB|o�|d|k�r|d}t
|�|o�|d|kr�tj||�q�W|�s$||dk�r4|d7}|j	V|j	}t
|�|r�|d|kr�tj||�q�W||_dS)NcSsg|]}t|��qSrG)r_)rNrOrGrGrHr�Qsz"rruleset._iter.<locals>.<listcomp>cSsg|]}t|��qSrG)r_)rNrOrGrGrHr�Vsrr)
r;r�r9r_r:r=r<r.r1r}r�heapreplacer])rDZrlistrkZexlistZlastdtrZritemZexitemrGrGrHr`Ms<



zrruleset._iter)F)rIrJrKrL�objectr9rCrWr
r>r?r@r`rMrGrG)rFrHrs"c@s�eZdZeeeeeee	d�Z
dddddddd	�Zd
d�Zdd
�Z
eZeZe
Ze
Ze
Ze
Ze
Ze
Ze
Ze
Ze
Zdd�Zdd�Zdd�Zdd�ZeZddd�Zddd�Zdd�Z dS) �	_rrulestr)r
rrrrrrrrrrrr!r")rrrrrrrcKst|�||j�<dS)N)�int�lower)rD�rrkwargsr�r�rSrGrGrH�_handle_int{sz_rrulestr._handle_intcKs dd�|jd�D�||j�<dS)NcSsg|]}t|��qSrG)rD)rNrOrGrGrHr�sz._rrulestr._handle_int_list.<locals>.<listcomp>r�)�splitrE)rDrFr�r�rSrGrGrH�_handle_int_list~sz_rrulestr._handle_int_listcKs|j||d<dS)Nr�)�	_freq_map)rDrFr�r�rSrGrGrH�_handle_FREQ�sz_rrulestr._handle_FREQcKsVtsddlmay$tj||jd�|jd�d�|d<Wntk
rPtd��YnXdS)Nr)�parser�ignoretz�tzinfos)rMrNr�zinvalid until date)rLr��parser�rA)rDrFr�r�rSrGrGrH�
_handle_UNTIL�sz_rrulestr._handle_UNTILcKs|j||d<dS)Nr�)�_weekday_map)rDrFr�r�rSrGrGrH�_handle_WKST�sz_rrulestr._handle_WKSTcKs�g}x�|jd�D]�}d|krD|jd�}|d}t|ddd	��}	n^t|�r�x"tt|��D]}
||
dkrZPqZW|d|
�p~d}	||
d�}|	r�t|	�}	ntd��|jt|j||	��qW||d<dS)
z:
        Two ways to specify this: +1MO or MO(+1)
        r��(rrNz+-0123456789z$Invalid (empty) BYDAY specification.r�r)rHrDrgrrArh�weekdaysrQ)rDrFr�r�rSr�r�Zsplt�wrErjrGrGrH�_handle_BYWEEKDAY�s"

z_rrulestr._handle_BYWEEKDAYNFc
Cs�|jd�dkr.|jd�\}}|dkr2td��n|}i}x�|jd�D]�}	|	jd�\}}|j�}|j�}y t|d|�|||||d�WqBtk
r�td	|��YqBttfk
r�td
||f��YqBXqBWtf||d�|��S)
N�:r�RRULEzunknown parameter namer��=Z_handle_)rMrNzunknown parameter '%s'zinvalid '%s': %s)r�r^r)�findrHrA�upper�getattr�AttributeError�KeyErrorr
)
rD�liner�r^rMrNr�r�rFZpairrGrGrH�_parse_rfc_rrule�s&
z_rrulestr._parse_rfc_rrulec	Cs�|rd}d}|j�}|j�s$td��|r�|j�}	d}
xr|
t|	�kr�|	|
j�}|sZ|	|
=q6|
dkr�|ddkr�|	|
d|dd�7<|	|
=q6|
d7}
q6Wn|j�}	|r�t|	�dkr�|jd�dks�|jd�r�|j	|	d||||d�Sg}g}
g}g}�x�|	D�]�}|�s�q|jd�dk�r,d	}|}n|jdd�\}}|jd
�}|�sTtd��|d}|dd�}|d	k�r�x|D]}td|���qxW|j
|��q|d
k�r�x$|D]}|dk�r�td|���q�W|
j
|�n�|dk�r
x|D]}td|���q�W|j
|�n�|dk�rFx$|D]}|dk�rtd|���qW|j
|�nV|dk�r�x|D]}td|���qVWt�s~ddlmatj
|||d�}ntd|���qW|�s�t|�dk�s�|
�s�|�s�|�r�t�r�|
�s�|�r�ddlmat|d�}x&|D]}|j|j	||||d���q�Wx:|
D]2}x*|jd�D]}|jtj
|||d���q0W�q Wx&|D]}|j|j	||||d���q\Wx:|D]2}x*|jd�D]}|jtj
|||d���q�W�q�W|�r�|�r�|j|�|S|j	|d||||d�SdS)NTzempty stringr� rrWzRRULE:)r^r�rMrNrXr�zempty property namezunsupported RRULE parm: ZRDATEzVALUE=DATE-TIMEzunsupported RDATE parm: ZEXRULEzunsupported EXRULE parm: ZEXDATEzunsupported EXDATE parm: ZDTSTARTzunsupported DTSTART parm: )rL)rMrNzunsupported property: )r^)r�rMrNr�)r�r^rMrNrr)r[�striprA�
splitlinesrg�rstriprHrZ�
startswithr`rhrLr�rOrr
r>r?r@)rDr�r�r^ZunfoldZforcesetZ
compatiblerMrN�linesrjr_Z	rrulevalsZ	rdatevalsZ
exrulevalsZ
exdatevalsr�r�ZparmsZparmZrsetZdatestrrGrGrH�
_parse_rfc�s�	 























z_rrulestr._parse_rfccKs|j|f|�S)N)rg)rDr�rSrGrGrH�__call__Dsz_rrulestr.__call__)NFFN)NFFFFFN)!rIrJrKr
rrrrrrrJrQrGrIZ_handle_INTERVALZ
_handle_COUNTZ_handle_BYSETPOSZ_handle_BYMONTHZ_handle_BYMONTHDAYZ_handle_BYYEARDAYZ_handle_BYEASTERZ_handle_BYWEEKNOZ_handle_BYHOURZ_handle_BYMINUTEZ_handle_BYSECONDrKrPrRrVZ
_handle_BYDAYr`rgrhrGrGrGrHrCnsN

irCi��i��i��)
rrr*r+r,r-r.r/r0r1r2r3r4)
rrr5r6r7r8r9r:r;r<r=r>r?)>rLrqr�r�ruZmathr�ImportErrorZ	fractionsZsixrrZ	six.movesrrr.Z_commonrZweekdaybase�warningsr	�__all__r�r$rprZM29ZM30ZM31r%r r&r!r'r#r"r�r
rrrrrrr�rLrrrrrrrrTrWrBrXr
r�rrCrrGrGrGrH�<module>sl�.@.@(
TDm[site-packages/dateutil/__pycache__/_version.cpython-36.opt-1.pyc000064400000000506147511334560020532 0ustar003

6�cY��@s.dZdZdZdZeeefZdjeee��ZdS)z2
Contains information about the dateutil version.
����.N)	�__doc__Z
VERSION_MAJORZ
VERSION_MINORZ
VERSION_PATCHZ
VERSION_TUPLE�join�map�str�VERSION�r
r
�/usr/lib/python3.6/_version.py�<module>s

site-packages/dateutil/__pycache__/parser.cpython-36.pyc000064400000102174147511334560017247 0ustar003

6�cY���@sdZddlmZddlZddlZddlZddlZddlZddlm	Z	ddl
mZddlm
Z
mZmZddlmZdd	lmZd
dgZGdd
�d
e�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�Ze�Zddd
�ZGdd�de�Ze�Zdd�Zdd�Z dS)a�
This module offers a generic date/time string parser which is able to parse
most known formats to represent a date and/or time.

This module attempts to be forgiving with regards to unlikely input formats,
returning a datetime object even for dates which are ambiguous. If an element
of a date/time stamp is omitted, the following rules are applied:
- If AM or PM is left unspecified, a 24-hour clock is assumed, however, an hour
  on a 12-hour clock (``0 <= hour <= 12``) *must* be specified if AM or PM is
  specified.
- If a time zone is omitted, a timezone-naive datetime is returned.

If any other elements are missing, they are taken from the
:class:`datetime.datetime` object passed to the parameter ``default``. If this
results in a day number exceeding the valid number of days per month, the
value falls back to the end of the month.

Additional resources about date/time string formats can be found below:

- `A summary of the international standard date and time notation
  <http://www.cl.cam.ac.uk/~mgk25/iso-time.html>`_
- `W3C Date and Time Formats <http://www.w3.org/TR/NOTE-datetime>`_
- `Time Formats (Planetary Rings Node) <http://pds-rings.seti.org/tools/time_formats.html>`_
- `CPAN ParseDate module
  <http://search.cpan.org/~muir/Time-modules-2013.0912/lib/Time/ParseDate.pm>`_
- `Java SimpleDateFormat Class
  <https://docs.oracle.com/javase/6/docs/api/java/text/SimpleDateFormat.html>`_
�)�unicode_literalsN)�StringIO)�
monthrange)�	text_type�binary_type�
integer_types�)�
relativedelta)�tz�parse�
parserinfoc@sneZdZejd�Zdd�Zdd�Zdd�Zdd	�Z	d
d�Z
edd
��Zedd��Z
edd��Zedd��ZdS)�_timelexz([.,])cCsdt|t�r|j�}t|t�r$t|�}t|dd�dkrHtdj|jj	d���||_
g|_g|_d|_
dS)N�readz8Parser must be a string or character stream, not {itype})ZitypeF)�
isinstancer�decoderr�getattr�	TypeError�format�	__class__�__name__�instream�	charstack�
tokenstack�eof)�selfr�r�/usr/lib/python3.6/parser.py�__init__4s

z_timelex.__init__cCs�|jr|jjd�Sd}d}d}�x�|j�s|jr>|jjd�}n&|jjd�}x|dkrb|jjd�}qLW|srd|_Pq"|s�|}|j|�r�d}n$|j|�r�d}n|j|�r�d	}PnPq"|dk�r�d}|j|�r�||7}n$|d
kr�||7}d}n|jj	|�Pq"|dk�rX|j|��r||7}n>|d
k�s:|dk�rHt
|�d
k�rH||7}d}n|jj	|�Pq"|dk�r�d}|d
k�s||j|��r�||7}n6|j|��r�|dd
k�r�||7}d}n|jj	|�Pq"|dkr"|d
k�s�|j|��r�||7}q"|j|��r|dd
k�r||7}d}q"|jj	|�Pq"W|dk�r�|�sN|jd
�dk�sN|ddk�r�|jj
|�}|d}x(|dd�D]}|�rp|jj	|��qpW|dk�r�|jd
�dk�r�|jdd
�}|S)a�
        This function breaks the time string into lexical units (tokens), which
        can be parsed by the parser. Lexical units are demarcated by changes in
        the character set, so any continuous string of letters is considered
        one unit, any continuous string of numbers is considered one unit.

        The main complication arises from the fact that dots ('.') can be used
        both as separators (e.g. "Sep.20.2009") or decimal points (e.g.
        "4:30:21.447"). As such, it is necessary to read the full context of
        any dot-separated strings before breaking it into tokens; as such, this
        function maintains a "token stack", for when the ambiguous context
        demands that multiple tokens be parsed at once.
        rFNr�T�a�0� �.�a.�,��0.z.,���r')r#r&r')r�poprrrr�isword�isnum�isspace�append�len�count�_split_decimal�split�replace)rZseenletters�token�state�nextchar�l�tokrrr�	get_tokenDs�








"


 z_timelex.get_tokencCs|S)Nr)rrrr�__iter__�sz_timelex.__iter__cCs|j�}|dkrt�|S)N)r7�
StopIteration)rr2rrr�__next__�sz_timelex.__next__cCs|j�S)N)r:)rrrr�next�sz
_timelex.nextcCst||��S)N)�list)�cls�srrrr0�sz_timelex.splitcCs|j�S)z5 Whether or not the next character is part of a word )�isalpha)r=r4rrrr)�sz_timelex.iswordcCs|j�S)z0 Whether the next character is part of a number )�isdigit)r=r4rrrr*�sz_timelex.isnumcCs|j�S)z* Whether the next character is whitespace )r+)r=r4rrrr+�sz_timelex.isspaceN)r�
__module__�__qualname__�re�compiler/rr7r8r:r;�classmethodr0r)r*r+rrrrr
0s
mr
c@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�_resultbasecCs x|jD]}t||d�qWdS)N)�	__slots__�setattr)r�attrrrrr�sz_resultbase.__init__cCsNg}x6|jD],}t||�}|dk	r|jd|t|�f�qWd|dj|�fS)Nz%s=%sz%s(%s)z, )rGrr,�repr�join)rZ	classnamer5rI�valuerrr�_repr�s
z_resultbase._reprcst�fdd��jD��S)Nc3s|]}t�|�dk	VqdS)N)r)�.0rI)rrr�	<genexpr>�sz&_resultbase.__len__.<locals>.<genexpr>)�sumrG)rr)rr�__len__�sz_resultbase.__len__cCs|j|jj�S)N)rMrr)rrrr�__repr__�sz_resultbase.__repr__N)rrArBrrMrQrRrrrrrF�srFc@s�eZdZdZdddddddd	d
ddd
ddddddgZdcdddedfdgdhdigZdjdkdldmdndodpdqdrdsdtdugZdvdwdxgZdydzgZdFdGdHgZ	dgZ
iZd{dJdK�ZdLdM�Z
dNdO�ZdPdQ�ZdRdS�ZdTdU�ZdVdW�ZdXdY�ZdZd[�Zd\d]�Zd|d^d_�Zd`da�ZdbS)}ra�
    Class which handles what inputs are accepted. Subclass this to customize
    the language and acceptable values for each parameter.

    :param dayfirst:
            Whether to interpret the first value in an ambiguous 3-integer date
            (e.g. 01/05/09) as the day (``True``) or month (``False``). If
            ``yearfirst`` is set to ``True``, this distinguishes between YDM
            and YMD. Default is ``False``.

    :param yearfirst:
            Whether to interpret the first value in an ambiguous 3-integer date
            (e.g. 01/05/09) as the year. If ``True``, the first number is taken
            to be the year, otherwise the last number is taken to be the year.
            Default is ``False``.
    r!r"r$�;�-�/�'ZatZon�andZad�m�tZof�stZndZrdZth�Mon�Monday�Tue�Tuesday�Wed�	Wednesday�Thu�Thursday�Fri�Friday�Sat�Saturday�Sun�Sunday�Jan�January�Feb�February�Mar�March�Apr�April�May�Jun�June�Jul�July�Aug�August�Sep�Sept�	September�Oct�October�Nov�November�Dec�December�h�hour�hours�minute�minutesr>�second�seconds�amr�pm�p�UTCZGMT�ZFcCs�|j|j�|_|j|j�|_|j|j�|_|j|j�|_|j|j	�|_
|j|j�|_|j|j
�|_||_||_tj�j|_|jdd|_dS)N�d)�_convert�JUMP�_jump�WEEKDAYS�	_weekdays�MONTHS�_months�HMS�_hms�AMPM�_ampm�UTCZONE�_utczone�PERTAIN�_pertain�dayfirst�	yearfirst�timeZ	localtimeZtm_year�_year�_century)rr�r�rrrrszparserinfo.__init__cCsPi}xFt|�D]:\}}t|t�r<x&|D]}|||j�<q&Wq|||j�<qW|S)N)�	enumerater�tuple�lower)rZlstZdct�i�vrrrr�*s

zparserinfo._convertcCs|j�|jkS)N)r�r�)r�namerrr�jump4szparserinfo.jumpcCsHt|�tdd�|jj�D��krDy|j|j�Stk
rBYnXdS)Ncss|]}t|�VqdS)N)r-)rN�nrrrrO8sz%parserinfo.weekday.<locals>.<genexpr>)r-�minr��keysr��KeyError)rr�rrr�weekday7s zparserinfo.weekdaycCsLt|�tdd�|jj�D��krHy|j|j�dStk
rFYnXdS)Ncss|]}t|�VqdS)N)r-)rNr�rrrrO@sz#parserinfo.month.<locals>.<genexpr>r)r-r�r�r�r�r�)rr�rrr�month?s zparserinfo.monthcCs(y|j|j�Stk
r"dSXdS)N)r�r�r�)rr�rrr�hmsGszparserinfo.hmscCs(y|j|j�Stk
r"dSXdS)N)r�r�r�)rr�rrr�ampmMszparserinfo.ampmcCs|j�|jkS)N)r�r�)rr�rrr�pertainSszparserinfo.pertaincCs|j�|jkS)N)r�r�)rr�rrr�utczoneVszparserinfo.utczonecCs||jkrdS|jj|�S)Nr)r��TZOFFSET�get)rr�rrr�tzoffsetYs
zparserinfo.tzoffsetcCsJ|dkrF|rF||j7}t||j�dkrF||jkr>|d7}n|d8}|S)Nr��2)r��absr�)r�year�century_specifiedrrr�convertyear_s


zparserinfo.convertyearcCsl|jdk	r|j|j|j�|_|jdkr.|js8|jdkrFd|_d|_n"|jdkrh|jrh|j|j�rhd|_dS)Nrr�r�T)r�r�r�r��tznamer�)r�resrrr�validateis
zparserinfo.validateN)r[r\)r]r^)r_r`)rarb)rcrd)rerf)rgrh)rirj)rkrl)rmrn)rorp)rqrq)rrrs)rtru)rvrw)rxryrz)r{r|)r}r~)rr�)r�r�r�)rXr�r�)r>r�r�)r�r)r�r�)FF)F)rrArB�__doc__r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�rrrrr�sV




csPeZdZ�fdd�Zedd��Zedd��Zdd�Z�fd	d
�Zdd�Z	�Z
S)
�_ymdcs$t|j|�j||�d|_||_dS)NF)�superrrr��tzstr)rr��args�kwargs)rrrrwsz
_ymd.__init__cCs&yt|�|kStk
r dSXdS)NF)�int�
ValueError)r2r�rrr�token_could_be_year|sz_ymd.token_could_be_yearcs�fdd�|D�S)Ncsg|]}tj|��r|�qSr)r�r�)rNr2)r�rr�
<listcomp>�sz3_ymd.find_potential_year_tokens.<locals>.<listcomp>r)r��tokensr)r�r�find_potential_year_tokens�sz_ymd.find_potential_year_tokenscCsFx@t|�D]4\}}tj||�}t|�dkr
t|d�dkr
|Sq
WdS)zk
        attempt to deduce if a pre 100 year was lost
         due to padded zeros being taken off
        rrr%N)r�r�r�r-)rr��indexr2Zpotential_year_tokensrrr�find_probable_year_index�sz_ymd.find_probable_year_indexcsNt|d�r&|j�r4t|�dkr4d|_n|dkr4d|_t|j|�jt|��dS)NrQr%Tr�)�hasattrr@r-r�r�rr,r�)r�val)rrrr,�s
z_ymd.appendcCs,t|�}d\}}}|dkr&td���n�|dks>|d	kr�|dkr�|d
krT||}||=|dksd|dkr�|ddkrz|d}n|d}�n�|dkr�|ddkr�|\}}n8|ddkr�|\}}n"|r�|ddkr�|\}}n|\}}�nB|dk�r"|dk�r|\}}}�n |dk�rF|ddk�s.|�r:|ddk�r:|\}}}n
|\}}}n�|dk�rv|ddk�rj|\}}}n
|\}}}n�|ddk�s�|jtj|j��dk�s�|�r�|ddk�r�|ddk�r�|�r�|ddk�r�|\}}}n
|\}}}n8|ddk�s|�r|ddk�r|\}}}n
|\}}}|||fS)N�zMore than three YMD valuesrr%r��)NNNr'r'r')r-r�r�r
r0r�)r�mstridxr�r�Zlen_ymdr�r��dayrrr�resolve_ymd�sR







"
""
z_ymd.resolve_ymd)rrArBr�staticmethodr�r�r�r,r��
__classcell__rr)rrr�vs
	r�c@sFeZdZd
dd�Zddd�ZGdd�de�Zdd	d
�Zedd��Z	dS)�parserNcCs|pt�|_dS)N)r�info)rr�rrrr�szparser.__init__FcKsV|dkr tjj�jddddd�}|j|f|�\}}|dkrBtd��t|�dkrVtd��i}x&dD]}	t||	�}
|
dk	r`|
||	<q`Wd|kr�|jdkr�|jn|j}|jdkr�|jn|j}|j	dkr�|j	n|j	}
|
t
||�d
kr�t
||�d
|d<|jf|�}|jdk	�r$|j	�r$|tj|jd�}|�s8t
|tj��sJ|�r�|j|k�r�t
|tj��rh||j|j�}n|j|j�}t
|tj��r�|}n<t
|t��r�tj|�}n$t
|t��r�tj|j|�}ntd��|j|d�}nf|j�r�|jtjk�r�|jtj�d�}n>|jdk�r|jtj�d�}n |j�r8|jtj|j|j�d�}|jdd��rN||fS|SdS)aV

        Parse the date/time string into a :class:`datetime.datetime` object.

        :param timestr:
            Any date/time string using the supported formats.

        :param default:
            The default datetime object, if this is a datetime object and not
            ``None``, elements specified in ``timestr`` replace elements in the
            default object.

        :param ignoretz:
            If set ``True``, time zones in parsed strings are ignored and a
            naive :class:`datetime.datetime` object is returned.

        :param tzinfos:
            Additional time zone names / aliases which may be present in the
            string. This argument maps time zone names (and optionally offsets
            from those time zones) to time zones. This parameter can be a
            dictionary with timezone aliases mapping time zone names to time
            zones or a function taking two parameters (``tzname`` and
            ``tzoffset``) and returning a time zone.

            The timezones to which the names are mapped can be an integer
            offset from UTC in minutes or a :class:`tzinfo` object.

            .. doctest::
               :options: +NORMALIZE_WHITESPACE

                >>> from dateutil.parser import parse
                >>> from dateutil.tz import gettz
                >>> tzinfos = {"BRST": -10800, "CST": gettz("America/Chicago")}
                >>> parse("2012-01-19 17:21:00 BRST", tzinfos=tzinfos)
                datetime.datetime(2012, 1, 19, 17, 21, tzinfo=tzoffset(u'BRST', -10800))
                >>> parse("2012-01-19 17:21:00 CST", tzinfos=tzinfos)
                datetime.datetime(2012, 1, 19, 17, 21,
                                  tzinfo=tzfile('/usr/share/zoneinfo/America/Chicago'))

            This parameter is ignored if ``ignoretz`` is set.

        :param **kwargs:
            Keyword arguments as passed to ``_parse()``.

        :return:
            Returns a :class:`datetime.datetime` object or, if the
            ``fuzzy_with_tokens`` option is ``True``, returns a tuple, the
            first element being a :class:`datetime.datetime` object, the second
            a tuple containing the fuzzy tokens.

        :raises ValueError:
            Raised for invalid or unknown string format, if the provided
            :class:`tzinfo` is not in a valid format, or if an invalid date
            would be created.

        :raises TypeError:
            Raised for non-string or character stream input.

        :raises OverflowError:
            Raised if the parsed date exceeds the largest valid C integer on
            your system.
        Nr)r�r�r��microsecondzUnknown string formatzString does not contain a date.r�r�r�r�r�r�r�r)r�z9Offset must be tzinfo subclass, tz string, or int offset.)�tzinfo�fuzzy_with_tokensF)r�r�r�r�r�r�r�)�datetimeZnowr1�_parser�r-rr�r�r�rr�r	r�collections�Callabler�r�r�r�rr
r�rr�ZtzlocalZtzutc)r�timestr�defaultZignoretzZtzinfosr�r��skipped_tokens�replrIrLZcyearZcmonthZcday�retZtzdatar�rrrr�s\?

zparser.parsec@s&eZdZddddddddd	d
dgZdS)
zparser._resultr�r�r�r�r�r�r�r�r�r�r�N)rrArBrGrrrr�_resultisr�cCst
|rd}|j}|dkr|j}|dkr*|j}|j�}tj|�}d }	t�}
�y�t|�}d!}t|�}
d}�xt||
k�r�y||}t	|�}Wnt
k
r�d}YnX|dk	�r�t||�}|d7}t|�dk�rH|d"k�rH|jdk�rH||
k�s||dk�rH|j||�dk�rH||d}t
|dd��|_|dk�r�t
|dd��|_qf|d	k�st|d	k�r||djd
�d	k�r||d}|�r�||djd
�d#k�r�|j|dd��|j|dd��|j|dd��n<t
|dd��|_t
|dd��|_t|dd��\|_|_qf|d$k�r�||d}|j|dd��|j|dd	��|j|d	d��|dk�r�t
|dd��|_t
|dd��|_|dk�r�t
|dd��|_qf||
k�r�|j||�dk	�s|d|
k�r0||dk�r0|j||d�dk	�r0||dk�r|d7}|j||�}�x�|dk�rZt
|�|_|d�r�t
d|d�|_nL|dk�r�t
|�|_|d�r�t
d|d�|_n|dk�r�t|�\|_|_|d7}||
k�s�|dk�r�Py||}t	|�}Wnt
k
�r�PYn8X|d7}|d7}||
k�r(|j||�}|dk	�r(|}�q(Wqf||
k�r�||ddk�r�|j||d�dk	�r�|j||d�}|dk�r�t
|�|_|d}|�r�t
d|�|_n|dk�r�t|�\|_|_qf|d|
k�rf||dk�rft
|�|_|d7}t	||�}t
|�|_|d�r$t
d|d�|_|d7}||
k�r�||dk�r�t||d�\|_|_|d7}qf||
k�r�||d%k�r�||}|j|�|d7}||
k�r�|j||��r�y|j||�WnXt
k
�r|j||�}|dk	�r|j|�|d&k�st�t|�d}nd'SYnX|d7}||
k�r�|||k�r�|d7}|j||�}|dk	�r�|j|�t|�d}|d(k�s�t�n|j||�|d7}qf||
k�s�|j||��rd|d|
k�rP|j||d�dk	�rPt
|�|_|jdk�r|j||d�dk�r|jd7_n*|jdk�rF|j||d�dk�rFd|_|d7}n
|j|�|d7}qf|j||�dk	�r�t
|�|_|jdk�r�|j||�dk�r�|jd7_n&|jdk�r�|j||�dk�r�d|_|d7}qf|�s�d)S|d7}qf|j||�}|dk	�r||_|d7}qf|j||�}|dk	�	rd|j|�|d*k�sNt�t|�d}|d7}||
krf||d+k�r�||}|d7}|j||�|d7}||
k�	rb|||k�	rb|d7}|j||�|d7}qf|d|
krf||||dk�	odknrf|j||d�rfyt
||d�}Wnt
k
�	rDYnX|jt|j|���|d7}qf|j||�}|dk	�
rZd}|�	r�|jdk	�	r�d}|jdk�	r�|�	r�d}nt
d,��n2d|jk�	o�dkn�	s�|�	r�d}nt
d-��|�
r:|dk�
r|jdk�
r|jd7_n|dk�
r2|jdk�
r2d|_||_n|�
rP|j|
|	||�}	|d7}qf|jdk	�rt||�dk�r|jdk�r|jdk�rdd�||D��r|||_|j|j�|_|d7}||
krf||d.krfd/||dk||<d|_|j|j�rfd|_qf|jdk	�r�||d0k�r�d2||dk}|d7}t||�}|dk�r�t
||dd��dt
||dd��d|_nz|d|
k�r�||ddk�r�t
||�dt
||d�d|_|d7}n*|dk�r�t
||dd��d|_nd3S|d7}|j|9_|d|
krf|j||�rf||ddkrf||ddkrfdt||d�k�ondknrfdd�||dD�rf||d|_|d7}qf|j||��p�|�s�d4S|j|
|	||�}	|d7}qfW|j |||�\}}}|dk	�
r||_!|j"|_"|dk	�
r||_|dk	�
r&||_#Wnt$t
tfk
�
rDd5SX|j%|��
sVd6S|�
rh|t&|
�fS|dfSdS)7a
        Private method which performs the heavy lifting of parsing, called from
        ``parse()``, which passes on its ``kwargs`` to this function.

        :param timestr:
            The string to parse.

        :param dayfirst:
            Whether to interpret the first value in an ambiguous 3-integer date
            (e.g. 01/05/09) as the day (``True``) or month (``False``). If
            ``yearfirst`` is set to ``True``, this distinguishes between YDM
            and YMD. If set to ``None``, this value is retrieved from the
            current :class:`parserinfo` object (which itself defaults to
            ``False``).

        :param yearfirst:
            Whether to interpret the first value in an ambiguous 3-integer date
            (e.g. 01/05/09) as the year. If ``True``, the first number is taken
            to be the year, otherwise the last number is taken to be the year.
            If this is set to ``None``, the value is retrieved from the current
            :class:`parserinfo` object (which itself defaults to ``False``).

        :param fuzzy:
            Whether to allow fuzzy parsing, allowing for string like "Today is
            January 1, 2047 at 8:21:00AM".

        :param fuzzy_with_tokens:
            If ``True``, ``fuzzy`` is automatically set to True, and the parser
            will return a tuple where the first element is the parsed
            :class:`datetime.datetime` datetimestamp and the second element is
            a tuple containing the portions of the string which were ignored:

            .. doctest::

                >>> from dateutil.parser import parse
                >>> parse("Today is January 1, 2047 at 8:21:00AM", fuzzy_with_tokens=True)
                (datetime.datetime(2047, 1, 1, 8, 21), (u'Today is ', u' ', u'at '))

        TNr%rrr���:�r"�r���
r!�<rTrUFzNo hour specified with zAM or PM flag.zInvalid hour specified for z12-hour clock.�cSsg|]}|tjkr|�qSr)�string�ascii_uppercase)rN�xrrrr��sz!parser._parse.<locals>.<listcomp>�+i�(�)cSsg|]}|tjkr|�qSr)r�r�)rNr�rrrr�s���r')r%r�r')r�r�r�)rTrUr"r')NNr')NNr')rTrUz%No hour specified with AM or PM flag.z)Invalid hour specified for 12-hour clock.)r�rT)r�rT)r�rTr')r'r)NN)NN)NN)NN)'r�r�r�r�r
r0r<r�r-�floatr�r�r�r�r��findr,�_parsemsr�r�r�r��AssertionErrorr�r�r��strr��_skip_tokenr�r�r�r�r�r�r��
IndexErrorr�r�)rr�r�r�Zfuzzyr�r�r�r5�last_skipped_token_ir�Zymdr��len_lr�Z
value_reprrL�len_lir>�idxZnewidxZ
sec_remainder�sepZval_is_ampm�signalr�r�r�rrrr�ns)


$
, 

























&
$$



  




2



4 &

.&




z
parser._parsecCs8||dkr"|d||7<n|j||�|}|S)Nrr')r,)r�r�r�r5rrrr�-s
zparser._skip_token)N)NFN)NNFF)
rrArBrrrFr�r�r�r�rrrrr��s


Ar�cKs(|rt|�j|f|�Stj|f|�SdS)a)

    Parse a string in one of the supported formats, using the
    ``parserinfo`` parameters.

    :param timestr:
        A string containing a date/time stamp.

    :param parserinfo:
        A :class:`parserinfo` object containing parameters for the parser.
        If ``None``, the default arguments to the :class:`parserinfo`
        constructor are used.

    The ``**kwargs`` parameter takes the following keyword arguments:

    :param default:
        The default datetime object, if this is a datetime object and not
        ``None``, elements specified in ``timestr`` replace elements in the
        default object.

    :param ignoretz:
        If set ``True``, time zones in parsed strings are ignored and a naive
        :class:`datetime` object is returned.

    :param tzinfos:
            Additional time zone names / aliases which may be present in the
            string. This argument maps time zone names (and optionally offsets
            from those time zones) to time zones. This parameter can be a
            dictionary with timezone aliases mapping time zone names to time
            zones or a function taking two parameters (``tzname`` and
            ``tzoffset``) and returning a time zone.

            The timezones to which the names are mapped can be an integer
            offset from UTC in minutes or a :class:`tzinfo` object.

            .. doctest::
               :options: +NORMALIZE_WHITESPACE

                >>> from dateutil.parser import parse
                >>> from dateutil.tz import gettz
                >>> tzinfos = {"BRST": -10800, "CST": gettz("America/Chicago")}
                >>> parse("2012-01-19 17:21:00 BRST", tzinfos=tzinfos)
                datetime.datetime(2012, 1, 19, 17, 21, tzinfo=tzoffset(u'BRST', -10800))
                >>> parse("2012-01-19 17:21:00 CST", tzinfos=tzinfos)
                datetime.datetime(2012, 1, 19, 17, 21,
                                  tzinfo=tzfile('/usr/share/zoneinfo/America/Chicago'))

            This parameter is ignored if ``ignoretz`` is set.

    :param dayfirst:
        Whether to interpret the first value in an ambiguous 3-integer date
        (e.g. 01/05/09) as the day (``True``) or month (``False``). If
        ``yearfirst`` is set to ``True``, this distinguishes between YDM and
        YMD. If set to ``None``, this value is retrieved from the current
        :class:`parserinfo` object (which itself defaults to ``False``).

    :param yearfirst:
        Whether to interpret the first value in an ambiguous 3-integer date
        (e.g. 01/05/09) as the year. If ``True``, the first number is taken to
        be the year, otherwise the last number is taken to be the year. If
        this is set to ``None``, the value is retrieved from the current
        :class:`parserinfo` object (which itself defaults to ``False``).

    :param fuzzy:
        Whether to allow fuzzy parsing, allowing for string like "Today is
        January 1, 2047 at 8:21:00AM".

    :param fuzzy_with_tokens:
        If ``True``, ``fuzzy`` is automatically set to True, and the parser
        will return a tuple where the first element is the parsed
        :class:`datetime.datetime` datetimestamp and the second element is
        a tuple containing the portions of the string which were ignored:

        .. doctest::

            >>> from dateutil.parser import parse
            >>> parse("Today is January 1, 2047 at 8:21:00AM", fuzzy_with_tokens=True)
            (datetime.datetime(2047, 1, 1, 8, 21), (u'Today is ', u' ', u'at '))

    :return:
        Returns a :class:`datetime.datetime` object or, if the
        ``fuzzy_with_tokens`` option is ``True``, returns a tuple, the
        first element being a :class:`datetime.datetime` object, the second
        a tuple containing the fuzzy tokens.

    :raises ValueError:
        Raised for invalid or unknown string format, if the provided
        :class:`tzinfo` is not in a valid format, or if an invalid date
        would be created.

    :raises OverflowError:
        Raised if the parsed date exceeds the largest valid C integer on
        your system.
    N)r�r�
DEFAULTPARSER)r�rr�rrrr<s_c@s$eZdZGdd�de�Zdd�ZdS)�	_tzparserc@s<eZdZddddddgZGdd�de�Zd	d
�Zdd�Zd
S)z_tzparser._result�stdabbr�	stdoffset�dstabbr�	dstoffset�start�endc@seZdZdddddddgZdS)	z_tzparser._result._attrr��weekr��yday�jydayr�r�N)rrArBrGrrrr�_attr�srcCs
|jd�S)N�)rM)rrrrrR�sz_tzparser._result.__repr__cCs"tj|�|j�|_|j�|_dS)N)rFrrrr)rrrrr�s

z_tzparser._result.__init__N)rrArBrGrFrrRrrrrrr��s
r�cCsZ|j�}tj|�}�y$t|�}d}�x�||k�r�|}x(||kr\dd�||D�r\|d7}q6W||k�r�|js�d}dj|||��|_nd}dj|||��|_|}||ko�||dks�||dd
k�r�||dk�r�d||dk}|d7}nd }t||�}	|	dk�rJt||t||dd��d
t||dd��d|�n�|d|k�r�||ddk�r�t||t||�d
t||d�d|�|d7}n4|	dk�r�t||t||dd��d
|�ndS|d7}|j�r�Pq&Pq&W||k�rBx*t	||�D]}||dk�rd||<�qW||dk�s:t
�|d7}||k�rP�n�d|jd�k�ojdkn�r�dd�||d�D��r�x�|j|j
fD]�}
t||�|
_|d7}||d	k�r�t||d�d!}|d7}nt||�}|d7}|�r||
_t||�dd|
_nt||�|
_|d7}t||�|
_|d7}�q�W||k�r6||d"k�r|d$||dk}|d7}nd}|jt||�||_�n�|jd�dk�r6||d�jd�dk�r6dd�||d�D��r6�xF|j|j
fD�]4}
||dk�r|d7}t||�|
_n�||dk�r�|d7}t||�|
_|d7}||d%k�sXt
�|d7}t||�|
_|
jdk�r�d&|
_|d7}||d'k�s�t
�|d7}t||�dd|
_nt||�d|
_|d7}||k�r�||dk�r�|d7}t||�}	|	dk�r>t||dd��d
t||dd��d|
_n�|d|k�r�||ddk�r�t||�d
t||d�d|
_|d7}|d|k�r�||ddk�r�|d7}|
jt||�7_n*|	dk�r�t||dd��d
|
_ndS|d7}||k�s||dk�st
�|d7}�q�W||k�s6t
�Wnttt
fk
�rTdSX|S)(NrcSsg|]}|dkr|�qS)z0123456789:,-+r)rNr�rrrr��sz#_tzparser.parse.<locals>.<listcomp>rrr
rr�rT�
0123456789r�r%ir�r�rSr$r��	cSs*g|]"}|dkr|D]}|dkr|�qqS)r$rr)rNr��yrrrr��s
�rUc
Ss*g|]"}|dkr|D]}|dkr|�qqS)	r$rU�J�Mr"rTr�r)r$rUrrr"rTr�r)rNr�rrrrr�srrr"r�)r�rT)r�rTr')rr'r'r')rTr�r')r'r)rTr"r')rTr")r�r
r0r-rrKrrHr��ranger�r.rrr�r	r�r�r�rrrr
r�r�)rr�r�r5r�r��jZoffattrrr�r�rLrrrr�s�



" *

 

"


(
4 & 
z_tzparser.parseN)rrArBrFr�rrrrrr�srcCs
tj|�S)N)�DEFAULTTZPARSERr)r�rrr�_parsetzQsrcCsFd|krt|�dfS|jd�\}}t|�t|jdd�dd��fSdS)z9Parse a I[.F] seconds value into (seconds, microseconds).r"rr�r N)r�r0�ljust)rLr��frrrr�Usr�)N)!r�Z
__future__rr�r�r�r�rC�iorZcalendarrZsixrrrr
r	r
�__all__�objectr
rFrr<r�r�rrrrrr�rrrr�<module>s<#oX
e.site-packages/dateutil/__pycache__/tzwin.cpython-36.pyc000064400000000205147511334560017116 0ustar003

6�cY;�@sddlTdS)�)�*N)Ztz.win�rr�/usr/lib/python3.6/tzwin.py�<module>ssite-packages/dateutil/__pycache__/easter.cpython-36.pyc000064400000004036147511334560017234 0ustar003

6�cYE
�@s4dZddlZddddgZdZdZd	Zefd
d�ZdS)zx
This module offers a generic easter computing method for any given year, using
Western, Orthodox or Julian algorithms.
�N�easter�
EASTER_JULIAN�EASTER_ORTHODOX�EASTER_WESTERN���cCsld|kodkns td��|}|d}d}|dkr�d|dd}||d|d	}|d
kr�d}|dkr�||d
d|d
dd}n�|d
}||dd|ddd|dd}||dd|dd|dd|d}||d|d
||dd	}|||}	d|	d|	ddd}
d|	dd}tjt|�t|�t|
��S)a�
    This method was ported from the work done by GM Arts,
    on top of the algorithm by Claus Tondering, which was
    based in part on the algorithm of Ouding (1940), as
    quoted in "Explanatory Supplement to the Astronomical
    Almanac", P.  Kenneth Seidelmann, editor.

    This algorithm implements three different easter
    calculation methods:

    1 - Original calculation in Julian calendar, valid in
        dates after 326 AD
    2 - Original method, with date converted to Gregorian
        calendar, valid in years 1583 to 4099
    3 - Revised method, in Gregorian calendar, valid in
        years 1583 to 4099 as well

    These methods are represented by the constants:

    * ``EASTER_JULIAN   = 1``
    * ``EASTER_ORTHODOX = 2``
    * ``EASTER_WESTERN  = 3``

    The default method is method 3.

    More about the algorithm may be found at:

    http://users.chariot.net.au/~gmarts/eastalg.htm

    and

    http://www.tondering.dk/claus/calendar.html

    rrzinvalid method�r����r�
i@�d���
��������(��)�
ValueError�datetimeZdate�int)Zyear�method�y�g�e�i�j�c�h�p�d�m�r+�/usr/lib/python3.6/easter.pyrs($",0$)�__doc__r�__all__rrrrr+r+r+r,�<module>ssite-packages/dateutil/__pycache__/parser.cpython-36.opt-1.pyc000064400000101672147511334560020210 0ustar003

6�cY���@sdZddlmZddlZddlZddlZddlZddlZddlm	Z	ddl
mZddlm
Z
mZmZddlmZdd	lmZd
dgZGdd
�d
e�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�Ze�Zddd
�ZGdd�de�Ze�Zdd�Zdd�Z dS)a�
This module offers a generic date/time string parser which is able to parse
most known formats to represent a date and/or time.

This module attempts to be forgiving with regards to unlikely input formats,
returning a datetime object even for dates which are ambiguous. If an element
of a date/time stamp is omitted, the following rules are applied:
- If AM or PM is left unspecified, a 24-hour clock is assumed, however, an hour
  on a 12-hour clock (``0 <= hour <= 12``) *must* be specified if AM or PM is
  specified.
- If a time zone is omitted, a timezone-naive datetime is returned.

If any other elements are missing, they are taken from the
:class:`datetime.datetime` object passed to the parameter ``default``. If this
results in a day number exceeding the valid number of days per month, the
value falls back to the end of the month.

Additional resources about date/time string formats can be found below:

- `A summary of the international standard date and time notation
  <http://www.cl.cam.ac.uk/~mgk25/iso-time.html>`_
- `W3C Date and Time Formats <http://www.w3.org/TR/NOTE-datetime>`_
- `Time Formats (Planetary Rings Node) <http://pds-rings.seti.org/tools/time_formats.html>`_
- `CPAN ParseDate module
  <http://search.cpan.org/~muir/Time-modules-2013.0912/lib/Time/ParseDate.pm>`_
- `Java SimpleDateFormat Class
  <https://docs.oracle.com/javase/6/docs/api/java/text/SimpleDateFormat.html>`_
�)�unicode_literalsN)�StringIO)�
monthrange)�	text_type�binary_type�
integer_types�)�
relativedelta)�tz�parse�
parserinfoc@sneZdZejd�Zdd�Zdd�Zdd�Zdd	�Z	d
d�Z
edd
��Zedd��Z
edd��Zedd��ZdS)�_timelexz([.,])cCsdt|t�r|j�}t|t�r$t|�}t|dd�dkrHtdj|jj	d���||_
g|_g|_d|_
dS)N�readz8Parser must be a string or character stream, not {itype})ZitypeF)�
isinstancer�decoderr�getattr�	TypeError�format�	__class__�__name__�instream�	charstack�
tokenstack�eof)�selfr�r�/usr/lib/python3.6/parser.py�__init__4s

z_timelex.__init__cCs�|jr|jjd�Sd}d}d}�x�|j�s|jr>|jjd�}n&|jjd�}x|dkrb|jjd�}qLW|srd|_Pq"|s�|}|j|�r�d}n$|j|�r�d}n|j|�r�d	}PnPq"|dk�r�d}|j|�r�||7}n$|d
kr�||7}d}n|jj	|�Pq"|dk�rX|j|��r||7}n>|d
k�s:|dk�rHt
|�d
k�rH||7}d}n|jj	|�Pq"|dk�r�d}|d
k�s||j|��r�||7}n6|j|��r�|dd
k�r�||7}d}n|jj	|�Pq"|dkr"|d
k�s�|j|��r�||7}q"|j|��r|dd
k�r||7}d}q"|jj	|�Pq"W|dk�r�|�sN|jd
�dk�sN|ddk�r�|jj
|�}|d}x(|dd�D]}|�rp|jj	|��qpW|dk�r�|jd
�dk�r�|jdd
�}|S)a�
        This function breaks the time string into lexical units (tokens), which
        can be parsed by the parser. Lexical units are demarcated by changes in
        the character set, so any continuous string of letters is considered
        one unit, any continuous string of numbers is considered one unit.

        The main complication arises from the fact that dots ('.') can be used
        both as separators (e.g. "Sep.20.2009") or decimal points (e.g.
        "4:30:21.447"). As such, it is necessary to read the full context of
        any dot-separated strings before breaking it into tokens; as such, this
        function maintains a "token stack", for when the ambiguous context
        demands that multiple tokens be parsed at once.
        rFNr�T�a�0� �.�a.�,��0.z.,���r')r#r&r')r�poprrrr�isword�isnum�isspace�append�len�count�_split_decimal�split�replace)rZseenletters�token�state�nextchar�l�tokrrr�	get_tokenDs�








"


 z_timelex.get_tokencCs|S)Nr)rrrr�__iter__�sz_timelex.__iter__cCs|j�}|dkrt�|S)N)r7�
StopIteration)rr2rrr�__next__�sz_timelex.__next__cCs|j�S)N)r:)rrrr�next�sz
_timelex.nextcCst||��S)N)�list)�cls�srrrr0�sz_timelex.splitcCs|j�S)z5 Whether or not the next character is part of a word )�isalpha)r=r4rrrr)�sz_timelex.iswordcCs|j�S)z0 Whether the next character is part of a number )�isdigit)r=r4rrrr*�sz_timelex.isnumcCs|j�S)z* Whether the next character is whitespace )r+)r=r4rrrr+�sz_timelex.isspaceN)r�
__module__�__qualname__�re�compiler/rr7r8r:r;�classmethodr0r)r*r+rrrrr
0s
mr
c@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�_resultbasecCs x|jD]}t||d�qWdS)N)�	__slots__�setattr)r�attrrrrr�sz_resultbase.__init__cCsNg}x6|jD],}t||�}|dk	r|jd|t|�f�qWd|dj|�fS)Nz%s=%sz%s(%s)z, )rGrr,�repr�join)rZ	classnamer5rI�valuerrr�_repr�s
z_resultbase._reprcst�fdd��jD��S)Nc3s|]}t�|�dk	VqdS)N)r)�.0rI)rrr�	<genexpr>�sz&_resultbase.__len__.<locals>.<genexpr>)�sumrG)rr)rr�__len__�sz_resultbase.__len__cCs|j|jj�S)N)rMrr)rrrr�__repr__�sz_resultbase.__repr__N)rrArBrrMrQrRrrrrrF�srFc@s�eZdZdZdddddddd	d
ddd
ddddddgZdcdddedfdgdhdigZdjdkdldmdndodpdqdrdsdtdugZdvdwdxgZdydzgZdFdGdHgZ	dgZ
iZd{dJdK�ZdLdM�Z
dNdO�ZdPdQ�ZdRdS�ZdTdU�ZdVdW�ZdXdY�ZdZd[�Zd\d]�Zd|d^d_�Zd`da�ZdbS)}ra�
    Class which handles what inputs are accepted. Subclass this to customize
    the language and acceptable values for each parameter.

    :param dayfirst:
            Whether to interpret the first value in an ambiguous 3-integer date
            (e.g. 01/05/09) as the day (``True``) or month (``False``). If
            ``yearfirst`` is set to ``True``, this distinguishes between YDM
            and YMD. Default is ``False``.

    :param yearfirst:
            Whether to interpret the first value in an ambiguous 3-integer date
            (e.g. 01/05/09) as the year. If ``True``, the first number is taken
            to be the year, otherwise the last number is taken to be the year.
            Default is ``False``.
    r!r"r$�;�-�/�'ZatZon�andZad�m�tZof�stZndZrdZth�Mon�Monday�Tue�Tuesday�Wed�	Wednesday�Thu�Thursday�Fri�Friday�Sat�Saturday�Sun�Sunday�Jan�January�Feb�February�Mar�March�Apr�April�May�Jun�June�Jul�July�Aug�August�Sep�Sept�	September�Oct�October�Nov�November�Dec�December�h�hour�hours�minute�minutesr>�second�seconds�amr�pm�p�UTCZGMT�ZFcCs�|j|j�|_|j|j�|_|j|j�|_|j|j�|_|j|j	�|_
|j|j�|_|j|j
�|_||_||_tj�j|_|jdd|_dS)N�d)�_convert�JUMP�_jump�WEEKDAYS�	_weekdays�MONTHS�_months�HMS�_hms�AMPM�_ampm�UTCZONE�_utczone�PERTAIN�_pertain�dayfirst�	yearfirst�timeZ	localtimeZtm_year�_year�_century)rr�r�rrrrszparserinfo.__init__cCsPi}xFt|�D]:\}}t|t�r<x&|D]}|||j�<q&Wq|||j�<qW|S)N)�	enumerater�tuple�lower)rZlstZdct�i�vrrrr�*s

zparserinfo._convertcCs|j�|jkS)N)r�r�)r�namerrr�jump4szparserinfo.jumpcCsHt|�tdd�|jj�D��krDy|j|j�Stk
rBYnXdS)Ncss|]}t|�VqdS)N)r-)rN�nrrrrO8sz%parserinfo.weekday.<locals>.<genexpr>)r-�minr��keysr��KeyError)rr�rrr�weekday7s zparserinfo.weekdaycCsLt|�tdd�|jj�D��krHy|j|j�dStk
rFYnXdS)Ncss|]}t|�VqdS)N)r-)rNr�rrrrO@sz#parserinfo.month.<locals>.<genexpr>r)r-r�r�r�r�r�)rr�rrr�month?s zparserinfo.monthcCs(y|j|j�Stk
r"dSXdS)N)r�r�r�)rr�rrr�hmsGszparserinfo.hmscCs(y|j|j�Stk
r"dSXdS)N)r�r�r�)rr�rrr�ampmMszparserinfo.ampmcCs|j�|jkS)N)r�r�)rr�rrr�pertainSszparserinfo.pertaincCs|j�|jkS)N)r�r�)rr�rrr�utczoneVszparserinfo.utczonecCs||jkrdS|jj|�S)Nr)r��TZOFFSET�get)rr�rrr�tzoffsetYs
zparserinfo.tzoffsetcCsJ|dkrF|rF||j7}t||j�dkrF||jkr>|d7}n|d8}|S)Nr��2)r��absr�)r�year�century_specifiedrrr�convertyear_s


zparserinfo.convertyearcCsl|jdk	r|j|j|j�|_|jdkr.|js8|jdkrFd|_d|_n"|jdkrh|jrh|j|j�rhd|_dS)Nrr�r�T)r�r�r�r��tznamer�)r�resrrr�validateis
zparserinfo.validateN)r[r\)r]r^)r_r`)rarb)rcrd)rerf)rgrh)rirj)rkrl)rmrn)rorp)rqrq)rrrs)rtru)rvrw)rxryrz)r{r|)r}r~)rr�)r�r�r�)rXr�r�)r>r�r�)r�r)r�r�)FF)F)rrArB�__doc__r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�rrrrr�sV




csPeZdZ�fdd�Zedd��Zedd��Zdd�Z�fd	d
�Zdd�Z	�Z
S)
�_ymdcs$t|j|�j||�d|_||_dS)NF)�superrrr��tzstr)rr��args�kwargs)rrrrwsz
_ymd.__init__cCs&yt|�|kStk
r dSXdS)NF)�int�
ValueError)r2r�rrr�token_could_be_year|sz_ymd.token_could_be_yearcs�fdd�|D�S)Ncsg|]}tj|��r|�qSr)r�r�)rNr2)r�rr�
<listcomp>�sz3_ymd.find_potential_year_tokens.<locals>.<listcomp>r)r��tokensr)r�r�find_potential_year_tokens�sz_ymd.find_potential_year_tokenscCsFx@t|�D]4\}}tj||�}t|�dkr
t|d�dkr
|Sq
WdS)zk
        attempt to deduce if a pre 100 year was lost
         due to padded zeros being taken off
        rrr%N)r�r�r�r-)rr��indexr2Zpotential_year_tokensrrr�find_probable_year_index�sz_ymd.find_probable_year_indexcsNt|d�r&|j�r4t|�dkr4d|_n|dkr4d|_t|j|�jt|��dS)NrQr%Tr�)�hasattrr@r-r�r�rr,r�)r�val)rrrr,�s
z_ymd.appendcCs,t|�}d\}}}|dkr&td���n�|dks>|d	kr�|dkr�|d
krT||}||=|dksd|dkr�|ddkrz|d}n|d}�n�|dkr�|ddkr�|\}}n8|ddkr�|\}}n"|r�|ddkr�|\}}n|\}}�nB|dk�r"|dk�r|\}}}�n |dk�rF|ddk�s.|�r:|ddk�r:|\}}}n
|\}}}n�|dk�rv|ddk�rj|\}}}n
|\}}}n�|ddk�s�|jtj|j��dk�s�|�r�|ddk�r�|ddk�r�|�r�|ddk�r�|\}}}n
|\}}}n8|ddk�s|�r|ddk�r|\}}}n
|\}}}|||fS)N�zMore than three YMD valuesrr%r��)NNNr'r'r')r-r�r�r
r0r�)r�mstridxr�r�Zlen_ymdr�r��dayrrr�resolve_ymd�sR







"
""
z_ymd.resolve_ymd)rrArBr�staticmethodr�r�r�r,r��
__classcell__rr)rrr�vs
	r�c@sFeZdZd
dd�Zddd�ZGdd�de�Zdd	d
�Zedd��Z	dS)�parserNcCs|pt�|_dS)N)r�info)rr�rrrr�szparser.__init__FcKsV|dkr tjj�jddddd�}|j|f|�\}}|dkrBtd��t|�dkrVtd��i}x&dD]}	t||	�}
|
dk	r`|
||	<q`Wd|kr�|jdkr�|jn|j}|jdkr�|jn|j}|j	dkr�|j	n|j	}
|
t
||�d
kr�t
||�d
|d<|jf|�}|jdk	�r$|j	�r$|tj|jd�}|�s8t
|tj��sJ|�r�|j|k�r�t
|tj��rh||j|j�}n|j|j�}t
|tj��r�|}n<t
|t��r�tj|�}n$t
|t��r�tj|j|�}ntd��|j|d�}nf|j�r�|jtjk�r�|jtj�d�}n>|jdk�r|jtj�d�}n |j�r8|jtj|j|j�d�}|jdd��rN||fS|SdS)aV

        Parse the date/time string into a :class:`datetime.datetime` object.

        :param timestr:
            Any date/time string using the supported formats.

        :param default:
            The default datetime object, if this is a datetime object and not
            ``None``, elements specified in ``timestr`` replace elements in the
            default object.

        :param ignoretz:
            If set ``True``, time zones in parsed strings are ignored and a
            naive :class:`datetime.datetime` object is returned.

        :param tzinfos:
            Additional time zone names / aliases which may be present in the
            string. This argument maps time zone names (and optionally offsets
            from those time zones) to time zones. This parameter can be a
            dictionary with timezone aliases mapping time zone names to time
            zones or a function taking two parameters (``tzname`` and
            ``tzoffset``) and returning a time zone.

            The timezones to which the names are mapped can be an integer
            offset from UTC in minutes or a :class:`tzinfo` object.

            .. doctest::
               :options: +NORMALIZE_WHITESPACE

                >>> from dateutil.parser import parse
                >>> from dateutil.tz import gettz
                >>> tzinfos = {"BRST": -10800, "CST": gettz("America/Chicago")}
                >>> parse("2012-01-19 17:21:00 BRST", tzinfos=tzinfos)
                datetime.datetime(2012, 1, 19, 17, 21, tzinfo=tzoffset(u'BRST', -10800))
                >>> parse("2012-01-19 17:21:00 CST", tzinfos=tzinfos)
                datetime.datetime(2012, 1, 19, 17, 21,
                                  tzinfo=tzfile('/usr/share/zoneinfo/America/Chicago'))

            This parameter is ignored if ``ignoretz`` is set.

        :param **kwargs:
            Keyword arguments as passed to ``_parse()``.

        :return:
            Returns a :class:`datetime.datetime` object or, if the
            ``fuzzy_with_tokens`` option is ``True``, returns a tuple, the
            first element being a :class:`datetime.datetime` object, the second
            a tuple containing the fuzzy tokens.

        :raises ValueError:
            Raised for invalid or unknown string format, if the provided
            :class:`tzinfo` is not in a valid format, or if an invalid date
            would be created.

        :raises TypeError:
            Raised for non-string or character stream input.

        :raises OverflowError:
            Raised if the parsed date exceeds the largest valid C integer on
            your system.
        Nr)r�r�r��microsecondzUnknown string formatzString does not contain a date.r�r�r�r�r�r�r�r)r�z9Offset must be tzinfo subclass, tz string, or int offset.)�tzinfo�fuzzy_with_tokensF)r�r�r�r�r�r�r�)�datetimeZnowr1�_parser�r-rr�r�r�rr�r	r�collections�Callabler�r�r�r�rr
r�rr�ZtzlocalZtzutc)r�timestr�defaultZignoretzZtzinfosr�r��skipped_tokens�replrIrLZcyearZcmonthZcday�retZtzdatar�rrrr�s\?

zparser.parsec@s&eZdZddddddddd	d
dgZdS)
zparser._resultr�r�r�r�r�r�r�r�r�r�r�N)rrArBrGrrrr�_resultisr�cCsJ
|rd}|j}|dkr|j}|dkr*|j}|j�}tj|�}d }	t�}
�y�t|�}d!}t|�}
d}�xJ||
k�r�y||}t	|�}Wnt
k
r�d}YnX|dk	�r�t||�}|d7}t|�dk�rH|d"k�rH|jdk�rH||
k�s||dk�rH|j||�dk�rH||d}t
|dd��|_|dk�r�t
|dd��|_qf|d	k�st|d	k�r||djd
�d	k�r||d}|�r�||djd
�d#k�r�|j|dd��|j|dd��|j|dd��n<t
|dd��|_t
|dd��|_t|dd��\|_|_qf|d$k�r�||d}|j|dd��|j|dd	��|j|d	d��|dk�r�t
|dd��|_t
|dd��|_|dk�r�t
|dd��|_qf||
k�r�|j||�dk	�s|d|
k�r0||dk�r0|j||d�dk	�r0||dk�r|d7}|j||�}�x�|dk�rZt
|�|_|d�r�t
d|d�|_nL|dk�r�t
|�|_|d�r�t
d|d�|_n|dk�r�t|�\|_|_|d7}||
k�s�|dk�r�Py||}t	|�}Wnt
k
�r�PYn8X|d7}|d7}||
k�r(|j||�}|dk	�r(|}�q(Wqf||
k�r�||ddk�r�|j||d�dk	�r�|j||d�}|dk�r�t
|�|_|d}|�r�t
d|�|_n|dk�r�t|�\|_|_qf|d|
k�rf||dk�rft
|�|_|d7}t	||�}t
|�|_|d�r$t
d|d�|_|d7}||
k�r�||dk�r�t||d�\|_|_|d7}qf||
k�r�||d%k�r�||}|j|�|d7}||
k�r�|j||��r�y|j||�WnJt
k
�r|j||�}|dk	�r|j|�t|�d}nd&SYnX|d7}||
k�r�|||k�r�|d7}|j||�}|dk	�rj|j|�t|�d}n|j||�|d7}qf||
k�s�|j||��rH|d|
k�r4|j||d�dk	�r4t
|�|_|jdk�r|j||d�dk�r|jd7_n*|jdk�r*|j||d�dk�r*d|_|d7}n
|j|�|d7}qf|j||�dk	�r�t
|�|_|jdk�r�|j||�dk�r�|jd7_n&|jdk�r�|j||�dk�r�d|_|d7}qf|�s�d'S|d7}qf|j||�}|dk	�r||_|d7}qf|j||�}|dk	�	r:|j|�t|�d}|d7}||
krf||d(k�r�||}|d7}|j||�|d7}||
k�	r8|||k�	r8|d7}|j||�|d7}qf|d|
krf||||dk�o�dknrf|j||d�rfyt
||d�}Wnt
k
�	rYnX|jt|j|���|d7}qf|j||�}|dk	�
r0d}|�	rl|jdk	�	rld}|jdk�	r�|�	r�d}nt
d)��n2d|jk�	o�dkn�	s�|�	r�d}nt
d*��|�
r|dk�	r�|jdk�	r�|jd7_n|dk�
r|jdk�
rd|_||_n|�
r&|j|
|	||�}	|d7}qf|jdk	�
r�t||�dk�
r�|jdk�
r�|jdk�
r�dd�||D��
r�|||_|j|j�|_|d7}||
krf||d+krfd,||dk||<d|_|j|j�rfd|_qf|jdk	�rz||d-k�rzd/||dk}|d7}t||�}|dk�rZt
||dd��dt
||dd��d|_nz|d|
k�r�||ddk�r�t
||�dt
||d�d|_|d7}n*|dk�r�t
||dd��d|_nd0S|d7}|j|9_|d|
krf|j||�rf||ddkrf||ddkrfdt||d�k�oDdknrfdd�||dD�rf||d|_|d7}qf|j||��p�|�s�d1S|j|
|	||�}	|d7}qfW|j|||�\}}}|dk	�r�||_ |j!|_!|dk	�r�||_|dk	�r�||_"Wnt#t
t$fk
�
rd2SX|j%|��
s,d3S|�
r>|t&|
�fS|dfSdS)4a
        Private method which performs the heavy lifting of parsing, called from
        ``parse()``, which passes on its ``kwargs`` to this function.

        :param timestr:
            The string to parse.

        :param dayfirst:
            Whether to interpret the first value in an ambiguous 3-integer date
            (e.g. 01/05/09) as the day (``True``) or month (``False``). If
            ``yearfirst`` is set to ``True``, this distinguishes between YDM
            and YMD. If set to ``None``, this value is retrieved from the
            current :class:`parserinfo` object (which itself defaults to
            ``False``).

        :param yearfirst:
            Whether to interpret the first value in an ambiguous 3-integer date
            (e.g. 01/05/09) as the year. If ``True``, the first number is taken
            to be the year, otherwise the last number is taken to be the year.
            If this is set to ``None``, the value is retrieved from the current
            :class:`parserinfo` object (which itself defaults to ``False``).

        :param fuzzy:
            Whether to allow fuzzy parsing, allowing for string like "Today is
            January 1, 2047 at 8:21:00AM".

        :param fuzzy_with_tokens:
            If ``True``, ``fuzzy`` is automatically set to True, and the parser
            will return a tuple where the first element is the parsed
            :class:`datetime.datetime` datetimestamp and the second element is
            a tuple containing the portions of the string which were ignored:

            .. doctest::

                >>> from dateutil.parser import parse
                >>> parse("Today is January 1, 2047 at 8:21:00AM", fuzzy_with_tokens=True)
                (datetime.datetime(2047, 1, 1, 8, 21), (u'Today is ', u' ', u'at '))

        TNr%rrr���:�r"�r���
r!�<rTrUFzNo hour specified with zAM or PM flag.zInvalid hour specified for z12-hour clock.�cSsg|]}|tjkr|�qSr)�string�ascii_uppercase)rN�xrrrr��sz!parser._parse.<locals>.<listcomp>�+i�(�)cSsg|]}|tjkr|�qSr)r�r�)rNr�rrrr�s���r')r%r�r')r�r�r�)rTrUr")NN)NN)rTrUz%No hour specified with AM or PM flag.z)Invalid hour specified for 12-hour clock.)r�rT)r�rT)r�rTr')r'r)NN)NN)NN)NN)'r�r�r�r�r
r0r<r�r-�floatr�r�r�r�r��findr,�_parsemsr�r�r�r�r�r�r��strr��_skip_tokenr�r�r�r�r�r�r��
IndexError�AssertionErrorr�r�)rr�r�r�Zfuzzyr�r�r�r5�last_skipped_token_ir�Zymdr��len_lr�Z
value_reprrL�len_lir>�idxZnewidxZ
sec_remainder�sepZval_is_ampm�signalr�r�r�rrrr�ns�)


$
, 

























&
$$



  




2



4 &

.&




z
parser._parsecCs8||dkr"|d||7<n|j||�|}|S)Nrr')r,)r�r�r�r5rrrr�-s
zparser._skip_token)N)NFN)NNFF)
rrArBrrrFr�r�r�r�rrrrr��s


Ar�cKs(|rt|�j|f|�Stj|f|�SdS)a)

    Parse a string in one of the supported formats, using the
    ``parserinfo`` parameters.

    :param timestr:
        A string containing a date/time stamp.

    :param parserinfo:
        A :class:`parserinfo` object containing parameters for the parser.
        If ``None``, the default arguments to the :class:`parserinfo`
        constructor are used.

    The ``**kwargs`` parameter takes the following keyword arguments:

    :param default:
        The default datetime object, if this is a datetime object and not
        ``None``, elements specified in ``timestr`` replace elements in the
        default object.

    :param ignoretz:
        If set ``True``, time zones in parsed strings are ignored and a naive
        :class:`datetime` object is returned.

    :param tzinfos:
            Additional time zone names / aliases which may be present in the
            string. This argument maps time zone names (and optionally offsets
            from those time zones) to time zones. This parameter can be a
            dictionary with timezone aliases mapping time zone names to time
            zones or a function taking two parameters (``tzname`` and
            ``tzoffset``) and returning a time zone.

            The timezones to which the names are mapped can be an integer
            offset from UTC in minutes or a :class:`tzinfo` object.

            .. doctest::
               :options: +NORMALIZE_WHITESPACE

                >>> from dateutil.parser import parse
                >>> from dateutil.tz import gettz
                >>> tzinfos = {"BRST": -10800, "CST": gettz("America/Chicago")}
                >>> parse("2012-01-19 17:21:00 BRST", tzinfos=tzinfos)
                datetime.datetime(2012, 1, 19, 17, 21, tzinfo=tzoffset(u'BRST', -10800))
                >>> parse("2012-01-19 17:21:00 CST", tzinfos=tzinfos)
                datetime.datetime(2012, 1, 19, 17, 21,
                                  tzinfo=tzfile('/usr/share/zoneinfo/America/Chicago'))

            This parameter is ignored if ``ignoretz`` is set.

    :param dayfirst:
        Whether to interpret the first value in an ambiguous 3-integer date
        (e.g. 01/05/09) as the day (``True``) or month (``False``). If
        ``yearfirst`` is set to ``True``, this distinguishes between YDM and
        YMD. If set to ``None``, this value is retrieved from the current
        :class:`parserinfo` object (which itself defaults to ``False``).

    :param yearfirst:
        Whether to interpret the first value in an ambiguous 3-integer date
        (e.g. 01/05/09) as the year. If ``True``, the first number is taken to
        be the year, otherwise the last number is taken to be the year. If
        this is set to ``None``, the value is retrieved from the current
        :class:`parserinfo` object (which itself defaults to ``False``).

    :param fuzzy:
        Whether to allow fuzzy parsing, allowing for string like "Today is
        January 1, 2047 at 8:21:00AM".

    :param fuzzy_with_tokens:
        If ``True``, ``fuzzy`` is automatically set to True, and the parser
        will return a tuple where the first element is the parsed
        :class:`datetime.datetime` datetimestamp and the second element is
        a tuple containing the portions of the string which were ignored:

        .. doctest::

            >>> from dateutil.parser import parse
            >>> parse("Today is January 1, 2047 at 8:21:00AM", fuzzy_with_tokens=True)
            (datetime.datetime(2047, 1, 1, 8, 21), (u'Today is ', u' ', u'at '))

    :return:
        Returns a :class:`datetime.datetime` object or, if the
        ``fuzzy_with_tokens`` option is ``True``, returns a tuple, the
        first element being a :class:`datetime.datetime` object, the second
        a tuple containing the fuzzy tokens.

    :raises ValueError:
        Raised for invalid or unknown string format, if the provided
        :class:`tzinfo` is not in a valid format, or if an invalid date
        would be created.

    :raises OverflowError:
        Raised if the parsed date exceeds the largest valid C integer on
        your system.
    N)r�r�
DEFAULTPARSER)r�rr�rrrr<s_c@s$eZdZGdd�de�Zdd�ZdS)�	_tzparserc@s<eZdZddddddgZGdd�de�Zd	d
�Zdd�Zd
S)z_tzparser._result�stdabbr�	stdoffset�dstabbr�	dstoffset�start�endc@seZdZdddddddgZdS)	z_tzparser._result._attrr��weekr��yday�jydayr�r�N)rrArBrGrrrr�_attr�srcCs
|jd�S)N�)rM)rrrrrR�sz_tzparser._result.__repr__cCs"tj|�|j�|_|j�|_dS)N)rFrrrr)rrrrr�s

z_tzparser._result.__init__N)rrArBrGrFrrRrrrrrr��s
r�cCs�|j�}tj|�}�y�t|�}d}�x�||k�r�|}x(||kr\dd�||D�r\|d7}q6W||k�r�|js�d}dj|||��|_nd}dj|||��|_|}||ko�||dks�||dd
k�r�||dk�r�d||dk}|d7}nd}t||�}	|	dk�rJt||t||dd��d
t||dd��d|�n�|d|k�r�||ddk�r�t||t||�d
t||d�d|�|d7}n4|	dk�r�t||t||dd��d
|�ndS|d7}|j�r�Pq&Pq&W||k�r0x*t	||�D]}||dk�rd||<�qW|d7}||k�r>�n�d|j
d�k�oXdkn�r�dd�||d�D��r�x�|j|jfD]�}
t||�|
_
|d7}||d	k�r�t||d�d }|d7}nt||�}|d7}|�r||
_t||�dd|
_nt||�|
_|d7}t||�|
_|d7}�q�W||k�r�||d!k�rjd#||dk}|d7}nd}|jt||�||_�nL|j
d�dk�r�||d�j
d�dk�r�dd�||d�D��r֐x|j|jfD�]�}
||dk�r|d7}t||�|
_n�||dk�r�|d7}t||�|
_
|d7}|d7}t||�|
_|
jdk�r\d$|
_|d7}|d7}t||�dd|
_nt||�d|
_|d7}||k�r�||dk�r�|d7}t||�}	|	dk�rt||dd��d
t||dd��d|
_n�|d|k�r�||ddk�r�t||�d
t||d�d|
_|d7}|d|k�r�||ddk�r�|d7}|
jt||�7_n*|	dk�r�t||dd��d
|
_ndS|d7}|d7}�q�WWntttfk
�r�dSX|S)%NrcSsg|]}|dkr|�qS)z0123456789:,-+r)rNr�rrrr��sz#_tzparser.parse.<locals>.<listcomp>rrr
rr�rT�
0123456789r�r%ir�r�rSr$r��	cSs*g|]"}|dkr|D]}|dkr|�qqS)r$rr)rNr��yrrrr��s
�rUc
Ss*g|]"}|dkr|D]}|dkr|�qqS)	r$rU�J�Mr"rTr�r)r$rUrrr"rTr�r)rNr�rrrrr�srrr�)r�rT)r�rTr')rr'r'r')rTr�r')r'rr')r�r
r0r-rrKrrHr��ranger.rrr�r	r�r�r�rrrr
r�r�r�)rr�r�r5r�r��jZoffattrrr�r�rLrrrr�s�



" *

 

"


(
4 & 
z_tzparser.parseN)rrArBrFr�rrrrrr�srcCs
tj|�S)N)�DEFAULTTZPARSERr)r�rrr�_parsetzQsrcCsFd|krt|�dfS|jd�\}}t|�t|jdd�dd��fSdS)z9Parse a I[.F] seconds value into (seconds, microseconds).r"rr�r N)r�r0�ljust)rLr��frrrr�Usr�)N)!r�Z
__future__rr�r�r�r�rC�iorZcalendarrZsixrrrr
r	r
�__all__�objectr
rFrr<r�r�rrrrrr�rrrr�<module>s<#oX
e.site-packages/dateutil/__pycache__/relativedelta.cpython-36.pyc000064400000033731147511334560020602 0ustar003

6�cYiZ�@s�ddlZddlZddlZddlmZddlmZddlmZddl	m
Z
edd�ed	�D��\Z
ZZZZZZZd
ddd
ddddgZGdd
�d
e�Zdd�ZdS)�N)�copysign)�
integer_types)�warn�)�weekdayccs|]}t|�VqdS)N)r)�.0�x�r	�#/usr/lib/python3.6/relativedelta.py�	<genexpr>
sr��
relativedelta�MO�TU�WE�TH�FR�SA�SUc@s�eZdZdZd%dd�Zdd�Zedd	��Zejd
d	��Zdd�Z	d
d�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�ZeZdd�ZeZdd�ZdZdd �Zd!d"�ZeZd#d$�ZdS)&r
a�
    The relativedelta type is based on the specification of the excellent
    work done by M.-A. Lemburg in his
    `mx.DateTime <http://www.egenix.com/files/python/mxDateTime.html>`_ extension.
    However, notice that this type does *NOT* implement the same algorithm as
    his work. Do *NOT* expect it to behave like mx.DateTime's counterpart.

    There are two different ways to build a relativedelta instance. The
    first one is passing it two date/datetime classes::

        relativedelta(datetime1, datetime2)

    The second one is passing it any number of the following keyword arguments::

        relativedelta(arg1=x,arg2=y,arg3=z...)

        year, month, day, hour, minute, second, microsecond:
            Absolute information (argument is singular); adding or subtracting a
            relativedelta with absolute information does not perform an aritmetic
            operation, but rather REPLACES the corresponding value in the
            original datetime with the value(s) in relativedelta.

        years, months, weeks, days, hours, minutes, seconds, microseconds:
            Relative information, may be negative (argument is plural); adding
            or subtracting a relativedelta with relative information performs
            the corresponding aritmetic operation on the original datetime value
            with the information in the relativedelta.

        weekday:
            One of the weekday instances (MO, TU, etc). These instances may
            receive a parameter N, specifying the Nth weekday, which could
            be positive or negative (like MO(+1) or MO(-2). Not specifying
            it is the same as specifying +1. You can also use an integer,
            where 0=MO.

        leapdays:
            Will add given days to the date found, if year is a leap
            year, and the date found is post 28 of february.

        yearday, nlyearday:
            Set the yearday or the non-leap year day (jump leap days).
            These are converted to day/month/leapdays information.

    Here is the behavior of operations with relativedelta:

    1. Calculate the absolute year, using the 'year' argument, or the
       original datetime year, if the argument is not present.

    2. Add the relative 'years' argument to the absolute year.

    3. Do steps 1 and 2 for month/months.

    4. Calculate the absolute day, using the 'day' argument, or the
       original datetime day, if the argument is not present. Then,
       subtract from the day until it fits in the year and month
       found after their operations.

    5. Add the relative 'days' argument to the absolute day. Notice
       that the 'weeks' argument is multiplied by 7 and added to
       'days'.

    6. Do steps 1 and 2 for hour/hours, minute/minutes, second/seconds,
       microsecond/microseconds.

    7. If the 'weekday' argument is present, calculate the weekday,
       with the given (wday, nth) tuple. wday is the index of the
       weekday (0-6, 0=Mon), and nth is the number of weeks to add
       forward or backward, depending on its signal. Notice that if
       the calculated date is already Monday, for example, using
       (0, 1) or (0, -1) won't change the day.
    NrcCstdd�||fD��rtd��|o$|�r�t|tj�o>t|tj�sHtd��t|tj�t|tj�kr�t|tj�s~tjj|j��}nt|tj�s�tjj|j��}d|_d|_	d|_
d|_d|_d|_
d|_d|_d|_d|_d|_d|_d|_d|_d|_d|_d|_|j|jd|j|j}|j|�|j|�}||k�rFtj}d}n
tj}d}x.|||��r~||7}|j|�|j|�}�qRW||}|j|j
d|_|j|_�nV||_||_	||d	|_
||_||_|	|_
|
|_||_||_|
|_||_||_||_||_||_td
d�||
|||||fD���r4tdt�t|t ��rLt!||_n||_d}|�rb|}n|�r||}|dk�r|d|_|�r�ddddddddddddg}x\t"|�D]D\}}||k�r�|d|_|dk�r�||_n|||d|_P�q�Wtd|��|j#�dS)Ncss"|]}|dk	o|t|�kVqdS)N)�int)rrr	r	r
rcsz)relativedelta.__init__.<locals>.<genexpr>zGNon-integer years and months are ambiguous and not currently supported.z&relativedelta only diffs datetime/dater�ri�Qrcss"|]}|dk	ot|�|kVqdS)N)r)rrr	r	r
r�sz2Non-integer value passed as absolute information. z4This is not a well-defined condition and will raise zerrors in future versions.�;��Z�x������ii0iNinzinvalid year day (%d)���zfNon-integer value passed as absolute information. This is not a well-defined condition and will raise z�Non-integer value passed as absolute information. This is not a well-defined condition and will raise errors in future versions.r)$�any�
ValueError�
isinstance�datetime�date�	TypeError�fromordinal�	toordinal�years�months�days�leapdays�hours�minutes�seconds�microseconds�year�month�dayr�hour�minute�second�microsecond�	_has_time�_set_months�__radd__�operator�gt�ltr�DeprecationWarningr�weekdays�	enumerate�_fix)�selfZdt1Zdt2r(r)r*r+�weeksr,r-r.r/r0r1r2rZyeardayZ	nlyeardayr3r4r5r6ZdtmZcompareZ	incrementZdeltaZydayZydayidx�idxZydaysr	r	r
�__init__[s�









zrelativedelta.__init__cCs�t|j�dkrHt|j�}t|j|d�\}}|||_|j||7_t|j�dkr�t|j�}t|j|d�\}}|||_|j||7_t|j�dkr�t|j�}t|j|d�\}}|||_|j||7_t|j�dk�r"t|j�}t|j|d�\}}|||_|j||7_t|j�dk�rlt|j�}t|j|d�\}}|||_|j	||7_	|j�s�|j�s�|j�s�|j�s�|j
dk	�s�|jdk	�s�|jdk	�s�|j
dk	�r�d	|_nd
|_dS)Ni?Bi@Br�<���rrr)�absr/�_sign�divmodr.r-r,r*r)r(r3r4r5r6r7)rA�s�div�modr	r	r
r@�s<









 zrelativedelta._fixcCs
|jdS)Nr)r*)rAr	r	r
rB�szrelativedelta.weekscCs|j|jd|d|_dS)Nr)r*rB)rA�valuer	r	r
rB�scCsR||_t|j�dkrHt|j�}t|j|d�\}}|||_|||_nd|_dS)NrHrr)r)rIrJrKr()rAr)rLrMrNr	r	r
r8s

zrelativedelta._set_monthsc	Cs�t|j�}t|jd|j|d�}t|�}t|jd||d�}t|�}t|jd||d�}t|�}t|jd||�}|j|j|j	||||||j
|j|j|j
|j|j|j|j|jd�S)aA
        Return a version of this object represented entirely using integer
        values for the relative attributes.

        >>> relativedelta(days=1.5, hours=2).normalized()
        relativedelta(days=1, hours=14)

        :return:
            Returns a :class:`dateutil.relativedelta.relativedelta` object.
        rGrHrE�
�g��.A)r(r)r*r,r-r.r/r+r0r1r2rr3r4r5r6)rr*�roundr,r-r.r/�	__class__r(r)r+r0r1r2rr3r4r5r6)	rAr*Zhours_fr,Z	minutes_fr-Z	seconds_fr.r/r	r	r
�
normalizeds 
zrelativedelta.normalizedc
Cslt|t��r|j|j|j|j|j|j|j|j|j|j|j|j|j|j	|j	|j
p`|j
|jdk	rp|jn|j|jdk	r�|jn|j|j
dk	r�|j
n|j
|jdk	r�|jn|j|jdk	r�|jn|j|jdk	r�|jn|j|jdk	r�|jn|j|jdk	r�|jn|jd�St|tj��rp|j|j|j|j|j|j|j|j|j|j	|j	|j
|j|j|j
|j|j|j|j|jd�St|tj��s�tS|j�r�t|tj��r�tjj|j��}|j�p�|j|j}|j�p�|j}|j�r:dt|j�k�o�dkn�s�t�||j7}|dk�r |d7}|d8}n|dk�r:|d8}|d7}ttj||�d|j
�pV|j
�}|||d�}x*dD]"}t||�}|dk	�rl|||<�qlW|j}|j
�r�|d	k�r�tj |��r�||j
7}|j!f|�tj||j|j|j|j	d
�}	|j�rh|jj|jj"�pd}
}t|�dd}|dk�r<|d|	j�|
d7}n||	j�|
d7}|d9}|	tj|d
�7}	|	S)N)r(r)r*r,r-r.r/r+r0r1r2rr3r4r5r6rr)r0r1r2r3r4r5r6�)r*r,r-r.r/rr)r*)r3r4r5r6r)#r"r
rSr(r)r*r,r-r.r/r+r0r1r2rr3r4r5r6r#Z	timedeltar$�NotImplementedr7r&r'rI�AssertionError�min�calendarZ
monthrange�getattrZisleap�replace�n)
rA�otherr0r1r2�repl�attrrOr*�retrZnthZjumpdaysr	r	r
�__add__/s�










&









zrelativedelta.__add__cCs
|j|�S)N)ra)rAr]r	r	r
r9�szrelativedelta.__radd__cCs|j�j|�S)N)�__neg__r9)rAr]r	r	r
�__rsub__�szrelativedelta.__rsub__cCs
t|t�stS|j|j|j|j|j|j|j|j|j|j|j|j	|j	|j
|j
|jpb|j|jdk	rr|jn|j|j
dk	r�|j
n|j
|jdk	r�|jn|j|jdk	r�|jn|j|jdk	r�|jn|j|jdk	r�|jn|j|jdk	r�|jn|j|jdk	�r|jn|jd�S)N)r(r)r*r,r-r.r/r+r0r1r2rr3r4r5r6)r"r
rVrSr(r)r*r,r-r.r/r+r0r1r2rr3r4r5r6)rAr]r	r	r
�__sub__�s6







zrelativedelta.__sub__cCsX|j|j|j|j|j|j|j|j|j|j	|j
|j|j|j
|j|j|jd�S)N)r(r)r*r,r-r.r/r+r0r1r2rr3r4r5r6)rSr(r)r*r,r-r.r/r+r0r1r2rr3r4r5r6)rAr	r	r
rb�s 
zrelativedelta.__neg__cCs�|jo�|jo�|jo�|jo�|jo�|jo�|jo�|jo�|jdko�|j	dko�|j
dko�|jdko�|jdko�|j
dko�|jdko�|jdkS)N)r(r)r*r,r-r.r/r+r0r1r2rr3r4r5r6)rAr	r	r
�__bool__�s 






zrelativedelta.__bool__cCs�yt|�}Wntk
r tSX|jt|j|�t|j|�t|j|�t|j|�t|j	|�t|j
|�t|j|�|j|j
|j|j|j|j|j|j|jd�S)N)r(r)r*r,r-r.r/r+r0r1r2rr3r4r5r6)�floatr%rVrSrr(r)r*r,r-r.r/r+r0r1r2rr3r4r5r6)rAr]�fr	r	r
�__mul__�s(zrelativedelta.__mul__cCsNt|t�stS|js|jr~|js*|jr.dS|jj|jjkrBdS|jj|jj}}||kr~|sj|dkov|pv|dkr~dS|j|jk�oL|j|jk�oL|j|jk�oL|j|jk�oL|j	|j	k�oL|j
|j
k�oL|j|jk�oL|j|jk�oL|j
|j
k�oL|j|jk�oL|j|jk�oL|j|jk�oL|j|jk�oL|j|jk�oL|j|jkS)NFr)r"r
rVrr\r(r)r*r,r-r.r/r+r0r1r2r3r4r5r6)rAr]Zn1Zn2r	r	r
�__eq__�s2
&zrelativedelta.__eq__cCs|j|�S)N)ri)rAr]r	r	r
�__ne__szrelativedelta.__ne__cCs0ydt|�}Wntk
r$tSX|j|�S)Nr)rfr%rVrh)rAr]Z
reciprocalr	r	r
�__div__s
zrelativedelta.__div__cCs�g}x.dD]&}t||�}|r
|jd	j||d
��q
Wx6dD].}t||�}|dk	r:|jdj|t|�d
��q:Wdj|jjdj|�d�S)Nr(r)r*r+r,r-r.r/z{attr}={value:+g})r_rOr0r1r2rr3r4r5r6z{attr}={value}z{classname}({attrs})z, )Z	classnameZattrs)r(r)r*r+r,r-r.r/)r0r1r2rr3r4r5r6)rZ�append�format�reprrS�__name__�join)rA�lr_rOr	r	r
�__repr__s


zrelativedelta.__repr__)NNrrrrrrrrrNNNNNNNNNN)ro�
__module__�__qualname__�__doc__rDr@�propertyrB�setterr8rTrar9rcrdrbreZ__nonzero__rh�__rmul__ri�__hash__rjrk�__truediv__rrr	r	r	r
r
s6G
y!
#WcCsttd|��S)Nr)rr)rr	r	r
rJ"srJ)r#rYr:ZmathrZsixr�warningsrZ_commonr�tuple�rangerrrrrrrr>�__all__�objectr
rJr	r	r	r
�<module>s(site-packages/dateutil/__pycache__/_common.cpython-36.pyc000064400000002145147511334560017377 0ustar003

6�cY�@sdZGdd�de�ZdS)z'
Common code used in multiple modules.
c@s:eZdZddgZddd�Zdd�Zdd�ZdZd	d
�ZdS)�weekday�nNcCs||_||_dS)N)rr)�selfrr�r�/usr/lib/python3.6/_common.py�__init__	szweekday.__init__cCs ||jkr|S|j|j|�SdS)N)r�	__class__r)rrrrr�__call__
s
zweekday.__call__cCs:y |j|jks|j|jkrdSWntk
r4dSXdS)NFT)rr�AttributeError)r�otherrrr�__eq__szweekday.__eq__cCs&d	|j}|js|Sd||jfSdS)
N�MO�TU�WE�TH�FR�SA�SUz%s(%+d))rr
rrrrr)rr)r�srrr�__repr__s
zweekday.__repr__)N)	�__name__�
__module__�__qualname__�	__slots__rrr�__hash__rrrrrrs
rN)�__doc__�objectrrrrr�<module>ssite-packages/dateutil/__pycache__/_common.cpython-36.opt-1.pyc000064400000002145147511334560020336 0ustar003

6�cY�@sdZGdd�de�ZdS)z'
Common code used in multiple modules.
c@s:eZdZddgZddd�Zdd�Zdd�ZdZd	d
�ZdS)�weekday�nNcCs||_||_dS)N)rr)�selfrr�r�/usr/lib/python3.6/_common.py�__init__	szweekday.__init__cCs ||jkr|S|j|j|�SdS)N)r�	__class__r)rrrrr�__call__
s
zweekday.__call__cCs:y |j|jks|j|jkrdSWntk
r4dSXdS)NFT)rr�AttributeError)r�otherrrr�__eq__szweekday.__eq__cCs&d	|j}|js|Sd||jfSdS)
N�MO�TU�WE�TH�FR�SA�SUz%s(%+d))rr
rrrrr)rr)r�srrr�__repr__s
zweekday.__repr__)N)	�__name__�
__module__�__qualname__�	__slots__rrr�__hash__rrrrrrs
rN)�__doc__�objectrrrrr�<module>ssite-packages/dateutil/__pycache__/_version.cpython-36.pyc000064400000000506147511334560017573 0ustar003

6�cY��@s.dZdZdZdZeeefZdjeee��ZdS)z2
Contains information about the dateutil version.
����.N)	�__doc__Z
VERSION_MAJORZ
VERSION_MINORZ
VERSION_PATCHZ
VERSION_TUPLE�join�map�str�VERSION�r
r
�/usr/lib/python3.6/_version.py�<module>s

site-packages/dateutil/__pycache__/tzwin.cpython-36.opt-1.pyc000064400000000205147511334560020055 0ustar003

6�cY;�@sddlTdS)�)�*N)Ztz.win�rr�/usr/lib/python3.6/tzwin.py�<module>ssite-packages/dateutil/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000246147511334560020446 0ustar003

6�cYE�@sddlmZdS)�)�VERSIONN)Z_versionr�__version__�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/dateutil/__pycache__/easter.cpython-36.opt-1.pyc000064400000004036147511334560020173 0ustar003

6�cYE
�@s4dZddlZddddgZdZdZd	Zefd
d�ZdS)zx
This module offers a generic easter computing method for any given year, using
Western, Orthodox or Julian algorithms.
�N�easter�
EASTER_JULIAN�EASTER_ORTHODOX�EASTER_WESTERN���cCsld|kodkns td��|}|d}d}|dkr�d|dd}||d|d	}|d
kr�d}|dkr�||d
d|d
dd}n�|d
}||dd|ddd|dd}||dd|dd|dd|d}||d|d
||dd	}|||}	d|	d|	ddd}
d|	dd}tjt|�t|�t|
��S)a�
    This method was ported from the work done by GM Arts,
    on top of the algorithm by Claus Tondering, which was
    based in part on the algorithm of Ouding (1940), as
    quoted in "Explanatory Supplement to the Astronomical
    Almanac", P.  Kenneth Seidelmann, editor.

    This algorithm implements three different easter
    calculation methods:

    1 - Original calculation in Julian calendar, valid in
        dates after 326 AD
    2 - Original method, with date converted to Gregorian
        calendar, valid in years 1583 to 4099
    3 - Revised method, in Gregorian calendar, valid in
        years 1583 to 4099 as well

    These methods are represented by the constants:

    * ``EASTER_JULIAN   = 1``
    * ``EASTER_ORTHODOX = 2``
    * ``EASTER_WESTERN  = 3``

    The default method is method 3.

    More about the algorithm may be found at:

    http://users.chariot.net.au/~gmarts/eastalg.htm

    and

    http://www.tondering.dk/claus/calendar.html

    rrzinvalid method�r����r�
i@�d���
��������(��)�
ValueError�datetimeZdate�int)Zyear�method�y�g�e�i�j�c�h�p�d�m�r+�/usr/lib/python3.6/easter.pyrs($",0$)�__doc__r�__all__rrrrr+r+r+r,�<module>ssite-packages/dateutil/zoneinfo/__pycache__/rebuild.cpython-36.pyc000064400000003405147511334560021225 0ustar003

6�cY��@sfddlZddlZddlZddlZddlZddlmZddlmZm	Z	m
Z
ddgdfdd�Zdd�ZdS)	�N)�
check_call)�tar_open�METADATA_FN�ZONEFILENAMEZgzc-sHtj��tjj�d�}tjjt�}�zt|��v}x|D]}|j|��q6W�fdd�|D�}	yt	dd|g|	�Wn,t
k
r�}
zt|
��WYdd}
~
XnXWdQRXttjj|t
�d��}tj||dd	d
�WdQRXtjj|t�}t|d|��6}x.tj|�D] }
tjj||
�}|j||
��qWWdQRXWdtj��XdS)z�Rebuild the internal timezone info in dateutil/zoneinfo/zoneinfo*tar*

    filename is the timezone tarball from ftp.iana.org/tz.

    Zzoneinfocsg|]}tjj�|��qS�)�os�path�join)�.0�n)�tmpdirr�/usr/lib/python3.6/rebuild.py�
<listcomp>szrebuild.<locals>.<listcomp>Zzicz-dN�w�T)�indentZ	sort_keyszw:%s)�tempfileZmkdtemprrr	�dirname�__file__r�extractr�OSError�_print_on_nosuchfile�openr�json�dumpr�listdir�add�shutilZrmtree)�filename�tag�formatZ
zonegroupsZmetadataZzonedirZ	moduledirZtf�nameZ	filepaths�e�f�target�entryZ	entrypathr)rr
�rebuilds*

 r&cCs|jdkrtjd�dS)zdPrint helpful troubleshooting message

    e is an exception raised by subprocess.check_call()

    �zzCould not find zic. Perhaps you need to install libc-bin or some other package that provides it, or it's not in your PATH?N)�errno�logging�error)r"rrr
r*s
r)
r)rrrr�
subprocessrZdateutil.zoneinforrrr&rrrrr
�<module>ssite-packages/dateutil/zoneinfo/__pycache__/__init__.cpython-36.opt-1.pyc000064400000013255147511334560022301 0ustar003

6�cY[�@s�ddlZddlZddlmZddlmZddlmZddlm	Z	ddl
mZddd	d
gZdZ
dZejZeed
�sxdd�ZGdd�de�Zdd�ZGdd�de�Ze�Zddd�Zdd�Zdd	�ZdS)�N)�TarFile)�get_data)�BytesIO)�closing)�tzfile�get_zonefile_instance�gettz�gettz_db_metadataZrebuildzdateutil-zoneinfo.tar.gzZMETADATA�__exit__cOsttj||��S)N)rr�open)�args�kwargs�r�/usr/lib/python3.6/__init__.py�tar_opensrc@seZdZdd�ZdS)rcCst|jffS)N)rZ	_filename)�selfrrr�
__reduce__sztzfile.__reduce__N)�__name__�
__module__�__qualname__rrrrrrsrcCsJytttt��Stk
rD}ztjdj|j|j	��dSd}~XnXdS)NzI/O error({0}): {1})
rrr�ZONEFILENAME�IOError�warnings�warn�format�errno�strerror)�errr�getzoneinfofile_streams
rc@s eZdZddd�Zddd�ZdS)�ZoneInfoFileNcs�|dk	r�t|dd����t�fdd��j�D���_t�fdd��j�D��}�jj|�y.�j�jt��}|j�j	d�}t
j|��_Wnt
k
r�d�_YnXWdQRXnt��_d�_dS)N�r)Zfileobj�modec3s:|]2}|j�r|jtkr|jt�j|�|jd�fVqdS))�filenameN)�isfile�name�METADATA_FNr�extractfile)�.0Zzf)�tfrr�	<genexpr>/sz(ZoneInfoFile.__init__.<locals>.<genexpr>c3s0|](}|j�s|j�r|j�j|jfVqdS)N)ZislnkZissymr$�zonesZlinkname)r'Zzl)rrrr)7szUTF-8)r�dictZ
getmembersr*�updater&Z	getmemberr%�read�decode�json�loads�metadata�KeyError)rZzonefile_streamZlinksZ
metadata_jsonZmetadata_strr)rr(r�__init__'szZoneInfoFile.__init__cCs|jj||�S)ak
        Wrapper for :func:`ZoneInfoFile.zones.get`. This is a convenience method
        for retrieving zones from the zone dictionary.

        :param name:
            The name of the zone to retrieve. (Generally IANA zone names)

        :param default:
            The value to return in the event of a missing key.

        .. versionadded:: 2.6.0

        )r*�get)rr$�defaultrrrr4FszZoneInfoFile.get)N)N)rrrr3r4rrrrr&s
rFcCs2|r
d}nttdd�}|dkr.tt��}|t_|S)a%
    This is a convenience function which provides a :class:`ZoneInfoFile`
    instance using the data provided by the ``dateutil`` package. By default, it
    caches a single instance of the ZoneInfoFile object and returns that.

    :param new_instance:
        If ``True``, a new instance of :class:`ZoneInfoFile` is instantiated and
        used as the cached instance for the next call. Otherwise, new instances
        are created only as necessary.

    :return:
        Returns a :class:`ZoneInfoFile` object.

    .. versionadded:: 2.6
    N�_cached_instance)�getattrrrrr6)Znew_instanceZzifrrrr`s
cCs8tjdt�tt�dkr(tjtt���tdjj	|�S)a+
    This retrieves a time zone from the local zoneinfo tarball that is packaged
    with dateutil.

    :param name:
        An IANA-style time zone name, as found in the zoneinfo file.

    :return:
        Returns a :class:`dateutil.tz.tzfile` time zone object.

    .. warning::
        It is generally inadvisable to use this function, and it is only
        provided for API compatibility with earlier versions. This is *not*
        equivalent to ``dateutil.tz.gettz()``, which selects an appropriate
        time zone based on the inputs, favoring system zoneinfo. This is ONLY
        for accessing the dateutil-specific zoneinfo (which may be out of
        date compared to the system zoneinfo).

    .. deprecated:: 2.6
        If you need to use a specific zoneinfofile over the system zoneinfo,
        instantiate a :class:`dateutil.zoneinfo.ZoneInfoFile` object and call
        :func:`dateutil.zoneinfo.ZoneInfoFile.get(name)` instead.

        Use :func:`get_zonefile_instance` to retrieve an instance of the
        dateutil-provided zoneinfo.
    z�zoneinfo.gettz() will be removed in future versions, to use the dateutil-provided zoneinfo files, instantiate a ZoneInfoFile object and use ZoneInfoFile.zones.get() instead. See the documentation for details.r)
rr�DeprecationWarning�len�_CLASS_ZONE_INSTANCE�appendrrr*r4)r$rrrr}s
cCs2tjdt�tt�dkr(tjtt���tdjS)a! Get the zonefile metadata

    See `zonefile_metadata`_

    :returns:
        A dictionary with the database metadata

    .. deprecated:: 2.6
        See deprecation warning in :func:`zoneinfo.gettz`. To get metadata,
        query the attribute ``zoneinfo.ZoneInfoFile.metadata``.
    z�zoneinfo.gettz_db_metadata() will be removed in future versions, to use the dateutil-provided zoneinfo files, ZoneInfoFile object and query the 'metadata' attribute instead. See the documentation for details.r)	rrr8r9r:r;rrr1rrrrr	�s
)F)rr/ZtarfilerZpkgutilr�ior�
contextlibrZdateutil.tzr�__all__rr%rr�hasattrr�objectr�listr:rrr	rrrr�<module>s&
7
&site-packages/dateutil/zoneinfo/__pycache__/rebuild.cpython-36.opt-1.pyc000064400000003405147511334560022164 0ustar003

6�cY��@sfddlZddlZddlZddlZddlZddlmZddlmZm	Z	m
Z
ddgdfdd�Zdd�ZdS)	�N)�
check_call)�tar_open�METADATA_FN�ZONEFILENAMEZgzc-sHtj��tjj�d�}tjjt�}�zt|��v}x|D]}|j|��q6W�fdd�|D�}	yt	dd|g|	�Wn,t
k
r�}
zt|
��WYdd}
~
XnXWdQRXttjj|t
�d��}tj||dd	d
�WdQRXtjj|t�}t|d|��6}x.tj|�D] }
tjj||
�}|j||
��qWWdQRXWdtj��XdS)z�Rebuild the internal timezone info in dateutil/zoneinfo/zoneinfo*tar*

    filename is the timezone tarball from ftp.iana.org/tz.

    Zzoneinfocsg|]}tjj�|��qS�)�os�path�join)�.0�n)�tmpdirr�/usr/lib/python3.6/rebuild.py�
<listcomp>szrebuild.<locals>.<listcomp>Zzicz-dN�w�T)�indentZ	sort_keyszw:%s)�tempfileZmkdtemprrr	�dirname�__file__r�extractr�OSError�_print_on_nosuchfile�openr�json�dumpr�listdir�add�shutilZrmtree)�filename�tag�formatZ
zonegroupsZmetadataZzonedirZ	moduledirZtf�nameZ	filepaths�e�f�target�entryZ	entrypathr)rr
�rebuilds*

 r&cCs|jdkrtjd�dS)zdPrint helpful troubleshooting message

    e is an exception raised by subprocess.check_call()

    �zzCould not find zic. Perhaps you need to install libc-bin or some other package that provides it, or it's not in your PATH?N)�errno�logging�error)r"rrr
r*s
r)
r)rrrr�
subprocessrZdateutil.zoneinforrrr&rrrrr
�<module>ssite-packages/dateutil/zoneinfo/__pycache__/__init__.cpython-36.pyc000064400000013255147511334560021342 0ustar003

6�cY[�@s�ddlZddlZddlmZddlmZddlmZddlm	Z	ddl
mZddd	d
gZdZ
dZejZeed
�sxdd�ZGdd�de�Zdd�ZGdd�de�Ze�Zddd�Zdd�Zdd	�ZdS)�N)�TarFile)�get_data)�BytesIO)�closing)�tzfile�get_zonefile_instance�gettz�gettz_db_metadataZrebuildzdateutil-zoneinfo.tar.gzZMETADATA�__exit__cOsttj||��S)N)rr�open)�args�kwargs�r�/usr/lib/python3.6/__init__.py�tar_opensrc@seZdZdd�ZdS)rcCst|jffS)N)rZ	_filename)�selfrrr�
__reduce__sztzfile.__reduce__N)�__name__�
__module__�__qualname__rrrrrrsrcCsJytttt��Stk
rD}ztjdj|j|j	��dSd}~XnXdS)NzI/O error({0}): {1})
rrr�ZONEFILENAME�IOError�warnings�warn�format�errno�strerror)�errr�getzoneinfofile_streams
rc@s eZdZddd�Zddd�ZdS)�ZoneInfoFileNcs�|dk	r�t|dd����t�fdd��j�D���_t�fdd��j�D��}�jj|�y.�j�jt��}|j�j	d�}t
j|��_Wnt
k
r�d�_YnXWdQRXnt��_d�_dS)N�r)Zfileobj�modec3s:|]2}|j�r|jtkr|jt�j|�|jd�fVqdS))�filenameN)�isfile�name�METADATA_FNr�extractfile)�.0Zzf)�tfrr�	<genexpr>/sz(ZoneInfoFile.__init__.<locals>.<genexpr>c3s0|](}|j�s|j�r|j�j|jfVqdS)N)ZislnkZissymr$�zonesZlinkname)r'Zzl)rrrr)7szUTF-8)r�dictZ
getmembersr*�updater&Z	getmemberr%�read�decode�json�loads�metadata�KeyError)rZzonefile_streamZlinksZ
metadata_jsonZmetadata_strr)rr(r�__init__'szZoneInfoFile.__init__cCs|jj||�S)ak
        Wrapper for :func:`ZoneInfoFile.zones.get`. This is a convenience method
        for retrieving zones from the zone dictionary.

        :param name:
            The name of the zone to retrieve. (Generally IANA zone names)

        :param default:
            The value to return in the event of a missing key.

        .. versionadded:: 2.6.0

        )r*�get)rr$�defaultrrrr4FszZoneInfoFile.get)N)N)rrrr3r4rrrrr&s
rFcCs2|r
d}nttdd�}|dkr.tt��}|t_|S)a%
    This is a convenience function which provides a :class:`ZoneInfoFile`
    instance using the data provided by the ``dateutil`` package. By default, it
    caches a single instance of the ZoneInfoFile object and returns that.

    :param new_instance:
        If ``True``, a new instance of :class:`ZoneInfoFile` is instantiated and
        used as the cached instance for the next call. Otherwise, new instances
        are created only as necessary.

    :return:
        Returns a :class:`ZoneInfoFile` object.

    .. versionadded:: 2.6
    N�_cached_instance)�getattrrrrr6)Znew_instanceZzifrrrr`s
cCs8tjdt�tt�dkr(tjtt���tdjj	|�S)a+
    This retrieves a time zone from the local zoneinfo tarball that is packaged
    with dateutil.

    :param name:
        An IANA-style time zone name, as found in the zoneinfo file.

    :return:
        Returns a :class:`dateutil.tz.tzfile` time zone object.

    .. warning::
        It is generally inadvisable to use this function, and it is only
        provided for API compatibility with earlier versions. This is *not*
        equivalent to ``dateutil.tz.gettz()``, which selects an appropriate
        time zone based on the inputs, favoring system zoneinfo. This is ONLY
        for accessing the dateutil-specific zoneinfo (which may be out of
        date compared to the system zoneinfo).

    .. deprecated:: 2.6
        If you need to use a specific zoneinfofile over the system zoneinfo,
        instantiate a :class:`dateutil.zoneinfo.ZoneInfoFile` object and call
        :func:`dateutil.zoneinfo.ZoneInfoFile.get(name)` instead.

        Use :func:`get_zonefile_instance` to retrieve an instance of the
        dateutil-provided zoneinfo.
    z�zoneinfo.gettz() will be removed in future versions, to use the dateutil-provided zoneinfo files, instantiate a ZoneInfoFile object and use ZoneInfoFile.zones.get() instead. See the documentation for details.r)
rr�DeprecationWarning�len�_CLASS_ZONE_INSTANCE�appendrrr*r4)r$rrrr}s
cCs2tjdt�tt�dkr(tjtt���tdjS)a! Get the zonefile metadata

    See `zonefile_metadata`_

    :returns:
        A dictionary with the database metadata

    .. deprecated:: 2.6
        See deprecation warning in :func:`zoneinfo.gettz`. To get metadata,
        query the attribute ``zoneinfo.ZoneInfoFile.metadata``.
    z�zoneinfo.gettz_db_metadata() will be removed in future versions, to use the dateutil-provided zoneinfo files, ZoneInfoFile object and query the 'metadata' attribute instead. See the documentation for details.r)	rrr8r9r:r;rrr1rrrrr	�s
)F)rr/ZtarfilerZpkgutilr�ior�
contextlibrZdateutil.tzr�__all__rr%rr�hasattrr�objectr�listr:rrr	rrrr�<module>s&
7
&site-packages/dateutil/zoneinfo/dateutil-zoneinfo.tar.gz000064400000417201147511334560017533 0ustar00��!8Y�dateutil-zoneinfo.tar�]`�֝jhHoCMB˦o�!$�.z�@B%!��GAEA�
Ei�)�A�ҋ+
D�"�{�4��f��3�IHv�;�{�ʹ
�hܴqDc��Lt��c��7ᵷx�M�@����;��c
���}|�M�b���m�Є�xE���F����?�O>$�~�'�Qn
�*�N���L��H�2vp\床OS-���ёC���?���
��+�ԡu���G��%����oX�:���F�*��&��#�\'62.����.�G�ī2��G��?߫o�h��/}h��CǭN���O~c���"��}��}�MA��ޑ�}�}�������5��1��������? 2�wT@ ���k
�����l��6����������MϽ�{�����sAf>�ёf?ST�)Ќo��	���������/귿�[����?x�:��]�7>�Od�G��O�����k���v�jt�w��'���_3t�?|_���7�O\���߄>�	��#�&DJ|�G�I?"�[�#������G�Ge��1R���޺8�Y��?�z�O�xg��Et������.�̯s���O�s��s�z�{��F`r�&`�:�n�!���k"��;����d�?�_$����5z�٫��
���S���97�b�N�t�82�yR�6��"�ByP��>��Q�ԏ+�X�-,m+��D���y�[K�)�Z�|!{��2�޵��l8�-��,?�g�)C�GG9�����7sVn�R��Y�Vig�rn���:�s���7J�gJj�I�i���Qk|uA����Zk�n����j�	+U�C5EMU�{lv���p�6���k7��_^s�ts���4�MR�'�:��vU�n��w��ZQ��`S���jj���j�Gm<���I�Ojp�Φ�v�!�g�\���*_8C/�p��.t��Y�l��Dg�o:[�Z��zk�3�ey��kל9r�ʝ'o��r�w����
>��_��dV	��BϤ�ޓ\��p�n4å�$�-.t�j��⏶�Y]JKR9E*���5L.��o�d~_*޲U�Ԋ��є��7�G���_
	������K.��O?��<_���C�u\�����#�9Fw@{�<�y2�8�Xd���1���Hc���'#�Q��<Ny�2�xҘe�q��c���/#�aFnj<�y<3�f�q��c���7�&�qF�<�y�3�Ҹg����H`d02���	F�#s�E�`d�02O�+��F�#�H�ad��C��#F�#�9�ȼbdn12��c��3F�#�9�ȼcd�12���AF�!����|ddN22/����O�LP7��ھ�F�z�˕�ȁY���]z���+��#9���?��em�����	�>���Gd���_�v����0b����7��O��a��Gd)�������_��?ƃo�dl��DS��3���<��>?��c��W�[k����/����}���'�߷/�=ٱ�Kokڸ˶����w4�a�f�fݹs�mיϭ��\p���z`~����փ-:y���y�q$_��3lGjd?ve������m'׼m���F۵�O�7��R�l��O.��sh��-޳��h��ښ;�]5O�b-��aR�ҋ�|��:ܜy��=�G�Z.�Z4�ZxBY�<q��h���ZH/��L-^�V½{jI�ϴRy���<�Z�Ĵ�r����,n�ku�|�5�7�;�G�*�}G�\;��*�ɡ���ᮟ��{���?,�[��Dn��3���'�����m��d��z�N���F���[刖�z�����
��o��N{�X9]�����s�>ʱ���7��\�T�U�~�(M΋4����u���sx��,���&?z�>�]�Re��ih2�M/c�adڠ�E�}�B�L#���J(O��R(S��Z@�#S��i��T��r�L;F�#ӏ�)��4dd*22���LKF�&#ӓ�)
$�22U���LYF�-#S�����f��FT�.':32��֌LmF�7#S��i�+�;�(�*ў��$�3�0�0�0�0�$0�,�eB��	I"Pte8�
�I.�$(�l�,H��l˂�ԧ*���/�Y�DYp�.ʂ�ɢ,xVe�����Y�?�EY�3�K!z$��M�$2�,2�42�<2�D�2b(^�T2�\I2�f#ٔ����V���0���������D��㣳0����������xd�V8�M��OcZ�n?ھYsƶsj����9KL���%�k�\�s呤����L*�[城Q���H��$����TJ�TM*ǚҾKI ?|L����A�0���9ߣ�9��~#F���Y`���$�g�)�υ O�Mz^��ܰ��9)f�6-����m��}�����,���ƶ��3靳%n�vw�n��q��[g���w��kҹ"-�ӽԤ�s?�;_�]��]��t��~���H�<��~N�!I��^).F�<��^�u���rə��4l���ϴ��L�)�K&W2"��Ѣi��d�U�?!#J�}RF������S3�'g�O��{H{�����#�F�+��gy�0�BJ@{	)�)����<�=�F_��m�I�vm�������f����j	��/;�k�#�h��}�d��`He�P�Ϝ��&��W=Z��v���c�K��v�']���H�S��[�:F�*}?}��t���;�^�Θ�l�!ҹ���9j�׏8��t�`������9/ݼ']~�+畣^�s�-5Ͻ�R�K���NN��lq9>O+qRu���TwI�Ο�<Z&�d��wX&�3d2�1�|�V���V�8K1�DX�����HT�hg��F;7ک��юEōv.*m�����Qq����lT�����;��w>#F>kt �` ��Ax����F��  $F>Pot���!��ACx���F�� �DT��@½��s��`D��,*qtp�g~t�����_��up|°�g:����?�?����o`��� ��S�e��Q~f�A��۵Y�.��w�W猋W�~��i�P�q^���� n���J�Gm�;?�uE��緜�w|�\�q�J���<)�s����jeuM�k�ZkNu�OK5�p���p����'��
�-)��[�R?{�ug�N��I�_���sk�W�r�_�M�mu�;�w^�n?�����K�qUQ�9�۹s];�΋���]'�;w����e۞���=_��V¹wY���n���<u_�-�)�>}�_������u΃A��C�_(�K�Q��[����r�ʉ�]���'��3�{���x���}|�q�w��C�ƏM�g|
g+�7Ε�T��Q���W~:O9�r�S��O�*�>IU.�J��|�9�]�j�	ʵ>+���
����
�W�_�5n�2n�v0n]ofXZC*8��T�R�	�����*M1��^,������x�z�b'���׷�~�b�>`�e����ݼE.�l�\~�\��]�8�eY/W��G��2B�R?D�Z+@�V�]�^���+Xw��{���{,��:�^s�-��Ÿ��ӎ�u&�Խ6릨�w���O�Zٷ�&ٯ�tٿ�l9�'I�?N6����OF�Aw��[��z�;���l��7��Xo4��n�[Eo<!Hn�,�UVnZ�|_��>�[S�?�c��c��D�s�fr��E�.7<��ֻt��E�k/�=�]�{M�#�z�#ǜ�{7� ��K�����`��7�:=���Ǥ��ݏ�c�F�OF�w�n"z�����<�_E]��/i[_��[R�]AN�-+"ɉ^�D�i9��M=��<��Q}d�"yԞM����Ǭ����P��8]�ŧ�������ϓ�|n���Ç7��'��|T�˼]Nq)��pYM<�;%2��N܏���!��#��-��E�2{8?toI�$��,�X�@b��	$�X�@���	$�X�@∮/	$��$��P��Kb��.	&�D�����œ���E�����Ŕ���E������H��"��B��b�Ȃ�Ȣ�)	/#�/#0�,�0#1#�1#2#�2���hg�U�-Y�Qp%��X�;M�
<!�7�ě�H"��B$1gdA����M�$q��&��*�=<8�=<8	��zd�]�)���D xn
���)(�sS`����M���<7
xn
�s�rЀצ����M���<7xn
&��P�)��sS`���.<t�)�0r�a�`��<t�)�0r�a����.���"P����³sP���*���K���
�t
R��~�Jt�w����:.t�)x1rC������]�`&��,��/ˢ�/��zYt�#d�]�Ew=@�uw <�]�Ew�O�j@�H�(��-]t�r@����o�Ew}�.��Ge�]_+��&Ytקˢ�>[�$�8Ew}�.�둲�'ꢻ,��zg]t�{颻^W�ƺ讗�Ew��.��A�9��S��s��ߗ�+]u
���N����3w�9�)�3r0g���A��;#wF��9У�C���>#}F����0�	�`d3�Ȇ��M#F6�l�$0�Q@�Jf��
#�F6�l����`d$#��fH���M#F6�l0�d0��`�6�l��7��'j���_���]������^��`d\T���}|��8��_���%�_��]�c��k���c�YGW����l��f�Ƈ�y�[�EcF��Oɻε�w/)��,�}�O�kKl�	�J��{G��&ہ��o[T��\.�\+�UF>�o�|4��џ���R�յ�O����7]���t��k�K�,�.��t�ۮ�s�p���ҵ�]���*H?׭'�SW��Ly�`��j���'sd7
�|}/j��[l���J^O�J�0O+}��̗㵲�W��0�VL%K�
ى�d%��Ft�*;��Γ_�<�xs�̛;�k�/�C�������:���N�
N��4��6߮a�y��R?z~�U�Va'8$�t�_'�=6�)X�����l�4^f��ӗ�!Qb@�K*�ė�@�ʟdE@�K���̤��L����:R
$r�H�H1��H�H9���z q#A�F*�č�D$nk��("q�.��ʟ�"qK�kV �+R$V�6H�HqDb�F��B�� �AbE
$��@V"$V�FH�H��X�*!�"eBbE�ĉ
H*�t	����Ȥ`"�ʘ�LJ&(1�M$P����&��H�VJ"�B~�	�+�H�D�c�	�8Y�_v"��He'R�&��G�*����-'z�yeO�_��_�����,�\30��2�uw��;Ҿ9��Y���b�8U�O��V��\���V
������.U���鑧�䱿�3����$҇!�:WXTMʋ�M����#�>�#���	�I4��p��G��>�Χ�N��ݡ�C6�>l��X��6�> l}H }P���@������mJ��������~>O���>��3�oN�Ḱ9���s7�ь���y��͏~�/��[Z����H�9��������R�M�D���4�v�����ҝ[��靥{G:���JR�䇋�I�ee�I���ˮC�h9#K빬��ܖ�z�)-ϛ�z>9U˷|��术�w���L*�:^/�q�Tx�!����L~E.:"Z*%��%T*�L.�)���KU�J�s���p��\���=�O+�?�^��Z�U��vkJby���+���U��/Wk�B�^�Kٽ�.�?�N�t��<O}��8!�ܺ@�u@�j�yU�����56F7��y�L�}���|C��~�ni��ar��1-0O
�\��d>W@�{T���\�&�~���`�l����F3�I��{��fjM�$�Qc��^��!M{i!׿�Ck��B|,��)���,7�d�Z�Hoy���j�$����Z�u�����Bo��Ck_����G-�^a���6��Zr�
?H��.$wy��u�S�v�c����r�����Qk�^sH�Vv=r�$�w��z��X-��4=:��})Q�q�������)��.���?{U0��<�^H�$ǭtJ�;��շ�KC�����JC�''��.%<��'�Wk��N�I��iI[?�G�J�FΙ��:�Y=v�>&��6�g�>�rگ���	1��Ç��q�Yj��m]ʋ�x�q����i�Dk�)��S�o�#�el=s�Y����#Qo!V��B����53
��@b*��V�S��@b-��\4z����`Ff1����lfdF32��l�1����ȇ��|c�#�AQYb������J����B*$%@�$5@!�!�aV�SRE>��z
)�)�����a�X�Ȋ�z
��)�"f�@�S�ì"����0�	��R�SHYPO!u��X%�aVRRRRRRR �Oj࿁�~Bj$EB�]�L���:��B
�:
��(cŚV+�QH�PG!�B��K�Qn�5+����kV2QG���NJ&�(G�M�Q6�5+�����kV:QG�׬x��2�?+�����kV@�QH������IQ?!U�2��B�:
)$�(�����R��Z��B��lh:9J�����z��B
�hI*
$%E���H���	�*��uRW�QHaQG!�E��uR[�Q�D��UuR^�QH}QG!F�Tu���~�j�$�W��ݢ~��\.�W�Ҩ��R��Bj�:J��_�j���Q��u�WVq�Q����9��+�:�XF���*�W���|e�Ge��+�>�qOQ�b���w�ߍ��/s�U�:�Ϸ/ܺk���Y�M�O�M��_��r�6B�c�7�+uߑo�G�+��Ϲ�(p�sG����R8
�Lu���(�p��hR��X�	��:�;^z��Q�\�Q��x[�Ja�Rۣme\j��|j+�}[�I���mb��*v��쒭���Vٔ�Q%v��j��ju\l�s4q����~������6�/r;j��Vsi[��m���e����W�s6ӠV6��;l>c�5��Z8��޴�������Ka������Wu�r������m
�8m
�z;��o�tomk�s��I}�-��t[�rEm!�l!w�5{��z�UGة����1��[+�Zlw��C>�����Ӷ6�:�}8��j��5s8�[
�"���P�c��Ng'::���e[?G�Ϧٺ�|��}q����1������58�f��lS��;����WA[���]芭o�����;b�����W���^ǀ件�3V9�'mq��uV?���dې.��wlC��8�۰�s�RG[���f[��>��el#7�8F-	p����6fr)��W�����pə�5K���eM�Z�ywR�5-���Ϭ�5�}����׏�}�Y������G�=��n���d�0�D �F��#Jt�%J��-`D�0�DQ��(�F��#Jt�%��60�D �F�($����D'Q��(�
F��#J�%����`D�j0�D7Q��(�F��#J�%
���taD��0�DIQ�%�(QF��	#J�%�B=��0�DWQ�,�(�F��#J�%
��0�DeQ�3�(QF�h
#JԆ%zÈ�aD��0�DuQ�;�(QF�h#Jԇ%�À�I��#)��)�+K��aq\Y`D7���#:CW�
Q�Q�Q�8�,0�q\YB�Ы(�+K	�8�,)0�߉���#J�#J#J2#JR#Jr#J�#J�#J�#J�#J#J2JR����$	F�d	
=�&�}�4�,Q0�$S0�$U0�$W0�$Y0�$[0�$]0�$_0�$a0�$c0�$e��$g0�$i@�5F�6Q�7Q�8Q�9Q�:Q�;Q�<Q�=Q�>�����;
k{���ojX�Uo�V�V~l?�|i;��ϸ���ǟ�}�~��
�^����?Oƃo�2����O^�:}U�I�W'��&��mm���U�n[o^�o��[J�7��\�ڱ�֠~��ر�H�~�HwlY�K��/�q|�֑�����}����_����U��u����Z�i�^Ȗ�Vk�-��~��}h��5����|��=���I���ce7q���ũ����R��9D�3���]����>��(8��IZ.�*@�$V-�
���U���tfLQ\�h��TŪ@W1��)$�bU Q�G#�22��J�h�U�De�$:c� Q���X%H��*A�7V	űJ�h�U�Du�$�cu Q�y�=��/r�-ٹVv���k�s�����W)!�/e�b�fa�%k��g��3��|B׿t��I�����F�E�?����_�}|����O��?_��O���>Z�����X߷�mC�~[J�[��ίO%۶����pձ�I���{7t�����x�x�wL��34���Y�kw^�w�+��R��eӝ�����8��ȩ�[�����{Z�Mo9�}Jk4SuZ�j�'��6i3_��6����4m�6+�W�꫆�/��,������pSm�*��]���}O����6I�%��ζ��Z������"Z�tvP.hk�qv�vk���vv�Ժ��vM��u;�S�d��c�E�9Y�z�WY���K�c:{�GH}z*�(�)�agߢ��?:c��i��os��/��?��9`�-mද�AS�iq�>V٬�'�C���P���$%4k�+3[J�u&�'%n�L:)��R�9��`i�����zicV\u��X��������?�j�Ç
p�1[�rr7�;����MϢ��^�Y���NsXe�ixx�'4Đ��0C�BC
9

7 
9�4�K����^~@�XHF�HCgˣሳ�ѐ_%1�%�*&�t��ė�RW_v�b⋘��CxVLh�L�x��,)c1��-���"&4�_Ą&�b⋘��C\L|� &��P�����KCxKLh�<"&41�SĄ&��}1��)�$�15�#ń&�<pc1�����	MLx_ILhb�bBS�hLT�^"&41��'�	ML-x�X1��).FS
�S��f����z�kbBSx����T�^%&41%ၧ�	MLMx`�'<0Q�����T��+&41eʈ	ML]�1��)$���?�-���Ą&�6<0��ŧ8OQf�
��"�� ��W�_�����g��3=Q��
0e_�-+�_泤�����o�!�#S�:yA�S��QT��~�S�oV8A��X�5U�YE?2Tn�c.�EJ��������sRP+-[���`毷�FA����
^�����P�:�w.�X��آw���5u����@����r�u�����Ƚ�3i�XM�ۊ�Po�OO/
>M��E(�����)�4��?HA����i����m�AS��E�C��Չ3�!]���}GhC���uӆ�Y*'z%Q�xSN*ܕ��`yĕ��ʣ�V��0H���A}�[���@��)�[��g�}x[�2&0��3Gk{1������(�z~�4jMظ���b4�Nظ��e��Ɖp�#Zظ*��qC�e���u�i�(ց/g%�/l�YI�8��u�i��q%3ց"	w;c�Iظ���eԘ�ƽ�	77c�@Mظ�b�2�Eظ9b�2�F�8M�'��[��)*M�$a�H���2ց_���k����$l\Ռu��&l�Mظ�u��4a��X>C6�\�:���q;�eb��qK3ց'	;G�6N�Ǚ���՗��k+ց[��$�/��)����`�oӐ���?����L.�g�?�~��}���(�oz�v)�~і�hCp�s}�(uCYYJq�F��M
�>?�T=\����������{���x��gC��状���k���|��y
���/�u�Wݔ/m�/_�zT��Xi8�[.�:�(�q�\x�
E�`�\d��1T.��ԥ�\<<B)(��PJUn���WJ)��_/s��({��^n	���
�n�וĚJ����;T��?&W���Tk�Y�^|��^c�����ɞ��5��kn]e�:0V�=�]����5v�az�����3�������ϭ��wT|��<f�\.�l>WF	�{N���R��~r)����r÷��f�)[�0OX�7i���zCo�e!M��
�b�Vm��ޚ�����Ï��4����b���z����׷��tnX������b�K������������;���t�pU�u9�˃rׅ�n����'8�[g�=��Pz�!�Z�2"'�����j��G�oD�vѣ/M0b���1���y���Ì�g��G+���A��J�������C�_���<��%%��B9�a>#ќ�'��`$���'m�m��5A9g�1�H_}�ةƘ�0}lϡƸ�k�(s�-_��Ç�-�RL���\�gDV��EV��V���N��X�#���5�Ȱǽ�/X��ly�Sxdq!f�1�Ox�7���rq���\T�H�|�X��3q�ab'N>L�ɇ���m�9���@b,΍E�sq�ab/ΑE�9���8G�Tqn&f3�i�1M�X�i�lL� vc���3�1Mc���vL��"<0��3��f�cz)#��g����4N̪�i���i�f��4���f��4
RL� �4����z`zF
�YE0=�Mx`VL�8/<0�
�i��`���i��U�]�YmDn#<0�r�������Fx`V!�4�D"�)���Fx`V%�6Džfu���R"���J�6Kt��l5Dn�.r����b"���L�6޺�m^VDnS�Yݐې����!�!�CnCj�܆�
�rR>�6�~�d��@RA�6��@RC�4��@RE�6���mH�ېB"�!�DnCJ�܆��
)&rRM�6���mH=�ې�"�!EnCJ
$5ENC�
$UENC�
$uEnC
�܆T�
)-rR[�6���mHu�ې�"�!�EnC
�܆T�
)1��9
)2�T9
)3���
)4r�9�����m�
��b#�!�Fn�?��n�E�5K�g>�[���#"����x>��3�g��ϙ�@ ��.�޵ϫ�ܴJ>Ue�����g��g��r\h��Y�&��yo�%��t�:_�Y�x/[�]+�e��h+�v���ܒ�
��W+�r٦���J�c���U�4uuT���V-�Vݲ����ͷi�����a����6��.v��]�X�M��?2����UHg���t~����2�3����
\�.�$���*�;@�C�K�Ek���K^Ѫ�=C{
>����.�(�� >�E�ړ�/�7�_^�y�⏦=�B{���0���h����
A{���:��y�]���tw��Qp��}�KN��bxd�ȁѱqY��&߀'�蛭�/���fl�'���J؜JL��Y���fu��5�J�:Do����A�+��Ѷ��Y�J�j5���8ѯ�����G^���2�ݚG��q�w�������A��/c�<ܰ�~�b	����ѝ���y��Ǘ�N�}(�|c���Y.��ܜS��#ǹ.�7�^�H�\��p{�GdV��/��Q@����J�Ӽ���~�G���菅u�?X�[�[�K�7��'2K�����l�� �/�9��[x��<o�V��m��Os�9'��;^U��(���8�Qx�x��A_G���΢#��bC�/u��(��,T�Q�3��TeO[鼗�er䷹��*N�Ӝ�s�|D��lOuAxoA�b��_��J�����Te#2~�?d�>!�>%�>��g���z�XNȟZԳ�rB��-��^��0��g�:E=��CԳ=E>H{FԳ�;E=[,'�$��m�gW��gx=P�J����������s���N���]��?��ZZ�Վ]g>��^�@ڳ�g�޷�:�KF�M�9���C=��};�a?�%���*��Z����qLNw���q|��tb��8�xS:�v�����4����H��6���g�q�M�L���	�����J��m�yG���{?ٞ��6�E��r[o9�O�j��롭�����*[��u��=�8�(=�U��lդF��/9�:kw�����j�����ƒ������&��k�u����5��a
���6��S?�����W�ݿ�{6�����9�j�{]T�/����u��z+���/0��j[�񯨍F�uX����mMZ�-��Q��"�T��٬@{���>�Ћ�Rعj���R󭳜-�ޖZN��l�^q����j�����g'�^��~����5�/��:�*�v�y�ֹ����Z���]�kݾ���>�&�X���sL�״UN[��R�=j���>=�ըr�Rt�wԾ�sj1�Ԙ3g�~�;��_�i�O��}�
����-nQ�spt��N,��
)>��sh�ARB�	�Bm��:�����RR��jҞb҈�[ԑ�oH���UG�(��Y�:6�6�>�K&�ʙ�=I�����P��W�(�H$�lq��;�}�'�o����;�#�<~U�/����%,x'>�jH�)\
�x��!��x�mq�Z���D<WA�!._K|�U��s���; qv��;L�"�!��[L|��U�y��&q��%~�2�o���2O�I&���L|E'�8�J'��d�.:��_�I~(��D<�}�\�i��O�k�ۨt�Q�$���I<�#���I|G��8�J'��ήbA9���ba4�:���h�tjIЩ%]@����tjI#Щ%�@����Oz�N-i*�����v��J��N-i�t�Z� �	:���١�+�Ԓ��SK��N-i:��Xh�Z�N�h��5�Z�p.�=�Ԓ��SK�N-�:��E@�#tj/���K��J���	�B�(�H�Щ}!Ab�B��4cҭ�*Ƥ_�Ԓ��7�c��e�В�I�С-.k:�7�#b�C��[�X�Щ%�C��4�Z�=tjI�Щ%�C����%� :����В2�&�CK�$mdd}D��4�Z�ItjI+ѩ%�D��4�Z�MtjI;ѩ%�D��4�Z�QtjIKѡ%=�� ME��tHڊ-�+�4�Z�YtjIkѩ%�E�vZ4�.:����Ԓ�J���F��	Z�:~^>ٓW��-<�����t������{zg��2��|���r,&���xѴB��N�m�߲��Ku-E:Ĥ
i�R,�b�K���x(�i�g�xVck唚��a��v4�~�5��69�kp`�)r����?ͧ�3���J�_�a)�/U	Ƚ�p�cZ�)�C����u7���=s���5����pTbX�A�̖�K���JiҨNZ��ڔ��;���1��:���o�-����<)�&�k=���M��4k����-��kp=�}�ya��
~��Q���JްN߿e��Ͱ._3w]����f)��9��đ����z�,O�uZ���'���֧f_sT�bi�>H����ns��]a���̱[��_�j��7l�;̓^�7�b�����󐐊a���Vo��P�Hʰ<Æ�)`NLߘ6�x����
�Fl�*e��Qsn�G��Q��f�ظ�a�"{�%�9I�s=�;�n3Ƹ����̕�fV>Չ��W^j����Y�#��b�5ѳ?=���k�^f���!�.4Qv����
H 
J�_h`��B���(�� ����
��
H&�-L 
\�@�0�4�ai��@ƾ��H&�5L 
l�@�04�y�i���`�	�H�&�>L 
~�@"L �&��Hd�	$B�)`�0�D�<"L����Gd��#��i`�80yD�<"L�&���Gd��#B��`�X0wD. ��I�GD��#����`�t0yD<�<"L&�H�GD��#2��!a�0yDL&"'�
�G$��#���Ya�0yDZ�<".L�&��G$��#"���a��0yDj�<"6L�q�$������0yDx�<"=L&���G�G"�GB���Y���{���k���hX����e���'��+��=3��'z`d\TV�??�'�?�/+�_�VH�6�?o����V�f�{M��V�{�U6ڿ*�[�/��-ܘ okUF��]�~�d��Ǝ/�+;sW3v.kb�:�[�=���g�Je�܆}�Te_�tc��ˍ5�1��~�8�O92�8\z�r�Z{���c�T������3%��('��)��3�[QBq�����[���A���)?6)k�i�Y9[�q.�����S�O����S�2˸��)�⡱ƥOR���mƕ��+��M���'(��R��7��{5Vn��b�Ҩ�r�X�q�b.��f���;<���ƽ�!�c�wt���>� *_C檓�7�,�ĬW�M�Rh+щ�����A��`�2�S�e���0I����ˊ�ƔG:ȸ|h l�}�8��IHgᡃ��%*4�4P��'
<iP��IO(x���A��'
<i���4`p�w4(|�A����^� �@B��(|v�kT(|�B�
�4�P��A��'
4>i���I�Ot(|�C�
�4qyy�@�XBC�SBi@b	

Jq��T������e構���������Ӏ�e�i��2�4pq�y���
`\΍1�2.�F��s�-.��
|a������-lp\�zdm�b�'mf�,���O��,c��Wc����|��o��]��E���eo�@�
�o�m��`�d�7.>ܰqʑ:U��r�P��ǎ�M9~)}�)'w���2��u�p�|�����ĚH\��W�^��-\�|�ђ��n���T��?��r���R
�\����K3��\N�qe�����W;�I������n��_Q��j�/�?{ܬ6�|�x���\6���w�41�=�k�==S�om���
K��9UKoUNq�kq���)�=�-���lJ�М�]�y����3��9_�4�|�l7����#��%�ۦx�d.:x�b�g��;��P�_/���
�mVɰ��יh�f��|(Ź�:��j�tH��"�ҡE��ËhK�і3�-j1��1~����A)"��^,�mY��!�hK�і����e��@��a�hKCі��-
D[&��4Tmi� �ҐҰA�����hKCі��-
%D[N��4�miX!���B���hKCі��-
5D[n��4�mi�!���C����!�hK�і�"�-
G\<��$.���6��&���(��<Ly�b�
Wp��}4lQJ����*
a�fݎ�<�1�鐨D�FŌ�5*fY?����r��-4>::nxl�~Y���'�f����"=Z���Y�£?��=��y�wC�A��1�OG!���p�w__�@I�~���?�ڦ���e��t�k����'��?��ߤvHl|tV�@o�'���w������x��z�6�?3�/7fʥ��k2f/�Ҙ3k�1wF�x�Se^�=c~�,e��4ca��Fw�E�l���۔�5j)�/�2>��,�,�ɩ,�>PY�c������Jc��<ɤ�z�n�V[k�Wk�u�u�]��&���h��fl,���x�[�ӟ$eӑ���㝔-)������NI�YV�<i����ʗ�G(_�]h|m�l������!c��|�7%�������WA�5���{���u���u{\O}_|	y�]y��@O��p�r�B%�`�������˔C\�#u�R~�M]�(zS;�('��m�Z{^��2��3NHP~hW�1B7�0�6X��+멤�a��ͯ�t�r��%�����O�*���1.�J��r��;F�n�7ռ���T���7%]�)�>ћJ]#zS����oR]�mJN��J.�b%w��J��J���)��J�������u��ݤ�z�(�\T)<�!P�(�6�(:B6�y�7^�|�(^�^¼M/�[/Uj�^ꇕz�[��e������c�ܲr��'���r�yde`�\��Vr�r�H/�j��Z�z��d��u�~��[a��`	��OW��ko�v�kO;��Y�N�Jج�&��M�J�=��>�ke��d�v�e����$9�8�\v�n>)�M��n	����_�Ko����pbc���r��o�� �I�\zpTY�i�szH�}�Y�zh�����g�p����=ir��-V|"�����)r�e��6qq�u�x�m��v��������z������<��7Mz����·��]nx�]��w;�&w7��=�^�{��.���G��:(G�9��n�A��c�U�9��
�o�uzLy]�I�����Ǧ�����,��6D�Z=nQ{yp���:�_Ҷ�߷�<Ի��r[VD������rR��z��+y����E�=���|��Y1[;b�>�q�_��z���'s�~��\���=Ͽ��߻�%Y�
��'L9v)���|�}~�T���W�4������B;~����ez����Z
<$�r��hÒ���Gڊ6,�+ڰ��hÒ�2��2�ޢK��3��2��2��2�I�Y���nj�Ɍ�ˌ�͌��8!i4.8I:�N�Vん�׸�$i6.8I��N�v㌅��8c!i8�XH:�M��I�q�I�tq�ɫ@�v\p���$��'I�q�I�z\p���$��'I�q�I�~\$��mj�hOSm�|@�@�	(����96�=M��i�hOS�@{�b��/�3P8����9��)~�pN1D����8���N��@�_9��aۚb�cmk�5(�S��u�?޶�؃B:��)��Nq�t�E(�S<B!�b
������B:�(�)N��N�
�t�W(�S�B!��
��PH���B:Ű����k[SL{�mM��5Ÿ����k[S�C!��
��P ���9�@��3r,�$��1�&�ȱr��(�S�d�8	!�X���c&
�m�9v2r�d���q��c)#�SF��@���[9�2r�e�8�ȱ���-#�\F���{9�2rf�8�ȱ���1#�d �eF��@�ό�9N3r�f�x��1���6,1�n�9��c8#�qF���9�3r\Rlg����q��c=#�{F����9�3r�gd��>���#�F��+C�]3�ȑ��#0�O�W`d��Ȟ��}#{F���!�G0��`d?�Ȟ��|#{F�@���3���f���C�ɒ<#�F���?ك0�ad/��~��=	#�F�&��O٣0�O���8K�8�%yF�-��]ٿ0��ad��^���#{F�5��m�� #���>H^����<#�F�>���1�B;��#�!)���5}�^��_��坽�ů�<�����>����?���?��������'�ށ>���,����x��l�K���f��C��<m��i����m�~��C��ȇky(ٵ��Z��ZV}=��~�+���M����}^�'��������ޝ�\?�S��iފ|�r��Q2��>�M.6o�ٿ6m%�%���J�f���6����)T��#U�\���턨쭙**{��<���jv���+|���+p����Jʟ]qˮ�eWܲ+n���\q˶-�mۂ�[�_���ČP�6��
x�>����?�Ys�el�?�ݵ��;��;�]q|�=ݾs�
ۮ�<�}�x�\�˭�}l�?׶�Ͽ�v���~����u��K{�9/ϸ&]I^�:GTӮN=�ֵ�t}h��zS�檿T(����0D�]*�z��o�;���w����;��y��W����'���9՜.1?i9�uQ]��K9#O��jΗrq�)pP�[5D�{9��+�鶷����}g�U�kwx��޹!�$�)'xhE�x_-ڡ�T,���R�R�6Vg��/i%�w�%�I�����Gie~��,�U��T��s��
�8+j1�2�Z)r�Tydw�JSE���]�V��V����{�I�G�_��>�jH��5�i��~�Zk�P����u�,Ҽ�bӀv�q7>�/�}{v�հ�mt���z�^�k
ʛb:��Z7�C[�5�Z맍�5xw���➶FC;Y-SG:wjo2��#�~{Ӷ
!E�[�y+�f��ְ"9a��Ý?:�����☋�e�{�ɝ-ޱ�fv�_6���f[D�W����u��3�=ZAT���[d�m����8�ד�?͘�����ۘ-���|�4Ɓ4α@��:��x��@�^���{�ӝ�s9��ɜ/��@�\>q.�x�O\��'���g��7p������J��!�|�\>q	.���O����[@�\>q.�x�O\��'�a�s�X/vq;����D6@<Ď'." >" N" ^��D�^·�3G��Sd�Ud�Wd�Yd��2w)������:�D�2��3��4���(iJ�m��Iϑ
ב
lŸ�������֋�g�# 
@6@:�l���@S$/�d��H��>�5H:�AOZ$�@6@��l`�H��H?�
�� $��  -A@z�,�4Y�
����F� ��?i
��Yi��d�=�H��! B@Z�,��Yi��%d�M�"�O�H���S��U@�+d�Y�H���v! ���I���Iˀ�g@�4���I؞�E����~֗�����������_��̧��%e:�ϲ�_h�L��N�aI��מv���za�o6J�U5ںkV��}_��w#���K�o�N�i�e��p��v����?�O�s�����-��^b�?���A�d���Ҍ�v�I_�����U=m�i�ޚ���ܖ/�yJ�Y�*Y����&M��9�[�;b��NL-���Zhc��«�[�13-Eڞ��!���.�K�>H-��{��RK֋����#�t�f��g����a/�����q7{��
-F��V|[�(]����Tnk��GK�*��ju�Y���^�t�����7��=OM��8��^s�k�
��紲֙�d��e5
�j��ia��T��Z��g�b�/��l
���j.��n>{�tw���u�z�u{����
6������hf��ҷ�=���5�{hՒ��K�᮷��k���?���5��r�M{�W'�[���f��n}�k�Ŭ���/{���Sk���֎����n�\!�����.>�w]�����h{�	�{l�i�5��k��nk��5rRk��#�}bY�
v�F�V�F_jh�qw���V����Xc��?��:`z���?�%J���_�w8mW��>�+{��������'<�Қh~͚x�#kR��֤���#n����ku��u���1)�ul���q3�Y]�����|�_�?|�:�]��T���T����F�G�����#c]�_8e�l�N�;�'`�BO\D����J�/�Y��@�'<+qH<�7%�›_1g�8��9���+����a �^���H|�W$N�+����hQ��F�~ ��+�E�^�8�H��!��+��I�����r�_gM�W$]�W$m�W$}�W$��W$�����r��Y7�I;�������1�<"i	��^�4^�t^��^��^�4^�t^��^��^�4^�t^��8>��"i�"���^���c�$�	�%xE�&xE�'xE�(xE�)xE�*�/�+xE�,xE�-xE�.xE�/Q1>h�d��e�b,�k��������d��x�f����w��I�P	&����g����q�IEE�&�gmDE��a�HT�I'Q&�DE��ސ4�$��v�L�	$
E%�tHZ��0�)*¤�����"Lڊ�0�+*¤���΢"LZ���$q��\T�c�5X{Q&����.�Y�Z�J0�1�4a�eT�I�Q&}FE�4���ը�^�"�/�Ⱥ��0i7*¤�����S���������|d}GEx��#�<*¤�(2��K�˂�Fo��z1q�_/'�坕�#�����z��?����C��?�?��94!:>|�9��>O���|�����?��Q�Ӱ����b������oy�Ν䇻v�J�'�]�YSsX“];���Tj&�j�ƒ[*��'�fK��$�}P̒�}�n'oX�/�l*�尥�kvS��-���6��	�fB��hB�V�d�\%�&$��*��c2r�U�d�lS�$#gS�%#g�jB�tׄ���IS5��Z�g���n���g�U��[�k^�Zs���Z^K�����uf�O���'٤M�x7	I���Z|6�YZX��75L��>�l�
[�;�~��Z���z;
��}����+�
��05zm_���7ɍ��Im�,98�Ԧ��J�^m	��&��f	���v5��y����Ζ)Rr�Uu-���1�����f֊dk���mG���j�bi�=>9"�K����N+/�v)�f��Ԯ7?1u����S�6��g�啐j�a5E��j�
L�}k�%��:9�HK_�)�oJ���R4���-�N�?��e�w��F�L�v�g��:x�A��O��S��%uh�LCsS���2
۱�2�{F��%s,#��H1Y����-yTlo��}
�DŽ7��]U1y�gu�˿��̳X���9|�đ��)r/58r��9f�祆�X�Nk=j��H�9���.S���I^L�"� ="
!=b!�-�#�\Q
��
�Q���QL��$�!="�ab
�����(��h�􈨈���(��h�XzD�|,="�fN�����#��c����h�XzDt~,="Z?��K���H���@�;�$�<�#��H���k��H��5ˀH�^�k����k��;I�#��G$bB�a�H�Z�5��"���I6��tI>��� ="AzDR����I
�#��G$-H�H^PJ'�A)�da���t��M$9"}R�H�R:I�'�!�O$EH�H��>�W�`YB�D҄��	�I�'�)�O$UH�H�VH�Y��$]H�H��$aH�HƐ.��!]"9C�D��t�d
�I�%�7�K$qH�H�$uH�H�$y�@C�$����? I �%�A�K$�H�H�.�$"]"YD�D҈t���I$�%�IЖ���%�$��d�C��H�4�dHR�t���I*�%�U�K$�H�H^�.��J���V���0����je�l)��ǧV+�����d�u�<8.!6.:.!�Y\
�������7{�OV���簘��
���)����g�sG��6O���s��U�XV7��k:��k+�Y�-��w��o(s8uÏ3�)���nܮXw����x�~��۩ߖݑ|зC��w,�Dv>������G.=r�yA�n���=g�{F]���]�G�>G�������ﯳ83/B�j���x�Q~#S�P��XlB��"�-yu�B"����X�j��X�AT�"�3y��ȣ�h
1��؃�-rE����@�:ZJDw�Rځ\�i$�g�BٹPv.�����?/>p���Џ�}˓�~͓��*O
�����|�
�������<��O�.��{����&�'�m��d��&s\�v%���L����a�$ϛ�_�9�#y����~�M�D���|�n�^r��x�.���J��@��g�R�r{�t�RJ��z�k~F��%�r�K�?��WXu˨8︮$�T*��E�ܱ�R��1����Z��r���sd��O7M�<�רq��\s�*�ց�z�9�u��t���ӻMt�w��Uu��ů}n�߽���1+�r�d�2J��sr�m��z�w��K
6-��}�h4�M�2���x��I��Fp�zS/�i:H�~@	���z`�^p��f��Rk�Ŭ5F�C��V��3Z�o���>ܰN������h�(����3":\�#�7:��w<SW�\���rJ��(�N���Op(=�ΐ{F}��3B��:eDNzO�];��3\�*8߈�G_�`ĸ��cv�5���+�����c�(nci��N��;��P���2D�dDD-ӱ\�*�̥,K���Y�I�5ʬ����n�cZ��ς�����3�Wp�`]��=�9�s�}�|j�Iˢ�a���s�n?��Hi�<19����4�Qae���
���g������FZ�KO�9e���>.s�1>w��D�
���!���/OeD��6&�k�煒��]����~���Hn������"���*�c�*=����_���}�ӎ��
;|�ޡ`i��zky�]σ"{F���5��yA$r�v�C~ق�=#�'��r�b[
b��Nk�ZG�C������#"�K�
�H��
�+"�[!��\@���<#r�F������;"�{D. ��\@��<$R���ж�5�$�C;Cχ���о�k�Q^����1Wy�,�Ɯ�uhc�s�ס�DϏ9���R�s���3�C;Aט�D* �)B~�:��"�C�*�:����ס�+�:���C{��1��\��z	X@��ש=�;��T6P| R�����5b�S[E�?b��
"�D.?�\��:�~t�XB��ש��;�Wx��7�s�(�G����w�#��:5�q�����L�ש��ˀ?�Nm<�\����E�N�w�#&�:5�q����Ԧ�N��\�"�X��Ԣ^���;���lv�:�H�ש���e�1^�f��e�3^���w.��:�E���ש�x�2��S{�w.��:�)�sp�ש
�ˀ�N-�w.R1,��`�a��Z��+ռ������^q��ǥ�06�O~��	��',л��n����!�<��
дk���3V��h��[�Q�G���������������������uk�(RJY�N��z��6G�R�~i}��
3����_*�_ު~qr��uFY��Da����oz%�9,�	߅n��Law��ݦ!��ۨ�u=-��Y��;]�ɜ���ifz�~0i�vhb�pt�Ȁ�h�zڱ�����'��'~�+�:f��w��l�!�{��}%[� 
�k 
��#
�cv#
�k�yc`chcyc�|�#w�p�w�;�:w�>�i�]�3��h�1�����Lw�"�):�	�E�ct&�5:�	���w��\�;�g.���̥Y�x7�%�����*횆;K�.�@9ɻ��.턫|Z�s��]���y:�	�>���:�	��@�
tV�#�&xW�YM�Π����Ag5�;��j�w	���:�	�-tV�c�&x��YM�Ρ����Cg5�;��j�w�ކwEx7ѮmxG���N޵}��Y��<����78iH�໢���ݤ���z�.�a��̂ǕX�Yg,X[Y�:���X]OܙVA���S�}�K����w�;�l�Kqw���ns��'o��ץ��rV��~O�ɜm��n��5&��������=�#ZG���"��z�'��K'~O��I'wTO˓N��$��bJg_�-�[qհF����1~��^���2.|S����Om�ꗚN1�W�\a�q���r�3ҕCī�	ҵ�(1/#R�_�P�5��=[V�g\��Q���q�F��zɶ�
��^J:e���}�m����˘+������\/�׬+��\_yQ�8C��?ګ}�����l������XM�	'VHK���<^�g�N����jH�[@�O��C��d@���ܛ��s.1���y�7j=��� \���s��&\��|HD��|>$��~>�Q�k?�S�hõ6���õ,���õ������ε���hĵ��4*q�o��5:��-�P��`D+��Q�NpY��|"z�	.�`|N�z�y�d|N�t����4N�Y5�bV
�h7.�E@:>�q]#��R@=ޕIׄ~�1��DAZj1��DCZj�H7P��Ӷ�� :�L*���(IK,VW-i����
�&-�p����IK-Ai��(-�$����^5ԫ�z�P�VC����o���n	c�$�G6����k�|�������t�'~�+x\��k_*���cW�3&>���]��ݷ0"�����s�ծ�R��_��uɀ����ui������]ߩ�w߻ŧt}�ں}��u���o{���&�s.��r���?���UӴ�<^���C��0�ˏ:u���k���������n��Ou�����}��kfz̯̈K.�t��*�������-�0����/�4�Lj��V-�W1㳪��f�.d|~����������>� ���l��o:����zٻ��.�Tvv�Ϣv������Q{+�8��؜�/'ձ��ǣr2cft�2�#��M��8}ԑǮ:��1�Xd���U�:Q�r����N�ܗ}r��Y��ef�^�Q֙-F��O3�έX�Q�g.p�0��ɓ�G��}��6��.5��ȭ?(��
��˥:d]�m�}�D����u��e��������d�0���bӿ̺g���{�+'%�[�Jھp��ˢJI�9J�3�|��֨�2��(����r����k��*�=2[|�GV����gݟ�]�_��ѕ�����`����ԮU^�j�~��v�GG���Q~ۿs�X�qT͕k��%����C�����b�a�7�κX��r�^�1��;�Rj߇�4�IMmx��F����w։H�Hm��D|Ӊ�R嗎�&�]J�<>8��Ő��Ƈֻ|1,��.FT�Hı̋ͮ��H�-��{?�rź�Vk�i=sAj���GlɓS�N�߮����r|��S;v�/�ㅱ#�����bt9�H�;���H���]wU����<�ۇ墺��p�g���Ѳ����Q��_u��*���Gl���#��5Wf96���7+�E�ُ���?��~,�oր����7�Rb8��F
���_n�#!�ZT™���zW���;�J�D%-+�v�5|N��_��z<�'{�̬�>�g�3��Q�[�S���]yYvj���Լ/i���r�s��28j\�����HWOf�GM����TF騉B�枊��ZX�<�3��|�ȳ����U�}���c���qV������8.ſ�j�
�}7��ݠ�w���LHk�Haܴ��fQfr�!�eG�cW�)O|��2`*�\��#����J����8KN���xK�,���ň�K���Lp�Α,&'�cr�L&'�e�$�L��i�`4m��M��4u���`6M�i��M��or��i�8E�r�:���0�� �)��#�;9BƓ#�<M'�i:xO�	�|�N��t�����4��@S	�4��Z@��=��PD���@
�R(�F�Bu�
��P@� �j)P7(B� e�E�!�P�:X�x<��X+ݣX��p�B�
E5+��5�R(�ְBQ%����Xw(B�!e�E�A�Pdf�B�Q+]cMb�b���$�(V(8X�x:���V(�X��`��S+�P�b���
E�,V(�g�BQ/���٬P��b��'��
�چP�(��ƑBu�
�u�P@�#�j)P�X�X�
�{V(fF�B��QQ�P�V(��b������,V(Ff�B�#�
{6+�Y�Pe�BQ9��Jt���
���PT���u��	�����B5�
���P|�
�+��5�YV(Q��
�V(f]d�B9�
��P�;�
�K��P<|���TV(Ja�"6��:�PD��BQ"��jt����0z~�٬PT�k�ݤP@�f��x������S�z�
ſ��Ba\d��q���
E�#�P,He�"�+�SY���
��TV(�xV(:��Bq_<+c/�B��+�.�B!a��]co�
�#��
�P4Oe��\+t�g`��]c�@
��P��k�#X��M��O�B���/�W�B1+���lV(�d�Bad�B�7����Y�h��
�+�Q�PLv�B�
�`+բX��H�ؓ�B�O�؛�BQ�
�+�����Wa��'��L�ƾ�
���a���,V(�e�BO��ϰB19�
�W�mX��|��
�W�uX��|Ş�����G�6?��������r7�������1#S������z�����`���!����|e^_�e�{Q7vL���	��)%�Eþ��nԖ�:.�Uy��4�sqi�����.��@˖��L�_�'�����_�Jڮ�>r�VJگ�]����k���1�ew
}�3z�5�3D�g�{Z�o|�_�b�~��wuh�@x�A}�J�Z�Ҿb�{��j?�ӫ*���(��XJ���^k�6MJ�!>4y�V�O�Xgؗ�7�v^��Fѿ�}���o
�G�'{C_��x���O՛��5���˳B���)b�����!bH�\-�?Jڣ��4#�_"���]�
}�Y��޵BK���j���ǜ��̟��Wz�)�vݝz����M_�;v�u��Y����[bt�З�;��A?�Xﺫ�3a�����Z��I�}���h�E�v���/@����j=6t�{���yXpl./�����%>���;��2g@��80}8�
n=n�tmP�����$-���zBd-�L����%ns�I>��e�a��k���G��C�o&�\a	ɱ�Eu����Ĕ���S�ρ�~EO�x_K�ٯ���6.�S}|n��D�B��l�6a�3�S͵��I�4���K��N�˳~ma�����:�ݯ-���iH2�"�%�Ô$�ҒdHMbU��$@�R�4���y�\������!0]k&�@�O�qU���U�����X��tl��t̥1c7Lu�cl�)�t�
�0���"ӱg�cO�L�����X��t�
�x`�n<0f7�
�c@��ҙ��Ҙ���B�k�И�� @
ӱ����c�if���L�ֳ�I~� �0S5�cS� ��X"�1���ILdž��j<0�*0�ۣ�a���%��e��vL����x`�[`����t�t���$���W��5�c�㿧4�c/� ���f�1c�E�<�cl����tl��t���tl��t���t,I灱�1c�E�F��"B$��x��ƍyP����"ӱ��1Kd:���tl��2c�E�T�cn���t���:ӱ$��."�2c�E�\�cl����t�
����y`\^`:Ɔ��<0f�E�f�cl���t�
���q�"d3�|E�f:���t������+B:�16\�C�/|;<���|���;�XXh�/���@o��)���E��R||J1�7�Z��J�/�����mѴǬ��)K����*��\��!k�i�|��s��oc�w��������l�������$�Ή���z��'vVVMV?�l���_Y�п��Q�KjFIK�=%�v(k��hў�#�Y�V���Mk�Og�o�ͷ�'�ln�򔵱{��)~������3A��Bs3+`���YK�9�s�#�7���}�q��t}��;[���]y�{Op�mO�[�S�}93�o?��dẄW���`R'��Ħ���ʑգ
|�c����'�+X�?�y��y��Oߘ���N��<�e�u�ӝʹ;Tk�j���&�U.$�Q/���}��S�G�KMQs�5��R�\��y9�a]9�¼��ܺ�U�������7��yV���{ƚֽj�R\�����JI��Gޠ��֫��E����kMSʘSղ��0˹��k���~�_bV��`�7>ʼ?)Ҫԯ�Y9���@���
�XUj7P��:cU���R�_���3���j��_+5W~��Z�R���ɳ��}Yu��0�d��Y��˿�h�?o����� g���d��(s��xgo% =Im�*Li:��*Ϫ�&&ZA)����h+��>%Կ�������"��1#����]��l�e��bo��r�Y��Z�l=ӭ��?ȴ%����2Ii�e�}|�ҡi�ڱC{�ㅥVd�@3r翬�rm��U#��g�]^��v�eSb&W�}X[��?F��)��h�T�9��˯�ڻO���56�%�h�Qs���t�ꗷ�|d�N�ќ������a��k
L�f*1�qӇ+�樃c��r�ԄȦJ™G�D��JⶖjR�JҲ��c�*��4�F|}�|<���b���'�Rg�6G��VJ�\st�
Vj��fj�5-b����^We�2.�Mu|�#��S�'�[*&&�Oe�T&�V'��?V��m�&��3W�B'��u��Q(�����ea'����W��y����S��	���������0p$I,@�X%�&K��4�Ф�I�ZO��Ұ@����4�0�a-*
kTiX�J�ZW��Ұ@����4���a-n�-b�4��-�E@����4�0�!-2
i�iH�LCZg�@Ӑ@����4���!-6
i���p�^��G�a,9��w��N�t�r��ܼ\|������kx^.�==cg�<�}Z�a�p����
ccT�vRx�T�al}���WTy����������L^.^���@�0�<]c��a�N6�������l�	����;��
c?Py;W�a�6������ql�	������<�k�0�����~&�y~B��als6��������a, ���'"ƚc��^c�O(L���7(�׫�)��T��OS��OU���a2�Od�O(`��a�O(d�,f�Q&��H��C��6���쿌�쿁���3��2
�_���gƞf�O(���R���T���b"�6��������E����h��?�b�d��Xe�n2��Ά�P@��'���f�]Tf�Uf��c/+��-f��f��,f��
��r��=&c�Y�?1��o�����<�=�2��Lf�n�� ���*��I
���Uf�q
��4��{��R��@���,f�mMf�#,f�uLf�3Tf�6��p��m����쿸�쿩���c
���*��,���Tf��(�+X���<�=o1�_j2��i1�������������쿛����p���������쿩����E��K���f�5Uf��*��[<�=o2������4y��
��Wl4��o����7��s�b���f���؈0��|ņ��?�+6&��Uf��"x;�[�P|�ۡ�1���S�@�������@��)�8��{����F��ܑ�߅[�|��p����6�Y+��_�9���ڂ�V7-��ݧX[���x�[���P�^��m�o�]�6�Џ�la����+�n�Ek/<�ެ�־����u5's�v`F�j�?�L���J;�PK;ڠ�z,��p���M��X'�9�=�:u�pz�&�̖M��?�.$/�.�~��Y�\*M���(��jv�p-�����H�ռrк�A�zN�wT5�x\�g�g4�=���y�Ƴ�YϞ�h<{~I��s����X�g�Ϟ��`�j��Ϟ�h4{�~Y���>�g��=�x��S���j�g��5�=��x�<[���X�g�4�=�	<{v<{n.��ُ:���=�4�i��Ƴ��g�.�g�oh8�i�Q�١��c�j�A[����hǺ[ѫ��6,����\��nj̄�Z�e�{�G5����-[i=�\T{���z�٥��ZL�
�Pc�քv�˱ۨ�G��i���&h����/�v����l�⦏W,�'>�Ɨ��%D�R���k���:kI>�Ԥe��a����DCۓo=��ڝ�Vrl9hs6Z�B�A{��]y;�5ӭԼ3ZZ�"5-�km\�i�̕�ܡ�鳴'�;�&�ў�h�N�W�4��z�Ӊ������Xt��5�b�9��e�����Zȫ�=� � |�p�[��O�_�?�|�Ξ/ƌ!����+'?E�`e`�Ō?O+:.m���oW,f��
��Y�?�񯰘�‡��вk�)�0㟫1��Uf��5f�)*3�G5f��Uf��4f�a*3�Z3�**3��3��M>$�7��낰Ō�����Ō���?���/��G8c��YcƞV0O
��☱7/��7
����	���QD��y}���Vt^��Vt^�R+:����׏ъ��_���i�l�y}�Pt^!��W���Z����t��z�ڢ�}Z�y}���Pt^�S(:�_-��o׊��WiE����Z�y}?��>M(:��ի��ͅ��z������zc^�B��+��ޘ��}�1��zc^����K3�)<����#��C�`�
��T0sR`�h1s��1s�2s�1s�Uf�j���S���Ҙ�_T����y=����y=�f�5y^����ǒ�����}���}���	�����{������x���"����*3�i<����}(��T1s���z(Y��k����ܣf�3�F3����9o��1s�����};���1s��R�̝�K3w�[,}��9o�2s��R�̽��̽�*xk"�D:V���d�s�k�:"nd�:���a7��
�y��h�v�]��#9�����3�*>�o�mo�+�EO��W�%/�e��\��\�r�m��;~l�lk.�{u��O_���Ͷ-.O^�IG�sRy���l�wQ��^�h����ꑊ���.�G���2J=�\��5�v�-�յv��:�1ۺ���?|`�T�#�vw�)ox��F{���qնY��-)-m���䬰����':�x��b����!���q�������>�.i����W�la�{w�	�ns�{_Γ��S�9��)f����6�����)���ُE�2��W�}¿�y�e�ɒ�͓��O�l�^��>?s��ä���*���}��S����]��I�+4Q.��⾜�ټr"�}5��y-��;/�����y�0�2�M��g��^ղW�)%�;�%m��{)i�RZ�c�f+��8{s�Rv���r�~J�5i�
o�1�W�3����o�?��Y����rtu�<�����UjW�W-u��V욽��J���~�s����k�t)��i��`>4y��v�o�:���Ь�w���L��	n���J�Hw���JÓ
܍2�+�wv��?�4YU��tb+E��kL�g���G�0Cz~e��a�9��%�͈�_�#�4�]Y�n�e��b��n��TZ�Mq���Ai3��ۖ�Hi;e��]�iJ��^�M�*;��;^xՌ���9��.�^5��|����kO*]w�c&<�t����{�6�}�O�-Rz�ɶ��W���c{�+G�����G����nǦ�f��Wݏ,�l>�3���[�c���g��mn%f�7�1���g���m���)	���tU��'nk�$�<jOZ&*Î}f>��9������ʛ#Wlv'��Lu�[�Q�;̔��ѕ?0S�w��mW�"����R�Uf�9[���D�X���&�	�)Oe���Z(��`�p)�T���7)X��uc��̅���`�n�-T�P���Y��_��#�aD�#��v��v?1Z?��Ҭ@�f����ڠ���"EF��/�kH��O�kJ��XҬ�g����f��4��Y?�(�Hi�`J�~T���Ҭ��f��`yֿ��hi�`K�f��4��Y?�/����� �Y?�1��i֟�no�4�p�M��)P�"pH)����p���ݬd���t�"�Ɋ�,7+LVƸYxRa ��
�c
+�vV�(�D�YxHa���-�73�b<�"�3��<�b�����3�<�_������
���y�?L�~��g�]��'�Yh��"��͊@g��7+�MV*�Y(o��
��-���nVv�<÷�<�?��"��Ί�v����RX�cgE`6���"0VaE�e7+�V�ܬ�1Yp�Y�k�"��͊@3�?7+���)P�X�ngE�]c���g���(����"�RXx�����"0�Ί�7&+ݬ|h�"��y<���NJ�j��H7+sV�Y��"��Ί��
+���RX�"����i;+-LV���"P�dE�I�X@Y��͊��bʊ�F��͊���ʊ��"ˊ��bˊ����J�P��C�e%�U��Cf%`���0�d%�����'V��<��P��J@��b�J�C
+�vV���=o����=qV����9������f%`��J�7+o��p�0�d%��f%`��J�cvV�UX	hcg%`��	`%�+��`%�	��)`%@TX	���J@c����nVʛ�lv����V8_�y`%���C�J�+6�p�bS�J�+6�p�b��J�+6���ko�qM�/|;��D�.%����߄ѩw��G`�/�p`�W����Ͻ��<.��Ϝw&uȨ��ʸ�͹�|;��9O�{">��9]ʶ��9׾5��+ul9�O��/������s�Xb���+�v�|cgn����vʗ}M��K:���d���;k|���渣�ZF{����䱶�ev9�6��쐅�E��>)��]�B5�,T����0�7�&����u7�O�N��@K�ᵠ��xM(��Bt^��݀׈��N��kE�?�^��kF�?�n��kG^?j��5��#���ZR��'���R��+�����?^_nw�����L�)��Ԟ��^��Z� ������n�px���]��)�?�(��|�Q!�̸f3֜�?�I��D,����,�zйoM?i���r��We�w�t���%H?���K�k�=˹�m�.��p^^�k\9]R��琉�� ���Q�uJ�_MN�k��2�8Dg�C�T7�g�o>��� ��;l4��
^����d9g��AFc�j9`�&��1Wn�i�$k��1FP_�\r��[�]/�I�䰯��?mqFdL��8,7���h���l�\�h��E�u7�h�b�lk��h;-Wn��9�}�
r�Kۥ�1���3Fd�ir�G����o8�玕:����2�~<���1������d&A�!53z��{	Ս��Ƚ���b딑c?�&9�9#;�^���Z~���ѭ������������4��+�J`�7�E砊�1��h���뤄F}�	���J)r��N�Ћ��WKþ	��?�Q�A��G����.���KK�S8GU=%���u���MJm�Ǚ�{���3m�Vc��t�wOH�i�'��i<��uy Q�S�'��� M�'w�������a�η��E'��G�-�����"���h�����""s�O���/�boz.��\�I\�+�\�s$.��\�������|��E�&�:@�p��,q��AϏ��E���E�)���2��.�N.�l;���E>��1�(BR���E���,H0��d�eA�Q�dcM����t�4���i�D�	H�$!a#$"iZ���iAB��II�$&EHNҴ A)B����J�$+iZ���iAҒ��K�$/iZ���iA���L�$3iZ��T�dLl�������6��!�1b“�IO�$>iZ�8 �����iZ�l @��iZ�i8���l�@A�s�q@����m8H�� M�4-ҴHH�0!M�4�@�q@`!M�"�8 �P�!-��"�
iY8t0"��C�iZB�i��`D�iZJ�i0���D�E)Ҳ�(X���E@#iZ^�i��� F�iZf�i����F��q>LNh���֠�����5�u�@�Ip���8&���b��!�[rJ�ȡqw���t�`P����~�/!>�BwƵ }�z��9C�=����ii_՝ƾ5]�W|���n��$�w]����Ə%��O�X�RyQ��QS��#��&��ғ��S���'�5>פ��>�k���^h�R�2�u�u
�1�~�F�)È�+6;���bz���F�dAnY�=���y��y=�)����R��H�����[� �[�z��Ȅ���%n!v�Bt��𑸅p��B<'qQ���5��.r�~��rQS��#v��ך[���s��[�k�-��|����Ox��xT��c�[�{��8ĎFxo��X#r��0�d�僤�^Q���%���%��{�|		��z��m�$�5�7(�f�����R�����gT"d\��Zs�n�8G��¾���}U����c�%О3����f��,��'�f�T��+�G�=����˾gݗ߮�r����_q��{�q\��n)���֛�9�R&�{N*�m���޼�V(x�X�����/���t�!�|^F�	��:0�6�?`g����O�ub���'�z1���O�uc=���q�	���A�ud���3��1Ο�� �2U�mg���8�/�יq���zY��;Qx퉢ܡ��:�u����GdL�;�3�����pA
�LIH96i�P/�ߩ�>���%����C�����{<9�n�0T�_��	�{
����q)�^,q�T1���qX�����@��z���g���T᮪O֑g1�O~����@o�����78i��u+����濧𿏅��.�2��ݘ��X��%�܏z⮼���2}O�F���,�с='s����fz����&v��i�Ȁ֏Ev�Y��c�O���I��1KD�3[v��~n�Y���t��~!�}����藚���s��N�肎��~����z����H�t�.��p����l����h��|{��K/�TBst/G�rt-G�rt+G�rt)G�rt'Ggrt%GGrt#G'rt!GrtG�qtG�qtG�qtG�qtGgqtGGqtG'qtGqtG�pt
G�ptG�pt	G�ptGgptGGptG'ptGpt�F�ot�F�ot�F�ot�F�ot�Fgot�FGot�F'ot�Fot�F�nt�F�nt�F�nt�F�nt�Fgnt�FGnt�F'nt�Fnt�F�mt�F�mt�F�mt�F�mt�Fgmt�FGmt�F'mt�Fmt�F�lt�F�lt�F�lt�F�lt�Fglt�FGlt�F'lt�Flt�F�kt�F�kt�F�kt�F�kt�Fgkt�FG��˨��X�vͬo	�
��Qj@�j�4 2�5Ee����CQ/�l-Q�(z�G=��Ԡ�:�|8�F�� �Qg��u��y��t�7꼠�F�$�7�:o���F��:���Q!:o��,�Ɲ"o�i�����(��7�X"oܹ$�Q;D>�c�ȇs���Ed�=n�7���yc�:o�yS�=/뼱g��{�X���=�7�La�P�7���ь7�D���'R�>��эn�l�(G
 FD;��=��Q#"E@?����ٲ���#""FDE���1"BbD�ĈH��#"&FDM���=1"�bDňHJ�#"*FDUR�YI�t%e��-@YV���5�-M.q1"����#�/FD`�y���b/�@Dƈ���#�3FDh���� RcD�ƈ��Q#"7FDo���4I��E����"Dt���4Qd��
�;FDx����1"�cDħ	@w^�ȏ�#V�PH)�J@�F��*`�ʀ�F��J`�J��F��j���#V�L��P�*�+	E�&��P���+F�.��`�*�+
F�6��`Ī�+F�>�a�*�+E�F�"Q���+E�N�Ba�*�+F�V�baĪ�+F�^��a�*�+�|�W�h��a��F�F�p��a�J��F�x��a��'�3K�/|;��43�wG��nO�
	
��nqiqÒ������C����	�y����M�_��?�+xL'�M]W��WZ�,�T�.t��-,4�����׽H|���"�{��J��?����eg/�W㦯���^E���ءP�_�H�fɘ71w~��s���X��9@N]�f��+�1fk^+fk^+fk^+fkc+��!��e�ޏ������q��~'V��^�&�޴�?��x7���|�Z�
�ӹ�ƴ���W�������ߓ�Z��}k�mi�y��5���O��V���`�z~����*��m�y��,>��1�����˪|�Y��w�>�k�����ބ��e��wZ|�K���j�y����a�y��,>�?\��t��+*���_��?U����*������X|�d��l1o������"d�?.���"��/~�a���p�^�k�-�?�z���ݛg�̭q���1+y�W������<���=>�����|@�ə�mTy&�����(z|~�U�FV��󹛿q|>w�7��/��L�����8>��yDfi��#�0K�n���N�Y���_���]?>?�b�v���X�Y���#,fi׏ϯf1K�~|~��,����s|>�������W��]?>�S�Y�����Ү���b�v��������Of�����[EY��Xyf1��c�p�'�C�76.)���ߛ��ݴ������w��/^�y]��Ǵ�km��Qk��:�U֎��o��F���#�d�@�� YF+�d7
R!�C�?�S}����}O��f����[P�߄����&:�����?�k��7�YZ�?����2��Xxp�M�ü󿻙��W���������1MP��My#��3ߛ��-W���,��V�����y��w��z'g����z/3�\����<���bU��v�hU�6�hQ���hM���hI�v�k��Vц-��n�@A�q�G�q�G�q�G�q�Gkq�GKq�G+�ܾ��l��͔�sK�G�o�l�vI��g������3&ڄ�u%�h���h���h�V�h��h�}�om����&�}��7�|��7�{��7�z��7�y��7�x��7�w�u7�v�e7�u�U7�t�E7�s�m�ђ��ъ{n���h����h���ۿ���^��坿����S�o���߮��-`���V>�k���TY�9��
#���l�k�H���$��J���d��/Hu@�$���"�'YC�R%��R��<%��T��\%��UlB��J��d8K��dxK��d�K�����Sf����1"&�E�2YD6�E�3YDF�E�4YDV��c]#f�E�6YDv�E�7YC�S'k�r���5KG|g���&�^[,V�>��Y;�>�^Ϙ�z����`��5Ce��i�U��*�^�
�^1*�^�V����z�WX�������d�+�b�Kd���v���yV������b��C�U���^;V�v��z�VX��@��s^�>Gel��+�ǩ��Q�W�?��J�&�Do�j� V�����5�X5�a�j�g��[`}bՌ��N�j���}�W�����m�:�]��]��]��X��m^n�y��&
�@�"���������7,�{����<�����}kk_dGY�s��۩��z���_r^/���A����D�ߟ~BU���Ew��;����v�vdj��p|�258�Q�_�����H��
�K<=k��>adjJ܈�6�	����?!��=��;�[�?�ؚ*���%kQ����%�voU��.
^+,�[e�s���;9��w/L��k-�',O�g�X�^pNla�|m��~�C]5���Adsuu�:ڇ�~�G-Kh%��5Տkk����Ņ�[6��9&�[�cen�>��\���;[��<'l�>��?R��t��%�_�g����ZVn��E�nw�O����x��O�~s��������O�~j��^j:T˭?��Bg�r�D�ﲮ�A�*��ɎV���Z㝃Ԁ�Z�UԦ�i�zj`b�4���Bz�TC��aA_��>~BD�Vı<�ٕ/���K����r�U�����3s�6�U˖���Ny^m�
�}|�ڡ�����we�B�����P�}ss�"jT���R������]�-��?�2q�gB���G���`(τv<���Lh��3�U<\��l�gB��	��x&'�L���3��τZX<��Lȡ�L���3��*τ�h<�k�c�	��,p�gB��+r�<ڠ�L�3��gBYτ\t��M3!'��y&4���H�gBC,�	�K�P��3�τ����/��[	����Jh�+f%4��	-*�2kτ�T�	
�x&4��Pg�gB��w%�筤��3��τfk<��Lh��3�*τ�i<���L(���0ττN�<j.�L�K�gB~τN�lj4τ��x&ĺ�k�	]�x&��3��gB��3!��3��U�	���L(Ye�c��])�O^)�mhn�������ڦ$=�<���{�CCn�B��ߞ���L�c�����>�1�O_����e��<�̣�'W�;F�r�Q����8?�Y>��u��E܊܊̊�6���+=�����Ws�"�јz��L���x�@Y��_�"�����v��--�>�������	cR���_!�7�?{�w3�K|�^�!��3:�Y?��.�����e�_��&��e/�l��q��{G�t���.�#�U�m��'����K����˾��Z{\e��/�ʹֺ˯�m��v�]|e��⌏��v�k�J�&�*GO�?�,�`�^��S����٫�3�U�Qm��vW�C�N�A�*;k�_�Uk�%��i{h�ng���:�~�+c��9[�W��Ϳ�Dz~iW���9�\
ON�ens5��3 }��ɪ	Φ_tɳ8�ۂFۜ���-�Wmg��-,��3ܧ�-��!9�x����{��U���{Tn�q�Z���Z��f�۲-9��v�fg��{\���rvh���c�Ύ��"�ȑ;�ۢ�=+G�Ze�|f��嵯]]wMs�LX���Pg���\�9��=Z�q�Lk������;�����f��P�{4��UNvl����;'?����ќ�r�)Wm�e������ϖ��ˮ��+����'�rƗ�t%D�q&�1\��}��۞v%�4s&-�w
;^�9|�d�w��x�`�Hg599��M�yU�oKIΑG?Pږ��%���u�E�s��q�������k|��|"}���짝&.s=��8`�kҼNN�b�K�֟��|��?����n,���xk7�;"=�D�Ū��OV�ՠ�U��w��mv;x��m��dŮ�Q��J2߂O�x���C�+Z�ϵ0>5��\�:���Zv6��Dd�e���HH[z�y&&�-��,��t����_��$�@���	�b��n[r],�|og�e��Ŗ�n[.�XlY�f�e��Ŗl;�-km,�|lg��u�-��Yl�dc�e��Ŗ8�-��Ŗ���a��2��Xt��5��.$�(������%���`�e7]#P���7�.�2�.gl,�|̾� ,��ƾ�9l��`¢�6�.�:YtY�b�e��E�],�p���Ƣ��}�pHt�a�%��|Xt� D�8�_#]�Xt9J�L,�|ƾ�:�_)]��"X���A�E�O\,��`_l0]��/6�.��,�����2Lf��k�.���E���
@ǢKg'�.c�A�E��t��ǢK3�.��E�X�y�,�D�Xt9Ǿ��,�l�Yt�jc�e�̢K��E��2�.�],��d_lN]f9Yt�t��2�}�HYt��d��i�.͜,���5�+�.���!Ȳ�2��lYt�hc��*]#��C��,�����E�}�
�̢K&�b0��b�/64�.O;YtY�b�%�ɢ�L�.������2�ҥ��?���A��;�7л�o��[j'�Ǎ~'
�~��
����������?��;��H#/c�-i'Y���V���=�m����,��U�G=gɶu]>���>�t�O:K/�k�t�1����^.�Z�*�f�\��/l�+��3޳�7~�|��J�Fɕ�U��z�6�i�R;�Y�t���==��U����Y�P]�o���|���t�Z��!�5���ը[]�3��p�5�n�/�z��
ߓ��8�� 5���lxR�e~�l��9# }��ɪ�FӉ�8�Yv#0ѐ�F��O�!�*���rX�OF�O'9��~)�x}�ٕ���-�~'�t�s�Z�Fj��r��?_�%�t���Ѯ�����.wv�����A92`��s�]�	)z�b��R���;��g�Lx����G��:�sZ=Z&:{��4z�E;{�
F鬒�����أQ��f����P�wPzdqY�ќ�R�)g��2K�����s�7�McP�W��S��rNgBd��p�%g����-ՙ���HZ�v��1|�hy����xZy����.�3�J�ª�)�nit�+rj�����i_i9?;�Uq�2�:��d<���dv�1a��S�������E���&{9<��D)�Z��#���U����j��:XB�B�h%����t����t�jҥן�����Ӏ�QL�\Ei@;�i@��4`��i@'ơy�N`~2
�v�i@{ӀzN��N`�2
�v�i�^Ӏ/�L���|f���5Ӏ/lL��L޳1
�)3
x��4`��4@�1
�%3
�ic��dbc���4���4���i@]�i��N�>Ӏ�N��L��L�L��LNH<{�&3
�Bbp�����^q2
xAb���4@��|�d�4`��i�H�i�3N�v�i�!3
2�<-3
�d0
��y����4��/b����IL|e��IL��"F1
�X�4��/b����]L�s��4� ���eL6r���4`��4���4`��i�8�i��N��L�:��4�$:��4�Ds��4�����L���Ld�PfpPbPVf�QbpVf�Xb�L���v2
x�`�i�T�i����R�/q���4 ���V��N����e�
%�}d�e%��2Ӏ�Ӏ�2��ij�+�3
x���d�����?`3Ӏ�N�/L�9��L8��L&;�D�H6�-�����M3�o�t�g@���wg�w��KhPp� ���m��Ԥ�w������^���wM�8>!.e���u��ߴ�7$,��{7��u�/<.����aa�7O*Y�(_�"n{�A��7ڋ_�"���+nOy@�1�G<�B�O/�*���J<�l�ܸZ�5T�;��?D� ^?,^�F����O��_jZR�d���j�uL�R�e�r�S�����w���L���z՝�^m�z�O։~�-k��@���.֚0Y�F���,���[�Ӳ��p�0����X�lѿD{����Ѻz�o�n��}��7^����ɔ�z��Ot9~���n1(�1��Z1��1���bX�81��$1���z�8�ٕ4����b���咁z�������m��!u�S���:����W;�=�wlM�T�s=��1��11��b�WYb��o�]��#v��S=���n�-Ի�)ڟ{F��_z�	z��zﮑz��鱍豗d�Q����>�ߏ��G>��?����ߨ(>��iq������ߊqO����H�ؗz|�WĄV�!�V�5t=��\}�=)k�>lo}��G�[F��?�J����<���NG�h)�yPXSL�8�� �5���Y��q�/��ܛ��gw�O8��O~�V��������/�'�0��t�-�+�;{����*;���*���_�;7�O�hE]G���`�[H���Ӧ0�O9����F!�h;�FL)��V1�h�
��Q�ÈiF�R���C��2pH9ZiG�!�h8�-��e�b1i8�#-���e��\�
LjK�5�(Q��x_.�*QwHW�D�!m��C�R��%
)LҘ(;�2QvHg�)��D�!���Cje��&�)N�Ҝ(;�:QvHw���D�!퉲C�U���a�(;@Qv���	D���4ex �Ah0A���(;�Qv���D�:����BT`�"@	S�4�FHa��k����)"�0e���PÔ��9L٫�5BQu��AD��(U8��D�`�(;@Qv�'��QD����Te�"��E�`�(;@��/�aD��(�U8��F�`�(;@Qv�7��qD�戲�e�#��G�`�(;@Qv�?���D�1"U8���a�(;@#Qv�G���D�&��Te�$��I�`�(;@'Qv�O���D�F��(%�pJ ��:�*E�V���D�b���e�%�pK� �(;���7�]_�V@ �%��P"�2Ʀ�ޕ̷��)npjr�;�~��ݼ�;$(�����X�,�
��-���-�������ڻ��.tL����m�?�a��^�篺��1��;}�����$H��)		#�&
�w��7�=���z�_x�<�{�_v����|;��0���#cF�����N�b�O���Ȥ4��ݱ#�n��q�����u�_`����n���{�{������&�V�-�������:0�+yF�zX�{�=0�옾����7��?�u`�7�=#�e�������+�~��x��u�w���L����78Ȼ�������f���7���Z7�g�����7�=/�gj�����7��?�u�7�=#��=-�������4@��������ý����i��������v���	0��Z�y��3�?����{���'�o���'��m�����@���3�o���'��i����`O����[7�o���'��n�ݵ�!��i��{`�61}���O���ڻ��C�?����;����?oa����7��?��w��g���� /���:�Eht��_�	|[Bl��� �����'a��~��x���^���i��{`��y��n_�+�������g����w���?��5���_�'��i�zF��zZ�{�{`�w�c���~��x����@���a���^��󿷂�v���	0��Zx{F��{Z�{����TL����O���:���3�?���߻���_�0}#n?�o<�D��`��f������Dӷ���'��o�:�{�g��=�+�&��;����G�F����@�ʭ��yH�zX�{����:1}���O����=#��<,�����.L�?����`�A�{zF�{X�{����1}���O����]���a��{`�o21}���O����]x��O��w�g�q����`�ӿ�7����n�a�?X	�������@�����a�pW�HpX�/����ݽ��x�'~�.x,�/�2����_��o�Q��Q確#��x9��ʉ!����ZV���)�/m(
�HŞ-+�3��t�z�8�D_�d�҆�|Z�y��QZ�J/��S��r�e�F����r���kV��^.���(V��K��-ޟ�"U��G��[z�Y��`�0�J�^z��U�j����~1�zP���Qc��z͕�F��{u)�����uz��rR�a{ćC�Iu;"֫�]�o�.��1�jb�����'��2W�wN��gMV)zӉcyV;=0�%)h��zpd�ҫ��+����}"��b��jR�+���[�J-�~.�tV1Z�}Wl=��f��-y��v��z��F����F���/�"�%F�\-E�!F��+u>�M���*��az̄�F�����5�s��=Z�3z��z/�F�>?软�0bC��c�6�5ϋ��~R����#��GsV�����c�s������E%&Lj�>[�2'���˽a$D���L1�[�ۆI>5�eQưc���9	����xZ�4rE���@Rg�⨰2RJ�qt�3Rj�Ebj~i#-¥��2�UyC����;E"}��d�}����2��F��5ԁ!���b�G��sʋ�[z^�}-�@	�)�§��H�I�|��DˆɄ
#&FL,��\!�0b�a�DÈɆ#%~`��$����1	1b"�/Ɉ#&%FLL���1A)B�b�DňɊ#&-FL\���1�1bc�DƈɌ#&5FLl���1�1b�c�D�Ɏ#&=FL|���0"`D ��`�#�F� (H`D��`�#�F�@0"�P� ��`�#�
F�.`0"�`D��`�#�F�>�0"aD ˆ`�	#�F&�N�$@Q�ˆ@E�
#E-�\�0"�aDÈ@����4�j�0"�Q��� ���"�F<�����#�F@��1"bD@Ĉ���#�#FH��()XbD������"�'FP��H1"�bD@ň���#�+FX��h)�s�"�R�ň�K�#0Fa���1" cDP��f߂��qLp��&r��&�M�KNn|�����w�����k}4c�����ﵭ3���W.���pW���ײ�y���~ЂT�M��{M�w��w%pQ����6f��&�"�e׮!*:**.Ɍ0�(2W6��V��՗�4m�Ѭl��KX��5[L�,+����e][M����PM%��.�gʹs��<�9��J�em)�6����do���_�\ϠJ�^[k��VJ�]+��m+q��h�
�N����:�g�%��!�p�X�
w�w�p��E��e=C��R��;ʭ��'��+��?�o�I�튽�1���{�H톻
�'}�����C��a���/�R�ɀ��M�t2R�sr������뺡R��T�_&��5�d6�Bc'†��L���!�Kޝ���������g��4��mH�>r������|��s �~$�����w�"G�̒���c����_vbgBJ}��p�����H���?+�Ϝ/��[<�.���!=�CsjH�|���H	��#L���v1$�}
���aT�.H��a̱!�0ܳe�����e�!�����o�ɩv)��b�;@��ʑS��Խ��ݻ�d��<��ii򒑆��F�����h��Zą���������4d
^�e��s"ߐr��s[,�r�l����Hӗ��gN�fξ_���]�=�&�y�W��4�ӋN+b��`���
P(A0�xH3��w��{K�}e�'��o ���-8��������9���}�]�|�"���	�����dy	�?����1��?�l$(����+%�+d�[9���Wp��s��3���q�S�c9�w�p��4��������K�;���I����'��?)���+��oc ����F1 ��!G����������;��������6��?%P
�D�����P���L�?L"�?�@���D�{��B����������Ph�S]��C��G���9�Tס�������?�u(H��Ca"���~(���P���|?���G��y���X���|��o���]&�o����א����#�ߘ|?��.�A��72���ő���_��0��_���_o �?�#�_ ���/��C&�o��/&�BI�?�|?&��D� ����OK�G�7��w3���8��WV�M�WX�_���eXH|E�%��|?/���2����?����?����M&���O��������%�Gy����x���"-�O�9����/��S(�����=�ʝ�>N;8��kOg�0�3�^�P�h��'�ƴ��v4kY�K��?�u�S�7_ z��_�gxC���s`+��]���)ZC5[j��1��K��}�Қ,��0Bkjv!Rk	6��v[@+�b˺��[��Bj}�����f�Q�o��튭�1�1��b��!ݿ�u0�4�\��l��{5��R�O�pl���d��y��R����eӤ��Š٣$~QG185�u]146��I�����b��/��������׺�vA~U�����G���Iw=-j�C�����w�J)*�!F͕b���1��	iu�{�IH����u3 �k�Y*
�d�?3O���#���"��K���8̷��0�1἗4”/������Z�.L��ѫ���6v�NH{ӵq)/@��Y�K�y�URr�1��<��K���(���S�����^���.Ҥ_����qi��ڔ�ΐ��k�
!M�Ц�~����l?���ZV�))'r��sl���b���e��wf�8}�"i������Ҭ�.��q#�9O4=4�lR��Bv��������4�����L�I�`��X��=��*G�?N#�Ǒ�﬑��̑�oH��G��'���O��"�����E��1���F����#��X$��H"��h�%�i���gi�8�#4�#8��چ�M��G�?�����K���~Hx�����H���s�W��Os������/��o �A��D�?Y#��N"����\"�?^$�?M"�-��%���(�������'r�O��s�?�S]�B���u(8�]��W�k�h��C!"�Ou
���P���%�E��M� T�$��E�G�?X#��T"�?H$��'���E�c$��{��(l�ے��#��E������~<�D6>�_q������O����������.����������#�BI�"�~L���p���"���Z$������g���q��?����?����~� ���(�����%��V"�?Q$�O|E!&�O|EA&�?R"���O�����
K0�x����x������?�s'�xj��h"�s��g�s�p�?�]�kOWa���~���j^�jLsi5�BK-�_bm��RG�R��[���R�gXe���b�G�g��z@l��JWC5Ul�i���K�k�''��,�i��s5��j����j�5��u[@}�E�K˺��V5�[Z��#�>~���Ͷ}���mW��s8�{Y�
�Jk?i���i�vG�G]����w�t��|���=/v:��e���`�%p�]�.�4{��/ja	NM�B2�ZBc㴰�G,&��Zx�K�wC-����?i��mvu߽_���2W�5?��ޖ\��w==�%8��̝c�;x��b�D�1�Q��_Vk��I�؃jq
���֥i�O�w
xf�8��?s�8hc;���yI-ː�A����a�MĄ�{,	�~G�^�������ە��6�p�k�ژc���ݠݳ%�5.e���l���^�Οl�DLN���4�m�A۩�b�K�ޞ���/�k~��,��t���vM�����t9�j��
��&���X��l�]����*�R̉\b�9�M�m�k���wf�e����{Zf�Ng�Yf���<�Y<4ͼ�z�i���SL3�
%&i&��?����,�9���^�M,�`�5�=\�k�۸���׌x��#��E���^#���,�$��$�O>	J��F�����"���.�"���.�k�'���4�\�c5�\�4�
\��k�,�Oi��[����OY�� ���g!���H����?���_d!���F�����N����.�/��� ��C$�?�E��y������|������]$�n!�?@$���B�?���((��4�G,�;k�X�S]�BC���:�5��\��C"�Ou
���P��S]��D�?�|?����A��?H����F���������`!�?Y$���B�?^$�_�B�?H$��B��	�~8� r�_���oL�ߏ
=�]�j�t��ߠ��Os��_����"��^$�?�B��H�?�B�?�|?%����A0��$��I��O$��B���F�������F�� {��J����?����v� ���(�����%���H����?����?����H��&f�"{���+}�V�K�������a&�W��������)��}�'O��W���(�}���Y�8b����������p�����ESN�_��5�_�_}��2��֠����t=�?����7���a�H��WS�g�t���p���j�E�%�ו�Ǣ��W�fw��*Wk`|ק�E\Ͽv�BRp���_��'�6A��a�b�#yru��B"p��A��$���hkz�5).�:ޖV���o˵';��Z'Y92�lz�WZ�udZ3�õ��H0Qf�}��j�u���W��[1��|l!�q�P��V�)�~���E���f�omxNy��Za�O�w>�Fݒ�l}����^�l{�Ya{�����N�ߕ]�Z®�N��}"�w]k�=�.ޏ_����<��x7��rJ=`R>�JQ�U��u���o���C��'�+g�F�g�=���4��p���z~�0��W�
�Ϙx��a��s�����7(��wD��ܡ�8�oo�$:8��twvr�;?9��n%rk'%vZ=%��c��+���V�W����]�5�gM����Q��~��*���}�������ˀϹht��;">����m���
T�Re<󚂈
�����
�����!4$"6&"6("6*"6,"6."60"62"64"66"68"6:ChxDl|���!1 BP b` bp b�0� A�@A�`a��A���ͩ�bs* �؜
$6��U�_,���
�-""""C6D8�t�x!�11111 BP"b`"bpr��>=᷽�=|ox�o�Hꓜa���ʃ��a�nO�������w�geجi���h�p<�-8ʚnM��X3�l�z�Wf���Sӭ�C#��?�W��o&��ݡ�y6U
P�)�\Dž��_P�*}F�V���uX�):��am�Y}���Z����ߒ�?:D<��Dž���L�!(�������I��@G�=3Ǟ�V���j���Ϥ��Uk��.�Ư7���b��t�U�p�7o"��p����R{B�b���z�ᇸ/�L�ȱ��їd�ؖ<���k��=A�jv/��p�qm�oZ�'b{r�E�i��q�w���w?��W�N|���N���܏[��?�qI��,�o��9��9�7�9��U�Y���2��uJ<J����۽�)�>��=��=��6��V��Kؗ��f+#��M��&�+f~Ic9�f��Z6�@e@Du`��!���mr����("�"*"�"*	"�	"*
"�
"*"�"*"�"*
"�
"*"�"*"�CP DT!DT"DT#DT$DT%DT��]�W�I��H�l}����Ez7�"==�!���q�7 ݑ�7�:�/�����KO�[ӭ�àJ�\�@z�?k��ٓ��*�=��ϛ.=�34,B���)��=n���N#U���P���S����n��=U���������a�B���C�Q�9f�Ts,�}���W�DG�=���_��p�G`�����nGFV�Hk^�G����ۿ*�?&=ӆ��<����M����l�(GF�c|�����?\�U�S�/nkZR?k�xGvF��)4��8%P��=���.z�Z_������93�Rr���kb����zw�ԯ{�O��Z.�n�e�����'�2KN��^iБ��k��v���k�u�5#�Z����!a!:�=������ۧy�P�r���ŠfXnO����k��=�7��?H����g�£S��I���f���I}��L�a�����+5���+g��+������z���=1�S>�����z��I������z���V4��'͖�e���r�?�H�SHp����ܜRo�-+i�==ٖ���&�����&:�X3=���z�Wq��N�{�x8���_��?�����)���U�_kF�D[�mJ^��D8������?��0���/$4To�*mk�=ՑU)�Q���"t�Wi��f[Sli�l�C���_��Qi�˷��^�Ui�G;�����S���_����,{j�����ۿ*�?5۞��1���*m��4[z�}��o�А�K����?��u��%5��K>o$��E~]����8�n��{p���_4�g ��/�u}oP�H0x#6�z�L?)����f�H2g�`f���a%�_e������3��W���E��_��e͛b���?C���c����E�`��W��7���7�(:�n�D���$�%�}~G"�wmo��%]�ϰ&W�쏫�?,��#B���ZE�U��-��DK;�1q�/�l�۽M@�n�F��E��,z��������x�:�o[ZZ	Tt�y�:�/�D�$��ׄs��z��ʇ�㿶{��Q���~.�__�_���W��8U�����uG�K�+/5*�Q��;�m]���7oݠ����-˗*[���s�m�.�ܞ4F�1+7r�]��]����ݶ��n���{���k���~��-��`x���oo��Ǔ�?�@����S1��S|b��i������3�$(g�e�ڟ�/8���c���/�*��K����y�Mﭮ�x��u�]���?)�F���`�Qg�����a�e�<�iz��	|����W|�th��xG����|:+�	ɾ�>���F�{� ��3jL��_f��uSbZ�~��ĭ��ӹ`�3I��
�g���`���yj󐞵�����Z'$��s�	^�5j�G��%7�f�k�m���~�b�;@?��5(�y�u�k��-n���^��9jd<�H\��t2�C)���7��C�����:[�By���V!�1�1�1�1��
%7�B	B��P��c+� ��
%?�B	B�V(-a��"[���V(AH��I���le�'[�!��a�B��P�pe{J@�"b�2��e�C�2�f�C3�PF�pf!�V&AX�IڥV&A���I�le�:[���_�a�ZB�%S�@D�k�"�)���@Dj "=�"�HD�
"�)��J+��B/]�tb)(���BDj!"��b�f,����r�H;D�"�)��4DD*""���HKD�&"ғ�7P�!Д��yzٵZ%�LrLH&Z+|�E������k�}|�?0�!e��:�_���Y��h�G�t��.�[o	��b��~�^�D��p ��b�I��2�J1�
-�ۗ�x,�å�}��r�opI��%���S�m�'N����'����C�s��p���}��;c�v�/��6P}�q��wݍU�;�~���G�Q��;�}]б�L5����X�=��7w"G�|��z������&��Y�è�f�z���~�vf����@�?��#���>�����B�φ����^����9�2�����2d�]���2�ӓ��j�������T'�Kn/ro�s�?����ڝ?��k�Ͼ,����k�T��
�G+�,��$%J�e�b���{�[#[
����[To���vVZ�L�[~�Nm��|�-���뭕6��������r�(ާ�n=(�歴�N:������C��,]�U�}%Og8�S:�{G���W�@����cj��>�!8|"2���
%, ���LJ7�Y
?�?%���m{�n�lT��X���zP�f#�k���^�����1��3�G�o�JԐ!��#JL�B�/���#5��P��YE�[��1u�����j��6�������7��Ai�P�K�u��o�7��Uˆ?*#�yB�]�Q��o�1��c������u\��JR^?Ւt���ꤎ�� �ܭ���vok�js/�Sk�R���'?%ط��'m�'L^=MI[�V��QJ������8i�:�_%���j��[�ꔩf7��s|��9Z!?�d��[����o�NW��3օ3�U��E-���<~N�Y��B�[M��=
�~�ص�jdܻ�zڧĮ%��|�D�Df���-�^ �k "�zW�d�����zY� ,�fQ��z]��#a�4�HD$"�!����dBDB��d��X�H.Jd��ߑd���cG��^ ��ұ^ Kt@>Jt���HB�@Dċ'\�k
�d�<@LD$'K�@PD$)"��'�	���ED�""y����ȈHfD$4"�����FD�#"���HvD$<K�@z�+�g�R@~�z��F�^)���1�^���5��J���(�Wj�O	�+BA�R��k�+��z�@8�W��^��P��v��ń!
�Qa]� ,�(.�(0�(2�(4�(6�(8�(:�(<�(>�(@�(B�(D�(FA�Q��0!�8!�@!�H!�P!�X!�`!�h!�p!�x!��!��!��!���<
�����P#���!��!��!��!��!��!��!��!��q7���q��K|0����xn^>��P�۾.��˫�6�>�+��ߡ�������9�{⳼���Z���9��(�
\D��sq��L��hp"�\�XҀ�;3	�wgƣ�W��q�GY�l����HH��_�l:����������,�V�lf�G�c�D?Vh9��R���6Żr^�[�[Q���oͳNͶ�U��K���B�����Y���U}[tn~�����?���p'2s�D*H�ء)�q����7����c"��ޟu+�50����s!UԿ~��l��]9Gl�g���}�OU�_G�#��꿲�j���L�"q�ةf���{��-e~��ݾ���	�X�Ϻ�����up�]�K�ߜ��eMꃻU���&S�a�_u�n�7~e=g�,�c�~z���3�ڽ���*��^;���F�`]���of��m���fj���>U��is�_c�M�����}Wp(t�z(�1�=����*���\��+۬�┚�C�Z��J�ރ�:\CŻ�f'/�-l*�ۺ_�9��P��|�w>���o��&���7xüCj�~��[&��6�3����'�fMS�w��6?����s��b��剻�Vk#���
���Rڬ�-��當��qNM�vG
�]�s|�A�Z����;���w�����
��?v���Q:}���y�r�˛�Q�W�t�4>Y�Bp�%�bB{}ʇ	S�h>�a�Q��5"?��w���j��O�=�k��|�C����ޓ^���~U���g�:��;�{\-5��d��S
���!n���?Q��)�v�w{�����y]1��L2SV�v�]66CI�Xq���51j���z]\`STjשԇ���\��w�*^�ɪz6t�E����@�o@����T�������1\��hsc�	����1��q������1��q�j�}:j�S�w�: "������
6���&�#G�<�/�
��D��Iްdj�	��D�"�a�B�"r��D�"r���CD�!"��o��9D�"r���DD2.""'����MD�'"ry��\ED�""g����]D�/"ry����|f=�iD�5"r�y�7��qD�9"r����g)x���GD�#� � � � �& �.0m@D}@D�@D�@D�@D�@D�@D�@D�`�}"M	�0�
:�x�!�&4��5u��5u��5u����#D�$��K��M��"N�-ya�gkz�ݚZ)�q��5ўf�r����������ݑ�7Ú������ӂ�$g�*�����?ƞ��nm��������_���+��]�s6����H��'p��;m6�uL>�|����GZ�UpdS�p�\����g�/�n4��>Ѻ��Ӆ�U�G���!��g�=.������K���~hg>�Гj��^6���u)0��O�������vQ��'��~�9�sj���yԬ�vo�JO�/;��t��%��#���])�����!�]bw
�C�c�x���!��C�;�8��HN��
�n���pG��a��t��q:��.C��l(�2[F��r���M�gn�:s�p癛���>4�s��
�N������UZ��]tU����tkv�l�S�����E��-3K����&��gV��U�?_&����ՙ�KN�iǕ�����9���#���)�N��:v�y�f���
GZ��ٔ�=�^8����GW��>��O4����F]�3C�x�Y�z��:ʹn�?���g��o���(�zT���iB����:��E|�/O���r�l���5b������Iѥ�^�'/���(ߏ�(�R(�(���S��Ly��T��<��x���}�@c�k�4��G��hw���x7)�סmC�R��z���R���z��.S�?J@�nS����
��7
�>�JUT��0=h��MaK��r�.���!z��)��WZ���������lQZ�wy���^����u����l�u|��
�(�CM&}�����?cqU��D?��W�i�LJx�����V8���I����j�ɂ�E�!N8�wu>/ݽ�g�0ڌQ��:�|X�x���7
�rJ��vW�<���k�r���B��:B���q�o�_�1�V5�u��_����/}�g�����x�ʠ�F��㜼��La�1V�<\�(	QB��͏h�R�ޝ|b�sB�K>����
���?�?ձ?��y�u\?�Oz����b��\��u%y��j��x[�dն?CI�UR�&(/؟��O�8Z�<}����0e��O_\Wpt��guP����ϸ�����|V��լ���s|�Ws��L;������|ޑG��=����0s�e��fx���C���ԇ�AZ$�{/��g,�7tH#�@%��!�R�H+�@-D�C�"��F�;�>����CD�1E,�D������tdX���H;$ 5��E�"^��ED�""uYj�""��W�#����҈�?�Yw�
~�8��@uD�;"Ri���GD�#���r�D�� AQ�4 �<0�@D�@D�@D�@D�@D�@D�@D�@D	ADAD)AD9a�����p?M�@ya��2��R��r��������҃����2��R��r�$	e�!H"�C�(D�)D�*D�+D�,D�-D�.D�/D�0��P��K�L��mHu�ڵ	[�E���h?���T���NI���[���0���`}�7O������-���BW��=��`~�a8��/����N��s�HdD$��ugxC8�k���
/�u$���i�1��?C"J�����?*}��u�����ˌ�����z���]�O��o��Kpn:ƿu,�y���;�iy\9�i���f�����<��J5��ԭ�V{��w5������̐.��ww�Pή>�<�C��C��m�{������iR������k�����l�hP�%�Y���w��l�w�vp����;�l�t���n�'�4vu.��~��Ky|p�xg�H�Zg�1,2�;��nlˇ��}�����ۗ_��vq���Vz:9{�|��=Hsz��28�1�?Kn�ߠ���*�wM�m���u�/�8���B�5�'Úe$�:Wh%2��2���dX��dX�ɰ>œa
e��ɰ�3�aM�ɰ�9ɰv�ɰk�eB�a�Óa��$�O��S'V�B�u��������'�J�Vb0�A�Vb4�a-:r��+mZ��A��V�a��a�M+1ZȰҦ�5dXi�J�2�F��$#V:��k[2�UdXwӊ.�.2�_�dX���c��am�$�(O�Us�a�MT�&�����xk��^9ۿ\u�Ohx��_=�{L��T:�O�Y��8�����~�o,���u�㱹��;]>7O���油�J�[�Ln�Ж��ܕ��O�55�jO�����e��>��S�?���g�����!�nN�\~�@Ad���g�,^��_}a|P�|�ň���m�	�����?�/�y]�=D��,����խ��|��5v^��E�W~��5<5_��q��^�u���n��/�����Ȳ�٦[=B��B�������[x�����������-t���^�ki7]�˧����5�6�:�����z�u����r%�^���~U=���e�U�c�t�?�/=U�	D�����#_2;�߿avh�n]�o��kK���ʳ�Ch�뿧��K��ʪj��
<\n�
�?de��n��]]�o|���p�O�V��_n�g��8�;�V�W��f�?Кd�N���χ��ʌ����?�R��,z���ߒo�<�1+T_L
�R�G]�hK׆� ��ԥ���{�o���t�Na�ޛ!�?W��7*�"��d��p�}kP��?�d���?&����Q�)����k�5�b�]w�		
-q�c�5�:Ś�l�ۿ��x��lk�����`(�/�9B?��:���o��U����C�7[���-�?��n�����9��3k��K>���uK������:z����g��=v�;f�;}WvYj	�
;��P�u��ܻXx?~���ԏ��G�)�`�I�8*E��W]���S�q�A���r�L�(;?��u�p�`������<���աB�3&�p��\�n����
J�b';�r�l�XA"�S�;;��Ic�[;)¦A�p��;��{d�ыN��G�Z%5k�k�{�4(;
�ݤx���VN7(6+��ai}�
Ի��J���g��J�F������Z�^c��hD�А�ؘ�ؠ�ب�ذ�ظ����Ȉ�Ј�؈��������B b b 0�`@Ā`A��������N @B���5�`aúN hJ��c�.���Y
;|�U�_,���
�-"""⅁P6�K� �/,M��C�D� D�0p
������!'W�ѩw��t�O������R�Օ���?���>��}�B�o��7R��OW^����/�ì�I��x�U�إ���|�^�Wg��.�.N/�7E���y��0������ꅜ��r�OM^`9�yg�
;[}]��g_���&��W�?����o�&���Zw�Vi3�f�n��@�1В�?�@D���&����q��RI��!D(��!R�	ݞ��;���-=�1�
��/3�ۂK��T�%���>�v��L�?������'e�U�{�c�WDx��ߊ�$������mM��/������zJ�o�w��;
W�.Y���F�n��U�	|��7�u�uC䱺�#m�~ݝ�W���\�a��{7A�7К4��d�������j��x���nK����t��M��+v0���T�7A�O�NΞR��5�������O����n�_nኹ���j�~gW*q������u��*0��M��t�3��Nջ���Y&��`0�*�p�< ^u7p����eU�@�U�2���az������ߔ��A�O�ɴ��*�=���?�s����ia_�����U��rd�8�W������?�:2��t:�>���mKwd&��g�2���CL|�?Iz�_�������W�ae�������o�X�E�׻�,�i��s5��j����j�5��u[@}�E�K˺��V5�[Z��#�>~���Ͷ}���mW��s8�{Y�
�Jk?i���i�vG�G]����w�t��|���=/v:��e���`�%p�]�.�4{��/ja	NM�B2�ZBc㴰�G,&��Zx�K�wC-����?i��mvu߽_���2W�5?��ޖ\��w==�%8��̝c�;x��b�D�1�Q��_Vk��I�؃jq
���֥i�O�w
xf�8��?s�8hc;���yI-ː�A����a�MĄ�{,	�~G�^�\�o����;f4r�^h�6)��
��:Z��3���>������m\z�-?f����w������������!>1F1N1V1^1f1n1v1~��bf∙p�%D�'f�!�B\1��⋙q�1f�!�1�1�1�1��!�1�11111&�ɇ�d&b�!�'"�("�)+�!VY��1��!nY�<�.���e1��q������1���!�Y�!�8C�sD�uD�w&�,H �1�1����D�"��p�!���y
9|��xG�K,�S�mi���x���А�������5��ܓ�^�܀��э����'���o
���w���H�`Z]pG�l�c��;�T�_4�tS�-0w:�Z�e���H!pY���VB�� 3��y!8u@AH�1!46� l�*�����"�.�l�E�<q���9��}���_ܧ�\s�������滞�[K�}�&}皣Rb����h!旇b��؃�q
�P�֙���V<3�<�B���A����͏/���<4�}aX���	�_�}jazD��/�~/��;�*|X��͂1��ձs�,�g�Y�2� iY�j��l�?X8ݜ�,�4k��6l�z�S;�&��5��u>��0O:�I��į m�!uJW���M�1��q��T�΂�5��Y��Q�
w�s"�9Ǟ7�+�n�o�;�[���n�q�(̜=�<+��0{\�y��o���տ}��$���m��XoR�a	ez��'�J�8f6 �	��#Ptj ��E*���d�U2�	��	x�L&��J&`��L-��@%)�	�7�	h%�	b�1x�( pL X@&@�4. ����&�E%p��L�S%���L�}*��f2)*���f2w�d���$dr�db2��d�2�	�S � d�P���x�d���t�6�	�#�	f��Dd�����^#���*�	��L&��	��$cDCD�!"��t�H<D$"I��DDD2""!���HLD$'C ("��!�#��!I���ED�""��ĈHdD$3"I���FDr3�#"������HzD$>"�E�ѣŠ�K~����5�+�%>8�kpW^�f��gζed9��ړ+�����G��	����W������@�_��S^5kx��۽K2߫�����}��=./�}
�W�r6`���7!�#�50����s!������)��¾��Cp����*vF��2�?�jv���|���F�?V����hb�J�y'��u��,�V�67r���c��i�箦뇘��;m`�u���O澉�+�.�9:!]�i;���תQ�9FD�{
ۀnM��W��7)���&''�oP��q��0��N���o�]f��}~�="~�?�gB���'�l���3��l�s��>++�������@�ܬ���ʰ�Ȯ�=`���2n��!���?#˞���U�0>��_��<���_}nOt�t�ְ���vo�Ѕ�4���]����.�^%b�:�%��ܺ���0"�E��'��\W�;����Ϟ�)����?3W�E�/�	���d{�5M�Vy���������?�U�K��Y�+q�突��Oq�>�is��[\�{�2G��+��=܅3#_�J��E�%����o�4��φ��0\�ka�p=�o�51���*�����GFV�5;МaOO�U)�C�Me��-�u�{F�7��9�ϤO=߲����~Z�����~�w��Ͷu���7�mW<.]l���]4�k���w=�],4�k���w�Yh~����e��]�\4�����\:�{��W�;Ok�
W�F�:��9��k��
�=[�\�R�jI��,��E�ɖ�K���xKJ�\�d��-��7����)��b���'N:q�2yIw-m�iה_-}�A�cx�&.��jri����f۵��y���/Ŝ�%��c������-ωygF[�/{@�q��e��TqV��e��8q��ū�\��e��q�7�v���{�cEeÒ�Q���x�S$�^^�#�C��SokF�P��a� b�0��A��A��A�B�0B�PbR�Ĥ�ˆ��$��}mdxq��/N��J��Ӿ����}���}�axq�W/��}��R�´��u񴯫����iS���fӴ0�1�1�1�1�1�1�1�1�1���HD�C�"R�!���聈AD� "U�.�HD�
"R郈BD!"���R�V�H-�@/D�"����tCD�!"��z��r�~a�?T������,�kZe[W���e��D��z���)�?{��W-�{���ki�����]w�[�0��v��+qz�+*��\%�W���j��tӅd���?\"^!���PO�^��X�?&-i�5-ǚ�Ȩr��F��	������S���z�xi��1y���^�r\�w�j��W�o��ì�Y��h�{z����Qv���P���R����F���e_�ٺ
ξ�Cp�x������k����s��5sͣ���60�޺��7����ܼ��a����
8�j�Z�k�ƽ��[,�Ze5߫	ȹ
)<,��U��B�Kr\�g�����.X��e� �gc���c���^�)��Y�I�OSv�33����>;�2Ӕ]�̙�4e�"���~�p�yc�f������9�t����뿣��[u�W��G9�>��Okշ����a�������x���s�7�ogi�f��i�Ζ�K�f��.˝��m�l����D?��P�0��框�'c}��1f�t9�����ό"\�Yp��2�R�岢.��p٬O.�!\>�������X��٧�*�=���2�B���d��[����}�,_���rxŜ6��]���v�jdN|��q��?�$��_hy�[_�vY^z�ė��nye�-�_}w�嵜_���<˚�V�O��
�?��8׵��S�כ'��>��F��k�O&�:���Hܤ5p����ķ?�l�sW�;��[�$ݞ���S�����r�Y����ĵ��u��\d��ׇ߽}���ww~��Q�{/�x��㙮䝉n��+���=����s�TK�W"J�]��V��Ɩ������aZ�h �T��,�	{�s�K��}�-�����-�����`2-�FˡU�eB�0C-�Z	[�!�"�"�"�"�"� "�""�$"�&"�("�*"�,Ch]�-�0Che��4ChmDlq�ȡ�ٶd��l[2h}6k"�mKQ��%�H`ےA4�m� "ضdl[2��-Dێ"�!D	ێ"�!D"FC��-D[}у���QĜ=D�M4��ŰJ��<��[3�[S�ՠ��\������/6\l��Ky�� ��F�ϮF�.h��s����|�%�\�4�m���m.e����@��V�@��V��9�}��K-�Kc"����Kd��V��"��2o��h�+^:�?��kZi1�Y33�����BMae�_������/)��(�
��k�叠��;���l�S<[w���D��=\�w����U�﫹_t��������\ZM��R��X������햺�mb=�*����ͳ�w= 6�l����*6�4����5Ó\��S׶�Ԉok�~�]7���8����o�ЂR��M�^2Y��S���ض��$�]-�	Ŷ�V��j�Y�#=��F��mk�'��Lc���!��nj���6d�Xg����c]�R2�z��;I=���̤�U�
��6�R��]4�ي��e)X�R.0��\`/K��b�b\� �J��j�P�WqS�nXU���s7��}Rve
�c�Dxg���ړA}����Y��=�����L�[��Z�Sl��*m�P�c������Z�y}�$Y���o�^\�s�k��7�X��[�}pr����!;�Rp�\CHfs)4v�!l����������.��n�>�v��c�~���#�z�Yp'?�w==\�Gr��OK}+rT�,):�19&:I��q�ۡ�{�QC\��R��LC�ojK��m��?�
z�7mpn}�����44m�6�O=)a���p�]fi#��Rb���D�i�o3���ҘC�ı/��ݳA�-����ۉ{g��[��%�3j)ᜭw
m�-C��6�h�?�I�ڭ�4�&]�M^{FJ۽K����R�/����%q�q��ϸ�	����osYwNj٭��r��Z��9\n�Z��V.�@m�[Q܌}����%I�^?�;��4�p0Eɯ�W��yԮQ���,;��Y���%�|�'n>�I���h�����e�!���*Yad�1�.66Ǝ.�(c�zi!�X_<D"EsMy��!�SO0��>#���g O��D����<�4��I��3���yj�T�!Z�S?ˑ��A&O=�#O��L��i�<�"���%��~L&O��^cT3/
����yi�p���SC�3O
��<5D<����SC�3O�[��F0O
,`���<5���0Y��40��40�!��yj`
��橁1�Sk����]�>c��v�Q�Lb�:�"�<5��yi`"��yi`C`"2�y�Ԯ�8��R�"�~���<5��E�<�E���SO�ZYɼ��T� ;���2<K�Le�9�$d,��Z�Q-��e��u�E���S�pLf�IQ���?��滚t}S��x�tkV��:�����K��a��st�O�?mS���4Gշ_f����T����_��_����g�._��/���{���6+��#|�9��&�'f['fW�ϐ�˭���O�k�pjp����R����T?�A�Ղ������	�vL�д=P��&'|��?�]���9��j)��<����ry̡4i�K�
�l$�[�̐�8/Y�:;����qAP�Dl��@Ub�S��H_y�_��
� O:�&7��:r��ҔG���_� 9��/��JSGgȘ�'evϐu�)��+���\�&�k�m<��!吏P=1���U�y���s���.N��U�����/�(
�(;X�Cv�S�%y��H^�$�l�����o�Q1�^cHS1��@�	��y�L�<[I�œ���9*�����b�����T�C�S1��D�r���4����*�IT�/3P1�KT�K*��4{F*�8*�
T�7ᨘ��T��Q1�W�b�G�|���7�k��%*濗��� Q1��L�R����T��IT̿g�b~�D�+*�s���Q1?��z�!�=9*����㨘����I�`��y��(s�W�ދ��I�tkj���\&��F�?�o�3��?+�Ϝ/��[<�.���!=�CsjH�|���H	��#L���v1$�}�%�jdU���J3�9�"7v��=[p�R���M�,��u�|i|�rr�]Ji�X���l�r�T�@)uo�l�n,�_�.O:qZ��d�!��7%��!}��9F�6�pSM2��f_�/�
��9�oH9ǎʹ-K�[��ygr���V�3'J3g�/���.�g��<�+U��,����c��!�Y��pf�B��k��!�Y���f�B��s��!�Y���ga��>�>C���,�
XcX�:���`�h��2:��?P���	�@���.,�emX��0���b4b����?Љ���@+���PՍ�b�(��?Ќ���@7��/�N}��@=�@?������,�Y�:���d�h��?P���'��Oы���f�\�e]�IQ��<�
��kK�eX��_�����,[F�-����]�o��T��+9�9�+1�����[�
?͵���e<_wH��޾����"OB�}_�[-����Z�5�^o?��^xL��iQ��g�ž�WJQ)1:h�=D���)�C�{�~H��ĸ�6�����c��<��~��+~�ly�V�ӞH�V^O�6t�[�9�~a\\�q�Αq�P$��K�
�!���52n�$2n4c���#�c7Q#�Jd�+�q[)�qs�d��Jd܆�d�&�a��#�v?Gƭ�HƍfbR�M,��¸�gaT�<mN���,�GF�,.O���������Ri�'���yҠ��8x�ɼ�qH�^���_�a�m��ៈ	罤�|qķ~P�~�%�‚w�6z�WP�>����
��ڸ����Y�K�y�URr�1��<��K���(���S�����^���.Ҥ_����qi��ڔ��P��#BA�C�j�	
���f����e��r"W�9�I�-批[�Jyg&�ӗ-�f�/Μ�-���"�7R��D#��ti���
����#u���%u���'u��qJ7m)�C�RǍ��[긡�_긡{�qL7_Q/�3u��qM7/P�������u���q�J���1"u�̣�{긙H��:n�S�
�:n�P�>P�M{?�u�t���F7
i��:n~��|���C�P��)��P��>�Q��Z�:n&��q�H����"u�dS�~Q��H��̳���3�&]�ɰ�T�!���PS���gu��E�ź��
����C���������E:_R�J�&S?�����ظ�'�8A�X��>O��$�UU?��#�ӓ�ӫ%��].���=%�w.z~k��=pL;9�'��G��
��>�Z��~}�E���z�:}�E�˙"/ܾ�K�׺�b=E^L�U��lW��n�[��G[�e:�G��x���o������E��=���,�e���΂ʊ��ܩh��x�f�#(G>s�J�7���Ԟa�u�pC��c�n��-�
�[�kalU�$��9����d�͝m��)��=�l��٘��x�����F44����'��7:6�o����_x��#��sv:9��y�Zg����e��]�Y���N~Q_98�1cHf946�6��l�a�^��4F��6D~�����w���{|񞡧�����W��u��
�㈳�ܗ侃�8�R����Θ�)r�/���bn0�5L3ĭ[j�j�a�3�?�$��|�9h�@y��iN�򐞣�Cs�0�΄�?�	��8G�>�G|�ݘ�wڐ���8��a�B�cc續�lYj��ݘ�,�`�?��\�:�S��)
W:m��d۩��T�r��	N�w�Z?�ny��1���0%'֘����`�S��3�
��N��2d��9s"U9����-Vʹ[�:��̕�/[�qx�<s���Y�����2�s��$�T�Z5jר�]
�{�@�n�s�HK��F�O6!
�6/����y��Q ߣ@�G��o3���5T�"]/:k��5�@*�A�D>k�#��lgf�ۙ��vf���c�D�
C��#ӎo+��5�"�9��<b��-@>��8���BDn1~!"��g��5D�"ry���c���?��������|DDN2^""7?���O����WD�,"���"Z ����Z �<f�h-@>���kf�����d�h���/h��-@ޓ��2B/�d��8�=$�r:���f�@�>0�t�!�
?X��I2�<��2�Ls�AW��F�ר%4~��I�?�P�?��&�����i�
:C�
4~Ph��
?pi�`����i� �@�ǜ4~�L������4~��I��h(���	��S4~І&��^��[��{?�.?�5�A����4~�b�Q�h�`���Ni�`M}���&�����J�
zG�se?X��	2�<��~2�d:i������b��4�������ȉ�,�DGFf%l�V��p�'8ʚnM��ʞ\Y��>�����ҳs쓫��-;�'4\?��c���7C�3H�#h�t���\����=�p�X�w�w_z��E��e=C��R��;ʭ��'��+��?�o�I�튽�1���{�H톻
�'}�������a��_���>r���d����.E)p�r�uC��٩2�(�������o_󺳘_vbg��{��p������_wƜM5<�K�W�hj���c�\��4	F��-K|�}0�'�KVo�T�M��
[���MUX���0��(�*��-1ک
���
�1PFݖ�T�Q�%���0�C6PF#{�
��hd�AU؇Ua�
T������0�C�PF#{����n!��
��K�MT������0٫��T���^E/�C�R6��*�E����ni`1UaGU������p��
�T׶*���([Z�c��t�[���K��
��u�W����B�E�["�tj�}���L}�-8:�M��7���Եv5<J�+����h�����l�HWs��^X�u�>���vIW�1N}�
��}���<��.Qw��K�`�x�XezƋ�ϰ��ڒ�Z����e�2�/u�W���~�7�W��;�\+~�+ݐ��x.b��3��2<�w���U+��yK6��
��w{��񿼢�ⴾ�I��=K>�R�4�H�8V�q�t�+�[7����>y����T�_��;Ԛ>ٞ���f˪��2���zN�on�s6������Z��`]"�fq�`YZi���!��{,JK���-���ω���Jm����ڮ�*s�Y!�#�t,�r�c�
`5���K>�Z��uR��1Z�-�K]��M�����f���E�k^��V��k^�˛�ڧh����>X�[7Ry�v��?R���5=jլQQ��4Gg�B"�ǒZC�Κ�o0�EEG_�N��q���%=;4y�⣷ɛ��țGj��#9��4��8��le=y�"ysv^"�`�H@��H��D�|�H޼1G�|�H޼�*H���5��ȓaț�ț'k���I��c4��K��Nj�ͧI�ͣ�g�Zc8������A����I�G""		� �5K"B$#/Tʴ��Wk�z��J=�/��-���[�ʥJ�ui�������;�l$H�&+f�;gf���|�~��|����͛
�͟�bmܡ�:"��Oy%8-r�E5q^��p��M���*rr��r�g��˳T����KHOMδ��O��|��4���ԭ�D��i�/�Vx��h��i�O�������q��̤�4�_PP��_������n�7�L�5�p$y�Hs��-�F�N��fIr�����gTs�V;w�v�ʪW��|׫j�r�����	����{���o��1��,Fm���|�~?��GoT^4�S�t��4�\���HW|�����S�_�@hq%Hi�s���d,�zY��fS �vrA��W����v�A.jp�o��<�ҹeJ��ׄ��D��Y�d����:u
ߥ�|!,��wm�%�w
��X�F��D�����DmJV�_m��X:K�>��d�z~ֈ�58F�T�{wh+�ɼ���&�;���C�o���������7u�ݵ�˫N��.�C<�Su��deh�Buز�
�Y0��o�@���{L"��	W_}��G;I���7F_>ŏY�^M��7el�������*�TgPԴԅJ��5��D%��wBV�>��na��~�Ε�ě/�,�&L:ӁϞ�(���?yh�0e�3���vTt)�O����-�+��'��ٓ�l
�����O�s���������NE��:�mY#�A�?qd��8d�I2��YdYdsY���,���,�9�,�,�2�,�M@y�G�+ ���GyL@�1�,�Q�E��#��AE�QA�OE9WA��#q7d�{d��d��"�d�3d��<��$Yd �,���,ҋGY��V��"=�=�-�EW�E.S�E^�E�
�HY@9\A���,r
�,r��,��#��B�H�Y$�.�G�d���E&��"+�"�u!��Hl]��,[< �l+ ���#�����2�E~�#��օ���օ#�"O��"�� ��TE�� �\�"�� ��, ��#�\  ���EN@�H`Y���	� ������H�c��,�7Y���,�$��B���,RQ�E.T�Eb�2�-d��!{$�,r7�Gc�"W
�"_�EN�Ev�E&
�"��"�d�����e�qN+��)�)Ii�)�%��	�(M��nyS_�\7���T���x�W�m�!7�
ʩ��N�9�[�����
�+�nvG�jT���R��W���|��㼏_�����5�8�a�JN�����i\�~_+�G��_4|�4���s���"A���)��-Bn~a!��Js��Ή|��ݸ���m65��N��k��(eO�:ҸVz��m��H�=��A��E���'�):B�Aʆ)��~HJ�1��e(�P@�Ю6��x����.�oؒC�ZJ�$?I�*#Iԇ I��$���$�[F�X�ރQ�H|	Ib=I�w���Z�[H��$^�$��$��HW�{�9$��8$�_+HW�H?S�$�!#I\��#��Oy$�2�ą<���2�ĉ<��n��A<��f�Ď�F5��L�D
��WY��KKHϰS������^���i�
�7��[��oKN����C�;���&�sD�5��ԜM���z�o{����`���6��W�w�Bv
���R�2��LH�G�����V��(�?s,״�aٖ���<�ó�p
hS�T��t�uW�N��0Gس����*�o�7�ˑ�o/l2�_��ta�;��f�%)�ƛ�O�&4]��#�g�K�K���q���f�i���w������	O���wu��`�Pt��e�xۤ���6��)��?RsL��R���}�:O�n��j�~�Vi��u*���7t�~��>���s(���%B����,j�s��Ζ�PYW~\3�������M�Z�}�]u��*��$7��M5K���&��GY����*[7�U�mб���-���tV�y����0��A缼���d3@��i]�&�%��K.�Ϝe
�}r�T����^˚�m�w��m���+G��m��*%�b`�����w.G-q;��z`�����7�ܑ6�#.	�,�&XpO��`�M�����y�`�e�%n\�Z�`�����W�\,�5Xpm���`�����W�!��&ku��`���%`,@X���W�4�'<[�DP.D`��`.��v���`>0���F���X��)`V���xkmH0�!#�=<V�i��1dT���e�5~��?�3,,-!aL	�?�
���������wR�'�Iis"K�V�T8���he��d2�?�������SF�ff%��B����?�6�p�I#�Fh��x���;%������~~��LjfJ�1)E{��{�}�'��������Q���W_`���W��0��˖�O���kٖ���m.��@��rb��ʜ�w�=�����a?{��ɜ�\ÞUL�O>�؟kWe~�����O2W�`��=��񚤎[O�O��1Bg��~,B^\�5�C�`!���V�ɕ��UX�Y�Wa����Uo��U�<��|`�@�}�E��-�,{?�,{'�,{
�m�=!�mE��B���{�ސ�`��f��^����Lj	|�t�F
��듐��b���x���/0��ט>Ƙ1bT�xc����y��RR���D�x��4�o}�a������ߎ�?4#iL��c�?�/���R��iL����=��?�W��I�G�q�?�~����?��mwΛa������/����yׂ+�����J�'����*+o��gͪGN�Xe�<��y�8C�6+��>����*[��@]8@r9/oMț�fK�Wzm����8�Lr6�Ʃ�0��
��0�ڌf(qM8�Xp\t7�Zq�.��ˀ��C���O�����W�\
^�V`����V�`�u0�u0r�F�:^:V##���$!)�t��������o˿��?t3�;<S���\u��J�*ލY)�����4���GN�Pe�T��7Tv�H�ڬ���H��{F�����]��_���g�C�B���
�Ϭ�^�eJ�S�� �[@����?�C��
��B��˔�m���2ep�L��?.SwB��������?AE�� ��zOQ���m~U��oW��km��km�����e/��e��ǎ�L�OH���~�
A����51�dl��8I���»�ӌ�r���p�����9����h���Cyy�b�S�Gϴ�ɺ�u��Z���L!�ci�^o\�Qz\��&�N`����[�ע�����4����X?�A|��Di��4X�Q����6�?�	i��Mh�_���$SF��k���@�{���4�� �����?��_��l{@�R���/3<�s��99�P�
(-�(�zw),�>�c;w�O�1�~�{X�]��5���ek>��LjʈR����_��c�����6�#��gŀm��m����+�����myW��!9�.Ȼ��+d϶y�����[򾼿ƛ!���A��!�"�]��e�����b������!_ɧ�mQN�>��mxW9ü-���V�*��7w��r��^�ğ_��]�9��nVNY֙�>)��arC�bT]�?C����.G���X�K���I����+/*W�w@���~%��Z��5ʵ]��_7����k�����2��#u&��ac��F���ភI�U��������]��Sk���VʝCU�[�(yk~�����4�\n�	���r�_yg����Wbpn�;��:�������Η=r�U�f�U�e*��8�ڬJ�����?$���ɭ�u�n��=����yג};�K���#^���d����@���A/��g�
���v�[�r�Xs��s�6��n�9��;/	�
I��:�_r�^���⃒�����Z'9��w��ֽ刓�J����i�����c�$>���=��YM����<�����}2�p}����bo]��r�/9)q
��q�.*�+��j�2�B�<x�e�Ρ��Yʰe!2��7��
o=��ؙ���'D4��F�>�ģm�������,?��An̂VJ�W�YU������~�"�Y#�3�P�Rg�鞛��^c䌻���\օM���܄���oFs�,�O:ӆ˞<�u�M�?e��S�{�9?�͉K������ƘzS�,J�T�-|>���"��\XLmh���/,�Qa��ZO`��%0�
,�X�,�[`��|�0�,��H���X�6j	�=j+�;�T�@.ݢ�Rяe\*���KEߖq�h��KE3elȲ�dž,dl�2�dž,6d�cC��6d��cC�`�4�!K]�����ؐ��xɸ4Չ���F,N�%��KU�+�Tu��KU��{�X\�����j�!�j���!�L���!�h��p�wiM��!K��!Kw����!K+�T��!KzO���\Uc�'e���P0�rs��ǘ�	c��x��.�0��^	�c�u<�1�2�\�s͒1暩`�5NƘ�c��-����� �dc̵E2�\;gc��=c�
�=L9s�xc�r2�\_��1�O�1��tc���=LGs5�b�u�m����Ř��s��Ř�l����Ř��l���b�uf6�\�%c�ut6�\��1���1�:�s}c�dJØk6�Z�Ԇ1ס
�\=e��N�1���a�u�1ך�\;�{��0�ڐǘ�c��1�J�B��^�X+�1�ꄱV25b���1�we���W0�-c�u��1ס2�\g)s
�1溚ǘ�c�3y��v�0�:c�dJŘk4�Z�Ԋ1�6k%S,�\Yc�9���R0�zQƘkc��e�����1W�O��1�c�dJƘ�q����c��0Ec���j����”�1W�W��1���)����B�p��C���3��#��#�R�Q�������oG���i�[PXЪ��z�b�q�E�فuoa�ȹq����\�?�%%��1��S_	�3sA��zG����+�nm��ث�tN�;lT�����Ns����!���Щ�.���a�}��mG��]
��{��0�ߎ�1D�fa!jp����Th�)p,'Υ){nN�ơ"��"��P����"9ơ"���"1s�H��*���@E�WAE"ɨH��@�I����{TP�`u�*�QX�A@FS$ϙ")+�I����䤑�	S���_���lLɰSX���_jjz)�@����������*[��:b�n���g/��t��e1�9[|�wO�u�E5��<�v���mZ{�Xm�:�=�ǖ���ٗ�S[��a~�#��[P/^>trf�3_w/yjT�3��_��w��3�F���r����"�Us�;wb��o/dĝ_76���q����,��>�7��u�.FՈ�ϐ�q��]���u�������3?��ǡ+�=s�C�\�y&���]=`:sm���_7�S��"��<�)�G���Z���G^�5"���E�hw���C7����ޒ׾�o7|-)�Q�-���틆wo7鞘��3��O��\��������m~aRv�+�%��|;��!7�$;ʣvnԦ.����c���ӵ�c�C��d�������>�pu)�Q�X?�dat������oWA���!z}��ٝ�-�c���V��C�=	��IX`O�z�a�]��)�\��Z*iX�QwO����t6��_^4�K�f|	�ь/2��%`F3��hƗ��`��^n`�h���������ь/>��%�G3�iƗ� � �_�4�K�Z�4�K��Z���͈Ì��H���Ìo�H�'�a<�73�u�0�[���b�z$f|��`Ʒ���RK@3���{W��<�[p|~3��3�����q��5Eb�wyf|_�Č�kq��Č�8�v�Čoxf|��>sid~�~���m��$%c�e�m�����.��ƸKb.�]�oc�%*�.��1��2�.��1��|d�'�Ƹ�!�.�m����ŸKz�]�ecܥV2�]B���˳�	�f��?�kA�}3�uO����@?��Oτ�#S3SⓍ)���IHI����P
�C����^�\���^s,׸���V���izqŻ���'3�`��E��jw�/�D��6'��e��M���������m�U9;B�j����=�5q5���z����L�H���m��8>j�R���
�\�a=��~���QQwQ'��z�.�!�W�1��{�[�z�zO�z�zO�z�zO�z-c���0k}����wM�HKMʰ��(��3`��~}��$���hL����?��{$�dd�3������P�����J��ǤB�MBrrjI�@@��?~A�G��-�ޏ���D1w���?o���,�`�f����+S~��P�XYt�	C��e;&���,j��R�Z��yy�G�QhfF)�W�d��ޱ�X���wM'8��)�4��b
���8�U�'��)plT~��� �ր�ZJ"����x���tjfRzVRrrBI�B���
���'����Q)�I��=���M�%���'d=�5 ��?�����_[��(��wQ���;�w�x�yk��/��y�6=5�C8O��'��x=,{��wk{V@(T�U��Ƶ���A&�0qlsU��? )�8�8�$������Z��Q�1E�������bN뎪�6|Μa6�g=3g���7w?�夫�^X(�_'	vN���&(���'�
����}A�4��z9��cP�'��O��R���ŕ�7�_.���~��z��>�������y���N��~P|s*�w�q�W��@��@������X��؏�s��nT��b���U��"�cMWqu�BW�I�Λ(��4W�
qu^�`ˀ�
����W�u�����6���tU�~�&�:�W�}���}��W�)Eb\�Ճ�b��B���%ھB��=RR�z�'X������jݗ��1	�����2���̴Ē���?�@-��(�?����?��Kͽ�۫����
���#�>���U������.��s�p��(��诛�̭���z}Q~�kr4�po6/|��r�;��.��i�㝒s2��s2��&9��{Dl�񁀭3V��:c���3���:#I����-?�Q�D}��E�u��&:�K��,������_lBr򰰤���~l�W_ ���X�����Ma�r]�o�xnqx����-Sㅏ��>�����+�6�+.�L1�rQ��8�:��}lD�^���|��8@qHߓGp��H�����4Jhk{�]���2"!%%!���_@A����q��_Bq���^a�7u��J!h7�$2��/�ӂg�m��aQA�՗2��1�%�$�>���_��O?m�G��E�ֺ�|u���?n����û�YG���|�M���+y����S����Q
Lp�=������sz��v���w�ȟv��b�w�V�h��Y�	����X!59)���_a��AX�nL�hm�Ki��~ϿKjRz��~������;�_���G��o�L�[�V�{CZ�-Zwx�������f&{�8��C홻[ڱyk�fQ=�i�]Bf����+�H��m�b�-�EAt����#VZ�1릾/VVVH�gw19oHU��`��3���7�j�^g�OL`j$ų5F0�Q��Z�3��7c���u*��u��uo*����}�U������W�VuY�ن�m��c�>¼h��m�}��s/��Bb|�~'5wKc�_�-��˴ܹRjuR[/�&���Gl;9Qҿ �&Ne��k��#ـ>7E�O$��҂
�w�	�ѝmw�,��5��sۘf7���Nsr��K�0!�ǤЩK�.��RX��b׶�����?����		[�Fy%�k�j!]Hѧ�1�3���u{
N��:�w�R��rb_��Rl������J�
��Z�q
.2q�����'�Ie]XC��	v��Y�$mf�-Cȑ,g����D��$�{̗"z�	W��D��b��8)ɥ�����4��o��d�
!9�ؔ�*�ڿ+�9AH�6-u3!3߱�sK�
�D̺p^��5_��s�4�f��ʲҤ3qb��7�W��'M��,������\ѡ������yO3G�&�1����؛yG8G�K�11������E��
usP�DR�n�z��	��
z.Iuq��Nj�2�L�@�,�bh{�H;�@�-�z,�l'.�e�uE,�#�����ҨG�'9�6��I.�z{���c�.�@ߐ�t���/���(p/���h� ,��*�\����,�X��#`J��� ,�
�Z��PK ,��j�� ,�X��?`��C`,@X�#�I`��4�x`�ZU`���`^]`��`^e�J8�`
��<	��0��~�f6f$ԡ��(�Po�0�7?�PoQoΐPovQo&I�7u"���`�fk	����@$��X�B���E�@!��zKP:Qo�aPo�`Qo�bPonfQo�aPo���"��O$ԛI"����X�B��fk�@!���'U��@/���Dԛ��F�َE��U�ɨ7O0�7�_�Qo~Ǣ�\�(w�_�Qo��Po��
ݨ7�Dԛ��7�_�Qo&H�7	cx�pݍ|��S+��F�&�$9��?��iI���I�i�����7����x����g���	�IZ�{?���cJ�||��~�Z�i�۲k�r�f�ó�ʨ����N�3��2��[�,��9�)7����'������; �b����[+��ͱr%�fN��?sn�79������!9��*[φT]w�co�6����Bj$-�j��5����R�9�y5z5�N�.\�r�r궬������ML�'^4���bj�IE�+~6����v�Ԩ=}��?�^4��7����wz��|��27w{K��ys�+����]�V'g�Z/[an�)��v�f�ۜ�7Ѥ�K�3�G���[�d�������M�]C�c3}�[�u���_:�o]sSs�m[u�$Ws�%Kt!�W̡S?7u�u���Ե�sx�I��?��G�^��8�W��.j���W�z,�m�>=������M�O3s:�zwH4��j`��e��Ϙbo�4�7�`�)R� O���~���u/�ta�n��k�!;?�
���Ö���1�f㌕�᭏�G$N3�{��	�����̉>Q�ģ�$������ͣ�0�Y��O�[��f�ӧ��u����œk�q�u�i��.��>��'������c���'x�Mv�3O�9��ʲ�Ig2Lٓ��_���4y�k�)��MN��;Up~�Q�����6���v�M�a�[rv�������v��ݒ�v����v!�]���mo{`���47�^�,�Mme�ko�}uF�(��e=Be=�F���#�|�E�D<��-�wR�N<�{������T�����x-���(�o�|�7<'^���F�m2��!(�q�j�n��a5x9�wܰ��d�1�;ϡ|5�{���	P�w1��obB�"�d�2����M(���Q����(�ҡ���G��e�wz���P��2��K�2��e��C�ˌ2~�	e�
3��7�(�9�x�e��	e����e|�e��L(��{@"����(��{@%��֡�ojF���J��_�CŌ2�s�#f��M(�7�Q�O2���^�2~�e�^=�Wt(�?У��C�ی2~�	e�J3��M(㧙Q�w0��O4��o`BE�QƷ4�����2>R�2>O�2��e����e�^�kz���P��z��3t(�s�(�W�P�3���fBo6��O4���gFeB�aF�҄2�?��E��G�B�2���E�G������u(�o�Q�B��Q�3���ӌ2�lBΌ2~�	e�3���f��M(�_3��61E��Rɵ��l,�]�f�r���O�ϔ�$�=��2bTj�1��.t�7_�U���5�o��?�8&3�X*�������q���u������Q���n��[�c�f�Օ�PG_~\����`v��e�Z�����fos%�[S��G�n�NSee���9�G�rs��;tU�m׳����Z��>�5]�$Q_s��g�p}�v��ͻ�Mu*5��-fҴ�3��y�h��)' ��?rv�g���8���` !	Yصp���pgr���5�r� ��;=%^�.�x��%^��q8*�u(�"��C�N���`Fi�eBi���]o3J�oM(��Q��4��C�Ί�)"8-J��z�v�u(��Qڽ�Ci'�QڍС��Gi�Ci�U���hBi�T��.̤I;M�iҮĤ�#��%x0̱��@�SqNeR�=����lc$��~��@���Q������	�S��]��,����i�;[nBe]�qͤ
F�so7�bh%��>Wry3�T�=&UZ�K禚���F��٣�G�\S��٪�6���s�j�$]���l��4]́�XϨX]�vAl��:�F}�:��tu�����Y�2u5����=	��陥�X��$�G�]�<rֲPd�Yl�ڻp�z����_���צ,���W�@����+�`!m&!�1!�
4P�_�
�W�@���+�`�!=�"�kB���E��A��u.��U�!Mg���鐿�c���ꐿ��_u�_�J�_�t�_
��_����0��0����94���}�2�f}l��X`���O��Kq����9��F�2�k�2��KTd�)�{����C>��X�1��r��K�/ָ�}.���X�ƥK�KW,�*�kh��`��ʣ0�~x|��Vy<��w�LI0f��z0��/]�o[�=��W�]R���/���mI�~�v�QSʽ��㧔�S��-p|j��eSX�4-�iM<+Z�"ڒ��>9�j{�s���?:5}XhJbB�S�_$�_H�w��!�9㈤�I#4�o���5!%+!�T�_!�? ���i�2�N���=�ﴌQúǤf����Y�����M�����lUb����A��5h�?��k���_{i�6�4�[Z����S��A�W������x�%"�w@i��ɉ*�[Dm5;�Qm�blZDY[篰+N������������\Dl��-a뿟E�j���Ak��h-��^�?m�I8*�M�8/�Q�{&��ט�0������5�_V��:�M�K��Ű�T�k۩���y�R�"
LW5��������!���
h��*�~m�y���]8/��?,!%�)&�$���5�_�����B�e��6��
m'�o7|����*�Gđ��`�iOf�#B��7�aR�/�%3�j�
�Q?R;l�-v���i���A��S�vm��U��/����������������������?�A��* ��c��_���g�����st��2>W@)�
���#z����-�?>W@/��k��^����2��F�s�}j힠���`��H�1Y��	�
LM� �?(����	i)������\J�������-M�_����r�S�q�m��(�.
�������Yn�s"��d7���A|�M͸��;��8�ā�_Z.����g������yJp�#r���v�6���U^:'�6*|�mir�9{��Kb��U|��q\�^����\׶���.��w��ֽ刓�J����i�����c�$>���=��YM����<�����}2�p}����bo]��r�/9)q
��q�.*�+��j�2�B�<x�e�Ρ��Yʰe!2��7��
o=��ؙ���'D4��F�>�ģm�������,?��An̂VJ�W�YU������~�"�Y#�3�P�Rg�鞛��^c䌻���\օM���܄���oFs�,�O:ӆ˞<�u�M�?e��c��
ˑ��k*P�S2P��9�+P�PXc�>�/�
�
UBd쀅�C�CT	�q�Jh"�J��Jh�J��J�#�JȍC%4���0�P	���:ơ���2�{{���Ȩ��WP	m�Q	�UP	I2*!�G%�&��ã��Q	��Q	��P	M�Q	�;0fQ	�;0vQ	�;0�Q	�;0�Q	yʨ�&�8TBCxTB59TB��=�sTB
yTB���yTB���}TB���*�-�/�zWF%�_A%�-�Z��*����
�Q	��Q	
�P	��Q	u�P	��Q	5�P	E��s����{�'TB,�J� �J���J袌J���Jh?�_C%�FF%tBA%4KF%�YA%4FF%t�G%4�C%��G%4�C%��J(�C%4�G%ԆC%4�G%�r��^����n�2 ���:�G��g�2/�����W��[P�������;[��%,ז�S#&������V�o���J���)y]�(�9i���+�۾���h[�r��l�}!�5V���6<�m�6�ξ�*�m���6u�b}��$��R�KM��jI-��H-7ߔZ��Yj=�����Rی�>~��;�,뗶��︍
賀5��
�ޙ�כ/�7��neI�w��/���:�&u��^�4=T��[
�X
�ڎ���Y
���vm�v���/)��wlĵ�lT��lԱCl��+�>d���bޚ��h��+%��f�!���Or��7(Y�������)�o�\�C/�yvc�]]x���g���rc���C6�C�����凝b��^��w��1���b6��i��Rb}IJ�i�4�N��th�4�|�4f� )��8v�̎R��l�$a��׻�6�6��ۀ���ͬưYm=�,�vB��y;��Y��jvұmR��=ҫޓ&O\%Myk��8U(W�ď�<�9z�J^<_T`=/��Z��؆���3Z�\�#�ƍ��qc2�iܘ�b7&#�Z2�i���h?&��ƍ�Ȧqc2�iܘ�p7&��ƍ�H�qc2�iܘ�x7&��ƍ�ȧqc2�iܘx�/�qc�	�%o@� ![j��c�Ȗ\��l�x	eK�S(["�B��
N�k(["��l	�,B���Z�$dK��B��[��g![�-�Ð-���iȖp�r�8dK�Ԃ�![�-��-a�<���0�7�GR���,Ϥ,�x'��C)["^J��Tʖ���౔-}���\ʖfc�<���dܲ<��"�f��є%���x6eI��o�l�:���)[:���xʖ��S�D<��%�-�l�U�7P�DЀ�%��%T�@YAj	B���l� eK-(["�A�A
ʖrP�DЃ�%� �-�l� 	eKM(["�BYA:	d�NFЅZ�0�%��� 
eKm(["�C�Aʖ�P�DЇ�%�@�-=S(�ց|�ɟ��:�2�ѷm�G��/}���#R�҇OL�L�/q�P`�o���Q�?�,�,�h�1)!5^��l��I�e���+_��3����e �E0�71�����;����ۗiI]���	�pNX���F{!'L	����h��4N�qB�:'|��	x�o'��O��D�=��,�^�c�Xcژ�QƬ�������T�����M'&˵��E�z�O�cO��բs^޵GH;���n+�"bb�"������'QZ�	l�<&7��{������w<>����+]�/gyS�f���j�T�u٪S�%6#P��#U���j�te�|PY������{������Rg��
U�0\��,]�0|q���j�!-�1�pp�F.Z��GV]�O�<S\4��_!}�_�^���*X��i���Ľ6���7�ҳ��V}*-�*I˧.�U'��{t]m��a�W�X��&��Z쉞!����<�#��UX������T�O#;��%N,8]�K�.�%N,8Xp��$`�Q�������v��h,Vc��-[��g
��Mkߘ���ע�e��_��j�_���S\Lqa����5���G��[�0���������-��h����?����j�`B��h�Yc�Z��c�����)���g��g��
	m�����������o	>�ؤ4c���l�������A�/E��Z��z�r�T$�ec��n>ẵ���_��_|�&����C�[�Uv����kQ��3s��{�PeϨ^�޸e_���?��r�Y
�W����/ǿ���xO�5r-�oCj���j�U5��J�ι���Gޖ�m7���U�/}]m0;]}!;^�%�S�6�Rwh��ت����MmZ���4,Z�q6�>כ��.���?�Ϳ��[�k���j��\�f��Է���x}|&�;d��6I�︞�����!�g4!=��.��K��(��!8���{���;M�;/�"���C��d�����=y9+���JH�61��qBV������/Ay������|,Ƽ5����^)<!"�b��b��bߠ�blt;�_���zb���8�`B2*���%��8��ۄT�'��!���!$� !ߊ�;�#�a���?	a0�#]׈��_�%��/&#�>�M�6FL>0����b��pB���t2��!%2�d���d'����~M&���	�gdb?,�b^D&�b����戓'.��5Nt*�\�B��b%W��nx�{�?*�?u�chgB�骢������X��B+:�uL5�?��2�4HV�zqj�4����X��V)����@���h4��!�޷+y�`,a6kK�'��9V��]|��C7i*��)X�U����~|,�/X�a���`����?��~
|,�7X�q���`��i*��;X�"�DP!�}��#�@-�*�6P!F�>U�(�L*
��*
��Ub�*
�x�X?�X7�X��B���B���B̍Z�j	�� 3�(�(<2�1T����`
foP��Q�}��-	��0{�[Ba���-	�0�$�0[Ϡ0�ɠ0�à0��0
�v��Ba���P�a�y�.f�q0�Y�����;���0Î�m(̰�<`
3�8X��lZ�y(��aP�-eP�a�y�@f�q��v�LDa��Q��Q���(��%X��,�AaVADaV�Aav��~�0�6���(�.1(��p�����[��:��N��?f&�����h	�0�Q��Q�uQ�ň(�F0(�ڊ(��f�Df>
�6
3������-�Q���(̾fP��,�0��AavXDa��Aa�^Da�YDa6GDa�@Da6Nd4`/>�X:P�i�<Ҙ�Q*�B�_@�_ ����:<)-A��iƌ��dc���@���/U�g�=�-˵�j�R�J�'�'V�bO����Q'�S�������˞���=�$�g���W�-�T6�It;��G9����d S�L��
�Tz�5c��-E�6�}�6��7�$�u5[y� j�g�_]�i׊2�z�����(#.�,�"XpG��`�-��k����g���j�3-��Ϟ��Yy���4:+_ѥ4+�s
��n=v
��A(����J�T���ecbZ�p;����<���7���'��/����z�9	�W̦�=�‡	W6���_�2�J��נZ���g�l��9��Y�n4�8Zn;�_���|�+wP#iv����,؀QOi6��7�4���\����(t����cR�ѥ��g!�@_�η$ȩ���ߘf���(�?�iϿ�?���
�����/��=:sx�hc��<��K�7�g�b���
�����+��펊�-���=���q���̑=}-G[fr��gi�{0Lſ���R ��(���-���/�WD�㾮8�F{�X���g�0�i�g�LjIL-��Ew(�YxӾ�����T�Ge(��P>��{�?eX�1-)%���OP����_�����6��$�����e���{�3��Yw��&o�������p��2�.(٠�����Xs�_]<�к@�
}B=�����N7_���E����O�O�?��Sny�nmݬ޾�Y��x�z���潾I`��:
G(��e�|��B�iY�s˷Պ)o.�3U���/T�6���T�*�����z�GQ���Z���ͨ+T[��P��m�����VwX9��(_���T�Z|�pRZ�t�+o~������[U��B�;�lp���5��'Ǫ�#4
�N���ӪD��*�8>Xp~�`�ЪE��U	0�^��W�h�G�V1��W�H���?��b0��
Xj	��p���h�E��~�p����CU3 �Q%@D#j	 �P��X'j	@��(R4�D�
,�X,�Z`��%�V����&`Z�X�
��r孥�W �K�F�#r~@��j��
��V�����3�FS��;���	�C�oߒ�Z�����LOJILHq���d�횜Ҟ?|��^	��cSJ�����.��m.%˚K��Ϫ��z�����\�=�k�8&�.�K���Y��$�hͷ���t`�6-8$�
��%�@.�_���m�`X/��_X��M����L�~d8��Ҩ�ϋ������w�%C\S͚j�Ts)��g���
b-�/*=9aX�H"�RJ���
�^�Rr�?"3!-%=a���?�/��ԞI=����NO���?��_�Ͽg҈��$c�<�������O���m�EƦ����
�T��Qq;?O��W���籪_���U�����^r�w��}�FL���γ�n�V�U\��+��L�.N����ݛ���>U���
��ґr��	�ّ�.9Bi8���(��Ҹ���b��J�z�䦕�*>Ε9��n|��\���7f���QRZ�ˆ�i�'����T�x���DBGR��NJ���3|�t�p��6J�a��\=�4#F=�Փ�Os�d�c��{���4WO���3`�^�0G_G���
�y�6��ƶ�I0�0��h�4�
�8�6$�m�*c��
�
#e�6$(m��1��`����ц�
F�e�6T���`4�@��
�9�6���G�
O��-qL�6H��
i�mТ
�mx���-�R�R_51��_���A����R��7� ��0O~Xg���Uu�/?�Q��tl�i�i�����#�\���|�]�A��W��#<\+�n���n����t2�1n�wZdT�ղ��C��Y��z���.fK����"&4��׭;�p��ם�[�C���gL�zF����S����Uf�c�[<�B��<	�����~�ᳳi���N6K{=a���+���/��/���߂4��h���B����xy�2�;���M��sM�l��\�b���X��d]�;�X��N���a�I��O4���[vp�Ln�7�3~B�8O:�ձ��|����~����f=O��2����^�j�O��Z�;t��%\��h
|�*⏨��4�1����k�8���{$%d��w<���>X�����p�?�r��k�i��h���k�j���:��׬qG���;��Mw�%g�w��;ܫ��s_M��wu�j�v�o=ÿ�^��X{����Q����x�����'���?qG��+Ҁ�ukc��k��9�?/��1��G�2<3�4�a�?�`���̴1v[�����̴��JG������Z��|�}3RG���<��!����4�_�����ߗ4�M���ze�5��T��ߡ�{�ƺ�G�����.���=�|>��~��i{K2���o��h�5a�	s�Ǖ|�wh��R�j`����z�O@m?�U�V��c|ZR��~�����9 P�U�����
����z���}�_]njǘ�z�ג��>�1�Y��5�5~�FS/�<�&��W�O��)m����.�W��7n�/���ol=e��`�A��o���_�3N=���tJ�jek^?��Q�?�/N���:d�p�A7��σTW�����	t��B�m�
��N�!�Q�N
�،@��/T�����zI_-[|�B��xN�
���Z�R�Q�G^����ma5ya3w���t����k�W�|2 ���Z����C���7��z�-3��)����ؽ�/��݉����vj�ǃ���, X@��`
�"�T�ЁZ�`%�RPKЂ�l#�ASj5hJ� M��vA���sܟ� 	m OЄ6�'�B�T�B-A�Na�%(C��u8A���{�a��E����
QK�,�X@$Mqh�CS�8�)��)Fk�^�]�ӌ�i�����
~��%���%O4��f��)����X����V�Y�����d�r]�
����ʜ���Z^��]������u�ɩsΠ�{dZN����镡`O��
��
V��W]���z���{N�%6��/�ž_����Z�:*���Sh3u��6�$���
�C��~iU���Ԁ>UC}Q
����כ �襁����~GS��s��Ä����7:/a���.B�To�K��B؈�j�&���0E��^�h�G���/5��*5����/NW{lxS�>�R�y+U�����QS���˽P!��Z���Tp!vk*Z�!�؍��~�遳��2�����Z?��ĵ���m �X�K�n'ˋqz��m+�cM#v���O�
T��a@-
T���@�
T��a�Be��B�ڵ*CA�b�Z�_A���P	VP�X�V�UP�X�V=�I��fq��E�s2�P�|+�`9*�`�)�`�(�`1	(X�
(X��(Xp�-�(X&�(XD
I�(X&(X��(Xp1�(X:(X�,
,M,����E@��M�=��a��尀�w�!��e����_*
�Uh�G�2]E�򦊂e���%UER�G���}��':��*�
��0�5�٩�������m�����%:�3\�q��5�x\�Ǿ�o�g�,����3F���1J�X%��h>�|���V4�~���ҹ6|����6��4�6�����?�ЩU�.���ҵ�><���y9�x�qm�U~�u�=���	J�
S�oYhۑ���K��d+��}�# �J���k'ׇ@B��}����ߚ�C��O���y���+v�8�[W,��qַ��|��Y�ҳ�:��-=+���Y�ڳ2-�8��q�Tp֏�q�Rp�o��o�q�gy��kS���d��_E�Y�����ି��Y�����rg��
�����%g�	
�S����8е9�	�Ii�v[����O_`����W��Kq���2:�r��{��y�qu�C�O��������zT7VԳ"^q�]Y���O�x/�xw��3���Z��Q��1�^�����h����,o�4��t��{=;��ֳ`����}��+^賰@�=tO��A:&(^�a���H�CfMǗ��?591*@C����c��ݾ}^�!�b�%��V'"�`�^��xO���3Ԧ��M�;����C���i�1����2�k���z�g<��G?G2�v
���J�'y��d���*�j0k����IcG&����R�����
�����_���a���^��ƶ�+�)?��W˃����Ka��[�9 ��:-�m�TN��;�$���f��5Yv���CE�ZK?Q�B���~��1�y|�P�/Щ��i���4��`R�~��;�i����=��Y6K\���ë([���C���[��g2md-�B��L吵$� k1��ZV���<�m�4���������ȿT쀊��Oj����v�c���3�7���Z���%��z��R�R�z,�3�i���?�����K�k��K�n��[���l�$-��P*t_�כ��_N��w��'���;q��S�u���n���A��T��FN��-��)��|El���z����bی
�>~��;�0�4�;�'��L2A�!��h21�#��t1�|_2���0�I���auo��Vb��b�%UĐ��b�ԖD����=�wV��Ll�Ĉ��Ʉv�Lf��D����!�R2y},Ƽ5�LB-�rd����sE�^R�z9�W��1jSޢs®t��_�+������xҟ�K?o��Ѷ- ����%����v�D{�WL}P$m�I��)'Xh`a��Ed��Ed�k�=2�%��"���"�Z28�I֡�
t�D�5���2�Y�)"��""�<'"�<""��!"�� "�\."�<��2Jpd�3d�sd���0�$���$�2�$G��$1�N�L���L���L�_D&YED&�)"�l�`솈L����!���A&�MD&y�A&y�A&�	-q@d��0�$�
��IN�qJ��yn��tp{��%s�(cZBzF)�?��@��W��(���\W���|�uk�?���.��M~�W����);����\�Oeעl%g�Xe���ʞQ���q!ʾH_ep#�@��A�
J����}�FLe����{6>/֪qL�]~�X���uq�X��<���7�z�7H58�K��u��B�	|�#�ٍ�f��fL>N0e��sݍo��[|���/���{��z�LדN��ތO�����������|hI>Tty���'�I{�e}!m��x<�`�KL�}dͿ5�C,ں�?��b�n��Rh��,��č��+�w.�
,�Xp/��b`��h#E�j�H1D�F���@��Xp=�@��������W�HYqIʂ�[R�S�g�{�/��(eA�M)"��,h#�,�� �� z�A�
.�,>�2�� YP3�,h
�l7�l��z�8�CV������ߣ̺�7�CVd]�;I��1ZcB�j�^v�U�{�t��I��SS��;�d�/� F�˥�ed&��h�_x��i�F��I�?٘o�/@{��{�'���w<A�� �?y��i#2�F��NMI��`L>1!E{�vy��c�3�⍥���!P_0����-�O�^�Z�b��M�S�/=���U�ܗ�i�|a-w�:{E�M���ƳW���}�<�������g��C/�^�r��k/����������믎t߰&j��c�g�g�����
f"D�o���}3����i�?ӻ���}K�k�[C&�����k��mgrgo���[���͙ٻ�g��2{��y���9<{ϷC���M�'��/�o�����~���:�V�=�3D�hqG�X5B��Ϻ�u/m���ʗx��B���YJG<�t���0�FoVA>�\�濮`����9���UXƋ�"���s8�4�ITY(-�%HLξ��=K�I���+�|fE۰X�꽓1��&�~�c�o�;&�vQ(C�~ip_���t%A��:`y���(�,�X@$j	*�&��=����>'�p��(zh�����P�9��?����P, 
4�FÀ�0(�{@7j	��p A9���v4S���y��QKÂ4�	H�a���"�*i�2'�
�I��5�rRKГ��4���~(�>(�>(Z��	7jS�Ә"tߔ{��)���<V���h�?�����1n���sy�ג�	���n�+���.�uWXnj�������~����W��U����6V71�k|��k�
�\^U�]>�q����ԏ���wdc��tV?�5F�Dh$�ޑ�O�ڪ���,|�-l�zHؒ�U����yuۙ�C��;��Uw.&�Z�5�]U�=!Y�3j��7�����3�z�@������T2}Y{������Ij-8�7ժ+ok��ޒ�|.j*�G{�q�뱫�k�l�T�����}�����zM&�j�O�u-:�ճ���{�v`N�2�GD���V��U~c�ӽ�O��#�B-��	�PH�*@�1�3ezXC ��	�PHp�*@�=T�AhPQ.���**��Ζ��=`*�P5�N���;]:����`#!�N���
��U��PBK0
����<�bA��**���֡���C��bA��Κ=qGM��������*?���L�H-�F���c�!���!�'��)CMjʰl�h�e�&KԀ�6*)=)Ř�0�� ���?�&���Ӛ@�Z�19�^�?=V������i��4����
�u�k��
�),9uDY���������[���z���A����dC~]-� �)C%���O�';�J_G�`Oտ!�n�P=��(������S�ϳ��[�'�[[7��oo�,^�޽$�y�o����N���-}Y(�e�PaZ����m�b]���E��v-����ҍ������I�+~=>;#Uy�Xu�k�$V�Z*U}G�>����F��eC���ݮ"�N�,��rƪm��v�X��'�0�v}�튱��G�'��(>T}�z�V�U�Ž��TmoƼ�������&��_u{+�T7�t����S�M<���
�V��Z$H�O-D��,�X@j	J��hA-A�����,Ex����	�&	�P>��y��tP�~�i��PK,�X@�~	�P�K��I�BT�$�ꗠU����%�D�/A&�~	:Q�KJS����ԯ���k�q���z*�/:sB��ᩙi�����/�%�O��u��߭��Z@I���~����̕5�{��0˕}�K&�'���,a�Cfm'u�˻,�S��;���A����9��"�ۘ>�zY���5}�e�R�wӖ�����~m���?u����~�ݎ-'u;��2�Z�.gI=��	u{F��퍛��׹�i� ݁��u�N� �q��^W ��j�W|rn�u��Z=�s�:G�f �.���!�Lk�ʹ{��ADʾ'���6T�n=�'�O�ӟ�9F�E���Bʜ���k�k�n���:
�˼�"�;x�)�"�K�~;&�:��T^V�-��5�yK�Vdjf�Jeei:�.�v�i����ˬ05���l�1P@Ʀ�a��Dv��=��>�����&�@+4%��V��P.�WQ�������t�^�><��>�2}�%�^/v�Υ�4��Z�}:XθO�2��R�}:����'KpA�؂��,8|	��N���t�
	N�*����%8������t�n�3�N?+��`���Hp:�����Hp:�]	Naq�����/���ӿ4�'J����a!�t��2k2}�X;������}2����=����|���(�z��kZwp��ڍKG���Ծy��=���ɤ���ҁ~S�S�r1[>/�kq+�@���bA���F�n�xx�h`�fMŁ�ŵ� n?�{E�.�U�� �B�LjiA�вWœϱ,�̈́�]tt�/�����z�NJA�.+�]��vy{VV�W��C8}Ƚ����4�!>E��W4y��h�QN|���$~V%1N|��d��)�"K�TE,7"X��[)�f�HS���O�?c�^[�wk��(��e*�G_��r����Z���KV�m1
[l9��߯��뷇^Kg?��QaM��1Χy@�~F~���J�G�]�	s#U��J��(�G����7i
�ƃ�MT�n�@�e���$�#��*��'�q!���Ɛ�䱷:~�%�1U��p������|:�6���S�(��Q$^F����l�x���U���tV����A���	SC�B�P�Y�@��S��#ԙ�ϛ��+�_�/��e��X]��A��%�/<�%]�y�[�*�G�O�R����u�?�_���N�^���r,�OH6�g�����Jς���v���ϴ/���K�e��b����z�}�(>�K�����ߴ�~��K�G��~Pt��Xp�S�=�RO0���=�})jFXw���ĕ�޵�X�"��_C�0���|����ۯ���`���o����x��^f��29�L:�D�<�F����C�ަ�^�_�x3���E��u�x7E����������������}ϱLL��ĄebB��u$�Zj�C�G9�de�zi��-��B�K%S	�߇�?��
_�;�é��;�o-:o{��׬_J�}"ގ�>/'ۏn
�{Z�}w��wok�o:�8�2��W��w^d�l�7]9�o�$�Ͼ&n����Ř��
{�
U��4�]2O����A�Ia����*p>��@�߀��NoN9����*�zV�k�_W�w��N?rU����5`�Wt��-�Ŀ�R'>Fg{���)����!�U�����(���?�H|���|��NW����و�R�M|�{ZAa
[P�4�t�0GT��~�J�|��U1�
��
��O�?t\����/�$�I-�oy��†�-/>n�,�'�8�eS���Y���k�:�䥮�d[䇒�={I��\by������-%�w�.��Q�䕏OH^�6J�e��h�T�j�e'�lyð�RѾ��ZL�}���z��o�}�����x��U7�'�N2>��|d~�3��D�uǯ��'�K>K�a9�h�|޾���!�͟g1j�䋿�[��J�Օ��SM9ɩ��'9�Ck�ׇ�J�9x���Z�{��Ķ��a�����1=-�R͖�!��"�)���?��!߹��,���_���*�����q�k4��C�'a:�=d�5=ř���CǺh�
�U����-�9ф�4'"\C��
�� ����S�S��)�`װ�"����-@8� ��EX	�P�Ez!����	WQ$|E���"9,��*�_X`���y�2,�l�";-�)\�V�p�[�}nE޴�D�"��R��"�LZ��&�E	w�\��'�E	�ެ�p*�="�Jg��p�
�����Hx�"�Z:�D��"�\:�Dx�"�^:�D��"�`:�Dx��8��Ƅ��������h:�Dx��>���O�����l:�Dx��:�r^!�r^_��G��|�s�W�!G���V�?�
���}��5r|��S����r�~;�h���3���y�91�_�����'N+ퟥE�� ��}�K>����|�������g�B}����堀@���cA@ŵb��`���&�{J�8�+)�G�:����J�W�Z�t-��'o���J����B����T��Hܔ"qU��fg
/\�u�:q]��}�}7�l��J"n�����^��d�p��]��tek����zo-��B�g���
�L�>�M���q�<�w��p�H�G�2��M8��^�T��i��F���E��<뾞�����eI���0��lX:�4�'n�e�J���;�T�J܏��q�
�(`�9
(�J�$�M�`!O��{qɡ3o'f�<�t�#2r��z_����I�e���CF�nN���y��Su99�\/M޲���#�W���_��Ŏo���N���0)��s�o#�m��K>T��ɇ�z�+8�75�U�d�
��u͏�|��p`��A
B�n �G���ȣY��A�U���y%�JO(Ģ�M��m%�{J������q�⼆�T����w�j;�up�Y�ʥ4U�Z��*�V�stzN���GhK<��i<�*��Ϣi�.���xX��1�u4M�3�&a�o`��Y�$�CB:T�tH�̛P��W�_�<y���efx#���'¿��/�B���/���1����ٯ�����U���D���K�'#^1>�1��r���/�'��K�?�w�3����G�/����jh9����n�*�ˎ��rQ�~��hp����W�k�;�5xĥ�;�V�[d��(��	s'B�p��kry8�6���I�乒�+38_�k��Q�KQ�{|A���Ϡ�,�V���Q����t���<oQ$���y��^;��\�S.���\�=�n�_�u�
]$��.j[�d�/��5T��ֲ�/�V�s�C�Y[�Vć��ȷ�l5���|�׈��!����O�c:��hFc������v�"f���E���؞�i�ve{my]�����im��լ4�P+���Mnx�*�x�v&,��� |x��?Ї7�IEdar��5�NJ�ds��u�+�)��H��UV��_c�7|=�w��9���ٰ��&�CW���q"���)p&����쭘����[���F���<fo�򘽽�`�fa0{{�����7d6p@���1��
c0{�a0{���tJ�8(fq�,�∣�,�8+N��bw��,�u-fqE,fqOk1�[�bW��,�UfqX�,N������A���c���t2�����N�����b�5�Ö���W ���$=XR(�w7�C���9��v�_r��b��K��*y�9���_74�.�+WQ
���H���os���`������q�~�������{�,��ǣࡑm$�̔�� l`��)�6�7�y����+�N��j�����ѵ㫮�^[a�伆���~gH����.��]���ҥ�7|�w\�K���p����S��EW�:/�o�6�ީ�0�I��V���mnu�ǹ�#\�V��p�M�t��G��*u�����ά���'��$�, 6\JQ�&Lf
i�]�� ԨU=�g:�Y?���ΨI�K��yf��kȚK���\���,~zP��3��-V��l��U2���R�����u�~��.���˱���a?��Y=r��zl��	�w}!_�{W�>��$��%�o��X�~�H��x�~~�۽��e
~�sMyͬ���%Q̐�?���&$���O��H������I��Ĭ7H��Gb�Z�6��:��j1�=J_
��2iQeF#���+*T�
�2(�ŔbB�ϥk�C<�l��4��r�Sף�
�ul/��&1�i���T6�2��	ݸBL�J2���u~�q%��"ʹ0�\oX�#1�9��[ƣ�ˠ���QΩ�s#y�s��(�z�(�<ʹ�<ʹ�<ʹ(����Q�a7������N���q�5Z�(�1(��2(�60(�^�Q�-f�F	@�=.��\�����:���<F�y��0�����	�f�;�_:�m��ś�N�y-��j����{_J�=Tae��Ի�ޯK1�z�}�E��������ԍd�{��*��3��6��S����YtT�ò��a�U�x`��}+;�ݥ�/��E�q��
�p�(�0
�e{�"�6ԅ����g0�|��
��j�0��K����n����ӌ���5ہu� T�S�ʃ��"Li9�*�������7�P�JR����dd�i�o��a��T�A���J�Z�9�+���/�o�㚎�>�2�М奵_>�{�,���.���ֹ���Gs!�F�MNdۛ���6+�ƉB���o�c/��q�\h��ؖ#��Zݷ�m-����Q[f
���6��kva���љL�m�]��0-��Մb�2�ka���,5W�	n@a�+T4nA�נH܃��H܄��'�B��w����e� nC�D�%�!�gp!�$����p' ��%.���;���Z�{�AX�b8w>�Ź����`����}�Ң ��1�?�p������]"�%�
�n���r^nz���*&���!�^7�{����4�,}Zo��G�JU���+�$�W	����$�������_!�
���	p|��L$�JI�d��_=Ƿ��}ڦ�.�k����A��ߤKieҸh�TP�u��B_���U
�Y]�������������1r8�k�X�ޗ�e�����q��
�Z*��a�__�_v[�_���:�D��#��J㿌�����G�e��xg��_-W#�������{��S���u�F��`����R��9�o��*�2��B�7����og�k���_�t6���|�3��t"i8,��r����O2��Se�ns�g>8�R!��[Š�7��|�*�5Q�0�Pe����q�̼|_���c�G-����]�4�q��gK2�>�6�u�-j�e-=��ƍ�w�"r���9<3Z�"����++��5�9 �Ś�D��S�Z �R�O��d�E����$���g�s2������M��K�?"/7/�0��_�?�L��'�rr���d2!����b�����h;��������?y\�L.D溎��t�:�+FQ^�+�2��~���r\S�����v�9�5��S��fY�n�_��?��y���_i%�ok:N٧��s�
�.Qi�F,�vk��t��j#��t?�����5=Ϗ�F���5����y9��9[P��we!�p
�>
 |"@�T���h�A> |B@����I���'�O
�>=E���
������"4���V��,JY���:���R�4�����A)����
su�
��J�B��K�?���큂�������f�
t\a��,FV���R�P��C�o�/����c���~���_�H�ϰW���Ƭ�Ll��u�B�u�����j�|�uj|�w�N����+[�V�l�V��ڴYӵi�m�f�h3��}kf�N��[���Z}�r��~�f'�9��Y�D%Ys_=g���r�>�r�Gj
�ۭ�������j���M;�i���1�1��v���څS/X�2E�h���}����>�ls-��W�ü2���p��=hh�z� 3��<+�7U9������Jgc��fý���t t tE���!��)��1I�BB'BGBgB�B�B�B�BR$�I�t& t(Eҩ�бI�BS$�
�
��@� Eb�`�Q�aP$�B�	 
   
   ��� RAP�+E&�-��3{+j�9�T��uy��|��)<�_����'�?�\�F]�GQ@�7f��2�C�YV��m��r���fnj���᝘��U�W����?�P���I���r�>�Sy��P�����W�ǁ���X	�����O��o�4+��_V�{�sz�-���O2���}���;����
�Ɯ���bdQʳQ�Q~ώ���2z~�v�߶�ٞ�a6�XM���=۸��
���߰�~\f���Nv҇Y��ְS^[�N�^d��d&;ݴ�6#/�e�
l��Q���mi}����cmM/����cg�4�e��f}��M�
;��c�좫lγ��昿as�?m���[�Νt�fЬf�c+���m^��cƳ�<l�o�e��N�-(g��`{�jۢM3��-~d(�P�_��zP<��e��x���ܦ���h�l@0�0��[P��ʡdR�,-�4�_�N��$WZw�z�*t t�t t �Z�Qr��BgRr 
�
�K�t0 t2 t4�x4t8�;��jT��4�$�O�Nb4�$F@�Nb4�$�@�Nb4�$FA�Mb�`4�$FB�Mb(�`,��м�
 �;��м��;�Ѽ��;�1Ѽ��;�Q�|� �7�����|��
bl4�$G� �;��Ѽ� �;�Ҽ�"�;�1Ҽ�$�;�QҼ�&�W�Y>�E�z�(!/�"����̼�����*�Z�S���/H��
>����<����6�CO
T
"�A�e!BLF!O�L0�����!���PS���4��b��f��?6.V85�!y��:}�<}NNF]��RQ����d�����+ڤG�8�fOD��S_���@|�1Z@v���,6ù>님&�GY�(i_��iU*8�q��t�1�tY�/[��3r#�3r3���9�z�V���������5�7���қ��ƚ��
f��2�g.�k���UIe����kZ������ݍ'��n�r��J�}��?�|R�
��/GW����{�����������0Jc��S����P:�ev��п�Gi����+8z]���<ì��l���R]��o���}�r�yM��\�7j�IV���8�ُY��~�yHb�^X��ڡ1UܭS��EE���YV��pf��t������Q$�����=ޕ�A���}��G����M�_�qJ�?^Ŋ>���x�]��妏`pI�;�KKg�d�y;�u���`j2�s���S�{@~�V�����!��Խ�ˠ�[���2���������wDto>hwX�K���lX�4I�a��&HZ�g�N:Kء�����[·OZ�}WK[S�u��khi��n]���-��7�u|�����mX���t��u�'��v[Xd��H�{aPѧ�Ǎ�c��s&�5���,\9-�:笥}��|̶���C��c�G��M�@�4 4E�$��,I�B�P$MDۘ4�	�T��\��d��l��t��|�Є�Ќ�Д�МI�B�R$M��r��J�љ;���tY�9>���B��Q���GN���?�0-�;�j�
%����l��pN����r�'F�Y�I������Q��`A��i���X��m��%ɮ�1��
fC�Dz%7�W���V��p��r���ᾨ`�+����=��}�m�@���+4�j�G�)�*!���Y����ST��]FX�h��+1�d�"�͍{�D����c/+O�-�5z[Ww���[�۠n^�[��kN8�?)/ߠ��/
�J���?�K�eZ��L
�5�YGK��v=�d"��{��w�f`u[% �-�o�T��U�m^V.�����{�j�J�r��7��?�O������R&��W���n�.'�+g���eR�oޔ�n����]h��WP�E-��R�UZ��,۵����Sxո��:�/�|U{+�l���ޠ/���y���x�!a���+���?-�]b�4xPB����ತ�sŰw�q�+q��u�x5�@��w1#(����S�;��ԣe���r�����a�Yh�������
a����������>
VH�����7Aޗ��1�=V&�O^�F[��_\� ����F�_!ej�B�����Π���
�b�����/�nA�o���/��⯚���(\��y8�0���,}��7�/�`�O*��%�W^�S;n/@��C��$�
t$[�kn�?o�W�����anaF���?�+���7�F�O�� �&�0�ᩥ������;�븾�xC�����{��[��P�����#�O^~�0W꧷�o�W���	E!����v���K�r��������Y�A���^/V�omS��E�ƾ�5�Լ�-��fl�{�eS�m�6aG��iy����mL�'�1�ZSSbS�Ͷ�W~̎_�l����ą�m��d����N�柶����wVئwn��x~����+�[�8���36mV���n6c�(6��2�I��o��
Z��쵶�?\f����r��ms�&�r_�d���r�Nڸ�YC���d?�`�:[���켘�켧�f���e��6v��؅S���r
�h�����.n[l{��lŧ�V��0ۊ�-�X�f�՛0/�
W�.�k���k�{(�.�n�H�
���
��R�>���)����K��"�R@�V��k�{)�.�n�����.�n����3�HĹ"1	@0��4�<(3�HL�L�L�L̈"1%@0'�Ĥ��(��HL̌"15@07@09@0;@0=@0?@0A�:2��1X:]RGL	_~�5��?�.G����g��T��UH!�W	���.7-o�>7W����S*����������.�t\��
�^ٲad���}��M7ys���m�Q�Bѱ坌�9�)�IgQ���)�A���@S�/:�5���hO}S��C��/JD!_� j�P=Qӽ��M?'����y܇�����o�N�#
��qQ��Ѣ�!fSX�lS�G&Sx@�)|�lSę>��˓M��mnj�`��óo�$���:�:
�$ꬿ$��i��k�IQ׋��"{�E�Lԣ�:Q��v�z�_(�Zw����3�h���;��>ך��'�d�t&�a�I10ޤ�ޤj��DR��zCSlD;Ql�k_�~}����~����:#��?�����-�D4h�V�&c�h�$�ŧm6
�)!~�iX���Ι���S4���/W�F~�H���Q�颔gG�F-׈Fϗ���T�l�i�6�4V�ǔ*���I��Ƌ����4��9���>7M:��i�7MS��M�~R4m���=��m"6k�H7�1��Ei�٢��SEa)�Y͍���)��ɦ�3I&�����F��_oo����4�)Ɣ�j��7q����N�!2$�E�}ψ
z��y_4��.Ѽ+[E��?#2�{�h��F�·�v�1-ڴ����y�ŏd�2L4�j���5|֯���d����^��zp��T�4�V���e�3xV;��sU�L����hQ�G�H���h$E�8�oM�������٨���H���q<���)'�2�8" 8# 8$ 8% 8& 8' 8( 8) 8* 8+ 8,Eⴀ฀ༀ���#S$�
N
�
�
N��N�O�8? EB�@	!)19AIP�'Dd����A�2B 	��	EB&	���������
 
�t�x�|��������(������D��7���h�,�/��	a�l�� �	y�Q?"$DF�EBf	��Ѭ�;�_ 7�5��Y#!9�5n���Y�	�H�f�Yؿ@~�@�4[$$DH�EB��@�4[$�H�W�_ G�5���Y��ؿ@�4k|
�H�f��8i�H��f��D�Hi�HȔ�6د@��`��Ҭ�,�	�Ҭ�-�	�Ҭq�+�.��~�u�EA
j��� ���d��J�c�v#ʿM)��]C�:8��)���
�B�$O�b��'P	�jU��������Jc��?K��/��>�uS���߰��p�1����G�4X>�6�ܨ�UK�=a���mYzt�j���'���$Â2%�;/���%��tK��Kj�p)��f��x<��yMy`�Px[]�B��U�������(FvL�:,�!u�83ꨫ�Qa�13ɇ�z�|p�X��m���"���4Β�H���V	����pR���ק��q
���*��g�0��/����_ڰa��~8�
��G�#o��?�֗;j�g�-���s}��&mI:���7�s�o�E8ߨ�w|������_lCҍo�&���4���ߐ�bI-��b	I)6�tbI%%i���O/�a�����������3����v���l�;<�������א��9��LR�I�)ӭ�"�_#���O�|��|"�?�{�x��6��{��)�#�~,/�6��T�b�^9�5�jמWˮ���ъH������:E��"�I�>��/)����}����D~o���'�/�%�g��n"2�i"��y]H��,"�'I=���DJ�$2z?;�����j���6�8I0?��U~����.~�O��(?��~�"~�ޯ�$����)���"��"�"}��M'�w��Èԝ�g��3�g��돫���v�_o��l�s�W�\��|w��f7$ҵ����D��"r�0����L}�H��<}�HS��ҹD��$r�	~Ѧ����ŏ��C�����k7�_�0�E��C�]���М�
3OVf�[-
�0��3;��7383�y3}9L��L�#8L�����7&���~,E�3L�i�~��"�9L���1�o�`��-E�AL����EL�71��`0�_�b��&�����b�?��*����g1LC$���~w��}:�Ȁ/c����t1����L�f0���t�S���x�։�~�a�U�}$s�L���1��c�?�"p��cyL�'��xL����1�oO��}��t��i���g�L���b0�?�`����4�}��%��i�V�ۄ_0�_�`���\�i>�p���4���|�W�!L�_��0��~^�4�'�!��i>�+���د�W��c�oa��?�oxL����1�/�1���4�x
�|�W�7L�_��0��~��4�i�0�|�W�?L�_�1��~>�4�x���f�W�GL�O���( p%���L�_�71��~��4�x�|��(�b�?��4�	�|�W�WL�_�g���h���h��MH��0�O�����B�+��_&���������s#3���'�k;3�?��=�����?!��|��vL�A�1��Z�M�F�YSa~�Y��G�J��{��{�\���}>j�B��+[��5S��O,�`����h�W��h�W�Ic}����=y����`���ӭ����������k�A��$�[e���eC�$p5r>�+e��	C��'<[�pTo�G��4O�?��_�T�������7�<|�-�8��T9� �Pa����v�A���?h�qr��Gh�A����Oh�A8���W���/ny��<CM�7+�3���?7�3��/�e�ey�'�<��[�A��-� ��g��<ϸ���7��ؼ‚,_�)�1j��?�B�����V?�t[�՛��p��m�ߊ�� AT���ڿ:^��K�8����.7��"�^\����%��7]�G���K��-�#�P�K���~�R?�-�+=�10���p�
�r³�(Ϻ]SiϹ�OX�Ww���Œ��|o�x�R������_���q������C2̓F�,�hQ�0��E�X����|����Α����Qy��R�T��|�E9�����ڶg�UnU�7v\�K�VO����L.���"��Nܕɝ-B\��/�G��*�
������!�U}�o��
�N\����O��d��C���n������U�L5�N��?a��0//���:�
�}���G$�F�+T�����|�5%�пg�Z�jJZ��xVX�iư�r��÷�ʊ8���v�4�ݻ�����:<�r�d�u�?2v���Y��K�\{��J�^<m�챭$��;��+��ڋ���K����z�0��/p�w�6���s���FٴdNn�mT��)Lj��vbN-�dT_���F4-���$�w��~��p�O�(���}�]%qKw���D��}�o�$��ik�C{�����
��<�8�ϧ�#~ZQ��e�}�GsJ�ޜ`O�6�$�پ�Q�e%��Gص�(�l,7F+5�Մs���q�q�]6���-7��Ɖ�s���5N>�����d���O[��d��b�a\W�f=b�M\X2sD�=-vjIz�a�����Y�Ӹ̀)��_��3�����w;�_o��l0�1����g�yܧ��Q2w�y�!�LI~�����y�>��֒yW���?�����K�5���e�E�8��Y�ŏ��2�4�L�Ճa�v{�Y��:L3Z��ϊ��ph��>˪g�N[u���f��ZUS��ԏg�V�fD����Zu���,Ԫ��U�Y�U�Q���P��\�Z��P�~dD��ҎZ�l	j�\;j�JP����V�V�Z�;j��JP��hG��]�Zu?�Z�dD���Zu��*ϡVM6�VM�P��6�V�͡VQ��9Ԫ����K�U1��B�������Ԫ'�Uw��V�eG��L	j�g�U�0�V5�Q��5�V]ˡV�oD�:��Zu����vԪ+JP��Q��)A�:��Zut	jվvԪ�ԪvԪ���Uc9ԪR#j�p�j�3�2j��FԪ�r�U�0�V-�P��5�V�ġV=Y�Z�_vԪ{JP�Z�Uו�V}ĎZua	j�t;jթ%�U��Q�ƕ�VM�P�N1�VM�P�4�V��P�v0�VmġV
0�V��C�zֈZ�S�3�j��vԪgJP���V}���+vԪ[KP�>iG�����\;jU}	jձvԪˌ�U��wP���P��tx�sJ��W�<��d�RVN�ߡ�/^�;3�`���_���`$����:�xp|�o�E��a���u����g�ț���#������S��M<Đ���;c�>'��/����'�߹�cAzn�7��I�W�����u�)93�

�~��r�����8��N�_h0�3|���˄���2���V���B}���0�le�@�Xס��%�[���P�A�ˀc�B]S�08�.���1u���P3���H��ț�3x�|
�Tj�����?�zF�N���������x�8���!v}nzq�:��TV��7�2F��%�o���Ϟ�,[���V��l���N�5S���M'}�1��>o�>�6�Vv�
_w��b�����C��K�
w���B�w4qBaZ��+�q�*��o�Z��~���?�1��n�/_r8��������鶤�F�~
m��ihx����m��=��	��ځ5	sw�4u��Ά���&�O
M<:%�“���\���B�z �	v���K�*U�9V��^�w�%v?*�C�^)�|���`��b��
��C���R$�6'"��MC9����3+�	p(n*Eh
��9�i��Y�Zl�����~JU�-�_����e�����w4�P�a�>�?�?F&/���k�I��'�����F��_�a$*!��M��#�z�]A���/�/��P�ç�ϵ��qM�߲�.$7k��9�!�ͩk��#)|�~)�vG�݆\���\�$ٿ��|�uJ����/�%�M�k�z�[��{�t-ƹuv���B;]��e����Xg�k1�%v�#e G�bDw��Z�&\��oX���a��,׬2�-y�2��3k��,�Sǘ����T��vQX��/�r�e|����"˄�9��'U}0L��c&�g���!��b@���QӺ�Rhb��f����:�49 4; 4= 4? t t t t t	 t t
�"�]D��H7QtN���9�N��Q9��I��!*�:�F��)tҝ��S�[i#8��I��N��n�����.�n������F�L�L��rb�`�`�Z��L�L���Y���م��3���t]nf��?�R�1�/�
��'���%x��K��Z}n>ljǟ�8H=�P�g�g<�XD˨��k1~
5�Xj�S��\i���k箂�g_9e]�j�~��B����B�Q�M�P
��C�w��輦���ز��.e;SC��S&���α(웕�k���
oDUKu�b�:.Z-LfU�3rf����#����wYb�A��#�����N�����e�_u�/��%���m츦�ϴ2�jsT��㓎r'���?��->���K�uL���h2Dr!�'��&Q#*��P!�_ז%G��jz��nn��ߐ[�ۦB��:�4��#�Ǡ��?
ħ���`��1$����2�Sx��
��K��n��_��×��_�i�n1�;�y&�7%ˑd5�k�1��T���Y���խ��y�+��ip���䎘ۻ%*�������]-�Oѥ����W���<F.����=�tC���^�|�}���Nxc�%���o1�����c���2iLI��� 6������aH��ogl�vt������W5{o��y;���ހ�A����?��)��������
��Z�b58Խ����r��_�36�Õݛ�Y����͞�����7�מi�ן�oF?>tt;�l}�3{|��������Z�r�)h)@h-@h1@h5@h9@h=@hA@hE@hI@hM@hQ@hU��e�u��Ϥ���Ϥ���Ϥ�������-_�������"=@��%_�Tg����^�/���w(y��^����������zz���k�'��ȆXal����tst�酾���c�_�T
�m�O���%�ä8$7�Zِ��k�!9�RU��p���M��ہ�S��7�M��	_����30>��JU��ja�ߗ�_���zM7@��/�	`�1t{K*�y�#��k��~�*��{�G�3�����'��j���_�?T���c|�*��#��
�?�e����-�'��	�vd�|�7�O���8i�Wfx����3?�X��ވ`kJI=F�€�G���9�paм��&rޚk0�/[�n��� �K��2 �6�.�: �>��#��������l�7�~Ռ��2�B�:��̹B�߳���I+,(����O����}�+��+�O�XG�*�x�U�QsJV��=^&�M���7`*�|���b<�s��L~�V��3���geg�f��}�۫���T��/�������t���6Oݽ���e䊢VaŒ֢��W~����T��t�$�����K�mwZ�m�em�r��âVIN_kǩ=���#���X�D]�v��`��;kdPcm��k��箲=��ٲMD��=4$ά���b5�
�V�’n|v��xǒ������]��nW����נU�e�u(��V����� � � � � �  �" �$ �& �( �* �, �. �0E�ʀ�Ҁ�im:B-��'̝Y譕�5�j����BC�w�$�ߵ�S
��t^{��Ȕ��/���?/�c;��(�rF'�W���O���B�P�E+�x�%����-�{��WU�_!S���\���������fg����".��i"=ı���y��/=��T�oY���ڬܼ93��~S�II�Y"�r�OR�|�޿�J�~��>�P��E���:������*л".Z������\/�GM����^&'��u}��:���)����JA������7^z^���b�̥Zg�NI[T6��$����VB���������}�*��?�P�ϗ�_�";���C+�v������FSvo�/�1�]�C�Vg����5��/W����+���c<��������Dס�/�i�a���-##���F[Z�F���4�Y�����#-5�;�mw�Ƕ�ȶ_�;��YI��l�)��NIo�����v�z������A62p�-��?l��Qc�q�[σ9��]�l����E�d�Dj�S��&Moa�M�c�6�b��U��U��Ǫe߰�k����+�د?d��u��o�v���wl��
��-n�K�A��i2L�����I�m�O�١]�m	�3�aA�m�;e�_Hb�wc�F�#ϊؤW���G�'n`A��j��h(r����5�y�.��^s�S�{�0���
��:���Lj�uJ;29�&�G� �x�&�_�Y��/gh��3�,�נ�����)��N����#Cc����#Cc��H���2)1�Ā�@1"@0$@0&@0(@0*@0,��ĸ������������(ã�Eb��`��`��`��`��`��`��`��`�����������������R$FL�2 3 4 5�c�$���q�2� �ku�W7'-KW���Q������4���|̸���6晃�7�����i6���hj�=�+Z-�ce�k��.�#�t(��|v�4���#A��(���ӑ�1:c�z+����0֓b�]�ј�8_��\~A������1J��J�7_��7e �u�?�;ݛ? ab,�*,���h����5a�_ж9u�8�Hg-�ґ��@�sx��������@Vpx�6{�s���&T���ɶ� )v<���I��HN1�7�ӈS�sj�Ϝ��%.6��=��\߿�����?u�>`�	��}����Z��]��}�;�!	F{|�*nh���x�4�>��n��I\b}%�x�?7�l+.��v�3���sY���*
:�g�I�� �Q!�eà}9���LB����_����dB�b���J�K{�UH��.	�k
U�.w��!]�J���X�B6*b
�^ԠRxA�J�D1*��ZT
�sK:r�����P)8�-��C��<��#���ܒr��疬�P)8�-�fG���A�0��*�%�vT
��KR����Rp�_�̡R�t�_қ"�"*1�J�gDb�T!�D�p�C�pގJa�J���	;*�W����.*�g�vp��vT
�8T
��8��;*�1*�I*%�Q,���J����>��;�A[H��]�XCFznFv^΂9��>����*��}�;�)�8����vo�nk���j���MA�"�wfZ��͉J����D.��y��71��:�NV	�U�w�'�0�c�
�-��{'��YV]F"�L��5��)�L]��W��$�X�'���7�?�=��-�z�����mv���'X?���E�� _w��j����.;K��
���R��G	��?���t��2��{OHw�'�����+��:��F�"��=��B����N�=C�u?!�	��@u+!�}����D7�C��	y.�%�d��2��C�ģ��ȳ]��W���#���UMx����*b���?����]l�!�]H�u�ÝҚ�(�V���I��UҤ���_Q�o�c�w�QN̎"1=��������u�1�b0�a0��sj��_`0�oEz&f�19v1W���_�c~�1��1�g�� SƘ߅ǘ�c��C���K���`�B��������M_���\���1B��a�o�\���s?��/��-����6v~���[��A\�ds���6�6�_��a��,�	0w�r��)鬹s���.Q��F�1wk����8��Aq�s�Kz�%=N�$�y�Yq�.��ז��5�Ž�L�)HK��eӲ�rC�Y1p�Y9��Yծ�Y-ohV_oj���"���n�{������#���x�O�qK����!t�w;`Tn=mj��5H��E�\�V|�e�,�JK@�!���|% �;�-q�S�|�u��s�2i�|hY�ĸO�V��������m(��Q$�͢H�ǡH�`C�0�C��lF�0Ќ"��EB�`7T$�A�pɌ"�E�I3���(��Q$l3�H�L�"��}�H�F$v�"�Y1��E�1���b	S�(��(�Q$d�Q$ěQ$�4�H�jF�Ћ"�&���f	]�(��g�U	'�(�#F�pQ�"�1���b	;��R,��h��}v���|/��9�t������l_�
��2a��_�o�4|���{H����
[`m-2�Z_yEv~���~D�F�����÷� �ۯl�wXT��d��w�r���9߹��|��7���n���G~@8z�����i�'�N~���B��ل��N!�G����ɼܠ���kcf��eya1��󕀐��Avs�fY�T.?����e��{�	,F����^���#�#�͊^�b���1��1�Kx���y��x������9��C#��<F��<F���Q���`��-�?
�$�ք~!��,#|
�>��XF��<Fx�~��1V.���?I�]X����u�������g��x�����_��x��C֊��^6�:_���<,h�xx�\�����Zs��	�g��W��������0�,��d�����8s�o��nj7�C/��3|�T�����׊1��l�?_����$1�\3�B3�-"�h��r3��f�����#�5��}e�j�^k�?>K_��?�
�	�~����˽��qTr���3�l�[�X�.vW,���E]pG�����
��Uj���_����C�t�S�fvo�L�2RǷ
[ko-ZK��\�C|�S���GR����-mw$Y�m`i�2��aQ{�$����T��)�WK��[�D}f�q�ҭ�nKd�I�����K{H{��%�yp�$j�3�^[�K��%���%}
&K��Iٴy��~�b�D�r�¢j�Ϣ��������F�%�_�m�{5T�o�����[�K�J��;ɠ��$��w%���2$a�$>m�ehץ�����aA���Y�_xԒX�%�h�e�Y�%����#�$)�zXR����}�_����{��<;5qa>���
�q:��������H�Rv�$I�ζ�P��XD�c!=�r�-Xʁ�(�%���C9p�G9p͎r �G90قr ɂr`��@��@{ʁf�jP�jA9��g�-(v[P���8!A9p\�r`"�.�[%(���X.A9`���KPL��H���gA9p��D���@?ʁ���*Q�%(������YP�!A9p]�r�+	ʁ�$(�IP�+A9��f	ʁ��K%(�[PdJP,��xԂr`:"�r�ʁ���$(zXP|)f�L�I+H����A����@g����s�_.���������g$-#�̭¶��,n}�K>��6���b�؟U�vǤ�vF�_ٿ�â^E��vE�6-�t��s{Q��3E]#Nuk�~Qd�Ei�e�����4=��5=N�'�y�
iԮ-�^[�J��,��^2_ڧ K*M�$�M[R$7�(R�U�3�H�nD�Z޹H}�gQl�F�MHQ߫����_(�:P:��ҁ�~��-�E:h�!�&�t�犆$�.�O��hhק�	�ۋ�J�w~�h������%�/yvTQҫ㊒�dKSV��&� ����	Mi�@���D��B���WB�~H��71R;*����?�e��K��1�o�Ozc��<�/迃`��b�X�U��~R��E�a��U���]��E�oa��a�?S���D���0�_�b��I1���X��X�7��H1֯�b�_&�X?_��>K��~�c��"��3�0��*�X?�c��"����b��������c��3X%���21�(�X��c�!)�cR���0ֿ.�X�a����a�/�b��c�SE�s�Ec��B�`�볥�c�0��IV.�w��T޼��…���e�_EL��������}9LR|o���U	ŵP2�����M��F�˟t���P�@��ޏ�Y��u�S��xo�*�2���!��2��3�q�����}'�G�
�wY��f�/4��o4��T<$�&>�d�Ю5	�{��=��y]��/'�_X�x���g�'��V���I�I'$�T�>E����rsejֿn#��YPw����;Ey��7���QР<8Y��`��b��jP�+Fy�b1ʃ��ĮQ��`���F��N�wA9�0�����j��#����6��nӥ.��ߣWo�D�Y��d��O�<�4=S#���Xn�S�XP�3�X�n|�Z.+V_�-�����&���U���k�O��x���V��Gn·��
n��|���/�?��9��]�جA�F�:`�u�<
�L
��Ũ��((F0�u��b�2Db��FiP���jP\+F�B�:��Ư�b�B��A��3��
s�'�W��*!��e����k���|��C���5�̇DU��QK�2��~����3!���;�ޅ�_��?ګ�5X�'W˄��������{C�g��Җ�ZZ����.i}�s�s�6�ۈÏ�-���ݑY�n���+�;,�W,ɉ*�8�mq��&ŝ��]�%�bq�[q��Nj#�k"/_�t?����q���I���w5Q�vj�1����QC?�,�u0���]���
0��w��o"�˟�es������2F�����wb�'���3F�s��m������1�gc�X��?���b��Q���R+��w1F����m���c��A��-F���Ē0�����S#d�U��w��C�W�E���_�?)#[W�a���,4d� ���Ʌ�_�
w�A�nF�ȍ-^'�Ầe䊢VaŒ֢��W~����T��t�$�����K�mwZ�m�em�r��âVIN_kǩ=���#���X�D]�v��`��;kdPcm��k��箲=���'��<xL��X�k����5����<��S�D+M/�ʦY�r�V�G��1Ӭ�v�V�|�U}}�56b�6���־W��	��?-�x)R;��zڸe���֟�j2~�^�X�$|��O��:��kڄ���aAOi�w~�:��+��&k��'�#�ζ&�:ך|d�6e�Xkjr��VvB���*Vø����|vB9)��@�ZJ�JGz�b*a�o�n�€t=
�`(�QS@a����"��P,��0(��0�eEa0���`��A_+
��V�|P\��0�`Ea�E0'״(��(6#�BapL� X���u-
�MZOkQ,Ѣ0(Ԣ0�XQ<`Ea���4+
�t+
�A��,QLբ0�nEaG3Ea Ѣ0��&K�1[�(~�&���S-
��(^Ӣ08hEa���KV�XQ����0����Q,ע0K��MT�*.Z%���?"oF|�~F�>7��?
���_-�)��q8�>3/W��^�T]~VvFn�o�d2�G�G�0��S�'r|Ә�v���h#������<�$|���:ee�cM>��ш�6���G괕����?V7Gg��������
��|�
�NlX�Kc�s�����_��@M��z$�5�H6H�������^I.�
T��Tv�Gu>�;�/��y���}K]y�ۭ+��Cg���{����e�3�K)�O����������+�WK���\v��t����Ct���y�������3r�������Y�ܙ����c<���e��M���k�?�<��tT�a4|���d臿#�~�q�����]~Za�o�_��K=�_)��%��w|��:�5=n�3nz��y���]s~k�C�K��m�u�v��?>h���1̥)����/�~�
�G�T���3̟7�+MO1Wμm�����{�k�?e�o-�~��o,�fg^P2���ze���u�モ��7��0
C>�D]��_��-g�����'�	y�
��ǃ��ٓ��^�g��Opa�? �;��r$��%����L���$�2kc�4�������ߵk׭�P>2(���c���Ofm��#؞���F�b{m�a�~n����E�>�`�s^��f0�\��*�g��,���AV��9kL����ߎic�`�E��߇c�o.�ض�6p���8�>vЂVM� v��|�d�-~�6���؄׵�BԶ�=���&6�kM<��m��i�v�f�?>�MYs�����N��Ƕk�O}��\;V�"�:�%�Ї��r�u��4vb��։�<d�t)�:��mʉ�֩��ۦ�[�?��6�h-;��M�@3���mi�զwk�f�L;+�?lf�nm�����ǭ�v��Ͼa�޺��y�uΪ�m���[�rWڸG'Y�ʵ�[�{km���M�y��i�]�C�j�_��?��]��}��m�E��>��u�/ه�|ke���]*	zJHtl+aA�KHO�5p$���ġhym�T��X��\��`��d�hN�h��l��p��t��x��|������C�SR$�	�	
N
�
H���a�8-��q�yi'Gg�c�ġ��"NM�86�Ax�%88 89|���������8>�]��� ��	P�E���,BTgB�:k8
�@u���:���Y�(��"d�C���g 
��qСVB�h�������8B��H��H��o��	�H��	H	�	ȉ
BP�@R�@T�@V�@XT:��;�EBb�@d�@f�@h�@j�@l�@n�@p�@r�@t�@v����& =@ >����)"2BRbr���������HH���xd.* �* + �+ , �, - �- . �. /�+�|��)"��%�����
$
x�u0����R�U�U}��RdR���<�-/'o��<_���e�?
�����?��0�w˟����m�Ye9Q���^b���2�����&�5 n8�kJ��`4!e�cְ)!۬��̴�#M�_�ISpDA!u��T+�5��F�{u?����	�; �? |@�4��BC.�<4đ��x�Cѡ�4.Z�O������L�g��W0���!�/��}	���1��o�J�c:��
b���wƾFf�L��I� � E�D"	Y�H�V�[����Rݕ���*S�]k�^۠���m]h��U5�9�I&bK��}�_򿓐�������4��s���ב�F_��7.�^s��������p~%u�����VsF�"��?ޜ��i�����%���(�|����1���oNJ��
Ֆ���/��|�ك�������Y���Ut�bS�.2W
(�1Wr�`
$��]7{3�]_Tr�����u�G�ݪ���o�(|�f�
#�&�7N�#x�T;�@=2�>��������9��IM�z��[�t%��U�?��'�s�7@���E>xv������{.;�<:Iv�4�yt� {h��x�]�h���0Q��r���o�,8��W��׳��G���/�O��,�3ۋ?/=����1ܥiM-�{-���[�:��c����l\���Wr��#�J�hS�9�U�ڝ�~�^Z�s6���⟝-���M��_Z|��f�J��j���X�О�c�P����sΎ?�;9*�:tA��R+g�wS�n�Xo'�%S��βz�n��|Y��_�F} �vB[���C�B���%9��Y���^p���'����$�בhg�(���v��Um�>�Vv��Q�#GO8/�D��_�����K���w+���c��^>)6��_��!�LC�U��,8�z=�,������z=3M#�Gub�����>?^�?��Fa�;,�D�r�e�ka��,I+��„4�yN��9�2q�INi�Ւ��'U��Ok^_N;�9?��or������������W!k�쵟
99,�oS��XS����-�}��j/�L~Z��S�eF���#�,3/��g�e͟}��<穄��^i-ϝaȟ�D�p�Z����<*���C���C:�I�u�SO��F���ͧ�n>y������E�	�"����|c�l&=G�Q
����CD""	����GB�$"�����DD�""O���f�R�	8��LM-�ʹDdm�U�9��L�X�	�L�%�3k3}ȱ6S�=r��9�oB�8�s�uD�;"r��8�{j������6�j��6�j�=�6�j�] �@m���Am��`+�����8`3��q�vP���!��Y�m	Ֆ��P;l
�s��P;l�s�����v�j瀭!{Cm�9���Cm�=���Cm�A��;Dm�E��{��:�!D�Dm�M�m뀝"[Em�W�h���u�vQ[��u����:+�����!شbm�m��<`㊵y��Q���w���ݣ�N=�^A�Gm�K���v��<�Y{�!�y�&R��"�yV��
�Gj��a�������+h/����+h7��Ӝ�W�~"�
�6�Qj�e��������+hW�ͳ��WоR�l,Ş`g�����<`o��6�������K���;`�	�S�l1�y�S�l2�y�.S�l3E�`���6����k[@���g�[- ��cͳ�^|�[�:õ��^���7�R%��r��fga��J����{٥�C1�GԿ�Q�]���������R]7��u��_��(��M���~�g��'�{��[��*�ӆ*���K�_�u���s�^{�q��r�>u�E��
C_VGl�gェ�m@�M=�z�m�7�ԃ.�Vq@抮��'��D�ݓ��_}:o2�s�}�s�8�).��o�3��7�~i_��	>"~�7�aȝ�"w
��|0r{��������x�:�sӳ��)y���à)��ޜߛ�����d$�3���}!M����U�?����;!���i����w�]����}b�����ci����|ckK����K*ZZ�>'������g��&n��m�
B��/Y�7["t����c�厗_����Cr��wٕ)w�:%���!r����rHnH�:���?آ�t��[B�����P���uR��+ň��I���b�yR��dјb�����}c���b�v�~-b�؟6�l�mT��2��9��gc,��ΣB`�#iJ"e�s5��%6ܧv�P�seH�kZM��F}��Ӎ�h�
�e���@D� �\({�P���dD�3ȉ��)�(�e�q�,�e� ;ʞA~,{~���r$��nCy��y1=G�R�����K�3Ș�g�3e� k�ț�f�9e� wʚ{�[�P��5�P�z@�0�eà�A'�}�^P6�A�0�eà#d�@O(]�l�a�ʆ[��5G�aY�3�Q�XV:����.�u����fsv��[�@���tj���7��m�ͱ*��[�r��x�5n�
]��O�O���J�/2��s�����׮������H����R�*��VR�9DUf���o�6qg��������7{Q�P��c���3v�y|�z�ˮ��#���N�0�� |�`^�� ]��lj�ރ���`Qߢ�d�Y[��{����"־m�|�-r���>�smƔ��Q	�-�o�=&4�ַ]��_�$��/�G戣~}�>��0�h+��~��@���u�KW�K[ܽ������m̽�h�b���{o��w���{+z�b ��`����3s�{D��wK̽�(2�����;��E��s��y��s̽/�{g;���{��{�3��1��.1�'2�>Hb�=Xd�=G13�����;6���f�9����g�̽ϵ1�~���w���{����hs�Qv�ޓṰ�hgn=Gdn�!;s��&��X��fbzFv���:,LW�Oe��������W�,���h���K�L;�ow|l3팔��ow�V�w?UŴ7 ��;��sߙ��{�;?�U ��d:�4�;6 +��*qXS&c'O��3~)��sW Lu�p��0 .�G����!�G$��Iw�G%�����>6"~tB���x	�2P�."^D�$tw\B�4T����)�)��o�4'
�˲�Y}��N{�:�2�ӗ����],����w҆H��o�_;+X�T���_���8��FQwu|�L�4�i*�nA���6&�m�Z;�0�
#�e:\��|��U��k��,���C��Y���y���l\y}�r�q��b��a���-��Z��	+���F�����V����ԉ�����u}�:׷�(��q���Η��b���r�u�o,7]<2U���&����Y����O���SQ�R�eS�vaBՋ��A���#uM��$�x�IS�m�Z�>t�~)P��M/w��]�l�q��b������V�+:UO�Z煙ڬ�&����.-Ih���C��BǦmM�Z|+w�2�م˒�������
�6Wp���
n���K��ޒ<ͩoa�
5G:
Wd6W� ����e6WP/����6W�O���B`s��\�U�+����
>'����d6W�,٘a�#��ę�����a�B��ڦ���	���l�{e!�AMg�'�
c.���Ƈ�=u�[uN��5�y���;��
���-��>K�㗛RZe˩���&U}XHk�hJ�6KH��Ӕ�o�0�TsӔ5B�pg֢zB���Μ�_�;��#[SO;��/ɹ��<�cyz�S��o�3jm7�p>-��y�,�ly��9O?%ܷ)�4w�,a�#�&?����\�p#�G��l�6�ޢ�܀!����6r��Fn���
C�kh#7`!�����n�v��Y���M���.�vo�{d��i��6���q�f֑����	����D�����J*S3��T��RyXJ�	`*�'��T��RyXK�	`.�'��T�SyXL�	`2�'��T�FSYXMe	`6!���'����a��'ܣ�+O�G;.Xy�=�q���K]�%��'��F;����*�e@,��6B+�����v����vd��vd��vdk�F;��sE��ڹ�ю����v|���vd��vd���AD��V-"Z#D�Hē�l#4�LT6��6BC��V���Z+D�X�h��r!��BD��V-\`ͨ[҇��C����
�"Z8D�r�`���Q��"Z>D�~�h�
"�%DDk���""ZFD��(��DD+I��%����C�����'"ZPD���hIњrw�9�v��p�5���vk��1���N�i�V�V��u�6��/��?���ß��bٹ�1[F��}�pu��y�"̉!�-�"�F�
�P����
 ⛠$=�5��������}��?�DjN^������t�PN�>4N�y�2p����h����A�����A�/Jx\�/sy��y���0u�]�T]�6Qw[��*x�u�������8�f�v�6>ͦ�D��Q��-��0h��Ɵo6r@���=��
�^���h2 b}����\��O��̩��_��/��G��W��n4%��Q������0|��R<�(�jtJi�R|6�\���M��� |��`�OT��?H�QX�A�Q)X�G�Q9X�fc���������}
=G�a�{�	����R��,u�R�:'X��dKݻӂ=T(���,u?5����sT0��D�VѰ�`�6(y�t�7�j�b�w�_Sr�e�o���?v��M�����x'v�&B�_��N�\:�X#t�贩ˮS��lX��y��h�ԣ�.��G3|�L=m��z4�֣,�Mw��hx��h~X���|�I�U����y@:�n�j^�����h�Í�F�,��pj2NP.4�e�Iq���Ir�k��g��N����?���M�+���N�dO�ga�)�x�P����P��?_(V�)V�])V��)^��,���W��+��P����P���:�L�S�B��;�j���k��?C�_����u�����������P�Kg�ꦎ��6��GL]�:Ll}�j[�����g؜Z�l��������)N����p���Ծ�<
�23�ŞV��,�K�e���������<]?�Ƿ.�MC3������.�M/C�3��nگ61��n�/01��n�ۜ���v5��w2�?�մOa�zP��;���O������E�k��/q������w��˯��j,����$�AXB6�w�$��I��B�ޡ��\fU_VU���~?O��K�n���Nr���V����0���Ww}�kMՋ�#�hh�;�?o��F�����r�5�Ml�v5[����VW���\7R�|�9y�$[oQ�����s6��r)�I�t�S�[�*�NN�ܫx���)�W�v41��^�[�Ĝ�{�&��[ϔ:�R�6s���k��6�/��7�)��/�y�?n���K�X�v�X|�lv��{�/����q��N�����)+����eΝ�oX�_��R�
-^��}�k���h*�;�]�n1v[���|�����!�Ӎ�4�f�2�֚���u��з�0h5��pGx�!��SM=���=����E]c�&���\6F>p���O���Sƨ��8n�$�J���7��f��̢s��e\�s�ߟ}�\�2����ad~����5F������t#�iF�9���r0���`~����Q�5Ag��bd~����}5=Gb~�����&��O�A������)Bկk�=��YYf��������Pe��/��mx�]��k>qV�μ���9�ݳ��+-VUs�\vUhu�lRY�Z'`j3|��6]�6�}W��CX�б�jg����N_2u��}�ˁIlF��،��;C���u7��6�1!��r��s�Ƿ=?� �z�y!b�2��c���B���
Ɖ�������V9&b�ܷ[��/*D�_� Ƕ�#�ԛ�A���=�f��E�_��0䥏�j3��	l&�����Fi��G]}NN�;CN<5O��Xy�{f��c��b��[Z��k/'=�K�W�ͳ[��	�)*�=�&~+�5�)���PȨ����{�0��yʺG�̝k�G6��k�s��Ȗ�i��!ٲu�@yZw��;x��WK+O�6@���PaF�.ŒO5��5�Yo�f�Q��į�}��s�*�[~A����+�ۼ8�z�
K���Z�P4�p�K�k�$wٰҨ��{D��sT]�Ro~�9K�o���:��Ul�	Tyr����6A��m&��HHr�q�
�6�����.��.��@r�@��&T��⃪f��XH�V@r��P��"z�����<Ζ�"��]�Ɩ�"��]>�z�H=r�O�M��䆲� %�""�)���$�X��j�����THSB�`��Y��p�~֫E�""}݃��ƈ����ΈHiD�5�U��қ��Li��TGD�#"����H}D�?"�D4�h
� �I@D������Y���	��l �D4�h6�t ��@D��fM	"�D4)�hV��b���f��S���D49�hv�� ��AD�x���k7�
ц�v���]ӥ(�Kș23�g������2[��*�K��R�y�̙�Y�"��0��4�95S����mNKO1�����0����Ze��/�?�͒
�s���TGȍ��F�v4��=S�/b�M>�l��>c�W��Z�ȕ[.I�[�%�2c�����6�;�m#���U�������w��'�r]>�#v}G-v{y�)\�%�Ő9O���9MRsQkJ�tF�Y&�[�
u�pa���.�)����z|2W�'G��.�5\�Ί��>��j���#F�[	��\L�b���~����~�b�p1�p��R< �^K!�
�0]��`�.St��Ct��P�#���+��L��W�҇���H�$B�=�
���Ye�1
�QB,E�kb��\�U�Xʀc�=�2��Xe�ͶE	R�Rd����R�$K��KH�R�2KQ�Pڔ���)E��@򔢀�)E
��hbeP�JQ@HMA#袁V�fP��A)
h�({�6��)����P�C)
h
�(�9������
b)ʣ��I���6Q�Ҋe��U���fq�Q-,��"�uA�i�Rӭ^���iq���V������<�lɳ�}$]���?>���Cj��u�̽�p�S�ķ�/��:����y�a큧\���b��⼇N͍�P�W��{�ʈ��4{��� ��PV<�?�<�<m�w��qe����?��M��NK2��S��A�b��!����s�y�o�_r��ΠU�����AZ�u�
������;0sm�c�~�|�����˯�����,�T!ҖI�J4�e�i*F]�ސ�g4CCm�\��b����o�w�~�m
�y�/	�����s �ga��E���g��_���[s����K�Z�2��_�_�uN�^��N����]�Z�Xh����+�X��D�Z��M�%�%������3i MD���l�i6�>�-m��?�/���\N���_2��4zw�cΜ���w��搾�^|�[�0e�?�����n]���s�wxZ�_�{���Χ9�
��ۆ��"��\gɷ��Ҽ����$c
/�?��rni^�="���F��d0�2|��Z�-9�U�/��y���u�����3�?�aj��i�搱�_���s�&_��M>�h�m��|�|�T�@��l5Kg�s�������s����5޺ޓ�ã�l�[;[�oT���m�l�%5[�N���S�l㩽����}g	�Pɵ��{�+T���b�b�jU�?S�W���L5�	�&݄U����:����h������kSi%dpK��<�w؝����w(9Ģmb	A��(ID�&"J����E�Z�н+(h]d����w��t��#��;���p������C�]xG@[z^tG���_�m�y���y���y�����:���y��}��������P�u�{�s�9
�o��AV����|�=W��r��A���ޖݪ]�,�Z�W�\�=��k{���Y{A3���3�Pw�l��D����Ԡ��a���F��.-45>���x����4�������������B
'�p�����^�!N� �}J1�U;�7\�LJ7[.��2�=�L�z���"Q�X�$�~���0J��ls�8�����?P�Ii�m^�s���_�m�%��9~`�����Ag��q�6�n��Y���44;�7-�/
��4<3U�)%��F6"���Y��Z��,�T�k\�u�L�����u6�O�Yly3�ν���F5<T�Ł��U���U�U�U�U	�	U�����	T-
`@�A�(�5�@�8[ ������@ƽ��;�qom�d�[������{kw ��ڠp{s�-���,�qoi����,��dz@f��3����՞2�d���
(�*Pt`L�HD�"�)��4AD�]�2�@D�"�)��4BD*!"��R�H+D�"ҋLPiFJ
T#�!�1�J�p��dL��/��8��_�5ǒ�ݷ��q��D�{���z]a�ߜoNQ�oI0gXR3�H��P��g��%�/���NK��������|_S������?Қ�5՟����
�ř���ymm�"���n�NK7g���u�?�p����ǀ�s������o8�9�;E��hk^��?]��F��}Y��\��>����W����X\�qSh�E�lCYD.h�NJ��2��g_���Ű~�4�J���gdY��|������\r��/ʿo�y��{�Q��O(�����<q�ߜ"�a�����B���S�ݗ��s��~\��_���_�e�������x�u+��oq�z�…_��[<���9�թ\x��b��\�+�b��\�/�뇉���&F.ԉ}��%'5�t��.Ƥ���L���?�]1��Q.�����q��
:�8�i7��b�#s�!�f�iI�8l�qx�hqD�1apqd#�8�[3�{|V�T�j5��n���ܸ������k�b�f�uV���3 �4B.����}��c�Lt]A����T��X��\��`��dx���ʆ*!(�x��|x�"�⁊��ʈ�
��J������IFTRDTT<PYI8��x��"�����
LJ�*2��x�B#�Rお��ʍ*8"*9"*:"*;	PxDTzĻF��k�W�I��N��V�������4����o,�P���֓�nkY���o�`�%$�fQ�,��X��fF��B���/�m���/��?p��v����O�?':3C��4O|�q!��<d�M{T�x�m}��+�,���g���<�"�"�"�"�Q����Pr�l3Z�M����?&3��6�E]e�]���9-�-s�lsZ�_���g���
��x�r�q��</n�Y���6,T�R���Y�w�����Ff	i�T���&Cu�;,�g,|�f��kq��bd0tj0=uj�j�6��A�'�fL4�x��az}i����S�����G���'�r2S� �R�B5N�2V���?~b|�5%�;�/��j�"�r��ɜ���uv���Os��:�2�ϗ�_U�U�]�T���W��έ��Q/������?/�k��~e������Y�W,	5VY3�^�]���z��q���GGٛ���h�j��ّq��K��[l�4���mo��ʨ�WQ�:���M�+�C�1�krJ�>�s��s���/8:~������?x�������Wٻ���|��ݟ��I�hW�5��Z!è�:�g�W�6�|�V�c{8W�~����w]=7|��#⡃��/�䈜�I�g�Q�qć����Ѻ����;��{��O�����[g�1��
�?O7ܖ�t4��g�jOc��*�!��CWl���?s�#هw��8b�՞P��qt�9����1�7 U�X)�&��o�Q�@�up�

>vT-(8#�s���'��rU�AK�!�[�ֺk�X���QEAMQUA]QeQmQuQ}Q�I��U¿��DoT��Q�VT��"������@ͨ,�Fe)P7*K��QY
Ԏ�R�zT����� ��@
�,�He)PG*K����%��@5�,�Ie)PQ*K��R�T��R��T�����-��@u�,�K�T��T��T�Uf�O��JSy
Ԛ�R�ڈ�ޔ'����Sy�;6�՝�S�٨FT{*O��F5��SyjՈ4 T��Ё�S@	*O-�<�a#�T���F4"M�*T��Py
(C�)�
���:T��Py
(D�)�u��Jԝ:Qw
(E��"�BDz���E�o"uA>����S��֌��%���������S��9���3��T���
��H�|���[�rW���n^��q�����*���U��+�l��������UB��U>i*V���X�Š�v��W5�j�8�՜{������=��?������;cW���\��I\��s
�p�*��N��pM��M_�(6�����~b�Ab˴�b�'�p�1s���a�M�\ی�vm���<�u���u켙�x�
��ů��=�u9�S�u��ub��q\��{Pɍ��Y�f�XQ;j!�1Ԑ��[�

q����[7.���b��{n�%���E�X��{U
1�\��?���u\����A.&r�ط��_��b��8.�ʫ\�﫹��~���7��{sG��J54�v�{��%4]�{@5�Oع��\-4]Qx�rT��	\W��U�/��s+�g�0'�!!*"*"*m�	�E�(E�`��Qd���*��(T:�X��Qu���-��JH�Ho�����j������x��⁊�*+��x��⁊�*/��x��"S�
��"Ph�`@�	�Е&�&�H��"Pt�d@�)���H�I6��"P~�d��	(�"Pd����� bP$�`
���9��5�f�s$k����H�`�d6�C��" �d"�9��5�Z�s$k�q��E��" E2@4�d�l��(�YE{rH<�d�|�Ѷ�9; !k�=Hϑ��!)�R�F[��I�;�+��"�O�\���a��E��"�5�3
=�����To�U�7{���o�2�˗������*r�f�`P(�)����ou��kx����U�F�3��Q9�rs�1��ח���
�����%.��P`L�6������bk�lP�W��`����*�[��,Y�����/��9����b�蚱�7��KlПO���4��N��7.K�|ď����o��O�w�OZ~��>��}/%�y��8�����i>��&iR�ץ�����R�U���o�4�T�4��{��=S��E���#���V�e���:,��Nj�OӴ�s�������!
��_�3j�.�p��g���4˾��}�]i��H�mzN�;s�4�yW1�>���*�Ray�cUîj-|�Y�������3�B�ޥ��N�^ �n �~P�
t�j
�'��+��/��3��7��;��?��C��G��K��O��S��W��[��_��c��g��k��o��s��w��{�����������dr@Q'	A/Q7	A?QGQOQWQ_QgQo}����Ցj����v��u�wW�h��k�t�D�>��#f��k����^��U+�?�2�?�j"W���n���t�M�j��?��yY9ٹ}GZ�gd�W��r�w��ߠV+��'�?�6]<*�#*`��Y�&�jԥ�����>"�Y�[��Ց�e��?�|�4��u�?t%���O���Ѡ��k�V�yv_FQ�o�r���������P�y�/�Q�>2X������i���C����Pe�����z�ί0r��~��2���:�ɿ�C^ڠ��4�Ձ�7[�9�l�����J��2x���-	���F}������jQe|hU�"}8����c��:�7�-����K�����:'���� ��ʪ�G?���
�ИCRbP/�FqV,t�l�3v�V�9#Xx���4o���[a���"��[b�_�b�?���w��J,��W�S2��!^}����թC��?�Q���9���ޓ�M�N_b��^Y��/��}^��?~����q�]{�b�WaD��
-�d}��f��F��	M<��;���Y�6T_�7�i�� �A�7��_ySA�z���19Y9VoM�)�'-�Ԕ�qR�?^bL��a͘���E��ȿ�?1g�#�kCK�
S��p�X�ye��'�ê+m�S�^oP��FvX�o�I�	��Jl;������e#����5cZn��'�_m���B5�o��%��R���+=�5|��k��+��9��7W[�8��Lɘ�����?%��)��/������{1d�S؏�-�
��:(��xb�~G���1K�w>���K)��;���4�%�_���)�����s_�>w�Ň���DK3\�r�o�*�����z�/��%v��K���f�f����Y���+�}����ܜ�q�&�uM���k�t��_U�{9��^{���_�Q��O�/J9�����[r�7����*����y���\���o��҃=�S3f�������5#7#�'�]X�?q�w��~��W.���|�q�Ec�/�j6+�Q'�\�-���np�G�Z�W+V�W�s|eBxu����G�y�=):�r3�ٹ����?
�e�g���9;)�<��R����C���΄��9�i3S��do�;E�L��ͩ֜T��&�ׅ������r]�5�?��6�A����NJ�*�]z_j|f���Mj�s��l�{|�+V�-�H|�9��U�V����|��	|ۈ~|�n�}�|�����1R�_C�Ng�K�?n(u�]E�������K��Br�I�W$�<p]�w���O�����pu�l�~���2]�^��D)b}��ݞR��(�ϳ�%㤶RԂ���k���
�QP��E:V�`{Q�7�~�H�ܻ�F9=�ETR<GEEDeED�%��0����������	��$@D" "��H
D$� �s͑(�HD$"���C	7����$BD"!"��P�H*D$"�	FNH��l8�l8�xW� ��Rbs�s�X���M�
`��]b��P����/���-���&�)6>��7��Ap���6�i�glj�5-�\��^��S�K������]��s�8���2=c�U2�����#���b�N��H��u_�sv�O������S������s�����I;��u�2TT��p�ipV����ނ��?z��>c��D�w��Y`<d���I��M�o�7�y��نu��U-�n��=�V�
Fբ�փۤ�����݀{�j'�4-��{�ة��t�sG��]�lstu�Rw[��l_��n!�"�zx�C3s�]�k�Q7f�=��٨�n�_�1�5d���1��n��V1��������bM��n�#Ww6��_0�_���e!�|���QGL}����~���a€�C�;���ovZ3T��p�s�|�P,頙=��U
��B��+yL�A�y��h1G��Թ&jK/�X�T�Y��lCD)#��Qڈ(qD�:�t�'�#� � �& �6 �F �VP�4�`�(O�*���P4�
0�-T���h
`@s��C� *�tf�?�&Q��
0�Q��UTx�"�h`@˨�F�6*���Q��
0�yD~�>*��R��
0��T�m�h$!h%"j&!h'`@C�ZJ�T*���R4�
0��T�ͥh/`@�ɪ�S4�
0��T��&�&j�f�v#����S4�
0��T���h=�j�
*���N�f�b�o]���_z�_���s�?��lq���5���)�{\ʟa�.�+�l���c�P�����/��Ӊ�:�G�ug�t���!����``3�a�?�F��ʍ�435����4઩��;����i�Ŏ���:�Vϯ1��~�[��S���m'?�l�Cn?h��C���]R��y]���|�I����]ߟ)t;������k�L!s{�ǂL��1��z����%�?l�wl!�vSX�9��Ag����W69{��)��BrFl����Z���~ �y6�i�Y+D-�j��P�Ia��.��7���<�{|�<���9�	�
�q+f�?њ����Q�4t\��o�a���y'L#�WF�cJ�rV�_au6@Nl��3q�y�?�8Ǯ�-�sz�s܂���OpNHY,'�4:���������E�Ĵ>�������֦ԋ����Mi���Ϛ2^����<�M�<rƙ5����y�3gd�ly��T�1ٚ��9��V9w�g�?G���L�O�&�h<�4�Dž�6�Zy�0�DwӜ�c����Ms'��=������w��Ȥg����p�a߄2h�g���刨��\@�	A�Q�Q�Q����Dd"����@DfP�
�A�*`5��%Ԥ�P�l�	�b���9,V:MϑA+�(V&Q�l�X	E���b%`�J�.���a#˨IL#�Q��#�Q�̣X	�G�0�b%`!�J�D����d(��++Y��=Gv�f�az�,e�*;=G��f�Az=d-kVm���^֬��9��5����̚U	�Y͚US��#�Y�j����Ԭ��)p��)F��)V@�X���P�
�5��"P�
�5��2P�
�5��BP�
��1X
B�Ԥ�AV��T`9w���hA�YV��U`I�Yք�U`Q�YV��U`Y�Yօ�U`a�YV��U`i�Yֆ�U`q��P�
,!XjR�"+D�*�DԬkD�*�HԬ�D�*�LԬ�D�*�PԬ+E�*�TԬkE�*�X�`��I���4�+Z0�_ђQ�j%�+Z4jV�e|E�F�*�nԬ�k,\<
��ذ�vWw��h�kJٮ*��bp�'��n��
%��P�6�����^�y���W��Q��z����/IY��Y���7�,���9�+!V���y�ώ�{��
�]W=�<�c�7�L�X<��u��1e鐲t�S���ҡ��6�
+�
�xsJj���?�AS���S����j���8�:o��߲s+���->���ҊW��/W��Uvs��d�WG��d�'�mی[7�UnC�Uq���W���mzA�m�����^]<��bQ��Y����ojNJo�D|�`���Oo�����{�M�UY����p;f��>H��v&��v��'����鹎ۛ�����������~=�;�;p�����'_�=��9�r�j�?���x$�=wdc^�s��|���ёU��KF����{`�x|�q���{|����S�i��ն�O��F��w��'�NS�ڲ�~z�|���r������_'������~V��~�Mu��fa�o�wR{�������~E�Ç��9�����g��zȮΟ������ԿL|�~i�&��U�_u�������^�j���D���5�m��z�W{A���v�4�&4���\���gS��/.�&�m{���|�֣Z������((�������x������>��&ſ������<�䍻�G���\28��M7�<-�i����&���,�$DkH�""ZFD���h!�J"��DDk���&"ZND���`Aъ"�%EDk���6������+յ��"��EDKK�-."Z]D���h}�#�FDK��֘�
�E��`�i�XfZ+֙���&+Mk�R����`�i�XmB�ܴf�7�Iv=Њ�%GDkN�:!XvD��`�����GDkO��Z}���#��GD��^=!xD����3�$K���!�K ���Qw�-�c ��@Dρ�ރ&^�AD/B��	"zD�*��YhE�.��a�� ��ADo���"zD�>4i<"z!D�D�����3��g��x'�����x)ʸ�S�.�~Ϧd�Jƭd��q+a���?+
�Q�r����`P��(��.�o�� �_,=��ۊ�I��_%�����j�qޖd��_RYZ���&�aim�����
��K.�̿���#��6E�?؆u�"~�%RzͺV�jRIojӥ�j#�U��O{�m�>��{k7������^����w�X��>�ߙ8��է��;<����5~o�>i�gW��q��#����x=�?p��ӹ�'���ڙ&�n��B:���l\�9�?�9׽'ـ?�dwL;�;��*�����?�q�I���O�1���0���U�g߭6�|;�xj��qz�RǗ��w�Os|eMq|�<�qfx�7�a���N�sm�8�7�g�����핎��~hl���j�>��x��7�4���!�O�o̟���d���Fls\�y��n�㷶�:~�':��8��ǥdG�g;�n��^�O�s
R���R݄�z}�I�C�����ְ��F��5�t���{��lMw��5�v@�|�6U���Z.Y�j5�Q�*ST�7U�fp��m�U�n}U��UjtPuf�k���Y�����l]v7�)���%��Z��Mռ�!��GG|�.r�9��������)3�����st�\�x\����K�eH�O��I,�ɕ�/�X��F�?D����X��R��<K�-��$K��X��R��<K����d���<K�w�,�7�,��"�Կ
�R�E<K�#%����XꯒX�.����d�Ɩ���,��ͳ���R��9`��/�Y�?�g������%����,��g��>���W$���1C�R�y<K�s��f�:Ó�z�g���R��K���X��Y꿀c������Կ�R�IK��3|��
>K���,��hd��i#K��Y��`������,��3����`��RK��w������)���t�Կ����a��wr�Կ������,�70'�R�Ƅ�LX����?Y���R�CF���od��f#K�mF��p�������8X��R�G,�,�������K���,��'��� �Կ��R��£�"��b���6��oc���q�������l,�_ec��K����D����������X�/�X�?U�R�dK�G�X��W�R����T,�fc�����:K���X�����Jꯤ�J�q��L�Dj"��E��;���h��x��)���3��{����aaZ��91�]�����������_Y�����9�"��oKjҨTkJj��_v���$\�������^��ܵ��%_�:�;����w#�)Q2����<ܞ��%�;��,��z����)g�7��+ӛp�c�Fa.j�5c�9ī�Q�����e��r�BFVrޔ�<?�a:m������t��Q�QӲR����l�Fo���=�bfNnn����P^��k�q�k���?�a����u��o����V6�f��*��\�4g%a�����Q���R�����j�g�N��ASu��\cW
S���ʵ�q�~)P�N3�Yt^����Y/}�PPWg�/
�&�u�?�q�e�&Us�&��������gj~4Kn��\g��O���ڊ�R�*�G�4Wq�𜫉a�I����J(ӵ�M��ׅ�=pm� �5�?׉�[�V�x��!�uC�kGQ*\?B����u�>\K*����;\S*��u��;\[*����;\c*���]��ZS�}�%�9��;�]���S��*�%e@ew���ATv?�v	B�PY�Kr	�'�ʮ>޴����ds���?��k����?�b�+�Ϋ!k��$��%⡕9�Ïd�G������>ɪ4�]Hn��sDԈR��*��4�Y���[�o�*�6�o����v��o�+�{
�g������o���|���jV�-��x�M9�ܜ�!9�s| ����V�����7���a�L�tE���x�0��;���5N��b�7�?$Ü�͟�f��`
�W��b����4'�������Q�����)~T������O��������~!��P�V�9�?.#{Z�y��O��S�_���?��"�r���<k��K��7h������iy
��w�i�)������J������}+u-��ܽ��;�����I�2λ�O��J6OK����c�
���W��b�Y��f����c�?������fk��_�Q�_��5'�Xs��c/���t���3�˛�?�'�?_��?E��+��b��o��S+��<���fNɘ���'�C���<�s��k
����?)�/7����������/��k������֤�iI#���|/��P�F�9�?jZ�ٿ��*�/O��@���A�a���"�r��sNRBN���aZV�S�/'��3C��������bSX�9�HN�y��t0���C�"��`o�2��e���kjjnν��Y�G���S�V?�>T�ȿ���m�b��/��(���?ڜ=9/����*�/7�g�����a����O���p���ZSb��V��S.�
VlWq}�1�u�P�o+?ɳ�z>[���e�l�/䳙�C-�3��|�T���j��O��S�U�d�W��W��_s��#�Xj&��X����S�P��*ܹ�=<6s�X�|W�{CU��U�ι�x�@@vN��K`�c����p�
�?y*��Vz���^'D�\' �=GB�9M/��<�r��HD$!�_�C�AD�$BD"!"��P�H*D$�(\e�2�,�p�M�/~��)?[8�F�_��p���1�[�q���u�k����>��+��(:p���K��==�{W
{�h#�u�|�����C/5�}А;t��^��,坋��G�
�cC
5�~�_k�FK�
)�|nSK�5=��Ft�o׭I~��A�"vZ:v�D��̏������{s���U�;I+h8���&3��F�悘�]V2�fw�캍���}Y<#�1��]F���&`.��̕)�%c/���^�p�h/S���p	�R�ަp9ioS��4�.+M��KK�P��RD��"B��¥��.7�.9��:�H�I>��Q,��H���H(���EB� '���u��?.�l���������:�@���%�I7gg�NKγ��C�Gk�(�������r|�]���0e�������)���7��&�'���~Q��:o�!��*� ?���=�9r�w�z�?�&-P�v�y]��bh��%}������RX�Ű�)���=^]-�ܓ#�Z�@�ؼ��=u��h6�g�|ޘc�z���C�\L�X�oH}��_����~�^�����8`gGiৗ�Ak�Iqo~"��)�ɷ�!ɇ���>#��7KÄ7��m�J#�Os	�7�#U���?H�h�8�X�D�7��~��5w��~�K���S���B%!#@4�6���'�CJ�[L��4���ܤ�ۤ��wq��>�3���&�x����".s�s|���\��y|���%��O�.Z����zs�����ļ�Q��j���n+�,_��.P���1q���Ҝ�?q�eW��������S%o� /�j��\�ѐ� :�M�w=
�c�3u66w��'�=ƴܶBMp+T��q3n4݁2P6�%��Q-��jɠx�?P>�%�R-��jɠ�TKe�Z2($!(%"*&!('ՒAA��JJ�dPT�%��R-y3e,��TK~�-�C�Zr��VPbʠ�����LTH}�I�RS�M�M(8"*9eR��TKe�Z2(<ՒA驖�O�dP~�%��$�Z2�j�@�%!HN@
�%1`=���w�HB�d 
Ւ_f=$Ւf=$Y�)�G��Zr� ����zH(�%�c=$�%�#@�g=$Ւ���j�+X��G��9�G��Z�8�#@"R-9����TKn�zHL�%���,� Q��TK�R-y� q����e�H`�%���D�Z2�����̶��_v���]���P%�w��fK^����ч��*�?����e&E�3�|��[k0�X�c��O��}�{��y��6 _�~�.�h-π�s�w�X�m�
���-q'���g-��.���f3U}�f�<�2��`6�~۰l�WچgN���e���m�����#m#����#�F�\�6�[�m��#����FUb�U�*��eۘ��1�Tc߬`��
�=��{>n�gk�w8�6��t���&��	��m��JZ~R%$�Q	�D�`�c3��-��v���?���]�J��6�Y�-��J[�ϲ*��h���m�d۔3l�{橲�ٲכU9�6����a�T�I�U�4�T�������RMik�����Q;�6�yL5���Y���ه���<�ݷn�j��-�y��*������
#��k����5�/�L��ȣ��9j	a)���Q��D�D0h��U�`�.
�A�(-#M#m��4���c�<B�>
�A	A	A�z�6R�IZI����IJ4h)!h*Ҡ����t�@k	As	A{)�
&-�@4���n�hB�jB�l
�A�)
'-'M����r�xB�zB�|
�A�)@P�,�����l��A�yvݐ�;(@��>7��t`
�
Ё1�k(@�P��D:��t`�&2�(
́U��,
́]��0
Ёe��(@�Q���XG:0�t�b_<Q���:
��_�4sr�9{�����N�ӗ���U����/2�s�O�sŞ����č���]�=��B����ǎ��Z��8'_��0V(9[��ӗK�((����|��ܽy<gK�\�+�_B�3|@���dl��FZ;��F+@�Q�A�vU	K��ħa�?��"��b�'��a������X���<��������֜������k��?�P�����W�u^�[R把������o0f���2�����_�b����g48v�z��e�Ks�|� x�4�n�!�kN����N��Y��Θ6͜�F�:���/���Wt}y���_v����̸����產���b5�����{�}|J<n��3���8��4z���Ze����_�:�uP+Փ��	�-�9��i]��ܺ�C���Z|቉�O4�^��	�w��^�̹����+���6�W���~�}�^�_}ô���E�e��Gn�l�[�o�^mz��[�?v���s�Ic��XS�����Q��v8�O��~5������,�����C�8w��ݛ�2��_���;�o��>�p�`�C�����!=��J])8H���U	b�@MKg�
�����+��W��("^UB���xu�
#�UF�+M���F�+��W�6!�+O��էM@�	H�$A��4A"�(B�"J�$��RBDI!��Qb�(5���Qz�(AD�""J�IED��dQ�� a*��� �FDi#����:�:�Z���� }ZH�5
��=Oa���������g����\����O�H�l�����:e����A�Y^������jJ���+���\�5Y�'�ԝ4�R��i�F����Z��nj�W���E�g�"��9|A�R�([��$��g�3�2n��^�t%�3����|���/<ڸΛ"�qc�gcg�?�i7ˊmK-�9�\��e��K�MX���X#u���kf
�_��O�ڷ*[�T��M|G�y5�	�DZ=A�Y>��b9{�8wi�=�Œ�c���<�)W땾��g���1|��1�z���Is�C�ؾEvCM����*xR�Re��p�:�W�h���{�I
�o�{Lx������m^�����iʦ�({n�א�#"^KD����5�-��R�-"^_D�Ɣ5�uF�kMכ-�a�ם�}�
Db�h�v��Q�(D�]I�
�l�]D��`�"�
��l��l�"���S*;�(�r(�r(�r(�r(�r(�r(�r(�r(�r(�r(�r(�r(�r(�r(�r(�r(�r(�r(�r(�]~�?t�W��
site-packages/dateutil/zoneinfo/rebuild.py000064400000003231147511334560014736 0ustar00import logging
import os
import tempfile
import shutil
import json
from subprocess import check_call

from dateutil.zoneinfo import tar_open, METADATA_FN, ZONEFILENAME


def rebuild(filename, tag=None, format="gz", zonegroups=[], metadata=None):
    """Rebuild the internal timezone info in dateutil/zoneinfo/zoneinfo*tar*

    filename is the timezone tarball from ftp.iana.org/tz.

    """
    tmpdir = tempfile.mkdtemp()
    zonedir = os.path.join(tmpdir, "zoneinfo")
    moduledir = os.path.dirname(__file__)
    try:
        with tar_open(filename) as tf:
            for name in zonegroups:
                tf.extract(name, tmpdir)
            filepaths = [os.path.join(tmpdir, n) for n in zonegroups]
            try:
                check_call(["zic", "-d", zonedir] + filepaths)
            except OSError as e:
                _print_on_nosuchfile(e)
                raise
        # write metadata file
        with open(os.path.join(zonedir, METADATA_FN), 'w') as f:
            json.dump(metadata, f, indent=4, sort_keys=True)
        target = os.path.join(moduledir, ZONEFILENAME)
        with tar_open(target, "w:%s" % format) as tf:
            for entry in os.listdir(zonedir):
                entrypath = os.path.join(zonedir, entry)
                tf.add(entrypath, entry)
    finally:
        shutil.rmtree(tmpdir)


def _print_on_nosuchfile(e):
    """Print helpful troubleshooting message

    e is an exception raised by subprocess.check_call()

    """
    if e.errno == 2:
        logging.error(
            "Could not find zic. Perhaps you need to install "
            "libc-bin or some other package that provides it, "
            "or it's not in your PATH?")
site-packages/dateutil/zoneinfo/__init__.py000064400000015133147511334560015053 0ustar00# -*- coding: utf-8 -*-
import warnings
import json

from tarfile import TarFile
from pkgutil import get_data
from io import BytesIO
from contextlib import closing

from dateutil.tz import tzfile

__all__ = ["get_zonefile_instance", "gettz", "gettz_db_metadata", "rebuild"]

ZONEFILENAME = "dateutil-zoneinfo.tar.gz"
METADATA_FN = 'METADATA'

# python2.6 compatability. Note that TarFile.__exit__ != TarFile.close, but
# it's close enough for python2.6
tar_open = TarFile.open
if not hasattr(TarFile, '__exit__'):
    def tar_open(*args, **kwargs):
        return closing(TarFile.open(*args, **kwargs))


class tzfile(tzfile):
    def __reduce__(self):
        return (gettz, (self._filename,))


def getzoneinfofile_stream():
    try:
        return BytesIO(get_data(__name__, ZONEFILENAME))
    except IOError as e:  # TODO  switch to FileNotFoundError?
        warnings.warn("I/O error({0}): {1}".format(e.errno, e.strerror))
        return None


class ZoneInfoFile(object):
    def __init__(self, zonefile_stream=None):
        if zonefile_stream is not None:
            with tar_open(fileobj=zonefile_stream, mode='r') as tf:
                # dict comprehension does not work on python2.6
                # TODO: get back to the nicer syntax when we ditch python2.6
                # self.zones = {zf.name: tzfile(tf.extractfile(zf),
                #               filename = zf.name)
                #              for zf in tf.getmembers() if zf.isfile()}
                self.zones = dict((zf.name, tzfile(tf.extractfile(zf),
                                                   filename=zf.name))
                                  for zf in tf.getmembers()
                                  if zf.isfile() and zf.name != METADATA_FN)
                # deal with links: They'll point to their parent object. Less
                # waste of memory
                # links = {zl.name: self.zones[zl.linkname]
                #        for zl in tf.getmembers() if zl.islnk() or zl.issym()}
                links = dict((zl.name, self.zones[zl.linkname])
                             for zl in tf.getmembers() if
                             zl.islnk() or zl.issym())
                self.zones.update(links)
                try:
                    metadata_json = tf.extractfile(tf.getmember(METADATA_FN))
                    metadata_str = metadata_json.read().decode('UTF-8')
                    self.metadata = json.loads(metadata_str)
                except KeyError:
                    # no metadata in tar file
                    self.metadata = None
        else:
            self.zones = dict()
            self.metadata = None

    def get(self, name, default=None):
        """
        Wrapper for :func:`ZoneInfoFile.zones.get`. This is a convenience method
        for retrieving zones from the zone dictionary.

        :param name:
            The name of the zone to retrieve. (Generally IANA zone names)

        :param default:
            The value to return in the event of a missing key.

        .. versionadded:: 2.6.0

        """
        return self.zones.get(name, default)


# The current API has gettz as a module function, although in fact it taps into
# a stateful class. So as a workaround for now, without changing the API, we
# will create a new "global" class instance the first time a user requests a
# timezone. Ugly, but adheres to the api.
#
# TODO: Remove after deprecation period.
_CLASS_ZONE_INSTANCE = list()


def get_zonefile_instance(new_instance=False):
    """
    This is a convenience function which provides a :class:`ZoneInfoFile`
    instance using the data provided by the ``dateutil`` package. By default, it
    caches a single instance of the ZoneInfoFile object and returns that.

    :param new_instance:
        If ``True``, a new instance of :class:`ZoneInfoFile` is instantiated and
        used as the cached instance for the next call. Otherwise, new instances
        are created only as necessary.

    :return:
        Returns a :class:`ZoneInfoFile` object.

    .. versionadded:: 2.6
    """
    if new_instance:
        zif = None
    else:
        zif = getattr(get_zonefile_instance, '_cached_instance', None)

    if zif is None:
        zif = ZoneInfoFile(getzoneinfofile_stream())

        get_zonefile_instance._cached_instance = zif

    return zif


def gettz(name):
    """
    This retrieves a time zone from the local zoneinfo tarball that is packaged
    with dateutil.

    :param name:
        An IANA-style time zone name, as found in the zoneinfo file.

    :return:
        Returns a :class:`dateutil.tz.tzfile` time zone object.

    .. warning::
        It is generally inadvisable to use this function, and it is only
        provided for API compatibility with earlier versions. This is *not*
        equivalent to ``dateutil.tz.gettz()``, which selects an appropriate
        time zone based on the inputs, favoring system zoneinfo. This is ONLY
        for accessing the dateutil-specific zoneinfo (which may be out of
        date compared to the system zoneinfo).

    .. deprecated:: 2.6
        If you need to use a specific zoneinfofile over the system zoneinfo,
        instantiate a :class:`dateutil.zoneinfo.ZoneInfoFile` object and call
        :func:`dateutil.zoneinfo.ZoneInfoFile.get(name)` instead.

        Use :func:`get_zonefile_instance` to retrieve an instance of the
        dateutil-provided zoneinfo.
    """
    warnings.warn("zoneinfo.gettz() will be removed in future versions, "
                  "to use the dateutil-provided zoneinfo files, instantiate a "
                  "ZoneInfoFile object and use ZoneInfoFile.zones.get() "
                  "instead. See the documentation for details.",
                  DeprecationWarning)

    if len(_CLASS_ZONE_INSTANCE) == 0:
        _CLASS_ZONE_INSTANCE.append(ZoneInfoFile(getzoneinfofile_stream()))
    return _CLASS_ZONE_INSTANCE[0].zones.get(name)


def gettz_db_metadata():
    """ Get the zonefile metadata

    See `zonefile_metadata`_

    :returns:
        A dictionary with the database metadata

    .. deprecated:: 2.6
        See deprecation warning in :func:`zoneinfo.gettz`. To get metadata,
        query the attribute ``zoneinfo.ZoneInfoFile.metadata``.
    """
    warnings.warn("zoneinfo.gettz_db_metadata() will be removed in future "
                  "versions, to use the dateutil-provided zoneinfo files, "
                  "ZoneInfoFile object and query the 'metadata' attribute "
                  "instead. See the documentation for details.",
                  DeprecationWarning)

    if len(_CLASS_ZONE_INSTANCE) == 0:
        _CLASS_ZONE_INSTANCE.append(ZoneInfoFile(getzoneinfofile_stream()))
    return _CLASS_ZONE_INSTANCE[0].metadata
site-packages/dateutil/parser.py000064400000143373147511334560012771 0ustar00# -*- coding: utf-8 -*-
"""
This module offers a generic date/time string parser which is able to parse
most known formats to represent a date and/or time.

This module attempts to be forgiving with regards to unlikely input formats,
returning a datetime object even for dates which are ambiguous. If an element
of a date/time stamp is omitted, the following rules are applied:
- If AM or PM is left unspecified, a 24-hour clock is assumed, however, an hour
  on a 12-hour clock (``0 <= hour <= 12``) *must* be specified if AM or PM is
  specified.
- If a time zone is omitted, a timezone-naive datetime is returned.

If any other elements are missing, they are taken from the
:class:`datetime.datetime` object passed to the parameter ``default``. If this
results in a day number exceeding the valid number of days per month, the
value falls back to the end of the month.

Additional resources about date/time string formats can be found below:

- `A summary of the international standard date and time notation
  <http://www.cl.cam.ac.uk/~mgk25/iso-time.html>`_
- `W3C Date and Time Formats <http://www.w3.org/TR/NOTE-datetime>`_
- `Time Formats (Planetary Rings Node) <http://pds-rings.seti.org/tools/time_formats.html>`_
- `CPAN ParseDate module
  <http://search.cpan.org/~muir/Time-modules-2013.0912/lib/Time/ParseDate.pm>`_
- `Java SimpleDateFormat Class
  <https://docs.oracle.com/javase/6/docs/api/java/text/SimpleDateFormat.html>`_
"""
from __future__ import unicode_literals

import datetime
import string
import time
import collections
import re
from io import StringIO
from calendar import monthrange

from six import text_type, binary_type, integer_types

from . import relativedelta
from . import tz

__all__ = ["parse", "parserinfo"]


class _timelex(object):
    # Fractional seconds are sometimes split by a comma
    _split_decimal = re.compile("([.,])")

    def __init__(self, instream):
        if isinstance(instream, binary_type):
            instream = instream.decode()

        if isinstance(instream, text_type):
            instream = StringIO(instream)

        if getattr(instream, 'read', None) is None:
            raise TypeError('Parser must be a string or character stream, not '
                            '{itype}'.format(itype=instream.__class__.__name__))

        self.instream = instream
        self.charstack = []
        self.tokenstack = []
        self.eof = False

    def get_token(self):
        """
        This function breaks the time string into lexical units (tokens), which
        can be parsed by the parser. Lexical units are demarcated by changes in
        the character set, so any continuous string of letters is considered
        one unit, any continuous string of numbers is considered one unit.

        The main complication arises from the fact that dots ('.') can be used
        both as separators (e.g. "Sep.20.2009") or decimal points (e.g.
        "4:30:21.447"). As such, it is necessary to read the full context of
        any dot-separated strings before breaking it into tokens; as such, this
        function maintains a "token stack", for when the ambiguous context
        demands that multiple tokens be parsed at once.
        """
        if self.tokenstack:
            return self.tokenstack.pop(0)

        seenletters = False
        token = None
        state = None

        while not self.eof:
            # We only realize that we've reached the end of a token when we
            # find a character that's not part of the current token - since
            # that character may be part of the next token, it's stored in the
            # charstack.
            if self.charstack:
                nextchar = self.charstack.pop(0)
            else:
                nextchar = self.instream.read(1)
                while nextchar == '\x00':
                    nextchar = self.instream.read(1)

            if not nextchar:
                self.eof = True
                break
            elif not state:
                # First character of the token - determines if we're starting
                # to parse a word, a number or something else.
                token = nextchar
                if self.isword(nextchar):
                    state = 'a'
                elif self.isnum(nextchar):
                    state = '0'
                elif self.isspace(nextchar):
                    token = ' '
                    break  # emit token
                else:
                    break  # emit token
            elif state == 'a':
                # If we've already started reading a word, we keep reading
                # letters until we find something that's not part of a word.
                seenletters = True
                if self.isword(nextchar):
                    token += nextchar
                elif nextchar == '.':
                    token += nextchar
                    state = 'a.'
                else:
                    self.charstack.append(nextchar)
                    break  # emit token
            elif state == '0':
                # If we've already started reading a number, we keep reading
                # numbers until we find something that doesn't fit.
                if self.isnum(nextchar):
                    token += nextchar
                elif nextchar == '.' or (nextchar == ',' and len(token) >= 2):
                    token += nextchar
                    state = '0.'
                else:
                    self.charstack.append(nextchar)
                    break  # emit token
            elif state == 'a.':
                # If we've seen some letters and a dot separator, continue
                # parsing, and the tokens will be broken up later.
                seenletters = True
                if nextchar == '.' or self.isword(nextchar):
                    token += nextchar
                elif self.isnum(nextchar) and token[-1] == '.':
                    token += nextchar
                    state = '0.'
                else:
                    self.charstack.append(nextchar)
                    break  # emit token
            elif state == '0.':
                # If we've seen at least one dot separator, keep going, we'll
                # break up the tokens later.
                if nextchar == '.' or self.isnum(nextchar):
                    token += nextchar
                elif self.isword(nextchar) and token[-1] == '.':
                    token += nextchar
                    state = 'a.'
                else:
                    self.charstack.append(nextchar)
                    break  # emit token

        if (state in ('a.', '0.') and (seenletters or token.count('.') > 1 or
                                       token[-1] in '.,')):
            l = self._split_decimal.split(token)
            token = l[0]
            for tok in l[1:]:
                if tok:
                    self.tokenstack.append(tok)

        if state == '0.' and token.count('.') == 0:
            token = token.replace(',', '.')

        return token

    def __iter__(self):
        return self

    def __next__(self):
        token = self.get_token()
        if token is None:
            raise StopIteration

        return token

    def next(self):
        return self.__next__()  # Python 2.x support

    @classmethod
    def split(cls, s):
        return list(cls(s))

    @classmethod
    def isword(cls, nextchar):
        """ Whether or not the next character is part of a word """
        return nextchar.isalpha()

    @classmethod
    def isnum(cls, nextchar):
        """ Whether the next character is part of a number """
        return nextchar.isdigit()

    @classmethod
    def isspace(cls, nextchar):
        """ Whether the next character is whitespace """
        return nextchar.isspace()


class _resultbase(object):

    def __init__(self):
        for attr in self.__slots__:
            setattr(self, attr, None)

    def _repr(self, classname):
        l = []
        for attr in self.__slots__:
            value = getattr(self, attr)
            if value is not None:
                l.append("%s=%s" % (attr, repr(value)))
        return "%s(%s)" % (classname, ", ".join(l))

    def __len__(self):
        return (sum(getattr(self, attr) is not None
                    for attr in self.__slots__))

    def __repr__(self):
        return self._repr(self.__class__.__name__)


class parserinfo(object):
    """
    Class which handles what inputs are accepted. Subclass this to customize
    the language and acceptable values for each parameter.

    :param dayfirst:
            Whether to interpret the first value in an ambiguous 3-integer date
            (e.g. 01/05/09) as the day (``True``) or month (``False``). If
            ``yearfirst`` is set to ``True``, this distinguishes between YDM
            and YMD. Default is ``False``.

    :param yearfirst:
            Whether to interpret the first value in an ambiguous 3-integer date
            (e.g. 01/05/09) as the year. If ``True``, the first number is taken
            to be the year, otherwise the last number is taken to be the year.
            Default is ``False``.
    """

    # m from a.m/p.m, t from ISO T separator
    JUMP = [" ", ".", ",", ";", "-", "/", "'",
            "at", "on", "and", "ad", "m", "t", "of",
            "st", "nd", "rd", "th"]

    WEEKDAYS = [("Mon", "Monday"),
                ("Tue", "Tuesday"),
                ("Wed", "Wednesday"),
                ("Thu", "Thursday"),
                ("Fri", "Friday"),
                ("Sat", "Saturday"),
                ("Sun", "Sunday")]
    MONTHS = [("Jan", "January"),
              ("Feb", "February"),
              ("Mar", "March"),
              ("Apr", "April"),
              ("May", "May"),
              ("Jun", "June"),
              ("Jul", "July"),
              ("Aug", "August"),
              ("Sep", "Sept", "September"),
              ("Oct", "October"),
              ("Nov", "November"),
              ("Dec", "December")]
    HMS = [("h", "hour", "hours"),
           ("m", "minute", "minutes"),
           ("s", "second", "seconds")]
    AMPM = [("am", "a"),
            ("pm", "p")]
    UTCZONE = ["UTC", "GMT", "Z"]
    PERTAIN = ["of"]
    TZOFFSET = {}

    def __init__(self, dayfirst=False, yearfirst=False):
        self._jump = self._convert(self.JUMP)
        self._weekdays = self._convert(self.WEEKDAYS)
        self._months = self._convert(self.MONTHS)
        self._hms = self._convert(self.HMS)
        self._ampm = self._convert(self.AMPM)
        self._utczone = self._convert(self.UTCZONE)
        self._pertain = self._convert(self.PERTAIN)

        self.dayfirst = dayfirst
        self.yearfirst = yearfirst

        self._year = time.localtime().tm_year
        self._century = self._year // 100 * 100

    def _convert(self, lst):
        dct = {}
        for i, v in enumerate(lst):
            if isinstance(v, tuple):
                for v in v:
                    dct[v.lower()] = i
            else:
                dct[v.lower()] = i
        return dct

    def jump(self, name):
        return name.lower() in self._jump

    def weekday(self, name):
        if len(name) >= min(len(n) for n in self._weekdays.keys()):
            try:
                return self._weekdays[name.lower()]
            except KeyError:
                pass
        return None

    def month(self, name):
        if len(name) >= min(len(n) for n in self._months.keys()):
            try:
                return self._months[name.lower()] + 1
            except KeyError:
                pass
        return None

    def hms(self, name):
        try:
            return self._hms[name.lower()]
        except KeyError:
            return None

    def ampm(self, name):
        try:
            return self._ampm[name.lower()]
        except KeyError:
            return None

    def pertain(self, name):
        return name.lower() in self._pertain

    def utczone(self, name):
        return name.lower() in self._utczone

    def tzoffset(self, name):
        if name in self._utczone:
            return 0

        return self.TZOFFSET.get(name)

    def convertyear(self, year, century_specified=False):
        if year < 100 and not century_specified:
            year += self._century
            if abs(year - self._year) >= 50:
                if year < self._year:
                    year += 100
                else:
                    year -= 100
        return year

    def validate(self, res):
        # move to info
        if res.year is not None:
            res.year = self.convertyear(res.year, res.century_specified)

        if res.tzoffset == 0 and not res.tzname or res.tzname == 'Z':
            res.tzname = "UTC"
            res.tzoffset = 0
        elif res.tzoffset != 0 and res.tzname and self.utczone(res.tzname):
            res.tzoffset = 0
        return True


class _ymd(list):
    def __init__(self, tzstr, *args, **kwargs):
        super(self.__class__, self).__init__(*args, **kwargs)
        self.century_specified = False
        self.tzstr = tzstr

    @staticmethod
    def token_could_be_year(token, year):
        try:
            return int(token) == year
        except ValueError:
            return False

    @staticmethod
    def find_potential_year_tokens(year, tokens):
        return [token for token in tokens if _ymd.token_could_be_year(token, year)]

    def find_probable_year_index(self, tokens):
        """
        attempt to deduce if a pre 100 year was lost
         due to padded zeros being taken off
        """
        for index, token in enumerate(self):
            potential_year_tokens = _ymd.find_potential_year_tokens(token, tokens)
            if len(potential_year_tokens) == 1 and len(potential_year_tokens[0]) > 2:
                return index

    def append(self, val):
        if hasattr(val, '__len__'):
            if val.isdigit() and len(val) > 2:
                self.century_specified = True
        elif val > 100:
            self.century_specified = True

        super(self.__class__, self).append(int(val))

    def resolve_ymd(self, mstridx, yearfirst, dayfirst):
        len_ymd = len(self)
        year, month, day = (None, None, None)

        if len_ymd > 3:
            raise ValueError("More than three YMD values")
        elif len_ymd == 1 or (mstridx != -1 and len_ymd == 2):
            # One member, or two members with a month string
            if mstridx != -1:
                month = self[mstridx]
                del self[mstridx]

            if len_ymd > 1 or mstridx == -1:
                if self[0] > 31:
                    year = self[0]
                else:
                    day = self[0]

        elif len_ymd == 2:
            # Two members with numbers
            if self[0] > 31:
                # 99-01
                year, month = self
            elif self[1] > 31:
                # 01-99
                month, year = self
            elif dayfirst and self[1] <= 12:
                # 13-01
                day, month = self
            else:
                # 01-13
                month, day = self

        elif len_ymd == 3:
            # Three members
            if mstridx == 0:
                month, day, year = self
            elif mstridx == 1:
                if self[0] > 31 or (yearfirst and self[2] <= 31):
                    # 99-Jan-01
                    year, month, day = self
                else:
                    # 01-Jan-01
                    # Give precendence to day-first, since
                    # two-digit years is usually hand-written.
                    day, month, year = self

            elif mstridx == 2:
                # WTF!?
                if self[1] > 31:
                    # 01-99-Jan
                    day, year, month = self
                else:
                    # 99-01-Jan
                    year, day, month = self

            else:
                if self[0] > 31 or \
                    self.find_probable_year_index(_timelex.split(self.tzstr)) == 0 or \
                   (yearfirst and self[1] <= 12 and self[2] <= 31):
                    # 99-01-01
                    if dayfirst and self[2] <= 12:
                        year, day, month = self
                    else:
                        year, month, day = self
                elif self[0] > 12 or (dayfirst and self[1] <= 12):
                    # 13-01-01
                    day, month, year = self
                else:
                    # 01-13-01
                    month, day, year = self

        return year, month, day


class parser(object):
    def __init__(self, info=None):
        self.info = info or parserinfo()

    def parse(self, timestr, default=None, ignoretz=False, tzinfos=None, **kwargs):
        """
        Parse the date/time string into a :class:`datetime.datetime` object.

        :param timestr:
            Any date/time string using the supported formats.

        :param default:
            The default datetime object, if this is a datetime object and not
            ``None``, elements specified in ``timestr`` replace elements in the
            default object.

        :param ignoretz:
            If set ``True``, time zones in parsed strings are ignored and a
            naive :class:`datetime.datetime` object is returned.

        :param tzinfos:
            Additional time zone names / aliases which may be present in the
            string. This argument maps time zone names (and optionally offsets
            from those time zones) to time zones. This parameter can be a
            dictionary with timezone aliases mapping time zone names to time
            zones or a function taking two parameters (``tzname`` and
            ``tzoffset``) and returning a time zone.

            The timezones to which the names are mapped can be an integer
            offset from UTC in minutes or a :class:`tzinfo` object.

            .. doctest::
               :options: +NORMALIZE_WHITESPACE

                >>> from dateutil.parser import parse
                >>> from dateutil.tz import gettz
                >>> tzinfos = {"BRST": -10800, "CST": gettz("America/Chicago")}
                >>> parse("2012-01-19 17:21:00 BRST", tzinfos=tzinfos)
                datetime.datetime(2012, 1, 19, 17, 21, tzinfo=tzoffset(u'BRST', -10800))
                >>> parse("2012-01-19 17:21:00 CST", tzinfos=tzinfos)
                datetime.datetime(2012, 1, 19, 17, 21,
                                  tzinfo=tzfile('/usr/share/zoneinfo/America/Chicago'))

            This parameter is ignored if ``ignoretz`` is set.

        :param **kwargs:
            Keyword arguments as passed to ``_parse()``.

        :return:
            Returns a :class:`datetime.datetime` object or, if the
            ``fuzzy_with_tokens`` option is ``True``, returns a tuple, the
            first element being a :class:`datetime.datetime` object, the second
            a tuple containing the fuzzy tokens.

        :raises ValueError:
            Raised for invalid or unknown string format, if the provided
            :class:`tzinfo` is not in a valid format, or if an invalid date
            would be created.

        :raises TypeError:
            Raised for non-string or character stream input.

        :raises OverflowError:
            Raised if the parsed date exceeds the largest valid C integer on
            your system.
        """

        if default is None:
            default = datetime.datetime.now().replace(hour=0, minute=0,
                                                      second=0, microsecond=0)

        res, skipped_tokens = self._parse(timestr, **kwargs)

        if res is None:
            raise ValueError("Unknown string format")

        if len(res) == 0:
            raise ValueError("String does not contain a date.")

        repl = {}
        for attr in ("year", "month", "day", "hour",
                     "minute", "second", "microsecond"):
            value = getattr(res, attr)
            if value is not None:
                repl[attr] = value

        if 'day' not in repl:
            # If the default day exceeds the last day of the month, fall back to
            # the end of the month.
            cyear = default.year if res.year is None else res.year
            cmonth = default.month if res.month is None else res.month
            cday = default.day if res.day is None else res.day

            if cday > monthrange(cyear, cmonth)[1]:
                repl['day'] = monthrange(cyear, cmonth)[1]

        ret = default.replace(**repl)

        if res.weekday is not None and not res.day:
            ret = ret+relativedelta.relativedelta(weekday=res.weekday)

        if not ignoretz:
            if (isinstance(tzinfos, collections.Callable) or
                    tzinfos and res.tzname in tzinfos):

                if isinstance(tzinfos, collections.Callable):
                    tzdata = tzinfos(res.tzname, res.tzoffset)
                else:
                    tzdata = tzinfos.get(res.tzname)

                if isinstance(tzdata, datetime.tzinfo):
                    tzinfo = tzdata
                elif isinstance(tzdata, text_type):
                    tzinfo = tz.tzstr(tzdata)
                elif isinstance(tzdata, integer_types):
                    tzinfo = tz.tzoffset(res.tzname, tzdata)
                else:
                    raise ValueError("Offset must be tzinfo subclass, "
                                     "tz string, or int offset.")
                ret = ret.replace(tzinfo=tzinfo)
            elif res.tzname and res.tzname in time.tzname:
                ret = ret.replace(tzinfo=tz.tzlocal())
            elif res.tzoffset == 0:
                ret = ret.replace(tzinfo=tz.tzutc())
            elif res.tzoffset:
                ret = ret.replace(tzinfo=tz.tzoffset(res.tzname, res.tzoffset))

        if kwargs.get('fuzzy_with_tokens', False):
            return ret, skipped_tokens
        else:
            return ret

    class _result(_resultbase):
        __slots__ = ["year", "month", "day", "weekday",
                     "hour", "minute", "second", "microsecond",
                     "tzname", "tzoffset", "ampm"]

    def _parse(self, timestr, dayfirst=None, yearfirst=None, fuzzy=False,
               fuzzy_with_tokens=False):
        """
        Private method which performs the heavy lifting of parsing, called from
        ``parse()``, which passes on its ``kwargs`` to this function.

        :param timestr:
            The string to parse.

        :param dayfirst:
            Whether to interpret the first value in an ambiguous 3-integer date
            (e.g. 01/05/09) as the day (``True``) or month (``False``). If
            ``yearfirst`` is set to ``True``, this distinguishes between YDM
            and YMD. If set to ``None``, this value is retrieved from the
            current :class:`parserinfo` object (which itself defaults to
            ``False``).

        :param yearfirst:
            Whether to interpret the first value in an ambiguous 3-integer date
            (e.g. 01/05/09) as the year. If ``True``, the first number is taken
            to be the year, otherwise the last number is taken to be the year.
            If this is set to ``None``, the value is retrieved from the current
            :class:`parserinfo` object (which itself defaults to ``False``).

        :param fuzzy:
            Whether to allow fuzzy parsing, allowing for string like "Today is
            January 1, 2047 at 8:21:00AM".

        :param fuzzy_with_tokens:
            If ``True``, ``fuzzy`` is automatically set to True, and the parser
            will return a tuple where the first element is the parsed
            :class:`datetime.datetime` datetimestamp and the second element is
            a tuple containing the portions of the string which were ignored:

            .. doctest::

                >>> from dateutil.parser import parse
                >>> parse("Today is January 1, 2047 at 8:21:00AM", fuzzy_with_tokens=True)
                (datetime.datetime(2047, 1, 1, 8, 21), (u'Today is ', u' ', u'at '))

        """
        if fuzzy_with_tokens:
            fuzzy = True

        info = self.info

        if dayfirst is None:
            dayfirst = info.dayfirst

        if yearfirst is None:
            yearfirst = info.yearfirst

        res = self._result()
        l = _timelex.split(timestr)         # Splits the timestr into tokens

        # keep up with the last token skipped so we can recombine
        # consecutively skipped tokens (-2 for when i begins at 0).
        last_skipped_token_i = -2
        skipped_tokens = list()

        try:
            # year/month/day list
            ymd = _ymd(timestr)

            # Index of the month string in ymd
            mstridx = -1

            len_l = len(l)
            i = 0
            while i < len_l:

                # Check if it's a number
                try:
                    value_repr = l[i]
                    value = float(value_repr)
                except ValueError:
                    value = None

                if value is not None:
                    # Token is a number
                    len_li = len(l[i])
                    i += 1

                    if (len(ymd) == 3 and len_li in (2, 4)
                        and res.hour is None and (i >= len_l or (l[i] != ':' and
                                                  info.hms(l[i]) is None))):
                        # 19990101T23[59]
                        s = l[i-1]
                        res.hour = int(s[:2])

                        if len_li == 4:
                            res.minute = int(s[2:])

                    elif len_li == 6 or (len_li > 6 and l[i-1].find('.') == 6):
                        # YYMMDD or HHMMSS[.ss]
                        s = l[i-1]

                        if not ymd and l[i-1].find('.') == -1:
                            #ymd.append(info.convertyear(int(s[:2])))

                            ymd.append(s[:2])
                            ymd.append(s[2:4])
                            ymd.append(s[4:])
                        else:
                            # 19990101T235959[.59]
                            res.hour = int(s[:2])
                            res.minute = int(s[2:4])
                            res.second, res.microsecond = _parsems(s[4:])

                    elif len_li in (8, 12, 14):
                        # YYYYMMDD
                        s = l[i-1]
                        ymd.append(s[:4])
                        ymd.append(s[4:6])
                        ymd.append(s[6:8])

                        if len_li > 8:
                            res.hour = int(s[8:10])
                            res.minute = int(s[10:12])

                            if len_li > 12:
                                res.second = int(s[12:])

                    elif ((i < len_l and info.hms(l[i]) is not None) or
                          (i+1 < len_l and l[i] == ' ' and
                           info.hms(l[i+1]) is not None)):

                        # HH[ ]h or MM[ ]m or SS[.ss][ ]s
                        if l[i] == ' ':
                            i += 1

                        idx = info.hms(l[i])

                        while True:
                            if idx == 0:
                                res.hour = int(value)

                                if value % 1:
                                    res.minute = int(60*(value % 1))

                            elif idx == 1:
                                res.minute = int(value)

                                if value % 1:
                                    res.second = int(60*(value % 1))

                            elif idx == 2:
                                res.second, res.microsecond = \
                                    _parsems(value_repr)

                            i += 1

                            if i >= len_l or idx == 2:
                                break

                            # 12h00
                            try:
                                value_repr = l[i]
                                value = float(value_repr)
                            except ValueError:
                                break
                            else:
                                i += 1
                                idx += 1

                                if i < len_l:
                                    newidx = info.hms(l[i])

                                    if newidx is not None:
                                        idx = newidx

                    elif (i == len_l and l[i-2] == ' ' and
                          info.hms(l[i-3]) is not None):
                        # X h MM or X m SS
                        idx = info.hms(l[i-3])

                        if idx == 0:               # h
                            res.minute = int(value)

                            sec_remainder = value % 1
                            if sec_remainder:
                                res.second = int(60 * sec_remainder)
                        elif idx == 1:             # m
                            res.second, res.microsecond = \
                                _parsems(value_repr)

                        # We don't need to advance the tokens here because the
                        # i == len_l call indicates that we're looking at all
                        # the tokens already.

                    elif i+1 < len_l and l[i] == ':':
                        # HH:MM[:SS[.ss]]
                        res.hour = int(value)
                        i += 1
                        value = float(l[i])
                        res.minute = int(value)

                        if value % 1:
                            res.second = int(60*(value % 1))

                        i += 1

                        if i < len_l and l[i] == ':':
                            res.second, res.microsecond = _parsems(l[i+1])
                            i += 2

                    elif i < len_l and l[i] in ('-', '/', '.'):
                        sep = l[i]
                        ymd.append(value_repr)
                        i += 1

                        if i < len_l and not info.jump(l[i]):
                            try:
                                # 01-01[-01]
                                ymd.append(l[i])
                            except ValueError:
                                # 01-Jan[-01]
                                value = info.month(l[i])

                                if value is not None:
                                    ymd.append(value)
                                    assert mstridx == -1
                                    mstridx = len(ymd)-1
                                else:
                                    return None, None

                            i += 1

                            if i < len_l and l[i] == sep:
                                # We have three members
                                i += 1
                                value = info.month(l[i])

                                if value is not None:
                                    ymd.append(value)
                                    mstridx = len(ymd)-1
                                    assert mstridx == -1
                                else:
                                    ymd.append(l[i])

                                i += 1
                    elif i >= len_l or info.jump(l[i]):
                        if i+1 < len_l and info.ampm(l[i+1]) is not None:
                            # 12 am
                            res.hour = int(value)

                            if res.hour < 12 and info.ampm(l[i+1]) == 1:
                                res.hour += 12
                            elif res.hour == 12 and info.ampm(l[i+1]) == 0:
                                res.hour = 0

                            i += 1
                        else:
                            # Year, month or day
                            ymd.append(value)
                        i += 1
                    elif info.ampm(l[i]) is not None:

                        # 12am
                        res.hour = int(value)

                        if res.hour < 12 and info.ampm(l[i]) == 1:
                            res.hour += 12
                        elif res.hour == 12 and info.ampm(l[i]) == 0:
                            res.hour = 0
                        i += 1

                    elif not fuzzy:
                        return None, None
                    else:
                        i += 1
                    continue

                # Check weekday
                value = info.weekday(l[i])
                if value is not None:
                    res.weekday = value
                    i += 1
                    continue

                # Check month name
                value = info.month(l[i])
                if value is not None:
                    ymd.append(value)
                    assert mstridx == -1
                    mstridx = len(ymd)-1

                    i += 1
                    if i < len_l:
                        if l[i] in ('-', '/'):
                            # Jan-01[-99]
                            sep = l[i]
                            i += 1
                            ymd.append(l[i])
                            i += 1

                            if i < len_l and l[i] == sep:
                                # Jan-01-99
                                i += 1
                                ymd.append(l[i])
                                i += 1

                        elif (i+3 < len_l and l[i] == l[i+2] == ' '
                              and info.pertain(l[i+1])):
                            # Jan of 01
                            # In this case, 01 is clearly year
                            try:
                                value = int(l[i+3])
                            except ValueError:
                                # Wrong guess
                                pass
                            else:
                                # Convert it here to become unambiguous
                                ymd.append(str(info.convertyear(value)))
                            i += 4
                    continue

                # Check am/pm
                value = info.ampm(l[i])
                if value is not None:
                    # For fuzzy parsing, 'a' or 'am' (both valid English words)
                    # may erroneously trigger the AM/PM flag. Deal with that
                    # here.
                    val_is_ampm = True

                    # If there's already an AM/PM flag, this one isn't one.
                    if fuzzy and res.ampm is not None:
                        val_is_ampm = False

                    # If AM/PM is found and hour is not, raise a ValueError
                    if res.hour is None:
                        if fuzzy:
                            val_is_ampm = False
                        else:
                            raise ValueError('No hour specified with ' +
                                             'AM or PM flag.')
                    elif not 0 <= res.hour <= 12:
                        # If AM/PM is found, it's a 12 hour clock, so raise
                        # an error for invalid range
                        if fuzzy:
                            val_is_ampm = False
                        else:
                            raise ValueError('Invalid hour specified for ' +
                                             '12-hour clock.')

                    if val_is_ampm:
                        if value == 1 and res.hour < 12:
                            res.hour += 12
                        elif value == 0 and res.hour == 12:
                            res.hour = 0

                        res.ampm = value

                    elif fuzzy:
                        last_skipped_token_i = self._skip_token(skipped_tokens,
                                                    last_skipped_token_i, i, l)
                    i += 1
                    continue

                # Check for a timezone name
                if (res.hour is not None and len(l[i]) <= 5 and
                        res.tzname is None and res.tzoffset is None and
                        not [x for x in l[i] if x not in
                             string.ascii_uppercase]):
                    res.tzname = l[i]
                    res.tzoffset = info.tzoffset(res.tzname)
                    i += 1

                    # Check for something like GMT+3, or BRST+3. Notice
                    # that it doesn't mean "I am 3 hours after GMT", but
                    # "my time +3 is GMT". If found, we reverse the
                    # logic so that timezone parsing code will get it
                    # right.
                    if i < len_l and l[i] in ('+', '-'):
                        l[i] = ('+', '-')[l[i] == '+']
                        res.tzoffset = None
                        if info.utczone(res.tzname):
                            # With something like GMT+3, the timezone
                            # is *not* GMT.
                            res.tzname = None

                    continue

                # Check for a numbered timezone
                if res.hour is not None and l[i] in ('+', '-'):
                    signal = (-1, 1)[l[i] == '+']
                    i += 1
                    len_li = len(l[i])

                    if len_li == 4:
                        # -0300
                        res.tzoffset = int(l[i][:2])*3600+int(l[i][2:])*60
                    elif i+1 < len_l and l[i+1] == ':':
                        # -03:00
                        res.tzoffset = int(l[i])*3600+int(l[i+2])*60
                        i += 2
                    elif len_li <= 2:
                        # -[0]3
                        res.tzoffset = int(l[i][:2])*3600
                    else:
                        return None, None
                    i += 1

                    res.tzoffset *= signal

                    # Look for a timezone name between parenthesis
                    if (i+3 < len_l and
                        info.jump(l[i]) and l[i+1] == '(' and l[i+3] == ')' and
                        3 <= len(l[i+2]) <= 5 and
                        not [x for x in l[i+2]
                             if x not in string.ascii_uppercase]):
                        # -0300 (BRST)
                        res.tzname = l[i+2]
                        i += 4
                    continue

                # Check jumps
                if not (info.jump(l[i]) or fuzzy):
                    return None, None

                last_skipped_token_i = self._skip_token(skipped_tokens,
                                                last_skipped_token_i, i, l)
                i += 1

            # Process year/month/day
            year, month, day = ymd.resolve_ymd(mstridx, yearfirst, dayfirst)
            if year is not None:
                res.year = year
                res.century_specified = ymd.century_specified

            if month is not None:
                res.month = month

            if day is not None:
                res.day = day

        except (IndexError, ValueError, AssertionError):
            return None, None

        if not info.validate(res):
            return None, None

        if fuzzy_with_tokens:
            return res, tuple(skipped_tokens)
        else:
            return res, None

    @staticmethod
    def _skip_token(skipped_tokens, last_skipped_token_i, i, l):
        if last_skipped_token_i == i - 1:
            # recombine the tokens
            skipped_tokens[-1] += l[i]
        else:
            # just append
            skipped_tokens.append(l[i])
        last_skipped_token_i = i
        return last_skipped_token_i


DEFAULTPARSER = parser()


def parse(timestr, parserinfo=None, **kwargs):
    """

    Parse a string in one of the supported formats, using the
    ``parserinfo`` parameters.

    :param timestr:
        A string containing a date/time stamp.

    :param parserinfo:
        A :class:`parserinfo` object containing parameters for the parser.
        If ``None``, the default arguments to the :class:`parserinfo`
        constructor are used.

    The ``**kwargs`` parameter takes the following keyword arguments:

    :param default:
        The default datetime object, if this is a datetime object and not
        ``None``, elements specified in ``timestr`` replace elements in the
        default object.

    :param ignoretz:
        If set ``True``, time zones in parsed strings are ignored and a naive
        :class:`datetime` object is returned.

    :param tzinfos:
            Additional time zone names / aliases which may be present in the
            string. This argument maps time zone names (and optionally offsets
            from those time zones) to time zones. This parameter can be a
            dictionary with timezone aliases mapping time zone names to time
            zones or a function taking two parameters (``tzname`` and
            ``tzoffset``) and returning a time zone.

            The timezones to which the names are mapped can be an integer
            offset from UTC in minutes or a :class:`tzinfo` object.

            .. doctest::
               :options: +NORMALIZE_WHITESPACE

                >>> from dateutil.parser import parse
                >>> from dateutil.tz import gettz
                >>> tzinfos = {"BRST": -10800, "CST": gettz("America/Chicago")}
                >>> parse("2012-01-19 17:21:00 BRST", tzinfos=tzinfos)
                datetime.datetime(2012, 1, 19, 17, 21, tzinfo=tzoffset(u'BRST', -10800))
                >>> parse("2012-01-19 17:21:00 CST", tzinfos=tzinfos)
                datetime.datetime(2012, 1, 19, 17, 21,
                                  tzinfo=tzfile('/usr/share/zoneinfo/America/Chicago'))

            This parameter is ignored if ``ignoretz`` is set.

    :param dayfirst:
        Whether to interpret the first value in an ambiguous 3-integer date
        (e.g. 01/05/09) as the day (``True``) or month (``False``). If
        ``yearfirst`` is set to ``True``, this distinguishes between YDM and
        YMD. If set to ``None``, this value is retrieved from the current
        :class:`parserinfo` object (which itself defaults to ``False``).

    :param yearfirst:
        Whether to interpret the first value in an ambiguous 3-integer date
        (e.g. 01/05/09) as the year. If ``True``, the first number is taken to
        be the year, otherwise the last number is taken to be the year. If
        this is set to ``None``, the value is retrieved from the current
        :class:`parserinfo` object (which itself defaults to ``False``).

    :param fuzzy:
        Whether to allow fuzzy parsing, allowing for string like "Today is
        January 1, 2047 at 8:21:00AM".

    :param fuzzy_with_tokens:
        If ``True``, ``fuzzy`` is automatically set to True, and the parser
        will return a tuple where the first element is the parsed
        :class:`datetime.datetime` datetimestamp and the second element is
        a tuple containing the portions of the string which were ignored:

        .. doctest::

            >>> from dateutil.parser import parse
            >>> parse("Today is January 1, 2047 at 8:21:00AM", fuzzy_with_tokens=True)
            (datetime.datetime(2047, 1, 1, 8, 21), (u'Today is ', u' ', u'at '))

    :return:
        Returns a :class:`datetime.datetime` object or, if the
        ``fuzzy_with_tokens`` option is ``True``, returns a tuple, the
        first element being a :class:`datetime.datetime` object, the second
        a tuple containing the fuzzy tokens.

    :raises ValueError:
        Raised for invalid or unknown string format, if the provided
        :class:`tzinfo` is not in a valid format, or if an invalid date
        would be created.

    :raises OverflowError:
        Raised if the parsed date exceeds the largest valid C integer on
        your system.
    """
    if parserinfo:
        return parser(parserinfo).parse(timestr, **kwargs)
    else:
        return DEFAULTPARSER.parse(timestr, **kwargs)


class _tzparser(object):

    class _result(_resultbase):

        __slots__ = ["stdabbr", "stdoffset", "dstabbr", "dstoffset",
                     "start", "end"]

        class _attr(_resultbase):
            __slots__ = ["month", "week", "weekday",
                         "yday", "jyday", "day", "time"]

        def __repr__(self):
            return self._repr("")

        def __init__(self):
            _resultbase.__init__(self)
            self.start = self._attr()
            self.end = self._attr()

    def parse(self, tzstr):
        res = self._result()
        l = _timelex.split(tzstr)
        try:

            len_l = len(l)

            i = 0
            while i < len_l:
                # BRST+3[BRDT[+2]]
                j = i
                while j < len_l and not [x for x in l[j]
                                         if x in "0123456789:,-+"]:
                    j += 1
                if j != i:
                    if not res.stdabbr:
                        offattr = "stdoffset"
                        res.stdabbr = "".join(l[i:j])
                    else:
                        offattr = "dstoffset"
                        res.dstabbr = "".join(l[i:j])
                    i = j
                    if (i < len_l and (l[i] in ('+', '-') or l[i][0] in
                                       "0123456789")):
                        if l[i] in ('+', '-'):
                            # Yes, that's right.  See the TZ variable
                            # documentation.
                            signal = (1, -1)[l[i] == '+']
                            i += 1
                        else:
                            signal = -1
                        len_li = len(l[i])
                        if len_li == 4:
                            # -0300
                            setattr(res, offattr, (int(l[i][:2])*3600 +
                                                   int(l[i][2:])*60)*signal)
                        elif i+1 < len_l and l[i+1] == ':':
                            # -03:00
                            setattr(res, offattr,
                                    (int(l[i])*3600+int(l[i+2])*60)*signal)
                            i += 2
                        elif len_li <= 2:
                            # -[0]3
                            setattr(res, offattr,
                                    int(l[i][:2])*3600*signal)
                        else:
                            return None
                        i += 1
                    if res.dstabbr:
                        break
                else:
                    break

            if i < len_l:
                for j in range(i, len_l):
                    if l[j] == ';':
                        l[j] = ','

                assert l[i] == ','

                i += 1

            if i >= len_l:
                pass
            elif (8 <= l.count(',') <= 9 and
                  not [y for x in l[i:] if x != ','
                       for y in x if y not in "0123456789"]):
                # GMT0BST,3,0,30,3600,10,0,26,7200[,3600]
                for x in (res.start, res.end):
                    x.month = int(l[i])
                    i += 2
                    if l[i] == '-':
                        value = int(l[i+1])*-1
                        i += 1
                    else:
                        value = int(l[i])
                    i += 2
                    if value:
                        x.week = value
                        x.weekday = (int(l[i])-1) % 7
                    else:
                        x.day = int(l[i])
                    i += 2
                    x.time = int(l[i])
                    i += 2
                if i < len_l:
                    if l[i] in ('-', '+'):
                        signal = (-1, 1)[l[i] == "+"]
                        i += 1
                    else:
                        signal = 1
                    res.dstoffset = (res.stdoffset+int(l[i]))*signal
            elif (l.count(',') == 2 and l[i:].count('/') <= 2 and
                  not [y for x in l[i:] if x not in (',', '/', 'J', 'M',
                                                     '.', '-', ':')
                       for y in x if y not in "0123456789"]):
                for x in (res.start, res.end):
                    if l[i] == 'J':
                        # non-leap year day (1 based)
                        i += 1
                        x.jyday = int(l[i])
                    elif l[i] == 'M':
                        # month[-.]week[-.]weekday
                        i += 1
                        x.month = int(l[i])
                        i += 1
                        assert l[i] in ('-', '.')
                        i += 1
                        x.week = int(l[i])
                        if x.week == 5:
                            x.week = -1
                        i += 1
                        assert l[i] in ('-', '.')
                        i += 1
                        x.weekday = (int(l[i])-1) % 7
                    else:
                        # year day (zero based)
                        x.yday = int(l[i])+1

                    i += 1

                    if i < len_l and l[i] == '/':
                        i += 1
                        # start time
                        len_li = len(l[i])
                        if len_li == 4:
                            # -0300
                            x.time = (int(l[i][:2])*3600+int(l[i][2:])*60)
                        elif i+1 < len_l and l[i+1] == ':':
                            # -03:00
                            x.time = int(l[i])*3600+int(l[i+2])*60
                            i += 2
                            if i+1 < len_l and l[i+1] == ':':
                                i += 2
                                x.time += int(l[i])
                        elif len_li <= 2:
                            # -[0]3
                            x.time = (int(l[i][:2])*3600)
                        else:
                            return None
                        i += 1

                    assert i == len_l or l[i] == ','

                    i += 1

                assert i >= len_l

        except (IndexError, ValueError, AssertionError):
            return None

        return res


DEFAULTTZPARSER = _tzparser()


def _parsetz(tzstr):
    return DEFAULTTZPARSER.parse(tzstr)


def _parsems(value):
    """Parse a I[.F] seconds value into (seconds, microseconds)."""
    if "." not in value:
        return int(value), 0
    else:
        i, f = value.split(".")
        return int(i), int(f.ljust(6, "0")[:6])


# vim:ts=4:sw=4:et
site-packages/dateutil/_common.py000064400000001400147511334560013104 0ustar00"""
Common code used in multiple modules.
"""


class weekday(object):
    __slots__ = ["weekday", "n"]

    def __init__(self, weekday, n=None):
        self.weekday = weekday
        self.n = n

    def __call__(self, n):
        if n == self.n:
            return self
        else:
            return self.__class__(self.weekday, n)

    def __eq__(self, other):
        try:
            if self.weekday != other.weekday or self.n != other.n:
                return False
        except AttributeError:
            return False
        return True

    __hash__ = None

    def __repr__(self):
        s = ("MO", "TU", "WE", "TH", "FR", "SA", "SU")[self.weekday]
        if not self.n:
            return s
        else:
            return "%s(%+d)" % (s, self.n)
site-packages/dateutil/relativedelta.py000064400000055151147511334560014316 0ustar00# -*- coding: utf-8 -*-
import datetime
import calendar

import operator
from math import copysign

from six import integer_types
from warnings import warn

from ._common import weekday

MO, TU, WE, TH, FR, SA, SU = weekdays = tuple(weekday(x) for x in range(7))

__all__ = ["relativedelta", "MO", "TU", "WE", "TH", "FR", "SA", "SU"]


class relativedelta(object):
    """
    The relativedelta type is based on the specification of the excellent
    work done by M.-A. Lemburg in his
    `mx.DateTime <http://www.egenix.com/files/python/mxDateTime.html>`_ extension.
    However, notice that this type does *NOT* implement the same algorithm as
    his work. Do *NOT* expect it to behave like mx.DateTime's counterpart.

    There are two different ways to build a relativedelta instance. The
    first one is passing it two date/datetime classes::

        relativedelta(datetime1, datetime2)

    The second one is passing it any number of the following keyword arguments::

        relativedelta(arg1=x,arg2=y,arg3=z...)

        year, month, day, hour, minute, second, microsecond:
            Absolute information (argument is singular); adding or subtracting a
            relativedelta with absolute information does not perform an aritmetic
            operation, but rather REPLACES the corresponding value in the
            original datetime with the value(s) in relativedelta.

        years, months, weeks, days, hours, minutes, seconds, microseconds:
            Relative information, may be negative (argument is plural); adding
            or subtracting a relativedelta with relative information performs
            the corresponding aritmetic operation on the original datetime value
            with the information in the relativedelta.

        weekday:
            One of the weekday instances (MO, TU, etc). These instances may
            receive a parameter N, specifying the Nth weekday, which could
            be positive or negative (like MO(+1) or MO(-2). Not specifying
            it is the same as specifying +1. You can also use an integer,
            where 0=MO.

        leapdays:
            Will add given days to the date found, if year is a leap
            year, and the date found is post 28 of february.

        yearday, nlyearday:
            Set the yearday or the non-leap year day (jump leap days).
            These are converted to day/month/leapdays information.

    Here is the behavior of operations with relativedelta:

    1. Calculate the absolute year, using the 'year' argument, or the
       original datetime year, if the argument is not present.

    2. Add the relative 'years' argument to the absolute year.

    3. Do steps 1 and 2 for month/months.

    4. Calculate the absolute day, using the 'day' argument, or the
       original datetime day, if the argument is not present. Then,
       subtract from the day until it fits in the year and month
       found after their operations.

    5. Add the relative 'days' argument to the absolute day. Notice
       that the 'weeks' argument is multiplied by 7 and added to
       'days'.

    6. Do steps 1 and 2 for hour/hours, minute/minutes, second/seconds,
       microsecond/microseconds.

    7. If the 'weekday' argument is present, calculate the weekday,
       with the given (wday, nth) tuple. wday is the index of the
       weekday (0-6, 0=Mon), and nth is the number of weeks to add
       forward or backward, depending on its signal. Notice that if
       the calculated date is already Monday, for example, using
       (0, 1) or (0, -1) won't change the day.
    """

    def __init__(self, dt1=None, dt2=None,
                 years=0, months=0, days=0, leapdays=0, weeks=0,
                 hours=0, minutes=0, seconds=0, microseconds=0,
                 year=None, month=None, day=None, weekday=None,
                 yearday=None, nlyearday=None,
                 hour=None, minute=None, second=None, microsecond=None):

        # Check for non-integer values in integer-only quantities
        if any(x is not None and x != int(x) for x in (years, months)):
            raise ValueError("Non-integer years and months are "
                             "ambiguous and not currently supported.")

        if dt1 and dt2:
            # datetime is a subclass of date. So both must be date
            if not (isinstance(dt1, datetime.date) and
                    isinstance(dt2, datetime.date)):
                raise TypeError("relativedelta only diffs datetime/date")

            # We allow two dates, or two datetimes, so we coerce them to be
            # of the same type
            if (isinstance(dt1, datetime.datetime) !=
                    isinstance(dt2, datetime.datetime)):
                if not isinstance(dt1, datetime.datetime):
                    dt1 = datetime.datetime.fromordinal(dt1.toordinal())
                elif not isinstance(dt2, datetime.datetime):
                    dt2 = datetime.datetime.fromordinal(dt2.toordinal())

            self.years = 0
            self.months = 0
            self.days = 0
            self.leapdays = 0
            self.hours = 0
            self.minutes = 0
            self.seconds = 0
            self.microseconds = 0
            self.year = None
            self.month = None
            self.day = None
            self.weekday = None
            self.hour = None
            self.minute = None
            self.second = None
            self.microsecond = None
            self._has_time = 0

            # Get year / month delta between the two
            months = (dt1.year - dt2.year) * 12 + (dt1.month - dt2.month)
            self._set_months(months)

            # Remove the year/month delta so the timedelta is just well-defined
            # time units (seconds, days and microseconds)
            dtm = self.__radd__(dt2)

            # If we've overshot our target, make an adjustment
            if dt1 < dt2:
                compare = operator.gt
                increment = 1
            else:
                compare = operator.lt
                increment = -1

            while compare(dt1, dtm):
                months += increment
                self._set_months(months)
                dtm = self.__radd__(dt2)

            # Get the timedelta between the "months-adjusted" date and dt1
            delta = dt1 - dtm
            self.seconds = delta.seconds + delta.days * 86400
            self.microseconds = delta.microseconds
        else:
            # Relative information
            self.years = years
            self.months = months
            self.days = days + weeks * 7
            self.leapdays = leapdays
            self.hours = hours
            self.minutes = minutes
            self.seconds = seconds
            self.microseconds = microseconds

            # Absolute information
            self.year = year
            self.month = month
            self.day = day
            self.hour = hour
            self.minute = minute
            self.second = second
            self.microsecond = microsecond

            if any(x is not None and int(x) != x
                   for x in (year, month, day, hour,
                             minute, second, microsecond)):
                # For now we'll deprecate floats - later it'll be an error.
                warn("Non-integer value passed as absolute information. " +
                     "This is not a well-defined condition and will raise " +
                     "errors in future versions.", DeprecationWarning)

            if isinstance(weekday, integer_types):
                self.weekday = weekdays[weekday]
            else:
                self.weekday = weekday

            yday = 0
            if nlyearday:
                yday = nlyearday
            elif yearday:
                yday = yearday
                if yearday > 59:
                    self.leapdays = -1
            if yday:
                ydayidx = [31, 59, 90, 120, 151, 181, 212,
                           243, 273, 304, 334, 366]
                for idx, ydays in enumerate(ydayidx):
                    if yday <= ydays:
                        self.month = idx+1
                        if idx == 0:
                            self.day = yday
                        else:
                            self.day = yday-ydayidx[idx-1]
                        break
                else:
                    raise ValueError("invalid year day (%d)" % yday)

        self._fix()

    def _fix(self):
        if abs(self.microseconds) > 999999:
            s = _sign(self.microseconds)
            div, mod = divmod(self.microseconds * s, 1000000)
            self.microseconds = mod * s
            self.seconds += div * s
        if abs(self.seconds) > 59:
            s = _sign(self.seconds)
            div, mod = divmod(self.seconds * s, 60)
            self.seconds = mod * s
            self.minutes += div * s
        if abs(self.minutes) > 59:
            s = _sign(self.minutes)
            div, mod = divmod(self.minutes * s, 60)
            self.minutes = mod * s
            self.hours += div * s
        if abs(self.hours) > 23:
            s = _sign(self.hours)
            div, mod = divmod(self.hours * s, 24)
            self.hours = mod * s
            self.days += div * s
        if abs(self.months) > 11:
            s = _sign(self.months)
            div, mod = divmod(self.months * s, 12)
            self.months = mod * s
            self.years += div * s
        if (self.hours or self.minutes or self.seconds or self.microseconds
                or self.hour is not None or self.minute is not None or
                self.second is not None or self.microsecond is not None):
            self._has_time = 1
        else:
            self._has_time = 0

    @property
    def weeks(self):
        return self.days // 7

    @weeks.setter
    def weeks(self, value):
        self.days = self.days - (self.weeks * 7) + value * 7

    def _set_months(self, months):
        self.months = months
        if abs(self.months) > 11:
            s = _sign(self.months)
            div, mod = divmod(self.months * s, 12)
            self.months = mod * s
            self.years = div * s
        else:
            self.years = 0

    def normalized(self):
        """
        Return a version of this object represented entirely using integer
        values for the relative attributes.

        >>> relativedelta(days=1.5, hours=2).normalized()
        relativedelta(days=1, hours=14)

        :return:
            Returns a :class:`dateutil.relativedelta.relativedelta` object.
        """
        # Cascade remainders down (rounding each to roughly nearest microsecond)
        days = int(self.days)

        hours_f = round(self.hours + 24 * (self.days - days), 11)
        hours = int(hours_f)

        minutes_f = round(self.minutes + 60 * (hours_f - hours), 10)
        minutes = int(minutes_f)

        seconds_f = round(self.seconds + 60 * (minutes_f - minutes), 8)
        seconds = int(seconds_f)

        microseconds = round(self.microseconds + 1e6 * (seconds_f - seconds))

        # Constructor carries overflow back up with call to _fix()
        return self.__class__(years=self.years, months=self.months,
                              days=days, hours=hours, minutes=minutes,
                              seconds=seconds, microseconds=microseconds,
                              leapdays=self.leapdays, year=self.year,
                              month=self.month, day=self.day,
                              weekday=self.weekday, hour=self.hour,
                              minute=self.minute, second=self.second,
                              microsecond=self.microsecond)

    def __add__(self, other):
        if isinstance(other, relativedelta):
            return self.__class__(years=other.years + self.years,
                                 months=other.months + self.months,
                                 days=other.days + self.days,
                                 hours=other.hours + self.hours,
                                 minutes=other.minutes + self.minutes,
                                 seconds=other.seconds + self.seconds,
                                 microseconds=(other.microseconds +
                                               self.microseconds),
                                 leapdays=other.leapdays or self.leapdays,
                                 year=(other.year if other.year is not None
                                       else self.year),
                                 month=(other.month if other.month is not None
                                        else self.month),
                                 day=(other.day if other.day is not None
                                      else self.day),
                                 weekday=(other.weekday if other.weekday is not None
                                          else self.weekday),
                                 hour=(other.hour if other.hour is not None
                                       else self.hour),
                                 minute=(other.minute if other.minute is not None
                                         else self.minute),
                                 second=(other.second if other.second is not None
                                         else self.second),
                                 microsecond=(other.microsecond if other.microsecond
                                              is not None else
                                              self.microsecond))
        if isinstance(other, datetime.timedelta):
            return self.__class__(years=self.years,
                                  months=self.months,
                                  days=self.days + other.days,
                                  hours=self.hours,
                                  minutes=self.minutes,
                                  seconds=self.seconds + other.seconds,
                                  microseconds=self.microseconds + other.microseconds,
                                  leapdays=self.leapdays,
                                  year=self.year,
                                  month=self.month,
                                  day=self.day,
                                  weekday=self.weekday,
                                  hour=self.hour,
                                  minute=self.minute,
                                  second=self.second,
                                  microsecond=self.microsecond)
        if not isinstance(other, datetime.date):
            return NotImplemented
        elif self._has_time and not isinstance(other, datetime.datetime):
            other = datetime.datetime.fromordinal(other.toordinal())
        year = (self.year or other.year)+self.years
        month = self.month or other.month
        if self.months:
            assert 1 <= abs(self.months) <= 12
            month += self.months
            if month > 12:
                year += 1
                month -= 12
            elif month < 1:
                year -= 1
                month += 12
        day = min(calendar.monthrange(year, month)[1],
                  self.day or other.day)
        repl = {"year": year, "month": month, "day": day}
        for attr in ["hour", "minute", "second", "microsecond"]:
            value = getattr(self, attr)
            if value is not None:
                repl[attr] = value
        days = self.days
        if self.leapdays and month > 2 and calendar.isleap(year):
            days += self.leapdays
        ret = (other.replace(**repl)
               + datetime.timedelta(days=days,
                                    hours=self.hours,
                                    minutes=self.minutes,
                                    seconds=self.seconds,
                                    microseconds=self.microseconds))
        if self.weekday:
            weekday, nth = self.weekday.weekday, self.weekday.n or 1
            jumpdays = (abs(nth) - 1) * 7
            if nth > 0:
                jumpdays += (7 - ret.weekday() + weekday) % 7
            else:
                jumpdays += (ret.weekday() - weekday) % 7
                jumpdays *= -1
            ret += datetime.timedelta(days=jumpdays)
        return ret

    def __radd__(self, other):
        return self.__add__(other)

    def __rsub__(self, other):
        return self.__neg__().__radd__(other)

    def __sub__(self, other):
        if not isinstance(other, relativedelta):
            return NotImplemented   # In case the other object defines __rsub__
        return self.__class__(years=self.years - other.years,
                             months=self.months - other.months,
                             days=self.days - other.days,
                             hours=self.hours - other.hours,
                             minutes=self.minutes - other.minutes,
                             seconds=self.seconds - other.seconds,
                             microseconds=self.microseconds - other.microseconds,
                             leapdays=self.leapdays or other.leapdays,
                             year=(self.year if self.year is not None
                                   else other.year),
                             month=(self.month if self.month is not None else
                                    other.month),
                             day=(self.day if self.day is not None else
                                  other.day),
                             weekday=(self.weekday if self.weekday is not None else
                                      other.weekday),
                             hour=(self.hour if self.hour is not None else
                                   other.hour),
                             minute=(self.minute if self.minute is not None else
                                     other.minute),
                             second=(self.second if self.second is not None else
                                     other.second),
                             microsecond=(self.microsecond if self.microsecond
                                          is not None else
                                          other.microsecond))

    def __neg__(self):
        return self.__class__(years=-self.years,
                             months=-self.months,
                             days=-self.days,
                             hours=-self.hours,
                             minutes=-self.minutes,
                             seconds=-self.seconds,
                             microseconds=-self.microseconds,
                             leapdays=self.leapdays,
                             year=self.year,
                             month=self.month,
                             day=self.day,
                             weekday=self.weekday,
                             hour=self.hour,
                             minute=self.minute,
                             second=self.second,
                             microsecond=self.microsecond)

    def __bool__(self):
        return not (not self.years and
                    not self.months and
                    not self.days and
                    not self.hours and
                    not self.minutes and
                    not self.seconds and
                    not self.microseconds and
                    not self.leapdays and
                    self.year is None and
                    self.month is None and
                    self.day is None and
                    self.weekday is None and
                    self.hour is None and
                    self.minute is None and
                    self.second is None and
                    self.microsecond is None)
    # Compatibility with Python 2.x
    __nonzero__ = __bool__

    def __mul__(self, other):
        try:
            f = float(other)
        except TypeError:
            return NotImplemented

        return self.__class__(years=int(self.years * f),
                             months=int(self.months * f),
                             days=int(self.days * f),
                             hours=int(self.hours * f),
                             minutes=int(self.minutes * f),
                             seconds=int(self.seconds * f),
                             microseconds=int(self.microseconds * f),
                             leapdays=self.leapdays,
                             year=self.year,
                             month=self.month,
                             day=self.day,
                             weekday=self.weekday,
                             hour=self.hour,
                             minute=self.minute,
                             second=self.second,
                             microsecond=self.microsecond)

    __rmul__ = __mul__

    def __eq__(self, other):
        if not isinstance(other, relativedelta):
            return NotImplemented
        if self.weekday or other.weekday:
            if not self.weekday or not other.weekday:
                return False
            if self.weekday.weekday != other.weekday.weekday:
                return False
            n1, n2 = self.weekday.n, other.weekday.n
            if n1 != n2 and not ((not n1 or n1 == 1) and (not n2 or n2 == 1)):
                return False
        return (self.years == other.years and
                self.months == other.months and
                self.days == other.days and
                self.hours == other.hours and
                self.minutes == other.minutes and
                self.seconds == other.seconds and
                self.microseconds == other.microseconds and
                self.leapdays == other.leapdays and
                self.year == other.year and
                self.month == other.month and
                self.day == other.day and
                self.hour == other.hour and
                self.minute == other.minute and
                self.second == other.second and
                self.microsecond == other.microsecond)

    __hash__ = None

    def __ne__(self, other):
        return not self.__eq__(other)

    def __div__(self, other):
        try:
            reciprocal = 1 / float(other)
        except TypeError:
            return NotImplemented

        return self.__mul__(reciprocal)

    __truediv__ = __div__

    def __repr__(self):
        l = []
        for attr in ["years", "months", "days", "leapdays",
                     "hours", "minutes", "seconds", "microseconds"]:
            value = getattr(self, attr)
            if value:
                l.append("{attr}={value:+g}".format(attr=attr, value=value))
        for attr in ["year", "month", "day", "weekday",
                     "hour", "minute", "second", "microsecond"]:
            value = getattr(self, attr)
            if value is not None:
                l.append("{attr}={value}".format(attr=attr, value=repr(value)))
        return "{classname}({attrs})".format(classname=self.__class__.__name__,
                                             attrs=", ".join(l))


def _sign(x):
    return int(copysign(1, x))

# vim:ts=4:sw=4:et
site-packages/dnfpluginscore/__pycache__/__init__.cpython-36.pyc000064400000000670147511334560020717 0ustar003

�gt`��@sTdZddlmZddlmZddlZddlZejjd�\Z	Z
ejd�Zejd�Z
dS)z! Common code for dnf-plugins-core�)�absolute_import)�unicode_literalsNzdnf-plugins-corez
dnf.pluginzdnf.rpm)�__doc__Z
__future__rrZdnf.exceptionsZdnfZloggingZi18nZtranslation�_ZP_Z	getLoggerZloggerZ
rpm_logger�rr�/usr/lib/python3.6/__init__.py�<module>s
site-packages/dnfpluginscore/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000670147511334560021656 0ustar003

�gt`��@sTdZddlmZddlmZddlZddlZejjd�\Z	Z
ejd�Zejd�Z
dS)z! Common code for dnf-plugins-core�)�absolute_import)�unicode_literalsNzdnf-plugins-corez
dnf.pluginzdnf.rpm)�__doc__Z
__future__rrZdnf.exceptionsZdnfZloggingZi18nZtranslation�_ZP_Z	getLoggerZloggerZ
rpm_logger�rr�/usr/lib/python3.6/__init__.py�<module>s
site-packages/dnfpluginscore/__init__.py000064400000002340147511334560014427 0ustar00# Copyright (C) 2014  Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

""" Common code for dnf-plugins-core"""
from __future__ import absolute_import
from __future__ import unicode_literals

import dnf.exceptions
import logging

_, P_ = dnf.i18n.translation('dnf-plugins-core')
logger = logging.getLogger('dnf.plugin')
rpm_logger = logging.getLogger('dnf.rpm')


site-packages/pip-9.0.3.dist-info/top_level.txt000064400000000004147511334560015151 0ustar00pip
site-packages/pip-9.0.3.dist-info/INSTALLER000064400000000004147511334560013677 0ustar00rpm
site-packages/pip-9.0.3.dist-info/DESCRIPTION.rst000064400000002407147511334560014745 0ustar00pip
===

The `PyPA recommended
<https://packaging.python.org/en/latest/current/>`_
tool for installing Python packages.

* `Installation <https://pip.pypa.io/en/stable/installing.html>`_
* `Documentation <https://pip.pypa.io/>`_
* `Changelog <https://pip.pypa.io/en/stable/news.html>`_
* `Github Page <https://github.com/pypa/pip>`_
* `Issue Tracking <https://github.com/pypa/pip/issues>`_
* `User mailing list <http://groups.google.com/group/python-virtualenv>`_
* `Dev mailing list <http://groups.google.com/group/pypa-dev>`_
* User IRC: #pypa on Freenode.
* Dev IRC: #pypa-dev on Freenode.


.. image:: https://img.shields.io/pypi/v/pip.svg
   :target: https://pypi.python.org/pypi/pip

.. image:: https://img.shields.io/travis/pypa/pip/master.svg
   :target: http://travis-ci.org/pypa/pip

.. image:: https://img.shields.io/appveyor/ci/pypa/pip.svg
   :target: https://ci.appveyor.com/project/pypa/pip/history

.. image:: https://readthedocs.org/projects/pip/badge/?version=stable
   :target: https://pip.pypa.io/en/stable

Code of Conduct
---------------

Everyone interacting in the pip project's codebases, issue trackers, chat
rooms, and mailing lists is expected to follow the `PyPA Code of Conduct`_.

.. _PyPA Code of Conduct: https://www.pypa.io/en/latest/code-of-conduct/


site-packages/pip-9.0.3.dist-info/RECORD000064400000117164147511334560013340 0ustar00pip/__init__.py,sha256=4lW9KYLzqHLnjA4LqDJpzrQvCoV1WcsiCpQv6UcaWhU,11934
pip/__main__.py,sha256=V6Kh-IEDEFpt1cahRE6MajUF_14qJR_Qsvn4MjWZXzE,584
pip/basecommand.py,sha256=TTlmZesQ4Vuxcto2KqwZGmgmN5ioHEl_DeFev9ie_SA,11910
pip/baseparser.py,sha256=AKMOeF3fTrRroiv0DmTQbdiLW0DQux2KqGC_dJJB9d0,10465
pip/cmdoptions.py,sha256=8JCcF2kKAF2cFnV77oW-3DsHJifr9jF2WuChzzwgcwg,16474
pip/download.py,sha256=qZIbS-XFZeHSy4Ub_4nRwS9eyEj6vfwm0K8fSLIdzAQ,32847
pip/exceptions.py,sha256=BvqH-Jw3tP2b-2IJ2kjrQemOAPMqKrQMLRIZHZQpJXk,8121
pip/index.py,sha256=n31FFa3urQ8xHpMXOnVUAsVao71UZiltOX7jVz31PhA,41559
pip/locations.py,sha256=9rJRlgonC6QC2zGDIn_7mXaoZ9_tF_IHM2BQhWVRgbo,5626
pip/pep425tags.py,sha256=q3kec4f6NHszuGYIhGIbVvs896D06uJAnKFgJ_wce44,10980
pip/status_codes.py,sha256=F6uDG6Gj7RNKQJUDnd87QKqI16Us-t-B0wPF_4QMpWc,156
pip/wheel.py,sha256=sYlucHaRqyQZJ-e6H-190HHnJQhL5mT0tlNQN3yQ0Ws,32287
pip/_vendor/__init__.py,sha256=WaaSJ3roSSJ_Uv4yKAxlGohKEH9YUA3aIh1Xg2IjfgU,4670
pip/_vendor/appdirs.py,sha256=-9UOIZy62ahCQVY9-b7Nn6_5_4Y6ooHnv72tM8iHi9Y,22368
pip/_vendor/distro.py,sha256=8yTxDnhLRpIS9EUcz1r_143-KeZFapXaxeoTivnlZig,38381
pip/_vendor/ipaddress.py,sha256=wimbqcE7rwwETlucn8A_4Qd_-NKXPOBcNxJHarUoXng,80176
pip/_vendor/pyparsing.py,sha256=7vAuUVbh6txUKQR2IzJ8_9DKmD5vtm5MDssWkI0ka8o,224171
pip/_vendor/re-vendor.py,sha256=PcdZ40d0ohMsdJmA4t0AeAWbPXi1tFsvAwA5KE5FGeY,773
pip/_vendor/retrying.py,sha256=k3fflf5_Mm0XcIJYhB7Tj34bqCCPhUDkYbx1NvW2FPE,9972
pip/_vendor/six.py,sha256=A6hdJZVjI3t_geebZ9BzUvwRrIXo0lfwzQlM2LcKyas,30098
pip/_vendor/cachecontrol/__init__.py,sha256=UPyFlz0dIjxusu5ITig9UDFJdSY5LTwijhldn0AfyzU,302
pip/_vendor/cachecontrol/_cmd.py,sha256=MPxZfZd2LKDzVrs55X3wA1rsI2YuP8evLZSwQj0dIk0,1320
pip/_vendor/cachecontrol/adapter.py,sha256=RaGYyRA-RA1J0AnE67GzEYFPBu4YH4EQUvQqTKa57iM,4608
pip/_vendor/cachecontrol/cache.py,sha256=xtl-V-pr9KSt9VvFDRCB9yrHPEvqvbk-5M1vAInZb5k,790
pip/_vendor/cachecontrol/compat.py,sha256=2MTOyI1JlG_gJpfuy3-UQQlKMRiJimR-XXB0sr44wj0,380
pip/_vendor/cachecontrol/controller.py,sha256=elDsLcaYA15ncodRmHnWQp6ekU_ocEGtDeGLbsnTjzo,13024
pip/_vendor/cachecontrol/filewrapper.py,sha256=_K8cStmXqD33m15PfsQ8rlpo6FfXjVbKmjvLXyICRgI,2531
pip/_vendor/cachecontrol/heuristics.py,sha256=WtJrVsyWjpP9WoUiDVdTZZRNBCz5ZVptaQpYnqofDQU,4141
pip/_vendor/cachecontrol/serialize.py,sha256=XM6elG9DSNexwaOCgMjUtfrHHW5NAB6TSbIf3x235xs,6536
pip/_vendor/cachecontrol/wrapper.py,sha256=Kqyu_3TW_54XDudha4-HF21vyEOAJ4ZnRXFysTiLmXA,498
pip/_vendor/cachecontrol/caches/__init__.py,sha256=uWnUtyMvHY_LULaL_4_IR1F_xPgK5zHfJyRnBq4DnPE,369
pip/_vendor/cachecontrol/caches/file_cache.py,sha256=FsDug3bwUAQ3okjjfGzxlDaBf2fwVSn1iBKMTL6SyGU,3532
pip/_vendor/cachecontrol/caches/redis_cache.py,sha256=XywqxkS9MkCaflTOY_wjrE02neKdywB9YwlOBbP7Ywc,973
pip/_vendor/certifi/__init__.py,sha256=QSRy1UztE-i09IuGIKKuc190k07lt6ktabbelPMIZoc,63
pip/_vendor/certifi/__main__.py,sha256=FiOYt1Fltst7wk9DRa6GCoBr8qBUxlNQu_MKJf04E6s,41
pip/_vendor/certifi/core.py,sha256=9MGV_bfdXHlJJ18qDuEEi_QvAbPUsgK8YggA2b70tqg,806
pip/_vendor/chardet/__init__.py,sha256=YsP5wQlsHJ2auF1RZJfypiSrCA7_bQiRm3ES_NI76-Y,1559
pip/_vendor/chardet/big5freq.py,sha256=D_zK5GyzoVsRes0HkLJziltFQX0bKCLOrFe9_xDvO_8,31254
pip/_vendor/chardet/big5prober.py,sha256=kBxHbdetBpPe7xrlb-e990iot64g_eGSLd32lB7_h3M,1757
pip/_vendor/chardet/chardistribution.py,sha256=3woWS62KrGooKyqz4zQSnjFbJpa6V7g02daAibTwcl8,9411
pip/_vendor/chardet/charsetgroupprober.py,sha256=6bDu8YIiRuScX4ca9Igb0U69TA2PGXXDej6Cc4_9kO4,3787
pip/_vendor/chardet/charsetprober.py,sha256=KSmwJErjypyj0bRZmC5F5eM7c8YQgLYIjZXintZNstg,5110
pip/_vendor/chardet/codingstatemachine.py,sha256=VYp_6cyyki5sHgXDSZnXW4q1oelHc3cu9AyQTX7uug8,3590
pip/_vendor/chardet/compat.py,sha256=PKTzHkSbtbHDqS9PyujMbX74q1a8mMpeQTDVsQhZMRw,1134
pip/_vendor/chardet/cp949prober.py,sha256=TZ434QX8zzBsnUvL_8wm4AQVTZ2ZkqEEQL_lNw9f9ow,1855
pip/_vendor/chardet/enums.py,sha256=Aimwdb9as1dJKZaFNUH2OhWIVBVd6ZkJJ_WK5sNY8cU,1661
pip/_vendor/chardet/escprober.py,sha256=kkyqVg1Yw3DIOAMJ2bdlyQgUFQhuHAW8dUGskToNWSc,3950
pip/_vendor/chardet/escsm.py,sha256=RuXlgNvTIDarndvllNCk5WZBIpdCxQ0kcd9EAuxUh84,10510
pip/_vendor/chardet/eucjpprober.py,sha256=iD8Jdp0ISRjgjiVN7f0e8xGeQJ5GM2oeZ1dA8nbSeUw,3749
pip/_vendor/chardet/euckrfreq.py,sha256=-7GdmvgWez4-eO4SuXpa7tBiDi5vRXQ8WvdFAzVaSfo,13546
pip/_vendor/chardet/euckrprober.py,sha256=MqFMTQXxW4HbzIpZ9lKDHB3GN8SP4yiHenTmf8g_PxY,1748
pip/_vendor/chardet/euctwfreq.py,sha256=No1WyduFOgB5VITUA7PLyC5oJRNzRyMbBxaKI1l16MA,31621
pip/_vendor/chardet/euctwprober.py,sha256=13p6EP4yRaxqnP4iHtxHOJ6R2zxHq1_m8hTRjzVZ95c,1747
pip/_vendor/chardet/gb2312freq.py,sha256=JX8lsweKLmnCwmk8UHEQsLgkr_rP_kEbvivC4qPOrlc,20715
pip/_vendor/chardet/gb2312prober.py,sha256=gGvIWi9WhDjE-xQXHvNIyrnLvEbMAYgyUSZ65HUfylw,1754
pip/_vendor/chardet/hebrewprober.py,sha256=c3SZ-K7hvyzGY6JRAZxJgwJ_sUS9k0WYkvMY00YBYFo,13838
pip/_vendor/chardet/jisfreq.py,sha256=vpmJv2Bu0J8gnMVRPHMFefTRvo_ha1mryLig8CBwgOg,25777
pip/_vendor/chardet/jpcntx.py,sha256=PYlNqRUQT8LM3cT5FmHGP0iiscFlTWED92MALvBungo,19643
pip/_vendor/chardet/langbulgarianmodel.py,sha256=1HqQS9Pbtnj1xQgxitJMvw8X6kKr5OockNCZWfEQrPE,12839
pip/_vendor/chardet/langcyrillicmodel.py,sha256=LODajvsetH87yYDDQKA2CULXUH87tI223dhfjh9Zx9c,17948
pip/_vendor/chardet/langgreekmodel.py,sha256=8YAW7bU8YwSJap0kIJSbPMw1BEqzGjWzqcqf0WgUKAA,12688
pip/_vendor/chardet/langhebrewmodel.py,sha256=JSnqmE5E62tDLTPTvLpQsg5gOMO4PbdWRvV7Avkc0HA,11345
pip/_vendor/chardet/langhungarianmodel.py,sha256=RhapYSG5l0ZaO-VV4Fan5sW0WRGQqhwBM61yx3yxyOA,12592
pip/_vendor/chardet/langthaimodel.py,sha256=8l0173Gu_W6G8mxmQOTEF4ls2YdE7FxWf3QkSxEGXJQ,11290
pip/_vendor/chardet/langturkishmodel.py,sha256=W22eRNJsqI6uWAfwXSKVWWnCerYqrI8dZQTm_M0lRFk,11102
pip/_vendor/chardet/latin1prober.py,sha256=S2IoORhFk39FEFOlSFWtgVybRiP6h7BlLldHVclNkU8,5370
pip/_vendor/chardet/mbcharsetprober.py,sha256=AR95eFH9vuqSfvLQZN-L5ijea25NOBCoXqw8s5O9xLQ,3413
pip/_vendor/chardet/mbcsgroupprober.py,sha256=h6TRnnYq2OxG1WdD5JOyxcdVpn7dG0q-vB8nWr5mbh4,2012
pip/_vendor/chardet/mbcssm.py,sha256=SY32wVIF3HzcjY3BaEspy9metbNSKxIIB0RKPn7tjpI,25481
pip/_vendor/chardet/sbcharsetprober.py,sha256=LDSpCldDCFlYwUkGkwD2oFxLlPWIWXT09akH_2PiY74,5657
pip/_vendor/chardet/sbcsgroupprober.py,sha256=1IprcCB_k1qfmnxGC6MBbxELlKqD3scW6S8YIwdeyXA,3546
pip/_vendor/chardet/sjisprober.py,sha256=IIt-lZj0WJqK4rmUZzKZP4GJlE8KUEtFYVuY96ek5MQ,3774
pip/_vendor/chardet/universaldetector.py,sha256=qL0174lSZE442eB21nnktT9_VcAye07laFWUeUrjttY,12485
pip/_vendor/chardet/utf8prober.py,sha256=IdD8v3zWOsB8OLiyPi-y_fqwipRFxV9Nc1eKBLSuIEw,2766
pip/_vendor/chardet/version.py,sha256=sp3B08mrDXB-pf3K9fqJ_zeDHOCLC8RrngQyDFap_7g,242
pip/_vendor/chardet/cli/__init__.py,sha256=AbpHGcgLb-kRsJGnwFEktk7uzpZOCcBY74-YBdrKVGs,1
pip/_vendor/chardet/cli/chardetect.py,sha256=YBO8L4mXo0WR6_-Fjh_8QxPBoEBNqB9oNxNrdc54AQs,2738
pip/_vendor/colorama/__init__.py,sha256=9xByrTvk9upkL5NGV5It2Eje4-kzNLwa_1lGPWpXoNU,240
pip/_vendor/colorama/ansi.py,sha256=Fi0un-QLqRm-v7o_nKiOqyC8PapBJK7DLV_q9LKtTO0,2524
pip/_vendor/colorama/ansitowin32.py,sha256=gJZB35Lbdjatykd2zrUUnokMzkvcFgscyn_tNxxMFHA,9668
pip/_vendor/colorama/initialise.py,sha256=cHqVJtb82OG7HUCxvQ2joG7N_CoxbIKbI_fgryZkj20,1917
pip/_vendor/colorama/win32.py,sha256=_SCEoTK_GA2tU1nhbayKKac-v9Jn98lCPIFOeFMGCHQ,5365
pip/_vendor/colorama/winterm.py,sha256=V7U7ojwG1q4n6PKripjEvW_htYQi5ueXSM3LUUoqqDY,6290
pip/_vendor/distlib/__init__.py,sha256=-aUeNNCfiIG_1Tqf19BH0xLNuBKGX1I7lNhcLYgFUEA,581
pip/_vendor/distlib/compat.py,sha256=FzKlP9dNUMH-j_1LCVnjgx6KgUbpnRjTjYkTkDYRPlI,40801
pip/_vendor/distlib/database.py,sha256=jniJmYk0Mj2t6gZYbnn68TvQwnVZ0kXyeuf_3AxFclk,49672
pip/_vendor/distlib/index.py,sha256=Cw8gxFq_7xXvdgExL3efjLAY3EAPDMSL3VA42RkbQBs,21085
pip/_vendor/distlib/locators.py,sha256=hD_Hm3aSL9DklY9Cxyct2n_74gZ0xNFFGB5L7M6ds14,51013
pip/_vendor/distlib/manifest.py,sha256=3qEuZhHlDbvyYZ1BZbdapDAivgMgUwWpZ00cmXqcn18,14810
pip/_vendor/distlib/markers.py,sha256=iRrVWwpyVwjkKJSX8NEQ92_MRMwpROcfNGKCD-Ch1QM,6282
pip/_vendor/distlib/metadata.py,sha256=hUsf7Qh2Ae4CCkL33qK8ppwC8ZTzT7ep6Hj9RKpijKU,38833
pip/_vendor/distlib/resources.py,sha256=VFBVbFqLVqDBSQDXcFQHrX1KEcuoDxTK699Ydi_beyc,10766
pip/_vendor/distlib/scripts.py,sha256=xpehNfISGPTNxQZu02K9Rw2QbNx_2Q4emePv3W5X0iw,15224
pip/_vendor/distlib/util.py,sha256=KNYa6VsljZ1EiMPsB1qpc_e0R3w6BfJXCd2_0pn7vKg,53609
pip/_vendor/distlib/version.py,sha256=CgghOUylxGD7dEA2S3MvWjx7mY_2bWsluF0Of3Yxl4Y,23711
pip/_vendor/distlib/wheel.py,sha256=UP53cKxOM5r7bHSS-n5prF6hwJEVsMW9ZNJutOuC26c,39115
pip/_vendor/distlib/_backport/__init__.py,sha256=bqS_dTOH6uW9iGgd0uzfpPjo6vZ4xpPZ7kyfZJ2vNaw,274
pip/_vendor/distlib/_backport/misc.py,sha256=KWecINdbFNOxSOP1fGF680CJnaC6S4fBRgEtaYTw0ig,971
pip/_vendor/distlib/_backport/sysconfig.cfg,sha256=swZKxq9RY5e9r3PXCrlvQPMsvOdiWZBTHLEbqS8LJLU,2617
pip/_vendor/distlib/_backport/sysconfig.py,sha256=eSEyJg7jxF_eHlHG8IOtl93kb07UoMIRp1wYsPeGi9k,26955
pip/_vendor/html5lib/__init__.py,sha256=JsIwmFldk-9raBadPSTS74JrfmJvozc-3aekMi7Hr9s,780
pip/_vendor/html5lib/_ihatexml.py,sha256=tzXygYmisUmiEUt2v7E1Ab50AKQsrD-SglPRnY75vME,16705
pip/_vendor/html5lib/_inputstream.py,sha256=C4lX5gUBwebOWy41hYP2ZBpkPVNvxk_hZBm3OVyPZM4,32532
pip/_vendor/html5lib/_tokenizer.py,sha256=YAaOEBD6qc5ISq9Xt9Nif1OFgcybTTfMdwqBkZhpAq4,76580
pip/_vendor/html5lib/_utils.py,sha256=bS6THVlL8ZyTcI6CIxiM6xxuHsE8i1j5Ogd3Ha1G84U,4096
pip/_vendor/html5lib/constants.py,sha256=Dfc1Fv3_9frktgWjg4tbj-CjMMp02Ko9qMe4il1BVdo,83387
pip/_vendor/html5lib/html5parser.py,sha256=Dmlu9hlq5w_id6mBZyY_sE5LukIACgvG4kpgIsded8Q,117170
pip/_vendor/html5lib/serializer.py,sha256=Urrsa0cPPLqNX-UbJWS2gUhs_06qVbNxZvUnrmGZK6E,14177
pip/_vendor/html5lib/_trie/__init__.py,sha256=8VR1bcgD2OpeS2XExpu5yBhP_Q1K-lwKbBKICBPf1kU,289
pip/_vendor/html5lib/_trie/_base.py,sha256=6P_AcIoGjtwB2qAlhV8H4VP-ztQxoXFGwt4NyMqG_Kw,979
pip/_vendor/html5lib/_trie/datrie.py,sha256=EQpqSfkZRuTbE-DuhW7xMdVDxdZNZ0CfmnYfHA_3zxM,1178
pip/_vendor/html5lib/_trie/py.py,sha256=wXmQLrZRf4MyWNyg0m3h81m9InhLR7GJ002mIIZh-8o,1775
pip/_vendor/html5lib/filters/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
pip/_vendor/html5lib/filters/alphabeticalattributes.py,sha256=DXv-P2vdQ5F3OTWM6QZ6KhyDlAWm90pbfrD1Bk9D_l0,621
pip/_vendor/html5lib/filters/base.py,sha256=z-IU9ZAYjpsVsqmVt7kuWC63jR11hDMr6CVrvuao8W0,286
pip/_vendor/html5lib/filters/inject_meta_charset.py,sha256=2Q_JnMscn_tNbV_qpgYN_5M3PnBGfmuvECMKDExHUcY,2742
pip/_vendor/html5lib/filters/lint.py,sha256=qf5cLrT6xXd8V7GH1R_3lKxIjuJSfpbWTpSwaglYdDw,3365
pip/_vendor/html5lib/filters/optionaltags.py,sha256=EHig4kM-QiLjuxVJ3FAAFNy-10k4aV6HJbQzHKZ_3u8,10534
pip/_vendor/html5lib/filters/sanitizer.py,sha256=7PqJrhm6mo3JvaHk2IQW7i74Or7Qtd-FV8UftJIyDys,25112
pip/_vendor/html5lib/filters/whitespace.py,sha256=KPt067nYTqqi8KLTClyynn4eVzNDC_-MApXNVHRXVX0,1139
pip/_vendor/html5lib/treeadapters/__init__.py,sha256=l3LcqMSEyoh99Jh_eWjGexHnIvKhLAXoP-LDz88whuM,208
pip/_vendor/html5lib/treeadapters/genshi.py,sha256=6VIuHDNoExv1JWv3ePj6V5CM-tcyiUSWe5_Hd2ejbwY,1555
pip/_vendor/html5lib/treeadapters/sax.py,sha256=3of4vvaUYIAic7pngebwJV24hpOS7Zg9ggJa_WQegy4,1661
pip/_vendor/html5lib/treebuilders/__init__.py,sha256=UlB4orkTgZhFIKQdXrtiWn9cpKSsuhnOQOIHeD0Fv4k,3406
pip/_vendor/html5lib/treebuilders/base.py,sha256=4vdjm_Z2f_GTQBwKnWlrzVcctTb-K5sfN8pXDaWODiA,13942
pip/_vendor/html5lib/treebuilders/dom.py,sha256=SY3MsijXyzdNPc8aK5IQsupBoM8J67y56DgNtGvsb9g,8835
pip/_vendor/html5lib/treebuilders/etree.py,sha256=aqIBOGj_dFYqBURIcTegGNBhAIJOw5iFDHb4jrkYH-8,12764
pip/_vendor/html5lib/treebuilders/etree_lxml.py,sha256=CEgwHMIQZvIDFAqct4kqPkVtyKIm9efHFq_VeExEPCA,14161
pip/_vendor/html5lib/treewalkers/__init__.py,sha256=CFpUOCfLuhAgVJ8NYk9wviCu1khYnv7XRStvyzU1Fws,5544
pip/_vendor/html5lib/treewalkers/base.py,sha256=ei-2cFbNFd0gRjyaFmxnxZGLNID4o0bHFCH9bMyZ5Bk,4939
pip/_vendor/html5lib/treewalkers/dom.py,sha256=EHyFR8D8lYNnyDU9lx_IKigVJRyecUGua0mOi7HBukc,1413
pip/_vendor/html5lib/treewalkers/etree.py,sha256=8jVLEY2FjgN4RFugwhAh44l9ScVYoDStQFCnlPwvafI,4684
pip/_vendor/html5lib/treewalkers/etree_lxml.py,sha256=sY6wfRshWTllu6n48TPWpKsQRPp-0CQrT0hj_AdzHSU,6309
pip/_vendor/html5lib/treewalkers/genshi.py,sha256=4D2PECZ5n3ZN3qu3jMl9yY7B81jnQApBQSVlfaIuYbA,2309
pip/_vendor/idna/__init__.py,sha256=9Nt7xpyet3DmOrPUGooDdAwmHZZu1qUAy2EaJ93kGiQ,58
pip/_vendor/idna/codec.py,sha256=lvYb7yu7PhAqFaAIAdWcwgaWI2UmgseUua-1c0AsG0A,3299
pip/_vendor/idna/compat.py,sha256=R-h29D-6mrnJzbXxymrWUW7iZUvy-26TQwZ0ij57i4U,232
pip/_vendor/idna/core.py,sha256=GafiWdYQIK5TSjWdRzCYCho704ALtMCrV_dnXXn57U0,11390
pip/_vendor/idna/idnadata.py,sha256=-Cg83lurKoA9p7lb0lMAsos0rFz1dnKrGeBE3o8UuCA,32999
pip/_vendor/idna/intranges.py,sha256=TY1lpxZIQWEP6tNqjZkFA5hgoMWOj1OBmnUG8ihT87E,1749
pip/_vendor/idna/package_data.py,sha256=KMSUTS_M7ZZ7Ugl_V_EOxV-D3o7v7yVkt45JK_bpW24,21
pip/_vendor/idna/uts46data.py,sha256=YylQYBfljAx_WVqR2D7HgcGGyVCWwPm6uF38aERuhyw,184944
pip/_vendor/lockfile/__init__.py,sha256=Tqpz90DwKYfhPsfzVOJl84TL87pdFE5ePNHdXAxs4Tk,9371
pip/_vendor/lockfile/linklockfile.py,sha256=C7OH3H4GdK68u4FQgp8fkP2kO4fyUTSyj3X6blgfobc,2652
pip/_vendor/lockfile/mkdirlockfile.py,sha256=e3qgIL-etZMLsS-3ft19iW_8IQ360HNkGOqE3yBKsUw,3096
pip/_vendor/lockfile/pidlockfile.py,sha256=ukH9uk6NFuxyVmG5QiWw4iKq3fT7MjqUguX95avYPIY,6090
pip/_vendor/lockfile/sqlitelockfile.py,sha256=o2TMkMRY0iwn-iL1XMRRIFStMUkS4i3ajceeYNntKFg,5506
pip/_vendor/lockfile/symlinklockfile.py,sha256=ABwXXmvTHvCl5viPblShL3PG-gGsLiT1roAMfDRwhi8,2616
pip/_vendor/packaging/__about__.py,sha256=zkcCPTN_6TcLW0Nrlg0176-R1QQ_WVPTm8sz1R4-HjM,720
pip/_vendor/packaging/__init__.py,sha256=_vNac5TrzwsrzbOFIbF-5cHqc_Y2aPT2D7zrIR06BOo,513
pip/_vendor/packaging/_compat.py,sha256=Vi_A0rAQeHbU-a9X0tt1yQm9RqkgQbDSxzRw8WlU9kA,860
pip/_vendor/packaging/_structures.py,sha256=RImECJ4c_wTlaTYYwZYLHEiebDMaAJmK1oPARhw1T5o,1416
pip/_vendor/packaging/markers.py,sha256=mtg2nphJE1oQO39g1DgsdPsMO-guBBClpR-AEYFrbMg,8230
pip/_vendor/packaging/requirements.py,sha256=SD7dVJGjdPUqtoHb47qwK6wWJTQd-ZXWjxpJg83UcBA,4327
pip/_vendor/packaging/specifiers.py,sha256=SAMRerzO3fK2IkFZCaZkuwZaL_EGqHNOz4pni4vhnN0,28025
pip/_vendor/packaging/utils.py,sha256=3m6WvPm6NNxE8rkTGmn0r75B_GZSGg7ikafxHsBN1WA,421
pip/_vendor/packaging/version.py,sha256=OwGnxYfr2ghNzYx59qWIBkrK3SnB6n-Zfd1XaLpnnM0,11556
pip/_vendor/pkg_resources/__init__.py,sha256=CcwuHtCBZn9OTkmgF9cFpadIAMhlrnZTVKTOo4V2p58,103230
pip/_vendor/progress/__init__.py,sha256=Wn1074LUDZovd4zfoVYojnPBgOc6ctHbQX7rp_p8lRA,3023
pip/_vendor/progress/bar.py,sha256=YNPJeRrwYVKFO2nyaEwsQjYByamMWTgJMvQO1NpD-AY,2685
pip/_vendor/progress/counter.py,sha256=kEqA8jWEdwrc6P_9VaRx7bjOHwk9gxl-Q9oVbQ08v5c,1502
pip/_vendor/progress/helpers.py,sha256=FehfwZTv-5cCfsbcMlvlUkm3xZ0cRhsev6XVpmeTF4c,2854
pip/_vendor/progress/spinner.py,sha256=iCVtUQbaJUFHTjn1ZLPQLPYeao4lC9aXAa_HxIeUK6k,1314
pip/_vendor/requests/__init__.py,sha256=JRFVBw6JyV98WQSqv8jshc5_g9xIbLhevI5LHaVi9I4,3575
pip/_vendor/requests/__version__.py,sha256=BQ279bjqQ_8PHhvD_FN36UuFqjbSUqsm7bMeyJV-kVo,436
pip/_vendor/requests/_internal_utils.py,sha256=Zx3PnEUccyfsB-ie11nZVAW8qClJy0gx1qNME7rgT18,1096
pip/_vendor/requests/adapters.py,sha256=LAay3OH0ZbvI6bDW_M5Of06tU5z2fnCflrp_Xm38KsY,21016
pip/_vendor/requests/api.py,sha256=BqVZnvsWu6Pwm0vQ3fw_Dj9_I-gcOR9CbScB2htPArA,6237
pip/_vendor/requests/auth.py,sha256=4KCFQHrL1Lcox3uMh4tjOh3OrJhw-F5zti91wY-ZyTY,9728
pip/_vendor/requests/certs.py,sha256=nXRVq9DtGmv_1AYbwjTu9UrgAcdJv05ZvkNeaoLOZxY,465
pip/_vendor/requests/compat.py,sha256=kcqhV7U43c4i8Ouk5e5YdJXMDEmWNLT97LYm6Uor-74,1626
pip/_vendor/requests/cookies.py,sha256=u7QC5hmloMwdT9-2taz5GpwAvzp2LTUS9cP4SWUKnfM,18208
pip/_vendor/requests/exceptions.py,sha256=oZwYwCm65Y0FMuFqojEgUlWUBQ4MkXRy5URHV1b98L4,3115
pip/_vendor/requests/help.py,sha256=UuBTtc7tEpnU_ivnpuk2hjgzuS6z7GnTEkSbENlc1XQ,3667
pip/_vendor/requests/hooks.py,sha256=HXAHoC1FNTFRZX6-lNdvPM7Tst4kvGwYTN-AOKRxoRU,767
pip/_vendor/requests/models.py,sha256=lcz2GEOe2eOu-GqAGdpA0vJUpI7EE4eDSlWI78R8Y64,34051
pip/_vendor/requests/packages.py,sha256=njJmVifY4aSctuW3PP5EFRCxjEwMRDO6J_feG2dKWsI,695
pip/_vendor/requests/sessions.py,sha256=Ug4EFuHIkqS3EPmD__hWy0JVCzoAGv_F342kzVLnHrg,28689
pip/_vendor/requests/status_codes.py,sha256=a9bwuU7lMr4HshsRZdzBbUsYVsIo4Fu9GYg1XFTFFPc,3323
pip/_vendor/requests/structures.py,sha256=yexCvWbX40M6E8mLQOpAGZZ-ZoAnyaT2dni-Bp-b42g,3012
pip/_vendor/requests/utils.py,sha256=YEGPbyXaf3hosTL0dI4JrCIDpGYaFgHtLhYj96py7AQ,27695
pip/_vendor/urllib3/__init__.py,sha256=EfUPF9RHveaF9g5dcK5kBsGDp5LrIB-396MpK2RQk1I,2853
pip/_vendor/urllib3/_collections.py,sha256=b0-x45LBArs96Rum5xGAbPsriwfB3MZVQbsv7lFiUwE,10204
pip/_vendor/urllib3/connection.py,sha256=4GKR8uXHz6IpztYhtuhKBqXj84HbbFUF3CvwymGfLD0,13003
pip/_vendor/urllib3/connectionpool.py,sha256=5ub8CXnKs1wc2X-MyuVVPo3_9scWLghSgbHOJTpQXuE,35358
pip/_vendor/urllib3/exceptions.py,sha256=dz1gBEgtROnLrW8V911KhVZWeAn3H2OhDGztWNXQpr0,6603
pip/_vendor/urllib3/fields.py,sha256=YrNRM8RBUmM8guXKUQFa3kwj6XvQZ78Z8inE6l-YK-E,5943
pip/_vendor/urllib3/filepost.py,sha256=NF6Rly66bilWU-sdULXjCdQgN1uRxfFRedeifcRLzkU,2321
pip/_vendor/urllib3/poolmanager.py,sha256=V843K_nTlkV8u3GIj6M-ProgIuUIkqpgOsLy_epC-q4,16820
pip/_vendor/urllib3/request.py,sha256=wrt2D0SWLLgTRKrRnaZophq2xXpCvNRd7RMT6F5o5hY,5946
pip/_vendor/urllib3/response.py,sha256=7mGUH35L2IPuZVOY7QvDQ1GSSKIf6V6geJXThdjmQD4,22903
pip/_vendor/urllib3/contrib/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
pip/_vendor/urllib3/contrib/appengine.py,sha256=41l3arTy-kBBpOdVpSPYVC64Qo7RLnXnDED6hcIthA0,10865
pip/_vendor/urllib3/contrib/ntlmpool.py,sha256=Q9-rO5Rh2-IqyEd4ZicpTDfMnOlf0IPPCkjhChBCjV4,4478
pip/_vendor/urllib3/contrib/pyopenssl.py,sha256=BxJ1yMPE62duuHJP6jlZxLz_FNTKbWI0b-nVRKlBDgI,15354
pip/_vendor/urllib3/contrib/securetransport.py,sha256=ZRPz6Q1tnsu9H1BOAQVUSHIyowWmfaBb7pGlbG5iOPk,30501
pip/_vendor/urllib3/contrib/socks.py,sha256=zPYUKMg_c0n9HFjZPG9nGN2kjpaH7qUCZxrD5B7G0_I,6195
pip/_vendor/urllib3/contrib/_securetransport/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
pip/_vendor/urllib3/contrib/_securetransport/bindings.py,sha256=x2kLSh-ASZKsun0FxtraBuLVe3oHuth4YW6yZ5Vof-w,17560
pip/_vendor/urllib3/contrib/_securetransport/low_level.py,sha256=UbhUykEH6HUIJud9_rn_6YWjionk5iq_rq6YrhVM6Co,12062
pip/_vendor/urllib3/packages/__init__.py,sha256=nlChrGzkjCkmhCX9HrF_qHPUgosfsPQkVIJxiiLhk9g,109
pip/_vendor/urllib3/packages/ordered_dict.py,sha256=VQaPONfhVMsb8B63Xg7ZOydJqIE_jzeMhVN3Pec6ogw,8935
pip/_vendor/urllib3/packages/six.py,sha256=A6hdJZVjI3t_geebZ9BzUvwRrIXo0lfwzQlM2LcKyas,30098
pip/_vendor/urllib3/packages/backports/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
pip/_vendor/urllib3/packages/backports/makefile.py,sha256=r1IADol_pBBq2Y1ub4CPyuS2hXuShK47nfFngZRcRhI,1461
pip/_vendor/urllib3/packages/ssl_match_hostname/__init__.py,sha256=WBVbxQBojNAxfZwNavkox3BgJiMA9BJmm-_fwd0jD_o,688
pip/_vendor/urllib3/packages/ssl_match_hostname/_implementation.py,sha256=lAj7qGCZLOldhn8gZDY6Tqp4mvgkbTfy4k4gDIDRo8g,5702
pip/_vendor/urllib3/util/__init__.py,sha256=6Ran4oAVIy40Cu_oEPWnNV9bwF5rXx6G1DUZ7oehjPY,1044
pip/_vendor/urllib3/util/connection.py,sha256=_6_5JZJF3HHRXR7HaxHg3mk7qMKK3N0nl3DL8gFAfo4,4237
pip/_vendor/urllib3/util/request.py,sha256=H5_lrHvtwl2U2BbT1UYN9HpruNc1gsNFlz2njQmhPrQ,3705
pip/_vendor/urllib3/util/response.py,sha256=SSNL888W-MQ8t3HAi44kNGgF682p6H__ytEXzBYxV_M,2343
pip/_vendor/urllib3/util/retry.py,sha256=ZfL_m5PNUz8XSNy4VJT77Z3EnykjPBLYYHF-3rF_jeM,15104
pip/_vendor/urllib3/util/selectors.py,sha256=PIINzwjiD5Z6IyTKA1tR5n1kCOXyThpDCE2fCVFzLeM,21147
pip/_vendor/urllib3/util/ssl_.py,sha256=4cgfqqgM5U_71CgpKG-aqcUDInWv5-YaPeguohjC97I,12214
pip/_vendor/urllib3/util/timeout.py,sha256=7lHNrgL5YH2cI1j-yZnzV_J8jBlRVdmFhQaNyM1_2b8,9757
pip/_vendor/urllib3/util/url.py,sha256=2hwSEH6nZjUTeE6o54BmZy_8irGtOZpGezXExDcsP1g,6798
pip/_vendor/urllib3/util/wait.py,sha256=Q_pd_bD6iaPgRKwEmcjTYDrSPj4Dd4ojykmqA398b8o,1451
pip/_vendor/webencodings/__init__.py,sha256=t7rAQQxXwalY-ak9hTl73qHjhia9UH-sL-e00qQrBpo,10576
pip/_vendor/webencodings/labels.py,sha256=4AO_KxTddqGtrL9ns7kAPjb0CcN6xsCIxbK37HY9r3E,8979
pip/_vendor/webencodings/mklabels.py,sha256=GYIeywnpaLnP0GSic8LFWgd0UVvO_l1Nc6YoF-87R_4,1305
pip/_vendor/webencodings/tests.py,sha256=7vTk7LgOJn_t1XtT_viofZlEJ7cJCzPe_hvVHOkcQl8,6562
pip/_vendor/webencodings/x_user_defined.py,sha256=72cfPRhbfkRCGkkA8ZnvVV7UnoiLb5uPMhXwhrXiLPk,4306
pip/commands/__init__.py,sha256=2Uq3HCdjchJD9FL1LB7rd5v6UySVAVizX0W3EX3hIoE,2244
pip/commands/check.py,sha256=-A7GI1-WZBh9a4P6UoH_aR-J7I8Lz8ly7m3wnCjmevs,1382
pip/commands/completion.py,sha256=kkPgVX7SUcJ_8Juw5GkgWaxHN9_45wmAr9mGs1zXEEs,2453
pip/commands/download.py,sha256=8RuuPmSYgAq3iEDTqZY_1PDXRqREdUULHNjWJeAv7Mo,7810
pip/commands/freeze.py,sha256=h6-yFMpjCjbNj8-gOm5UuoF6cg14N5rPV4TCi3_CeuI,2835
pip/commands/hash.py,sha256=MCt4jEFyfoce0lVeNEz1x49uaTY-VDkKiBvvxrVcHkw,1597
pip/commands/help.py,sha256=84HWkEdnGP_AEBHnn8gJP2Te0XTXRKFoXqXopbOZTNo,982
pip/commands/install.py,sha256=q45kfTQUKkUJLCdPs38FKYfrVeFz4i9WyeRLfcr4b-Y,18289
pip/commands/list.py,sha256=93bCiFyt2Qut_YHkYHJMZHpXladmxsjS-yOtZeb3uqI,11369
pip/commands/search.py,sha256=oTs9QNdefnrmCV_JeftG0PGiMuYVmiEDF1OUaYsmDao,4502
pip/commands/show.py,sha256=ZYM57_7U8KP9MQIIyHKQdZxmiEZByy-DRzB697VFoTY,5891
pip/commands/uninstall.py,sha256=tz8cXz4WdpUdnt3RvpdQwH6_SNMB50egBIZWa1dwfcc,2884
pip/commands/wheel.py,sha256=z5SEhws2YRMb0Ml1IEkg6jFZMLRpLl86bHCrQbYt5zo,7729
pip/compat/__init__.py,sha256=2Xs_IpsmdRgHbQgQO0c8_lPvHJnQXHyGWxPbLbYJL4c,4672
pip/compat/dictconfig.py,sha256=dRrelPDWrceDSzFT51RTEVY2GuM7UDyc5Igh_tn4Fvk,23096
pip/models/__init__.py,sha256=0Rs7_RA4DxeOkWT5Cq4CQzDrSEhvYcN3TH2cazr72PE,71
pip/models/index.py,sha256=pUfbO__v3mD9j-2n_ClwPS8pVyx4l2wIwyvWt8GMCRA,487
pip/operations/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
pip/operations/check.py,sha256=uwUN9cs1sPo7c0Sj6pRrSv7b22Pk29SXUImTelVchMQ,1590
pip/operations/freeze.py,sha256=k-7w7LsM-RpPv7ERBzHiPpYkH-GuYfHLyR-Cp_1VPL0,5194
pip/req/__init__.py,sha256=vFwZY8_Vc1WU1zFAespg1My_r_AT3n7cN0W9eX0EFqk,276
pip/req/req_file.py,sha256=fG9MDsXUNPhmGwxUiwrIXEynyD8Q7s3L47-hLZPDXq0,11926
pip/req/req_install.py,sha256=g0HClWrzqIqm5Lxiyd3wlaCFrdzQVqMJEAZIkv1XTtc,46741
pip/req/req_set.py,sha256=T6bbKdaL8XAdj6RfXxFUlinNUYEnbcMXlQ24kMvkq0s,34846
pip/req/req_uninstall.py,sha256=fdH2VgCjEC8NRYDS7fRu3ZJaBBUEy-N5muwxDX5MBNM,6897
pip/utils/__init__.py,sha256=zggUku3mu_75m-RCTdBD_mYJMg15wKMQ7UGjTZkRDJs,27757
pip/utils/appdirs.py,sha256=kj2LK-I2fC5QnEh_A_v-ev_IQMcXaWWF5DE39sNvCLQ,8811
pip/utils/build.py,sha256=4smLRrfSCmXmjEnVnMFh2tBEpNcSLRe6J0ejZJ-wWJE,1312
pip/utils/deprecation.py,sha256=X_FMjtDbMJqfqEkdRrki-mYyIdPB6I6DHUTCA_ChY6M,2232
pip/utils/encoding.py,sha256=NQxGiFS5GbeAveLZTnx92t5r0PYqvt0iRnP2u9SGG1w,971
pip/utils/filesystem.py,sha256=ZEVBuYM3fqr2_lgOESh4Y7fPFszGD474zVm_M3Mb5Tk,899
pip/utils/glibc.py,sha256=jcQYjt_oJLPKVZB28Kauy4Sw70zS-wawxoU1HHX36_0,2939
pip/utils/hashes.py,sha256=oMk7cd3PbJgzpSQyXq1MytMud5f6H5Oa2YY5hYuCq6I,2866
pip/utils/logging.py,sha256=7yWu4gZw-Qclj7X80QVdpGWkdTWGKT4LiUVKcE04pro,3327
pip/utils/outdated.py,sha256=9xLA0dbtgGBb07OTI1bHbrA5rVFtOv7XbBCdSosRe6s,5989
pip/utils/packaging.py,sha256=qhmli14odw6DIhWJgQYS2Q0RrSbr8nXNcG48f5yTRms,2080
pip/utils/setuptools_build.py,sha256=0blfscmNJW_iZ5DcswJeDB_PbtTEjfK9RL1R1WEDW2E,278
pip/utils/ui.py,sha256=pbDkSAeumZ6jdZcOJ2yAbx8iBgeP2zfpqNnLJK1gskQ,11597
pip/vcs/__init__.py,sha256=WafFliUTHMmsSISV8PHp1M5EXDNSWyJr78zKaQmPLdY,12374
pip/vcs/bazaar.py,sha256=tYTwc4b4off8mr0O2o8SiGejqBDJxcbDBMSMd9-ISYc,3803
pip/vcs/git.py,sha256=gjC_g3tv-DA4JXiYi8tlg6TRI1W2xZgHmATlf5TsR80,11564
pip/vcs/mercurial.py,sha256=xG6rDiwHCRytJEs23SIHBXl_SwQo2jkkdD_6rVVP5h4,3472
pip/vcs/subversion.py,sha256=GAuX2Sk7IZvJyEzENKcVld_wGBrQ3fpXDlXjapZEYdI,9350
pip-9.0.3.dist-info/DESCRIPTION.rst,sha256=Va8Wj1XBpTbVQ2Z41mZRJdALEeziiS_ZewWn1H2ecY4,1287
pip-9.0.3.dist-info/METADATA,sha256=kM2zXDoIPiZ2qojfkaDC7aKqTFwuwgMRVJ7FYUtAAVk,2553
pip-9.0.3.dist-info/RECORD,,
pip-9.0.3.dist-info/WHEEL,sha256=kdsN-5OJAZIiHN-iO4Rhl82KyS0bDWf4uBwMbkNafr8,110
pip-9.0.3.dist-info/entry_points.txt,sha256=Q-fR2tcp9DRdeXoGn1wR67Xecy32o5EyQEnzDghwqqk,68
pip-9.0.3.dist-info/metadata.json,sha256=4Lmrui3knL0bgoxl18scys1UztOfugUfCzARcpKW4zA,1565
pip-9.0.3.dist-info/top_level.txt,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
../../../bin/pip,sha256=a5Jycd6ha0m0MHdBHQlJybs8bw48knoS20snPPooX78,218
../../../bin/pip3,sha256=a5Jycd6ha0m0MHdBHQlJybs8bw48knoS20snPPooX78,218
../../../bin/pip3.6,sha256=a5Jycd6ha0m0MHdBHQlJybs8bw48knoS20snPPooX78,218
pip-9.0.3.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
pip/compat/__pycache__/dictconfig.cpython-36.pyc,,
pip/compat/__pycache__/__init__.cpython-36.pyc,,
pip/utils/__pycache__/build.cpython-36.pyc,,
pip/utils/__pycache__/appdirs.cpython-36.pyc,,
pip/utils/__pycache__/logging.cpython-36.pyc,,
pip/utils/__pycache__/hashes.cpython-36.pyc,,
pip/utils/__pycache__/encoding.cpython-36.pyc,,
pip/utils/__pycache__/packaging.cpython-36.pyc,,
pip/utils/__pycache__/deprecation.cpython-36.pyc,,
pip/utils/__pycache__/glibc.cpython-36.pyc,,
pip/utils/__pycache__/filesystem.cpython-36.pyc,,
pip/utils/__pycache__/setuptools_build.cpython-36.pyc,,
pip/utils/__pycache__/outdated.cpython-36.pyc,,
pip/utils/__pycache__/ui.cpython-36.pyc,,
pip/utils/__pycache__/__init__.cpython-36.pyc,,
pip/vcs/__pycache__/subversion.cpython-36.pyc,,
pip/vcs/__pycache__/mercurial.cpython-36.pyc,,
pip/vcs/__pycache__/bazaar.cpython-36.pyc,,
pip/vcs/__pycache__/git.cpython-36.pyc,,
pip/vcs/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/colorama/__pycache__/ansi.cpython-36.pyc,,
pip/_vendor/colorama/__pycache__/initialise.cpython-36.pyc,,
pip/_vendor/colorama/__pycache__/winterm.cpython-36.pyc,,
pip/_vendor/colorama/__pycache__/win32.cpython-36.pyc,,
pip/_vendor/colorama/__pycache__/ansitowin32.cpython-36.pyc,,
pip/_vendor/colorama/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/urllib3/contrib/_securetransport/__pycache__/bindings.cpython-36.pyc,,
pip/_vendor/urllib3/contrib/_securetransport/__pycache__/low_level.cpython-36.pyc,,
pip/_vendor/urllib3/contrib/_securetransport/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/urllib3/contrib/__pycache__/appengine.cpython-36.pyc,,
pip/_vendor/urllib3/contrib/__pycache__/securetransport.cpython-36.pyc,,
pip/_vendor/urllib3/contrib/__pycache__/ntlmpool.cpython-36.pyc,,
pip/_vendor/urllib3/contrib/__pycache__/pyopenssl.cpython-36.pyc,,
pip/_vendor/urllib3/contrib/__pycache__/socks.cpython-36.pyc,,
pip/_vendor/urllib3/contrib/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/urllib3/util/__pycache__/selectors.cpython-36.pyc,,
pip/_vendor/urllib3/util/__pycache__/retry.cpython-36.pyc,,
pip/_vendor/urllib3/util/__pycache__/request.cpython-36.pyc,,
pip/_vendor/urllib3/util/__pycache__/ssl_.cpython-36.pyc,,
pip/_vendor/urllib3/util/__pycache__/response.cpython-36.pyc,,
pip/_vendor/urllib3/util/__pycache__/timeout.cpython-36.pyc,,
pip/_vendor/urllib3/util/__pycache__/connection.cpython-36.pyc,,
pip/_vendor/urllib3/util/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/urllib3/util/__pycache__/url.cpython-36.pyc,,
pip/_vendor/urllib3/util/__pycache__/wait.cpython-36.pyc,,
pip/_vendor/urllib3/__pycache__/_collections.cpython-36.pyc,,
pip/_vendor/urllib3/__pycache__/request.cpython-36.pyc,,
pip/_vendor/urllib3/__pycache__/poolmanager.cpython-36.pyc,,
pip/_vendor/urllib3/__pycache__/connectionpool.cpython-36.pyc,,
pip/_vendor/urllib3/__pycache__/response.cpython-36.pyc,,
pip/_vendor/urllib3/__pycache__/exceptions.cpython-36.pyc,,
pip/_vendor/urllib3/__pycache__/filepost.cpython-36.pyc,,
pip/_vendor/urllib3/__pycache__/fields.cpython-36.pyc,,
pip/_vendor/urllib3/__pycache__/connection.cpython-36.pyc,,
pip/_vendor/urllib3/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/urllib3/packages/__pycache__/six.cpython-36.pyc,,
pip/_vendor/urllib3/packages/__pycache__/ordered_dict.cpython-36.pyc,,
pip/_vendor/urllib3/packages/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/urllib3/packages/ssl_match_hostname/__pycache__/_implementation.cpython-36.pyc,,
pip/_vendor/urllib3/packages/ssl_match_hostname/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/urllib3/packages/backports/__pycache__/makefile.cpython-36.pyc,,
pip/_vendor/urllib3/packages/backports/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/distlib/__pycache__/util.cpython-36.pyc,,
pip/_vendor/distlib/__pycache__/database.cpython-36.pyc,,
pip/_vendor/distlib/__pycache__/markers.cpython-36.pyc,,
pip/_vendor/distlib/__pycache__/manifest.cpython-36.pyc,,
pip/_vendor/distlib/__pycache__/scripts.cpython-36.pyc,,
pip/_vendor/distlib/__pycache__/compat.cpython-36.pyc,,
pip/_vendor/distlib/__pycache__/metadata.cpython-36.pyc,,
pip/_vendor/distlib/__pycache__/resources.cpython-36.pyc,,
pip/_vendor/distlib/__pycache__/version.cpython-36.pyc,,
pip/_vendor/distlib/__pycache__/locators.cpython-36.pyc,,
pip/_vendor/distlib/__pycache__/index.cpython-36.pyc,,
pip/_vendor/distlib/__pycache__/wheel.cpython-36.pyc,,
pip/_vendor/distlib/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/distlib/_backport/__pycache__/misc.cpython-36.pyc,,
pip/_vendor/distlib/_backport/__pycache__/sysconfig.cpython-36.pyc,,
pip/_vendor/distlib/_backport/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/__pycache__/re-vendor.cpython-36.pyc,,
pip/_vendor/__pycache__/appdirs.cpython-36.pyc,,
pip/_vendor/__pycache__/six.cpython-36.pyc,,
pip/_vendor/__pycache__/ipaddress.cpython-36.pyc,,
pip/_vendor/__pycache__/retrying.cpython-36.pyc,,
pip/_vendor/__pycache__/distro.cpython-36.pyc,,
pip/_vendor/__pycache__/pyparsing.cpython-36.pyc,,
pip/_vendor/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/cachecontrol/caches/__pycache__/redis_cache.cpython-36.pyc,,
pip/_vendor/cachecontrol/caches/__pycache__/file_cache.cpython-36.pyc,,
pip/_vendor/cachecontrol/caches/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/cachecontrol/__pycache__/filewrapper.cpython-36.pyc,,
pip/_vendor/cachecontrol/__pycache__/compat.cpython-36.pyc,,
pip/_vendor/cachecontrol/__pycache__/heuristics.cpython-36.pyc,,
pip/_vendor/cachecontrol/__pycache__/adapter.cpython-36.pyc,,
pip/_vendor/cachecontrol/__pycache__/controller.cpython-36.pyc,,
pip/_vendor/cachecontrol/__pycache__/wrapper.cpython-36.pyc,,
pip/_vendor/cachecontrol/__pycache__/serialize.cpython-36.pyc,,
pip/_vendor/cachecontrol/__pycache__/_cmd.cpython-36.pyc,,
pip/_vendor/cachecontrol/__pycache__/cache.cpython-36.pyc,,
pip/_vendor/cachecontrol/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/langturkishmodel.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/enums.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/charsetprober.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/hebrewprober.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/euctwprober.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/sjisprober.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/cp949prober.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/mbcssm.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/langhebrewmodel.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/euckrfreq.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/utf8prober.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/compat.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/universaldetector.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/sbcsgroupprober.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/langbulgarianmodel.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/langgreekmodel.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/escsm.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/chardistribution.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/charsetgroupprober.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/gb2312prober.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/gb2312freq.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/langcyrillicmodel.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/euctwfreq.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/big5freq.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/langhungarianmodel.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/eucjpprober.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/latin1prober.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/mbcsgroupprober.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/version.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/jpcntx.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/escprober.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/sbcharsetprober.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/codingstatemachine.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/mbcharsetprober.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/langthaimodel.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/jisfreq.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/euckrprober.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/chardet/__pycache__/big5prober.cpython-36.pyc,,
pip/_vendor/chardet/cli/__pycache__/chardetect.cpython-36.pyc,,
pip/_vendor/chardet/cli/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/webencodings/__pycache__/labels.cpython-36.pyc,,
pip/_vendor/webencodings/__pycache__/tests.cpython-36.pyc,,
pip/_vendor/webencodings/__pycache__/mklabels.cpython-36.pyc,,
pip/_vendor/webencodings/__pycache__/x_user_defined.cpython-36.pyc,,
pip/_vendor/webencodings/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/pkg_resources/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/requests/__pycache__/__version__.cpython-36.pyc,,
pip/_vendor/requests/__pycache__/packages.cpython-36.pyc,,
pip/_vendor/requests/__pycache__/utils.cpython-36.pyc,,
pip/_vendor/requests/__pycache__/adapters.cpython-36.pyc,,
pip/_vendor/requests/__pycache__/_internal_utils.cpython-36.pyc,,
pip/_vendor/requests/__pycache__/help.cpython-36.pyc,,
pip/_vendor/requests/__pycache__/hooks.cpython-36.pyc,,
pip/_vendor/requests/__pycache__/compat.cpython-36.pyc,,
pip/_vendor/requests/__pycache__/auth.cpython-36.pyc,,
pip/_vendor/requests/__pycache__/api.cpython-36.pyc,,
pip/_vendor/requests/__pycache__/sessions.cpython-36.pyc,,
pip/_vendor/requests/__pycache__/structures.cpython-36.pyc,,
pip/_vendor/requests/__pycache__/status_codes.cpython-36.pyc,,
pip/_vendor/requests/__pycache__/models.cpython-36.pyc,,
pip/_vendor/requests/__pycache__/exceptions.cpython-36.pyc,,
pip/_vendor/requests/__pycache__/cookies.cpython-36.pyc,,
pip/_vendor/requests/__pycache__/certs.cpython-36.pyc,,
pip/_vendor/requests/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/packaging/__pycache__/requirements.cpython-36.pyc,,
pip/_vendor/packaging/__pycache__/utils.cpython-36.pyc,,
pip/_vendor/packaging/__pycache__/markers.cpython-36.pyc,,
pip/_vendor/packaging/__pycache__/_compat.cpython-36.pyc,,
pip/_vendor/packaging/__pycache__/__about__.cpython-36.pyc,,
pip/_vendor/packaging/__pycache__/_structures.cpython-36.pyc,,
pip/_vendor/packaging/__pycache__/version.cpython-36.pyc,,
pip/_vendor/packaging/__pycache__/specifiers.cpython-36.pyc,,
pip/_vendor/packaging/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/lockfile/__pycache__/sqlitelockfile.cpython-36.pyc,,
pip/_vendor/lockfile/__pycache__/symlinklockfile.cpython-36.pyc,,
pip/_vendor/lockfile/__pycache__/pidlockfile.cpython-36.pyc,,
pip/_vendor/lockfile/__pycache__/mkdirlockfile.cpython-36.pyc,,
pip/_vendor/lockfile/__pycache__/linklockfile.cpython-36.pyc,,
pip/_vendor/lockfile/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/html5lib/_trie/__pycache__/py.cpython-36.pyc,,
pip/_vendor/html5lib/_trie/__pycache__/datrie.cpython-36.pyc,,
pip/_vendor/html5lib/_trie/__pycache__/_base.cpython-36.pyc,,
pip/_vendor/html5lib/_trie/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/html5lib/__pycache__/_tokenizer.cpython-36.pyc,,
pip/_vendor/html5lib/__pycache__/_ihatexml.cpython-36.pyc,,
pip/_vendor/html5lib/__pycache__/html5parser.cpython-36.pyc,,
pip/_vendor/html5lib/__pycache__/_utils.cpython-36.pyc,,
pip/_vendor/html5lib/__pycache__/constants.cpython-36.pyc,,
pip/_vendor/html5lib/__pycache__/serializer.cpython-36.pyc,,
pip/_vendor/html5lib/__pycache__/_inputstream.cpython-36.pyc,,
pip/_vendor/html5lib/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/html5lib/treewalkers/__pycache__/etree.cpython-36.pyc,,
pip/_vendor/html5lib/treewalkers/__pycache__/genshi.cpython-36.pyc,,
pip/_vendor/html5lib/treewalkers/__pycache__/dom.cpython-36.pyc,,
pip/_vendor/html5lib/treewalkers/__pycache__/etree_lxml.cpython-36.pyc,,
pip/_vendor/html5lib/treewalkers/__pycache__/base.cpython-36.pyc,,
pip/_vendor/html5lib/treewalkers/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/html5lib/treebuilders/__pycache__/etree.cpython-36.pyc,,
pip/_vendor/html5lib/treebuilders/__pycache__/dom.cpython-36.pyc,,
pip/_vendor/html5lib/treebuilders/__pycache__/etree_lxml.cpython-36.pyc,,
pip/_vendor/html5lib/treebuilders/__pycache__/base.cpython-36.pyc,,
pip/_vendor/html5lib/treebuilders/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/html5lib/treeadapters/__pycache__/sax.cpython-36.pyc,,
pip/_vendor/html5lib/treeadapters/__pycache__/genshi.cpython-36.pyc,,
pip/_vendor/html5lib/treeadapters/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/html5lib/filters/__pycache__/optionaltags.cpython-36.pyc,,
pip/_vendor/html5lib/filters/__pycache__/alphabeticalattributes.cpython-36.pyc,,
pip/_vendor/html5lib/filters/__pycache__/sanitizer.cpython-36.pyc,,
pip/_vendor/html5lib/filters/__pycache__/lint.cpython-36.pyc,,
pip/_vendor/html5lib/filters/__pycache__/inject_meta_charset.cpython-36.pyc,,
pip/_vendor/html5lib/filters/__pycache__/base.cpython-36.pyc,,
pip/_vendor/html5lib/filters/__pycache__/whitespace.cpython-36.pyc,,
pip/_vendor/html5lib/filters/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/progress/__pycache__/spinner.cpython-36.pyc,,
pip/_vendor/progress/__pycache__/bar.cpython-36.pyc,,
pip/_vendor/progress/__pycache__/counter.cpython-36.pyc,,
pip/_vendor/progress/__pycache__/helpers.cpython-36.pyc,,
pip/_vendor/progress/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/certifi/__pycache__/__main__.cpython-36.pyc,,
pip/_vendor/certifi/__pycache__/core.cpython-36.pyc,,
pip/_vendor/certifi/__pycache__/__init__.cpython-36.pyc,,
pip/_vendor/idna/__pycache__/compat.cpython-36.pyc,,
pip/_vendor/idna/__pycache__/uts46data.cpython-36.pyc,,
pip/_vendor/idna/__pycache__/codec.cpython-36.pyc,,
pip/_vendor/idna/__pycache__/idnadata.cpython-36.pyc,,
pip/_vendor/idna/__pycache__/package_data.cpython-36.pyc,,
pip/_vendor/idna/__pycache__/core.cpython-36.pyc,,
pip/_vendor/idna/__pycache__/intranges.cpython-36.pyc,,
pip/_vendor/idna/__pycache__/__init__.cpython-36.pyc,,
pip/__pycache__/cmdoptions.cpython-36.pyc,,
pip/__pycache__/download.cpython-36.pyc,,
pip/__pycache__/__main__.cpython-36.pyc,,
pip/__pycache__/basecommand.cpython-36.pyc,,
pip/__pycache__/baseparser.cpython-36.pyc,,
pip/__pycache__/locations.cpython-36.pyc,,
pip/__pycache__/pep425tags.cpython-36.pyc,,
pip/__pycache__/status_codes.cpython-36.pyc,,
pip/__pycache__/exceptions.cpython-36.pyc,,
pip/__pycache__/index.cpython-36.pyc,,
pip/__pycache__/wheel.cpython-36.pyc,,
pip/__pycache__/__init__.cpython-36.pyc,,
pip/operations/__pycache__/check.cpython-36.pyc,,
pip/operations/__pycache__/freeze.cpython-36.pyc,,
pip/operations/__pycache__/__init__.cpython-36.pyc,,
pip/commands/__pycache__/completion.cpython-36.pyc,,
pip/commands/__pycache__/download.cpython-36.pyc,,
pip/commands/__pycache__/uninstall.cpython-36.pyc,,
pip/commands/__pycache__/check.cpython-36.pyc,,
pip/commands/__pycache__/help.cpython-36.pyc,,
pip/commands/__pycache__/list.cpython-36.pyc,,
pip/commands/__pycache__/show.cpython-36.pyc,,
pip/commands/__pycache__/search.cpython-36.pyc,,
pip/commands/__pycache__/install.cpython-36.pyc,,
pip/commands/__pycache__/hash.cpython-36.pyc,,
pip/commands/__pycache__/freeze.cpython-36.pyc,,
pip/commands/__pycache__/wheel.cpython-36.pyc,,
pip/commands/__pycache__/__init__.cpython-36.pyc,,
pip/models/__pycache__/index.cpython-36.pyc,,
pip/models/__pycache__/__init__.cpython-36.pyc,,
pip/req/__pycache__/req_file.cpython-36.pyc,,
pip/req/__pycache__/req_uninstall.cpython-36.pyc,,
pip/req/__pycache__/req_set.cpython-36.pyc,,
pip/req/__pycache__/req_install.cpython-36.pyc,,
pip/req/__pycache__/__init__.cpython-36.pyc,,
site-packages/pip-9.0.3.dist-info/metadata.json000064400000003035147511334560015101 0ustar00{"classifiers": ["Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Topic :: Software Development :: Build Tools", "Programming Language :: Python :: 2", "Programming Language :: Python :: 2.6", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.3", "Programming Language :: Python :: 3.4", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: Implementation :: PyPy"], "extensions": {"python.commands": {"wrap_console": {"pip": "pip:main", "pip3": "pip:main", "pip3.6": "pip:main"}}, "python.details": {"contacts": [{"email": "python-virtualenv@groups.google.com", "name": "The pip developers", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst"}, "project_urls": {"Home": "https://pip.pypa.io/"}}, "python.exports": {"console_scripts": {"pip": "pip:main", "pip3": "pip:main", "pip3.6": "pip:main"}}}, "extras": ["testing"], "generator": "bdist_wheel (0.30.0)", "keywords": ["easy_install", "distutils", "setuptools", "egg", "virtualenv"], "license": "MIT", "metadata_version": "2.0", "name": "pip", "requires_python": ">=2.6,!=3.0.*,!=3.1.*,!=3.2.*", "run_requires": [{"extra": "testing", "requires": ["mock", "pretend", "pytest", "scripttest (>=1.3)", "virtualenv (>=1.10)"]}], "summary": "The PyPA recommended tool for installing Python packages.", "test_requires": [{"requires": ["mock", "pretend", "pytest", "scripttest (>=1.3)", "virtualenv (>=1.10)"]}], "version": "9.0.3"}site-packages/pip-9.0.3.dist-info/METADATA000064400000004771147511334560013541 0ustar00Metadata-Version: 2.0
Name: pip
Version: 9.0.3
Summary: The PyPA recommended tool for installing Python packages.
Home-page: https://pip.pypa.io/
Author: The pip developers
Author-email: python-virtualenv@groups.google.com
License: MIT
Keywords: easy_install distutils setuptools egg virtualenv
Platform: UNKNOWN
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Topic :: Software Development :: Build Tools
Classifier: Programming Language :: Python :: 2
Classifier: Programming Language :: Python :: 2.6
Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.3
Classifier: Programming Language :: Python :: 3.4
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: Implementation :: PyPy
Requires-Python: >=2.6,!=3.0.*,!=3.1.*,!=3.2.*
Provides-Extra: testing
Provides-Extra: testing
Requires-Dist: pytest; extra == 'testing'
Requires-Dist: virtualenv (>=1.10); extra == 'testing'
Requires-Dist: scripttest (>=1.3); extra == 'testing'
Requires-Dist: mock; extra == 'testing'
Requires-Dist: pretend; extra == 'testing'

pip
===

The `PyPA recommended
<https://packaging.python.org/en/latest/current/>`_
tool for installing Python packages.

* `Installation <https://pip.pypa.io/en/stable/installing.html>`_
* `Documentation <https://pip.pypa.io/>`_
* `Changelog <https://pip.pypa.io/en/stable/news.html>`_
* `Github Page <https://github.com/pypa/pip>`_
* `Issue Tracking <https://github.com/pypa/pip/issues>`_
* `User mailing list <http://groups.google.com/group/python-virtualenv>`_
* `Dev mailing list <http://groups.google.com/group/pypa-dev>`_
* User IRC: #pypa on Freenode.
* Dev IRC: #pypa-dev on Freenode.


.. image:: https://img.shields.io/pypi/v/pip.svg
   :target: https://pypi.python.org/pypi/pip

.. image:: https://img.shields.io/travis/pypa/pip/master.svg
   :target: http://travis-ci.org/pypa/pip

.. image:: https://img.shields.io/appveyor/ci/pypa/pip.svg
   :target: https://ci.appveyor.com/project/pypa/pip/history

.. image:: https://readthedocs.org/projects/pip/badge/?version=stable
   :target: https://pip.pypa.io/en/stable

Code of Conduct
---------------

Everyone interacting in the pip project's codebases, issue trackers, chat
rooms, and mailing lists is expected to follow the `PyPA Code of Conduct`_.

.. _PyPA Code of Conduct: https://www.pypa.io/en/latest/code-of-conduct/


site-packages/pip-9.0.3.dist-info/WHEEL000064400000000156147511334560013216 0ustar00Wheel-Version: 1.0
Generator: bdist_wheel (0.30.0)
Root-Is-Purelib: true
Tag: py2-none-any
Tag: py3-none-any

site-packages/pip-9.0.3.dist-info/entry_points.txt000064400000000104147511334560015716 0ustar00[console_scripts]
pip = pip:main
pip3 = pip:main
pip3.6 = pip:main

site-packages/syspurpose-1.28.36-py3.6.egg-info/dependency_links.txt000064400000000001147511334560020763 0ustar00
site-packages/syspurpose-1.28.36-py3.6.egg-info/SOURCES.txt000064400000000575147511334560016610 0ustar00README.md
setup.cfg
setup.py
man/syspurpose.8
src/syspurpose/__init__.py
src/syspurpose/cli.py
src/syspurpose/files.py
src/syspurpose/i18n.py
src/syspurpose/main.py
src/syspurpose/utils.py
src/syspurpose.egg-info/PKG-INFO
src/syspurpose.egg-info/SOURCES.txt
src/syspurpose.egg-info/dependency_links.txt
src/syspurpose.egg-info/entry_points.txt
src/syspurpose.egg-info/top_level.txtsite-packages/syspurpose-1.28.36-py3.6.egg-info/entry_points.txt000064400000000065147511334560020214 0ustar00[console_scripts]
syspurpose = syspurpose.main:main

site-packages/syspurpose-1.28.36-py3.6.egg-info/PKG-INFO000064400000000367147511334560016020 0ustar00Metadata-Version: 1.0
Name: syspurpose
Version: 1.28.36
Summary: Manage Red Hat System Purpose
Home-page: http://www.candlepinproject.org
Author: Chris Snyder
Author-email: chainsaw@redhat.com
License: GPLv2
Description: UNKNOWN
Platform: UNKNOWN
site-packages/syspurpose-1.28.36-py3.6.egg-info/top_level.txt000064400000000013147511334560017441 0ustar00syspurpose
site-packages/slip-0.6.4-py3.6.egg-info000064400000000304147511334560013176 0ustar00Metadata-Version: 1.1
Name: slip
Version: 0.6.4
Summary: UNKNOWN
Home-page: UNKNOWN
Author: UNKNOWN
Author-email: UNKNOWN
License: UNKNOWN
Description: UNKNOWN
Platform: UNKNOWN
Requires: selinux
site-packages/pyparsing.py000064400000700753147511334560011677 0ustar00# module pyparsing.py
#
# Copyright (c) 2003-2016  Paul T. McGuire
#
# Permission is hereby granted, free of charge, to any person obtaining
# a copy of this software and associated documentation files (the
# "Software"), to deal in the Software without restriction, including
# without limitation the rights to use, copy, modify, merge, publish,
# distribute, sublicense, and/or sell copies of the Software, and to
# permit persons to whom the Software is furnished to do so, subject to
# the following conditions:
#
# The above copyright notice and this permission notice shall be
# included in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
#

__doc__ = \
"""
pyparsing module - Classes and methods to define and execute parsing grammars

The pyparsing module is an alternative approach to creating and executing simple grammars,
vs. the traditional lex/yacc approach, or the use of regular expressions.  With pyparsing, you
don't need to learn a new syntax for defining grammars or matching expressions - the parsing module
provides a library of classes that you use to construct the grammar directly in Python.

Here is a program to parse "Hello, World!" (or any greeting of the form 
C{"<salutation>, <addressee>!"}), built up using L{Word}, L{Literal}, and L{And} elements 
(L{'+'<ParserElement.__add__>} operator gives L{And} expressions, strings are auto-converted to
L{Literal} expressions)::

    from pyparsing import Word, alphas

    # define grammar of a greeting
    greet = Word(alphas) + "," + Word(alphas) + "!"

    hello = "Hello, World!"
    print (hello, "->", greet.parseString(hello))

The program outputs the following::

    Hello, World! -> ['Hello', ',', 'World', '!']

The Python representation of the grammar is quite readable, owing to the self-explanatory
class names, and the use of '+', '|' and '^' operators.

The L{ParseResults} object returned from L{ParserElement.parseString<ParserElement.parseString>} can be accessed as a nested list, a dictionary, or an
object with named attributes.

The pyparsing module handles some of the problems that are typically vexing when writing text parsers:
 - extra or missing whitespace (the above program will also handle "Hello,World!", "Hello  ,  World  !", etc.)
 - quoted strings
 - embedded comments
"""

__version__ = "2.1.10"
__versionTime__ = "07 Oct 2016 01:31 UTC"
__author__ = "Paul McGuire <ptmcg@users.sourceforge.net>"

import string
from weakref import ref as wkref
import copy
import sys
import warnings
import re
import sre_constants
import collections
import pprint
import traceback
import types
from datetime import datetime

try:
    from _thread import RLock
except ImportError:
    from threading import RLock

try:
    from collections import OrderedDict as _OrderedDict
except ImportError:
    try:
        from ordereddict import OrderedDict as _OrderedDict
    except ImportError:
        _OrderedDict = None

#~ sys.stderr.write( "testing pyparsing module, version %s, %s\n" % (__version__,__versionTime__ ) )

__all__ = [
'And', 'CaselessKeyword', 'CaselessLiteral', 'CharsNotIn', 'Combine', 'Dict', 'Each', 'Empty',
'FollowedBy', 'Forward', 'GoToColumn', 'Group', 'Keyword', 'LineEnd', 'LineStart', 'Literal',
'MatchFirst', 'NoMatch', 'NotAny', 'OneOrMore', 'OnlyOnce', 'Optional', 'Or',
'ParseBaseException', 'ParseElementEnhance', 'ParseException', 'ParseExpression', 'ParseFatalException',
'ParseResults', 'ParseSyntaxException', 'ParserElement', 'QuotedString', 'RecursiveGrammarException',
'Regex', 'SkipTo', 'StringEnd', 'StringStart', 'Suppress', 'Token', 'TokenConverter', 
'White', 'Word', 'WordEnd', 'WordStart', 'ZeroOrMore',
'alphanums', 'alphas', 'alphas8bit', 'anyCloseTag', 'anyOpenTag', 'cStyleComment', 'col',
'commaSeparatedList', 'commonHTMLEntity', 'countedArray', 'cppStyleComment', 'dblQuotedString',
'dblSlashComment', 'delimitedList', 'dictOf', 'downcaseTokens', 'empty', 'hexnums',
'htmlComment', 'javaStyleComment', 'line', 'lineEnd', 'lineStart', 'lineno',
'makeHTMLTags', 'makeXMLTags', 'matchOnlyAtCol', 'matchPreviousExpr', 'matchPreviousLiteral',
'nestedExpr', 'nullDebugAction', 'nums', 'oneOf', 'opAssoc', 'operatorPrecedence', 'printables',
'punc8bit', 'pythonStyleComment', 'quotedString', 'removeQuotes', 'replaceHTMLEntity', 
'replaceWith', 'restOfLine', 'sglQuotedString', 'srange', 'stringEnd',
'stringStart', 'traceParseAction', 'unicodeString', 'upcaseTokens', 'withAttribute',
'indentedBlock', 'originalTextFor', 'ungroup', 'infixNotation','locatedExpr', 'withClass',
'CloseMatch', 'tokenMap', 'pyparsing_common',
]

system_version = tuple(sys.version_info)[:3]
PY_3 = system_version[0] == 3
if PY_3:
    _MAX_INT = sys.maxsize
    basestring = str
    unichr = chr
    _ustr = str

    # build list of single arg builtins, that can be used as parse actions
    singleArgBuiltins = [sum, len, sorted, reversed, list, tuple, set, any, all, min, max]

else:
    _MAX_INT = sys.maxint
    range = xrange

    def _ustr(obj):
        """Drop-in replacement for str(obj) that tries to be Unicode friendly. It first tries
           str(obj). If that fails with a UnicodeEncodeError, then it tries unicode(obj). It
           then < returns the unicode object | encodes it with the default encoding | ... >.
        """
        if isinstance(obj,unicode):
            return obj

        try:
            # If this works, then _ustr(obj) has the same behaviour as str(obj), so
            # it won't break any existing code.
            return str(obj)

        except UnicodeEncodeError:
            # Else encode it
            ret = unicode(obj).encode(sys.getdefaultencoding(), 'xmlcharrefreplace')
            xmlcharref = Regex('&#\d+;')
            xmlcharref.setParseAction(lambda t: '\\u' + hex(int(t[0][2:-1]))[2:])
            return xmlcharref.transformString(ret)

    # build list of single arg builtins, tolerant of Python version, that can be used as parse actions
    singleArgBuiltins = []
    import __builtin__
    for fname in "sum len sorted reversed list tuple set any all min max".split():
        try:
            singleArgBuiltins.append(getattr(__builtin__,fname))
        except AttributeError:
            continue
            
_generatorType = type((y for y in range(1)))
 
def _xml_escape(data):
    """Escape &, <, >, ", ', etc. in a string of data."""

    # ampersand must be replaced first
    from_symbols = '&><"\''
    to_symbols = ('&'+s+';' for s in "amp gt lt quot apos".split())
    for from_,to_ in zip(from_symbols, to_symbols):
        data = data.replace(from_, to_)
    return data

class _Constants(object):
    pass

alphas     = string.ascii_uppercase + string.ascii_lowercase
nums       = "0123456789"
hexnums    = nums + "ABCDEFabcdef"
alphanums  = alphas + nums
_bslash    = chr(92)
printables = "".join(c for c in string.printable if c not in string.whitespace)

class ParseBaseException(Exception):
    """base exception class for all parsing runtime exceptions"""
    # Performance tuning: we construct a *lot* of these, so keep this
    # constructor as small and fast as possible
    def __init__( self, pstr, loc=0, msg=None, elem=None ):
        self.loc = loc
        if msg is None:
            self.msg = pstr
            self.pstr = ""
        else:
            self.msg = msg
            self.pstr = pstr
        self.parserElement = elem
        self.args = (pstr, loc, msg)

    @classmethod
    def _from_exception(cls, pe):
        """
        internal factory method to simplify creating one type of ParseException 
        from another - avoids having __init__ signature conflicts among subclasses
        """
        return cls(pe.pstr, pe.loc, pe.msg, pe.parserElement)

    def __getattr__( self, aname ):
        """supported attributes by name are:
            - lineno - returns the line number of the exception text
            - col - returns the column number of the exception text
            - line - returns the line containing the exception text
        """
        if( aname == "lineno" ):
            return lineno( self.loc, self.pstr )
        elif( aname in ("col", "column") ):
            return col( self.loc, self.pstr )
        elif( aname == "line" ):
            return line( self.loc, self.pstr )
        else:
            raise AttributeError(aname)

    def __str__( self ):
        return "%s (at char %d), (line:%d, col:%d)" % \
                ( self.msg, self.loc, self.lineno, self.column )
    def __repr__( self ):
        return _ustr(self)
    def markInputline( self, markerString = ">!<" ):
        """Extracts the exception line from the input string, and marks
           the location of the exception with a special symbol.
        """
        line_str = self.line
        line_column = self.column - 1
        if markerString:
            line_str = "".join((line_str[:line_column],
                                markerString, line_str[line_column:]))
        return line_str.strip()
    def __dir__(self):
        return "lineno col line".split() + dir(type(self))

class ParseException(ParseBaseException):
    """
    Exception thrown when parse expressions don't match class;
    supported attributes by name are:
     - lineno - returns the line number of the exception text
     - col - returns the column number of the exception text
     - line - returns the line containing the exception text
        
    Example::
        try:
            Word(nums).setName("integer").parseString("ABC")
        except ParseException as pe:
            print(pe)
            print("column: {}".format(pe.col))
            
    prints::
       Expected integer (at char 0), (line:1, col:1)
        column: 1
    """
    pass

class ParseFatalException(ParseBaseException):
    """user-throwable exception thrown when inconsistent parse content
       is found; stops all parsing immediately"""
    pass

class ParseSyntaxException(ParseFatalException):
    """just like L{ParseFatalException}, but thrown internally when an
       L{ErrorStop<And._ErrorStop>} ('-' operator) indicates that parsing is to stop 
       immediately because an unbacktrackable syntax error has been found"""
    pass

#~ class ReparseException(ParseBaseException):
    #~ """Experimental class - parse actions can raise this exception to cause
       #~ pyparsing to reparse the input string:
        #~ - with a modified input string, and/or
        #~ - with a modified start location
       #~ Set the values of the ReparseException in the constructor, and raise the
       #~ exception in a parse action to cause pyparsing to use the new string/location.
       #~ Setting the values as None causes no change to be made.
       #~ """
    #~ def __init_( self, newstring, restartLoc ):
        #~ self.newParseText = newstring
        #~ self.reparseLoc = restartLoc

class RecursiveGrammarException(Exception):
    """exception thrown by L{ParserElement.validate} if the grammar could be improperly recursive"""
    def __init__( self, parseElementList ):
        self.parseElementTrace = parseElementList

    def __str__( self ):
        return "RecursiveGrammarException: %s" % self.parseElementTrace

class _ParseResultsWithOffset(object):
    def __init__(self,p1,p2):
        self.tup = (p1,p2)
    def __getitem__(self,i):
        return self.tup[i]
    def __repr__(self):
        return repr(self.tup[0])
    def setOffset(self,i):
        self.tup = (self.tup[0],i)

class ParseResults(object):
    """
    Structured parse results, to provide multiple means of access to the parsed data:
       - as a list (C{len(results)})
       - by list index (C{results[0], results[1]}, etc.)
       - by attribute (C{results.<resultsName>} - see L{ParserElement.setResultsName})

    Example::
        integer = Word(nums)
        date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))
        # equivalent form:
        # date_str = integer("year") + '/' + integer("month") + '/' + integer("day")

        # parseString returns a ParseResults object
        result = date_str.parseString("1999/12/31")

        def test(s, fn=repr):
            print("%s -> %s" % (s, fn(eval(s))))
        test("list(result)")
        test("result[0]")
        test("result['month']")
        test("result.day")
        test("'month' in result")
        test("'minutes' in result")
        test("result.dump()", str)
    prints::
        list(result) -> ['1999', '/', '12', '/', '31']
        result[0] -> '1999'
        result['month'] -> '12'
        result.day -> '31'
        'month' in result -> True
        'minutes' in result -> False
        result.dump() -> ['1999', '/', '12', '/', '31']
        - day: 31
        - month: 12
        - year: 1999
    """
    def __new__(cls, toklist=None, name=None, asList=True, modal=True ):
        if isinstance(toklist, cls):
            return toklist
        retobj = object.__new__(cls)
        retobj.__doinit = True
        return retobj

    # Performance tuning: we construct a *lot* of these, so keep this
    # constructor as small and fast as possible
    def __init__( self, toklist=None, name=None, asList=True, modal=True, isinstance=isinstance ):
        if self.__doinit:
            self.__doinit = False
            self.__name = None
            self.__parent = None
            self.__accumNames = {}
            self.__asList = asList
            self.__modal = modal
            if toklist is None:
                toklist = []
            if isinstance(toklist, list):
                self.__toklist = toklist[:]
            elif isinstance(toklist, _generatorType):
                self.__toklist = list(toklist)
            else:
                self.__toklist = [toklist]
            self.__tokdict = dict()

        if name is not None and name:
            if not modal:
                self.__accumNames[name] = 0
            if isinstance(name,int):
                name = _ustr(name) # will always return a str, but use _ustr for consistency
            self.__name = name
            if not (isinstance(toklist, (type(None), basestring, list)) and toklist in (None,'',[])):
                if isinstance(toklist,basestring):
                    toklist = [ toklist ]
                if asList:
                    if isinstance(toklist,ParseResults):
                        self[name] = _ParseResultsWithOffset(toklist.copy(),0)
                    else:
                        self[name] = _ParseResultsWithOffset(ParseResults(toklist[0]),0)
                    self[name].__name = name
                else:
                    try:
                        self[name] = toklist[0]
                    except (KeyError,TypeError,IndexError):
                        self[name] = toklist

    def __getitem__( self, i ):
        if isinstance( i, (int,slice) ):
            return self.__toklist[i]
        else:
            if i not in self.__accumNames:
                return self.__tokdict[i][-1][0]
            else:
                return ParseResults([ v[0] for v in self.__tokdict[i] ])

    def __setitem__( self, k, v, isinstance=isinstance ):
        if isinstance(v,_ParseResultsWithOffset):
            self.__tokdict[k] = self.__tokdict.get(k,list()) + [v]
            sub = v[0]
        elif isinstance(k,(int,slice)):
            self.__toklist[k] = v
            sub = v
        else:
            self.__tokdict[k] = self.__tokdict.get(k,list()) + [_ParseResultsWithOffset(v,0)]
            sub = v
        if isinstance(sub,ParseResults):
            sub.__parent = wkref(self)

    def __delitem__( self, i ):
        if isinstance(i,(int,slice)):
            mylen = len( self.__toklist )
            del self.__toklist[i]

            # convert int to slice
            if isinstance(i, int):
                if i < 0:
                    i += mylen
                i = slice(i, i+1)
            # get removed indices
            removed = list(range(*i.indices(mylen)))
            removed.reverse()
            # fixup indices in token dictionary
            for name,occurrences in self.__tokdict.items():
                for j in removed:
                    for k, (value, position) in enumerate(occurrences):
                        occurrences[k] = _ParseResultsWithOffset(value, position - (position > j))
        else:
            del self.__tokdict[i]

    def __contains__( self, k ):
        return k in self.__tokdict

    def __len__( self ): return len( self.__toklist )
    def __bool__(self): return ( not not self.__toklist )
    __nonzero__ = __bool__
    def __iter__( self ): return iter( self.__toklist )
    def __reversed__( self ): return iter( self.__toklist[::-1] )
    def _iterkeys( self ):
        if hasattr(self.__tokdict, "iterkeys"):
            return self.__tokdict.iterkeys()
        else:
            return iter(self.__tokdict)

    def _itervalues( self ):
        return (self[k] for k in self._iterkeys())
            
    def _iteritems( self ):
        return ((k, self[k]) for k in self._iterkeys())

    if PY_3:
        keys = _iterkeys       
        """Returns an iterator of all named result keys (Python 3.x only)."""

        values = _itervalues
        """Returns an iterator of all named result values (Python 3.x only)."""

        items = _iteritems
        """Returns an iterator of all named result key-value tuples (Python 3.x only)."""

    else:
        iterkeys = _iterkeys
        """Returns an iterator of all named result keys (Python 2.x only)."""

        itervalues = _itervalues
        """Returns an iterator of all named result values (Python 2.x only)."""

        iteritems = _iteritems
        """Returns an iterator of all named result key-value tuples (Python 2.x only)."""

        def keys( self ):
            """Returns all named result keys (as a list in Python 2.x, as an iterator in Python 3.x)."""
            return list(self.iterkeys())

        def values( self ):
            """Returns all named result values (as a list in Python 2.x, as an iterator in Python 3.x)."""
            return list(self.itervalues())
                
        def items( self ):
            """Returns all named result key-values (as a list of tuples in Python 2.x, as an iterator in Python 3.x)."""
            return list(self.iteritems())

    def haskeys( self ):
        """Since keys() returns an iterator, this method is helpful in bypassing
           code that looks for the existence of any defined results names."""
        return bool(self.__tokdict)
        
    def pop( self, *args, **kwargs):
        """
        Removes and returns item at specified index (default=C{last}).
        Supports both C{list} and C{dict} semantics for C{pop()}. If passed no
        argument or an integer argument, it will use C{list} semantics
        and pop tokens from the list of parsed tokens. If passed a 
        non-integer argument (most likely a string), it will use C{dict}
        semantics and pop the corresponding value from any defined 
        results names. A second default return value argument is 
        supported, just as in C{dict.pop()}.

        Example::
            def remove_first(tokens):
                tokens.pop(0)
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            print(OneOrMore(Word(nums)).addParseAction(remove_first).parseString("0 123 321")) # -> ['123', '321']

            label = Word(alphas)
            patt = label("LABEL") + OneOrMore(Word(nums))
            print(patt.parseString("AAB 123 321").dump())

            # Use pop() in a parse action to remove named result (note that corresponding value is not
            # removed from list form of results)
            def remove_LABEL(tokens):
                tokens.pop("LABEL")
                return tokens
            patt.addParseAction(remove_LABEL)
            print(patt.parseString("AAB 123 321").dump())
        prints::
            ['AAB', '123', '321']
            - LABEL: AAB

            ['AAB', '123', '321']
        """
        if not args:
            args = [-1]
        for k,v in kwargs.items():
            if k == 'default':
                args = (args[0], v)
            else:
                raise TypeError("pop() got an unexpected keyword argument '%s'" % k)
        if (isinstance(args[0], int) or 
                        len(args) == 1 or 
                        args[0] in self):
            index = args[0]
            ret = self[index]
            del self[index]
            return ret
        else:
            defaultvalue = args[1]
            return defaultvalue

    def get(self, key, defaultValue=None):
        """
        Returns named result matching the given key, or if there is no
        such name, then returns the given C{defaultValue} or C{None} if no
        C{defaultValue} is specified.

        Similar to C{dict.get()}.
        
        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            result = date_str.parseString("1999/12/31")
            print(result.get("year")) # -> '1999'
            print(result.get("hour", "not specified")) # -> 'not specified'
            print(result.get("hour")) # -> None
        """
        if key in self:
            return self[key]
        else:
            return defaultValue

    def insert( self, index, insStr ):
        """
        Inserts new element at location index in the list of parsed tokens.
        
        Similar to C{list.insert()}.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']

            # use a parse action to insert the parse location in the front of the parsed results
            def insert_locn(locn, tokens):
                tokens.insert(0, locn)
            print(OneOrMore(Word(nums)).addParseAction(insert_locn).parseString("0 123 321")) # -> [0, '0', '123', '321']
        """
        self.__toklist.insert(index, insStr)
        # fixup indices in token dictionary
        for name,occurrences in self.__tokdict.items():
            for k, (value, position) in enumerate(occurrences):
                occurrences[k] = _ParseResultsWithOffset(value, position + (position > index))

    def append( self, item ):
        """
        Add single element to end of ParseResults list of elements.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            
            # use a parse action to compute the sum of the parsed integers, and add it to the end
            def append_sum(tokens):
                tokens.append(sum(map(int, tokens)))
            print(OneOrMore(Word(nums)).addParseAction(append_sum).parseString("0 123 321")) # -> ['0', '123', '321', 444]
        """
        self.__toklist.append(item)

    def extend( self, itemseq ):
        """
        Add sequence of elements to end of ParseResults list of elements.

        Example::
            patt = OneOrMore(Word(alphas))
            
            # use a parse action to append the reverse of the matched strings, to make a palindrome
            def make_palindrome(tokens):
                tokens.extend(reversed([t[::-1] for t in tokens]))
                return ''.join(tokens)
            print(patt.addParseAction(make_palindrome).parseString("lskdj sdlkjf lksd")) # -> 'lskdjsdlkjflksddsklfjkldsjdksl'
        """
        if isinstance(itemseq, ParseResults):
            self += itemseq
        else:
            self.__toklist.extend(itemseq)

    def clear( self ):
        """
        Clear all elements and results names.
        """
        del self.__toklist[:]
        self.__tokdict.clear()

    def __getattr__( self, name ):
        try:
            return self[name]
        except KeyError:
            return ""
            
        if name in self.__tokdict:
            if name not in self.__accumNames:
                return self.__tokdict[name][-1][0]
            else:
                return ParseResults([ v[0] for v in self.__tokdict[name] ])
        else:
            return ""

    def __add__( self, other ):
        ret = self.copy()
        ret += other
        return ret

    def __iadd__( self, other ):
        if other.__tokdict:
            offset = len(self.__toklist)
            addoffset = lambda a: offset if a<0 else a+offset
            otheritems = other.__tokdict.items()
            otherdictitems = [(k, _ParseResultsWithOffset(v[0],addoffset(v[1])) )
                                for (k,vlist) in otheritems for v in vlist]
            for k,v in otherdictitems:
                self[k] = v
                if isinstance(v[0],ParseResults):
                    v[0].__parent = wkref(self)
            
        self.__toklist += other.__toklist
        self.__accumNames.update( other.__accumNames )
        return self

    def __radd__(self, other):
        if isinstance(other,int) and other == 0:
            # useful for merging many ParseResults using sum() builtin
            return self.copy()
        else:
            # this may raise a TypeError - so be it
            return other + self
        
    def __repr__( self ):
        return "(%s, %s)" % ( repr( self.__toklist ), repr( self.__tokdict ) )

    def __str__( self ):
        return '[' + ', '.join(_ustr(i) if isinstance(i, ParseResults) else repr(i) for i in self.__toklist) + ']'

    def _asStringList( self, sep='' ):
        out = []
        for item in self.__toklist:
            if out and sep:
                out.append(sep)
            if isinstance( item, ParseResults ):
                out += item._asStringList()
            else:
                out.append( _ustr(item) )
        return out

    def asList( self ):
        """
        Returns the parse results as a nested list of matching tokens, all converted to strings.

        Example::
            patt = OneOrMore(Word(alphas))
            result = patt.parseString("sldkj lsdkj sldkj")
            # even though the result prints in string-like form, it is actually a pyparsing ParseResults
            print(type(result), result) # -> <class 'pyparsing.ParseResults'> ['sldkj', 'lsdkj', 'sldkj']
            
            # Use asList() to create an actual list
            result_list = result.asList()
            print(type(result_list), result_list) # -> <class 'list'> ['sldkj', 'lsdkj', 'sldkj']
        """
        return [res.asList() if isinstance(res,ParseResults) else res for res in self.__toklist]

    def asDict( self ):
        """
        Returns the named parse results as a nested dictionary.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(type(result), repr(result)) # -> <class 'pyparsing.ParseResults'> (['12', '/', '31', '/', '1999'], {'day': [('1999', 4)], 'year': [('12', 0)], 'month': [('31', 2)]})
            
            result_dict = result.asDict()
            print(type(result_dict), repr(result_dict)) # -> <class 'dict'> {'day': '1999', 'year': '12', 'month': '31'}

            # even though a ParseResults supports dict-like access, sometime you just need to have a dict
            import json
            print(json.dumps(result)) # -> Exception: TypeError: ... is not JSON serializable
            print(json.dumps(result.asDict())) # -> {"month": "31", "day": "1999", "year": "12"}
        """
        if PY_3:
            item_fn = self.items
        else:
            item_fn = self.iteritems
            
        def toItem(obj):
            if isinstance(obj, ParseResults):
                if obj.haskeys():
                    return obj.asDict()
                else:
                    return [toItem(v) for v in obj]
            else:
                return obj
                
        return dict((k,toItem(v)) for k,v in item_fn())

    def copy( self ):
        """
        Returns a new copy of a C{ParseResults} object.
        """
        ret = ParseResults( self.__toklist )
        ret.__tokdict = self.__tokdict.copy()
        ret.__parent = self.__parent
        ret.__accumNames.update( self.__accumNames )
        ret.__name = self.__name
        return ret

    def asXML( self, doctag=None, namedItemsOnly=False, indent="", formatted=True ):
        """
        (Deprecated) Returns the parse results as XML. Tags are created for tokens and lists that have defined results names.
        """
        nl = "\n"
        out = []
        namedItems = dict((v[1],k) for (k,vlist) in self.__tokdict.items()
                                                            for v in vlist)
        nextLevelIndent = indent + "  "

        # collapse out indents if formatting is not desired
        if not formatted:
            indent = ""
            nextLevelIndent = ""
            nl = ""

        selfTag = None
        if doctag is not None:
            selfTag = doctag
        else:
            if self.__name:
                selfTag = self.__name

        if not selfTag:
            if namedItemsOnly:
                return ""
            else:
                selfTag = "ITEM"

        out += [ nl, indent, "<", selfTag, ">" ]

        for i,res in enumerate(self.__toklist):
            if isinstance(res,ParseResults):
                if i in namedItems:
                    out += [ res.asXML(namedItems[i],
                                        namedItemsOnly and doctag is None,
                                        nextLevelIndent,
                                        formatted)]
                else:
                    out += [ res.asXML(None,
                                        namedItemsOnly and doctag is None,
                                        nextLevelIndent,
                                        formatted)]
            else:
                # individual token, see if there is a name for it
                resTag = None
                if i in namedItems:
                    resTag = namedItems[i]
                if not resTag:
                    if namedItemsOnly:
                        continue
                    else:
                        resTag = "ITEM"
                xmlBodyText = _xml_escape(_ustr(res))
                out += [ nl, nextLevelIndent, "<", resTag, ">",
                                                xmlBodyText,
                                                "</", resTag, ">" ]

        out += [ nl, indent, "</", selfTag, ">" ]
        return "".join(out)

    def __lookup(self,sub):
        for k,vlist in self.__tokdict.items():
            for v,loc in vlist:
                if sub is v:
                    return k
        return None

    def getName(self):
        """
        Returns the results name for this token expression. Useful when several 
        different expressions might match at a particular location.

        Example::
            integer = Word(nums)
            ssn_expr = Regex(r"\d\d\d-\d\d-\d\d\d\d")
            house_number_expr = Suppress('#') + Word(nums, alphanums)
            user_data = (Group(house_number_expr)("house_number") 
                        | Group(ssn_expr)("ssn")
                        | Group(integer)("age"))
            user_info = OneOrMore(user_data)
            
            result = user_info.parseString("22 111-22-3333 #221B")
            for item in result:
                print(item.getName(), ':', item[0])
        prints::
            age : 22
            ssn : 111-22-3333
            house_number : 221B
        """
        if self.__name:
            return self.__name
        elif self.__parent:
            par = self.__parent()
            if par:
                return par.__lookup(self)
            else:
                return None
        elif (len(self) == 1 and
               len(self.__tokdict) == 1 and
               next(iter(self.__tokdict.values()))[0][1] in (0,-1)):
            return next(iter(self.__tokdict.keys()))
        else:
            return None

    def dump(self, indent='', depth=0, full=True):
        """
        Diagnostic method for listing out the contents of a C{ParseResults}.
        Accepts an optional C{indent} argument so that this string can be embedded
        in a nested display of other data.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(result.dump())
        prints::
            ['12', '/', '31', '/', '1999']
            - day: 1999
            - month: 31
            - year: 12
        """
        out = []
        NL = '\n'
        out.append( indent+_ustr(self.asList()) )
        if full:
            if self.haskeys():
                items = sorted((str(k), v) for k,v in self.items())
                for k,v in items:
                    if out:
                        out.append(NL)
                    out.append( "%s%s- %s: " % (indent,('  '*depth), k) )
                    if isinstance(v,ParseResults):
                        if v:
                            out.append( v.dump(indent,depth+1) )
                        else:
                            out.append(_ustr(v))
                    else:
                        out.append(repr(v))
            elif any(isinstance(vv,ParseResults) for vv in self):
                v = self
                for i,vv in enumerate(v):
                    if isinstance(vv,ParseResults):
                        out.append("\n%s%s[%d]:\n%s%s%s" % (indent,('  '*(depth)),i,indent,('  '*(depth+1)),vv.dump(indent,depth+1) ))
                    else:
                        out.append("\n%s%s[%d]:\n%s%s%s" % (indent,('  '*(depth)),i,indent,('  '*(depth+1)),_ustr(vv)))
            
        return "".join(out)

    def pprint(self, *args, **kwargs):
        """
        Pretty-printer for parsed results as a list, using the C{pprint} module.
        Accepts additional positional or keyword args as defined for the 
        C{pprint.pprint} method. (U{http://docs.python.org/3/library/pprint.html#pprint.pprint})

        Example::
            ident = Word(alphas, alphanums)
            num = Word(nums)
            func = Forward()
            term = ident | num | Group('(' + func + ')')
            func <<= ident + Group(Optional(delimitedList(term)))
            result = func.parseString("fna a,b,(fnb c,d,200),100")
            result.pprint(width=40)
        prints::
            ['fna',
             ['a',
              'b',
              ['(', 'fnb', ['c', 'd', '200'], ')'],
              '100']]
        """
        pprint.pprint(self.asList(), *args, **kwargs)

    # add support for pickle protocol
    def __getstate__(self):
        return ( self.__toklist,
                 ( self.__tokdict.copy(),
                   self.__parent is not None and self.__parent() or None,
                   self.__accumNames,
                   self.__name ) )

    def __setstate__(self,state):
        self.__toklist = state[0]
        (self.__tokdict,
         par,
         inAccumNames,
         self.__name) = state[1]
        self.__accumNames = {}
        self.__accumNames.update(inAccumNames)
        if par is not None:
            self.__parent = wkref(par)
        else:
            self.__parent = None

    def __getnewargs__(self):
        return self.__toklist, self.__name, self.__asList, self.__modal

    def __dir__(self):
        return (dir(type(self)) + list(self.keys()))

collections.MutableMapping.register(ParseResults)

def col (loc,strg):
    """Returns current column within a string, counting newlines as line separators.
   The first column is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   """
    s = strg
    return 1 if 0<loc<len(s) and s[loc-1] == '\n' else loc - s.rfind("\n", 0, loc)

def lineno(loc,strg):
    """Returns current line number within a string, counting newlines as line separators.
   The first line is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   """
    return strg.count("\n",0,loc) + 1

def line( loc, strg ):
    """Returns the line of text containing loc within a string, counting newlines as line separators.
       """
    lastCR = strg.rfind("\n", 0, loc)
    nextCR = strg.find("\n", loc)
    if nextCR >= 0:
        return strg[lastCR+1:nextCR]
    else:
        return strg[lastCR+1:]

def _defaultStartDebugAction( instring, loc, expr ):
    print (("Match " + _ustr(expr) + " at loc " + _ustr(loc) + "(%d,%d)" % ( lineno(loc,instring), col(loc,instring) )))

def _defaultSuccessDebugAction( instring, startloc, endloc, expr, toks ):
    print ("Matched " + _ustr(expr) + " -> " + str(toks.asList()))

def _defaultExceptionDebugAction( instring, loc, expr, exc ):
    print ("Exception raised:" + _ustr(exc))

def nullDebugAction(*args):
    """'Do-nothing' debug action, to suppress debugging output during parsing."""
    pass

# Only works on Python 3.x - nonlocal is toxic to Python 2 installs
#~ 'decorator to trim function calls to match the arity of the target'
#~ def _trim_arity(func, maxargs=3):
    #~ if func in singleArgBuiltins:
        #~ return lambda s,l,t: func(t)
    #~ limit = 0
    #~ foundArity = False
    #~ def wrapper(*args):
        #~ nonlocal limit,foundArity
        #~ while 1:
            #~ try:
                #~ ret = func(*args[limit:])
                #~ foundArity = True
                #~ return ret
            #~ except TypeError:
                #~ if limit == maxargs or foundArity:
                    #~ raise
                #~ limit += 1
                #~ continue
    #~ return wrapper

# this version is Python 2.x-3.x cross-compatible
'decorator to trim function calls to match the arity of the target'
def _trim_arity(func, maxargs=2):
    if func in singleArgBuiltins:
        return lambda s,l,t: func(t)
    limit = [0]
    foundArity = [False]
    
    # traceback return data structure changed in Py3.5 - normalize back to plain tuples
    if system_version[:2] >= (3,5):
        def extract_stack(limit=0):
            # special handling for Python 3.5.0 - extra deep call stack by 1
            offset = -3 if system_version == (3,5,0) else -2
            frame_summary = traceback.extract_stack(limit=-offset+limit-1)[offset]
            return [(frame_summary.filename, frame_summary.lineno)]
        def extract_tb(tb, limit=0):
            frames = traceback.extract_tb(tb, limit=limit)
            frame_summary = frames[-1]
            return [(frame_summary.filename, frame_summary.lineno)]
    else:
        extract_stack = traceback.extract_stack
        extract_tb = traceback.extract_tb
    
    # synthesize what would be returned by traceback.extract_stack at the call to 
    # user's parse action 'func', so that we don't incur call penalty at parse time
    
    LINE_DIFF = 6
    # IF ANY CODE CHANGES, EVEN JUST COMMENTS OR BLANK LINES, BETWEEN THE NEXT LINE AND 
    # THE CALL TO FUNC INSIDE WRAPPER, LINE_DIFF MUST BE MODIFIED!!!!
    this_line = extract_stack(limit=2)[-1]
    pa_call_line_synth = (this_line[0], this_line[1]+LINE_DIFF)

    def wrapper(*args):
        while 1:
            try:
                ret = func(*args[limit[0]:])
                foundArity[0] = True
                return ret
            except TypeError:
                # re-raise TypeErrors if they did not come from our arity testing
                if foundArity[0]:
                    raise
                else:
                    try:
                        tb = sys.exc_info()[-1]
                        if not extract_tb(tb, limit=2)[-1][:2] == pa_call_line_synth:
                            raise
                    finally:
                        del tb

                if limit[0] <= maxargs:
                    limit[0] += 1
                    continue
                raise

    # copy func name to wrapper for sensible debug output
    func_name = "<parse action>"
    try:
        func_name = getattr(func, '__name__', 
                            getattr(func, '__class__').__name__)
    except Exception:
        func_name = str(func)
    wrapper.__name__ = func_name

    return wrapper

class ParserElement(object):
    """Abstract base level parser element class."""
    DEFAULT_WHITE_CHARS = " \n\t\r"
    verbose_stacktrace = False

    @staticmethod
    def setDefaultWhitespaceChars( chars ):
        r"""
        Overrides the default whitespace chars

        Example::
            # default whitespace chars are space, <TAB> and newline
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def', 'ghi', 'jkl']
            
            # change to just treat newline as significant
            ParserElement.setDefaultWhitespaceChars(" \t")
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def']
        """
        ParserElement.DEFAULT_WHITE_CHARS = chars

    @staticmethod
    def inlineLiteralsUsing(cls):
        """
        Set class to be used for inclusion of string literals into a parser.
        
        Example::
            # default literal class used is Literal
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']


            # change to Suppress
            ParserElement.inlineLiteralsUsing(Suppress)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '12', '31']
        """
        ParserElement._literalStringClass = cls

    def __init__( self, savelist=False ):
        self.parseAction = list()
        self.failAction = None
        #~ self.name = "<unknown>"  # don't define self.name, let subclasses try/except upcall
        self.strRepr = None
        self.resultsName = None
        self.saveAsList = savelist
        self.skipWhitespace = True
        self.whiteChars = ParserElement.DEFAULT_WHITE_CHARS
        self.copyDefaultWhiteChars = True
        self.mayReturnEmpty = False # used when checking for left-recursion
        self.keepTabs = False
        self.ignoreExprs = list()
        self.debug = False
        self.streamlined = False
        self.mayIndexError = True # used to optimize exception handling for subclasses that don't advance parse index
        self.errmsg = ""
        self.modalResults = True # used to mark results names as modal (report only last) or cumulative (list all)
        self.debugActions = ( None, None, None ) #custom debug actions
        self.re = None
        self.callPreparse = True # used to avoid redundant calls to preParse
        self.callDuringTry = False

    def copy( self ):
        """
        Make a copy of this C{ParserElement}.  Useful for defining different parse actions
        for the same parsing pattern, using copies of the original parse element.
        
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            integerK = integer.copy().addParseAction(lambda toks: toks[0]*1024) + Suppress("K")
            integerM = integer.copy().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
            
            print(OneOrMore(integerK | integerM | integer).parseString("5K 100 640K 256M"))
        prints::
            [5120, 100, 655360, 268435456]
        Equivalent form of C{expr.copy()} is just C{expr()}::
            integerM = integer().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
        """
        cpy = copy.copy( self )
        cpy.parseAction = self.parseAction[:]
        cpy.ignoreExprs = self.ignoreExprs[:]
        if self.copyDefaultWhiteChars:
            cpy.whiteChars = ParserElement.DEFAULT_WHITE_CHARS
        return cpy

    def setName( self, name ):
        """
        Define name for this expression, makes debugging and exception messages clearer.
        
        Example::
            Word(nums).parseString("ABC")  # -> Exception: Expected W:(0123...) (at char 0), (line:1, col:1)
            Word(nums).setName("integer").parseString("ABC")  # -> Exception: Expected integer (at char 0), (line:1, col:1)
        """
        self.name = name
        self.errmsg = "Expected " + self.name
        if hasattr(self,"exception"):
            self.exception.msg = self.errmsg
        return self

    def setResultsName( self, name, listAllMatches=False ):
        """
        Define name for referencing matching tokens as a nested attribute
        of the returned parse results.
        NOTE: this returns a *copy* of the original C{ParserElement} object;
        this is so that the client can define a basic element, such as an
        integer, and reference it in multiple places with different names.

        You can also set results names using the abbreviated syntax,
        C{expr("name")} in place of C{expr.setResultsName("name")} - 
        see L{I{__call__}<__call__>}.

        Example::
            date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))

            # equivalent form:
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
        """
        newself = self.copy()
        if name.endswith("*"):
            name = name[:-1]
            listAllMatches=True
        newself.resultsName = name
        newself.modalResults = not listAllMatches
        return newself

    def setBreak(self,breakFlag = True):
        """Method to invoke the Python pdb debugger when this element is
           about to be parsed. Set C{breakFlag} to True to enable, False to
           disable.
        """
        if breakFlag:
            _parseMethod = self._parse
            def breaker(instring, loc, doActions=True, callPreParse=True):
                import pdb
                pdb.set_trace()
                return _parseMethod( instring, loc, doActions, callPreParse )
            breaker._originalParseMethod = _parseMethod
            self._parse = breaker
        else:
            if hasattr(self._parse,"_originalParseMethod"):
                self._parse = self._parse._originalParseMethod
        return self

    def setParseAction( self, *fns, **kwargs ):
        """
        Define action to perform when successfully matching parse element definition.
        Parse action fn is a callable method with 0-3 arguments, called as C{fn(s,loc,toks)},
        C{fn(loc,toks)}, C{fn(toks)}, or just C{fn()}, where:
         - s   = the original string being parsed (see note below)
         - loc = the location of the matching substring
         - toks = a list of the matched tokens, packaged as a C{L{ParseResults}} object
        If the functions in fns modify the tokens, they can return them as the return
        value from fn, and the modified list of tokens will replace the original.
        Otherwise, fn does not need to return any value.

        Optional keyword arguments:
         - callDuringTry = (default=C{False}) indicate if parse action should be run during lookaheads and alternate testing

        Note: the default parsing behavior is to expand tabs in the input string
        before starting the parsing process.  See L{I{parseString}<parseString>} for more information
        on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
        consistent view of the parsed string, the parse location, and line and column
        positions within the parsed string.
        
        Example::
            integer = Word(nums)
            date_str = integer + '/' + integer + '/' + integer

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']

            # use parse action to convert to ints at parse time
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            date_str = integer + '/' + integer + '/' + integer

            # note that integer fields are now ints, not strings
            date_str.parseString("1999/12/31")  # -> [1999, '/', 12, '/', 31]
        """
        self.parseAction = list(map(_trim_arity, list(fns)))
        self.callDuringTry = kwargs.get("callDuringTry", False)
        return self

    def addParseAction( self, *fns, **kwargs ):
        """
        Add parse action to expression's list of parse actions. See L{I{setParseAction}<setParseAction>}.
        
        See examples in L{I{copy}<copy>}.
        """
        self.parseAction += list(map(_trim_arity, list(fns)))
        self.callDuringTry = self.callDuringTry or kwargs.get("callDuringTry", False)
        return self

    def addCondition(self, *fns, **kwargs):
        """Add a boolean predicate function to expression's list of parse actions. See 
        L{I{setParseAction}<setParseAction>} for function call signatures. Unlike C{setParseAction}, 
        functions passed to C{addCondition} need to return boolean success/fail of the condition.

        Optional keyword arguments:
         - message = define a custom message to be used in the raised exception
         - fatal   = if True, will raise ParseFatalException to stop parsing immediately; otherwise will raise ParseException
         
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            year_int = integer.copy()
            year_int.addCondition(lambda toks: toks[0] >= 2000, message="Only support years 2000 and later")
            date_str = year_int + '/' + integer + '/' + integer

            result = date_str.parseString("1999/12/31")  # -> Exception: Only support years 2000 and later (at char 0), (line:1, col:1)
        """
        msg = kwargs.get("message", "failed user-defined condition")
        exc_type = ParseFatalException if kwargs.get("fatal", False) else ParseException
        for fn in fns:
            def pa(s,l,t):
                if not bool(_trim_arity(fn)(s,l,t)):
                    raise exc_type(s,l,msg)
            self.parseAction.append(pa)
        self.callDuringTry = self.callDuringTry or kwargs.get("callDuringTry", False)
        return self

    def setFailAction( self, fn ):
        """Define action to perform if parsing fails at this expression.
           Fail acton fn is a callable function that takes the arguments
           C{fn(s,loc,expr,err)} where:
            - s = string being parsed
            - loc = location where expression match was attempted and failed
            - expr = the parse expression that failed
            - err = the exception thrown
           The function returns no value.  It may throw C{L{ParseFatalException}}
           if it is desired to stop parsing immediately."""
        self.failAction = fn
        return self

    def _skipIgnorables( self, instring, loc ):
        exprsFound = True
        while exprsFound:
            exprsFound = False
            for e in self.ignoreExprs:
                try:
                    while 1:
                        loc,dummy = e._parse( instring, loc )
                        exprsFound = True
                except ParseException:
                    pass
        return loc

    def preParse( self, instring, loc ):
        if self.ignoreExprs:
            loc = self._skipIgnorables( instring, loc )

        if self.skipWhitespace:
            wt = self.whiteChars
            instrlen = len(instring)
            while loc < instrlen and instring[loc] in wt:
                loc += 1

        return loc

    def parseImpl( self, instring, loc, doActions=True ):
        return loc, []

    def postParse( self, instring, loc, tokenlist ):
        return tokenlist

    #~ @profile
    def _parseNoCache( self, instring, loc, doActions=True, callPreParse=True ):
        debugging = ( self.debug ) #and doActions )

        if debugging or self.failAction:
            #~ print ("Match",self,"at loc",loc,"(%d,%d)" % ( lineno(loc,instring), col(loc,instring) ))
            if (self.debugActions[0] ):
                self.debugActions[0]( instring, loc, self )
            if callPreParse and self.callPreparse:
                preloc = self.preParse( instring, loc )
            else:
                preloc = loc
            tokensStart = preloc
            try:
                try:
                    loc,tokens = self.parseImpl( instring, preloc, doActions )
                except IndexError:
                    raise ParseException( instring, len(instring), self.errmsg, self )
            except ParseBaseException as err:
                #~ print ("Exception raised:", err)
                if self.debugActions[2]:
                    self.debugActions[2]( instring, tokensStart, self, err )
                if self.failAction:
                    self.failAction( instring, tokensStart, self, err )
                raise
        else:
            if callPreParse and self.callPreparse:
                preloc = self.preParse( instring, loc )
            else:
                preloc = loc
            tokensStart = preloc
            if self.mayIndexError or loc >= len(instring):
                try:
                    loc,tokens = self.parseImpl( instring, preloc, doActions )
                except IndexError:
                    raise ParseException( instring, len(instring), self.errmsg, self )
            else:
                loc,tokens = self.parseImpl( instring, preloc, doActions )

        tokens = self.postParse( instring, loc, tokens )

        retTokens = ParseResults( tokens, self.resultsName, asList=self.saveAsList, modal=self.modalResults )
        if self.parseAction and (doActions or self.callDuringTry):
            if debugging:
                try:
                    for fn in self.parseAction:
                        tokens = fn( instring, tokensStart, retTokens )
                        if tokens is not None:
                            retTokens = ParseResults( tokens,
                                                      self.resultsName,
                                                      asList=self.saveAsList and isinstance(tokens,(ParseResults,list)),
                                                      modal=self.modalResults )
                except ParseBaseException as err:
                    #~ print "Exception raised in user parse action:", err
                    if (self.debugActions[2] ):
                        self.debugActions[2]( instring, tokensStart, self, err )
                    raise
            else:
                for fn in self.parseAction:
                    tokens = fn( instring, tokensStart, retTokens )
                    if tokens is not None:
                        retTokens = ParseResults( tokens,
                                                  self.resultsName,
                                                  asList=self.saveAsList and isinstance(tokens,(ParseResults,list)),
                                                  modal=self.modalResults )

        if debugging:
            #~ print ("Matched",self,"->",retTokens.asList())
            if (self.debugActions[1] ):
                self.debugActions[1]( instring, tokensStart, loc, self, retTokens )

        return loc, retTokens

    def tryParse( self, instring, loc ):
        try:
            return self._parse( instring, loc, doActions=False )[0]
        except ParseFatalException:
            raise ParseException( instring, loc, self.errmsg, self)
    
    def canParseNext(self, instring, loc):
        try:
            self.tryParse(instring, loc)
        except (ParseException, IndexError):
            return False
        else:
            return True

    class _UnboundedCache(object):
        def __init__(self):
            cache = {}
            self.not_in_cache = not_in_cache = object()

            def get(self, key):
                return cache.get(key, not_in_cache)

            def set(self, key, value):
                cache[key] = value

            def clear(self):
                cache.clear()

            self.get = types.MethodType(get, self)
            self.set = types.MethodType(set, self)
            self.clear = types.MethodType(clear, self)

    if _OrderedDict is not None:
        class _FifoCache(object):
            def __init__(self, size):
                self.not_in_cache = not_in_cache = object()

                cache = _OrderedDict()

                def get(self, key):
                    return cache.get(key, not_in_cache)

                def set(self, key, value):
                    cache[key] = value
                    if len(cache) > size:
                        cache.popitem(False)

                def clear(self):
                    cache.clear()

                self.get = types.MethodType(get, self)
                self.set = types.MethodType(set, self)
                self.clear = types.MethodType(clear, self)

    else:
        class _FifoCache(object):
            def __init__(self, size):
                self.not_in_cache = not_in_cache = object()

                cache = {}
                key_fifo = collections.deque([], size)

                def get(self, key):
                    return cache.get(key, not_in_cache)

                def set(self, key, value):
                    cache[key] = value
                    if len(cache) > size:
                        cache.pop(key_fifo.popleft(), None)
                    key_fifo.append(key)

                def clear(self):
                    cache.clear()
                    key_fifo.clear()

                self.get = types.MethodType(get, self)
                self.set = types.MethodType(set, self)
                self.clear = types.MethodType(clear, self)

    # argument cache for optimizing repeated calls when backtracking through recursive expressions
    packrat_cache = {} # this is set later by enabledPackrat(); this is here so that resetCache() doesn't fail
    packrat_cache_lock = RLock()
    packrat_cache_stats = [0, 0]

    # this method gets repeatedly called during backtracking with the same arguments -
    # we can cache these arguments and save ourselves the trouble of re-parsing the contained expression
    def _parseCache( self, instring, loc, doActions=True, callPreParse=True ):
        HIT, MISS = 0, 1
        lookup = (self, instring, loc, callPreParse, doActions)
        with ParserElement.packrat_cache_lock:
            cache = ParserElement.packrat_cache
            value = cache.get(lookup)
            if value is cache.not_in_cache:
                ParserElement.packrat_cache_stats[MISS] += 1
                try:
                    value = self._parseNoCache(instring, loc, doActions, callPreParse)
                except ParseBaseException as pe:
                    # cache a copy of the exception, without the traceback
                    cache.set(lookup, pe.__class__(*pe.args))
                    raise
                else:
                    cache.set(lookup, (value[0], value[1].copy()))
                    return value
            else:
                ParserElement.packrat_cache_stats[HIT] += 1
                if isinstance(value, Exception):
                    raise value
                return (value[0], value[1].copy())

    _parse = _parseNoCache

    @staticmethod
    def resetCache():
        ParserElement.packrat_cache.clear()
        ParserElement.packrat_cache_stats[:] = [0] * len(ParserElement.packrat_cache_stats)

    _packratEnabled = False
    @staticmethod
    def enablePackrat(cache_size_limit=128):
        """Enables "packrat" parsing, which adds memoizing to the parsing logic.
           Repeated parse attempts at the same string location (which happens
           often in many complex grammars) can immediately return a cached value,
           instead of re-executing parsing/validating code.  Memoizing is done of
           both valid results and parsing exceptions.
           
           Parameters:
            - cache_size_limit - (default=C{128}) - if an integer value is provided
              will limit the size of the packrat cache; if None is passed, then
              the cache size will be unbounded; if 0 is passed, the cache will
              be effectively disabled.
            
           This speedup may break existing programs that use parse actions that
           have side-effects.  For this reason, packrat parsing is disabled when
           you first import pyparsing.  To activate the packrat feature, your
           program must call the class method C{ParserElement.enablePackrat()}.  If
           your program uses C{psyco} to "compile as you go", you must call
           C{enablePackrat} before calling C{psyco.full()}.  If you do not do this,
           Python will crash.  For best results, call C{enablePackrat()} immediately
           after importing pyparsing.
           
           Example::
               import pyparsing
               pyparsing.ParserElement.enablePackrat()
        """
        if not ParserElement._packratEnabled:
            ParserElement._packratEnabled = True
            if cache_size_limit is None:
                ParserElement.packrat_cache = ParserElement._UnboundedCache()
            else:
                ParserElement.packrat_cache = ParserElement._FifoCache(cache_size_limit)
            ParserElement._parse = ParserElement._parseCache

    def parseString( self, instring, parseAll=False ):
        """
        Execute the parse expression with the given string.
        This is the main interface to the client code, once the complete
        expression has been built.

        If you want the grammar to require that the entire input string be
        successfully parsed, then set C{parseAll} to True (equivalent to ending
        the grammar with C{L{StringEnd()}}).

        Note: C{parseString} implicitly calls C{expandtabs()} on the input string,
        in order to report proper column numbers in parse actions.
        If the input string contains tabs and
        the grammar uses parse actions that use the C{loc} argument to index into the
        string being parsed, you can ensure you have a consistent view of the input
        string by:
         - calling C{parseWithTabs} on your grammar before calling C{parseString}
           (see L{I{parseWithTabs}<parseWithTabs>})
         - define your parse action using the full C{(s,loc,toks)} signature, and
           reference the input string using the parse action's C{s} argument
         - explictly expand the tabs in your input string before calling
           C{parseString}
        
        Example::
            Word('a').parseString('aaaaabaaa')  # -> ['aaaaa']
            Word('a').parseString('aaaaabaaa', parseAll=True)  # -> Exception: Expected end of text
        """
        ParserElement.resetCache()
        if not self.streamlined:
            self.streamline()
            #~ self.saveAsList = True
        for e in self.ignoreExprs:
            e.streamline()
        if not self.keepTabs:
            instring = instring.expandtabs()
        try:
            loc, tokens = self._parse( instring, 0 )
            if parseAll:
                loc = self.preParse( instring, loc )
                se = Empty() + StringEnd()
                se._parse( instring, loc )
        except ParseBaseException as exc:
            if ParserElement.verbose_stacktrace:
                raise
            else:
                # catch and re-raise exception from here, clears out pyparsing internal stack trace
                raise exc
        else:
            return tokens

    def scanString( self, instring, maxMatches=_MAX_INT, overlap=False ):
        """
        Scan the input string for expression matches.  Each match will return the
        matching tokens, start location, and end location.  May be called with optional
        C{maxMatches} argument, to clip scanning after 'n' matches are found.  If
        C{overlap} is specified, then overlapping matches will be reported.

        Note that the start and end locations are reported relative to the string
        being parsed.  See L{I{parseString}<parseString>} for more information on parsing
        strings with embedded tabs.

        Example::
            source = "sldjf123lsdjjkf345sldkjf879lkjsfd987"
            print(source)
            for tokens,start,end in Word(alphas).scanString(source):
                print(' '*start + '^'*(end-start))
                print(' '*start + tokens[0])
        
        prints::
        
            sldjf123lsdjjkf345sldkjf879lkjsfd987
            ^^^^^
            sldjf
                    ^^^^^^^
                    lsdjjkf
                              ^^^^^^
                              sldkjf
                                       ^^^^^^
                                       lkjsfd
        """
        if not self.streamlined:
            self.streamline()
        for e in self.ignoreExprs:
            e.streamline()

        if not self.keepTabs:
            instring = _ustr(instring).expandtabs()
        instrlen = len(instring)
        loc = 0
        preparseFn = self.preParse
        parseFn = self._parse
        ParserElement.resetCache()
        matches = 0
        try:
            while loc <= instrlen and matches < maxMatches:
                try:
                    preloc = preparseFn( instring, loc )
                    nextLoc,tokens = parseFn( instring, preloc, callPreParse=False )
                except ParseException:
                    loc = preloc+1
                else:
                    if nextLoc > loc:
                        matches += 1
                        yield tokens, preloc, nextLoc
                        if overlap:
                            nextloc = preparseFn( instring, loc )
                            if nextloc > loc:
                                loc = nextLoc
                            else:
                                loc += 1
                        else:
                            loc = nextLoc
                    else:
                        loc = preloc+1
        except ParseBaseException as exc:
            if ParserElement.verbose_stacktrace:
                raise
            else:
                # catch and re-raise exception from here, clears out pyparsing internal stack trace
                raise exc

    def transformString( self, instring ):
        """
        Extension to C{L{scanString}}, to modify matching text with modified tokens that may
        be returned from a parse action.  To use C{transformString}, define a grammar and
        attach a parse action to it that modifies the returned token list.
        Invoking C{transformString()} on a target string will then scan for matches,
        and replace the matched text patterns according to the logic in the parse
        action.  C{transformString()} returns the resulting transformed string.
        
        Example::
            wd = Word(alphas)
            wd.setParseAction(lambda toks: toks[0].title())
            
            print(wd.transformString("now is the winter of our discontent made glorious summer by this sun of york."))
        Prints::
            Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York.
        """
        out = []
        lastE = 0
        # force preservation of <TAB>s, to minimize unwanted transformation of string, and to
        # keep string locs straight between transformString and scanString
        self.keepTabs = True
        try:
            for t,s,e in self.scanString( instring ):
                out.append( instring[lastE:s] )
                if t:
                    if isinstance(t,ParseResults):
                        out += t.asList()
                    elif isinstance(t,list):
                        out += t
                    else:
                        out.append(t)
                lastE = e
            out.append(instring[lastE:])
            out = [o for o in out if o]
            return "".join(map(_ustr,_flatten(out)))
        except ParseBaseException as exc:
            if ParserElement.verbose_stacktrace:
                raise
            else:
                # catch and re-raise exception from here, clears out pyparsing internal stack trace
                raise exc

    def searchString( self, instring, maxMatches=_MAX_INT ):
        """
        Another extension to C{L{scanString}}, simplifying the access to the tokens found
        to match the given parse expression.  May be called with optional
        C{maxMatches} argument, to clip searching after 'n' matches are found.
        
        Example::
            # a capitalized word starts with an uppercase letter, followed by zero or more lowercase letters
            cap_word = Word(alphas.upper(), alphas.lower())
            
            print(cap_word.searchString("More than Iron, more than Lead, more than Gold I need Electricity"))
        prints::
            ['More', 'Iron', 'Lead', 'Gold', 'I']
        """
        try:
            return ParseResults([ t for t,s,e in self.scanString( instring, maxMatches ) ])
        except ParseBaseException as exc:
            if ParserElement.verbose_stacktrace:
                raise
            else:
                # catch and re-raise exception from here, clears out pyparsing internal stack trace
                raise exc

    def split(self, instring, maxsplit=_MAX_INT, includeSeparators=False):
        """
        Generator method to split a string using the given expression as a separator.
        May be called with optional C{maxsplit} argument, to limit the number of splits;
        and the optional C{includeSeparators} argument (default=C{False}), if the separating
        matching text should be included in the split results.
        
        Example::        
            punc = oneOf(list(".,;:/-!?"))
            print(list(punc.split("This, this?, this sentence, is badly punctuated!")))
        prints::
            ['This', ' this', '', ' this sentence', ' is badly punctuated', '']
        """
        splits = 0
        last = 0
        for t,s,e in self.scanString(instring, maxMatches=maxsplit):
            yield instring[last:s]
            if includeSeparators:
                yield t[0]
            last = e
        yield instring[last:]

    def __add__(self, other ):
        """
        Implementation of + operator - returns C{L{And}}. Adding strings to a ParserElement
        converts them to L{Literal}s by default.
        
        Example::
            greet = Word(alphas) + "," + Word(alphas) + "!"
            hello = "Hello, World!"
            print (hello, "->", greet.parseString(hello))
        Prints::
            Hello, World! -> ['Hello', ',', 'World', '!']
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return And( [ self, other ] )

    def __radd__(self, other ):
        """
        Implementation of + operator when left operand is not a C{L{ParserElement}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return other + self

    def __sub__(self, other):
        """
        Implementation of - operator, returns C{L{And}} with error stop
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return And( [ self, And._ErrorStop(), other ] )

    def __rsub__(self, other ):
        """
        Implementation of - operator when left operand is not a C{L{ParserElement}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return other - self

    def __mul__(self,other):
        """
        Implementation of * operator, allows use of C{expr * 3} in place of
        C{expr + expr + expr}.  Expressions may also me multiplied by a 2-integer
        tuple, similar to C{{min,max}} multipliers in regular expressions.  Tuples
        may also include C{None} as in:
         - C{expr*(n,None)} or C{expr*(n,)} is equivalent
              to C{expr*n + L{ZeroOrMore}(expr)}
              (read as "at least n instances of C{expr}")
         - C{expr*(None,n)} is equivalent to C{expr*(0,n)}
              (read as "0 to n instances of C{expr}")
         - C{expr*(None,None)} is equivalent to C{L{ZeroOrMore}(expr)}
         - C{expr*(1,None)} is equivalent to C{L{OneOrMore}(expr)}

        Note that C{expr*(None,n)} does not raise an exception if
        more than n exprs exist in the input stream; that is,
        C{expr*(None,n)} does not enforce a maximum number of expr
        occurrences.  If this behavior is desired, then write
        C{expr*(None,n) + ~expr}
        """
        if isinstance(other,int):
            minElements, optElements = other,0
        elif isinstance(other,tuple):
            other = (other + (None, None))[:2]
            if other[0] is None:
                other = (0, other[1])
            if isinstance(other[0],int) and other[1] is None:
                if other[0] == 0:
                    return ZeroOrMore(self)
                if other[0] == 1:
                    return OneOrMore(self)
                else:
                    return self*other[0] + ZeroOrMore(self)
            elif isinstance(other[0],int) and isinstance(other[1],int):
                minElements, optElements = other
                optElements -= minElements
            else:
                raise TypeError("cannot multiply 'ParserElement' and ('%s','%s') objects", type(other[0]),type(other[1]))
        else:
            raise TypeError("cannot multiply 'ParserElement' and '%s' objects", type(other))

        if minElements < 0:
            raise ValueError("cannot multiply ParserElement by negative value")
        if optElements < 0:
            raise ValueError("second tuple value must be greater or equal to first tuple value")
        if minElements == optElements == 0:
            raise ValueError("cannot multiply ParserElement by 0 or (0,0)")

        if (optElements):
            def makeOptionalList(n):
                if n>1:
                    return Optional(self + makeOptionalList(n-1))
                else:
                    return Optional(self)
            if minElements:
                if minElements == 1:
                    ret = self + makeOptionalList(optElements)
                else:
                    ret = And([self]*minElements) + makeOptionalList(optElements)
            else:
                ret = makeOptionalList(optElements)
        else:
            if minElements == 1:
                ret = self
            else:
                ret = And([self]*minElements)
        return ret

    def __rmul__(self, other):
        return self.__mul__(other)

    def __or__(self, other ):
        """
        Implementation of | operator - returns C{L{MatchFirst}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return MatchFirst( [ self, other ] )

    def __ror__(self, other ):
        """
        Implementation of | operator when left operand is not a C{L{ParserElement}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return other | self

    def __xor__(self, other ):
        """
        Implementation of ^ operator - returns C{L{Or}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return Or( [ self, other ] )

    def __rxor__(self, other ):
        """
        Implementation of ^ operator when left operand is not a C{L{ParserElement}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return other ^ self

    def __and__(self, other ):
        """
        Implementation of & operator - returns C{L{Each}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return Each( [ self, other ] )

    def __rand__(self, other ):
        """
        Implementation of & operator when left operand is not a C{L{ParserElement}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return other & self

    def __invert__( self ):
        """
        Implementation of ~ operator - returns C{L{NotAny}}
        """
        return NotAny( self )

    def __call__(self, name=None):
        """
        Shortcut for C{L{setResultsName}}, with C{listAllMatches=False}.
        
        If C{name} is given with a trailing C{'*'} character, then C{listAllMatches} will be
        passed as C{True}.
           
        If C{name} is omitted, same as calling C{L{copy}}.

        Example::
            # these are equivalent
            userdata = Word(alphas).setResultsName("name") + Word(nums+"-").setResultsName("socsecno")
            userdata = Word(alphas)("name") + Word(nums+"-")("socsecno")             
        """
        if name is not None:
            return self.setResultsName(name)
        else:
            return self.copy()

    def suppress( self ):
        """
        Suppresses the output of this C{ParserElement}; useful to keep punctuation from
        cluttering up returned output.
        """
        return Suppress( self )

    def leaveWhitespace( self ):
        """
        Disables the skipping of whitespace before matching the characters in the
        C{ParserElement}'s defined pattern.  This is normally only used internally by
        the pyparsing module, but may be needed in some whitespace-sensitive grammars.
        """
        self.skipWhitespace = False
        return self

    def setWhitespaceChars( self, chars ):
        """
        Overrides the default whitespace chars
        """
        self.skipWhitespace = True
        self.whiteChars = chars
        self.copyDefaultWhiteChars = False
        return self

    def parseWithTabs( self ):
        """
        Overrides default behavior to expand C{<TAB>}s to spaces before parsing the input string.
        Must be called before C{parseString} when the input grammar contains elements that
        match C{<TAB>} characters.
        """
        self.keepTabs = True
        return self

    def ignore( self, other ):
        """
        Define expression to be ignored (e.g., comments) while doing pattern
        matching; may be called repeatedly, to define multiple comment or other
        ignorable patterns.
        
        Example::
            patt = OneOrMore(Word(alphas))
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj']
            
            patt.ignore(cStyleComment)
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj', 'lskjd']
        """
        if isinstance(other, basestring):
            other = Suppress(other)

        if isinstance( other, Suppress ):
            if other not in self.ignoreExprs:
                self.ignoreExprs.append(other)
        else:
            self.ignoreExprs.append( Suppress( other.copy() ) )
        return self

    def setDebugActions( self, startAction, successAction, exceptionAction ):
        """
        Enable display of debugging messages while doing pattern matching.
        """
        self.debugActions = (startAction or _defaultStartDebugAction,
                             successAction or _defaultSuccessDebugAction,
                             exceptionAction or _defaultExceptionDebugAction)
        self.debug = True
        return self

    def setDebug( self, flag=True ):
        """
        Enable display of debugging messages while doing pattern matching.
        Set C{flag} to True to enable, False to disable.

        Example::
            wd = Word(alphas).setName("alphaword")
            integer = Word(nums).setName("numword")
            term = wd | integer
            
            # turn on debugging for wd
            wd.setDebug()

            OneOrMore(term).parseString("abc 123 xyz 890")
        
        prints::
            Match alphaword at loc 0(1,1)
            Matched alphaword -> ['abc']
            Match alphaword at loc 3(1,4)
            Exception raised:Expected alphaword (at char 4), (line:1, col:5)
            Match alphaword at loc 7(1,8)
            Matched alphaword -> ['xyz']
            Match alphaword at loc 11(1,12)
            Exception raised:Expected alphaword (at char 12), (line:1, col:13)
            Match alphaword at loc 15(1,16)
            Exception raised:Expected alphaword (at char 15), (line:1, col:16)

        The output shown is that produced by the default debug actions - custom debug actions can be
        specified using L{setDebugActions}. Prior to attempting
        to match the C{wd} expression, the debugging message C{"Match <exprname> at loc <n>(<line>,<col>)"}
        is shown. Then if the parse succeeds, a C{"Matched"} message is shown, or an C{"Exception raised"}
        message is shown. Also note the use of L{setName} to assign a human-readable name to the expression,
        which makes debugging and exception messages easier to understand - for instance, the default
        name created for the C{Word} expression without calling C{setName} is C{"W:(ABCD...)"}.
        """
        if flag:
            self.setDebugActions( _defaultStartDebugAction, _defaultSuccessDebugAction, _defaultExceptionDebugAction )
        else:
            self.debug = False
        return self

    def __str__( self ):
        return self.name

    def __repr__( self ):
        return _ustr(self)

    def streamline( self ):
        self.streamlined = True
        self.strRepr = None
        return self

    def checkRecursion( self, parseElementList ):
        pass

    def validate( self, validateTrace=[] ):
        """
        Check defined expressions for valid structure, check for infinite recursive definitions.
        """
        self.checkRecursion( [] )

    def parseFile( self, file_or_filename, parseAll=False ):
        """
        Execute the parse expression on the given file or filename.
        If a filename is specified (instead of a file object),
        the entire file is opened, read, and closed before parsing.
        """
        try:
            file_contents = file_or_filename.read()
        except AttributeError:
            with open(file_or_filename, "r") as f:
                file_contents = f.read()
        try:
            return self.parseString(file_contents, parseAll)
        except ParseBaseException as exc:
            if ParserElement.verbose_stacktrace:
                raise
            else:
                # catch and re-raise exception from here, clears out pyparsing internal stack trace
                raise exc

    def __eq__(self,other):
        if isinstance(other, ParserElement):
            return self is other or vars(self) == vars(other)
        elif isinstance(other, basestring):
            return self.matches(other)
        else:
            return super(ParserElement,self)==other

    def __ne__(self,other):
        return not (self == other)

    def __hash__(self):
        return hash(id(self))

    def __req__(self,other):
        return self == other

    def __rne__(self,other):
        return not (self == other)

    def matches(self, testString, parseAll=True):
        """
        Method for quick testing of a parser against a test string. Good for simple 
        inline microtests of sub expressions while building up larger parser.
           
        Parameters:
         - testString - to test against this expression for a match
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests
            
        Example::
            expr = Word(nums)
            assert expr.matches("100")
        """
        try:
            self.parseString(_ustr(testString), parseAll=parseAll)
            return True
        except ParseBaseException:
            return False
                
    def runTests(self, tests, parseAll=True, comment='#', fullDump=True, printResults=True, failureTests=False):
        """
        Execute the parse expression on a series of test strings, showing each
        test, the parsed results or where the parse failed. Quick and easy way to
        run a parse expression against a list of sample strings.
           
        Parameters:
         - tests - a list of separate test strings, or a multiline string of test strings
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests           
         - comment - (default=C{'#'}) - expression for indicating embedded comments in the test 
              string; pass None to disable comment filtering
         - fullDump - (default=C{True}) - dump results as list followed by results names in nested outline;
              if False, only dump nested list
         - printResults - (default=C{True}) prints test output to stdout
         - failureTests - (default=C{False}) indicates if these tests are expected to fail parsing

        Returns: a (success, results) tuple, where success indicates that all tests succeeded
        (or failed if C{failureTests} is True), and the results contain a list of lines of each 
        test's output
        
        Example::
            number_expr = pyparsing_common.number.copy()

            result = number_expr.runTests('''
                # unsigned integer
                100
                # negative integer
                -100
                # float with scientific notation
                6.02e23
                # integer with scientific notation
                1e-12
                ''')
            print("Success" if result[0] else "Failed!")

            result = number_expr.runTests('''
                # stray character
                100Z
                # missing leading digit before '.'
                -.100
                # too many '.'
                3.14.159
                ''', failureTests=True)
            print("Success" if result[0] else "Failed!")
        prints::
            # unsigned integer
            100
            [100]

            # negative integer
            -100
            [-100]

            # float with scientific notation
            6.02e23
            [6.02e+23]

            # integer with scientific notation
            1e-12
            [1e-12]

            Success
            
            # stray character
            100Z
               ^
            FAIL: Expected end of text (at char 3), (line:1, col:4)

            # missing leading digit before '.'
            -.100
            ^
            FAIL: Expected {real number with scientific notation | real number | signed integer} (at char 0), (line:1, col:1)

            # too many '.'
            3.14.159
                ^
            FAIL: Expected end of text (at char 4), (line:1, col:5)

            Success

        Each test string must be on a single line. If you want to test a string that spans multiple
        lines, create a test like this::

            expr.runTest(r"this is a test\\n of strings that spans \\n 3 lines")
        
        (Note that this is a raw string literal, you must include the leading 'r'.)
        """
        if isinstance(tests, basestring):
            tests = list(map(str.strip, tests.rstrip().splitlines()))
        if isinstance(comment, basestring):
            comment = Literal(comment)
        allResults = []
        comments = []
        success = True
        for t in tests:
            if comment is not None and comment.matches(t, False) or comments and not t:
                comments.append(t)
                continue
            if not t:
                continue
            out = ['\n'.join(comments), t]
            comments = []
            try:
                t = t.replace(r'\n','\n')
                result = self.parseString(t, parseAll=parseAll)
                out.append(result.dump(full=fullDump))
                success = success and not failureTests
            except ParseBaseException as pe:
                fatal = "(FATAL)" if isinstance(pe, ParseFatalException) else ""
                if '\n' in t:
                    out.append(line(pe.loc, t))
                    out.append(' '*(col(pe.loc,t)-1) + '^' + fatal)
                else:
                    out.append(' '*pe.loc + '^' + fatal)
                out.append("FAIL: " + str(pe))
                success = success and failureTests
                result = pe
            except Exception as exc:
                out.append("FAIL-EXCEPTION: " + str(exc))
                success = success and failureTests
                result = exc

            if printResults:
                if fullDump:
                    out.append('')
                print('\n'.join(out))

            allResults.append((t, result))
        
        return success, allResults

        
class Token(ParserElement):
    """
    Abstract C{ParserElement} subclass, for defining atomic matching patterns.
    """
    def __init__( self ):
        super(Token,self).__init__( savelist=False )


class Empty(Token):
    """
    An empty token, will always match.
    """
    def __init__( self ):
        super(Empty,self).__init__()
        self.name = "Empty"
        self.mayReturnEmpty = True
        self.mayIndexError = False


class NoMatch(Token):
    """
    A token that will never match.
    """
    def __init__( self ):
        super(NoMatch,self).__init__()
        self.name = "NoMatch"
        self.mayReturnEmpty = True
        self.mayIndexError = False
        self.errmsg = "Unmatchable token"

    def parseImpl( self, instring, loc, doActions=True ):
        raise ParseException(instring, loc, self.errmsg, self)


class Literal(Token):
    """
    Token to exactly match a specified string.
    
    Example::
        Literal('blah').parseString('blah')  # -> ['blah']
        Literal('blah').parseString('blahfooblah')  # -> ['blah']
        Literal('blah').parseString('bla')  # -> Exception: Expected "blah"
    
    For case-insensitive matching, use L{CaselessLiteral}.
    
    For keyword matching (force word break before and after the matched string),
    use L{Keyword} or L{CaselessKeyword}.
    """
    def __init__( self, matchString ):
        super(Literal,self).__init__()
        self.match = matchString
        self.matchLen = len(matchString)
        try:
            self.firstMatchChar = matchString[0]
        except IndexError:
            warnings.warn("null string passed to Literal; use Empty() instead",
                            SyntaxWarning, stacklevel=2)
            self.__class__ = Empty
        self.name = '"%s"' % _ustr(self.match)
        self.errmsg = "Expected " + self.name
        self.mayReturnEmpty = False
        self.mayIndexError = False

    # Performance tuning: this routine gets called a *lot*
    # if this is a single character match string  and the first character matches,
    # short-circuit as quickly as possible, and avoid calling startswith
    #~ @profile
    def parseImpl( self, instring, loc, doActions=True ):
        if (instring[loc] == self.firstMatchChar and
            (self.matchLen==1 or instring.startswith(self.match,loc)) ):
            return loc+self.matchLen, self.match
        raise ParseException(instring, loc, self.errmsg, self)
_L = Literal
ParserElement._literalStringClass = Literal

class Keyword(Token):
    """
    Token to exactly match a specified string as a keyword, that is, it must be
    immediately followed by a non-keyword character.  Compare with C{L{Literal}}:
     - C{Literal("if")} will match the leading C{'if'} in C{'ifAndOnlyIf'}.
     - C{Keyword("if")} will not; it will only match the leading C{'if'} in C{'if x=1'}, or C{'if(y==2)'}
    Accepts two optional constructor arguments in addition to the keyword string:
     - C{identChars} is a string of characters that would be valid identifier characters,
          defaulting to all alphanumerics + "_" and "$"
     - C{caseless} allows case-insensitive matching, default is C{False}.
       
    Example::
        Keyword("start").parseString("start")  # -> ['start']
        Keyword("start").parseString("starting")  # -> Exception

    For case-insensitive matching, use L{CaselessKeyword}.
    """
    DEFAULT_KEYWORD_CHARS = alphanums+"_$"

    def __init__( self, matchString, identChars=None, caseless=False ):
        super(Keyword,self).__init__()
        if identChars is None:
            identChars = Keyword.DEFAULT_KEYWORD_CHARS
        self.match = matchString
        self.matchLen = len(matchString)
        try:
            self.firstMatchChar = matchString[0]
        except IndexError:
            warnings.warn("null string passed to Keyword; use Empty() instead",
                            SyntaxWarning, stacklevel=2)
        self.name = '"%s"' % self.match
        self.errmsg = "Expected " + self.name
        self.mayReturnEmpty = False
        self.mayIndexError = False
        self.caseless = caseless
        if caseless:
            self.caselessmatch = matchString.upper()
            identChars = identChars.upper()
        self.identChars = set(identChars)

    def parseImpl( self, instring, loc, doActions=True ):
        if self.caseless:
            if ( (instring[ loc:loc+self.matchLen ].upper() == self.caselessmatch) and
                 (loc >= len(instring)-self.matchLen or instring[loc+self.matchLen].upper() not in self.identChars) and
                 (loc == 0 or instring[loc-1].upper() not in self.identChars) ):
                return loc+self.matchLen, self.match
        else:
            if (instring[loc] == self.firstMatchChar and
                (self.matchLen==1 or instring.startswith(self.match,loc)) and
                (loc >= len(instring)-self.matchLen or instring[loc+self.matchLen] not in self.identChars) and
                (loc == 0 or instring[loc-1] not in self.identChars) ):
                return loc+self.matchLen, self.match
        raise ParseException(instring, loc, self.errmsg, self)

    def copy(self):
        c = super(Keyword,self).copy()
        c.identChars = Keyword.DEFAULT_KEYWORD_CHARS
        return c

    @staticmethod
    def setDefaultKeywordChars( chars ):
        """Overrides the default Keyword chars
        """
        Keyword.DEFAULT_KEYWORD_CHARS = chars

class CaselessLiteral(Literal):
    """
    Token to match a specified string, ignoring case of letters.
    Note: the matched results will always be in the case of the given
    match string, NOT the case of the input text.

    Example::
        OneOrMore(CaselessLiteral("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD', 'CMD']
        
    (Contrast with example for L{CaselessKeyword}.)
    """
    def __init__( self, matchString ):
        super(CaselessLiteral,self).__init__( matchString.upper() )
        # Preserve the defining literal.
        self.returnString = matchString
        self.name = "'%s'" % self.returnString
        self.errmsg = "Expected " + self.name

    def parseImpl( self, instring, loc, doActions=True ):
        if instring[ loc:loc+self.matchLen ].upper() == self.match:
            return loc+self.matchLen, self.returnString
        raise ParseException(instring, loc, self.errmsg, self)

class CaselessKeyword(Keyword):
    """
    Caseless version of L{Keyword}.

    Example::
        OneOrMore(CaselessKeyword("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD']
        
    (Contrast with example for L{CaselessLiteral}.)
    """
    def __init__( self, matchString, identChars=None ):
        super(CaselessKeyword,self).__init__( matchString, identChars, caseless=True )

    def parseImpl( self, instring, loc, doActions=True ):
        if ( (instring[ loc:loc+self.matchLen ].upper() == self.caselessmatch) and
             (loc >= len(instring)-self.matchLen or instring[loc+self.matchLen].upper() not in self.identChars) ):
            return loc+self.matchLen, self.match
        raise ParseException(instring, loc, self.errmsg, self)

class CloseMatch(Token):
    """
    A variation on L{Literal} which matches "close" matches, that is, 
    strings with at most 'n' mismatching characters. C{CloseMatch} takes parameters:
     - C{match_string} - string to be matched
     - C{maxMismatches} - (C{default=1}) maximum number of mismatches allowed to count as a match
    
    The results from a successful parse will contain the matched text from the input string and the following named results:
     - C{mismatches} - a list of the positions within the match_string where mismatches were found
     - C{original} - the original match_string used to compare against the input string
    
    If C{mismatches} is an empty list, then the match was an exact match.
    
    Example::
        patt = CloseMatch("ATCATCGAATGGA")
        patt.parseString("ATCATCGAAXGGA") # -> (['ATCATCGAAXGGA'], {'mismatches': [[9]], 'original': ['ATCATCGAATGGA']})
        patt.parseString("ATCAXCGAAXGGA") # -> Exception: Expected 'ATCATCGAATGGA' (with up to 1 mismatches) (at char 0), (line:1, col:1)

        # exact match
        patt.parseString("ATCATCGAATGGA") # -> (['ATCATCGAATGGA'], {'mismatches': [[]], 'original': ['ATCATCGAATGGA']})

        # close match allowing up to 2 mismatches
        patt = CloseMatch("ATCATCGAATGGA", maxMismatches=2)
        patt.parseString("ATCAXCGAAXGGA") # -> (['ATCAXCGAAXGGA'], {'mismatches': [[4, 9]], 'original': ['ATCATCGAATGGA']})
    """
    def __init__(self, match_string, maxMismatches=1):
        super(CloseMatch,self).__init__()
        self.name = match_string
        self.match_string = match_string
        self.maxMismatches = maxMismatches
        self.errmsg = "Expected %r (with up to %d mismatches)" % (self.match_string, self.maxMismatches)
        self.mayIndexError = False
        self.mayReturnEmpty = False

    def parseImpl( self, instring, loc, doActions=True ):
        start = loc
        instrlen = len(instring)
        maxloc = start + len(self.match_string)

        if maxloc <= instrlen:
            match_string = self.match_string
            match_stringloc = 0
            mismatches = []
            maxMismatches = self.maxMismatches

            for match_stringloc,s_m in enumerate(zip(instring[loc:maxloc], self.match_string)):
                src,mat = s_m
                if src != mat:
                    mismatches.append(match_stringloc)
                    if len(mismatches) > maxMismatches:
                        break
            else:
                loc = match_stringloc + 1
                results = ParseResults([instring[start:loc]])
                results['original'] = self.match_string
                results['mismatches'] = mismatches
                return loc, results

        raise ParseException(instring, loc, self.errmsg, self)


class Word(Token):
    """
    Token for matching words composed of allowed character sets.
    Defined with string containing all allowed initial characters,
    an optional string containing allowed body characters (if omitted,
    defaults to the initial character set), and an optional minimum,
    maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction. An optional
    C{excludeChars} parameter can list characters that might be found in 
    the input C{bodyChars} string; useful to define a word of all printables
    except for one or two characters, for instance.
    
    L{srange} is useful for defining custom character set strings for defining 
    C{Word} expressions, using range notation from regular expression character sets.
    
    A common mistake is to use C{Word} to match a specific literal string, as in 
    C{Word("Address")}. Remember that C{Word} uses the string argument to define
    I{sets} of matchable characters. This expression would match "Add", "AAA",
    "dAred", or any other word made up of the characters 'A', 'd', 'r', 'e', and 's'.
    To match an exact literal string, use L{Literal} or L{Keyword}.

    pyparsing includes helper strings for building Words:
     - L{alphas}
     - L{nums}
     - L{alphanums}
     - L{hexnums}
     - L{alphas8bit} (alphabetic characters in ASCII range 128-255 - accented, tilded, umlauted, etc.)
     - L{punc8bit} (non-alphabetic characters in ASCII range 128-255 - currency, symbols, superscripts, diacriticals, etc.)
     - L{printables} (any non-whitespace character)

    Example::
        # a word composed of digits
        integer = Word(nums) # equivalent to Word("0123456789") or Word(srange("0-9"))
        
        # a word with a leading capital, and zero or more lowercase
        capital_word = Word(alphas.upper(), alphas.lower())

        # hostnames are alphanumeric, with leading alpha, and '-'
        hostname = Word(alphas, alphanums+'-')
        
        # roman numeral (not a strict parser, accepts invalid mix of characters)
        roman = Word("IVXLCDM")
        
        # any string of non-whitespace characters, except for ','
        csv_value = Word(printables, excludeChars=",")
    """
    def __init__( self, initChars, bodyChars=None, min=1, max=0, exact=0, asKeyword=False, excludeChars=None ):
        super(Word,self).__init__()
        if excludeChars:
            initChars = ''.join(c for c in initChars if c not in excludeChars)
            if bodyChars:
                bodyChars = ''.join(c for c in bodyChars if c not in excludeChars)
        self.initCharsOrig = initChars
        self.initChars = set(initChars)
        if bodyChars :
            self.bodyCharsOrig = bodyChars
            self.bodyChars = set(bodyChars)
        else:
            self.bodyCharsOrig = initChars
            self.bodyChars = set(initChars)

        self.maxSpecified = max > 0

        if min < 1:
            raise ValueError("cannot specify a minimum length < 1; use Optional(Word()) if zero-length word is permitted")

        self.minLen = min

        if max > 0:
            self.maxLen = max
        else:
            self.maxLen = _MAX_INT

        if exact > 0:
            self.maxLen = exact
            self.minLen = exact

        self.name = _ustr(self)
        self.errmsg = "Expected " + self.name
        self.mayIndexError = False
        self.asKeyword = asKeyword

        if ' ' not in self.initCharsOrig+self.bodyCharsOrig and (min==1 and max==0 and exact==0):
            if self.bodyCharsOrig == self.initCharsOrig:
                self.reString = "[%s]+" % _escapeRegexRangeChars(self.initCharsOrig)
            elif len(self.initCharsOrig) == 1:
                self.reString = "%s[%s]*" % \
                                      (re.escape(self.initCharsOrig),
                                      _escapeRegexRangeChars(self.bodyCharsOrig),)
            else:
                self.reString = "[%s][%s]*" % \
                                      (_escapeRegexRangeChars(self.initCharsOrig),
                                      _escapeRegexRangeChars(self.bodyCharsOrig),)
            if self.asKeyword:
                self.reString = r"\b"+self.reString+r"\b"
            try:
                self.re = re.compile( self.reString )
            except Exception:
                self.re = None

    def parseImpl( self, instring, loc, doActions=True ):
        if self.re:
            result = self.re.match(instring,loc)
            if not result:
                raise ParseException(instring, loc, self.errmsg, self)

            loc = result.end()
            return loc, result.group()

        if not(instring[ loc ] in self.initChars):
            raise ParseException(instring, loc, self.errmsg, self)

        start = loc
        loc += 1
        instrlen = len(instring)
        bodychars = self.bodyChars
        maxloc = start + self.maxLen
        maxloc = min( maxloc, instrlen )
        while loc < maxloc and instring[loc] in bodychars:
            loc += 1

        throwException = False
        if loc - start < self.minLen:
            throwException = True
        if self.maxSpecified and loc < instrlen and instring[loc] in bodychars:
            throwException = True
        if self.asKeyword:
            if (start>0 and instring[start-1] in bodychars) or (loc<instrlen and instring[loc] in bodychars):
                throwException = True

        if throwException:
            raise ParseException(instring, loc, self.errmsg, self)

        return loc, instring[start:loc]

    def __str__( self ):
        try:
            return super(Word,self).__str__()
        except Exception:
            pass


        if self.strRepr is None:

            def charsAsStr(s):
                if len(s)>4:
                    return s[:4]+"..."
                else:
                    return s

            if ( self.initCharsOrig != self.bodyCharsOrig ):
                self.strRepr = "W:(%s,%s)" % ( charsAsStr(self.initCharsOrig), charsAsStr(self.bodyCharsOrig) )
            else:
                self.strRepr = "W:(%s)" % charsAsStr(self.initCharsOrig)

        return self.strRepr


class Regex(Token):
    """
    Token for matching strings that match a given regular expression.
    Defined with string specifying the regular expression in a form recognized by the inbuilt Python re module.
    If the given regex contains named groups (defined using C{(?P<name>...)}), these will be preserved as 
    named parse results.

    Example::
        realnum = Regex(r"[+-]?\d+\.\d*")
        date = Regex(r'(?P<year>\d{4})-(?P<month>\d\d?)-(?P<day>\d\d?)')
        # ref: http://stackoverflow.com/questions/267399/how-do-you-match-only-valid-roman-numerals-with-a-regular-expression
        roman = Regex(r"M{0,4}(CM|CD|D?C{0,3})(XC|XL|L?X{0,3})(IX|IV|V?I{0,3})")
    """
    compiledREtype = type(re.compile("[A-Z]"))
    def __init__( self, pattern, flags=0):
        """The parameters C{pattern} and C{flags} are passed to the C{re.compile()} function as-is. See the Python C{re} module for an explanation of the acceptable patterns and flags."""
        super(Regex,self).__init__()

        if isinstance(pattern, basestring):
            if not pattern:
                warnings.warn("null string passed to Regex; use Empty() instead",
                        SyntaxWarning, stacklevel=2)

            self.pattern = pattern
            self.flags = flags

            try:
                self.re = re.compile(self.pattern, self.flags)
                self.reString = self.pattern
            except sre_constants.error:
                warnings.warn("invalid pattern (%s) passed to Regex" % pattern,
                    SyntaxWarning, stacklevel=2)
                raise

        elif isinstance(pattern, Regex.compiledREtype):
            self.re = pattern
            self.pattern = \
            self.reString = str(pattern)
            self.flags = flags
            
        else:
            raise ValueError("Regex may only be constructed with a string or a compiled RE object")

        self.name = _ustr(self)
        self.errmsg = "Expected " + self.name
        self.mayIndexError = False
        self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        result = self.re.match(instring,loc)
        if not result:
            raise ParseException(instring, loc, self.errmsg, self)

        loc = result.end()
        d = result.groupdict()
        ret = ParseResults(result.group())
        if d:
            for k in d:
                ret[k] = d[k]
        return loc,ret

    def __str__( self ):
        try:
            return super(Regex,self).__str__()
        except Exception:
            pass

        if self.strRepr is None:
            self.strRepr = "Re:(%s)" % repr(self.pattern)

        return self.strRepr


class QuotedString(Token):
    r"""
    Token for matching strings that are delimited by quoting characters.
    
    Defined with the following parameters:
        - quoteChar - string of one or more characters defining the quote delimiting string
        - escChar - character to escape quotes, typically backslash (default=C{None})
        - escQuote - special quote sequence to escape an embedded quote string (such as SQL's "" to escape an embedded ") (default=C{None})
        - multiline - boolean indicating whether quotes can span multiple lines (default=C{False})
        - unquoteResults - boolean indicating whether the matched text should be unquoted (default=C{True})
        - endQuoteChar - string of one or more characters defining the end of the quote delimited string (default=C{None} => same as quoteChar)
        - convertWhitespaceEscapes - convert escaped whitespace (C{'\t'}, C{'\n'}, etc.) to actual whitespace (default=C{True})

    Example::
        qs = QuotedString('"')
        print(qs.searchString('lsjdf "This is the quote" sldjf'))
        complex_qs = QuotedString('{{', endQuoteChar='}}')
        print(complex_qs.searchString('lsjdf {{This is the "quote"}} sldjf'))
        sql_qs = QuotedString('"', escQuote='""')
        print(sql_qs.searchString('lsjdf "This is the quote with ""embedded"" quotes" sldjf'))
    prints::
        [['This is the quote']]
        [['This is the "quote"']]
        [['This is the quote with "embedded" quotes']]
    """
    def __init__( self, quoteChar, escChar=None, escQuote=None, multiline=False, unquoteResults=True, endQuoteChar=None, convertWhitespaceEscapes=True):
        super(QuotedString,self).__init__()

        # remove white space from quote chars - wont work anyway
        quoteChar = quoteChar.strip()
        if not quoteChar:
            warnings.warn("quoteChar cannot be the empty string",SyntaxWarning,stacklevel=2)
            raise SyntaxError()

        if endQuoteChar is None:
            endQuoteChar = quoteChar
        else:
            endQuoteChar = endQuoteChar.strip()
            if not endQuoteChar:
                warnings.warn("endQuoteChar cannot be the empty string",SyntaxWarning,stacklevel=2)
                raise SyntaxError()

        self.quoteChar = quoteChar
        self.quoteCharLen = len(quoteChar)
        self.firstQuoteChar = quoteChar[0]
        self.endQuoteChar = endQuoteChar
        self.endQuoteCharLen = len(endQuoteChar)
        self.escChar = escChar
        self.escQuote = escQuote
        self.unquoteResults = unquoteResults
        self.convertWhitespaceEscapes = convertWhitespaceEscapes

        if multiline:
            self.flags = re.MULTILINE | re.DOTALL
            self.pattern = r'%s(?:[^%s%s]' % \
                ( re.escape(self.quoteChar),
                  _escapeRegexRangeChars(self.endQuoteChar[0]),
                  (escChar is not None and _escapeRegexRangeChars(escChar) or '') )
        else:
            self.flags = 0
            self.pattern = r'%s(?:[^%s\n\r%s]' % \
                ( re.escape(self.quoteChar),
                  _escapeRegexRangeChars(self.endQuoteChar[0]),
                  (escChar is not None and _escapeRegexRangeChars(escChar) or '') )
        if len(self.endQuoteChar) > 1:
            self.pattern += (
                '|(?:' + ')|(?:'.join("%s[^%s]" % (re.escape(self.endQuoteChar[:i]),
                                               _escapeRegexRangeChars(self.endQuoteChar[i]))
                                    for i in range(len(self.endQuoteChar)-1,0,-1)) + ')'
                )
        if escQuote:
            self.pattern += (r'|(?:%s)' % re.escape(escQuote))
        if escChar:
            self.pattern += (r'|(?:%s.)' % re.escape(escChar))
            self.escCharReplacePattern = re.escape(self.escChar)+"(.)"
        self.pattern += (r')*%s' % re.escape(self.endQuoteChar))

        try:
            self.re = re.compile(self.pattern, self.flags)
            self.reString = self.pattern
        except sre_constants.error:
            warnings.warn("invalid pattern (%s) passed to Regex" % self.pattern,
                SyntaxWarning, stacklevel=2)
            raise

        self.name = _ustr(self)
        self.errmsg = "Expected " + self.name
        self.mayIndexError = False
        self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        result = instring[loc] == self.firstQuoteChar and self.re.match(instring,loc) or None
        if not result:
            raise ParseException(instring, loc, self.errmsg, self)

        loc = result.end()
        ret = result.group()

        if self.unquoteResults:

            # strip off quotes
            ret = ret[self.quoteCharLen:-self.endQuoteCharLen]

            if isinstance(ret,basestring):
                # replace escaped whitespace
                if '\\' in ret and self.convertWhitespaceEscapes:
                    ws_map = {
                        r'\t' : '\t',
                        r'\n' : '\n',
                        r'\f' : '\f',
                        r'\r' : '\r',
                    }
                    for wslit,wschar in ws_map.items():
                        ret = ret.replace(wslit, wschar)

                # replace escaped characters
                if self.escChar:
                    ret = re.sub(self.escCharReplacePattern,"\g<1>",ret)

                # replace escaped quotes
                if self.escQuote:
                    ret = ret.replace(self.escQuote, self.endQuoteChar)

        return loc, ret

    def __str__( self ):
        try:
            return super(QuotedString,self).__str__()
        except Exception:
            pass

        if self.strRepr is None:
            self.strRepr = "quoted string, starting with %s ending with %s" % (self.quoteChar, self.endQuoteChar)

        return self.strRepr


class CharsNotIn(Token):
    """
    Token for matching words composed of characters I{not} in a given set (will
    include whitespace in matched characters if not listed in the provided exclusion set - see example).
    Defined with string containing all disallowed characters, and an optional
    minimum, maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction.

    Example::
        # define a comma-separated-value as anything that is not a ','
        csv_value = CharsNotIn(',')
        print(delimitedList(csv_value).parseString("dkls,lsdkjf,s12 34,@!#,213"))
    prints::
        ['dkls', 'lsdkjf', 's12 34', '@!#', '213']
    """
    def __init__( self, notChars, min=1, max=0, exact=0 ):
        super(CharsNotIn,self).__init__()
        self.skipWhitespace = False
        self.notChars = notChars

        if min < 1:
            raise ValueError("cannot specify a minimum length < 1; use Optional(CharsNotIn()) if zero-length char group is permitted")

        self.minLen = min

        if max > 0:
            self.maxLen = max
        else:
            self.maxLen = _MAX_INT

        if exact > 0:
            self.maxLen = exact
            self.minLen = exact

        self.name = _ustr(self)
        self.errmsg = "Expected " + self.name
        self.mayReturnEmpty = ( self.minLen == 0 )
        self.mayIndexError = False

    def parseImpl( self, instring, loc, doActions=True ):
        if instring[loc] in self.notChars:
            raise ParseException(instring, loc, self.errmsg, self)

        start = loc
        loc += 1
        notchars = self.notChars
        maxlen = min( start+self.maxLen, len(instring) )
        while loc < maxlen and \
              (instring[loc] not in notchars):
            loc += 1

        if loc - start < self.minLen:
            raise ParseException(instring, loc, self.errmsg, self)

        return loc, instring[start:loc]

    def __str__( self ):
        try:
            return super(CharsNotIn, self).__str__()
        except Exception:
            pass

        if self.strRepr is None:
            if len(self.notChars) > 4:
                self.strRepr = "!W:(%s...)" % self.notChars[:4]
            else:
                self.strRepr = "!W:(%s)" % self.notChars

        return self.strRepr

class White(Token):
    """
    Special matching class for matching whitespace.  Normally, whitespace is ignored
    by pyparsing grammars.  This class is included when some whitespace structures
    are significant.  Define with a string containing the whitespace characters to be
    matched; default is C{" \\t\\r\\n"}.  Also takes optional C{min}, C{max}, and C{exact} arguments,
    as defined for the C{L{Word}} class.
    """
    whiteStrs = {
        " " : "<SPC>",
        "\t": "<TAB>",
        "\n": "<LF>",
        "\r": "<CR>",
        "\f": "<FF>",
        }
    def __init__(self, ws=" \t\r\n", min=1, max=0, exact=0):
        super(White,self).__init__()
        self.matchWhite = ws
        self.setWhitespaceChars( "".join(c for c in self.whiteChars if c not in self.matchWhite) )
        #~ self.leaveWhitespace()
        self.name = ("".join(White.whiteStrs[c] for c in self.matchWhite))
        self.mayReturnEmpty = True
        self.errmsg = "Expected " + self.name

        self.minLen = min

        if max > 0:
            self.maxLen = max
        else:
            self.maxLen = _MAX_INT

        if exact > 0:
            self.maxLen = exact
            self.minLen = exact

    def parseImpl( self, instring, loc, doActions=True ):
        if not(instring[ loc ] in self.matchWhite):
            raise ParseException(instring, loc, self.errmsg, self)
        start = loc
        loc += 1
        maxloc = start + self.maxLen
        maxloc = min( maxloc, len(instring) )
        while loc < maxloc and instring[loc] in self.matchWhite:
            loc += 1

        if loc - start < self.minLen:
            raise ParseException(instring, loc, self.errmsg, self)

        return loc, instring[start:loc]


class _PositionToken(Token):
    def __init__( self ):
        super(_PositionToken,self).__init__()
        self.name=self.__class__.__name__
        self.mayReturnEmpty = True
        self.mayIndexError = False

class GoToColumn(_PositionToken):
    """
    Token to advance to a specific column of input text; useful for tabular report scraping.
    """
    def __init__( self, colno ):
        super(GoToColumn,self).__init__()
        self.col = colno

    def preParse( self, instring, loc ):
        if col(loc,instring) != self.col:
            instrlen = len(instring)
            if self.ignoreExprs:
                loc = self._skipIgnorables( instring, loc )
            while loc < instrlen and instring[loc].isspace() and col( loc, instring ) != self.col :
                loc += 1
        return loc

    def parseImpl( self, instring, loc, doActions=True ):
        thiscol = col( loc, instring )
        if thiscol > self.col:
            raise ParseException( instring, loc, "Text not in expected column", self )
        newloc = loc + self.col - thiscol
        ret = instring[ loc: newloc ]
        return newloc, ret


class LineStart(_PositionToken):
    """
    Matches if current position is at the beginning of a line within the parse string
    
    Example::
    
        test = '''\
        AAA this line
        AAA and this line
          AAA but not this one
        B AAA and definitely not this one
        '''

        for t in (LineStart() + 'AAA' + restOfLine).searchString(test):
            print(t)
    
    Prints::
        ['AAA', ' this line']
        ['AAA', ' and this line']    

    """
    def __init__( self ):
        super(LineStart,self).__init__()
        self.errmsg = "Expected start of line"

    def parseImpl( self, instring, loc, doActions=True ):
        if col(loc, instring) == 1:
            return loc, []
        raise ParseException(instring, loc, self.errmsg, self)

class LineEnd(_PositionToken):
    """
    Matches if current position is at the end of a line within the parse string
    """
    def __init__( self ):
        super(LineEnd,self).__init__()
        self.setWhitespaceChars( ParserElement.DEFAULT_WHITE_CHARS.replace("\n","") )
        self.errmsg = "Expected end of line"

    def parseImpl( self, instring, loc, doActions=True ):
        if loc<len(instring):
            if instring[loc] == "\n":
                return loc+1, "\n"
            else:
                raise ParseException(instring, loc, self.errmsg, self)
        elif loc == len(instring):
            return loc+1, []
        else:
            raise ParseException(instring, loc, self.errmsg, self)

class StringStart(_PositionToken):
    """
    Matches if current position is at the beginning of the parse string
    """
    def __init__( self ):
        super(StringStart,self).__init__()
        self.errmsg = "Expected start of text"

    def parseImpl( self, instring, loc, doActions=True ):
        if loc != 0:
            # see if entire string up to here is just whitespace and ignoreables
            if loc != self.preParse( instring, 0 ):
                raise ParseException(instring, loc, self.errmsg, self)
        return loc, []

class StringEnd(_PositionToken):
    """
    Matches if current position is at the end of the parse string
    """
    def __init__( self ):
        super(StringEnd,self).__init__()
        self.errmsg = "Expected end of text"

    def parseImpl( self, instring, loc, doActions=True ):
        if loc < len(instring):
            raise ParseException(instring, loc, self.errmsg, self)
        elif loc == len(instring):
            return loc+1, []
        elif loc > len(instring):
            return loc, []
        else:
            raise ParseException(instring, loc, self.errmsg, self)

class WordStart(_PositionToken):
    """
    Matches if the current position is at the beginning of a Word, and
    is not preceded by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{\b} behavior of regular expressions,
    use C{WordStart(alphanums)}. C{WordStart} will also match at the beginning of
    the string being parsed, or at the beginning of a line.
    """
    def __init__(self, wordChars = printables):
        super(WordStart,self).__init__()
        self.wordChars = set(wordChars)
        self.errmsg = "Not at the start of a word"

    def parseImpl(self, instring, loc, doActions=True ):
        if loc != 0:
            if (instring[loc-1] in self.wordChars or
                instring[loc] not in self.wordChars):
                raise ParseException(instring, loc, self.errmsg, self)
        return loc, []

class WordEnd(_PositionToken):
    """
    Matches if the current position is at the end of a Word, and
    is not followed by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{\b} behavior of regular expressions,
    use C{WordEnd(alphanums)}. C{WordEnd} will also match at the end of
    the string being parsed, or at the end of a line.
    """
    def __init__(self, wordChars = printables):
        super(WordEnd,self).__init__()
        self.wordChars = set(wordChars)
        self.skipWhitespace = False
        self.errmsg = "Not at the end of a word"

    def parseImpl(self, instring, loc, doActions=True ):
        instrlen = len(instring)
        if instrlen>0 and loc<instrlen:
            if (instring[loc] in self.wordChars or
                instring[loc-1] not in self.wordChars):
                raise ParseException(instring, loc, self.errmsg, self)
        return loc, []


class ParseExpression(ParserElement):
    """
    Abstract subclass of ParserElement, for combining and post-processing parsed tokens.
    """
    def __init__( self, exprs, savelist = False ):
        super(ParseExpression,self).__init__(savelist)
        if isinstance( exprs, _generatorType ):
            exprs = list(exprs)

        if isinstance( exprs, basestring ):
            self.exprs = [ ParserElement._literalStringClass( exprs ) ]
        elif isinstance( exprs, collections.Iterable ):
            exprs = list(exprs)
            # if sequence of strings provided, wrap with Literal
            if all(isinstance(expr, basestring) for expr in exprs):
                exprs = map(ParserElement._literalStringClass, exprs)
            self.exprs = list(exprs)
        else:
            try:
                self.exprs = list( exprs )
            except TypeError:
                self.exprs = [ exprs ]
        self.callPreparse = False

    def __getitem__( self, i ):
        return self.exprs[i]

    def append( self, other ):
        self.exprs.append( other )
        self.strRepr = None
        return self

    def leaveWhitespace( self ):
        """Extends C{leaveWhitespace} defined in base class, and also invokes C{leaveWhitespace} on
           all contained expressions."""
        self.skipWhitespace = False
        self.exprs = [ e.copy() for e in self.exprs ]
        for e in self.exprs:
            e.leaveWhitespace()
        return self

    def ignore( self, other ):
        if isinstance( other, Suppress ):
            if other not in self.ignoreExprs:
                super( ParseExpression, self).ignore( other )
                for e in self.exprs:
                    e.ignore( self.ignoreExprs[-1] )
        else:
            super( ParseExpression, self).ignore( other )
            for e in self.exprs:
                e.ignore( self.ignoreExprs[-1] )
        return self

    def __str__( self ):
        try:
            return super(ParseExpression,self).__str__()
        except Exception:
            pass

        if self.strRepr is None:
            self.strRepr = "%s:(%s)" % ( self.__class__.__name__, _ustr(self.exprs) )
        return self.strRepr

    def streamline( self ):
        super(ParseExpression,self).streamline()

        for e in self.exprs:
            e.streamline()

        # collapse nested And's of the form And( And( And( a,b), c), d) to And( a,b,c,d )
        # but only if there are no parse actions or resultsNames on the nested And's
        # (likewise for Or's and MatchFirst's)
        if ( len(self.exprs) == 2 ):
            other = self.exprs[0]
            if ( isinstance( other, self.__class__ ) and
                  not(other.parseAction) and
                  other.resultsName is None and
                  not other.debug ):
                self.exprs = other.exprs[:] + [ self.exprs[1] ]
                self.strRepr = None
                self.mayReturnEmpty |= other.mayReturnEmpty
                self.mayIndexError  |= other.mayIndexError

            other = self.exprs[-1]
            if ( isinstance( other, self.__class__ ) and
                  not(other.parseAction) and
                  other.resultsName is None and
                  not other.debug ):
                self.exprs = self.exprs[:-1] + other.exprs[:]
                self.strRepr = None
                self.mayReturnEmpty |= other.mayReturnEmpty
                self.mayIndexError  |= other.mayIndexError

        self.errmsg = "Expected " + _ustr(self)
        
        return self

    def setResultsName( self, name, listAllMatches=False ):
        ret = super(ParseExpression,self).setResultsName(name,listAllMatches)
        return ret

    def validate( self, validateTrace=[] ):
        tmp = validateTrace[:]+[self]
        for e in self.exprs:
            e.validate(tmp)
        self.checkRecursion( [] )
        
    def copy(self):
        ret = super(ParseExpression,self).copy()
        ret.exprs = [e.copy() for e in self.exprs]
        return ret

class And(ParseExpression):
    """
    Requires all given C{ParseExpression}s to be found in the given order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'+'} operator.
    May also be constructed using the C{'-'} operator, which will suppress backtracking.

    Example::
        integer = Word(nums)
        name_expr = OneOrMore(Word(alphas))

        expr = And([integer("id"),name_expr("name"),integer("age")])
        # more easily written as:
        expr = integer("id") + name_expr("name") + integer("age")
    """

    class _ErrorStop(Empty):
        def __init__(self, *args, **kwargs):
            super(And._ErrorStop,self).__init__(*args, **kwargs)
            self.name = '-'
            self.leaveWhitespace()

    def __init__( self, exprs, savelist = True ):
        super(And,self).__init__(exprs, savelist)
        self.mayReturnEmpty = all(e.mayReturnEmpty for e in self.exprs)
        self.setWhitespaceChars( self.exprs[0].whiteChars )
        self.skipWhitespace = self.exprs[0].skipWhitespace
        self.callPreparse = True

    def parseImpl( self, instring, loc, doActions=True ):
        # pass False as last arg to _parse for first element, since we already
        # pre-parsed the string as part of our And pre-parsing
        loc, resultlist = self.exprs[0]._parse( instring, loc, doActions, callPreParse=False )
        errorStop = False
        for e in self.exprs[1:]:
            if isinstance(e, And._ErrorStop):
                errorStop = True
                continue
            if errorStop:
                try:
                    loc, exprtokens = e._parse( instring, loc, doActions )
                except ParseSyntaxException:
                    raise
                except ParseBaseException as pe:
                    pe.__traceback__ = None
                    raise ParseSyntaxException._from_exception(pe)
                except IndexError:
                    raise ParseSyntaxException(instring, len(instring), self.errmsg, self)
            else:
                loc, exprtokens = e._parse( instring, loc, doActions )
            if exprtokens or exprtokens.haskeys():
                resultlist += exprtokens
        return loc, resultlist

    def __iadd__(self, other ):
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        return self.append( other ) #And( [ self, other ] )

    def checkRecursion( self, parseElementList ):
        subRecCheckList = parseElementList[:] + [ self ]
        for e in self.exprs:
            e.checkRecursion( subRecCheckList )
            if not e.mayReturnEmpty:
                break

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "{" + " ".join(_ustr(e) for e in self.exprs) + "}"

        return self.strRepr


class Or(ParseExpression):
    """
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the expression that matches the longest string will be used.
    May be constructed using the C{'^'} operator.

    Example::
        # construct Or using '^' operator
        
        number = Word(nums) ^ Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789"))
    prints::
        [['123'], ['3.1416'], ['789']]
    """
    def __init__( self, exprs, savelist = False ):
        super(Or,self).__init__(exprs, savelist)
        if self.exprs:
            self.mayReturnEmpty = any(e.mayReturnEmpty for e in self.exprs)
        else:
            self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        maxExcLoc = -1
        maxException = None
        matches = []
        for e in self.exprs:
            try:
                loc2 = e.tryParse( instring, loc )
            except ParseException as err:
                err.__traceback__ = None
                if err.loc > maxExcLoc:
                    maxException = err
                    maxExcLoc = err.loc
            except IndexError:
                if len(instring) > maxExcLoc:
                    maxException = ParseException(instring,len(instring),e.errmsg,self)
                    maxExcLoc = len(instring)
            else:
                # save match among all matches, to retry longest to shortest
                matches.append((loc2, e))

        if matches:
            matches.sort(key=lambda x: -x[0])
            for _,e in matches:
                try:
                    return e._parse( instring, loc, doActions )
                except ParseException as err:
                    err.__traceback__ = None
                    if err.loc > maxExcLoc:
                        maxException = err
                        maxExcLoc = err.loc

        if maxException is not None:
            maxException.msg = self.errmsg
            raise maxException
        else:
            raise ParseException(instring, loc, "no defined alternatives to match", self)


    def __ixor__(self, other ):
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        return self.append( other ) #Or( [ self, other ] )

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "{" + " ^ ".join(_ustr(e) for e in self.exprs) + "}"

        return self.strRepr

    def checkRecursion( self, parseElementList ):
        subRecCheckList = parseElementList[:] + [ self ]
        for e in self.exprs:
            e.checkRecursion( subRecCheckList )


class MatchFirst(ParseExpression):
    """
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the first one listed is the one that will match.
    May be constructed using the C{'|'} operator.

    Example::
        # construct MatchFirst using '|' operator
        
        # watch the order of expressions to match
        number = Word(nums) | Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789")) #  Fail! -> [['123'], ['3'], ['1416'], ['789']]

        # put more selective expression first
        number = Combine(Word(nums) + '.' + Word(nums)) | Word(nums)
        print(number.searchString("123 3.1416 789")) #  Better -> [['123'], ['3.1416'], ['789']]
    """
    def __init__( self, exprs, savelist = False ):
        super(MatchFirst,self).__init__(exprs, savelist)
        if self.exprs:
            self.mayReturnEmpty = any(e.mayReturnEmpty for e in self.exprs)
        else:
            self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        maxExcLoc = -1
        maxException = None
        for e in self.exprs:
            try:
                ret = e._parse( instring, loc, doActions )
                return ret
            except ParseException as err:
                if err.loc > maxExcLoc:
                    maxException = err
                    maxExcLoc = err.loc
            except IndexError:
                if len(instring) > maxExcLoc:
                    maxException = ParseException(instring,len(instring),e.errmsg,self)
                    maxExcLoc = len(instring)

        # only got here if no expression matched, raise exception for match that made it the furthest
        else:
            if maxException is not None:
                maxException.msg = self.errmsg
                raise maxException
            else:
                raise ParseException(instring, loc, "no defined alternatives to match", self)

    def __ior__(self, other ):
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        return self.append( other ) #MatchFirst( [ self, other ] )

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "{" + " | ".join(_ustr(e) for e in self.exprs) + "}"

        return self.strRepr

    def checkRecursion( self, parseElementList ):
        subRecCheckList = parseElementList[:] + [ self ]
        for e in self.exprs:
            e.checkRecursion( subRecCheckList )


class Each(ParseExpression):
    """
    Requires all given C{ParseExpression}s to be found, but in any order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'&'} operator.

    Example::
        color = oneOf("RED ORANGE YELLOW GREEN BLUE PURPLE BLACK WHITE BROWN")
        shape_type = oneOf("SQUARE CIRCLE TRIANGLE STAR HEXAGON OCTAGON")
        integer = Word(nums)
        shape_attr = "shape:" + shape_type("shape")
        posn_attr = "posn:" + Group(integer("x") + ',' + integer("y"))("posn")
        color_attr = "color:" + color("color")
        size_attr = "size:" + integer("size")

        # use Each (using operator '&') to accept attributes in any order 
        # (shape and posn are required, color and size are optional)
        shape_spec = shape_attr & posn_attr & Optional(color_attr) & Optional(size_attr)

        shape_spec.runTests('''
            shape: SQUARE color: BLACK posn: 100, 120
            shape: CIRCLE size: 50 color: BLUE posn: 50,80
            color:GREEN size:20 shape:TRIANGLE posn:20,40
            '''
            )
    prints::
        shape: SQUARE color: BLACK posn: 100, 120
        ['shape:', 'SQUARE', 'color:', 'BLACK', 'posn:', ['100', ',', '120']]
        - color: BLACK
        - posn: ['100', ',', '120']
          - x: 100
          - y: 120
        - shape: SQUARE


        shape: CIRCLE size: 50 color: BLUE posn: 50,80
        ['shape:', 'CIRCLE', 'size:', '50', 'color:', 'BLUE', 'posn:', ['50', ',', '80']]
        - color: BLUE
        - posn: ['50', ',', '80']
          - x: 50
          - y: 80
        - shape: CIRCLE
        - size: 50


        color: GREEN size: 20 shape: TRIANGLE posn: 20,40
        ['color:', 'GREEN', 'size:', '20', 'shape:', 'TRIANGLE', 'posn:', ['20', ',', '40']]
        - color: GREEN
        - posn: ['20', ',', '40']
          - x: 20
          - y: 40
        - shape: TRIANGLE
        - size: 20
    """
    def __init__( self, exprs, savelist = True ):
        super(Each,self).__init__(exprs, savelist)
        self.mayReturnEmpty = all(e.mayReturnEmpty for e in self.exprs)
        self.skipWhitespace = True
        self.initExprGroups = True

    def parseImpl( self, instring, loc, doActions=True ):
        if self.initExprGroups:
            self.opt1map = dict((id(e.expr),e) for e in self.exprs if isinstance(e,Optional))
            opt1 = [ e.expr for e in self.exprs if isinstance(e,Optional) ]
            opt2 = [ e for e in self.exprs if e.mayReturnEmpty and not isinstance(e,Optional)]
            self.optionals = opt1 + opt2
            self.multioptionals = [ e.expr for e in self.exprs if isinstance(e,ZeroOrMore) ]
            self.multirequired = [ e.expr for e in self.exprs if isinstance(e,OneOrMore) ]
            self.required = [ e for e in self.exprs if not isinstance(e,(Optional,ZeroOrMore,OneOrMore)) ]
            self.required += self.multirequired
            self.initExprGroups = False
        tmpLoc = loc
        tmpReqd = self.required[:]
        tmpOpt  = self.optionals[:]
        matchOrder = []

        keepMatching = True
        while keepMatching:
            tmpExprs = tmpReqd + tmpOpt + self.multioptionals + self.multirequired
            failed = []
            for e in tmpExprs:
                try:
                    tmpLoc = e.tryParse( instring, tmpLoc )
                except ParseException:
                    failed.append(e)
                else:
                    matchOrder.append(self.opt1map.get(id(e),e))
                    if e in tmpReqd:
                        tmpReqd.remove(e)
                    elif e in tmpOpt:
                        tmpOpt.remove(e)
            if len(failed) == len(tmpExprs):
                keepMatching = False

        if tmpReqd:
            missing = ", ".join(_ustr(e) for e in tmpReqd)
            raise ParseException(instring,loc,"Missing one or more required elements (%s)" % missing )

        # add any unmatched Optionals, in case they have default values defined
        matchOrder += [e for e in self.exprs if isinstance(e,Optional) and e.expr in tmpOpt]

        resultlist = []
        for e in matchOrder:
            loc,results = e._parse(instring,loc,doActions)
            resultlist.append(results)

        finalResults = sum(resultlist, ParseResults([]))
        return loc, finalResults

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "{" + " & ".join(_ustr(e) for e in self.exprs) + "}"

        return self.strRepr

    def checkRecursion( self, parseElementList ):
        subRecCheckList = parseElementList[:] + [ self ]
        for e in self.exprs:
            e.checkRecursion( subRecCheckList )


class ParseElementEnhance(ParserElement):
    """
    Abstract subclass of C{ParserElement}, for combining and post-processing parsed tokens.
    """
    def __init__( self, expr, savelist=False ):
        super(ParseElementEnhance,self).__init__(savelist)
        if isinstance( expr, basestring ):
            if issubclass(ParserElement._literalStringClass, Token):
                expr = ParserElement._literalStringClass(expr)
            else:
                expr = ParserElement._literalStringClass(Literal(expr))
        self.expr = expr
        self.strRepr = None
        if expr is not None:
            self.mayIndexError = expr.mayIndexError
            self.mayReturnEmpty = expr.mayReturnEmpty
            self.setWhitespaceChars( expr.whiteChars )
            self.skipWhitespace = expr.skipWhitespace
            self.saveAsList = expr.saveAsList
            self.callPreparse = expr.callPreparse
            self.ignoreExprs.extend(expr.ignoreExprs)

    def parseImpl( self, instring, loc, doActions=True ):
        if self.expr is not None:
            return self.expr._parse( instring, loc, doActions, callPreParse=False )
        else:
            raise ParseException("",loc,self.errmsg,self)

    def leaveWhitespace( self ):
        self.skipWhitespace = False
        self.expr = self.expr.copy()
        if self.expr is not None:
            self.expr.leaveWhitespace()
        return self

    def ignore( self, other ):
        if isinstance( other, Suppress ):
            if other not in self.ignoreExprs:
                super( ParseElementEnhance, self).ignore( other )
                if self.expr is not None:
                    self.expr.ignore( self.ignoreExprs[-1] )
        else:
            super( ParseElementEnhance, self).ignore( other )
            if self.expr is not None:
                self.expr.ignore( self.ignoreExprs[-1] )
        return self

    def streamline( self ):
        super(ParseElementEnhance,self).streamline()
        if self.expr is not None:
            self.expr.streamline()
        return self

    def checkRecursion( self, parseElementList ):
        if self in parseElementList:
            raise RecursiveGrammarException( parseElementList+[self] )
        subRecCheckList = parseElementList[:] + [ self ]
        if self.expr is not None:
            self.expr.checkRecursion( subRecCheckList )

    def validate( self, validateTrace=[] ):
        tmp = validateTrace[:]+[self]
        if self.expr is not None:
            self.expr.validate(tmp)
        self.checkRecursion( [] )

    def __str__( self ):
        try:
            return super(ParseElementEnhance,self).__str__()
        except Exception:
            pass

        if self.strRepr is None and self.expr is not None:
            self.strRepr = "%s:(%s)" % ( self.__class__.__name__, _ustr(self.expr) )
        return self.strRepr


class FollowedBy(ParseElementEnhance):
    """
    Lookahead matching of the given parse expression.  C{FollowedBy}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression matches at the current
    position.  C{FollowedBy} always returns a null token list.

    Example::
        # use FollowedBy to match a label only if it is followed by a ':'
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        OneOrMore(attr_expr).parseString("shape: SQUARE color: BLACK posn: upper left").pprint()
    prints::
        [['shape', 'SQUARE'], ['color', 'BLACK'], ['posn', 'upper left']]
    """
    def __init__( self, expr ):
        super(FollowedBy,self).__init__(expr)
        self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        self.expr.tryParse( instring, loc )
        return loc, []


class NotAny(ParseElementEnhance):
    """
    Lookahead to disallow matching with the given parse expression.  C{NotAny}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression does I{not} match at the current
    position.  Also, C{NotAny} does I{not} skip over leading whitespace. C{NotAny}
    always returns a null token list.  May be constructed using the '~' operator.

    Example::
        
    """
    def __init__( self, expr ):
        super(NotAny,self).__init__(expr)
        #~ self.leaveWhitespace()
        self.skipWhitespace = False  # do NOT use self.leaveWhitespace(), don't want to propagate to exprs
        self.mayReturnEmpty = True
        self.errmsg = "Found unwanted token, "+_ustr(self.expr)

    def parseImpl( self, instring, loc, doActions=True ):
        if self.expr.canParseNext(instring, loc):
            raise ParseException(instring, loc, self.errmsg, self)
        return loc, []

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "~{" + _ustr(self.expr) + "}"

        return self.strRepr

class _MultipleMatch(ParseElementEnhance):
    def __init__( self, expr, stopOn=None):
        super(_MultipleMatch, self).__init__(expr)
        self.saveAsList = True
        ender = stopOn
        if isinstance(ender, basestring):
            ender = ParserElement._literalStringClass(ender)
        self.not_ender = ~ender if ender is not None else None

    def parseImpl( self, instring, loc, doActions=True ):
        self_expr_parse = self.expr._parse
        self_skip_ignorables = self._skipIgnorables
        check_ender = self.not_ender is not None
        if check_ender:
            try_not_ender = self.not_ender.tryParse
        
        # must be at least one (but first see if we are the stopOn sentinel;
        # if so, fail)
        if check_ender:
            try_not_ender(instring, loc)
        loc, tokens = self_expr_parse( instring, loc, doActions, callPreParse=False )
        try:
            hasIgnoreExprs = (not not self.ignoreExprs)
            while 1:
                if check_ender:
                    try_not_ender(instring, loc)
                if hasIgnoreExprs:
                    preloc = self_skip_ignorables( instring, loc )
                else:
                    preloc = loc
                loc, tmptokens = self_expr_parse( instring, preloc, doActions )
                if tmptokens or tmptokens.haskeys():
                    tokens += tmptokens
        except (ParseException,IndexError):
            pass

        return loc, tokens
        
class OneOrMore(_MultipleMatch):
    """
    Repetition of one or more of the given expression.
    
    Parameters:
     - expr - expression that must match one or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: BLACK"
        OneOrMore(attr_expr).parseString(text).pprint()  # Fail! read 'color' as data instead of next label -> [['shape', 'SQUARE color']]

        # use stopOn attribute for OneOrMore to avoid reading label string as part of the data
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        OneOrMore(attr_expr).parseString(text).pprint() # Better -> [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'BLACK']]
        
        # could also be written as
        (attr_expr * (1,)).parseString(text).pprint()
    """

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "{" + _ustr(self.expr) + "}..."

        return self.strRepr

class ZeroOrMore(_MultipleMatch):
    """
    Optional repetition of zero or more of the given expression.
    
    Parameters:
     - expr - expression that must match zero or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example: similar to L{OneOrMore}
    """
    def __init__( self, expr, stopOn=None):
        super(ZeroOrMore,self).__init__(expr, stopOn=stopOn)
        self.mayReturnEmpty = True
        
    def parseImpl( self, instring, loc, doActions=True ):
        try:
            return super(ZeroOrMore, self).parseImpl(instring, loc, doActions)
        except (ParseException,IndexError):
            return loc, []

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "[" + _ustr(self.expr) + "]..."

        return self.strRepr

class _NullToken(object):
    def __bool__(self):
        return False
    __nonzero__ = __bool__
    def __str__(self):
        return ""

_optionalNotMatched = _NullToken()
class Optional(ParseElementEnhance):
    """
    Optional matching of the given expression.

    Parameters:
     - expr - expression that must match zero or more times
     - default (optional) - value to be returned if the optional expression is not found.

    Example::
        # US postal code can be a 5-digit zip, plus optional 4-digit qualifier
        zip = Combine(Word(nums, exact=5) + Optional('-' + Word(nums, exact=4)))
        zip.runTests('''
            # traditional ZIP code
            12345
            
            # ZIP+4 form
            12101-0001
            
            # invalid ZIP
            98765-
            ''')
    prints::
        # traditional ZIP code
        12345
        ['12345']

        # ZIP+4 form
        12101-0001
        ['12101-0001']

        # invalid ZIP
        98765-
             ^
        FAIL: Expected end of text (at char 5), (line:1, col:6)
    """
    def __init__( self, expr, default=_optionalNotMatched ):
        super(Optional,self).__init__( expr, savelist=False )
        self.saveAsList = self.expr.saveAsList
        self.defaultValue = default
        self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        try:
            loc, tokens = self.expr._parse( instring, loc, doActions, callPreParse=False )
        except (ParseException,IndexError):
            if self.defaultValue is not _optionalNotMatched:
                if self.expr.resultsName:
                    tokens = ParseResults([ self.defaultValue ])
                    tokens[self.expr.resultsName] = self.defaultValue
                else:
                    tokens = [ self.defaultValue ]
            else:
                tokens = []
        return loc, tokens

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "[" + _ustr(self.expr) + "]"

        return self.strRepr

class SkipTo(ParseElementEnhance):
    """
    Token for skipping over all undefined text until the matched expression is found.

    Parameters:
     - expr - target expression marking the end of the data to be skipped
     - include - (default=C{False}) if True, the target expression is also parsed 
          (the skipped text and target expression are returned as a 2-element list).
     - ignore - (default=C{None}) used to define grammars (typically quoted strings and 
          comments) that might contain false matches to the target expression
     - failOn - (default=C{None}) define expressions that are not allowed to be 
          included in the skipped test; if found before the target expression is found, 
          the SkipTo is not a match

    Example::
        report = '''
            Outstanding Issues Report - 1 Jan 2000

               # | Severity | Description                               |  Days Open
            -----+----------+-------------------------------------------+-----------
             101 | Critical | Intermittent system crash                 |          6
              94 | Cosmetic | Spelling error on Login ('log|n')         |         14
              79 | Minor    | System slow when running too many reports |         47
            '''
        integer = Word(nums)
        SEP = Suppress('|')
        # use SkipTo to simply match everything up until the next SEP
        # - ignore quoted strings, so that a '|' character inside a quoted string does not match
        # - parse action will call token.strip() for each matched token, i.e., the description body
        string_data = SkipTo(SEP, ignore=quotedString)
        string_data.setParseAction(tokenMap(str.strip))
        ticket_expr = (integer("issue_num") + SEP 
                      + string_data("sev") + SEP 
                      + string_data("desc") + SEP 
                      + integer("days_open"))
        
        for tkt in ticket_expr.searchString(report):
            print tkt.dump()
    prints::
        ['101', 'Critical', 'Intermittent system crash', '6']
        - days_open: 6
        - desc: Intermittent system crash
        - issue_num: 101
        - sev: Critical
        ['94', 'Cosmetic', "Spelling error on Login ('log|n')", '14']
        - days_open: 14
        - desc: Spelling error on Login ('log|n')
        - issue_num: 94
        - sev: Cosmetic
        ['79', 'Minor', 'System slow when running too many reports', '47']
        - days_open: 47
        - desc: System slow when running too many reports
        - issue_num: 79
        - sev: Minor
    """
    def __init__( self, other, include=False, ignore=None, failOn=None ):
        super( SkipTo, self ).__init__( other )
        self.ignoreExpr = ignore
        self.mayReturnEmpty = True
        self.mayIndexError = False
        self.includeMatch = include
        self.asList = False
        if isinstance(failOn, basestring):
            self.failOn = ParserElement._literalStringClass(failOn)
        else:
            self.failOn = failOn
        self.errmsg = "No match found for "+_ustr(self.expr)

    def parseImpl( self, instring, loc, doActions=True ):
        startloc = loc
        instrlen = len(instring)
        expr = self.expr
        expr_parse = self.expr._parse
        self_failOn_canParseNext = self.failOn.canParseNext if self.failOn is not None else None
        self_ignoreExpr_tryParse = self.ignoreExpr.tryParse if self.ignoreExpr is not None else None
        
        tmploc = loc
        while tmploc <= instrlen:
            if self_failOn_canParseNext is not None:
                # break if failOn expression matches
                if self_failOn_canParseNext(instring, tmploc):
                    break
                    
            if self_ignoreExpr_tryParse is not None:
                # advance past ignore expressions
                while 1:
                    try:
                        tmploc = self_ignoreExpr_tryParse(instring, tmploc)
                    except ParseBaseException:
                        break
            
            try:
                expr_parse(instring, tmploc, doActions=False, callPreParse=False)
            except (ParseException, IndexError):
                # no match, advance loc in string
                tmploc += 1
            else:
                # matched skipto expr, done
                break

        else:
            # ran off the end of the input string without matching skipto expr, fail
            raise ParseException(instring, loc, self.errmsg, self)

        # build up return values
        loc = tmploc
        skiptext = instring[startloc:loc]
        skipresult = ParseResults(skiptext)
        
        if self.includeMatch:
            loc, mat = expr_parse(instring,loc,doActions,callPreParse=False)
            skipresult += mat

        return loc, skipresult

class Forward(ParseElementEnhance):
    """
    Forward declaration of an expression to be defined later -
    used for recursive grammars, such as algebraic infix notation.
    When the expression is known, it is assigned to the C{Forward} variable using the '<<' operator.

    Note: take care when assigning to C{Forward} not to overlook precedence of operators.
    Specifically, '|' has a lower precedence than '<<', so that::
        fwdExpr << a | b | c
    will actually be evaluated as::
        (fwdExpr << a) | b | c
    thereby leaving b and c out as parseable alternatives.  It is recommended that you
    explicitly group the values inserted into the C{Forward}::
        fwdExpr << (a | b | c)
    Converting to use the '<<=' operator instead will avoid this problem.

    See L{ParseResults.pprint} for an example of a recursive parser created using
    C{Forward}.
    """
    def __init__( self, other=None ):
        super(Forward,self).__init__( other, savelist=False )

    def __lshift__( self, other ):
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass(other)
        self.expr = other
        self.strRepr = None
        self.mayIndexError = self.expr.mayIndexError
        self.mayReturnEmpty = self.expr.mayReturnEmpty
        self.setWhitespaceChars( self.expr.whiteChars )
        self.skipWhitespace = self.expr.skipWhitespace
        self.saveAsList = self.expr.saveAsList
        self.ignoreExprs.extend(self.expr.ignoreExprs)
        return self
        
    def __ilshift__(self, other):
        return self << other
    
    def leaveWhitespace( self ):
        self.skipWhitespace = False
        return self

    def streamline( self ):
        if not self.streamlined:
            self.streamlined = True
            if self.expr is not None:
                self.expr.streamline()
        return self

    def validate( self, validateTrace=[] ):
        if self not in validateTrace:
            tmp = validateTrace[:]+[self]
            if self.expr is not None:
                self.expr.validate(tmp)
        self.checkRecursion([])

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name
        return self.__class__.__name__ + ": ..."

        # stubbed out for now - creates awful memory and perf issues
        self._revertClass = self.__class__
        self.__class__ = _ForwardNoRecurse
        try:
            if self.expr is not None:
                retString = _ustr(self.expr)
            else:
                retString = "None"
        finally:
            self.__class__ = self._revertClass
        return self.__class__.__name__ + ": " + retString

    def copy(self):
        if self.expr is not None:
            return super(Forward,self).copy()
        else:
            ret = Forward()
            ret <<= self
            return ret

class _ForwardNoRecurse(Forward):
    def __str__( self ):
        return "..."

class TokenConverter(ParseElementEnhance):
    """
    Abstract subclass of C{ParseExpression}, for converting parsed results.
    """
    def __init__( self, expr, savelist=False ):
        super(TokenConverter,self).__init__( expr )#, savelist )
        self.saveAsList = False

class Combine(TokenConverter):
    """
    Converter to concatenate all matching tokens to a single string.
    By default, the matching patterns must also be contiguous in the input string;
    this can be disabled by specifying C{'adjacent=False'} in the constructor.

    Example::
        real = Word(nums) + '.' + Word(nums)
        print(real.parseString('3.1416')) # -> ['3', '.', '1416']
        # will also erroneously match the following
        print(real.parseString('3. 1416')) # -> ['3', '.', '1416']

        real = Combine(Word(nums) + '.' + Word(nums))
        print(real.parseString('3.1416')) # -> ['3.1416']
        # no match when there are internal spaces
        print(real.parseString('3. 1416')) # -> Exception: Expected W:(0123...)
    """
    def __init__( self, expr, joinString="", adjacent=True ):
        super(Combine,self).__init__( expr )
        # suppress whitespace-stripping in contained parse expressions, but re-enable it on the Combine itself
        if adjacent:
            self.leaveWhitespace()
        self.adjacent = adjacent
        self.skipWhitespace = True
        self.joinString = joinString
        self.callPreparse = True

    def ignore( self, other ):
        if self.adjacent:
            ParserElement.ignore(self, other)
        else:
            super( Combine, self).ignore( other )
        return self

    def postParse( self, instring, loc, tokenlist ):
        retToks = tokenlist.copy()
        del retToks[:]
        retToks += ParseResults([ "".join(tokenlist._asStringList(self.joinString)) ], modal=self.modalResults)

        if self.resultsName and retToks.haskeys():
            return [ retToks ]
        else:
            return retToks

class Group(TokenConverter):
    """
    Converter to return the matched tokens as a list - useful for returning tokens of C{L{ZeroOrMore}} and C{L{OneOrMore}} expressions.

    Example::
        ident = Word(alphas)
        num = Word(nums)
        term = ident | num
        func = ident + Optional(delimitedList(term))
        print(func.parseString("fn a,b,100"))  # -> ['fn', 'a', 'b', '100']

        func = ident + Group(Optional(delimitedList(term)))
        print(func.parseString("fn a,b,100"))  # -> ['fn', ['a', 'b', '100']]
    """
    def __init__( self, expr ):
        super(Group,self).__init__( expr )
        self.saveAsList = True

    def postParse( self, instring, loc, tokenlist ):
        return [ tokenlist ]

class Dict(TokenConverter):
    """
    Converter to return a repetitive expression as a list, but also as a dictionary.
    Each element can also be referenced using the first token in the expression as its key.
    Useful for tabular report scraping when the first column can be used as a item key.

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        # print attributes as plain groups
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        # instead of OneOrMore(expr), parse using Dict(OneOrMore(Group(expr))) - Dict will auto-assign names
        result = Dict(OneOrMore(Group(attr_expr))).parseString(text)
        print(result.dump())
        
        # access named fields as dict entries, or output as dict
        print(result['shape'])        
        print(result.asDict())
    prints::
        ['shape', 'SQUARE', 'posn', 'upper left', 'color', 'light blue', 'texture', 'burlap']

        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        {'color': 'light blue', 'posn': 'upper left', 'texture': 'burlap', 'shape': 'SQUARE'}
    See more examples at L{ParseResults} of accessing fields by results name.
    """
    def __init__( self, expr ):
        super(Dict,self).__init__( expr )
        self.saveAsList = True

    def postParse( self, instring, loc, tokenlist ):
        for i,tok in enumerate(tokenlist):
            if len(tok) == 0:
                continue
            ikey = tok[0]
            if isinstance(ikey,int):
                ikey = _ustr(tok[0]).strip()
            if len(tok)==1:
                tokenlist[ikey] = _ParseResultsWithOffset("",i)
            elif len(tok)==2 and not isinstance(tok[1],ParseResults):
                tokenlist[ikey] = _ParseResultsWithOffset(tok[1],i)
            else:
                dictvalue = tok.copy() #ParseResults(i)
                del dictvalue[0]
                if len(dictvalue)!= 1 or (isinstance(dictvalue,ParseResults) and dictvalue.haskeys()):
                    tokenlist[ikey] = _ParseResultsWithOffset(dictvalue,i)
                else:
                    tokenlist[ikey] = _ParseResultsWithOffset(dictvalue[0],i)

        if self.resultsName:
            return [ tokenlist ]
        else:
            return tokenlist


class Suppress(TokenConverter):
    """
    Converter for ignoring the results of a parsed expression.

    Example::
        source = "a, b, c,d"
        wd = Word(alphas)
        wd_list1 = wd + ZeroOrMore(',' + wd)
        print(wd_list1.parseString(source))

        # often, delimiters that are useful during parsing are just in the
        # way afterward - use Suppress to keep them out of the parsed output
        wd_list2 = wd + ZeroOrMore(Suppress(',') + wd)
        print(wd_list2.parseString(source))
    prints::
        ['a', ',', 'b', ',', 'c', ',', 'd']
        ['a', 'b', 'c', 'd']
    (See also L{delimitedList}.)
    """
    def postParse( self, instring, loc, tokenlist ):
        return []

    def suppress( self ):
        return self


class OnlyOnce(object):
    """
    Wrapper for parse actions, to ensure they are only called once.
    """
    def __init__(self, methodCall):
        self.callable = _trim_arity(methodCall)
        self.called = False
    def __call__(self,s,l,t):
        if not self.called:
            results = self.callable(s,l,t)
            self.called = True
            return results
        raise ParseException(s,l,"")
    def reset(self):
        self.called = False

def traceParseAction(f):
    """
    Decorator for debugging parse actions. 
    
    When the parse action is called, this decorator will print C{">> entering I{method-name}(line:I{current_source_line}, I{parse_location}, I{matched_tokens})".}
    When the parse action completes, the decorator will print C{"<<"} followed by the returned value, or any exception that the parse action raised.

    Example::
        wd = Word(alphas)

        @traceParseAction
        def remove_duplicate_chars(tokens):
            return ''.join(sorted(set(''.join(tokens)))

        wds = OneOrMore(wd).setParseAction(remove_duplicate_chars)
        print(wds.parseString("slkdjs sld sldd sdlf sdljf"))
    prints::
        >>entering remove_duplicate_chars(line: 'slkdjs sld sldd sdlf sdljf', 0, (['slkdjs', 'sld', 'sldd', 'sdlf', 'sdljf'], {}))
        <<leaving remove_duplicate_chars (ret: 'dfjkls')
        ['dfjkls']
    """
    f = _trim_arity(f)
    def z(*paArgs):
        thisFunc = f.__name__
        s,l,t = paArgs[-3:]
        if len(paArgs)>3:
            thisFunc = paArgs[0].__class__.__name__ + '.' + thisFunc
        sys.stderr.write( ">>entering %s(line: '%s', %d, %r)\n" % (thisFunc,line(l,s),l,t) )
        try:
            ret = f(*paArgs)
        except Exception as exc:
            sys.stderr.write( "<<leaving %s (exception: %s)\n" % (thisFunc,exc) )
            raise
        sys.stderr.write( "<<leaving %s (ret: %r)\n" % (thisFunc,ret) )
        return ret
    try:
        z.__name__ = f.__name__
    except AttributeError:
        pass
    return z

#
# global helpers
#
def delimitedList( expr, delim=",", combine=False ):
    """
    Helper to define a delimited list of expressions - the delimiter defaults to ','.
    By default, the list elements and delimiters can have intervening whitespace, and
    comments, but this can be overridden by passing C{combine=True} in the constructor.
    If C{combine} is set to C{True}, the matching tokens are returned as a single token
    string, with the delimiters included; otherwise, the matching tokens are returned
    as a list of tokens, with the delimiters suppressed.

    Example::
        delimitedList(Word(alphas)).parseString("aa,bb,cc") # -> ['aa', 'bb', 'cc']
        delimitedList(Word(hexnums), delim=':', combine=True).parseString("AA:BB:CC:DD:EE") # -> ['AA:BB:CC:DD:EE']
    """
    dlName = _ustr(expr)+" ["+_ustr(delim)+" "+_ustr(expr)+"]..."
    if combine:
        return Combine( expr + ZeroOrMore( delim + expr ) ).setName(dlName)
    else:
        return ( expr + ZeroOrMore( Suppress( delim ) + expr ) ).setName(dlName)

def countedArray( expr, intExpr=None ):
    """
    Helper to define a counted list of expressions.
    This helper defines a pattern of the form::
        integer expr expr expr...
    where the leading integer tells how many expr expressions follow.
    The matched tokens returns the array of expr tokens as a list - the leading count token is suppressed.
    
    If C{intExpr} is specified, it should be a pyparsing expression that produces an integer value.

    Example::
        countedArray(Word(alphas)).parseString('2 ab cd ef')  # -> ['ab', 'cd']

        # in this parser, the leading integer value is given in binary,
        # '10' indicating that 2 values are in the array
        binaryConstant = Word('01').setParseAction(lambda t: int(t[0], 2))
        countedArray(Word(alphas), intExpr=binaryConstant).parseString('10 ab cd ef')  # -> ['ab', 'cd']
    """
    arrayExpr = Forward()
    def countFieldParseAction(s,l,t):
        n = t[0]
        arrayExpr << (n and Group(And([expr]*n)) or Group(empty))
        return []
    if intExpr is None:
        intExpr = Word(nums).setParseAction(lambda t:int(t[0]))
    else:
        intExpr = intExpr.copy()
    intExpr.setName("arrayLen")
    intExpr.addParseAction(countFieldParseAction, callDuringTry=True)
    return ( intExpr + arrayExpr ).setName('(len) ' + _ustr(expr) + '...')

def _flatten(L):
    ret = []
    for i in L:
        if isinstance(i,list):
            ret.extend(_flatten(i))
        else:
            ret.append(i)
    return ret

def matchPreviousLiteral(expr):
    """
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousLiteral(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches a
    previous literal, will also match the leading C{"1:1"} in C{"1:10"}.
    If this is not desired, use C{matchPreviousExpr}.
    Do I{not} use with packrat parsing enabled.
    """
    rep = Forward()
    def copyTokenToRepeater(s,l,t):
        if t:
            if len(t) == 1:
                rep << t[0]
            else:
                # flatten t tokens
                tflat = _flatten(t.asList())
                rep << And(Literal(tt) for tt in tflat)
        else:
            rep << Empty()
    expr.addParseAction(copyTokenToRepeater, callDuringTry=True)
    rep.setName('(prev) ' + _ustr(expr))
    return rep

def matchPreviousExpr(expr):
    """
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousExpr(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches by
    expressions, will I{not} match the leading C{"1:1"} in C{"1:10"};
    the expressions are evaluated first, and then compared, so
    C{"1"} is compared with C{"10"}.
    Do I{not} use with packrat parsing enabled.
    """
    rep = Forward()
    e2 = expr.copy()
    rep <<= e2
    def copyTokenToRepeater(s,l,t):
        matchTokens = _flatten(t.asList())
        def mustMatchTheseTokens(s,l,t):
            theseTokens = _flatten(t.asList())
            if  theseTokens != matchTokens:
                raise ParseException("",0,"")
        rep.setParseAction( mustMatchTheseTokens, callDuringTry=True )
    expr.addParseAction(copyTokenToRepeater, callDuringTry=True)
    rep.setName('(prev) ' + _ustr(expr))
    return rep

def _escapeRegexRangeChars(s):
    #~  escape these chars: ^-]
    for c in r"\^-]":
        s = s.replace(c,_bslash+c)
    s = s.replace("\n",r"\n")
    s = s.replace("\t",r"\t")
    return _ustr(s)

def oneOf( strs, caseless=False, useRegex=True ):
    """
    Helper to quickly define a set of alternative Literals, and makes sure to do
    longest-first testing when there is a conflict, regardless of the input order,
    but returns a C{L{MatchFirst}} for best performance.

    Parameters:
     - strs - a string of space-delimited literals, or a collection of string literals
     - caseless - (default=C{False}) - treat all literals as caseless
     - useRegex - (default=C{True}) - as an optimization, will generate a Regex
          object; otherwise, will generate a C{MatchFirst} object (if C{caseless=True}, or
          if creating a C{Regex} raises an exception)

    Example::
        comp_oper = oneOf("< = > <= >= !=")
        var = Word(alphas)
        number = Word(nums)
        term = var | number
        comparison_expr = term + comp_oper + term
        print(comparison_expr.searchString("B = 12  AA=23 B<=AA AA>12"))
    prints::
        [['B', '=', '12'], ['AA', '=', '23'], ['B', '<=', 'AA'], ['AA', '>', '12']]
    """
    if caseless:
        isequal = ( lambda a,b: a.upper() == b.upper() )
        masks = ( lambda a,b: b.upper().startswith(a.upper()) )
        parseElementClass = CaselessLiteral
    else:
        isequal = ( lambda a,b: a == b )
        masks = ( lambda a,b: b.startswith(a) )
        parseElementClass = Literal

    symbols = []
    if isinstance(strs,basestring):
        symbols = strs.split()
    elif isinstance(strs, collections.Iterable):
        symbols = list(strs)
    else:
        warnings.warn("Invalid argument to oneOf, expected string or iterable",
                SyntaxWarning, stacklevel=2)
    if not symbols:
        return NoMatch()

    i = 0
    while i < len(symbols)-1:
        cur = symbols[i]
        for j,other in enumerate(symbols[i+1:]):
            if ( isequal(other, cur) ):
                del symbols[i+j+1]
                break
            elif ( masks(cur, other) ):
                del symbols[i+j+1]
                symbols.insert(i,other)
                cur = other
                break
        else:
            i += 1

    if not caseless and useRegex:
        #~ print (strs,"->", "|".join( [ _escapeRegexChars(sym) for sym in symbols] ))
        try:
            if len(symbols)==len("".join(symbols)):
                return Regex( "[%s]" % "".join(_escapeRegexRangeChars(sym) for sym in symbols) ).setName(' | '.join(symbols))
            else:
                return Regex( "|".join(re.escape(sym) for sym in symbols) ).setName(' | '.join(symbols))
        except Exception:
            warnings.warn("Exception creating Regex for oneOf, building MatchFirst",
                    SyntaxWarning, stacklevel=2)


    # last resort, just use MatchFirst
    return MatchFirst(parseElementClass(sym) for sym in symbols).setName(' | '.join(symbols))

def dictOf( key, value ):
    """
    Helper to easily and clearly define a dictionary by specifying the respective patterns
    for the key and value.  Takes care of defining the C{L{Dict}}, C{L{ZeroOrMore}}, and C{L{Group}} tokens
    in the proper order.  The key pattern can include delimiting markers or punctuation,
    as long as they are suppressed, thereby leaving the significant key text.  The value
    pattern can include named results, so that the C{Dict} results can include named token
    fields.

    Example::
        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        attr_label = label
        attr_value = Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join)

        # similar to Dict, but simpler call format
        result = dictOf(attr_label, attr_value).parseString(text)
        print(result.dump())
        print(result['shape'])
        print(result.shape)  # object attribute access works too
        print(result.asDict())
    prints::
        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        SQUARE
        {'color': 'light blue', 'shape': 'SQUARE', 'posn': 'upper left', 'texture': 'burlap'}
    """
    return Dict( ZeroOrMore( Group ( key + value ) ) )

def originalTextFor(expr, asString=True):
    """
    Helper to return the original, untokenized text for a given expression.  Useful to
    restore the parsed fields of an HTML start tag into the raw tag text itself, or to
    revert separate tokens with intervening whitespace back to the original matching
    input text. By default, returns astring containing the original parsed text.  
       
    If the optional C{asString} argument is passed as C{False}, then the return value is a 
    C{L{ParseResults}} containing any results names that were originally matched, and a 
    single token containing the original matched text from the input string.  So if 
    the expression passed to C{L{originalTextFor}} contains expressions with defined
    results names, you must set C{asString} to C{False} if you want to preserve those
    results name values.

    Example::
        src = "this is test <b> bold <i>text</i> </b> normal text "
        for tag in ("b","i"):
            opener,closer = makeHTMLTags(tag)
            patt = originalTextFor(opener + SkipTo(closer) + closer)
            print(patt.searchString(src)[0])
    prints::
        ['<b> bold <i>text</i> </b>']
        ['<i>text</i>']
    """
    locMarker = Empty().setParseAction(lambda s,loc,t: loc)
    endlocMarker = locMarker.copy()
    endlocMarker.callPreparse = False
    matchExpr = locMarker("_original_start") + expr + endlocMarker("_original_end")
    if asString:
        extractText = lambda s,l,t: s[t._original_start:t._original_end]
    else:
        def extractText(s,l,t):
            t[:] = [s[t.pop('_original_start'):t.pop('_original_end')]]
    matchExpr.setParseAction(extractText)
    matchExpr.ignoreExprs = expr.ignoreExprs
    return matchExpr

def ungroup(expr): 
    """
    Helper to undo pyparsing's default grouping of And expressions, even
    if all but one are non-empty.
    """
    return TokenConverter(expr).setParseAction(lambda t:t[0])

def locatedExpr(expr):
    """
    Helper to decorate a returned token with its starting and ending locations in the input string.
    This helper adds the following results names:
     - locn_start = location where matched expression begins
     - locn_end = location where matched expression ends
     - value = the actual parsed results

    Be careful if the input text contains C{<TAB>} characters, you may want to call
    C{L{ParserElement.parseWithTabs}}

    Example::
        wd = Word(alphas)
        for match in locatedExpr(wd).searchString("ljsdf123lksdjjf123lkkjj1222"):
            print(match)
    prints::
        [[0, 'ljsdf', 5]]
        [[8, 'lksdjjf', 15]]
        [[18, 'lkkjj', 23]]
    """
    locator = Empty().setParseAction(lambda s,l,t: l)
    return Group(locator("locn_start") + expr("value") + locator.copy().leaveWhitespace()("locn_end"))


# convenience constants for positional expressions
empty       = Empty().setName("empty")
lineStart   = LineStart().setName("lineStart")
lineEnd     = LineEnd().setName("lineEnd")
stringStart = StringStart().setName("stringStart")
stringEnd   = StringEnd().setName("stringEnd")

_escapedPunc = Word( _bslash, r"\[]-*.$+^?()~ ", exact=2 ).setParseAction(lambda s,l,t:t[0][1])
_escapedHexChar = Regex(r"\\0?[xX][0-9a-fA-F]+").setParseAction(lambda s,l,t:unichr(int(t[0].lstrip(r'\0x'),16)))
_escapedOctChar = Regex(r"\\0[0-7]+").setParseAction(lambda s,l,t:unichr(int(t[0][1:],8)))
_singleChar = _escapedPunc | _escapedHexChar | _escapedOctChar | Word(printables, excludeChars=r'\]', exact=1) | Regex(r"\w", re.UNICODE)
_charRange = Group(_singleChar + Suppress("-") + _singleChar)
_reBracketExpr = Literal("[") + Optional("^").setResultsName("negate") + Group( OneOrMore( _charRange | _singleChar ) ).setResultsName("body") + "]"

def srange(s):
    r"""
    Helper to easily define string ranges for use in Word construction.  Borrows
    syntax from regexp '[]' string range definitions::
        srange("[0-9]")   -> "0123456789"
        srange("[a-z]")   -> "abcdefghijklmnopqrstuvwxyz"
        srange("[a-z$_]") -> "abcdefghijklmnopqrstuvwxyz$_"
    The input string must be enclosed in []'s, and the returned string is the expanded
    character set joined into a single string.
    The values enclosed in the []'s may be:
     - a single character
     - an escaped character with a leading backslash (such as C{\-} or C{\]})
     - an escaped hex character with a leading C{'\x'} (C{\x21}, which is a C{'!'} character) 
         (C{\0x##} is also supported for backwards compatibility) 
     - an escaped octal character with a leading C{'\0'} (C{\041}, which is a C{'!'} character)
     - a range of any of the above, separated by a dash (C{'a-z'}, etc.)
     - any combination of the above (C{'aeiouy'}, C{'a-zA-Z0-9_$'}, etc.)
    """
    _expanded = lambda p: p if not isinstance(p,ParseResults) else ''.join(unichr(c) for c in range(ord(p[0]),ord(p[1])+1))
    try:
        return "".join(_expanded(part) for part in _reBracketExpr.parseString(s).body)
    except Exception:
        return ""

def matchOnlyAtCol(n):
    """
    Helper method for defining parse actions that require matching at a specific
    column in the input text.
    """
    def verifyCol(strg,locn,toks):
        if col(locn,strg) != n:
            raise ParseException(strg,locn,"matched token not at column %d" % n)
    return verifyCol

def replaceWith(replStr):
    """
    Helper method for common parse actions that simply return a literal value.  Especially
    useful when used with C{L{transformString<ParserElement.transformString>}()}.

    Example::
        num = Word(nums).setParseAction(lambda toks: int(toks[0]))
        na = oneOf("N/A NA").setParseAction(replaceWith(math.nan))
        term = na | num
        
        OneOrMore(term).parseString("324 234 N/A 234") # -> [324, 234, nan, 234]
    """
    return lambda s,l,t: [replStr]

def removeQuotes(s,l,t):
    """
    Helper parse action for removing quotation marks from parsed quoted strings.

    Example::
        # by default, quotation marks are included in parsed results
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["'Now is the Winter of our Discontent'"]

        # use removeQuotes to strip quotation marks from parsed results
        quotedString.setParseAction(removeQuotes)
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["Now is the Winter of our Discontent"]
    """
    return t[0][1:-1]

def tokenMap(func, *args):
    """
    Helper to define a parse action by mapping a function to all elements of a ParseResults list.If any additional 
    args are passed, they are forwarded to the given function as additional arguments after
    the token, as in C{hex_integer = Word(hexnums).setParseAction(tokenMap(int, 16))}, which will convert the
    parsed data to an integer using base 16.

    Example (compare the last to example in L{ParserElement.transformString}::
        hex_ints = OneOrMore(Word(hexnums)).setParseAction(tokenMap(int, 16))
        hex_ints.runTests('''
            00 11 22 aa FF 0a 0d 1a
            ''')
        
        upperword = Word(alphas).setParseAction(tokenMap(str.upper))
        OneOrMore(upperword).runTests('''
            my kingdom for a horse
            ''')

        wd = Word(alphas).setParseAction(tokenMap(str.title))
        OneOrMore(wd).setParseAction(' '.join).runTests('''
            now is the winter of our discontent made glorious summer by this sun of york
            ''')
    prints::
        00 11 22 aa FF 0a 0d 1a
        [0, 17, 34, 170, 255, 10, 13, 26]

        my kingdom for a horse
        ['MY', 'KINGDOM', 'FOR', 'A', 'HORSE']

        now is the winter of our discontent made glorious summer by this sun of york
        ['Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York']
    """
    def pa(s,l,t):
        return [func(tokn, *args) for tokn in t]

    try:
        func_name = getattr(func, '__name__', 
                            getattr(func, '__class__').__name__)
    except Exception:
        func_name = str(func)
    pa.__name__ = func_name

    return pa

upcaseTokens = tokenMap(lambda t: _ustr(t).upper())
"""(Deprecated) Helper parse action to convert tokens to upper case. Deprecated in favor of L{pyparsing_common.upcaseTokens}"""

downcaseTokens = tokenMap(lambda t: _ustr(t).lower())
"""(Deprecated) Helper parse action to convert tokens to lower case. Deprecated in favor of L{pyparsing_common.downcaseTokens}"""
    
def _makeTags(tagStr, xml):
    """Internal helper to construct opening and closing tag expressions, given a tag name"""
    if isinstance(tagStr,basestring):
        resname = tagStr
        tagStr = Keyword(tagStr, caseless=not xml)
    else:
        resname = tagStr.name

    tagAttrName = Word(alphas,alphanums+"_-:")
    if (xml):
        tagAttrValue = dblQuotedString.copy().setParseAction( removeQuotes )
        openTag = Suppress("<") + tagStr("tag") + \
                Dict(ZeroOrMore(Group( tagAttrName + Suppress("=") + tagAttrValue ))) + \
                Optional("/",default=[False]).setResultsName("empty").setParseAction(lambda s,l,t:t[0]=='/') + Suppress(">")
    else:
        printablesLessRAbrack = "".join(c for c in printables if c not in ">")
        tagAttrValue = quotedString.copy().setParseAction( removeQuotes ) | Word(printablesLessRAbrack)
        openTag = Suppress("<") + tagStr("tag") + \
                Dict(ZeroOrMore(Group( tagAttrName.setParseAction(downcaseTokens) + \
                Optional( Suppress("=") + tagAttrValue ) ))) + \
                Optional("/",default=[False]).setResultsName("empty").setParseAction(lambda s,l,t:t[0]=='/') + Suppress(">")
    closeTag = Combine(_L("</") + tagStr + ">")

    openTag = openTag.setResultsName("start"+"".join(resname.replace(":"," ").title().split())).setName("<%s>" % resname)
    closeTag = closeTag.setResultsName("end"+"".join(resname.replace(":"," ").title().split())).setName("</%s>" % resname)
    openTag.tag = resname
    closeTag.tag = resname
    return openTag, closeTag

def makeHTMLTags(tagStr):
    """
    Helper to construct opening and closing tag expressions for HTML, given a tag name. Matches
    tags in either upper or lower case, attributes with namespaces and with quoted or unquoted values.

    Example::
        text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
        # makeHTMLTags returns pyparsing expressions for the opening and closing tags as a 2-tuple
        a,a_end = makeHTMLTags("A")
        link_expr = a + SkipTo(a_end)("link_text") + a_end
        
        for link in link_expr.searchString(text):
            # attributes in the <A> tag (like "href" shown here) are also accessible as named results
            print(link.link_text, '->', link.href)
    prints::
        pyparsing -> http://pyparsing.wikispaces.com
    """
    return _makeTags( tagStr, False )

def makeXMLTags(tagStr):
    """
    Helper to construct opening and closing tag expressions for XML, given a tag name. Matches
    tags only in the given upper/lower case.

    Example: similar to L{makeHTMLTags}
    """
    return _makeTags( tagStr, True )

def withAttribute(*args,**attrDict):
    """
    Helper to create a validating parse action to be used with start tags created
    with C{L{makeXMLTags}} or C{L{makeHTMLTags}}. Use C{withAttribute} to qualify a starting tag
    with a required attribute value, to avoid false matches on common tags such as
    C{<TD>} or C{<DIV>}.

    Call C{withAttribute} with a series of attribute names and values. Specify the list
    of filter attributes names and values as:
     - keyword arguments, as in C{(align="right")}, or
     - as an explicit dict with C{**} operator, when an attribute name is also a Python
          reserved word, as in C{**{"class":"Customer", "align":"right"}}
     - a list of name-value tuples, as in ( ("ns1:class", "Customer"), ("ns2:align","right") )
    For attribute names with a namespace prefix, you must use the second form.  Attribute
    names are matched insensitive to upper/lower case.
       
    If just testing for C{class} (with or without a namespace), use C{L{withClass}}.

    To verify that the attribute exists, but without specifying a value, pass
    C{withAttribute.ANY_VALUE} as the value.

    Example::
        html = '''
            <div>
            Some text
            <div type="grid">1 4 0 1 0</div>
            <div type="graph">1,3 2,3 1,1</div>
            <div>this has no type</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")

        # only match div tag having a type attribute with value "grid"
        div_grid = div().setParseAction(withAttribute(type="grid"))
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        # construct a match with any div tag having a type attribute, regardless of the value
        div_any_type = div().setParseAction(withAttribute(type=withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    """
    if args:
        attrs = args[:]
    else:
        attrs = attrDict.items()
    attrs = [(k,v) for k,v in attrs]
    def pa(s,l,tokens):
        for attrName,attrValue in attrs:
            if attrName not in tokens:
                raise ParseException(s,l,"no matching attribute " + attrName)
            if attrValue != withAttribute.ANY_VALUE and tokens[attrName] != attrValue:
                raise ParseException(s,l,"attribute '%s' has value '%s', must be '%s'" %
                                            (attrName, tokens[attrName], attrValue))
    return pa
withAttribute.ANY_VALUE = object()

def withClass(classname, namespace=''):
    """
    Simplified version of C{L{withAttribute}} when matching on a div class - made
    difficult because C{class} is a reserved word in Python.

    Example::
        html = '''
            <div>
            Some text
            <div class="grid">1 4 0 1 0</div>
            <div class="graph">1,3 2,3 1,1</div>
            <div>this &lt;div&gt; has no class</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")
        div_grid = div().setParseAction(withClass("grid"))
        
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        div_any_type = div().setParseAction(withClass(withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    """
    classattr = "%s:class" % namespace if namespace else "class"
    return withAttribute(**{classattr : classname})        

opAssoc = _Constants()
opAssoc.LEFT = object()
opAssoc.RIGHT = object()

def infixNotation( baseExpr, opList, lpar=Suppress('('), rpar=Suppress(')') ):
    """
    Helper method for constructing grammars of expressions made up of
    operators working in a precedence hierarchy.  Operators may be unary or
    binary, left- or right-associative.  Parse actions can also be attached
    to operator expressions. The generated parser will also recognize the use 
    of parentheses to override operator precedences (see example below).
    
    Note: if you define a deep operator list, you may see performance issues
    when using infixNotation. See L{ParserElement.enablePackrat} for a
    mechanism to potentially improve your parser performance.

    Parameters:
     - baseExpr - expression representing the most basic element for the nested
     - opList - list of tuples, one for each operator precedence level in the
      expression grammar; each tuple is of the form
      (opExpr, numTerms, rightLeftAssoc, parseAction), where:
       - opExpr is the pyparsing expression for the operator;
          may also be a string, which will be converted to a Literal;
          if numTerms is 3, opExpr is a tuple of two expressions, for the
          two operators separating the 3 terms
       - numTerms is the number of terms for this operator (must
          be 1, 2, or 3)
       - rightLeftAssoc is the indicator whether the operator is
          right or left associative, using the pyparsing-defined
          constants C{opAssoc.RIGHT} and C{opAssoc.LEFT}.
       - parseAction is the parse action to be associated with
          expressions matching this operator expression (the
          parse action tuple member may be omitted)
     - lpar - expression for matching left-parentheses (default=C{Suppress('(')})
     - rpar - expression for matching right-parentheses (default=C{Suppress(')')})

    Example::
        # simple example of four-function arithmetic with ints and variable names
        integer = pyparsing_common.signed_integer
        varname = pyparsing_common.identifier 
        
        arith_expr = infixNotation(integer | varname,
            [
            ('-', 1, opAssoc.RIGHT),
            (oneOf('* /'), 2, opAssoc.LEFT),
            (oneOf('+ -'), 2, opAssoc.LEFT),
            ])
        
        arith_expr.runTests('''
            5+3*6
            (5+3)*6
            -2--11
            ''', fullDump=False)
    prints::
        5+3*6
        [[5, '+', [3, '*', 6]]]

        (5+3)*6
        [[[5, '+', 3], '*', 6]]

        -2--11
        [[['-', 2], '-', ['-', 11]]]
    """
    ret = Forward()
    lastExpr = baseExpr | ( lpar + ret + rpar )
    for i,operDef in enumerate(opList):
        opExpr,arity,rightLeftAssoc,pa = (operDef + (None,))[:4]
        termName = "%s term" % opExpr if arity < 3 else "%s%s term" % opExpr
        if arity == 3:
            if opExpr is None or len(opExpr) != 2:
                raise ValueError("if numterms=3, opExpr must be a tuple or list of two expressions")
            opExpr1, opExpr2 = opExpr
        thisExpr = Forward().setName(termName)
        if rightLeftAssoc == opAssoc.LEFT:
            if arity == 1:
                matchExpr = FollowedBy(lastExpr + opExpr) + Group( lastExpr + OneOrMore( opExpr ) )
            elif arity == 2:
                if opExpr is not None:
                    matchExpr = FollowedBy(lastExpr + opExpr + lastExpr) + Group( lastExpr + OneOrMore( opExpr + lastExpr ) )
                else:
                    matchExpr = FollowedBy(lastExpr+lastExpr) + Group( lastExpr + OneOrMore(lastExpr) )
            elif arity == 3:
                matchExpr = FollowedBy(lastExpr + opExpr1 + lastExpr + opExpr2 + lastExpr) + \
                            Group( lastExpr + opExpr1 + lastExpr + opExpr2 + lastExpr )
            else:
                raise ValueError("operator must be unary (1), binary (2), or ternary (3)")
        elif rightLeftAssoc == opAssoc.RIGHT:
            if arity == 1:
                # try to avoid LR with this extra test
                if not isinstance(opExpr, Optional):
                    opExpr = Optional(opExpr)
                matchExpr = FollowedBy(opExpr.expr + thisExpr) + Group( opExpr + thisExpr )
            elif arity == 2:
                if opExpr is not None:
                    matchExpr = FollowedBy(lastExpr + opExpr + thisExpr) + Group( lastExpr + OneOrMore( opExpr + thisExpr ) )
                else:
                    matchExpr = FollowedBy(lastExpr + thisExpr) + Group( lastExpr + OneOrMore( thisExpr ) )
            elif arity == 3:
                matchExpr = FollowedBy(lastExpr + opExpr1 + thisExpr + opExpr2 + thisExpr) + \
                            Group( lastExpr + opExpr1 + thisExpr + opExpr2 + thisExpr )
            else:
                raise ValueError("operator must be unary (1), binary (2), or ternary (3)")
        else:
            raise ValueError("operator must indicate right or left associativity")
        if pa:
            matchExpr.setParseAction( pa )
        thisExpr <<= ( matchExpr.setName(termName) | lastExpr )
        lastExpr = thisExpr
    ret <<= lastExpr
    return ret

operatorPrecedence = infixNotation
"""(Deprecated) Former name of C{L{infixNotation}}, will be dropped in a future release."""

dblQuotedString = Combine(Regex(r'"(?:[^"\n\r\\]|(?:"")|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*')+'"').setName("string enclosed in double quotes")
sglQuotedString = Combine(Regex(r"'(?:[^'\n\r\\]|(?:'')|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*")+"'").setName("string enclosed in single quotes")
quotedString = Combine(Regex(r'"(?:[^"\n\r\\]|(?:"")|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*')+'"'|
                       Regex(r"'(?:[^'\n\r\\]|(?:'')|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*")+"'").setName("quotedString using single or double quotes")
unicodeString = Combine(_L('u') + quotedString.copy()).setName("unicode string literal")

def nestedExpr(opener="(", closer=")", content=None, ignoreExpr=quotedString.copy()):
    """
    Helper method for defining nested lists enclosed in opening and closing
    delimiters ("(" and ")" are the default).

    Parameters:
     - opener - opening character for a nested list (default=C{"("}); can also be a pyparsing expression
     - closer - closing character for a nested list (default=C{")"}); can also be a pyparsing expression
     - content - expression for items within the nested lists (default=C{None})
     - ignoreExpr - expression for ignoring opening and closing delimiters (default=C{quotedString})

    If an expression is not provided for the content argument, the nested
    expression will capture all whitespace-delimited content between delimiters
    as a list of separate values.

    Use the C{ignoreExpr} argument to define expressions that may contain
    opening or closing characters that should not be treated as opening
    or closing characters for nesting, such as quotedString or a comment
    expression.  Specify multiple expressions using an C{L{Or}} or C{L{MatchFirst}}.
    The default is L{quotedString}, but if no expressions are to be ignored,
    then pass C{None} for this argument.

    Example::
        data_type = oneOf("void int short long char float double")
        decl_data_type = Combine(data_type + Optional(Word('*')))
        ident = Word(alphas+'_', alphanums+'_')
        number = pyparsing_common.number
        arg = Group(decl_data_type + ident)
        LPAR,RPAR = map(Suppress, "()")

        code_body = nestedExpr('{', '}', ignoreExpr=(quotedString | cStyleComment))

        c_function = (decl_data_type("type") 
                      + ident("name")
                      + LPAR + Optional(delimitedList(arg), [])("args") + RPAR 
                      + code_body("body"))
        c_function.ignore(cStyleComment)
        
        source_code = '''
            int is_odd(int x) { 
                return (x%2); 
            }
                
            int dec_to_hex(char hchar) { 
                if (hchar >= '0' && hchar <= '9') { 
                    return (ord(hchar)-ord('0')); 
                } else { 
                    return (10+ord(hchar)-ord('A'));
                } 
            }
        '''
        for func in c_function.searchString(source_code):
            print("%(name)s (%(type)s) args: %(args)s" % func)

    prints::
        is_odd (int) args: [['int', 'x']]
        dec_to_hex (int) args: [['char', 'hchar']]
    """
    if opener == closer:
        raise ValueError("opening and closing strings cannot be the same")
    if content is None:
        if isinstance(opener,basestring) and isinstance(closer,basestring):
            if len(opener) == 1 and len(closer)==1:
                if ignoreExpr is not None:
                    content = (Combine(OneOrMore(~ignoreExpr +
                                    CharsNotIn(opener+closer+ParserElement.DEFAULT_WHITE_CHARS,exact=1))
                                ).setParseAction(lambda t:t[0].strip()))
                else:
                    content = (empty.copy()+CharsNotIn(opener+closer+ParserElement.DEFAULT_WHITE_CHARS
                                ).setParseAction(lambda t:t[0].strip()))
            else:
                if ignoreExpr is not None:
                    content = (Combine(OneOrMore(~ignoreExpr + 
                                    ~Literal(opener) + ~Literal(closer) +
                                    CharsNotIn(ParserElement.DEFAULT_WHITE_CHARS,exact=1))
                                ).setParseAction(lambda t:t[0].strip()))
                else:
                    content = (Combine(OneOrMore(~Literal(opener) + ~Literal(closer) +
                                    CharsNotIn(ParserElement.DEFAULT_WHITE_CHARS,exact=1))
                                ).setParseAction(lambda t:t[0].strip()))
        else:
            raise ValueError("opening and closing arguments must be strings if no content expression is given")
    ret = Forward()
    if ignoreExpr is not None:
        ret <<= Group( Suppress(opener) + ZeroOrMore( ignoreExpr | ret | content ) + Suppress(closer) )
    else:
        ret <<= Group( Suppress(opener) + ZeroOrMore( ret | content )  + Suppress(closer) )
    ret.setName('nested %s%s expression' % (opener,closer))
    return ret

def indentedBlock(blockStatementExpr, indentStack, indent=True):
    """
    Helper method for defining space-delimited indentation blocks, such as
    those used to define block statements in Python source code.

    Parameters:
     - blockStatementExpr - expression defining syntax of statement that
            is repeated within the indented block
     - indentStack - list created by caller to manage indentation stack
            (multiple statementWithIndentedBlock expressions within a single grammar
            should share a common indentStack)
     - indent - boolean indicating whether block must be indented beyond the
            the current level; set to False for block of left-most statements
            (default=C{True})

    A valid block must contain at least one C{blockStatement}.

    Example::
        data = '''
        def A(z):
          A1
          B = 100
          G = A2
          A2
          A3
        B
        def BB(a,b,c):
          BB1
          def BBA():
            bba1
            bba2
            bba3
        C
        D
        def spam(x,y):
             def eggs(z):
                 pass
        '''


        indentStack = [1]
        stmt = Forward()

        identifier = Word(alphas, alphanums)
        funcDecl = ("def" + identifier + Group( "(" + Optional( delimitedList(identifier) ) + ")" ) + ":")
        func_body = indentedBlock(stmt, indentStack)
        funcDef = Group( funcDecl + func_body )

        rvalue = Forward()
        funcCall = Group(identifier + "(" + Optional(delimitedList(rvalue)) + ")")
        rvalue << (funcCall | identifier | Word(nums))
        assignment = Group(identifier + "=" + rvalue)
        stmt << ( funcDef | assignment | identifier )

        module_body = OneOrMore(stmt)

        parseTree = module_body.parseString(data)
        parseTree.pprint()
    prints::
        [['def',
          'A',
          ['(', 'z', ')'],
          ':',
          [['A1'], [['B', '=', '100']], [['G', '=', 'A2']], ['A2'], ['A3']]],
         'B',
         ['def',
          'BB',
          ['(', 'a', 'b', 'c', ')'],
          ':',
          [['BB1'], [['def', 'BBA', ['(', ')'], ':', [['bba1'], ['bba2'], ['bba3']]]]]],
         'C',
         'D',
         ['def',
          'spam',
          ['(', 'x', 'y', ')'],
          ':',
          [[['def', 'eggs', ['(', 'z', ')'], ':', [['pass']]]]]]] 
    """
    def checkPeerIndent(s,l,t):
        if l >= len(s): return
        curCol = col(l,s)
        if curCol != indentStack[-1]:
            if curCol > indentStack[-1]:
                raise ParseFatalException(s,l,"illegal nesting")
            raise ParseException(s,l,"not a peer entry")

    def checkSubIndent(s,l,t):
        curCol = col(l,s)
        if curCol > indentStack[-1]:
            indentStack.append( curCol )
        else:
            raise ParseException(s,l,"not a subentry")

    def checkUnindent(s,l,t):
        if l >= len(s): return
        curCol = col(l,s)
        if not(indentStack and curCol < indentStack[-1] and curCol <= indentStack[-2]):
            raise ParseException(s,l,"not an unindent")
        indentStack.pop()

    NL = OneOrMore(LineEnd().setWhitespaceChars("\t ").suppress())
    INDENT = (Empty() + Empty().setParseAction(checkSubIndent)).setName('INDENT')
    PEER   = Empty().setParseAction(checkPeerIndent).setName('')
    UNDENT = Empty().setParseAction(checkUnindent).setName('UNINDENT')
    if indent:
        smExpr = Group( Optional(NL) +
            #~ FollowedBy(blockStatementExpr) +
            INDENT + (OneOrMore( PEER + Group(blockStatementExpr) + Optional(NL) )) + UNDENT)
    else:
        smExpr = Group( Optional(NL) +
            (OneOrMore( PEER + Group(blockStatementExpr) + Optional(NL) )) )
    blockStatementExpr.ignore(_bslash + LineEnd())
    return smExpr.setName('indented block')

alphas8bit = srange(r"[\0xc0-\0xd6\0xd8-\0xf6\0xf8-\0xff]")
punc8bit = srange(r"[\0xa1-\0xbf\0xd7\0xf7]")

anyOpenTag,anyCloseTag = makeHTMLTags(Word(alphas,alphanums+"_:").setName('any tag'))
_htmlEntityMap = dict(zip("gt lt amp nbsp quot apos".split(),'><& "\''))
commonHTMLEntity = Regex('&(?P<entity>' + '|'.join(_htmlEntityMap.keys()) +");").setName("common HTML entity")
def replaceHTMLEntity(t):
    """Helper parser action to replace common HTML entities with their special characters"""
    return _htmlEntityMap.get(t.entity)

# it's easy to get these comment structures wrong - they're very common, so may as well make them available
cStyleComment = Combine(Regex(r"/\*(?:[^*]|\*(?!/))*") + '*/').setName("C style comment")
"Comment of the form C{/* ... */}"

htmlComment = Regex(r"<!--[\s\S]*?-->").setName("HTML comment")
"Comment of the form C{<!-- ... -->}"

restOfLine = Regex(r".*").leaveWhitespace().setName("rest of line")
dblSlashComment = Regex(r"//(?:\\\n|[^\n])*").setName("// comment")
"Comment of the form C{// ... (to end of line)}"

cppStyleComment = Combine(Regex(r"/\*(?:[^*]|\*(?!/))*") + '*/'| dblSlashComment).setName("C++ style comment")
"Comment of either form C{L{cStyleComment}} or C{L{dblSlashComment}}"

javaStyleComment = cppStyleComment
"Same as C{L{cppStyleComment}}"

pythonStyleComment = Regex(r"#.*").setName("Python style comment")
"Comment of the form C{# ... (to end of line)}"

_commasepitem = Combine(OneOrMore(Word(printables, excludeChars=',') +
                                  Optional( Word(" \t") +
                                            ~Literal(",") + ~LineEnd() ) ) ).streamline().setName("commaItem")
commaSeparatedList = delimitedList( Optional( quotedString.copy() | _commasepitem, default="") ).setName("commaSeparatedList")
"""(Deprecated) Predefined expression of 1 or more printable words or quoted strings, separated by commas.
   This expression is deprecated in favor of L{pyparsing_common.comma_separated_list}."""

# some other useful expressions - using lower-case class name since we are really using this as a namespace
class pyparsing_common:
    """
    Here are some common low-level expressions that may be useful in jump-starting parser development:
     - numeric forms (L{integers<integer>}, L{reals<real>}, L{scientific notation<sci_real>})
     - common L{programming identifiers<identifier>}
     - network addresses (L{MAC<mac_address>}, L{IPv4<ipv4_address>}, L{IPv6<ipv6_address>})
     - ISO8601 L{dates<iso8601_date>} and L{datetime<iso8601_datetime>}
     - L{UUID<uuid>}
     - L{comma-separated list<comma_separated_list>}
    Parse actions:
     - C{L{convertToInteger}}
     - C{L{convertToFloat}}
     - C{L{convertToDate}}
     - C{L{convertToDatetime}}
     - C{L{stripHTMLTags}}
     - C{L{upcaseTokens}}
     - C{L{downcaseTokens}}

    Example::
        pyparsing_common.number.runTests('''
            # any int or real number, returned as the appropriate type
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.fnumber.runTests('''
            # any int or real number, returned as float
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.hex_integer.runTests('''
            # hex numbers
            100
            FF
            ''')

        pyparsing_common.fraction.runTests('''
            # fractions
            1/2
            -3/4
            ''')

        pyparsing_common.mixed_integer.runTests('''
            # mixed fractions
            1
            1/2
            -3/4
            1-3/4
            ''')

        import uuid
        pyparsing_common.uuid.setParseAction(tokenMap(uuid.UUID))
        pyparsing_common.uuid.runTests('''
            # uuid
            12345678-1234-5678-1234-567812345678
            ''')
    prints::
        # any int or real number, returned as the appropriate type
        100
        [100]

        -100
        [-100]

        +100
        [100]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # any int or real number, returned as float
        100
        [100.0]

        -100
        [-100.0]

        +100
        [100.0]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # hex numbers
        100
        [256]

        FF
        [255]

        # fractions
        1/2
        [0.5]

        -3/4
        [-0.75]

        # mixed fractions
        1
        [1]

        1/2
        [0.5]

        -3/4
        [-0.75]

        1-3/4
        [1.75]

        # uuid
        12345678-1234-5678-1234-567812345678
        [UUID('12345678-1234-5678-1234-567812345678')]
    """

    convertToInteger = tokenMap(int)
    """
    Parse action for converting parsed integers to Python int
    """

    convertToFloat = tokenMap(float)
    """
    Parse action for converting parsed numbers to Python float
    """

    integer = Word(nums).setName("integer").setParseAction(convertToInteger)
    """expression that parses an unsigned integer, returns an int"""

    hex_integer = Word(hexnums).setName("hex integer").setParseAction(tokenMap(int,16))
    """expression that parses a hexadecimal integer, returns an int"""

    signed_integer = Regex(r'[+-]?\d+').setName("signed integer").setParseAction(convertToInteger)
    """expression that parses an integer with optional leading sign, returns an int"""

    fraction = (signed_integer().setParseAction(convertToFloat) + '/' + signed_integer().setParseAction(convertToFloat)).setName("fraction")
    """fractional expression of an integer divided by an integer, returns a float"""
    fraction.addParseAction(lambda t: t[0]/t[-1])

    mixed_integer = (fraction | signed_integer + Optional(Optional('-').suppress() + fraction)).setName("fraction or mixed integer-fraction")
    """mixed integer of the form 'integer - fraction', with optional leading integer, returns float"""
    mixed_integer.addParseAction(sum)

    real = Regex(r'[+-]?\d+\.\d*').setName("real number").setParseAction(convertToFloat)
    """expression that parses a floating point number and returns a float"""

    sci_real = Regex(r'[+-]?\d+([eE][+-]?\d+|\.\d*([eE][+-]?\d+)?)').setName("real number with scientific notation").setParseAction(convertToFloat)
    """expression that parses a floating point number with optional scientific notation and returns a float"""

    # streamlining this expression makes the docs nicer-looking
    number = (sci_real | real | signed_integer).streamline()
    """any numeric expression, returns the corresponding Python type"""

    fnumber = Regex(r'[+-]?\d+\.?\d*([eE][+-]?\d+)?').setName("fnumber").setParseAction(convertToFloat)
    """any int or real number, returned as float"""
    
    identifier = Word(alphas+'_', alphanums+'_').setName("identifier")
    """typical code identifier (leading alpha or '_', followed by 0 or more alphas, nums, or '_')"""
    
    ipv4_address = Regex(r'(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})(\.(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})){3}').setName("IPv4 address")
    "IPv4 address (C{0.0.0.0 - 255.255.255.255})"

    _ipv6_part = Regex(r'[0-9a-fA-F]{1,4}').setName("hex_integer")
    _full_ipv6_address = (_ipv6_part + (':' + _ipv6_part)*7).setName("full IPv6 address")
    _short_ipv6_address = (Optional(_ipv6_part + (':' + _ipv6_part)*(0,6)) + "::" + Optional(_ipv6_part + (':' + _ipv6_part)*(0,6))).setName("short IPv6 address")
    _short_ipv6_address.addCondition(lambda t: sum(1 for tt in t if pyparsing_common._ipv6_part.matches(tt)) < 8)
    _mixed_ipv6_address = ("::ffff:" + ipv4_address).setName("mixed IPv6 address")
    ipv6_address = Combine((_full_ipv6_address | _mixed_ipv6_address | _short_ipv6_address).setName("IPv6 address")).setName("IPv6 address")
    "IPv6 address (long, short, or mixed form)"
    
    mac_address = Regex(r'[0-9a-fA-F]{2}([:.-])[0-9a-fA-F]{2}(?:\1[0-9a-fA-F]{2}){4}').setName("MAC address")
    "MAC address xx:xx:xx:xx:xx (may also have '-' or '.' delimiters)"

    @staticmethod
    def convertToDate(fmt="%Y-%m-%d"):
        """
        Helper to create a parse action for converting parsed date string to Python datetime.date

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%d"})

        Example::
            date_expr = pyparsing_common.iso8601_date.copy()
            date_expr.setParseAction(pyparsing_common.convertToDate())
            print(date_expr.parseString("1999-12-31"))
        prints::
            [datetime.date(1999, 12, 31)]
        """
        def cvt_fn(s,l,t):
            try:
                return datetime.strptime(t[0], fmt).date()
            except ValueError as ve:
                raise ParseException(s, l, str(ve))
        return cvt_fn

    @staticmethod
    def convertToDatetime(fmt="%Y-%m-%dT%H:%M:%S.%f"):
        """
        Helper to create a parse action for converting parsed datetime string to Python datetime.datetime

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%dT%H:%M:%S.%f"})

        Example::
            dt_expr = pyparsing_common.iso8601_datetime.copy()
            dt_expr.setParseAction(pyparsing_common.convertToDatetime())
            print(dt_expr.parseString("1999-12-31T23:59:59.999"))
        prints::
            [datetime.datetime(1999, 12, 31, 23, 59, 59, 999000)]
        """
        def cvt_fn(s,l,t):
            try:
                return datetime.strptime(t[0], fmt)
            except ValueError as ve:
                raise ParseException(s, l, str(ve))
        return cvt_fn

    iso8601_date = Regex(r'(?P<year>\d{4})(?:-(?P<month>\d\d)(?:-(?P<day>\d\d))?)?').setName("ISO8601 date")
    "ISO8601 date (C{yyyy-mm-dd})"

    iso8601_datetime = Regex(r'(?P<year>\d{4})-(?P<month>\d\d)-(?P<day>\d\d)[T ](?P<hour>\d\d):(?P<minute>\d\d)(:(?P<second>\d\d(\.\d*)?)?)?(?P<tz>Z|[+-]\d\d:?\d\d)?').setName("ISO8601 datetime")
    "ISO8601 datetime (C{yyyy-mm-ddThh:mm:ss.s(Z|+-00:00)}) - trailing seconds, milliseconds, and timezone optional; accepts separating C{'T'} or C{' '}"

    uuid = Regex(r'[0-9a-fA-F]{8}(-[0-9a-fA-F]{4}){3}-[0-9a-fA-F]{12}').setName("UUID")
    "UUID (C{xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx})"

    _html_stripper = anyOpenTag.suppress() | anyCloseTag.suppress()
    @staticmethod
    def stripHTMLTags(s, l, tokens):
        """
        Parse action to remove HTML tags from web page HTML source

        Example::
            # strip HTML links from normal text 
            text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
            td,td_end = makeHTMLTags("TD")
            table_text = td + SkipTo(td_end).setParseAction(pyparsing_common.stripHTMLTags)("body") + td_end
            
            print(table_text.parseString(text).body) # -> 'More info at the pyparsing wiki page'
        """
        return pyparsing_common._html_stripper.transformString(tokens[0])

    _commasepitem = Combine(OneOrMore(~Literal(",") + ~LineEnd() + Word(printables, excludeChars=',') 
                                        + Optional( White(" \t") ) ) ).streamline().setName("commaItem")
    comma_separated_list = delimitedList( Optional( quotedString.copy() | _commasepitem, default="") ).setName("comma separated list")
    """Predefined expression of 1 or more printable words or quoted strings, separated by commas."""

    upcaseTokens = staticmethod(tokenMap(lambda t: _ustr(t).upper()))
    """Parse action to convert tokens to upper case."""

    downcaseTokens = staticmethod(tokenMap(lambda t: _ustr(t).lower()))
    """Parse action to convert tokens to lower case."""


if __name__ == "__main__":

    selectToken    = CaselessLiteral("select")
    fromToken      = CaselessLiteral("from")

    ident          = Word(alphas, alphanums + "_$")

    columnName     = delimitedList(ident, ".", combine=True).setParseAction(upcaseTokens)
    columnNameList = Group(delimitedList(columnName)).setName("columns")
    columnSpec     = ('*' | columnNameList)

    tableName      = delimitedList(ident, ".", combine=True).setParseAction(upcaseTokens)
    tableNameList  = Group(delimitedList(tableName)).setName("tables")
    
    simpleSQL      = selectToken("command") + columnSpec("columns") + fromToken + tableNameList("tables")

    # demo runTests method, including embedded comments in test string
    simpleSQL.runTests("""
        # '*' as column list and dotted table name
        select * from SYS.XYZZY

        # caseless match on "SELECT", and casts back to "select"
        SELECT * from XYZZY, ABC

        # list of column names, and mixed case SELECT keyword
        Select AA,BB,CC from Sys.dual

        # multiple tables
        Select A, B, C from Sys.dual, Table2

        # invalid SELECT keyword - should fail
        Xelect A, B, C from Sys.dual

        # incomplete command - should fail
        Select

        # invalid column name - should fail
        Select ^^^ frox Sys.dual

        """)

    pyparsing_common.number.runTests("""
        100
        -100
        +100
        3.14159
        6.02e23
        1e-12
        """)

    # any int or real number, returned as float
    pyparsing_common.fnumber.runTests("""
        100
        -100
        +100
        3.14159
        6.02e23
        1e-12
        """)

    pyparsing_common.hex_integer.runTests("""
        100
        FF
        """)

    import uuid
    pyparsing_common.uuid.setParseAction(tokenMap(uuid.UUID))
    pyparsing_common.uuid.runTests("""
        12345678-1234-5678-1234-567812345678
        """)
site-packages/six.py000064400000074250147511334560010462 0ustar00# Copyright (c) 2010-2017 Benjamin Peterson
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in all
# copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.

"""Utilities for writing code that runs on Python 2 and 3"""

from __future__ import absolute_import

import functools
import itertools
import operator
import sys
import types

__author__ = "Benjamin Peterson <benjamin@python.org>"
__version__ = "1.11.0"


# Useful for very coarse version differentiation.
PY2 = sys.version_info[0] == 2
PY3 = sys.version_info[0] == 3
PY34 = sys.version_info[0:2] >= (3, 4)

if PY3:
    string_types = str,
    integer_types = int,
    class_types = type,
    text_type = str
    binary_type = bytes

    MAXSIZE = sys.maxsize
else:
    string_types = basestring,
    integer_types = (int, long)
    class_types = (type, types.ClassType)
    text_type = unicode
    binary_type = str

    if sys.platform.startswith("java"):
        # Jython always uses 32 bits.
        MAXSIZE = int((1 << 31) - 1)
    else:
        # It's possible to have sizeof(long) != sizeof(Py_ssize_t).
        class X(object):

            def __len__(self):
                return 1 << 31
        try:
            len(X())
        except OverflowError:
            # 32-bit
            MAXSIZE = int((1 << 31) - 1)
        else:
            # 64-bit
            MAXSIZE = int((1 << 63) - 1)
        del X


def _add_doc(func, doc):
    """Add documentation to a function."""
    func.__doc__ = doc


def _import_module(name):
    """Import module, returning the module after the last dot."""
    __import__(name)
    return sys.modules[name]


class _LazyDescr(object):

    def __init__(self, name):
        self.name = name

    def __get__(self, obj, tp):
        result = self._resolve()
        setattr(obj, self.name, result)  # Invokes __set__.
        try:
            # This is a bit ugly, but it avoids running this again by
            # removing this descriptor.
            delattr(obj.__class__, self.name)
        except AttributeError:
            pass
        return result


class MovedModule(_LazyDescr):

    def __init__(self, name, old, new=None):
        super(MovedModule, self).__init__(name)
        if PY3:
            if new is None:
                new = name
            self.mod = new
        else:
            self.mod = old

    def _resolve(self):
        return _import_module(self.mod)

    def __getattr__(self, attr):
        _module = self._resolve()
        value = getattr(_module, attr)
        setattr(self, attr, value)
        return value


class _LazyModule(types.ModuleType):

    def __init__(self, name):
        super(_LazyModule, self).__init__(name)
        self.__doc__ = self.__class__.__doc__

    def __dir__(self):
        attrs = ["__doc__", "__name__"]
        attrs += [attr.name for attr in self._moved_attributes]
        return attrs

    # Subclasses should override this
    _moved_attributes = []


class MovedAttribute(_LazyDescr):

    def __init__(self, name, old_mod, new_mod, old_attr=None, new_attr=None):
        super(MovedAttribute, self).__init__(name)
        if PY3:
            if new_mod is None:
                new_mod = name
            self.mod = new_mod
            if new_attr is None:
                if old_attr is None:
                    new_attr = name
                else:
                    new_attr = old_attr
            self.attr = new_attr
        else:
            self.mod = old_mod
            if old_attr is None:
                old_attr = name
            self.attr = old_attr

    def _resolve(self):
        module = _import_module(self.mod)
        return getattr(module, self.attr)


class _SixMetaPathImporter(object):

    """
    A meta path importer to import six.moves and its submodules.

    This class implements a PEP302 finder and loader. It should be compatible
    with Python 2.5 and all existing versions of Python3
    """

    def __init__(self, six_module_name):
        self.name = six_module_name
        self.known_modules = {}

    def _add_module(self, mod, *fullnames):
        for fullname in fullnames:
            self.known_modules[self.name + "." + fullname] = mod

    def _get_module(self, fullname):
        return self.known_modules[self.name + "." + fullname]

    def find_module(self, fullname, path=None):
        if fullname in self.known_modules:
            return self
        return None

    def __get_module(self, fullname):
        try:
            return self.known_modules[fullname]
        except KeyError:
            raise ImportError("This loader does not know module " + fullname)

    def load_module(self, fullname):
        try:
            # in case of a reload
            return sys.modules[fullname]
        except KeyError:
            pass
        mod = self.__get_module(fullname)
        if isinstance(mod, MovedModule):
            mod = mod._resolve()
        else:
            mod.__loader__ = self
        sys.modules[fullname] = mod
        return mod

    def is_package(self, fullname):
        """
        Return true, if the named module is a package.

        We need this method to get correct spec objects with
        Python 3.4 (see PEP451)
        """
        return hasattr(self.__get_module(fullname), "__path__")

    def get_code(self, fullname):
        """Return None

        Required, if is_package is implemented"""
        self.__get_module(fullname)  # eventually raises ImportError
        return None
    get_source = get_code  # same as get_code

_importer = _SixMetaPathImporter(__name__)


class _MovedItems(_LazyModule):

    """Lazy loading of moved objects"""
    __path__ = []  # mark as package


_moved_attributes = [
    MovedAttribute("cStringIO", "cStringIO", "io", "StringIO"),
    MovedAttribute("filter", "itertools", "builtins", "ifilter", "filter"),
    MovedAttribute("filterfalse", "itertools", "itertools", "ifilterfalse", "filterfalse"),
    MovedAttribute("input", "__builtin__", "builtins", "raw_input", "input"),
    MovedAttribute("intern", "__builtin__", "sys"),
    MovedAttribute("map", "itertools", "builtins", "imap", "map"),
    MovedAttribute("getcwd", "os", "os", "getcwdu", "getcwd"),
    MovedAttribute("getcwdb", "os", "os", "getcwd", "getcwdb"),
    MovedAttribute("getoutput", "commands", "subprocess"),
    MovedAttribute("range", "__builtin__", "builtins", "xrange", "range"),
    MovedAttribute("reload_module", "__builtin__", "importlib" if PY34 else "imp", "reload"),
    MovedAttribute("reduce", "__builtin__", "functools"),
    MovedAttribute("shlex_quote", "pipes", "shlex", "quote"),
    MovedAttribute("StringIO", "StringIO", "io"),
    MovedAttribute("UserDict", "UserDict", "collections"),
    MovedAttribute("UserList", "UserList", "collections"),
    MovedAttribute("UserString", "UserString", "collections"),
    MovedAttribute("xrange", "__builtin__", "builtins", "xrange", "range"),
    MovedAttribute("zip", "itertools", "builtins", "izip", "zip"),
    MovedAttribute("zip_longest", "itertools", "itertools", "izip_longest", "zip_longest"),
    MovedModule("builtins", "__builtin__"),
    MovedModule("configparser", "ConfigParser"),
    MovedModule("copyreg", "copy_reg"),
    MovedModule("dbm_gnu", "gdbm", "dbm.gnu"),
    MovedModule("_dummy_thread", "dummy_thread", "_dummy_thread"),
    MovedModule("http_cookiejar", "cookielib", "http.cookiejar"),
    MovedModule("http_cookies", "Cookie", "http.cookies"),
    MovedModule("html_entities", "htmlentitydefs", "html.entities"),
    MovedModule("html_parser", "HTMLParser", "html.parser"),
    MovedModule("http_client", "httplib", "http.client"),
    MovedModule("email_mime_base", "email.MIMEBase", "email.mime.base"),
    MovedModule("email_mime_image", "email.MIMEImage", "email.mime.image"),
    MovedModule("email_mime_multipart", "email.MIMEMultipart", "email.mime.multipart"),
    MovedModule("email_mime_nonmultipart", "email.MIMENonMultipart", "email.mime.nonmultipart"),
    MovedModule("email_mime_text", "email.MIMEText", "email.mime.text"),
    MovedModule("BaseHTTPServer", "BaseHTTPServer", "http.server"),
    MovedModule("CGIHTTPServer", "CGIHTTPServer", "http.server"),
    MovedModule("SimpleHTTPServer", "SimpleHTTPServer", "http.server"),
    MovedModule("cPickle", "cPickle", "pickle"),
    MovedModule("queue", "Queue"),
    MovedModule("reprlib", "repr"),
    MovedModule("socketserver", "SocketServer"),
    MovedModule("_thread", "thread", "_thread"),
    MovedModule("tkinter", "Tkinter"),
    MovedModule("tkinter_dialog", "Dialog", "tkinter.dialog"),
    MovedModule("tkinter_filedialog", "FileDialog", "tkinter.filedialog"),
    MovedModule("tkinter_scrolledtext", "ScrolledText", "tkinter.scrolledtext"),
    MovedModule("tkinter_simpledialog", "SimpleDialog", "tkinter.simpledialog"),
    MovedModule("tkinter_tix", "Tix", "tkinter.tix"),
    MovedModule("tkinter_ttk", "ttk", "tkinter.ttk"),
    MovedModule("tkinter_constants", "Tkconstants", "tkinter.constants"),
    MovedModule("tkinter_dnd", "Tkdnd", "tkinter.dnd"),
    MovedModule("tkinter_colorchooser", "tkColorChooser",
                "tkinter.colorchooser"),
    MovedModule("tkinter_commondialog", "tkCommonDialog",
                "tkinter.commondialog"),
    MovedModule("tkinter_tkfiledialog", "tkFileDialog", "tkinter.filedialog"),
    MovedModule("tkinter_font", "tkFont", "tkinter.font"),
    MovedModule("tkinter_messagebox", "tkMessageBox", "tkinter.messagebox"),
    MovedModule("tkinter_tksimpledialog", "tkSimpleDialog",
                "tkinter.simpledialog"),
    MovedModule("urllib_parse", __name__ + ".moves.urllib_parse", "urllib.parse"),
    MovedModule("urllib_error", __name__ + ".moves.urllib_error", "urllib.error"),
    MovedModule("urllib", __name__ + ".moves.urllib", __name__ + ".moves.urllib"),
    MovedModule("urllib_robotparser", "robotparser", "urllib.robotparser"),
    MovedModule("xmlrpc_client", "xmlrpclib", "xmlrpc.client"),
    MovedModule("xmlrpc_server", "SimpleXMLRPCServer", "xmlrpc.server"),
]
# Add windows specific modules.
if sys.platform == "win32":
    _moved_attributes += [
        MovedModule("winreg", "_winreg"),
    ]

for attr in _moved_attributes:
    setattr(_MovedItems, attr.name, attr)
    if isinstance(attr, MovedModule):
        _importer._add_module(attr, "moves." + attr.name)
del attr

_MovedItems._moved_attributes = _moved_attributes

moves = _MovedItems(__name__ + ".moves")
_importer._add_module(moves, "moves")


class Module_six_moves_urllib_parse(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_parse"""


_urllib_parse_moved_attributes = [
    MovedAttribute("ParseResult", "urlparse", "urllib.parse"),
    MovedAttribute("SplitResult", "urlparse", "urllib.parse"),
    MovedAttribute("parse_qs", "urlparse", "urllib.parse"),
    MovedAttribute("parse_qsl", "urlparse", "urllib.parse"),
    MovedAttribute("urldefrag", "urlparse", "urllib.parse"),
    MovedAttribute("urljoin", "urlparse", "urllib.parse"),
    MovedAttribute("urlparse", "urlparse", "urllib.parse"),
    MovedAttribute("urlsplit", "urlparse", "urllib.parse"),
    MovedAttribute("urlunparse", "urlparse", "urllib.parse"),
    MovedAttribute("urlunsplit", "urlparse", "urllib.parse"),
    MovedAttribute("quote", "urllib", "urllib.parse"),
    MovedAttribute("quote_plus", "urllib", "urllib.parse"),
    MovedAttribute("unquote", "urllib", "urllib.parse"),
    MovedAttribute("unquote_plus", "urllib", "urllib.parse"),
    MovedAttribute("unquote_to_bytes", "urllib", "urllib.parse", "unquote", "unquote_to_bytes"),
    MovedAttribute("urlencode", "urllib", "urllib.parse"),
    MovedAttribute("splitquery", "urllib", "urllib.parse"),
    MovedAttribute("splittag", "urllib", "urllib.parse"),
    MovedAttribute("splituser", "urllib", "urllib.parse"),
    MovedAttribute("splitvalue", "urllib", "urllib.parse"),
    MovedAttribute("uses_fragment", "urlparse", "urllib.parse"),
    MovedAttribute("uses_netloc", "urlparse", "urllib.parse"),
    MovedAttribute("uses_params", "urlparse", "urllib.parse"),
    MovedAttribute("uses_query", "urlparse", "urllib.parse"),
    MovedAttribute("uses_relative", "urlparse", "urllib.parse"),
]
for attr in _urllib_parse_moved_attributes:
    setattr(Module_six_moves_urllib_parse, attr.name, attr)
del attr

Module_six_moves_urllib_parse._moved_attributes = _urllib_parse_moved_attributes

_importer._add_module(Module_six_moves_urllib_parse(__name__ + ".moves.urllib_parse"),
                      "moves.urllib_parse", "moves.urllib.parse")


class Module_six_moves_urllib_error(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_error"""


_urllib_error_moved_attributes = [
    MovedAttribute("URLError", "urllib2", "urllib.error"),
    MovedAttribute("HTTPError", "urllib2", "urllib.error"),
    MovedAttribute("ContentTooShortError", "urllib", "urllib.error"),
]
for attr in _urllib_error_moved_attributes:
    setattr(Module_six_moves_urllib_error, attr.name, attr)
del attr

Module_six_moves_urllib_error._moved_attributes = _urllib_error_moved_attributes

_importer._add_module(Module_six_moves_urllib_error(__name__ + ".moves.urllib.error"),
                      "moves.urllib_error", "moves.urllib.error")


class Module_six_moves_urllib_request(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_request"""


_urllib_request_moved_attributes = [
    MovedAttribute("urlopen", "urllib2", "urllib.request"),
    MovedAttribute("install_opener", "urllib2", "urllib.request"),
    MovedAttribute("build_opener", "urllib2", "urllib.request"),
    MovedAttribute("pathname2url", "urllib", "urllib.request"),
    MovedAttribute("url2pathname", "urllib", "urllib.request"),
    MovedAttribute("getproxies", "urllib", "urllib.request"),
    MovedAttribute("Request", "urllib2", "urllib.request"),
    MovedAttribute("OpenerDirector", "urllib2", "urllib.request"),
    MovedAttribute("HTTPDefaultErrorHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPRedirectHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPCookieProcessor", "urllib2", "urllib.request"),
    MovedAttribute("ProxyHandler", "urllib2", "urllib.request"),
    MovedAttribute("BaseHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPPasswordMgr", "urllib2", "urllib.request"),
    MovedAttribute("HTTPPasswordMgrWithDefaultRealm", "urllib2", "urllib.request"),
    MovedAttribute("AbstractBasicAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPBasicAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("ProxyBasicAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("AbstractDigestAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPDigestAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("ProxyDigestAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPSHandler", "urllib2", "urllib.request"),
    MovedAttribute("FileHandler", "urllib2", "urllib.request"),
    MovedAttribute("FTPHandler", "urllib2", "urllib.request"),
    MovedAttribute("CacheFTPHandler", "urllib2", "urllib.request"),
    MovedAttribute("UnknownHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPErrorProcessor", "urllib2", "urllib.request"),
    MovedAttribute("urlretrieve", "urllib", "urllib.request"),
    MovedAttribute("urlcleanup", "urllib", "urllib.request"),
    MovedAttribute("URLopener", "urllib", "urllib.request"),
    MovedAttribute("FancyURLopener", "urllib", "urllib.request"),
    MovedAttribute("proxy_bypass", "urllib", "urllib.request"),
    MovedAttribute("parse_http_list", "urllib2", "urllib.request"),
    MovedAttribute("parse_keqv_list", "urllib2", "urllib.request"),
]
for attr in _urllib_request_moved_attributes:
    setattr(Module_six_moves_urllib_request, attr.name, attr)
del attr

Module_six_moves_urllib_request._moved_attributes = _urllib_request_moved_attributes

_importer._add_module(Module_six_moves_urllib_request(__name__ + ".moves.urllib.request"),
                      "moves.urllib_request", "moves.urllib.request")


class Module_six_moves_urllib_response(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_response"""


_urllib_response_moved_attributes = [
    MovedAttribute("addbase", "urllib", "urllib.response"),
    MovedAttribute("addclosehook", "urllib", "urllib.response"),
    MovedAttribute("addinfo", "urllib", "urllib.response"),
    MovedAttribute("addinfourl", "urllib", "urllib.response"),
]
for attr in _urllib_response_moved_attributes:
    setattr(Module_six_moves_urllib_response, attr.name, attr)
del attr

Module_six_moves_urllib_response._moved_attributes = _urllib_response_moved_attributes

_importer._add_module(Module_six_moves_urllib_response(__name__ + ".moves.urllib.response"),
                      "moves.urllib_response", "moves.urllib.response")


class Module_six_moves_urllib_robotparser(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_robotparser"""


_urllib_robotparser_moved_attributes = [
    MovedAttribute("RobotFileParser", "robotparser", "urllib.robotparser"),
]
for attr in _urllib_robotparser_moved_attributes:
    setattr(Module_six_moves_urllib_robotparser, attr.name, attr)
del attr

Module_six_moves_urllib_robotparser._moved_attributes = _urllib_robotparser_moved_attributes

_importer._add_module(Module_six_moves_urllib_robotparser(__name__ + ".moves.urllib.robotparser"),
                      "moves.urllib_robotparser", "moves.urllib.robotparser")


class Module_six_moves_urllib(types.ModuleType):

    """Create a six.moves.urllib namespace that resembles the Python 3 namespace"""
    __path__ = []  # mark as package
    parse = _importer._get_module("moves.urllib_parse")
    error = _importer._get_module("moves.urllib_error")
    request = _importer._get_module("moves.urllib_request")
    response = _importer._get_module("moves.urllib_response")
    robotparser = _importer._get_module("moves.urllib_robotparser")

    def __dir__(self):
        return ['parse', 'error', 'request', 'response', 'robotparser']

_importer._add_module(Module_six_moves_urllib(__name__ + ".moves.urllib"),
                      "moves.urllib")


def add_move(move):
    """Add an item to six.moves."""
    setattr(_MovedItems, move.name, move)


def remove_move(name):
    """Remove item from six.moves."""
    try:
        delattr(_MovedItems, name)
    except AttributeError:
        try:
            del moves.__dict__[name]
        except KeyError:
            raise AttributeError("no such move, %r" % (name,))


if PY3:
    _meth_func = "__func__"
    _meth_self = "__self__"

    _func_closure = "__closure__"
    _func_code = "__code__"
    _func_defaults = "__defaults__"
    _func_globals = "__globals__"
else:
    _meth_func = "im_func"
    _meth_self = "im_self"

    _func_closure = "func_closure"
    _func_code = "func_code"
    _func_defaults = "func_defaults"
    _func_globals = "func_globals"


try:
    advance_iterator = next
except NameError:
    def advance_iterator(it):
        return it.next()
next = advance_iterator


try:
    callable = callable
except NameError:
    def callable(obj):
        return any("__call__" in klass.__dict__ for klass in type(obj).__mro__)


if PY3:
    def get_unbound_function(unbound):
        return unbound

    create_bound_method = types.MethodType

    def create_unbound_method(func, cls):
        return func

    Iterator = object
else:
    def get_unbound_function(unbound):
        return unbound.im_func

    def create_bound_method(func, obj):
        return types.MethodType(func, obj, obj.__class__)

    def create_unbound_method(func, cls):
        return types.MethodType(func, None, cls)

    class Iterator(object):

        def next(self):
            return type(self).__next__(self)

    callable = callable
_add_doc(get_unbound_function,
         """Get the function out of a possibly unbound function""")


get_method_function = operator.attrgetter(_meth_func)
get_method_self = operator.attrgetter(_meth_self)
get_function_closure = operator.attrgetter(_func_closure)
get_function_code = operator.attrgetter(_func_code)
get_function_defaults = operator.attrgetter(_func_defaults)
get_function_globals = operator.attrgetter(_func_globals)


if PY3:
    def iterkeys(d, **kw):
        return iter(d.keys(**kw))

    def itervalues(d, **kw):
        return iter(d.values(**kw))

    def iteritems(d, **kw):
        return iter(d.items(**kw))

    def iterlists(d, **kw):
        return iter(d.lists(**kw))

    viewkeys = operator.methodcaller("keys")

    viewvalues = operator.methodcaller("values")

    viewitems = operator.methodcaller("items")
else:
    def iterkeys(d, **kw):
        return d.iterkeys(**kw)

    def itervalues(d, **kw):
        return d.itervalues(**kw)

    def iteritems(d, **kw):
        return d.iteritems(**kw)

    def iterlists(d, **kw):
        return d.iterlists(**kw)

    viewkeys = operator.methodcaller("viewkeys")

    viewvalues = operator.methodcaller("viewvalues")

    viewitems = operator.methodcaller("viewitems")

_add_doc(iterkeys, "Return an iterator over the keys of a dictionary.")
_add_doc(itervalues, "Return an iterator over the values of a dictionary.")
_add_doc(iteritems,
         "Return an iterator over the (key, value) pairs of a dictionary.")
_add_doc(iterlists,
         "Return an iterator over the (key, [values]) pairs of a dictionary.")


if PY3:
    def b(s):
        return s.encode("latin-1")

    def u(s):
        return s
    unichr = chr
    import struct
    int2byte = struct.Struct(">B").pack
    del struct
    byte2int = operator.itemgetter(0)
    indexbytes = operator.getitem
    iterbytes = iter
    import io
    StringIO = io.StringIO
    BytesIO = io.BytesIO
    _assertCountEqual = "assertCountEqual"
    if sys.version_info[1] <= 1:
        _assertRaisesRegex = "assertRaisesRegexp"
        _assertRegex = "assertRegexpMatches"
    else:
        _assertRaisesRegex = "assertRaisesRegex"
        _assertRegex = "assertRegex"
else:
    def b(s):
        return s
    # Workaround for standalone backslash

    def u(s):
        return unicode(s.replace(r'\\', r'\\\\'), "unicode_escape")
    unichr = unichr
    int2byte = chr

    def byte2int(bs):
        return ord(bs[0])

    def indexbytes(buf, i):
        return ord(buf[i])
    iterbytes = functools.partial(itertools.imap, ord)
    import StringIO
    StringIO = BytesIO = StringIO.StringIO
    _assertCountEqual = "assertItemsEqual"
    _assertRaisesRegex = "assertRaisesRegexp"
    _assertRegex = "assertRegexpMatches"
_add_doc(b, """Byte literal""")
_add_doc(u, """Text literal""")


def assertCountEqual(self, *args, **kwargs):
    return getattr(self, _assertCountEqual)(*args, **kwargs)


def assertRaisesRegex(self, *args, **kwargs):
    return getattr(self, _assertRaisesRegex)(*args, **kwargs)


def assertRegex(self, *args, **kwargs):
    return getattr(self, _assertRegex)(*args, **kwargs)


if PY3:
    exec_ = getattr(moves.builtins, "exec")

    def reraise(tp, value, tb=None):
        try:
            if value is None:
                value = tp()
            if value.__traceback__ is not tb:
                raise value.with_traceback(tb)
            raise value
        finally:
            value = None
            tb = None

else:
    def exec_(_code_, _globs_=None, _locs_=None):
        """Execute code in a namespace."""
        if _globs_ is None:
            frame = sys._getframe(1)
            _globs_ = frame.f_globals
            if _locs_ is None:
                _locs_ = frame.f_locals
            del frame
        elif _locs_ is None:
            _locs_ = _globs_
        exec("""exec _code_ in _globs_, _locs_""")

    exec_("""def reraise(tp, value, tb=None):
    try:
        raise tp, value, tb
    finally:
        tb = None
""")


if sys.version_info[:2] == (3, 2):
    exec_("""def raise_from(value, from_value):
    try:
        if from_value is None:
            raise value
        raise value from from_value
    finally:
        value = None
""")
elif sys.version_info[:2] > (3, 2):
    exec_("""def raise_from(value, from_value):
    try:
        raise value from from_value
    finally:
        value = None
""")
else:
    def raise_from(value, from_value):
        raise value


print_ = getattr(moves.builtins, "print", None)
if print_ is None:
    def print_(*args, **kwargs):
        """The new-style print function for Python 2.4 and 2.5."""
        fp = kwargs.pop("file", sys.stdout)
        if fp is None:
            return

        def write(data):
            if not isinstance(data, basestring):
                data = str(data)
            # If the file has an encoding, encode unicode with it.
            if (isinstance(fp, file) and
                    isinstance(data, unicode) and
                    fp.encoding is not None):
                errors = getattr(fp, "errors", None)
                if errors is None:
                    errors = "strict"
                data = data.encode(fp.encoding, errors)
            fp.write(data)
        want_unicode = False
        sep = kwargs.pop("sep", None)
        if sep is not None:
            if isinstance(sep, unicode):
                want_unicode = True
            elif not isinstance(sep, str):
                raise TypeError("sep must be None or a string")
        end = kwargs.pop("end", None)
        if end is not None:
            if isinstance(end, unicode):
                want_unicode = True
            elif not isinstance(end, str):
                raise TypeError("end must be None or a string")
        if kwargs:
            raise TypeError("invalid keyword arguments to print()")
        if not want_unicode:
            for arg in args:
                if isinstance(arg, unicode):
                    want_unicode = True
                    break
        if want_unicode:
            newline = unicode("\n")
            space = unicode(" ")
        else:
            newline = "\n"
            space = " "
        if sep is None:
            sep = space
        if end is None:
            end = newline
        for i, arg in enumerate(args):
            if i:
                write(sep)
            write(arg)
        write(end)
if sys.version_info[:2] < (3, 3):
    _print = print_

    def print_(*args, **kwargs):
        fp = kwargs.get("file", sys.stdout)
        flush = kwargs.pop("flush", False)
        _print(*args, **kwargs)
        if flush and fp is not None:
            fp.flush()

_add_doc(reraise, """Reraise an exception.""")

if sys.version_info[0:2] < (3, 4):
    def wraps(wrapped, assigned=functools.WRAPPER_ASSIGNMENTS,
              updated=functools.WRAPPER_UPDATES):
        def wrapper(f):
            f = functools.wraps(wrapped, assigned, updated)(f)
            f.__wrapped__ = wrapped
            return f
        return wrapper
else:
    wraps = functools.wraps


def with_metaclass(meta, *bases):
    """Create a base class with a metaclass."""
    # This requires a bit of explanation: the basic idea is to make a dummy
    # metaclass for one level of class instantiation that replaces itself with
    # the actual metaclass.
    class metaclass(type):

        def __new__(cls, name, this_bases, d):
            return meta(name, bases, d)

        @classmethod
        def __prepare__(cls, name, this_bases):
            return meta.__prepare__(name, bases)
    return type.__new__(metaclass, 'temporary_class', (), {})


def add_metaclass(metaclass):
    """Class decorator for creating a class with a metaclass."""
    def wrapper(cls):
        orig_vars = cls.__dict__.copy()
        slots = orig_vars.get('__slots__')
        if slots is not None:
            if isinstance(slots, str):
                slots = [slots]
            for slots_var in slots:
                orig_vars.pop(slots_var)
        orig_vars.pop('__dict__', None)
        orig_vars.pop('__weakref__', None)
        return metaclass(cls.__name__, cls.__bases__, orig_vars)
    return wrapper


def python_2_unicode_compatible(klass):
    """
    A decorator that defines __unicode__ and __str__ methods under Python 2.
    Under Python 3 it does nothing.

    To support Python 2 and 3 with a single code base, define a __str__ method
    returning text and apply this decorator to the class.
    """
    if PY2:
        if '__str__' not in klass.__dict__:
            raise ValueError("@python_2_unicode_compatible cannot be applied "
                             "to %s because it doesn't define __str__()." %
                             klass.__name__)
        klass.__unicode__ = klass.__str__
        klass.__str__ = lambda self: self.__unicode__().encode('utf-8')
    return klass


# Complete the moves implementation.
# This code is at the end of this module to speed up module loading.
# Turn this module into a package.
__path__ = []  # required for PEP 302 and PEP 451
__package__ = __name__  # see PEP 366 @ReservedAssignment
if globals().get("__spec__") is not None:
    __spec__.submodule_search_locations = []  # PEP 451 @UndefinedVariable
# Remove other six meta path importers, since they cause problems. This can
# happen if six is removed from sys.modules and then reloaded. (Setuptools does
# this for some reason.)
if sys.meta_path:
    for i, importer in enumerate(sys.meta_path):
        # Here's some real nastiness: Another "instance" of the six module might
        # be floating around. Therefore, we can't use isinstance() to check for
        # the six meta path importer, since the other six instance will have
        # inserted an importer with different class.
        if (type(importer).__name__ == "_SixMetaPathImporter" and
                importer.name == __name__):
            del sys.meta_path[i]
            break
    del i, importer
# Finally, add the importer to the meta path import hook.
sys.meta_path.append(_importer)
site-packages/setuptools-39.2.0.dist-info/METADATA000064400000005724147511334560015253 0ustar00Metadata-Version: 2.0
Name: setuptools
Version: 39.2.0
Summary: Easily download, build, install, upgrade, and uninstall Python packages
Home-page: https://github.com/pypa/setuptools
Author: Python Packaging Authority
Author-email: distutils-sig@python.org
License: UNKNOWN
Project-URL: Documentation, https://setuptools.readthedocs.io/
Keywords: CPAN PyPI distutils eggs package management
Platform: UNKNOWN
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 2
Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.3
Classifier: Programming Language :: Python :: 3.4
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Topic :: System :: Archiving :: Packaging
Classifier: Topic :: System :: Systems Administration
Classifier: Topic :: Utilities
Requires-Python: >=2.7,!=3.0.*,!=3.1.*,!=3.2.*
Description-Content-Type: text/x-rst; charset=UTF-8
Provides-Extra: ssl
Provides-Extra: certs
Provides-Extra: certs
Requires-Dist: certifi (==2016.9.26); extra == 'certs'
Provides-Extra: ssl
Requires-Dist: wincertstore (==0.2); sys_platform=='win32' and extra == 'ssl'

.. image:: https://img.shields.io/pypi/v/setuptools.svg
   :target: https://pypi.org/project/setuptools

.. image:: https://readthedocs.org/projects/setuptools/badge/?version=latest
    :target: https://setuptools.readthedocs.io

.. image:: https://img.shields.io/travis/pypa/setuptools/master.svg?label=Linux%20build%20%40%20Travis%20CI
   :target: https://travis-ci.org/pypa/setuptools

.. image:: https://img.shields.io/appveyor/ci/pypa/setuptools/master.svg?label=Windows%20build%20%40%20Appveyor
   :target: https://ci.appveyor.com/project/pypa/setuptools/branch/master

.. image:: https://img.shields.io/codecov/c/github/pypa/setuptools/master.svg
   :target: https://codecov.io/gh/pypa/setuptools

.. image:: https://img.shields.io/pypi/pyversions/setuptools.svg

See the `Installation Instructions
<https://packaging.python.org/installing/>`_ in the Python Packaging
User's Guide for instructions on installing, upgrading, and uninstalling
Setuptools.

The project is `maintained at GitHub <https://github.com/pypa/setuptools>`_.

Questions and comments should be directed to the `distutils-sig
mailing list <http://mail.python.org/pipermail/distutils-sig/>`_.
Bug reports and especially tested patches may be
submitted directly to the `bug tracker
<https://github.com/pypa/setuptools/issues>`_.


Code of Conduct
---------------

Everyone interacting in the setuptools project's codebases, issue trackers,
chat rooms, and mailing lists is expected to follow the
`PyPA Code of Conduct <https://www.pypa.io/en/latest/code-of-conduct/>`_.


site-packages/setuptools-39.2.0.dist-info/INSTALLER000064400000000004147511334560015412 0ustar00pip
site-packages/setuptools-39.2.0.dist-info/entry_points.txt000064400000005656147511334560017452 0ustar00[console_scripts]
easy_install = setuptools.command.easy_install:main
easy_install-3.6 = setuptools.command.easy_install:main

[distutils.commands]
alias = setuptools.command.alias:alias
bdist_egg = setuptools.command.bdist_egg:bdist_egg
bdist_rpm = setuptools.command.bdist_rpm:bdist_rpm
bdist_wininst = setuptools.command.bdist_wininst:bdist_wininst
build_clib = setuptools.command.build_clib:build_clib
build_ext = setuptools.command.build_ext:build_ext
build_py = setuptools.command.build_py:build_py
develop = setuptools.command.develop:develop
dist_info = setuptools.command.dist_info:dist_info
easy_install = setuptools.command.easy_install:easy_install
egg_info = setuptools.command.egg_info:egg_info
install = setuptools.command.install:install
install_egg_info = setuptools.command.install_egg_info:install_egg_info
install_lib = setuptools.command.install_lib:install_lib
install_scripts = setuptools.command.install_scripts:install_scripts
register = setuptools.command.register:register
rotate = setuptools.command.rotate:rotate
saveopts = setuptools.command.saveopts:saveopts
sdist = setuptools.command.sdist:sdist
setopt = setuptools.command.setopt:setopt
test = setuptools.command.test:test
upload = setuptools.command.upload:upload
upload_docs = setuptools.command.upload_docs:upload_docs

[distutils.setup_keywords]
convert_2to3_doctests = setuptools.dist:assert_string_list
dependency_links = setuptools.dist:assert_string_list
eager_resources = setuptools.dist:assert_string_list
entry_points = setuptools.dist:check_entry_points
exclude_package_data = setuptools.dist:check_package_data
extras_require = setuptools.dist:check_extras
include_package_data = setuptools.dist:assert_bool
install_requires = setuptools.dist:check_requirements
namespace_packages = setuptools.dist:check_nsp
package_data = setuptools.dist:check_package_data
packages = setuptools.dist:check_packages
python_requires = setuptools.dist:check_specifier
setup_requires = setuptools.dist:check_requirements
test_loader = setuptools.dist:check_importable
test_runner = setuptools.dist:check_importable
test_suite = setuptools.dist:check_test_suite
tests_require = setuptools.dist:check_requirements
use_2to3 = setuptools.dist:assert_bool
use_2to3_exclude_fixers = setuptools.dist:assert_string_list
use_2to3_fixers = setuptools.dist:assert_string_list
zip_safe = setuptools.dist:assert_bool

[egg_info.writers]
PKG-INFO = setuptools.command.egg_info:write_pkg_info
dependency_links.txt = setuptools.command.egg_info:overwrite_arg
depends.txt = setuptools.command.egg_info:warn_depends_obsolete
eager_resources.txt = setuptools.command.egg_info:overwrite_arg
entry_points.txt = setuptools.command.egg_info:write_entries
namespace_packages.txt = setuptools.command.egg_info:overwrite_arg
requires.txt = setuptools.command.egg_info:write_requirements
top_level.txt = setuptools.command.egg_info:write_toplevel_names

[setuptools.installation]
eggsecutable = setuptools.command.easy_install:bootstrap

site-packages/setuptools-39.2.0.dist-info/DESCRIPTION.rst000064400000003013147511334560016452 0ustar00.. image:: https://img.shields.io/pypi/v/setuptools.svg
   :target: https://pypi.org/project/setuptools

.. image:: https://readthedocs.org/projects/setuptools/badge/?version=latest
    :target: https://setuptools.readthedocs.io

.. image:: https://img.shields.io/travis/pypa/setuptools/master.svg?label=Linux%20build%20%40%20Travis%20CI
   :target: https://travis-ci.org/pypa/setuptools

.. image:: https://img.shields.io/appveyor/ci/pypa/setuptools/master.svg?label=Windows%20build%20%40%20Appveyor
   :target: https://ci.appveyor.com/project/pypa/setuptools/branch/master

.. image:: https://img.shields.io/codecov/c/github/pypa/setuptools/master.svg
   :target: https://codecov.io/gh/pypa/setuptools

.. image:: https://img.shields.io/pypi/pyversions/setuptools.svg

See the `Installation Instructions
<https://packaging.python.org/installing/>`_ in the Python Packaging
User's Guide for instructions on installing, upgrading, and uninstalling
Setuptools.

The project is `maintained at GitHub <https://github.com/pypa/setuptools>`_.

Questions and comments should be directed to the `distutils-sig
mailing list <http://mail.python.org/pipermail/distutils-sig/>`_.
Bug reports and especially tested patches may be
submitted directly to the `bug tracker
<https://github.com/pypa/setuptools/issues>`_.


Code of Conduct
---------------

Everyone interacting in the setuptools project's codebases, issue trackers,
chat rooms, and mailing lists is expected to follow the
`PyPA Code of Conduct <https://www.pypa.io/en/latest/code-of-conduct/>`_.


site-packages/setuptools-39.2.0.dist-info/zip-safe000064400000000001147511334560015570 0ustar00
site-packages/setuptools-39.2.0.dist-info/LICENSE.txt000064400000002066147511334560015767 0ustar00Copyright (C) 2016 Jason R Coombs <jaraco@jaraco.com>

Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies
of the Software, and to permit persons to whom the Software is furnished to do
so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
site-packages/setuptools-39.2.0.dist-info/dependency_links.txt000064400000000357147511334560020224 0ustar00https://files.pythonhosted.org/packages/source/c/certifi/certifi-2016.9.26.tar.gz#md5=baa81e951a29958563689d868ef1064d
https://files.pythonhosted.org/packages/source/w/wincertstore/wincertstore-0.2.zip#md5=ae728f2f007185648d0c7a8679b361e2
site-packages/setuptools-39.2.0.dist-info/top_level.txt000064400000000046147511334560016672 0ustar00easy_install
pkg_resources
setuptools
site-packages/setuptools-39.2.0.dist-info/metadata.json000064400000011551147511334560016616 0ustar00{"classifiers": ["Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Operating System :: OS Independent", "Programming Language :: Python :: 2", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.3", "Programming Language :: Python :: 3.4", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3.6", "Topic :: Software Development :: Libraries :: Python Modules", "Topic :: System :: Archiving :: Packaging", "Topic :: System :: Systems Administration", "Topic :: Utilities"], "description_content_type": "text/x-rst; charset=UTF-8", "extensions": {"python.commands": {"wrap_console": {"easy_install": "setuptools.command.easy_install:main", "easy_install-3.6": "setuptools.command.easy_install:main"}}, "python.details": {"contacts": [{"email": "distutils-sig@python.org", "name": "Python Packaging Authority", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst", "license": "LICENSE.txt"}, "project_urls": {"Home": "https://github.com/pypa/setuptools"}}, "python.exports": {"console_scripts": {"easy_install": "setuptools.command.easy_install:main", "easy_install-3.6": "setuptools.command.easy_install:main"}, "distutils.commands": {"alias": "setuptools.command.alias:alias", "bdist_egg": "setuptools.command.bdist_egg:bdist_egg", "bdist_rpm": "setuptools.command.bdist_rpm:bdist_rpm", "bdist_wininst": "setuptools.command.bdist_wininst:bdist_wininst", "build_clib": "setuptools.command.build_clib:build_clib", "build_ext": "setuptools.command.build_ext:build_ext", "build_py": "setuptools.command.build_py:build_py", "develop": "setuptools.command.develop:develop", "dist_info": "setuptools.command.dist_info:dist_info", "easy_install": "setuptools.command.easy_install:easy_install", "egg_info": "setuptools.command.egg_info:egg_info", "install": "setuptools.command.install:install", "install_egg_info": "setuptools.command.install_egg_info:install_egg_info", "install_lib": "setuptools.command.install_lib:install_lib", "install_scripts": "setuptools.command.install_scripts:install_scripts", "register": "setuptools.command.register:register", "rotate": "setuptools.command.rotate:rotate", "saveopts": "setuptools.command.saveopts:saveopts", "sdist": "setuptools.command.sdist:sdist", "setopt": "setuptools.command.setopt:setopt", "test": "setuptools.command.test:test", "upload": "setuptools.command.upload:upload", "upload_docs": "setuptools.command.upload_docs:upload_docs"}, "distutils.setup_keywords": {"convert_2to3_doctests": "setuptools.dist:assert_string_list", "dependency_links": "setuptools.dist:assert_string_list", "eager_resources": "setuptools.dist:assert_string_list", "entry_points": "setuptools.dist:check_entry_points", "exclude_package_data": "setuptools.dist:check_package_data", "extras_require": "setuptools.dist:check_extras", "include_package_data": "setuptools.dist:assert_bool", "install_requires": "setuptools.dist:check_requirements", "namespace_packages": "setuptools.dist:check_nsp", "package_data": "setuptools.dist:check_package_data", "packages": "setuptools.dist:check_packages", "python_requires": "setuptools.dist:check_specifier", "setup_requires": "setuptools.dist:check_requirements", "test_loader": "setuptools.dist:check_importable", "test_runner": "setuptools.dist:check_importable", "test_suite": "setuptools.dist:check_test_suite", "tests_require": "setuptools.dist:check_requirements", "use_2to3": "setuptools.dist:assert_bool", "use_2to3_exclude_fixers": "setuptools.dist:assert_string_list", "use_2to3_fixers": "setuptools.dist:assert_string_list", "zip_safe": "setuptools.dist:assert_bool"}, "egg_info.writers": {"PKG-INFO": "setuptools.command.egg_info:write_pkg_info", "dependency_links.txt": "setuptools.command.egg_info:overwrite_arg", "depends.txt": "setuptools.command.egg_info:warn_depends_obsolete", "eager_resources.txt": "setuptools.command.egg_info:overwrite_arg", "entry_points.txt": "setuptools.command.egg_info:write_entries", "namespace_packages.txt": "setuptools.command.egg_info:overwrite_arg", "requires.txt": "setuptools.command.egg_info:write_requirements", "top_level.txt": "setuptools.command.egg_info:write_toplevel_names"}, "setuptools.installation": {"eggsecutable": "setuptools.command.easy_install:bootstrap"}}}, "extras": ["certs", "ssl"], "generator": "bdist_wheel (0.30.0)", "keywords": ["CPAN", "PyPI", "distutils", "eggs", "package", "management"], "metadata_version": "2.0", "name": "setuptools", "project_url": "Documentation, https://setuptools.readthedocs.io/", "requires_python": ">=2.7,!=3.0.*,!=3.1.*,!=3.2.*", "run_requires": [{"extra": "certs", "requires": ["certifi (==2016.9.26)"]}, {"environment": "sys_platform=='win32'", "extra": "ssl", "requires": ["wincertstore (==0.2)"]}], "summary": "Easily download, build, install, upgrade, and uninstall Python packages", "version": "39.2.0"}site-packages/setuptools-39.2.0.dist-info/WHEEL000064400000000156147511334560014731 0ustar00Wheel-Version: 1.0
Generator: bdist_wheel (0.30.0)
Root-Is-Purelib: true
Tag: py2-none-any
Tag: py3-none-any

site-packages/setuptools-39.2.0.dist-info/RECORD000064400000032375147511334560015053 0ustar00easy_install.py,sha256=MDC9vt5AxDsXX5qcKlBz2TnW6Tpuv_AobnfhCJ9X3PM,126
pkg_resources/__init__.py,sha256=D6DGFHIzVnG-ByUliqZuw3dkB3ccE6z5jdhDJFap12Y,103822
pkg_resources/py31compat.py,sha256=-ysVqoxLetAnL94uM0kHkomKQTC1JZLN2ZUjqUhMeKE,600
pkg_resources/_vendor/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
pkg_resources/_vendor/appdirs.py,sha256=tgGaL0m4Jo2VeuGfoOOifLv7a7oUEJu2n1vRkqoPw-0,22374
pkg_resources/_vendor/pyparsing.py,sha256=PifeLY3-WhIcBVzLtv0U4T_pwDtPruBhBCkg5vLqa28,229867
pkg_resources/_vendor/six.py,sha256=A6hdJZVjI3t_geebZ9BzUvwRrIXo0lfwzQlM2LcKyas,30098
pkg_resources/_vendor/packaging/__about__.py,sha256=zkcCPTN_6TcLW0Nrlg0176-R1QQ_WVPTm8sz1R4-HjM,720
pkg_resources/_vendor/packaging/__init__.py,sha256=_vNac5TrzwsrzbOFIbF-5cHqc_Y2aPT2D7zrIR06BOo,513
pkg_resources/_vendor/packaging/_compat.py,sha256=Vi_A0rAQeHbU-a9X0tt1yQm9RqkgQbDSxzRw8WlU9kA,860
pkg_resources/_vendor/packaging/_structures.py,sha256=RImECJ4c_wTlaTYYwZYLHEiebDMaAJmK1oPARhw1T5o,1416
pkg_resources/_vendor/packaging/markers.py,sha256=uEcBBtGvzqltgnArqb9c4RrcInXezDLos14zbBHhWJo,8248
pkg_resources/_vendor/packaging/requirements.py,sha256=SikL2UynbsT0qtY9ltqngndha_sfo0w6XGFhAhoSoaQ,4355
pkg_resources/_vendor/packaging/specifiers.py,sha256=SAMRerzO3fK2IkFZCaZkuwZaL_EGqHNOz4pni4vhnN0,28025
pkg_resources/_vendor/packaging/utils.py,sha256=3m6WvPm6NNxE8rkTGmn0r75B_GZSGg7ikafxHsBN1WA,421
pkg_resources/_vendor/packaging/version.py,sha256=OwGnxYfr2ghNzYx59qWIBkrK3SnB6n-Zfd1XaLpnnM0,11556
pkg_resources/extern/__init__.py,sha256=JUtlHHvlxHSNuB4pWqNjcx7n6kG-fwXg7qmJ2zNJlIY,2487
setuptools/__init__.py,sha256=WWIdCbFJnZ9fZoaWDN_x1vDA_Rkm-Sc15iKvPtIYKFs,5700
setuptools/archive_util.py,sha256=kw8Ib_lKjCcnPKNbS7h8HztRVK0d5RacU3r_KRdVnmM,6592
setuptools/build_meta.py,sha256=FllaKTr1vSJyiUeRjVJEZmeEaRzhYueNlimtcwaJba8,5671
setuptools/config.py,sha256=3L9wwF1_uprsyHsUHXXsyLmJUA5HIczJYQ2BFzLWjc0,18006
setuptools/dep_util.py,sha256=fgixvC1R7sH3r13ktyf7N0FALoqEXL1cBarmNpSEoWg,935
setuptools/depends.py,sha256=hC8QIDcM3VDpRXvRVA6OfL9AaQfxvhxHcN_w6sAyNq8,5837
setuptools/dist.py,sha256=1j3kuNEGaaAzWz0iLWItxziNyJTZC8MgcTfMZ4U4Wes,42613
setuptools/extension.py,sha256=uc6nHI-MxwmNCNPbUiBnybSyqhpJqjbhvOQ-emdvt_E,1729
setuptools/glibc.py,sha256=X64VvGPL2AbURKwYRsWJOXXGAYOiF_v2qixeTkAULuU,3146
setuptools/glob.py,sha256=Y-fpv8wdHZzv9DPCaGACpMSBWJ6amq_1e0R_i8_el4w,5207
setuptools/launch.py,sha256=sd7ejwhBocCDx_wG9rIs0OaZ8HtmmFU8ZC6IR_S0Lvg,787
setuptools/lib2to3_ex.py,sha256=t5e12hbR2pi9V4ezWDTB4JM-AISUnGOkmcnYHek3xjg,2013
setuptools/monkey.py,sha256=H_yJ91EtDWu20v5JsEmFeDckiYVMhpE3nMcEdxxd-Ig,5261
setuptools/msvc.py,sha256=8EiV9ypb3EQJQssPcH1HZbdNsbRvqsFnJ7wPFEGwFIo,40877
setuptools/namespaces.py,sha256=F0Nrbv8KCT2OrO7rwa03om4N4GZKAlnce-rr-cgDQa8,3199
setuptools/package_index.py,sha256=6v4e6K62WmAhYAgLOBw1wbMDsnxAW8OIbAKcRVsWB9k,40133
setuptools/pep425tags.py,sha256=I7lxWpy9XKELBJ0CVYiT7OPW0hkBFe0kNpbEBwXV_XQ,10873
setuptools/py27compat.py,sha256=3mwxRMDk5Q5O1rSXOERbQDXhFqwDJhhUitfMW_qpUCo,536
setuptools/py31compat.py,sha256=XuU1HCsGE_3zGvBRIhYw2iB-IhCFK4-Pxw_jMiqdNVk,1192
setuptools/py33compat.py,sha256=NKS84nl4LjLIoad6OQfgmygZn4mMvrok_b1N1tzebew,1182
setuptools/py36compat.py,sha256=VUDWxmu5rt4QHlGTRtAFu6W5jvfL6WBjeDAzeoBy0OM,2891
setuptools/sandbox.py,sha256=9UbwfEL5QY436oMI1LtFWohhoZ-UzwHvGyZjUH_qhkw,14276
setuptools/script (dev).tmpl,sha256=f7MR17dTkzaqkCMSVseyOCMVrPVSMdmTQsaB8cZzfuI,201
setuptools/script.tmpl,sha256=WGTt5piezO27c-Dbx6l5Q4T3Ff20A5z7872hv3aAhYY,138
setuptools/site-patch.py,sha256=BVt6yIrDMXJoflA5J6DJIcsJUfW_XEeVhOzelTTFDP4,2307
setuptools/ssl_support.py,sha256=YBDJsCZjSp62CWjxmSkke9kn9rhHHj25Cus6zhJRW3c,8492
setuptools/unicode_utils.py,sha256=NOiZ_5hD72A6w-4wVj8awHFM3n51Kmw1Ic_vx15XFqw,996
setuptools/version.py,sha256=og_cuZQb0QI6ukKZFfZWPlr1HgJBPPn2vO2m_bI9ZTE,144
setuptools/wheel.py,sha256=2V7-XGD0jRFvEOo3btpl1I7kUwaZIUFmqnAVYeoMwPs,7778
setuptools/windows_support.py,sha256=5GrfqSP2-dLGJoZTq2g6dCKkyQxxa2n5IQiXlJCoYEE,714
setuptools/_vendor/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
setuptools/_vendor/pyparsing.py,sha256=PifeLY3-WhIcBVzLtv0U4T_pwDtPruBhBCkg5vLqa28,229867
setuptools/_vendor/six.py,sha256=A6hdJZVjI3t_geebZ9BzUvwRrIXo0lfwzQlM2LcKyas,30098
setuptools/_vendor/packaging/__about__.py,sha256=zkcCPTN_6TcLW0Nrlg0176-R1QQ_WVPTm8sz1R4-HjM,720
setuptools/_vendor/packaging/__init__.py,sha256=_vNac5TrzwsrzbOFIbF-5cHqc_Y2aPT2D7zrIR06BOo,513
setuptools/_vendor/packaging/_compat.py,sha256=Vi_A0rAQeHbU-a9X0tt1yQm9RqkgQbDSxzRw8WlU9kA,860
setuptools/_vendor/packaging/_structures.py,sha256=RImECJ4c_wTlaTYYwZYLHEiebDMaAJmK1oPARhw1T5o,1416
setuptools/_vendor/packaging/markers.py,sha256=Gvpk9EY20yKaMTiKgQZ8yFEEpodqVgVYtfekoic1Yts,8239
setuptools/_vendor/packaging/requirements.py,sha256=t44M2HVWtr8phIz2OhnILzuGT3rTATaovctV1dpnVIg,4343
setuptools/_vendor/packaging/specifiers.py,sha256=SAMRerzO3fK2IkFZCaZkuwZaL_EGqHNOz4pni4vhnN0,28025
setuptools/_vendor/packaging/utils.py,sha256=3m6WvPm6NNxE8rkTGmn0r75B_GZSGg7ikafxHsBN1WA,421
setuptools/_vendor/packaging/version.py,sha256=OwGnxYfr2ghNzYx59qWIBkrK3SnB6n-Zfd1XaLpnnM0,11556
setuptools/command/__init__.py,sha256=NWzJ0A1BEengZpVeqUyWLNm2bk4P3F4iL5QUErHy7kA,594
setuptools/command/alias.py,sha256=KjpE0sz_SDIHv3fpZcIQK-sCkJz-SrC6Gmug6b9Nkc8,2426
setuptools/command/bdist_egg.py,sha256=RQ9h8BmSVpXKJQST3i_b_sm093Z-aCXbfMBEM2IrI-Q,18185
setuptools/command/bdist_rpm.py,sha256=B7l0TnzCGb-0nLlm6rS00jWLkojASwVmdhW2w5Qz_Ak,1508
setuptools/command/bdist_wininst.py,sha256=_6dz3lpB1tY200LxKPLM7qgwTCceOMgaWFF-jW2-pm0,637
setuptools/command/build_clib.py,sha256=bQ9aBr-5ZSO-9fGsGsDLz0mnnFteHUZnftVLkhvHDq0,4484
setuptools/command/build_ext.py,sha256=PCRAZ2xYnqyEof7EFNtpKYl0sZzT0qdKUNTH3sUdPqk,13173
setuptools/command/build_py.py,sha256=yWyYaaS9F3o9JbIczn064A5g1C5_UiKRDxGaTqYbtLE,9596
setuptools/command/develop.py,sha256=wKbOw2_qUvcDti2lZmtxbDmYb54yAAibExzXIvToz-A,8046
setuptools/command/dist_info.py,sha256=5t6kOfrdgALT-P3ogss6PF9k-Leyesueycuk3dUyZnI,960
setuptools/command/easy_install.py,sha256=y1tcBQpk5YkR4ASnmoIcVfP4FjLxPq_IYM9xfUBHnXQ,87204
setuptools/command/egg_info.py,sha256=3b5Y3t_bl_zZRCkmlGi3igvRze9oOaxd-dVf2w1FBOc,24800
setuptools/command/install.py,sha256=a0EZpL_A866KEdhicTGbuyD_TYl1sykfzdrri-zazT4,4683
setuptools/command/install_egg_info.py,sha256=bMgeIeRiXzQ4DAGPV1328kcjwQjHjOWU4FngAWLV78Q,2203
setuptools/command/install_lib.py,sha256=11mxf0Ch12NsuYwS8PHwXBRvyh671QAM4cTRh7epzG0,3840
setuptools/command/install_scripts.py,sha256=UD0rEZ6861mTYhIdzcsqKnUl8PozocXWl9VBQ1VTWnc,2439
setuptools/command/launcher manifest.xml,sha256=xlLbjWrB01tKC0-hlVkOKkiSPbzMml2eOPtJ_ucCnbE,628
setuptools/command/py36compat.py,sha256=SzjZcOxF7zdFUT47Zv2n7AM3H8koDys_0OpS-n9gIfc,4986
setuptools/command/register.py,sha256=bHlMm1qmBbSdahTOT8w6UhA-EgeQIz7p6cD-qOauaiI,270
setuptools/command/rotate.py,sha256=co5C1EkI7P0GGT6Tqz-T2SIj2LBJTZXYELpmao6d4KQ,2164
setuptools/command/saveopts.py,sha256=za7QCBcQimKKriWcoCcbhxPjUz30gSB74zuTL47xpP4,658
setuptools/command/sdist.py,sha256=obDTe2BmWt2PlnFPZZh7e0LWvemEsbCCO9MzhrTZjm8,6711
setuptools/command/setopt.py,sha256=NTWDyx-gjDF-txf4dO577s7LOzHVoKR0Mq33rFxaRr8,5085
setuptools/command/test.py,sha256=MeBAcXUePGjPKqjz4zvTrHatLvNsjlPFcagt3XnFYdk,9214
setuptools/command/upload.py,sha256=i1gfItZ3nQOn5FKXb8tLC2Kd7eKC8lWO4bdE6NqGpE4,1172
setuptools/command/upload_docs.py,sha256=oXiGplM_cUKLwE4CWWw98RzCufAu8tBhMC97GegFcms,7311
setuptools/extern/__init__.py,sha256=2eKMsBMwsZqolIcYBtLZU3t96s6xSTP4PTaNfM5P-I0,2499
setuptools-39.2.0.dist-info/DESCRIPTION.rst,sha256=mOsk4uH4ma3S7RjGk8m4kEMQssSAZxI8Dj1Z8EZHkxA,1547
setuptools-39.2.0.dist-info/LICENSE.txt,sha256=wyo6w5WvYyHv0ovnPQagDw22q4h9HCHU_sRhKNIFbVo,1078
setuptools-39.2.0.dist-info/METADATA,sha256=gHeHu4S3oYwMrgCuSC88ChBAIaZOkzUWzKpg9Xomdro,3028
setuptools-39.2.0.dist-info/RECORD,,
setuptools-39.2.0.dist-info/WHEEL,sha256=kdsN-5OJAZIiHN-iO4Rhl82KyS0bDWf4uBwMbkNafr8,110
setuptools-39.2.0.dist-info/dependency_links.txt,sha256=HlkCFkoK5TbZ5EMLbLKYhLcY_E31kBWD8TqW2EgmatQ,239
setuptools-39.2.0.dist-info/entry_points.txt,sha256=jBqCYDlVjl__sjYFGXo1JQGIMAYFJE-prYWUtnMZEew,2990
setuptools-39.2.0.dist-info/metadata.json,sha256=20034ySWEQzNtb_Cyg6EYzoZfpYjyHsKleWelH--ROo,4969
setuptools-39.2.0.dist-info/top_level.txt,sha256=2HUXVVwA4Pff1xgTFr3GsTXXKaPaO6vlG6oNJ_4u4Tg,38
setuptools-39.2.0.dist-info/zip-safe,sha256=AbpHGcgLb-kRsJGnwFEktk7uzpZOCcBY74-YBdrKVGs,1
../../../bin/easy_install,sha256=Gezk0fVCHeU4cjSAPY5QeEYRqYmWoPAKE8Edpwhp15U,246
../../../bin/easy_install-3.6,sha256=Gezk0fVCHeU4cjSAPY5QeEYRqYmWoPAKE8Edpwhp15U,246
setuptools-39.2.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
pkg_resources/extern/__pycache__/__init__.cpython-36.pyc,,
pkg_resources/_vendor/packaging/__pycache__/markers.cpython-36.pyc,,
pkg_resources/_vendor/packaging/__pycache__/_structures.cpython-36.pyc,,
pkg_resources/_vendor/packaging/__pycache__/__init__.cpython-36.pyc,,
pkg_resources/_vendor/packaging/__pycache__/__about__.cpython-36.pyc,,
pkg_resources/_vendor/packaging/__pycache__/utils.cpython-36.pyc,,
pkg_resources/_vendor/packaging/__pycache__/version.cpython-36.pyc,,
pkg_resources/_vendor/packaging/__pycache__/requirements.cpython-36.pyc,,
pkg_resources/_vendor/packaging/__pycache__/_compat.cpython-36.pyc,,
pkg_resources/_vendor/packaging/__pycache__/specifiers.cpython-36.pyc,,
pkg_resources/_vendor/__pycache__/__init__.cpython-36.pyc,,
pkg_resources/_vendor/__pycache__/pyparsing.cpython-36.pyc,,
pkg_resources/_vendor/__pycache__/appdirs.cpython-36.pyc,,
pkg_resources/_vendor/__pycache__/six.cpython-36.pyc,,
pkg_resources/__pycache__/__init__.cpython-36.pyc,,
pkg_resources/__pycache__/py31compat.cpython-36.pyc,,
__pycache__/easy_install.cpython-36.pyc,,
setuptools/extern/__pycache__/__init__.cpython-36.pyc,,
setuptools/_vendor/packaging/__pycache__/markers.cpython-36.pyc,,
setuptools/_vendor/packaging/__pycache__/_structures.cpython-36.pyc,,
setuptools/_vendor/packaging/__pycache__/__init__.cpython-36.pyc,,
setuptools/_vendor/packaging/__pycache__/__about__.cpython-36.pyc,,
setuptools/_vendor/packaging/__pycache__/utils.cpython-36.pyc,,
setuptools/_vendor/packaging/__pycache__/version.cpython-36.pyc,,
setuptools/_vendor/packaging/__pycache__/requirements.cpython-36.pyc,,
setuptools/_vendor/packaging/__pycache__/_compat.cpython-36.pyc,,
setuptools/_vendor/packaging/__pycache__/specifiers.cpython-36.pyc,,
setuptools/_vendor/__pycache__/__init__.cpython-36.pyc,,
setuptools/_vendor/__pycache__/pyparsing.cpython-36.pyc,,
setuptools/_vendor/__pycache__/six.cpython-36.pyc,,
setuptools/__pycache__/pep425tags.cpython-36.pyc,,
setuptools/__pycache__/dep_util.cpython-36.pyc,,
setuptools/__pycache__/py27compat.cpython-36.pyc,,
setuptools/__pycache__/build_meta.cpython-36.pyc,,
setuptools/__pycache__/monkey.cpython-36.pyc,,
setuptools/__pycache__/__init__.cpython-36.pyc,,
setuptools/__pycache__/extension.cpython-36.pyc,,
setuptools/__pycache__/package_index.cpython-36.pyc,,
setuptools/__pycache__/glibc.cpython-36.pyc,,
setuptools/__pycache__/depends.cpython-36.pyc,,
setuptools/__pycache__/lib2to3_ex.cpython-36.pyc,,
setuptools/__pycache__/namespaces.cpython-36.pyc,,
setuptools/__pycache__/dist.cpython-36.pyc,,
setuptools/__pycache__/unicode_utils.cpython-36.pyc,,
setuptools/__pycache__/py33compat.cpython-36.pyc,,
setuptools/__pycache__/msvc.cpython-36.pyc,,
setuptools/__pycache__/windows_support.cpython-36.pyc,,
setuptools/__pycache__/site-patch.cpython-36.pyc,,
setuptools/__pycache__/glob.cpython-36.pyc,,
setuptools/__pycache__/sandbox.cpython-36.pyc,,
setuptools/__pycache__/version.cpython-36.pyc,,
setuptools/__pycache__/py31compat.cpython-36.pyc,,
setuptools/__pycache__/archive_util.cpython-36.pyc,,
setuptools/__pycache__/ssl_support.cpython-36.pyc,,
setuptools/__pycache__/config.cpython-36.pyc,,
setuptools/__pycache__/wheel.cpython-36.pyc,,
setuptools/__pycache__/launch.cpython-36.pyc,,
setuptools/__pycache__/py36compat.cpython-36.pyc,,
setuptools/command/__pycache__/install_lib.cpython-36.pyc,,
setuptools/command/__pycache__/saveopts.cpython-36.pyc,,
setuptools/command/__pycache__/install_scripts.cpython-36.pyc,,
setuptools/command/__pycache__/setopt.cpython-36.pyc,,
setuptools/command/__pycache__/install.cpython-36.pyc,,
setuptools/command/__pycache__/bdist_wininst.cpython-36.pyc,,
setuptools/command/__pycache__/install_egg_info.cpython-36.pyc,,
setuptools/command/__pycache__/__init__.cpython-36.pyc,,
setuptools/command/__pycache__/build_py.cpython-36.pyc,,
setuptools/command/__pycache__/bdist_egg.cpython-36.pyc,,
setuptools/command/__pycache__/upload_docs.cpython-36.pyc,,
setuptools/command/__pycache__/bdist_rpm.cpython-36.pyc,,
setuptools/command/__pycache__/dist_info.cpython-36.pyc,,
setuptools/command/__pycache__/build_ext.cpython-36.pyc,,
setuptools/command/__pycache__/upload.cpython-36.pyc,,
setuptools/command/__pycache__/easy_install.cpython-36.pyc,,
setuptools/command/__pycache__/build_clib.cpython-36.pyc,,
setuptools/command/__pycache__/test.cpython-36.pyc,,
setuptools/command/__pycache__/rotate.cpython-36.pyc,,
setuptools/command/__pycache__/sdist.cpython-36.pyc,,
setuptools/command/__pycache__/alias.cpython-36.pyc,,
setuptools/command/__pycache__/register.cpython-36.pyc,,
setuptools/command/__pycache__/develop.cpython-36.pyc,,
setuptools/command/__pycache__/egg_info.cpython-36.pyc,,
setuptools/command/__pycache__/py36compat.cpython-36.pyc,,
site-packages/procfs/__init__.py000075500000000620147511334560012703 0ustar00#! /usr/bin/python3
# -*- python -*-
# -*- coding: utf-8 -*-
# SPDX-License-Identifier: GPL-2.0-only
#
# Copyright (C) 2008, 2009  Red Hat, Inc.
#
"""
Copyright (c) 2008, 2009  Red Hat Inc.

Abstractions to extract information from the Linux kernel /proc files.
"""
__author__ = "Arnaldo Carvalho de Melo <acme@redhat.com>"
__license__ = "GPLv2 License"

from .procfs import *
from .utilist import *
site-packages/procfs/utilist.py000075500000001567147511334560012654 0ustar00#! /usr/bin/python3
# -*- python -*-
# -*- coding: utf-8 -*-
# SPDX-License-Identifier: GPL-2.0-only
#
# Copyright (C) 2007 Red Hat, Inc.
#

from six.moves import range


def hexbitmask(l, nr_entries):
    hexbitmask = []
    bit = 0
    mask = 0
    for entry in range(nr_entries):
        if entry in l:
            mask |= (1 << bit)
        bit += 1
        if bit == 32:
            bit = 0
            hexbitmask.insert(0, mask)
            mask = 0

    if bit < 32 and mask != 0:
        hexbitmask.insert(0, mask)

    return hexbitmask

def bitmasklist(line, nr_entries):
    hexmask = line.strip().replace(",", "")
    bitmasklist = []
    entry = 0
    bitmask = bin(int(hexmask, 16))[2::]
    for i in reversed(bitmask):
        if int(i) & 1:
            bitmasklist.append(entry)
        entry += 1
        if entry == nr_entries:
            break
    return bitmasklist
site-packages/procfs/procfs.py000075500000106712147511334560012451 0ustar00#!/usr/bin/python3
# -*- python -*-
# -*- coding: utf-8 -*-
# SPDX-License-Identifier: GPL-2.0-only
#
# Copyright (C) 2007-2015 Red Hat, Inc.
#

import os
import platform
import re
import time
from functools import reduce
from six.moves import range
from procfs.utilist import bitmasklist

VERSION = "0.7.3"


def is_s390():
    """ Return True if running on s390 or s390x """
    machine = platform.machine()
    return bool(re.search('s390', machine))


def process_cmdline(pid_info):
    """
    Returns the process command line, if available in the given `process' class,
    if not available, falls back to using the comm (short process name) in its
    pidstat key.
    """
    if pid_info["cmdline"]:
        return reduce(lambda a, b: a + " %s" % b, pid_info["cmdline"]).strip()

    try:
        """ If a pid disappears before we query it, return None """
        return pid_info["stat"]["comm"]
    except:
        return None


class pidstat:
    """
    Provides a dictionary to access the fields in the
    per process /proc/PID/stat files.

    One can obtain the available fields by asking for the keys of the
    dictionary, e.g.:

        >>> p = procfs.pidstat(1)
        >>> print p.keys()
        ['majflt', 'rss', 'cnswap', 'cstime', 'pid', 'session', 'startstack', 'startcode', 'cmajflt', 'blocked', 'exit_signal', 'minflt', 'nswap', 'environ', 'priority', 'state', 'delayacct_blkio_ticks', 'policy', 'rt_priority', 'ppid', 'nice', 'cutime', 'endcode', 'wchan', 'num_threads', 'sigcatch', 'comm', 'stime', 'sigignore', 'tty_nr', 'kstkeip', 'utime', 'tpgid', 'itrealvalue', 'kstkesp', 'rlim', 'signal', 'pgrp', 'flags', 'starttime', 'cminflt', 'vsize', 'processor']

       And then access the various process properties using it as a dictionary:

        >>> print p['comm']
        systemd
        >>> print p['priority']
        20
        >>> print p['state']
        S

       Please refer to the 'procfs(5)' man page, by using:

        $ man 5 procfs

       To see information for each of the above fields, it is part of the
           'man-pages' RPM package.
    """

    # Entries with the same value, the one with a comment after it is the
    # more recent, having replaced the other name in v4.1-rc kernel times.

    PF_ALIGNWARN = 0x00000001
    PF_STARTING = 0x00000002
    PF_EXITING = 0x00000004
    PF_EXITPIDONE = 0x00000008
    PF_VCPU = 0x00000010
    PF_WQ_WORKER = 0x00000020  # /* I'm a workqueue worker */
    PF_FORKNOEXEC = 0x00000040
    PF_MCE_PROCESS = 0x00000080  # /* process policy on mce errors */
    PF_SUPERPRIV = 0x00000100
    PF_DUMPCORE = 0x00000200
    PF_SIGNALED = 0x00000400
    PF_MEMALLOC = 0x00000800
    # /* set_user noticed that RLIMIT_NPROC was exceeded */
    PF_NPROC_EXCEEDED = 0x00001000
    PF_FLUSHER = 0x00001000
    PF_USED_MATH = 0x00002000
    PF_USED_ASYNC = 0x00004000  # /* used async_schedule*(), used by module init */
    PF_NOFREEZE = 0x00008000
    PF_FROZEN = 0x00010000
    PF_FSTRANS = 0x00020000
    PF_KSWAPD = 0x00040000
    PF_MEMALLOC_NOIO = 0x00080000  # /* Allocating memory without IO involved */
    PF_SWAPOFF = 0x00080000
    PF_LESS_THROTTLE = 0x00100000
    PF_KTHREAD = 0x00200000
    PF_RANDOMIZE = 0x00400000
    PF_SWAPWRITE = 0x00800000
    PF_SPREAD_PAGE = 0x01000000
    PF_SPREAD_SLAB = 0x02000000
    PF_THREAD_BOUND = 0x04000000
    # /* Userland is not allowed to meddle with cpus_allowed */
    PF_NO_SETAFFINITY = 0x04000000
    PF_MCE_EARLY = 0x08000000  # /* Early kill for mce process policy */
    PF_MEMPOLICY = 0x10000000
    PF_MUTEX_TESTER = 0x20000000
    PF_FREEZER_SKIP = 0x40000000
    PF_FREEZER_NOSIG = 0x80000000
    # /* this thread called freeze_processes and should not be frozen */
    PF_SUSPEND_TASK = 0x80000000

    proc_stat_fields = ["pid", "comm", "state", "ppid", "pgrp", "session",
                        "tty_nr", "tpgid", "flags", "minflt", "cminflt",
                        "majflt", "cmajflt", "utime", "stime", "cutime",
                        "cstime", "priority", "nice", "num_threads",
                        "itrealvalue", "starttime", "vsize", "rss",
                        "rlim", "startcode", "endcode", "startstack",
                        "kstkesp", "kstkeip", "signal", "blocked",
                        "sigignore", "sigcatch", "wchan", "nswap",
                        "cnswap", "exit_signal", "processor",
                        "rt_priority", "policy",
                        "delayacct_blkio_ticks", "environ"]

    def __init__(self, pid, basedir="/proc"):
        self.pid = pid
        try:
            self.load(basedir)
        except FileNotFoundError:
            # The file representing the pid has disappeared
            #  propagate the error to the user to handle
            raise

    def __getitem__(self, fieldname):
        return self.fields[fieldname]

    def keys(self):
        return list(self.fields.keys())

    def values(self):
        return list(self.fields.values())

    def has_key(self, fieldname):
        return fieldname in self.fields

    def items(self):
        return self.fields

    def __contains__(self, fieldname):
        return fieldname in self.fields

    def load(self, basedir="/proc"):
        try:
            f = open(f"{basedir}/{self.pid}/stat")
        except FileNotFoundError:
            # The pid has disappeared, propagate the error
            raise
        fields = f.readline().strip().split(') ')
        f.close()
        fields = fields[0].split(' (') + fields[1].split()
        self.fields = {}
        nr_fields = min(len(fields), len(self.proc_stat_fields))
        for i in range(nr_fields):
            attrname = self.proc_stat_fields[i]
            value = fields[i]
            if attrname == "comm":
                self.fields["comm"] = value.strip('()')
            else:
                try:
                    self.fields[attrname] = int(value)
                except:
                    self.fields[attrname] = value

    def is_bound_to_cpu(self):
        """
        Returns true if this process has a fixed smp affinity mask,
                not allowing it to be moved to a different set of CPUs.
        """
        return bool(self.fields["flags"] & self.PF_THREAD_BOUND)

    def process_flags(self):
        """
        Returns a list with all the process flags known, details depend
        on kernel version, declared in the file include/linux/sched.h in
        the kernel sources.

        As of v4.2-rc7 these include (from include/linux/sched.h comments):

            PF_EXITING        Getting shut down
            PF_EXITPIDONE     Pi exit done on shut down
            PF_VCPU           I'm a virtual CPU
            PF_WQ_WORKER      I'm a workqueue worker
            PF_FORKNOEXEC     Forked but didn't exec
            PF_MCE_PROCESS    Process policy on mce errors
            PF_SUPERPRIV      Used super-user privileges
            PF_DUMPCORE       Dumped core
            PF_SIGNALED       Killed by a signal
            PF_MEMALLOC       Allocating memory
            PF_NPROC_EXCEEDED Set_user noticed that RLIMIT_NPROC was exceeded
            PF_USED_MATH      If unset the fpu must be initialized before use
            PF_USED_ASYNC     Used async_schedule*(), used by module init
            PF_NOFREEZE       This thread should not be frozen
            PF_FROZEN          Frozen for system suspend
            PF_FSTRANS         Inside a filesystem transaction
            PF_KSWAPD          I am kswapd
            PF_MEMALLOC_NOIO   Allocating memory without IO involved
            PF_LESS_THROTTLE   Throttle me less: I clean memory
            PF_KTHREAD         I am a kernel thread
            PF_RANDOMIZE       Randomize virtual address space
            PF_SWAPWRITE       Allowed to write to swap
            PF_NO_SETAFFINITY  Userland is not allowed to meddle with cpus_allowed
            PF_MCE_EARLY       Early kill for mce process policy
            PF_MUTEX_TESTER    Thread belongs to the rt mutex tester
            PF_FREEZER_SKIP    Freezer should not count it as freezable
            PF_SUSPEND_TASK    This thread called freeze_processes and
                               should not be frozen

        """
        sflags = []
        for attr in dir(self):
            if attr[:3] != "PF_":
                continue
            value = getattr(self, attr)
            if value & self.fields["flags"]:
                sflags.append(attr)

        return sflags


def cannot_set_affinity(self, pid):
    PF_NO_SETAFFINITY = 0x04000000
    try:
        return bool(int(self.processes[pid]["stat"]["flags"]) &
                    PF_NO_SETAFFINITY)
    except:
        return True


def cannot_set_thread_affinity(self, pid, tid):
    PF_NO_SETAFFINITY = 0x04000000
    try:
        return bool(int(self.processes[pid].threads[tid]["stat"]["flags"]) &
                    PF_NO_SETAFFINITY)
    except:
        return True


class pidstatus:
    """
    Provides a dictionary to access the fields
    in the per process /proc/PID/status files.
    This provides additional information about processes and threads to
    what can be obtained with the procfs.pidstat() class.

    One can obtain the available fields by asking for the keys of the
    dictionary, e.g.:

        >>> import procfs
        >>> p = procfs.pidstatus(1)
        >>> print p.keys()
        ['VmExe', 'CapBnd', 'NSpgid', 'Tgid', 'NSpid', 'VmSize', 'VmPMD', 'ShdPnd', 'State', 'Gid', 'nonvoluntary_ctxt_switches', 'SigIgn', 'VmStk', 'VmData', 'SigCgt', 'CapEff', 'VmPTE', 'Groups', 'NStgid', 'Threads', 'PPid', 'VmHWM', 'NSsid', 'VmSwap', 'Name', 'SigBlk', 'Mems_allowed_list', 'VmPeak', 'Ngid', 'VmLck', 'SigQ', 'VmPin', 'Mems_allowed', 'CapPrm', 'Seccomp', 'VmLib', 'Cpus_allowed', 'Uid', 'SigPnd', 'Pid', 'Cpus_allowed_list', 'TracerPid', 'CapInh', 'voluntary_ctxt_switches', 'VmRSS', 'FDSize']
        >>> print p["Pid"]
        1
        >>> print p["Threads"]
        1
        >>> print p["VmExe"]
        1248 kB
        >>> print p["Cpus_allowed"]
        f
        >>> print p["SigQ"]
        0/30698
        >>> print p["VmPeak"]
        320300 kB
        >>>

    Please refer to the 'procfs(5)' man page, by using:

        $ man 5 procfs

    To see information for each of the above fields, it is part of the
    'man-pages' RPM package.

    In the man page there will be references to further documentation, like
        referring to the "getrlimit(2)" man page when explaining the "SigQ"
        line/field.
    """

    def __init__(self, pid, basedir="/proc"):
        self.pid = pid
        self.load(basedir)

    def __getitem__(self, fieldname):
        return self.fields[fieldname]

    def keys(self):
        return list(self.fields.keys())

    def values(self):
        return list(self.fields.values())

    def has_key(self, fieldname):
        return fieldname in self.fields

    def items(self):
        return self.fields

    def __contains__(self, fieldname):
        return fieldname in self.fields

    def load(self, basedir="/proc"):
        self.fields = {}
        with open(f"{basedir}/{self.pid}/status") as f:
            for line in f.readlines():
                fields = line.split(":")
                if len(fields) != 2:
                    continue
                name = fields[0]
                value = fields[1].strip()
                try:
                    self.fields[name] = int(value)
                except:
                    self.fields[name] = value


class process:
    """
    Information about a process with a given pid, provides a dictionary with
    two entries, instances of different wrappers for /proc/ process related
    meta files: "stat" and "status", see the documentation for procfs.pidstat
    and procfs.pidstatus for further info about those classes.
    """

    def __init__(self, pid, basedir="/proc"):
        self.pid = pid
        self.basedir = basedir

    def __getitem__(self, attr):
        if not hasattr(self, attr):
            if attr in ("stat", "status"):
                if attr == "stat":
                    sclass = pidstat
                else:
                    sclass = pidstatus

                try:
                    setattr(self, attr, sclass(self.pid, self.basedir))
                except FileNotFoundError:
                    # The pid has disappeared, progate the error
                    raise
            elif attr == "cmdline":
                self.load_cmdline()
            elif attr == "threads":
                self.load_threads()
            elif attr == "cgroups":
                self.load_cgroups()
            elif attr == "environ":
                self.load_environ()

        return getattr(self, attr)

    def has_key(self, attr):
        return hasattr(self, attr)

    def __contains__(self, attr):
        return hasattr(self, attr)

    def load_cmdline(self):
        try:
            with open(f"/proc/{self.pid}/cmdline") as f:
                self.cmdline = f.readline().strip().split('\0')[:-1]
        except FileNotFoundError:
            """ This can happen when a pid disappears """
            self.cmdline = None
        except UnicodeDecodeError:
            """ TODO - this shouldn't happen, needs to be investigated """
            self.cmdline = None

    def load_threads(self):
        self.threads = pidstats(f"/proc/{self.pid}/task/")
        # remove thread leader
        del self.threads[self.pid]

    def load_cgroups(self):
        self.cgroups = ""
        with open(f"/proc/{self.pid}/cgroup") as f:
            for line in reversed(f.readlines()):
                if len(self.cgroups) != 0:
                    self.cgroups = self.cgroups + "," + line[:-1]
                else:
                    self.cgroups = line[:-1]

    def load_environ(self):
        """
        Loads the environment variables for this process. The entries then
        become available via the 'environ' member, or via the 'environ'
        dict key when accessing as p["environ"].

        E.g.:


        >>> all_processes = procfs.pidstats()
        >>> firefox_pid = all_processes.find_by_name("firefox")
        >>> firefox_process = all_processes[firefox_pid[0]]
        >>> print firefox_process["environ"]["PWD"]
        /home/acme
        >>> print len(firefox_process.environ.keys())
        66
        >>> print firefox_process["environ"]["SHELL"]
        /bin/bash
        >>> print firefox_process["environ"]["USERNAME"]
        acme
        >>> print firefox_process["environ"]["HOME"]
        /home/acme
        >>> print firefox_process["environ"]["MAIL"]
        /var/spool/mail/acme
        >>>
        """
        self.environ = {}
        with open(f"/proc/{self.pid}/environ") as f:
            for x in f.readline().split('\0'):
                if len(x) > 0:
                    y = x.split('=')
                    self.environ[y[0]] = y[1]


class pidstats:
    """
    Provides access to all the processes in the system, to get a picture of
    how many processes there are at any given moment.

    The entries can be accessed as a dictionary, keyed by pid. Also there are
    methods to find processes that match a given COMM or regular expression.
    """

    def __init__(self, basedir="/proc"):
        self.basedir = basedir
        self.processes = {}
        self.reload()

    def __getitem__(self, key):
        return self.processes[key]

    def __delitem__(self, key):
        # not clear on why this can fail, but it can
        try:
            del self.processes[key]
        except:
            pass

    def keys(self):
        return list(self.processes.keys())

    def values(self):
        return list(self.processes.values())

    def has_key(self, key):
        return key in self.processes

    def items(self):
        return self.processes

    def __contains__(self, key):
        return key in self.processes

    def reload(self):
        """
        This operation will throw away the current dictionary contents,
        if any, and read all the pid files from /proc/, instantiating a
        'process' instance for each of them.

        This is a high overhead operation, and should be avoided if the
        perf python binding can be used to detect when new threads appear
        and existing ones terminate.

        In RHEL it is found in the python-perf rpm package.

        More information about the perf facilities can be found in the
        'perf_event_open' man page.
        """
        del self.processes
        self.processes = {}
        pids = os.listdir(self.basedir)
        for spid in pids:
            try:
                pid = int(spid)
            except:
                continue

            self.processes[pid] = process(pid, self.basedir)

    def reload_threads(self):
        to_remove = []
        for pid in list(self.processes.keys()):
            try:
                self.processes[pid].load_threads()
            except OSError:
                # process vanished, remove it
                to_remove.append(pid)
        for pid in to_remove:
            del self.processes[pid]

    def find_by_name(self, name):
        name = name[:15]
        pids = []
        for pid in list(self.processes.keys()):
            try:
                if name == self.processes[pid]["stat"]["comm"]:
                    pids.append(pid)
            except IOError:
                # We're doing lazy loading of /proc files
                # So if we get this exception is because the
                # process vanished, remove it
                del self.processes[pid]

        return pids

    def find_by_regex(self, regex):
        pids = []
        for pid in list(self.processes.keys()):
            try:
                if regex.match(self.processes[pid]["stat"]["comm"]):
                    pids.append(pid)
            except IOError:
                # We're doing lazy loading of /proc files
                # So if we get this exception is because the
                # process vanished, remove it
                del self.processes[pid]
        return pids

    def find_by_cmdline_regex(self, regex):
        pids = []
        for pid in list(self.processes.keys()):
            try:
                if regex.match(process_cmdline(self.processes[pid])):
                    pids.append(pid)
            except IOError:
                # We're doing lazy loading of /proc files
                # So if we get this exception is because the
                # process vanished, remove it
                del self.processes[pid]
        return pids

    def get_per_cpu_rtprios(self, basename):
        cpu = 0
        priorities = ""
        processed_pids = []
        while True:
            name = f"{basename}/{cpu}"
            pids = self.find_by_name(name)
            if not pids or len([n for n in pids if n not in processed_pids]) == 0:
                break
            for pid in pids:
                try:
                    priorities += f'{self.processes[pid]["stat"]["rt_priority"]}'
                except IOError:
                    # We're doing lazy loading of /proc files
                    # So if we get this exception is because the
                    # process vanished, remove it
                    del self.processes[pid]
            processed_pids += pids
            cpu += 1

        priorities = priorities.strip(',')
        return priorities

    def get_rtprios(self, name):
        cpu = 0
        priorities = ""
        processed_pids = []
        while True:
            pids = self.find_by_name(name)
            if not pids or len([n for n in pids if n not in processed_pids]) == 0:
                break
            for pid in pids:
                try:
                    priorities += f'{self.processes[pid]["stat"]["rt_priority"]}'
                except IOError:
                    # We're doing lazy loading of /proc files
                    # So if we get this exception is because the
                    # process vanished, remove it
                    del self.processes[pid]
            processed_pids += pids
            cpu += 1

        priorities = priorities.strip(',')
        return priorities

    def is_bound_to_cpu(self, pid):
        """
        Checks if a given pid can't have its SMP affinity mask changed.
        """
        return self.processes[pid]["stat"].is_bound_to_cpu()


class interrupts:
    """
    Information about IRQs in the system. A dictionary keyed by IRQ number
    will have as its value another dictionary with "cpu", "type" and "users"
    keys, with the SMP affinity mask, type of IRQ and the drivers associated
    with each interrupt.

    The information comes from the /proc/interrupts file, documented in
    'man procfs(5)', for instance, the 'cpu' dict is an array with one entry
    per CPU present in the sistem, each value being the number of interrupts
    that took place per CPU.

    E.g.:

    >>> import procfs
    >>> interrupts = procfs.interrupts()
    >>> thunderbolt_irq = interrupts.find_by_user("thunderbolt")
    >>> print thunderbolt_irq
    34
    >>> thunderbolt = interrupts[thunderbolt_irq]
    >>> print thunderbolt
    {'affinity': [0, 1, 2, 3], 'type': 'PCI-MSI', 'cpu': [3495, 0, 81, 0], 'users': ['thunderbolt']}
    >>>
    """

    def __init__(self):
        self.interrupts = {}
        self.reload()

    def __getitem__(self, key):
        return self.interrupts[str(key)]

    def keys(self):
        return list(self.interrupts.keys())

    def values(self):
        return list(self.interrupts.values())

    def has_key(self, key):
        return str(key) in self.interrupts

    def items(self):
        return self.interrupts

    def __contains__(self, key):
        return str(key) in self.interrupts

    def reload(self):
        del self.interrupts
        self.interrupts = {}
        with open("/proc/interrupts") as f:
            for line in f.readlines():
                line = line.strip()
                fields = line.split()
                if fields[0][:3] == "CPU":
                    self.nr_cpus = len(fields)
                    continue
                irq = fields[0].strip(":")
                self.interrupts[irq] = {}
                self.interrupts[irq] = self.parse_entry(fields[1:], line)
                try:
                    nirq = int(irq)
                except:
                    continue
                self.interrupts[irq]["affinity"] = self.parse_affinity(nirq)

    def parse_entry(self, fields, line):
        dict = {}
        dict["cpu"] = []
        dict["cpu"].append(int(fields[0]))
        nr_fields = len(fields)
        if nr_fields >= self.nr_cpus:
            dict["cpu"] += [int(i) for i in fields[1:self.nr_cpus]]
            if nr_fields > self.nr_cpus:
                dict["type"] = fields[self.nr_cpus]
                # look if there are users (interrupts 3 and 4 haven't)
                if nr_fields > self.nr_cpus + 1:
                    dict["users"] = [a.strip()
                                     for a in fields[nr_fields - 1].split(',')]
                else:
                    dict["users"] = []
        return dict

    def parse_affinity(self, irq):
        try:
            with open(f"/proc/irq/{irq}/smp_affinity") as f:
                line = f.readline()
            return bitmasklist(line, self.nr_cpus)
        except IOError:
            return [0, ]

    def find_by_user(self, user):
        """
        Looks up a interrupt number by the name of one of its users"

        E.g.:

        >>> import procfs
        >>> interrupts = procfs.interrupts()
        >>> thunderbolt_irq = interrupts.find_by_user("thunderbolt")
        >>> print thunderbolt_irq
        34
        >>> thunderbolt = interrupts[thunderbolt_irq]
        >>> print thunderbolt
        {'affinity': [0, 1, 2, 3], 'type': 'PCI-MSI', 'cpu': [3495, 0, 81, 0], 'users': ['thunderbolt']}
        >>>
        """
        for i in list(self.interrupts.keys()):
            if "users" in self.interrupts[i] and \
               user in self.interrupts[i]["users"]:
                return i
        return None

    def find_by_user_regex(self, regex):
        """
        Looks up a interrupt number by a regex that matches names of its users"

        E.g.:

        >>> import procfs
        >>> import re
        >>> interrupts = procfs.interrupts()
        >>> usb_controllers = interrupts.find_by_user_regex(re.compile(".*hcd"))
        >>> print usb_controllers
        ['22', '23', '31']
        >>> print [ interrupts[irq]["users"] for irq in usb_controllers ]
        [['ehci_hcd:usb4'], ['ehci_hcd:usb3'], ['xhci_hcd']]
        >>>
        """
        irqs = []
        for i in list(self.interrupts.keys()):
            if "users" not in self.interrupts[i]:
                continue
            for user in self.interrupts[i]["users"]:
                if regex.match(user):
                    irqs.append(i)
                    break
        return irqs


class cmdline:
    """
    Parses the kernel command line (/proc/cmdline), turning it into a dictionary."

    Useful to figure out if some kernel boolean knob has been turned on,
    as well as to find the value associated to other kernel knobs.

    It can also be used to find out about parameters passed to the
    init process, such as 'BOOT_IMAGE', etc.

    E.g.:
    >>> import procfs
    >>> kcmd = procfs.cmdline()
    >>> print kcmd.keys()
    ['LANG', 'BOOT_IMAGE', 'quiet', 'rhgb', 'rd.lvm.lv', 'ro', 'root']
    >>> print kcmd["BOOT_IMAGE"]
    /vmlinuz-4.3.0-rc1+
    >>>
    """

    def __init__(self):
        self.options = {}
        self.parse()

    def parse(self):
        with open("/proc/cmdline") as f:
            for option in f.readline().strip().split():
                fields = option.split("=")
                if len(fields) == 1:
                    self.options[fields[0]] = True
                else:
                    self.options[fields[0]] = fields[1]

    def __getitem__(self, key):
        return self.options[key]

    def keys(self):
        return list(self.options.keys())

    def values(self):
        return list(self.options.values())

    def items(self):
        return self.options


class cpuinfo:
    """
    Dictionary with information about CPUs in the system.

    Please refer to 'man procfs(5)' for further information about the
    '/proc/cpuinfo' file, that is the source of the information provided
    by this class. The 'man lscpu(1)' also has information about a program that
    uses the '/proc/cpuinfo' file.

    Using this class one can obtain the number of CPUs in a system:

      >>> cpus = procfs.cpuinfo()
          >>> print cpus.nr_cpus
          4

    It is also possible to figure out aspects of the CPU topology, such as
    how many CPU physical sockets exists, i.e. groups of CPUs sharing
    components such as CPU memory caches:

      >>> print len(cpus.sockets)
      1

    Additionally dictionary with information common to all CPUs in the system
    is available:

      >>> print cpus["model name"]
          Intel(R) Core(TM) i7-3667U CPU @ 2.00GHz
          >>> print cpus["cache size"]
          4096 KB
          >>>
    """

    def __init__(self, filename="/proc/cpuinfo"):
        self.tags = {}
        self.nr_cpus = 0
        self.sockets = []
        self.parse(filename)

    def __getitem__(self, key):
        return self.tags[key.lower()]

    def keys(self):
        return list(self.tags.keys())

    def values(self):
        return list(self.tags.values())

    def items(self):
        return self.tags

    def parse(self, filename):
        with open(filename) as f:
            for line in f.readlines():
                line = line.strip()
                if not line:
                    continue
                fields = line.split(":")
                tagname = fields[0].strip().lower()
                if tagname == "processor":
                    self.nr_cpus += 1
                    continue
                if is_s390() and tagname == "cpu number":
                    self.nr_cpus += 1
                    continue
                if tagname == "core id":
                    continue
                self.tags[tagname] = fields[1].strip()
                if tagname == "physical id":
                    socket_id = self.tags[tagname]
                    if socket_id not in self.sockets:
                        self.sockets.append(socket_id)
        self.nr_sockets = self.sockets and len(self.sockets) or \
            (self.nr_cpus /
             ("siblings" in self.tags and int(self.tags["siblings"]) or 1))
        self.nr_cores = ("cpu cores" in self.tags and int(
            self.tags["cpu cores"]) or 1) * self.nr_sockets


class smaps_lib:
    """
    Representation of an mmap in place for a process. Can be used to figure
    out which processes have an library mapped, etc.

    The 'perm' member can be used to figure out executable mmaps,
    i.e. libraries.

    The 'vm_start' and 'vm_end' in turn can be used when trying to resolve
    processor instruction pointer addresses to a symbol name in a library.
    """

    def __init__(self, lines):
        fields = lines[0].split()
        self.vm_start, self.vm_end = [int(a, 16) for a in fields[0].split("-")]
        self.perms = fields[1]
        self.offset = int(fields[2], 16)
        self.major, self.minor = fields[3].split(":")
        self.inode = int(fields[4])
        if len(fields) > 5:
            self.name = fields[5]
        else:
            self.name = None
        self.tags = {}
        for line in lines[1:]:
            fields = line.split()
            tag = fields[0][:-1].lower()
            try:
                self.tags[tag] = int(fields[1])
            except:
                # VmFlags are strings
                self.tags[tag] = fields

    def __getitem__(self, key):
        return self.tags[key.lower()]

    def keys(self):
        return list(self.tags.keys())

    def values(self):
        return list(self.tags.values())

    def items(self):
        return self.tags


class smaps:
    """
    List of libraries mapped by a process. Parses the lines in
    the /proc/PID/smaps file, that is further documented in the
    procfs(5) man page.

    Example: Listing the executable maps for the 'sshd' process:

          >>> import procfs
          >>> processes = procfs.pidstats()
          >>> sshd = processes.find_by_name("sshd")
          >>> sshd_maps = procfs.smaps(sshd[0])
          >>> for i in range(len(sshd_maps)):
          ...     if 'x' in sshd_maps[i].perms:
          ...         print "%s: %s" % (sshd_maps[i].name, sshd_maps[i].perms)
          ...
          /usr/sbin/sshd: r-xp
          /usr/lib64/libnss_files-2.20.so: r-xp
          /usr/lib64/librt-2.20.so: r-xp
          /usr/lib64/libkeyutils.so.1.5: r-xp
          /usr/lib64/libkrb5support.so.0.1: r-xp
          /usr/lib64/libfreebl3.so: r-xp
          /usr/lib64/libpthread-2.20.so: r-xp
      ...
    """

    def __init__(self, pid):
        self.pid = pid
        self.entries = []
        self.reload()

    def parse_entry(self, f, line):
        lines = []
        if not line:
            line = f.readline().strip()
        if not line:
            return
        lines.append(line)
        while True:
            line = f.readline()
            if not line:
                break
            line = line.strip()
            if line.split()[0][-1] == ':':
                lines.append(line)
            else:
                break
        self.entries.append(smaps_lib(lines))
        return line

    def __len__(self):
        return len(self.entries)

    def __getitem__(self, index):
        return self.entries[index]

    def reload(self):
        line = None
        with open(f"/proc/{self.pid}/smaps") as f:
            while True:
                line = self.parse_entry(f, line)
                if not line:
                    break
        self.nr_entries = len(self.entries)

    def find_by_name_fragment(self, fragment):
        result = []
        for i in range(self.nr_entries):
            if self.entries[i].name and \
               self.entries[i].name.find(fragment) >= 0:
                result.append(self.entries[i])

        return result


class cpustat:
    """
    CPU statistics, obtained from a line in the '/proc/stat' file, Please
    refer to 'man procfs(5)' for further information about the '/proc/stat'
    file, that is the source of the information provided by this class.
    """

    def __init__(self, fields):
        self.name = fields[0]
        (self.user,
         self.nice,
         self.system,
         self.idle,
         self.iowait,
         self.irq,
         self.softirq) = [int(i) for i in fields[1:8]]
        if len(fields) > 7:
            self.steal = int(fields[7])
            if len(fields) > 8:
                self.guest = int(fields[8])

    def __repr__(self):
        s = f"< user: {self.user}, nice: {self.nice}, system: {self.system}, idle: {self.idle}, iowait: {self.iowait}, irq: {self.irq}, softirq: {self.softirq}"
        if hasattr(self, 'steal'):
            s += f", steal: {self.steal}"
        if hasattr(self, 'guest'):
            s += f", guest: {self.guest}"
        return s + ">"


class cpusstats:
    """
    Dictionary with information about CPUs in the system. First entry in the
    dictionary gives an aggregate view of all CPUs, each other entry is about
    separate CPUs. Please refer to 'man procfs(5)' for further information
    about the '/proc/stat' file, that is the source of the information provided
    by this class.
    """

    def __init__(self, filename="/proc/stat"):
        self.entries = {}
        self.time = None
        self.hertz = os.sysconf(2)
        self.filename = filename
        self.reload()

    def __iter__(self):
        return iter(self.entries)

    def __getitem__(self, key):
        return self.entries[key]

    def __len__(self):
        return len(list(self.entries.keys()))

    def keys(self):
        return list(self.entries.keys())

    def values(self):
        return list(self.entries.values())

    def items(self):
        return self.entries

    def reload(self):
        last_entries = self.entries
        self.entries = {}
        with open(self.filename) as f:
            for line in f.readlines():
                fields = line.strip().split()
                if fields[0][:3].lower() != "cpu":
                    continue
                c = cpustat(fields)
                if c.name == "cpu":
                    idx = 0
                else:
                    idx = int(c.name[3:]) + 1
                self.entries[idx] = c
        last_time = self.time
        self.time = time.time()
        if last_entries:
            delta_sec = self.time - last_time
            interval_hz = delta_sec * self.hertz
            for cpu in list(self.entries.keys()):
                if cpu not in last_entries:
                    curr.usage = 0
                    continue
                curr = self.entries[cpu]
                prev = last_entries[cpu]
                delta = (curr.user - prev.user) + \
                    (curr.nice - prev.nice) + \
                    (curr.system - prev.system)
                curr.usage = (delta / interval_hz) * 100
                curr.usage = min(curr.usage, 100)


if __name__ == '__main__':
    import sys

    ints = interrupts()

    for i in list(ints.interrupts.keys()):
        print(f"{i}: {ints.interrupts[i]}")

    options = cmdline()
    for o in list(options.options.keys()):
        print(f"{o}: {options.options[o]}")

    cpu = cpuinfo()
    print(f"\ncpuinfo data: {cpu.nr_cpus} processors")
    for tag in list(cpu.keys()):
        print(f"{tag}={cpu[tag]}")

    print("smaps:\n" + ("-" * 40))
    s = smaps(int(sys.argv[1]))
    for i in range(s.nr_entries):
        print(f"{s.entries[i].vm_start:#x} {s.entries[i].name}")
    print("-" * 40)
    for a in s.find_by_name_fragment(sys.argv[2]):
        print(a["Size"])

    ps = pidstats()
    print(ps[1])

    cs = cpusstats()
    while True:
        time.sleep(1)
        cs.reload()
        for cpu in cs:
            print(f"{cpu}: {cs[cpu]}")
        print("-" * 10)
site-packages/procfs/__pycache__/__init__.cpython-36.pyc000075500000000574147511334560017177 0ustar003

�{Me��@s dZdZdZddlTddlTdS)zp
Copyright (c) 2008, 2009  Red Hat Inc.

Abstractions to extract information from the Linux kernel /proc files.
z*Arnaldo Carvalho de Melo <acme@redhat.com>z
GPLv2 License�)�*N)�__doc__�
__author__Z__license__ZprocfsZutilist�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/procfs/__pycache__/utilist.cpython-36.opt-1.pyc000075500000001417147511334560020071 0ustar003

�{Mew�@s ddlmZdd�Zdd�ZdS)�)�rangecCsvg}d}d}xHt|�D]<}||kr.|d|>O}|d7}|dkrd}|jd|�d}qW|dkrr|dkrr|jd|�|S)Nr�� )r�insert)�l�
nr_entries�
hexbitmask�bit�mask�entry�r�/usr/lib/python3.6/utilist.pyrsrcCsl|j�jdd�}g}d}tt|d��dd�}x8t|�D],}t|�d@rR|j|�|d7}||kr8Pq8W|S)N�,�r��r)�strip�replace�bin�int�reversed�append)�linerZhexmask�bitmasklistrZbitmask�irrr
rs
rN)Z	six.movesrrrrrrr
�<module>	ssite-packages/procfs/__pycache__/procfs.cpython-36.pyc000075500000107105147511334560016732 0ustar003

�{Meʍ�@s�ddlZddlZddlZddlZddlmZddlmZddlm	Z	dZ
dd�Zdd	�ZGd
d�d�Z
dd
�Zdd�ZGdd�d�ZGdd�d�ZGdd�d�ZGdd�d�ZGdd�d�ZGdd�d�ZGdd�d�ZGdd�d�ZGd d!�d!�ZGd"d#�d#�Zed$k�r�ddlZe�Zx0eejj��D]Ze e�d%eje����q*We�Z!x0ee!j!j��D]Z"e e"�d%e!j!e"����qbWe�Z#e d&e#j$�d'��x,ee#j��D]Z%e e%�d(e#e%����q�We d)d*d+�ee&ej'd,��Z(x8ee(j)�D]*Ze e(j*ej+d-�d.e(j*ej,����q�We d*d+�x&e(j-ej'd/�D]Z.e e.d0��qDWe�Z/e e/d,�e�Z0xFej1d,�e0j2�x$e0D]Z#e e#�d%e0e#����q�We d2��qtWdS)3�N)�reduce)�range)�bitmasklistz0.7.3cCstj�}ttjd|��S)z) Return True if running on s390 or s390x Zs390)�platform�machine�bool�re�search)r�r
�/usr/lib/python3.6/procfs.py�is_s390src	Cs:|drtdd�|d�j�Sy|ddSdSdS)z�
    Returns the process command line, if available in the given `process' class,
    if not available, falls back to using the comm (short process name) in its
    pidstat key.
    �cmdlinecSs|d|S)Nz %sr
)�a�br
r
r�<lambda>!sz!process_cmdline.<locals>.<lambda>�stat�commN)r�strip)Zpid_infor
r
r�process_cmdlinesrc+@sNeZdZdZdZdZdZdZdZdZ	dZ
d	Zd
ZdZ
dZd
ZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZ dZ!dZ"dZ#dZ$d Z%d!Z&d!Z'd"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLg+Z(dcdNdO�Z)dPdQ�Z*dRdS�Z+dTdU�Z,dVdW�Z-dXdY�Z.dZd[�Z/ddd\d]�Z0d^d_�Z1d`da�Z2dbS)e�pidstatam
    Provides a dictionary to access the fields in the
    per process /proc/PID/stat files.

    One can obtain the available fields by asking for the keys of the
    dictionary, e.g.:

        >>> p = procfs.pidstat(1)
        >>> print p.keys()
        ['majflt', 'rss', 'cnswap', 'cstime', 'pid', 'session', 'startstack', 'startcode', 'cmajflt', 'blocked', 'exit_signal', 'minflt', 'nswap', 'environ', 'priority', 'state', 'delayacct_blkio_ticks', 'policy', 'rt_priority', 'ppid', 'nice', 'cutime', 'endcode', 'wchan', 'num_threads', 'sigcatch', 'comm', 'stime', 'sigignore', 'tty_nr', 'kstkeip', 'utime', 'tpgid', 'itrealvalue', 'kstkesp', 'rlim', 'signal', 'pgrp', 'flags', 'starttime', 'cminflt', 'vsize', 'processor']

       And then access the various process properties using it as a dictionary:

        >>> print p['comm']
        systemd
        >>> print p['priority']
        20
        >>> print p['state']
        S

       Please refer to the 'procfs(5)' man page, by using:

        $ man 5 procfs

       To see information for each of the above fields, it is part of the
           'man-pages' RPM package.
    ������ �@��iiiii i@i�iiiiii i@i�iiiiii i@l�pidr�stateZppidZpgrpZsessionZtty_nrZtpgid�flagsZminfltZcminfltZmajfltZcmajflt�utimeZstimeZcutimeZcstimeZpriority�niceZnum_threadsZitrealvalueZ	starttimeZvsizeZrssZrlimZ	startcodeZendcodeZ
startstackZkstkespZkstkeip�signalZblockedZ	sigignoreZsigcatchZwchanZnswapZcnswapZexit_signal�	processor�rt_priorityZpolicyZdelayacct_blkio_ticks�environ�/proccCs0||_y|j|�Wntk
r*�YnXdS)N)r�load�FileNotFoundError)�selfr�basedirr
r
r�__init__~s
zpidstat.__init__cCs
|j|S)N)�fields)r+�	fieldnamer
r
r�__getitem__�szpidstat.__getitem__cCst|jj��S)N)�listr.�keys)r+r
r
rr2�szpidstat.keyscCst|jj��S)N)r1r.�values)r+r
r
rr3�szpidstat.valuescCs
||jkS)N)r.)r+r/r
r
r�has_key�szpidstat.has_keycCs|jS)N)r.)r+r
r
r�items�sz
pidstat.itemscCs
||jkS)N)r.)r+r/r
r
r�__contains__�szpidstat.__contains__cCs�yt|�d|j�d��}Wntk
r0�YnX|j�j�jd�}|j�|djd�|dj�}i|_tt	|�t	|j
��}xft|�D]Z}|j
|}||}|dkr�|jd�|jd<q�yt|�|j|<Wq�||j|<Yq�Xq�WdS)	N�/z/statz) rz (rrz())
�openrr*�readliner�split�closer.�min�len�proc_stat_fieldsr�int)r+r,�fr.�	nr_fields�iZattrname�valuer
r
rr)�s$
zpidstat.loadcCst|jd|j@�S)z�
        Returns true if this process has a fixed smp affinity mask,
                not allowing it to be moved to a different set of CPUs.
        r!)rr.�PF_THREAD_BOUND)r+r
r
r�is_bound_to_cpu�szpidstat.is_bound_to_cpucCsNg}xDt|�D]8}|dd�dkr$qt||�}||jd@r|j|�qW|S)a�
        Returns a list with all the process flags known, details depend
        on kernel version, declared in the file include/linux/sched.h in
        the kernel sources.

        As of v4.2-rc7 these include (from include/linux/sched.h comments):

            PF_EXITING        Getting shut down
            PF_EXITPIDONE     Pi exit done on shut down
            PF_VCPU           I'm a virtual CPU
            PF_WQ_WORKER      I'm a workqueue worker
            PF_FORKNOEXEC     Forked but didn't exec
            PF_MCE_PROCESS    Process policy on mce errors
            PF_SUPERPRIV      Used super-user privileges
            PF_DUMPCORE       Dumped core
            PF_SIGNALED       Killed by a signal
            PF_MEMALLOC       Allocating memory
            PF_NPROC_EXCEEDED Set_user noticed that RLIMIT_NPROC was exceeded
            PF_USED_MATH      If unset the fpu must be initialized before use
            PF_USED_ASYNC     Used async_schedule*(), used by module init
            PF_NOFREEZE       This thread should not be frozen
            PF_FROZEN          Frozen for system suspend
            PF_FSTRANS         Inside a filesystem transaction
            PF_KSWAPD          I am kswapd
            PF_MEMALLOC_NOIO   Allocating memory without IO involved
            PF_LESS_THROTTLE   Throttle me less: I clean memory
            PF_KTHREAD         I am a kernel thread
            PF_RANDOMIZE       Randomize virtual address space
            PF_SWAPWRITE       Allowed to write to swap
            PF_NO_SETAFFINITY  Userland is not allowed to meddle with cpus_allowed
            PF_MCE_EARLY       Early kill for mce process policy
            PF_MUTEX_TESTER    Thread belongs to the rt mutex tester
            PF_FREEZER_SKIP    Freezer should not count it as freezable
            PF_SUSPEND_TASK    This thread called freeze_processes and
                               should not be frozen

        N�ZPF_r!)�dir�getattrr.�append)r+Zsflags�attrrCr
r
r�
process_flags�s&
zpidstat.process_flagsN)r()r()3�__name__�
__module__�__qualname__�__doc__ZPF_ALIGNWARNZPF_STARTINGZ
PF_EXITINGZ
PF_EXITPIDONEZPF_VCPUZPF_WQ_WORKERZ
PF_FORKNOEXECZPF_MCE_PROCESSZPF_SUPERPRIVZPF_DUMPCOREZPF_SIGNALEDZPF_MEMALLOCZPF_NPROC_EXCEEDEDZ
PF_FLUSHERZPF_USED_MATHZ
PF_USED_ASYNCZPF_NOFREEZEZ	PF_FROZENZ
PF_FSTRANSZ	PF_KSWAPDZPF_MEMALLOC_NOIOZ
PF_SWAPOFFZPF_LESS_THROTTLEZ
PF_KTHREADZPF_RANDOMIZEZPF_SWAPWRITEZPF_SPREAD_PAGEZPF_SPREAD_SLABrD�PF_NO_SETAFFINITYZPF_MCE_EARLYZPF_MEMPOLICYZPF_MUTEX_TESTERZPF_FREEZER_SKIPZPF_FREEZER_NOSIGZPF_SUSPEND_TASKr>r-r0r2r3r4r5r6r)rErKr
r
r
rr*st


	
rc
Cs2d}ytt|j|dd�|@�SdSdS)Nirr!T)rr?�	processes)r+rrPr
r
r�cannot_set_affinity�srRc
Cs8d}y$tt|j|j|dd�|@�SdSdS)Nirr!T)rr?rQ�threads)r+r�tidrPr
r
r�cannot_set_thread_affinity�srUc@sTeZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
ddd�ZdS)�	pidstatusa
    Provides a dictionary to access the fields
    in the per process /proc/PID/status files.
    This provides additional information about processes and threads to
    what can be obtained with the procfs.pidstat() class.

    One can obtain the available fields by asking for the keys of the
    dictionary, e.g.:

        >>> import procfs
        >>> p = procfs.pidstatus(1)
        >>> print p.keys()
        ['VmExe', 'CapBnd', 'NSpgid', 'Tgid', 'NSpid', 'VmSize', 'VmPMD', 'ShdPnd', 'State', 'Gid', 'nonvoluntary_ctxt_switches', 'SigIgn', 'VmStk', 'VmData', 'SigCgt', 'CapEff', 'VmPTE', 'Groups', 'NStgid', 'Threads', 'PPid', 'VmHWM', 'NSsid', 'VmSwap', 'Name', 'SigBlk', 'Mems_allowed_list', 'VmPeak', 'Ngid', 'VmLck', 'SigQ', 'VmPin', 'Mems_allowed', 'CapPrm', 'Seccomp', 'VmLib', 'Cpus_allowed', 'Uid', 'SigPnd', 'Pid', 'Cpus_allowed_list', 'TracerPid', 'CapInh', 'voluntary_ctxt_switches', 'VmRSS', 'FDSize']
        >>> print p["Pid"]
        1
        >>> print p["Threads"]
        1
        >>> print p["VmExe"]
        1248 kB
        >>> print p["Cpus_allowed"]
        f
        >>> print p["SigQ"]
        0/30698
        >>> print p["VmPeak"]
        320300 kB
        >>>

    Please refer to the 'procfs(5)' man page, by using:

        $ man 5 procfs

    To see information for each of the above fields, it is part of the
    'man-pages' RPM package.

    In the man page there will be references to further documentation, like
        referring to the "getrlimit(2)" man page when explaining the "SigQ"
        line/field.
    �/proccCs||_|j|�dS)N)rr))r+rr,r
r
rr-!szpidstatus.__init__cCs
|j|S)N)r.)r+r/r
r
rr0%szpidstatus.__getitem__cCst|jj��S)N)r1r.r2)r+r
r
rr2(szpidstatus.keyscCst|jj��S)N)r1r.r3)r+r
r
rr3+szpidstatus.valuescCs
||jkS)N)r.)r+r/r
r
rr4.szpidstatus.has_keycCs|jS)N)r.)r+r
r
rr51szpidstatus.itemscCs
||jkS)N)r.)r+r/r
r
rr64szpidstatus.__contains__cCs�i|_t|�d|j�d���n}xf|j�D]Z}|jd�}t|�dkrDq(|d}|dj�}yt|�|j|<Wq(||j|<Yq(Xq(WWdQRXdS)Nr7z/status�:rrr)r.r8r�	readlinesr:r=rr?)r+r,r@�liner.�namerCr
r
rr)7s
zpidstatus.loadN)rW)rW)rLrMrNrOr-r0r2r3r4r5r6r)r
r
r
rrV�s&
rVc@sReZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)�processa+
    Information about a process with a given pid, provides a dictionary with
    two entries, instances of different wrappers for /proc/ process related
    meta files: "stat" and "status", see the documentation for procfs.pidstat
    and procfs.pidstatus for further info about those classes.
    �/proccCs||_||_dS)N)rr,)r+rr,r
r
rr-Nszprocess.__init__cCs�t||�s�|dkrX|dkr t}nt}yt||||j|j��Wq�tk
rT�Yq�XnF|dkrj|j�n4|dkr||j�n"|dkr�|j	�n|dkr�|j
�t||�S)Nr�statusr
rS�cgroupsr')rr^)�hasattrrrV�setattrrr,r*�load_cmdline�load_threads�load_cgroups�load_environrH)r+rJZsclassr
r
rr0Rs$




zprocess.__getitem__cCs
t||�S)N)r`)r+rJr
r
rr4jszprocess.has_keycCs
t||�S)N)r`)r+rJr
r
rr6mszprocess.__contains__cCsvy>td|j�d���"}|j�j�jd�dd�|_WdQRXWn2tk
rXd|_Yntk
rpd|_YnXdS)Nz/proc/z/cmdline�r���)r8rr9rr:r
r*�UnicodeDecodeError)r+r@r
r
rrbps*
zprocess.load_cmdlinecCs"td|j�d��|_|j|j=dS)Nz/proc/z/task/)�pidstatsrrS)r+r
r
rrc{szprocess.load_threadscCstd|_td|j�d���R}xJt|j��D]:}t|j�dkrT|jd|dd�|_q(|dd�|_q(WWdQRXdS)	N�z/proc/z/cgroupr�,rrgrg)r_r8r�reversedrYr=)r+r@rZr
r
rrd�szprocess.load_cgroupscCshi|_td|j�d���F}x>|j�jd�D],}t|�dkr*|jd�}|d|j|d<q*WWdQRXdS)aD
        Loads the environment variables for this process. The entries then
        become available via the 'environ' member, or via the 'environ'
        dict key when accessing as p["environ"].

        E.g.:


        >>> all_processes = procfs.pidstats()
        >>> firefox_pid = all_processes.find_by_name("firefox")
        >>> firefox_process = all_processes[firefox_pid[0]]
        >>> print firefox_process["environ"]["PWD"]
        /home/acme
        >>> print len(firefox_process.environ.keys())
        66
        >>> print firefox_process["environ"]["SHELL"]
        /bin/bash
        >>> print firefox_process["environ"]["USERNAME"]
        acme
        >>> print firefox_process["environ"]["HOME"]
        /home/acme
        >>> print firefox_process["environ"]["MAIL"]
        /var/spool/mail/acme
        >>>
        z/proc/z/environrfr�=rN)r'r8rr9r:r=)r+r@�x�yr
r
rre�s
zprocess.load_environN)r])rLrMrNrOr-r0r4r6rbrcrdrer
r
r
rr\Fs
	r\c@s�eZdZdZd$dd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#S)%ria#
    Provides access to all the processes in the system, to get a picture of
    how many processes there are at any given moment.

    The entries can be accessed as a dictionary, keyed by pid. Also there are
    methods to find processes that match a given COMM or regular expression.
    �/proccCs||_i|_|j�dS)N)r,rQ�reload)r+r,r
r
rr-�szpidstats.__init__cCs
|j|S)N)rQ)r+�keyr
r
rr0�szpidstats.__getitem__c	Csy|j|=WnYnXdS)N)rQ)r+rrr
r
r�__delitem__�szpidstats.__delitem__cCst|jj��S)N)r1rQr2)r+r
r
rr2�sz
pidstats.keyscCst|jj��S)N)r1rQr3)r+r
r
rr3�szpidstats.valuescCs
||jkS)N)rQ)r+rrr
r
rr4�szpidstats.has_keycCs|jS)N)rQ)r+r
r
rr5�szpidstats.itemscCs
||jkS)N)rQ)r+rrr
r
rr6�szpidstats.__contains__c
CsV|`i|_tj|j�}x:|D]2}yt|�}WnwYnXt||j�|j|<qWdS)a'
        This operation will throw away the current dictionary contents,
        if any, and read all the pid files from /proc/, instantiating a
        'process' instance for each of them.

        This is a high overhead operation, and should be avoided if the
        perf python binding can be used to detect when new threads appear
        and existing ones terminate.

        In RHEL it is found in the python-perf rpm package.

        More information about the perf facilities can be found in the
        'perf_event_open' man page.
        N)rQ�os�listdirr,r?r\)r+�pidsZspidrr
r
rrq�s
zpidstats.reloadcCshg}xHt|jj��D]6}y|j|j�Wqtk
rH|j|�YqXqWx|D]}|j|=qTWdS)N)r1rQr2rc�OSErrorrI)r+Z	to_removerr
r
r�reload_threads�s
zpidstats.reload_threadscCsn|dd�}g}xXt|jj��D]F}y$||j|ddkrF|j|�Wq tk
rd|j|=Yq Xq W|S)N�rr)r1rQr2rI�IOError)r+r[rvrr
r
r�find_by_name�szpidstats.find_by_namecCsdg}xZt|jj��D]H}y&|j|j|dd�r<|j|�Wqtk
rZ|j|=YqXqW|S)Nrr)r1rQr2�matchrIrz)r+�regexrvrr
r
r�
find_by_regexszpidstats.find_by_regexcCs`g}xVt|jj��D]D}y"|jt|j|��r8|j|�Wqtk
rV|j|=YqXqW|S)N)r1rQr2r|rrIrz)r+r}rvrr
r
r�find_by_cmdline_regexszpidstats.find_by_cmdline_regexcs�d}d}g�x�|�d|��}|j|�}|sFt�fdd�|D��dkrHPxF|D]>}y||j|dd�7}WqNtk
r�|j|=YqNXqNW�|7�|d7}qW|jd	�}|S)
Nrrjr7csg|]}|�kr|�qSr
r
)�.0�n)�processed_pidsr
r�
<listcomp>'sz0pidstats.get_per_cpu_rtprios.<locals>.<listcomp>rr&rrk)r{r=rQrzr)r+�basename�cpu�
prioritiesr[rvrr
)r�r�get_per_cpu_rtprios s"
 

zpidstats.get_per_cpu_rtprioscs�d}d}g�x�|j|�}|s8t�fdd�|D��dkr:PxF|D]>}y||j|dd�7}Wq@tk
r||j|=Yq@Xq@W�|7�|d7}qW|jd�}|S)	Nrrjcsg|]}|�kr|�qSr
r
)r�r�)r�r
rr�=sz(pidstats.get_rtprios.<locals>.<listcomp>rr&rrk)r{r=rQrzr)r+r[r�r�rvrr
)r�r�get_rtprios7s 
 

zpidstats.get_rtprioscCs|j|dj�S)zQ
        Checks if a given pid can't have its SMP affinity mask changed.
        r)rQrE)r+rr
r
rrEMszpidstats.is_bound_to_cpuN)rp)rLrMrNrOr-r0rsr2r3r4r5r6rqrxr{r~rr�r�rEr
r
r
rri�s"


ric@speZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�ZdS)�
interruptsam
    Information about IRQs in the system. A dictionary keyed by IRQ number
    will have as its value another dictionary with "cpu", "type" and "users"
    keys, with the SMP affinity mask, type of IRQ and the drivers associated
    with each interrupt.

    The information comes from the /proc/interrupts file, documented in
    'man procfs(5)', for instance, the 'cpu' dict is an array with one entry
    per CPU present in the sistem, each value being the number of interrupts
    that took place per CPU.

    E.g.:

    >>> import procfs
    >>> interrupts = procfs.interrupts()
    >>> thunderbolt_irq = interrupts.find_by_user("thunderbolt")
    >>> print thunderbolt_irq
    34
    >>> thunderbolt = interrupts[thunderbolt_irq]
    >>> print thunderbolt
    {'affinity': [0, 1, 2, 3], 'type': 'PCI-MSI', 'cpu': [3495, 0, 81, 0], 'users': ['thunderbolt']}
    >>>
    cCsi|_|j�dS)N)r�rq)r+r
r
rr-mszinterrupts.__init__cCs|jt|�S)N)r��str)r+rrr
r
rr0qszinterrupts.__getitem__cCst|jj��S)N)r1r�r2)r+r
r
rr2tszinterrupts.keyscCst|jj��S)N)r1r�r3)r+r
r
rr3wszinterrupts.valuescCst|�|jkS)N)r�r�)r+rrr
r
rr4zszinterrupts.has_keycCs|jS)N)r�)r+r
r
rr5}szinterrupts.itemscCst|�|jkS)N)r�r�)r+rrr
r
rr6�szinterrupts.__contains__cCs�|`i|_td���}x�|j�D]�}|j�}|j�}|ddd�dkrRt|�|_q|djd�}i|j|<|j|dd�|�|j|<yt|�}WnwYnX|j	|�|j|d<qWWdQRXdS)Nz/proc/interruptsrrFZCPUrXrZaffinity)
r�r8rYrr:r=�nr_cpus�parse_entryr?�parse_affinity)r+r@rZr.�irqZnirqr
r
rrq�s"


zinterrupts.reloadcCs�i}g|d<|djt|d��t|�}||jkr�|ddd�|d|j�D�7<||jkr�||j|d<||jdkr�dd�||djd�D�|d	<ng|d	<|S)
Nr�rcSsg|]}t|��qSr
)r?)r�rBr
r
rr��sz*interrupts.parse_entry.<locals>.<listcomp>r�typecSsg|]}|j��qSr
)r)r�rr
r
rr��srk�users)rIr?r=r�r:)r+r.rZ�dictrAr
r
rr��s
$
zinterrupts.parse_entrycCsLy0td|�d���}|j�}WdQRXt||j�Stk
rFdgSXdS)Nz
/proc/irq/z
/smp_affinityr)r8r9rr�rz)r+r�r@rZr
r
rr��szinterrupts.parse_affinitycCs@x:t|jj��D](}d|j|kr||j|dkr|SqWdS)a�
        Looks up a interrupt number by the name of one of its users"

        E.g.:

        >>> import procfs
        >>> interrupts = procfs.interrupts()
        >>> thunderbolt_irq = interrupts.find_by_user("thunderbolt")
        >>> print thunderbolt_irq
        34
        >>> thunderbolt = interrupts[thunderbolt_irq]
        >>> print thunderbolt
        {'affinity': [0, 1, 2, 3], 'type': 'PCI-MSI', 'cpu': [3495, 0, 81, 0], 'users': ['thunderbolt']}
        >>>
        r�N)r1r�r2)r+�userrBr
r
r�find_by_user�s
zinterrupts.find_by_usercCs^g}xTt|jj��D]B}d|j|kr(qx,|j|dD]}|j|�r8|j|�Pq8WqW|S)a�
        Looks up a interrupt number by a regex that matches names of its users"

        E.g.:

        >>> import procfs
        >>> import re
        >>> interrupts = procfs.interrupts()
        >>> usb_controllers = interrupts.find_by_user_regex(re.compile(".*hcd"))
        >>> print usb_controllers
        ['22', '23', '31']
        >>> print [ interrupts[irq]["users"] for irq in usb_controllers ]
        [['ehci_hcd:usb4'], ['ehci_hcd:usb3'], ['xhci_hcd']]
        >>>
        r�)r1r�r2r|rI)r+r}ZirqsrBr�r
r
r�find_by_user_regex�s


zinterrupts.find_by_user_regexN)rLrMrNrOr-r0r2r3r4r5r6rqr�r�r�r�r
r
r
rr�Tsr�c@s@eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dS)r
a9
    Parses the kernel command line (/proc/cmdline), turning it into a dictionary."

    Useful to figure out if some kernel boolean knob has been turned on,
    as well as to find the value associated to other kernel knobs.

    It can also be used to find out about parameters passed to the
    init process, such as 'BOOT_IMAGE', etc.

    E.g.:
    >>> import procfs
    >>> kcmd = procfs.cmdline()
    >>> print kcmd.keys()
    ['LANG', 'BOOT_IMAGE', 'quiet', 'rhgb', 'rd.lvm.lv', 'ro', 'root']
    >>> print kcmd["BOOT_IMAGE"]
    /vmlinuz-4.3.0-rc1+
    >>>
    cCsi|_|j�dS)N)�options�parse)r+r
r
rr-�szcmdline.__init__cCsjtd��X}xP|j�j�j�D]<}|jd�}t|�dkrFd|j|d<q|d|j|d<qWWdQRXdS)Nz
/proc/cmdlinermrTr)r8r9rr:r=r�)r+r@Zoptionr.r
r
rr��s

z
cmdline.parsecCs
|j|S)N)r�)r+rrr
r
rr0szcmdline.__getitem__cCst|jj��S)N)r1r�r2)r+r
r
rr2szcmdline.keyscCst|jj��S)N)r1r�r3)r+r
r
rr3szcmdline.valuescCs|jS)N)r�)r+r
r
rr5
sz
cmdline.itemsN)
rLrMrNrOr-r�r0r2r3r5r
r
r
rr
�s	r
c@sBeZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dS)�cpuinfoa�
    Dictionary with information about CPUs in the system.

    Please refer to 'man procfs(5)' for further information about the
    '/proc/cpuinfo' file, that is the source of the information provided
    by this class. The 'man lscpu(1)' also has information about a program that
    uses the '/proc/cpuinfo' file.

    Using this class one can obtain the number of CPUs in a system:

      >>> cpus = procfs.cpuinfo()
          >>> print cpus.nr_cpus
          4

    It is also possible to figure out aspects of the CPU topology, such as
    how many CPU physical sockets exists, i.e. groups of CPUs sharing
    components such as CPU memory caches:

      >>> print len(cpus.sockets)
      1

    Additionally dictionary with information common to all CPUs in the system
    is available:

      >>> print cpus["model name"]
          Intel(R) Core(TM) i7-3667U CPU @ 2.00GHz
          >>> print cpus["cache size"]
          4096 KB
          >>>
    �
/proc/cpuinfocCs i|_d|_g|_|j|�dS)Nr)�tagsr��socketsr�)r+�filenamer
r
rr-.szcpuinfo.__init__cCs|j|j�S)N)r��lower)r+rrr
r
rr04szcpuinfo.__getitem__cCst|jj��S)N)r1r�r2)r+r
r
rr27szcpuinfo.keyscCst|jj��S)N)r1r�r3)r+r
r
rr3:szcpuinfo.valuescCs|jS)N)r�)r+r
r
rr5=sz
cpuinfo.itemscCs(t|���}x�|j�D]�}|j�}|s&q|jd�}|dj�j�}|dkrX|jd7_qt�rv|dkrv|jd7_q|dkr�q|dj�|j|<|dkr|j|}||jkr|jj	|�qWWdQRX|jr�t
|j�p�|jd|jkr�t|jd�p�d|_d	|jk�rt|jd	��pd|j|_
dS)
NrXrr%rz
cpu numberzcore idzphysical idZsiblingsz	cpu cores)r8rYrr:r�r�rr�r�rIr=r?Z
nr_socketsZnr_cores)r+r�r@rZr.ZtagnameZ	socket_idr
r
rr�@s2



 z
cpuinfo.parseN)r�)
rLrMrNrOr-r0r2r3r5r�r
r
r
rr�s
r�c@s8eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�ZdS)
�	smaps_libat
    Representation of an mmap in place for a process. Can be used to figure
    out which processes have an library mapped, etc.

    The 'perm' member can be used to figure out executable mmaps,
    i.e. libraries.

    The 'vm_start' and 'vm_end' in turn can be used when trying to resolve
    processor instruction pointer addresses to a symbol name in a library.
    c
Cs�|dj�}dd�|djd�D�\|_|_|d|_t|dd�|_|djd	�\|_|_t|d
�|_t	|�dkr�|d|_
nd|_
i|_x^|dd�D]N}|j�}|ddd�j�}yt|d�|j|<Wq�||j|<Yq�Xq�WdS)
NrcSsg|]}t|d��qS)r)r?)r�rr
r
rr�jsz&smaps_lib.__init__.<locals>.<listcomp>�-rrrrFrXr�rg)
r:�vm_startZvm_endZpermsr?�offset�major�minor�inoder=r[r�r�)r+�linesr.rZ�tagr
r
rr-hs" 
zsmaps_lib.__init__cCs|j|j�S)N)r�r�)r+rrr
r
rr0}szsmaps_lib.__getitem__cCst|jj��S)N)r1r�r2)r+r
r
rr2�szsmaps_lib.keyscCst|jj��S)N)r1r�r3)r+r
r
rr3�szsmaps_lib.valuescCs|jS)N)r�)r+r
r
rr5�szsmaps_lib.itemsN)	rLrMrNrOr-r0r2r3r5r
r
r
rr�\s
r�c@s@eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dS)�smapsa~
    List of libraries mapped by a process. Parses the lines in
    the /proc/PID/smaps file, that is further documented in the
    procfs(5) man page.

    Example: Listing the executable maps for the 'sshd' process:

          >>> import procfs
          >>> processes = procfs.pidstats()
          >>> sshd = processes.find_by_name("sshd")
          >>> sshd_maps = procfs.smaps(sshd[0])
          >>> for i in range(len(sshd_maps)):
          ...     if 'x' in sshd_maps[i].perms:
          ...         print "%s: %s" % (sshd_maps[i].name, sshd_maps[i].perms)
          ...
          /usr/sbin/sshd: r-xp
          /usr/lib64/libnss_files-2.20.so: r-xp
          /usr/lib64/librt-2.20.so: r-xp
          /usr/lib64/libkeyutils.so.1.5: r-xp
          /usr/lib64/libkrb5support.so.0.1: r-xp
          /usr/lib64/libfreebl3.so: r-xp
          /usr/lib64/libpthread-2.20.so: r-xp
      ...
    cCs||_g|_|j�dS)N)r�entriesrq)r+rr
r
rr-�szsmaps.__init__cCsxg}|s|j�j�}|sdS|j|�x<|j�}|s6P|j�}|j�dddkr^|j|�q(Pq(W|jjt|��|S)NrrrXrg)r9rrIr:r�r�)r+r@rZr�r
r
rr��s 
zsmaps.parse_entrycCs
t|j�S)N)r=r�)r+r
r
r�__len__�sz
smaps.__len__cCs
|j|S)N)r�)r+�indexr
r
rr0�szsmaps.__getitem__c
CsJd}td|j�d���}x|j||�}|sPqWWdQRXt|j�|_dS)Nz/proc/z/smaps)r8rr�r=r��
nr_entries)r+rZr@r
r
rrq�szsmaps.reloadcCsNg}xDt|j�D]6}|j|jr|j|jj|�dkr|j|j|�qW|S)Nr)rr�r�r[�findrI)r+Zfragment�resultrBr
r
r�find_by_name_fragment�szsmaps.find_by_name_fragmentN)
rLrMrNrOr-r�r�r0rqr�r
r
r
rr��s	r�c@s eZdZdZdd�Zdd�ZdS)�cpustatz�
    CPU statistics, obtained from a line in the '/proc/stat' file, Please
    refer to 'man procfs(5)' for further information about the '/proc/stat'
    file, that is the source of the information provided by this class.
    cCst|d|_dd�|dd�D�\|_|_|_|_|_|_|_t|�dkrpt	|d�|_
t|�dkrpt	|d�|_dS)NrcSsg|]}t|��qSr
)r?)r�rBr
r
rr��sz$cpustat.__init__.<locals>.<listcomp>rr�)r[r�r#�system�idle�iowaitr��softirqr=r?�steal�guest)r+r.r
r
rr-�s
2zcpustat.__init__cCsxd|j�d|j�d|j�d|j�d|j�d|j�d|j��}t|d�rV|d	|j��7}t|d
�rp|d|j	��7}|dS)
Nz< user: z, nice: z
, system: z, idle: z
, iowait: z, irq: z, softirq: r�z	, steal: r�z	, guest: �>)
r�r#r�r�r�r�r�r`r�r�)r+�sr
r
r�__repr__�s<

zcpustat.__repr__N)rLrMrNrOr-r�r
r
r
rr��sr�c@sReZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)�	cpusstatsaN
    Dictionary with information about CPUs in the system. First entry in the
    dictionary gives an aggregate view of all CPUs, each other entry is about
    separate CPUs. Please refer to 'man procfs(5)' for further information
    about the '/proc/stat' file, that is the source of the information provided
    by this class.
    �
/proc/statcCs*i|_d|_tjd�|_||_|j�dS)Nr)r��timert�sysconf�hertzr�rq)r+r�r
r
rr-�s
zcpusstats.__init__cCs
t|j�S)N)�iterr�)r+r
r
r�__iter__szcpusstats.__iter__cCs
|j|S)N)r�)r+rrr
r
rr0szcpusstats.__getitem__cCstt|jj���S)N)r=r1r�r2)r+r
r
rr�	szcpusstats.__len__cCst|jj��S)N)r1r�r2)r+r
r
rr2szcpusstats.keyscCst|jj��S)N)r1r�r3)r+r
r
rr3szcpusstats.valuescCs|jS)N)r�)r+r
r
rr5szcpusstats.itemscCs:|j}i|_t|j��v}xn|j�D]b}|j�j�}|ddd�j�dkrLq"t|�}|jdkrdd}nt	|jdd��d}||j|<q"WWdQRX|j
}t
j
�|_
|�r6|j
|}||j}	xxt|jj
��D]f}
|
|kr�d|_q�|j|
}||
}|j|j|j|j|j|j}
|
|	d|_t|jd�|_q�WdS)NrrFr�r�d)r�r8r�rYrr:r�r�r[r?r�r�r1r2Zusager�r#r�r<)r+Zlast_entriesr@rZr.�c�idxZ	last_timeZ	delta_secZinterval_hzr�Zcurr�prevZdeltar
r
rrqs6




zcpusstats.reloadN)r�)rLrMrNrOr-r�r0r�r2r3r5rqr
r
r
rr��s
r��__main__z: z
cpuinfo data: z processorsrmzsmaps:
r��(rz#x� rZSize�
z
----------)3rtrrr��	functoolsrZ	six.movesrZprocfs.utilistr�VERSIONrrrrRrUrVr\rir�r
r�r�r�r�r�rL�sysZintsr1r2rB�printr��or�r�r�r?�argvr�r�r�r�r[r�rZpsZcsZsleeprqr
r
r
r�<module>	sj>		Me*
.N.KB
*

site-packages/procfs/__pycache__/__init__.cpython-36.opt-1.pyc000075500000000574147511334560020136 0ustar003

�{Me��@s dZdZdZddlTddlTdS)zp
Copyright (c) 2008, 2009  Red Hat Inc.

Abstractions to extract information from the Linux kernel /proc files.
z*Arnaldo Carvalho de Melo <acme@redhat.com>z
GPLv2 License�)�*N)�__doc__�
__author__Z__license__ZprocfsZutilist�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/procfs/__pycache__/procfs.cpython-36.opt-1.pyc000075500000107105147511334560017671 0ustar003

�{Meʍ�@s�ddlZddlZddlZddlZddlmZddlmZddlm	Z	dZ
dd�Zdd	�ZGd
d�d�Z
dd
�Zdd�ZGdd�d�ZGdd�d�ZGdd�d�ZGdd�d�ZGdd�d�ZGdd�d�ZGdd�d�ZGdd�d�ZGd d!�d!�ZGd"d#�d#�Zed$k�r�ddlZe�Zx0eejj��D]Ze e�d%eje����q*We�Z!x0ee!j!j��D]Z"e e"�d%e!j!e"����qbWe�Z#e d&e#j$�d'��x,ee#j��D]Z%e e%�d(e#e%����q�We d)d*d+�ee&ej'd,��Z(x8ee(j)�D]*Ze e(j*ej+d-�d.e(j*ej,����q�We d*d+�x&e(j-ej'd/�D]Z.e e.d0��qDWe�Z/e e/d,�e�Z0xFej1d,�e0j2�x$e0D]Z#e e#�d%e0e#����q�We d2��qtWdS)3�N)�reduce)�range)�bitmasklistz0.7.3cCstj�}ttjd|��S)z) Return True if running on s390 or s390x Zs390)�platform�machine�bool�re�search)r�r
�/usr/lib/python3.6/procfs.py�is_s390src	Cs:|drtdd�|d�j�Sy|ddSdSdS)z�
    Returns the process command line, if available in the given `process' class,
    if not available, falls back to using the comm (short process name) in its
    pidstat key.
    �cmdlinecSs|d|S)Nz %sr
)�a�br
r
r�<lambda>!sz!process_cmdline.<locals>.<lambda>�stat�commN)r�strip)Zpid_infor
r
r�process_cmdlinesrc+@sNeZdZdZdZdZdZdZdZdZ	dZ
d	Zd
ZdZ
dZd
ZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZ dZ!dZ"dZ#dZ$d Z%d!Z&d!Z'd"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLg+Z(dcdNdO�Z)dPdQ�Z*dRdS�Z+dTdU�Z,dVdW�Z-dXdY�Z.dZd[�Z/ddd\d]�Z0d^d_�Z1d`da�Z2dbS)e�pidstatam
    Provides a dictionary to access the fields in the
    per process /proc/PID/stat files.

    One can obtain the available fields by asking for the keys of the
    dictionary, e.g.:

        >>> p = procfs.pidstat(1)
        >>> print p.keys()
        ['majflt', 'rss', 'cnswap', 'cstime', 'pid', 'session', 'startstack', 'startcode', 'cmajflt', 'blocked', 'exit_signal', 'minflt', 'nswap', 'environ', 'priority', 'state', 'delayacct_blkio_ticks', 'policy', 'rt_priority', 'ppid', 'nice', 'cutime', 'endcode', 'wchan', 'num_threads', 'sigcatch', 'comm', 'stime', 'sigignore', 'tty_nr', 'kstkeip', 'utime', 'tpgid', 'itrealvalue', 'kstkesp', 'rlim', 'signal', 'pgrp', 'flags', 'starttime', 'cminflt', 'vsize', 'processor']

       And then access the various process properties using it as a dictionary:

        >>> print p['comm']
        systemd
        >>> print p['priority']
        20
        >>> print p['state']
        S

       Please refer to the 'procfs(5)' man page, by using:

        $ man 5 procfs

       To see information for each of the above fields, it is part of the
           'man-pages' RPM package.
    ������ �@��iiiii i@i�iiiiii i@i�iiiiii i@l�pidr�stateZppidZpgrpZsessionZtty_nrZtpgid�flagsZminfltZcminfltZmajfltZcmajflt�utimeZstimeZcutimeZcstimeZpriority�niceZnum_threadsZitrealvalueZ	starttimeZvsizeZrssZrlimZ	startcodeZendcodeZ
startstackZkstkespZkstkeip�signalZblockedZ	sigignoreZsigcatchZwchanZnswapZcnswapZexit_signal�	processor�rt_priorityZpolicyZdelayacct_blkio_ticks�environ�/proccCs0||_y|j|�Wntk
r*�YnXdS)N)r�load�FileNotFoundError)�selfr�basedirr
r
r�__init__~s
zpidstat.__init__cCs
|j|S)N)�fields)r+�	fieldnamer
r
r�__getitem__�szpidstat.__getitem__cCst|jj��S)N)�listr.�keys)r+r
r
rr2�szpidstat.keyscCst|jj��S)N)r1r.�values)r+r
r
rr3�szpidstat.valuescCs
||jkS)N)r.)r+r/r
r
r�has_key�szpidstat.has_keycCs|jS)N)r.)r+r
r
r�items�sz
pidstat.itemscCs
||jkS)N)r.)r+r/r
r
r�__contains__�szpidstat.__contains__cCs�yt|�d|j�d��}Wntk
r0�YnX|j�j�jd�}|j�|djd�|dj�}i|_tt	|�t	|j
��}xft|�D]Z}|j
|}||}|dkr�|jd�|jd<q�yt|�|j|<Wq�||j|<Yq�Xq�WdS)	N�/z/statz) rz (rrz())
�openrr*�readliner�split�closer.�min�len�proc_stat_fieldsr�int)r+r,�fr.�	nr_fields�iZattrname�valuer
r
rr)�s$
zpidstat.loadcCst|jd|j@�S)z�
        Returns true if this process has a fixed smp affinity mask,
                not allowing it to be moved to a different set of CPUs.
        r!)rr.�PF_THREAD_BOUND)r+r
r
r�is_bound_to_cpu�szpidstat.is_bound_to_cpucCsNg}xDt|�D]8}|dd�dkr$qt||�}||jd@r|j|�qW|S)a�
        Returns a list with all the process flags known, details depend
        on kernel version, declared in the file include/linux/sched.h in
        the kernel sources.

        As of v4.2-rc7 these include (from include/linux/sched.h comments):

            PF_EXITING        Getting shut down
            PF_EXITPIDONE     Pi exit done on shut down
            PF_VCPU           I'm a virtual CPU
            PF_WQ_WORKER      I'm a workqueue worker
            PF_FORKNOEXEC     Forked but didn't exec
            PF_MCE_PROCESS    Process policy on mce errors
            PF_SUPERPRIV      Used super-user privileges
            PF_DUMPCORE       Dumped core
            PF_SIGNALED       Killed by a signal
            PF_MEMALLOC       Allocating memory
            PF_NPROC_EXCEEDED Set_user noticed that RLIMIT_NPROC was exceeded
            PF_USED_MATH      If unset the fpu must be initialized before use
            PF_USED_ASYNC     Used async_schedule*(), used by module init
            PF_NOFREEZE       This thread should not be frozen
            PF_FROZEN          Frozen for system suspend
            PF_FSTRANS         Inside a filesystem transaction
            PF_KSWAPD          I am kswapd
            PF_MEMALLOC_NOIO   Allocating memory without IO involved
            PF_LESS_THROTTLE   Throttle me less: I clean memory
            PF_KTHREAD         I am a kernel thread
            PF_RANDOMIZE       Randomize virtual address space
            PF_SWAPWRITE       Allowed to write to swap
            PF_NO_SETAFFINITY  Userland is not allowed to meddle with cpus_allowed
            PF_MCE_EARLY       Early kill for mce process policy
            PF_MUTEX_TESTER    Thread belongs to the rt mutex tester
            PF_FREEZER_SKIP    Freezer should not count it as freezable
            PF_SUSPEND_TASK    This thread called freeze_processes and
                               should not be frozen

        N�ZPF_r!)�dir�getattrr.�append)r+Zsflags�attrrCr
r
r�
process_flags�s&
zpidstat.process_flagsN)r()r()3�__name__�
__module__�__qualname__�__doc__ZPF_ALIGNWARNZPF_STARTINGZ
PF_EXITINGZ
PF_EXITPIDONEZPF_VCPUZPF_WQ_WORKERZ
PF_FORKNOEXECZPF_MCE_PROCESSZPF_SUPERPRIVZPF_DUMPCOREZPF_SIGNALEDZPF_MEMALLOCZPF_NPROC_EXCEEDEDZ
PF_FLUSHERZPF_USED_MATHZ
PF_USED_ASYNCZPF_NOFREEZEZ	PF_FROZENZ
PF_FSTRANSZ	PF_KSWAPDZPF_MEMALLOC_NOIOZ
PF_SWAPOFFZPF_LESS_THROTTLEZ
PF_KTHREADZPF_RANDOMIZEZPF_SWAPWRITEZPF_SPREAD_PAGEZPF_SPREAD_SLABrD�PF_NO_SETAFFINITYZPF_MCE_EARLYZPF_MEMPOLICYZPF_MUTEX_TESTERZPF_FREEZER_SKIPZPF_FREEZER_NOSIGZPF_SUSPEND_TASKr>r-r0r2r3r4r5r6r)rErKr
r
r
rr*st


	
rc
Cs2d}ytt|j|dd�|@�SdSdS)Nirr!T)rr?�	processes)r+rrPr
r
r�cannot_set_affinity�srRc
Cs8d}y$tt|j|j|dd�|@�SdSdS)Nirr!T)rr?rQ�threads)r+r�tidrPr
r
r�cannot_set_thread_affinity�srUc@sTeZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
ddd�ZdS)�	pidstatusa
    Provides a dictionary to access the fields
    in the per process /proc/PID/status files.
    This provides additional information about processes and threads to
    what can be obtained with the procfs.pidstat() class.

    One can obtain the available fields by asking for the keys of the
    dictionary, e.g.:

        >>> import procfs
        >>> p = procfs.pidstatus(1)
        >>> print p.keys()
        ['VmExe', 'CapBnd', 'NSpgid', 'Tgid', 'NSpid', 'VmSize', 'VmPMD', 'ShdPnd', 'State', 'Gid', 'nonvoluntary_ctxt_switches', 'SigIgn', 'VmStk', 'VmData', 'SigCgt', 'CapEff', 'VmPTE', 'Groups', 'NStgid', 'Threads', 'PPid', 'VmHWM', 'NSsid', 'VmSwap', 'Name', 'SigBlk', 'Mems_allowed_list', 'VmPeak', 'Ngid', 'VmLck', 'SigQ', 'VmPin', 'Mems_allowed', 'CapPrm', 'Seccomp', 'VmLib', 'Cpus_allowed', 'Uid', 'SigPnd', 'Pid', 'Cpus_allowed_list', 'TracerPid', 'CapInh', 'voluntary_ctxt_switches', 'VmRSS', 'FDSize']
        >>> print p["Pid"]
        1
        >>> print p["Threads"]
        1
        >>> print p["VmExe"]
        1248 kB
        >>> print p["Cpus_allowed"]
        f
        >>> print p["SigQ"]
        0/30698
        >>> print p["VmPeak"]
        320300 kB
        >>>

    Please refer to the 'procfs(5)' man page, by using:

        $ man 5 procfs

    To see information for each of the above fields, it is part of the
    'man-pages' RPM package.

    In the man page there will be references to further documentation, like
        referring to the "getrlimit(2)" man page when explaining the "SigQ"
        line/field.
    �/proccCs||_|j|�dS)N)rr))r+rr,r
r
rr-!szpidstatus.__init__cCs
|j|S)N)r.)r+r/r
r
rr0%szpidstatus.__getitem__cCst|jj��S)N)r1r.r2)r+r
r
rr2(szpidstatus.keyscCst|jj��S)N)r1r.r3)r+r
r
rr3+szpidstatus.valuescCs
||jkS)N)r.)r+r/r
r
rr4.szpidstatus.has_keycCs|jS)N)r.)r+r
r
rr51szpidstatus.itemscCs
||jkS)N)r.)r+r/r
r
rr64szpidstatus.__contains__cCs�i|_t|�d|j�d���n}xf|j�D]Z}|jd�}t|�dkrDq(|d}|dj�}yt|�|j|<Wq(||j|<Yq(Xq(WWdQRXdS)Nr7z/status�:rrr)r.r8r�	readlinesr:r=rr?)r+r,r@�liner.�namerCr
r
rr)7s
zpidstatus.loadN)rW)rW)rLrMrNrOr-r0r2r3r4r5r6r)r
r
r
rrV�s&
rVc@sReZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)�processa+
    Information about a process with a given pid, provides a dictionary with
    two entries, instances of different wrappers for /proc/ process related
    meta files: "stat" and "status", see the documentation for procfs.pidstat
    and procfs.pidstatus for further info about those classes.
    �/proccCs||_||_dS)N)rr,)r+rr,r
r
rr-Nszprocess.__init__cCs�t||�s�|dkrX|dkr t}nt}yt||||j|j��Wq�tk
rT�Yq�XnF|dkrj|j�n4|dkr||j�n"|dkr�|j	�n|dkr�|j
�t||�S)Nr�statusr
rS�cgroupsr')rr^)�hasattrrrV�setattrrr,r*�load_cmdline�load_threads�load_cgroups�load_environrH)r+rJZsclassr
r
rr0Rs$




zprocess.__getitem__cCs
t||�S)N)r`)r+rJr
r
rr4jszprocess.has_keycCs
t||�S)N)r`)r+rJr
r
rr6mszprocess.__contains__cCsvy>td|j�d���"}|j�j�jd�dd�|_WdQRXWn2tk
rXd|_Yntk
rpd|_YnXdS)Nz/proc/z/cmdline�r���)r8rr9rr:r
r*�UnicodeDecodeError)r+r@r
r
rrbps*
zprocess.load_cmdlinecCs"td|j�d��|_|j|j=dS)Nz/proc/z/task/)�pidstatsrrS)r+r
r
rrc{szprocess.load_threadscCstd|_td|j�d���R}xJt|j��D]:}t|j�dkrT|jd|dd�|_q(|dd�|_q(WWdQRXdS)	N�z/proc/z/cgroupr�,rrgrg)r_r8r�reversedrYr=)r+r@rZr
r
rrd�szprocess.load_cgroupscCshi|_td|j�d���F}x>|j�jd�D],}t|�dkr*|jd�}|d|j|d<q*WWdQRXdS)aD
        Loads the environment variables for this process. The entries then
        become available via the 'environ' member, or via the 'environ'
        dict key when accessing as p["environ"].

        E.g.:


        >>> all_processes = procfs.pidstats()
        >>> firefox_pid = all_processes.find_by_name("firefox")
        >>> firefox_process = all_processes[firefox_pid[0]]
        >>> print firefox_process["environ"]["PWD"]
        /home/acme
        >>> print len(firefox_process.environ.keys())
        66
        >>> print firefox_process["environ"]["SHELL"]
        /bin/bash
        >>> print firefox_process["environ"]["USERNAME"]
        acme
        >>> print firefox_process["environ"]["HOME"]
        /home/acme
        >>> print firefox_process["environ"]["MAIL"]
        /var/spool/mail/acme
        >>>
        z/proc/z/environrfr�=rN)r'r8rr9r:r=)r+r@�x�yr
r
rre�s
zprocess.load_environN)r])rLrMrNrOr-r0r4r6rbrcrdrer
r
r
rr\Fs
	r\c@s�eZdZdZd$dd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#S)%ria#
    Provides access to all the processes in the system, to get a picture of
    how many processes there are at any given moment.

    The entries can be accessed as a dictionary, keyed by pid. Also there are
    methods to find processes that match a given COMM or regular expression.
    �/proccCs||_i|_|j�dS)N)r,rQ�reload)r+r,r
r
rr-�szpidstats.__init__cCs
|j|S)N)rQ)r+�keyr
r
rr0�szpidstats.__getitem__c	Csy|j|=WnYnXdS)N)rQ)r+rrr
r
r�__delitem__�szpidstats.__delitem__cCst|jj��S)N)r1rQr2)r+r
r
rr2�sz
pidstats.keyscCst|jj��S)N)r1rQr3)r+r
r
rr3�szpidstats.valuescCs
||jkS)N)rQ)r+rrr
r
rr4�szpidstats.has_keycCs|jS)N)rQ)r+r
r
rr5�szpidstats.itemscCs
||jkS)N)rQ)r+rrr
r
rr6�szpidstats.__contains__c
CsV|`i|_tj|j�}x:|D]2}yt|�}WnwYnXt||j�|j|<qWdS)a'
        This operation will throw away the current dictionary contents,
        if any, and read all the pid files from /proc/, instantiating a
        'process' instance for each of them.

        This is a high overhead operation, and should be avoided if the
        perf python binding can be used to detect when new threads appear
        and existing ones terminate.

        In RHEL it is found in the python-perf rpm package.

        More information about the perf facilities can be found in the
        'perf_event_open' man page.
        N)rQ�os�listdirr,r?r\)r+�pidsZspidrr
r
rrq�s
zpidstats.reloadcCshg}xHt|jj��D]6}y|j|j�Wqtk
rH|j|�YqXqWx|D]}|j|=qTWdS)N)r1rQr2rc�OSErrorrI)r+Z	to_removerr
r
r�reload_threads�s
zpidstats.reload_threadscCsn|dd�}g}xXt|jj��D]F}y$||j|ddkrF|j|�Wq tk
rd|j|=Yq Xq W|S)N�rr)r1rQr2rI�IOError)r+r[rvrr
r
r�find_by_name�szpidstats.find_by_namecCsdg}xZt|jj��D]H}y&|j|j|dd�r<|j|�Wqtk
rZ|j|=YqXqW|S)Nrr)r1rQr2�matchrIrz)r+�regexrvrr
r
r�
find_by_regexszpidstats.find_by_regexcCs`g}xVt|jj��D]D}y"|jt|j|��r8|j|�Wqtk
rV|j|=YqXqW|S)N)r1rQr2r|rrIrz)r+r}rvrr
r
r�find_by_cmdline_regexszpidstats.find_by_cmdline_regexcs�d}d}g�x�|�d|��}|j|�}|sFt�fdd�|D��dkrHPxF|D]>}y||j|dd�7}WqNtk
r�|j|=YqNXqNW�|7�|d7}qW|jd	�}|S)
Nrrjr7csg|]}|�kr|�qSr
r
)�.0�n)�processed_pidsr
r�
<listcomp>'sz0pidstats.get_per_cpu_rtprios.<locals>.<listcomp>rr&rrk)r{r=rQrzr)r+�basename�cpu�
prioritiesr[rvrr
)r�r�get_per_cpu_rtprios s"
 

zpidstats.get_per_cpu_rtprioscs�d}d}g�x�|j|�}|s8t�fdd�|D��dkr:PxF|D]>}y||j|dd�7}Wq@tk
r||j|=Yq@Xq@W�|7�|d7}qW|jd�}|S)	Nrrjcsg|]}|�kr|�qSr
r
)r�r�)r�r
rr�=sz(pidstats.get_rtprios.<locals>.<listcomp>rr&rrk)r{r=rQrzr)r+r[r�r�rvrr
)r�r�get_rtprios7s 
 

zpidstats.get_rtprioscCs|j|dj�S)zQ
        Checks if a given pid can't have its SMP affinity mask changed.
        r)rQrE)r+rr
r
rrEMszpidstats.is_bound_to_cpuN)rp)rLrMrNrOr-r0rsr2r3r4r5r6rqrxr{r~rr�r�rEr
r
r
rri�s"


ric@speZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�ZdS)�
interruptsam
    Information about IRQs in the system. A dictionary keyed by IRQ number
    will have as its value another dictionary with "cpu", "type" and "users"
    keys, with the SMP affinity mask, type of IRQ and the drivers associated
    with each interrupt.

    The information comes from the /proc/interrupts file, documented in
    'man procfs(5)', for instance, the 'cpu' dict is an array with one entry
    per CPU present in the sistem, each value being the number of interrupts
    that took place per CPU.

    E.g.:

    >>> import procfs
    >>> interrupts = procfs.interrupts()
    >>> thunderbolt_irq = interrupts.find_by_user("thunderbolt")
    >>> print thunderbolt_irq
    34
    >>> thunderbolt = interrupts[thunderbolt_irq]
    >>> print thunderbolt
    {'affinity': [0, 1, 2, 3], 'type': 'PCI-MSI', 'cpu': [3495, 0, 81, 0], 'users': ['thunderbolt']}
    >>>
    cCsi|_|j�dS)N)r�rq)r+r
r
rr-mszinterrupts.__init__cCs|jt|�S)N)r��str)r+rrr
r
rr0qszinterrupts.__getitem__cCst|jj��S)N)r1r�r2)r+r
r
rr2tszinterrupts.keyscCst|jj��S)N)r1r�r3)r+r
r
rr3wszinterrupts.valuescCst|�|jkS)N)r�r�)r+rrr
r
rr4zszinterrupts.has_keycCs|jS)N)r�)r+r
r
rr5}szinterrupts.itemscCst|�|jkS)N)r�r�)r+rrr
r
rr6�szinterrupts.__contains__cCs�|`i|_td���}x�|j�D]�}|j�}|j�}|ddd�dkrRt|�|_q|djd�}i|j|<|j|dd�|�|j|<yt|�}WnwYnX|j	|�|j|d<qWWdQRXdS)Nz/proc/interruptsrrFZCPUrXrZaffinity)
r�r8rYrr:r=�nr_cpus�parse_entryr?�parse_affinity)r+r@rZr.�irqZnirqr
r
rrq�s"


zinterrupts.reloadcCs�i}g|d<|djt|d��t|�}||jkr�|ddd�|d|j�D�7<||jkr�||j|d<||jdkr�dd�||djd�D�|d	<ng|d	<|S)
Nr�rcSsg|]}t|��qSr
)r?)r�rBr
r
rr��sz*interrupts.parse_entry.<locals>.<listcomp>r�typecSsg|]}|j��qSr
)r)r�rr
r
rr��srk�users)rIr?r=r�r:)r+r.rZ�dictrAr
r
rr��s
$
zinterrupts.parse_entrycCsLy0td|�d���}|j�}WdQRXt||j�Stk
rFdgSXdS)Nz
/proc/irq/z
/smp_affinityr)r8r9rr�rz)r+r�r@rZr
r
rr��szinterrupts.parse_affinitycCs@x:t|jj��D](}d|j|kr||j|dkr|SqWdS)a�
        Looks up a interrupt number by the name of one of its users"

        E.g.:

        >>> import procfs
        >>> interrupts = procfs.interrupts()
        >>> thunderbolt_irq = interrupts.find_by_user("thunderbolt")
        >>> print thunderbolt_irq
        34
        >>> thunderbolt = interrupts[thunderbolt_irq]
        >>> print thunderbolt
        {'affinity': [0, 1, 2, 3], 'type': 'PCI-MSI', 'cpu': [3495, 0, 81, 0], 'users': ['thunderbolt']}
        >>>
        r�N)r1r�r2)r+�userrBr
r
r�find_by_user�s
zinterrupts.find_by_usercCs^g}xTt|jj��D]B}d|j|kr(qx,|j|dD]}|j|�r8|j|�Pq8WqW|S)a�
        Looks up a interrupt number by a regex that matches names of its users"

        E.g.:

        >>> import procfs
        >>> import re
        >>> interrupts = procfs.interrupts()
        >>> usb_controllers = interrupts.find_by_user_regex(re.compile(".*hcd"))
        >>> print usb_controllers
        ['22', '23', '31']
        >>> print [ interrupts[irq]["users"] for irq in usb_controllers ]
        [['ehci_hcd:usb4'], ['ehci_hcd:usb3'], ['xhci_hcd']]
        >>>
        r�)r1r�r2r|rI)r+r}ZirqsrBr�r
r
r�find_by_user_regex�s


zinterrupts.find_by_user_regexN)rLrMrNrOr-r0r2r3r4r5r6rqr�r�r�r�r
r
r
rr�Tsr�c@s@eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dS)r
a9
    Parses the kernel command line (/proc/cmdline), turning it into a dictionary."

    Useful to figure out if some kernel boolean knob has been turned on,
    as well as to find the value associated to other kernel knobs.

    It can also be used to find out about parameters passed to the
    init process, such as 'BOOT_IMAGE', etc.

    E.g.:
    >>> import procfs
    >>> kcmd = procfs.cmdline()
    >>> print kcmd.keys()
    ['LANG', 'BOOT_IMAGE', 'quiet', 'rhgb', 'rd.lvm.lv', 'ro', 'root']
    >>> print kcmd["BOOT_IMAGE"]
    /vmlinuz-4.3.0-rc1+
    >>>
    cCsi|_|j�dS)N)�options�parse)r+r
r
rr-�szcmdline.__init__cCsjtd��X}xP|j�j�j�D]<}|jd�}t|�dkrFd|j|d<q|d|j|d<qWWdQRXdS)Nz
/proc/cmdlinermrTr)r8r9rr:r=r�)r+r@Zoptionr.r
r
rr��s

z
cmdline.parsecCs
|j|S)N)r�)r+rrr
r
rr0szcmdline.__getitem__cCst|jj��S)N)r1r�r2)r+r
r
rr2szcmdline.keyscCst|jj��S)N)r1r�r3)r+r
r
rr3szcmdline.valuescCs|jS)N)r�)r+r
r
rr5
sz
cmdline.itemsN)
rLrMrNrOr-r�r0r2r3r5r
r
r
rr
�s	r
c@sBeZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dS)�cpuinfoa�
    Dictionary with information about CPUs in the system.

    Please refer to 'man procfs(5)' for further information about the
    '/proc/cpuinfo' file, that is the source of the information provided
    by this class. The 'man lscpu(1)' also has information about a program that
    uses the '/proc/cpuinfo' file.

    Using this class one can obtain the number of CPUs in a system:

      >>> cpus = procfs.cpuinfo()
          >>> print cpus.nr_cpus
          4

    It is also possible to figure out aspects of the CPU topology, such as
    how many CPU physical sockets exists, i.e. groups of CPUs sharing
    components such as CPU memory caches:

      >>> print len(cpus.sockets)
      1

    Additionally dictionary with information common to all CPUs in the system
    is available:

      >>> print cpus["model name"]
          Intel(R) Core(TM) i7-3667U CPU @ 2.00GHz
          >>> print cpus["cache size"]
          4096 KB
          >>>
    �
/proc/cpuinfocCs i|_d|_g|_|j|�dS)Nr)�tagsr��socketsr�)r+�filenamer
r
rr-.szcpuinfo.__init__cCs|j|j�S)N)r��lower)r+rrr
r
rr04szcpuinfo.__getitem__cCst|jj��S)N)r1r�r2)r+r
r
rr27szcpuinfo.keyscCst|jj��S)N)r1r�r3)r+r
r
rr3:szcpuinfo.valuescCs|jS)N)r�)r+r
r
rr5=sz
cpuinfo.itemscCs(t|���}x�|j�D]�}|j�}|s&q|jd�}|dj�j�}|dkrX|jd7_qt�rv|dkrv|jd7_q|dkr�q|dj�|j|<|dkr|j|}||jkr|jj	|�qWWdQRX|jr�t
|j�p�|jd|jkr�t|jd�p�d|_d	|jk�rt|jd	��pd|j|_
dS)
NrXrr%rz
cpu numberzcore idzphysical idZsiblingsz	cpu cores)r8rYrr:r�r�rr�r�rIr=r?Z
nr_socketsZnr_cores)r+r�r@rZr.ZtagnameZ	socket_idr
r
rr�@s2



 z
cpuinfo.parseN)r�)
rLrMrNrOr-r0r2r3r5r�r
r
r
rr�s
r�c@s8eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�ZdS)
�	smaps_libat
    Representation of an mmap in place for a process. Can be used to figure
    out which processes have an library mapped, etc.

    The 'perm' member can be used to figure out executable mmaps,
    i.e. libraries.

    The 'vm_start' and 'vm_end' in turn can be used when trying to resolve
    processor instruction pointer addresses to a symbol name in a library.
    c
Cs�|dj�}dd�|djd�D�\|_|_|d|_t|dd�|_|djd	�\|_|_t|d
�|_t	|�dkr�|d|_
nd|_
i|_x^|dd�D]N}|j�}|ddd�j�}yt|d�|j|<Wq�||j|<Yq�Xq�WdS)
NrcSsg|]}t|d��qS)r)r?)r�rr
r
rr�jsz&smaps_lib.__init__.<locals>.<listcomp>�-rrrrFrXr�rg)
r:�vm_startZvm_endZpermsr?�offset�major�minor�inoder=r[r�r�)r+�linesr.rZ�tagr
r
rr-hs" 
zsmaps_lib.__init__cCs|j|j�S)N)r�r�)r+rrr
r
rr0}szsmaps_lib.__getitem__cCst|jj��S)N)r1r�r2)r+r
r
rr2�szsmaps_lib.keyscCst|jj��S)N)r1r�r3)r+r
r
rr3�szsmaps_lib.valuescCs|jS)N)r�)r+r
r
rr5�szsmaps_lib.itemsN)	rLrMrNrOr-r0r2r3r5r
r
r
rr�\s
r�c@s@eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dS)�smapsa~
    List of libraries mapped by a process. Parses the lines in
    the /proc/PID/smaps file, that is further documented in the
    procfs(5) man page.

    Example: Listing the executable maps for the 'sshd' process:

          >>> import procfs
          >>> processes = procfs.pidstats()
          >>> sshd = processes.find_by_name("sshd")
          >>> sshd_maps = procfs.smaps(sshd[0])
          >>> for i in range(len(sshd_maps)):
          ...     if 'x' in sshd_maps[i].perms:
          ...         print "%s: %s" % (sshd_maps[i].name, sshd_maps[i].perms)
          ...
          /usr/sbin/sshd: r-xp
          /usr/lib64/libnss_files-2.20.so: r-xp
          /usr/lib64/librt-2.20.so: r-xp
          /usr/lib64/libkeyutils.so.1.5: r-xp
          /usr/lib64/libkrb5support.so.0.1: r-xp
          /usr/lib64/libfreebl3.so: r-xp
          /usr/lib64/libpthread-2.20.so: r-xp
      ...
    cCs||_g|_|j�dS)N)r�entriesrq)r+rr
r
rr-�szsmaps.__init__cCsxg}|s|j�j�}|sdS|j|�x<|j�}|s6P|j�}|j�dddkr^|j|�q(Pq(W|jjt|��|S)NrrrXrg)r9rrIr:r�r�)r+r@rZr�r
r
rr��s 
zsmaps.parse_entrycCs
t|j�S)N)r=r�)r+r
r
r�__len__�sz
smaps.__len__cCs
|j|S)N)r�)r+�indexr
r
rr0�szsmaps.__getitem__c
CsJd}td|j�d���}x|j||�}|sPqWWdQRXt|j�|_dS)Nz/proc/z/smaps)r8rr�r=r��
nr_entries)r+rZr@r
r
rrq�szsmaps.reloadcCsNg}xDt|j�D]6}|j|jr|j|jj|�dkr|j|j|�qW|S)Nr)rr�r�r[�findrI)r+Zfragment�resultrBr
r
r�find_by_name_fragment�szsmaps.find_by_name_fragmentN)
rLrMrNrOr-r�r�r0rqr�r
r
r
rr��s	r�c@s eZdZdZdd�Zdd�ZdS)�cpustatz�
    CPU statistics, obtained from a line in the '/proc/stat' file, Please
    refer to 'man procfs(5)' for further information about the '/proc/stat'
    file, that is the source of the information provided by this class.
    cCst|d|_dd�|dd�D�\|_|_|_|_|_|_|_t|�dkrpt	|d�|_
t|�dkrpt	|d�|_dS)NrcSsg|]}t|��qSr
)r?)r�rBr
r
rr��sz$cpustat.__init__.<locals>.<listcomp>rr�)r[r�r#�system�idle�iowaitr��softirqr=r?�steal�guest)r+r.r
r
rr-�s
2zcpustat.__init__cCsxd|j�d|j�d|j�d|j�d|j�d|j�d|j��}t|d�rV|d	|j��7}t|d
�rp|d|j	��7}|dS)
Nz< user: z, nice: z
, system: z, idle: z
, iowait: z, irq: z, softirq: r�z	, steal: r�z	, guest: �>)
r�r#r�r�r�r�r�r`r�r�)r+�sr
r
r�__repr__�s<

zcpustat.__repr__N)rLrMrNrOr-r�r
r
r
rr��sr�c@sReZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)�	cpusstatsaN
    Dictionary with information about CPUs in the system. First entry in the
    dictionary gives an aggregate view of all CPUs, each other entry is about
    separate CPUs. Please refer to 'man procfs(5)' for further information
    about the '/proc/stat' file, that is the source of the information provided
    by this class.
    �
/proc/statcCs*i|_d|_tjd�|_||_|j�dS)Nr)r��timert�sysconf�hertzr�rq)r+r�r
r
rr-�s
zcpusstats.__init__cCs
t|j�S)N)�iterr�)r+r
r
r�__iter__szcpusstats.__iter__cCs
|j|S)N)r�)r+rrr
r
rr0szcpusstats.__getitem__cCstt|jj���S)N)r=r1r�r2)r+r
r
rr�	szcpusstats.__len__cCst|jj��S)N)r1r�r2)r+r
r
rr2szcpusstats.keyscCst|jj��S)N)r1r�r3)r+r
r
rr3szcpusstats.valuescCs|jS)N)r�)r+r
r
rr5szcpusstats.itemscCs:|j}i|_t|j��v}xn|j�D]b}|j�j�}|ddd�j�dkrLq"t|�}|jdkrdd}nt	|jdd��d}||j|<q"WWdQRX|j
}t
j
�|_
|�r6|j
|}||j}	xxt|jj
��D]f}
|
|kr�d|_q�|j|
}||
}|j|j|j|j|j|j}
|
|	d|_t|jd�|_q�WdS)NrrFr�r�d)r�r8r�rYrr:r�r�r[r?r�r�r1r2Zusager�r#r�r<)r+Zlast_entriesr@rZr.�c�idxZ	last_timeZ	delta_secZinterval_hzr�Zcurr�prevZdeltar
r
rrqs6




zcpusstats.reloadN)r�)rLrMrNrOr-r�r0r�r2r3r5rqr
r
r
rr��s
r��__main__z: z
cpuinfo data: z processorsrmzsmaps:
r��(rz#x� rZSize�
z
----------)3rtrrr��	functoolsrZ	six.movesrZprocfs.utilistr�VERSIONrrrrRrUrVr\rir�r
r�r�r�r�r�rL�sysZintsr1r2rB�printr��or�r�r�r?�argvr�r�r�r�r[r�rZpsZcsZsleeprqr
r
r
r�<module>	sj>		Me*
.N.KB
*

site-packages/procfs/__pycache__/utilist.cpython-36.pyc000075500000001417147511334560017132 0ustar003

�{Mew�@s ddlmZdd�Zdd�ZdS)�)�rangecCsvg}d}d}xHt|�D]<}||kr.|d|>O}|d7}|dkrd}|jd|�d}qW|dkrr|dkrr|jd|�|S)Nr�� )r�insert)�l�
nr_entries�
hexbitmask�bit�mask�entry�r�/usr/lib/python3.6/utilist.pyrsrcCsl|j�jdd�}g}d}tt|d��dd�}x8t|�D],}t|�d@rR|j|�|d7}||kr8Pq8W|S)N�,�r��r)�strip�replace�bin�int�reversed�append)�linerZhexmask�bitmasklistrZbitmask�irrr
rs
rN)Z	six.movesrrrrrrr
�<module>	ssite-packages/easy_install.py000064400000000176147511334560012342 0ustar00"""Run the EasyInstall command"""

if __name__ == '__main__':
    from setuptools.command.easy_install import main
    main()
site-packages/python_linux_procfs-0.7.3-py3.6.egg-info/PKG-INFO000064400000000536147511334560017524 0ustar00Metadata-Version: 1.0
Name: python-linux-procfs
Version: 0.7.3
Summary: Linux /proc abstraction classes
Home-page: http://userweb.kernel.org/python-linux-procfs
Author: Arnaldo Carvalho de Melo
Author-email: acme@redhat.com
License: GPLv2
Description: Abstractions to extract information from the Linux kernel /proc files.
        
Platform: UNKNOWN
site-packages/python_linux_procfs-0.7.3-py3.6.egg-info/SOURCES.txt000064400000000433147511334560020307 0ustar00pflags
setup.py
procfs/__init__.py
procfs/procfs.py
procfs/utilist.py
python_linux_procfs.egg-info/PKG-INFO
python_linux_procfs.egg-info/SOURCES.txt
python_linux_procfs.egg-info/dependency_links.txt
python_linux_procfs.egg-info/requires.txt
python_linux_procfs.egg-info/top_level.txtsite-packages/python_linux_procfs-0.7.3-py3.6.egg-info/top_level.txt000064400000000007147511334560021152 0ustar00procfs
site-packages/python_linux_procfs-0.7.3-py3.6.egg-info/dependency_links.txt000064400000000001147511334560022471 0ustar00
site-packages/python_linux_procfs-0.7.3-py3.6.egg-info/requires.txt000064400000000004147511334560021015 0ustar00six
site-packages/syspurpose-1.28.40-py3.6.egg-info/SOURCES.txt000064400000000575147511334560016603 0ustar00README.md
setup.cfg
setup.py
man/syspurpose.8
src/syspurpose/__init__.py
src/syspurpose/cli.py
src/syspurpose/files.py
src/syspurpose/i18n.py
src/syspurpose/main.py
src/syspurpose/utils.py
src/syspurpose.egg-info/PKG-INFO
src/syspurpose.egg-info/SOURCES.txt
src/syspurpose.egg-info/dependency_links.txt
src/syspurpose.egg-info/entry_points.txt
src/syspurpose.egg-info/top_level.txtsite-packages/syspurpose-1.28.40-py3.6.egg-info/PKG-INFO000064400000000367147511334560016013 0ustar00Metadata-Version: 1.0
Name: syspurpose
Version: 1.28.40
Summary: Manage Red Hat System Purpose
Home-page: http://www.candlepinproject.org
Author: Chris Snyder
Author-email: chainsaw@redhat.com
License: GPLv2
Description: UNKNOWN
Platform: UNKNOWN
site-packages/syspurpose-1.28.40-py3.6.egg-info/dependency_links.txt000064400000000001147511334560020756 0ustar00
site-packages/syspurpose-1.28.40-py3.6.egg-info/entry_points.txt000064400000000065147511334560020207 0ustar00[console_scripts]
syspurpose = syspurpose.main:main

site-packages/syspurpose-1.28.40-py3.6.egg-info/top_level.txt000064400000000013147511334560017434 0ustar00syspurpose
site-packages/configobj-5.0.6-py3.6.egg-info000064400000005722147511334560014201 0ustar00Metadata-Version: 1.1
Name: configobj
Version: 5.0.6
Summary: Config file reading, writing and validation.
Home-page: https://github.com/DiffSK/configobj
Author: Rob Dennis, Eli Courtwright (Michael Foord & Nicola Larosa original maintainers)
Author-email: rdennis+configobj@gmail.com, eli@courtwright.org, fuzzyman@voidspace.co.uk, nico@tekNico.net
License: UNKNOWN
Description: **ConfigObj** is a simple but powerful config file reader and writer: an *ini
        file round tripper*. Its main feature is that it is very easy to use, with a
        straightforward programmer's interface and a simple syntax for config files.
        It has lots of other features though :
        
        * Nested sections (subsections), to any level
        * List values
        * Multiple line values
        * Full Unicode support
        * String interpolation (substitution)
        * Integrated with a powerful validation system
        
            - including automatic type checking/conversion
            - and allowing default values
            - repeated sections
        
        * All comments in the file are preserved
        * The order of keys/sections is preserved
        * Powerful ``unrepr`` mode for storing/retrieving Python data-types
        
        | Release 5.0.6 improves error messages in certain edge cases
        | Release 5.0.5 corrects a unicode-bug that still existed in writing files
        | Release 5.0.4 corrects a unicode-bug that still existed in reading files after
        | fixing lists of string in 5.0.3
        | Release 5.0.3 corrects errors related to the incorrectly handling unicode
        | encoding and writing out files
        | Release 5.0.2 adds a specific error message when trying to install on
        | Python versions older than 2.5
        | Release 5.0.1 fixes a regression with unicode conversion not happening
        | in certain cases PY2
        | Release 5.0.0 updates the supported Python versions to 2.6, 2.7, 3.2, 3.3
        | and is otherwise unchanged
        | Release 4.7.2 fixes several bugs in 4.7.1
        | Release 4.7.1 fixes a bug with the deprecated options keyword in
        | 4.7.0.
        | Release 4.7.0 improves performance adds features for validation and
        | fixes some bugs.
Keywords: config,ini,dictionary,application,admin,sysadmin,configuration,validation
Platform: UNKNOWN
Classifier: Development Status :: 6 - Mature
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: BSD License
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 2
Classifier: Programming Language :: Python :: 2.6
Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.2
Classifier: Programming Language :: Python :: 3.3
Classifier: Operating System :: OS Independent
Classifier: Topic :: Software Development :: Libraries
Classifier: Topic :: Software Development :: Libraries :: Python Modules
site-packages/appdirs-1.4.3.dist-info/RECORD000064400000001261147511334560014174 0ustar00appdirs.py,sha256=MievUEuv3l_mQISH5SF0shDk_BNhHHzYiAPrT3ITN4I,24701
appdirs-1.4.3.dist-info/DESCRIPTION.rst,sha256=77Fe8OIOLSjDSNdLiL5xywMKO-AGE42rdXkqKo4Ee-k,7531
appdirs-1.4.3.dist-info/METADATA,sha256=0b4qJnCe47dSs3DnCEyiRjjem0za-TOoA6YmjOvxLg0,8831
appdirs-1.4.3.dist-info/RECORD,,
appdirs-1.4.3.dist-info/WHEEL,sha256=kdsN-5OJAZIiHN-iO4Rhl82KyS0bDWf4uBwMbkNafr8,110
appdirs-1.4.3.dist-info/metadata.json,sha256=pS8bByGso9wTgo8oC6prGmLOQ1fNH9QNW4Egsi9knSA,1434
appdirs-1.4.3.dist-info/top_level.txt,sha256=nKncE8CUqZERJ6VuQWL4_bkunSPDNfn7KZqb4Tr5YEM,8
appdirs-1.4.3.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
__pycache__/appdirs.cpython-36.pyc,,
site-packages/appdirs-1.4.3.dist-info/DESCRIPTION.rst000064400000016553147511334560015622 0ustar00
.. image:: https://secure.travis-ci.org/ActiveState/appdirs.png
    :target: http://travis-ci.org/ActiveState/appdirs

the problem
===========

What directory should your app use for storing user data? If running on Mac OS X, you
should use::

    ~/Library/Application Support/<AppName>

If on Windows (at least English Win XP) that should be::

    C:\Documents and Settings\<User>\Application Data\Local Settings\<AppAuthor>\<AppName>

or possibly::

    C:\Documents and Settings\<User>\Application Data\<AppAuthor>\<AppName>

for `roaming profiles <http://bit.ly/9yl3b6>`_ but that is another story.

On Linux (and other Unices) the dir, according to the `XDG
spec <http://standards.freedesktop.org/basedir-spec/basedir-spec-latest.html>`_, is::

    ~/.local/share/<AppName>


``appdirs`` to the rescue
=========================

This kind of thing is what the ``appdirs`` module is for. ``appdirs`` will
help you choose an appropriate:

- user data dir (``user_data_dir``)
- user config dir (``user_config_dir``)
- user cache dir (``user_cache_dir``)
- site data dir (``site_data_dir``)
- site config dir (``site_config_dir``)
- user log dir (``user_log_dir``)

and also:

- is a single module so other Python packages can include their own private copy
- is slightly opinionated on the directory names used. Look for "OPINION" in
  documentation and code for when an opinion is being applied.


some example output
===================

On Mac OS X::

    >>> from appdirs import *
    >>> appname = "SuperApp"
    >>> appauthor = "Acme"
    >>> user_data_dir(appname, appauthor)
    '/Users/trentm/Library/Application Support/SuperApp'
    >>> site_data_dir(appname, appauthor)
    '/Library/Application Support/SuperApp'
    >>> user_cache_dir(appname, appauthor)
    '/Users/trentm/Library/Caches/SuperApp'
    >>> user_log_dir(appname, appauthor)
    '/Users/trentm/Library/Logs/SuperApp'

On Windows 7::

    >>> from appdirs import *
    >>> appname = "SuperApp"
    >>> appauthor = "Acme"
    >>> user_data_dir(appname, appauthor)
    'C:\\Users\\trentm\\AppData\\Local\\Acme\\SuperApp'
    >>> user_data_dir(appname, appauthor, roaming=True)
    'C:\\Users\\trentm\\AppData\\Roaming\\Acme\\SuperApp'
    >>> user_cache_dir(appname, appauthor)
    'C:\\Users\\trentm\\AppData\\Local\\Acme\\SuperApp\\Cache'
    >>> user_log_dir(appname, appauthor)
    'C:\\Users\\trentm\\AppData\\Local\\Acme\\SuperApp\\Logs'

On Linux::

    >>> from appdirs import *
    >>> appname = "SuperApp"
    >>> appauthor = "Acme"
    >>> user_data_dir(appname, appauthor)
    '/home/trentm/.local/share/SuperApp
    >>> site_data_dir(appname, appauthor)
    '/usr/local/share/SuperApp'
    >>> site_data_dir(appname, appauthor, multipath=True)
    '/usr/local/share/SuperApp:/usr/share/SuperApp'
    >>> user_cache_dir(appname, appauthor)
    '/home/trentm/.cache/SuperApp'
    >>> user_log_dir(appname, appauthor)
    '/home/trentm/.cache/SuperApp/log'
    >>> user_config_dir(appname)
    '/home/trentm/.config/SuperApp'
    >>> site_config_dir(appname)
    '/etc/xdg/SuperApp'
    >>> os.environ['XDG_CONFIG_DIRS'] = '/etc:/usr/local/etc'
    >>> site_config_dir(appname, multipath=True)
    '/etc/SuperApp:/usr/local/etc/SuperApp'


``AppDirs`` for convenience
===========================

::

    >>> from appdirs import AppDirs
    >>> dirs = AppDirs("SuperApp", "Acme")
    >>> dirs.user_data_dir
    '/Users/trentm/Library/Application Support/SuperApp'
    >>> dirs.site_data_dir
    '/Library/Application Support/SuperApp'
    >>> dirs.user_cache_dir
    '/Users/trentm/Library/Caches/SuperApp'
    >>> dirs.user_log_dir
    '/Users/trentm/Library/Logs/SuperApp'



Per-version isolation
=====================

If you have multiple versions of your app in use that you want to be
able to run side-by-side, then you may want version-isolation for these
dirs::

    >>> from appdirs import AppDirs
    >>> dirs = AppDirs("SuperApp", "Acme", version="1.0")
    >>> dirs.user_data_dir
    '/Users/trentm/Library/Application Support/SuperApp/1.0'
    >>> dirs.site_data_dir
    '/Library/Application Support/SuperApp/1.0'
    >>> dirs.user_cache_dir
    '/Users/trentm/Library/Caches/SuperApp/1.0'
    >>> dirs.user_log_dir
    '/Users/trentm/Library/Logs/SuperApp/1.0'



appdirs Changelog
=================

appdirs 1.4.3
-------------
- [PR #76] Python 3.6 invalid escape sequence deprecation fixes
- Fix for Python 3.6 support

appdirs 1.4.2
-------------
- [PR #84] Allow installing without setuptools
- [PR #86] Fix string delimiters in setup.py description
- Add Python 3.6 support

appdirs 1.4.1
-------------
- [issue #38] Fix _winreg import on Windows Py3
- [issue #55] Make appname optional

appdirs 1.4.0
-------------
- [PR #42] AppAuthor is now optional on Windows
- [issue 41] Support Jython on Windows, Mac, and Unix-like platforms. Windows
  support requires `JNA <https://github.com/twall/jna>`_.
- [PR #44] Fix incorrect behaviour of the site_config_dir method

appdirs 1.3.0
-------------
- [Unix, issue 16] Conform to XDG standard, instead of breaking it for
  everybody
- [Unix] Removes gratuitous case mangling of the case, since \*nix-es are
  usually case sensitive, so mangling is not wise
- [Unix] Fixes the utterly wrong behaviour in ``site_data_dir``, return result
  based on XDG_DATA_DIRS and make room for respecting the standard which
  specifies XDG_DATA_DIRS is a multiple-value variable
- [Issue 6] Add ``*_config_dir`` which are distinct on nix-es, according to
  XDG specs; on Windows and Mac return the corresponding ``*_data_dir``

appdirs 1.2.0
-------------

- [Unix] Put ``user_log_dir`` under the *cache* dir on Unix. Seems to be more
  typical.
- [issue 9] Make ``unicode`` work on py3k.

appdirs 1.1.0
-------------

- [issue 4] Add ``AppDirs.user_log_dir``.
- [Unix, issue 2, issue 7] appdirs now conforms to `XDG base directory spec
  <http://standards.freedesktop.org/basedir-spec/basedir-spec-latest.html>`_.
- [Mac, issue 5] Fix ``site_data_dir()`` on Mac.
- [Mac] Drop use of 'Carbon' module in favour of hardcoded paths; supports
  Python3 now.
- [Windows] Append "Cache" to ``user_cache_dir`` on Windows by default. Use
  ``opinion=False`` option to disable this.
- Add ``appdirs.AppDirs`` convenience class. Usage:

        >>> dirs = AppDirs("SuperApp", "Acme", version="1.0")
        >>> dirs.user_data_dir
        '/Users/trentm/Library/Application Support/SuperApp/1.0'

- [Windows] Cherry-pick Komodo's change to downgrade paths to the Windows short
  paths if there are high bit chars.
- [Linux] Change default ``user_cache_dir()`` on Linux to be singular, e.g.
  "~/.superapp/cache".
- [Windows] Add ``roaming`` option to ``user_data_dir()`` (for use on Windows only)
  and change the default ``user_data_dir`` behaviour to use a *non*-roaming
  profile dir (``CSIDL_LOCAL_APPDATA`` instead of ``CSIDL_APPDATA``). Why? Because
  a large roaming profile can cause login speed issues. The "only syncs on
  logout" behaviour can cause surprises in appdata info.


appdirs 1.0.1 (never released)
------------------------------

Started this changelog 27 July 2010. Before that this module originated in the
`Komodo <http://www.activestate.com/komodo>`_ product as ``applib.py`` and then
as `applib/location.py
<http://github.com/ActiveState/applib/blob/master/applib/location.py>`_ (used by
`PyPM <http://code.activestate.com/pypm/>`_ in `ActivePython
<http://www.activestate.com/activepython>`_). This is basically a fork of
applib.py 1.0.1 and applib/location.py 1.0.1.



site-packages/appdirs-1.4.3.dist-info/METADATA000064400000021177147511334560014406 0ustar00Metadata-Version: 2.0
Name: appdirs
Version: 1.4.3
Summary: A small Python module for determining appropriate platform-specific dirs, e.g. a "user data dir".
Home-page: http://github.com/ActiveState/appdirs
Author: Trent Mick
Author-email: trentm@gmail.com
Maintainer: Trent Mick; Sridhar Ratnakumar; Jeff Rouse
Maintainer-email: trentm@gmail.com; github@srid.name; jr@its.to
License: MIT
Keywords: application directory log cache user
Platform: UNKNOWN
Classifier: Development Status :: 4 - Beta
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 2
Classifier: Programming Language :: Python :: 2.6
Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.2
Classifier: Programming Language :: Python :: 3.3
Classifier: Programming Language :: Python :: 3.4
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: Implementation :: PyPy
Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Topic :: Software Development :: Libraries :: Python Modules


.. image:: https://secure.travis-ci.org/ActiveState/appdirs.png
    :target: http://travis-ci.org/ActiveState/appdirs

the problem
===========

What directory should your app use for storing user data? If running on Mac OS X, you
should use::

    ~/Library/Application Support/<AppName>

If on Windows (at least English Win XP) that should be::

    C:\Documents and Settings\<User>\Application Data\Local Settings\<AppAuthor>\<AppName>

or possibly::

    C:\Documents and Settings\<User>\Application Data\<AppAuthor>\<AppName>

for `roaming profiles <http://bit.ly/9yl3b6>`_ but that is another story.

On Linux (and other Unices) the dir, according to the `XDG
spec <http://standards.freedesktop.org/basedir-spec/basedir-spec-latest.html>`_, is::

    ~/.local/share/<AppName>


``appdirs`` to the rescue
=========================

This kind of thing is what the ``appdirs`` module is for. ``appdirs`` will
help you choose an appropriate:

- user data dir (``user_data_dir``)
- user config dir (``user_config_dir``)
- user cache dir (``user_cache_dir``)
- site data dir (``site_data_dir``)
- site config dir (``site_config_dir``)
- user log dir (``user_log_dir``)

and also:

- is a single module so other Python packages can include their own private copy
- is slightly opinionated on the directory names used. Look for "OPINION" in
  documentation and code for when an opinion is being applied.


some example output
===================

On Mac OS X::

    >>> from appdirs import *
    >>> appname = "SuperApp"
    >>> appauthor = "Acme"
    >>> user_data_dir(appname, appauthor)
    '/Users/trentm/Library/Application Support/SuperApp'
    >>> site_data_dir(appname, appauthor)
    '/Library/Application Support/SuperApp'
    >>> user_cache_dir(appname, appauthor)
    '/Users/trentm/Library/Caches/SuperApp'
    >>> user_log_dir(appname, appauthor)
    '/Users/trentm/Library/Logs/SuperApp'

On Windows 7::

    >>> from appdirs import *
    >>> appname = "SuperApp"
    >>> appauthor = "Acme"
    >>> user_data_dir(appname, appauthor)
    'C:\\Users\\trentm\\AppData\\Local\\Acme\\SuperApp'
    >>> user_data_dir(appname, appauthor, roaming=True)
    'C:\\Users\\trentm\\AppData\\Roaming\\Acme\\SuperApp'
    >>> user_cache_dir(appname, appauthor)
    'C:\\Users\\trentm\\AppData\\Local\\Acme\\SuperApp\\Cache'
    >>> user_log_dir(appname, appauthor)
    'C:\\Users\\trentm\\AppData\\Local\\Acme\\SuperApp\\Logs'

On Linux::

    >>> from appdirs import *
    >>> appname = "SuperApp"
    >>> appauthor = "Acme"
    >>> user_data_dir(appname, appauthor)
    '/home/trentm/.local/share/SuperApp
    >>> site_data_dir(appname, appauthor)
    '/usr/local/share/SuperApp'
    >>> site_data_dir(appname, appauthor, multipath=True)
    '/usr/local/share/SuperApp:/usr/share/SuperApp'
    >>> user_cache_dir(appname, appauthor)
    '/home/trentm/.cache/SuperApp'
    >>> user_log_dir(appname, appauthor)
    '/home/trentm/.cache/SuperApp/log'
    >>> user_config_dir(appname)
    '/home/trentm/.config/SuperApp'
    >>> site_config_dir(appname)
    '/etc/xdg/SuperApp'
    >>> os.environ['XDG_CONFIG_DIRS'] = '/etc:/usr/local/etc'
    >>> site_config_dir(appname, multipath=True)
    '/etc/SuperApp:/usr/local/etc/SuperApp'


``AppDirs`` for convenience
===========================

::

    >>> from appdirs import AppDirs
    >>> dirs = AppDirs("SuperApp", "Acme")
    >>> dirs.user_data_dir
    '/Users/trentm/Library/Application Support/SuperApp'
    >>> dirs.site_data_dir
    '/Library/Application Support/SuperApp'
    >>> dirs.user_cache_dir
    '/Users/trentm/Library/Caches/SuperApp'
    >>> dirs.user_log_dir
    '/Users/trentm/Library/Logs/SuperApp'



Per-version isolation
=====================

If you have multiple versions of your app in use that you want to be
able to run side-by-side, then you may want version-isolation for these
dirs::

    >>> from appdirs import AppDirs
    >>> dirs = AppDirs("SuperApp", "Acme", version="1.0")
    >>> dirs.user_data_dir
    '/Users/trentm/Library/Application Support/SuperApp/1.0'
    >>> dirs.site_data_dir
    '/Library/Application Support/SuperApp/1.0'
    >>> dirs.user_cache_dir
    '/Users/trentm/Library/Caches/SuperApp/1.0'
    >>> dirs.user_log_dir
    '/Users/trentm/Library/Logs/SuperApp/1.0'



appdirs Changelog
=================

appdirs 1.4.3
-------------
- [PR #76] Python 3.6 invalid escape sequence deprecation fixes
- Fix for Python 3.6 support

appdirs 1.4.2
-------------
- [PR #84] Allow installing without setuptools
- [PR #86] Fix string delimiters in setup.py description
- Add Python 3.6 support

appdirs 1.4.1
-------------
- [issue #38] Fix _winreg import on Windows Py3
- [issue #55] Make appname optional

appdirs 1.4.0
-------------
- [PR #42] AppAuthor is now optional on Windows
- [issue 41] Support Jython on Windows, Mac, and Unix-like platforms. Windows
  support requires `JNA <https://github.com/twall/jna>`_.
- [PR #44] Fix incorrect behaviour of the site_config_dir method

appdirs 1.3.0
-------------
- [Unix, issue 16] Conform to XDG standard, instead of breaking it for
  everybody
- [Unix] Removes gratuitous case mangling of the case, since \*nix-es are
  usually case sensitive, so mangling is not wise
- [Unix] Fixes the utterly wrong behaviour in ``site_data_dir``, return result
  based on XDG_DATA_DIRS and make room for respecting the standard which
  specifies XDG_DATA_DIRS is a multiple-value variable
- [Issue 6] Add ``*_config_dir`` which are distinct on nix-es, according to
  XDG specs; on Windows and Mac return the corresponding ``*_data_dir``

appdirs 1.2.0
-------------

- [Unix] Put ``user_log_dir`` under the *cache* dir on Unix. Seems to be more
  typical.
- [issue 9] Make ``unicode`` work on py3k.

appdirs 1.1.0
-------------

- [issue 4] Add ``AppDirs.user_log_dir``.
- [Unix, issue 2, issue 7] appdirs now conforms to `XDG base directory spec
  <http://standards.freedesktop.org/basedir-spec/basedir-spec-latest.html>`_.
- [Mac, issue 5] Fix ``site_data_dir()`` on Mac.
- [Mac] Drop use of 'Carbon' module in favour of hardcoded paths; supports
  Python3 now.
- [Windows] Append "Cache" to ``user_cache_dir`` on Windows by default. Use
  ``opinion=False`` option to disable this.
- Add ``appdirs.AppDirs`` convenience class. Usage:

        >>> dirs = AppDirs("SuperApp", "Acme", version="1.0")
        >>> dirs.user_data_dir
        '/Users/trentm/Library/Application Support/SuperApp/1.0'

- [Windows] Cherry-pick Komodo's change to downgrade paths to the Windows short
  paths if there are high bit chars.
- [Linux] Change default ``user_cache_dir()`` on Linux to be singular, e.g.
  "~/.superapp/cache".
- [Windows] Add ``roaming`` option to ``user_data_dir()`` (for use on Windows only)
  and change the default ``user_data_dir`` behaviour to use a *non*-roaming
  profile dir (``CSIDL_LOCAL_APPDATA`` instead of ``CSIDL_APPDATA``). Why? Because
  a large roaming profile can cause login speed issues. The "only syncs on
  logout" behaviour can cause surprises in appdata info.


appdirs 1.0.1 (never released)
------------------------------

Started this changelog 27 July 2010. Before that this module originated in the
`Komodo <http://www.activestate.com/komodo>`_ product as ``applib.py`` and then
as `applib/location.py
<http://github.com/ActiveState/applib/blob/master/applib/location.py>`_ (used by
`PyPM <http://code.activestate.com/pypm/>`_ in `ActivePython
<http://www.activestate.com/activepython>`_). This is basically a fork of
applib.py 1.0.1 and applib/location.py 1.0.1.



site-packages/appdirs-1.4.3.dist-info/WHEEL000064400000000156147511334560014064 0ustar00Wheel-Version: 1.0
Generator: bdist_wheel (0.30.0)
Root-Is-Purelib: true
Tag: py2-none-any
Tag: py3-none-any

site-packages/appdirs-1.4.3.dist-info/metadata.json000064400000002632147511334560015751 0ustar00{"classifiers": ["Development Status :: 4 - Beta", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Operating System :: OS Independent", "Programming Language :: Python :: 2", "Programming Language :: Python :: 2.6", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.2", "Programming Language :: Python :: 3.3", "Programming Language :: Python :: 3.4", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3.6", "Programming Language :: Python :: Implementation :: PyPy", "Programming Language :: Python :: Implementation :: CPython", "Topic :: Software Development :: Libraries :: Python Modules"], "extensions": {"python.details": {"contacts": [{"email": "trentm@gmail.com", "name": "Trent Mick", "role": "author"}, {"email": "trentm@gmail.com; github@srid.name; jr@its.to", "name": "Trent Mick; Sridhar Ratnakumar; Jeff Rouse", "role": "maintainer"}], "document_names": {"description": "DESCRIPTION.rst"}, "project_urls": {"Home": "http://github.com/ActiveState/appdirs"}}}, "generator": "bdist_wheel (0.30.0)", "keywords": ["application", "directory", "log", "cache", "user"], "license": "MIT", "metadata_version": "2.0", "name": "appdirs", "summary": "A small Python module for determining appropriate platform-specific dirs, e.g. a \"user data dir\".", "test_requires": [{"requires": []}], "version": "1.4.3"}site-packages/appdirs-1.4.3.dist-info/top_level.txt000064400000000010147511334560016014 0ustar00appdirs
site-packages/appdirs-1.4.3.dist-info/INSTALLER000064400000000004147511334560014545 0ustar00pip
site-packages/pyparsing-2.1.10.dist-info/METADATA000064400000001617147511334560015031 0ustar00Metadata-Version: 2.1
Name: pyparsing
Version: 2.1.10
Summary: Python parsing module
Home-page: http://pyparsing.wikispaces.com/
Author: Paul McGuire
Author-email: ptmcg@users.sourceforge.net
License: MIT License
Download-URL: http://sourceforge.net/project/showfiles.php?group_id=97203
Platform: UNKNOWN
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Information Technology
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 2.6
Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.3
Classifier: Programming Language :: Python :: 3.4
Classifier: Programming Language :: Python :: 3.5

UNKNOWN


site-packages/pyparsing-2.1.10.dist-info/top_level.txt000064400000000012147511334560016444 0ustar00pyparsing
site-packages/pyparsing-2.1.10.dist-info/WHEEL000064400000000156147511334560014512 0ustar00Wheel-Version: 1.0
Generator: bdist_wheel (0.31.1)
Root-Is-Purelib: true
Tag: py2-none-any
Tag: py3-none-any

site-packages/pyparsing-2.1.10.dist-info/RECORD000064400000001005147511334560014616 0ustar00pyparsing.py,sha256=PifeLY3-WhIcBVzLtv0U4T_pwDtPruBhBCkg5vLqa28,229867
pyparsing-2.1.10.dist-info/METADATA,sha256=9nn52Op3W0UI2og8TbACWXta1fMSoZ_jWFZEQqOd3S4,911
pyparsing-2.1.10.dist-info/RECORD,,
pyparsing-2.1.10.dist-info/WHEEL,sha256=gduuPyBvFJQSQ0zdyxF7k0zynDXbIbvg5ZBHoXum5uk,110
pyparsing-2.1.10.dist-info/top_level.txt,sha256=eUOjGzJVhlQ3WS2rFAy2mN3LX_7FKTM5GSJ04jfnLmU,10
pyparsing-2.1.10.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
__pycache__/pyparsing.cpython-36.pyc,,
site-packages/pyparsing-2.1.10.dist-info/INSTALLER000064400000000004147511334560015173 0ustar00pip
site-packages/isc/__pycache__/checkds.cpython-36.pyc000064400000010272147511334560016317 0ustar003

��f&�@shddlZddlZddlZddlmZmZddlmZmZdZ	Gdd�d�Z
d
dd�Zd	d
�Zdd�Z
dS)�N)�Popen�PIPE)�prefix�versionzdnssec-checkdsc@sPeZdZddddd�ZdZdZdZdZdZdZ	d	Z
dd
d�Zdd
�Zdd�Z
dS)�SECRRzSHA-1zSHA-256ZGOSTzSHA-384)������INNrcCs�|st�t|�tk	r$|jd�j�}n|j�}t|�dkr<t�|r�d|_|j�|_|dj�j	d�jd�}|j
�|jd�}|j
�xDt|�dkr�t|�dkr�|d|dkr�|dd�}|dd�}q�W|r�t�|j
�dj|�|_|jd|jd|_
nd|_|dj�|_
|dd�}|dj�dk�rJ|dj�|_|dd�}n(t|d�|_|dj�|_|dd�}|dj�|jk�r�td|dj�|jf��tt|dd
��\|_|_|_dj|d
d��j�|_dS)N�ascii��DLVr�.r�DSr�CH�HSrz%s does not match %sr
r)rrr)�	Exception�type�str�decode�split�len�rrtype�lower�dlvname�strip�reverse�join�parent�rrname�upper�rrclass�int�ttl�map�keyid�keyalg�hashalg�digest)�selfZrrtextrZfieldsr �dlv�r-�/usr/lib/python3.6/checkds.py�__init__$sH

*zSECRR.__init__cCs$d|j|j|j|j|j|j|jfS)Nz%s %s %s %d %d %d %s)r!r#rr'r(r)r*)r+r-r-r.�__repr__SszSECRR.__repr__cCs|j�|j�kS)N)r0)r+�otherr-r-r.�__eq__XszSECRR.__eq__)N)�__name__�
__module__�__qualname__�hashalgsr!r#r'r(r)r*r%r/r0r2r-r-r-r.rs
/rc
	Cs&g}|jddd|rdndd|r*|d|n|g}t|td�j�\}}x6|j�D]*}t|�tk	rh|jd	�}|jt	||��qNWt
|d
d�d�}g}	|r�|jd
|g}|r�|d|g7}|j|�t|td�j�\}}ndt|jddddd|gtd�j�\}
}|jd
dg}|�r|d|g7}|j|�t|ttd�j|
�\}}x:|j�D].}t|�tk	�rZ|jd	�}|	jt	||���q>Wt|	�dk�r�t
d�dSd}xv|	D]n}||k�r�t
d|j|jjd�|j|jt	j|jf�d}n,t
d|j|jjd�|j|jt	j|jf��q�W|�s"t
d|�rdnd�|S)Nz+noallz+answerz-tr,Zdsz-qr)�stdoutr
cSs|j|j|jfS)N)r'r(r))�rrr-r-r.�<lambda>mszcheck.<locals>.<lambda>)�keyz-fz-lZdnskey�-)�stdinr7rz$No DNSKEY records found in zone apexFz,%s for KSK %s/%03d/%05d (%s) found in parentTz0%s for KSK %s/%03d/%05d (%s) missing from parentz'No %s records were found for any DNSKEYrr)�digrrZcommunicate�
splitlinesrrr�appendr�sorted�	dsfromkeyr�printrr!rr(r'r6r))
�zone�args�
masterfile�	lookasideZrrlist�cmd�fp�_�lineZklistZintods�foundr8r-r-r.�checkcsV





rLcCs�tjtdd�}d}tjdkr"dnd}|jdtdd�|jd	d
dtdd
�|jdddtdd
�|jdddtjjt	|�d�tdd�|jdddtjjt	|�d�tdd�|jdddt
d�|j�}|jj
d �|_|jr�|jj
d �|_|S)!Nz: checks DS coverage)�description�bin�ntZsbinrCz
zone to check)r�helpz-fz--filerEzzone master file)�destrrPz-lz--lookasiderFzDLV lookaside zonez-dz--digr=z
path to 'dig')rQ�defaultrrPz-Dz--dsfromkeyrAzdnssec-dsfromkeyzpath to 'dnssec-dsfromkey'z-vz	--versionr)�actionrr)�argparse�ArgumentParser�prog�os�name�add_argumentr�pathrrr�
parse_argsrCrrF)�parserZbindirZsbindirrDr-r-r.r[�s,




r[cCs.t�}t|j||j|j�}t|r$dnd�dS)Nrr)r[rLrCrErF�exit)rDrKr-r-r.�main�sr^)NN)rTrW�sys�
subprocessrrZ	isc.utilsrrrVrrLr[r^r-r-r-r.�<module>sI
; site-packages/isc/__pycache__/checkds.cpython-36.opt-1.pyc000064400000010272147511334560017256 0ustar003

��f&�@shddlZddlZddlZddlmZmZddlmZmZdZ	Gdd�d�Z
d
dd�Zd	d
�Zdd�Z
dS)�N)�Popen�PIPE)�prefix�versionzdnssec-checkdsc@sPeZdZddddd�ZdZdZdZdZdZdZ	d	Z
dd
d�Zdd
�Zdd�Z
dS)�SECRRzSHA-1zSHA-256ZGOSTzSHA-384)������INNrcCs�|st�t|�tk	r$|jd�j�}n|j�}t|�dkr<t�|r�d|_|j�|_|dj�j	d�jd�}|j
�|jd�}|j
�xDt|�dkr�t|�dkr�|d|dkr�|dd�}|dd�}q�W|r�t�|j
�dj|�|_|jd|jd|_
nd|_|dj�|_
|dd�}|dj�dk�rJ|dj�|_|dd�}n(t|d�|_|dj�|_|dd�}|dj�|jk�r�td|dj�|jf��tt|dd
��\|_|_|_dj|d
d��j�|_dS)N�ascii��DLVr�.r�DSr�CH�HSrz%s does not match %sr
r)rrr)�	Exception�type�str�decode�split�len�rrtype�lower�dlvname�strip�reverse�join�parent�rrname�upper�rrclass�int�ttl�map�keyid�keyalg�hashalg�digest)�selfZrrtextrZfieldsr �dlv�r-�/usr/lib/python3.6/checkds.py�__init__$sH

*zSECRR.__init__cCs$d|j|j|j|j|j|j|jfS)Nz%s %s %s %d %d %d %s)r!r#rr'r(r)r*)r+r-r-r.�__repr__SszSECRR.__repr__cCs|j�|j�kS)N)r0)r+�otherr-r-r.�__eq__XszSECRR.__eq__)N)�__name__�
__module__�__qualname__�hashalgsr!r#r'r(r)r*r%r/r0r2r-r-r-r.rs
/rc
	Cs&g}|jddd|rdndd|r*|d|n|g}t|td�j�\}}x6|j�D]*}t|�tk	rh|jd	�}|jt	||��qNWt
|d
d�d�}g}	|r�|jd
|g}|r�|d|g7}|j|�t|td�j�\}}ndt|jddddd|gtd�j�\}
}|jd
dg}|�r|d|g7}|j|�t|ttd�j|
�\}}x:|j�D].}t|�tk	�rZ|jd	�}|	jt	||���q>Wt|	�dk�r�t
d�dSd}xv|	D]n}||k�r�t
d|j|jjd�|j|jt	j|jf�d}n,t
d|j|jjd�|j|jt	j|jf��q�W|�s"t
d|�rdnd�|S)Nz+noallz+answerz-tr,Zdsz-qr)�stdoutr
cSs|j|j|jfS)N)r'r(r))�rrr-r-r.�<lambda>mszcheck.<locals>.<lambda>)�keyz-fz-lZdnskey�-)�stdinr7rz$No DNSKEY records found in zone apexFz,%s for KSK %s/%03d/%05d (%s) found in parentTz0%s for KSK %s/%03d/%05d (%s) missing from parentz'No %s records were found for any DNSKEYrr)�digrrZcommunicate�
splitlinesrrr�appendr�sorted�	dsfromkeyr�printrr!rr(r'r6r))
�zone�args�
masterfile�	lookasideZrrlist�cmd�fp�_�lineZklistZintods�foundr8r-r-r.�checkcsV





rLcCs�tjtdd�}d}tjdkr"dnd}|jdtdd�|jd	d
dtdd
�|jdddtdd
�|jdddtjjt	|�d�tdd�|jdddtjjt	|�d�tdd�|jdddt
d�|j�}|jj
d �|_|jr�|jj
d �|_|S)!Nz: checks DS coverage)�description�bin�ntZsbinrCz
zone to check)r�helpz-fz--filerEzzone master file)�destrrPz-lz--lookasiderFzDLV lookaside zonez-dz--digr=z
path to 'dig')rQ�defaultrrPz-Dz--dsfromkeyrAzdnssec-dsfromkeyzpath to 'dnssec-dsfromkey'z-vz	--versionr)�actionrr)�argparse�ArgumentParser�prog�os�name�add_argumentr�pathrrr�
parse_argsrCrrF)�parserZbindirZsbindirrDr-r-r.r[�s,




r[cCs.t�}t|j||j|j�}t|r$dnd�dS)Nrr)r[rLrCrErF�exit)rDrKr-r-r.�main�sr^)NN)rTrW�sys�
subprocessrrZ	isc.utilsrrrVrrLr[r^r-r-r-r.�<module>sI
; site-packages/isc/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000744147511334560017414 0ustar003

��f��
@sjdddddddddd	d
ddg
Zd
dlTd
dlTd
dlTd
dlTd
dlTd
dlTd
dlTd
dlTd
dl	TdS)ZcheckdsZcoverageZkeymgrZdnskeyZ	eventlistZkeydictZkeyeventZ	keyseriesZkeyzoneZpolicyZparsetabZrndcZutils�)�*N)
�__all__Z
isc.dnskeyZ
isc.eventlistZisc.keydictZisc.keyeventZ
isc.keyseriesZisc.keyzoneZ
isc.policyZisc.rndcZ	isc.utils�rr�/usr/lib/python3.6/__init__.py�<module>s


site-packages/isc/__pycache__/keyevent.cpython-36.pyc000064400000003500147511334560016541 0ustar003

��f�@sddlZGdd�d�ZdS)�Nc@s4eZdZdZddd�Zdd�Zdd�Zdd	d
�ZdS)
�keyeventz� A discrete key event, e.g., Publish, Activate, Inactive, Delete,
    etc. Stores the date of the event, and identifying information
    about the key to which the event will occur.NcCs@||_|p|j|�|_||_|j|_|j|_|j|_|j|_dS)N)	�whatZgettime�when�key�sep�name�zone�alg�keyid)�selfrrr�r�/usr/lib/python3.6/keyevent.py�__init__szkeyevent.__init__cCs t|j|j|j|j|j|jf�S)N)�reprrrr
rrr	)rrrr
�__repr__ szkeyevent.__repr__cCstjd|j�S)Nz%a %b %d %H:%M:%S UTC %Y)�timeZstrftimer)rrrr
�showtime$szkeyevent.showtimecCs�dd�}|s|}|st�}|s$t�}|jdkr<|j|j�n�|jdkrT|j|j�n�|jdkr�|j|kr||dt|j��q�|j|j�nl|jdkr�|j|kr�|j|j�q�|dt|j��n6|jd	kr�|j|kr�|j|j�|j|kr�|j|j�||fS)
Nc_sdS)Nr)�args�kwargsrrr
�noop*szkeyevent.status.<locals>.noopZActivateZPublishZInactivez=	WARNING: %s scheduled to become inactive before it is activeZDeletez@WARNING: key %s is scheduled for deletion before it is publishedZRevoke)�setr�addr
rr�remove)rZactiveZ	published�outputrrrr
�status)s6








zkeyevent.status)N)N)�__name__�
__module__�__qualname__�__doc__rrrrrrrr
rs

	r)rrrrrr
�<module>ssite-packages/isc/__pycache__/parsetab.cpython-36.pyc000064400000015224147511334560016516 0ustar003

��f}�;@s
dZdZdZddddddd	d
ddd
gddddddd	ddddgfddddddd	d
ddd
gddddddd	ddddgfddddddd	d
ddddd
ddgddddddd	ddddddddgfddddd	d
ddd
g	ddddd	ddddg	fddgddgfdddgdddgfdd gd!d"gfddd#d$d%d&d'dgdd(d)d*d+d,d-d(gfdddd.gdd/d/d0gfdd1dd!ddddd(d/d2ddg
d!d!d
dd1dd!dddd3ddg
fdd(d/d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLd.dMdNdOdPdQd"dd0dRdSdTdUdVg*dddddd
dWd=d>dXd#d$d%dYdd4d3d5dd6ddZd7d8d9d:d[ddd(dCd2dd&d'dDd/dd?d@dAdBg*fd3ddZddWdd[dgdXdXdXdXd;dd<dgfd3ddZddWdd[dgd#d#d#d#d;dd<dgfd3ddZddWdd[dgd$d$d$d$d;dd<dgfd3ddZddWdd[dgd%d%d%d%d;dd<dgfd3ddZddWdd[dgdYdYdYdYd;dd<dgfd3ddZddWdd[dgd&d&d&d&d;dd<dgfd3ddZddWdd[dgd'd'd'd'd;dd<dgfddddgd d ddgfddddgddddgfdZddWdd[dgdJdOd;dd<dgfdXdYd)d*d+d,d-gd.d.d.d.d.dUdVgfdXdYd)d*d+gdMdMdMdMdMgfd\�ZiZxXej�D]L\ZZx@eeded�D]*\Z	Z
e	ek�r�iee	<e
ee	e<�q�W�q�W[dgdgfdgdgfddgdd
gfddgddgfddgddgfddgd	d	gfdgd1gfddgddQgfdd1dgd2ddgfd2gd4gfddgd5d6gfd3gdZgfd3dZgd7dKgfd3ddZdgd8dBd8dBgfd3ddZdgd9dCd9dCgfd3ddZdgd:dDd:dDgfd3ddZdgd;dEd;dEgfd3ddZdgd<dHd<dHgfd3ddZdgd=dFd=dFgfd3ddZdgd>dId>dIgfdgdgfddgd?dPgfddgd@d@gfddgdAdAgfddgdGdGgfdXdYd)d*d+gdLdNdRdSdTgfd]�ZiZxXej�D]L\ZZx@eeded�D]*\Z	Z
e	ek�r�iee	<e
ee	e<�q�W�q�W[d^d_dd`d`d`fdadbddcdddefdfdbddcdddgfdhdiddjdddkfdldmddndddofdpdmddndddqfdrdmddndddsfdtduddvdddwfdxduddvdddyfdzduddvddd{fd|d}dd~dddfd�d}dd~ddd�fd�d}dd~ddd�fd�d}dd~ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�d�ddd�dfg6Z
d`S(z3.8ZLALRZ C2FE91AF256AF1CCEB499107DA2CFE7E��������
��.�>�������/�M�X�	�;���=��P�
��<��O�(�)�*�,�-��E�F�G�I�J�C��R�������� �!�"�#�$�%�&�0�1�2�3�4�5�6�7�8�9�:�?�@�B�D�H�K�L�N�S�T�U�V�W�A�'�+��Q)ZALGORITHM_POLICYZZONEZPOLICYz$endZALGNAMEZSTRZQSTRINGZKEYTYPEZ
DATESUFFIX�LBRACE�SEMIZCOVERAGEZROLL_PERIODZPRE_PUBLISHZPOST_PUBLISHZKEYTTLZKEY_SIZEZSTANDBYZ	DIRECTORYZ	ALGORITHM�RBRACE�NUMBERZNONE)�
policylist�init�policy�
alg_policy�zone_policy�named_policy�domain�name�
new_policy�alg_option_group�policy_option_group�alg_option_list�
alg_option�coverage_option�rollperiod_option�prepublish_option�postpublish_option�
keyttl_option�keysize_option�standby_option�policy_option_list�
policy_option�
parent_option�directory_option�algorithm_option�durationzS' -> policylistzS'Nzpolicylist -> init policyr^Zp_policylistz	policy.pyizpolicylist -> policylist policyizinit -> <empty>r_Zp_initizpolicy -> alg_policyr`Zp_policyizpolicy -> zone_policyizpolicy -> named_policyizname -> STRreZp_nameizname -> KEYTYPEizname -> DATESUFFIXiz
domain -> STRrdZp_domaini!zdomain -> QSTRINGi"zdomain -> KEYTYPEi#zdomain -> DATESUFFIXi$znew_policy -> <empty>rfZp_new_policyi+zGalg_policy -> ALGORITHM_POLICY ALGNAME new_policy alg_option_group SEMIraZp_alg_policyi/z>zone_policy -> ZONE domain new_policy policy_option_group SEMIrbZ
p_zone_policyi6z?named_policy -> POLICY name new_policy policy_option_group SEMIrcZp_named_policyi=zduration -> NUMBERrwZp_duration_1iCzduration -> NONEZp_duration_2iHzduration -> NUMBER DATESUFFIXZp_duration_3iMz7policy_option_group -> LBRACE policy_option_list RBRACErhZp_policy_option_groupi`z(policy_option_list -> policy_option SEMIrrZp_policy_option_listidz;policy_option_list -> policy_option_list policy_option SEMIiezpolicy_option -> parent_optionrsZp_policy_optioniiz!policy_option -> directory_optionijz policy_option -> coverage_optionikz"policy_option -> rollperiod_optionilz"policy_option -> prepublish_optionimz#policy_option -> postpublish_optioninzpolicy_option -> keysize_optionioz!policy_option -> algorithm_optionipzpolicy_option -> keyttl_optioniqzpolicy_option -> standby_optionirz1alg_option_group -> LBRACE alg_option_list RBRACErgZp_alg_option_groupivz"alg_option_list -> alg_option SEMIriZp_alg_option_listizz2alg_option_list -> alg_option_list alg_option SEMIi{zalg_option -> coverage_optionrjZp_alg_optionizalg_option -> rollperiod_optioni�zalg_option -> prepublish_optioni�z alg_option -> postpublish_optioni�zalg_option -> keyttl_optioni�zalg_option -> keysize_optioni�zalg_option -> standby_optioni�zparent_option -> POLICY namertZp_parent_optioni�z%directory_option -> DIRECTORY QSTRINGruZp_directory_optioni�z$coverage_option -> COVERAGE durationrkZp_coverage_optioni�z1rollperiod_option -> ROLL_PERIOD KEYTYPE durationrlZp_rollperiod_optioni�z1prepublish_option -> PRE_PUBLISH KEYTYPE durationrmZp_prepublish_optioni�z3postpublish_option -> POST_PUBLISH KEYTYPE durationrnZp_postpublish_optioni�z)keysize_option -> KEY_SIZE KEYTYPE NUMBERrpZp_keysize_optioni�z(standby_option -> STANDBY KEYTYPE NUMBERrqZp_standby_optioni�z keyttl_option -> KEYTTL durationroZp_keyttl_optioni�z%algorithm_option -> ALGORITHM ALGNAMErvZp_algorithm_optioni�)Z_tabversionZ
_lr_methodZ
_lr_signatureZ_lr_action_itemsZ
_lr_action�itemsZ_kZ_v�zipZ_xZ_yZ_lr_goto_itemsZ_lr_gotoZ_lr_productions�rzrz�/usr/lib/python3.6/parsetab.py�<module>s��������site-packages/isc/__pycache__/keyzone.cpython-36.opt-1.pyc000064400000002344147511334560017337 0ustar003

��f��@sJddlZddlZddlZddlmZmZGdd�de�ZGdd�d�ZdS)�N)�Popen�PIPEc@seZdZdS)�KeyZoneExceptionN)�__name__�
__module__�__qualname__�rr�/usr/lib/python3.6/keyzone.pyrsrc@seZdZdZdd�ZdS)�keyzonez/reads a zone file to find data relevant to keysc
Cs�d|_d|_|sdS|s8tjj|�s8tj|tj�rDtd��dSd}}t|dd||gt	t	d�j
�\}}xv|j�D]j}t|�t
k	r�|jd�}tjd|�r�qv|j�}	|s�t|	d�|kr�t|	d�}|	dd	krvt|	d�}qvW||_||_dS)
Nz"named-compilezone" not foundz-o�-)�stdout�stderr�asciiz^[:space:]*;��ZDNSKEY)�maxttl�keyttl�os�path�isfile�access�X_OKrrrZcommunicate�
splitlines�type�str�decode�re�search�split�int)
�self�name�filenameZczpathrr�fp�_�lineZfieldsrrr	�__init__s.
zkeyzone.__init__N)rrr�__doc__r&rrrr	r
sr
)	r�sysr�
subprocessrr�	Exceptionrr
rrrr	�<module>s
site-packages/isc/__pycache__/dnskey.cpython-36.opt-1.pyc000064400000032530147511334560017150 0ustar003

��f @�@sJddlZddlZddlZddlmZmZGdd�de�ZGdd�d�ZdS)�N)�Popen�PIPEcseZdZ�fdd�Z�ZS)�TimePastcstt|�jd|||f�dS)Nz'%s time for key %s (%d) is already past)�superr�__init__)�self�key�prop�value)�	__class__��/usr/lib/python3.6/dnskey.pyrszTimePast.__init__)�__name__�
__module__�__qualname__r�
__classcell__rr)rr
rsrc@s�eZdZdZdqZdrZdsZdtd!d"�Zd#d$�Zd%d&�Z	e
dud'd(��Zd)d*�Ze
d+d,��Ze
d-d.��Zdvd/d0�Ze
d1d2��Ze
d3d4��Ze
d5d6��Ze
d7d8��Zd9d:�Zd;d<�Zd=d>�Zd?d@�ZdAdB�ZdCdD�Zej�fdEdF�ZdGdH�Zej�fdIdJ�ZdKdL�Zej�fdMdN�Z dOdP�Z!ej�fdQdR�Z"dSdT�Z#ej�fdUdV�Z$dWdX�Z%ej�fdYdZ�Z&d[d\�Z'ej�fd]d^�Z(d_d`�Z)dadb�Z*dcdd�Z+dedf�Z,dgdh�Z-didj�Z.dwdkdl�Z/dxdmdn�Z0e
dodp��Z1dS)y�dnskeyztAn individual DNSSEC key.  Identified by path, name, algorithm, keyid.
    Contains a dictionary of metadata events.�Created�Publish�Activate�Inactive�Delete�Revoke�	DSPublish�SyncPublish�
SyncDeleteN�-P�-A�-I�-D�-R�-Psync�-Dsync�RSAMD5�DH�DSA�ECC�RSASHA1�NSEC3DSA�NSEC3RSASHA1�	RSASHA256�	RSASHA512�ECCGOST�ECDSAP256SHA256�ECDSAP384SHA384�ED25519�ED448cCs�t|t�r:t|�dkr:|pd|_|\}}}|j||||�|pLtjj|�pLd|_tjj|�}|j	d�\}}}|dd�}t
|�}t
|j	d�d�}|j||||�dS)N��.�+�r���)�
isinstance�tuple�len�_dir�	fromtuple�os�path�dirname�basename�split�int)rrZ	directory�keyttl�name�alg�keyidrrr
r&s

zdnskey.__init__cs�|jd�r|}|jd�}n|d}d|||f}|j|jr@tjpBd|d}|j|jr^tjp`d|d}||_||_t|�|_t|�|_	||_
t|d�}	x�|	D]z��ddkr�q��j�}
|
s�q�|
d	j
�dkr�d
}||_nd}|s�t|
d	�n||_t|
|�d	@d	k�rd|_q�d|_q�W|	j�t|d�}t�|_t�|_t�|_t�|_t�|_t�|_t�|_d|_x�|D]���j����sv�ddk�r��qv�fdd�dD�t��g}
tdd�|
D��}�d|�j�}�|d�jd�j�}||j|<�qvWx�tjD]�}d|j|<||jk�rn|j|j|�}||j|<|j |�|j|<|j!|�|j|<|j||j|<n(d|j|<d|j|<d|j|<d|j|<�qW|j�dS)Nr2z
K%s+%03d+%05d�z.keyz.private�rr�;r4�in�ch�hsr1�TFZrUz!#csg|]}�j|��qSr)�find)�.0�c)�linerr
�
<listcomp>lsz$dnskey.fromtuple.<locals>.<listcomp>z:= cSsg|]}|dkr|�qS)r4r5r)rM�posrrr
rPms)rHrIrJ)"�endswith�rstripr9r;�sep�keystrrBr@rCrD�fullname�openr?�lower�ttl�close�dictZmetadata�_changed�_delete�_times�_fmttime�_timestamps�	_original�_origttl�stripr8�min�lstripr�_PROPS�	parsetime�
formattime�
epochfromtime)rrBrCrDrArVrUZkey_fileZprivate_fileZkfp�tokensZseptokenZpfpZpunctuation�foundr
r	�tr)rOr
r:5sv












zdnskey.fromtuplecKsr|jdd�}g}d}|jdk	r0|dt|j�g7}xlttjtj�D]Z\}}|s@|j|r\q@d}||j	krx|j	|rxd}|r�dn|j
|}	|||	g7}d}q@W|�rn|d|jg||jg}
|s�t
ddj|
��y0t|
ttd	�}|j�\}}
|
�rtt|
���Wn8tk
�r:}ztd
|t|�f��WYdd}~XnXd|_x*tjD] }|j||j|<d|j|<�qJWdS)N�quietFTz-LZnonez-Kz# � )�stdout�stderrzunable to run %s: %s)�getrb�strrY�ziprrf�_OPTSr\r]r_r9rU�print�joinrr�communicate�	Exceptionr`ra)rZsettime_bin�kwargsrm�cmd�firstr	�opt�deleteZwhenZfullcmd�prorp�errr
�commit�s<
"z
dnskey.commitcKsL|jdd�}|dd|dt|�g}
|r0|
d|g7}
|r>|
jd�|rN|
d|g7}
|rb|
d	t|�g7}
|	r�tj|	�}|
d
tj|�g7}
|
r�tj|
�}|
dtj|
�g7}
|
j|�|s�tdd
j|
��t|
t	t	d�}|j
�\}}|r�tdt|���y"|j�dj
d�}t|||�}|Stk
�rF}ztdt|���WYdd}~XnXdS)NrmFz-qz-Kz-Lz-rz-fkz-az-bz-Pz-Az# rn)rorpzunable to generate key: r�asciiz!unable to parse generated key: %s)rqrr�appendr�
timefromepochrhrurvrrrwrx�
splitlines�decode)�cls�
keygen_bin�	randomdevZkeys_dirrBrCZkeysizerTrY�publish�activateryrm�
keygen_cmdrlr~rorprU�newkeyrrrr
�generate�s:



zdnskey.generatec
Ks�|jdd�}|j�s td|��|dd|jd|jg}|jrL|dt|j�g7}|r\|d|g7}|rp|d	t|�g7}|s�td
dj|��t	|t
t
d�}|j�\}}	|	r�td
|	��y&|j�dj
d�}
t|
|j|j�}|Std|��YnXdS)NrmFz'predecessor key %s has no inactive datez-qz-Kz-Sz-Lz-rz-iz# rn)rorpzunable to generate key: rr�z'unable to generate successor for key %s)rq�inactiverxr9rUrYrrrurvrrrwr�r�r)rr�r�Z
prepublishryrmr�r~rorprUr�rrr
�generate_successor�s,zdnskey.generate_successorcCs0d}|tttj��kr tj|}|r(|Sd|S)Nz%03d)�ranger8r�	_ALGNAMES)rCrBrrr
�algstr�s
z
dnskey.algstrcCs6|sdS|j�}ytjj|�Stk
r0dSXdS)N)�upperrr��index�
ValueError)rCrrr
�algnum�sz
dnskey.algnumcCs|j|p|j�S)N)r�rC)rrCrrr
�algnameszdnskey.algnamecCs
tj|�S)N)�timeZgmtime)�secsrrr
r�szdnskey.timefromepochcCstj|d�S)Nz%Y%m%d%H%M%S)r�Zstrptime)�stringrrr
rgszdnskey.parsetimecCs
tj|�S)N)�calendarZtimegm)rlrrr
riszdnskey.epochfromtimecCstjd|�S)Nz%Y%m%d%H%M%S)r�Zstrftime)rlrrr
rhszdnskey.formattimecKs�|jdd�}|j||krdS|j|dk	rR|j||krR|rRt|||j|��|dkr�|j|dkrldnd|j|<d|j|<d|j|<d|j|<d|j|<dS|j|�}||j|<||j|<|j	|�|j|<|j||j|kr�dnd|j|<dS)N�forceFT)
rqr`rarr\r]r^r_r�rh)rr	r��nowryr�rlrrr
�setmetas$






zdnskey.setmetacCs
|j|S)N)r^)rr	rrr
�gettime2szdnskey.gettimecCs
|j|S)N)r_)rr	rrr
�
getfmttime5szdnskey.getfmttimecCs
|j|S)N)r`)rr	rrr
�gettimestamp8szdnskey.gettimestampcCs
|jdS)Nr)r`)rrrr
�created;szdnskey.createdcCs
|jdS)Nr)r`)rrrr
�syncpublish>szdnskey.syncpublishcKs|jd||f|�dS)Nr)r�)rr�r�ryrrr
�setsyncpublishAszdnskey.setsyncpublishcCs
|jdS)Nr)r`)rrrr
r�Dszdnskey.publishcKs|jd||f|�dS)Nr)r�)rr�r�ryrrr
�
setpublishGszdnskey.setpublishcCs
|jdS)Nr)r`)rrrr
r�Jszdnskey.activatecKs|jd||f|�dS)Nr)r�)rr�r�ryrrr
�setactivateMszdnskey.setactivatecCs
|jdS)Nr)r`)rrrr
�revokePsz
dnskey.revokecKs|jd||f|�dS)Nr)r�)rr�r�ryrrr
�	setrevokeSszdnskey.setrevokecCs
|jdS)Nr)r`)rrrr
r�Vszdnskey.inactivecKs|jd||f|�dS)Nr)r�)rr�r�ryrrr
�setinactiveYszdnskey.setinactivecCs
|jdS)Nr)r`)rrrr
r}\sz
dnskey.deletecKs|jd||f|�dS)Nr)r�)rr�r�ryrrr
�	setdelete_szdnskey.setdeletecCs
|jdS)Nr)r`)rrrr
�
syncdeletebszdnskey.syncdeletecKs|jd||f|�dS)Nr)r�)rr�r�ryrrr
�
setsyncdeleteeszdnskey.setsyncdeletecCsR|dks|j|krdS|jdkr0|j|_||_n|j|krHd|_||_n||_dS)N)rYrb)rrYrrr
�setttlhs

z
dnskey.setttlcCs|jr
dSdS)N�KSK�ZSK)rT)rrrr
�keytypetszdnskey.keytypecCsd|j|j�|jfS)Nz
%s/%s/%05d)rBr�rD)rrrr
�__str__wszdnskey.__str__cCs"d|j|j�|j|jrdndfS)Nz%s/%s/%05d (%s)r�r�)rBr�rDrT)rrrr
�__repr__{szdnskey.__repr__cCs|j�p|j�p|j�S)N)r�r�r�)rrrr
�date�szdnskey.datecCs@|j|jkr|j|jkS|j|jkr0|j|jkS|j�|j�kS)N)rBrCr�)r�otherrrr
�__lt__�s
z
dnskey.__lt__cCs�dd�}|s|}ttj��}|j�}|j�}|s4dS|sT||krP|dt|��dS||krh||krhdS||kr�|dt|�tj|j�p�df�dS||kr�|dt|��dS|jdk	r�|||jkr�|d	t|�tj|j�p�df�dSdS)
Nc_sdS)Nr)�argsryrrr
�noop�sz!dnskey.check_prepub.<locals>.noopFzFWARNING: Key %s is scheduled for
	 activation but not for publication.Tz�WARNING: %s is scheduled to be
	 published and activated at the same time. This
	 could result in a coverage gap if the zone was
	 previously signed. Activation should be at least
	 %s after publication.zone DNSKEY TTLz0WARNING: Key %s is active before it is publishedz�WARNING: Key %s is activated too soon
	 after publication; this could result in coverage 
	 gaps due to resolver caches containing old data.
	 Activation should be at least %s after
	 publication.)r@r�r�r��reprr�durationrY)r�outputr�r��ar~rrr
�check_prepub�s<zdnskey.check_prepubcCs�dd�}|dkr|}|dkr"|j}|dkr>|dt|��d}tj�}|j�}|j�}|s^dS|s~||krz|dt|��dS||kr�||kr�dS||kr�|d	t|��dS|||kr�|d
t|�tj|�f�dSdS)
Nc_sdS)Nr)r�ryrrr
r��sz"dnskey.check_postpub.<locals>.noopz"WARNING: Key %s using default TTL.�<�FzEWARNING: Key %s is scheduled for
	 deletion but not for inactivation.Tz@WARNING: Key %s is scheduled for
	 deletion before inactivation.z�WARNING: Key %s scheduled for
	 deletion too soon after deactivation; this may 
	 result in coverage gaps due to resolver caches
	 containing old data.  Deletion should be at least
	 %s after inactivation.ii�Q)rYr�r�r}r�rr�)rr�Ztimespanr�r��d�irrr
�
check_postpub�s:zdnskey.check_postpubcCsz|sdSddddddg}g}xR|D]J}||d||d}}|dkr"|jd
||d|dkrbdndf�q"Wdj|�S) N�yearr�r�im�month��day�hour�minute�secondr4rz%d %s%s�srEz, ii�Q�3�)r�r�ii�Q��')r�r�i�Q)r�r��)r�r�)r�r�)r�r4)r�rv)r�Zunitsr�Zunit�vrrr
r��s
(zdnskey.duration)	rrrrrrrrr)	Nrrrrr Nr!r")Nr#r$r%r&r'r(r)r*Nr+Nr,r-r.r/r0)NN)NN)N)N)NN)2rrr�__doc__rfrtr�rr:r��classmethodr�r��staticmethodr�r�r�r�rgrirhr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r}r�r�r�r�r�r�r�r�r�r�r�r�rrrr
rsb
M%* 


1
-r)	r;r�r��
subprocessrrrxrrrrrr
�<module>s
site-packages/isc/__pycache__/keymgr.cpython-36.pyc000064400000010165147511334560016212 0ustar003

��fk�@s�ddlmZddlZddlZddlZddlZddlZddlZddlZddl	Z	ddl
mZdZddl
mZmZmZmZmZmZdd�Zddd	�Zd
d�Zdd
�ZdS)�)�print_functionN)�defaultdictz
dnssec-keymgr)�dnskey�keydict�	keyseries�policy�parsetab�utilscOst||�tjd�dS)N�)�print�sys�exit)�args�kwargs�r�/usr/lib/python3.6/keymgr.py�fatals
rcCs�|}|s(tjj|�s(tj|tj�r�tjd}|s>tjj}xB|jtj�D]2}|tj	|}tjj|�rztj|tj�rzPd}qLW|S)a2 find the location of a specified command. If a default is supplied,
    exists and it's an executable, we use it; otherwise we search PATH
    for an alternative.
    :param command: command to look for
    :param default: default value to use
    :return: PATH with the location of a suitable binary
    �PATHN)
�os�path�isfile�access�X_OK�environ�defpath�split�pathsep�sep)Zcommand�defaultZfpathrZ	directoryrrr�set_paths$
rcCs�tdtjjtjd�d��}tdtjjtjd�d��}tjtdd�}|j	dt
ddd;d�|j	dd
t
ddd�|j	ddt
ddd�|j	dd|t
dd
d�|j	ddt
ddd
d�|j	dd|t
dd
d�|j	d d!d"d#d$d%�|j	d&d'd"d#d(d%�|j	d)d*d+d"d#d<d%�|j	d.d/d0d"d#d1d%�|j	d2d3d4tjd5�|j�}|j
�rJ|j�rJtd6�|jdk�r^td7�|jdk�rrtd8�|jdk	�r�tjj|j��s�td9|j�n(tjjtjd:�|_tjj|j��s�d|_|S)=zc Read command line arguments, returns 'args' object
    :return: args object properly prepared
    z
dnssec-keygenZsbinzdnssec-settimezA: schedule DNSSEC key rollovers according to a pre-defined policy)�description�zone�*Nz.Zone(s) to which the policy should be applied z%(default: all zones in the directory))�type�nargsr�helpz-KrzDirectory containing keys�dir)�destr#r%�metavarz-c�
policyfilezPolicy definition file�filez-g�keygenzPath to 'dnssec-keygen')r'rr#r%r(z-r�	randomdevz@Path to a file containing random data to pass to 'dnssec-keygen')r'r#rr%r(z-s�settimezPath to 'dnssec-settime'z-k�no_zsk�
store_trueFz,Only apply policy to key-signing keys (KSKs))r'�actionrr%z-z�no_kskz-Only apply policy to zone-signing keys (ZSKs)z-fz--force�forcezForce updates to key events zeven if they are in the pastz-qz--quiet�quietzUpdate keys silentlyz-vz	--version�version)r0r4z)ERROR: -z and -k cannot be used together.zERROR: dnssec-keygen not foundzERROR: dnssec-settime not foundz!ERROR: Policy file "%s" not foundzdnssec-policy.confzSZone(s) to which the policy should be applied (default: all zones in the directory)z8Force updates to key events even if they are in the past)rrr�joinr	�prefix�argparse�ArgumentParser�prog�add_argument�strr4�
parse_argsr.r1rr+r-r)�existsZ
sysconfdir)r+r-�parserrrrrr<6sb







r<c:CsHt�}|j|j|j|jd�}ytj|j�}Wn2tk
r^}zt	dt
|��WYdd}~XnXyt||j|jd�}Wn2tk
r�}zt	dt
|��WYdd}~XnXyt
||d�}Wn2tk
r�}zt	dt
|��WYdd}~XnXy |j||j|j|j|jd�Wn4tk
�rB}zt	dt
|��WYdd}~XnXdS)	N)Zkeygen_pathZsettime_pathZ	keys_pathr,zUnable to load DNSSEC policy: )rZzonesz Unable to build key dictionary: )�contextzUnable to build key series: )ZkskZzskr2r3zUnable to apply policy: )r<r+r-rr,rZ
dnssec_policyr)�	Exceptionrr;rr!rZenforce_policyr.r1r2r3)rr?Zdp�eZkdZksrrr�main}s,
"""rB)N)Z
__future__rrrr7Zglob�reZtimeZcalendar�pprint�collectionsrr9Ziscrrrrrr	rrr<rBrrrr�<module>s@ 
Gsite-packages/isc/__pycache__/keydict.cpython-36.opt-1.pyc000064400000005013147511334560017303 0ustar003

��f!�@s:ddlmZddlmZddlZddlZGdd�d�ZdS)�)�defaultdict�)�dnskeyNc@sneZdZdZedd��ZdZgZddd�Zdd�Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�Zdd�ZdS)�keydictz> A dictionary of keys, indexed by name, algorithm, and key id cCstt�S)N)r�dict�rr�/usr/lib/python3.6/keydict.py�<lambda>szkeydict.<lambda>NcKs�|jdd�|_|jdd�}|s:|jdd�p,d}|j|�nXxV|D]N}d|krb|ddk	rb|d}n|rr|j|�jptd}|j||�s@|jj|�q@WdS)NZkeyttl�zones�path�.)�get�_defttl�readallZpolicyZ	directory�readone�_missing�append)�selfZdp�kwargsr
r�zonerrr�__init__s

zkeydict.__init__cCsLtjtjj|d��}x2|D]*}t|||j�}||j|j|j|j	<qWdS)Nz	*.private)
�glob�osr�joinrr�_keydict�name�alg�keyid)rr�files�infile�keyrrrr,s
zkeydict.readallc	Cs�|jd�s|d7}d|d}tjtjj||��}d}xR|D]J}t|||j�}|j|krZq<|dkrh|jnd}||j	||j
|j<d}q<W|S)Nr�Kz
+*.privateFT)�endswithrrrrrr�fullnamerrrr)	rrr�matchr�foundrr Zkeynamerrrr3s


zkeydict.readoneccsJxD|jj�D]6\}}x,|j�D] \}}x|j�D]
}|Vq0WqWqWdS)N)r�items�values)rr�
algorithmsr�keysr rrr�__iter__Dszkeydict.__iter__cCs
|j|S)N)r)rrrrr�__getitem__Jszkeydict.__getitem__cCs
|jj�S)N)rr))rrrrr
Msz
keydict.zonescCs|j|j�S)N)rr))rrrrrr(Pszkeydict.algorithmscCs|j||j�S)N)rr))rrrrrrr)Sszkeydict.keyscCs|jS)N)r)rrrr�missingVszkeydict.missing)N)�__name__�
__module__�__qualname__�__doc__rrrrrrrr*r+r
r(r)r,rrrrrs
r)�collectionsr�rrrrrrrr�<module>ssite-packages/isc/__pycache__/rndc.cpython-36.pyc000064400000012000147511334560015630 0ustar003

��f+�@sXddlmZddlZddlZddlZddlZddlZddlZddlZGdd�de	�Z
dS)�)�OrderedDictNc@sleZdZdZddddddd�Zd	d
�Zdd�Zddd�Zdd�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dS)�rndczRNDC protocol client library������)�md5Zsha1Zsha224Zsha256Zsha384Zsha512cCsb||_|j�}|jd�r$|dd�}||_tt|�|_tj|�|_	t
jdd�|_d|_
|j�dS)z�Creates a persistent connection to RNDC and logs in
        host - (ip, port) tuple
        algo - HMAC algorithm: one of md5, sha1, sha224, sha256, sha384, sha512
               (with optional prefix 'hmac-')
        secret - HMAC secret, base64 encodedzhmac-�Nr��i)�host�lower�
startswith�algo�getattr�hashlib�hlalgo�base64�	b64decode�secret�randomZrandint�ser�nonce�_rndc__connect_login)�selfrrr�r�/usr/lib/python3.6/rndc.py�__init__$s
z
rndc.__init__cCst|j|d�d�S)z�Call a RNDC command, all parsing is done on the server side
        cmd - a complete string with a command (eg 'reload zone example.com')
        )�type�_data)�dict�_rndc__command)r�cmdrrr�call5sz	rndc.callFcCst�}�x|j�D]�\}}|r(|dkr(q|tjdt|��|jd�7}t|�tkrt|tjddt|��|jd�7}qt|�tkr�|tjddt|��|7}qt|�tkr�|tjddt|��|7}qt|�t	kr�|j
|�}|tjddt|��|7}qtdt|���qW|S)N�_auth�B�asciiz>BIr�z#Cannot serialize element of type %s)�	bytearray�items�struct�pack�len�encoder �str�bytesr�_rndc__serialize_dict�NotImplementedError)r�data�ignore_auth�rv�k�vZsdrrrZ__serialize_dict;s""
zrndc.__serialize_dictc	Os,|jd7_ttj��}t||�}t�}t�|d<t�|d<t|j�|dd<t|�|dd<t|d�|dd<|jdk	r�|j|dd<||d	<|j|d
d�}tj|j	||j
�j�}tj
|�}|jdkr�tjd
|�|dd<n"ttjd|j|j|��|dd<|j|�}tjdt|�dd�|}|S)Nrr&�_ctrlZ_serZ_tim�<Z_exp�_noncer!T)r5r
Z22s�hmd5ZB88s�hshaz>II�)r�int�timerr0rr2�hmac�newrr�digestrZ	b64encoderr,r-r*�_rndc__algosr.)	r�args�kwargsZnowr4�d�msg�hash�bhashrrrZ__prep_messageOs,






zrndc.__prep_messagecCs�|jdk	r |dd|jkr dS|jdkr8|dd}n|dddd�}t|�tkrb|jd	�}|d
dt|�d7}tj|�}|j|dd
�}t	j
|j||j�j
�}||kS)Nr9r;Fr
r&r<r=rr(�=r>T)r5)rrr r1�decoder.rrr2rArBrrrC)rrHrJZremote_hashZmy_msgZmy_hashrrrZ__verify_msgjs


zrndc.__verify_msgc	Os�|j||�}|jj|�}|t|�kr,td��|jjd�}t|�dkrLtd��tjd|�\}}|dkrptd|��|d8}|jj|tj	�}t|�|kr�td��t
|�tkr�t|�}|j
|�}|j|�s�td	��|S)
NzCannot send the message�zCan't read response headerz>IIrzWrong message version %dr>zCan't read response datazAuthentication failure)�_rndc__prep_message�socket�sendr.�IOErrorZrecvr,�unpackr3ZMSG_WAITALLr r0r*�_rndc__parse_message�_rndc__verify_msg)	rrErFrHZsent�headerZlength�versionr4rrrZ	__commandys(

zrndc.__commandcCs2tj|j�|_d|_|jdd�}|dd|_dS)NZnull)r r9r;)rOZcreate_connectionrrr#)rrHrrrZ__connect_login�szrndc.__connect_logincCs�d}||}|d7}||||�jd�}||7}||}|d7}tjd|||d��d}|d7}||||�}||7}||d�}|dkr�|||fS|dkr�t�}	x(t|�dkr�|j|�\}
}}||	|
<q�W||	|fStd|��dS)Nrrr(z>Ir>r)zUnknown element type %d)rLr,rRrr.�_rndc__parse_elementr3)r�input�posZlabellen�labelr Zdatalenr4�restrGZilabel�valuerrrZ__parse_element�s*

zrndc.__parse_elementcCs8t�}d}x(t|�dkr2|j|�\}}}|||<qW|S)Nr)rr.rW)rrXr6ZhdatarZr\rrrZ__parse_message�szrndc.__parse_messageN)F)�__name__�
__module__�__qualname__�__doc__rDrr%r2rNrTr#rrWrSrrrrrs 
r)�collectionsrr@r,rrArrrO�objectrrrrr�<module>ssite-packages/isc/__pycache__/coverage.cpython-36.pyc000064400000014323147511334560016507 0ustar003

��f�&�@s�ddlmZddlZddlZddlZddlZddlZddlZddlZddl	Z	ddl
mZdZddl
mZmZmZmZmZmZdd�Zdad	d
�Zdd�Zd
d�Zddd�Zdd�Zdd�ZdS)�)�print_functionN)�defaultdictzdnssec-coverage)�dnskey�	eventlist�keydict�keyevent�keyzone�utilscOst||�tjd�dS)N�)�print�sys�exit)�args�kwargs�r�/usr/lib/python3.6/coverage.py�fatals
rTcOsJd|kr|d}|jdd�nd}tr,dan|r8td�|rFt||�dS)zuoutput text, adding a vertical space this is *not* the first
    first section being printed since a call to vreset()�skipNTF�)�pop�
_firstliner)rrrrrr�output'srcCsdadS)zreset vertical spacingTN)rrrrr�vreset8srcCs�|j�}yt|�Stk
r$YnXtjd�}|j|�}|sJtd|��|j�\}}t|�}|j�}|jd�rx|dS|jd�r�|dS|jd�r�|dS|jd	�r�|d
S|jd�r�|dS|jd
�r�|dS|jd�r�|Std|��dS)z� convert a formatted time (e.g., 1y, 6mo, 15mi, etc) into seconds
    :param s: String with some text representing a time interval
    :return: Integer with the number of seconds in the time interval
    z([0-9][0-9]*)\s*([A-Za-z]*)zCannot parse %s�yi�3��moi�'�wi�:	�di�Q�hiZmi�<�szInvalid suffix %sN)	�strip�int�
ValueError�re�compile�match�groups�lower�
startswith)r�r�m�nZunitrrr�
parse_timeAs6








r,cCs�|}|s(tjj|�s(tj|tj�r�tjd}|s>tjj}xB|jtj�D]2}tjj	||�}tjj|�rztj|tj�rzPd}qLW|S)a1 find the location of a specified command.  if a default is supplied
    and it works, we use it; otherwise we search PATH for a match.
    :param command: string with a command to look for in the path
    :param default: default location to use
    :return: detected location for the desired command
    �PATHN)
�os�path�isfile�access�X_OK�environ�defpath�split�pathsep�join)Zcommand�defaultZfpathr/Z	directoryrrr�set_pathks$
r9c	0CsDtdtjjtjd�d��}tjtddd�}|j	dt
dddFd�|j	dd
dt
ddd�|j	ddt
ddd�|j	ddt
ddd�|j	ddt
ddd�|j	ddd t
d!dd�|j	d"d#|t
d$d
d�|j	d%d&t
d'd(dd)�|j	d*d+d,d-d.d/�|j	d0d1d,d-d2d/�|j	d3d4d5d,d-d6d/�|j	d7d8d9tjd:�|j�}|j
�rJ|j�rJtd;�n*|j
�sZ|j�rn|j
�rfd<nd=|_nd|_|j�r�t|j�d>k�r�td?�d@dA�|jD�|_y|j�r�t|j�}||_Wntk
�r�YnXy|j�r�t|j�}||_Wntk
�rYnXy|j�r(t|j�}||_Wntk
�r@YnXy<|j�r||j}t|j�}|dBk�rnd|_ntj�||_Wntk
�r�YnX|j�r�|j�r�|S|j�r*|j�r*y:t|jdB|j|j�}|j�p�|j|_|j�p�|j|_Wn4tk
�r(}ztdC|j|�WYdd}~XnX|j�s@tdD�dE|_|S)Gz8Read command line arguments, set global 'args' structureznamed-compilezoneZsbinz: checks future zDNSKEY coverage for a zone)�description�zone�*Nzzone(s) to checkz%(default: all zones in the directory))�type�nargsr8�helpz-Kr/�.z&a directory containing keys to process�dir)�destr8r=r?�metavarz-f�filenamezzone master file�file)rBr=r?rCz-m�maxttlzthe longest TTL in the zone(s)�timez-d�keyttlzthe DNSKEY TTLz-r�resignZ1944000z:the RRSIG refresh interval in seconds [default: 22.5 days]z-c�compilezonezpath to 'named-compilezone'z-l�
checklimit�0zDLength of time to check for DNSSEC coverage [default: 0 (unlimited)])rBr=r8r?rCz-z�no_ksk�
store_trueFz#Only check zone-signing keys (ZSKs))rB�actionr8r?z-k�no_zskz"Only check key-signing keys (KSKs)z-Dz--debugZ
debug_modezTurn on debugging outputz-vz	--version�version)rOrQz)ERROR: -z and -k cannot be used together.ZKSKZZSKr
z)ERROR: -f can only be used with one zone.cSs4g|],}t|�dkr,|ddkr,|dd�n|�qS)r
r@N���rR)�len)�.0�xrrr�
<listcomp>�szparse_args.<locals>.<listcomp>rz"Unable to load zone data from %s: z�WARNING: Maximum TTL value was not specified.  Using 1 week
	 (604800 seconds); re-run with the -m option to get more
	 accurate results.i�:	z5zone(s) to check(default: all zones in the directory)) r9r.r/r7r	�prefix�argparse�ArgumentParser�prog�add_argument�strrQ�
parse_argsrPrMr�keytyperDrSr;rFr,r"rHrIrKrGrrJ�	Exceptionrr)	rJ�parserrr*�kr)Zlimr;�errrr]�s�



















"r]c(Cspt�}td�yt|j|j|jd�}Wn2tk
rX}ztdt|��WYdd}~XnXx<|D]4}|j	t
�|jr�|jt
�q`|jt
|j
|j�q`Wt
d�t�yt|�}Wn2tk
r�}ztdt|��WYdd}~XnXd}|j�s|jd|j|jt
��sXd}nJxH|jD]>}y|j||j|jt
��s6d}Wnt
d|�YnX�qWtj|�rfd	nd
�dS)Nz;PHASE 1--Loading keys to check for internal timing problems)r/ZzonesrHz'ERROR: Unable to build key dictionary: z9PHASE 2--Scanning future key events for coverage failuresz#ERROR: Unable to build event list: FTz&ERROR: Coverage check failed for zone r
r)r]rrr/r;rHr_rr\Zcheck_prepubr�sepZ
check_postpubrFrIrrZcoverager^rKrr
)rZkdrb�keyZelist�errorsr;rrr�main�s:"

"
rf)N)Z
__future__rr.rrXZglobr#rGZcalendar�pprint�collectionsrrZZiscrrrrrr	rrrrr,r9r]rfrrrr�<module>s& 	*
xsite-packages/isc/__pycache__/keyevent.cpython-36.opt-1.pyc000064400000003500147511334560017500 0ustar003

��f�@sddlZGdd�d�ZdS)�Nc@s4eZdZdZddd�Zdd�Zdd�Zdd	d
�ZdS)
�keyeventz� A discrete key event, e.g., Publish, Activate, Inactive, Delete,
    etc. Stores the date of the event, and identifying information
    about the key to which the event will occur.NcCs@||_|p|j|�|_||_|j|_|j|_|j|_|j|_dS)N)	�whatZgettime�when�key�sep�name�zone�alg�keyid)�selfrrr�r�/usr/lib/python3.6/keyevent.py�__init__szkeyevent.__init__cCs t|j|j|j|j|j|jf�S)N)�reprrrr
rrr	)rrrr
�__repr__ szkeyevent.__repr__cCstjd|j�S)Nz%a %b %d %H:%M:%S UTC %Y)�timeZstrftimer)rrrr
�showtime$szkeyevent.showtimecCs�dd�}|s|}|st�}|s$t�}|jdkr<|j|j�n�|jdkrT|j|j�n�|jdkr�|j|kr||dt|j��q�|j|j�nl|jdkr�|j|kr�|j|j�q�|dt|j��n6|jd	kr�|j|kr�|j|j�|j|kr�|j|j�||fS)
Nc_sdS)Nr)�args�kwargsrrr
�noop*szkeyevent.status.<locals>.noopZActivateZPublishZInactivez=	WARNING: %s scheduled to become inactive before it is activeZDeletez@WARNING: key %s is scheduled for deletion before it is publishedZRevoke)�setr�addr
rr�remove)rZactiveZ	published�outputrrrr
�status)s6








zkeyevent.status)N)N)�__name__�
__module__�__qualname__�__doc__rrrrrrrr
rs

	r)rrrrrr
�<module>ssite-packages/isc/__pycache__/keymgr.cpython-36.opt-1.pyc000064400000010165147511334560017151 0ustar003

��fk�@s�ddlmZddlZddlZddlZddlZddlZddlZddlZddl	Z	ddl
mZdZddl
mZmZmZmZmZmZdd�Zddd	�Zd
d�Zdd
�ZdS)�)�print_functionN)�defaultdictz
dnssec-keymgr)�dnskey�keydict�	keyseries�policy�parsetab�utilscOst||�tjd�dS)N�)�print�sys�exit)�args�kwargs�r�/usr/lib/python3.6/keymgr.py�fatals
rcCs�|}|s(tjj|�s(tj|tj�r�tjd}|s>tjj}xB|jtj�D]2}|tj	|}tjj|�rztj|tj�rzPd}qLW|S)a2 find the location of a specified command. If a default is supplied,
    exists and it's an executable, we use it; otherwise we search PATH
    for an alternative.
    :param command: command to look for
    :param default: default value to use
    :return: PATH with the location of a suitable binary
    �PATHN)
�os�path�isfile�access�X_OK�environ�defpath�split�pathsep�sep)Zcommand�defaultZfpathrZ	directoryrrr�set_paths$
rcCs�tdtjjtjd�d��}tdtjjtjd�d��}tjtdd�}|j	dt
ddd;d�|j	dd
t
ddd�|j	ddt
ddd�|j	dd|t
dd
d�|j	ddt
ddd
d�|j	dd|t
dd
d�|j	d d!d"d#d$d%�|j	d&d'd"d#d(d%�|j	d)d*d+d"d#d<d%�|j	d.d/d0d"d#d1d%�|j	d2d3d4tjd5�|j�}|j
�rJ|j�rJtd6�|jdk�r^td7�|jdk�rrtd8�|jdk	�r�tjj|j��s�td9|j�n(tjjtjd:�|_tjj|j��s�d|_|S)=zc Read command line arguments, returns 'args' object
    :return: args object properly prepared
    z
dnssec-keygenZsbinzdnssec-settimezA: schedule DNSSEC key rollovers according to a pre-defined policy)�description�zone�*Nz.Zone(s) to which the policy should be applied z%(default: all zones in the directory))�type�nargsr�helpz-KrzDirectory containing keys�dir)�destr#r%�metavarz-c�
policyfilezPolicy definition file�filez-g�keygenzPath to 'dnssec-keygen')r'rr#r%r(z-r�	randomdevz@Path to a file containing random data to pass to 'dnssec-keygen')r'r#rr%r(z-s�settimezPath to 'dnssec-settime'z-k�no_zsk�
store_trueFz,Only apply policy to key-signing keys (KSKs))r'�actionrr%z-z�no_kskz-Only apply policy to zone-signing keys (ZSKs)z-fz--force�forcezForce updates to key events zeven if they are in the pastz-qz--quiet�quietzUpdate keys silentlyz-vz	--version�version)r0r4z)ERROR: -z and -k cannot be used together.zERROR: dnssec-keygen not foundzERROR: dnssec-settime not foundz!ERROR: Policy file "%s" not foundzdnssec-policy.confzSZone(s) to which the policy should be applied (default: all zones in the directory)z8Force updates to key events even if they are in the past)rrr�joinr	�prefix�argparse�ArgumentParser�prog�add_argument�strr4�
parse_argsr.r1rr+r-r)�existsZ
sysconfdir)r+r-�parserrrrrr<6sb







r<c:CsHt�}|j|j|j|jd�}ytj|j�}Wn2tk
r^}zt	dt
|��WYdd}~XnXyt||j|jd�}Wn2tk
r�}zt	dt
|��WYdd}~XnXyt
||d�}Wn2tk
r�}zt	dt
|��WYdd}~XnXy |j||j|j|j|jd�Wn4tk
�rB}zt	dt
|��WYdd}~XnXdS)	N)Zkeygen_pathZsettime_pathZ	keys_pathr,zUnable to load DNSSEC policy: )rZzonesz Unable to build key dictionary: )�contextzUnable to build key series: )ZkskZzskr2r3zUnable to apply policy: )r<r+r-rr,rZ
dnssec_policyr)�	Exceptionrr;rr!rZenforce_policyr.r1r2r3)rr?Zdp�eZkdZksrrr�main}s,
"""rB)N)Z
__future__rrrr7Zglob�reZtimeZcalendar�pprint�collectionsrr9Ziscrrrrrr	rrr<rBrrrr�<module>s@ 
Gsite-packages/isc/__pycache__/keydict.cpython-36.pyc000064400000005013147511334560016344 0ustar003

��f!�@s:ddlmZddlmZddlZddlZGdd�d�ZdS)�)�defaultdict�)�dnskeyNc@sneZdZdZedd��ZdZgZddd�Zdd�Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�Zdd�ZdS)�keydictz> A dictionary of keys, indexed by name, algorithm, and key id cCstt�S)N)r�dict�rr�/usr/lib/python3.6/keydict.py�<lambda>szkeydict.<lambda>NcKs�|jdd�|_|jdd�}|s:|jdd�p,d}|j|�nXxV|D]N}d|krb|ddk	rb|d}n|rr|j|�jptd}|j||�s@|jj|�q@WdS)NZkeyttl�zones�path�.)�get�_defttl�readallZpolicyZ	directory�readone�_missing�append)�selfZdp�kwargsr
r�zonerrr�__init__s

zkeydict.__init__cCsLtjtjj|d��}x2|D]*}t|||j�}||j|j|j|j	<qWdS)Nz	*.private)
�glob�osr�joinrr�_keydict�name�alg�keyid)rr�files�infile�keyrrrr,s
zkeydict.readallc	Cs�|jd�s|d7}d|d}tjtjj||��}d}xR|D]J}t|||j�}|j|krZq<|dkrh|jnd}||j	||j
|j<d}q<W|S)Nr�Kz
+*.privateFT)�endswithrrrrrr�fullnamerrrr)	rrr�matchr�foundrr Zkeynamerrrr3s


zkeydict.readoneccsJxD|jj�D]6\}}x,|j�D] \}}x|j�D]
}|Vq0WqWqWdS)N)r�items�values)rr�
algorithmsr�keysr rrr�__iter__Dszkeydict.__iter__cCs
|j|S)N)r)rrrrr�__getitem__Jszkeydict.__getitem__cCs
|jj�S)N)rr))rrrrr
Msz
keydict.zonescCs|j|j�S)N)rr))rrrrrr(Pszkeydict.algorithmscCs|j||j�S)N)rr))rrrrrrr)Sszkeydict.keyscCs|jS)N)r)rrrr�missingVszkeydict.missing)N)�__name__�
__module__�__qualname__�__doc__rrrrrrrr*r+r
r(r)r,rrrrrs
r)�collectionsr�rrrrrrrr�<module>ssite-packages/isc/__pycache__/eventlist.cpython-36.pyc000064400000007327147511334560016737 0ustar003

��f��@s6ddlmZddlTddlTddlTGdd�d�ZdS)�)�defaultdict�)�*c@s`eZdZedd��Zedd��Ze�ZdZdd�Z	ddd�Z
d	d
�Zedd��Z
ed
d��ZdS)�	eventlistcCstt�S)N)r�list�rr�/usr/lib/python3.6/eventlist.py�<lambda>szeventlist.<lambda>cCstt�S)N)rrrrrrr	sNc
Csddddddg}||_x�|j�D]�}|jj|�x�||j�D]�\}}xj|j�D]^}xX|D]P}|j|�}|snqZt|||�}	|jr�|j	||j
|	�qZ|j||j
|	�qZWqPWt|j	||dd�d	�|j	||<t|j||d
d�d	�|j||<q>Wq WdS)NZSyncPublishZPublishZ
SyncDeleteZActivateZInactiveZDeletecSs|jS)N)�when)�eventrrrr	+sz$eventlist.__init__.<locals>.<lambda>)�keycSs|jS)N)r
)rrrrr	-s)
�_kdictZzones�_zones�add�items�valuesZgettime�keyevent�sep�_K�append�_Z�sorted)
�selfZkdictZ
properties�zone�alg�keys�kZprop�t�errr�__init__s&


zeventlist.__init__cCsdd�}|s|}|r |dkr dnd}|r4|dkr4dnd}d}}	d}
|rb||jkrb|d|�dS|r�d}
|s~|j|d||�}|s�|j|d||�}	n`x^|jD]T}|r�||jj�kr�d}
|j|d||�}|r�||jj�kr�d}
|j|d||�}	q�W|
�s|d�dS|�o|	S)	Nc_sdS)Nr)�args�kwargsrrr�noop2sz eventlist.coverage.<locals>.noop�KSKTFZZSKz!ERROR: No key events found for %szERROR: No key events found)r�	checkzonerrr)rr�keytype�until�outputr"Zno_zskZno_kskZkokZzok�found�zrrr�coverage1s6zeventlist.coveragec	Csxd}|dkr|j|}n
|j|}xP|j�D]D}|d||tj|�f�tj|||||�}|rh|d�|on|}q,W|S)NTr#z9Checking scheduled %s events for zone %s, algorithm %s...zNo errors found)rrr�dnskeyZalgstrr�checkset)	rrr%r&r'ZallokZkzr�okrrrr$Ts
zeventlist.checkzonecCsR|sdS|d|dj�ddd�x(|D] }|d|jt|j�fdd�q*WdS)Nz  r�:F)�skipz
    %s: %s)ZshowtimeZwhat�reprr)�eventsetr'rrrr�showsetfs

zeventlist.showsetc
CsNt�}t�}d}xZ|D]R}d}|s4|dj|jkr>|j|�|dj|jkr|j|�t�}|j|�qW|rz|j|�|s�|d|�dSd}}	x�|D]�}|r�tj|dj�|kr�|dtjdtj|���dSx|D]}|j||	�\}}	q�Wt	j
||�|�s|d|�dS|	�s,|d|�dS|	j|�s�|d	|�dSq�WdS)
NFTrzERROR: No %s events foundzIgnoring events after %sz%a %b %d %H:%M:%S UTC %Yz*ERROR: No %s's are active after this eventz-ERROR: No %s's are published after this eventz=ERROR: No %s's are both active and published after this event)rr
rZcalendarZtimegmZtimeZstrftimeZgmtimeZstatusrr2�intersection)
r1r%r&r'�groups�groupZeventsfoundrZactiveZ	publishedrrrr,nsL






zeventlist.checkset)N)�__name__�
__module__�__qualname__rrr�setrr
rr*r$�staticmethodr2r,rrrrrs
#rN)�collectionsrr+Zkeydictrrrrrr�<module>ssite-packages/isc/__pycache__/utils.cpython-36.opt-1.pyc000064400000002303147511334560017006 0ustar003

��f��@sTddlZejdkr"ddlZddlZddd�Zdd�ZdZejdkrHd	Zned
�ZdS)�N�nt�c
Cs tjdkrtjjd|�Stj}d}tj}d}d}ytj||d|�}Wnd}YnX|s�d}|tj	B}ytj||d|�}Wnd}YnX|s�d}|tj
B}ytj||d|�}Wnd}YnX|r�ytj|d�\}}	Wnd}YnXtj|�|�rtjj||�Stjjtj
�|�S)Nrz/usrzSoftware\ISC\BINDTrFZ
InstallDir)�os�name�path�join�win32con�HKEY_LOCAL_MACHINEZKEY_READ�win32apiZRegOpenKeyExZKEY_WOW64_64KEYZKEY_WOW64_32KEYZRegQueryValueExZRegCloseKeyZGetSystemDirectory)
ZbindirZhklmZbind_subkeyZsamZh_keyZ	key_foundZsam64Zsam32Z
named_base�_�r�/usr/lib/python3.6/utils.py�prefixsD







rcCs2tjdkrd|jdd�dSd|jdd�dS)Nr�"z"\"�'z'\'')rr�replace)�srrr
�
shellquote=s
rz"9.11.36-RedHat-9.11.36-16.el8_10.2z/etcZetc)r)rrrr
rr�versionZ
sysconfdirrrrr
�<module>s

)
site-packages/isc/__pycache__/utils.cpython-36.pyc000064400000002303147511334560016047 0ustar003

��f��@sTddlZejdkr"ddlZddlZddd�Zdd�ZdZejdkrHd	Zned
�ZdS)�N�nt�c
Cs tjdkrtjjd|�Stj}d}tj}d}d}ytj||d|�}Wnd}YnX|s�d}|tj	B}ytj||d|�}Wnd}YnX|s�d}|tj
B}ytj||d|�}Wnd}YnX|r�ytj|d�\}}	Wnd}YnXtj|�|�rtjj||�Stjjtj
�|�S)Nrz/usrzSoftware\ISC\BINDTrFZ
InstallDir)�os�name�path�join�win32con�HKEY_LOCAL_MACHINEZKEY_READ�win32apiZRegOpenKeyExZKEY_WOW64_64KEYZKEY_WOW64_32KEYZRegQueryValueExZRegCloseKeyZGetSystemDirectory)
ZbindirZhklmZbind_subkeyZsamZh_keyZ	key_foundZsam64Zsam32Z
named_base�_�r�/usr/lib/python3.6/utils.py�prefixsD







rcCs2tjdkrd|jdd�dSd|jdd�dS)Nr�"z"\"�'z'\'')rr�replace)�srrr
�
shellquote=s
rz"9.11.36-RedHat-9.11.36-16.el8_10.2z/etcZetc)r)rrrr
rr�versionZ
sysconfdirrrrr
�<module>s

)
site-packages/isc/__pycache__/keyzone.cpython-36.pyc000064400000002344147511334560016400 0ustar003

��f��@sJddlZddlZddlZddlmZmZGdd�de�ZGdd�d�ZdS)�N)�Popen�PIPEc@seZdZdS)�KeyZoneExceptionN)�__name__�
__module__�__qualname__�rr�/usr/lib/python3.6/keyzone.pyrsrc@seZdZdZdd�ZdS)�keyzonez/reads a zone file to find data relevant to keysc
Cs�d|_d|_|sdS|s8tjj|�s8tj|tj�rDtd��dSd}}t|dd||gt	t	d�j
�\}}xv|j�D]j}t|�t
k	r�|jd�}tjd|�r�qv|j�}	|s�t|	d�|kr�t|	d�}|	dd	krvt|	d�}qvW||_||_dS)
Nz"named-compilezone" not foundz-o�-)�stdout�stderr�asciiz^[:space:]*;��ZDNSKEY)�maxttl�keyttl�os�path�isfile�access�X_OKrrrZcommunicate�
splitlines�type�str�decode�re�search�split�int)
�self�name�filenameZczpathrr�fp�_�lineZfieldsrrr	�__init__s.
zkeyzone.__init__N)rrr�__doc__r&rrrr	r
sr
)	r�sysr�
subprocessrr�	Exceptionrr
rrrr	�<module>s
site-packages/isc/__pycache__/rndc.cpython-36.opt-1.pyc000064400000012000147511334560016567 0ustar003

��f+�@sXddlmZddlZddlZddlZddlZddlZddlZddlZGdd�de	�Z
dS)�)�OrderedDictNc@sleZdZdZddddddd�Zd	d
�Zdd�Zddd�Zdd�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dS)�rndczRNDC protocol client library������)�md5Zsha1Zsha224Zsha256Zsha384Zsha512cCsb||_|j�}|jd�r$|dd�}||_tt|�|_tj|�|_	t
jdd�|_d|_
|j�dS)z�Creates a persistent connection to RNDC and logs in
        host - (ip, port) tuple
        algo - HMAC algorithm: one of md5, sha1, sha224, sha256, sha384, sha512
               (with optional prefix 'hmac-')
        secret - HMAC secret, base64 encodedzhmac-�Nr��i)�host�lower�
startswith�algo�getattr�hashlib�hlalgo�base64�	b64decode�secret�randomZrandint�ser�nonce�_rndc__connect_login)�selfrrr�r�/usr/lib/python3.6/rndc.py�__init__$s
z
rndc.__init__cCst|j|d�d�S)z�Call a RNDC command, all parsing is done on the server side
        cmd - a complete string with a command (eg 'reload zone example.com')
        )�type�_data)�dict�_rndc__command)r�cmdrrr�call5sz	rndc.callFcCst�}�x|j�D]�\}}|r(|dkr(q|tjdt|��|jd�7}t|�tkrt|tjddt|��|jd�7}qt|�tkr�|tjddt|��|7}qt|�tkr�|tjddt|��|7}qt|�t	kr�|j
|�}|tjddt|��|7}qtdt|���qW|S)N�_auth�B�asciiz>BIr�z#Cannot serialize element of type %s)�	bytearray�items�struct�pack�len�encoder �str�bytesr�_rndc__serialize_dict�NotImplementedError)r�data�ignore_auth�rv�k�vZsdrrrZ__serialize_dict;s""
zrndc.__serialize_dictc	Os,|jd7_ttj��}t||�}t�}t�|d<t�|d<t|j�|dd<t|�|dd<t|d�|dd<|jdk	r�|j|dd<||d	<|j|d
d�}tj|j	||j
�j�}tj
|�}|jdkr�tjd
|�|dd<n"ttjd|j|j|��|dd<|j|�}tjdt|�dd�|}|S)Nrr&�_ctrlZ_serZ_tim�<Z_exp�_noncer!T)r5r
Z22s�hmd5ZB88s�hshaz>II�)r�int�timerr0rr2�hmac�newrr�digestrZ	b64encoderr,r-r*�_rndc__algosr.)	r�args�kwargsZnowr4�d�msg�hash�bhashrrrZ__prep_messageOs,






zrndc.__prep_messagecCs�|jdk	r |dd|jkr dS|jdkr8|dd}n|dddd�}t|�tkrb|jd	�}|d
dt|�d7}tj|�}|j|dd
�}t	j
|j||j�j
�}||kS)Nr9r;Fr
r&r<r=rr(�=r>T)r5)rrr r1�decoder.rrr2rArBrrrC)rrHrJZremote_hashZmy_msgZmy_hashrrrZ__verify_msgjs


zrndc.__verify_msgc	Os�|j||�}|jj|�}|t|�kr,td��|jjd�}t|�dkrLtd��tjd|�\}}|dkrptd|��|d8}|jj|tj	�}t|�|kr�td��t
|�tkr�t|�}|j
|�}|j|�s�td	��|S)
NzCannot send the message�zCan't read response headerz>IIrzWrong message version %dr>zCan't read response datazAuthentication failure)�_rndc__prep_message�socket�sendr.�IOErrorZrecvr,�unpackr3ZMSG_WAITALLr r0r*�_rndc__parse_message�_rndc__verify_msg)	rrErFrHZsent�headerZlength�versionr4rrrZ	__commandys(

zrndc.__commandcCs2tj|j�|_d|_|jdd�}|dd|_dS)NZnull)r r9r;)rOZcreate_connectionrrr#)rrHrrrZ__connect_login�szrndc.__connect_logincCs�d}||}|d7}||||�jd�}||7}||}|d7}tjd|||d��d}|d7}||||�}||7}||d�}|dkr�|||fS|dkr�t�}	x(t|�dkr�|j|�\}
}}||	|
<q�W||	|fStd|��dS)Nrrr(z>Ir>r)zUnknown element type %d)rLr,rRrr.�_rndc__parse_elementr3)r�input�posZlabellen�labelr Zdatalenr4�restrGZilabel�valuerrrZ__parse_element�s*

zrndc.__parse_elementcCs8t�}d}x(t|�dkr2|j|�\}}}|||<qW|S)Nr)rr.rW)rrXr6ZhdatarZr\rrrZ__parse_message�szrndc.__parse_messageN)F)�__name__�
__module__�__qualname__�__doc__rDrr%r2rNrTr#rrWrSrrrrrs 
r)�collectionsrr@r,rrArrrO�objectrrrrr�<module>ssite-packages/isc/__pycache__/dnskey.cpython-36.pyc000064400000032530147511334560016211 0ustar003

��f @�@sJddlZddlZddlZddlmZmZGdd�de�ZGdd�d�ZdS)�N)�Popen�PIPEcseZdZ�fdd�Z�ZS)�TimePastcstt|�jd|||f�dS)Nz'%s time for key %s (%d) is already past)�superr�__init__)�self�key�prop�value)�	__class__��/usr/lib/python3.6/dnskey.pyrszTimePast.__init__)�__name__�
__module__�__qualname__r�
__classcell__rr)rr
rsrc@s�eZdZdZdqZdrZdsZdtd!d"�Zd#d$�Zd%d&�Z	e
dud'd(��Zd)d*�Ze
d+d,��Ze
d-d.��Zdvd/d0�Ze
d1d2��Ze
d3d4��Ze
d5d6��Ze
d7d8��Zd9d:�Zd;d<�Zd=d>�Zd?d@�ZdAdB�ZdCdD�Zej�fdEdF�ZdGdH�Zej�fdIdJ�ZdKdL�Zej�fdMdN�Z dOdP�Z!ej�fdQdR�Z"dSdT�Z#ej�fdUdV�Z$dWdX�Z%ej�fdYdZ�Z&d[d\�Z'ej�fd]d^�Z(d_d`�Z)dadb�Z*dcdd�Z+dedf�Z,dgdh�Z-didj�Z.dwdkdl�Z/dxdmdn�Z0e
dodp��Z1dS)y�dnskeyztAn individual DNSSEC key.  Identified by path, name, algorithm, keyid.
    Contains a dictionary of metadata events.�Created�Publish�Activate�Inactive�Delete�Revoke�	DSPublish�SyncPublish�
SyncDeleteN�-P�-A�-I�-D�-R�-Psync�-Dsync�RSAMD5�DH�DSA�ECC�RSASHA1�NSEC3DSA�NSEC3RSASHA1�	RSASHA256�	RSASHA512�ECCGOST�ECDSAP256SHA256�ECDSAP384SHA384�ED25519�ED448cCs�t|t�r:t|�dkr:|pd|_|\}}}|j||||�|pLtjj|�pLd|_tjj|�}|j	d�\}}}|dd�}t
|�}t
|j	d�d�}|j||||�dS)N��.�+�r���)�
isinstance�tuple�len�_dir�	fromtuple�os�path�dirname�basename�split�int)rrZ	directory�keyttl�name�alg�keyidrrr
r&s

zdnskey.__init__cs�|jd�r|}|jd�}n|d}d|||f}|j|jr@tjpBd|d}|j|jr^tjp`d|d}||_||_t|�|_t|�|_	||_
t|d�}	x�|	D]z��ddkr�q��j�}
|
s�q�|
d	j
�dkr�d
}||_nd}|s�t|
d	�n||_t|
|�d	@d	k�rd|_q�d|_q�W|	j�t|d�}t�|_t�|_t�|_t�|_t�|_t�|_t�|_d|_x�|D]���j����sv�ddk�r��qv�fdd�dD�t��g}
tdd�|
D��}�d|�j�}�|d�jd�j�}||j|<�qvWx�tjD]�}d|j|<||jk�rn|j|j|�}||j|<|j |�|j|<|j!|�|j|<|j||j|<n(d|j|<d|j|<d|j|<d|j|<�qW|j�dS)Nr2z
K%s+%03d+%05d�z.keyz.private�rr�;r4�in�ch�hsr1�TFZrUz!#csg|]}�j|��qSr)�find)�.0�c)�linerr
�
<listcomp>lsz$dnskey.fromtuple.<locals>.<listcomp>z:= cSsg|]}|dkr|�qS)r4r5r)rM�posrrr
rPms)rHrIrJ)"�endswith�rstripr9r;�sep�keystrrBr@rCrD�fullname�openr?�lower�ttl�close�dictZmetadata�_changed�_delete�_times�_fmttime�_timestamps�	_original�_origttl�stripr8�min�lstripr�_PROPS�	parsetime�
formattime�
epochfromtime)rrBrCrDrArVrUZkey_fileZprivate_fileZkfp�tokensZseptokenZpfpZpunctuation�foundr
r	�tr)rOr
r:5sv












zdnskey.fromtuplecKsr|jdd�}g}d}|jdk	r0|dt|j�g7}xlttjtj�D]Z\}}|s@|j|r\q@d}||j	krx|j	|rxd}|r�dn|j
|}	|||	g7}d}q@W|�rn|d|jg||jg}
|s�t
ddj|
��y0t|
ttd	�}|j�\}}
|
�rtt|
���Wn8tk
�r:}ztd
|t|�f��WYdd}~XnXd|_x*tjD] }|j||j|<d|j|<�qJWdS)N�quietFTz-LZnonez-Kz# � )�stdout�stderrzunable to run %s: %s)�getrb�strrY�ziprrf�_OPTSr\r]r_r9rU�print�joinrr�communicate�	Exceptionr`ra)rZsettime_bin�kwargsrm�cmd�firstr	�opt�deleteZwhenZfullcmd�prorp�errr
�commit�s<
"z
dnskey.commitcKsL|jdd�}|dd|dt|�g}
|r0|
d|g7}
|r>|
jd�|rN|
d|g7}
|rb|
d	t|�g7}
|	r�tj|	�}|
d
tj|�g7}
|
r�tj|
�}|
dtj|
�g7}
|
j|�|s�tdd
j|
��t|
t	t	d�}|j
�\}}|r�tdt|���y"|j�dj
d�}t|||�}|Stk
�rF}ztdt|���WYdd}~XnXdS)NrmFz-qz-Kz-Lz-rz-fkz-az-bz-Pz-Az# rn)rorpzunable to generate key: r�asciiz!unable to parse generated key: %s)rqrr�appendr�
timefromepochrhrurvrrrwrx�
splitlines�decode)�cls�
keygen_bin�	randomdevZkeys_dirrBrCZkeysizerTrY�publish�activateryrm�
keygen_cmdrlr~rorprU�newkeyrrrr
�generate�s:



zdnskey.generatec
Ks�|jdd�}|j�s td|��|dd|jd|jg}|jrL|dt|j�g7}|r\|d|g7}|rp|d	t|�g7}|s�td
dj|��t	|t
t
d�}|j�\}}	|	r�td
|	��y&|j�dj
d�}
t|
|j|j�}|Std|��YnXdS)NrmFz'predecessor key %s has no inactive datez-qz-Kz-Sz-Lz-rz-iz# rn)rorpzunable to generate key: rr�z'unable to generate successor for key %s)rq�inactiverxr9rUrYrrrurvrrrwr�r�r)rr�r�Z
prepublishryrmr�r~rorprUr�rrr
�generate_successor�s,zdnskey.generate_successorcCs0d}|tttj��kr tj|}|r(|Sd|S)Nz%03d)�ranger8r�	_ALGNAMES)rCrBrrr
�algstr�s
z
dnskey.algstrcCs6|sdS|j�}ytjj|�Stk
r0dSXdS)N)�upperrr��index�
ValueError)rCrrr
�algnum�sz
dnskey.algnumcCs|j|p|j�S)N)r�rC)rrCrrr
�algnameszdnskey.algnamecCs
tj|�S)N)�timeZgmtime)�secsrrr
r�szdnskey.timefromepochcCstj|d�S)Nz%Y%m%d%H%M%S)r�Zstrptime)�stringrrr
rgszdnskey.parsetimecCs
tj|�S)N)�calendarZtimegm)rlrrr
riszdnskey.epochfromtimecCstjd|�S)Nz%Y%m%d%H%M%S)r�Zstrftime)rlrrr
rhszdnskey.formattimecKs�|jdd�}|j||krdS|j|dk	rR|j||krR|rRt|||j|��|dkr�|j|dkrldnd|j|<d|j|<d|j|<d|j|<d|j|<dS|j|�}||j|<||j|<|j	|�|j|<|j||j|kr�dnd|j|<dS)N�forceFT)
rqr`rarr\r]r^r_r�rh)rr	r��nowryr�rlrrr
�setmetas$






zdnskey.setmetacCs
|j|S)N)r^)rr	rrr
�gettime2szdnskey.gettimecCs
|j|S)N)r_)rr	rrr
�
getfmttime5szdnskey.getfmttimecCs
|j|S)N)r`)rr	rrr
�gettimestamp8szdnskey.gettimestampcCs
|jdS)Nr)r`)rrrr
�created;szdnskey.createdcCs
|jdS)Nr)r`)rrrr
�syncpublish>szdnskey.syncpublishcKs|jd||f|�dS)Nr)r�)rr�r�ryrrr
�setsyncpublishAszdnskey.setsyncpublishcCs
|jdS)Nr)r`)rrrr
r�Dszdnskey.publishcKs|jd||f|�dS)Nr)r�)rr�r�ryrrr
�
setpublishGszdnskey.setpublishcCs
|jdS)Nr)r`)rrrr
r�Jszdnskey.activatecKs|jd||f|�dS)Nr)r�)rr�r�ryrrr
�setactivateMszdnskey.setactivatecCs
|jdS)Nr)r`)rrrr
�revokePsz
dnskey.revokecKs|jd||f|�dS)Nr)r�)rr�r�ryrrr
�	setrevokeSszdnskey.setrevokecCs
|jdS)Nr)r`)rrrr
r�Vszdnskey.inactivecKs|jd||f|�dS)Nr)r�)rr�r�ryrrr
�setinactiveYszdnskey.setinactivecCs
|jdS)Nr)r`)rrrr
r}\sz
dnskey.deletecKs|jd||f|�dS)Nr)r�)rr�r�ryrrr
�	setdelete_szdnskey.setdeletecCs
|jdS)Nr)r`)rrrr
�
syncdeletebszdnskey.syncdeletecKs|jd||f|�dS)Nr)r�)rr�r�ryrrr
�
setsyncdeleteeszdnskey.setsyncdeletecCsR|dks|j|krdS|jdkr0|j|_||_n|j|krHd|_||_n||_dS)N)rYrb)rrYrrr
�setttlhs

z
dnskey.setttlcCs|jr
dSdS)N�KSK�ZSK)rT)rrrr
�keytypetszdnskey.keytypecCsd|j|j�|jfS)Nz
%s/%s/%05d)rBr�rD)rrrr
�__str__wszdnskey.__str__cCs"d|j|j�|j|jrdndfS)Nz%s/%s/%05d (%s)r�r�)rBr�rDrT)rrrr
�__repr__{szdnskey.__repr__cCs|j�p|j�p|j�S)N)r�r�r�)rrrr
�date�szdnskey.datecCs@|j|jkr|j|jkS|j|jkr0|j|jkS|j�|j�kS)N)rBrCr�)r�otherrrr
�__lt__�s
z
dnskey.__lt__cCs�dd�}|s|}ttj��}|j�}|j�}|s4dS|sT||krP|dt|��dS||krh||krhdS||kr�|dt|�tj|j�p�df�dS||kr�|dt|��dS|jdk	r�|||jkr�|d	t|�tj|j�p�df�dSdS)
Nc_sdS)Nr)�argsryrrr
�noop�sz!dnskey.check_prepub.<locals>.noopFzFWARNING: Key %s is scheduled for
	 activation but not for publication.Tz�WARNING: %s is scheduled to be
	 published and activated at the same time. This
	 could result in a coverage gap if the zone was
	 previously signed. Activation should be at least
	 %s after publication.zone DNSKEY TTLz0WARNING: Key %s is active before it is publishedz�WARNING: Key %s is activated too soon
	 after publication; this could result in coverage 
	 gaps due to resolver caches containing old data.
	 Activation should be at least %s after
	 publication.)r@r�r�r��reprr�durationrY)r�outputr�r��ar~rrr
�check_prepub�s<zdnskey.check_prepubcCs�dd�}|dkr|}|dkr"|j}|dkr>|dt|��d}tj�}|j�}|j�}|s^dS|s~||krz|dt|��dS||kr�||kr�dS||kr�|d	t|��dS|||kr�|d
t|�tj|�f�dSdS)
Nc_sdS)Nr)r�ryrrr
r��sz"dnskey.check_postpub.<locals>.noopz"WARNING: Key %s using default TTL.�<�FzEWARNING: Key %s is scheduled for
	 deletion but not for inactivation.Tz@WARNING: Key %s is scheduled for
	 deletion before inactivation.z�WARNING: Key %s scheduled for
	 deletion too soon after deactivation; this may 
	 result in coverage gaps due to resolver caches
	 containing old data.  Deletion should be at least
	 %s after inactivation.ii�Q)rYr�r�r}r�rr�)rr�Ztimespanr�r��d�irrr
�
check_postpub�s:zdnskey.check_postpubcCsz|sdSddddddg}g}xR|D]J}||d||d}}|dkr"|jd
||d|dkrbdndf�q"Wdj|�S) N�yearr�r�im�month��day�hour�minute�secondr4rz%d %s%s�srEz, ii�Q�3�)r�r�ii�Q��')r�r�i�Q)r�r��)r�r�)r�r�)r�r4)r�rv)r�Zunitsr�Zunit�vrrr
r��s
(zdnskey.duration)	rrrrrrrrr)	Nrrrrr Nr!r")Nr#r$r%r&r'r(r)r*Nr+Nr,r-r.r/r0)NN)NN)N)N)NN)2rrr�__doc__rfrtr�rr:r��classmethodr�r��staticmethodr�r�r�r�rgrirhr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r}r�r�r�r�r�r�r�r�r�r�r�r�rrrr
rsb
M%* 


1
-r)	r;r�r��
subprocessrrrxrrrrrr
�<module>s
site-packages/isc/__pycache__/keyseries.cpython-36.pyc000064400000010705147511334560016717 0ustar003

��f"�@sFddlmZddlTddlTddlTddlTddlZGdd�d�ZdS)�)�defaultdict�)�*Nc@sleZdZedd��Zedd��Ze�ZdZdZ	e
j
�dfdd�Zdd�Zd	d
�Z
dd�Ze
j
�fd
d�ZdS)�	keyseriescCstt�S)N)r�list�rr�/usr/lib/python3.6/keyseries.py�<lambda>szkeyseries.<lambda>cCstt�S)N)rrrrrrr	sNcCs�||_||_t|j��|_x�|j�D]�}|jj|�x�||j�D]�\}}xh|j�D]\}|j	r�|j
�op|j
�|ks�|j||j|�qT|j
�o�|j
�|ksT|j
||j|�qTW|j||j�|j
||j�qBWq$WdS)N)�_kdict�_context�setZmissing�_zones�zones�add�items�values�sep�delete�_K�append�_Z�sort)�selfZkdict�now�context�zone�alg�keys�krrr�__init__szkeyseries.__init__ccsbx\|jD]R}xL|j|jgD]<}||kr(qx,||j�D]\}}x|D]
}|VqDWq6WqWqWdS)N)r
rrr)rr�
collectionrr�keyrrr�__iter__.s
zkeyseries.__iter__cCs"x|D]}tdt|��qWdS)Nz%s)�print�repr)rrrrr�dump7s
zkeyseries.dumpcKs�|jdd�}|sdS|d}|jr>|j}|jp0d}|jp:d}	n|j}|jpLd
}|jpVd}	|j�}
|j	�}|
sv|
|kr�|j
|�|}
|s�||kr�|j|�|}|j�}d}
|s�|j
d|�|jd|��n�|s�|||k�r|�r(|||||
k�r(|j
||f|�|j|||	f|�n�|�s`|j
|||
f|�|j|||	|
f|�n�||k�rln�|||k�r�|j
||f|�|j|||	f|�np|||||
k�r�|j
||f|�|j|||	f|�n0|j
|||
f|�|j|||	|
f|�n�|j�}|�s8||	||
k�rL|j||	f|�nN|�sj|j||	|
f|�n0|||
k�rzn |||	k�r�|j||	f|�|j|jk�r�|j|j�|}x�|dd�D]�}|�s|j
d|�|jd|�|j
d|�|jd|�|j|jk�r�|j|j��q�|j�}||}
|j|f|�|j
|
f|�|j
||f|�|j|||	f|�|j||	f|�|j|jk�r�|j|j�|}�q�Wx�|�r>|j��r>|j�||jk�r>|j|jdf|�|j|jd	|jd
|f|�}|j
|j	�|f|�|j|j�|	f|�|j|�|}�q�W|j
d|�|jd|�x"|D]}|j|jdf|��q^WdS)N�forceFr�i�Qi,rZsettime_path�keygen_path�	randomdevi�'i�'i�'i�')N)N)N)N)N)N)N)N)�getrZksk_rollperiodZksk_prepublishZksk_postpublishZzsk_rollperiodZzsk_prepublishZzsk_postpublishZpublishZactivateZ
setpublishZsetactivateZinactiveZsetinactiveZ	setdeleter�keyttlZttlZsetttlZcoverageZcommitrZgenerate_successorr)rr�policyr�kwargsr&r!ZrpZprepubZpostpub�p�a�iZfudge�d�prevrrr�	fixseries;s�










zkeyseries.fixseriescKs�|jd|j�}|jd|jjdd��}|jdd�}�x�|D�]�}g}|j|�}	|pX|	jpXd}|	j}
tj|
�}d|ks||dr�t|j	||�dkr�tj
|jd	|jd
|||
|	jd|	jp�df|�}|j	||j
|�|j
|j	|�d|ks�|d�rht|j||�dk�rXtj
|jd	|jd
|||
|	jd
|	j�p<df|�}|j||j
|�|j
|j|�x�|D]�}
x||
j�D]p\}}||k�r��q|y|j||	|f|�Wn@tk
�r�}z"td|tj|�t|�f��WYdd}~XnX�q|W�qnWq8WdS)Nr�dirZ	keys_pathr&F�.Zkskrr(r)iZzskTz	%s/%s: %s)r*r
rr,Z	directory�	algorithm�dnskey�algnum�lenrZgenerateZzsk_keysizer+rrZksk_keysizerr3�	ExceptionZalgstr�str)rZpoliciesrr-rZkeys_dirr&r�collectionsr,rr8rr r6r�errr�enforce_policy�sL




zkeyseries.enforce_policy)�__name__�
__module__�__qualname__rrrrr
r
r�timerr"r%r3r>rrrrrs	vr)r<rr7ZkeydictZkeyeventr,rBrrrrr�<module>ssite-packages/isc/__pycache__/parsetab.cpython-36.opt-1.pyc000064400000015224147511334560017455 0ustar003

��f}�;@s
dZdZdZddddddd	d
ddd
gddddddd	ddddgfddddddd	d
ddd
gddddddd	ddddgfddddddd	d
ddddd
ddgddddddd	ddddddddgfddddd	d
ddd
g	ddddd	ddddg	fddgddgfdddgdddgfdd gd!d"gfddd#d$d%d&d'dgdd(d)d*d+d,d-d(gfdddd.gdd/d/d0gfdd1dd!ddddd(d/d2ddg
d!d!d
dd1dd!dddd3ddg
fdd(d/d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLd.dMdNdOdPdQd"dd0dRdSdTdUdVg*dddddd
dWd=d>dXd#d$d%dYdd4d3d5dd6ddZd7d8d9d:d[ddd(dCd2dd&d'dDd/dd?d@dAdBg*fd3ddZddWdd[dgdXdXdXdXd;dd<dgfd3ddZddWdd[dgd#d#d#d#d;dd<dgfd3ddZddWdd[dgd$d$d$d$d;dd<dgfd3ddZddWdd[dgd%d%d%d%d;dd<dgfd3ddZddWdd[dgdYdYdYdYd;dd<dgfd3ddZddWdd[dgd&d&d&d&d;dd<dgfd3ddZddWdd[dgd'd'd'd'd;dd<dgfddddgd d ddgfddddgddddgfdZddWdd[dgdJdOd;dd<dgfdXdYd)d*d+d,d-gd.d.d.d.d.dUdVgfdXdYd)d*d+gdMdMdMdMdMgfd\�ZiZxXej�D]L\ZZx@eeded�D]*\Z	Z
e	ek�r�iee	<e
ee	e<�q�W�q�W[dgdgfdgdgfddgdd
gfddgddgfddgddgfddgd	d	gfdgd1gfddgddQgfdd1dgd2ddgfd2gd4gfddgd5d6gfd3gdZgfd3dZgd7dKgfd3ddZdgd8dBd8dBgfd3ddZdgd9dCd9dCgfd3ddZdgd:dDd:dDgfd3ddZdgd;dEd;dEgfd3ddZdgd<dHd<dHgfd3ddZdgd=dFd=dFgfd3ddZdgd>dId>dIgfdgdgfddgd?dPgfddgd@d@gfddgdAdAgfddgdGdGgfdXdYd)d*d+gdLdNdRdSdTgfd]�ZiZxXej�D]L\ZZx@eeded�D]*\Z	Z
e	ek�r�iee	<e
ee	e<�q�W�q�W[d^d_dd`d`d`fdadbddcdddefdfdbddcdddgfdhdiddjdddkfdldmddndddofdpdmddndddqfdrdmddndddsfdtduddvdddwfdxduddvdddyfdzduddvddd{fd|d}dd~dddfd�d}dd~ddd�fd�d}dd~ddd�fd�d}dd~ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�dd�ddd�fd�d�d�ddd�dfg6Z
d`S(z3.8ZLALRZ C2FE91AF256AF1CCEB499107DA2CFE7E��������
��.�>�������/�M�X�	�;���=��P�
��<��O�(�)�*�,�-��E�F�G�I�J�C��R�������� �!�"�#�$�%�&�0�1�2�3�4�5�6�7�8�9�:�?�@�B�D�H�K�L�N�S�T�U�V�W�A�'�+��Q)ZALGORITHM_POLICYZZONEZPOLICYz$endZALGNAMEZSTRZQSTRINGZKEYTYPEZ
DATESUFFIX�LBRACE�SEMIZCOVERAGEZROLL_PERIODZPRE_PUBLISHZPOST_PUBLISHZKEYTTLZKEY_SIZEZSTANDBYZ	DIRECTORYZ	ALGORITHM�RBRACE�NUMBERZNONE)�
policylist�init�policy�
alg_policy�zone_policy�named_policy�domain�name�
new_policy�alg_option_group�policy_option_group�alg_option_list�
alg_option�coverage_option�rollperiod_option�prepublish_option�postpublish_option�
keyttl_option�keysize_option�standby_option�policy_option_list�
policy_option�
parent_option�directory_option�algorithm_option�durationzS' -> policylistzS'Nzpolicylist -> init policyr^Zp_policylistz	policy.pyizpolicylist -> policylist policyizinit -> <empty>r_Zp_initizpolicy -> alg_policyr`Zp_policyizpolicy -> zone_policyizpolicy -> named_policyizname -> STRreZp_nameizname -> KEYTYPEizname -> DATESUFFIXiz
domain -> STRrdZp_domaini!zdomain -> QSTRINGi"zdomain -> KEYTYPEi#zdomain -> DATESUFFIXi$znew_policy -> <empty>rfZp_new_policyi+zGalg_policy -> ALGORITHM_POLICY ALGNAME new_policy alg_option_group SEMIraZp_alg_policyi/z>zone_policy -> ZONE domain new_policy policy_option_group SEMIrbZ
p_zone_policyi6z?named_policy -> POLICY name new_policy policy_option_group SEMIrcZp_named_policyi=zduration -> NUMBERrwZp_duration_1iCzduration -> NONEZp_duration_2iHzduration -> NUMBER DATESUFFIXZp_duration_3iMz7policy_option_group -> LBRACE policy_option_list RBRACErhZp_policy_option_groupi`z(policy_option_list -> policy_option SEMIrrZp_policy_option_listidz;policy_option_list -> policy_option_list policy_option SEMIiezpolicy_option -> parent_optionrsZp_policy_optioniiz!policy_option -> directory_optionijz policy_option -> coverage_optionikz"policy_option -> rollperiod_optionilz"policy_option -> prepublish_optionimz#policy_option -> postpublish_optioninzpolicy_option -> keysize_optionioz!policy_option -> algorithm_optionipzpolicy_option -> keyttl_optioniqzpolicy_option -> standby_optionirz1alg_option_group -> LBRACE alg_option_list RBRACErgZp_alg_option_groupivz"alg_option_list -> alg_option SEMIriZp_alg_option_listizz2alg_option_list -> alg_option_list alg_option SEMIi{zalg_option -> coverage_optionrjZp_alg_optionizalg_option -> rollperiod_optioni�zalg_option -> prepublish_optioni�z alg_option -> postpublish_optioni�zalg_option -> keyttl_optioni�zalg_option -> keysize_optioni�zalg_option -> standby_optioni�zparent_option -> POLICY namertZp_parent_optioni�z%directory_option -> DIRECTORY QSTRINGruZp_directory_optioni�z$coverage_option -> COVERAGE durationrkZp_coverage_optioni�z1rollperiod_option -> ROLL_PERIOD KEYTYPE durationrlZp_rollperiod_optioni�z1prepublish_option -> PRE_PUBLISH KEYTYPE durationrmZp_prepublish_optioni�z3postpublish_option -> POST_PUBLISH KEYTYPE durationrnZp_postpublish_optioni�z)keysize_option -> KEY_SIZE KEYTYPE NUMBERrpZp_keysize_optioni�z(standby_option -> STANDBY KEYTYPE NUMBERrqZp_standby_optioni�z keyttl_option -> KEYTTL durationroZp_keyttl_optioni�z%algorithm_option -> ALGORITHM ALGNAMErvZp_algorithm_optioni�)Z_tabversionZ
_lr_methodZ
_lr_signatureZ_lr_action_itemsZ
_lr_action�itemsZ_kZ_v�zipZ_xZ_yZ_lr_goto_itemsZ_lr_gotoZ_lr_productions�rzrz�/usr/lib/python3.6/parsetab.py�<module>s��������site-packages/isc/__pycache__/__init__.cpython-36.pyc000064400000000744147511334560016455 0ustar003

��f��
@sjdddddddddd	d
ddg
Zd
dlTd
dlTd
dlTd
dlTd
dlTd
dlTd
dlTd
dlTd
dl	TdS)ZcheckdsZcoverageZkeymgrZdnskeyZ	eventlistZkeydictZkeyeventZ	keyseriesZkeyzoneZpolicyZparsetabZrndcZutils�)�*N)
�__all__Z
isc.dnskeyZ
isc.eventlistZisc.keydictZisc.keyeventZ
isc.keyseriesZisc.keyzoneZ
isc.policyZisc.rndcZ	isc.utils�rr�/usr/lib/python3.6/__init__.py�<module>s


site-packages/isc/__pycache__/policy.cpython-36.pyc000064400000046217147511334560016222 0ustar003

��f2g�@s:ddlZddljZddljZddlTddlmZGdd�d�ZGdd�d�ZGdd	�d	e	�Z
Gd
d�d�Zedk�r6ddl
Z
e
jd
dkr�ee
jd�Zej�Zej�ed
d�Zeje�nxe
jd
dk�r6y4ee
jdddd�Zeejd�eejd��Wn2e	k
�r4Zzeejd�WYddZ[XnXdS)�N)�*)�copyc
@s�eZdZd3Zed4ZiZdZdZdZdZ	dZ
dd�Zdd�Zd d!�Z
d"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zd2S)5�	PolicyLex�POLICY�ALGORITHM_POLICY�ZONE�	ALGORITHM�	DIRECTORY�KEYTTL�KEY_SIZE�ROLL_PERIOD�PRE_PUBLISH�POST_PUBLISH�COVERAGE�STANDBY�NONE�
DATESUFFIX�KEYTYPE�ALGNAME�STR�QSTRING�NUMBER�LBRACE�RBRACE�SEMIz 	z	(//|\#).*z\{z\}�;cCs|jj|jjd�7_dS)z\n+�
N)�lexer�lineno�value�count)�self�t�r#�/usr/lib/python3.6/policy.py�	t_newline7szPolicyLex.t_newlinecCs|jj|jjd�7_dS)z/\*(.|\n)*?\*/rN)rrrr )r!r"r#r#r$�	t_comment;szPolicyLex.t_commentcCstjd|j�jd�j�|_|S)z�(?i)(?<=[0-9 \t])(y(?:ears|ear|ea|e)?|mo(?:nths|nth|nt|n)?|w(?:eeks|eek|ee|e)?|d(?:ays|ay|a)?|h(?:ours|our|ou|o)?|mi(?:nutes|nute|nut|nu|n)?|s(?:econds|econd|econ|eco|ec|e)?)\bz(?i)(y|mo|w|d|h|mi|s)([a-z]*)�)�re�matchr�group�lower)r!r"r#r#r$�t_DATESUFFIX?szPolicyLex.t_DATESUFFIXcCs|jj�|_|S)z(?i)\b(KSK|ZSK)\b)r�upper)r!r"r#r#r$�	t_KEYTYPEDszPolicyLex.t_KEYTYPEcCs|jj�|_|S)z�(?i)\b(RSAMD5|DH|DSA|NSEC3DSA|ECC|RSASHA1|NSEC3RSASHA1|RSASHA256|RSASHA512|ECCGOST|ECDSAP256SHA256|ECDSAP384SHA384|ED25519|ED448)\b)rr-)r!r"r#r#r$�	t_ALGNAMEIszPolicyLex.t_ALGNAMEcCs|jj|jd�|_|S)z[A-Za-z._-][\w._-]*r)�reserved_map�getr�type)r!r"r#r#r$�t_STRNszPolicyLex.t_STRcCs&|jj|jd�|_|jdd�|_|S)z"([^"\n]|(\\"))*"rr'���)r0r1rr2)r!r"r#r#r$�	t_QSTRINGSszPolicyLex.t_QSTRINGcCst|j�|_|S)z\d+)�intr)r!r"r#r#r$�t_NUMBERYszPolicyLex.t_NUMBERcCs"td|jd�|jjd�dS)NzIllegal character '%s'rr')�printrr�skip)r!r"r#r#r$�t_error^szPolicyLex.t_errorcKsbdtt�krtjdd�}n
tdd�}x"|jD]}||j|j�j|�<q,Wtjfd|i|��|_dS)N�	maketrans�_�-�object)	�dir�strr;�reservedr0r+�	translate�lexr)r!�kwargsZtrans�rr#r#r$�__init__bs
zPolicyLex.__init__cCs.|jj|�x|jj�}|sPt|�qWdS)N)r�input�tokenr8)r!�textr"r#r#r$�testks
zPolicyLex.testN)
rrrrr	r
rrr
rrrr)	rrrrrrrrr)�__name__�
__module__�__qualname__rA�tokensr0Zt_ignoreZt_ignore_olcommentZt_LBRACEZt_RBRACEZt_SEMIr%r&r,r.r/r3r5r7r:rFrJr#r#r#r$rsN	rc
@s�eZdZdZdZdZdZdZdZdZ	dZ
dZdZdZ
dZdZdZdZdZddgddgddgddgddgddgddgdddddd�Zddd�Zd	d
�Zdd�Zd
d�Zdd�Zdd�ZdS)�PolicyFNiii)�DSA�NSEC3DSA�RSAMD5�RSASHA1�NSEC3RSASHA1�	RSASHA256�	RSASHA512�ECCGOST�ECDSAP256SHA256�ECDSAP384SHA384�ED25519�ED448cCs||_||_||_dS)N)�name�	algorithm�parent)r!r\r]r^r#r#r$rF�szPolicy.__init__cCsFd|jrdp"|jrdp"|jr dp"d|jp*d|jr8|jjp:d|jrRdt|j�dpTd|jp\d|jrlt|j�pnd|j	r~t|j	�p�d|j
r�t|j
�p�d|jr�t|j�p�d|jr�t|j�p�d|j
r�t|j
�p�d|jr�t|j�p�d|jr�t|j�p�d|jr�t|j�p�d|j�rt|j��pd|j�r(t|j��p*d|j�r>t|j��p@dfS)	Na%spolicy %s:
	inherits %s
	directory %s
	algorithm %s
	coverage %s
	ksk_keysize %s
	zsk_keysize %s
	ksk_rollperiod %s
	zsk_rollperiod %s
	ksk_prepublish %s
	ksk_postpublish %s
	zsk_prepublish %s
	zsk_postpublish %s
	ksk_standby %s
	zsk_standby %s
	keyttl %s
zconstructed zzone z
algorithm �ZUNKNOWN�None�")�is_constructed�is_zone�is_algr\r^�	directoryr@r]�coverage�ksk_keysize�zsk_keysize�ksk_rollperiod�zsk_rollperiod�ksk_prepublish�ksk_postpublish�zsk_prepublish�zsk_postpublish�ksk_standby�zsk_standby�keyttl)r!r#r#r$�__repr__�s(

zPolicy.__repr__cCs |d|ko|dkSS)Nrr'r#)r!Zkey_sizeZ
size_ranger#r#r$Z
__verify_size�szPolicy.__verify_sizecCs|jS)N)r\)r!r#r#r$�get_name�szPolicy.get_namecCs|jS)N)rb)r!r#r#r$�constructed�szPolicy.constructedcCs$|jr:|jdk	r:|j|jkr:t|j�dd|j|jffS|jrj|jdk	rj|j|jkrjdd|j|jffS|jr�|jdk	r�|j|jkr�dd|j|jffS|jr�|jdk	r�|j|jkr�dd|j|jffS|jo�|jo�|jo�|j|j|jk�rdd|j|j|jffS|j�rL|j�rL|j�rL|j|j|jk�rLdd|j|j|jffS|jdk	�r |jj	|j�}|dk	�r�|j
|j|��s�dd
|j|ffS|j
|j|��s�dd|j|ffS|jdk�r�|jddk�r�dd|jfS|jdk�r|jddk�rdd|jfS|jd k�r d|_d|_d!S)"zr Check if the values in the policy make sense
        :return: True/False if the policy passes validation
        NFz6KSK pre-publish period (%d) exceeds rollover period %dz7KSK post-publish period (%d) exceeds rollover period %dz6ZSK pre-publish period (%d) exceeds rollover period %dz7ZSK post-publish period (%d) exceeds rollover period %dz%KSK pre/post-publish periods (%d/%d) z"combined exceed rollover period %dz%ZSK pre/post-publish periods (%d/%d) z&KSK key size %d outside valid range %sz&ZSK key size %d outside valid range %srPrQ�@rz$KSK key size %d not divisible by 64 zas required for DSAz$ZSK key size %d not divisible by 64 rWrXrYrZr[Tr_zGKSK pre/post-publish periods (%d/%d) combined exceed rollover period %dzGZSK pre/post-publish periods (%d/%d) combined exceed rollover period %d)rPrQz7KSK key size %d not divisible by 64 as required for DSA)rPrQz7ZSK key size %d not divisible by 64 as required for DSA)rWrXrYrZr[)Tr_)
rirkr8rlrjrmrnr]�valid_key_sz_per_algor1�_Policy__verify_sizergrh)r!Zkey_sz_ranger#r#r$�validate�s�





zPolicy.validate)NNN)rKrLrMrcrdrbrirjrkrmrlrnrgrhrorprqrfrervrFrrrwrsrtrxr#r#r#r$rOvsD
&rOc@seZdZdS)�PolicyExceptionN)rKrLrMr#r#r#r$ry)sryc@s.eZdZiZiZiZdZdZdZdEdd�Z	dd�Z
dd�Zd	d
�Zdd�Z
d
d�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Z d3d4�Z!d5d6�Z"d7d8�Z#d9d:�Z$d;d<�Z%d=d>�Z&d?d@�Z'dAdB�Z(dCdD�Z)dS)F�
dnssec_policyNTcKs�t�|_|jj|_d|kr"d|d<d|kr2d|d<tjfd|i|��|_|jd�t�}d|_d|_d|_	d|_
t|�|jd<d|jd_d|jd_
d	|jd_	t|�|jd
<d
|jd
_d
|jd
_
d	|jd
_	t|�|jd<d|jd_d|jd_
t|�|jd<d|jd_d|jd_
t|�|jd
<d
|jd
_d
|jd
_
t|�|jd<d|jd_d|jd_
t|�|jd<d|jd_d|jd_
t|�|jd<d|jd_d|jd_
t|�|jd<d|jd_d|jd_
d|jd_	d|jd_
t|�|jd<d|jd_d|jd_
d|jd_	d|jd_
t|�|jd<d|jd_d|jd_
d|jd_	d|jd_
t|�|jd<d|jd_d|jd_
d|jd_	d|jd_
|�r�|j|�dS)N�debugF�write_tables�moduleapolicy global { algorithm rsasha256;
                                      key-size ksk 2048;
                                      key-size zsk 2048;
                                      roll-period ksk 0;
                                      roll-period zsk 1y;
                                      pre-publish ksk 1mo;
                                      pre-publish zsk 1mo;
                                      post-publish ksk 1mo;
                                      post-publish zsk 1mo;
                                      standby ksk 0;
                                      standby zsk 0;
                                      keyttl 1h;
                                      coverage 6mo; };
                      policy default { policy global; };TirPirQrRrSrTrUrVrWrXrYrZr[)r�plexrN�yacc�parser�setuprOr]rdrgrhr�
alg_policyr\�load)r!�filenamerD�pr#r#r$rF4s|

zdnssec_policy.__init__c	CsH||_d|_t|��$}|j�}d|jj_|jj|�WdQRXd|_dS)NTr)	r��initial�open�readr~rrr��parse)r!r��frIr#r#r$r��s

zdnssec_policy.loadcCs d|_d|jj_|jj|�dS)NTr)r�r~rrr�r�)r!rIr#r#r$r��s
zdnssec_policy.setupc	Ks`|j�}d}||jkr |j|}|dkrBt|jd�}||_d|_|jdkr�|jpZ|jd}x|rr|jrr|j}q^W|r~|jp�d|_|j|jkr�|j|j}nt	d��|j
dkr�|jp�|jd}x|dk	r�|j
r�|j}q�W|o�|j
|_
|jdk�r:|j�p|jd}x|�r"|j�r"|j}�qW|�r2|j�p6|j|_|jdk�r�|j�pV|jd}x|j�rv|j�rv|j}�qZW|�r�|j�p�|j|_|j
dk�r�|j�p�|jd}x|j�r�|j
�r�|j}�q�W|�r�|j
�p�|j
|_
|jdk�r6|j�p�|jd}x|j�r|j�r|j}�qW|�r.|j�p2|j|_|jdk�r�|j�pR|jd}x|j�rr|j�rr|j}�qVW|�r�|j�p�|j|_|jdk�r�|j�p�|jd}x|j�r�|j�r�|j}�q�W|�r�|j�p�|j|_|jdk�r2|j�p�|jd}x|j�r|j�r|j}�q�W|�r*|j�p.|j|_|jdk�r�|j�pN|jd}x|j�rn|j�rn|j}�qRW|�r~|j�p�|j|_|jdk�r�|j�p�|jd}x|j�r�|j�r�|j}�q�W|�r�|j�p�|j|_|jdk�r(|j�p�|jd}x |dk	�r|j�r|j}�q�W|�o$|j|_d|k�s>|d�r\|j�\}}|�s\t	|��dS|S)N�defaultTzalgorithm not foundZ
novalidate)r+�zone_policyr�named_policyr\rbr]r^r�ryrerfrgrhrirjrkrmrlrnrqrx)	r!ZzonerD�zr�r^ZapZvalid�msgr#r#r$�policy�s�





zdnssec_policy.policycCsdS)zBpolicylist : init policy
                      | policylist policyNr#)r!r�r#r#r$�p_policylist
szdnssec_policy.p_policylistcCs
d|_dS)zinit :FN)r�)r!r�r#r#r$�p_initszdnssec_policy.p_initcCsdS)zTpolicy : alg_policy
                  | zone_policy
                  | named_policyNr#)r!r�r#r#r$�p_policyszdnssec_policy.p_policycCs|d|d<dS)zAname : STR
                | KEYTYPE
                | DATESUFFIXr'rNr#)r!r�r#r#r$�p_nameszdnssec_policy.p_namecCs,|dj�|d<tjd|d�s(td��dS)zcdomain : STR
                  | QSTRING
                  | KEYTYPE
                  | DATESUFFIXr'rz^[\w.-][\w.-]*$zinvalid domainN)�stripr(r)ry)r!r�r#r#r$�p_domain szdnssec_policy.p_domaincCst�|_dS)znew_policy :N)rO�current)r!r�r#r#r$�p_new_policy*szdnssec_policy.p_new_policycCs(|d|j_d|j_|j|j|d<dS)zFalg_policy : ALGORITHM_POLICY ALGNAME new_policy alg_option_group SEMI�TN)r�r\rdr�)r!r�r#r#r$�p_alg_policy.szdnssec_policy.p_alg_policycCs8|djd�|j_d|j_|j|j|djd�j�<dS)z=zone_policy : ZONE domain new_policy policy_option_group SEMIr��.TN)�rstripr�r\rcr�r+)r!r�r#r#r$�
p_zone_policy5szdnssec_policy.p_zone_policycCs$|d|j_|j|j|dj�<dS)z>named_policy : POLICY name new_policy policy_option_group SEMIr�N)r�r\r�r+)r!r�r#r#r$�p_named_policy<szdnssec_policy.p_named_policycCs|d|d<dS)zduration : NUMBERr'rNr#)r!r�r#r#r$�p_duration_1Bszdnssec_policy.p_duration_1cCsd|d<dS)zduration : NONENrr#)r!r�r#r#r$�p_duration_2Gszdnssec_policy.p_duration_2cCs�|ddkr|dd|d<n�|ddkr<|dd|d<n�|ddkrZ|dd	|d<n||dd
krx|dd|d<n^|ddkr�|dd
|d<n@|ddkr�|dd|d<n"|ddkr�|d|d<ntd��dS)zduration : NUMBER DATESUFFIXr��yr'i�3�r�moi�'�wi�:	�di�Q�hiZmi�<�szinvalid durationN)ry)r!r�r#r#r$�p_duration_3Lszdnssec_policy.p_duration_3cCsdS)z6policy_option_group : LBRACE policy_option_list RBRACENr#)r!r�r#r#r$�p_policy_option_group_sz#dnssec_policy.p_policy_option_groupcCsdS)zmpolicy_option_list : policy_option SEMI
                              | policy_option_list policy_option SEMINr#)r!r�r#r#r$�p_policy_option_listcsz"dnssec_policy.p_policy_option_listcCsdS)a�policy_option : parent_option
                         | directory_option
                         | coverage_option
                         | rollperiod_option
                         | prepublish_option
                         | postpublish_option
                         | keysize_option
                         | algorithm_option
                         | keyttl_option
                         | standby_optionNr#)r!r�r#r#r$�p_policy_optionhszdnssec_policy.p_policy_optioncCsdS)z0alg_option_group : LBRACE alg_option_list RBRACENr#)r!r�r#r#r$�p_alg_option_groupusz dnssec_policy.p_alg_option_groupcCsdS)z^alg_option_list : alg_option SEMI
                           | alg_option_list alg_option SEMINr#)r!r�r#r#r$�p_alg_option_listyszdnssec_policy.p_alg_option_listcCsdS)aalg_option : coverage_option
                      | rollperiod_option
                      | prepublish_option
                      | postpublish_option
                      | keyttl_option
                      | keysize_option
                      | standby_optionNr#)r!r�r#r#r$�p_alg_option~szdnssec_policy.p_alg_optioncCs|j|dj�|j_dS)zparent_option : POLICY namer�N)r�r+r�r^)r!r�r#r#r$�p_parent_option�szdnssec_policy.p_parent_optioncCs|d|j_dS)z$directory_option : DIRECTORY QSTRINGr�N)r�re)r!r�r#r#r$�p_directory_option�sz dnssec_policy.p_directory_optioncCs|d|j_dS)z#coverage_option : COVERAGE durationr�N)r�rf)r!r�r#r#r$�p_coverage_option�szdnssec_policy.p_coverage_optioncCs*|ddkr|d|j_n|d|j_dS)z0rollperiod_option : ROLL_PERIOD KEYTYPE durationr��KSK�N)r�rirj)r!r�r#r#r$�p_rollperiod_option�sz!dnssec_policy.p_rollperiod_optioncCs*|ddkr|d|j_n|d|j_dS)z0prepublish_option : PRE_PUBLISH KEYTYPE durationr�r�r�N)r�rkrm)r!r�r#r#r$�p_prepublish_option�sz!dnssec_policy.p_prepublish_optioncCs*|ddkr|d|j_n|d|j_dS)z2postpublish_option : POST_PUBLISH KEYTYPE durationr�r�r�N)r�rlrn)r!r�r#r#r$�p_postpublish_option�sz"dnssec_policy.p_postpublish_optioncCs*|ddkr|d|j_n|d|j_dS)z(keysize_option : KEY_SIZE KEYTYPE NUMBERr�r�r�N)r�rgrh)r!r�r#r#r$�p_keysize_option�szdnssec_policy.p_keysize_optioncCs*|ddkr|d|j_n|d|j_dS)z'standby_option : STANDBY KEYTYPE NUMBERr�r�r�N)r�rorp)r!r�r#r#r$�p_standby_option�szdnssec_policy.p_standby_optioncCs|d|j_dS)zkeyttl_option : KEYTTL durationr�N)r�rq)r!r�r#r#r$�p_keyttl_option�szdnssec_policy.p_keyttl_optioncCs|d|j_dS)z$algorithm_option : ALGORITHM ALGNAMEr�N)r�r])r!r�r#r#r$�p_algorithm_option�sz dnssec_policy.p_algorithm_optioncCsd|r.td|jpd|jrdnd|j|jf�n2|js`td|jp@d|jrJdnd|rV|jpXdf��dS)Nz%s%s%d:syntax error near '%s'r_�:z%s%s%d:unexpected end of inputr)r8r�rrr�ry)r!r�r#r#r$�p_error�szdnssec_policy.p_error)N)*rKrLrMr�r�r�r�r�r�rFr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r#r#r$rz,sN
_
h


rz�__main__r'rCr�)r{r�T)r|r{r�znonexistent.zone)r(Zply.lexrCZply.yaccr�stringrrrO�	ExceptionryrzrK�sys�argvr��filer�rI�closer~rJZppr8r�r��e�argsr#r#r#r$�<module>s6

`4!

site-packages/isc/__pycache__/eventlist.cpython-36.opt-1.pyc000064400000007327147511334560017676 0ustar003

��f��@s6ddlmZddlTddlTddlTGdd�d�ZdS)�)�defaultdict�)�*c@s`eZdZedd��Zedd��Ze�ZdZdd�Z	ddd�Z
d	d
�Zedd��Z
ed
d��ZdS)�	eventlistcCstt�S)N)r�list�rr�/usr/lib/python3.6/eventlist.py�<lambda>szeventlist.<lambda>cCstt�S)N)rrrrrrr	sNc
Csddddddg}||_x�|j�D]�}|jj|�x�||j�D]�\}}xj|j�D]^}xX|D]P}|j|�}|snqZt|||�}	|jr�|j	||j
|	�qZ|j||j
|	�qZWqPWt|j	||dd�d	�|j	||<t|j||d
d�d	�|j||<q>Wq WdS)NZSyncPublishZPublishZ
SyncDeleteZActivateZInactiveZDeletecSs|jS)N)�when)�eventrrrr	+sz$eventlist.__init__.<locals>.<lambda>)�keycSs|jS)N)r
)rrrrr	-s)
�_kdictZzones�_zones�add�items�valuesZgettime�keyevent�sep�_K�append�_Z�sorted)
�selfZkdictZ
properties�zone�alg�keys�kZprop�t�errr�__init__s&


zeventlist.__init__cCsdd�}|s|}|r |dkr dnd}|r4|dkr4dnd}d}}	d}
|rb||jkrb|d|�dS|r�d}
|s~|j|d||�}|s�|j|d||�}	n`x^|jD]T}|r�||jj�kr�d}
|j|d||�}|r�||jj�kr�d}
|j|d||�}	q�W|
�s|d�dS|�o|	S)	Nc_sdS)Nr)�args�kwargsrrr�noop2sz eventlist.coverage.<locals>.noop�KSKTFZZSKz!ERROR: No key events found for %szERROR: No key events found)r�	checkzonerrr)rr�keytype�until�outputr"Zno_zskZno_kskZkokZzok�found�zrrr�coverage1s6zeventlist.coveragec	Csxd}|dkr|j|}n
|j|}xP|j�D]D}|d||tj|�f�tj|||||�}|rh|d�|on|}q,W|S)NTr#z9Checking scheduled %s events for zone %s, algorithm %s...zNo errors found)rrr�dnskeyZalgstrr�checkset)	rrr%r&r'ZallokZkzr�okrrrr$Ts
zeventlist.checkzonecCsR|sdS|d|dj�ddd�x(|D] }|d|jt|j�fdd�q*WdS)Nz  r�:F)�skipz
    %s: %s)ZshowtimeZwhat�reprr)�eventsetr'rrrr�showsetfs

zeventlist.showsetc
CsNt�}t�}d}xZ|D]R}d}|s4|dj|jkr>|j|�|dj|jkr|j|�t�}|j|�qW|rz|j|�|s�|d|�dSd}}	x�|D]�}|r�tj|dj�|kr�|dtjdtj|���dSx|D]}|j||	�\}}	q�Wt	j
||�|�s|d|�dS|	�s,|d|�dS|	j|�s�|d	|�dSq�WdS)
NFTrzERROR: No %s events foundzIgnoring events after %sz%a %b %d %H:%M:%S UTC %Yz*ERROR: No %s's are active after this eventz-ERROR: No %s's are published after this eventz=ERROR: No %s's are both active and published after this event)rr
rZcalendarZtimegmZtimeZstrftimeZgmtimeZstatusrr2�intersection)
r1r%r&r'�groups�groupZeventsfoundrZactiveZ	publishedrrrr,nsL






zeventlist.checkset)N)�__name__�
__module__�__qualname__rrr�setrr
rr*r$�staticmethodr2r,rrrrrs
#rN)�collectionsrr+Zkeydictrrrrrr�<module>ssite-packages/isc/__pycache__/keyseries.cpython-36.opt-1.pyc000064400000010705147511334560017656 0ustar003

��f"�@sFddlmZddlTddlTddlTddlTddlZGdd�d�ZdS)�)�defaultdict�)�*Nc@sleZdZedd��Zedd��Ze�ZdZdZ	e
j
�dfdd�Zdd�Zd	d
�Z
dd�Ze
j
�fd
d�ZdS)�	keyseriescCstt�S)N)r�list�rr�/usr/lib/python3.6/keyseries.py�<lambda>szkeyseries.<lambda>cCstt�S)N)rrrrrrr	sNcCs�||_||_t|j��|_x�|j�D]�}|jj|�x�||j�D]�\}}xh|j�D]\}|j	r�|j
�op|j
�|ks�|j||j|�qT|j
�o�|j
�|ksT|j
||j|�qTW|j||j�|j
||j�qBWq$WdS)N)�_kdict�_context�setZmissing�_zones�zones�add�items�values�sep�delete�_K�append�_Z�sort)�selfZkdict�now�context�zone�alg�keys�krrr�__init__szkeyseries.__init__ccsbx\|jD]R}xL|j|jgD]<}||kr(qx,||j�D]\}}x|D]
}|VqDWq6WqWqWdS)N)r
rrr)rr�
collectionrr�keyrrr�__iter__.s
zkeyseries.__iter__cCs"x|D]}tdt|��qWdS)Nz%s)�print�repr)rrrrr�dump7s
zkeyseries.dumpcKs�|jdd�}|sdS|d}|jr>|j}|jp0d}|jp:d}	n|j}|jpLd
}|jpVd}	|j�}
|j	�}|
sv|
|kr�|j
|�|}
|s�||kr�|j|�|}|j�}d}
|s�|j
d|�|jd|��n�|s�|||k�r|�r(|||||
k�r(|j
||f|�|j|||	f|�n�|�s`|j
|||
f|�|j|||	|
f|�n�||k�rln�|||k�r�|j
||f|�|j|||	f|�np|||||
k�r�|j
||f|�|j|||	f|�n0|j
|||
f|�|j|||	|
f|�n�|j�}|�s8||	||
k�rL|j||	f|�nN|�sj|j||	|
f|�n0|||
k�rzn |||	k�r�|j||	f|�|j|jk�r�|j|j�|}x�|dd�D]�}|�s|j
d|�|jd|�|j
d|�|jd|�|j|jk�r�|j|j��q�|j�}||}
|j|f|�|j
|
f|�|j
||f|�|j|||	f|�|j||	f|�|j|jk�r�|j|j�|}�q�Wx�|�r>|j��r>|j�||jk�r>|j|jdf|�|j|jd	|jd
|f|�}|j
|j	�|f|�|j|j�|	f|�|j|�|}�q�W|j
d|�|jd|�x"|D]}|j|jdf|��q^WdS)N�forceFr�i�Qi,rZsettime_path�keygen_path�	randomdevi�'i�'i�'i�')N)N)N)N)N)N)N)N)�getrZksk_rollperiodZksk_prepublishZksk_postpublishZzsk_rollperiodZzsk_prepublishZzsk_postpublishZpublishZactivateZ
setpublishZsetactivateZinactiveZsetinactiveZ	setdeleter�keyttlZttlZsetttlZcoverageZcommitrZgenerate_successorr)rr�policyr�kwargsr&r!ZrpZprepubZpostpub�p�a�iZfudge�d�prevrrr�	fixseries;s�










zkeyseries.fixseriescKs�|jd|j�}|jd|jjdd��}|jdd�}�x�|D�]�}g}|j|�}	|pX|	jpXd}|	j}
tj|
�}d|ks||dr�t|j	||�dkr�tj
|jd	|jd
|||
|	jd|	jp�df|�}|j	||j
|�|j
|j	|�d|ks�|d�rht|j||�dk�rXtj
|jd	|jd
|||
|	jd
|	j�p<df|�}|j||j
|�|j
|j|�x�|D]�}
x||
j�D]p\}}||k�r��q|y|j||	|f|�Wn@tk
�r�}z"td|tj|�t|�f��WYdd}~XnX�q|W�qnWq8WdS)Nr�dirZ	keys_pathr&F�.Zkskrr(r)iZzskTz	%s/%s: %s)r*r
rr,Z	directory�	algorithm�dnskey�algnum�lenrZgenerateZzsk_keysizer+rrZksk_keysizerr3�	ExceptionZalgstr�str)rZpoliciesrr-rZkeys_dirr&r�collectionsr,rr8rr r6r�errr�enforce_policy�sL




zkeyseries.enforce_policy)�__name__�
__module__�__qualname__rrrrr
r
r�timerr"r%r3r>rrrrrs	vr)r<rr7ZkeydictZkeyeventr,rBrrrrr�<module>ssite-packages/isc/__pycache__/coverage.cpython-36.opt-1.pyc000064400000014323147511334560017446 0ustar003

��f�&�@s�ddlmZddlZddlZddlZddlZddlZddlZddlZddl	Z	ddl
mZdZddl
mZmZmZmZmZmZdd�Zdad	d
�Zdd�Zd
d�Zddd�Zdd�Zdd�ZdS)�)�print_functionN)�defaultdictzdnssec-coverage)�dnskey�	eventlist�keydict�keyevent�keyzone�utilscOst||�tjd�dS)N�)�print�sys�exit)�args�kwargs�r�/usr/lib/python3.6/coverage.py�fatals
rTcOsJd|kr|d}|jdd�nd}tr,dan|r8td�|rFt||�dS)zuoutput text, adding a vertical space this is *not* the first
    first section being printed since a call to vreset()�skipNTF�)�pop�
_firstliner)rrrrrr�output'srcCsdadS)zreset vertical spacingTN)rrrrr�vreset8srcCs�|j�}yt|�Stk
r$YnXtjd�}|j|�}|sJtd|��|j�\}}t|�}|j�}|jd�rx|dS|jd�r�|dS|jd�r�|dS|jd	�r�|d
S|jd�r�|dS|jd
�r�|dS|jd�r�|Std|��dS)z� convert a formatted time (e.g., 1y, 6mo, 15mi, etc) into seconds
    :param s: String with some text representing a time interval
    :return: Integer with the number of seconds in the time interval
    z([0-9][0-9]*)\s*([A-Za-z]*)zCannot parse %s�yi�3��moi�'�wi�:	�di�Q�hiZmi�<�szInvalid suffix %sN)	�strip�int�
ValueError�re�compile�match�groups�lower�
startswith)r�r�m�nZunitrrr�
parse_timeAs6








r,cCs�|}|s(tjj|�s(tj|tj�r�tjd}|s>tjj}xB|jtj�D]2}tjj	||�}tjj|�rztj|tj�rzPd}qLW|S)a1 find the location of a specified command.  if a default is supplied
    and it works, we use it; otherwise we search PATH for a match.
    :param command: string with a command to look for in the path
    :param default: default location to use
    :return: detected location for the desired command
    �PATHN)
�os�path�isfile�access�X_OK�environ�defpath�split�pathsep�join)Zcommand�defaultZfpathr/Z	directoryrrr�set_pathks$
r9c	0CsDtdtjjtjd�d��}tjtddd�}|j	dt
dddFd�|j	dd
dt
ddd�|j	ddt
ddd�|j	ddt
ddd�|j	ddt
ddd�|j	ddd t
d!dd�|j	d"d#|t
d$d
d�|j	d%d&t
d'd(dd)�|j	d*d+d,d-d.d/�|j	d0d1d,d-d2d/�|j	d3d4d5d,d-d6d/�|j	d7d8d9tjd:�|j�}|j
�rJ|j�rJtd;�n*|j
�sZ|j�rn|j
�rfd<nd=|_nd|_|j�r�t|j�d>k�r�td?�d@dA�|jD�|_y|j�r�t|j�}||_Wntk
�r�YnXy|j�r�t|j�}||_Wntk
�rYnXy|j�r(t|j�}||_Wntk
�r@YnXy<|j�r||j}t|j�}|dBk�rnd|_ntj�||_Wntk
�r�YnX|j�r�|j�r�|S|j�r*|j�r*y:t|jdB|j|j�}|j�p�|j|_|j�p�|j|_Wn4tk
�r(}ztdC|j|�WYdd}~XnX|j�s@tdD�dE|_|S)Gz8Read command line arguments, set global 'args' structureznamed-compilezoneZsbinz: checks future zDNSKEY coverage for a zone)�description�zone�*Nzzone(s) to checkz%(default: all zones in the directory))�type�nargsr8�helpz-Kr/�.z&a directory containing keys to process�dir)�destr8r=r?�metavarz-f�filenamezzone master file�file)rBr=r?rCz-m�maxttlzthe longest TTL in the zone(s)�timez-d�keyttlzthe DNSKEY TTLz-r�resignZ1944000z:the RRSIG refresh interval in seconds [default: 22.5 days]z-c�compilezonezpath to 'named-compilezone'z-l�
checklimit�0zDLength of time to check for DNSSEC coverage [default: 0 (unlimited)])rBr=r8r?rCz-z�no_ksk�
store_trueFz#Only check zone-signing keys (ZSKs))rB�actionr8r?z-k�no_zskz"Only check key-signing keys (KSKs)z-Dz--debugZ
debug_modezTurn on debugging outputz-vz	--version�version)rOrQz)ERROR: -z and -k cannot be used together.ZKSKZZSKr
z)ERROR: -f can only be used with one zone.cSs4g|],}t|�dkr,|ddkr,|dd�n|�qS)r
r@N���rR)�len)�.0�xrrr�
<listcomp>�szparse_args.<locals>.<listcomp>rz"Unable to load zone data from %s: z�WARNING: Maximum TTL value was not specified.  Using 1 week
	 (604800 seconds); re-run with the -m option to get more
	 accurate results.i�:	z5zone(s) to check(default: all zones in the directory)) r9r.r/r7r	�prefix�argparse�ArgumentParser�prog�add_argument�strrQ�
parse_argsrPrMr�keytyperDrSr;rFr,r"rHrIrKrGrrJ�	Exceptionrr)	rJ�parserrr*�kr)Zlimr;�errrr]�s�



















"r]c(Cspt�}td�yt|j|j|jd�}Wn2tk
rX}ztdt|��WYdd}~XnXx<|D]4}|j	t
�|jr�|jt
�q`|jt
|j
|j�q`Wt
d�t�yt|�}Wn2tk
r�}ztdt|��WYdd}~XnXd}|j�s|jd|j|jt
��sXd}nJxH|jD]>}y|j||j|jt
��s6d}Wnt
d|�YnX�qWtj|�rfd	nd
�dS)Nz;PHASE 1--Loading keys to check for internal timing problems)r/ZzonesrHz'ERROR: Unable to build key dictionary: z9PHASE 2--Scanning future key events for coverage failuresz#ERROR: Unable to build event list: FTz&ERROR: Coverage check failed for zone r
r)r]rrr/r;rHr_rr\Zcheck_prepubr�sepZ
check_postpubrFrIrrZcoverager^rKrr
)rZkdrb�keyZelist�errorsr;rrr�main�s:"

"
rf)N)Z
__future__rr.rrXZglobr#rGZcalendar�pprint�collectionsrrZZiscrrrrrr	rrrrr,r9r]rfrrrr�<module>s& 	*
xsite-packages/isc/__pycache__/policy.cpython-36.opt-1.pyc000064400000046217147511334560017161 0ustar003

��f2g�@s:ddlZddljZddljZddlTddlmZGdd�d�ZGdd�d�ZGdd	�d	e	�Z
Gd
d�d�Zedk�r6ddl
Z
e
jd
dkr�ee
jd�Zej�Zej�ed
d�Zeje�nxe
jd
dk�r6y4ee
jdddd�Zeejd�eejd��Wn2e	k
�r4Zzeejd�WYddZ[XnXdS)�N)�*)�copyc
@s�eZdZd3Zed4ZiZdZdZdZdZ	dZ
dd�Zdd�Zd d!�Z
d"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zd2S)5�	PolicyLex�POLICY�ALGORITHM_POLICY�ZONE�	ALGORITHM�	DIRECTORY�KEYTTL�KEY_SIZE�ROLL_PERIOD�PRE_PUBLISH�POST_PUBLISH�COVERAGE�STANDBY�NONE�
DATESUFFIX�KEYTYPE�ALGNAME�STR�QSTRING�NUMBER�LBRACE�RBRACE�SEMIz 	z	(//|\#).*z\{z\}�;cCs|jj|jjd�7_dS)z\n+�
N)�lexer�lineno�value�count)�self�t�r#�/usr/lib/python3.6/policy.py�	t_newline7szPolicyLex.t_newlinecCs|jj|jjd�7_dS)z/\*(.|\n)*?\*/rN)rrrr )r!r"r#r#r$�	t_comment;szPolicyLex.t_commentcCstjd|j�jd�j�|_|S)z�(?i)(?<=[0-9 \t])(y(?:ears|ear|ea|e)?|mo(?:nths|nth|nt|n)?|w(?:eeks|eek|ee|e)?|d(?:ays|ay|a)?|h(?:ours|our|ou|o)?|mi(?:nutes|nute|nut|nu|n)?|s(?:econds|econd|econ|eco|ec|e)?)\bz(?i)(y|mo|w|d|h|mi|s)([a-z]*)�)�re�matchr�group�lower)r!r"r#r#r$�t_DATESUFFIX?szPolicyLex.t_DATESUFFIXcCs|jj�|_|S)z(?i)\b(KSK|ZSK)\b)r�upper)r!r"r#r#r$�	t_KEYTYPEDszPolicyLex.t_KEYTYPEcCs|jj�|_|S)z�(?i)\b(RSAMD5|DH|DSA|NSEC3DSA|ECC|RSASHA1|NSEC3RSASHA1|RSASHA256|RSASHA512|ECCGOST|ECDSAP256SHA256|ECDSAP384SHA384|ED25519|ED448)\b)rr-)r!r"r#r#r$�	t_ALGNAMEIszPolicyLex.t_ALGNAMEcCs|jj|jd�|_|S)z[A-Za-z._-][\w._-]*r)�reserved_map�getr�type)r!r"r#r#r$�t_STRNszPolicyLex.t_STRcCs&|jj|jd�|_|jdd�|_|S)z"([^"\n]|(\\"))*"rr'���)r0r1rr2)r!r"r#r#r$�	t_QSTRINGSszPolicyLex.t_QSTRINGcCst|j�|_|S)z\d+)�intr)r!r"r#r#r$�t_NUMBERYszPolicyLex.t_NUMBERcCs"td|jd�|jjd�dS)NzIllegal character '%s'rr')�printrr�skip)r!r"r#r#r$�t_error^szPolicyLex.t_errorcKsbdtt�krtjdd�}n
tdd�}x"|jD]}||j|j�j|�<q,Wtjfd|i|��|_dS)N�	maketrans�_�-�object)	�dir�strr;�reservedr0r+�	translate�lexr)r!�kwargsZtrans�rr#r#r$�__init__bs
zPolicyLex.__init__cCs.|jj|�x|jj�}|sPt|�qWdS)N)r�input�tokenr8)r!�textr"r#r#r$�testks
zPolicyLex.testN)
rrrrr	r
rrr
rrrr)	rrrrrrrrr)�__name__�
__module__�__qualname__rA�tokensr0Zt_ignoreZt_ignore_olcommentZt_LBRACEZt_RBRACEZt_SEMIr%r&r,r.r/r3r5r7r:rFrJr#r#r#r$rsN	rc
@s�eZdZdZdZdZdZdZdZdZ	dZ
dZdZdZ
dZdZdZdZdZddgddgddgddgddgddgddgdddddd�Zddd�Zd	d
�Zdd�Zd
d�Zdd�Zdd�ZdS)�PolicyFNiii)�DSA�NSEC3DSA�RSAMD5�RSASHA1�NSEC3RSASHA1�	RSASHA256�	RSASHA512�ECCGOST�ECDSAP256SHA256�ECDSAP384SHA384�ED25519�ED448cCs||_||_||_dS)N)�name�	algorithm�parent)r!r\r]r^r#r#r$rF�szPolicy.__init__cCsFd|jrdp"|jrdp"|jr dp"d|jp*d|jr8|jjp:d|jrRdt|j�dpTd|jp\d|jrlt|j�pnd|j	r~t|j	�p�d|j
r�t|j
�p�d|jr�t|j�p�d|jr�t|j�p�d|j
r�t|j
�p�d|jr�t|j�p�d|jr�t|j�p�d|jr�t|j�p�d|j�rt|j��pd|j�r(t|j��p*d|j�r>t|j��p@dfS)	Na%spolicy %s:
	inherits %s
	directory %s
	algorithm %s
	coverage %s
	ksk_keysize %s
	zsk_keysize %s
	ksk_rollperiod %s
	zsk_rollperiod %s
	ksk_prepublish %s
	ksk_postpublish %s
	zsk_prepublish %s
	zsk_postpublish %s
	ksk_standby %s
	zsk_standby %s
	keyttl %s
zconstructed zzone z
algorithm �ZUNKNOWN�None�")�is_constructed�is_zone�is_algr\r^�	directoryr@r]�coverage�ksk_keysize�zsk_keysize�ksk_rollperiod�zsk_rollperiod�ksk_prepublish�ksk_postpublish�zsk_prepublish�zsk_postpublish�ksk_standby�zsk_standby�keyttl)r!r#r#r$�__repr__�s(

zPolicy.__repr__cCs |d|ko|dkSS)Nrr'r#)r!Zkey_sizeZ
size_ranger#r#r$Z
__verify_size�szPolicy.__verify_sizecCs|jS)N)r\)r!r#r#r$�get_name�szPolicy.get_namecCs|jS)N)rb)r!r#r#r$�constructed�szPolicy.constructedcCs$|jr:|jdk	r:|j|jkr:t|j�dd|j|jffS|jrj|jdk	rj|j|jkrjdd|j|jffS|jr�|jdk	r�|j|jkr�dd|j|jffS|jr�|jdk	r�|j|jkr�dd|j|jffS|jo�|jo�|jo�|j|j|jk�rdd|j|j|jffS|j�rL|j�rL|j�rL|j|j|jk�rLdd|j|j|jffS|jdk	�r |jj	|j�}|dk	�r�|j
|j|��s�dd
|j|ffS|j
|j|��s�dd|j|ffS|jdk�r�|jddk�r�dd|jfS|jdk�r|jddk�rdd|jfS|jd k�r d|_d|_d!S)"zr Check if the values in the policy make sense
        :return: True/False if the policy passes validation
        NFz6KSK pre-publish period (%d) exceeds rollover period %dz7KSK post-publish period (%d) exceeds rollover period %dz6ZSK pre-publish period (%d) exceeds rollover period %dz7ZSK post-publish period (%d) exceeds rollover period %dz%KSK pre/post-publish periods (%d/%d) z"combined exceed rollover period %dz%ZSK pre/post-publish periods (%d/%d) z&KSK key size %d outside valid range %sz&ZSK key size %d outside valid range %srPrQ�@rz$KSK key size %d not divisible by 64 zas required for DSAz$ZSK key size %d not divisible by 64 rWrXrYrZr[Tr_zGKSK pre/post-publish periods (%d/%d) combined exceed rollover period %dzGZSK pre/post-publish periods (%d/%d) combined exceed rollover period %d)rPrQz7KSK key size %d not divisible by 64 as required for DSA)rPrQz7ZSK key size %d not divisible by 64 as required for DSA)rWrXrYrZr[)Tr_)
rirkr8rlrjrmrnr]�valid_key_sz_per_algor1�_Policy__verify_sizergrh)r!Zkey_sz_ranger#r#r$�validate�s�





zPolicy.validate)NNN)rKrLrMrcrdrbrirjrkrmrlrnrgrhrorprqrfrervrFrrrwrsrtrxr#r#r#r$rOvsD
&rOc@seZdZdS)�PolicyExceptionN)rKrLrMr#r#r#r$ry)sryc@s.eZdZiZiZiZdZdZdZdEdd�Z	dd�Z
dd�Zd	d
�Zdd�Z
d
d�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Z d3d4�Z!d5d6�Z"d7d8�Z#d9d:�Z$d;d<�Z%d=d>�Z&d?d@�Z'dAdB�Z(dCdD�Z)dS)F�
dnssec_policyNTcKs�t�|_|jj|_d|kr"d|d<d|kr2d|d<tjfd|i|��|_|jd�t�}d|_d|_d|_	d|_
t|�|jd<d|jd_d|jd_
d	|jd_	t|�|jd
<d
|jd
_d
|jd
_
d	|jd
_	t|�|jd<d|jd_d|jd_
t|�|jd<d|jd_d|jd_
t|�|jd
<d
|jd
_d
|jd
_
t|�|jd<d|jd_d|jd_
t|�|jd<d|jd_d|jd_
t|�|jd<d|jd_d|jd_
t|�|jd<d|jd_d|jd_
d|jd_	d|jd_
t|�|jd<d|jd_d|jd_
d|jd_	d|jd_
t|�|jd<d|jd_d|jd_
d|jd_	d|jd_
t|�|jd<d|jd_d|jd_
d|jd_	d|jd_
|�r�|j|�dS)N�debugF�write_tables�moduleapolicy global { algorithm rsasha256;
                                      key-size ksk 2048;
                                      key-size zsk 2048;
                                      roll-period ksk 0;
                                      roll-period zsk 1y;
                                      pre-publish ksk 1mo;
                                      pre-publish zsk 1mo;
                                      post-publish ksk 1mo;
                                      post-publish zsk 1mo;
                                      standby ksk 0;
                                      standby zsk 0;
                                      keyttl 1h;
                                      coverage 6mo; };
                      policy default { policy global; };TirPirQrRrSrTrUrVrWrXrYrZr[)r�plexrN�yacc�parser�setuprOr]rdrgrhr�
alg_policyr\�load)r!�filenamerD�pr#r#r$rF4s|

zdnssec_policy.__init__c	CsH||_d|_t|��$}|j�}d|jj_|jj|�WdQRXd|_dS)NTr)	r��initial�open�readr~rrr��parse)r!r��frIr#r#r$r��s

zdnssec_policy.loadcCs d|_d|jj_|jj|�dS)NTr)r�r~rrr�r�)r!rIr#r#r$r��s
zdnssec_policy.setupc	Ks`|j�}d}||jkr |j|}|dkrBt|jd�}||_d|_|jdkr�|jpZ|jd}x|rr|jrr|j}q^W|r~|jp�d|_|j|jkr�|j|j}nt	d��|j
dkr�|jp�|jd}x|dk	r�|j
r�|j}q�W|o�|j
|_
|jdk�r:|j�p|jd}x|�r"|j�r"|j}�qW|�r2|j�p6|j|_|jdk�r�|j�pV|jd}x|j�rv|j�rv|j}�qZW|�r�|j�p�|j|_|j
dk�r�|j�p�|jd}x|j�r�|j
�r�|j}�q�W|�r�|j
�p�|j
|_
|jdk�r6|j�p�|jd}x|j�r|j�r|j}�qW|�r.|j�p2|j|_|jdk�r�|j�pR|jd}x|j�rr|j�rr|j}�qVW|�r�|j�p�|j|_|jdk�r�|j�p�|jd}x|j�r�|j�r�|j}�q�W|�r�|j�p�|j|_|jdk�r2|j�p�|jd}x|j�r|j�r|j}�q�W|�r*|j�p.|j|_|jdk�r�|j�pN|jd}x|j�rn|j�rn|j}�qRW|�r~|j�p�|j|_|jdk�r�|j�p�|jd}x|j�r�|j�r�|j}�q�W|�r�|j�p�|j|_|jdk�r(|j�p�|jd}x |dk	�r|j�r|j}�q�W|�o$|j|_d|k�s>|d�r\|j�\}}|�s\t	|��dS|S)N�defaultTzalgorithm not foundZ
novalidate)r+�zone_policyr�named_policyr\rbr]r^r�ryrerfrgrhrirjrkrmrlrnrqrx)	r!ZzonerD�zr�r^ZapZvalid�msgr#r#r$�policy�s�





zdnssec_policy.policycCsdS)zBpolicylist : init policy
                      | policylist policyNr#)r!r�r#r#r$�p_policylist
szdnssec_policy.p_policylistcCs
d|_dS)zinit :FN)r�)r!r�r#r#r$�p_initszdnssec_policy.p_initcCsdS)zTpolicy : alg_policy
                  | zone_policy
                  | named_policyNr#)r!r�r#r#r$�p_policyszdnssec_policy.p_policycCs|d|d<dS)zAname : STR
                | KEYTYPE
                | DATESUFFIXr'rNr#)r!r�r#r#r$�p_nameszdnssec_policy.p_namecCs,|dj�|d<tjd|d�s(td��dS)zcdomain : STR
                  | QSTRING
                  | KEYTYPE
                  | DATESUFFIXr'rz^[\w.-][\w.-]*$zinvalid domainN)�stripr(r)ry)r!r�r#r#r$�p_domain szdnssec_policy.p_domaincCst�|_dS)znew_policy :N)rO�current)r!r�r#r#r$�p_new_policy*szdnssec_policy.p_new_policycCs(|d|j_d|j_|j|j|d<dS)zFalg_policy : ALGORITHM_POLICY ALGNAME new_policy alg_option_group SEMI�TN)r�r\rdr�)r!r�r#r#r$�p_alg_policy.szdnssec_policy.p_alg_policycCs8|djd�|j_d|j_|j|j|djd�j�<dS)z=zone_policy : ZONE domain new_policy policy_option_group SEMIr��.TN)�rstripr�r\rcr�r+)r!r�r#r#r$�
p_zone_policy5szdnssec_policy.p_zone_policycCs$|d|j_|j|j|dj�<dS)z>named_policy : POLICY name new_policy policy_option_group SEMIr�N)r�r\r�r+)r!r�r#r#r$�p_named_policy<szdnssec_policy.p_named_policycCs|d|d<dS)zduration : NUMBERr'rNr#)r!r�r#r#r$�p_duration_1Bszdnssec_policy.p_duration_1cCsd|d<dS)zduration : NONENrr#)r!r�r#r#r$�p_duration_2Gszdnssec_policy.p_duration_2cCs�|ddkr|dd|d<n�|ddkr<|dd|d<n�|ddkrZ|dd	|d<n||dd
krx|dd|d<n^|ddkr�|dd
|d<n@|ddkr�|dd|d<n"|ddkr�|d|d<ntd��dS)zduration : NUMBER DATESUFFIXr��yr'i�3�r�moi�'�wi�:	�di�Q�hiZmi�<�szinvalid durationN)ry)r!r�r#r#r$�p_duration_3Lszdnssec_policy.p_duration_3cCsdS)z6policy_option_group : LBRACE policy_option_list RBRACENr#)r!r�r#r#r$�p_policy_option_group_sz#dnssec_policy.p_policy_option_groupcCsdS)zmpolicy_option_list : policy_option SEMI
                              | policy_option_list policy_option SEMINr#)r!r�r#r#r$�p_policy_option_listcsz"dnssec_policy.p_policy_option_listcCsdS)a�policy_option : parent_option
                         | directory_option
                         | coverage_option
                         | rollperiod_option
                         | prepublish_option
                         | postpublish_option
                         | keysize_option
                         | algorithm_option
                         | keyttl_option
                         | standby_optionNr#)r!r�r#r#r$�p_policy_optionhszdnssec_policy.p_policy_optioncCsdS)z0alg_option_group : LBRACE alg_option_list RBRACENr#)r!r�r#r#r$�p_alg_option_groupusz dnssec_policy.p_alg_option_groupcCsdS)z^alg_option_list : alg_option SEMI
                           | alg_option_list alg_option SEMINr#)r!r�r#r#r$�p_alg_option_listyszdnssec_policy.p_alg_option_listcCsdS)aalg_option : coverage_option
                      | rollperiod_option
                      | prepublish_option
                      | postpublish_option
                      | keyttl_option
                      | keysize_option
                      | standby_optionNr#)r!r�r#r#r$�p_alg_option~szdnssec_policy.p_alg_optioncCs|j|dj�|j_dS)zparent_option : POLICY namer�N)r�r+r�r^)r!r�r#r#r$�p_parent_option�szdnssec_policy.p_parent_optioncCs|d|j_dS)z$directory_option : DIRECTORY QSTRINGr�N)r�re)r!r�r#r#r$�p_directory_option�sz dnssec_policy.p_directory_optioncCs|d|j_dS)z#coverage_option : COVERAGE durationr�N)r�rf)r!r�r#r#r$�p_coverage_option�szdnssec_policy.p_coverage_optioncCs*|ddkr|d|j_n|d|j_dS)z0rollperiod_option : ROLL_PERIOD KEYTYPE durationr��KSK�N)r�rirj)r!r�r#r#r$�p_rollperiod_option�sz!dnssec_policy.p_rollperiod_optioncCs*|ddkr|d|j_n|d|j_dS)z0prepublish_option : PRE_PUBLISH KEYTYPE durationr�r�r�N)r�rkrm)r!r�r#r#r$�p_prepublish_option�sz!dnssec_policy.p_prepublish_optioncCs*|ddkr|d|j_n|d|j_dS)z2postpublish_option : POST_PUBLISH KEYTYPE durationr�r�r�N)r�rlrn)r!r�r#r#r$�p_postpublish_option�sz"dnssec_policy.p_postpublish_optioncCs*|ddkr|d|j_n|d|j_dS)z(keysize_option : KEY_SIZE KEYTYPE NUMBERr�r�r�N)r�rgrh)r!r�r#r#r$�p_keysize_option�szdnssec_policy.p_keysize_optioncCs*|ddkr|d|j_n|d|j_dS)z'standby_option : STANDBY KEYTYPE NUMBERr�r�r�N)r�rorp)r!r�r#r#r$�p_standby_option�szdnssec_policy.p_standby_optioncCs|d|j_dS)zkeyttl_option : KEYTTL durationr�N)r�rq)r!r�r#r#r$�p_keyttl_option�szdnssec_policy.p_keyttl_optioncCs|d|j_dS)z$algorithm_option : ALGORITHM ALGNAMEr�N)r�r])r!r�r#r#r$�p_algorithm_option�sz dnssec_policy.p_algorithm_optioncCsd|r.td|jpd|jrdnd|j|jf�n2|js`td|jp@d|jrJdnd|rV|jpXdf��dS)Nz%s%s%d:syntax error near '%s'r_�:z%s%s%d:unexpected end of inputr)r8r�rrr�ry)r!r�r#r#r$�p_error�szdnssec_policy.p_error)N)*rKrLrMr�r�r�r�r�r�rFr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r#r#r$rz,sN
_
h


rz�__main__r'rCr�)r{r�T)r|r{r�znonexistent.zone)r(Zply.lexrCZply.yaccr�stringrrrO�	ExceptionryrzrK�sys�argvr��filer�rI�closer~rJZppr8r�r��e�argsr#r#r#r$�<module>s6

`4!

site-packages/isc/checkds.py000064400000015446147511334560012043 0ustar00############################################################################
# Copyright (C) Internet Systems Consortium, Inc. ("ISC")
#
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, you can obtain one at https://mozilla.org/MPL/2.0/.
#
# See the COPYRIGHT file distributed with this work for additional
# information regarding copyright ownership.
############################################################################

import argparse
import os
import sys
from subprocess import Popen, PIPE

from isc.utils import prefix,version

prog = 'dnssec-checkds'


############################################################################
# SECRR class:
# Class for DS/DLV resource record
############################################################################
class SECRR:
    hashalgs = {1: 'SHA-1', 2: 'SHA-256', 3: 'GOST', 4: 'SHA-384'}
    rrname = ''
    rrclass = 'IN'
    keyid = None
    keyalg = None
    hashalg = None
    digest = ''
    ttl = 0

    def __init__(self, rrtext, dlvname = None):
        if not rrtext:
            raise Exception

        # 'str' does not have decode method in python3
        if type(rrtext) is not str:
            fields = rrtext.decode('ascii').split()
        else:
            fields = rrtext.split()
        if len(fields) < 7:
            raise Exception

        if dlvname:
            self.rrtype = "DLV"
            self.dlvname = dlvname.lower()
            parent = fields[0].lower().strip('.').split('.')
            parent.reverse()
            dlv = dlvname.split('.')
            dlv.reverse()
            while len(dlv) != 0 and len(parent) != 0 and parent[0] == dlv[0]:
                parent = parent[1:]
                dlv = dlv[1:]
            if dlv:
                raise Exception
            parent.reverse()
            self.parent = '.'.join(parent)
            self.rrname = self.parent + '.' + self.dlvname + '.'
        else:
            self.rrtype = "DS"
            self.rrname = fields[0].lower()

        fields = fields[1:]
        if fields[0].upper() in ['IN', 'CH', 'HS']:
            self.rrclass = fields[0].upper()
            fields = fields[1:]
        else:
            self.ttl = int(fields[0])
            self.rrclass = fields[1].upper()
            fields = fields[2:]

        if fields[0].upper() != self.rrtype:
            raise Exception('%s does not match %s' %
                            (fields[0].upper(), self.rrtype))

        self.keyid, self.keyalg, self.hashalg = map(int, fields[1:4])
        self.digest = ''.join(fields[4:]).upper()

    def __repr__(self):
        return '%s %s %s %d %d %d %s' % \
               (self.rrname, self.rrclass, self.rrtype,
                self.keyid, self.keyalg, self.hashalg, self.digest)

    def __eq__(self, other):
        return self.__repr__() == other.__repr__()


############################################################################
# check:
# Fetch DS/DLV RRset for the given zone from the DNS; fetch DNSKEY
# RRset from the masterfile if specified, or from DNS if not.
# Generate a set of expected DS/DLV records from the DNSKEY RRset,
# and report on congruency.
############################################################################
def check(zone, args, masterfile=None, lookaside=None):
    rrlist = []
    cmd = [args.dig, "+noall", "+answer", "-t", "dlv" if lookaside else "ds",
           "-q", zone + "." + lookaside if lookaside else zone]
    fp, _ = Popen(cmd, stdout=PIPE).communicate()

    for line in fp.splitlines():
        if type(line) is not str:
            line = line.decode('ascii')
        rrlist.append(SECRR(line, lookaside))
    rrlist = sorted(rrlist, key=lambda rr: (rr.keyid, rr.keyalg, rr.hashalg))

    klist = []

    if masterfile:
        cmd = [args.dsfromkey, "-f", masterfile]
        if lookaside:
            cmd += ["-l", lookaside]
        cmd.append(zone)
        fp, _ = Popen(cmd, stdout=PIPE).communicate()
    else:
        intods, _ = Popen([args.dig, "+noall", "+answer", "-t", "dnskey",
                           "-q", zone], stdout=PIPE).communicate()
        cmd = [args.dsfromkey, "-f", "-"]
        if lookaside:
            cmd += ["-l", lookaside]
        cmd.append(zone)
        fp, _ = Popen(cmd, stdin=PIPE, stdout=PIPE).communicate(intods)

    for line in fp.splitlines():
        if type(line) is not str:
            line = line.decode('ascii')
        klist.append(SECRR(line, lookaside))

    if len(klist) < 1:
        print("No DNSKEY records found in zone apex")
        return False

    found = False
    for rr in klist:
        if rr in rrlist:
            print("%s for KSK %s/%03d/%05d (%s) found in parent" %
                  (rr.rrtype, rr.rrname.strip('.'), rr.keyalg,
                   rr.keyid, SECRR.hashalgs[rr.hashalg]))
            found = True
        else:
            print("%s for KSK %s/%03d/%05d (%s) missing from parent" %
                  (rr.rrtype, rr.rrname.strip('.'), rr.keyalg,
                   rr.keyid, SECRR.hashalgs[rr.hashalg]))

    if not found:
        print("No %s records were found for any DNSKEY" % ("DLV" if lookaside else "DS"))

    return found

############################################################################
# parse_args:
# Read command line arguments, set global 'args' structure
############################################################################
def parse_args():
    parser = argparse.ArgumentParser(description=prog + ': checks DS coverage')

    bindir = 'bin'
    sbindir = 'bin' if os.name == 'nt' else 'sbin'

    parser.add_argument('zone', type=str, help='zone to check')
    parser.add_argument('-f', '--file', dest='masterfile', type=str,
                        help='zone master file')
    parser.add_argument('-l', '--lookaside', dest='lookaside', type=str,
                        help='DLV lookaside zone')
    parser.add_argument('-d', '--dig', dest='dig',
                        default=os.path.join(prefix(bindir), 'dig'),
                        type=str, help='path to \'dig\'')
    parser.add_argument('-D', '--dsfromkey', dest='dsfromkey',
                        default=os.path.join(prefix(sbindir),
                                             'dnssec-dsfromkey'),
                        type=str, help='path to \'dnssec-dsfromkey\'')
    parser.add_argument('-v', '--version', action='version',
                        version=version)
    args = parser.parse_args()

    args.zone = args.zone.strip('.')
    if args.lookaside:
        args.lookaside = args.lookaside.strip('.')

    return args


############################################################################
# Main
############################################################################
def main():
    args = parse_args()
    found = check(args.zone, args, args.masterfile, args.lookaside)
    exit(0 if found else 1)
site-packages/isc/dnskey.py000064400000040040147511334560011720 0ustar00############################################################################
# Copyright (C) Internet Systems Consortium, Inc. ("ISC")
#
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, you can obtain one at https://mozilla.org/MPL/2.0/.
#
# See the COPYRIGHT file distributed with this work for additional
# information regarding copyright ownership.
############################################################################

import os
import time
import calendar
from subprocess import Popen, PIPE

########################################################################
# Class dnskey
########################################################################
class TimePast(Exception):
    def __init__(self, key, prop, value):
        super(TimePast, self).__init__('%s time for key %s (%d) is already past'
                                       % (prop, key, value))

class dnskey:
    """An individual DNSSEC key.  Identified by path, name, algorithm, keyid.
    Contains a dictionary of metadata events."""

    _PROPS = ('Created', 'Publish', 'Activate', 'Inactive', 'Delete',
              'Revoke', 'DSPublish', 'SyncPublish', 'SyncDelete')
    _OPTS = (None, '-P', '-A', '-I', '-D', '-R', None, '-Psync', '-Dsync')

    _ALGNAMES = (None, 'RSAMD5', 'DH', 'DSA', 'ECC', 'RSASHA1',
                 'NSEC3DSA', 'NSEC3RSASHA1', 'RSASHA256', None,
                 'RSASHA512', None, 'ECCGOST', 'ECDSAP256SHA256',
                 'ECDSAP384SHA384', 'ED25519', 'ED448')

    def __init__(self, key, directory=None, keyttl=None):
        # this makes it possible to use algname as a class or instance method
        if isinstance(key, tuple) and len(key) == 3:
            self._dir = directory or '.'
            (name, alg, keyid) = key
            self.fromtuple(name, alg, keyid, keyttl)

        self._dir = directory or os.path.dirname(key) or '.'
        key = os.path.basename(key)
        (name, alg, keyid) = key.split('+')
        name = name[1:-1]
        alg = int(alg)
        keyid = int(keyid.split('.')[0])
        self.fromtuple(name, alg, keyid, keyttl)

    def fromtuple(self, name, alg, keyid, keyttl):
        if name.endswith('.'):
            fullname = name
            name = name.rstrip('.')
        else:
            fullname = name + '.'

        keystr = "K%s+%03d+%05d" % (fullname, alg, keyid)
        key_file = self._dir + (self._dir and os.sep or '') + keystr + ".key"
        private_file = (self._dir + (self._dir and os.sep or '') +
                        keystr + ".private")

        self.keystr = keystr

        self.name = name
        self.alg = int(alg)
        self.keyid = int(keyid)
        self.fullname = fullname

        kfp = open(key_file, "r")
        for line in kfp:
            if line[0] == ';':
                continue
            tokens = line.split()
            if not tokens:
                continue

            if tokens[1].lower() in ('in', 'ch', 'hs'):
                septoken = 3
                self.ttl = keyttl
            else:
                septoken = 4
                self.ttl = int(tokens[1]) if not keyttl else keyttl

            if (int(tokens[septoken]) & 0x1) == 1:
                self.sep = True
            else:
                self.sep = False
        kfp.close()

        pfp = open(private_file, "rU")

        self.metadata = dict()
        self._changed = dict()
        self._delete = dict()
        self._times = dict()
        self._fmttime = dict()
        self._timestamps = dict()
        self._original = dict()
        self._origttl = None

        for line in pfp:
            line = line.strip()
            if not line or line[0] in ('!#'):
                continue
            punctuation = [line.find(c) for c in ':= '] + [len(line)]
            found = min([pos for pos in punctuation if pos != -1])
            name = line[:found].rstrip()
            value = line[found:].lstrip(":= ").rstrip()
            self.metadata[name] = value

        for prop in dnskey._PROPS:
            self._changed[prop] = False
            if prop in self.metadata:
                t = self.parsetime(self.metadata[prop])
                self._times[prop] = t
                self._fmttime[prop] = self.formattime(t)
                self._timestamps[prop] = self.epochfromtime(t)
                self._original[prop] = self._timestamps[prop]
            else:
                self._times[prop] = None
                self._fmttime[prop] = None
                self._timestamps[prop] = None
                self._original[prop] = None

        pfp.close()

    def commit(self, settime_bin, **kwargs):
        quiet = kwargs.get('quiet', False)
        cmd = []
        first = True

        if self._origttl is not None:
            cmd += ["-L", str(self.ttl)]

        for prop, opt in zip(dnskey._PROPS, dnskey._OPTS):
            if not opt or not self._changed[prop]:
                continue

            delete = False
            if prop in self._delete and self._delete[prop]:
                delete = True

            when = 'none' if delete else self._fmttime[prop]
            cmd += [opt, when]
            first = False

        if cmd:
            fullcmd = [settime_bin, "-K", self._dir] + cmd + [self.keystr,]
            if not quiet:
                print('# ' + ' '.join(fullcmd))
            try:
                p = Popen(fullcmd, stdout=PIPE, stderr=PIPE)
                stdout, stderr = p.communicate()
                if stderr:
                    raise Exception(str(stderr))
            except Exception as e:
                raise Exception('unable to run %s: %s' %
                                (settime_bin, str(e)))
            self._origttl = None
            for prop in dnskey._PROPS:
                self._original[prop] = self._timestamps[prop]
                self._changed[prop] = False

    @classmethod
    def generate(cls, keygen_bin, randomdev, keys_dir, name, alg, keysize, sep,
                 ttl, publish=None, activate=None, **kwargs):
        quiet = kwargs.get('quiet', False)

        keygen_cmd = [keygen_bin, "-q", "-K", keys_dir, "-L", str(ttl)]

        if randomdev:
            keygen_cmd += ["-r", randomdev]

        if sep:
            keygen_cmd.append("-fk")

        if alg:
            keygen_cmd += ["-a", alg]

        if keysize:
            keygen_cmd += ["-b", str(keysize)]

        if publish:
            t = dnskey.timefromepoch(publish)
            keygen_cmd += ["-P", dnskey.formattime(t)]

        if activate:
            t = dnskey.timefromepoch(activate)
            keygen_cmd += ["-A", dnskey.formattime(activate)]

        keygen_cmd.append(name)

        if not quiet:
            print('# ' + ' '.join(keygen_cmd))

        p = Popen(keygen_cmd, stdout=PIPE, stderr=PIPE)
        stdout, stderr = p.communicate()
        if stderr:
            raise Exception('unable to generate key: ' + str(stderr))

        try:
            keystr = stdout.splitlines()[0].decode('ascii')
            newkey = dnskey(keystr, keys_dir, ttl)
            return newkey
        except Exception as e:
            raise Exception('unable to parse generated key: %s' % str(e))

    def generate_successor(self, keygen_bin, randomdev, prepublish, **kwargs):
        quiet = kwargs.get('quiet', False)

        if not self.inactive():
            raise Exception("predecessor key %s has no inactive date" % self)

        keygen_cmd = [keygen_bin, "-q", "-K", self._dir, "-S", self.keystr]

        if self.ttl:
            keygen_cmd += ["-L", str(self.ttl)]

        if randomdev:
            keygen_cmd += ["-r", randomdev]

        if prepublish:
            keygen_cmd += ["-i", str(prepublish)]

        if not quiet:
            print('# ' + ' '.join(keygen_cmd))

        p = Popen(keygen_cmd, stdout=PIPE, stderr=PIPE)
        stdout, stderr = p.communicate()
        if stderr:
            raise Exception('unable to generate key: ' + stderr)

        try:
            keystr = stdout.splitlines()[0].decode('ascii')
            newkey = dnskey(keystr, self._dir, self.ttl)
            return newkey
        except:
            raise Exception('unable to generate successor for key %s' % self)

    @staticmethod
    def algstr(alg):
        name = None
        if alg in range(len(dnskey._ALGNAMES)):
            name = dnskey._ALGNAMES[alg]
        return name if name else ("%03d" % alg)

    @staticmethod
    def algnum(alg):
        if not alg:
            return None
        alg = alg.upper()
        try:
            return dnskey._ALGNAMES.index(alg)
        except ValueError:
            return None

    def algname(self, alg=None):
        return self.algstr(alg or self.alg)

    @staticmethod
    def timefromepoch(secs):
        return time.gmtime(secs)

    @staticmethod
    def parsetime(string):
        return time.strptime(string, "%Y%m%d%H%M%S")

    @staticmethod
    def epochfromtime(t):
        return calendar.timegm(t)

    @staticmethod
    def formattime(t):
        return time.strftime("%Y%m%d%H%M%S", t)

    def setmeta(self, prop, secs, now, **kwargs):
        force = kwargs.get('force', False)

        if self._timestamps[prop] == secs:
            return

        if self._original[prop] is not None and \
           self._original[prop] < now and not force:
            raise TimePast(self, prop, self._original[prop])

        if secs is None:
            self._changed[prop] = False \
                if self._original[prop] is None else True

            self._delete[prop] = True
            self._timestamps[prop] = None
            self._times[prop] = None
            self._fmttime[prop] = None
            return

        t = self.timefromepoch(secs)
        self._timestamps[prop] = secs
        self._times[prop] = t
        self._fmttime[prop] = self.formattime(t)
        self._changed[prop] = False if \
            self._original[prop] == self._timestamps[prop] else True

    def gettime(self, prop):
        return self._times[prop]

    def getfmttime(self, prop):
        return self._fmttime[prop]

    def gettimestamp(self, prop):
        return self._timestamps[prop]

    def created(self):
        return self._timestamps["Created"]

    def syncpublish(self):
        return self._timestamps["SyncPublish"]

    def setsyncpublish(self, secs, now=time.time(), **kwargs):
        self.setmeta("SyncPublish", secs, now, **kwargs)

    def publish(self):
        return self._timestamps["Publish"]

    def setpublish(self, secs, now=time.time(), **kwargs):
        self.setmeta("Publish", secs, now, **kwargs)

    def activate(self):
        return self._timestamps["Activate"]

    def setactivate(self, secs, now=time.time(), **kwargs):
        self.setmeta("Activate", secs, now, **kwargs)

    def revoke(self):
        return self._timestamps["Revoke"]

    def setrevoke(self, secs, now=time.time(), **kwargs):
        self.setmeta("Revoke", secs, now, **kwargs)

    def inactive(self):
        return self._timestamps["Inactive"]

    def setinactive(self, secs, now=time.time(), **kwargs):
        self.setmeta("Inactive", secs, now, **kwargs)

    def delete(self):
        return self._timestamps["Delete"]

    def setdelete(self, secs, now=time.time(), **kwargs):
        self.setmeta("Delete", secs, now, **kwargs)

    def syncdelete(self):
        return self._timestamps["SyncDelete"]

    def setsyncdelete(self, secs, now=time.time(), **kwargs):
        self.setmeta("SyncDelete", secs, now, **kwargs)

    def setttl(self, ttl):
        if ttl is None or self.ttl == ttl:
            return
        elif self._origttl is None:
            self._origttl = self.ttl
            self.ttl = ttl
        elif self._origttl == ttl:
            self._origttl = None
            self.ttl = ttl
        else:
            self.ttl = ttl

    def keytype(self):
        return ("KSK" if self.sep else "ZSK")

    def __str__(self):
        return ("%s/%s/%05d"
                % (self.name, self.algname(), self.keyid))

    def __repr__(self):
        return ("%s/%s/%05d (%s)"
                % (self.name, self.algname(), self.keyid,
                   ("KSK" if self.sep else "ZSK")))

    def date(self):
        return (self.activate() or self.publish() or self.created())

    # keys are sorted first by zone name, then by algorithm. within
    # the same name/algorithm, they are sorted according to their
    # 'date' value: the activation date if set, OR the publication
    # if set, OR the creation date.
    def __lt__(self, other):
        if self.name != other.name:
            return self.name < other.name
        if self.alg != other.alg:
            return self.alg < other.alg
        return self.date() < other.date()

    def check_prepub(self, output=None):
        def noop(*args, **kwargs): pass
        if not output:
            output = noop

        now = int(time.time())
        a = self.activate()
        p = self.publish()

        if not a:
            return False

        if not p:
            if a > now:
                output("WARNING: Key %s is scheduled for\n"
                       "\t activation but not for publication."
                       % repr(self))
            return False

        if p <= now and a <= now:
            return True

        if p == a:
            output("WARNING: %s is scheduled to be\n"
                   "\t published and activated at the same time. This\n"
                   "\t could result in a coverage gap if the zone was\n"
                   "\t previously signed. Activation should be at least\n"
                   "\t %s after publication."
                   % (repr(self),
                       dnskey.duration(self.ttl) or 'one DNSKEY TTL'))
            return True

        if a < p:
            output("WARNING: Key %s is active before it is published"
                   % repr(self))
            return False

        if self.ttl is not None and a - p < self.ttl:
            output("WARNING: Key %s is activated too soon\n"
                   "\t after publication; this could result in coverage \n"
                   "\t gaps due to resolver caches containing old data.\n"
                   "\t Activation should be at least %s after\n"
                   "\t publication."
                   % (repr(self),
                      dnskey.duration(self.ttl) or 'one DNSKEY TTL'))
            return False

        return True

    def check_postpub(self, output = None, timespan = None):
        def noop(*args, **kwargs): pass
        if output is None:
            output = noop

        if timespan is None:
            timespan = self.ttl

        if timespan is None:
            output("WARNING: Key %s using default TTL." % repr(self))
            timespan = (60*60*24)

        now = time.time()
        d = self.delete()
        i = self.inactive()

        if not d:
            return False

        if not i:
            if d > now:
                output("WARNING: Key %s is scheduled for\n"
                       "\t deletion but not for inactivation." % repr(self))
            return False

        if d < now and i < now:
            return True

        if d < i:
            output("WARNING: Key %s is scheduled for\n"
                   "\t deletion before inactivation."
                   % repr(self))
            return False

        if d - i < timespan:
            output("WARNING: Key %s scheduled for\n"
                   "\t deletion too soon after deactivation; this may \n"
                   "\t result in coverage gaps due to resolver caches\n"
                   "\t containing old data.  Deletion should be at least\n"
                   "\t %s after inactivation."
                   % (repr(self), dnskey.duration(timespan)))
            return False

        return True

    @staticmethod
    def duration(secs):
        if not secs:
            return None

        units = [("year", 60*60*24*365),
                 ("month", 60*60*24*30),
                 ("day", 60*60*24),
                 ("hour", 60*60),
                 ("minute", 60),
                 ("second", 1)]

        output = []
        for unit in units:
            v, secs = secs // unit[1], secs % unit[1]
            if v > 0:
                output.append("%d %s%s" % (v, unit[0], "s" if v > 1 else ""))

        return ", ".join(output)

site-packages/isc/keyevent.py000064400000005413147511334560012262 0ustar00############################################################################
# Copyright (C) Internet Systems Consortium, Inc. ("ISC")
#
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, you can obtain one at https://mozilla.org/MPL/2.0/.
#
# See the COPYRIGHT file distributed with this work for additional
# information regarding copyright ownership.
############################################################################

import time


########################################################################
# Class keyevent
########################################################################
class keyevent:
    """ A discrete key event, e.g., Publish, Activate, Inactive, Delete,
    etc. Stores the date of the event, and identifying information
    about the key to which the event will occur."""

    def __init__(self, what, key, when=None):
        self.what = what
        self.when = when or key.gettime(what)
        self.key = key
        self.sep = key.sep
        self.zone = key.name
        self.alg = key.alg
        self.keyid = key.keyid

    def __repr__(self):
        return repr((self.when, self.what, self.keyid, self.sep,
                     self.zone, self.alg))

    def showtime(self):
        return time.strftime("%a %b %d %H:%M:%S UTC %Y", self.when)

    # update sets of active and published keys, based on
    # the contents of this keyevent
    def status(self, active, published, output = None):
        def noop(*args, **kwargs): pass
        if not output:
            output = noop

        if not active:
            active = set()
        if not published:
            published = set()

        if self.what == "Activate":
            active.add(self.keyid)
        elif self.what == "Publish":
            published.add(self.keyid)
        elif self.what == "Inactive":
            if self.keyid not in active:
                output("\tWARNING: %s scheduled to become inactive "
                       "before it is active"
                       % repr(self.key))
            else:
                active.remove(self.keyid)
        elif self.what == "Delete":
            if self.keyid in published:
                published.remove(self.keyid)
            else:
                output("WARNING: key %s is scheduled for deletion "
                       "before it is published" % repr(self.key))
        elif self.what == "Revoke":
            # We don't need to worry about the logic of this one;
            # just stop counting this key as either active or published
            if self.keyid in published:
                published.remove(self.keyid)
            if self.keyid in active:
                active.remove(self.keyid)

        return active, published
site-packages/isc/keydict.py000064400000005441147511334560012065 0ustar00############################################################################
# Copyright (C) Internet Systems Consortium, Inc. ("ISC")
#
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, you can obtain one at https://mozilla.org/MPL/2.0/.
#
# See the COPYRIGHT file distributed with this work for additional
# information regarding copyright ownership.
############################################################################

from collections import defaultdict
from . import dnskey
import os
import glob


########################################################################
# Class keydict
########################################################################
class keydict:
    """ A dictionary of keys, indexed by name, algorithm, and key id """

    _keydict = defaultdict(lambda: defaultdict(dict))
    _defttl = None
    _missing = []

    def __init__(self, dp=None, **kwargs):
        self._defttl = kwargs.get('keyttl', None)
        zones = kwargs.get('zones', None)

        if not zones:
            path = kwargs.get('path',None) or '.'
            self.readall(path)
        else:
            for zone in zones:
                if 'path' in kwargs and kwargs['path'] is not None:
                    path = kwargs['path']
                else:
                    path = dp and dp.policy(zone).directory or '.'
                if not self.readone(path, zone):
                    self._missing.append(zone)

    def readall(self, path):
        files = glob.glob(os.path.join(path, '*.private'))

        for infile in files:
            key = dnskey(infile, path, self._defttl)
            self._keydict[key.name][key.alg][key.keyid] = key

    def readone(self, path, zone):
        if not zone.endswith('.'):
            zone += '.'
        match='K' + zone + '+*.private'
        files = glob.glob(os.path.join(path, match))

        found = False
        for infile in files:
            key = dnskey(infile, path, self._defttl)
            if key.fullname != zone: # shouldn't ever happen
                continue
            keyname=key.name if zone != '.' else '.'
            self._keydict[keyname][key.alg][key.keyid] = key
            found = True

        return found

    def __iter__(self):
        for zone, algorithms in self._keydict.items():
            for alg, keys in algorithms.items():
                for key in keys.values():
                    yield key

    def __getitem__(self, name):
        return self._keydict[name]

    def zones(self):
        return (self._keydict.keys())

    def algorithms(self, zone):
        return (self._keydict[zone].keys())

    def keys(self, zone, alg):
        return (self._keydict[zone][alg].keys())

    def missing(self):
        return (self._missing)
site-packages/isc/__init__.py000064400000001651147511334560012167 0ustar00############################################################################
# Copyright (C) Internet Systems Consortium, Inc. ("ISC")
#
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, you can obtain one at https://mozilla.org/MPL/2.0/.
#
# See the COPYRIGHT file distributed with this work for additional
# information regarding copyright ownership.
############################################################################

__all__ = ['checkds', 'coverage', 'keymgr', 'dnskey', 'eventlist',
           'keydict', 'keyevent', 'keyseries', 'keyzone', 'policy',
           'parsetab', 'rndc', 'utils']

from isc.dnskey import *
from isc.eventlist import *
from isc.keydict import *
from isc.keyevent import *
from isc.keyseries import *
from isc.keyzone import *
from isc.policy import *
from isc.rndc import *
from isc.utils import *
site-packages/isc/coverage.py000064400000023375147511334560012232 0ustar00############################################################################
# Copyright (C) Internet Systems Consortium, Inc. ("ISC")
#
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, you can obtain one at https://mozilla.org/MPL/2.0/.
#
# See the COPYRIGHT file distributed with this work for additional
# information regarding copyright ownership.
############################################################################

from __future__ import print_function
import os
import sys
import argparse
import glob
import re
import time
import calendar
import pprint
from collections import defaultdict

prog = 'dnssec-coverage'

from isc import dnskey, eventlist, keydict, keyevent, keyzone, utils

############################################################################
# print a fatal error and exit
############################################################################
def fatal(*args, **kwargs):
    print(*args, **kwargs)
    sys.exit(1)


############################################################################
# output:
############################################################################
_firstline = True
def output(*args, **kwargs):
    """output text, adding a vertical space this is *not* the first
    first section being printed since a call to vreset()"""
    global _firstline
    if 'skip' in kwargs:
        skip = kwargs['skip']
        kwargs.pop('skip', None)
    else:
        skip = True
    if _firstline:
        _firstline = False
    elif skip:
        print('')
    if args:
        print(*args, **kwargs)


def vreset():
    """reset vertical spacing"""
    global _firstline
    _firstline = True


############################################################################
# parse_time
############################################################################
def parse_time(s):
    """ convert a formatted time (e.g., 1y, 6mo, 15mi, etc) into seconds
    :param s: String with some text representing a time interval
    :return: Integer with the number of seconds in the time interval
    """
    s = s.strip()

    # if s is an integer, we're done already
    try:
        return int(s)
    except ValueError:
        pass

    # try to parse as a number with a suffix indicating unit of time
    r = re.compile(r'([0-9][0-9]*)\s*([A-Za-z]*)')
    m = r.match(s)
    if not m:
        raise ValueError("Cannot parse %s" % s)
    n, unit = m.groups()
    n = int(n)
    unit = unit.lower()
    if unit.startswith('y'):
        return n * 31536000
    elif unit.startswith('mo'):
        return n * 2592000
    elif unit.startswith('w'):
        return n * 604800
    elif unit.startswith('d'):
        return n * 86400
    elif unit.startswith('h'):
        return n * 3600
    elif unit.startswith('mi'):
        return n * 60
    elif unit.startswith('s'):
        return n
    else:
        raise ValueError("Invalid suffix %s" % unit)


############################################################################
# set_path:
############################################################################
def set_path(command, default=None):
    """ find the location of a specified command.  if a default is supplied
    and it works, we use it; otherwise we search PATH for a match.
    :param command: string with a command to look for in the path
    :param default: default location to use
    :return: detected location for the desired command
    """

    fpath = default
    if not fpath or not os.path.isfile(fpath) or not os.access(fpath, os.X_OK):
        path = os.environ["PATH"]
        if not path:
            path = os.path.defpath
        for directory in path.split(os.pathsep):
            fpath = os.path.join(directory, command)
            if os.path.isfile(fpath) and os.access(fpath, os.X_OK):
                break
            fpath = None

    return fpath


############################################################################
# parse_args:
############################################################################
def parse_args():
    """Read command line arguments, set global 'args' structure"""
    compilezone = set_path('named-compilezone',
                           os.path.join(utils.prefix('sbin'),
                           'named-compilezone'))

    parser = argparse.ArgumentParser(description=prog + ': checks future ' +
                                     'DNSKEY coverage for a zone')

    parser.add_argument('zone', type=str, nargs='*', default=None,
                        help='zone(s) to check' +
                        '(default: all zones in the directory)')
    parser.add_argument('-K', dest='path', default='.', type=str,
                        help='a directory containing keys to process',
                        metavar='dir')
    parser.add_argument('-f', dest='filename', type=str,
                        help='zone master file', metavar='file')
    parser.add_argument('-m', dest='maxttl', type=str,
                        help='the longest TTL in the zone(s)',
                        metavar='time')
    parser.add_argument('-d', dest='keyttl', type=str,
                        help='the DNSKEY TTL', metavar='time')
    parser.add_argument('-r', dest='resign', default='1944000',
                        type=str, help='the RRSIG refresh interval '
                                       'in seconds [default: 22.5 days]',
                        metavar='time')
    parser.add_argument('-c', dest='compilezone',
                        default=compilezone, type=str,
                        help='path to \'named-compilezone\'',
                        metavar='path')
    parser.add_argument('-l', dest='checklimit',
                        type=str, default='0',
                        help='Length of time to check for '
                             'DNSSEC coverage [default: 0 (unlimited)]',
                        metavar='time')
    parser.add_argument('-z', dest='no_ksk',
                        action='store_true', default=False,
                        help='Only check zone-signing keys (ZSKs)')
    parser.add_argument('-k', dest='no_zsk',
                        action='store_true', default=False,
                        help='Only check key-signing keys (KSKs)')
    parser.add_argument('-D', '--debug', dest='debug_mode',
                        action='store_true', default=False,
                        help='Turn on debugging output')
    parser.add_argument('-v', '--version', action='version',
                        version=utils.version)

    args = parser.parse_args()

    if args.no_zsk and args.no_ksk:
        fatal("ERROR: -z and -k cannot be used together.")
    elif args.no_zsk or args.no_ksk:
        args.keytype = "KSK" if args.no_zsk else "ZSK"
    else:
        args.keytype = None

    if args.filename and len(args.zone) > 1:
        fatal("ERROR: -f can only be used with one zone.")

    # strip trailing dots if any
    args.zone = [x[:-1] if (len(x) > 1 and x[-1] == '.') else x
                        for x in args.zone]

    # convert from time arguments to seconds
    try:
        if args.maxttl:
            m = parse_time(args.maxttl)
            args.maxttl = m
    except ValueError:
        pass

    try:
        if args.keyttl:
            k = parse_time(args.keyttl)
            args.keyttl = k
    except ValueError:
        pass

    try:
        if args.resign:
            r = parse_time(args.resign)
            args.resign = r
    except ValueError:
        pass

    try:
        if args.checklimit:
            lim = args.checklimit
            r = parse_time(args.checklimit)
            if r == 0:
                args.checklimit = None
            else:
                args.checklimit = time.time() + r
    except ValueError:
        pass

    # if we've got the values we need from the command line, stop now
    if args.maxttl and args.keyttl:
        return args

    # load keyttl and maxttl data from zonefile
    if args.zone and args.filename:
        try:
            zone = keyzone(args.zone[0], args.filename, args.compilezone)
            args.maxttl = args.maxttl or zone.maxttl
            args.keyttl = args.maxttl or zone.keyttl
        except Exception as e:
            print("Unable to load zone data from %s: " % args.filename, e)

    if not args.maxttl:
        output("WARNING: Maximum TTL value was not specified.  Using 1 week\n"
               "\t (604800 seconds); re-run with the -m option to get more\n"
               "\t accurate results.")
        args.maxttl = 604800

    return args

############################################################################
# Main
############################################################################
def main():
    args = parse_args()

    print("PHASE 1--Loading keys to check for internal timing problems")

    try:
        kd = keydict(path=args.path, zones=args.zone, keyttl=args.keyttl)
    except Exception as e:
        fatal('ERROR: Unable to build key dictionary: ' + str(e))

    for key in kd:
        key.check_prepub(output)
        if key.sep:
            key.check_postpub(output)
        else:
            key.check_postpub(output, args.maxttl + args.resign)

    output("PHASE 2--Scanning future key events for coverage failures")
    vreset()

    try:
        elist = eventlist(kd)
    except Exception as e:
        fatal('ERROR: Unable to build event list: ' + str(e))

    errors = False
    if not args.zone:
        if not elist.coverage(None, args.keytype, args.checklimit, output):
            errors = True
    else:
        for zone in args.zone:
            try:
                if not elist.coverage(zone, args.keytype,
                                      args.checklimit, output):
                    errors = True
            except:
                output('ERROR: Coverage check failed for zone ' + zone)

    sys.exit(1 if errors else 0)
site-packages/isc/policy.py000064400000063462147511334560011737 0ustar00############################################################################
# Copyright (C) Internet Systems Consortium, Inc. ("ISC")
#
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, you can obtain one at https://mozilla.org/MPL/2.0/.
#
# See the COPYRIGHT file distributed with this work for additional
# information regarding copyright ownership.
############################################################################

import re
import ply.lex as lex
import ply.yacc as yacc
from string import *
from copy import copy


############################################################################
# PolicyLex: a lexer for the policy file syntax.
############################################################################
class PolicyLex:
    reserved = ('POLICY',
                'ALGORITHM_POLICY',
                'ZONE',
                'ALGORITHM',
                'DIRECTORY',
                'KEYTTL',
                'KEY_SIZE',
                'ROLL_PERIOD',
                'PRE_PUBLISH',
                'POST_PUBLISH',
                'COVERAGE',
                'STANDBY',
                'NONE')

    tokens = reserved + ('DATESUFFIX',
                         'KEYTYPE',
                         'ALGNAME',
                         'STR',
                         'QSTRING',
                         'NUMBER',
                         'LBRACE',
                         'RBRACE',
                         'SEMI')
    reserved_map = {}

    t_ignore           = ' \t'
    t_ignore_olcomment = r'(//|\#).*'

    t_LBRACE           = r'\{'
    t_RBRACE           = r'\}'
    t_SEMI             = r';';

    def t_newline(self, t):
        r'\n+'
        t.lexer.lineno += t.value.count("\n")

    def t_comment(self, t):
        r'/\*(.|\n)*?\*/'
        t.lexer.lineno += t.value.count('\n')

    def t_DATESUFFIX(self, t):
        r'(?i)(?<=[0-9 \t])(y(?:ears|ear|ea|e)?|mo(?:nths|nth|nt|n)?|w(?:eeks|eek|ee|e)?|d(?:ays|ay|a)?|h(?:ours|our|ou|o)?|mi(?:nutes|nute|nut|nu|n)?|s(?:econds|econd|econ|eco|ec|e)?)\b'
        t.value = re.match(r'(?i)(y|mo|w|d|h|mi|s)([a-z]*)', t.value).group(1).lower()
        return t

    def t_KEYTYPE(self, t):
        r'(?i)\b(KSK|ZSK)\b'
        t.value = t.value.upper()
        return t

    def t_ALGNAME(self, t):
        r'(?i)\b(RSAMD5|DH|DSA|NSEC3DSA|ECC|RSASHA1|NSEC3RSASHA1|RSASHA256|RSASHA512|ECCGOST|ECDSAP256SHA256|ECDSAP384SHA384|ED25519|ED448)\b'
        t.value = t.value.upper()
        return t

    def t_STR(self, t):
        r'[A-Za-z._-][\w._-]*'
        t.type = self.reserved_map.get(t.value, "STR")
        return t

    def t_QSTRING(self, t):
        r'"([^"\n]|(\\"))*"'
        t.type = self.reserved_map.get(t.value, "QSTRING")
        t.value = t.value[1:-1]
        return t

    def t_NUMBER(self, t):
        r'\d+'
        t.value = int(t.value)
        return t

    def t_error(self, t):
        print("Illegal character '%s'" % t.value[0])
        t.lexer.skip(1)

    def __init__(self, **kwargs):
        if 'maketrans' in dir(str):
            trans = str.maketrans('_', '-')
        else:
            trans = maketrans('_', '-')
        for r in self.reserved:
            self.reserved_map[r.lower().translate(trans)] = r
        self.lexer = lex.lex(object=self, **kwargs)

    def test(self, text):
        self.lexer.input(text)
        while True:
            t = self.lexer.token()
            if not t:
                break
            print(t)

############################################################################
# Policy: this object holds a set of DNSSEC policy settings.
############################################################################
class Policy:
    is_zone = False
    is_alg = False
    is_constructed = False
    ksk_rollperiod = None
    zsk_rollperiod = None
    ksk_prepublish = None
    zsk_prepublish = None
    ksk_postpublish = None
    zsk_postpublish = None
    ksk_keysize = None
    zsk_keysize = None
    ksk_standby = None
    zsk_standby = None
    keyttl = None
    coverage = None
    directory = None
    valid_key_sz_per_algo = {'DSA': [512, 1024],
                             'NSEC3DSA': [512, 1024],
                             'RSAMD5': [512, 4096],
                             'RSASHA1': [512, 4096],
                             'NSEC3RSASHA1': [512, 4096],
                             'RSASHA256': [512, 4096],
                             'RSASHA512': [512, 4096],
                             'ECCGOST': None,
                             'ECDSAP256SHA256': None,
                             'ECDSAP384SHA384': None,
                             'ED25519': None,
                             'ED448': None}

    def __init__(self, name=None, algorithm=None, parent=None):
        self.name = name
        self.algorithm = algorithm
        self.parent = parent
        pass

    def __repr__(self):
        return ("%spolicy %s:\n"
                "\tinherits %s\n"
                "\tdirectory %s\n"
                "\talgorithm %s\n"
                "\tcoverage %s\n"
                "\tksk_keysize %s\n"
                "\tzsk_keysize %s\n"
                "\tksk_rollperiod %s\n"
                "\tzsk_rollperiod %s\n"
                "\tksk_prepublish %s\n"
                "\tksk_postpublish %s\n"
                "\tzsk_prepublish %s\n"
                "\tzsk_postpublish %s\n"
                "\tksk_standby %s\n"
                "\tzsk_standby %s\n"
                "\tkeyttl %s\n"
                %
                ((self.is_constructed and 'constructed ' or \
                  self.is_zone and 'zone ' or \
                  self.is_alg and 'algorithm ' or ''),
                 self.name or 'UNKNOWN',
                 self.parent and self.parent.name or 'None',
                 self.directory and ('"' + str(self.directory) + '"') or 'None',
                 self.algorithm or 'None',
                 self.coverage and str(self.coverage) or 'None',
                 self.ksk_keysize and str(self.ksk_keysize) or 'None',
                 self.zsk_keysize and str(self.zsk_keysize) or 'None',
                 self.ksk_rollperiod and str(self.ksk_rollperiod) or 'None',
                 self.zsk_rollperiod and str(self.zsk_rollperiod) or 'None',
                 self.ksk_prepublish and str(self.ksk_prepublish) or 'None',
                 self.ksk_postpublish and str(self.ksk_postpublish) or 'None',
                 self.zsk_prepublish and str(self.zsk_prepublish) or 'None',
                 self.zsk_postpublish and str(self.zsk_postpublish) or 'None',
                 self.ksk_standby and str(self.ksk_standby) or 'None',
                 self.zsk_standby and str(self.zsk_standby) or 'None',
                 self.keyttl and str(self.keyttl) or 'None'))

    def __verify_size(self, key_size, size_range):
        return (size_range[0] <= key_size <= size_range[1])

    def get_name(self):
        return self.name

    def constructed(self):
        return self.is_constructed

    def validate(self):
        """ Check if the values in the policy make sense
        :return: True/False if the policy passes validation
        """
        if self.ksk_rollperiod and \
           self.ksk_prepublish is not None and \
           self.ksk_prepublish > self.ksk_rollperiod:
            print(self.ksk_rollperiod)
            return (False,
                    ('KSK pre-publish period (%d) exceeds rollover period %d'
                     % (self.ksk_prepublish, self.ksk_rollperiod)))

        if self.ksk_rollperiod and \
           self.ksk_postpublish is not None and \
           self.ksk_postpublish > self.ksk_rollperiod:
            return (False,
                    ('KSK post-publish period (%d) exceeds rollover period %d'
                     % (self.ksk_postpublish, self.ksk_rollperiod)))

        if self.zsk_rollperiod and \
           self.zsk_prepublish is not None and \
           self.zsk_prepublish >= self.zsk_rollperiod:
            return (False,
                    ('ZSK pre-publish period (%d) exceeds rollover period %d'
                     % (self.zsk_prepublish, self.zsk_rollperiod)))

        if self.zsk_rollperiod and \
           self.zsk_postpublish is not None and \
           self.zsk_postpublish >= self.zsk_rollperiod:
            return (False,
                    ('ZSK post-publish period (%d) exceeds rollover period %d'
                     % (self.zsk_postpublish, self.zsk_rollperiod)))

        if self.ksk_rollperiod and \
           self.ksk_prepublish and self.ksk_postpublish and \
           self.ksk_prepublish + self.ksk_postpublish >= self.ksk_rollperiod:
            return (False,
                    (('KSK pre/post-publish periods (%d/%d) ' +
                      'combined exceed rollover period %d') %
                     (self.ksk_prepublish,
                      self.ksk_postpublish,
                      self.ksk_rollperiod)))

        if self.zsk_rollperiod and \
           self.zsk_prepublish and self.zsk_postpublish and \
           self.zsk_prepublish + self.zsk_postpublish >= self.zsk_rollperiod:
            return (False,
                    (('ZSK pre/post-publish periods (%d/%d) ' +
                      'combined exceed rollover period %d') %
                     (self.zsk_prepublish,
                      self.zsk_postpublish,
                      self.zsk_rollperiod)))

        if self.algorithm is not None:
            # Validate the key size
            key_sz_range = self.valid_key_sz_per_algo.get(self.algorithm)
            if key_sz_range is not None:
                # Verify KSK
                if not self.__verify_size(self.ksk_keysize, key_sz_range):
                    return False, 'KSK key size %d outside valid range %s' \
                            % (self.ksk_keysize, key_sz_range)

                # Verify ZSK
                if not self.__verify_size(self.zsk_keysize, key_sz_range):
                    return False, 'ZSK key size %d outside valid range %s' \
                            % (self.zsk_keysize, key_sz_range)

            # Specific check for DSA keys
            if self.algorithm in ['DSA', 'NSEC3DSA'] and \
               self.ksk_keysize % 64 != 0:
                return False, \
                        ('KSK key size %d not divisible by 64 ' +
                         'as required for DSA') % self.ksk_keysize

            if self.algorithm in ['DSA', 'NSEC3DSA'] and \
               self.zsk_keysize % 64 != 0:
                return False, \
                        ('ZSK key size %d not divisible by 64 ' +
                         'as required for DSA') % self.zsk_keysize

            if self.algorithm in ['ECCGOST', \
                                  'ECDSAP256SHA256', \
                                  'ECDSAP384SHA384', \
                                  'ED25519', \
                                  'ED448']:
                self.ksk_keysize = None
                self.zsk_keysize = None

        return True, ''

############################################################################
# dnssec_policy:
# This class reads a dnssec.policy file and creates a dictionary of
# DNSSEC policy rules from which a policy for a specific zone can
# be generated.
############################################################################
class PolicyException(Exception):
    pass

class dnssec_policy:
    alg_policy = {}
    named_policy = {}
    zone_policy = {}
    current = None
    filename = None
    initial = True

    def __init__(self, filename=None, **kwargs):
        self.plex = PolicyLex()
        self.tokens = self.plex.tokens
        if 'debug' not in kwargs:
            kwargs['debug'] = False
        if 'write_tables' not in kwargs:
            kwargs['write_tables'] = False
        self.parser = yacc.yacc(module=self, **kwargs)

        # set defaults
        self.setup('''policy global { algorithm rsasha256;
                                      key-size ksk 2048;
                                      key-size zsk 2048;
                                      roll-period ksk 0;
                                      roll-period zsk 1y;
                                      pre-publish ksk 1mo;
                                      pre-publish zsk 1mo;
                                      post-publish ksk 1mo;
                                      post-publish zsk 1mo;
                                      standby ksk 0;
                                      standby zsk 0;
                                      keyttl 1h;
                                      coverage 6mo; };
                      policy default { policy global; };''')

        p = Policy()
        p.algorithm = None
        p.is_alg = True
        p.ksk_keysize = 2048;
        p.zsk_keysize = 2048;

        # set default algorithm policies
        # these need a lower default key size:
        self.alg_policy['DSA'] = copy(p)
        self.alg_policy['DSA'].algorithm = "DSA"
        self.alg_policy['DSA'].name = "DSA"
        self.alg_policy['DSA'].ksk_keysize = 1024;

        self.alg_policy['NSEC3DSA'] = copy(p)
        self.alg_policy['NSEC3DSA'].algorithm = "NSEC3DSA"
        self.alg_policy['NSEC3DSA'].name = "NSEC3DSA"
        self.alg_policy['NSEC3DSA'].ksk_keysize = 1024;

        # these can use default settings
        self.alg_policy['RSAMD5'] = copy(p)
        self.alg_policy['RSAMD5'].algorithm = "RSAMD5"
        self.alg_policy['RSAMD5'].name = "RSAMD5"

        self.alg_policy['RSASHA1'] = copy(p)
        self.alg_policy['RSASHA1'].algorithm = "RSASHA1"
        self.alg_policy['RSASHA1'].name = "RSASHA1"

        self.alg_policy['NSEC3RSASHA1'] = copy(p)
        self.alg_policy['NSEC3RSASHA1'].algorithm = "NSEC3RSASHA1"
        self.alg_policy['NSEC3RSASHA1'].name = "NSEC3RSASHA1"

        self.alg_policy['RSASHA256'] = copy(p)
        self.alg_policy['RSASHA256'].algorithm = "RSASHA256"
        self.alg_policy['RSASHA256'].name = "RSASHA256"

        self.alg_policy['RSASHA512'] = copy(p)
        self.alg_policy['RSASHA512'].algorithm = "RSASHA512"
        self.alg_policy['RSASHA512'].name = "RSASHA512"

        self.alg_policy['ECCGOST'] = copy(p)
        self.alg_policy['ECCGOST'].algorithm = "ECCGOST"
        self.alg_policy['ECCGOST'].name = "ECCGOST"

        self.alg_policy['ECDSAP256SHA256'] = copy(p)
        self.alg_policy['ECDSAP256SHA256'].algorithm = "ECDSAP256SHA256"
        self.alg_policy['ECDSAP256SHA256'].name = "ECDSAP256SHA256"
        self.alg_policy['ECDSAP256SHA256'].ksk_keysize = None;
        self.alg_policy['ECDSAP256SHA256'].zsk_keysize = None;

        self.alg_policy['ECDSAP384SHA384'] = copy(p)
        self.alg_policy['ECDSAP384SHA384'].algorithm = "ECDSAP384SHA384"
        self.alg_policy['ECDSAP384SHA384'].name = "ECDSAP384SHA384"
        self.alg_policy['ECDSAP384SHA384'].ksk_keysize = None;
        self.alg_policy['ECDSAP384SHA384'].zsk_keysize = None;

        self.alg_policy['ED25519'] = copy(p)
        self.alg_policy['ED25519'].algorithm = "ED25519"
        self.alg_policy['ED25519'].name = "ED25519"
        self.alg_policy['ED25519'].ksk_keysize = None;
        self.alg_policy['ED25519'].zsk_keysize = None;

        self.alg_policy['ED448'] = copy(p)
        self.alg_policy['ED448'].algorithm = "ED448"
        self.alg_policy['ED448'].name = "ED448"
        self.alg_policy['ED448'].ksk_keysize = None;
        self.alg_policy['ED448'].zsk_keysize = None;

        if filename:
            self.load(filename)

    def load(self, filename):
        self.filename = filename
        self.initial = True
        with open(filename) as f:
            text = f.read()
            self.plex.lexer.lineno = 0
            self.parser.parse(text)

        self.filename = None

    def setup(self, text):
        self.initial = True
        self.plex.lexer.lineno = 0
        self.parser.parse(text)

    def policy(self, zone, **kwargs):
        z = zone.lower()
        p = None

        if z in self.zone_policy:
            p = self.zone_policy[z]

        if p is None:
            p = copy(self.named_policy['default'])
            p.name = zone
            p.is_constructed = True

        if p.algorithm is None:
            parent = p.parent or self.named_policy['default']
            while parent and not parent.algorithm:
                parent = parent.parent
            p.algorithm = parent and parent.algorithm or None

        if p.algorithm in self.alg_policy:
            ap = self.alg_policy[p.algorithm]
        else:
            raise PolicyException('algorithm not found')

        if p.directory is None:
            parent = p.parent or self.named_policy['default']
            while parent is not None and not parent.directory:
                parent = parent.parent
            p.directory = parent and parent.directory

        if p.coverage is None:
            parent = p.parent or self.named_policy['default']
            while parent and not parent.coverage:
                parent = parent.parent
            p.coverage = parent and parent.coverage or ap.coverage

        if p.ksk_keysize is None:
            parent = p.parent or self.named_policy['default']
            while parent.parent and not parent.ksk_keysize:
                parent = parent.parent
            p.ksk_keysize = parent and parent.ksk_keysize or ap.ksk_keysize

        if p.zsk_keysize is None:
            parent = p.parent or self.named_policy['default']
            while parent.parent and not parent.zsk_keysize:
                parent = parent.parent
            p.zsk_keysize = parent and parent.zsk_keysize or ap.zsk_keysize

        if p.ksk_rollperiod is None:
            parent = p.parent or self.named_policy['default']
            while parent.parent and not parent.ksk_rollperiod:
                parent = parent.parent
            p.ksk_rollperiod = parent and \
                parent.ksk_rollperiod or ap.ksk_rollperiod

        if p.zsk_rollperiod is None:
            parent = p.parent or self.named_policy['default']
            while parent.parent and not parent.zsk_rollperiod:
                parent = parent.parent
            p.zsk_rollperiod = parent and \
                parent.zsk_rollperiod or ap.zsk_rollperiod

        if p.ksk_prepublish is None:
            parent = p.parent or self.named_policy['default']
            while parent.parent and not parent.ksk_prepublish:
                parent = parent.parent
            p.ksk_prepublish = parent and \
                parent.ksk_prepublish or ap.ksk_prepublish

        if p.zsk_prepublish is None:
            parent = p.parent or self.named_policy['default']
            while parent.parent and not parent.zsk_prepublish:
                parent = parent.parent
            p.zsk_prepublish = parent and \
                parent.zsk_prepublish or ap.zsk_prepublish

        if p.ksk_postpublish is None:
            parent = p.parent or self.named_policy['default']
            while parent.parent and not parent.ksk_postpublish:
                parent = parent.parent
            p.ksk_postpublish = parent and \
                parent.ksk_postpublish or ap.ksk_postpublish

        if p.zsk_postpublish is None:
            parent = p.parent or self.named_policy['default']
            while parent.parent and not parent.zsk_postpublish:
                parent = parent.parent
            p.zsk_postpublish = parent and \
                parent.zsk_postpublish or ap.zsk_postpublish

        if p.keyttl is None:
            parent = p.parent or self.named_policy['default']
            while parent is not None and not parent.keyttl:
                parent = parent.parent
            p.keyttl = parent and parent.keyttl

        if 'novalidate' not in kwargs or not kwargs['novalidate']:
            (valid, msg) = p.validate()
            if not valid:
                raise PolicyException(msg)
                return None

        return p


    def p_policylist(self, p):
        '''policylist : init policy
                      | policylist policy'''
        pass

    def p_init(self, p):
        "init :"
        self.initial = False

    def p_policy(self, p):
        '''policy : alg_policy
                  | zone_policy
                  | named_policy'''
        pass

    def p_name(self, p):
        '''name : STR
                | KEYTYPE
                | DATESUFFIX'''
        p[0] = p[1]
        pass

    def p_domain(self, p):
        '''domain : STR
                  | QSTRING
                  | KEYTYPE
                  | DATESUFFIX'''
        p[0] = p[1].strip()
        if not re.match(r'^[\w.-][\w.-]*$', p[0]):
            raise PolicyException('invalid domain')
        pass

    def p_new_policy(self, p):
        "new_policy :"
        self.current = Policy()

    def p_alg_policy(self, p):
        "alg_policy : ALGORITHM_POLICY ALGNAME new_policy alg_option_group SEMI"
        self.current.name = p[2]
        self.current.is_alg = True
        self.alg_policy[p[2]] = self.current
        pass

    def p_zone_policy(self, p):
        "zone_policy : ZONE domain new_policy policy_option_group SEMI"
        self.current.name = p[2].rstrip('.')
        self.current.is_zone = True
        self.zone_policy[p[2].rstrip('.').lower()] = self.current
        pass

    def p_named_policy(self, p):
        "named_policy : POLICY name new_policy policy_option_group SEMI"
        self.current.name = p[2]
        self.named_policy[p[2].lower()] = self.current
        pass

    def p_duration_1(self, p):
        "duration : NUMBER"
        p[0] = p[1]
        pass

    def p_duration_2(self, p):
        "duration : NONE"
        p[0] = None
        pass

    def p_duration_3(self, p):
        "duration : NUMBER DATESUFFIX"
        if p[2] == "y":
            p[0] = p[1] * 31536000 # year
        elif p[2] == "mo":
            p[0] = p[1] * 2592000  # month
        elif p[2] == "w":
            p[0] = p[1] * 604800   # week
        elif p[2] == "d":
            p[0] = p[1] * 86400    # day
        elif p[2] == "h":
            p[0] = p[1] * 3600     # hour
        elif p[2] == "mi":
            p[0] = p[1] * 60       # minute
        elif p[2] == "s":
            p[0] = p[1]            # second
        else:
            raise PolicyException('invalid duration')

    def p_policy_option_group(self, p):
        "policy_option_group : LBRACE policy_option_list RBRACE"
        pass

    def p_policy_option_list(self, p):
        '''policy_option_list : policy_option SEMI
                              | policy_option_list policy_option SEMI'''
        pass

    def p_policy_option(self, p):
        '''policy_option : parent_option
                         | directory_option
                         | coverage_option
                         | rollperiod_option
                         | prepublish_option
                         | postpublish_option
                         | keysize_option
                         | algorithm_option
                         | keyttl_option
                         | standby_option'''
        pass

    def p_alg_option_group(self, p):
        "alg_option_group : LBRACE alg_option_list RBRACE"
        pass

    def p_alg_option_list(self, p):
        '''alg_option_list : alg_option SEMI
                           | alg_option_list alg_option SEMI'''
        pass

    def p_alg_option(self, p):
        '''alg_option : coverage_option
                      | rollperiod_option
                      | prepublish_option
                      | postpublish_option
                      | keyttl_option
                      | keysize_option
                      | standby_option'''
        pass

    def p_parent_option(self, p):
        "parent_option : POLICY name"
        self.current.parent = self.named_policy[p[2].lower()]

    def p_directory_option(self, p):
        "directory_option : DIRECTORY QSTRING"
        self.current.directory = p[2]

    def p_coverage_option(self, p):
        "coverage_option : COVERAGE duration"
        self.current.coverage = p[2]

    def p_rollperiod_option(self, p):
        "rollperiod_option : ROLL_PERIOD KEYTYPE duration"
        if p[2] == "KSK":
            self.current.ksk_rollperiod = p[3]
        else:
            self.current.zsk_rollperiod = p[3]

    def p_prepublish_option(self, p):
        "prepublish_option : PRE_PUBLISH KEYTYPE duration"
        if p[2] == "KSK":
            self.current.ksk_prepublish = p[3]
        else:
            self.current.zsk_prepublish = p[3]

    def p_postpublish_option(self, p):
        "postpublish_option : POST_PUBLISH KEYTYPE duration"
        if p[2] == "KSK":
            self.current.ksk_postpublish = p[3]
        else:
            self.current.zsk_postpublish = p[3]

    def p_keysize_option(self, p):
        "keysize_option : KEY_SIZE KEYTYPE NUMBER"
        if p[2] == "KSK":
            self.current.ksk_keysize = p[3]
        else:
            self.current.zsk_keysize = p[3]

    def p_standby_option(self, p):
        "standby_option : STANDBY KEYTYPE NUMBER"
        if p[2] == "KSK":
            self.current.ksk_standby = p[3]
        else:
            self.current.zsk_standby = p[3]

    def p_keyttl_option(self, p):
        "keyttl_option : KEYTTL duration"
        self.current.keyttl = p[2]

    def p_algorithm_option(self, p):
        "algorithm_option : ALGORITHM ALGNAME"
        self.current.algorithm = p[2]

    def p_error(self, p):
        if p:
            print("%s%s%d:syntax error near '%s'" %
                    (self.filename or "", ":" if self.filename else "",
                     p.lineno, p.value))
        else:
            if not self.initial:
                raise PolicyException("%s%s%d:unexpected end of input" %
                    (self.filename or "", ":" if self.filename else "",
                     p and p.lineno or 0))

if __name__ == "__main__":
    import sys
    if sys.argv[1] == "lex":
        file = open(sys.argv[2])
        text = file.read()
        file.close()
        plex = PolicyLex(debug=1)
        plex.test(text)
    elif sys.argv[1] == "parse":
        try:
            pp = dnssec_policy(sys.argv[2], write_tables=True, debug=True)
            print(pp.named_policy['default'])
            print(pp.policy("nonexistent.zone"))
        except Exception as e:
            print(e.args[0])
site-packages/isc/keymgr.py000064400000014553147511334560011733 0ustar00############################################################################
# Copyright (C) Internet Systems Consortium, Inc. ("ISC")
#
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, you can obtain one at https://mozilla.org/MPL/2.0/.
#
# See the COPYRIGHT file distributed with this work for additional
# information regarding copyright ownership.
############################################################################

from __future__ import print_function
import os, sys, argparse, glob, re, time, calendar, pprint
from collections import defaultdict

prog='dnssec-keymgr'

from isc import dnskey, keydict, keyseries, policy, parsetab, utils

############################################################################
# print a fatal error and exit
############################################################################
def fatal(*args, **kwargs):
    print(*args, **kwargs)
    sys.exit(1)

############################################################################
# find the location of an external command
############################################################################
def set_path(command, default=None):
    """ find the location of a specified command. If a default is supplied,
    exists and it's an executable, we use it; otherwise we search PATH
    for an alternative.
    :param command: command to look for
    :param default: default value to use
    :return: PATH with the location of a suitable binary
    """
    fpath = default
    if not fpath or not os.path.isfile(fpath) or not os.access(fpath, os.X_OK):
        path = os.environ["PATH"]
        if not path:
            path = os.path.defpath
        for directory in path.split(os.pathsep):
            fpath = directory + os.sep + command
            if os.path.isfile(fpath) and os.access(fpath, os.X_OK):
                break
            fpath = None

    return fpath

############################################################################
# parse arguments
############################################################################
def parse_args():
    """ Read command line arguments, returns 'args' object
    :return: args object properly prepared
    """

    keygen = set_path('dnssec-keygen',
                      os.path.join(utils.prefix('sbin'), 'dnssec-keygen'))
    settime = set_path('dnssec-settime',
                       os.path.join(utils.prefix('sbin'), 'dnssec-settime'))

    parser = argparse.ArgumentParser(description=prog + ': schedule '
                                     'DNSSEC key rollovers according to a '
                                     'pre-defined policy')

    parser.add_argument('zone', type=str, nargs='*', default=None,
                        help='Zone(s) to which the policy should be applied ' +
                        '(default: all zones in the directory)')
    parser.add_argument('-K', dest='path', type=str,
                        help='Directory containing keys', metavar='dir')
    parser.add_argument('-c', dest='policyfile', type=str,
                        help='Policy definition file', metavar='file')
    parser.add_argument('-g', dest='keygen', default=keygen, type=str,
                        help='Path to \'dnssec-keygen\'',
                        metavar='path')
    parser.add_argument('-r', dest='randomdev', type=str, default=None,
                        help='Path to a file containing random data to pass to \'dnssec-keygen\'',
                        metavar='path')
    parser.add_argument('-s', dest='settime', default=settime, type=str,
                        help='Path to \'dnssec-settime\'',
                        metavar='path')
    parser.add_argument('-k', dest='no_zsk',
                        action='store_true', default=False,
                        help='Only apply policy to key-signing keys (KSKs)')
    parser.add_argument('-z', dest='no_ksk',
                        action='store_true', default=False,
                        help='Only apply policy to zone-signing keys (ZSKs)')
    parser.add_argument('-f', '--force', dest='force', action='store_true',
                        default=False, help='Force updates to key events '+
                        'even if they are in the past')
    parser.add_argument('-q', '--quiet', dest='quiet', action='store_true',
                        default=False, help='Update keys silently')
    parser.add_argument('-v', '--version', action='version',
                        version=utils.version)

    args = parser.parse_args()

    if args.no_zsk and args.no_ksk:
        fatal("ERROR: -z and -k cannot be used together.")

    if args.keygen is None:
        fatal("ERROR: dnssec-keygen not found")

    if args.settime is None:
        fatal("ERROR: dnssec-settime not found")

    # if a policy file was specified, check that it exists.
    # if not, use the default file, unless it doesn't exist
    if args.policyfile is not None:
        if not os.path.exists(args.policyfile):
            fatal('ERROR: Policy file "%s" not found' % args.policyfile)
    else:
        args.policyfile = os.path.join(utils.sysconfdir,
                                       'dnssec-policy.conf')
        if not os.path.exists(args.policyfile):
            args.policyfile = None

    return args

############################################################################
# main
############################################################################
def main():
    args = parse_args()

    # As we may have specific locations for the binaries, we put that info
    # into a context object that can be passed around
    context = {'keygen_path': args.keygen,
               'settime_path': args.settime,
               'keys_path': args.path,
               'randomdev': args.randomdev}

    try:
        dp = policy.dnssec_policy(args.policyfile)
    except Exception as e:
        fatal('Unable to load DNSSEC policy: ' + str(e))

    try:
        kd = keydict(dp, path=args.path, zones=args.zone)
    except Exception as e:
        fatal('Unable to build key dictionary: ' + str(e))

    try:
        ks = keyseries(kd, context=context)
    except Exception as e:
        fatal('Unable to build key series: ' + str(e))

    try:
        ks.enforce_policy(dp, ksk=args.no_zsk, zsk=args.no_ksk,
                          force=args.force, quiet=args.quiet)
    except Exception as e:
        fatal('Unable to apply policy: ' + str(e))
site-packages/isc/rndc.py000064400000015053147511334560011357 0ustar00############################################################################
# Copyright (C) Internet Systems Consortium, Inc. ("ISC")
#
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, you can obtain one at https://mozilla.org/MPL/2.0/.
#
# See the COPYRIGHT file distributed with this work for additional
# information regarding copyright ownership.
############################################################################

############################################################################
# rndc.py
# This module implements the RNDC control protocol.
############################################################################

from collections import OrderedDict
import time
import struct
import hashlib
import hmac
import base64
import random
import socket


class rndc(object):
    """RNDC protocol client library"""
    __algos = {'md5':    157,
               'sha1':   161,
               'sha224': 162,
               'sha256': 163,
               'sha384': 164,
               'sha512': 165}

    def __init__(self, host, algo, secret):
        """Creates a persistent connection to RNDC and logs in
        host - (ip, port) tuple
        algo - HMAC algorithm: one of md5, sha1, sha224, sha256, sha384, sha512
               (with optional prefix 'hmac-')
        secret - HMAC secret, base64 encoded"""
        self.host = host
        algo = algo.lower()
        if algo.startswith('hmac-'):
            algo = algo[5:]
        self.algo = algo
        self.hlalgo = getattr(hashlib, algo)
        self.secret = base64.b64decode(secret)
        self.ser = random.randint(0, 1 << 24)
        self.nonce = None
        self.__connect_login()

    def call(self, cmd):
        """Call a RNDC command, all parsing is done on the server side
        cmd - a complete string with a command (eg 'reload zone example.com')
        """
        return dict(self.__command(type=cmd)['_data'])

    def __serialize_dict(self, data, ignore_auth=False):
        rv = bytearray()
        for k, v in data.items():
            if ignore_auth and k == '_auth':
                continue
            rv += struct.pack('B', len(k)) + k.encode('ascii')
            if type(v) == str:
                rv += struct.pack('>BI', 1, len(v)) + v.encode('ascii')
            elif type(v) == bytes:
                rv += struct.pack('>BI', 1, len(v)) + v
            elif type(v) == bytearray:
                rv += struct.pack('>BI', 1, len(v)) + v
            elif type(v) == OrderedDict:
                sd = self.__serialize_dict(v)
                rv += struct.pack('>BI', 2, len(sd)) + sd
            else:
                raise NotImplementedError('Cannot serialize element of type %s'
                                          % type(v))
        return rv

    def __prep_message(self, *args, **kwargs):
        self.ser += 1
        now = int(time.time())
        data = OrderedDict(*args, **kwargs)

        d = OrderedDict()
        d['_auth'] = OrderedDict()
        d['_ctrl'] = OrderedDict()
        d['_ctrl']['_ser'] = str(self.ser)
        d['_ctrl']['_tim'] = str(now)
        d['_ctrl']['_exp'] = str(now+60)
        if self.nonce is not None:
            d['_ctrl']['_nonce'] = self.nonce
        d['_data'] = data

        msg = self.__serialize_dict(d, ignore_auth=True)
        hash = hmac.new(self.secret, msg, self.hlalgo).digest()
        bhash = base64.b64encode(hash)
        if self.algo == 'md5':
            d['_auth']['hmd5'] = struct.pack('22s', bhash)
        else:
            d['_auth']['hsha'] = bytearray(struct.pack('B88s',
                                             self.__algos[self.algo], bhash))
        msg = self.__serialize_dict(d)
        msg = struct.pack('>II', len(msg) + 4, 1) + msg
        return msg

    def __verify_msg(self, msg):
        if self.nonce is not None and msg['_ctrl']['_nonce'] != self.nonce:
            return False
        if self.algo == 'md5':
            bhash = msg['_auth']['hmd5']
        else:
            bhash = msg['_auth']['hsha'][1:]
        if type(bhash) == bytes:
            bhash = bhash.decode('ascii')
        bhash += '=' * (4 - (len(bhash) % 4))
        remote_hash = base64.b64decode(bhash)
        my_msg = self.__serialize_dict(msg, ignore_auth=True)
        my_hash = hmac.new(self.secret, my_msg, self.hlalgo).digest()
        return (my_hash == remote_hash)

    def __command(self, *args, **kwargs):
        msg = self.__prep_message(*args, **kwargs)
        sent = self.socket.send(msg)
        if sent != len(msg):
            raise IOError("Cannot send the message")

        header = self.socket.recv(8)
        if len(header) != 8:
            # What should we throw here? Bad auth can cause this...
            raise IOError("Can't read response header")

        length, version = struct.unpack('>II', header)
        if version != 1:
            raise NotImplementedError('Wrong message version %d' % version)

        # it includes the header
        length -= 4
        data = self.socket.recv(length, socket.MSG_WAITALL)
        if len(data) != length:
            raise IOError("Can't read response data")

        if type(data) == str:
            data = bytearray(data)
        msg = self.__parse_message(data)
        if not self.__verify_msg(msg):
            raise IOError("Authentication failure")

        return msg

    def __connect_login(self):
        self.socket = socket.create_connection(self.host)
        self.nonce = None
        msg = self.__command(type='null')
        self.nonce = msg['_ctrl']['_nonce']

    def __parse_element(self, input):
        pos = 0
        labellen = input[pos]
        pos += 1
        label = input[pos:pos+labellen].decode('ascii')
        pos += labellen
        type = input[pos]
        pos += 1
        datalen = struct.unpack('>I', input[pos:pos+4])[0]
        pos += 4
        data = input[pos:pos+datalen]
        pos += datalen
        rest = input[pos:]

        if type == 1:         # raw binary value
            return label, data, rest
        elif type == 2:       # dictionary
            d = OrderedDict()
            while len(data) > 0:
                ilabel, value, data = self.__parse_element(data)
                d[ilabel] = value
            return label, d, rest
        # TODO type 3 - list
        else:
            raise NotImplementedError('Unknown element type %d' % type)

    def __parse_message(self, input):
        rv = OrderedDict()
        hdata = None
        while len(input) > 0:
            label, value, input = self.__parse_element(input)
            rv[label] = value
        return rv
site-packages/isc/eventlist.py000064400000013330147511334560012442 0ustar00############################################################################
# Copyright (C) Internet Systems Consortium, Inc. ("ISC")
#
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, you can obtain one at https://mozilla.org/MPL/2.0/.
#
# See the COPYRIGHT file distributed with this work for additional
# information regarding copyright ownership.
############################################################################

from collections import defaultdict
from .dnskey import *
from .keydict import *
from .keyevent import *


class eventlist:
    _K = defaultdict(lambda: defaultdict(list))
    _Z = defaultdict(lambda: defaultdict(list))
    _zones = set()
    _kdict = None

    def __init__(self, kdict):
        properties = ["SyncPublish", "Publish", "SyncDelete",
                      "Activate", "Inactive", "Delete"]
        self._kdict = kdict
        for zone in kdict.zones():
            self._zones.add(zone)
            for alg, keys in kdict[zone].items():
                for k in keys.values():
                    for prop in properties:
                        t = k.gettime(prop)
                        if not t:
                            continue
                        e = keyevent(prop, k, t)
                        if k.sep:
                            self._K[zone][alg].append(e)
                        else:
                            self._Z[zone][alg].append(e)

                self._K[zone][alg] = sorted(self._K[zone][alg],
                                            key=lambda event: event.when)
                self._Z[zone][alg] = sorted(self._Z[zone][alg],
                                            key=lambda event: event.when)

    # scan events per zone, algorithm, and key type, in order of
    # occurrence, noting inconsistent states when found
    def coverage(self, zone, keytype, until, output = None):
        def noop(*args, **kwargs): pass
        if not output:
            output = noop

        no_zsk = True if (keytype and keytype == "KSK") else False
        no_ksk = True if (keytype and keytype == "ZSK") else False
        kok = zok = True
        found = False

        if zone and not zone in self._zones:
            output("ERROR: No key events found for %s" % zone)
            return False

        if zone:
            found = True
            if not no_ksk:
                kok = self.checkzone(zone, "KSK", until, output)
            if not no_zsk:
                zok = self.checkzone(zone, "ZSK", until, output)
        else:
            for z in self._zones:
                if not no_ksk and z in self._K.keys():
                    found = True
                    kok = self.checkzone(z, "KSK", until, output)
                if not no_zsk and z in self._Z.keys():
                    found = True
                    zok = self.checkzone(z, "ZSK", until, output)

        if not found:
            output("ERROR: No key events found")
            return False

        return (kok and zok)

    def checkzone(self, zone, keytype, until, output):
        allok = True
        if keytype == "KSK":
            kz = self._K[zone]
        else:
            kz = self._Z[zone]

        for alg in kz.keys():
            output("Checking scheduled %s events for zone %s, "
                   "algorithm %s..." %
                   (keytype, zone, dnskey.algstr(alg)))
            ok = eventlist.checkset(kz[alg], keytype, until, output)
            if ok:
                output("No errors found")
            allok = allok and ok

        return allok

    @staticmethod
    def showset(eventset, output):
        if not eventset:
            return
        output("  " + eventset[0].showtime() + ":", skip=False)
        for event in eventset:
            output("    %s: %s" % (event.what, repr(event.key)), skip=False)

    @staticmethod
    def checkset(eventset, keytype, until, output):
        groups = list()
        group = list()

        # collect up all events that have the same time
        eventsfound = False
        for event in eventset:
            # we found an event
            eventsfound = True

            # add event to current group
            if (not group or group[0].when == event.when):
                group.append(event)

            # if we're at the end of the list, we're done.  if
            # we've found an event with a later time, start a new group
            if (group[0].when != event.when):
                groups.append(group)
                group = list()
                group.append(event)

        if group:
            groups.append(group)

        if not eventsfound:
            output("ERROR: No %s events found" % keytype)
            return False

        active = published = None
        for group in groups:
            if (until and calendar.timegm(group[0].when) > until):
                output("Ignoring events after %s" %
                      time.strftime("%a %b %d %H:%M:%S UTC %Y", 
                          time.gmtime(until)))
                return True

            for event in group:
                (active, published) = event.status(active, published)

            eventlist.showset(group, output)

            # and then check for inconsistencies:
            if not active:
                output("ERROR: No %s's are active after this event" % keytype)
                return False
            elif not published:
                output("ERROR: No %s's are published after this event"
                        % keytype)
                return False
            elif not published.intersection(active):
                output("ERROR: No %s's are both active and published "
                       "after this event" % keytype)
                return False

        return True

site-packages/isc/parsetab.py000064400000017575147511334560012245 0ustar00
# parsetab.py
# This file is automatically generated. Do not edit.
_tabversion = '3.8'

_lr_method = 'LALR'

_lr_signature = 'C2FE91AF256AF1CCEB499107DA2CFE7E'
    
_lr_action_items = {'ALGORITHM_POLICY':([0,1,2,3,4,5,6,10,29,46,62,],[-3,7,7,-2,-4,-5,-6,-1,-15,-16,-17,]),'ZONE':([0,1,2,3,4,5,6,10,29,46,62,],[-3,8,8,-2,-4,-5,-6,-1,-15,-16,-17,]),'POLICY':([0,1,2,3,4,5,6,10,27,29,46,47,62,77,88,],[-3,9,9,-2,-4,-5,-6,-1,59,-15,-16,59,-17,-22,-23,]),'$end':([1,3,4,5,6,10,29,46,62,],[0,-2,-4,-5,-6,-1,-15,-16,-17,]),'ALGNAME':([7,61,],[11,80,]),'STR':([8,9,59,],[13,18,18,]),'QSTRING':([8,60,],[14,79,]),'KEYTYPE':([8,9,40,41,42,44,45,59,],[15,19,69,70,71,73,74,19,]),'DATESUFFIX':([8,9,59,67,],[16,20,20,82,]),'LBRACE':([11,12,13,14,15,16,17,18,19,20,21,22,23,],[-14,-14,-10,-11,-12,-13,-14,-7,-8,-9,25,27,27,]),'SEMI':([18,19,20,24,26,28,31,32,33,34,35,36,37,38,48,49,50,51,52,53,54,55,56,57,58,63,64,66,67,68,72,75,76,78,79,80,82,83,84,85,86,87,],[-7,-8,-9,29,46,62,65,-37,-38,-39,-40,-41,-42,-43,77,-24,-25,-26,-27,-28,-29,-30,-31,-32,-33,-34,81,-46,-18,-19,-52,-21,88,-44,-45,-53,-20,-47,-48,-49,-50,-51,]),'COVERAGE':([25,27,30,47,65,77,81,88,],[39,39,39,39,-35,-22,-36,-23,]),'ROLL_PERIOD':([25,27,30,47,65,77,81,88,],[40,40,40,40,-35,-22,-36,-23,]),'PRE_PUBLISH':([25,27,30,47,65,77,81,88,],[41,41,41,41,-35,-22,-36,-23,]),'POST_PUBLISH':([25,27,30,47,65,77,81,88,],[42,42,42,42,-35,-22,-36,-23,]),'KEYTTL':([25,27,30,47,65,77,81,88,],[43,43,43,43,-35,-22,-36,-23,]),'KEY_SIZE':([25,27,30,47,65,77,81,88,],[44,44,44,44,-35,-22,-36,-23,]),'STANDBY':([25,27,30,47,65,77,81,88,],[45,45,45,45,-35,-22,-36,-23,]),'DIRECTORY':([27,47,77,88,],[60,60,-22,-23,]),'ALGORITHM':([27,47,77,88,],[61,61,-22,-23,]),'RBRACE':([30,47,65,77,81,88,],[63,75,-35,-22,-36,-23,]),'NUMBER':([39,43,69,70,71,73,74,],[67,67,67,67,67,86,87,]),'NONE':([39,43,69,70,71,],[68,68,68,68,68,]),}

_lr_action = {}
for _k, _v in _lr_action_items.items():
   for _x,_y in zip(_v[0],_v[1]):
      if not _x in _lr_action:  _lr_action[_x] = {}
      _lr_action[_x][_k] = _y
del _lr_action_items

_lr_goto_items = {'policylist':([0,],[1,]),'init':([0,],[2,]),'policy':([1,2,],[3,10,]),'alg_policy':([1,2,],[4,4,]),'zone_policy':([1,2,],[5,5,]),'named_policy':([1,2,],[6,6,]),'domain':([8,],[12,]),'name':([9,59,],[17,78,]),'new_policy':([11,12,17,],[21,22,23,]),'alg_option_group':([21,],[24,]),'policy_option_group':([22,23,],[26,28,]),'alg_option_list':([25,],[30,]),'alg_option':([25,30,],[31,64,]),'coverage_option':([25,27,30,47,],[32,51,32,51,]),'rollperiod_option':([25,27,30,47,],[33,52,33,52,]),'prepublish_option':([25,27,30,47,],[34,53,34,53,]),'postpublish_option':([25,27,30,47,],[35,54,35,54,]),'keyttl_option':([25,27,30,47,],[36,57,36,57,]),'keysize_option':([25,27,30,47,],[37,55,37,55,]),'standby_option':([25,27,30,47,],[38,58,38,58,]),'policy_option_list':([27,],[47,]),'policy_option':([27,47,],[48,76,]),'parent_option':([27,47,],[49,49,]),'directory_option':([27,47,],[50,50,]),'algorithm_option':([27,47,],[56,56,]),'duration':([39,43,69,70,71,],[66,72,83,84,85,]),}

_lr_goto = {}
for _k, _v in _lr_goto_items.items():
   for _x, _y in zip(_v[0], _v[1]):
       if not _x in _lr_goto: _lr_goto[_x] = {}
       _lr_goto[_x][_k] = _y
del _lr_goto_items
_lr_productions = [
  ("S' -> policylist","S'",1,None,None,None),
  ('policylist -> init policy','policylist',2,'p_policylist','policy.py',523),
  ('policylist -> policylist policy','policylist',2,'p_policylist','policy.py',524),
  ('init -> <empty>','init',0,'p_init','policy.py',528),
  ('policy -> alg_policy','policy',1,'p_policy','policy.py',532),
  ('policy -> zone_policy','policy',1,'p_policy','policy.py',533),
  ('policy -> named_policy','policy',1,'p_policy','policy.py',534),
  ('name -> STR','name',1,'p_name','policy.py',538),
  ('name -> KEYTYPE','name',1,'p_name','policy.py',539),
  ('name -> DATESUFFIX','name',1,'p_name','policy.py',540),
  ('domain -> STR','domain',1,'p_domain','policy.py',545),
  ('domain -> QSTRING','domain',1,'p_domain','policy.py',546),
  ('domain -> KEYTYPE','domain',1,'p_domain','policy.py',547),
  ('domain -> DATESUFFIX','domain',1,'p_domain','policy.py',548),
  ('new_policy -> <empty>','new_policy',0,'p_new_policy','policy.py',555),
  ('alg_policy -> ALGORITHM_POLICY ALGNAME new_policy alg_option_group SEMI','alg_policy',5,'p_alg_policy','policy.py',559),
  ('zone_policy -> ZONE domain new_policy policy_option_group SEMI','zone_policy',5,'p_zone_policy','policy.py',566),
  ('named_policy -> POLICY name new_policy policy_option_group SEMI','named_policy',5,'p_named_policy','policy.py',573),
  ('duration -> NUMBER','duration',1,'p_duration_1','policy.py',579),
  ('duration -> NONE','duration',1,'p_duration_2','policy.py',584),
  ('duration -> NUMBER DATESUFFIX','duration',2,'p_duration_3','policy.py',589),
  ('policy_option_group -> LBRACE policy_option_list RBRACE','policy_option_group',3,'p_policy_option_group','policy.py',608),
  ('policy_option_list -> policy_option SEMI','policy_option_list',2,'p_policy_option_list','policy.py',612),
  ('policy_option_list -> policy_option_list policy_option SEMI','policy_option_list',3,'p_policy_option_list','policy.py',613),
  ('policy_option -> parent_option','policy_option',1,'p_policy_option','policy.py',617),
  ('policy_option -> directory_option','policy_option',1,'p_policy_option','policy.py',618),
  ('policy_option -> coverage_option','policy_option',1,'p_policy_option','policy.py',619),
  ('policy_option -> rollperiod_option','policy_option',1,'p_policy_option','policy.py',620),
  ('policy_option -> prepublish_option','policy_option',1,'p_policy_option','policy.py',621),
  ('policy_option -> postpublish_option','policy_option',1,'p_policy_option','policy.py',622),
  ('policy_option -> keysize_option','policy_option',1,'p_policy_option','policy.py',623),
  ('policy_option -> algorithm_option','policy_option',1,'p_policy_option','policy.py',624),
  ('policy_option -> keyttl_option','policy_option',1,'p_policy_option','policy.py',625),
  ('policy_option -> standby_option','policy_option',1,'p_policy_option','policy.py',626),
  ('alg_option_group -> LBRACE alg_option_list RBRACE','alg_option_group',3,'p_alg_option_group','policy.py',630),
  ('alg_option_list -> alg_option SEMI','alg_option_list',2,'p_alg_option_list','policy.py',634),
  ('alg_option_list -> alg_option_list alg_option SEMI','alg_option_list',3,'p_alg_option_list','policy.py',635),
  ('alg_option -> coverage_option','alg_option',1,'p_alg_option','policy.py',639),
  ('alg_option -> rollperiod_option','alg_option',1,'p_alg_option','policy.py',640),
  ('alg_option -> prepublish_option','alg_option',1,'p_alg_option','policy.py',641),
  ('alg_option -> postpublish_option','alg_option',1,'p_alg_option','policy.py',642),
  ('alg_option -> keyttl_option','alg_option',1,'p_alg_option','policy.py',643),
  ('alg_option -> keysize_option','alg_option',1,'p_alg_option','policy.py',644),
  ('alg_option -> standby_option','alg_option',1,'p_alg_option','policy.py',645),
  ('parent_option -> POLICY name','parent_option',2,'p_parent_option','policy.py',649),
  ('directory_option -> DIRECTORY QSTRING','directory_option',2,'p_directory_option','policy.py',653),
  ('coverage_option -> COVERAGE duration','coverage_option',2,'p_coverage_option','policy.py',657),
  ('rollperiod_option -> ROLL_PERIOD KEYTYPE duration','rollperiod_option',3,'p_rollperiod_option','policy.py',661),
  ('prepublish_option -> PRE_PUBLISH KEYTYPE duration','prepublish_option',3,'p_prepublish_option','policy.py',668),
  ('postpublish_option -> POST_PUBLISH KEYTYPE duration','postpublish_option',3,'p_postpublish_option','policy.py',675),
  ('keysize_option -> KEY_SIZE KEYTYPE NUMBER','keysize_option',3,'p_keysize_option','policy.py',682),
  ('standby_option -> STANDBY KEYTYPE NUMBER','standby_option',3,'p_standby_option','policy.py',689),
  ('keyttl_option -> KEYTTL duration','keyttl_option',2,'p_keyttl_option','policy.py',696),
  ('algorithm_option -> ALGORITHM ALGNAME','algorithm_option',2,'p_algorithm_option','policy.py',700),
]
site-packages/isc/keyseries.py000064400000021022147511334560012425 0ustar00############################################################################
# Copyright (C) Internet Systems Consortium, Inc. ("ISC")
#
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, you can obtain one at https://mozilla.org/MPL/2.0/.
#
# See the COPYRIGHT file distributed with this work for additional
# information regarding copyright ownership.
############################################################################

from collections import defaultdict
from .dnskey import *
from .keydict import *
from .keyevent import *
from .policy import *
import time


class keyseries:
    _K = defaultdict(lambda: defaultdict(list))
    _Z = defaultdict(lambda: defaultdict(list))
    _zones = set()
    _kdict = None
    _context = None

    def __init__(self, kdict, now=time.time(), context=None):
        self._kdict = kdict
        self._context = context
        self._zones = set(kdict.missing())

        for zone in kdict.zones():
            self._zones.add(zone)
            for alg, keys in kdict[zone].items():
                for k in keys.values():
                    if k.sep:
                        if not (k.delete() and k.delete() < now):
                          self._K[zone][alg].append(k)
                    else:
                        if not (k.delete() and k.delete() < now):
                          self._Z[zone][alg].append(k)

                self._K[zone][alg].sort()
                self._Z[zone][alg].sort()

    def __iter__(self):
        for zone in self._zones:
            for collection in [self._K, self._Z]:
                if zone not in collection:
                    continue
                for alg, keys in collection[zone].items():
                    for key in keys:
                        yield key

    def dump(self):
        for k in self:
            print("%s" % repr(k))

    def fixseries(self, keys, policy, now, **kwargs):
        force = kwargs.get('force', False)
        if not keys:
            return

        # handle the first key
        key = keys[0]
        if key.sep:
            rp = policy.ksk_rollperiod
            prepub = policy.ksk_prepublish or (30 * 86400)
            postpub = policy.ksk_postpublish or (30 * 86400)
        else:
            rp = policy.zsk_rollperiod
            prepub = policy.zsk_prepublish or (30 * 86400)
            postpub = policy.zsk_postpublish or (30 * 86400)

        # the first key should be published and active
        p = key.publish()
        a = key.activate()
        if not p or p > now:
            key.setpublish(now)
            p = now
        if not a or a > now:
            key.setactivate(now)
            a = now

        i = key.inactive()
        fudge = 300
        if not rp:
            key.setinactive(None, **kwargs)
            key.setdelete(None, **kwargs)
        elif not i or a + rp != i:
            if not i and a + rp > now + prepub + fudge:
                key.setinactive(a + rp, **kwargs)
                key.setdelete(a + rp + postpub, **kwargs)
            elif not i:
                key.setinactive(now + prepub + fudge, **kwargs)
                key.setdelete(now + prepub + postpub + fudge, **kwargs)
            elif i < now:
                pass
            elif a + rp > i:
                key.setinactive(a + rp, **kwargs)
                key.setdelete(a + rp + postpub, **kwargs)
            elif a + rp > now + prepub + fudge:
                key.setinactive(a + rp, **kwargs)
                key.setdelete(a + rp + postpub, **kwargs)
            else:
                key.setinactive(now + prepub + fudge, **kwargs)
                key.setdelete(now + prepub + postpub + fudge, **kwargs)
        else:
            d = key.delete()
            if not d or i + postpub > now + fudge:
                key.setdelete(i + postpub, **kwargs)
            elif not d:
                key.setdelete(now + postpub + fudge, **kwargs)
            elif d < now + fudge:
                pass
            elif d < i + postpub:
                key.setdelete(i + postpub, **kwargs)

        if policy.keyttl != key.ttl:
            key.setttl(policy.keyttl)

        # handle all the subsequent keys
        prev = key
        for key in keys[1:]:
            # if no rollperiod, then all keys after the first in
            # the series kept inactive.
            # (XXX: we need to change this to allow standby keys)
            if not rp:
                key.setpublish(None, **kwargs)
                key.setactivate(None, **kwargs)
                key.setinactive(None, **kwargs)
                key.setdelete(None, **kwargs)
                if policy.keyttl != key.ttl:
                    key.setttl(policy.keyttl)
                continue

            # otherwise, ensure all dates are set correctly based on
            # the initial key
            a = prev.inactive()
            p = a - prepub
            key.setactivate(a, **kwargs)
            key.setpublish(p, **kwargs)
            key.setinactive(a + rp, **kwargs)
            key.setdelete(a + rp + postpub, **kwargs)
            prev.setdelete(a + postpub, **kwargs)
            if policy.keyttl != key.ttl:
                key.setttl(policy.keyttl)
            prev = key

        # if we haven't got sufficient coverage, create successor key(s)
        while rp and prev.inactive() and \
              prev.inactive() < now + policy.coverage:
            # commit changes to predecessor: a successor can only be
            # generated if Inactive has been set in the predecessor key
            prev.commit(self._context['settime_path'], **kwargs)
            key = prev.generate_successor(self._context['keygen_path'],
                                          self._context['randomdev'],
                                          prepub, **kwargs)

            key.setinactive(key.activate() + rp, **kwargs)
            key.setdelete(key.inactive() + postpub, **kwargs)
            keys.append(key)
            prev = key

        # last key? we already know we have sufficient coverage now, so
        # disable the inactivation of the final key (if it was set),
        # ensuring that if dnssec-keymgr isn't run again, the last key
        # in the series will at least remain usable.
        prev.setinactive(None, **kwargs)
        prev.setdelete(None, **kwargs)

        # commit changes
        for key in keys:
            key.commit(self._context['settime_path'], **kwargs)


    def enforce_policy(self, policies, now=time.time(), **kwargs):
        # If zones is provided as a parameter, use that list.
        # If not, use what we have in this object
        zones = kwargs.get('zones', self._zones)
        keys_dir = kwargs.get('dir', self._context.get('keys_path', None))
        force = kwargs.get('force', False)

        for zone in zones:
            collections = []
            policy = policies.policy(zone)
            keys_dir = keys_dir or policy.directory or '.'
            alg = policy.algorithm
            algnum = dnskey.algnum(alg)
            if 'ksk' not in kwargs or not kwargs['ksk']:
                if len(self._Z[zone][algnum]) == 0:
                    k = dnskey.generate(self._context['keygen_path'],
                                        self._context['randomdev'],
                                        keys_dir, zone, alg,
                                        policy.zsk_keysize, False,
                                        policy.keyttl or 3600,
                                        **kwargs)
                    self._Z[zone][algnum].append(k)
                collections.append(self._Z[zone])

            if 'zsk' not in kwargs or not kwargs['zsk']:
                if len(self._K[zone][algnum]) == 0:
                    k = dnskey.generate(self._context['keygen_path'],
                                        self._context['randomdev'],
                                        keys_dir, zone, alg,
                                        policy.ksk_keysize, True,
                                        policy.keyttl or 3600,
                                        **kwargs)
                    self._K[zone][algnum].append(k)
                collections.append(self._K[zone])

            for collection in collections:
                for algorithm, keys in collection.items():
                    if algorithm != algnum:
                        continue
                    try:
                        self.fixseries(keys, policy, now, **kwargs)
                    except Exception as e:
                        raise Exception('%s/%s: %s' %
                                        (zone, dnskey.algstr(algnum), str(e)))
site-packages/isc/utils.py000064400000004240147511334560011565 0ustar00############################################################################
# Copyright (C) Internet Systems Consortium, Inc. ("ISC")
#
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, you can obtain one at https://mozilla.org/MPL/2.0/.
#
# See the COPYRIGHT file distributed with this work for additional
# information regarding copyright ownership.
############################################################################

import os

# These routines permit platform-independent location of BIND 9 tools
if os.name == 'nt':
    import win32con
    import win32api


def prefix(bindir=''):
    if os.name != 'nt':
        return os.path.join('/usr', bindir)

    hklm = win32con.HKEY_LOCAL_MACHINE
    bind_subkey = "Software\\ISC\\BIND"
    sam = win32con.KEY_READ
    h_key = None
    key_found = True
    # can fail if the registry redirected for 32/64 bits
    try:
        h_key = win32api.RegOpenKeyEx(hklm, bind_subkey, 0, sam)
    except:
        key_found = False
    # retry for 32 bit python with 64 bit bind9
    if not key_found:
        key_found = True
        sam64 = sam | win32con.KEY_WOW64_64KEY
        try:
            h_key = win32api.RegOpenKeyEx(hklm, bind_subkey, 0, sam64)
        except:
            key_found = False
    # retry 64 bit python with 32 bit bind9
    if not key_found:
        key_found = True
        sam32 = sam | win32con.KEY_WOW64_32KEY
        try:
            h_key = win32api.RegOpenKeyEx(hklm, bind_subkey, 0, sam32)
        except:
            key_found = False
    if key_found:
        try:
            (named_base, _) = win32api.RegQueryValueEx(h_key, "InstallDir")
        except:
            key_found = False
        win32api.RegCloseKey(h_key)
    if key_found:
        return os.path.join(named_base, bindir)
    return os.path.join(win32api.GetSystemDirectory(), bindir)


def shellquote(s):
    if os.name == 'nt':
        return '"' + s.replace('"', '"\\"') + '"'
    return "'" + s.replace("'", "'\\''") + "'"


version = '9.11.36-RedHat-9.11.36-16.el8_10.2'
if os.name != 'nt':
    sysconfdir = '/etc'
else:
    sysconfdir = prefix('etc')
site-packages/isc/keyzone.py000064400000003677147511334560012126 0ustar00############################################################################
# Copyright (C) Internet Systems Consortium, Inc. ("ISC")
#
# This Source Code Form is subject to the terms of the Mozilla Public
# License, v. 2.0. If a copy of the MPL was not distributed with this
# file, you can obtain one at https://mozilla.org/MPL/2.0/.
#
# See the COPYRIGHT file distributed with this work for additional
# information regarding copyright ownership.
############################################################################

import os
import sys
import re
from subprocess import Popen, PIPE

########################################################################
# Exceptions
########################################################################
class KeyZoneException(Exception):
    pass

########################################################################
# class keyzone
########################################################################
class keyzone:
    """reads a zone file to find data relevant to keys"""

    def __init__(self, name, filename, czpath):
        self.maxttl = None
        self.keyttl = None

        if not name:
            return

        if not czpath or not os.path.isfile(czpath) \
                or not os.access(czpath, os.X_OK):
            raise KeyZoneException('"named-compilezone" not found')
            return

        maxttl = keyttl = None

        fp, _ = Popen([czpath, "-o", "-", name, filename],
                      stdout=PIPE, stderr=PIPE).communicate()
        for line in fp.splitlines():
            if type(line) is not str:
                line = line.decode('ascii')
            if re.search('^[:space:]*;', line):
                continue
            fields = line.split()
            if not maxttl or int(fields[1]) > maxttl:
                maxttl = int(fields[1])
            if fields[3] == "DNSKEY":
                keyttl = int(fields[1])

        self.keyttl = keyttl
        self.maxttl = maxttl
site-packages/decorator.py000064400000040270147511334560011634 0ustar00# #########################     LICENSE     ############################ #

# Copyright (c) 2005-2018, Michele Simionato
# All rights reserved.

# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are
# met:

#   Redistributions of source code must retain the above copyright
#   notice, this list of conditions and the following disclaimer.
#   Redistributions in bytecode form must reproduce the above copyright
#   notice, this list of conditions and the following disclaimer in
#   the documentation and/or other materials provided with the
#   distribution.

# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
# HOLDERS OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
# BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS
# OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
# ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR
# TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE
# USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH
# DAMAGE.

"""
Decorator module, see http://pypi.python.org/pypi/decorator
for the documentation.
"""
from __future__ import print_function

import re
import sys
import inspect
import operator
import itertools
import collections

__version__ = '4.2.1'

if sys.version >= '3':
    from inspect import getfullargspec

    def get_init(cls):
        return cls.__init__
else:
    FullArgSpec = collections.namedtuple(
        'FullArgSpec', 'args varargs varkw defaults '
        'kwonlyargs kwonlydefaults')

    def getfullargspec(f):
        "A quick and dirty replacement for getfullargspec for Python 2.X"
        return FullArgSpec._make(inspect.getargspec(f) + ([], None))

    def get_init(cls):
        return cls.__init__.__func__

try:
    iscoroutinefunction = inspect.iscoroutinefunction
except AttributeError:
    # let's assume there are no coroutine functions in old Python
    def iscoroutinefunction(f):
        return False

# getargspec has been deprecated in Python 3.5
ArgSpec = collections.namedtuple(
    'ArgSpec', 'args varargs varkw defaults')


def getargspec(f):
    """A replacement for inspect.getargspec"""
    spec = getfullargspec(f)
    return ArgSpec(spec.args, spec.varargs, spec.varkw, spec.defaults)


DEF = re.compile(r'\s*def\s*([_\w][_\w\d]*)\s*\(')


# basic functionality
class FunctionMaker(object):
    """
    An object with the ability to create functions with a given signature.
    It has attributes name, doc, module, signature, defaults, dict and
    methods update and make.
    """

    # Atomic get-and-increment provided by the GIL
    _compile_count = itertools.count()

    # make pylint happy
    args = varargs = varkw = defaults = kwonlyargs = kwonlydefaults = ()

    def __init__(self, func=None, name=None, signature=None,
                 defaults=None, doc=None, module=None, funcdict=None):
        self.shortsignature = signature
        if func:
            # func can be a class or a callable, but not an instance method
            self.name = func.__name__
            if self.name == '<lambda>':  # small hack for lambda functions
                self.name = '_lambda_'
            self.doc = func.__doc__
            self.module = func.__module__
            if inspect.isfunction(func):
                argspec = getfullargspec(func)
                self.annotations = getattr(func, '__annotations__', {})
                for a in ('args', 'varargs', 'varkw', 'defaults', 'kwonlyargs',
                          'kwonlydefaults'):
                    setattr(self, a, getattr(argspec, a))
                for i, arg in enumerate(self.args):
                    setattr(self, 'arg%d' % i, arg)
                allargs = list(self.args)
                allshortargs = list(self.args)
                if self.varargs:
                    allargs.append('*' + self.varargs)
                    allshortargs.append('*' + self.varargs)
                elif self.kwonlyargs:
                    allargs.append('*')  # single star syntax
                for a in self.kwonlyargs:
                    allargs.append('%s=None' % a)
                    allshortargs.append('%s=%s' % (a, a))
                if self.varkw:
                    allargs.append('**' + self.varkw)
                    allshortargs.append('**' + self.varkw)
                self.signature = ', '.join(allargs)
                self.shortsignature = ', '.join(allshortargs)
                self.dict = func.__dict__.copy()
        # func=None happens when decorating a caller
        if name:
            self.name = name
        if signature is not None:
            self.signature = signature
        if defaults:
            self.defaults = defaults
        if doc:
            self.doc = doc
        if module:
            self.module = module
        if funcdict:
            self.dict = funcdict
        # check existence required attributes
        assert hasattr(self, 'name')
        if not hasattr(self, 'signature'):
            raise TypeError('You are decorating a non function: %s' % func)

    def update(self, func, **kw):
        "Update the signature of func with the data in self"
        func.__name__ = self.name
        func.__doc__ = getattr(self, 'doc', None)
        func.__dict__ = getattr(self, 'dict', {})
        func.__defaults__ = self.defaults
        func.__kwdefaults__ = self.kwonlydefaults or None
        func.__annotations__ = getattr(self, 'annotations', None)
        try:
            frame = sys._getframe(3)
        except AttributeError:  # for IronPython and similar implementations
            callermodule = '?'
        else:
            callermodule = frame.f_globals.get('__name__', '?')
        func.__module__ = getattr(self, 'module', callermodule)
        func.__dict__.update(kw)

    def make(self, src_templ, evaldict=None, addsource=False, **attrs):
        "Make a new function from a given template and update the signature"
        src = src_templ % vars(self)  # expand name and signature
        evaldict = evaldict or {}
        mo = DEF.search(src)
        if mo is None:
            raise SyntaxError('not a valid function template\n%s' % src)
        name = mo.group(1)  # extract the function name
        names = set([name] + [arg.strip(' *') for arg in
                              self.shortsignature.split(',')])
        for n in names:
            if n in ('_func_', '_call_'):
                raise NameError('%s is overridden in\n%s' % (n, src))

        if not src.endswith('\n'):  # add a newline for old Pythons
            src += '\n'

        # Ensure each generated function has a unique filename for profilers
        # (such as cProfile) that depend on the tuple of (<filename>,
        # <definition line>, <function name>) being unique.
        filename = '<decorator-gen-%d>' % (next(self._compile_count),)
        try:
            code = compile(src, filename, 'single')
            exec(code, evaldict)
        except:
            print('Error in generated code:', file=sys.stderr)
            print(src, file=sys.stderr)
            raise
        func = evaldict[name]
        if addsource:
            attrs['__source__'] = src
        self.update(func, **attrs)
        return func

    @classmethod
    def create(cls, obj, body, evaldict, defaults=None,
               doc=None, module=None, addsource=True, **attrs):
        """
        Create a function from the strings name, signature and body.
        evaldict is the evaluation dictionary. If addsource is true an
        attribute __source__ is added to the result. The attributes attrs
        are added, if any.
        """
        if isinstance(obj, str):  # "name(signature)"
            name, rest = obj.strip().split('(', 1)
            signature = rest[:-1]  # strip a right parens
            func = None
        else:  # a function
            name = None
            signature = None
            func = obj
        self = cls(func, name, signature, defaults, doc, module)
        ibody = '\n'.join('    ' + line for line in body.splitlines())
        caller = evaldict.get('_call_')  # when called from `decorate`
        if caller and iscoroutinefunction(caller):
            body = ('async def %(name)s(%(signature)s):\n' + ibody).replace(
                'return', 'return await')
        else:
            body = 'def %(name)s(%(signature)s):\n' + ibody
        return self.make(body, evaldict, addsource, **attrs)


def decorate(func, caller, extras=()):
    """
    decorate(func, caller) decorates a function using a caller.
    """
    evaldict = dict(_call_=caller, _func_=func)
    es = ''
    for i, extra in enumerate(extras):
        ex = '_e%d_' % i
        evaldict[ex] = extra
        es += ex + ', '
    fun = FunctionMaker.create(
        func, "return _call_(_func_, %s%%(shortsignature)s)" % es,
        evaldict, __wrapped__=func)
    if hasattr(func, '__qualname__'):
        fun.__qualname__ = func.__qualname__
    return fun


def decorator(caller, _func=None):
    """decorator(caller) converts a caller function into a decorator"""
    if _func is not None:  # return a decorated function
        # this is obsolete behavior; you should use decorate instead
        return decorate(_func, caller)
    # else return a decorator function
    defaultargs, defaults = '', ()
    if inspect.isclass(caller):
        name = caller.__name__.lower()
        doc = 'decorator(%s) converts functions/generators into ' \
            'factories of %s objects' % (caller.__name__, caller.__name__)
    elif inspect.isfunction(caller):
        if caller.__name__ == '<lambda>':
            name = '_lambda_'
        else:
            name = caller.__name__
        doc = caller.__doc__
        nargs = caller.__code__.co_argcount
        ndefs = len(caller.__defaults__ or ())
        defaultargs = ', '.join(caller.__code__.co_varnames[nargs-ndefs:nargs])
        if defaultargs:
            defaultargs += ','
        defaults = caller.__defaults__
    else:  # assume caller is an object with a __call__ method
        name = caller.__class__.__name__.lower()
        doc = caller.__call__.__doc__
    evaldict = dict(_call=caller, _decorate_=decorate)
    dec = FunctionMaker.create(
        '%s(func, %s)' % (name, defaultargs),
        'if func is None: return lambda func:  _decorate_(func, _call, (%s))\n'
        'return _decorate_(func, _call, (%s))' % (defaultargs, defaultargs),
        evaldict, doc=doc, module=caller.__module__, __wrapped__=caller)
    if defaults:
        dec.__defaults__ = (None,) + defaults
    return dec

# ####################### contextmanager ####################### #

try:  # Python >= 3.2
    from contextlib import _GeneratorContextManager
except ImportError:  # Python >= 2.5
    from contextlib import GeneratorContextManager as _GeneratorContextManager


class ContextManager(_GeneratorContextManager):
    def __call__(self, func):
        """Context manager decorator"""
        return FunctionMaker.create(
            func, "with _self_: return _func_(%(shortsignature)s)",
            dict(_self_=self, _func_=func), __wrapped__=func)


init = getfullargspec(_GeneratorContextManager.__init__)
n_args = len(init.args)
if n_args == 2 and not init.varargs:  # (self, genobj) Python 2.7
    def __init__(self, g, *a, **k):
        return _GeneratorContextManager.__init__(self, g(*a, **k))
    ContextManager.__init__ = __init__
elif n_args == 2 and init.varargs:  # (self, gen, *a, **k) Python 3.4
    pass
elif n_args == 4:  # (self, gen, args, kwds) Python 3.5
    def __init__(self, g, *a, **k):
        return _GeneratorContextManager.__init__(self, g, a, k)
    ContextManager.__init__ = __init__

_contextmanager = decorator(ContextManager)


def contextmanager(func):
    # Enable Pylint config: contextmanager-decorators=decorator.contextmanager
    return _contextmanager(func)


# ############################ dispatch_on ############################ #

def append(a, vancestors):
    """
    Append ``a`` to the list of the virtual ancestors, unless it is already
    included.
    """
    add = True
    for j, va in enumerate(vancestors):
        if issubclass(va, a):
            add = False
            break
        if issubclass(a, va):
            vancestors[j] = a
            add = False
    if add:
        vancestors.append(a)


# inspired from simplegeneric by P.J. Eby and functools.singledispatch
def dispatch_on(*dispatch_args):
    """
    Factory of decorators turning a function into a generic function
    dispatching on the given arguments.
    """
    assert dispatch_args, 'No dispatch args passed'
    dispatch_str = '(%s,)' % ', '.join(dispatch_args)

    def check(arguments, wrong=operator.ne, msg=''):
        """Make sure one passes the expected number of arguments"""
        if wrong(len(arguments), len(dispatch_args)):
            raise TypeError('Expected %d arguments, got %d%s' %
                            (len(dispatch_args), len(arguments), msg))

    def gen_func_dec(func):
        """Decorator turning a function into a generic function"""

        # first check the dispatch arguments
        argset = set(getfullargspec(func).args)
        if not set(dispatch_args) <= argset:
            raise NameError('Unknown dispatch arguments %s' % dispatch_str)

        typemap = {}

        def vancestors(*types):
            """
            Get a list of sets of virtual ancestors for the given types
            """
            check(types)
            ras = [[] for _ in range(len(dispatch_args))]
            for types_ in typemap:
                for t, type_, ra in zip(types, types_, ras):
                    if issubclass(t, type_) and type_ not in t.mro():
                        append(type_, ra)
            return [set(ra) for ra in ras]

        def ancestors(*types):
            """
            Get a list of virtual MROs, one for each type
            """
            check(types)
            lists = []
            for t, vas in zip(types, vancestors(*types)):
                n_vas = len(vas)
                if n_vas > 1:
                    raise RuntimeError(
                        'Ambiguous dispatch for %s: %s' % (t, vas))
                elif n_vas == 1:
                    va, = vas
                    mro = type('t', (t, va), {}).mro()[1:]
                else:
                    mro = t.mro()
                lists.append(mro[:-1])  # discard t and object
            return lists

        def register(*types):
            """
            Decorator to register an implementation for the given types
            """
            check(types)

            def dec(f):
                check(getfullargspec(f).args, operator.lt, ' in ' + f.__name__)
                typemap[types] = f
                return f
            return dec

        def dispatch_info(*types):
            """
            An utility to introspect the dispatch algorithm
            """
            check(types)
            lst = []
            for anc in itertools.product(*ancestors(*types)):
                lst.append(tuple(a.__name__ for a in anc))
            return lst

        def _dispatch(dispatch_args, *args, **kw):
            types = tuple(type(arg) for arg in dispatch_args)
            try:  # fast path
                f = typemap[types]
            except KeyError:
                pass
            else:
                return f(*args, **kw)
            combinations = itertools.product(*ancestors(*types))
            next(combinations)  # the first one has been already tried
            for types_ in combinations:
                f = typemap.get(types_)
                if f is not None:
                    return f(*args, **kw)

            # else call the default implementation
            return func(*args, **kw)

        return FunctionMaker.create(
            func, 'return _f_(%s, %%(shortsignature)s)' % dispatch_str,
            dict(_f_=_dispatch), register=register, default=func,
            typemap=typemap, vancestors=vancestors, ancestors=ancestors,
            dispatch_info=dispatch_info, __wrapped__=func)

    gen_func_dec.__name__ = 'dispatch_on' + dispatch_str
    return gen_func_dec
site-packages/ply-3.9-py3.6.egg-info/SOURCES.txt000064400000007354147511334560014654 0ustar00ANNOUNCE
CHANGES
MANIFEST.in
README.md
TODO
setup.cfg
setup.py
doc/internal.html
doc/makedoc.py
doc/ply.html
example/README
example/cleanup.sh
example/BASIC/README
example/BASIC/basic.py
example/BASIC/basiclex.py
example/BASIC/basiclog.py
example/BASIC/basinterp.py
example/BASIC/basparse.py
example/BASIC/dim.bas
example/BASIC/func.bas
example/BASIC/gcd.bas
example/BASIC/gosub.bas
example/BASIC/hello.bas
example/BASIC/linear.bas
example/BASIC/maxsin.bas
example/BASIC/powers.bas
example/BASIC/rand.bas
example/BASIC/sales.bas
example/BASIC/sears.bas
example/BASIC/sqrt1.bas
example/BASIC/sqrt2.bas
example/GardenSnake/GardenSnake.py
example/GardenSnake/README
example/ansic/README
example/ansic/clex.py
example/ansic/cparse.py
example/calc/calc.py
example/calcdebug/calc.py
example/calceof/calc.py
example/classcalc/calc.py
example/closurecalc/calc.py
example/hedit/hedit.py
example/newclasscalc/calc.py
example/optcalc/README
example/optcalc/calc.py
example/unicalc/calc.py
example/yply/README
example/yply/ylex.py
example/yply/yparse.py
example/yply/yply.py
ply/__init__.py
ply/cpp.py
ply/ctokens.py
ply/lex.py
ply/yacc.py
ply/ygen.py
ply.egg-info/PKG-INFO
ply.egg-info/SOURCES.txt
ply.egg-info/dependency_links.txt
ply.egg-info/top_level.txt
test/README
test/calclex.py
test/cleanup.sh
test/lex_closure.py
test/lex_doc1.py
test/lex_dup1.py
test/lex_dup2.py
test/lex_dup3.py
test/lex_empty.py
test/lex_error1.py
test/lex_error2.py
test/lex_error3.py
test/lex_error4.py
test/lex_hedit.py
test/lex_ignore.py
test/lex_ignore2.py
test/lex_literal1.py
test/lex_literal2.py
test/lex_literal3.py
test/lex_many_tokens.py
test/lex_module.py
test/lex_module_import.py
test/lex_object.py
test/lex_opt_alias.py
test/lex_optimize.py
test/lex_optimize2.py
test/lex_optimize3.py
test/lex_re1.py
test/lex_re2.py
test/lex_re3.py
test/lex_rule1.py
test/lex_rule2.py
test/lex_rule3.py
test/lex_state1.py
test/lex_state2.py
test/lex_state3.py
test/lex_state4.py
test/lex_state5.py
test/lex_state_noerror.py
test/lex_state_norule.py
test/lex_state_try.py
test/lex_token1.py
test/lex_token2.py
test/lex_token3.py
test/lex_token4.py
test/lex_token5.py
test/lex_token_dup.py
test/testlex.py
test/testyacc.py
test/yacc_badargs.py
test/yacc_badid.py
test/yacc_badprec.py
test/yacc_badprec2.py
test/yacc_badprec3.py
test/yacc_badrule.py
test/yacc_badtok.py
test/yacc_dup.py
test/yacc_error1.py
test/yacc_error2.py
test/yacc_error3.py
test/yacc_error4.py
test/yacc_error5.py
test/yacc_error6.py
test/yacc_error7.py
test/yacc_inf.py
test/yacc_literal.py
test/yacc_misplaced.py
test/yacc_missing1.py
test/yacc_nested.py
test/yacc_nodoc.py
test/yacc_noerror.py
test/yacc_nop.py
test/yacc_notfunc.py
test/yacc_notok.py
test/yacc_prec1.py
test/yacc_rr.py
test/yacc_rr_unused.py
test/yacc_simple.py
test/yacc_sr.py
test/yacc_term1.py
test/yacc_unicode_literals.py
test/yacc_unused.py
test/yacc_unused_rule.py
test/yacc_uprec.py
test/yacc_uprec2.py
test/pkg_test1/__init__.py
test/pkg_test1/parsing/__init__.py
test/pkg_test1/parsing/calclex.py
test/pkg_test1/parsing/calcparse.py
test/pkg_test2/__init__.py
test/pkg_test2/parsing/__init__.py
test/pkg_test2/parsing/calclex.py
test/pkg_test2/parsing/calcparse.py
test/pkg_test3/__init__.py
test/pkg_test3/generated/__init__.py
test/pkg_test3/parsing/__init__.py
test/pkg_test3/parsing/calclex.py
test/pkg_test3/parsing/calcparse.py
test/pkg_test4/__init__.py
test/pkg_test4/parsing/__init__.py
test/pkg_test4/parsing/calclex.py
test/pkg_test4/parsing/calcparse.py
test/pkg_test5/__init__.py
test/pkg_test5/parsing/__init__.py
test/pkg_test5/parsing/calclex.py
test/pkg_test5/parsing/calcparse.py
test/pkg_test6/__init__.py
test/pkg_test6/parsing/__init__.py
test/pkg_test6/parsing/calclex.py
test/pkg_test6/parsing/calcparse.py
test/pkg_test6/parsing/expression.py
test/pkg_test6/parsing/statement.pysite-packages/ply-3.9-py3.6.egg-info/dependency_links.txt000064400000000001147511334560017025 0ustar00
site-packages/ply-3.9-py3.6.egg-info/top_level.txt000064400000000004147511334560015503 0ustar00ply
site-packages/ply-3.9-py3.6.egg-info/PKG-INFO000064400000001700147511334560014052 0ustar00Metadata-Version: 1.2
Name: ply
Version: 3.9
Summary: Python Lex & Yacc
Home-page: http://www.dabeaz.com/ply/
Author: David Beazley
Author-email: dave@dabeaz.com
Maintainer: David Beazley
Maintainer-email: dave@dabeaz.com
License: BSD
Description: 
        PLY is yet another implementation of lex and yacc for Python. Some notable
        features include the fact that its implemented entirely in Python and it
        uses LALR(1) parsing which is efficient and well suited for larger grammars.
        
        PLY provides most of the standard lex/yacc features including support for empty 
        productions, precedence rules, error recovery, and support for ambiguous grammars. 
        
        PLY is extremely easy to use and provides very extensive error checking. 
        It is compatible with both Python 2 and Python 3.
        
Platform: UNKNOWN
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 2
site-packages/__pycache__/_version.cpython-36.opt-1.pyc000064400000000206147511334560016714 0ustar003

��S�@sdZdS)z5.0.6N)�__version__�rr�/usr/lib/python3.6/_version.py�<module>ssite-packages/__pycache__/easy_install.cpython-36.pyc000064400000000403147511334560016617 0ustar003

��f~�@s"dZedkrddlmZe�dS)zRun the EasyInstall command�__main__�)�mainN)�__doc__�__name__Zsetuptools.command.easy_installr�rr�"/usr/lib/python3.6/easy_install.py�<module>ssite-packages/__pycache__/validate.cpython-36.pyc000064400000126537147511334560015742 0ustar003

��S���@s�dZdZdXZdd lZdd lZdd!lmZejdYkr8eZne	Zd#d$�Z
eZej
d%ejejB�Zej
d&ejejB�Zd'Zd(eZyeWnek
r�d)d*�ZYnXd+d�Zd,d�ZGd-d�de�ZGd.d�de�ZGd/d�de�ZGd0d�de�ZGd1d�de�ZGd2d	�d	e�ZGd3d
�d
e�Z Gd4d�de�Z!Gd5d�de�Z"Gd6d
�d
e�Z#Gd7d�de$�Z%dZd9d:�Z&d[d;d�Z'd\d<d�Z(d=d=d=d=d=d8d8d8d8d8d>�
Z)d?d�Z*d@d�Z+d]dAd�Z,d^dBd�Z-d_dCd�Z.d`dDd�Z/dadEd�Z0dbdFd�Z1dcdGd�Z2dddHd�Z3dedIdJ�Z4e'e(e+e.e*dK�Z5dLd�Z6dMd�Z7dNdO�Z8dPdQ�Z9dRdS�Z:e;dTk�r�dd lZdd l<Z<ej=j>dT�Z?e?j@jA�ZBeBjCdUe%�i�e<jDe?eBe<jEe<jFBdV�\ZGZHeG�s�tIdWjJeGeH���d S)fa
    The Validator object is used to check that supplied values 
    conform to a specification.
    
    The value can be supplied as a string - e.g. from a config file.
    In this case the check will also *convert* the value to
    the required type. This allows you to add validation
    as a transparent layer to access data stored as strings.
    The validation checks that the data is correct *and*
    converts it to the expected type.
    
    Some standard checks are provided for basic data types.
    Additional checks are easy to write. They can be
    provided when the ``Validator`` is instantiated or
    added afterwards.
    
    The standard functions work with the following basic data types :
    
    * integers
    * floats
    * booleans
    * strings
    * ip_addr
    
    plus lists of these datatypes
    
    Adding additional checks is done through coding simple functions.
    
    The full set of standard checks are : 
    
    * 'integer': matches integer values (including negative)
                 Takes optional 'min' and 'max' arguments : ::
    
                   integer()
                   integer(3, 9)  # any value from 3 to 9
                   integer(min=0) # any positive value
                   integer(max=9)
    
    * 'float': matches float values
               Has the same parameters as the integer check.
    
    * 'boolean': matches boolean values - ``True`` or ``False``
                 Acceptable string values for True are :
                   true, on, yes, 1
                 Acceptable string values for False are :
                   false, off, no, 0
    
                 Any other value raises an error.
    
    * 'ip_addr': matches an Internet Protocol address, v.4, represented
                 by a dotted-quad string, i.e. '1.2.3.4'.
    
    * 'string': matches any string.
                Takes optional keyword args 'min' and 'max'
                to specify min and max lengths of the string.
    
    * 'list': matches any list.
              Takes optional keyword args 'min', and 'max' to specify min and
              max sizes of the list. (Always returns a list.)
    
    * 'tuple': matches any tuple.
              Takes optional keyword args 'min', and 'max' to specify min and
              max sizes of the tuple. (Always returns a tuple.)
    
    * 'int_list': Matches a list of integers.
                  Takes the same arguments as list.
    
    * 'float_list': Matches a list of floats.
                    Takes the same arguments as list.
    
    * 'bool_list': Matches a list of boolean values.
                   Takes the same arguments as list.
    
    * 'ip_addr_list': Matches a list of IP addresses.
                     Takes the same arguments as list.
    
    * 'string_list': Matches a list of strings.
                     Takes the same arguments as list.
    
    * 'mixed_list': Matches a list with different types in 
                    specific positions. List size must match
                    the number of arguments.
    
                    Each position can be one of :
                    'integer', 'float', 'ip_addr', 'string', 'boolean'
    
                    So to specify a list with two strings followed
                    by two integers, you write the check as : ::
    
                      mixed_list('string', 'string', 'integer', 'integer')
    
    * 'pass': This check matches everything ! It never fails
              and the value is unchanged.
    
              It is also the default if no check is specified.
    
    * 'option': This check matches any from a list of options.
                You specify this check with : ::
    
                  option('option 1', 'option 2', 'option 3')
    
    You can supply a default value (returned if no value is supplied)
    using the default keyword argument.
    
    You specify a list argument for default using a list constructor syntax in
    the check : ::
    
        checkname(arg1, arg2, default=list('val 1', 'val 2', 'val 3'))
    
    A badly formatted set of arguments will raise a ``VdtParamError``.
z1.0.1�__version__�dottedQuadToNum�numToDottedQuad�
ValidateError�VdtUnknownCheckError�
VdtParamError�VdtTypeError�
VdtValueError�VdtValueTooSmallError�VdtValueTooBigError�VdtValueTooShortError�VdtValueTooLongError�VdtMissingValue�	Validator�
is_integer�is_float�
is_boolean�is_list�is_tuple�
is_ip_addr�	is_string�is_int_list�is_bool_list�
is_float_list�is_string_list�is_ip_addr_list�
is_mixed_list�	is_option�
__docformat__�N)�pprint�cCs|S)N�)�xr!r!�/usr/lib/python3.6/validate.py�<lambda>�sr$a�
    (?:
        ([a-zA-Z_][a-zA-Z0-9_]*)\s*=\s*list\(
            (
                (?:
                    \s*
                    (?:
                        (?:".*?")|              # double quotes
                        (?:'.*?')|              # single quotes
                        (?:[^'",\s\)][^,\)]*?)  # unquoted
                    )
                    \s*,\s*
                )*
                (?:
                    (?:".*?")|              # double quotes
                    (?:'.*?')|              # single quotes
                    (?:[^'",\s\)][^,\)]*?)  # unquoted
                )?                          # last one
            )
        \)
    )
z�
    (
        (?:".*?")|              # double quotes
        (?:'.*?')|              # single quotes
        (?:[^'",\s=][^,=]*?)       # unquoted
    )
    (?:
    (?:\s*,\s*)|(?:\s*$)            # comma
    )
a�
    (?:
        (
            (?:
                [a-zA-Z_][a-zA-Z0-9_]*\s*=\s*list\(
                    (?:
                        \s*
                        (?:
                            (?:".*?")|              # double quotes
                            (?:'.*?')|              # single quotes
                            (?:[^'",\s\)][^,\)]*?)       # unquoted
                        )
                        \s*,\s*
                    )*
                    (?:
                        (?:".*?")|              # double quotes
                        (?:'.*?')|              # single quotes
                        (?:[^'",\s\)][^,\)]*?)       # unquoted
                    )?                              # last one
                \)
            )|
            (?:
                (?:".*?")|              # double quotes
                (?:'.*?')|              # single quotes
                (?:[^'",\s=][^,=]*?)|       # unquoted
                (?:                         # keyword argument
                    [a-zA-Z_][a-zA-Z0-9_]*\s*=\s*
                    (?:
                        (?:".*?")|              # double quotes
                        (?:'.*?')|              # single quotes
                        (?:[^'",\s=][^,=]*?)       # unquoted
                    )
                )
            )
        )
        (?:
            (?:\s*,\s*)|(?:\s*$)            # comma
        )
    )
    z^%s*cCs|rdSdSdS)z$Simple boolean equivalent function. �rNr!)�valr!r!r#�bool
sr'cCsRddl}ddl}y|jd|j|j���dS|jk
rLtd|��YnXdS)a�
    Convert decimal dotted quad string to long integer
    
    >>> int(dottedQuadToNum('1 '))
    1
    >>> int(dottedQuadToNum(' 1.2'))
    16777218
    >>> int(dottedQuadToNum(' 1.2.3 '))
    16908291
    >>> int(dottedQuadToNum('1.2.3.4'))
    16909060
    >>> dottedQuadToNum('255.255.255.255')
    4294967295
    >>> dottedQuadToNum('255.255.255.256')
    Traceback (most recent call last):
    ValueError: Not a good dotted-quad IP: 255.255.255.256
    rNz!LzNot a good dotted-quad IP: %s)�socket�struct�unpackZ	inet_aton�strip�error�
ValueError)Zipr(r)r!r!r#rsc
Csvddl}ddl}|td�ks$|dkr0td|��y|j|jdt|���S|j|jtfk
rptd|��YnXdS)a!
    Convert int or long int to dotted quad string
    
    >>> numToDottedQuad(long(-1))
    Traceback (most recent call last):
    ValueError: Not a good numeric IP: -1
    >>> numToDottedQuad(long(1))
    '0.0.0.1'
    >>> numToDottedQuad(long(16777218))
    '1.0.0.2'
    >>> numToDottedQuad(long(16908291))
    '1.2.0.3'
    >>> numToDottedQuad(long(16909060))
    '1.2.3.4'
    >>> numToDottedQuad(long(4294967295))
    '255.255.255.255'
    >>> numToDottedQuad(long(4294967296))
    Traceback (most recent call last):
    ValueError: Not a good numeric IP: 4294967296
    >>> numToDottedQuad(-1)
    Traceback (most recent call last):
    ValueError: Not a good numeric IP: -1
    >>> numToDottedQuad(1)
    '0.0.0.1'
    >>> numToDottedQuad(16777218)
    '1.0.0.2'
    >>> numToDottedQuad(16908291)
    '1.2.0.3'
    >>> numToDottedQuad(16909060)
    '1.2.3.4'
    >>> numToDottedQuad(4294967295)
    '255.255.255.255'
    >>> numToDottedQuad(4294967296)
    Traceback (most recent call last):
    ValueError: Not a good numeric IP: 4294967296

    rNl��zNot a good numeric IP: %sz!L)r(r)�longr-Z	inet_ntoa�packr,�
OverflowError)Znumr(r)r!r!r#r0s(c@seZdZdZdS)ra
    This error indicates that the check failed.
    It can be the base class for more specific errors.
    
    Any check function that fails ought to raise this error.
    (or a subclass)
    
    >>> raise ValidateError
    Traceback (most recent call last):
    ValidateError
    N)�__name__�
__module__�__qualname__�__doc__r!r!r!r#rdsc@seZdZdZdS)r
z1No value was supplied to a check that needed one.N)r1r2r3r4r!r!r!r#r
rsc@seZdZdZdd�ZdS)rz'An unknown check function was requestedcCstj|d|f�dS)z�
        >>> raise VdtUnknownCheckError('yoda')
        Traceback (most recent call last):
        VdtUnknownCheckError: the check "yoda" is unknown.
        zthe check "%s" is unknown.N)r�__init__)�self�valuer!r!r#r5yszVdtUnknownCheckError.__init__N)r1r2r3r4r5r!r!r!r#rvsc@seZdZdZdd�ZdS)rz!An incorrect parameter was passedcCstj|d||f�dS)z�
        >>> raise VdtParamError('yoda', 'jedi')
        Traceback (most recent call last):
        VdtParamError: passed an incorrect value "jedi" for parameter "yoda".
        z2passed an incorrect value "%s" for parameter "%s".N)�SyntaxErrorr5)r6�namer7r!r!r#r5�szVdtParamError.__init__N)r1r2r3r4r5r!r!r!r#r�sc@seZdZdZdd�ZdS)rz(The value supplied was of the wrong typecCstj|d|f�dS)z�
        >>> raise VdtTypeError('jedi')
        Traceback (most recent call last):
        VdtTypeError: the value "jedi" is of the wrong type.
        z$the value "%s" is of the wrong type.N)rr5)r6r7r!r!r#r5�szVdtTypeError.__init__N)r1r2r3r4r5r!r!r!r#r�sc@seZdZdZdd�ZdS)rzIThe value supplied was of the correct type, but was not an allowed value.cCstj|d|f�dS)z�
        >>> raise VdtValueError('jedi')
        Traceback (most recent call last):
        VdtValueError: the value "jedi" is unacceptable.
        zthe value "%s" is unacceptable.N)rr5)r6r7r!r!r#r5�szVdtValueError.__init__N)r1r2r3r4r5r!r!r!r#r�sc@seZdZdZdd�ZdS)r	z>The value supplied was of the correct type, but was too small.cCstj|d|f�dS)z�
        >>> raise VdtValueTooSmallError('0')
        Traceback (most recent call last):
        VdtValueTooSmallError: the value "0" is too small.
        zthe value "%s" is too small.N)rr5)r6r7r!r!r#r5�szVdtValueTooSmallError.__init__N)r1r2r3r4r5r!r!r!r#r	�sc@seZdZdZdd�ZdS)r
z<The value supplied was of the correct type, but was too big.cCstj|d|f�dS)z�
        >>> raise VdtValueTooBigError('1')
        Traceback (most recent call last):
        VdtValueTooBigError: the value "1" is too big.
        zthe value "%s" is too big.N)rr5)r6r7r!r!r#r5�szVdtValueTooBigError.__init__N)r1r2r3r4r5r!r!r!r#r
�sc@seZdZdZdd�ZdS)rz>The value supplied was of the correct type, but was too short.cCstj|d|f�dS)z�
        >>> raise VdtValueTooShortError('jed')
        Traceback (most recent call last):
        VdtValueTooShortError: the value "jed" is too short.
        zthe value "%s" is too short.N)rr5)r6r7r!r!r#r5�szVdtValueTooShortError.__init__N)r1r2r3r4r5r!r!r!r#r�sc@seZdZdZdd�ZdS)rz=The value supplied was of the correct type, but was too long.cCstj|d|f�dS)z�
        >>> raise VdtValueTooLongError('jedie')
        Traceback (most recent call last):
        VdtValueTooLongError: the value "jedie" is too long.
        zthe value "%s" is too long.N)rr5)r6r7r!r!r#r5�szVdtValueTooLongError.__init__N)r1r2r3r4r5r!r!r!r#r�sc@s�eZdZdZejdej�Zejdej�Ze	Z	e
Z
ejeejejB�Z
ejeejejB�Zddd�Zddd	�Zd
d�Zdd
�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�ZdS)ra3

    Validator is an object that allows you to register a set of 'checks'.
    These checks take input and test that it conforms to the check.
    
    This can also involve converting the value from a string into
    the correct datatype.
    
    The ``check`` method takes an input string which configures which
    check is to be used and applies that check to a supplied value.
    
    An example input string would be:
    'int_range(param1, param2)'
    
    You would then provide something like:
    
    >>> def int_range_check(value, min, max):
    ...     # turn min and max from strings to integers
    ...     min = int(min)
    ...     max = int(max)
    ...     # check that value is of the correct type.
    ...     # possible valid inputs are integers or strings
    ...     # that represent integers
    ...     if not isinstance(value, (int, long, string_type)):
    ...         raise VdtTypeError(value)
    ...     elif isinstance(value, string_type):
    ...         # if we are given a string
    ...         # attempt to convert to an integer
    ...         try:
    ...             value = int(value)
    ...         except ValueError:
    ...             raise VdtValueError(value)
    ...     # check the value is between our constraints
    ...     if not min <= value:
    ...          raise VdtValueTooSmallError(value)
    ...     if not value <= max:
    ...          raise VdtValueTooBigError(value)
    ...     return value
    
    >>> fdict = {'int_range': int_range_check}
    >>> vtr1 = Validator(fdict)
    >>> vtr1.check('int_range(20, 40)', '30')
    30
    >>> vtr1.check('int_range(20, 40)', '60')
    Traceback (most recent call last):
    VdtValueTooBigError: the value "60" is too big.
    
    New functions can be added with : ::
    
    >>> vtr2 = Validator()       
    >>> vtr2.functions['int_range'] = int_range_check
    
    Or by passing in a dictionary of functions when Validator 
    is instantiated.
    
    Your functions *can* use keyword arguments,
    but the first argument should always be 'value'.
    
    If the function doesn't take additional arguments,
    the parentheses are optional in the check.
    It can be written with either of : ::
    
        keyword = function_name
        keyword = function_name()
    
    The first program to utilise Validator() was Michael Foord's
    ConfigObj, an alternative to ConfigParser which supports lists and
    can validate a config file using a config schema.
    For more details on using Validator with ConfigObj see:
    https://configobj.readthedocs.org/en/latest/configobj.html
    z
(.+?)\((.*)\)z%^([a-zA-Z_][a-zA-Z0-9_]*)\s*=\s*(.*)$NcCsR|jttttttttt	t
ttt
|jttd�|_|dk	rB|jj|�t|_i|_dS)z(
        >>> vtri = Validator()
        )��integer�float�boolean�ip_addr�string�list�tupleZint_listZ
float_listZ	bool_listZip_addr_listZstring_list�
mixed_list�passZoption�
force_listN)�_passrrrrrrrrrrrrrrrD�	functions�updaterZbaseErrorClass�_cache)r6rFr!r!r#r53s*
zValidator.__init__FcCsJ|j|�\}}}}|r.|dkr$t��|j|�}|dkr:dS|j||||�S)a�
        Usage: check(check, value)
        
        Arguments:
            check: string representing check to apply (including arguments)
            value: object to be checked
        Returns value, converted to correct type if necessary
        
        If the check fails, raises a ``ValidateError`` subclass.
        
        >>> vtor.check('yoda', '')
        Traceback (most recent call last):
        VdtUnknownCheckError: the check "yoda" is unknown.
        >>> vtor.check('yoda()', '')
        Traceback (most recent call last):
        VdtUnknownCheckError: the check "yoda" is unknown.
        
        >>> vtor.check('string(default="")', '', missing=True)
        ''
        N)�_parse_with_cachingr
�_handle_none�_check_value)r6�checkr7Zmissing�fun_name�fun_args�
fun_kwargs�defaultr!r!r#rLQs
zValidator.checkcCs"|dkrdS|dkr|j|�}|S)N�None�'None'�"None")rRrS)�_unquote)r6r7r!r!r#rJts

zValidator._handle_nonecCs�||jkr.|j|\}}}}t|�}t|�}nF|j|�\}}}}tdd�t|j��D��}|t|�t|�|f|j|<||||fS)NcSsg|]\}}t|�|f�qSr!)�str)�.0�keyr7r!r!r#�
<listcomp>�sz1Validator._parse_with_caching.<locals>.<listcomp>)rHr@�dict�_parse_check�items)r6rLrMrNrOrPr!r!r#rI}s

zValidator._parse_with_cachingcCs@y|j|}Wntk
r*t|��YnX||f|�|�SdS)N)rF�KeyErrorr)r6r7rMrNrO�funr!r!r#rK�s
zValidator._check_valuecCs|jj|�}|r�|jd�}|jd�}|jj|�}|dkrDtd|��g}i}x�|jj|�D]�}|j�}|jj|�}	|	r�|j	|	�\}
}|||
<qZ|j
j|�}|r�|jd�}|dkr�|j|�}|||jd�<qZ|j|j|��qZWn|fidfS|j
dd�}
||||
fS)Nr%�zBad syntax in check "%s".�'None'�"None"rP)r_r`)�_func_re�match�group�_matchfinderr�_paramfinder�findallr+�	_list_arg�_list_handle�_key_argrT�append�pop)r6rLZ	fun_matchrMZ
arg_stringZ	arg_matchrNrO�arg�	listmatchrWr&ZkeymatchrPr!r!r#rZ�s6



zValidator._parse_checkcCs8t|�dkr4|ddkr4|d|dkr4|dd�}|S)	zUnquote a value if necessary.r^r�'�"r%)rnro���rp)�len)r6r&r!r!r#rT�s(zValidator._unquotecCsFg}|jd�}|jd�}x$|jj|�D]}|j|j|��q&W||fS)z7Take apart a ``keyword=list('val, 'val')`` type string.r%r^)rc�
_list_membersrfrjrT)r6rm�outr9�argsrlr!r!r#rh�s

zValidator._list_handlecCs|S)z�
        Dummy check that always passes
        
        >>> vtor.check('', 0)
        0
        >>> vtor.check('', '0')
        '0'
        r!)r6r7r!r!r#rE�s	zValidator._passcCsL|j|�\}}}}|dkr&td|��|j|�}|dkr<|S|j||||�S)z�
        Given a check, return the default value for the check
        (converted to the right type).
        
        If the check doesn't specify a default value then a
        ``KeyError`` will be raised.
        Nz Check "%s" has no default value.)rIr\rJrK)r6rLrMrNrOrPr7r!r!r#�get_default_value�s
zValidator.get_default_value)N)F)r1r2r3r4�re�compile�DOTALLrarirgrr�_paramstring�VERBOSEre�_matchstringrdr5rLrJrIrKrZrTrhrErur!r!r!r#r�s"F

#		(
FcCs�|rtp
t}g}x�t||�D]z\}}|dkr8|j|�qt|ttttf�r�y|j||��Wq�tk
r�}zt||��WYdd}~Xq�Xqt||��qW|S)a�
    Return numbers from inputs or raise VdtParamError.
    
    Lets ``None`` pass through.
    Pass in keyword argument ``to_float=True`` to
    use float for the conversion rather than int.
    
    >>> _is_num_param(('', ''), (0, 1.0))
    [0, 1]
    >>> _is_num_param(('', ''), (0, 1.0), to_float=True)
    [0.0, 1.0]
    >>> _is_num_param(('a'), ('a'))
    Traceback (most recent call last):
    VdtParamError: passed an incorrect value "a" for parameter "a".
    N)	r<�int�ziprj�
isinstancer.�string_typer-r)�names�values�to_floatr]Z
out_paramsr9r&�er!r!r#�
_is_num_param�sr�cCs�td||f�\}}t|tttf�s*t|��t|t�r^yt|�}Wntk
r\t|��YnX|dk	rv||krvt|��|dk	r�||kr�t|��|S)aH
    A check that tests that a given value is an integer (int, or long)
    and optionally, between bounds. A negative value is accepted, while
    a float will fail.
    
    If the value is a string, then the conversion is done - if possible.
    Otherwise a VdtError is raised.
    
    >>> vtor.check('integer', '-1')
    -1
    >>> vtor.check('integer', '0')
    0
    >>> vtor.check('integer', 9)
    9
    >>> vtor.check('integer', 'a')
    Traceback (most recent call last):
    VdtTypeError: the value "a" is of the wrong type.
    >>> vtor.check('integer', '2.2')
    Traceback (most recent call last):
    VdtTypeError: the value "2.2" is of the wrong type.
    >>> vtor.check('integer(10)', '20')
    20
    >>> vtor.check('integer(max=20)', '15')
    15
    >>> vtor.check('integer(10)', '9')
    Traceback (most recent call last):
    VdtValueTooSmallError: the value "9" is too small.
    >>> vtor.check('integer(10)', 9)
    Traceback (most recent call last):
    VdtValueTooSmallError: the value "9" is too small.
    >>> vtor.check('integer(max=20)', '35')
    Traceback (most recent call last):
    VdtValueTooBigError: the value "35" is too big.
    >>> vtor.check('integer(max=20)', 35)
    Traceback (most recent call last):
    VdtValueTooBigError: the value "35" is too big.
    >>> vtor.check('integer(0, 9)', False)
    0
    �min�maxN)r�r�)	r�r~r|r.rrr-r	r
)r7r�r��min_val�max_valr!r!r#rs(
cCs�td||fdd�\}}t|ttttf�s0t|��t|t�sdyt|�}Wntk
rbt|��YnX|dk	r|||kr|t|��|dk	r�||kr�t	|��|S)a<
    A check that tests that a given value is a float
    (an integer will be accepted), and optionally - that it is between bounds.
    
    If the value is a string, then the conversion is done - if possible.
    Otherwise a VdtError is raised.
    
    This can accept negative values.
    
    >>> vtor.check('float', '2')
    2.0
    
    From now on we multiply the value to avoid comparing decimals
    
    >>> vtor.check('float', '-6.8') * 10
    -68.0
    >>> vtor.check('float', '12.2') * 10
    122.0
    >>> vtor.check('float', 8.4) * 10
    84.0
    >>> vtor.check('float', 'a')
    Traceback (most recent call last):
    VdtTypeError: the value "a" is of the wrong type.
    >>> vtor.check('float(10.1)', '10.2') * 10
    102.0
    >>> vtor.check('float(max=20.2)', '15.1') * 10
    151.0
    >>> vtor.check('float(10.0)', '9.0')
    Traceback (most recent call last):
    VdtValueTooSmallError: the value "9.0" is too small.
    >>> vtor.check('float(max=20.0)', '35.0')
    Traceback (most recent call last):
    VdtValueTooBigError: the value "35.0" is too big.
    r�r�T)r�N)r�r�)
r�r~r|r.r<rrr-r	r
)r7r�r�r�r�r!r!r#rGs#
T)
TZon�1�true�yesFZoff�0Zfalse�nocCsXt|t�r4yt|j�Stk
r2t|��YnX|dkr@dS|dkrLdSt|��dS)a�
    Check if the value represents a boolean.
    
    >>> vtor.check('boolean', 0)
    0
    >>> vtor.check('boolean', False)
    0
    >>> vtor.check('boolean', '0')
    0
    >>> vtor.check('boolean', 'off')
    0
    >>> vtor.check('boolean', 'false')
    0
    >>> vtor.check('boolean', 'no')
    0
    >>> vtor.check('boolean', 'nO')
    0
    >>> vtor.check('boolean', 'NO')
    0
    >>> vtor.check('boolean', 1)
    1
    >>> vtor.check('boolean', True)
    1
    >>> vtor.check('boolean', '1')
    1
    >>> vtor.check('boolean', 'on')
    1
    >>> vtor.check('boolean', 'true')
    1
    >>> vtor.check('boolean', 'yes')
    1
    >>> vtor.check('boolean', 'Yes')
    1
    >>> vtor.check('boolean', 'YES')
    1
    >>> vtor.check('boolean', '')
    Traceback (most recent call last):
    VdtTypeError: the value "" is of the wrong type.
    >>> vtor.check('boolean', 'up')
    Traceback (most recent call last):
    VdtTypeError: the value "up" is of the wrong type.
    
    FTN)r~r�	bool_dict�lowerr\r)r7r!r!r#r�s,
cCsHt|t�st|��|j�}yt|�Wntk
rBt|��YnX|S)as
    Check that the supplied value is an Internet Protocol address, v.4,
    represented by a dotted-quad string, i.e. '1.2.3.4'.
    
    >>> vtor.check('ip_addr', '1 ')
    '1'
    >>> vtor.check('ip_addr', ' 1.2')
    '1.2'
    >>> vtor.check('ip_addr', ' 1.2.3 ')
    '1.2.3'
    >>> vtor.check('ip_addr', '1.2.3.4')
    '1.2.3.4'
    >>> vtor.check('ip_addr', '0.0.0.0')
    '0.0.0.0'
    >>> vtor.check('ip_addr', '255.255.255.255')
    '255.255.255.255'
    >>> vtor.check('ip_addr', '255.255.255.256')
    Traceback (most recent call last):
    VdtValueError: the value "255.255.255.256" is unacceptable.
    >>> vtor.check('ip_addr', '1.2.3.4.5')
    Traceback (most recent call last):
    VdtValueError: the value "1.2.3.4.5" is unacceptable.
    >>> vtor.check('ip_addr', 0)
    Traceback (most recent call last):
    VdtTypeError: the value "0" is of the wrong type.
    )r~rrr+rr-r)r7r!r!r#r�s
cCs�td||f�\}}t|t�r$t|��yt|�}Wntk
rLt|��YnX|dk	rf||krft|��|dk	r~||kr~t|��t|�S)a�
    Check that the value is a list of values.
    
    You can optionally specify the minimum and maximum number of members.
    
    It does no check on list members.
    
    >>> vtor.check('list', ())
    []
    >>> vtor.check('list', [])
    []
    >>> vtor.check('list', (1, 2))
    [1, 2]
    >>> vtor.check('list', [1, 2])
    [1, 2]
    >>> vtor.check('list(3)', (1, 2))
    Traceback (most recent call last):
    VdtValueTooShortError: the value "(1, 2)" is too short.
    >>> vtor.check('list(max=5)', (1, 2, 3, 4, 5, 6))
    Traceback (most recent call last):
    VdtValueTooLongError: the value "(1, 2, 3, 4, 5, 6)" is too long.
    >>> vtor.check('list(min=3, max=5)', (1, 2, 3, 4))
    [1, 2, 3, 4]
    >>> vtor.check('list', 0)
    Traceback (most recent call last):
    VdtTypeError: the value "0" is of the wrong type.
    >>> vtor.check('list', '12')
    Traceback (most recent call last):
    VdtTypeError: the value "12" is of the wrong type.
    r�r�N)r�r�)	r�r~rrrq�	TypeErrorrrr@)r7r�r��min_len�max_len�num_membersr!r!r#r�s
cCstt|||��S)a�
    Check that the value is a tuple of values.
    
    You can optionally specify the minimum and maximum number of members.
    
    It does no check on members.
    
    >>> vtor.check('tuple', ())
    ()
    >>> vtor.check('tuple', [])
    ()
    >>> vtor.check('tuple', (1, 2))
    (1, 2)
    >>> vtor.check('tuple', [1, 2])
    (1, 2)
    >>> vtor.check('tuple(3)', (1, 2))
    Traceback (most recent call last):
    VdtValueTooShortError: the value "(1, 2)" is too short.
    >>> vtor.check('tuple(max=5)', (1, 2, 3, 4, 5, 6))
    Traceback (most recent call last):
    VdtValueTooLongError: the value "(1, 2, 3, 4, 5, 6)" is too long.
    >>> vtor.check('tuple(min=3, max=5)', (1, 2, 3, 4))
    (1, 2, 3, 4)
    >>> vtor.check('tuple', 0)
    Traceback (most recent call last):
    VdtTypeError: the value "0" is of the wrong type.
    >>> vtor.check('tuple', '12')
    Traceback (most recent call last):
    VdtTypeError: the value "12" is of the wrong type.
    )rAr)r7r�r�r!r!r#rscCs�t|t�st|��td||f�\}}yt|�}Wntk
rLt|��YnX|dk	rf||krft|��|dk	r~||kr~t|��|S)a�
    Check that the supplied value is a string.
    
    You can optionally specify the minimum and maximum number of members.
    
    >>> vtor.check('string', '0')
    '0'
    >>> vtor.check('string', 0)
    Traceback (most recent call last):
    VdtTypeError: the value "0" is of the wrong type.
    >>> vtor.check('string(2)', '12')
    '12'
    >>> vtor.check('string(2)', '1')
    Traceback (most recent call last):
    VdtValueTooShortError: the value "1" is too short.
    >>> vtor.check('string(min=2, max=3)', '123')
    '123'
    >>> vtor.check('string(min=2, max=3)', '1234')
    Traceback (most recent call last):
    VdtValueTooLongError: the value "1234" is too long.
    r�r�N)r�r�)r~rrr�rqr�rr)r7r�r�r�r�r�r!r!r#r1s
cCsdd�t|||�D�S)a
    Check that the value is a list of integers.
    
    You can optionally specify the minimum and maximum number of members.
    
    Each list member is checked that it is an integer.
    
    >>> vtor.check('int_list', ())
    []
    >>> vtor.check('int_list', [])
    []
    >>> vtor.check('int_list', (1, 2))
    [1, 2]
    >>> vtor.check('int_list', [1, 2])
    [1, 2]
    >>> vtor.check('int_list', [1, 'a'])
    Traceback (most recent call last):
    VdtTypeError: the value "a" is of the wrong type.
    cSsg|]}t|��qSr!)r)rV�memr!r!r#rXiszis_int_list.<locals>.<listcomp>)r)r7r�r�r!r!r#rUscCsdd�t|||�D�S)al
    Check that the value is a list of booleans.
    
    You can optionally specify the minimum and maximum number of members.
    
    Each list member is checked that it is a boolean.
    
    >>> vtor.check('bool_list', ())
    []
    >>> vtor.check('bool_list', [])
    []
    >>> check_res = vtor.check('bool_list', (True, False))
    >>> check_res == [True, False]
    1
    >>> check_res = vtor.check('bool_list', [True, False])
    >>> check_res == [True, False]
    1
    >>> vtor.check('bool_list', [True, 'a'])
    Traceback (most recent call last):
    VdtTypeError: the value "a" is of the wrong type.
    cSsg|]}t|��qSr!)r)rVr�r!r!r#rX�sz is_bool_list.<locals>.<listcomp>)r)r7r�r�r!r!r#rlscCsdd�t|||�D�S)a
    Check that the value is a list of floats.
    
    You can optionally specify the minimum and maximum number of members.
    
    Each list member is checked that it is a float.
    
    >>> vtor.check('float_list', ())
    []
    >>> vtor.check('float_list', [])
    []
    >>> vtor.check('float_list', (1, 2.0))
    [1.0, 2.0]
    >>> vtor.check('float_list', [1, 2.0])
    [1.0, 2.0]
    >>> vtor.check('float_list', [1, 'a'])
    Traceback (most recent call last):
    VdtTypeError: the value "a" is of the wrong type.
    cSsg|]}t|��qSr!)r)rVr�r!r!r#rX�sz!is_float_list.<locals>.<listcomp>)r)r7r�r�r!r!r#r�scCs(t|t�rt|��dd�t|||�D�S)an
    Check that the value is a list of strings.
    
    You can optionally specify the minimum and maximum number of members.
    
    Each list member is checked that it is a string.
    
    >>> vtor.check('string_list', ())
    []
    >>> vtor.check('string_list', [])
    []
    >>> vtor.check('string_list', ('a', 'b'))
    ['a', 'b']
    >>> vtor.check('string_list', ['a', 1])
    Traceback (most recent call last):
    VdtTypeError: the value "1" is of the wrong type.
    >>> vtor.check('string_list', 'hello')
    Traceback (most recent call last):
    VdtTypeError: the value "hello" is of the wrong type.
    cSsg|]}t|��qSr!)r)rVr�r!r!r#rX�sz"is_string_list.<locals>.<listcomp>)r~rrr)r7r�r�r!r!r#r�s
cCsdd�t|||�D�S)a
    Check that the value is a list of IP addresses.
    
    You can optionally specify the minimum and maximum number of members.
    
    Each list member is checked that it is an IP address.
    
    >>> vtor.check('ip_addr_list', ())
    []
    >>> vtor.check('ip_addr_list', [])
    []
    >>> vtor.check('ip_addr_list', ('1.2.3.4', '5.6.7.8'))
    ['1.2.3.4', '5.6.7.8']
    >>> vtor.check('ip_addr_list', ['a'])
    Traceback (most recent call last):
    VdtValueError: the value "a" is unacceptable.
    cSsg|]}t|��qSr!)r)rVr�r!r!r#rX�sz#is_ip_addr_list.<locals>.<listcomp>)r)r7r�r�r!r!r#r�scCs t|ttf�s|g}t|||�S)a�
    Check that a value is a list, coercing strings into
    a list with one member. Useful where users forget the
    trailing comma that turns a single value into a list.
    
    You can optionally specify the minimum and maximum number of members.
    A minumum of greater than one will fail if the user only supplies a
    string.
    
    >>> vtor.check('force_list', ())
    []
    >>> vtor.check('force_list', [])
    []
    >>> vtor.check('force_list', 'hello')
    ['hello']
    )r~r@rAr)r7r�r�r!r!r#rD�srD)r;r<r>r?r=cGs�yt|�}Wntk
r(t|��YnX|t|�kr@t|��n|t|�krTt|��ydd�t||�D�Stk
r�}ztd|��WYdd}~XnXdS)a�
    Check that the value is a list.
    Allow specifying the type of each member.
    Work on lists of specific lengths.
    
    You specify each member as a positional argument specifying type
    
    Each type should be one of the following strings :
      'integer', 'float', 'ip_addr', 'string', 'boolean'
    
    So you can specify a list of two strings, followed by
    two integers as :
    
      mixed_list('string', 'string', 'integer', 'integer')
    
    The length of the list must match the number of positional
    arguments you supply.
    
    >>> mix_str = "mixed_list('integer', 'float', 'ip_addr', 'string', 'boolean')"
    >>> check_res = vtor.check(mix_str, (1, 2.0, '1.2.3.4', 'a', True))
    >>> check_res == [1, 2.0, '1.2.3.4', 'a', True]
    1
    >>> check_res = vtor.check(mix_str, ('1', '2.0', '1.2.3.4', 'a', 'True'))
    >>> check_res == [1, 2.0, '1.2.3.4', 'a', True]
    1
    >>> vtor.check(mix_str, ('b', 2.0, '1.2.3.4', 'a', True))
    Traceback (most recent call last):
    VdtTypeError: the value "b" is of the wrong type.
    >>> vtor.check(mix_str, (1, 2.0, '1.2.3.4', 'a'))
    Traceback (most recent call last):
    VdtValueTooShortError: the value "(1, 2.0, '1.2.3.4', 'a')" is too short.
    >>> vtor.check(mix_str, (1, 2.0, '1.2.3.4', 'a', 1, 'b'))
    Traceback (most recent call last):
    VdtValueTooLongError: the value "(1, 2.0, '1.2.3.4', 'a', 1, 'b')" is too long.
    >>> vtor.check(mix_str, 0)
    Traceback (most recent call last):
    VdtTypeError: the value "0" is of the wrong type.

    >>> vtor.check('mixed_list("yoda")', ('a'))
    Traceback (most recent call last):
    VdtParamError: passed an incorrect value "KeyError('yoda',)" for parameter "'mixed_list'"
    cSsg|]\}}t||��qSr!)�fun_dict)rVrlr&r!r!r#rXsz!is_mixed_list.<locals>.<listcomp>rBN)rqr�rrrr}r\r)r7rtZlengthr�r!r!r#r�s+
cGs&t|t�st|��||kr"t|��|S)a�
    This check matches the value to any of a set of options.
    
    >>> vtor.check('option("yoda", "jedi")', 'yoda')
    'yoda'
    >>> vtor.check('option("yoda", "jedi")', 'jed')
    Traceback (most recent call last):
    VdtValueError: the value "jed" is unacceptable.
    >>> vtor.check('option("yoda", "jedi")', 0)
    Traceback (most recent call last):
    VdtTypeError: the value "0" is of the wrong type.
    )r~rrr)r7Zoptionsr!r!r#r$s

cOs
|||fS)ag

    A function that exists for test purposes.
    
    >>> checks = [
    ...     '3, 6, min=1, max=3, test=list(a, b, c)',
    ...     '3',
    ...     '3, 6',
    ...     '3,',
    ...     'min=1, test="a b c"',
    ...     'min=5, test="a, b, c"',
    ...     'min=1, max=3, test="a, b, c"',
    ...     'min=-100, test=-99',
    ...     'min=1, max=3',
    ...     '3, 6, test="36"',
    ...     '3, 6, test="a, b, c"',
    ...     '3, max=3, test=list("a", "b", "c")',
    ...     '''3, max=3, test=list("'a'", 'b', "x=(c)")''',
    ...     "test='x=fish(3)'",
    ...    ]
    >>> v = Validator({'test': _test})
    >>> for entry in checks:
    ...     pprint(v.check(('test(%s)' % entry), 3))
    (3, ('3', '6'), {'max': '3', 'min': '1', 'test': ['a', 'b', 'c']})
    (3, ('3',), {})
    (3, ('3', '6'), {})
    (3, ('3',), {})
    (3, (), {'min': '1', 'test': 'a b c'})
    (3, (), {'min': '5', 'test': 'a, b, c'})
    (3, (), {'max': '3', 'min': '1', 'test': 'a, b, c'})
    (3, (), {'min': '-100', 'test': '-99'})
    (3, (), {'max': '3', 'min': '1'})
    (3, ('3', '6'), {'test': '36'})
    (3, ('3', '6'), {'test': 'a, b, c'})
    (3, ('3',), {'max': '3', 'test': ['a', 'b', 'c']})
    (3, ('3',), {'max': '3', 'test': ["'a'", 'b', 'x=(c)']})
    (3, (), {'test': 'x=fish(3)'})
    
    >>> v = Validator()
    >>> v.check('integer(default=6)', '3')
    3
    >>> v.check('integer(default=6)', None, True)
    6
    >>> v.get_default_value('integer(default=6)')
    6
    >>> v.get_default_value('float(default=6)')
    6.0
    >>> v.get_default_value('pass(default=None)')
    >>> v.get_default_value("string(default='None')")
    'None'
    >>> v.get_default_value('pass')
    Traceback (most recent call last):
    KeyError: 'Check "pass" has no default value.'
    >>> v.get_default_value('pass(default=list(1, 2, 3, 4))')
    ['1', '2', '3', '4']
    
    >>> v = Validator()
    >>> v.check("pass(default=None)", None, True)
    >>> v.check("pass(default='None')", None, True)
    'None'
    >>> v.check('pass(default="None")', None, True)
    'None'
    >>> v.check('pass(default=list(1, 2, 3, 4))', None, True)
    ['1', '2', '3', '4']
    
    Bug test for unicode arguments
    >>> v = Validator()
    >>> v.check(unicode('string(min=4)'), unicode('test')) == unicode('test')
    True
    
    >>> v = Validator()
    >>> v.get_default_value(unicode('string(min=4, default="1234")')) == unicode('1234')
    True
    >>> v.check(unicode('string(min=4, default="1234")'), unicode('test')) == unicode('test')
    True
    
    >>> v = Validator()
    >>> default = v.get_default_value('string(default=None)')
    >>> default == None
    1
    r!)r7rtZkeywargsr!r!r#�_test8sQr�cCsdS)z�
    >>> 
    >>> v = Validator()
    >>> v.get_default_value('string(default="#ff00dd")')
    '#ff00dd'
    >>> v.get_default_value('integer(default=3) # comment')
    3
    Nr!r!r!r!r#�_test2�sr�cCsdS)a�
    >>> vtor.check('string(default="")', '', missing=True)
    ''
    >>> vtor.check('string(default="\n")', '', missing=True)
    '\n'
    >>> print(vtor.check('string(default="\n")', '', missing=True))
    <BLANKLINE>
    <BLANKLINE>
    >>> vtor.check('string()', '\n')
    '\n'
    >>> vtor.check('string(default="\n\n\n")', '', missing=True)
    '\n\n\n'
    >>> vtor.check('string()', 'random \n text goes here\n\n')
    'random \n text goes here\n\n'
    >>> vtor.check('string(default=" \nrandom text\ngoes \n here\n\n ")',
    ... '', missing=True)
    ' \nrandom text\ngoes \n here\n\n '
    >>> vtor.check("string(default='\n\n\n')", '', missing=True)
    '\n\n\n'
    >>> vtor.check("option('\n','a','b',default='\n')", '', missing=True)
    '\n'
    >>> vtor.check("string_list()", ['foo', '\n', 'bar'])
    ['foo', '\n', 'bar']
    >>> vtor.check("string_list(default=list('\n'))", '', missing=True)
    ['\n']
    Nr!r!r!r!r#�_test3�sr��__main__Zvtor)�globsZoptionflagsz{} failures out of {} tests)rrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrr)r )F)NN)NN)NN)NN)NN)NN)NN)NN)NN)NN)NN)Kr4r�__all__rv�sysr�version_infoZ
basestringrrUZunicoder|r.rwrzrxrgrrryr{r'�	NameErrorrr�	Exceptionrr
rr8rrrr	r
rr�objectrr�rrr�rrrrrrrrrrrDr�rrr�r�r�r1Zdoctest�modules�get�m�__dict__�copyr�rGZtestmodZIGNORE_EXCEPTION_DETAIL�ELLIPSISZfailuresZtests�AssertionError�formatr!r!r!r#�<module>s�
	)4
%
8
5
<%
-
"
$





9T


site-packages/__pycache__/decorator.cpython-36.pyc000064400000030213147511334560016114 0ustar003

5*[Z�@�@s�dZddlmZddlZddlZddlZddlZddlZddlZdZ	ej
dkrdddlmZdd�Znej
d	d
�Zdd�Zd
d�Zy
ejZWnek
r�dd�ZYnXej
dd�Zdd�Zejd�ZGdd�de�Zffdd�Zd*dd�ZyddlmZWn"ek
�rddlmZYnXGdd�de�Zeej�Ze ej!�Z"e"dk�rhej#�rhd d!�Zee_n,e"dk�r|ej#�r|ne"d"k�r�d#d!�Zee_ee�Z$d$d%�Z%d&d'�Z&d(d)�Z'dS)+zT
Decorator module, see http://pypi.python.org/pypi/decorator
for the documentation.
�)�print_functionNz4.2.1�3)�getfullargspeccCs|jS)N)�__init__)�cls�r�/usr/lib/python3.6/decorator.py�get_init0sr	�FullArgSpecz5args varargs varkw defaults kwonlyargs kwonlydefaultscCstjtj|�gdf�S)z?A quick and dirty replacement for getfullargspec for Python 2.XN)r
�_make�inspect�
getargspec)�frrrr7srcCs|jjS)N)r�__func__)rrrrr	;scCsdS)NFr)rrrr�iscoroutinefunctionBsr�ArgSpeczargs varargs varkw defaultscCst|�}t|j|j|j|j�S)z$A replacement for inspect.getargspec)rr�args�varargs�varkw�defaults)r�specrrrr
Jsr
z\s*def\s*([_\w][_\w\d]*)\s*\(c@sZeZdZdZej�ZfZZZ	Z
ZZd
dd�Z
dd�Zddd	�Zeddd��ZdS)�
FunctionMakerz�
    An object with the ability to create functions with a given signature.
    It has attributes name, doc, module, signature, defaults, dict and
    methods update and make.
    NcCs�||_|�rf|j|_|jdkr$d|_|j|_|j|_tj|��rft	|�}t
|di�|_xdD]}	t||	t
||	��q\Wx&t
|j�D]\}
}t|d
|
|�q�Wt|j�}t|j�}
|jr�|jd|j�|
jd|j�n|jr�|jd�x.|jD]$}	|jd|	�|
jd
|	|	f�q�W|j�rB|jd|j�|
jd|j�dj|�|_dj|
�|_|jj�|_|�rr||_|dk	�r�||_|�r�||_|�r�||_|�r�||_|�r�||_t|d��s�t�t|d��s�td|��dS)Nz<lambda>�_lambda_�__annotations__rrrr�
kwonlyargs�kwonlydefaultszarg%d�*z%s=Nonez%s=%sz**z, �name�	signaturez%You are decorating a non function: %s)rrrrrr)�shortsignature�__name__r�__doc__�doc�
__module__�moduler�
isfunctionr�getattr�annotations�setattr�	enumerater�listr�appendrr�joinr�__dict__�copy�dictr�hasattr�AssertionError�	TypeError)�self�funcrrrr"r$ZfuncdictZargspec�a�i�argZallargsZallshortargsrrrras\




zFunctionMaker.__init__cKs�|j|_t|dd�|_t|di�|_|j|_|jp4d|_t|dd�|_	yt
jd�}Wntk
rld}YnX|j
jdd�}t|d|�|_|jj|�dS)	z2Update the signature of func with the data in selfr"Nr/r'��?r r$)rr r&r!r-r�__defaults__r�__kwdefaults__r�sys�	_getframe�AttributeError�	f_globals�getr#�update)r3r4�kw�frameZcallermodulerrrrA�s
zFunctionMaker.updateFc

Ks|t|�}|pi}tj|�}|dkr2td|��|jd�}t|gdd�|jjd�D��}x$|D]}	|	dkrbtd	|	|f��qbW|j	d
�s�|d
7}dt
|j�f}
yt||
d�}t
||�Wn*td
tjd�t|tjd��YnX||}|�r||d<|j|f|�|S)zBMake a new function from a given template and update the signatureNz not a valid function template
%s�cSsg|]}|jd��qS)z *)�strip)�.0r7rrr�
<listcomp>�sz&FunctionMaker.make.<locals>.<listcomp>�,�_func_�_call_z%s is overridden in
%s�
z<decorator-gen-%d>ZsinglezError in generated code:)�fileZ
__source__)rIrJ)�vars�DEF�search�SyntaxError�group�setr�split�	NameError�endswith�next�_compile_count�compile�exec�printr<�stderrrA)
r3Z	src_templ�evaldict�	addsource�attrs�src�mor�names�n�filename�coder4rrr�make�s4



zFunctionMaker.makeTcKs�t|t�r0|j�jdd�\}	}
|
dd�}d}nd}	d}|}|||	||||�}
djdd�|j�D��}|jd�}|r�t|�r�d|jd	d
�}nd|}|
j	|||f|�S)
z�
        Create a function from the strings name, signature and body.
        evaldict is the evaluation dictionary. If addsource is true an
        attribute __source__ is added to the result. The attributes attrs
        are added, if any.
        �(rDNrKcss|]}d|VqdS)z    Nr)rF�linerrr�	<genexpr>�sz'FunctionMaker.create.<locals>.<genexpr>rJz#async def %(name)s(%(signature)s):
�returnzreturn awaitzdef %(name)s(%(signature)s):
���)
�
isinstance�strrErSr,�
splitlinesr@r�replacere)r�objZbodyr\rr"r$r]r^r�restrr4r3Zibody�callerrrr�create�s	


zFunctionMaker.create)NNNNNNN)NF)NNNT)r r#�__qualname__r!�	itertools�countrWrrrrrrrrAre�classmethodrrrrrrrTs
3
"rc	Csnt||d�}d}x0t|�D]$\}}d|}|||<||d7}qWtj|d|||d�}t|d�rj|j|_|S)zE
    decorate(func, caller) decorates a function using a caller.
    )rJrI�z_e%d_z, z,return _call_(_func_, %s%%(shortsignature)s))�__wrapped__rs)r/r)rrrr0rs)	r4rqZextrasr\Zesr6ZextraZexZfunrrr�decorate�s

ryc
Cs|dk	rt||�Sdf}}tj|�rB|jj�}d|j|jf}n~tj|�r�|jdkr\d}n|j}|j}|jj}t	|j
pzf�}dj|jj|||��}|r�|d7}|j
}n|j
jj�}|jj}t|td�}tjd	||fd
||f|||j|d�}	|�rd||	_
|	S)
z=decorator(caller) converts a caller function into a decoratorNrwzHdecorator(%s) converts functions/generators into factories of %s objectsz<lambda>rz, rH)Z_callZ
_decorate_z%s(func, %s)zhif func is None: return lambda func:  _decorate_(func, _call, (%s))
return _decorate_(func, _call, (%s)))r"r$rx)N)ryrZisclassr �lowerr%r!�__code__�co_argcount�lenr:r,�co_varnames�	__class__�__call__r/rrrr#)
rqZ_funcZdefaultargsrrr"�nargsZndefsr\�decrrr�	decorator�s:







r�)�_GeneratorContextManager)�GeneratorContextManagerc@seZdZdd�ZdS)�ContextManagercCstj|dt||d�|d�S)zContext manager decoratorz.with _self_: return _func_(%(shortsignature)s))Z_self_rI)rx)rrrr/)r3r4rrrr�#szContextManager.__call__N)r r#rsr�rrrrr�"sr��cOstj||||��S)N)r�r)r3�gr5�krrrr-sr�cOstj||||�S)N)r�r)r3r�r5r�rrrr3scCst|�S)N)�_contextmanager)r4rrr�contextmanager:sr�cCsRd}x:t|�D].\}}t||�r&d}Pt||�r|||<d}qW|rN|j|�dS)z_
    Append ``a`` to the list of the virtual ancestors, unless it is already
    included.
    TFN)r)�
issubclassr+)r5�
vancestors�add�j�varrrr+As

r+csL�std��ddj���tjdf�fdd�	����fdd�}d	�|_|S)
zr
    Factory of decorators turning a function into a generic function
    dispatching on the given arguments.
    zNo dispatch args passedz(%s,)z, rwcs0|t|�t���r,tdt��t|�|f��dS)z5Make sure one passes the expected number of argumentszExpected %d arguments, got %d%sN)r}r2)Z	argumentsZwrong�msg)�
dispatch_argsrr�check[szdispatch_on.<locals>.checkcs�tt��j�}t��|ks&td���i����fdd����fdd����fdd�}��fdd	�}���fd
d�}tj�d�t|d
�|����|�d�
S)z4Decorator turning a function into a generic functionzUnknown dispatch arguments %scsv�|�dd�tt���D�}xH�D]@}x:t|||�D]*\}}}t||�r6||j�kr6t||�q6Wq$Wdd�|D�S)zU
            Get a list of sets of virtual ancestors for the given types
            cSsg|]}g�qSrr)rF�_rrrrGpszIdispatch_on.<locals>.gen_func_dec.<locals>.vancestors.<locals>.<listcomp>cSsg|]}t|��qSr)rR)rF�rarrrrGus)�ranger}�zipr��mror+)�typesZras�types_�tZtype_r�)r�r��typemaprrr�ks
z5dispatch_on.<locals>.gen_func_dec.<locals>.vancestorscs��|�g}x�t|�|��D]p\}}t|�}|dkrFtd||f��n4|dkrr|\}td||fi�j�dd�}n|j�}|j|dd��qW|S)zG
            Get a list of virtual MROs, one for each type
            rDzAmbiguous dispatch for %s: %sr�Nrj)r�r}�RuntimeError�typer�r+)r�Zlistsr�ZvasZn_vasr�r�)r�r�rr�	ancestorswsz4dispatch_on.<locals>.gen_func_dec.<locals>.ancestorscs������fdd�}|S)zU
            Decorator to register an implementation for the given types
            cs&�t|�jtjd|j�|��<|S)Nz in )rr�operator�ltr )r)r�r�r�rrr��sz@dispatch_on.<locals>.gen_func_dec.<locals>.register.<locals>.decr)r�r�)r�r�)r�r�register�sz3dispatch_on.<locals>.gen_func_dec.<locals>.registercs@�|�g}x.tj�|��D]}|jtdd�|D���qW|S)zI
            An utility to introspect the dispatch algorithm
            css|]}|jVqdS)N)r )rFr5rrrrh�szKdispatch_on.<locals>.gen_func_dec.<locals>.dispatch_info.<locals>.<genexpr>)rt�productr+�tuple)r�ZlstZanc)r�r�rr�
dispatch_info�s
z8dispatch_on.<locals>.gen_func_dec.<locals>.dispatch_infocs�tdd�|D��}y�|}Wntk
r2YnX|||�Stj�|��}t|�x(|D] }�j|�}|dk	rZ|||�SqZW�||�S)Ncss|]}t|�VqdS)N)r�)rFr7rrrrh�szGdispatch_on.<locals>.gen_func_dec.<locals>._dispatch.<locals>.<genexpr>)r��KeyErrorrtr�rVr@)r�rrBr�r�combinationsr�)r�r4r�rr�	_dispatch�s


z4dispatch_on.<locals>.gen_func_dec.<locals>._dispatchz#return _f_(%s, %%(shortsignature)s))Z_f_)r��defaultr�r�r�r�rx)rRrrrTrrrr/)r4Zargsetr�r�r�)r�r��dispatch_str)r�r4r�r�r�gen_func_decas
z!dispatch_on.<locals>.gen_func_dec�dispatch_on)r1r,r��ner )r�r�r)r�r�r�rr�SsW
r�)N)(r!Z
__future__r�rer<rr�rt�collections�__version__�versionrr	�
namedtupler
rr>rr
rXrN�objectrryr��
contextlibr��ImportErrorr�r�rZinitr}rZn_argsrr�r�r+r�rrrr�<module>!s\




&


site-packages/__pycache__/easy_install.cpython-36.opt-1.pyc000064400000000403147511334560017556 0ustar003

��f~�@s"dZedkrddlmZe�dS)zRun the EasyInstall command�__main__�)�mainN)�__doc__�__name__Zsetuptools.command.easy_installr�rr�"/usr/lib/python3.6/easy_install.py�<module>ssite-packages/__pycache__/validate.cpython-36.opt-1.pyc000064400000126422147511334560016672 0ustar003

��S���@s�dZdZdWZdd lZdd lZdd!lmZejdXkr8eZne	Zd#d$�Z
eZej
d%ejejB�Zej
d&ejejB�Zd'Zd(eZyeWnek
r�d)d*�ZYnXd+d�Zd,d�ZGd-d�de�ZGd.d�de�ZGd/d�de�ZGd0d�de�ZGd1d�de�ZGd2d	�d	e�ZGd3d
�d
e�Z Gd4d�de�Z!Gd5d�de�Z"Gd6d
�d
e�Z#Gd7d�de$�Z%dYd9d:�Z&dZd;d�Z'd[d<d�Z(d=d=d=d=d=d8d8d8d8d8d>�
Z)d?d�Z*d@d�Z+d\dAd�Z,d]dBd�Z-d^dCd�Z.d_dDd�Z/d`dEd�Z0dadFd�Z1dbdGd�Z2dcdHd�Z3dddIdJ�Z4e'e(e+e.e*dK�Z5dLd�Z6dMd�Z7dNdO�Z8dPdQ�Z9dRdS�Z:e;dTk�r�dd lZdd l<Z<ej=j>dT�Z?e?j@jA�ZBeBjCdUe%�i�e<jDe?eBe<jEe<jFBdV�\ZGZHd S)ea
    The Validator object is used to check that supplied values 
    conform to a specification.
    
    The value can be supplied as a string - e.g. from a config file.
    In this case the check will also *convert* the value to
    the required type. This allows you to add validation
    as a transparent layer to access data stored as strings.
    The validation checks that the data is correct *and*
    converts it to the expected type.
    
    Some standard checks are provided for basic data types.
    Additional checks are easy to write. They can be
    provided when the ``Validator`` is instantiated or
    added afterwards.
    
    The standard functions work with the following basic data types :
    
    * integers
    * floats
    * booleans
    * strings
    * ip_addr
    
    plus lists of these datatypes
    
    Adding additional checks is done through coding simple functions.
    
    The full set of standard checks are : 
    
    * 'integer': matches integer values (including negative)
                 Takes optional 'min' and 'max' arguments : ::
    
                   integer()
                   integer(3, 9)  # any value from 3 to 9
                   integer(min=0) # any positive value
                   integer(max=9)
    
    * 'float': matches float values
               Has the same parameters as the integer check.
    
    * 'boolean': matches boolean values - ``True`` or ``False``
                 Acceptable string values for True are :
                   true, on, yes, 1
                 Acceptable string values for False are :
                   false, off, no, 0
    
                 Any other value raises an error.
    
    * 'ip_addr': matches an Internet Protocol address, v.4, represented
                 by a dotted-quad string, i.e. '1.2.3.4'.
    
    * 'string': matches any string.
                Takes optional keyword args 'min' and 'max'
                to specify min and max lengths of the string.
    
    * 'list': matches any list.
              Takes optional keyword args 'min', and 'max' to specify min and
              max sizes of the list. (Always returns a list.)
    
    * 'tuple': matches any tuple.
              Takes optional keyword args 'min', and 'max' to specify min and
              max sizes of the tuple. (Always returns a tuple.)
    
    * 'int_list': Matches a list of integers.
                  Takes the same arguments as list.
    
    * 'float_list': Matches a list of floats.
                    Takes the same arguments as list.
    
    * 'bool_list': Matches a list of boolean values.
                   Takes the same arguments as list.
    
    * 'ip_addr_list': Matches a list of IP addresses.
                     Takes the same arguments as list.
    
    * 'string_list': Matches a list of strings.
                     Takes the same arguments as list.
    
    * 'mixed_list': Matches a list with different types in 
                    specific positions. List size must match
                    the number of arguments.
    
                    Each position can be one of :
                    'integer', 'float', 'ip_addr', 'string', 'boolean'
    
                    So to specify a list with two strings followed
                    by two integers, you write the check as : ::
    
                      mixed_list('string', 'string', 'integer', 'integer')
    
    * 'pass': This check matches everything ! It never fails
              and the value is unchanged.
    
              It is also the default if no check is specified.
    
    * 'option': This check matches any from a list of options.
                You specify this check with : ::
    
                  option('option 1', 'option 2', 'option 3')
    
    You can supply a default value (returned if no value is supplied)
    using the default keyword argument.
    
    You specify a list argument for default using a list constructor syntax in
    the check : ::
    
        checkname(arg1, arg2, default=list('val 1', 'val 2', 'val 3'))
    
    A badly formatted set of arguments will raise a ``VdtParamError``.
z1.0.1�__version__�dottedQuadToNum�numToDottedQuad�
ValidateError�VdtUnknownCheckError�
VdtParamError�VdtTypeError�
VdtValueError�VdtValueTooSmallError�VdtValueTooBigError�VdtValueTooShortError�VdtValueTooLongError�VdtMissingValue�	Validator�
is_integer�is_float�
is_boolean�is_list�is_tuple�
is_ip_addr�	is_string�is_int_list�is_bool_list�
is_float_list�is_string_list�is_ip_addr_list�
is_mixed_list�	is_option�
__docformat__�N)�pprint�cCs|S)N�)�xr!r!�/usr/lib/python3.6/validate.py�<lambda>�sr$a�
    (?:
        ([a-zA-Z_][a-zA-Z0-9_]*)\s*=\s*list\(
            (
                (?:
                    \s*
                    (?:
                        (?:".*?")|              # double quotes
                        (?:'.*?')|              # single quotes
                        (?:[^'",\s\)][^,\)]*?)  # unquoted
                    )
                    \s*,\s*
                )*
                (?:
                    (?:".*?")|              # double quotes
                    (?:'.*?')|              # single quotes
                    (?:[^'",\s\)][^,\)]*?)  # unquoted
                )?                          # last one
            )
        \)
    )
z�
    (
        (?:".*?")|              # double quotes
        (?:'.*?')|              # single quotes
        (?:[^'",\s=][^,=]*?)       # unquoted
    )
    (?:
    (?:\s*,\s*)|(?:\s*$)            # comma
    )
a�
    (?:
        (
            (?:
                [a-zA-Z_][a-zA-Z0-9_]*\s*=\s*list\(
                    (?:
                        \s*
                        (?:
                            (?:".*?")|              # double quotes
                            (?:'.*?')|              # single quotes
                            (?:[^'",\s\)][^,\)]*?)       # unquoted
                        )
                        \s*,\s*
                    )*
                    (?:
                        (?:".*?")|              # double quotes
                        (?:'.*?')|              # single quotes
                        (?:[^'",\s\)][^,\)]*?)       # unquoted
                    )?                              # last one
                \)
            )|
            (?:
                (?:".*?")|              # double quotes
                (?:'.*?')|              # single quotes
                (?:[^'",\s=][^,=]*?)|       # unquoted
                (?:                         # keyword argument
                    [a-zA-Z_][a-zA-Z0-9_]*\s*=\s*
                    (?:
                        (?:".*?")|              # double quotes
                        (?:'.*?')|              # single quotes
                        (?:[^'",\s=][^,=]*?)       # unquoted
                    )
                )
            )
        )
        (?:
            (?:\s*,\s*)|(?:\s*$)            # comma
        )
    )
    z^%s*cCs|rdSdSdS)z$Simple boolean equivalent function. �rNr!)�valr!r!r#�bool
sr'cCsRddl}ddl}y|jd|j|j���dS|jk
rLtd|��YnXdS)a�
    Convert decimal dotted quad string to long integer
    
    >>> int(dottedQuadToNum('1 '))
    1
    >>> int(dottedQuadToNum(' 1.2'))
    16777218
    >>> int(dottedQuadToNum(' 1.2.3 '))
    16908291
    >>> int(dottedQuadToNum('1.2.3.4'))
    16909060
    >>> dottedQuadToNum('255.255.255.255')
    4294967295
    >>> dottedQuadToNum('255.255.255.256')
    Traceback (most recent call last):
    ValueError: Not a good dotted-quad IP: 255.255.255.256
    rNz!LzNot a good dotted-quad IP: %s)�socket�struct�unpackZ	inet_aton�strip�error�
ValueError)Zipr(r)r!r!r#rsc
Csvddl}ddl}|td�ks$|dkr0td|��y|j|jdt|���S|j|jtfk
rptd|��YnXdS)a!
    Convert int or long int to dotted quad string
    
    >>> numToDottedQuad(long(-1))
    Traceback (most recent call last):
    ValueError: Not a good numeric IP: -1
    >>> numToDottedQuad(long(1))
    '0.0.0.1'
    >>> numToDottedQuad(long(16777218))
    '1.0.0.2'
    >>> numToDottedQuad(long(16908291))
    '1.2.0.3'
    >>> numToDottedQuad(long(16909060))
    '1.2.3.4'
    >>> numToDottedQuad(long(4294967295))
    '255.255.255.255'
    >>> numToDottedQuad(long(4294967296))
    Traceback (most recent call last):
    ValueError: Not a good numeric IP: 4294967296
    >>> numToDottedQuad(-1)
    Traceback (most recent call last):
    ValueError: Not a good numeric IP: -1
    >>> numToDottedQuad(1)
    '0.0.0.1'
    >>> numToDottedQuad(16777218)
    '1.0.0.2'
    >>> numToDottedQuad(16908291)
    '1.2.0.3'
    >>> numToDottedQuad(16909060)
    '1.2.3.4'
    >>> numToDottedQuad(4294967295)
    '255.255.255.255'
    >>> numToDottedQuad(4294967296)
    Traceback (most recent call last):
    ValueError: Not a good numeric IP: 4294967296

    rNl��zNot a good numeric IP: %sz!L)r(r)�longr-Z	inet_ntoa�packr,�
OverflowError)Znumr(r)r!r!r#r0s(c@seZdZdZdS)ra
    This error indicates that the check failed.
    It can be the base class for more specific errors.
    
    Any check function that fails ought to raise this error.
    (or a subclass)
    
    >>> raise ValidateError
    Traceback (most recent call last):
    ValidateError
    N)�__name__�
__module__�__qualname__�__doc__r!r!r!r#rdsc@seZdZdZdS)r
z1No value was supplied to a check that needed one.N)r1r2r3r4r!r!r!r#r
rsc@seZdZdZdd�ZdS)rz'An unknown check function was requestedcCstj|d|f�dS)z�
        >>> raise VdtUnknownCheckError('yoda')
        Traceback (most recent call last):
        VdtUnknownCheckError: the check "yoda" is unknown.
        zthe check "%s" is unknown.N)r�__init__)�self�valuer!r!r#r5yszVdtUnknownCheckError.__init__N)r1r2r3r4r5r!r!r!r#rvsc@seZdZdZdd�ZdS)rz!An incorrect parameter was passedcCstj|d||f�dS)z�
        >>> raise VdtParamError('yoda', 'jedi')
        Traceback (most recent call last):
        VdtParamError: passed an incorrect value "jedi" for parameter "yoda".
        z2passed an incorrect value "%s" for parameter "%s".N)�SyntaxErrorr5)r6�namer7r!r!r#r5�szVdtParamError.__init__N)r1r2r3r4r5r!r!r!r#r�sc@seZdZdZdd�ZdS)rz(The value supplied was of the wrong typecCstj|d|f�dS)z�
        >>> raise VdtTypeError('jedi')
        Traceback (most recent call last):
        VdtTypeError: the value "jedi" is of the wrong type.
        z$the value "%s" is of the wrong type.N)rr5)r6r7r!r!r#r5�szVdtTypeError.__init__N)r1r2r3r4r5r!r!r!r#r�sc@seZdZdZdd�ZdS)rzIThe value supplied was of the correct type, but was not an allowed value.cCstj|d|f�dS)z�
        >>> raise VdtValueError('jedi')
        Traceback (most recent call last):
        VdtValueError: the value "jedi" is unacceptable.
        zthe value "%s" is unacceptable.N)rr5)r6r7r!r!r#r5�szVdtValueError.__init__N)r1r2r3r4r5r!r!r!r#r�sc@seZdZdZdd�ZdS)r	z>The value supplied was of the correct type, but was too small.cCstj|d|f�dS)z�
        >>> raise VdtValueTooSmallError('0')
        Traceback (most recent call last):
        VdtValueTooSmallError: the value "0" is too small.
        zthe value "%s" is too small.N)rr5)r6r7r!r!r#r5�szVdtValueTooSmallError.__init__N)r1r2r3r4r5r!r!r!r#r	�sc@seZdZdZdd�ZdS)r
z<The value supplied was of the correct type, but was too big.cCstj|d|f�dS)z�
        >>> raise VdtValueTooBigError('1')
        Traceback (most recent call last):
        VdtValueTooBigError: the value "1" is too big.
        zthe value "%s" is too big.N)rr5)r6r7r!r!r#r5�szVdtValueTooBigError.__init__N)r1r2r3r4r5r!r!r!r#r
�sc@seZdZdZdd�ZdS)rz>The value supplied was of the correct type, but was too short.cCstj|d|f�dS)z�
        >>> raise VdtValueTooShortError('jed')
        Traceback (most recent call last):
        VdtValueTooShortError: the value "jed" is too short.
        zthe value "%s" is too short.N)rr5)r6r7r!r!r#r5�szVdtValueTooShortError.__init__N)r1r2r3r4r5r!r!r!r#r�sc@seZdZdZdd�ZdS)rz=The value supplied was of the correct type, but was too long.cCstj|d|f�dS)z�
        >>> raise VdtValueTooLongError('jedie')
        Traceback (most recent call last):
        VdtValueTooLongError: the value "jedie" is too long.
        zthe value "%s" is too long.N)rr5)r6r7r!r!r#r5�szVdtValueTooLongError.__init__N)r1r2r3r4r5r!r!r!r#r�sc@s�eZdZdZejdej�Zejdej�Ze	Z	e
Z
ejeejejB�Z
ejeejejB�Zddd�Zddd	�Zd
d�Zdd
�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�ZdS)ra3

    Validator is an object that allows you to register a set of 'checks'.
    These checks take input and test that it conforms to the check.
    
    This can also involve converting the value from a string into
    the correct datatype.
    
    The ``check`` method takes an input string which configures which
    check is to be used and applies that check to a supplied value.
    
    An example input string would be:
    'int_range(param1, param2)'
    
    You would then provide something like:
    
    >>> def int_range_check(value, min, max):
    ...     # turn min and max from strings to integers
    ...     min = int(min)
    ...     max = int(max)
    ...     # check that value is of the correct type.
    ...     # possible valid inputs are integers or strings
    ...     # that represent integers
    ...     if not isinstance(value, (int, long, string_type)):
    ...         raise VdtTypeError(value)
    ...     elif isinstance(value, string_type):
    ...         # if we are given a string
    ...         # attempt to convert to an integer
    ...         try:
    ...             value = int(value)
    ...         except ValueError:
    ...             raise VdtValueError(value)
    ...     # check the value is between our constraints
    ...     if not min <= value:
    ...          raise VdtValueTooSmallError(value)
    ...     if not value <= max:
    ...          raise VdtValueTooBigError(value)
    ...     return value
    
    >>> fdict = {'int_range': int_range_check}
    >>> vtr1 = Validator(fdict)
    >>> vtr1.check('int_range(20, 40)', '30')
    30
    >>> vtr1.check('int_range(20, 40)', '60')
    Traceback (most recent call last):
    VdtValueTooBigError: the value "60" is too big.
    
    New functions can be added with : ::
    
    >>> vtr2 = Validator()       
    >>> vtr2.functions['int_range'] = int_range_check
    
    Or by passing in a dictionary of functions when Validator 
    is instantiated.
    
    Your functions *can* use keyword arguments,
    but the first argument should always be 'value'.
    
    If the function doesn't take additional arguments,
    the parentheses are optional in the check.
    It can be written with either of : ::
    
        keyword = function_name
        keyword = function_name()
    
    The first program to utilise Validator() was Michael Foord's
    ConfigObj, an alternative to ConfigParser which supports lists and
    can validate a config file using a config schema.
    For more details on using Validator with ConfigObj see:
    https://configobj.readthedocs.org/en/latest/configobj.html
    z
(.+?)\((.*)\)z%^([a-zA-Z_][a-zA-Z0-9_]*)\s*=\s*(.*)$NcCsR|jttttttttt	t
ttt
|jttd�|_|dk	rB|jj|�t|_i|_dS)z(
        >>> vtri = Validator()
        )��integer�float�boolean�ip_addr�string�list�tupleZint_listZ
float_listZ	bool_listZip_addr_listZstring_list�
mixed_list�passZoption�
force_listN)�_passrrrrrrrrrrrrrrrD�	functions�updaterZbaseErrorClass�_cache)r6rFr!r!r#r53s*
zValidator.__init__FcCsJ|j|�\}}}}|r.|dkr$t��|j|�}|dkr:dS|j||||�S)a�
        Usage: check(check, value)
        
        Arguments:
            check: string representing check to apply (including arguments)
            value: object to be checked
        Returns value, converted to correct type if necessary
        
        If the check fails, raises a ``ValidateError`` subclass.
        
        >>> vtor.check('yoda', '')
        Traceback (most recent call last):
        VdtUnknownCheckError: the check "yoda" is unknown.
        >>> vtor.check('yoda()', '')
        Traceback (most recent call last):
        VdtUnknownCheckError: the check "yoda" is unknown.
        
        >>> vtor.check('string(default="")', '', missing=True)
        ''
        N)�_parse_with_cachingr
�_handle_none�_check_value)r6�checkr7Zmissing�fun_name�fun_args�
fun_kwargs�defaultr!r!r#rLQs
zValidator.checkcCs"|dkrdS|dkr|j|�}|S)N�None�'None'�"None")rRrS)�_unquote)r6r7r!r!r#rJts

zValidator._handle_nonecCs�||jkr.|j|\}}}}t|�}t|�}nF|j|�\}}}}tdd�t|j��D��}|t|�t|�|f|j|<||||fS)NcSsg|]\}}t|�|f�qSr!)�str)�.0�keyr7r!r!r#�
<listcomp>�sz1Validator._parse_with_caching.<locals>.<listcomp>)rHr@�dict�_parse_check�items)r6rLrMrNrOrPr!r!r#rI}s

zValidator._parse_with_cachingcCs@y|j|}Wntk
r*t|��YnX||f|�|�SdS)N)rF�KeyErrorr)r6r7rMrNrO�funr!r!r#rK�s
zValidator._check_valuecCs|jj|�}|r�|jd�}|jd�}|jj|�}|dkrDtd|��g}i}x�|jj|�D]�}|j�}|jj|�}	|	r�|j	|	�\}
}|||
<qZ|j
j|�}|r�|jd�}|dkr�|j|�}|||jd�<qZ|j|j|��qZWn|fidfS|j
dd�}
||||
fS)Nr%�zBad syntax in check "%s".�'None'�"None"rP)r_r`)�_func_re�match�group�_matchfinderr�_paramfinder�findallr+�	_list_arg�_list_handle�_key_argrT�append�pop)r6rLZ	fun_matchrMZ
arg_stringZ	arg_matchrNrO�arg�	listmatchrWr&ZkeymatchrPr!r!r#rZ�s6



zValidator._parse_checkcCs8t|�dkr4|ddkr4|d|dkr4|dd�}|S)	zUnquote a value if necessary.r^r�'�"r%)rnro���rp)�len)r6r&r!r!r#rT�s(zValidator._unquotecCsFg}|jd�}|jd�}x$|jj|�D]}|j|j|��q&W||fS)z7Take apart a ``keyword=list('val, 'val')`` type string.r%r^)rc�
_list_membersrfrjrT)r6rm�outr9�argsrlr!r!r#rh�s

zValidator._list_handlecCs|S)z�
        Dummy check that always passes
        
        >>> vtor.check('', 0)
        0
        >>> vtor.check('', '0')
        '0'
        r!)r6r7r!r!r#rE�s	zValidator._passcCsL|j|�\}}}}|dkr&td|��|j|�}|dkr<|S|j||||�S)z�
        Given a check, return the default value for the check
        (converted to the right type).
        
        If the check doesn't specify a default value then a
        ``KeyError`` will be raised.
        Nz Check "%s" has no default value.)rIr\rJrK)r6rLrMrNrOrPr7r!r!r#�get_default_value�s
zValidator.get_default_value)N)F)r1r2r3r4�re�compile�DOTALLrarirgrr�_paramstring�VERBOSEre�_matchstringrdr5rLrJrIrKrZrTrhrErur!r!r!r#r�s"F

#		(
FcCs�|rtp
t}g}x�t||�D]z\}}|dkr8|j|�qt|ttttf�r�y|j||��Wq�tk
r�}zt||��WYdd}~Xq�Xqt||��qW|S)a�
    Return numbers from inputs or raise VdtParamError.
    
    Lets ``None`` pass through.
    Pass in keyword argument ``to_float=True`` to
    use float for the conversion rather than int.
    
    >>> _is_num_param(('', ''), (0, 1.0))
    [0, 1]
    >>> _is_num_param(('', ''), (0, 1.0), to_float=True)
    [0.0, 1.0]
    >>> _is_num_param(('a'), ('a'))
    Traceback (most recent call last):
    VdtParamError: passed an incorrect value "a" for parameter "a".
    N)	r<�int�ziprj�
isinstancer.�string_typer-r)�names�values�to_floatr]Z
out_paramsr9r&�er!r!r#�
_is_num_param�sr�cCs�td||f�\}}t|tttf�s*t|��t|t�r^yt|�}Wntk
r\t|��YnX|dk	rv||krvt|��|dk	r�||kr�t|��|S)aH
    A check that tests that a given value is an integer (int, or long)
    and optionally, between bounds. A negative value is accepted, while
    a float will fail.
    
    If the value is a string, then the conversion is done - if possible.
    Otherwise a VdtError is raised.
    
    >>> vtor.check('integer', '-1')
    -1
    >>> vtor.check('integer', '0')
    0
    >>> vtor.check('integer', 9)
    9
    >>> vtor.check('integer', 'a')
    Traceback (most recent call last):
    VdtTypeError: the value "a" is of the wrong type.
    >>> vtor.check('integer', '2.2')
    Traceback (most recent call last):
    VdtTypeError: the value "2.2" is of the wrong type.
    >>> vtor.check('integer(10)', '20')
    20
    >>> vtor.check('integer(max=20)', '15')
    15
    >>> vtor.check('integer(10)', '9')
    Traceback (most recent call last):
    VdtValueTooSmallError: the value "9" is too small.
    >>> vtor.check('integer(10)', 9)
    Traceback (most recent call last):
    VdtValueTooSmallError: the value "9" is too small.
    >>> vtor.check('integer(max=20)', '35')
    Traceback (most recent call last):
    VdtValueTooBigError: the value "35" is too big.
    >>> vtor.check('integer(max=20)', 35)
    Traceback (most recent call last):
    VdtValueTooBigError: the value "35" is too big.
    >>> vtor.check('integer(0, 9)', False)
    0
    �min�maxN)r�r�)	r�r~r|r.rrr-r	r
)r7r�r��min_val�max_valr!r!r#rs(
cCs�td||fdd�\}}t|ttttf�s0t|��t|t�sdyt|�}Wntk
rbt|��YnX|dk	r|||kr|t|��|dk	r�||kr�t	|��|S)a<
    A check that tests that a given value is a float
    (an integer will be accepted), and optionally - that it is between bounds.
    
    If the value is a string, then the conversion is done - if possible.
    Otherwise a VdtError is raised.
    
    This can accept negative values.
    
    >>> vtor.check('float', '2')
    2.0
    
    From now on we multiply the value to avoid comparing decimals
    
    >>> vtor.check('float', '-6.8') * 10
    -68.0
    >>> vtor.check('float', '12.2') * 10
    122.0
    >>> vtor.check('float', 8.4) * 10
    84.0
    >>> vtor.check('float', 'a')
    Traceback (most recent call last):
    VdtTypeError: the value "a" is of the wrong type.
    >>> vtor.check('float(10.1)', '10.2') * 10
    102.0
    >>> vtor.check('float(max=20.2)', '15.1') * 10
    151.0
    >>> vtor.check('float(10.0)', '9.0')
    Traceback (most recent call last):
    VdtValueTooSmallError: the value "9.0" is too small.
    >>> vtor.check('float(max=20.0)', '35.0')
    Traceback (most recent call last):
    VdtValueTooBigError: the value "35.0" is too big.
    r�r�T)r�N)r�r�)
r�r~r|r.r<rrr-r	r
)r7r�r�r�r�r!r!r#rGs#
T)
TZon�1�true�yesFZoff�0Zfalse�nocCsXt|t�r4yt|j�Stk
r2t|��YnX|dkr@dS|dkrLdSt|��dS)a�
    Check if the value represents a boolean.
    
    >>> vtor.check('boolean', 0)
    0
    >>> vtor.check('boolean', False)
    0
    >>> vtor.check('boolean', '0')
    0
    >>> vtor.check('boolean', 'off')
    0
    >>> vtor.check('boolean', 'false')
    0
    >>> vtor.check('boolean', 'no')
    0
    >>> vtor.check('boolean', 'nO')
    0
    >>> vtor.check('boolean', 'NO')
    0
    >>> vtor.check('boolean', 1)
    1
    >>> vtor.check('boolean', True)
    1
    >>> vtor.check('boolean', '1')
    1
    >>> vtor.check('boolean', 'on')
    1
    >>> vtor.check('boolean', 'true')
    1
    >>> vtor.check('boolean', 'yes')
    1
    >>> vtor.check('boolean', 'Yes')
    1
    >>> vtor.check('boolean', 'YES')
    1
    >>> vtor.check('boolean', '')
    Traceback (most recent call last):
    VdtTypeError: the value "" is of the wrong type.
    >>> vtor.check('boolean', 'up')
    Traceback (most recent call last):
    VdtTypeError: the value "up" is of the wrong type.
    
    FTN)r~r�	bool_dict�lowerr\r)r7r!r!r#r�s,
cCsHt|t�st|��|j�}yt|�Wntk
rBt|��YnX|S)as
    Check that the supplied value is an Internet Protocol address, v.4,
    represented by a dotted-quad string, i.e. '1.2.3.4'.
    
    >>> vtor.check('ip_addr', '1 ')
    '1'
    >>> vtor.check('ip_addr', ' 1.2')
    '1.2'
    >>> vtor.check('ip_addr', ' 1.2.3 ')
    '1.2.3'
    >>> vtor.check('ip_addr', '1.2.3.4')
    '1.2.3.4'
    >>> vtor.check('ip_addr', '0.0.0.0')
    '0.0.0.0'
    >>> vtor.check('ip_addr', '255.255.255.255')
    '255.255.255.255'
    >>> vtor.check('ip_addr', '255.255.255.256')
    Traceback (most recent call last):
    VdtValueError: the value "255.255.255.256" is unacceptable.
    >>> vtor.check('ip_addr', '1.2.3.4.5')
    Traceback (most recent call last):
    VdtValueError: the value "1.2.3.4.5" is unacceptable.
    >>> vtor.check('ip_addr', 0)
    Traceback (most recent call last):
    VdtTypeError: the value "0" is of the wrong type.
    )r~rrr+rr-r)r7r!r!r#r�s
cCs�td||f�\}}t|t�r$t|��yt|�}Wntk
rLt|��YnX|dk	rf||krft|��|dk	r~||kr~t|��t|�S)a�
    Check that the value is a list of values.
    
    You can optionally specify the minimum and maximum number of members.
    
    It does no check on list members.
    
    >>> vtor.check('list', ())
    []
    >>> vtor.check('list', [])
    []
    >>> vtor.check('list', (1, 2))
    [1, 2]
    >>> vtor.check('list', [1, 2])
    [1, 2]
    >>> vtor.check('list(3)', (1, 2))
    Traceback (most recent call last):
    VdtValueTooShortError: the value "(1, 2)" is too short.
    >>> vtor.check('list(max=5)', (1, 2, 3, 4, 5, 6))
    Traceback (most recent call last):
    VdtValueTooLongError: the value "(1, 2, 3, 4, 5, 6)" is too long.
    >>> vtor.check('list(min=3, max=5)', (1, 2, 3, 4))
    [1, 2, 3, 4]
    >>> vtor.check('list', 0)
    Traceback (most recent call last):
    VdtTypeError: the value "0" is of the wrong type.
    >>> vtor.check('list', '12')
    Traceback (most recent call last):
    VdtTypeError: the value "12" is of the wrong type.
    r�r�N)r�r�)	r�r~rrrq�	TypeErrorrrr@)r7r�r��min_len�max_len�num_membersr!r!r#r�s
cCstt|||��S)a�
    Check that the value is a tuple of values.
    
    You can optionally specify the minimum and maximum number of members.
    
    It does no check on members.
    
    >>> vtor.check('tuple', ())
    ()
    >>> vtor.check('tuple', [])
    ()
    >>> vtor.check('tuple', (1, 2))
    (1, 2)
    >>> vtor.check('tuple', [1, 2])
    (1, 2)
    >>> vtor.check('tuple(3)', (1, 2))
    Traceback (most recent call last):
    VdtValueTooShortError: the value "(1, 2)" is too short.
    >>> vtor.check('tuple(max=5)', (1, 2, 3, 4, 5, 6))
    Traceback (most recent call last):
    VdtValueTooLongError: the value "(1, 2, 3, 4, 5, 6)" is too long.
    >>> vtor.check('tuple(min=3, max=5)', (1, 2, 3, 4))
    (1, 2, 3, 4)
    >>> vtor.check('tuple', 0)
    Traceback (most recent call last):
    VdtTypeError: the value "0" is of the wrong type.
    >>> vtor.check('tuple', '12')
    Traceback (most recent call last):
    VdtTypeError: the value "12" is of the wrong type.
    )rAr)r7r�r�r!r!r#rscCs�t|t�st|��td||f�\}}yt|�}Wntk
rLt|��YnX|dk	rf||krft|��|dk	r~||kr~t|��|S)a�
    Check that the supplied value is a string.
    
    You can optionally specify the minimum and maximum number of members.
    
    >>> vtor.check('string', '0')
    '0'
    >>> vtor.check('string', 0)
    Traceback (most recent call last):
    VdtTypeError: the value "0" is of the wrong type.
    >>> vtor.check('string(2)', '12')
    '12'
    >>> vtor.check('string(2)', '1')
    Traceback (most recent call last):
    VdtValueTooShortError: the value "1" is too short.
    >>> vtor.check('string(min=2, max=3)', '123')
    '123'
    >>> vtor.check('string(min=2, max=3)', '1234')
    Traceback (most recent call last):
    VdtValueTooLongError: the value "1234" is too long.
    r�r�N)r�r�)r~rrr�rqr�rr)r7r�r�r�r�r�r!r!r#r1s
cCsdd�t|||�D�S)a
    Check that the value is a list of integers.
    
    You can optionally specify the minimum and maximum number of members.
    
    Each list member is checked that it is an integer.
    
    >>> vtor.check('int_list', ())
    []
    >>> vtor.check('int_list', [])
    []
    >>> vtor.check('int_list', (1, 2))
    [1, 2]
    >>> vtor.check('int_list', [1, 2])
    [1, 2]
    >>> vtor.check('int_list', [1, 'a'])
    Traceback (most recent call last):
    VdtTypeError: the value "a" is of the wrong type.
    cSsg|]}t|��qSr!)r)rV�memr!r!r#rXiszis_int_list.<locals>.<listcomp>)r)r7r�r�r!r!r#rUscCsdd�t|||�D�S)al
    Check that the value is a list of booleans.
    
    You can optionally specify the minimum and maximum number of members.
    
    Each list member is checked that it is a boolean.
    
    >>> vtor.check('bool_list', ())
    []
    >>> vtor.check('bool_list', [])
    []
    >>> check_res = vtor.check('bool_list', (True, False))
    >>> check_res == [True, False]
    1
    >>> check_res = vtor.check('bool_list', [True, False])
    >>> check_res == [True, False]
    1
    >>> vtor.check('bool_list', [True, 'a'])
    Traceback (most recent call last):
    VdtTypeError: the value "a" is of the wrong type.
    cSsg|]}t|��qSr!)r)rVr�r!r!r#rX�sz is_bool_list.<locals>.<listcomp>)r)r7r�r�r!r!r#rlscCsdd�t|||�D�S)a
    Check that the value is a list of floats.
    
    You can optionally specify the minimum and maximum number of members.
    
    Each list member is checked that it is a float.
    
    >>> vtor.check('float_list', ())
    []
    >>> vtor.check('float_list', [])
    []
    >>> vtor.check('float_list', (1, 2.0))
    [1.0, 2.0]
    >>> vtor.check('float_list', [1, 2.0])
    [1.0, 2.0]
    >>> vtor.check('float_list', [1, 'a'])
    Traceback (most recent call last):
    VdtTypeError: the value "a" is of the wrong type.
    cSsg|]}t|��qSr!)r)rVr�r!r!r#rX�sz!is_float_list.<locals>.<listcomp>)r)r7r�r�r!r!r#r�scCs(t|t�rt|��dd�t|||�D�S)an
    Check that the value is a list of strings.
    
    You can optionally specify the minimum and maximum number of members.
    
    Each list member is checked that it is a string.
    
    >>> vtor.check('string_list', ())
    []
    >>> vtor.check('string_list', [])
    []
    >>> vtor.check('string_list', ('a', 'b'))
    ['a', 'b']
    >>> vtor.check('string_list', ['a', 1])
    Traceback (most recent call last):
    VdtTypeError: the value "1" is of the wrong type.
    >>> vtor.check('string_list', 'hello')
    Traceback (most recent call last):
    VdtTypeError: the value "hello" is of the wrong type.
    cSsg|]}t|��qSr!)r)rVr�r!r!r#rX�sz"is_string_list.<locals>.<listcomp>)r~rrr)r7r�r�r!r!r#r�s
cCsdd�t|||�D�S)a
    Check that the value is a list of IP addresses.
    
    You can optionally specify the minimum and maximum number of members.
    
    Each list member is checked that it is an IP address.
    
    >>> vtor.check('ip_addr_list', ())
    []
    >>> vtor.check('ip_addr_list', [])
    []
    >>> vtor.check('ip_addr_list', ('1.2.3.4', '5.6.7.8'))
    ['1.2.3.4', '5.6.7.8']
    >>> vtor.check('ip_addr_list', ['a'])
    Traceback (most recent call last):
    VdtValueError: the value "a" is unacceptable.
    cSsg|]}t|��qSr!)r)rVr�r!r!r#rX�sz#is_ip_addr_list.<locals>.<listcomp>)r)r7r�r�r!r!r#r�scCs t|ttf�s|g}t|||�S)a�
    Check that a value is a list, coercing strings into
    a list with one member. Useful where users forget the
    trailing comma that turns a single value into a list.
    
    You can optionally specify the minimum and maximum number of members.
    A minumum of greater than one will fail if the user only supplies a
    string.
    
    >>> vtor.check('force_list', ())
    []
    >>> vtor.check('force_list', [])
    []
    >>> vtor.check('force_list', 'hello')
    ['hello']
    )r~r@rAr)r7r�r�r!r!r#rD�srD)r;r<r>r?r=cGs�yt|�}Wntk
r(t|��YnX|t|�kr@t|��n|t|�krTt|��ydd�t||�D�Stk
r�}ztd|��WYdd}~XnXdS)a�
    Check that the value is a list.
    Allow specifying the type of each member.
    Work on lists of specific lengths.
    
    You specify each member as a positional argument specifying type
    
    Each type should be one of the following strings :
      'integer', 'float', 'ip_addr', 'string', 'boolean'
    
    So you can specify a list of two strings, followed by
    two integers as :
    
      mixed_list('string', 'string', 'integer', 'integer')
    
    The length of the list must match the number of positional
    arguments you supply.
    
    >>> mix_str = "mixed_list('integer', 'float', 'ip_addr', 'string', 'boolean')"
    >>> check_res = vtor.check(mix_str, (1, 2.0, '1.2.3.4', 'a', True))
    >>> check_res == [1, 2.0, '1.2.3.4', 'a', True]
    1
    >>> check_res = vtor.check(mix_str, ('1', '2.0', '1.2.3.4', 'a', 'True'))
    >>> check_res == [1, 2.0, '1.2.3.4', 'a', True]
    1
    >>> vtor.check(mix_str, ('b', 2.0, '1.2.3.4', 'a', True))
    Traceback (most recent call last):
    VdtTypeError: the value "b" is of the wrong type.
    >>> vtor.check(mix_str, (1, 2.0, '1.2.3.4', 'a'))
    Traceback (most recent call last):
    VdtValueTooShortError: the value "(1, 2.0, '1.2.3.4', 'a')" is too short.
    >>> vtor.check(mix_str, (1, 2.0, '1.2.3.4', 'a', 1, 'b'))
    Traceback (most recent call last):
    VdtValueTooLongError: the value "(1, 2.0, '1.2.3.4', 'a', 1, 'b')" is too long.
    >>> vtor.check(mix_str, 0)
    Traceback (most recent call last):
    VdtTypeError: the value "0" is of the wrong type.

    >>> vtor.check('mixed_list("yoda")', ('a'))
    Traceback (most recent call last):
    VdtParamError: passed an incorrect value "KeyError('yoda',)" for parameter "'mixed_list'"
    cSsg|]\}}t||��qSr!)�fun_dict)rVrlr&r!r!r#rXsz!is_mixed_list.<locals>.<listcomp>rBN)rqr�rrrr}r\r)r7rtZlengthr�r!r!r#r�s+
cGs&t|t�st|��||kr"t|��|S)a�
    This check matches the value to any of a set of options.
    
    >>> vtor.check('option("yoda", "jedi")', 'yoda')
    'yoda'
    >>> vtor.check('option("yoda", "jedi")', 'jed')
    Traceback (most recent call last):
    VdtValueError: the value "jed" is unacceptable.
    >>> vtor.check('option("yoda", "jedi")', 0)
    Traceback (most recent call last):
    VdtTypeError: the value "0" is of the wrong type.
    )r~rrr)r7Zoptionsr!r!r#r$s

cOs
|||fS)ag

    A function that exists for test purposes.
    
    >>> checks = [
    ...     '3, 6, min=1, max=3, test=list(a, b, c)',
    ...     '3',
    ...     '3, 6',
    ...     '3,',
    ...     'min=1, test="a b c"',
    ...     'min=5, test="a, b, c"',
    ...     'min=1, max=3, test="a, b, c"',
    ...     'min=-100, test=-99',
    ...     'min=1, max=3',
    ...     '3, 6, test="36"',
    ...     '3, 6, test="a, b, c"',
    ...     '3, max=3, test=list("a", "b", "c")',
    ...     '''3, max=3, test=list("'a'", 'b', "x=(c)")''',
    ...     "test='x=fish(3)'",
    ...    ]
    >>> v = Validator({'test': _test})
    >>> for entry in checks:
    ...     pprint(v.check(('test(%s)' % entry), 3))
    (3, ('3', '6'), {'max': '3', 'min': '1', 'test': ['a', 'b', 'c']})
    (3, ('3',), {})
    (3, ('3', '6'), {})
    (3, ('3',), {})
    (3, (), {'min': '1', 'test': 'a b c'})
    (3, (), {'min': '5', 'test': 'a, b, c'})
    (3, (), {'max': '3', 'min': '1', 'test': 'a, b, c'})
    (3, (), {'min': '-100', 'test': '-99'})
    (3, (), {'max': '3', 'min': '1'})
    (3, ('3', '6'), {'test': '36'})
    (3, ('3', '6'), {'test': 'a, b, c'})
    (3, ('3',), {'max': '3', 'test': ['a', 'b', 'c']})
    (3, ('3',), {'max': '3', 'test': ["'a'", 'b', 'x=(c)']})
    (3, (), {'test': 'x=fish(3)'})
    
    >>> v = Validator()
    >>> v.check('integer(default=6)', '3')
    3
    >>> v.check('integer(default=6)', None, True)
    6
    >>> v.get_default_value('integer(default=6)')
    6
    >>> v.get_default_value('float(default=6)')
    6.0
    >>> v.get_default_value('pass(default=None)')
    >>> v.get_default_value("string(default='None')")
    'None'
    >>> v.get_default_value('pass')
    Traceback (most recent call last):
    KeyError: 'Check "pass" has no default value.'
    >>> v.get_default_value('pass(default=list(1, 2, 3, 4))')
    ['1', '2', '3', '4']
    
    >>> v = Validator()
    >>> v.check("pass(default=None)", None, True)
    >>> v.check("pass(default='None')", None, True)
    'None'
    >>> v.check('pass(default="None")', None, True)
    'None'
    >>> v.check('pass(default=list(1, 2, 3, 4))', None, True)
    ['1', '2', '3', '4']
    
    Bug test for unicode arguments
    >>> v = Validator()
    >>> v.check(unicode('string(min=4)'), unicode('test')) == unicode('test')
    True
    
    >>> v = Validator()
    >>> v.get_default_value(unicode('string(min=4, default="1234")')) == unicode('1234')
    True
    >>> v.check(unicode('string(min=4, default="1234")'), unicode('test')) == unicode('test')
    True
    
    >>> v = Validator()
    >>> default = v.get_default_value('string(default=None)')
    >>> default == None
    1
    r!)r7rtZkeywargsr!r!r#�_test8sQr�cCsdS)z�
    >>> 
    >>> v = Validator()
    >>> v.get_default_value('string(default="#ff00dd")')
    '#ff00dd'
    >>> v.get_default_value('integer(default=3) # comment')
    3
    Nr!r!r!r!r#�_test2�sr�cCsdS)a�
    >>> vtor.check('string(default="")', '', missing=True)
    ''
    >>> vtor.check('string(default="\n")', '', missing=True)
    '\n'
    >>> print(vtor.check('string(default="\n")', '', missing=True))
    <BLANKLINE>
    <BLANKLINE>
    >>> vtor.check('string()', '\n')
    '\n'
    >>> vtor.check('string(default="\n\n\n")', '', missing=True)
    '\n\n\n'
    >>> vtor.check('string()', 'random \n text goes here\n\n')
    'random \n text goes here\n\n'
    >>> vtor.check('string(default=" \nrandom text\ngoes \n here\n\n ")',
    ... '', missing=True)
    ' \nrandom text\ngoes \n here\n\n '
    >>> vtor.check("string(default='\n\n\n')", '', missing=True)
    '\n\n\n'
    >>> vtor.check("option('\n','a','b',default='\n')", '', missing=True)
    '\n'
    >>> vtor.check("string_list()", ['foo', '\n', 'bar'])
    ['foo', '\n', 'bar']
    >>> vtor.check("string_list(default=list('\n'))", '', missing=True)
    ['\n']
    Nr!r!r!r!r#�_test3�sr��__main__Zvtor)�globsZoptionflags)rrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrr)r )F)NN)NN)NN)NN)NN)NN)NN)NN)NN)NN)NN)Ir4r�__all__rv�sysr�version_infoZ
basestringrrUZunicoder|r.rwrzrxrgrrryr{r'�	NameErrorrr�	Exceptionrr
rr8rrrr	r
rr�objectrr�rrr�rrrrrrrrrrrDr�rrr�r�r�r1Zdoctest�modules�get�m�__dict__�copyr�rGZtestmodZIGNORE_EXCEPTION_DETAIL�ELLIPSISZfailuresZtestsr!r!r!r#�<module>s�
	)4
%
8
5
<%
-
"
$





9T


site-packages/__pycache__/appdirs.cpython-36.opt-1.pyc000064400000050253147511334560016541 0ustar003

I�5]g`�@s�dZd4Zdjeee��ZddlZddlZejddkZ	e	r>eZ
ejjd�r�ddlZej
�ddZejd�rrd	Zq�ejd
�r�dZq�dZnejZd5dd�Zd6dd�Zd7dd�Zd8dd�Zd9dd�Zd:dd�Zd;dd�ZGdd�de�Zdd �Zd!d"�Zd#d$�Zd%d&�Zed	k�r�yddlZeZWnne k
�r�ydd'l!m"Z"eZWnBe k
�r�yddl#Z$eZWne k
�r�eZYnXYnXYnXe%d(k�r�d)Z&d*Z'd<Z(e)d+e�e)d,�ee&e'd-d.�Z*x$e(D]Z+e)d/e+e,e*e+�f��q�We)d0�ee&e'�Z*x$e(D]Z+e)d/e+e,e*e+�f��qWe)d1�ee&�Z*x$e(D]Z+e)d/e+e,e*e+�f��q:We)d2�ee&d
d3�Z*x$e(D]Z+e)d/e+e,e*e+�f��qtWdS)=zyUtilities for determining application-specific dirs.

See <http://github.com/ActiveState/appdirs> for details and usage.
����.�N�javaZWindows�win32ZMac�darwinZlinux2FcCs�tdkr^|dkr|}|rdpd}tjjt|��}|r�|dk	rNtjj|||�}q�tjj||�}nNtdkr�tjjd�}|r�tjj||�}n&tjdtjjd	��}|r�tjj||�}|r�|r�tjj||�}|S)
aJReturn full path to the user-specific data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user data directories are:
        Mac OS X:               ~/Library/Application Support/<AppName>
        Unix:                   ~/.local/share/<AppName>    # or in $XDG_DATA_HOME, if defined
        Win XP (not roaming):   C:\Documents and Settings\<username>\Application Data\<AppAuthor>\<AppName>
        Win XP (roaming):       C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>
        Win 7  (not roaming):   C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>
        Win 7  (roaming):       C:\Users\<username>\AppData\Roaming\<AppAuthor>\<AppName>

    For Unix, we follow the XDG spec and support $XDG_DATA_HOME.
    That means, by default "~/.local/share/<AppName>".
    rN�
CSIDL_APPDATA�CSIDL_LOCAL_APPDATAFrz~/Library/Application Support/Z
XDG_DATA_HOMEz~/.local/share)�system�os�path�normpath�_get_win_folder�join�
expanduser�getenv)�appname�	appauthor�version�roaming�constr
�r�/usr/lib/python3.6/appdirs.py�
user_data_dir,s& rcs
tdkrR|dkr�}tjjtd��}�r�|dk	rBtjj||��}q�tjj|��}n�tdkrztjjd�}�r�tjj|��}nttjdtjjdd	g��}d
d�|j	tj�D�}�r�|r�tjj�|���fdd�|D�}|r�tjj|�}n|d
}|S�o�|�rtjj||�}|S)aiReturn full path to the user-shared data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "multipath" is an optional parameter only applicable to *nix
            which indicates that the entire list of data dirs should be
            returned. By default, the first item from XDG_DATA_DIRS is
            returned, or '/usr/local/share/<AppName>',
            if XDG_DATA_DIRS is not set

    Typical site data directories are:
        Mac OS X:   /Library/Application Support/<AppName>
        Unix:       /usr/local/share/<AppName> or /usr/share/<AppName>
        Win XP:     C:\Documents and Settings\All Users\Application Data\<AppAuthor>\<AppName>
        Vista:      (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)
        Win 7:      C:\ProgramData\<AppAuthor>\<AppName>   # Hidden, but writeable on Win 7.

    For Unix, this is using the $XDG_DATA_DIRS[0] default.

    WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
    rN�CSIDL_COMMON_APPDATAFrz/Library/Application SupportZ
XDG_DATA_DIRSz/usr/local/sharez
/usr/sharecSs g|]}tjj|jtj���qSr)rr
r�rstrip�sep)�.0�xrrr�
<listcomp>�sz!site_data_dir.<locals>.<listcomp>csg|]}tjj|�g��qSr)rrr)rr)rrrr �sr)
rrr
rrrrr�pathsep�split)rrr�	multipathr
�pathlistr)rr�
site_data_dircs4
r%cCsXtdkrt||d|�}n&tjdtjjd��}|r>tjj||�}|rT|rTtjj||�}|S)a�Return full path to the user-specific config dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user config directories are:
        Mac OS X:               same as user_data_dir
        Unix:                   ~/.config/<AppName>     # or in $XDG_CONFIG_HOME, if defined
        Win *:                  same as user_data_dir

    For Unix, we follow the XDG spec and support $XDG_CONFIG_HOME.
    That means, by default "~/.config/<AppName>".
    rrNZXDG_CONFIG_HOMEz	~/.config)rr)rrrrr
rr)rrrrr
rrr�user_config_dir�sr&cs�td	kr*t�|�}�r�|r�tjj||�}ndtjdd�}dd�|jtj�D�}�rt|rbtjj�|���fdd�|D�}|r�tjj|�}n|d}|S)
aReturn full path to the user-shared data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "multipath" is an optional parameter only applicable to *nix
            which indicates that the entire list of config dirs should be
            returned. By default, the first item from XDG_CONFIG_DIRS is
            returned, or '/etc/xdg/<AppName>', if XDG_CONFIG_DIRS is not set

    Typical site config directories are:
        Mac OS X:   same as site_data_dir
        Unix:       /etc/xdg/<AppName> or $XDG_CONFIG_DIRS[i]/<AppName> for each value in
                    $XDG_CONFIG_DIRS
        Win *:      same as site_data_dir
        Vista:      (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)

    For Unix, this is using the $XDG_CONFIG_DIRS[0] default, if multipath=False

    WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
    rrZXDG_CONFIG_DIRSz/etc/xdgcSs g|]}tjj|jtj���qSr)rr
rrr)rrrrrr �sz#site_config_dir.<locals>.<listcomp>csg|]}tjj|�g��qSr)rrr)rr)rrrr �sr)rr)rr%rr
rrr"r!)rrrr#r
r$r)rr�site_config_dir�s
r'TcCs�tdkrd|dkr|}tjjtd��}|r�|dk	rBtjj|||�}ntjj||�}|r�tjj|d�}nNtdkr�tjjd�}|r�tjj||�}n&tjdtjjd	��}|r�tjj||�}|r�|r�tjj||�}|S)
aReturn full path to the user-specific cache dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "opinion" (boolean) can be False to disable the appending of
            "Cache" to the base app data dir for Windows. See
            discussion below.

    Typical user cache directories are:
        Mac OS X:   ~/Library/Caches/<AppName>
        Unix:       ~/.cache/<AppName> (XDG default)
        Win XP:     C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Cache
        Vista:      C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Cache

    On Windows the only suggestion in the MSDN docs is that local settings go in
    the `CSIDL_LOCAL_APPDATA` directory. This is identical to the non-roaming
    app data dir (the default returned by `user_data_dir` above). Apps typically
    put cache data somewhere *under* the given dir here. Some examples:
        ...\Mozilla\Firefox\Profiles\<ProfileName>\Cache
        ...\Acme\SuperApp\Cache\1.0
    OPINION: This function appends "Cache" to the `CSIDL_LOCAL_APPDATA` value.
    This can be disabled with the `opinion=False` option.
    rNr
FZCacherz~/Library/CachesZXDG_CACHE_HOMEz~/.cache)rrr
rrrrr)rrr�opinionr
rrr�user_cache_dirs(!r)cCsXtdkrt||d|�}n&tjdtjjd��}|r>tjj||�}|rT|rTtjj||�}|S)aReturn full path to the user-specific state dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user state directories are:
        Mac OS X:  same as user_data_dir
        Unix:      ~/.local/state/<AppName>   # or in $XDG_STATE_HOME, if defined
        Win *:     same as user_data_dir

    For Unix, we follow this Debian proposal <https://wiki.debian.org/XDGBaseDirectorySpecification#state>
    to extend the XDG spec and support $XDG_STATE_HOME.

    That means, by default "~/.local/state/<AppName>".
    rrNZXDG_STATE_HOMEz~/.local/state)rr)rrrrr
rr)rrrrr
rrr�user_state_dir9sr*cCs�tdkr tjjtjjd�|�}nNtdkrLt|||�}d}|rntjj|d�}n"t|||�}d}|rntjj|d�}|r�|r�tjj||�}|S)a�Return full path to the user-specific log dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "opinion" (boolean) can be False to disable the appending of
            "Logs" to the base app data dir for Windows, and "log" to the
            base cache dir for Unix. See discussion below.

    Typical user log directories are:
        Mac OS X:   ~/Library/Logs/<AppName>
        Unix:       ~/.cache/<AppName>/log  # or under $XDG_CACHE_HOME if defined
        Win XP:     C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Logs
        Vista:      C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Logs

    On Windows the only suggestion in the MSDN docs is that local settings
    go in the `CSIDL_LOCAL_APPDATA` directory. (Note: I'm interested in
    examples of what some windows apps use for a logs dir.)

    OPINION: This function appends "Logs" to the `CSIDL_LOCAL_APPDATA`
    value for Windows and appends "log" to the user cache dir for Unix.
    This can be disabled with the `opinion=False` option.
    rz~/Library/LogsrFZLogs�log)rrr
rrrr))rrrr(r
rrr�user_log_dircs  
r,c@sneZdZdZddd�Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��Zedd��ZdS)�AppDirsz1Convenience wrapper for getting application dirs.NFcCs"||_||_||_||_||_dS)N)rrrrr#)�selfrrrrr#rrr�__init__�s
zAppDirs.__init__cCst|j|j|j|jd�S)N)rr)rrrrr)r.rrrr�s
zAppDirs.user_data_dircCst|j|j|j|jd�S)N)rr#)r%rrrr#)r.rrrr%�s
zAppDirs.site_data_dircCst|j|j|j|jd�S)N)rr)r&rrrr)r.rrrr&�s
zAppDirs.user_config_dircCst|j|j|j|jd�S)N)rr#)r'rrrr#)r.rrrr'�s
zAppDirs.site_config_dircCst|j|j|jd�S)N)r)r)rrr)r.rrrr)�s
zAppDirs.user_cache_dircCst|j|j|jd�S)N)r)r*rrr)r.rrrr*�s
zAppDirs.user_state_dircCst|j|j|jd�S)N)r)r,rrr)r.rrrr,�s
zAppDirs.user_log_dir)NNNFF)
�__name__�
__module__�__qualname__�__doc__r/�propertyrr%r&r'r)r*r,rrrrr-�s
r-cCsHtrddl}nddl}dddd�|}|j|jd�}|j||�\}}|S)z�This is a fallback technique at best. I'm not sure if using the
    registry for this guarantees us the correct answer for all CSIDL_*
    names.
    rNZAppDatazCommon AppDataz
Local AppData)r	rr
z@Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders)�PY3�winreg�_winreg�OpenKey�HKEY_CURRENT_USERZQueryValueEx)�
csidl_namer7Zshell_folder_name�key�dir�typerrr�_get_win_folder_from_registry�s
r>cCs�ddlm}m}|jdt||�dd�}y`t|�}d}x|D]}t|�dkr:d}Pq:W|r�yddl}|j|�}Wnt	k
r�YnXWnt
k
r�YnX|S)Nr)�shellcon�shellF�T)�win32com.shellr?r@�SHGetFolderPath�getattr�unicode�ord�win32api�GetShortPathName�ImportError�UnicodeError)r:r?r@r<�
has_high_char�crGrrr�_get_win_folder_with_pywin32�s$

rMcCs�ddl}dddd�|}|jd�}|jjjd|dd|�d}x|D]}t|�dkrBd	}PqBW|r�|jd�}|jjj|j|d�r�|}|jS)
Nr��#�)r	rr
iFrAT)	�ctypesZcreate_unicode_buffer�windllZshell32ZSHGetFolderPathWrFZkernel32ZGetShortPathNameW�value)r:rQZcsidl_const�bufrKrLZbuf2rrr�_get_win_folder_with_ctypes�s"


rUcCs�ddl}ddlm}ddlm}|jjd}|jd|�}|jj	}|j
dt|j|�d|jj
|�|jj|j��jd�}d}x|D]}	t|	�dkr~d	}Pq~W|r�|jd|�}|jj	}
|
j|||�r�|jj|j��jd�}|S)
Nr)�jna)r�rL�FrAT)�arrayZcom.sunrVZcom.sun.jna.platformrZWinDefZMAX_PATHZzerosZShell32ZINSTANCErCrDZShlObjZSHGFP_TYPE_CURRENTZNativeZtoStringZtostringrrFZKernel32rH)r:rYrVrZbuf_sizerTr@r<rKrLZkernelrrr�_get_win_folder_with_jnas&
rZ)rR�__main__ZMyAppZ	MyCompanyz-- app dirs %s --z%-- app dirs (with optional 'version')z1.0)rz%s: %sz)
-- app dirs (without optional 'version')z+
-- app dirs (without optional 'appauthor')z(
-- app dirs (with disabled 'appauthor'))r)rrr)NNNF)NNNF)NNNF)NNNF)NNNT)NNNF)NNNT)rr&r)r*r,r%r')-r3Z__version_info__r�map�str�__version__�sysr�version_infor5rE�platform�
startswithZjava_verZos_namerrr%r&r'r)r*r,�objectr-r>rMrUrZrBZwin32comrrIrQrRZcom.sun.jnaZcomr0rrZprops�print�dirsZproprDrrrr�<module>s�


7
B
(
3
9
*
30






site-packages/__pycache__/six.cpython-36.opt-1.pyc000064400000060674147511334560015712 0ustar003

��s`�x�K@s�dZddlmZddlZddlZddlZddlZddlZdZdZ	ej
ddkZej
ddkZej
dd��d�kZ
er�efZefZefZeZeZejZn�efZeefZeejfZeZeZejjd	�r�e�d��ZnLGdd
�d
e�Z ye!e ��Wn e"k
�re�d��ZYnXe�d��Z[ dd�Z#dd�Z$Gdd�de�Z%Gdd�de%�Z&Gdd�dej'�Z(Gdd�de%�Z)Gdd�de�Z*e*e+�Z,Gdd�de(�Z-e)ddd d!�e)d"d#d$d%d"�e)d&d#d#d'd&�e)d(d)d$d*d(�e)d+d)d,�e)d-d#d$d.d-�e)d/d0d0d1d/�e)d2d0d0d/d2�e)d3d4d5�e)d6d)d$d7d6�e)d8d)e
�r&d9nd:d;�e)d<d)d=�e)d>d?d@dA�e)d!d!d �e)dBdBdC�e)dDdDdC�e)dEdEdC�e)d7d)d$d7d6�e)dFd#d$dGdF�e)dHd#d#dIdH�e&d$d)�e&dJdK�e&dLdM�e&dNdOdP�e&dQdRdQ�e&dSdTdU�e&dVdWdX�e&dYdZd[�e&d\d]d^�e&d_d`da�e&dbdcdd�e&dedfdg�e&dhdidj�e&dkdldm�e&dndodp�e&dqdqdr�e&dsdsdr�e&dtdtdr�e&dududv�e&dwdx�e&dydz�e&d{d|�e&d}d~d}�e&dd��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�e+d�d��e&d�e+d�d��e&d�e+d�e+d��e&d�d�d��e&d�d�d��e&d�d�d��g@Z.ejd�k�rne.e&d�d��g7Z.x:e.D]2Z/e0e-e/j1e/�e2e/e&��rte,j3e/d�e/j1��qtW[/e.e-_.e-e+d��Z4e,j3e4d��Gd�d��d�e(�Z5e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)dAd�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d�d�dσe)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��gZ6xe6D]Z/e0e5e/j1e/��q�W[/e6e5_.e,j3e5e+d��d�dۃGd�d݄d�e(�Z7e)d�d�d��e)d�d�d��e)d�d�d��gZ8xe8D]Z/e0e7e/j1e/��qPW[/e8e7_.e,j3e7e+d��d�d�Gd�d�d�e(�Z9e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)�dd�d�e)�dd�d�e)�dd�d�e)�dd�d�e)�dd�d�e)�dd�d�e)�dd�d�e)�dd�d�e)�dd�d�e)�d	d�d�e)�d
d�d�g#Z:xe:D]Z/e0e9e/j1e/��qW[/e:e9_.e,j3e9e+�d��d�d
�G�d�d��de(�Z;e)�dd��d�e)�dd��d�e)�dd��d�e)�dd��d�gZ<xe<D]Z/e0e;e/j1e/��q�W[/e<e;_.e,j3e;e+�d��d�d�G�d�d��de(�Z=e)�dd�d��gZ>xe>D]Z/e0e=e/j1e/��	qW[/e>e=_.e,j3e=e+�d��d�d�G�d�d��dej'�Z?e,j3e?e+d���d ��d!�d"�Z@�d#�d$�ZAe�	r��d%ZB�d&ZC�d'ZD�d(ZE�d)ZF�d*ZGn$�d+ZB�d,ZC�d-ZD�d.ZE�d/ZF�d0ZGyeHZIWn"eJk
�
r�d1�d2�ZIYnXeIZHyeKZKWn"eJk
�
r<�d3�d4�ZKYnXe�
rh�d5�d6�ZLejMZN�d7�d8�ZOeZPn>�d9�d6�ZL�d:�d;�ZN�d<�d8�ZOG�d=�d>��d>e�ZPeKZKe#eL�d?�ejQeB�ZRejQeC�ZSejQeD�ZTejQeE�ZUejQeF�ZVejQeG�ZWe�rJ�d@�dA�ZX�dB�dC�ZY�dD�dE�ZZ�dF�dG�Z[ej\�dH�Z]ej\�dI�Z^ej\�dJ�Z_nT�dK�dA�ZX�dL�dC�ZY�dM�dE�ZZ�dN�dG�Z[ej\�dO�Z]ej\�dP�Z^ej\�dQ�Z_e#eX�dR�e#eY�dS�e#eZ�dT�e#e[�dU�e�rb�dV�dW�Z`�dX�dY�ZaebZcddldZdedje�dZ�jfZg[dejhd�ZiejjZkelZmddlnZnenjoZoenjpZp�d[Zqej
d
d
k�rT�d\Zr�d]Zsn�d^Zr�d_Zsnj�d`�dW�Z`�da�dY�ZaecZcebZg�db�dc�Zi�dd�de�Zkejtejuev�ZmddloZoeojoZoZp�dfZq�d\Zr�d]Zse#e`�dg�e#ea�dh��di�d[�Zw�dj�d^�Zx�dk�d_�Zye�
r.eze4j{�dl�Z|�d��dm�dn�Z}n�d��do�dp�Z|e|�dq�ej
dd��d�k�
rje|�dr�n.ej
dd��d�k�
r�e|�ds�n�dt�du�Z~eze4j{�dvd�Zedk�
r��dw�dx�Zej
dd��d�k�
r�eZ��dy�dx�Ze#e}�dz�ej
dd��d�k�rej�ej�f�d{�d|�Z�nej�Z��d}�d~�Z��d�d��Z��d��d��Z�gZ�e+Z�e��j��d��dk	�rjge�_�ej��r�x>e�ej��D]0\Z�Z�ee��j+dk�r~e�j1e+k�r~ej�e�=P�q~W[�[�ej�j�e,�dS(�z6Utilities for writing code that runs on Python 2 and 3�)�absolute_importNz'Benjamin Peterson <benjamin@python.org>z1.11.0����java��c@seZdZdd�ZdS)�XcCsdS)Nrrl�)�selfr
r
�/usr/lib/python3.6/six.py�__len__>sz	X.__len__N)�__name__�
__module__�__qualname__r
r
r
r
rr	<sr	�?cCs
||_dS)z Add documentation to a function.N)�__doc__)�func�docr
r
r�_add_docKsrcCst|�tj|S)z7Import module, returning the module after the last dot.)�
__import__�sys�modules)�namer
r
r�_import_modulePsrc@seZdZdd�Zdd�ZdS)�
_LazyDescrcCs
||_dS)N)r)rrr
r
r�__init__Xsz_LazyDescr.__init__cCsB|j�}t||j|�yt|j|j�Wntk
r<YnX|S)N)�_resolve�setattrr�delattr�	__class__�AttributeError)r�obj�tp�resultr
r
r�__get__[sz_LazyDescr.__get__N)rrrrr%r
r
r
rrVsrcs.eZdZd�fdd�	Zdd�Zdd�Z�ZS)	�MovedModuleNcs2tt|�j|�tr(|dkr |}||_n||_dS)N)�superr&r�PY3�mod)rr�old�new)r r
rriszMovedModule.__init__cCs
t|j�S)N)rr))rr
r
rrrszMovedModule._resolvecCs"|j�}t||�}t|||�|S)N)r�getattrr)r�attr�_module�valuer
r
r�__getattr__us
zMovedModule.__getattr__)N)rrrrrr0�
__classcell__r
r
)r rr&gs	r&cs(eZdZ�fdd�Zdd�ZgZ�ZS)�_LazyModulecstt|�j|�|jj|_dS)N)r'r2rr r)rr)r r
rr~sz_LazyModule.__init__cCs ddg}|dd�|jD�7}|S)NrrcSsg|]
}|j�qSr
)r)�.0r-r
r
r�
<listcomp>�sz'_LazyModule.__dir__.<locals>.<listcomp>)�_moved_attributes)rZattrsr
r
r�__dir__�sz_LazyModule.__dir__)rrrrr6r5r1r
r
)r rr2|sr2cs&eZdZd�fdd�	Zdd�Z�ZS)�MovedAttributeNcsdtt|�j|�trH|dkr |}||_|dkr@|dkr<|}n|}||_n||_|dkrZ|}||_dS)N)r'r7rr(r)r-)rrZold_modZnew_modZold_attrZnew_attr)r r
rr�szMovedAttribute.__init__cCst|j�}t||j�S)N)rr)r,r-)r�moduler
r
rr�s
zMovedAttribute._resolve)NN)rrrrrr1r
r
)r rr7�sr7c@sVeZdZdZdd�Zdd�Zdd�Zdd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZeZdS)�_SixMetaPathImporterz�
    A meta path importer to import six.moves and its submodules.

    This class implements a PEP302 finder and loader. It should be compatible
    with Python 2.5 and all existing versions of Python3
    cCs||_i|_dS)N)r�
known_modules)rZsix_module_namer
r
rr�sz_SixMetaPathImporter.__init__cGs&x |D]}||j|jd|<qWdS)N�.)r:r)rr)Z	fullnames�fullnamer
r
r�_add_module�s
z _SixMetaPathImporter._add_modulecCs|j|jd|S)Nr;)r:r)rr<r
r
r�_get_module�sz _SixMetaPathImporter._get_moduleNcCs||jkr|SdS)N)r:)rr<�pathr
r
r�find_module�s
z _SixMetaPathImporter.find_modulecCs0y
|j|Stk
r*td|��YnXdS)Nz!This loader does not know module )r:�KeyError�ImportError)rr<r
r
rZ__get_module�s
z!_SixMetaPathImporter.__get_modulecCsRy
tj|Stk
rYnX|j|�}t|t�r>|j�}n||_|tj|<|S)N)rrrA� _SixMetaPathImporter__get_module�
isinstancer&r�
__loader__)rr<r)r
r
r�load_module�s




z _SixMetaPathImporter.load_modulecCst|j|�d�S)z�
        Return true, if the named module is a package.

        We need this method to get correct spec objects with
        Python 3.4 (see PEP451)
        �__path__)�hasattrrC)rr<r
r
r�
is_package�sz_SixMetaPathImporter.is_packagecCs|j|�dS)z;Return None

        Required, if is_package is implementedN)rC)rr<r
r
r�get_code�s
z_SixMetaPathImporter.get_code)N)
rrrrrr=r>r@rCrFrIrJ�
get_sourcer
r
r
rr9�s
	r9c@seZdZdZgZdS)�_MovedItemszLazy loading of moved objectsN)rrrrrGr
r
r
rrL�srLZ	cStringIO�io�StringIO�filter�	itertools�builtinsZifilter�filterfalseZifilterfalse�inputZ__builtin__Z	raw_input�internr�map�imap�getcwd�osZgetcwdu�getcwdbZ	getoutputZcommands�
subprocess�rangeZxrangeZ
reload_module�	importlibZimp�reload�reduce�	functoolsZshlex_quoteZpipesZshlexZquote�UserDict�collections�UserList�
UserString�zipZizip�zip_longestZizip_longestZconfigparserZConfigParser�copyregZcopy_regZdbm_gnuZgdbmzdbm.gnuZ
_dummy_threadZdummy_threadZhttp_cookiejarZ	cookielibzhttp.cookiejarZhttp_cookiesZCookiezhttp.cookiesZ
html_entitiesZhtmlentitydefsz
html.entitiesZhtml_parserZ
HTMLParserzhtml.parserZhttp_clientZhttplibzhttp.clientZemail_mime_basezemail.MIMEBasezemail.mime.baseZemail_mime_imagezemail.MIMEImagezemail.mime.imageZemail_mime_multipartzemail.MIMEMultipartzemail.mime.multipartZemail_mime_nonmultipartzemail.MIMENonMultipartzemail.mime.nonmultipartZemail_mime_textzemail.MIMETextzemail.mime.textZBaseHTTPServerzhttp.serverZ
CGIHTTPServerZSimpleHTTPServerZcPickle�pickleZqueueZQueue�reprlib�reprZsocketserverZSocketServer�_threadZthreadZtkinterZTkinterZtkinter_dialogZDialogztkinter.dialogZtkinter_filedialogZ
FileDialogztkinter.filedialogZtkinter_scrolledtextZScrolledTextztkinter.scrolledtextZtkinter_simpledialogZSimpleDialogztkinter.simpledialogZtkinter_tixZTixztkinter.tixZtkinter_ttkZttkztkinter.ttkZtkinter_constantsZTkconstantsztkinter.constantsZtkinter_dndZTkdndztkinter.dndZtkinter_colorchooserZtkColorChooserztkinter.colorchooserZtkinter_commondialogZtkCommonDialogztkinter.commondialogZtkinter_tkfiledialogZtkFileDialogZtkinter_fontZtkFontztkinter.fontZtkinter_messageboxZtkMessageBoxztkinter.messageboxZtkinter_tksimpledialogZtkSimpleDialogZurllib_parsez.moves.urllib_parsezurllib.parseZurllib_errorz.moves.urllib_errorzurllib.errorZurllibz
.moves.urllibZurllib_robotparser�robotparserzurllib.robotparserZ
xmlrpc_clientZ	xmlrpclibz
xmlrpc.clientZ
xmlrpc_serverZSimpleXMLRPCServerz
xmlrpc.serverZwin32�winreg�_winregzmoves.z.moves�movesc@seZdZdZdS)�Module_six_moves_urllib_parsez7Lazy loading of moved objects in six.moves.urllib_parseN)rrrrr
r
r
rroBsroZParseResultZurlparseZSplitResultZparse_qsZ	parse_qslZ	urldefragZurljoinZurlsplitZ
urlunparseZ
urlunsplitZ
quote_plusZunquoteZunquote_plusZunquote_to_bytesZ	urlencodeZ
splitqueryZsplittagZ	splituserZ
splitvalueZ
uses_fragmentZuses_netlocZuses_paramsZ
uses_queryZ
uses_relativezmoves.urllib_parsezmoves.urllib.parsec@seZdZdZdS)�Module_six_moves_urllib_errorz7Lazy loading of moved objects in six.moves.urllib_errorN)rrrrr
r
r
rrplsrpZURLErrorZurllib2Z	HTTPErrorZContentTooShortErrorz.moves.urllib.errorzmoves.urllib_errorzmoves.urllib.errorc@seZdZdZdS)�Module_six_moves_urllib_requestz9Lazy loading of moved objects in six.moves.urllib_requestN)rrrrr
r
r
rrq�srqZurlopenzurllib.requestZinstall_openerZbuild_openerZpathname2urlZurl2pathnameZ
getproxiesZRequestZOpenerDirectorZHTTPDefaultErrorHandlerZHTTPRedirectHandlerZHTTPCookieProcessorZProxyHandlerZBaseHandlerZHTTPPasswordMgrZHTTPPasswordMgrWithDefaultRealmZAbstractBasicAuthHandlerZHTTPBasicAuthHandlerZProxyBasicAuthHandlerZAbstractDigestAuthHandlerZHTTPDigestAuthHandlerZProxyDigestAuthHandlerZHTTPHandlerZHTTPSHandlerZFileHandlerZ
FTPHandlerZCacheFTPHandlerZUnknownHandlerZHTTPErrorProcessorZurlretrieveZ
urlcleanupZ	URLopenerZFancyURLopenerZproxy_bypassZparse_http_listZparse_keqv_listz.moves.urllib.requestzmoves.urllib_requestzmoves.urllib.requestc@seZdZdZdS)� Module_six_moves_urllib_responsez:Lazy loading of moved objects in six.moves.urllib_responseN)rrrrr
r
r
rrr�srrZaddbasezurllib.responseZaddclosehookZaddinfoZ
addinfourlz.moves.urllib.responsezmoves.urllib_responsezmoves.urllib.responsec@seZdZdZdS)�#Module_six_moves_urllib_robotparserz=Lazy loading of moved objects in six.moves.urllib_robotparserN)rrrrr
r
r
rrs�srsZRobotFileParserz.moves.urllib.robotparserzmoves.urllib_robotparserzmoves.urllib.robotparserc@sNeZdZdZgZejd�Zejd�Zejd�Z	ejd�Z
ejd�Zdd�Zd	S)
�Module_six_moves_urllibzICreate a six.moves.urllib namespace that resembles the Python 3 namespacezmoves.urllib_parsezmoves.urllib_errorzmoves.urllib_requestzmoves.urllib_responsezmoves.urllib_robotparsercCsdddddgS)N�parse�error�request�responserkr
)rr
r
rr6�szModule_six_moves_urllib.__dir__N)
rrrrrG�	_importerr>rurvrwrxrkr6r
r
r
rrt�s




rtzmoves.urllibcCstt|j|�dS)zAdd an item to six.moves.N)rrLr)Zmover
r
r�add_move�srzcCsXytt|�WnDtk
rRytj|=Wn"tk
rLtd|f��YnXYnXdS)zRemove item from six.moves.zno such move, %rN)rrLr!rn�__dict__rA)rr
r
r�remove_move�sr|�__func__�__self__�__closure__�__code__�__defaults__�__globals__�im_funcZim_selfZfunc_closureZ	func_codeZ
func_defaultsZfunc_globalscCs|j�S)N)�next)�itr
r
r�advance_iteratorsr�cCstdd�t|�jD��S)Ncss|]}d|jkVqdS)�__call__N)r{)r3�klassr
r
r�	<genexpr>szcallable.<locals>.<genexpr>)�any�type�__mro__)r"r
r
r�callablesr�cCs|S)Nr
)�unboundr
r
r�get_unbound_functionsr�cCs|S)Nr
)r�clsr
r
r�create_unbound_method#sr�cCs|jS)N)r�)r�r
r
rr�(scCstj|||j�S)N)�types�
MethodTyper )rr"r
r
r�create_bound_method+sr�cCstj|d|�S)N)r�r�)rr�r
r
rr�.sc@seZdZdd�ZdS)�IteratorcCst|�j|�S)N)r��__next__)rr
r
rr�3sz
Iterator.nextN)rrrr�r
r
r
rr�1sr�z3Get the function out of a possibly unbound functioncKst|jf|��S)N)�iter�keys)�d�kwr
r
r�iterkeysDsr�cKst|jf|��S)N)r��values)r�r�r
r
r�
itervaluesGsr�cKst|jf|��S)N)r��items)r�r�r
r
r�	iteritemsJsr�cKst|jf|��S)N)r�Zlists)r�r�r
r
r�	iterlistsMsr�r�r�r�cKs|jf|�S)N)r�)r�r�r
r
rr�VscKs|jf|�S)N)r�)r�r�r
r
rr�YscKs|jf|�S)N)r�)r�r�r
r
rr�\scKs|jf|�S)N)r�)r�r�r
r
rr�_s�viewkeys�
viewvalues�	viewitemsz1Return an iterator over the keys of a dictionary.z3Return an iterator over the values of a dictionary.z?Return an iterator over the (key, value) pairs of a dictionary.zBReturn an iterator over the (key, [values]) pairs of a dictionary.cCs
|jd�S)Nzlatin-1)�encode)�sr
r
r�bqsr�cCs|S)Nr
)r�r
r
r�utsr�z>B�assertCountEqualZassertRaisesRegexpZassertRegexpMatches�assertRaisesRegex�assertRegexcCs|S)Nr
)r�r
r
rr��scCst|jdd�d�S)Nz\\z\\\\Zunicode_escape)�unicode�replace)r�r
r
rr��scCst|d�S)Nr)�ord)Zbsr
r
r�byte2int�sr�cCst||�S)N)r�)Zbuf�ir
r
r�
indexbytes�sr�ZassertItemsEqualzByte literalzText literalcOst|t�||�S)N)r,�_assertCountEqual)r�args�kwargsr
r
rr��scOst|t�||�S)N)r,�_assertRaisesRegex)rr�r�r
r
rr��scOst|t�||�S)N)r,�_assertRegex)rr�r�r
r
rr��s�execc
Cs:z*|dkr|�}|j|k	r$|j|��|�Wdd}d}XdS)N)�
__traceback__�with_traceback)r#r/�tbr
r
r�reraise�s

r�cCsB|dkr*tjd�}|j}|dkr&|j}~n|dkr6|}td�dS)zExecute code in a namespace.Nrzexec _code_ in _globs_, _locs_)r�	_getframe�	f_globals�f_localsr�)Z_code_Z_globs_Z_locs_�framer
r
r�exec_�s
r�zedef reraise(tp, value, tb=None):
    try:
        raise tp, value, tb
    finally:
        tb = None
z�def raise_from(value, from_value):
    try:
        if from_value is None:
            raise value
        raise value from from_value
    finally:
        value = None
zrdef raise_from(value, from_value):
    try:
        raise value from from_value
    finally:
        value = None
cCs|�dS)Nr
)r/Z
from_valuer
r
r�
raise_from�sr��printc
s6|jdtj���dkrdS�fdd�}d}|jdd�}|dk	r`t|t�rNd}nt|t�s`td��|jd	d�}|dk	r�t|t�r�d}nt|t�s�td
��|r�td��|s�x|D]}t|t�r�d}Pq�W|r�td�}td
�}nd}d
}|dkr�|}|dk�r�|}x,t|�D] \}	}|	�r||�||��qW||�dS)z4The new-style print function for Python 2.4 and 2.5.�fileNcsdt|t�st|�}t�t�rVt|t�rV�jdk	rVt�dd�}|dkrHd}|j�j|�}�j|�dS)N�errors�strict)	rD�
basestring�strr�r��encodingr,r��write)�datar�)�fpr
rr��s



zprint_.<locals>.writeF�sepTzsep must be None or a string�endzend must be None or a stringz$invalid keyword arguments to print()�
� )�popr�stdoutrDr�r��	TypeError�	enumerate)
r�r�r�Zwant_unicoder�r��arg�newlineZspacer�r
)r�r�print_�sL







r�cOs<|jdtj�}|jdd�}t||�|r8|dk	r8|j�dS)Nr��flushF)�getrr�r��_printr�)r�r�r�r�r
r
rr�s

zReraise an exception.cs���fdd�}|S)Ncstj����|�}�|_|S)N)r_�wraps�__wrapped__)�f)�assigned�updated�wrappedr
r�wrapper*szwraps.<locals>.wrapperr
)r�r�r�r�r
)r�r�r�rr�(sr�cs&G��fdd�dt�}tj|dfi�S)z%Create a base class with a metaclass.cs,eZdZ��fdd�Ze��fdd��ZdS)z!with_metaclass.<locals>.metaclasscs�|�|�S)Nr
)r�r�
this_basesr�)�bases�metar
r�__new__:sz)with_metaclass.<locals>.metaclass.__new__cs�j|��S)N)�__prepare__)r�rr�)r�r�r
rr�=sz-with_metaclass.<locals>.metaclass.__prepare__N)rrrr��classmethodr�r
)r�r�r
r�	metaclass8sr�Ztemporary_class)r�r�)r�r�r�r
)r�r�r�with_metaclass3sr�cs�fdd�}|S)z6Class decorator for creating a class with a metaclass.csl|jj�}|jd�}|dk	rDt|t�r,|g}x|D]}|j|�q2W|jdd�|jdd��|j|j|�S)N�	__slots__r{�__weakref__)r{�copyr�rDr�r�r�	__bases__)r�Z	orig_vars�slotsZ	slots_var)r�r
rr�Es



zadd_metaclass.<locals>.wrapperr
)r�r�r
)r�r�
add_metaclassCsr�cCs2tr.d|jkrtd|j��|j|_dd�|_|S)a
    A decorator that defines __unicode__ and __str__ methods under Python 2.
    Under Python 3 it does nothing.

    To support Python 2 and 3 with a single code base, define a __str__ method
    returning text and apply this decorator to the class.
    �__str__zY@python_2_unicode_compatible cannot be applied to %s because it doesn't define __str__().cSs|j�jd�S)Nzutf-8)�__unicode__r�)rr
r
r�<lambda>asz-python_2_unicode_compatible.<locals>.<lambda>)�PY2r{�
ValueErrorrr�r�)r�r
r
r�python_2_unicode_compatibleSs


r��__spec__)rrli���li���ll����)N)NN)rr)rr)rr)rr)�rZ
__future__rr_rP�operatorrr��
__author__�__version__�version_infor�r(ZPY34r�Zstring_types�intZ
integer_typesr�Zclass_typesZ	text_type�bytesZbinary_type�maxsizeZMAXSIZEr�ZlongZ	ClassTyper��platform�
startswith�objectr	�len�
OverflowErrorrrrr&�
ModuleTyper2r7r9rryrLr5r-rrrDr=rnroZ_urllib_parse_moved_attributesrpZ_urllib_error_moved_attributesrqZ _urllib_request_moved_attributesrrZ!_urllib_response_moved_attributesrsZ$_urllib_robotparser_moved_attributesrtrzr|Z
_meth_funcZ
_meth_selfZ
_func_closureZ
_func_codeZ_func_defaultsZ
_func_globalsr�r��	NameErrorr�r�r�r�r�r��
attrgetterZget_method_functionZget_method_selfZget_function_closureZget_function_codeZget_function_defaultsZget_function_globalsr�r�r�r��methodcallerr�r�r�r�r��chrZunichr�struct�Struct�packZint2byte�
itemgetterr��getitemr�r�Z	iterbytesrMrN�BytesIOr�r�r��partialrVr�r�r�r�r,rQr�r�r�r�r��WRAPPER_ASSIGNMENTS�WRAPPER_UPDATESr�r�r�r�rG�__package__�globalsr�r��submodule_search_locations�	meta_pathr�r�Zimporter�appendr
r
r
r�<module>s�

>







































































































5site-packages/__pycache__/appdirs.cpython-36.pyc000064400000050253147511334560015602 0ustar003

I�5]g`�@s�dZd4Zdjeee��ZddlZddlZejddkZ	e	r>eZ
ejjd�r�ddlZej
�ddZejd�rrd	Zq�ejd
�r�dZq�dZnejZd5dd�Zd6dd�Zd7dd�Zd8dd�Zd9dd�Zd:dd�Zd;dd�ZGdd�de�Zdd �Zd!d"�Zd#d$�Zd%d&�Zed	k�r�yddlZeZWnne k
�r�ydd'l!m"Z"eZWnBe k
�r�yddl#Z$eZWne k
�r�eZYnXYnXYnXe%d(k�r�d)Z&d*Z'd<Z(e)d+e�e)d,�ee&e'd-d.�Z*x$e(D]Z+e)d/e+e,e*e+�f��q�We)d0�ee&e'�Z*x$e(D]Z+e)d/e+e,e*e+�f��qWe)d1�ee&�Z*x$e(D]Z+e)d/e+e,e*e+�f��q:We)d2�ee&d
d3�Z*x$e(D]Z+e)d/e+e,e*e+�f��qtWdS)=zyUtilities for determining application-specific dirs.

See <http://github.com/ActiveState/appdirs> for details and usage.
����.�N�javaZWindows�win32ZMac�darwinZlinux2FcCs�tdkr^|dkr|}|rdpd}tjjt|��}|r�|dk	rNtjj|||�}q�tjj||�}nNtdkr�tjjd�}|r�tjj||�}n&tjdtjjd	��}|r�tjj||�}|r�|r�tjj||�}|S)
aJReturn full path to the user-specific data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user data directories are:
        Mac OS X:               ~/Library/Application Support/<AppName>
        Unix:                   ~/.local/share/<AppName>    # or in $XDG_DATA_HOME, if defined
        Win XP (not roaming):   C:\Documents and Settings\<username>\Application Data\<AppAuthor>\<AppName>
        Win XP (roaming):       C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>
        Win 7  (not roaming):   C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>
        Win 7  (roaming):       C:\Users\<username>\AppData\Roaming\<AppAuthor>\<AppName>

    For Unix, we follow the XDG spec and support $XDG_DATA_HOME.
    That means, by default "~/.local/share/<AppName>".
    rN�
CSIDL_APPDATA�CSIDL_LOCAL_APPDATAFrz~/Library/Application Support/Z
XDG_DATA_HOMEz~/.local/share)�system�os�path�normpath�_get_win_folder�join�
expanduser�getenv)�appname�	appauthor�version�roaming�constr
�r�/usr/lib/python3.6/appdirs.py�
user_data_dir,s& rcs
tdkrR|dkr�}tjjtd��}�r�|dk	rBtjj||��}q�tjj|��}n�tdkrztjjd�}�r�tjj|��}nttjdtjjdd	g��}d
d�|j	tj�D�}�r�|r�tjj�|���fdd�|D�}|r�tjj|�}n|d
}|S�o�|�rtjj||�}|S)aiReturn full path to the user-shared data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "multipath" is an optional parameter only applicable to *nix
            which indicates that the entire list of data dirs should be
            returned. By default, the first item from XDG_DATA_DIRS is
            returned, or '/usr/local/share/<AppName>',
            if XDG_DATA_DIRS is not set

    Typical site data directories are:
        Mac OS X:   /Library/Application Support/<AppName>
        Unix:       /usr/local/share/<AppName> or /usr/share/<AppName>
        Win XP:     C:\Documents and Settings\All Users\Application Data\<AppAuthor>\<AppName>
        Vista:      (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)
        Win 7:      C:\ProgramData\<AppAuthor>\<AppName>   # Hidden, but writeable on Win 7.

    For Unix, this is using the $XDG_DATA_DIRS[0] default.

    WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
    rN�CSIDL_COMMON_APPDATAFrz/Library/Application SupportZ
XDG_DATA_DIRSz/usr/local/sharez
/usr/sharecSs g|]}tjj|jtj���qSr)rr
r�rstrip�sep)�.0�xrrr�
<listcomp>�sz!site_data_dir.<locals>.<listcomp>csg|]}tjj|�g��qSr)rrr)rr)rrrr �sr)
rrr
rrrrr�pathsep�split)rrr�	multipathr
�pathlistr)rr�
site_data_dircs4
r%cCsXtdkrt||d|�}n&tjdtjjd��}|r>tjj||�}|rT|rTtjj||�}|S)a�Return full path to the user-specific config dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user config directories are:
        Mac OS X:               same as user_data_dir
        Unix:                   ~/.config/<AppName>     # or in $XDG_CONFIG_HOME, if defined
        Win *:                  same as user_data_dir

    For Unix, we follow the XDG spec and support $XDG_CONFIG_HOME.
    That means, by default "~/.config/<AppName>".
    rrNZXDG_CONFIG_HOMEz	~/.config)rr)rrrrr
rr)rrrrr
rrr�user_config_dir�sr&cs�td	kr*t�|�}�r�|r�tjj||�}ndtjdd�}dd�|jtj�D�}�rt|rbtjj�|���fdd�|D�}|r�tjj|�}n|d}|S)
aReturn full path to the user-shared data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "multipath" is an optional parameter only applicable to *nix
            which indicates that the entire list of config dirs should be
            returned. By default, the first item from XDG_CONFIG_DIRS is
            returned, or '/etc/xdg/<AppName>', if XDG_CONFIG_DIRS is not set

    Typical site config directories are:
        Mac OS X:   same as site_data_dir
        Unix:       /etc/xdg/<AppName> or $XDG_CONFIG_DIRS[i]/<AppName> for each value in
                    $XDG_CONFIG_DIRS
        Win *:      same as site_data_dir
        Vista:      (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)

    For Unix, this is using the $XDG_CONFIG_DIRS[0] default, if multipath=False

    WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
    rrZXDG_CONFIG_DIRSz/etc/xdgcSs g|]}tjj|jtj���qSr)rr
rrr)rrrrrr �sz#site_config_dir.<locals>.<listcomp>csg|]}tjj|�g��qSr)rrr)rr)rrrr �sr)rr)rr%rr
rrr"r!)rrrr#r
r$r)rr�site_config_dir�s
r'TcCs�tdkrd|dkr|}tjjtd��}|r�|dk	rBtjj|||�}ntjj||�}|r�tjj|d�}nNtdkr�tjjd�}|r�tjj||�}n&tjdtjjd	��}|r�tjj||�}|r�|r�tjj||�}|S)
aReturn full path to the user-specific cache dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "opinion" (boolean) can be False to disable the appending of
            "Cache" to the base app data dir for Windows. See
            discussion below.

    Typical user cache directories are:
        Mac OS X:   ~/Library/Caches/<AppName>
        Unix:       ~/.cache/<AppName> (XDG default)
        Win XP:     C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Cache
        Vista:      C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Cache

    On Windows the only suggestion in the MSDN docs is that local settings go in
    the `CSIDL_LOCAL_APPDATA` directory. This is identical to the non-roaming
    app data dir (the default returned by `user_data_dir` above). Apps typically
    put cache data somewhere *under* the given dir here. Some examples:
        ...\Mozilla\Firefox\Profiles\<ProfileName>\Cache
        ...\Acme\SuperApp\Cache\1.0
    OPINION: This function appends "Cache" to the `CSIDL_LOCAL_APPDATA` value.
    This can be disabled with the `opinion=False` option.
    rNr
FZCacherz~/Library/CachesZXDG_CACHE_HOMEz~/.cache)rrr
rrrrr)rrr�opinionr
rrr�user_cache_dirs(!r)cCsXtdkrt||d|�}n&tjdtjjd��}|r>tjj||�}|rT|rTtjj||�}|S)aReturn full path to the user-specific state dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user state directories are:
        Mac OS X:  same as user_data_dir
        Unix:      ~/.local/state/<AppName>   # or in $XDG_STATE_HOME, if defined
        Win *:     same as user_data_dir

    For Unix, we follow this Debian proposal <https://wiki.debian.org/XDGBaseDirectorySpecification#state>
    to extend the XDG spec and support $XDG_STATE_HOME.

    That means, by default "~/.local/state/<AppName>".
    rrNZXDG_STATE_HOMEz~/.local/state)rr)rrrrr
rr)rrrrr
rrr�user_state_dir9sr*cCs�tdkr tjjtjjd�|�}nNtdkrLt|||�}d}|rntjj|d�}n"t|||�}d}|rntjj|d�}|r�|r�tjj||�}|S)a�Return full path to the user-specific log dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "opinion" (boolean) can be False to disable the appending of
            "Logs" to the base app data dir for Windows, and "log" to the
            base cache dir for Unix. See discussion below.

    Typical user log directories are:
        Mac OS X:   ~/Library/Logs/<AppName>
        Unix:       ~/.cache/<AppName>/log  # or under $XDG_CACHE_HOME if defined
        Win XP:     C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Logs
        Vista:      C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Logs

    On Windows the only suggestion in the MSDN docs is that local settings
    go in the `CSIDL_LOCAL_APPDATA` directory. (Note: I'm interested in
    examples of what some windows apps use for a logs dir.)

    OPINION: This function appends "Logs" to the `CSIDL_LOCAL_APPDATA`
    value for Windows and appends "log" to the user cache dir for Unix.
    This can be disabled with the `opinion=False` option.
    rz~/Library/LogsrFZLogs�log)rrr
rrrr))rrrr(r
rrr�user_log_dircs  
r,c@sneZdZdZddd�Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��Zedd��ZdS)�AppDirsz1Convenience wrapper for getting application dirs.NFcCs"||_||_||_||_||_dS)N)rrrrr#)�selfrrrrr#rrr�__init__�s
zAppDirs.__init__cCst|j|j|j|jd�S)N)rr)rrrrr)r.rrrr�s
zAppDirs.user_data_dircCst|j|j|j|jd�S)N)rr#)r%rrrr#)r.rrrr%�s
zAppDirs.site_data_dircCst|j|j|j|jd�S)N)rr)r&rrrr)r.rrrr&�s
zAppDirs.user_config_dircCst|j|j|j|jd�S)N)rr#)r'rrrr#)r.rrrr'�s
zAppDirs.site_config_dircCst|j|j|jd�S)N)r)r)rrr)r.rrrr)�s
zAppDirs.user_cache_dircCst|j|j|jd�S)N)r)r*rrr)r.rrrr*�s
zAppDirs.user_state_dircCst|j|j|jd�S)N)r)r,rrr)r.rrrr,�s
zAppDirs.user_log_dir)NNNFF)
�__name__�
__module__�__qualname__�__doc__r/�propertyrr%r&r'r)r*r,rrrrr-�s
r-cCsHtrddl}nddl}dddd�|}|j|jd�}|j||�\}}|S)z�This is a fallback technique at best. I'm not sure if using the
    registry for this guarantees us the correct answer for all CSIDL_*
    names.
    rNZAppDatazCommon AppDataz
Local AppData)r	rr
z@Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders)�PY3�winreg�_winreg�OpenKey�HKEY_CURRENT_USERZQueryValueEx)�
csidl_namer7Zshell_folder_name�key�dir�typerrr�_get_win_folder_from_registry�s
r>cCs�ddlm}m}|jdt||�dd�}y`t|�}d}x|D]}t|�dkr:d}Pq:W|r�yddl}|j|�}Wnt	k
r�YnXWnt
k
r�YnX|S)Nr)�shellcon�shellF�T)�win32com.shellr?r@�SHGetFolderPath�getattr�unicode�ord�win32api�GetShortPathName�ImportError�UnicodeError)r:r?r@r<�
has_high_char�crGrrr�_get_win_folder_with_pywin32�s$

rMcCs�ddl}dddd�|}|jd�}|jjjd|dd|�d}x|D]}t|�dkrBd	}PqBW|r�|jd�}|jjj|j|d�r�|}|jS)
Nr��#�)r	rr
iFrAT)	�ctypesZcreate_unicode_buffer�windllZshell32ZSHGetFolderPathWrFZkernel32ZGetShortPathNameW�value)r:rQZcsidl_const�bufrKrLZbuf2rrr�_get_win_folder_with_ctypes�s"


rUcCs�ddl}ddlm}ddlm}|jjd}|jd|�}|jj	}|j
dt|j|�d|jj
|�|jj|j��jd�}d}x|D]}	t|	�dkr~d	}Pq~W|r�|jd|�}|jj	}
|
j|||�r�|jj|j��jd�}|S)
Nr)�jna)r�rL�FrAT)�arrayZcom.sunrVZcom.sun.jna.platformrZWinDefZMAX_PATHZzerosZShell32ZINSTANCErCrDZShlObjZSHGFP_TYPE_CURRENTZNativeZtoStringZtostringrrFZKernel32rH)r:rYrVrZbuf_sizerTr@r<rKrLZkernelrrr�_get_win_folder_with_jnas&
rZ)rR�__main__ZMyAppZ	MyCompanyz-- app dirs %s --z%-- app dirs (with optional 'version')z1.0)rz%s: %sz)
-- app dirs (without optional 'version')z+
-- app dirs (without optional 'appauthor')z(
-- app dirs (with disabled 'appauthor'))r)rrr)NNNF)NNNF)NNNF)NNNF)NNNT)NNNF)NNNT)rr&r)r*r,r%r')-r3Z__version_info__r�map�str�__version__�sysr�version_infor5rE�platform�
startswithZjava_verZos_namerrr%r&r'r)r*r,�objectr-r>rMrUrZrBZwin32comrrIrQrRZcom.sun.jnaZcomr0rrZprops�print�dirsZproprDrrrr�<module>s�


7
B
(
3
9
*
30






site-packages/__pycache__/seobject.cpython-36.opt-1.pyc000064400000251644147511334560016704 0ustar003

��f���@s2ddlZddlZddlZddlZddlZddlZddlZddlZddlTdZ	ddl
Z
ddlZddlZy:ddl
Z
iZejdRkr�ded<e
je	fddd	�e��WnJyddlZeejd
<Wn&ek
r�ddlZeejd
<YnXYnXddlZiZeed<eed<eed
<eed<eed<eed<eed<eed<eed<eed<eed<eed<eed<eed<eed<eed<eed<eed<eed<eed<eed<e ed <e ed!<e ed"<d
ddddddd d#�Z!d$d$d%d&d'd(d)d*dd+�	Z"y(ddl#Z#e#j$e#j%��Gd,d-�d-�Z&Wn(e'efk
�r4Gd.d-�d-�Z&YnXGd/d0�d0�Z(d1d2�Z)dSd4d5�Z*dTd6d7�Z+Gd8d9�d9�Z,Gd:d;�d;e,�Z-Gd<d=�d=e,�Z.Gd>d?�d?e,�Z/Gd@dA�dAe,�Z0GdBdC�dCe,�Z1GdDdE�dEe,�Z2GdFdG�dGe,�Z3GdHdI�dIe,�Z4GdJdK�dKe,�Z5GdLdM�dMe,�Z6GdNdO�dOe,�Z7GdPdQ�dQe,�Z8dS)U�N)�*zselinux-python�T�unicodez/usr/share/localezutf-8)Z	localedirZcodeset�_�z	all files�azregular filez--�fz-d�	directory�dz-czcharacter device�cz-bzblock device�bz-s�socket�sz-l�lz
symbolic link�pz-pz
named pipe)z	all fileszregular filer	zcharacter devicezblock devicer
z
symbolic linkz
named pipe�any�block�char�dir�file�symlink�pipe)	rrrrr
rrrrc@s8eZdZdd�Zd
dd�Zddd�Zdd	�Zd
d�ZdS)�loggercCstj�|_g|_g|_dS)N)�audit�
audit_open�audit_fd�log_list�log_change_list)�self�r�/usr/lib/python3.6/seobject.py�__init__ls
zlogger.__init__rc	
Cs�d}	||kr||	d7}d}	||kr4||	d7}d}	||krL||	d7}d}	|jj|jtjtjdt|�|d||||||dddg�dS)N�-�sename�,�role�rangerr)r�appendrrZAUDIT_ROLE_ASSIGN�sys�argv�str)
r�msg�namer#�serole�serange�	oldsename�	oldserole�
oldserange�seprrr �logqsz
logger.logc		Cs<|jj|jtjtjdt|�|d||||||dddg�dS)Nrr)rr'rrZAUDIT_ROLE_REMOVEr(r)r*)	rr+r,r#r-r.r/r0r1rrr �
log_remove�szlogger.log_removecCs&|jj|jtjt|�ddddg�dS)N�semanager)rr'rrZAUDIT_USER_MAC_CONFIG_CHANGEr*)rr+rrr �
log_change�szlogger.log_changecCsPx|jD]}tj||g�qWx|jD]}tj||g�q(Wg|_g|_dS)N)rrZaudit_log_semanage_messagerZaudit_log_user_comm_message)r�successrrrr �commit�sz
logger.commitN)rrrrrrr)rrrrrrr)�__name__�
__module__�__qualname__r!r3r4r6r8rrrr rjs


rc@s8eZdZdd�Zd
dd�Zddd�Zdd	�Zd
d�ZdS)rcCs
g|_dS)N)r)rrrr r!�szlogger.__init__rc	
Cs�d||f}	|dkr |	d|7}	|dkr4|	d|7}	|dkrH|	d|7}	|dkr\|	d|7}	|dkrx|dk	rx|	d|7}	|dkr�|dk	r�|	d|7}	|jj|	�dS)	Nz %s name=%srz sename=z oldsename=z role=z
 old_role=z
 MLSRange=z old_MLSRange=)rr')
rr+r,r#r-r.r/r0r1�messagerrr r3�sz
logger.logc			Cs|j||||||||�dS)N)r3)	rr+r,r#r-r.r/r0r1rrr r4�szlogger.log_removecCs|jjd|�dS)Nz %s)rr')rr+rrr r6�szlogger.log_changecCs8|dkrd}nd}x |jD]}tjtj||�qWdS)N�zSuccessful: zFailed: )r�syslogZLOG_INFO)rr7r<rrrr r8�s
z
logger.commitN)rrrrrrr)rrrrrrr)r9r:r;r!r3r4r6r8rrrr r�s


c@s0eZdZddd�Zddd�Zdd�Zdd	�Zd
S)
�
nullloggerrc		CsdS)Nr)	rr+r,r#r-r.r/r0r1rrr r3�sznulllogger.logc		CsdS)Nr)	rr+r,r#r-r.r/r0r1rrr r4�sznulllogger.log_removecCsdS)Nr)rr+rrr r6�sznulllogger.log_changecCsdS)Nr)rr7rrr r8�sznulllogger.commitN)rrrrrrr)rrrrrrr)r9r:r;r3r4r6r8rrrr r?�s

r?cCsXd}d}|d|d}|d|d}|d|dd|d}tjd	|d
|�S)Nzs[0-9]*zc[0-9]*z(\.z)?z(\,z)*z(-z(:�^�$)�re�search)�rawZsensitivity�categoryZ	cat_rangeZ
categoriesZregrrr �validate_level�srFr=cCs`d}|dkrd||f}n|}tj|�\}}|dkr8|S|rL|t|�d�}|dkrX|S|SdS)Nza:b:c:r=z%s%srr)�selinuxZselinux_raw_to_trans_context�len)rD�prepend�filler�context�rc�transrrr �	translate�srNcCs`d}|dkrd||f}n|}tj|�\}}|dkr8|S|rL|t|�d�}|dkrX|S|SdS)Nza:b:c:r=z%s%srr)rGZselinux_trans_to_raw_contextrH)rMrIrJrKrLrDrrr �untranslate�srOc@sfeZdZdZdZdZdZddd�Zdd�Zdd�Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�ZdS)�semanageRecordsFNcCs�|rt|�tkr||_n||_t|dd�|_|js@t|dd�|_|j|j�|_tj	�\}}|jdksn|j|krxt
�|_n,tj
|j�tjdtj�|jf�t�|_dS)N�noreloadF�storerz%s%s)�typer*rR�args�getattrrQ�
get_handle�shrG�selinux_getpolicytyper�mylog�sepolicyZload_store_policyZselinux_set_policy_rootZselinux_pathr?)rrTrLZ
localstorerrr r!�s
zsemanageRecords.__init__cCs||_dS)N)rQ)r�loadrrr �
set_reload
szsemanageRecords.set_reloadcCs�tjrtjSt�}|s"ttd���tjrD|dkrDt||t�|t_t	|�s`t
|�ttd���t|�}|tkr�t
|�ttd���t
|�}|dkr�t
|�ttd���t|�atdkr�t
|�ttd���|t_tjS)Nz Could not create semanage handlerz:SELinux policy is not managed or store cannot be accessed.zCannot read policy store.rz'Could not establish semanage connectionz!Could not test MLS enabled status)rP�handleZsemanage_handle_create�
ValueErrorr�transactionZsemanage_select_storeZSEMANAGE_CON_DIRECTrRZsemanage_is_managedZsemanage_handle_destroyZsemanage_access_checkZSEMANAGE_CAN_READZsemanage_connectZsemanage_mls_enabled�is_mls_enabled)rrRr]rLrrr rV
s2zsemanageRecords.get_handlecCsttd���dS)NzNot yet implemented)r^r)rrrr �	deleteall1szsemanageRecords.deleteallcCs$tjrttd���|j�dt_dS)Nz(Semanage transaction already in progressT)rPr_r^r�begin)rrrr �start4szsemanageRecords.startcCs,tjr
dSt|j�}|dkr(ttd���dS)Nrz$Could not start semanage transaction)rPr_Zsemanage_begin_transactionrWr^r)rrLrrr rb:s

zsemanageRecords.begincCsttd���dS)NzNot yet implemented)r^r)rrrr �
customizedAszsemanageRecords.customizedcCsVtjr
dS|jrt|jd�t|j�}|dkrF|jjd�tt	d���|jjd�dS)Nrz%Could not commit semanage transactionr=)
rPr_rQZsemanage_set_reloadrWZsemanage_commitrYr8r^r)rrLrrr r8Ds
zsemanageRecords.commitcCs$tjsttd���dt_|j�dS)Nz$Semanage transaction not in progressF)rPr_r^rr8)rrrr �finishPszsemanageRecords.finish)N)r9r:r;r_r]rRrTr!r\rVrarcrbrdr8rerrrr rP�s
$rPc@sPeZdZddd�Zdd�Zdd�Zdd
d�Zdd
�Zdd�Zdd�Z	dd�Z
dS)�
moduleRecordsNcCstj||�dS)N)rPr!)rrTrrr r!YszmoduleRecords.__init__cCsg}t|j�\}}}|dkr(ttd���x�t|�D]�}t||�}t|j|�\}}|dkrdttd���t|j|�\}}|dkr�ttd���t|j|�\}}	|dkr�ttd���t	|j|�\}}
|dkr�ttd���|j
|||	|
f�q2W|jdd�d	d
�|jdd�d�|S)
NrzCould not list SELinux moduleszCould not get module namezCould not get module enabledzCould not get module priorityzCould not get module lang_extcSs|dS)Nrr)�trrr �<lambda>xsz'moduleRecords.get_all.<locals>.<lambda>T)�key�reversecSs|dS)Nrr)rgrrr rhys)ri)Zsemanage_module_list_allrWr^rr&�semanage_module_list_nthZsemanage_module_info_get_nameZ semanage_module_info_get_enabledZ!semanage_module_info_get_priorityZ!semanage_module_info_get_lang_extr'�sort)rrrL�mlist�number�i�modr,Zenabled�priorityZlang_extrrr �get_all\s,
zmoduleRecords.get_allcCs0|j�}t|�dkrgSdd�dd�|D�D�S)NrcSsg|]}d|d�qS)z-d %srr)�.0�xrrr �
<listcomp>�sz,moduleRecords.customized.<locals>.<listcomp>cSsg|]}|ddkr|�qS)r=rr)rsrgrrr ru�s)rrrH)r�allrrr rd|szmoduleRecords.customizedr=rcCs�|j�}t|�dkrdS|r:tdtd�td�td�f�xL|D]D}|ddkrZtd�}n
|r`q@d}td	|d|d
|d|f�q@WdS)Nrz
%-25s %-9s %s
zModule NameZPriorityZLanguager=ZDisabledrz%-25s %-9s %-5s %s�r)rrrH�printr)r�heading�	locallistrvrgZdisabledrrr �list�s

zmoduleRecords.listcCs`tjj|�sttd�|��t|j|�}|dkr@ttd�|��t|j|�}|dkr\|j�dS)NzModule does not exist: %s rz3Invalid priority %d (needs to be between 1 and 999))	�os�path�existsr^r�semanage_set_default_priorityrWZsemanage_module_install_filer8)rrrqrLrrr �add�szmoduleRecords.addcCs�x�|j�D]�}t|j�\}}|dkr0ttd���t|j||�}|dkrRttd���t|j||�}|dkr
|r~ttd�|��q
ttd�|��q
W|j�dS)NrzCould not create module keyzCould not set module key namezCould not enable module %szCould not disable module %s)�splitZsemanage_module_key_createrWr^rZsemanage_module_key_set_nameZsemanage_module_set_enabledr8)r�module�enable�mrLrirrr �set_enabled�szmoduleRecords.set_enabledcCsnt|j|�}|dkr$ttd�|��x<|j�D]0}t|j|�}|dkr.|dkr.ttd�|��q.W|j�dS)Nrz3Invalid priority %d (needs to be between 1 and 999)rwz*Could not remove module %s (remove failed)���)rrWr^rr��semanage_module_remover8)rr�rqrLr�rrr �delete�szmoduleRecords.deletecCs:dd�dd�|j�D�D�}x|D]}|j|d�q"WdS)NcSsg|]}|d�qS)rr)rsrtrrr ru�sz+moduleRecords.deleteall.<locals>.<listcomp>cSsg|]}|ddkr|�qS)r=rr)rsrgrrr ru�sT)rrr�)rrr�rrr ra�s
zmoduleRecords.deleteall)N)r=r)r9r:r;r!rrrdr{r�r�r�rarrrr rfWs
 
rfc@seZdZddd�Zdd�ZdS)�dontauditClassNcCstj||�dS)N)rPr!)rrTrrr r!�szdontauditClass.__init__cCs8|dkrttd���|j�t|j|dk�|j�dS)N�on�offz'dontaudit requires either 'on' or 'off')r�r�)r^rrbZsemanage_set_disable_dontauditrWr8)rZ	dontauditrrr �toggle�s
zdontauditClass.toggle)N)r9r:r;r!r�rrrr r��s
r�c@sHeZdZddd�Zdd�Zdd�Zdd
d�Zdd
�Zdd�Zdd�Z	dS)�permissiveRecordsNcCstj||�dS)N)rPr!)rrTrrr r!�szpermissiveRecords.__init__cCsrg}t|j�\}}}|dkr(ttd���xDt|�D]8}t||�}t|�}|r2|jd�r2|j|j	d�d�q2W|S)NrzCould not list SELinux modulesZpermissive_r=)
Zsemanage_module_listrWr^rr&rkZsemanage_module_get_name�
startswithr'r�)rrrLrmrnrorpr,rrr rr�s
zpermissiveRecords.get_allcCsdd�t|j��D�S)NcSsg|]}d|�qS)z-a %sr)rsrtrrr ru�sz0permissiveRecords.customized.<locals>.<listcomp>)�sortedrr)rrrr rd�szpermissiveRecords.customizedr=rcCs�dd�dd�tjtj�D�D�}t|�dkr0dS|rDtdtd��|j�}x|D]}||krRt|�qRWt|�dkrzdS|r�tdtd��x|D]}t|�q�WdS)NcSsg|]}|d�qS)r,r)rs�yrrr ru�sz*permissiveRecords.list.<locals>.<listcomp>cSsg|]}|dr|�qS)Z
permissiver)rsrtrrr ru�srz
%-25s
zBuiltin Permissive TypeszCustomized Permissive Types)rZ�infoZTYPErHrxrrr)rryrzrvrdrgrrr r{�s 

zpermissiveRecords.listcCs�yddlj}Wn tk
r.ttd���YnXd|}d|}t|j|t|�|d�}|dkrf|j�|dkr~ttd�|��dS)Nrz�The sepolgen python module is required to setup permissive domains.
In some distributions it is included in the policycoreutils-devel package.
# yum install policycoreutils-devel
Or similar for your distro.z
permissive_%sz(typepermissive %s)Zcilz?Could not set permissive domain %s (module installation failed))	Zsepolgen.moduler��ImportErrorr^rZsemanage_module_installrWrHr8)rrSr�r,ZmodtxtrLrrr r��szpermissiveRecords.addcCsFx8|j�D],}t|jd|�}|dkr
ttd�|��q
W|j�dS)Nz
permissive_%srz5Could not remove permissive domain %s (remove failed))r�r�rWr^rr8)rr,�nrLrrr r�s
zpermissiveRecords.deletecCs,|j�}t|�dkr(dj|�}|j|�dS)Nr� )rrrH�joinr�)rrrvrrr ras
zpermissiveRecords.deleteall)N)r=r)
r9r:r;r!rrrdr{r�r�rarrrr r��s


r�c@s~eZdZddd�Zdd�Zdd�Zdd	�Zd dd�Zd!d
d�Zdd�Z	dd�Z
dd�Zdd�Zd"dd�Z
dd�Zd#dd�ZdS)$�loginRecordsNcCs(tj||�d|_d|_d|_d|_dS)N)rPr!r/r1r#r.)rrTrrr r!s
zloginRecords.__init__cCs�tj|�\}|_|_|dkr d}t|j�}|j|j�\}\}}|j|�\}\}}	tdkrn|dkrjt|�}n|}t	|j
|�\}}
|dkr�ttd�|��|ddkr�yt
j|dd��Wn$ttd�|dd���YnXn,ytj|�Wnttd�|��YnXt|j
�\}}|dk�r4ttd	�|��t|j
||�}|dk�r\ttd
�|��tdk�r�|dk�r�t|j
||�}|dk�r�ttd�|��t|j
||�}|dk�r�ttd�|��t|j
|
|�}|dk�r�ttd
�|��t|
�t|�dS)NrZuser_ur=rzCould not create a key for %s�%zLinux Group %s does not existzLinux User %s does not existz%Could not create login mapping for %szCould not set name for %szCould not set MLS range for %sz!Could not set SELinux user for %sz"Could not add login mapping for %s)rG�getseuserbynamer/r1�seluserRecordsrT�getr`rO�semanage_seuser_key_createrWr^r�grpZgetgrnam�pwd�getpwnamZsemanage_seuser_createZsemanage_seuser_set_name�semanage_seuser_set_mlsrange�semanage_seuser_set_sename�semanage_seuser_modify_local�semanage_seuser_key_free�semanage_seuser_free)rr,r#r.�rec�userrecr&rLr0r-�k�urrr �__add sP

 




zloginRecords.__addcCsxyL|j�|j|�r4ttd�|�|j|||�n|j|||�|j�Wn&tk
rr}z
|�WYdd}~XnXdS)Nz:Login mapping for %s is already defined, modifying instead)rb�_loginRecords__existsrxr�_loginRecords__modify�_loginRecords__addr8r^)rr,r#r.�errorrrr r�Vs
zloginRecords.addcCs\t|j|�\}}|dkr(ttd�|��t|j|�\}}|dkrPttd�|��t|�|S)NrzCould not create a key for %sz2Could not check if login mapping for %s is defined)r�rWr^r�semanage_seuser_existsr�)rr,rLr�r~rrr �__existsdszloginRecords.__existsrc
Cs�tj|�\}|_|_|dkr0|dkr0ttd���t|j�}|j|j�\}\}}|dkrj|j|�\}\}}	n|}	|dkr~||_	n||_	t
|j|�\}}
|dkr�ttd�|��t|j|
�\}}|dkr�ttd�|��|s�ttd�|��t
|j|
�\}}|dk�rttd�|��t|�|_t|�|_tdk�rL|dk�rLt|j|t|��|dk�rlt|j||�||_n|j|_t|j|
|�}|dk�r�ttd	�|��t|
�t|�dS)
NrzRequires seuser or serangerzCould not create a key for %sz2Could not check if login mapping for %s is definedz#Login mapping for %s is not definedzCould not query seuser for %sr=z%Could not modify login mapping for %s)rGr�r/r1r^rr�rTr�r.r�rWr�Zsemanage_seuser_query�semanage_seuser_get_mlsrange�semanage_seuser_get_senamer`r�rOr�r#r�r�r�)
rr,r#r.r�r�r&rLr0r-r�r~r�rrr �__modifypsF





zloginRecords.__modifycCsNy"|j�|j|||�|j�Wn&tk
rH}z
|�WYdd}~XnXdS)N)rbr�r8r^)rr,r#r.r�rrr �modify�szloginRecords.modifyc
Cs*tj|�\}|_|_t|j�}|j|j�\}\}}t|j|�\}}|dkrZt	t
d�|��t|j|�\}}|dkr�t	t
d�|��|s�t	t
d�|��t|j|�\}}|dkr�t	t
d�|��|s�t	t
d�|��t
|j|�}|dkr�t	t
d�|��t|�tjd�\}|_|_|j|j�\}\}}	dS)NrzCould not create a key for %sz2Could not check if login mapping for %s is definedz#Login mapping for %s is not definedz<Login mapping for %s is defined in policy, cannot be deletedz%Could not delete login mapping for %sZ__default__)rGr�r/r1r�rTr�r�rWr^rr�Zsemanage_seuser_exists_localZsemanage_seuser_del_localr�r#r.)
rr,r�r�r&rLr0r�r~r-rrr �__delete�s,
zloginRecords.__deletecCsJy|j�|j|�|j�Wn&tk
rD}z
|�WYdd}~XnXdS)N)rb�_loginRecords__deleter8r^)rr,r�rrr r��s
zloginRecords.deletecCs~t|j�\}}|dkr"ttd���y0|j�x|D]}|jt|��q2W|j�Wn&tk
rx}z
|�WYdd}~XnXdS)NrzCould not list login mappings)�semanage_seuser_list_localrWr^rrbr��semanage_seuser_get_namer8)rrL�ulistr�r�rrr ra�s
zloginRecords.deleteallc
Cs�i}tj�d|_x�tj|j�D]�\}}}||jkr xj|D]b}yHt|d|�}|j�j�jd�}|j	�|d|d|df||<Wq:t
k
r�Yq:Xq:Wq W|S)Nz/logins�/�:r=rwr)rGZselinux_policy_root�logins_pathr|�walk�open�read�rstripr��close�
IndexError)r�ddictr}�dirs�filesr,�fdr�rrr �get_all_logins�s

zloginRecords.get_all_loginsrcCspi}|rt|j�\}|_nt|j�\}|_|dkr>ttd���x,|jD]"}t|�}t|�t|�df||<qFW|S)NrzCould not list login mappingsr)	r�rWr�Zsemanage_seuser_listr^rr�r�r�)rrzr�rLr�r,rrr rr�szloginRecords.get_allcCstg}|jd�}x`t|j��D]P}||drR|jd||d||d|f�q|jd||d|f�qW|S)NTr=z-a -s %s -r '%s' %srz-a -s %s %s)rrr��keysr')rrr�r�rrr rd�s
&zloginRecords.customizedr=c	CsN|j|�}|j�}t|j��}t|j��}t|�dkrFt|�dkrFdStdk�r|rxtdtd�td�td�td�f�x8|D]0}||}td||dt|d�|d	f�q~Wt|�r�td
|j	�x�|D]0}||}td||dt|d�|d	f�q�WnF|�r"tdtd�td�f�x&|D]}td|||df��q(WdS)
Nrr=z
%-20s %-20s %-20s %s
z
Login NamezSELinux Userz
MLS/MCS RangeZServicez%-20s %-20s %-20s %srwz
Local customization in %sz
%-25s %-25s
z%-25s %-25s)
rrr�r�r�rHr`rxrrNr�)	rryrzr�ZldictZlkeysr�r�r�rrr r{s*

$
(
*
zloginRecords.list)N)rr)rr)r)r=r)r9r:r;r!r�r�r�r�r�r�r�rar�rrrdr{rrrr r�s
6
2
	


r�c@s�eZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zgdddfd
d�Zgdddfdd�Z	dd�Z
dd�Zdd�Zd dd�Z
dd�Zd!dd�ZdS)"r�NcCstj||�dS)N)rPr!)rrTrrr r!"szseluserRecords.__init__cCs�t|j|�\}}|dkr(ttd�|��t|j|�\}}|dkrPttd�|��t|j|�\}}|dkrxttd�|��t|�}t|j|�}t|�t	|�||fS)NrzCould not create a key for %sz-Could not check if SELinux user %s is definedzCould not query user for %s)
�semanage_user_key_createrWr^r�semanage_user_exists�semanage_user_query�semanage_user_get_mlsrange�semanage_user_get_roles�semanage_user_key_free�semanage_user_free)rr,rLr�r~r�r.r-rrr r�%szseluserRecords.getcCstdkr4|dkrd}nt|�}|dkr,d}nt|�}t|�dkrPttd�|��t|j|�\}}|dkrxttd�|��t|j�\}}|dkr�ttd�|��t|j||�}|dkr�ttd�|��x6|D].}	t	|j||	�}|dkr�ttd	�|	|f��q�Wtdk�rVt
|j||�}|dk�r.ttd
�|��t|j||�}|dk�rVttd�|��t|j||�}|dk�r�ttd�|	|f��t
|j|�\}}
|dk�r�ttd
�|��t|j||�}|dk�r�ttd�|��t|�t|�|jjd|dj|�|d�dS)Nr=r�s0z%You must add at least one role for %srzCould not create a key for %sz$Could not create SELinux user for %szCould not set name for %szCould not add role %s for %szCould not set MLS range for %szCould not set MLS level for %szCould not add prefix %s for %szCould not extract key for %szCould not add SELinux user %s�seuserr$)r#r-r.)r`rOrHr^rr�rWZsemanage_user_createZsemanage_user_set_name�semanage_user_add_role�semanage_user_set_mlsrange�semanage_user_set_mlslevel�semanage_user_set_prefixZsemanage_user_key_extract�semanage_user_modify_localr�r�rYr3r�)rr,�roles�selevelr.�prefixrLr�r��rrirrr r�5sR






zseluserRecords.__addcCs�yT|j�|j|�r8ttd�|�|j|||||�n|j|||||�|j�Wn2tk
r�}z|jjd�|�WYdd}~XnXdS)Nz5SELinux user %s is already defined, modifying insteadr)	rb�_seluserRecords__existsrxr�_seluserRecords__modify�_seluserRecords__addr8r^rY)rr,r�r�r.r�r�rrr r�ls
zseluserRecords.addcCs\t|j|�\}}|dkr(ttd�|��t|j|�\}}|dkrPttd�|��t|�|S)NrzCould not create a key for %sz-Could not check if SELinux user %s is defined)r�rWr^rr�r�)rr,rLr�r~rrr r�yszseluserRecords.__existsrc	Cs@d}d}dj|�}|dkrXt|�dkrX|dkrX|dkrXtdkrLttd���nttd���t|j|�\}	}
|	dkr�ttd�|��t|j|
�\}	}|	dkr�ttd�|��|s�ttd	�|��t|j|
�\}	}|	dkr�ttd
�|��t	|�}t
|j|�\}	}
|	dk�rdj|
�}tdk�r6|dk�r6t|j|t|��tdk�r\|dk�r\t
|j|t|��|dk�rtt|j||�t|�dk�r�x"|
D]}||k�r�t||��q�Wx&|D]}||
k�r�t|j||��q�Wt|j|
|�}	|	dk�r�ttd�|��t|
�t|�dj|j��}dj|j��}|jjd
||||||d�dS)Nrr�rr=z&Requires prefix, roles, level or rangezRequires prefix or roleszCould not create a key for %sz-Could not check if SELinux user %s is definedzSELinux user %s is not definedzCould not query user for %sz Could not modify SELinux user %sr$r�)r#r/r-r.r0r1)r�rHr`r^rr�rWr�r�r�r�r�rOr�r�Zsemanage_user_del_roler�r�r�r�r�rYr3)rr,r�r�r.r�r0r1ZnewrolesrLr�r~r��rlistr�r%rrr r��sV
$







zseluserRecords.__modifycCs^y&|j�|j|||||�|j�Wn2tk
rX}z|jjd�|�WYdd}~XnXdS)Nr)rbr�r8r^rY)rr,r�r�r.r�r�rrr r��szseluserRecords.modifyc	Cs8t|j|�\}}|dkr(ttd�|��t|j|�\}}|dkrPttd�|��|sdttd�|��t|j|�\}}|dkr�ttd�|��|s�ttd�|��t|j|�\}}|dkr�ttd�|��t|�}t|j|�\}}dj	|�}t
|j|�}|dk�rttd�|��t|�t|�|j
jd	|||d
�dS)NrzCould not create a key for %sz-Could not check if SELinux user %s is definedzSELinux user %s is not definedz7SELinux user %s is defined in policy, cannot be deletedzCould not query user for %sr$z Could not delete SELinux user %sr�)r/r1r0)r�rWr^rr�Zsemanage_user_exists_localr�r�r�r�Zsemanage_user_del_localr�r�rYr4)	rr,rLr�r~r�r1r�r0rrr r��s2

zseluserRecords.__deletecCsVy|j�|j|�|j�Wn2tk
rP}z|jjd�|�WYdd}~XnXdS)Nr)rb�_seluserRecords__deleter8r^rY)rr,r�rrr r��s
zseluserRecords.deletecCs�t|j�\}}|dkr"ttd���y0|j�x|D]}|jt|��q2W|j�Wn2tk
r�}z|jjd�|�WYdd}~XnXdS)NrzCould not list login mappings)	�semanage_user_list_localrWr^rrbr��semanage_user_get_namer8rY)rrLr�r�r�rrr ra�s
zseluserRecords.deleteallrcCs�i}|rt|j�\}|_nt|j�\}|_|dkr>ttd���xh|jD]^}t|�}t|j|�\}}|dkrzttd�|��dj|�}t	|�t
|�t|�|f|t|�<qFW|S)NrzCould not list SELinux usersz Could not list roles for user %sr�)r�rWr�Zsemanage_user_listr^rr�r�r�Zsemanage_user_get_prefixZsemanage_user_get_mlslevelr�)rrzr�rLr�r,r�r�rrr rr�s
$zseluserRecords.get_allcCs�g}|jd�}xvt|j��D]f}||ds8||drh|jd||d||d||d|f�q|jd||d|f�qW|S)NTr=rwz-a -L %s -r %s -R '%s' %srz
-a -R '%s' %s)rrr�r�r')rrr�r�rrr rds
0zseluserRecords.customizedr=c	Cs|j|�}t|�dkrdSt|j��}tdkr�|r|tddtd�td�td�f�tdtd�td	�td
�td�td�f�x�|D]B}td
|||dt||d�t||d�||df�q�WnB|r�tdtd�td�f�x$|D]}td|||df�q�WdS)Nrr=z
%-15s %-10s %-10s %-30srZLabelingzMLS/z%-15s %-10s %-10s %-30s %s
zSELinux UserZPrefixz	MCS Levelz	MCS Rangez
SELinux Rolesz%-15s %-10s %-10s %-30s %srwrz	%-15s %s
z%-15s %s)rrrHr�r�r`rxrrN)rryrzr�r�r�rrr r{s
 *
D
zseluserRecords.list)N)r)r=r)r9r:r;r!r�r�r�r�r�r�r�r�rarrrdr{rrrr r� s
7
8	!


r�c@s�eZdZgZd dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
d!dd�Zd"dd�Zdd�Zd#dd�ZdS)$�portRecordsNcCsJtj||�y$tttjtjd��dd�|_Wntk
rDYnXdS)NZ	port_typer�types)rPr!r{rZr��	ATTRIBUTE�valid_types�RuntimeError)rrTrrr r!4s
$zportRecords.__init__c
Cs�ttttd�}||j�kr$||}nttd���|dkrDttd���|jd�}t|�dkrlt	|d�}}nt	|d�}t	|d�}|dkr�ttd	���t
|j|||�\}}	|dkr�ttd
�||f��|	|||fS)N)ZtcpZudpZsctpZdccpz0Protocol has to be one of udp, tcp, dccp or sctprzPort is requiredr"r=ri��zInvalid Portz Could not create a key for %s/%s)ZSEMANAGE_PROTO_TCPZSEMANAGE_PROTO_UDPZSEMANAGE_PROTO_SCTPZSEMANAGE_PROTO_DCCPr�r^rr�rH�intZsemanage_port_key_createrW)
r�port�protoZ	protocols�proto_dZports�high�lowrLr�rrr �__genkey;s(

zportRecords.__genkeycCs,tdkr|dkrd}nt|�}|dkr2ttd���tj|�}||jkrVttd�|��|j||�\}}}}t|j	�\}	}
|	dkr�ttd�||f��t
|
|�t|
||�t|j	�\}	}|	dkr�ttd�||f��t
|j	|d	�}	|	dkr�ttd
�||f��t|j	|d�}	|	dk�r*ttd�||f��t|j	||�}	|	dk�rVttd
�||f��tdk�r�|dk�r�t|j	||�}	|	dk�r�ttd�||f��t|j	|
|�}	|	dk�r�ttd�||f��t|j	||
�}	|	dk�r�ttd�||f��t|�t|�t|
�|jjd|tj|�d	d||f�dS)Nr=rr�zType is requiredz'Type %s is invalid, must be a port typerzCould not create port for %s/%sz"Could not create context for %s/%s�system_uz,Could not set user in port context for %s/%s�object_rz,Could not set role in port context for %s/%sz,Could not set type in port context for %s/%sz2Could not set mls fields in port context for %s/%sz$Could not set port context for %s/%szCould not add port %s/%sz8resrc=port op=add lport=%s proto=%s tcontext=%s:%s:%s:%s)r`rOr^rrZ�get_real_type_namer��_portRecords__genkeyZsemanage_port_createrWZsemanage_port_set_protoZsemanage_port_set_range�semanage_context_create�semanage_context_set_user�semanage_context_set_role�semanage_context_set_type�semanage_context_set_mlsZsemanage_port_set_con�semanage_port_modify_local�semanage_context_free�semanage_port_key_free�semanage_port_freerYr6r
�getprotobyname)rr�r�r.rSr�r�r�r�rLr�conrrr r�WsR







zportRecords.__addcCsX|j�|j||�r<ttd�j||d��|j||||�n|j||||�|j�dS)Nz6Port {proto}/{port} already defined, modifying instead)r�r�)rb�_portRecords__existsrxr�format�_portRecords__modify�_portRecords__addr8)rr�r�r.rSrrr r��szportRecords.addc	CsN|j||�\}}}}t|j|�\}}|dkrBttd�j||d���t|�|S)Nrz1Could not check if port {proto}/{port} is defined)r�r�)r��semanage_port_existsrWr^rr�r�)	rr�r�r�r�r�r�rLr~rrr r��szportRecords.__existsc
Cs�|dkr2|dkr2tdkr&ttd���nttd���tj|�}|rZ||jkrZttd�|��|j||�\}}}}t|j|�\}	}
|	dkr�ttd�||f��|
s�ttd�||f��t	|j|�\}	}|	dkr�ttd	�||f��t
|�}tdk�r|dk�rd
}nt|j|t|��|dk�r*t
|j||�t|j||�}	|	dk�rVttd�||f��t|�t|�|jjd|tj|�d
d||f�dS)Nrr=zRequires setype or serangezRequires setypez'Type %s is invalid, must be a port typerz(Could not check if port %s/%s is definedzPort %s/%s is not definedzCould not query port %s/%sr�zCould not modify port %s/%sz;resrc=port op=modify lport=%s proto=%s tcontext=%s:%s:%s:%sr�r�)r`r^rrZr�r�r�r�rWZsemanage_port_query�semanage_port_get_conr�rOr�r�r�r�rYr6r
r�)
rr�r�r.�setyper�r�r�r�rLr~rr�rrr r��s:




zportRecords.__modifycCs$|j�|j||||�|j�dS)N)rbr�r8)rr�r�r.rrrr r��szportRecords.modifycCs�t|j�\}}|dkr"ttd���|j�x�|D]�}t|�}t|�}t|�}t|�}d||f}|j	||�\}	}
}}|dkr�ttd�|��t
|j|	�}|dkr�ttd�|��t|	�||kr�|}|jj
d|tj|�f�q0W|j�dS)NrzCould not list the portsz%s-%szCould not create a key for %szCould not delete the port %sz&resrc=port op=delete lport=%s proto=%s)�semanage_port_list_localrWr^rrb�semanage_port_get_proto�semanage_port_get_proto_str�semanage_port_get_low�semanage_port_get_highr��semanage_port_del_localr�rYr6r
r�r8)rrL�plistr�r��	proto_strr�r�Zport_strr�r�rrr ra�s*
zportRecords.deleteallc	Cs�|j||�\}}}}t|j|�\}}|dkr@ttd�||f��|sXttd�||f��t|j|�\}}|dkr�ttd�||f��|s�ttd�||f��t|j|�}|dkr�ttd�||f��t|�|jj	d|t
j|�f�dS)Nrz(Could not check if port %s/%s is definedzPort %s/%s is not definedz2Port %s/%s is defined in policy, cannot be deletedzCould not delete port %s/%sz&resrc=port op=delete lport=%s proto=%s)r�r�rWr^rZsemanage_port_exists_localrr�rYr6r
r�)	rr�r�r�r�r�r�rLr~rrr r��s zportRecords.__deletecCs |j�|j||�|j�dS)N)rb�_portRecords__deleter8)rr�r�rrr r�szportRecords.deletercCs�i}|rt|j�\}|_nt|j�\}|_|dkr>ttd���xX|jD]N}t|�}t|�}t|�}t	|�}t
|�}	t|�}
t|�}||f||
||	f<qFW|S)NrzCould not list ports)
rrWr�semanage_port_listr^rr�semanage_context_get_type�semanage_context_get_mlsrrrr)rrzr�rLr�r��ctype�levelr�r	r�r�rrr rrs zportRecords.get_allcCs�i}|rt|j�\}|_nt|j�\}|_|dkr>ttd���x�|jD]�}t|�}t|�}t|�}t	|�}t
|�}	t|�}
||f|j�kr�g|||f<|	|
kr�|||fj
d|	�qF|||fj
d|	|
f�qFW|S)NrzCould not list portsz%dz%d-%d)rrWrrr^rrrrrrrr�r')rrzr�rLr�r�rr�r	r�r�rrr �get_all_by_type s&zportRecords.get_all_by_typecCs�g}|jd�}x�t|j��D]�}|d|dkr8|dnd|d|df}||dr�|jd||d||d|d|f�q|jd||d|d|f�qW|S)NTrr=z%s-%sz-a -t %s -r '%s' -p %s %srwz-a -t %s -p %s %s)rrr�r�r')rrr�r�r�rrr rd8s
,,$zportRecords.customizedr=cCs�|j|�}t|�dkrdSt|j��}|rHtdtd�td�td�f�xV|D]N}d|}|d||d7}x$||dd�D]}|d	|7}q�Wt|�qNWdS)
Nrz%-30s %-8s %s
zSELinux Port TypeZProtozPort Numberz%-30s %-8s z%sr=z, %s)rrHr�r�rxr)rryrzr�r�ror�rrrr r{Cs

zportRecords.list)N)r)r)r=r)r9r:r;r�r!r�r�r�r�r�r�rar
r�rrrrdr{rrrr r�0s
:	
*

r�c@s�eZdZgZd dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
d!dd�Zd"dd�Zdd�Zd#dd�ZdS)$�
ibpkeyRecordsNc
CsXtj||�y:tjtjtj|j��dgd�}tdd�|j	�D��|_
WnYnXdS)NZibpkey_type)�attrscss|]}t|�VqdS)N)r*)rsrgrrr �	<genexpr>Zsz)ibpkeyRecords.__init__.<locals>.<genexpr>)rPr!�setools�	TypeQuery�
SELinuxPolicyrZ�get_store_policyrRr��resultsr�)rrT�qrrr r!VszibpkeyRecords.__init__cCs�|dkrttd���|jd�}t|�dkr>t|dd�}}nt|dd�}t|dd�}|dkrnttd���t|j|||�\}}|dkr�ttd�||f��||||fS)	NrzSubnet Prefix is requiredr"r=ri��zInvalid Pkeyz Could not create a key for %s/%s)r^rr�rHr�Zsemanage_ibpkey_key_createrW)r�pkey�
subnet_prefixZpkeysr�r�rLr�rrr r�^s
zibpkeyRecords.__genkeycCstdkr|dkrd}nt|�}|dkr2ttd���tj|�}||jkrVttd�|��|j||�\}}}}t|j	�\}}	|dkr�ttd�||f��t
|j	|	|�t|	||�t|j	�\}}
|dkr�ttd�||f��t
|j	|
d	�}|dk�rttd
�||f��t|j	|
d�}|dk�r0ttd�||f��t|j	|
|�}|dk�r\ttd
�||f��tdk�r�|dk�r�t|j	|
|�}|dk�r�ttd�||f��t|j	|	|
�}|dk�r�ttd�||f��t|j	||	�}|dk�r�ttd�||f��t|
�t|�t|	�dS)Nr=rr�zType is requiredz)Type %s is invalid, must be a ibpkey typerz!Could not create ibpkey for %s/%sz"Could not create context for %s/%sr�z.Could not set user in ibpkey context for %s/%sr�z.Could not set role in ibpkey context for %s/%sz.Could not set type in ibpkey context for %s/%sz4Could not set mls fields in ibpkey context for %s/%sz&Could not set ibpkey context for %s/%szCould not add ibpkey %s/%s)r`rOr^rrZr�r��_ibpkeyRecords__genkeyZsemanage_ibpkey_createrWZ!semanage_ibpkey_set_subnet_prefixZsemanage_ibpkey_set_ranger�r�r�r�r�Zsemanage_ibpkey_set_con�semanage_ibpkey_modify_localr��semanage_ibpkey_key_free�semanage_ibpkey_free)rrrr.rSr�r�r�rLrr�rrr r�qsP







zibpkeyRecords.__addcCsX|j�|j||�r<ttd�j||d��|j||||�n|j||||�|j�dS)Nz@ibpkey {subnet_prefix}/{pkey} already defined, modifying instead)rr)rb�_ibpkeyRecords__existsrxrr��_ibpkeyRecords__modify�_ibpkeyRecords__addr8)rrrr.rSrrr r��szibpkeyRecords.addcCsN|j||�\}}}}t|j|�\}}|dkrBttd�j||d���t|�|S)Nrz;Could not check if ibpkey {subnet_prefix}/{pkey} is defined)rr)r�semanage_ibpkey_existsrWr^rZformnatr)rrrr�r�r�rLr~rrr r��szibpkeyRecords.__existscCsb|dkr2|dkr2tdkr&ttd���nttd���tj|�}|rZ||jkrZttd�|��|j||�\}}}}t|j|�\}}	|dkr�ttd�||f��|	s�ttd�||f��t	|j|�\}}
|dkr�ttd	�||f��t
|
�}tdko�|dk�r
t|j|t|��|dk�r"t
|j||�t|j||
�}|dk�rNttd
�||f��t|�t|
�dS)Nrr=zRequires setype or serangezRequires setypez)Type %s is invalid, must be a ibpkey typerz*Could not check if ibpkey %s/%s is definedzibpkey %s/%s is not definedzCould not query ibpkey %s/%szCould not modify ibpkey %s/%s)r`r^rrZr�r�rr#rWZsemanage_ibpkey_query�semanage_ibpkey_get_conr�rOr�rrr)rrrr.rr�r�r�rLr~rr�rrr r��s4


zibpkeyRecords.__modifycCs$|j�|j||||�|j�dS)N)rbr!r8)rrrr.rrrr r��szibpkeyRecords.modifyc	Cs�t|j�\}}|dkr"ttd���|j�x�|D]�}t|j|�\}}t|�}t|�}d||f}|j||�\}}}}|dkr�ttd�|��t	|j|�}|dkr�ttd�|��t
|�q0W|j�dS)NrzCould not list the ibpkeysz%s-%szCould not create a key for %szCould not delete the ibpkey %s)�semanage_ibpkey_list_localrWr^rrb�!semanage_ibpkey_get_subnet_prefix�semanage_ibpkey_get_low�semanage_ibpkey_get_highr�semanage_ibpkey_del_localrr8)	rrLr�ibpkeyrr�r�Zpkey_strr�rrr ra�s"
zibpkeyRecords.deleteallcCs�|j||�\}}}}t|j|�\}}|dkr@ttd�||f��|sXttd�||f��t|j|�\}}|dkr�ttd�||f��|s�ttd�||f��t|j|�}|dkr�ttd�||f��t|�dS)Nrz*Could not check if ibpkey %s/%s is definedzibpkey %s/%s is not definedz4ibpkey %s/%s is defined in policy, cannot be deletedzCould not delete ibpkey %s/%s)rr#rWr^rZsemanage_ibpkey_exists_localr)r)rrrr�r�r�rLr~rrr r��szibpkeyRecords.__deletecCs |j�|j||�|j�dS)N)rb�_ibpkeyRecords__deleter8)rrrrrr r�szibpkeyRecords.deletercCs�i}|rt|j�\}|_nt|j�\}|_|dkr>ttd���xb|jD]X}t|�}t|�}|dkrdqFt|�}t	|j|�\}}t
|�}	t|�}
||f||	|
|f<qFW|S)NrzCould not list ibpkeysZreserved_ibpkey_t)r%rWr�semanage_ibpkey_listr^rr$rr
r&r'r()rrzr�rLr*r�rrrr�r�rrr rrs"zibpkeyRecords.get_allc
Cs�i}|rt|j�\}|_nt|j�\}|_|dkr>ttd���x�|jD]�}t|�}t|�}t|j|�\}}t	|�}t
|�}	||f|j�kr�g|||f<||	kr�|||fjd|�qF|||fjd||	f�qFW|S)NrzCould not list ibpkeysz0x%xz	0x%x-0x%x)
r%rWrr,r^rr$rr&r'r(r�r')
rrzr�rLr*r�rrr�r�rrr r,s$zibpkeyRecords.get_all_by_typecCs�g}|jd�}x�t|j��D]�}|d|dkr8|dnd|d|df}||dr�|jd||d||d|d|f�q|jd||d|d|f�qW|S)NTrr=z%s-%sz-a -t %s -r '%s' -x %s %srwz-a -t %s -x %s %s)rrr�r�r')rrr�r�r�rrr rdCs
,,$zibpkeyRecords.customizedr=cCs�|j|�}|j�}t|�dkr"dS|rDtdtd�td�td�f�xZt|�D]N}d|}|d||d7}x$||dd�D]}|d	|7}q�Wt|�qNWdS)
Nrz%-30s %-18s %s
zSELinux IB Pkey TypeZ
Subnet_PrefixzPkey Numberz%-30s %-18s z%sr=z, %s)rr�rHrxrr�)rryrzr�r�ror�rrrr r{Os
zibpkeyRecords.list)N)r)r)r=r)r9r:r;r�r!rr"r�r r!r�rar+r�rrrrdr{rrrr rRs
8	
&

rc@s�eZdZgZd dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
d!dd�Zd"dd�Zdd�Zd#dd�ZdS)$�ibendportRecordsNc
CsXtj||�y:tjtjtj|j��dgd�}tdd�|j	�D��|_
WnYnXdS)NZibendport_type)rcss|]}t|�VqdS)N)r*)rsrgrrr rfsz,ibendportRecords.__init__.<locals>.<genexpr>)rPr!rrrrZrrR�setrr�)rrTrrrr r!bszibendportRecords.__init__cCsp|dkrttd���t|�}|dks,|dkr8ttd���t|j||�\}}|dkrfttd�||f��|||fS)NrzIB device name is required�r=zInvalid Port Numberrz*Could not create a key for ibendport %s/%s)r^rr�Zsemanage_ibendport_key_createrW)r�	ibendport�
ibdev_namer�rLr�rrr r�jszibendportRecords.__genkeyc
Cs
tdkr|dkrd}nt|�}|dkr2ttd���tj|�}||jkrVttd�|��|j||�\}}}t|j	�\}}|dkr�ttd�||f��t
|j	||�t||�t|j	�\}}	|dkr�ttd�||f��t
|j	|	d	�}|dkr�ttd
�||f��t|j	|	d�}|dk�r*ttd�||f��t|j	|	|�}|dk�rVttd
�||f��tdk�r�|dk�r�t|j	|	|�}|dk�r�ttd�||f��t|j	||	�}|dk�r�ttd�||f��t|j	||�}|dk�r�ttd�||f��t|	�t|�t|�dS)Nr=rr�zType is requiredz-Type %s is invalid, must be an ibendport typerz$Could not create ibendport for %s/%sz"Could not create context for %s/%sr�z1Could not set user in ibendport context for %s/%sr�z1Could not set role in ibendport context for %s/%sz1Could not set type in ibendport context for %s/%sz7Could not set mls fields in ibendport context for %s/%sz)Could not set ibendport context for %s/%szCould not add ibendport %s/%s)r`rOr^rrZr�r��_ibendportRecords__genkeyZsemanage_ibendport_createrWZ!semanage_ibendport_set_ibdev_nameZsemanage_ibendport_set_portr�r�r�r�r�Zsemanage_ibendport_set_con�semanage_ibendport_modify_localr��semanage_ibendport_key_free�semanage_ibendport_free)
rr0r1r.rSr�r�rLrr�rrr r�xsP







zibendportRecords.__addcCsX|j�|j||�r<ttd�j|td��|j||||�n|j||||�|j�dS)Nz@ibendport {ibdev_name}/{port} already defined, modifying instead)r1r�)	rb�_ibendportRecords__existsrxrr�r��_ibendportRecords__modify�_ibendportRecords__addr8)rr0r1r.rSrrr r��szibendportRecords.addcCsL|j||�\}}}t|j|�\}}|dkr@ttd�j||d���t|�|S)Nrz;Could not check if ibendport {ibdev_name}/{port} is defined)r1r�)r2�semanage_ibendport_existsrWr^rr�r4)rr0r1r�r�rLr~rrr r��szibendportRecords.__existscCs`|dkr2|dkr2tdkr&ttd���nttd���tj|�}|rZ||jkrZttd�|��|j||�\}}}t|j|�\}}|dkr�ttd�||f��|s�ttd�||f��t	|j|�\}}	|dkr�ttd	�||f��t
|	�}
tdko�|dk�rt|j|
t|��|dk�r t
|j|
|�t|j||	�}|dk�rLttd
�||f��t|�t|	�dS)Nrr=zRequires setype or serangezRequires setypez-Type %s is invalid, must be an ibendport typerz-Could not check if ibendport %s/%s is definedzibendport %s/%s is not definedzCould not query ibendport %s/%sz Could not modify ibendport %s/%s)r`r^rrZr�r�r2r9rWZsemanage_ibendport_query�semanage_ibendport_get_conr�rOr�r3r4r5)rr0r1r.rr�r�rLr~rr�rrr r��s4


zibendportRecords.__modifycCs$|j�|j||||�|j�dS)N)rbr7r8)rr0r1r.rrrr r��szibendportRecords.modifycCs�t|j�\}}|dkr"ttd���|j�x�|D]~}t|j|�\}}t|�}|jt|�|�\}}}|dkr~ttd�t	|f��t
|j|�}|dkr�ttd�||f��t|�q0W|j�dS)NrzCould not list the ibendportsz Could not create a key for %s/%dz$Could not delete the ibendport %s/%d)
�semanage_ibendport_list_localrWr^rrb�!semanage_ibendport_get_ibdev_name�semanage_ibendport_get_portr2r*Z	ibdevname�semanage_ibendport_del_localr4r8)rrLrr0r1r�r�rrr ra�s
zibendportRecords.deleteallcCs�|j||�\}}}t|j|�\}}|dkr>ttd�||f��|sVttd�||f��t|j|�\}}|dkr�ttd�||f��|s�ttd�||f��t|j|�}|dkr�ttd�||f��t|�dS)Nrz-Could not check if ibendport %s/%s is definedzibendport %s/%s is not definedz7ibendport %s/%s is defined in policy, cannot be deletedz Could not delete ibendport %s/%s)r2r9rWr^rZsemanage_ibendport_exists_localr>r4)rr0r1r�r�rLr~rrr r�szibendportRecords.__deletecCs |j�|j||�|j�dS)N)rb�_ibendportRecords__deleter8)rr0r1rrr r�szibendportRecords.deleterc
Cs�i}|rt|j�\}|_nt|j�\}|_|dkr>ttd���xX|jD]N}t|�}t|�}|dkrdqFt|�}t	|j|�\}}t
|�}	||f||	|f<qFW|S)NrzCould not list ibendportsZreserved_ibendport_t)r;rWr�semanage_ibendport_listr^rr:rr
r<r=)
rrzr�rLr0r�rrr1r�rrr rrs zibendportRecords.get_allc	Cs�i}|rt|j�\}|_nt|j�\}|_|dkr>ttd���xh|jD]^}t|�}t|�}t|j|�\}}t	|�}||f|j
�kr�g|||f<|||fjd|�qFW|S)NrzCould not list ibendportsz0x%x)r;rWrr@r^rr:rr<r=r�r')	rrzr�rLr0r�rr1r�rrr r/sz ibendportRecords.get_all_by_typecCs�g}|jd�}xtt|j��D]d}||dr\|jd||d||d|d|df�q|jd||d|d|df�qW|S)NTr=z-a -t %s -r '%s' -z %s %srz-a -t %s -z %s %s)rrr�r�r')rrr�r�rrr rdBs
0(zibendportRecords.customizedr=cCs�|j|�}|j�}t|�dkr"dS|rDtdtd�td�td�f�xZt|�D]N}d|}|d||d7}x$||dd�D]}|d	|7}q�Wt|�qNWdS)
Nrz%-30s %-18s %s
zSELinux IB End Port TypezIB Device NamezPort Numberz%-30s %-18s z%sr=z, %s)rr�rHrxrr�)rryrzr�r�ror�rrrr r{Ms
zibendportRecords.list)N)r)r)r=r)r9r:r;r�r!r2r8r�r6r7r�rar?r�rrrrdr{rrrr r-^s
7	
&

r-c@s~eZdZgZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
ddd�Zdd�Zd dd�ZdS)!�nodeRecordsNcCsTtj||�ddg|_y$tttjtjd��dd�|_Wntk
rNYnXdS)NZipv4Zipv6Z	node_typerr�)	rPr!�protocolr{rZr�r�r�r�)rrTrrr r!`s
$znodeRecords.__init__c	Cs�|}|}d}|dkr ttd���t|�dks8|ddkrztj||�}t|j�}t|j�}|dkrp|jdkrpd}d|j}y|j	j
|�}Wnttd	���YnX|||fS)
NrzNode Address is requiredrr�z0.0.0.0�z::zipv%dzUnknown or missing protocol)r^rrH�	ipaddressZ
ip_networkr*Znetwork_addressZnetmask�versionrB�index)r�addr�maskrBZnewaddrZnewmaskZnewprotocolrorrr �validatehs"


znodeRecords.validatec
	Csp|j|||�\}}}tdkr2|dkr*d}nt|�}|dkrFttd���tj|�}||jkrjttd�|��t|j	|||�\}}|dkr�ttd�|��t
|j	�\}}|dkr�ttd�|��t||�t|j	|||�}t
|j	�\}}	|dkr�ttd	�|��t|j	|||�}|dk�r&ttd
�|��t|j	|	d�}|dk�rNttd�|��t|j	|	d
�}|dk�rvttd�|��t|j	|	|�}|dk�r�ttd�|��tdk�r�|dk�r�t|j	|	|�}|dk�r�ttd�|��t|j	||	�}|dk�rttd�|��t|j	||�}|dk�r*ttd�|��t|	�t|�t|�|jjd||tj|j|�dd
||f�dS)Nr=rr�zSELinux node type is requiredz'Type %s is invalid, must be a node typerzCould not create key for %szCould not create addr for %szCould not create context for %szCould not set mask for %sr�z)Could not set user in addr context for %sr�z)Could not set role in addr context for %sz)Could not set type in addr context for %sz/Could not set mls fields in addr context for %sz!Could not set addr context for %szCould not add addr %szCresrc=node op=add laddr=%s netmask=%s proto=%s tcontext=%s:%s:%s:%s)rIr`rOr^rrZr�r��semanage_node_key_createrWZsemanage_node_createZsemanage_node_set_protoZsemanage_node_set_addrr�Zsemanage_node_set_maskr�r�r�r�Zsemanage_node_set_con�semanage_node_modify_localr��semanage_node_key_free�semanage_node_freerYr6r
r�rB)
rrGrHr�r.rrLr��noder�rrr r��s^









znodeRecords.__addcCsX|j�|j|||�r:ttd�|�|j|||||�n|j|||||�|j�dS)Nz*Addr %s already defined, modifying instead)rb�_nodeRecords__existsrxr�_nodeRecords__modify�_nodeRecords__addr8)rrGrHr�r.rrrr r��sznodeRecords.addcCst|j|||�\}}}t|j|||�\}}|dkr@ttd�|��t|j|�\}}|dkrhttd�|��t|�|S)NrzCould not create key for %sz%Could not check if addr %s is defined)rIrJrWr^r�semanage_node_existsrL)rrGrHr�rLr�r~rrr r��sznodeRecords.__existsc	Cs�|j|||�\}}}|dkr0|dkr0ttd���tj|�}|rX||jkrXttd�|��t|j|||�\}}|dkr�ttd�|��t|j|�\}}|dkr�ttd�|��|s�ttd�|��t	|j|�\}}	|dkr�ttd�|��t
|	�}
td	k�r|dk�rt|j|
t
|��|dk�r.t|j|
|�t|j||	�}|dk�rVttd
�|��t|�t|	�|jjd||tj|j|�dd
||f�dS)NrzRequires setype or serangez'Type %s is invalid, must be a node typerzCould not create key for %sz%Could not check if addr %s is definedzAddr %s is not definedzCould not query addr %sr=zCould not modify addr %szFresrc=node op=modify laddr=%s netmask=%s proto=%s tcontext=%s:%s:%s:%sr�r�)rIr^rrZr�r�rJrWrRZsemanage_node_query�semanage_node_get_conr`r�rOr�rKrLrMrYr6r
r�rB)rrGrHr�r.rrLr�r~rNr�rrr r��s8


znodeRecords.__modifycCs&|j�|j|||||�|j�dS)N)rbrPr8)rrGrHr�r.rrrr r�sznodeRecords.modifycCs
|j|||�\}}}t|j|||�\}}|dkr@ttd�|��t|j|�\}}|dkrhttd�|��|s|ttd�|��t|j|�\}}|dkr�ttd�|��|s�ttd�|��t|j|�}|dkr�ttd�|��t|�|j	j
d||tj|j
|�f�dS)NrzCould not create key for %sz%Could not check if addr %s is definedzAddr %s is not definedz/Addr %s is defined in policy, cannot be deletedzCould not delete addr %sz1resrc=node op=delete laddr=%s netmask=%s proto=%s)rIrJrWr^rrRZsemanage_node_exists_localZsemanage_node_del_localrLrYr6r
r�rB)rrGrHr�rLr�r~rrr r�s&znodeRecords.__deletecCs"|j�|j|||�|j�dS)N)rb�_nodeRecords__deleter8)rrGrHr�rrr r�#sznodeRecords.deletecCstt|j�\}}|dkr"ttd���|j�x<|D]4}|jt|j|�dt|j|�d|jt	|��q0W|j
�dS)Nrz!Could not deleteall node mappingsr=)�semanage_node_list_localrWr^rrbrT�semanage_node_get_addr�semanage_node_get_maskrB�semanage_node_get_protor8)rrLZnlistrNrrr ra(s
4znodeRecords.deleteallrc	Cs�i}|rt|j�\}|_nt|j�\}|_|dkr>ttd���xj|jD]`}t|�}t|j|�}t|j|�}|j	t
|�}t|�t|�t
|�t|�f||d|d|f<qFW|S)NrzCould not list addrsr=)rUrW�ilistZsemanage_node_listr^rrSrVrWrBrX�semanage_context_get_user�semanage_context_get_rolerr
)	rrzr�rLrNr�rGrHr�rrr rr2s2znodeRecords.get_allc	Cs�g}|jd�}x�t|j��D]p}||drb|jd|d|d||d||d|df�q|jd|d|d||d|df�qW|S)NTrz-a -M %s -p %s -t %s -r '%s' %sr=rwrz-a -M %s -p %s -t %s %s)rrr�r�r')rrr�r�rrr rdDs
6.znodeRecords.customizedr=cCs|j|�}t|�dkrdSt|j��}|r6tdd�tr�x�|D]r}d}x|D]}|dt|�}qNWtd	|d|d
|d||d||d
||dt||dd
�f�q@WnJxH|D]@}td|d|d
|d||d||d
||df�q�WdS)Nrz%-18s %-18s %-5s %-5s
�
IP Address�Netmask�Protocol�Contextr�	z%-18s %-18s %-5s %s:%s:%s:%s r=rwrFz%-18s %-18s %-5s %s:%s:%s )r\r]r^r_)rrrHr�r�rxr`r*rN)rryrzr�r�r��valZfieldsrrr r{Ns


R
znodeRecords.list)N)r)r=r)r9r:r;r�r!rIrQr�rOrPr�rTr�rarrrdr{rrrr rA\s
B	(


rAc@sreZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zddd�Zdd�Z
ddd�ZdS)�interfaceRecordsNcCstj||�dS)N)rPr!)rrTrrr r!cszinterfaceRecords.__init__cCstdkr|dkrd}nt|�}|dkr2ttd���t|j|�\}}|dkrZttd�|��t|j�\}}|dkr�ttd�|��t|j||�}t|j�\}}|dkr�ttd�|��t	|j|d	�}|dkr�ttd
�|��t
|j|d�}|dk�rttd�|��t|j||�}|dk�r*ttd
�|��tdk�rf|dk�rft|j||�}|dk�rfttd�|��t
|j||�}|dk�r�ttd�|��t|j||�}|dk�r�ttd�|��t|j||�}|dk�r�ttd�|��t|�t|�t|�|jjd|d	d||f�dS)Nr=rr�zSELinux Type is requiredrzCould not create key for %sz!Could not create interface for %szCould not create context for %sr�z.Could not set user in interface context for %sr�z.Could not set role in interface context for %sz.Could not set type in interface context for %sz4Could not set mls fields in interface context for %sz&Could not set interface context for %sz$Could not set message context for %szCould not add interface %sz4resrc=interface op=add netif=%s tcontext=%s:%s:%s:%s)r`rOr^r�semanage_iface_key_createrWZsemanage_iface_createZsemanage_iface_set_namer�r�r�r�r�Zsemanage_iface_set_ifconZsemanage_iface_set_msgcon�semanage_iface_modify_localr��semanage_iface_key_free�semanage_iface_freerYr6)r�	interfacer.rrLr��ifacer�rrr r�fsT





zinterfaceRecords.__addcCsL|j�|j|�r2ttd�|�|j|||�n|j|||�|j�dS)Nz/Interface %s already defined, modifying instead)rb�_interfaceRecords__existsrxr�_interfaceRecords__modify�_interfaceRecords__addr8)rrgr.rrrr r��s
zinterfaceRecords.addcCs\t|j|�\}}|dkr(ttd�|��t|j|�\}}|dkrPttd�|��t|�|S)NrzCould not create key for %sz*Could not check if interface %s is defined)rcrWr^r�semanage_iface_existsre)rrgrLr�r~rrr r��szinterfaceRecords.__existsc	Cs>|dkr|dkrttd���t|j|�\}}|dkrDttd�|��t|j|�\}}|dkrlttd�|��|s�ttd�|��t|j|�\}}|dkr�ttd�|��t|�}tdkr�|dkr�t|j|t	|��|dkr�t
|j||�t|j||�}|dk�rttd	�|��t|�t
|�|jjd
|dd||f�dS)
NrzRequires setype or serangerzCould not create key for %sz*Could not check if interface %s is definedzInterface %s is not definedzCould not query interface %sr=zCould not modify interface %sz7resrc=interface op=modify netif=%s tcontext=%s:%s:%s:%sr�r�)r^rrcrWrlZsemanage_iface_query�semanage_iface_get_ifconr`r�rOr�rdrerfrYr6)	rrgr.rrLr�r~rhr�rrr r��s0
zinterfaceRecords.__modifycCs"|j�|j|||�|j�dS)N)rbrjr8)rrgr.rrrr r��szinterfaceRecords.modifycCs�t|j|�\}}|dkr(ttd�|��t|j|�\}}|dkrPttd�|��|sdttd�|��t|j|�\}}|dkr�ttd�|��|s�ttd�|��t|j|�}|dkr�ttd�|��t|�|jj	d|�dS)NrzCould not create key for %sz*Could not check if interface %s is definedzInterface %s is not definedz4Interface %s is defined in policy, cannot be deletedzCould not delete interface %sz"resrc=interface op=delete netif=%s)
rcrWr^rrlZsemanage_iface_exists_localZsemanage_iface_del_localrerYr6)rrgrLr�r~rrr r��s$zinterfaceRecords.__deletecCs|j�|j|�|j�dS)N)rb�_interfaceRecords__deleter8)rrgrrr r��s
zinterfaceRecords.deletecCsRt|j�\}}|dkr"ttd���|j�x|D]}|jt|��q0W|j�dS)Nrz(Could not delete all interface  mappings)�semanage_iface_list_localrWr^rrbrn�semanage_iface_get_namer8)rrLr�rorrr ra�s
zinterfaceRecords.deleteallrcCs~i}|rt|j�\}|_nt|j�\}|_|dkr>ttd���x:|jD]0}t|�}t|�t|�t	|�t
|�f|t|�<qFW|S)NrzCould not list interfaces)rorWrYZsemanage_iface_listr^rrmrZr[rr
rp)rrzr�rLrgr�rrr rr	s(zinterfaceRecords.get_allcCstg}|jd�}x`t|j��D]P}||drR|jd||d||d|f�q|jd||d|f�qW|S)NTrz-a -t %s -r '%s' %srwz-a -t %s %s)rrr�r�r')rrr�r�rrr rd	s
&zinterfaceRecords.customizedr=c
Cs�|j|�}t|�dkrdSt|j��}|rBtdtd�td�f�tr�x�|D]@}td|||d||d||dt||dd	�f�qLWn:x8|D]0}td
|||d||d||df�q�WdS)Nrz	%-30s %s
zSELinux Interfacer_z%-30s %s:%s:%s:%s r=rwrFz%-30s %s:%s:%s )rrrHr�r�rxrr`rN)rryrzr�r�r�rrr r{	s

B
zinterfaceRecords.list)N)r)r=r)r9r:r;r!rkr�rirjr�rnr�rarrrdr{rrrr rbas
:	"


rbc@s�eZdZgZd(dd�Zdd�Zdd�Zdd	�Zd)dd�Zd
d�Z	d*dd�Z
d+dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zd,d!d"�Zd#d$�Zd-d&d'�ZdS).�fcontextRecordsNcCs�tj||�yLtttjtjd��dd�|_|jtttjtjd��dd�7_Wntk
rlYnXi|_i|_	d|_
ydttj
�d�}xH|j�D]<}|j�}t|�dkr�q�|jd�r�q�|j�\}}||j|<q�W|j�Wntk
r�YnXynttj�d�}xR|j�D]F}|j�}t|�dk�r2�q|jd��rB�q|j�\}}||j	|<�qW|j�Wntk
�r~YnXdS)NZ	file_typerr�Zdevice_nodeFr��#)rPr!r{rZr�r�r�r��equiv�
equiv_dist�	equal_indr�rG�selinux_file_context_subs_path�	readlines�striprHr�r�r��IOErrorZ#selinux_file_context_subs_dist_path)rrTr�ro�target�
substituterrr r!1	sF ,
zfcontextRecords.__init__c
Cs�|jr�tj�}d|}t|d�}x*|jj�D]}|jd||j|f�q,W|j�ytj	|tj
|�t
j�WnYnXtj||�d|_t
j|�dS)Nz%s.tmp�wz%s %s
F)rurGrvr�rsr��writer�r|�chmod�stat�ST_MODE�renamerPr8)rZ	subs_fileZtmpfiler�rzrrr r8W	s
zfcontextRecords.commitcCsL|j�|dkr,|d
dkr,ttd�|��|dkrP|ddkrPttd�|��||jj�kr�ttd�|�||j|<d|_|jjdt	j
d|d	�t	j
d
|d	�f�|j�dS|j|�xJ|j|j
fD]:}x4|D],}|j|d�r�ttd�||||f��q�Wq�W|jjdt	j
d|d	�t	j
d
|d	�f�||j|<d|_|j�dS)Nr�r=z=Target %s is not valid. Target is not allowed to end with '/'zESubstitute %s is not valid. Substitute is not allowed to end with '/'z:Equivalence class for %s already exists, modifying insteadTz$resrc=fcontext op=modify-equal %s %s�sglobr�tglobz4File spec %s conflicts with equivalency rule '%s %s'z!resrc=fcontext op=add-equal %s %s���r�)rbr^rrsr�rxrurYr6r�audit_encode_nv_stringr8rIrtr�)rrzr{�fdictrorrr �	add_equalg	s*
(

"(
zfcontextRecords.add_equalcCsj|j�||jj�kr&ttd�|��||j|<d|_|jjdtj	d|d�tj	d|d�f�|j
�dS)Nz'Equivalence class for %s does not existTz$resrc=fcontext op=modify-equal %s %sr�rr�)rbrsr�r^rrurYr6rr�r8)rrzr{rrr �modify_equal�	s
(zfcontextRecords.modify_equalr�cCs�t|j�\}}|dkr&ttd�|��|dkr2d}t|j||�}|dkrXttd�|��t|j|d�}|dkr~ttd�|��tdkr�t|j|d	�}|dkr�ttd
�|��|S)NrzCould not create context for %srr�z)Could not set user in file context for %sr�z)Could not set role in file context for %sr=r�z/Could not set mls fields in file context for %s)r�rWr^rr�r�r`r�)rrzr�rLr�rrr �	createcon�	s zfcontextRecords.createconcCs�|dks|jd�dkr"ttd���|jd�d
kr<ttd���x^|j|jfD]N}xH|D]@}|j|d�rTtj||||�}ttd	�|||||f��qTWqJWdS)Nr�
rzInvalid file specificationr�r=z)File specification can not include spacesr�zMFile spec %s conflicts with equivalency rule '%s %s'; Try adding '%s' insteadr�)�findr^rrsrtr�rB�sub)rrzr�rorgrrr rI�	s
zfcontextRecords.validaterc
Cs�|j|�tdkrt|�}|dkr.ttd���|dkrZtj|�}||jkrZttd�|��t|j	|t
|�\}}|dkr�ttd�|��t|j	�\}}|dkr�ttd�|��t|j	||�}|dk�r\|j
||�}	t|j	|	|�}|dkr�ttd	�|��tdk�r4|dk�r4t|j	|	|�}|dk�r4ttd
�|��t|j	||	�}|dk�r\ttd�|��t|t
|�t|j	||�}|dk�r�ttd�|��|dk�r�t|	�t|�t|�|�s�d
}|jjdtjd|d�t||d||f�dS)Nr=rzSELinux Type is requiredz<<none>>z1Type %s is invalid, must be a file or device typerzCould not create key for %sz$Could not create file context for %sz)Could not set type in file context for %sz/Could not set mls fields in file context for %sz!Could not set file context for %sz!Could not add file context for %sr�z6resrc=fcontext op=add %s ftype=%s tcontext=%s:%s:%s:%sr�r�)rIr`rOr^rrZr�r��semanage_fcontext_key_createrW�
file_typesZsemanage_fcontext_createZsemanage_fcontext_set_exprr�r�r��semanage_fcontext_set_conZsemanage_fcontext_set_type�semanage_fcontext_modify_localr��semanage_fcontext_key_free�semanage_fcontext_freerYr6rr��ftype_to_audit)
rrzrS�ftyper.r�rLr��fcontextr�rrr r��	sN







zfcontextRecords.__addcCsV|j�|j||�r8ttd�|�|j|||||�n|j|||||�|j�dS)Nz6File context for %s already defined, modifying instead)rb�_fcontextRecords__existsrxr�_fcontextRecords__modify�_fcontextRecords__addr8)rrzrSr�r.r�rrr r��	szfcontextRecords.addcCs�t|j|t|�\}}|dkr.ttd�|��t|j|�\}}|dkrVttd�|��|s�t|j|�\}}|dkr�ttd�|��t|�|S)NrzCould not create key for %sz1Could not check if file context for %s is defined)r�rWr�r^r�semanage_fcontext_exists�semanage_fcontext_exists_localr�)rrzr�rLr�r~rrr r��	szfcontextRecords.__existscCs�|dkr$|dkr$|dkr$ttd���|dkrPtj|�}||jkrPttd�|��|j|�t|j|t|�\}}|dkr�ttd�|��t	|j|�\}}|dkr�ttd�|��|r�yt
|j|�\}}	Wn$tk
r�ttd�|��YnXn|t|j|�\}}|dk�rttd�|��|�s0ttd	�|��yt
|j|�\}}	Wn&tk
�rjttd�|��YnX|dk�rt|	�}
|
dk�r�|j|�}
td
k�r�|dk�r�t|j|
t|��|dk�r�t|j|
|�|dk�r�t|j|
|�t|j|	|
�}|dk�r:ttd�|��n(t|j|	d�}|dk�r:ttd�|��t|j||	�}|dk�rbttd�|��t|�t|	�|�s|d
}|jjdtjd|d�t||d||f�dS)Nrz"Requires setype, serange or seuser�<<none>>z1Type %s is invalid, must be a file or device typerzCould not create a key for %sz1Could not check if file context for %s is definedz#Could not query file context for %sz"File context for %s is not definedr=z!Could not set file context for %sz$Could not modify file context for %sr�z9resrc=fcontext op=modify %s ftype=%s tcontext=%s:%s:%s:%sr�r�)rr�)r^rrZr�r�rIr�rWr�r�Zsemanage_fcontext_query�OSErrorr�Zsemanage_fcontext_query_local�semanage_fcontext_get_conr�r`r�rOr�r�r�r�r�r�rYr6rr�r�)rrzrr�r.r�rLr�r~r�r�rrr r�
sf











zfcontextRecords.__modifycCs&|j�|j|||||�|j�dS)N)rbr�r8)rrzrr�r.r�rrr r�C
szfcontextRecords.modifycCs�t|j�\}}|dkr"ttd���|j�x�|D]�}t|�}t|�}t|�}t|j|t	|�\}}|dkrzttd�|��t
|j|�}|dkr�ttd�|��t|�|jj
dtjd|d�tt|f�q0Wi|_d|_|j�dS)Nrz Could not list the file contextszCould not create a key for %sz$Could not delete the file context %sz$resrc=fcontext op=delete %s ftype=%sr�T)�semanage_fcontext_list_localrWr^rrb�semanage_fcontext_get_expr�semanage_fcontext_get_type�semanage_fcontext_get_type_strr�r��semanage_fcontext_del_localr�rYr6rr�r��file_type_str_to_optionrsrur8)rrL�flistr�rzr��	ftype_strr�rrr raH
s&
*zfcontextRecords.deleteallcCs:||jj�kr>|jj|�d|_|jjdtjd|d��dSt|j	|t
|�\}}|dkrlttd�|��t
|j	|�\}}|dkr�ttd�|��|s�t|j	|�\}}|dkr�ttd�|��|r�ttd�|��nttd�|��t|j	|�}|dk�rttd	�|��t|�|jjd
tjd|d�t|f�dS)NTz!resrc=fcontext op=delete-equal %sr�rzCould not create a key for %sz1Could not check if file context for %s is definedz;File context for %s is defined in policy, cannot be deletedz"File context for %s is not definedz$Could not delete file context for %sz$resrc=fcontext op=delete %s ftype=%s)rsr��poprurYr6rr�r�rWr�r^rr�r�r�r�r�)rrzr�rLr�r~rrr r�b
s.
zfcontextRecords.__deletecCs |j�|j||�|j�dS)N)rb�_fcontextRecords__deleter8)rrzr�rrr r��
szfcontextRecords.deletercCs|rt|j�\}|_n�t|j�\}|_|dkr:ttd���t|j�\}}|dkr\ttd���t|j�\}}|dkr~ttd���|j|7_|j|7_i}xd|jD]Z}t|�}t|�}t	|�}	t
|�}
|
r�t|
�t|
�t
|
�t|
�f|||	f<q�|
|||	f<q�W|S)NrzCould not list file contextsz1Could not list file contexts for home directoriesz"Could not list local file contexts)r�rWr�Zsemanage_fcontext_listr^rZsemanage_fcontext_list_homedirsr�r�r�r�rZr[rr
)rrzrLZ
fchomedirsZfclocalr�r��exprr�r�r�rrr rr�
s.&zfcontextRecords.get_allcCs�g}|jd�}x�|j�D]t}||r||drd|jdt|d||d||d|df�q|jdt|d||d|df�qWt|j�r�x*|jj�D]}|jd|j||f�q�W|S)	NTrz-a -f %s -t %s -r '%s' '%s'r=rwrz-a -f %s -t %s '%s'z-a -e %s %s)rrr�r'r�rHrs)rr�	fcon_dictr�rzrrr rd�
s
4,
zfcontextRecords.customizedr=cCs�|j|�}t|�dk�r|r:tdtd�td�td�f�|rH|j�}nt|j��}x�|D]�}||r�tr�td|d|d||d||d||dt||d	d
�f�n6td|d|d||d||d||df�qZtd|d|df�qZWt|j��rV|�sV|�r*ttd
��x*|jj�D]}td||j|f��q6Wt|j	��r�|�rtttd��x*|j	j�D]}td||j	|f��q�WdS)Nrz%-50s %-18s %s
zSELinux fcontextrSr_z%-50s %-18s %s:%s:%s:%s r=rwrFz%-50s %-18s %s:%s:%s z%-50s %-18s <<None>>z,
SELinux Distribution fcontext Equivalence 
z%s = %sz%
SELinux Local fcontext Equivalence 
)
rrrHrxrr�r�r`rNrtrs)rryrzr�Zfkeysr�rzrrr r{�
s0


H8zfcontextRecords.list)N)r�)rrr�)rrr�)r)r=r)r9r:r;r�r!r8r�r�r�rIr�r�r�r�r�rar�r�rrrdr{rrrr rq-	s$
&

6
	C!
 rqc@sleZdZddd�Zdd�Zddd�Zd	d
�Zdd�Zd
d�Zddd�Z	dd�Z
dd�Zdd�Zddd�Z
dS)�booleanRecordsNc	Cs�tj||�i|_d|jd<d|jd<d|jd<d|jd<d|jd<d|jd<ytj�\}|_tj�\}}Wng|_d}YnX|jd	ks�|j|kr�d
|_nd|_dS)Nr=ZTRUErZFALSEZONZOFF�1�0rTF)	rPr!�dictrGZsecurity_get_boolean_names�current_booleansrXrR�modify_local)rrTrLZptyperrr r!�
s"






zbooleanRecords.__init__cCsLtj|�}t|j|�\}}|dkr2ttd�|��t|j|�\}}|dkrZttd�|��|snttd�|��t|j|�\}}|dkr�ttd�|��|j�|j	kr�t
||j	|j��nttd�dj|j	j����|j
o�||jk�rt|j||�}|dk�rttd�|��t|j||�}|dk�r8ttd	�|��t|�t|�dS)
NrzCould not create a key for %sz(Could not check if boolean %s is definedzBoolean %s is not definedzCould not query file context %sz0You must specify one of the following values: %sz, z(Could not set active value of boolean %szCould not modify boolean %s)rG�selinux_boolean_sub�semanage_bool_key_createrWr^r�semanage_bool_existsZsemanage_bool_query�upperr�Zsemanage_bool_set_valuer�r�r�r�Zsemanage_bool_set_activeZsemanage_bool_modify_local�semanage_bool_key_freeZsemanage_bool_free)rr,�valuerLr�r~rrrr Z__mod�
s0


zbooleanRecords.__modFcCs�|j�|r�t|�}x||j�jd�D]j}|j�}t|�dkr>q$y|jd�\}}Wn(tk
rxttd||f���YnX|j|j�|j��q$W|j	�n|j||�|j
�dS)Nr�r�=zBad format %s: Record %s)rbr�r�r�rxrHr^r�_booleanRecords__modr�r8)rr,r��use_filer�rZboolnamerarrr r�s
zbooleanRecords.modifycCs�tj|�}t|j|�\}}|dkr2ttd�|��t|j|�\}}|dkrZttd�|��|snttd�|��t|j|�\}}|dkr�ttd�|��|s�ttd�|��t|j|�}|dkr�ttd�|��t	|�dS)NrzCould not create a key for %sz(Could not check if boolean %s is definedzBoolean %s is not definedz2Boolean %s is defined in policy, cannot be deletedzCould not delete boolean %s)
rGr�r�rWr^rr�Zsemanage_bool_exists_localZsemanage_bool_del_localr�)rr,rLr�r~rrr r�#s$
zbooleanRecords.__deletecCs|j�|j|�|j�dS)N)rb�_booleanRecords__deleter8)rr,rrr r�;s
zbooleanRecords.deletecCsZt|j�\}|_|dkr$ttd���|j�x |jD]}t|�}|j|�q4W|j�dS)NrzCould not list booleans)	�semanage_bool_list_localrW�blistr^rrb�semanage_bool_get_namer�r8)rrL�booleanr,rrr ra@szbooleanRecords.deleteallrcCs�i}|rt|j�\}|_nt|j�\}|_|dkr>ttd���x~|jD]t}g}t|�}|jt|��|j	r�||j
kr�|jtj|��|jtj
|��n|j|d�|j|d�|||<qFW|S)NrzCould not list booleans)r�rWr�Zsemanage_bool_listr^rr�r'Zsemanage_bool_get_valuer�r�rGZsecurity_get_boolean_pendingZsecurity_get_boolean_active)rrzr�rLr�r�r,rrr rrMs"zbooleanRecords.get_allcCstj|�}tj|�S)N)rGr�rZZboolean_desc)rr,rrr �get_descds
zbooleanRecords.get_desccCstj|�}tj|�S)N)rGr�rZZboolean_category)rr,rrr �get_categoryhs
zbooleanRecords.get_categorycCsJg}|jd�}x6t|j��D]&}||r|jd||d|f�qW|S)NTz	-m -%s %srw)rrr�r�r')rrr�r�rrr rdls
zbooleanRecords.customizedTcCs�td�td�f}|rX|j|�}x4t|j��D]$}||r,td|||df�q,WdS|j|�}t|�dkrrdS|r�tdtd�td�td	�td
�f�xNt|j��D]>}||r�td||||d|||d|j|�f�q�WdS)Nr�r�z%s=%srwrz%-30s %s  %s %s
zSELinux booleanZStateZDefaultZDescriptionz%-30s (%-5s,%5s)  %s)rrrr�r�rxrHr�)rryrzr�Zon_offr�r�rrr r{ts

$zbooleanRecords.list)N)NF)r)TFF)r9r:r;r!r�r�r�r�rarrr�r�rdr{rrrr r��
s


r�)r)r=)r=)9r�r�rGr|rBr(rr
r5ZPROGNAMErZrrD�gettext�kwargs�version_infoZinstall�builtinsr*�__dict__r�Z__builtin__rr>r�ZSEMANAGE_FCONTEXT_ALLZSEMANAGE_FCONTEXT_REGZSEMANAGE_FCONTEXT_DIRZSEMANAGE_FCONTEXT_CHARZSEMANAGE_FCONTEXT_BLOCKZSEMANAGE_FCONTEXT_SOCKZSEMANAGE_FCONTEXT_LINKZSEMANAGE_FCONTEXT_PIPEr�r�rZaudit_closerrr�r?rFrNrOrPrfr�r�r�r�r�rr-rArbrqr�rrrr �<module>s�
$$	

ik
H$M.site-packages/__pycache__/seobject.cpython-36.pyc000064400000251644147511334560015745 0ustar003

��f���@s2ddlZddlZddlZddlZddlZddlZddlZddlZddlTdZ	ddl
Z
ddlZddlZy:ddl
Z
iZejdRkr�ded<e
je	fddd	�e��WnJyddlZeejd
<Wn&ek
r�ddlZeejd
<YnXYnXddlZiZeed<eed<eed
<eed<eed<eed<eed<eed<eed<eed<eed<eed<eed<eed<eed<eed<eed<eed<eed<eed<eed<e ed <e ed!<e ed"<d
ddddddd d#�Z!d$d$d%d&d'd(d)d*dd+�	Z"y(ddl#Z#e#j$e#j%��Gd,d-�d-�Z&Wn(e'efk
�r4Gd.d-�d-�Z&YnXGd/d0�d0�Z(d1d2�Z)dSd4d5�Z*dTd6d7�Z+Gd8d9�d9�Z,Gd:d;�d;e,�Z-Gd<d=�d=e,�Z.Gd>d?�d?e,�Z/Gd@dA�dAe,�Z0GdBdC�dCe,�Z1GdDdE�dEe,�Z2GdFdG�dGe,�Z3GdHdI�dIe,�Z4GdJdK�dKe,�Z5GdLdM�dMe,�Z6GdNdO�dOe,�Z7GdPdQ�dQe,�Z8dS)U�N)�*zselinux-python�T�unicodez/usr/share/localezutf-8)Z	localedirZcodeset�_�z	all files�azregular filez--�fz-d�	directory�dz-czcharacter device�cz-bzblock device�bz-s�socket�sz-l�lz
symbolic link�pz-pz
named pipe)z	all fileszregular filer	zcharacter devicezblock devicer
z
symbolic linkz
named pipe�any�block�char�dir�file�symlink�pipe)	rrrrr
rrrrc@s8eZdZdd�Zd
dd�Zddd�Zdd	�Zd
d�ZdS)�loggercCstj�|_g|_g|_dS)N)�audit�
audit_open�audit_fd�log_list�log_change_list)�self�r�/usr/lib/python3.6/seobject.py�__init__ls
zlogger.__init__rc	
Cs�d}	||kr||	d7}d}	||kr4||	d7}d}	||krL||	d7}d}	|jj|jtjtjdt|�|d||||||dddg�dS)N�-�sename�,�role�rangerr)r�appendrrZAUDIT_ROLE_ASSIGN�sys�argv�str)
r�msg�namer#�serole�serange�	oldsename�	oldserole�
oldserange�seprrr �logqsz
logger.logc		Cs<|jj|jtjtjdt|�|d||||||dddg�dS)Nrr)rr'rrZAUDIT_ROLE_REMOVEr(r)r*)	rr+r,r#r-r.r/r0r1rrr �
log_remove�szlogger.log_removecCs&|jj|jtjt|�ddddg�dS)N�semanager)rr'rrZAUDIT_USER_MAC_CONFIG_CHANGEr*)rr+rrr �
log_change�szlogger.log_changecCsPx|jD]}tj||g�qWx|jD]}tj||g�q(Wg|_g|_dS)N)rrZaudit_log_semanage_messagerZaudit_log_user_comm_message)r�successrrrr �commit�sz
logger.commitN)rrrrrrr)rrrrrrr)�__name__�
__module__�__qualname__r!r3r4r6r8rrrr rjs


rc@s8eZdZdd�Zd
dd�Zddd�Zdd	�Zd
d�ZdS)rcCs
g|_dS)N)r)rrrr r!�szlogger.__init__rc	
Cs�d||f}	|dkr |	d|7}	|dkr4|	d|7}	|dkrH|	d|7}	|dkr\|	d|7}	|dkrx|dk	rx|	d|7}	|dkr�|dk	r�|	d|7}	|jj|	�dS)	Nz %s name=%srz sename=z oldsename=z role=z
 old_role=z
 MLSRange=z old_MLSRange=)rr')
rr+r,r#r-r.r/r0r1�messagerrr r3�sz
logger.logc			Cs|j||||||||�dS)N)r3)	rr+r,r#r-r.r/r0r1rrr r4�szlogger.log_removecCs|jjd|�dS)Nz %s)rr')rr+rrr r6�szlogger.log_changecCs8|dkrd}nd}x |jD]}tjtj||�qWdS)N�zSuccessful: zFailed: )r�syslogZLOG_INFO)rr7r<rrrr r8�s
z
logger.commitN)rrrrrrr)rrrrrrr)r9r:r;r!r3r4r6r8rrrr r�s


c@s0eZdZddd�Zddd�Zdd�Zdd	�Zd
S)
�
nullloggerrc		CsdS)Nr)	rr+r,r#r-r.r/r0r1rrr r3�sznulllogger.logc		CsdS)Nr)	rr+r,r#r-r.r/r0r1rrr r4�sznulllogger.log_removecCsdS)Nr)rr+rrr r6�sznulllogger.log_changecCsdS)Nr)rr7rrr r8�sznulllogger.commitN)rrrrrrr)rrrrrrr)r9r:r;r3r4r6r8rrrr r?�s

r?cCsXd}d}|d|d}|d|d}|d|dd|d}tjd	|d
|�S)Nzs[0-9]*zc[0-9]*z(\.z)?z(\,z)*z(-z(:�^�$)�re�search)�rawZsensitivity�categoryZ	cat_rangeZ
categoriesZregrrr �validate_level�srFr=cCs`d}|dkrd||f}n|}tj|�\}}|dkr8|S|rL|t|�d�}|dkrX|S|SdS)Nza:b:c:r=z%s%srr)�selinuxZselinux_raw_to_trans_context�len)rD�prepend�filler�context�rc�transrrr �	translate�srNcCs`d}|dkrd||f}n|}tj|�\}}|dkr8|S|rL|t|�d�}|dkrX|S|SdS)Nza:b:c:r=z%s%srr)rGZselinux_trans_to_raw_contextrH)rMrIrJrKrLrDrrr �untranslate�srOc@sfeZdZdZdZdZdZddd�Zdd�Zdd�Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�ZdS)�semanageRecordsFNcCs�|rt|�tkr||_n||_t|dd�|_|js@t|dd�|_|j|j�|_tj	�\}}|jdksn|j|krxt
�|_n,tj
|j�tjdtj�|jf�t�|_dS)N�noreloadF�storerz%s%s)�typer*rR�args�getattrrQ�
get_handle�shrG�selinux_getpolicytyper�mylog�sepolicyZload_store_policyZselinux_set_policy_rootZselinux_pathr?)rrTrLZ
localstorerrr r!�s
zsemanageRecords.__init__cCs||_dS)N)rQ)r�loadrrr �
set_reload
szsemanageRecords.set_reloadcCs�tjrtjSt�}|s"ttd���tjrD|dkrDt||t�|t_t	|�s`t
|�ttd���t|�}|tkr�t
|�ttd���t
|�}|dkr�t
|�ttd���t|�atdkr�t
|�ttd���|t_tjS)Nz Could not create semanage handlerz:SELinux policy is not managed or store cannot be accessed.zCannot read policy store.rz'Could not establish semanage connectionz!Could not test MLS enabled status)rP�handleZsemanage_handle_create�
ValueErrorr�transactionZsemanage_select_storeZSEMANAGE_CON_DIRECTrRZsemanage_is_managedZsemanage_handle_destroyZsemanage_access_checkZSEMANAGE_CAN_READZsemanage_connectZsemanage_mls_enabled�is_mls_enabled)rrRr]rLrrr rV
s2zsemanageRecords.get_handlecCsttd���dS)NzNot yet implemented)r^r)rrrr �	deleteall1szsemanageRecords.deleteallcCs$tjrttd���|j�dt_dS)Nz(Semanage transaction already in progressT)rPr_r^r�begin)rrrr �start4szsemanageRecords.startcCs,tjr
dSt|j�}|dkr(ttd���dS)Nrz$Could not start semanage transaction)rPr_Zsemanage_begin_transactionrWr^r)rrLrrr rb:s

zsemanageRecords.begincCsttd���dS)NzNot yet implemented)r^r)rrrr �
customizedAszsemanageRecords.customizedcCsVtjr
dS|jrt|jd�t|j�}|dkrF|jjd�tt	d���|jjd�dS)Nrz%Could not commit semanage transactionr=)
rPr_rQZsemanage_set_reloadrWZsemanage_commitrYr8r^r)rrLrrr r8Ds
zsemanageRecords.commitcCs$tjsttd���dt_|j�dS)Nz$Semanage transaction not in progressF)rPr_r^rr8)rrrr �finishPszsemanageRecords.finish)N)r9r:r;r_r]rRrTr!r\rVrarcrbrdr8rerrrr rP�s
$rPc@sPeZdZddd�Zdd�Zdd�Zdd
d�Zdd
�Zdd�Zdd�Z	dd�Z
dS)�
moduleRecordsNcCstj||�dS)N)rPr!)rrTrrr r!YszmoduleRecords.__init__cCsg}t|j�\}}}|dkr(ttd���x�t|�D]�}t||�}t|j|�\}}|dkrdttd���t|j|�\}}|dkr�ttd���t|j|�\}}	|dkr�ttd���t	|j|�\}}
|dkr�ttd���|j
|||	|
f�q2W|jdd�d	d
�|jdd�d�|S)
NrzCould not list SELinux moduleszCould not get module namezCould not get module enabledzCould not get module priorityzCould not get module lang_extcSs|dS)Nrr)�trrr �<lambda>xsz'moduleRecords.get_all.<locals>.<lambda>T)�key�reversecSs|dS)Nrr)rgrrr rhys)ri)Zsemanage_module_list_allrWr^rr&�semanage_module_list_nthZsemanage_module_info_get_nameZ semanage_module_info_get_enabledZ!semanage_module_info_get_priorityZ!semanage_module_info_get_lang_extr'�sort)rrrL�mlist�number�i�modr,Zenabled�priorityZlang_extrrr �get_all\s,
zmoduleRecords.get_allcCs0|j�}t|�dkrgSdd�dd�|D�D�S)NrcSsg|]}d|d�qS)z-d %srr)�.0�xrrr �
<listcomp>�sz,moduleRecords.customized.<locals>.<listcomp>cSsg|]}|ddkr|�qS)r=rr)rsrgrrr ru�s)rrrH)r�allrrr rd|szmoduleRecords.customizedr=rcCs�|j�}t|�dkrdS|r:tdtd�td�td�f�xL|D]D}|ddkrZtd�}n
|r`q@d}td	|d|d
|d|f�q@WdS)Nrz
%-25s %-9s %s
zModule NameZPriorityZLanguager=ZDisabledrz%-25s %-9s %-5s %s�r)rrrH�printr)r�heading�	locallistrvrgZdisabledrrr �list�s

zmoduleRecords.listcCs`tjj|�sttd�|��t|j|�}|dkr@ttd�|��t|j|�}|dkr\|j�dS)NzModule does not exist: %s rz3Invalid priority %d (needs to be between 1 and 999))	�os�path�existsr^r�semanage_set_default_priorityrWZsemanage_module_install_filer8)rrrqrLrrr �add�szmoduleRecords.addcCs�x�|j�D]�}t|j�\}}|dkr0ttd���t|j||�}|dkrRttd���t|j||�}|dkr
|r~ttd�|��q
ttd�|��q
W|j�dS)NrzCould not create module keyzCould not set module key namezCould not enable module %szCould not disable module %s)�splitZsemanage_module_key_createrWr^rZsemanage_module_key_set_nameZsemanage_module_set_enabledr8)r�module�enable�mrLrirrr �set_enabled�szmoduleRecords.set_enabledcCsnt|j|�}|dkr$ttd�|��x<|j�D]0}t|j|�}|dkr.|dkr.ttd�|��q.W|j�dS)Nrz3Invalid priority %d (needs to be between 1 and 999)rwz*Could not remove module %s (remove failed)���)rrWr^rr��semanage_module_remover8)rr�rqrLr�rrr �delete�szmoduleRecords.deletecCs:dd�dd�|j�D�D�}x|D]}|j|d�q"WdS)NcSsg|]}|d�qS)rr)rsrtrrr ru�sz+moduleRecords.deleteall.<locals>.<listcomp>cSsg|]}|ddkr|�qS)r=rr)rsrgrrr ru�sT)rrr�)rrr�rrr ra�s
zmoduleRecords.deleteall)N)r=r)r9r:r;r!rrrdr{r�r�r�rarrrr rfWs
 
rfc@seZdZddd�Zdd�ZdS)�dontauditClassNcCstj||�dS)N)rPr!)rrTrrr r!�szdontauditClass.__init__cCs8|dkrttd���|j�t|j|dk�|j�dS)N�on�offz'dontaudit requires either 'on' or 'off')r�r�)r^rrbZsemanage_set_disable_dontauditrWr8)rZ	dontauditrrr �toggle�s
zdontauditClass.toggle)N)r9r:r;r!r�rrrr r��s
r�c@sHeZdZddd�Zdd�Zdd�Zdd
d�Zdd
�Zdd�Zdd�Z	dS)�permissiveRecordsNcCstj||�dS)N)rPr!)rrTrrr r!�szpermissiveRecords.__init__cCsrg}t|j�\}}}|dkr(ttd���xDt|�D]8}t||�}t|�}|r2|jd�r2|j|j	d�d�q2W|S)NrzCould not list SELinux modulesZpermissive_r=)
Zsemanage_module_listrWr^rr&rkZsemanage_module_get_name�
startswithr'r�)rrrLrmrnrorpr,rrr rr�s
zpermissiveRecords.get_allcCsdd�t|j��D�S)NcSsg|]}d|�qS)z-a %sr)rsrtrrr ru�sz0permissiveRecords.customized.<locals>.<listcomp>)�sortedrr)rrrr rd�szpermissiveRecords.customizedr=rcCs�dd�dd�tjtj�D�D�}t|�dkr0dS|rDtdtd��|j�}x|D]}||krRt|�qRWt|�dkrzdS|r�tdtd��x|D]}t|�q�WdS)NcSsg|]}|d�qS)r,r)rs�yrrr ru�sz*permissiveRecords.list.<locals>.<listcomp>cSsg|]}|dr|�qS)Z
permissiver)rsrtrrr ru�srz
%-25s
zBuiltin Permissive TypeszCustomized Permissive Types)rZ�infoZTYPErHrxrrr)rryrzrvrdrgrrr r{�s 

zpermissiveRecords.listcCs�yddlj}Wn tk
r.ttd���YnXd|}d|}t|j|t|�|d�}|dkrf|j�|dkr~ttd�|��dS)Nrz�The sepolgen python module is required to setup permissive domains.
In some distributions it is included in the policycoreutils-devel package.
# yum install policycoreutils-devel
Or similar for your distro.z
permissive_%sz(typepermissive %s)Zcilz?Could not set permissive domain %s (module installation failed))	Zsepolgen.moduler��ImportErrorr^rZsemanage_module_installrWrHr8)rrSr�r,ZmodtxtrLrrr r��szpermissiveRecords.addcCsFx8|j�D],}t|jd|�}|dkr
ttd�|��q
W|j�dS)Nz
permissive_%srz5Could not remove permissive domain %s (remove failed))r�r�rWr^rr8)rr,�nrLrrr r�s
zpermissiveRecords.deletecCs,|j�}t|�dkr(dj|�}|j|�dS)Nr� )rrrH�joinr�)rrrvrrr ras
zpermissiveRecords.deleteall)N)r=r)
r9r:r;r!rrrdr{r�r�rarrrr r��s


r�c@s~eZdZddd�Zdd�Zdd�Zdd	�Zd dd�Zd!d
d�Zdd�Z	dd�Z
dd�Zdd�Zd"dd�Z
dd�Zd#dd�ZdS)$�loginRecordsNcCs(tj||�d|_d|_d|_d|_dS)N)rPr!r/r1r#r.)rrTrrr r!s
zloginRecords.__init__cCs�tj|�\}|_|_|dkr d}t|j�}|j|j�\}\}}|j|�\}\}}	tdkrn|dkrjt|�}n|}t	|j
|�\}}
|dkr�ttd�|��|ddkr�yt
j|dd��Wn$ttd�|dd���YnXn,ytj|�Wnttd�|��YnXt|j
�\}}|dk�r4ttd	�|��t|j
||�}|dk�r\ttd
�|��tdk�r�|dk�r�t|j
||�}|dk�r�ttd�|��t|j
||�}|dk�r�ttd�|��t|j
|
|�}|dk�r�ttd
�|��t|
�t|�dS)NrZuser_ur=rzCould not create a key for %s�%zLinux Group %s does not existzLinux User %s does not existz%Could not create login mapping for %szCould not set name for %szCould not set MLS range for %sz!Could not set SELinux user for %sz"Could not add login mapping for %s)rG�getseuserbynamer/r1�seluserRecordsrT�getr`rO�semanage_seuser_key_createrWr^r�grpZgetgrnam�pwd�getpwnamZsemanage_seuser_createZsemanage_seuser_set_name�semanage_seuser_set_mlsrange�semanage_seuser_set_sename�semanage_seuser_modify_local�semanage_seuser_key_free�semanage_seuser_free)rr,r#r.�rec�userrecr&rLr0r-�k�urrr �__add sP

 




zloginRecords.__addcCsxyL|j�|j|�r4ttd�|�|j|||�n|j|||�|j�Wn&tk
rr}z
|�WYdd}~XnXdS)Nz:Login mapping for %s is already defined, modifying instead)rb�_loginRecords__existsrxr�_loginRecords__modify�_loginRecords__addr8r^)rr,r#r.�errorrrr r�Vs
zloginRecords.addcCs\t|j|�\}}|dkr(ttd�|��t|j|�\}}|dkrPttd�|��t|�|S)NrzCould not create a key for %sz2Could not check if login mapping for %s is defined)r�rWr^r�semanage_seuser_existsr�)rr,rLr�r~rrr �__existsdszloginRecords.__existsrc
Cs�tj|�\}|_|_|dkr0|dkr0ttd���t|j�}|j|j�\}\}}|dkrj|j|�\}\}}	n|}	|dkr~||_	n||_	t
|j|�\}}
|dkr�ttd�|��t|j|
�\}}|dkr�ttd�|��|s�ttd�|��t
|j|
�\}}|dk�rttd�|��t|�|_t|�|_tdk�rL|dk�rLt|j|t|��|dk�rlt|j||�||_n|j|_t|j|
|�}|dk�r�ttd	�|��t|
�t|�dS)
NrzRequires seuser or serangerzCould not create a key for %sz2Could not check if login mapping for %s is definedz#Login mapping for %s is not definedzCould not query seuser for %sr=z%Could not modify login mapping for %s)rGr�r/r1r^rr�rTr�r.r�rWr�Zsemanage_seuser_query�semanage_seuser_get_mlsrange�semanage_seuser_get_senamer`r�rOr�r#r�r�r�)
rr,r#r.r�r�r&rLr0r-r�r~r�rrr �__modifypsF





zloginRecords.__modifycCsNy"|j�|j|||�|j�Wn&tk
rH}z
|�WYdd}~XnXdS)N)rbr�r8r^)rr,r#r.r�rrr �modify�szloginRecords.modifyc
Cs*tj|�\}|_|_t|j�}|j|j�\}\}}t|j|�\}}|dkrZt	t
d�|��t|j|�\}}|dkr�t	t
d�|��|s�t	t
d�|��t|j|�\}}|dkr�t	t
d�|��|s�t	t
d�|��t
|j|�}|dkr�t	t
d�|��t|�tjd�\}|_|_|j|j�\}\}}	dS)NrzCould not create a key for %sz2Could not check if login mapping for %s is definedz#Login mapping for %s is not definedz<Login mapping for %s is defined in policy, cannot be deletedz%Could not delete login mapping for %sZ__default__)rGr�r/r1r�rTr�r�rWr^rr�Zsemanage_seuser_exists_localZsemanage_seuser_del_localr�r#r.)
rr,r�r�r&rLr0r�r~r-rrr �__delete�s,
zloginRecords.__deletecCsJy|j�|j|�|j�Wn&tk
rD}z
|�WYdd}~XnXdS)N)rb�_loginRecords__deleter8r^)rr,r�rrr r��s
zloginRecords.deletecCs~t|j�\}}|dkr"ttd���y0|j�x|D]}|jt|��q2W|j�Wn&tk
rx}z
|�WYdd}~XnXdS)NrzCould not list login mappings)�semanage_seuser_list_localrWr^rrbr��semanage_seuser_get_namer8)rrL�ulistr�r�rrr ra�s
zloginRecords.deleteallc
Cs�i}tj�d|_x�tj|j�D]�\}}}||jkr xj|D]b}yHt|d|�}|j�j�jd�}|j	�|d|d|df||<Wq:t
k
r�Yq:Xq:Wq W|S)Nz/logins�/�:r=rwr)rGZselinux_policy_root�logins_pathr|�walk�open�read�rstripr��close�
IndexError)r�ddictr}�dirs�filesr,�fdr�rrr �get_all_logins�s

zloginRecords.get_all_loginsrcCspi}|rt|j�\}|_nt|j�\}|_|dkr>ttd���x,|jD]"}t|�}t|�t|�df||<qFW|S)NrzCould not list login mappingsr)	r�rWr�Zsemanage_seuser_listr^rr�r�r�)rrzr�rLr�r,rrr rr�szloginRecords.get_allcCstg}|jd�}x`t|j��D]P}||drR|jd||d||d|f�q|jd||d|f�qW|S)NTr=z-a -s %s -r '%s' %srz-a -s %s %s)rrr��keysr')rrr�r�rrr rd�s
&zloginRecords.customizedr=c	CsN|j|�}|j�}t|j��}t|j��}t|�dkrFt|�dkrFdStdk�r|rxtdtd�td�td�td�f�x8|D]0}||}td||dt|d�|d	f�q~Wt|�r�td
|j	�x�|D]0}||}td||dt|d�|d	f�q�WnF|�r"tdtd�td�f�x&|D]}td|||df��q(WdS)
Nrr=z
%-20s %-20s %-20s %s
z
Login NamezSELinux Userz
MLS/MCS RangeZServicez%-20s %-20s %-20s %srwz
Local customization in %sz
%-25s %-25s
z%-25s %-25s)
rrr�r�r�rHr`rxrrNr�)	rryrzr�ZldictZlkeysr�r�r�rrr r{s*

$
(
*
zloginRecords.list)N)rr)rr)r)r=r)r9r:r;r!r�r�r�r�r�r�r�rar�rrrdr{rrrr r�s
6
2
	


r�c@s�eZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zgdddfd
d�Zgdddfdd�Z	dd�Z
dd�Zdd�Zd dd�Z
dd�Zd!dd�ZdS)"r�NcCstj||�dS)N)rPr!)rrTrrr r!"szseluserRecords.__init__cCs�t|j|�\}}|dkr(ttd�|��t|j|�\}}|dkrPttd�|��t|j|�\}}|dkrxttd�|��t|�}t|j|�}t|�t	|�||fS)NrzCould not create a key for %sz-Could not check if SELinux user %s is definedzCould not query user for %s)
�semanage_user_key_createrWr^r�semanage_user_exists�semanage_user_query�semanage_user_get_mlsrange�semanage_user_get_roles�semanage_user_key_free�semanage_user_free)rr,rLr�r~r�r.r-rrr r�%szseluserRecords.getcCstdkr4|dkrd}nt|�}|dkr,d}nt|�}t|�dkrPttd�|��t|j|�\}}|dkrxttd�|��t|j�\}}|dkr�ttd�|��t|j||�}|dkr�ttd�|��x6|D].}	t	|j||	�}|dkr�ttd	�|	|f��q�Wtdk�rVt
|j||�}|dk�r.ttd
�|��t|j||�}|dk�rVttd�|��t|j||�}|dk�r�ttd�|	|f��t
|j|�\}}
|dk�r�ttd
�|��t|j||�}|dk�r�ttd�|��t|�t|�|jjd|dj|�|d�dS)Nr=r�s0z%You must add at least one role for %srzCould not create a key for %sz$Could not create SELinux user for %szCould not set name for %szCould not add role %s for %szCould not set MLS range for %szCould not set MLS level for %szCould not add prefix %s for %szCould not extract key for %szCould not add SELinux user %s�seuserr$)r#r-r.)r`rOrHr^rr�rWZsemanage_user_createZsemanage_user_set_name�semanage_user_add_role�semanage_user_set_mlsrange�semanage_user_set_mlslevel�semanage_user_set_prefixZsemanage_user_key_extract�semanage_user_modify_localr�r�rYr3r�)rr,�roles�selevelr.�prefixrLr�r��rrirrr r�5sR






zseluserRecords.__addcCs�yT|j�|j|�r8ttd�|�|j|||||�n|j|||||�|j�Wn2tk
r�}z|jjd�|�WYdd}~XnXdS)Nz5SELinux user %s is already defined, modifying insteadr)	rb�_seluserRecords__existsrxr�_seluserRecords__modify�_seluserRecords__addr8r^rY)rr,r�r�r.r�r�rrr r�ls
zseluserRecords.addcCs\t|j|�\}}|dkr(ttd�|��t|j|�\}}|dkrPttd�|��t|�|S)NrzCould not create a key for %sz-Could not check if SELinux user %s is defined)r�rWr^rr�r�)rr,rLr�r~rrr r�yszseluserRecords.__existsrc	Cs@d}d}dj|�}|dkrXt|�dkrX|dkrX|dkrXtdkrLttd���nttd���t|j|�\}	}
|	dkr�ttd�|��t|j|
�\}	}|	dkr�ttd�|��|s�ttd	�|��t|j|
�\}	}|	dkr�ttd
�|��t	|�}t
|j|�\}	}
|	dk�rdj|
�}tdk�r6|dk�r6t|j|t|��tdk�r\|dk�r\t
|j|t|��|dk�rtt|j||�t|�dk�r�x"|
D]}||k�r�t||��q�Wx&|D]}||
k�r�t|j||��q�Wt|j|
|�}	|	dk�r�ttd�|��t|
�t|�dj|j��}dj|j��}|jjd
||||||d�dS)Nrr�rr=z&Requires prefix, roles, level or rangezRequires prefix or roleszCould not create a key for %sz-Could not check if SELinux user %s is definedzSELinux user %s is not definedzCould not query user for %sz Could not modify SELinux user %sr$r�)r#r/r-r.r0r1)r�rHr`r^rr�rWr�r�r�r�r�rOr�r�Zsemanage_user_del_roler�r�r�r�r�rYr3)rr,r�r�r.r�r0r1ZnewrolesrLr�r~r��rlistr�r%rrr r��sV
$







zseluserRecords.__modifycCs^y&|j�|j|||||�|j�Wn2tk
rX}z|jjd�|�WYdd}~XnXdS)Nr)rbr�r8r^rY)rr,r�r�r.r�r�rrr r��szseluserRecords.modifyc	Cs8t|j|�\}}|dkr(ttd�|��t|j|�\}}|dkrPttd�|��|sdttd�|��t|j|�\}}|dkr�ttd�|��|s�ttd�|��t|j|�\}}|dkr�ttd�|��t|�}t|j|�\}}dj	|�}t
|j|�}|dk�rttd�|��t|�t|�|j
jd	|||d
�dS)NrzCould not create a key for %sz-Could not check if SELinux user %s is definedzSELinux user %s is not definedz7SELinux user %s is defined in policy, cannot be deletedzCould not query user for %sr$z Could not delete SELinux user %sr�)r/r1r0)r�rWr^rr�Zsemanage_user_exists_localr�r�r�r�Zsemanage_user_del_localr�r�rYr4)	rr,rLr�r~r�r1r�r0rrr r��s2

zseluserRecords.__deletecCsVy|j�|j|�|j�Wn2tk
rP}z|jjd�|�WYdd}~XnXdS)Nr)rb�_seluserRecords__deleter8r^rY)rr,r�rrr r��s
zseluserRecords.deletecCs�t|j�\}}|dkr"ttd���y0|j�x|D]}|jt|��q2W|j�Wn2tk
r�}z|jjd�|�WYdd}~XnXdS)NrzCould not list login mappings)	�semanage_user_list_localrWr^rrbr��semanage_user_get_namer8rY)rrLr�r�r�rrr ra�s
zseluserRecords.deleteallrcCs�i}|rt|j�\}|_nt|j�\}|_|dkr>ttd���xh|jD]^}t|�}t|j|�\}}|dkrzttd�|��dj|�}t	|�t
|�t|�|f|t|�<qFW|S)NrzCould not list SELinux usersz Could not list roles for user %sr�)r�rWr�Zsemanage_user_listr^rr�r�r�Zsemanage_user_get_prefixZsemanage_user_get_mlslevelr�)rrzr�rLr�r,r�r�rrr rr�s
$zseluserRecords.get_allcCs�g}|jd�}xvt|j��D]f}||ds8||drh|jd||d||d||d|f�q|jd||d|f�qW|S)NTr=rwz-a -L %s -r %s -R '%s' %srz
-a -R '%s' %s)rrr�r�r')rrr�r�rrr rds
0zseluserRecords.customizedr=c	Cs|j|�}t|�dkrdSt|j��}tdkr�|r|tddtd�td�td�f�tdtd�td	�td
�td�td�f�x�|D]B}td
|||dt||d�t||d�||df�q�WnB|r�tdtd�td�f�x$|D]}td|||df�q�WdS)Nrr=z
%-15s %-10s %-10s %-30srZLabelingzMLS/z%-15s %-10s %-10s %-30s %s
zSELinux UserZPrefixz	MCS Levelz	MCS Rangez
SELinux Rolesz%-15s %-10s %-10s %-30s %srwrz	%-15s %s
z%-15s %s)rrrHr�r�r`rxrrN)rryrzr�r�r�rrr r{s
 *
D
zseluserRecords.list)N)r)r=r)r9r:r;r!r�r�r�r�r�r�r�r�rarrrdr{rrrr r� s
7
8	!


r�c@s�eZdZgZd dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
d!dd�Zd"dd�Zdd�Zd#dd�ZdS)$�portRecordsNcCsJtj||�y$tttjtjd��dd�|_Wntk
rDYnXdS)NZ	port_typer�types)rPr!r{rZr��	ATTRIBUTE�valid_types�RuntimeError)rrTrrr r!4s
$zportRecords.__init__c
Cs�ttttd�}||j�kr$||}nttd���|dkrDttd���|jd�}t|�dkrlt	|d�}}nt	|d�}t	|d�}|dkr�ttd	���t
|j|||�\}}	|dkr�ttd
�||f��|	|||fS)N)ZtcpZudpZsctpZdccpz0Protocol has to be one of udp, tcp, dccp or sctprzPort is requiredr"r=ri��zInvalid Portz Could not create a key for %s/%s)ZSEMANAGE_PROTO_TCPZSEMANAGE_PROTO_UDPZSEMANAGE_PROTO_SCTPZSEMANAGE_PROTO_DCCPr�r^rr�rH�intZsemanage_port_key_createrW)
r�port�protoZ	protocols�proto_dZports�high�lowrLr�rrr �__genkey;s(

zportRecords.__genkeycCs,tdkr|dkrd}nt|�}|dkr2ttd���tj|�}||jkrVttd�|��|j||�\}}}}t|j	�\}	}
|	dkr�ttd�||f��t
|
|�t|
||�t|j	�\}	}|	dkr�ttd�||f��t
|j	|d	�}	|	dkr�ttd
�||f��t|j	|d�}	|	dk�r*ttd�||f��t|j	||�}	|	dk�rVttd
�||f��tdk�r�|dk�r�t|j	||�}	|	dk�r�ttd�||f��t|j	|
|�}	|	dk�r�ttd�||f��t|j	||
�}	|	dk�r�ttd�||f��t|�t|�t|
�|jjd|tj|�d	d||f�dS)Nr=rr�zType is requiredz'Type %s is invalid, must be a port typerzCould not create port for %s/%sz"Could not create context for %s/%s�system_uz,Could not set user in port context for %s/%s�object_rz,Could not set role in port context for %s/%sz,Could not set type in port context for %s/%sz2Could not set mls fields in port context for %s/%sz$Could not set port context for %s/%szCould not add port %s/%sz8resrc=port op=add lport=%s proto=%s tcontext=%s:%s:%s:%s)r`rOr^rrZ�get_real_type_namer��_portRecords__genkeyZsemanage_port_createrWZsemanage_port_set_protoZsemanage_port_set_range�semanage_context_create�semanage_context_set_user�semanage_context_set_role�semanage_context_set_type�semanage_context_set_mlsZsemanage_port_set_con�semanage_port_modify_local�semanage_context_free�semanage_port_key_free�semanage_port_freerYr6r
�getprotobyname)rr�r�r.rSr�r�r�r�rLr�conrrr r�WsR







zportRecords.__addcCsX|j�|j||�r<ttd�j||d��|j||||�n|j||||�|j�dS)Nz6Port {proto}/{port} already defined, modifying instead)r�r�)rb�_portRecords__existsrxr�format�_portRecords__modify�_portRecords__addr8)rr�r�r.rSrrr r��szportRecords.addc	CsN|j||�\}}}}t|j|�\}}|dkrBttd�j||d���t|�|S)Nrz1Could not check if port {proto}/{port} is defined)r�r�)r��semanage_port_existsrWr^rr�r�)	rr�r�r�r�r�r�rLr~rrr r��szportRecords.__existsc
Cs�|dkr2|dkr2tdkr&ttd���nttd���tj|�}|rZ||jkrZttd�|��|j||�\}}}}t|j|�\}	}
|	dkr�ttd�||f��|
s�ttd�||f��t	|j|�\}	}|	dkr�ttd	�||f��t
|�}tdk�r|dk�rd
}nt|j|t|��|dk�r*t
|j||�t|j||�}	|	dk�rVttd�||f��t|�t|�|jjd|tj|�d
d||f�dS)Nrr=zRequires setype or serangezRequires setypez'Type %s is invalid, must be a port typerz(Could not check if port %s/%s is definedzPort %s/%s is not definedzCould not query port %s/%sr�zCould not modify port %s/%sz;resrc=port op=modify lport=%s proto=%s tcontext=%s:%s:%s:%sr�r�)r`r^rrZr�r�r�r�rWZsemanage_port_query�semanage_port_get_conr�rOr�r�r�r�rYr6r
r�)
rr�r�r.�setyper�r�r�r�rLr~rr�rrr r��s:




zportRecords.__modifycCs$|j�|j||||�|j�dS)N)rbr�r8)rr�r�r.rrrr r��szportRecords.modifycCs�t|j�\}}|dkr"ttd���|j�x�|D]�}t|�}t|�}t|�}t|�}d||f}|j	||�\}	}
}}|dkr�ttd�|��t
|j|	�}|dkr�ttd�|��t|	�||kr�|}|jj
d|tj|�f�q0W|j�dS)NrzCould not list the portsz%s-%szCould not create a key for %szCould not delete the port %sz&resrc=port op=delete lport=%s proto=%s)�semanage_port_list_localrWr^rrb�semanage_port_get_proto�semanage_port_get_proto_str�semanage_port_get_low�semanage_port_get_highr��semanage_port_del_localr�rYr6r
r�r8)rrL�plistr�r��	proto_strr�r�Zport_strr�r�rrr ra�s*
zportRecords.deleteallc	Cs�|j||�\}}}}t|j|�\}}|dkr@ttd�||f��|sXttd�||f��t|j|�\}}|dkr�ttd�||f��|s�ttd�||f��t|j|�}|dkr�ttd�||f��t|�|jj	d|t
j|�f�dS)Nrz(Could not check if port %s/%s is definedzPort %s/%s is not definedz2Port %s/%s is defined in policy, cannot be deletedzCould not delete port %s/%sz&resrc=port op=delete lport=%s proto=%s)r�r�rWr^rZsemanage_port_exists_localrr�rYr6r
r�)	rr�r�r�r�r�r�rLr~rrr r��s zportRecords.__deletecCs |j�|j||�|j�dS)N)rb�_portRecords__deleter8)rr�r�rrr r�szportRecords.deletercCs�i}|rt|j�\}|_nt|j�\}|_|dkr>ttd���xX|jD]N}t|�}t|�}t|�}t	|�}t
|�}	t|�}
t|�}||f||
||	f<qFW|S)NrzCould not list ports)
rrWr�semanage_port_listr^rr�semanage_context_get_type�semanage_context_get_mlsrrrr)rrzr�rLr�r��ctype�levelr�r	r�r�rrr rrs zportRecords.get_allcCs�i}|rt|j�\}|_nt|j�\}|_|dkr>ttd���x�|jD]�}t|�}t|�}t|�}t	|�}t
|�}	t|�}
||f|j�kr�g|||f<|	|
kr�|||fj
d|	�qF|||fj
d|	|
f�qFW|S)NrzCould not list portsz%dz%d-%d)rrWrrr^rrrrrrrr�r')rrzr�rLr�r�rr�r	r�r�rrr �get_all_by_type s&zportRecords.get_all_by_typecCs�g}|jd�}x�t|j��D]�}|d|dkr8|dnd|d|df}||dr�|jd||d||d|d|f�q|jd||d|d|f�qW|S)NTrr=z%s-%sz-a -t %s -r '%s' -p %s %srwz-a -t %s -p %s %s)rrr�r�r')rrr�r�r�rrr rd8s
,,$zportRecords.customizedr=cCs�|j|�}t|�dkrdSt|j��}|rHtdtd�td�td�f�xV|D]N}d|}|d||d7}x$||dd�D]}|d	|7}q�Wt|�qNWdS)
Nrz%-30s %-8s %s
zSELinux Port TypeZProtozPort Numberz%-30s %-8s z%sr=z, %s)rrHr�r�rxr)rryrzr�r�ror�rrrr r{Cs

zportRecords.list)N)r)r)r=r)r9r:r;r�r!r�r�r�r�r�r�rar
r�rrrrdr{rrrr r�0s
:	
*

r�c@s�eZdZgZd dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
d!dd�Zd"dd�Zdd�Zd#dd�ZdS)$�
ibpkeyRecordsNc
CsXtj||�y:tjtjtj|j��dgd�}tdd�|j	�D��|_
WnYnXdS)NZibpkey_type)�attrscss|]}t|�VqdS)N)r*)rsrgrrr �	<genexpr>Zsz)ibpkeyRecords.__init__.<locals>.<genexpr>)rPr!�setools�	TypeQuery�
SELinuxPolicyrZ�get_store_policyrRr��resultsr�)rrT�qrrr r!VszibpkeyRecords.__init__cCs�|dkrttd���|jd�}t|�dkr>t|dd�}}nt|dd�}t|dd�}|dkrnttd���t|j|||�\}}|dkr�ttd�||f��||||fS)	NrzSubnet Prefix is requiredr"r=ri��zInvalid Pkeyz Could not create a key for %s/%s)r^rr�rHr�Zsemanage_ibpkey_key_createrW)r�pkey�
subnet_prefixZpkeysr�r�rLr�rrr r�^s
zibpkeyRecords.__genkeycCstdkr|dkrd}nt|�}|dkr2ttd���tj|�}||jkrVttd�|��|j||�\}}}}t|j	�\}}	|dkr�ttd�||f��t
|j	|	|�t|	||�t|j	�\}}
|dkr�ttd�||f��t
|j	|
d	�}|dk�rttd
�||f��t|j	|
d�}|dk�r0ttd�||f��t|j	|
|�}|dk�r\ttd
�||f��tdk�r�|dk�r�t|j	|
|�}|dk�r�ttd�||f��t|j	|	|
�}|dk�r�ttd�||f��t|j	||	�}|dk�r�ttd�||f��t|
�t|�t|	�dS)Nr=rr�zType is requiredz)Type %s is invalid, must be a ibpkey typerz!Could not create ibpkey for %s/%sz"Could not create context for %s/%sr�z.Could not set user in ibpkey context for %s/%sr�z.Could not set role in ibpkey context for %s/%sz.Could not set type in ibpkey context for %s/%sz4Could not set mls fields in ibpkey context for %s/%sz&Could not set ibpkey context for %s/%szCould not add ibpkey %s/%s)r`rOr^rrZr�r��_ibpkeyRecords__genkeyZsemanage_ibpkey_createrWZ!semanage_ibpkey_set_subnet_prefixZsemanage_ibpkey_set_ranger�r�r�r�r�Zsemanage_ibpkey_set_con�semanage_ibpkey_modify_localr��semanage_ibpkey_key_free�semanage_ibpkey_free)rrrr.rSr�r�r�rLrr�rrr r�qsP







zibpkeyRecords.__addcCsX|j�|j||�r<ttd�j||d��|j||||�n|j||||�|j�dS)Nz@ibpkey {subnet_prefix}/{pkey} already defined, modifying instead)rr)rb�_ibpkeyRecords__existsrxrr��_ibpkeyRecords__modify�_ibpkeyRecords__addr8)rrrr.rSrrr r��szibpkeyRecords.addcCsN|j||�\}}}}t|j|�\}}|dkrBttd�j||d���t|�|S)Nrz;Could not check if ibpkey {subnet_prefix}/{pkey} is defined)rr)r�semanage_ibpkey_existsrWr^rZformnatr)rrrr�r�r�rLr~rrr r��szibpkeyRecords.__existscCsb|dkr2|dkr2tdkr&ttd���nttd���tj|�}|rZ||jkrZttd�|��|j||�\}}}}t|j|�\}}	|dkr�ttd�||f��|	s�ttd�||f��t	|j|�\}}
|dkr�ttd	�||f��t
|
�}tdko�|dk�r
t|j|t|��|dk�r"t
|j||�t|j||
�}|dk�rNttd
�||f��t|�t|
�dS)Nrr=zRequires setype or serangezRequires setypez)Type %s is invalid, must be a ibpkey typerz*Could not check if ibpkey %s/%s is definedzibpkey %s/%s is not definedzCould not query ibpkey %s/%szCould not modify ibpkey %s/%s)r`r^rrZr�r�rr#rWZsemanage_ibpkey_query�semanage_ibpkey_get_conr�rOr�rrr)rrrr.rr�r�r�rLr~rr�rrr r��s4


zibpkeyRecords.__modifycCs$|j�|j||||�|j�dS)N)rbr!r8)rrrr.rrrr r��szibpkeyRecords.modifyc	Cs�t|j�\}}|dkr"ttd���|j�x�|D]�}t|j|�\}}t|�}t|�}d||f}|j||�\}}}}|dkr�ttd�|��t	|j|�}|dkr�ttd�|��t
|�q0W|j�dS)NrzCould not list the ibpkeysz%s-%szCould not create a key for %szCould not delete the ibpkey %s)�semanage_ibpkey_list_localrWr^rrb�!semanage_ibpkey_get_subnet_prefix�semanage_ibpkey_get_low�semanage_ibpkey_get_highr�semanage_ibpkey_del_localrr8)	rrLr�ibpkeyrr�r�Zpkey_strr�rrr ra�s"
zibpkeyRecords.deleteallcCs�|j||�\}}}}t|j|�\}}|dkr@ttd�||f��|sXttd�||f��t|j|�\}}|dkr�ttd�||f��|s�ttd�||f��t|j|�}|dkr�ttd�||f��t|�dS)Nrz*Could not check if ibpkey %s/%s is definedzibpkey %s/%s is not definedz4ibpkey %s/%s is defined in policy, cannot be deletedzCould not delete ibpkey %s/%s)rr#rWr^rZsemanage_ibpkey_exists_localr)r)rrrr�r�r�rLr~rrr r��szibpkeyRecords.__deletecCs |j�|j||�|j�dS)N)rb�_ibpkeyRecords__deleter8)rrrrrr r�szibpkeyRecords.deletercCs�i}|rt|j�\}|_nt|j�\}|_|dkr>ttd���xb|jD]X}t|�}t|�}|dkrdqFt|�}t	|j|�\}}t
|�}	t|�}
||f||	|
|f<qFW|S)NrzCould not list ibpkeysZreserved_ibpkey_t)r%rWr�semanage_ibpkey_listr^rr$rr
r&r'r()rrzr�rLr*r�rrrr�r�rrr rrs"zibpkeyRecords.get_allc
Cs�i}|rt|j�\}|_nt|j�\}|_|dkr>ttd���x�|jD]�}t|�}t|�}t|j|�\}}t	|�}t
|�}	||f|j�kr�g|||f<||	kr�|||fjd|�qF|||fjd||	f�qFW|S)NrzCould not list ibpkeysz0x%xz	0x%x-0x%x)
r%rWrr,r^rr$rr&r'r(r�r')
rrzr�rLr*r�rrr�r�rrr r,s$zibpkeyRecords.get_all_by_typecCs�g}|jd�}x�t|j��D]�}|d|dkr8|dnd|d|df}||dr�|jd||d||d|d|f�q|jd||d|d|f�qW|S)NTrr=z%s-%sz-a -t %s -r '%s' -x %s %srwz-a -t %s -x %s %s)rrr�r�r')rrr�r�r�rrr rdCs
,,$zibpkeyRecords.customizedr=cCs�|j|�}|j�}t|�dkr"dS|rDtdtd�td�td�f�xZt|�D]N}d|}|d||d7}x$||dd�D]}|d	|7}q�Wt|�qNWdS)
Nrz%-30s %-18s %s
zSELinux IB Pkey TypeZ
Subnet_PrefixzPkey Numberz%-30s %-18s z%sr=z, %s)rr�rHrxrr�)rryrzr�r�ror�rrrr r{Os
zibpkeyRecords.list)N)r)r)r=r)r9r:r;r�r!rr"r�r r!r�rar+r�rrrrdr{rrrr rRs
8	
&

rc@s�eZdZgZd dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
d!dd�Zd"dd�Zdd�Zd#dd�ZdS)$�ibendportRecordsNc
CsXtj||�y:tjtjtj|j��dgd�}tdd�|j	�D��|_
WnYnXdS)NZibendport_type)rcss|]}t|�VqdS)N)r*)rsrgrrr rfsz,ibendportRecords.__init__.<locals>.<genexpr>)rPr!rrrrZrrR�setrr�)rrTrrrr r!bszibendportRecords.__init__cCsp|dkrttd���t|�}|dks,|dkr8ttd���t|j||�\}}|dkrfttd�||f��|||fS)NrzIB device name is required�r=zInvalid Port Numberrz*Could not create a key for ibendport %s/%s)r^rr�Zsemanage_ibendport_key_createrW)r�	ibendport�
ibdev_namer�rLr�rrr r�jszibendportRecords.__genkeyc
Cs
tdkr|dkrd}nt|�}|dkr2ttd���tj|�}||jkrVttd�|��|j||�\}}}t|j	�\}}|dkr�ttd�||f��t
|j	||�t||�t|j	�\}}	|dkr�ttd�||f��t
|j	|	d	�}|dkr�ttd
�||f��t|j	|	d�}|dk�r*ttd�||f��t|j	|	|�}|dk�rVttd
�||f��tdk�r�|dk�r�t|j	|	|�}|dk�r�ttd�||f��t|j	||	�}|dk�r�ttd�||f��t|j	||�}|dk�r�ttd�||f��t|	�t|�t|�dS)Nr=rr�zType is requiredz-Type %s is invalid, must be an ibendport typerz$Could not create ibendport for %s/%sz"Could not create context for %s/%sr�z1Could not set user in ibendport context for %s/%sr�z1Could not set role in ibendport context for %s/%sz1Could not set type in ibendport context for %s/%sz7Could not set mls fields in ibendport context for %s/%sz)Could not set ibendport context for %s/%szCould not add ibendport %s/%s)r`rOr^rrZr�r��_ibendportRecords__genkeyZsemanage_ibendport_createrWZ!semanage_ibendport_set_ibdev_nameZsemanage_ibendport_set_portr�r�r�r�r�Zsemanage_ibendport_set_con�semanage_ibendport_modify_localr��semanage_ibendport_key_free�semanage_ibendport_free)
rr0r1r.rSr�r�rLrr�rrr r�xsP







zibendportRecords.__addcCsX|j�|j||�r<ttd�j|td��|j||||�n|j||||�|j�dS)Nz@ibendport {ibdev_name}/{port} already defined, modifying instead)r1r�)	rb�_ibendportRecords__existsrxrr�r��_ibendportRecords__modify�_ibendportRecords__addr8)rr0r1r.rSrrr r��szibendportRecords.addcCsL|j||�\}}}t|j|�\}}|dkr@ttd�j||d���t|�|S)Nrz;Could not check if ibendport {ibdev_name}/{port} is defined)r1r�)r2�semanage_ibendport_existsrWr^rr�r4)rr0r1r�r�rLr~rrr r��szibendportRecords.__existscCs`|dkr2|dkr2tdkr&ttd���nttd���tj|�}|rZ||jkrZttd�|��|j||�\}}}t|j|�\}}|dkr�ttd�||f��|s�ttd�||f��t	|j|�\}}	|dkr�ttd	�||f��t
|	�}
tdko�|dk�rt|j|
t|��|dk�r t
|j|
|�t|j||	�}|dk�rLttd
�||f��t|�t|	�dS)Nrr=zRequires setype or serangezRequires setypez-Type %s is invalid, must be an ibendport typerz-Could not check if ibendport %s/%s is definedzibendport %s/%s is not definedzCould not query ibendport %s/%sz Could not modify ibendport %s/%s)r`r^rrZr�r�r2r9rWZsemanage_ibendport_query�semanage_ibendport_get_conr�rOr�r3r4r5)rr0r1r.rr�r�rLr~rr�rrr r��s4


zibendportRecords.__modifycCs$|j�|j||||�|j�dS)N)rbr7r8)rr0r1r.rrrr r��szibendportRecords.modifycCs�t|j�\}}|dkr"ttd���|j�x�|D]~}t|j|�\}}t|�}|jt|�|�\}}}|dkr~ttd�t	|f��t
|j|�}|dkr�ttd�||f��t|�q0W|j�dS)NrzCould not list the ibendportsz Could not create a key for %s/%dz$Could not delete the ibendport %s/%d)
�semanage_ibendport_list_localrWr^rrb�!semanage_ibendport_get_ibdev_name�semanage_ibendport_get_portr2r*Z	ibdevname�semanage_ibendport_del_localr4r8)rrLrr0r1r�r�rrr ra�s
zibendportRecords.deleteallcCs�|j||�\}}}t|j|�\}}|dkr>ttd�||f��|sVttd�||f��t|j|�\}}|dkr�ttd�||f��|s�ttd�||f��t|j|�}|dkr�ttd�||f��t|�dS)Nrz-Could not check if ibendport %s/%s is definedzibendport %s/%s is not definedz7ibendport %s/%s is defined in policy, cannot be deletedz Could not delete ibendport %s/%s)r2r9rWr^rZsemanage_ibendport_exists_localr>r4)rr0r1r�r�rLr~rrr r�szibendportRecords.__deletecCs |j�|j||�|j�dS)N)rb�_ibendportRecords__deleter8)rr0r1rrr r�szibendportRecords.deleterc
Cs�i}|rt|j�\}|_nt|j�\}|_|dkr>ttd���xX|jD]N}t|�}t|�}|dkrdqFt|�}t	|j|�\}}t
|�}	||f||	|f<qFW|S)NrzCould not list ibendportsZreserved_ibendport_t)r;rWr�semanage_ibendport_listr^rr:rr
r<r=)
rrzr�rLr0r�rrr1r�rrr rrs zibendportRecords.get_allc	Cs�i}|rt|j�\}|_nt|j�\}|_|dkr>ttd���xh|jD]^}t|�}t|�}t|j|�\}}t	|�}||f|j
�kr�g|||f<|||fjd|�qFW|S)NrzCould not list ibendportsz0x%x)r;rWrr@r^rr:rr<r=r�r')	rrzr�rLr0r�rr1r�rrr r/sz ibendportRecords.get_all_by_typecCs�g}|jd�}xtt|j��D]d}||dr\|jd||d||d|d|df�q|jd||d|d|df�qW|S)NTr=z-a -t %s -r '%s' -z %s %srz-a -t %s -z %s %s)rrr�r�r')rrr�r�rrr rdBs
0(zibendportRecords.customizedr=cCs�|j|�}|j�}t|�dkr"dS|rDtdtd�td�td�f�xZt|�D]N}d|}|d||d7}x$||dd�D]}|d	|7}q�Wt|�qNWdS)
Nrz%-30s %-18s %s
zSELinux IB End Port TypezIB Device NamezPort Numberz%-30s %-18s z%sr=z, %s)rr�rHrxrr�)rryrzr�r�ror�rrrr r{Ms
zibendportRecords.list)N)r)r)r=r)r9r:r;r�r!r2r8r�r6r7r�rar?r�rrrrdr{rrrr r-^s
7	
&

r-c@s~eZdZgZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
ddd�Zdd�Zd dd�ZdS)!�nodeRecordsNcCsTtj||�ddg|_y$tttjtjd��dd�|_Wntk
rNYnXdS)NZipv4Zipv6Z	node_typerr�)	rPr!�protocolr{rZr�r�r�r�)rrTrrr r!`s
$znodeRecords.__init__c	Cs�|}|}d}|dkr ttd���t|�dks8|ddkrztj||�}t|j�}t|j�}|dkrp|jdkrpd}d|j}y|j	j
|�}Wnttd	���YnX|||fS)
NrzNode Address is requiredrr�z0.0.0.0�z::zipv%dzUnknown or missing protocol)r^rrH�	ipaddressZ
ip_networkr*Znetwork_addressZnetmask�versionrB�index)r�addr�maskrBZnewaddrZnewmaskZnewprotocolrorrr �validatehs"


znodeRecords.validatec
	Csp|j|||�\}}}tdkr2|dkr*d}nt|�}|dkrFttd���tj|�}||jkrjttd�|��t|j	|||�\}}|dkr�ttd�|��t
|j	�\}}|dkr�ttd�|��t||�t|j	|||�}t
|j	�\}}	|dkr�ttd	�|��t|j	|||�}|dk�r&ttd
�|��t|j	|	d�}|dk�rNttd�|��t|j	|	d
�}|dk�rvttd�|��t|j	|	|�}|dk�r�ttd�|��tdk�r�|dk�r�t|j	|	|�}|dk�r�ttd�|��t|j	||	�}|dk�rttd�|��t|j	||�}|dk�r*ttd�|��t|	�t|�t|�|jjd||tj|j|�dd
||f�dS)Nr=rr�zSELinux node type is requiredz'Type %s is invalid, must be a node typerzCould not create key for %szCould not create addr for %szCould not create context for %szCould not set mask for %sr�z)Could not set user in addr context for %sr�z)Could not set role in addr context for %sz)Could not set type in addr context for %sz/Could not set mls fields in addr context for %sz!Could not set addr context for %szCould not add addr %szCresrc=node op=add laddr=%s netmask=%s proto=%s tcontext=%s:%s:%s:%s)rIr`rOr^rrZr�r��semanage_node_key_createrWZsemanage_node_createZsemanage_node_set_protoZsemanage_node_set_addrr�Zsemanage_node_set_maskr�r�r�r�Zsemanage_node_set_con�semanage_node_modify_localr��semanage_node_key_free�semanage_node_freerYr6r
r�rB)
rrGrHr�r.rrLr��noder�rrr r��s^









znodeRecords.__addcCsX|j�|j|||�r:ttd�|�|j|||||�n|j|||||�|j�dS)Nz*Addr %s already defined, modifying instead)rb�_nodeRecords__existsrxr�_nodeRecords__modify�_nodeRecords__addr8)rrGrHr�r.rrrr r��sznodeRecords.addcCst|j|||�\}}}t|j|||�\}}|dkr@ttd�|��t|j|�\}}|dkrhttd�|��t|�|S)NrzCould not create key for %sz%Could not check if addr %s is defined)rIrJrWr^r�semanage_node_existsrL)rrGrHr�rLr�r~rrr r��sznodeRecords.__existsc	Cs�|j|||�\}}}|dkr0|dkr0ttd���tj|�}|rX||jkrXttd�|��t|j|||�\}}|dkr�ttd�|��t|j|�\}}|dkr�ttd�|��|s�ttd�|��t	|j|�\}}	|dkr�ttd�|��t
|	�}
td	k�r|dk�rt|j|
t
|��|dk�r.t|j|
|�t|j||	�}|dk�rVttd
�|��t|�t|	�|jjd||tj|j|�dd
||f�dS)NrzRequires setype or serangez'Type %s is invalid, must be a node typerzCould not create key for %sz%Could not check if addr %s is definedzAddr %s is not definedzCould not query addr %sr=zCould not modify addr %szFresrc=node op=modify laddr=%s netmask=%s proto=%s tcontext=%s:%s:%s:%sr�r�)rIr^rrZr�r�rJrWrRZsemanage_node_query�semanage_node_get_conr`r�rOr�rKrLrMrYr6r
r�rB)rrGrHr�r.rrLr�r~rNr�rrr r��s8


znodeRecords.__modifycCs&|j�|j|||||�|j�dS)N)rbrPr8)rrGrHr�r.rrrr r�sznodeRecords.modifycCs
|j|||�\}}}t|j|||�\}}|dkr@ttd�|��t|j|�\}}|dkrhttd�|��|s|ttd�|��t|j|�\}}|dkr�ttd�|��|s�ttd�|��t|j|�}|dkr�ttd�|��t|�|j	j
d||tj|j
|�f�dS)NrzCould not create key for %sz%Could not check if addr %s is definedzAddr %s is not definedz/Addr %s is defined in policy, cannot be deletedzCould not delete addr %sz1resrc=node op=delete laddr=%s netmask=%s proto=%s)rIrJrWr^rrRZsemanage_node_exists_localZsemanage_node_del_localrLrYr6r
r�rB)rrGrHr�rLr�r~rrr r�s&znodeRecords.__deletecCs"|j�|j|||�|j�dS)N)rb�_nodeRecords__deleter8)rrGrHr�rrr r�#sznodeRecords.deletecCstt|j�\}}|dkr"ttd���|j�x<|D]4}|jt|j|�dt|j|�d|jt	|��q0W|j
�dS)Nrz!Could not deleteall node mappingsr=)�semanage_node_list_localrWr^rrbrT�semanage_node_get_addr�semanage_node_get_maskrB�semanage_node_get_protor8)rrLZnlistrNrrr ra(s
4znodeRecords.deleteallrc	Cs�i}|rt|j�\}|_nt|j�\}|_|dkr>ttd���xj|jD]`}t|�}t|j|�}t|j|�}|j	t
|�}t|�t|�t
|�t|�f||d|d|f<qFW|S)NrzCould not list addrsr=)rUrW�ilistZsemanage_node_listr^rrSrVrWrBrX�semanage_context_get_user�semanage_context_get_rolerr
)	rrzr�rLrNr�rGrHr�rrr rr2s2znodeRecords.get_allc	Cs�g}|jd�}x�t|j��D]p}||drb|jd|d|d||d||d|df�q|jd|d|d||d|df�qW|S)NTrz-a -M %s -p %s -t %s -r '%s' %sr=rwrz-a -M %s -p %s -t %s %s)rrr�r�r')rrr�r�rrr rdDs
6.znodeRecords.customizedr=cCs|j|�}t|�dkrdSt|j��}|r6tdd�tr�x�|D]r}d}x|D]}|dt|�}qNWtd	|d|d
|d||d||d
||dt||dd
�f�q@WnJxH|D]@}td|d|d
|d||d||d
||df�q�WdS)Nrz%-18s %-18s %-5s %-5s
�
IP Address�Netmask�Protocol�Contextr�	z%-18s %-18s %-5s %s:%s:%s:%s r=rwrFz%-18s %-18s %-5s %s:%s:%s )r\r]r^r_)rrrHr�r�rxr`r*rN)rryrzr�r�r��valZfieldsrrr r{Ns


R
znodeRecords.list)N)r)r=r)r9r:r;r�r!rIrQr�rOrPr�rTr�rarrrdr{rrrr rA\s
B	(


rAc@sreZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zddd�Zdd�Z
ddd�ZdS)�interfaceRecordsNcCstj||�dS)N)rPr!)rrTrrr r!cszinterfaceRecords.__init__cCstdkr|dkrd}nt|�}|dkr2ttd���t|j|�\}}|dkrZttd�|��t|j�\}}|dkr�ttd�|��t|j||�}t|j�\}}|dkr�ttd�|��t	|j|d	�}|dkr�ttd
�|��t
|j|d�}|dk�rttd�|��t|j||�}|dk�r*ttd
�|��tdk�rf|dk�rft|j||�}|dk�rfttd�|��t
|j||�}|dk�r�ttd�|��t|j||�}|dk�r�ttd�|��t|j||�}|dk�r�ttd�|��t|�t|�t|�|jjd|d	d||f�dS)Nr=rr�zSELinux Type is requiredrzCould not create key for %sz!Could not create interface for %szCould not create context for %sr�z.Could not set user in interface context for %sr�z.Could not set role in interface context for %sz.Could not set type in interface context for %sz4Could not set mls fields in interface context for %sz&Could not set interface context for %sz$Could not set message context for %szCould not add interface %sz4resrc=interface op=add netif=%s tcontext=%s:%s:%s:%s)r`rOr^r�semanage_iface_key_createrWZsemanage_iface_createZsemanage_iface_set_namer�r�r�r�r�Zsemanage_iface_set_ifconZsemanage_iface_set_msgcon�semanage_iface_modify_localr��semanage_iface_key_free�semanage_iface_freerYr6)r�	interfacer.rrLr��ifacer�rrr r�fsT





zinterfaceRecords.__addcCsL|j�|j|�r2ttd�|�|j|||�n|j|||�|j�dS)Nz/Interface %s already defined, modifying instead)rb�_interfaceRecords__existsrxr�_interfaceRecords__modify�_interfaceRecords__addr8)rrgr.rrrr r��s
zinterfaceRecords.addcCs\t|j|�\}}|dkr(ttd�|��t|j|�\}}|dkrPttd�|��t|�|S)NrzCould not create key for %sz*Could not check if interface %s is defined)rcrWr^r�semanage_iface_existsre)rrgrLr�r~rrr r��szinterfaceRecords.__existsc	Cs>|dkr|dkrttd���t|j|�\}}|dkrDttd�|��t|j|�\}}|dkrlttd�|��|s�ttd�|��t|j|�\}}|dkr�ttd�|��t|�}tdkr�|dkr�t|j|t	|��|dkr�t
|j||�t|j||�}|dk�rttd	�|��t|�t
|�|jjd
|dd||f�dS)
NrzRequires setype or serangerzCould not create key for %sz*Could not check if interface %s is definedzInterface %s is not definedzCould not query interface %sr=zCould not modify interface %sz7resrc=interface op=modify netif=%s tcontext=%s:%s:%s:%sr�r�)r^rrcrWrlZsemanage_iface_query�semanage_iface_get_ifconr`r�rOr�rdrerfrYr6)	rrgr.rrLr�r~rhr�rrr r��s0
zinterfaceRecords.__modifycCs"|j�|j|||�|j�dS)N)rbrjr8)rrgr.rrrr r��szinterfaceRecords.modifycCs�t|j|�\}}|dkr(ttd�|��t|j|�\}}|dkrPttd�|��|sdttd�|��t|j|�\}}|dkr�ttd�|��|s�ttd�|��t|j|�}|dkr�ttd�|��t|�|jj	d|�dS)NrzCould not create key for %sz*Could not check if interface %s is definedzInterface %s is not definedz4Interface %s is defined in policy, cannot be deletedzCould not delete interface %sz"resrc=interface op=delete netif=%s)
rcrWr^rrlZsemanage_iface_exists_localZsemanage_iface_del_localrerYr6)rrgrLr�r~rrr r��s$zinterfaceRecords.__deletecCs|j�|j|�|j�dS)N)rb�_interfaceRecords__deleter8)rrgrrr r��s
zinterfaceRecords.deletecCsRt|j�\}}|dkr"ttd���|j�x|D]}|jt|��q0W|j�dS)Nrz(Could not delete all interface  mappings)�semanage_iface_list_localrWr^rrbrn�semanage_iface_get_namer8)rrLr�rorrr ra�s
zinterfaceRecords.deleteallrcCs~i}|rt|j�\}|_nt|j�\}|_|dkr>ttd���x:|jD]0}t|�}t|�t|�t	|�t
|�f|t|�<qFW|S)NrzCould not list interfaces)rorWrYZsemanage_iface_listr^rrmrZr[rr
rp)rrzr�rLrgr�rrr rr	s(zinterfaceRecords.get_allcCstg}|jd�}x`t|j��D]P}||drR|jd||d||d|f�q|jd||d|f�qW|S)NTrz-a -t %s -r '%s' %srwz-a -t %s %s)rrr�r�r')rrr�r�rrr rd	s
&zinterfaceRecords.customizedr=c
Cs�|j|�}t|�dkrdSt|j��}|rBtdtd�td�f�tr�x�|D]@}td|||d||d||dt||dd	�f�qLWn:x8|D]0}td
|||d||d||df�q�WdS)Nrz	%-30s %s
zSELinux Interfacer_z%-30s %s:%s:%s:%s r=rwrFz%-30s %s:%s:%s )rrrHr�r�rxrr`rN)rryrzr�r�r�rrr r{	s

B
zinterfaceRecords.list)N)r)r=r)r9r:r;r!rkr�rirjr�rnr�rarrrdr{rrrr rbas
:	"


rbc@s�eZdZgZd(dd�Zdd�Zdd�Zdd	�Zd)dd�Zd
d�Z	d*dd�Z
d+dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zd,d!d"�Zd#d$�Zd-d&d'�ZdS).�fcontextRecordsNcCs�tj||�yLtttjtjd��dd�|_|jtttjtjd��dd�7_Wntk
rlYnXi|_i|_	d|_
ydttj
�d�}xH|j�D]<}|j�}t|�dkr�q�|jd�r�q�|j�\}}||j|<q�W|j�Wntk
r�YnXynttj�d�}xR|j�D]F}|j�}t|�dk�r2�q|jd��rB�q|j�\}}||j	|<�qW|j�Wntk
�r~YnXdS)NZ	file_typerr�Zdevice_nodeFr��#)rPr!r{rZr�r�r�r��equiv�
equiv_dist�	equal_indr�rG�selinux_file_context_subs_path�	readlines�striprHr�r�r��IOErrorZ#selinux_file_context_subs_dist_path)rrTr�ro�target�
substituterrr r!1	sF ,
zfcontextRecords.__init__c
Cs�|jr�tj�}d|}t|d�}x*|jj�D]}|jd||j|f�q,W|j�ytj	|tj
|�t
j�WnYnXtj||�d|_t
j|�dS)Nz%s.tmp�wz%s %s
F)rurGrvr�rsr��writer�r|�chmod�stat�ST_MODE�renamerPr8)rZ	subs_fileZtmpfiler�rzrrr r8W	s
zfcontextRecords.commitcCsL|j�|dkr,|d
dkr,ttd�|��|dkrP|ddkrPttd�|��||jj�kr�ttd�|�||j|<d|_|jjdt	j
d|d	�t	j
d
|d	�f�|j�dS|j|�xJ|j|j
fD]:}x4|D],}|j|d�r�ttd�||||f��q�Wq�W|jjdt	j
d|d	�t	j
d
|d	�f�||j|<d|_|j�dS)Nr�r=z=Target %s is not valid. Target is not allowed to end with '/'zESubstitute %s is not valid. Substitute is not allowed to end with '/'z:Equivalence class for %s already exists, modifying insteadTz$resrc=fcontext op=modify-equal %s %s�sglobr�tglobz4File spec %s conflicts with equivalency rule '%s %s'z!resrc=fcontext op=add-equal %s %s���r�)rbr^rrsr�rxrurYr6r�audit_encode_nv_stringr8rIrtr�)rrzr{�fdictrorrr �	add_equalg	s*
(

"(
zfcontextRecords.add_equalcCsj|j�||jj�kr&ttd�|��||j|<d|_|jjdtj	d|d�tj	d|d�f�|j
�dS)Nz'Equivalence class for %s does not existTz$resrc=fcontext op=modify-equal %s %sr�rr�)rbrsr�r^rrurYr6rr�r8)rrzr{rrr �modify_equal�	s
(zfcontextRecords.modify_equalr�cCs�t|j�\}}|dkr&ttd�|��|dkr2d}t|j||�}|dkrXttd�|��t|j|d�}|dkr~ttd�|��tdkr�t|j|d	�}|dkr�ttd
�|��|S)NrzCould not create context for %srr�z)Could not set user in file context for %sr�z)Could not set role in file context for %sr=r�z/Could not set mls fields in file context for %s)r�rWr^rr�r�r`r�)rrzr�rLr�rrr �	createcon�	s zfcontextRecords.createconcCs�|dks|jd�dkr"ttd���|jd�d
kr<ttd���x^|j|jfD]N}xH|D]@}|j|d�rTtj||||�}ttd	�|||||f��qTWqJWdS)Nr�
rzInvalid file specificationr�r=z)File specification can not include spacesr�zMFile spec %s conflicts with equivalency rule '%s %s'; Try adding '%s' insteadr�)�findr^rrsrtr�rB�sub)rrzr�rorgrrr rI�	s
zfcontextRecords.validaterc
Cs�|j|�tdkrt|�}|dkr.ttd���|dkrZtj|�}||jkrZttd�|��t|j	|t
|�\}}|dkr�ttd�|��t|j	�\}}|dkr�ttd�|��t|j	||�}|dk�r\|j
||�}	t|j	|	|�}|dkr�ttd	�|��tdk�r4|dk�r4t|j	|	|�}|dk�r4ttd
�|��t|j	||	�}|dk�r\ttd�|��t|t
|�t|j	||�}|dk�r�ttd�|��|dk�r�t|	�t|�t|�|�s�d
}|jjdtjd|d�t||d||f�dS)Nr=rzSELinux Type is requiredz<<none>>z1Type %s is invalid, must be a file or device typerzCould not create key for %sz$Could not create file context for %sz)Could not set type in file context for %sz/Could not set mls fields in file context for %sz!Could not set file context for %sz!Could not add file context for %sr�z6resrc=fcontext op=add %s ftype=%s tcontext=%s:%s:%s:%sr�r�)rIr`rOr^rrZr�r��semanage_fcontext_key_createrW�
file_typesZsemanage_fcontext_createZsemanage_fcontext_set_exprr�r�r��semanage_fcontext_set_conZsemanage_fcontext_set_type�semanage_fcontext_modify_localr��semanage_fcontext_key_free�semanage_fcontext_freerYr6rr��ftype_to_audit)
rrzrS�ftyper.r�rLr��fcontextr�rrr r��	sN







zfcontextRecords.__addcCsV|j�|j||�r8ttd�|�|j|||||�n|j|||||�|j�dS)Nz6File context for %s already defined, modifying instead)rb�_fcontextRecords__existsrxr�_fcontextRecords__modify�_fcontextRecords__addr8)rrzrSr�r.r�rrr r��	szfcontextRecords.addcCs�t|j|t|�\}}|dkr.ttd�|��t|j|�\}}|dkrVttd�|��|s�t|j|�\}}|dkr�ttd�|��t|�|S)NrzCould not create key for %sz1Could not check if file context for %s is defined)r�rWr�r^r�semanage_fcontext_exists�semanage_fcontext_exists_localr�)rrzr�rLr�r~rrr r��	szfcontextRecords.__existscCs�|dkr$|dkr$|dkr$ttd���|dkrPtj|�}||jkrPttd�|��|j|�t|j|t|�\}}|dkr�ttd�|��t	|j|�\}}|dkr�ttd�|��|r�yt
|j|�\}}	Wn$tk
r�ttd�|��YnXn|t|j|�\}}|dk�rttd�|��|�s0ttd	�|��yt
|j|�\}}	Wn&tk
�rjttd�|��YnX|dk�rt|	�}
|
dk�r�|j|�}
td
k�r�|dk�r�t|j|
t|��|dk�r�t|j|
|�|dk�r�t|j|
|�t|j|	|
�}|dk�r:ttd�|��n(t|j|	d�}|dk�r:ttd�|��t|j||	�}|dk�rbttd�|��t|�t|	�|�s|d
}|jjdtjd|d�t||d||f�dS)Nrz"Requires setype, serange or seuser�<<none>>z1Type %s is invalid, must be a file or device typerzCould not create a key for %sz1Could not check if file context for %s is definedz#Could not query file context for %sz"File context for %s is not definedr=z!Could not set file context for %sz$Could not modify file context for %sr�z9resrc=fcontext op=modify %s ftype=%s tcontext=%s:%s:%s:%sr�r�)rr�)r^rrZr�r�rIr�rWr�r�Zsemanage_fcontext_query�OSErrorr�Zsemanage_fcontext_query_local�semanage_fcontext_get_conr�r`r�rOr�r�r�r�r�r�rYr6rr�r�)rrzrr�r.r�rLr�r~r�r�rrr r�
sf











zfcontextRecords.__modifycCs&|j�|j|||||�|j�dS)N)rbr�r8)rrzrr�r.r�rrr r�C
szfcontextRecords.modifycCs�t|j�\}}|dkr"ttd���|j�x�|D]�}t|�}t|�}t|�}t|j|t	|�\}}|dkrzttd�|��t
|j|�}|dkr�ttd�|��t|�|jj
dtjd|d�tt|f�q0Wi|_d|_|j�dS)Nrz Could not list the file contextszCould not create a key for %sz$Could not delete the file context %sz$resrc=fcontext op=delete %s ftype=%sr�T)�semanage_fcontext_list_localrWr^rrb�semanage_fcontext_get_expr�semanage_fcontext_get_type�semanage_fcontext_get_type_strr�r��semanage_fcontext_del_localr�rYr6rr�r��file_type_str_to_optionrsrur8)rrL�flistr�rzr��	ftype_strr�rrr raH
s&
*zfcontextRecords.deleteallcCs:||jj�kr>|jj|�d|_|jjdtjd|d��dSt|j	|t
|�\}}|dkrlttd�|��t
|j	|�\}}|dkr�ttd�|��|s�t|j	|�\}}|dkr�ttd�|��|r�ttd�|��nttd�|��t|j	|�}|dk�rttd	�|��t|�|jjd
tjd|d�t|f�dS)NTz!resrc=fcontext op=delete-equal %sr�rzCould not create a key for %sz1Could not check if file context for %s is definedz;File context for %s is defined in policy, cannot be deletedz"File context for %s is not definedz$Could not delete file context for %sz$resrc=fcontext op=delete %s ftype=%s)rsr��poprurYr6rr�r�rWr�r^rr�r�r�r�r�)rrzr�rLr�r~rrr r�b
s.
zfcontextRecords.__deletecCs |j�|j||�|j�dS)N)rb�_fcontextRecords__deleter8)rrzr�rrr r��
szfcontextRecords.deletercCs|rt|j�\}|_n�t|j�\}|_|dkr:ttd���t|j�\}}|dkr\ttd���t|j�\}}|dkr~ttd���|j|7_|j|7_i}xd|jD]Z}t|�}t|�}t	|�}	t
|�}
|
r�t|
�t|
�t
|
�t|
�f|||	f<q�|
|||	f<q�W|S)NrzCould not list file contextsz1Could not list file contexts for home directoriesz"Could not list local file contexts)r�rWr�Zsemanage_fcontext_listr^rZsemanage_fcontext_list_homedirsr�r�r�r�rZr[rr
)rrzrLZ
fchomedirsZfclocalr�r��exprr�r�r�rrr rr�
s.&zfcontextRecords.get_allcCs�g}|jd�}x�|j�D]t}||r||drd|jdt|d||d||d|df�q|jdt|d||d|df�qWt|j�r�x*|jj�D]}|jd|j||f�q�W|S)	NTrz-a -f %s -t %s -r '%s' '%s'r=rwrz-a -f %s -t %s '%s'z-a -e %s %s)rrr�r'r�rHrs)rr�	fcon_dictr�rzrrr rd�
s
4,
zfcontextRecords.customizedr=cCs�|j|�}t|�dk�r|r:tdtd�td�td�f�|rH|j�}nt|j��}x�|D]�}||r�tr�td|d|d||d||d||dt||d	d
�f�n6td|d|d||d||d||df�qZtd|d|df�qZWt|j��rV|�sV|�r*ttd
��x*|jj�D]}td||j|f��q6Wt|j	��r�|�rtttd��x*|j	j�D]}td||j	|f��q�WdS)Nrz%-50s %-18s %s
zSELinux fcontextrSr_z%-50s %-18s %s:%s:%s:%s r=rwrFz%-50s %-18s %s:%s:%s z%-50s %-18s <<None>>z,
SELinux Distribution fcontext Equivalence 
z%s = %sz%
SELinux Local fcontext Equivalence 
)
rrrHrxrr�r�r`rNrtrs)rryrzr�Zfkeysr�rzrrr r{�
s0


H8zfcontextRecords.list)N)r�)rrr�)rrr�)r)r=r)r9r:r;r�r!r8r�r�r�rIr�r�r�r�r�rar�r�rrrdr{rrrr rq-	s$
&

6
	C!
 rqc@sleZdZddd�Zdd�Zddd�Zd	d
�Zdd�Zd
d�Zddd�Z	dd�Z
dd�Zdd�Zddd�Z
dS)�booleanRecordsNc	Cs�tj||�i|_d|jd<d|jd<d|jd<d|jd<d|jd<d|jd<ytj�\}|_tj�\}}Wng|_d}YnX|jd	ks�|j|kr�d
|_nd|_dS)Nr=ZTRUErZFALSEZONZOFF�1�0rTF)	rPr!�dictrGZsecurity_get_boolean_names�current_booleansrXrR�modify_local)rrTrLZptyperrr r!�
s"






zbooleanRecords.__init__cCsLtj|�}t|j|�\}}|dkr2ttd�|��t|j|�\}}|dkrZttd�|��|snttd�|��t|j|�\}}|dkr�ttd�|��|j�|j	kr�t
||j	|j��nttd�dj|j	j����|j
o�||jk�rt|j||�}|dk�rttd�|��t|j||�}|dk�r8ttd	�|��t|�t|�dS)
NrzCould not create a key for %sz(Could not check if boolean %s is definedzBoolean %s is not definedzCould not query file context %sz0You must specify one of the following values: %sz, z(Could not set active value of boolean %szCould not modify boolean %s)rG�selinux_boolean_sub�semanage_bool_key_createrWr^r�semanage_bool_existsZsemanage_bool_query�upperr�Zsemanage_bool_set_valuer�r�r�r�Zsemanage_bool_set_activeZsemanage_bool_modify_local�semanage_bool_key_freeZsemanage_bool_free)rr,�valuerLr�r~rrrr Z__mod�
s0


zbooleanRecords.__modFcCs�|j�|r�t|�}x||j�jd�D]j}|j�}t|�dkr>q$y|jd�\}}Wn(tk
rxttd||f���YnX|j|j�|j��q$W|j	�n|j||�|j
�dS)Nr�r�=zBad format %s: Record %s)rbr�r�r�rxrHr^r�_booleanRecords__modr�r8)rr,r��use_filer�rZboolnamerarrr r�s
zbooleanRecords.modifycCs�tj|�}t|j|�\}}|dkr2ttd�|��t|j|�\}}|dkrZttd�|��|snttd�|��t|j|�\}}|dkr�ttd�|��|s�ttd�|��t|j|�}|dkr�ttd�|��t	|�dS)NrzCould not create a key for %sz(Could not check if boolean %s is definedzBoolean %s is not definedz2Boolean %s is defined in policy, cannot be deletedzCould not delete boolean %s)
rGr�r�rWr^rr�Zsemanage_bool_exists_localZsemanage_bool_del_localr�)rr,rLr�r~rrr r�#s$
zbooleanRecords.__deletecCs|j�|j|�|j�dS)N)rb�_booleanRecords__deleter8)rr,rrr r�;s
zbooleanRecords.deletecCsZt|j�\}|_|dkr$ttd���|j�x |jD]}t|�}|j|�q4W|j�dS)NrzCould not list booleans)	�semanage_bool_list_localrW�blistr^rrb�semanage_bool_get_namer�r8)rrL�booleanr,rrr ra@szbooleanRecords.deleteallrcCs�i}|rt|j�\}|_nt|j�\}|_|dkr>ttd���x~|jD]t}g}t|�}|jt|��|j	r�||j
kr�|jtj|��|jtj
|��n|j|d�|j|d�|||<qFW|S)NrzCould not list booleans)r�rWr�Zsemanage_bool_listr^rr�r'Zsemanage_bool_get_valuer�r�rGZsecurity_get_boolean_pendingZsecurity_get_boolean_active)rrzr�rLr�r�r,rrr rrMs"zbooleanRecords.get_allcCstj|�}tj|�S)N)rGr�rZZboolean_desc)rr,rrr �get_descds
zbooleanRecords.get_desccCstj|�}tj|�S)N)rGr�rZZboolean_category)rr,rrr �get_categoryhs
zbooleanRecords.get_categorycCsJg}|jd�}x6t|j��D]&}||r|jd||d|f�qW|S)NTz	-m -%s %srw)rrr�r�r')rrr�r�rrr rdls
zbooleanRecords.customizedTcCs�td�td�f}|rX|j|�}x4t|j��D]$}||r,td|||df�q,WdS|j|�}t|�dkrrdS|r�tdtd�td�td	�td
�f�xNt|j��D]>}||r�td||||d|||d|j|�f�q�WdS)Nr�r�z%s=%srwrz%-30s %s  %s %s
zSELinux booleanZStateZDefaultZDescriptionz%-30s (%-5s,%5s)  %s)rrrr�r�rxrHr�)rryrzr�Zon_offr�r�rrr r{ts

$zbooleanRecords.list)N)NF)r)TFF)r9r:r;r!r�r�r�r�rarrr�r�rdr{rrrr r��
s


r�)r)r=)r=)9r�r�rGr|rBr(rr
r5ZPROGNAMErZrrD�gettext�kwargs�version_infoZinstall�builtinsr*�__dict__r�Z__builtin__rr>r�ZSEMANAGE_FCONTEXT_ALLZSEMANAGE_FCONTEXT_REGZSEMANAGE_FCONTEXT_DIRZSEMANAGE_FCONTEXT_CHARZSEMANAGE_FCONTEXT_BLOCKZSEMANAGE_FCONTEXT_SOCKZSEMANAGE_FCONTEXT_LINKZSEMANAGE_FCONTEXT_PIPEr�r�rZaudit_closerrr�r?rFrNrOrPrfr�r�r�r�r�rr-rArbrqr�rrrr �<module>s�
$$	

ik
H$M.site-packages/__pycache__/six.cpython-36.pyc000064400000060674147511334560014753 0ustar003

��s`�x�K@s�dZddlmZddlZddlZddlZddlZddlZdZdZ	ej
ddkZej
ddkZej
dd��d�kZ
er�efZefZefZeZeZejZn�efZeefZeejfZeZeZejjd	�r�e�d��ZnLGdd
�d
e�Z ye!e ��Wn e"k
�re�d��ZYnXe�d��Z[ dd�Z#dd�Z$Gdd�de�Z%Gdd�de%�Z&Gdd�dej'�Z(Gdd�de%�Z)Gdd�de�Z*e*e+�Z,Gdd�de(�Z-e)ddd d!�e)d"d#d$d%d"�e)d&d#d#d'd&�e)d(d)d$d*d(�e)d+d)d,�e)d-d#d$d.d-�e)d/d0d0d1d/�e)d2d0d0d/d2�e)d3d4d5�e)d6d)d$d7d6�e)d8d)e
�r&d9nd:d;�e)d<d)d=�e)d>d?d@dA�e)d!d!d �e)dBdBdC�e)dDdDdC�e)dEdEdC�e)d7d)d$d7d6�e)dFd#d$dGdF�e)dHd#d#dIdH�e&d$d)�e&dJdK�e&dLdM�e&dNdOdP�e&dQdRdQ�e&dSdTdU�e&dVdWdX�e&dYdZd[�e&d\d]d^�e&d_d`da�e&dbdcdd�e&dedfdg�e&dhdidj�e&dkdldm�e&dndodp�e&dqdqdr�e&dsdsdr�e&dtdtdr�e&dududv�e&dwdx�e&dydz�e&d{d|�e&d}d~d}�e&dd��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�e+d�d��e&d�e+d�d��e&d�e+d�e+d��e&d�d�d��e&d�d�d��e&d�d�d��g@Z.ejd�k�rne.e&d�d��g7Z.x:e.D]2Z/e0e-e/j1e/�e2e/e&��rte,j3e/d�e/j1��qtW[/e.e-_.e-e+d��Z4e,j3e4d��Gd�d��d�e(�Z5e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)dAd�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d�d�dσe)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��gZ6xe6D]Z/e0e5e/j1e/��q�W[/e6e5_.e,j3e5e+d��d�dۃGd�d݄d�e(�Z7e)d�d�d��e)d�d�d��e)d�d�d��gZ8xe8D]Z/e0e7e/j1e/��qPW[/e8e7_.e,j3e7e+d��d�d�Gd�d�d�e(�Z9e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)�dd�d�e)�dd�d�e)�dd�d�e)�dd�d�e)�dd�d�e)�dd�d�e)�dd�d�e)�dd�d�e)�dd�d�e)�d	d�d�e)�d
d�d�g#Z:xe:D]Z/e0e9e/j1e/��qW[/e:e9_.e,j3e9e+�d��d�d
�G�d�d��de(�Z;e)�dd��d�e)�dd��d�e)�dd��d�e)�dd��d�gZ<xe<D]Z/e0e;e/j1e/��q�W[/e<e;_.e,j3e;e+�d��d�d�G�d�d��de(�Z=e)�dd�d��gZ>xe>D]Z/e0e=e/j1e/��	qW[/e>e=_.e,j3e=e+�d��d�d�G�d�d��dej'�Z?e,j3e?e+d���d ��d!�d"�Z@�d#�d$�ZAe�	r��d%ZB�d&ZC�d'ZD�d(ZE�d)ZF�d*ZGn$�d+ZB�d,ZC�d-ZD�d.ZE�d/ZF�d0ZGyeHZIWn"eJk
�
r�d1�d2�ZIYnXeIZHyeKZKWn"eJk
�
r<�d3�d4�ZKYnXe�
rh�d5�d6�ZLejMZN�d7�d8�ZOeZPn>�d9�d6�ZL�d:�d;�ZN�d<�d8�ZOG�d=�d>��d>e�ZPeKZKe#eL�d?�ejQeB�ZRejQeC�ZSejQeD�ZTejQeE�ZUejQeF�ZVejQeG�ZWe�rJ�d@�dA�ZX�dB�dC�ZY�dD�dE�ZZ�dF�dG�Z[ej\�dH�Z]ej\�dI�Z^ej\�dJ�Z_nT�dK�dA�ZX�dL�dC�ZY�dM�dE�ZZ�dN�dG�Z[ej\�dO�Z]ej\�dP�Z^ej\�dQ�Z_e#eX�dR�e#eY�dS�e#eZ�dT�e#e[�dU�e�rb�dV�dW�Z`�dX�dY�ZaebZcddldZdedje�dZ�jfZg[dejhd�ZiejjZkelZmddlnZnenjoZoenjpZp�d[Zqej
d
d
k�rT�d\Zr�d]Zsn�d^Zr�d_Zsnj�d`�dW�Z`�da�dY�ZaecZcebZg�db�dc�Zi�dd�de�Zkejtejuev�ZmddloZoeojoZoZp�dfZq�d\Zr�d]Zse#e`�dg�e#ea�dh��di�d[�Zw�dj�d^�Zx�dk�d_�Zye�
r.eze4j{�dl�Z|�d��dm�dn�Z}n�d��do�dp�Z|e|�dq�ej
dd��d�k�
rje|�dr�n.ej
dd��d�k�
r�e|�ds�n�dt�du�Z~eze4j{�dvd�Zedk�
r��dw�dx�Zej
dd��d�k�
r�eZ��dy�dx�Ze#e}�dz�ej
dd��d�k�rej�ej�f�d{�d|�Z�nej�Z��d}�d~�Z��d�d��Z��d��d��Z�gZ�e+Z�e��j��d��dk	�rjge�_�ej��r�x>e�ej��D]0\Z�Z�ee��j+dk�r~e�j1e+k�r~ej�e�=P�q~W[�[�ej�j�e,�dS(�z6Utilities for writing code that runs on Python 2 and 3�)�absolute_importNz'Benjamin Peterson <benjamin@python.org>z1.11.0����java��c@seZdZdd�ZdS)�XcCsdS)Nrrl�)�selfr
r
�/usr/lib/python3.6/six.py�__len__>sz	X.__len__N)�__name__�
__module__�__qualname__r
r
r
r
rr	<sr	�?cCs
||_dS)z Add documentation to a function.N)�__doc__)�func�docr
r
r�_add_docKsrcCst|�tj|S)z7Import module, returning the module after the last dot.)�
__import__�sys�modules)�namer
r
r�_import_modulePsrc@seZdZdd�Zdd�ZdS)�
_LazyDescrcCs
||_dS)N)r)rrr
r
r�__init__Xsz_LazyDescr.__init__cCsB|j�}t||j|�yt|j|j�Wntk
r<YnX|S)N)�_resolve�setattrr�delattr�	__class__�AttributeError)r�obj�tp�resultr
r
r�__get__[sz_LazyDescr.__get__N)rrrrr%r
r
r
rrVsrcs.eZdZd�fdd�	Zdd�Zdd�Z�ZS)	�MovedModuleNcs2tt|�j|�tr(|dkr |}||_n||_dS)N)�superr&r�PY3�mod)rr�old�new)r r
rriszMovedModule.__init__cCs
t|j�S)N)rr))rr
r
rrrszMovedModule._resolvecCs"|j�}t||�}t|||�|S)N)r�getattrr)r�attr�_module�valuer
r
r�__getattr__us
zMovedModule.__getattr__)N)rrrrrr0�
__classcell__r
r
)r rr&gs	r&cs(eZdZ�fdd�Zdd�ZgZ�ZS)�_LazyModulecstt|�j|�|jj|_dS)N)r'r2rr r)rr)r r
rr~sz_LazyModule.__init__cCs ddg}|dd�|jD�7}|S)NrrcSsg|]
}|j�qSr
)r)�.0r-r
r
r�
<listcomp>�sz'_LazyModule.__dir__.<locals>.<listcomp>)�_moved_attributes)rZattrsr
r
r�__dir__�sz_LazyModule.__dir__)rrrrr6r5r1r
r
)r rr2|sr2cs&eZdZd�fdd�	Zdd�Z�ZS)�MovedAttributeNcsdtt|�j|�trH|dkr |}||_|dkr@|dkr<|}n|}||_n||_|dkrZ|}||_dS)N)r'r7rr(r)r-)rrZold_modZnew_modZold_attrZnew_attr)r r
rr�szMovedAttribute.__init__cCst|j�}t||j�S)N)rr)r,r-)r�moduler
r
rr�s
zMovedAttribute._resolve)NN)rrrrrr1r
r
)r rr7�sr7c@sVeZdZdZdd�Zdd�Zdd�Zdd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZeZdS)�_SixMetaPathImporterz�
    A meta path importer to import six.moves and its submodules.

    This class implements a PEP302 finder and loader. It should be compatible
    with Python 2.5 and all existing versions of Python3
    cCs||_i|_dS)N)r�
known_modules)rZsix_module_namer
r
rr�sz_SixMetaPathImporter.__init__cGs&x |D]}||j|jd|<qWdS)N�.)r:r)rr)Z	fullnames�fullnamer
r
r�_add_module�s
z _SixMetaPathImporter._add_modulecCs|j|jd|S)Nr;)r:r)rr<r
r
r�_get_module�sz _SixMetaPathImporter._get_moduleNcCs||jkr|SdS)N)r:)rr<�pathr
r
r�find_module�s
z _SixMetaPathImporter.find_modulecCs0y
|j|Stk
r*td|��YnXdS)Nz!This loader does not know module )r:�KeyError�ImportError)rr<r
r
rZ__get_module�s
z!_SixMetaPathImporter.__get_modulecCsRy
tj|Stk
rYnX|j|�}t|t�r>|j�}n||_|tj|<|S)N)rrrA� _SixMetaPathImporter__get_module�
isinstancer&r�
__loader__)rr<r)r
r
r�load_module�s




z _SixMetaPathImporter.load_modulecCst|j|�d�S)z�
        Return true, if the named module is a package.

        We need this method to get correct spec objects with
        Python 3.4 (see PEP451)
        �__path__)�hasattrrC)rr<r
r
r�
is_package�sz_SixMetaPathImporter.is_packagecCs|j|�dS)z;Return None

        Required, if is_package is implementedN)rC)rr<r
r
r�get_code�s
z_SixMetaPathImporter.get_code)N)
rrrrrr=r>r@rCrFrIrJ�
get_sourcer
r
r
rr9�s
	r9c@seZdZdZgZdS)�_MovedItemszLazy loading of moved objectsN)rrrrrGr
r
r
rrL�srLZ	cStringIO�io�StringIO�filter�	itertools�builtinsZifilter�filterfalseZifilterfalse�inputZ__builtin__Z	raw_input�internr�map�imap�getcwd�osZgetcwdu�getcwdbZ	getoutputZcommands�
subprocess�rangeZxrangeZ
reload_module�	importlibZimp�reload�reduce�	functoolsZshlex_quoteZpipesZshlexZquote�UserDict�collections�UserList�
UserString�zipZizip�zip_longestZizip_longestZconfigparserZConfigParser�copyregZcopy_regZdbm_gnuZgdbmzdbm.gnuZ
_dummy_threadZdummy_threadZhttp_cookiejarZ	cookielibzhttp.cookiejarZhttp_cookiesZCookiezhttp.cookiesZ
html_entitiesZhtmlentitydefsz
html.entitiesZhtml_parserZ
HTMLParserzhtml.parserZhttp_clientZhttplibzhttp.clientZemail_mime_basezemail.MIMEBasezemail.mime.baseZemail_mime_imagezemail.MIMEImagezemail.mime.imageZemail_mime_multipartzemail.MIMEMultipartzemail.mime.multipartZemail_mime_nonmultipartzemail.MIMENonMultipartzemail.mime.nonmultipartZemail_mime_textzemail.MIMETextzemail.mime.textZBaseHTTPServerzhttp.serverZ
CGIHTTPServerZSimpleHTTPServerZcPickle�pickleZqueueZQueue�reprlib�reprZsocketserverZSocketServer�_threadZthreadZtkinterZTkinterZtkinter_dialogZDialogztkinter.dialogZtkinter_filedialogZ
FileDialogztkinter.filedialogZtkinter_scrolledtextZScrolledTextztkinter.scrolledtextZtkinter_simpledialogZSimpleDialogztkinter.simpledialogZtkinter_tixZTixztkinter.tixZtkinter_ttkZttkztkinter.ttkZtkinter_constantsZTkconstantsztkinter.constantsZtkinter_dndZTkdndztkinter.dndZtkinter_colorchooserZtkColorChooserztkinter.colorchooserZtkinter_commondialogZtkCommonDialogztkinter.commondialogZtkinter_tkfiledialogZtkFileDialogZtkinter_fontZtkFontztkinter.fontZtkinter_messageboxZtkMessageBoxztkinter.messageboxZtkinter_tksimpledialogZtkSimpleDialogZurllib_parsez.moves.urllib_parsezurllib.parseZurllib_errorz.moves.urllib_errorzurllib.errorZurllibz
.moves.urllibZurllib_robotparser�robotparserzurllib.robotparserZ
xmlrpc_clientZ	xmlrpclibz
xmlrpc.clientZ
xmlrpc_serverZSimpleXMLRPCServerz
xmlrpc.serverZwin32�winreg�_winregzmoves.z.moves�movesc@seZdZdZdS)�Module_six_moves_urllib_parsez7Lazy loading of moved objects in six.moves.urllib_parseN)rrrrr
r
r
rroBsroZParseResultZurlparseZSplitResultZparse_qsZ	parse_qslZ	urldefragZurljoinZurlsplitZ
urlunparseZ
urlunsplitZ
quote_plusZunquoteZunquote_plusZunquote_to_bytesZ	urlencodeZ
splitqueryZsplittagZ	splituserZ
splitvalueZ
uses_fragmentZuses_netlocZuses_paramsZ
uses_queryZ
uses_relativezmoves.urllib_parsezmoves.urllib.parsec@seZdZdZdS)�Module_six_moves_urllib_errorz7Lazy loading of moved objects in six.moves.urllib_errorN)rrrrr
r
r
rrplsrpZURLErrorZurllib2Z	HTTPErrorZContentTooShortErrorz.moves.urllib.errorzmoves.urllib_errorzmoves.urllib.errorc@seZdZdZdS)�Module_six_moves_urllib_requestz9Lazy loading of moved objects in six.moves.urllib_requestN)rrrrr
r
r
rrq�srqZurlopenzurllib.requestZinstall_openerZbuild_openerZpathname2urlZurl2pathnameZ
getproxiesZRequestZOpenerDirectorZHTTPDefaultErrorHandlerZHTTPRedirectHandlerZHTTPCookieProcessorZProxyHandlerZBaseHandlerZHTTPPasswordMgrZHTTPPasswordMgrWithDefaultRealmZAbstractBasicAuthHandlerZHTTPBasicAuthHandlerZProxyBasicAuthHandlerZAbstractDigestAuthHandlerZHTTPDigestAuthHandlerZProxyDigestAuthHandlerZHTTPHandlerZHTTPSHandlerZFileHandlerZ
FTPHandlerZCacheFTPHandlerZUnknownHandlerZHTTPErrorProcessorZurlretrieveZ
urlcleanupZ	URLopenerZFancyURLopenerZproxy_bypassZparse_http_listZparse_keqv_listz.moves.urllib.requestzmoves.urllib_requestzmoves.urllib.requestc@seZdZdZdS)� Module_six_moves_urllib_responsez:Lazy loading of moved objects in six.moves.urllib_responseN)rrrrr
r
r
rrr�srrZaddbasezurllib.responseZaddclosehookZaddinfoZ
addinfourlz.moves.urllib.responsezmoves.urllib_responsezmoves.urllib.responsec@seZdZdZdS)�#Module_six_moves_urllib_robotparserz=Lazy loading of moved objects in six.moves.urllib_robotparserN)rrrrr
r
r
rrs�srsZRobotFileParserz.moves.urllib.robotparserzmoves.urllib_robotparserzmoves.urllib.robotparserc@sNeZdZdZgZejd�Zejd�Zejd�Z	ejd�Z
ejd�Zdd�Zd	S)
�Module_six_moves_urllibzICreate a six.moves.urllib namespace that resembles the Python 3 namespacezmoves.urllib_parsezmoves.urllib_errorzmoves.urllib_requestzmoves.urllib_responsezmoves.urllib_robotparsercCsdddddgS)N�parse�error�request�responserkr
)rr
r
rr6�szModule_six_moves_urllib.__dir__N)
rrrrrG�	_importerr>rurvrwrxrkr6r
r
r
rrt�s




rtzmoves.urllibcCstt|j|�dS)zAdd an item to six.moves.N)rrLr)Zmover
r
r�add_move�srzcCsXytt|�WnDtk
rRytj|=Wn"tk
rLtd|f��YnXYnXdS)zRemove item from six.moves.zno such move, %rN)rrLr!rn�__dict__rA)rr
r
r�remove_move�sr|�__func__�__self__�__closure__�__code__�__defaults__�__globals__�im_funcZim_selfZfunc_closureZ	func_codeZ
func_defaultsZfunc_globalscCs|j�S)N)�next)�itr
r
r�advance_iteratorsr�cCstdd�t|�jD��S)Ncss|]}d|jkVqdS)�__call__N)r{)r3�klassr
r
r�	<genexpr>szcallable.<locals>.<genexpr>)�any�type�__mro__)r"r
r
r�callablesr�cCs|S)Nr
)�unboundr
r
r�get_unbound_functionsr�cCs|S)Nr
)r�clsr
r
r�create_unbound_method#sr�cCs|jS)N)r�)r�r
r
rr�(scCstj|||j�S)N)�types�
MethodTyper )rr"r
r
r�create_bound_method+sr�cCstj|d|�S)N)r�r�)rr�r
r
rr�.sc@seZdZdd�ZdS)�IteratorcCst|�j|�S)N)r��__next__)rr
r
rr�3sz
Iterator.nextN)rrrr�r
r
r
rr�1sr�z3Get the function out of a possibly unbound functioncKst|jf|��S)N)�iter�keys)�d�kwr
r
r�iterkeysDsr�cKst|jf|��S)N)r��values)r�r�r
r
r�
itervaluesGsr�cKst|jf|��S)N)r��items)r�r�r
r
r�	iteritemsJsr�cKst|jf|��S)N)r�Zlists)r�r�r
r
r�	iterlistsMsr�r�r�r�cKs|jf|�S)N)r�)r�r�r
r
rr�VscKs|jf|�S)N)r�)r�r�r
r
rr�YscKs|jf|�S)N)r�)r�r�r
r
rr�\scKs|jf|�S)N)r�)r�r�r
r
rr�_s�viewkeys�
viewvalues�	viewitemsz1Return an iterator over the keys of a dictionary.z3Return an iterator over the values of a dictionary.z?Return an iterator over the (key, value) pairs of a dictionary.zBReturn an iterator over the (key, [values]) pairs of a dictionary.cCs
|jd�S)Nzlatin-1)�encode)�sr
r
r�bqsr�cCs|S)Nr
)r�r
r
r�utsr�z>B�assertCountEqualZassertRaisesRegexpZassertRegexpMatches�assertRaisesRegex�assertRegexcCs|S)Nr
)r�r
r
rr��scCst|jdd�d�S)Nz\\z\\\\Zunicode_escape)�unicode�replace)r�r
r
rr��scCst|d�S)Nr)�ord)Zbsr
r
r�byte2int�sr�cCst||�S)N)r�)Zbuf�ir
r
r�
indexbytes�sr�ZassertItemsEqualzByte literalzText literalcOst|t�||�S)N)r,�_assertCountEqual)r�args�kwargsr
r
rr��scOst|t�||�S)N)r,�_assertRaisesRegex)rr�r�r
r
rr��scOst|t�||�S)N)r,�_assertRegex)rr�r�r
r
rr��s�execc
Cs:z*|dkr|�}|j|k	r$|j|��|�Wdd}d}XdS)N)�
__traceback__�with_traceback)r#r/�tbr
r
r�reraise�s

r�cCsB|dkr*tjd�}|j}|dkr&|j}~n|dkr6|}td�dS)zExecute code in a namespace.Nrzexec _code_ in _globs_, _locs_)r�	_getframe�	f_globals�f_localsr�)Z_code_Z_globs_Z_locs_�framer
r
r�exec_�s
r�zedef reraise(tp, value, tb=None):
    try:
        raise tp, value, tb
    finally:
        tb = None
z�def raise_from(value, from_value):
    try:
        if from_value is None:
            raise value
        raise value from from_value
    finally:
        value = None
zrdef raise_from(value, from_value):
    try:
        raise value from from_value
    finally:
        value = None
cCs|�dS)Nr
)r/Z
from_valuer
r
r�
raise_from�sr��printc
s6|jdtj���dkrdS�fdd�}d}|jdd�}|dk	r`t|t�rNd}nt|t�s`td��|jd	d�}|dk	r�t|t�r�d}nt|t�s�td
��|r�td��|s�x|D]}t|t�r�d}Pq�W|r�td�}td
�}nd}d
}|dkr�|}|dk�r�|}x,t|�D] \}	}|	�r||�||��qW||�dS)z4The new-style print function for Python 2.4 and 2.5.�fileNcsdt|t�st|�}t�t�rVt|t�rV�jdk	rVt�dd�}|dkrHd}|j�j|�}�j|�dS)N�errors�strict)	rD�
basestring�strr�r��encodingr,r��write)�datar�)�fpr
rr��s



zprint_.<locals>.writeF�sepTzsep must be None or a string�endzend must be None or a stringz$invalid keyword arguments to print()�
� )�popr�stdoutrDr�r��	TypeError�	enumerate)
r�r�r�Zwant_unicoder�r��arg�newlineZspacer�r
)r�r�print_�sL







r�cOs<|jdtj�}|jdd�}t||�|r8|dk	r8|j�dS)Nr��flushF)�getrr�r��_printr�)r�r�r�r�r
r
rr�s

zReraise an exception.cs���fdd�}|S)Ncstj����|�}�|_|S)N)r_�wraps�__wrapped__)�f)�assigned�updated�wrappedr
r�wrapper*szwraps.<locals>.wrapperr
)r�r�r�r�r
)r�r�r�rr�(sr�cs&G��fdd�dt�}tj|dfi�S)z%Create a base class with a metaclass.cs,eZdZ��fdd�Ze��fdd��ZdS)z!with_metaclass.<locals>.metaclasscs�|�|�S)Nr
)r�r�
this_basesr�)�bases�metar
r�__new__:sz)with_metaclass.<locals>.metaclass.__new__cs�j|��S)N)�__prepare__)r�rr�)r�r�r
rr�=sz-with_metaclass.<locals>.metaclass.__prepare__N)rrrr��classmethodr�r
)r�r�r
r�	metaclass8sr�Ztemporary_class)r�r�)r�r�r�r
)r�r�r�with_metaclass3sr�cs�fdd�}|S)z6Class decorator for creating a class with a metaclass.csl|jj�}|jd�}|dk	rDt|t�r,|g}x|D]}|j|�q2W|jdd�|jdd��|j|j|�S)N�	__slots__r{�__weakref__)r{�copyr�rDr�r�r�	__bases__)r�Z	orig_vars�slotsZ	slots_var)r�r
rr�Es



zadd_metaclass.<locals>.wrapperr
)r�r�r
)r�r�
add_metaclassCsr�cCs2tr.d|jkrtd|j��|j|_dd�|_|S)a
    A decorator that defines __unicode__ and __str__ methods under Python 2.
    Under Python 3 it does nothing.

    To support Python 2 and 3 with a single code base, define a __str__ method
    returning text and apply this decorator to the class.
    �__str__zY@python_2_unicode_compatible cannot be applied to %s because it doesn't define __str__().cSs|j�jd�S)Nzutf-8)�__unicode__r�)rr
r
r�<lambda>asz-python_2_unicode_compatible.<locals>.<lambda>)�PY2r{�
ValueErrorrr�r�)r�r
r
r�python_2_unicode_compatibleSs


r��__spec__)rrli���li���ll����)N)NN)rr)rr)rr)rr)�rZ
__future__rr_rP�operatorrr��
__author__�__version__�version_infor�r(ZPY34r�Zstring_types�intZ
integer_typesr�Zclass_typesZ	text_type�bytesZbinary_type�maxsizeZMAXSIZEr�ZlongZ	ClassTyper��platform�
startswith�objectr	�len�
OverflowErrorrrrr&�
ModuleTyper2r7r9rryrLr5r-rrrDr=rnroZ_urllib_parse_moved_attributesrpZ_urllib_error_moved_attributesrqZ _urllib_request_moved_attributesrrZ!_urllib_response_moved_attributesrsZ$_urllib_robotparser_moved_attributesrtrzr|Z
_meth_funcZ
_meth_selfZ
_func_closureZ
_func_codeZ_func_defaultsZ
_func_globalsr�r��	NameErrorr�r�r�r�r�r��
attrgetterZget_method_functionZget_method_selfZget_function_closureZget_function_codeZget_function_defaultsZget_function_globalsr�r�r�r��methodcallerr�r�r�r�r��chrZunichr�struct�Struct�packZint2byte�
itemgetterr��getitemr�r�Z	iterbytesrMrN�BytesIOr�r�r��partialrVr�r�r�r�r,rQr�r�r�r�r��WRAPPER_ASSIGNMENTS�WRAPPER_UPDATESr�r�r�r�rG�__package__�globalsr�r��submodule_search_locations�	meta_pathr�r�Zimporter�appendr
r
r
r�<module>s�

>







































































































5site-packages/__pycache__/pyparsing.cpython-36.pyc000064400000610513147511334560016155 0ustar003

��s`��@s�dZdZdZdZddlZddlmZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlmZyddlmZWn ek
r�ddlmZYnXydd	l
mZWn>ek
r�ydd	lmZWnek
r�dZYnXYnXd
ddd
ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrgiZee	j�dds�ZeddskZe�r"e	jZe Z!e"Z#e Z$e%e&e'e(e)ee*e+e,e-e.gZ/nbe	j0Ze1Z2dtdu�Z$gZ/ddl3Z3xBdvj4�D]6Z5ye/j6e7e3e5��Wne8k
�r|�wJYnX�qJWe9dwdx�e2dy�D��Z:dzd{�Z;Gd|d}�d}e<�Z=ej>ej?Z@d~ZAeAdZBe@eAZCe"d��ZDd�jEd�dx�ejFD��ZGGd�d!�d!eH�ZIGd�d#�d#eI�ZJGd�d%�d%eI�ZKGd�d'�d'eK�ZLGd�d*�d*eH�ZMGd�d��d�e<�ZNGd�d&�d&e<�ZOe
jPjQeO�d�d=�ZRd�dN�ZSd�dK�ZTd�d��ZUd�d��ZVd�d��ZWd�dU�ZX�d/d�d��ZYGd�d(�d(e<�ZZGd�d0�d0eZ�Z[Gd�d�de[�Z\Gd�d�de[�Z]Gd�d�de[�Z^e^Z_e^eZ_`Gd�d�de[�ZaGd�d�de^�ZbGd�d�dea�ZcGd�dp�dpe[�ZdGd�d3�d3e[�ZeGd�d+�d+e[�ZfGd�d)�d)e[�ZgGd�d
�d
e[�ZhGd�d2�d2e[�ZiGd�d��d�e[�ZjGd�d�dej�ZkGd�d�dej�ZlGd�d�dej�ZmGd�d.�d.ej�ZnGd�d-�d-ej�ZoGd�d5�d5ej�ZpGd�d4�d4ej�ZqGd�d$�d$eZ�ZrGd�d
�d
er�ZsGd�d �d er�ZtGd�d�der�ZuGd�d�der�ZvGd�d"�d"eZ�ZwGd�d�dew�ZxGd�d�dew�ZyGd�d��d�ew�ZzGd�d�dez�Z{Gd�d6�d6ez�Z|Gd�d��d�e<�Z}e}�Z~Gd�d�dew�ZGd�d,�d,ew�Z�Gd�d�dew�Z�Gd�d��d�e��Z�Gd�d1�d1ew�Z�Gd�d�de��Z�Gd�d�de��Z�Gd�d�de��Z�Gd�d/�d/e��Z�Gd�d�de<�Z�d�df�Z��d0d�dD�Z��d1d�d@�Z�d�d΄Z�d�dS�Z�d�dR�Z�d�d҄Z��d2d�dW�Z�d�dE�Z��d3d�dk�Z�d�dl�Z�d�dn�Z�e\�j�dG�Z�el�j�dM�Z�em�j�dL�Z�en�j�de�Z�eo�j�dd�Z�eeeDd�d�dڍj�d�d܄�Z�efd݃j�d�d܄�Z�efd߃j�d�d܄�Z�e�e�Be�BeeeGd�dyd�Befd�ej��BZ�e�e�e�d�e��Z�e^d�ed�j�d�e�e{e�e�B��j�d�d�Z�d�dc�Z�d�dQ�Z�d�d`�Z�d�d^�Z�d�dq�Z�e�d�d܄�Z�e�d�d܄�Z�d�d�Z�d�dO�Z�d�dP�Z�d�di�Z�e<�e�_��d4d�do�Z�e=�Z�e<�e�_�e<�e�_�e�d��e�d��fd�dm�Z�e�Z�e�efd��d��j�d��Z�e�efd��d��j�d��Z�e�efd��d�efd��d�B�j��d�Z�e�e_�d�e�j��j��d�Z�d�d�de�j�f�ddT�Z��d5�ddj�Z�e��d�Z�e��d�Z�e�eee@eC�d�j��d��\Z�Z�e�e��d	j4��d
��Z�ef�d�djEe�j��d
�j��d�ZĐdd_�Z�e�ef�d��d�j��d�Z�ef�d�j��d�Z�ef�d�jȃj��d�Z�ef�d�j��d�Z�e�ef�d��de�B�j��d�Z�e�Z�ef�d�j��d�Z�e�e{eeeGdɐd�eee�d�e^dɃem����j΃j��d�Z�e�ee�j�e�Bd��d��j�d>�Z�G�d dr�dr�Z�eҐd!k�r�eb�d"�Z�eb�d#�Z�eee@eC�d$�Z�e�eՐd%dӐd&�j�e��Z�e�e�eփ�j��d'�Zאd(e�BZ�e�eՐd%dӐd&�j�e��Z�e�e�eك�j��d)�Z�eӐd*�eؐd'�e�eڐd)�Z�e�jܐd+�e�j�jܐd,�e�j�jܐd,�e�j�jܐd-�ddl�Z�e�j�j�e�e�j��e�j�jܐd.�dS(6aS
pyparsing module - Classes and methods to define and execute parsing grammars

The pyparsing module is an alternative approach to creating and executing simple grammars,
vs. the traditional lex/yacc approach, or the use of regular expressions.  With pyparsing, you
don't need to learn a new syntax for defining grammars or matching expressions - the parsing module
provides a library of classes that you use to construct the grammar directly in Python.

Here is a program to parse "Hello, World!" (or any greeting of the form 
C{"<salutation>, <addressee>!"}), built up using L{Word}, L{Literal}, and L{And} elements 
(L{'+'<ParserElement.__add__>} operator gives L{And} expressions, strings are auto-converted to
L{Literal} expressions)::

    from pyparsing import Word, alphas

    # define grammar of a greeting
    greet = Word(alphas) + "," + Word(alphas) + "!"

    hello = "Hello, World!"
    print (hello, "->", greet.parseString(hello))

The program outputs the following::

    Hello, World! -> ['Hello', ',', 'World', '!']

The Python representation of the grammar is quite readable, owing to the self-explanatory
class names, and the use of '+', '|' and '^' operators.

The L{ParseResults} object returned from L{ParserElement.parseString<ParserElement.parseString>} can be accessed as a nested list, a dictionary, or an
object with named attributes.

The pyparsing module handles some of the problems that are typically vexing when writing text parsers:
 - extra or missing whitespace (the above program will also handle "Hello,World!", "Hello  ,  World  !", etc.)
 - quoted strings
 - embedded comments
z2.1.10z07 Oct 2016 01:31 UTCz*Paul McGuire <ptmcg@users.sourceforge.net>�N)�ref)�datetime)�RLock)�OrderedDict�And�CaselessKeyword�CaselessLiteral�
CharsNotIn�Combine�Dict�Each�Empty�
FollowedBy�Forward�
GoToColumn�Group�Keyword�LineEnd�	LineStart�Literal�
MatchFirst�NoMatch�NotAny�	OneOrMore�OnlyOnce�Optional�Or�ParseBaseException�ParseElementEnhance�ParseException�ParseExpression�ParseFatalException�ParseResults�ParseSyntaxException�
ParserElement�QuotedString�RecursiveGrammarException�Regex�SkipTo�	StringEnd�StringStart�Suppress�Token�TokenConverter�White�Word�WordEnd�	WordStart�
ZeroOrMore�	alphanums�alphas�
alphas8bit�anyCloseTag�
anyOpenTag�
cStyleComment�col�commaSeparatedList�commonHTMLEntity�countedArray�cppStyleComment�dblQuotedString�dblSlashComment�
delimitedList�dictOf�downcaseTokens�empty�hexnums�htmlComment�javaStyleComment�line�lineEnd�	lineStart�lineno�makeHTMLTags�makeXMLTags�matchOnlyAtCol�matchPreviousExpr�matchPreviousLiteral�
nestedExpr�nullDebugAction�nums�oneOf�opAssoc�operatorPrecedence�
printables�punc8bit�pythonStyleComment�quotedString�removeQuotes�replaceHTMLEntity�replaceWith�
restOfLine�sglQuotedString�srange�	stringEnd�stringStart�traceParseAction�
unicodeString�upcaseTokens�
withAttribute�
indentedBlock�originalTextFor�ungroup�
infixNotation�locatedExpr�	withClass�
CloseMatch�tokenMap�pyparsing_common�cCs`t|t�r|Syt|�Stk
rZt|�jtj�d�}td�}|jdd��|j	|�SXdS)aDrop-in replacement for str(obj) that tries to be Unicode friendly. It first tries
           str(obj). If that fails with a UnicodeEncodeError, then it tries unicode(obj). It
           then < returns the unicode object | encodes it with the default encoding | ... >.
        �xmlcharrefreplacez&#\d+;cSs$dtt|ddd���dd�S)Nz\ur�����)�hex�int)�t�rw�/usr/lib/python3.6/pyparsing.py�<lambda>�sz_ustr.<locals>.<lambda>N)
�
isinstanceZunicode�str�UnicodeEncodeError�encode�sys�getdefaultencodingr'�setParseAction�transformString)�obj�retZ
xmlcharrefrwrwrx�_ustr�s
r�z6sum len sorted reversed list tuple set any all min maxccs|]
}|VqdS)Nrw)�.0�yrwrwrx�	<genexpr>�sr�rrcCs>d}dd�dj�D�}x"t||�D]\}}|j||�}q"W|S)z/Escape &, <, >, ", ', etc. in a string of data.z&><"'css|]}d|dVqdS)�&�;Nrw)r��srwrwrxr��sz_xml_escape.<locals>.<genexpr>zamp gt lt quot apos)�split�zip�replace)�dataZfrom_symbolsZ
to_symbolsZfrom_Zto_rwrwrx�_xml_escape�s
r�c@seZdZdS)�
_ConstantsN)�__name__�
__module__�__qualname__rwrwrwrxr��sr��
0123456789ZABCDEFabcdef�\�ccs|]}|tjkr|VqdS)N)�stringZ
whitespace)r��crwrwrxr��sc@sPeZdZdZddd�Zedd��Zdd	�Zd
d�Zdd
�Z	ddd�Z
dd�ZdS)rz7base exception class for all parsing runtime exceptionsrNcCs>||_|dkr||_d|_n||_||_||_|||f|_dS)Nr�)�loc�msg�pstr�
parserElement�args)�selfr�r�r��elemrwrwrx�__init__�szParseBaseException.__init__cCs||j|j|j|j�S)z�
        internal factory method to simplify creating one type of ParseException 
        from another - avoids having __init__ signature conflicts among subclasses
        )r�r�r�r�)�cls�perwrwrx�_from_exception�sz"ParseBaseException._from_exceptioncCsN|dkrt|j|j�S|dkr,t|j|j�S|dkrBt|j|j�St|��dS)z�supported attributes by name are:
            - lineno - returns the line number of the exception text
            - col - returns the column number of the exception text
            - line - returns the line containing the exception text
        rJr9�columnrGN)r9r�)rJr�r�r9rG�AttributeError)r�Zanamerwrwrx�__getattr__�szParseBaseException.__getattr__cCsd|j|j|j|jfS)Nz"%s (at char %d), (line:%d, col:%d))r�r�rJr�)r�rwrwrx�__str__�szParseBaseException.__str__cCst|�S)N)r�)r�rwrwrx�__repr__�szParseBaseException.__repr__�>!<cCs<|j}|jd}|r4dj|d|�|||d�f�}|j�S)z�Extracts the exception line from the input string, and marks
           the location of the exception with a special symbol.
        rrr�N)rGr��join�strip)r�ZmarkerStringZline_strZline_columnrwrwrx�
markInputline�s
z ParseBaseException.markInputlinecCsdj�tt|��S)Nzlineno col line)r��dir�type)r�rwrwrx�__dir__�szParseBaseException.__dir__)rNN)r�)r�r�r��__doc__r��classmethodr�r�r�r�r�r�rwrwrwrxr�s


c@seZdZdZdS)raN
    Exception thrown when parse expressions don't match class;
    supported attributes by name are:
     - lineno - returns the line number of the exception text
     - col - returns the column number of the exception text
     - line - returns the line containing the exception text
        
    Example::
        try:
            Word(nums).setName("integer").parseString("ABC")
        except ParseException as pe:
            print(pe)
            print("column: {}".format(pe.col))
            
    prints::
       Expected integer (at char 0), (line:1, col:1)
        column: 1
    N)r�r�r�r�rwrwrwrxr�sc@seZdZdZdS)r!znuser-throwable exception thrown when inconsistent parse content
       is found; stops all parsing immediatelyN)r�r�r�r�rwrwrwrxr!sc@seZdZdZdS)r#z�just like L{ParseFatalException}, but thrown internally when an
       L{ErrorStop<And._ErrorStop>} ('-' operator) indicates that parsing is to stop 
       immediately because an unbacktrackable syntax error has been foundN)r�r�r�r�rwrwrwrxr#sc@s eZdZdZdd�Zdd�ZdS)r&zZexception thrown by L{ParserElement.validate} if the grammar could be improperly recursivecCs
||_dS)N)�parseElementTrace)r��parseElementListrwrwrxr�sz"RecursiveGrammarException.__init__cCs
d|jS)NzRecursiveGrammarException: %s)r�)r�rwrwrxr� sz!RecursiveGrammarException.__str__N)r�r�r�r�r�r�rwrwrwrxr&sc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�_ParseResultsWithOffsetcCs||f|_dS)N)�tup)r�Zp1Zp2rwrwrxr�$sz _ParseResultsWithOffset.__init__cCs
|j|S)N)r�)r��irwrwrx�__getitem__&sz#_ParseResultsWithOffset.__getitem__cCst|jd�S)Nr)�reprr�)r�rwrwrxr�(sz _ParseResultsWithOffset.__repr__cCs|jd|f|_dS)Nr)r�)r�r�rwrwrx�	setOffset*sz!_ParseResultsWithOffset.setOffsetN)r�r�r�r�r�r�r�rwrwrwrxr�#sr�c@s�eZdZdZd[dd�Zddddefdd�Zdd	�Zefd
d�Zdd
�Z	dd�Z
dd�Zdd�ZeZ
dd�Zdd�Zdd�Zdd�Zdd�Zer�eZeZeZn$eZeZeZdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd\d(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Z d2d3�Z!d4d5�Z"d6d7�Z#d8d9�Z$d:d;�Z%d<d=�Z&d]d?d@�Z'dAdB�Z(dCdD�Z)dEdF�Z*d^dHdI�Z+dJdK�Z,dLdM�Z-d_dOdP�Z.dQdR�Z/dSdT�Z0dUdV�Z1dWdX�Z2dYdZ�Z3dS)`r"aI
    Structured parse results, to provide multiple means of access to the parsed data:
       - as a list (C{len(results)})
       - by list index (C{results[0], results[1]}, etc.)
       - by attribute (C{results.<resultsName>} - see L{ParserElement.setResultsName})

    Example::
        integer = Word(nums)
        date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))
        # equivalent form:
        # date_str = integer("year") + '/' + integer("month") + '/' + integer("day")

        # parseString returns a ParseResults object
        result = date_str.parseString("1999/12/31")

        def test(s, fn=repr):
            print("%s -> %s" % (s, fn(eval(s))))
        test("list(result)")
        test("result[0]")
        test("result['month']")
        test("result.day")
        test("'month' in result")
        test("'minutes' in result")
        test("result.dump()", str)
    prints::
        list(result) -> ['1999', '/', '12', '/', '31']
        result[0] -> '1999'
        result['month'] -> '12'
        result.day -> '31'
        'month' in result -> True
        'minutes' in result -> False
        result.dump() -> ['1999', '/', '12', '/', '31']
        - day: 31
        - month: 12
        - year: 1999
    NTcCs"t||�r|Stj|�}d|_|S)NT)rz�object�__new__�_ParseResults__doinit)r��toklist�name�asList�modalZretobjrwrwrxr�Ts


zParseResults.__new__c
Cs`|jrvd|_d|_d|_i|_||_||_|dkr6g}||t�rP|dd�|_n||t�rft|�|_n|g|_t	�|_
|dk	o�|�r\|s�d|j|<||t�r�t|�}||_||t
d�ttf�o�|ddgfk�s\||t�r�|g}|�r&||t��rt|j�d�||<ntt|d�d�||<|||_n6y|d||<Wn$tttfk
�rZ|||<YnXdS)NFrr�)r��_ParseResults__name�_ParseResults__parent�_ParseResults__accumNames�_ParseResults__asList�_ParseResults__modal�list�_ParseResults__toklist�_generatorType�dict�_ParseResults__tokdictrur�r��
basestringr"r��copy�KeyError�	TypeError�
IndexError)r�r�r�r�r�rzrwrwrxr�]sB



$
zParseResults.__init__cCsPt|ttf�r|j|S||jkr4|j|ddStdd�|j|D��SdS)NrrrcSsg|]}|d�qS)rrw)r��vrwrwrx�
<listcomp>�sz,ParseResults.__getitem__.<locals>.<listcomp>rs)rzru�slicer�r�r�r")r�r�rwrwrxr��s


zParseResults.__getitem__cCs�||t�r0|jj|t��|g|j|<|d}nD||ttf�rN||j|<|}n&|jj|t��t|d�g|j|<|}||t�r�t|�|_	dS)Nr)
r�r��getr�rur�r�r"�wkrefr�)r��kr�rz�subrwrwrx�__setitem__�s


"
zParseResults.__setitem__c
Cs�t|ttf�r�t|j�}|j|=t|t�rH|dkr:||7}t||d�}tt|j|���}|j�x^|j	j
�D]F\}}x<|D]4}x.t|�D]"\}\}}	t||	|	|k�||<q�Wq|WqnWn|j	|=dS)Nrrr)
rzrur��lenr�r��range�indices�reverser��items�	enumerater�)
r�r�ZmylenZremovedr��occurrences�jr��value�positionrwrwrx�__delitem__�s


$zParseResults.__delitem__cCs
||jkS)N)r�)r�r�rwrwrx�__contains__�szParseResults.__contains__cCs
t|j�S)N)r�r�)r�rwrwrx�__len__�szParseResults.__len__cCs
|jS)N)r�)r�rwrwrx�__bool__�szParseResults.__bool__cCs
t|j�S)N)�iterr�)r�rwrwrx�__iter__�szParseResults.__iter__cCst|jddd��S)Nrrrs)r�r�)r�rwrwrx�__reversed__�szParseResults.__reversed__cCs$t|jd�r|jj�St|j�SdS)N�iterkeys)�hasattrr�r�r�)r�rwrwrx�	_iterkeys�s
zParseResults._iterkeyscs�fdd��j�D�S)Nc3s|]}�|VqdS)Nrw)r�r�)r�rwrxr��sz+ParseResults._itervalues.<locals>.<genexpr>)r�)r�rw)r�rx�_itervalues�szParseResults._itervaluescs�fdd��j�D�S)Nc3s|]}|�|fVqdS)Nrw)r�r�)r�rwrxr��sz*ParseResults._iteritems.<locals>.<genexpr>)r�)r�rw)r�rx�
_iteritems�szParseResults._iteritemscCst|j��S)zVReturns all named result keys (as a list in Python 2.x, as an iterator in Python 3.x).)r�r�)r�rwrwrx�keys�szParseResults.keyscCst|j��S)zXReturns all named result values (as a list in Python 2.x, as an iterator in Python 3.x).)r��
itervalues)r�rwrwrx�values�szParseResults.valuescCst|j��S)zfReturns all named result key-values (as a list of tuples in Python 2.x, as an iterator in Python 3.x).)r��	iteritems)r�rwrwrxr��szParseResults.itemscCs
t|j�S)z�Since keys() returns an iterator, this method is helpful in bypassing
           code that looks for the existence of any defined results names.)�boolr�)r�rwrwrx�haskeys�szParseResults.haskeyscOs�|s
dg}x6|j�D]*\}}|dkr2|d|f}qtd|��qWt|dt�sht|�dksh|d|kr�|d}||}||=|S|d}|SdS)a�
        Removes and returns item at specified index (default=C{last}).
        Supports both C{list} and C{dict} semantics for C{pop()}. If passed no
        argument or an integer argument, it will use C{list} semantics
        and pop tokens from the list of parsed tokens. If passed a 
        non-integer argument (most likely a string), it will use C{dict}
        semantics and pop the corresponding value from any defined 
        results names. A second default return value argument is 
        supported, just as in C{dict.pop()}.

        Example::
            def remove_first(tokens):
                tokens.pop(0)
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            print(OneOrMore(Word(nums)).addParseAction(remove_first).parseString("0 123 321")) # -> ['123', '321']

            label = Word(alphas)
            patt = label("LABEL") + OneOrMore(Word(nums))
            print(patt.parseString("AAB 123 321").dump())

            # Use pop() in a parse action to remove named result (note that corresponding value is not
            # removed from list form of results)
            def remove_LABEL(tokens):
                tokens.pop("LABEL")
                return tokens
            patt.addParseAction(remove_LABEL)
            print(patt.parseString("AAB 123 321").dump())
        prints::
            ['AAB', '123', '321']
            - LABEL: AAB

            ['AAB', '123', '321']
        rr�defaultrz-pop() got an unexpected keyword argument '%s'Nrs)r�r�rzrur�)r�r��kwargsr�r��indexr�Zdefaultvaluerwrwrx�pop�s"zParseResults.popcCs||kr||S|SdS)ai
        Returns named result matching the given key, or if there is no
        such name, then returns the given C{defaultValue} or C{None} if no
        C{defaultValue} is specified.

        Similar to C{dict.get()}.
        
        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            result = date_str.parseString("1999/12/31")
            print(result.get("year")) # -> '1999'
            print(result.get("hour", "not specified")) # -> 'not specified'
            print(result.get("hour")) # -> None
        Nrw)r��key�defaultValuerwrwrxr�szParseResults.getcCsZ|jj||�xF|jj�D]8\}}x.t|�D]"\}\}}t||||k�||<q,WqWdS)a
        Inserts new element at location index in the list of parsed tokens.
        
        Similar to C{list.insert()}.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']

            # use a parse action to insert the parse location in the front of the parsed results
            def insert_locn(locn, tokens):
                tokens.insert(0, locn)
            print(OneOrMore(Word(nums)).addParseAction(insert_locn).parseString("0 123 321")) # -> [0, '0', '123', '321']
        N)r��insertr�r�r�r�)r�r�ZinsStrr�r�r�r�r�rwrwrxr�2szParseResults.insertcCs|jj|�dS)a�
        Add single element to end of ParseResults list of elements.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            
            # use a parse action to compute the sum of the parsed integers, and add it to the end
            def append_sum(tokens):
                tokens.append(sum(map(int, tokens)))
            print(OneOrMore(Word(nums)).addParseAction(append_sum).parseString("0 123 321")) # -> ['0', '123', '321', 444]
        N)r��append)r��itemrwrwrxr�FszParseResults.appendcCs$t|t�r||7}n|jj|�dS)a
        Add sequence of elements to end of ParseResults list of elements.

        Example::
            patt = OneOrMore(Word(alphas))
            
            # use a parse action to append the reverse of the matched strings, to make a palindrome
            def make_palindrome(tokens):
                tokens.extend(reversed([t[::-1] for t in tokens]))
                return ''.join(tokens)
            print(patt.addParseAction(make_palindrome).parseString("lskdj sdlkjf lksd")) # -> 'lskdjsdlkjflksddsklfjkldsjdksl'
        N)rzr"r��extend)r�Zitemseqrwrwrxr�Ts

zParseResults.extendcCs|jdd�=|jj�dS)z7
        Clear all elements and results names.
        N)r�r��clear)r�rwrwrxr�fszParseResults.clearcCsfy||Stk
rdSX||jkr^||jkrD|j|ddStdd�|j|D��SndSdS)Nr�rrrcSsg|]}|d�qS)rrw)r�r�rwrwrxr�wsz,ParseResults.__getattr__.<locals>.<listcomp>rs)r�r�r�r")r�r�rwrwrxr�ms

zParseResults.__getattr__cCs|j�}||7}|S)N)r�)r��otherr�rwrwrx�__add__{szParseResults.__add__cs�|jrnt|j���fdd��|jj�}�fdd�|D�}x4|D],\}}|||<t|dt�r>t|�|d_q>W|j|j7_|jj	|j�|S)Ncs|dkr�S|�S)Nrrw)�a)�offsetrwrxry�sz'ParseResults.__iadd__.<locals>.<lambda>c	s4g|],\}}|D]}|t|d�|d��f�qqS)rrr)r�)r�r��vlistr�)�	addoffsetrwrxr��sz)ParseResults.__iadd__.<locals>.<listcomp>r)
r�r�r�r�rzr"r�r�r��update)r�r�Z
otheritemsZotherdictitemsr�r�rw)rrrx�__iadd__�s


zParseResults.__iadd__cCs&t|t�r|dkr|j�S||SdS)Nr)rzrur�)r�r�rwrwrx�__radd__�szParseResults.__radd__cCsdt|j�t|j�fS)Nz(%s, %s))r�r�r�)r�rwrwrxr��szParseResults.__repr__cCsddjdd�|jD��dS)N�[z, css(|] }t|t�rt|�nt|�VqdS)N)rzr"r�r�)r�r�rwrwrxr��sz'ParseResults.__str__.<locals>.<genexpr>�])r�r�)r�rwrwrxr��szParseResults.__str__r�cCsPg}xF|jD]<}|r"|r"|j|�t|t�r:||j�7}q|jt|��qW|S)N)r�r�rzr"�
_asStringListr�)r��sep�outr�rwrwrxr
�s

zParseResults._asStringListcCsdd�|jD�S)a�
        Returns the parse results as a nested list of matching tokens, all converted to strings.

        Example::
            patt = OneOrMore(Word(alphas))
            result = patt.parseString("sldkj lsdkj sldkj")
            # even though the result prints in string-like form, it is actually a pyparsing ParseResults
            print(type(result), result) # -> <class 'pyparsing.ParseResults'> ['sldkj', 'lsdkj', 'sldkj']
            
            # Use asList() to create an actual list
            result_list = result.asList()
            print(type(result_list), result_list) # -> <class 'list'> ['sldkj', 'lsdkj', 'sldkj']
        cSs"g|]}t|t�r|j�n|�qSrw)rzr"r�)r��resrwrwrxr��sz'ParseResults.asList.<locals>.<listcomp>)r�)r�rwrwrxr��szParseResults.asListcs6tr|j}n|j}�fdd��t�fdd�|�D��S)a�
        Returns the named parse results as a nested dictionary.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(type(result), repr(result)) # -> <class 'pyparsing.ParseResults'> (['12', '/', '31', '/', '1999'], {'day': [('1999', 4)], 'year': [('12', 0)], 'month': [('31', 2)]})
            
            result_dict = result.asDict()
            print(type(result_dict), repr(result_dict)) # -> <class 'dict'> {'day': '1999', 'year': '12', 'month': '31'}

            # even though a ParseResults supports dict-like access, sometime you just need to have a dict
            import json
            print(json.dumps(result)) # -> Exception: TypeError: ... is not JSON serializable
            print(json.dumps(result.asDict())) # -> {"month": "31", "day": "1999", "year": "12"}
        cs6t|t�r.|j�r|j�S�fdd�|D�Sn|SdS)Ncsg|]}�|��qSrwrw)r�r�)�toItemrwrxr��sz7ParseResults.asDict.<locals>.toItem.<locals>.<listcomp>)rzr"r��asDict)r�)rrwrxr�s

z#ParseResults.asDict.<locals>.toItemc3s|]\}}|�|�fVqdS)Nrw)r�r�r�)rrwrxr��sz&ParseResults.asDict.<locals>.<genexpr>)�PY_3r�r�r�)r�Zitem_fnrw)rrxr�s
	zParseResults.asDictcCs8t|j�}|jj�|_|j|_|jj|j�|j|_|S)zA
        Returns a new copy of a C{ParseResults} object.
        )r"r�r�r�r�r�rr�)r�r�rwrwrxr��s
zParseResults.copyFcCsPd}g}tdd�|jj�D��}|d}|s8d}d}d}d}	|dk	rJ|}	n|jrV|j}	|	sf|rbdSd}	|||d|	d	g7}x�t|j�D]�\}
}t|t�r�|
|kr�||j||
|o�|dk||�g7}n||jd|o�|dk||�g7}q�d}|
|kr�||
}|�s
|�rq�nd}t	t
|��}
|||d|d	|
d
|d	g	7}q�W|||d
|	d	g7}dj|�S)z�
        (Deprecated) Returns the parse results as XML. Tags are created for tokens and lists that have defined results names.
        �
css(|] \}}|D]}|d|fVqqdS)rrNrw)r�r�rr�rwrwrxr��sz%ParseResults.asXML.<locals>.<genexpr>z  r�NZITEM�<�>z</)r�r�r�r�r�r�rzr"�asXMLr�r�r�)r�ZdoctagZnamedItemsOnly�indentZ	formatted�nlrZ
namedItemsZnextLevelIndentZselfTagr�r
ZresTagZxmlBodyTextrwrwrxr�sT


zParseResults.asXMLcCs:x4|jj�D]&\}}x|D]\}}||kr|SqWqWdS)N)r�r�)r�r�r�rr�r�rwrwrxZ__lookup$s
zParseResults.__lookupcCs�|jr|jS|jr.|j�}|r(|j|�SdSnNt|�dkrxt|j�dkrxtt|jj���dddkrxtt|jj���SdSdS)a(
        Returns the results name for this token expression. Useful when several 
        different expressions might match at a particular location.

        Example::
            integer = Word(nums)
            ssn_expr = Regex(r"\d\d\d-\d\d-\d\d\d\d")
            house_number_expr = Suppress('#') + Word(nums, alphanums)
            user_data = (Group(house_number_expr)("house_number") 
                        | Group(ssn_expr)("ssn")
                        | Group(integer)("age"))
            user_info = OneOrMore(user_data)
            
            result = user_info.parseString("22 111-22-3333 #221B")
            for item in result:
                print(item.getName(), ':', item[0])
        prints::
            age : 22
            ssn : 111-22-3333
            house_number : 221B
        Nrrrrs)rrs)	r�r��_ParseResults__lookupr�r��nextr�r�r�)r��parrwrwrx�getName+s
zParseResults.getNamercCsbg}d}|j|t|j���|�rX|j�r�tdd�|j�D��}xz|D]r\}}|r^|j|�|jd|d||f�t|t�r�|r�|j|j||d��q�|jt|��qH|jt	|��qHWn�t
dd�|D���rX|}x~t|�D]r\}	}
t|
t��r*|jd|d||	|d|d|
j||d�f�q�|jd|d||	|d|dt|
�f�q�Wd	j|�S)
aH
        Diagnostic method for listing out the contents of a C{ParseResults}.
        Accepts an optional C{indent} argument so that this string can be embedded
        in a nested display of other data.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(result.dump())
        prints::
            ['12', '/', '31', '/', '1999']
            - day: 1999
            - month: 31
            - year: 12
        rcss|]\}}t|�|fVqdS)N)r{)r�r�r�rwrwrxr�gsz$ParseResults.dump.<locals>.<genexpr>z
%s%s- %s: z  rrcss|]}t|t�VqdS)N)rzr")r��vvrwrwrxr�ssz
%s%s[%d]:
%s%s%sr�)
r�r�r�r��sortedr�rzr"�dumpr��anyr�r�)r�r�depth�fullr�NLr�r�r�r�rrwrwrxrPs,

4.zParseResults.dumpcOstj|j�f|�|�dS)a�
        Pretty-printer for parsed results as a list, using the C{pprint} module.
        Accepts additional positional or keyword args as defined for the 
        C{pprint.pprint} method. (U{http://docs.python.org/3/library/pprint.html#pprint.pprint})

        Example::
            ident = Word(alphas, alphanums)
            num = Word(nums)
            func = Forward()
            term = ident | num | Group('(' + func + ')')
            func <<= ident + Group(Optional(delimitedList(term)))
            result = func.parseString("fna a,b,(fnb c,d,200),100")
            result.pprint(width=40)
        prints::
            ['fna',
             ['a',
              'b',
              ['(', 'fnb', ['c', 'd', '200'], ')'],
              '100']]
        N)�pprintr�)r�r�r�rwrwrxr"}szParseResults.pprintcCs.|j|jj�|jdk	r|j�p d|j|jffS)N)r�r�r�r�r�r�)r�rwrwrx�__getstate__�s
zParseResults.__getstate__cCsN|d|_|d\|_}}|_i|_|jj|�|dk	rDt|�|_nd|_dS)Nrrr)r�r�r�r�rr�r�)r��staterZinAccumNamesrwrwrx�__setstate__�s
zParseResults.__setstate__cCs|j|j|j|jfS)N)r�r�r�r�)r�rwrwrx�__getnewargs__�szParseResults.__getnewargs__cCstt|��t|j��S)N)r�r�r�r�)r�rwrwrxr��szParseResults.__dir__)NNTT)N)r�)NFr�T)r�rT)4r�r�r�r�r�rzr�r�r�r�r�r�r��__nonzero__r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr�r�r
r�rr�rrrrr"r#r%r&r�rwrwrwrxr"-sh&
	'	
4

#
=%
-
cCsF|}d|kot|�knr4||ddkr4dS||jdd|�S)aReturns current column within a string, counting newlines as line separators.
   The first column is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   rrrr)r��rfind)r��strgr�rwrwrxr9�s
cCs|jdd|�dS)aReturns current line number within a string, counting newlines as line separators.
   The first line is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   rrrr)�count)r�r)rwrwrxrJ�s
cCsF|jdd|�}|jd|�}|dkr2||d|�S||dd�SdS)zfReturns the line of text containing loc within a string, counting newlines as line separators.
       rrrrN)r(�find)r�r)ZlastCRZnextCRrwrwrxrG�s
cCs8tdt|�dt|�dt||�t||�f�dS)NzMatch z at loc z(%d,%d))�printr�rJr9)�instringr��exprrwrwrx�_defaultStartDebugAction�sr/cCs$tdt|�dt|j���dS)NzMatched z -> )r,r�r{r�)r-�startlocZendlocr.�toksrwrwrx�_defaultSuccessDebugAction�sr2cCstdt|��dS)NzException raised:)r,r�)r-r�r.�excrwrwrx�_defaultExceptionDebugAction�sr4cGsdS)zG'Do-nothing' debug action, to suppress debugging output during parsing.Nrw)r�rwrwrxrQ�srqcs��tkr�fdd�Sdg�dg�tdd�dkrFddd	�}dd
d��ntj}tj�d}|dd
�d}|d|d|f�������fdd�}d}yt�dt�d�j�}Wntk
r�t��}YnX||_|S)Ncs�|�S)Nrw)r��lrv)�funcrwrxry�sz_trim_arity.<locals>.<lambda>rFrqro�cSs8tdkrdnd	}tj||dd�|}|j|jfgS)
Nror7rrqrr)�limit)ror7r������)�system_version�	traceback�
extract_stack�filenamerJ)r8r�
frame_summaryrwrwrxr=sz"_trim_arity.<locals>.extract_stackcSs$tj||d�}|d}|j|jfgS)N)r8rrrs)r<�
extract_tbr>rJ)�tbr8Zframesr?rwrwrxr@sz_trim_arity.<locals>.extract_tb�)r8rrcs�x�y �|�dd��}d�d<|Stk
r��dr>�n4z.tj�d}�|dd�ddd��ksj�Wd~X�d�kr��dd7<w�YqXqWdS)NrTrrrq)r8rsrs)r�r~�exc_info)r�r�rA)r@�
foundArityr6r8�maxargs�pa_call_line_synthrwrx�wrappers"z_trim_arity.<locals>.wrapperz<parse action>r��	__class__)ror7)r)rrs)	�singleArgBuiltinsr;r<r=r@�getattrr��	Exceptionr{)r6rEr=Z	LINE_DIFFZ	this_linerG�	func_namerw)r@rDr6r8rErFrx�_trim_arity�s*
rMcs�eZdZdZdZdZedd��Zedd��Zd�dd	�Z	d
d�Z
dd
�Zd�dd�Zd�dd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd�dd �Zd!d"�Zd�d#d$�Zd%d&�Zd'd(�ZGd)d*�d*e�Zed+k	r�Gd,d-�d-e�ZnGd.d-�d-e�ZiZe�Zd/d/gZ d�d0d1�Z!eZ"ed2d3��Z#dZ$ed�d5d6��Z%d�d7d8�Z&e'dfd9d:�Z(d;d<�Z)e'fd=d>�Z*e'dfd?d@�Z+dAdB�Z,dCdD�Z-dEdF�Z.dGdH�Z/dIdJ�Z0dKdL�Z1dMdN�Z2dOdP�Z3dQdR�Z4dSdT�Z5dUdV�Z6dWdX�Z7dYdZ�Z8d�d[d\�Z9d]d^�Z:d_d`�Z;dadb�Z<dcdd�Z=dedf�Z>dgdh�Z?d�didj�Z@dkdl�ZAdmdn�ZBdodp�ZCdqdr�ZDgfdsdt�ZEd�dudv�ZF�fdwdx�ZGdydz�ZHd{d|�ZId}d~�ZJdd��ZKd�d�d��ZLd�d�d��ZM�ZNS)�r$z)Abstract base level parser element class.z 
	
FcCs
|t_dS)a�
        Overrides the default whitespace chars

        Example::
            # default whitespace chars are space, <TAB> and newline
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def', 'ghi', 'jkl']
            
            # change to just treat newline as significant
            ParserElement.setDefaultWhitespaceChars(" \t")
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def']
        N)r$�DEFAULT_WHITE_CHARS)�charsrwrwrx�setDefaultWhitespaceChars=s
z'ParserElement.setDefaultWhitespaceCharscCs
|t_dS)a�
        Set class to be used for inclusion of string literals into a parser.
        
        Example::
            # default literal class used is Literal
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']


            # change to Suppress
            ParserElement.inlineLiteralsUsing(Suppress)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '12', '31']
        N)r$�_literalStringClass)r�rwrwrx�inlineLiteralsUsingLsz!ParserElement.inlineLiteralsUsingcCs�t�|_d|_d|_d|_||_d|_tj|_	d|_
d|_d|_t�|_
d|_d|_d|_d|_d|_d|_d|_d|_d|_dS)NTFr�)NNN)r��parseAction�
failAction�strRepr�resultsName�
saveAsList�skipWhitespacer$rN�
whiteChars�copyDefaultWhiteChars�mayReturnEmpty�keepTabs�ignoreExprs�debug�streamlined�
mayIndexError�errmsg�modalResults�debugActions�re�callPreparse�
callDuringTry)r��savelistrwrwrxr�as(zParserElement.__init__cCs<tj|�}|jdd�|_|jdd�|_|jr8tj|_|S)a$
        Make a copy of this C{ParserElement}.  Useful for defining different parse actions
        for the same parsing pattern, using copies of the original parse element.
        
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            integerK = integer.copy().addParseAction(lambda toks: toks[0]*1024) + Suppress("K")
            integerM = integer.copy().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
            
            print(OneOrMore(integerK | integerM | integer).parseString("5K 100 640K 256M"))
        prints::
            [5120, 100, 655360, 268435456]
        Equivalent form of C{expr.copy()} is just C{expr()}::
            integerM = integer().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
        N)r�rSr]rZr$rNrY)r�Zcpyrwrwrxr�xs
zParserElement.copycCs*||_d|j|_t|d�r&|j|j_|S)af
        Define name for this expression, makes debugging and exception messages clearer.
        
        Example::
            Word(nums).parseString("ABC")  # -> Exception: Expected W:(0123...) (at char 0), (line:1, col:1)
            Word(nums).setName("integer").parseString("ABC")  # -> Exception: Expected integer (at char 0), (line:1, col:1)
        z	Expected �	exception)r�rar�rhr�)r�r�rwrwrx�setName�s


zParserElement.setNamecCs4|j�}|jd�r"|dd�}d}||_||_|S)aP
        Define name for referencing matching tokens as a nested attribute
        of the returned parse results.
        NOTE: this returns a *copy* of the original C{ParserElement} object;
        this is so that the client can define a basic element, such as an
        integer, and reference it in multiple places with different names.

        You can also set results names using the abbreviated syntax,
        C{expr("name")} in place of C{expr.setResultsName("name")} - 
        see L{I{__call__}<__call__>}.

        Example::
            date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))

            # equivalent form:
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
        �*NrrTrs)r��endswithrVrb)r�r��listAllMatchesZnewselfrwrwrx�setResultsName�s
zParserElement.setResultsNameTcs@|r&|j�d�fdd�	}�|_||_nt|jd�r<|jj|_|S)z�Method to invoke the Python pdb debugger when this element is
           about to be parsed. Set C{breakFlag} to True to enable, False to
           disable.
        Tcsddl}|j��||||�S)Nr)�pdbZ	set_trace)r-r��	doActions�callPreParsern)�_parseMethodrwrx�breaker�sz'ParserElement.setBreak.<locals>.breaker�_originalParseMethod)TT)�_parsersr�)r�Z	breakFlagrrrw)rqrx�setBreak�s
zParserElement.setBreakcOs&tttt|���|_|jdd�|_|S)a
        Define action to perform when successfully matching parse element definition.
        Parse action fn is a callable method with 0-3 arguments, called as C{fn(s,loc,toks)},
        C{fn(loc,toks)}, C{fn(toks)}, or just C{fn()}, where:
         - s   = the original string being parsed (see note below)
         - loc = the location of the matching substring
         - toks = a list of the matched tokens, packaged as a C{L{ParseResults}} object
        If the functions in fns modify the tokens, they can return them as the return
        value from fn, and the modified list of tokens will replace the original.
        Otherwise, fn does not need to return any value.

        Optional keyword arguments:
         - callDuringTry = (default=C{False}) indicate if parse action should be run during lookaheads and alternate testing

        Note: the default parsing behavior is to expand tabs in the input string
        before starting the parsing process.  See L{I{parseString}<parseString>} for more information
        on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
        consistent view of the parsed string, the parse location, and line and column
        positions within the parsed string.
        
        Example::
            integer = Word(nums)
            date_str = integer + '/' + integer + '/' + integer

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']

            # use parse action to convert to ints at parse time
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            date_str = integer + '/' + integer + '/' + integer

            # note that integer fields are now ints, not strings
            date_str.parseString("1999/12/31")  # -> [1999, '/', 12, '/', 31]
        rfF)r��maprMrSr�rf)r��fnsr�rwrwrxr��s"zParserElement.setParseActioncOs4|jtttt|���7_|jp,|jdd�|_|S)z�
        Add parse action to expression's list of parse actions. See L{I{setParseAction}<setParseAction>}.
        
        See examples in L{I{copy}<copy>}.
        rfF)rSr�rvrMrfr�)r�rwr�rwrwrx�addParseAction�szParserElement.addParseActioncsb|jdd��|jdd�rtnt�x(|D] ����fdd�}|jj|�q&W|jpZ|jdd�|_|S)a�Add a boolean predicate function to expression's list of parse actions. See 
        L{I{setParseAction}<setParseAction>} for function call signatures. Unlike C{setParseAction}, 
        functions passed to C{addCondition} need to return boolean success/fail of the condition.

        Optional keyword arguments:
         - message = define a custom message to be used in the raised exception
         - fatal   = if True, will raise ParseFatalException to stop parsing immediately; otherwise will raise ParseException
         
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            year_int = integer.copy()
            year_int.addCondition(lambda toks: toks[0] >= 2000, message="Only support years 2000 and later")
            date_str = year_int + '/' + integer + '/' + integer

            result = date_str.parseString("1999/12/31")  # -> Exception: Only support years 2000 and later (at char 0), (line:1, col:1)
        �messagezfailed user-defined condition�fatalFcs$tt��|||��s �||���dS)N)r�rM)r�r5rv)�exc_type�fnr�rwrx�pasz&ParserElement.addCondition.<locals>.parf)r�r!rrSr�rf)r�rwr�r}rw)r{r|r�rx�addCondition�s
zParserElement.addConditioncCs
||_|S)aDefine action to perform if parsing fails at this expression.
           Fail acton fn is a callable function that takes the arguments
           C{fn(s,loc,expr,err)} where:
            - s = string being parsed
            - loc = location where expression match was attempted and failed
            - expr = the parse expression that failed
            - err = the exception thrown
           The function returns no value.  It may throw C{L{ParseFatalException}}
           if it is desired to stop parsing immediately.)rT)r�r|rwrwrx�
setFailActions
zParserElement.setFailActioncCsZd}xP|rTd}xB|jD]8}yx|j||�\}}d}qWWqtk
rLYqXqWqW|S)NTF)r]rtr)r�r-r�Z
exprsFound�eZdummyrwrwrx�_skipIgnorables#szParserElement._skipIgnorablescCsL|jr|j||�}|jrH|j}t|�}x ||krF|||krF|d7}q(W|S)Nrr)r]r�rXrYr�)r�r-r�Zwt�instrlenrwrwrx�preParse0szParserElement.preParsecCs|gfS)Nrw)r�r-r�rorwrwrx�	parseImpl<szParserElement.parseImplcCs|S)Nrw)r�r-r��	tokenlistrwrwrx�	postParse?szParserElement.postParsec"Cs�|j}|s|jr�|jdr,|jd|||�|rD|jrD|j||�}n|}|}yDy|j|||�\}}Wn(tk
r�t|t|�|j	|��YnXWnXt
k
r�}	z<|jdr�|jd||||	�|jr�|j||||	��WYdd}	~	XnXn�|o�|j�r|j||�}n|}|}|j�s$|t|�k�rhy|j|||�\}}Wn*tk
�rdt|t|�|j	|��YnXn|j|||�\}}|j|||�}t
||j|j|jd�}
|j�r�|�s�|j�r�|�rVyRxL|jD]B}||||
�}|dk	�r�t
||j|j�o�t|t
tf�|jd�}
�q�WWnFt
k
�rR}	z(|jd�r@|jd||||	��WYdd}	~	XnXnNxL|jD]B}||||
�}|dk	�r^t
||j|j�o�t|t
tf�|jd�}
�q^W|�r�|jd�r�|jd|||||
�||
fS)Nrrq)r�r�rr)r^rTrcrer�r�r�rr�rarr`r�r"rVrWrbrSrfrzr�)r�r-r�rorpZ	debugging�prelocZtokensStart�tokens�errZ	retTokensr|rwrwrx�
_parseNoCacheCsp





zParserElement._parseNoCachecCs>y|j||dd�dStk
r8t|||j|��YnXdS)NF)ror)rtr!rra)r�r-r�rwrwrx�tryParse�szParserElement.tryParsecCs2y|j||�Wnttfk
r(dSXdSdS)NFT)r�rr�)r�r-r�rwrwrx�canParseNext�s
zParserElement.canParseNextc@seZdZdd�ZdS)zParserElement._UnboundedCachecsdi�t�|_���fdd�}�fdd�}�fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)�cache�not_in_cacherwrxr��sz3ParserElement._UnboundedCache.__init__.<locals>.getcs|�|<dS)Nrw)r�r�r�)r�rwrx�set�sz3ParserElement._UnboundedCache.__init__.<locals>.setcs�j�dS)N)r�)r�)r�rwrxr��sz5ParserElement._UnboundedCache.__init__.<locals>.clear)r�r��types�
MethodTyper�r�r�)r�r�r�r�rw)r�r�rxr��sz&ParserElement._UnboundedCache.__init__N)r�r�r�r�rwrwrwrx�_UnboundedCache�sr�Nc@seZdZdd�ZdS)zParserElement._FifoCachecsht�|_�t����fdd�}��fdd�}�fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)r�r�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.getcs"|�|<t���kr�jd�dS)NF)r��popitem)r�r�r�)r��sizerwrxr��sz.ParserElement._FifoCache.__init__.<locals>.setcs�j�dS)N)r�)r�)r�rwrxr��sz0ParserElement._FifoCache.__init__.<locals>.clear)r�r��_OrderedDictr�r�r�r�r�)r�r�r�r�r�rw)r�r�r�rxr��sz!ParserElement._FifoCache.__init__N)r�r�r�r�rwrwrwrx�
_FifoCache�sr�c@seZdZdd�ZdS)zParserElement._FifoCachecsvt�|_�i�tjg�����fdd�}���fdd�}��fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)r�r�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.getcs2|�|<t���kr$�j�j�d��j|�dS)N)r�r��popleftr�)r�r�r�)r��key_fifor�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.setcs�j��j�dS)N)r�)r�)r�r�rwrxr��sz0ParserElement._FifoCache.__init__.<locals>.clear)	r�r��collections�dequer�r�r�r�r�)r�r�r�r�r�rw)r�r�r�r�rxr��sz!ParserElement._FifoCache.__init__N)r�r�r�r�rwrwrwrxr��srcCs�d\}}|||||f}tj��tj}|j|�}	|	|jkr�tj|d7<y|j||||�}	Wn8tk
r�}
z|j||
j	|
j
���WYdd}
~
Xq�X|j||	d|	dj�f�|	Sn4tj|d7<t|	t
�r�|	�|	d|	dj�fSWdQRXdS)Nrrr)rrr)r$�packrat_cache_lock�
packrat_cacher�r��packrat_cache_statsr�rr�rHr�r�rzrK)r�r-r�rorpZHITZMISS�lookupr�r�r�rwrwrx�_parseCache�s$


zParserElement._parseCachecCs(tjj�dgttj�tjdd�<dS)Nr)r$r�r�r�r�rwrwrwrx�
resetCache�s
zParserElement.resetCache�cCs8tjs4dt_|dkr tj�t_ntj|�t_tjt_dS)a�Enables "packrat" parsing, which adds memoizing to the parsing logic.
           Repeated parse attempts at the same string location (which happens
           often in many complex grammars) can immediately return a cached value,
           instead of re-executing parsing/validating code.  Memoizing is done of
           both valid results and parsing exceptions.
           
           Parameters:
            - cache_size_limit - (default=C{128}) - if an integer value is provided
              will limit the size of the packrat cache; if None is passed, then
              the cache size will be unbounded; if 0 is passed, the cache will
              be effectively disabled.
            
           This speedup may break existing programs that use parse actions that
           have side-effects.  For this reason, packrat parsing is disabled when
           you first import pyparsing.  To activate the packrat feature, your
           program must call the class method C{ParserElement.enablePackrat()}.  If
           your program uses C{psyco} to "compile as you go", you must call
           C{enablePackrat} before calling C{psyco.full()}.  If you do not do this,
           Python will crash.  For best results, call C{enablePackrat()} immediately
           after importing pyparsing.
           
           Example::
               import pyparsing
               pyparsing.ParserElement.enablePackrat()
        TN)r$�_packratEnabledr�r�r�r�rt)Zcache_size_limitrwrwrx�
enablePackratszParserElement.enablePackratcCs�tj�|js|j�x|jD]}|j�qW|js<|j�}y<|j|d�\}}|rv|j||�}t	�t
�}|j||�Wn0tk
r�}ztjr��n|�WYdd}~XnX|SdS)aB
        Execute the parse expression with the given string.
        This is the main interface to the client code, once the complete
        expression has been built.

        If you want the grammar to require that the entire input string be
        successfully parsed, then set C{parseAll} to True (equivalent to ending
        the grammar with C{L{StringEnd()}}).

        Note: C{parseString} implicitly calls C{expandtabs()} on the input string,
        in order to report proper column numbers in parse actions.
        If the input string contains tabs and
        the grammar uses parse actions that use the C{loc} argument to index into the
        string being parsed, you can ensure you have a consistent view of the input
        string by:
         - calling C{parseWithTabs} on your grammar before calling C{parseString}
           (see L{I{parseWithTabs}<parseWithTabs>})
         - define your parse action using the full C{(s,loc,toks)} signature, and
           reference the input string using the parse action's C{s} argument
         - explictly expand the tabs in your input string before calling
           C{parseString}
        
        Example::
            Word('a').parseString('aaaaabaaa')  # -> ['aaaaa']
            Word('a').parseString('aaaaabaaa', parseAll=True)  # -> Exception: Expected end of text
        rN)
r$r�r_�
streamliner]r\�
expandtabsrtr�r
r)r�verbose_stacktrace)r�r-�parseAllr�r�r�Zser3rwrwrx�parseString#s$zParserElement.parseStringccs@|js|j�x|jD]}|j�qW|js8t|�j�}t|�}d}|j}|j}t	j
�d}	y�x�||kon|	|k�ry |||�}
|||
dd�\}}Wntk
r�|
d}Yq`X||kr�|	d7}	||
|fV|r�|||�}
|
|kr�|}q�|d7}n|}q`|
d}q`WWn4tk
�r:}zt	j
�r&�n|�WYdd}~XnXdS)a�
        Scan the input string for expression matches.  Each match will return the
        matching tokens, start location, and end location.  May be called with optional
        C{maxMatches} argument, to clip scanning after 'n' matches are found.  If
        C{overlap} is specified, then overlapping matches will be reported.

        Note that the start and end locations are reported relative to the string
        being parsed.  See L{I{parseString}<parseString>} for more information on parsing
        strings with embedded tabs.

        Example::
            source = "sldjf123lsdjjkf345sldkjf879lkjsfd987"
            print(source)
            for tokens,start,end in Word(alphas).scanString(source):
                print(' '*start + '^'*(end-start))
                print(' '*start + tokens[0])
        
        prints::
        
            sldjf123lsdjjkf345sldkjf879lkjsfd987
            ^^^^^
            sldjf
                    ^^^^^^^
                    lsdjjkf
                              ^^^^^^
                              sldkjf
                                       ^^^^^^
                                       lkjsfd
        rF)rprrN)r_r�r]r\r�r�r�r�rtr$r�rrr�)r�r-�
maxMatchesZoverlapr�r�r�Z
preparseFnZparseFn�matchesr�ZnextLocr�Znextlocr3rwrwrx�
scanStringUsB


zParserElement.scanStringcCs�g}d}d|_y�xh|j|�D]Z\}}}|j|||��|rrt|t�rT||j�7}nt|t�rh||7}n
|j|�|}qW|j||d��dd�|D�}djtt	t
|���Stk
r�}ztj
rȂn|�WYdd}~XnXdS)af
        Extension to C{L{scanString}}, to modify matching text with modified tokens that may
        be returned from a parse action.  To use C{transformString}, define a grammar and
        attach a parse action to it that modifies the returned token list.
        Invoking C{transformString()} on a target string will then scan for matches,
        and replace the matched text patterns according to the logic in the parse
        action.  C{transformString()} returns the resulting transformed string.
        
        Example::
            wd = Word(alphas)
            wd.setParseAction(lambda toks: toks[0].title())
            
            print(wd.transformString("now is the winter of our discontent made glorious summer by this sun of york."))
        Prints::
            Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York.
        rTNcSsg|]}|r|�qSrwrw)r��orwrwrxr��sz1ParserElement.transformString.<locals>.<listcomp>r�)r\r�r�rzr"r�r�r�rvr��_flattenrr$r�)r�r-rZlastErvr�r�r3rwrwrxr��s(



zParserElement.transformStringcCsPytdd�|j||�D��Stk
rJ}ztjr6�n|�WYdd}~XnXdS)a~
        Another extension to C{L{scanString}}, simplifying the access to the tokens found
        to match the given parse expression.  May be called with optional
        C{maxMatches} argument, to clip searching after 'n' matches are found.
        
        Example::
            # a capitalized word starts with an uppercase letter, followed by zero or more lowercase letters
            cap_word = Word(alphas.upper(), alphas.lower())
            
            print(cap_word.searchString("More than Iron, more than Lead, more than Gold I need Electricity"))
        prints::
            ['More', 'Iron', 'Lead', 'Gold', 'I']
        cSsg|]\}}}|�qSrwrw)r�rvr�r�rwrwrxr��sz.ParserElement.searchString.<locals>.<listcomp>N)r"r�rr$r�)r�r-r�r3rwrwrx�searchString�szParserElement.searchStringc	csXd}d}x<|j||d�D]*\}}}|||�V|r>|dV|}qW||d�VdS)a[
        Generator method to split a string using the given expression as a separator.
        May be called with optional C{maxsplit} argument, to limit the number of splits;
        and the optional C{includeSeparators} argument (default=C{False}), if the separating
        matching text should be included in the split results.
        
        Example::        
            punc = oneOf(list(".,;:/-!?"))
            print(list(punc.split("This, this?, this sentence, is badly punctuated!")))
        prints::
            ['This', ' this', '', ' this sentence', ' is badly punctuated', '']
        r)r�N)r�)	r�r-�maxsplitZincludeSeparatorsZsplitsZlastrvr�r�rwrwrxr��s

zParserElement.splitcCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)a�
        Implementation of + operator - returns C{L{And}}. Adding strings to a ParserElement
        converts them to L{Literal}s by default.
        
        Example::
            greet = Word(alphas) + "," + Word(alphas) + "!"
            hello = "Hello, World!"
            print (hello, "->", greet.parseString(hello))
        Prints::
            Hello, World! -> ['Hello', ',', 'World', '!']
        z4Cannot combine element of type %s with ParserElementrq)�
stacklevelN)	rzr�r$rQ�warnings�warnr��
SyntaxWarningr)r�r�rwrwrxr�s



zParserElement.__add__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||S)z]
        Implementation of + operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrxrs



zParserElement.__radd__cCsLt|t�rtj|�}t|t�s:tjdt|�tdd�dSt|tj	�|g�S)zQ
        Implementation of - operator, returns C{L{And}} with error stop
        z4Cannot combine element of type %s with ParserElementrq)r�N)
rzr�r$rQr�r�r�r�r�
_ErrorStop)r�r�rwrwrx�__sub__s



zParserElement.__sub__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||S)z]
        Implementation of - operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rsub__ s



zParserElement.__rsub__cs�t|t�r|d}}n�t|t�r�|ddd�}|ddkrHd|df}t|dt�r�|ddkr�|ddkrvt��S|ddkr�t��S�|dt��SnJt|dt�r�t|dt�r�|\}}||8}ntdt|d�t|d���ntdt|���|dk�rtd��|dk�rtd��||k�o2dkn�rBtd	��|�r���fd
d��|�r�|dk�rt��|�}nt�g|��|�}n�|�}n|dk�r��}nt�g|�}|S)
a�
        Implementation of * operator, allows use of C{expr * 3} in place of
        C{expr + expr + expr}.  Expressions may also me multiplied by a 2-integer
        tuple, similar to C{{min,max}} multipliers in regular expressions.  Tuples
        may also include C{None} as in:
         - C{expr*(n,None)} or C{expr*(n,)} is equivalent
              to C{expr*n + L{ZeroOrMore}(expr)}
              (read as "at least n instances of C{expr}")
         - C{expr*(None,n)} is equivalent to C{expr*(0,n)}
              (read as "0 to n instances of C{expr}")
         - C{expr*(None,None)} is equivalent to C{L{ZeroOrMore}(expr)}
         - C{expr*(1,None)} is equivalent to C{L{OneOrMore}(expr)}

        Note that C{expr*(None,n)} does not raise an exception if
        more than n exprs exist in the input stream; that is,
        C{expr*(None,n)} does not enforce a maximum number of expr
        occurrences.  If this behavior is desired, then write
        C{expr*(None,n) + ~expr}
        rNrqrrz7cannot multiply 'ParserElement' and ('%s','%s') objectsz0cannot multiply 'ParserElement' and '%s' objectsz/cannot multiply ParserElement by negative valuez@second tuple value must be greater or equal to first tuple valuez+cannot multiply ParserElement by 0 or (0,0)cs(|dkrt��|d��St��SdS)Nrr)r)�n)�makeOptionalListr�rwrxr�]sz/ParserElement.__mul__.<locals>.makeOptionalList)NN)	rzru�tupler2rr�r��
ValueErrorr)r�r�ZminElementsZoptElementsr�rw)r�r�rx�__mul__,sD







zParserElement.__mul__cCs
|j|�S)N)r�)r�r�rwrwrx�__rmul__pszParserElement.__rmul__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zI
        Implementation of | operator - returns C{L{MatchFirst}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__or__ss



zParserElement.__or__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||BS)z]
        Implementation of | operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__ror__s



zParserElement.__ror__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zA
        Implementation of ^ operator - returns C{L{Or}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__xor__�s



zParserElement.__xor__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||AS)z]
        Implementation of ^ operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rxor__�s



zParserElement.__rxor__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zC
        Implementation of & operator - returns C{L{Each}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__and__�s



zParserElement.__and__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||@S)z]
        Implementation of & operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rand__�s



zParserElement.__rand__cCst|�S)zE
        Implementation of ~ operator - returns C{L{NotAny}}
        )r)r�rwrwrx�
__invert__�szParserElement.__invert__cCs|dk	r|j|�S|j�SdS)a

        Shortcut for C{L{setResultsName}}, with C{listAllMatches=False}.
        
        If C{name} is given with a trailing C{'*'} character, then C{listAllMatches} will be
        passed as C{True}.
           
        If C{name} is omitted, same as calling C{L{copy}}.

        Example::
            # these are equivalent
            userdata = Word(alphas).setResultsName("name") + Word(nums+"-").setResultsName("socsecno")
            userdata = Word(alphas)("name") + Word(nums+"-")("socsecno")             
        N)rmr�)r�r�rwrwrx�__call__�s
zParserElement.__call__cCst|�S)z�
        Suppresses the output of this C{ParserElement}; useful to keep punctuation from
        cluttering up returned output.
        )r+)r�rwrwrx�suppress�szParserElement.suppresscCs
d|_|S)a
        Disables the skipping of whitespace before matching the characters in the
        C{ParserElement}'s defined pattern.  This is normally only used internally by
        the pyparsing module, but may be needed in some whitespace-sensitive grammars.
        F)rX)r�rwrwrx�leaveWhitespace�szParserElement.leaveWhitespacecCsd|_||_d|_|S)z8
        Overrides the default whitespace chars
        TF)rXrYrZ)r�rOrwrwrx�setWhitespaceChars�sz ParserElement.setWhitespaceCharscCs
d|_|S)z�
        Overrides default behavior to expand C{<TAB>}s to spaces before parsing the input string.
        Must be called before C{parseString} when the input grammar contains elements that
        match C{<TAB>} characters.
        T)r\)r�rwrwrx�
parseWithTabs�szParserElement.parseWithTabscCsLt|t�rt|�}t|t�r4||jkrH|jj|�n|jjt|j���|S)a�
        Define expression to be ignored (e.g., comments) while doing pattern
        matching; may be called repeatedly, to define multiple comment or other
        ignorable patterns.
        
        Example::
            patt = OneOrMore(Word(alphas))
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj']
            
            patt.ignore(cStyleComment)
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj', 'lskjd']
        )rzr�r+r]r�r�)r�r�rwrwrx�ignore�s


zParserElement.ignorecCs"|pt|pt|ptf|_d|_|S)zT
        Enable display of debugging messages while doing pattern matching.
        T)r/r2r4rcr^)r�ZstartActionZ
successActionZexceptionActionrwrwrx�setDebugActions
s
zParserElement.setDebugActionscCs|r|jttt�nd|_|S)a�
        Enable display of debugging messages while doing pattern matching.
        Set C{flag} to True to enable, False to disable.

        Example::
            wd = Word(alphas).setName("alphaword")
            integer = Word(nums).setName("numword")
            term = wd | integer
            
            # turn on debugging for wd
            wd.setDebug()

            OneOrMore(term).parseString("abc 123 xyz 890")
        
        prints::
            Match alphaword at loc 0(1,1)
            Matched alphaword -> ['abc']
            Match alphaword at loc 3(1,4)
            Exception raised:Expected alphaword (at char 4), (line:1, col:5)
            Match alphaword at loc 7(1,8)
            Matched alphaword -> ['xyz']
            Match alphaword at loc 11(1,12)
            Exception raised:Expected alphaword (at char 12), (line:1, col:13)
            Match alphaword at loc 15(1,16)
            Exception raised:Expected alphaword (at char 15), (line:1, col:16)

        The output shown is that produced by the default debug actions - custom debug actions can be
        specified using L{setDebugActions}. Prior to attempting
        to match the C{wd} expression, the debugging message C{"Match <exprname> at loc <n>(<line>,<col>)"}
        is shown. Then if the parse succeeds, a C{"Matched"} message is shown, or an C{"Exception raised"}
        message is shown. Also note the use of L{setName} to assign a human-readable name to the expression,
        which makes debugging and exception messages easier to understand - for instance, the default
        name created for the C{Word} expression without calling C{setName} is C{"W:(ABCD...)"}.
        F)r�r/r2r4r^)r��flagrwrwrx�setDebugs#zParserElement.setDebugcCs|jS)N)r�)r�rwrwrxr�@szParserElement.__str__cCst|�S)N)r�)r�rwrwrxr�CszParserElement.__repr__cCsd|_d|_|S)NT)r_rU)r�rwrwrxr�FszParserElement.streamlinecCsdS)Nrw)r�r�rwrwrx�checkRecursionKszParserElement.checkRecursioncCs|jg�dS)zj
        Check defined expressions for valid structure, check for infinite recursive definitions.
        N)r�)r��
validateTracerwrwrx�validateNszParserElement.validatecCs�y|j�}Wn2tk
r>t|d��}|j�}WdQRXYnXy|j||�Stk
r|}ztjrh�n|�WYdd}~XnXdS)z�
        Execute the parse expression on the given file or filename.
        If a filename is specified (instead of a file object),
        the entire file is opened, read, and closed before parsing.
        �rN)�readr��openr�rr$r�)r�Zfile_or_filenamer�Z
file_contents�fr3rwrwrx�	parseFileTszParserElement.parseFilecsHt|t�r"||kp t|�t|�kSt|t�r6|j|�Stt|�|kSdS)N)rzr$�varsr�r��super)r�r�)rHrwrx�__eq__hs



zParserElement.__eq__cCs
||kS)Nrw)r�r�rwrwrx�__ne__pszParserElement.__ne__cCstt|��S)N)�hash�id)r�rwrwrx�__hash__sszParserElement.__hash__cCs||kS)Nrw)r�r�rwrwrx�__req__vszParserElement.__req__cCs
||kS)Nrw)r�r�rwrwrx�__rne__yszParserElement.__rne__cCs0y|jt|�|d�dStk
r*dSXdS)a�
        Method for quick testing of a parser against a test string. Good for simple 
        inline microtests of sub expressions while building up larger parser.
           
        Parameters:
         - testString - to test against this expression for a match
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests
            
        Example::
            expr = Word(nums)
            assert expr.matches("100")
        )r�TFN)r�r�r)r�Z
testStringr�rwrwrxr�|s

zParserElement.matches�#cCs�t|t�r"tttj|j�j���}t|t�r4t|�}g}g}d}	�x�|D�]�}
|dk	rb|j	|
d�sl|rx|
rx|j
|
�qH|
s~qHdj|�|
g}g}y:|
jdd�}
|j
|
|d�}|j
|j|d��|	o�|}	Wn�tk
�rx}
z�t|
t�r�dnd	}d|
k�r0|j
t|
j|
��|j
d
t|
j|
�dd|�n|j
d
|
jd|�|j
d
t|
��|	�ob|}	|
}WYdd}
~
XnDtk
�r�}z&|j
dt|��|	�o�|}	|}WYdd}~XnX|�r�|�r�|j
d	�tdj|��|j
|
|f�qHW|	|fS)a3
        Execute the parse expression on a series of test strings, showing each
        test, the parsed results or where the parse failed. Quick and easy way to
        run a parse expression against a list of sample strings.
           
        Parameters:
         - tests - a list of separate test strings, or a multiline string of test strings
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests           
         - comment - (default=C{'#'}) - expression for indicating embedded comments in the test 
              string; pass None to disable comment filtering
         - fullDump - (default=C{True}) - dump results as list followed by results names in nested outline;
              if False, only dump nested list
         - printResults - (default=C{True}) prints test output to stdout
         - failureTests - (default=C{False}) indicates if these tests are expected to fail parsing

        Returns: a (success, results) tuple, where success indicates that all tests succeeded
        (or failed if C{failureTests} is True), and the results contain a list of lines of each 
        test's output
        
        Example::
            number_expr = pyparsing_common.number.copy()

            result = number_expr.runTests('''
                # unsigned integer
                100
                # negative integer
                -100
                # float with scientific notation
                6.02e23
                # integer with scientific notation
                1e-12
                ''')
            print("Success" if result[0] else "Failed!")

            result = number_expr.runTests('''
                # stray character
                100Z
                # missing leading digit before '.'
                -.100
                # too many '.'
                3.14.159
                ''', failureTests=True)
            print("Success" if result[0] else "Failed!")
        prints::
            # unsigned integer
            100
            [100]

            # negative integer
            -100
            [-100]

            # float with scientific notation
            6.02e23
            [6.02e+23]

            # integer with scientific notation
            1e-12
            [1e-12]

            Success
            
            # stray character
            100Z
               ^
            FAIL: Expected end of text (at char 3), (line:1, col:4)

            # missing leading digit before '.'
            -.100
            ^
            FAIL: Expected {real number with scientific notation | real number | signed integer} (at char 0), (line:1, col:1)

            # too many '.'
            3.14.159
                ^
            FAIL: Expected end of text (at char 4), (line:1, col:5)

            Success

        Each test string must be on a single line. If you want to test a string that spans multiple
        lines, create a test like this::

            expr.runTest(r"this is a test\n of strings that spans \n 3 lines")
        
        (Note that this is a raw string literal, you must include the leading 'r'.)
        TNFrz\n)r�)r z(FATAL)r�� rr�^zFAIL: zFAIL-EXCEPTION: )rzr�r�rvr{r��rstrip�
splitlinesrr�r�r�r�r�rrr!rGr�r9rKr,)r�Ztestsr�ZcommentZfullDumpZprintResultsZfailureTestsZ
allResultsZcomments�successrvr�resultr�rzr3rwrwrx�runTests�sNW



$


zParserElement.runTests)F)F)T)T)TT)TT)r�)F)N)T)F)T)Tr�TTF)Or�r�r�r�rNr��staticmethodrPrRr�r�rirmrur�rxr~rr�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�rtr�r�r�r��_MAX_INTr�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r��
__classcell__rwrw)rHrxr$8s�


&




H
"
2G+D
			

)

cs eZdZdZ�fdd�Z�ZS)r,zT
    Abstract C{ParserElement} subclass, for defining atomic matching patterns.
    cstt|�jdd�dS)NF)rg)r�r,r�)r�)rHrwrxr�	szToken.__init__)r�r�r�r�r�r�rwrw)rHrxr,	scs eZdZdZ�fdd�Z�ZS)r
z,
    An empty token, will always match.
    cs$tt|�j�d|_d|_d|_dS)Nr
TF)r�r
r�r�r[r`)r�)rHrwrxr�	szEmpty.__init__)r�r�r�r�r�r�rwrw)rHrxr
	scs*eZdZdZ�fdd�Zddd�Z�ZS)rz(
    A token that will never match.
    cs*tt|�j�d|_d|_d|_d|_dS)NrTFzUnmatchable token)r�rr�r�r[r`ra)r�)rHrwrxr�*	s
zNoMatch.__init__TcCst|||j|��dS)N)rra)r�r-r�rorwrwrxr�1	szNoMatch.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr&	scs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Token to exactly match a specified string.
    
    Example::
        Literal('blah').parseString('blah')  # -> ['blah']
        Literal('blah').parseString('blahfooblah')  # -> ['blah']
        Literal('blah').parseString('bla')  # -> Exception: Expected "blah"
    
    For case-insensitive matching, use L{CaselessLiteral}.
    
    For keyword matching (force word break before and after the matched string),
    use L{Keyword} or L{CaselessKeyword}.
    cs�tt|�j�||_t|�|_y|d|_Wn*tk
rVtj	dt
dd�t|_YnXdt
|j�|_d|j|_d|_d|_dS)Nrz2null string passed to Literal; use Empty() insteadrq)r�z"%s"z	Expected F)r�rr��matchr��matchLen�firstMatchCharr�r�r�r�r
rHr�r�rar[r`)r��matchString)rHrwrxr�C	s

zLiteral.__init__TcCsJ|||jkr6|jdks&|j|j|�r6||j|jfSt|||j|��dS)Nrr)r�r��
startswithr�rra)r�r-r�rorwrwrxr�V	szLiteral.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr5	s
csLeZdZdZedZd�fdd�	Zddd	�Z�fd
d�Ze	dd
��Z
�ZS)ra\
    Token to exactly match a specified string as a keyword, that is, it must be
    immediately followed by a non-keyword character.  Compare with C{L{Literal}}:
     - C{Literal("if")} will match the leading C{'if'} in C{'ifAndOnlyIf'}.
     - C{Keyword("if")} will not; it will only match the leading C{'if'} in C{'if x=1'}, or C{'if(y==2)'}
    Accepts two optional constructor arguments in addition to the keyword string:
     - C{identChars} is a string of characters that would be valid identifier characters,
          defaulting to all alphanumerics + "_" and "$"
     - C{caseless} allows case-insensitive matching, default is C{False}.
       
    Example::
        Keyword("start").parseString("start")  # -> ['start']
        Keyword("start").parseString("starting")  # -> Exception

    For case-insensitive matching, use L{CaselessKeyword}.
    z_$NFcs�tt|�j�|dkrtj}||_t|�|_y|d|_Wn$tk
r^t	j
dtdd�YnXd|j|_d|j|_
d|_d|_||_|r�|j�|_|j�}t|�|_dS)Nrz2null string passed to Keyword; use Empty() insteadrq)r�z"%s"z	Expected F)r�rr��DEFAULT_KEYWORD_CHARSr�r�r�r�r�r�r�r�r�rar[r`�caseless�upper�
caselessmatchr��
identChars)r�r�r�r�)rHrwrxr�q	s&

zKeyword.__init__TcCs|jr|||||j�j�|jkr�|t|�|jksL|||jj�|jkr�|dksj||dj�|jkr�||j|jfSnv|||jkr�|jdks�|j|j|�r�|t|�|jks�|||j|jkr�|dks�||d|jkr�||j|jfSt	|||j
|��dS)Nrrr)r�r�r�r�r�r�r�r�r�rra)r�r-r�rorwrwrxr��	s*&zKeyword.parseImplcstt|�j�}tj|_|S)N)r�rr�r�r�)r�r�)rHrwrxr��	szKeyword.copycCs
|t_dS)z,Overrides the default Keyword chars
        N)rr�)rOrwrwrx�setDefaultKeywordChars�	szKeyword.setDefaultKeywordChars)NF)T)r�r�r�r�r3r�r�r�r�r�r�r�rwrw)rHrxr^	s
cs*eZdZdZ�fdd�Zddd�Z�ZS)ral
    Token to match a specified string, ignoring case of letters.
    Note: the matched results will always be in the case of the given
    match string, NOT the case of the input text.

    Example::
        OneOrMore(CaselessLiteral("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD', 'CMD']
        
    (Contrast with example for L{CaselessKeyword}.)
    cs6tt|�j|j��||_d|j|_d|j|_dS)Nz'%s'z	Expected )r�rr�r��returnStringr�ra)r�r�)rHrwrxr��	szCaselessLiteral.__init__TcCs@||||j�j�|jkr,||j|jfSt|||j|��dS)N)r�r�r�r�rra)r�r-r�rorwrwrxr��	szCaselessLiteral.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr�	s
cs,eZdZdZd�fdd�	Zd	dd�Z�ZS)
rz�
    Caseless version of L{Keyword}.

    Example::
        OneOrMore(CaselessKeyword("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD']
        
    (Contrast with example for L{CaselessLiteral}.)
    Ncstt|�j||dd�dS)NT)r�)r�rr�)r�r�r�)rHrwrxr��	szCaselessKeyword.__init__TcCsj||||j�j�|jkrV|t|�|jksF|||jj�|jkrV||j|jfSt|||j|��dS)N)r�r�r�r�r�r�rra)r�r-r�rorwrwrxr��	s*zCaselessKeyword.parseImpl)N)T)r�r�r�r�r�r�r�rwrw)rHrxr�	scs,eZdZdZd�fdd�	Zd	dd�Z�ZS)
rlax
    A variation on L{Literal} which matches "close" matches, that is, 
    strings with at most 'n' mismatching characters. C{CloseMatch} takes parameters:
     - C{match_string} - string to be matched
     - C{maxMismatches} - (C{default=1}) maximum number of mismatches allowed to count as a match
    
    The results from a successful parse will contain the matched text from the input string and the following named results:
     - C{mismatches} - a list of the positions within the match_string where mismatches were found
     - C{original} - the original match_string used to compare against the input string
    
    If C{mismatches} is an empty list, then the match was an exact match.
    
    Example::
        patt = CloseMatch("ATCATCGAATGGA")
        patt.parseString("ATCATCGAAXGGA") # -> (['ATCATCGAAXGGA'], {'mismatches': [[9]], 'original': ['ATCATCGAATGGA']})
        patt.parseString("ATCAXCGAAXGGA") # -> Exception: Expected 'ATCATCGAATGGA' (with up to 1 mismatches) (at char 0), (line:1, col:1)

        # exact match
        patt.parseString("ATCATCGAATGGA") # -> (['ATCATCGAATGGA'], {'mismatches': [[]], 'original': ['ATCATCGAATGGA']})

        # close match allowing up to 2 mismatches
        patt = CloseMatch("ATCATCGAATGGA", maxMismatches=2)
        patt.parseString("ATCAXCGAAXGGA") # -> (['ATCAXCGAAXGGA'], {'mismatches': [[4, 9]], 'original': ['ATCATCGAATGGA']})
    rrcsBtt|�j�||_||_||_d|j|jf|_d|_d|_dS)Nz&Expected %r (with up to %d mismatches)F)	r�rlr�r��match_string�
maxMismatchesrar`r[)r�r�r�)rHrwrxr��	szCloseMatch.__init__TcCs�|}t|�}|t|j�}||kr�|j}d}g}	|j}
x�tt|||�|j��D]0\}}|\}}
||
krP|	j|�t|	�|
krPPqPW|d}t|||�g�}|j|d<|	|d<||fSt|||j|��dS)NrrrZoriginal�
mismatches)	r�r�r�r�r�r�r"rra)r�r-r�ro�startr��maxlocr�Zmatch_stringlocr�r�Zs_m�src�mat�resultsrwrwrxr��	s("

zCloseMatch.parseImpl)rr)T)r�r�r�r�r�r�r�rwrw)rHrxrl�	s	cs8eZdZdZd
�fdd�	Zdd	d
�Z�fdd�Z�ZS)r/a	
    Token for matching words composed of allowed character sets.
    Defined with string containing all allowed initial characters,
    an optional string containing allowed body characters (if omitted,
    defaults to the initial character set), and an optional minimum,
    maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction. An optional
    C{excludeChars} parameter can list characters that might be found in 
    the input C{bodyChars} string; useful to define a word of all printables
    except for one or two characters, for instance.
    
    L{srange} is useful for defining custom character set strings for defining 
    C{Word} expressions, using range notation from regular expression character sets.
    
    A common mistake is to use C{Word} to match a specific literal string, as in 
    C{Word("Address")}. Remember that C{Word} uses the string argument to define
    I{sets} of matchable characters. This expression would match "Add", "AAA",
    "dAred", or any other word made up of the characters 'A', 'd', 'r', 'e', and 's'.
    To match an exact literal string, use L{Literal} or L{Keyword}.

    pyparsing includes helper strings for building Words:
     - L{alphas}
     - L{nums}
     - L{alphanums}
     - L{hexnums}
     - L{alphas8bit} (alphabetic characters in ASCII range 128-255 - accented, tilded, umlauted, etc.)
     - L{punc8bit} (non-alphabetic characters in ASCII range 128-255 - currency, symbols, superscripts, diacriticals, etc.)
     - L{printables} (any non-whitespace character)

    Example::
        # a word composed of digits
        integer = Word(nums) # equivalent to Word("0123456789") or Word(srange("0-9"))
        
        # a word with a leading capital, and zero or more lowercase
        capital_word = Word(alphas.upper(), alphas.lower())

        # hostnames are alphanumeric, with leading alpha, and '-'
        hostname = Word(alphas, alphanums+'-')
        
        # roman numeral (not a strict parser, accepts invalid mix of characters)
        roman = Word("IVXLCDM")
        
        # any string of non-whitespace characters, except for ','
        csv_value = Word(printables, excludeChars=",")
    NrrrFcs�tt|�j��rFdj�fdd�|D��}|rFdj�fdd�|D��}||_t|�|_|rl||_t|�|_n||_t|�|_|dk|_	|dkr�t
d��||_|dkr�||_nt
|_|dkr�||_||_t|�|_d|j|_d	|_||_d
|j|jk�r�|dk�r�|dk�r�|dk�r�|j|jk�r8dt|j�|_nHt|j�dk�rfdtj|j�t|j�f|_nd
t|j�t|j�f|_|j�r�d|jd|_ytj|j�|_Wntk
�r�d|_YnXdS)Nr�c3s|]}|�kr|VqdS)Nrw)r�r�)�excludeCharsrwrxr�7
sz Word.__init__.<locals>.<genexpr>c3s|]}|�kr|VqdS)Nrw)r�r�)r�rwrxr�9
srrrzZcannot specify a minimum length < 1; use Optional(Word()) if zero-length word is permittedz	Expected Fr�z[%s]+z%s[%s]*z	[%s][%s]*z\b)r�r/r�r��
initCharsOrigr��	initChars�
bodyCharsOrig�	bodyChars�maxSpecifiedr��minLen�maxLenr�r�r�rar`�	asKeyword�_escapeRegexRangeChars�reStringr�rd�escape�compilerK)r�r�r�min�max�exactrr�)rH)r�rxr�4
sT



0
z
Word.__init__Tc
CsD|jr<|jj||�}|s(t|||j|��|j�}||j�fS|||jkrZt|||j|��|}|d7}t|�}|j}||j	}t
||�}x ||kr�|||kr�|d7}q�Wd}	|||jkr�d}	|jr�||kr�|||kr�d}	|j
�r|dk�r||d|k�s||k�r|||k�rd}	|	�r4t|||j|��||||�fS)NrrFTr)rdr�rra�end�groupr�r�rrr
rrr)
r�r-r�ror�r�r�Z	bodycharsr�ZthrowExceptionrwrwrxr�j
s6

4zWord.parseImplcstytt|�j�Stk
r"YnX|jdkrndd�}|j|jkr^d||j�||j�f|_nd||j�|_|jS)NcSs$t|�dkr|dd�dS|SdS)N�z...)r�)r�rwrwrx�
charsAsStr�
sz Word.__str__.<locals>.charsAsStrz	W:(%s,%s)zW:(%s))r�r/r�rKrUr�r)r�r)rHrwrxr��
s
zWord.__str__)NrrrrFN)T)r�r�r�r�r�r�r�r�rwrw)rHrxr/
s.6
#csFeZdZdZeejd��Zd�fdd�	Zddd�Z	�fd	d
�Z
�ZS)
r'a�
    Token for matching strings that match a given regular expression.
    Defined with string specifying the regular expression in a form recognized by the inbuilt Python re module.
    If the given regex contains named groups (defined using C{(?P<name>...)}), these will be preserved as 
    named parse results.

    Example::
        realnum = Regex(r"[+-]?\d+\.\d*")
        date = Regex(r'(?P<year>\d{4})-(?P<month>\d\d?)-(?P<day>\d\d?)')
        # ref: http://stackoverflow.com/questions/267399/how-do-you-match-only-valid-roman-numerals-with-a-regular-expression
        roman = Regex(r"M{0,4}(CM|CD|D?C{0,3})(XC|XL|L?X{0,3})(IX|IV|V?I{0,3})")
    z[A-Z]rcs�tt|�j�t|t�r�|s,tjdtdd�||_||_	yt
j|j|j	�|_
|j|_Wq�t
jk
r�tjd|tdd��Yq�Xn2t|tj�r�||_
t|�|_|_||_	ntd��t|�|_d|j|_d|_d|_d	S)
z�The parameters C{pattern} and C{flags} are passed to the C{re.compile()} function as-is. See the Python C{re} module for an explanation of the acceptable patterns and flags.z0null string passed to Regex; use Empty() insteadrq)r�z$invalid pattern (%s) passed to RegexzCRegex may only be constructed with a string or a compiled RE objectz	Expected FTN)r�r'r�rzr�r�r�r��pattern�flagsrdr	r�
sre_constants�error�compiledREtyper{r�r�r�rar`r[)r�rr)rHrwrxr��
s.





zRegex.__init__TcCsd|jj||�}|s"t|||j|��|j�}|j�}t|j��}|r\x|D]}||||<qHW||fS)N)rdr�rrar
�	groupdictr"r)r�r-r�ror��dr�r�rwrwrxr��
s
zRegex.parseImplcsDytt|�j�Stk
r"YnX|jdkr>dt|j�|_|jS)NzRe:(%s))r�r'r�rKrUr�r)r�)rHrwrxr��
s
z
Regex.__str__)r)T)r�r�r�r�r�rdr	rr�r�r�r�rwrw)rHrxr'�
s
"

cs8eZdZdZd�fdd�	Zddd�Z�fd	d
�Z�ZS)
r%a�
    Token for matching strings that are delimited by quoting characters.
    
    Defined with the following parameters:
        - quoteChar - string of one or more characters defining the quote delimiting string
        - escChar - character to escape quotes, typically backslash (default=C{None})
        - escQuote - special quote sequence to escape an embedded quote string (such as SQL's "" to escape an embedded ") (default=C{None})
        - multiline - boolean indicating whether quotes can span multiple lines (default=C{False})
        - unquoteResults - boolean indicating whether the matched text should be unquoted (default=C{True})
        - endQuoteChar - string of one or more characters defining the end of the quote delimited string (default=C{None} => same as quoteChar)
        - convertWhitespaceEscapes - convert escaped whitespace (C{'\t'}, C{'\n'}, etc.) to actual whitespace (default=C{True})

    Example::
        qs = QuotedString('"')
        print(qs.searchString('lsjdf "This is the quote" sldjf'))
        complex_qs = QuotedString('{{', endQuoteChar='}}')
        print(complex_qs.searchString('lsjdf {{This is the "quote"}} sldjf'))
        sql_qs = QuotedString('"', escQuote='""')
        print(sql_qs.searchString('lsjdf "This is the quote with ""embedded"" quotes" sldjf'))
    prints::
        [['This is the quote']]
        [['This is the "quote"']]
        [['This is the quote with "embedded" quotes']]
    NFTcsNtt��j�|j�}|s0tjdtdd�t��|dkr>|}n"|j�}|s`tjdtdd�t��|�_t	|��_
|d�_|�_t	|��_
|�_|�_|�_|�_|r�tjtjB�_dtj�j�t�jd�|dk	r�t|�p�df�_n<d�_dtj�j�t�jd�|dk	�rt|��pdf�_t	�j�d	k�rp�jd
dj�fdd
�tt	�j�d	dd�D��d7_|�r��jdtj|�7_|�r��jdtj|�7_tj�j�d�_�jdtj�j�7_ytj�j�j��_�j�_Wn0tjk
�r&tjd�jtdd��YnXt ���_!d�j!�_"d�_#d�_$dS)Nz$quoteChar cannot be the empty stringrq)r�z'endQuoteChar cannot be the empty stringrz%s(?:[^%s%s]r�z%s(?:[^%s\n\r%s]rrz|(?:z)|(?:c3s4|],}dtj�jd|��t�j|�fVqdS)z%s[^%s]N)rdr�endQuoteCharr)r�r�)r�rwrxr�/sz(QuotedString.__init__.<locals>.<genexpr>�)z|(?:%s)z|(?:%s.)z(.)z)*%sz$invalid pattern (%s) passed to Regexz	Expected FTrs)%r�r%r�r�r�r�r��SyntaxError�	quoteCharr��quoteCharLen�firstQuoteCharr�endQuoteCharLen�escChar�escQuote�unquoteResults�convertWhitespaceEscapesrd�	MULTILINE�DOTALLrrrrr�r��escCharReplacePatternr	rrrr�r�rar`r[)r�rrr Z	multiliner!rr")rH)r�rxr�sf




6

zQuotedString.__init__c	Cs�|||jkr|jj||�pd}|s4t|||j|��|j�}|j�}|jr�||j|j	�}t
|t�r�d|kr�|jr�ddddd�}x |j
�D]\}}|j||�}q�W|jr�tj|jd|�}|jr�|j|j|j�}||fS)N�\�	r��
)z\tz\nz\fz\rz\g<1>)rrdr�rrar
rr!rrrzr�r"r�r�rr�r%r r)	r�r-r�ror�r�Zws_mapZwslitZwscharrwrwrxr�Gs( 
zQuotedString.parseImplcsFytt|�j�Stk
r"YnX|jdkr@d|j|jf|_|jS)Nz.quoted string, starting with %s ending with %s)r�r%r�rKrUrr)r�)rHrwrxr�js
zQuotedString.__str__)NNFTNT)T)r�r�r�r�r�r�r�r�rwrw)rHrxr%�
sA
#cs8eZdZdZd�fdd�	Zddd�Z�fd	d
�Z�ZS)
r	a�
    Token for matching words composed of characters I{not} in a given set (will
    include whitespace in matched characters if not listed in the provided exclusion set - see example).
    Defined with string containing all disallowed characters, and an optional
    minimum, maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction.

    Example::
        # define a comma-separated-value as anything that is not a ','
        csv_value = CharsNotIn(',')
        print(delimitedList(csv_value).parseString("dkls,lsdkjf,s12 34,@!#,213"))
    prints::
        ['dkls', 'lsdkjf', 's12 34', '@!#', '213']
    rrrcs�tt|�j�d|_||_|dkr*td��||_|dkr@||_nt|_|dkrZ||_||_t	|�|_
d|j
|_|jdk|_d|_
dS)NFrrzfcannot specify a minimum length < 1; use Optional(CharsNotIn()) if zero-length char group is permittedrz	Expected )r�r	r�rX�notCharsr�rrr�r�r�rar[r`)r�r*r
rr)rHrwrxr��s 
zCharsNotIn.__init__TcCs�|||jkrt|||j|��|}|d7}|j}t||jt|��}x ||krd|||krd|d7}qFW|||jkr�t|||j|��||||�fS)Nrr)r*rrar
rr�r)r�r-r�ror�Znotchars�maxlenrwrwrxr��s
zCharsNotIn.parseImplcsdytt|�j�Stk
r"YnX|jdkr^t|j�dkrRd|jdd�|_nd|j|_|jS)Nrz
!W:(%s...)z!W:(%s))r�r	r�rKrUr�r*)r�)rHrwrxr��s
zCharsNotIn.__str__)rrrr)T)r�r�r�r�r�r�r�r�rwrw)rHrxr	vs
cs<eZdZdZdddddd�Zd�fdd�	Zddd�Z�ZS)r.a�
    Special matching class for matching whitespace.  Normally, whitespace is ignored
    by pyparsing grammars.  This class is included when some whitespace structures
    are significant.  Define with a string containing the whitespace characters to be
    matched; default is C{" \t\r\n"}.  Also takes optional C{min}, C{max}, and C{exact} arguments,
    as defined for the C{L{Word}} class.
    z<SPC>z<TAB>z<LF>z<CR>z<FF>)r�r'rr)r(� 	
rrrcs�tt��j�|�_�jdj�fdd��jD���djdd��jD���_d�_d�j�_	|�_
|dkrt|�_nt�_|dkr�|�_|�_
dS)Nr�c3s|]}|�jkr|VqdS)N)�
matchWhite)r�r�)r�rwrxr��sz!White.__init__.<locals>.<genexpr>css|]}tj|VqdS)N)r.�	whiteStrs)r�r�rwrwrxr��sTz	Expected r)
r�r.r�r-r�r�rYr�r[rarrr�)r�Zwsr
rr)rH)r�rxr��s zWhite.__init__TcCs�|||jkrt|||j|��|}|d7}||j}t|t|��}x"||krd|||jkrd|d7}qDW|||jkr�t|||j|��||||�fS)Nrr)r-rrarr
r�r)r�r-r�ror�r�rwrwrxr��s
zWhite.parseImpl)r,rrrr)T)r�r�r�r�r.r�r�r�rwrw)rHrxr.�scseZdZ�fdd�Z�ZS)�_PositionTokencs(tt|�j�|jj|_d|_d|_dS)NTF)r�r/r�rHr�r�r[r`)r�)rHrwrxr��s
z_PositionToken.__init__)r�r�r�r�r�rwrw)rHrxr/�sr/cs2eZdZdZ�fdd�Zdd�Zd	dd�Z�ZS)
rzb
    Token to advance to a specific column of input text; useful for tabular report scraping.
    cstt|�j�||_dS)N)r�rr�r9)r��colno)rHrwrxr��szGoToColumn.__init__cCs`t||�|jkr\t|�}|jr*|j||�}x0||krZ||j�rZt||�|jkrZ|d7}q,W|S)Nrr)r9r�r]r��isspace)r�r-r�r�rwrwrxr��s&zGoToColumn.preParseTcCsDt||�}||jkr"t||d|��||j|}|||�}||fS)NzText not in expected column)r9r)r�r-r�roZthiscolZnewlocr�rwrwrxr�s

zGoToColumn.parseImpl)T)r�r�r�r�r�r�r�r�rwrw)rHrxr�s	cs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Matches if current position is at the beginning of a line within the parse string
    
    Example::
    
        test = '''        AAA this line
        AAA and this line
          AAA but not this one
        B AAA and definitely not this one
        '''

        for t in (LineStart() + 'AAA' + restOfLine).searchString(test):
            print(t)
    
    Prints::
        ['AAA', ' this line']
        ['AAA', ' and this line']    

    cstt|�j�d|_dS)NzExpected start of line)r�rr�ra)r�)rHrwrxr�&szLineStart.__init__TcCs*t||�dkr|gfSt|||j|��dS)Nrr)r9rra)r�r-r�rorwrwrxr�*szLineStart.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxrscs*eZdZdZ�fdd�Zddd�Z�ZS)rzU
    Matches if current position is at the end of a line within the parse string
    cs,tt|�j�|jtjjdd��d|_dS)Nrr�zExpected end of line)r�rr�r�r$rNr�ra)r�)rHrwrxr�3szLineEnd.__init__TcCsb|t|�kr6||dkr$|ddfSt|||j|��n(|t|�krN|dgfSt|||j|��dS)Nrrr)r�rra)r�r-r�rorwrwrxr�8szLineEnd.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr/scs*eZdZdZ�fdd�Zddd�Z�ZS)r*zM
    Matches if current position is at the beginning of the parse string
    cstt|�j�d|_dS)NzExpected start of text)r�r*r�ra)r�)rHrwrxr�GszStringStart.__init__TcCs0|dkr(||j|d�kr(t|||j|��|gfS)Nr)r�rra)r�r-r�rorwrwrxr�KszStringStart.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr*Cscs*eZdZdZ�fdd�Zddd�Z�ZS)r)zG
    Matches if current position is at the end of the parse string
    cstt|�j�d|_dS)NzExpected end of text)r�r)r�ra)r�)rHrwrxr�VszStringEnd.__init__TcCs^|t|�krt|||j|��n<|t|�kr6|dgfS|t|�krJ|gfSt|||j|��dS)Nrr)r�rra)r�r-r�rorwrwrxr�ZszStringEnd.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr)Rscs.eZdZdZef�fdd�	Zddd�Z�ZS)r1ap
    Matches if the current position is at the beginning of a Word, and
    is not preceded by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{} behavior of regular expressions,
    use C{WordStart(alphanums)}. C{WordStart} will also match at the beginning of
    the string being parsed, or at the beginning of a line.
    cs"tt|�j�t|�|_d|_dS)NzNot at the start of a word)r�r1r�r��	wordCharsra)r�r2)rHrwrxr�ls
zWordStart.__init__TcCs@|dkr8||d|jks(|||jkr8t|||j|��|gfS)Nrrr)r2rra)r�r-r�rorwrwrxr�qs
zWordStart.parseImpl)T)r�r�r�r�rVr�r�r�rwrw)rHrxr1dscs.eZdZdZef�fdd�	Zddd�Z�ZS)r0aZ
    Matches if the current position is at the end of a Word, and
    is not followed by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{} behavior of regular expressions,
    use C{WordEnd(alphanums)}. C{WordEnd} will also match at the end of
    the string being parsed, or at the end of a line.
    cs(tt|�j�t|�|_d|_d|_dS)NFzNot at the end of a word)r�r0r�r�r2rXra)r�r2)rHrwrxr��s
zWordEnd.__init__TcCsPt|�}|dkrH||krH|||jks8||d|jkrHt|||j|��|gfS)Nrrr)r�r2rra)r�r-r�ror�rwrwrxr��szWordEnd.parseImpl)T)r�r�r�r�rVr�r�r�rwrw)rHrxr0xscs�eZdZdZd�fdd�	Zdd�Zdd�Zd	d
�Z�fdd�Z�fd
d�Z	�fdd�Z
d�fdd�	Zgfdd�Z�fdd�Z
�ZS)r z^
    Abstract subclass of ParserElement, for combining and post-processing parsed tokens.
    Fcs�tt|�j|�t|t�r"t|�}t|t�r<tj|�g|_	njt|t
j�rzt|�}tdd�|D��rnt
tj|�}t|�|_	n,yt|�|_	Wntk
r�|g|_	YnXd|_dS)Ncss|]}t|t�VqdS)N)rzr�)r�r.rwrwrxr��sz+ParseExpression.__init__.<locals>.<genexpr>F)r�r r�rzr�r�r�r$rQ�exprsr��Iterable�allrvr�re)r�r3rg)rHrwrxr��s

zParseExpression.__init__cCs
|j|S)N)r3)r�r�rwrwrxr��szParseExpression.__getitem__cCs|jj|�d|_|S)N)r3r�rU)r�r�rwrwrxr��szParseExpression.appendcCs4d|_dd�|jD�|_x|jD]}|j�q W|S)z~Extends C{leaveWhitespace} defined in base class, and also invokes C{leaveWhitespace} on
           all contained expressions.FcSsg|]}|j��qSrw)r�)r�r�rwrwrxr��sz3ParseExpression.leaveWhitespace.<locals>.<listcomp>)rXr3r�)r�r�rwrwrxr��s
zParseExpression.leaveWhitespacecszt|t�rF||jkrvtt|�j|�xP|jD]}|j|jd�q,Wn0tt|�j|�x|jD]}|j|jd�q^W|S)Nrrrsrs)rzr+r]r�r r�r3)r�r�r�)rHrwrxr��s

zParseExpression.ignorecsLytt|�j�Stk
r"YnX|jdkrFd|jjt|j�f|_|jS)Nz%s:(%s))	r�r r�rKrUrHr�r�r3)r�)rHrwrxr��s
zParseExpression.__str__cs0tt|�j�x|jD]}|j�qWt|j�dk�r|jd}t||j�r�|jr�|jdkr�|j	r�|jdd�|jdg|_d|_
|j|jO_|j|jO_|jd}t||j�o�|jo�|jdko�|j	�r|jdd�|jdd�|_d|_
|j|jO_|j|jO_dt
|�|_|S)Nrqrrrz	Expected rsrs)r�r r�r3r�rzrHrSrVr^rUr[r`r�ra)r�r�r�)rHrwrxr��s0




zParseExpression.streamlinecstt|�j||�}|S)N)r�r rm)r�r�rlr�)rHrwrxrm�szParseExpression.setResultsNamecCs:|dd�|g}x|jD]}|j|�qW|jg�dS)N)r3r�r�)r�r��tmpr�rwrwrxr��szParseExpression.validatecs$tt|�j�}dd�|jD�|_|S)NcSsg|]}|j��qSrw)r�)r�r�rwrwrxr��sz(ParseExpression.copy.<locals>.<listcomp>)r�r r�r3)r�r�)rHrwrxr��szParseExpression.copy)F)F)r�r�r�r�r�r�r�r�r�r�r�rmr�r�r�rwrw)rHrxr �s	
"csTeZdZdZGdd�de�Zd�fdd�	Zddd�Zd	d
�Zdd�Z	d
d�Z
�ZS)ra

    Requires all given C{ParseExpression}s to be found in the given order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'+'} operator.
    May also be constructed using the C{'-'} operator, which will suppress backtracking.

    Example::
        integer = Word(nums)
        name_expr = OneOrMore(Word(alphas))

        expr = And([integer("id"),name_expr("name"),integer("age")])
        # more easily written as:
        expr = integer("id") + name_expr("name") + integer("age")
    cseZdZ�fdd�Z�ZS)zAnd._ErrorStopcs&ttj|�j||�d|_|j�dS)N�-)r�rr�r�r�r�)r�r�r�)rHrwrxr�
szAnd._ErrorStop.__init__)r�r�r�r�r�rwrw)rHrxr�
sr�TcsRtt|�j||�tdd�|jD��|_|j|jdj�|jdj|_d|_	dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�
szAnd.__init__.<locals>.<genexpr>rT)
r�rr�r5r3r[r�rYrXre)r�r3rg)rHrwrxr�
s
zAnd.__init__c	Cs|jdj|||dd�\}}d}x�|jdd�D]�}t|tj�rFd}q0|r�y|j|||�\}}Wq�tk
rv�Yq�tk
r�}zd|_tj|��WYdd}~Xq�t	k
r�t|t
|�|j|��Yq�Xn|j|||�\}}|s�|j�r0||7}q0W||fS)NrF)rprrT)
r3rtrzrr�r#r�
__traceback__r�r�r�rar�)	r�r-r�ro�
resultlistZ	errorStopr�Z
exprtokensr�rwrwrxr�
s(z
And.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrxr5
s

zAnd.__iadd__cCs8|dd�|g}x |jD]}|j|�|jsPqWdS)N)r3r�r[)r�r��subRecCheckListr�rwrwrxr�:
s

zAnd.checkRecursioncCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr��{r�css|]}t|�VqdS)N)r�)r�r�rwrwrxr�F
szAnd.__str__.<locals>.<genexpr>�})r�r�rUr�r3)r�rwrwrxr�A
s


 zAnd.__str__)T)T)r�r�r�r�r
r�r�r�rr�r�r�rwrw)rHrxr�s
csDeZdZdZd�fdd�	Zddd�Zdd	�Zd
d�Zdd
�Z�Z	S)ra�
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the expression that matches the longest string will be used.
    May be constructed using the C{'^'} operator.

    Example::
        # construct Or using '^' operator
        
        number = Word(nums) ^ Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789"))
    prints::
        [['123'], ['3.1416'], ['789']]
    Fcs:tt|�j||�|jr0tdd�|jD��|_nd|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�\
szOr.__init__.<locals>.<genexpr>T)r�rr�r3rr[)r�r3rg)rHrwrxr�Y
szOr.__init__TcCsTd}d}g}x�|jD]�}y|j||�}Wnvtk
rd}	z d|	_|	j|krT|	}|	j}WYdd}	~	Xqtk
r�t|�|kr�t|t|�|j|�}t|�}YqX|j||f�qW|�r*|j	dd�d�x`|D]X\}
}y|j
|||�Stk
�r$}	z"d|	_|	j|k�r|	}|	j}WYdd}	~	Xq�Xq�W|dk	�rB|j|_|�nt||d|��dS)NrrcSs
|dS)Nrrw)�xrwrwrxryu
szOr.parseImpl.<locals>.<lambda>)r�z no defined alternatives to matchrs)r3r�rr8r�r�r�rar��sortrtr�)r�r-r�ro�	maxExcLoc�maxExceptionr�r�Zloc2r��_rwrwrxr�`
s<

zOr.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrx�__ixor__�
s

zOr.__ixor__cCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r;z ^ css|]}t|�VqdS)N)r�)r�r�rwrwrxr��
szOr.__str__.<locals>.<genexpr>r<)r�r�rUr�r3)r�rwrwrxr��
s


 z
Or.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r3r�)r�r�r:r�rwrwrxr��
szOr.checkRecursion)F)T)
r�r�r�r�r�r�rBr�r�r�rwrw)rHrxrK
s

&	csDeZdZdZd�fdd�	Zddd�Zdd	�Zd
d�Zdd
�Z�Z	S)ra�
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the first one listed is the one that will match.
    May be constructed using the C{'|'} operator.

    Example::
        # construct MatchFirst using '|' operator
        
        # watch the order of expressions to match
        number = Word(nums) | Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789")) #  Fail! -> [['123'], ['3'], ['1416'], ['789']]

        # put more selective expression first
        number = Combine(Word(nums) + '.' + Word(nums)) | Word(nums)
        print(number.searchString("123 3.1416 789")) #  Better -> [['123'], ['3.1416'], ['789']]
    Fcs:tt|�j||�|jr0tdd�|jD��|_nd|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr��
sz&MatchFirst.__init__.<locals>.<genexpr>T)r�rr�r3rr[)r�r3rg)rHrwrxr��
szMatchFirst.__init__Tc	Cs�d}d}x�|jD]�}y|j|||�}|Stk
r\}z|j|krL|}|j}WYdd}~Xqtk
r�t|�|kr�t|t|�|j|�}t|�}YqXqW|dk	r�|j|_|�nt||d|��dS)Nrrz no defined alternatives to matchrs)r3rtrr�r�r�rar�)	r�r-r�ror?r@r�r�r�rwrwrxr��
s$
zMatchFirst.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrx�__ior__�
s

zMatchFirst.__ior__cCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r;z | css|]}t|�VqdS)N)r�)r�r�rwrwrxr��
sz%MatchFirst.__str__.<locals>.<genexpr>r<)r�r�rUr�r3)r�rwrwrxr��
s


 zMatchFirst.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r3r�)r�r�r:r�rwrwrxr��
szMatchFirst.checkRecursion)F)T)
r�r�r�r�r�r�rCr�r�r�rwrw)rHrxr�
s
	cs<eZdZdZd�fdd�	Zddd�Zdd�Zd	d
�Z�ZS)
ram
    Requires all given C{ParseExpression}s to be found, but in any order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'&'} operator.

    Example::
        color = oneOf("RED ORANGE YELLOW GREEN BLUE PURPLE BLACK WHITE BROWN")
        shape_type = oneOf("SQUARE CIRCLE TRIANGLE STAR HEXAGON OCTAGON")
        integer = Word(nums)
        shape_attr = "shape:" + shape_type("shape")
        posn_attr = "posn:" + Group(integer("x") + ',' + integer("y"))("posn")
        color_attr = "color:" + color("color")
        size_attr = "size:" + integer("size")

        # use Each (using operator '&') to accept attributes in any order 
        # (shape and posn are required, color and size are optional)
        shape_spec = shape_attr & posn_attr & Optional(color_attr) & Optional(size_attr)

        shape_spec.runTests('''
            shape: SQUARE color: BLACK posn: 100, 120
            shape: CIRCLE size: 50 color: BLUE posn: 50,80
            color:GREEN size:20 shape:TRIANGLE posn:20,40
            '''
            )
    prints::
        shape: SQUARE color: BLACK posn: 100, 120
        ['shape:', 'SQUARE', 'color:', 'BLACK', 'posn:', ['100', ',', '120']]
        - color: BLACK
        - posn: ['100', ',', '120']
          - x: 100
          - y: 120
        - shape: SQUARE


        shape: CIRCLE size: 50 color: BLUE posn: 50,80
        ['shape:', 'CIRCLE', 'size:', '50', 'color:', 'BLUE', 'posn:', ['50', ',', '80']]
        - color: BLUE
        - posn: ['50', ',', '80']
          - x: 50
          - y: 80
        - shape: CIRCLE
        - size: 50


        color: GREEN size: 20 shape: TRIANGLE posn: 20,40
        ['color:', 'GREEN', 'size:', '20', 'shape:', 'TRIANGLE', 'posn:', ['20', ',', '40']]
        - color: GREEN
        - posn: ['20', ',', '40']
          - x: 20
          - y: 40
        - shape: TRIANGLE
        - size: 20
    Tcs8tt|�j||�tdd�|jD��|_d|_d|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�sz Each.__init__.<locals>.<genexpr>T)r�rr�r5r3r[rX�initExprGroups)r�r3rg)rHrwrxr�sz
Each.__init__c
s�|jr�tdd�|jD��|_dd�|jD�}dd�|jD�}|||_dd�|jD�|_dd�|jD�|_dd�|jD�|_|j|j7_d	|_|}|jdd�}|jdd��g}d
}	x�|	�rp|�|j|j}
g}x~|
D]v}y|j||�}Wn t	k
�r|j
|�Yq�X|j
|jjt|�|��||k�rD|j
|�q�|�kr�j
|�q�Wt|�t|
�kr�d	}	q�W|�r�djdd�|D��}
t	||d
|
��|�fdd�|jD�7}g}x*|D]"}|j|||�\}}|j
|��q�Wt|tg��}||fS)Ncss&|]}t|t�rt|j�|fVqdS)N)rzrr�r.)r�r�rwrwrxr�sz!Each.parseImpl.<locals>.<genexpr>cSsg|]}t|t�r|j�qSrw)rzrr.)r�r�rwrwrxr�sz"Each.parseImpl.<locals>.<listcomp>cSs"g|]}|jrt|t�r|�qSrw)r[rzr)r�r�rwrwrxr�scSsg|]}t|t�r|j�qSrw)rzr2r.)r�r�rwrwrxr� scSsg|]}t|t�r|j�qSrw)rzrr.)r�r�rwrwrxr�!scSs g|]}t|tttf�s|�qSrw)rzrr2r)r�r�rwrwrxr�"sFTz, css|]}t|�VqdS)N)r�)r�r�rwrwrxr�=sz*Missing one or more required elements (%s)cs$g|]}t|t�r|j�kr|�qSrw)rzrr.)r�r�)�tmpOptrwrxr�As)rDr�r3Zopt1mapZ	optionalsZmultioptionalsZ
multirequiredZrequiredr�rr�r�r��remover�r�rt�sumr")r�r-r�roZopt1Zopt2ZtmpLocZtmpReqdZ
matchOrderZkeepMatchingZtmpExprsZfailedr�Zmissingr9r�ZfinalResultsrw)rErxr�sP



zEach.parseImplcCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r;z & css|]}t|�VqdS)N)r�)r�r�rwrwrxr�PszEach.__str__.<locals>.<genexpr>r<)r�r�rUr�r3)r�rwrwrxr�Ks


 zEach.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r3r�)r�r�r:r�rwrwrxr�TszEach.checkRecursion)T)T)	r�r�r�r�r�r�r�r�r�rwrw)rHrxr�
s
5
1	csleZdZdZd�fdd�	Zddd�Zdd	�Z�fd
d�Z�fdd
�Zdd�Z	gfdd�Z
�fdd�Z�ZS)rza
    Abstract subclass of C{ParserElement}, for combining and post-processing parsed tokens.
    Fcs�tt|�j|�t|t�r@ttjt�r2tj|�}ntjt	|��}||_
d|_|dk	r�|j|_|j
|_
|j|j�|j|_|j|_|j|_|jj|j�dS)N)r�rr�rzr��
issubclassr$rQr,rr.rUr`r[r�rYrXrWrer]r�)r�r.rg)rHrwrxr�^s
zParseElementEnhance.__init__TcCs2|jdk	r|jj|||dd�Std||j|��dS)NF)rpr�)r.rtrra)r�r-r�rorwrwrxr�ps
zParseElementEnhance.parseImplcCs*d|_|jj�|_|jdk	r&|jj�|S)NF)rXr.r�r�)r�rwrwrxr�vs


z#ParseElementEnhance.leaveWhitespacecsrt|t�rB||jkrntt|�j|�|jdk	rn|jj|jd�n,tt|�j|�|jdk	rn|jj|jd�|S)Nrrrsrs)rzr+r]r�rr�r.)r�r�)rHrwrxr�}s



zParseElementEnhance.ignorecs&tt|�j�|jdk	r"|jj�|S)N)r�rr�r.)r�)rHrwrxr��s

zParseElementEnhance.streamlinecCsB||krt||g��|dd�|g}|jdk	r>|jj|�dS)N)r&r.r�)r�r�r:rwrwrxr��s

z"ParseElementEnhance.checkRecursioncCs6|dd�|g}|jdk	r(|jj|�|jg�dS)N)r.r�r�)r�r�r6rwrwrxr��s
zParseElementEnhance.validatecsVytt|�j�Stk
r"YnX|jdkrP|jdk	rPd|jjt|j�f|_|jS)Nz%s:(%s))	r�rr�rKrUr.rHr�r�)r�)rHrwrxr��szParseElementEnhance.__str__)F)T)
r�r�r�r�r�r�r�r�r�r�r�r�r�rwrw)rHrxrZs
cs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Lookahead matching of the given parse expression.  C{FollowedBy}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression matches at the current
    position.  C{FollowedBy} always returns a null token list.

    Example::
        # use FollowedBy to match a label only if it is followed by a ':'
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        OneOrMore(attr_expr).parseString("shape: SQUARE color: BLACK posn: upper left").pprint()
    prints::
        [['shape', 'SQUARE'], ['color', 'BLACK'], ['posn', 'upper left']]
    cstt|�j|�d|_dS)NT)r�rr�r[)r�r.)rHrwrxr��szFollowedBy.__init__TcCs|jj||�|gfS)N)r.r�)r�r-r�rorwrwrxr��szFollowedBy.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr�scs2eZdZdZ�fdd�Zd	dd�Zdd�Z�ZS)
ra�
    Lookahead to disallow matching with the given parse expression.  C{NotAny}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression does I{not} match at the current
    position.  Also, C{NotAny} does I{not} skip over leading whitespace. C{NotAny}
    always returns a null token list.  May be constructed using the '~' operator.

    Example::
        
    cs0tt|�j|�d|_d|_dt|j�|_dS)NFTzFound unwanted token, )r�rr�rXr[r�r.ra)r�r.)rHrwrxr��szNotAny.__init__TcCs&|jj||�rt|||j|��|gfS)N)r.r�rra)r�r-r�rorwrwrxr��szNotAny.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�z~{r<)r�r�rUr�r.)r�rwrwrxr��s


zNotAny.__str__)T)r�r�r�r�r�r�r�r�rwrw)rHrxr�s

cs(eZdZd�fdd�	Zddd�Z�ZS)	�_MultipleMatchNcsFtt|�j|�d|_|}t|t�r.tj|�}|dk	r<|nd|_dS)NT)	r�rIr�rWrzr�r$rQ�	not_ender)r�r.�stopOnZender)rHrwrxr��s

z_MultipleMatch.__init__TcCs�|jj}|j}|jdk	}|r$|jj}|r2|||�||||dd�\}}yZ|j}	xJ|rb|||�|	rr|||�}
n|}
|||
|�\}}|s�|j�rT||7}qTWWnttfk
r�YnX||fS)NF)rp)	r.rtr�rJr�r]r�rr�)r�r-r�roZself_expr_parseZself_skip_ignorablesZcheck_enderZ
try_not_enderr�ZhasIgnoreExprsr�Z	tmptokensrwrwrxr��s,



z_MultipleMatch.parseImpl)N)T)r�r�r�r�r�r�rwrw)rHrxrI�srIc@seZdZdZdd�ZdS)ra�
    Repetition of one or more of the given expression.
    
    Parameters:
     - expr - expression that must match one or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: BLACK"
        OneOrMore(attr_expr).parseString(text).pprint()  # Fail! read 'color' as data instead of next label -> [['shape', 'SQUARE color']]

        # use stopOn attribute for OneOrMore to avoid reading label string as part of the data
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        OneOrMore(attr_expr).parseString(text).pprint() # Better -> [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'BLACK']]
        
        # could also be written as
        (attr_expr * (1,)).parseString(text).pprint()
    cCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�r;z}...)r�r�rUr�r.)r�rwrwrxr�!s


zOneOrMore.__str__N)r�r�r�r�r�rwrwrwrxrscs8eZdZdZd
�fdd�	Zd�fdd�	Zdd	�Z�ZS)r2aw
    Optional repetition of zero or more of the given expression.
    
    Parameters:
     - expr - expression that must match zero or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example: similar to L{OneOrMore}
    Ncstt|�j||d�d|_dS)N)rKT)r�r2r�r[)r�r.rK)rHrwrxr�6szZeroOrMore.__init__Tcs6ytt|�j|||�Sttfk
r0|gfSXdS)N)r�r2r�rr�)r�r-r�ro)rHrwrxr�:szZeroOrMore.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�rz]...)r�r�rUr�r.)r�rwrwrxr�@s


zZeroOrMore.__str__)N)T)r�r�r�r�r�r�r�r�rwrw)rHrxr2*sc@s eZdZdd�ZeZdd�ZdS)�
_NullTokencCsdS)NFrw)r�rwrwrxr�Jsz_NullToken.__bool__cCsdS)Nr�rw)r�rwrwrxr�Msz_NullToken.__str__N)r�r�r�r�r'r�rwrwrwrxrLIsrLcs6eZdZdZef�fdd�	Zd	dd�Zdd�Z�ZS)
raa
    Optional matching of the given expression.

    Parameters:
     - expr - expression that must match zero or more times
     - default (optional) - value to be returned if the optional expression is not found.

    Example::
        # US postal code can be a 5-digit zip, plus optional 4-digit qualifier
        zip = Combine(Word(nums, exact=5) + Optional('-' + Word(nums, exact=4)))
        zip.runTests('''
            # traditional ZIP code
            12345
            
            # ZIP+4 form
            12101-0001
            
            # invalid ZIP
            98765-
            ''')
    prints::
        # traditional ZIP code
        12345
        ['12345']

        # ZIP+4 form
        12101-0001
        ['12101-0001']

        # invalid ZIP
        98765-
             ^
        FAIL: Expected end of text (at char 5), (line:1, col:6)
    cs.tt|�j|dd�|jj|_||_d|_dS)NF)rgT)r�rr�r.rWr�r[)r�r.r�)rHrwrxr�ts
zOptional.__init__TcCszy|jj|||dd�\}}WnTttfk
rp|jtk	rh|jjr^t|jg�}|j||jj<ql|jg}ng}YnX||fS)NF)rp)r.rtrr�r��_optionalNotMatchedrVr")r�r-r�ror�rwrwrxr�zs


zOptional.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�rr	)r�r�rUr�r.)r�rwrwrxr��s


zOptional.__str__)T)	r�r�r�r�rMr�r�r�r�rwrw)rHrxrQs"
cs,eZdZdZd	�fdd�	Zd
dd�Z�ZS)r(a�	
    Token for skipping over all undefined text until the matched expression is found.

    Parameters:
     - expr - target expression marking the end of the data to be skipped
     - include - (default=C{False}) if True, the target expression is also parsed 
          (the skipped text and target expression are returned as a 2-element list).
     - ignore - (default=C{None}) used to define grammars (typically quoted strings and 
          comments) that might contain false matches to the target expression
     - failOn - (default=C{None}) define expressions that are not allowed to be 
          included in the skipped test; if found before the target expression is found, 
          the SkipTo is not a match

    Example::
        report = '''
            Outstanding Issues Report - 1 Jan 2000

               # | Severity | Description                               |  Days Open
            -----+----------+-------------------------------------------+-----------
             101 | Critical | Intermittent system crash                 |          6
              94 | Cosmetic | Spelling error on Login ('log|n')         |         14
              79 | Minor    | System slow when running too many reports |         47
            '''
        integer = Word(nums)
        SEP = Suppress('|')
        # use SkipTo to simply match everything up until the next SEP
        # - ignore quoted strings, so that a '|' character inside a quoted string does not match
        # - parse action will call token.strip() for each matched token, i.e., the description body
        string_data = SkipTo(SEP, ignore=quotedString)
        string_data.setParseAction(tokenMap(str.strip))
        ticket_expr = (integer("issue_num") + SEP 
                      + string_data("sev") + SEP 
                      + string_data("desc") + SEP 
                      + integer("days_open"))
        
        for tkt in ticket_expr.searchString(report):
            print tkt.dump()
    prints::
        ['101', 'Critical', 'Intermittent system crash', '6']
        - days_open: 6
        - desc: Intermittent system crash
        - issue_num: 101
        - sev: Critical
        ['94', 'Cosmetic', "Spelling error on Login ('log|n')", '14']
        - days_open: 14
        - desc: Spelling error on Login ('log|n')
        - issue_num: 94
        - sev: Cosmetic
        ['79', 'Minor', 'System slow when running too many reports', '47']
        - days_open: 47
        - desc: System slow when running too many reports
        - issue_num: 79
        - sev: Minor
    FNcs`tt|�j|�||_d|_d|_||_d|_t|t	�rFt
j|�|_n||_dt
|j�|_dS)NTFzNo match found for )r�r(r��
ignoreExprr[r`�includeMatchr�rzr�r$rQ�failOnr�r.ra)r�r��includer�rP)rHrwrxr��s
zSkipTo.__init__TcCs,|}t|�}|j}|jj}|jdk	r,|jjnd}|jdk	rB|jjnd}	|}
x�|
|kr�|dk	rh|||
�rhP|	dk	r�x*y|	||
�}
Wqrtk
r�PYqrXqrWy|||
ddd�Wn tt	fk
r�|
d7}
YqLXPqLWt|||j
|��|
}|||�}t|�}|j�r$||||dd�\}}
||
7}||fS)NF)rorprr)rp)
r�r.rtrPr�rNr�rrr�rar"rO)r�r-r�ror0r�r.Z
expr_parseZself_failOn_canParseNextZself_ignoreExpr_tryParseZtmplocZskiptextZ
skipresultr�rwrwrxr��s<

zSkipTo.parseImpl)FNN)T)r�r�r�r�r�r�r�rwrw)rHrxr(�s6
csbeZdZdZd�fdd�	Zdd�Zdd�Zd	d
�Zdd�Zgfd
d�Z	dd�Z
�fdd�Z�ZS)raK
    Forward declaration of an expression to be defined later -
    used for recursive grammars, such as algebraic infix notation.
    When the expression is known, it is assigned to the C{Forward} variable using the '<<' operator.

    Note: take care when assigning to C{Forward} not to overlook precedence of operators.
    Specifically, '|' has a lower precedence than '<<', so that::
        fwdExpr << a | b | c
    will actually be evaluated as::
        (fwdExpr << a) | b | c
    thereby leaving b and c out as parseable alternatives.  It is recommended that you
    explicitly group the values inserted into the C{Forward}::
        fwdExpr << (a | b | c)
    Converting to use the '<<=' operator instead will avoid this problem.

    See L{ParseResults.pprint} for an example of a recursive parser created using
    C{Forward}.
    Ncstt|�j|dd�dS)NF)rg)r�rr�)r�r�)rHrwrxr�szForward.__init__cCsjt|t�rtj|�}||_d|_|jj|_|jj|_|j|jj	�|jj
|_
|jj|_|jj
|jj�|S)N)rzr�r$rQr.rUr`r[r�rYrXrWr]r�)r�r�rwrwrx�
__lshift__s





zForward.__lshift__cCs||>S)Nrw)r�r�rwrwrx�__ilshift__'szForward.__ilshift__cCs
d|_|S)NF)rX)r�rwrwrxr�*szForward.leaveWhitespacecCs$|js d|_|jdk	r |jj�|S)NT)r_r.r�)r�rwrwrxr�.s


zForward.streamlinecCs>||kr0|dd�|g}|jdk	r0|jj|�|jg�dS)N)r.r�r�)r�r�r6rwrwrxr�5s

zForward.validatecCs>t|d�r|jS|jjdSd}Wd|j|_X|jjd|S)Nr�z: ...�Nonez: )r�r�rHr�Z_revertClass�_ForwardNoRecurser.r�)r�Z	retStringrwrwrxr�<s

zForward.__str__cs.|jdk	rtt|�j�St�}||K}|SdS)N)r.r�rr�)r�r�)rHrwrxr�Ms

zForward.copy)N)
r�r�r�r�r�rRrSr�r�r�r�r�r�rwrw)rHrxrs
c@seZdZdd�ZdS)rUcCsdS)Nz...rw)r�rwrwrxr�Vsz_ForwardNoRecurse.__str__N)r�r�r�r�rwrwrwrxrUUsrUcs"eZdZdZd�fdd�	Z�ZS)r-zQ
    Abstract subclass of C{ParseExpression}, for converting parsed results.
    Fcstt|�j|�d|_dS)NF)r�r-r�rW)r�r.rg)rHrwrxr�]szTokenConverter.__init__)F)r�r�r�r�r�r�rwrw)rHrxr-Yscs6eZdZdZd
�fdd�	Z�fdd�Zdd	�Z�ZS)r
a�
    Converter to concatenate all matching tokens to a single string.
    By default, the matching patterns must also be contiguous in the input string;
    this can be disabled by specifying C{'adjacent=False'} in the constructor.

    Example::
        real = Word(nums) + '.' + Word(nums)
        print(real.parseString('3.1416')) # -> ['3', '.', '1416']
        # will also erroneously match the following
        print(real.parseString('3. 1416')) # -> ['3', '.', '1416']

        real = Combine(Word(nums) + '.' + Word(nums))
        print(real.parseString('3.1416')) # -> ['3.1416']
        # no match when there are internal spaces
        print(real.parseString('3. 1416')) # -> Exception: Expected W:(0123...)
    r�Tcs8tt|�j|�|r|j�||_d|_||_d|_dS)NT)r�r
r�r��adjacentrX�
joinStringre)r�r.rWrV)rHrwrxr�rszCombine.__init__cs(|jrtj||�ntt|�j|�|S)N)rVr$r�r�r
)r�r�)rHrwrxr�|szCombine.ignorecCsP|j�}|dd�=|tdj|j|j��g|jd�7}|jrH|j�rH|gS|SdS)Nr�)r�)r�r"r�r
rWrbrVr�)r�r-r�r�ZretToksrwrwrxr��s
"zCombine.postParse)r�T)r�r�r�r�r�r�r�r�rwrw)rHrxr
as
cs(eZdZdZ�fdd�Zdd�Z�ZS)ra�
    Converter to return the matched tokens as a list - useful for returning tokens of C{L{ZeroOrMore}} and C{L{OneOrMore}} expressions.

    Example::
        ident = Word(alphas)
        num = Word(nums)
        term = ident | num
        func = ident + Optional(delimitedList(term))
        print(func.parseString("fn a,b,100"))  # -> ['fn', 'a', 'b', '100']

        func = ident + Group(Optional(delimitedList(term)))
        print(func.parseString("fn a,b,100"))  # -> ['fn', ['a', 'b', '100']]
    cstt|�j|�d|_dS)NT)r�rr�rW)r�r.)rHrwrxr��szGroup.__init__cCs|gS)Nrw)r�r-r�r�rwrwrxr��szGroup.postParse)r�r�r�r�r�r�r�rwrw)rHrxr�s
cs(eZdZdZ�fdd�Zdd�Z�ZS)raW
    Converter to return a repetitive expression as a list, but also as a dictionary.
    Each element can also be referenced using the first token in the expression as its key.
    Useful for tabular report scraping when the first column can be used as a item key.

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        # print attributes as plain groups
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        # instead of OneOrMore(expr), parse using Dict(OneOrMore(Group(expr))) - Dict will auto-assign names
        result = Dict(OneOrMore(Group(attr_expr))).parseString(text)
        print(result.dump())
        
        # access named fields as dict entries, or output as dict
        print(result['shape'])        
        print(result.asDict())
    prints::
        ['shape', 'SQUARE', 'posn', 'upper left', 'color', 'light blue', 'texture', 'burlap']

        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        {'color': 'light blue', 'posn': 'upper left', 'texture': 'burlap', 'shape': 'SQUARE'}
    See more examples at L{ParseResults} of accessing fields by results name.
    cstt|�j|�d|_dS)NT)r�rr�rW)r�r.)rHrwrxr��sz
Dict.__init__cCs�x�t|�D]�\}}t|�dkr q
|d}t|t�rBt|d�j�}t|�dkr^td|�||<q
t|�dkr�t|dt�r�t|d|�||<q
|j�}|d=t|�dks�t|t�r�|j	�r�t||�||<q
t|d|�||<q
W|j
r�|gS|SdS)Nrrrr�rq)r�r�rzrur�r�r�r"r�r�rV)r�r-r�r�r��tokZikeyZ	dictvaluerwrwrxr��s$
zDict.postParse)r�r�r�r�r�r�r�rwrw)rHrxr�s#c@s eZdZdZdd�Zdd�ZdS)r+aV
    Converter for ignoring the results of a parsed expression.

    Example::
        source = "a, b, c,d"
        wd = Word(alphas)
        wd_list1 = wd + ZeroOrMore(',' + wd)
        print(wd_list1.parseString(source))

        # often, delimiters that are useful during parsing are just in the
        # way afterward - use Suppress to keep them out of the parsed output
        wd_list2 = wd + ZeroOrMore(Suppress(',') + wd)
        print(wd_list2.parseString(source))
    prints::
        ['a', ',', 'b', ',', 'c', ',', 'd']
        ['a', 'b', 'c', 'd']
    (See also L{delimitedList}.)
    cCsgS)Nrw)r�r-r�r�rwrwrxr��szSuppress.postParsecCs|S)Nrw)r�rwrwrxr��szSuppress.suppressN)r�r�r�r�r�r�rwrwrwrxr+�sc@s(eZdZdZdd�Zdd�Zdd�ZdS)	rzI
    Wrapper for parse actions, to ensure they are only called once.
    cCst|�|_d|_dS)NF)rM�callable�called)r�Z
methodCallrwrwrxr�s
zOnlyOnce.__init__cCs.|js|j|||�}d|_|St||d��dS)NTr�)rZrYr)r�r�r5rvr�rwrwrxr�s
zOnlyOnce.__call__cCs
d|_dS)NF)rZ)r�rwrwrx�reset
szOnlyOnce.resetN)r�r�r�r�r�r�r[rwrwrwrxr�scs:t����fdd�}y�j|_Wntk
r4YnX|S)as
    Decorator for debugging parse actions. 
    
    When the parse action is called, this decorator will print C{">> entering I{method-name}(line:I{current_source_line}, I{parse_location}, I{matched_tokens})".}
    When the parse action completes, the decorator will print C{"<<"} followed by the returned value, or any exception that the parse action raised.

    Example::
        wd = Word(alphas)

        @traceParseAction
        def remove_duplicate_chars(tokens):
            return ''.join(sorted(set(''.join(tokens)))

        wds = OneOrMore(wd).setParseAction(remove_duplicate_chars)
        print(wds.parseString("slkdjs sld sldd sdlf sdljf"))
    prints::
        >>entering remove_duplicate_chars(line: 'slkdjs sld sldd sdlf sdljf', 0, (['slkdjs', 'sld', 'sldd', 'sdlf', 'sdljf'], {}))
        <<leaving remove_duplicate_chars (ret: 'dfjkls')
        ['dfjkls']
    cs��j}|dd�\}}}t|�dkr8|djjd|}tjjd|t||�||f�y�|�}Wn8tk
r�}ztjjd||f��WYdd}~XnXtjjd||f�|S)Nror�.z">>entering %s(line: '%s', %d, %r)
z<<leaving %s (exception: %s)
z<<leaving %s (ret: %r)
r9)r�r�rHr~�stderr�writerGrK)ZpaArgsZthisFuncr�r5rvr�r3)r�rwrx�z#sztraceParseAction.<locals>.z)rMr�r�)r�r_rw)r�rxrb
s
�,FcCs`t|�dt|�dt|�d}|rBt|t||��j|�S|tt|�|�j|�SdS)a�
    Helper to define a delimited list of expressions - the delimiter defaults to ','.
    By default, the list elements and delimiters can have intervening whitespace, and
    comments, but this can be overridden by passing C{combine=True} in the constructor.
    If C{combine} is set to C{True}, the matching tokens are returned as a single token
    string, with the delimiters included; otherwise, the matching tokens are returned
    as a list of tokens, with the delimiters suppressed.

    Example::
        delimitedList(Word(alphas)).parseString("aa,bb,cc") # -> ['aa', 'bb', 'cc']
        delimitedList(Word(hexnums), delim=':', combine=True).parseString("AA:BB:CC:DD:EE") # -> ['AA:BB:CC:DD:EE']
    z [r�z]...N)r�r
r2rir+)r.Zdelim�combineZdlNamerwrwrxr@9s
$csjt����fdd�}|dkr0tt�jdd��}n|j�}|jd�|j|dd�|�jd	t��d
�S)a:
    Helper to define a counted list of expressions.
    This helper defines a pattern of the form::
        integer expr expr expr...
    where the leading integer tells how many expr expressions follow.
    The matched tokens returns the array of expr tokens as a list - the leading count token is suppressed.
    
    If C{intExpr} is specified, it should be a pyparsing expression that produces an integer value.

    Example::
        countedArray(Word(alphas)).parseString('2 ab cd ef')  # -> ['ab', 'cd']

        # in this parser, the leading integer value is given in binary,
        # '10' indicating that 2 values are in the array
        binaryConstant = Word('01').setParseAction(lambda t: int(t[0], 2))
        countedArray(Word(alphas), intExpr=binaryConstant).parseString('10 ab cd ef')  # -> ['ab', 'cd']
    cs.|d}�|r tt�g|��p&tt�>gS)Nr)rrrC)r�r5rvr�)�	arrayExprr.rwrx�countFieldParseAction_s"z+countedArray.<locals>.countFieldParseActionNcSst|d�S)Nr)ru)rvrwrwrxrydszcountedArray.<locals>.<lambda>ZarrayLenT)rfz(len) z...)rr/rRr�r�rirxr�)r.ZintExprrcrw)rbr.rxr<Ls
cCs:g}x0|D](}t|t�r(|jt|��q
|j|�q
W|S)N)rzr�r�r�r�)�Lr�r�rwrwrxr�ks

r�cs6t���fdd�}|j|dd��jdt|���S)a*
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousLiteral(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches a
    previous literal, will also match the leading C{"1:1"} in C{"1:10"}.
    If this is not desired, use C{matchPreviousExpr}.
    Do I{not} use with packrat parsing enabled.
    csP|rBt|�dkr�|d>qLt|j��}�tdd�|D��>n
�t�>dS)Nrrrcss|]}t|�VqdS)N)r)r��ttrwrwrxr��szDmatchPreviousLiteral.<locals>.copyTokenToRepeater.<locals>.<genexpr>)r�r�r�rr
)r�r5rvZtflat)�reprwrx�copyTokenToRepeater�sz1matchPreviousLiteral.<locals>.copyTokenToRepeaterT)rfz(prev) )rrxrir�)r.rgrw)rfrxrOts


csFt��|j�}�|K��fdd�}|j|dd��jdt|���S)aS
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousExpr(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches by
    expressions, will I{not} match the leading C{"1:1"} in C{"1:10"};
    the expressions are evaluated first, and then compared, so
    C{"1"} is compared with C{"10"}.
    Do I{not} use with packrat parsing enabled.
    cs*t|j����fdd�}�j|dd�dS)Ncs$t|j��}|�kr tddd��dS)Nr�r)r�r�r)r�r5rvZtheseTokens)�matchTokensrwrx�mustMatchTheseTokens�szLmatchPreviousExpr.<locals>.copyTokenToRepeater.<locals>.mustMatchTheseTokensT)rf)r�r�r�)r�r5rvri)rf)rhrxrg�sz.matchPreviousExpr.<locals>.copyTokenToRepeaterT)rfz(prev) )rr�rxrir�)r.Ze2rgrw)rfrxrN�scCs>xdD]}|j|t|�}qW|jdd�}|jdd�}t|�S)Nz\^-]rz\nr'z\t)r��_bslashr�)r�r�rwrwrxr�s

rTc
s�|rdd�}dd�}t�ndd�}dd�}t�g}t|t�rF|j�}n&t|tj�r\t|�}ntj	dt
dd�|svt�Sd	}x�|t|�d
k�r||}xnt
||d
d��D]N\}}	||	|�r�|||d
=Pq�|||	�r�|||d
=|j||	�|	}Pq�W|d
7}q|W|�r�|�r�yht|�tdj|��k�rZtd
djdd�|D���jdj|��Stdjdd�|D���jdj|��SWn&tk
�r�tj	dt
dd�YnXt�fdd�|D��jdj|��S)a�
    Helper to quickly define a set of alternative Literals, and makes sure to do
    longest-first testing when there is a conflict, regardless of the input order,
    but returns a C{L{MatchFirst}} for best performance.

    Parameters:
     - strs - a string of space-delimited literals, or a collection of string literals
     - caseless - (default=C{False}) - treat all literals as caseless
     - useRegex - (default=C{True}) - as an optimization, will generate a Regex
          object; otherwise, will generate a C{MatchFirst} object (if C{caseless=True}, or
          if creating a C{Regex} raises an exception)

    Example::
        comp_oper = oneOf("< = > <= >= !=")
        var = Word(alphas)
        number = Word(nums)
        term = var | number
        comparison_expr = term + comp_oper + term
        print(comparison_expr.searchString("B = 12  AA=23 B<=AA AA>12"))
    prints::
        [['B', '=', '12'], ['AA', '=', '23'], ['B', '<=', 'AA'], ['AA', '>', '12']]
    cSs|j�|j�kS)N)r�)r�brwrwrxry�szoneOf.<locals>.<lambda>cSs|j�j|j��S)N)r�r�)rrkrwrwrxry�scSs||kS)Nrw)rrkrwrwrxry�scSs
|j|�S)N)r�)rrkrwrwrxry�sz6Invalid argument to oneOf, expected string or iterablerq)r�rrrNr�z[%s]css|]}t|�VqdS)N)r)r��symrwrwrxr��szoneOf.<locals>.<genexpr>z | �|css|]}tj|�VqdS)N)rdr)r�rlrwrwrxr��sz7Exception creating Regex for oneOf, building MatchFirstc3s|]}�|�VqdS)Nrw)r�rl)�parseElementClassrwrxr��s)rrrzr�r�r�r4r�r�r�r�rr�r�r�r�r'rirKr)
Zstrsr�ZuseRegexZisequalZmasksZsymbolsr�Zcurr�r�rw)rnrxrS�sL





((cCsttt||���S)a�
    Helper to easily and clearly define a dictionary by specifying the respective patterns
    for the key and value.  Takes care of defining the C{L{Dict}}, C{L{ZeroOrMore}}, and C{L{Group}} tokens
    in the proper order.  The key pattern can include delimiting markers or punctuation,
    as long as they are suppressed, thereby leaving the significant key text.  The value
    pattern can include named results, so that the C{Dict} results can include named token
    fields.

    Example::
        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        attr_label = label
        attr_value = Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join)

        # similar to Dict, but simpler call format
        result = dictOf(attr_label, attr_value).parseString(text)
        print(result.dump())
        print(result['shape'])
        print(result.shape)  # object attribute access works too
        print(result.asDict())
    prints::
        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        SQUARE
        {'color': 'light blue', 'shape': 'SQUARE', 'posn': 'upper left', 'texture': 'burlap'}
    )rr2r)r�r�rwrwrxrA�s!cCs^t�jdd��}|j�}d|_|d�||d�}|r@dd�}ndd�}|j|�|j|_|S)	a�
    Helper to return the original, untokenized text for a given expression.  Useful to
    restore the parsed fields of an HTML start tag into the raw tag text itself, or to
    revert separate tokens with intervening whitespace back to the original matching
    input text. By default, returns astring containing the original parsed text.  
       
    If the optional C{asString} argument is passed as C{False}, then the return value is a 
    C{L{ParseResults}} containing any results names that were originally matched, and a 
    single token containing the original matched text from the input string.  So if 
    the expression passed to C{L{originalTextFor}} contains expressions with defined
    results names, you must set C{asString} to C{False} if you want to preserve those
    results name values.

    Example::
        src = "this is test <b> bold <i>text</i> </b> normal text "
        for tag in ("b","i"):
            opener,closer = makeHTMLTags(tag)
            patt = originalTextFor(opener + SkipTo(closer) + closer)
            print(patt.searchString(src)[0])
    prints::
        ['<b> bold <i>text</i> </b>']
        ['<i>text</i>']
    cSs|S)Nrw)r�r�rvrwrwrxry8sz!originalTextFor.<locals>.<lambda>F�_original_start�
_original_endcSs||j|j�S)N)rorp)r�r5rvrwrwrxry=scSs&||jd�|jd��g|dd�<dS)Nrorp)r�)r�r5rvrwrwrx�extractText?sz$originalTextFor.<locals>.extractText)r
r�r�rer])r.ZasStringZ	locMarkerZendlocMarker�	matchExprrqrwrwrxrg s

cCst|�jdd��S)zp
    Helper to undo pyparsing's default grouping of And expressions, even
    if all but one are non-empty.
    cSs|dS)Nrrw)rvrwrwrxryJszungroup.<locals>.<lambda>)r-r�)r.rwrwrxrhEscCs4t�jdd��}t|d�|d�|j�j�d��S)a�
    Helper to decorate a returned token with its starting and ending locations in the input string.
    This helper adds the following results names:
     - locn_start = location where matched expression begins
     - locn_end = location where matched expression ends
     - value = the actual parsed results

    Be careful if the input text contains C{<TAB>} characters, you may want to call
    C{L{ParserElement.parseWithTabs}}

    Example::
        wd = Word(alphas)
        for match in locatedExpr(wd).searchString("ljsdf123lksdjjf123lkkjj1222"):
            print(match)
    prints::
        [[0, 'ljsdf', 5]]
        [[8, 'lksdjjf', 15]]
        [[18, 'lkkjj', 23]]
    cSs|S)Nrw)r�r5rvrwrwrxry`szlocatedExpr.<locals>.<lambda>Z
locn_startr�Zlocn_end)r
r�rr�r�)r.ZlocatorrwrwrxrjLsz\[]-*.$+^?()~ )rcCs|ddS)Nrrrrw)r�r5rvrwrwrxryksryz\\0?[xX][0-9a-fA-F]+cCstt|djd�d��S)Nrz\0x�)�unichrru�lstrip)r�r5rvrwrwrxrylsz	\\0[0-7]+cCstt|ddd�d��S)Nrrr�)rtru)r�r5rvrwrwrxrymsz\])r�rz\wr7rr�Znegate�bodyr	csBdd��y dj�fdd�tj|�jD��Stk
r<dSXdS)a�
    Helper to easily define string ranges for use in Word construction.  Borrows
    syntax from regexp '[]' string range definitions::
        srange("[0-9]")   -> "0123456789"
        srange("[a-z]")   -> "abcdefghijklmnopqrstuvwxyz"
        srange("[a-z$_]") -> "abcdefghijklmnopqrstuvwxyz$_"
    The input string must be enclosed in []'s, and the returned string is the expanded
    character set joined into a single string.
    The values enclosed in the []'s may be:
     - a single character
     - an escaped character with a leading backslash (such as C{\-} or C{\]})
     - an escaped hex character with a leading C{'\x'} (C{\x21}, which is a C{'!'} character) 
         (C{\0x##} is also supported for backwards compatibility) 
     - an escaped octal character with a leading C{'\0'} (C{\041}, which is a C{'!'} character)
     - a range of any of the above, separated by a dash (C{'a-z'}, etc.)
     - any combination of the above (C{'aeiouy'}, C{'a-zA-Z0-9_$'}, etc.)
    cSs<t|t�s|Sdjdd�tt|d�t|d�d�D��S)Nr�css|]}t|�VqdS)N)rt)r�r�rwrwrxr��sz+srange.<locals>.<lambda>.<locals>.<genexpr>rrr)rzr"r�r��ord)�prwrwrxry�szsrange.<locals>.<lambda>r�c3s|]}�|�VqdS)Nrw)r��part)�	_expandedrwrxr��szsrange.<locals>.<genexpr>N)r��_reBracketExprr�rwrK)r�rw)r{rxr_rs
 cs�fdd�}|S)zt
    Helper method for defining parse actions that require matching at a specific
    column in the input text.
    cs"t||��krt||d���dS)Nzmatched token not at column %d)r9r)r)Zlocnr1)r�rwrx�	verifyCol�sz!matchOnlyAtCol.<locals>.verifyColrw)r�r}rw)r�rxrM�scs�fdd�S)a�
    Helper method for common parse actions that simply return a literal value.  Especially
    useful when used with C{L{transformString<ParserElement.transformString>}()}.

    Example::
        num = Word(nums).setParseAction(lambda toks: int(toks[0]))
        na = oneOf("N/A NA").setParseAction(replaceWith(math.nan))
        term = na | num
        
        OneOrMore(term).parseString("324 234 N/A 234") # -> [324, 234, nan, 234]
    cs�gS)Nrw)r�r5rv)�replStrrwrxry�szreplaceWith.<locals>.<lambda>rw)r~rw)r~rxr\�scCs|ddd�S)a
    Helper parse action for removing quotation marks from parsed quoted strings.

    Example::
        # by default, quotation marks are included in parsed results
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["'Now is the Winter of our Discontent'"]

        # use removeQuotes to strip quotation marks from parsed results
        quotedString.setParseAction(removeQuotes)
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["Now is the Winter of our Discontent"]
    rrrrsrw)r�r5rvrwrwrxrZ�scsN��fdd�}yt�dt�d�j�}Wntk
rBt��}YnX||_|S)aG
    Helper to define a parse action by mapping a function to all elements of a ParseResults list.If any additional 
    args are passed, they are forwarded to the given function as additional arguments after
    the token, as in C{hex_integer = Word(hexnums).setParseAction(tokenMap(int, 16))}, which will convert the
    parsed data to an integer using base 16.

    Example (compare the last to example in L{ParserElement.transformString}::
        hex_ints = OneOrMore(Word(hexnums)).setParseAction(tokenMap(int, 16))
        hex_ints.runTests('''
            00 11 22 aa FF 0a 0d 1a
            ''')
        
        upperword = Word(alphas).setParseAction(tokenMap(str.upper))
        OneOrMore(upperword).runTests('''
            my kingdom for a horse
            ''')

        wd = Word(alphas).setParseAction(tokenMap(str.title))
        OneOrMore(wd).setParseAction(' '.join).runTests('''
            now is the winter of our discontent made glorious summer by this sun of york
            ''')
    prints::
        00 11 22 aa FF 0a 0d 1a
        [0, 17, 34, 170, 255, 10, 13, 26]

        my kingdom for a horse
        ['MY', 'KINGDOM', 'FOR', 'A', 'HORSE']

        now is the winter of our discontent made glorious summer by this sun of york
        ['Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York']
    cs��fdd�|D�S)Ncsg|]}�|f����qSrwrw)r�Ztokn)r�r6rwrxr��sz(tokenMap.<locals>.pa.<locals>.<listcomp>rw)r�r5rv)r�r6rwrxr}�sztokenMap.<locals>.par�rH)rJr�rKr{)r6r�r}rLrw)r�r6rxrm�s cCst|�j�S)N)r�r�)rvrwrwrxry�scCst|�j�S)N)r��lower)rvrwrwrxry�scCs�t|t�r|}t||d�}n|j}tttd�}|r�tj�j	t
�}td�|d�tt
t|td�|���tddgd�jd	�j	d
d��td�}n�d
jdd�tD��}tj�j	t
�t|�B}td�|d�tt
t|j	t�ttd�|����tddgd�jd	�j	dd��td�}ttd�|d�}|jdd
j|jdd�j�j���jd|�}|jdd
j|jdd�j�j���jd|�}||_||_||fS)zRInternal helper to construct opening and closing tag expressions, given a tag name)r�z_-:r�tag�=�/F)r�rCcSs|ddkS)Nrr�rw)r�r5rvrwrwrxry�sz_makeTags.<locals>.<lambda>rr�css|]}|dkr|VqdS)rNrw)r�r�rwrwrxr��sz_makeTags.<locals>.<genexpr>cSs|ddkS)Nrr�rw)r�r5rvrwrwrxry�sz</r��:r�z<%s>r
z</%s>)rzr�rr�r/r4r3r>r�r�rZr+rr2rrrmr�rVrYrBr
�_Lr��titler�rir�)�tagStrZxmlZresnameZtagAttrNameZtagAttrValueZopenTagZprintablesLessRAbrackZcloseTagrwrwrx�	_makeTags�s"
T\..r�cCs
t|d�S)a 
    Helper to construct opening and closing tag expressions for HTML, given a tag name. Matches
    tags in either upper or lower case, attributes with namespaces and with quoted or unquoted values.

    Example::
        text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
        # makeHTMLTags returns pyparsing expressions for the opening and closing tags as a 2-tuple
        a,a_end = makeHTMLTags("A")
        link_expr = a + SkipTo(a_end)("link_text") + a_end
        
        for link in link_expr.searchString(text):
            # attributes in the <A> tag (like "href" shown here) are also accessible as named results
            print(link.link_text, '->', link.href)
    prints::
        pyparsing -> http://pyparsing.wikispaces.com
    F)r�)r�rwrwrxrK�scCs
t|d�S)z�
    Helper to construct opening and closing tag expressions for XML, given a tag name. Matches
    tags only in the given upper/lower case.

    Example: similar to L{makeHTMLTags}
    T)r�)r�rwrwrxrLscs8|r|dd��n|j��dd��D���fdd�}|S)a<
    Helper to create a validating parse action to be used with start tags created
    with C{L{makeXMLTags}} or C{L{makeHTMLTags}}. Use C{withAttribute} to qualify a starting tag
    with a required attribute value, to avoid false matches on common tags such as
    C{<TD>} or C{<DIV>}.

    Call C{withAttribute} with a series of attribute names and values. Specify the list
    of filter attributes names and values as:
     - keyword arguments, as in C{(align="right")}, or
     - as an explicit dict with C{**} operator, when an attribute name is also a Python
          reserved word, as in C{**{"class":"Customer", "align":"right"}}
     - a list of name-value tuples, as in ( ("ns1:class", "Customer"), ("ns2:align","right") )
    For attribute names with a namespace prefix, you must use the second form.  Attribute
    names are matched insensitive to upper/lower case.
       
    If just testing for C{class} (with or without a namespace), use C{L{withClass}}.

    To verify that the attribute exists, but without specifying a value, pass
    C{withAttribute.ANY_VALUE} as the value.

    Example::
        html = '''
            <div>
            Some text
            <div type="grid">1 4 0 1 0</div>
            <div type="graph">1,3 2,3 1,1</div>
            <div>this has no type</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")

        # only match div tag having a type attribute with value "grid"
        div_grid = div().setParseAction(withAttribute(type="grid"))
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        # construct a match with any div tag having a type attribute, regardless of the value
        div_any_type = div().setParseAction(withAttribute(type=withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    NcSsg|]\}}||f�qSrwrw)r�r�r�rwrwrxr�Qsz!withAttribute.<locals>.<listcomp>cs^xX�D]P\}}||kr&t||d|��|tjkr|||krt||d||||f��qWdS)Nzno matching attribute z+attribute '%s' has value '%s', must be '%s')rre�	ANY_VALUE)r�r5r�ZattrNameZ	attrValue)�attrsrwrxr}RszwithAttribute.<locals>.pa)r�)r�ZattrDictr}rw)r�rxres2cCs|rd|nd}tf||i�S)a�
    Simplified version of C{L{withAttribute}} when matching on a div class - made
    difficult because C{class} is a reserved word in Python.

    Example::
        html = '''
            <div>
            Some text
            <div class="grid">1 4 0 1 0</div>
            <div class="graph">1,3 2,3 1,1</div>
            <div>this &lt;div&gt; has no class</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")
        div_grid = div().setParseAction(withClass("grid"))
        
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        div_any_type = div().setParseAction(withClass(withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    z%s:class�class)re)Z	classname�	namespaceZ	classattrrwrwrxrk\s �(rcCs�t�}||||B}�x`t|�D�]R\}}|ddd�\}}	}
}|	dkrTd|nd|}|	dkr�|dksxt|�dkr�td��|\}
}t�j|�}|
tjk�rd|	dkr�t||�t|t	|��}n�|	dk�r|dk	�rt|||�t|t	||��}nt||�t|t	|��}nD|	dk�rZt||
|||�t||
|||�}ntd	��n�|
tj
k�rH|	dk�r�t|t��s�t|�}t|j
|�t||�}n�|	dk�r|dk	�r�t|||�t|t	||��}nt||�t|t	|��}nD|	dk�r>t||
|||�t||
|||�}ntd	��ntd
��|�r`|j|�||j|�|BK}|}q"W||K}|S)a�	
    Helper method for constructing grammars of expressions made up of
    operators working in a precedence hierarchy.  Operators may be unary or
    binary, left- or right-associative.  Parse actions can also be attached
    to operator expressions. The generated parser will also recognize the use 
    of parentheses to override operator precedences (see example below).
    
    Note: if you define a deep operator list, you may see performance issues
    when using infixNotation. See L{ParserElement.enablePackrat} for a
    mechanism to potentially improve your parser performance.

    Parameters:
     - baseExpr - expression representing the most basic element for the nested
     - opList - list of tuples, one for each operator precedence level in the
      expression grammar; each tuple is of the form
      (opExpr, numTerms, rightLeftAssoc, parseAction), where:
       - opExpr is the pyparsing expression for the operator;
          may also be a string, which will be converted to a Literal;
          if numTerms is 3, opExpr is a tuple of two expressions, for the
          two operators separating the 3 terms
       - numTerms is the number of terms for this operator (must
          be 1, 2, or 3)
       - rightLeftAssoc is the indicator whether the operator is
          right or left associative, using the pyparsing-defined
          constants C{opAssoc.RIGHT} and C{opAssoc.LEFT}.
       - parseAction is the parse action to be associated with
          expressions matching this operator expression (the
          parse action tuple member may be omitted)
     - lpar - expression for matching left-parentheses (default=C{Suppress('(')})
     - rpar - expression for matching right-parentheses (default=C{Suppress(')')})

    Example::
        # simple example of four-function arithmetic with ints and variable names
        integer = pyparsing_common.signed_integer
        varname = pyparsing_common.identifier 
        
        arith_expr = infixNotation(integer | varname,
            [
            ('-', 1, opAssoc.RIGHT),
            (oneOf('* /'), 2, opAssoc.LEFT),
            (oneOf('+ -'), 2, opAssoc.LEFT),
            ])
        
        arith_expr.runTests('''
            5+3*6
            (5+3)*6
            -2--11
            ''', fullDump=False)
    prints::
        5+3*6
        [[5, '+', [3, '*', 6]]]

        (5+3)*6
        [[[5, '+', 3], '*', 6]]

        -2--11
        [[['-', 2], '-', ['-', 11]]]
    Nrroz%s termz	%s%s termrqz@if numterms=3, opExpr must be a tuple or list of two expressionsrrz6operator must be unary (1), binary (2), or ternary (3)z2operator must indicate right or left associativity)N)rr�r�r�rirT�LEFTrrr�RIGHTrzrr.r�)ZbaseExprZopListZlparZrparr�ZlastExprr�ZoperDefZopExprZarityZrightLeftAssocr}ZtermNameZopExpr1ZopExpr2ZthisExprrrrwrwrxri�sR;

&




&


z4"(?:[^"\n\r\\]|(?:"")|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*�"z string enclosed in double quotesz4'(?:[^'\n\r\\]|(?:'')|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*�'z string enclosed in single quotesz*quotedString using single or double quotes�uzunicode string literalcCs�||krtd��|dk�r(t|t�o,t|t��r t|�dkr�t|�dkr�|dk	r�tt|t||tjdd���j	dd��}n$t
j�t||tj�j	dd��}nx|dk	r�tt|t|�t|�ttjdd���j	dd��}n4ttt|�t|�ttjdd���j	d	d��}ntd
��t
�}|dk	�rb|tt|�t||B|B�t|��K}n$|tt|�t||B�t|��K}|jd||f�|S)a~	
    Helper method for defining nested lists enclosed in opening and closing
    delimiters ("(" and ")" are the default).

    Parameters:
     - opener - opening character for a nested list (default=C{"("}); can also be a pyparsing expression
     - closer - closing character for a nested list (default=C{")"}); can also be a pyparsing expression
     - content - expression for items within the nested lists (default=C{None})
     - ignoreExpr - expression for ignoring opening and closing delimiters (default=C{quotedString})

    If an expression is not provided for the content argument, the nested
    expression will capture all whitespace-delimited content between delimiters
    as a list of separate values.

    Use the C{ignoreExpr} argument to define expressions that may contain
    opening or closing characters that should not be treated as opening
    or closing characters for nesting, such as quotedString or a comment
    expression.  Specify multiple expressions using an C{L{Or}} or C{L{MatchFirst}}.
    The default is L{quotedString}, but if no expressions are to be ignored,
    then pass C{None} for this argument.

    Example::
        data_type = oneOf("void int short long char float double")
        decl_data_type = Combine(data_type + Optional(Word('*')))
        ident = Word(alphas+'_', alphanums+'_')
        number = pyparsing_common.number
        arg = Group(decl_data_type + ident)
        LPAR,RPAR = map(Suppress, "()")

        code_body = nestedExpr('{', '}', ignoreExpr=(quotedString | cStyleComment))

        c_function = (decl_data_type("type") 
                      + ident("name")
                      + LPAR + Optional(delimitedList(arg), [])("args") + RPAR 
                      + code_body("body"))
        c_function.ignore(cStyleComment)
        
        source_code = '''
            int is_odd(int x) { 
                return (x%2); 
            }
                
            int dec_to_hex(char hchar) { 
                if (hchar >= '0' && hchar <= '9') { 
                    return (ord(hchar)-ord('0')); 
                } else { 
                    return (10+ord(hchar)-ord('A'));
                } 
            }
        '''
        for func in c_function.searchString(source_code):
            print("%(name)s (%(type)s) args: %(args)s" % func)

    prints::
        is_odd (int) args: [['int', 'x']]
        dec_to_hex (int) args: [['char', 'hchar']]
    z.opening and closing strings cannot be the sameNrr)rcSs|dj�S)Nr)r�)rvrwrwrxry9sznestedExpr.<locals>.<lambda>cSs|dj�S)Nr)r�)rvrwrwrxry<scSs|dj�S)Nr)r�)rvrwrwrxryBscSs|dj�S)Nr)r�)rvrwrwrxryFszOopening and closing arguments must be strings if no content expression is givenznested %s%s expression)r�rzr�r�r
rr	r$rNr�rCr�rrrr+r2ri)�openerZcloserZcontentrNr�rwrwrxrP�s4:

*$cs��fdd�}�fdd�}�fdd�}tt�jd�j��}t�t�j|�jd�}t�j|�jd	�}t�j|�jd
�}	|r�tt|�|t|t|�t|��|	�}
n$tt|�t|t|�t|���}
|j	t
t��|
jd�S)a
	
    Helper method for defining space-delimited indentation blocks, such as
    those used to define block statements in Python source code.

    Parameters:
     - blockStatementExpr - expression defining syntax of statement that
            is repeated within the indented block
     - indentStack - list created by caller to manage indentation stack
            (multiple statementWithIndentedBlock expressions within a single grammar
            should share a common indentStack)
     - indent - boolean indicating whether block must be indented beyond the
            the current level; set to False for block of left-most statements
            (default=C{True})

    A valid block must contain at least one C{blockStatement}.

    Example::
        data = '''
        def A(z):
          A1
          B = 100
          G = A2
          A2
          A3
        B
        def BB(a,b,c):
          BB1
          def BBA():
            bba1
            bba2
            bba3
        C
        D
        def spam(x,y):
             def eggs(z):
                 pass
        '''


        indentStack = [1]
        stmt = Forward()

        identifier = Word(alphas, alphanums)
        funcDecl = ("def" + identifier + Group( "(" + Optional( delimitedList(identifier) ) + ")" ) + ":")
        func_body = indentedBlock(stmt, indentStack)
        funcDef = Group( funcDecl + func_body )

        rvalue = Forward()
        funcCall = Group(identifier + "(" + Optional(delimitedList(rvalue)) + ")")
        rvalue << (funcCall | identifier | Word(nums))
        assignment = Group(identifier + "=" + rvalue)
        stmt << ( funcDef | assignment | identifier )

        module_body = OneOrMore(stmt)

        parseTree = module_body.parseString(data)
        parseTree.pprint()
    prints::
        [['def',
          'A',
          ['(', 'z', ')'],
          ':',
          [['A1'], [['B', '=', '100']], [['G', '=', 'A2']], ['A2'], ['A3']]],
         'B',
         ['def',
          'BB',
          ['(', 'a', 'b', 'c', ')'],
          ':',
          [['BB1'], [['def', 'BBA', ['(', ')'], ':', [['bba1'], ['bba2'], ['bba3']]]]]],
         'C',
         'D',
         ['def',
          'spam',
          ['(', 'x', 'y', ')'],
          ':',
          [[['def', 'eggs', ['(', 'z', ')'], ':', [['pass']]]]]]] 
    csN|t|�krdSt||�}|�dkrJ|�dkr>t||d��t||d��dS)Nrrzillegal nestingznot a peer entryrsrs)r�r9r!r)r�r5rv�curCol)�indentStackrwrx�checkPeerIndent�s
z&indentedBlock.<locals>.checkPeerIndentcs2t||�}|�dkr"�j|�nt||d��dS)Nrrznot a subentryrs)r9r�r)r�r5rvr�)r�rwrx�checkSubIndent�s
z%indentedBlock.<locals>.checkSubIndentcsN|t|�krdSt||�}�o4|�dko4|�dksBt||d���j�dS)Nrrrqznot an unindentrsr:)r�r9rr�)r�r5rvr�)r�rwrx�
checkUnindent�s
z$indentedBlock.<locals>.checkUnindentz	 �INDENTr�ZUNINDENTzindented block)rrr�r�r
r�rirrr�rj)ZblockStatementExprr�rr�r�r�r!r�ZPEERZUNDENTZsmExprrw)r�rxrfQsN,z#[\0xc0-\0xd6\0xd8-\0xf6\0xf8-\0xff]z[\0xa1-\0xbf\0xd7\0xf7]z_:zany tagzgt lt amp nbsp quot aposz><& "'z&(?P<entity>rmz);zcommon HTML entitycCstj|j�S)zRHelper parser action to replace common HTML entities with their special characters)�_htmlEntityMapr�Zentity)rvrwrwrxr[�sz/\*(?:[^*]|\*(?!/))*z*/zC style commentz<!--[\s\S]*?-->zHTML commentz.*zrest of linez//(?:\\\n|[^\n])*z
// commentzC++ style commentz#.*zPython style comment)r�z 	�	commaItem)r�c@s�eZdZdZee�Zee�Ze	e
�jd�je�Z
e	e�jd�jeed��Zed�jd�je�Ze�je�de�je�jd�Zejd	d
��eeeed�j�e�Bjd�Zeje�ed
�jd�je�Zed�jd�je�ZeeBeBj�Zed�jd�je�Ze	eded�jd�Zed�jd�Z ed�jd�Z!e!de!djd�Z"ee!de!d>�dee!de!d?�jd�Z#e#j$d d
��d!e jd"�Z%e&e"e%Be#Bjd#��jd#�Z'ed$�jd%�Z(e)d@d'd(��Z*e)dAd*d+��Z+ed,�jd-�Z,ed.�jd/�Z-ed0�jd1�Z.e/j�e0j�BZ1e)d2d3��Z2e&e3e4d4�e5�e	e6d4d5�ee7d6����j�jd7�Z8e9ee:j;�e8Bd8d9��jd:�Z<e)ed;d
���Z=e)ed<d
���Z>d=S)Brna�

    Here are some common low-level expressions that may be useful in jump-starting parser development:
     - numeric forms (L{integers<integer>}, L{reals<real>}, L{scientific notation<sci_real>})
     - common L{programming identifiers<identifier>}
     - network addresses (L{MAC<mac_address>}, L{IPv4<ipv4_address>}, L{IPv6<ipv6_address>})
     - ISO8601 L{dates<iso8601_date>} and L{datetime<iso8601_datetime>}
     - L{UUID<uuid>}
     - L{comma-separated list<comma_separated_list>}
    Parse actions:
     - C{L{convertToInteger}}
     - C{L{convertToFloat}}
     - C{L{convertToDate}}
     - C{L{convertToDatetime}}
     - C{L{stripHTMLTags}}
     - C{L{upcaseTokens}}
     - C{L{downcaseTokens}}

    Example::
        pyparsing_common.number.runTests('''
            # any int or real number, returned as the appropriate type
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.fnumber.runTests('''
            # any int or real number, returned as float
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.hex_integer.runTests('''
            # hex numbers
            100
            FF
            ''')

        pyparsing_common.fraction.runTests('''
            # fractions
            1/2
            -3/4
            ''')

        pyparsing_common.mixed_integer.runTests('''
            # mixed fractions
            1
            1/2
            -3/4
            1-3/4
            ''')

        import uuid
        pyparsing_common.uuid.setParseAction(tokenMap(uuid.UUID))
        pyparsing_common.uuid.runTests('''
            # uuid
            12345678-1234-5678-1234-567812345678
            ''')
    prints::
        # any int or real number, returned as the appropriate type
        100
        [100]

        -100
        [-100]

        +100
        [100]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # any int or real number, returned as float
        100
        [100.0]

        -100
        [-100.0]

        +100
        [100.0]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # hex numbers
        100
        [256]

        FF
        [255]

        # fractions
        1/2
        [0.5]

        -3/4
        [-0.75]

        # mixed fractions
        1
        [1]

        1/2
        [0.5]

        -3/4
        [-0.75]

        1-3/4
        [1.75]

        # uuid
        12345678-1234-5678-1234-567812345678
        [UUID('12345678-1234-5678-1234-567812345678')]
    �integerzhex integerrsz[+-]?\d+zsigned integerr��fractioncCs|d|dS)Nrrrrsrw)rvrwrwrxry�szpyparsing_common.<lambda>r7z"fraction or mixed integer-fractionz
[+-]?\d+\.\d*zreal numberz+[+-]?\d+([eE][+-]?\d+|\.\d*([eE][+-]?\d+)?)z$real number with scientific notationz[+-]?\d+\.?\d*([eE][+-]?\d+)?�fnumberrA�
identifierzK(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})(\.(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})){3}zIPv4 addressz[0-9a-fA-F]{1,4}�hex_integerr��zfull IPv6 addressrrBz::zshort IPv6 addresscCstdd�|D��dkS)Ncss|]}tjj|�rdVqdS)rrN)rn�
_ipv6_partr�)r�rerwrwrxr��sz,pyparsing_common.<lambda>.<locals>.<genexpr>rv)rG)rvrwrwrxry�sz::ffff:zmixed IPv6 addresszIPv6 addressz:[0-9a-fA-F]{2}([:.-])[0-9a-fA-F]{2}(?:\1[0-9a-fA-F]{2}){4}zMAC address�%Y-%m-%dcs�fdd�}|S)a�
        Helper to create a parse action for converting parsed date string to Python datetime.date

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%d"})

        Example::
            date_expr = pyparsing_common.iso8601_date.copy()
            date_expr.setParseAction(pyparsing_common.convertToDate())
            print(date_expr.parseString("1999-12-31"))
        prints::
            [datetime.date(1999, 12, 31)]
        csLytj|d��j�Stk
rF}zt||t|���WYdd}~XnXdS)Nr)r�strptimeZdater�rr{)r�r5rv�ve)�fmtrwrx�cvt_fn�sz.pyparsing_common.convertToDate.<locals>.cvt_fnrw)r�r�rw)r�rx�
convertToDate�szpyparsing_common.convertToDate�%Y-%m-%dT%H:%M:%S.%fcs�fdd�}|S)a
        Helper to create a parse action for converting parsed datetime string to Python datetime.datetime

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%dT%H:%M:%S.%f"})

        Example::
            dt_expr = pyparsing_common.iso8601_datetime.copy()
            dt_expr.setParseAction(pyparsing_common.convertToDatetime())
            print(dt_expr.parseString("1999-12-31T23:59:59.999"))
        prints::
            [datetime.datetime(1999, 12, 31, 23, 59, 59, 999000)]
        csHytj|d��Stk
rB}zt||t|���WYdd}~XnXdS)Nr)rr�r�rr{)r�r5rvr�)r�rwrxr��sz2pyparsing_common.convertToDatetime.<locals>.cvt_fnrw)r�r�rw)r�rx�convertToDatetime�sz"pyparsing_common.convertToDatetimez7(?P<year>\d{4})(?:-(?P<month>\d\d)(?:-(?P<day>\d\d))?)?zISO8601 datez�(?P<year>\d{4})-(?P<month>\d\d)-(?P<day>\d\d)[T ](?P<hour>\d\d):(?P<minute>\d\d)(:(?P<second>\d\d(\.\d*)?)?)?(?P<tz>Z|[+-]\d\d:?\d\d)?zISO8601 datetimez2[0-9a-fA-F]{8}(-[0-9a-fA-F]{4}){3}-[0-9a-fA-F]{12}�UUIDcCstjj|d�S)a
        Parse action to remove HTML tags from web page HTML source

        Example::
            # strip HTML links from normal text 
            text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
            td,td_end = makeHTMLTags("TD")
            table_text = td + SkipTo(td_end).setParseAction(pyparsing_common.stripHTMLTags)("body") + td_end
            
            print(table_text.parseString(text).body) # -> 'More info at the pyparsing wiki page'
        r)rn�_html_stripperr�)r�r5r�rwrwrx�
stripHTMLTags�s
zpyparsing_common.stripHTMLTagsr`)r�z 	r�r�)r�zcomma separated listcCst|�j�S)N)r�r�)rvrwrwrxry�scCst|�j�S)N)r�r)rvrwrwrxry�sN)rrB)rrB)r�)r�)?r�r�r�r�rmruZconvertToInteger�floatZconvertToFloatr/rRrir�r�rDr�r'Zsigned_integerr�rxrr�Z
mixed_integerrG�realZsci_realr��numberr�r4r3r�Zipv4_addressr�Z_full_ipv6_addressZ_short_ipv6_addressr~Z_mixed_ipv6_addressr
Zipv6_addressZmac_addressr�r�r�Ziso8601_dateZiso8601_datetime�uuidr7r6r�r�rrrrVr.�
_commasepitemr@rYr�Zcomma_separated_listrdrBrwrwrwrxrn�sN""
28�__main__Zselect�fromz_$r\)ra�columnsrjZtablesZcommandaK
        # '*' as column list and dotted table name
        select * from SYS.XYZZY

        # caseless match on "SELECT", and casts back to "select"
        SELECT * from XYZZY, ABC

        # list of column names, and mixed case SELECT keyword
        Select AA,BB,CC from Sys.dual

        # multiple tables
        Select A, B, C from Sys.dual, Table2

        # invalid SELECT keyword - should fail
        Xelect A, B, C from Sys.dual

        # incomplete command - should fail
        Select

        # invalid column name - should fail
        Select ^^^ frox Sys.dual

        z]
        100
        -100
        +100
        3.14159
        6.02e23
        1e-12
        z 
        100
        FF
        z6
        12345678-1234-5678-1234-567812345678
        )rq)r`F)N)FT)T)r�)T)�r��__version__Z__versionTime__�
__author__r��weakrefrr�r�r~r�rdrr�r"r<r�r�_threadr�ImportErrorZ	threadingrr�Zordereddict�__all__r��version_infor;r�maxsizer�r{r��chrrtr�rGr�r�reversedr�r�rr5r
rrIZmaxintZxranger�Z__builtin__r�Zfnamer�rJr�r�r�r�r�r�Zascii_uppercaseZascii_lowercaser4rRrDr3rjr�Z	printablerVrKrrr!r#r&r�r"�MutableMapping�registerr9rJrGr/r2r4rQrMr$r,r
rrr�rQrrrrlr/r'r%r	r.r/rrrr*r)r1r0r rrrrrrrrIrr2rLrMrr(rrUr-r
rrr+rrbr@r<r�rOrNrrSrArgrhrjrirCrIrHrar`r�Z_escapedPuncZ_escapedHexCharZ_escapedOctChar�UNICODEZ_singleCharZ
_charRangermr|r_rMr\rZrmrdrBr�rKrLrer�rkrTr�r�rirUr>r^rYrcrPrfr5rWr7r6r�r�r�r�r;r[r8rEr�r]r?r=rFrXr�r�r:rnr�ZselectTokenZ	fromTokenZidentZ
columnNameZcolumnNameListZ
columnSpecZ	tableNameZ
tableNameListZ	simpleSQLr�r�r�r�r�r�rwrwrwrx�<module>=s�









8


@d&A= I
G3pLOD|M &#@sQ,A,	I#%&0
,	?#kZr

 (
 0 


"site-packages/__pycache__/configobj.cpython-36.pyc000064400000163466147511334560016113 0ustar003

��S^�@sVddlZddlZddlZddlmZmZmZmZddlZddl	m
Z
daedKedLedMedNiZdddddddddddddddd�Z
eeeeed	�Zd
d�ZdZd
ZdZdZdZdZe�ZdOZd$Zd%Zd&Zd'd(d'd(d(dd'dddd(d(d)�Zd*d+�ZGd,d!�d!e�ZGd-d.�d.e�Z e �Z!d/d0�Z"Gd1d�de#�Z$Gd2d�de$�Z%Gd3d�de$�Z&Gd4d�de'�Z(Gd5d�de$�Z)Gd6d�de$�Z*Gd7d�de$�Z+Gd8d�de+�Z,Gd9d�de$�Z-Gd:d�de+�Z.Gd;d �d e$�Z/Gd<d=�d=e�Z0Gd>d?�d?e0�Z1Gd@dA�dAe0�Z2e1e2dB�Z3dCdD�Z4GdEdF�dFe5�Z6GdGd�de6�Z7GdHd�de�Z8dPdId"�Z9ffdJd#�Z:dS)Q�N)�BOM_UTF8�	BOM_UTF16�BOM_UTF16_BE�BOM_UTF16_LE)�__version__�utf_8�utf16_be�utf_16�utf16_le)r	�u16�utf16zutf-16r�	utf_16_bezutf-16ber
�	utf_16_lezutf-16ler�u8�utf�utf8zutf-8)rr	rr
NcCstj|j��dkS)Nr)�BOM_LIST�get�lower)�encoding�r�/usr/lib/python3.6/configobj.py�
match_utf8Dsrz'%s'z"%s"z%sz 
	'"z"""%s"""z'''%s'''�DEFAULT_INDENT_TYPE�DEFAULT_INTERPOLATION�ConfigObjError�NestingError�
ParseError�DuplicateError�ConfigspecError�	ConfigObj�	SimpleVal�InterpolationError�InterpolationLoopError�MissingInterpolationOption�RepeatSectionError�ReloadError�UnreprError�UnknownType�flatten_errors�get_extra_values�configparserz    �
TF)�
interpolation�raise_errors�list_values�create_empty�
file_error�
configspec�	stringify�indent_typer�default_encoding�unrepr�write_empty_valuescCs>tdkrddlad|}tj|�}|j�dj�dj�dS)Nrza=�)�compiler�parse�getChildren)�s�prrr�getObj�s

r>c@seZdZdS)r(N)�__name__�
__module__�__qualname__rrrrr(�sc@s\eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�ZdS)�BuildercCstdkrt|jj��t|�S)N)�mr(�	__class__r?)�self�orrr�build�sz
Builder.buildcCstt|j|j���S)N)�list�maprGr;)rErFrrr�
build_List�szBuilder.build_ListcCs|jS)N)�value)rErFrrr�build_Const�szBuilder.build_ConstcCs6i}tt|j|j���}x|D]}t|�||<qW|S)N)�iterrIrGr;�next)rErF�d�iZelrrr�
build_Dict�s

zBuilder.build_DictcCst|j|��S)N)�tuplerJ)rErFrrr�build_Tuple�szBuilder.build_TuplecCs6|jdkrdS|jdkrdS|jdkr*dStd��dS)N�None�TrueT�FalseFzUndefined Name)�namer()rErFrrr�
build_Name�s


zBuilder.build_NamecCshtt|j|j���\}}yt|�}Wntk
r@td��YnXt|t�sX|j	dkr`td��||S)NZAddg)
rHrIrLr;�float�	TypeErrorr(�
isinstance�complex�real)rErFr]�imagrrr�	build_Add�szBuilder.build_AddcCs|j|j�}t||j�S)N)rG�expr�getattrZattrname)rErF�parentrrr�
build_Getattr�szBuilder.build_GetattrcCs|j|j�d�S)Nr)rLr;)rErFrrr�build_UnarySub�szBuilder.build_UnarySubcCs|j|j�d�S)Nr)rLr;)rErFrrr�build_UnaryAdd�szBuilder.build_UnaryAddN)
r?r@rArGrJrLrQrSrXr_rcrdrerrrrrB�s
rBcCs|s|Sddl}|j|�S)Nr)�astZliteral_eval)r<rfrrrr6�sr6c@seZdZdZddd�ZdS)rzk
    This is the base class for all errors that ConfigObj raises.
    It is a subclass of SyntaxError.
    �NcCs||_||_tj||�dS)N)�line�line_number�SyntaxError�__init__)rE�messagerirhrrrrk�szConfigObjError.__init__)rgNrg)r?r@rA�__doc__rkrrrrr�sc@seZdZdZdS)rzE
    This error indicates a level of nesting that doesn't match.
    N)r?r@rArmrrrrr�sc@seZdZdZdS)rz�
    This error indicates that a line is badly written.
    It is neither a valid ``key = value`` line,
    nor a valid section marker line.
    N)r?r@rArmrrrrr�sc@seZdZdZdd�ZdS)r&zW
    A 'reload' operation failed.
    This exception is a subclass of ``IOError``.
    cCstj|d�dS)Nz#reload failed, filename is not set.)�IOErrorrk)rErrrrk�szReloadError.__init__N)r?r@rArmrkrrrrr&�sc@seZdZdZdS)rz:
    The keyword or section specified already exists.
    N)r?r@rArmrrrrr�sc@seZdZdZdS)rz7
    An error occured whilst parsing a configspec.
    N)r?r@rArmrrrrr�sc@seZdZdZdS)r"z,Base class for the two interpolation errors.N)r?r@rArmrrrrr"�sc@seZdZdZdd�ZdS)r#z=Maximum interpolation depth exceeded in string interpolation.cCstj|d|�dS)Nz*interpolation loop detected in value "%s".)r"rk)rE�optionrrrrkszInterpolationLoopError.__init__N)r?r@rArmrkrrrrr#sc@seZdZdZdS)r%zk
    This error indicates additional sections in a section with a
    ``__many__`` (repeated) section.
    N)r?r@rArmrrrrr%sc@seZdZdZdd�ZdS)r$z0A value specified for interpolation was missing.cCsd|}tj||�dS)Nz%missing option "%s" in interpolation.)r"rk)rEro�msgrrrrksz#MissingInterpolationOption.__init__N)r?r@rArmrkrrrrr$sc@seZdZdZdS)r'z An error parsing in unrepr mode.N)r?r@rArmrrrrr'sc@s>eZdZdZejd�ZdZdd�Zdd�Z	dd	�Z
d
d�ZdS)
�InterpolationEnginez�
    A helper class to help perform string interpolation.

    This class is an abstract base class; its descendants perform
    the actual work.
    z
%\(([^)]*)\)s�%cCs
||_dS)N)�section)rErsrrrrk*szInterpolationEngine.__init__cs0�j|kr|S��fdd���||�ji�}|S)Ncs�||jf|krt|��d|||jf<�jj|�}xz|r��j|�\}}}|dkrT|}n�||||�}|j�\}	}
dj|d|	�|||
d�f�}|	t|�}�jj||�}q2W|||jf=|S)axThe function that does the actual work.

            ``value``: the string we're trying to interpolate.
            ``section``: the section in which that string was found
            ``backtrail``: a dict to keep track of where we've been,
            to detect and prevent infinite recursion loops

            This is similar to a depth-first-search algorithm.
            r8Nrg)rWr#�_KEYCRE�search�_parse_match�span�join�len)�keyrKrsZ	backtrail�match�k�vr<Zreplacement�start�endZnew_search_start)�recursive_interpolaterErrr�4s z>InterpolationEngine.interpolate.<locals>.recursive_interpolate)�_cookiers)rErzrKr)r�rEr�interpolate/s

,zInterpolationEngine.interpolatecCs�|jjj}d|jj_|j}x^|j|�}|dk	r<t|t�r<P|jdi�j|�}|dk	rdt|t�rdP|j|krpP|j}qW||jj_|dkr�t|��||fS)z�Helper function to fetch values from owning section.

        Returns a 2-tuple: the value, and the section where it was found.
        FN�DEFAULT)rs�mainr-rr[�Sectionrbr$)rErzZsave_interpZcurrent_section�valrrr�_fetchds"





zInterpolationEngine._fetchcCs
t��dS)a�Implementation-dependent helper function.

        Will be passed a match object corresponding to the interpolation
        key we just found (e.g., "%(foo)s" or "$foo"). Should look up that
        key in the appropriate config file section (using the ``_fetch()``
        helper function) and return a 3-tuple: (key, value, section)

        ``key`` is the name of the key we're looking for
        ``value`` is the value found for that key
        ``section`` is a reference to the section where it was found

        ``key`` and ``section`` should be None if no further
        interpolation should be performed on the resulting value
        (e.g., if we interpolated "$$" and returned "$").
        N)�NotImplementedError)rEr{rrrrv�sz InterpolationEngine._parse_matchN)r?r@rArm�re�compilertr�rkr�r�rvrrrrrqs
5"rqc@s&eZdZdZdZejd�Zdd�ZdS)�ConfigParserInterpolationzBehaves like ConfigParser.rrz
%\(([^)]*)\)scCs"|jd�}|j|�\}}|||fS)Nr8)�groupr�)rEr{rzrKrsrrrrv�s
z&ConfigParserInterpolation._parse_matchN)	r?r@rArmr�r�r�rtrvrrrrr��s
r�c@s4eZdZdZdZdZejdejej	B�Z
dd�ZdS)�TemplateInterpolationzBehaves like string.Template.�$z�
        \$(?:
          (?P<escaped>\$)              |   # Two $ signs
          (?P<named>[_a-z][_a-z0-9]*)  |   # $name format
          {(?P<braced>[^}]*)}              # ${name} format
        )
        cCs\|jd�p|jd�}|dk	r4|j|�\}}|||fS|jd�dk	rNd|jdfSd|j�dfS)NZnamedZbracedZescaped)r�r��
_delimiter)rEr{rzrKrsrrrrv�s
z"TemplateInterpolation._parse_matchN)r?r@rArmr�r�r�r��
IGNORECASE�VERBOSErtrvrrrrr��sr�)r+�templatecGs|j|f|��S)N)�__new__)�cls�argsrrr�
__newobj__�sr�c@s$eZdZdZdd�Zdd�ZdDdd�Zd	d
�Zdd�Zd
d�Z	dEdd�Z
dd�ZdFdd�Zdd�Z
efdd�Zdd�Zdd�ZdGdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�ZeZd*d+�Zd,d-�ZeZd.e_d/d0�Zd1d2�Zd3d4�ZdHd6d7�Zd8d9�Z d:d;�Z!d<d=�Z"d>d?�Z#d@dA�Z$dBdC�Z%dS)Ir�a�
    A dictionary-like object that represents a section in a config file.
    
    It does string interpolation if the 'interpolation' attribute
    of the 'main' object is set to True.
    
    Interpolation is tried first from this object, then from the 'DEFAULT'
    section of this object, next from the parent and its 'DEFAULT' section,
    and so on until the main object is reached.
    
    A Section will behave like an ordered dictionary - following the
    order of the ``scalars`` and ``sections`` attributes.
    You can use this to change the order of members.
    
    Iteration follows the order: scalars, then sections.
    cCs$tj||d�|jj|d�dS)Nrr8)�dict�update�__dict__)rE�staterrr�__setstate__�szSection.__setstate__cCst|�|jf}t|jf|fS)N)r�r�r�rD)rEr�rrr�
__reduce__�szSection.__reduce__NcCsX|dkri}tj|�||_||_||_||_|j�x|j�D]\}}|||<q@WdS)z�
        * parent is the section above
        * depth is the depth level of this section
        * main is the main ConfigObj
        * indict is a dictionary to initialise the section with
        N)r�rkrbr��depthrW�_initialise�items)rErbr�r��indictrW�entryrKrrrrk�s
zSection.__init__cCs:g|_g|_i|_i|_d|_g|_i|_g|_d|_dS)NF)	�scalars�sections�comments�inline_commentsr2�defaults�default_values�extra_values�_created)rErrrr�szSection._initialisecCsvy
|j}Wn^tk
rh|jj}|dkr.t}|j�}tj|d�}|dkrVd|j_|S||�}|_YnX|j||�S)NTF)	Z_interpolation_engine�AttributeErrorr�r-rr�interpolation_enginesrr�)rErzrKZenginerWZclass_rrr�_interpolates
zSection._interpolatecsftj���}�jjrbt|tj�r,�j�|�St|t�rb��fdd���fdd�|D�}||krb|S|S)z+Fetch the item and do string interpolation.cst|tj�r�j�|�S|S)N)r[�six�string_typesr�)r�)rzrErr�_check/sz#Section.__getitem__.<locals>._checkcsg|]}�|��qSrr)�.0r�)r�rr�
<listcomp>3sz'Section.__getitem__.<locals>.<listcomp>)	r��__getitem__r�r-r[r�r�r�rH)rErzr��newr)r�rzrErr�(s
zSection.__getitem__Fc
CsNt|tj�std|��||jkr6g|j|<d|j|<||jkrL|jj|�t|t�rz||krj|j	j
|�tj|||�n�t|t�r�|r�||kr�|j	j
|�|j
d}tj||t|||j||d��n�||kr�|jj
|�|jj�s<t|tj�r�nHt|ttf��r0x6|D] }t|tj��s
td|���q
Wntd|��tj|||�dS)a�
        Correctly set a value.
        
        Making dictionary values Section instances.
        (We have to special case 'Section' instances - which are also dicts)
        
        Keys must be strings.
        Values need only be strings (or lists of strings) if
        ``main.stringify`` is set.
        
        ``unrepr`` must be set when setting a value to a dictionary, without
        creating a new sub-section.
        zThe key "%s" is not a string.rgr8)r�rWzValue is not a string "%s".N)r[r�r��
ValueErrorr�r�r��remover�r��appendr��__setitem__r�r�r�r3rHrRrZ)rErzrKr6Z	new_depthr�rrrr�9sF







zSection.__setitem__cCsDtj||�||jkr$|jj|�n|jj|�|j|=|j|=dS)z-Remove items from the sequence when deleting.N)r��__delitem__r�r�r�r�r�)rErzrrrr�ts
zSection.__delitem__cCs"y||Stk
r|SXdS)z>A version of ``get`` that doesn't bypass string interpolation.N)�KeyError)rErz�defaultrrrrszSection.getcCsx|D]}||||<qWdS)zD
        A version of update that uses our ``__setitem__``.
        Nr)rEr�r�rrrr��s
zSection.updatecCs:y||}Wn"tk
r.|tkr&�|}YnX||=|S)z�
        'D.pop(k[,d]) -> v, remove specified key and return the corresponding value.
        If key is not found, d is returned if given, otherwise KeyError is raised'
        )r��MISSING)rErzr�r�rrr�pop�s
zSection.popcCs6|j|j}|std��|d}||}||=||fS)zPops the first (key,val)z": 'popitem(): dictionary is empty'r)r�r�r�)rEZsequencerzr�rrr�popitem�szSection.popitemcCs8tj|�g|_g|_i|_i|_d|_g|_g|_dS)z�
        A version of clear that also affects scalars/sections
        Also clears comments and configspec.
        
        Leaves other attributes alone :
            depth/main/parent are not affected
        N)	r��clearr�r�r�r�r2r�r�)rErrrr��s
z
Section.clearcCs.y||Stk
r(|||<||SXdS)z:A version of setdefault that sets sequence if appropriate.N)r�)rErzr�rrr�
setdefault�s
zSection.setdefaultcCstt|j|jt|j����S)z8D.items() -> list of D's (key, value) pairs, as 2-tuples)rH�zipr�r��values)rErrrr��sz
Section.itemscCs|j|jS)zD.keys() -> list of D's keys)r�r�)rErrr�keys�szSection.keyscs�fdd��j�jD�S)z D.values() -> list of D's valuescsg|]}�|�qSrr)r�rz)rErrr��sz"Section.values.<locals>.<listcomp>)r�r�)rEr)rErr��szSection.valuescCstt|j���S)z=D.iteritems() -> an iterator over the (key, value) items of D)rMrHr�)rErrr�	iteritems�szSection.iteritemscCst|j|j�S)z.D.iterkeys() -> an iterator over the keys of D)rMr�r�)rErrr�iterkeys�szSection.iterkeyscCstt|j���S)z2D.itervalues() -> an iterator over the values of D)rMrHr�)rErrr�
itervalues�szSection.itervaluescs0�fdd��ddj�fdd��j�jD��S)zx.__repr__() <==> repr(x)cs*y�|Stk
r$tj�|�SXdS)N)r$r�r�)rz)rErr�_getval�sz!Section.__repr__.<locals>._getvalz{%s}z, cs$g|]}dt|�t�|��f�qS)z%s: %s)�repr)r�rz)r�rrr��sz$Section.__repr__.<locals>.<listcomp>)rxr�r�)rEr)r�rEr�__repr__�szSection.__repr__zx.__str__() <==> str(x)cCs`i}xV|D]N}||}t|t�r*|j�}n&t|t�r>t|�}nt|t�rPt|�}|||<q
W|S)a0
        Return a deepcopy of self as a dictionary.
        
        All members that are ``Section`` instances are recursively turned to
        ordinary dictionaries - by calling their ``dict`` method.
        
        >>> n = a.dict()
        >>> n == a
        1
        >>> n is a
        0
        )r[r�r�rHrR)rEZnewdictr��
this_entryrrrr��s






zSection.dictcCsVxPt|j��D]@\}}||krFt||t�rFt|t�rF||j|�q|||<qWdS)aQ
        A recursive update - useful for merging config files.
        
        >>> a = '''[section1]
        ...     option1 = True
        ...     [[subsection]]
        ...     more_options = False
        ...     # end of file'''.splitlines()
        >>> b = '''# File is user.ini
        ...     [section1]
        ...     option1 = False
        ...     # end of file'''.splitlines()
        >>> c1 = ConfigObj(b)
        >>> c2 = ConfigObj(a)
        >>> c2.merge(c1)
        >>> c2
        ConfigObj({'section1': {'option1': 'False', 'subsection': {'more_options': 'False'}}})
        N)rHr�r[r��merge)rEr�rzr�rrrr�s

z
Section.mergecCs�||jkr|j}n||jkr$|j}ntd|��|j|�}||}tj||�tj|||�|j|�|j||�|j	|}|j
|}|j	|=|j
|=||j	|<||j
|<dS)a
        Change a keyname to another, without changing position in sequence.
        
        Implemented so that transformations can be made on keys,
        as well as on values. (used by encode and decode)
        
        Also renames comments.
        zKey "%s" not found.N)r�r�r��indexr�r�r�r��insertr�r�)rEZoldkeyZnewkey�the_list�posr�ZcommZinline_commentrrr�rename,s"	






zSection.renameTc	Ksi}xttt|j��D]b}|j|}y$|||f|�}|j|}|||<Wqtk
rt|r^�n|j|}d||<YqXqWx�tt|j��D]~}|j|}|r�y|||f|�Wn.tk
r�|rƂn|j|}d||<YnX|j|}||j|f||d�|��||<q�W|S)a�
        Walk every member and call a function on the keyword and value.
        
        Return a dictionary of the return values
        
        If the function raises an exception, raise the errror
        unless ``raise_errors=False``, in which case set the return value to
        ``False``.
        
        Any unrecognised keyword arguments you pass to walk, will be pased on
        to the function you pass in.
        
        Note: if ``call_on_sections`` is ``True`` then - on encountering a
        subsection, *first* the function is called for the *whole* subsection,
        and then recurses into it's members. This means your function must be
        able to handle strings, dictionaries and lists. This allows you
        to change the key of subsections as well as for ordinary members. The
        return value when called on the whole subsection has to be discarded.
        
        See  the encode and decode methods for examples, including functions.
        
        .. admonition:: caution
        
            You can use ``walk`` to transform the names of members of a section
            but you mustn't add or delete members.
        
        >>> config = '''[XXXXsection]
        ... XXXXkey = XXXXvalue'''.splitlines()
        >>> cfg = ConfigObj(config)
        >>> cfg
        ConfigObj({'XXXXsection': {'XXXXkey': 'XXXXvalue'}})
        >>> def transform(section, key):
        ...     val = section[key]
        ...     newkey = key.replace('XXXX', 'CLIENT1')
        ...     section.rename(key, newkey)
        ...     if isinstance(val, (tuple, list, dict)):
        ...         pass
        ...     else:
        ...         val = val.replace('XXXX', 'CLIENT1')
        ...         section[newkey] = val
        >>> cfg.walk(transform, call_on_sections=True)
        {'CLIENT1section': {'CLIENT1key': None}}
        >>> cfg
        ConfigObj({'CLIENT1section': {'CLIENT1key': 'CLIENT1value'}})
        F)r.�call_on_sections)�rangeryr��	Exceptionr��walk)	rEZfunctionr.r�Zkeywargs�outrPr�r�rrrr�Js:/





zSection.walkcCsn||}|dkrdS|dkr dSy(t|tj�s6t��n|jj|j�SWn tk
rhtd|��YnXdS)a_
        Accepts a key as input. The corresponding value must be a string or
        the objects (``True`` or 1) or (``False`` or 0). We allow 0 and 1 to
        retain compatibility with Python 2.2.
        
        If the string is one of  ``True``, ``On``, ``Yes``, or ``1`` it returns 
        ``True``.
        
        If the string is one of  ``False``, ``Off``, ``No``, or ``0`` it returns 
        ``False``.
        
        ``as_bool`` is not case sensitive.
        
        Any other input will raise a ``ValueError``.
        
        >>> a = ConfigObj()
        >>> a['a'] = 'fish'
        >>> a.as_bool('a')
        Traceback (most recent call last):
        ValueError: Value "fish" is neither True nor False
        >>> a['b'] = 'True'
        >>> a.as_bool('b')
        1
        >>> a['b'] = 'off'
        >>> a.as_bool('b')
        0
        TFz$Value "%s" is neither True nor FalseN)r[r�r�r�r��_boolsrr�)rErzr�rrr�as_bool�szSection.as_boolcCst||�S)ai
        A convenience method which coerces the specified value to an integer.
        
        If the value is an invalid literal for ``int``, a ``ValueError`` will
        be raised.
        
        >>> a = ConfigObj()
        >>> a['a'] = 'fish'
        >>> a.as_int('a')
        Traceback (most recent call last):
        ValueError: invalid literal for int() with base 10: 'fish'
        >>> a['b'] = '1'
        >>> a.as_int('b')
        1
        >>> a['b'] = '3.2'
        >>> a.as_int('b')
        Traceback (most recent call last):
        ValueError: invalid literal for int() with base 10: '3.2'
        )�int)rErzrrr�as_int�szSection.as_intcCst||�S)a>
        A convenience method which coerces the specified value to a float.
        
        If the value is an invalid literal for ``float``, a ``ValueError`` will
        be raised.
        
        >>> a = ConfigObj()
        >>> a['a'] = 'fish'
        >>> a.as_float('a')  #doctest: +IGNORE_EXCEPTION_DETAIL
        Traceback (most recent call last):
        ValueError: invalid literal for float(): fish
        >>> a['b'] = '1'
        >>> a.as_float('b')
        1.0
        >>> a['b'] = '3.2'
        >>> a.as_float('b')  #doctest: +ELLIPSIS
        3.2...
        )rY)rErzrrr�as_float�szSection.as_floatcCs$||}t|ttf�rt|�S|gS)aU
        A convenience method which fetches the specified value, guaranteeing
        that it is a list.
        
        >>> a = ConfigObj()
        >>> a['a'] = 1
        >>> a.as_list('a')
        [1]
        >>> a['a'] = (1,)
        >>> a.as_list('a')
        [1]
        >>> a['a'] = [1]
        >>> a.as_list('a')
        [1]
        )r[rRrH)rErz�resultrrr�as_list�szSection.as_listcCs2|j|}tj|||�||jkr.|jj|�|S)a
        Restore (and return) default value for the specified key.
        
        This method will only work for a ConfigObj that was created
        with a configspec and has been validated.
        
        If there is no default value for this key, ``KeyError`` is raised.
        )r�r�r�r�r�)rErzr�rrr�restore_defaults
	

zSection.restore_defaultcCs:x|jD]}|j|�qWx|jD]}||j�q"WdS)a'
        Recursively restore default values to all members
        that have them.
        
        This method will only work for a ConfigObj that was created
        with a configspec and has been validated.
        
        It doesn't delete or modify entries without default values.
        N)r�r�r��restore_defaults)rErzrsrrrr�s
zSection.restore_defaults)NN)F)N)N)TF)&r?r@rArmr�r�rkr�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r��__iter__r�r��__str__r�r�r�r�r�r�r�r�r�r�rrrrr��sH

;

	

T,r�c@s�eZdZdZejdej�Zejdej�Zejdej�Z	ejdej�Z
ejdej�Zejd�Zejd�Z
ejd	�Zejd
�Zeefe
efd�Zdd
dd
dd
dd
d�ZdFdd�Zdd�ZdGdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�ZdHd*d+�Zd,d-�Z d.d/�Z!d0d1�Z"d2d3�Z#d4d5�Z$d6d7�Z%d8d9�Z&d:d;�Z'd<d=�Z(dId>d?�Z)dJd@dA�Z*dBdC�Z+dDdE�Z,dS)Kr z2An object to read, create, and write config files.a�^ # line start
        (\s*)                   # indentation
        (                       # keyword
            (?:".*?")|          # double quotes
            (?:'.*?')|          # single quotes
            (?:[^'"=].*?)       # no quotes
        )
        \s*=\s*                 # divider
        (.*)                    # value (including list values and comments)
        $   # line end
        a=^
        (\s*)                     # 1: indentation
        ((?:\[\s*)+)              # 2: section marker open
        (                         # 3: section name open
            (?:"\s*\S.*?\s*")|    # at least one non-space with double quotes
            (?:'\s*\S.*?\s*')|    # at least one non-space with single quotes
            (?:[^'"\s].*?)        # at least one non-space unquoted
        )                         # section name close
        ((?:\s*\])+)              # 4: section marker close
        \s*(\#.*)?                # 5: optional comment
        $a�^
        (?:
            (?:
                (
                    (?:
                        (?:
                            (?:".*?")|              # double quotes
                            (?:'.*?')|              # single quotes
                            (?:[^'",\#][^,\#]*?)    # unquoted
                        )
                        \s*,\s*                     # comma
                    )*      # match all list items ending in a comma (if any)
                )
                (
                    (?:".*?")|                      # double quotes
                    (?:'.*?')|                      # single quotes
                    (?:[^'",\#\s][^,]*?)|           # unquoted
                    (?:(?<!,))                      # Empty value
                )?          # last item in a list - or string value
            )|
            (,)             # alternatively a single comma - empty list
        )
        \s*(\#.*)?          # optional comment
        $z�
        (
            (?:".*?")|          # double quotes
            (?:'.*?')|          # single quotes
            (?:[^'",\#]?.*?)       # unquoted
        )
        \s*,\s*                 # comma
        a^
        (
            (?:".*?")|          # double quotes
            (?:'.*?')|          # single quotes
            (?:[^'"\#].*?)|     # unquoted
            (?:)                # Empty value
        )
        \s*(\#.*)?              # optional comment
        $z^'''(.*?)'''\s*(#.*)?$z^"""(.*?)"""\s*(#.*)?$z^(.*?)'''\s*(#.*)?$z^(.*?)"""\s*(#.*)?$)z'''z"""TF)�yes�noZonZoff�1�0�trueZfalseNc
Cs�||_tj||d|�|pg}|||||||	|
|||
|d�}|dkrJ|}n|ddl}|jdtdd�x |D]}|tkrhtd|��qhWx@ttj	��D]0\}}||kr�|||<||}||kr�|||<q�W|r�d|d	<|j
|�|d
}||_|j||�dS)a�
        Parse a config file or create a config file object.
        
        ``ConfigObj(infile=None, configspec=None, encoding=None,
                    interpolation=True, raise_errors=False, list_values=True,
                    create_empty=False, file_error=False, stringify=True,
                    indent_type=None, default_encoding=None, unrepr=False,
                    write_empty_values=False, _inspec=False)``
        r)r2rr-r.r/r0r1r3r4r5r6r7NzUPassing in an options dictionary to ConfigObj() is deprecated. Use **options instead.�)�
stacklevelzUnrecognised option "%s".Fr/r2)
�_inspecr�rk�warnings�warn�DeprecationWarning�OPTION_DEFAULTSrZrHr�r��_original_configspec�_load)rE�infile�optionsr2rr-r.r/r0r1r3r4r5r6r7r�Z_optionsr�r�rKZ
keyword_valuerrrrk�s<


zConfigObj.__init__csBt|tj�r�||_tjj|�rBt|d��}|j�p4g}WdQRXn@|j	rXt
d|j��n*|jr~t|d��}|jd�WdQRXg}n�t|t
tf�r�t
|�}n�t|t��rt|t�rʇfdd���||�nx|D]}||||<q�W|`|dk	r�|j|�nd|_dSt|dt�tk	�r(|j��p$g}ntd��|�r�|j|�}xN|D]F}|�sF|ddk�rd�qFx"dD]}|j|��rj||_P�qjWP�qFWtd
d�|D���s�tt|���dd�|D�}|j|�|j�rd|jdj}t |j�d	k�rd|}	t!|	�}
n
|jd}
|j|
_"||
_#|
�|`|dk�r4d|_n
|j|�dS)N�rbzConfig file not found: "%s".�wrgcsJx|jD]}||||<qWx(|jD]}i||<�||||�q$WdS)N)r�r�)Z
in_section�this_sectionr�rs)�set_sectionrrr��s
z$ConfigObj._load.<locals>.set_section�readz>infile must be a filename, file like object, or list of lines.r8�
�
�
css|]}t|tj�VqdS)N)r[r�r�)r�rhrrr�	<genexpr>sz"ConfigObj._load.<locals>.<genexpr>cSsg|]}|jd��qS)z
)�rstrip)r�rhrrrr�sz#ConfigObj._load.<locals>.<listcomp>zat line %s.rz2Parsing failed with several errors.
First error %s���)r�r�)r�r�r�)$r[r�r��filename�os�path�isfile�open�	readlinesr1rnr0�writerHrRr�r �_errors�_handle_configspecr2rar�r�rZ�_handle_bom�endswith�newlines�all�AssertionErrorr��_parseriryr�errors�config)rEr�r2�hZcontentr�rhr�inforp�errorr)r�rr��sj





 



zConfigObj._loadcCs�|dkrt}d|_g|_|d|_|d|_|d|_|d|_|d|_|d|_|d|_	|d|_
|d	|_d
|_d|_
|d|_|d|_g|_g|_d|_|jr�d
|_tj|�dS)
Nr.r-r/r0r1r3r4rr5Fr7r6)r�r�rr.r-r/r0r1r3r4rr5�BOMrr7r6�initial_comment�
final_commentr2r�r�r�)rEr�rrrr�0s.










zConfigObj._initialisecs0�fdd��ddj�fdd��j�jD��S)Ncs*y�|Stk
r$tj�|�SXdS)N)r$r�r�)rz)rErrr�Qsz#ConfigObj.__repr__.<locals>._getvalzConfigObj({%s})z, cs$g|]}dt|�t�|��f�qS)z%s: %s)r�)r�rz)r�rrr�Wsz&ConfigObj.__repr__.<locals>.<listcomp>)rxr�r�)rEr)r�rErr�PszConfigObj.__repr__cCsH|jdk	r&|jj�tkr&|j||j�St|ttf�r>|d}n|}t|tj�r\|j||j�S|jdk	�r(t|jj�}|dkr�x8tt	j
��D](\}\}}|s�q�|j|�r�|j||�Sq�W|j||j�St|}|j|�s�|j||j�S|t
|�d�}t|ttf��r||d<n|}d|_|j||j�Sx�tt	j
��D]�\}\}}t|tj��s6|j|��rf�q6n�||_|�s�d|_|t
|�d�}t|ttf��r�||d<n|}t|tj��r�|jd�St|tj��r�|jd�jd�S|j|d�S|j||�S�q6Wtj�rt|t��r|j|d�St|tj��r8|jd�jd�S|j|d�SdS)a1
        Handle any BOM, and decode if necessary.
        
        If an encoding is specified, that *must* be used - but the BOM should
        still be removed (and the BOM attribute set).
        
        (If the encoding is wrongly specified, then a BOM for an alternative
        encoding won't be discovered or removed.)
        
        If an encoding is not specified, UTF8 or UTF16 BOM will be detected and
        removed. The BOM attribute will be set. UTF16 will be decoded to
        unicode.
        
        NOTE: This method must not be called with an empty ``infile``.
        
        Specifying the *wrong* encoding is likely to cause a
        ``UnicodeDecodeError``.
        
        ``infile`` must always be returned as a list of lines, but may be
        passed in as a single string.
        Nrr	Tzutf-8)rrr�_decoder[rHrRr�Z	text_type�BOMSr��
startswith�BOM_SETryr�binary_type�
splitlines�decodeZPY2�str)rEr�rh�encrrZfinal_encoding�newlinerrrr[s^






zConfigObj._handle_bomcCs&t|tj�r|jr|j|j�S|SdS)z@Decode ASCII strings to unicode if a self.encoding is specified.N)r[r�rrr)rEZaStringrrr�_a_to_u�szConfigObj._a_to_ucCsxt|tj�r|jd�St|tj�r@|r6|j|�jd�S|jd�S|rtx.t|�D]"\}}t|tj�rN|j|�||<qNW|S)z�
        Decode infile to unicode. Using the specified encoding.
        
        if is a string, it also needs converting to a list.
        T)r[r�r�rrr�	enumerate)rEr�rrPrhrrrr�s

zConfigObj._decodecCs&t|tj�r|jr|j|j�S|SdS)z'Decode element to unicode if necessary.N)r[r�rr5r)rErhrrr�_decode_element�szConfigObj._decode_elementcCst|tj�st|�S|SdS)zh
        Used by ``stringify`` within validate, to turn non-string values
        into strings.
        N)r[r�r�r)rErKrrr�_str�szConfigObj._strcCs"|j}|jrd|_g}d}|}t|�d}d}d}�x�||k�r�|rHg}|d7}||}	|	j�}
|
sp|
jd�r�d}|j|	�q6|s�||_g}d}d}|jj|	�}|dk	�r�|j	�\}}
}}}|r�|j
dkr�||_
|
jd�}||jd�k�r�|jdt
||�q6||jk�rHy|j||�j}Wn(tk
�rD|jd	t
||�w6YnXn<||jk�r\|j}n(||jdk�rr|}n|jd
t
||�q6|j|�}||k�r�|jdt||�q6t||||d�}|||<||j|<||j|<q6|jj|	�}|dk�r|jd
j|	�t||�q6|j	�\}}}|�r,|j
dk�r,||_
|dd�dk�r�y|j||||�\}}}Wn(tk
�r�|jdt||�w6YnjX|j�r�d}yt|�}WnNtk
�r�}z0t|�tk�r�d}nd}|j|t||�w6WYdd}~XnXn�|j�rTd}yt|�}WnLtk
�rP}z.t|t��r*d}nd}|j|t||�w6WYdd}~XnXn<y|j |�\}}Wn(tk
�r�|jdt||�w6YnX|j|�}||k�r�|jdt||�q6|j!||dd�||j|<||j|<q6q6W|j
dk�r�d|_
|�r|j�r||_n|�s||_"||_dS)zActually parse the config file.Fr8�#TN�[�]z Cannot compute the section depthzCannot compute nesting levelzSection too nestedzDuplicate section name)rWz=Invalid line ({0!r}) (matched as neither section nor keyword)��"""�'''zParse error in multiline valuergzUnknown name or type in valuez+Parse error from unrepr-ing multiline valuez!Parse error from unrepr-ing valuezParse error in valuezDuplicate keyword name)r6r�)r$r%)#r/r6ry�striprr�r�_sectionmarkerr{�groupsr4�count�
_handle_errorrr��_match_depthrbrj�_unquoterr�r�r��_keyword�formatr�
_multiliner��typer(r'r[�
_handle_valuer�r)rEr�Ztemp_list_valuesZcomment_listZ
done_startr��maxline�	cur_indexZ
reset_commentrhZsline�mat�indentZ	sect_openZ	sect_nameZ
sect_close�commentZ	cur_depthrbrzrK�erprrrr	s�





















zConfigObj._parsecCs>x$||jkr$||jkrt��|j}qW|j|kr4|St��dS)z�
        Given a section and a depth level, walk back through the sections
        parents to see if the depth level matches a previous section.
        
        Return a reference to the right section,
        or raise a SyntaxError.
        N)r�rbrj)rEZsectr�rrrr+�s


zConfigObj._match_depthcCsB||}|d7}dj||�}||||�}|jr2|�|jj|�dS)z�
        Handle an error according to the error settings.
        
        Either raise the error or store it.
        The error will have occured at ``cur_index``
        r8z{0} at line {1}.N)r.r.rr�)rE�textZ
ErrorClassr�r3rhrlrrrrr*�szConfigObj._handle_errorcCs4|st�|d|dkr0|ddkr0|dd�}|S)z%Return an unquoted version of a valuerr8�"�'r�)r9r:r�)rj)rErKrrrr,�s
zConfigObj._unquotecs�|r�jr|dkrdS|rjt|ttf�rj|s0dSt|�dkrR�j|ddd�dSdj�fdd	�|D��St|tj�s��j	r�t
|�}ntd
|��|s�dS�jo�d|ko�d
|k}|o�d|kr�d|kp�d|k}|o�|o�d|ko�d|ko�d
|k}|�s�|�o|}|�rh�j�st
}nNd|k�r0td|��n6|dtk�r\|dtk�r\d|k�r\t
}n
�j|�}n
�j|�}|t
k�r�d
|k�r��j�r��j|�}||S)a�
        Return a safely quoted version of a value.
        
        Raise a ConfigObjError if the value cannot be safely quoted.
        If multiline is ``True`` (default) then use triple quotes
        if necessary.
        
        * Don't quote values that don't need it.
        * Recursively quote members of a list and return a comma joined list.
        * Multiline is ``False`` for lists.
        * Obey list syntax for empty and single member lists.
        
        If ``list_values=False`` then the value is only quoted if it contains
        a ``\n`` (is multiline) or '#'.
        
        If ``write_empty_values`` is set, and the value is an empty string, it
        won't be quoted.
        rg�,r8rF)�	multilinez, csg|]}�j|dd��qS)F)r<)�_quote)r�r�)rErrr��sz$ConfigObj._quote.<locals>.<listcomp>zValue "%s" is not a string.z""r�r r:r9z#Value "%s" cannot be safely quoted.r�)r7r[rHrRryr=rxr�r�r3rrZr/�noquotr�wspace_plus�_get_single_quote�_get_triple_quote)rErKr<Zno_lists_no_quotesZneed_tripleZhash_triple_quoteZcheck_for_single�quotr)rErr=�sB

"



zConfigObj._quotecCs4d|krd|krtd|��nd|kr,t}nt}|S)Nr:r9z#Value "%s" cannot be safely quoted.)r�squot�dquot)rErKrBrrrr@%szConfigObj._get_single_quotecCsD|jd�dkr(|jd�dkr(td|��|jd�dkr<t}nt}|S)Nz"""r8z'''z#Value "%s" cannot be safely quoted.r�r�r�)�findr�tdquot�tsquot)rErKrBrrrrA/szConfigObj._get_triple_quotecs��jr|dfS�js6�jj|�}|dkr.t��|j�S�jj|�}|dkrPt��|j�\}}}}|dkrv|dkrvt��|dk	r�g|fS|dk	r�|r�|r�d}n|p�d}�j|�}|dkr�||fS�jj	|�}�fdd�|D�}|dk	r�||g7}||fS)z�
        Given a value string, unquote, remove comment,
        handle lists. (including empty and single member lists)
        rgNz""csg|]}�j|��qSr)r,)r�r�)rErrr�dsz+ConfigObj._handle_value.<locals>.<listcomp>)
r�r/�_nolistvaluer{rjr(�	_valueexpr,�
_listvalueexp�findall)rErKr4r/ZsingleZ
empty_listr6r�r)rErr19s6


zConfigObj._handle_valuec
Cs�|dd�}|dd�}|j|d}|j|d}|j|�}	|	dk	r`t|	j��}
|
j|�|
S|j|�dkrtt��xD||kr�|d7}|d7}||}|j|�dkr�||7}qvPqvWt��|j|�}	|	dkr�t��|	j�\}}||||fS)z9Extract the value, where we are in a multiline situation.Nr#rr8r�r�r�)�
_triple_quoter{rHr(r�rErj)
rErKr�r3r2rBZnewvalueZsingle_lineZ
multi_liner4Zretvalrhr6rrrr/js0




zConfigObj._multilinecCs�t|t�szyt|dddd�}WnZtk
rL}ztd|��WYdd}~Xn.tk
rx}ztd|��WYdd}~XnX||_dS)zParse the configspec.T)r.r1r�zParsing configspec failed: %sNzReading configspec failed: %s)r[r rrrnr2)rEr2r7rrrr�s
zConfigObj._handle_configspeccCs�|j}|jd�}t|t�r<x |jD]}||kr"|||_q"Wxz|jD]p}|dkrRqD||kr�i||<d||_|r�|jj|g�|j|<|jj|d�|j|<t||t�rD||||_qDWdS)z�
        Called by validate. Handles setting the configspec on subsections
        including sections to be validated by __many__
        �__many__TrgN)	r2rr[r�r�r�r�r�r�)rErs�copyr2�manyr�rrr�_set_configspec�s"


zConfigObj._set_configspeccCsN|js|j|j|��}nt|�}d||j|j|dd��|jd�||j|�fS)z.Write an individual line, for the write methodz
%s%s%s%s%sF)r<z = )r6rr=r�r)rE�
indent_stringr�r�r6r�rrr�_write_line�szConfigObj._write_linecCs<d||jd|�|j|j|�dd�|jd|�|j|�fS)zWrite a section marker linez
%s%s%s%s%sr!F)r<r")rr=r)rErQr�r�r6rrr�
_write_marker�s
zConfigObj._write_markercCs.|sdS|j}|jd�s&||jd�7}||S)zDeal with a comment.rgr z # )r4rr)rEr6r~rrr�_handle_comment�s
zConfigObj._handle_commentc	sB�jdkrt�_g}�jd�}�jd�}|dkr��j}d�_�}xB�jD]8}�j|�}|j�}|rv|j|�rv||}|j|�qHW�j|j	}	x�|j
|jD]�}
|
|jkr�q�xF|j
|
D]8}�j|j��}|r�|j|�r�||}|j|	|�q�W||
}�j|j|
�}
t|t��rF|j�j|	|j	|
|
��|j�j|d��q�|j�j|	|
||
��q�W|�k�r�xH�jD]>}�j|�}|j�}|�r�|j|��r�||}|j|��qrW|�_|�k	�r�|S�jdk�rF|dk�rF�j�r��fdd�|D�}�j�rB�jdk�s"tj�jj��dk�rB|�s2|jd	�t|d
|d
<|S�j�pRt j!}t"|dd�dk	�r�|j#dk�r�t$j%d
k�r�|dk�r�d}�j|�j&|�}|j'|��s�||7}t|t(j)��r�|}n|j*�j�pڈj+�p�d�}�j�r�jdk�s�t,�j��rt|}|dk	�r|j|�n"t-�jd��}|j|�WdQRXdS)a~
        Write the current ConfigObj as a file
        
        tekNico: FIXME: use StringIO instead of real files
        
        >>> filename = a.filename
        >>> a.filename = 'test.ini'
        >>> a.write()
        >>> a.filename = filename
        >>> a == ConfigObj('test.ini', raise_errors=True)
        1
        >>> import os
        >>> os.remove('test.ini')
        Nr z# F)rscsg|]}|j�j��qSr)�encoder)r��l)rErrr�/sz#ConfigObj.write.<locals>.<listcomp>rrgr�moder�Zwin32z
r��ascii�wb).r4rrr-rrr&rr�r�r�r�r�r��lstriprTr�r[r�rS�extendrrRrr�rrrrrrrr��lineseprarW�sys�platformrxrr�rrUr5rr�)rEZoutfilersr�ZcsZcspZint_valrhZ
stripped_linerQr�Zcomment_liner�r6r�outputZoutput_bytesrr)rErr�s�








 
zConfigObj.writecst�dkrt�jdkrtd���r0ddlm}|�_���rt�jj�_�jj�_�jj�_�jj�_�jj	�_	�jj
�_
�j��j����������fdd�}i�d}d}�fdd	��jD�}	�fd
d	��j
D�}
�fdd	��jD�}x��jD]�}|dk�r�q�|�jk�s|�jk�rZd}
d}��rf|�jk�rf�jj|g��j|<�jj|d��j|<nd}
�|}||�|||
||�\}}q�Wd}d�jk�r��d}nd
�jk�r��d
}|dk	�r�x,|	D]$}�|}||||d||�\}}�q�Wg}	x<|D]4}d}��sd�|<nd}d|}�j|��|<�q�Wx<|
D]4}d}��sJd�|<nd}d|}�j|��|<�q2Wx��j
D]�}��k�r�|dk�r��qr�|jdk�r�|	j|��qr��rڈjj|g��j|<�jj|d��j|<�j����|d�}|�|<|dk�rd}n|dk�rd}nd}�qrW|	�_��r<�j�r<d}|�r\��r\��r\t�j��}|�rfdS|�rpdS�S)a;
        Test the ConfigObj against a configspec.
        
        It uses the ``validator`` object from *validate.py*.
        
        To run ``validate`` on the current ConfigObj, call: ::
        
            test = config.validate(validator)
        
        (Normally having previously passed in the configspec when the ConfigObj
        was created - you can dynamically assign a dictionary of checks to the
        ``configspec`` attribute of a section though).
        
        It returns ``True`` if everything passes, or a dictionary of
        pass/fails (True/False). If every member of a subsection passes, it
        will just have the value ``True``. (It also returns ``False`` if all
        members fail).
        
        In addition, it converts the values from strings to their native
        types if their checks pass (and ``stringify`` is set).
        
        If ``preserve_errors`` is ``True`` (``False`` is default) then instead
        of a marking a fail with a ``False``, it will preserve the actual
        exception object. This can contain info about the reason for failure.
        For example the ``VdtValueTooSmallError`` indicates that the value
        supplied was too small. If a value (or section) is missing it will
        still be marked as ``False``.
        
        You must have the validate module to use ``preserve_errors=True``.
        
        You can then use the ``flatten_errors`` function to turn your nested
        results dictionary into a flattened list of failures - useful for
        displaying meaningful error messages.
        NzNo configspec supplied.r)�VdtMissingValuecsP�jj|d�y�j�|��j|<Wntt�jfk
rBYnXy�j|||d�}WnP�jk
r�}z2�s~t|�j�r�d�|<n|�|<d}d}WYdd}~Xn�Xd}d�|<�j	s�|�r"�j	�s
t|t
tf�r�fdd�|D�}n|o�|dk�rd}n
�j|�}||k�s|�r"|�|<��rH|�rH|�j
k�rH�j
j|�||fS)N)�missingFTcsg|]}�j|��qSr)r)r��item)rErrr��sz>ConfigObj.validate.<locals>.validate_entry.<locals>.<listcomp>rg)r�r�Zget_default_valuer�r��baseErrorClass�checkr[�_vdtMissingValuer3rHrRrr�r�)r��specr�ra�ret_true�	ret_falserdr7)r2rNr��preserve_errorsrsrE�	validatorrr�validate_entry�s:

z*ConfigObj.validate.<locals>.validate_entryTcsg|]}|�kr|�qSrr)r�r|)r2rrr��sz&ConfigObj.validate.<locals>.<listcomp>csg|]}|�jkr|�qSr)r�)r�r|)rsrrr��scsg|]}|�jkr|�qSr)r�)r�r|)rsrrr��srM�
___many___rgFz"Value %r was provided as a sectionz)Section %r was provided as a single valuer�)rirNrs)rMrl)r2r��validater`rerrrrrr4rPr�r�r�r�rr�rcr�r�r��anyr�)rErjrirNrsr`rkrgrhZunvalidatedZincorrect_sectionsZincorrect_scalarsr�rar�rOrprdr)r2rNr�rirsrErjrrmSs�$






-












zConfigObj.validatecCs |j�|j�d|_d|_dS)z@Clear ConfigObj instance and restore to 'freshly created' state.N)r�r�r2r�)rErrr�reset"	szConfigObj.resetcCstt|jtj�st��|j}i}x$tD]}|dkr2q$t||�||<q$W|j}||d<|j�|j	|�|j
||�dS)z�
        Reload a ConfigObj from file.
        
        This method raises a ``ReloadError`` if the ConfigObj doesn't have
        a filename attribute pointing to a file.
        r2N)r[r�r�r�r&r�rar�r�r�r�)rEr�Zcurrent_optionsr�r2rrr�reload-	s

zConfigObj.reload)NNNNTFTFFTNNFFF)N)T)NN)FFN)-r?r@rArmr�r�r�r-r'rIrJrHZ_single_line_singleZ_single_line_doubleZ_multi_line_singleZ_multi_line_doublerLr�rkr�r�r�rrrrrr	r+r*r,r=r@rAr1r/rrPrRrSrTrrmrorprrrrr /sx







6`
 u	
(

G

1#	
r
Oc@s"eZdZdZdd�Zddd�ZdS)	r!a�
    A simple validator.
    Can be used to check that all members expected are present.
    
    To use it, provide a configspec with all your members in (the value given
    will be ignored). Pass an instance of ``SimpleVal`` to the ``validate``
    method of your ``ConfigObj``. ``validate`` will return ``True`` if all
    members are present, or a dictionary with True/False meaning
    present/missing. (Whole missing sections will be replaced with ``False``)
    cCs
t|_dS)N)rrc)rErrrrkS	szSimpleVal.__init__FcCs|r|j��|S)z9A dummy check method, always returns the value unchanged.)rc)rErd�memberrarrrrdV	szSimpleVal.checkN)F)r?r@rArmrkrdrrrrr!G	s
cCs�|dkrg}g}|dkr t|�S|dks2t|t�r^|j|dd�d|f�|rV|j�t|�Sxht|j��D]X\}}|dkr~qlt|j|�t�r�|j|�t	|||||�ql|j|dd�||f�qlW|r�|j�t|�S)a�
    An example function that will turn a nested dictionary of results
    (as returned by ``ConfigObj.validate``) into a flat list.
    
    ``cfg`` is the ConfigObj instance being checked, ``res`` is the results
    dictionary returned by ``validate``.
    
    (This is a recursive function, so you shouldn't use the ``levels`` or
    ``results`` arguments - they are used by the function.)
    
    Returns a list of keys that failed. Each member of the list is a tuple::
    
        ([list of sections...], key, result)
    
    If ``validate`` was called with ``preserve_errors=False`` (the default)
    then ``result`` will always be ``False``.

    *list of sections* is a flattened list of sections that the key was found
    in.
    
    If the section was missing (or a section was expected and a scalar provided
    - or vice-versa) then key will be ``None``.
    
    If the value (or section) was missing then ``result`` will be ``False``.
    
    If ``validate`` was called with ``preserve_errors=True`` and a value
    was present, but failed the check, then ``result`` will be the exception
    object returned. You can use this as a string that describes the failure.
    
    For example *The value "3" is of the wrong type*.
    NTF)
�sortedr[r�r�r�rHr�rr�r))Zcfg�resZlevels�resultsrzr�rrrr)]	s* 
csVg}|j�fdd�|jD��x2|jD](}||jkr&|jt||�|f��q&W|S)a�
    Find all the values and sections not in the configspec from a validated
    ConfigObj.
    
    ``get_extra_values`` returns a list of tuples where each tuple represents
    either an extra section, or an extra value.
    
    The tuples contain two values, a tuple representing the section the value 
    is in and the name of the extra values. For extra values in the top level
    section the first member will be an empty tuple. For values in the 'foo'
    section the first member will be ``('foo',)``. For members in the 'bar'
    subsection of the 'foo' section the first member will be ``('foo', 'bar')``.
    
    NOTE: If you call ``get_extra_values`` on a ConfigObj instance that hasn't
    been validated it will return an empty list.
    csg|]}�|f�qSrr)r�rW)�_prependrrr��	sz$get_extra_values.<locals>.<listcomp>)r[r�r�r*)Zconfrur�rWr)rurr*�	s
)rN)rr	)r
r	)r	r	)rrrrrrrr r!r"r#r$r%r&r'r(r)r*)NN);r�r�r]�codecsrrrrr�Z_versionrr9rrrrrCrDr>r?rGrF�objectr��__all__rrZMAX_INTERPOL_DEPTHr�r>r�r(rBZ_builderr6rjrrrrnr&rrr"r#r%r$r'rqr�r�r�r�r�r�r r!r)r*rrrr�<module>s�	7			|i"
<site-packages/__pycache__/pyparsing.cpython-36.opt-1.pyc000064400000610513147511334560017114 0ustar003

��s`��@s�dZdZdZdZddlZddlmZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlmZyddlmZWn ek
r�ddlmZYnXydd	l
mZWn>ek
r�ydd	lmZWnek
r�dZYnXYnXd
ddd
ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrgiZee	j�dds�ZeddskZe�r"e	jZe Z!e"Z#e Z$e%e&e'e(e)ee*e+e,e-e.gZ/nbe	j0Ze1Z2dtdu�Z$gZ/ddl3Z3xBdvj4�D]6Z5ye/j6e7e3e5��Wne8k
�r|�wJYnX�qJWe9dwdx�e2dy�D��Z:dzd{�Z;Gd|d}�d}e<�Z=ej>ej?Z@d~ZAeAdZBe@eAZCe"d��ZDd�jEd�dx�ejFD��ZGGd�d!�d!eH�ZIGd�d#�d#eI�ZJGd�d%�d%eI�ZKGd�d'�d'eK�ZLGd�d*�d*eH�ZMGd�d��d�e<�ZNGd�d&�d&e<�ZOe
jPjQeO�d�d=�ZRd�dN�ZSd�dK�ZTd�d��ZUd�d��ZVd�d��ZWd�dU�ZX�d/d�d��ZYGd�d(�d(e<�ZZGd�d0�d0eZ�Z[Gd�d�de[�Z\Gd�d�de[�Z]Gd�d�de[�Z^e^Z_e^eZ_`Gd�d�de[�ZaGd�d�de^�ZbGd�d�dea�ZcGd�dp�dpe[�ZdGd�d3�d3e[�ZeGd�d+�d+e[�ZfGd�d)�d)e[�ZgGd�d
�d
e[�ZhGd�d2�d2e[�ZiGd�d��d�e[�ZjGd�d�dej�ZkGd�d�dej�ZlGd�d�dej�ZmGd�d.�d.ej�ZnGd�d-�d-ej�ZoGd�d5�d5ej�ZpGd�d4�d4ej�ZqGd�d$�d$eZ�ZrGd�d
�d
er�ZsGd�d �d er�ZtGd�d�der�ZuGd�d�der�ZvGd�d"�d"eZ�ZwGd�d�dew�ZxGd�d�dew�ZyGd�d��d�ew�ZzGd�d�dez�Z{Gd�d6�d6ez�Z|Gd�d��d�e<�Z}e}�Z~Gd�d�dew�ZGd�d,�d,ew�Z�Gd�d�dew�Z�Gd�d��d�e��Z�Gd�d1�d1ew�Z�Gd�d�de��Z�Gd�d�de��Z�Gd�d�de��Z�Gd�d/�d/e��Z�Gd�d�de<�Z�d�df�Z��d0d�dD�Z��d1d�d@�Z�d�d΄Z�d�dS�Z�d�dR�Z�d�d҄Z��d2d�dW�Z�d�dE�Z��d3d�dk�Z�d�dl�Z�d�dn�Z�e\�j�dG�Z�el�j�dM�Z�em�j�dL�Z�en�j�de�Z�eo�j�dd�Z�eeeDd�d�dڍj�d�d܄�Z�efd݃j�d�d܄�Z�efd߃j�d�d܄�Z�e�e�Be�BeeeGd�dyd�Befd�ej��BZ�e�e�e�d�e��Z�e^d�ed�j�d�e�e{e�e�B��j�d�d�Z�d�dc�Z�d�dQ�Z�d�d`�Z�d�d^�Z�d�dq�Z�e�d�d܄�Z�e�d�d܄�Z�d�d�Z�d�dO�Z�d�dP�Z�d�di�Z�e<�e�_��d4d�do�Z�e=�Z�e<�e�_�e<�e�_�e�d��e�d��fd�dm�Z�e�Z�e�efd��d��j�d��Z�e�efd��d��j�d��Z�e�efd��d�efd��d�B�j��d�Z�e�e_�d�e�j��j��d�Z�d�d�de�j�f�ddT�Z��d5�ddj�Z�e��d�Z�e��d�Z�e�eee@eC�d�j��d��\Z�Z�e�e��d	j4��d
��Z�ef�d�djEe�j��d
�j��d�ZĐdd_�Z�e�ef�d��d�j��d�Z�ef�d�j��d�Z�ef�d�jȃj��d�Z�ef�d�j��d�Z�e�ef�d��de�B�j��d�Z�e�Z�ef�d�j��d�Z�e�e{eeeGdɐd�eee�d�e^dɃem����j΃j��d�Z�e�ee�j�e�Bd��d��j�d>�Z�G�d dr�dr�Z�eҐd!k�r�eb�d"�Z�eb�d#�Z�eee@eC�d$�Z�e�eՐd%dӐd&�j�e��Z�e�e�eփ�j��d'�Zאd(e�BZ�e�eՐd%dӐd&�j�e��Z�e�e�eك�j��d)�Z�eӐd*�eؐd'�e�eڐd)�Z�e�jܐd+�e�j�jܐd,�e�j�jܐd,�e�j�jܐd-�ddl�Z�e�j�j�e�e�j��e�j�jܐd.�dS(6aS
pyparsing module - Classes and methods to define and execute parsing grammars

The pyparsing module is an alternative approach to creating and executing simple grammars,
vs. the traditional lex/yacc approach, or the use of regular expressions.  With pyparsing, you
don't need to learn a new syntax for defining grammars or matching expressions - the parsing module
provides a library of classes that you use to construct the grammar directly in Python.

Here is a program to parse "Hello, World!" (or any greeting of the form 
C{"<salutation>, <addressee>!"}), built up using L{Word}, L{Literal}, and L{And} elements 
(L{'+'<ParserElement.__add__>} operator gives L{And} expressions, strings are auto-converted to
L{Literal} expressions)::

    from pyparsing import Word, alphas

    # define grammar of a greeting
    greet = Word(alphas) + "," + Word(alphas) + "!"

    hello = "Hello, World!"
    print (hello, "->", greet.parseString(hello))

The program outputs the following::

    Hello, World! -> ['Hello', ',', 'World', '!']

The Python representation of the grammar is quite readable, owing to the self-explanatory
class names, and the use of '+', '|' and '^' operators.

The L{ParseResults} object returned from L{ParserElement.parseString<ParserElement.parseString>} can be accessed as a nested list, a dictionary, or an
object with named attributes.

The pyparsing module handles some of the problems that are typically vexing when writing text parsers:
 - extra or missing whitespace (the above program will also handle "Hello,World!", "Hello  ,  World  !", etc.)
 - quoted strings
 - embedded comments
z2.1.10z07 Oct 2016 01:31 UTCz*Paul McGuire <ptmcg@users.sourceforge.net>�N)�ref)�datetime)�RLock)�OrderedDict�And�CaselessKeyword�CaselessLiteral�
CharsNotIn�Combine�Dict�Each�Empty�
FollowedBy�Forward�
GoToColumn�Group�Keyword�LineEnd�	LineStart�Literal�
MatchFirst�NoMatch�NotAny�	OneOrMore�OnlyOnce�Optional�Or�ParseBaseException�ParseElementEnhance�ParseException�ParseExpression�ParseFatalException�ParseResults�ParseSyntaxException�
ParserElement�QuotedString�RecursiveGrammarException�Regex�SkipTo�	StringEnd�StringStart�Suppress�Token�TokenConverter�White�Word�WordEnd�	WordStart�
ZeroOrMore�	alphanums�alphas�
alphas8bit�anyCloseTag�
anyOpenTag�
cStyleComment�col�commaSeparatedList�commonHTMLEntity�countedArray�cppStyleComment�dblQuotedString�dblSlashComment�
delimitedList�dictOf�downcaseTokens�empty�hexnums�htmlComment�javaStyleComment�line�lineEnd�	lineStart�lineno�makeHTMLTags�makeXMLTags�matchOnlyAtCol�matchPreviousExpr�matchPreviousLiteral�
nestedExpr�nullDebugAction�nums�oneOf�opAssoc�operatorPrecedence�
printables�punc8bit�pythonStyleComment�quotedString�removeQuotes�replaceHTMLEntity�replaceWith�
restOfLine�sglQuotedString�srange�	stringEnd�stringStart�traceParseAction�
unicodeString�upcaseTokens�
withAttribute�
indentedBlock�originalTextFor�ungroup�
infixNotation�locatedExpr�	withClass�
CloseMatch�tokenMap�pyparsing_common�cCs`t|t�r|Syt|�Stk
rZt|�jtj�d�}td�}|jdd��|j	|�SXdS)aDrop-in replacement for str(obj) that tries to be Unicode friendly. It first tries
           str(obj). If that fails with a UnicodeEncodeError, then it tries unicode(obj). It
           then < returns the unicode object | encodes it with the default encoding | ... >.
        �xmlcharrefreplacez&#\d+;cSs$dtt|ddd���dd�S)Nz\ur�����)�hex�int)�t�rw�/usr/lib/python3.6/pyparsing.py�<lambda>�sz_ustr.<locals>.<lambda>N)
�
isinstanceZunicode�str�UnicodeEncodeError�encode�sys�getdefaultencodingr'�setParseAction�transformString)�obj�retZ
xmlcharrefrwrwrx�_ustr�s
r�z6sum len sorted reversed list tuple set any all min maxccs|]
}|VqdS)Nrw)�.0�yrwrwrx�	<genexpr>�sr�rrcCs>d}dd�dj�D�}x"t||�D]\}}|j||�}q"W|S)z/Escape &, <, >, ", ', etc. in a string of data.z&><"'css|]}d|dVqdS)�&�;Nrw)r��srwrwrxr��sz_xml_escape.<locals>.<genexpr>zamp gt lt quot apos)�split�zip�replace)�dataZfrom_symbolsZ
to_symbolsZfrom_Zto_rwrwrx�_xml_escape�s
r�c@seZdZdS)�
_ConstantsN)�__name__�
__module__�__qualname__rwrwrwrxr��sr��
0123456789ZABCDEFabcdef�\�ccs|]}|tjkr|VqdS)N)�stringZ
whitespace)r��crwrwrxr��sc@sPeZdZdZddd�Zedd��Zdd	�Zd
d�Zdd
�Z	ddd�Z
dd�ZdS)rz7base exception class for all parsing runtime exceptionsrNcCs>||_|dkr||_d|_n||_||_||_|||f|_dS)Nr�)�loc�msg�pstr�
parserElement�args)�selfr�r�r��elemrwrwrx�__init__�szParseBaseException.__init__cCs||j|j|j|j�S)z�
        internal factory method to simplify creating one type of ParseException 
        from another - avoids having __init__ signature conflicts among subclasses
        )r�r�r�r�)�cls�perwrwrx�_from_exception�sz"ParseBaseException._from_exceptioncCsN|dkrt|j|j�S|dkr,t|j|j�S|dkrBt|j|j�St|��dS)z�supported attributes by name are:
            - lineno - returns the line number of the exception text
            - col - returns the column number of the exception text
            - line - returns the line containing the exception text
        rJr9�columnrGN)r9r�)rJr�r�r9rG�AttributeError)r�Zanamerwrwrx�__getattr__�szParseBaseException.__getattr__cCsd|j|j|j|jfS)Nz"%s (at char %d), (line:%d, col:%d))r�r�rJr�)r�rwrwrx�__str__�szParseBaseException.__str__cCst|�S)N)r�)r�rwrwrx�__repr__�szParseBaseException.__repr__�>!<cCs<|j}|jd}|r4dj|d|�|||d�f�}|j�S)z�Extracts the exception line from the input string, and marks
           the location of the exception with a special symbol.
        rrr�N)rGr��join�strip)r�ZmarkerStringZline_strZline_columnrwrwrx�
markInputline�s
z ParseBaseException.markInputlinecCsdj�tt|��S)Nzlineno col line)r��dir�type)r�rwrwrx�__dir__�szParseBaseException.__dir__)rNN)r�)r�r�r��__doc__r��classmethodr�r�r�r�r�r�rwrwrwrxr�s


c@seZdZdZdS)raN
    Exception thrown when parse expressions don't match class;
    supported attributes by name are:
     - lineno - returns the line number of the exception text
     - col - returns the column number of the exception text
     - line - returns the line containing the exception text
        
    Example::
        try:
            Word(nums).setName("integer").parseString("ABC")
        except ParseException as pe:
            print(pe)
            print("column: {}".format(pe.col))
            
    prints::
       Expected integer (at char 0), (line:1, col:1)
        column: 1
    N)r�r�r�r�rwrwrwrxr�sc@seZdZdZdS)r!znuser-throwable exception thrown when inconsistent parse content
       is found; stops all parsing immediatelyN)r�r�r�r�rwrwrwrxr!sc@seZdZdZdS)r#z�just like L{ParseFatalException}, but thrown internally when an
       L{ErrorStop<And._ErrorStop>} ('-' operator) indicates that parsing is to stop 
       immediately because an unbacktrackable syntax error has been foundN)r�r�r�r�rwrwrwrxr#sc@s eZdZdZdd�Zdd�ZdS)r&zZexception thrown by L{ParserElement.validate} if the grammar could be improperly recursivecCs
||_dS)N)�parseElementTrace)r��parseElementListrwrwrxr�sz"RecursiveGrammarException.__init__cCs
d|jS)NzRecursiveGrammarException: %s)r�)r�rwrwrxr� sz!RecursiveGrammarException.__str__N)r�r�r�r�r�r�rwrwrwrxr&sc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�_ParseResultsWithOffsetcCs||f|_dS)N)�tup)r�Zp1Zp2rwrwrxr�$sz _ParseResultsWithOffset.__init__cCs
|j|S)N)r�)r��irwrwrx�__getitem__&sz#_ParseResultsWithOffset.__getitem__cCst|jd�S)Nr)�reprr�)r�rwrwrxr�(sz _ParseResultsWithOffset.__repr__cCs|jd|f|_dS)Nr)r�)r�r�rwrwrx�	setOffset*sz!_ParseResultsWithOffset.setOffsetN)r�r�r�r�r�r�r�rwrwrwrxr�#sr�c@s�eZdZdZd[dd�Zddddefdd�Zdd	�Zefd
d�Zdd
�Z	dd�Z
dd�Zdd�ZeZ
dd�Zdd�Zdd�Zdd�Zdd�Zer�eZeZeZn$eZeZeZdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd\d(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Z d2d3�Z!d4d5�Z"d6d7�Z#d8d9�Z$d:d;�Z%d<d=�Z&d]d?d@�Z'dAdB�Z(dCdD�Z)dEdF�Z*d^dHdI�Z+dJdK�Z,dLdM�Z-d_dOdP�Z.dQdR�Z/dSdT�Z0dUdV�Z1dWdX�Z2dYdZ�Z3dS)`r"aI
    Structured parse results, to provide multiple means of access to the parsed data:
       - as a list (C{len(results)})
       - by list index (C{results[0], results[1]}, etc.)
       - by attribute (C{results.<resultsName>} - see L{ParserElement.setResultsName})

    Example::
        integer = Word(nums)
        date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))
        # equivalent form:
        # date_str = integer("year") + '/' + integer("month") + '/' + integer("day")

        # parseString returns a ParseResults object
        result = date_str.parseString("1999/12/31")

        def test(s, fn=repr):
            print("%s -> %s" % (s, fn(eval(s))))
        test("list(result)")
        test("result[0]")
        test("result['month']")
        test("result.day")
        test("'month' in result")
        test("'minutes' in result")
        test("result.dump()", str)
    prints::
        list(result) -> ['1999', '/', '12', '/', '31']
        result[0] -> '1999'
        result['month'] -> '12'
        result.day -> '31'
        'month' in result -> True
        'minutes' in result -> False
        result.dump() -> ['1999', '/', '12', '/', '31']
        - day: 31
        - month: 12
        - year: 1999
    NTcCs"t||�r|Stj|�}d|_|S)NT)rz�object�__new__�_ParseResults__doinit)r��toklist�name�asList�modalZretobjrwrwrxr�Ts


zParseResults.__new__c
Cs`|jrvd|_d|_d|_i|_||_||_|dkr6g}||t�rP|dd�|_n||t�rft|�|_n|g|_t	�|_
|dk	o�|�r\|s�d|j|<||t�r�t|�}||_||t
d�ttf�o�|ddgfk�s\||t�r�|g}|�r&||t��rt|j�d�||<ntt|d�d�||<|||_n6y|d||<Wn$tttfk
�rZ|||<YnXdS)NFrr�)r��_ParseResults__name�_ParseResults__parent�_ParseResults__accumNames�_ParseResults__asList�_ParseResults__modal�list�_ParseResults__toklist�_generatorType�dict�_ParseResults__tokdictrur�r��
basestringr"r��copy�KeyError�	TypeError�
IndexError)r�r�r�r�r�rzrwrwrxr�]sB



$
zParseResults.__init__cCsPt|ttf�r|j|S||jkr4|j|ddStdd�|j|D��SdS)NrrrcSsg|]}|d�qS)rrw)r��vrwrwrx�
<listcomp>�sz,ParseResults.__getitem__.<locals>.<listcomp>rs)rzru�slicer�r�r�r")r�r�rwrwrxr��s


zParseResults.__getitem__cCs�||t�r0|jj|t��|g|j|<|d}nD||ttf�rN||j|<|}n&|jj|t��t|d�g|j|<|}||t�r�t|�|_	dS)Nr)
r�r��getr�rur�r�r"�wkrefr�)r��kr�rz�subrwrwrx�__setitem__�s


"
zParseResults.__setitem__c
Cs�t|ttf�r�t|j�}|j|=t|t�rH|dkr:||7}t||d�}tt|j|���}|j�x^|j	j
�D]F\}}x<|D]4}x.t|�D]"\}\}}	t||	|	|k�||<q�Wq|WqnWn|j	|=dS)Nrrr)
rzrur��lenr�r��range�indices�reverser��items�	enumerater�)
r�r�ZmylenZremovedr��occurrences�jr��value�positionrwrwrx�__delitem__�s


$zParseResults.__delitem__cCs
||jkS)N)r�)r�r�rwrwrx�__contains__�szParseResults.__contains__cCs
t|j�S)N)r�r�)r�rwrwrx�__len__�szParseResults.__len__cCs
|jS)N)r�)r�rwrwrx�__bool__�szParseResults.__bool__cCs
t|j�S)N)�iterr�)r�rwrwrx�__iter__�szParseResults.__iter__cCst|jddd��S)Nrrrs)r�r�)r�rwrwrx�__reversed__�szParseResults.__reversed__cCs$t|jd�r|jj�St|j�SdS)N�iterkeys)�hasattrr�r�r�)r�rwrwrx�	_iterkeys�s
zParseResults._iterkeyscs�fdd��j�D�S)Nc3s|]}�|VqdS)Nrw)r�r�)r�rwrxr��sz+ParseResults._itervalues.<locals>.<genexpr>)r�)r�rw)r�rx�_itervalues�szParseResults._itervaluescs�fdd��j�D�S)Nc3s|]}|�|fVqdS)Nrw)r�r�)r�rwrxr��sz*ParseResults._iteritems.<locals>.<genexpr>)r�)r�rw)r�rx�
_iteritems�szParseResults._iteritemscCst|j��S)zVReturns all named result keys (as a list in Python 2.x, as an iterator in Python 3.x).)r�r�)r�rwrwrx�keys�szParseResults.keyscCst|j��S)zXReturns all named result values (as a list in Python 2.x, as an iterator in Python 3.x).)r��
itervalues)r�rwrwrx�values�szParseResults.valuescCst|j��S)zfReturns all named result key-values (as a list of tuples in Python 2.x, as an iterator in Python 3.x).)r��	iteritems)r�rwrwrxr��szParseResults.itemscCs
t|j�S)z�Since keys() returns an iterator, this method is helpful in bypassing
           code that looks for the existence of any defined results names.)�boolr�)r�rwrwrx�haskeys�szParseResults.haskeyscOs�|s
dg}x6|j�D]*\}}|dkr2|d|f}qtd|��qWt|dt�sht|�dksh|d|kr�|d}||}||=|S|d}|SdS)a�
        Removes and returns item at specified index (default=C{last}).
        Supports both C{list} and C{dict} semantics for C{pop()}. If passed no
        argument or an integer argument, it will use C{list} semantics
        and pop tokens from the list of parsed tokens. If passed a 
        non-integer argument (most likely a string), it will use C{dict}
        semantics and pop the corresponding value from any defined 
        results names. A second default return value argument is 
        supported, just as in C{dict.pop()}.

        Example::
            def remove_first(tokens):
                tokens.pop(0)
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            print(OneOrMore(Word(nums)).addParseAction(remove_first).parseString("0 123 321")) # -> ['123', '321']

            label = Word(alphas)
            patt = label("LABEL") + OneOrMore(Word(nums))
            print(patt.parseString("AAB 123 321").dump())

            # Use pop() in a parse action to remove named result (note that corresponding value is not
            # removed from list form of results)
            def remove_LABEL(tokens):
                tokens.pop("LABEL")
                return tokens
            patt.addParseAction(remove_LABEL)
            print(patt.parseString("AAB 123 321").dump())
        prints::
            ['AAB', '123', '321']
            - LABEL: AAB

            ['AAB', '123', '321']
        rr�defaultrz-pop() got an unexpected keyword argument '%s'Nrs)r�r�rzrur�)r�r��kwargsr�r��indexr�Zdefaultvaluerwrwrx�pop�s"zParseResults.popcCs||kr||S|SdS)ai
        Returns named result matching the given key, or if there is no
        such name, then returns the given C{defaultValue} or C{None} if no
        C{defaultValue} is specified.

        Similar to C{dict.get()}.
        
        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            result = date_str.parseString("1999/12/31")
            print(result.get("year")) # -> '1999'
            print(result.get("hour", "not specified")) # -> 'not specified'
            print(result.get("hour")) # -> None
        Nrw)r��key�defaultValuerwrwrxr�szParseResults.getcCsZ|jj||�xF|jj�D]8\}}x.t|�D]"\}\}}t||||k�||<q,WqWdS)a
        Inserts new element at location index in the list of parsed tokens.
        
        Similar to C{list.insert()}.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']

            # use a parse action to insert the parse location in the front of the parsed results
            def insert_locn(locn, tokens):
                tokens.insert(0, locn)
            print(OneOrMore(Word(nums)).addParseAction(insert_locn).parseString("0 123 321")) # -> [0, '0', '123', '321']
        N)r��insertr�r�r�r�)r�r�ZinsStrr�r�r�r�r�rwrwrxr�2szParseResults.insertcCs|jj|�dS)a�
        Add single element to end of ParseResults list of elements.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            
            # use a parse action to compute the sum of the parsed integers, and add it to the end
            def append_sum(tokens):
                tokens.append(sum(map(int, tokens)))
            print(OneOrMore(Word(nums)).addParseAction(append_sum).parseString("0 123 321")) # -> ['0', '123', '321', 444]
        N)r��append)r��itemrwrwrxr�FszParseResults.appendcCs$t|t�r||7}n|jj|�dS)a
        Add sequence of elements to end of ParseResults list of elements.

        Example::
            patt = OneOrMore(Word(alphas))
            
            # use a parse action to append the reverse of the matched strings, to make a palindrome
            def make_palindrome(tokens):
                tokens.extend(reversed([t[::-1] for t in tokens]))
                return ''.join(tokens)
            print(patt.addParseAction(make_palindrome).parseString("lskdj sdlkjf lksd")) # -> 'lskdjsdlkjflksddsklfjkldsjdksl'
        N)rzr"r��extend)r�Zitemseqrwrwrxr�Ts

zParseResults.extendcCs|jdd�=|jj�dS)z7
        Clear all elements and results names.
        N)r�r��clear)r�rwrwrxr�fszParseResults.clearcCsfy||Stk
rdSX||jkr^||jkrD|j|ddStdd�|j|D��SndSdS)Nr�rrrcSsg|]}|d�qS)rrw)r�r�rwrwrxr�wsz,ParseResults.__getattr__.<locals>.<listcomp>rs)r�r�r�r")r�r�rwrwrxr�ms

zParseResults.__getattr__cCs|j�}||7}|S)N)r�)r��otherr�rwrwrx�__add__{szParseResults.__add__cs�|jrnt|j���fdd��|jj�}�fdd�|D�}x4|D],\}}|||<t|dt�r>t|�|d_q>W|j|j7_|jj	|j�|S)Ncs|dkr�S|�S)Nrrw)�a)�offsetrwrxry�sz'ParseResults.__iadd__.<locals>.<lambda>c	s4g|],\}}|D]}|t|d�|d��f�qqS)rrr)r�)r�r��vlistr�)�	addoffsetrwrxr��sz)ParseResults.__iadd__.<locals>.<listcomp>r)
r�r�r�r�rzr"r�r�r��update)r�r�Z
otheritemsZotherdictitemsr�r�rw)rrrx�__iadd__�s


zParseResults.__iadd__cCs&t|t�r|dkr|j�S||SdS)Nr)rzrur�)r�r�rwrwrx�__radd__�szParseResults.__radd__cCsdt|j�t|j�fS)Nz(%s, %s))r�r�r�)r�rwrwrxr��szParseResults.__repr__cCsddjdd�|jD��dS)N�[z, css(|] }t|t�rt|�nt|�VqdS)N)rzr"r�r�)r�r�rwrwrxr��sz'ParseResults.__str__.<locals>.<genexpr>�])r�r�)r�rwrwrxr��szParseResults.__str__r�cCsPg}xF|jD]<}|r"|r"|j|�t|t�r:||j�7}q|jt|��qW|S)N)r�r�rzr"�
_asStringListr�)r��sep�outr�rwrwrxr
�s

zParseResults._asStringListcCsdd�|jD�S)a�
        Returns the parse results as a nested list of matching tokens, all converted to strings.

        Example::
            patt = OneOrMore(Word(alphas))
            result = patt.parseString("sldkj lsdkj sldkj")
            # even though the result prints in string-like form, it is actually a pyparsing ParseResults
            print(type(result), result) # -> <class 'pyparsing.ParseResults'> ['sldkj', 'lsdkj', 'sldkj']
            
            # Use asList() to create an actual list
            result_list = result.asList()
            print(type(result_list), result_list) # -> <class 'list'> ['sldkj', 'lsdkj', 'sldkj']
        cSs"g|]}t|t�r|j�n|�qSrw)rzr"r�)r��resrwrwrxr��sz'ParseResults.asList.<locals>.<listcomp>)r�)r�rwrwrxr��szParseResults.asListcs6tr|j}n|j}�fdd��t�fdd�|�D��S)a�
        Returns the named parse results as a nested dictionary.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(type(result), repr(result)) # -> <class 'pyparsing.ParseResults'> (['12', '/', '31', '/', '1999'], {'day': [('1999', 4)], 'year': [('12', 0)], 'month': [('31', 2)]})
            
            result_dict = result.asDict()
            print(type(result_dict), repr(result_dict)) # -> <class 'dict'> {'day': '1999', 'year': '12', 'month': '31'}

            # even though a ParseResults supports dict-like access, sometime you just need to have a dict
            import json
            print(json.dumps(result)) # -> Exception: TypeError: ... is not JSON serializable
            print(json.dumps(result.asDict())) # -> {"month": "31", "day": "1999", "year": "12"}
        cs6t|t�r.|j�r|j�S�fdd�|D�Sn|SdS)Ncsg|]}�|��qSrwrw)r�r�)�toItemrwrxr��sz7ParseResults.asDict.<locals>.toItem.<locals>.<listcomp>)rzr"r��asDict)r�)rrwrxr�s

z#ParseResults.asDict.<locals>.toItemc3s|]\}}|�|�fVqdS)Nrw)r�r�r�)rrwrxr��sz&ParseResults.asDict.<locals>.<genexpr>)�PY_3r�r�r�)r�Zitem_fnrw)rrxr�s
	zParseResults.asDictcCs8t|j�}|jj�|_|j|_|jj|j�|j|_|S)zA
        Returns a new copy of a C{ParseResults} object.
        )r"r�r�r�r�r�rr�)r�r�rwrwrxr��s
zParseResults.copyFcCsPd}g}tdd�|jj�D��}|d}|s8d}d}d}d}	|dk	rJ|}	n|jrV|j}	|	sf|rbdSd}	|||d|	d	g7}x�t|j�D]�\}
}t|t�r�|
|kr�||j||
|o�|dk||�g7}n||jd|o�|dk||�g7}q�d}|
|kr�||
}|�s
|�rq�nd}t	t
|��}
|||d|d	|
d
|d	g	7}q�W|||d
|	d	g7}dj|�S)z�
        (Deprecated) Returns the parse results as XML. Tags are created for tokens and lists that have defined results names.
        �
css(|] \}}|D]}|d|fVqqdS)rrNrw)r�r�rr�rwrwrxr��sz%ParseResults.asXML.<locals>.<genexpr>z  r�NZITEM�<�>z</)r�r�r�r�r�r�rzr"�asXMLr�r�r�)r�ZdoctagZnamedItemsOnly�indentZ	formatted�nlrZ
namedItemsZnextLevelIndentZselfTagr�r
ZresTagZxmlBodyTextrwrwrxr�sT


zParseResults.asXMLcCs:x4|jj�D]&\}}x|D]\}}||kr|SqWqWdS)N)r�r�)r�r�r�rr�r�rwrwrxZ__lookup$s
zParseResults.__lookupcCs�|jr|jS|jr.|j�}|r(|j|�SdSnNt|�dkrxt|j�dkrxtt|jj���dddkrxtt|jj���SdSdS)a(
        Returns the results name for this token expression. Useful when several 
        different expressions might match at a particular location.

        Example::
            integer = Word(nums)
            ssn_expr = Regex(r"\d\d\d-\d\d-\d\d\d\d")
            house_number_expr = Suppress('#') + Word(nums, alphanums)
            user_data = (Group(house_number_expr)("house_number") 
                        | Group(ssn_expr)("ssn")
                        | Group(integer)("age"))
            user_info = OneOrMore(user_data)
            
            result = user_info.parseString("22 111-22-3333 #221B")
            for item in result:
                print(item.getName(), ':', item[0])
        prints::
            age : 22
            ssn : 111-22-3333
            house_number : 221B
        Nrrrrs)rrs)	r�r��_ParseResults__lookupr�r��nextr�r�r�)r��parrwrwrx�getName+s
zParseResults.getNamercCsbg}d}|j|t|j���|�rX|j�r�tdd�|j�D��}xz|D]r\}}|r^|j|�|jd|d||f�t|t�r�|r�|j|j||d��q�|jt|��qH|jt	|��qHWn�t
dd�|D���rX|}x~t|�D]r\}	}
t|
t��r*|jd|d||	|d|d|
j||d�f�q�|jd|d||	|d|dt|
�f�q�Wd	j|�S)
aH
        Diagnostic method for listing out the contents of a C{ParseResults}.
        Accepts an optional C{indent} argument so that this string can be embedded
        in a nested display of other data.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(result.dump())
        prints::
            ['12', '/', '31', '/', '1999']
            - day: 1999
            - month: 31
            - year: 12
        rcss|]\}}t|�|fVqdS)N)r{)r�r�r�rwrwrxr�gsz$ParseResults.dump.<locals>.<genexpr>z
%s%s- %s: z  rrcss|]}t|t�VqdS)N)rzr")r��vvrwrwrxr�ssz
%s%s[%d]:
%s%s%sr�)
r�r�r�r��sortedr�rzr"�dumpr��anyr�r�)r�r�depth�fullr�NLr�r�r�r�rrwrwrxrPs,

4.zParseResults.dumpcOstj|j�f|�|�dS)a�
        Pretty-printer for parsed results as a list, using the C{pprint} module.
        Accepts additional positional or keyword args as defined for the 
        C{pprint.pprint} method. (U{http://docs.python.org/3/library/pprint.html#pprint.pprint})

        Example::
            ident = Word(alphas, alphanums)
            num = Word(nums)
            func = Forward()
            term = ident | num | Group('(' + func + ')')
            func <<= ident + Group(Optional(delimitedList(term)))
            result = func.parseString("fna a,b,(fnb c,d,200),100")
            result.pprint(width=40)
        prints::
            ['fna',
             ['a',
              'b',
              ['(', 'fnb', ['c', 'd', '200'], ')'],
              '100']]
        N)�pprintr�)r�r�r�rwrwrxr"}szParseResults.pprintcCs.|j|jj�|jdk	r|j�p d|j|jffS)N)r�r�r�r�r�r�)r�rwrwrx�__getstate__�s
zParseResults.__getstate__cCsN|d|_|d\|_}}|_i|_|jj|�|dk	rDt|�|_nd|_dS)Nrrr)r�r�r�r�rr�r�)r��staterZinAccumNamesrwrwrx�__setstate__�s
zParseResults.__setstate__cCs|j|j|j|jfS)N)r�r�r�r�)r�rwrwrx�__getnewargs__�szParseResults.__getnewargs__cCstt|��t|j��S)N)r�r�r�r�)r�rwrwrxr��szParseResults.__dir__)NNTT)N)r�)NFr�T)r�rT)4r�r�r�r�r�rzr�r�r�r�r�r�r��__nonzero__r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr�r�r
r�rr�rrrrr"r#r%r&r�rwrwrwrxr"-sh&
	'	
4

#
=%
-
cCsF|}d|kot|�knr4||ddkr4dS||jdd|�S)aReturns current column within a string, counting newlines as line separators.
   The first column is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   rrrr)r��rfind)r��strgr�rwrwrxr9�s
cCs|jdd|�dS)aReturns current line number within a string, counting newlines as line separators.
   The first line is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   rrrr)�count)r�r)rwrwrxrJ�s
cCsF|jdd|�}|jd|�}|dkr2||d|�S||dd�SdS)zfReturns the line of text containing loc within a string, counting newlines as line separators.
       rrrrN)r(�find)r�r)ZlastCRZnextCRrwrwrxrG�s
cCs8tdt|�dt|�dt||�t||�f�dS)NzMatch z at loc z(%d,%d))�printr�rJr9)�instringr��exprrwrwrx�_defaultStartDebugAction�sr/cCs$tdt|�dt|j���dS)NzMatched z -> )r,r�r{r�)r-�startlocZendlocr.�toksrwrwrx�_defaultSuccessDebugAction�sr2cCstdt|��dS)NzException raised:)r,r�)r-r�r.�excrwrwrx�_defaultExceptionDebugAction�sr4cGsdS)zG'Do-nothing' debug action, to suppress debugging output during parsing.Nrw)r�rwrwrxrQ�srqcs��tkr�fdd�Sdg�dg�tdd�dkrFddd	�}dd
d��ntj}tj�d}|dd
�d}|d|d|f�������fdd�}d}yt�dt�d�j�}Wntk
r�t��}YnX||_|S)Ncs�|�S)Nrw)r��lrv)�funcrwrxry�sz_trim_arity.<locals>.<lambda>rFrqro�cSs8tdkrdnd	}tj||dd�|}|j|jfgS)
Nror7rrqrr)�limit)ror7r������)�system_version�	traceback�
extract_stack�filenamerJ)r8r�
frame_summaryrwrwrxr=sz"_trim_arity.<locals>.extract_stackcSs$tj||d�}|d}|j|jfgS)N)r8rrrs)r<�
extract_tbr>rJ)�tbr8Zframesr?rwrwrxr@sz_trim_arity.<locals>.extract_tb�)r8rrcs�x�y �|�dd��}d�d<|Stk
r��dr>�n4z.tj�d}�|dd�ddd��ksj�Wd~X�d�kr��dd7<w�YqXqWdS)NrTrrrq)r8rsrs)r�r~�exc_info)r�r�rA)r@�
foundArityr6r8�maxargs�pa_call_line_synthrwrx�wrappers"z_trim_arity.<locals>.wrapperz<parse action>r��	__class__)ror7)r)rrs)	�singleArgBuiltinsr;r<r=r@�getattrr��	Exceptionr{)r6rEr=Z	LINE_DIFFZ	this_linerG�	func_namerw)r@rDr6r8rErFrx�_trim_arity�s*
rMcs�eZdZdZdZdZedd��Zedd��Zd�dd	�Z	d
d�Z
dd
�Zd�dd�Zd�dd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd�dd �Zd!d"�Zd�d#d$�Zd%d&�Zd'd(�ZGd)d*�d*e�Zed+k	r�Gd,d-�d-e�ZnGd.d-�d-e�ZiZe�Zd/d/gZ d�d0d1�Z!eZ"ed2d3��Z#dZ$ed�d5d6��Z%d�d7d8�Z&e'dfd9d:�Z(d;d<�Z)e'fd=d>�Z*e'dfd?d@�Z+dAdB�Z,dCdD�Z-dEdF�Z.dGdH�Z/dIdJ�Z0dKdL�Z1dMdN�Z2dOdP�Z3dQdR�Z4dSdT�Z5dUdV�Z6dWdX�Z7dYdZ�Z8d�d[d\�Z9d]d^�Z:d_d`�Z;dadb�Z<dcdd�Z=dedf�Z>dgdh�Z?d�didj�Z@dkdl�ZAdmdn�ZBdodp�ZCdqdr�ZDgfdsdt�ZEd�dudv�ZF�fdwdx�ZGdydz�ZHd{d|�ZId}d~�ZJdd��ZKd�d�d��ZLd�d�d��ZM�ZNS)�r$z)Abstract base level parser element class.z 
	
FcCs
|t_dS)a�
        Overrides the default whitespace chars

        Example::
            # default whitespace chars are space, <TAB> and newline
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def', 'ghi', 'jkl']
            
            # change to just treat newline as significant
            ParserElement.setDefaultWhitespaceChars(" \t")
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def']
        N)r$�DEFAULT_WHITE_CHARS)�charsrwrwrx�setDefaultWhitespaceChars=s
z'ParserElement.setDefaultWhitespaceCharscCs
|t_dS)a�
        Set class to be used for inclusion of string literals into a parser.
        
        Example::
            # default literal class used is Literal
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']


            # change to Suppress
            ParserElement.inlineLiteralsUsing(Suppress)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '12', '31']
        N)r$�_literalStringClass)r�rwrwrx�inlineLiteralsUsingLsz!ParserElement.inlineLiteralsUsingcCs�t�|_d|_d|_d|_||_d|_tj|_	d|_
d|_d|_t�|_
d|_d|_d|_d|_d|_d|_d|_d|_d|_dS)NTFr�)NNN)r��parseAction�
failAction�strRepr�resultsName�
saveAsList�skipWhitespacer$rN�
whiteChars�copyDefaultWhiteChars�mayReturnEmpty�keepTabs�ignoreExprs�debug�streamlined�
mayIndexError�errmsg�modalResults�debugActions�re�callPreparse�
callDuringTry)r��savelistrwrwrxr�as(zParserElement.__init__cCs<tj|�}|jdd�|_|jdd�|_|jr8tj|_|S)a$
        Make a copy of this C{ParserElement}.  Useful for defining different parse actions
        for the same parsing pattern, using copies of the original parse element.
        
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            integerK = integer.copy().addParseAction(lambda toks: toks[0]*1024) + Suppress("K")
            integerM = integer.copy().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
            
            print(OneOrMore(integerK | integerM | integer).parseString("5K 100 640K 256M"))
        prints::
            [5120, 100, 655360, 268435456]
        Equivalent form of C{expr.copy()} is just C{expr()}::
            integerM = integer().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
        N)r�rSr]rZr$rNrY)r�Zcpyrwrwrxr�xs
zParserElement.copycCs*||_d|j|_t|d�r&|j|j_|S)af
        Define name for this expression, makes debugging and exception messages clearer.
        
        Example::
            Word(nums).parseString("ABC")  # -> Exception: Expected W:(0123...) (at char 0), (line:1, col:1)
            Word(nums).setName("integer").parseString("ABC")  # -> Exception: Expected integer (at char 0), (line:1, col:1)
        z	Expected �	exception)r�rar�rhr�)r�r�rwrwrx�setName�s


zParserElement.setNamecCs4|j�}|jd�r"|dd�}d}||_||_|S)aP
        Define name for referencing matching tokens as a nested attribute
        of the returned parse results.
        NOTE: this returns a *copy* of the original C{ParserElement} object;
        this is so that the client can define a basic element, such as an
        integer, and reference it in multiple places with different names.

        You can also set results names using the abbreviated syntax,
        C{expr("name")} in place of C{expr.setResultsName("name")} - 
        see L{I{__call__}<__call__>}.

        Example::
            date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))

            # equivalent form:
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
        �*NrrTrs)r��endswithrVrb)r�r��listAllMatchesZnewselfrwrwrx�setResultsName�s
zParserElement.setResultsNameTcs@|r&|j�d�fdd�	}�|_||_nt|jd�r<|jj|_|S)z�Method to invoke the Python pdb debugger when this element is
           about to be parsed. Set C{breakFlag} to True to enable, False to
           disable.
        Tcsddl}|j��||||�S)Nr)�pdbZ	set_trace)r-r��	doActions�callPreParsern)�_parseMethodrwrx�breaker�sz'ParserElement.setBreak.<locals>.breaker�_originalParseMethod)TT)�_parsersr�)r�Z	breakFlagrrrw)rqrx�setBreak�s
zParserElement.setBreakcOs&tttt|���|_|jdd�|_|S)a
        Define action to perform when successfully matching parse element definition.
        Parse action fn is a callable method with 0-3 arguments, called as C{fn(s,loc,toks)},
        C{fn(loc,toks)}, C{fn(toks)}, or just C{fn()}, where:
         - s   = the original string being parsed (see note below)
         - loc = the location of the matching substring
         - toks = a list of the matched tokens, packaged as a C{L{ParseResults}} object
        If the functions in fns modify the tokens, they can return them as the return
        value from fn, and the modified list of tokens will replace the original.
        Otherwise, fn does not need to return any value.

        Optional keyword arguments:
         - callDuringTry = (default=C{False}) indicate if parse action should be run during lookaheads and alternate testing

        Note: the default parsing behavior is to expand tabs in the input string
        before starting the parsing process.  See L{I{parseString}<parseString>} for more information
        on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
        consistent view of the parsed string, the parse location, and line and column
        positions within the parsed string.
        
        Example::
            integer = Word(nums)
            date_str = integer + '/' + integer + '/' + integer

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']

            # use parse action to convert to ints at parse time
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            date_str = integer + '/' + integer + '/' + integer

            # note that integer fields are now ints, not strings
            date_str.parseString("1999/12/31")  # -> [1999, '/', 12, '/', 31]
        rfF)r��maprMrSr�rf)r��fnsr�rwrwrxr��s"zParserElement.setParseActioncOs4|jtttt|���7_|jp,|jdd�|_|S)z�
        Add parse action to expression's list of parse actions. See L{I{setParseAction}<setParseAction>}.
        
        See examples in L{I{copy}<copy>}.
        rfF)rSr�rvrMrfr�)r�rwr�rwrwrx�addParseAction�szParserElement.addParseActioncsb|jdd��|jdd�rtnt�x(|D] ����fdd�}|jj|�q&W|jpZ|jdd�|_|S)a�Add a boolean predicate function to expression's list of parse actions. See 
        L{I{setParseAction}<setParseAction>} for function call signatures. Unlike C{setParseAction}, 
        functions passed to C{addCondition} need to return boolean success/fail of the condition.

        Optional keyword arguments:
         - message = define a custom message to be used in the raised exception
         - fatal   = if True, will raise ParseFatalException to stop parsing immediately; otherwise will raise ParseException
         
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            year_int = integer.copy()
            year_int.addCondition(lambda toks: toks[0] >= 2000, message="Only support years 2000 and later")
            date_str = year_int + '/' + integer + '/' + integer

            result = date_str.parseString("1999/12/31")  # -> Exception: Only support years 2000 and later (at char 0), (line:1, col:1)
        �messagezfailed user-defined condition�fatalFcs$tt��|||��s �||���dS)N)r�rM)r�r5rv)�exc_type�fnr�rwrx�pasz&ParserElement.addCondition.<locals>.parf)r�r!rrSr�rf)r�rwr�r}rw)r{r|r�rx�addCondition�s
zParserElement.addConditioncCs
||_|S)aDefine action to perform if parsing fails at this expression.
           Fail acton fn is a callable function that takes the arguments
           C{fn(s,loc,expr,err)} where:
            - s = string being parsed
            - loc = location where expression match was attempted and failed
            - expr = the parse expression that failed
            - err = the exception thrown
           The function returns no value.  It may throw C{L{ParseFatalException}}
           if it is desired to stop parsing immediately.)rT)r�r|rwrwrx�
setFailActions
zParserElement.setFailActioncCsZd}xP|rTd}xB|jD]8}yx|j||�\}}d}qWWqtk
rLYqXqWqW|S)NTF)r]rtr)r�r-r�Z
exprsFound�eZdummyrwrwrx�_skipIgnorables#szParserElement._skipIgnorablescCsL|jr|j||�}|jrH|j}t|�}x ||krF|||krF|d7}q(W|S)Nrr)r]r�rXrYr�)r�r-r�Zwt�instrlenrwrwrx�preParse0szParserElement.preParsecCs|gfS)Nrw)r�r-r�rorwrwrx�	parseImpl<szParserElement.parseImplcCs|S)Nrw)r�r-r��	tokenlistrwrwrx�	postParse?szParserElement.postParsec"Cs�|j}|s|jr�|jdr,|jd|||�|rD|jrD|j||�}n|}|}yDy|j|||�\}}Wn(tk
r�t|t|�|j	|��YnXWnXt
k
r�}	z<|jdr�|jd||||	�|jr�|j||||	��WYdd}	~	XnXn�|o�|j�r|j||�}n|}|}|j�s$|t|�k�rhy|j|||�\}}Wn*tk
�rdt|t|�|j	|��YnXn|j|||�\}}|j|||�}t
||j|j|jd�}
|j�r�|�s�|j�r�|�rVyRxL|jD]B}||||
�}|dk	�r�t
||j|j�o�t|t
tf�|jd�}
�q�WWnFt
k
�rR}	z(|jd�r@|jd||||	��WYdd}	~	XnXnNxL|jD]B}||||
�}|dk	�r^t
||j|j�o�t|t
tf�|jd�}
�q^W|�r�|jd�r�|jd|||||
�||
fS)Nrrq)r�r�rr)r^rTrcrer�r�r�rr�rarr`r�r"rVrWrbrSrfrzr�)r�r-r�rorpZ	debugging�prelocZtokensStart�tokens�errZ	retTokensr|rwrwrx�
_parseNoCacheCsp





zParserElement._parseNoCachecCs>y|j||dd�dStk
r8t|||j|��YnXdS)NF)ror)rtr!rra)r�r-r�rwrwrx�tryParse�szParserElement.tryParsecCs2y|j||�Wnttfk
r(dSXdSdS)NFT)r�rr�)r�r-r�rwrwrx�canParseNext�s
zParserElement.canParseNextc@seZdZdd�ZdS)zParserElement._UnboundedCachecsdi�t�|_���fdd�}�fdd�}�fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)�cache�not_in_cacherwrxr��sz3ParserElement._UnboundedCache.__init__.<locals>.getcs|�|<dS)Nrw)r�r�r�)r�rwrx�set�sz3ParserElement._UnboundedCache.__init__.<locals>.setcs�j�dS)N)r�)r�)r�rwrxr��sz5ParserElement._UnboundedCache.__init__.<locals>.clear)r�r��types�
MethodTyper�r�r�)r�r�r�r�rw)r�r�rxr��sz&ParserElement._UnboundedCache.__init__N)r�r�r�r�rwrwrwrx�_UnboundedCache�sr�Nc@seZdZdd�ZdS)zParserElement._FifoCachecsht�|_�t����fdd�}��fdd�}�fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)r�r�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.getcs"|�|<t���kr�jd�dS)NF)r��popitem)r�r�r�)r��sizerwrxr��sz.ParserElement._FifoCache.__init__.<locals>.setcs�j�dS)N)r�)r�)r�rwrxr��sz0ParserElement._FifoCache.__init__.<locals>.clear)r�r��_OrderedDictr�r�r�r�r�)r�r�r�r�r�rw)r�r�r�rxr��sz!ParserElement._FifoCache.__init__N)r�r�r�r�rwrwrwrx�
_FifoCache�sr�c@seZdZdd�ZdS)zParserElement._FifoCachecsvt�|_�i�tjg�����fdd�}���fdd�}��fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)r�r�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.getcs2|�|<t���kr$�j�j�d��j|�dS)N)r�r��popleftr�)r�r�r�)r��key_fifor�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.setcs�j��j�dS)N)r�)r�)r�r�rwrxr��sz0ParserElement._FifoCache.__init__.<locals>.clear)	r�r��collections�dequer�r�r�r�r�)r�r�r�r�r�rw)r�r�r�r�rxr��sz!ParserElement._FifoCache.__init__N)r�r�r�r�rwrwrwrxr��srcCs�d\}}|||||f}tj��tj}|j|�}	|	|jkr�tj|d7<y|j||||�}	Wn8tk
r�}
z|j||
j	|
j
���WYdd}
~
Xq�X|j||	d|	dj�f�|	Sn4tj|d7<t|	t
�r�|	�|	d|	dj�fSWdQRXdS)Nrrr)rrr)r$�packrat_cache_lock�
packrat_cacher�r��packrat_cache_statsr�rr�rHr�r�rzrK)r�r-r�rorpZHITZMISS�lookupr�r�r�rwrwrx�_parseCache�s$


zParserElement._parseCachecCs(tjj�dgttj�tjdd�<dS)Nr)r$r�r�r�r�rwrwrwrx�
resetCache�s
zParserElement.resetCache�cCs8tjs4dt_|dkr tj�t_ntj|�t_tjt_dS)a�Enables "packrat" parsing, which adds memoizing to the parsing logic.
           Repeated parse attempts at the same string location (which happens
           often in many complex grammars) can immediately return a cached value,
           instead of re-executing parsing/validating code.  Memoizing is done of
           both valid results and parsing exceptions.
           
           Parameters:
            - cache_size_limit - (default=C{128}) - if an integer value is provided
              will limit the size of the packrat cache; if None is passed, then
              the cache size will be unbounded; if 0 is passed, the cache will
              be effectively disabled.
            
           This speedup may break existing programs that use parse actions that
           have side-effects.  For this reason, packrat parsing is disabled when
           you first import pyparsing.  To activate the packrat feature, your
           program must call the class method C{ParserElement.enablePackrat()}.  If
           your program uses C{psyco} to "compile as you go", you must call
           C{enablePackrat} before calling C{psyco.full()}.  If you do not do this,
           Python will crash.  For best results, call C{enablePackrat()} immediately
           after importing pyparsing.
           
           Example::
               import pyparsing
               pyparsing.ParserElement.enablePackrat()
        TN)r$�_packratEnabledr�r�r�r�rt)Zcache_size_limitrwrwrx�
enablePackratszParserElement.enablePackratcCs�tj�|js|j�x|jD]}|j�qW|js<|j�}y<|j|d�\}}|rv|j||�}t	�t
�}|j||�Wn0tk
r�}ztjr��n|�WYdd}~XnX|SdS)aB
        Execute the parse expression with the given string.
        This is the main interface to the client code, once the complete
        expression has been built.

        If you want the grammar to require that the entire input string be
        successfully parsed, then set C{parseAll} to True (equivalent to ending
        the grammar with C{L{StringEnd()}}).

        Note: C{parseString} implicitly calls C{expandtabs()} on the input string,
        in order to report proper column numbers in parse actions.
        If the input string contains tabs and
        the grammar uses parse actions that use the C{loc} argument to index into the
        string being parsed, you can ensure you have a consistent view of the input
        string by:
         - calling C{parseWithTabs} on your grammar before calling C{parseString}
           (see L{I{parseWithTabs}<parseWithTabs>})
         - define your parse action using the full C{(s,loc,toks)} signature, and
           reference the input string using the parse action's C{s} argument
         - explictly expand the tabs in your input string before calling
           C{parseString}
        
        Example::
            Word('a').parseString('aaaaabaaa')  # -> ['aaaaa']
            Word('a').parseString('aaaaabaaa', parseAll=True)  # -> Exception: Expected end of text
        rN)
r$r�r_�
streamliner]r\�
expandtabsrtr�r
r)r�verbose_stacktrace)r�r-�parseAllr�r�r�Zser3rwrwrx�parseString#s$zParserElement.parseStringccs@|js|j�x|jD]}|j�qW|js8t|�j�}t|�}d}|j}|j}t	j
�d}	y�x�||kon|	|k�ry |||�}
|||
dd�\}}Wntk
r�|
d}Yq`X||kr�|	d7}	||
|fV|r�|||�}
|
|kr�|}q�|d7}n|}q`|
d}q`WWn4tk
�r:}zt	j
�r&�n|�WYdd}~XnXdS)a�
        Scan the input string for expression matches.  Each match will return the
        matching tokens, start location, and end location.  May be called with optional
        C{maxMatches} argument, to clip scanning after 'n' matches are found.  If
        C{overlap} is specified, then overlapping matches will be reported.

        Note that the start and end locations are reported relative to the string
        being parsed.  See L{I{parseString}<parseString>} for more information on parsing
        strings with embedded tabs.

        Example::
            source = "sldjf123lsdjjkf345sldkjf879lkjsfd987"
            print(source)
            for tokens,start,end in Word(alphas).scanString(source):
                print(' '*start + '^'*(end-start))
                print(' '*start + tokens[0])
        
        prints::
        
            sldjf123lsdjjkf345sldkjf879lkjsfd987
            ^^^^^
            sldjf
                    ^^^^^^^
                    lsdjjkf
                              ^^^^^^
                              sldkjf
                                       ^^^^^^
                                       lkjsfd
        rF)rprrN)r_r�r]r\r�r�r�r�rtr$r�rrr�)r�r-�
maxMatchesZoverlapr�r�r�Z
preparseFnZparseFn�matchesr�ZnextLocr�Znextlocr3rwrwrx�
scanStringUsB


zParserElement.scanStringcCs�g}d}d|_y�xh|j|�D]Z\}}}|j|||��|rrt|t�rT||j�7}nt|t�rh||7}n
|j|�|}qW|j||d��dd�|D�}djtt	t
|���Stk
r�}ztj
rȂn|�WYdd}~XnXdS)af
        Extension to C{L{scanString}}, to modify matching text with modified tokens that may
        be returned from a parse action.  To use C{transformString}, define a grammar and
        attach a parse action to it that modifies the returned token list.
        Invoking C{transformString()} on a target string will then scan for matches,
        and replace the matched text patterns according to the logic in the parse
        action.  C{transformString()} returns the resulting transformed string.
        
        Example::
            wd = Word(alphas)
            wd.setParseAction(lambda toks: toks[0].title())
            
            print(wd.transformString("now is the winter of our discontent made glorious summer by this sun of york."))
        Prints::
            Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York.
        rTNcSsg|]}|r|�qSrwrw)r��orwrwrxr��sz1ParserElement.transformString.<locals>.<listcomp>r�)r\r�r�rzr"r�r�r�rvr��_flattenrr$r�)r�r-rZlastErvr�r�r3rwrwrxr��s(



zParserElement.transformStringcCsPytdd�|j||�D��Stk
rJ}ztjr6�n|�WYdd}~XnXdS)a~
        Another extension to C{L{scanString}}, simplifying the access to the tokens found
        to match the given parse expression.  May be called with optional
        C{maxMatches} argument, to clip searching after 'n' matches are found.
        
        Example::
            # a capitalized word starts with an uppercase letter, followed by zero or more lowercase letters
            cap_word = Word(alphas.upper(), alphas.lower())
            
            print(cap_word.searchString("More than Iron, more than Lead, more than Gold I need Electricity"))
        prints::
            ['More', 'Iron', 'Lead', 'Gold', 'I']
        cSsg|]\}}}|�qSrwrw)r�rvr�r�rwrwrxr��sz.ParserElement.searchString.<locals>.<listcomp>N)r"r�rr$r�)r�r-r�r3rwrwrx�searchString�szParserElement.searchStringc	csXd}d}x<|j||d�D]*\}}}|||�V|r>|dV|}qW||d�VdS)a[
        Generator method to split a string using the given expression as a separator.
        May be called with optional C{maxsplit} argument, to limit the number of splits;
        and the optional C{includeSeparators} argument (default=C{False}), if the separating
        matching text should be included in the split results.
        
        Example::        
            punc = oneOf(list(".,;:/-!?"))
            print(list(punc.split("This, this?, this sentence, is badly punctuated!")))
        prints::
            ['This', ' this', '', ' this sentence', ' is badly punctuated', '']
        r)r�N)r�)	r�r-�maxsplitZincludeSeparatorsZsplitsZlastrvr�r�rwrwrxr��s

zParserElement.splitcCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)a�
        Implementation of + operator - returns C{L{And}}. Adding strings to a ParserElement
        converts them to L{Literal}s by default.
        
        Example::
            greet = Word(alphas) + "," + Word(alphas) + "!"
            hello = "Hello, World!"
            print (hello, "->", greet.parseString(hello))
        Prints::
            Hello, World! -> ['Hello', ',', 'World', '!']
        z4Cannot combine element of type %s with ParserElementrq)�
stacklevelN)	rzr�r$rQ�warnings�warnr��
SyntaxWarningr)r�r�rwrwrxr�s



zParserElement.__add__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||S)z]
        Implementation of + operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrxrs



zParserElement.__radd__cCsLt|t�rtj|�}t|t�s:tjdt|�tdd�dSt|tj	�|g�S)zQ
        Implementation of - operator, returns C{L{And}} with error stop
        z4Cannot combine element of type %s with ParserElementrq)r�N)
rzr�r$rQr�r�r�r�r�
_ErrorStop)r�r�rwrwrx�__sub__s



zParserElement.__sub__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||S)z]
        Implementation of - operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rsub__ s



zParserElement.__rsub__cs�t|t�r|d}}n�t|t�r�|ddd�}|ddkrHd|df}t|dt�r�|ddkr�|ddkrvt��S|ddkr�t��S�|dt��SnJt|dt�r�t|dt�r�|\}}||8}ntdt|d�t|d���ntdt|���|dk�rtd��|dk�rtd��||k�o2dkn�rBtd	��|�r���fd
d��|�r�|dk�rt��|�}nt�g|��|�}n�|�}n|dk�r��}nt�g|�}|S)
a�
        Implementation of * operator, allows use of C{expr * 3} in place of
        C{expr + expr + expr}.  Expressions may also me multiplied by a 2-integer
        tuple, similar to C{{min,max}} multipliers in regular expressions.  Tuples
        may also include C{None} as in:
         - C{expr*(n,None)} or C{expr*(n,)} is equivalent
              to C{expr*n + L{ZeroOrMore}(expr)}
              (read as "at least n instances of C{expr}")
         - C{expr*(None,n)} is equivalent to C{expr*(0,n)}
              (read as "0 to n instances of C{expr}")
         - C{expr*(None,None)} is equivalent to C{L{ZeroOrMore}(expr)}
         - C{expr*(1,None)} is equivalent to C{L{OneOrMore}(expr)}

        Note that C{expr*(None,n)} does not raise an exception if
        more than n exprs exist in the input stream; that is,
        C{expr*(None,n)} does not enforce a maximum number of expr
        occurrences.  If this behavior is desired, then write
        C{expr*(None,n) + ~expr}
        rNrqrrz7cannot multiply 'ParserElement' and ('%s','%s') objectsz0cannot multiply 'ParserElement' and '%s' objectsz/cannot multiply ParserElement by negative valuez@second tuple value must be greater or equal to first tuple valuez+cannot multiply ParserElement by 0 or (0,0)cs(|dkrt��|d��St��SdS)Nrr)r)�n)�makeOptionalListr�rwrxr�]sz/ParserElement.__mul__.<locals>.makeOptionalList)NN)	rzru�tupler2rr�r��
ValueErrorr)r�r�ZminElementsZoptElementsr�rw)r�r�rx�__mul__,sD







zParserElement.__mul__cCs
|j|�S)N)r�)r�r�rwrwrx�__rmul__pszParserElement.__rmul__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zI
        Implementation of | operator - returns C{L{MatchFirst}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__or__ss



zParserElement.__or__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||BS)z]
        Implementation of | operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__ror__s



zParserElement.__ror__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zA
        Implementation of ^ operator - returns C{L{Or}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__xor__�s



zParserElement.__xor__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||AS)z]
        Implementation of ^ operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rxor__�s



zParserElement.__rxor__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zC
        Implementation of & operator - returns C{L{Each}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__and__�s



zParserElement.__and__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||@S)z]
        Implementation of & operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rand__�s



zParserElement.__rand__cCst|�S)zE
        Implementation of ~ operator - returns C{L{NotAny}}
        )r)r�rwrwrx�
__invert__�szParserElement.__invert__cCs|dk	r|j|�S|j�SdS)a

        Shortcut for C{L{setResultsName}}, with C{listAllMatches=False}.
        
        If C{name} is given with a trailing C{'*'} character, then C{listAllMatches} will be
        passed as C{True}.
           
        If C{name} is omitted, same as calling C{L{copy}}.

        Example::
            # these are equivalent
            userdata = Word(alphas).setResultsName("name") + Word(nums+"-").setResultsName("socsecno")
            userdata = Word(alphas)("name") + Word(nums+"-")("socsecno")             
        N)rmr�)r�r�rwrwrx�__call__�s
zParserElement.__call__cCst|�S)z�
        Suppresses the output of this C{ParserElement}; useful to keep punctuation from
        cluttering up returned output.
        )r+)r�rwrwrx�suppress�szParserElement.suppresscCs
d|_|S)a
        Disables the skipping of whitespace before matching the characters in the
        C{ParserElement}'s defined pattern.  This is normally only used internally by
        the pyparsing module, but may be needed in some whitespace-sensitive grammars.
        F)rX)r�rwrwrx�leaveWhitespace�szParserElement.leaveWhitespacecCsd|_||_d|_|S)z8
        Overrides the default whitespace chars
        TF)rXrYrZ)r�rOrwrwrx�setWhitespaceChars�sz ParserElement.setWhitespaceCharscCs
d|_|S)z�
        Overrides default behavior to expand C{<TAB>}s to spaces before parsing the input string.
        Must be called before C{parseString} when the input grammar contains elements that
        match C{<TAB>} characters.
        T)r\)r�rwrwrx�
parseWithTabs�szParserElement.parseWithTabscCsLt|t�rt|�}t|t�r4||jkrH|jj|�n|jjt|j���|S)a�
        Define expression to be ignored (e.g., comments) while doing pattern
        matching; may be called repeatedly, to define multiple comment or other
        ignorable patterns.
        
        Example::
            patt = OneOrMore(Word(alphas))
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj']
            
            patt.ignore(cStyleComment)
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj', 'lskjd']
        )rzr�r+r]r�r�)r�r�rwrwrx�ignore�s


zParserElement.ignorecCs"|pt|pt|ptf|_d|_|S)zT
        Enable display of debugging messages while doing pattern matching.
        T)r/r2r4rcr^)r�ZstartActionZ
successActionZexceptionActionrwrwrx�setDebugActions
s
zParserElement.setDebugActionscCs|r|jttt�nd|_|S)a�
        Enable display of debugging messages while doing pattern matching.
        Set C{flag} to True to enable, False to disable.

        Example::
            wd = Word(alphas).setName("alphaword")
            integer = Word(nums).setName("numword")
            term = wd | integer
            
            # turn on debugging for wd
            wd.setDebug()

            OneOrMore(term).parseString("abc 123 xyz 890")
        
        prints::
            Match alphaword at loc 0(1,1)
            Matched alphaword -> ['abc']
            Match alphaword at loc 3(1,4)
            Exception raised:Expected alphaword (at char 4), (line:1, col:5)
            Match alphaword at loc 7(1,8)
            Matched alphaword -> ['xyz']
            Match alphaword at loc 11(1,12)
            Exception raised:Expected alphaword (at char 12), (line:1, col:13)
            Match alphaword at loc 15(1,16)
            Exception raised:Expected alphaword (at char 15), (line:1, col:16)

        The output shown is that produced by the default debug actions - custom debug actions can be
        specified using L{setDebugActions}. Prior to attempting
        to match the C{wd} expression, the debugging message C{"Match <exprname> at loc <n>(<line>,<col>)"}
        is shown. Then if the parse succeeds, a C{"Matched"} message is shown, or an C{"Exception raised"}
        message is shown. Also note the use of L{setName} to assign a human-readable name to the expression,
        which makes debugging and exception messages easier to understand - for instance, the default
        name created for the C{Word} expression without calling C{setName} is C{"W:(ABCD...)"}.
        F)r�r/r2r4r^)r��flagrwrwrx�setDebugs#zParserElement.setDebugcCs|jS)N)r�)r�rwrwrxr�@szParserElement.__str__cCst|�S)N)r�)r�rwrwrxr�CszParserElement.__repr__cCsd|_d|_|S)NT)r_rU)r�rwrwrxr�FszParserElement.streamlinecCsdS)Nrw)r�r�rwrwrx�checkRecursionKszParserElement.checkRecursioncCs|jg�dS)zj
        Check defined expressions for valid structure, check for infinite recursive definitions.
        N)r�)r��
validateTracerwrwrx�validateNszParserElement.validatecCs�y|j�}Wn2tk
r>t|d��}|j�}WdQRXYnXy|j||�Stk
r|}ztjrh�n|�WYdd}~XnXdS)z�
        Execute the parse expression on the given file or filename.
        If a filename is specified (instead of a file object),
        the entire file is opened, read, and closed before parsing.
        �rN)�readr��openr�rr$r�)r�Zfile_or_filenamer�Z
file_contents�fr3rwrwrx�	parseFileTszParserElement.parseFilecsHt|t�r"||kp t|�t|�kSt|t�r6|j|�Stt|�|kSdS)N)rzr$�varsr�r��super)r�r�)rHrwrx�__eq__hs



zParserElement.__eq__cCs
||kS)Nrw)r�r�rwrwrx�__ne__pszParserElement.__ne__cCstt|��S)N)�hash�id)r�rwrwrx�__hash__sszParserElement.__hash__cCs||kS)Nrw)r�r�rwrwrx�__req__vszParserElement.__req__cCs
||kS)Nrw)r�r�rwrwrx�__rne__yszParserElement.__rne__cCs0y|jt|�|d�dStk
r*dSXdS)a�
        Method for quick testing of a parser against a test string. Good for simple 
        inline microtests of sub expressions while building up larger parser.
           
        Parameters:
         - testString - to test against this expression for a match
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests
            
        Example::
            expr = Word(nums)
            assert expr.matches("100")
        )r�TFN)r�r�r)r�Z
testStringr�rwrwrxr�|s

zParserElement.matches�#cCs�t|t�r"tttj|j�j���}t|t�r4t|�}g}g}d}	�x�|D�]�}
|dk	rb|j	|
d�sl|rx|
rx|j
|
�qH|
s~qHdj|�|
g}g}y:|
jdd�}
|j
|
|d�}|j
|j|d��|	o�|}	Wn�tk
�rx}
z�t|
t�r�dnd	}d|
k�r0|j
t|
j|
��|j
d
t|
j|
�dd|�n|j
d
|
jd|�|j
d
t|
��|	�ob|}	|
}WYdd}
~
XnDtk
�r�}z&|j
dt|��|	�o�|}	|}WYdd}~XnX|�r�|�r�|j
d	�tdj|��|j
|
|f�qHW|	|fS)a3
        Execute the parse expression on a series of test strings, showing each
        test, the parsed results or where the parse failed. Quick and easy way to
        run a parse expression against a list of sample strings.
           
        Parameters:
         - tests - a list of separate test strings, or a multiline string of test strings
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests           
         - comment - (default=C{'#'}) - expression for indicating embedded comments in the test 
              string; pass None to disable comment filtering
         - fullDump - (default=C{True}) - dump results as list followed by results names in nested outline;
              if False, only dump nested list
         - printResults - (default=C{True}) prints test output to stdout
         - failureTests - (default=C{False}) indicates if these tests are expected to fail parsing

        Returns: a (success, results) tuple, where success indicates that all tests succeeded
        (or failed if C{failureTests} is True), and the results contain a list of lines of each 
        test's output
        
        Example::
            number_expr = pyparsing_common.number.copy()

            result = number_expr.runTests('''
                # unsigned integer
                100
                # negative integer
                -100
                # float with scientific notation
                6.02e23
                # integer with scientific notation
                1e-12
                ''')
            print("Success" if result[0] else "Failed!")

            result = number_expr.runTests('''
                # stray character
                100Z
                # missing leading digit before '.'
                -.100
                # too many '.'
                3.14.159
                ''', failureTests=True)
            print("Success" if result[0] else "Failed!")
        prints::
            # unsigned integer
            100
            [100]

            # negative integer
            -100
            [-100]

            # float with scientific notation
            6.02e23
            [6.02e+23]

            # integer with scientific notation
            1e-12
            [1e-12]

            Success
            
            # stray character
            100Z
               ^
            FAIL: Expected end of text (at char 3), (line:1, col:4)

            # missing leading digit before '.'
            -.100
            ^
            FAIL: Expected {real number with scientific notation | real number | signed integer} (at char 0), (line:1, col:1)

            # too many '.'
            3.14.159
                ^
            FAIL: Expected end of text (at char 4), (line:1, col:5)

            Success

        Each test string must be on a single line. If you want to test a string that spans multiple
        lines, create a test like this::

            expr.runTest(r"this is a test\n of strings that spans \n 3 lines")
        
        (Note that this is a raw string literal, you must include the leading 'r'.)
        TNFrz\n)r�)r z(FATAL)r�� rr�^zFAIL: zFAIL-EXCEPTION: )rzr�r�rvr{r��rstrip�
splitlinesrr�r�r�r�r�rrr!rGr�r9rKr,)r�Ztestsr�ZcommentZfullDumpZprintResultsZfailureTestsZ
allResultsZcomments�successrvr�resultr�rzr3rwrwrx�runTests�sNW



$


zParserElement.runTests)F)F)T)T)TT)TT)r�)F)N)T)F)T)Tr�TTF)Or�r�r�r�rNr��staticmethodrPrRr�r�rirmrur�rxr~rr�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�rtr�r�r�r��_MAX_INTr�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r��
__classcell__rwrw)rHrxr$8s�


&




H
"
2G+D
			

)

cs eZdZdZ�fdd�Z�ZS)r,zT
    Abstract C{ParserElement} subclass, for defining atomic matching patterns.
    cstt|�jdd�dS)NF)rg)r�r,r�)r�)rHrwrxr�	szToken.__init__)r�r�r�r�r�r�rwrw)rHrxr,	scs eZdZdZ�fdd�Z�ZS)r
z,
    An empty token, will always match.
    cs$tt|�j�d|_d|_d|_dS)Nr
TF)r�r
r�r�r[r`)r�)rHrwrxr�	szEmpty.__init__)r�r�r�r�r�r�rwrw)rHrxr
	scs*eZdZdZ�fdd�Zddd�Z�ZS)rz(
    A token that will never match.
    cs*tt|�j�d|_d|_d|_d|_dS)NrTFzUnmatchable token)r�rr�r�r[r`ra)r�)rHrwrxr�*	s
zNoMatch.__init__TcCst|||j|��dS)N)rra)r�r-r�rorwrwrxr�1	szNoMatch.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr&	scs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Token to exactly match a specified string.
    
    Example::
        Literal('blah').parseString('blah')  # -> ['blah']
        Literal('blah').parseString('blahfooblah')  # -> ['blah']
        Literal('blah').parseString('bla')  # -> Exception: Expected "blah"
    
    For case-insensitive matching, use L{CaselessLiteral}.
    
    For keyword matching (force word break before and after the matched string),
    use L{Keyword} or L{CaselessKeyword}.
    cs�tt|�j�||_t|�|_y|d|_Wn*tk
rVtj	dt
dd�t|_YnXdt
|j�|_d|j|_d|_d|_dS)Nrz2null string passed to Literal; use Empty() insteadrq)r�z"%s"z	Expected F)r�rr��matchr��matchLen�firstMatchCharr�r�r�r�r
rHr�r�rar[r`)r��matchString)rHrwrxr�C	s

zLiteral.__init__TcCsJ|||jkr6|jdks&|j|j|�r6||j|jfSt|||j|��dS)Nrr)r�r��
startswithr�rra)r�r-r�rorwrwrxr�V	szLiteral.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr5	s
csLeZdZdZedZd�fdd�	Zddd	�Z�fd
d�Ze	dd
��Z
�ZS)ra\
    Token to exactly match a specified string as a keyword, that is, it must be
    immediately followed by a non-keyword character.  Compare with C{L{Literal}}:
     - C{Literal("if")} will match the leading C{'if'} in C{'ifAndOnlyIf'}.
     - C{Keyword("if")} will not; it will only match the leading C{'if'} in C{'if x=1'}, or C{'if(y==2)'}
    Accepts two optional constructor arguments in addition to the keyword string:
     - C{identChars} is a string of characters that would be valid identifier characters,
          defaulting to all alphanumerics + "_" and "$"
     - C{caseless} allows case-insensitive matching, default is C{False}.
       
    Example::
        Keyword("start").parseString("start")  # -> ['start']
        Keyword("start").parseString("starting")  # -> Exception

    For case-insensitive matching, use L{CaselessKeyword}.
    z_$NFcs�tt|�j�|dkrtj}||_t|�|_y|d|_Wn$tk
r^t	j
dtdd�YnXd|j|_d|j|_
d|_d|_||_|r�|j�|_|j�}t|�|_dS)Nrz2null string passed to Keyword; use Empty() insteadrq)r�z"%s"z	Expected F)r�rr��DEFAULT_KEYWORD_CHARSr�r�r�r�r�r�r�r�r�rar[r`�caseless�upper�
caselessmatchr��
identChars)r�r�r�r�)rHrwrxr�q	s&

zKeyword.__init__TcCs|jr|||||j�j�|jkr�|t|�|jksL|||jj�|jkr�|dksj||dj�|jkr�||j|jfSnv|||jkr�|jdks�|j|j|�r�|t|�|jks�|||j|jkr�|dks�||d|jkr�||j|jfSt	|||j
|��dS)Nrrr)r�r�r�r�r�r�r�r�r�rra)r�r-r�rorwrwrxr��	s*&zKeyword.parseImplcstt|�j�}tj|_|S)N)r�rr�r�r�)r�r�)rHrwrxr��	szKeyword.copycCs
|t_dS)z,Overrides the default Keyword chars
        N)rr�)rOrwrwrx�setDefaultKeywordChars�	szKeyword.setDefaultKeywordChars)NF)T)r�r�r�r�r3r�r�r�r�r�r�r�rwrw)rHrxr^	s
cs*eZdZdZ�fdd�Zddd�Z�ZS)ral
    Token to match a specified string, ignoring case of letters.
    Note: the matched results will always be in the case of the given
    match string, NOT the case of the input text.

    Example::
        OneOrMore(CaselessLiteral("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD', 'CMD']
        
    (Contrast with example for L{CaselessKeyword}.)
    cs6tt|�j|j��||_d|j|_d|j|_dS)Nz'%s'z	Expected )r�rr�r��returnStringr�ra)r�r�)rHrwrxr��	szCaselessLiteral.__init__TcCs@||||j�j�|jkr,||j|jfSt|||j|��dS)N)r�r�r�r�rra)r�r-r�rorwrwrxr��	szCaselessLiteral.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr�	s
cs,eZdZdZd�fdd�	Zd	dd�Z�ZS)
rz�
    Caseless version of L{Keyword}.

    Example::
        OneOrMore(CaselessKeyword("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD']
        
    (Contrast with example for L{CaselessLiteral}.)
    Ncstt|�j||dd�dS)NT)r�)r�rr�)r�r�r�)rHrwrxr��	szCaselessKeyword.__init__TcCsj||||j�j�|jkrV|t|�|jksF|||jj�|jkrV||j|jfSt|||j|��dS)N)r�r�r�r�r�r�rra)r�r-r�rorwrwrxr��	s*zCaselessKeyword.parseImpl)N)T)r�r�r�r�r�r�r�rwrw)rHrxr�	scs,eZdZdZd�fdd�	Zd	dd�Z�ZS)
rlax
    A variation on L{Literal} which matches "close" matches, that is, 
    strings with at most 'n' mismatching characters. C{CloseMatch} takes parameters:
     - C{match_string} - string to be matched
     - C{maxMismatches} - (C{default=1}) maximum number of mismatches allowed to count as a match
    
    The results from a successful parse will contain the matched text from the input string and the following named results:
     - C{mismatches} - a list of the positions within the match_string where mismatches were found
     - C{original} - the original match_string used to compare against the input string
    
    If C{mismatches} is an empty list, then the match was an exact match.
    
    Example::
        patt = CloseMatch("ATCATCGAATGGA")
        patt.parseString("ATCATCGAAXGGA") # -> (['ATCATCGAAXGGA'], {'mismatches': [[9]], 'original': ['ATCATCGAATGGA']})
        patt.parseString("ATCAXCGAAXGGA") # -> Exception: Expected 'ATCATCGAATGGA' (with up to 1 mismatches) (at char 0), (line:1, col:1)

        # exact match
        patt.parseString("ATCATCGAATGGA") # -> (['ATCATCGAATGGA'], {'mismatches': [[]], 'original': ['ATCATCGAATGGA']})

        # close match allowing up to 2 mismatches
        patt = CloseMatch("ATCATCGAATGGA", maxMismatches=2)
        patt.parseString("ATCAXCGAAXGGA") # -> (['ATCAXCGAAXGGA'], {'mismatches': [[4, 9]], 'original': ['ATCATCGAATGGA']})
    rrcsBtt|�j�||_||_||_d|j|jf|_d|_d|_dS)Nz&Expected %r (with up to %d mismatches)F)	r�rlr�r��match_string�
maxMismatchesrar`r[)r�r�r�)rHrwrxr��	szCloseMatch.__init__TcCs�|}t|�}|t|j�}||kr�|j}d}g}	|j}
x�tt|||�|j��D]0\}}|\}}
||
krP|	j|�t|	�|
krPPqPW|d}t|||�g�}|j|d<|	|d<||fSt|||j|��dS)NrrrZoriginal�
mismatches)	r�r�r�r�r�r�r"rra)r�r-r�ro�startr��maxlocr�Zmatch_stringlocr�r�Zs_m�src�mat�resultsrwrwrxr��	s("

zCloseMatch.parseImpl)rr)T)r�r�r�r�r�r�r�rwrw)rHrxrl�	s	cs8eZdZdZd
�fdd�	Zdd	d
�Z�fdd�Z�ZS)r/a	
    Token for matching words composed of allowed character sets.
    Defined with string containing all allowed initial characters,
    an optional string containing allowed body characters (if omitted,
    defaults to the initial character set), and an optional minimum,
    maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction. An optional
    C{excludeChars} parameter can list characters that might be found in 
    the input C{bodyChars} string; useful to define a word of all printables
    except for one or two characters, for instance.
    
    L{srange} is useful for defining custom character set strings for defining 
    C{Word} expressions, using range notation from regular expression character sets.
    
    A common mistake is to use C{Word} to match a specific literal string, as in 
    C{Word("Address")}. Remember that C{Word} uses the string argument to define
    I{sets} of matchable characters. This expression would match "Add", "AAA",
    "dAred", or any other word made up of the characters 'A', 'd', 'r', 'e', and 's'.
    To match an exact literal string, use L{Literal} or L{Keyword}.

    pyparsing includes helper strings for building Words:
     - L{alphas}
     - L{nums}
     - L{alphanums}
     - L{hexnums}
     - L{alphas8bit} (alphabetic characters in ASCII range 128-255 - accented, tilded, umlauted, etc.)
     - L{punc8bit} (non-alphabetic characters in ASCII range 128-255 - currency, symbols, superscripts, diacriticals, etc.)
     - L{printables} (any non-whitespace character)

    Example::
        # a word composed of digits
        integer = Word(nums) # equivalent to Word("0123456789") or Word(srange("0-9"))
        
        # a word with a leading capital, and zero or more lowercase
        capital_word = Word(alphas.upper(), alphas.lower())

        # hostnames are alphanumeric, with leading alpha, and '-'
        hostname = Word(alphas, alphanums+'-')
        
        # roman numeral (not a strict parser, accepts invalid mix of characters)
        roman = Word("IVXLCDM")
        
        # any string of non-whitespace characters, except for ','
        csv_value = Word(printables, excludeChars=",")
    NrrrFcs�tt|�j��rFdj�fdd�|D��}|rFdj�fdd�|D��}||_t|�|_|rl||_t|�|_n||_t|�|_|dk|_	|dkr�t
d��||_|dkr�||_nt
|_|dkr�||_||_t|�|_d|j|_d	|_||_d
|j|jk�r�|dk�r�|dk�r�|dk�r�|j|jk�r8dt|j�|_nHt|j�dk�rfdtj|j�t|j�f|_nd
t|j�t|j�f|_|j�r�d|jd|_ytj|j�|_Wntk
�r�d|_YnXdS)Nr�c3s|]}|�kr|VqdS)Nrw)r�r�)�excludeCharsrwrxr�7
sz Word.__init__.<locals>.<genexpr>c3s|]}|�kr|VqdS)Nrw)r�r�)r�rwrxr�9
srrrzZcannot specify a minimum length < 1; use Optional(Word()) if zero-length word is permittedz	Expected Fr�z[%s]+z%s[%s]*z	[%s][%s]*z\b)r�r/r�r��
initCharsOrigr��	initChars�
bodyCharsOrig�	bodyChars�maxSpecifiedr��minLen�maxLenr�r�r�rar`�	asKeyword�_escapeRegexRangeChars�reStringr�rd�escape�compilerK)r�r�r�min�max�exactrr�)rH)r�rxr�4
sT



0
z
Word.__init__Tc
CsD|jr<|jj||�}|s(t|||j|��|j�}||j�fS|||jkrZt|||j|��|}|d7}t|�}|j}||j	}t
||�}x ||kr�|||kr�|d7}q�Wd}	|||jkr�d}	|jr�||kr�|||kr�d}	|j
�r|dk�r||d|k�s||k�r|||k�rd}	|	�r4t|||j|��||||�fS)NrrFTr)rdr�rra�end�groupr�r�rrr
rrr)
r�r-r�ror�r�r�Z	bodycharsr�ZthrowExceptionrwrwrxr�j
s6

4zWord.parseImplcstytt|�j�Stk
r"YnX|jdkrndd�}|j|jkr^d||j�||j�f|_nd||j�|_|jS)NcSs$t|�dkr|dd�dS|SdS)N�z...)r�)r�rwrwrx�
charsAsStr�
sz Word.__str__.<locals>.charsAsStrz	W:(%s,%s)zW:(%s))r�r/r�rKrUr�r)r�r)rHrwrxr��
s
zWord.__str__)NrrrrFN)T)r�r�r�r�r�r�r�r�rwrw)rHrxr/
s.6
#csFeZdZdZeejd��Zd�fdd�	Zddd�Z	�fd	d
�Z
�ZS)
r'a�
    Token for matching strings that match a given regular expression.
    Defined with string specifying the regular expression in a form recognized by the inbuilt Python re module.
    If the given regex contains named groups (defined using C{(?P<name>...)}), these will be preserved as 
    named parse results.

    Example::
        realnum = Regex(r"[+-]?\d+\.\d*")
        date = Regex(r'(?P<year>\d{4})-(?P<month>\d\d?)-(?P<day>\d\d?)')
        # ref: http://stackoverflow.com/questions/267399/how-do-you-match-only-valid-roman-numerals-with-a-regular-expression
        roman = Regex(r"M{0,4}(CM|CD|D?C{0,3})(XC|XL|L?X{0,3})(IX|IV|V?I{0,3})")
    z[A-Z]rcs�tt|�j�t|t�r�|s,tjdtdd�||_||_	yt
j|j|j	�|_
|j|_Wq�t
jk
r�tjd|tdd��Yq�Xn2t|tj�r�||_
t|�|_|_||_	ntd��t|�|_d|j|_d|_d|_d	S)
z�The parameters C{pattern} and C{flags} are passed to the C{re.compile()} function as-is. See the Python C{re} module for an explanation of the acceptable patterns and flags.z0null string passed to Regex; use Empty() insteadrq)r�z$invalid pattern (%s) passed to RegexzCRegex may only be constructed with a string or a compiled RE objectz	Expected FTN)r�r'r�rzr�r�r�r��pattern�flagsrdr	r�
sre_constants�error�compiledREtyper{r�r�r�rar`r[)r�rr)rHrwrxr��
s.





zRegex.__init__TcCsd|jj||�}|s"t|||j|��|j�}|j�}t|j��}|r\x|D]}||||<qHW||fS)N)rdr�rrar
�	groupdictr"r)r�r-r�ror��dr�r�rwrwrxr��
s
zRegex.parseImplcsDytt|�j�Stk
r"YnX|jdkr>dt|j�|_|jS)NzRe:(%s))r�r'r�rKrUr�r)r�)rHrwrxr��
s
z
Regex.__str__)r)T)r�r�r�r�r�rdr	rr�r�r�r�rwrw)rHrxr'�
s
"

cs8eZdZdZd�fdd�	Zddd�Z�fd	d
�Z�ZS)
r%a�
    Token for matching strings that are delimited by quoting characters.
    
    Defined with the following parameters:
        - quoteChar - string of one or more characters defining the quote delimiting string
        - escChar - character to escape quotes, typically backslash (default=C{None})
        - escQuote - special quote sequence to escape an embedded quote string (such as SQL's "" to escape an embedded ") (default=C{None})
        - multiline - boolean indicating whether quotes can span multiple lines (default=C{False})
        - unquoteResults - boolean indicating whether the matched text should be unquoted (default=C{True})
        - endQuoteChar - string of one or more characters defining the end of the quote delimited string (default=C{None} => same as quoteChar)
        - convertWhitespaceEscapes - convert escaped whitespace (C{'\t'}, C{'\n'}, etc.) to actual whitespace (default=C{True})

    Example::
        qs = QuotedString('"')
        print(qs.searchString('lsjdf "This is the quote" sldjf'))
        complex_qs = QuotedString('{{', endQuoteChar='}}')
        print(complex_qs.searchString('lsjdf {{This is the "quote"}} sldjf'))
        sql_qs = QuotedString('"', escQuote='""')
        print(sql_qs.searchString('lsjdf "This is the quote with ""embedded"" quotes" sldjf'))
    prints::
        [['This is the quote']]
        [['This is the "quote"']]
        [['This is the quote with "embedded" quotes']]
    NFTcsNtt��j�|j�}|s0tjdtdd�t��|dkr>|}n"|j�}|s`tjdtdd�t��|�_t	|��_
|d�_|�_t	|��_
|�_|�_|�_|�_|r�tjtjB�_dtj�j�t�jd�|dk	r�t|�p�df�_n<d�_dtj�j�t�jd�|dk	�rt|��pdf�_t	�j�d	k�rp�jd
dj�fdd
�tt	�j�d	dd�D��d7_|�r��jdtj|�7_|�r��jdtj|�7_tj�j�d�_�jdtj�j�7_ytj�j�j��_�j�_Wn0tjk
�r&tjd�jtdd��YnXt ���_!d�j!�_"d�_#d�_$dS)Nz$quoteChar cannot be the empty stringrq)r�z'endQuoteChar cannot be the empty stringrz%s(?:[^%s%s]r�z%s(?:[^%s\n\r%s]rrz|(?:z)|(?:c3s4|],}dtj�jd|��t�j|�fVqdS)z%s[^%s]N)rdr�endQuoteCharr)r�r�)r�rwrxr�/sz(QuotedString.__init__.<locals>.<genexpr>�)z|(?:%s)z|(?:%s.)z(.)z)*%sz$invalid pattern (%s) passed to Regexz	Expected FTrs)%r�r%r�r�r�r�r��SyntaxError�	quoteCharr��quoteCharLen�firstQuoteCharr�endQuoteCharLen�escChar�escQuote�unquoteResults�convertWhitespaceEscapesrd�	MULTILINE�DOTALLrrrrr�r��escCharReplacePatternr	rrrr�r�rar`r[)r�rrr Z	multiliner!rr")rH)r�rxr�sf




6

zQuotedString.__init__c	Cs�|||jkr|jj||�pd}|s4t|||j|��|j�}|j�}|jr�||j|j	�}t
|t�r�d|kr�|jr�ddddd�}x |j
�D]\}}|j||�}q�W|jr�tj|jd|�}|jr�|j|j|j�}||fS)N�\�	r��
)z\tz\nz\fz\rz\g<1>)rrdr�rrar
rr!rrrzr�r"r�r�rr�r%r r)	r�r-r�ror�r�Zws_mapZwslitZwscharrwrwrxr�Gs( 
zQuotedString.parseImplcsFytt|�j�Stk
r"YnX|jdkr@d|j|jf|_|jS)Nz.quoted string, starting with %s ending with %s)r�r%r�rKrUrr)r�)rHrwrxr�js
zQuotedString.__str__)NNFTNT)T)r�r�r�r�r�r�r�r�rwrw)rHrxr%�
sA
#cs8eZdZdZd�fdd�	Zddd�Z�fd	d
�Z�ZS)
r	a�
    Token for matching words composed of characters I{not} in a given set (will
    include whitespace in matched characters if not listed in the provided exclusion set - see example).
    Defined with string containing all disallowed characters, and an optional
    minimum, maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction.

    Example::
        # define a comma-separated-value as anything that is not a ','
        csv_value = CharsNotIn(',')
        print(delimitedList(csv_value).parseString("dkls,lsdkjf,s12 34,@!#,213"))
    prints::
        ['dkls', 'lsdkjf', 's12 34', '@!#', '213']
    rrrcs�tt|�j�d|_||_|dkr*td��||_|dkr@||_nt|_|dkrZ||_||_t	|�|_
d|j
|_|jdk|_d|_
dS)NFrrzfcannot specify a minimum length < 1; use Optional(CharsNotIn()) if zero-length char group is permittedrz	Expected )r�r	r�rX�notCharsr�rrr�r�r�rar[r`)r�r*r
rr)rHrwrxr��s 
zCharsNotIn.__init__TcCs�|||jkrt|||j|��|}|d7}|j}t||jt|��}x ||krd|||krd|d7}qFW|||jkr�t|||j|��||||�fS)Nrr)r*rrar
rr�r)r�r-r�ror�Znotchars�maxlenrwrwrxr��s
zCharsNotIn.parseImplcsdytt|�j�Stk
r"YnX|jdkr^t|j�dkrRd|jdd�|_nd|j|_|jS)Nrz
!W:(%s...)z!W:(%s))r�r	r�rKrUr�r*)r�)rHrwrxr��s
zCharsNotIn.__str__)rrrr)T)r�r�r�r�r�r�r�r�rwrw)rHrxr	vs
cs<eZdZdZdddddd�Zd�fdd�	Zddd�Z�ZS)r.a�
    Special matching class for matching whitespace.  Normally, whitespace is ignored
    by pyparsing grammars.  This class is included when some whitespace structures
    are significant.  Define with a string containing the whitespace characters to be
    matched; default is C{" \t\r\n"}.  Also takes optional C{min}, C{max}, and C{exact} arguments,
    as defined for the C{L{Word}} class.
    z<SPC>z<TAB>z<LF>z<CR>z<FF>)r�r'rr)r(� 	
rrrcs�tt��j�|�_�jdj�fdd��jD���djdd��jD���_d�_d�j�_	|�_
|dkrt|�_nt�_|dkr�|�_|�_
dS)Nr�c3s|]}|�jkr|VqdS)N)�
matchWhite)r�r�)r�rwrxr��sz!White.__init__.<locals>.<genexpr>css|]}tj|VqdS)N)r.�	whiteStrs)r�r�rwrwrxr��sTz	Expected r)
r�r.r�r-r�r�rYr�r[rarrr�)r�Zwsr
rr)rH)r�rxr��s zWhite.__init__TcCs�|||jkrt|||j|��|}|d7}||j}t|t|��}x"||krd|||jkrd|d7}qDW|||jkr�t|||j|��||||�fS)Nrr)r-rrarr
r�r)r�r-r�ror�r�rwrwrxr��s
zWhite.parseImpl)r,rrrr)T)r�r�r�r�r.r�r�r�rwrw)rHrxr.�scseZdZ�fdd�Z�ZS)�_PositionTokencs(tt|�j�|jj|_d|_d|_dS)NTF)r�r/r�rHr�r�r[r`)r�)rHrwrxr��s
z_PositionToken.__init__)r�r�r�r�r�rwrw)rHrxr/�sr/cs2eZdZdZ�fdd�Zdd�Zd	dd�Z�ZS)
rzb
    Token to advance to a specific column of input text; useful for tabular report scraping.
    cstt|�j�||_dS)N)r�rr�r9)r��colno)rHrwrxr��szGoToColumn.__init__cCs`t||�|jkr\t|�}|jr*|j||�}x0||krZ||j�rZt||�|jkrZ|d7}q,W|S)Nrr)r9r�r]r��isspace)r�r-r�r�rwrwrxr��s&zGoToColumn.preParseTcCsDt||�}||jkr"t||d|��||j|}|||�}||fS)NzText not in expected column)r9r)r�r-r�roZthiscolZnewlocr�rwrwrxr�s

zGoToColumn.parseImpl)T)r�r�r�r�r�r�r�r�rwrw)rHrxr�s	cs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Matches if current position is at the beginning of a line within the parse string
    
    Example::
    
        test = '''        AAA this line
        AAA and this line
          AAA but not this one
        B AAA and definitely not this one
        '''

        for t in (LineStart() + 'AAA' + restOfLine).searchString(test):
            print(t)
    
    Prints::
        ['AAA', ' this line']
        ['AAA', ' and this line']    

    cstt|�j�d|_dS)NzExpected start of line)r�rr�ra)r�)rHrwrxr�&szLineStart.__init__TcCs*t||�dkr|gfSt|||j|��dS)Nrr)r9rra)r�r-r�rorwrwrxr�*szLineStart.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxrscs*eZdZdZ�fdd�Zddd�Z�ZS)rzU
    Matches if current position is at the end of a line within the parse string
    cs,tt|�j�|jtjjdd��d|_dS)Nrr�zExpected end of line)r�rr�r�r$rNr�ra)r�)rHrwrxr�3szLineEnd.__init__TcCsb|t|�kr6||dkr$|ddfSt|||j|��n(|t|�krN|dgfSt|||j|��dS)Nrrr)r�rra)r�r-r�rorwrwrxr�8szLineEnd.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr/scs*eZdZdZ�fdd�Zddd�Z�ZS)r*zM
    Matches if current position is at the beginning of the parse string
    cstt|�j�d|_dS)NzExpected start of text)r�r*r�ra)r�)rHrwrxr�GszStringStart.__init__TcCs0|dkr(||j|d�kr(t|||j|��|gfS)Nr)r�rra)r�r-r�rorwrwrxr�KszStringStart.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr*Cscs*eZdZdZ�fdd�Zddd�Z�ZS)r)zG
    Matches if current position is at the end of the parse string
    cstt|�j�d|_dS)NzExpected end of text)r�r)r�ra)r�)rHrwrxr�VszStringEnd.__init__TcCs^|t|�krt|||j|��n<|t|�kr6|dgfS|t|�krJ|gfSt|||j|��dS)Nrr)r�rra)r�r-r�rorwrwrxr�ZszStringEnd.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr)Rscs.eZdZdZef�fdd�	Zddd�Z�ZS)r1ap
    Matches if the current position is at the beginning of a Word, and
    is not preceded by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{} behavior of regular expressions,
    use C{WordStart(alphanums)}. C{WordStart} will also match at the beginning of
    the string being parsed, or at the beginning of a line.
    cs"tt|�j�t|�|_d|_dS)NzNot at the start of a word)r�r1r�r��	wordCharsra)r�r2)rHrwrxr�ls
zWordStart.__init__TcCs@|dkr8||d|jks(|||jkr8t|||j|��|gfS)Nrrr)r2rra)r�r-r�rorwrwrxr�qs
zWordStart.parseImpl)T)r�r�r�r�rVr�r�r�rwrw)rHrxr1dscs.eZdZdZef�fdd�	Zddd�Z�ZS)r0aZ
    Matches if the current position is at the end of a Word, and
    is not followed by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{} behavior of regular expressions,
    use C{WordEnd(alphanums)}. C{WordEnd} will also match at the end of
    the string being parsed, or at the end of a line.
    cs(tt|�j�t|�|_d|_d|_dS)NFzNot at the end of a word)r�r0r�r�r2rXra)r�r2)rHrwrxr��s
zWordEnd.__init__TcCsPt|�}|dkrH||krH|||jks8||d|jkrHt|||j|��|gfS)Nrrr)r�r2rra)r�r-r�ror�rwrwrxr��szWordEnd.parseImpl)T)r�r�r�r�rVr�r�r�rwrw)rHrxr0xscs�eZdZdZd�fdd�	Zdd�Zdd�Zd	d
�Z�fdd�Z�fd
d�Z	�fdd�Z
d�fdd�	Zgfdd�Z�fdd�Z
�ZS)r z^
    Abstract subclass of ParserElement, for combining and post-processing parsed tokens.
    Fcs�tt|�j|�t|t�r"t|�}t|t�r<tj|�g|_	njt|t
j�rzt|�}tdd�|D��rnt
tj|�}t|�|_	n,yt|�|_	Wntk
r�|g|_	YnXd|_dS)Ncss|]}t|t�VqdS)N)rzr�)r�r.rwrwrxr��sz+ParseExpression.__init__.<locals>.<genexpr>F)r�r r�rzr�r�r�r$rQ�exprsr��Iterable�allrvr�re)r�r3rg)rHrwrxr��s

zParseExpression.__init__cCs
|j|S)N)r3)r�r�rwrwrxr��szParseExpression.__getitem__cCs|jj|�d|_|S)N)r3r�rU)r�r�rwrwrxr��szParseExpression.appendcCs4d|_dd�|jD�|_x|jD]}|j�q W|S)z~Extends C{leaveWhitespace} defined in base class, and also invokes C{leaveWhitespace} on
           all contained expressions.FcSsg|]}|j��qSrw)r�)r�r�rwrwrxr��sz3ParseExpression.leaveWhitespace.<locals>.<listcomp>)rXr3r�)r�r�rwrwrxr��s
zParseExpression.leaveWhitespacecszt|t�rF||jkrvtt|�j|�xP|jD]}|j|jd�q,Wn0tt|�j|�x|jD]}|j|jd�q^W|S)Nrrrsrs)rzr+r]r�r r�r3)r�r�r�)rHrwrxr��s

zParseExpression.ignorecsLytt|�j�Stk
r"YnX|jdkrFd|jjt|j�f|_|jS)Nz%s:(%s))	r�r r�rKrUrHr�r�r3)r�)rHrwrxr��s
zParseExpression.__str__cs0tt|�j�x|jD]}|j�qWt|j�dk�r|jd}t||j�r�|jr�|jdkr�|j	r�|jdd�|jdg|_d|_
|j|jO_|j|jO_|jd}t||j�o�|jo�|jdko�|j	�r|jdd�|jdd�|_d|_
|j|jO_|j|jO_dt
|�|_|S)Nrqrrrz	Expected rsrs)r�r r�r3r�rzrHrSrVr^rUr[r`r�ra)r�r�r�)rHrwrxr��s0




zParseExpression.streamlinecstt|�j||�}|S)N)r�r rm)r�r�rlr�)rHrwrxrm�szParseExpression.setResultsNamecCs:|dd�|g}x|jD]}|j|�qW|jg�dS)N)r3r�r�)r�r��tmpr�rwrwrxr��szParseExpression.validatecs$tt|�j�}dd�|jD�|_|S)NcSsg|]}|j��qSrw)r�)r�r�rwrwrxr��sz(ParseExpression.copy.<locals>.<listcomp>)r�r r�r3)r�r�)rHrwrxr��szParseExpression.copy)F)F)r�r�r�r�r�r�r�r�r�r�r�rmr�r�r�rwrw)rHrxr �s	
"csTeZdZdZGdd�de�Zd�fdd�	Zddd�Zd	d
�Zdd�Z	d
d�Z
�ZS)ra

    Requires all given C{ParseExpression}s to be found in the given order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'+'} operator.
    May also be constructed using the C{'-'} operator, which will suppress backtracking.

    Example::
        integer = Word(nums)
        name_expr = OneOrMore(Word(alphas))

        expr = And([integer("id"),name_expr("name"),integer("age")])
        # more easily written as:
        expr = integer("id") + name_expr("name") + integer("age")
    cseZdZ�fdd�Z�ZS)zAnd._ErrorStopcs&ttj|�j||�d|_|j�dS)N�-)r�rr�r�r�r�)r�r�r�)rHrwrxr�
szAnd._ErrorStop.__init__)r�r�r�r�r�rwrw)rHrxr�
sr�TcsRtt|�j||�tdd�|jD��|_|j|jdj�|jdj|_d|_	dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�
szAnd.__init__.<locals>.<genexpr>rT)
r�rr�r5r3r[r�rYrXre)r�r3rg)rHrwrxr�
s
zAnd.__init__c	Cs|jdj|||dd�\}}d}x�|jdd�D]�}t|tj�rFd}q0|r�y|j|||�\}}Wq�tk
rv�Yq�tk
r�}zd|_tj|��WYdd}~Xq�t	k
r�t|t
|�|j|��Yq�Xn|j|||�\}}|s�|j�r0||7}q0W||fS)NrF)rprrT)
r3rtrzrr�r#r�
__traceback__r�r�r�rar�)	r�r-r�ro�
resultlistZ	errorStopr�Z
exprtokensr�rwrwrxr�
s(z
And.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrxr5
s

zAnd.__iadd__cCs8|dd�|g}x |jD]}|j|�|jsPqWdS)N)r3r�r[)r�r��subRecCheckListr�rwrwrxr�:
s

zAnd.checkRecursioncCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr��{r�css|]}t|�VqdS)N)r�)r�r�rwrwrxr�F
szAnd.__str__.<locals>.<genexpr>�})r�r�rUr�r3)r�rwrwrxr�A
s


 zAnd.__str__)T)T)r�r�r�r�r
r�r�r�rr�r�r�rwrw)rHrxr�s
csDeZdZdZd�fdd�	Zddd�Zdd	�Zd
d�Zdd
�Z�Z	S)ra�
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the expression that matches the longest string will be used.
    May be constructed using the C{'^'} operator.

    Example::
        # construct Or using '^' operator
        
        number = Word(nums) ^ Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789"))
    prints::
        [['123'], ['3.1416'], ['789']]
    Fcs:tt|�j||�|jr0tdd�|jD��|_nd|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�\
szOr.__init__.<locals>.<genexpr>T)r�rr�r3rr[)r�r3rg)rHrwrxr�Y
szOr.__init__TcCsTd}d}g}x�|jD]�}y|j||�}Wnvtk
rd}	z d|	_|	j|krT|	}|	j}WYdd}	~	Xqtk
r�t|�|kr�t|t|�|j|�}t|�}YqX|j||f�qW|�r*|j	dd�d�x`|D]X\}
}y|j
|||�Stk
�r$}	z"d|	_|	j|k�r|	}|	j}WYdd}	~	Xq�Xq�W|dk	�rB|j|_|�nt||d|��dS)NrrcSs
|dS)Nrrw)�xrwrwrxryu
szOr.parseImpl.<locals>.<lambda>)r�z no defined alternatives to matchrs)r3r�rr8r�r�r�rar��sortrtr�)r�r-r�ro�	maxExcLoc�maxExceptionr�r�Zloc2r��_rwrwrxr�`
s<

zOr.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrx�__ixor__�
s

zOr.__ixor__cCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r;z ^ css|]}t|�VqdS)N)r�)r�r�rwrwrxr��
szOr.__str__.<locals>.<genexpr>r<)r�r�rUr�r3)r�rwrwrxr��
s


 z
Or.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r3r�)r�r�r:r�rwrwrxr��
szOr.checkRecursion)F)T)
r�r�r�r�r�r�rBr�r�r�rwrw)rHrxrK
s

&	csDeZdZdZd�fdd�	Zddd�Zdd	�Zd
d�Zdd
�Z�Z	S)ra�
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the first one listed is the one that will match.
    May be constructed using the C{'|'} operator.

    Example::
        # construct MatchFirst using '|' operator
        
        # watch the order of expressions to match
        number = Word(nums) | Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789")) #  Fail! -> [['123'], ['3'], ['1416'], ['789']]

        # put more selective expression first
        number = Combine(Word(nums) + '.' + Word(nums)) | Word(nums)
        print(number.searchString("123 3.1416 789")) #  Better -> [['123'], ['3.1416'], ['789']]
    Fcs:tt|�j||�|jr0tdd�|jD��|_nd|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr��
sz&MatchFirst.__init__.<locals>.<genexpr>T)r�rr�r3rr[)r�r3rg)rHrwrxr��
szMatchFirst.__init__Tc	Cs�d}d}x�|jD]�}y|j|||�}|Stk
r\}z|j|krL|}|j}WYdd}~Xqtk
r�t|�|kr�t|t|�|j|�}t|�}YqXqW|dk	r�|j|_|�nt||d|��dS)Nrrz no defined alternatives to matchrs)r3rtrr�r�r�rar�)	r�r-r�ror?r@r�r�r�rwrwrxr��
s$
zMatchFirst.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrx�__ior__�
s

zMatchFirst.__ior__cCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r;z | css|]}t|�VqdS)N)r�)r�r�rwrwrxr��
sz%MatchFirst.__str__.<locals>.<genexpr>r<)r�r�rUr�r3)r�rwrwrxr��
s


 zMatchFirst.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r3r�)r�r�r:r�rwrwrxr��
szMatchFirst.checkRecursion)F)T)
r�r�r�r�r�r�rCr�r�r�rwrw)rHrxr�
s
	cs<eZdZdZd�fdd�	Zddd�Zdd�Zd	d
�Z�ZS)
ram
    Requires all given C{ParseExpression}s to be found, but in any order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'&'} operator.

    Example::
        color = oneOf("RED ORANGE YELLOW GREEN BLUE PURPLE BLACK WHITE BROWN")
        shape_type = oneOf("SQUARE CIRCLE TRIANGLE STAR HEXAGON OCTAGON")
        integer = Word(nums)
        shape_attr = "shape:" + shape_type("shape")
        posn_attr = "posn:" + Group(integer("x") + ',' + integer("y"))("posn")
        color_attr = "color:" + color("color")
        size_attr = "size:" + integer("size")

        # use Each (using operator '&') to accept attributes in any order 
        # (shape and posn are required, color and size are optional)
        shape_spec = shape_attr & posn_attr & Optional(color_attr) & Optional(size_attr)

        shape_spec.runTests('''
            shape: SQUARE color: BLACK posn: 100, 120
            shape: CIRCLE size: 50 color: BLUE posn: 50,80
            color:GREEN size:20 shape:TRIANGLE posn:20,40
            '''
            )
    prints::
        shape: SQUARE color: BLACK posn: 100, 120
        ['shape:', 'SQUARE', 'color:', 'BLACK', 'posn:', ['100', ',', '120']]
        - color: BLACK
        - posn: ['100', ',', '120']
          - x: 100
          - y: 120
        - shape: SQUARE


        shape: CIRCLE size: 50 color: BLUE posn: 50,80
        ['shape:', 'CIRCLE', 'size:', '50', 'color:', 'BLUE', 'posn:', ['50', ',', '80']]
        - color: BLUE
        - posn: ['50', ',', '80']
          - x: 50
          - y: 80
        - shape: CIRCLE
        - size: 50


        color: GREEN size: 20 shape: TRIANGLE posn: 20,40
        ['color:', 'GREEN', 'size:', '20', 'shape:', 'TRIANGLE', 'posn:', ['20', ',', '40']]
        - color: GREEN
        - posn: ['20', ',', '40']
          - x: 20
          - y: 40
        - shape: TRIANGLE
        - size: 20
    Tcs8tt|�j||�tdd�|jD��|_d|_d|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�sz Each.__init__.<locals>.<genexpr>T)r�rr�r5r3r[rX�initExprGroups)r�r3rg)rHrwrxr�sz
Each.__init__c
s�|jr�tdd�|jD��|_dd�|jD�}dd�|jD�}|||_dd�|jD�|_dd�|jD�|_dd�|jD�|_|j|j7_d	|_|}|jdd�}|jdd��g}d
}	x�|	�rp|�|j|j}
g}x~|
D]v}y|j||�}Wn t	k
�r|j
|�Yq�X|j
|jjt|�|��||k�rD|j
|�q�|�kr�j
|�q�Wt|�t|
�kr�d	}	q�W|�r�djdd�|D��}
t	||d
|
��|�fdd�|jD�7}g}x*|D]"}|j|||�\}}|j
|��q�Wt|tg��}||fS)Ncss&|]}t|t�rt|j�|fVqdS)N)rzrr�r.)r�r�rwrwrxr�sz!Each.parseImpl.<locals>.<genexpr>cSsg|]}t|t�r|j�qSrw)rzrr.)r�r�rwrwrxr�sz"Each.parseImpl.<locals>.<listcomp>cSs"g|]}|jrt|t�r|�qSrw)r[rzr)r�r�rwrwrxr�scSsg|]}t|t�r|j�qSrw)rzr2r.)r�r�rwrwrxr� scSsg|]}t|t�r|j�qSrw)rzrr.)r�r�rwrwrxr�!scSs g|]}t|tttf�s|�qSrw)rzrr2r)r�r�rwrwrxr�"sFTz, css|]}t|�VqdS)N)r�)r�r�rwrwrxr�=sz*Missing one or more required elements (%s)cs$g|]}t|t�r|j�kr|�qSrw)rzrr.)r�r�)�tmpOptrwrxr�As)rDr�r3Zopt1mapZ	optionalsZmultioptionalsZ
multirequiredZrequiredr�rr�r�r��remover�r�rt�sumr")r�r-r�roZopt1Zopt2ZtmpLocZtmpReqdZ
matchOrderZkeepMatchingZtmpExprsZfailedr�Zmissingr9r�ZfinalResultsrw)rErxr�sP



zEach.parseImplcCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r;z & css|]}t|�VqdS)N)r�)r�r�rwrwrxr�PszEach.__str__.<locals>.<genexpr>r<)r�r�rUr�r3)r�rwrwrxr�Ks


 zEach.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r3r�)r�r�r:r�rwrwrxr�TszEach.checkRecursion)T)T)	r�r�r�r�r�r�r�r�r�rwrw)rHrxr�
s
5
1	csleZdZdZd�fdd�	Zddd�Zdd	�Z�fd
d�Z�fdd
�Zdd�Z	gfdd�Z
�fdd�Z�ZS)rza
    Abstract subclass of C{ParserElement}, for combining and post-processing parsed tokens.
    Fcs�tt|�j|�t|t�r@ttjt�r2tj|�}ntjt	|��}||_
d|_|dk	r�|j|_|j
|_
|j|j�|j|_|j|_|j|_|jj|j�dS)N)r�rr�rzr��
issubclassr$rQr,rr.rUr`r[r�rYrXrWrer]r�)r�r.rg)rHrwrxr�^s
zParseElementEnhance.__init__TcCs2|jdk	r|jj|||dd�Std||j|��dS)NF)rpr�)r.rtrra)r�r-r�rorwrwrxr�ps
zParseElementEnhance.parseImplcCs*d|_|jj�|_|jdk	r&|jj�|S)NF)rXr.r�r�)r�rwrwrxr�vs


z#ParseElementEnhance.leaveWhitespacecsrt|t�rB||jkrntt|�j|�|jdk	rn|jj|jd�n,tt|�j|�|jdk	rn|jj|jd�|S)Nrrrsrs)rzr+r]r�rr�r.)r�r�)rHrwrxr�}s



zParseElementEnhance.ignorecs&tt|�j�|jdk	r"|jj�|S)N)r�rr�r.)r�)rHrwrxr��s

zParseElementEnhance.streamlinecCsB||krt||g��|dd�|g}|jdk	r>|jj|�dS)N)r&r.r�)r�r�r:rwrwrxr��s

z"ParseElementEnhance.checkRecursioncCs6|dd�|g}|jdk	r(|jj|�|jg�dS)N)r.r�r�)r�r�r6rwrwrxr��s
zParseElementEnhance.validatecsVytt|�j�Stk
r"YnX|jdkrP|jdk	rPd|jjt|j�f|_|jS)Nz%s:(%s))	r�rr�rKrUr.rHr�r�)r�)rHrwrxr��szParseElementEnhance.__str__)F)T)
r�r�r�r�r�r�r�r�r�r�r�r�r�rwrw)rHrxrZs
cs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Lookahead matching of the given parse expression.  C{FollowedBy}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression matches at the current
    position.  C{FollowedBy} always returns a null token list.

    Example::
        # use FollowedBy to match a label only if it is followed by a ':'
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        OneOrMore(attr_expr).parseString("shape: SQUARE color: BLACK posn: upper left").pprint()
    prints::
        [['shape', 'SQUARE'], ['color', 'BLACK'], ['posn', 'upper left']]
    cstt|�j|�d|_dS)NT)r�rr�r[)r�r.)rHrwrxr��szFollowedBy.__init__TcCs|jj||�|gfS)N)r.r�)r�r-r�rorwrwrxr��szFollowedBy.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr�scs2eZdZdZ�fdd�Zd	dd�Zdd�Z�ZS)
ra�
    Lookahead to disallow matching with the given parse expression.  C{NotAny}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression does I{not} match at the current
    position.  Also, C{NotAny} does I{not} skip over leading whitespace. C{NotAny}
    always returns a null token list.  May be constructed using the '~' operator.

    Example::
        
    cs0tt|�j|�d|_d|_dt|j�|_dS)NFTzFound unwanted token, )r�rr�rXr[r�r.ra)r�r.)rHrwrxr��szNotAny.__init__TcCs&|jj||�rt|||j|��|gfS)N)r.r�rra)r�r-r�rorwrwrxr��szNotAny.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�z~{r<)r�r�rUr�r.)r�rwrwrxr��s


zNotAny.__str__)T)r�r�r�r�r�r�r�r�rwrw)rHrxr�s

cs(eZdZd�fdd�	Zddd�Z�ZS)	�_MultipleMatchNcsFtt|�j|�d|_|}t|t�r.tj|�}|dk	r<|nd|_dS)NT)	r�rIr�rWrzr�r$rQ�	not_ender)r�r.�stopOnZender)rHrwrxr��s

z_MultipleMatch.__init__TcCs�|jj}|j}|jdk	}|r$|jj}|r2|||�||||dd�\}}yZ|j}	xJ|rb|||�|	rr|||�}
n|}
|||
|�\}}|s�|j�rT||7}qTWWnttfk
r�YnX||fS)NF)rp)	r.rtr�rJr�r]r�rr�)r�r-r�roZself_expr_parseZself_skip_ignorablesZcheck_enderZ
try_not_enderr�ZhasIgnoreExprsr�Z	tmptokensrwrwrxr��s,



z_MultipleMatch.parseImpl)N)T)r�r�r�r�r�r�rwrw)rHrxrI�srIc@seZdZdZdd�ZdS)ra�
    Repetition of one or more of the given expression.
    
    Parameters:
     - expr - expression that must match one or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: BLACK"
        OneOrMore(attr_expr).parseString(text).pprint()  # Fail! read 'color' as data instead of next label -> [['shape', 'SQUARE color']]

        # use stopOn attribute for OneOrMore to avoid reading label string as part of the data
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        OneOrMore(attr_expr).parseString(text).pprint() # Better -> [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'BLACK']]
        
        # could also be written as
        (attr_expr * (1,)).parseString(text).pprint()
    cCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�r;z}...)r�r�rUr�r.)r�rwrwrxr�!s


zOneOrMore.__str__N)r�r�r�r�r�rwrwrwrxrscs8eZdZdZd
�fdd�	Zd�fdd�	Zdd	�Z�ZS)r2aw
    Optional repetition of zero or more of the given expression.
    
    Parameters:
     - expr - expression that must match zero or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example: similar to L{OneOrMore}
    Ncstt|�j||d�d|_dS)N)rKT)r�r2r�r[)r�r.rK)rHrwrxr�6szZeroOrMore.__init__Tcs6ytt|�j|||�Sttfk
r0|gfSXdS)N)r�r2r�rr�)r�r-r�ro)rHrwrxr�:szZeroOrMore.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�rz]...)r�r�rUr�r.)r�rwrwrxr�@s


zZeroOrMore.__str__)N)T)r�r�r�r�r�r�r�r�rwrw)rHrxr2*sc@s eZdZdd�ZeZdd�ZdS)�
_NullTokencCsdS)NFrw)r�rwrwrxr�Jsz_NullToken.__bool__cCsdS)Nr�rw)r�rwrwrxr�Msz_NullToken.__str__N)r�r�r�r�r'r�rwrwrwrxrLIsrLcs6eZdZdZef�fdd�	Zd	dd�Zdd�Z�ZS)
raa
    Optional matching of the given expression.

    Parameters:
     - expr - expression that must match zero or more times
     - default (optional) - value to be returned if the optional expression is not found.

    Example::
        # US postal code can be a 5-digit zip, plus optional 4-digit qualifier
        zip = Combine(Word(nums, exact=5) + Optional('-' + Word(nums, exact=4)))
        zip.runTests('''
            # traditional ZIP code
            12345
            
            # ZIP+4 form
            12101-0001
            
            # invalid ZIP
            98765-
            ''')
    prints::
        # traditional ZIP code
        12345
        ['12345']

        # ZIP+4 form
        12101-0001
        ['12101-0001']

        # invalid ZIP
        98765-
             ^
        FAIL: Expected end of text (at char 5), (line:1, col:6)
    cs.tt|�j|dd�|jj|_||_d|_dS)NF)rgT)r�rr�r.rWr�r[)r�r.r�)rHrwrxr�ts
zOptional.__init__TcCszy|jj|||dd�\}}WnTttfk
rp|jtk	rh|jjr^t|jg�}|j||jj<ql|jg}ng}YnX||fS)NF)rp)r.rtrr�r��_optionalNotMatchedrVr")r�r-r�ror�rwrwrxr�zs


zOptional.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�rr	)r�r�rUr�r.)r�rwrwrxr��s


zOptional.__str__)T)	r�r�r�r�rMr�r�r�r�rwrw)rHrxrQs"
cs,eZdZdZd	�fdd�	Zd
dd�Z�ZS)r(a�	
    Token for skipping over all undefined text until the matched expression is found.

    Parameters:
     - expr - target expression marking the end of the data to be skipped
     - include - (default=C{False}) if True, the target expression is also parsed 
          (the skipped text and target expression are returned as a 2-element list).
     - ignore - (default=C{None}) used to define grammars (typically quoted strings and 
          comments) that might contain false matches to the target expression
     - failOn - (default=C{None}) define expressions that are not allowed to be 
          included in the skipped test; if found before the target expression is found, 
          the SkipTo is not a match

    Example::
        report = '''
            Outstanding Issues Report - 1 Jan 2000

               # | Severity | Description                               |  Days Open
            -----+----------+-------------------------------------------+-----------
             101 | Critical | Intermittent system crash                 |          6
              94 | Cosmetic | Spelling error on Login ('log|n')         |         14
              79 | Minor    | System slow when running too many reports |         47
            '''
        integer = Word(nums)
        SEP = Suppress('|')
        # use SkipTo to simply match everything up until the next SEP
        # - ignore quoted strings, so that a '|' character inside a quoted string does not match
        # - parse action will call token.strip() for each matched token, i.e., the description body
        string_data = SkipTo(SEP, ignore=quotedString)
        string_data.setParseAction(tokenMap(str.strip))
        ticket_expr = (integer("issue_num") + SEP 
                      + string_data("sev") + SEP 
                      + string_data("desc") + SEP 
                      + integer("days_open"))
        
        for tkt in ticket_expr.searchString(report):
            print tkt.dump()
    prints::
        ['101', 'Critical', 'Intermittent system crash', '6']
        - days_open: 6
        - desc: Intermittent system crash
        - issue_num: 101
        - sev: Critical
        ['94', 'Cosmetic', "Spelling error on Login ('log|n')", '14']
        - days_open: 14
        - desc: Spelling error on Login ('log|n')
        - issue_num: 94
        - sev: Cosmetic
        ['79', 'Minor', 'System slow when running too many reports', '47']
        - days_open: 47
        - desc: System slow when running too many reports
        - issue_num: 79
        - sev: Minor
    FNcs`tt|�j|�||_d|_d|_||_d|_t|t	�rFt
j|�|_n||_dt
|j�|_dS)NTFzNo match found for )r�r(r��
ignoreExprr[r`�includeMatchr�rzr�r$rQ�failOnr�r.ra)r�r��includer�rP)rHrwrxr��s
zSkipTo.__init__TcCs,|}t|�}|j}|jj}|jdk	r,|jjnd}|jdk	rB|jjnd}	|}
x�|
|kr�|dk	rh|||
�rhP|	dk	r�x*y|	||
�}
Wqrtk
r�PYqrXqrWy|||
ddd�Wn tt	fk
r�|
d7}
YqLXPqLWt|||j
|��|
}|||�}t|�}|j�r$||||dd�\}}
||
7}||fS)NF)rorprr)rp)
r�r.rtrPr�rNr�rrr�rar"rO)r�r-r�ror0r�r.Z
expr_parseZself_failOn_canParseNextZself_ignoreExpr_tryParseZtmplocZskiptextZ
skipresultr�rwrwrxr��s<

zSkipTo.parseImpl)FNN)T)r�r�r�r�r�r�r�rwrw)rHrxr(�s6
csbeZdZdZd�fdd�	Zdd�Zdd�Zd	d
�Zdd�Zgfd
d�Z	dd�Z
�fdd�Z�ZS)raK
    Forward declaration of an expression to be defined later -
    used for recursive grammars, such as algebraic infix notation.
    When the expression is known, it is assigned to the C{Forward} variable using the '<<' operator.

    Note: take care when assigning to C{Forward} not to overlook precedence of operators.
    Specifically, '|' has a lower precedence than '<<', so that::
        fwdExpr << a | b | c
    will actually be evaluated as::
        (fwdExpr << a) | b | c
    thereby leaving b and c out as parseable alternatives.  It is recommended that you
    explicitly group the values inserted into the C{Forward}::
        fwdExpr << (a | b | c)
    Converting to use the '<<=' operator instead will avoid this problem.

    See L{ParseResults.pprint} for an example of a recursive parser created using
    C{Forward}.
    Ncstt|�j|dd�dS)NF)rg)r�rr�)r�r�)rHrwrxr�szForward.__init__cCsjt|t�rtj|�}||_d|_|jj|_|jj|_|j|jj	�|jj
|_
|jj|_|jj
|jj�|S)N)rzr�r$rQr.rUr`r[r�rYrXrWr]r�)r�r�rwrwrx�
__lshift__s





zForward.__lshift__cCs||>S)Nrw)r�r�rwrwrx�__ilshift__'szForward.__ilshift__cCs
d|_|S)NF)rX)r�rwrwrxr�*szForward.leaveWhitespacecCs$|js d|_|jdk	r |jj�|S)NT)r_r.r�)r�rwrwrxr�.s


zForward.streamlinecCs>||kr0|dd�|g}|jdk	r0|jj|�|jg�dS)N)r.r�r�)r�r�r6rwrwrxr�5s

zForward.validatecCs>t|d�r|jS|jjdSd}Wd|j|_X|jjd|S)Nr�z: ...�Nonez: )r�r�rHr�Z_revertClass�_ForwardNoRecurser.r�)r�Z	retStringrwrwrxr�<s

zForward.__str__cs.|jdk	rtt|�j�St�}||K}|SdS)N)r.r�rr�)r�r�)rHrwrxr�Ms

zForward.copy)N)
r�r�r�r�r�rRrSr�r�r�r�r�r�rwrw)rHrxrs
c@seZdZdd�ZdS)rUcCsdS)Nz...rw)r�rwrwrxr�Vsz_ForwardNoRecurse.__str__N)r�r�r�r�rwrwrwrxrUUsrUcs"eZdZdZd�fdd�	Z�ZS)r-zQ
    Abstract subclass of C{ParseExpression}, for converting parsed results.
    Fcstt|�j|�d|_dS)NF)r�r-r�rW)r�r.rg)rHrwrxr�]szTokenConverter.__init__)F)r�r�r�r�r�r�rwrw)rHrxr-Yscs6eZdZdZd
�fdd�	Z�fdd�Zdd	�Z�ZS)r
a�
    Converter to concatenate all matching tokens to a single string.
    By default, the matching patterns must also be contiguous in the input string;
    this can be disabled by specifying C{'adjacent=False'} in the constructor.

    Example::
        real = Word(nums) + '.' + Word(nums)
        print(real.parseString('3.1416')) # -> ['3', '.', '1416']
        # will also erroneously match the following
        print(real.parseString('3. 1416')) # -> ['3', '.', '1416']

        real = Combine(Word(nums) + '.' + Word(nums))
        print(real.parseString('3.1416')) # -> ['3.1416']
        # no match when there are internal spaces
        print(real.parseString('3. 1416')) # -> Exception: Expected W:(0123...)
    r�Tcs8tt|�j|�|r|j�||_d|_||_d|_dS)NT)r�r
r�r��adjacentrX�
joinStringre)r�r.rWrV)rHrwrxr�rszCombine.__init__cs(|jrtj||�ntt|�j|�|S)N)rVr$r�r�r
)r�r�)rHrwrxr�|szCombine.ignorecCsP|j�}|dd�=|tdj|j|j��g|jd�7}|jrH|j�rH|gS|SdS)Nr�)r�)r�r"r�r
rWrbrVr�)r�r-r�r�ZretToksrwrwrxr��s
"zCombine.postParse)r�T)r�r�r�r�r�r�r�r�rwrw)rHrxr
as
cs(eZdZdZ�fdd�Zdd�Z�ZS)ra�
    Converter to return the matched tokens as a list - useful for returning tokens of C{L{ZeroOrMore}} and C{L{OneOrMore}} expressions.

    Example::
        ident = Word(alphas)
        num = Word(nums)
        term = ident | num
        func = ident + Optional(delimitedList(term))
        print(func.parseString("fn a,b,100"))  # -> ['fn', 'a', 'b', '100']

        func = ident + Group(Optional(delimitedList(term)))
        print(func.parseString("fn a,b,100"))  # -> ['fn', ['a', 'b', '100']]
    cstt|�j|�d|_dS)NT)r�rr�rW)r�r.)rHrwrxr��szGroup.__init__cCs|gS)Nrw)r�r-r�r�rwrwrxr��szGroup.postParse)r�r�r�r�r�r�r�rwrw)rHrxr�s
cs(eZdZdZ�fdd�Zdd�Z�ZS)raW
    Converter to return a repetitive expression as a list, but also as a dictionary.
    Each element can also be referenced using the first token in the expression as its key.
    Useful for tabular report scraping when the first column can be used as a item key.

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        # print attributes as plain groups
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        # instead of OneOrMore(expr), parse using Dict(OneOrMore(Group(expr))) - Dict will auto-assign names
        result = Dict(OneOrMore(Group(attr_expr))).parseString(text)
        print(result.dump())
        
        # access named fields as dict entries, or output as dict
        print(result['shape'])        
        print(result.asDict())
    prints::
        ['shape', 'SQUARE', 'posn', 'upper left', 'color', 'light blue', 'texture', 'burlap']

        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        {'color': 'light blue', 'posn': 'upper left', 'texture': 'burlap', 'shape': 'SQUARE'}
    See more examples at L{ParseResults} of accessing fields by results name.
    cstt|�j|�d|_dS)NT)r�rr�rW)r�r.)rHrwrxr��sz
Dict.__init__cCs�x�t|�D]�\}}t|�dkr q
|d}t|t�rBt|d�j�}t|�dkr^td|�||<q
t|�dkr�t|dt�r�t|d|�||<q
|j�}|d=t|�dks�t|t�r�|j	�r�t||�||<q
t|d|�||<q
W|j
r�|gS|SdS)Nrrrr�rq)r�r�rzrur�r�r�r"r�r�rV)r�r-r�r�r��tokZikeyZ	dictvaluerwrwrxr��s$
zDict.postParse)r�r�r�r�r�r�r�rwrw)rHrxr�s#c@s eZdZdZdd�Zdd�ZdS)r+aV
    Converter for ignoring the results of a parsed expression.

    Example::
        source = "a, b, c,d"
        wd = Word(alphas)
        wd_list1 = wd + ZeroOrMore(',' + wd)
        print(wd_list1.parseString(source))

        # often, delimiters that are useful during parsing are just in the
        # way afterward - use Suppress to keep them out of the parsed output
        wd_list2 = wd + ZeroOrMore(Suppress(',') + wd)
        print(wd_list2.parseString(source))
    prints::
        ['a', ',', 'b', ',', 'c', ',', 'd']
        ['a', 'b', 'c', 'd']
    (See also L{delimitedList}.)
    cCsgS)Nrw)r�r-r�r�rwrwrxr��szSuppress.postParsecCs|S)Nrw)r�rwrwrxr��szSuppress.suppressN)r�r�r�r�r�r�rwrwrwrxr+�sc@s(eZdZdZdd�Zdd�Zdd�ZdS)	rzI
    Wrapper for parse actions, to ensure they are only called once.
    cCst|�|_d|_dS)NF)rM�callable�called)r�Z
methodCallrwrwrxr�s
zOnlyOnce.__init__cCs.|js|j|||�}d|_|St||d��dS)NTr�)rZrYr)r�r�r5rvr�rwrwrxr�s
zOnlyOnce.__call__cCs
d|_dS)NF)rZ)r�rwrwrx�reset
szOnlyOnce.resetN)r�r�r�r�r�r�r[rwrwrwrxr�scs:t����fdd�}y�j|_Wntk
r4YnX|S)as
    Decorator for debugging parse actions. 
    
    When the parse action is called, this decorator will print C{">> entering I{method-name}(line:I{current_source_line}, I{parse_location}, I{matched_tokens})".}
    When the parse action completes, the decorator will print C{"<<"} followed by the returned value, or any exception that the parse action raised.

    Example::
        wd = Word(alphas)

        @traceParseAction
        def remove_duplicate_chars(tokens):
            return ''.join(sorted(set(''.join(tokens)))

        wds = OneOrMore(wd).setParseAction(remove_duplicate_chars)
        print(wds.parseString("slkdjs sld sldd sdlf sdljf"))
    prints::
        >>entering remove_duplicate_chars(line: 'slkdjs sld sldd sdlf sdljf', 0, (['slkdjs', 'sld', 'sldd', 'sdlf', 'sdljf'], {}))
        <<leaving remove_duplicate_chars (ret: 'dfjkls')
        ['dfjkls']
    cs��j}|dd�\}}}t|�dkr8|djjd|}tjjd|t||�||f�y�|�}Wn8tk
r�}ztjjd||f��WYdd}~XnXtjjd||f�|S)Nror�.z">>entering %s(line: '%s', %d, %r)
z<<leaving %s (exception: %s)
z<<leaving %s (ret: %r)
r9)r�r�rHr~�stderr�writerGrK)ZpaArgsZthisFuncr�r5rvr�r3)r�rwrx�z#sztraceParseAction.<locals>.z)rMr�r�)r�r_rw)r�rxrb
s
�,FcCs`t|�dt|�dt|�d}|rBt|t||��j|�S|tt|�|�j|�SdS)a�
    Helper to define a delimited list of expressions - the delimiter defaults to ','.
    By default, the list elements and delimiters can have intervening whitespace, and
    comments, but this can be overridden by passing C{combine=True} in the constructor.
    If C{combine} is set to C{True}, the matching tokens are returned as a single token
    string, with the delimiters included; otherwise, the matching tokens are returned
    as a list of tokens, with the delimiters suppressed.

    Example::
        delimitedList(Word(alphas)).parseString("aa,bb,cc") # -> ['aa', 'bb', 'cc']
        delimitedList(Word(hexnums), delim=':', combine=True).parseString("AA:BB:CC:DD:EE") # -> ['AA:BB:CC:DD:EE']
    z [r�z]...N)r�r
r2rir+)r.Zdelim�combineZdlNamerwrwrxr@9s
$csjt����fdd�}|dkr0tt�jdd��}n|j�}|jd�|j|dd�|�jd	t��d
�S)a:
    Helper to define a counted list of expressions.
    This helper defines a pattern of the form::
        integer expr expr expr...
    where the leading integer tells how many expr expressions follow.
    The matched tokens returns the array of expr tokens as a list - the leading count token is suppressed.
    
    If C{intExpr} is specified, it should be a pyparsing expression that produces an integer value.

    Example::
        countedArray(Word(alphas)).parseString('2 ab cd ef')  # -> ['ab', 'cd']

        # in this parser, the leading integer value is given in binary,
        # '10' indicating that 2 values are in the array
        binaryConstant = Word('01').setParseAction(lambda t: int(t[0], 2))
        countedArray(Word(alphas), intExpr=binaryConstant).parseString('10 ab cd ef')  # -> ['ab', 'cd']
    cs.|d}�|r tt�g|��p&tt�>gS)Nr)rrrC)r�r5rvr�)�	arrayExprr.rwrx�countFieldParseAction_s"z+countedArray.<locals>.countFieldParseActionNcSst|d�S)Nr)ru)rvrwrwrxrydszcountedArray.<locals>.<lambda>ZarrayLenT)rfz(len) z...)rr/rRr�r�rirxr�)r.ZintExprrcrw)rbr.rxr<Ls
cCs:g}x0|D](}t|t�r(|jt|��q
|j|�q
W|S)N)rzr�r�r�r�)�Lr�r�rwrwrxr�ks

r�cs6t���fdd�}|j|dd��jdt|���S)a*
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousLiteral(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches a
    previous literal, will also match the leading C{"1:1"} in C{"1:10"}.
    If this is not desired, use C{matchPreviousExpr}.
    Do I{not} use with packrat parsing enabled.
    csP|rBt|�dkr�|d>qLt|j��}�tdd�|D��>n
�t�>dS)Nrrrcss|]}t|�VqdS)N)r)r��ttrwrwrxr��szDmatchPreviousLiteral.<locals>.copyTokenToRepeater.<locals>.<genexpr>)r�r�r�rr
)r�r5rvZtflat)�reprwrx�copyTokenToRepeater�sz1matchPreviousLiteral.<locals>.copyTokenToRepeaterT)rfz(prev) )rrxrir�)r.rgrw)rfrxrOts


csFt��|j�}�|K��fdd�}|j|dd��jdt|���S)aS
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousExpr(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches by
    expressions, will I{not} match the leading C{"1:1"} in C{"1:10"};
    the expressions are evaluated first, and then compared, so
    C{"1"} is compared with C{"10"}.
    Do I{not} use with packrat parsing enabled.
    cs*t|j����fdd�}�j|dd�dS)Ncs$t|j��}|�kr tddd��dS)Nr�r)r�r�r)r�r5rvZtheseTokens)�matchTokensrwrx�mustMatchTheseTokens�szLmatchPreviousExpr.<locals>.copyTokenToRepeater.<locals>.mustMatchTheseTokensT)rf)r�r�r�)r�r5rvri)rf)rhrxrg�sz.matchPreviousExpr.<locals>.copyTokenToRepeaterT)rfz(prev) )rr�rxrir�)r.Ze2rgrw)rfrxrN�scCs>xdD]}|j|t|�}qW|jdd�}|jdd�}t|�S)Nz\^-]rz\nr'z\t)r��_bslashr�)r�r�rwrwrxr�s

rTc
s�|rdd�}dd�}t�ndd�}dd�}t�g}t|t�rF|j�}n&t|tj�r\t|�}ntj	dt
dd�|svt�Sd	}x�|t|�d
k�r||}xnt
||d
d��D]N\}}	||	|�r�|||d
=Pq�|||	�r�|||d
=|j||	�|	}Pq�W|d
7}q|W|�r�|�r�yht|�tdj|��k�rZtd
djdd�|D���jdj|��Stdjdd�|D���jdj|��SWn&tk
�r�tj	dt
dd�YnXt�fdd�|D��jdj|��S)a�
    Helper to quickly define a set of alternative Literals, and makes sure to do
    longest-first testing when there is a conflict, regardless of the input order,
    but returns a C{L{MatchFirst}} for best performance.

    Parameters:
     - strs - a string of space-delimited literals, or a collection of string literals
     - caseless - (default=C{False}) - treat all literals as caseless
     - useRegex - (default=C{True}) - as an optimization, will generate a Regex
          object; otherwise, will generate a C{MatchFirst} object (if C{caseless=True}, or
          if creating a C{Regex} raises an exception)

    Example::
        comp_oper = oneOf("< = > <= >= !=")
        var = Word(alphas)
        number = Word(nums)
        term = var | number
        comparison_expr = term + comp_oper + term
        print(comparison_expr.searchString("B = 12  AA=23 B<=AA AA>12"))
    prints::
        [['B', '=', '12'], ['AA', '=', '23'], ['B', '<=', 'AA'], ['AA', '>', '12']]
    cSs|j�|j�kS)N)r�)r�brwrwrxry�szoneOf.<locals>.<lambda>cSs|j�j|j��S)N)r�r�)rrkrwrwrxry�scSs||kS)Nrw)rrkrwrwrxry�scSs
|j|�S)N)r�)rrkrwrwrxry�sz6Invalid argument to oneOf, expected string or iterablerq)r�rrrNr�z[%s]css|]}t|�VqdS)N)r)r��symrwrwrxr��szoneOf.<locals>.<genexpr>z | �|css|]}tj|�VqdS)N)rdr)r�rlrwrwrxr��sz7Exception creating Regex for oneOf, building MatchFirstc3s|]}�|�VqdS)Nrw)r�rl)�parseElementClassrwrxr��s)rrrzr�r�r�r4r�r�r�r�rr�r�r�r�r'rirKr)
Zstrsr�ZuseRegexZisequalZmasksZsymbolsr�Zcurr�r�rw)rnrxrS�sL





((cCsttt||���S)a�
    Helper to easily and clearly define a dictionary by specifying the respective patterns
    for the key and value.  Takes care of defining the C{L{Dict}}, C{L{ZeroOrMore}}, and C{L{Group}} tokens
    in the proper order.  The key pattern can include delimiting markers or punctuation,
    as long as they are suppressed, thereby leaving the significant key text.  The value
    pattern can include named results, so that the C{Dict} results can include named token
    fields.

    Example::
        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        attr_label = label
        attr_value = Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join)

        # similar to Dict, but simpler call format
        result = dictOf(attr_label, attr_value).parseString(text)
        print(result.dump())
        print(result['shape'])
        print(result.shape)  # object attribute access works too
        print(result.asDict())
    prints::
        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        SQUARE
        {'color': 'light blue', 'shape': 'SQUARE', 'posn': 'upper left', 'texture': 'burlap'}
    )rr2r)r�r�rwrwrxrA�s!cCs^t�jdd��}|j�}d|_|d�||d�}|r@dd�}ndd�}|j|�|j|_|S)	a�
    Helper to return the original, untokenized text for a given expression.  Useful to
    restore the parsed fields of an HTML start tag into the raw tag text itself, or to
    revert separate tokens with intervening whitespace back to the original matching
    input text. By default, returns astring containing the original parsed text.  
       
    If the optional C{asString} argument is passed as C{False}, then the return value is a 
    C{L{ParseResults}} containing any results names that were originally matched, and a 
    single token containing the original matched text from the input string.  So if 
    the expression passed to C{L{originalTextFor}} contains expressions with defined
    results names, you must set C{asString} to C{False} if you want to preserve those
    results name values.

    Example::
        src = "this is test <b> bold <i>text</i> </b> normal text "
        for tag in ("b","i"):
            opener,closer = makeHTMLTags(tag)
            patt = originalTextFor(opener + SkipTo(closer) + closer)
            print(patt.searchString(src)[0])
    prints::
        ['<b> bold <i>text</i> </b>']
        ['<i>text</i>']
    cSs|S)Nrw)r�r�rvrwrwrxry8sz!originalTextFor.<locals>.<lambda>F�_original_start�
_original_endcSs||j|j�S)N)rorp)r�r5rvrwrwrxry=scSs&||jd�|jd��g|dd�<dS)Nrorp)r�)r�r5rvrwrwrx�extractText?sz$originalTextFor.<locals>.extractText)r
r�r�rer])r.ZasStringZ	locMarkerZendlocMarker�	matchExprrqrwrwrxrg s

cCst|�jdd��S)zp
    Helper to undo pyparsing's default grouping of And expressions, even
    if all but one are non-empty.
    cSs|dS)Nrrw)rvrwrwrxryJszungroup.<locals>.<lambda>)r-r�)r.rwrwrxrhEscCs4t�jdd��}t|d�|d�|j�j�d��S)a�
    Helper to decorate a returned token with its starting and ending locations in the input string.
    This helper adds the following results names:
     - locn_start = location where matched expression begins
     - locn_end = location where matched expression ends
     - value = the actual parsed results

    Be careful if the input text contains C{<TAB>} characters, you may want to call
    C{L{ParserElement.parseWithTabs}}

    Example::
        wd = Word(alphas)
        for match in locatedExpr(wd).searchString("ljsdf123lksdjjf123lkkjj1222"):
            print(match)
    prints::
        [[0, 'ljsdf', 5]]
        [[8, 'lksdjjf', 15]]
        [[18, 'lkkjj', 23]]
    cSs|S)Nrw)r�r5rvrwrwrxry`szlocatedExpr.<locals>.<lambda>Z
locn_startr�Zlocn_end)r
r�rr�r�)r.ZlocatorrwrwrxrjLsz\[]-*.$+^?()~ )rcCs|ddS)Nrrrrw)r�r5rvrwrwrxryksryz\\0?[xX][0-9a-fA-F]+cCstt|djd�d��S)Nrz\0x�)�unichrru�lstrip)r�r5rvrwrwrxrylsz	\\0[0-7]+cCstt|ddd�d��S)Nrrr�)rtru)r�r5rvrwrwrxrymsz\])r�rz\wr7rr�Znegate�bodyr	csBdd��y dj�fdd�tj|�jD��Stk
r<dSXdS)a�
    Helper to easily define string ranges for use in Word construction.  Borrows
    syntax from regexp '[]' string range definitions::
        srange("[0-9]")   -> "0123456789"
        srange("[a-z]")   -> "abcdefghijklmnopqrstuvwxyz"
        srange("[a-z$_]") -> "abcdefghijklmnopqrstuvwxyz$_"
    The input string must be enclosed in []'s, and the returned string is the expanded
    character set joined into a single string.
    The values enclosed in the []'s may be:
     - a single character
     - an escaped character with a leading backslash (such as C{\-} or C{\]})
     - an escaped hex character with a leading C{'\x'} (C{\x21}, which is a C{'!'} character) 
         (C{\0x##} is also supported for backwards compatibility) 
     - an escaped octal character with a leading C{'\0'} (C{\041}, which is a C{'!'} character)
     - a range of any of the above, separated by a dash (C{'a-z'}, etc.)
     - any combination of the above (C{'aeiouy'}, C{'a-zA-Z0-9_$'}, etc.)
    cSs<t|t�s|Sdjdd�tt|d�t|d�d�D��S)Nr�css|]}t|�VqdS)N)rt)r�r�rwrwrxr��sz+srange.<locals>.<lambda>.<locals>.<genexpr>rrr)rzr"r�r��ord)�prwrwrxry�szsrange.<locals>.<lambda>r�c3s|]}�|�VqdS)Nrw)r��part)�	_expandedrwrxr��szsrange.<locals>.<genexpr>N)r��_reBracketExprr�rwrK)r�rw)r{rxr_rs
 cs�fdd�}|S)zt
    Helper method for defining parse actions that require matching at a specific
    column in the input text.
    cs"t||��krt||d���dS)Nzmatched token not at column %d)r9r)r)Zlocnr1)r�rwrx�	verifyCol�sz!matchOnlyAtCol.<locals>.verifyColrw)r�r}rw)r�rxrM�scs�fdd�S)a�
    Helper method for common parse actions that simply return a literal value.  Especially
    useful when used with C{L{transformString<ParserElement.transformString>}()}.

    Example::
        num = Word(nums).setParseAction(lambda toks: int(toks[0]))
        na = oneOf("N/A NA").setParseAction(replaceWith(math.nan))
        term = na | num
        
        OneOrMore(term).parseString("324 234 N/A 234") # -> [324, 234, nan, 234]
    cs�gS)Nrw)r�r5rv)�replStrrwrxry�szreplaceWith.<locals>.<lambda>rw)r~rw)r~rxr\�scCs|ddd�S)a
    Helper parse action for removing quotation marks from parsed quoted strings.

    Example::
        # by default, quotation marks are included in parsed results
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["'Now is the Winter of our Discontent'"]

        # use removeQuotes to strip quotation marks from parsed results
        quotedString.setParseAction(removeQuotes)
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["Now is the Winter of our Discontent"]
    rrrrsrw)r�r5rvrwrwrxrZ�scsN��fdd�}yt�dt�d�j�}Wntk
rBt��}YnX||_|S)aG
    Helper to define a parse action by mapping a function to all elements of a ParseResults list.If any additional 
    args are passed, they are forwarded to the given function as additional arguments after
    the token, as in C{hex_integer = Word(hexnums).setParseAction(tokenMap(int, 16))}, which will convert the
    parsed data to an integer using base 16.

    Example (compare the last to example in L{ParserElement.transformString}::
        hex_ints = OneOrMore(Word(hexnums)).setParseAction(tokenMap(int, 16))
        hex_ints.runTests('''
            00 11 22 aa FF 0a 0d 1a
            ''')
        
        upperword = Word(alphas).setParseAction(tokenMap(str.upper))
        OneOrMore(upperword).runTests('''
            my kingdom for a horse
            ''')

        wd = Word(alphas).setParseAction(tokenMap(str.title))
        OneOrMore(wd).setParseAction(' '.join).runTests('''
            now is the winter of our discontent made glorious summer by this sun of york
            ''')
    prints::
        00 11 22 aa FF 0a 0d 1a
        [0, 17, 34, 170, 255, 10, 13, 26]

        my kingdom for a horse
        ['MY', 'KINGDOM', 'FOR', 'A', 'HORSE']

        now is the winter of our discontent made glorious summer by this sun of york
        ['Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York']
    cs��fdd�|D�S)Ncsg|]}�|f����qSrwrw)r�Ztokn)r�r6rwrxr��sz(tokenMap.<locals>.pa.<locals>.<listcomp>rw)r�r5rv)r�r6rwrxr}�sztokenMap.<locals>.par�rH)rJr�rKr{)r6r�r}rLrw)r�r6rxrm�s cCst|�j�S)N)r�r�)rvrwrwrxry�scCst|�j�S)N)r��lower)rvrwrwrxry�scCs�t|t�r|}t||d�}n|j}tttd�}|r�tj�j	t
�}td�|d�tt
t|td�|���tddgd�jd	�j	d
d��td�}n�d
jdd�tD��}tj�j	t
�t|�B}td�|d�tt
t|j	t�ttd�|����tddgd�jd	�j	dd��td�}ttd�|d�}|jdd
j|jdd�j�j���jd|�}|jdd
j|jdd�j�j���jd|�}||_||_||fS)zRInternal helper to construct opening and closing tag expressions, given a tag name)r�z_-:r�tag�=�/F)r�rCcSs|ddkS)Nrr�rw)r�r5rvrwrwrxry�sz_makeTags.<locals>.<lambda>rr�css|]}|dkr|VqdS)rNrw)r�r�rwrwrxr��sz_makeTags.<locals>.<genexpr>cSs|ddkS)Nrr�rw)r�r5rvrwrwrxry�sz</r��:r�z<%s>r
z</%s>)rzr�rr�r/r4r3r>r�r�rZr+rr2rrrmr�rVrYrBr
�_Lr��titler�rir�)�tagStrZxmlZresnameZtagAttrNameZtagAttrValueZopenTagZprintablesLessRAbrackZcloseTagrwrwrx�	_makeTags�s"
T\..r�cCs
t|d�S)a 
    Helper to construct opening and closing tag expressions for HTML, given a tag name. Matches
    tags in either upper or lower case, attributes with namespaces and with quoted or unquoted values.

    Example::
        text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
        # makeHTMLTags returns pyparsing expressions for the opening and closing tags as a 2-tuple
        a,a_end = makeHTMLTags("A")
        link_expr = a + SkipTo(a_end)("link_text") + a_end
        
        for link in link_expr.searchString(text):
            # attributes in the <A> tag (like "href" shown here) are also accessible as named results
            print(link.link_text, '->', link.href)
    prints::
        pyparsing -> http://pyparsing.wikispaces.com
    F)r�)r�rwrwrxrK�scCs
t|d�S)z�
    Helper to construct opening and closing tag expressions for XML, given a tag name. Matches
    tags only in the given upper/lower case.

    Example: similar to L{makeHTMLTags}
    T)r�)r�rwrwrxrLscs8|r|dd��n|j��dd��D���fdd�}|S)a<
    Helper to create a validating parse action to be used with start tags created
    with C{L{makeXMLTags}} or C{L{makeHTMLTags}}. Use C{withAttribute} to qualify a starting tag
    with a required attribute value, to avoid false matches on common tags such as
    C{<TD>} or C{<DIV>}.

    Call C{withAttribute} with a series of attribute names and values. Specify the list
    of filter attributes names and values as:
     - keyword arguments, as in C{(align="right")}, or
     - as an explicit dict with C{**} operator, when an attribute name is also a Python
          reserved word, as in C{**{"class":"Customer", "align":"right"}}
     - a list of name-value tuples, as in ( ("ns1:class", "Customer"), ("ns2:align","right") )
    For attribute names with a namespace prefix, you must use the second form.  Attribute
    names are matched insensitive to upper/lower case.
       
    If just testing for C{class} (with or without a namespace), use C{L{withClass}}.

    To verify that the attribute exists, but without specifying a value, pass
    C{withAttribute.ANY_VALUE} as the value.

    Example::
        html = '''
            <div>
            Some text
            <div type="grid">1 4 0 1 0</div>
            <div type="graph">1,3 2,3 1,1</div>
            <div>this has no type</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")

        # only match div tag having a type attribute with value "grid"
        div_grid = div().setParseAction(withAttribute(type="grid"))
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        # construct a match with any div tag having a type attribute, regardless of the value
        div_any_type = div().setParseAction(withAttribute(type=withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    NcSsg|]\}}||f�qSrwrw)r�r�r�rwrwrxr�Qsz!withAttribute.<locals>.<listcomp>cs^xX�D]P\}}||kr&t||d|��|tjkr|||krt||d||||f��qWdS)Nzno matching attribute z+attribute '%s' has value '%s', must be '%s')rre�	ANY_VALUE)r�r5r�ZattrNameZ	attrValue)�attrsrwrxr}RszwithAttribute.<locals>.pa)r�)r�ZattrDictr}rw)r�rxres2cCs|rd|nd}tf||i�S)a�
    Simplified version of C{L{withAttribute}} when matching on a div class - made
    difficult because C{class} is a reserved word in Python.

    Example::
        html = '''
            <div>
            Some text
            <div class="grid">1 4 0 1 0</div>
            <div class="graph">1,3 2,3 1,1</div>
            <div>this &lt;div&gt; has no class</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")
        div_grid = div().setParseAction(withClass("grid"))
        
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        div_any_type = div().setParseAction(withClass(withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    z%s:class�class)re)Z	classname�	namespaceZ	classattrrwrwrxrk\s �(rcCs�t�}||||B}�x`t|�D�]R\}}|ddd�\}}	}
}|	dkrTd|nd|}|	dkr�|dksxt|�dkr�td��|\}
}t�j|�}|
tjk�rd|	dkr�t||�t|t	|��}n�|	dk�r|dk	�rt|||�t|t	||��}nt||�t|t	|��}nD|	dk�rZt||
|||�t||
|||�}ntd	��n�|
tj
k�rH|	dk�r�t|t��s�t|�}t|j
|�t||�}n�|	dk�r|dk	�r�t|||�t|t	||��}nt||�t|t	|��}nD|	dk�r>t||
|||�t||
|||�}ntd	��ntd
��|�r`|j|�||j|�|BK}|}q"W||K}|S)a�	
    Helper method for constructing grammars of expressions made up of
    operators working in a precedence hierarchy.  Operators may be unary or
    binary, left- or right-associative.  Parse actions can also be attached
    to operator expressions. The generated parser will also recognize the use 
    of parentheses to override operator precedences (see example below).
    
    Note: if you define a deep operator list, you may see performance issues
    when using infixNotation. See L{ParserElement.enablePackrat} for a
    mechanism to potentially improve your parser performance.

    Parameters:
     - baseExpr - expression representing the most basic element for the nested
     - opList - list of tuples, one for each operator precedence level in the
      expression grammar; each tuple is of the form
      (opExpr, numTerms, rightLeftAssoc, parseAction), where:
       - opExpr is the pyparsing expression for the operator;
          may also be a string, which will be converted to a Literal;
          if numTerms is 3, opExpr is a tuple of two expressions, for the
          two operators separating the 3 terms
       - numTerms is the number of terms for this operator (must
          be 1, 2, or 3)
       - rightLeftAssoc is the indicator whether the operator is
          right or left associative, using the pyparsing-defined
          constants C{opAssoc.RIGHT} and C{opAssoc.LEFT}.
       - parseAction is the parse action to be associated with
          expressions matching this operator expression (the
          parse action tuple member may be omitted)
     - lpar - expression for matching left-parentheses (default=C{Suppress('(')})
     - rpar - expression for matching right-parentheses (default=C{Suppress(')')})

    Example::
        # simple example of four-function arithmetic with ints and variable names
        integer = pyparsing_common.signed_integer
        varname = pyparsing_common.identifier 
        
        arith_expr = infixNotation(integer | varname,
            [
            ('-', 1, opAssoc.RIGHT),
            (oneOf('* /'), 2, opAssoc.LEFT),
            (oneOf('+ -'), 2, opAssoc.LEFT),
            ])
        
        arith_expr.runTests('''
            5+3*6
            (5+3)*6
            -2--11
            ''', fullDump=False)
    prints::
        5+3*6
        [[5, '+', [3, '*', 6]]]

        (5+3)*6
        [[[5, '+', 3], '*', 6]]

        -2--11
        [[['-', 2], '-', ['-', 11]]]
    Nrroz%s termz	%s%s termrqz@if numterms=3, opExpr must be a tuple or list of two expressionsrrz6operator must be unary (1), binary (2), or ternary (3)z2operator must indicate right or left associativity)N)rr�r�r�rirT�LEFTrrr�RIGHTrzrr.r�)ZbaseExprZopListZlparZrparr�ZlastExprr�ZoperDefZopExprZarityZrightLeftAssocr}ZtermNameZopExpr1ZopExpr2ZthisExprrrrwrwrxri�sR;

&




&


z4"(?:[^"\n\r\\]|(?:"")|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*�"z string enclosed in double quotesz4'(?:[^'\n\r\\]|(?:'')|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*�'z string enclosed in single quotesz*quotedString using single or double quotes�uzunicode string literalcCs�||krtd��|dk�r(t|t�o,t|t��r t|�dkr�t|�dkr�|dk	r�tt|t||tjdd���j	dd��}n$t
j�t||tj�j	dd��}nx|dk	r�tt|t|�t|�ttjdd���j	dd��}n4ttt|�t|�ttjdd���j	d	d��}ntd
��t
�}|dk	�rb|tt|�t||B|B�t|��K}n$|tt|�t||B�t|��K}|jd||f�|S)a~	
    Helper method for defining nested lists enclosed in opening and closing
    delimiters ("(" and ")" are the default).

    Parameters:
     - opener - opening character for a nested list (default=C{"("}); can also be a pyparsing expression
     - closer - closing character for a nested list (default=C{")"}); can also be a pyparsing expression
     - content - expression for items within the nested lists (default=C{None})
     - ignoreExpr - expression for ignoring opening and closing delimiters (default=C{quotedString})

    If an expression is not provided for the content argument, the nested
    expression will capture all whitespace-delimited content between delimiters
    as a list of separate values.

    Use the C{ignoreExpr} argument to define expressions that may contain
    opening or closing characters that should not be treated as opening
    or closing characters for nesting, such as quotedString or a comment
    expression.  Specify multiple expressions using an C{L{Or}} or C{L{MatchFirst}}.
    The default is L{quotedString}, but if no expressions are to be ignored,
    then pass C{None} for this argument.

    Example::
        data_type = oneOf("void int short long char float double")
        decl_data_type = Combine(data_type + Optional(Word('*')))
        ident = Word(alphas+'_', alphanums+'_')
        number = pyparsing_common.number
        arg = Group(decl_data_type + ident)
        LPAR,RPAR = map(Suppress, "()")

        code_body = nestedExpr('{', '}', ignoreExpr=(quotedString | cStyleComment))

        c_function = (decl_data_type("type") 
                      + ident("name")
                      + LPAR + Optional(delimitedList(arg), [])("args") + RPAR 
                      + code_body("body"))
        c_function.ignore(cStyleComment)
        
        source_code = '''
            int is_odd(int x) { 
                return (x%2); 
            }
                
            int dec_to_hex(char hchar) { 
                if (hchar >= '0' && hchar <= '9') { 
                    return (ord(hchar)-ord('0')); 
                } else { 
                    return (10+ord(hchar)-ord('A'));
                } 
            }
        '''
        for func in c_function.searchString(source_code):
            print("%(name)s (%(type)s) args: %(args)s" % func)

    prints::
        is_odd (int) args: [['int', 'x']]
        dec_to_hex (int) args: [['char', 'hchar']]
    z.opening and closing strings cannot be the sameNrr)rcSs|dj�S)Nr)r�)rvrwrwrxry9sznestedExpr.<locals>.<lambda>cSs|dj�S)Nr)r�)rvrwrwrxry<scSs|dj�S)Nr)r�)rvrwrwrxryBscSs|dj�S)Nr)r�)rvrwrwrxryFszOopening and closing arguments must be strings if no content expression is givenznested %s%s expression)r�rzr�r�r
rr	r$rNr�rCr�rrrr+r2ri)�openerZcloserZcontentrNr�rwrwrxrP�s4:

*$cs��fdd�}�fdd�}�fdd�}tt�jd�j��}t�t�j|�jd�}t�j|�jd	�}t�j|�jd
�}	|r�tt|�|t|t|�t|��|	�}
n$tt|�t|t|�t|���}
|j	t
t��|
jd�S)a
	
    Helper method for defining space-delimited indentation blocks, such as
    those used to define block statements in Python source code.

    Parameters:
     - blockStatementExpr - expression defining syntax of statement that
            is repeated within the indented block
     - indentStack - list created by caller to manage indentation stack
            (multiple statementWithIndentedBlock expressions within a single grammar
            should share a common indentStack)
     - indent - boolean indicating whether block must be indented beyond the
            the current level; set to False for block of left-most statements
            (default=C{True})

    A valid block must contain at least one C{blockStatement}.

    Example::
        data = '''
        def A(z):
          A1
          B = 100
          G = A2
          A2
          A3
        B
        def BB(a,b,c):
          BB1
          def BBA():
            bba1
            bba2
            bba3
        C
        D
        def spam(x,y):
             def eggs(z):
                 pass
        '''


        indentStack = [1]
        stmt = Forward()

        identifier = Word(alphas, alphanums)
        funcDecl = ("def" + identifier + Group( "(" + Optional( delimitedList(identifier) ) + ")" ) + ":")
        func_body = indentedBlock(stmt, indentStack)
        funcDef = Group( funcDecl + func_body )

        rvalue = Forward()
        funcCall = Group(identifier + "(" + Optional(delimitedList(rvalue)) + ")")
        rvalue << (funcCall | identifier | Word(nums))
        assignment = Group(identifier + "=" + rvalue)
        stmt << ( funcDef | assignment | identifier )

        module_body = OneOrMore(stmt)

        parseTree = module_body.parseString(data)
        parseTree.pprint()
    prints::
        [['def',
          'A',
          ['(', 'z', ')'],
          ':',
          [['A1'], [['B', '=', '100']], [['G', '=', 'A2']], ['A2'], ['A3']]],
         'B',
         ['def',
          'BB',
          ['(', 'a', 'b', 'c', ')'],
          ':',
          [['BB1'], [['def', 'BBA', ['(', ')'], ':', [['bba1'], ['bba2'], ['bba3']]]]]],
         'C',
         'D',
         ['def',
          'spam',
          ['(', 'x', 'y', ')'],
          ':',
          [[['def', 'eggs', ['(', 'z', ')'], ':', [['pass']]]]]]] 
    csN|t|�krdSt||�}|�dkrJ|�dkr>t||d��t||d��dS)Nrrzillegal nestingznot a peer entryrsrs)r�r9r!r)r�r5rv�curCol)�indentStackrwrx�checkPeerIndent�s
z&indentedBlock.<locals>.checkPeerIndentcs2t||�}|�dkr"�j|�nt||d��dS)Nrrznot a subentryrs)r9r�r)r�r5rvr�)r�rwrx�checkSubIndent�s
z%indentedBlock.<locals>.checkSubIndentcsN|t|�krdSt||�}�o4|�dko4|�dksBt||d���j�dS)Nrrrqznot an unindentrsr:)r�r9rr�)r�r5rvr�)r�rwrx�
checkUnindent�s
z$indentedBlock.<locals>.checkUnindentz	 �INDENTr�ZUNINDENTzindented block)rrr�r�r
r�rirrr�rj)ZblockStatementExprr�rr�r�r�r!r�ZPEERZUNDENTZsmExprrw)r�rxrfQsN,z#[\0xc0-\0xd6\0xd8-\0xf6\0xf8-\0xff]z[\0xa1-\0xbf\0xd7\0xf7]z_:zany tagzgt lt amp nbsp quot aposz><& "'z&(?P<entity>rmz);zcommon HTML entitycCstj|j�S)zRHelper parser action to replace common HTML entities with their special characters)�_htmlEntityMapr�Zentity)rvrwrwrxr[�sz/\*(?:[^*]|\*(?!/))*z*/zC style commentz<!--[\s\S]*?-->zHTML commentz.*zrest of linez//(?:\\\n|[^\n])*z
// commentzC++ style commentz#.*zPython style comment)r�z 	�	commaItem)r�c@s�eZdZdZee�Zee�Ze	e
�jd�je�Z
e	e�jd�jeed��Zed�jd�je�Ze�je�de�je�jd�Zejd	d
��eeeed�j�e�Bjd�Zeje�ed
�jd�je�Zed�jd�je�ZeeBeBj�Zed�jd�je�Ze	eded�jd�Zed�jd�Z ed�jd�Z!e!de!djd�Z"ee!de!d>�dee!de!d?�jd�Z#e#j$d d
��d!e jd"�Z%e&e"e%Be#Bjd#��jd#�Z'ed$�jd%�Z(e)d@d'd(��Z*e)dAd*d+��Z+ed,�jd-�Z,ed.�jd/�Z-ed0�jd1�Z.e/j�e0j�BZ1e)d2d3��Z2e&e3e4d4�e5�e	e6d4d5�ee7d6����j�jd7�Z8e9ee:j;�e8Bd8d9��jd:�Z<e)ed;d
���Z=e)ed<d
���Z>d=S)Brna�

    Here are some common low-level expressions that may be useful in jump-starting parser development:
     - numeric forms (L{integers<integer>}, L{reals<real>}, L{scientific notation<sci_real>})
     - common L{programming identifiers<identifier>}
     - network addresses (L{MAC<mac_address>}, L{IPv4<ipv4_address>}, L{IPv6<ipv6_address>})
     - ISO8601 L{dates<iso8601_date>} and L{datetime<iso8601_datetime>}
     - L{UUID<uuid>}
     - L{comma-separated list<comma_separated_list>}
    Parse actions:
     - C{L{convertToInteger}}
     - C{L{convertToFloat}}
     - C{L{convertToDate}}
     - C{L{convertToDatetime}}
     - C{L{stripHTMLTags}}
     - C{L{upcaseTokens}}
     - C{L{downcaseTokens}}

    Example::
        pyparsing_common.number.runTests('''
            # any int or real number, returned as the appropriate type
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.fnumber.runTests('''
            # any int or real number, returned as float
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.hex_integer.runTests('''
            # hex numbers
            100
            FF
            ''')

        pyparsing_common.fraction.runTests('''
            # fractions
            1/2
            -3/4
            ''')

        pyparsing_common.mixed_integer.runTests('''
            # mixed fractions
            1
            1/2
            -3/4
            1-3/4
            ''')

        import uuid
        pyparsing_common.uuid.setParseAction(tokenMap(uuid.UUID))
        pyparsing_common.uuid.runTests('''
            # uuid
            12345678-1234-5678-1234-567812345678
            ''')
    prints::
        # any int or real number, returned as the appropriate type
        100
        [100]

        -100
        [-100]

        +100
        [100]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # any int or real number, returned as float
        100
        [100.0]

        -100
        [-100.0]

        +100
        [100.0]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # hex numbers
        100
        [256]

        FF
        [255]

        # fractions
        1/2
        [0.5]

        -3/4
        [-0.75]

        # mixed fractions
        1
        [1]

        1/2
        [0.5]

        -3/4
        [-0.75]

        1-3/4
        [1.75]

        # uuid
        12345678-1234-5678-1234-567812345678
        [UUID('12345678-1234-5678-1234-567812345678')]
    �integerzhex integerrsz[+-]?\d+zsigned integerr��fractioncCs|d|dS)Nrrrrsrw)rvrwrwrxry�szpyparsing_common.<lambda>r7z"fraction or mixed integer-fractionz
[+-]?\d+\.\d*zreal numberz+[+-]?\d+([eE][+-]?\d+|\.\d*([eE][+-]?\d+)?)z$real number with scientific notationz[+-]?\d+\.?\d*([eE][+-]?\d+)?�fnumberrA�
identifierzK(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})(\.(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})){3}zIPv4 addressz[0-9a-fA-F]{1,4}�hex_integerr��zfull IPv6 addressrrBz::zshort IPv6 addresscCstdd�|D��dkS)Ncss|]}tjj|�rdVqdS)rrN)rn�
_ipv6_partr�)r�rerwrwrxr��sz,pyparsing_common.<lambda>.<locals>.<genexpr>rv)rG)rvrwrwrxry�sz::ffff:zmixed IPv6 addresszIPv6 addressz:[0-9a-fA-F]{2}([:.-])[0-9a-fA-F]{2}(?:\1[0-9a-fA-F]{2}){4}zMAC address�%Y-%m-%dcs�fdd�}|S)a�
        Helper to create a parse action for converting parsed date string to Python datetime.date

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%d"})

        Example::
            date_expr = pyparsing_common.iso8601_date.copy()
            date_expr.setParseAction(pyparsing_common.convertToDate())
            print(date_expr.parseString("1999-12-31"))
        prints::
            [datetime.date(1999, 12, 31)]
        csLytj|d��j�Stk
rF}zt||t|���WYdd}~XnXdS)Nr)r�strptimeZdater�rr{)r�r5rv�ve)�fmtrwrx�cvt_fn�sz.pyparsing_common.convertToDate.<locals>.cvt_fnrw)r�r�rw)r�rx�
convertToDate�szpyparsing_common.convertToDate�%Y-%m-%dT%H:%M:%S.%fcs�fdd�}|S)a
        Helper to create a parse action for converting parsed datetime string to Python datetime.datetime

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%dT%H:%M:%S.%f"})

        Example::
            dt_expr = pyparsing_common.iso8601_datetime.copy()
            dt_expr.setParseAction(pyparsing_common.convertToDatetime())
            print(dt_expr.parseString("1999-12-31T23:59:59.999"))
        prints::
            [datetime.datetime(1999, 12, 31, 23, 59, 59, 999000)]
        csHytj|d��Stk
rB}zt||t|���WYdd}~XnXdS)Nr)rr�r�rr{)r�r5rvr�)r�rwrxr��sz2pyparsing_common.convertToDatetime.<locals>.cvt_fnrw)r�r�rw)r�rx�convertToDatetime�sz"pyparsing_common.convertToDatetimez7(?P<year>\d{4})(?:-(?P<month>\d\d)(?:-(?P<day>\d\d))?)?zISO8601 datez�(?P<year>\d{4})-(?P<month>\d\d)-(?P<day>\d\d)[T ](?P<hour>\d\d):(?P<minute>\d\d)(:(?P<second>\d\d(\.\d*)?)?)?(?P<tz>Z|[+-]\d\d:?\d\d)?zISO8601 datetimez2[0-9a-fA-F]{8}(-[0-9a-fA-F]{4}){3}-[0-9a-fA-F]{12}�UUIDcCstjj|d�S)a
        Parse action to remove HTML tags from web page HTML source

        Example::
            # strip HTML links from normal text 
            text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
            td,td_end = makeHTMLTags("TD")
            table_text = td + SkipTo(td_end).setParseAction(pyparsing_common.stripHTMLTags)("body") + td_end
            
            print(table_text.parseString(text).body) # -> 'More info at the pyparsing wiki page'
        r)rn�_html_stripperr�)r�r5r�rwrwrx�
stripHTMLTags�s
zpyparsing_common.stripHTMLTagsr`)r�z 	r�r�)r�zcomma separated listcCst|�j�S)N)r�r�)rvrwrwrxry�scCst|�j�S)N)r�r)rvrwrwrxry�sN)rrB)rrB)r�)r�)?r�r�r�r�rmruZconvertToInteger�floatZconvertToFloatr/rRrir�r�rDr�r'Zsigned_integerr�rxrr�Z
mixed_integerrG�realZsci_realr��numberr�r4r3r�Zipv4_addressr�Z_full_ipv6_addressZ_short_ipv6_addressr~Z_mixed_ipv6_addressr
Zipv6_addressZmac_addressr�r�r�Ziso8601_dateZiso8601_datetime�uuidr7r6r�r�rrrrVr.�
_commasepitemr@rYr�Zcomma_separated_listrdrBrwrwrwrxrn�sN""
28�__main__Zselect�fromz_$r\)ra�columnsrjZtablesZcommandaK
        # '*' as column list and dotted table name
        select * from SYS.XYZZY

        # caseless match on "SELECT", and casts back to "select"
        SELECT * from XYZZY, ABC

        # list of column names, and mixed case SELECT keyword
        Select AA,BB,CC from Sys.dual

        # multiple tables
        Select A, B, C from Sys.dual, Table2

        # invalid SELECT keyword - should fail
        Xelect A, B, C from Sys.dual

        # incomplete command - should fail
        Select

        # invalid column name - should fail
        Select ^^^ frox Sys.dual

        z]
        100
        -100
        +100
        3.14159
        6.02e23
        1e-12
        z 
        100
        FF
        z6
        12345678-1234-5678-1234-567812345678
        )rq)r`F)N)FT)T)r�)T)�r��__version__Z__versionTime__�
__author__r��weakrefrr�r�r~r�rdrr�r"r<r�r�_threadr�ImportErrorZ	threadingrr�Zordereddict�__all__r��version_infor;r�maxsizer�r{r��chrrtr�rGr�r�reversedr�r�rr5r
rrIZmaxintZxranger�Z__builtin__r�Zfnamer�rJr�r�r�r�r�r�Zascii_uppercaseZascii_lowercaser4rRrDr3rjr�Z	printablerVrKrrr!r#r&r�r"�MutableMapping�registerr9rJrGr/r2r4rQrMr$r,r
rrr�rQrrrrlr/r'r%r	r.r/rrrr*r)r1r0r rrrrrrrrIrr2rLrMrr(rrUr-r
rrr+rrbr@r<r�rOrNrrSrArgrhrjrirCrIrHrar`r�Z_escapedPuncZ_escapedHexCharZ_escapedOctChar�UNICODEZ_singleCharZ
_charRangermr|r_rMr\rZrmrdrBr�rKrLrer�rkrTr�r�rirUr>r^rYrcrPrfr5rWr7r6r�r�r�r�r;r[r8rEr�r]r?r=rFrXr�r�r:rnr�ZselectTokenZ	fromTokenZidentZ
columnNameZcolumnNameListZ
columnSpecZ	tableNameZ
tableNameListZ	simpleSQLr�r�r�r�r�r�rwrwrwrx�<module>=s�









8


@d&A= I
G3pLOD|M &#@sQ,A,	I#%&0
,	?#kZr

 (
 0 


"site-packages/__pycache__/decorator.cpython-36.opt-1.pyc000064400000030070147511334560017054 0ustar003

5*[Z�@�@s�dZddlmZddlZddlZddlZddlZddlZddlZdZ	ej
dkrdddlmZdd�Znej
d	d
�Zdd�Zd
d�Zy
ejZWnek
r�dd�ZYnXej
dd�Zdd�Zejd�ZGdd�de�Zffdd�Zd*dd�ZyddlmZWn"ek
�rddlmZYnXGdd�de�Zeej�Ze ej!�Z"e"dk�rhej#�rhd d!�Zee_n,e"dk�r|ej#�r|ne"d"k�r�d#d!�Zee_ee�Z$d$d%�Z%d&d'�Z&d(d)�Z'dS)+zT
Decorator module, see http://pypi.python.org/pypi/decorator
for the documentation.
�)�print_functionNz4.2.1�3)�getfullargspeccCs|jS)N)�__init__)�cls�r�/usr/lib/python3.6/decorator.py�get_init0sr	�FullArgSpecz5args varargs varkw defaults kwonlyargs kwonlydefaultscCstjtj|�gdf�S)z?A quick and dirty replacement for getfullargspec for Python 2.XN)r
�_make�inspect�
getargspec)�frrrr7srcCs|jjS)N)r�__func__)rrrrr	;scCsdS)NFr)rrrr�iscoroutinefunctionBsr�ArgSpeczargs varargs varkw defaultscCst|�}t|j|j|j|j�S)z$A replacement for inspect.getargspec)rr�args�varargs�varkw�defaults)r�specrrrr
Jsr
z\s*def\s*([_\w][_\w\d]*)\s*\(c@sZeZdZdZej�ZfZZZ	Z
ZZd
dd�Z
dd�Zddd	�Zeddd��ZdS)�
FunctionMakerz�
    An object with the ability to create functions with a given signature.
    It has attributes name, doc, module, signature, defaults, dict and
    methods update and make.
    NcCs�||_|�rf|j|_|jdkr$d|_|j|_|j|_tj|��rft	|�}t
|di�|_xdD]}	t||	t
||	��q\Wx&t
|j�D]\}
}t|d
|
|�q�Wt|j�}t|j�}
|jr�|jd|j�|
jd|j�n|jr�|jd�x.|jD]$}	|jd|	�|
jd
|	|	f�q�W|j�rB|jd|j�|
jd|j�dj|�|_dj|
�|_|jj�|_|�rr||_|dk	�r�||_|�r�||_|�r�||_|�r�||_|�r�||_t|d��s�td|��dS)Nz<lambda>�_lambda_�__annotations__rrrr�
kwonlyargs�kwonlydefaultszarg%d�*z%s=Nonez%s=%sz**z, �	signaturez%You are decorating a non function: %s)rrrrrr)�shortsignature�__name__�name�__doc__�doc�
__module__�moduler�
isfunctionr�getattr�annotations�setattr�	enumerater�listr�appendrr�joinr�__dict__�copy�dictr�hasattr�	TypeError)�self�funcr rrr"r$ZfuncdictZargspec�a�i�argZallargsZallshortargsrrrrasZ




zFunctionMaker.__init__cKs�|j|_t|dd�|_t|di�|_|j|_|jp4d|_t|dd�|_	yt
jd�}Wntk
rld}YnX|j
jdd�}t|d|�|_|jj|�dS)	z2Update the signature of func with the data in selfr"Nr/r'��?rr$)r rr&r!r-r�__defaults__r�__kwdefaults__r�sys�	_getframe�AttributeError�	f_globals�getr#�update)r2r3�kw�frameZcallermodulerrrr@�s
zFunctionMaker.updateFc

Ks|t|�}|pi}tj|�}|dkr2td|��|jd�}t|gdd�|jjd�D��}x$|D]}	|	dkrbtd	|	|f��qbW|j	d
�s�|d
7}dt
|j�f}
yt||
d�}t
||�Wn*td
tjd�t|tjd��YnX||}|�r||d<|j|f|�|S)zBMake a new function from a given template and update the signatureNz not a valid function template
%s�cSsg|]}|jd��qS)z *)�strip)�.0r6rrr�
<listcomp>�sz&FunctionMaker.make.<locals>.<listcomp>�,�_func_�_call_z%s is overridden in
%s�
z<decorator-gen-%d>ZsinglezError in generated code:)�fileZ
__source__)rHrI)�vars�DEF�search�SyntaxError�group�setr�split�	NameError�endswith�next�_compile_count�compile�exec�printr;�stderrr@)
r2Z	src_templ�evaldict�	addsource�attrs�src�mor �names�n�filename�coder3rrr�make�s4



zFunctionMaker.makeTcKs�t|t�r0|j�jdd�\}	}
|
dd�}d}nd}	d}|}|||	||||�}
djdd�|j�D��}|jd�}|r�t|�r�d|jd	d
�}nd|}|
j	|||f|�S)
z�
        Create a function from the strings name, signature and body.
        evaldict is the evaluation dictionary. If addsource is true an
        attribute __source__ is added to the result. The attributes attrs
        are added, if any.
        �(rCNrJcss|]}d|VqdS)z    Nr)rE�linerrr�	<genexpr>�sz'FunctionMaker.create.<locals>.<genexpr>rIz#async def %(name)s(%(signature)s):
�returnzreturn awaitzdef %(name)s(%(signature)s):
���)
�
isinstance�strrDrRr,�
splitlinesr?r�replacerd)r�objZbodyr[rr"r$r\r]r �restrr3r2Zibody�callerrrr�create�s	


zFunctionMaker.create)NNNNNNN)NF)NNNT)rr#�__qualname__r!�	itertools�countrVrrrrrrrr@rd�classmethodrqrrrrrTs
3
"rc	Csnt||d�}d}x0t|�D]$\}}d|}|||<||d7}qWtj|d|||d�}t|d�rj|j|_|S)zE
    decorate(func, caller) decorates a function using a caller.
    )rIrH�z_e%d_z, z,return _call_(_func_, %s%%(shortsignature)s))�__wrapped__rr)r/r)rrqr0rr)	r3rpZextrasr[Zesr5ZextraZexZfunrrr�decorate�s

rxc
Cs|dk	rt||�Sdf}}tj|�rB|jj�}d|j|jf}n~tj|�r�|jdkr\d}n|j}|j}|jj}t	|j
pzf�}dj|jj|||��}|r�|d7}|j
}n|j
jj�}|jj}t|td�}tjd	||fd
||f|||j|d�}	|�rd||	_
|	S)
z=decorator(caller) converts a caller function into a decoratorNrvzHdecorator(%s) converts functions/generators into factories of %s objectsz<lambda>rz, rG)Z_callZ
_decorate_z%s(func, %s)zhif func is None: return lambda func:  _decorate_(func, _call, (%s))
return _decorate_(func, _call, (%s)))r"r$rw)N)rxrZisclassr�lowerr%r!�__code__�co_argcount�lenr9r,�co_varnames�	__class__�__call__r/rrqr#)
rpZ_funcZdefaultargsrr r"�nargsZndefsr[�decrrr�	decorator�s:







r�)�_GeneratorContextManager)�GeneratorContextManagerc@seZdZdd�ZdS)�ContextManagercCstj|dt||d�|d�S)zContext manager decoratorz.with _self_: return _func_(%(shortsignature)s))Z_self_rH)rw)rrqr/)r2r3rrrr#szContextManager.__call__N)rr#rrrrrrrr�"sr��cOstj||||��S)N)r�r)r2�gr4�krrrr-sr�cOstj||||�S)N)r�r)r2r�r4r�rrrr3scCst|�S)N)�_contextmanager)r3rrr�contextmanager:sr�cCsRd}x:t|�D].\}}t||�r&d}Pt||�r|||<d}qW|rN|j|�dS)z_
    Append ``a`` to the list of the virtual ancestors, unless it is already
    included.
    TFN)r)�
issubclassr+)r4�
vancestors�add�j�varrrr+As

r+cs@ddj���tjdf�fdd�	����fdd�}d�|_|S)	zr
    Factory of decorators turning a function into a generic function
    dispatching on the given arguments.
    z(%s,)z, rvcs0|t|�t���r,tdt��t|�|f��dS)z5Make sure one passes the expected number of argumentszExpected %d arguments, got %d%sN)r|r1)Z	argumentsZwrong�msg)�
dispatch_argsrr�check[szdispatch_on.<locals>.checkcs�tt��j�}t��|ks&td���i����fdd����fdd����fdd�}��fdd	�}���fd
d�}tj�d�t|d
�|����|�d�
S)z4Decorator turning a function into a generic functionzUnknown dispatch arguments %scsv�|�dd�tt���D�}xH�D]@}x:t|||�D]*\}}}t||�r6||j�kr6t||�q6Wq$Wdd�|D�S)zU
            Get a list of sets of virtual ancestors for the given types
            cSsg|]}g�qSrr)rE�_rrrrFpszIdispatch_on.<locals>.gen_func_dec.<locals>.vancestors.<locals>.<listcomp>cSsg|]}t|��qSr)rQ)rE�rarrrrFus)�ranger|�zipr��mror+)�typesZras�types_�tZtype_r�)r�r��typemaprrr�ks
z5dispatch_on.<locals>.gen_func_dec.<locals>.vancestorscs��|�g}x�t|�|��D]p\}}t|�}|dkrFtd||f��n4|dkrr|\}td||fi�j�dd�}n|j�}|j|dd��qW|S)zG
            Get a list of virtual MROs, one for each type
            rCzAmbiguous dispatch for %s: %sr�Nri)r�r|�RuntimeError�typer�r+)r�Zlistsr�ZvasZn_vasr�r�)r�r�rr�	ancestorswsz4dispatch_on.<locals>.gen_func_dec.<locals>.ancestorscs������fdd�}|S)zU
            Decorator to register an implementation for the given types
            cs&�t|�jtjd|j�|��<|S)Nz in )rr�operator�ltr)r)r�r�r�rrr��sz@dispatch_on.<locals>.gen_func_dec.<locals>.register.<locals>.decr)r�r�)r�r�)r�r�register�sz3dispatch_on.<locals>.gen_func_dec.<locals>.registercs@�|�g}x.tj�|��D]}|jtdd�|D���qW|S)zI
            An utility to introspect the dispatch algorithm
            css|]}|jVqdS)N)r)rEr4rrrrg�szKdispatch_on.<locals>.gen_func_dec.<locals>.dispatch_info.<locals>.<genexpr>)rs�productr+�tuple)r�ZlstZanc)r�r�rr�
dispatch_info�s
z8dispatch_on.<locals>.gen_func_dec.<locals>.dispatch_infocs�tdd�|D��}y�|}Wntk
r2YnX|||�Stj�|��}t|�x(|D] }�j|�}|dk	rZ|||�SqZW�||�S)Ncss|]}t|�VqdS)N)r�)rEr6rrrrg�szGdispatch_on.<locals>.gen_func_dec.<locals>._dispatch.<locals>.<genexpr>)r��KeyErrorrsr�rUr?)r�rrAr�r�combinationsr�)r�r3r�rr�	_dispatch�s


z4dispatch_on.<locals>.gen_func_dec.<locals>._dispatchz#return _f_(%s, %%(shortsignature)s))Z_f_)r��defaultr�r�r�r�rw)rQrrrSrrqr/)r3Zargsetr�r�r�)r�r��dispatch_str)r�r3r�r�r�gen_func_decas
z!dispatch_on.<locals>.gen_func_dec�dispatch_on)r,r��ner)r�r�r)r�r�r�rr�Ss
W
r�)N)(r!Z
__future__r�rer;rr�rs�collections�__version__�versionrr	�
namedtupler
rr=rr
rWrM�objectrrxr��
contextlibr��ImportErrorr�r�rZinitr|rZn_argsrr�r�r+r�rrrr�<module>!s\




&


site-packages/__pycache__/configobj.cpython-36.opt-1.pyc000064400000163135147511334560017043 0ustar003

��S^�@sVddlZddlZddlZddlmZmZmZmZddlZddl	m
Z
daedKedLedMedNiZdddddddddddddddd�Z
eeeeed	�Zd
d�ZdZd
ZdZdZdZdZe�ZdOZd$Zd%Zd&Zd'd(d'd(d(dd'dddd(d(d)�Zd*d+�ZGd,d!�d!e�ZGd-d.�d.e�Z e �Z!d/d0�Z"Gd1d�de#�Z$Gd2d�de$�Z%Gd3d�de$�Z&Gd4d�de'�Z(Gd5d�de$�Z)Gd6d�de$�Z*Gd7d�de$�Z+Gd8d�de+�Z,Gd9d�de$�Z-Gd:d�de+�Z.Gd;d �d e$�Z/Gd<d=�d=e�Z0Gd>d?�d?e0�Z1Gd@dA�dAe0�Z2e1e2dB�Z3dCdD�Z4GdEdF�dFe5�Z6GdGd�de6�Z7GdHd�de�Z8dPdId"�Z9ffdJd#�Z:dS)Q�N)�BOM_UTF8�	BOM_UTF16�BOM_UTF16_BE�BOM_UTF16_LE)�__version__�utf_8�utf16_be�utf_16�utf16_le)r	�u16�utf16zutf-16r�	utf_16_bezutf-16ber
�	utf_16_lezutf-16ler�u8�utf�utf8zutf-8)rr	rr
NcCstj|j��dkS)Nr)�BOM_LIST�get�lower)�encoding�r�/usr/lib/python3.6/configobj.py�
match_utf8Dsrz'%s'z"%s"z%sz 
	'"z"""%s"""z'''%s'''�DEFAULT_INDENT_TYPE�DEFAULT_INTERPOLATION�ConfigObjError�NestingError�
ParseError�DuplicateError�ConfigspecError�	ConfigObj�	SimpleVal�InterpolationError�InterpolationLoopError�MissingInterpolationOption�RepeatSectionError�ReloadError�UnreprError�UnknownType�flatten_errors�get_extra_values�configparserz    �
TF)�
interpolation�raise_errors�list_values�create_empty�
file_error�
configspec�	stringify�indent_typer�default_encoding�unrepr�write_empty_valuescCs>tdkrddlad|}tj|�}|j�dj�dj�dS)Nrza=�)�compiler�parse�getChildren)�s�prrr�getObj�s

r>c@seZdZdS)r(N)�__name__�
__module__�__qualname__rrrrr(�sc@s\eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�ZdS)�BuildercCstdkrt|jj��t|�S)N)�mr(�	__class__r?)�self�orrr�build�sz
Builder.buildcCstt|j|j���S)N)�list�maprGr;)rErFrrr�
build_List�szBuilder.build_ListcCs|jS)N)�value)rErFrrr�build_Const�szBuilder.build_ConstcCs6i}tt|j|j���}x|D]}t|�||<qW|S)N)�iterrIrGr;�next)rErF�d�iZelrrr�
build_Dict�s

zBuilder.build_DictcCst|j|��S)N)�tuplerJ)rErFrrr�build_Tuple�szBuilder.build_TuplecCs6|jdkrdS|jdkrdS|jdkr*dStd��dS)N�None�TrueT�FalseFzUndefined Name)�namer()rErFrrr�
build_Name�s


zBuilder.build_NamecCshtt|j|j���\}}yt|�}Wntk
r@td��YnXt|t�sX|j	dkr`td��||S)NZAddg)
rHrIrLr;�float�	TypeErrorr(�
isinstance�complex�real)rErFr]�imagrrr�	build_Add�szBuilder.build_AddcCs|j|j�}t||j�S)N)rG�expr�getattrZattrname)rErF�parentrrr�
build_Getattr�szBuilder.build_GetattrcCs|j|j�d�S)Nr)rLr;)rErFrrr�build_UnarySub�szBuilder.build_UnarySubcCs|j|j�d�S)Nr)rLr;)rErFrrr�build_UnaryAdd�szBuilder.build_UnaryAddN)
r?r@rArGrJrLrQrSrXr_rcrdrerrrrrB�s
rBcCs|s|Sddl}|j|�S)Nr)�astZliteral_eval)r<rfrrrr6�sr6c@seZdZdZddd�ZdS)rzk
    This is the base class for all errors that ConfigObj raises.
    It is a subclass of SyntaxError.
    �NcCs||_||_tj||�dS)N)�line�line_number�SyntaxError�__init__)rE�messagerirhrrrrk�szConfigObjError.__init__)rgNrg)r?r@rA�__doc__rkrrrrr�sc@seZdZdZdS)rzE
    This error indicates a level of nesting that doesn't match.
    N)r?r@rArmrrrrr�sc@seZdZdZdS)rz�
    This error indicates that a line is badly written.
    It is neither a valid ``key = value`` line,
    nor a valid section marker line.
    N)r?r@rArmrrrrr�sc@seZdZdZdd�ZdS)r&zW
    A 'reload' operation failed.
    This exception is a subclass of ``IOError``.
    cCstj|d�dS)Nz#reload failed, filename is not set.)�IOErrorrk)rErrrrk�szReloadError.__init__N)r?r@rArmrkrrrrr&�sc@seZdZdZdS)rz:
    The keyword or section specified already exists.
    N)r?r@rArmrrrrr�sc@seZdZdZdS)rz7
    An error occured whilst parsing a configspec.
    N)r?r@rArmrrrrr�sc@seZdZdZdS)r"z,Base class for the two interpolation errors.N)r?r@rArmrrrrr"�sc@seZdZdZdd�ZdS)r#z=Maximum interpolation depth exceeded in string interpolation.cCstj|d|�dS)Nz*interpolation loop detected in value "%s".)r"rk)rE�optionrrrrkszInterpolationLoopError.__init__N)r?r@rArmrkrrrrr#sc@seZdZdZdS)r%zk
    This error indicates additional sections in a section with a
    ``__many__`` (repeated) section.
    N)r?r@rArmrrrrr%sc@seZdZdZdd�ZdS)r$z0A value specified for interpolation was missing.cCsd|}tj||�dS)Nz%missing option "%s" in interpolation.)r"rk)rEro�msgrrrrksz#MissingInterpolationOption.__init__N)r?r@rArmrkrrrrr$sc@seZdZdZdS)r'z An error parsing in unrepr mode.N)r?r@rArmrrrrr'sc@s>eZdZdZejd�ZdZdd�Zdd�Z	dd	�Z
d
d�ZdS)
�InterpolationEnginez�
    A helper class to help perform string interpolation.

    This class is an abstract base class; its descendants perform
    the actual work.
    z
%\(([^)]*)\)s�%cCs
||_dS)N)�section)rErsrrrrk*szInterpolationEngine.__init__cs0�j|kr|S��fdd���||�ji�}|S)Ncs�||jf|krt|��d|||jf<�jj|�}xz|r��j|�\}}}|dkrT|}n�||||�}|j�\}	}
dj|d|	�|||
d�f�}|	t|�}�jj||�}q2W|||jf=|S)axThe function that does the actual work.

            ``value``: the string we're trying to interpolate.
            ``section``: the section in which that string was found
            ``backtrail``: a dict to keep track of where we've been,
            to detect and prevent infinite recursion loops

            This is similar to a depth-first-search algorithm.
            r8Nrg)rWr#�_KEYCRE�search�_parse_match�span�join�len)�keyrKrsZ	backtrail�match�k�vr<Zreplacement�start�endZnew_search_start)�recursive_interpolaterErrr�4s z>InterpolationEngine.interpolate.<locals>.recursive_interpolate)�_cookiers)rErzrKr)r�rEr�interpolate/s

,zInterpolationEngine.interpolatecCs�|jjj}d|jj_|j}x^|j|�}|dk	r<t|t�r<P|jdi�j|�}|dk	rdt|t�rdP|j|krpP|j}qW||jj_|dkr�t|��||fS)z�Helper function to fetch values from owning section.

        Returns a 2-tuple: the value, and the section where it was found.
        FN�DEFAULT)rs�mainr-rr[�Sectionrbr$)rErzZsave_interpZcurrent_section�valrrr�_fetchds"





zInterpolationEngine._fetchcCs
t��dS)a�Implementation-dependent helper function.

        Will be passed a match object corresponding to the interpolation
        key we just found (e.g., "%(foo)s" or "$foo"). Should look up that
        key in the appropriate config file section (using the ``_fetch()``
        helper function) and return a 3-tuple: (key, value, section)

        ``key`` is the name of the key we're looking for
        ``value`` is the value found for that key
        ``section`` is a reference to the section where it was found

        ``key`` and ``section`` should be None if no further
        interpolation should be performed on the resulting value
        (e.g., if we interpolated "$$" and returned "$").
        N)�NotImplementedError)rEr{rrrrv�sz InterpolationEngine._parse_matchN)r?r@rArm�re�compilertr�rkr�r�rvrrrrrqs
5"rqc@s&eZdZdZdZejd�Zdd�ZdS)�ConfigParserInterpolationzBehaves like ConfigParser.rrz
%\(([^)]*)\)scCs"|jd�}|j|�\}}|||fS)Nr8)�groupr�)rEr{rzrKrsrrrrv�s
z&ConfigParserInterpolation._parse_matchN)	r?r@rArmr�r�r�rtrvrrrrr��s
r�c@s4eZdZdZdZdZejdejej	B�Z
dd�ZdS)�TemplateInterpolationzBehaves like string.Template.�$z�
        \$(?:
          (?P<escaped>\$)              |   # Two $ signs
          (?P<named>[_a-z][_a-z0-9]*)  |   # $name format
          {(?P<braced>[^}]*)}              # ${name} format
        )
        cCs\|jd�p|jd�}|dk	r4|j|�\}}|||fS|jd�dk	rNd|jdfSd|j�dfS)NZnamedZbracedZescaped)r�r��
_delimiter)rEr{rzrKrsrrrrv�s
z"TemplateInterpolation._parse_matchN)r?r@rArmr�r�r�r��
IGNORECASE�VERBOSErtrvrrrrr��sr�)r+�templatecGs|j|f|��S)N)�__new__)�cls�argsrrr�
__newobj__�sr�c@s$eZdZdZdd�Zdd�ZdDdd�Zd	d
�Zdd�Zd
d�Z	dEdd�Z
dd�ZdFdd�Zdd�Z
efdd�Zdd�Zdd�ZdGdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�ZeZd*d+�Zd,d-�ZeZd.e_d/d0�Zd1d2�Zd3d4�ZdHd6d7�Zd8d9�Z d:d;�Z!d<d=�Z"d>d?�Z#d@dA�Z$dBdC�Z%dS)Ir�a�
    A dictionary-like object that represents a section in a config file.
    
    It does string interpolation if the 'interpolation' attribute
    of the 'main' object is set to True.
    
    Interpolation is tried first from this object, then from the 'DEFAULT'
    section of this object, next from the parent and its 'DEFAULT' section,
    and so on until the main object is reached.
    
    A Section will behave like an ordered dictionary - following the
    order of the ``scalars`` and ``sections`` attributes.
    You can use this to change the order of members.
    
    Iteration follows the order: scalars, then sections.
    cCs$tj||d�|jj|d�dS)Nrr8)�dict�update�__dict__)rE�staterrr�__setstate__�szSection.__setstate__cCst|�|jf}t|jf|fS)N)r�r�r�rD)rEr�rrr�
__reduce__�szSection.__reduce__NcCsX|dkri}tj|�||_||_||_||_|j�x|j�D]\}}|||<q@WdS)z�
        * parent is the section above
        * depth is the depth level of this section
        * main is the main ConfigObj
        * indict is a dictionary to initialise the section with
        N)r�rkrbr��depthrW�_initialise�items)rErbr�r��indictrW�entryrKrrrrk�s
zSection.__init__cCs:g|_g|_i|_i|_d|_g|_i|_g|_d|_dS)NF)	�scalars�sections�comments�inline_commentsr2�defaults�default_values�extra_values�_created)rErrrr�szSection._initialisecCsvy
|j}Wn^tk
rh|jj}|dkr.t}|j�}tj|d�}|dkrVd|j_|S||�}|_YnX|j||�S)NTF)	Z_interpolation_engine�AttributeErrorr�r-rr�interpolation_enginesrr�)rErzrKZenginerWZclass_rrr�_interpolates
zSection._interpolatecsftj���}�jjrbt|tj�r,�j�|�St|t�rb��fdd���fdd�|D�}||krb|S|S)z+Fetch the item and do string interpolation.cst|tj�r�j�|�S|S)N)r[�six�string_typesr�)r�)rzrErr�_check/sz#Section.__getitem__.<locals>._checkcsg|]}�|��qSrr)�.0r�)r�rr�
<listcomp>3sz'Section.__getitem__.<locals>.<listcomp>)	r��__getitem__r�r-r[r�r�r�rH)rErzr��newr)r�rzrErr�(s
zSection.__getitem__Fc
CsNt|tj�std|��||jkr6g|j|<d|j|<||jkrL|jj|�t|t�rz||krj|j	j
|�tj|||�n�t|t�r�|r�||kr�|j	j
|�|j
d}tj||t|||j||d��n�||kr�|jj
|�|jj�s<t|tj�r�nHt|ttf��r0x6|D] }t|tj��s
td|���q
Wntd|��tj|||�dS)a�
        Correctly set a value.
        
        Making dictionary values Section instances.
        (We have to special case 'Section' instances - which are also dicts)
        
        Keys must be strings.
        Values need only be strings (or lists of strings) if
        ``main.stringify`` is set.
        
        ``unrepr`` must be set when setting a value to a dictionary, without
        creating a new sub-section.
        zThe key "%s" is not a string.rgr8)r�rWzValue is not a string "%s".N)r[r�r��
ValueErrorr�r�r��remover�r��appendr��__setitem__r�r�r�r3rHrRrZ)rErzrKr6Z	new_depthr�rrrr�9sF







zSection.__setitem__cCsDtj||�||jkr$|jj|�n|jj|�|j|=|j|=dS)z-Remove items from the sequence when deleting.N)r��__delitem__r�r�r�r�r�)rErzrrrr�ts
zSection.__delitem__cCs"y||Stk
r|SXdS)z>A version of ``get`` that doesn't bypass string interpolation.N)�KeyError)rErz�defaultrrrrszSection.getcCsx|D]}||||<qWdS)zD
        A version of update that uses our ``__setitem__``.
        Nr)rEr�r�rrrr��s
zSection.updatecCs:y||}Wn"tk
r.|tkr&�|}YnX||=|S)z�
        'D.pop(k[,d]) -> v, remove specified key and return the corresponding value.
        If key is not found, d is returned if given, otherwise KeyError is raised'
        )r��MISSING)rErzr�r�rrr�pop�s
zSection.popcCs6|j|j}|std��|d}||}||=||fS)zPops the first (key,val)z": 'popitem(): dictionary is empty'r)r�r�r�)rEZsequencerzr�rrr�popitem�szSection.popitemcCs8tj|�g|_g|_i|_i|_d|_g|_g|_dS)z�
        A version of clear that also affects scalars/sections
        Also clears comments and configspec.
        
        Leaves other attributes alone :
            depth/main/parent are not affected
        N)	r��clearr�r�r�r�r2r�r�)rErrrr��s
z
Section.clearcCs.y||Stk
r(|||<||SXdS)z:A version of setdefault that sets sequence if appropriate.N)r�)rErzr�rrr�
setdefault�s
zSection.setdefaultcCstt|j|jt|j����S)z8D.items() -> list of D's (key, value) pairs, as 2-tuples)rH�zipr�r��values)rErrrr��sz
Section.itemscCs|j|jS)zD.keys() -> list of D's keys)r�r�)rErrr�keys�szSection.keyscs�fdd��j�jD�S)z D.values() -> list of D's valuescsg|]}�|�qSrr)r�rz)rErrr��sz"Section.values.<locals>.<listcomp>)r�r�)rEr)rErr��szSection.valuescCstt|j���S)z=D.iteritems() -> an iterator over the (key, value) items of D)rMrHr�)rErrr�	iteritems�szSection.iteritemscCst|j|j�S)z.D.iterkeys() -> an iterator over the keys of D)rMr�r�)rErrr�iterkeys�szSection.iterkeyscCstt|j���S)z2D.itervalues() -> an iterator over the values of D)rMrHr�)rErrr�
itervalues�szSection.itervaluescs0�fdd��ddj�fdd��j�jD��S)zx.__repr__() <==> repr(x)cs*y�|Stk
r$tj�|�SXdS)N)r$r�r�)rz)rErr�_getval�sz!Section.__repr__.<locals>._getvalz{%s}z, cs$g|]}dt|�t�|��f�qS)z%s: %s)�repr)r�rz)r�rrr��sz$Section.__repr__.<locals>.<listcomp>)rxr�r�)rEr)r�rEr�__repr__�szSection.__repr__zx.__str__() <==> str(x)cCs`i}xV|D]N}||}t|t�r*|j�}n&t|t�r>t|�}nt|t�rPt|�}|||<q
W|S)a0
        Return a deepcopy of self as a dictionary.
        
        All members that are ``Section`` instances are recursively turned to
        ordinary dictionaries - by calling their ``dict`` method.
        
        >>> n = a.dict()
        >>> n == a
        1
        >>> n is a
        0
        )r[r�r�rHrR)rEZnewdictr��
this_entryrrrr��s






zSection.dictcCsVxPt|j��D]@\}}||krFt||t�rFt|t�rF||j|�q|||<qWdS)aQ
        A recursive update - useful for merging config files.
        
        >>> a = '''[section1]
        ...     option1 = True
        ...     [[subsection]]
        ...     more_options = False
        ...     # end of file'''.splitlines()
        >>> b = '''# File is user.ini
        ...     [section1]
        ...     option1 = False
        ...     # end of file'''.splitlines()
        >>> c1 = ConfigObj(b)
        >>> c2 = ConfigObj(a)
        >>> c2.merge(c1)
        >>> c2
        ConfigObj({'section1': {'option1': 'False', 'subsection': {'more_options': 'False'}}})
        N)rHr�r[r��merge)rEr�rzr�rrrr�s

z
Section.mergecCs�||jkr|j}n||jkr$|j}ntd|��|j|�}||}tj||�tj|||�|j|�|j||�|j	|}|j
|}|j	|=|j
|=||j	|<||j
|<dS)a
        Change a keyname to another, without changing position in sequence.
        
        Implemented so that transformations can be made on keys,
        as well as on values. (used by encode and decode)
        
        Also renames comments.
        zKey "%s" not found.N)r�r�r��indexr�r�r�r��insertr�r�)rEZoldkeyZnewkey�the_list�posr�ZcommZinline_commentrrr�rename,s"	






zSection.renameTc	Ksi}xttt|j��D]b}|j|}y$|||f|�}|j|}|||<Wqtk
rt|r^�n|j|}d||<YqXqWx�tt|j��D]~}|j|}|r�y|||f|�Wn.tk
r�|rƂn|j|}d||<YnX|j|}||j|f||d�|��||<q�W|S)a�
        Walk every member and call a function on the keyword and value.
        
        Return a dictionary of the return values
        
        If the function raises an exception, raise the errror
        unless ``raise_errors=False``, in which case set the return value to
        ``False``.
        
        Any unrecognised keyword arguments you pass to walk, will be pased on
        to the function you pass in.
        
        Note: if ``call_on_sections`` is ``True`` then - on encountering a
        subsection, *first* the function is called for the *whole* subsection,
        and then recurses into it's members. This means your function must be
        able to handle strings, dictionaries and lists. This allows you
        to change the key of subsections as well as for ordinary members. The
        return value when called on the whole subsection has to be discarded.
        
        See  the encode and decode methods for examples, including functions.
        
        .. admonition:: caution
        
            You can use ``walk`` to transform the names of members of a section
            but you mustn't add or delete members.
        
        >>> config = '''[XXXXsection]
        ... XXXXkey = XXXXvalue'''.splitlines()
        >>> cfg = ConfigObj(config)
        >>> cfg
        ConfigObj({'XXXXsection': {'XXXXkey': 'XXXXvalue'}})
        >>> def transform(section, key):
        ...     val = section[key]
        ...     newkey = key.replace('XXXX', 'CLIENT1')
        ...     section.rename(key, newkey)
        ...     if isinstance(val, (tuple, list, dict)):
        ...         pass
        ...     else:
        ...         val = val.replace('XXXX', 'CLIENT1')
        ...         section[newkey] = val
        >>> cfg.walk(transform, call_on_sections=True)
        {'CLIENT1section': {'CLIENT1key': None}}
        >>> cfg
        ConfigObj({'CLIENT1section': {'CLIENT1key': 'CLIENT1value'}})
        F)r.�call_on_sections)�rangeryr��	Exceptionr��walk)	rEZfunctionr.r�Zkeywargs�outrPr�r�rrrr�Js:/





zSection.walkcCsn||}|dkrdS|dkr dSy(t|tj�s6t��n|jj|j�SWn tk
rhtd|��YnXdS)a_
        Accepts a key as input. The corresponding value must be a string or
        the objects (``True`` or 1) or (``False`` or 0). We allow 0 and 1 to
        retain compatibility with Python 2.2.
        
        If the string is one of  ``True``, ``On``, ``Yes``, or ``1`` it returns 
        ``True``.
        
        If the string is one of  ``False``, ``Off``, ``No``, or ``0`` it returns 
        ``False``.
        
        ``as_bool`` is not case sensitive.
        
        Any other input will raise a ``ValueError``.
        
        >>> a = ConfigObj()
        >>> a['a'] = 'fish'
        >>> a.as_bool('a')
        Traceback (most recent call last):
        ValueError: Value "fish" is neither True nor False
        >>> a['b'] = 'True'
        >>> a.as_bool('b')
        1
        >>> a['b'] = 'off'
        >>> a.as_bool('b')
        0
        TFz$Value "%s" is neither True nor FalseN)r[r�r�r�r��_boolsrr�)rErzr�rrr�as_bool�szSection.as_boolcCst||�S)ai
        A convenience method which coerces the specified value to an integer.
        
        If the value is an invalid literal for ``int``, a ``ValueError`` will
        be raised.
        
        >>> a = ConfigObj()
        >>> a['a'] = 'fish'
        >>> a.as_int('a')
        Traceback (most recent call last):
        ValueError: invalid literal for int() with base 10: 'fish'
        >>> a['b'] = '1'
        >>> a.as_int('b')
        1
        >>> a['b'] = '3.2'
        >>> a.as_int('b')
        Traceback (most recent call last):
        ValueError: invalid literal for int() with base 10: '3.2'
        )�int)rErzrrr�as_int�szSection.as_intcCst||�S)a>
        A convenience method which coerces the specified value to a float.
        
        If the value is an invalid literal for ``float``, a ``ValueError`` will
        be raised.
        
        >>> a = ConfigObj()
        >>> a['a'] = 'fish'
        >>> a.as_float('a')  #doctest: +IGNORE_EXCEPTION_DETAIL
        Traceback (most recent call last):
        ValueError: invalid literal for float(): fish
        >>> a['b'] = '1'
        >>> a.as_float('b')
        1.0
        >>> a['b'] = '3.2'
        >>> a.as_float('b')  #doctest: +ELLIPSIS
        3.2...
        )rY)rErzrrr�as_float�szSection.as_floatcCs$||}t|ttf�rt|�S|gS)aU
        A convenience method which fetches the specified value, guaranteeing
        that it is a list.
        
        >>> a = ConfigObj()
        >>> a['a'] = 1
        >>> a.as_list('a')
        [1]
        >>> a['a'] = (1,)
        >>> a.as_list('a')
        [1]
        >>> a['a'] = [1]
        >>> a.as_list('a')
        [1]
        )r[rRrH)rErz�resultrrr�as_list�szSection.as_listcCs2|j|}tj|||�||jkr.|jj|�|S)a
        Restore (and return) default value for the specified key.
        
        This method will only work for a ConfigObj that was created
        with a configspec and has been validated.
        
        If there is no default value for this key, ``KeyError`` is raised.
        )r�r�r�r�r�)rErzr�rrr�restore_defaults
	

zSection.restore_defaultcCs:x|jD]}|j|�qWx|jD]}||j�q"WdS)a'
        Recursively restore default values to all members
        that have them.
        
        This method will only work for a ConfigObj that was created
        with a configspec and has been validated.
        
        It doesn't delete or modify entries without default values.
        N)r�r�r��restore_defaults)rErzrsrrrr�s
zSection.restore_defaults)NN)F)N)N)TF)&r?r@rArmr�r�rkr�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r��__iter__r�r��__str__r�r�r�r�r�r�r�r�r�r�rrrrr��sH

;

	

T,r�c@s�eZdZdZejdej�Zejdej�Zejdej�Z	ejdej�Z
ejdej�Zejd�Zejd�Z
ejd	�Zejd
�Zeefe
efd�Zdd
dd
dd
dd
d�ZdFdd�Zdd�ZdGdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�ZdHd*d+�Zd,d-�Z d.d/�Z!d0d1�Z"d2d3�Z#d4d5�Z$d6d7�Z%d8d9�Z&d:d;�Z'd<d=�Z(dId>d?�Z)dJd@dA�Z*dBdC�Z+dDdE�Z,dS)Kr z2An object to read, create, and write config files.a�^ # line start
        (\s*)                   # indentation
        (                       # keyword
            (?:".*?")|          # double quotes
            (?:'.*?')|          # single quotes
            (?:[^'"=].*?)       # no quotes
        )
        \s*=\s*                 # divider
        (.*)                    # value (including list values and comments)
        $   # line end
        a=^
        (\s*)                     # 1: indentation
        ((?:\[\s*)+)              # 2: section marker open
        (                         # 3: section name open
            (?:"\s*\S.*?\s*")|    # at least one non-space with double quotes
            (?:'\s*\S.*?\s*')|    # at least one non-space with single quotes
            (?:[^'"\s].*?)        # at least one non-space unquoted
        )                         # section name close
        ((?:\s*\])+)              # 4: section marker close
        \s*(\#.*)?                # 5: optional comment
        $a�^
        (?:
            (?:
                (
                    (?:
                        (?:
                            (?:".*?")|              # double quotes
                            (?:'.*?')|              # single quotes
                            (?:[^'",\#][^,\#]*?)    # unquoted
                        )
                        \s*,\s*                     # comma
                    )*      # match all list items ending in a comma (if any)
                )
                (
                    (?:".*?")|                      # double quotes
                    (?:'.*?')|                      # single quotes
                    (?:[^'",\#\s][^,]*?)|           # unquoted
                    (?:(?<!,))                      # Empty value
                )?          # last item in a list - or string value
            )|
            (,)             # alternatively a single comma - empty list
        )
        \s*(\#.*)?          # optional comment
        $z�
        (
            (?:".*?")|          # double quotes
            (?:'.*?')|          # single quotes
            (?:[^'",\#]?.*?)       # unquoted
        )
        \s*,\s*                 # comma
        a^
        (
            (?:".*?")|          # double quotes
            (?:'.*?')|          # single quotes
            (?:[^'"\#].*?)|     # unquoted
            (?:)                # Empty value
        )
        \s*(\#.*)?              # optional comment
        $z^'''(.*?)'''\s*(#.*)?$z^"""(.*?)"""\s*(#.*)?$z^(.*?)'''\s*(#.*)?$z^(.*?)"""\s*(#.*)?$)z'''z"""TF)�yes�noZonZoff�1�0�trueZfalseNc
Cs�||_tj||d|�|pg}|||||||	|
|||
|d�}|dkrJ|}n|ddl}|jdtdd�x |D]}|tkrhtd|��qhWx@ttj	��D]0\}}||kr�|||<||}||kr�|||<q�W|r�d|d	<|j
|�|d
}||_|j||�dS)a�
        Parse a config file or create a config file object.
        
        ``ConfigObj(infile=None, configspec=None, encoding=None,
                    interpolation=True, raise_errors=False, list_values=True,
                    create_empty=False, file_error=False, stringify=True,
                    indent_type=None, default_encoding=None, unrepr=False,
                    write_empty_values=False, _inspec=False)``
        r)r2rr-r.r/r0r1r3r4r5r6r7NzUPassing in an options dictionary to ConfigObj() is deprecated. Use **options instead.�)�
stacklevelzUnrecognised option "%s".Fr/r2)
�_inspecr�rk�warnings�warn�DeprecationWarning�OPTION_DEFAULTSrZrHr�r��_original_configspec�_load)rE�infile�optionsr2rr-r.r/r0r1r3r4r5r6r7r�Z_optionsr�r�rKZ
keyword_valuerrrrk�s<


zConfigObj.__init__cs"t|tj�r�||_tjj|�rBt|d��}|j�p4g}WdQRXn@|j	rXt
d|j��n*|jr~t|d��}|jd�WdQRXg}n�t|t
tf�r�t
|�}n�t|t��rt|t�rʇfdd���||�nx|D]}||||<q�W|`|dk	r�|j|�nd|_dSt|dt�tk	�r(|j��p$g}ntd��|�r�|j|�}xN|D]F}|�sF|ddk�rd�qFx"dD]}|j|��rj||_P�qjWP�qFWd
d�|D�}|j|�|j�r�d|jdj}t|j�d	k�r�d|}	t|	�}
n
|jd}
|j|
_||
_ |
�|`|dk�rd|_n
|j|�dS)N�rbzConfig file not found: "%s".�wrgcsJx|jD]}||||<qWx(|jD]}i||<�||||�q$WdS)N)r�r�)Z
in_section�this_sectionr�rs)�set_sectionrrr��s
z$ConfigObj._load.<locals>.set_section�readz>infile must be a filename, file like object, or list of lines.r8�
�
�
cSsg|]}|jd��qS)z
)�rstrip)r�rhrrrr�sz#ConfigObj._load.<locals>.<listcomp>zat line %s.rz2Parsing failed with several errors.
First error %s���)r�r�)r�r�r�)!r[r�r��filename�os�path�isfile�open�	readlinesr1rnr0�writerHrRr�r �_errors�_handle_configspecr2rar�r�rZ�_handle_bom�endswith�newlines�_parseriryr�errors�config)rEr�r2�hZcontentr�rhr�inforp�errorr)r�rr��sh









zConfigObj._loadcCs�|dkrt}d|_g|_|d|_|d|_|d|_|d|_|d|_|d|_|d|_	|d|_
|d	|_d
|_d|_
|d|_|d|_g|_g|_d|_|jr�d
|_tj|�dS)
Nr.r-r/r0r1r3r4rr5Fr7r6)r�r�rr.r-r/r0r1r3r4rr5�BOMrr7r6�initial_comment�
final_commentr2r�r�r�)rEr�rrrr�0s.










zConfigObj._initialisecs0�fdd��ddj�fdd��j�jD��S)Ncs*y�|Stk
r$tj�|�SXdS)N)r$r�r�)rz)rErrr�Qsz#ConfigObj.__repr__.<locals>._getvalzConfigObj({%s})z, cs$g|]}dt|�t�|��f�qS)z%s: %s)r�)r�rz)r�rrr�Wsz&ConfigObj.__repr__.<locals>.<listcomp>)rxr�r�)rEr)r�rErr�PszConfigObj.__repr__cCsH|jdk	r&|jj�tkr&|j||j�St|ttf�r>|d}n|}t|tj�r\|j||j�S|jdk	�r(t|jj�}|dkr�x8tt	j
��D](\}\}}|s�q�|j|�r�|j||�Sq�W|j||j�St|}|j|�s�|j||j�S|t
|�d�}t|ttf��r||d<n|}d|_|j||j�Sx�tt	j
��D]�\}\}}t|tj��s6|j|��rf�q6n�||_|�s�d|_|t
|�d�}t|ttf��r�||d<n|}t|tj��r�|jd�St|tj��r�|jd�jd�S|j|d�S|j||�S�q6Wtj�rt|t��r|j|d�St|tj��r8|jd�jd�S|j|d�SdS)a1
        Handle any BOM, and decode if necessary.
        
        If an encoding is specified, that *must* be used - but the BOM should
        still be removed (and the BOM attribute set).
        
        (If the encoding is wrongly specified, then a BOM for an alternative
        encoding won't be discovered or removed.)
        
        If an encoding is not specified, UTF8 or UTF16 BOM will be detected and
        removed. The BOM attribute will be set. UTF16 will be decoded to
        unicode.
        
        NOTE: This method must not be called with an empty ``infile``.
        
        Specifying the *wrong* encoding is likely to cause a
        ``UnicodeDecodeError``.
        
        ``infile`` must always be returned as a list of lines, but may be
        passed in as a single string.
        Nrr	Tzutf-8)rrr�_decoder[rHrRr�Z	text_type�BOMSr��
startswith�BOM_SETryr�binary_type�
splitlines�decodeZPY2�str)rEr�rh�encrrZfinal_encoding�newlinerrrr[s^






zConfigObj._handle_bomcCs&t|tj�r|jr|j|j�S|SdS)z@Decode ASCII strings to unicode if a self.encoding is specified.N)r[r�rrr)rEZaStringrrr�_a_to_u�szConfigObj._a_to_ucCsxt|tj�r|jd�St|tj�r@|r6|j|�jd�S|jd�S|rtx.t|�D]"\}}t|tj�rN|j|�||<qNW|S)z�
        Decode infile to unicode. Using the specified encoding.
        
        if is a string, it also needs converting to a list.
        T)r[r�r�rrr�	enumerate)rEr�rrPrhrrrr�s

zConfigObj._decodecCs&t|tj�r|jr|j|j�S|SdS)z'Decode element to unicode if necessary.N)r[r�rr5r)rErhrrr�_decode_element�szConfigObj._decode_elementcCst|tj�st|�S|SdS)zh
        Used by ``stringify`` within validate, to turn non-string values
        into strings.
        N)r[r�r�r)rErKrrr�_str�szConfigObj._strcCs"|j}|jrd|_g}d}|}t|�d}d}d}�x�||k�r�|rHg}|d7}||}	|	j�}
|
sp|
jd�r�d}|j|	�q6|s�||_g}d}d}|jj|	�}|dk	�r�|j	�\}}
}}}|r�|j
dkr�||_
|
jd�}||jd�k�r�|jdt
||�q6||jk�rHy|j||�j}Wn(tk
�rD|jd	t
||�w6YnXn<||jk�r\|j}n(||jdk�rr|}n|jd
t
||�q6|j|�}||k�r�|jdt||�q6t||||d�}|||<||j|<||j|<q6|jj|	�}|dk�r|jd
j|	�t||�q6|j	�\}}}|�r,|j
dk�r,||_
|dd�dk�r�y|j||||�\}}}Wn(tk
�r�|jdt||�w6YnjX|j�r�d}yt|�}WnNtk
�r�}z0t|�tk�r�d}nd}|j|t||�w6WYdd}~XnXn�|j�rTd}yt|�}WnLtk
�rP}z.t|t��r*d}nd}|j|t||�w6WYdd}~XnXn<y|j |�\}}Wn(tk
�r�|jdt||�w6YnX|j|�}||k�r�|jdt||�q6|j!||dd�||j|<||j|<q6q6W|j
dk�r�d|_
|�r|j�r||_n|�s||_"||_dS)zActually parse the config file.Fr8�#TN�[�]z Cannot compute the section depthzCannot compute nesting levelzSection too nestedzDuplicate section name)rWz=Invalid line ({0!r}) (matched as neither section nor keyword)��"""�'''zParse error in multiline valuergzUnknown name or type in valuez+Parse error from unrepr-ing multiline valuez!Parse error from unrepr-ing valuezParse error in valuezDuplicate keyword name)r6r�)r!r")#r/r6ry�striprr�r
�_sectionmarkerr{�groupsr4�count�
_handle_errorrr��_match_depthrbrj�_unquoterr�r�r��_keyword�formatr�
_multiliner��typer(r'r[�
_handle_valuer�r)rEr�Ztemp_list_valuesZcomment_listZ
done_startr��maxline�	cur_indexZ
reset_commentrhZsline�mat�indentZ	sect_openZ	sect_nameZ
sect_close�commentZ	cur_depthrbrzrK�erprrrrs�





















zConfigObj._parsecCs>x$||jkr$||jkrt��|j}qW|j|kr4|St��dS)z�
        Given a section and a depth level, walk back through the sections
        parents to see if the depth level matches a previous section.
        
        Return a reference to the right section,
        or raise a SyntaxError.
        N)r�rbrj)rEZsectr�rrrr(�s


zConfigObj._match_depthcCsB||}|d7}dj||�}||||�}|jr2|�|jj|�dS)z�
        Handle an error according to the error settings.
        
        Either raise the error or store it.
        The error will have occured at ``cur_index``
        r8z{0} at line {1}.N)r+r.rr�)rE�textZ
ErrorClassr�r0rhrlrrrrr'�szConfigObj._handle_errorcCs4|st�|d|dkr0|ddkr0|dd�}|S)z%Return an unquoted version of a valuerr8�"�'r�)r6r7r�)rj)rErKrrrr)�s
zConfigObj._unquotecs�|r�jr|dkrdS|rjt|ttf�rj|s0dSt|�dkrR�j|ddd�dSdj�fdd	�|D��St|tj�s��j	r�t
|�}ntd
|��|s�dS�jo�d|ko�d
|k}|o�d|kr�d|kp�d|k}|o�|o�d|ko�d|ko�d
|k}|�s�|�o|}|�rh�j�st
}nNd|k�r0td|��n6|dtk�r\|dtk�r\d|k�r\t
}n
�j|�}n
�j|�}|t
k�r�d
|k�r��j�r��j|�}||S)a�
        Return a safely quoted version of a value.
        
        Raise a ConfigObjError if the value cannot be safely quoted.
        If multiline is ``True`` (default) then use triple quotes
        if necessary.
        
        * Don't quote values that don't need it.
        * Recursively quote members of a list and return a comma joined list.
        * Multiline is ``False`` for lists.
        * Obey list syntax for empty and single member lists.
        
        If ``list_values=False`` then the value is only quoted if it contains
        a ``\n`` (is multiline) or '#'.
        
        If ``write_empty_values`` is set, and the value is an empty string, it
        won't be quoted.
        rg�,r8rF)�	multilinez, csg|]}�j|dd��qS)F)r9)�_quote)r�r�)rErrr��sz$ConfigObj._quote.<locals>.<listcomp>zValue "%s" is not a string.z""r�rr7r6z#Value "%s" cannot be safely quoted.r�)r7r[rHrRryr:rxr�r�r3rrZr/�noquotr�wspace_plus�_get_single_quote�_get_triple_quote)rErKr9Zno_lists_no_quotesZneed_tripleZhash_triple_quoteZcheck_for_single�quotr)rErr:�sB

"



zConfigObj._quotecCs4d|krd|krtd|��nd|kr,t}nt}|S)Nr7r6z#Value "%s" cannot be safely quoted.)r�squot�dquot)rErKr?rrrr=%szConfigObj._get_single_quotecCsD|jd�dkr(|jd�dkr(td|��|jd�dkr<t}nt}|S)Nz"""r8z'''z#Value "%s" cannot be safely quoted.r�r�r�)�findr�tdquot�tsquot)rErKr?rrrr>/szConfigObj._get_triple_quotecs��jr|dfS�js6�jj|�}|dkr.t��|j�S�jj|�}|dkrPt��|j�\}}}}|dkrv|dkrvt��|dk	r�g|fS|dk	r�|r�|r�d}n|p�d}�j|�}|dkr�||fS�jj	|�}�fdd�|D�}|dk	r�||g7}||fS)z�
        Given a value string, unquote, remove comment,
        handle lists. (including empty and single member lists)
        rgNz""csg|]}�j|��qSr)r))r�r�)rErrr�dsz+ConfigObj._handle_value.<locals>.<listcomp>)
r�r/�_nolistvaluer{rjr%�	_valueexpr)�
_listvalueexp�findall)rErKr1r/ZsingleZ
empty_listr3r�r)rErr.9s6


zConfigObj._handle_valuec
Cs�|dd�}|dd�}|j|d}|j|d}|j|�}	|	dk	r`t|	j��}
|
j|�|
S|j|�dkrtt��xD||kr�|d7}|d7}||}|j|�dkr�||7}qvPqvWt��|j|�}	|	dkr�t��|	j�\}}||||fS)z9Extract the value, where we are in a multiline situation.Nr rr8r�r�r�)�
_triple_quoter{rHr%r�rBrj)
rErKr�r0r/r?ZnewvalueZsingle_lineZ
multi_liner1Zretvalrhr3rrrr,js0




zConfigObj._multilinecCs�t|t�szyt|dddd�}WnZtk
rL}ztd|��WYdd}~Xn.tk
rx}ztd|��WYdd}~XnX||_dS)zParse the configspec.T)r.r1r�zParsing configspec failed: %sNzReading configspec failed: %s)r[r rrrnr2)rEr2r4rrrr�s
zConfigObj._handle_configspeccCs�|j}|jd�}t|t�r<x |jD]}||kr"|||_q"Wxz|jD]p}|dkrRqD||kr�i||<d||_|r�|jj|g�|j|<|jj|d�|j|<t||t�rD||||_qDWdS)z�
        Called by validate. Handles setting the configspec on subsections
        including sections to be validated by __many__
        �__many__TrgN)	r2rr[r�r�r�r�r�r�)rErs�copyr2�manyr�rrr�_set_configspec�s"


zConfigObj._set_configspeccCsN|js|j|j|��}nt|�}d||j|j|dd��|jd�||j|�fS)z.Write an individual line, for the write methodz
%s%s%s%s%sF)r9z = )r6rr:r�r)rE�
indent_stringr�r�r3r�rrr�_write_line�szConfigObj._write_linecCs<d||jd|�|j|j|�dd�|jd|�|j|�fS)zWrite a section marker linez
%s%s%s%s%srF)r9r)rr:r)rErNr�r�r3rrr�
_write_marker�s
zConfigObj._write_markercCs.|sdS|j}|jd�s&||jd�7}||S)zDeal with a comment.rgrz # )r4rr)rEr3r~rrr�_handle_comment�s
zConfigObj._handle_commentc	sB�jdkrt�_g}�jd�}�jd�}|dkr��j}d�_�}xB�jD]8}�j|�}|j�}|rv|j|�rv||}|j|�qHW�j|j	}	x�|j
|jD]�}
|
|jkr�q�xF|j
|
D]8}�j|j��}|r�|j|�r�||}|j|	|�q�W||
}�j|j|
�}
t|t��rF|j�j|	|j	|
|
��|j�j|d��q�|j�j|	|
||
��q�W|�k�r�xH�jD]>}�j|�}|j�}|�r�|j|��r�||}|j|��qrW|�_|�k	�r�|S�jdk�rF|dk�rF�j�r��fdd�|D�}�j�rB�jdk�s"tj�jj��dk�rB|�s2|jd	�t|d
|d
<|S�j�pRt j!}t"|dd�dk	�r�|j#dk�r�t$j%d
k�r�|dk�r�d}�j|�j&|�}|j'|��s�||7}t|t(j)��r�|}n|j*�j�pڈj+�p�d�}�j�r�jdk�s�t,�j��rt|}|dk	�r|j|�n"t-�jd��}|j|�WdQRXdS)a~
        Write the current ConfigObj as a file
        
        tekNico: FIXME: use StringIO instead of real files
        
        >>> filename = a.filename
        >>> a.filename = 'test.ini'
        >>> a.write()
        >>> a.filename = filename
        >>> a == ConfigObj('test.ini', raise_errors=True)
        1
        >>> import os
        >>> os.remove('test.ini')
        Nrz# F)rscsg|]}|j�j��qSr)�encoder)r��l)rErrr�/sz#ConfigObj.write.<locals>.<listcomp>rrgr�moder�Zwin32z
r��ascii�wb).r4rrr-r
rr#rr�r�r�r�r�r��lstriprQr�r[r�rP�extendrrOrr�rrrrrrrr��lineseprarT�sys�platformrxrr�rrRr5rr�)rEZoutfilersr�ZcsZcspZint_valrhZ
stripped_linerNr�Zcomment_liner�r3r�outputZoutput_bytesr	r)rErr�s�








 
zConfigObj.writecst�dkrt�jdkrtd���r0ddlm}|�_���rt�jj�_�jj�_�jj�_�jj�_�jj	�_	�jj
�_
�j��j����������fdd�}i�d}d}�fdd	��jD�}	�fd
d	��j
D�}
�fdd	��jD�}x��jD]�}|dk�r�q�|�jk�s|�jk�rZd}
d}��rf|�jk�rf�jj|g��j|<�jj|d��j|<nd}
�|}||�|||
||�\}}q�Wd}d�jk�r��d}nd
�jk�r��d
}|dk	�r�x,|	D]$}�|}||||d||�\}}�q�Wg}	x<|D]4}d}��sd�|<nd}d|}�j|��|<�q�Wx<|
D]4}d}��sJd�|<nd}d|}�j|��|<�q2Wx��j
D]�}��k�r�|dk�r��qr�|jdk�r�|	j|��qr��rڈjj|g��j|<�jj|d��j|<�j����|d�}|�|<|dk�rd}n|dk�rd}nd}�qrW|	�_��r<�j�r<d}|�r\��r\��r\t�j��}|�rfdS|�rpdS�S)a;
        Test the ConfigObj against a configspec.
        
        It uses the ``validator`` object from *validate.py*.
        
        To run ``validate`` on the current ConfigObj, call: ::
        
            test = config.validate(validator)
        
        (Normally having previously passed in the configspec when the ConfigObj
        was created - you can dynamically assign a dictionary of checks to the
        ``configspec`` attribute of a section though).
        
        It returns ``True`` if everything passes, or a dictionary of
        pass/fails (True/False). If every member of a subsection passes, it
        will just have the value ``True``. (It also returns ``False`` if all
        members fail).
        
        In addition, it converts the values from strings to their native
        types if their checks pass (and ``stringify`` is set).
        
        If ``preserve_errors`` is ``True`` (``False`` is default) then instead
        of a marking a fail with a ``False``, it will preserve the actual
        exception object. This can contain info about the reason for failure.
        For example the ``VdtValueTooSmallError`` indicates that the value
        supplied was too small. If a value (or section) is missing it will
        still be marked as ``False``.
        
        You must have the validate module to use ``preserve_errors=True``.
        
        You can then use the ``flatten_errors`` function to turn your nested
        results dictionary into a flattened list of failures - useful for
        displaying meaningful error messages.
        NzNo configspec supplied.r)�VdtMissingValuecsP�jj|d�y�j�|��j|<Wntt�jfk
rBYnXy�j|||d�}WnP�jk
r�}z2�s~t|�j�r�d�|<n|�|<d}d}WYdd}~Xn�Xd}d�|<�j	s�|�r"�j	�s
t|t
tf�r�fdd�|D�}n|o�|dk�rd}n
�j|�}||k�s|�r"|�|<��rH|�rH|�j
k�rH�j
j|�||fS)N)�missingFTcsg|]}�j|��qSr)r)r��item)rErrr��sz>ConfigObj.validate.<locals>.validate_entry.<locals>.<listcomp>rg)r�r�Zget_default_valuer�r��baseErrorClass�checkr[�_vdtMissingValuer3rHrRrr�r�)r��specr�r^�ret_true�	ret_falserar4)r2rKr��preserve_errorsrsrE�	validatorrr�validate_entry�s:

z*ConfigObj.validate.<locals>.validate_entryTcsg|]}|�kr|�qSrr)r�r|)r2rrr��sz&ConfigObj.validate.<locals>.<listcomp>csg|]}|�jkr|�qSr)r�)r�r|)rsrrr��scsg|]}|�jkr|�qSr)r�)r�r|)rsrrr��srJ�
___many___rgFz"Value %r was provided as a sectionz)Section %r was provided as a single valuer�)rfrKrs)rJri)r2r��validater]rbr
rrrrr4rMr�r�r�r�rr�r`r�r�r��anyr�)rErgrfrKrsr]rhrdreZunvalidatedZincorrect_sectionsZincorrect_scalarsr�r^r�rLrprar)r2rKr�rfrsrErgrrjSs�$






-












zConfigObj.validatecCs |j�|j�d|_d|_dS)z@Clear ConfigObj instance and restore to 'freshly created' state.N)r�r�r2r�)rErrr�reset"	szConfigObj.resetcCstt|jtj�st��|j}i}x$tD]}|dkr2q$t||�||<q$W|j}||d<|j�|j	|�|j
||�dS)z�
        Reload a ConfigObj from file.
        
        This method raises a ``ReloadError`` if the ConfigObj doesn't have
        a filename attribute pointing to a file.
        r2N)r[r�r�r�r&r�rar�r�r�r�)rEr�Zcurrent_optionsr�r2rrr�reload-	s

zConfigObj.reload)NNNNTFTFFTNNFFF)N)T)NN)FFN)-r?r@rArmr�r�r�r*r$rFrGrEZ_single_line_singleZ_single_line_doubleZ_multi_line_singleZ_multi_line_doublerIr�rkr�r�r�rrrrrrr(r'r)r:r=r>r.r,rrMrOrPrQrrjrlrmrrrrr /sx







6`
 u	
(

G

1#	
r
Oc@s"eZdZdZdd�Zddd�ZdS)	r!a�
    A simple validator.
    Can be used to check that all members expected are present.
    
    To use it, provide a configspec with all your members in (the value given
    will be ignored). Pass an instance of ``SimpleVal`` to the ``validate``
    method of your ``ConfigObj``. ``validate`` will return ``True`` if all
    members are present, or a dictionary with True/False meaning
    present/missing. (Whole missing sections will be replaced with ``False``)
    cCs
t|_dS)N)rr`)rErrrrkS	szSimpleVal.__init__FcCs|r|j��|S)z9A dummy check method, always returns the value unchanged.)r`)rEra�memberr^rrrraV	szSimpleVal.checkN)F)r?r@rArmrkrarrrrr!G	s
cCs�|dkrg}g}|dkr t|�S|dks2t|t�r^|j|dd�d|f�|rV|j�t|�Sxht|j��D]X\}}|dkr~qlt|j|�t�r�|j|�t	|||||�ql|j|dd�||f�qlW|r�|j�t|�S)a�
    An example function that will turn a nested dictionary of results
    (as returned by ``ConfigObj.validate``) into a flat list.
    
    ``cfg`` is the ConfigObj instance being checked, ``res`` is the results
    dictionary returned by ``validate``.
    
    (This is a recursive function, so you shouldn't use the ``levels`` or
    ``results`` arguments - they are used by the function.)
    
    Returns a list of keys that failed. Each member of the list is a tuple::
    
        ([list of sections...], key, result)
    
    If ``validate`` was called with ``preserve_errors=False`` (the default)
    then ``result`` will always be ``False``.

    *list of sections* is a flattened list of sections that the key was found
    in.
    
    If the section was missing (or a section was expected and a scalar provided
    - or vice-versa) then key will be ``None``.
    
    If the value (or section) was missing then ``result`` will be ``False``.
    
    If ``validate`` was called with ``preserve_errors=True`` and a value
    was present, but failed the check, then ``result`` will be the exception
    object returned. You can use this as a string that describes the failure.
    
    For example *The value "3" is of the wrong type*.
    NTF)
�sortedr[r�r�r�rHr�rr�r))Zcfg�resZlevels�resultsrzr�rrrr)]	s* 
csVg}|j�fdd�|jD��x2|jD](}||jkr&|jt||�|f��q&W|S)a�
    Find all the values and sections not in the configspec from a validated
    ConfigObj.
    
    ``get_extra_values`` returns a list of tuples where each tuple represents
    either an extra section, or an extra value.
    
    The tuples contain two values, a tuple representing the section the value 
    is in and the name of the extra values. For extra values in the top level
    section the first member will be an empty tuple. For values in the 'foo'
    section the first member will be ``('foo',)``. For members in the 'bar'
    subsection of the 'foo' section the first member will be ``('foo', 'bar')``.
    
    NOTE: If you call ``get_extra_values`` on a ConfigObj instance that hasn't
    been validated it will return an empty list.
    csg|]}�|f�qSrr)r�rW)�_prependrrr��	sz$get_extra_values.<locals>.<listcomp>)rXr�r�r*)Zconfrrr�rWr)rrrr*�	s
)rN)rr	)r
r	)r	r	)rrrrrrrr r!r"r#r$r%r&r'r(r)r*)NN);r�r�rZ�codecsrrrrr�Z_versionrr9rrrrr@rAr;r<rDrC�objectr��__all__rrZMAX_INTERPOL_DEPTHr�r>r�r(rBZ_builderr6rjrrrrnr&rrr"r#r%r$r'rqr�r�r�r�r�r�r r!r)r*rrrr�<module>s�	7			|i"
<site-packages/__pycache__/_version.cpython-36.pyc000064400000000206147511334560015755 0ustar003

��S�@sdZdS)z5.0.6N)�__version__�rr�/usr/lib/python3.6/_version.py�<module>ssite-packages/sepolgen/module.py000064400000016167147511334560012763 0ustar00# Authors: Karl MacMillan <kmacmillan@mentalrootkit.com>
#
# Copyright (C) 2006 Red Hat 
# see file 'COPYING' for use and warranty information
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License as
# published by the Free Software Foundation; version 2 only
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#

"""
Utilities for dealing with the compilation of modules and creation
of module tress.
"""

import re
import tempfile
try:
    from subprocess import getstatusoutput
except ImportError:
    from commands import getstatusoutput
import os
import os.path
import shutil

import selinux

from . import defaults


def is_valid_name(modname):
    """Check that a module name is valid.
    """
    m = re.findall(r"[^a-zA-Z0-9_\-\.]", modname)
    if len(m) == 0 and modname[0].isalpha():
        return True
    else:
        return False

class ModuleTree:
    def __init__(self, modname):
        self.modname = modname
        self.dirname = None

    def dir_name(self):
        return self.dirname

    def te_name(self):
        return self.dirname + "/" + self.modname + ".te"

    def fc_name(self):
        return self.dirname + "/" + self.modname + ".fc"

    def if_name(self):
        return self.dirname + "/" + self.modname + ".if"

    def package_name(self):
        return self.dirname + "/" + self.modname + ".pp"

    def makefile_name(self):
        return self.dirname + "/Makefile"

    def create(self, parent_dirname, makefile_include=None):
        self.dirname = parent_dirname + "/" + self.modname
        os.mkdir(self.dirname)
        fd = open(self.makefile_name(), "w")
        if makefile_include:
            fd.write("include " + makefile_include)
        else:
            fd.write("include " + defaults.refpolicy_makefile())
        fd.close()

        # Create empty files for the standard refpolicy
        # module files
        open(self.te_name(), "w").close()
        open(self.fc_name(), "w").close()
        open(self.if_name(), "w").close()

def modname_from_sourcename(sourcename):
    return os.path.splitext(os.path.split(sourcename)[1])[0]

class ModuleCompiler:
    """ModuleCompiler eases running of the module compiler.

    The ModuleCompiler class encapsulates running the commandline
    module compiler (checkmodule) and module packager (semodule_package).
    You are likely interested in the create_module_package method.
    
    Several options are controlled via paramaters (only effects the 
    non-refpol builds):
    
     .mls          [boolean] Generate an MLS module (by passed -M to
                   checkmodule). True to generate an MLS module, false
                   otherwise.
                   
     .module       [boolean] Generate a module instead of a base module.
                   True to generate a module, false to generate a base.
                   
     .checkmodule  [string] Fully qualified path to the module compiler.
                   Default is /usr/bin/checkmodule.
                   
     .semodule_package [string] Fully qualified path to the module
                   packager. Defaults to /usr/bin/semodule_package.
     .output       [file object] File object used to write verbose
                   output of the compililation and packaging process.
    """
    def __init__(self, output=None):
        """Create a ModuleCompiler instance, optionally with an
        output file object for verbose output of the compilation process.
        """
        self.mls = selinux.is_selinux_mls_enabled()
        self.module = True
        self.checkmodule = "/usr/bin/checkmodule"
        self.semodule_package = "/usr/bin/semodule_package"
        self.output = output
        self.last_output = ""
        self.refpol_makefile = defaults.refpolicy_makefile()
        self.make = "/usr/bin/make"

    def o(self, str):
        if self.output:
            self.output.write(str + "\n")
        self.last_output = str

    def run(self, command):
        self.o(command)
        rc, output = getstatusoutput(command)
        self.o(output)
        
        return rc
    
    def gen_filenames(self, sourcename):
        """Generate the module and policy package filenames from
        a source file name. The source file must be in the form
        of "foo.te". This will generate "foo.mod" and "foo.pp".
        
        Returns a tuple with (modname, policypackage).
        """
        splitname = sourcename.split(".")
        if len(splitname) < 2:
            raise RuntimeError("invalid sourcefile name %s (must end in .te)", sourcename)
        # Handle other periods in the filename correctly
        basename = ".".join(splitname[0:-1])
        modname = basename + ".mod"
        packagename = basename + ".pp"
        
        return (modname, packagename)

    def create_module_package(self, sourcename, refpolicy=True):
        """Create a module package saved in a packagename from a
        sourcename.

        The create_module_package creates a module package saved in a
        file named sourcename (.pp is the standard extension) from a
        source file (.te is the standard extension). The source file
        should contain SELinux policy statements appropriate for a
        base or non-base module (depending on the setting of .module).

        Only file names are accepted, not open file objects or
        descriptors because the command line SELinux tools are used.

        On error a RuntimeError will be raised with a descriptive
        error message.
        """
        if refpolicy:
            self.refpol_build(sourcename)
        else:
            modname, packagename = self.gen_filenames(sourcename)
            self.compile(sourcename, modname)
            self.package(modname, packagename)
            os.unlink(modname)
            
    def refpol_build(self, sourcename):
        # Compile
        command = self.make + " -f " + self.refpol_makefile
        rc = self.run(command)

        # Raise an error if the process failed
        if rc != 0:
            raise RuntimeError("compilation failed:\n%s" % self.last_output)
        
    def compile(self, sourcename, modname):
        s = [self.checkmodule]
        if self.mls:
            s.append("-M")
        if self.module:
            s.append("-m")
        s.append("-o")
        s.append(modname)
        s.append(sourcename)

        rc = self.run(" ".join(s))
        if rc != 0:
            raise RuntimeError("compilation failed:\n%s" % self.last_output)

    def package(self, modname, packagename):
        s = [self.semodule_package]
        s.append("-o")
        s.append(packagename)
        s.append("-m")
        s.append(modname)
        
        rc = self.run(" ".join(s))
        if rc != 0:
            raise RuntimeError("packaging failed [%s]" % self.last_output)
        
    
site-packages/sepolgen/matching.py000064400000020720147511334560013256 0ustar00# Authors: Karl MacMillan <kmacmillan@mentalrootkit.com>
#
# Copyright (C) 2006 Red Hat
# see file 'COPYING' for use and warranty information
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License as
# published by the Free Software Foundation; version 2 only
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#

"""
Classes and algorithms for matching requested access to access vectors.
"""

import itertools

from . import access
from . import objectmodel
from . import util


class Match(util.Comparison):
    def __init__(self, interface=None, dist=0):
        self.interface = interface
        self.dist = dist
        self.info_dir_change = False
        # when implementing __eq__ also __hash__ is needed on py2
        # if object is muttable __hash__ should be None
        self.__hash__ = None

    def _compare(self, other, method):
        try:
            a = (self.dist, self.info_dir_change)
            b = (other.dist, other.info_dir_change)
            return method(a, b)
        except (AttributeError, TypeError):
            # trying to compare to foreign type
            return NotImplemented

class MatchList:
    DEFAULT_THRESHOLD = 150
    def __init__(self):
        # Match objects that pass the threshold
        self.children = []
        # Match objects over the threshold
        self.bastards = []
        self.threshold = self.DEFAULT_THRESHOLD
        self.allow_info_dir_change = False
        self.av = None

    def best(self):
        if len(self.children):
            return self.children[0]
        if len(self.bastards):
            return self.bastards[0]
        return None

    def __len__(self):
        # Only return the length of the matches so
        # that this can be used to test if there is
        # a match.
        return len(self.children) + len(self.bastards)

    def __iter__(self):
        return iter(self.children)

    def all(self):
        return itertools.chain(self.children, self.bastards)

    def append(self, match):
        if match.dist <= self.threshold:
            if not match.info_dir_change or self.allow_info_dir_change:
                self.children.append(match)
            else:
                self.bastards.append(match)
        else:
            self.bastards.append(match)

    def sort(self):
        self.children.sort()
        self.bastards.sort()
                

class AccessMatcher:
    def __init__(self, perm_maps=None):
        self.type_penalty = 10
        self.obj_penalty = 10
        if perm_maps:
            self.perm_maps = perm_maps
        else:
            self.perm_maps = objectmodel.PermMappings()
        # We want a change in the information flow direction
        # to be a strong penalty - stronger than access to
        # a few unrelated types.
        self.info_dir_penalty = 100

    def type_distance(self, a, b):
        if a == b or access.is_idparam(b):
            return 0
        else:
            return -self.type_penalty


    def perm_distance(self, av_req, av_prov):
        # First check that we have enough perms
        diff = av_req.perms.difference(av_prov.perms)

        if len(diff) != 0:
            total = self.perm_maps.getdefault_distance(av_req.obj_class, diff)
            return -total
        else:
            diff = av_prov.perms.difference(av_req.perms)
            return self.perm_maps.getdefault_distance(av_req.obj_class, diff)

    def av_distance(self, req, prov):
        """Determine the 'distance' between 2 access vectors.

        This function is used to find an access vector that matches
        a 'required' access. To do this we comput a signed numeric
        value that indicates how close the req access is to the
        'provided' access vector. The closer the value is to 0
        the closer the match, with 0 being an exact match.

        A value over 0 indicates that the prov access vector provides more
        access than the req (in practice, this means that the source type,
        target type, and object class is the same and the perms in prov is
        a superset of those in req.

        A value under 0 indicates that the prov access less - or unrelated
        - access to the req access. A different type or object class will
        result in a very low value.

        The values other than 0 should only be interpreted relative to
        one another - they have no exact meaning and are likely to
        change.

        Params:
          req - [AccessVector] The access that is required. This is the
                access being matched.
          prov - [AccessVector] The access provided. This is the potential
                 match that is being evaluated for req.
        Returns:
          0   : Exact match between the acess vectors.

          < 0 : The prov av does not provide all of the access in req.
                A smaller value indicates that the access is further.

          > 0 : The prov av provides more access than req. The larger
                the value the more access over req.
        """
        # FUTURE - this is _very_ expensive and probably needs some
        # thorough performance work. This version is meant to give
        # meaningful results relatively simply.
        dist = 0

        # Get the difference between the types. The addition is safe
        # here because type_distance only returns 0 or negative.
        dist += self.type_distance(req.src_type, prov.src_type)
        dist += self.type_distance(req.tgt_type, prov.tgt_type)

        # Object class distance
        if req.obj_class != prov.obj_class and not access.is_idparam(prov.obj_class):
            dist -= self.obj_penalty

        # Permission distance

        # If this av doesn't have a matching source type, target type, and object class
        # count all of the permissions against it. Otherwise determine the perm
        # distance and dir.
        if dist < 0:
            pdist = self.perm_maps.getdefault_distance(prov.obj_class, prov.perms)
        else:
            pdist = self.perm_distance(req, prov)

        # Combine the perm and other distance
        if dist < 0:
            if pdist < 0:
                return dist + pdist
            else:
                return dist - pdist
        elif dist >= 0:
            if pdist < 0:
                return pdist - dist
            else:
                return dist + pdist

    def av_set_match(self, av_set, av):
        """

        """
        dist = None

        # Get the distance for each access vector
        for x in av_set:
            tmp = self.av_distance(av, x)
            if dist is None:
                dist = tmp
            elif tmp >= 0:
                if dist >= 0:
                    dist += tmp
                else:
                    dist = tmp + -dist
            else:
                if dist < 0:
                    dist += tmp
                else:
                    dist -= tmp

        # Penalize for information flow - we want to prevent the
        # addition of a write if the requested is read none. We are
        # much less concerned about the reverse.
        av_dir = self.perm_maps.getdefault_direction(av.obj_class, av.perms)

        if av_set.info_dir is None:
            av_set.info_dir = objectmodel.FLOW_NONE
            for x in av_set:
                av_set.info_dir = av_set.info_dir | \
                                  self.perm_maps.getdefault_direction(x.obj_class, x.perms)
        if (av_dir & objectmodel.FLOW_WRITE == 0) and (av_set.info_dir & objectmodel.FLOW_WRITE):
            if dist < 0:
                dist -= self.info_dir_penalty
            else:
                dist += self.info_dir_penalty

        return dist

    def search_ifs(self, ifset, av, match_list):
        match_list.av = av
        for iv in itertools.chain(ifset.tgt_type_all,
                                  ifset.tgt_type_map.get(av.tgt_type, [])):
            if not iv.enabled:
                #print "iv %s not enabled" % iv.name
                continue

            dist = self.av_set_match(iv.access, av)
            if dist >= 0:
                m = Match(iv, dist)
                match_list.append(m)


        match_list.sort()


site-packages/sepolgen/objectmodel.py000064400000014566147511334560013766 0ustar00# Authors: Karl MacMillan <kmacmillan@mentalrootkit.com>
#
# Copyright (C) 2006 Red Hat
# see file 'COPYING' for use and warranty information
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License as
# published by the Free Software Foundation; version 2 only
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#

"""
This module provides knowledge object classes and permissions. It should
be used to keep this knowledge from leaking into the more generic parts of
the policy generation.
"""

# Objects that can be implicitly typed - these objects do
# not _have_ to be implicitly typed (e.g., sockets can be
# explicitly labeled), but they often are.
#
# File is in this list for /proc/self
#
# This list is useful when dealing with rules that have a
# type (or param) used as both a subject and object. For
# example:
#
#   allow httpd_t httpd_t : socket read;
#
# This rule makes sense because the socket was (presumably) created
# by a process with the type httpd_t.
implicitly_typed_objects = ["socket", "fd", "process", "file", "lnk_file", "fifo_file",
                            "dbus", "capability", "unix_stream_socket"]

#::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
#
#Information Flow
#
# All of the permissions in SELinux can be described in terms of
# information flow. For example, a read of a file is a flow of
# information from that file to the process reading. Viewing
# permissions in these terms can be used to model a varity of
# security properties.
#
# Here we have some infrastructure for understanding permissions
# in terms of information flow
#
#::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

# Information flow deals with information either flowing from a subject
# to and object ("write") or to a subject from an object ("read"). Read
# or write is described from the subject point-of-view. It is also possible
# for a permission to represent both a read and write (though the flow is
# typical asymettric in terms of bandwidth). It is also possible for
# permission to not flow information (meaning that the result is pure
# side-effect).
#
# The following constants are for representing the directionality
# of information flow.
FLOW_NONE  = 0
FLOW_READ  = 1
FLOW_WRITE = 2
FLOW_BOTH  = FLOW_READ | FLOW_WRITE

# These are used by the parser and for nice disply of the directions
str_to_dir = { "n" : FLOW_NONE, "r" : FLOW_READ, "w" : FLOW_WRITE, "b" : FLOW_BOTH }
dir_to_str = { FLOW_NONE : "n", FLOW_READ : "r", FLOW_WRITE : "w", FLOW_BOTH : "b" }

class PermMap:
    """A mapping between a permission and its information flow properties.

    PermMap represents the information flow properties of a single permission
    including the direction (read, write, etc.) and an abstract representation
    of the bandwidth of the flow (weight).
    """
    def __init__(self, perm, dir, weight):
        self.perm = perm
        self.dir = dir
        self.weight = weight

    def __repr__(self):
        return "<sepolgen.objectmodel.PermMap %s %s %d>" % (self.perm,
                                                           dir_to_str[self.dir],
                                                           self.weight)

class PermMappings:
    """The information flow properties of a set of object classes and permissions.

    PermMappings maps one or more classes and permissions to their PermMap objects
    describing their information flow charecteristics.
    """
    def __init__(self):
        self.classes = { }
        self.default_weight = 5
        self.default_dir = FLOW_BOTH

    def from_file(self, fd):
        """Read the permission mappings from a file. This reads the format used
        by Apol in the setools suite.
        """
        # This parsing is deliberitely picky and bails at the least error. It
        # is assumed that the permission map file will be shipped as part
        # of sepolgen and not user modified, so this is a reasonable design
        # choice. If user supplied permission mappings are needed the parser
        # should be made a little more robust and give better error messages.
        cur = None
        for line in fd:
            fields = line.split()
            if len(fields) == 0 or len(fields) == 1 or fields[0] == "#":
                continue
            if fields[0] == "class":
                c = fields[1]
                if c in self.classes:
                    raise ValueError("duplicate class in perm map")
                self.classes[c] = { }
                cur = self.classes[c]
            else:
                if len(fields) != 3:
                    raise ValueError("error in object classs permissions")
                if cur is None:
                    raise ValueError("permission outside of class")
                pm = PermMap(fields[0], str_to_dir[fields[1]], int(fields[2]))
                cur[pm.perm] = pm

    def get(self, obj, perm):
        """Get the permission map for the object permission.

        Returns:
          PermMap representing the permission
        Raises:
          KeyError if the object or permission is not defined
        """
        return self.classes[obj][perm]

    def getdefault(self, obj, perm):
        """Get the permission map for the object permission or a default.

        getdefault is the same as get except that a default PermMap is
        returned if the object class or permission is not defined. The
        default is FLOW_BOTH with a weight of 5.
        """
        try:
            pm = self.classes[obj][perm]
        except KeyError:
            return PermMap(perm, self.default_dir, self.default_weight)
        return pm

    def getdefault_direction(self, obj, perms):
        dir = FLOW_NONE
        for perm in perms:
            pm = self.getdefault(obj, perm)
            dir = dir | pm.dir
        return dir

    def getdefault_distance(self, obj, perms):
        total = 0
        for perm in perms:
            pm = self.getdefault(obj, perm)
            total += pm.weight

        return total



site-packages/sepolgen/lex.py000064400000123631147511334560012261 0ustar00# -----------------------------------------------------------------------------
# ply: lex.py
#
# Copyright (C) 2001-2018
# David M. Beazley (Dabeaz LLC)
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are
# met:
#
# * Redistributions of source code must retain the above copyright notice,
#   this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright notice,
#   this list of conditions and the following disclaimer in the documentation
#   and/or other materials provided with the distribution.
# * Neither the name of the David Beazley or Dabeaz LLC may be used to
#   endorse or promote products derived from this software without
#  specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
# -----------------------------------------------------------------------------

__version__    = '3.11'
__tabversion__ = '3.10'

import re
import sys
import types
import copy
import os
import inspect

# This tuple contains known string types
try:
    # Python 2.6
    StringTypes = (types.StringType, types.UnicodeType)
except AttributeError:
    # Python 3.0
    StringTypes = (str, bytes)

# This regular expression is used to match valid token names
_is_identifier = re.compile(r'^[a-zA-Z0-9_]+$')

# Exception thrown when invalid token encountered and no default error
# handler is defined.
class LexError(Exception):
    def __init__(self, message, s):
        self.args = (message,)
        self.text = s


# Token class.  This class is used to represent the tokens produced.
class LexToken(object):
    def __str__(self):
        return 'LexToken(%s,%r,%d,%d)' % (self.type, self.value, self.lineno, self.lexpos)

    def __repr__(self):
        return str(self)


# This object is a stand-in for a logging object created by the
# logging module.

class PlyLogger(object):
    def __init__(self, f):
        self.f = f

    def critical(self, msg, *args, **kwargs):
        self.f.write((msg % args) + '\n')

    def warning(self, msg, *args, **kwargs):
        self.f.write('WARNING: ' + (msg % args) + '\n')

    def error(self, msg, *args, **kwargs):
        self.f.write('ERROR: ' + (msg % args) + '\n')

    info = critical
    debug = critical


# Null logger is used when no output is generated. Does nothing.
class NullLogger(object):
    def __getattribute__(self, name):
        return self

    def __call__(self, *args, **kwargs):
        return self


# -----------------------------------------------------------------------------
#                        === Lexing Engine ===
#
# The following Lexer class implements the lexer runtime.   There are only
# a few public methods and attributes:
#
#    input()          -  Store a new string in the lexer
#    token()          -  Get the next token
#    clone()          -  Clone the lexer
#
#    lineno           -  Current line number
#    lexpos           -  Current position in the input string
# -----------------------------------------------------------------------------

class Lexer:
    def __init__(self):
        self.lexre = None             # Master regular expression. This is a list of
                                      # tuples (re, findex) where re is a compiled
                                      # regular expression and findex is a list
                                      # mapping regex group numbers to rules
        self.lexretext = None         # Current regular expression strings
        self.lexstatere = {}          # Dictionary mapping lexer states to master regexs
        self.lexstateretext = {}      # Dictionary mapping lexer states to regex strings
        self.lexstaterenames = {}     # Dictionary mapping lexer states to symbol names
        self.lexstate = 'INITIAL'     # Current lexer state
        self.lexstatestack = []       # Stack of lexer states
        self.lexstateinfo = None      # State information
        self.lexstateignore = {}      # Dictionary of ignored characters for each state
        self.lexstateerrorf = {}      # Dictionary of error functions for each state
        self.lexstateeoff = {}        # Dictionary of eof functions for each state
        self.lexreflags = 0           # Optional re compile flags
        self.lexdata = None           # Actual input data (as a string)
        self.lexpos = 0               # Current position in input text
        self.lexlen = 0               # Length of the input text
        self.lexerrorf = None         # Error rule (if any)
        self.lexeoff = None           # EOF rule (if any)
        self.lextokens = None         # List of valid tokens
        self.lexignore = ''           # Ignored characters
        self.lexliterals = ''         # Literal characters that can be passed through
        self.lexmodule = None         # Module
        self.lineno = 1               # Current line number
        self.lexoptimize = False      # Optimized mode

    def clone(self, object=None):
        c = copy.copy(self)

        # If the object parameter has been supplied, it means we are attaching the
        # lexer to a new object.  In this case, we have to rebind all methods in
        # the lexstatere and lexstateerrorf tables.

        if object:
            newtab = {}
            for key, ritem in self.lexstatere.items():
                newre = []
                for cre, findex in ritem:
                    newfindex = []
                    for f in findex:
                        if not f or not f[0]:
                            newfindex.append(f)
                            continue
                        newfindex.append((getattr(object, f[0].__name__), f[1]))
                newre.append((cre, newfindex))
                newtab[key] = newre
            c.lexstatere = newtab
            c.lexstateerrorf = {}
            for key, ef in self.lexstateerrorf.items():
                c.lexstateerrorf[key] = getattr(object, ef.__name__)
            c.lexmodule = object
        return c

    # ------------------------------------------------------------
    # writetab() - Write lexer information to a table file
    # ------------------------------------------------------------
    def writetab(self, lextab, outputdir=''):
        if isinstance(lextab, types.ModuleType):
            raise IOError("Won't overwrite existing lextab module")
        basetabmodule = lextab.split('.')[-1]
        filename = os.path.join(outputdir, basetabmodule) + '.py'
        with open(filename, 'w') as tf:
            tf.write('# %s.py. This file automatically created by PLY (version %s). Don\'t edit!\n' % (basetabmodule, __version__))
            tf.write('_tabversion   = %s\n' % repr(__tabversion__))
            tf.write('_lextokens    = set(%s)\n' % repr(tuple(sorted(self.lextokens))))
            tf.write('_lexreflags   = %s\n' % repr(int(self.lexreflags)))
            tf.write('_lexliterals  = %s\n' % repr(self.lexliterals))
            tf.write('_lexstateinfo = %s\n' % repr(self.lexstateinfo))

            # Rewrite the lexstatere table, replacing function objects with function names
            tabre = {}
            for statename, lre in self.lexstatere.items():
                titem = []
                for (pat, func), retext, renames in zip(lre, self.lexstateretext[statename], self.lexstaterenames[statename]):
                    titem.append((retext, _funcs_to_names(func, renames)))
                tabre[statename] = titem

            tf.write('_lexstatere   = %s\n' % repr(tabre))
            tf.write('_lexstateignore = %s\n' % repr(self.lexstateignore))

            taberr = {}
            for statename, ef in self.lexstateerrorf.items():
                taberr[statename] = ef.__name__ if ef else None
            tf.write('_lexstateerrorf = %s\n' % repr(taberr))

            tabeof = {}
            for statename, ef in self.lexstateeoff.items():
                tabeof[statename] = ef.__name__ if ef else None
            tf.write('_lexstateeoff = %s\n' % repr(tabeof))

    # ------------------------------------------------------------
    # readtab() - Read lexer information from a tab file
    # ------------------------------------------------------------
    def readtab(self, tabfile, fdict):
        if isinstance(tabfile, types.ModuleType):
            lextab = tabfile
        else:
            exec('import %s' % tabfile)
            lextab = sys.modules[tabfile]

        if getattr(lextab, '_tabversion', '0.0') != __tabversion__:
            raise ImportError('Inconsistent PLY version')

        self.lextokens      = lextab._lextokens
        self.lexreflags     = lextab._lexreflags
        self.lexliterals    = lextab._lexliterals
        self.lextokens_all  = self.lextokens | set(self.lexliterals)
        self.lexstateinfo   = lextab._lexstateinfo
        self.lexstateignore = lextab._lexstateignore
        self.lexstatere     = {}
        self.lexstateretext = {}
        for statename, lre in lextab._lexstatere.items():
            titem = []
            txtitem = []
            for pat, func_name in lre:
                titem.append((re.compile(pat, lextab._lexreflags), _names_to_funcs(func_name, fdict)))

            self.lexstatere[statename] = titem
            self.lexstateretext[statename] = txtitem

        self.lexstateerrorf = {}
        for statename, ef in lextab._lexstateerrorf.items():
            self.lexstateerrorf[statename] = fdict[ef]

        self.lexstateeoff = {}
        for statename, ef in lextab._lexstateeoff.items():
            self.lexstateeoff[statename] = fdict[ef]

        self.begin('INITIAL')

    # ------------------------------------------------------------
    # input() - Push a new string into the lexer
    # ------------------------------------------------------------
    def input(self, s):
        # Pull off the first character to see if s looks like a string
        c = s[:1]
        if not isinstance(c, StringTypes):
            raise ValueError('Expected a string')
        self.lexdata = s
        self.lexpos = 0
        self.lexlen = len(s)

    # ------------------------------------------------------------
    # begin() - Changes the lexing state
    # ------------------------------------------------------------
    def begin(self, state):
        if state not in self.lexstatere:
            raise ValueError('Undefined state')
        self.lexre = self.lexstatere[state]
        self.lexretext = self.lexstateretext[state]
        self.lexignore = self.lexstateignore.get(state, '')
        self.lexerrorf = self.lexstateerrorf.get(state, None)
        self.lexeoff = self.lexstateeoff.get(state, None)
        self.lexstate = state

    # ------------------------------------------------------------
    # push_state() - Changes the lexing state and saves old on stack
    # ------------------------------------------------------------
    def push_state(self, state):
        self.lexstatestack.append(self.lexstate)
        self.begin(state)

    # ------------------------------------------------------------
    # pop_state() - Restores the previous state
    # ------------------------------------------------------------
    def pop_state(self):
        self.begin(self.lexstatestack.pop())

    # ------------------------------------------------------------
    # current_state() - Returns the current lexing state
    # ------------------------------------------------------------
    def current_state(self):
        return self.lexstate

    # ------------------------------------------------------------
    # skip() - Skip ahead n characters
    # ------------------------------------------------------------
    def skip(self, n):
        self.lexpos += n

    # ------------------------------------------------------------
    # opttoken() - Return the next token from the Lexer
    #
    # Note: This function has been carefully implemented to be as fast
    # as possible.  Don't make changes unless you really know what
    # you are doing
    # ------------------------------------------------------------
    def token(self):
        # Make local copies of frequently referenced attributes
        lexpos    = self.lexpos
        lexlen    = self.lexlen
        lexignore = self.lexignore
        lexdata   = self.lexdata

        while lexpos < lexlen:
            # This code provides some short-circuit code for whitespace, tabs, and other ignored characters
            if lexdata[lexpos] in lexignore:
                lexpos += 1
                continue

            # Look for a regular expression match
            for lexre, lexindexfunc in self.lexre:
                m = lexre.match(lexdata, lexpos)
                if not m:
                    continue

                # Create a token for return
                tok = LexToken()
                tok.value = m.group()
                tok.lineno = self.lineno
                tok.lexpos = lexpos

                i = m.lastindex
                func, tok.type = lexindexfunc[i]

                if not func:
                    # If no token type was set, it's an ignored token
                    if tok.type:
                        self.lexpos = m.end()
                        return tok
                    else:
                        lexpos = m.end()
                        break

                lexpos = m.end()

                # If token is processed by a function, call it

                tok.lexer = self      # Set additional attributes useful in token rules
                self.lexmatch = m
                self.lexpos = lexpos

                newtok = func(tok)

                # Every function must return a token, if nothing, we just move to next token
                if not newtok:
                    lexpos    = self.lexpos         # This is here in case user has updated lexpos.
                    lexignore = self.lexignore      # This is here in case there was a state change
                    break

                # Verify type of the token.  If not in the token map, raise an error
                if not self.lexoptimize:
                    if newtok.type not in self.lextokens_all:
                        raise LexError("%s:%d: Rule '%s' returned an unknown token type '%s'" % (
                            func.__code__.co_filename, func.__code__.co_firstlineno,
                            func.__name__, newtok.type), lexdata[lexpos:])

                return newtok
            else:
                # No match, see if in literals
                if lexdata[lexpos] in self.lexliterals:
                    tok = LexToken()
                    tok.value = lexdata[lexpos]
                    tok.lineno = self.lineno
                    tok.type = tok.value
                    tok.lexpos = lexpos
                    self.lexpos = lexpos + 1
                    return tok

                # No match. Call t_error() if defined.
                if self.lexerrorf:
                    tok = LexToken()
                    tok.value = self.lexdata[lexpos:]
                    tok.lineno = self.lineno
                    tok.type = 'error'
                    tok.lexer = self
                    tok.lexpos = lexpos
                    self.lexpos = lexpos
                    newtok = self.lexerrorf(tok)
                    if lexpos == self.lexpos:
                        # Error method didn't change text position at all. This is an error.
                        raise LexError("Scanning error. Illegal character '%s'" % (lexdata[lexpos]), lexdata[lexpos:])
                    lexpos = self.lexpos
                    if not newtok:
                        continue
                    return newtok

                self.lexpos = lexpos
                raise LexError("Illegal character '%s' at index %d" % (lexdata[lexpos], lexpos), lexdata[lexpos:])

        if self.lexeoff:
            tok = LexToken()
            tok.type = 'eof'
            tok.value = ''
            tok.lineno = self.lineno
            tok.lexpos = lexpos
            tok.lexer = self
            self.lexpos = lexpos
            newtok = self.lexeoff(tok)
            return newtok

        self.lexpos = lexpos + 1
        if self.lexdata is None:
            raise RuntimeError('No input string given with input()')
        return None

    # Iterator interface
    def __iter__(self):
        return self

    def next(self):
        t = self.token()
        if t is None:
            raise StopIteration
        return t

    __next__ = next

# -----------------------------------------------------------------------------
#                           ==== Lex Builder ===
#
# The functions and classes below are used to collect lexing information
# and build a Lexer object from it.
# -----------------------------------------------------------------------------

# -----------------------------------------------------------------------------
# _get_regex(func)
#
# Returns the regular expression assigned to a function either as a doc string
# or as a .regex attribute attached by the @TOKEN decorator.
# -----------------------------------------------------------------------------
def _get_regex(func):
    return getattr(func, 'regex', func.__doc__)

# -----------------------------------------------------------------------------
# get_caller_module_dict()
#
# This function returns a dictionary containing all of the symbols defined within
# a caller further down the call stack.  This is used to get the environment
# associated with the yacc() call if none was provided.
# -----------------------------------------------------------------------------
def get_caller_module_dict(levels):
    f = sys._getframe(levels)
    ldict = f.f_globals.copy()
    if f.f_globals != f.f_locals:
        ldict.update(f.f_locals)
    return ldict

# -----------------------------------------------------------------------------
# _funcs_to_names()
#
# Given a list of regular expression functions, this converts it to a list
# suitable for output to a table file
# -----------------------------------------------------------------------------
def _funcs_to_names(funclist, namelist):
    result = []
    for f, name in zip(funclist, namelist):
        if f and f[0]:
            result.append((name, f[1]))
        else:
            result.append(f)
    return result

# -----------------------------------------------------------------------------
# _names_to_funcs()
#
# Given a list of regular expression function names, this converts it back to
# functions.
# -----------------------------------------------------------------------------
def _names_to_funcs(namelist, fdict):
    result = []
    for n in namelist:
        if n and n[0]:
            result.append((fdict[n[0]], n[1]))
        else:
            result.append(n)
    return result

# -----------------------------------------------------------------------------
# _form_master_re()
#
# This function takes a list of all of the regex components and attempts to
# form the master regular expression.  Given limitations in the Python re
# module, it may be necessary to break the master regex into separate expressions.
# -----------------------------------------------------------------------------
def _form_master_re(relist, reflags, ldict, toknames):
    if not relist:
        return []
    regex = '|'.join(relist)
    try:
        lexre = re.compile(regex, reflags)

        # Build the index to function map for the matching engine
        lexindexfunc = [None] * (max(lexre.groupindex.values()) + 1)
        lexindexnames = lexindexfunc[:]

        for f, i in lexre.groupindex.items():
            handle = ldict.get(f, None)
            if type(handle) in (types.FunctionType, types.MethodType):
                lexindexfunc[i] = (handle, toknames[f])
                lexindexnames[i] = f
            elif handle is not None:
                lexindexnames[i] = f
                if f.find('ignore_') > 0:
                    lexindexfunc[i] = (None, None)
                else:
                    lexindexfunc[i] = (None, toknames[f])

        return [(lexre, lexindexfunc)], [regex], [lexindexnames]
    except Exception:
        m = int(len(relist)/2)
        if m == 0:
            m = 1
        llist, lre, lnames = _form_master_re(relist[:m], reflags, ldict, toknames)
        rlist, rre, rnames = _form_master_re(relist[m:], reflags, ldict, toknames)
        return (llist+rlist), (lre+rre), (lnames+rnames)

# -----------------------------------------------------------------------------
# def _statetoken(s,names)
#
# Given a declaration name s of the form "t_" and a dictionary whose keys are
# state names, this function returns a tuple (states,tokenname) where states
# is a tuple of state names and tokenname is the name of the token.  For example,
# calling this with s = "t_foo_bar_SPAM" might return (('foo','bar'),'SPAM')
# -----------------------------------------------------------------------------
def _statetoken(s, names):
    parts = s.split('_')
    for i, part in enumerate(parts[1:], 1):
        if part not in names and part != 'ANY':
            break

    if i > 1:
        states = tuple(parts[1:i])
    else:
        states = ('INITIAL',)

    if 'ANY' in states:
        states = tuple(names)

    tokenname = '_'.join(parts[i:])
    return (states, tokenname)


# -----------------------------------------------------------------------------
# LexerReflect()
#
# This class represents information needed to build a lexer as extracted from a
# user's input file.
# -----------------------------------------------------------------------------
class LexerReflect(object):
    def __init__(self, ldict, log=None, reflags=0):
        self.ldict      = ldict
        self.error_func = None
        self.tokens     = []
        self.reflags    = reflags
        self.stateinfo  = {'INITIAL': 'inclusive'}
        self.modules    = set()
        self.error      = False
        self.log        = PlyLogger(sys.stderr) if log is None else log

    # Get all of the basic information
    def get_all(self):
        self.get_tokens()
        self.get_literals()
        self.get_states()
        self.get_rules()

    # Validate all of the information
    def validate_all(self):
        self.validate_tokens()
        self.validate_literals()
        self.validate_rules()
        return self.error

    # Get the tokens map
    def get_tokens(self):
        tokens = self.ldict.get('tokens', None)
        if not tokens:
            self.log.error('No token list is defined')
            self.error = True
            return

        if not isinstance(tokens, (list, tuple)):
            self.log.error('tokens must be a list or tuple')
            self.error = True
            return

        if not tokens:
            self.log.error('tokens is empty')
            self.error = True
            return

        self.tokens = tokens

    # Validate the tokens
    def validate_tokens(self):
        terminals = {}
        for n in self.tokens:
            if not _is_identifier.match(n):
                self.log.error("Bad token name '%s'", n)
                self.error = True
            if n in terminals:
                self.log.warning("Token '%s' multiply defined", n)
            terminals[n] = 1

    # Get the literals specifier
    def get_literals(self):
        self.literals = self.ldict.get('literals', '')
        if not self.literals:
            self.literals = ''

    # Validate literals
    def validate_literals(self):
        try:
            for c in self.literals:
                if not isinstance(c, StringTypes) or len(c) > 1:
                    self.log.error('Invalid literal %s. Must be a single character', repr(c))
                    self.error = True

        except TypeError:
            self.log.error('Invalid literals specification. literals must be a sequence of characters')
            self.error = True

    def get_states(self):
        self.states = self.ldict.get('states', None)
        # Build statemap
        if self.states:
            if not isinstance(self.states, (tuple, list)):
                self.log.error('states must be defined as a tuple or list')
                self.error = True
            else:
                for s in self.states:
                    if not isinstance(s, tuple) or len(s) != 2:
                        self.log.error("Invalid state specifier %s. Must be a tuple (statename,'exclusive|inclusive')", repr(s))
                        self.error = True
                        continue
                    name, statetype = s
                    if not isinstance(name, StringTypes):
                        self.log.error('State name %s must be a string', repr(name))
                        self.error = True
                        continue
                    if not (statetype == 'inclusive' or statetype == 'exclusive'):
                        self.log.error("State type for state %s must be 'inclusive' or 'exclusive'", name)
                        self.error = True
                        continue
                    if name in self.stateinfo:
                        self.log.error("State '%s' already defined", name)
                        self.error = True
                        continue
                    self.stateinfo[name] = statetype

    # Get all of the symbols with a t_ prefix and sort them into various
    # categories (functions, strings, error functions, and ignore characters)

    def get_rules(self):
        tsymbols = [f for f in self.ldict if f[:2] == 't_']

        # Now build up a list of functions and a list of strings
        self.toknames = {}        # Mapping of symbols to token names
        self.funcsym  = {}        # Symbols defined as functions
        self.strsym   = {}        # Symbols defined as strings
        self.ignore   = {}        # Ignore strings by state
        self.errorf   = {}        # Error functions by state
        self.eoff     = {}        # EOF functions by state

        for s in self.stateinfo:
            self.funcsym[s] = []
            self.strsym[s] = []

        if len(tsymbols) == 0:
            self.log.error('No rules of the form t_rulename are defined')
            self.error = True
            return

        for f in tsymbols:
            t = self.ldict[f]
            states, tokname = _statetoken(f, self.stateinfo)
            self.toknames[f] = tokname

            if hasattr(t, '__call__'):
                if tokname == 'error':
                    for s in states:
                        self.errorf[s] = t
                elif tokname == 'eof':
                    for s in states:
                        self.eoff[s] = t
                elif tokname == 'ignore':
                    line = t.__code__.co_firstlineno
                    file = t.__code__.co_filename
                    self.log.error("%s:%d: Rule '%s' must be defined as a string", file, line, t.__name__)
                    self.error = True
                else:
                    for s in states:
                        self.funcsym[s].append((f, t))
            elif isinstance(t, StringTypes):
                if tokname == 'ignore':
                    for s in states:
                        self.ignore[s] = t
                    if '\\' in t:
                        self.log.warning("%s contains a literal backslash '\\'", f)

                elif tokname == 'error':
                    self.log.error("Rule '%s' must be defined as a function", f)
                    self.error = True
                else:
                    for s in states:
                        self.strsym[s].append((f, t))
            else:
                self.log.error('%s not defined as a function or string', f)
                self.error = True

        # Sort the functions by line number
        for f in self.funcsym.values():
            f.sort(key=lambda x: x[1].__code__.co_firstlineno)

        # Sort the strings by regular expression length
        for s in self.strsym.values():
            s.sort(key=lambda x: len(x[1]), reverse=True)

    # Validate all of the t_rules collected
    def validate_rules(self):
        for state in self.stateinfo:
            # Validate all rules defined by functions

            for fname, f in self.funcsym[state]:
                line = f.__code__.co_firstlineno
                file = f.__code__.co_filename
                module = inspect.getmodule(f)
                self.modules.add(module)

                tokname = self.toknames[fname]
                if isinstance(f, types.MethodType):
                    reqargs = 2
                else:
                    reqargs = 1
                nargs = f.__code__.co_argcount
                if nargs > reqargs:
                    self.log.error("%s:%d: Rule '%s' has too many arguments", file, line, f.__name__)
                    self.error = True
                    continue

                if nargs < reqargs:
                    self.log.error("%s:%d: Rule '%s' requires an argument", file, line, f.__name__)
                    self.error = True
                    continue

                if not _get_regex(f):
                    self.log.error("%s:%d: No regular expression defined for rule '%s'", file, line, f.__name__)
                    self.error = True
                    continue

                try:
                    c = re.compile('(?P<%s>%s)' % (fname, _get_regex(f)), self.reflags)
                    if c.match(''):
                        self.log.error("%s:%d: Regular expression for rule '%s' matches empty string", file, line, f.__name__)
                        self.error = True
                except re.error as e:
                    self.log.error("%s:%d: Invalid regular expression for rule '%s'. %s", file, line, f.__name__, e)
                    if '#' in _get_regex(f):
                        self.log.error("%s:%d. Make sure '#' in rule '%s' is escaped with '\\#'", file, line, f.__name__)
                    self.error = True

            # Validate all rules defined by strings
            for name, r in self.strsym[state]:
                tokname = self.toknames[name]
                if tokname == 'error':
                    self.log.error("Rule '%s' must be defined as a function", name)
                    self.error = True
                    continue

                if tokname not in self.tokens and tokname.find('ignore_') < 0:
                    self.log.error("Rule '%s' defined for an unspecified token %s", name, tokname)
                    self.error = True
                    continue

                try:
                    c = re.compile('(?P<%s>%s)' % (name, r), self.reflags)
                    if (c.match('')):
                        self.log.error("Regular expression for rule '%s' matches empty string", name)
                        self.error = True
                except re.error as e:
                    self.log.error("Invalid regular expression for rule '%s'. %s", name, e)
                    if '#' in r:
                        self.log.error("Make sure '#' in rule '%s' is escaped with '\\#'", name)
                    self.error = True

            if not self.funcsym[state] and not self.strsym[state]:
                self.log.error("No rules defined for state '%s'", state)
                self.error = True

            # Validate the error function
            efunc = self.errorf.get(state, None)
            if efunc:
                f = efunc
                line = f.__code__.co_firstlineno
                file = f.__code__.co_filename
                module = inspect.getmodule(f)
                self.modules.add(module)

                if isinstance(f, types.MethodType):
                    reqargs = 2
                else:
                    reqargs = 1
                nargs = f.__code__.co_argcount
                if nargs > reqargs:
                    self.log.error("%s:%d: Rule '%s' has too many arguments", file, line, f.__name__)
                    self.error = True

                if nargs < reqargs:
                    self.log.error("%s:%d: Rule '%s' requires an argument", file, line, f.__name__)
                    self.error = True

        for module in self.modules:
            self.validate_module(module)

    # -----------------------------------------------------------------------------
    # validate_module()
    #
    # This checks to see if there are duplicated t_rulename() functions or strings
    # in the parser input file.  This is done using a simple regular expression
    # match on each line in the source code of the given module.
    # -----------------------------------------------------------------------------

    def validate_module(self, module):
        try:
            lines, linen = inspect.getsourcelines(module)
        except IOError:
            return

        fre = re.compile(r'\s*def\s+(t_[a-zA-Z_0-9]*)\(')
        sre = re.compile(r'\s*(t_[a-zA-Z_0-9]*)\s*=')

        counthash = {}
        linen += 1
        for line in lines:
            m = fre.match(line)
            if not m:
                m = sre.match(line)
            if m:
                name = m.group(1)
                prev = counthash.get(name)
                if not prev:
                    counthash[name] = linen
                else:
                    filename = inspect.getsourcefile(module)
                    self.log.error('%s:%d: Rule %s redefined. Previously defined on line %d', filename, linen, name, prev)
                    self.error = True
            linen += 1

# -----------------------------------------------------------------------------
# lex(module)
#
# Build all of the regular expression rules from definitions in the supplied module
# -----------------------------------------------------------------------------
def lex(module=None, object=None, debug=False, optimize=False, lextab='lextab',
        reflags=int(re.VERBOSE), nowarn=False, outputdir=None, debuglog=None, errorlog=None):

    if lextab is None:
        lextab = 'lextab'

    global lexer

    ldict = None
    stateinfo  = {'INITIAL': 'inclusive'}
    lexobj = Lexer()
    lexobj.lexoptimize = optimize
    global token, input

    if errorlog is None:
        errorlog = PlyLogger(sys.stderr)

    if debug:
        if debuglog is None:
            debuglog = PlyLogger(sys.stderr)

    # Get the module dictionary used for the lexer
    if object:
        module = object

    # Get the module dictionary used for the parser
    if module:
        _items = [(k, getattr(module, k)) for k in dir(module)]
        ldict = dict(_items)
        # If no __file__ attribute is available, try to obtain it from the __module__ instead
        if '__file__' not in ldict:
            ldict['__file__'] = sys.modules[ldict['__module__']].__file__
    else:
        ldict = get_caller_module_dict(2)

    # Determine if the module is package of a package or not.
    # If so, fix the tabmodule setting so that tables load correctly
    pkg = ldict.get('__package__')
    if pkg and isinstance(lextab, str):
        if '.' not in lextab:
            lextab = pkg + '.' + lextab

    # Collect parser information from the dictionary
    linfo = LexerReflect(ldict, log=errorlog, reflags=reflags)
    linfo.get_all()
    if not optimize:
        if linfo.validate_all():
            raise SyntaxError("Can't build lexer")

    if optimize and lextab:
        try:
            lexobj.readtab(lextab, ldict)
            token = lexobj.token
            input = lexobj.input
            lexer = lexobj
            return lexobj

        except ImportError:
            pass

    # Dump some basic debugging information
    if debug:
        debuglog.info('lex: tokens   = %r', linfo.tokens)
        debuglog.info('lex: literals = %r', linfo.literals)
        debuglog.info('lex: states   = %r', linfo.stateinfo)

    # Build a dictionary of valid token names
    lexobj.lextokens = set()
    for n in linfo.tokens:
        lexobj.lextokens.add(n)

    # Get literals specification
    if isinstance(linfo.literals, (list, tuple)):
        lexobj.lexliterals = type(linfo.literals[0])().join(linfo.literals)
    else:
        lexobj.lexliterals = linfo.literals

    lexobj.lextokens_all = lexobj.lextokens | set(lexobj.lexliterals)

    # Get the stateinfo dictionary
    stateinfo = linfo.stateinfo

    regexs = {}
    # Build the master regular expressions
    for state in stateinfo:
        regex_list = []

        # Add rules defined by functions first
        for fname, f in linfo.funcsym[state]:
            regex_list.append('(?P<%s>%s)' % (fname, _get_regex(f)))
            if debug:
                debuglog.info("lex: Adding rule %s -> '%s' (state '%s')", fname, _get_regex(f), state)

        # Now add all of the simple rules
        for name, r in linfo.strsym[state]:
            regex_list.append('(?P<%s>%s)' % (name, r))
            if debug:
                debuglog.info("lex: Adding rule %s -> '%s' (state '%s')", name, r, state)

        regexs[state] = regex_list

    # Build the master regular expressions

    if debug:
        debuglog.info('lex: ==== MASTER REGEXS FOLLOW ====')

    for state in regexs:
        lexre, re_text, re_names = _form_master_re(regexs[state], reflags, ldict, linfo.toknames)
        lexobj.lexstatere[state] = lexre
        lexobj.lexstateretext[state] = re_text
        lexobj.lexstaterenames[state] = re_names
        if debug:
            for i, text in enumerate(re_text):
                debuglog.info("lex: state '%s' : regex[%d] = '%s'", state, i, text)

    # For inclusive states, we need to add the regular expressions from the INITIAL state
    for state, stype in stateinfo.items():
        if state != 'INITIAL' and stype == 'inclusive':
            lexobj.lexstatere[state].extend(lexobj.lexstatere['INITIAL'])
            lexobj.lexstateretext[state].extend(lexobj.lexstateretext['INITIAL'])
            lexobj.lexstaterenames[state].extend(lexobj.lexstaterenames['INITIAL'])

    lexobj.lexstateinfo = stateinfo
    lexobj.lexre = lexobj.lexstatere['INITIAL']
    lexobj.lexretext = lexobj.lexstateretext['INITIAL']
    lexobj.lexreflags = reflags

    # Set up ignore variables
    lexobj.lexstateignore = linfo.ignore
    lexobj.lexignore = lexobj.lexstateignore.get('INITIAL', '')

    # Set up error functions
    lexobj.lexstateerrorf = linfo.errorf
    lexobj.lexerrorf = linfo.errorf.get('INITIAL', None)
    if not lexobj.lexerrorf:
        errorlog.warning('No t_error rule is defined')

    # Set up eof functions
    lexobj.lexstateeoff = linfo.eoff
    lexobj.lexeoff = linfo.eoff.get('INITIAL', None)

    # Check state information for ignore and error rules
    for s, stype in stateinfo.items():
        if stype == 'exclusive':
            if s not in linfo.errorf:
                errorlog.warning("No error rule is defined for exclusive state '%s'", s)
            if s not in linfo.ignore and lexobj.lexignore:
                errorlog.warning("No ignore rule is defined for exclusive state '%s'", s)
        elif stype == 'inclusive':
            if s not in linfo.errorf:
                linfo.errorf[s] = linfo.errorf.get('INITIAL', None)
            if s not in linfo.ignore:
                linfo.ignore[s] = linfo.ignore.get('INITIAL', '')

    # Create global versions of the token() and input() functions
    token = lexobj.token
    input = lexobj.input
    lexer = lexobj

    # If in optimize mode, we write the lextab
    if lextab and optimize:
        if outputdir is None:
            # If no output directory is set, the location of the output files
            # is determined according to the following rules:
            #     - If lextab specifies a package, files go into that package directory
            #     - Otherwise, files go in the same directory as the specifying module
            if isinstance(lextab, types.ModuleType):
                srcfile = lextab.__file__
            else:
                if '.' not in lextab:
                    srcfile = ldict['__file__']
                else:
                    parts = lextab.split('.')
                    pkgname = '.'.join(parts[:-1])
                    exec('import %s' % pkgname)
                    srcfile = getattr(sys.modules[pkgname], '__file__', '')
            outputdir = os.path.dirname(srcfile)
        try:
            lexobj.writetab(lextab, outputdir)
            if lextab in sys.modules:
                del sys.modules[lextab]
        except IOError as e:
            errorlog.warning("Couldn't write lextab module %r. %s" % (lextab, e))

    return lexobj

# -----------------------------------------------------------------------------
# runmain()
#
# This runs the lexer as a main program
# -----------------------------------------------------------------------------

def runmain(lexer=None, data=None):
    if not data:
        try:
            filename = sys.argv[1]
            f = open(filename)
            data = f.read()
            f.close()
        except IndexError:
            sys.stdout.write('Reading from standard input (type EOF to end):\n')
            data = sys.stdin.read()

    if lexer:
        _input = lexer.input
    else:
        _input = input
    _input(data)
    if lexer:
        _token = lexer.token
    else:
        _token = token

    while True:
        tok = _token()
        if not tok:
            break
        sys.stdout.write('(%s,%r,%d,%d)\n' % (tok.type, tok.value, tok.lineno, tok.lexpos))

# -----------------------------------------------------------------------------
# @TOKEN(regex)
#
# This decorator function can be used to set the regex expression on a function
# when its docstring might need to be set in an alternative way
# -----------------------------------------------------------------------------

def TOKEN(r):
    def set_regex(f):
        if hasattr(r, '__call__'):
            f.regex = _get_regex(r)
        else:
            f.regex = r
        return f
    return set_regex

# Alternative spelling of the TOKEN decorator
Token = TOKEN
site-packages/sepolgen/audit.py000064400000052734147511334560012604 0ustar00# Authors: Karl MacMillan <kmacmillan@mentalrootkit.com>
#
# Copyright (C) 2006 Red Hat 
# see file 'COPYING' for use and warranty information
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License as
# published by the Free Software Foundation; version 2 only
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#

import re
import sys

from . import refpolicy
from . import access
from . import util
# Convenience functions

def get_audit_boot_msgs():
    """Obtain all of the avc and policy load messages from the audit
    log. This function uses ausearch and requires that the current
    process have sufficient rights to run ausearch.

    Returns:
       string contain all of the audit messages returned by ausearch.
    """
    import subprocess
    import time
    fd=open("/proc/uptime", "r")
    off=float(fd.read().split()[0])
    fd.close
    s = time.localtime(time.time() - off)
    bootdate = time.strftime("%x", s)
    boottime = time.strftime("%X", s)
    output = subprocess.Popen(["/sbin/ausearch", "-m", "AVC,USER_AVC,MAC_POLICY_LOAD,DAEMON_START,SELINUX_ERR", "-ts", bootdate, boottime],
                              stdout=subprocess.PIPE).communicate()[0]
    if util.PY3:
        output = util.decode_input(output)
    return output

def get_audit_msgs():
    """Obtain all of the avc and policy load messages from the audit
    log. This function uses ausearch and requires that the current
    process have sufficient rights to run ausearch.

    Returns:
       string contain all of the audit messages returned by ausearch.
    """
    import subprocess
    output = subprocess.Popen(["/sbin/ausearch", "-m", "AVC,USER_AVC,MAC_POLICY_LOAD,DAEMON_START,SELINUX_ERR"],
                              stdout=subprocess.PIPE).communicate()[0]
    if util.PY3:
        output = util.decode_input(output)
    return output

def get_dmesg_msgs():
    """Obtain all of the avc and policy load messages from /bin/dmesg.

    Returns:
       string contain all of the audit messages returned by dmesg.
    """
    import subprocess
    output = subprocess.Popen(["/bin/dmesg"],
                              stdout=subprocess.PIPE).communicate()[0]
    if util.PY3:
        output = util.decode_input(output)
    return output

# Classes representing audit messages

class AuditMessage:
    """Base class for all objects representing audit messages.

    AuditMessage is a base class for all audit messages and only
    provides storage for the raw message (as a string) and a
    parsing function that does nothing.
    """
    def __init__(self, message):
        self.message = message
        self.header = ""

    def from_split_string(self, recs):
        """Parse a string that has been split into records by space into
        an audit message.

        This method should be overridden by subclasses. Error reporting
        should be done by raise ValueError exceptions.
        """
        for msg in recs:
            fields = msg.split("=")
            if len(fields) != 2:
                if msg[:6] == "audit(":
                    self.header = msg
                    return
                else:
                    continue
            
            if fields[0] == "msg":
                self.header = fields[1]
                return


class InvalidMessage(AuditMessage):
    """Class representing invalid audit messages. This is used to differentiate
    between audit messages that aren't recognized (that should return None from
    the audit message parser) and a message that is recognized but is malformed
    in some way.
    """
    def __init__(self, message):
        AuditMessage.__init__(self, message)

class PathMessage(AuditMessage):
    """Class representing a path message"""
    def __init__(self, message):
        AuditMessage.__init__(self, message)
        self.path = ""

    def from_split_string(self, recs):
        AuditMessage.from_split_string(self, recs)
        
        for msg in recs:
            fields = msg.split("=")
            if len(fields) != 2:
                continue
            if fields[0] == "path":
                self.path = fields[1][1:-1]
                return
import selinux.audit2why as audit2why

avcdict = {}

class AVCMessage(AuditMessage):
    """AVC message representing an access denial or granted message.

    This is a very basic class and does not represent all possible fields
    in an avc message. Currently the fields are:
       scontext - context for the source (process) that generated the message
       tcontext - context for the target
       tclass - object class for the target (only one)
       comm - the process name
       exe - the on-disc binary
       path - the path of the target
       access - list of accesses that were allowed or denied
       denial - boolean indicating whether this was a denial (True) or granted
          (False) message.
       ioctlcmd - ioctl 'request' parameter

    An example audit message generated from the audit daemon looks like (line breaks
    added):
       'type=AVC msg=audit(1155568085.407:10877): avc:  denied  { search } for
       pid=677 comm="python" name="modules" dev=dm-0 ino=13716388
       scontext=user_u:system_r:setroubleshootd_t:s0
       tcontext=system_u:object_r:modules_object_t:s0 tclass=dir'

    An example audit message stored in syslog (not processed by the audit daemon - line
    breaks added):
       'Sep 12 08:26:43 dhcp83-5 kernel: audit(1158064002.046:4): avc:  denied  { read }
       for  pid=2 496 comm="bluez-pin" name=".gdm1K3IFT" dev=dm-0 ino=3601333
       scontext=user_u:system_r:bluetooth_helper_t:s0-s0:c0
       tcontext=system_u:object_r:xdm_tmp_t:s0 tclass=file
    """
    def __init__(self, message):
        AuditMessage.__init__(self, message)
        self.scontext = refpolicy.SecurityContext()
        self.tcontext = refpolicy.SecurityContext()
        self.tclass = ""
        self.comm = ""
        self.exe = ""
        self.path = ""
        self.name = ""
        self.accesses = []
        self.denial = True
        self.ioctlcmd = None
        self.type = audit2why.TERULE

    def __parse_access(self, recs, start):
        # This is kind of sucky - the access that is in a space separated
        # list like '{ read write }'. This doesn't fit particularly well with splitting
        # the string on spaces. This function takes the list of recs and a starting
        # position one beyond the open brace. It then adds the accesses until it finds
        # the close brace or the end of the list (which is an error if reached without
        # seeing a close brace).
        found_close = False
        i = start
        if i == (len(recs) - 1):
            raise ValueError("AVC message in invalid format [%s]\n" % self.message)
        while i < len(recs):
            if recs[i] == "}":
                found_close = True
                break
            self.accesses.append(recs[i])
            i = i + 1
        if not found_close:
            raise ValueError("AVC message in invalid format [%s]\n" % self.message)
        return i + 1
        

    def from_split_string(self, recs):
        AuditMessage.from_split_string(self, recs)        
        # FUTURE - fully parse avc messages and store all possible fields
        # Required fields
        found_src = False
        found_tgt = False
        found_class = False
        found_access = False
        
        for i in range(len(recs)):
            if recs[i] == "{":
                i = self.__parse_access(recs, i + 1)
                found_access = True
                continue
            elif recs[i] == "granted":
                self.denial = False
            
            fields = recs[i].split("=")
            if len(fields) != 2:
                continue
            if fields[0] == "scontext":
                self.scontext = refpolicy.SecurityContext(fields[1])
                found_src = True
            elif fields[0] == "tcontext":
                self.tcontext = refpolicy.SecurityContext(fields[1])
                found_tgt = True
            elif fields[0] == "tclass":
                self.tclass = fields[1]
                found_class = True
            elif fields[0] == "comm":
                self.comm = fields[1][1:-1]
            elif fields[0] == "exe":
                self.exe = fields[1][1:-1]
            elif fields[0] == "name":
                self.name = fields[1][1:-1]
            elif fields[0] == "ioctlcmd":
                try:
                    self.ioctlcmd = int(fields[1], 16)
                except ValueError:
                    pass

        if not found_src or not found_tgt or not found_class or not found_access:
            raise ValueError("AVC message in invalid format [%s]\n" % self.message)
        self.analyze()

    def analyze(self):
        tcontext = self.tcontext.to_string()
        scontext = self.scontext.to_string()
        access_tuple = tuple( self.accesses)
        self.data = []

        if (scontext, tcontext, self.tclass, access_tuple) in avcdict.keys():
            self.type, self.data = avcdict[(scontext, tcontext, self.tclass, access_tuple)]
        else:
            self.type, self.data = audit2why.analyze(scontext, tcontext, self.tclass, self.accesses)
            if self.type == audit2why.NOPOLICY:
                self.type = audit2why.TERULE
            if self.type == audit2why.BADTCON:
                raise ValueError("Invalid Target Context %s\n" % tcontext)
            if self.type == audit2why.BADSCON:
                raise ValueError("Invalid Source Context %s\n" % scontext)
            if self.type == audit2why.BADSCON:
                raise ValueError("Invalid Type Class %s\n" % self.tclass)
            if self.type == audit2why.BADPERM:
                raise ValueError("Invalid permission %s\n" % " ".join(self.accesses))
            if self.type == audit2why.BADCOMPUTE:
                raise ValueError("Error during access vector computation")

            if self.type == audit2why.CONSTRAINT:
                self.data = [ self.data ]
                if self.scontext.user != self.tcontext.user:
                    self.data.append(("user (%s)" % self.scontext.user, 'user (%s)' % self.tcontext.user))
                if self.scontext.role != self.tcontext.role and self.tcontext.role != "object_r":
                    self.data.append(("role (%s)" % self.scontext.role, 'role (%s)' % self.tcontext.role))
                if self.scontext.level != self.tcontext.level:
                    self.data.append(("level (%s)" % self.scontext.level, 'level (%s)' % self.tcontext.level))

            avcdict[(scontext, tcontext, self.tclass, access_tuple)] = (self.type, self.data)

class PolicyLoadMessage(AuditMessage):
    """Audit message indicating that the policy was reloaded."""
    def __init__(self, message):
        AuditMessage.__init__(self, message)

class DaemonStartMessage(AuditMessage):
    """Audit message indicating that a daemon was started."""
    def __init__(self, message):
        AuditMessage.__init__(self, message)
        self.auditd = False

    def from_split_string(self, recs):
        AuditMessage.from_split_string(self, recs)
        if "auditd" in recs:
            self.auditd = True
        

class ComputeSidMessage(AuditMessage):
    """Audit message indicating that a sid was not valid.

    Compute sid messages are generated on attempting to create a security
    context that is not valid. Security contexts are invalid if the role is
    not authorized for the user or the type is not authorized for the role.

    This class does not store all of the fields from the compute sid message -
    just the type and role.
    """
    def __init__(self, message):
        AuditMessage.__init__(self, message)
        self.invalid_context = refpolicy.SecurityContext()
        self.scontext = refpolicy.SecurityContext()
        self.tcontext = refpolicy.SecurityContext()
        self.tclass = ""

    def from_split_string(self, recs):
        AuditMessage.from_split_string(self, recs)
        if len(recs) < 10:
            raise ValueError("Split string does not represent a valid compute sid message")

        try:
            self.invalid_context = refpolicy.SecurityContext(recs[5])
            self.scontext = refpolicy.SecurityContext(recs[7].split("=")[1])
            self.tcontext = refpolicy.SecurityContext(recs[8].split("=")[1])
            self.tclass = recs[9].split("=")[1]
        except:
            raise ValueError("Split string does not represent a valid compute sid message")
    def output(self):
        return "role %s types %s;\n" % (self.role, self.type)
        
# Parser for audit messages

class AuditParser:
    """Parser for audit messages.

    This class parses audit messages and stores them according to their message
    type. This is not a general purpose audit message parser - it only extracts
    selinux related messages.

    Each audit messages are stored in one of four lists:
       avc_msgs - avc denial or granted messages. Messages are stored in
          AVCMessage objects.
       comput_sid_messages - invalid sid messages. Messages are stored in
          ComputSidMessage objects.
       invalid_msgs - selinux related messages that are not valid. Messages
          are stored in InvalidMessageObjects.
       policy_load_messages - policy load messages. Messages are stored in
          PolicyLoadMessage objects.

    These lists will be reset when a policy load message is seen if
    AuditParser.last_load_only is set to true. It is assumed that messages
    are fed to the parser in chronological order - time stamps are not
    parsed.
    """
    def __init__(self, last_load_only=False):
        self.__initialize()
        self.last_load_only = last_load_only

    def __initialize(self):
        self.avc_msgs = []
        self.compute_sid_msgs = []
        self.invalid_msgs = []
        self.policy_load_msgs = []
        self.path_msgs = []
        self.by_header = { }
        self.check_input_file = False
                
    # Low-level parsing function - tries to determine if this audit
    # message is an SELinux related message and then parses it into
    # the appropriate AuditMessage subclass. This function deliberately
    # does not impose policy (e.g., on policy load message) or store
    # messages to make as simple and reusable as possible.
    #
    # Return values:
    #   None - no recognized audit message found in this line
    #
    #   InvalidMessage - a recognized but invalid message was found.
    #
    #   AuditMessage (or subclass) - object representing a parsed
    #      and valid audit message.
    def __parse_line(self, line):
        # strip("\x1c\x1d\x1e\x85") is only needed for python2
        # since str.split() in python3 already does this
        rec = [x.strip("\x1c\x1d\x1e\x85") for x in line.split()]
        for i in rec:
            found = False
            if i == "avc:" or i == "message=avc:" or i == "msg='avc:":
                msg = AVCMessage(line)
                found = True
            elif i == "security_compute_sid:":
                msg = ComputeSidMessage(line)
                found = True
            elif i == "type=MAC_POLICY_LOAD" or i == "type=1403":
                msg = PolicyLoadMessage(line)
                found = True
            elif i == "type=AVC_PATH":
                msg = PathMessage(line)
                found = True
            elif i == "type=DAEMON_START":
                msg = DaemonStartMessage(list)
                found = True
                
            if found:
                self.check_input_file = True
                try:
                    msg.from_split_string(rec)
                except ValueError:
                    msg = InvalidMessage(line)
                return msg
        return None

    # Higher-level parse function - take a line, parse it into an
    # AuditMessage object, and store it in the appropriate list.
    # This function will optionally reset all of the lists when
    # it sees a load policy message depending on the value of
    # self.last_load_only.
    def __parse(self, line):
        msg = self.__parse_line(line)
        if msg is None:
            return

        # Append to the correct list
        if isinstance(msg, PolicyLoadMessage):
            if self.last_load_only:
                self.__initialize()
        elif isinstance(msg, DaemonStartMessage):
            # We initialize every time the auditd is started. This
            # is less than ideal, but unfortunately it is the only
            # way to catch reboots since the initial policy load
            # by init is not stored in the audit log.
            if msg.auditd and self.last_load_only:
                self.__initialize()
            self.policy_load_msgs.append(msg)
        elif isinstance(msg, AVCMessage):
            self.avc_msgs.append(msg)
        elif isinstance(msg, ComputeSidMessage):
            self.compute_sid_msgs.append(msg)
        elif isinstance(msg, InvalidMessage):
            self.invalid_msgs.append(msg)
        elif isinstance(msg, PathMessage):
            self.path_msgs.append(msg)

        # Group by audit header
        if msg.header != "":
            if msg.header in self.by_header:
                self.by_header[msg.header].append(msg)
            else:
                self.by_header[msg.header] = [msg]
            

    # Post processing will add additional information from AVC messages
    # from related messages - only works on messages generated by
    # the audit system.
    def __post_process(self):
        for value in self.by_header.values():
            avc = []
            path = None
            for msg in value:
                if isinstance(msg, PathMessage):
                    path = msg
                elif isinstance(msg, AVCMessage):
                    avc.append(msg)
            if len(avc) > 0 and path:
                for a in avc:
                    a.path = path.path

    def parse_file(self, input):
        """Parse the contents of a file object. This method can be called
        multiple times (along with parse_string)."""
        line = input.readline()
        while line:
            self.__parse(line)
            line = input.readline()
        if not self.check_input_file:
            sys.stderr.write("Nothing to do\n")
            sys.exit(0)
        self.__post_process()

    def parse_string(self, input):
        """Parse a string containing audit messages - messages should
        be separated by new lines. This method can be called multiple
        times (along with parse_file)."""
        lines = input.split('\n')
        for l in lines:
            self.__parse(l)
        self.__post_process()

    def to_role(self, role_filter=None):
        """Return RoleAllowSet statements matching the specified filter

        Filter out types that match the filer, or all roles

        Params:
           role_filter - [optional] Filter object used to filter the
              output.
        Returns:
           Access vector set representing the denied access in the
           audit logs parsed by this object.
        """
        role_types = access.RoleTypeSet()
        for cs in self.compute_sid_msgs:
            if not role_filter or role_filter.filter(cs):
                role_types.add(cs.invalid_context.role, cs.invalid_context.type)
        
        return role_types

    def to_access(self, avc_filter=None, only_denials=True):
        """Convert the audit logs access into a an access vector set.

        Convert the audit logs into an access vector set, optionally
        filtering the restults with the passed in filter object.

        Filter objects are object instances with a .filter method
        that takes and access vector and returns True if the message
        should be included in the final output and False otherwise.

        Params:
           avc_filter - [optional] Filter object used to filter the
              output.
        Returns:
           Access vector set representing the denied access in the
           audit logs parsed by this object.
        """
        av_set = access.AccessVectorSet()
        for avc in self.avc_msgs:
            if avc.denial != True and only_denials:
                continue

            if not avc_filter or avc_filter.filter(avc):
                av = access.AccessVector([avc.scontext.type, avc.tcontext.type,
                                         avc.tclass] + avc.accesses)
                av.data = avc.data
                av.type = avc.type

                if avc.ioctlcmd:
                    xperm_set = refpolicy.XpermSet()
                    xperm_set.add(avc.ioctlcmd)
                    av.xperms["ioctl"] = xperm_set

                av_set.add_av(av, audit_msg=avc)

        return av_set

class AVCTypeFilter:
    def __init__(self, regex):
        self.regex = re.compile(regex)

    def filter(self, avc):
        if self.regex.match(avc.scontext.type):
            return True
        if self.regex.match(avc.tcontext.type):
            return True
        return False

class ComputeSidTypeFilter:
    def __init__(self, regex):
        self.regex = re.compile(regex)

    def filter(self, avc):
        if self.regex.match(avc.invalid_context.type):
            return True
        if self.regex.match(avc.scontext.type):
            return True
        if self.regex.match(avc.tcontext.type):
            return True
        return False


site-packages/sepolgen/__pycache__/module.cpython-36.pyc000064400000015711147511334560017241 0ustar003

��fw�@s�dZddlZddlZyddlmZWn ek
rDddlmZYnXddlZddlZddl	Z	ddl
Z
ddlmZdd�Z
Gdd	�d	�Zd
d�ZGdd
�d
�ZdS)zU
Utilities for dealing with the compilation of modules and creation
of module tress.
�N)�getstatusoutput�)�defaultscCs0tjd|�}t|�dkr(|dj�r(dSdSdS)z'Check that a module name is valid.
    z[^a-zA-Z0-9_\-\.]rTFN)�re�findall�len�isalpha)�modname�m�r�/usr/lib/python3.6/module.py�
is_valid_name(sr
c@sNeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	ddd�Z
dS)�
ModuleTreecCs||_d|_dS)N)r	�dirname)�selfr	rrr�__init__2szModuleTree.__init__cCs|jS)N)r)rrrr�dir_name6szModuleTree.dir_namecCs|jd|jdS)N�/z.te)rr	)rrrr�te_name9szModuleTree.te_namecCs|jd|jdS)Nrz.fc)rr	)rrrr�fc_name<szModuleTree.fc_namecCs|jd|jdS)Nrz.if)rr	)rrrr�if_name?szModuleTree.if_namecCs|jd|jdS)Nrz.pp)rr	)rrrr�package_nameBszModuleTree.package_namecCs
|jdS)Nz	/Makefile)r)rrrr�
makefile_nameEszModuleTree.makefile_nameNcCs�|d|j|_tj|j�t|j�d�}|r>|jd|�n|jdtj��|j	�t|j
�d�j	�t|j�d�j	�t|j�d�j	�dS)Nr�wzinclude )
r	r�os�mkdir�openr�writer�refpolicy_makefile�closerrr)rZparent_dirnameZmakefile_include�fdrrr�createHszModuleTree.create)N)�__name__�
__module__�__qualname__rrrrrrrr!rrrrr1srcCstjjtjj|�d�dS)Nrr)r�path�splitext�split)�
sourcenamerrr�modname_from_sourcenameXsr)c@sTeZdZdZddd�Zdd�Zdd�Zd	d
�Zddd
�Zdd�Z	dd�Z
dd�ZdS)�ModuleCompileratModuleCompiler eases running of the module compiler.

    The ModuleCompiler class encapsulates running the commandline
    module compiler (checkmodule) and module packager (semodule_package).
    You are likely interested in the create_module_package method.
    
    Several options are controlled via paramaters (only effects the 
    non-refpol builds):
    
     .mls          [boolean] Generate an MLS module (by passed -M to
                   checkmodule). True to generate an MLS module, false
                   otherwise.
                   
     .module       [boolean] Generate a module instead of a base module.
                   True to generate a module, false to generate a base.
                   
     .checkmodule  [string] Fully qualified path to the module compiler.
                   Default is /usr/bin/checkmodule.
                   
     .semodule_package [string] Fully qualified path to the module
                   packager. Defaults to /usr/bin/semodule_package.
     .output       [file object] File object used to write verbose
                   output of the compililation and packaging process.
    NcCs<tj�|_d|_d|_d|_||_d|_tj	�|_
d|_dS)z�Create a ModuleCompiler instance, optionally with an
        output file object for verbose output of the compilation process.
        Tz/usr/bin/checkmodulez/usr/bin/semodule_package�z
/usr/bin/makeN)�selinuxZis_selinux_mls_enabled�mls�module�checkmodule�semodule_package�output�last_outputrr�refpol_makefile�make)rr1rrrrts

zModuleCompiler.__init__cCs |jr|jj|d�||_dS)N�
)r1rr2)r�strrrr�o�szModuleCompiler.ocCs$|j|�t|�\}}|j|�|S)N)r7r)r�command�rcr1rrr�run�s

zModuleCompiler.runcCsJ|jd�}t|�dkr td|��dj|dd��}|d}|d}||fS)	z�Generate the module and policy package filenames from
        a source file name. The source file must be in the form
        of "foo.te". This will generate "foo.mod" and "foo.pp".
        
        Returns a tuple with (modname, policypackage).
        �.�z,invalid sourcefile name %s (must end in .te)rrz.modz.pp���)r'r�RuntimeError�join)rr(Z	splitname�basenamer	�packagenamerrr�
gen_filenames�s

zModuleCompiler.gen_filenamesTcCsD|r|j|�n0|j|�\}}|j||�|j||�tj|�dS)a�Create a module package saved in a packagename from a
        sourcename.

        The create_module_package creates a module package saved in a
        file named sourcename (.pp is the standard extension) from a
        source file (.te is the standard extension). The source file
        should contain SELinux policy statements appropriate for a
        base or non-base module (depending on the setting of .module).

        Only file names are accepted, not open file objects or
        descriptors because the command line SELinux tools are used.

        On error a RuntimeError will be raised with a descriptive
        error message.
        N)�refpol_buildrB�compile�packager�unlink)rr(Z	refpolicyr	rArrr�create_module_package�sz$ModuleCompiler.create_module_packagecCs4|jd|j}|j|�}|dkr0td|j��dS)Nz -f rzcompilation failed:
%s)r4r3r:r>r2)rr(r8r9rrrrC�s
zModuleCompiler.refpol_buildcCsp|jg}|jr|jd�|jr(|jd�|jd�|j|�|j|�|jdj|��}|dkrltd|j��dS)Nz-Mz-mz-o� rzcompilation failed:
%s)r/r-�appendr.r:r?r>r2)rr(r	�sr9rrrrD�s




zModuleCompiler.compilecCsZ|jg}|jd�|j|�|jd�|j|�|jdj|��}|dkrVtd|j��dS)Nz-oz-mrHrzpackaging failed [%s])r0rIr:r?r>r2)rr	rArJr9rrrrE�s



zModuleCompiler.package)N)T)r"r#r$�__doc__rr7r:rBrGrCrDrErrrrr*[s


	r*)rKrZtempfile�
subprocessr�ImportErrorZcommandsrZos.pathZshutilr,r+rr
rr)r*rrrr�<module>s	'site-packages/sepolgen/__pycache__/defaults.cpython-36.pyc000064400000004306147511334560017561 0ustar003

��fU�@sTddlZddlZGdd�de�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dS)�Nc@seZdZdd�Zddd�ZdS)�PathChooserc
Cs�t�|_tjj|�s(d|_d|jd<dS||_tjd�}tjd�}t|d��`}xXt	|�D]L\}}|j
|�rlqX|j
|�}|s�td||df��|jd	�|j|jd�<qXWWdQRXdS)
Nz
(defaults)zJ/usr/share/selinux/default:/usr/share/selinux/mls:/usr/share/selinux/devel�SELINUX_DEVEL_PATHz
^\s*(?:#.+)?$z^\s*(\w+)\s*=\s*(.+?)\s*$�rz(%s:%d: line is not in key = value format��)
�dict�config�os�path�exists�config_pathname�re�compile�open�	enumerate�match�
ValueError�group)�self�pathname�ignoreZconsider�fd�lineno�line�mo�r�/usr/lib/python3.6/defaults.py�__init__s 




zPathChooser.__init__rcCsp|jj|d�}|dkr(td||jf��|jd�}x*|D]"}tjj||�}tjj|�r8|Sq8Wtjj|d|�S)Nz%s was not in %s�:r)	r�getrr�splitr	r
�joinr)rZtestfilenameZpathset�paths�p�targetrrr�__call__,s

zPathChooser.__call__N)r)�__name__�
__module__�__qualname__rr%rrrrrsrcCsdS)Nz/var/lib/sepolgenrrrrr�data_dir;sr)cCs
t�dS)Nz	/perm_map)r)rrrr�perm_map>sr*cCs
t�dS)Nz/interface_info)r)rrrr�interface_infoAsr+cCs
t�dS)Nz/attribute_info)r)rrrr�attribute_infoDsr,cCs(td�}|d�}tjj|�s$|d�}|S)Nz/etc/selinux/sepolgen.conf�Makefilezinclude/Makefile)rr	r
r)�chooser�resultrrr�refpolicy_makefileGs
r0cCstd�}|d�S)Nz/etc/selinux/sepolgen.conf�include)r)r.rrr�headersNsr2)
r	r
�objectrr)r*r+r,r0r2rrrr�<module>s"site-packages/sepolgen/__pycache__/matching.cpython-36.opt-1.pyc000064400000013715147511334560020507 0ustar003

��f�!�@sbdZddlZddlmZddlmZddlmZGdd�dej�ZGd	d
�d
�ZGdd�d�Z	dS)
zI
Classes and algorithms for matching requested access to access vectors.
�N�)�access)�objectmodel)�utilc@seZdZddd�Zdd�ZdS)�MatchNrcCs||_||_d|_d|_dS)NF)�	interface�dist�info_dir_change�__hash__)�selfrr�r�/usr/lib/python3.6/matching.py�__init__ szMatch.__init__cCs@y"|j|jf}|j|jf}|||�Sttfk
r:tSXdS)N)rr	�AttributeError�	TypeError�NotImplemented)r�other�method�a�brrr
�_compare(s
zMatch._compare)Nr)�__name__�
__module__�__qualname__rrrrrr
rs
rc@sHeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dS)�	MatchList�cCs$g|_g|_|j|_d|_d|_dS)NF)�children�bastards�DEFAULT_THRESHOLD�	threshold�allow_info_dir_change�av)rrrr
r3s
zMatchList.__init__cCs,t|j�r|jdSt|j�r(|jdSdS)Nr)�lenrr)rrrr
�best<s




zMatchList.bestcCst|j�t|j�S)N)r"rr)rrrr
�__len__CszMatchList.__len__cCs
t|j�S)N)�iterr)rrrr
�__iter__IszMatchList.__iter__cCstj|j|j�S)N)�	itertools�chainrr)rrrr
�allLsz
MatchList.allcCsF|j|jkr6|js|jr(|jj|�qB|jj|�n|jj|�dS)N)rrr	r r�appendr)r�matchrrr
r*Os
zMatchList.appendcCs|jj�|jj�dS)N)r�sortr)rrrr
r,Xs
zMatchList.sortN)rrrrrr#r$r&r)r*r,rrrr
r1s		rc@s>eZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�ZdS)�
AccessMatcherNcCs,d|_d|_|r||_n
tj�|_d|_dS)N�
�d)�type_penalty�obj_penalty�	perm_mapsrZPermMappings�info_dir_penalty)rr2rrr
r^s
zAccessMatcher.__init__cCs"||kstj|�rdS|jSdS)Nr)r�
is_idparamr0)rrrrrr
�
type_distancejszAccessMatcher.type_distancecCsR|jj|j�}t|�dkr0|jj|j|�}|S|jj|j�}|jj|j|�SdS)Nr)�perms�
differencer"r2�getdefault_distance�	obj_class)rZav_reqZav_provZdiffZtotalrrr
�
perm_distanceqszAccessMatcher.perm_distancecCs�d}||j|j|j�7}||j|j|j�7}|j|jkrPtj|j�rP||j8}|dkrl|jj|j|j	�}n|j
||�}|dkr�|dkr�||S||Sn |dkr�|dkr�||S||SdS)a+Determine the 'distance' between 2 access vectors.

        This function is used to find an access vector that matches
        a 'required' access. To do this we comput a signed numeric
        value that indicates how close the req access is to the
        'provided' access vector. The closer the value is to 0
        the closer the match, with 0 being an exact match.

        A value over 0 indicates that the prov access vector provides more
        access than the req (in practice, this means that the source type,
        target type, and object class is the same and the perms in prov is
        a superset of those in req.

        A value under 0 indicates that the prov access less - or unrelated
        - access to the req access. A different type or object class will
        result in a very low value.

        The values other than 0 should only be interpreted relative to
        one another - they have no exact meaning and are likely to
        change.

        Params:
          req - [AccessVector] The access that is required. This is the
                access being matched.
          prov - [AccessVector] The access provided. This is the potential
                 match that is being evaluated for req.
        Returns:
          0   : Exact match between the acess vectors.

          < 0 : The prov av does not provide all of the access in req.
                A smaller value indicates that the access is further.

          > 0 : The prov av provides more access than req. The larger
                the value the more access over req.
        rN)r5Zsrc_type�tgt_typer9rr4r1r2r8r6r:)rZreqZprovrZpdistrrr
�av_distance|s '

zAccessMatcher.av_distancecCs�d}xf|D]^}|j||�}|dkr(|}q
|dkrN|dkrB||7}qh||}q
|dkr`||7}q
||8}q
W|jj|j|j�}|jdkr�tj|_x&|D]}|j|jj|j|j�B|_q�W|tj@dkr�|jtj@r�|dkr�||j	8}n
||j	7}|S)z


        Nr)
r<r2Zgetdefault_directionr9r6Zinfo_dirrZ	FLOW_NONEZ
FLOW_WRITEr3)rZav_setr!r�xZtmpZav_dirrrr
�av_set_match�s.





zAccessMatcher.av_set_matchcCsh||_xTtj|j|jj|jg��D]6}|js.q"|j|j	|�}|dkr"t
||�}|j|�q"W|j�dS)Nr)
r!r'r(Ztgt_type_allZtgt_type_map�getr;Zenabledr>rrr*r,)rZifsetr!Z
match_listZivr�mrrr
�
search_ifs�s

zAccessMatcher.search_ifs)N)	rrrrr5r:r<r>rArrrr
r-]s
H(r-)
�__doc__r'�rrrZ
Comparisonrrr-rrrr
�<module>s,site-packages/sepolgen/__pycache__/classperms.cpython-36.pyc000064400000004665147511334560020136 0ustar003

��f�
@s�ddlZd*ZddiZd
ZdZdZdZdZdZdZ	dZ
dZdd�Zdd�Z
ddlmZej�dd�Zdd�Zd d!�Zd"d#�Zd$d%�Zdd&lmZej�ed'�Zej�Zej�d(Zd)Zeje�Zee�dS)+�N�DEFINE�NAME�TICK�SQUOTE�OBRACE�CBRACE�SEMI�OPAREN�CPAREN�COMMAZdefinez\`z\'z\{z\}z\;z\(z\)z\,z 	
cCstj|jd�|_|S)z[a-zA-Z_][a-zA-Z0-9_]*r)�reserved�get�value�type)�t�r� /usr/lib/python3.6/classperms.py�t_NAME.srcCs td|jd�|jd�dS)NzIllegal character '%s'r�)�printr�skip)rrrr�t_error3srr)�lexcCs8t|�dkr|dg|d<n|dg|dg|d<dS)zHstatements : define_stmt
                  | define_stmt statements
    �rrN)�len)�prrr�p_statements:srcCs|d|dg|d<dS)zOdefine_stmt : DEFINE OPAREN TICK NAME SQUOTE COMMA TICK list SQUOTE CPAREN
    ��rNr)rrrr�
p_define_stmtCsrcCs,|ddkr|d|d<n|dg|d<dS)z2list : NAME
            | OBRACE names CBRACE
    r�{rrNr)rrrr�p_listJsr!cCs6t|�dkr|dg|d<n|dg|d|d<dS)z+names : NAME
             | NAME names
    rrrN)r)rrrr�p_namesSsr"cCstd|j|j|jf�dS)Nz$Syntax error on line %d %s [type=%s])r�linenorr)rrrr�p_error\sr$)�yaccz
all_perms.sptz%define(`foo',`{ read write append }')a2define(`all_filesystem_perms',`{ mount remount unmount getattr relabelfrom relabelto transition associate quotamod quotaget }')
define(`all_security_perms',`{ compute_av compute_create compute_member check_context load_policy compute_relabel compute_user setenforce setbool setsecparam setcheckreqprot }')
)
rrrrrrrr	r
r) �sys�tokensrZt_TICKZt_SQUOTEZt_OBRACEZt_CBRACEZt_SEMIZt_OPARENZt_CPARENZt_COMMAZt_ignorerr�rrrr!r"r$r%�open�f�readZtxt�closeZtestZtest2�parse�resultrrrrr�<module>sL				
site-packages/sepolgen/__pycache__/matching.cpython-36.pyc000064400000013715147511334560017550 0ustar003

��f�!�@sbdZddlZddlmZddlmZddlmZGdd�dej�ZGd	d
�d
�ZGdd�d�Z	dS)
zI
Classes and algorithms for matching requested access to access vectors.
�N�)�access)�objectmodel)�utilc@seZdZddd�Zdd�ZdS)�MatchNrcCs||_||_d|_d|_dS)NF)�	interface�dist�info_dir_change�__hash__)�selfrr�r�/usr/lib/python3.6/matching.py�__init__ szMatch.__init__cCs@y"|j|jf}|j|jf}|||�Sttfk
r:tSXdS)N)rr	�AttributeError�	TypeError�NotImplemented)r�other�method�a�brrr
�_compare(s
zMatch._compare)Nr)�__name__�
__module__�__qualname__rrrrrr
rs
rc@sHeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dS)�	MatchList�cCs$g|_g|_|j|_d|_d|_dS)NF)�children�bastards�DEFAULT_THRESHOLD�	threshold�allow_info_dir_change�av)rrrr
r3s
zMatchList.__init__cCs,t|j�r|jdSt|j�r(|jdSdS)Nr)�lenrr)rrrr
�best<s




zMatchList.bestcCst|j�t|j�S)N)r"rr)rrrr
�__len__CszMatchList.__len__cCs
t|j�S)N)�iterr)rrrr
�__iter__IszMatchList.__iter__cCstj|j|j�S)N)�	itertools�chainrr)rrrr
�allLsz
MatchList.allcCsF|j|jkr6|js|jr(|jj|�qB|jj|�n|jj|�dS)N)rrr	r r�appendr)r�matchrrr
r*Os
zMatchList.appendcCs|jj�|jj�dS)N)r�sortr)rrrr
r,Xs
zMatchList.sortN)rrrrrr#r$r&r)r*r,rrrr
r1s		rc@s>eZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�ZdS)�
AccessMatcherNcCs,d|_d|_|r||_n
tj�|_d|_dS)N�
�d)�type_penalty�obj_penalty�	perm_mapsrZPermMappings�info_dir_penalty)rr2rrr
r^s
zAccessMatcher.__init__cCs"||kstj|�rdS|jSdS)Nr)r�
is_idparamr0)rrrrrr
�
type_distancejszAccessMatcher.type_distancecCsR|jj|j�}t|�dkr0|jj|j|�}|S|jj|j�}|jj|j|�SdS)Nr)�perms�
differencer"r2�getdefault_distance�	obj_class)rZav_reqZav_provZdiffZtotalrrr
�
perm_distanceqszAccessMatcher.perm_distancecCs�d}||j|j|j�7}||j|j|j�7}|j|jkrPtj|j�rP||j8}|dkrl|jj|j|j	�}n|j
||�}|dkr�|dkr�||S||Sn |dkr�|dkr�||S||SdS)a+Determine the 'distance' between 2 access vectors.

        This function is used to find an access vector that matches
        a 'required' access. To do this we comput a signed numeric
        value that indicates how close the req access is to the
        'provided' access vector. The closer the value is to 0
        the closer the match, with 0 being an exact match.

        A value over 0 indicates that the prov access vector provides more
        access than the req (in practice, this means that the source type,
        target type, and object class is the same and the perms in prov is
        a superset of those in req.

        A value under 0 indicates that the prov access less - or unrelated
        - access to the req access. A different type or object class will
        result in a very low value.

        The values other than 0 should only be interpreted relative to
        one another - they have no exact meaning and are likely to
        change.

        Params:
          req - [AccessVector] The access that is required. This is the
                access being matched.
          prov - [AccessVector] The access provided. This is the potential
                 match that is being evaluated for req.
        Returns:
          0   : Exact match between the acess vectors.

          < 0 : The prov av does not provide all of the access in req.
                A smaller value indicates that the access is further.

          > 0 : The prov av provides more access than req. The larger
                the value the more access over req.
        rN)r5Zsrc_type�tgt_typer9rr4r1r2r8r6r:)rZreqZprovrZpdistrrr
�av_distance|s '

zAccessMatcher.av_distancecCs�d}xf|D]^}|j||�}|dkr(|}q
|dkrN|dkrB||7}qh||}q
|dkr`||7}q
||8}q
W|jj|j|j�}|jdkr�tj|_x&|D]}|j|jj|j|j�B|_q�W|tj@dkr�|jtj@r�|dkr�||j	8}n
||j	7}|S)z


        Nr)
r<r2Zgetdefault_directionr9r6Zinfo_dirrZ	FLOW_NONEZ
FLOW_WRITEr3)rZav_setr!r�xZtmpZav_dirrrr
�av_set_match�s.





zAccessMatcher.av_set_matchcCsh||_xTtj|j|jj|jg��D]6}|js.q"|j|j	|�}|dkr"t
||�}|j|�q"W|j�dS)Nr)
r!r'r(Ztgt_type_allZtgt_type_map�getr;Zenabledr>rrr*r,)rZifsetr!Z
match_listZivr�mrrr
�
search_ifs�s

zAccessMatcher.search_ifs)N)	rrrrr5r:r<r>rArrrr
r-]s
H(r-)
�__doc__r'�rrrZ
Comparisonrrr-rrrr
�<module>s,site-packages/sepolgen/__pycache__/util.cpython-36.pyc000064400000014346147511334560016734 0ustar003

��fp�@s�ddlZddlZejddkZer,eZeZneZeZGdd�d�Z	dd�Z
ddd	�Zd
d�Zdd
�Z
Gdd�d�Zejdkr�dd�ZnddlmZdd�Zedkr�ddlZe	ejdd�Zejd�x"ed�D]Zej�ejd�q�WdS)�N�c@s*eZdZddd�Zddd�Zd
d	d
�ZdS)�ConsoleProgressBar�d�#cCs(d|_d|_||_||_||_d|_dS)NrF)�blocks�current�steps�	indicator�out�done)�selfr
rr	�r
�/usr/lib/python3.6/util.py�__init__"szConsoleProgressBar.__init__NcCs*d|_|r|jjd|�|jjd�dS)NFz
%s:
z3%--10---20---30---40---50---60---70---80---90--100
)rr
�write)r�messager
r
r�start*szConsoleProgressBar.start�cCs�|j|7_|j}tt|jt|j�d�d�|_|jdkrFd|_|j|}|jj|j|�|jj	�|jdkr�|j
r�d|_
|jjd�dS)Nr��2T�
)rr�int�round�floatrr
rr	�flushr)r�n�old�newr
r
r�step0s"


zConsoleProgressBar.step)rr)N)r)�__name__�
__module__�__qualname__rrrr
r
r
rr!s

rcCsg}|j|�|S)N)�extend)�s�lr
r
r�set_to_listBs
r%FcCs@t|�std��|r,t|�}|j�|dSx|D]}|SWdS)a�
    Return the first element of a set.

    It sometimes useful to return the first element from a set but,
    because sets are not indexable, this is rather hard. This function
    will return the first element from a set. If sorted is True, then
    the set will first be sorted (making this an expensive operation).
    Otherwise a random element will be returned (as sets are not ordered).
    zempty containterrN)�len�
IndexErrorr%�sort)r#�sortedr$�xr
r
r�firstGs

r+cCs:tj�}y|j|�}Wntk
r4|jd�}YnX|S)z/Encode given text via preferred system encodingzutf-8)�locale�getpreferredencoding�encode�UnicodeError)�text�encodingZencoded_textr
r
r�encode_input\sr2cCs:tj�}y|j|�}Wntk
r4|jd�}YnX|S)z/Decode given text via preferred system encodingzutf-8)r,r-�decoder/)r0r1Zdecoded_textr
r
r�decode_inputisr4c@sHeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dS)�
Comparisonz�Class used when implementing rich comparison.

    Inherit from this class if you want to have a rich
    comparison withing the class, afterwards implement
    _compare function within your class.cCstS)N)�NotImplemented)r�other�methodr
r
r�_compare}szComparison._comparecCs|j|dd��S)NcSs||kS)Nr
)�a�br
r
r�<lambda>�sz#Comparison.__eq__.<locals>.<lambda>)r9)rr7r
r
r�__eq__�szComparison.__eq__cCs|j|dd��S)NcSs||kS)Nr
)r:r;r
r
rr<�sz#Comparison.__lt__.<locals>.<lambda>)r9)rr7r
r
r�__lt__�szComparison.__lt__cCs|j|dd��S)NcSs||kS)Nr
)r:r;r
r
rr<�sz#Comparison.__le__.<locals>.<lambda>)r9)rr7r
r
r�__le__�szComparison.__le__cCs|j|dd��S)NcSs||kS)Nr
)r:r;r
r
rr<�sz#Comparison.__ge__.<locals>.<lambda>)r9)rr7r
r
r�__ge__�szComparison.__ge__cCs|j|dd��S)NcSs||kS)Nr
)r:r;r
r
rr<�sz#Comparison.__gt__.<locals>.<lambda>)r9)rr7r
r
r�__gt__�szComparison.__gt__cCs|j|dd��S)NcSs||kS)Nr
)r:r;r
r
rr<�sz#Comparison.__ne__.<locals>.<lambda>)r9)rr7r
r
r�__ne__�szComparison.__ne__N)rr r!�__doc__r9r=r>r?r@rArBr
r
r
rr5vsr5r�csG�fdd�d�}|S)z,Convert a cmp= function into a key= functioncs\eZdZdd�Z�fdd�Z�fdd�Z�fdd�Z�fd	d
�Z�fdd�Z�fd
d�Z	dS)zcmp_to_key.<locals>.KcWs
||_dS)N)�obj)rrE�argsr
r
rr�szcmp_to_key.<locals>.K.__init__cs�|j|j�dkS)Nr)rE)rr7)�mycmpr
rr>�szcmp_to_key.<locals>.K.__lt__cs�|j|j�dkS)Nr)rE)rr7)rGr
rrA�szcmp_to_key.<locals>.K.__gt__cs�|j|j�dkS)Nr)rE)rr7)rGr
rr=�szcmp_to_key.<locals>.K.__eq__cs�|j|j�dkS)Nr)rE)rr7)rGr
rr?�szcmp_to_key.<locals>.K.__le__cs�|j|j�dkS)Nr)rE)rr7)rGr
rr@�szcmp_to_key.<locals>.K.__ge__cs�|j|j�dkS)Nr)rE)rr7)rGr
rrB�szcmp_to_key.<locals>.K.__ne__N)
rr r!rr>rAr=r?r@rBr
)rGr
r�K�srHr
)rGrHr
)rGr�
cmp_to_key�srI)rIcCs||k||kS)Nr
)r+�secondr
r
r�cmp�srK�__main__i�)rzcomputing pig����MbP?)F)rrD)r,�sys�version_infoZPY3�bytesZ
bytes_type�strZstring_typeZunicoderr%r+r2r4r5rI�	functoolsrKrZtime�stdout�pr�range�irZsleepr
r
r
r�<module>s0!





site-packages/sepolgen/__pycache__/sepolgeni18n.cpython-36.pyc000064400000000460147511334560020263 0ustar003

��f��	@s6yddlZejd�ZejZWndd�ZYnXdS)�Nzselinux-pythoncCs|S)N�)�strrr�"/usr/lib/python3.6/sepolgeni18n.py�_sr)�gettextZtranslation�trrrrr�<module>s


site-packages/sepolgen/__pycache__/lex.cpython-36.pyc000064400000051662147511334570016552 0ustar003

��f���@s<dZdZddlZddlZddlZddlZddlZddlZyejej	fZ
Wnek
rdee
fZ
YnXejd�ZGdd�de�ZGdd�de�ZGd	d
�d
e�ZGdd�de�ZGd
d�d�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�ZGdd�de�Zdddddeej�ddddf
dd �Z d%d!d"�Z!d#d$�Z"e"Z#dS)&z3.11z3.10�Nz^[a-zA-Z0-9_]+$c@seZdZdd�ZdS)�LexErrorcCs|f|_||_dS)N)�args�text)�self�message�s�r�/usr/lib/python3.6/lex.py�__init__:szLexError.__init__N)�__name__�
__module__�__qualname__r
rrrr	r9src@seZdZdd�Zdd�ZdS)�LexTokencCsd|j|j|j|jfS)NzLexToken(%s,%r,%d,%d))�type�value�lineno�lexpos)rrrr	�__str__AszLexToken.__str__cCst|�S)N)�str)rrrr	�__repr__DszLexToken.__repr__N)rrr
rrrrrr	r@src@s4eZdZdd�Zdd�Zdd�Zdd�ZeZeZd	S)
�	PlyLoggercCs
||_dS)N)�f)rrrrr	r
LszPlyLogger.__init__cOs|jj||d�dS)N�
)r�write)r�msgr�kwargsrrr	�criticalOszPlyLogger.criticalcOs|jjd||d�dS)Nz	WARNING: r)rr)rrrrrrr	�warningRszPlyLogger.warningcOs|jjd||d�dS)NzERROR: r)rr)rrrrrrr	�errorUszPlyLogger.errorN)	rrr
r
rrr�info�debugrrrr	rKsrc@seZdZdd�Zdd�ZdS)�
NullLoggercCs|S)Nr)r�namerrr	�__getattribute__^szNullLogger.__getattribute__cOs|S)Nr)rrrrrr	�__call__aszNullLogger.__call__N)rrr
r#r$rrrr	r!]sr!c@s|eZdZdd�Zddd�Zddd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�ZeZdS)�LexercCs�d|_d|_i|_i|_i|_d|_g|_d|_i|_i|_	i|_
d|_d|_d|_
d|_d|_d|_d|_d|_d|_d|_d|_d|_dS)N�INITIALr��F)�lexre�	lexretext�
lexstatere�lexstateretext�lexstaterenames�lexstate�
lexstatestack�lexstateinfo�lexstateignore�lexstateerrorf�lexstateeoff�
lexreflags�lexdatar�lexlen�	lexerrorf�lexeoff�	lextokens�	lexignore�lexliterals�	lexmoduler�lexoptimize)rrrr	r
ts.zLexer.__init__NcCs�tj|�}|r�i}x�|jj�D]�\}}g}x\|D]T\}}g}	xF|D]>}
|
sV|
drb|	j|
�qB|	jt||
dj�|
df�qBWq0W|j||	f�|||<qW||_i|_x(|jj�D]\}}t||j�|j|<q�W||_|S)Nrr()�copyr+�items�append�getattrrr2r<)r�object�cZnewtab�keyZritemZnewreZcreZfindexZ	newfindexr�efrrr	�clone�s(


&zLexer.cloner'cCs�t|tj�rtd��|jd�d}tjj||�d}t|d����}|j	d|t
f�|j	dtt��|j	dtt
t|j����|j	d	tt|j���|j	d
t|j��|j	dt|j��i}xb|jj�D]T\}}g}	x>t||j||j|�D]"\\}
}}}
|	j|t||
�f�q�W|	||<q�W|j	dt|��|j	d
t|j��i}x,|jj�D]\}}|�rt|jnd||<�q`W|j	dt|��i}x,|jj�D]\}}|�r�|jnd||<�q�W|j	dt|��WdQRXdS)Nz&Won't overwrite existing lextab module�.r(z.py�wzJ# %s.py. This file automatically created by PLY (version %s). Don't edit!
z_tabversion   = %s
z_lextokens    = set(%s)
z_lexreflags   = %s
z_lexliterals  = %s
z_lexstateinfo = %s
z_lexstatere   = %s
z_lexstateignore = %s
z_lexstateerrorf = %s
z_lexstateeoff = %s
���)�
isinstance�types�
ModuleType�IOError�split�os�path�join�openr�__version__�repr�__tabversion__�tuple�sortedr9�intr4r;r0r+r?�zipr,r-r@�_funcs_to_namesr1r2rr3)r�lextab�	outputdirZ
basetabmodule�filenameZtfZtabre�	statename�lre�titem�pat�funcZretext�renamesZtaberrrEZtabeofrrr	�writetab�s6(zLexer.writetabcCsRt|tj�r|}ntd|�tj|}t|dd�tkr@td��|j	|_
|j|_|j
|_|j
t|j�B|_|j|_|j|_i|_i|_xb|jj�D]T\}}g}g}x.|D]&\}}	|jtj||j�t|	|�f�q�W||j|<||j|<q�Wi|_x$|jj�D]\}}
||
|j|<q�Wi|_x&|j j�D]\}}
||
|j|<�q(W|j!d�dS)Nz	import %sZ_tabversionz0.0zInconsistent PLY versionr&)"rJrKrL�exec�sys�modulesrArU�ImportErrorZ
_lextokensr9Z_lexreflagsr4Z_lexliteralsr;�set�
lextokens_allZ
_lexstateinfor0Z_lexstateignorer1r+r,Z_lexstaterer?r@�re�compile�_names_to_funcsr2Z_lexstateerrorfr3Z
_lexstateeoff�begin)rZtabfile�fdictr[r^r_r`ZtxtitemraZ	func_namerErrr	�readtab�s8
"
z
Lexer.readtabcCs8|dd�}t|t�std��||_d|_t|�|_dS)Nr(zExpected a stringr)rJ�StringTypes�
ValueErrorr5r�lenr6)rrrCrrr	�input�s
zLexer.inputcCsd||jkrtd��|j||_|j||_|jj|d�|_|jj|d�|_	|j
j|d�|_||_dS)NzUndefined stater')
r+rrr)r,r*r1�getr:r2r7r3r8r.)r�staterrr	rns
zLexer.begincCs|jj|j�|j|�dS)N)r/r@r.rn)rrvrrr	�
push_stateszLexer.push_statecCs|j|jj��dS)N)rnr/�pop)rrrr	�	pop_stateszLexer.pop_statecCs|jS)N)r.)rrrr	�
current_state!szLexer.current_statecCs|j|7_dS)N)r)r�nrrr	�skip'sz
Lexer.skipcCs~|j}|j}|j}|j}�x�||k�r|||kr<|d7}q�x�|jD]�\}}|j||�}|s`qFt�}|j�|_|j	|_	||_|j
}	||	\}
|_|
s�|jr�|j�|_|S|j�}P|j�}||_
||_||_|
|�}|s�|j}|j}P|j�s(|j|jk�r(td|
jj|
jj|
j|jf||d���|SW|||jk�rrt�}|||_|j	|_	|j|_||_|d|_|S|j�r�t�}|j|d�|_|j	|_	d|_||_
||_||_|j|�}||jk�r�td||||d���|j}|�s�q|S||_td|||f||d���qW|j�r\t�}d|_d|_|j	|_	||_||_
||_|j|�}|S|d|_|jdk�rztd��dS)	Nr(z4%s:%d: Rule '%s' returned an unknown token type '%s'rz&Scanning error. Illegal character '%s'z"Illegal character '%s' at index %d�eofr'z"No input string given with input())rr6r:r5r)�matchr�grouprr�	lastindexr�end�lexerZlexmatchr=rjr�__code__�co_filename�co_firstlinenorr;r7r8�RuntimeError)rrr6r:r5r)�lexindexfunc�m�tok�irbZnewtokrrr	�token1s�




"

zLexer.tokencCs|S)Nr)rrrr	�__iter__�szLexer.__iter__cCs|j�}|dkrt�|S)N)r��
StopIteration)r�trrr	�next�sz
Lexer.next)N)r')rrr
r
rFrdrprtrnrwryrzr|r�r�r��__next__rrrr	r%ss

%(

nr%cCst|d|j�S)N�regex)rA�__doc__)rbrrr	�
_get_regex�sr�cCs0tj|�}|jj�}|j|jkr,|j|j�|S)N)rf�	_getframe�	f_globalsr>�f_locals�update)Zlevelsr�ldictrrr	�get_caller_module_dict�s


r�cCsJg}x@t||�D]2\}}|r8|dr8|j||df�q|j|�qW|S)Nrr()rYr@)Zfunclist�namelist�resultrr"rrr	rZ�srZcCsHg}x>|D]6}|r6|dr6|j||d|df�q
|j|�q
W|S)Nrr()r@)r�ror�r{rrr	rm�s
rmcCsd|sgSdj|�}y�tj||�}dgt|jj��d}|dd�}x�|jj�D]z\}}	|j|d�}
t|
�t	j
t	jfkr�|
||f||	<|||	<qP|
dk	rP|||	<|jd�dkr�d||	<qPd||f||	<qPW||fg|g|gfSt
k
�r^tt|�d�}|dk�rd}t|d|�|||�\}}
}t||d�|||�\}}}|||
|||fSXdS)N�|r(�ignore_r�)NN)rQrkrl�max�
groupindex�valuesr?rurrK�FunctionType�
MethodType�find�	ExceptionrXrs�_form_master_re)Zrelist�reflagsr��toknamesr�r)r�Z
lexindexnamesrr�Zhandler�Zllistr_ZlnamesZrlistZrreZrnamesrrr	r��s2



r�cCs�|jd�}x0t|dd�d�D]\}}||kr|dkrPqW|dkrVt|d|��}nd}d|krjt|�}dj||d��}||fS)N�_r(�ANYr&)r&)rN�	enumeraterVrQ)r�names�partsr��part�statesZ	tokennamerrr	�_statetokens
r�c@sfeZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dS)�LexerReflectNrcCsL||_d|_g|_||_ddi|_t�|_d|_|dkrBtt	j
�n||_dS)Nr&�	inclusiveF)r�Z
error_func�tokensr��	stateinforirgrrrf�stderr�log)rr�r�r�rrr	r
.s
zLexerReflect.__init__cCs$|j�|j�|j�|j�dS)N)�
get_tokens�get_literals�
get_states�	get_rules)rrrr	�get_all9szLexerReflect.get_allcCs|j�|j�|j�|jS)N)�validate_tokens�validate_literals�validate_rulesr)rrrr	�validate_all@szLexerReflect.validate_allcCsp|jjdd�}|s(|jjd�d|_dSt|ttf�sL|jjd�d|_dS|sf|jjd�d|_dS||_dS)Nr�zNo token list is definedTztokens must be a list or tupleztokens is empty)r�rur�rrJ�listrVr�)rr�rrr	r�GszLexerReflect.get_tokenscCsTi}xJ|jD]@}tj|�s.|jjd|�d|_||krD|jjd|�d||<qWdS)NzBad token name '%s'TzToken '%s' multiply definedr()r��_is_identifierr~r�rr)rZ	terminalsr{rrr	r�[s
zLexerReflect.validate_tokenscCs |jjdd�|_|jsd|_dS)N�literalsr')r�rur�)rrrr	r�fszLexerReflect.get_literalscCspyDx>|jD]4}t|t�s&t|�dkr
|jjdt|��d|_q
WWn&tk
rj|jjd�d|_YnXdS)Nr(z.Invalid literal %s. Must be a single characterTzIInvalid literals specification. literals must be a sequence of characters)r�rJrqrsr�rrT�	TypeError)rrCrrr	r�lszLexerReflect.validate_literalscCs�|jjdd�|_|jr�t|jttf�s:|jjd�d|_n�x�|jD]�}t|t�s^t|�dkrx|jjdt	|��d|_qB|\}}t|t
�s�|jjdt	|��d|_qB|dkp�|dks�|jjd	|�d|_qB||jkr�|jjd
|�d|_qB||j|<qBWdS)Nr�z)states must be defined as a tuple or listTr�zMInvalid state specifier %s. Must be a tuple (statename,'exclusive|inclusive')zState name %s must be a stringr��	exclusivez:State type for state %s must be 'inclusive' or 'exclusive'zState '%s' already defined)r�rur�rJrVr�r�rrsrTrqr�)rrr"Z	statetyperrr	r�ws0

zLexerReflect.get_statesc	CsRdd�|jD�}i|_i|_i|_i|_i|_i|_x"|jD]}g|j|<g|j|<q<Wt|�dkrz|j	j
d�d|_
dS�x�|D�]x}|j|}t||j�\}}||j|<t|d��rX|dkr�x�|D]}||j|<q�Wn||dkr�xr|D]}||j|<q�WnZ|d	k�r2|j
j}|j
j}|j	j
d
|||j�d|_
n$x�|D]}|j|j||f��q8Wq�t|t��r�|d	k�r�x|D]}||j|<�qtWd|k�r�|j	jd|�nD|dk�r�|j	j
d
|�d|_
n$x8|D]}|j|j||f��q�Wq�|j	j
d|�d|_
q�Wx$|jj�D]}|jdd�d��qWx&|jj�D]}|jdd�dd��q2WdS)NcSs g|]}|dd�dkr|�qS)Nr�Zt_r)�.0rrrr	�
<listcomp>�sz*LexerReflect.get_rules.<locals>.<listcomp>rz+No rules of the form t_rulename are definedTr$rr}�ignorez,%s:%d: Rule '%s' must be defined as a string�\z#%s contains a literal backslash '\'z'Rule '%s' must be defined as a functionz&%s not defined as a function or stringcSs|djjS)Nr()r�r�)�xrrr	�<lambda>�sz(LexerReflect.get_rules.<locals>.<lambda>)rDcSst|d�S)Nr()rs)r�rrr	r��s)rD�reverse)r�r��funcsym�strsymr��errorf�eoffr�rsr�rr��hasattrr�r�r�rr@rJrqrr��sort)	rZtsymbolsrrr�r��tokname�line�filerrr	r��sb












zLexerReflect.get_rulescCs��xp|jD�]d}�x||j|D�]l\}}|jj}|jj}tj|�}|jj|�|j	|}t
|tj�rjd}nd}|jj
}	|	|kr�|jjd|||j�d|_q|	|kr�|jjd|||j�d|_qt|�s�|jjd|||j�d|_qyDtjd|t|�f|j�}
|
jd��r$|jjd	|||j�d|_Wqtjk
�r�}zD|jjd
|||j|�dt|�k�rt|jjd|||j�d|_WYdd}~XqXqW�x
|j|D]�\}}
|j	|}|d
k�r�|jjd|�d|_�q�||jk�r|jd�dk�r|jjd||�d|_�q�y:tjd||
f|j�}
|
jd��r@|jjd|�d|_WnTtjk
�r�}z4|jjd||�d|
k�r�|jjd|�d|_WYdd}~XnX�q�W|j|�r�|j|�r�|jjd|�d|_|jj|d�}|r
|}|jj}|jj}tj|�}|jj|�t
|tj��rd}nd}|jj
}	|	|k�rN|jjd|||j�d|_|	|kr
|jjd|||j�d|_q
Wx|jD]}|j|��q|WdS)Nr�r(z'%s:%d: Rule '%s' has too many argumentsTz%%s:%d: Rule '%s' requires an argumentz2%s:%d: No regular expression defined for rule '%s'z
(?P<%s>%s)r'z<%s:%d: Regular expression for rule '%s' matches empty stringz3%s:%d: Invalid regular expression for rule '%s'. %s�#z6%s:%d. Make sure '#' in rule '%s' is escaped with '\#'rz'Rule '%s' must be defined as a functionr�rz-Rule '%s' defined for an unspecified token %sz5Regular expression for rule '%s' matches empty stringz,Invalid regular expression for rule '%s'. %sz/Make sure '#' in rule '%s' is escaped with '\#'zNo rules defined for state '%s')r�r�r�r�r��inspectZ	getmodulerg�addr�rJrKr��co_argcountr�rrr�rkrlr�r~r�r�r�r�ru�validate_module)rrv�fnamerr�r��moduler�Zreqargs�nargsrC�er"�rZefuncrrr	r��s�









zLexerReflect.validate_rulescCs�ytj|�\}}Wntk
r&dSXtjd�}tjd�}i}|d7}xv|D]n}|j|�}|sj|j|�}|r�|jd�}	|j|	�}
|
s�|||	<n$tj|�}|j	j
d|||	|
�d|_
|d7}qNWdS)Nz\s*def\s+(t_[a-zA-Z_0-9]*)\(z\s*(t_[a-zA-Z_0-9]*)\s*=r(z7%s:%d: Rule %s redefined. Previously defined on line %dT)r�ZgetsourcelinesrMrkrlr~rruZ
getsourcefiler�r)rr��linesZlinenZfreZsreZ	counthashr�r�r"�prevr]rrr	r�?s*








zLexerReflect.validate_module)Nr)rrr
r
r�r�r�r�r�r�r�r�r�r�rrrr	r�-s
Bgr�Fr[c
#s�|dkrd}d}
ddi}t�}||_|	dkr6ttj�}	|rL|dkrLttj�}|rT|��r��fdd�t��D�}
t|
�}
d|
kr�tj|
dj|
d<nt	d�}
|
j
d	�}|r�t|t�r�d
|kr�|d
|}t
|
|	|d�}|j�|s�|j�r�td��|o�|�r4y |j||
�|ja|ja|a|Stk
�r2YnX|�rd|jd
|j�|jd|j�|jd|j�t�|_x|jD]}|jj|��qtWt|jttf��r�t|jd��j |j�|_!n|j|_!|jt|j!�B|_"|j}i}x�|D]�}g}xH|j#|D]:\}}|j$d|t%|�f�|�r�|jd|t%|�|��q�Wx@|j&|D]2\}}|j$d||f�|�r@|jd|||��q@W|||<�q�W|�r�|jd�xt|D]l}t'||||
|j(�\}}}||j)|<||j*|<||j+|<|�r�x&t,|�D]\}}|jd|||��q�W�q�Wxl|j-�D]`\}}|dk�r|dk�r|j)|j.|j)d�|j*|j.|j*d�|j+|j.|j+d��qW||_/|j)d|_0|j*d|_1||_2|j3|_4|j4j
dd�|_5|j6|_7|j6j
dd�|_8|j8�s�|	j9d�|j:|_;|j:j
dd�|_<x�|j-�D]�\}}|dk�rL||j6k�r*|	j9d|�||j3k�r�|j5�r�|	j9d|�nJ|dk�r||j6k�rv|j6j
dd�|j6|<||j3k�r|j3j
dd�|j3|<�qW|ja|ja|a|�r�|�r�|dk�r2t|t=j>��r�|j}nNd
|k�r�|
d}n:|j?d
�} d
j | dd��}!t@d|!�tAtj|!dd�}tBjCjD|�}y$|jE||�|tjk�rTtj|=Wn6tFk
�r�}"z|	j9d||"f�WYdd}"~"XnX|S)Nr[r&r�csg|]}|t�|�f�qSr)rA)r��k)r�rr	r�yszlex.<locals>.<listcomp>�__file__rr��__package__rG)r�r�zCan't build lexerzlex: tokens   = %rzlex: literals = %rzlex: states   = %rrz
(?P<%s>%s)z(lex: Adding rule %s -> '%s' (state '%s')z#lex: ==== MASTER REGEXS FOLLOW ====z"lex: state '%s' : regex[%d] = '%s'r'zNo t_error rule is definedr�z1No error rule is defined for exclusive state '%s'z2No ignore rule is defined for exclusive state '%s'r(z	import %sz#Couldn't write lextab module %r. %srI)Gr%r=rrfr��dir�dictrgr�r�rurJrr�r�r��SyntaxErrorrpr�rtr�rhrr�r�r�rir9r�r�rVrrQr;rjr�r@r�r�r�r�r+r,r-r�r?�extendr0r)r*r4r�r1r:r�r2r7rr�r3r8rKrLrNrerArOrP�dirnamerdrM)#r�rBr �optimizer[r�Znowarnr\ZdebuglogZerrorlogr�r�ZlexobjZ_itemsZpkgZlinfor{ZregexsrvZ
regex_listr�rr"r�r)Zre_textZre_namesr�r�styperZsrcfiler�Zpkgnamer�r)r�r	�lex^s�
















$r�cCs�|sVy&tjd}t|�}|j�}|j�Wn*tk
rTtjjd�tjj�}YnX|rb|j	}nt	}||�|rz|j
}nt
}x0|�}|s�Ptjjd|j|j|j
|jf�q�WdS)Nr(z/Reading from standard input (type EOF to end):
z(%s,%r,%d,%d)
)rf�argvrR�read�close�
IndexError�stdoutr�stdinrtr�rrrr)r��datar]rZ_inputZ_tokenr�rrr	�runmains*
r�cs�fdd�}|S)Ncs t�d�rt��|_n�|_|S)Nr$)r�r�r�)r)r�rr	�	set_regexAs
zTOKEN.<locals>.set_regexr)r�r�r)r�r	�TOKEN@sr�)NN)$rSrUrkrfrKr>rOr�Z
StringTypeZUnicodeTyperq�AttributeErrorr�bytesrlr�r�rrBrrr!r%r�r�rZrmr�r�r�rX�VERBOSEr�r�r��Tokenrrrr	�<module>"sD
F

(3
@
"
site-packages/sepolgen/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000161147511334570020444 0ustar003

��f�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/sepolgen/__pycache__/module.cpython-36.opt-1.pyc000064400000015711147511334570020201 0ustar003

��fw�@s�dZddlZddlZyddlmZWn ek
rDddlmZYnXddlZddlZddl	Z	ddl
Z
ddlmZdd�Z
Gdd	�d	�Zd
d�ZGdd
�d
�ZdS)zU
Utilities for dealing with the compilation of modules and creation
of module tress.
�N)�getstatusoutput�)�defaultscCs0tjd|�}t|�dkr(|dj�r(dSdSdS)z'Check that a module name is valid.
    z[^a-zA-Z0-9_\-\.]rTFN)�re�findall�len�isalpha)�modname�m�r�/usr/lib/python3.6/module.py�
is_valid_name(sr
c@sNeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	ddd�Z
dS)�
ModuleTreecCs||_d|_dS)N)r	�dirname)�selfr	rrr�__init__2szModuleTree.__init__cCs|jS)N)r)rrrr�dir_name6szModuleTree.dir_namecCs|jd|jdS)N�/z.te)rr	)rrrr�te_name9szModuleTree.te_namecCs|jd|jdS)Nrz.fc)rr	)rrrr�fc_name<szModuleTree.fc_namecCs|jd|jdS)Nrz.if)rr	)rrrr�if_name?szModuleTree.if_namecCs|jd|jdS)Nrz.pp)rr	)rrrr�package_nameBszModuleTree.package_namecCs
|jdS)Nz	/Makefile)r)rrrr�
makefile_nameEszModuleTree.makefile_nameNcCs�|d|j|_tj|j�t|j�d�}|r>|jd|�n|jdtj��|j	�t|j
�d�j	�t|j�d�j	�t|j�d�j	�dS)Nr�wzinclude )
r	r�os�mkdir�openr�writer�refpolicy_makefile�closerrr)rZparent_dirnameZmakefile_include�fdrrr�createHszModuleTree.create)N)�__name__�
__module__�__qualname__rrrrrrrr!rrrrr1srcCstjjtjj|�d�dS)Nrr)r�path�splitext�split)�
sourcenamerrr�modname_from_sourcenameXsr)c@sTeZdZdZddd�Zdd�Zdd�Zd	d
�Zddd
�Zdd�Z	dd�Z
dd�ZdS)�ModuleCompileratModuleCompiler eases running of the module compiler.

    The ModuleCompiler class encapsulates running the commandline
    module compiler (checkmodule) and module packager (semodule_package).
    You are likely interested in the create_module_package method.
    
    Several options are controlled via paramaters (only effects the 
    non-refpol builds):
    
     .mls          [boolean] Generate an MLS module (by passed -M to
                   checkmodule). True to generate an MLS module, false
                   otherwise.
                   
     .module       [boolean] Generate a module instead of a base module.
                   True to generate a module, false to generate a base.
                   
     .checkmodule  [string] Fully qualified path to the module compiler.
                   Default is /usr/bin/checkmodule.
                   
     .semodule_package [string] Fully qualified path to the module
                   packager. Defaults to /usr/bin/semodule_package.
     .output       [file object] File object used to write verbose
                   output of the compililation and packaging process.
    NcCs<tj�|_d|_d|_d|_||_d|_tj	�|_
d|_dS)z�Create a ModuleCompiler instance, optionally with an
        output file object for verbose output of the compilation process.
        Tz/usr/bin/checkmodulez/usr/bin/semodule_package�z
/usr/bin/makeN)�selinuxZis_selinux_mls_enabled�mls�module�checkmodule�semodule_package�output�last_outputrr�refpol_makefile�make)rr1rrrrts

zModuleCompiler.__init__cCs |jr|jj|d�||_dS)N�
)r1rr2)r�strrrr�o�szModuleCompiler.ocCs$|j|�t|�\}}|j|�|S)N)r7r)r�command�rcr1rrr�run�s

zModuleCompiler.runcCsJ|jd�}t|�dkr td|��dj|dd��}|d}|d}||fS)	z�Generate the module and policy package filenames from
        a source file name. The source file must be in the form
        of "foo.te". This will generate "foo.mod" and "foo.pp".
        
        Returns a tuple with (modname, policypackage).
        �.�z,invalid sourcefile name %s (must end in .te)rrz.modz.pp���)r'r�RuntimeError�join)rr(Z	splitname�basenamer	�packagenamerrr�
gen_filenames�s

zModuleCompiler.gen_filenamesTcCsD|r|j|�n0|j|�\}}|j||�|j||�tj|�dS)a�Create a module package saved in a packagename from a
        sourcename.

        The create_module_package creates a module package saved in a
        file named sourcename (.pp is the standard extension) from a
        source file (.te is the standard extension). The source file
        should contain SELinux policy statements appropriate for a
        base or non-base module (depending on the setting of .module).

        Only file names are accepted, not open file objects or
        descriptors because the command line SELinux tools are used.

        On error a RuntimeError will be raised with a descriptive
        error message.
        N)�refpol_buildrB�compile�packager�unlink)rr(Z	refpolicyr	rArrr�create_module_package�sz$ModuleCompiler.create_module_packagecCs4|jd|j}|j|�}|dkr0td|j��dS)Nz -f rzcompilation failed:
%s)r4r3r:r>r2)rr(r8r9rrrrC�s
zModuleCompiler.refpol_buildcCsp|jg}|jr|jd�|jr(|jd�|jd�|j|�|j|�|jdj|��}|dkrltd|j��dS)Nz-Mz-mz-o� rzcompilation failed:
%s)r/r-�appendr.r:r?r>r2)rr(r	�sr9rrrrD�s




zModuleCompiler.compilecCsZ|jg}|jd�|j|�|jd�|j|�|jdj|��}|dkrVtd|j��dS)Nz-oz-mrHrzpackaging failed [%s])r0rIr:r?r>r2)rr	rArJr9rrrrE�s



zModuleCompiler.package)N)T)r"r#r$�__doc__rr7r:rBrGrCrDrErrrrr*[s


	r*)rKrZtempfile�
subprocessr�ImportErrorZcommandsrZos.pathZshutilr,r+rr
rr)r*rrrr�<module>s	'site-packages/sepolgen/__pycache__/util.cpython-36.opt-1.pyc000064400000014346147511334570017674 0ustar003

��fp�@s�ddlZddlZejddkZer,eZeZneZeZGdd�d�Z	dd�Z
ddd	�Zd
d�Zdd
�Z
Gdd�d�Zejdkr�dd�ZnddlmZdd�Zedkr�ddlZe	ejdd�Zejd�x"ed�D]Zej�ejd�q�WdS)�N�c@s*eZdZddd�Zddd�Zd
d	d
�ZdS)�ConsoleProgressBar�d�#cCs(d|_d|_||_||_||_d|_dS)NrF)�blocks�current�steps�	indicator�out�done)�selfr
rr	�r
�/usr/lib/python3.6/util.py�__init__"szConsoleProgressBar.__init__NcCs*d|_|r|jjd|�|jjd�dS)NFz
%s:
z3%--10---20---30---40---50---60---70---80---90--100
)rr
�write)r�messager
r
r�start*szConsoleProgressBar.start�cCs�|j|7_|j}tt|jt|j�d�d�|_|jdkrFd|_|j|}|jj|j|�|jj	�|jdkr�|j
r�d|_
|jjd�dS)Nr��2T�
)rr�int�round�floatrr
rr	�flushr)r�n�old�newr
r
r�step0s"


zConsoleProgressBar.step)rr)N)r)�__name__�
__module__�__qualname__rrrr
r
r
rr!s

rcCsg}|j|�|S)N)�extend)�s�lr
r
r�set_to_listBs
r%FcCs@t|�std��|r,t|�}|j�|dSx|D]}|SWdS)a�
    Return the first element of a set.

    It sometimes useful to return the first element from a set but,
    because sets are not indexable, this is rather hard. This function
    will return the first element from a set. If sorted is True, then
    the set will first be sorted (making this an expensive operation).
    Otherwise a random element will be returned (as sets are not ordered).
    zempty containterrN)�len�
IndexErrorr%�sort)r#�sortedr$�xr
r
r�firstGs

r+cCs:tj�}y|j|�}Wntk
r4|jd�}YnX|S)z/Encode given text via preferred system encodingzutf-8)�locale�getpreferredencoding�encode�UnicodeError)�text�encodingZencoded_textr
r
r�encode_input\sr2cCs:tj�}y|j|�}Wntk
r4|jd�}YnX|S)z/Decode given text via preferred system encodingzutf-8)r,r-�decoder/)r0r1Zdecoded_textr
r
r�decode_inputisr4c@sHeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dS)�
Comparisonz�Class used when implementing rich comparison.

    Inherit from this class if you want to have a rich
    comparison withing the class, afterwards implement
    _compare function within your class.cCstS)N)�NotImplemented)r�other�methodr
r
r�_compare}szComparison._comparecCs|j|dd��S)NcSs||kS)Nr
)�a�br
r
r�<lambda>�sz#Comparison.__eq__.<locals>.<lambda>)r9)rr7r
r
r�__eq__�szComparison.__eq__cCs|j|dd��S)NcSs||kS)Nr
)r:r;r
r
rr<�sz#Comparison.__lt__.<locals>.<lambda>)r9)rr7r
r
r�__lt__�szComparison.__lt__cCs|j|dd��S)NcSs||kS)Nr
)r:r;r
r
rr<�sz#Comparison.__le__.<locals>.<lambda>)r9)rr7r
r
r�__le__�szComparison.__le__cCs|j|dd��S)NcSs||kS)Nr
)r:r;r
r
rr<�sz#Comparison.__ge__.<locals>.<lambda>)r9)rr7r
r
r�__ge__�szComparison.__ge__cCs|j|dd��S)NcSs||kS)Nr
)r:r;r
r
rr<�sz#Comparison.__gt__.<locals>.<lambda>)r9)rr7r
r
r�__gt__�szComparison.__gt__cCs|j|dd��S)NcSs||kS)Nr
)r:r;r
r
rr<�sz#Comparison.__ne__.<locals>.<lambda>)r9)rr7r
r
r�__ne__�szComparison.__ne__N)rr r!�__doc__r9r=r>r?r@rArBr
r
r
rr5vsr5r�csG�fdd�d�}|S)z,Convert a cmp= function into a key= functioncs\eZdZdd�Z�fdd�Z�fdd�Z�fdd�Z�fd	d
�Z�fdd�Z�fd
d�Z	dS)zcmp_to_key.<locals>.KcWs
||_dS)N)�obj)rrE�argsr
r
rr�szcmp_to_key.<locals>.K.__init__cs�|j|j�dkS)Nr)rE)rr7)�mycmpr
rr>�szcmp_to_key.<locals>.K.__lt__cs�|j|j�dkS)Nr)rE)rr7)rGr
rrA�szcmp_to_key.<locals>.K.__gt__cs�|j|j�dkS)Nr)rE)rr7)rGr
rr=�szcmp_to_key.<locals>.K.__eq__cs�|j|j�dkS)Nr)rE)rr7)rGr
rr?�szcmp_to_key.<locals>.K.__le__cs�|j|j�dkS)Nr)rE)rr7)rGr
rr@�szcmp_to_key.<locals>.K.__ge__cs�|j|j�dkS)Nr)rE)rr7)rGr
rrB�szcmp_to_key.<locals>.K.__ne__N)
rr r!rr>rAr=r?r@rBr
)rGr
r�K�srHr
)rGrHr
)rGr�
cmp_to_key�srI)rIcCs||k||kS)Nr
)r+�secondr
r
r�cmp�srK�__main__i�)rzcomputing pig����MbP?)F)rrD)r,�sys�version_infoZPY3�bytesZ
bytes_type�strZstring_typeZunicoderr%r+r2r4r5rI�	functoolsrKrZtime�stdout�pr�range�irZsleepr
r
r
r�<module>s0!





site-packages/sepolgen/__pycache__/audit.cpython-36.pyc000064400000042452147511334570017065 0ustar003

��f�U�@s�ddlZddlZddlmZddlmZddlmZdd�Zdd	�Zd
d�ZGdd
�d
�Z	Gdd�de	�Z
Gdd�de	�Zddlj
Z
iZGdd�de	�ZGdd�de	�ZGdd�de	�ZGdd�de	�ZGdd�d�ZGdd�d�ZGdd�d�ZdS) �N�)�	refpolicy)�access)�utilcCs�ddl}ddl}tdd�}t|j�j�d�}|j|j|j�|�}|jd|�}|jd|�}|j	ddd	d
||g|j
d�j�d}tj
r�tj|�}|S)a
Obtain all of the avc and policy load messages from the audit
    log. This function uses ausearch and requires that the current
    process have sufficient rights to run ausearch.

    Returns:
       string contain all of the audit messages returned by ausearch.
    rNz/proc/uptime�rz%xz%Xz/sbin/ausearchz-mz5AVC,USER_AVC,MAC_POLICY_LOAD,DAEMON_START,SELINUX_ERRz-ts)�stdout)�
subprocess�time�open�float�read�split�closeZ	localtimeZstrftime�Popen�PIPE�communicater�PY3�decode_input)rr	�fdZoff�sZbootdateZboottime�output�r�/usr/lib/python3.6/audit.py�get_audit_boot_msgss

rcCs:ddl}|jdddg|jd�j�d}tjr6tj|�}|S)a
Obtain all of the avc and policy load messages from the audit
    log. This function uses ausearch and requires that the current
    process have sufficient rights to run ausearch.

    Returns:
       string contain all of the audit messages returned by ausearch.
    rNz/sbin/ausearchz-mz5AVC,USER_AVC,MAC_POLICY_LOAD,DAEMON_START,SELINUX_ERR)r)rrrrrrr)rrrrr�get_audit_msgs2s
rcCs6ddl}|jdg|jd�j�d}tjr2tj|�}|S)z�Obtain all of the avc and policy load messages from /bin/dmesg.

    Returns:
       string contain all of the audit messages returned by dmesg.
    rNz
/bin/dmesg)r)rrrrrrr)rrrrr�get_dmesg_msgsAs
rc@s eZdZdZdd�Zdd�ZdS)�AuditMessagez�Base class for all objects representing audit messages.

    AuditMessage is a base class for all audit messages and only
    provides storage for the raw message (as a string) and a
    parsing function that does nothing.
    cCs||_d|_dS)N�)�message�header)�selfrrrr�__init__WszAuditMessage.__init__cCs^xX|D]P}|jd�}t|�dkr<|dd�dkr||_dSq|ddkr|d|_dSqWdS)	z�Parse a string that has been split into records by space into
        an audit message.

        This method should be overridden by subclasses. Error reporting
        should be done by raise ValueError exceptions.
        �=�N�zaudit(r�msgr)r
�lenr)r �recsr%�fieldsrrr�from_split_string[s


zAuditMessage.from_split_stringN)�__name__�
__module__�__qualname__�__doc__r!r)rrrrrPsrc@seZdZdZdd�ZdS)�InvalidMessagez�Class representing invalid audit messages. This is used to differentiate
    between audit messages that aren't recognized (that should return None from
    the audit message parser) and a message that is recognized but is malformed
    in some way.
    cCstj||�dS)N)rr!)r rrrrr!vszInvalidMessage.__init__N)r*r+r,r-r!rrrrr.psr.c@s eZdZdZdd�Zdd�ZdS)�PathMessagez!Class representing a path messagecCstj||�d|_dS)Nr)rr!�path)r rrrrr!{szPathMessage.__init__cCsXtj||�xF|D]>}|jd�}t|�dkr.q|ddkr|ddd�|_dSqWdS)Nr"r#rr0r���)rr)r
r&r0)r r'r%r(rrrr)s

zPathMessage.from_split_stringN)r*r+r,r-r!r)rrrrr/ysr/c@s0eZdZdZdd�Zdd�Zdd�Zdd	�Zd
S)�
AVCMessagea�AVC message representing an access denial or granted message.

    This is a very basic class and does not represent all possible fields
    in an avc message. Currently the fields are:
       scontext - context for the source (process) that generated the message
       tcontext - context for the target
       tclass - object class for the target (only one)
       comm - the process name
       exe - the on-disc binary
       path - the path of the target
       access - list of accesses that were allowed or denied
       denial - boolean indicating whether this was a denial (True) or granted
          (False) message.
       ioctlcmd - ioctl 'request' parameter

    An example audit message generated from the audit daemon looks like (line breaks
    added):
       'type=AVC msg=audit(1155568085.407:10877): avc:  denied  { search } for
       pid=677 comm="python" name="modules" dev=dm-0 ino=13716388
       scontext=user_u:system_r:setroubleshootd_t:s0
       tcontext=system_u:object_r:modules_object_t:s0 tclass=dir'

    An example audit message stored in syslog (not processed by the audit daemon - line
    breaks added):
       'Sep 12 08:26:43 dhcp83-5 kernel: audit(1158064002.046:4): avc:  denied  { read }
       for  pid=2 496 comm="bluez-pin" name=".gdm1K3IFT" dev=dm-0 ino=3601333
       scontext=user_u:system_r:bluetooth_helper_t:s0-s0:c0
       tcontext=system_u:object_r:xdm_tmp_t:s0 tclass=file
    cCs\tj||�tj�|_tj�|_d|_d|_d|_d|_	d|_
g|_d|_d|_
tj|_dS)NrT)rr!r�SecurityContext�scontext�tcontext�tclass�comm�exer0�name�accesses�denial�ioctlcmd�	audit2why�TERULE�type)r rrrrr!�s

zAVCMessage.__init__cCs|d}|}|t|�dkr&td|j��x:|t|�kr`||dkrFd}P|jj||�|d}q(W|sttd|j��|dS)NFrz#AVC message in invalid format [%s]
�}T)r&�
ValueErrorrr:�append)r r'�startZfound_close�irrrZ__parse_access�szAVCMessage.__parse_accesscCs�tj||�d}d}d}d}�xftt|��D�]T}||dkrV|j||d�}d}q,n||dkrhd|_||jd�}t|�dkr�q,|dd	kr�tj|d�|_	d}q,|dd
kr�tj|d�|_
d}q,|ddkr�|d|_d}q,|ddk�r|ddd�|_q,|dd
k�r(|ddd�|_
q,|ddk�rJ|ddd�|_q,|ddkr,yt|dd�|_Wq,tk
�r�Yq,Xq,W|�s�|�s�|�s�|�r�td|j��|j�dS)NF�{rTZgrantedr"r#rr4r5r6r7r8r9r<�z#AVC message in invalid format [%s]
r1r1r1)rr)�ranger&�_AVCMessage__parse_accessr;r
rr3r4r5r6r7r8r9�intr<rAr�analyze)r r'Z	found_srcZ	found_tgtZfound_classZfound_accessrDr(rrrr)�sL

 zAVCMessage.from_split_stringcCs�|jj�}|jj�}t|j�}g|_|||j|ftj�krXt|||j|f\|_	|_�n�t
j|||j|j�\|_	|_|j	t
jkr�t
j
|_	|j	t
jkr�td|��|j	t
jkr�td|��|j	t
jkr�td|j��|j	t
jkr�tddj|j���|j	t
jk�rtd��|j	t
jk�r�|jg|_|jj|jjk�rR|jjd|jjd|jjf�|jj|jjk�r�|jjdk�r�|jjd	|jjd	|jjf�|jj|jjk�r�|jjd
|jjd
|jjf�|j	|jft|||j|f<dS)NzInvalid Target Context %s
zInvalid Source Context %s
zInvalid Type Class %s
zInvalid permission %s
� z&Error during access vector computationz	user (%s)Zobject_rz	role (%s)z
level (%s))r5Z	to_stringr4�tupler:�datar6�avcdict�keysr?r=rJZNOPOLICYr>ZBADTCONrAZBADSCONZBADPERM�joinZ
BADCOMPUTEZ
CONSTRAINT�userrB�role�level)r r5r4Zaccess_tuplerrrrJ�s8



    zAVCMessage.analyzeN)r*r+r,r-r!rHr)rJrrrrr2�s
-r2c@seZdZdZdd�ZdS)�PolicyLoadMessagez6Audit message indicating that the policy was reloaded.cCstj||�dS)N)rr!)r rrrrr! szPolicyLoadMessage.__init__N)r*r+r,r-r!rrrrrTsrTc@s eZdZdZdd�Zdd�ZdS)�DaemonStartMessagez3Audit message indicating that a daemon was started.cCstj||�d|_dS)NF)rr!�auditd)r rrrrr!%szDaemonStartMessage.__init__cCstj||�d|krd|_dS)NrVT)rr)rV)r r'rrrr))sz$DaemonStartMessage.from_split_stringN)r*r+r,r-r!r)rrrrrU#srUc@s(eZdZdZdd�Zdd�Zdd�ZdS)	�ComputeSidMessagea�Audit message indicating that a sid was not valid.

    Compute sid messages are generated on attempting to create a security
    context that is not valid. Security contexts are invalid if the role is
    not authorized for the user or the type is not authorized for the role.

    This class does not store all of the fields from the compute sid message -
    just the type and role.
    cCs4tj||�tj�|_tj�|_tj�|_d|_dS)Nr)rr!rr3�invalid_contextr4r5r6)r rrrrr!9s



zComputeSidMessage.__init__c	Cs�tj||�t|�dkr td��y\tj|d�|_tj|djd�d�|_tj|djd�d�|_	|djd�d|_
Wntd��YnXdS)	N�
z;Split string does not represent a valid compute sid message��r"r��	)rr)r&rArr3rXr
r4r5r6)r r'rrrr)@sz#ComputeSidMessage.from_split_stringcCsd|j|jfS)Nzrole %s types %s;
)rRr?)r rrrrLszComputeSidMessage.outputN)r*r+r,r-r!r)rrrrrrW/s	rWc@s^eZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
ddd�Zddd�ZdS)�AuditParsera�Parser for audit messages.

    This class parses audit messages and stores them according to their message
    type. This is not a general purpose audit message parser - it only extracts
    selinux related messages.

    Each audit messages are stored in one of four lists:
       avc_msgs - avc denial or granted messages. Messages are stored in
          AVCMessage objects.
       comput_sid_messages - invalid sid messages. Messages are stored in
          ComputSidMessage objects.
       invalid_msgs - selinux related messages that are not valid. Messages
          are stored in InvalidMessageObjects.
       policy_load_messages - policy load messages. Messages are stored in
          PolicyLoadMessage objects.

    These lists will be reset when a policy load message is seen if
    AuditParser.last_load_only is set to true. It is assumed that messages
    are fed to the parser in chronological order - time stamps are not
    parsed.
    FcCs|j�||_dS)N)�_AuditParser__initialize�last_load_only)r r`rrrr!gszAuditParser.__init__cCs.g|_g|_g|_g|_g|_i|_d|_dS)NF)�avc_msgs�compute_sid_msgs�invalid_msgs�policy_load_msgs�	path_msgs�	by_header�check_input_file)r rrrZ__initializekszAuditParser.__initializecCs�dd�|j�D�}x�|D]�}d}|dks8|dks8|dkrFt|�}d}n^|dkr\t|�}d}nH|d	ksl|d
krzt|�}d}n*|dkr�t|�}d}n|dkr�tt�}d}|rd|_y|j|�Wnt	k
r�t
|�}YnX|SqWdS)
NcSsg|]}|jd��qS)u…)�strip)�.0�xrrr�
<listcomp>�sz,AuditParser.__parse_line.<locals>.<listcomp>Fzavc:zmessage=avc:z	msg='avc:Tzsecurity_compute_sid:ztype=MAC_POLICY_LOADz	type=1403z
type=AVC_PATHztype=DAEMON_START)r
r2rWrTr/rU�listrgr)rAr.)r �lineZrecrD�foundr%rrrZ__parse_line�s4
zAuditParser.__parse_linecCs�|j|�}|dkrdSt|t�r0|jr�|j�n�t|t�r\|jrN|jrN|j�|jj|�n^t|t	�rt|j
j|�nFt|t�r�|jj|�n.t|t
�r�|jj|�nt|t�r�|jj|�|jdkr�|j|jkr�|j|jj|�n|g|j|j<dS)Nr)�_AuditParser__parse_line�
isinstancerTr`r_rUrVrdrBr2rarWrbr.rcr/rerrf)r rmr%rrrZ__parse�s,








zAuditParser.__parsecCsxxr|jj�D]d}g}d}x0|D](}t|t�r2|}qt|t�r|j|�qWt|�dkr|rx|D]}|j|_q`WqWdS)Nr)rf�valuesrpr/r2rBr&r0)r �value�avcr0r%�arrrZ__post_process�s



zAuditParser.__post_processcCsL|j�}x|r"|j|�|j�}q
W|js@tjjd�tjd�|j�dS)zpParse the contents of a file object. This method can be called
        multiple times (along with parse_string).zNothing to do
rN)�readline�_AuditParser__parserg�sys�stderr�write�exit�_AuditParser__post_process)r �inputrmrrr�
parse_file�s

zAuditParser.parse_filecCs.|jd�}x|D]}|j|�qW|j�dS)z�Parse a string containing audit messages - messages should
        be separated by new lines. This method can be called multiple
        times (along with parse_file).�
N)r
rvr{)r r|�lines�lrrr�parse_string�s

zAuditParser.parse_stringNcCs@tj�}x2|jD](}|s$|j|�r|j|jj|jj�qW|S)aoReturn RoleAllowSet statements matching the specified filter

        Filter out types that match the filer, or all roles

        Params:
           role_filter - [optional] Filter object used to filter the
              output.
        Returns:
           Access vector set representing the denied access in the
           audit logs parsed by this object.
        )rZRoleTypeSetrb�filter�addrXrRr?)r Zrole_filterZ
role_typesZcsrrr�to_role�s
zAuditParser.to_roleTcCs�tj�}x�|jD]�}|jdkr$|r$q|s4|j|�rtj|jj|jj|j	g|j
�}|j|_|j|_|jr�t
j�}|j|j�||jd<|j||d�qW|S)a�Convert the audit logs access into a an access vector set.

        Convert the audit logs into an access vector set, optionally
        filtering the restults with the passed in filter object.

        Filter objects are object instances with a .filter method
        that takes and access vector and returns True if the message
        should be included in the final output and False otherwise.

        Params:
           avc_filter - [optional] Filter object used to filter the
              output.
        Returns:
           Access vector set representing the denied access in the
           audit logs parsed by this object.
        TZioctl)Z	audit_msg)rZAccessVectorSetrar;r�ZAccessVectorr4r?r5r6r:rMr<rZXpermSetr�ZxpermsZadd_av)r Z
avc_filterZonly_denialsZav_setrs�avZ	xperm_setrrr�	to_access�s
zAuditParser.to_access)F)N)NT)
r*r+r,r-r!r_rorvr{r}r�r�r�rrrrr^Qs
$%
	
r^c@seZdZdd�Zdd�ZdS)�
AVCTypeFiltercCstj|�|_dS)N)�re�compile�regex)r r�rrrr!%szAVCTypeFilter.__init__cCs,|jj|jj�rdS|jj|jj�r(dSdS)NTF)r��matchr4r?r5)r rsrrrr�(s
zAVCTypeFilter.filterN)r*r+r,r!r�rrrrr�$sr�c@seZdZdd�Zdd�ZdS)�ComputeSidTypeFiltercCstj|�|_dS)N)r�r�r�)r r�rrrr!0szComputeSidTypeFilter.__init__cCs@|jj|jj�rdS|jj|jj�r(dS|jj|jj�r<dSdS)NTF)r�r�rXr?r4r5)r rsrrrr�3szComputeSidTypeFilter.filterN)r*r+r,r!r�rrrrr�/sr�)r�rwrrrrrrrrr.r/Zselinux.audit2whyr=rNr2rTrUrWr^r�r�rrrr�<module>s* 	
"Tsite-packages/sepolgen/__pycache__/__init__.cpython-36.pyc000064400000000161147511334570017505 0ustar003

��f�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/sepolgen/__pycache__/lex.cpython-36.opt-1.pyc000064400000051662147511334570017511 0ustar003

��f���@s<dZdZddlZddlZddlZddlZddlZddlZyejej	fZ
Wnek
rdee
fZ
YnXejd�ZGdd�de�ZGdd�de�ZGd	d
�d
e�ZGdd�de�ZGd
d�d�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�ZGdd�de�Zdddddeej�ddddf
dd �Z d%d!d"�Z!d#d$�Z"e"Z#dS)&z3.11z3.10�Nz^[a-zA-Z0-9_]+$c@seZdZdd�ZdS)�LexErrorcCs|f|_||_dS)N)�args�text)�self�message�s�r�/usr/lib/python3.6/lex.py�__init__:szLexError.__init__N)�__name__�
__module__�__qualname__r
rrrr	r9src@seZdZdd�Zdd�ZdS)�LexTokencCsd|j|j|j|jfS)NzLexToken(%s,%r,%d,%d))�type�value�lineno�lexpos)rrrr	�__str__AszLexToken.__str__cCst|�S)N)�str)rrrr	�__repr__DszLexToken.__repr__N)rrr
rrrrrr	r@src@s4eZdZdd�Zdd�Zdd�Zdd�ZeZeZd	S)
�	PlyLoggercCs
||_dS)N)�f)rrrrr	r
LszPlyLogger.__init__cOs|jj||d�dS)N�
)r�write)r�msgr�kwargsrrr	�criticalOszPlyLogger.criticalcOs|jjd||d�dS)Nz	WARNING: r)rr)rrrrrrr	�warningRszPlyLogger.warningcOs|jjd||d�dS)NzERROR: r)rr)rrrrrrr	�errorUszPlyLogger.errorN)	rrr
r
rrr�info�debugrrrr	rKsrc@seZdZdd�Zdd�ZdS)�
NullLoggercCs|S)Nr)r�namerrr	�__getattribute__^szNullLogger.__getattribute__cOs|S)Nr)rrrrrr	�__call__aszNullLogger.__call__N)rrr
r#r$rrrr	r!]sr!c@s|eZdZdd�Zddd�Zddd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�ZeZdS)�LexercCs�d|_d|_i|_i|_i|_d|_g|_d|_i|_i|_	i|_
d|_d|_d|_
d|_d|_d|_d|_d|_d|_d|_d|_d|_dS)N�INITIALr��F)�lexre�	lexretext�
lexstatere�lexstateretext�lexstaterenames�lexstate�
lexstatestack�lexstateinfo�lexstateignore�lexstateerrorf�lexstateeoff�
lexreflags�lexdatar�lexlen�	lexerrorf�lexeoff�	lextokens�	lexignore�lexliterals�	lexmoduler�lexoptimize)rrrr	r
ts.zLexer.__init__NcCs�tj|�}|r�i}x�|jj�D]�\}}g}x\|D]T\}}g}	xF|D]>}
|
sV|
drb|	j|
�qB|	jt||
dj�|
df�qBWq0W|j||	f�|||<qW||_i|_x(|jj�D]\}}t||j�|j|<q�W||_|S)Nrr()�copyr+�items�append�getattrrr2r<)r�object�cZnewtab�keyZritemZnewreZcreZfindexZ	newfindexr�efrrr	�clone�s(


&zLexer.cloner'cCs�t|tj�rtd��|jd�d}tjj||�d}t|d����}|j	d|t
f�|j	dtt��|j	dtt
t|j����|j	d	tt|j���|j	d
t|j��|j	dt|j��i}xb|jj�D]T\}}g}	x>t||j||j|�D]"\\}
}}}
|	j|t||
�f�q�W|	||<q�W|j	dt|��|j	d
t|j��i}x,|jj�D]\}}|�rt|jnd||<�q`W|j	dt|��i}x,|jj�D]\}}|�r�|jnd||<�q�W|j	dt|��WdQRXdS)Nz&Won't overwrite existing lextab module�.r(z.py�wzJ# %s.py. This file automatically created by PLY (version %s). Don't edit!
z_tabversion   = %s
z_lextokens    = set(%s)
z_lexreflags   = %s
z_lexliterals  = %s
z_lexstateinfo = %s
z_lexstatere   = %s
z_lexstateignore = %s
z_lexstateerrorf = %s
z_lexstateeoff = %s
���)�
isinstance�types�
ModuleType�IOError�split�os�path�join�openr�__version__�repr�__tabversion__�tuple�sortedr9�intr4r;r0r+r?�zipr,r-r@�_funcs_to_namesr1r2rr3)r�lextab�	outputdirZ
basetabmodule�filenameZtfZtabre�	statename�lre�titem�pat�funcZretext�renamesZtaberrrEZtabeofrrr	�writetab�s6(zLexer.writetabcCsRt|tj�r|}ntd|�tj|}t|dd�tkr@td��|j	|_
|j|_|j
|_|j
t|j�B|_|j|_|j|_i|_i|_xb|jj�D]T\}}g}g}x.|D]&\}}	|jtj||j�t|	|�f�q�W||j|<||j|<q�Wi|_x$|jj�D]\}}
||
|j|<q�Wi|_x&|j j�D]\}}
||
|j|<�q(W|j!d�dS)Nz	import %sZ_tabversionz0.0zInconsistent PLY versionr&)"rJrKrL�exec�sys�modulesrArU�ImportErrorZ
_lextokensr9Z_lexreflagsr4Z_lexliteralsr;�set�
lextokens_allZ
_lexstateinfor0Z_lexstateignorer1r+r,Z_lexstaterer?r@�re�compile�_names_to_funcsr2Z_lexstateerrorfr3Z
_lexstateeoff�begin)rZtabfile�fdictr[r^r_r`ZtxtitemraZ	func_namerErrr	�readtab�s8
"
z
Lexer.readtabcCs8|dd�}t|t�std��||_d|_t|�|_dS)Nr(zExpected a stringr)rJ�StringTypes�
ValueErrorr5r�lenr6)rrrCrrr	�input�s
zLexer.inputcCsd||jkrtd��|j||_|j||_|jj|d�|_|jj|d�|_	|j
j|d�|_||_dS)NzUndefined stater')
r+rrr)r,r*r1�getr:r2r7r3r8r.)r�staterrr	rns
zLexer.begincCs|jj|j�|j|�dS)N)r/r@r.rn)rrvrrr	�
push_stateszLexer.push_statecCs|j|jj��dS)N)rnr/�pop)rrrr	�	pop_stateszLexer.pop_statecCs|jS)N)r.)rrrr	�
current_state!szLexer.current_statecCs|j|7_dS)N)r)r�nrrr	�skip'sz
Lexer.skipcCs~|j}|j}|j}|j}�x�||k�r|||kr<|d7}q�x�|jD]�\}}|j||�}|s`qFt�}|j�|_|j	|_	||_|j
}	||	\}
|_|
s�|jr�|j�|_|S|j�}P|j�}||_
||_||_|
|�}|s�|j}|j}P|j�s(|j|jk�r(td|
jj|
jj|
j|jf||d���|SW|||jk�rrt�}|||_|j	|_	|j|_||_|d|_|S|j�r�t�}|j|d�|_|j	|_	d|_||_
||_||_|j|�}||jk�r�td||||d���|j}|�s�q|S||_td|||f||d���qW|j�r\t�}d|_d|_|j	|_	||_||_
||_|j|�}|S|d|_|jdk�rztd��dS)	Nr(z4%s:%d: Rule '%s' returned an unknown token type '%s'rz&Scanning error. Illegal character '%s'z"Illegal character '%s' at index %d�eofr'z"No input string given with input())rr6r:r5r)�matchr�grouprr�	lastindexr�end�lexerZlexmatchr=rjr�__code__�co_filename�co_firstlinenorr;r7r8�RuntimeError)rrr6r:r5r)�lexindexfunc�m�tok�irbZnewtokrrr	�token1s�




"

zLexer.tokencCs|S)Nr)rrrr	�__iter__�szLexer.__iter__cCs|j�}|dkrt�|S)N)r��
StopIteration)r�trrr	�next�sz
Lexer.next)N)r')rrr
r
rFrdrprtrnrwryrzr|r�r�r��__next__rrrr	r%ss

%(

nr%cCst|d|j�S)N�regex)rA�__doc__)rbrrr	�
_get_regex�sr�cCs0tj|�}|jj�}|j|jkr,|j|j�|S)N)rf�	_getframe�	f_globalsr>�f_locals�update)Zlevelsr�ldictrrr	�get_caller_module_dict�s


r�cCsJg}x@t||�D]2\}}|r8|dr8|j||df�q|j|�qW|S)Nrr()rYr@)Zfunclist�namelist�resultrr"rrr	rZ�srZcCsHg}x>|D]6}|r6|dr6|j||d|df�q
|j|�q
W|S)Nrr()r@)r�ror�r{rrr	rm�s
rmcCsd|sgSdj|�}y�tj||�}dgt|jj��d}|dd�}x�|jj�D]z\}}	|j|d�}
t|
�t	j
t	jfkr�|
||f||	<|||	<qP|
dk	rP|||	<|jd�dkr�d||	<qPd||f||	<qPW||fg|g|gfSt
k
�r^tt|�d�}|dk�rd}t|d|�|||�\}}
}t||d�|||�\}}}|||
|||fSXdS)N�|r(�ignore_r�)NN)rQrkrl�max�
groupindex�valuesr?rurrK�FunctionType�
MethodType�find�	ExceptionrXrs�_form_master_re)Zrelist�reflagsr��toknamesr�r)r�Z
lexindexnamesrr�Zhandler�Zllistr_ZlnamesZrlistZrreZrnamesrrr	r��s2



r�cCs�|jd�}x0t|dd�d�D]\}}||kr|dkrPqW|dkrVt|d|��}nd}d|krjt|�}dj||d��}||fS)N�_r(�ANYr&)r&)rN�	enumeraterVrQ)r�names�partsr��part�statesZ	tokennamerrr	�_statetokens
r�c@sfeZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dS)�LexerReflectNrcCsL||_d|_g|_||_ddi|_t�|_d|_|dkrBtt	j
�n||_dS)Nr&�	inclusiveF)r�Z
error_func�tokensr��	stateinforirgrrrf�stderr�log)rr�r�r�rrr	r
.s
zLexerReflect.__init__cCs$|j�|j�|j�|j�dS)N)�
get_tokens�get_literals�
get_states�	get_rules)rrrr	�get_all9szLexerReflect.get_allcCs|j�|j�|j�|jS)N)�validate_tokens�validate_literals�validate_rulesr)rrrr	�validate_all@szLexerReflect.validate_allcCsp|jjdd�}|s(|jjd�d|_dSt|ttf�sL|jjd�d|_dS|sf|jjd�d|_dS||_dS)Nr�zNo token list is definedTztokens must be a list or tupleztokens is empty)r�rur�rrJ�listrVr�)rr�rrr	r�GszLexerReflect.get_tokenscCsTi}xJ|jD]@}tj|�s.|jjd|�d|_||krD|jjd|�d||<qWdS)NzBad token name '%s'TzToken '%s' multiply definedr()r��_is_identifierr~r�rr)rZ	terminalsr{rrr	r�[s
zLexerReflect.validate_tokenscCs |jjdd�|_|jsd|_dS)N�literalsr')r�rur�)rrrr	r�fszLexerReflect.get_literalscCspyDx>|jD]4}t|t�s&t|�dkr
|jjdt|��d|_q
WWn&tk
rj|jjd�d|_YnXdS)Nr(z.Invalid literal %s. Must be a single characterTzIInvalid literals specification. literals must be a sequence of characters)r�rJrqrsr�rrT�	TypeError)rrCrrr	r�lszLexerReflect.validate_literalscCs�|jjdd�|_|jr�t|jttf�s:|jjd�d|_n�x�|jD]�}t|t�s^t|�dkrx|jjdt	|��d|_qB|\}}t|t
�s�|jjdt	|��d|_qB|dkp�|dks�|jjd	|�d|_qB||jkr�|jjd
|�d|_qB||j|<qBWdS)Nr�z)states must be defined as a tuple or listTr�zMInvalid state specifier %s. Must be a tuple (statename,'exclusive|inclusive')zState name %s must be a stringr��	exclusivez:State type for state %s must be 'inclusive' or 'exclusive'zState '%s' already defined)r�rur�rJrVr�r�rrsrTrqr�)rrr"Z	statetyperrr	r�ws0

zLexerReflect.get_statesc	CsRdd�|jD�}i|_i|_i|_i|_i|_i|_x"|jD]}g|j|<g|j|<q<Wt|�dkrz|j	j
d�d|_
dS�x�|D�]x}|j|}t||j�\}}||j|<t|d��rX|dkr�x�|D]}||j|<q�Wn||dkr�xr|D]}||j|<q�WnZ|d	k�r2|j
j}|j
j}|j	j
d
|||j�d|_
n$x�|D]}|j|j||f��q8Wq�t|t��r�|d	k�r�x|D]}||j|<�qtWd|k�r�|j	jd|�nD|dk�r�|j	j
d
|�d|_
n$x8|D]}|j|j||f��q�Wq�|j	j
d|�d|_
q�Wx$|jj�D]}|jdd�d��qWx&|jj�D]}|jdd�dd��q2WdS)NcSs g|]}|dd�dkr|�qS)Nr�Zt_r)�.0rrrr	�
<listcomp>�sz*LexerReflect.get_rules.<locals>.<listcomp>rz+No rules of the form t_rulename are definedTr$rr}�ignorez,%s:%d: Rule '%s' must be defined as a string�\z#%s contains a literal backslash '\'z'Rule '%s' must be defined as a functionz&%s not defined as a function or stringcSs|djjS)Nr()r�r�)�xrrr	�<lambda>�sz(LexerReflect.get_rules.<locals>.<lambda>)rDcSst|d�S)Nr()rs)r�rrr	r��s)rD�reverse)r�r��funcsym�strsymr��errorf�eoffr�rsr�rr��hasattrr�r�r�rr@rJrqrr��sort)	rZtsymbolsrrr�r��tokname�line�filerrr	r��sb












zLexerReflect.get_rulescCs��xp|jD�]d}�x||j|D�]l\}}|jj}|jj}tj|�}|jj|�|j	|}t
|tj�rjd}nd}|jj
}	|	|kr�|jjd|||j�d|_q|	|kr�|jjd|||j�d|_qt|�s�|jjd|||j�d|_qyDtjd|t|�f|j�}
|
jd��r$|jjd	|||j�d|_Wqtjk
�r�}zD|jjd
|||j|�dt|�k�rt|jjd|||j�d|_WYdd}~XqXqW�x
|j|D]�\}}
|j	|}|d
k�r�|jjd|�d|_�q�||jk�r|jd�dk�r|jjd||�d|_�q�y:tjd||
f|j�}
|
jd��r@|jjd|�d|_WnTtjk
�r�}z4|jjd||�d|
k�r�|jjd|�d|_WYdd}~XnX�q�W|j|�r�|j|�r�|jjd|�d|_|jj|d�}|r
|}|jj}|jj}tj|�}|jj|�t
|tj��rd}nd}|jj
}	|	|k�rN|jjd|||j�d|_|	|kr
|jjd|||j�d|_q
Wx|jD]}|j|��q|WdS)Nr�r(z'%s:%d: Rule '%s' has too many argumentsTz%%s:%d: Rule '%s' requires an argumentz2%s:%d: No regular expression defined for rule '%s'z
(?P<%s>%s)r'z<%s:%d: Regular expression for rule '%s' matches empty stringz3%s:%d: Invalid regular expression for rule '%s'. %s�#z6%s:%d. Make sure '#' in rule '%s' is escaped with '\#'rz'Rule '%s' must be defined as a functionr�rz-Rule '%s' defined for an unspecified token %sz5Regular expression for rule '%s' matches empty stringz,Invalid regular expression for rule '%s'. %sz/Make sure '#' in rule '%s' is escaped with '\#'zNo rules defined for state '%s')r�r�r�r�r��inspectZ	getmodulerg�addr�rJrKr��co_argcountr�rrr�rkrlr�r~r�r�r�r�ru�validate_module)rrv�fnamerr�r��moduler�Zreqargs�nargsrC�er"�rZefuncrrr	r��s�









zLexerReflect.validate_rulescCs�ytj|�\}}Wntk
r&dSXtjd�}tjd�}i}|d7}xv|D]n}|j|�}|sj|j|�}|r�|jd�}	|j|	�}
|
s�|||	<n$tj|�}|j	j
d|||	|
�d|_
|d7}qNWdS)Nz\s*def\s+(t_[a-zA-Z_0-9]*)\(z\s*(t_[a-zA-Z_0-9]*)\s*=r(z7%s:%d: Rule %s redefined. Previously defined on line %dT)r�ZgetsourcelinesrMrkrlr~rruZ
getsourcefiler�r)rr��linesZlinenZfreZsreZ	counthashr�r�r"�prevr]rrr	r�?s*








zLexerReflect.validate_module)Nr)rrr
r
r�r�r�r�r�r�r�r�r�r�rrrr	r�-s
Bgr�Fr[c
#s�|dkrd}d}
ddi}t�}||_|	dkr6ttj�}	|rL|dkrLttj�}|rT|��r��fdd�t��D�}
t|
�}
d|
kr�tj|
dj|
d<nt	d�}
|
j
d	�}|r�t|t�r�d
|kr�|d
|}t
|
|	|d�}|j�|s�|j�r�td��|o�|�r4y |j||
�|ja|ja|a|Stk
�r2YnX|�rd|jd
|j�|jd|j�|jd|j�t�|_x|jD]}|jj|��qtWt|jttf��r�t|jd��j |j�|_!n|j|_!|jt|j!�B|_"|j}i}x�|D]�}g}xH|j#|D]:\}}|j$d|t%|�f�|�r�|jd|t%|�|��q�Wx@|j&|D]2\}}|j$d||f�|�r@|jd|||��q@W|||<�q�W|�r�|jd�xt|D]l}t'||||
|j(�\}}}||j)|<||j*|<||j+|<|�r�x&t,|�D]\}}|jd|||��q�W�q�Wxl|j-�D]`\}}|dk�r|dk�r|j)|j.|j)d�|j*|j.|j*d�|j+|j.|j+d��qW||_/|j)d|_0|j*d|_1||_2|j3|_4|j4j
dd�|_5|j6|_7|j6j
dd�|_8|j8�s�|	j9d�|j:|_;|j:j
dd�|_<x�|j-�D]�\}}|dk�rL||j6k�r*|	j9d|�||j3k�r�|j5�r�|	j9d|�nJ|dk�r||j6k�rv|j6j
dd�|j6|<||j3k�r|j3j
dd�|j3|<�qW|ja|ja|a|�r�|�r�|dk�r2t|t=j>��r�|j}nNd
|k�r�|
d}n:|j?d
�} d
j | dd��}!t@d|!�tAtj|!dd�}tBjCjD|�}y$|jE||�|tjk�rTtj|=Wn6tFk
�r�}"z|	j9d||"f�WYdd}"~"XnX|S)Nr[r&r�csg|]}|t�|�f�qSr)rA)r��k)r�rr	r�yszlex.<locals>.<listcomp>�__file__rr��__package__rG)r�r�zCan't build lexerzlex: tokens   = %rzlex: literals = %rzlex: states   = %rrz
(?P<%s>%s)z(lex: Adding rule %s -> '%s' (state '%s')z#lex: ==== MASTER REGEXS FOLLOW ====z"lex: state '%s' : regex[%d] = '%s'r'zNo t_error rule is definedr�z1No error rule is defined for exclusive state '%s'z2No ignore rule is defined for exclusive state '%s'r(z	import %sz#Couldn't write lextab module %r. %srI)Gr%r=rrfr��dir�dictrgr�r�rurJrr�r�r��SyntaxErrorrpr�rtr�rhrr�r�r�rir9r�r�rVrrQr;rjr�r@r�r�r�r�r+r,r-r�r?�extendr0r)r*r4r�r1r:r�r2r7rr�r3r8rKrLrNrerArOrP�dirnamerdrM)#r�rBr �optimizer[r�Znowarnr\ZdebuglogZerrorlogr�r�ZlexobjZ_itemsZpkgZlinfor{ZregexsrvZ
regex_listr�rr"r�r)Zre_textZre_namesr�r�styperZsrcfiler�Zpkgnamer�r)r�r	�lex^s�
















$r�cCs�|sVy&tjd}t|�}|j�}|j�Wn*tk
rTtjjd�tjj�}YnX|rb|j	}nt	}||�|rz|j
}nt
}x0|�}|s�Ptjjd|j|j|j
|jf�q�WdS)Nr(z/Reading from standard input (type EOF to end):
z(%s,%r,%d,%d)
)rf�argvrR�read�close�
IndexError�stdoutr�stdinrtr�rrrr)r��datar]rZ_inputZ_tokenr�rrr	�runmains*
r�cs�fdd�}|S)Ncs t�d�rt��|_n�|_|S)Nr$)r�r�r�)r)r�rr	�	set_regexAs
zTOKEN.<locals>.set_regexr)r�r�r)r�r	�TOKEN@sr�)NN)$rSrUrkrfrKr>rOr�Z
StringTypeZUnicodeTyperq�AttributeErrorr�bytesrlr�r�rrBrrr!r%r�r�rZrmr�r�r�rX�VERBOSEr�r�r��Tokenrrrr	�<module>"sD
F

(3
@
"
site-packages/sepolgen/__pycache__/interfaces.cpython-36.pyc000064400000030375147511334570020103 0ustar003

��fQ@�@s�dZddlZddlZddlmZddlmZddlmZddlmZddlm	Z	Gd	d
�d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�ZGdd�d�ZGdd�d�ZGdd�d�ZGdd�d�ZdS)z7
Classes for representing and manipulating interfaces.
�N�)�access)�	refpolicy)�objectmodel)�matching)�_c@sHeZdZdZdd�Zdd�Zdd�Zeee�Zedd	�d
�Z	dd�Z
d
S)�Paramz;
    Object representing a paramater for an interface.
    cCs"d|_tj|_tj�|_d|_dS)N�T)�_Param__namer�SRC_TYPE�type�IdSet�obj_classesZrequired)�self�r� /usr/lib/python3.6/interfaces.py�__init__&s
zParam.__init__cCs tj|�std|��||_dS)NzName [%s] is not a param)r�
is_idparam�
ValueErrorr
)r�namerrr�set_name,s
zParam.set_namecCs|jS)N)r
)rrrr�get_name1szParam.get_namecCst|jdd��S)Nr)�intr)rrrr�<lambda>6szParam.<lambda>)�fgetcCs d|jtj|jdj|j�fS)Nz0<sepolgen.policygen.Param instance [%s, %s, %s]>� )rr�field_to_strr�joinr)rrrr�__repr__8szParam.__repr__N)�__name__�
__module__�__qualname__�__doc__rrr�propertyr�numrrrrrr"s
rcCs�d}||kr�||}||jkr"dS|tjks6|tjkr�|jtjksN|jtjkr�d}|r`|jg}ng}x&tj|j|�D]}|tj	krtd}PqtWtj|_q�d}nt
�}||_||_|||j<|r�|jj|j�|S)Nrr)
rrr�TGT_TYPE�	obj_class�	itertools�chainrrZimplicitly_typed_objectsrr�add)rr�av�params�ret�pZavobjs�objrrr�__param_insert>s0




r/cCs~d}d}tj|j�r.t|jtj||�dkr.d}tj|j�rTt|jtj||�dkrTd}tj|j�rzt|jtj	||�dkrzd}|S)ajExtract the paramaters from an access vector.

    Extract the paramaters (in the form $N) from an access
    vector, storing them as Param objects in a dictionary.
    Some attempt is made at resolving conflicts with other
    entries in the dict, but if an unresolvable conflict is
    found it is reported to the caller.

    The goal here is to figure out how interface parameters are
    actually used in the interface - e.g., that $1 is a domain used as
    a SRC_TYPE. In general an interface will look like this:

    interface(`foo', `
       allow $1 foo : file read;
    ')

    This is simple to figure out - $1 is a SRC_TYPE. A few interfaces
    are more complex, for example:

    interface(`foo_trans',`
       domain_auto_trans($1,fingerd_exec_t,fingerd_t)

       allow $1 fingerd_t:fd use;
       allow fingerd_t $1:fd use;
       allow fingerd_t $1:fifo_file rw_file_perms;
       allow fingerd_t $1:process sigchld;
    ')

    Here the usage seems ambigious, but it is not. $1 is still domain
    and therefore should be returned as a SRC_TYPE.

    Returns:
      0  - success
      1  - conflict found
    rFr)
rr�src_typer/rr�tgt_typer%r&�	OBJ_CLASS)r*r+r,Z	found_srcrrr�av_extract_paramsjs$r3cCs"tj|j�rt|jtjd|�SdS)N)rr�roler/rZROLE)r4r+rrr�role_extract_params�sr5csl�fdd�}d}||jtj�r"d}||jtj�r4d}||jtj�rFd}tj|j	�rht
|j	tjd��rhd}|S)Ncs2d}x(|D] }tj|�r
t||d��r
d}q
W|S)Nrr)rrr/)�setrr,�x)r+rr�extract_from_set�s

z2type_rule_extract_params.<locals>.extract_from_setrr)�	src_typesrr�	tgt_typesr%rr2rrZ	dest_typer/Z	DEST_TYPE)�ruler+r8r,r)r+r�type_rule_extract_params�sr<cCs6d}x,|jD]"}tj|�rt|tjd|�rd}qW|S)Nrr)�argsrrr/rr)�ifcallr+r,�argrrr�ifcall_extract_params�s
r@c@seZdZdd�Zdd�ZdS)�AttributeVectorcCsd|_tj�|_dS)Nr	)rr�AccessVectorSet)rrrrr�szAttributeVector.__init__cCs|jj|�dS)N)r�add_av)rr*rrrrC�szAttributeVector.add_avN)rr r!rrCrrrrrA�srAc@s$eZdZdd�Zdd�Zdd�ZdS)�AttributeSetcCs
i|_dS)N)�
attributes)rrrrr�szAttributeSet.__init__cCs||j|j<dS)N)rEr)r�attrrrr�add_attr�szAttributeSet.add_attrcCs~dd�}d}x^|D]V}|dd�}|ddkrF|r<|j|�||�}q|r|jd�}tj|�}|j|�qW|rz|j|�dS)NcSsH|dd�j�}t|�dks(|ddkr4td|��t�}|d|_|S)Nr�rZ	Attributez#Syntax error Attribute statement %s���)�split�len�SyntaxErrorrAr)�line�fields�arrr�
parse_attr�s
z*AttributeSet.from_file.<locals>.parse_attrrr�[�,rI)rGrJr�AccessVectorrC)r�fdrPrOrM�lr*rrr�	from_file�s	




zAttributeSet.from_fileN)rr r!rrGrVrrrrrD�srDc@sFeZdZdifdd�Zifdd�Zdd�Zdd	�Zd
d�Zdd
�ZdS)�InterfaceVectorNcCs6d|_d|_tj�|_i|_|r,|j||�d|_dS)NTr	F)�enabledrrrBr+�from_interface�expanded)r�	interfacerErrrr�s
zInterfaceVector.__init__c
CsF|j|_xN|j�D]B}|jtjjkr&qd|jkr2qtj|�}x|D]}|j|�qBWqW|r�x�|j	�D]v}xp|j
D]f}||j
kr�qr|j
|}xJ|jD]@}	tj|	�}|j|jkr�|j
|_|j|jkr�|j
|_|j|�q�WqrWqfWx|j�D]}
t|
|j�r�q�Wx |j�D]}t||j��r�qWx |j�D]}t||j��r*�q*WdS)NZ	dontaudit)rZavrulesZ	rule_typerZAVRuleZALLOWrZavrule_to_access_vectorsrCZtypeattributesrE�copyr0rr1Zrolesr5r+Z	typerulesr<�interface_callsr@)
rr[rEZavruleZavsr*Z
typeattributerFZattr_vecrOr4r;r>rrrrY�s>





zInterfaceVector.from_interfacecCs t||j�dkr|jj|�dS)Nr)r3r+rrC)rr*rrrrC3szInterfaceVector.add_avcCs<g}|jd|j�x|jD]}|jt|��qWdj|�S)Nz[InterfaceVector %s]�
)�appendrr�strr)r�sr*rrr�	to_string9s
zInterfaceVector.to_stringcCs|j�S)N)r)rrrr�__str__@szInterfaceVector.__str__cCsd|j|jfS)Nz<InterfaceVector %s:%s>)rrX)rrrrrCszInterfaceVector.__repr__)	rr r!rrYrCrbrcrrrrrrW�s4rWc@sxeZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zifdd�Z	difdd�Z
dd�Zdd�Zdd�Z
dd�ZdS)�InterfaceSetNcCsi|_i|_g|_||_dS)N)�
interfaces�tgt_type_map�tgt_type_all�output)rrhrrrrHszInterfaceSet.__init__cCs|jr|jj|d�dS)Nr^)rh�write)rr`rrr�oNszInterfaceSet.ocCs�x�t|jj�dd�d�D]�}|jd|j�x:t|jj�dd�d�D] }|jd|jtj|jf�qDW|jd�t|j	j
��}x&|D]}|jdj|��|jd	�q�WqWdS)
NcSs|jS)N)r)r7rrrrSsz&InterfaceSet.to_file.<locals>.<lambda>)�keyz[InterfaceVector %s cSs|jS)N)r)r7rrrrUsz%s:%s z]
rRr^)�sortedre�valuesrirr+rrrrZto_listr)rrTZiv�paramZavlr*rrr�to_fileRs 

zInterfaceSet.to_filecCs�dd�}d}x^|D]V}|dd�}|ddkrF|r<|j|�||�}q|r|jd�}tj|�}|j|�qW|rz|j|�|j�dS)NcSs�|dd�j�}t|�dks(|ddkr4td|��t�}|d|_t|�dkrTdSxb|dd�D]R}|jd�}t|�dkr�td|��t�}|d|_tj|d|_||j	|j<qbW|S)	NrrHrrWz)Syntax error InterfaceVector statement %s�:z-Invalid param in InterfaceVector statement %srI)
rJrKrLrWrrrZstr_to_fieldrr+)rMrN�ifvZfieldr-rnrrr�	parse_ifv^s 


z)InterfaceSet.from_file.<locals>.parse_ifvrrrQrRrI)�add_ifvrJrrSrC�index)rrTrrrqrMrUr*rrrrV]s





zInterfaceSet.from_filecCs||j|j<dS)N)rer)rrqrrrrs�szInterfaceSet.add_ifvcCs�xz|jj�D]l}t�}x:|jD]0}tj|j�rB|jj|�t�}P|j|j�qWx$|D]}|j	j
|g�}|j|�qXWqWdS)N)rermr6rrr1rgr_r)rf�
setdefault)rrqr:r*rrUrrrrt�s
zInterfaceSet.indexcCst||�}|j|�dS)N)rWrs)rr[rErqrrrr)�s
zInterfaceSet.addcCs@x(tj|j�|j��D]}|j||�qW|j|�|j�dS)N)r'r(re�	templatesr)�expand_ifcallsrt)r�headersrhrE�irrr�add_headers�s
zInterfaceSet.add_headerscCsZtj|�rPt|dd��}|t|j�kr,dS|j|d}t|t�rH|S|gSn|gSdS)Nr)rrrrKr=�
isinstance�list)r�idr>r$r?rrr�	map_param�s

zInterfaceSet.map_paramc
Cs�|j|j|�}|dkrdS|j|j|�}|dkr4dS|j|j|�}|dkrNdStj�}x0|jD]&}|j||�}	|	dkrzq^q^|j|	�q^Wt|�dkr�dSx:|D]2}
x,|D]$}x|D]}|j	j
|
|||�q�Wq�Wq�WdS)Nr)r~r0r1r&rr
Zperms�updaterKrr))
rrqr*r>r9r:rZ	new_permsZpermr-r0r1r&rrr�
map_add_av�s*


zInterfaceSet.map_add_avcCs�|dfg}|j|j}d|_x�t|�dkr�|jd�\}}|j|j}||krrx|jD]}|j|||�qTW|jrrqxv|j�D]j}	|	j|jkr�|j	t
d��dSy||	j}
Wn*tk
r�|j	t
d|	j��w|YnX|j|
|	f�q|WqWdS)NTrrzFound circular interface classz#Missing interface definition for %srI)
rerrZrK�poprr�r]Zifnamerjr�KeyErrorr_)rr[�
if_by_name�stackrqZcurZ
cur_ifcallZcur_ifvr*r>Znewifrrr�do_expand_ifcalls�s*
zInterfaceSet.do_expand_ifcallscCsZi}x&tj|j�|j��D]}|||j<qWx(tj|j�|j��D]}|j||�qBWdS)N)r'r(rervrr�)rrxr�ryr[rrrrw�s
zInterfaceSet.expand_ifcalls)N)rr r!rrjrorVrsrtr)rzr~r�r�rwrrrrrdGs
#$rd)r"r\r'r	rrrrZsepolgeni18nrrr/r3r5r<r@rArDrWrdrrrr�<module>s",4Zsite-packages/sepolgen/__pycache__/audit.cpython-36.opt-1.pyc000064400000042452147511334570020024 0ustar003

��f�U�@s�ddlZddlZddlmZddlmZddlmZdd�Zdd	�Zd
d�ZGdd
�d
�Z	Gdd�de	�Z
Gdd�de	�Zddlj
Z
iZGdd�de	�ZGdd�de	�ZGdd�de	�ZGdd�de	�ZGdd�d�ZGdd�d�ZGdd�d�ZdS) �N�)�	refpolicy)�access)�utilcCs�ddl}ddl}tdd�}t|j�j�d�}|j|j|j�|�}|jd|�}|jd|�}|j	ddd	d
||g|j
d�j�d}tj
r�tj|�}|S)a
Obtain all of the avc and policy load messages from the audit
    log. This function uses ausearch and requires that the current
    process have sufficient rights to run ausearch.

    Returns:
       string contain all of the audit messages returned by ausearch.
    rNz/proc/uptime�rz%xz%Xz/sbin/ausearchz-mz5AVC,USER_AVC,MAC_POLICY_LOAD,DAEMON_START,SELINUX_ERRz-ts)�stdout)�
subprocess�time�open�float�read�split�closeZ	localtimeZstrftime�Popen�PIPE�communicater�PY3�decode_input)rr	�fdZoff�sZbootdateZboottime�output�r�/usr/lib/python3.6/audit.py�get_audit_boot_msgss

rcCs:ddl}|jdddg|jd�j�d}tjr6tj|�}|S)a
Obtain all of the avc and policy load messages from the audit
    log. This function uses ausearch and requires that the current
    process have sufficient rights to run ausearch.

    Returns:
       string contain all of the audit messages returned by ausearch.
    rNz/sbin/ausearchz-mz5AVC,USER_AVC,MAC_POLICY_LOAD,DAEMON_START,SELINUX_ERR)r)rrrrrrr)rrrrr�get_audit_msgs2s
rcCs6ddl}|jdg|jd�j�d}tjr2tj|�}|S)z�Obtain all of the avc and policy load messages from /bin/dmesg.

    Returns:
       string contain all of the audit messages returned by dmesg.
    rNz
/bin/dmesg)r)rrrrrrr)rrrrr�get_dmesg_msgsAs
rc@s eZdZdZdd�Zdd�ZdS)�AuditMessagez�Base class for all objects representing audit messages.

    AuditMessage is a base class for all audit messages and only
    provides storage for the raw message (as a string) and a
    parsing function that does nothing.
    cCs||_d|_dS)N�)�message�header)�selfrrrr�__init__WszAuditMessage.__init__cCs^xX|D]P}|jd�}t|�dkr<|dd�dkr||_dSq|ddkr|d|_dSqWdS)	z�Parse a string that has been split into records by space into
        an audit message.

        This method should be overridden by subclasses. Error reporting
        should be done by raise ValueError exceptions.
        �=�N�zaudit(r�msgr)r
�lenr)r �recsr%�fieldsrrr�from_split_string[s


zAuditMessage.from_split_stringN)�__name__�
__module__�__qualname__�__doc__r!r)rrrrrPsrc@seZdZdZdd�ZdS)�InvalidMessagez�Class representing invalid audit messages. This is used to differentiate
    between audit messages that aren't recognized (that should return None from
    the audit message parser) and a message that is recognized but is malformed
    in some way.
    cCstj||�dS)N)rr!)r rrrrr!vszInvalidMessage.__init__N)r*r+r,r-r!rrrrr.psr.c@s eZdZdZdd�Zdd�ZdS)�PathMessagez!Class representing a path messagecCstj||�d|_dS)Nr)rr!�path)r rrrrr!{szPathMessage.__init__cCsXtj||�xF|D]>}|jd�}t|�dkr.q|ddkr|ddd�|_dSqWdS)Nr"r#rr0r���)rr)r
r&r0)r r'r%r(rrrr)s

zPathMessage.from_split_stringN)r*r+r,r-r!r)rrrrr/ysr/c@s0eZdZdZdd�Zdd�Zdd�Zdd	�Zd
S)�
AVCMessagea�AVC message representing an access denial or granted message.

    This is a very basic class and does not represent all possible fields
    in an avc message. Currently the fields are:
       scontext - context for the source (process) that generated the message
       tcontext - context for the target
       tclass - object class for the target (only one)
       comm - the process name
       exe - the on-disc binary
       path - the path of the target
       access - list of accesses that were allowed or denied
       denial - boolean indicating whether this was a denial (True) or granted
          (False) message.
       ioctlcmd - ioctl 'request' parameter

    An example audit message generated from the audit daemon looks like (line breaks
    added):
       'type=AVC msg=audit(1155568085.407:10877): avc:  denied  { search } for
       pid=677 comm="python" name="modules" dev=dm-0 ino=13716388
       scontext=user_u:system_r:setroubleshootd_t:s0
       tcontext=system_u:object_r:modules_object_t:s0 tclass=dir'

    An example audit message stored in syslog (not processed by the audit daemon - line
    breaks added):
       'Sep 12 08:26:43 dhcp83-5 kernel: audit(1158064002.046:4): avc:  denied  { read }
       for  pid=2 496 comm="bluez-pin" name=".gdm1K3IFT" dev=dm-0 ino=3601333
       scontext=user_u:system_r:bluetooth_helper_t:s0-s0:c0
       tcontext=system_u:object_r:xdm_tmp_t:s0 tclass=file
    cCs\tj||�tj�|_tj�|_d|_d|_d|_d|_	d|_
g|_d|_d|_
tj|_dS)NrT)rr!r�SecurityContext�scontext�tcontext�tclass�comm�exer0�name�accesses�denial�ioctlcmd�	audit2why�TERULE�type)r rrrrr!�s

zAVCMessage.__init__cCs|d}|}|t|�dkr&td|j��x:|t|�kr`||dkrFd}P|jj||�|d}q(W|sttd|j��|dS)NFrz#AVC message in invalid format [%s]
�}T)r&�
ValueErrorrr:�append)r r'�startZfound_close�irrrZ__parse_access�szAVCMessage.__parse_accesscCs�tj||�d}d}d}d}�xftt|��D�]T}||dkrV|j||d�}d}q,n||dkrhd|_||jd�}t|�dkr�q,|dd	kr�tj|d�|_	d}q,|dd
kr�tj|d�|_
d}q,|ddkr�|d|_d}q,|ddk�r|ddd�|_q,|dd
k�r(|ddd�|_
q,|ddk�rJ|ddd�|_q,|ddkr,yt|dd�|_Wq,tk
�r�Yq,Xq,W|�s�|�s�|�s�|�r�td|j��|j�dS)NF�{rTZgrantedr"r#rr4r5r6r7r8r9r<�z#AVC message in invalid format [%s]
r1r1r1)rr)�ranger&�_AVCMessage__parse_accessr;r
rr3r4r5r6r7r8r9�intr<rAr�analyze)r r'Z	found_srcZ	found_tgtZfound_classZfound_accessrDr(rrrr)�sL

 zAVCMessage.from_split_stringcCs�|jj�}|jj�}t|j�}g|_|||j|ftj�krXt|||j|f\|_	|_�n�t
j|||j|j�\|_	|_|j	t
jkr�t
j
|_	|j	t
jkr�td|��|j	t
jkr�td|��|j	t
jkr�td|j��|j	t
jkr�tddj|j���|j	t
jk�rtd��|j	t
jk�r�|jg|_|jj|jjk�rR|jjd|jjd|jjf�|jj|jjk�r�|jjdk�r�|jjd	|jjd	|jjf�|jj|jjk�r�|jjd
|jjd
|jjf�|j	|jft|||j|f<dS)NzInvalid Target Context %s
zInvalid Source Context %s
zInvalid Type Class %s
zInvalid permission %s
� z&Error during access vector computationz	user (%s)Zobject_rz	role (%s)z
level (%s))r5Z	to_stringr4�tupler:�datar6�avcdict�keysr?r=rJZNOPOLICYr>ZBADTCONrAZBADSCONZBADPERM�joinZ
BADCOMPUTEZ
CONSTRAINT�userrB�role�level)r r5r4Zaccess_tuplerrrrJ�s8



    zAVCMessage.analyzeN)r*r+r,r-r!rHr)rJrrrrr2�s
-r2c@seZdZdZdd�ZdS)�PolicyLoadMessagez6Audit message indicating that the policy was reloaded.cCstj||�dS)N)rr!)r rrrrr! szPolicyLoadMessage.__init__N)r*r+r,r-r!rrrrrTsrTc@s eZdZdZdd�Zdd�ZdS)�DaemonStartMessagez3Audit message indicating that a daemon was started.cCstj||�d|_dS)NF)rr!�auditd)r rrrrr!%szDaemonStartMessage.__init__cCstj||�d|krd|_dS)NrVT)rr)rV)r r'rrrr))sz$DaemonStartMessage.from_split_stringN)r*r+r,r-r!r)rrrrrU#srUc@s(eZdZdZdd�Zdd�Zdd�ZdS)	�ComputeSidMessagea�Audit message indicating that a sid was not valid.

    Compute sid messages are generated on attempting to create a security
    context that is not valid. Security contexts are invalid if the role is
    not authorized for the user or the type is not authorized for the role.

    This class does not store all of the fields from the compute sid message -
    just the type and role.
    cCs4tj||�tj�|_tj�|_tj�|_d|_dS)Nr)rr!rr3�invalid_contextr4r5r6)r rrrrr!9s



zComputeSidMessage.__init__c	Cs�tj||�t|�dkr td��y\tj|d�|_tj|djd�d�|_tj|djd�d�|_	|djd�d|_
Wntd��YnXdS)	N�
z;Split string does not represent a valid compute sid message��r"r��	)rr)r&rArr3rXr
r4r5r6)r r'rrrr)@sz#ComputeSidMessage.from_split_stringcCsd|j|jfS)Nzrole %s types %s;
)rRr?)r rrrrLszComputeSidMessage.outputN)r*r+r,r-r!r)rrrrrrW/s	rWc@s^eZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
ddd�Zddd�ZdS)�AuditParsera�Parser for audit messages.

    This class parses audit messages and stores them according to their message
    type. This is not a general purpose audit message parser - it only extracts
    selinux related messages.

    Each audit messages are stored in one of four lists:
       avc_msgs - avc denial or granted messages. Messages are stored in
          AVCMessage objects.
       comput_sid_messages - invalid sid messages. Messages are stored in
          ComputSidMessage objects.
       invalid_msgs - selinux related messages that are not valid. Messages
          are stored in InvalidMessageObjects.
       policy_load_messages - policy load messages. Messages are stored in
          PolicyLoadMessage objects.

    These lists will be reset when a policy load message is seen if
    AuditParser.last_load_only is set to true. It is assumed that messages
    are fed to the parser in chronological order - time stamps are not
    parsed.
    FcCs|j�||_dS)N)�_AuditParser__initialize�last_load_only)r r`rrrr!gszAuditParser.__init__cCs.g|_g|_g|_g|_g|_i|_d|_dS)NF)�avc_msgs�compute_sid_msgs�invalid_msgs�policy_load_msgs�	path_msgs�	by_header�check_input_file)r rrrZ__initializekszAuditParser.__initializecCs�dd�|j�D�}x�|D]�}d}|dks8|dks8|dkrFt|�}d}n^|dkr\t|�}d}nH|d	ksl|d
krzt|�}d}n*|dkr�t|�}d}n|dkr�tt�}d}|rd|_y|j|�Wnt	k
r�t
|�}YnX|SqWdS)
NcSsg|]}|jd��qS)u…)�strip)�.0�xrrr�
<listcomp>�sz,AuditParser.__parse_line.<locals>.<listcomp>Fzavc:zmessage=avc:z	msg='avc:Tzsecurity_compute_sid:ztype=MAC_POLICY_LOADz	type=1403z
type=AVC_PATHztype=DAEMON_START)r
r2rWrTr/rU�listrgr)rAr.)r �lineZrecrD�foundr%rrrZ__parse_line�s4
zAuditParser.__parse_linecCs�|j|�}|dkrdSt|t�r0|jr�|j�n�t|t�r\|jrN|jrN|j�|jj|�n^t|t	�rt|j
j|�nFt|t�r�|jj|�n.t|t
�r�|jj|�nt|t�r�|jj|�|jdkr�|j|jkr�|j|jj|�n|g|j|j<dS)Nr)�_AuditParser__parse_line�
isinstancerTr`r_rUrVrdrBr2rarWrbr.rcr/rerrf)r rmr%rrrZ__parse�s,








zAuditParser.__parsecCsxxr|jj�D]d}g}d}x0|D](}t|t�r2|}qt|t�r|j|�qWt|�dkr|rx|D]}|j|_q`WqWdS)Nr)rf�valuesrpr/r2rBr&r0)r �value�avcr0r%�arrrZ__post_process�s



zAuditParser.__post_processcCsL|j�}x|r"|j|�|j�}q
W|js@tjjd�tjd�|j�dS)zpParse the contents of a file object. This method can be called
        multiple times (along with parse_string).zNothing to do
rN)�readline�_AuditParser__parserg�sys�stderr�write�exit�_AuditParser__post_process)r �inputrmrrr�
parse_file�s

zAuditParser.parse_filecCs.|jd�}x|D]}|j|�qW|j�dS)z�Parse a string containing audit messages - messages should
        be separated by new lines. This method can be called multiple
        times (along with parse_file).�
N)r
rvr{)r r|�lines�lrrr�parse_string�s

zAuditParser.parse_stringNcCs@tj�}x2|jD](}|s$|j|�r|j|jj|jj�qW|S)aoReturn RoleAllowSet statements matching the specified filter

        Filter out types that match the filer, or all roles

        Params:
           role_filter - [optional] Filter object used to filter the
              output.
        Returns:
           Access vector set representing the denied access in the
           audit logs parsed by this object.
        )rZRoleTypeSetrb�filter�addrXrRr?)r Zrole_filterZ
role_typesZcsrrr�to_role�s
zAuditParser.to_roleTcCs�tj�}x�|jD]�}|jdkr$|r$q|s4|j|�rtj|jj|jj|j	g|j
�}|j|_|j|_|jr�t
j�}|j|j�||jd<|j||d�qW|S)a�Convert the audit logs access into a an access vector set.

        Convert the audit logs into an access vector set, optionally
        filtering the restults with the passed in filter object.

        Filter objects are object instances with a .filter method
        that takes and access vector and returns True if the message
        should be included in the final output and False otherwise.

        Params:
           avc_filter - [optional] Filter object used to filter the
              output.
        Returns:
           Access vector set representing the denied access in the
           audit logs parsed by this object.
        TZioctl)Z	audit_msg)rZAccessVectorSetrar;r�ZAccessVectorr4r?r5r6r:rMr<rZXpermSetr�ZxpermsZadd_av)r Z
avc_filterZonly_denialsZav_setrs�avZ	xperm_setrrr�	to_access�s
zAuditParser.to_access)F)N)NT)
r*r+r,r-r!r_rorvr{r}r�r�r�rrrrr^Qs
$%
	
r^c@seZdZdd�Zdd�ZdS)�
AVCTypeFiltercCstj|�|_dS)N)�re�compile�regex)r r�rrrr!%szAVCTypeFilter.__init__cCs,|jj|jj�rdS|jj|jj�r(dSdS)NTF)r��matchr4r?r5)r rsrrrr�(s
zAVCTypeFilter.filterN)r*r+r,r!r�rrrrr�$sr�c@seZdZdd�Zdd�ZdS)�ComputeSidTypeFiltercCstj|�|_dS)N)r�r�r�)r r�rrrr!0szComputeSidTypeFilter.__init__cCs@|jj|jj�rdS|jj|jj�r(dS|jj|jj�r<dSdS)NTF)r�r�rXr?r4r5)r rsrrrr�3szComputeSidTypeFilter.filterN)r*r+r,r!r�rrrrr�/sr�)r�rwrrrrrrrrr.r/Zselinux.audit2whyr=rNr2rTrUrWr^r�r�rrrr�<module>s* 	
"Tsite-packages/sepolgen/__pycache__/output.cpython-36.pyc000064400000006656147511334570017325 0ustar003

��f��@spdZddlmZddlmZejr.ddlmZGdd�d�Zdd�Zd	d
�Zdd�Z	d
d�Z
dd�Zdd�ZdS)ai
Classes and functions for the output of reference policy modules.

This module takes a refpolicy.Module object and formats it for
output using the ModuleWriter object. By separating the output
in this way the other parts of Madison can focus solely on
generating policy. This keeps the semantic / syntactic issues
cleanly separated from the formatting issues.
�)�	refpolicy)�util)�cmpc@seZdZdd�Zdd�ZdS)�ModuleWritercCsd|_d|_d|_d|_dS)NT)�fd�module�sort�requires)�self�r�/usr/lib/python3.6/output.py�__init__&szModuleWriter.__init__cCsJ||_|jrt|j�x.tj|jdd�D]\}}|jdt|��q(WdS)NT)Z	showdepthz%s
)rr�sort_filterrZwalktree�write�str)r
rr�node�depthrrrr,s

zModuleWriter.writeN)�__name__�
__module__�__qualname__r
rrrrrr%srcCs�tj|�}|j�tj|�}|j�t|�t|�krFt|d|d�Sx4t||�D]&}|d|dkrRt|d|d�SqRWdS)N�r)rZset_to_listr�lenr�zip)�x�yZxlZyl�vrrr�
id_set_cmp=s

rcCsdt|j|j�}|dkr|St|j|j�}|dkr4|St|j|j�}|dkrN|Stt|j�t|j��S)Nr)r�	src_typesZ	tgt_typesZobj_classesrrZperms)�a�b�retrrr�
avrule_cmpKsr!cCs8|jd|jdkr*t|jd|jd�St|j|j�S)Nr)�argsrZifname)rrrrr�
ifcall_cmpZsr#cCsft|tj�r8t|tj�r"t||�St|jdg|j�Sn*t|tj�rNt||�St|j|jdg�SdS)Nr)	�
isinstancer�
InterfaceCallr#rr"rZAVRuler!)rrrrr�rule_cmp`s

r&cCst|j|j�S)N)rZrole)rrrrr�
role_type_cmplsr'cCs&dd�}x|j�D]}||�qWdS)z/Sort and group the output for readability.
    cSs�g}x(|j�D]}|j|�|jtj��qWx|j�D]}|j|�q8W|jtj��g}|j|j��|j|j��|jt	j
t�d�d}g}x||D]t}t|tj
�r�|jd}nt	j|j�}||k�r|r�|jtj��|}tj�}	|	jjd|�|j|	�|j|�q�W|j|�g}
|
j|j��|
jt	j
t�d�t|
��rftj�}	|	jjd�|j|	�|j|
�x$|jD]}||k�rx|j|��qxW||_dS)N)�keyrz============= %s ==============z"============= ROLES ==============)Zmodule_declarations�appendr�Commentr	�extendZavrulesZinterface_callsrr�
cmp_to_keyr&r$r%r"�firstr�linesZ
role_typesr'rZchildren)r�c�modZrequireZrulesZcurZ	sep_rulesZrulerZcommentZrasZchildrrr�	sort_nodersL








zsort_filter.<locals>.sort_nodeN)Znodes)rr1rrrrros<rN)
�__doc__�rrZPY3rrrr!r#r&r'rrrrr�<module>ssite-packages/sepolgen/__pycache__/refpolicy.cpython-36.opt-1.pyc000064400000117634147511334570020717 0ustar003

��f�}�@sbddlZddlZdZdZdZdZdZdZddd	d
ddgZeeeeeed
�Z	Gdd�d�Z
Gdd�de
�ZGdd�de
�Zdzdd�Z
d{dd�Zd}dd�Zdd�ZGd d!�d!e�ZGd"d#�d#e�ZGd$d%�d%e�ZGd&d'�d'�ZGd(d)�d)e�ZGd*d+�d+e�ZGd,d-�d-e�ZGd.d/�d/e�ZGd0d1�d1e�ZGd2d3�d3e�ZGd4d5�d5e�ZGd6d7�d7e�ZGd8d9�d9e�ZGd:d;�d;e�ZGd<d=�d=e�Z Gd>d?�d?e�Z!Gd@dA�dAe�Z"GdBdC�dCe�Z#GdDdE�dEe�Z$GdFdG�dGe�Z%GdHdI�dIe�Z&GdJdK�dKe�Z'GdLdM�dMe�Z(GdNdO�dOe�Z)GdPdQ�dQe�Z*GdRdS�dSe�Z+GdTdU�dUe�Z,GdVdW�dWe�Z-GdXdY�dYe�Z.GdZd[�d[e�Z/Gd\d]�d]e�Z0d^d_�Z1Gd`da�dae�Z2Gdbdc�dce�Z3Gddde�dee�Z4Gdfdg�dge�Z5Gdhdi�die�Z6Gdjdk�dke�Z7Gdldm�dme�Z8Gdndo�doe�Z9Gdpdq�dqe�Z:Gdrds�dse�Z;Gdtdu�du�Z<Gdvdw�dw�Z=Gdxdy�dy�Z>dS)~�N������source�target�object�
permission�role�destination)rrr	r
rrc@seZdZddd�ZdS)�
PolicyBaseNcCsd|_d|_dS)N)�parent�comment)�selfr�r�/usr/lib/python3.6/refpolicy.py�__init__5szPolicyBase.__init__)N)�__name__�
__module__�__qualname__rrrrrr
4sr
c@s�eZdZdZd/dd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�ZdS)0�Nodea�Base class objects produced from parsing the reference policy.

    The Node class is used as the base class for any non-leaf
    object produced by parsing the reference policy. This object
    should contain a reference to its parent (or None for a top-level
    object) and 0 or more children.

    The general idea here is to have a very simple tree structure. Children
    are not separated out by type. Instead the tree structure represents
    fairly closely the real structure of the policy statements.

    The object should be iterable - by default over all children but
    subclasses are free to provide additional iterators over a subset
    of their childre (see Interface for example).
    NcCstj||�g|_dS)N)r
r�children)rrrrrrJsz
Node.__init__cCs
t|j�S)N)�iterr)rrrr�__iter__Nsz
Node.__iter__cCstdd�t|��S)NcSs
t|t�S)N)�
isinstancer)�xrrr�<lambda>WszNode.nodes.<locals>.<lambda>)�filter�walktree)rrrr�nodesVsz
Node.nodescCstdd�t|��S)NcSs
t|t�S)N)r�Module)rrrrrZszNode.modules.<locals>.<lambda>)rr)rrrr�modulesYszNode.modulescCstdd�t|��S)NcSs
t|t�S)N)r�	Interface)rrrrr]sz!Node.interfaces.<locals>.<lambda>)rr)rrrr�
interfaces\szNode.interfacescCstdd�t|��S)NcSs
t|t�S)N)r�Template)rrrrr`sz Node.templates.<locals>.<lambda>)rr)rrrr�	templates_szNode.templatescCstdd�t|��S)NcSs
t|t�S)N)r�
SupportMacros)rrrrrcsz%Node.support_macros.<locals>.<lambda>)rr)rrrr�support_macrosbszNode.support_macroscCstdd�t|��S)NcSs
t|t�S)N)r�ModuleDeclaration)rrrrrhsz*Node.module_declarations.<locals>.<lambda>)rr)rrrr�module_declarationsgszNode.module_declarationscCstdd�t|��S)NcSs
t|t�S)N)r�
InterfaceCall)rrrrrksz&Node.interface_calls.<locals>.<lambda>)rr)rrrr�interface_callsjszNode.interface_callscCstdd�t|��S)NcSs
t|t�S)N)r�AVRule)rrrrrnszNode.avrules.<locals>.<lambda>)rr)rrrr�avrulesmszNode.avrulescCstdd�t|��S)NcSs
t|t�S)N)r�	AVExtRule)rrrrrqsz!Node.avextrules.<locals>.<lambda>)rr)rrrr�
avextrulespszNode.avextrulescCstdd�t|��S)NcSs
t|t�S)N)r�TypeRule)rrrrrtsz Node.typerules.<locals>.<lambda>)rr)rrrr�	typerulessszNode.typerulescCstdd�t|��S)NcSs
t|t�S)N)r�	TypeBound)rrrrrwsz!Node.typebounds.<locals>.<lambda>)rr)rrrr�
typeboundsvszNode.typeboundscCstdd�t|��S)zAIterate over all of the TypeAttribute children of this Interface.cSs
t|t�S)N)r�
TypeAttribute)rrrrr{sz%Node.typeattributes.<locals>.<lambda>)rr)rrrr�typeattributesyszNode.typeattributescCstdd�t|��S)zAIterate over all of the RoleAttribute children of this Interface.cSs
t|t�S)N)r�
RoleAttribute)rrrrrsz%Node.roleattributes.<locals>.<lambda>)rr)rrrr�roleattributes}szNode.roleattributescCstdd�t|��S)NcSs
t|t�S)N)r�Require)rrrrr�szNode.requires.<locals>.<lambda>)rr)rrrr�requires�sz
Node.requirescCstdd�t|��S)NcSs
t|t�S)N)r�Role)rrrrr�szNode.roles.<locals>.<lambda>)rr)rrrr�roles�sz
Node.rolescCstdd�t|��S)NcSs
t|t�S)N)r�	RoleAllow)rrrrr�sz"Node.role_allows.<locals>.<lambda>)rr)rrrr�role_allows�szNode.role_allowscCstdd�t|��S)NcSs
t|t�S)N)r�RoleType)rrrrr�sz!Node.role_types.<locals>.<lambda>)rr)rrrr�
role_types�szNode.role_typescCs(|jrt|j�d|j�S|j�SdS)N�
)r�str�	to_string)rrrr�__str__�szNode.__str__cCsd|jj|j�fS)Nz<%s(%s)>)�	__class__rrC)rrrr�__repr__�sz
Node.__repr__cCsdS)N�r)rrrrrC�szNode.to_string)N)rrr�__doc__rrr r"r$r&r(r*r,r.r0r2r4r6r8r:r<r>r@rDrFrCrrrrr9s.
rc@s.eZdZd
dd�Zdd�Zdd�Zdd	�ZdS)�LeafNcCstj||�dS)N)r
r)rrrrrr�sz
Leaf.__init__cCs(|jrt|j�d|j�S|j�SdS)NrA)rrBrC)rrrrrD�szLeaf.__str__cCsd|jj|j�fS)Nz<%s(%s)>)rErrC)rrrrrF�sz
Leaf.__repr__cCsdS)NrGr)rrrrrC�szLeaf.to_string)N)rrrrrDrFrCrrrrrI�s
rITFc
cs�|r
d}nd}|dfg}x�t|�dkr�|j|�\}}|rD||fVn|Vt|t�rg}t|j�d}	xD|	dkr�|dks�t|j|	|�r�|j|j|	|df�|	d8}	qhW|j|�qWdS)a�Iterate over a Node and its Children.

    The walktree function iterates over a tree containing Nodes and
    leaf objects. The iteration can perform a depth first or a breadth
    first traversal of the tree (controlled by the depthfirst
    paramater. The passed in node will be returned.

    This function will only work correctly for trees - arbitrary graphs
    will likely cause infinite looping.
    rrN���)�len�poprrr�append�extend)
�nodeZ
depthfirst�	showdepth�type�index�stackZcur�depth�items�irrrr�s"



rccs*x$|D]}|dkst||�r|VqWdS)aIterate over the direct children of a Node.

    The walktree function iterates over the children of a Node.
    Unlike walktree it does note return the passed in node or
    the children of any Node objects (that is, it does not go
    beyond the current level in the tree).
    N)r)rOrQrrrr�walknode�s
rW�{�}cCsRt|�}d}|dkrtd��dj|�}|dkr2|S|dd|d|dSdS)z�Convert a set (or any sequence type) into a string representation
    formatted to match SELinux space separated list conventions.

    For example the list ['read', 'write'] would be converted into:
    '{ read write }'
    rGrz"cannot convert 0 len set to string� rN)rK�
ValueError�join)�s�cont�lrBrrr�list_to_space_str�s
r`cCs"t|�}|dkrtd��dj|�S)Nrz'cannot conver 0 len set to comma stringz, )rKr[r\)r]r_rrr�list_to_comma_str�srac@s&eZdZddd�Zdd�Zdd�ZdS)	�IdSetNcCs&|rtj||�n
tj|�d|_dS)NF)�setrZ
compliment)r�listrrrr�s
zIdSet.__init__cCstt|��S)N)r`�sorted)rrrr�to_space_strszIdSet.to_space_strcCstt|��S)N)rare)rrrr�to_comma_strszIdSet.to_comma_str)N)rrrrrfrgrrrrrb�s
rbc@s4eZdZdZddd�Zdd�Zdd�Zdd	d
�ZdS)
�SecurityContextz;An SELinux security context with optional MCS / MLS fields.NcCs:tj||�d|_d|_d|_d|_|dk	r6|j|�dS)z�Create a SecurityContext object, optionally from a string.

        Parameters:
           [context] - string representing a security context. Same format
              as a string passed to the from_string method.
        rGN)rIr�userrrQ�level�from_string)r�contextrrrrrszSecurityContext.__init__cCs�tj|�}|ddkr|d}|jd�}t|�dkr@td|��|d|_|d|_|d|_t|�dkr�dj|dd��|_	nd|_	dS)z�Parse a string representing a context into a SecurityContext.

        The string should be in the standard format - e.g.,
        'user:role:type:level'.

        Raises ValueError if the string is not parsable as a security context.
        rr�:rz)context string [%s] not in a valid formatrN)
�selinuxZselinux_trans_to_raw_context�splitrKr[rirrQr\rj)rrl�raw�fieldsrrrrks	




zSecurityContext.from_stringcCs0|j|jko.|j|jko.|j|jko.|j|jkS)aCompare two SecurityContext objects - all fields must be exactly the
        the same for the comparison to work. It is possible for the level fields
        to be semantically the same yet syntactically different - in this case
        this function will return false.
        )rirrQrj)r�otherrrr�__eq__4szSecurityContext.__eq__cCs\|j|j|jg}|jdkrF|dkr:tj�dkrD|jd�qR|j|�n|j|j�dj|�S)a�Return a string representing this security context.

        By default, the string will contiain a MCS / MLS level
        potentially from the default which is passed in if none was
        set.

        Arguments:
           default_level - the default level to use if self.level is an
             empty string.

        Returns:
           A string represening the security context in the form
              'user:role:type:level'.
        NrZs0rm)rirrQrjrnZis_selinux_mls_enabledrMr\)rZ
default_levelrqrrrrC?s
zSecurityContext.to_string)NN)N)rrrrHrrkrsrCrrrrrh	s

rhc@seZdZdZddd�ZdS)�ObjectClassa"SELinux object class and permissions.

    This class is a basic representation of an SELinux object
    class - it does not represent separate common permissions -
    just the union of the common and class specific permissions.
    It is meant to be convenient for policy generation.
    rGNcCstj||�||_t�|_dS)N)rIr�namerb�perms)rrurrrrraszObjectClass.__init__)rGN)rrrrHrrrrrrtYsrtc@s<eZdZdZddd�Zdd�Zdd�Zdd
d�Zdd
�Zd	S)�XpermSeta)Extended permission set.

    This class represents one or more extended permissions
    represented by numeric values or ranges of values. The
    .complement attribute is used to specify all permission
    except those specified.

    Two xperm set can be merged using the .extend() method.
    FcCs||_g|_dS)N)�
complement�ranges)rrxrrrrpszXpermSet.__init__cCs�|jj�d}x�|t|j�kr�x�|dt|j�kr�|j|dd|j|ddkr�|j|dt|j|d|j|dd�f|j|<|j|d=q Pq W|d7}qWdS)z0Ensure that ranges are not overlapping.
        rrN)ry�sortrK�max)rrVrrrZ__normalize_rangests
$zXpermSet.__normalize_rangescCs|jj|j�|j�dS)z%Add ranges from an xperm set
        N)ryrN�_XpermSet__normalize_ranges)rr]rrrrN�szXpermSet.extendNcCs(|dkr|}|jj||f�|j�dS)z7Add value of range of values to the xperm set.
        N)ryrMr|)rZminimumZmaximumrrr�add�szXpermSet.addcCsz|js
dS|jrdnd}t|j�dkrX|jdd|jddkrX|t|jdd�Stdd�|j�}d|dj|�fS)	NrGz~ rrcSs$|d|dkrt|d�Sd|S)Nrrz%s-%s)rB)rrrrr�sz$XpermSet.to_string.<locals>.<lambda>z%s{ %s }rZ)ryrxrKrB�mapr\)rZcompl�valsrrrrC�s*zXpermSet.to_string)F)N)	rrrrHrr|rNr}rCrrrrrwfs	

rwc@s"eZdZdZddd�Zdd�ZdS)r5z[SElinux typeattribute statement.

    This class represents a typeattribute statement.
    NcCstj||�d|_t�|_dS)NrG)rIrrQrb�
attributes)rrrrrr�szTypeAttribute.__init__cCsd|j|jj�fS)Nztypeattribute %s %s;)rQr�rg)rrrrrC�szTypeAttribute.to_string)N)rrrrHrrCrrrrr5�s
r5c@s"eZdZdZddd�Zdd�ZdS)r7z[SElinux roleattribute statement.

    This class represents a roleattribute statement.
    NcCstj||�d|_t�|_dS)NrG)rIrrrbr8)rrrrrr�szRoleAttribute.__init__cCsd|j|jj�fS)Nzroleattribute %s %s;)rr8rg)rrrrrC�szRoleAttribute.to_string)N)rrrrHrrCrrrrr7�s
r7c@seZdZddd�Zdd�ZdS)r;NcCstj||�d|_t�|_dS)NrG)rIrrrb�types)rrrrrr�sz
Role.__init__cCs*d}x |jD]}|d|j|f7}qW|S)NrGzrole %s types %s;
)r�r)rr]�trrrrC�szRole.to_string)N)rrrrrCrrrrr;�s
r;c@seZdZddd�Zdd�ZdS)�TyperGNcCs&tj||�||_t�|_t�|_dS)N)rIrrurbr��aliases)rrurrrrr�sz
Type.__init__cCsRd|j}t|j�dkr*|d|jj�}t|j�dkrJ|d|jj�}|dS)Nztype %srzalias %sz, %s�;)rurKr�rfr�rg)rr]rrrrC�s
zType.to_string)rGN)rrrrrCrrrrr��s
r�c@seZdZddd�Zdd�ZdS)�	TypeAliasNcCstj||�d|_t�|_dS)NrG)rIrrQrbr�)rrrrrr�szTypeAlias.__init__cCsd|j|jj�fS)Nztypealias %s alias %s;)rQr�rf)rrrrrC�szTypeAlias.to_string)N)rrrrrCrrrrr��s
r�c@seZdZddd�Zdd�ZdS)�	AttributerGNcCstj||�||_dS)N)rIrru)rrurrrrr�szAttribute.__init__cCs
d|jS)Nz
attribute %s;)ru)rrrrrC�szAttribute.to_string)rGN)rrrrrCrrrrr��s
r�c@seZdZddd�Zdd�ZdS)�Attribute_RolerGNcCstj||�||_dS)N)rIrru)rrurrrrr�szAttribute_Role.__init__cCs
d|jS)Nzattribute_role %s;)ru)rrrrrC�szAttribute_Role.to_string)rGN)rrrrrCrrrrr��s
r�c@sBeZdZdZdZdZdZdZddd�Zd	d
�Z	dd�Z
d
d�ZdS)r-a�SELinux access vector (AV) rule.

    The AVRule class represents all varieties of AV rules including
    allow, dontaudit, and auditallow (indicated by the flags self.ALLOW,
    self.DONTAUDIT, and self.AUDITALLOW respectively).

    The source and target types, object classes, and perms are all represented
    by sets containing strings. Sets are used to make it simple to add
    strings repeatedly while avoiding duplicates.

    No checking is done to make certain that the symbols are valid or
    consistent (e.g., perms that don't match the object classes). It is
    even possible to put invalid types like '$1' into the rules to allow
    storage of the reference policy interfaces.
    rrrrNcCsFtj||�t�|_t�|_t�|_t�|_|j|_|rB|j	|�dS)N)
rIrrb�	src_types�	tgt_types�obj_classesrv�ALLOW�	rule_type�from_av)r�avrrrrr
szAVRule.__init__cCsD|j|jkrdS|j|jkr dS|j|jkr0dS|j|jkr@dSdS)NZallowZ	dontauditZ
auditallowZ
neverallow)r�r��	DONTAUDIT�
AUDITALLOW�
NEVERALLOW)rrrr�__rule_type_strszAVRule.__rule_type_strcCsV|jj|j�|j|jkr(|jjd�n|jj|j�|jj|j�|jj|j�dS)zIAdd the access from an access vector to this allow
        rule.
        rN)	r�r}�src_type�tgt_typer�r��	obj_classrv�update)rr�rrrr�szAVRule.from_avcCs.d|j�|jj�|jj�|jj�|jj�fS)z�Return a string representation of the rule
        that is a valid policy language representation (assuming
        that the types, object class, etc. are valie).
        z%s %s %s:%s %s;)�_AVRule__rule_type_strr�rfr�r�rv)rrrrrC*s
zAVRule.to_string)NN)rrrrHr�r�r�r�rr�r�rCrrrrr-�s


r-c@sBeZdZdZdZdZdZdZddd�Zd	d
�Z	dd�Z
d
d�ZdS)r/ajExtended permission access vector rule.

    The AVExtRule class represents allowxperm, dontauditxperm,
    auditallowxperm, and neverallowxperm rules.

    The source and target types, and object classes are represented
    by sets containing strings. The operation is a single string,
    e.g. 'ioctl'. Extended permissions are represented by an XpermSet.
    rrrrNcCsNtj||�t�|_t�|_t�|_|j|_t�|_	||_
|rJ|j||�dS)N)rIrrbr�r�r��
ALLOWXPERMr�rw�xperms�	operationr�)rr��oprrrrrDszAVExtRule.__init__cCsD|j|jkrdS|j|jkr dS|j|jkr0dS|j|jkr@dSdS)NZ
allowxpermZdontauditxpermZauditallowxpermZneverallowxperm)r�r��DONTAUDITXPERM�AUDITALLOWXPERM�NEVERALLOWXPERM)rrrrr�OszAVExtRule.__rule_type_strcCsZ|jj|j�|j|jkr(|jjd�n|jj|j�|jj|j�||_|j||_dS)Nr)	r�r}r�r�r�r�r�r�r�)rr�r�rrrr�YszAVExtRule.from_avcCs2d|j�|jj�|jj�|jj�|j|jj�fS)z�Return a string representation of the rule that is
        a valid policy language representation (assuming that
        the types, object class, etc. are valid).
        z%s %s %s:%s %s %s;)�_AVExtRule__rule_type_strr�rfr�r�r�r�rC)rrrrrCcszAVExtRule.to_string)NNN)rrrrHr�r�r�r�rr�r�rCrrrrr/5s	


r/c@s6eZdZdZdZdZdZddd�Zdd	�Zd
d�Z	dS)
r1z�SELinux type rules.

    This class is very similar to the AVRule class, but is for representing
    the type rules (type_trans, type_change, and type_member). The major
    difference is the lack of perms and only and sing destination type.
    rrrNcCs6tj||�t�|_t�|_t�|_d|_|j|_dS)NrG)	rIrrbr�r�r��	dest_type�TYPE_TRANSITIONr�)rrrrrrzszTypeRule.__init__cCs(|j|jkrdS|j|jkr dSdSdS)NZtype_transitionZtype_changeZtype_member)r�r��TYPE_CHANGE)rrrrr��s
zTypeRule.__rule_type_strcCs*d|j�|jj�|jj�|jj�|jfS)Nz%s %s %s:%s %s;)�_TypeRule__rule_type_strr�rfr�r�r�)rrrrrC�s
zTypeRule.to_string)N)
rrrrHr�r�ZTYPE_MEMBERrr�rCrrrrr1os
r1c@s"eZdZdZddd�Zdd�ZdS)r3zSSElinux typebound statement.

    This class represents a typebound statement.
    NcCstj||�d|_t�|_dS)NrG)rIrrQrbr�)rrrrrr�szTypeBound.__init__cCsd|j|jj�fS)Nztypebounds %s %s;)rQr�rg)rrrrrC�szTypeBound.to_string)N)rrrrHrrCrrrrr3�s
r3c@seZdZddd�Zdd�ZdS)r=NcCs tj||�t�|_t�|_dS)N)rIrrb�	src_roles�	tgt_roles)rrrrrr�szRoleAllow.__init__cCsd|jj�|jj�fS)Nzallow %s %s;)r�rgr�)rrrrrC�s
zRoleAllow.to_string)N)rrrrrCrrrrr=�s
r=c@seZdZddd�Zdd�ZdS)r?NcCstj||�d|_t�|_dS)NrG)rIrrrbr�)rrrrrr�szRoleType.__init__cCs*d}x |jD]}|d|j|f7}qW|S)NrGzrole %s types %s;
)r�r)rr]r�rrrrC�szRoleType.to_string)N)rrrrrCrrrrr?�s
r?c@seZdZddd�Zdd�ZdS)r)NcCs"tj||�d|_d|_d|_dS)NrGF)rIrru�version�	refpolicy)rrrrrr�szModuleDeclaration.__init__cCs*|jrd|j|jfSd|j|jfSdS)Nzpolicy_module(%s, %s)z
module %s %s;)r�rur�)rrrrrC�szModuleDeclaration.to_string)N)rrrrrCrrrrr)�s
r)c@seZdZddd�Zdd�ZdS)�ConditionalNcCstj||�g|_dS)N)rr�	cond_expr)rrrrrr�szConditional.__init__cCsdt|jdd�S)Nz[If %s]rG)r^)rGrG)r`r�)rrrrrC�szConditional.to_string)N)rrrrrCrrrrr��s
r�c@seZdZddd�Zdd�ZdS)�BoolNcCstj||�d|_d|_dS)NrGF)rIrru�state)rrrrrr�sz
Bool.__init__cCs$d|j}|jr|dS|dSdS)Nzbool %s �trueZfalse)rur�)rr]rrrrC�s
zBool.to_string)N)rrrrrCrrrrr��s
r�c@seZdZddd�Zdd�ZdS)�
InitialSidNcCstj||�d|_d|_dS)NrG)rIrrurl)rrrrrZ__init�szInitialSid.__initcCsd|jt|j�fS)Nz	sid %s %s)rurBrl)rrrrrC�szInitialSid.to_string)N)rrrZ_InitialSid__initrCrrrrr��s
r�c@seZdZddd�Zdd�ZdS)�GenfsConNcCs"tj||�d|_d|_d|_dS)NrG)rIr�
filesystem�pathrl)rrrrrr�szGenfsCon.__init__cCsd|j|jt|j�fS)Nzgenfscon %s %s %s)r�r�rBrl)rrrrrC�szGenfsCon.to_string)N)rrrrrCrrrrr��s
r�c@s*eZdZdZdZdZd	dd�Zdd�ZdS)
�
FilesystemUserrrNcCs$tj||�|j|_d|_d|_dS)NrG)rIr�XATTRrQr�rl)rrrrrr�szFilesystemUse.__init__cCsNd}|j|jkrd}n"|j|jkr(d}n|j|jkr8d}d||jt|j�fS)NrGz
fs_use_xattr z
fs_use_trans zfs_use_task z	%s %s %s;)rQr��TRANS�TASKr�rBrl)rr]rrrrC�szFilesystemUse.to_string)N)rrrr�r�r�rrCrrrrr��s

r�c@seZdZddd�Zdd�ZdS)�PortConNcCs"tj||�d|_d|_d|_dS)NrG)rIr�	port_type�port_numberrl)rrrrrrszPortCon.__init__cCsd|j|jt|j�fS)Nzportcon %s %s %s)r�r�rBrl)rrrrrCszPortCon.to_string)N)rrrrrCrrrrr��s
r�c@seZdZddd�Zdd�ZdS)�NodeConNcCs"tj||�d|_d|_d|_dS)NrG)rIr�start�endrl)rrrrrr
szNodeCon.__init__cCsd|j|jt|j�fS)Nznodecon %s %s %s)r�r�rBrl)rrrrrCszNodeCon.to_string)N)rrrrrCrrrrr�	s
r�c@seZdZddd�Zdd�ZdS)�NetifConNcCs"tj||�d|_d|_d|_dS)NrG)rIr�	interface�interface_context�packet_context)rrrrrrszNetifCon.__init__cCsd|jt|j�t|j�fS)Nznetifcon %s %s %s)r�rBr�r�)rrrrrCszNetifCon.to_string)N)rrrrrCrrrrr�s
r�c@seZdZddd�Zdd�ZdS)�PirqConNcCstj||�d|_d|_dS)NrG)rIr�pirq_numberrl)rrrrrrszPirqCon.__init__cCsd|jt|j�fS)Nz
pirqcon %s %s)r�rBrl)rrrrrC#szPirqCon.to_string)N)rrrrrCrrrrr�s
r�c@seZdZddd�Zdd�ZdS)�IomemConNcCstj||�d|_d|_dS)NrG)rIr�
device_memrl)rrrrrr'szIomemCon.__init__cCsd|jt|j�fS)Nziomemcon %s %s)r�rBrl)rrrrrC,szIomemCon.to_string)N)rrrrrCrrrrr�&s
r�c@seZdZddd�Zdd�ZdS)�	IoportConNcCstj||�d|_d|_dS)NrG)rIr�ioportrl)rrrrrr0szIoportCon.__init__cCsd|jt|j�fS)Nzioportcon %s %s)r�rBrl)rrrrrC5szIoportCon.to_string)N)rrrrrCrrrrr�/s
r�c@seZdZddd�Zdd�ZdS)�PciDeviceConNcCstj||�d|_d|_dS)NrG)rIr�devicerl)rrrrrr9szPciDeviceCon.__init__cCsd|jt|j�fS)Nzpcidevicecon %s %s)r�rBrl)rrrrrC>szPciDeviceCon.to_string)N)rrrrrCrrrrr�8s
r�c@seZdZddd�Zdd�ZdS)�
DeviceTreeConNcCstj||�d|_d|_dS)NrG)rIrr�rl)rrrrrrBszDeviceTreeCon.__init__cCsd|jt|j�fS)Nzdevicetreecon %s %s)r�rBrl)rrrrrCGszDeviceTreeCon.to_string)N)rrrrrCrrrrr�As
r�cCsLxFt|dd�D]6\}}d}xt|�D]}|d}q$Wt|t|��qWdS)NT)rPrG�	)r�range�printrB)�headrOrTr]rVrrr�
print_treeLs
r�c@seZdZddd�Zdd�ZdS)�HeadersNcCstj||�dS)N)rr)rrrrrrUszHeaders.__init__cCsdS)Nz	[Headers]r)rrrrrCXszHeaders.to_string)N)rrrrrCrrrrr�Ts
r�c@seZdZddd�Zdd�ZdS)r!NcCstj||�dS)N)rr)rrrrrr]szModule.__init__cCsdS)NrGr)rrrrrC`szModule.to_string)N)rrrrrCrrrrr!\s
r!c@s"eZdZdZddd�Zdd�ZdS)	r#zqA reference policy interface definition.

    This class represents a reference policy interface definition.
    rGNcCstj||�||_dS)N)rrru)rrurrrrrhszInterface.__init__cCs
d|jS)Nz[Interface name: %s])ru)rrrrrClszInterface.to_string)rGN)rrrrHrrCrrrrr#cs
r#c@seZdZddd�Zdd�ZdS)�
TunablePolicyNcCstj||�g|_dS)N)rrr�)rrrrrrpszTunablePolicy.__init__cCsdt|jdd�S)Nz[Tunable Policy %s]rG)r^)rGrG)r`r�)rrrrrCtszTunablePolicy.to_string)N)rrrrrCrrrrr�os
r�c@seZdZddd�Zdd�ZdS)r%rGNcCstj||�||_dS)N)rrru)rrurrrrrxszTemplate.__init__cCs
d|jS)Nz[Template name: %s])ru)rrrrrC|szTemplate.to_string)rGN)rrrrrCrrrrr%ws
r%c@seZdZddd�Zdd�ZdS)�IfDefrGNcCstj||�||_dS)N)rrru)rrurrrrr�szIfDef.__init__cCs
d|jS)Nz[Ifdef name: %s])ru)rrrrrC�szIfDef.to_string)rGN)rrrrrCrrrrr�s
r�c@s&eZdZd	dd�Zdd�Zdd�ZdS)
r+rGNcCs"tj||�||_g|_g|_dS)N)rIr�ifname�argsZcomments)rr�rrrrr�szInterfaceCall.__init__cCsT|j|jkrdSt|j�t|j�kr(dSx&t|j|j�D]\}}||kr8dSq8WdS)NFT)r�rKr��zip)rrr�a�brrr�matches�szInterfaceCall.matchescCsdd|j}d}xL|jD]B}t|t�r.t|�}n|}|dkrH|d|}n||}|d7}qW|dS)Nz%s(rz, %sr�))r�r�rrdr`)rr]rVr�rBrrrrC�s


zInterfaceCall.to_string)rGN)rrrrr�rCrrrrr+�s

r+c@seZdZddd�Zdd�ZdS)�OptionalPolicyNcCstj||�dS)N)rr)rrrrrr�szOptionalPolicy.__init__cCsdS)Nz[Optional Policy]r)rrrrrC�szOptionalPolicy.to_string)N)rrrrrCrrrrr��s
r�c@s>eZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�ZdS)r'NcCstj||�d|_dS)N)rrr~)rrrrrr�szSupportMacros.__init__cCsdS)Nz[Support Macros]r)rrrrrC�szSupportMacros.to_stringcCsDt�}||jkr6x.|j|�D]}|j|j|��qWn
|j|�|S)N)rcr~�by_namer��_SupportMacros__expand_permr})r�permr]�prrrZ
__expand_perm�s

zSupportMacros.__expand_permcCsJi|_x>|D]6}t�}x|jD]}|j|j|��qW||j|j<qWdS)N)r~rcrvr�r�ru)rrZ	exp_permsr�rrrZ	__gen_map�s
zSupportMacros.__gen_mapcCs|js|j�|j|S)N)r~�_SupportMacros__gen_map)rrurrrr��szSupportMacros.by_namecCs|js|j�||jkS)N)r~r�)rrurrr�has_key�szSupportMacros.has_key)N)	rrrrrCr�r�r�r�rrrrr'�s
r'c@s&eZdZddd�Zdd�Zdd�ZdS)	r9NcCs6tj||�t�|_i|_t�|_t�|_t�|_dS)N)rIrrbr�r�r<�data�users)rrrrrr�szRequire.__init__cCs|jj|t��}|j|�dS)N)r��
setdefaultrbr�)rr�rvr�rrr�
add_obj_class�szRequire.add_obj_classcCs�g}|jd�x|jD]}|jd|�qWx,|jj�D]\}}|jd||j�f�q8Wx|jD]}|jd|�qbWx|jD]}|jd|�q�Wx|jD]}|jd|�q�W|jd�t|�dkr�d	Sd
j	|�S)Nz	require {z		type %s;z
	class %s %s;z		role %s;z		bool %s;z		user %s;rYrrGrA)
rMr�r�rUrfr<r�r�rKr\)rr]rQr�rvr�boolrirrrrC�s 

zRequire.to_string)N)rrrrr�rCrrrrr9�s
r9c@seZdZdd�Zdd�ZdS)�
ObjPermSetcCs||_t�|_dS)N)rurcrv)rrurrrr�szObjPermSet.__init__cCsd|j|jj�fS)Nzdefine(`%s', `%s'))rurvrf)rrrrrC�szObjPermSet.to_stringN)rrrrrCrrrrr��sr�c@seZdZdd�Zdd�ZdS)�ClassMapcCs||_||_dS)N)r�rv)rr�rvrrrrszClassMap.__init__cCs|jd|jS)Nz: )r�rv)rrrrrCszClassMap.to_stringN)rrrrrCrrrrr�sr�c@s.eZdZd
dd�Zdd�Zdd�Zdd	�ZdS)�CommentNcCs|r||_ng|_dS)N)�lines)rr_rrrr
szComment.__init__cCsBt|j�dkrdSg}x|jD]}|jd|�qWdj|�SdS)NrrG�#rA)rKr�rMr\)r�out�linerrrrCszComment.to_stringcCs2t|j�r.x"|jD]}|dkr|jj|�qWdS)NrG)rKr�rM)rrrr�rrr�merges
z
Comment.mergecCs|j�S)N)rC)rrrrrD!szComment.__str__)N)rrrrrCr�rDrrrrr�	s
r�)TFN)N�rXrY)r�)?�stringrnZSRC_TYPEZTGT_TYPEZ	OBJ_CLASSZPERMSZROLEZ	DEST_TYPEZfield_to_strZstr_to_fieldr
rrIrrWr`rarcrbrhrtrwr5r7r;r�r�r�r�r-r/r1r3r=r?r)r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r!r#r�r%r�r+r�r'r9r�r�r�rrrr�<module>s|a
&

	P
=
	@:!


	



				!&$site-packages/sepolgen/__pycache__/policygen.cpython-36.opt-1.pyc000064400000031030147511334570020675 0ustar003

��f�;�	@s�dZddlZddlZddljZyddlTWnYnXddlmZddlmZddlm	Z	ddlm
Z
dd	lmZdd
lmZdZ
dZdZGdd
�d
�Zdefdd�Zdd�ZGdd�d�Zdd�ZdS)z>
classes and algorithms for the generation of SELinux policy.
�N)�*�)�	refpolicy)�objectmodel)�access)�
interfaces)�matching)�util�c@s�eZdZdZddd�Zd dd�Zd!dd	�Zefd
d�Zdd
�Z	dd�Z
dd�Zd"dd�Zdd�Z
dd�Zdd�Zdd�Zdd�ZdS)#�PolicyGeneratora�Generate a reference policy module from access vectors.

    PolicyGenerator generates a new reference policy module
    or updates an existing module based on requested access
    in the form of access vectors.

    It generates allow rules and optionally module require
    statements, reference policy interfaces, and extended
    permission access vector rules. By default only allow rules
    are generated. The methods .set_gen_refpol, .set_gen_requires
    and .set_gen_xperms turns on interface generation,
    requires generation, and xperms rules genration respectively.

    PolicyGenerator can also optionally add comments explaining
    why a particular access was allowed based on the audit
    messages that generated the access. The access vectors
    passed in must have the .audit_msgs field set correctly
    and .explain set to SHORT|LONG_EXPLANATION to enable this
    feature.

    The module created by PolicyGenerator can be passed to
    output.ModuleWriter to output a text representation.
    NcCs>d|_t|_d|_|r||_n
tj�|_d|_d|_d|_	dS)z�Initialize a PolicyGenerator with an optional
        existing module.

        If the module paramater is not None then access
        will be added to the passed in module. Otherwise
        a new reference policy module will be created.
        NF)
�ifgen�NO_EXPLANATION�explain�gen_requires�modulerZModule�	dontaudit�xperms�domains)�selfr�r�/usr/lib/python3.6/policygen.py�__init__Es
zPolicyGenerator.__init__cCs*|rt||�|_d|_nd|_|j�dS)a?Set whether reference policy interfaces are generated.

        To turn on interface generation pass in an interface set
        to use for interface generation. To turn off interface
        generation pass in None.

        If interface generation is enabled requires generation
        will also be enabled.
        TN)�InterfaceGeneratorrr�"_PolicyGenerator__set_module_style)rZif_set�	perm_mapsrrr�set_gen_refpolYs

zPolicyGenerator.set_gen_refpolTcCs
||_dS)a&Set whether module requires are generated.

        Passing in true will turn on requires generation and
        False will disable generation. If requires generation is
        disabled interface generation will also be disabled and
        can only be re-enabled via .set_gen_refpol.
        N)r)rZstatusrrr�set_gen_requiresksz PolicyGenerator.set_gen_requirescCs
||_dS)z)Set whether access is explained.
        N)r)rrrrr�set_gen_explainuszPolicyGenerator.set_gen_explaincCs
||_dS)N)r)rrrrr�set_gen_dontauditzsz!PolicyGenerator.set_gen_dontauditcCs
||_dS)zSSet whether extended permission access vector rules
        are generated.
        N)r)rrrrr�set_gen_xperms}szPolicyGenerator.set_gen_xpermscCs.|jrd}nd}x|jj�D]
}||_qWdS)NTF)rr�module_declarationsr)rr�modrrrZ__set_module_style�s
z"PolicyGenerator.__set_module_style�1.0cCs\d}x|jj�D]}|}qW|s8tj�}|jjjd|�||_||_|jrRd|_nd|_dS)z?Set the name of the module and optionally the version.
        NrTF)	rr rZModuleDeclaration�children�insert�name�versionr)rr%r&�mr!rrr�set_module_name�szPolicyGenerator.set_module_namecCs|jrt|j�|jS)N)rr)rrrr�
get_module�s
zPolicyGenerator.get_modulecCsvtj|�}|jr|j|_d|_|jr>ttjt	||jd���|_|j
tjkrl|jd7_|j
rl|jd7_|j
tjkr�|jd7_|j
tjkr�t|j�dkr�|jddjd	d
�|jD��7_n|jd|jdd7_|j
tjk�rP|jd
7_|jd7_|jd|jd7_x*|jdd�D]}|jd|7_�q4W�y|j
tjk�rTd|jk�rTd|jk�s�d|jk�rT|j�s�ttdd�dd|_g}xHdd
�ttgt|jt|jt|ji�D�D]}||jk�r�|j|��q�Wt|�dk�r$|jd|j|jdj|�f7_n0t|�dk�rT|jd|j|jdj|�f7_WnYnX|jj j|�dS)z Add access vector rule.
        �)�	verbosityz0
#!!!! This avc is allowed in the current policyzN
#!!!! This av rule may have been overridden by an extended permission av rulez:
#!!!! This avc has a dontaudit rule in the current policyrzH
#!!!! This avc can be allowed using one of the these booleans:
#     %sz, cSsg|]}|d�qS)rr)�.0�xrrr�
<listcomp>�sz1PolicyGenerator.__add_av_rule.<locals>.<listcomp>z5
#!!!! This avc can be allowed using the boolean '%s'rz�
#!!!! This avc is a constraint violation.  You would need to modify the attributes of either the source or target types to allow this access.z
#Constraint rule: z
#	Nz?
#	Possible cause is the source %s and target %s are different.�write�dir�openZdomain)r%�typescSsg|]}|t�qSr)ZTCONTEXT)r,r-rrrr.�szL
#!!!! The source type '%s' can write to a '%s' of the following type:
# %s
zM
#!!!! The source type '%s' can write to a '%s' of the following types:
# %s
)!rZAVRulerZ	DONTAUDIT�	rule_type�commentr�str�Comment�explain_access�type�	audit2whyZALLOWrZBOOLEAN�len�data�joinZ
CONSTRAINTZTERULE�perms�	obj_classrZseinfoZ	ATTRIBUTEZsesearchZSCONTEXT�src_typeZCLASSZPERMS�appendrr#)r�avZrule�reasonr2�irrrZ
__add_av_rule�sN
&.$&zPolicyGenerator.__add_av_rulecCs@x:|jj�D],}tj||�}|jr*|j|_|jjj	|�qWdS)z5Add extended permission access vector rules.
        N)
r�keysrZ	AVExtRulerZDONTAUDITXPERMr3rr#r@)rrA�opZextrulerrrZ__add_ext_av_rules�s
z"PolicyGenerator.__add_ext_av_rulescCs`|jr*|jj||j�\}}|jjj|�n|}x,|D]$}|j|�|jr4|jr4|j|�q4WdS)zJAdd the access from the access vector set to this
        module.
        N)	r�genrrr#�extend�_PolicyGenerator__add_av_ruler�"_PolicyGenerator__add_ext_av_rules)rZav_setZ	raw_allow�ifcallsrArrr�
add_access�s	

zPolicyGenerator.add_accesscCs x|D]}|jjj|�qWdS)N)rr#r@)rZ
role_type_set�	role_typerrr�add_role_types�s
zPolicyGenerator.add_role_types)N)NN)T)r")�__name__�
__module__�__qualname__�__doc__rrr�SHORT_EXPLANATIONrrrrr(r)rHrIrKrMrrrrr-s




5rcsg���fdd�}|tkr�x�|jD]�}�jd|j��jdt|j�t|j�f��jd|jtj	|j
�f��jd|j|j|j
f��jtjd|jdd	d
dd��q"W|�nb|�r�jd
|j|j|j|jj�f�t|j�dk�r|jd}�jd|j|j|j
f�|��S)a�Explain why a policy statement was generated.

    Return a string containing a text explanation of
    why a policy statement was generated. The string is
    commented and wrapped and can be directly inserted
    into a policy.

    Params:
      av - access vector representing the access. Should
       have .audit_msgs set appropriately.
      verbosity - the amount of explanation provided. Should
       be set to NO_EXPLANATION, SHORT_EXPLANATION, or
       LONG_EXPLANATION.
    Returns:
      list of strings - strings explaining the access or an empty
       string if verbosity=NO_EXPLANATION or there is not sufficient
       information to provide an explanation.
    csN�sdS�jd�x6�j�D]*}t|j�j�}�jd|j�|jf�qWdS)Nz Interface options:z   %s # [%d])r@�all�call_interface�	interfacerAZ	to_stringZdist)�match�ifcall)�ml�srr�explain_interfacess
z*explain_access.<locals>.explain_interfacesz %sz  scontext="%s" tcontext="%s"z  class="%s" perms="%s"z  comm="%s" exe="%s" path="%s"z	message="�"�Pz  z   )Zinitial_indentZsubsequent_indentz) src="%s" tgt="%s" class="%s", perms="%s"rz comm="%s" exe="%s" path="%s")�LONG_EXPLANATIONZ
audit_msgsr@�headerr5ZscontextZtcontextZtclassrZlist_to_space_strZaccessesZcommZexe�pathrG�textwrapZwrap�messager?�tgt_typer>r=Zto_space_strr:)rArXr+rZ�msgr)rXrYrr7�s*
r7cCs�g}g}|j|jj��|jdd�dd�tj�}|j|_x�tt	|��D]r}||j
tjkrl|jj
|j�qH||j
tjkr�|jj
|j�qH||j
tjkr�|jj
|j�qHt||j
�qHW|S)NcSs|jS)N)�num)�paramrrr�<lambda>9sz call_interface.<locals>.<lambda>T)�key�reverse)rG�params�values�sortrZ
InterfaceCallr%Zifname�ranger:r8�SRC_TYPE�argsr@r?�TGT_TYPErb�	OBJ_CLASSr>�print)rUrArirnrWrCrrrrT4s rTc@s.eZdZd
dd�Zdd�Zdd�Zdd	�ZdS)rNcCs&||_|j|�tj|�|_g|_dS)N)�ifs�hack_check_ifsrZ
AccessMatcher�matcher�calls)rrrrrrrrNs
zInterfaceGenerator.__init__cCs�x�|jj�D]|}g}|j|jj��|jdd�dd�xPtt|��D]@}|d||jkrbd|_P||j	t
jt
jt
j
gkrDd|_PqDWqWdS)NcSs|jS)N)rd)rerrrrf\sz3InterfaceGenerator.hack_check_ifs.<locals>.<lambda>T)rgrhrF)rrjrGrirkrlr:rdZenabledr8rrmrorp)rrrr-rirCrrrrsTs
z!InterfaceGenerator.hack_check_ifscCs�|j|�}g}xH|jD]>}t|j�j|j�}|rFtjt|j||��|_	|j
||f�qWg}xX|D]P\}}d}	x4|D],}
|
j|�rt|
j	r�|j	r�|
j	j|j	�d}	qtW|	sb|j
|�qbW||fS)NFT)
rVrurTZbestrUrArr6r7r4r@Zmatches�merge)r�avsr+�raw_avrJrXrW�drr�foundZo_ifcallrrrrFks$


zInterfaceGenerator.gencCsPg}xF|D]>}tj�}|jj|j||�t|�r>|jj|�q
|j|�q
W|S)N)rZ	MatchListrtZ
search_ifsrrr:rur@)rrwrxrAZansrrrrV�s
zInterfaceGenerator.match)N)rNrOrPrrsrFrVrrrrrMs
rcCs&dd�}x|j�D]}||�qWdS)z*Add require statements to the module.
    cSs�tj�}xJ|j�D]>}|jj|j�|jj|j�x|jD]}|j||j	�q:WqWx,|j
�D] }x|jD]}|jj|�qjWq^Wx,|j
�D] }|jj|j�|jj|j�q�W|jjd�|jjd|�dS)Nrr)rZRequireZavrulesr2�updateZ	src_typesZ	tgt_typesZobj_classesZ
add_obj_classr=Zinterface_callsrn�addZ
role_typesZrolesZrole�discardr#r$)�node�rZavrule�objrW�argrLrrr�collect_requires�sz&gen_requires.<locals>.collect_requiresN)Znodes)rr�r~rrrr�sr)rQ�	itertoolsr`Zselinux.audit2whyr9Zsetoolsr*rrrrrr	r
rRr]rr7rTrrrrrr�<module>s,
Q7Bsite-packages/sepolgen/__pycache__/refparser.cpython-36.pyc000064400000070600147511334570017744 0ustar003

��f�x�I@s�ddlZddlZddlZddlZddlmZddlmZddlmZddlmZddlm	Z	�dZ
dddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdOdNdPdQ�4ZdRZdSZ
dTZdUZdVZdWZdXZdYZdZZd[Zd\Zd]Zd^Zd_Zd`ZdaZdbZdcZddZdedf�Zdgdh�Z didj�Z!dkdl�Z"dmdn�Z#dodp�Z$dqdr�Z%dsdt�Z&dudv�Z'da(da)dwa*da+dxa,�d	dydz�Z-d{d|�Z.d}d~�Z/dd��Z0d�d��Z1d�d��Z2d�d��Z3d�d��Z4d�d��Z5d�d��Z6d�d��Z7d�d��Z8d�d��Z9d�d��Z:d�d��Z;d�d��Z<d�d��Z=d�d��Z>d�d��Z?d�d��Z@d�d��ZAd�d��ZBd�d��ZCd�d��ZDd�d��ZEd�d��ZFd�d��ZGd�d��ZHd�d��ZId�d��ZJd�d��ZKd�d��ZLd�d��ZMd�d��ZNd�d��ZOd�d��ZPd�d„ZQd�dĄZRd�dƄZSd�dȄZTd�dʄZUd�d̄ZVd�d΄ZWd�dЄZXd�d҄ZYd�dԄZZd�dքZ[d�d؄Z\d�dڄZ]d�d܄Z^d�dބZ_d�d�Z`d�d�Zad�d�Zbd�d�Zcd�d�Zdd�d�Zed�d�Zfd�d�Zgd�d�Zhd�d�Zid�d�Zjd�d��Zkd�d��Zld�d��Zmd�d��Znd�d��Zodapdaqd��d�Zr�d
�d�d�Zs�d�d�Zt�d�d�d�ZudS(�N�)�access)�defaults)�lex)�	refpolicy)�yacc�TICK�SQUOTE�OBRACE�CBRACE�SEMI�COLON�OPAREN�CPAREN�COMMA�MINUS�TILDE�ASTERISK�AMP�BAR�EXPL�EQUAL�FILENAME�
IDENTIFIER�NUMBER�PATH�	IPV6_ADDR�MODULE�
POLICY_MODULE�REQUIRE�SID�GENFSCON�FS_USE_XATTR�FS_USE_TRANS�FS_USE_TASK�PORTCON�NODECON�NETIFCON�PIRQCON�IOMEMCON�	IOPORTCON�PCIDEVICECON�
DEVICETREECON�CLASS�
TYPEATTRIBUTE�
ROLEATTRIBUTE�TYPE�	ATTRIBUTE�ATTRIBUTE_ROLE�ALIAS�	TYPEALIAS�BOOL�TRUE�FALSE�IF�ELSE�ROLE�TYPES�ALLOW�	DONTAUDIT�
AUDITALLOW�
NEVERALLOW�
PERMISSIVE�
TYPEBOUNDS�TYPE_TRANSITION�TYPE_CHANGE�TYPE_MEMBER�RANGE_TRANSITION�ROLE_TRANSITION�
OPT_POLICY�	INTERFACE�TUNABLE_POLICY�GEN_REQ�TEMPLATE�GEN_CONTEXT�IFELSE�IFDEF�IFNDEF�DEFINE)4�moduleZ
policy_moduleZrequireZsidZgenfscon�fs_use_xattr�fs_use_trans�fs_use_taskZportconZnodeconZnetifconZpirqconZiomemconZ	ioportconZpcideviceconZ
devicetreecon�classZ
typeattributeZ
roleattribute�typeZ	attributeZattribute_role�aliasZ	typealias�bool�trueZfalse�if�else�role�typesZallow�	dontaudit�
auditallow�
neverallowZ
permissiveZ
typeboundsZtype_transition�type_change�type_memberZrange_transitionZrole_transitionZoptional_policy�	interfaceZtunable_policyZgen_require�templateZgen_contextZifelseZifndef�ifdefZdefinez\`z\'z\{z\}z\;+z\:z\(z\)z\,z\-z\~z\*z\&z\|z\!z\=z[0-9\.]+z/[a-zA-Z0-9)_\.\*/\$]*z 	cCs|S)z2[a-fA-F0-9]{0,4}:[a-fA-F0-9]{0,4}:([a-fA-F0-9]|:)*�)�trfrf�/usr/lib/python3.6/refparser.py�t_IPV6_ADDR�sricCs|jjd7_dS)zdnl.*\nrN)�lexer�lineno)rgrfrfrh�t_m4comment�srlcCs|jd�dS)zdefine.*refpolicywarn\(.*\nrN)�skip)rgrfrfrh�t_refpolicywarn1�srncCs|jjd7_dS)zrefpolicywarn\(.*\nrN)rjrk)rgrfrfrh�t_refpolicywarn�srocCstj|jd�|_|S)z#[a-zA-Z_\$][a-zA-Z0-9_\-\+\.\$\*~]*r)�reserved�get�valuerV)rgrfrfrh�t_IDENTIFIER�srscCstj|jd�|_|S)z\"[a-zA-Z0-9_\-\+\.\$\*~ :]+\"r)rprqrrrV)rgrfrfrh�
t_FILENAMEsrtcCs|jjd7_dS)z\#.*\nrN)rjrk)rgrfrfrh�	t_commentsrucCs td|jd�|jd�dS)NzIllegal character '%s'rr)�printrrrm)rgrfrfrh�t_errorsrwcCs|jjt|j�7_dS)z\n+N)rjrk�lenrr)rgrfrfrh�	t_newlinesry�TcCsX|dkrdSxF|D]>}|dkr q||_|dk	rB|jjd||f�q|jjd|�qWdS)Nr)�parent�children�insert)Zstmtsr{�val�srfrfrh�collect-s
r�cCs8x2|D]*}tj|�r&|jtj|��q|j|�qWdS)N)�sptZhas_key�updateZby_name�add)Zidsr�idrfrfrh�expand9s

r�cCsNt|�dkr&|dr&tjj|d�n$t|�dkrJ|drJtjj|d�dS)z^statements : statement
                  | statements statement
                  | empty
    �rN)rx�mr|�append)�prfrfrh�p_statementsAsr�cCs|d|d<dS)z�statement : interface
                 | template
                 | obj_perm_set
                 | policy
                 | policy_module_stmt
                 | module_stmt
    rrNrf)r�rfrfrh�p_statementKsr�cCsdS)zempty :Nrf)r�rfrfrh�p_emptyUsr�cCs.tj�}|d|_|d|_d|_||d<dS)zHpolicy_module_stmt : POLICY_MODULE OPAREN IDENTIFIER COMMA NUMBER CPAREN��TrN)r�ModuleDeclaration�name�version)r�r�rfrfrh�p_policy_module_stmt`s


r�cCs(tj|d�}t|d|�||d<dS)zainterface : INTERFACE OPAREN TICK IDENTIFIER SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN
    ��rN)r�	Interfacer�)r��xrfrfrh�p_interfacehsr�cCs(tj|d�}t|d|�||d<dS)z�template : TEMPLATE OPAREN TICK IDENTIFIER SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN
                | DEFINE OPAREN TICK IDENTIFIER SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN
    r�r�rN)rZTemplater�)r�r�rfrfrh�
p_templateosr�cCsd|d<dS)z4define : DEFINE OPAREN TICK IDENTIFIER SQUOTE CPARENNrrf)r�rfrfrh�p_definewsr�cCszt|�dkr"|dr"|d|d<nTt|�dkrv|dsL|drv|d|d<n*|dsb|d|d<n|d|d|d<dS)zlinterface_stmts : policy
                       | interface_stmts policy
                       | empty
    r�rrN)rx)r�rfrfrh�p_interface_stmts~sr�cCsFtj�}t|d|dd�t|�dkr8t|d|dd�|g|d<dS)	z�optional_policy : OPT_POLICY OPAREN TICK interface_stmts SQUOTE CPAREN
                       | OPT_POLICY OPAREN TICK interface_stmts SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN
    r�T)r~�r�FrN)rZOptionalPolicyr�rx)r��orfrfrh�p_optional_policy�s
r�cCsPtj�}|d|_t|d|dd�t|�dkrBt|d|dd�|g|d<d	S)
z�tunable_policy : TUNABLE_POLICY OPAREN TICK cond_expr SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN
                      | TUNABLE_POLICY OPAREN TICK cond_expr SQUOTE COMMA TICK interface_stmts SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN
    r�r�T)r~��FrN)rZ
TunablePolicy�	cond_exprr�rx)r�r�rfrfrh�p_tunable_policy�s
r�cCsdS)a�ifelse : IFELSE OPAREN TICK IDENTIFIER SQUOTE COMMA COMMA TICK IDENTIFIER SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN optional_semi
              | IFELSE OPAREN TICK IDENTIFIER SQUOTE COMMA TICK IDENTIFIER SQUOTE COMMA TICK interface_stmts SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN optional_semi
              | IFELSE OPAREN TICK IDENTIFIER SQUOTE COMMA TICK SQUOTE COMMA TICK interface_stmts SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN optional_semi
    Nrf)r�rfrfrh�p_ifelse�sr�cCsbtj|d�}|ddkr d}nd}t|d||d�t|�dkrTt|d|dd�|g|d	<d
S)aJifdef : IFDEF OPAREN TICK IDENTIFIER SQUOTE COMMA TICK statements SQUOTE CPAREN optional_semi
             | IFNDEF OPAREN TICK IDENTIFIER SQUOTE COMMA TICK statements SQUOTE CPAREN optional_semi
             | IFDEF OPAREN TICK IDENTIFIER SQUOTE COMMA TICK statements SQUOTE COMMA TICK statements SQUOTE CPAREN optional_semi
    r�rreTFr�)r~r�rN)rZIfDefr�rx)r�r��vrfrfrh�p_ifdef�sr�cCs8tj|dd�}t|�dkr,|jj|d�||d<dS)z�interface_call : IDENTIFIER OPAREN interface_call_param_list CPAREN
                      | IDENTIFIER OPAREN CPAREN
                      | IDENTIFIER OPAREN interface_call_param_list CPAREN SEMIr)Zifnamer�r�rN)rZ
InterfaceCallrx�args�extend)r��irfrfrh�p_interface_call�sr�cCs6t|�dkr|d|d<n|dd|dg|d<dS)z�interface_call_param : IDENTIFIER
                            | IDENTIFIER MINUS IDENTIFIER
                            | nested_id_set
                            | TRUE
                            | FALSE
                            | FILENAME
    r�rr�-r�N)rx)r�rfrfrh�p_interface_call_param�s
r�cCs6t|�dkr|dg|d<n|d|dg|d<dS)z�interface_call_param_list : interface_call_param
                                 | interface_call_param_list COMMA interface_call_param
    r�rrr�N)rx)r�rfrfrh�p_interface_call_param_list�sr�cCs$tj|d�}|d|_||d<dS)zRobj_perm_set : DEFINE OPAREN TICK IDENTIFIER SQUOTE COMMA TICK names SQUOTE CPARENr�r�rN)rZ
ObjPermSet�perms)r�rrfrfrh�p_obj_perm_set�s
r�cCs|d|d<dS)z�policy : policy_stmt
              | optional_policy
              | tunable_policy
              | ifdef
              | ifelse
              | conditional
    rrNrf)r�rfrfrh�p_policy�sr�cCs|dr|dg|d<dS)a�policy_stmt : gen_require
                   | avrule_def
                   | typerule_def
                   | typebound_def
                   | typeattribute_def
                   | roleattribute_def
                   | interface_call
                   | role_def
                   | role_allow
                   | permissive
                   | type_def
                   | typealias_def
                   | attribute_def
                   | attribute_role_def
                   | range_transition_def
                   | role_transition_def
                   | bool
                   | define
                   | initial_sid
                   | genfscon
                   | fs_use
                   | portcon
                   | nodecon
                   | netifcon
                   | pirqcon
                   | iomemcon
                   | ioportcon
                   | pcidevicecon
                   | devicetreecon
    rrNrf)r�rfrfrh�
p_policy_stmt�sr�cCs.tj�}|d|_|d|_d|_||d<dS)z+module_stmt : MODULE IDENTIFIER NUMBER SEMIr�r�FrN)rr�r�r�)r�r�rfrfrh�
p_module_stmts


r�cCsdS)zlgen_require : GEN_REQ OPAREN TICK requires SQUOTE CPAREN
                   | REQUIRE OBRACE requires CBRACENrf)r�rfrfrh�
p_gen_require!sr�cCsdS)zsrequires : require
                | requires require
                | ifdef
                | requires ifdef
    Nrf)r�rfrfrh�
p_requires)sr�cCsdS)z�require : TYPE comma_list SEMI
               | ROLE comma_list SEMI
               | ATTRIBUTE comma_list SEMI
               | ATTRIBUTE_ROLE comma_list SEMI
               | CLASS comma_list SEMI
               | BOOL comma_list SEMI
    Nrf)r�rfrfrh�	p_require1sr�cCsHtj�}|d|_|d|_|d|_t|�dkr<|d|_||d<dS)z�security_context : IDENTIFIER COLON IDENTIFIER COLON IDENTIFIER
                        | IDENTIFIER COLON IDENTIFIER COLON IDENTIFIER COLON mls_range_defrr�r��r�rN)rZSecurityContext�userr\rVrx�level)r�rrfrfrh�p_security_context;s



r�cCs|d}|d|_||d<dS)zQgen_context : GEN_CONTEXT OPAREN security_context COMMA mls_range_def CPAREN
    r�r�rN)r�)r�rrfrfrh�
p_gen_contextHs
r�cCs|d|d<dS)z<context : security_context
               | gen_context
    rrNrf)r�rfrfrh�	p_contextSsr�cCs(tj�}|d|_|d|_||d<dS)z$initial_sid : SID IDENTIFIER contextr�r�rN)rZ
InitialSidr��context)r�rrfrfrh�
p_initial_sidYs

r�cCs2tj�}|d|_|d|_|d|_||d<dS)z+genfscon : GENFSCON IDENTIFIER PATH contextr�r�r�rN)rZGenfsCon�
filesystem�pathr�)r��grfrfrh�
p_genfscon`s



r�cCsntj�}|ddkr tjj|_n.|ddkr8tjj|_n|ddkrNtjj|_|d|_|d|_||d<dS)	z�fs_use : FS_USE_XATTR IDENTIFIER context SEMI
              | FS_USE_TASK IDENTIFIER context SEMI
              | FS_USE_TRANS IDENTIFIER context SEMI
    rrRrTrSr�r�rN)rZ
FilesystemUseZXATTRrVZTASKZTRANSr�r�)r��frfrfrh�p_fs_usejs


r�cCs`tj�}|d|_t|�dkr4|d|_|d|_n |dd|d|_|d|_||d<dS)zkportcon : PORTCON IDENTIFIER NUMBER context
               | PORTCON IDENTIFIER NUMBER MINUS NUMBER contextr�r�r�r�r�rN)rZPortConZ	port_typerxZport_numberr�)r��crfrfrh�	p_portcon|s


r�cCs2tj�}|d|_|d|_|d|_||d<dS)zanodecon : NODECON NUMBER NUMBER context
               | NODECON IPV6_ADDR IPV6_ADDR context
    r�r�r�rN)rZNodeCon�start�endr�)r��nrfrfrh�	p_nodecon�s



r�cCs2tj�}|d|_|d|_|d|_||d<dS)z.netifcon : NETIFCON IDENTIFIER context contextr�r�r�rN)rZNetifConrcZinterface_contextZpacket_context)r�r�rfrfrh�
p_netifcon�s



r�cCs(tj�}|d|_|d|_||d<dS)z pirqcon : PIRQCON NUMBER contextr�r�rN)rZPirqConZpirq_numberr�)r�r�rfrfrh�	p_pirqcon�s

r�cCsVtj�}t|�dkr*|d|_|d|_n |dd|d|_|d|_||d<dS)zYiomemcon : IOMEMCON NUMBER context
                | IOMEMCON NUMBER MINUS NUMBER contextr�r�r�r�rN)rZIomemConrxZ
device_memr�)r�r�rfrfrh�
p_iomemcon�s

r�cCsVtj�}t|�dkr*|d|_|d|_n |dd|d|_|d|_||d<dS)z\ioportcon : IOPORTCON NUMBER context
                | IOPORTCON NUMBER MINUS NUMBER contextr�r�r�r�rN)rZ	IoportConrxZioportr�)r�r�rfrfrh�p_ioportcon�s

r�cCs(tj�}|d|_|d|_||d<dS)z*pcidevicecon : PCIDEVICECON NUMBER contextr�r�rN)rZPciDeviceConZdevicer�)r�r�rfrfrh�p_pcidevicecon�s

r�cCs(tj�}|d|_|d|_||d<dS)z,devicetreecon : DEVICETREECON NUMBER contextr�r�rN)rZ
DevicetTeeConr�r�)r�r�rfrfrh�p_devicetreecon�s

r�cCs4|d|d<t|�dkr0|dd|d|d<dS)z[mls_range_def : mls_level_def MINUS mls_level_def
                     | mls_level_def
    rrr�r�r�N)rx)r�rfrfrh�p_mls_range_def�sr�cCs:|d|d<t|�dkr6|dddj|d�|d<dS)zRmls_level_def : IDENTIFIER COLON comma_list
                     | IDENTIFIER
    rrr��:�,r�N)rx�join)r�rfrfrh�p_mls_level_def�sr�cCs�tj|d�}t|�dkrD|ddkr8|jj|d�qv|d|_n2t|�dkrv|d|_t|�dkrv|jj|d�||d<dS)	z�type_def : TYPE IDENTIFIER COMMA comma_list SEMI
                | TYPE IDENTIFIER SEMI
                | TYPE IDENTIFIER ALIAS names SEMI
                | TYPE IDENTIFIER ALIAS names COMMA comma_list SEMI
    r�r�r�r�r�r�rN)rZTyperx�
attributesr��aliases)r�rgrfrfrh�
p_type_def�s
r�cCstj|d�}||d<dS)z)attribute_def : ATTRIBUTE IDENTIFIER SEMIr�rN)rZ	Attribute)r��arfrfrh�p_attribute_def�sr�cCstj|d�}||d<dS)z3attribute_role_def : ATTRIBUTE_ROLE IDENTIFIER SEMIr�rN)rZAttribute_Role)r�r�rfrfrh�p_attribute_role_def�sr�cCs(tj�}|d|_|d|_||d<dS)z5typealias_def : TYPEALIAS IDENTIFIER ALIAS names SEMIr�r�rN)rZ	TypeAliasrVr�)r�rgrfrfrh�p_typealias_def�s

r�cCs:tj�}|d|_t|�dkr.|jj|d�||d<dS)zWrole_def : ROLE IDENTIFIER TYPES comma_list SEMI
                | ROLE IDENTIFIER SEMIr�r�rN)rZRoler\rxr]r�)r��rrfrfrh�
p_role_defs

r�cCs(tj�}|d|_|d|_||d<dS)z#role_allow : ALLOW names names SEMIr�r�rN)rZ	RoleAllowZ	src_rolesZ	tgt_roles)r�r�rfrfrh�p_role_allows

r�cCsdS)z"permissive : PERMISSIVE names SEMINrf)r�rfrfrh�p_permissivesr�cCs�tj�}|ddkr tjj|_n.|ddkr8tjj|_n|ddkrNtjj|_|d|_|d|_|d|_|d|_	||d	<d
S)z�avrule_def : ALLOW names names COLON names names SEMI
                  | DONTAUDIT names names COLON names names SEMI
                  | AUDITALLOW names names COLON names names SEMI
                  | NEVERALLOW names names COLON names names SEMI
    rr^r_r`r�r�r�r�rN)
r�AVRuler=�	rule_typer>r?�	src_types�	tgt_types�obj_classesr�)r�r�rfrfrh�p_avrule_defs




r�cCsttj�}|ddkr tjj|_n|ddkr6tjj|_|d|_|d|_|d|_|d|_|d|_	||d	<d
S)a�typerule_def : TYPE_TRANSITION names names COLON names IDENTIFIER SEMI
                    | TYPE_TRANSITION names names COLON names IDENTIFIER FILENAME SEMI
                    | TYPE_TRANSITION names names COLON names IDENTIFIER IDENTIFIER SEMI
                    | TYPE_CHANGE names names COLON names IDENTIFIER SEMI
                    | TYPE_MEMBER names names COLON names IDENTIFIER SEMI
    rrarbr�r�r�r�r�rN)
rZTypeRulerCr�rDr�r�r�Z	dest_type�	file_name)r�rgrfrfrh�p_typerule_def*s





r�cCs.tj�}|d|_|jj|d�||d<dS)z5typebound_def : TYPEBOUNDS IDENTIFIER comma_list SEMIr�r�rN)rZ	TypeBoundrVr�r�)r�rgrfrfrh�p_typebound_def=s
r�cCs8tj�}|d|_|ddkr&d|_nd|_||d<dS)zIbool : BOOL IDENTIFIER TRUE SEMI
            | BOOL IDENTIFIER FALSE SEMIr�r�rYTFrN)rZBoolr��state)r��brfrfrh�p_boolDs
r�cCsPtj�}|d|_t|d|dd�t|�dkrBt|d|dd�|g|d<d	S)
z� conditional : IF OPAREN cond_expr CPAREN OBRACE interface_stmts CBRACE
                    | IF OPAREN cond_expr CPAREN OBRACE interface_stmts CBRACE ELSE OBRACE interface_stmts CBRACE
    r�r�T)r~r��
FrN)rZConditionalr�r�rx)r�r�rfrfrh�
p_conditionalOs
r�cCs.tj�}|d|_|jj|d�||d<dS)z<typeattribute_def : TYPEATTRIBUTE IDENTIFIER comma_list SEMIr�r�rN)rZ
TypeAttributerVr�r�)r�rgrfrfrh�p_typeattribute_defZs
r�cCs.tj�}|d|_|jj|d�||d<dS)z<roleattribute_def : ROLEATTRIBUTE IDENTIFIER comma_list SEMIr�r�rN)rZ
RoleAttributer\Zroleattributesr�)r�rgrfrfrh�p_roleattribute_defas
r�cCsdS)z�range_transition_def : RANGE_TRANSITION names names COLON names mls_range_def SEMI
                            | RANGE_TRANSITION names names names SEMINrf)r�rfrfrh�p_range_transition_defhsr�cCsdS)z<role_transition_def : ROLE_TRANSITION names names names SEMINrf)r�rfrfrh�p_role_transition_defmsr�cCsjt|�}|dkr |dg|d<nF|dkr@|dg|d|d<n&|d|d|dg|d|d<dS)acond_expr : IDENTIFIER
                 | EXPL cond_expr
                 | cond_expr AMP AMP cond_expr
                 | cond_expr BAR BAR cond_expr
                 | cond_expr EQUAL EQUAL cond_expr
                 | cond_expr EXPL EQUAL cond_expr
    r�rrr�r�N)rx)r��lrfrfrh�p_cond_exprqsr�cCsrtj�}t|�dkr$t|d|�nBt|�dkrFt|d|�d|_n t|dg�|jd|d�||d<dS)z�names : identifier
             | nested_id_set
             | asterisk
             | TILDE identifier
             | TILDE nested_id_set
             | IDENTIFIER MINUS IDENTIFIER
    r�rr�Tr�rN)rZIdSetrxr�Z
complimentr�)r�rrfrfrh�p_names�sr�cCs|dg|d<dS)zidentifier : IDENTIFIERrrNrf)r�rfrfrh�p_identifier�sr�cCs|dg|d<dS)zasterisk : ASTERISKrrNrf)r�rfrfrh�
p_asterisk�sr�cCs|d|d<dS)z1nested_id_set : OBRACE nested_id_list CBRACE
    r�rNrf)r�rfrfrh�p_nested_id_set�sr�cCs2t|�dkr|d|d<n|d|d|d<dS)z`nested_id_list : nested_id_element
                      | nested_id_list nested_id_element
    r�rrN)rx)r�rfrfrh�p_nested_id_list�sr�cCs4t|�dkr|d|d<nd|d}|g|d<dS)zxnested_id_element : identifier
                         | MINUS IDENTIFIER
                         | nested_id_set
    r�rrr�N)rx)r��strrfrfrh�p_nested_id_element�sr�cCs0t|�dkr |d|d|d<|d|d<dS)zTcomma_list : nested_id_list
                  | comma_list COMMA nested_id_list
    r�rr�rN)rx)r�rfrfrh�p_comma_list�sr�cCsdS)z/optional_semi : SEMI
                   | emptyNrf)r�rfrfrh�p_optional_semi�sr�cCs&dt|j|j|jfatt�dadS)Nz(%s: Syntax error on line %d %s [type=%s]F)�
parse_filerkrrrV�errorrv�success)�tokrfrfrh�p_error�sr�cCs(|siSi}x|D]}|||j<qWdS)N)r�)r��mapr�rfrfrh�prep_spt�s

rcCsHtstj�atjd|dd�a|dk	r*|antj�a|s@tj�an|adS)NZLALRr)�method�debugZwrite_tables)	�parserrrjrr�r�Module�
SupportMacrosr�)rQ�supportrrfrfrh�create_globals�s
rFcCs�t|||�dt_daytj||td�WnBtk
rl}z&dadadt|�dtj	�a
WYdd}~XnXts�dadt
}t|��tS)NrT)rrjzinternal parser error: %s�
zcould not parse text: "%s")
rrjrkr�r�parse�	Exceptionr��	traceback�
format_excr��
ValueErrorr�)�textrQrr�e�msgrfrfrhr
�s*r
c	Cs�g}d}x�tj|�D]�\}}}x�|D]~}tjj|�}tjj||�}|ddkr�|dkr\|}q�ttjd|d��r�|j|d|f�q$|ddkr$|j|d|f�q$WqW||fS)Nrz.sptzobj_perm_sets.sptZpatternsrz.if)	�os�walkr��splitextr�rx�re�findallr�)	�root�modules�support_macros�dirpathZdirnames�	filenamesr��modname�filenamerfrfrh�list_headers
s
rcs>ddlm}tj�}g}d}tjj|�r|tjj|�d}|dkrLtd|��tjj	|�}	|j
|	d|f�ttj
��\}
}nt|�\}}|r�|r�td���fdd��d��fd	d
�	}d}|�r8�d|�tj�}|||�|jj
|�tjd�}
tjd
dddddddddg
�}|
jj
tj|��|jj
|
��d���rd��rd|jtjt|�d�}|jd�g}x�|D]�}tj�}|d|_y*|�r�||d||�n||d|�WnFtk
�r�}z(�t|�d�|j
|d��wnWYdd}~XnX|jj
|���rn��rn|j��qnWt|��r:�ddj|��|S)Nr)�utilrzzInvalid file name %srz1could not find support macros (obj_perm_sets.spt)cs�r�j|�dS)N)�write)r)�outputrfrhr�3szparse_headers.<locals>.ocs��r�d|�y.t|�}|j�}|j�|at|||��WnTtk
r^}zdSd}~Xn6tk
r�}ztd|t|�f��WYdd}~XnXdS)Nzparsing file %s
zerror parsing file %s: %s)�open�read�closer�r
�IOErrorrr�)r�rQr��fdZtxtr)rr�rfrhr�7sz!parse_headers.<locals>.parse_filezParsing support macros (%s): �can_execz$1z$2�fileZexecute_no_transr"r#�getattr�lockZexecuteZioctlzdone.
)ZstepszParsing interface filesr	z failed to parse some headers: %sz, )N)rzrrZHeadersrr��isfile�splitrrr�rr�headersrr|r�rZAccessVectorr�ZConsoleProgressBar�sys�stdoutrxr�rr�r��stepr�)rr!r�rrr-rrr�rZall_modulesr�r�r'�avZstatusZfailuresr�r�rrf)rr�r!rh�
parse_headerssb






r2)Irr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrP)N)NNF)NTF)vr.rrrrzrrrrr�tokensrpZt_TICKZt_SQUOTEZt_OBRACEZt_CBRACEZt_SEMIZt_COLONZt_OPARENZt_CPARENZt_COMMAZt_MINUSZt_TILDEZ
t_ASTERISKZt_AMPZt_BARZt_EXPLZt_EQUALZt_NUMBERZt_PATHZt_ignorerirlrnrorsrtrurwryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrjrr
rr2rfrfrfrh�<module> s�







"


	

		

site-packages/sepolgen/__pycache__/interfaces.cpython-36.opt-1.pyc000064400000030375147511334570021042 0ustar003

��fQ@�@s�dZddlZddlZddlmZddlmZddlmZddlmZddlm	Z	Gd	d
�d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�ZGdd�d�ZGdd�d�ZGdd�d�ZGdd�d�ZdS)z7
Classes for representing and manipulating interfaces.
�N�)�access)�	refpolicy)�objectmodel)�matching)�_c@sHeZdZdZdd�Zdd�Zdd�Zeee�Zedd	�d
�Z	dd�Z
d
S)�Paramz;
    Object representing a paramater for an interface.
    cCs"d|_tj|_tj�|_d|_dS)N�T)�_Param__namer�SRC_TYPE�type�IdSet�obj_classesZrequired)�self�r� /usr/lib/python3.6/interfaces.py�__init__&s
zParam.__init__cCs tj|�std|��||_dS)NzName [%s] is not a param)r�
is_idparam�
ValueErrorr
)r�namerrr�set_name,s
zParam.set_namecCs|jS)N)r
)rrrr�get_name1szParam.get_namecCst|jdd��S)Nr)�intr)rrrr�<lambda>6szParam.<lambda>)�fgetcCs d|jtj|jdj|j�fS)Nz0<sepolgen.policygen.Param instance [%s, %s, %s]>� )rr�field_to_strr�joinr)rrrr�__repr__8szParam.__repr__N)�__name__�
__module__�__qualname__�__doc__rrr�propertyr�numrrrrrr"s
rcCs�d}||kr�||}||jkr"dS|tjks6|tjkr�|jtjksN|jtjkr�d}|r`|jg}ng}x&tj|j|�D]}|tj	krtd}PqtWtj|_q�d}nt
�}||_||_|||j<|r�|jj|j�|S)Nrr)
rrr�TGT_TYPE�	obj_class�	itertools�chainrrZimplicitly_typed_objectsrr�add)rr�av�params�ret�pZavobjs�objrrr�__param_insert>s0




r/cCs~d}d}tj|j�r.t|jtj||�dkr.d}tj|j�rTt|jtj||�dkrTd}tj|j�rzt|jtj	||�dkrzd}|S)ajExtract the paramaters from an access vector.

    Extract the paramaters (in the form $N) from an access
    vector, storing them as Param objects in a dictionary.
    Some attempt is made at resolving conflicts with other
    entries in the dict, but if an unresolvable conflict is
    found it is reported to the caller.

    The goal here is to figure out how interface parameters are
    actually used in the interface - e.g., that $1 is a domain used as
    a SRC_TYPE. In general an interface will look like this:

    interface(`foo', `
       allow $1 foo : file read;
    ')

    This is simple to figure out - $1 is a SRC_TYPE. A few interfaces
    are more complex, for example:

    interface(`foo_trans',`
       domain_auto_trans($1,fingerd_exec_t,fingerd_t)

       allow $1 fingerd_t:fd use;
       allow fingerd_t $1:fd use;
       allow fingerd_t $1:fifo_file rw_file_perms;
       allow fingerd_t $1:process sigchld;
    ')

    Here the usage seems ambigious, but it is not. $1 is still domain
    and therefore should be returned as a SRC_TYPE.

    Returns:
      0  - success
      1  - conflict found
    rFr)
rr�src_typer/rr�tgt_typer%r&�	OBJ_CLASS)r*r+r,Z	found_srcrrr�av_extract_paramsjs$r3cCs"tj|j�rt|jtjd|�SdS)N)rr�roler/rZROLE)r4r+rrr�role_extract_params�sr5csl�fdd�}d}||jtj�r"d}||jtj�r4d}||jtj�rFd}tj|j	�rht
|j	tjd��rhd}|S)Ncs2d}x(|D] }tj|�r
t||d��r
d}q
W|S)Nrr)rrr/)�setrr,�x)r+rr�extract_from_set�s

z2type_rule_extract_params.<locals>.extract_from_setrr)�	src_typesrr�	tgt_typesr%rr2rrZ	dest_typer/Z	DEST_TYPE)�ruler+r8r,r)r+r�type_rule_extract_params�sr<cCs6d}x,|jD]"}tj|�rt|tjd|�rd}qW|S)Nrr)�argsrrr/rr)�ifcallr+r,�argrrr�ifcall_extract_params�s
r@c@seZdZdd�Zdd�ZdS)�AttributeVectorcCsd|_tj�|_dS)Nr	)rr�AccessVectorSet)rrrrr�szAttributeVector.__init__cCs|jj|�dS)N)r�add_av)rr*rrrrC�szAttributeVector.add_avN)rr r!rrCrrrrrA�srAc@s$eZdZdd�Zdd�Zdd�ZdS)�AttributeSetcCs
i|_dS)N)�
attributes)rrrrr�szAttributeSet.__init__cCs||j|j<dS)N)rEr)r�attrrrr�add_attr�szAttributeSet.add_attrcCs~dd�}d}x^|D]V}|dd�}|ddkrF|r<|j|�||�}q|r|jd�}tj|�}|j|�qW|rz|j|�dS)NcSsH|dd�j�}t|�dks(|ddkr4td|��t�}|d|_|S)Nr�rZ	Attributez#Syntax error Attribute statement %s���)�split�len�SyntaxErrorrAr)�line�fields�arrr�
parse_attr�s
z*AttributeSet.from_file.<locals>.parse_attrrr�[�,rI)rGrJr�AccessVectorrC)r�fdrPrOrM�lr*rrr�	from_file�s	




zAttributeSet.from_fileN)rr r!rrGrVrrrrrD�srDc@sFeZdZdifdd�Zifdd�Zdd�Zdd	�Zd
d�Zdd
�ZdS)�InterfaceVectorNcCs6d|_d|_tj�|_i|_|r,|j||�d|_dS)NTr	F)�enabledrrrBr+�from_interface�expanded)r�	interfacerErrrr�s
zInterfaceVector.__init__c
CsF|j|_xN|j�D]B}|jtjjkr&qd|jkr2qtj|�}x|D]}|j|�qBWqW|r�x�|j	�D]v}xp|j
D]f}||j
kr�qr|j
|}xJ|jD]@}	tj|	�}|j|jkr�|j
|_|j|jkr�|j
|_|j|�q�WqrWqfWx|j�D]}
t|
|j�r�q�Wx |j�D]}t||j��r�qWx |j�D]}t||j��r*�q*WdS)NZ	dontaudit)rZavrulesZ	rule_typerZAVRuleZALLOWrZavrule_to_access_vectorsrCZtypeattributesrE�copyr0rr1Zrolesr5r+Z	typerulesr<�interface_callsr@)
rr[rEZavruleZavsr*Z
typeattributerFZattr_vecrOr4r;r>rrrrY�s>





zInterfaceVector.from_interfacecCs t||j�dkr|jj|�dS)Nr)r3r+rrC)rr*rrrrC3szInterfaceVector.add_avcCs<g}|jd|j�x|jD]}|jt|��qWdj|�S)Nz[InterfaceVector %s]�
)�appendrr�strr)r�sr*rrr�	to_string9s
zInterfaceVector.to_stringcCs|j�S)N)r)rrrr�__str__@szInterfaceVector.__str__cCsd|j|jfS)Nz<InterfaceVector %s:%s>)rrX)rrrrrCszInterfaceVector.__repr__)	rr r!rrYrCrbrcrrrrrrW�s4rWc@sxeZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zifdd�Z	difdd�Z
dd�Zdd�Zdd�Z
dd�ZdS)�InterfaceSetNcCsi|_i|_g|_||_dS)N)�
interfaces�tgt_type_map�tgt_type_all�output)rrhrrrrHszInterfaceSet.__init__cCs|jr|jj|d�dS)Nr^)rh�write)rr`rrr�oNszInterfaceSet.ocCs�x�t|jj�dd�d�D]�}|jd|j�x:t|jj�dd�d�D] }|jd|jtj|jf�qDW|jd�t|j	j
��}x&|D]}|jdj|��|jd	�q�WqWdS)
NcSs|jS)N)r)r7rrrrSsz&InterfaceSet.to_file.<locals>.<lambda>)�keyz[InterfaceVector %s cSs|jS)N)r)r7rrrrUsz%s:%s z]
rRr^)�sortedre�valuesrirr+rrrrZto_listr)rrTZiv�paramZavlr*rrr�to_fileRs 

zInterfaceSet.to_filecCs�dd�}d}x^|D]V}|dd�}|ddkrF|r<|j|�||�}q|r|jd�}tj|�}|j|�qW|rz|j|�|j�dS)NcSs�|dd�j�}t|�dks(|ddkr4td|��t�}|d|_t|�dkrTdSxb|dd�D]R}|jd�}t|�dkr�td|��t�}|d|_tj|d|_||j	|j<qbW|S)	NrrHrrWz)Syntax error InterfaceVector statement %s�:z-Invalid param in InterfaceVector statement %srI)
rJrKrLrWrrrZstr_to_fieldrr+)rMrN�ifvZfieldr-rnrrr�	parse_ifv^s 


z)InterfaceSet.from_file.<locals>.parse_ifvrrrQrRrI)�add_ifvrJrrSrC�index)rrTrrrqrMrUr*rrrrV]s





zInterfaceSet.from_filecCs||j|j<dS)N)rer)rrqrrrrs�szInterfaceSet.add_ifvcCs�xz|jj�D]l}t�}x:|jD]0}tj|j�rB|jj|�t�}P|j|j�qWx$|D]}|j	j
|g�}|j|�qXWqWdS)N)rermr6rrr1rgr_r)rf�
setdefault)rrqr:r*rrUrrrrt�s
zInterfaceSet.indexcCst||�}|j|�dS)N)rWrs)rr[rErqrrrr)�s
zInterfaceSet.addcCs@x(tj|j�|j��D]}|j||�qW|j|�|j�dS)N)r'r(re�	templatesr)�expand_ifcallsrt)r�headersrhrE�irrr�add_headers�s
zInterfaceSet.add_headerscCsZtj|�rPt|dd��}|t|j�kr,dS|j|d}t|t�rH|S|gSn|gSdS)Nr)rrrrKr=�
isinstance�list)r�idr>r$r?rrr�	map_param�s

zInterfaceSet.map_paramc
Cs�|j|j|�}|dkrdS|j|j|�}|dkr4dS|j|j|�}|dkrNdStj�}x0|jD]&}|j||�}	|	dkrzq^q^|j|	�q^Wt|�dkr�dSx:|D]2}
x,|D]$}x|D]}|j	j
|
|||�q�Wq�Wq�WdS)Nr)r~r0r1r&rr
Zperms�updaterKrr))
rrqr*r>r9r:rZ	new_permsZpermr-r0r1r&rrr�
map_add_av�s*


zInterfaceSet.map_add_avcCs�|dfg}|j|j}d|_x�t|�dkr�|jd�\}}|j|j}||krrx|jD]}|j|||�qTW|jrrqxv|j�D]j}	|	j|jkr�|j	t
d��dSy||	j}
Wn*tk
r�|j	t
d|	j��w|YnX|j|
|	f�q|WqWdS)NTrrzFound circular interface classz#Missing interface definition for %srI)
rerrZrK�poprr�r]Zifnamerjr�KeyErrorr_)rr[�
if_by_name�stackrqZcurZ
cur_ifcallZcur_ifvr*r>Znewifrrr�do_expand_ifcalls�s*
zInterfaceSet.do_expand_ifcallscCsZi}x&tj|j�|j��D]}|||j<qWx(tj|j�|j��D]}|j||�qBWdS)N)r'r(rervrr�)rrxr�ryr[rrrrw�s
zInterfaceSet.expand_ifcalls)N)rr r!rrjrorVrsrtr)rzr~r�r�rwrrrrrdGs
#$rd)r"r\r'r	rrrrZsepolgeni18nrrr/r3r5r<r@rArDrWrdrrrr�<module>s",4Zsite-packages/sepolgen/__pycache__/yacc.cpython-36.pyc000064400000151562147511334570016701 0ustar003

��fd�
@s
ddlZddlZddlZddlZddlZddlZdZdZdZ	dZ
dZdZdZ
d	Zd
ZdZejddkrleZneZejZGdd�de�ZGd
d�de�ZGdd�de�Zdd�Zdd�Zdadada dZ!dd�Z"dd�Z#dd�Z$dd�Z%Gdd�d�Z&Gd d!�d!�Z'Gd"d#�d#�Z(ej)d$�Z*Gd%d&�d&e�Z+Gd'd(�d(e�Z,Gd)d*�d*e�Z-d+d,�Z.Gd-d.�d.e�Z/Gd/d0�d0e�Z0Gd1d2�d2e�Z1Gd3d4�d4e�Z2d5d6�Z3d7d8�Z4Gd9d:�d:e�Z5Gd;d<�d<e2�Z6d=d>�Z7d?d@�Z8GdAdB�dBe�Z9de	deddd	de
ddddf
dCdD�Z:dS)E�Nz3.11z3.10Tz
parser.out�parsetab�LALR�F�(c@s4eZdZdd�Zdd�ZeZdd�Zdd�ZeZd	S)
�	PlyLoggercCs
||_dS)N)�f)�selfr�r	�/usr/lib/python3.6/yacc.py�__init__mszPlyLogger.__init__cOs|jj||d�dS)N�
)r�write)r�msg�args�kwargsr	r	r
�debugpszPlyLogger.debugcOs|jjd||d�dS)Nz	WARNING: r)rr
)rrrrr	r	r
�warninguszPlyLogger.warningcOs|jjd||d�dS)NzERROR: r)rr
)rrrrr	r	r
�errorxszPlyLogger.errorN)	�__name__�
__module__�__qualname__rr�inforrZcriticalr	r	r	r
rlsrc@seZdZdd�Zdd�ZdS)�
NullLoggercCs|S)Nr	)r�namer	r	r
�__getattribute__szNullLogger.__getattribute__cOs|S)Nr	)rrrr	r	r
�__call__�szNullLogger.__call__N)rrrrrr	r	r	r
r~src@seZdZdS)�	YaccErrorN)rrrr	r	r	r
r�srcCsPt|�}d|krt|�}t|�tkr4|dt�d}dt|�jt|�|f}|S)Nrz ...z<%s @ 0x%x> (%s))�repr�len�resultlimit�typer�id)�r�repr_str�resultr	r	r
�
format_result�sr%cCsBt|�}d|krt|�}t|�dkr(|Sdt|�jt|�fSdS)Nr�z<%s @ 0x%x>)rrr rr!)r"r#r	r	r
�format_stack_entry�sr'aPLY: Don't use global functions errok(), token(), and restart() in p_error().
Instead, invoke the methods on the associated parser instance:

    def p_error(p):
        ...
        # Use parser.errok(), parser.token(), parser.restart()
        ...

    parser = yacc.yacc()
cCstjt�t�S)N)�warnings�warn�_warnmsg�_errokr	r	r	r
�errok�s
r,cCstjt�t�S)N)r(r)r*�_restartr	r	r	r
�restart�s
r.cCstjt�t�S)N)r(r)r*�_tokenr	r	r	r
�token�s
r0cCs>|ja|ja|ja||�}y
bbbWntk
r8YnX|S)N)r,r+r0r/r.r-�	NameError)�	errorfuncr0�parserr"r	r	r
�call_errorfunc�s
r4c@seZdZdd�Zdd�ZdS)�
YaccSymbolcCs|jS)N)r )rr	r	r
�__str__�szYaccSymbol.__str__cCst|�S)N)�str)rr	r	r
�__repr__�szYaccSymbol.__repr__N)rrrr6r8r	r	r	r
r5�sr5c@sneZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�ZdS)�YaccProductionNcCs||_||_d|_d|_dS)N)�slice�stack�lexerr3)r�sr;r	r	r
r�szYaccProduction.__init__cCsBt|t�rdd�|j|D�S|dkr2|j|jS|j|jSdS)NcSsg|]
}|j�qSr	)�value)�.0r=r	r	r
�
<listcomp>�sz.YaccProduction.__getitem__.<locals>.<listcomp>r)�
isinstancer:r>r;)r�nr	r	r
�__getitem__�s

zYaccProduction.__getitem__cCs||j|_dS)N)r:r>)rrB�vr	r	r
�__setitem__�szYaccProduction.__setitem__cCsdd�|j||�D�S)NcSsg|]
}|j�qSr	)r>)r?r=r	r	r
r@�sz/YaccProduction.__getslice__.<locals>.<listcomp>)r:)r�i�jr	r	r
�__getslice__�szYaccProduction.__getslice__cCs
t|j�S)N)rr:)rr	r	r
�__len__�szYaccProduction.__len__cCst|j|dd�S)N�linenor)�getattrr:)rrBr	r	r
rJszYaccProduction.linenocCs||j|_dS)N)r:rJ)rrBrJr	r	r
�
set_linenoszYaccProduction.set_linenocCs,t|j|dd�}t|j|d|�}||fS)NrJr�	endlineno)rKr:)rrB�	startlineZendliner	r	r
�linespanszYaccProduction.linespancCst|j|dd�S)N�lexposr)rKr:)rrBr	r	r
rPszYaccProduction.lexposcCs||j|_dS)N)r:rP)rrBrPr	r	r
�
set_lexposszYaccProduction.set_lexposcCs,t|j|dd�}t|j|d|�}||fS)NrPr�	endlexpos)rKr:)rrB�startpos�endposr	r	r
�lexspanszYaccProduction.lexspancCst�dS)N)�SyntaxError)rr	r	r
rszYaccProduction.error)N)rrrrrCrErHrIrJrLrOrPrQrUrr	r	r	r
r9�s
r9c@s\eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd
d�Zddd�Z	ddd�Z
ddd�ZdS)�LRParsercCs0|j|_|j|_|j|_||_|j�d|_dS)NT)	�lr_productions�productions�	lr_action�action�lr_goto�gotor2�set_defaulted_states�errorok)rZlrtabZerrorfr	r	r
r szLRParser.__init__cCs
d|_dS)NT)r_)rr	r	r
r,(szLRParser.errokcCs@|jdd�=|jdd�=t�}d|_|jj|�|jjd�dS)Nz$endr)�
statestack�symstackr5r �append)r�symr	r	r
r.+szLRParser.restartcCsTi|_xH|jj�D]:\}}t|j��}t|�dkr|ddkr|d|j|<qWdS)N�r)�defaulted_statesr[�items�list�valuesr)r�state�actionsZrulesr	r	r
r^;s
zLRParser.set_defaulted_statescCs
i|_dS)N)re)rr	r	r
�disable_defaulted_statesBsz!LRParser.disable_defaulted_statesNFcCsZ|str.t|t�rttj�}|j|||||�S|rD|j|||||�S|j|||||�SdS)N)	�	yaccdevelrA�intr�sys�stderr�
parsedebug�parseopt�parseopt_notrack)r�inputr<r�tracking�	tokenfuncr	r	r
�parseEs

zLRParser.parsec Cs�d}g}|j}|j}	|j}
|j}td�}d}
|jd�|sLddlm}|j}||_||_	|dk	rj|j
|�|dkrz|j}n|}||_g}||_g}||_
||_d}|jd�t�}d|_|j|�d}�x�|jd�|jd|�||k�r.|�s|�s�|�}n|j�}|�st�}d|_|j}||j|�}n||}|jd||�|jd	d
djdd
�|D�dd��t|�fj��|dk	�r�|dk�r�|j|�|}|jd|�|j|�d}|
r�|
d8}
q�|dk�rN|
|}|j}|j}t�}||_d|_|�rB|jd|jddjdd
�||d�D��d|	|d%||�n|jd|jg|	|d&|�|�r�||dd�}||d<|�r�|d}|j|_|j|_|d'}t|d|j�|_t|d|j�|_||_ yd||d�=||_!|j"|�||d�=|jdt#|d��|j|�|	|d(|}|j|�Wq�t$k
�r�|j|�|j%|dd)��|j�|d*}d|_d|_|}t&}
d|_'Yq�Xq�n�|�r�|j|_|j|_|g}||_ yL||_!|j"|�|jdt#|d��|j|�|	|d+|}|j|�Wq�t$k
�rJ|j|�|j�|d,}d|_d|_|}t&}
d|_'Yq�Xq�|dk�r�|d-}t|dd�}|jdt#|��|jd�|S|dk�r�|j(dd
djdd
�|D�dd��t|�fj��|
dk�s�|j'�r�t&}
d|_'|}|jdk�r�d}|j)�rB|�rt*|d��r||_||_!t+|j)||�}|j'�r�|}d}q�n`|�r�t*|d��r\|j}nd}|�r~t,j-j.d ||jf�nt,j-j.d!|j�nt,j-j.d"�dSnt&}
t|�dk�r�|jdk�r�d}d}d}|dd�=q�|jdk�r�dS|jdk�r�|d.}|jdk�r6|�r0t|d|j�|_t|d#|j�|_d}q�t�}d|_t*|d��r\|j|_|_t*|d#��rv|j|_|_||_|j|�|}q�|j�}|�r�|j|_|j|_|j�|d/}q�t/d$��q�WdS)0NrzPLY: PARSE DEBUG STARTrd)�lexz$end�zState  : %sz#Defaulted state %s: Reduce using %dzStack  : %sz%s . %s� cSsg|]
}|j�qSr	)r )r?�xxr	r	r
r@�sz'LRParser.parsedebug.<locals>.<listcomp>z Action : Shift and goto state %sz3Action : Reduce rule [%s] with %s and goto state %d�[�,cSsg|]}t|j��qSr	)r'r>)r?Z_vr	r	r
r@�s�]rMrRzResult : %srFr>zDone   : Returning %szPLY: PARSE DEBUG ENDzError  : %scSsg|]
}|j�qSr	)r )r?rzr	r	r
r@Dsr<rJz(yacc: Syntax error at line %d, token=%s
zyacc: Syntax error, token=%sz yacc: Parse error in input. EOF
rPzyacc: internal parser error!!!
���r~r~r~r~r~r~r~r~r~r~)0r[r]rYrer9rrxrwr<r3rsr0r`rar;rbr5r r�pop�get�joinr7�lstriprrr>rJrPrKrMrRr:ri�callabler%rV�extend�error_countr_rr2�hasattrr4rnror
�RuntimeError) rrsr<rrtru�	lookahead�lookaheadstackrjr]�prodre�pslice�
errorcountrw�	get_tokenr`ra�errtokenrcri�ltype�t�p�pname�plen�targ�t1rBr$�tokrJr	r	r
rp^s~





.






$








.


zLRParser.parsedebugc Csvd}g}|j}|j}	|j}
|j}td�}d}
|sBddlm}|j}||_||_|dk	r`|j	|�|dkrp|j
}n|}||_
g}||_g}||_||_
d}|jd�t�}d|_|j|�d}�x�||k�r|s�|s�|�}n|j�}|s�t�}d|_|j}||j|�}n||}|dk	�rh|dk�rN|j|�|}|j|�d}|
r�|
d8}
q�|dk�rF|
|}|j}|j}t�}||_d|_|�r�||dd�}||d<|�r�|d}|j|_|j|_|d}t|d|j�|_t|d|j�|_||_yP||d�=||_|j|�||d�=|j|�|	|d|}|j|�Wq�tk
�r�|j|�|j|dd��|j�|d}d|_d|_|}t }
d|_!Yq�Xq�n�|�r�|j|_|j|_|g}||_y8||_|j|�|j|�|	|d|}|j|�Wq�tk
�rB|j|�|j�|d}d|_d|_|}t }
d|_!Yq�Xq�|dk�rh|d}t|d	d�}|S|dk�rf|
dk�s�|j!�rNt }
d|_!|}|jdk�r�d}|j"�r�|�r�t#|d
��r�||_||_t$|j"||�}|j!�rL|}d}q�n`|�r<t#|d��r|j}nd}|�r(t%j&j'd||jf�nt%j&j'd
|j�nt%j&j'd�dSnt }
t|�dk�r�|jdk�r�d}d}d}|dd�=q�|jdk�r�dS|jdk�r6|d}|jdk�r�|�r�t|d|j�|_t|d|j�|_d}q�t�}d|_t#|d��r|j|_|_t#|d��r |j|_|_||_|j|�|}q�|j�}|�rT|j|_|j|_|j�|d}q�t(d��q�WdS)Nrrd)rwz$endrMrRrFr>r<rJz(yacc: Syntax error at line %d, token=%s
zyacc: Syntax error, token=%sz yacc: Parse error in input. EOF
rPzyacc: internal parser error!!!
r~r~r~r~r~r~r~r~r~))r[r]rYrer9rxrwr<r3rsr0r`rar;rbr5r rr�rrr>rJrPrKrMrRr:rir�rVr�r�r_r2r�r4rnror
r�) rrsr<rrtrur�r�rjr]r�rer�r�rwr�r`rar�rcrir�r�r�r�r�r�r�rBr$r�rJr	r	r
rq�sX




















zLRParser.parseoptcCs�d}g}|j}|j}	|j}
|j}td�}d}
|sBddlm}|j}||_||_|dk	r`|j	|�|dkrp|j
}n|}||_
g}||_g}||_||_
d}|jd�t�}d|_|j|�d}�x||k�r|s�|s�|�}n|j�}|s�t�}d|_|j}||j|�}n||}|dk	�r|dk�rN|j|�|}|j|�d}|
r�|
d8}
q�|dk�r�|
|}|j}|j}t�}||_d|_|�rX||dd�}||d<||_yP||d�=||_|j|�||d�=|j|�|	|d|}|j|�Wq�tk
�rR|j|�|j|dd��|j�|d}d|_d|_|}t}
d|_Yq�Xq�n�|g}||_y8||_|j|�|j|�|	|d|}|j|�Wq�tk
�r�|j|�|j�|d}d|_d|_|}t}
d|_Yq�Xq�|dk�r|d}t|dd�}|S|dk�r�|
dk�s(|j�r�t}
d|_|}|jdk�rFd}|j�r�|�rht|d��rh||_||_t |j||�}|j�r�|}d}q�n`|�r�t|d	��r�|j!}nd}|�r�t"j#j$d
||jf�nt"j#j$d|j�nt"j#j$d�dSnt}
t|�dk�r(|jdk�r(d}d}d}|dd�=q�|jdk�r8dS|jdk�r�|d}|jdk�r^d}q�t�}d|_t|d	��r�|j!|_!|_%t|d
��r�|j&|_&|_'||_|j|�|}q�|j�}|j�|d}q�t(d��q�WdS)Nrrd)rwz$endrFr>r<rJz(yacc: Syntax error at line %d, token=%s
zyacc: Syntax error, token=%sz yacc: Parse error in input. EOF
rPzyacc: internal parser error!!!
r~r~r~r~r~r~r~r~))r[r]rYrer9rxrwr<r3rsr0r`rar;rbr5r rr�rrr>r:rir�rVr�r�r_rKr2r�r4rJrnror
rMrPrRr�)rrsr<rrtrur�r�rjr]r�rer�r�rwr�r`rar�rcrir�r�r�r�r�r�rBr$r�rJr	r	r
rr�s8




















zLRParser.parseopt_notrack)NNFFN)NNFFN)NNFFN)NNFFN)rrrrr,r.r^rkrvrprqrrr	r	r	r
rWs

]
4rWz^[a-zA-Z0-9_-]+$c@sReZdZdZddd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�ZdS)�
Productionr�rightNrxc	Cs�||_t|�|_||_||_d|_||_||_||_t	|j�|_	g|_
x$|jD]}||j
krN|j
j|�qNWg|_d|_
|jr�d|jdj|j�f|_nd|j|_dS)Nz%s -> %sryz
%s -> <empty>)r�tupler��number�funcr��file�line�precr�usymsrb�lr_items�lr_nextr�r7)	rr�rr��
precedencer�r�r�r=r	r	r
r!s$

zProduction.__init__cCs|jS)N)r7)rr	r	r
r6?szProduction.__str__cCsdt|�dS)NzProduction(�))r7)rr	r	r
r8BszProduction.__repr__cCs
t|j�S)N)rr�)rr	r	r
rIEszProduction.__len__cCsdS)Nrdr	)rr	r	r
�__nonzero__HszProduction.__nonzero__cCs
|j|S)N)r�)r�indexr	r	r
rCKszProduction.__getitem__cCs�|t|j�krdSt||�}y|j|j|d|_Wnttfk
rTg|_YnXy|j|d|_Wntk
r�d|_YnX|S)Nrd)rr��LRItem�	Prodnames�lr_after�
IndexError�KeyError�	lr_before)rrBr�r	r	r
�lr_itemOs
zProduction.lr_itemcCs|jr||j|_dS)N)r�r�)r�pdictr	r	r
�bind_szProduction.bind�r�r)r�Nrxr)rrr�reducedrr6r8rIr�rCr�r�r	r	r	r
r�s
r�c@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�MiniProductioncCs.||_||_||_d|_||_||_||_dS)N)rrr�r�r�r�r7)rr7rrr�r�r�r	r	r
rhszMiniProduction.__init__cCs|jS)N)r7)rr	r	r
r6qszMiniProduction.__str__cCs
d|jS)NzMiniProduction(%s))r7)rr	r	r
r8tszMiniProduction.__repr__cCs|jr||j|_dS)N)r�r�)rr�r	r	r
r�xszMiniProduction.bindN)rrrrr6r8r�r	r	r	r
r�gs	r�c@s$eZdZdd�Zdd�Zdd�ZdS)r�cCsZ|j|_t|j�|_|j|_||_i|_|jj|d�t|j�|_t|j�|_|j	|_	dS)N�.)
rrgr�r��lr_index�
lookaheads�insertr�rr�)rr�rBr	r	r
r�szLRItem.__init__cCs,|jrd|jdj|j�f}n
d|j}|S)Nz%s -> %sryz
%s -> <empty>)r�rr�)rr=r	r	r
r6�s
zLRItem.__str__cCsdt|�dS)NzLRItem(r�)r7)rr	r	r
r8�szLRItem.__repr__N)rrrrr6r8r	r	r	r
r��sr�cCs:t|�d}x(|dkr4|||kr*||S|d8}qWdS)Nrdr)r)Zsymbols�	terminalsrFr	r	r
�rightmost_terminal�s
r�c@seZdZdS)�GrammarErrorN)rrrr	r	r	r
r��sr�c@s�eZdZdd�Zdd�Zdd�Zdd�Zd$dd
�Zd%dd�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zd&d d!�Zd"d#�Zd	S)'�GrammarcCsfdg|_i|_i|_i|_x|D]}g|j|<q Wg|jd<i|_i|_i|_i|_t�|_	d|_
dS)Nr)�Productionsr��Prodmap�	Terminals�Nonterminals�First�Follow�
Precedence�set�UsedPrecedence�Start)rr��termr	r	r
r�s

zGrammar.__init__cCs
t|j�S)N)rr�)rr	r	r
rI�szGrammar.__len__cCs
|j|S)N)r�)rr�r	r	r
rC�szGrammar.__getitem__cCsL|jdgkstd��||jkr*td|��|dkr:td��||f|j|<dS)Nz2Must call set_precedence() before add_production()z,Precedence already specified for terminal %r�leftr��nonassocz:Associativity must be one of 'left','right', or 'nonassoc')r�r�r�)r��AssertionErrorr�r�)rr��assoc�levelr	r	r
�set_precedence�s
zGrammar.set_precedenceNrxrcCs�||jkrtd|||f��|dkr6td|||f��tj|�sRtd|||f��x�t|�D]�\}}|ddkr�yJt|�}t|�dkr�td||||f��||jkr�g|j|<|||<w\Wntk
r�YnXtj|�r\|d	kr\td
||||f��q\Wd	|k�r�|dd	k�r$td||f��|dd	k�rBtd
||f��|d}	|jj	|	�}
|
�sptd|||	f��n|j
j|	�|dd�=nt||j�}	|jj	|	d�}
d||f}||j
k�r�|j
|}td|||fd|j|jf��t|j�}
||jk�rg|j|<xR|D]J}||jk�r.|j|j|
�n&||jk�rDg|j|<|j|j|
��qWt|
|||
|||�}|jj|�||j
|<y|j|j|�Wn"tk
�r�|g|j|<YnXdS)Nz7%s:%d: Illegal rule name %r. Already defined as a tokenrz5%s:%d: Illegal rule name %r. error is a reserved wordz%s:%d: Illegal rule name %rrz'"rdzA%s:%d: Literal token %s in rule %r may only be a single characterz%precz!%s:%d: Illegal name %r in rule %rz+%s:%d: Syntax error. Nothing follows %%prec�zH%s:%d: Syntax error. %%prec can only appear at the end of a grammar rulez/%s:%d: Nothing known about the precedence of %rr�z%s -> %sz%s:%d: Duplicate rule %s. zPrevious definition at %s:%dr~���r~r�)r�r)r�r��_is_identifier�match�	enumerate�evalrrVr�r�r��addr�r�r�r�r�r�rbr�r�r�)r�prodname�symsr�r�r�rBr=�cZprecnameZprodprec�map�mZpnumberr�r�r	r	r
�add_productionsp










zGrammar.add_productioncCsT|s|jdj}||jkr&td|��tdd|g�|jd<|j|jd�||_dS)Nrdzstart symbol %s undefinedrzS')r�rr�r�r�rbr�)r�startr	r	r
�	set_startcs
zGrammar.set_startcs>���fdd��t����jdjd��fdd��jD�S)NcsJ|�krdS�j|�x.�jj|g�D]}x|jD]}�|�q2Wq&WdS)N)r�r�r�r�)r=r�r")�mark_reachable_from�	reachablerr	r
r�vs
z5Grammar.find_unreachable.<locals>.mark_reachable_fromrcsg|]}|�kr|�qSr	r	)r?r=)r�r	r
r@�sz,Grammar.find_unreachable.<locals>.<listcomp>)r�r�r�r�)rr	)r�r�rr
�find_unreachablesszGrammar.find_unreachablecCs�i}x|jD]}d||<qWd|d<x|jD]}d||<q,Wxpd}x`|jj�D]R\}}xH|D]@}x |jD]}||shd}PqhWd}|r\||s�d||<d}Pq\WqNW|s>Pq>Wg}	x@|j�D]4\}}
|
s�||jkr�||jkr�|dkr�q�|	j|�q�W|	S)NTz$endFr)r�r�r�rfr�rb)rZ
terminatesr�rB�some_changeZplr�r=Zp_terminates�infiniter�r	r	r
�infinite_cycles�s:

zGrammar.infinite_cyclescCsXg}xN|jD]D}|sqx8|jD].}||jkr||jkr|dkr|j||f�qWqW|S)Nr)r�r�r�r�rb)rr$r�r=r	r	r
�undefined_symbols�szGrammar.undefined_symbolscCs8g}x.|jj�D] \}}|dkr|r|j|�qW|S)Nr)r�rfrb)rZ
unused_tokr=rDr	r	r
�unused_terminals�s
zGrammar.unused_terminalscCs<g}x2|jj�D]$\}}|s|j|d}|j|�qW|S)Nr)r�rfr�rb)rZunused_prodr=rDr�r	r	r
�unused_rules�szGrammar.unused_rulescCsDg}x:|jD]0}||jkp"||jks|j||j|df�qW|S)Nr)r�r�r�rb)rZunusedZtermnamer	r	r
�unused_precedence�s
zGrammar.unused_precedencecCs`g}xV|D]D}d}x2|j|D]$}|dkr0d}q||kr|j|�qW|rLq
Pq
W|jd�|S)NFz<empty>T)r�rb)rZbetar$�xZx_produces_emptyrr	r	r
�_firsts

zGrammar._firstcCs�|jr|jSx|jD]}|g|j|<qWdg|jd<x|jD]}g|j|<q<Wxjd}xZ|jD]P}xJ|j|D]<}x6|j|j�D]&}||j|kr~|j|j|�d}q~WqlWq\W|sPPqPW|jS)Nz$endFT)r�r�r�r�r�r�rb)rr�rBr�r�rr	r	r
�
compute_first.s$zGrammar.compute_firstc
CsV|jr|jS|js|j�x|jD]}g|j|<q"W|sD|jdj}dg|j|<�x�d}x�|jdd�D]�}x�t|j�D]�\}}||jkrx|j|j|dd��}d}xB|D]:}	|	dkr�|	|j|kr�|j|j	|	�d}|	dkr�d}q�W|�s|t
|j�dkrxx:|j|jD]*}	|	|j|k�r|j|j	|	�d}�qWqxWqhW|sTPqTW|jS)Nrdz$endFz<empty>T)r�r�r�r�r�rr�r�r�rbr)
rr��k�didaddr�rF�BZfstZhasemptyrr	r	r
�compute_followSs<

zGrammar.compute_followcCs�x�|jD]�}|}d}g}x�|t|�kr,d}ntt||�}y|j|j|d|_Wnttfk
rng|_YnXy|j|d|_Wntk
r�d|_YnX||_	|s�P|j
|�|}|d7}qW||_qWdS)Nrrd)r�rr�r�r�r�r�r�r�r�rbr�)rr�ZlastlrirFr�Zlrir	r	r
�
build_lritems�s.

zGrammar.build_lritems)Nrxr)N)N)rrrrrIrCr�r�r�r�r�r�r�r�r�r�r�r�r�r	r	r	r
r��s $
T
@#%
;r�c@seZdZdS)�VersionErrorN)rrrr	r	r	r
r��sr�c@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�LRTablecCsd|_d|_d|_d|_dS)N)rZr\rX�	lr_method)rr	r	r
r�szLRTable.__init__cCs~t|tj�r|}ntd|�tj|}|jtkr:td��|j	|_
|j|_g|_
x|jD]}|j
jt|��qXW|j|_|jS)Nz	import %sz&yacc table file version is out of date)rA�types�
ModuleType�execrn�modulesZ_tabversion�__tabversion__r�Z
_lr_actionrZZ_lr_gotor\rXZ_lr_productionsrbr�Z
_lr_methodr�Z
_lr_signature)r�modulerr�r	r	r
�
read_table�s

zLRTable.read_tablecCs�yddl}Wntk
r(ddl}YnXtjj|�s:t�t|d�}|j|�}|tkr^t	d��|j|�|_
|j|�}|j|�|_|j|�|_|j|�}g|_
x|D]}|j
jt|��q�W|j�|S)Nr�rbz&yacc table file version is out of date)�cPickle�ImportError�pickle�os�path�exists�open�loadr�r�r�rZr\rXrbr��close)r�filenamer�Zin_fZ
tabversion�	signaturerYr�r	r	r
�read_pickle�s(




zLRTable.read_picklecCsx|jD]}|j|�qWdS)N)rXr�)rr�r�r	r	r
�bind_callables�szLRTable.bind_callablesN)rrrrr�rrr	r	r	r
r��sr�c	CsTi}x|D]}d||<q
Wg}i}x,|D]$}||dkr(t|||||||�q(W|S)Nr)�traverse)�X�R�FP�Nr�r;�Fr	r	r
�digraph	s

r
c	Cs|j|�t|�}|||<||�||<||�}xr|D]j}	||	dkrXt|	||||||�t||||	�||<x.|j|	g�D]}
|
||kr|||j|
�q|Wq4W|||k�rt||d<||||d<|j�}x2||k�rt||d<||||d<|j�}q�WdS)Nrrdr~r~r~r~)rbrr�minr��MAXINTr)r�rr;r	rrr�d�rel�y�a�elementr	r	r
rs(

rc@seZdZdS)�	LALRErrorN)rrrr	r	r	r
r+src@s�eZdZd$dd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zd%d d!�Zd&d"d#�ZdS)'�LRGeneratedTablerNcCs�|dkrtd|��||_||_|s*t�}||_i|_i|_|j|_i|_	i|_
d|_d|_d|_
g|_g|_g|_|jj�|jj�|jj�|j�dS)N�SLRrzUnsupported method %sr)rr)r�grammarr�r�logrZr\r�rX�
lr_goto_cache�lr0_cidhash�
_add_countZsr_conflictZrr_conflictZ	conflicts�sr_conflicts�rr_conflictsr�r�r��lr_parse_table)rr�methodrr	r	r
r6s,


zLRGeneratedTable.__init__cCsz|jd7_|dd�}d}xV|rtd}xH|D]@}x:|jD]0}t|dd�|jkrRq:|j|j�|j|_d}q:Wq.Wq W|S)NrdTF�	lr0_addedr)rr�rKrbr�r)r�I�Jr�rGr�r	r	r
�lr0_closure[s
zLRGeneratedTable.lr0_closurec	Cs�|jjt|�|f�}|r|S|jj|�}|s:i}||j|<g}xP|D]H}|j}|rD|j|krD|jt|��}|s~i}||t|�<|j|�|}qDW|jd�}|s�|r�|j|�}||d<n||d<||jt|�|f<|S)Nz$end)rr�r!r�r�rbr!)	rrr��gr=Zgsr�rB�s1r	r	r
�lr0_gotous2





zLRGeneratedTable.lr0_gotoc	Cs�|j|jjdjg�g}d}x"|D]}||jt|�<|d7}q"Wd}x�|t|�kr�||}|d7}i}x$|D]}x|jD]}d||<qxWqlWxJ|D]B}|j||�}|s�t|�|jkr�q�t|�|jt|�<|j	|�q�WqFW|S)Nrrd)
r!rr�r�rr!rr�r$rb)	r�CrFrZasyms�iir=r�r"r	r	r
�	lr0_items�s(


zLRGeneratedTable.lr0_itemscCs�t�}d}xrxV|jjdd�D]B}|jdkr:|j|j�qx$|jD]}||krBPqBW|j|j�qWt|�|krrPt|�}qW|S)Nrrd)r�rr�rr�rr�)r�nullableZnum_nullabler�r�r	r	r
�compute_nullable_nonterminals�s
z.LRGeneratedTable.compute_nullable_nonterminalscCsrg}xht|�D]\\}}xR|D]J}|j|jdkr||j|jdf}|d|jjkr||kr|j|�qWqW|S)Nrd)r�r�rr�rr�rb)rr%�transZstatenorir�r�r	r	r
�find_nonterminal_transitions�s
z-LRGeneratedTable.find_nonterminal_transitionsc
Cs�|\}}g}|j|||�}xJ|D]B}|j|jdkr"|j|jd}	|	|jjkr"|	|kr"|j|	�q"W|dkr�||jjdjdkr�|jd�|S)Nrdrz$end)r$r�rr�rr�rbr�)
rr%r*r(rir�termsr"r�rr	r	r
�dr_relation	s

zLRGeneratedTable.dr_relationcCsvg}|\}}|j|||�}|jjt|�d�}xB|D]:}	|	j|	jdkr4|	j|	jd}
|
|kr4|j||
f�q4W|S)Nrdr~)r$rr�r!r�rr�rb)rr%r*�emptyrrirr"rGr�rr	r	r
�reads_relation	s
zLRGeneratedTable.reads_relationcCs�i}i}i}x|D]}d||<qW�x�|D�]�\}}	g}
g}�xR||D�]D}|j|	krZqH|j}
|}x�|
|jdk�r
|
d}
|j|
}||f|kr�|
d}xH||jkr�|j||jjkr�P|j||kr�P|d}q�W|j||f�|j|||�}|jj	t
|�d�}qfWx�||D]t}|j|jk�r,�q|j|jk�r>�qd}xD||jk�rx|j||j|dk�rlP|d}�qDW|
j||f��qWqHWx2|D]*}||k�r�g||<||j||	f��q�W|
|||	f<q*W||fS)Nrdrr~)rr�rr�rr�rbr$rr�r!)rr%r*r(ZlookdictZincludedictZdtransr�rirZlookbZincludesr�r�rGZlir"r"rFr	r	r
�compute_lookback_includesD	sX




z*LRGeneratedTable.compute_lookback_includescs0���fdd�}���fdd�}t|||�}|S)Ncs�j�|��S)N)r-)r�)r%r(rr	r
�<lambda>�	sz4LRGeneratedTable.compute_read_sets.<locals>.<lambda>cs�j�|��S)N)r/)r�)r%r(rr	r
r1�	s)r
)rr%�ntransr(rrr	r	)r%r(rr
�compute_read_sets�	sz"LRGeneratedTable.compute_read_setscs(�fdd�}�fdd�}t|||�}|S)Ncs�|S)Nr	)r�)�readsetsr	r
r1�	sz6LRGeneratedTable.compute_follow_sets.<locals>.<lambda>cs�j|g�S)N)r�)r�)�inclsetsr	r
r1�	s)r
)rr2r4r5rrr	r	)r5r4r
�compute_follow_sets�	sz$LRGeneratedTable.compute_follow_setsc	Csxxr|j�D]f\}}x\|D]T\}}||jkr4g|j|<|j|g�}x*|D]"}||j|krF|j|j|�qFWqWq
WdS)N)rfr�r�rb)	rZ	lookbacksZ	followsetr*Zlbrir�rrr	r	r
�add_lookaheads�	s


zLRGeneratedTable.add_lookaheadscCsP|j�}|j|�}|j|||�}|j|||�\}}|j|||�}|j||�dS)N)r)r+r3r0r6r7)rr%r(r*r4ZlookdZincludedZ
followsetsr	r	r
�add_lalr_lookaheads�	s
z$LRGeneratedTable.add_lalr_lookaheadsc$	Cs8|jj}|jj}|j}|j}|j}i}|jd|j�|j�}|jdkrP|j	|�d}�x�|D�]�}	g}
i}i}i}
|jd�|jd|�|jd�x|	D]}|jd|j
|�q�W|jd��x|	D�]�}|j|jdk�r(|j
dkr�d|d	<||d	<�q�|jdk�r|j|}n|jj|j
}�x�|D�]�}|
j||d
|j
|ff�|j|�}|dk	�r�|dk�r>|j|d�\}}||j
j\}}||k�s�||k�r�|dk�r�|j
||<|||<|�r�|�r�|jd
|�|jj||df�||j
jd7_nB||k�r|dk�rd||<n$|�s�|jd|�|jj||df�n�|dk�r�||}||j
}|j|jk�r�|j
||<|||<||}}||j
jd7_||j
jd8_n
||}}|jj|||f�|jd|||j
||�ntd|��n(|j
||<|||<||j
jd7_�q&Wq�|j}|j|d}||jjkr�|j|	|�}|jjt|�d�}|dkr�|
j||d|f�|j|�}|dk	�r�|dk�r�||k�r�td|��n�|dk�r�|j|d�\}}|||j
j\}}||k�s||k�rR|dk�rR|||j
jd8_|||<|||<|�s�|jd|�|jj||df�nL||k�rp|dk�rpd||<n.|�r�|�r�|jd
|�|jj||df�ntd|��q�|||<|||<q�Wi}xF|
D]>\}}}||k�r�|||k�r�|jd||�d|||f<�q�W|jd�d}xX|
D]P\}}}||k�r"|||k	�r"||f|k�r"|jd||�d}d|||f<�q"W|�r�|jd�i} x6|	D].}!x&|!jD]}"|"|jjk�r�d| |"<�q�W�q�WxL| D]D}#|j|	|#�}|jjt|�d�}|dk�r�||
|#<|jd|#|��q�W|||<|||<|
||<|d7}q\WdS)NzParsing method: %srrrxzstate %dz    (%d) %srdzS'z$endzreduce using rule %d (%s)r�r�z3  ! shift/reduce conflict for %s resolved as reduce�reducer�z2  ! shift/reduce conflict for %s resolved as shiftZshiftz=  ! reduce/reduce conflict for %s resolved using rule %d (%s)zUnknown conflict in state %dzshift and go to state %dz Shift/shift conflict in state %dz    %-15s %sz  ! %-15s [ %s ]z"    %-30s shift and go to state %d)r�rr~)r�rr~) rr�r�r\rZrrr�r'r8r�rr�rr�r�rbr�r�rr�r�rrr�r�r$rr!rr�r�)$rr�r�r]r[rZactionpr%�strZactlistZ	st_actionZ
st_actionpZst_gotor�Zlaheadsrr"ZsprecZslevelZrprecZrlevelZoldpZppZchosenpZrejectprFr"rGZ	_actprintr�Znot_usedZnkeysr&r=rBr	r	r
r�	s




























zLRGeneratedTable.lr_parse_tablerxcCs�t|tj�rtd��|jd�d}tjj||�d}�ylt|d�}|j	dtjj
|�t|j|f�d}|�rti}xf|j
j�D]X\}	}
xN|
j�D]B\}}|j|�}
|
s�ggf}
|
||<|
dj|	�|
dj|�q�Wq|W|j	d�xz|j�D]n\}}|j	d	|�x |dD]}
|j	d
|
��qW|j	d�x |dD]}
|j	d
|
��q8W|j	d�q�W|j	d
�|j	d�nJ|j	d�x4|j
j�D]&\}}|j	d|d|d|f��q�W|j	d
�|�r�i}xl|jj�D]^\}	}
xR|
j�D]F\}}|j|�}
|
�sggf}
|
||<|
dj|	�|
dj|��q�W�q�W|j	d�x||j�D]p\}}|j	d	|�x |dD]}
|j	d
|
��qjW|j	d�x |dD]}
|j	d
|
��q�W|j	d��qJW|j	d
�|j	d�nJ|j	d�x4|jj�D]&\}}|j	d|d|d|f��q�W|j	d
�|j	d�xd|jD]Z}|j�rl|j	d|j|j|j|jtjj
|j�|jf�n|j	dt|�|j|jf��q0W|j	d�|j�Wn&tk
�r�}z�WYdd}~XnXdS)Nz"Won't overwrite existing tabmoduler�rdz.py�wz�
# %s
# This file is automatically generated. Do not edit.
# pylint: disable=W,C,R
_tabversion = %r

_lr_method = %r

_lr_signature = %r
    rz
_lr_action_items = {z%r:([z%r,z],[z]),z}
z�
_lr_action = {}
for _k, _v in _lr_action_items.items():
   for _x,_y in zip(_v[0],_v[1]):
      if not _x in _lr_action:  _lr_action[_x] = {}
      _lr_action[_x][_k] = _y
del _lr_action_items
z
_lr_action = { z(%r,%r):%r,z
_lr_goto_items = {z�
_lr_goto = {}
for _k, _v in _lr_goto_items.items():
   for _x, _y in zip(_v[0], _v[1]):
       if not _x in _lr_goto: _lr_goto[_x] = {}
       _lr_goto[_x][_k] = _y
del _lr_goto_items
z
_lr_goto = { z_lr_productions = [
z  (%r,%r,%d,%r,%r,%d),
z  (%r,%r,%d,None,None,None),
z]
r~)rAr�r��IOError�splitr�r�r�r�r
�basenamer�r�rZrfr�rbr\rXr�r7rrr�r�r�)r�	tabmodule�	outputdirrZbasemodulenamerrZsmallerrfr=ZndrrDrFr�r��er	r	r
�write_table�
s�
	




"





"

"
zLRGeneratedTable.write_tablecCsyddl}Wntk
r(ddl}YnXt|d���}|jt|t�|j|j|t�|j||t�|j|j|t�|j|j	|t�g}x^|j
D]T}|jr�|j|j
|j|j|jtjj|j�|jf�q�|jt
|�|j|jdddf�q�W|j||t�WdQRXdS)Nr�wb)r�r�r�r��dumpr��pickle_protocolr�rZr\rXr�rbr7rrr�r�r>r�r�)rrrr�ZoutfZoutpr�r	r	r
�pickle_table"s ,"zLRGeneratedTable.pickle_table)rN)rxrx)rx)rrrrr!r$r'r)r+r-r/r0r3r6r7r8rrBrFr	r	r	r
r5s"
%#8+PB
{rcCs0tj|�}|jj�}|j|jkr,|j|j�|S)N)rn�	_getframe�	f_globals�copy�f_locals�update)ZlevelsrZldictr	r	r
�get_caller_module_dictEs


rLcCsg}|j�}d}|}x�|D]�}|d7}|j�}|s4qy�|ddkrh|sVtd||f��|}	|dd�}
n@|d}	|	}|dd�}
|d}|dkr�|dkr�td||f��|j|||	|
f�Wqtk
r��Yqtk
r�td	|||j�f��YqXqW|S)
Nrdr�|z%s:%d: Misplaced '|'r��:z::=z!%s:%d: Syntax error. Expected ':'z%s:%d: Syntax error in rule %r)�
splitlinesr=rVrb�	Exception�strip)�docr�r�rZpstringsZlastpZdlineZpsr�r�r�Zassignr	r	r
�
parse_grammarQs6
 rSc@s�eZdZd dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�ZdS)!�
ParserReflectNcCsL||_d|_d|_d|_t�|_g|_d|_|dkrBtt	j
�|_n||_dS)NF)r�r��
error_func�tokensr�r�rrrrnror)rr�rr	r	r
r{szParserReflect.__init__cCs,|j�|j�|j�|j�|j�dS)N)�	get_start�get_error_func�
get_tokens�get_precedence�get_pfunctions)rr	r	r
�get_all�s
zParserReflect.get_allcCs6|j�|j�|j�|j�|j�|j�|jS)N)�validate_start�validate_error_func�validate_tokens�validate_precedence�validate_pfunctions�validate_modulesr)rr	r	r
�validate_all�szParserReflect.validate_allcCs�g}yv|jr|j|j�|jr:|jdjdd�|jD���|jrR|jdj|j��x$|jD]}|drZ|j|d�qZWWnttfk
r�YnXdj|�S)NrxcSsg|]}dj|��qS)rx)r�)r?r�r	r	r
r@�sz+ParserReflect.signature.<locals>.<listcomp>ryr)r�rbr�r�rV�pfuncs�	TypeError�
ValueError)r�partsrr	r	r
r�szParserReflect.signaturecCs�tjd�}x�|jD]�}ytj|�\}}Wntk
r>wYnXi}xjt|�D]^\}}|d7}|j|�}|rN|jd�}|j	|�}	|	s�|||<qNtj
|�}
|jjd|
|||	�qNWqWdS)Nz\s*def\s+(p_[a-zA-Z_0-9]*)\(rdz;%s:%d: Function %s redefined. Previously defined on line %d)
�re�compiler��inspectZgetsourcelinesr<r�r��groupr��
getsourcefilerr)rZfrer��linesZlinenZ	counthashr�r�r�prevrr	r	r
rb�s$





zParserReflect.validate_modulescCs|jjd�|_dS)Nr�)r�r�r�)rr	r	r
rW�szParserReflect.get_startcCs&|jdk	r"t|jt�s"|jjd�dS)Nz'start' must be a string)r�rA�string_typesrr)rr	r	r
r]�s
zParserReflect.validate_startcCs|jjd�|_dS)N�p_error)r�r�rU)rr	r	r
rX�szParserReflect.get_error_funccCs�|jr�t|jtj�rd}n*t|jtj�r.d}n|jjd�d|_dS|jjj}|jjj	}t
j|j�}|jj
|�|jjj|}|dkr�|jjd||�d|_dS)Nrrdz2'p_error' defined, but is not a function or methodTz$%s:%d: p_error() requires 1 argument)rUrAr��FunctionType�
MethodTyperr�__code__�co_firstlineno�co_filenamerj�	getmoduler�r��co_argcount)rZismethodZelineZefiler�Zargcountr	r	r
r^�s 

z!ParserReflect.validate_error_funccCsr|jjd�}|s&|jjd�d|_dSt|ttf�sJ|jjd�d|_dS|sd|jjd�d|_dSt|�|_dS)NrVzNo token list is definedTztokens must be a list or tupleztokens is empty)	r�r�rrrArgr��sortedrV)rrVr	r	r
rY�szParserReflect.get_tokenscCsZd|jkr |jjd�d|_dSt�}x.|jD]$}||krH|jjd|�|j|�q.WdS)Nrz.Illegal token name 'error'. Is a reserved wordTzToken %r multiply defined)rVrrr�rr�)rr�rBr	r	r
r_s
zParserReflect.validate_tokenscCs|jjd�|_dS)Nr�)r�r�r�)rr	r	r
rZszParserReflect.get_precedencecCsg}|j�rt|jttf�s2|jjd�d|_dSx�t|j�D]�\}}t|ttf�sj|jjd�d|_dSt|�dkr�|jjd|�d|_dS|d}t|t�s�|jjd�d|_dSxH|dd�D]8}t|t�s�|jjd	�d|_dS|j	|||df�q�Wq>W||_
dS)
Nz"precedence must be a list or tupleTzBad precedence tabler�z?Malformed precedence entry %s. Must be (assoc, term, ..., term)rz)precedence associativity must be a stringrdz precedence items must be strings)r�rArgr�rrr�rrorb�preclist)rryr�r�r�r�r	r	r
r`s6

z!ParserReflect.validate_precedencecCs�g}xl|jj�D]^\}}|jd�s|dkr.qt|tjtjf�rt|d|jj	�}t
j|�}|j||||j
f�qW|jdd�d�||_dS)N�p_rprtcSs |dt|d�|d|dfS)Nrrdr�r)r7)Z
p_functionr	r	r
r1Fs
z.ParserReflect.get_pfunctions.<locals>.<lambda>)�key)r�rf�
startswithrAr�rqrrrKrsrtrjrvrb�__doc__�sortrd)rZp_functionsr�itemr�r�r	r	r
r[9s
zParserReflect.get_pfunctionscCs^g}t|j�dkr(|jjd�d|_dS�x"|jD�]\}}}}tj|�}|j|}t|tj	�rfd}nd}|j
j|kr�|jjd|||j�d|_q2|j
j|kr�|jjd|||j�d|_q2|j
s�|jjd|||j�q2y,t|||�}	x|	D]}
|j||
f�q�WWn:tk
�r<}z|jjt|��d|_WYdd}~XnX|jj|�q2W�x|jj�D]�\}}
|jd	��r�t|
tjtj	f��r��q\|jd
��r��q\|jd	��r�|dk�r�|jjd|�t|
tj��r�|
j
jdk�s�t|
tj	��r\|
jj
jdk�r\|
j
�r\y8|
j
jd
�}|ddk�r4|jjd|
j
j|
j
j|�Wntk
�rLYnX�q\W||_dS)Nrz+no rules of the form p_rulename are definedTr�rdz%%s:%d: Rule %r has too many argumentsz#%s:%d: Rule %r requires an argumentzA%s:%d: No documentation string specified in function %r (ignored)rzZt_rpz%r not defined as a functionryrNz9%s:%d: Possible grammar rule %r defined without p_ prefix)rrdrrrjrlr�rAr�rrrsrwrr}rrSrbrVr7r�r�rfr|rq�__func__r=rurtr�r)rrr�r�rrRr�r�ZreqargsZparsed_gr"rArBrDr	r	r
raNs\


 z!ParserReflect.validate_pfunctions)N)rrrrr\rcrrbrWr]rXr^rYr_rZr`r[rar	r	r	r
rTzs

rTc
<Os�	|dkrt}|rd}|dkr&ttj�}�r��fdd�t��D�}
t|
�}d|krdtj|dj|d<d|kr�d|kr�ttj|dd�r�tj|dj	|d<nt
d�}|	dk�rt|tj
�r�|j}nLd|kr�|d}n:|jd�}dj|dd6��}td
|�ttj|dd�}tjj|�}	|jd�}|�rNt|t��rNd|k�rN|d|}|dk	�r`||d<t||d
�}|j�|j�r�td��|j�}y�t�}|�r�|j|�}n
|j|�}|�s�||k�ry"|j|j �t!||j"�}|j#a#|St$k
�r}z|j%d|�WYdd}~XnXWnFt&k
�rH}z|j%t|��WYdd}~Xnt'k
�r\YnX|
dk�r�|�r�ytt(tjj|	|�d��}
Wn<t)k
�r�}z|j%d||f�t*�}
WYdd}~XnXnt*�}
|
j+dt,�d}|j-��r�td��|j"�s|j%d�t.|j/�}xZ|j0D]P\}}}y|j1|||�Wn0t2k
�rb}z|j%d|�WYdd}~XnX�qWxl|j3D]b\}}|\} }!}"}#y|j4|"|#|| |!�Wn4t2k
�r�}z|jd|�d}WYdd}~XnX�qrWy&|dk�r�|j5|j6�n
|j5|�Wn6t2k
�r4}z|jt|��d}WYdd}~XnX|�rDtd��|j7�}$x*|$D]"\}%}&|jd|&j8|&j9|%�d}�qRW|j:�}'|'�r�|
j+d�|
j+d�|
j+d�x&|'D]}|j%d|�|
j+d|��q�W|�r|
j+d�|
j+d�|
j+d�x&t;|j<�D]\}(})|
j+d|(|)��q�W|j=�}*x$|*D]}&|j%d|&j8|&j9|&j>��q&Wt?|'�d	k�r^|j%d�t?|'�d	k�r||j%dt?|'��t?|*�d	k�r�|j%d �t?|*�d	k�r�|j%d!t?|*��|�r�|
j+d�|
j+d"�|
j+d�t@|jA�}+|+jB�x2|+D]*}|
j+d#|d$jd%d�|jA|D����q�W|
j+d�|
j+d&�|
j+d�t@|jC�},|,jB�x2|,D]*}-|
j+d#|-d$jd'd�|jC|-D����qRW|
j+d�|�r�|jD�}.x|.D]}/|j%d(|/��q�W|jE�}0x|0D]}1|jd)|1�d}�q�W|jF�}2x$|2D]\}}|jd*||�d}�q�W|�rtd��|�r*|jGd+|�tH|||
�}|�r�t?|jI�}3|3d	k�r\|j%d,�n|3d	k�rr|j%d-|3�t?|jJ�}4|4d	k�r�|j%d.�n|4d	k�r�|j%d/|4�|�r�|jI�s�|jJ�r�|
j%d�|
j%d0�|
j%d�x&|jID]\}5}6}7|
j%d1|6|5|7��q�WtK�}8x�|jJD]x\}5}9}:|5tL|9�tL|:�f|8k�r8�q|
j%d2|5|9�|
j%d3|:|5�|j%d2|5|9�|j%d3|:|5�|8jM|5tL|9�tL|:�f��qWg};xL|jJD]B\}5}9}:|:jN�r�|:|;k�r�|
j%d4|:�|j%d4|:�|;jO|:��q�W|�	rDy&|jP||	|�|tjk�	r
tj|=Wn6t)k
�	rB}z|j%d5||f�WYdd}~XnX|�	r�y|jQ||�Wn6t)k
�	r�}z|j%d5||f�WYdd}~XnX|j|j �t!||j"�}|j#a#|S)7Nrcsg|]}|t�|�f�qSr	)rK)r?r�)r�r	r
r@�szyacc.<locals>.<listcomp>�__file__r�__package__r�r�rdz	import %srxr�)rzUnable to build parserz.There was a problem loading the table file: %rr;zCouldn't open %r. %sz5Created by PLY version %s (http://www.dabeaz.com/ply)Fz no p_error() function is definedz%sTz;%s:%d: Symbol %r used, but not defined as a token or a rulezUnused terminals:zToken %r defined, but not usedz    %sr�zRule %-5d %sz$%s:%d: Rule %r defined, but not usedzThere is 1 unused tokenzThere are %d unused tokenszThere is 1 unused rulezThere are %d unused rulesz'Terminals, with rules where they appearz
%-20s : %srycSsg|]}t|��qSr	)r7)r?r=r	r	r
r@M
sz*Nonterminals, with rules where they appearcSsg|]}t|��qSr	)r7)r?r=r	r	r
r@U
szSymbol %r is unreachablez)Infinite recursion detected for symbol %rz0Precedence rule %r defined for unknown symbol %rzGenerating %s tablesz1 shift/reduce conflictz%d shift/reduce conflictsz1 reduce/reduce conflictz%d reduce/reduce conflictsz
Conflicts:z7shift/reduce conflict for %s in state %d resolved as %sz;reduce/reduce conflict in state %d resolved using rule (%s)zrejected rule (%s) in state %dzRule (%s) is never reducedzCouldn't create %r. %sr~)R�
tab_modulerrnro�dir�dictr�r�r�r�rLrAr�r�r=r�r�rKr�r��dirnamer�r7rTr\rrrr�rr�rr�rWrUrvrPrr�r�r�r<rr�__version__rcr�rVryr�r�rr�r�r�r�r�r�r�r�r�r�rrrgr�r~r�r�r�r�rrrrr�r!r�r�rbrBrF)<rrr�r?r�Zcheck_recursion�optimizeZwrite_tablesZ	debugfiler@ZdebuglogZerrorlogZ
picklefileZ_itemsr�ZsrcfilergZpkgnameZpkgZpinforZlrZread_signaturer3rA�errorsrr�r�r��funcnameZgramr�r�r�r�r�rcr�r�rBr�r�r,ZnontermsZnontermZunreachable�ur��infZunused_precZnum_srZnum_rrrir�Z
resolutionZalready_reportedZruleZrejectedZwarned_neverr	)r�r
�yacc�s�







"



$
















*




*













$$r�);rhr�rnZos.pathr�rjr(r�r�Z	yaccdebugZ
debug_filer�Z
default_lrr�rlrrE�version_infoZ
basestringror7�maxsizer�objectrrrPrr%r'r+r/r-r*r,r.r0r4r5r9rWrir�r�r�r�r�r�r�r�r�r
rrrrLrSrTr�r	r	r	r
�<module>>s�

7m
H.rT
)
site-packages/sepolgen/__pycache__/objectmodel.cpython-36.opt-1.pyc000064400000007523147511334570021205 0ustar003

��fv�	@spdZddddddddd	g	Zd
ZdZdZeeBZeeeed
�ZedededediZGdd�d�ZGdd�d�Z	dS)z�
This module provides knowledge object classes and permissions. It should
be used to keep this knowledge from leaking into the more generic parts of
the policy generation.
Zsocket�fdZprocess�fileZlnk_fileZ	fifo_fileZdbusZ
capabilityZunix_stream_socket���)�n�r�w�brrrr	c@s eZdZdZdd�Zdd�ZdS)�PermMapaA mapping between a permission and its information flow properties.

    PermMap represents the information flow properties of a single permission
    including the direction (read, write, etc.) and an abstract representation
    of the bandwidth of the flow (weight).
    cCs||_||_||_dS)N)�perm�dir�weight)�selfrrr
�r�!/usr/lib/python3.6/objectmodel.py�__init__TszPermMap.__init__cCsd|jt|j|jfS)Nz'<sepolgen.objectmodel.PermMap %s %s %d>)r�
dir_to_strrr
)rrrr�__repr__YszPermMap.__repr__N)�__name__�
__module__�__qualname__�__doc__rrrrrrr
Msr
c@s@eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dS)�PermMappingsz�The information flow properties of a set of object classes and permissions.

    PermMappings maps one or more classes and permissions to their PermMap objects
    describing their information flow charecteristics.
    cCsi|_d|_t|_dS)N�)�classes�default_weight�	FLOW_BOTH�default_dir)rrrrrdszPermMappings.__init__cCs�d}x�|D]�}|j�}t|�dks
t|�dks
|ddkr<q
|ddkrx|d}||jkrbtd��i|j|<|j|}q
t|�dkr�td��|dkr�td	��t|dt|dt|d
��}|||j<q
WdS)zsRead the permission mappings from a file. This reads the format used
        by Apol in the setools suite.
        Nrr�#�classzduplicate class in perm map�z"error in object classs permissionszpermission outside of classr)�split�lenr�
ValueErrorr
�
str_to_dir�intr)rrZcur�lineZfields�c�pmrrr�	from_fileis"	
$

 zPermMappings.from_filecCs|j||S)z�Get the permission map for the object permission.

        Returns:
          PermMap representing the permission
        Raises:
          KeyError if the object or permission is not defined
        )r)r�objrrrr�get�szPermMappings.getcCs8y|j||}Wn tk
r2t||j|j�SX|S)aGet the permission map for the object permission or a default.

        getdefault is the same as get except that a default PermMap is
        returned if the object class or permission is not defined. The
        default is FLOW_BOTH with a weight of 5.
        )r�KeyErrorr
rr)rr*rr(rrr�
getdefault�s
zPermMappings.getdefaultcCs,t}x"|D]}|j||�}||jB}q
W|S)N)�	FLOW_NONEr-r)rr*�permsrrr(rrr�getdefault_direction�s

z!PermMappings.getdefault_directioncCs,d}x"|D]}|j||�}||j7}q
W|S)Nr)r-r
)rr*r/Ztotalrr(rrr�getdefault_distance�s

z PermMappings.getdefault_distanceN)
rrrrrr)r+r-r0r1rrrrr^s

rN)
rZimplicitly_typed_objectsr.Z	FLOW_READZ
FLOW_WRITErr$rr
rrrrr�<module>s
site-packages/sepolgen/__pycache__/yacc.cpython-36.opt-1.pyc000064400000151423147511334570017634 0ustar003

��fd�
@s
ddlZddlZddlZddlZddlZddlZdZdZdZ	dZ
dZdZdZ
d	Zd
ZdZejddkrleZneZejZGdd�de�ZGd
d�de�ZGdd�de�Zdd�Zdd�Zdadada dZ!dd�Z"dd�Z#dd�Z$dd�Z%Gdd�d�Z&Gd d!�d!�Z'Gd"d#�d#�Z(ej)d$�Z*Gd%d&�d&e�Z+Gd'd(�d(e�Z,Gd)d*�d*e�Z-d+d,�Z.Gd-d.�d.e�Z/Gd/d0�d0e�Z0Gd1d2�d2e�Z1Gd3d4�d4e�Z2d5d6�Z3d7d8�Z4Gd9d:�d:e�Z5Gd;d<�d<e2�Z6d=d>�Z7d?d@�Z8GdAdB�dBe�Z9de	deddd	de
ddddf
dCdD�Z:dS)E�Nz3.11z3.10Tz
parser.out�parsetab�LALR�F�(c@s4eZdZdd�Zdd�ZeZdd�Zdd�ZeZd	S)
�	PlyLoggercCs
||_dS)N)�f)�selfr�r	�/usr/lib/python3.6/yacc.py�__init__mszPlyLogger.__init__cOs|jj||d�dS)N�
)r�write)r�msg�args�kwargsr	r	r
�debugpszPlyLogger.debugcOs|jjd||d�dS)Nz	WARNING: r)rr
)rrrrr	r	r
�warninguszPlyLogger.warningcOs|jjd||d�dS)NzERROR: r)rr
)rrrrr	r	r
�errorxszPlyLogger.errorN)	�__name__�
__module__�__qualname__rr�inforrZcriticalr	r	r	r
rlsrc@seZdZdd�Zdd�ZdS)�
NullLoggercCs|S)Nr	)r�namer	r	r
�__getattribute__szNullLogger.__getattribute__cOs|S)Nr	)rrrr	r	r
�__call__�szNullLogger.__call__N)rrrrrr	r	r	r
r~src@seZdZdS)�	YaccErrorN)rrrr	r	r	r
r�srcCsPt|�}d|krt|�}t|�tkr4|dt�d}dt|�jt|�|f}|S)Nrz ...z<%s @ 0x%x> (%s))�repr�len�resultlimit�typer�id)�r�repr_str�resultr	r	r
�
format_result�sr%cCsBt|�}d|krt|�}t|�dkr(|Sdt|�jt|�fSdS)Nr�z<%s @ 0x%x>)rrr rr!)r"r#r	r	r
�format_stack_entry�sr'aPLY: Don't use global functions errok(), token(), and restart() in p_error().
Instead, invoke the methods on the associated parser instance:

    def p_error(p):
        ...
        # Use parser.errok(), parser.token(), parser.restart()
        ...

    parser = yacc.yacc()
cCstjt�t�S)N)�warnings�warn�_warnmsg�_errokr	r	r	r
�errok�s
r,cCstjt�t�S)N)r(r)r*�_restartr	r	r	r
�restart�s
r.cCstjt�t�S)N)r(r)r*�_tokenr	r	r	r
�token�s
r0cCs>|ja|ja|ja||�}y
bbbWntk
r8YnX|S)N)r,r+r0r/r.r-�	NameError)�	errorfuncr0�parserr"r	r	r
�call_errorfunc�s
r4c@seZdZdd�Zdd�ZdS)�
YaccSymbolcCs|jS)N)r )rr	r	r
�__str__�szYaccSymbol.__str__cCst|�S)N)�str)rr	r	r
�__repr__�szYaccSymbol.__repr__N)rrrr6r8r	r	r	r
r5�sr5c@sneZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�ZdS)�YaccProductionNcCs||_||_d|_d|_dS)N)�slice�stack�lexerr3)r�sr;r	r	r
r�szYaccProduction.__init__cCsBt|t�rdd�|j|D�S|dkr2|j|jS|j|jSdS)NcSsg|]
}|j�qSr	)�value)�.0r=r	r	r
�
<listcomp>�sz.YaccProduction.__getitem__.<locals>.<listcomp>r)�
isinstancer:r>r;)r�nr	r	r
�__getitem__�s

zYaccProduction.__getitem__cCs||j|_dS)N)r:r>)rrB�vr	r	r
�__setitem__�szYaccProduction.__setitem__cCsdd�|j||�D�S)NcSsg|]
}|j�qSr	)r>)r?r=r	r	r
r@�sz/YaccProduction.__getslice__.<locals>.<listcomp>)r:)r�i�jr	r	r
�__getslice__�szYaccProduction.__getslice__cCs
t|j�S)N)rr:)rr	r	r
�__len__�szYaccProduction.__len__cCst|j|dd�S)N�linenor)�getattrr:)rrBr	r	r
rJszYaccProduction.linenocCs||j|_dS)N)r:rJ)rrBrJr	r	r
�
set_linenoszYaccProduction.set_linenocCs,t|j|dd�}t|j|d|�}||fS)NrJr�	endlineno)rKr:)rrB�	startlineZendliner	r	r
�linespanszYaccProduction.linespancCst|j|dd�S)N�lexposr)rKr:)rrBr	r	r
rPszYaccProduction.lexposcCs||j|_dS)N)r:rP)rrBrPr	r	r
�
set_lexposszYaccProduction.set_lexposcCs,t|j|dd�}t|j|d|�}||fS)NrPr�	endlexpos)rKr:)rrB�startpos�endposr	r	r
�lexspanszYaccProduction.lexspancCst�dS)N)�SyntaxError)rr	r	r
rszYaccProduction.error)N)rrrrrCrErHrIrJrLrOrPrQrUrr	r	r	r
r9�s
r9c@s\eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd
d�Zddd�Z	ddd�Z
ddd�ZdS)�LRParsercCs0|j|_|j|_|j|_||_|j�d|_dS)NT)	�lr_productions�productions�	lr_action�action�lr_goto�gotor2�set_defaulted_states�errorok)rZlrtabZerrorfr	r	r
r szLRParser.__init__cCs
d|_dS)NT)r_)rr	r	r
r,(szLRParser.errokcCs@|jdd�=|jdd�=t�}d|_|jj|�|jjd�dS)Nz$endr)�
statestack�symstackr5r �append)r�symr	r	r
r.+szLRParser.restartcCsTi|_xH|jj�D]:\}}t|j��}t|�dkr|ddkr|d|j|<qWdS)N�r)�defaulted_statesr[�items�list�valuesr)r�state�actionsZrulesr	r	r
r^;s
zLRParser.set_defaulted_statescCs
i|_dS)N)re)rr	r	r
�disable_defaulted_statesBsz!LRParser.disable_defaulted_statesNFcCsZ|str.t|t�rttj�}|j|||||�S|rD|j|||||�S|j|||||�SdS)N)	�	yaccdevelrA�intr�sys�stderr�
parsedebug�parseopt�parseopt_notrack)r�inputr<r�tracking�	tokenfuncr	r	r
�parseEs

zLRParser.parsec Cs�d}g}|j}|j}	|j}
|j}td�}d}
|jd�|sLddlm}|j}||_||_	|dk	rj|j
|�|dkrz|j}n|}||_g}||_g}||_
||_d}|jd�t�}d|_|j|�d}�x�|jd�|jd|�||k�r.|�s|�s�|�}n|j�}|�st�}d|_|j}||j|�}n||}|jd||�|jd	d
djdd
�|D�dd��t|�fj��|dk	�r�|dk�r�|j|�|}|jd|�|j|�d}|
r�|
d8}
q�|dk�rN|
|}|j}|j}t�}||_d|_|�rB|jd|jddjdd
�||d�D��d|	|d%||�n|jd|jg|	|d&|�|�r�||dd�}||d<|�r�|d}|j|_|j|_|d'}t|d|j�|_t|d|j�|_||_ yd||d�=||_!|j"|�||d�=|jdt#|d��|j|�|	|d(|}|j|�Wq�t$k
�r�|j|�|j%|dd)��|j�|d*}d|_d|_|}t&}
d|_'Yq�Xq�n�|�r�|j|_|j|_|g}||_ yL||_!|j"|�|jdt#|d��|j|�|	|d+|}|j|�Wq�t$k
�rJ|j|�|j�|d,}d|_d|_|}t&}
d|_'Yq�Xq�|dk�r�|d-}t|dd�}|jdt#|��|jd�|S|dk�r�|j(dd
djdd
�|D�dd��t|�fj��|
dk�s�|j'�r�t&}
d|_'|}|jdk�r�d}|j)�rB|�rt*|d��r||_||_!t+|j)||�}|j'�r�|}d}q�n`|�r�t*|d��r\|j}nd}|�r~t,j-j.d ||jf�nt,j-j.d!|j�nt,j-j.d"�dSnt&}
t|�dk�r�|jdk�r�d}d}d}|dd�=q�|jdk�r�dS|jdk�r�|d.}|jdk�r6|�r0t|d|j�|_t|d#|j�|_d}q�t�}d|_t*|d��r\|j|_|_t*|d#��rv|j|_|_||_|j|�|}q�|j�}|�r�|j|_|j|_|j�|d/}q�t/d$��q�WdS)0NrzPLY: PARSE DEBUG STARTrd)�lexz$end�zState  : %sz#Defaulted state %s: Reduce using %dzStack  : %sz%s . %s� cSsg|]
}|j�qSr	)r )r?�xxr	r	r
r@�sz'LRParser.parsedebug.<locals>.<listcomp>z Action : Shift and goto state %sz3Action : Reduce rule [%s] with %s and goto state %d�[�,cSsg|]}t|j��qSr	)r'r>)r?Z_vr	r	r
r@�s�]rMrRzResult : %srFr>zDone   : Returning %szPLY: PARSE DEBUG ENDzError  : %scSsg|]
}|j�qSr	)r )r?rzr	r	r
r@Dsr<rJz(yacc: Syntax error at line %d, token=%s
zyacc: Syntax error, token=%sz yacc: Parse error in input. EOF
rPzyacc: internal parser error!!!
���r~r~r~r~r~r~r~r~r~r~)0r[r]rYrer9rrxrwr<r3rsr0r`rar;rbr5r r�pop�get�joinr7�lstriprrr>rJrPrKrMrRr:ri�callabler%rV�extend�error_countr_rr2�hasattrr4rnror
�RuntimeError) rrsr<rrtru�	lookahead�lookaheadstackrjr]�prodre�pslice�
errorcountrw�	get_tokenr`ra�errtokenrcri�ltype�t�p�pname�plen�targ�t1rBr$�tokrJr	r	r
rp^s~





.






$








.


zLRParser.parsedebugc Csvd}g}|j}|j}	|j}
|j}td�}d}
|sBddlm}|j}||_||_|dk	r`|j	|�|dkrp|j
}n|}||_
g}||_g}||_||_
d}|jd�t�}d|_|j|�d}�x�||k�r|s�|s�|�}n|j�}|s�t�}d|_|j}||j|�}n||}|dk	�rh|dk�rN|j|�|}|j|�d}|
r�|
d8}
q�|dk�rF|
|}|j}|j}t�}||_d|_|�r�||dd�}||d<|�r�|d}|j|_|j|_|d}t|d|j�|_t|d|j�|_||_yP||d�=||_|j|�||d�=|j|�|	|d|}|j|�Wq�tk
�r�|j|�|j|dd��|j�|d}d|_d|_|}t }
d|_!Yq�Xq�n�|�r�|j|_|j|_|g}||_y8||_|j|�|j|�|	|d|}|j|�Wq�tk
�rB|j|�|j�|d}d|_d|_|}t }
d|_!Yq�Xq�|dk�rh|d}t|d	d�}|S|dk�rf|
dk�s�|j!�rNt }
d|_!|}|jdk�r�d}|j"�r�|�r�t#|d
��r�||_||_t$|j"||�}|j!�rL|}d}q�n`|�r<t#|d��r|j}nd}|�r(t%j&j'd||jf�nt%j&j'd
|j�nt%j&j'd�dSnt }
t|�dk�r�|jdk�r�d}d}d}|dd�=q�|jdk�r�dS|jdk�r6|d}|jdk�r�|�r�t|d|j�|_t|d|j�|_d}q�t�}d|_t#|d��r|j|_|_t#|d��r |j|_|_||_|j|�|}q�|j�}|�rT|j|_|j|_|j�|d}q�t(d��q�WdS)Nrrd)rwz$endrMrRrFr>r<rJz(yacc: Syntax error at line %d, token=%s
zyacc: Syntax error, token=%sz yacc: Parse error in input. EOF
rPzyacc: internal parser error!!!
r~r~r~r~r~r~r~r~r~))r[r]rYrer9rxrwr<r3rsr0r`rar;rbr5r rr�rrr>rJrPrKrMrRr:rir�rVr�r�r_r2r�r4rnror
r�) rrsr<rrtrur�r�rjr]r�rer�r�rwr�r`rar�rcrir�r�r�r�r�r�r�rBr$r�rJr	r	r
rq�sX




















zLRParser.parseoptcCs�d}g}|j}|j}	|j}
|j}td�}d}
|sBddlm}|j}||_||_|dk	r`|j	|�|dkrp|j
}n|}||_
g}||_g}||_||_
d}|jd�t�}d|_|j|�d}�x||k�r|s�|s�|�}n|j�}|s�t�}d|_|j}||j|�}n||}|dk	�r|dk�rN|j|�|}|j|�d}|
r�|
d8}
q�|dk�r�|
|}|j}|j}t�}||_d|_|�rX||dd�}||d<||_yP||d�=||_|j|�||d�=|j|�|	|d|}|j|�Wq�tk
�rR|j|�|j|dd��|j�|d}d|_d|_|}t}
d|_Yq�Xq�n�|g}||_y8||_|j|�|j|�|	|d|}|j|�Wq�tk
�r�|j|�|j�|d}d|_d|_|}t}
d|_Yq�Xq�|dk�r|d}t|dd�}|S|dk�r�|
dk�s(|j�r�t}
d|_|}|jdk�rFd}|j�r�|�rht|d��rh||_||_t |j||�}|j�r�|}d}q�n`|�r�t|d	��r�|j!}nd}|�r�t"j#j$d
||jf�nt"j#j$d|j�nt"j#j$d�dSnt}
t|�dk�r(|jdk�r(d}d}d}|dd�=q�|jdk�r8dS|jdk�r�|d}|jdk�r^d}q�t�}d|_t|d	��r�|j!|_!|_%t|d
��r�|j&|_&|_'||_|j|�|}q�|j�}|j�|d}q�t(d��q�WdS)Nrrd)rwz$endrFr>r<rJz(yacc: Syntax error at line %d, token=%s
zyacc: Syntax error, token=%sz yacc: Parse error in input. EOF
rPzyacc: internal parser error!!!
r~r~r~r~r~r~r~r~))r[r]rYrer9rxrwr<r3rsr0r`rar;rbr5r rr�rrr>r:rir�rVr�r�r_rKr2r�r4rJrnror
rMrPrRr�)rrsr<rrtrur�r�rjr]r�rer�r�rwr�r`rar�rcrir�r�r�r�r�r�rBr$r�rJr	r	r
rr�s8




















zLRParser.parseopt_notrack)NNFFN)NNFFN)NNFFN)NNFFN)rrrrr,r.r^rkrvrprqrrr	r	r	r
rWs

]
4rWz^[a-zA-Z0-9_-]+$c@sReZdZdZddd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�ZdS)�
Productionr�rightNrxc	Cs�||_t|�|_||_||_d|_||_||_||_t	|j�|_	g|_
x$|jD]}||j
krN|j
j|�qNWg|_d|_
|jr�d|jdj|j�f|_nd|j|_dS)Nz%s -> %sryz
%s -> <empty>)r�tupler��number�funcr��file�line�precr�usymsrb�lr_items�lr_nextr�r7)	rr�rr��
precedencer�r�r�r=r	r	r
r!s$

zProduction.__init__cCs|jS)N)r7)rr	r	r
r6?szProduction.__str__cCsdt|�dS)NzProduction(�))r7)rr	r	r
r8BszProduction.__repr__cCs
t|j�S)N)rr�)rr	r	r
rIEszProduction.__len__cCsdS)Nrdr	)rr	r	r
�__nonzero__HszProduction.__nonzero__cCs
|j|S)N)r�)r�indexr	r	r
rCKszProduction.__getitem__cCs�|t|j�krdSt||�}y|j|j|d|_Wnttfk
rTg|_YnXy|j|d|_Wntk
r�d|_YnX|S)Nrd)rr��LRItem�	Prodnames�lr_after�
IndexError�KeyError�	lr_before)rrBr�r	r	r
�lr_itemOs
zProduction.lr_itemcCs|jr||j|_dS)N)r�r�)r�pdictr	r	r
�bind_szProduction.bind�r�r)r�Nrxr)rrr�reducedrr6r8rIr�rCr�r�r	r	r	r
r�s
r�c@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�MiniProductioncCs.||_||_||_d|_||_||_||_dS)N)rrr�r�r�r�r7)rr7rrr�r�r�r	r	r
rhszMiniProduction.__init__cCs|jS)N)r7)rr	r	r
r6qszMiniProduction.__str__cCs
d|jS)NzMiniProduction(%s))r7)rr	r	r
r8tszMiniProduction.__repr__cCs|jr||j|_dS)N)r�r�)rr�r	r	r
r�xszMiniProduction.bindN)rrrrr6r8r�r	r	r	r
r�gs	r�c@s$eZdZdd�Zdd�Zdd�ZdS)r�cCsZ|j|_t|j�|_|j|_||_i|_|jj|d�t|j�|_t|j�|_|j	|_	dS)N�.)
rrgr�r��lr_index�
lookaheads�insertr�rr�)rr�rBr	r	r
r�szLRItem.__init__cCs,|jrd|jdj|j�f}n
d|j}|S)Nz%s -> %sryz
%s -> <empty>)r�rr�)rr=r	r	r
r6�s
zLRItem.__str__cCsdt|�dS)NzLRItem(r�)r7)rr	r	r
r8�szLRItem.__repr__N)rrrrr6r8r	r	r	r
r��sr�cCs:t|�d}x(|dkr4|||kr*||S|d8}qWdS)Nrdr)r)Zsymbols�	terminalsrFr	r	r
�rightmost_terminal�s
r�c@seZdZdS)�GrammarErrorN)rrrr	r	r	r
r��sr�c@s�eZdZdd�Zdd�Zdd�Zdd�Zd$dd
�Zd%dd�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zd&d d!�Zd"d#�Zd	S)'�GrammarcCsfdg|_i|_i|_i|_x|D]}g|j|<q Wg|jd<i|_i|_i|_i|_t�|_	d|_
dS)Nr)�Productionsr��Prodmap�	Terminals�Nonterminals�First�Follow�
Precedence�set�UsedPrecedence�Start)rr��termr	r	r
r�s

zGrammar.__init__cCs
t|j�S)N)rr�)rr	r	r
rI�szGrammar.__len__cCs
|j|S)N)r�)rr�r	r	r
rC�szGrammar.__getitem__cCs8||jkrtd|��|dkr&td��||f|j|<dS)Nz,Precedence already specified for terminal %r�leftr��nonassocz:Associativity must be one of 'left','right', or 'nonassoc')r�r�r�)r�r�)rr��assoc�levelr	r	r
�set_precedence�s

zGrammar.set_precedenceNrxrcCs�||jkrtd|||f��|dkr6td|||f��tj|�sRtd|||f��x�t|�D]�\}}|ddkr�yJt|�}t|�dkr�td||||f��||jkr�g|j|<|||<w\Wntk
r�YnXtj|�r\|d	kr\td
||||f��q\Wd	|k�r�|dd	k�r$td||f��|dd	k�rBtd
||f��|d}	|jj	|	�}
|
�sptd|||	f��n|j
j|	�|dd�=nt||j�}	|jj	|	d�}
d||f}||j
k�r�|j
|}td|||fd|j|jf��t|j�}
||jk�rg|j|<xR|D]J}||jk�r.|j|j|
�n&||jk�rDg|j|<|j|j|
��qWt|
|||
|||�}|jj|�||j
|<y|j|j|�Wn"tk
�r�|g|j|<YnXdS)Nz7%s:%d: Illegal rule name %r. Already defined as a tokenrz5%s:%d: Illegal rule name %r. error is a reserved wordz%s:%d: Illegal rule name %rrz'"rdzA%s:%d: Literal token %s in rule %r may only be a single characterz%precz!%s:%d: Illegal name %r in rule %rz+%s:%d: Syntax error. Nothing follows %%prec�zH%s:%d: Syntax error. %%prec can only appear at the end of a grammar rulez/%s:%d: Nothing known about the precedence of %rr�z%s -> %sz%s:%d: Duplicate rule %s. zPrevious definition at %s:%dr~���r~r�)r�r)r�r��_is_identifier�match�	enumerate�evalrrVr�r�r��addr�r�r�r�r�r�rbr�r�r�)r�prodname�symsr�r�r�rBr=�cZprecnameZprodprec�map�mZpnumberr�r�r	r	r
�add_productionsp










zGrammar.add_productioncCsT|s|jdj}||jkr&td|��tdd|g�|jd<|j|jd�||_dS)Nrdzstart symbol %s undefinedrzS')r�rr�r�r�rbr�)r�startr	r	r
�	set_startcs
zGrammar.set_startcs>���fdd��t����jdjd��fdd��jD�S)NcsJ|�krdS�j|�x.�jj|g�D]}x|jD]}�|�q2Wq&WdS)N)r�r�r�r�)r=r�r")�mark_reachable_from�	reachablerr	r
r�vs
z5Grammar.find_unreachable.<locals>.mark_reachable_fromrcsg|]}|�kr|�qSr	r	)r?r=)r�r	r
r@�sz,Grammar.find_unreachable.<locals>.<listcomp>)r�r�r�r�)rr	)r�r�rr
�find_unreachablesszGrammar.find_unreachablecCs�i}x|jD]}d||<qWd|d<x|jD]}d||<q,Wxpd}x`|jj�D]R\}}xH|D]@}x |jD]}||shd}PqhWd}|r\||s�d||<d}Pq\WqNW|s>Pq>Wg}	x@|j�D]4\}}
|
s�||jkr�||jkr�|dkr�q�|	j|�q�W|	S)NTz$endFr)r�r�r�rfr�rb)rZ
terminatesr�rB�some_changeZplr�r=Zp_terminates�infiniter�r	r	r
�infinite_cycles�s:

zGrammar.infinite_cyclescCsXg}xN|jD]D}|sqx8|jD].}||jkr||jkr|dkr|j||f�qWqW|S)Nr)r�r�r�r�rb)rr$r�r=r	r	r
�undefined_symbols�szGrammar.undefined_symbolscCs8g}x.|jj�D] \}}|dkr|r|j|�qW|S)Nr)r�rfrb)rZ
unused_tokr=rDr	r	r
�unused_terminals�s
zGrammar.unused_terminalscCs<g}x2|jj�D]$\}}|s|j|d}|j|�qW|S)Nr)r�rfr�rb)rZunused_prodr=rDr�r	r	r
�unused_rules�szGrammar.unused_rulescCsDg}x:|jD]0}||jkp"||jks|j||j|df�qW|S)Nr)r�r�r�rb)rZunusedZtermnamer	r	r
�unused_precedence�s
zGrammar.unused_precedencecCs`g}xV|D]D}d}x2|j|D]$}|dkr0d}q||kr|j|�qW|rLq
Pq
W|jd�|S)NFz<empty>T)r�rb)rZbetar$�xZx_produces_emptyrr	r	r
�_firsts

zGrammar._firstcCs�|jr|jSx|jD]}|g|j|<qWdg|jd<x|jD]}g|j|<q<Wxjd}xZ|jD]P}xJ|j|D]<}x6|j|j�D]&}||j|kr~|j|j|�d}q~WqlWq\W|sPPqPW|jS)Nz$endFT)r�r�r�r�r�r�rb)rr�rBr�r�rr	r	r
�
compute_first.s$zGrammar.compute_firstc
CsV|jr|jS|js|j�x|jD]}g|j|<q"W|sD|jdj}dg|j|<�x�d}x�|jdd�D]�}x�t|j�D]�\}}||jkrx|j|j|dd��}d}xB|D]:}	|	dkr�|	|j|kr�|j|j	|	�d}|	dkr�d}q�W|�s|t
|j�dkrxx:|j|jD]*}	|	|j|k�r|j|j	|	�d}�qWqxWqhW|sTPqTW|jS)Nrdz$endFz<empty>T)r�r�r�r�r�rr�r�r�rbr)
rr��k�didaddr�rF�BZfstZhasemptyrr	r	r
�compute_followSs<

zGrammar.compute_followcCs�x�|jD]�}|}d}g}x�|t|�kr,d}ntt||�}y|j|j|d|_Wnttfk
rng|_YnXy|j|d|_Wntk
r�d|_YnX||_	|s�P|j
|�|}|d7}qW||_qWdS)Nrrd)r�rr�r�r�r�r�r�r�r�rbr�)rr�ZlastlrirFr�Zlrir	r	r
�
build_lritems�s.

zGrammar.build_lritems)Nrxr)N)N)rrrrrIrCr�r�r�r�r�r�r�r�r�r�r�r�r�r	r	r	r
r��s $
T
@#%
;r�c@seZdZdS)�VersionErrorN)rrrr	r	r	r
r��sr�c@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�LRTablecCsd|_d|_d|_d|_dS)N)rZr\rX�	lr_method)rr	r	r
r�szLRTable.__init__cCs~t|tj�r|}ntd|�tj|}|jtkr:td��|j	|_
|j|_g|_
x|jD]}|j
jt|��qXW|j|_|jS)Nz	import %sz&yacc table file version is out of date)rA�types�
ModuleType�execrn�modulesZ_tabversion�__tabversion__r�Z
_lr_actionrZZ_lr_gotor\rXZ_lr_productionsrbr�Z
_lr_methodr�Z
_lr_signature)r�modulerr�r	r	r
�
read_table�s

zLRTable.read_tablecCs�yddl}Wntk
r(ddl}YnXtjj|�s:t�t|d�}|j|�}|tkr^t	d��|j|�|_
|j|�}|j|�|_|j|�|_|j|�}g|_
x|D]}|j
jt|��q�W|j�|S)Nr�rbz&yacc table file version is out of date)�cPickle�ImportError�pickle�os�path�exists�open�loadr�r�r�rZr\rXrbr��close)r�filenamer�Zin_fZ
tabversion�	signaturerYr�r	r	r
�read_pickle�s(




zLRTable.read_picklecCsx|jD]}|j|�qWdS)N)rXr�)rr�r�r	r	r
�bind_callables�szLRTable.bind_callablesN)rrrrr�rrr	r	r	r
r��sr�c	CsTi}x|D]}d||<q
Wg}i}x,|D]$}||dkr(t|||||||�q(W|S)Nr)�traverse)�X�R�FP�Nr�r;�Fr	r	r
�digraph	s

r	c	Cs|j|�t|�}|||<||�||<||�}xr|D]j}	||	dkrXt|	||||||�t||||	�||<x.|j|	g�D]}
|
||kr|||j|
�q|Wq4W|||k�rt||d<||||d<|j�}x2||k�rt||d<||||d<|j�}q�WdS)Nrrdr~r~r~r~)rbrr�minr��MAXINTr)r�rr;rrrr�d�rel�y�a�elementr	r	r
rs(

rc@seZdZdS)�	LALRErrorN)rrrr	r	r	r
r+src@s�eZdZd$dd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zd%d d!�Zd&d"d#�ZdS)'�LRGeneratedTablerNcCs�|dkrtd|��||_||_|s*t�}||_i|_i|_|j|_i|_	i|_
d|_d|_d|_
g|_g|_g|_|jj�|jj�|jj�|j�dS)N�SLRrzUnsupported method %sr)rr)r�grammarr�r�logrZr\r�rX�
lr_goto_cache�lr0_cidhash�
_add_countZsr_conflictZrr_conflictZ	conflicts�sr_conflicts�rr_conflictsr�r�r��lr_parse_table)rr�methodrr	r	r
r6s,


zLRGeneratedTable.__init__cCsz|jd7_|dd�}d}xV|rtd}xH|D]@}x:|jD]0}t|dd�|jkrRq:|j|j�|j|_d}q:Wq.Wq W|S)NrdTF�	lr0_addedr)rr�rKrbr�r)r�I�Jr�rGr�r	r	r
�lr0_closure[s
zLRGeneratedTable.lr0_closurec	Cs�|jjt|�|f�}|r|S|jj|�}|s:i}||j|<g}xP|D]H}|j}|rD|j|krD|jt|��}|s~i}||t|�<|j|�|}qDW|jd�}|s�|r�|j|�}||d<n||d<||jt|�|f<|S)Nz$end)rr�r!r�r�rbr )	rrr��gr=Zgsr�rB�s1r	r	r
�lr0_gotous2





zLRGeneratedTable.lr0_gotoc	Cs�|j|jjdjg�g}d}x"|D]}||jt|�<|d7}q"Wd}x�|t|�kr�||}|d7}i}x$|D]}x|jD]}d||<qxWqlWxJ|D]B}|j||�}|s�t|�|jkr�q�t|�|jt|�<|j	|�q�WqFW|S)Nrrd)
r rr�r�rr!rr�r#rb)	r�CrFrZasyms�iir=r�r!r	r	r
�	lr0_items�s(


zLRGeneratedTable.lr0_itemscCs�t�}d}xrxV|jjdd�D]B}|jdkr:|j|j�qx$|jD]}||krBPqBW|j|j�qWt|�|krrPt|�}qW|S)Nrrd)r�rr�rr�rr�)r�nullableZnum_nullabler�r�r	r	r
�compute_nullable_nonterminals�s
z.LRGeneratedTable.compute_nullable_nonterminalscCsrg}xht|�D]\\}}xR|D]J}|j|jdkr||j|jdf}|d|jjkr||kr|j|�qWqW|S)Nrd)r�r�rr�rr�rb)rr$�transZstatenorir�r�r	r	r
�find_nonterminal_transitions�s
z-LRGeneratedTable.find_nonterminal_transitionsc
Cs�|\}}g}|j|||�}xJ|D]B}|j|jdkr"|j|jd}	|	|jjkr"|	|kr"|j|	�q"W|dkr�||jjdjdkr�|jd�|S)Nrdrz$end)r#r�rr�rr�rbr�)
rr$r)r'rir�termsr!r�rr	r	r
�dr_relation	s

zLRGeneratedTable.dr_relationcCsvg}|\}}|j|||�}|jjt|�d�}xB|D]:}	|	j|	jdkr4|	j|	jd}
|
|kr4|j||
f�q4W|S)Nrdr~)r#rr�r!r�rr�rb)rr$r)�emptyr
rirr!rGr�rr	r	r
�reads_relation	s
zLRGeneratedTable.reads_relationcCs�i}i}i}x|D]}d||<qW�x�|D�]�\}}	g}
g}�xR||D�]D}|j|	krZqH|j}
|}x�|
|jdk�r
|
d}
|j|
}||f|kr�|
d}xH||jkr�|j||jjkr�P|j||kr�P|d}q�W|j||f�|j|||�}|jj	t
|�d�}qfWx�||D]t}|j|jk�r,�q|j|jk�r>�qd}xD||jk�rx|j||j|dk�rlP|d}�qDW|
j||f��qWqHWx2|D]*}||k�r�g||<||j||	f��q�W|
|||	f<q*W||fS)Nrdrr~)rr�rr�rr�rbr#rr�r!)rr$r)r'ZlookdictZincludedictZdtransr�rirZlookbZincludesr�r�rGZlir!r"rFr	r	r
�compute_lookback_includesD	sX




z*LRGeneratedTable.compute_lookback_includescs0���fdd�}���fdd�}t|||�}|S)Ncs�j�|��S)N)r,)r�)r$r'rr	r
�<lambda>�	sz4LRGeneratedTable.compute_read_sets.<locals>.<lambda>cs�j�|��S)N)r.)r�)r$r'rr	r
r0�	s)r	)rr$�ntransr'rrrr	)r$r'rr
�compute_read_sets�	sz"LRGeneratedTable.compute_read_setscs(�fdd�}�fdd�}t|||�}|S)Ncs�|S)Nr	)r�)�readsetsr	r
r0�	sz6LRGeneratedTable.compute_follow_sets.<locals>.<lambda>cs�j|g�S)N)r�)r�)�inclsetsr	r
r0�	s)r	)rr1r3r4rrrr	)r4r3r
�compute_follow_sets�	sz$LRGeneratedTable.compute_follow_setsc	Csxxr|j�D]f\}}x\|D]T\}}||jkr4g|j|<|j|g�}x*|D]"}||j|krF|j|j|�qFWqWq
WdS)N)rfr�r�rb)	rZ	lookbacksZ	followsetr)Zlbrir�rrr	r	r
�add_lookaheads�	s


zLRGeneratedTable.add_lookaheadscCsP|j�}|j|�}|j|||�}|j|||�\}}|j|||�}|j||�dS)N)r(r*r2r/r5r6)rr$r'r)r3ZlookdZincludedZ
followsetsr	r	r
�add_lalr_lookaheads�	s
z$LRGeneratedTable.add_lalr_lookaheadsc$	Cs8|jj}|jj}|j}|j}|j}i}|jd|j�|j�}|jdkrP|j	|�d}�x�|D�]�}	g}
i}i}i}
|jd�|jd|�|jd�x|	D]}|jd|j
|�q�W|jd��x|	D�]�}|j|jdk�r(|j
dkr�d|d	<||d	<�q�|jdk�r|j|}n|jj|j
}�x�|D�]�}|
j||d
|j
|ff�|j|�}|dk	�r�|dk�r>|j|d�\}}||j
j\}}||k�s�||k�r�|dk�r�|j
||<|||<|�r�|�r�|jd
|�|jj||df�||j
jd7_nB||k�r|dk�rd||<n$|�s�|jd|�|jj||df�n�|dk�r�||}||j
}|j|jk�r�|j
||<|||<||}}||j
jd7_||j
jd8_n
||}}|jj|||f�|jd|||j
||�ntd|��n(|j
||<|||<||j
jd7_�q&Wq�|j}|j|d}||jjkr�|j|	|�}|jjt|�d�}|dkr�|
j||d|f�|j|�}|dk	�r�|dk�r�||k�r�td|��n�|dk�r�|j|d�\}}|||j
j\}}||k�s||k�rR|dk�rR|||j
jd8_|||<|||<|�s�|jd|�|jj||df�nL||k�rp|dk�rpd||<n.|�r�|�r�|jd
|�|jj||df�ntd|��q�|||<|||<q�Wi}xF|
D]>\}}}||k�r�|||k�r�|jd||�d|||f<�q�W|jd�d}xX|
D]P\}}}||k�r"|||k	�r"||f|k�r"|jd||�d}d|||f<�q"W|�r�|jd�i} x6|	D].}!x&|!jD]}"|"|jjk�r�d| |"<�q�W�q�WxL| D]D}#|j|	|#�}|jjt|�d�}|dk�r�||
|#<|jd|#|��q�W|||<|||<|
||<|d7}q\WdS)NzParsing method: %srrrxzstate %dz    (%d) %srdzS'z$endzreduce using rule %d (%s)r�r�z3  ! shift/reduce conflict for %s resolved as reduce�reducer�z2  ! shift/reduce conflict for %s resolved as shiftZshiftz=  ! reduce/reduce conflict for %s resolved using rule %d (%s)zUnknown conflict in state %dzshift and go to state %dz Shift/shift conflict in state %dz    %-15s %sz  ! %-15s [ %s ]z"    %-30s shift and go to state %d)r�rr~)r�rr~) rr�r�r\rZrrr�r&r7r�rr�rr�r�rbr�r�rr�r�rrr�r�r#rr!rr�r�)$rr�r�r]r[rZactionpr$�strZactlistZ	st_actionZ
st_actionpZst_gotor�Zlaheadsrr"ZsprecZslevelZrprecZrlevelZoldpZppZchosenpZrejectprFr!rGZ	_actprintr�Znot_usedZnkeysr%r=rBr	r	r
r�	s




























zLRGeneratedTable.lr_parse_tablerxcCs�t|tj�rtd��|jd�d}tjj||�d}�ylt|d�}|j	dtjj
|�t|j|f�d}|�rti}xf|j
j�D]X\}	}
xN|
j�D]B\}}|j|�}
|
s�ggf}
|
||<|
dj|	�|
dj|�q�Wq|W|j	d�xz|j�D]n\}}|j	d	|�x |dD]}
|j	d
|
��qW|j	d�x |dD]}
|j	d
|
��q8W|j	d�q�W|j	d
�|j	d�nJ|j	d�x4|j
j�D]&\}}|j	d|d|d|f��q�W|j	d
�|�r�i}xl|jj�D]^\}	}
xR|
j�D]F\}}|j|�}
|
�sggf}
|
||<|
dj|	�|
dj|��q�W�q�W|j	d�x||j�D]p\}}|j	d	|�x |dD]}
|j	d
|
��qjW|j	d�x |dD]}
|j	d
|
��q�W|j	d��qJW|j	d
�|j	d�nJ|j	d�x4|jj�D]&\}}|j	d|d|d|f��q�W|j	d
�|j	d�xd|jD]Z}|j�rl|j	d|j|j|j|jtjj
|j�|jf�n|j	dt|�|j|jf��q0W|j	d�|j�Wn&tk
�r�}z�WYdd}~XnXdS)Nz"Won't overwrite existing tabmoduler�rdz.py�wz�
# %s
# This file is automatically generated. Do not edit.
# pylint: disable=W,C,R
_tabversion = %r

_lr_method = %r

_lr_signature = %r
    rz
_lr_action_items = {z%r:([z%r,z],[z]),z}
z�
_lr_action = {}
for _k, _v in _lr_action_items.items():
   for _x,_y in zip(_v[0],_v[1]):
      if not _x in _lr_action:  _lr_action[_x] = {}
      _lr_action[_x][_k] = _y
del _lr_action_items
z
_lr_action = { z(%r,%r):%r,z
_lr_goto_items = {z�
_lr_goto = {}
for _k, _v in _lr_goto_items.items():
   for _x, _y in zip(_v[0], _v[1]):
       if not _x in _lr_goto: _lr_goto[_x] = {}
       _lr_goto[_x][_k] = _y
del _lr_goto_items
z
_lr_goto = { z_lr_productions = [
z  (%r,%r,%d,%r,%r,%d),
z  (%r,%r,%d,None,None,None),
z]
r~)rAr�r��IOError�splitr�r�r�r�r
�basenamer�r�rZrfr�rbr\rXr�r7rrr�r�r�)r�	tabmodule�	outputdirrZbasemodulenamer�rZsmallerrfr=ZndrrDrFr�r��er	r	r
�write_table�
s�
	




"





"

"
zLRGeneratedTable.write_tablecCsyddl}Wntk
r(ddl}YnXt|d���}|jt|t�|j|j|t�|j||t�|j|j|t�|j|j	|t�g}x^|j
D]T}|jr�|j|j
|j|j|jtjj|j�|jf�q�|jt
|�|j|jdddf�q�W|j||t�WdQRXdS)Nr�wb)r�r�r�r��dumpr��pickle_protocolr�rZr\rXr�rbr7rrr�r�r=r�r�)rr�rr�ZoutfZoutpr�r	r	r
�pickle_table"s ,"zLRGeneratedTable.pickle_table)rN)rxrx)rx)rrrrr r#r&r(r*r,r.r/r2r5r6r7rrArEr	r	r	r
r5s"
%#8+PB
{rcCs0tj|�}|jj�}|j|jkr,|j|j�|S)N)rn�	_getframe�	f_globals�copy�f_locals�update)ZlevelsrZldictr	r	r
�get_caller_module_dictEs


rKcCsg}|j�}d}|}x�|D]�}|d7}|j�}|s4qy�|ddkrh|sVtd||f��|}	|dd�}
n@|d}	|	}|dd�}
|d}|dkr�|dkr�td||f��|j|||	|
f�Wqtk
r��Yqtk
r�td	|||j�f��YqXqW|S)
Nrdr�|z%s:%d: Misplaced '|'r��:z::=z!%s:%d: Syntax error. Expected ':'z%s:%d: Syntax error in rule %r)�
splitlinesr<rVrb�	Exception�strip)�docr�r�rZpstringsZlastpZdlineZpsr�r�r�Zassignr	r	r
�
parse_grammarQs6
 rRc@s�eZdZd dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�ZdS)!�
ParserReflectNcCsL||_d|_d|_d|_t�|_g|_d|_|dkrBtt	j
�|_n||_dS)NF)r�r��
error_func�tokensr�r�rrrrnror)rr�rr	r	r
r{szParserReflect.__init__cCs,|j�|j�|j�|j�|j�dS)N)�	get_start�get_error_func�
get_tokens�get_precedence�get_pfunctions)rr	r	r
�get_all�s
zParserReflect.get_allcCs6|j�|j�|j�|j�|j�|j�|jS)N)�validate_start�validate_error_func�validate_tokens�validate_precedence�validate_pfunctions�validate_modulesr)rr	r	r
�validate_all�szParserReflect.validate_allcCs�g}yv|jr|j|j�|jr:|jdjdd�|jD���|jrR|jdj|j��x$|jD]}|drZ|j|d�qZWWnttfk
r�YnXdj|�S)NrxcSsg|]}dj|��qS)rx)r�)r?r�r	r	r
r@�sz+ParserReflect.signature.<locals>.<listcomp>ryr)r�rbr�r�rU�pfuncs�	TypeError�
ValueError)r�partsrr	r	r
r�szParserReflect.signaturecCs�tjd�}x�|jD]�}ytj|�\}}Wntk
r>wYnXi}xjt|�D]^\}}|d7}|j|�}|rN|jd�}|j	|�}	|	s�|||<qNtj
|�}
|jjd|
|||	�qNWqWdS)Nz\s*def\s+(p_[a-zA-Z_0-9]*)\(rdz;%s:%d: Function %s redefined. Previously defined on line %d)
�re�compiler��inspectZgetsourcelinesr;r�r��groupr��
getsourcefilerr)rZfrer��linesZlinenZ	counthashr�r�r�prevr�r	r	r
ra�s$





zParserReflect.validate_modulescCs|jjd�|_dS)Nr�)r�r�r�)rr	r	r
rV�szParserReflect.get_startcCs&|jdk	r"t|jt�s"|jjd�dS)Nz'start' must be a string)r�rA�string_typesrr)rr	r	r
r\�s
zParserReflect.validate_startcCs|jjd�|_dS)N�p_error)r�r�rT)rr	r	r
rW�szParserReflect.get_error_funccCs�|jr�t|jtj�rd}n*t|jtj�r.d}n|jjd�d|_dS|jjj}|jjj	}t
j|j�}|jj
|�|jjj|}|dkr�|jjd||�d|_dS)Nrrdz2'p_error' defined, but is not a function or methodTz$%s:%d: p_error() requires 1 argument)rTrAr��FunctionType�
MethodTyperr�__code__�co_firstlineno�co_filenameri�	getmoduler�r��co_argcount)rZismethodZelineZefiler�Zargcountr	r	r
r]�s 

z!ParserReflect.validate_error_funccCsr|jjd�}|s&|jjd�d|_dSt|ttf�sJ|jjd�d|_dS|sd|jjd�d|_dSt|�|_dS)NrUzNo token list is definedTztokens must be a list or tupleztokens is empty)	r�r�rrrArgr��sortedrU)rrUr	r	r
rX�szParserReflect.get_tokenscCsZd|jkr |jjd�d|_dSt�}x.|jD]$}||krH|jjd|�|j|�q.WdS)Nrz.Illegal token name 'error'. Is a reserved wordTzToken %r multiply defined)rUrrr�rr�)rr�rBr	r	r
r^s
zParserReflect.validate_tokenscCs|jjd�|_dS)Nr�)r�r�r�)rr	r	r
rYszParserReflect.get_precedencecCsg}|j�rt|jttf�s2|jjd�d|_dSx�t|j�D]�\}}t|ttf�sj|jjd�d|_dSt|�dkr�|jjd|�d|_dS|d}t|t�s�|jjd�d|_dSxH|dd�D]8}t|t�s�|jjd	�d|_dS|j	|||df�q�Wq>W||_
dS)
Nz"precedence must be a list or tupleTzBad precedence tabler�z?Malformed precedence entry %s. Must be (assoc, term, ..., term)rz)precedence associativity must be a stringrdz precedence items must be strings)r�rArgr�rrr�rrnrb�preclist)rrxr�r�r�r�r	r	r
r_s6

z!ParserReflect.validate_precedencecCs�g}xl|jj�D]^\}}|jd�s|dkr.qt|tjtjf�rt|d|jj	�}t
j|�}|j||||j
f�qW|jdd�d�||_dS)N�p_rorscSs |dt|d�|d|dfS)Nrrdr�r)r7)Z
p_functionr	r	r
r0Fs
z.ParserReflect.get_pfunctions.<locals>.<lambda>)�key)r�rf�
startswithrAr�rprqrKrrrsrirurb�__doc__�sortrc)rZp_functionsr�itemr�r�r	r	r
rZ9s
zParserReflect.get_pfunctionscCs^g}t|j�dkr(|jjd�d|_dS�x"|jD�]\}}}}tj|�}|j|}t|tj	�rfd}nd}|j
j|kr�|jjd|||j�d|_q2|j
j|kr�|jjd|||j�d|_q2|j
s�|jjd|||j�q2y,t|||�}	x|	D]}
|j||
f�q�WWn:tk
�r<}z|jjt|��d|_WYdd}~XnX|jj|�q2W�x|jj�D]�\}}
|jd	��r�t|
tjtj	f��r��q\|jd
��r��q\|jd	��r�|dk�r�|jjd|�t|
tj��r�|
j
jdk�s�t|
tj	��r\|
jj
jdk�r\|
j
�r\y8|
j
jd
�}|ddk�r4|jjd|
j
j|
j
j|�Wntk
�rLYnX�q\W||_dS)Nrz+no rules of the form p_rulename are definedTr�rdz%%s:%d: Rule %r has too many argumentsz#%s:%d: Rule %r requires an argumentzA%s:%d: No documentation string specified in function %r (ignored)ryZt_roz%r not defined as a functionryrMz9%s:%d: Possible grammar rule %r defined without p_ prefix)rrcrrrirkr�rAr�rqrrrvrr|rrRrbrVr7r�r�rfr{rp�__func__r<rtrsr�r)rrr�r�rrQr�r�ZreqargsZparsed_gr!r@rBrDr	r	r
r`Ns\


 z!ParserReflect.validate_pfunctions)N)rrrrr[rbrrarVr\rWr]rXr^rYr_rZr`r	r	r	r
rSzs

rSc
<Os�	|dkrt}|rd}|dkr&ttj�}�r��fdd�t��D�}
t|
�}d|krdtj|dj|d<d|kr�d|kr�ttj|dd�r�tj|dj	|d<nt
d�}|	dk�rt|tj
�r�|j}nLd|kr�|d}n:|jd�}dj|dd6��}td
|�ttj|dd�}tjj|�}	|jd�}|�rNt|t��rNd|k�rN|d|}|dk	�r`||d<t||d
�}|j�|j�r�td��|j�}y�t�}|�r�|j|�}n
|j|�}|�s�||k�ry"|j|j �t!||j"�}|j#a#|St$k
�r}z|j%d|�WYdd}~XnXWnFt&k
�rH}z|j%t|��WYdd}~Xnt'k
�r\YnX|
dk�r�|�r�ytt(tjj|	|�d��}
Wn<t)k
�r�}z|j%d||f�t*�}
WYdd}~XnXnt*�}
|
j+dt,�d}|j-��r�td��|j"�s|j%d�t.|j/�}xZ|j0D]P\}}}y|j1|||�Wn0t2k
�rb}z|j%d|�WYdd}~XnX�qWxl|j3D]b\}}|\} }!}"}#y|j4|"|#|| |!�Wn4t2k
�r�}z|jd|�d}WYdd}~XnX�qrWy&|dk�r�|j5|j6�n
|j5|�Wn6t2k
�r4}z|jt|��d}WYdd}~XnX|�rDtd��|j7�}$x*|$D]"\}%}&|jd|&j8|&j9|%�d}�qRW|j:�}'|'�r�|
j+d�|
j+d�|
j+d�x&|'D]}|j%d|�|
j+d|��q�W|�r|
j+d�|
j+d�|
j+d�x&t;|j<�D]\}(})|
j+d|(|)��q�W|j=�}*x$|*D]}&|j%d|&j8|&j9|&j>��q&Wt?|'�d	k�r^|j%d�t?|'�d	k�r||j%dt?|'��t?|*�d	k�r�|j%d �t?|*�d	k�r�|j%d!t?|*��|�r�|
j+d�|
j+d"�|
j+d�t@|jA�}+|+jB�x2|+D]*}|
j+d#|d$jd%d�|jA|D����q�W|
j+d�|
j+d&�|
j+d�t@|jC�},|,jB�x2|,D]*}-|
j+d#|-d$jd'd�|jC|-D����qRW|
j+d�|�r�|jD�}.x|.D]}/|j%d(|/��q�W|jE�}0x|0D]}1|jd)|1�d}�q�W|jF�}2x$|2D]\}}|jd*||�d}�q�W|�rtd��|�r*|jGd+|�tH|||
�}|�r�t?|jI�}3|3d	k�r\|j%d,�n|3d	k�rr|j%d-|3�t?|jJ�}4|4d	k�r�|j%d.�n|4d	k�r�|j%d/|4�|�r�|jI�s�|jJ�r�|
j%d�|
j%d0�|
j%d�x&|jID]\}5}6}7|
j%d1|6|5|7��q�WtK�}8x�|jJD]x\}5}9}:|5tL|9�tL|:�f|8k�r8�q|
j%d2|5|9�|
j%d3|:|5�|j%d2|5|9�|j%d3|:|5�|8jM|5tL|9�tL|:�f��qWg};xL|jJD]B\}5}9}:|:jN�r�|:|;k�r�|
j%d4|:�|j%d4|:�|;jO|:��q�W|�	rDy&|jP||	|�|tjk�	r
tj|=Wn6t)k
�	rB}z|j%d5||f�WYdd}~XnX|�	r�y|jQ||�Wn6t)k
�	r�}z|j%d5||f�WYdd}~XnX|j|j �t!||j"�}|j#a#|S)7Nrcsg|]}|t�|�f�qSr	)rK)r?r�)r�r	r
r@�szyacc.<locals>.<listcomp>�__file__r�__package__r�r�rdz	import %srxr�)rzUnable to build parserz.There was a problem loading the table file: %rr:zCouldn't open %r. %sz5Created by PLY version %s (http://www.dabeaz.com/ply)Fz no p_error() function is definedz%sTz;%s:%d: Symbol %r used, but not defined as a token or a rulezUnused terminals:zToken %r defined, but not usedz    %sr�zRule %-5d %sz$%s:%d: Rule %r defined, but not usedzThere is 1 unused tokenzThere are %d unused tokenszThere is 1 unused rulezThere are %d unused rulesz'Terminals, with rules where they appearz
%-20s : %srycSsg|]}t|��qSr	)r7)r?r=r	r	r
r@M
sz*Nonterminals, with rules where they appearcSsg|]}t|��qSr	)r7)r?r=r	r	r
r@U
szSymbol %r is unreachablez)Infinite recursion detected for symbol %rz0Precedence rule %r defined for unknown symbol %rzGenerating %s tablesz1 shift/reduce conflictz%d shift/reduce conflictsz1 reduce/reduce conflictz%d reduce/reduce conflictsz
Conflicts:z7shift/reduce conflict for %s in state %d resolved as %sz;reduce/reduce conflict in state %d resolved using rule (%s)zrejected rule (%s) in state %dzRule (%s) is never reducedzCouldn't create %r. %sr~)R�
tab_modulerrnro�dir�dictr�r�r�r�rKrAr�r�r<r�r�rKr�r��dirnamer�r7rSr[rrrr�rr�rr�rWrTrvrOrr�r�r�r;rr�__version__rbr�rUrxr�r�rr�r�r�r�r�r�r�r�r�r�rrrgr�r}r�r�r�r�rrrrr�r!r�r�rbrArE)<rrr�r>r�Zcheck_recursion�optimizeZwrite_tablesZ	debugfiler?ZdebuglogZerrorlogZ
picklefileZ_itemsr�ZsrcfilerfZpkgnameZpkgZpinforZlrZread_signaturer3r@�errorsrr�r�r��funcnameZgramr�r�r�r�r�rcr�r�rBr�r�r+ZnontermsZnontermZunreachable�ur��infZunused_precZnum_srZnum_rrrir�Z
resolutionZalready_reportedZruleZrejectedZwarned_neverr	)r�r
�yacc�s�







"



$
















*




*













$$r�);rgr�rnZos.pathr�rir(r�r�Z	yaccdebugZ
debug_filer�Z
default_lrr�rlrrD�version_infoZ
basestringrnr7�maxsizer�objectrrrOrr%r'r+r/r-r*r,r.r0r4r5r9rWrhr�r�r�r�r�r�r�r�r�r	rrrrKrRrSr�r	r	r	r
�<module>>s�

7m
H.rT
)
site-packages/sepolgen/__pycache__/refpolicy.cpython-36.pyc000064400000117634147511334570017760 0ustar003

��f�}�@sbddlZddlZdZdZdZdZdZdZddd	d
ddgZeeeeeed
�Z	Gdd�d�Z
Gdd�de
�ZGdd�de
�Zdzdd�Z
d{dd�Zd}dd�Zdd�ZGd d!�d!e�ZGd"d#�d#e�ZGd$d%�d%e�ZGd&d'�d'�ZGd(d)�d)e�ZGd*d+�d+e�ZGd,d-�d-e�ZGd.d/�d/e�ZGd0d1�d1e�ZGd2d3�d3e�ZGd4d5�d5e�ZGd6d7�d7e�ZGd8d9�d9e�ZGd:d;�d;e�ZGd<d=�d=e�Z Gd>d?�d?e�Z!Gd@dA�dAe�Z"GdBdC�dCe�Z#GdDdE�dEe�Z$GdFdG�dGe�Z%GdHdI�dIe�Z&GdJdK�dKe�Z'GdLdM�dMe�Z(GdNdO�dOe�Z)GdPdQ�dQe�Z*GdRdS�dSe�Z+GdTdU�dUe�Z,GdVdW�dWe�Z-GdXdY�dYe�Z.GdZd[�d[e�Z/Gd\d]�d]e�Z0d^d_�Z1Gd`da�dae�Z2Gdbdc�dce�Z3Gddde�dee�Z4Gdfdg�dge�Z5Gdhdi�die�Z6Gdjdk�dke�Z7Gdldm�dme�Z8Gdndo�doe�Z9Gdpdq�dqe�Z:Gdrds�dse�Z;Gdtdu�du�Z<Gdvdw�dw�Z=Gdxdy�dy�Z>dS)~�N������source�target�object�
permission�role�destination)rrr	r
rrc@seZdZddd�ZdS)�
PolicyBaseNcCsd|_d|_dS)N)�parent�comment)�selfr�r�/usr/lib/python3.6/refpolicy.py�__init__5szPolicyBase.__init__)N)�__name__�
__module__�__qualname__rrrrrr
4sr
c@s�eZdZdZd/dd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�ZdS)0�Nodea�Base class objects produced from parsing the reference policy.

    The Node class is used as the base class for any non-leaf
    object produced by parsing the reference policy. This object
    should contain a reference to its parent (or None for a top-level
    object) and 0 or more children.

    The general idea here is to have a very simple tree structure. Children
    are not separated out by type. Instead the tree structure represents
    fairly closely the real structure of the policy statements.

    The object should be iterable - by default over all children but
    subclasses are free to provide additional iterators over a subset
    of their childre (see Interface for example).
    NcCstj||�g|_dS)N)r
r�children)rrrrrrJsz
Node.__init__cCs
t|j�S)N)�iterr)rrrr�__iter__Nsz
Node.__iter__cCstdd�t|��S)NcSs
t|t�S)N)�
isinstancer)�xrrr�<lambda>WszNode.nodes.<locals>.<lambda>)�filter�walktree)rrrr�nodesVsz
Node.nodescCstdd�t|��S)NcSs
t|t�S)N)r�Module)rrrrrZszNode.modules.<locals>.<lambda>)rr)rrrr�modulesYszNode.modulescCstdd�t|��S)NcSs
t|t�S)N)r�	Interface)rrrrr]sz!Node.interfaces.<locals>.<lambda>)rr)rrrr�
interfaces\szNode.interfacescCstdd�t|��S)NcSs
t|t�S)N)r�Template)rrrrr`sz Node.templates.<locals>.<lambda>)rr)rrrr�	templates_szNode.templatescCstdd�t|��S)NcSs
t|t�S)N)r�
SupportMacros)rrrrrcsz%Node.support_macros.<locals>.<lambda>)rr)rrrr�support_macrosbszNode.support_macroscCstdd�t|��S)NcSs
t|t�S)N)r�ModuleDeclaration)rrrrrhsz*Node.module_declarations.<locals>.<lambda>)rr)rrrr�module_declarationsgszNode.module_declarationscCstdd�t|��S)NcSs
t|t�S)N)r�
InterfaceCall)rrrrrksz&Node.interface_calls.<locals>.<lambda>)rr)rrrr�interface_callsjszNode.interface_callscCstdd�t|��S)NcSs
t|t�S)N)r�AVRule)rrrrrnszNode.avrules.<locals>.<lambda>)rr)rrrr�avrulesmszNode.avrulescCstdd�t|��S)NcSs
t|t�S)N)r�	AVExtRule)rrrrrqsz!Node.avextrules.<locals>.<lambda>)rr)rrrr�
avextrulespszNode.avextrulescCstdd�t|��S)NcSs
t|t�S)N)r�TypeRule)rrrrrtsz Node.typerules.<locals>.<lambda>)rr)rrrr�	typerulessszNode.typerulescCstdd�t|��S)NcSs
t|t�S)N)r�	TypeBound)rrrrrwsz!Node.typebounds.<locals>.<lambda>)rr)rrrr�
typeboundsvszNode.typeboundscCstdd�t|��S)zAIterate over all of the TypeAttribute children of this Interface.cSs
t|t�S)N)r�
TypeAttribute)rrrrr{sz%Node.typeattributes.<locals>.<lambda>)rr)rrrr�typeattributesyszNode.typeattributescCstdd�t|��S)zAIterate over all of the RoleAttribute children of this Interface.cSs
t|t�S)N)r�
RoleAttribute)rrrrrsz%Node.roleattributes.<locals>.<lambda>)rr)rrrr�roleattributes}szNode.roleattributescCstdd�t|��S)NcSs
t|t�S)N)r�Require)rrrrr�szNode.requires.<locals>.<lambda>)rr)rrrr�requires�sz
Node.requirescCstdd�t|��S)NcSs
t|t�S)N)r�Role)rrrrr�szNode.roles.<locals>.<lambda>)rr)rrrr�roles�sz
Node.rolescCstdd�t|��S)NcSs
t|t�S)N)r�	RoleAllow)rrrrr�sz"Node.role_allows.<locals>.<lambda>)rr)rrrr�role_allows�szNode.role_allowscCstdd�t|��S)NcSs
t|t�S)N)r�RoleType)rrrrr�sz!Node.role_types.<locals>.<lambda>)rr)rrrr�
role_types�szNode.role_typescCs(|jrt|j�d|j�S|j�SdS)N�
)r�str�	to_string)rrrr�__str__�szNode.__str__cCsd|jj|j�fS)Nz<%s(%s)>)�	__class__rrC)rrrr�__repr__�sz
Node.__repr__cCsdS)N�r)rrrrrC�szNode.to_string)N)rrr�__doc__rrr r"r$r&r(r*r,r.r0r2r4r6r8r:r<r>r@rDrFrCrrrrr9s.
rc@s.eZdZd
dd�Zdd�Zdd�Zdd	�ZdS)�LeafNcCstj||�dS)N)r
r)rrrrrr�sz
Leaf.__init__cCs(|jrt|j�d|j�S|j�SdS)NrA)rrBrC)rrrrrD�szLeaf.__str__cCsd|jj|j�fS)Nz<%s(%s)>)rErrC)rrrrrF�sz
Leaf.__repr__cCsdS)NrGr)rrrrrC�szLeaf.to_string)N)rrrrrDrFrCrrrrrI�s
rITFc
cs�|r
d}nd}|dfg}x�t|�dkr�|j|�\}}|rD||fVn|Vt|t�rg}t|j�d}	xD|	dkr�|dks�t|j|	|�r�|j|j|	|df�|	d8}	qhW|j|�qWdS)a�Iterate over a Node and its Children.

    The walktree function iterates over a tree containing Nodes and
    leaf objects. The iteration can perform a depth first or a breadth
    first traversal of the tree (controlled by the depthfirst
    paramater. The passed in node will be returned.

    This function will only work correctly for trees - arbitrary graphs
    will likely cause infinite looping.
    rrN���)�len�poprrr�append�extend)
�nodeZ
depthfirst�	showdepth�type�index�stackZcur�depth�items�irrrr�s"



rccs*x$|D]}|dkst||�r|VqWdS)aIterate over the direct children of a Node.

    The walktree function iterates over the children of a Node.
    Unlike walktree it does note return the passed in node or
    the children of any Node objects (that is, it does not go
    beyond the current level in the tree).
    N)r)rOrQrrrr�walknode�s
rW�{�}cCsRt|�}d}|dkrtd��dj|�}|dkr2|S|dd|d|dSdS)z�Convert a set (or any sequence type) into a string representation
    formatted to match SELinux space separated list conventions.

    For example the list ['read', 'write'] would be converted into:
    '{ read write }'
    rGrz"cannot convert 0 len set to string� rN)rK�
ValueError�join)�s�cont�lrBrrr�list_to_space_str�s
r`cCs"t|�}|dkrtd��dj|�S)Nrz'cannot conver 0 len set to comma stringz, )rKr[r\)r]r_rrr�list_to_comma_str�srac@s&eZdZddd�Zdd�Zdd�ZdS)	�IdSetNcCs&|rtj||�n
tj|�d|_dS)NF)�setrZ
compliment)r�listrrrr�s
zIdSet.__init__cCstt|��S)N)r`�sorted)rrrr�to_space_strszIdSet.to_space_strcCstt|��S)N)rare)rrrr�to_comma_strszIdSet.to_comma_str)N)rrrrrfrgrrrrrb�s
rbc@s4eZdZdZddd�Zdd�Zdd�Zdd	d
�ZdS)
�SecurityContextz;An SELinux security context with optional MCS / MLS fields.NcCs:tj||�d|_d|_d|_d|_|dk	r6|j|�dS)z�Create a SecurityContext object, optionally from a string.

        Parameters:
           [context] - string representing a security context. Same format
              as a string passed to the from_string method.
        rGN)rIr�userrrQ�level�from_string)r�contextrrrrrszSecurityContext.__init__cCs�tj|�}|ddkr|d}|jd�}t|�dkr@td|��|d|_|d|_|d|_t|�dkr�dj|dd��|_	nd|_	dS)z�Parse a string representing a context into a SecurityContext.

        The string should be in the standard format - e.g.,
        'user:role:type:level'.

        Raises ValueError if the string is not parsable as a security context.
        rr�:rz)context string [%s] not in a valid formatrN)
�selinuxZselinux_trans_to_raw_context�splitrKr[rirrQr\rj)rrl�raw�fieldsrrrrks	




zSecurityContext.from_stringcCs0|j|jko.|j|jko.|j|jko.|j|jkS)aCompare two SecurityContext objects - all fields must be exactly the
        the same for the comparison to work. It is possible for the level fields
        to be semantically the same yet syntactically different - in this case
        this function will return false.
        )rirrQrj)r�otherrrr�__eq__4szSecurityContext.__eq__cCs\|j|j|jg}|jdkrF|dkr:tj�dkrD|jd�qR|j|�n|j|j�dj|�S)a�Return a string representing this security context.

        By default, the string will contiain a MCS / MLS level
        potentially from the default which is passed in if none was
        set.

        Arguments:
           default_level - the default level to use if self.level is an
             empty string.

        Returns:
           A string represening the security context in the form
              'user:role:type:level'.
        NrZs0rm)rirrQrjrnZis_selinux_mls_enabledrMr\)rZ
default_levelrqrrrrC?s
zSecurityContext.to_string)NN)N)rrrrHrrkrsrCrrrrrh	s

rhc@seZdZdZddd�ZdS)�ObjectClassa"SELinux object class and permissions.

    This class is a basic representation of an SELinux object
    class - it does not represent separate common permissions -
    just the union of the common and class specific permissions.
    It is meant to be convenient for policy generation.
    rGNcCstj||�||_t�|_dS)N)rIr�namerb�perms)rrurrrrraszObjectClass.__init__)rGN)rrrrHrrrrrrtYsrtc@s<eZdZdZddd�Zdd�Zdd�Zdd
d�Zdd
�Zd	S)�XpermSeta)Extended permission set.

    This class represents one or more extended permissions
    represented by numeric values or ranges of values. The
    .complement attribute is used to specify all permission
    except those specified.

    Two xperm set can be merged using the .extend() method.
    FcCs||_g|_dS)N)�
complement�ranges)rrxrrrrpszXpermSet.__init__cCs�|jj�d}x�|t|j�kr�x�|dt|j�kr�|j|dd|j|ddkr�|j|dt|j|d|j|dd�f|j|<|j|d=q Pq W|d7}qWdS)z0Ensure that ranges are not overlapping.
        rrN)ry�sortrK�max)rrVrrrZ__normalize_rangests
$zXpermSet.__normalize_rangescCs|jj|j�|j�dS)z%Add ranges from an xperm set
        N)ryrN�_XpermSet__normalize_ranges)rr]rrrrN�szXpermSet.extendNcCs(|dkr|}|jj||f�|j�dS)z7Add value of range of values to the xperm set.
        N)ryrMr|)rZminimumZmaximumrrr�add�szXpermSet.addcCsz|js
dS|jrdnd}t|j�dkrX|jdd|jddkrX|t|jdd�Stdd�|j�}d|dj|�fS)	NrGz~ rrcSs$|d|dkrt|d�Sd|S)Nrrz%s-%s)rB)rrrrr�sz$XpermSet.to_string.<locals>.<lambda>z%s{ %s }rZ)ryrxrKrB�mapr\)rZcompl�valsrrrrC�s*zXpermSet.to_string)F)N)	rrrrHrr|rNr}rCrrrrrwfs	

rwc@s"eZdZdZddd�Zdd�ZdS)r5z[SElinux typeattribute statement.

    This class represents a typeattribute statement.
    NcCstj||�d|_t�|_dS)NrG)rIrrQrb�
attributes)rrrrrr�szTypeAttribute.__init__cCsd|j|jj�fS)Nztypeattribute %s %s;)rQr�rg)rrrrrC�szTypeAttribute.to_string)N)rrrrHrrCrrrrr5�s
r5c@s"eZdZdZddd�Zdd�ZdS)r7z[SElinux roleattribute statement.

    This class represents a roleattribute statement.
    NcCstj||�d|_t�|_dS)NrG)rIrrrbr8)rrrrrr�szRoleAttribute.__init__cCsd|j|jj�fS)Nzroleattribute %s %s;)rr8rg)rrrrrC�szRoleAttribute.to_string)N)rrrrHrrCrrrrr7�s
r7c@seZdZddd�Zdd�ZdS)r;NcCstj||�d|_t�|_dS)NrG)rIrrrb�types)rrrrrr�sz
Role.__init__cCs*d}x |jD]}|d|j|f7}qW|S)NrGzrole %s types %s;
)r�r)rr]�trrrrC�szRole.to_string)N)rrrrrCrrrrr;�s
r;c@seZdZddd�Zdd�ZdS)�TyperGNcCs&tj||�||_t�|_t�|_dS)N)rIrrurbr��aliases)rrurrrrr�sz
Type.__init__cCsRd|j}t|j�dkr*|d|jj�}t|j�dkrJ|d|jj�}|dS)Nztype %srzalias %sz, %s�;)rurKr�rfr�rg)rr]rrrrC�s
zType.to_string)rGN)rrrrrCrrrrr��s
r�c@seZdZddd�Zdd�ZdS)�	TypeAliasNcCstj||�d|_t�|_dS)NrG)rIrrQrbr�)rrrrrr�szTypeAlias.__init__cCsd|j|jj�fS)Nztypealias %s alias %s;)rQr�rf)rrrrrC�szTypeAlias.to_string)N)rrrrrCrrrrr��s
r�c@seZdZddd�Zdd�ZdS)�	AttributerGNcCstj||�||_dS)N)rIrru)rrurrrrr�szAttribute.__init__cCs
d|jS)Nz
attribute %s;)ru)rrrrrC�szAttribute.to_string)rGN)rrrrrCrrrrr��s
r�c@seZdZddd�Zdd�ZdS)�Attribute_RolerGNcCstj||�||_dS)N)rIrru)rrurrrrr�szAttribute_Role.__init__cCs
d|jS)Nzattribute_role %s;)ru)rrrrrC�szAttribute_Role.to_string)rGN)rrrrrCrrrrr��s
r�c@sBeZdZdZdZdZdZdZddd�Zd	d
�Z	dd�Z
d
d�ZdS)r-a�SELinux access vector (AV) rule.

    The AVRule class represents all varieties of AV rules including
    allow, dontaudit, and auditallow (indicated by the flags self.ALLOW,
    self.DONTAUDIT, and self.AUDITALLOW respectively).

    The source and target types, object classes, and perms are all represented
    by sets containing strings. Sets are used to make it simple to add
    strings repeatedly while avoiding duplicates.

    No checking is done to make certain that the symbols are valid or
    consistent (e.g., perms that don't match the object classes). It is
    even possible to put invalid types like '$1' into the rules to allow
    storage of the reference policy interfaces.
    rrrrNcCsFtj||�t�|_t�|_t�|_t�|_|j|_|rB|j	|�dS)N)
rIrrb�	src_types�	tgt_types�obj_classesrv�ALLOW�	rule_type�from_av)r�avrrrrr
szAVRule.__init__cCsD|j|jkrdS|j|jkr dS|j|jkr0dS|j|jkr@dSdS)NZallowZ	dontauditZ
auditallowZ
neverallow)r�r��	DONTAUDIT�
AUDITALLOW�
NEVERALLOW)rrrr�__rule_type_strszAVRule.__rule_type_strcCsV|jj|j�|j|jkr(|jjd�n|jj|j�|jj|j�|jj|j�dS)zIAdd the access from an access vector to this allow
        rule.
        rN)	r�r}�src_type�tgt_typer�r��	obj_classrv�update)rr�rrrr�szAVRule.from_avcCs.d|j�|jj�|jj�|jj�|jj�fS)z�Return a string representation of the rule
        that is a valid policy language representation (assuming
        that the types, object class, etc. are valie).
        z%s %s %s:%s %s;)�_AVRule__rule_type_strr�rfr�r�rv)rrrrrC*s
zAVRule.to_string)NN)rrrrHr�r�r�r�rr�r�rCrrrrr-�s


r-c@sBeZdZdZdZdZdZdZddd�Zd	d
�Z	dd�Z
d
d�ZdS)r/ajExtended permission access vector rule.

    The AVExtRule class represents allowxperm, dontauditxperm,
    auditallowxperm, and neverallowxperm rules.

    The source and target types, and object classes are represented
    by sets containing strings. The operation is a single string,
    e.g. 'ioctl'. Extended permissions are represented by an XpermSet.
    rrrrNcCsNtj||�t�|_t�|_t�|_|j|_t�|_	||_
|rJ|j||�dS)N)rIrrbr�r�r��
ALLOWXPERMr�rw�xperms�	operationr�)rr��oprrrrrDszAVExtRule.__init__cCsD|j|jkrdS|j|jkr dS|j|jkr0dS|j|jkr@dSdS)NZ
allowxpermZdontauditxpermZauditallowxpermZneverallowxperm)r�r��DONTAUDITXPERM�AUDITALLOWXPERM�NEVERALLOWXPERM)rrrrr�OszAVExtRule.__rule_type_strcCsZ|jj|j�|j|jkr(|jjd�n|jj|j�|jj|j�||_|j||_dS)Nr)	r�r}r�r�r�r�r�r�r�)rr�r�rrrr�YszAVExtRule.from_avcCs2d|j�|jj�|jj�|jj�|j|jj�fS)z�Return a string representation of the rule that is
        a valid policy language representation (assuming that
        the types, object class, etc. are valid).
        z%s %s %s:%s %s %s;)�_AVExtRule__rule_type_strr�rfr�r�r�r�rC)rrrrrCcszAVExtRule.to_string)NNN)rrrrHr�r�r�r�rr�r�rCrrrrr/5s	


r/c@s6eZdZdZdZdZdZddd�Zdd	�Zd
d�Z	dS)
r1z�SELinux type rules.

    This class is very similar to the AVRule class, but is for representing
    the type rules (type_trans, type_change, and type_member). The major
    difference is the lack of perms and only and sing destination type.
    rrrNcCs6tj||�t�|_t�|_t�|_d|_|j|_dS)NrG)	rIrrbr�r�r��	dest_type�TYPE_TRANSITIONr�)rrrrrrzszTypeRule.__init__cCs(|j|jkrdS|j|jkr dSdSdS)NZtype_transitionZtype_changeZtype_member)r�r��TYPE_CHANGE)rrrrr��s
zTypeRule.__rule_type_strcCs*d|j�|jj�|jj�|jj�|jfS)Nz%s %s %s:%s %s;)�_TypeRule__rule_type_strr�rfr�r�r�)rrrrrC�s
zTypeRule.to_string)N)
rrrrHr�r�ZTYPE_MEMBERrr�rCrrrrr1os
r1c@s"eZdZdZddd�Zdd�ZdS)r3zSSElinux typebound statement.

    This class represents a typebound statement.
    NcCstj||�d|_t�|_dS)NrG)rIrrQrbr�)rrrrrr�szTypeBound.__init__cCsd|j|jj�fS)Nztypebounds %s %s;)rQr�rg)rrrrrC�szTypeBound.to_string)N)rrrrHrrCrrrrr3�s
r3c@seZdZddd�Zdd�ZdS)r=NcCs tj||�t�|_t�|_dS)N)rIrrb�	src_roles�	tgt_roles)rrrrrr�szRoleAllow.__init__cCsd|jj�|jj�fS)Nzallow %s %s;)r�rgr�)rrrrrC�s
zRoleAllow.to_string)N)rrrrrCrrrrr=�s
r=c@seZdZddd�Zdd�ZdS)r?NcCstj||�d|_t�|_dS)NrG)rIrrrbr�)rrrrrr�szRoleType.__init__cCs*d}x |jD]}|d|j|f7}qW|S)NrGzrole %s types %s;
)r�r)rr]r�rrrrC�szRoleType.to_string)N)rrrrrCrrrrr?�s
r?c@seZdZddd�Zdd�ZdS)r)NcCs"tj||�d|_d|_d|_dS)NrGF)rIrru�version�	refpolicy)rrrrrr�szModuleDeclaration.__init__cCs*|jrd|j|jfSd|j|jfSdS)Nzpolicy_module(%s, %s)z
module %s %s;)r�rur�)rrrrrC�szModuleDeclaration.to_string)N)rrrrrCrrrrr)�s
r)c@seZdZddd�Zdd�ZdS)�ConditionalNcCstj||�g|_dS)N)rr�	cond_expr)rrrrrr�szConditional.__init__cCsdt|jdd�S)Nz[If %s]rG)r^)rGrG)r`r�)rrrrrC�szConditional.to_string)N)rrrrrCrrrrr��s
r�c@seZdZddd�Zdd�ZdS)�BoolNcCstj||�d|_d|_dS)NrGF)rIrru�state)rrrrrr�sz
Bool.__init__cCs$d|j}|jr|dS|dSdS)Nzbool %s �trueZfalse)rur�)rr]rrrrC�s
zBool.to_string)N)rrrrrCrrrrr��s
r�c@seZdZddd�Zdd�ZdS)�
InitialSidNcCstj||�d|_d|_dS)NrG)rIrrurl)rrrrrZ__init�szInitialSid.__initcCsd|jt|j�fS)Nz	sid %s %s)rurBrl)rrrrrC�szInitialSid.to_string)N)rrrZ_InitialSid__initrCrrrrr��s
r�c@seZdZddd�Zdd�ZdS)�GenfsConNcCs"tj||�d|_d|_d|_dS)NrG)rIr�
filesystem�pathrl)rrrrrr�szGenfsCon.__init__cCsd|j|jt|j�fS)Nzgenfscon %s %s %s)r�r�rBrl)rrrrrC�szGenfsCon.to_string)N)rrrrrCrrrrr��s
r�c@s*eZdZdZdZdZd	dd�Zdd�ZdS)
�
FilesystemUserrrNcCs$tj||�|j|_d|_d|_dS)NrG)rIr�XATTRrQr�rl)rrrrrr�szFilesystemUse.__init__cCsNd}|j|jkrd}n"|j|jkr(d}n|j|jkr8d}d||jt|j�fS)NrGz
fs_use_xattr z
fs_use_trans zfs_use_task z	%s %s %s;)rQr��TRANS�TASKr�rBrl)rr]rrrrC�szFilesystemUse.to_string)N)rrrr�r�r�rrCrrrrr��s

r�c@seZdZddd�Zdd�ZdS)�PortConNcCs"tj||�d|_d|_d|_dS)NrG)rIr�	port_type�port_numberrl)rrrrrrszPortCon.__init__cCsd|j|jt|j�fS)Nzportcon %s %s %s)r�r�rBrl)rrrrrCszPortCon.to_string)N)rrrrrCrrrrr��s
r�c@seZdZddd�Zdd�ZdS)�NodeConNcCs"tj||�d|_d|_d|_dS)NrG)rIr�start�endrl)rrrrrr
szNodeCon.__init__cCsd|j|jt|j�fS)Nznodecon %s %s %s)r�r�rBrl)rrrrrCszNodeCon.to_string)N)rrrrrCrrrrr�	s
r�c@seZdZddd�Zdd�ZdS)�NetifConNcCs"tj||�d|_d|_d|_dS)NrG)rIr�	interface�interface_context�packet_context)rrrrrrszNetifCon.__init__cCsd|jt|j�t|j�fS)Nznetifcon %s %s %s)r�rBr�r�)rrrrrCszNetifCon.to_string)N)rrrrrCrrrrr�s
r�c@seZdZddd�Zdd�ZdS)�PirqConNcCstj||�d|_d|_dS)NrG)rIr�pirq_numberrl)rrrrrrszPirqCon.__init__cCsd|jt|j�fS)Nz
pirqcon %s %s)r�rBrl)rrrrrC#szPirqCon.to_string)N)rrrrrCrrrrr�s
r�c@seZdZddd�Zdd�ZdS)�IomemConNcCstj||�d|_d|_dS)NrG)rIr�
device_memrl)rrrrrr'szIomemCon.__init__cCsd|jt|j�fS)Nziomemcon %s %s)r�rBrl)rrrrrC,szIomemCon.to_string)N)rrrrrCrrrrr�&s
r�c@seZdZddd�Zdd�ZdS)�	IoportConNcCstj||�d|_d|_dS)NrG)rIr�ioportrl)rrrrrr0szIoportCon.__init__cCsd|jt|j�fS)Nzioportcon %s %s)r�rBrl)rrrrrC5szIoportCon.to_string)N)rrrrrCrrrrr�/s
r�c@seZdZddd�Zdd�ZdS)�PciDeviceConNcCstj||�d|_d|_dS)NrG)rIr�devicerl)rrrrrr9szPciDeviceCon.__init__cCsd|jt|j�fS)Nzpcidevicecon %s %s)r�rBrl)rrrrrC>szPciDeviceCon.to_string)N)rrrrrCrrrrr�8s
r�c@seZdZddd�Zdd�ZdS)�
DeviceTreeConNcCstj||�d|_d|_dS)NrG)rIrr�rl)rrrrrrBszDeviceTreeCon.__init__cCsd|jt|j�fS)Nzdevicetreecon %s %s)r�rBrl)rrrrrCGszDeviceTreeCon.to_string)N)rrrrrCrrrrr�As
r�cCsLxFt|dd�D]6\}}d}xt|�D]}|d}q$Wt|t|��qWdS)NT)rPrG�	)r�range�printrB)�headrOrTr]rVrrr�
print_treeLs
r�c@seZdZddd�Zdd�ZdS)�HeadersNcCstj||�dS)N)rr)rrrrrrUszHeaders.__init__cCsdS)Nz	[Headers]r)rrrrrCXszHeaders.to_string)N)rrrrrCrrrrr�Ts
r�c@seZdZddd�Zdd�ZdS)r!NcCstj||�dS)N)rr)rrrrrr]szModule.__init__cCsdS)NrGr)rrrrrC`szModule.to_string)N)rrrrrCrrrrr!\s
r!c@s"eZdZdZddd�Zdd�ZdS)	r#zqA reference policy interface definition.

    This class represents a reference policy interface definition.
    rGNcCstj||�||_dS)N)rrru)rrurrrrrhszInterface.__init__cCs
d|jS)Nz[Interface name: %s])ru)rrrrrClszInterface.to_string)rGN)rrrrHrrCrrrrr#cs
r#c@seZdZddd�Zdd�ZdS)�
TunablePolicyNcCstj||�g|_dS)N)rrr�)rrrrrrpszTunablePolicy.__init__cCsdt|jdd�S)Nz[Tunable Policy %s]rG)r^)rGrG)r`r�)rrrrrCtszTunablePolicy.to_string)N)rrrrrCrrrrr�os
r�c@seZdZddd�Zdd�ZdS)r%rGNcCstj||�||_dS)N)rrru)rrurrrrrxszTemplate.__init__cCs
d|jS)Nz[Template name: %s])ru)rrrrrC|szTemplate.to_string)rGN)rrrrrCrrrrr%ws
r%c@seZdZddd�Zdd�ZdS)�IfDefrGNcCstj||�||_dS)N)rrru)rrurrrrr�szIfDef.__init__cCs
d|jS)Nz[Ifdef name: %s])ru)rrrrrC�szIfDef.to_string)rGN)rrrrrCrrrrr�s
r�c@s&eZdZd	dd�Zdd�Zdd�ZdS)
r+rGNcCs"tj||�||_g|_g|_dS)N)rIr�ifname�argsZcomments)rr�rrrrr�szInterfaceCall.__init__cCsT|j|jkrdSt|j�t|j�kr(dSx&t|j|j�D]\}}||kr8dSq8WdS)NFT)r�rKr��zip)rrr�a�brrr�matches�szInterfaceCall.matchescCsdd|j}d}xL|jD]B}t|t�r.t|�}n|}|dkrH|d|}n||}|d7}qW|dS)Nz%s(rz, %sr�))r�r�rrdr`)rr]rVr�rBrrrrC�s


zInterfaceCall.to_string)rGN)rrrrr�rCrrrrr+�s

r+c@seZdZddd�Zdd�ZdS)�OptionalPolicyNcCstj||�dS)N)rr)rrrrrr�szOptionalPolicy.__init__cCsdS)Nz[Optional Policy]r)rrrrrC�szOptionalPolicy.to_string)N)rrrrrCrrrrr��s
r�c@s>eZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�ZdS)r'NcCstj||�d|_dS)N)rrr~)rrrrrr�szSupportMacros.__init__cCsdS)Nz[Support Macros]r)rrrrrC�szSupportMacros.to_stringcCsDt�}||jkr6x.|j|�D]}|j|j|��qWn
|j|�|S)N)rcr~�by_namer��_SupportMacros__expand_permr})r�permr]�prrrZ
__expand_perm�s

zSupportMacros.__expand_permcCsJi|_x>|D]6}t�}x|jD]}|j|j|��qW||j|j<qWdS)N)r~rcrvr�r�ru)rrZ	exp_permsr�rrrZ	__gen_map�s
zSupportMacros.__gen_mapcCs|js|j�|j|S)N)r~�_SupportMacros__gen_map)rrurrrr��szSupportMacros.by_namecCs|js|j�||jkS)N)r~r�)rrurrr�has_key�szSupportMacros.has_key)N)	rrrrrCr�r�r�r�rrrrr'�s
r'c@s&eZdZddd�Zdd�Zdd�ZdS)	r9NcCs6tj||�t�|_i|_t�|_t�|_t�|_dS)N)rIrrbr�r�r<�data�users)rrrrrr�szRequire.__init__cCs|jj|t��}|j|�dS)N)r��
setdefaultrbr�)rr�rvr�rrr�
add_obj_class�szRequire.add_obj_classcCs�g}|jd�x|jD]}|jd|�qWx,|jj�D]\}}|jd||j�f�q8Wx|jD]}|jd|�qbWx|jD]}|jd|�q�Wx|jD]}|jd|�q�W|jd�t|�dkr�d	Sd
j	|�S)Nz	require {z		type %s;z
	class %s %s;z		role %s;z		bool %s;z		user %s;rYrrGrA)
rMr�r�rUrfr<r�r�rKr\)rr]rQr�rvr�boolrirrrrC�s 

zRequire.to_string)N)rrrrr�rCrrrrr9�s
r9c@seZdZdd�Zdd�ZdS)�
ObjPermSetcCs||_t�|_dS)N)rurcrv)rrurrrr�szObjPermSet.__init__cCsd|j|jj�fS)Nzdefine(`%s', `%s'))rurvrf)rrrrrC�szObjPermSet.to_stringN)rrrrrCrrrrr��sr�c@seZdZdd�Zdd�ZdS)�ClassMapcCs||_||_dS)N)r�rv)rr�rvrrrrszClassMap.__init__cCs|jd|jS)Nz: )r�rv)rrrrrCszClassMap.to_stringN)rrrrrCrrrrr�sr�c@s.eZdZd
dd�Zdd�Zdd�Zdd	�ZdS)�CommentNcCs|r||_ng|_dS)N)�lines)rr_rrrr
szComment.__init__cCsBt|j�dkrdSg}x|jD]}|jd|�qWdj|�SdS)NrrG�#rA)rKr�rMr\)r�out�linerrrrCszComment.to_stringcCs2t|j�r.x"|jD]}|dkr|jj|�qWdS)NrG)rKr�rM)rrrr�rrr�merges
z
Comment.mergecCs|j�S)N)rC)rrrrrD!szComment.__str__)N)rrrrrCr�rDrrrrr�	s
r�)TFN)N�rXrY)r�)?�stringrnZSRC_TYPEZTGT_TYPEZ	OBJ_CLASSZPERMSZROLEZ	DEST_TYPEZfield_to_strZstr_to_fieldr
rrIrrWr`rarcrbrhrtrwr5r7r;r�r�r�r�r-r/r1r3r=r?r)r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r!r#r�r%r�r+r�r'r9r�r�r�rrrr�<module>s|a
&

	P
=
	@:!


	



				!&$site-packages/sepolgen/__pycache__/objectmodel.cpython-36.pyc000064400000007523147511334570020246 0ustar003

��fv�	@spdZddddddddd	g	Zd
ZdZdZeeBZeeeed
�ZedededediZGdd�d�ZGdd�d�Z	dS)z�
This module provides knowledge object classes and permissions. It should
be used to keep this knowledge from leaking into the more generic parts of
the policy generation.
Zsocket�fdZprocess�fileZlnk_fileZ	fifo_fileZdbusZ
capabilityZunix_stream_socket���)�n�r�w�brrrr	c@s eZdZdZdd�Zdd�ZdS)�PermMapaA mapping between a permission and its information flow properties.

    PermMap represents the information flow properties of a single permission
    including the direction (read, write, etc.) and an abstract representation
    of the bandwidth of the flow (weight).
    cCs||_||_||_dS)N)�perm�dir�weight)�selfrrr
�r�!/usr/lib/python3.6/objectmodel.py�__init__TszPermMap.__init__cCsd|jt|j|jfS)Nz'<sepolgen.objectmodel.PermMap %s %s %d>)r�
dir_to_strrr
)rrrr�__repr__YszPermMap.__repr__N)�__name__�
__module__�__qualname__�__doc__rrrrrrr
Msr
c@s@eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dS)�PermMappingsz�The information flow properties of a set of object classes and permissions.

    PermMappings maps one or more classes and permissions to their PermMap objects
    describing their information flow charecteristics.
    cCsi|_d|_t|_dS)N�)�classes�default_weight�	FLOW_BOTH�default_dir)rrrrrdszPermMappings.__init__cCs�d}x�|D]�}|j�}t|�dks
t|�dks
|ddkr<q
|ddkrx|d}||jkrbtd��i|j|<|j|}q
t|�dkr�td��|dkr�td	��t|dt|dt|d
��}|||j<q
WdS)zsRead the permission mappings from a file. This reads the format used
        by Apol in the setools suite.
        Nrr�#�classzduplicate class in perm map�z"error in object classs permissionszpermission outside of classr)�split�lenr�
ValueErrorr
�
str_to_dir�intr)rrZcur�lineZfields�c�pmrrr�	from_fileis"	
$

 zPermMappings.from_filecCs|j||S)z�Get the permission map for the object permission.

        Returns:
          PermMap representing the permission
        Raises:
          KeyError if the object or permission is not defined
        )r)r�objrrrr�get�szPermMappings.getcCs8y|j||}Wn tk
r2t||j|j�SX|S)aGet the permission map for the object permission or a default.

        getdefault is the same as get except that a default PermMap is
        returned if the object class or permission is not defined. The
        default is FLOW_BOTH with a weight of 5.
        )r�KeyErrorr
rr)rr*rr(rrr�
getdefault�s
zPermMappings.getdefaultcCs,t}x"|D]}|j||�}||jB}q
W|S)N)�	FLOW_NONEr-r)rr*�permsrrr(rrr�getdefault_direction�s

z!PermMappings.getdefault_directioncCs,d}x"|D]}|j||�}||j7}q
W|S)Nr)r-r
)rr*r/Ztotalrr(rrr�getdefault_distance�s

z PermMappings.getdefault_distanceN)
rrrrrr)r+r-r0r1rrrrr^s

rN)
rZimplicitly_typed_objectsr.Z	FLOW_READZ
FLOW_WRITErr$rr
rrrrr�<module>s
site-packages/sepolgen/__pycache__/classperms.cpython-36.opt-1.pyc000064400000004665147511334570021076 0ustar003

��f�
@s�ddlZd*ZddiZd
ZdZdZdZdZdZdZ	dZ
dZdd�Zdd�Z
ddlmZej�dd�Zdd�Zd d!�Zd"d#�Zd$d%�Zdd&lmZej�ed'�Zej�Zej�d(Zd)Zeje�Zee�dS)+�N�DEFINE�NAME�TICK�SQUOTE�OBRACE�CBRACE�SEMI�OPAREN�CPAREN�COMMAZdefinez\`z\'z\{z\}z\;z\(z\)z\,z 	
cCstj|jd�|_|S)z[a-zA-Z_][a-zA-Z0-9_]*r)�reserved�get�value�type)�t�r� /usr/lib/python3.6/classperms.py�t_NAME.srcCs td|jd�|jd�dS)NzIllegal character '%s'r�)�printr�skip)rrrr�t_error3srr)�lexcCs8t|�dkr|dg|d<n|dg|dg|d<dS)zHstatements : define_stmt
                  | define_stmt statements
    �rrN)�len)�prrr�p_statements:srcCs|d|dg|d<dS)zOdefine_stmt : DEFINE OPAREN TICK NAME SQUOTE COMMA TICK list SQUOTE CPAREN
    ��rNr)rrrr�
p_define_stmtCsrcCs,|ddkr|d|d<n|dg|d<dS)z2list : NAME
            | OBRACE names CBRACE
    r�{rrNr)rrrr�p_listJsr!cCs6t|�dkr|dg|d<n|dg|d|d<dS)z+names : NAME
             | NAME names
    rrrN)r)rrrr�p_namesSsr"cCstd|j|j|jf�dS)Nz$Syntax error on line %d %s [type=%s])r�linenorr)rrrr�p_error\sr$)�yaccz
all_perms.sptz%define(`foo',`{ read write append }')a2define(`all_filesystem_perms',`{ mount remount unmount getattr relabelfrom relabelto transition associate quotamod quotaget }')
define(`all_security_perms',`{ compute_av compute_create compute_member check_context load_policy compute_relabel compute_user setenforce setbool setsecparam setcheckreqprot }')
)
rrrrrrrr	r
r) �sys�tokensrZt_TICKZt_SQUOTEZt_OBRACEZt_CBRACEZt_SEMIZt_OPARENZt_CPARENZt_COMMAZt_ignorerr�rrrr!r"r$r%�open�f�readZtxt�closeZtestZtest2�parse�resultrrrrr�<module>sL				
site-packages/sepolgen/__pycache__/access.cpython-36.pyc000064400000026026147511334570017217 0ustar003

��f�.�@szdZddlmZddlmZddlmZdd�ZGdd	�d	ej�Zd
d�Z	Gdd
�d
�Z
dd�Zdd�ZGdd�d�Z
dS)aV
Classes representing basic access.

SELinux - at the most basic level - represents access as
the 4-tuple subject (type or context), target (type or context),
object class, permission. The policy language elaborates this basic
access to faciliate more concise rules (e.g., allow rules can have multiple
source or target types - see refpolicy for more information).

This module has objects for representing the most basic access (AccessVector)
and sets of that access (AccessVectorSet). These objects are used in Madison
in a variety of ways, but they are the fundamental representation of access.
�)�	refpolicy)�util�)�	audit2whycCsNt|�dkrF|ddkrFyt|dd��Wntk
r@dSXdSdSdS)z�Determine if an id is a paramater in the form $N, where N is
    an integer.

    Returns:
      True if the id is a paramater
      False if the id is not a paramater
    rr�$NFT)�len�int�
ValueError)�id�r�/usr/lib/python3.6/access.py�
is_idparam'sr
c@sJeZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)�AccessVectora
    An access vector is the basic unit of access in SELinux.

    Access vectors are the most basic representation of access within
    SELinux. It represents the access a source type has to a target
    type in terms of an object class and a set of permissions.

    Access vectors are distinct from AVRules in that they can only
    store a single source type, target type, and object class. The
    simplicity of AccessVectors makes them useful for storing access
    in a form that is easy to search and compare.

    The source, target, and object are stored as string. No checking
    done to verify that the strings are valid SELinux identifiers.
    Identifiers in the form $N (where N is an integer) are reserved as
    interface parameters and are treated as wild cards in many
    circumstances.

    Properties:
     .src_type - The source type allowed access. [String or None]
     .tgt_type - The target type to which access is allowed. [String or None]
     .obj_class - The object class to which access is allowed. [String or None]
     .perms - The permissions allowed to the object class. [IdSet]
     .audit_msgs - The audit messages that generated this access vector [List of strings]
     .xperms - Extended permissions attached to the AV. [Dictionary {operation: xperm set}]
    NcCsV|r|j|�nd|_d|_d|_tj�|_g|_tj	|_
g|_i|_d|_
d|_dS)N)�	from_list�src_type�tgt_type�	obj_classr�IdSet�perms�
audit_msgsr�TERULE�type�data�xperms�__hash__Z
info_flow_dir)�selfZ	init_listrrr�__init__Ss
zAccessVector.__init__cCsRt|�dkrtdt|���|d|_|d|_|d|_tj|dd��|_dS)axInitialize an access vector from a list.

        Initialize an access vector from a list treating the list as
        positional arguments - i.e., 0 = src_type, 1 = tgt_type, etc.
        All of the list elements 3 and greater are treated as perms.
        For example, the list ['foo_t', 'bar_t', 'file', 'read', 'write']
        would create an access vector list with the source type 'foo_t',
        target type 'bar_t', object class 'file', and permissions 'read'
        and 'write'.

        This format is useful for very simple storage to strings or disc
        (see to_list) and for initializing access vectors.
        �z+List must contain at least four elements %srr��N)	rr	�strrrrrrr)r�listrrrrhs


zAccessVector.from_listcCs$|j|j|jg}|jt|j��|S)z�
        Convert an access vector to a list.

        Convert an access vector to a list treating the list as positional
        values. See from_list for more information on how an access vector
        is represented in a list.
        )rrr�extend�sortedr)r�lrrr�to_list}szAccessVector.to_listcCsP|jj|j�x<|jD]2}||jkr2tj�|j|<|j|j|j|�qWdS)z0Add permissions and extended permissions from AVN)r�updaterrZXpermSetr")r�av�oprrr�merge�s

zAccessVector.mergecCs|j�S)N)�	to_string)rrrr�__str__�szAccessVector.__str__cCsd|j|j|j|jj�fS)Nzallow %s %s:%s %s;)rrrrZto_space_str)rrrrr*�s
zAccessVector.to_stringcCspyRt|j�}|j|j|j|f}t|j�}|j�|j�|j|j|j|f}|||�Sttfk
rjtSXdS)N)	r!rrrr�sort�AttributeError�	TypeError�NotImplemented)r�other�method�x�a�y�brrr�_compare�s


zAccessVector._compare)N)�__name__�
__module__�__qualname__�__doc__rrr%r)r+r*r6rrrrr8s
	rcCsvt|t�r|gSg}x\|jD]R}xL|jD]B}x<|jD]2}t�}||_||_||_|jj	�|_|j
|�q4Wq(WqW|S)aQConvert an avrule into a list of access vectors.

    AccessVectors and AVRules are similary, but differ in that
    an AVRule can more than one source type, target type, and
    object class. This function expands a single avrule into a
    list of one or more AccessVectors representing the access
    defined in the AVRule.

    
    )�
isinstancerZ	src_typesZ	tgt_typesZobj_classesrrrr�copy�append)Zavruler3rrr�accessrrr�avrule_to_access_vectors�s
r?c@sTeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zde	j
gfd
d�Zddd�ZdS)�AccessVectorSeta/A non-overlapping set of access vectors.

    An AccessVectorSet is designed to store one or more access vectors
    that are non-overlapping. Access can be added to the set
    incrementally and access vectors will be added or merged as
    necessary.  For example, adding the following access vectors using
    add_av:
       allow $1 etc_t : read;
       allow $1 etc_t : write;
       allow $1 var_log_t : read;
    Would result in an access vector set with the access vectors:
       allow $1 etc_t : { read write};
       allow $1 var_log_t : read;
    cCsi|_d|_dS)z)Initialize an access vector set.
        N)�srcZinfo_dir)rrrrr�szAccessVectorSet.__init__ccsBx<|jj�D].}x(|j�D]}x|j�D]
}|Vq(WqWqWdS)z9Iterate over all of the unique access vectors in the set.N)rA�values)r�tgts�objsr'rrr�__iter__�szAccessVectorSet.__iter__cCs:d}x0|jj�D]"}x|j�D]}|t|�7}qWqW|S)a5Return the number of unique access vectors in the set.

        Because of the inernal representation of the access vector set,
        __len__ is not a constant time operation. Worst case is O(N)
        where N is the number of unique access vectors, but the common
        case is probably better.
        r)rArBr)rr$rCrDrrr�__len__�s
zAccessVectorSet.__len__cCs$g}x|D]}|j|j��q
W|S)abReturn the unique access vectors in the set as a list.

        The format of the returned list is a set of nested lists,
        each access vector represented by a list. This format is
        designed to be simply  serializable to a file.

        For example, consider an access vector set with the following
        access vectors:
          allow $1 user_t : file read;
          allow $1 etc_t : file { read write};
        to_list would return the following:
          [[$1, user_t, file, read]
           [$1, etc_t, file, read, write]]

        See AccessVector.to_list for more information.
        )r=r%)rr$r'rrrr%�s
zAccessVectorSet.to_listcCs x|D]}|jt|��qWdS)aAdd access vectors stored in a list.

        See to list for more information on the list format that this
        method accepts.

        This will add all of the access from the list. Any existing
        access vectors in the set will be retained.
        N)�add_avr)rr$r'rrrrs	
zAccessVectorSet.from_listNc	Cs:t�}||_||_||_||_||_||_|j||�dS)z)Add an access vector to the set.
        N)rrrrrrrrG)	rrrrr�	audit_msgZavc_typerr'rrr�addszAccessVectorSet.addcCsv|jj|ji�}|j|ji�}|j|jf|krF||j|jfj|�n|||j|jf<|rr||j|jfjj|�dS)z Add an access vector to the set.N)	rA�
setdefaultrrrrr)rr=)rr'rHZtgt�clsrrrrGszAccessVectorSet.add_av)N)
r7r8r9r:rrErFr%rrrrIrGrrrrr@�s	
r@cCs2tj�}x$|D]}|j|j�|j|j�qW|S)N)rrrIrr)�avs�typesr'rrr�avs_extract_types*s

rNcCsJi}x@|D]8}|j|kr$||j}ntj�}|||j<|j|j�q
W|S)N)rrrr&r)rLrr'�srrr�avs_extract_obj_perms2s


rPc@s0eZdZdZdd�Zdd�Zdd�Zdd	�Zd
S)�RoleTypeSetz�A non-overlapping set of role type statements.

    This clas allows the incremental addition of role type statements and
    maintains a non-overlapping list of statements.
    cCs
i|_dS)z Initialize an access vector set.N)�
role_types)rrrrrCszRoleTypeSet.__init__ccsx|jj�D]
}|VqWdS)zAIterate over all of the unique role allows statements in the set.N)rRrB)r�	role_typerrrrEGszRoleTypeSet.__iter__cCst|jj��S)z2Return the unique number of role allow statements.)rrR�keys)rrrrrFLszRoleTypeSet.__len__cCs>||jkr|j|}ntj�}||_||j|<|jj|�dS)N)rRrZRoleType�rolerMrI)rrUrrSrrrrIPs

zRoleTypeSet.addN)r7r8r9r:rrErFrIrrrrrQ=s
rQN)r:�rrZselinuxrr
Z
Comparisonrr?r@rNrPrQrrrr�<module> sojsite-packages/sepolgen/__pycache__/sepolgeni18n.cpython-36.opt-1.pyc000064400000000460147511334570021223 0ustar003

��f��	@s6yddlZejd�ZejZWndd�ZYnXdS)�Nzselinux-pythoncCs|S)N�)�strrr�"/usr/lib/python3.6/sepolgeni18n.py�_sr)�gettextZtranslation�trrrrr�<module>s


site-packages/sepolgen/__pycache__/access.cpython-36.opt-1.pyc000064400000026026147511334570020156 0ustar003

��f�.�@szdZddlmZddlmZddlmZdd�ZGdd	�d	ej�Zd
d�Z	Gdd
�d
�Z
dd�Zdd�ZGdd�d�Z
dS)aV
Classes representing basic access.

SELinux - at the most basic level - represents access as
the 4-tuple subject (type or context), target (type or context),
object class, permission. The policy language elaborates this basic
access to faciliate more concise rules (e.g., allow rules can have multiple
source or target types - see refpolicy for more information).

This module has objects for representing the most basic access (AccessVector)
and sets of that access (AccessVectorSet). These objects are used in Madison
in a variety of ways, but they are the fundamental representation of access.
�)�	refpolicy)�util�)�	audit2whycCsNt|�dkrF|ddkrFyt|dd��Wntk
r@dSXdSdSdS)z�Determine if an id is a paramater in the form $N, where N is
    an integer.

    Returns:
      True if the id is a paramater
      False if the id is not a paramater
    rr�$NFT)�len�int�
ValueError)�id�r�/usr/lib/python3.6/access.py�
is_idparam'sr
c@sJeZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)�AccessVectora
    An access vector is the basic unit of access in SELinux.

    Access vectors are the most basic representation of access within
    SELinux. It represents the access a source type has to a target
    type in terms of an object class and a set of permissions.

    Access vectors are distinct from AVRules in that they can only
    store a single source type, target type, and object class. The
    simplicity of AccessVectors makes them useful for storing access
    in a form that is easy to search and compare.

    The source, target, and object are stored as string. No checking
    done to verify that the strings are valid SELinux identifiers.
    Identifiers in the form $N (where N is an integer) are reserved as
    interface parameters and are treated as wild cards in many
    circumstances.

    Properties:
     .src_type - The source type allowed access. [String or None]
     .tgt_type - The target type to which access is allowed. [String or None]
     .obj_class - The object class to which access is allowed. [String or None]
     .perms - The permissions allowed to the object class. [IdSet]
     .audit_msgs - The audit messages that generated this access vector [List of strings]
     .xperms - Extended permissions attached to the AV. [Dictionary {operation: xperm set}]
    NcCsV|r|j|�nd|_d|_d|_tj�|_g|_tj	|_
g|_i|_d|_
d|_dS)N)�	from_list�src_type�tgt_type�	obj_classr�IdSet�perms�
audit_msgsr�TERULE�type�data�xperms�__hash__Z
info_flow_dir)�selfZ	init_listrrr�__init__Ss
zAccessVector.__init__cCsRt|�dkrtdt|���|d|_|d|_|d|_tj|dd��|_dS)axInitialize an access vector from a list.

        Initialize an access vector from a list treating the list as
        positional arguments - i.e., 0 = src_type, 1 = tgt_type, etc.
        All of the list elements 3 and greater are treated as perms.
        For example, the list ['foo_t', 'bar_t', 'file', 'read', 'write']
        would create an access vector list with the source type 'foo_t',
        target type 'bar_t', object class 'file', and permissions 'read'
        and 'write'.

        This format is useful for very simple storage to strings or disc
        (see to_list) and for initializing access vectors.
        �z+List must contain at least four elements %srr��N)	rr	�strrrrrrr)r�listrrrrhs


zAccessVector.from_listcCs$|j|j|jg}|jt|j��|S)z�
        Convert an access vector to a list.

        Convert an access vector to a list treating the list as positional
        values. See from_list for more information on how an access vector
        is represented in a list.
        )rrr�extend�sortedr)r�lrrr�to_list}szAccessVector.to_listcCsP|jj|j�x<|jD]2}||jkr2tj�|j|<|j|j|j|�qWdS)z0Add permissions and extended permissions from AVN)r�updaterrZXpermSetr")r�av�oprrr�merge�s

zAccessVector.mergecCs|j�S)N)�	to_string)rrrr�__str__�szAccessVector.__str__cCsd|j|j|j|jj�fS)Nzallow %s %s:%s %s;)rrrrZto_space_str)rrrrr*�s
zAccessVector.to_stringcCspyRt|j�}|j|j|j|f}t|j�}|j�|j�|j|j|j|f}|||�Sttfk
rjtSXdS)N)	r!rrrr�sort�AttributeError�	TypeError�NotImplemented)r�other�method�x�a�y�brrr�_compare�s


zAccessVector._compare)N)�__name__�
__module__�__qualname__�__doc__rrr%r)r+r*r6rrrrr8s
	rcCsvt|t�r|gSg}x\|jD]R}xL|jD]B}x<|jD]2}t�}||_||_||_|jj	�|_|j
|�q4Wq(WqW|S)aQConvert an avrule into a list of access vectors.

    AccessVectors and AVRules are similary, but differ in that
    an AVRule can more than one source type, target type, and
    object class. This function expands a single avrule into a
    list of one or more AccessVectors representing the access
    defined in the AVRule.

    
    )�
isinstancerZ	src_typesZ	tgt_typesZobj_classesrrrr�copy�append)Zavruler3rrr�accessrrr�avrule_to_access_vectors�s
r?c@sTeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zde	j
gfd
d�Zddd�ZdS)�AccessVectorSeta/A non-overlapping set of access vectors.

    An AccessVectorSet is designed to store one or more access vectors
    that are non-overlapping. Access can be added to the set
    incrementally and access vectors will be added or merged as
    necessary.  For example, adding the following access vectors using
    add_av:
       allow $1 etc_t : read;
       allow $1 etc_t : write;
       allow $1 var_log_t : read;
    Would result in an access vector set with the access vectors:
       allow $1 etc_t : { read write};
       allow $1 var_log_t : read;
    cCsi|_d|_dS)z)Initialize an access vector set.
        N)�srcZinfo_dir)rrrrr�szAccessVectorSet.__init__ccsBx<|jj�D].}x(|j�D]}x|j�D]
}|Vq(WqWqWdS)z9Iterate over all of the unique access vectors in the set.N)rA�values)r�tgts�objsr'rrr�__iter__�szAccessVectorSet.__iter__cCs:d}x0|jj�D]"}x|j�D]}|t|�7}qWqW|S)a5Return the number of unique access vectors in the set.

        Because of the inernal representation of the access vector set,
        __len__ is not a constant time operation. Worst case is O(N)
        where N is the number of unique access vectors, but the common
        case is probably better.
        r)rArBr)rr$rCrDrrr�__len__�s
zAccessVectorSet.__len__cCs$g}x|D]}|j|j��q
W|S)abReturn the unique access vectors in the set as a list.

        The format of the returned list is a set of nested lists,
        each access vector represented by a list. This format is
        designed to be simply  serializable to a file.

        For example, consider an access vector set with the following
        access vectors:
          allow $1 user_t : file read;
          allow $1 etc_t : file { read write};
        to_list would return the following:
          [[$1, user_t, file, read]
           [$1, etc_t, file, read, write]]

        See AccessVector.to_list for more information.
        )r=r%)rr$r'rrrr%�s
zAccessVectorSet.to_listcCs x|D]}|jt|��qWdS)aAdd access vectors stored in a list.

        See to list for more information on the list format that this
        method accepts.

        This will add all of the access from the list. Any existing
        access vectors in the set will be retained.
        N)�add_avr)rr$r'rrrrs	
zAccessVectorSet.from_listNc	Cs:t�}||_||_||_||_||_||_|j||�dS)z)Add an access vector to the set.
        N)rrrrrrrrG)	rrrrr�	audit_msgZavc_typerr'rrr�addszAccessVectorSet.addcCsv|jj|ji�}|j|ji�}|j|jf|krF||j|jfj|�n|||j|jf<|rr||j|jfjj|�dS)z Add an access vector to the set.N)	rA�
setdefaultrrrrr)rr=)rr'rHZtgt�clsrrrrGszAccessVectorSet.add_av)N)
r7r8r9r:rrErFr%rrrrIrGrrrrr@�s	
r@cCs2tj�}x$|D]}|j|j�|j|j�qW|S)N)rrrIrr)�avs�typesr'rrr�avs_extract_types*s

rNcCsJi}x@|D]8}|j|kr$||j}ntj�}|||j<|j|j�q
W|S)N)rrrr&r)rLrr'�srrr�avs_extract_obj_perms2s


rPc@s0eZdZdZdd�Zdd�Zdd�Zdd	�Zd
S)�RoleTypeSetz�A non-overlapping set of role type statements.

    This clas allows the incremental addition of role type statements and
    maintains a non-overlapping list of statements.
    cCs
i|_dS)z Initialize an access vector set.N)�
role_types)rrrrrCszRoleTypeSet.__init__ccsx|jj�D]
}|VqWdS)zAIterate over all of the unique role allows statements in the set.N)rRrB)r�	role_typerrrrEGszRoleTypeSet.__iter__cCst|jj��S)z2Return the unique number of role allow statements.)rrR�keys)rrrrrFLszRoleTypeSet.__len__cCs>||jkr|j|}ntj�}||_||j|<|jj|�dS)N)rRrZRoleType�rolerMrI)rrUrrSrrrrIPs

zRoleTypeSet.addN)r7r8r9r:rrErFrIrrrrrQ=s
rQN)r:�rrZselinuxrr
Z
Comparisonrr?r@rNrPrQrrrr�<module> sojsite-packages/sepolgen/__pycache__/refparser.cpython-36.opt-1.pyc000064400000070600147511334570020703 0ustar003

��f�x�I@s�ddlZddlZddlZddlZddlmZddlmZddlmZddlmZddlm	Z	�dZ
dddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdOdNdPdQ�4ZdRZdSZ
dTZdUZdVZdWZdXZdYZdZZd[Zd\Zd]Zd^Zd_Zd`ZdaZdbZdcZddZdedf�Zdgdh�Z didj�Z!dkdl�Z"dmdn�Z#dodp�Z$dqdr�Z%dsdt�Z&dudv�Z'da(da)dwa*da+dxa,�d	dydz�Z-d{d|�Z.d}d~�Z/dd��Z0d�d��Z1d�d��Z2d�d��Z3d�d��Z4d�d��Z5d�d��Z6d�d��Z7d�d��Z8d�d��Z9d�d��Z:d�d��Z;d�d��Z<d�d��Z=d�d��Z>d�d��Z?d�d��Z@d�d��ZAd�d��ZBd�d��ZCd�d��ZDd�d��ZEd�d��ZFd�d��ZGd�d��ZHd�d��ZId�d��ZJd�d��ZKd�d��ZLd�d��ZMd�d��ZNd�d��ZOd�d��ZPd�d„ZQd�dĄZRd�dƄZSd�dȄZTd�dʄZUd�d̄ZVd�d΄ZWd�dЄZXd�d҄ZYd�dԄZZd�dքZ[d�d؄Z\d�dڄZ]d�d܄Z^d�dބZ_d�d�Z`d�d�Zad�d�Zbd�d�Zcd�d�Zdd�d�Zed�d�Zfd�d�Zgd�d�Zhd�d�Zid�d�Zjd�d��Zkd�d��Zld�d��Zmd�d��Znd�d��Zodapdaqd��d�Zr�d
�d�d�Zs�d�d�Zt�d�d�d�ZudS(�N�)�access)�defaults)�lex)�	refpolicy)�yacc�TICK�SQUOTE�OBRACE�CBRACE�SEMI�COLON�OPAREN�CPAREN�COMMA�MINUS�TILDE�ASTERISK�AMP�BAR�EXPL�EQUAL�FILENAME�
IDENTIFIER�NUMBER�PATH�	IPV6_ADDR�MODULE�
POLICY_MODULE�REQUIRE�SID�GENFSCON�FS_USE_XATTR�FS_USE_TRANS�FS_USE_TASK�PORTCON�NODECON�NETIFCON�PIRQCON�IOMEMCON�	IOPORTCON�PCIDEVICECON�
DEVICETREECON�CLASS�
TYPEATTRIBUTE�
ROLEATTRIBUTE�TYPE�	ATTRIBUTE�ATTRIBUTE_ROLE�ALIAS�	TYPEALIAS�BOOL�TRUE�FALSE�IF�ELSE�ROLE�TYPES�ALLOW�	DONTAUDIT�
AUDITALLOW�
NEVERALLOW�
PERMISSIVE�
TYPEBOUNDS�TYPE_TRANSITION�TYPE_CHANGE�TYPE_MEMBER�RANGE_TRANSITION�ROLE_TRANSITION�
OPT_POLICY�	INTERFACE�TUNABLE_POLICY�GEN_REQ�TEMPLATE�GEN_CONTEXT�IFELSE�IFDEF�IFNDEF�DEFINE)4�moduleZ
policy_moduleZrequireZsidZgenfscon�fs_use_xattr�fs_use_trans�fs_use_taskZportconZnodeconZnetifconZpirqconZiomemconZ	ioportconZpcideviceconZ
devicetreecon�classZ
typeattributeZ
roleattribute�typeZ	attributeZattribute_role�aliasZ	typealias�bool�trueZfalse�if�else�role�typesZallow�	dontaudit�
auditallow�
neverallowZ
permissiveZ
typeboundsZtype_transition�type_change�type_memberZrange_transitionZrole_transitionZoptional_policy�	interfaceZtunable_policyZgen_require�templateZgen_contextZifelseZifndef�ifdefZdefinez\`z\'z\{z\}z\;+z\:z\(z\)z\,z\-z\~z\*z\&z\|z\!z\=z[0-9\.]+z/[a-zA-Z0-9)_\.\*/\$]*z 	cCs|S)z2[a-fA-F0-9]{0,4}:[a-fA-F0-9]{0,4}:([a-fA-F0-9]|:)*�)�trfrf�/usr/lib/python3.6/refparser.py�t_IPV6_ADDR�sricCs|jjd7_dS)zdnl.*\nrN)�lexer�lineno)rgrfrfrh�t_m4comment�srlcCs|jd�dS)zdefine.*refpolicywarn\(.*\nrN)�skip)rgrfrfrh�t_refpolicywarn1�srncCs|jjd7_dS)zrefpolicywarn\(.*\nrN)rjrk)rgrfrfrh�t_refpolicywarn�srocCstj|jd�|_|S)z#[a-zA-Z_\$][a-zA-Z0-9_\-\+\.\$\*~]*r)�reserved�get�valuerV)rgrfrfrh�t_IDENTIFIER�srscCstj|jd�|_|S)z\"[a-zA-Z0-9_\-\+\.\$\*~ :]+\"r)rprqrrrV)rgrfrfrh�
t_FILENAMEsrtcCs|jjd7_dS)z\#.*\nrN)rjrk)rgrfrfrh�	t_commentsrucCs td|jd�|jd�dS)NzIllegal character '%s'rr)�printrrrm)rgrfrfrh�t_errorsrwcCs|jjt|j�7_dS)z\n+N)rjrk�lenrr)rgrfrfrh�	t_newlinesry�TcCsX|dkrdSxF|D]>}|dkr q||_|dk	rB|jjd||f�q|jjd|�qWdS)Nr)�parent�children�insert)Zstmtsr{�val�srfrfrh�collect-s
r�cCs8x2|D]*}tj|�r&|jtj|��q|j|�qWdS)N)�sptZhas_key�updateZby_name�add)Zidsr�idrfrfrh�expand9s

r�cCsNt|�dkr&|dr&tjj|d�n$t|�dkrJ|drJtjj|d�dS)z^statements : statement
                  | statements statement
                  | empty
    �rN)rx�mr|�append)�prfrfrh�p_statementsAsr�cCs|d|d<dS)z�statement : interface
                 | template
                 | obj_perm_set
                 | policy
                 | policy_module_stmt
                 | module_stmt
    rrNrf)r�rfrfrh�p_statementKsr�cCsdS)zempty :Nrf)r�rfrfrh�p_emptyUsr�cCs.tj�}|d|_|d|_d|_||d<dS)zHpolicy_module_stmt : POLICY_MODULE OPAREN IDENTIFIER COMMA NUMBER CPAREN��TrN)r�ModuleDeclaration�name�version)r�r�rfrfrh�p_policy_module_stmt`s


r�cCs(tj|d�}t|d|�||d<dS)zainterface : INTERFACE OPAREN TICK IDENTIFIER SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN
    ��rN)r�	Interfacer�)r��xrfrfrh�p_interfacehsr�cCs(tj|d�}t|d|�||d<dS)z�template : TEMPLATE OPAREN TICK IDENTIFIER SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN
                | DEFINE OPAREN TICK IDENTIFIER SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN
    r�r�rN)rZTemplater�)r�r�rfrfrh�
p_templateosr�cCsd|d<dS)z4define : DEFINE OPAREN TICK IDENTIFIER SQUOTE CPARENNrrf)r�rfrfrh�p_definewsr�cCszt|�dkr"|dr"|d|d<nTt|�dkrv|dsL|drv|d|d<n*|dsb|d|d<n|d|d|d<dS)zlinterface_stmts : policy
                       | interface_stmts policy
                       | empty
    r�rrN)rx)r�rfrfrh�p_interface_stmts~sr�cCsFtj�}t|d|dd�t|�dkr8t|d|dd�|g|d<dS)	z�optional_policy : OPT_POLICY OPAREN TICK interface_stmts SQUOTE CPAREN
                       | OPT_POLICY OPAREN TICK interface_stmts SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN
    r�T)r~�r�FrN)rZOptionalPolicyr�rx)r��orfrfrh�p_optional_policy�s
r�cCsPtj�}|d|_t|d|dd�t|�dkrBt|d|dd�|g|d<d	S)
z�tunable_policy : TUNABLE_POLICY OPAREN TICK cond_expr SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN
                      | TUNABLE_POLICY OPAREN TICK cond_expr SQUOTE COMMA TICK interface_stmts SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN
    r�r�T)r~��FrN)rZ
TunablePolicy�	cond_exprr�rx)r�r�rfrfrh�p_tunable_policy�s
r�cCsdS)a�ifelse : IFELSE OPAREN TICK IDENTIFIER SQUOTE COMMA COMMA TICK IDENTIFIER SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN optional_semi
              | IFELSE OPAREN TICK IDENTIFIER SQUOTE COMMA TICK IDENTIFIER SQUOTE COMMA TICK interface_stmts SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN optional_semi
              | IFELSE OPAREN TICK IDENTIFIER SQUOTE COMMA TICK SQUOTE COMMA TICK interface_stmts SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN optional_semi
    Nrf)r�rfrfrh�p_ifelse�sr�cCsbtj|d�}|ddkr d}nd}t|d||d�t|�dkrTt|d|dd�|g|d	<d
S)aJifdef : IFDEF OPAREN TICK IDENTIFIER SQUOTE COMMA TICK statements SQUOTE CPAREN optional_semi
             | IFNDEF OPAREN TICK IDENTIFIER SQUOTE COMMA TICK statements SQUOTE CPAREN optional_semi
             | IFDEF OPAREN TICK IDENTIFIER SQUOTE COMMA TICK statements SQUOTE COMMA TICK statements SQUOTE CPAREN optional_semi
    r�rreTFr�)r~r�rN)rZIfDefr�rx)r�r��vrfrfrh�p_ifdef�sr�cCs8tj|dd�}t|�dkr,|jj|d�||d<dS)z�interface_call : IDENTIFIER OPAREN interface_call_param_list CPAREN
                      | IDENTIFIER OPAREN CPAREN
                      | IDENTIFIER OPAREN interface_call_param_list CPAREN SEMIr)Zifnamer�r�rN)rZ
InterfaceCallrx�args�extend)r��irfrfrh�p_interface_call�sr�cCs6t|�dkr|d|d<n|dd|dg|d<dS)z�interface_call_param : IDENTIFIER
                            | IDENTIFIER MINUS IDENTIFIER
                            | nested_id_set
                            | TRUE
                            | FALSE
                            | FILENAME
    r�rr�-r�N)rx)r�rfrfrh�p_interface_call_param�s
r�cCs6t|�dkr|dg|d<n|d|dg|d<dS)z�interface_call_param_list : interface_call_param
                                 | interface_call_param_list COMMA interface_call_param
    r�rrr�N)rx)r�rfrfrh�p_interface_call_param_list�sr�cCs$tj|d�}|d|_||d<dS)zRobj_perm_set : DEFINE OPAREN TICK IDENTIFIER SQUOTE COMMA TICK names SQUOTE CPARENr�r�rN)rZ
ObjPermSet�perms)r�rrfrfrh�p_obj_perm_set�s
r�cCs|d|d<dS)z�policy : policy_stmt
              | optional_policy
              | tunable_policy
              | ifdef
              | ifelse
              | conditional
    rrNrf)r�rfrfrh�p_policy�sr�cCs|dr|dg|d<dS)a�policy_stmt : gen_require
                   | avrule_def
                   | typerule_def
                   | typebound_def
                   | typeattribute_def
                   | roleattribute_def
                   | interface_call
                   | role_def
                   | role_allow
                   | permissive
                   | type_def
                   | typealias_def
                   | attribute_def
                   | attribute_role_def
                   | range_transition_def
                   | role_transition_def
                   | bool
                   | define
                   | initial_sid
                   | genfscon
                   | fs_use
                   | portcon
                   | nodecon
                   | netifcon
                   | pirqcon
                   | iomemcon
                   | ioportcon
                   | pcidevicecon
                   | devicetreecon
    rrNrf)r�rfrfrh�
p_policy_stmt�sr�cCs.tj�}|d|_|d|_d|_||d<dS)z+module_stmt : MODULE IDENTIFIER NUMBER SEMIr�r�FrN)rr�r�r�)r�r�rfrfrh�
p_module_stmts


r�cCsdS)zlgen_require : GEN_REQ OPAREN TICK requires SQUOTE CPAREN
                   | REQUIRE OBRACE requires CBRACENrf)r�rfrfrh�
p_gen_require!sr�cCsdS)zsrequires : require
                | requires require
                | ifdef
                | requires ifdef
    Nrf)r�rfrfrh�
p_requires)sr�cCsdS)z�require : TYPE comma_list SEMI
               | ROLE comma_list SEMI
               | ATTRIBUTE comma_list SEMI
               | ATTRIBUTE_ROLE comma_list SEMI
               | CLASS comma_list SEMI
               | BOOL comma_list SEMI
    Nrf)r�rfrfrh�	p_require1sr�cCsHtj�}|d|_|d|_|d|_t|�dkr<|d|_||d<dS)z�security_context : IDENTIFIER COLON IDENTIFIER COLON IDENTIFIER
                        | IDENTIFIER COLON IDENTIFIER COLON IDENTIFIER COLON mls_range_defrr�r��r�rN)rZSecurityContext�userr\rVrx�level)r�rrfrfrh�p_security_context;s



r�cCs|d}|d|_||d<dS)zQgen_context : GEN_CONTEXT OPAREN security_context COMMA mls_range_def CPAREN
    r�r�rN)r�)r�rrfrfrh�
p_gen_contextHs
r�cCs|d|d<dS)z<context : security_context
               | gen_context
    rrNrf)r�rfrfrh�	p_contextSsr�cCs(tj�}|d|_|d|_||d<dS)z$initial_sid : SID IDENTIFIER contextr�r�rN)rZ
InitialSidr��context)r�rrfrfrh�
p_initial_sidYs

r�cCs2tj�}|d|_|d|_|d|_||d<dS)z+genfscon : GENFSCON IDENTIFIER PATH contextr�r�r�rN)rZGenfsCon�
filesystem�pathr�)r��grfrfrh�
p_genfscon`s



r�cCsntj�}|ddkr tjj|_n.|ddkr8tjj|_n|ddkrNtjj|_|d|_|d|_||d<dS)	z�fs_use : FS_USE_XATTR IDENTIFIER context SEMI
              | FS_USE_TASK IDENTIFIER context SEMI
              | FS_USE_TRANS IDENTIFIER context SEMI
    rrRrTrSr�r�rN)rZ
FilesystemUseZXATTRrVZTASKZTRANSr�r�)r��frfrfrh�p_fs_usejs


r�cCs`tj�}|d|_t|�dkr4|d|_|d|_n |dd|d|_|d|_||d<dS)zkportcon : PORTCON IDENTIFIER NUMBER context
               | PORTCON IDENTIFIER NUMBER MINUS NUMBER contextr�r�r�r�r�rN)rZPortConZ	port_typerxZport_numberr�)r��crfrfrh�	p_portcon|s


r�cCs2tj�}|d|_|d|_|d|_||d<dS)zanodecon : NODECON NUMBER NUMBER context
               | NODECON IPV6_ADDR IPV6_ADDR context
    r�r�r�rN)rZNodeCon�start�endr�)r��nrfrfrh�	p_nodecon�s



r�cCs2tj�}|d|_|d|_|d|_||d<dS)z.netifcon : NETIFCON IDENTIFIER context contextr�r�r�rN)rZNetifConrcZinterface_contextZpacket_context)r�r�rfrfrh�
p_netifcon�s



r�cCs(tj�}|d|_|d|_||d<dS)z pirqcon : PIRQCON NUMBER contextr�r�rN)rZPirqConZpirq_numberr�)r�r�rfrfrh�	p_pirqcon�s

r�cCsVtj�}t|�dkr*|d|_|d|_n |dd|d|_|d|_||d<dS)zYiomemcon : IOMEMCON NUMBER context
                | IOMEMCON NUMBER MINUS NUMBER contextr�r�r�r�rN)rZIomemConrxZ
device_memr�)r�r�rfrfrh�
p_iomemcon�s

r�cCsVtj�}t|�dkr*|d|_|d|_n |dd|d|_|d|_||d<dS)z\ioportcon : IOPORTCON NUMBER context
                | IOPORTCON NUMBER MINUS NUMBER contextr�r�r�r�rN)rZ	IoportConrxZioportr�)r�r�rfrfrh�p_ioportcon�s

r�cCs(tj�}|d|_|d|_||d<dS)z*pcidevicecon : PCIDEVICECON NUMBER contextr�r�rN)rZPciDeviceConZdevicer�)r�r�rfrfrh�p_pcidevicecon�s

r�cCs(tj�}|d|_|d|_||d<dS)z,devicetreecon : DEVICETREECON NUMBER contextr�r�rN)rZ
DevicetTeeConr�r�)r�r�rfrfrh�p_devicetreecon�s

r�cCs4|d|d<t|�dkr0|dd|d|d<dS)z[mls_range_def : mls_level_def MINUS mls_level_def
                     | mls_level_def
    rrr�r�r�N)rx)r�rfrfrh�p_mls_range_def�sr�cCs:|d|d<t|�dkr6|dddj|d�|d<dS)zRmls_level_def : IDENTIFIER COLON comma_list
                     | IDENTIFIER
    rrr��:�,r�N)rx�join)r�rfrfrh�p_mls_level_def�sr�cCs�tj|d�}t|�dkrD|ddkr8|jj|d�qv|d|_n2t|�dkrv|d|_t|�dkrv|jj|d�||d<dS)	z�type_def : TYPE IDENTIFIER COMMA comma_list SEMI
                | TYPE IDENTIFIER SEMI
                | TYPE IDENTIFIER ALIAS names SEMI
                | TYPE IDENTIFIER ALIAS names COMMA comma_list SEMI
    r�r�r�r�r�r�rN)rZTyperx�
attributesr��aliases)r�rgrfrfrh�
p_type_def�s
r�cCstj|d�}||d<dS)z)attribute_def : ATTRIBUTE IDENTIFIER SEMIr�rN)rZ	Attribute)r��arfrfrh�p_attribute_def�sr�cCstj|d�}||d<dS)z3attribute_role_def : ATTRIBUTE_ROLE IDENTIFIER SEMIr�rN)rZAttribute_Role)r�r�rfrfrh�p_attribute_role_def�sr�cCs(tj�}|d|_|d|_||d<dS)z5typealias_def : TYPEALIAS IDENTIFIER ALIAS names SEMIr�r�rN)rZ	TypeAliasrVr�)r�rgrfrfrh�p_typealias_def�s

r�cCs:tj�}|d|_t|�dkr.|jj|d�||d<dS)zWrole_def : ROLE IDENTIFIER TYPES comma_list SEMI
                | ROLE IDENTIFIER SEMIr�r�rN)rZRoler\rxr]r�)r��rrfrfrh�
p_role_defs

r�cCs(tj�}|d|_|d|_||d<dS)z#role_allow : ALLOW names names SEMIr�r�rN)rZ	RoleAllowZ	src_rolesZ	tgt_roles)r�r�rfrfrh�p_role_allows

r�cCsdS)z"permissive : PERMISSIVE names SEMINrf)r�rfrfrh�p_permissivesr�cCs�tj�}|ddkr tjj|_n.|ddkr8tjj|_n|ddkrNtjj|_|d|_|d|_|d|_|d|_	||d	<d
S)z�avrule_def : ALLOW names names COLON names names SEMI
                  | DONTAUDIT names names COLON names names SEMI
                  | AUDITALLOW names names COLON names names SEMI
                  | NEVERALLOW names names COLON names names SEMI
    rr^r_r`r�r�r�r�rN)
r�AVRuler=�	rule_typer>r?�	src_types�	tgt_types�obj_classesr�)r�r�rfrfrh�p_avrule_defs




r�cCsttj�}|ddkr tjj|_n|ddkr6tjj|_|d|_|d|_|d|_|d|_|d|_	||d	<d
S)a�typerule_def : TYPE_TRANSITION names names COLON names IDENTIFIER SEMI
                    | TYPE_TRANSITION names names COLON names IDENTIFIER FILENAME SEMI
                    | TYPE_TRANSITION names names COLON names IDENTIFIER IDENTIFIER SEMI
                    | TYPE_CHANGE names names COLON names IDENTIFIER SEMI
                    | TYPE_MEMBER names names COLON names IDENTIFIER SEMI
    rrarbr�r�r�r�r�rN)
rZTypeRulerCr�rDr�r�r�Z	dest_type�	file_name)r�rgrfrfrh�p_typerule_def*s





r�cCs.tj�}|d|_|jj|d�||d<dS)z5typebound_def : TYPEBOUNDS IDENTIFIER comma_list SEMIr�r�rN)rZ	TypeBoundrVr�r�)r�rgrfrfrh�p_typebound_def=s
r�cCs8tj�}|d|_|ddkr&d|_nd|_||d<dS)zIbool : BOOL IDENTIFIER TRUE SEMI
            | BOOL IDENTIFIER FALSE SEMIr�r�rYTFrN)rZBoolr��state)r��brfrfrh�p_boolDs
r�cCsPtj�}|d|_t|d|dd�t|�dkrBt|d|dd�|g|d<d	S)
z� conditional : IF OPAREN cond_expr CPAREN OBRACE interface_stmts CBRACE
                    | IF OPAREN cond_expr CPAREN OBRACE interface_stmts CBRACE ELSE OBRACE interface_stmts CBRACE
    r�r�T)r~r��
FrN)rZConditionalr�r�rx)r�r�rfrfrh�
p_conditionalOs
r�cCs.tj�}|d|_|jj|d�||d<dS)z<typeattribute_def : TYPEATTRIBUTE IDENTIFIER comma_list SEMIr�r�rN)rZ
TypeAttributerVr�r�)r�rgrfrfrh�p_typeattribute_defZs
r�cCs.tj�}|d|_|jj|d�||d<dS)z<roleattribute_def : ROLEATTRIBUTE IDENTIFIER comma_list SEMIr�r�rN)rZ
RoleAttributer\Zroleattributesr�)r�rgrfrfrh�p_roleattribute_defas
r�cCsdS)z�range_transition_def : RANGE_TRANSITION names names COLON names mls_range_def SEMI
                            | RANGE_TRANSITION names names names SEMINrf)r�rfrfrh�p_range_transition_defhsr�cCsdS)z<role_transition_def : ROLE_TRANSITION names names names SEMINrf)r�rfrfrh�p_role_transition_defmsr�cCsjt|�}|dkr |dg|d<nF|dkr@|dg|d|d<n&|d|d|dg|d|d<dS)acond_expr : IDENTIFIER
                 | EXPL cond_expr
                 | cond_expr AMP AMP cond_expr
                 | cond_expr BAR BAR cond_expr
                 | cond_expr EQUAL EQUAL cond_expr
                 | cond_expr EXPL EQUAL cond_expr
    r�rrr�r�N)rx)r��lrfrfrh�p_cond_exprqsr�cCsrtj�}t|�dkr$t|d|�nBt|�dkrFt|d|�d|_n t|dg�|jd|d�||d<dS)z�names : identifier
             | nested_id_set
             | asterisk
             | TILDE identifier
             | TILDE nested_id_set
             | IDENTIFIER MINUS IDENTIFIER
    r�rr�Tr�rN)rZIdSetrxr�Z
complimentr�)r�rrfrfrh�p_names�sr�cCs|dg|d<dS)zidentifier : IDENTIFIERrrNrf)r�rfrfrh�p_identifier�sr�cCs|dg|d<dS)zasterisk : ASTERISKrrNrf)r�rfrfrh�
p_asterisk�sr�cCs|d|d<dS)z1nested_id_set : OBRACE nested_id_list CBRACE
    r�rNrf)r�rfrfrh�p_nested_id_set�sr�cCs2t|�dkr|d|d<n|d|d|d<dS)z`nested_id_list : nested_id_element
                      | nested_id_list nested_id_element
    r�rrN)rx)r�rfrfrh�p_nested_id_list�sr�cCs4t|�dkr|d|d<nd|d}|g|d<dS)zxnested_id_element : identifier
                         | MINUS IDENTIFIER
                         | nested_id_set
    r�rrr�N)rx)r��strrfrfrh�p_nested_id_element�sr�cCs0t|�dkr |d|d|d<|d|d<dS)zTcomma_list : nested_id_list
                  | comma_list COMMA nested_id_list
    r�rr�rN)rx)r�rfrfrh�p_comma_list�sr�cCsdS)z/optional_semi : SEMI
                   | emptyNrf)r�rfrfrh�p_optional_semi�sr�cCs&dt|j|j|jfatt�dadS)Nz(%s: Syntax error on line %d %s [type=%s]F)�
parse_filerkrrrV�errorrv�success)�tokrfrfrh�p_error�sr�cCs(|siSi}x|D]}|||j<qWdS)N)r�)r��mapr�rfrfrh�prep_spt�s

rcCsHtstj�atjd|dd�a|dk	r*|antj�a|s@tj�an|adS)NZLALRr)�method�debugZwrite_tables)	�parserrrjrr�r�Module�
SupportMacrosr�)rQ�supportrrfrfrh�create_globals�s
rFcCs�t|||�dt_daytj||td�WnBtk
rl}z&dadadt|�dtj	�a
WYdd}~XnXts�dadt
}t|��tS)NrT)rrjzinternal parser error: %s�
zcould not parse text: "%s")
rrjrkr�r�parse�	Exceptionr��	traceback�
format_excr��
ValueErrorr�)�textrQrr�e�msgrfrfrhr
�s*r
c	Cs�g}d}x�tj|�D]�\}}}x�|D]~}tjj|�}tjj||�}|ddkr�|dkr\|}q�ttjd|d��r�|j|d|f�q$|ddkr$|j|d|f�q$WqW||fS)Nrz.sptzobj_perm_sets.sptZpatternsrz.if)	�os�walkr��splitextr�rx�re�findallr�)	�root�modules�support_macros�dirpathZdirnames�	filenamesr��modname�filenamerfrfrh�list_headers
s
rcs>ddlm}tj�}g}d}tjj|�r|tjj|�d}|dkrLtd|��tjj	|�}	|j
|	d|f�ttj
��\}
}nt|�\}}|r�|r�td���fdd��d��fd	d
�	}d}|�r8�d|�tj�}|||�|jj
|�tjd�}
tjd
dddddddddg
�}|
jj
tj|��|jj
|
��d���rd��rd|jtjt|�d�}|jd�g}x�|D]�}tj�}|d|_y*|�r�||d||�n||d|�WnFtk
�r�}z(�t|�d�|j
|d��wnWYdd}~XnX|jj
|���rn��rn|j��qnWt|��r:�ddj|��|S)Nr)�utilrzzInvalid file name %srz1could not find support macros (obj_perm_sets.spt)cs�r�j|�dS)N)�write)r)�outputrfrhr�3szparse_headers.<locals>.ocs��r�d|�y.t|�}|j�}|j�|at|||��WnTtk
r^}zdSd}~Xn6tk
r�}ztd|t|�f��WYdd}~XnXdS)Nzparsing file %s
zerror parsing file %s: %s)�open�read�closer�r
�IOErrorrr�)r�rQr��fdZtxtr)rr�rfrhr�7sz!parse_headers.<locals>.parse_filezParsing support macros (%s): �can_execz$1z$2�fileZexecute_no_transr"r#�getattr�lockZexecuteZioctlzdone.
)ZstepszParsing interface filesr	z failed to parse some headers: %sz, )N)rzrrZHeadersrr��isfile�splitrrr�rr�headersrr|r�rZAccessVectorr�ZConsoleProgressBar�sys�stdoutrxr�rr�r��stepr�)rr!r�rrr-rrr�rZall_modulesr�r�r'�avZstatusZfailuresr�r�rrf)rr�r!rh�
parse_headerssb






r2)Irr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrP)N)NNF)NTF)vr.rrrrzrrrrr�tokensrpZt_TICKZt_SQUOTEZt_OBRACEZt_CBRACEZt_SEMIZt_COLONZt_OPARENZt_CPARENZt_COMMAZt_MINUSZt_TILDEZ
t_ASTERISKZt_AMPZt_BARZt_EXPLZt_EQUALZt_NUMBERZt_PATHZt_ignorerirlrnrorsrtrurwryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrjrr
rr2rfrfrfrh�<module> s�







"


	

		

site-packages/sepolgen/__pycache__/defaults.cpython-36.opt-1.pyc000064400000004306147511334570020521 0ustar003

��fU�@sTddlZddlZGdd�de�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dS)�Nc@seZdZdd�Zddd�ZdS)�PathChooserc
Cs�t�|_tjj|�s(d|_d|jd<dS||_tjd�}tjd�}t|d��`}xXt	|�D]L\}}|j
|�rlqX|j
|�}|s�td||df��|jd	�|j|jd�<qXWWdQRXdS)
Nz
(defaults)zJ/usr/share/selinux/default:/usr/share/selinux/mls:/usr/share/selinux/devel�SELINUX_DEVEL_PATHz
^\s*(?:#.+)?$z^\s*(\w+)\s*=\s*(.+?)\s*$�rz(%s:%d: line is not in key = value format��)
�dict�config�os�path�exists�config_pathname�re�compile�open�	enumerate�match�
ValueError�group)�self�pathname�ignoreZconsider�fd�lineno�line�mo�r�/usr/lib/python3.6/defaults.py�__init__s 




zPathChooser.__init__rcCsp|jj|d�}|dkr(td||jf��|jd�}x*|D]"}tjj||�}tjj|�r8|Sq8Wtjj|d|�S)Nz%s was not in %s�:r)	r�getrr�splitr	r
�joinr)rZtestfilenameZpathset�paths�p�targetrrr�__call__,s

zPathChooser.__call__N)r)�__name__�
__module__�__qualname__rr%rrrrrsrcCsdS)Nz/var/lib/sepolgenrrrrr�data_dir;sr)cCs
t�dS)Nz	/perm_map)r)rrrr�perm_map>sr*cCs
t�dS)Nz/interface_info)r)rrrr�interface_infoAsr+cCs
t�dS)Nz/attribute_info)r)rrrr�attribute_infoDsr,cCs(td�}|d�}tjj|�s$|d�}|S)Nz/etc/selinux/sepolgen.conf�Makefilezinclude/Makefile)rr	r
r)�chooser�resultrrr�refpolicy_makefileGs
r0cCstd�}|d�S)Nz/etc/selinux/sepolgen.conf�include)r)r.rrr�headersNsr2)
r	r
�objectrr)r*r+r,r0r2rrrr�<module>s"site-packages/sepolgen/__pycache__/output.cpython-36.opt-1.pyc000064400000006656147511334570020264 0ustar003

��f��@spdZddlmZddlmZejr.ddlmZGdd�d�Zdd�Zd	d
�Zdd�Z	d
d�Z
dd�Zdd�ZdS)ai
Classes and functions for the output of reference policy modules.

This module takes a refpolicy.Module object and formats it for
output using the ModuleWriter object. By separating the output
in this way the other parts of Madison can focus solely on
generating policy. This keeps the semantic / syntactic issues
cleanly separated from the formatting issues.
�)�	refpolicy)�util)�cmpc@seZdZdd�Zdd�ZdS)�ModuleWritercCsd|_d|_d|_d|_dS)NT)�fd�module�sort�requires)�self�r�/usr/lib/python3.6/output.py�__init__&szModuleWriter.__init__cCsJ||_|jrt|j�x.tj|jdd�D]\}}|jdt|��q(WdS)NT)Z	showdepthz%s
)rr�sort_filterrZwalktree�write�str)r
rr�node�depthrrrr,s

zModuleWriter.writeN)�__name__�
__module__�__qualname__r
rrrrrr%srcCs�tj|�}|j�tj|�}|j�t|�t|�krFt|d|d�Sx4t||�D]&}|d|dkrRt|d|d�SqRWdS)N�r)rZset_to_listr�lenr�zip)�x�yZxlZyl�vrrr�
id_set_cmp=s

rcCsdt|j|j�}|dkr|St|j|j�}|dkr4|St|j|j�}|dkrN|Stt|j�t|j��S)Nr)r�	src_typesZ	tgt_typesZobj_classesrrZperms)�a�b�retrrr�
avrule_cmpKsr!cCs8|jd|jdkr*t|jd|jd�St|j|j�S)Nr)�argsrZifname)rrrrr�
ifcall_cmpZsr#cCsft|tj�r8t|tj�r"t||�St|jdg|j�Sn*t|tj�rNt||�St|j|jdg�SdS)Nr)	�
isinstancer�
InterfaceCallr#rr"rZAVRuler!)rrrrr�rule_cmp`s

r&cCst|j|j�S)N)rZrole)rrrrr�
role_type_cmplsr'cCs&dd�}x|j�D]}||�qWdS)z/Sort and group the output for readability.
    cSs�g}x(|j�D]}|j|�|jtj��qWx|j�D]}|j|�q8W|jtj��g}|j|j��|j|j��|jt	j
t�d�d}g}x||D]t}t|tj
�r�|jd}nt	j|j�}||k�r|r�|jtj��|}tj�}	|	jjd|�|j|	�|j|�q�W|j|�g}
|
j|j��|
jt	j
t�d�t|
��rftj�}	|	jjd�|j|	�|j|
�x$|jD]}||k�rx|j|��qxW||_dS)N)�keyrz============= %s ==============z"============= ROLES ==============)Zmodule_declarations�appendr�Commentr	�extendZavrulesZinterface_callsrr�
cmp_to_keyr&r$r%r"�firstr�linesZ
role_typesr'rZchildren)r�c�modZrequireZrulesZcurZ	sep_rulesZrulerZcommentZrasZchildrrr�	sort_nodersL








zsort_filter.<locals>.sort_nodeN)Znodes)rr1rrrrros<rN)
�__doc__�rrZPY3rrrr!r#r&r'rrrrr�<module>ssite-packages/sepolgen/__pycache__/policygen.cpython-36.pyc000064400000031111147511334570017736 0ustar003

��f�;�	@s�dZddlZddlZddljZyddlTWnYnXddlmZddlmZddlm	Z	ddlm
Z
dd	lmZdd
lmZdZ
dZdZGdd
�d
�Zdefdd�Zdd�ZGdd�d�Zdd�ZdS)z>
classes and algorithms for the generation of SELinux policy.
�N)�*�)�	refpolicy)�objectmodel)�access)�
interfaces)�matching)�util�c@s�eZdZdZddd�Zd dd�Zd!dd	�Zefd
d�Zdd
�Z	dd�Z
dd�Zd"dd�Zdd�Z
dd�Zdd�Zdd�Zdd�ZdS)#�PolicyGeneratora�Generate a reference policy module from access vectors.

    PolicyGenerator generates a new reference policy module
    or updates an existing module based on requested access
    in the form of access vectors.

    It generates allow rules and optionally module require
    statements, reference policy interfaces, and extended
    permission access vector rules. By default only allow rules
    are generated. The methods .set_gen_refpol, .set_gen_requires
    and .set_gen_xperms turns on interface generation,
    requires generation, and xperms rules genration respectively.

    PolicyGenerator can also optionally add comments explaining
    why a particular access was allowed based on the audit
    messages that generated the access. The access vectors
    passed in must have the .audit_msgs field set correctly
    and .explain set to SHORT|LONG_EXPLANATION to enable this
    feature.

    The module created by PolicyGenerator can be passed to
    output.ModuleWriter to output a text representation.
    NcCs>d|_t|_d|_|r||_n
tj�|_d|_d|_d|_	dS)z�Initialize a PolicyGenerator with an optional
        existing module.

        If the module paramater is not None then access
        will be added to the passed in module. Otherwise
        a new reference policy module will be created.
        NF)
�ifgen�NO_EXPLANATION�explain�gen_requires�modulerZModule�	dontaudit�xperms�domains)�selfr�r�/usr/lib/python3.6/policygen.py�__init__Es
zPolicyGenerator.__init__cCs*|rt||�|_d|_nd|_|j�dS)a?Set whether reference policy interfaces are generated.

        To turn on interface generation pass in an interface set
        to use for interface generation. To turn off interface
        generation pass in None.

        If interface generation is enabled requires generation
        will also be enabled.
        TN)�InterfaceGeneratorrr�"_PolicyGenerator__set_module_style)rZif_set�	perm_mapsrrr�set_gen_refpolYs

zPolicyGenerator.set_gen_refpolTcCs
||_dS)a&Set whether module requires are generated.

        Passing in true will turn on requires generation and
        False will disable generation. If requires generation is
        disabled interface generation will also be disabled and
        can only be re-enabled via .set_gen_refpol.
        N)r)rZstatusrrr�set_gen_requiresksz PolicyGenerator.set_gen_requirescCs
||_dS)z)Set whether access is explained.
        N)r)rrrrr�set_gen_explainuszPolicyGenerator.set_gen_explaincCs
||_dS)N)r)rrrrr�set_gen_dontauditzsz!PolicyGenerator.set_gen_dontauditcCs
||_dS)zSSet whether extended permission access vector rules
        are generated.
        N)r)rrrrr�set_gen_xperms}szPolicyGenerator.set_gen_xpermscCs.|jrd}nd}x|jj�D]
}||_qWdS)NTF)rr�module_declarationsr)rr�modrrrZ__set_module_style�s
z"PolicyGenerator.__set_module_style�1.0cCs\d}x|jj�D]}|}qW|s8tj�}|jjjd|�||_||_|jrRd|_nd|_dS)z?Set the name of the module and optionally the version.
        NrTF)	rr rZModuleDeclaration�children�insert�name�versionr)rr%r&�mr!rrr�set_module_name�szPolicyGenerator.set_module_namecCs|jrt|j�|jS)N)rr)rrrr�
get_module�s
zPolicyGenerator.get_modulecCsvtj|�}|jr|j|_d|_|jr>ttjt	||jd���|_|j
tjkrl|jd7_|j
rl|jd7_|j
tjkr�|jd7_|j
tjkr�t|j�dkr�|jddjd	d
�|jD��7_n|jd|jdd7_|j
tjk�rP|jd
7_|jd7_|jd|jd7_x*|jdd�D]}|jd|7_�q4W�y|j
tjk�rTd|jk�rTd|jk�s�d|jk�rT|j�s�ttdd�dd|_g}xHdd
�ttgt|jt|jt|ji�D�D]}||jk�r�|j|��q�Wt|�dk�r$|jd|j|jdj|�f7_n0t|�dk�rT|jd|j|jdj|�f7_WnYnX|jj j|�dS)z Add access vector rule.
        �)�	verbosityz0
#!!!! This avc is allowed in the current policyzN
#!!!! This av rule may have been overridden by an extended permission av rulez:
#!!!! This avc has a dontaudit rule in the current policyrzH
#!!!! This avc can be allowed using one of the these booleans:
#     %sz, cSsg|]}|d�qS)rr)�.0�xrrr�
<listcomp>�sz1PolicyGenerator.__add_av_rule.<locals>.<listcomp>z5
#!!!! This avc can be allowed using the boolean '%s'rz�
#!!!! This avc is a constraint violation.  You would need to modify the attributes of either the source or target types to allow this access.z
#Constraint rule: z
#	Nz?
#	Possible cause is the source %s and target %s are different.�write�dir�openZdomain)r%�typescSsg|]}|t�qSr)ZTCONTEXT)r,r-rrrr.�szL
#!!!! The source type '%s' can write to a '%s' of the following type:
# %s
zM
#!!!! The source type '%s' can write to a '%s' of the following types:
# %s
)!rZAVRulerZ	DONTAUDIT�	rule_type�commentr�str�Comment�explain_access�type�	audit2whyZALLOWrZBOOLEAN�len�data�joinZ
CONSTRAINTZTERULE�perms�	obj_classrZseinfoZ	ATTRIBUTEZsesearchZSCONTEXT�src_typeZCLASSZPERMS�appendrr#)r�avZrule�reasonr2�irrrZ
__add_av_rule�sN
&.$&zPolicyGenerator.__add_av_rulecCs@x:|jj�D],}tj||�}|jr*|j|_|jjj	|�qWdS)z5Add extended permission access vector rules.
        N)
r�keysrZ	AVExtRulerZDONTAUDITXPERMr3rr#r@)rrA�opZextrulerrrZ__add_ext_av_rules�s
z"PolicyGenerator.__add_ext_av_rulescCs`|jr*|jj||j�\}}|jjj|�n|}x,|D]$}|j|�|jr4|jr4|j|�q4WdS)zJAdd the access from the access vector set to this
        module.
        N)	r�genrrr#�extend�_PolicyGenerator__add_av_ruler�"_PolicyGenerator__add_ext_av_rules)rZav_setZ	raw_allow�ifcallsrArrr�
add_access�s	

zPolicyGenerator.add_accesscCs x|D]}|jjj|�qWdS)N)rr#r@)rZ
role_type_set�	role_typerrr�add_role_types�s
zPolicyGenerator.add_role_types)N)NN)T)r")�__name__�
__module__�__qualname__�__doc__rrr�SHORT_EXPLANATIONrrrrr(r)rHrIrKrMrrrrr-s




5rcsg���fdd�}|tkr�x�|jD]�}�jd|j��jdt|j�t|j�f��jd|jtj	|j
�f��jd|j|j|j
f��jtjd|jdd	d
dd��q"W|�nb|�r�jd
|j|j|j|jj�f�t|j�dk�r|jd}�jd|j|j|j
f�|��S)a�Explain why a policy statement was generated.

    Return a string containing a text explanation of
    why a policy statement was generated. The string is
    commented and wrapped and can be directly inserted
    into a policy.

    Params:
      av - access vector representing the access. Should
       have .audit_msgs set appropriately.
      verbosity - the amount of explanation provided. Should
       be set to NO_EXPLANATION, SHORT_EXPLANATION, or
       LONG_EXPLANATION.
    Returns:
      list of strings - strings explaining the access or an empty
       string if verbosity=NO_EXPLANATION or there is not sufficient
       information to provide an explanation.
    csN�sdS�jd�x6�j�D]*}t|j�j�}�jd|j�|jf�qWdS)Nz Interface options:z   %s # [%d])r@�all�call_interface�	interfacerAZ	to_stringZdist)�match�ifcall)�ml�srr�explain_interfacess
z*explain_access.<locals>.explain_interfacesz %sz  scontext="%s" tcontext="%s"z  class="%s" perms="%s"z  comm="%s" exe="%s" path="%s"z	message="�"�Pz  z   )Zinitial_indentZsubsequent_indentz) src="%s" tgt="%s" class="%s", perms="%s"rz comm="%s" exe="%s" path="%s")�LONG_EXPLANATIONZ
audit_msgsr@�headerr5ZscontextZtcontextZtclassrZlist_to_space_strZaccessesZcommZexe�pathrG�textwrapZwrap�messager?�tgt_typer>r=Zto_space_strr:)rArXr+rZ�msgr)rXrYrr7�s*
r7cCs�g}g}|j|jj��|jdd�dd�tj�}|j|_x�tt	|��D]z}||j
tjkrl|jj
|j�qH||j
tjkr�|jj
|j�qH||j
tjkr�|jj
|j�qHt||j
�dsHt�qHWt	|j�dks�t�|S)NcSs|jS)N)�num)�paramrrr�<lambda>9sz call_interface.<locals>.<lambda>T)�key�reverser)rG�params�values�sortrZ
InterfaceCallr%Zifname�ranger:r8�SRC_TYPE�argsr@r?�TGT_TYPErb�	OBJ_CLASSr>�print�AssertionError)rUrArirnrWrCrrrrT4s"rTc@s.eZdZd
dd�Zdd�Zdd�Zdd	�ZdS)rNcCs&||_|j|�tj|�|_g|_dS)N)�ifs�hack_check_ifsrZ
AccessMatcher�matcher�calls)rrsrrrrrNs
zInterfaceGenerator.__init__cCs�x�|jj�D]|}g}|j|jj��|jdd�dd�xPtt|��D]@}|d||jkrbd|_P||j	t
jt
jt
j
gkrDd|_PqDWqWdS)NcSs|jS)N)rd)rerrrrf\sz3InterfaceGenerator.hack_check_ifs.<locals>.<lambda>T)rgrhrF)rrjrGrirkrlr:rdZenabledr8rrmrorp)rrsr-rirCrrrrtTs
z!InterfaceGenerator.hack_check_ifscCs�|j|�}g}xH|jD]>}t|j�j|j�}|rFtjt|j||��|_	|j
||f�qWg}xX|D]P\}}d}	x4|D],}
|
j|�rt|
j	r�|j	r�|
j	j|j	�d}	qtW|	sb|j
|�qbW||fS)NFT)
rVrvrTZbestrUrArr6r7r4r@Zmatches�merge)r�avsr+�raw_avrJrXrW�drs�foundZo_ifcallrrrrFks$


zInterfaceGenerator.gencCsPg}xF|D]>}tj�}|jj|j||�t|�r>|jj|�q
|j|�q
W|S)N)rZ	MatchListruZ
search_ifsrsr:rvr@)rrxryrAZansrrrrV�s
zInterfaceGenerator.match)N)rNrOrPrrtrFrVrrrrrMs
rcCs&dd�}x|j�D]}||�qWdS)z*Add require statements to the module.
    cSs�tj�}xJ|j�D]>}|jj|j�|jj|j�x|jD]}|j||j	�q:WqWx,|j
�D] }x|jD]}|jj|�qjWq^Wx,|j
�D] }|jj|j�|jj|j�q�W|jjd�|jjd|�dS)Nrr)rZRequireZavrulesr2�updateZ	src_typesZ	tgt_typesZobj_classesZ
add_obj_classr=Zinterface_callsrn�addZ
role_typesZrolesZrole�discardr#r$)�node�rZavrule�objrW�argrLrrr�collect_requires�sz&gen_requires.<locals>.collect_requiresN)Znodes)rr�rrrrr�sr)rQ�	itertoolsr`Zselinux.audit2whyr9Zsetoolsr*rrrrrr	r
rRr]rr7rTrrrrrr�<module>s,
Q7Bsite-packages/sepolgen/policygen.py000064400000035770147511334570013471 0ustar00# Authors: Karl MacMillan <kmacmillan@mentalrootkit.com>
#
# Copyright (C) 2006 Red Hat
# see file 'COPYING' for use and warranty information
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License as
# published by the Free Software Foundation; version 2 only
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#

"""
classes and algorithms for the generation of SELinux policy.
"""

import itertools
import textwrap

import selinux.audit2why as audit2why
try:
    from setools import *
except:
    pass

from . import refpolicy
from . import objectmodel
from . import access
from . import interfaces
from . import matching
from . import util
# Constants for the level of explanation from the generation
# routines
NO_EXPLANATION    = 0
SHORT_EXPLANATION = 1
LONG_EXPLANATION  = 2

class PolicyGenerator:
    """Generate a reference policy module from access vectors.

    PolicyGenerator generates a new reference policy module
    or updates an existing module based on requested access
    in the form of access vectors.

    It generates allow rules and optionally module require
    statements, reference policy interfaces, and extended
    permission access vector rules. By default only allow rules
    are generated. The methods .set_gen_refpol, .set_gen_requires
    and .set_gen_xperms turns on interface generation,
    requires generation, and xperms rules genration respectively.

    PolicyGenerator can also optionally add comments explaining
    why a particular access was allowed based on the audit
    messages that generated the access. The access vectors
    passed in must have the .audit_msgs field set correctly
    and .explain set to SHORT|LONG_EXPLANATION to enable this
    feature.

    The module created by PolicyGenerator can be passed to
    output.ModuleWriter to output a text representation.
    """
    def __init__(self, module=None):
        """Initialize a PolicyGenerator with an optional
        existing module.

        If the module paramater is not None then access
        will be added to the passed in module. Otherwise
        a new reference policy module will be created.
        """
        self.ifgen = None
        self.explain = NO_EXPLANATION
        self.gen_requires = False
        if module:
            self.module = module
        else:
            self.module = refpolicy.Module()

        self.dontaudit = False
        self.xperms = False

        self.domains = None
    def set_gen_refpol(self, if_set=None, perm_maps=None):
        """Set whether reference policy interfaces are generated.

        To turn on interface generation pass in an interface set
        to use for interface generation. To turn off interface
        generation pass in None.

        If interface generation is enabled requires generation
        will also be enabled.
        """
        if if_set:
            self.ifgen = InterfaceGenerator(if_set, perm_maps)
            self.gen_requires = True
        else:
            self.ifgen = None
        self.__set_module_style()


    def set_gen_requires(self, status=True):
        """Set whether module requires are generated.

        Passing in true will turn on requires generation and
        False will disable generation. If requires generation is
        disabled interface generation will also be disabled and
        can only be re-enabled via .set_gen_refpol.
        """
        self.gen_requires = status

    def set_gen_explain(self, explain=SHORT_EXPLANATION):
        """Set whether access is explained.
        """
        self.explain = explain

    def set_gen_dontaudit(self, dontaudit):
        self.dontaudit = dontaudit

    def set_gen_xperms(self, xperms):
        """Set whether extended permission access vector rules
        are generated.
        """
        self.xperms = xperms

    def __set_module_style(self):
        if self.ifgen:
            refpolicy = True
        else:
            refpolicy = False
        for mod in self.module.module_declarations():
            mod.refpolicy = refpolicy

    def set_module_name(self, name, version="1.0"):
        """Set the name of the module and optionally the version.
        """
        # find an existing module declaration
        m = None
        for mod in self.module.module_declarations():
            m = mod
        if not m:
            m = refpolicy.ModuleDeclaration()
            self.module.children.insert(0, m)
        m.name = name
        m.version = version
        if self.ifgen:
            m.refpolicy = True
        else:
            m.refpolicy = False

    def get_module(self):
        # Generate the requires
        if self.gen_requires:
            gen_requires(self.module)

        """Return the generated module"""
        return self.module

    def __add_av_rule(self, av):
        """Add access vector rule.
        """
        rule = refpolicy.AVRule(av)

        if self.dontaudit:
            rule.rule_type = rule.DONTAUDIT
        rule.comment = ""
        if self.explain:
            rule.comment = str(refpolicy.Comment(explain_access(av, verbosity=self.explain)))

        if av.type == audit2why.ALLOW:
            rule.comment += "\n#!!!! This avc is allowed in the current policy"

            if av.xperms:
                rule.comment += "\n#!!!! This av rule may have been overridden by an extended permission av rule"

        if av.type == audit2why.DONTAUDIT:
            rule.comment += "\n#!!!! This avc has a dontaudit rule in the current policy"

        if av.type == audit2why.BOOLEAN:
            if len(av.data) > 1:
                rule.comment += "\n#!!!! This avc can be allowed using one of the these booleans:\n#     %s" % ", ".join([x[0] for x in av.data])
            else:
                rule.comment += "\n#!!!! This avc can be allowed using the boolean '%s'" % av.data[0][0]

        if av.type == audit2why.CONSTRAINT:
            rule.comment += "\n#!!!! This avc is a constraint violation.  You would need to modify the attributes of either the source or target types to allow this access."
            rule.comment += "\n#Constraint rule: "
            rule.comment += "\n#\t" + av.data[0]
            for reason in av.data[1:]:
                rule.comment += "\n#\tPossible cause is the source %s and target %s are different." % reason

        try:
            if ( av.type == audit2why.TERULE and
                 "write" in av.perms and
                 ( "dir" in av.obj_class or "open" in av.perms )):
                if not self.domains:
                    self.domains = seinfo(ATTRIBUTE, name="domain")[0]["types"]
                types=[]

                for i in [x[TCONTEXT] for x in sesearch([ALLOW], {SCONTEXT: av.src_type, CLASS: av.obj_class, PERMS: av.perms})]:
                    if i not in self.domains:
                        types.append(i)
                if len(types) == 1:
                    rule.comment += "\n#!!!! The source type '%s' can write to a '%s' of the following type:\n# %s\n" % ( av.src_type, av.obj_class, ", ".join(types))
                elif len(types) >= 1:
                    rule.comment += "\n#!!!! The source type '%s' can write to a '%s' of the following types:\n# %s\n" % ( av.src_type, av.obj_class, ", ".join(types))
        except:
            pass

        self.module.children.append(rule)

    def __add_ext_av_rules(self, av):
        """Add extended permission access vector rules.
        """
        for op in av.xperms.keys():
            extrule = refpolicy.AVExtRule(av, op)

            if self.dontaudit:
                extrule.rule_type = extrule.DONTAUDITXPERM

            self.module.children.append(extrule)

    def add_access(self, av_set):
        """Add the access from the access vector set to this
        module.
        """
        # Use the interface generator to split the access
        # into raw allow rules and interfaces. After this
        # a will contain a list of access that should be
        # used as raw allow rules and the interfaces will
        # be added to the module.
        if self.ifgen:
            raw_allow, ifcalls = self.ifgen.gen(av_set, self.explain)
            self.module.children.extend(ifcalls)
        else:
            raw_allow = av_set

        # Generate the raw allow rules from the filtered list
        for av in raw_allow:
            self.__add_av_rule(av)
            if self.xperms and av.xperms:
                self.__add_ext_av_rules(av)

    def add_role_types(self, role_type_set):
        for role_type in role_type_set:
            self.module.children.append(role_type)

def explain_access(av, ml=None, verbosity=SHORT_EXPLANATION):
    """Explain why a policy statement was generated.

    Return a string containing a text explanation of
    why a policy statement was generated. The string is
    commented and wrapped and can be directly inserted
    into a policy.

    Params:
      av - access vector representing the access. Should
       have .audit_msgs set appropriately.
      verbosity - the amount of explanation provided. Should
       be set to NO_EXPLANATION, SHORT_EXPLANATION, or
       LONG_EXPLANATION.
    Returns:
      list of strings - strings explaining the access or an empty
       string if verbosity=NO_EXPLANATION or there is not sufficient
       information to provide an explanation.
    """
    s = []

    def explain_interfaces():
        if not ml:
            return
        s.append(" Interface options:")
        for match in ml.all():
            ifcall = call_interface(match.interface, ml.av)
            s.append('   %s # [%d]' % (ifcall.to_string(), match.dist))


    # Format the raw audit data to explain why the
    # access was requested - either long or short.
    if verbosity == LONG_EXPLANATION:
        for msg in av.audit_msgs:
            s.append(' %s' % msg.header)
            s.append('  scontext="%s" tcontext="%s"' %
                     (str(msg.scontext), str(msg.tcontext)))
            s.append('  class="%s" perms="%s"' %
                     (msg.tclass, refpolicy.list_to_space_str(msg.accesses)))
            s.append('  comm="%s" exe="%s" path="%s"' % (msg.comm, msg.exe, msg.path))
            s.extend(textwrap.wrap('message="' + msg.message + '"', 80, initial_indent="  ",
                                   subsequent_indent="   "))
        explain_interfaces()
    elif verbosity:
        s.append(' src="%s" tgt="%s" class="%s", perms="%s"' %
                 (av.src_type, av.tgt_type, av.obj_class, av.perms.to_space_str()))
        # For the short display we are only going to use the additional information
        # from the first audit message. For the vast majority of cases this info
        # will always be the same anyway.
        if len(av.audit_msgs) > 0:
            msg = av.audit_msgs[0]
            s.append(' comm="%s" exe="%s" path="%s"' % (msg.comm, msg.exe, msg.path))
        explain_interfaces()
    return s

def call_interface(interface, av):
    params = []
    args = []

    params.extend(interface.params.values())
    params.sort(key=lambda param: param.num, reverse=True)

    ifcall = refpolicy.InterfaceCall()
    ifcall.ifname = interface.name

    for i in range(len(params)):
        if params[i].type == refpolicy.SRC_TYPE:
            ifcall.args.append(av.src_type)
        elif params[i].type == refpolicy.TGT_TYPE:
            ifcall.args.append(av.tgt_type)
        elif params[i].type == refpolicy.OBJ_CLASS:
            ifcall.args.append(av.obj_class)
        else:
            print(params[i].type)
            assert(0)

    assert(len(ifcall.args) > 0)

    return ifcall

class InterfaceGenerator:
    def __init__(self, ifs, perm_maps=None):
        self.ifs = ifs
        self.hack_check_ifs(ifs)
        self.matcher = matching.AccessMatcher(perm_maps)
        self.calls = []

    def hack_check_ifs(self, ifs):
        # FIXME: Disable interfaces we can't call - this is a hack.
        # Because we don't handle roles, multiple paramaters, etc.,
        # etc., we must make certain we can actually use a returned
        # interface.
        for x in ifs.interfaces.values():
            params = []
            params.extend(x.params.values())
            params.sort(key=lambda param: param.num, reverse=True)
            for i in range(len(params)):
                # Check that the paramater position matches
                # the number (e.g., $1 is the first arg). This
                # will fail if the parser missed something.
                if (i + 1) != params[i].num:
                    x.enabled = False
                    break
                # Check that we can handle the param type (currently excludes
                # roles.
                if params[i].type not in [refpolicy.SRC_TYPE, refpolicy.TGT_TYPE,
                                          refpolicy.OBJ_CLASS]:
                    x.enabled = False
                    break

    def gen(self, avs, verbosity):
        raw_av = self.match(avs)
        ifcalls = []
        for ml in self.calls:
            ifcall = call_interface(ml.best().interface, ml.av)
            if verbosity:
                ifcall.comment = refpolicy.Comment(explain_access(ml.av, ml, verbosity))
            ifcalls.append((ifcall, ml))

        d = []
        for ifcall, ifs in ifcalls:
            found = False
            for o_ifcall in d:
                if o_ifcall.matches(ifcall):
                    if o_ifcall.comment and ifcall.comment:
                        o_ifcall.comment.merge(ifcall.comment)
                    found = True
            if not found:
                d.append(ifcall)

        return (raw_av, d)


    def match(self, avs):
        raw_av = []
        for av in avs:
            ans = matching.MatchList()
            self.matcher.search_ifs(self.ifs, av, ans)
            if len(ans):
                self.calls.append(ans)
            else:
                raw_av.append(av)

        return raw_av


def gen_requires(module):
    """Add require statements to the module.
    """
    def collect_requires(node):
        r = refpolicy.Require()
        for avrule in node.avrules():
            r.types.update(avrule.src_types)
            r.types.update(avrule.tgt_types)
            for obj in avrule.obj_classes:
                r.add_obj_class(obj, avrule.perms)

        for ifcall in node.interface_calls():
            for arg in ifcall.args:
                # FIXME - handle non-type arguments when we
                # can actually figure those out.
                r.types.add(arg)

        for role_type in node.role_types():
            r.roles.add(role_type.role)
            r.types.update(role_type.types)
                
        r.types.discard("self")

        node.children.insert(0, r)

    # FUTURE - this is untested on modules with any sort of
    # nesting
    for node in module.nodes():
        collect_requires(node)


site-packages/sepolgen/access.py000064400000027306147511334570012735 0ustar00# Authors: Karl MacMillan <kmacmillan@mentalrootkit.com>
#
# Copyright (C) 2006 Red Hat 
# see file 'COPYING' for use and warranty information
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License as
# published by the Free Software Foundation; version 2 only
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#

"""
Classes representing basic access.

SELinux - at the most basic level - represents access as
the 4-tuple subject (type or context), target (type or context),
object class, permission. The policy language elaborates this basic
access to faciliate more concise rules (e.g., allow rules can have multiple
source or target types - see refpolicy for more information).

This module has objects for representing the most basic access (AccessVector)
and sets of that access (AccessVectorSet). These objects are used in Madison
in a variety of ways, but they are the fundamental representation of access.
"""

from . import refpolicy
from . import util

from selinux import audit2why

def is_idparam(id):
    """Determine if an id is a paramater in the form $N, where N is
    an integer.

    Returns:
      True if the id is a paramater
      False if the id is not a paramater
    """
    if len(id) > 1 and id[0] == '$':
        try:
            int(id[1:])
        except ValueError:
            return False
        return True
    else:
        return False

class AccessVector(util.Comparison):
    """
    An access vector is the basic unit of access in SELinux.

    Access vectors are the most basic representation of access within
    SELinux. It represents the access a source type has to a target
    type in terms of an object class and a set of permissions.

    Access vectors are distinct from AVRules in that they can only
    store a single source type, target type, and object class. The
    simplicity of AccessVectors makes them useful for storing access
    in a form that is easy to search and compare.

    The source, target, and object are stored as string. No checking
    done to verify that the strings are valid SELinux identifiers.
    Identifiers in the form $N (where N is an integer) are reserved as
    interface parameters and are treated as wild cards in many
    circumstances.

    Properties:
     .src_type - The source type allowed access. [String or None]
     .tgt_type - The target type to which access is allowed. [String or None]
     .obj_class - The object class to which access is allowed. [String or None]
     .perms - The permissions allowed to the object class. [IdSet]
     .audit_msgs - The audit messages that generated this access vector [List of strings]
     .xperms - Extended permissions attached to the AV. [Dictionary {operation: xperm set}]
    """
    def __init__(self, init_list=None):
        if init_list:
            self.from_list(init_list)
        else:
            self.src_type = None
            self.tgt_type = None
            self.obj_class = None
            self.perms = refpolicy.IdSet()

        self.audit_msgs = []
        self.type = audit2why.TERULE
        self.data = []
        self.xperms = {}
        # when implementing __eq__ also __hash__ is needed on py2
        # if object is muttable __hash__ should be None
        self.__hash__ = None

        # The direction of the information flow represented by this
        # access vector - used for matching
        self.info_flow_dir = None

    def from_list(self, list):
        """Initialize an access vector from a list.

        Initialize an access vector from a list treating the list as
        positional arguments - i.e., 0 = src_type, 1 = tgt_type, etc.
        All of the list elements 3 and greater are treated as perms.
        For example, the list ['foo_t', 'bar_t', 'file', 'read', 'write']
        would create an access vector list with the source type 'foo_t',
        target type 'bar_t', object class 'file', and permissions 'read'
        and 'write'.

        This format is useful for very simple storage to strings or disc
        (see to_list) and for initializing access vectors.
        """
        if len(list) < 4:
            raise ValueError("List must contain at least four elements %s" % str(list))
        self.src_type = list[0]
        self.tgt_type = list[1]
        self.obj_class = list[2]
        self.perms = refpolicy.IdSet(list[3:])

    def to_list(self):
        """
        Convert an access vector to a list.

        Convert an access vector to a list treating the list as positional
        values. See from_list for more information on how an access vector
        is represented in a list.
        """
        l = [self.src_type, self.tgt_type, self.obj_class]
        l.extend(sorted(self.perms))
        return l

    def merge(self, av):
        """Add permissions and extended permissions from AV"""
        self.perms.update(av.perms)

        for op in av.xperms:
            if op not in self.xperms:
                self.xperms[op] = refpolicy.XpermSet()
            self.xperms[op].extend(av.xperms[op])

    def __str__(self):
        return self.to_string()

    def to_string(self):
        return "allow %s %s:%s %s;" % (self.src_type, self.tgt_type,
                                        self.obj_class, self.perms.to_space_str())

    def _compare(self, other, method):
        try:
            x = list(self.perms)
            a = (self.src_type, self.tgt_type, self.obj_class, x)
            y = list(other.perms)
            x.sort()
            y.sort()
            b = (other.src_type, other.tgt_type, other.obj_class, y)
            return method(a, b)
        except (AttributeError, TypeError):
            # trying to compare to foreign type
            return NotImplemented


def avrule_to_access_vectors(avrule):
    """Convert an avrule into a list of access vectors.

    AccessVectors and AVRules are similary, but differ in that
    an AVRule can more than one source type, target type, and
    object class. This function expands a single avrule into a
    list of one or more AccessVectors representing the access
    defined in the AVRule.

    
    """
    if isinstance(avrule, AccessVector):
        return [avrule]
    a = []
    for src_type in avrule.src_types:
        for tgt_type in avrule.tgt_types:
            for obj_class in avrule.obj_classes:
                access = AccessVector()
                access.src_type = src_type
                access.tgt_type = tgt_type
                access.obj_class = obj_class
                access.perms = avrule.perms.copy()
                a.append(access)
    return a

class AccessVectorSet:
    """A non-overlapping set of access vectors.

    An AccessVectorSet is designed to store one or more access vectors
    that are non-overlapping. Access can be added to the set
    incrementally and access vectors will be added or merged as
    necessary.  For example, adding the following access vectors using
    add_av:
       allow $1 etc_t : read;
       allow $1 etc_t : write;
       allow $1 var_log_t : read;
    Would result in an access vector set with the access vectors:
       allow $1 etc_t : { read write};
       allow $1 var_log_t : read;
    """
    def __init__(self):
        """Initialize an access vector set.
        """
        self.src = {}
        # The information flow direction of this access vector
        # set - see objectmodel.py for more information. This
        # stored here to speed up searching - see matching.py.
        self.info_dir = None

    def __iter__(self):
        """Iterate over all of the unique access vectors in the set."""
        for tgts in self.src.values():
            for objs in tgts.values():
                for av in objs.values():
                    yield av

    def __len__(self):
        """Return the number of unique access vectors in the set.

        Because of the inernal representation of the access vector set,
        __len__ is not a constant time operation. Worst case is O(N)
        where N is the number of unique access vectors, but the common
        case is probably better.
        """
        l = 0
        for tgts in self.src.values():
            for objs in tgts.values():
               l += len(objs)
        return l

    def to_list(self):
        """Return the unique access vectors in the set as a list.

        The format of the returned list is a set of nested lists,
        each access vector represented by a list. This format is
        designed to be simply  serializable to a file.

        For example, consider an access vector set with the following
        access vectors:
          allow $1 user_t : file read;
          allow $1 etc_t : file { read write};
        to_list would return the following:
          [[$1, user_t, file, read]
           [$1, etc_t, file, read, write]]

        See AccessVector.to_list for more information.
        """
        l = []
        for av in self:
            l.append(av.to_list())

        return l

    def from_list(self, l):
        """Add access vectors stored in a list.

        See to list for more information on the list format that this
        method accepts.

        This will add all of the access from the list. Any existing
        access vectors in the set will be retained.
        """
        for av in l:
            self.add_av(AccessVector(av))

    def add(self, src_type, tgt_type, obj_class, perms, audit_msg=None, avc_type=audit2why.TERULE, data=[]):
        """Add an access vector to the set.
        """
        av = AccessVector()
        av.src_type = src_type
        av.tgt_type = tgt_type
        av.obj_class = obj_class
        av.perms = perms
        av.data = data
        av.type = avc_type

        self.add_av(av, audit_msg)

    def add_av(self, av, audit_msg=None):
        """Add an access vector to the set."""
        tgt = self.src.setdefault(av.src_type, { })
        cls = tgt.setdefault(av.tgt_type, { })

        if (av.obj_class, av.type) in cls:
            cls[av.obj_class, av.type].merge(av)
        else:
            cls[av.obj_class, av.type] = av

        if audit_msg:
            cls[av.obj_class, av.type].audit_msgs.append(audit_msg)

def avs_extract_types(avs):
    types = refpolicy.IdSet()
    for av in avs:
        types.add(av.src_type)
        types.add(av.tgt_type)
        
    return types

def avs_extract_obj_perms(avs):
    perms = { }
    for av in avs:
        if av.obj_class in perms:
            s = perms[av.obj_class]
        else:
            s = refpolicy.IdSet()
            perms[av.obj_class] = s
        s.update(av.perms)
    return perms

class RoleTypeSet:
    """A non-overlapping set of role type statements.

    This clas allows the incremental addition of role type statements and
    maintains a non-overlapping list of statements.
    """
    def __init__(self):
        """Initialize an access vector set."""
        self.role_types = {}

    def __iter__(self):
        """Iterate over all of the unique role allows statements in the set."""
        for role_type in self.role_types.values():
            yield role_type

    def __len__(self):
        """Return the unique number of role allow statements."""
        return len(self.role_types.keys())

    def add(self, role, type):
        if role in self.role_types:
            role_type = self.role_types[role]
        else:
            role_type = refpolicy.RoleType()
            role_type.role = role
            self.role_types[role] = role_type

        role_type.types.add(type)
site-packages/sepolgen/util.py000064400000012560147511334570012445 0ustar00# Authors: Karl MacMillan <kmacmillan@mentalrootkit.com>
#
# Copyright (C) 2006 Red Hat 
# see file 'COPYING' for use and warranty information
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License as
# published by the Free Software Foundation; version 2 only
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#
import locale
import sys


PY3 = sys.version_info[0] == 3

if PY3:
    bytes_type=bytes
    string_type=str
else:
    bytes_type=str
    string_type=unicode


class ConsoleProgressBar:
    def __init__(self, out, steps=100, indicator='#'):
        self.blocks = 0
        self.current = 0
        self.steps = steps
        self.indicator = indicator
        self.out = out
        self.done = False

    def start(self, message=None):
        self.done = False
        if message:
            self.out.write('\n%s:\n' % message)
        self.out.write('%--10---20---30---40---50---60---70---80---90--100\n')

    def step(self, n=1):
        self.current += n

        old = self.blocks
        self.blocks = int(round(self.current / float(self.steps) * 100) / 2)

        if self.blocks > 50:
            self.blocks = 50

        new = self.blocks - old

        self.out.write(self.indicator * new)
        self.out.flush()

        if self.blocks == 50 and not self.done:
            self.done = True
            self.out.write("\n")

def set_to_list(s):
    l = []
    l.extend(s)
    return l

def first(s, sorted=False):
    """
    Return the first element of a set.

    It sometimes useful to return the first element from a set but,
    because sets are not indexable, this is rather hard. This function
    will return the first element from a set. If sorted is True, then
    the set will first be sorted (making this an expensive operation).
    Otherwise a random element will be returned (as sets are not ordered).
    """
    if not len(s):
        raise IndexError("empty containter")
    
    if sorted:
        l = set_to_list(s)
        l.sort()
        return l[0]
    else:
        for x in s:
            return x

def encode_input(text):
    """Encode given text via preferred system encoding"""
    # locale will often find out the correct encoding
    encoding = locale.getpreferredencoding()
    try:
        encoded_text = text.encode(encoding)
    except UnicodeError:
    # if it fails to find correct encoding then ascii is used
    # which may lead to UnicodeError if `text` contains non ascii signs
    # utf-8 is our guess to fix the situation
        encoded_text = text.encode('utf-8')
    return encoded_text

def decode_input(text):
    """Decode given text via preferred system encoding"""
    # locale will often find out the correct encoding
    encoding = locale.getpreferredencoding()
    try:
        decoded_text = text.decode(encoding)
    except UnicodeError:
    # if it fails to find correct encoding then ascii is used
    # which may lead to UnicodeError if `text` contains non ascii signs
    # utf-8 is our guess to fix the situation
        decoded_text = text.decode('utf-8')
    return decoded_text

class Comparison():
    """Class used when implementing rich comparison.

    Inherit from this class if you want to have a rich
    comparison withing the class, afterwards implement
    _compare function within your class."""

    def _compare(self, other, method):
        return NotImplemented

    def __eq__(self, other):
        return self._compare(other, lambda a, b: a == b)

    def __lt__(self, other):
        return self._compare(other, lambda a, b: a < b)

    def __le__(self, other):
        return self._compare(other, lambda a, b: a <= b)

    def __ge__(self, other):
        return self._compare(other, lambda a, b: a >= b)

    def __gt__(self, other):
        return self._compare(other, lambda a, b: a > b)

    def __ne__(self, other):
        return self._compare(other, lambda a, b: a != b)

if sys.version_info < (2,7):
    # cmp_to_key function is missing in python2.6
    def cmp_to_key(mycmp):
        'Convert a cmp= function into a key= function'
        class K:
            def __init__(self, obj, *args):
                self.obj = obj
            def __lt__(self, other):
                return mycmp(self.obj, other.obj) < 0
            def __gt__(self, other):
                return mycmp(self.obj, other.obj) > 0
            def __eq__(self, other):
                return mycmp(self.obj, other.obj) == 0
            def __le__(self, other):
                return mycmp(self.obj, other.obj) <= 0
            def __ge__(self, other):
                return mycmp(self.obj, other.obj) >= 0
            def __ne__(self, other):
                return mycmp(self.obj, other.obj) != 0
        return K
else:
    from functools import cmp_to_key

def cmp(first, second):
    return (first > second) - (second > first)

if __name__ == "__main__":
    import time
    p = ConsoleProgressBar(sys.stdout, steps=999)
    p.start("computing pi")
    for i in range(999):
        p.step()
        time.sleep(0.001)
site-packages/sepolgen/interfaces.py000064400000040121147511334570013605 0ustar00# Authors: Karl MacMillan <kmacmillan@mentalrootkit.com>
#
# Copyright (C) 2006 Red Hat
# see file 'COPYING' for use and warranty information
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License as
# published by the Free Software Foundation; version 2 only
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#

"""
Classes for representing and manipulating interfaces.
"""

import copy
import itertools

from . import access
from . import refpolicy
from . import objectmodel
from . import matching
from .sepolgeni18n import _


class Param:
    """
    Object representing a paramater for an interface.
    """
    def __init__(self):
        self.__name = ""
        self.type = refpolicy.SRC_TYPE
        self.obj_classes = refpolicy.IdSet()
        self.required = True

    def set_name(self, name):
        if not access.is_idparam(name):
            raise ValueError("Name [%s] is not a param" % name)
        self.__name = name

    def get_name(self):
        return self.__name

    name = property(get_name, set_name)

    num = property(fget=lambda self: int(self.name[1:]))

    def __repr__(self):
        return "<sepolgen.policygen.Param instance [%s, %s, %s]>" % \
               (self.name, refpolicy.field_to_str[self.type], " ".join(self.obj_classes))


# Helper for extract perms
def __param_insert(name, type, av, params):
    ret = 0
    if name in params:
        p = params[name]
        # The entries are identical - we're done
        if type == p.type:
            return
        # Hanldle implicitly typed objects (like process)
        if (type == refpolicy.SRC_TYPE or type == refpolicy.TGT_TYPE) and \
           (p.type == refpolicy.TGT_TYPE or p.type == refpolicy.SRC_TYPE):
            #print name, refpolicy.field_to_str[p.type]
            # If the object is not implicitly typed, tell the
            # caller there is a likely conflict.
            ret = 1
            if av:
                avobjs = [av.obj_class]
            else:
                avobjs = []
            for obj in itertools.chain(p.obj_classes, avobjs):
                if obj in objectmodel.implicitly_typed_objects:
                    ret = 0
                    break
            # "Promote" to a SRC_TYPE as this is the likely usage.
            # We do this even if the above test fails on purpose
            # as there is really no sane way to resolve the conflict
            # here. The caller can take other actions if needed.
            p.type = refpolicy.SRC_TYPE
        else:
            # There is some conflict - no way to resolve it really
            # so we just leave the first entry and tell the caller
            # there was a conflict.
            ret = 1
    else:
        p = Param()
        p.name = name
        p.type = type
        params[p.name] = p

    if av:
        p.obj_classes.add(av.obj_class)
    return ret



def av_extract_params(av, params):
    """Extract the paramaters from an access vector.

    Extract the paramaters (in the form $N) from an access
    vector, storing them as Param objects in a dictionary.
    Some attempt is made at resolving conflicts with other
    entries in the dict, but if an unresolvable conflict is
    found it is reported to the caller.

    The goal here is to figure out how interface parameters are
    actually used in the interface - e.g., that $1 is a domain used as
    a SRC_TYPE. In general an interface will look like this:

    interface(`foo', `
       allow $1 foo : file read;
    ')

    This is simple to figure out - $1 is a SRC_TYPE. A few interfaces
    are more complex, for example:

    interface(`foo_trans',`
       domain_auto_trans($1,fingerd_exec_t,fingerd_t)

       allow $1 fingerd_t:fd use;
       allow fingerd_t $1:fd use;
       allow fingerd_t $1:fifo_file rw_file_perms;
       allow fingerd_t $1:process sigchld;
    ')

    Here the usage seems ambigious, but it is not. $1 is still domain
    and therefore should be returned as a SRC_TYPE.

    Returns:
      0  - success
      1  - conflict found
    """
    ret = 0
    found_src = False
    if access.is_idparam(av.src_type):
        if __param_insert(av.src_type, refpolicy.SRC_TYPE, av, params) == 1:
            ret = 1

    if access.is_idparam(av.tgt_type):
        if __param_insert(av.tgt_type, refpolicy.TGT_TYPE, av, params) == 1:
            ret = 1

    if access.is_idparam(av.obj_class):
        if __param_insert(av.obj_class, refpolicy.OBJ_CLASS, av, params) == 1:
            ret = 1

    return ret

def role_extract_params(role, params):
    if access.is_idparam(role.role):
        return __param_insert(role.role, refpolicy.ROLE, None, params)
    
def type_rule_extract_params(rule, params):
    def extract_from_set(set, type):
        ret = 0
        for x in set:
            if access.is_idparam(x):
                if __param_insert(x, type, None, params):
                    ret = 1
        return ret

    ret = 0
    if extract_from_set(rule.src_types, refpolicy.SRC_TYPE):
        ret = 1

    if extract_from_set(rule.tgt_types, refpolicy.TGT_TYPE):
        ret = 1
        
    if extract_from_set(rule.obj_classes, refpolicy.OBJ_CLASS):
        ret = 1

    if access.is_idparam(rule.dest_type):
        if __param_insert(rule.dest_type, refpolicy.DEST_TYPE, None, params):
            ret = 1
            
    return ret

def ifcall_extract_params(ifcall, params):
    ret = 0
    for arg in ifcall.args:
        if access.is_idparam(arg):
            # Assume interface arguments are source types. Fairly safe
            # assumption for most interfaces
            if __param_insert(arg, refpolicy.SRC_TYPE, None, params):
                ret = 1

    return ret

class AttributeVector:
    def __init__(self):
        self.name = ""
        self.access = access.AccessVectorSet()

    def add_av(self, av):
        self.access.add_av(av)

class AttributeSet:
    def __init__(self):
        self.attributes = { }

    def add_attr(self, attr):
        self.attributes[attr.name] = attr

    def from_file(self, fd):
        def parse_attr(line):
            fields = line[1:-1].split()
            if len(fields) != 2 or fields[0] != "Attribute":
                raise SyntaxError("Syntax error Attribute statement %s" % line)
            a = AttributeVector()
            a.name = fields[1]

            return a

        a = None
        for line in fd:
            line = line[:-1]
            if line[0] == "[":
                if a:
                    self.add_attr(a)
                a = parse_attr(line)
            elif a:
                l = line.split(",")
                av = access.AccessVector(l)
                a.add_av(av)
        if a:
            self.add_attr(a)

class InterfaceVector:
    def __init__(self, interface=None, attributes={}):
        # Enabled is a loose concept currently - we are essentially
        # not enabling interfaces that we can't handle currently.
        # See InterfaceVector.add_ifv for more information.
        self.enabled = True
        self.name = ""
        # The access that is enabled by this interface - eventually
        # this will include indirect access from typeattribute
        # statements.
        self.access = access.AccessVectorSet()
        # Paramaters are stored in a dictionary (key: param name
        # value: Param object).
        self.params = { }
        if interface:
            self.from_interface(interface, attributes)
        self.expanded = False

    def from_interface(self, interface, attributes={}):
        self.name = interface.name

        # Add allow rules
        for avrule in interface.avrules():
            if avrule.rule_type != refpolicy.AVRule.ALLOW:
                continue
            # Handle some policy bugs
            if "dontaudit" in interface.name:
                #print "allow rule in interface: %s" % interface
                continue
            avs = access.avrule_to_access_vectors(avrule)
            for av in avs:
                self.add_av(av)

        # Add typeattribute access
        if attributes:
            for typeattribute in interface.typeattributes():
                for attr in typeattribute.attributes:
                    if attr not in attributes.attributes:
                        # print "missing attribute " + attr
                        continue
                    attr_vec = attributes.attributes[attr]
                    for a in attr_vec.access:
                        av = copy.copy(a)
                        if av.src_type == attr_vec.name:
                            av.src_type = typeattribute.type
                        if av.tgt_type == attr_vec.name:
                            av.tgt_type = typeattribute.type
                        self.add_av(av)


        # Extract paramaters from roles
        for role in interface.roles():
            if role_extract_params(role, self.params):
                pass
                #print "found conflicting role param %s for interface %s" % \
                #      (role.name, interface.name)
        # Extract paramaters from type rules
        for rule in interface.typerules():
            if type_rule_extract_params(rule, self.params):
                pass
                #print "found conflicting params in rule %s in interface %s" % \
                #      (str(rule), interface.name)

        for ifcall in interface.interface_calls():
            if ifcall_extract_params(ifcall, self.params):
                pass
                #print "found conflicting params in ifcall %s in interface %s" % \
                #      (str(ifcall), interface.name)
            

    def add_av(self, av):
        if av_extract_params(av, self.params) == 1:
            pass
            #print "found conflicting perms [%s]" % str(av)
        self.access.add_av(av)

    def to_string(self):
        s = []
        s.append("[InterfaceVector %s]" % self.name)
        for av in self.access:
            s.append(str(av))
        return "\n".join(s)

    def __str__(self):
        return self.__repr__()

    def __repr__(self):
        return "<InterfaceVector %s:%s>" % (self.name, self.enabled)


class InterfaceSet:
    def __init__(self, output=None):
        self.interfaces = { }
        self.tgt_type_map = { }
        self.tgt_type_all = []
        self.output = output

    def o(self, str):
        if self.output:
            self.output.write(str + "\n")

    def to_file(self, fd):
        for iv in sorted(self.interfaces.values(), key=lambda x: x.name):
            fd.write("[InterfaceVector %s " % iv.name)
            for param in sorted(iv.params.values(), key=lambda x: x.name):
                fd.write("%s:%s " % (param.name, refpolicy.field_to_str[param.type]))
            fd.write("]\n")
            avl = sorted(iv.access.to_list())
            for av in avl:
                fd.write(",".join(av))
                fd.write("\n")

    def from_file(self, fd):
        def parse_ifv(line):
            fields = line[1:-1].split()
            if len(fields) < 2 or fields[0] != "InterfaceVector":
                raise SyntaxError("Syntax error InterfaceVector statement %s" % line)
            ifv = InterfaceVector()
            ifv.name = fields[1]
            if len(fields) == 2:
                return
            for field in fields[2:]:
                p = field.split(":")
                if len(p) != 2:
                    raise SyntaxError("Invalid param in InterfaceVector statement %s" % line)
                param = Param()
                param.name = p[0]
                param.type = refpolicy.str_to_field[p[1]]
                ifv.params[param.name] = param
            return ifv

        ifv = None
        for line in fd:
            line = line[:-1]
            if line[0] == "[":
                if ifv:
                    self.add_ifv(ifv)
                ifv = parse_ifv(line)
            elif ifv:
                l = line.split(",")
                av = access.AccessVector(l)
                ifv.add_av(av)
        if ifv:
            self.add_ifv(ifv)

        self.index()

    def add_ifv(self, ifv):
        self.interfaces[ifv.name] = ifv

    def index(self):
        for ifv in self.interfaces.values():
            tgt_types = set()
            for av in ifv.access:
                if access.is_idparam(av.tgt_type):
                    self.tgt_type_all.append(ifv)
                    tgt_types = set()
                    break
                tgt_types.add(av.tgt_type)

            for type in tgt_types:
                l = self.tgt_type_map.setdefault(type, [])
                l.append(ifv)

    def add(self, interface, attributes={}):
        ifv = InterfaceVector(interface, attributes)
        self.add_ifv(ifv)

    def add_headers(self, headers, output=None, attributes={}):
        for i in itertools.chain(headers.interfaces(), headers.templates()):
            self.add(i, attributes)

        self.expand_ifcalls(headers)
        self.index()

    def map_param(self, id, ifcall):
        if access.is_idparam(id):
            num = int(id[1:])
            if num > len(ifcall.args):
                # Tell caller to drop this because it must have
                # been generated from an optional param.
                return None
            else:
                arg = ifcall.args[num - 1]
                if isinstance(arg, list):
                    return arg
                else:
                    return [arg]
        else:
            return [id]

    def map_add_av(self, ifv, av, ifcall):
        src_types = self.map_param(av.src_type, ifcall)
        if src_types is None:
            return

        tgt_types = self.map_param(av.tgt_type, ifcall)
        if tgt_types is None:
            return

        obj_classes = self.map_param(av.obj_class, ifcall)
        if obj_classes is None:
            return

        new_perms = refpolicy.IdSet()
        for perm in av.perms:
            p = self.map_param(perm, ifcall)
            if p is None:
                continue
            else:
                new_perms.update(p)
        if len(new_perms) == 0:
            return

        for src_type in src_types:
            for tgt_type in tgt_types:
                for obj_class in obj_classes:
                    ifv.access.add(src_type, tgt_type, obj_class, new_perms)

    def do_expand_ifcalls(self, interface, if_by_name):
        # Descend an interface call tree adding the access
        # from each interface. This is a depth first walk
        # of the tree.

        stack = [(interface, None)]
        ifv = self.interfaces[interface.name]
        ifv.expanded = True

        while len(stack) > 0:
            cur, cur_ifcall = stack.pop(-1)

            cur_ifv = self.interfaces[cur.name]
            if cur != interface:

                for av in cur_ifv.access:
                    self.map_add_av(ifv, av, cur_ifcall)

                # If we have already fully expanded this interface
                # there is no reason to descend further.
                if cur_ifv.expanded:
                    continue

            for ifcall in cur.interface_calls():
                if ifcall.ifname == interface.name:
                    self.o(_("Found circular interface class"))
                    return
                try:
                    newif = if_by_name[ifcall.ifname]
                except KeyError:
                    self.o(_("Missing interface definition for %s" % ifcall.ifname))
                    continue

                stack.append((newif, ifcall))


    def expand_ifcalls(self, headers):
        # Create a map of interface names to interfaces -
        # this mirrors the interface vector map we already
        # have.
        if_by_name = { }

        for i in itertools.chain(headers.interfaces(), headers.templates()):
            if_by_name[i.name] = i


        for interface in itertools.chain(headers.interfaces(), headers.templates()):
            self.do_expand_ifcalls(interface, if_by_name)

site-packages/sepolgen/classperms.py000064400000005402147511334570013641 0ustar00# Authors: Karl MacMillan <kmacmillan@mentalrootkit.com>
#
# Copyright (C) 2006 Red Hat 
# see file 'COPYING' for use and warranty information
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License as
# published by the Free Software Foundation; version 2 only
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#
import sys

tokens = ('DEFINE',
          'NAME',
          'TICK',
          'SQUOTE',
          'OBRACE',
          'CBRACE',
          'SEMI',
          'OPAREN',
          'CPAREN',
          'COMMA')

reserved = {
    'define' : 'DEFINE' }

t_TICK      = r'\`'
t_SQUOTE    = r'\''
t_OBRACE    = r'\{'
t_CBRACE    = r'\}'
t_SEMI      = r'\;'
t_OPAREN    = r'\('
t_CPAREN    = r'\)'
t_COMMA     = r'\,'

t_ignore    = " \t\n"

def t_NAME(t):
    r'[a-zA-Z_][a-zA-Z0-9_]*'
    t.type = reserved.get(t.value,'NAME')
    return t

def t_error(t):
    print("Illegal character '%s'" % t.value[0])
    t.skip(1)

from . import lex
lex.lex()

def p_statements(p):
    '''statements : define_stmt
                  | define_stmt statements
    '''
    if len(p) == 2:
        p[0] = [p[1]]
    else:
        p[0] = [p[1]] + [p[2]]

def p_define_stmt(p):
    # This sucks - corresponds to 'define(`foo',`{ read write }')
    '''define_stmt : DEFINE OPAREN TICK NAME SQUOTE COMMA TICK list SQUOTE CPAREN
    '''
    
    p[0] = [p[4], p[8]]

def p_list(p):
    '''list : NAME
            | OBRACE names CBRACE
    '''
    if p[1] == "{":
        p[0] = p[2]
    else:
        p[0] = [p[1]]

def p_names(p):
    '''names : NAME
             | NAME names
    '''
    if len(p) == 2:
        p[0] = [p[1]]
    else:
        p[0] = [p[1]] + p[2]

def p_error(p):
    print("Syntax error on line %d %s [type=%s]" % (p.lineno, p.value, p.type))
    
from . import yacc
yacc.yacc()


f = open("all_perms.spt")
txt = f.read()
f.close()

#lex.input(txt)
#while 1:
#    tok = lex.token()
#    if not tok:
#        break
#    print tok

test = "define(`foo',`{ read write append }')"
test2 = """define(`all_filesystem_perms',`{ mount remount unmount getattr relabelfrom relabelto transition associate quotamod quotaget }')
define(`all_security_perms',`{ compute_av compute_create compute_member check_context load_policy compute_relabel compute_user setenforce setbool setsecparam setcheckreqprot }')
"""
result = yacc.parse(txt)
print(result)
    
site-packages/sepolgen/output.py000064400000011774147511334570013036 0ustar00# Authors: Karl MacMillan <kmacmillan@mentalrootkit.com>
#
# Copyright (C) 2006 Red Hat
# see file 'COPYING' for use and warranty information
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License as
# published by the Free Software Foundation; version 2 only
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#

"""
Classes and functions for the output of reference policy modules.

This module takes a refpolicy.Module object and formats it for
output using the ModuleWriter object. By separating the output
in this way the other parts of Madison can focus solely on
generating policy. This keeps the semantic / syntactic issues
cleanly separated from the formatting issues.
"""

from . import refpolicy
from . import util

if util.PY3:
    from .util import cmp


class ModuleWriter:
    def __init__(self):
        self.fd = None
        self.module = None
        self.sort = True
        self.requires = True

    def write(self, module, fd):
        self.module = module

        if self.sort:
            sort_filter(self.module)

        # FIXME - make this handle nesting
        for node, depth in refpolicy.walktree(self.module, showdepth=True):
            fd.write("%s\n" % str(node))

# Helper functions for sort_filter - this is all done old school
# C style rather than with polymorphic methods because this sorting
# is specific to output. It is not necessarily the comparison you
# want generally.

# Compare two IdSets - we could probably do something clever
# with different here, but this works.
def id_set_cmp(x, y):
    xl = util.set_to_list(x)
    xl.sort()
    yl = util.set_to_list(y)
    yl.sort()

    if len(xl) != len(yl):
        return cmp(xl[0], yl[0])
    for v in zip(xl, yl):
        if v[0] != v[1]:
            return cmp(v[0], v[1])
    return 0

# Compare two avrules
def avrule_cmp(a, b):
    ret = id_set_cmp(a.src_types, b.src_types)
    if ret != 0:
        return ret
    ret = id_set_cmp(a.tgt_types, b.tgt_types)
    if ret != 0:
        return ret
    ret = id_set_cmp(a.obj_classes, b.obj_classes)
    if ret != 0:
        return ret

    # At this point, who cares - just return something
    return cmp(len(a.perms), len(b.perms))

# Compare two interface calls
def ifcall_cmp(a, b):
    if a.args[0] != b.args[0]:
        return cmp(a.args[0], b.args[0])
    return cmp(a.ifname, b.ifname)

# Compare an two avrules or interface calls
def rule_cmp(a, b):
    if isinstance(a, refpolicy.InterfaceCall):
        if isinstance(b, refpolicy.InterfaceCall):
            return ifcall_cmp(a, b)
        else:
            return id_set_cmp([a.args[0]], b.src_types)
    else:
        if isinstance(b, refpolicy.AVRule):
            return avrule_cmp(a,b)
        else:
            return id_set_cmp(a.src_types, [b.args[0]])
                
def role_type_cmp(a, b):
    return cmp(a.role, b.role)

def sort_filter(module):
    """Sort and group the output for readability.
    """
    def sort_node(node):
        c = []

        # Module statement
        for mod in node.module_declarations():
            c.append(mod)
            c.append(refpolicy.Comment())

        # Requires
        for require in node.requires():
            c.append(require)
        c.append(refpolicy.Comment())

        # Rules
        #
        # We are going to group output by source type (which
        # we assume is the first argument for interfaces).
        rules = []
        rules.extend(node.avrules())
        rules.extend(node.interface_calls())
        rules.sort(key=util.cmp_to_key(rule_cmp))

        cur = None
        sep_rules = []
        for rule in rules:
            if isinstance(rule, refpolicy.InterfaceCall):
                x = rule.args[0]
            else:
                x = util.first(rule.src_types)

            if cur != x:
                if cur:
                    sep_rules.append(refpolicy.Comment())
                cur = x
                comment = refpolicy.Comment()
                comment.lines.append("============= %s ==============" % cur)
                sep_rules.append(comment)
            sep_rules.append(rule)

        c.extend(sep_rules)


        ras = []
        ras.extend(node.role_types())
        ras.sort(key=util.cmp_to_key(role_type_cmp))
        if len(ras):
            comment = refpolicy.Comment()
            comment.lines.append("============= ROLES ==============")
            c.append(comment)
        

        c.extend(ras)

        # Everything else
        for child in node.children:
            if child not in c:
                c.append(child)

        node.children = c

    for node in module.nodes():
        sort_node(node)


site-packages/sepolgen/yacc.py000064400000414144147511334570012413 0ustar00# -----------------------------------------------------------------------------
# ply: yacc.py
#
# Copyright (C) 2001-2018
# David M. Beazley (Dabeaz LLC)
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are
# met:
#
# * Redistributions of source code must retain the above copyright notice,
#   this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright notice,
#   this list of conditions and the following disclaimer in the documentation
#   and/or other materials provided with the distribution.
# * Neither the name of the David Beazley or Dabeaz LLC may be used to
#   endorse or promote products derived from this software without
#  specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
# -----------------------------------------------------------------------------
#
# This implements an LR parser that is constructed from grammar rules defined
# as Python functions. The grammar is specified by supplying the BNF inside
# Python documentation strings.  The inspiration for this technique was borrowed
# from John Aycock's Spark parsing system.  PLY might be viewed as cross between
# Spark and the GNU bison utility.
#
# The current implementation is only somewhat object-oriented. The
# LR parser itself is defined in terms of an object (which allows multiple
# parsers to co-exist).  However, most of the variables used during table
# construction are defined in terms of global variables.  Users shouldn't
# notice unless they are trying to define multiple parsers at the same
# time using threads (in which case they should have their head examined).
#
# This implementation supports both SLR and LALR(1) parsing.  LALR(1)
# support was originally implemented by Elias Ioup (ezioup@alumni.uchicago.edu),
# using the algorithm found in Aho, Sethi, and Ullman "Compilers: Principles,
# Techniques, and Tools" (The Dragon Book).  LALR(1) has since been replaced
# by the more efficient DeRemer and Pennello algorithm.
#
# :::::::: WARNING :::::::
#
# Construction of LR parsing tables is fairly complicated and expensive.
# To make this module run fast, a *LOT* of work has been put into
# optimization---often at the expensive of readability and what might
# consider to be good Python "coding style."   Modify the code at your
# own risk!
# ----------------------------------------------------------------------------

import re
import types
import sys
import os.path
import inspect
import warnings

__version__    = '3.11'
__tabversion__ = '3.10'

#-----------------------------------------------------------------------------
#                     === User configurable parameters ===
#
# Change these to modify the default behavior of yacc (if you wish)
#-----------------------------------------------------------------------------

yaccdebug   = True             # Debugging mode.  If set, yacc generates a
                               # a 'parser.out' file in the current directory

debug_file  = 'parser.out'     # Default name of the debugging file
tab_module  = 'parsetab'       # Default name of the table module
default_lr  = 'LALR'           # Default LR table generation method

error_count = 3                # Number of symbols that must be shifted to leave recovery mode

yaccdevel   = False            # Set to True if developing yacc.  This turns off optimized
                               # implementations of certain functions.

resultlimit = 40               # Size limit of results when running in debug mode.

pickle_protocol = 0            # Protocol to use when writing pickle files

# String type-checking compatibility
if sys.version_info[0] < 3:
    string_types = basestring
else:
    string_types = str

MAXINT = sys.maxsize

# This object is a stand-in for a logging object created by the
# logging module.   PLY will use this by default to create things
# such as the parser.out file.  If a user wants more detailed
# information, they can create their own logging object and pass
# it into PLY.

class PlyLogger(object):
    def __init__(self, f):
        self.f = f

    def debug(self, msg, *args, **kwargs):
        self.f.write((msg % args) + '\n')

    info = debug

    def warning(self, msg, *args, **kwargs):
        self.f.write('WARNING: ' + (msg % args) + '\n')

    def error(self, msg, *args, **kwargs):
        self.f.write('ERROR: ' + (msg % args) + '\n')

    critical = debug

# Null logger is used when no output is generated. Does nothing.
class NullLogger(object):
    def __getattribute__(self, name):
        return self

    def __call__(self, *args, **kwargs):
        return self

# Exception raised for yacc-related errors
class YaccError(Exception):
    pass

# Format the result message that the parser produces when running in debug mode.
def format_result(r):
    repr_str = repr(r)
    if '\n' in repr_str:
        repr_str = repr(repr_str)
    if len(repr_str) > resultlimit:
        repr_str = repr_str[:resultlimit] + ' ...'
    result = '<%s @ 0x%x> (%s)' % (type(r).__name__, id(r), repr_str)
    return result

# Format stack entries when the parser is running in debug mode
def format_stack_entry(r):
    repr_str = repr(r)
    if '\n' in repr_str:
        repr_str = repr(repr_str)
    if len(repr_str) < 16:
        return repr_str
    else:
        return '<%s @ 0x%x>' % (type(r).__name__, id(r))

# Panic mode error recovery support.   This feature is being reworked--much of the
# code here is to offer a deprecation/backwards compatible transition

_errok = None
_token = None
_restart = None
_warnmsg = '''PLY: Don't use global functions errok(), token(), and restart() in p_error().
Instead, invoke the methods on the associated parser instance:

    def p_error(p):
        ...
        # Use parser.errok(), parser.token(), parser.restart()
        ...

    parser = yacc.yacc()
'''

def errok():
    warnings.warn(_warnmsg)
    return _errok()

def restart():
    warnings.warn(_warnmsg)
    return _restart()

def token():
    warnings.warn(_warnmsg)
    return _token()

# Utility function to call the p_error() function with some deprecation hacks
def call_errorfunc(errorfunc, token, parser):
    global _errok, _token, _restart
    _errok = parser.errok
    _token = parser.token
    _restart = parser.restart
    r = errorfunc(token)
    try:
        del _errok, _token, _restart
    except NameError:
        pass
    return r

#-----------------------------------------------------------------------------
#                        ===  LR Parsing Engine ===
#
# The following classes are used for the LR parser itself.  These are not
# used during table construction and are independent of the actual LR
# table generation algorithm
#-----------------------------------------------------------------------------

# This class is used to hold non-terminal grammar symbols during parsing.
# It normally has the following attributes set:
#        .type       = Grammar symbol type
#        .value      = Symbol value
#        .lineno     = Starting line number
#        .endlineno  = Ending line number (optional, set automatically)
#        .lexpos     = Starting lex position
#        .endlexpos  = Ending lex position (optional, set automatically)

class YaccSymbol:
    def __str__(self):
        return self.type

    def __repr__(self):
        return str(self)

# This class is a wrapper around the objects actually passed to each
# grammar rule.   Index lookup and assignment actually assign the
# .value attribute of the underlying YaccSymbol object.
# The lineno() method returns the line number of a given
# item (or 0 if not defined).   The linespan() method returns
# a tuple of (startline,endline) representing the range of lines
# for a symbol.  The lexspan() method returns a tuple (lexpos,endlexpos)
# representing the range of positional information for a symbol.

class YaccProduction:
    def __init__(self, s, stack=None):
        self.slice = s
        self.stack = stack
        self.lexer = None
        self.parser = None

    def __getitem__(self, n):
        if isinstance(n, slice):
            return [s.value for s in self.slice[n]]
        elif n >= 0:
            return self.slice[n].value
        else:
            return self.stack[n].value

    def __setitem__(self, n, v):
        self.slice[n].value = v

    def __getslice__(self, i, j):
        return [s.value for s in self.slice[i:j]]

    def __len__(self):
        return len(self.slice)

    def lineno(self, n):
        return getattr(self.slice[n], 'lineno', 0)

    def set_lineno(self, n, lineno):
        self.slice[n].lineno = lineno

    def linespan(self, n):
        startline = getattr(self.slice[n], 'lineno', 0)
        endline = getattr(self.slice[n], 'endlineno', startline)
        return startline, endline

    def lexpos(self, n):
        return getattr(self.slice[n], 'lexpos', 0)

    def set_lexpos(self, n, lexpos):
        self.slice[n].lexpos = lexpos

    def lexspan(self, n):
        startpos = getattr(self.slice[n], 'lexpos', 0)
        endpos = getattr(self.slice[n], 'endlexpos', startpos)
        return startpos, endpos

    def error(self):
        raise SyntaxError

# -----------------------------------------------------------------------------
#                               == LRParser ==
#
# The LR Parsing engine.
# -----------------------------------------------------------------------------

class LRParser:
    def __init__(self, lrtab, errorf):
        self.productions = lrtab.lr_productions
        self.action = lrtab.lr_action
        self.goto = lrtab.lr_goto
        self.errorfunc = errorf
        self.set_defaulted_states()
        self.errorok = True

    def errok(self):
        self.errorok = True

    def restart(self):
        del self.statestack[:]
        del self.symstack[:]
        sym = YaccSymbol()
        sym.type = '$end'
        self.symstack.append(sym)
        self.statestack.append(0)

    # Defaulted state support.
    # This method identifies parser states where there is only one possible reduction action.
    # For such states, the parser can make a choose to make a rule reduction without consuming
    # the next look-ahead token.  This delayed invocation of the tokenizer can be useful in
    # certain kinds of advanced parsing situations where the lexer and parser interact with
    # each other or change states (i.e., manipulation of scope, lexer states, etc.).
    #
    # See:  http://www.gnu.org/software/bison/manual/html_node/Default-Reductions.html#Default-Reductions
    def set_defaulted_states(self):
        self.defaulted_states = {}
        for state, actions in self.action.items():
            rules = list(actions.values())
            if len(rules) == 1 and rules[0] < 0:
                self.defaulted_states[state] = rules[0]

    def disable_defaulted_states(self):
        self.defaulted_states = {}

    def parse(self, input=None, lexer=None, debug=False, tracking=False, tokenfunc=None):
        if debug or yaccdevel:
            if isinstance(debug, int):
                debug = PlyLogger(sys.stderr)
            return self.parsedebug(input, lexer, debug, tracking, tokenfunc)
        elif tracking:
            return self.parseopt(input, lexer, debug, tracking, tokenfunc)
        else:
            return self.parseopt_notrack(input, lexer, debug, tracking, tokenfunc)


    # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
    # parsedebug().
    #
    # This is the debugging enabled version of parse().  All changes made to the
    # parsing engine should be made here.   Optimized versions of this function
    # are automatically created by the ply/ygen.py script.  This script cuts out
    # sections enclosed in markers such as this:
    #
    #      #--! DEBUG
    #      statements
    #      #--! DEBUG
    #
    # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

    def parsedebug(self, input=None, lexer=None, debug=False, tracking=False, tokenfunc=None):
        #--! parsedebug-start
        lookahead = None                         # Current lookahead symbol
        lookaheadstack = []                      # Stack of lookahead symbols
        actions = self.action                    # Local reference to action table (to avoid lookup on self.)
        goto    = self.goto                      # Local reference to goto table (to avoid lookup on self.)
        prod    = self.productions               # Local reference to production list (to avoid lookup on self.)
        defaulted_states = self.defaulted_states # Local reference to defaulted states
        pslice  = YaccProduction(None)           # Production object passed to grammar rules
        errorcount = 0                           # Used during error recovery

        #--! DEBUG
        debug.info('PLY: PARSE DEBUG START')
        #--! DEBUG

        # If no lexer was given, we will try to use the lex module
        if not lexer:
            from . import lex
            lexer = lex.lexer

        # Set up the lexer and parser objects on pslice
        pslice.lexer = lexer
        pslice.parser = self

        # If input was supplied, pass to lexer
        if input is not None:
            lexer.input(input)

        if tokenfunc is None:
            # Tokenize function
            get_token = lexer.token
        else:
            get_token = tokenfunc

        # Set the parser() token method (sometimes used in error recovery)
        self.token = get_token

        # Set up the state and symbol stacks

        statestack = []                # Stack of parsing states
        self.statestack = statestack
        symstack   = []                # Stack of grammar symbols
        self.symstack = symstack

        pslice.stack = symstack         # Put in the production
        errtoken   = None               # Err token

        # The start state is assumed to be (0,$end)

        statestack.append(0)
        sym = YaccSymbol()
        sym.type = '$end'
        symstack.append(sym)
        state = 0
        while True:
            # Get the next symbol on the input.  If a lookahead symbol
            # is already set, we just use that. Otherwise, we'll pull
            # the next token off of the lookaheadstack or from the lexer

            #--! DEBUG
            debug.debug('')
            debug.debug('State  : %s', state)
            #--! DEBUG

            if state not in defaulted_states:
                if not lookahead:
                    if not lookaheadstack:
                        lookahead = get_token()     # Get the next token
                    else:
                        lookahead = lookaheadstack.pop()
                    if not lookahead:
                        lookahead = YaccSymbol()
                        lookahead.type = '$end'

                # Check the action table
                ltype = lookahead.type
                t = actions[state].get(ltype)
            else:
                t = defaulted_states[state]
                #--! DEBUG
                debug.debug('Defaulted state %s: Reduce using %d', state, -t)
                #--! DEBUG

            #--! DEBUG
            debug.debug('Stack  : %s',
                        ('%s . %s' % (' '.join([xx.type for xx in symstack][1:]), str(lookahead))).lstrip())
            #--! DEBUG

            if t is not None:
                if t > 0:
                    # shift a symbol on the stack
                    statestack.append(t)
                    state = t

                    #--! DEBUG
                    debug.debug('Action : Shift and goto state %s', t)
                    #--! DEBUG

                    symstack.append(lookahead)
                    lookahead = None

                    # Decrease error count on successful shift
                    if errorcount:
                        errorcount -= 1
                    continue

                if t < 0:
                    # reduce a symbol on the stack, emit a production
                    p = prod[-t]
                    pname = p.name
                    plen  = p.len

                    # Get production function
                    sym = YaccSymbol()
                    sym.type = pname       # Production name
                    sym.value = None

                    #--! DEBUG
                    if plen:
                        debug.info('Action : Reduce rule [%s] with %s and goto state %d', p.str,
                                   '['+','.join([format_stack_entry(_v.value) for _v in symstack[-plen:]])+']',
                                   goto[statestack[-1-plen]][pname])
                    else:
                        debug.info('Action : Reduce rule [%s] with %s and goto state %d', p.str, [],
                                   goto[statestack[-1]][pname])

                    #--! DEBUG

                    if plen:
                        targ = symstack[-plen-1:]
                        targ[0] = sym

                        #--! TRACKING
                        if tracking:
                            t1 = targ[1]
                            sym.lineno = t1.lineno
                            sym.lexpos = t1.lexpos
                            t1 = targ[-1]
                            sym.endlineno = getattr(t1, 'endlineno', t1.lineno)
                            sym.endlexpos = getattr(t1, 'endlexpos', t1.lexpos)
                        #--! TRACKING

                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                        # The code enclosed in this section is duplicated
                        # below as a performance optimization.  Make sure
                        # changes get made in both locations.

                        pslice.slice = targ

                        try:
                            # Call the grammar rule with our special slice object
                            del symstack[-plen:]
                            self.state = state
                            p.callable(pslice)
                            del statestack[-plen:]
                            #--! DEBUG
                            debug.info('Result : %s', format_result(pslice[0]))
                            #--! DEBUG
                            symstack.append(sym)
                            state = goto[statestack[-1]][pname]
                            statestack.append(state)
                        except SyntaxError:
                            # If an error was set. Enter error recovery state
                            lookaheadstack.append(lookahead)    # Save the current lookahead token
                            symstack.extend(targ[1:-1])         # Put the production slice back on the stack
                            statestack.pop()                    # Pop back one state (before the reduce)
                            state = statestack[-1]
                            sym.type = 'error'
                            sym.value = 'error'
                            lookahead = sym
                            errorcount = error_count
                            self.errorok = False

                        continue
                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

                    else:

                        #--! TRACKING
                        if tracking:
                            sym.lineno = lexer.lineno
                            sym.lexpos = lexer.lexpos
                        #--! TRACKING

                        targ = [sym]

                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                        # The code enclosed in this section is duplicated
                        # above as a performance optimization.  Make sure
                        # changes get made in both locations.

                        pslice.slice = targ

                        try:
                            # Call the grammar rule with our special slice object
                            self.state = state
                            p.callable(pslice)
                            #--! DEBUG
                            debug.info('Result : %s', format_result(pslice[0]))
                            #--! DEBUG
                            symstack.append(sym)
                            state = goto[statestack[-1]][pname]
                            statestack.append(state)
                        except SyntaxError:
                            # If an error was set. Enter error recovery state
                            lookaheadstack.append(lookahead)    # Save the current lookahead token
                            statestack.pop()                    # Pop back one state (before the reduce)
                            state = statestack[-1]
                            sym.type = 'error'
                            sym.value = 'error'
                            lookahead = sym
                            errorcount = error_count
                            self.errorok = False

                        continue
                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

                if t == 0:
                    n = symstack[-1]
                    result = getattr(n, 'value', None)
                    #--! DEBUG
                    debug.info('Done   : Returning %s', format_result(result))
                    debug.info('PLY: PARSE DEBUG END')
                    #--! DEBUG
                    return result

            if t is None:

                #--! DEBUG
                debug.error('Error  : %s',
                            ('%s . %s' % (' '.join([xx.type for xx in symstack][1:]), str(lookahead))).lstrip())
                #--! DEBUG

                # We have some kind of parsing error here.  To handle
                # this, we are going to push the current token onto
                # the tokenstack and replace it with an 'error' token.
                # If there are any synchronization rules, they may
                # catch it.
                #
                # In addition to pushing the error token, we call call
                # the user defined p_error() function if this is the
                # first syntax error.  This function is only called if
                # errorcount == 0.
                if errorcount == 0 or self.errorok:
                    errorcount = error_count
                    self.errorok = False
                    errtoken = lookahead
                    if errtoken.type == '$end':
                        errtoken = None               # End of file!
                    if self.errorfunc:
                        if errtoken and not hasattr(errtoken, 'lexer'):
                            errtoken.lexer = lexer
                        self.state = state
                        tok = call_errorfunc(self.errorfunc, errtoken, self)
                        if self.errorok:
                            # User must have done some kind of panic
                            # mode recovery on their own.  The
                            # returned token is the next lookahead
                            lookahead = tok
                            errtoken = None
                            continue
                    else:
                        if errtoken:
                            if hasattr(errtoken, 'lineno'):
                                lineno = lookahead.lineno
                            else:
                                lineno = 0
                            if lineno:
                                sys.stderr.write('yacc: Syntax error at line %d, token=%s\n' % (lineno, errtoken.type))
                            else:
                                sys.stderr.write('yacc: Syntax error, token=%s' % errtoken.type)
                        else:
                            sys.stderr.write('yacc: Parse error in input. EOF\n')
                            return

                else:
                    errorcount = error_count

                # case 1:  the statestack only has 1 entry on it.  If we're in this state, the
                # entire parse has been rolled back and we're completely hosed.   The token is
                # discarded and we just keep going.

                if len(statestack) <= 1 and lookahead.type != '$end':
                    lookahead = None
                    errtoken = None
                    state = 0
                    # Nuke the pushback stack
                    del lookaheadstack[:]
                    continue

                # case 2: the statestack has a couple of entries on it, but we're
                # at the end of the file. nuke the top entry and generate an error token

                # Start nuking entries on the stack
                if lookahead.type == '$end':
                    # Whoa. We're really hosed here. Bail out
                    return

                if lookahead.type != 'error':
                    sym = symstack[-1]
                    if sym.type == 'error':
                        # Hmmm. Error is on top of stack, we'll just nuke input
                        # symbol and continue
                        #--! TRACKING
                        if tracking:
                            sym.endlineno = getattr(lookahead, 'lineno', sym.lineno)
                            sym.endlexpos = getattr(lookahead, 'lexpos', sym.lexpos)
                        #--! TRACKING
                        lookahead = None
                        continue

                    # Create the error symbol for the first time and make it the new lookahead symbol
                    t = YaccSymbol()
                    t.type = 'error'

                    if hasattr(lookahead, 'lineno'):
                        t.lineno = t.endlineno = lookahead.lineno
                    if hasattr(lookahead, 'lexpos'):
                        t.lexpos = t.endlexpos = lookahead.lexpos
                    t.value = lookahead
                    lookaheadstack.append(lookahead)
                    lookahead = t
                else:
                    sym = symstack.pop()
                    #--! TRACKING
                    if tracking:
                        lookahead.lineno = sym.lineno
                        lookahead.lexpos = sym.lexpos
                    #--! TRACKING
                    statestack.pop()
                    state = statestack[-1]

                continue

            # Call an error function here
            raise RuntimeError('yacc: internal parser error!!!\n')

        #--! parsedebug-end

    # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
    # parseopt().
    #
    # Optimized version of parse() method.  DO NOT EDIT THIS CODE DIRECTLY!
    # This code is automatically generated by the ply/ygen.py script. Make
    # changes to the parsedebug() method instead.
    # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

    def parseopt(self, input=None, lexer=None, debug=False, tracking=False, tokenfunc=None):
        #--! parseopt-start
        lookahead = None                         # Current lookahead symbol
        lookaheadstack = []                      # Stack of lookahead symbols
        actions = self.action                    # Local reference to action table (to avoid lookup on self.)
        goto    = self.goto                      # Local reference to goto table (to avoid lookup on self.)
        prod    = self.productions               # Local reference to production list (to avoid lookup on self.)
        defaulted_states = self.defaulted_states # Local reference to defaulted states
        pslice  = YaccProduction(None)           # Production object passed to grammar rules
        errorcount = 0                           # Used during error recovery


        # If no lexer was given, we will try to use the lex module
        if not lexer:
            from . import lex
            lexer = lex.lexer

        # Set up the lexer and parser objects on pslice
        pslice.lexer = lexer
        pslice.parser = self

        # If input was supplied, pass to lexer
        if input is not None:
            lexer.input(input)

        if tokenfunc is None:
            # Tokenize function
            get_token = lexer.token
        else:
            get_token = tokenfunc

        # Set the parser() token method (sometimes used in error recovery)
        self.token = get_token

        # Set up the state and symbol stacks

        statestack = []                # Stack of parsing states
        self.statestack = statestack
        symstack   = []                # Stack of grammar symbols
        self.symstack = symstack

        pslice.stack = symstack         # Put in the production
        errtoken   = None               # Err token

        # The start state is assumed to be (0,$end)

        statestack.append(0)
        sym = YaccSymbol()
        sym.type = '$end'
        symstack.append(sym)
        state = 0
        while True:
            # Get the next symbol on the input.  If a lookahead symbol
            # is already set, we just use that. Otherwise, we'll pull
            # the next token off of the lookaheadstack or from the lexer


            if state not in defaulted_states:
                if not lookahead:
                    if not lookaheadstack:
                        lookahead = get_token()     # Get the next token
                    else:
                        lookahead = lookaheadstack.pop()
                    if not lookahead:
                        lookahead = YaccSymbol()
                        lookahead.type = '$end'

                # Check the action table
                ltype = lookahead.type
                t = actions[state].get(ltype)
            else:
                t = defaulted_states[state]


            if t is not None:
                if t > 0:
                    # shift a symbol on the stack
                    statestack.append(t)
                    state = t


                    symstack.append(lookahead)
                    lookahead = None

                    # Decrease error count on successful shift
                    if errorcount:
                        errorcount -= 1
                    continue

                if t < 0:
                    # reduce a symbol on the stack, emit a production
                    p = prod[-t]
                    pname = p.name
                    plen  = p.len

                    # Get production function
                    sym = YaccSymbol()
                    sym.type = pname       # Production name
                    sym.value = None


                    if plen:
                        targ = symstack[-plen-1:]
                        targ[0] = sym

                        #--! TRACKING
                        if tracking:
                            t1 = targ[1]
                            sym.lineno = t1.lineno
                            sym.lexpos = t1.lexpos
                            t1 = targ[-1]
                            sym.endlineno = getattr(t1, 'endlineno', t1.lineno)
                            sym.endlexpos = getattr(t1, 'endlexpos', t1.lexpos)
                        #--! TRACKING

                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                        # The code enclosed in this section is duplicated
                        # below as a performance optimization.  Make sure
                        # changes get made in both locations.

                        pslice.slice = targ

                        try:
                            # Call the grammar rule with our special slice object
                            del symstack[-plen:]
                            self.state = state
                            p.callable(pslice)
                            del statestack[-plen:]
                            symstack.append(sym)
                            state = goto[statestack[-1]][pname]
                            statestack.append(state)
                        except SyntaxError:
                            # If an error was set. Enter error recovery state
                            lookaheadstack.append(lookahead)    # Save the current lookahead token
                            symstack.extend(targ[1:-1])         # Put the production slice back on the stack
                            statestack.pop()                    # Pop back one state (before the reduce)
                            state = statestack[-1]
                            sym.type = 'error'
                            sym.value = 'error'
                            lookahead = sym
                            errorcount = error_count
                            self.errorok = False

                        continue
                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

                    else:

                        #--! TRACKING
                        if tracking:
                            sym.lineno = lexer.lineno
                            sym.lexpos = lexer.lexpos
                        #--! TRACKING

                        targ = [sym]

                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                        # The code enclosed in this section is duplicated
                        # above as a performance optimization.  Make sure
                        # changes get made in both locations.

                        pslice.slice = targ

                        try:
                            # Call the grammar rule with our special slice object
                            self.state = state
                            p.callable(pslice)
                            symstack.append(sym)
                            state = goto[statestack[-1]][pname]
                            statestack.append(state)
                        except SyntaxError:
                            # If an error was set. Enter error recovery state
                            lookaheadstack.append(lookahead)    # Save the current lookahead token
                            statestack.pop()                    # Pop back one state (before the reduce)
                            state = statestack[-1]
                            sym.type = 'error'
                            sym.value = 'error'
                            lookahead = sym
                            errorcount = error_count
                            self.errorok = False

                        continue
                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

                if t == 0:
                    n = symstack[-1]
                    result = getattr(n, 'value', None)
                    return result

            if t is None:


                # We have some kind of parsing error here.  To handle
                # this, we are going to push the current token onto
                # the tokenstack and replace it with an 'error' token.
                # If there are any synchronization rules, they may
                # catch it.
                #
                # In addition to pushing the error token, we call call
                # the user defined p_error() function if this is the
                # first syntax error.  This function is only called if
                # errorcount == 0.
                if errorcount == 0 or self.errorok:
                    errorcount = error_count
                    self.errorok = False
                    errtoken = lookahead
                    if errtoken.type == '$end':
                        errtoken = None               # End of file!
                    if self.errorfunc:
                        if errtoken and not hasattr(errtoken, 'lexer'):
                            errtoken.lexer = lexer
                        self.state = state
                        tok = call_errorfunc(self.errorfunc, errtoken, self)
                        if self.errorok:
                            # User must have done some kind of panic
                            # mode recovery on their own.  The
                            # returned token is the next lookahead
                            lookahead = tok
                            errtoken = None
                            continue
                    else:
                        if errtoken:
                            if hasattr(errtoken, 'lineno'):
                                lineno = lookahead.lineno
                            else:
                                lineno = 0
                            if lineno:
                                sys.stderr.write('yacc: Syntax error at line %d, token=%s\n' % (lineno, errtoken.type))
                            else:
                                sys.stderr.write('yacc: Syntax error, token=%s' % errtoken.type)
                        else:
                            sys.stderr.write('yacc: Parse error in input. EOF\n')
                            return

                else:
                    errorcount = error_count

                # case 1:  the statestack only has 1 entry on it.  If we're in this state, the
                # entire parse has been rolled back and we're completely hosed.   The token is
                # discarded and we just keep going.

                if len(statestack) <= 1 and lookahead.type != '$end':
                    lookahead = None
                    errtoken = None
                    state = 0
                    # Nuke the pushback stack
                    del lookaheadstack[:]
                    continue

                # case 2: the statestack has a couple of entries on it, but we're
                # at the end of the file. nuke the top entry and generate an error token

                # Start nuking entries on the stack
                if lookahead.type == '$end':
                    # Whoa. We're really hosed here. Bail out
                    return

                if lookahead.type != 'error':
                    sym = symstack[-1]
                    if sym.type == 'error':
                        # Hmmm. Error is on top of stack, we'll just nuke input
                        # symbol and continue
                        #--! TRACKING
                        if tracking:
                            sym.endlineno = getattr(lookahead, 'lineno', sym.lineno)
                            sym.endlexpos = getattr(lookahead, 'lexpos', sym.lexpos)
                        #--! TRACKING
                        lookahead = None
                        continue

                    # Create the error symbol for the first time and make it the new lookahead symbol
                    t = YaccSymbol()
                    t.type = 'error'

                    if hasattr(lookahead, 'lineno'):
                        t.lineno = t.endlineno = lookahead.lineno
                    if hasattr(lookahead, 'lexpos'):
                        t.lexpos = t.endlexpos = lookahead.lexpos
                    t.value = lookahead
                    lookaheadstack.append(lookahead)
                    lookahead = t
                else:
                    sym = symstack.pop()
                    #--! TRACKING
                    if tracking:
                        lookahead.lineno = sym.lineno
                        lookahead.lexpos = sym.lexpos
                    #--! TRACKING
                    statestack.pop()
                    state = statestack[-1]

                continue

            # Call an error function here
            raise RuntimeError('yacc: internal parser error!!!\n')

        #--! parseopt-end

    # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
    # parseopt_notrack().
    #
    # Optimized version of parseopt() with line number tracking removed.
    # DO NOT EDIT THIS CODE DIRECTLY. This code is automatically generated
    # by the ply/ygen.py script. Make changes to the parsedebug() method instead.
    # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

    def parseopt_notrack(self, input=None, lexer=None, debug=False, tracking=False, tokenfunc=None):
        #--! parseopt-notrack-start
        lookahead = None                         # Current lookahead symbol
        lookaheadstack = []                      # Stack of lookahead symbols
        actions = self.action                    # Local reference to action table (to avoid lookup on self.)
        goto    = self.goto                      # Local reference to goto table (to avoid lookup on self.)
        prod    = self.productions               # Local reference to production list (to avoid lookup on self.)
        defaulted_states = self.defaulted_states # Local reference to defaulted states
        pslice  = YaccProduction(None)           # Production object passed to grammar rules
        errorcount = 0                           # Used during error recovery


        # If no lexer was given, we will try to use the lex module
        if not lexer:
            from . import lex
            lexer = lex.lexer

        # Set up the lexer and parser objects on pslice
        pslice.lexer = lexer
        pslice.parser = self

        # If input was supplied, pass to lexer
        if input is not None:
            lexer.input(input)

        if tokenfunc is None:
            # Tokenize function
            get_token = lexer.token
        else:
            get_token = tokenfunc

        # Set the parser() token method (sometimes used in error recovery)
        self.token = get_token

        # Set up the state and symbol stacks

        statestack = []                # Stack of parsing states
        self.statestack = statestack
        symstack   = []                # Stack of grammar symbols
        self.symstack = symstack

        pslice.stack = symstack         # Put in the production
        errtoken   = None               # Err token

        # The start state is assumed to be (0,$end)

        statestack.append(0)
        sym = YaccSymbol()
        sym.type = '$end'
        symstack.append(sym)
        state = 0
        while True:
            # Get the next symbol on the input.  If a lookahead symbol
            # is already set, we just use that. Otherwise, we'll pull
            # the next token off of the lookaheadstack or from the lexer


            if state not in defaulted_states:
                if not lookahead:
                    if not lookaheadstack:
                        lookahead = get_token()     # Get the next token
                    else:
                        lookahead = lookaheadstack.pop()
                    if not lookahead:
                        lookahead = YaccSymbol()
                        lookahead.type = '$end'

                # Check the action table
                ltype = lookahead.type
                t = actions[state].get(ltype)
            else:
                t = defaulted_states[state]


            if t is not None:
                if t > 0:
                    # shift a symbol on the stack
                    statestack.append(t)
                    state = t


                    symstack.append(lookahead)
                    lookahead = None

                    # Decrease error count on successful shift
                    if errorcount:
                        errorcount -= 1
                    continue

                if t < 0:
                    # reduce a symbol on the stack, emit a production
                    p = prod[-t]
                    pname = p.name
                    plen  = p.len

                    # Get production function
                    sym = YaccSymbol()
                    sym.type = pname       # Production name
                    sym.value = None


                    if plen:
                        targ = symstack[-plen-1:]
                        targ[0] = sym


                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                        # The code enclosed in this section is duplicated
                        # below as a performance optimization.  Make sure
                        # changes get made in both locations.

                        pslice.slice = targ

                        try:
                            # Call the grammar rule with our special slice object
                            del symstack[-plen:]
                            self.state = state
                            p.callable(pslice)
                            del statestack[-plen:]
                            symstack.append(sym)
                            state = goto[statestack[-1]][pname]
                            statestack.append(state)
                        except SyntaxError:
                            # If an error was set. Enter error recovery state
                            lookaheadstack.append(lookahead)    # Save the current lookahead token
                            symstack.extend(targ[1:-1])         # Put the production slice back on the stack
                            statestack.pop()                    # Pop back one state (before the reduce)
                            state = statestack[-1]
                            sym.type = 'error'
                            sym.value = 'error'
                            lookahead = sym
                            errorcount = error_count
                            self.errorok = False

                        continue
                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

                    else:


                        targ = [sym]

                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                        # The code enclosed in this section is duplicated
                        # above as a performance optimization.  Make sure
                        # changes get made in both locations.

                        pslice.slice = targ

                        try:
                            # Call the grammar rule with our special slice object
                            self.state = state
                            p.callable(pslice)
                            symstack.append(sym)
                            state = goto[statestack[-1]][pname]
                            statestack.append(state)
                        except SyntaxError:
                            # If an error was set. Enter error recovery state
                            lookaheadstack.append(lookahead)    # Save the current lookahead token
                            statestack.pop()                    # Pop back one state (before the reduce)
                            state = statestack[-1]
                            sym.type = 'error'
                            sym.value = 'error'
                            lookahead = sym
                            errorcount = error_count
                            self.errorok = False

                        continue
                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

                if t == 0:
                    n = symstack[-1]
                    result = getattr(n, 'value', None)
                    return result

            if t is None:


                # We have some kind of parsing error here.  To handle
                # this, we are going to push the current token onto
                # the tokenstack and replace it with an 'error' token.
                # If there are any synchronization rules, they may
                # catch it.
                #
                # In addition to pushing the error token, we call call
                # the user defined p_error() function if this is the
                # first syntax error.  This function is only called if
                # errorcount == 0.
                if errorcount == 0 or self.errorok:
                    errorcount = error_count
                    self.errorok = False
                    errtoken = lookahead
                    if errtoken.type == '$end':
                        errtoken = None               # End of file!
                    if self.errorfunc:
                        if errtoken and not hasattr(errtoken, 'lexer'):
                            errtoken.lexer = lexer
                        self.state = state
                        tok = call_errorfunc(self.errorfunc, errtoken, self)
                        if self.errorok:
                            # User must have done some kind of panic
                            # mode recovery on their own.  The
                            # returned token is the next lookahead
                            lookahead = tok
                            errtoken = None
                            continue
                    else:
                        if errtoken:
                            if hasattr(errtoken, 'lineno'):
                                lineno = lookahead.lineno
                            else:
                                lineno = 0
                            if lineno:
                                sys.stderr.write('yacc: Syntax error at line %d, token=%s\n' % (lineno, errtoken.type))
                            else:
                                sys.stderr.write('yacc: Syntax error, token=%s' % errtoken.type)
                        else:
                            sys.stderr.write('yacc: Parse error in input. EOF\n')
                            return

                else:
                    errorcount = error_count

                # case 1:  the statestack only has 1 entry on it.  If we're in this state, the
                # entire parse has been rolled back and we're completely hosed.   The token is
                # discarded and we just keep going.

                if len(statestack) <= 1 and lookahead.type != '$end':
                    lookahead = None
                    errtoken = None
                    state = 0
                    # Nuke the pushback stack
                    del lookaheadstack[:]
                    continue

                # case 2: the statestack has a couple of entries on it, but we're
                # at the end of the file. nuke the top entry and generate an error token

                # Start nuking entries on the stack
                if lookahead.type == '$end':
                    # Whoa. We're really hosed here. Bail out
                    return

                if lookahead.type != 'error':
                    sym = symstack[-1]
                    if sym.type == 'error':
                        # Hmmm. Error is on top of stack, we'll just nuke input
                        # symbol and continue
                        lookahead = None
                        continue

                    # Create the error symbol for the first time and make it the new lookahead symbol
                    t = YaccSymbol()
                    t.type = 'error'

                    if hasattr(lookahead, 'lineno'):
                        t.lineno = t.endlineno = lookahead.lineno
                    if hasattr(lookahead, 'lexpos'):
                        t.lexpos = t.endlexpos = lookahead.lexpos
                    t.value = lookahead
                    lookaheadstack.append(lookahead)
                    lookahead = t
                else:
                    sym = symstack.pop()
                    statestack.pop()
                    state = statestack[-1]

                continue

            # Call an error function here
            raise RuntimeError('yacc: internal parser error!!!\n')

        #--! parseopt-notrack-end

# -----------------------------------------------------------------------------
#                          === Grammar Representation ===
#
# The following functions, classes, and variables are used to represent and
# manipulate the rules that make up a grammar.
# -----------------------------------------------------------------------------

# regex matching identifiers
_is_identifier = re.compile(r'^[a-zA-Z0-9_-]+$')

# -----------------------------------------------------------------------------
# class Production:
#
# This class stores the raw information about a single production or grammar rule.
# A grammar rule refers to a specification such as this:
#
#       expr : expr PLUS term
#
# Here are the basic attributes defined on all productions
#
#       name     - Name of the production.  For example 'expr'
#       prod     - A list of symbols on the right side ['expr','PLUS','term']
#       prec     - Production precedence level
#       number   - Production number.
#       func     - Function that executes on reduce
#       file     - File where production function is defined
#       lineno   - Line number where production function is defined
#
# The following attributes are defined or optional.
#
#       len       - Length of the production (number of symbols on right hand side)
#       usyms     - Set of unique symbols found in the production
# -----------------------------------------------------------------------------

class Production(object):
    reduced = 0
    def __init__(self, number, name, prod, precedence=('right', 0), func=None, file='', line=0):
        self.name     = name
        self.prod     = tuple(prod)
        self.number   = number
        self.func     = func
        self.callable = None
        self.file     = file
        self.line     = line
        self.prec     = precedence

        # Internal settings used during table construction

        self.len  = len(self.prod)   # Length of the production

        # Create a list of unique production symbols used in the production
        self.usyms = []
        for s in self.prod:
            if s not in self.usyms:
                self.usyms.append(s)

        # List of all LR items for the production
        self.lr_items = []
        self.lr_next = None

        # Create a string representation
        if self.prod:
            self.str = '%s -> %s' % (self.name, ' '.join(self.prod))
        else:
            self.str = '%s -> <empty>' % self.name

    def __str__(self):
        return self.str

    def __repr__(self):
        return 'Production(' + str(self) + ')'

    def __len__(self):
        return len(self.prod)

    def __nonzero__(self):
        return 1

    def __getitem__(self, index):
        return self.prod[index]

    # Return the nth lr_item from the production (or None if at the end)
    def lr_item(self, n):
        if n > len(self.prod):
            return None
        p = LRItem(self, n)
        # Precompute the list of productions immediately following.
        try:
            p.lr_after = self.Prodnames[p.prod[n+1]]
        except (IndexError, KeyError):
            p.lr_after = []
        try:
            p.lr_before = p.prod[n-1]
        except IndexError:
            p.lr_before = None
        return p

    # Bind the production function name to a callable
    def bind(self, pdict):
        if self.func:
            self.callable = pdict[self.func]

# This class serves as a minimal standin for Production objects when
# reading table data from files.   It only contains information
# actually used by the LR parsing engine, plus some additional
# debugging information.
class MiniProduction(object):
    def __init__(self, str, name, len, func, file, line):
        self.name     = name
        self.len      = len
        self.func     = func
        self.callable = None
        self.file     = file
        self.line     = line
        self.str      = str

    def __str__(self):
        return self.str

    def __repr__(self):
        return 'MiniProduction(%s)' % self.str

    # Bind the production function name to a callable
    def bind(self, pdict):
        if self.func:
            self.callable = pdict[self.func]


# -----------------------------------------------------------------------------
# class LRItem
#
# This class represents a specific stage of parsing a production rule.  For
# example:
#
#       expr : expr . PLUS term
#
# In the above, the "." represents the current location of the parse.  Here
# basic attributes:
#
#       name       - Name of the production.  For example 'expr'
#       prod       - A list of symbols on the right side ['expr','.', 'PLUS','term']
#       number     - Production number.
#
#       lr_next      Next LR item. Example, if we are ' expr -> expr . PLUS term'
#                    then lr_next refers to 'expr -> expr PLUS . term'
#       lr_index   - LR item index (location of the ".") in the prod list.
#       lookaheads - LALR lookahead symbols for this item
#       len        - Length of the production (number of symbols on right hand side)
#       lr_after    - List of all productions that immediately follow
#       lr_before   - Grammar symbol immediately before
# -----------------------------------------------------------------------------

class LRItem(object):
    def __init__(self, p, n):
        self.name       = p.name
        self.prod       = list(p.prod)
        self.number     = p.number
        self.lr_index   = n
        self.lookaheads = {}
        self.prod.insert(n, '.')
        self.prod       = tuple(self.prod)
        self.len        = len(self.prod)
        self.usyms      = p.usyms

    def __str__(self):
        if self.prod:
            s = '%s -> %s' % (self.name, ' '.join(self.prod))
        else:
            s = '%s -> <empty>' % self.name
        return s

    def __repr__(self):
        return 'LRItem(' + str(self) + ')'

# -----------------------------------------------------------------------------
# rightmost_terminal()
#
# Return the rightmost terminal from a list of symbols.  Used in add_production()
# -----------------------------------------------------------------------------
def rightmost_terminal(symbols, terminals):
    i = len(symbols) - 1
    while i >= 0:
        if symbols[i] in terminals:
            return symbols[i]
        i -= 1
    return None

# -----------------------------------------------------------------------------
#                           === GRAMMAR CLASS ===
#
# The following class represents the contents of the specified grammar along
# with various computed properties such as first sets, follow sets, LR items, etc.
# This data is used for critical parts of the table generation process later.
# -----------------------------------------------------------------------------

class GrammarError(YaccError):
    pass

class Grammar(object):
    def __init__(self, terminals):
        self.Productions  = [None]  # A list of all of the productions.  The first
                                    # entry is always reserved for the purpose of
                                    # building an augmented grammar

        self.Prodnames    = {}      # A dictionary mapping the names of nonterminals to a list of all
                                    # productions of that nonterminal.

        self.Prodmap      = {}      # A dictionary that is only used to detect duplicate
                                    # productions.

        self.Terminals    = {}      # A dictionary mapping the names of terminal symbols to a
                                    # list of the rules where they are used.

        for term in terminals:
            self.Terminals[term] = []

        self.Terminals['error'] = []

        self.Nonterminals = {}      # A dictionary mapping names of nonterminals to a list
                                    # of rule numbers where they are used.

        self.First        = {}      # A dictionary of precomputed FIRST(x) symbols

        self.Follow       = {}      # A dictionary of precomputed FOLLOW(x) symbols

        self.Precedence   = {}      # Precedence rules for each terminal. Contains tuples of the
                                    # form ('right',level) or ('nonassoc', level) or ('left',level)

        self.UsedPrecedence = set() # Precedence rules that were actually used by the grammer.
                                    # This is only used to provide error checking and to generate
                                    # a warning about unused precedence rules.

        self.Start = None           # Starting symbol for the grammar


    def __len__(self):
        return len(self.Productions)

    def __getitem__(self, index):
        return self.Productions[index]

    # -----------------------------------------------------------------------------
    # set_precedence()
    #
    # Sets the precedence for a given terminal. assoc is the associativity such as
    # 'left','right', or 'nonassoc'.  level is a numeric level.
    #
    # -----------------------------------------------------------------------------

    def set_precedence(self, term, assoc, level):
        assert self.Productions == [None], 'Must call set_precedence() before add_production()'
        if term in self.Precedence:
            raise GrammarError('Precedence already specified for terminal %r' % term)
        if assoc not in ['left', 'right', 'nonassoc']:
            raise GrammarError("Associativity must be one of 'left','right', or 'nonassoc'")
        self.Precedence[term] = (assoc, level)

    # -----------------------------------------------------------------------------
    # add_production()
    #
    # Given an action function, this function assembles a production rule and
    # computes its precedence level.
    #
    # The production rule is supplied as a list of symbols.   For example,
    # a rule such as 'expr : expr PLUS term' has a production name of 'expr' and
    # symbols ['expr','PLUS','term'].
    #
    # Precedence is determined by the precedence of the right-most non-terminal
    # or the precedence of a terminal specified by %prec.
    #
    # A variety of error checks are performed to make sure production symbols
    # are valid and that %prec is used correctly.
    # -----------------------------------------------------------------------------

    def add_production(self, prodname, syms, func=None, file='', line=0):

        if prodname in self.Terminals:
            raise GrammarError('%s:%d: Illegal rule name %r. Already defined as a token' % (file, line, prodname))
        if prodname == 'error':
            raise GrammarError('%s:%d: Illegal rule name %r. error is a reserved word' % (file, line, prodname))
        if not _is_identifier.match(prodname):
            raise GrammarError('%s:%d: Illegal rule name %r' % (file, line, prodname))

        # Look for literal tokens
        for n, s in enumerate(syms):
            if s[0] in "'\"":
                try:
                    c = eval(s)
                    if (len(c) > 1):
                        raise GrammarError('%s:%d: Literal token %s in rule %r may only be a single character' %
                                           (file, line, s, prodname))
                    if c not in self.Terminals:
                        self.Terminals[c] = []
                    syms[n] = c
                    continue
                except SyntaxError:
                    pass
            if not _is_identifier.match(s) and s != '%prec':
                raise GrammarError('%s:%d: Illegal name %r in rule %r' % (file, line, s, prodname))

        # Determine the precedence level
        if '%prec' in syms:
            if syms[-1] == '%prec':
                raise GrammarError('%s:%d: Syntax error. Nothing follows %%prec' % (file, line))
            if syms[-2] != '%prec':
                raise GrammarError('%s:%d: Syntax error. %%prec can only appear at the end of a grammar rule' %
                                   (file, line))
            precname = syms[-1]
            prodprec = self.Precedence.get(precname)
            if not prodprec:
                raise GrammarError('%s:%d: Nothing known about the precedence of %r' % (file, line, precname))
            else:
                self.UsedPrecedence.add(precname)
            del syms[-2:]     # Drop %prec from the rule
        else:
            # If no %prec, precedence is determined by the rightmost terminal symbol
            precname = rightmost_terminal(syms, self.Terminals)
            prodprec = self.Precedence.get(precname, ('right', 0))

        # See if the rule is already in the rulemap
        map = '%s -> %s' % (prodname, syms)
        if map in self.Prodmap:
            m = self.Prodmap[map]
            raise GrammarError('%s:%d: Duplicate rule %s. ' % (file, line, m) +
                               'Previous definition at %s:%d' % (m.file, m.line))

        # From this point on, everything is valid.  Create a new Production instance
        pnumber  = len(self.Productions)
        if prodname not in self.Nonterminals:
            self.Nonterminals[prodname] = []

        # Add the production number to Terminals and Nonterminals
        for t in syms:
            if t in self.Terminals:
                self.Terminals[t].append(pnumber)
            else:
                if t not in self.Nonterminals:
                    self.Nonterminals[t] = []
                self.Nonterminals[t].append(pnumber)

        # Create a production and add it to the list of productions
        p = Production(pnumber, prodname, syms, prodprec, func, file, line)
        self.Productions.append(p)
        self.Prodmap[map] = p

        # Add to the global productions list
        try:
            self.Prodnames[prodname].append(p)
        except KeyError:
            self.Prodnames[prodname] = [p]

    # -----------------------------------------------------------------------------
    # set_start()
    #
    # Sets the starting symbol and creates the augmented grammar.  Production
    # rule 0 is S' -> start where start is the start symbol.
    # -----------------------------------------------------------------------------

    def set_start(self, start=None):
        if not start:
            start = self.Productions[1].name
        if start not in self.Nonterminals:
            raise GrammarError('start symbol %s undefined' % start)
        self.Productions[0] = Production(0, "S'", [start])
        self.Nonterminals[start].append(0)
        self.Start = start

    # -----------------------------------------------------------------------------
    # find_unreachable()
    #
    # Find all of the nonterminal symbols that can't be reached from the starting
    # symbol.  Returns a list of nonterminals that can't be reached.
    # -----------------------------------------------------------------------------

    def find_unreachable(self):

        # Mark all symbols that are reachable from a symbol s
        def mark_reachable_from(s):
            if s in reachable:
                return
            reachable.add(s)
            for p in self.Prodnames.get(s, []):
                for r in p.prod:
                    mark_reachable_from(r)

        reachable = set()
        mark_reachable_from(self.Productions[0].prod[0])
        return [s for s in self.Nonterminals if s not in reachable]

    # -----------------------------------------------------------------------------
    # infinite_cycles()
    #
    # This function looks at the various parsing rules and tries to detect
    # infinite recursion cycles (grammar rules where there is no possible way
    # to derive a string of only terminals).
    # -----------------------------------------------------------------------------

    def infinite_cycles(self):
        terminates = {}

        # Terminals:
        for t in self.Terminals:
            terminates[t] = True

        terminates['$end'] = True

        # Nonterminals:

        # Initialize to false:
        for n in self.Nonterminals:
            terminates[n] = False

        # Then propagate termination until no change:
        while True:
            some_change = False
            for (n, pl) in self.Prodnames.items():
                # Nonterminal n terminates iff any of its productions terminates.
                for p in pl:
                    # Production p terminates iff all of its rhs symbols terminate.
                    for s in p.prod:
                        if not terminates[s]:
                            # The symbol s does not terminate,
                            # so production p does not terminate.
                            p_terminates = False
                            break
                    else:
                        # didn't break from the loop,
                        # so every symbol s terminates
                        # so production p terminates.
                        p_terminates = True

                    if p_terminates:
                        # symbol n terminates!
                        if not terminates[n]:
                            terminates[n] = True
                            some_change = True
                        # Don't need to consider any more productions for this n.
                        break

            if not some_change:
                break

        infinite = []
        for (s, term) in terminates.items():
            if not term:
                if s not in self.Prodnames and s not in self.Terminals and s != 'error':
                    # s is used-but-not-defined, and we've already warned of that,
                    # so it would be overkill to say that it's also non-terminating.
                    pass
                else:
                    infinite.append(s)

        return infinite

    # -----------------------------------------------------------------------------
    # undefined_symbols()
    #
    # Find all symbols that were used the grammar, but not defined as tokens or
    # grammar rules.  Returns a list of tuples (sym, prod) where sym in the symbol
    # and prod is the production where the symbol was used.
    # -----------------------------------------------------------------------------
    def undefined_symbols(self):
        result = []
        for p in self.Productions:
            if not p:
                continue

            for s in p.prod:
                if s not in self.Prodnames and s not in self.Terminals and s != 'error':
                    result.append((s, p))
        return result

    # -----------------------------------------------------------------------------
    # unused_terminals()
    #
    # Find all terminals that were defined, but not used by the grammar.  Returns
    # a list of all symbols.
    # -----------------------------------------------------------------------------
    def unused_terminals(self):
        unused_tok = []
        for s, v in self.Terminals.items():
            if s != 'error' and not v:
                unused_tok.append(s)

        return unused_tok

    # ------------------------------------------------------------------------------
    # unused_rules()
    #
    # Find all grammar rules that were defined,  but not used (maybe not reachable)
    # Returns a list of productions.
    # ------------------------------------------------------------------------------

    def unused_rules(self):
        unused_prod = []
        for s, v in self.Nonterminals.items():
            if not v:
                p = self.Prodnames[s][0]
                unused_prod.append(p)
        return unused_prod

    # -----------------------------------------------------------------------------
    # unused_precedence()
    #
    # Returns a list of tuples (term,precedence) corresponding to precedence
    # rules that were never used by the grammar.  term is the name of the terminal
    # on which precedence was applied and precedence is a string such as 'left' or
    # 'right' corresponding to the type of precedence.
    # -----------------------------------------------------------------------------

    def unused_precedence(self):
        unused = []
        for termname in self.Precedence:
            if not (termname in self.Terminals or termname in self.UsedPrecedence):
                unused.append((termname, self.Precedence[termname][0]))

        return unused

    # -------------------------------------------------------------------------
    # _first()
    #
    # Compute the value of FIRST1(beta) where beta is a tuple of symbols.
    #
    # During execution of compute_first1, the result may be incomplete.
    # Afterward (e.g., when called from compute_follow()), it will be complete.
    # -------------------------------------------------------------------------
    def _first(self, beta):

        # We are computing First(x1,x2,x3,...,xn)
        result = []
        for x in beta:
            x_produces_empty = False

            # Add all the non-<empty> symbols of First[x] to the result.
            for f in self.First[x]:
                if f == '<empty>':
                    x_produces_empty = True
                else:
                    if f not in result:
                        result.append(f)

            if x_produces_empty:
                # We have to consider the next x in beta,
                # i.e. stay in the loop.
                pass
            else:
                # We don't have to consider any further symbols in beta.
                break
        else:
            # There was no 'break' from the loop,
            # so x_produces_empty was true for all x in beta,
            # so beta produces empty as well.
            result.append('<empty>')

        return result

    # -------------------------------------------------------------------------
    # compute_first()
    #
    # Compute the value of FIRST1(X) for all symbols
    # -------------------------------------------------------------------------
    def compute_first(self):
        if self.First:
            return self.First

        # Terminals:
        for t in self.Terminals:
            self.First[t] = [t]

        self.First['$end'] = ['$end']

        # Nonterminals:

        # Initialize to the empty set:
        for n in self.Nonterminals:
            self.First[n] = []

        # Then propagate symbols until no change:
        while True:
            some_change = False
            for n in self.Nonterminals:
                for p in self.Prodnames[n]:
                    for f in self._first(p.prod):
                        if f not in self.First[n]:
                            self.First[n].append(f)
                            some_change = True
            if not some_change:
                break

        return self.First

    # ---------------------------------------------------------------------
    # compute_follow()
    #
    # Computes all of the follow sets for every non-terminal symbol.  The
    # follow set is the set of all symbols that might follow a given
    # non-terminal.  See the Dragon book, 2nd Ed. p. 189.
    # ---------------------------------------------------------------------
    def compute_follow(self, start=None):
        # If already computed, return the result
        if self.Follow:
            return self.Follow

        # If first sets not computed yet, do that first.
        if not self.First:
            self.compute_first()

        # Add '$end' to the follow list of the start symbol
        for k in self.Nonterminals:
            self.Follow[k] = []

        if not start:
            start = self.Productions[1].name

        self.Follow[start] = ['$end']

        while True:
            didadd = False
            for p in self.Productions[1:]:
                # Here is the production set
                for i, B in enumerate(p.prod):
                    if B in self.Nonterminals:
                        # Okay. We got a non-terminal in a production
                        fst = self._first(p.prod[i+1:])
                        hasempty = False
                        for f in fst:
                            if f != '<empty>' and f not in self.Follow[B]:
                                self.Follow[B].append(f)
                                didadd = True
                            if f == '<empty>':
                                hasempty = True
                        if hasempty or i == (len(p.prod)-1):
                            # Add elements of follow(a) to follow(b)
                            for f in self.Follow[p.name]:
                                if f not in self.Follow[B]:
                                    self.Follow[B].append(f)
                                    didadd = True
            if not didadd:
                break
        return self.Follow


    # -----------------------------------------------------------------------------
    # build_lritems()
    #
    # This function walks the list of productions and builds a complete set of the
    # LR items.  The LR items are stored in two ways:  First, they are uniquely
    # numbered and placed in the list _lritems.  Second, a linked list of LR items
    # is built for each production.  For example:
    #
    #   E -> E PLUS E
    #
    # Creates the list
    #
    #  [E -> . E PLUS E, E -> E . PLUS E, E -> E PLUS . E, E -> E PLUS E . ]
    # -----------------------------------------------------------------------------

    def build_lritems(self):
        for p in self.Productions:
            lastlri = p
            i = 0
            lr_items = []
            while True:
                if i > len(p):
                    lri = None
                else:
                    lri = LRItem(p, i)
                    # Precompute the list of productions immediately following
                    try:
                        lri.lr_after = self.Prodnames[lri.prod[i+1]]
                    except (IndexError, KeyError):
                        lri.lr_after = []
                    try:
                        lri.lr_before = lri.prod[i-1]
                    except IndexError:
                        lri.lr_before = None

                lastlri.lr_next = lri
                if not lri:
                    break
                lr_items.append(lri)
                lastlri = lri
                i += 1
            p.lr_items = lr_items

# -----------------------------------------------------------------------------
#                            == Class LRTable ==
#
# This basic class represents a basic table of LR parsing information.
# Methods for generating the tables are not defined here.  They are defined
# in the derived class LRGeneratedTable.
# -----------------------------------------------------------------------------

class VersionError(YaccError):
    pass

class LRTable(object):
    def __init__(self):
        self.lr_action = None
        self.lr_goto = None
        self.lr_productions = None
        self.lr_method = None

    def read_table(self, module):
        if isinstance(module, types.ModuleType):
            parsetab = module
        else:
            exec('import %s' % module)
            parsetab = sys.modules[module]

        if parsetab._tabversion != __tabversion__:
            raise VersionError('yacc table file version is out of date')

        self.lr_action = parsetab._lr_action
        self.lr_goto = parsetab._lr_goto

        self.lr_productions = []
        for p in parsetab._lr_productions:
            self.lr_productions.append(MiniProduction(*p))

        self.lr_method = parsetab._lr_method
        return parsetab._lr_signature

    def read_pickle(self, filename):
        try:
            import cPickle as pickle
        except ImportError:
            import pickle

        if not os.path.exists(filename):
          raise ImportError

        in_f = open(filename, 'rb')

        tabversion = pickle.load(in_f)
        if tabversion != __tabversion__:
            raise VersionError('yacc table file version is out of date')
        self.lr_method = pickle.load(in_f)
        signature      = pickle.load(in_f)
        self.lr_action = pickle.load(in_f)
        self.lr_goto   = pickle.load(in_f)
        productions    = pickle.load(in_f)

        self.lr_productions = []
        for p in productions:
            self.lr_productions.append(MiniProduction(*p))

        in_f.close()
        return signature

    # Bind all production function names to callable objects in pdict
    def bind_callables(self, pdict):
        for p in self.lr_productions:
            p.bind(pdict)


# -----------------------------------------------------------------------------
#                           === LR Generator ===
#
# The following classes and functions are used to generate LR parsing tables on
# a grammar.
# -----------------------------------------------------------------------------

# -----------------------------------------------------------------------------
# digraph()
# traverse()
#
# The following two functions are used to compute set valued functions
# of the form:
#
#     F(x) = F'(x) U U{F(y) | x R y}
#
# This is used to compute the values of Read() sets as well as FOLLOW sets
# in LALR(1) generation.
#
# Inputs:  X    - An input set
#          R    - A relation
#          FP   - Set-valued function
# ------------------------------------------------------------------------------

def digraph(X, R, FP):
    N = {}
    for x in X:
        N[x] = 0
    stack = []
    F = {}
    for x in X:
        if N[x] == 0:
            traverse(x, N, stack, F, X, R, FP)
    return F

def traverse(x, N, stack, F, X, R, FP):
    stack.append(x)
    d = len(stack)
    N[x] = d
    F[x] = FP(x)             # F(X) <- F'(x)

    rel = R(x)               # Get y's related to x
    for y in rel:
        if N[y] == 0:
            traverse(y, N, stack, F, X, R, FP)
        N[x] = min(N[x], N[y])
        for a in F.get(y, []):
            if a not in F[x]:
                F[x].append(a)
    if N[x] == d:
        N[stack[-1]] = MAXINT
        F[stack[-1]] = F[x]
        element = stack.pop()
        while element != x:
            N[stack[-1]] = MAXINT
            F[stack[-1]] = F[x]
            element = stack.pop()

class LALRError(YaccError):
    pass

# -----------------------------------------------------------------------------
#                             == LRGeneratedTable ==
#
# This class implements the LR table generation algorithm.  There are no
# public methods except for write()
# -----------------------------------------------------------------------------

class LRGeneratedTable(LRTable):
    def __init__(self, grammar, method='LALR', log=None):
        if method not in ['SLR', 'LALR']:
            raise LALRError('Unsupported method %s' % method)

        self.grammar = grammar
        self.lr_method = method

        # Set up the logger
        if not log:
            log = NullLogger()
        self.log = log

        # Internal attributes
        self.lr_action     = {}        # Action table
        self.lr_goto       = {}        # Goto table
        self.lr_productions  = grammar.Productions    # Copy of grammar Production array
        self.lr_goto_cache = {}        # Cache of computed gotos
        self.lr0_cidhash   = {}        # Cache of closures

        self._add_count    = 0         # Internal counter used to detect cycles

        # Diagonistic information filled in by the table generator
        self.sr_conflict   = 0
        self.rr_conflict   = 0
        self.conflicts     = []        # List of conflicts

        self.sr_conflicts  = []
        self.rr_conflicts  = []

        # Build the tables
        self.grammar.build_lritems()
        self.grammar.compute_first()
        self.grammar.compute_follow()
        self.lr_parse_table()

    # Compute the LR(0) closure operation on I, where I is a set of LR(0) items.

    def lr0_closure(self, I):
        self._add_count += 1

        # Add everything in I to J
        J = I[:]
        didadd = True
        while didadd:
            didadd = False
            for j in J:
                for x in j.lr_after:
                    if getattr(x, 'lr0_added', 0) == self._add_count:
                        continue
                    # Add B --> .G to J
                    J.append(x.lr_next)
                    x.lr0_added = self._add_count
                    didadd = True

        return J

    # Compute the LR(0) goto function goto(I,X) where I is a set
    # of LR(0) items and X is a grammar symbol.   This function is written
    # in a way that guarantees uniqueness of the generated goto sets
    # (i.e. the same goto set will never be returned as two different Python
    # objects).  With uniqueness, we can later do fast set comparisons using
    # id(obj) instead of element-wise comparison.

    def lr0_goto(self, I, x):
        # First we look for a previously cached entry
        g = self.lr_goto_cache.get((id(I), x))
        if g:
            return g

        # Now we generate the goto set in a way that guarantees uniqueness
        # of the result

        s = self.lr_goto_cache.get(x)
        if not s:
            s = {}
            self.lr_goto_cache[x] = s

        gs = []
        for p in I:
            n = p.lr_next
            if n and n.lr_before == x:
                s1 = s.get(id(n))
                if not s1:
                    s1 = {}
                    s[id(n)] = s1
                gs.append(n)
                s = s1
        g = s.get('$end')
        if not g:
            if gs:
                g = self.lr0_closure(gs)
                s['$end'] = g
            else:
                s['$end'] = gs
        self.lr_goto_cache[(id(I), x)] = g
        return g

    # Compute the LR(0) sets of item function
    def lr0_items(self):
        C = [self.lr0_closure([self.grammar.Productions[0].lr_next])]
        i = 0
        for I in C:
            self.lr0_cidhash[id(I)] = i
            i += 1

        # Loop over the items in C and each grammar symbols
        i = 0
        while i < len(C):
            I = C[i]
            i += 1

            # Collect all of the symbols that could possibly be in the goto(I,X) sets
            asyms = {}
            for ii in I:
                for s in ii.usyms:
                    asyms[s] = None

            for x in asyms:
                g = self.lr0_goto(I, x)
                if not g or id(g) in self.lr0_cidhash:
                    continue
                self.lr0_cidhash[id(g)] = len(C)
                C.append(g)

        return C

    # -----------------------------------------------------------------------------
    #                       ==== LALR(1) Parsing ====
    #
    # LALR(1) parsing is almost exactly the same as SLR except that instead of
    # relying upon Follow() sets when performing reductions, a more selective
    # lookahead set that incorporates the state of the LR(0) machine is utilized.
    # Thus, we mainly just have to focus on calculating the lookahead sets.
    #
    # The method used here is due to DeRemer and Pennelo (1982).
    #
    # DeRemer, F. L., and T. J. Pennelo: "Efficient Computation of LALR(1)
    #     Lookahead Sets", ACM Transactions on Programming Languages and Systems,
    #     Vol. 4, No. 4, Oct. 1982, pp. 615-649
    #
    # Further details can also be found in:
    #
    #  J. Tremblay and P. Sorenson, "The Theory and Practice of Compiler Writing",
    #      McGraw-Hill Book Company, (1985).
    #
    # -----------------------------------------------------------------------------

    # -----------------------------------------------------------------------------
    # compute_nullable_nonterminals()
    #
    # Creates a dictionary containing all of the non-terminals that might produce
    # an empty production.
    # -----------------------------------------------------------------------------

    def compute_nullable_nonterminals(self):
        nullable = set()
        num_nullable = 0
        while True:
            for p in self.grammar.Productions[1:]:
                if p.len == 0:
                    nullable.add(p.name)
                    continue
                for t in p.prod:
                    if t not in nullable:
                        break
                else:
                    nullable.add(p.name)
            if len(nullable) == num_nullable:
                break
            num_nullable = len(nullable)
        return nullable

    # -----------------------------------------------------------------------------
    # find_nonterminal_trans(C)
    #
    # Given a set of LR(0) items, this functions finds all of the non-terminal
    # transitions.    These are transitions in which a dot appears immediately before
    # a non-terminal.   Returns a list of tuples of the form (state,N) where state
    # is the state number and N is the nonterminal symbol.
    #
    # The input C is the set of LR(0) items.
    # -----------------------------------------------------------------------------

    def find_nonterminal_transitions(self, C):
        trans = []
        for stateno, state in enumerate(C):
            for p in state:
                if p.lr_index < p.len - 1:
                    t = (stateno, p.prod[p.lr_index+1])
                    if t[1] in self.grammar.Nonterminals:
                        if t not in trans:
                            trans.append(t)
        return trans

    # -----------------------------------------------------------------------------
    # dr_relation()
    #
    # Computes the DR(p,A) relationships for non-terminal transitions.  The input
    # is a tuple (state,N) where state is a number and N is a nonterminal symbol.
    #
    # Returns a list of terminals.
    # -----------------------------------------------------------------------------

    def dr_relation(self, C, trans, nullable):
        state, N = trans
        terms = []

        g = self.lr0_goto(C[state], N)
        for p in g:
            if p.lr_index < p.len - 1:
                a = p.prod[p.lr_index+1]
                if a in self.grammar.Terminals:
                    if a not in terms:
                        terms.append(a)

        # This extra bit is to handle the start state
        if state == 0 and N == self.grammar.Productions[0].prod[0]:
            terms.append('$end')

        return terms

    # -----------------------------------------------------------------------------
    # reads_relation()
    #
    # Computes the READS() relation (p,A) READS (t,C).
    # -----------------------------------------------------------------------------

    def reads_relation(self, C, trans, empty):
        # Look for empty transitions
        rel = []
        state, N = trans

        g = self.lr0_goto(C[state], N)
        j = self.lr0_cidhash.get(id(g), -1)
        for p in g:
            if p.lr_index < p.len - 1:
                a = p.prod[p.lr_index + 1]
                if a in empty:
                    rel.append((j, a))

        return rel

    # -----------------------------------------------------------------------------
    # compute_lookback_includes()
    #
    # Determines the lookback and includes relations
    #
    # LOOKBACK:
    #
    # This relation is determined by running the LR(0) state machine forward.
    # For example, starting with a production "N : . A B C", we run it forward
    # to obtain "N : A B C ."   We then build a relationship between this final
    # state and the starting state.   These relationships are stored in a dictionary
    # lookdict.
    #
    # INCLUDES:
    #
    # Computes the INCLUDE() relation (p,A) INCLUDES (p',B).
    #
    # This relation is used to determine non-terminal transitions that occur
    # inside of other non-terminal transition states.   (p,A) INCLUDES (p', B)
    # if the following holds:
    #
    #       B -> LAT, where T -> epsilon and p' -L-> p
    #
    # L is essentially a prefix (which may be empty), T is a suffix that must be
    # able to derive an empty string.  State p' must lead to state p with the string L.
    #
    # -----------------------------------------------------------------------------

    def compute_lookback_includes(self, C, trans, nullable):
        lookdict = {}          # Dictionary of lookback relations
        includedict = {}       # Dictionary of include relations

        # Make a dictionary of non-terminal transitions
        dtrans = {}
        for t in trans:
            dtrans[t] = 1

        # Loop over all transitions and compute lookbacks and includes
        for state, N in trans:
            lookb = []
            includes = []
            for p in C[state]:
                if p.name != N:
                    continue

                # Okay, we have a name match.  We now follow the production all the way
                # through the state machine until we get the . on the right hand side

                lr_index = p.lr_index
                j = state
                while lr_index < p.len - 1:
                    lr_index = lr_index + 1
                    t = p.prod[lr_index]

                    # Check to see if this symbol and state are a non-terminal transition
                    if (j, t) in dtrans:
                        # Yes.  Okay, there is some chance that this is an includes relation
                        # the only way to know for certain is whether the rest of the
                        # production derives empty

                        li = lr_index + 1
                        while li < p.len:
                            if p.prod[li] in self.grammar.Terminals:
                                break      # No forget it
                            if p.prod[li] not in nullable:
                                break
                            li = li + 1
                        else:
                            # Appears to be a relation between (j,t) and (state,N)
                            includes.append((j, t))

                    g = self.lr0_goto(C[j], t)               # Go to next set
                    j = self.lr0_cidhash.get(id(g), -1)      # Go to next state

                # When we get here, j is the final state, now we have to locate the production
                for r in C[j]:
                    if r.name != p.name:
                        continue
                    if r.len != p.len:
                        continue
                    i = 0
                    # This look is comparing a production ". A B C" with "A B C ."
                    while i < r.lr_index:
                        if r.prod[i] != p.prod[i+1]:
                            break
                        i = i + 1
                    else:
                        lookb.append((j, r))
            for i in includes:
                if i not in includedict:
                    includedict[i] = []
                includedict[i].append((state, N))
            lookdict[(state, N)] = lookb

        return lookdict, includedict

    # -----------------------------------------------------------------------------
    # compute_read_sets()
    #
    # Given a set of LR(0) items, this function computes the read sets.
    #
    # Inputs:  C        =  Set of LR(0) items
    #          ntrans   = Set of nonterminal transitions
    #          nullable = Set of empty transitions
    #
    # Returns a set containing the read sets
    # -----------------------------------------------------------------------------

    def compute_read_sets(self, C, ntrans, nullable):
        FP = lambda x: self.dr_relation(C, x, nullable)
        R =  lambda x: self.reads_relation(C, x, nullable)
        F = digraph(ntrans, R, FP)
        return F

    # -----------------------------------------------------------------------------
    # compute_follow_sets()
    #
    # Given a set of LR(0) items, a set of non-terminal transitions, a readset,
    # and an include set, this function computes the follow sets
    #
    # Follow(p,A) = Read(p,A) U U {Follow(p',B) | (p,A) INCLUDES (p',B)}
    #
    # Inputs:
    #            ntrans     = Set of nonterminal transitions
    #            readsets   = Readset (previously computed)
    #            inclsets   = Include sets (previously computed)
    #
    # Returns a set containing the follow sets
    # -----------------------------------------------------------------------------

    def compute_follow_sets(self, ntrans, readsets, inclsets):
        FP = lambda x: readsets[x]
        R  = lambda x: inclsets.get(x, [])
        F = digraph(ntrans, R, FP)
        return F

    # -----------------------------------------------------------------------------
    # add_lookaheads()
    #
    # Attaches the lookahead symbols to grammar rules.
    #
    # Inputs:    lookbacks         -  Set of lookback relations
    #            followset         -  Computed follow set
    #
    # This function directly attaches the lookaheads to productions contained
    # in the lookbacks set
    # -----------------------------------------------------------------------------

    def add_lookaheads(self, lookbacks, followset):
        for trans, lb in lookbacks.items():
            # Loop over productions in lookback
            for state, p in lb:
                if state not in p.lookaheads:
                    p.lookaheads[state] = []
                f = followset.get(trans, [])
                for a in f:
                    if a not in p.lookaheads[state]:
                        p.lookaheads[state].append(a)

    # -----------------------------------------------------------------------------
    # add_lalr_lookaheads()
    #
    # This function does all of the work of adding lookahead information for use
    # with LALR parsing
    # -----------------------------------------------------------------------------

    def add_lalr_lookaheads(self, C):
        # Determine all of the nullable nonterminals
        nullable = self.compute_nullable_nonterminals()

        # Find all non-terminal transitions
        trans = self.find_nonterminal_transitions(C)

        # Compute read sets
        readsets = self.compute_read_sets(C, trans, nullable)

        # Compute lookback/includes relations
        lookd, included = self.compute_lookback_includes(C, trans, nullable)

        # Compute LALR FOLLOW sets
        followsets = self.compute_follow_sets(trans, readsets, included)

        # Add all of the lookaheads
        self.add_lookaheads(lookd, followsets)

    # -----------------------------------------------------------------------------
    # lr_parse_table()
    #
    # This function constructs the parse tables for SLR or LALR
    # -----------------------------------------------------------------------------
    def lr_parse_table(self):
        Productions = self.grammar.Productions
        Precedence  = self.grammar.Precedence
        goto   = self.lr_goto         # Goto array
        action = self.lr_action       # Action array
        log    = self.log             # Logger for output

        actionp = {}                  # Action production array (temporary)

        log.info('Parsing method: %s', self.lr_method)

        # Step 1: Construct C = { I0, I1, ... IN}, collection of LR(0) items
        # This determines the number of states

        C = self.lr0_items()

        if self.lr_method == 'LALR':
            self.add_lalr_lookaheads(C)

        # Build the parser table, state by state
        st = 0
        for I in C:
            # Loop over each production in I
            actlist = []              # List of actions
            st_action  = {}
            st_actionp = {}
            st_goto    = {}
            log.info('')
            log.info('state %d', st)
            log.info('')
            for p in I:
                log.info('    (%d) %s', p.number, p)
            log.info('')

            for p in I:
                if p.len == p.lr_index + 1:
                    if p.name == "S'":
                        # Start symbol. Accept!
                        st_action['$end'] = 0
                        st_actionp['$end'] = p
                    else:
                        # We are at the end of a production.  Reduce!
                        if self.lr_method == 'LALR':
                            laheads = p.lookaheads[st]
                        else:
                            laheads = self.grammar.Follow[p.name]
                        for a in laheads:
                            actlist.append((a, p, 'reduce using rule %d (%s)' % (p.number, p)))
                            r = st_action.get(a)
                            if r is not None:
                                # Whoa. Have a shift/reduce or reduce/reduce conflict
                                if r > 0:
                                    # Need to decide on shift or reduce here
                                    # By default we favor shifting. Need to add
                                    # some precedence rules here.

                                    # Shift precedence comes from the token
                                    sprec, slevel = Precedence.get(a, ('right', 0))

                                    # Reduce precedence comes from rule being reduced (p)
                                    rprec, rlevel = Productions[p.number].prec

                                    if (slevel < rlevel) or ((slevel == rlevel) and (rprec == 'left')):
                                        # We really need to reduce here.
                                        st_action[a] = -p.number
                                        st_actionp[a] = p
                                        if not slevel and not rlevel:
                                            log.info('  ! shift/reduce conflict for %s resolved as reduce', a)
                                            self.sr_conflicts.append((st, a, 'reduce'))
                                        Productions[p.number].reduced += 1
                                    elif (slevel == rlevel) and (rprec == 'nonassoc'):
                                        st_action[a] = None
                                    else:
                                        # Hmmm. Guess we'll keep the shift
                                        if not rlevel:
                                            log.info('  ! shift/reduce conflict for %s resolved as shift', a)
                                            self.sr_conflicts.append((st, a, 'shift'))
                                elif r < 0:
                                    # Reduce/reduce conflict.   In this case, we favor the rule
                                    # that was defined first in the grammar file
                                    oldp = Productions[-r]
                                    pp = Productions[p.number]
                                    if oldp.line > pp.line:
                                        st_action[a] = -p.number
                                        st_actionp[a] = p
                                        chosenp, rejectp = pp, oldp
                                        Productions[p.number].reduced += 1
                                        Productions[oldp.number].reduced -= 1
                                    else:
                                        chosenp, rejectp = oldp, pp
                                    self.rr_conflicts.append((st, chosenp, rejectp))
                                    log.info('  ! reduce/reduce conflict for %s resolved using rule %d (%s)',
                                             a, st_actionp[a].number, st_actionp[a])
                                else:
                                    raise LALRError('Unknown conflict in state %d' % st)
                            else:
                                st_action[a] = -p.number
                                st_actionp[a] = p
                                Productions[p.number].reduced += 1
                else:
                    i = p.lr_index
                    a = p.prod[i+1]       # Get symbol right after the "."
                    if a in self.grammar.Terminals:
                        g = self.lr0_goto(I, a)
                        j = self.lr0_cidhash.get(id(g), -1)
                        if j >= 0:
                            # We are in a shift state
                            actlist.append((a, p, 'shift and go to state %d' % j))
                            r = st_action.get(a)
                            if r is not None:
                                # Whoa have a shift/reduce or shift/shift conflict
                                if r > 0:
                                    if r != j:
                                        raise LALRError('Shift/shift conflict in state %d' % st)
                                elif r < 0:
                                    # Do a precedence check.
                                    #   -  if precedence of reduce rule is higher, we reduce.
                                    #   -  if precedence of reduce is same and left assoc, we reduce.
                                    #   -  otherwise we shift

                                    # Shift precedence comes from the token
                                    sprec, slevel = Precedence.get(a, ('right', 0))

                                    # Reduce precedence comes from the rule that could have been reduced
                                    rprec, rlevel = Productions[st_actionp[a].number].prec

                                    if (slevel > rlevel) or ((slevel == rlevel) and (rprec == 'right')):
                                        # We decide to shift here... highest precedence to shift
                                        Productions[st_actionp[a].number].reduced -= 1
                                        st_action[a] = j
                                        st_actionp[a] = p
                                        if not rlevel:
                                            log.info('  ! shift/reduce conflict for %s resolved as shift', a)
                                            self.sr_conflicts.append((st, a, 'shift'))
                                    elif (slevel == rlevel) and (rprec == 'nonassoc'):
                                        st_action[a] = None
                                    else:
                                        # Hmmm. Guess we'll keep the reduce
                                        if not slevel and not rlevel:
                                            log.info('  ! shift/reduce conflict for %s resolved as reduce', a)
                                            self.sr_conflicts.append((st, a, 'reduce'))

                                else:
                                    raise LALRError('Unknown conflict in state %d' % st)
                            else:
                                st_action[a] = j
                                st_actionp[a] = p

            # Print the actions associated with each terminal
            _actprint = {}
            for a, p, m in actlist:
                if a in st_action:
                    if p is st_actionp[a]:
                        log.info('    %-15s %s', a, m)
                        _actprint[(a, m)] = 1
            log.info('')
            # Print the actions that were not used. (debugging)
            not_used = 0
            for a, p, m in actlist:
                if a in st_action:
                    if p is not st_actionp[a]:
                        if not (a, m) in _actprint:
                            log.debug('  ! %-15s [ %s ]', a, m)
                            not_used = 1
                            _actprint[(a, m)] = 1
            if not_used:
                log.debug('')

            # Construct the goto table for this state

            nkeys = {}
            for ii in I:
                for s in ii.usyms:
                    if s in self.grammar.Nonterminals:
                        nkeys[s] = None
            for n in nkeys:
                g = self.lr0_goto(I, n)
                j = self.lr0_cidhash.get(id(g), -1)
                if j >= 0:
                    st_goto[n] = j
                    log.info('    %-30s shift and go to state %d', n, j)

            action[st] = st_action
            actionp[st] = st_actionp
            goto[st] = st_goto
            st += 1

    # -----------------------------------------------------------------------------
    # write()
    #
    # This function writes the LR parsing tables to a file
    # -----------------------------------------------------------------------------

    def write_table(self, tabmodule, outputdir='', signature=''):
        if isinstance(tabmodule, types.ModuleType):
            raise IOError("Won't overwrite existing tabmodule")

        basemodulename = tabmodule.split('.')[-1]
        filename = os.path.join(outputdir, basemodulename) + '.py'
        try:
            f = open(filename, 'w')

            f.write('''
# %s
# This file is automatically generated. Do not edit.
# pylint: disable=W,C,R
_tabversion = %r

_lr_method = %r

_lr_signature = %r
    ''' % (os.path.basename(filename), __tabversion__, self.lr_method, signature))

            # Change smaller to 0 to go back to original tables
            smaller = 1

            # Factor out names to try and make smaller
            if smaller:
                items = {}

                for s, nd in self.lr_action.items():
                    for name, v in nd.items():
                        i = items.get(name)
                        if not i:
                            i = ([], [])
                            items[name] = i
                        i[0].append(s)
                        i[1].append(v)

                f.write('\n_lr_action_items = {')
                for k, v in items.items():
                    f.write('%r:([' % k)
                    for i in v[0]:
                        f.write('%r,' % i)
                    f.write('],[')
                    for i in v[1]:
                        f.write('%r,' % i)

                    f.write(']),')
                f.write('}\n')

                f.write('''
_lr_action = {}
for _k, _v in _lr_action_items.items():
   for _x,_y in zip(_v[0],_v[1]):
      if not _x in _lr_action:  _lr_action[_x] = {}
      _lr_action[_x][_k] = _y
del _lr_action_items
''')

            else:
                f.write('\n_lr_action = { ')
                for k, v in self.lr_action.items():
                    f.write('(%r,%r):%r,' % (k[0], k[1], v))
                f.write('}\n')

            if smaller:
                # Factor out names to try and make smaller
                items = {}

                for s, nd in self.lr_goto.items():
                    for name, v in nd.items():
                        i = items.get(name)
                        if not i:
                            i = ([], [])
                            items[name] = i
                        i[0].append(s)
                        i[1].append(v)

                f.write('\n_lr_goto_items = {')
                for k, v in items.items():
                    f.write('%r:([' % k)
                    for i in v[0]:
                        f.write('%r,' % i)
                    f.write('],[')
                    for i in v[1]:
                        f.write('%r,' % i)

                    f.write(']),')
                f.write('}\n')

                f.write('''
_lr_goto = {}
for _k, _v in _lr_goto_items.items():
   for _x, _y in zip(_v[0], _v[1]):
       if not _x in _lr_goto: _lr_goto[_x] = {}
       _lr_goto[_x][_k] = _y
del _lr_goto_items
''')
            else:
                f.write('\n_lr_goto = { ')
                for k, v in self.lr_goto.items():
                    f.write('(%r,%r):%r,' % (k[0], k[1], v))
                f.write('}\n')

            # Write production table
            f.write('_lr_productions = [\n')
            for p in self.lr_productions:
                if p.func:
                    f.write('  (%r,%r,%d,%r,%r,%d),\n' % (p.str, p.name, p.len,
                                                          p.func, os.path.basename(p.file), p.line))
                else:
                    f.write('  (%r,%r,%d,None,None,None),\n' % (str(p), p.name, p.len))
            f.write(']\n')
            f.close()

        except IOError as e:
            raise


    # -----------------------------------------------------------------------------
    # pickle_table()
    #
    # This function pickles the LR parsing tables to a supplied file object
    # -----------------------------------------------------------------------------

    def pickle_table(self, filename, signature=''):
        try:
            import cPickle as pickle
        except ImportError:
            import pickle
        with open(filename, 'wb') as outf:
            pickle.dump(__tabversion__, outf, pickle_protocol)
            pickle.dump(self.lr_method, outf, pickle_protocol)
            pickle.dump(signature, outf, pickle_protocol)
            pickle.dump(self.lr_action, outf, pickle_protocol)
            pickle.dump(self.lr_goto, outf, pickle_protocol)

            outp = []
            for p in self.lr_productions:
                if p.func:
                    outp.append((p.str, p.name, p.len, p.func, os.path.basename(p.file), p.line))
                else:
                    outp.append((str(p), p.name, p.len, None, None, None))
            pickle.dump(outp, outf, pickle_protocol)

# -----------------------------------------------------------------------------
#                            === INTROSPECTION ===
#
# The following functions and classes are used to implement the PLY
# introspection features followed by the yacc() function itself.
# -----------------------------------------------------------------------------

# -----------------------------------------------------------------------------
# get_caller_module_dict()
#
# This function returns a dictionary containing all of the symbols defined within
# a caller further down the call stack.  This is used to get the environment
# associated with the yacc() call if none was provided.
# -----------------------------------------------------------------------------

def get_caller_module_dict(levels):
    f = sys._getframe(levels)
    ldict = f.f_globals.copy()
    if f.f_globals != f.f_locals:
        ldict.update(f.f_locals)
    return ldict

# -----------------------------------------------------------------------------
# parse_grammar()
#
# This takes a raw grammar rule string and parses it into production data
# -----------------------------------------------------------------------------
def parse_grammar(doc, file, line):
    grammar = []
    # Split the doc string into lines
    pstrings = doc.splitlines()
    lastp = None
    dline = line
    for ps in pstrings:
        dline += 1
        p = ps.split()
        if not p:
            continue
        try:
            if p[0] == '|':
                # This is a continuation of a previous rule
                if not lastp:
                    raise SyntaxError("%s:%d: Misplaced '|'" % (file, dline))
                prodname = lastp
                syms = p[1:]
            else:
                prodname = p[0]
                lastp = prodname
                syms   = p[2:]
                assign = p[1]
                if assign != ':' and assign != '::=':
                    raise SyntaxError("%s:%d: Syntax error. Expected ':'" % (file, dline))

            grammar.append((file, dline, prodname, syms))
        except SyntaxError:
            raise
        except Exception:
            raise SyntaxError('%s:%d: Syntax error in rule %r' % (file, dline, ps.strip()))

    return grammar

# -----------------------------------------------------------------------------
# ParserReflect()
#
# This class represents information extracted for building a parser including
# start symbol, error function, tokens, precedence list, action functions,
# etc.
# -----------------------------------------------------------------------------
class ParserReflect(object):
    def __init__(self, pdict, log=None):
        self.pdict      = pdict
        self.start      = None
        self.error_func = None
        self.tokens     = None
        self.modules    = set()
        self.grammar    = []
        self.error      = False

        if log is None:
            self.log = PlyLogger(sys.stderr)
        else:
            self.log = log

    # Get all of the basic information
    def get_all(self):
        self.get_start()
        self.get_error_func()
        self.get_tokens()
        self.get_precedence()
        self.get_pfunctions()

    # Validate all of the information
    def validate_all(self):
        self.validate_start()
        self.validate_error_func()
        self.validate_tokens()
        self.validate_precedence()
        self.validate_pfunctions()
        self.validate_modules()
        return self.error

    # Compute a signature over the grammar
    def signature(self):
        parts = []
        try:
            if self.start:
                parts.append(self.start)
            if self.prec:
                parts.append(''.join([''.join(p) for p in self.prec]))
            if self.tokens:
                parts.append(' '.join(self.tokens))
            for f in self.pfuncs:
                if f[3]:
                    parts.append(f[3])
        except (TypeError, ValueError):
            pass
        return ''.join(parts)

    # -----------------------------------------------------------------------------
    # validate_modules()
    #
    # This method checks to see if there are duplicated p_rulename() functions
    # in the parser module file.  Without this function, it is really easy for
    # users to make mistakes by cutting and pasting code fragments (and it's a real
    # bugger to try and figure out why the resulting parser doesn't work).  Therefore,
    # we just do a little regular expression pattern matching of def statements
    # to try and detect duplicates.
    # -----------------------------------------------------------------------------

    def validate_modules(self):
        # Match def p_funcname(
        fre = re.compile(r'\s*def\s+(p_[a-zA-Z_0-9]*)\(')

        for module in self.modules:
            try:
                lines, linen = inspect.getsourcelines(module)
            except IOError:
                continue

            counthash = {}
            for linen, line in enumerate(lines):
                linen += 1
                m = fre.match(line)
                if m:
                    name = m.group(1)
                    prev = counthash.get(name)
                    if not prev:
                        counthash[name] = linen
                    else:
                        filename = inspect.getsourcefile(module)
                        self.log.warning('%s:%d: Function %s redefined. Previously defined on line %d',
                                         filename, linen, name, prev)

    # Get the start symbol
    def get_start(self):
        self.start = self.pdict.get('start')

    # Validate the start symbol
    def validate_start(self):
        if self.start is not None:
            if not isinstance(self.start, string_types):
                self.log.error("'start' must be a string")

    # Look for error handler
    def get_error_func(self):
        self.error_func = self.pdict.get('p_error')

    # Validate the error function
    def validate_error_func(self):
        if self.error_func:
            if isinstance(self.error_func, types.FunctionType):
                ismethod = 0
            elif isinstance(self.error_func, types.MethodType):
                ismethod = 1
            else:
                self.log.error("'p_error' defined, but is not a function or method")
                self.error = True
                return

            eline = self.error_func.__code__.co_firstlineno
            efile = self.error_func.__code__.co_filename
            module = inspect.getmodule(self.error_func)
            self.modules.add(module)

            argcount = self.error_func.__code__.co_argcount - ismethod
            if argcount != 1:
                self.log.error('%s:%d: p_error() requires 1 argument', efile, eline)
                self.error = True

    # Get the tokens map
    def get_tokens(self):
        tokens = self.pdict.get('tokens')
        if not tokens:
            self.log.error('No token list is defined')
            self.error = True
            return

        if not isinstance(tokens, (list, tuple)):
            self.log.error('tokens must be a list or tuple')
            self.error = True
            return

        if not tokens:
            self.log.error('tokens is empty')
            self.error = True
            return

        self.tokens = sorted(tokens)

    # Validate the tokens
    def validate_tokens(self):
        # Validate the tokens.
        if 'error' in self.tokens:
            self.log.error("Illegal token name 'error'. Is a reserved word")
            self.error = True
            return

        terminals = set()
        for n in self.tokens:
            if n in terminals:
                self.log.warning('Token %r multiply defined', n)
            terminals.add(n)

    # Get the precedence map (if any)
    def get_precedence(self):
        self.prec = self.pdict.get('precedence')

    # Validate and parse the precedence map
    def validate_precedence(self):
        preclist = []
        if self.prec:
            if not isinstance(self.prec, (list, tuple)):
                self.log.error('precedence must be a list or tuple')
                self.error = True
                return
            for level, p in enumerate(self.prec):
                if not isinstance(p, (list, tuple)):
                    self.log.error('Bad precedence table')
                    self.error = True
                    return

                if len(p) < 2:
                    self.log.error('Malformed precedence entry %s. Must be (assoc, term, ..., term)', p)
                    self.error = True
                    return
                assoc = p[0]
                if not isinstance(assoc, string_types):
                    self.log.error('precedence associativity must be a string')
                    self.error = True
                    return
                for term in p[1:]:
                    if not isinstance(term, string_types):
                        self.log.error('precedence items must be strings')
                        self.error = True
                        return
                    preclist.append((term, assoc, level+1))
        self.preclist = preclist

    # Get all p_functions from the grammar
    def get_pfunctions(self):
        p_functions = []
        for name, item in self.pdict.items():
            if not name.startswith('p_') or name == 'p_error':
                continue
            if isinstance(item, (types.FunctionType, types.MethodType)):
                line = getattr(item, 'co_firstlineno', item.__code__.co_firstlineno)
                module = inspect.getmodule(item)
                p_functions.append((line, module, name, item.__doc__))

        # Sort all of the actions by line number; make sure to stringify
        # modules to make them sortable, since `line` may not uniquely sort all
        # p functions
        p_functions.sort(key=lambda p_function: (
            p_function[0],
            str(p_function[1]),
            p_function[2],
            p_function[3]))
        self.pfuncs = p_functions

    # Validate all of the p_functions
    def validate_pfunctions(self):
        grammar = []
        # Check for non-empty symbols
        if len(self.pfuncs) == 0:
            self.log.error('no rules of the form p_rulename are defined')
            self.error = True
            return

        for line, module, name, doc in self.pfuncs:
            file = inspect.getsourcefile(module)
            func = self.pdict[name]
            if isinstance(func, types.MethodType):
                reqargs = 2
            else:
                reqargs = 1
            if func.__code__.co_argcount > reqargs:
                self.log.error('%s:%d: Rule %r has too many arguments', file, line, func.__name__)
                self.error = True
            elif func.__code__.co_argcount < reqargs:
                self.log.error('%s:%d: Rule %r requires an argument', file, line, func.__name__)
                self.error = True
            elif not func.__doc__:
                self.log.warning('%s:%d: No documentation string specified in function %r (ignored)',
                                 file, line, func.__name__)
            else:
                try:
                    parsed_g = parse_grammar(doc, file, line)
                    for g in parsed_g:
                        grammar.append((name, g))
                except SyntaxError as e:
                    self.log.error(str(e))
                    self.error = True

                # Looks like a valid grammar rule
                # Mark the file in which defined.
                self.modules.add(module)

        # Secondary validation step that looks for p_ definitions that are not functions
        # or functions that look like they might be grammar rules.

        for n, v in self.pdict.items():
            if n.startswith('p_') and isinstance(v, (types.FunctionType, types.MethodType)):
                continue
            if n.startswith('t_'):
                continue
            if n.startswith('p_') and n != 'p_error':
                self.log.warning('%r not defined as a function', n)
            if ((isinstance(v, types.FunctionType) and v.__code__.co_argcount == 1) or
                   (isinstance(v, types.MethodType) and v.__func__.__code__.co_argcount == 2)):
                if v.__doc__:
                    try:
                        doc = v.__doc__.split(' ')
                        if doc[1] == ':':
                            self.log.warning('%s:%d: Possible grammar rule %r defined without p_ prefix',
                                             v.__code__.co_filename, v.__code__.co_firstlineno, n)
                    except IndexError:
                        pass

        self.grammar = grammar

# -----------------------------------------------------------------------------
# yacc(module)
#
# Build a parser
# -----------------------------------------------------------------------------

def yacc(method='LALR', debug=yaccdebug, module=None, tabmodule=tab_module, start=None,
         check_recursion=True, optimize=False, write_tables=True, debugfile=debug_file,
         outputdir=None, debuglog=None, errorlog=None, picklefile=None):

    if tabmodule is None:
        tabmodule = tab_module

    # Reference to the parsing method of the last built parser
    global parse

    # If pickling is enabled, table files are not created
    if picklefile:
        write_tables = 0

    if errorlog is None:
        errorlog = PlyLogger(sys.stderr)

    # Get the module dictionary used for the parser
    if module:
        _items = [(k, getattr(module, k)) for k in dir(module)]
        pdict = dict(_items)
        # If no __file__ or __package__ attributes are available, try to obtain them
        # from the __module__ instead
        if '__file__' not in pdict:
            pdict['__file__'] = sys.modules[pdict['__module__']].__file__
        if '__package__' not in pdict and '__module__' in pdict:
            if hasattr(sys.modules[pdict['__module__']], '__package__'):
                pdict['__package__'] = sys.modules[pdict['__module__']].__package__
    else:
        pdict = get_caller_module_dict(2)

    if outputdir is None:
        # If no output directory is set, the location of the output files
        # is determined according to the following rules:
        #     - If tabmodule specifies a package, files go into that package directory
        #     - Otherwise, files go in the same directory as the specifying module
        if isinstance(tabmodule, types.ModuleType):
            srcfile = tabmodule.__file__
        else:
            if '.' not in tabmodule:
                srcfile = pdict['__file__']
            else:
                parts = tabmodule.split('.')
                pkgname = '.'.join(parts[:-1])
                exec('import %s' % pkgname)
                srcfile = getattr(sys.modules[pkgname], '__file__', '')
        outputdir = os.path.dirname(srcfile)

    # Determine if the module is package of a package or not.
    # If so, fix the tabmodule setting so that tables load correctly
    pkg = pdict.get('__package__')
    if pkg and isinstance(tabmodule, str):
        if '.' not in tabmodule:
            tabmodule = pkg + '.' + tabmodule



    # Set start symbol if it's specified directly using an argument
    if start is not None:
        pdict['start'] = start

    # Collect parser information from the dictionary
    pinfo = ParserReflect(pdict, log=errorlog)
    pinfo.get_all()

    if pinfo.error:
        raise YaccError('Unable to build parser')

    # Check signature against table files (if any)
    signature = pinfo.signature()

    # Read the tables
    try:
        lr = LRTable()
        if picklefile:
            read_signature = lr.read_pickle(picklefile)
        else:
            read_signature = lr.read_table(tabmodule)
        if optimize or (read_signature == signature):
            try:
                lr.bind_callables(pinfo.pdict)
                parser = LRParser(lr, pinfo.error_func)
                parse = parser.parse
                return parser
            except Exception as e:
                errorlog.warning('There was a problem loading the table file: %r', e)
    except VersionError as e:
        errorlog.warning(str(e))
    except ImportError:
        pass

    if debuglog is None:
        if debug:
            try:
                debuglog = PlyLogger(open(os.path.join(outputdir, debugfile), 'w'))
            except IOError as e:
                errorlog.warning("Couldn't open %r. %s" % (debugfile, e))
                debuglog = NullLogger()
        else:
            debuglog = NullLogger()

    debuglog.info('Created by PLY version %s (http://www.dabeaz.com/ply)', __version__)

    errors = False

    # Validate the parser information
    if pinfo.validate_all():
        raise YaccError('Unable to build parser')

    if not pinfo.error_func:
        errorlog.warning('no p_error() function is defined')

    # Create a grammar object
    grammar = Grammar(pinfo.tokens)

    # Set precedence level for terminals
    for term, assoc, level in pinfo.preclist:
        try:
            grammar.set_precedence(term, assoc, level)
        except GrammarError as e:
            errorlog.warning('%s', e)

    # Add productions to the grammar
    for funcname, gram in pinfo.grammar:
        file, line, prodname, syms = gram
        try:
            grammar.add_production(prodname, syms, funcname, file, line)
        except GrammarError as e:
            errorlog.error('%s', e)
            errors = True

    # Set the grammar start symbols
    try:
        if start is None:
            grammar.set_start(pinfo.start)
        else:
            grammar.set_start(start)
    except GrammarError as e:
        errorlog.error(str(e))
        errors = True

    if errors:
        raise YaccError('Unable to build parser')

    # Verify the grammar structure
    undefined_symbols = grammar.undefined_symbols()
    for sym, prod in undefined_symbols:
        errorlog.error('%s:%d: Symbol %r used, but not defined as a token or a rule', prod.file, prod.line, sym)
        errors = True

    unused_terminals = grammar.unused_terminals()
    if unused_terminals:
        debuglog.info('')
        debuglog.info('Unused terminals:')
        debuglog.info('')
        for term in unused_terminals:
            errorlog.warning('Token %r defined, but not used', term)
            debuglog.info('    %s', term)

    # Print out all productions to the debug log
    if debug:
        debuglog.info('')
        debuglog.info('Grammar')
        debuglog.info('')
        for n, p in enumerate(grammar.Productions):
            debuglog.info('Rule %-5d %s', n, p)

    # Find unused non-terminals
    unused_rules = grammar.unused_rules()
    for prod in unused_rules:
        errorlog.warning('%s:%d: Rule %r defined, but not used', prod.file, prod.line, prod.name)

    if len(unused_terminals) == 1:
        errorlog.warning('There is 1 unused token')
    if len(unused_terminals) > 1:
        errorlog.warning('There are %d unused tokens', len(unused_terminals))

    if len(unused_rules) == 1:
        errorlog.warning('There is 1 unused rule')
    if len(unused_rules) > 1:
        errorlog.warning('There are %d unused rules', len(unused_rules))

    if debug:
        debuglog.info('')
        debuglog.info('Terminals, with rules where they appear')
        debuglog.info('')
        terms = list(grammar.Terminals)
        terms.sort()
        for term in terms:
            debuglog.info('%-20s : %s', term, ' '.join([str(s) for s in grammar.Terminals[term]]))

        debuglog.info('')
        debuglog.info('Nonterminals, with rules where they appear')
        debuglog.info('')
        nonterms = list(grammar.Nonterminals)
        nonterms.sort()
        for nonterm in nonterms:
            debuglog.info('%-20s : %s', nonterm, ' '.join([str(s) for s in grammar.Nonterminals[nonterm]]))
        debuglog.info('')

    if check_recursion:
        unreachable = grammar.find_unreachable()
        for u in unreachable:
            errorlog.warning('Symbol %r is unreachable', u)

        infinite = grammar.infinite_cycles()
        for inf in infinite:
            errorlog.error('Infinite recursion detected for symbol %r', inf)
            errors = True

    unused_prec = grammar.unused_precedence()
    for term, assoc in unused_prec:
        errorlog.error('Precedence rule %r defined for unknown symbol %r', assoc, term)
        errors = True

    if errors:
        raise YaccError('Unable to build parser')

    # Run the LRGeneratedTable on the grammar
    if debug:
        errorlog.debug('Generating %s tables', method)

    lr = LRGeneratedTable(grammar, method, debuglog)

    if debug:
        num_sr = len(lr.sr_conflicts)

        # Report shift/reduce and reduce/reduce conflicts
        if num_sr == 1:
            errorlog.warning('1 shift/reduce conflict')
        elif num_sr > 1:
            errorlog.warning('%d shift/reduce conflicts', num_sr)

        num_rr = len(lr.rr_conflicts)
        if num_rr == 1:
            errorlog.warning('1 reduce/reduce conflict')
        elif num_rr > 1:
            errorlog.warning('%d reduce/reduce conflicts', num_rr)

    # Write out conflicts to the output file
    if debug and (lr.sr_conflicts or lr.rr_conflicts):
        debuglog.warning('')
        debuglog.warning('Conflicts:')
        debuglog.warning('')

        for state, tok, resolution in lr.sr_conflicts:
            debuglog.warning('shift/reduce conflict for %s in state %d resolved as %s',  tok, state, resolution)

        already_reported = set()
        for state, rule, rejected in lr.rr_conflicts:
            if (state, id(rule), id(rejected)) in already_reported:
                continue
            debuglog.warning('reduce/reduce conflict in state %d resolved using rule (%s)', state, rule)
            debuglog.warning('rejected rule (%s) in state %d', rejected, state)
            errorlog.warning('reduce/reduce conflict in state %d resolved using rule (%s)', state, rule)
            errorlog.warning('rejected rule (%s) in state %d', rejected, state)
            already_reported.add((state, id(rule), id(rejected)))

        warned_never = []
        for state, rule, rejected in lr.rr_conflicts:
            if not rejected.reduced and (rejected not in warned_never):
                debuglog.warning('Rule (%s) is never reduced', rejected)
                errorlog.warning('Rule (%s) is never reduced', rejected)
                warned_never.append(rejected)

    # Write the table file if requested
    if write_tables:
        try:
            lr.write_table(tabmodule, outputdir, signature)
            if tabmodule in sys.modules:
                del sys.modules[tabmodule]
        except IOError as e:
            errorlog.warning("Couldn't create %r. %s" % (tabmodule, e))

    # Write a pickled version of the tables
    if picklefile:
        try:
            lr.pickle_table(picklefile, signature)
        except IOError as e:
            errorlog.warning("Couldn't create %r. %s" % (picklefile, e))

    # Build the parser
    lr.bind_callables(pinfo.pdict)
    parser = LRParser(lr, pinfo.error_func)

    parse = parser.parse
    return parser
site-packages/sepolgen/refpolicy.py000064400000076642147511334570013477 0ustar00# Authors: Karl MacMillan <kmacmillan@mentalrootkit.com>
#
# Copyright (C) 2006 Red Hat
# see file 'COPYING' for use and warranty information
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License as
# published by the Free Software Foundation; version 2 only
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#

import string
import selinux

# OVERVIEW
#
# This file contains objects and functions used to represent the reference
# policy (including the headers, M4 macros, and policy language statements).
#
# This representation is very different from the semantic representation
# used in libsepol. Instead, it is a more typical abstract representation
# used by the first stage of compilers. It is basically a parse tree.
#
# This choice is intentional as it allows us to handle the unprocessed
# M4 statements - including the $1 style arguments - and to more easily generate
# the data structures that we need for policy generation.
#

# Constans for referring to fields
SRC_TYPE  = 0
TGT_TYPE  = 1
OBJ_CLASS = 2
PERMS     = 3
ROLE      = 4
DEST_TYPE = 5

# String represenations of the above constants
field_to_str = ["source", "target", "object", "permission", "role", "destination" ]
str_to_field = { "source" : SRC_TYPE, "target" : TGT_TYPE, "object" : OBJ_CLASS,
                "permission" : PERMS, "role" : ROLE, "destination" : DEST_TYPE }

# Base Classes

class PolicyBase:
    def __init__(self, parent=None):
        self.parent = None
        self.comment = None

class Node(PolicyBase):
    """Base class objects produced from parsing the reference policy.

    The Node class is used as the base class for any non-leaf
    object produced by parsing the reference policy. This object
    should contain a reference to its parent (or None for a top-level
    object) and 0 or more children.

    The general idea here is to have a very simple tree structure. Children
    are not separated out by type. Instead the tree structure represents
    fairly closely the real structure of the policy statements.

    The object should be iterable - by default over all children but
    subclasses are free to provide additional iterators over a subset
    of their childre (see Interface for example).
    """

    def __init__(self, parent=None):
        PolicyBase.__init__(self, parent)
        self.children = []

    def __iter__(self):
        return iter(self.children)

    # Not all of the iterators will return something on all Nodes, but
    # they won't explode either. Putting them here is just easier.

    # Top level nodes

    def nodes(self):
        return filter(lambda x: isinstance(x, Node), walktree(self))

    def modules(self):
        return filter(lambda x: isinstance(x, Module), walktree(self))

    def interfaces(self):
        return filter(lambda x: isinstance(x, Interface), walktree(self))

    def templates(self):
        return filter(lambda x: isinstance(x, Template), walktree(self))

    def support_macros(self):
        return filter(lambda x: isinstance(x, SupportMacros), walktree(self))

    # Common policy statements

    def module_declarations(self):
        return filter(lambda x: isinstance(x, ModuleDeclaration), walktree(self))

    def interface_calls(self):
        return filter(lambda x: isinstance(x, InterfaceCall), walktree(self))

    def avrules(self):
        return filter(lambda x: isinstance(x, AVRule), walktree(self))

    def avextrules(self):
        return filter(lambda x: isinstance(x, AVExtRule), walktree(self))

    def typerules(self):
        return filter(lambda x: isinstance(x, TypeRule), walktree(self))

    def typebounds(self):
        return filter(lambda x: isinstance(x, TypeBound), walktree(self))

    def typeattributes(self):
        """Iterate over all of the TypeAttribute children of this Interface."""
        return filter(lambda x: isinstance(x, TypeAttribute), walktree(self))

    def roleattributes(self):
        """Iterate over all of the RoleAttribute children of this Interface."""
        return filter(lambda x: isinstance(x, RoleAttribute), walktree(self))

    def requires(self):
        return filter(lambda x: isinstance(x, Require), walktree(self))

    def roles(self):
        return filter(lambda x: isinstance(x, Role), walktree(self))

    def role_allows(self):
        return filter(lambda x: isinstance(x, RoleAllow), walktree(self))

    def role_types(self):
        return filter(lambda x: isinstance(x, RoleType), walktree(self))

    def __str__(self):
        if self.comment:
            return str(self.comment) + "\n" + self.to_string()
        else:
            return self.to_string()

    def __repr__(self):
        return "<%s(%s)>" % (self.__class__.__name__, self.to_string())

    def to_string(self):
        return ""


class Leaf(PolicyBase):
    def __init__(self, parent=None):
        PolicyBase.__init__(self, parent)

    def __str__(self):
        if self.comment:
            return str(self.comment) + "\n" + self.to_string()
        else:
            return self.to_string()

    def __repr__(self):
        return "<%s(%s)>" % (self.__class__.__name__, self.to_string())

    def to_string(self):
        return ""



# Utility functions

def walktree(node, depthfirst=True, showdepth=False, type=None):
    """Iterate over a Node and its Children.

    The walktree function iterates over a tree containing Nodes and
    leaf objects. The iteration can perform a depth first or a breadth
    first traversal of the tree (controlled by the depthfirst
    paramater. The passed in node will be returned.

    This function will only work correctly for trees - arbitrary graphs
    will likely cause infinite looping.
    """
    # We control depth first / versus breadth first by
    # how we pop items off of the node stack.
    if depthfirst:
        index = -1
    else:
        index = 0

    stack = [(node, 0)]
    while len(stack) > 0:
        cur, depth = stack.pop(index)
        if showdepth:
            yield cur, depth
        else:
            yield cur

        # If the node is not a Node instance it must
        # be a leaf - so no need to add it to the stack
        if isinstance(cur, Node):
            items = []
            i = len(cur.children) - 1
            while i >= 0:
                if type is None or isinstance(cur.children[i], type):
                    items.append((cur.children[i], depth + 1))
                i -= 1

            stack.extend(items)

def walknode(node, type=None):
    """Iterate over the direct children of a Node.

    The walktree function iterates over the children of a Node.
    Unlike walktree it does note return the passed in node or
    the children of any Node objects (that is, it does not go
    beyond the current level in the tree).
    """
    for x in node:
        if type is None or isinstance(x, type):
            yield x


def list_to_space_str(s, cont=('{', '}')):
    """Convert a set (or any sequence type) into a string representation
    formatted to match SELinux space separated list conventions.

    For example the list ['read', 'write'] would be converted into:
    '{ read write }'
    """
    l = len(s)
    str = ""
    if l < 1:
        raise ValueError("cannot convert 0 len set to string")
    str = " ".join(s)
    if l == 1:
        return str
    else:
        return cont[0] + " " + str + " " + cont[1]

def list_to_comma_str(s):
    l = len(s)
    if l < 1:
        raise ValueError("cannot conver 0 len set to comma string")

    return ", ".join(s)

# Basic SELinux types

class IdSet(set):
    def __init__(self, list=None):
        if list:
            set.__init__(self, list)
        else:
            set.__init__(self)
        self.compliment = False

    def to_space_str(self):
        return list_to_space_str(sorted(self))

    def to_comma_str(self):
        return list_to_comma_str(sorted(self))

class SecurityContext(Leaf):
    """An SELinux security context with optional MCS / MLS fields."""
    def __init__(self, context=None, parent=None):
        """Create a SecurityContext object, optionally from a string.

        Parameters:
           [context] - string representing a security context. Same format
              as a string passed to the from_string method.
        """
        Leaf.__init__(self, parent)
        self.user = ""
        self.role = ""
        self.type = ""
        self.level = None
        if context is not None:
            self.from_string(context)

    def from_string(self, context):
        """Parse a string representing a context into a SecurityContext.

        The string should be in the standard format - e.g.,
        'user:role:type:level'.

        Raises ValueError if the string is not parsable as a security context.
        """
        # try to translate the context string to raw form
        raw = selinux.selinux_trans_to_raw_context(context)
        if raw[0] == 0:
            context = raw[1]

        fields = context.split(":")
        if len(fields) < 3:
            raise ValueError("context string [%s] not in a valid format" % context)

        self.user = fields[0]
        self.role = fields[1]
        self.type = fields[2]
        if len(fields) > 3:
            # FUTURE - normalize level fields to allow more comparisons to succeed.
            self.level = ':'.join(fields[3:])
        else:
            self.level = None

    def __eq__(self, other):
        """Compare two SecurityContext objects - all fields must be exactly the
        the same for the comparison to work. It is possible for the level fields
        to be semantically the same yet syntactically different - in this case
        this function will return false.
        """
        return self.user == other.user and \
               self.role == other.role and \
               self.type == other.type and \
               self.level == other.level

    def to_string(self, default_level=None):
        """Return a string representing this security context.

        By default, the string will contiain a MCS / MLS level
        potentially from the default which is passed in if none was
        set.

        Arguments:
           default_level - the default level to use if self.level is an
             empty string.

        Returns:
           A string represening the security context in the form
              'user:role:type:level'.
        """
        fields = [self.user, self.role, self.type]
        if self.level is None:
            if default_level is None:
                if selinux.is_selinux_mls_enabled() == 1:
                    fields.append("s0")
            else:
                fields.append(default_level)
        else:
            fields.append(self.level)
        return ":".join(fields)

class ObjectClass(Leaf):
    """SELinux object class and permissions.

    This class is a basic representation of an SELinux object
    class - it does not represent separate common permissions -
    just the union of the common and class specific permissions.
    It is meant to be convenient for policy generation.
    """
    def __init__(self, name="", parent=None):
        Leaf.__init__(self, parent)
        self.name = name
        self.perms = IdSet()

class XpermSet():
    """Extended permission set.

    This class represents one or more extended permissions
    represented by numeric values or ranges of values. The
    .complement attribute is used to specify all permission
    except those specified.

    Two xperm set can be merged using the .extend() method.
    """
    def __init__(self, complement=False):
        self.complement = complement
        self.ranges = []

    def __normalize_ranges(self):
        """Ensure that ranges are not overlapping.
        """
        self.ranges.sort()

        i = 0
        while i < len(self.ranges):
            while i + 1 < len(self.ranges):
                if self.ranges[i + 1][0] <= self.ranges[i][1] + 1:
                    self.ranges[i] = (self.ranges[i][0], max(self.ranges[i][1],
                                                             self.ranges[i + 1][1]))
                    del self.ranges[i + 1]
                else:
                    break
            i += 1

    def extend(self, s):
        """Add ranges from an xperm set
        """
        self.ranges.extend(s.ranges)
        self.__normalize_ranges()

    def add(self, minimum, maximum=None):
        """Add value of range of values to the xperm set.
        """
        if maximum is None:
            maximum = minimum
        self.ranges.append((minimum, maximum))
        self.__normalize_ranges()

    def to_string(self):
        if not self.ranges:
            return ""

        compl = "~ " if self.complement else ""

        # print single value without braces
        if len(self.ranges) == 1 and self.ranges[0][0] == self.ranges[0][1]:
            return compl + str(self.ranges[0][0])

        vals = map(lambda x: str(x[0]) if x[0] == x[1] else "%s-%s" % x,
                   self.ranges)

        return "%s{ %s }" % (compl, " ".join(vals))

# Basic statements

class TypeAttribute(Leaf):
    """SElinux typeattribute statement.

    This class represents a typeattribute statement.
    """
    def __init__(self, parent=None):
        Leaf.__init__(self, parent)
        self.type = ""
        self.attributes = IdSet()

    def to_string(self):
        return "typeattribute %s %s;" % (self.type, self.attributes.to_comma_str())

class RoleAttribute(Leaf):
    """SElinux roleattribute statement.

    This class represents a roleattribute statement.
    """
    def __init__(self, parent=None):
        Leaf.__init__(self, parent)
        self.role = ""
        self.roleattributes = IdSet()

    def to_string(self):
        return "roleattribute %s %s;" % (self.role, self.roleattributes.to_comma_str())


class Role(Leaf):
    def __init__(self, parent=None):
        Leaf.__init__(self, parent)
        self.role = ""
        self.types = IdSet()

    def to_string(self):
        s = ""
        for t in self.types:
            s += "role %s types %s;\n" % (self.role, t)
        return s

class Type(Leaf):
    def __init__(self, name="", parent=None):
        Leaf.__init__(self, parent)
        self.name = name
        self.attributes = IdSet()
        self.aliases = IdSet()

    def to_string(self):
        s = "type %s" % self.name
        if len(self.aliases) > 0:
            s = s + "alias %s" % self.aliases.to_space_str()
        if len(self.attributes) > 0:
            s = s + ", %s" % self.attributes.to_comma_str()
        return s + ";"

class TypeAlias(Leaf):
    def __init__(self, parent=None):
        Leaf.__init__(self, parent)
        self.type = ""
        self.aliases = IdSet()

    def to_string(self):
        return "typealias %s alias %s;" % (self.type, self.aliases.to_space_str())

class Attribute(Leaf):
    def __init__(self, name="", parent=None):
        Leaf.__init__(self, parent)
        self.name = name

    def to_string(self):
        return "attribute %s;" % self.name

class Attribute_Role(Leaf):
    def __init__(self, name="", parent=None):
        Leaf.__init__(self, parent)
        self.name = name

    def to_string(self):
        return "attribute_role %s;" % self.name


# Classes representing rules

class AVRule(Leaf):
    """SELinux access vector (AV) rule.

    The AVRule class represents all varieties of AV rules including
    allow, dontaudit, and auditallow (indicated by the flags self.ALLOW,
    self.DONTAUDIT, and self.AUDITALLOW respectively).

    The source and target types, object classes, and perms are all represented
    by sets containing strings. Sets are used to make it simple to add
    strings repeatedly while avoiding duplicates.

    No checking is done to make certain that the symbols are valid or
    consistent (e.g., perms that don't match the object classes). It is
    even possible to put invalid types like '$1' into the rules to allow
    storage of the reference policy interfaces.
    """
    ALLOW = 0
    DONTAUDIT = 1
    AUDITALLOW = 2
    NEVERALLOW = 3

    def __init__(self, av=None, parent=None):
        Leaf.__init__(self, parent)
        self.src_types = IdSet()
        self.tgt_types = IdSet()
        self.obj_classes = IdSet()
        self.perms = IdSet()
        self.rule_type = self.ALLOW
        if av:
            self.from_av(av)

    def __rule_type_str(self):
        if self.rule_type == self.ALLOW:
            return "allow"
        elif self.rule_type == self.DONTAUDIT:
            return "dontaudit"
        elif self.rule_type == self.AUDITALLOW:
            return "auditallow"
        elif self.rule_type == self.NEVERALLOW:
            return "neverallow"

    def from_av(self, av):
        """Add the access from an access vector to this allow
        rule.
        """
        self.src_types.add(av.src_type)
        if av.src_type == av.tgt_type:
            self.tgt_types.add("self")
        else:
            self.tgt_types.add(av.tgt_type)
        self.obj_classes.add(av.obj_class)
        self.perms.update(av.perms)

    def to_string(self):
        """Return a string representation of the rule
        that is a valid policy language representation (assuming
        that the types, object class, etc. are valie).
        """
        return "%s %s %s:%s %s;" % (self.__rule_type_str(),
                                     self.src_types.to_space_str(),
                                     self.tgt_types.to_space_str(),
                                     self.obj_classes.to_space_str(),
                                     self.perms.to_space_str())

class AVExtRule(Leaf):
    """Extended permission access vector rule.

    The AVExtRule class represents allowxperm, dontauditxperm,
    auditallowxperm, and neverallowxperm rules.

    The source and target types, and object classes are represented
    by sets containing strings. The operation is a single string,
    e.g. 'ioctl'. Extended permissions are represented by an XpermSet.
    """
    ALLOWXPERM = 0
    DONTAUDITXPERM = 1
    AUDITALLOWXPERM = 2
    NEVERALLOWXPERM = 3

    def __init__(self, av=None, op=None, parent=None):
        Leaf.__init__(self, parent)
        self.src_types = IdSet()
        self.tgt_types = IdSet()
        self.obj_classes = IdSet()
        self.rule_type = self.ALLOWXPERM
        self.xperms = XpermSet()
        self.operation = op
        if av:
            self.from_av(av, op)

    def __rule_type_str(self):
        if self.rule_type == self.ALLOWXPERM:
            return "allowxperm"
        elif self.rule_type == self.DONTAUDITXPERM:
            return "dontauditxperm"
        elif self.rule_type == self.AUDITALLOWXPERM:
            return "auditallowxperm"
        elif self.rule_type == self.NEVERALLOWXPERM:
            return "neverallowxperm"

    def from_av(self, av, op):
        self.src_types.add(av.src_type)
        if av.src_type == av.tgt_type:
            self.tgt_types.add("self")
        else:
            self.tgt_types.add(av.tgt_type)
        self.obj_classes.add(av.obj_class)
        self.operation = op
        self.xperms = av.xperms[op]

    def to_string(self):
        """Return a string representation of the rule that is
        a valid policy language representation (assuming that
        the types, object class, etc. are valid).
        """
        return "%s %s %s:%s %s %s;" % (self.__rule_type_str(),
                                     self.src_types.to_space_str(),
                                     self.tgt_types.to_space_str(),
                                     self.obj_classes.to_space_str(),
                                     self.operation,
                                     self.xperms.to_string())

class TypeRule(Leaf):
    """SELinux type rules.

    This class is very similar to the AVRule class, but is for representing
    the type rules (type_trans, type_change, and type_member). The major
    difference is the lack of perms and only and sing destination type.
    """
    TYPE_TRANSITION = 0
    TYPE_CHANGE = 1
    TYPE_MEMBER = 2

    def __init__(self, parent=None):
        Leaf.__init__(self, parent)
        self.src_types = IdSet()
        self.tgt_types = IdSet()
        self.obj_classes = IdSet()
        self.dest_type = ""
        self.rule_type = self.TYPE_TRANSITION

    def __rule_type_str(self):
        if self.rule_type == self.TYPE_TRANSITION:
            return "type_transition"
        elif self.rule_type == self.TYPE_CHANGE:
            return "type_change"
        else:
            return "type_member"

    def to_string(self):
        return "%s %s %s:%s %s;" % (self.__rule_type_str(),
                                     self.src_types.to_space_str(),
                                     self.tgt_types.to_space_str(),
                                     self.obj_classes.to_space_str(),
                                     self.dest_type)
class TypeBound(Leaf):
    """SElinux typebound statement.

    This class represents a typebound statement.
    """
    def __init__(self, parent=None):
        Leaf.__init__(self, parent)
        self.type = ""
        self.tgt_types = IdSet()

    def to_string(self):
        return "typebounds %s %s;" % (self.type, self.tgt_types.to_comma_str())


class RoleAllow(Leaf):
    def __init__(self, parent=None):
        Leaf.__init__(self, parent)
        self.src_roles = IdSet()
        self.tgt_roles = IdSet()

    def to_string(self):
        return "allow %s %s;" % (self.src_roles.to_comma_str(),
                                 self.tgt_roles.to_comma_str())

class RoleType(Leaf):
    def __init__(self, parent=None):
        Leaf.__init__(self, parent)
        self.role = ""
        self.types = IdSet()

    def to_string(self):
        s = ""
        for t in self.types:
            s += "role %s types %s;\n" % (self.role, t)
        return s

class ModuleDeclaration(Leaf):
    def __init__(self, parent=None):
        Leaf.__init__(self, parent)
        self.name = ""
        self.version = ""
        self.refpolicy = False

    def to_string(self):
        if self.refpolicy:
            return "policy_module(%s, %s)" % (self.name, self.version)
        else:
            return "module %s %s;" % (self.name, self.version)

class Conditional(Node):
    def __init__(self, parent=None):
        Node.__init__(self, parent)
        self.cond_expr = []

    def to_string(self):
        return "[If %s]" % list_to_space_str(self.cond_expr, cont=("", ""))

class Bool(Leaf):
    def __init__(self, parent=None):
        Leaf.__init__(self, parent)
        self.name = ""
        self.state = False

    def to_string(self):
        s = "bool %s " % self.name
        if s.state:
            return s + "true"
        else:
            return s + "false"

class InitialSid(Leaf):
    def __init(self, parent=None):
        Leaf.__init__(self, parent)
        self.name = ""
        self.context = None

    def to_string(self):
        return "sid %s %s" % (self.name, str(self.context))

class GenfsCon(Leaf):
    def __init__(self, parent=None):
        Leaf.__init__(self, parent)
        self.filesystem = ""
        self.path = ""
        self.context = None

    def to_string(self):
        return "genfscon %s %s %s" % (self.filesystem, self.path, str(self.context))

class FilesystemUse(Leaf):
    XATTR = 1
    TRANS = 2
    TASK = 3
    
    def __init__(self, parent=None):
        Leaf.__init__(self, parent)
        self.type = self.XATTR
        self.filesystem = ""
        self.context = None

    def to_string(self):
        s = ""
        if self.type == self.XATTR:
            s = "fs_use_xattr "
        elif self.type == self.TRANS:
            s = "fs_use_trans "
        elif self.type == self.TASK:
            s = "fs_use_task "

        return "%s %s %s;" % (s, self.filesystem, str(self.context))

class PortCon(Leaf):
    def __init__(self, parent=None):
        Leaf.__init__(self, parent)
        self.port_type = ""
        self.port_number = ""
        self.context = None

    def to_string(self):
        return "portcon %s %s %s" % (self.port_type, self.port_number, str(self.context))

class NodeCon(Leaf):
    def __init__(self, parent=None):
        Leaf.__init__(self, parent)
        self.start = ""
        self.end = ""
        self.context = None

    def to_string(self):
        return "nodecon %s %s %s" % (self.start, self.end, str(self.context))

class NetifCon(Leaf):
    def __init__(self, parent=None):
        Leaf.__init__(self, parent)
        self.interface = ""
        self.interface_context = None
        self.packet_context = None

    def to_string(self):
        return "netifcon %s %s %s" % (self.interface, str(self.interface_context),
                                   str(self.packet_context))
class PirqCon(Leaf):
    def __init__(self, parent=None):
        Leaf.__init__(self, parent)
        self.pirq_number = ""
        self.context = None

    def to_string(self):
        return "pirqcon %s %s" % (self.pirq_number, str(self.context))

class IomemCon(Leaf):
    def __init__(self, parent=None):
        Leaf.__init__(self, parent)
        self.device_mem = ""
        self.context = None

    def to_string(self):
        return "iomemcon %s %s" % (self.device_mem, str(self.context))

class IoportCon(Leaf):
    def __init__(self, parent=None):
        Leaf.__init__(self, parent)
        self.ioport = ""
        self.context = None

    def to_string(self):
        return "ioportcon %s %s" % (self.ioport, str(self.context))

class PciDeviceCon(Leaf):
    def __init__(self, parent=None):
        Leaf.__init__(self, parent)
        self.device = ""
        self.context = None

    def to_string(self):
        return "pcidevicecon %s %s" % (self.device, str(self.context))

class DeviceTreeCon(Leaf):
    def __init__(self, parent=None):
        Leaf.__init__(self, parent)
        self.path = ""
        self.context = None

    def to_string(self):
        return "devicetreecon %s %s" % (self.path, str(self.context))

# Reference policy specific types

def print_tree(head):
    for node, depth in walktree(head, showdepth=True):
        s = ""
        for i in range(depth):
            s = s + "\t"
        print(s + str(node))


class Headers(Node):
    def __init__(self, parent=None):
        Node.__init__(self, parent)

    def to_string(self):
        return "[Headers]"


class Module(Node):
    def __init__(self, parent=None):
        Node.__init__(self, parent)

    def to_string(self):
        return ""

class Interface(Node):
    """A reference policy interface definition.

    This class represents a reference policy interface definition.
    """
    def __init__(self, name="", parent=None):
        Node.__init__(self, parent)
        self.name = name

    def to_string(self):
        return "[Interface name: %s]" % self.name

class TunablePolicy(Node):
    def __init__(self, parent=None):
        Node.__init__(self, parent)
        self.cond_expr = []

    def to_string(self):
        return "[Tunable Policy %s]" % list_to_space_str(self.cond_expr, cont=("", ""))

class Template(Node):
    def __init__(self, name="", parent=None):
        Node.__init__(self, parent)
        self.name = name

    def to_string(self):
        return "[Template name: %s]" % self.name

class IfDef(Node):
    def __init__(self, name="", parent=None):
        Node.__init__(self, parent)
        self.name = name

    def to_string(self):
        return "[Ifdef name: %s]" % self.name

class InterfaceCall(Leaf):
    def __init__(self, ifname="", parent=None):
        Leaf.__init__(self, parent)
        self.ifname = ifname
        self.args = []
        self.comments = []

    def matches(self, other):
        if self.ifname != other.ifname:
            return False
        if len(self.args) != len(other.args):
            return False
        for a,b in zip(self.args, other.args):
            if a != b:
                return False
        return True

    def to_string(self):
        s = "%s(" % self.ifname
        i = 0
        for a in self.args:
            if isinstance(a, list):
                str = list_to_space_str(a)
            else:
                str = a
                
            if i != 0:
                s = s + ", %s" % str
            else:
                s = s + str
            i += 1
        return s + ")"

class OptionalPolicy(Node):
    def __init__(self, parent=None):
        Node.__init__(self, parent)

    def to_string(self):
        return "[Optional Policy]"

class SupportMacros(Node):
    def __init__(self, parent=None):
        Node.__init__(self, parent)
        self.map = None

    def to_string(self):
        return "[Support Macros]"

    def __expand_perm(self, perm):
        # Recursive expansion - the assumption is that these
        # are ordered correctly so that no macro is used before
        # it is defined
        s = set()
        if perm in self.map:
            for p in self.by_name(perm):
                s.update(self.__expand_perm(p))
        else:
            s.add(perm)
        return s

    def __gen_map(self):
        self.map = {}
        for x in self:
            exp_perms = set()
            for perm in x.perms:
                exp_perms.update(self.__expand_perm(perm))
            self.map[x.name] = exp_perms

    def by_name(self, name):
        if not self.map:
            self.__gen_map()
        return self.map[name]

    def has_key(self, name):
        if not self.map:
            self.__gen_map()
        return name in self.map

class Require(Leaf):
    def __init__(self, parent=None):
        Leaf.__init__(self, parent)
        self.types = IdSet()
        self.obj_classes = { }
        self.roles = IdSet()
        self.data = IdSet()
        self.users = IdSet()

    def add_obj_class(self, obj_class, perms):
        p = self.obj_classes.setdefault(obj_class, IdSet())
        p.update(perms)


    def to_string(self):
        s = []
        s.append("require {")
        for type in self.types:
            s.append("\ttype %s;" % type)
        for obj_class, perms in self.obj_classes.items():
            s.append("\tclass %s %s;" % (obj_class, perms.to_space_str()))
        for role in self.roles:
            s.append("\trole %s;" % role)
        for bool in self.data:
            s.append("\tbool %s;" % bool)
        for user in self.users:
            s.append("\tuser %s;" % user)
        s.append("}")

        # Handle empty requires
        if len(s) == 2:
            return ""

        return "\n".join(s)


class ObjPermSet:
    def __init__(self, name):
        self.name = name
        self.perms = set()

    def to_string(self):
        return "define(`%s', `%s')" % (self.name, self.perms.to_space_str())

class ClassMap:
    def __init__(self, obj_class, perms):
        self.obj_class = obj_class
        self.perms = perms

    def to_string(self):
        return self.obj_class + ": " + self.perms

class Comment:
    def __init__(self, l=None):
        if l:
            self.lines = l
        else:
            self.lines = []

    def to_string(self):
        # If there are no lines, treat this as a spacer between
        # policy statements and return a new line.
        if len(self.lines) == 0:
            return ""
        else:
            out = []
            for line in self.lines:
                out.append("#" + line)
            return "\n".join(out)

    def merge(self, other):
        if len(other.lines):
            for line in other.lines:
                if line != "":
                    self.lines.append(line)

    def __str__(self):
        return self.to_string()


site-packages/sepolgen/refparser.py000064400000074302147511334570013463 0ustar00# Authors: Karl MacMillan <kmacmillan@mentalrootkit.com>
#
# Copyright (C) 2006-2007 Red Hat
# see file 'COPYING' for use and warranty information
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License as
# published by the Free Software Foundation; version 2 only
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#

# OVERVIEW
#
#
# This is a parser for the refpolicy policy "language" - i.e., the
# normal SELinux policy language plus the refpolicy style M4 macro
# constructs on top of that base language. This parser is primarily
# aimed at parsing the policy headers in order to create an abstract
# policy representation suitable for generating policy.
#
# Both the lexer and parser are included in this file. The are implemented
# using the Ply library (included with sepolgen).

import sys
import os
import re
import traceback

from . import access
from . import defaults
from . import lex
from . import refpolicy
from . import yacc

# :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
#
# lexer
#
# :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

tokens = (
    # basic tokens, punctuation
    'TICK',
    'SQUOTE',
    'OBRACE',
    'CBRACE',
    'SEMI',
    'COLON',
    'OPAREN',
    'CPAREN',
    'COMMA',
    'MINUS',
    'TILDE',
    'ASTERISK',
    'AMP',
    'BAR',
    'EXPL',
    'EQUAL',
    'FILENAME',
    'IDENTIFIER',
    'NUMBER',
    'PATH',
    'IPV6_ADDR',
    # reserved words
    #   module
    'MODULE',
    'POLICY_MODULE',
    'REQUIRE',
    #   flask
    'SID',
    'GENFSCON',
    'FS_USE_XATTR',
    'FS_USE_TRANS',
    'FS_USE_TASK',
    'PORTCON',
    'NODECON',
    'NETIFCON',
    'PIRQCON',
    'IOMEMCON',
    'IOPORTCON',
    'PCIDEVICECON',
    'DEVICETREECON',
    #   object classes
    'CLASS',
    #   types and attributes
    'TYPEATTRIBUTE',
    'ROLEATTRIBUTE',
    'TYPE',
    'ATTRIBUTE',
    'ATTRIBUTE_ROLE',
    'ALIAS',
    'TYPEALIAS',
    #   conditional policy
    'BOOL',
    'TRUE',
    'FALSE',
    'IF',
    'ELSE',
    #   users and roles
    'ROLE',
    'TYPES',
    #   rules
    'ALLOW',
    'DONTAUDIT',
    'AUDITALLOW',
    'NEVERALLOW',
    'PERMISSIVE',
    'TYPEBOUNDS',
    'TYPE_TRANSITION',
    'TYPE_CHANGE',
    'TYPE_MEMBER',
    'RANGE_TRANSITION',
    'ROLE_TRANSITION',
    #   refpolicy keywords
    'OPT_POLICY',
    'INTERFACE',
    'TUNABLE_POLICY',
    'GEN_REQ',
    'TEMPLATE',
    'GEN_CONTEXT',
    #   m4
    'IFELSE',
    'IFDEF',
    'IFNDEF',
    'DEFINE'
    )

# All reserved keywords - see t_IDENTIFIER for how these are matched in
# the lexer.
reserved = {
    # module
    'module' : 'MODULE',
    'policy_module' : 'POLICY_MODULE',
    'require' : 'REQUIRE',
    # flask
    'sid' : 'SID',
    'genfscon' : 'GENFSCON',
    'fs_use_xattr' : 'FS_USE_XATTR',
    'fs_use_trans' : 'FS_USE_TRANS',
    'fs_use_task' : 'FS_USE_TASK',
    'portcon' : 'PORTCON',
    'nodecon' : 'NODECON',
    'netifcon' : 'NETIFCON',
    'pirqcon' : 'PIRQCON',
    'iomemcon' : 'IOMEMCON',
    'ioportcon' : 'IOPORTCON',
    'pcidevicecon' : 'PCIDEVICECON',
    'devicetreecon' : 'DEVICETREECON',
    # object classes
    'class' : 'CLASS',
    # types and attributes
    'typeattribute' : 'TYPEATTRIBUTE',
    'roleattribute' : 'ROLEATTRIBUTE',
    'type' : 'TYPE',
    'attribute' : 'ATTRIBUTE',
    'attribute_role' : 'ATTRIBUTE_ROLE',
    'alias' : 'ALIAS',
    'typealias' : 'TYPEALIAS',
    # conditional policy
    'bool' : 'BOOL',
    'true' : 'TRUE',
    'false' : 'FALSE',
    'if' : 'IF',
    'else' : 'ELSE',
    # users and roles
    'role' : 'ROLE',
    'types' : 'TYPES',
    # rules
    'allow' : 'ALLOW',
    'dontaudit' : 'DONTAUDIT',
    'auditallow' : 'AUDITALLOW',
    'neverallow' : 'NEVERALLOW',
    'permissive' : 'PERMISSIVE',
    'typebounds' : 'TYPEBOUNDS',
    'type_transition' : 'TYPE_TRANSITION',
    'type_change' : 'TYPE_CHANGE',
    'type_member' : 'TYPE_MEMBER',
    'range_transition' : 'RANGE_TRANSITION',
    'role_transition' : 'ROLE_TRANSITION',
    # refpolicy keywords
    'optional_policy' : 'OPT_POLICY',
    'interface' : 'INTERFACE',
    'tunable_policy' : 'TUNABLE_POLICY',
    'gen_require' : 'GEN_REQ',
    'template' : 'TEMPLATE',
    'gen_context' : 'GEN_CONTEXT',
    # M4
    'ifelse' : 'IFELSE',
    'ifndef' : 'IFNDEF',
    'ifdef' : 'IFDEF',
    'define' : 'DEFINE'
    }

# The ply lexer allows definition of tokens in 2 ways: regular expressions
# or functions.

# Simple regex tokens
t_TICK      = r'\`'
t_SQUOTE    = r'\''
t_OBRACE    = r'\{'
t_CBRACE    = r'\}'
# This will handle spurios extra ';' via the +
t_SEMI      = r'\;+'
t_COLON     = r'\:'
t_OPAREN    = r'\('
t_CPAREN    = r'\)'
t_COMMA     = r'\,'
t_MINUS     = r'\-'
t_TILDE     = r'\~'
t_ASTERISK  = r'\*'
t_AMP       = r'\&'
t_BAR       = r'\|'
t_EXPL      = r'\!'
t_EQUAL     = r'\='
t_NUMBER    = r'[0-9\.]+'
t_PATH      = r'/[a-zA-Z0-9)_\.\*/\$]*'
#t_IPV6_ADDR = r'[a-fA-F0-9]{0,4}:[a-fA-F0-9]{0,4}:([a-fA-F0-9]{0,4}:)*'

# Ignore whitespace - this is a special token for ply that more efficiently
# ignores uninteresting tokens.
t_ignore    = " \t"

# More complex tokens
def t_IPV6_ADDR(t):
    r'[a-fA-F0-9]{0,4}:[a-fA-F0-9]{0,4}:([a-fA-F0-9]|:)*'
    # This is a function simply to force it sooner into
    # the regex list
    return t

def t_m4comment(t):
    r'dnl.*\n'
    # Ignore all comments
    t.lexer.lineno += 1

def t_refpolicywarn1(t):
    r'define.*refpolicywarn\(.*\n'
    # Ignore refpolicywarn statements - they sometimes
    # contain text that we can't parse.
    t.skip(1)

def t_refpolicywarn(t):
    r'refpolicywarn\(.*\n'
    # Ignore refpolicywarn statements - they sometimes
    # contain text that we can't parse.
    t.lexer.lineno += 1

def t_IDENTIFIER(t):
    r'[a-zA-Z_\$][a-zA-Z0-9_\-\+\.\$\*~]*'
    # Handle any keywords
    t.type = reserved.get(t.value,'IDENTIFIER')
    return t

def t_FILENAME(t):
    r'\"[a-zA-Z0-9_\-\+\.\$\*~ :]+\"'
    # Handle any keywords
    t.type = reserved.get(t.value,'FILENAME')
    return t

def t_comment(t):
    r'\#.*\n'
    # Ignore all comments
    t.lexer.lineno += 1

def t_error(t):
    print("Illegal character '%s'" % t.value[0])
    t.skip(1)

def t_newline(t):
    r'\n+'
    t.lexer.lineno += len(t.value)

# :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
#
# Parser
#
# :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

# Global data used during parsing - making it global is easier than
# passing the state through the parsing functions.

#   m is the top-level data structure (stands for modules).
m = None
#   error is either None (indicating no error) or a string error message.
error = None
parse_file = ""
#   spt is the support macros (e.g., obj/perm sets) - it is an instance of
#     refpolicy.SupportMacros and should always be present during parsing
#     though it may not contain any macros.
spt = None
success = True

# utilities
def collect(stmts, parent, val=None):
    if stmts is None:
        return
    for s in stmts:
        if s is None:
            continue
        s.parent = parent
        if val is not None:
            parent.children.insert(0, (val, s))
        else:
            parent.children.insert(0, s)

def expand(ids, s):
    for id in ids:
        if spt.has_key(id):  # noqa
            s.update(spt.by_name(id))
        else:
            s.add(id)

# Top-level non-terminal
def p_statements(p):
    '''statements : statement
                  | statements statement
                  | empty
    '''
    if len(p) == 2 and p[1]:
        m.children.append(p[1])
    elif len(p) > 2 and p[2]:
        m.children.append(p[2])

def p_statement(p):
    '''statement : interface
                 | template
                 | obj_perm_set
                 | policy
                 | policy_module_stmt
                 | module_stmt
    '''
    p[0] = p[1]

def p_empty(p):
    'empty :'
    pass

#
# Reference policy language constructs
#

# This is for the policy module statement (e.g., policy_module(foo,1.2.0)).
# We have a separate terminal for either the basic language module statement
# and interface calls to make it easier to identifier.
def p_policy_module_stmt(p):
    'policy_module_stmt : POLICY_MODULE OPAREN IDENTIFIER COMMA NUMBER CPAREN'
    m = refpolicy.ModuleDeclaration()
    m.name = p[3]
    m.version = p[5]
    m.refpolicy = True
    p[0] = m

def p_interface(p):
    '''interface : INTERFACE OPAREN TICK IDENTIFIER SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN
    '''
    x = refpolicy.Interface(p[4])
    collect(p[8], x)
    p[0] = x

def p_template(p):
    '''template : TEMPLATE OPAREN TICK IDENTIFIER SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN
                | DEFINE OPAREN TICK IDENTIFIER SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN
    '''
    x = refpolicy.Template(p[4])
    collect(p[8], x)
    p[0] = x

def p_define(p):
    '''define : DEFINE OPAREN TICK IDENTIFIER SQUOTE CPAREN'''
    # This is for defining single M4 values (to be used later in ifdef statements).
    # Example: define(`sulogin_no_pam'). We don't currently do anything with these
    # but we should in the future when we correctly resolve ifdef statements.
    p[0] = None

def p_interface_stmts(p):
    '''interface_stmts : policy
                       | interface_stmts policy
                       | empty
    '''
    if len(p) == 2 and p[1]:
        p[0] = p[1]
    elif len(p) > 2:
        if not p[1]:
            if p[2]:
                p[0] = p[2]
        elif not p[2]:
            p[0] = p[1]
        else:
            p[0] = p[1] + p[2]

def p_optional_policy(p):
    '''optional_policy : OPT_POLICY OPAREN TICK interface_stmts SQUOTE CPAREN
                       | OPT_POLICY OPAREN TICK interface_stmts SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN
    '''
    o = refpolicy.OptionalPolicy()
    collect(p[4], o, val=True)
    if len(p) > 7:
        collect(p[8], o, val=False)
    p[0] = [o]

def p_tunable_policy(p):
    '''tunable_policy : TUNABLE_POLICY OPAREN TICK cond_expr SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN
                      | TUNABLE_POLICY OPAREN TICK cond_expr SQUOTE COMMA TICK interface_stmts SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN
    '''
    x = refpolicy.TunablePolicy()
    x.cond_expr = p[4]
    collect(p[8], x, val=True)
    if len(p) > 11:
        collect(p[12], x, val=False)
    p[0] = [x]

def p_ifelse(p):
    '''ifelse : IFELSE OPAREN TICK IDENTIFIER SQUOTE COMMA COMMA TICK IDENTIFIER SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN optional_semi
              | IFELSE OPAREN TICK IDENTIFIER SQUOTE COMMA TICK IDENTIFIER SQUOTE COMMA TICK interface_stmts SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN optional_semi
              | IFELSE OPAREN TICK IDENTIFIER SQUOTE COMMA TICK SQUOTE COMMA TICK interface_stmts SQUOTE COMMA TICK interface_stmts SQUOTE CPAREN optional_semi
    '''
#    x = refpolicy.IfDef(p[4])
#    v = True
#    collect(p[8], x, val=v)
#    if len(p) > 12:
#        collect(p[12], x, val=False)
#    p[0] = [x]
    pass


def p_ifdef(p):
    '''ifdef : IFDEF OPAREN TICK IDENTIFIER SQUOTE COMMA TICK statements SQUOTE CPAREN optional_semi
             | IFNDEF OPAREN TICK IDENTIFIER SQUOTE COMMA TICK statements SQUOTE CPAREN optional_semi
             | IFDEF OPAREN TICK IDENTIFIER SQUOTE COMMA TICK statements SQUOTE COMMA TICK statements SQUOTE CPAREN optional_semi
    '''
    x = refpolicy.IfDef(p[4])
    if p[1] == 'ifdef':
        v = True
    else:
        v = False
    collect(p[8], x, val=v)
    if len(p) > 12:
        collect(p[12], x, val=False)
    p[0] = [x]

def p_interface_call(p):
    '''interface_call : IDENTIFIER OPAREN interface_call_param_list CPAREN
                      | IDENTIFIER OPAREN CPAREN
                      | IDENTIFIER OPAREN interface_call_param_list CPAREN SEMI'''
    # Allow spurious semi-colons at the end of interface calls
    i = refpolicy.InterfaceCall(ifname=p[1])
    if len(p) > 4:
        i.args.extend(p[3])
    p[0] = i

def p_interface_call_param(p):
    '''interface_call_param : IDENTIFIER
                            | IDENTIFIER MINUS IDENTIFIER
                            | nested_id_set
                            | TRUE
                            | FALSE
                            | FILENAME
    '''
    # Intentionally let single identifiers pass through
    # List means set, non-list identifier
    if len(p) == 2:
        p[0] = p[1]
    else:
        p[0] = [p[1], "-" + p[3]]

def p_interface_call_param_list(p):
    '''interface_call_param_list : interface_call_param
                                 | interface_call_param_list COMMA interface_call_param
    '''
    if len(p) == 2:
        p[0] = [p[1]]
    else:
        p[0] = p[1] + [p[3]]


def p_obj_perm_set(p):
    'obj_perm_set : DEFINE OPAREN TICK IDENTIFIER SQUOTE COMMA TICK names SQUOTE CPAREN'
    s = refpolicy.ObjPermSet(p[4])
    s.perms = p[8]
    p[0] = s
    
#
# Basic SELinux policy language
#

def p_policy(p):
    '''policy : policy_stmt
              | optional_policy
              | tunable_policy
              | ifdef
              | ifelse
              | conditional
    '''
    p[0] = p[1]

def p_policy_stmt(p):
    '''policy_stmt : gen_require
                   | avrule_def
                   | typerule_def
                   | typebound_def
                   | typeattribute_def
                   | roleattribute_def
                   | interface_call
                   | role_def
                   | role_allow
                   | permissive
                   | type_def
                   | typealias_def
                   | attribute_def
                   | attribute_role_def
                   | range_transition_def
                   | role_transition_def
                   | bool
                   | define
                   | initial_sid
                   | genfscon
                   | fs_use
                   | portcon
                   | nodecon
                   | netifcon
                   | pirqcon
                   | iomemcon
                   | ioportcon
                   | pcidevicecon
                   | devicetreecon
    '''
    if p[1]:
        p[0] = [p[1]]

def p_module_stmt(p):
    'module_stmt : MODULE IDENTIFIER NUMBER SEMI'
    m = refpolicy.ModuleDeclaration()
    m.name = p[2]
    m.version = p[3]
    m.refpolicy = False
    p[0] = m

def p_gen_require(p):
    '''gen_require : GEN_REQ OPAREN TICK requires SQUOTE CPAREN
                   | REQUIRE OBRACE requires CBRACE'''
    # We ignore the require statements - they are redundant data from our point-of-view.
    # Checkmodule will verify them later anyway so we just assume that they match what
    # is in the rest of the interface.
    pass

def p_requires(p):
    '''requires : require
                | requires require
                | ifdef
                | requires ifdef
    '''
    pass

def p_require(p):
    '''require : TYPE comma_list SEMI
               | ROLE comma_list SEMI
               | ATTRIBUTE comma_list SEMI
               | ATTRIBUTE_ROLE comma_list SEMI
               | CLASS comma_list SEMI
               | BOOL comma_list SEMI
    '''
    pass

def p_security_context(p):
    '''security_context : IDENTIFIER COLON IDENTIFIER COLON IDENTIFIER
                        | IDENTIFIER COLON IDENTIFIER COLON IDENTIFIER COLON mls_range_def'''
    # This will likely need some updates to handle complex levels
    s = refpolicy.SecurityContext()
    s.user = p[1]
    s.role = p[3]
    s.type = p[5]
    if len(p) > 6:
        s.level = p[7]

    p[0] = s

def p_gen_context(p):
    '''gen_context : GEN_CONTEXT OPAREN security_context COMMA mls_range_def CPAREN
    '''
    # We actually store gen_context statements in a SecurityContext
    # object - it knows how to output either a bare context or a
    # gen_context statement.
    s = p[3]
    s.level = p[5]
    
    p[0] = s

def p_context(p):
    '''context : security_context
               | gen_context
    '''
    p[0] = p[1]

def p_initial_sid(p):
    '''initial_sid : SID IDENTIFIER context'''
    s = refpolicy.InitialSid()
    s.name = p[2]
    s.context = p[3]
    p[0] = s

def p_genfscon(p):
    '''genfscon : GENFSCON IDENTIFIER PATH context'''
    
    g = refpolicy.GenfsCon()
    g.filesystem = p[2]
    g.path = p[3]
    g.context = p[4]

    p[0] = g

def p_fs_use(p):
    '''fs_use : FS_USE_XATTR IDENTIFIER context SEMI
              | FS_USE_TASK IDENTIFIER context SEMI
              | FS_USE_TRANS IDENTIFIER context SEMI
    '''
    f = refpolicy.FilesystemUse()
    if p[1] == "fs_use_xattr":
        f.type = refpolicy.FilesystemUse.XATTR
    elif p[1] == "fs_use_task":
        f.type = refpolicy.FilesystemUse.TASK
    elif p[1] == "fs_use_trans":
        f.type = refpolicy.FilesystemUse.TRANS

    f.filesystem = p[2]
    f.context = p[3]

    p[0] = f

def p_portcon(p):
    '''portcon : PORTCON IDENTIFIER NUMBER context
               | PORTCON IDENTIFIER NUMBER MINUS NUMBER context'''
    c = refpolicy.PortCon()
    c.port_type = p[2]
    if len(p) == 5:
        c.port_number = p[3]
        c.context = p[4]
    else:
        c.port_number = p[3] + "-" + p[4]
        c.context = p[5]

    p[0] = c

def p_nodecon(p):
    '''nodecon : NODECON NUMBER NUMBER context
               | NODECON IPV6_ADDR IPV6_ADDR context
    '''
    n = refpolicy.NodeCon()
    n.start = p[2]
    n.end = p[3]
    n.context = p[4]

    p[0] = n

def p_netifcon(p):
    'netifcon : NETIFCON IDENTIFIER context context'
    n = refpolicy.NetifCon()
    n.interface = p[2]
    n.interface_context = p[3]
    n.packet_context = p[4]

    p[0] = n

def p_pirqcon(p):
    'pirqcon : PIRQCON NUMBER context'
    c = refpolicy.PirqCon()
    c.pirq_number = p[2]
    c.context = p[3]

    p[0] = c

def p_iomemcon(p):
    '''iomemcon : IOMEMCON NUMBER context
                | IOMEMCON NUMBER MINUS NUMBER context'''
    c = refpolicy.IomemCon()
    if len(p) == 4:
        c.device_mem = p[2]
        c.context = p[3]
    else:
        c.device_mem = p[2] + "-" + p[3]
        c.context = p[4]

    p[0] = c

def p_ioportcon(p):
    '''ioportcon : IOPORTCON NUMBER context
                | IOPORTCON NUMBER MINUS NUMBER context'''
    c = refpolicy.IoportCon()
    if len(p) == 4:
        c.ioport = p[2]
        c.context = p[3]
    else:
        c.ioport = p[2] + "-" + p[3]
        c.context = p[4]

    p[0] = c

def p_pcidevicecon(p):
    'pcidevicecon : PCIDEVICECON NUMBER context'
    c = refpolicy.PciDeviceCon()
    c.device = p[2]
    c.context = p[3]

    p[0] = c

def p_devicetreecon(p):
    'devicetreecon : DEVICETREECON NUMBER context'
    c = refpolicy.DevicetTeeCon()
    c.path = p[2]
    c.context = p[3]

    p[0] = c

def p_mls_range_def(p):
    '''mls_range_def : mls_level_def MINUS mls_level_def
                     | mls_level_def
    '''
    p[0] = p[1]
    if len(p) > 2:
        p[0] = p[0] + "-" + p[3]

def p_mls_level_def(p):
    '''mls_level_def : IDENTIFIER COLON comma_list
                     | IDENTIFIER
    '''
    p[0] = p[1]
    if len(p) > 2:
        p[0] = p[0] + ":" + ",".join(p[3])
    
def p_type_def(p):
    '''type_def : TYPE IDENTIFIER COMMA comma_list SEMI
                | TYPE IDENTIFIER SEMI
                | TYPE IDENTIFIER ALIAS names SEMI
                | TYPE IDENTIFIER ALIAS names COMMA comma_list SEMI
    '''
    t = refpolicy.Type(p[2])
    if len(p) == 6:
        if p[3] == ',':
            t.attributes.update(p[4])
        else:
            t.aliases = p[4]
    elif len(p) > 4:
        t.aliases = p[4]
        if len(p) == 8:
            t.attributes.update(p[6])
    p[0] = t

def p_attribute_def(p):
    'attribute_def : ATTRIBUTE IDENTIFIER SEMI'
    a = refpolicy.Attribute(p[2])
    p[0] = a

def p_attribute_role_def(p):
    'attribute_role_def : ATTRIBUTE_ROLE IDENTIFIER SEMI'
    a = refpolicy.Attribute_Role(p[2])
    p[0] = a

def p_typealias_def(p):
    'typealias_def : TYPEALIAS IDENTIFIER ALIAS names SEMI'
    t = refpolicy.TypeAlias()
    t.type = p[2]
    t.aliases = p[4]
    p[0] = t

def p_role_def(p):
    '''role_def : ROLE IDENTIFIER TYPES comma_list SEMI
                | ROLE IDENTIFIER SEMI'''
    r = refpolicy.Role()
    r.role = p[2]
    if len(p) > 4:
        r.types.update(p[4])
    p[0] = r

def p_role_allow(p):
    'role_allow : ALLOW names names SEMI'
    r = refpolicy.RoleAllow()
    r.src_roles = p[2]
    r.tgt_roles = p[3]
    p[0] = r

def p_permissive(p):
    'permissive : PERMISSIVE names SEMI'
    pass

def p_avrule_def(p):
    '''avrule_def : ALLOW names names COLON names names SEMI
                  | DONTAUDIT names names COLON names names SEMI
                  | AUDITALLOW names names COLON names names SEMI
                  | NEVERALLOW names names COLON names names SEMI
    '''
    a = refpolicy.AVRule()
    if p[1] == 'dontaudit':
        a.rule_type = refpolicy.AVRule.DONTAUDIT
    elif p[1] == 'auditallow':
        a.rule_type = refpolicy.AVRule.AUDITALLOW
    elif p[1] == 'neverallow':
        a.rule_type = refpolicy.AVRule.NEVERALLOW
    a.src_types = p[2]
    a.tgt_types = p[3]
    a.obj_classes = p[5]
    a.perms = p[6]
    p[0] = a

def p_typerule_def(p):
    '''typerule_def : TYPE_TRANSITION names names COLON names IDENTIFIER SEMI
                    | TYPE_TRANSITION names names COLON names IDENTIFIER FILENAME SEMI
                    | TYPE_TRANSITION names names COLON names IDENTIFIER IDENTIFIER SEMI
                    | TYPE_CHANGE names names COLON names IDENTIFIER SEMI
                    | TYPE_MEMBER names names COLON names IDENTIFIER SEMI
    '''
    t = refpolicy.TypeRule()
    if p[1] == 'type_change':
        t.rule_type = refpolicy.TypeRule.TYPE_CHANGE
    elif p[1] == 'type_member':
        t.rule_type = refpolicy.TypeRule.TYPE_MEMBER
    t.src_types = p[2]
    t.tgt_types = p[3]
    t.obj_classes = p[5]
    t.dest_type = p[6]
    t.file_name = p[7]
    p[0] = t

def p_typebound_def(p):
    '''typebound_def : TYPEBOUNDS IDENTIFIER comma_list SEMI'''
    t = refpolicy.TypeBound()
    t.type = p[2]
    t.tgt_types.update(p[3])
    p[0] = t

def p_bool(p):
    '''bool : BOOL IDENTIFIER TRUE SEMI
            | BOOL IDENTIFIER FALSE SEMI'''
    b = refpolicy.Bool()
    b.name = p[2]
    if p[3] == "true":
        b.state = True
    else:
        b.state = False
    p[0] = b

def p_conditional(p):
    ''' conditional : IF OPAREN cond_expr CPAREN OBRACE interface_stmts CBRACE
                    | IF OPAREN cond_expr CPAREN OBRACE interface_stmts CBRACE ELSE OBRACE interface_stmts CBRACE
    '''
    c = refpolicy.Conditional()
    c.cond_expr = p[3]
    collect(p[6], c, val=True)
    if len(p) > 8:
        collect(p[10], c, val=False)
    p[0] = [c]

def p_typeattribute_def(p):
    '''typeattribute_def : TYPEATTRIBUTE IDENTIFIER comma_list SEMI'''
    t = refpolicy.TypeAttribute()
    t.type = p[2]
    t.attributes.update(p[3])
    p[0] = t

def p_roleattribute_def(p):
    '''roleattribute_def : ROLEATTRIBUTE IDENTIFIER comma_list SEMI'''
    t = refpolicy.RoleAttribute()
    t.role = p[2]
    t.roleattributes.update(p[3])
    p[0] = t

def p_range_transition_def(p):
    '''range_transition_def : RANGE_TRANSITION names names COLON names mls_range_def SEMI
                            | RANGE_TRANSITION names names names SEMI'''
    pass

def p_role_transition_def(p):
    '''role_transition_def : ROLE_TRANSITION names names names SEMI'''
    pass

def p_cond_expr(p):
    '''cond_expr : IDENTIFIER
                 | EXPL cond_expr
                 | cond_expr AMP AMP cond_expr
                 | cond_expr BAR BAR cond_expr
                 | cond_expr EQUAL EQUAL cond_expr
                 | cond_expr EXPL EQUAL cond_expr
    '''
    l = len(p)
    if l == 2:
        p[0] = [p[1]]
    elif l == 3:
        p[0] = [p[1]] + p[2]
    else:
        p[0] = p[1] + [p[2] + p[3]] + p[4]


#
# Basic terminals
#

# Identifiers and lists of identifiers. These must
# be handled somewhat gracefully. Names returns an IdSet and care must
# be taken that this is _assigned_ to an object to correctly update
# all of the flags (as opposed to using update). The other terminals
# return list - this is to preserve ordering if it is important for
# parsing (for example, interface_call must retain the ordering). Other
# times the list should be used to update an IdSet.

def p_names(p):
    '''names : identifier
             | nested_id_set
             | asterisk
             | TILDE identifier
             | TILDE nested_id_set
             | IDENTIFIER MINUS IDENTIFIER
    '''
    s = refpolicy.IdSet()
    if len(p) < 3:
        expand(p[1], s)
    elif len(p) == 3:
        expand(p[2], s)
        s.compliment = True
    else:
        expand([p[1]])
        s.add("-" + p[3])
    p[0] = s

def p_identifier(p):
    'identifier : IDENTIFIER'
    p[0] = [p[1]]

def p_asterisk(p):
    'asterisk : ASTERISK'
    p[0] = [p[1]]

def p_nested_id_set(p):
    '''nested_id_set : OBRACE nested_id_list CBRACE
    '''
    p[0] = p[2]

def p_nested_id_list(p):
    '''nested_id_list : nested_id_element
                      | nested_id_list nested_id_element
    '''
    if len(p) == 2:
        p[0] = p[1]
    else:
        p[0] = p[1] + p[2]

def p_nested_id_element(p):
    '''nested_id_element : identifier
                         | MINUS IDENTIFIER
                         | nested_id_set
    '''
    if len(p) == 2:
        p[0] = p[1]
    else:
        # For now just leave the '-'
        str = "-" + p[2]
        p[0] = [str]

def p_comma_list(p):
    '''comma_list : nested_id_list
                  | comma_list COMMA nested_id_list
    '''
    if len(p) > 2:
        p[1] = p[1] + p[3]
    p[0] = p[1]

def p_optional_semi(p):
    '''optional_semi : SEMI
                   | empty'''
    pass


#
# Interface to the parser
#

def p_error(tok):
    global error, parse_file, success, parser
    error = "%s: Syntax error on line %d %s [type=%s]" % (parse_file, tok.lineno, tok.value, tok.type)
    print(error)
    success = False

def prep_spt(spt):
    if not spt:
        return { }
    map = {}
    for x in spt:
        map[x.name] = x

parser = None
lexer = None
def create_globals(module, support, debug):
    global parser, lexer, m, spt

    if not parser:
        lexer = lex.lex()
        parser = yacc.yacc(method="LALR", debug=debug, write_tables=0)

    if module is not None:
        m = module
    else:
        m = refpolicy.Module()

    if not support:
        spt = refpolicy.SupportMacros()
    else:
        spt = support

def parse(text, module=None, support=None, debug=False):
    create_globals(module, support, debug)
    global error, parser, lexer, success

    lexer.lineno = 1
    success = True

    try:
        parser.parse(text, debug=debug, lexer=lexer)
    except Exception as e:
        parser = None
        lexer = None
        error = "internal parser error: %s" % str(e) + "\n" + traceback.format_exc()

    if not success:
        # force the parser and lexer to be rebuilt - we have some problems otherwise
        parser = None
        msg = 'could not parse text: "%s"' % error
        raise ValueError(msg)
    return m

def list_headers(root):
    modules = []
    support_macros = None

    for dirpath, dirnames, filenames in os.walk(root):
        for name in filenames:
            modname = os.path.splitext(name)
            filename = os.path.join(dirpath, name)

            if modname[1] == '.spt':
                if name == "obj_perm_sets.spt":
                    support_macros = filename
                elif len(re.findall("patterns", modname[0])):
                    modules.append((modname[0], filename))
            elif modname[1] == '.if':
                modules.append((modname[0], filename))

    return (modules, support_macros)


def parse_headers(root, output=None, expand=True, debug=False):
    from . import util

    headers = refpolicy.Headers()

    modules = []
    support_macros = None

    if os.path.isfile(root):
        name = os.path.split(root)[1]
        if name == '':
            raise ValueError("Invalid file name %s" % root)
        modname = os.path.splitext(name)
        modules.append((modname[0], root))
        all_modules, support_macros = list_headers(defaults.headers())
    else:
        modules, support_macros = list_headers(root)

    if expand and not support_macros:
        raise ValueError("could not find support macros (obj_perm_sets.spt)")

    def o(msg):
        if output:
            output.write(msg)

    def parse_file(f, module, spt=None):
        global parse_file
        if debug:
            o("parsing file %s\n" % f)
        try:
            fd = open(f)
            txt = fd.read()
            fd.close()
            parse_file = f
            parse(txt, module, spt, debug)
        except IOError as e:
            return
        except ValueError as e:
            raise ValueError("error parsing file %s: %s" % (f, str(e)))

    spt = None
    if support_macros:
        o("Parsing support macros (%s): " % support_macros)
        spt = refpolicy.SupportMacros()
        parse_file(support_macros, spt)

        headers.children.append(spt)

        # FIXME: Total hack - add in can_exec rather than parse the insanity
        # of misc_macros. We are just going to pretend that this is an interface
        # to make the expansion work correctly.
        can_exec = refpolicy.Interface("can_exec")
        av = access.AccessVector(["$1","$2","file","execute_no_trans","open", "read",
                                  "getattr","lock","execute","ioctl"])

        can_exec.children.append(refpolicy.AVRule(av))
        headers.children.append(can_exec)

        o("done.\n")

    if output and not debug:
        status = util.ConsoleProgressBar(sys.stdout, steps=len(modules))
        status.start("Parsing interface files")

    failures = []
    for x in modules:
        m = refpolicy.Module()
        m.name = x[0]
        try:
            if expand:
                parse_file(x[1], m, spt)
            else:
                parse_file(x[1], m)
        except ValueError as e:
            o(str(e) + "\n")
            failures.append(x[1])
            continue

        headers.children.append(m)
        if output and not debug:
            status.step()

    if len(failures):
        o("failed to parse some headers: %s" % ", ".join(failures))

    return headers
site-packages/sepolgen/sepolgeni18n.py000064400000001631147511334570014001 0ustar00# Authors: Karl MacMillan <kmacmillan@mentalrootkit.com>
#
# Copyright (C) 2006 Red Hat 
# see file 'COPYING' for use and warranty information
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License as
# published by the Free Software Foundation; version 2 only
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#

try: 
    import gettext
    t = gettext.translation( 'selinux-python' )
    _ = t.gettext
except:
    def _(str):
        return str
site-packages/sepolgen/defaults.py000064400000005525147511334570013302 0ustar00# Authors: Karl MacMillan <kmacmillan@mentalrootkit.com>
#
# Copyright (C) 2006 Red Hat
# see file 'COPYING' for use and warranty information
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License as
# published by the Free Software Foundation; version 2 only
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#

import os
import re

# Select the correct location for the development files based on a
# path variable (optionally read from a configuration file)
class PathChooser(object):
    def __init__(self, pathname):
        self.config = dict()
        if not os.path.exists(pathname):
            self.config_pathname = "(defaults)"
            self.config["SELINUX_DEVEL_PATH"] = "/usr/share/selinux/default:/usr/share/selinux/mls:/usr/share/selinux/devel"
            return
        self.config_pathname = pathname
        ignore = re.compile(r"^\s*(?:#.+)?$")
        consider = re.compile(r"^\s*(\w+)\s*=\s*(.+?)\s*$")
        with open(pathname, "r") as fd:
            for lineno, line in enumerate(fd):
                if ignore.match(line): continue
                mo = consider.match(line)
                if not mo:
                    raise ValueError("%s:%d: line is not in key = value format" % (pathname, lineno+1))
                self.config[mo.group(1)] = mo.group(2)

    # We're only exporting one useful function, so why not be a function
    def __call__(self, testfilename, pathset="SELINUX_DEVEL_PATH"):
        paths = self.config.get(pathset, None)
        if paths is None:
            raise ValueError("%s was not in %s" % (pathset, self.config_pathname))
        paths = paths.split(":")
        for p in paths:
            target = os.path.join(p, testfilename)
            if os.path.exists(target): return target
        return os.path.join(paths[0], testfilename)


"""
Various default settings, including file and directory locations.
"""

def data_dir():
    return "/var/lib/sepolgen"

def perm_map():
    return data_dir() + "/perm_map"

def interface_info():
    return data_dir() + "/interface_info"

def attribute_info():
    return data_dir() + "/attribute_info"

def refpolicy_makefile():
    chooser = PathChooser("/etc/selinux/sepolgen.conf")
    result = chooser("Makefile")
    if not os.path.exists(result):
        result = chooser("include/Makefile")
    return result

def headers():
    chooser = PathChooser("/etc/selinux/sepolgen.conf")
    return chooser("include")

site-packages/sepolgen/__init__.py000064400000000000147511334570013211 0ustar00site-packages/pip/pep425tags.py000064400000025344147511334570012346 0ustar00"""Generate and work with PEP 425 Compatibility Tags."""
from __future__ import absolute_import

import re
import sys
import warnings
import platform
import logging

try:
    import sysconfig
except ImportError:  # pragma nocover
    # Python < 2.7
    import distutils.sysconfig as sysconfig
import distutils.util

from pip.compat import OrderedDict
import pip.utils.glibc

logger = logging.getLogger(__name__)

_osx_arch_pat = re.compile(r'(.+)_(\d+)_(\d+)_(.+)')


def get_config_var(var):
    try:
        return sysconfig.get_config_var(var)
    except IOError as e:  # Issue #1074
        warnings.warn("{0}".format(e), RuntimeWarning)
        return None


def get_abbr_impl():
    """Return abbreviated implementation name."""
    if hasattr(sys, 'pypy_version_info'):
        pyimpl = 'pp'
    elif sys.platform.startswith('java'):
        pyimpl = 'jy'
    elif sys.platform == 'cli':
        pyimpl = 'ip'
    else:
        pyimpl = 'cp'
    return pyimpl


def get_impl_ver():
    """Return implementation version."""
    impl_ver = get_config_var("py_version_nodot")
    if not impl_ver or get_abbr_impl() == 'pp':
        impl_ver = ''.join(map(str, get_impl_version_info()))
    return impl_ver


def get_impl_version_info():
    """Return sys.version_info-like tuple for use in decrementing the minor
    version."""
    if get_abbr_impl() == 'pp':
        # as per https://github.com/pypa/pip/issues/2882
        return (sys.version_info[0], sys.pypy_version_info.major,
                sys.pypy_version_info.minor)
    else:
        return sys.version_info[0], sys.version_info[1]


def get_impl_tag():
    """
    Returns the Tag for this specific implementation.
    """
    return "{0}{1}".format(get_abbr_impl(), get_impl_ver())


def get_flag(var, fallback, expected=True, warn=True):
    """Use a fallback method for determining SOABI flags if the needed config
    var is unset or unavailable."""
    val = get_config_var(var)
    if val is None:
        if warn:
            logger.debug("Config variable '%s' is unset, Python ABI tag may "
                         "be incorrect", var)
        return fallback()
    return val == expected


def get_abi_tag():
    """Return the ABI tag based on SOABI (if available) or emulate SOABI
    (CPython 2, PyPy)."""
    soabi = get_config_var('SOABI')
    impl = get_abbr_impl()
    if not soabi and impl in ('cp', 'pp') and hasattr(sys, 'maxunicode'):
        d = ''
        m = ''
        u = ''
        if get_flag('Py_DEBUG',
                    lambda: hasattr(sys, 'gettotalrefcount'),
                    warn=(impl == 'cp')):
            d = 'd'
        if get_flag('WITH_PYMALLOC',
                    lambda: impl == 'cp',
                    warn=(impl == 'cp')):
            m = 'm'
        if get_flag('Py_UNICODE_SIZE',
                    lambda: sys.maxunicode == 0x10ffff,
                    expected=4,
                    warn=(impl == 'cp' and
                          sys.version_info < (3, 3))) \
                and sys.version_info < (3, 3):
            u = 'u'
        abi = '%s%s%s%s%s' % (impl, get_impl_ver(), d, m, u)
    elif soabi and soabi.startswith('cpython-'):
        abi = 'cp' + soabi.split('-')[1]
    elif soabi:
        abi = soabi.replace('.', '_').replace('-', '_')
    else:
        abi = None
    return abi


def _is_running_32bit():
    return sys.maxsize == 2147483647


def get_platform():
    """Return our platform name 'win32', 'linux_x86_64'"""
    if sys.platform == 'darwin':
        # distutils.util.get_platform() returns the release based on the value
        # of MACOSX_DEPLOYMENT_TARGET on which Python was built, which may
        # be significantly older than the user's current machine.
        release, _, machine = platform.mac_ver()
        split_ver = release.split('.')

        if machine == "x86_64" and _is_running_32bit():
            machine = "i386"
        elif machine == "ppc64" and _is_running_32bit():
            machine = "ppc"

        return 'macosx_{0}_{1}_{2}'.format(split_ver[0], split_ver[1], machine)

    # XXX remove distutils dependency
    result = distutils.util.get_platform().replace('.', '_').replace('-', '_')
    if result == "linux_x86_64" and _is_running_32bit():
        # 32 bit Python program (running on a 64 bit Linux): pip should only
        # install and run 32 bit compiled extensions in that case.
        result = "linux_i686"

    return result


def is_manylinux1_compatible():
    # Only Linux, and only x86-64 / i686
    if get_platform() not in ("linux_x86_64", "linux_i686"):
        return False

    # Check for presence of _manylinux module
    try:
        import _manylinux
        return bool(_manylinux.manylinux1_compatible)
    except (ImportError, AttributeError):
        # Fall through to heuristic check below
        pass

    # Check glibc version. CentOS 5 uses glibc 2.5.
    return pip.utils.glibc.have_compatible_glibc(2, 5)


def get_darwin_arches(major, minor, machine):
    """Return a list of supported arches (including group arches) for
    the given major, minor and machine architecture of an macOS machine.
    """
    arches = []

    def _supports_arch(major, minor, arch):
        # Looking at the application support for macOS versions in the chart
        # provided by https://en.wikipedia.org/wiki/OS_X#Versions it appears
        # our timeline looks roughly like:
        #
        # 10.0 - Introduces ppc support.
        # 10.4 - Introduces ppc64, i386, and x86_64 support, however the ppc64
        #        and x86_64 support is CLI only, and cannot be used for GUI
        #        applications.
        # 10.5 - Extends ppc64 and x86_64 support to cover GUI applications.
        # 10.6 - Drops support for ppc64
        # 10.7 - Drops support for ppc
        #
        # Given that we do not know if we're installing a CLI or a GUI
        # application, we must be conservative and assume it might be a GUI
        # application and behave as if ppc64 and x86_64 support did not occur
        # until 10.5.
        #
        # Note: The above information is taken from the "Application support"
        #       column in the chart not the "Processor support" since I believe
        #       that we care about what instruction sets an application can use
        #       not which processors the OS supports.
        if arch == 'ppc':
            return (major, minor) <= (10, 5)
        if arch == 'ppc64':
            return (major, minor) == (10, 5)
        if arch == 'i386':
            return (major, minor) >= (10, 4)
        if arch == 'x86_64':
            return (major, minor) >= (10, 5)
        if arch in groups:
            for garch in groups[arch]:
                if _supports_arch(major, minor, garch):
                    return True
        return False

    groups = OrderedDict([
        ("fat", ("i386", "ppc")),
        ("intel", ("x86_64", "i386")),
        ("fat64", ("x86_64", "ppc64")),
        ("fat32", ("x86_64", "i386", "ppc")),
    ])

    if _supports_arch(major, minor, machine):
        arches.append(machine)

    for garch in groups:
        if machine in groups[garch] and _supports_arch(major, minor, garch):
            arches.append(garch)

    arches.append('universal')

    return arches


def get_supported(versions=None, noarch=False, platform=None,
                  impl=None, abi=None):
    """Return a list of supported tags for each version specified in
    `versions`.

    :param versions: a list of string versions, of the form ["33", "32"],
        or None. The first version will be assumed to support our ABI.
    :param platform: specify the exact platform you want valid
        tags for, or None. If None, use the local system platform.
    :param impl: specify the exact implementation you want valid
        tags for, or None. If None, use the local interpreter impl.
    :param abi: specify the exact abi you want valid
        tags for, or None. If None, use the local interpreter abi.
    """
    supported = []

    # Versions must be given with respect to the preference
    if versions is None:
        versions = []
        version_info = get_impl_version_info()
        major = version_info[:-1]
        # Support all previous minor Python versions.
        for minor in range(version_info[-1], -1, -1):
            versions.append(''.join(map(str, major + (minor,))))

    impl = impl or get_abbr_impl()

    abis = []

    abi = abi or get_abi_tag()
    if abi:
        abis[0:0] = [abi]

    abi3s = set()
    import imp
    for suffix in imp.get_suffixes():
        if suffix[0].startswith('.abi'):
            abi3s.add(suffix[0].split('.', 2)[1])

    abis.extend(sorted(list(abi3s)))

    abis.append('none')

    if not noarch:
        arch = platform or get_platform()
        if arch.startswith('macosx'):
            # support macosx-10.6-intel on macosx-10.9-x86_64
            match = _osx_arch_pat.match(arch)
            if match:
                name, major, minor, actual_arch = match.groups()
                tpl = '{0}_{1}_%i_%s'.format(name, major)
                arches = []
                for m in reversed(range(int(minor) + 1)):
                    for a in get_darwin_arches(int(major), m, actual_arch):
                        arches.append(tpl % (m, a))
            else:
                # arch pattern didn't match (?!)
                arches = [arch]
        elif platform is None and is_manylinux1_compatible():
            arches = [arch.replace('linux', 'manylinux1'), arch]
        else:
            arches = [arch]

        # Current version, current API (built specifically for our Python):
        for abi in abis:
            for arch in arches:
                supported.append(('%s%s' % (impl, versions[0]), abi, arch))

        # abi3 modules compatible with older version of Python
        for version in versions[1:]:
            # abi3 was introduced in Python 3.2
            if version in ('31', '30'):
                break
            for abi in abi3s:   # empty set if not Python 3
                for arch in arches:
                    supported.append(("%s%s" % (impl, version), abi, arch))

        # Has binaries, does not use the Python API:
        for arch in arches:
            supported.append(('py%s' % (versions[0][0]), 'none', arch))

    # No abi / arch, but requires our implementation:
    supported.append(('%s%s' % (impl, versions[0]), 'none', 'any'))
    # Tagged specifically as being cross-version compatible
    # (with just the major version specified)
    supported.append(('%s%s' % (impl, versions[0][0]), 'none', 'any'))

    # No abi / arch, generic Python
    for i, version in enumerate(versions):
        supported.append(('py%s' % (version,), 'none', 'any'))
        if i == 0:
            supported.append(('py%s' % (version[0]), 'none', 'any'))

    return supported

supported_tags = get_supported()
supported_tags_noarch = get_supported(noarch=True)

implementation_tag = get_impl_tag()
site-packages/pip/_vendor/chardet/escsm.py000064400000024416147511334570014627 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is mozilla.org code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

from .enums import MachineState

HZ_CLS = (
1,0,0,0,0,0,0,0,  # 00 - 07
0,0,0,0,0,0,0,0,  # 08 - 0f
0,0,0,0,0,0,0,0,  # 10 - 17
0,0,0,1,0,0,0,0,  # 18 - 1f
0,0,0,0,0,0,0,0,  # 20 - 27
0,0,0,0,0,0,0,0,  # 28 - 2f
0,0,0,0,0,0,0,0,  # 30 - 37
0,0,0,0,0,0,0,0,  # 38 - 3f
0,0,0,0,0,0,0,0,  # 40 - 47
0,0,0,0,0,0,0,0,  # 48 - 4f
0,0,0,0,0,0,0,0,  # 50 - 57
0,0,0,0,0,0,0,0,  # 58 - 5f
0,0,0,0,0,0,0,0,  # 60 - 67
0,0,0,0,0,0,0,0,  # 68 - 6f
0,0,0,0,0,0,0,0,  # 70 - 77
0,0,0,4,0,5,2,0,  # 78 - 7f
1,1,1,1,1,1,1,1,  # 80 - 87
1,1,1,1,1,1,1,1,  # 88 - 8f
1,1,1,1,1,1,1,1,  # 90 - 97
1,1,1,1,1,1,1,1,  # 98 - 9f
1,1,1,1,1,1,1,1,  # a0 - a7
1,1,1,1,1,1,1,1,  # a8 - af
1,1,1,1,1,1,1,1,  # b0 - b7
1,1,1,1,1,1,1,1,  # b8 - bf
1,1,1,1,1,1,1,1,  # c0 - c7
1,1,1,1,1,1,1,1,  # c8 - cf
1,1,1,1,1,1,1,1,  # d0 - d7
1,1,1,1,1,1,1,1,  # d8 - df
1,1,1,1,1,1,1,1,  # e0 - e7
1,1,1,1,1,1,1,1,  # e8 - ef
1,1,1,1,1,1,1,1,  # f0 - f7
1,1,1,1,1,1,1,1,  # f8 - ff
)

HZ_ST = (
MachineState.START,MachineState.ERROR,     3,MachineState.START,MachineState.START,MachineState.START,MachineState.ERROR,MachineState.ERROR,# 00-07
MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,# 08-0f
MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ERROR,MachineState.ERROR,MachineState.START,MachineState.START,     4,MachineState.ERROR,# 10-17
     5,MachineState.ERROR,     6,MachineState.ERROR,     5,     5,     4,MachineState.ERROR,# 18-1f
     4,MachineState.ERROR,     4,     4,     4,MachineState.ERROR,     4,MachineState.ERROR,# 20-27
     4,MachineState.ITS_ME,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START,# 28-2f
)

HZ_CHAR_LEN_TABLE = (0, 0, 0, 0, 0, 0)

HZ_SM_MODEL = {'class_table': HZ_CLS,
               'class_factor': 6,
               'state_table': HZ_ST,
               'char_len_table': HZ_CHAR_LEN_TABLE,
               'name': "HZ-GB-2312",
               'language': 'Chinese'}

ISO2022CN_CLS = (
2,0,0,0,0,0,0,0,  # 00 - 07
0,0,0,0,0,0,0,0,  # 08 - 0f
0,0,0,0,0,0,0,0,  # 10 - 17
0,0,0,1,0,0,0,0,  # 18 - 1f
0,0,0,0,0,0,0,0,  # 20 - 27
0,3,0,0,0,0,0,0,  # 28 - 2f
0,0,0,0,0,0,0,0,  # 30 - 37
0,0,0,0,0,0,0,0,  # 38 - 3f
0,0,0,4,0,0,0,0,  # 40 - 47
0,0,0,0,0,0,0,0,  # 48 - 4f
0,0,0,0,0,0,0,0,  # 50 - 57
0,0,0,0,0,0,0,0,  # 58 - 5f
0,0,0,0,0,0,0,0,  # 60 - 67
0,0,0,0,0,0,0,0,  # 68 - 6f
0,0,0,0,0,0,0,0,  # 70 - 77
0,0,0,0,0,0,0,0,  # 78 - 7f
2,2,2,2,2,2,2,2,  # 80 - 87
2,2,2,2,2,2,2,2,  # 88 - 8f
2,2,2,2,2,2,2,2,  # 90 - 97
2,2,2,2,2,2,2,2,  # 98 - 9f
2,2,2,2,2,2,2,2,  # a0 - a7
2,2,2,2,2,2,2,2,  # a8 - af
2,2,2,2,2,2,2,2,  # b0 - b7
2,2,2,2,2,2,2,2,  # b8 - bf
2,2,2,2,2,2,2,2,  # c0 - c7
2,2,2,2,2,2,2,2,  # c8 - cf
2,2,2,2,2,2,2,2,  # d0 - d7
2,2,2,2,2,2,2,2,  # d8 - df
2,2,2,2,2,2,2,2,  # e0 - e7
2,2,2,2,2,2,2,2,  # e8 - ef
2,2,2,2,2,2,2,2,  # f0 - f7
2,2,2,2,2,2,2,2,  # f8 - ff
)

ISO2022CN_ST = (
MachineState.START,     3,MachineState.ERROR,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START,# 00-07
MachineState.START,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,# 08-0f
MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,# 10-17
MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,     4,MachineState.ERROR,# 18-1f
MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,# 20-27
     5,     6,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,# 28-2f
MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,# 30-37
MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ERROR,MachineState.START,# 38-3f
)

ISO2022CN_CHAR_LEN_TABLE = (0, 0, 0, 0, 0, 0, 0, 0, 0)

ISO2022CN_SM_MODEL = {'class_table': ISO2022CN_CLS,
                      'class_factor': 9,
                      'state_table': ISO2022CN_ST,
                      'char_len_table': ISO2022CN_CHAR_LEN_TABLE,
                      'name': "ISO-2022-CN",
                      'language': 'Chinese'}

ISO2022JP_CLS = (
2,0,0,0,0,0,0,0,  # 00 - 07
0,0,0,0,0,0,2,2,  # 08 - 0f
0,0,0,0,0,0,0,0,  # 10 - 17
0,0,0,1,0,0,0,0,  # 18 - 1f
0,0,0,0,7,0,0,0,  # 20 - 27
3,0,0,0,0,0,0,0,  # 28 - 2f
0,0,0,0,0,0,0,0,  # 30 - 37
0,0,0,0,0,0,0,0,  # 38 - 3f
6,0,4,0,8,0,0,0,  # 40 - 47
0,9,5,0,0,0,0,0,  # 48 - 4f
0,0,0,0,0,0,0,0,  # 50 - 57
0,0,0,0,0,0,0,0,  # 58 - 5f
0,0,0,0,0,0,0,0,  # 60 - 67
0,0,0,0,0,0,0,0,  # 68 - 6f
0,0,0,0,0,0,0,0,  # 70 - 77
0,0,0,0,0,0,0,0,  # 78 - 7f
2,2,2,2,2,2,2,2,  # 80 - 87
2,2,2,2,2,2,2,2,  # 88 - 8f
2,2,2,2,2,2,2,2,  # 90 - 97
2,2,2,2,2,2,2,2,  # 98 - 9f
2,2,2,2,2,2,2,2,  # a0 - a7
2,2,2,2,2,2,2,2,  # a8 - af
2,2,2,2,2,2,2,2,  # b0 - b7
2,2,2,2,2,2,2,2,  # b8 - bf
2,2,2,2,2,2,2,2,  # c0 - c7
2,2,2,2,2,2,2,2,  # c8 - cf
2,2,2,2,2,2,2,2,  # d0 - d7
2,2,2,2,2,2,2,2,  # d8 - df
2,2,2,2,2,2,2,2,  # e0 - e7
2,2,2,2,2,2,2,2,  # e8 - ef
2,2,2,2,2,2,2,2,  # f0 - f7
2,2,2,2,2,2,2,2,  # f8 - ff
)

ISO2022JP_ST = (
MachineState.START,     3,MachineState.ERROR,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START,# 00-07
MachineState.START,MachineState.START,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,# 08-0f
MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,# 10-17
MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ERROR,MachineState.ERROR,# 18-1f
MachineState.ERROR,     5,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,     4,MachineState.ERROR,MachineState.ERROR,# 20-27
MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,     6,MachineState.ITS_ME,MachineState.ERROR,MachineState.ITS_ME,MachineState.ERROR,# 28-2f
MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ITS_ME,# 30-37
MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,# 38-3f
MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ERROR,MachineState.START,MachineState.START,# 40-47
)

ISO2022JP_CHAR_LEN_TABLE = (0, 0, 0, 0, 0, 0, 0, 0, 0, 0)

ISO2022JP_SM_MODEL = {'class_table': ISO2022JP_CLS,
                      'class_factor': 10,
                      'state_table': ISO2022JP_ST,
                      'char_len_table': ISO2022JP_CHAR_LEN_TABLE,
                      'name': "ISO-2022-JP",
                      'language': 'Japanese'}

ISO2022KR_CLS = (
2,0,0,0,0,0,0,0,  # 00 - 07
0,0,0,0,0,0,0,0,  # 08 - 0f
0,0,0,0,0,0,0,0,  # 10 - 17
0,0,0,1,0,0,0,0,  # 18 - 1f
0,0,0,0,3,0,0,0,  # 20 - 27
0,4,0,0,0,0,0,0,  # 28 - 2f
0,0,0,0,0,0,0,0,  # 30 - 37
0,0,0,0,0,0,0,0,  # 38 - 3f
0,0,0,5,0,0,0,0,  # 40 - 47
0,0,0,0,0,0,0,0,  # 48 - 4f
0,0,0,0,0,0,0,0,  # 50 - 57
0,0,0,0,0,0,0,0,  # 58 - 5f
0,0,0,0,0,0,0,0,  # 60 - 67
0,0,0,0,0,0,0,0,  # 68 - 6f
0,0,0,0,0,0,0,0,  # 70 - 77
0,0,0,0,0,0,0,0,  # 78 - 7f
2,2,2,2,2,2,2,2,  # 80 - 87
2,2,2,2,2,2,2,2,  # 88 - 8f
2,2,2,2,2,2,2,2,  # 90 - 97
2,2,2,2,2,2,2,2,  # 98 - 9f
2,2,2,2,2,2,2,2,  # a0 - a7
2,2,2,2,2,2,2,2,  # a8 - af
2,2,2,2,2,2,2,2,  # b0 - b7
2,2,2,2,2,2,2,2,  # b8 - bf
2,2,2,2,2,2,2,2,  # c0 - c7
2,2,2,2,2,2,2,2,  # c8 - cf
2,2,2,2,2,2,2,2,  # d0 - d7
2,2,2,2,2,2,2,2,  # d8 - df
2,2,2,2,2,2,2,2,  # e0 - e7
2,2,2,2,2,2,2,2,  # e8 - ef
2,2,2,2,2,2,2,2,  # f0 - f7
2,2,2,2,2,2,2,2,  # f8 - ff
)

ISO2022KR_ST = (
MachineState.START,     3,MachineState.ERROR,MachineState.START,MachineState.START,MachineState.START,MachineState.ERROR,MachineState.ERROR,# 00-07
MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,# 08-0f
MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,     4,MachineState.ERROR,MachineState.ERROR,# 10-17
MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,     5,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,# 18-1f
MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.START,MachineState.START,MachineState.START,MachineState.START,# 20-27
)

ISO2022KR_CHAR_LEN_TABLE = (0, 0, 0, 0, 0, 0)

ISO2022KR_SM_MODEL = {'class_table': ISO2022KR_CLS,
                      'class_factor': 6,
                      'state_table': ISO2022KR_ST,
                      'char_len_table': ISO2022KR_CHAR_LEN_TABLE,
                      'name': "ISO-2022-KR",
                      'language': 'Korean'}


site-packages/pip/_vendor/chardet/big5freq.py000064400000075026147511334570015224 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Communicator client code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

# Big5 frequency table
# by Taiwan's Mandarin Promotion Council
# <http://www.edu.tw:81/mandr/>
#
# 128  --> 0.42261
# 256  --> 0.57851
# 512  --> 0.74851
# 1024 --> 0.89384
# 2048 --> 0.97583
#
# Ideal Distribution Ratio = 0.74851/(1-0.74851) =2.98
# Random Distribution Ration = 512/(5401-512)=0.105
#
# Typical Distribution Ratio about 25% of Ideal one, still much higher than RDR

BIG5_TYPICAL_DISTRIBUTION_RATIO = 0.75

#Char to FreqOrder table
BIG5_TABLE_SIZE = 5376

BIG5_CHAR_TO_FREQ_ORDER = (
   1,1801,1506, 255,1431, 198,   9,  82,   6,5008, 177, 202,3681,1256,2821, 110, #   16
3814,  33,3274, 261,  76,  44,2114,  16,2946,2187,1176, 659,3971,  26,3451,2653, #   32
1198,3972,3350,4202, 410,2215, 302, 590, 361,1964,   8, 204,  58,4510,5009,1932, #   48
  63,5010,5011, 317,1614,  75, 222, 159,4203,2417,1480,5012,3555,3091, 224,2822, #   64
3682,   3,  10,3973,1471,  29,2787,1135,2866,1940, 873, 130,3275,1123, 312,5013, #   80
4511,2052, 507, 252, 682,5014, 142,1915, 124, 206,2947,  34,3556,3204,  64, 604, #   96
5015,2501,1977,1978, 155,1991, 645, 641,1606,5016,3452, 337,  72, 406,5017,  80, #  112
 630, 238,3205,1509, 263, 939,1092,2654, 756,1440,1094,3453, 449,  69,2987, 591, #  128
 179,2096, 471, 115,2035,1844,  60,  50,2988, 134, 806,1869, 734,2036,3454, 180, #  144
 995,1607, 156, 537,2907, 688,5018, 319,1305, 779,2145, 514,2379, 298,4512, 359, #  160
2502,  90,2716,1338, 663,  11, 906,1099,2553,  20,2441, 182, 532,1716,5019, 732, #  176
1376,4204,1311,1420,3206,  25,2317,1056, 113, 399, 382,1950, 242,3455,2474, 529, #  192
3276, 475,1447,3683,5020, 117,  21, 656, 810,1297,2300,2334,3557,5021, 126,4205, #  208
 706, 456, 150, 613,4513,  71,1118,2037,4206, 145,3092,  85, 835, 486,2115,1246, #  224
1426, 428, 727,1285,1015, 800, 106, 623, 303,1281,5022,2128,2359, 347,3815, 221, #  240
3558,3135,5023,1956,1153,4207,  83, 296,1199,3093, 192, 624,  93,5024, 822,1898, #  256
2823,3136, 795,2065, 991,1554,1542,1592,  27,  43,2867, 859, 139,1456, 860,4514, #  272
 437, 712,3974, 164,2397,3137, 695, 211,3037,2097, 195,3975,1608,3559,3560,3684, #  288
3976, 234, 811,2989,2098,3977,2233,1441,3561,1615,2380, 668,2077,1638, 305, 228, #  304
1664,4515, 467, 415,5025, 262,2099,1593, 239, 108, 300, 200,1033, 512,1247,2078, #  320
5026,5027,2176,3207,3685,2682, 593, 845,1062,3277,  88,1723,2038,3978,1951, 212, #  336
 266, 152, 149, 468,1899,4208,4516,  77, 187,5028,3038,  37,   5,2990,5029,3979, #  352
5030,5031,  39,2524,4517,2908,3208,2079,  55, 148,  74,4518, 545, 483,1474,1029, #  368
1665, 217,1870,1531,3138,1104,2655,4209,  24, 172,3562, 900,3980,3563,3564,4519, #  384
  32,1408,2824,1312, 329, 487,2360,2251,2717, 784,2683,   4,3039,3351,1427,1789, #  400
 188, 109, 499,5032,3686,1717,1790, 888,1217,3040,4520,5033,3565,5034,3352,1520, #  416
3687,3981, 196,1034, 775,5035,5036, 929,1816, 249, 439,  38,5037,1063,5038, 794, #  432
3982,1435,2301,  46, 178,3278,2066,5039,2381,5040, 214,1709,4521, 804,  35, 707, #  448
 324,3688,1601,2554, 140, 459,4210,5041,5042,1365, 839, 272, 978,2262,2580,3456, #  464
2129,1363,3689,1423, 697, 100,3094,  48,  70,1231, 495,3139,2196,5043,1294,5044, #  480
2080, 462, 586,1042,3279, 853, 256, 988, 185,2382,3457,1698, 434,1084,5045,3458, #  496
 314,2625,2788,4522,2335,2336, 569,2285, 637,1817,2525, 757,1162,1879,1616,3459, #  512
 287,1577,2116, 768,4523,1671,2868,3566,2526,1321,3816, 909,2418,5046,4211, 933, #  528
3817,4212,2053,2361,1222,4524, 765,2419,1322, 786,4525,5047,1920,1462,1677,2909, #  544
1699,5048,4526,1424,2442,3140,3690,2600,3353,1775,1941,3460,3983,4213, 309,1369, #  560
1130,2825, 364,2234,1653,1299,3984,3567,3985,3986,2656, 525,1085,3041, 902,2001, #  576
1475, 964,4527, 421,1845,1415,1057,2286, 940,1364,3141, 376,4528,4529,1381,   7, #  592
2527, 983,2383, 336,1710,2684,1846, 321,3461, 559,1131,3042,2752,1809,1132,1313, #  608
 265,1481,1858,5049, 352,1203,2826,3280, 167,1089, 420,2827, 776, 792,1724,3568, #  624
4214,2443,3281,5050,4215,5051, 446, 229, 333,2753, 901,3818,1200,1557,4530,2657, #  640
1921, 395,2754,2685,3819,4216,1836, 125, 916,3209,2626,4531,5052,5053,3820,5054, #  656
5055,5056,4532,3142,3691,1133,2555,1757,3462,1510,2318,1409,3569,5057,2146, 438, #  672
2601,2910,2384,3354,1068, 958,3043, 461, 311,2869,2686,4217,1916,3210,4218,1979, #  688
 383, 750,2755,2627,4219, 274, 539, 385,1278,1442,5058,1154,1965, 384, 561, 210, #  704
  98,1295,2556,3570,5059,1711,2420,1482,3463,3987,2911,1257, 129,5060,3821, 642, #  720
 523,2789,2790,2658,5061, 141,2235,1333,  68, 176, 441, 876, 907,4220, 603,2602, #  736
 710, 171,3464, 404, 549,  18,3143,2398,1410,3692,1666,5062,3571,4533,2912,4534, #  752
5063,2991, 368,5064, 146, 366,  99, 871,3693,1543, 748, 807,1586,1185,  22,2263, #  768
 379,3822,3211,5065,3212, 505,1942,2628,1992,1382,2319,5066, 380,2362, 218, 702, #  784
1818,1248,3465,3044,3572,3355,3282,5067,2992,3694, 930,3283,3823,5068,  59,5069, #  800
 585, 601,4221, 497,3466,1112,1314,4535,1802,5070,1223,1472,2177,5071, 749,1837, #  816
 690,1900,3824,1773,3988,1476, 429,1043,1791,2236,2117, 917,4222, 447,1086,1629, #  832
5072, 556,5073,5074,2021,1654, 844,1090, 105, 550, 966,1758,2828,1008,1783, 686, #  848
1095,5075,2287, 793,1602,5076,3573,2603,4536,4223,2948,2302,4537,3825, 980,2503, #  864
 544, 353, 527,4538, 908,2687,2913,5077, 381,2629,1943,1348,5078,1341,1252, 560, #  880
3095,5079,3467,2870,5080,2054, 973, 886,2081, 143,4539,5081,5082, 157,3989, 496, #  896
4224,  57, 840, 540,2039,4540,4541,3468,2118,1445, 970,2264,1748,1966,2082,4225, #  912
3144,1234,1776,3284,2829,3695, 773,1206,2130,1066,2040,1326,3990,1738,1725,4226, #  928
 279,3145,  51,1544,2604, 423,1578,2131,2067, 173,4542,1880,5083,5084,1583, 264, #  944
 610,3696,4543,2444, 280, 154,5085,5086,5087,1739, 338,1282,3096, 693,2871,1411, #  960
1074,3826,2445,5088,4544,5089,5090,1240, 952,2399,5091,2914,1538,2688, 685,1483, #  976
4227,2475,1436, 953,4228,2055,4545, 671,2400,  79,4229,2446,3285, 608, 567,2689, #  992
3469,4230,4231,1691, 393,1261,1792,2401,5092,4546,5093,5094,5095,5096,1383,1672, # 1008
3827,3213,1464, 522,1119, 661,1150, 216, 675,4547,3991,1432,3574, 609,4548,2690, # 1024
2402,5097,5098,5099,4232,3045,   0,5100,2476, 315, 231,2447, 301,3356,4549,2385, # 1040
5101, 233,4233,3697,1819,4550,4551,5102,  96,1777,1315,2083,5103, 257,5104,1810, # 1056
3698,2718,1139,1820,4234,2022,1124,2164,2791,1778,2659,5105,3097, 363,1655,3214, # 1072
5106,2993,5107,5108,5109,3992,1567,3993, 718, 103,3215, 849,1443, 341,3357,2949, # 1088
1484,5110,1712, 127,  67, 339,4235,2403, 679,1412, 821,5111,5112, 834, 738, 351, # 1104
2994,2147, 846, 235,1497,1881, 418,1993,3828,2719, 186,1100,2148,2756,3575,1545, # 1120
1355,2950,2872,1377, 583,3994,4236,2581,2995,5113,1298,3699,1078,2557,3700,2363, # 1136
  78,3829,3830, 267,1289,2100,2002,1594,4237, 348, 369,1274,2197,2178,1838,4552, # 1152
1821,2830,3701,2757,2288,2003,4553,2951,2758, 144,3358, 882,4554,3995,2759,3470, # 1168
4555,2915,5114,4238,1726, 320,5115,3996,3046, 788,2996,5116,2831,1774,1327,2873, # 1184
3997,2832,5117,1306,4556,2004,1700,3831,3576,2364,2660, 787,2023, 506, 824,3702, # 1200
 534, 323,4557,1044,3359,2024,1901, 946,3471,5118,1779,1500,1678,5119,1882,4558, # 1216
 165, 243,4559,3703,2528, 123, 683,4239, 764,4560,  36,3998,1793, 589,2916, 816, # 1232
 626,1667,3047,2237,1639,1555,1622,3832,3999,5120,4000,2874,1370,1228,1933, 891, # 1248
2084,2917, 304,4240,5121, 292,2997,2720,3577, 691,2101,4241,1115,4561, 118, 662, # 1264
5122, 611,1156, 854,2386,1316,2875,   2, 386, 515,2918,5123,5124,3286, 868,2238, # 1280
1486, 855,2661, 785,2216,3048,5125,1040,3216,3578,5126,3146, 448,5127,1525,5128, # 1296
2165,4562,5129,3833,5130,4242,2833,3579,3147, 503, 818,4001,3148,1568, 814, 676, # 1312
1444, 306,1749,5131,3834,1416,1030, 197,1428, 805,2834,1501,4563,5132,5133,5134, # 1328
1994,5135,4564,5136,5137,2198,  13,2792,3704,2998,3149,1229,1917,5138,3835,2132, # 1344
5139,4243,4565,2404,3580,5140,2217,1511,1727,1120,5141,5142, 646,3836,2448, 307, # 1360
5143,5144,1595,3217,5145,5146,5147,3705,1113,1356,4002,1465,2529,2530,5148, 519, # 1376
5149, 128,2133,  92,2289,1980,5150,4003,1512, 342,3150,2199,5151,2793,2218,1981, # 1392
3360,4244, 290,1656,1317, 789, 827,2365,5152,3837,4566, 562, 581,4004,5153, 401, # 1408
4567,2252,  94,4568,5154,1399,2794,5155,1463,2025,4569,3218,1944,5156, 828,1105, # 1424
4245,1262,1394,5157,4246, 605,4570,5158,1784,2876,5159,2835, 819,2102, 578,2200, # 1440
2952,5160,1502, 436,3287,4247,3288,2836,4005,2919,3472,3473,5161,2721,2320,5162, # 1456
5163,2337,2068,  23,4571, 193, 826,3838,2103, 699,1630,4248,3098, 390,1794,1064, # 1472
3581,5164,1579,3099,3100,1400,5165,4249,1839,1640,2877,5166,4572,4573, 137,4250, # 1488
 598,3101,1967, 780, 104, 974,2953,5167, 278, 899, 253, 402, 572, 504, 493,1339, # 1504
5168,4006,1275,4574,2582,2558,5169,3706,3049,3102,2253, 565,1334,2722, 863,  41, # 1520
5170,5171,4575,5172,1657,2338,  19, 463,2760,4251, 606,5173,2999,3289,1087,2085, # 1536
1323,2662,3000,5174,1631,1623,1750,4252,2691,5175,2878, 791,2723,2663,2339, 232, # 1552
2421,5176,3001,1498,5177,2664,2630, 755,1366,3707,3290,3151,2026,1609, 119,1918, # 1568
3474, 862,1026,4253,5178,4007,3839,4576,4008,4577,2265,1952,2477,5179,1125, 817, # 1584
4254,4255,4009,1513,1766,2041,1487,4256,3050,3291,2837,3840,3152,5180,5181,1507, # 1600
5182,2692, 733,  40,1632,1106,2879, 345,4257, 841,2531, 230,4578,3002,1847,3292, # 1616
3475,5183,1263, 986,3476,5184, 735, 879, 254,1137, 857, 622,1300,1180,1388,1562, # 1632
4010,4011,2954, 967,2761,2665,1349, 592,2134,1692,3361,3003,1995,4258,1679,4012, # 1648
1902,2188,5185, 739,3708,2724,1296,1290,5186,4259,2201,2202,1922,1563,2605,2559, # 1664
1871,2762,3004,5187, 435,5188, 343,1108, 596,  17,1751,4579,2239,3477,3709,5189, # 1680
4580, 294,3582,2955,1693, 477, 979, 281,2042,3583, 643,2043,3710,2631,2795,2266, # 1696
1031,2340,2135,2303,3584,4581, 367,1249,2560,5190,3585,5191,4582,1283,3362,2005, # 1712
 240,1762,3363,4583,4584, 836,1069,3153, 474,5192,2149,2532, 268,3586,5193,3219, # 1728
1521,1284,5194,1658,1546,4260,5195,3587,3588,5196,4261,3364,2693,1685,4262, 961, # 1744
1673,2632, 190,2006,2203,3841,4585,4586,5197, 570,2504,3711,1490,5198,4587,2633, # 1760
3293,1957,4588, 584,1514, 396,1045,1945,5199,4589,1968,2449,5200,5201,4590,4013, # 1776
 619,5202,3154,3294, 215,2007,2796,2561,3220,4591,3221,4592, 763,4263,3842,4593, # 1792
5203,5204,1958,1767,2956,3365,3712,1174, 452,1477,4594,3366,3155,5205,2838,1253, # 1808
2387,2189,1091,2290,4264, 492,5206, 638,1169,1825,2136,1752,4014, 648, 926,1021, # 1824
1324,4595, 520,4596, 997, 847,1007, 892,4597,3843,2267,1872,3713,2405,1785,4598, # 1840
1953,2957,3103,3222,1728,4265,2044,3714,4599,2008,1701,3156,1551,  30,2268,4266, # 1856
5207,2027,4600,3589,5208, 501,5209,4267, 594,3478,2166,1822,3590,3479,3591,3223, # 1872
 829,2839,4268,5210,1680,3157,1225,4269,5211,3295,4601,4270,3158,2341,5212,4602, # 1888
4271,5213,4015,4016,5214,1848,2388,2606,3367,5215,4603, 374,4017, 652,4272,4273, # 1904
 375,1140, 798,5216,5217,5218,2366,4604,2269, 546,1659, 138,3051,2450,4605,5219, # 1920
2254, 612,1849, 910, 796,3844,1740,1371, 825,3845,3846,5220,2920,2562,5221, 692, # 1936
 444,3052,2634, 801,4606,4274,5222,1491, 244,1053,3053,4275,4276, 340,5223,4018, # 1952
1041,3005, 293,1168,  87,1357,5224,1539, 959,5225,2240, 721, 694,4277,3847, 219, # 1968
1478, 644,1417,3368,2666,1413,1401,1335,1389,4019,5226,5227,3006,2367,3159,1826, # 1984
 730,1515, 184,2840,  66,4607,5228,1660,2958, 246,3369, 378,1457, 226,3480, 975, # 2000
4020,2959,1264,3592, 674, 696,5229, 163,5230,1141,2422,2167, 713,3593,3370,4608, # 2016
4021,5231,5232,1186,  15,5233,1079,1070,5234,1522,3224,3594, 276,1050,2725, 758, # 2032
1126, 653,2960,3296,5235,2342, 889,3595,4022,3104,3007, 903,1250,4609,4023,3481, # 2048
3596,1342,1681,1718, 766,3297, 286,  89,2961,3715,5236,1713,5237,2607,3371,3008, # 2064
5238,2962,2219,3225,2880,5239,4610,2505,2533, 181, 387,1075,4024, 731,2190,3372, # 2080
5240,3298, 310, 313,3482,2304, 770,4278,  54,3054, 189,4611,3105,3848,4025,5241, # 2096
1230,1617,1850, 355,3597,4279,4612,3373, 111,4280,3716,1350,3160,3483,3055,4281, # 2112
2150,3299,3598,5242,2797,4026,4027,3009, 722,2009,5243,1071, 247,1207,2343,2478, # 2128
1378,4613,2010, 864,1437,1214,4614, 373,3849,1142,2220, 667,4615, 442,2763,2563, # 2144
3850,4028,1969,4282,3300,1840, 837, 170,1107, 934,1336,1883,5244,5245,2119,4283, # 2160
2841, 743,1569,5246,4616,4284, 582,2389,1418,3484,5247,1803,5248, 357,1395,1729, # 2176
3717,3301,2423,1564,2241,5249,3106,3851,1633,4617,1114,2086,4285,1532,5250, 482, # 2192
2451,4618,5251,5252,1492, 833,1466,5253,2726,3599,1641,2842,5254,1526,1272,3718, # 2208
4286,1686,1795, 416,2564,1903,1954,1804,5255,3852,2798,3853,1159,2321,5256,2881, # 2224
4619,1610,1584,3056,2424,2764, 443,3302,1163,3161,5257,5258,4029,5259,4287,2506, # 2240
3057,4620,4030,3162,2104,1647,3600,2011,1873,4288,5260,4289, 431,3485,5261, 250, # 2256
  97,  81,4290,5262,1648,1851,1558, 160, 848,5263, 866, 740,1694,5264,2204,2843, # 2272
3226,4291,4621,3719,1687, 950,2479, 426, 469,3227,3720,3721,4031,5265,5266,1188, # 2288
 424,1996, 861,3601,4292,3854,2205,2694, 168,1235,3602,4293,5267,2087,1674,4622, # 2304
3374,3303, 220,2565,1009,5268,3855, 670,3010, 332,1208, 717,5269,5270,3603,2452, # 2320
4032,3375,5271, 513,5272,1209,2882,3376,3163,4623,1080,5273,5274,5275,5276,2534, # 2336
3722,3604, 815,1587,4033,4034,5277,3605,3486,3856,1254,4624,1328,3058,1390,4035, # 2352
1741,4036,3857,4037,5278, 236,3858,2453,3304,5279,5280,3723,3859,1273,3860,4625, # 2368
5281, 308,5282,4626, 245,4627,1852,2480,1307,2583, 430, 715,2137,2454,5283, 270, # 2384
 199,2883,4038,5284,3606,2727,1753, 761,1754, 725,1661,1841,4628,3487,3724,5285, # 2400
5286, 587,  14,3305, 227,2608, 326, 480,2270, 943,2765,3607, 291, 650,1884,5287, # 2416
1702,1226, 102,1547,  62,3488, 904,4629,3489,1164,4294,5288,5289,1224,1548,2766, # 2432
 391, 498,1493,5290,1386,1419,5291,2056,1177,4630, 813, 880,1081,2368, 566,1145, # 2448
4631,2291,1001,1035,2566,2609,2242, 394,1286,5292,5293,2069,5294,  86,1494,1730, # 2464
4039, 491,1588, 745, 897,2963, 843,3377,4040,2767,2884,3306,1768, 998,2221,2070, # 2480
 397,1827,1195,1970,3725,3011,3378, 284,5295,3861,2507,2138,2120,1904,5296,4041, # 2496
2151,4042,4295,1036,3490,1905, 114,2567,4296, 209,1527,5297,5298,2964,2844,2635, # 2512
2390,2728,3164, 812,2568,5299,3307,5300,1559, 737,1885,3726,1210, 885,  28,2695, # 2528
3608,3862,5301,4297,1004,1780,4632,5302, 346,1982,2222,2696,4633,3863,1742, 797, # 2544
1642,4043,1934,1072,1384,2152, 896,4044,3308,3727,3228,2885,3609,5303,2569,1959, # 2560
4634,2455,1786,5304,5305,5306,4045,4298,1005,1308,3728,4299,2729,4635,4636,1528, # 2576
2610, 161,1178,4300,1983, 987,4637,1101,4301, 631,4046,1157,3229,2425,1343,1241, # 2592
1016,2243,2570, 372, 877,2344,2508,1160, 555,1935, 911,4047,5307, 466,1170, 169, # 2608
1051,2921,2697,3729,2481,3012,1182,2012,2571,1251,2636,5308, 992,2345,3491,1540, # 2624
2730,1201,2071,2406,1997,2482,5309,4638, 528,1923,2191,1503,1874,1570,2369,3379, # 2640
3309,5310, 557,1073,5311,1828,3492,2088,2271,3165,3059,3107, 767,3108,2799,4639, # 2656
1006,4302,4640,2346,1267,2179,3730,3230, 778,4048,3231,2731,1597,2667,5312,4641, # 2672
5313,3493,5314,5315,5316,3310,2698,1433,3311, 131,  95,1504,4049, 723,4303,3166, # 2688
1842,3610,2768,2192,4050,2028,2105,3731,5317,3013,4051,1218,5318,3380,3232,4052, # 2704
4304,2584, 248,1634,3864, 912,5319,2845,3732,3060,3865, 654,  53,5320,3014,5321, # 2720
1688,4642, 777,3494,1032,4053,1425,5322, 191, 820,2121,2846, 971,4643, 931,3233, # 2736
 135, 664, 783,3866,1998, 772,2922,1936,4054,3867,4644,2923,3234, 282,2732, 640, # 2752
1372,3495,1127, 922, 325,3381,5323,5324, 711,2045,5325,5326,4055,2223,2800,1937, # 2768
4056,3382,2224,2255,3868,2305,5327,4645,3869,1258,3312,4057,3235,2139,2965,4058, # 2784
4059,5328,2225, 258,3236,4646, 101,1227,5329,3313,1755,5330,1391,3314,5331,2924, # 2800
2057, 893,5332,5333,5334,1402,4305,2347,5335,5336,3237,3611,5337,5338, 878,1325, # 2816
1781,2801,4647, 259,1385,2585, 744,1183,2272,4648,5339,4060,2509,5340, 684,1024, # 2832
4306,5341, 472,3612,3496,1165,3315,4061,4062, 322,2153, 881, 455,1695,1152,1340, # 2848
 660, 554,2154,4649,1058,4650,4307, 830,1065,3383,4063,4651,1924,5342,1703,1919, # 2864
5343, 932,2273, 122,5344,4652, 947, 677,5345,3870,2637, 297,1906,1925,2274,4653, # 2880
2322,3316,5346,5347,4308,5348,4309,  84,4310, 112, 989,5349, 547,1059,4064, 701, # 2896
3613,1019,5350,4311,5351,3497, 942, 639, 457,2306,2456, 993,2966, 407, 851, 494, # 2912
4654,3384, 927,5352,1237,5353,2426,3385, 573,4312, 680, 921,2925,1279,1875, 285, # 2928
 790,1448,1984, 719,2168,5354,5355,4655,4065,4066,1649,5356,1541, 563,5357,1077, # 2944
5358,3386,3061,3498, 511,3015,4067,4068,3733,4069,1268,2572,3387,3238,4656,4657, # 2960
5359, 535,1048,1276,1189,2926,2029,3167,1438,1373,2847,2967,1134,2013,5360,4313, # 2976
1238,2586,3109,1259,5361, 700,5362,2968,3168,3734,4314,5363,4315,1146,1876,1907, # 2992
4658,2611,4070, 781,2427, 132,1589, 203, 147, 273,2802,2407, 898,1787,2155,4071, # 3008
4072,5364,3871,2803,5365,5366,4659,4660,5367,3239,5368,1635,3872, 965,5369,1805, # 3024
2699,1516,3614,1121,1082,1329,3317,4073,1449,3873,  65,1128,2848,2927,2769,1590, # 3040
3874,5370,5371,  12,2668,  45, 976,2587,3169,4661, 517,2535,1013,1037,3240,5372, # 3056
3875,2849,5373,3876,5374,3499,5375,2612, 614,1999,2323,3877,3110,2733,2638,5376, # 3072
2588,4316, 599,1269,5377,1811,3735,5378,2700,3111, 759,1060, 489,1806,3388,3318, # 3088
1358,5379,5380,2391,1387,1215,2639,2256, 490,5381,5382,4317,1759,2392,2348,5383, # 3104
4662,3878,1908,4074,2640,1807,3241,4663,3500,3319,2770,2349, 874,5384,5385,3501, # 3120
3736,1859,  91,2928,3737,3062,3879,4664,5386,3170,4075,2669,5387,3502,1202,1403, # 3136
3880,2969,2536,1517,2510,4665,3503,2511,5388,4666,5389,2701,1886,1495,1731,4076, # 3152
2370,4667,5390,2030,5391,5392,4077,2702,1216, 237,2589,4318,2324,4078,3881,4668, # 3168
4669,2703,3615,3504, 445,4670,5393,5394,5395,5396,2771,  61,4079,3738,1823,4080, # 3184
5397, 687,2046, 935, 925, 405,2670, 703,1096,1860,2734,4671,4081,1877,1367,2704, # 3200
3389, 918,2106,1782,2483, 334,3320,1611,1093,4672, 564,3171,3505,3739,3390, 945, # 3216
2641,2058,4673,5398,1926, 872,4319,5399,3506,2705,3112, 349,4320,3740,4082,4674, # 3232
3882,4321,3741,2156,4083,4675,4676,4322,4677,2408,2047, 782,4084, 400, 251,4323, # 3248
1624,5400,5401, 277,3742, 299,1265, 476,1191,3883,2122,4324,4325,1109, 205,5402, # 3264
2590,1000,2157,3616,1861,5403,5404,5405,4678,5406,4679,2573, 107,2484,2158,4085, # 3280
3507,3172,5407,1533, 541,1301, 158, 753,4326,2886,3617,5408,1696, 370,1088,4327, # 3296
4680,3618, 579, 327, 440, 162,2244, 269,1938,1374,3508, 968,3063,  56,1396,3113, # 3312
2107,3321,3391,5409,1927,2159,4681,3016,5410,3619,5411,5412,3743,4682,2485,5413, # 3328
2804,5414,1650,4683,5415,2613,5416,5417,4086,2671,3392,1149,3393,4087,3884,4088, # 3344
5418,1076,  49,5419, 951,3242,3322,3323, 450,2850, 920,5420,1812,2805,2371,4328, # 3360
1909,1138,2372,3885,3509,5421,3243,4684,1910,1147,1518,2428,4685,3886,5422,4686, # 3376
2393,2614, 260,1796,3244,5423,5424,3887,3324, 708,5425,3620,1704,5426,3621,1351, # 3392
1618,3394,3017,1887, 944,4329,3395,4330,3064,3396,4331,5427,3744, 422, 413,1714, # 3408
3325, 500,2059,2350,4332,2486,5428,1344,1911, 954,5429,1668,5430,5431,4089,2409, # 3424
4333,3622,3888,4334,5432,2307,1318,2512,3114, 133,3115,2887,4687, 629,  31,2851, # 3440
2706,3889,4688, 850, 949,4689,4090,2970,1732,2089,4335,1496,1853,5433,4091, 620, # 3456
3245, 981,1242,3745,3397,1619,3746,1643,3326,2140,2457,1971,1719,3510,2169,5434, # 3472
3246,5435,5436,3398,1829,5437,1277,4690,1565,2048,5438,1636,3623,3116,5439, 869, # 3488
2852, 655,3890,3891,3117,4092,3018,3892,1310,3624,4691,5440,5441,5442,1733, 558, # 3504
4692,3747, 335,1549,3065,1756,4336,3748,1946,3511,1830,1291,1192, 470,2735,2108, # 3520
2806, 913,1054,4093,5443,1027,5444,3066,4094,4693, 982,2672,3399,3173,3512,3247, # 3536
3248,1947,2807,5445, 571,4694,5446,1831,5447,3625,2591,1523,2429,5448,2090, 984, # 3552
4695,3749,1960,5449,3750, 852, 923,2808,3513,3751, 969,1519, 999,2049,2325,1705, # 3568
5450,3118, 615,1662, 151, 597,4095,2410,2326,1049, 275,4696,3752,4337, 568,3753, # 3584
3626,2487,4338,3754,5451,2430,2275, 409,3249,5452,1566,2888,3514,1002, 769,2853, # 3600
 194,2091,3174,3755,2226,3327,4339, 628,1505,5453,5454,1763,2180,3019,4096, 521, # 3616
1161,2592,1788,2206,2411,4697,4097,1625,4340,4341, 412,  42,3119, 464,5455,2642, # 3632
4698,3400,1760,1571,2889,3515,2537,1219,2207,3893,2643,2141,2373,4699,4700,3328, # 3648
1651,3401,3627,5456,5457,3628,2488,3516,5458,3756,5459,5460,2276,2092, 460,5461, # 3664
4701,5462,3020, 962, 588,3629, 289,3250,2644,1116,  52,5463,3067,1797,5464,5465, # 3680
5466,1467,5467,1598,1143,3757,4342,1985,1734,1067,4702,1280,3402, 465,4703,1572, # 3696
 510,5468,1928,2245,1813,1644,3630,5469,4704,3758,5470,5471,2673,1573,1534,5472, # 3712
5473, 536,1808,1761,3517,3894,3175,2645,5474,5475,5476,4705,3518,2929,1912,2809, # 3728
5477,3329,1122, 377,3251,5478, 360,5479,5480,4343,1529, 551,5481,2060,3759,1769, # 3744
2431,5482,2930,4344,3330,3120,2327,2109,2031,4706,1404, 136,1468,1479, 672,1171, # 3760
3252,2308, 271,3176,5483,2772,5484,2050, 678,2736, 865,1948,4707,5485,2014,4098, # 3776
2971,5486,2737,2227,1397,3068,3760,4708,4709,1735,2931,3403,3631,5487,3895, 509, # 3792
2854,2458,2890,3896,5488,5489,3177,3178,4710,4345,2538,4711,2309,1166,1010, 552, # 3808
 681,1888,5490,5491,2972,2973,4099,1287,1596,1862,3179, 358, 453, 736, 175, 478, # 3824
1117, 905,1167,1097,5492,1854,1530,5493,1706,5494,2181,3519,2292,3761,3520,3632, # 3840
4346,2093,4347,5495,3404,1193,2489,4348,1458,2193,2208,1863,1889,1421,3331,2932, # 3856
3069,2182,3521, 595,2123,5496,4100,5497,5498,4349,1707,2646, 223,3762,1359, 751, # 3872
3121, 183,3522,5499,2810,3021, 419,2374, 633, 704,3897,2394, 241,5500,5501,5502, # 3888
 838,3022,3763,2277,2773,2459,3898,1939,2051,4101,1309,3122,2246,1181,5503,1136, # 3904
2209,3899,2375,1446,4350,2310,4712,5504,5505,4351,1055,2615, 484,3764,5506,4102, # 3920
 625,4352,2278,3405,1499,4353,4103,5507,4104,4354,3253,2279,2280,3523,5508,5509, # 3936
2774, 808,2616,3765,3406,4105,4355,3123,2539, 526,3407,3900,4356, 955,5510,1620, # 3952
4357,2647,2432,5511,1429,3766,1669,1832, 994, 928,5512,3633,1260,5513,5514,5515, # 3968
1949,2293, 741,2933,1626,4358,2738,2460, 867,1184, 362,3408,1392,5516,5517,4106, # 3984
4359,1770,1736,3254,2934,4713,4714,1929,2707,1459,1158,5518,3070,3409,2891,1292, # 4000
1930,2513,2855,3767,1986,1187,2072,2015,2617,4360,5519,2574,2514,2170,3768,2490, # 4016
3332,5520,3769,4715,5521,5522, 666,1003,3023,1022,3634,4361,5523,4716,1814,2257, # 4032
 574,3901,1603, 295,1535, 705,3902,4362, 283, 858, 417,5524,5525,3255,4717,4718, # 4048
3071,1220,1890,1046,2281,2461,4107,1393,1599, 689,2575, 388,4363,5526,2491, 802, # 4064
5527,2811,3903,2061,1405,2258,5528,4719,3904,2110,1052,1345,3256,1585,5529, 809, # 4080
5530,5531,5532, 575,2739,3524, 956,1552,1469,1144,2328,5533,2329,1560,2462,3635, # 4096
3257,4108, 616,2210,4364,3180,2183,2294,5534,1833,5535,3525,4720,5536,1319,3770, # 4112
3771,1211,3636,1023,3258,1293,2812,5537,5538,5539,3905, 607,2311,3906, 762,2892, # 4128
1439,4365,1360,4721,1485,3072,5540,4722,1038,4366,1450,2062,2648,4367,1379,4723, # 4144
2593,5541,5542,4368,1352,1414,2330,2935,1172,5543,5544,3907,3908,4724,1798,1451, # 4160
5545,5546,5547,5548,2936,4109,4110,2492,2351, 411,4111,4112,3637,3333,3124,4725, # 4176
1561,2674,1452,4113,1375,5549,5550,  47,2974, 316,5551,1406,1591,2937,3181,5552, # 4192
1025,2142,3125,3182, 354,2740, 884,2228,4369,2412, 508,3772, 726,3638, 996,2433, # 4208
3639, 729,5553, 392,2194,1453,4114,4726,3773,5554,5555,2463,3640,2618,1675,2813, # 4224
 919,2352,2975,2353,1270,4727,4115,  73,5556,5557, 647,5558,3259,2856,2259,1550, # 4240
1346,3024,5559,1332, 883,3526,5560,5561,5562,5563,3334,2775,5564,1212, 831,1347, # 4256
4370,4728,2331,3909,1864,3073, 720,3910,4729,4730,3911,5565,4371,5566,5567,4731, # 4272
5568,5569,1799,4732,3774,2619,4733,3641,1645,2376,4734,5570,2938, 669,2211,2675, # 4288
2434,5571,2893,5572,5573,1028,3260,5574,4372,2413,5575,2260,1353,5576,5577,4735, # 4304
3183, 518,5578,4116,5579,4373,1961,5580,2143,4374,5581,5582,3025,2354,2355,3912, # 4320
 516,1834,1454,4117,2708,4375,4736,2229,2620,1972,1129,3642,5583,2776,5584,2976, # 4336
1422, 577,1470,3026,1524,3410,5585,5586, 432,4376,3074,3527,5587,2594,1455,2515, # 4352
2230,1973,1175,5588,1020,2741,4118,3528,4737,5589,2742,5590,1743,1361,3075,3529, # 4368
2649,4119,4377,4738,2295, 895, 924,4378,2171, 331,2247,3076, 166,1627,3077,1098, # 4384
5591,1232,2894,2231,3411,4739, 657, 403,1196,2377, 542,3775,3412,1600,4379,3530, # 4400
5592,4740,2777,3261, 576, 530,1362,4741,4742,2540,2676,3776,4120,5593, 842,3913, # 4416
5594,2814,2032,1014,4121, 213,2709,3413, 665, 621,4380,5595,3777,2939,2435,5596, # 4432
2436,3335,3643,3414,4743,4381,2541,4382,4744,3644,1682,4383,3531,1380,5597, 724, # 4448
2282, 600,1670,5598,1337,1233,4745,3126,2248,5599,1621,4746,5600, 651,4384,5601, # 4464
1612,4385,2621,5602,2857,5603,2743,2312,3078,5604, 716,2464,3079, 174,1255,2710, # 4480
4122,3645, 548,1320,1398, 728,4123,1574,5605,1891,1197,3080,4124,5606,3081,3082, # 4496
3778,3646,3779, 747,5607, 635,4386,4747,5608,5609,5610,4387,5611,5612,4748,5613, # 4512
3415,4749,2437, 451,5614,3780,2542,2073,4388,2744,4389,4125,5615,1764,4750,5616, # 4528
4390, 350,4751,2283,2395,2493,5617,4391,4126,2249,1434,4127, 488,4752, 458,4392, # 4544
4128,3781, 771,1330,2396,3914,2576,3184,2160,2414,1553,2677,3185,4393,5618,2494, # 4560
2895,2622,1720,2711,4394,3416,4753,5619,2543,4395,5620,3262,4396,2778,5621,2016, # 4576
2745,5622,1155,1017,3782,3915,5623,3336,2313, 201,1865,4397,1430,5624,4129,5625, # 4592
5626,5627,5628,5629,4398,1604,5630, 414,1866, 371,2595,4754,4755,3532,2017,3127, # 4608
4756,1708, 960,4399, 887, 389,2172,1536,1663,1721,5631,2232,4130,2356,2940,1580, # 4624
5632,5633,1744,4757,2544,4758,4759,5634,4760,5635,2074,5636,4761,3647,3417,2896, # 4640
4400,5637,4401,2650,3418,2815, 673,2712,2465, 709,3533,4131,3648,4402,5638,1148, # 4656
 502, 634,5639,5640,1204,4762,3649,1575,4763,2623,3783,5641,3784,3128, 948,3263, # 4672
 121,1745,3916,1110,5642,4403,3083,2516,3027,4132,3785,1151,1771,3917,1488,4133, # 4688
1987,5643,2438,3534,5644,5645,2094,5646,4404,3918,1213,1407,2816, 531,2746,2545, # 4704
3264,1011,1537,4764,2779,4405,3129,1061,5647,3786,3787,1867,2897,5648,2018, 120, # 4720
4406,4407,2063,3650,3265,2314,3919,2678,3419,1955,4765,4134,5649,3535,1047,2713, # 4736
1266,5650,1368,4766,2858, 649,3420,3920,2546,2747,1102,2859,2679,5651,5652,2000, # 4752
5653,1111,3651,2977,5654,2495,3921,3652,2817,1855,3421,3788,5655,5656,3422,2415, # 4768
2898,3337,3266,3653,5657,2577,5658,3654,2818,4135,1460, 856,5659,3655,5660,2899, # 4784
2978,5661,2900,3922,5662,4408, 632,2517, 875,3923,1697,3924,2296,5663,5664,4767, # 4800
3028,1239, 580,4768,4409,5665, 914, 936,2075,1190,4136,1039,2124,5666,5667,5668, # 4816
5669,3423,1473,5670,1354,4410,3925,4769,2173,3084,4137, 915,3338,4411,4412,3339, # 4832
1605,1835,5671,2748, 398,3656,4413,3926,4138, 328,1913,2860,4139,3927,1331,4414, # 4848
3029, 937,4415,5672,3657,4140,4141,3424,2161,4770,3425, 524, 742, 538,3085,1012, # 4864
5673,5674,3928,2466,5675, 658,1103, 225,3929,5676,5677,4771,5678,4772,5679,3267, # 4880
1243,5680,4142, 963,2250,4773,5681,2714,3658,3186,5682,5683,2596,2332,5684,4774, # 4896
5685,5686,5687,3536, 957,3426,2547,2033,1931,2941,2467, 870,2019,3659,1746,2780, # 4912
2781,2439,2468,5688,3930,5689,3789,3130,3790,3537,3427,3791,5690,1179,3086,5691, # 4928
3187,2378,4416,3792,2548,3188,3131,2749,4143,5692,3428,1556,2549,2297, 977,2901, # 4944
2034,4144,1205,3429,5693,1765,3430,3189,2125,1271, 714,1689,4775,3538,5694,2333, # 4960
3931, 533,4417,3660,2184, 617,5695,2469,3340,3539,2315,5696,5697,3190,5698,5699, # 4976
3932,1988, 618, 427,2651,3540,3431,5700,5701,1244,1690,5702,2819,4418,4776,5703, # 4992
3541,4777,5704,2284,1576, 473,3661,4419,3432, 972,5705,3662,5706,3087,5707,5708, # 5008
4778,4779,5709,3793,4145,4146,5710, 153,4780, 356,5711,1892,2902,4420,2144, 408, # 5024
 803,2357,5712,3933,5713,4421,1646,2578,2518,4781,4782,3934,5714,3935,4422,5715, # 5040
2416,3433, 752,5716,5717,1962,3341,2979,5718, 746,3030,2470,4783,4423,3794, 698, # 5056
4784,1893,4424,3663,2550,4785,3664,3936,5719,3191,3434,5720,1824,1302,4147,2715, # 5072
3937,1974,4425,5721,4426,3192, 823,1303,1288,1236,2861,3542,4148,3435, 774,3938, # 5088
5722,1581,4786,1304,2862,3939,4787,5723,2440,2162,1083,3268,4427,4149,4428, 344, # 5104
1173, 288,2316, 454,1683,5724,5725,1461,4788,4150,2597,5726,5727,4789, 985, 894, # 5120
5728,3436,3193,5729,1914,2942,3795,1989,5730,2111,1975,5731,4151,5732,2579,1194, # 5136
 425,5733,4790,3194,1245,3796,4429,5734,5735,2863,5736, 636,4791,1856,3940, 760, # 5152
1800,5737,4430,2212,1508,4792,4152,1894,1684,2298,5738,5739,4793,4431,4432,2213, # 5168
 479,5740,5741, 832,5742,4153,2496,5743,2980,2497,3797, 990,3132, 627,1815,2652, # 5184
4433,1582,4434,2126,2112,3543,4794,5744, 799,4435,3195,5745,4795,2113,1737,3031, # 5200
1018, 543, 754,4436,3342,1676,4796,4797,4154,4798,1489,5746,3544,5747,2624,2903, # 5216
4155,5748,5749,2981,5750,5751,5752,5753,3196,4799,4800,2185,1722,5754,3269,3270, # 5232
1843,3665,1715, 481, 365,1976,1857,5755,5756,1963,2498,4801,5757,2127,3666,3271, # 5248
 433,1895,2064,2076,5758, 602,2750,5759,5760,5761,5762,5763,3032,1628,3437,5764, # 5264
3197,4802,4156,2904,4803,2519,5765,2551,2782,5766,5767,5768,3343,4804,2905,5769, # 5280
4805,5770,2864,4806,4807,1221,2982,4157,2520,5771,5772,5773,1868,1990,5774,5775, # 5296
5776,1896,5777,5778,4808,1897,4158, 318,5779,2095,4159,4437,5780,5781, 485,5782, # 5312
 938,3941, 553,2680, 116,5783,3942,3667,5784,3545,2681,2783,3438,3344,2820,5785, # 5328
3668,2943,4160,1747,2944,2983,5786,5787, 207,5788,4809,5789,4810,2521,5790,3033, # 5344
 890,3669,3943,5791,1878,3798,3439,5792,2186,2358,3440,1652,5793,5794,5795, 941, # 5360
2299, 208,3546,4161,2020, 330,4438,3944,2906,2499,3799,4439,4811,5796,5797,5798, # 5376
)

site-packages/pip/_vendor/chardet/euctwprober.py000064400000003323147511334570016050 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is mozilla.org code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

from .mbcharsetprober import MultiByteCharSetProber
from .codingstatemachine import CodingStateMachine
from .chardistribution import EUCTWDistributionAnalysis
from .mbcssm import EUCTW_SM_MODEL

class EUCTWProber(MultiByteCharSetProber):
    def __init__(self):
        super(EUCTWProber, self).__init__()
        self.coding_sm = CodingStateMachine(EUCTW_SM_MODEL)
        self.distribution_analyzer = EUCTWDistributionAnalysis()
        self.reset()

    @property
    def charset_name(self):
        return "EUC-TW"

    @property
    def language(self):
        return "Taiwan"
site-packages/pip/_vendor/chardet/universaldetector.py000064400000030305147511334570017251 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Universal charset detector code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 2001
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#   Shy Shalom - original C code
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################
"""
Module containing the UniversalDetector detector class, which is the primary
class a user of ``chardet`` should use.

:author: Mark Pilgrim (initial port to Python)
:author: Shy Shalom (original C code)
:author: Dan Blanchard (major refactoring for 3.0)
:author: Ian Cordasco
"""


import codecs
import logging
import re

from .charsetgroupprober import CharSetGroupProber
from .enums import InputState, LanguageFilter, ProbingState
from .escprober import EscCharSetProber
from .latin1prober import Latin1Prober
from .mbcsgroupprober import MBCSGroupProber
from .sbcsgroupprober import SBCSGroupProber


class UniversalDetector(object):
    """
    The ``UniversalDetector`` class underlies the ``chardet.detect`` function
    and coordinates all of the different charset probers.

    To get a ``dict`` containing an encoding and its confidence, you can simply
    run:

    .. code::

            u = UniversalDetector()
            u.feed(some_bytes)
            u.close()
            detected = u.result

    """

    MINIMUM_THRESHOLD = 0.20
    HIGH_BYTE_DETECTOR = re.compile(b'[\x80-\xFF]')
    ESC_DETECTOR = re.compile(b'(\033|~{)')
    WIN_BYTE_DETECTOR = re.compile(b'[\x80-\x9F]')
    ISO_WIN_MAP = {'iso-8859-1': 'Windows-1252',
                   'iso-8859-2': 'Windows-1250',
                   'iso-8859-5': 'Windows-1251',
                   'iso-8859-6': 'Windows-1256',
                   'iso-8859-7': 'Windows-1253',
                   'iso-8859-8': 'Windows-1255',
                   'iso-8859-9': 'Windows-1254',
                   'iso-8859-13': 'Windows-1257'}

    def __init__(self, lang_filter=LanguageFilter.ALL):
        self._esc_charset_prober = None
        self._charset_probers = []
        self.result = None
        self.done = None
        self._got_data = None
        self._input_state = None
        self._last_char = None
        self.lang_filter = lang_filter
        self.logger = logging.getLogger(__name__)
        self._has_win_bytes = None
        self.reset()

    def reset(self):
        """
        Reset the UniversalDetector and all of its probers back to their
        initial states.  This is called by ``__init__``, so you only need to
        call this directly in between analyses of different documents.
        """
        self.result = {'encoding': None, 'confidence': 0.0, 'language': None}
        self.done = False
        self._got_data = False
        self._has_win_bytes = False
        self._input_state = InputState.PURE_ASCII
        self._last_char = b''
        if self._esc_charset_prober:
            self._esc_charset_prober.reset()
        for prober in self._charset_probers:
            prober.reset()

    def feed(self, byte_str):
        """
        Takes a chunk of a document and feeds it through all of the relevant
        charset probers.

        After calling ``feed``, you can check the value of the ``done``
        attribute to see if you need to continue feeding the
        ``UniversalDetector`` more data, or if it has made a prediction
        (in the ``result`` attribute).

        .. note::
           You should always call ``close`` when you're done feeding in your
           document if ``done`` is not already ``True``.
        """
        if self.done:
            return

        if not len(byte_str):
            return

        if not isinstance(byte_str, bytearray):
            byte_str = bytearray(byte_str)

        # First check for known BOMs, since these are guaranteed to be correct
        if not self._got_data:
            # If the data starts with BOM, we know it is UTF
            if byte_str.startswith(codecs.BOM_UTF8):
                # EF BB BF  UTF-8 with BOM
                self.result = {'encoding': "UTF-8-SIG",
                               'confidence': 1.0,
                               'language': ''}
            elif byte_str.startswith((codecs.BOM_UTF32_LE,
                                      codecs.BOM_UTF32_BE)):
                # FF FE 00 00  UTF-32, little-endian BOM
                # 00 00 FE FF  UTF-32, big-endian BOM
                self.result = {'encoding': "UTF-32",
                               'confidence': 1.0,
                               'language': ''}
            elif byte_str.startswith(b'\xFE\xFF\x00\x00'):
                # FE FF 00 00  UCS-4, unusual octet order BOM (3412)
                self.result = {'encoding': "X-ISO-10646-UCS-4-3412",
                               'confidence': 1.0,
                               'language': ''}
            elif byte_str.startswith(b'\x00\x00\xFF\xFE'):
                # 00 00 FF FE  UCS-4, unusual octet order BOM (2143)
                self.result = {'encoding': "X-ISO-10646-UCS-4-2143",
                               'confidence': 1.0,
                               'language': ''}
            elif byte_str.startswith((codecs.BOM_LE, codecs.BOM_BE)):
                # FF FE  UTF-16, little endian BOM
                # FE FF  UTF-16, big endian BOM
                self.result = {'encoding': "UTF-16",
                               'confidence': 1.0,
                               'language': ''}

            self._got_data = True
            if self.result['encoding'] is not None:
                self.done = True
                return

        # If none of those matched and we've only see ASCII so far, check
        # for high bytes and escape sequences
        if self._input_state == InputState.PURE_ASCII:
            if self.HIGH_BYTE_DETECTOR.search(byte_str):
                self._input_state = InputState.HIGH_BYTE
            elif self._input_state == InputState.PURE_ASCII and \
                    self.ESC_DETECTOR.search(self._last_char + byte_str):
                self._input_state = InputState.ESC_ASCII

        self._last_char = byte_str[-1:]

        # If we've seen escape sequences, use the EscCharSetProber, which
        # uses a simple state machine to check for known escape sequences in
        # HZ and ISO-2022 encodings, since those are the only encodings that
        # use such sequences.
        if self._input_state == InputState.ESC_ASCII:
            if not self._esc_charset_prober:
                self._esc_charset_prober = EscCharSetProber(self.lang_filter)
            if self._esc_charset_prober.feed(byte_str) == ProbingState.FOUND_IT:
                self.result = {'encoding':
                               self._esc_charset_prober.charset_name,
                               'confidence':
                               self._esc_charset_prober.get_confidence(),
                               'language':
                               self._esc_charset_prober.language}
                self.done = True
        # If we've seen high bytes (i.e., those with values greater than 127),
        # we need to do more complicated checks using all our multi-byte and
        # single-byte probers that are left.  The single-byte probers
        # use character bigram distributions to determine the encoding, whereas
        # the multi-byte probers use a combination of character unigram and
        # bigram distributions.
        elif self._input_state == InputState.HIGH_BYTE:
            if not self._charset_probers:
                self._charset_probers = [MBCSGroupProber(self.lang_filter)]
                # If we're checking non-CJK encodings, use single-byte prober
                if self.lang_filter & LanguageFilter.NON_CJK:
                    self._charset_probers.append(SBCSGroupProber())
                self._charset_probers.append(Latin1Prober())
            for prober in self._charset_probers:
                if prober.feed(byte_str) == ProbingState.FOUND_IT:
                    self.result = {'encoding': prober.charset_name,
                                   'confidence': prober.get_confidence(),
                                   'language': prober.language}
                    self.done = True
                    break
            if self.WIN_BYTE_DETECTOR.search(byte_str):
                self._has_win_bytes = True

    def close(self):
        """
        Stop analyzing the current document and come up with a final
        prediction.

        :returns:  The ``result`` attribute, a ``dict`` with the keys
                   `encoding`, `confidence`, and `language`.
        """
        # Don't bother with checks if we're already done
        if self.done:
            return self.result
        self.done = True

        if not self._got_data:
            self.logger.debug('no data received!')

        # Default to ASCII if it is all we've seen so far
        elif self._input_state == InputState.PURE_ASCII:
            self.result = {'encoding': 'ascii',
                           'confidence': 1.0,
                           'language': ''}

        # If we have seen non-ASCII, return the best that met MINIMUM_THRESHOLD
        elif self._input_state == InputState.HIGH_BYTE:
            prober_confidence = None
            max_prober_confidence = 0.0
            max_prober = None
            for prober in self._charset_probers:
                if not prober:
                    continue
                prober_confidence = prober.get_confidence()
                if prober_confidence > max_prober_confidence:
                    max_prober_confidence = prober_confidence
                    max_prober = prober
            if max_prober and (max_prober_confidence > self.MINIMUM_THRESHOLD):
                charset_name = max_prober.charset_name
                lower_charset_name = max_prober.charset_name.lower()
                confidence = max_prober.get_confidence()
                # Use Windows encoding name instead of ISO-8859 if we saw any
                # extra Windows-specific bytes
                if lower_charset_name.startswith('iso-8859'):
                    if self._has_win_bytes:
                        charset_name = self.ISO_WIN_MAP.get(lower_charset_name,
                                                            charset_name)
                self.result = {'encoding': charset_name,
                               'confidence': confidence,
                               'language': max_prober.language}

        # Log all prober confidences if none met MINIMUM_THRESHOLD
        if self.logger.getEffectiveLevel() == logging.DEBUG:
            if self.result['encoding'] is None:
                self.logger.debug('no probers hit minimum threshold')
                for group_prober in self._charset_probers:
                    if not group_prober:
                        continue
                    if isinstance(group_prober, CharSetGroupProber):
                        for prober in group_prober.probers:
                            self.logger.debug('%s %s confidence = %s',
                                              prober.charset_name,
                                              prober.language,
                                              prober.get_confidence())
                    else:
                        self.logger.debug('%s %s confidence = %s',
                                          prober.charset_name,
                                          prober.language,
                                          prober.get_confidence())
        return self.result
site-packages/pip/_vendor/chardet/langcyrillicmodel.py000064400000043034147511334570017207 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Communicator client code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

# KOI8-R language model
# Character Mapping Table:
KOI8R_char_to_order_map = (
255,255,255,255,255,255,255,255,255,255,254,255,255,254,255,255,  # 00
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,  # 10
253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,  # 20
252,252,252,252,252,252,252,252,252,252,253,253,253,253,253,253,  # 30
253,142,143,144,145,146,147,148,149,150,151,152, 74,153, 75,154,  # 40
155,156,157,158,159,160,161,162,163,164,165,253,253,253,253,253,  # 50
253, 71,172, 66,173, 65,174, 76,175, 64,176,177, 77, 72,178, 69,  # 60
 67,179, 78, 73,180,181, 79,182,183,184,185,253,253,253,253,253,  # 70
191,192,193,194,195,196,197,198,199,200,201,202,203,204,205,206,  # 80
207,208,209,210,211,212,213,214,215,216,217,218,219,220,221,222,  # 90
223,224,225, 68,226,227,228,229,230,231,232,233,234,235,236,237,  # a0
238,239,240,241,242,243,244,245,246,247,248,249,250,251,252,253,  # b0
 27,  3, 21, 28, 13,  2, 39, 19, 26,  4, 23, 11,  8, 12,  5,  1,  # c0
 15, 16,  9,  7,  6, 14, 24, 10, 17, 18, 20, 25, 30, 29, 22, 54,  # d0
 59, 37, 44, 58, 41, 48, 53, 46, 55, 42, 60, 36, 49, 38, 31, 34,  # e0
 35, 43, 45, 32, 40, 52, 56, 33, 61, 62, 51, 57, 47, 63, 50, 70,  # f0
)

win1251_char_to_order_map = (
255,255,255,255,255,255,255,255,255,255,254,255,255,254,255,255,  # 00
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,  # 10
253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,  # 20
252,252,252,252,252,252,252,252,252,252,253,253,253,253,253,253,  # 30
253,142,143,144,145,146,147,148,149,150,151,152, 74,153, 75,154,  # 40
155,156,157,158,159,160,161,162,163,164,165,253,253,253,253,253,  # 50
253, 71,172, 66,173, 65,174, 76,175, 64,176,177, 77, 72,178, 69,  # 60
 67,179, 78, 73,180,181, 79,182,183,184,185,253,253,253,253,253,  # 70
191,192,193,194,195,196,197,198,199,200,201,202,203,204,205,206,
207,208,209,210,211,212,213,214,215,216,217,218,219,220,221,222,
223,224,225,226,227,228,229,230,231,232,233,234,235,236,237,238,
239,240,241,242,243,244,245,246, 68,247,248,249,250,251,252,253,
 37, 44, 33, 46, 41, 48, 56, 51, 42, 60, 36, 49, 38, 31, 34, 35,
 45, 32, 40, 52, 53, 55, 58, 50, 57, 63, 70, 62, 61, 47, 59, 43,
  3, 21, 10, 19, 13,  2, 24, 20,  4, 23, 11,  8, 12,  5,  1, 15,
  9,  7,  6, 14, 39, 26, 28, 22, 25, 29, 54, 18, 17, 30, 27, 16,
)

latin5_char_to_order_map = (
255,255,255,255,255,255,255,255,255,255,254,255,255,254,255,255,  # 00
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,  # 10
253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,  # 20
252,252,252,252,252,252,252,252,252,252,253,253,253,253,253,253,  # 30
253,142,143,144,145,146,147,148,149,150,151,152, 74,153, 75,154,  # 40
155,156,157,158,159,160,161,162,163,164,165,253,253,253,253,253,  # 50
253, 71,172, 66,173, 65,174, 76,175, 64,176,177, 77, 72,178, 69,  # 60
 67,179, 78, 73,180,181, 79,182,183,184,185,253,253,253,253,253,  # 70
191,192,193,194,195,196,197,198,199,200,201,202,203,204,205,206,
207,208,209,210,211,212,213,214,215,216,217,218,219,220,221,222,
223,224,225,226,227,228,229,230,231,232,233,234,235,236,237,238,
 37, 44, 33, 46, 41, 48, 56, 51, 42, 60, 36, 49, 38, 31, 34, 35,
 45, 32, 40, 52, 53, 55, 58, 50, 57, 63, 70, 62, 61, 47, 59, 43,
  3, 21, 10, 19, 13,  2, 24, 20,  4, 23, 11,  8, 12,  5,  1, 15,
  9,  7,  6, 14, 39, 26, 28, 22, 25, 29, 54, 18, 17, 30, 27, 16,
239, 68,240,241,242,243,244,245,246,247,248,249,250,251,252,255,
)

macCyrillic_char_to_order_map = (
255,255,255,255,255,255,255,255,255,255,254,255,255,254,255,255,  # 00
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,  # 10
253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,  # 20
252,252,252,252,252,252,252,252,252,252,253,253,253,253,253,253,  # 30
253,142,143,144,145,146,147,148,149,150,151,152, 74,153, 75,154,  # 40
155,156,157,158,159,160,161,162,163,164,165,253,253,253,253,253,  # 50
253, 71,172, 66,173, 65,174, 76,175, 64,176,177, 77, 72,178, 69,  # 60
 67,179, 78, 73,180,181, 79,182,183,184,185,253,253,253,253,253,  # 70
 37, 44, 33, 46, 41, 48, 56, 51, 42, 60, 36, 49, 38, 31, 34, 35,
 45, 32, 40, 52, 53, 55, 58, 50, 57, 63, 70, 62, 61, 47, 59, 43,
191,192,193,194,195,196,197,198,199,200,201,202,203,204,205,206,
207,208,209,210,211,212,213,214,215,216,217,218,219,220,221,222,
223,224,225,226,227,228,229,230,231,232,233,234,235,236,237,238,
239,240,241,242,243,244,245,246,247,248,249,250,251,252, 68, 16,
  3, 21, 10, 19, 13,  2, 24, 20,  4, 23, 11,  8, 12,  5,  1, 15,
  9,  7,  6, 14, 39, 26, 28, 22, 25, 29, 54, 18, 17, 30, 27,255,
)

IBM855_char_to_order_map = (
255,255,255,255,255,255,255,255,255,255,254,255,255,254,255,255,  # 00
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,  # 10
253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,  # 20
252,252,252,252,252,252,252,252,252,252,253,253,253,253,253,253,  # 30
253,142,143,144,145,146,147,148,149,150,151,152, 74,153, 75,154,  # 40
155,156,157,158,159,160,161,162,163,164,165,253,253,253,253,253,  # 50
253, 71,172, 66,173, 65,174, 76,175, 64,176,177, 77, 72,178, 69,  # 60
 67,179, 78, 73,180,181, 79,182,183,184,185,253,253,253,253,253,  # 70
191,192,193,194, 68,195,196,197,198,199,200,201,202,203,204,205,
206,207,208,209,210,211,212,213,214,215,216,217, 27, 59, 54, 70,
  3, 37, 21, 44, 28, 58, 13, 41,  2, 48, 39, 53, 19, 46,218,219,
220,221,222,223,224, 26, 55,  4, 42,225,226,227,228, 23, 60,229,
230,231,232,233,234,235, 11, 36,236,237,238,239,240,241,242,243,
  8, 49, 12, 38,  5, 31,  1, 34, 15,244,245,246,247, 35, 16,248,
 43,  9, 45,  7, 32,  6, 40, 14, 52, 24, 56, 10, 33, 17, 61,249,
250, 18, 62, 20, 51, 25, 57, 30, 47, 29, 63, 22, 50,251,252,255,
)

IBM866_char_to_order_map = (
255,255,255,255,255,255,255,255,255,255,254,255,255,254,255,255,  # 00
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,  # 10
253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,  # 20
252,252,252,252,252,252,252,252,252,252,253,253,253,253,253,253,  # 30
253,142,143,144,145,146,147,148,149,150,151,152, 74,153, 75,154,  # 40
155,156,157,158,159,160,161,162,163,164,165,253,253,253,253,253,  # 50
253, 71,172, 66,173, 65,174, 76,175, 64,176,177, 77, 72,178, 69,  # 60
 67,179, 78, 73,180,181, 79,182,183,184,185,253,253,253,253,253,  # 70
 37, 44, 33, 46, 41, 48, 56, 51, 42, 60, 36, 49, 38, 31, 34, 35,
 45, 32, 40, 52, 53, 55, 58, 50, 57, 63, 70, 62, 61, 47, 59, 43,
  3, 21, 10, 19, 13,  2, 24, 20,  4, 23, 11,  8, 12,  5,  1, 15,
191,192,193,194,195,196,197,198,199,200,201,202,203,204,205,206,
207,208,209,210,211,212,213,214,215,216,217,218,219,220,221,222,
223,224,225,226,227,228,229,230,231,232,233,234,235,236,237,238,
  9,  7,  6, 14, 39, 26, 28, 22, 25, 29, 54, 18, 17, 30, 27, 16,
239, 68,240,241,242,243,244,245,246,247,248,249,250,251,252,255,
)

# Model Table:
# total sequences: 100%
# first 512 sequences: 97.6601%
# first 1024 sequences: 2.3389%
# rest  sequences:      0.1237%
# negative sequences:   0.0009%
RussianLangModel = (
0,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,1,1,3,3,3,3,1,3,3,3,2,3,2,3,3,
3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,0,3,2,2,2,2,2,0,0,2,
3,3,3,2,3,3,3,3,3,3,3,3,3,3,2,3,3,0,0,3,3,3,3,3,3,3,3,3,2,3,2,0,
0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,2,2,3,3,3,3,3,3,3,3,3,2,3,3,0,0,3,3,3,3,3,3,3,3,2,3,3,1,0,
0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,2,3,2,3,3,3,3,3,3,3,3,3,3,3,3,3,0,0,3,3,3,3,3,3,3,3,3,3,3,2,1,
0,0,0,0,0,0,0,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,3,3,0,0,3,3,3,3,3,3,3,3,3,3,3,2,1,
0,0,0,0,0,1,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,3,3,3,2,2,2,3,1,3,3,1,3,3,3,3,2,2,3,0,2,2,2,3,3,2,1,0,
0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,3,2,3,3,3,3,3,2,2,3,2,3,3,3,2,1,2,2,0,1,2,2,2,2,2,2,0,
0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,2,2,3,0,2,2,3,3,2,1,2,0,
0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,2,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,3,2,3,3,1,2,3,2,2,3,2,3,3,3,3,2,2,3,0,3,2,2,3,1,1,1,0,
0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,3,3,3,2,2,3,3,3,3,3,2,3,3,3,3,2,2,2,0,3,3,3,2,2,2,2,0,
0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,3,3,3,3,3,2,3,2,3,3,3,3,3,3,2,3,2,2,0,1,3,2,1,2,2,1,0,
0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,3,3,3,3,3,3,2,1,1,3,0,1,1,1,1,2,1,1,0,2,2,2,1,2,0,1,0,
0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,3,2,3,3,2,2,2,2,1,3,2,3,2,3,2,1,2,2,0,1,1,2,1,2,1,2,0,
0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,3,3,3,3,3,3,3,2,2,3,2,3,3,3,2,2,2,2,0,2,2,2,2,3,1,1,0,
0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,
3,2,3,2,2,3,3,3,3,3,3,3,3,3,1,3,2,0,0,3,3,3,3,2,3,3,3,3,2,3,2,0,
0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,3,3,3,3,3,2,2,3,3,0,2,1,0,3,2,3,2,3,0,0,1,2,0,0,1,0,1,2,1,1,0,
0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,0,3,0,2,3,3,3,3,2,3,3,3,3,1,2,2,0,0,2,3,2,2,2,3,2,3,2,2,3,0,0,
0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,2,3,0,2,3,2,3,0,1,2,3,3,2,0,2,3,0,0,2,3,2,2,0,1,3,1,3,2,2,1,0,
0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,1,3,0,2,3,3,3,3,3,3,3,3,2,1,3,2,0,0,2,2,3,3,3,2,3,3,0,2,2,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,3,2,2,3,3,2,2,2,3,3,0,0,1,1,1,1,1,2,0,0,1,1,1,1,0,1,0,
0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,3,2,2,3,3,3,3,3,3,3,0,3,2,3,3,2,3,2,0,2,1,0,1,1,0,1,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,3,2,3,3,3,2,2,2,2,3,1,3,2,3,1,1,2,1,0,2,2,2,2,1,3,1,0,
0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,
2,2,3,3,3,3,3,1,2,2,1,3,1,0,3,0,0,3,0,0,0,1,1,0,1,2,1,0,0,0,0,0,
0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,2,2,1,1,3,3,3,2,2,1,2,2,3,1,1,2,0,0,2,2,1,3,0,0,2,1,1,2,1,1,0,
0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,2,3,3,3,3,1,2,2,2,1,2,1,3,3,1,1,2,1,2,1,2,2,0,2,0,0,1,1,0,1,0,
0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,3,3,3,3,3,2,1,3,2,2,3,2,0,3,2,0,3,0,1,0,1,1,0,0,1,1,1,1,0,1,0,
0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,2,3,3,3,2,2,2,3,3,1,2,1,2,1,0,1,0,1,1,0,1,0,0,2,1,1,1,0,1,0,
0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,
3,1,1,2,1,2,3,3,2,2,1,2,2,3,0,2,1,0,0,2,2,3,2,1,2,2,2,2,2,3,1,0,
0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,1,1,0,1,1,2,2,1,1,3,0,0,1,3,1,1,1,0,0,0,1,0,1,1,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,1,3,3,3,2,0,0,0,2,1,0,1,0,2,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,0,1,0,0,2,3,2,2,2,1,2,2,2,1,2,1,0,0,1,1,1,0,2,0,1,1,1,0,0,1,1,
1,0,0,0,0,0,1,2,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,
2,3,3,3,3,0,0,0,0,1,0,0,0,0,3,0,1,2,1,0,0,0,0,0,0,0,1,1,0,0,1,1,
1,0,1,0,1,2,0,0,1,1,2,1,0,1,1,1,1,0,1,1,1,1,0,1,0,0,1,0,0,1,1,0,
2,2,3,2,2,2,3,1,2,2,2,2,2,2,2,2,1,1,1,1,1,1,1,0,1,0,1,1,1,0,2,1,
1,1,1,1,1,1,1,1,2,1,1,1,1,1,1,1,1,1,1,0,1,0,1,1,0,1,1,1,0,1,1,0,
3,3,3,2,2,2,2,3,2,2,1,1,2,2,2,2,1,1,3,1,2,1,2,0,0,1,1,0,1,0,2,1,
1,1,1,1,1,2,1,0,1,1,1,1,0,1,0,0,1,1,0,0,1,0,1,0,0,1,0,0,0,1,1,0,
2,0,0,1,0,3,2,2,2,2,1,2,1,2,1,2,0,0,0,2,1,2,2,1,1,2,2,0,1,1,0,2,
1,1,1,1,1,0,1,1,1,2,1,1,1,2,1,0,1,2,1,1,1,1,0,1,1,1,0,0,1,0,0,1,
1,3,2,2,2,1,1,1,2,3,0,0,0,0,2,0,2,2,1,0,0,0,0,0,0,1,0,0,0,0,1,1,
1,0,1,1,0,1,0,1,1,0,1,1,0,2,0,0,1,1,0,0,1,0,0,0,0,0,0,0,0,1,1,0,
2,3,2,3,2,1,2,2,2,2,1,0,0,0,2,0,0,1,1,0,0,0,0,0,0,0,1,1,0,0,2,1,
1,1,2,1,0,2,0,0,1,0,1,0,0,1,0,0,1,1,0,1,1,0,0,0,0,0,1,0,0,0,0,0,
3,0,0,1,0,2,2,2,3,2,2,2,2,2,2,2,0,0,0,2,1,2,1,1,1,2,2,0,0,0,1,2,
1,1,1,1,1,0,1,2,1,1,1,1,1,1,1,0,1,1,1,1,1,1,0,1,1,1,1,1,1,0,0,1,
2,3,2,3,3,2,0,1,1,1,0,0,1,0,2,0,1,1,3,1,0,0,0,0,0,0,0,1,0,0,2,1,
1,1,1,1,1,1,1,0,1,0,1,1,1,1,0,1,1,1,0,0,1,1,0,1,0,0,0,0,0,0,1,0,
2,3,3,3,3,1,2,2,2,2,0,1,1,0,2,1,1,1,2,1,0,1,1,0,0,1,0,1,0,0,2,0,
0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,3,3,3,2,0,0,1,1,2,2,1,0,0,2,0,1,1,3,0,0,1,0,0,0,0,0,1,0,1,2,1,
1,1,2,0,1,1,1,0,1,0,1,1,0,1,0,1,1,1,1,0,1,0,0,0,0,0,0,1,0,1,1,0,
1,3,2,3,2,1,0,0,2,2,2,0,1,0,2,0,1,1,1,0,1,0,0,0,3,0,1,1,0,0,2,1,
1,1,1,0,1,1,0,0,0,0,1,1,0,1,0,0,2,1,1,0,1,0,0,0,1,0,1,0,0,1,1,0,
3,1,2,1,1,2,2,2,2,2,2,1,2,2,1,1,0,0,0,2,2,2,0,0,0,1,2,1,0,1,0,1,
2,1,1,1,1,1,1,1,1,1,1,1,1,1,1,0,2,1,1,1,0,1,0,1,1,0,1,1,1,0,0,1,
3,0,0,0,0,2,0,1,1,1,1,1,1,1,0,1,0,0,0,1,1,1,0,1,0,1,1,0,0,1,0,1,
1,1,0,0,1,0,0,0,1,0,1,1,0,0,1,0,1,0,1,0,0,0,0,1,0,0,0,1,0,0,0,1,
1,3,3,2,2,0,0,0,2,2,0,0,0,1,2,0,1,1,2,0,0,0,0,0,0,0,0,1,0,0,2,1,
0,1,1,0,0,1,1,0,0,0,1,1,0,1,1,0,1,1,0,0,1,0,0,0,0,0,0,0,0,0,1,0,
2,3,2,3,2,0,0,0,0,1,1,0,0,0,2,0,2,0,2,0,0,0,0,0,1,0,0,1,0,0,1,1,
1,1,2,0,1,2,1,0,1,1,2,1,1,1,1,1,2,1,1,0,1,0,0,1,1,1,1,1,0,1,1,0,
1,3,2,2,2,1,0,0,2,2,1,0,1,2,2,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,1,
0,0,1,1,0,1,1,0,0,1,1,0,1,1,0,0,1,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,
1,0,0,1,0,2,3,1,2,2,2,2,2,2,1,1,0,0,0,1,0,1,0,2,1,1,1,0,0,0,0,1,
1,1,0,1,1,0,1,1,1,1,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,
2,0,2,0,0,1,0,3,2,1,2,1,2,2,0,1,0,0,0,2,1,0,0,2,1,1,1,1,0,2,0,2,
2,1,1,1,1,1,1,1,1,1,1,1,1,2,1,0,1,1,1,1,0,0,0,1,1,1,1,0,1,0,0,1,
1,2,2,2,2,1,0,0,1,0,0,0,0,0,2,0,1,1,1,1,0,0,0,0,1,0,1,2,0,0,2,0,
1,0,1,1,1,2,1,0,1,0,1,1,0,0,1,0,1,1,1,0,1,0,0,0,1,0,0,1,0,1,1,0,
2,1,2,2,2,0,3,0,1,1,0,0,0,0,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,
0,0,0,1,1,1,0,0,1,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,
1,2,2,3,2,2,0,0,1,1,2,0,1,2,1,0,1,0,1,0,0,1,0,0,0,0,0,0,0,0,0,1,
0,1,1,0,0,1,1,0,0,1,1,0,0,1,1,0,1,1,0,0,1,0,0,0,0,0,0,0,0,1,1,0,
2,2,1,1,2,1,2,2,2,2,2,1,2,2,0,1,0,0,0,1,2,2,2,1,2,1,1,1,1,1,2,1,
1,1,1,1,1,1,1,1,1,1,0,0,1,1,1,0,1,1,1,0,0,0,0,1,1,1,0,1,1,0,0,1,
1,2,2,2,2,0,1,0,2,2,0,0,0,0,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,2,0,
0,0,1,0,0,1,0,0,0,0,1,0,1,1,0,0,1,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,
0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,2,2,2,2,0,0,0,2,2,2,0,1,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,
0,1,1,0,0,1,1,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,2,2,2,2,0,0,0,0,1,0,0,1,1,2,0,0,0,0,1,0,1,0,0,1,0,0,2,0,0,0,1,
0,0,1,0,0,1,0,0,0,1,1,0,0,0,0,0,1,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,
1,2,2,2,1,1,2,0,2,1,1,1,1,0,2,2,0,0,0,0,0,0,0,0,0,1,1,0,0,0,1,1,
0,0,1,0,1,1,0,0,0,0,1,0,0,0,0,0,1,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,
1,0,2,1,2,0,0,0,0,0,1,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,
0,0,1,0,1,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,
1,0,0,0,0,2,0,1,2,1,0,1,1,1,0,1,0,0,0,1,0,1,0,0,1,0,1,0,0,0,0,1,
0,0,0,0,0,1,0,0,1,1,0,0,1,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,
2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,
1,0,0,0,1,0,0,0,1,1,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,0,0,0,
2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,
1,1,1,0,1,0,1,0,0,1,1,1,1,0,0,0,1,0,0,0,0,1,0,0,0,1,0,1,0,0,0,0,
1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,
1,1,0,1,1,0,1,0,1,0,0,0,0,1,1,0,1,1,0,0,0,0,0,1,0,1,1,0,1,0,0,0,
0,1,1,1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,
)

Koi8rModel = {
  'char_to_order_map': KOI8R_char_to_order_map,
  'precedence_matrix': RussianLangModel,
  'typical_positive_ratio': 0.976601,
  'keep_english_letter': False,
  'charset_name': "KOI8-R",
  'language': 'Russian',
}

Win1251CyrillicModel = {
  'char_to_order_map': win1251_char_to_order_map,
  'precedence_matrix': RussianLangModel,
  'typical_positive_ratio': 0.976601,
  'keep_english_letter': False,
  'charset_name': "windows-1251",
  'language': 'Russian',
}

Latin5CyrillicModel = {
  'char_to_order_map': latin5_char_to_order_map,
  'precedence_matrix': RussianLangModel,
  'typical_positive_ratio': 0.976601,
  'keep_english_letter': False,
  'charset_name': "ISO-8859-5",
  'language': 'Russian',
}

MacCyrillicModel = {
  'char_to_order_map': macCyrillic_char_to_order_map,
  'precedence_matrix': RussianLangModel,
  'typical_positive_ratio': 0.976601,
  'keep_english_letter': False,
  'charset_name': "MacCyrillic",
  'language': 'Russian',
}

Ibm866Model = {
  'char_to_order_map': IBM866_char_to_order_map,
  'precedence_matrix': RussianLangModel,
  'typical_positive_ratio': 0.976601,
  'keep_english_letter': False,
  'charset_name': "IBM866",
  'language': 'Russian',
}

Ibm855Model = {
  'char_to_order_map': IBM855_char_to_order_map,
  'precedence_matrix': RussianLangModel,
  'typical_positive_ratio': 0.976601,
  'keep_english_letter': False,
  'charset_name': "IBM855",
  'language': 'Russian',
}
site-packages/pip/_vendor/chardet/big5prober.py000064400000003335147511334570015552 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Communicator client code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

from .mbcharsetprober import MultiByteCharSetProber
from .codingstatemachine import CodingStateMachine
from .chardistribution import Big5DistributionAnalysis
from .mbcssm import BIG5_SM_MODEL


class Big5Prober(MultiByteCharSetProber):
    def __init__(self):
        super(Big5Prober, self).__init__()
        self.coding_sm = CodingStateMachine(BIG5_SM_MODEL)
        self.distribution_analyzer = Big5DistributionAnalysis()
        self.reset()

    @property
    def charset_name(self):
        return "Big5"

    @property
    def language(self):
        return "Chinese"
site-packages/pip/_vendor/chardet/gb2312freq.py000064400000050353147511334570015272 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Communicator client code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

# GB2312 most frequently used character table
#
# Char to FreqOrder table , from hz6763

# 512  --> 0.79  -- 0.79
# 1024 --> 0.92  -- 0.13
# 2048 --> 0.98  -- 0.06
# 6768 --> 1.00  -- 0.02
#
# Ideal Distribution Ratio = 0.79135/(1-0.79135) = 3.79
# Random Distribution Ration = 512 / (3755 - 512) = 0.157
#
# Typical Distribution Ratio about 25% of Ideal one, still much higher that RDR

GB2312_TYPICAL_DISTRIBUTION_RATIO = 0.9

GB2312_TABLE_SIZE = 3760

GB2312_CHAR_TO_FREQ_ORDER = (
1671, 749,1443,2364,3924,3807,2330,3921,1704,3463,2691,1511,1515, 572,3191,2205,
2361, 224,2558, 479,1711, 963,3162, 440,4060,1905,2966,2947,3580,2647,3961,3842,
2204, 869,4207, 970,2678,5626,2944,2956,1479,4048, 514,3595, 588,1346,2820,3409,
 249,4088,1746,1873,2047,1774, 581,1813, 358,1174,3590,1014,1561,4844,2245, 670,
1636,3112, 889,1286, 953, 556,2327,3060,1290,3141, 613, 185,3477,1367, 850,3820,
1715,2428,2642,2303,2732,3041,2562,2648,3566,3946,1349, 388,3098,2091,1360,3585,
 152,1687,1539, 738,1559,  59,1232,2925,2267,1388,1249,1741,1679,2960, 151,1566,
1125,1352,4271, 924,4296, 385,3166,4459, 310,1245,2850,  70,3285,2729,3534,3575,
2398,3298,3466,1960,2265, 217,3647, 864,1909,2084,4401,2773,1010,3269,5152, 853,
3051,3121,1244,4251,1895, 364,1499,1540,2313,1180,3655,2268, 562, 715,2417,3061,
 544, 336,3768,2380,1752,4075, 950, 280,2425,4382, 183,2759,3272, 333,4297,2155,
1688,2356,1444,1039,4540, 736,1177,3349,2443,2368,2144,2225, 565, 196,1482,3406,
 927,1335,4147, 692, 878,1311,1653,3911,3622,1378,4200,1840,2969,3149,2126,1816,
2534,1546,2393,2760, 737,2494,  13, 447, 245,2747,  38,2765,2129,2589,1079, 606,
 360, 471,3755,2890, 404, 848, 699,1785,1236, 370,2221,1023,3746,2074,2026,2023,
2388,1581,2119, 812,1141,3091,2536,1519, 804,2053, 406,1596,1090, 784, 548,4414,
1806,2264,2936,1100, 343,4114,5096, 622,3358, 743,3668,1510,1626,5020,3567,2513,
3195,4115,5627,2489,2991,  24,2065,2697,1087,2719,  48,1634, 315,  68, 985,2052,
 198,2239,1347,1107,1439, 597,2366,2172, 871,3307, 919,2487,2790,1867, 236,2570,
1413,3794, 906,3365,3381,1701,1982,1818,1524,2924,1205, 616,2586,2072,2004, 575,
 253,3099,  32,1365,1182, 197,1714,2454,1201, 554,3388,3224,2748, 756,2587, 250,
2567,1507,1517,3529,1922,2761,2337,3416,1961,1677,2452,2238,3153, 615, 911,1506,
1474,2495,1265,1906,2749,3756,3280,2161, 898,2714,1759,3450,2243,2444, 563,  26,
3286,2266,3769,3344,2707,3677, 611,1402, 531,1028,2871,4548,1375, 261,2948, 835,
1190,4134, 353, 840,2684,1900,3082,1435,2109,1207,1674, 329,1872,2781,4055,2686,
2104, 608,3318,2423,2957,2768,1108,3739,3512,3271,3985,2203,1771,3520,1418,2054,
1681,1153, 225,1627,2929, 162,2050,2511,3687,1954, 124,1859,2431,1684,3032,2894,
 585,4805,3969,2869,2704,2088,2032,2095,3656,2635,4362,2209, 256, 518,2042,2105,
3777,3657, 643,2298,1148,1779, 190, 989,3544, 414,  11,2135,2063,2979,1471, 403,
3678, 126, 770,1563, 671,2499,3216,2877, 600,1179, 307,2805,4937,1268,1297,2694,
 252,4032,1448,1494,1331,1394, 127,2256, 222,1647,1035,1481,3056,1915,1048, 873,
3651, 210,  33,1608,2516, 200,1520, 415, 102,   0,3389,1287, 817,  91,3299,2940,
 836,1814, 549,2197,1396,1669,2987,3582,2297,2848,4528,1070, 687,  20,1819, 121,
1552,1364,1461,1968,2617,3540,2824,2083, 177, 948,4938,2291, 110,4549,2066, 648,
3359,1755,2110,2114,4642,4845,1693,3937,3308,1257,1869,2123, 208,1804,3159,2992,
2531,2549,3361,2418,1350,2347,2800,2568,1291,2036,2680,  72, 842,1990, 212,1233,
1154,1586,  75,2027,3410,4900,1823,1337,2710,2676, 728,2810,1522,3026,4995, 157,
 755,1050,4022, 710, 785,1936,2194,2085,1406,2777,2400, 150,1250,4049,1206, 807,
1910, 534, 529,3309,1721,1660, 274,  39,2827, 661,2670,1578, 925,3248,3815,1094,
4278,4901,4252,  41,1150,3747,2572,2227,4501,3658,4902,3813,3357,3617,2884,2258,
 887, 538,4187,3199,1294,2439,3042,2329,2343,2497,1255, 107, 543,1527, 521,3478,
3568, 194,5062,  15, 961,3870,1241,1192,2664,  66,5215,3260,2111,1295,1127,2152,
3805,4135, 901,1164,1976, 398,1278, 530,1460, 748, 904,1054,1966,1426,  53,2909,
 509, 523,2279,1534, 536,1019, 239,1685, 460,2353, 673,1065,2401,3600,4298,2272,
1272,2363, 284,1753,3679,4064,1695,  81, 815,2677,2757,2731,1386, 859, 500,4221,
2190,2566, 757,1006,2519,2068,1166,1455, 337,2654,3203,1863,1682,1914,3025,1252,
1409,1366, 847, 714,2834,2038,3209, 964,2970,1901, 885,2553,1078,1756,3049, 301,
1572,3326, 688,2130,1996,2429,1805,1648,2930,3421,2750,3652,3088, 262,1158,1254,
 389,1641,1812, 526,1719, 923,2073,1073,1902, 468, 489,4625,1140, 857,2375,3070,
3319,2863, 380, 116,1328,2693,1161,2244, 273,1212,1884,2769,3011,1775,1142, 461,
3066,1200,2147,2212, 790, 702,2695,4222,1601,1058, 434,2338,5153,3640,  67,2360,
4099,2502, 618,3472,1329, 416,1132, 830,2782,1807,2653,3211,3510,1662, 192,2124,
 296,3979,1739,1611,3684,  23, 118, 324, 446,1239,1225, 293,2520,3814,3795,2535,
3116,  17,1074, 467,2692,2201, 387,2922,  45,1326,3055,1645,3659,2817, 958, 243,
1903,2320,1339,2825,1784,3289, 356, 576, 865,2315,2381,3377,3916,1088,3122,1713,
1655, 935, 628,4689,1034,1327, 441, 800, 720, 894,1979,2183,1528,5289,2702,1071,
4046,3572,2399,1571,3281,  79, 761,1103, 327, 134, 758,1899,1371,1615, 879, 442,
 215,2605,2579, 173,2048,2485,1057,2975,3317,1097,2253,3801,4263,1403,1650,2946,
 814,4968,3487,1548,2644,1567,1285,   2, 295,2636,  97, 946,3576, 832, 141,4257,
3273, 760,3821,3521,3156,2607, 949,1024,1733,1516,1803,1920,2125,2283,2665,3180,
1501,2064,3560,2171,1592, 803,3518,1416, 732,3897,4258,1363,1362,2458, 119,1427,
 602,1525,2608,1605,1639,3175, 694,3064,  10, 465,  76,2000,4846,4208, 444,3781,
1619,3353,2206,1273,3796, 740,2483, 320,1723,2377,3660,2619,1359,1137,1762,1724,
2345,2842,1850,1862, 912, 821,1866, 612,2625,1735,2573,3369,1093, 844,  89, 937,
 930,1424,3564,2413,2972,1004,3046,3019,2011, 711,3171,1452,4178, 428, 801,1943,
 432, 445,2811, 206,4136,1472, 730, 349,  73, 397,2802,2547, 998,1637,1167, 789,
 396,3217, 154,1218, 716,1120,1780,2819,4826,1931,3334,3762,2139,1215,2627, 552,
3664,3628,3232,1405,2383,3111,1356,2652,3577,3320,3101,1703, 640,1045,1370,1246,
4996, 371,1575,2436,1621,2210, 984,4033,1734,2638,  16,4529, 663,2755,3255,1451,
3917,2257,1253,1955,2234,1263,2951, 214,1229, 617, 485, 359,1831,1969, 473,2310,
 750,2058, 165,  80,2864,2419, 361,4344,2416,2479,1134, 796,3726,1266,2943, 860,
2715, 938, 390,2734,1313,1384, 248, 202, 877,1064,2854, 522,3907, 279,1602, 297,
2357, 395,3740, 137,2075, 944,4089,2584,1267,3802,  62,1533,2285, 178, 176, 780,
2440, 201,3707, 590, 478,1560,4354,2117,1075,  30,  74,4643,4004,1635,1441,2745,
 776,2596, 238,1077,1692,1912,2844, 605, 499,1742,3947, 241,3053, 980,1749, 936,
2640,4511,2582, 515,1543,2162,5322,2892,2993, 890,2148,1924, 665,1827,3581,1032,
 968,3163, 339,1044,1896, 270, 583,1791,1720,4367,1194,3488,3669,  43,2523,1657,
 163,2167, 290,1209,1622,3378, 550, 634,2508,2510, 695,2634,2384,2512,1476,1414,
 220,1469,2341,2138,2852,3183,2900,4939,2865,3502,1211,3680, 854,3227,1299,2976,
3172, 186,2998,1459, 443,1067,3251,1495, 321,1932,3054, 909, 753,1410,1828, 436,
2441,1119,1587,3164,2186,1258, 227, 231,1425,1890,3200,3942, 247, 959, 725,5254,
2741, 577,2158,2079, 929, 120, 174, 838,2813, 591,1115, 417,2024,  40,3240,1536,
1037, 291,4151,2354, 632,1298,2406,2500,3535,1825,1846,3451, 205,1171, 345,4238,
  18,1163, 811, 685,2208,1217, 425,1312,1508,1175,4308,2552,1033, 587,1381,3059,
2984,3482, 340,1316,4023,3972, 792,3176, 519, 777,4690, 918, 933,4130,2981,3741,
  90,3360,2911,2200,5184,4550, 609,3079,2030, 272,3379,2736, 363,3881,1130,1447,
 286, 779, 357,1169,3350,3137,1630,1220,2687,2391, 747,1277,3688,2618,2682,2601,
1156,3196,5290,4034,3102,1689,3596,3128, 874, 219,2783, 798, 508,1843,2461, 269,
1658,1776,1392,1913,2983,3287,2866,2159,2372, 829,4076,  46,4253,2873,1889,1894,
 915,1834,1631,2181,2318, 298, 664,2818,3555,2735, 954,3228,3117, 527,3511,2173,
 681,2712,3033,2247,2346,3467,1652, 155,2164,3382, 113,1994, 450, 899, 494, 994,
1237,2958,1875,2336,1926,3727, 545,1577,1550, 633,3473, 204,1305,3072,2410,1956,
2471, 707,2134, 841,2195,2196,2663,3843,1026,4940, 990,3252,4997, 368,1092, 437,
3212,3258,1933,1829, 675,2977,2893, 412, 943,3723,4644,3294,3283,2230,2373,5154,
2389,2241,2661,2323,1404,2524, 593, 787, 677,3008,1275,2059, 438,2709,2609,2240,
2269,2246,1446,  36,1568,1373,3892,1574,2301,1456,3962, 693,2276,5216,2035,1143,
2720,1919,1797,1811,2763,4137,2597,1830,1699,1488,1198,2090, 424,1694, 312,3634,
3390,4179,3335,2252,1214, 561,1059,3243,2295,2561, 975,5155,2321,2751,3772, 472,
1537,3282,3398,1047,2077,2348,2878,1323,3340,3076, 690,2906,  51, 369, 170,3541,
1060,2187,2688,3670,2541,1083,1683, 928,3918, 459, 109,4427, 599,3744,4286, 143,
2101,2730,2490,  82,1588,3036,2121, 281,1860, 477,4035,1238,2812,3020,2716,3312,
1530,2188,2055,1317, 843, 636,1808,1173,3495, 649, 181,1002, 147,3641,1159,2414,
3750,2289,2795, 813,3123,2610,1136,4368,   5,3391,4541,2174, 420, 429,1728, 754,
1228,2115,2219, 347,2223,2733, 735,1518,3003,2355,3134,1764,3948,3329,1888,2424,
1001,1234,1972,3321,3363,1672,1021,1450,1584, 226, 765, 655,2526,3404,3244,2302,
3665, 731, 594,2184, 319,1576, 621, 658,2656,4299,2099,3864,1279,2071,2598,2739,
 795,3086,3699,3908,1707,2352,2402,1382,3136,2475,1465,4847,3496,3865,1085,3004,
2591,1084, 213,2287,1963,3565,2250, 822, 793,4574,3187,1772,1789,3050, 595,1484,
1959,2770,1080,2650, 456, 422,2996, 940,3322,4328,4345,3092,2742, 965,2784, 739,
4124, 952,1358,2498,2949,2565, 332,2698,2378, 660,2260,2473,4194,3856,2919, 535,
1260,2651,1208,1428,1300,1949,1303,2942, 433,2455,2450,1251,1946, 614,1269, 641,
1306,1810,2737,3078,2912, 564,2365,1419,1415,1497,4460,2367,2185,1379,3005,1307,
3218,2175,1897,3063, 682,1157,4040,4005,1712,1160,1941,1399, 394, 402,2952,1573,
1151,2986,2404, 862, 299,2033,1489,3006, 346, 171,2886,3401,1726,2932, 168,2533,
  47,2507,1030,3735,1145,3370,1395,1318,1579,3609,4560,2857,4116,1457,2529,1965,
 504,1036,2690,2988,2405, 745,5871, 849,2397,2056,3081, 863,2359,3857,2096,  99,
1397,1769,2300,4428,1643,3455,1978,1757,3718,1440,  35,4879,3742,1296,4228,2280,
 160,5063,1599,2013, 166, 520,3479,1646,3345,3012, 490,1937,1545,1264,2182,2505,
1096,1188,1369,1436,2421,1667,2792,2460,1270,2122, 727,3167,2143, 806,1706,1012,
1800,3037, 960,2218,1882, 805, 139,2456,1139,1521, 851,1052,3093,3089, 342,2039,
 744,5097,1468,1502,1585,2087, 223, 939, 326,2140,2577, 892,2481,1623,4077, 982,
3708, 135,2131,  87,2503,3114,2326,1106, 876,1616, 547,2997,2831,2093,3441,4530,
4314,   9,3256,4229,4148, 659,1462,1986,1710,2046,2913,2231,4090,4880,5255,3392,
3274,1368,3689,4645,1477, 705,3384,3635,1068,1529,2941,1458,3782,1509, 100,1656,
2548, 718,2339, 408,1590,2780,3548,1838,4117,3719,1345,3530, 717,3442,2778,3220,
2898,1892,4590,3614,3371,2043,1998,1224,3483, 891, 635, 584,2559,3355, 733,1766,
1729,1172,3789,1891,2307, 781,2982,2271,1957,1580,5773,2633,2005,4195,3097,1535,
3213,1189,1934,5693,3262, 586,3118,1324,1598, 517,1564,2217,1868,1893,4445,3728,
2703,3139,1526,1787,1992,3882,2875,1549,1199,1056,2224,1904,2711,5098,4287, 338,
1993,3129,3489,2689,1809,2815,1997, 957,1855,3898,2550,3275,3057,1105,1319, 627,
1505,1911,1883,3526, 698,3629,3456,1833,1431, 746,  77,1261,2017,2296,1977,1885,
 125,1334,1600, 525,1798,1109,2222,1470,1945, 559,2236,1186,3443,2476,1929,1411,
2411,3135,1777,3372,2621,1841,1613,3229, 668,1430,1839,2643,2916, 195,1989,2671,
2358,1387, 629,3205,2293,5256,4439, 123,1310, 888,1879,4300,3021,3605,1003,1162,
3192,2910,2010, 140,2395,2859,  55,1082,2012,2901, 662, 419,2081,1438, 680,2774,
4654,3912,1620,1731,1625,5035,4065,2328, 512,1344, 802,5443,2163,2311,2537, 524,
3399,  98,1155,2103,1918,2606,3925,2816,1393,2465,1504,3773,2177,3963,1478,4346,
 180,1113,4655,3461,2028,1698, 833,2696,1235,1322,1594,4408,3623,3013,3225,2040,
3022, 541,2881, 607,3632,2029,1665,1219, 639,1385,1686,1099,2803,3231,1938,3188,
2858, 427, 676,2772,1168,2025, 454,3253,2486,3556, 230,1950, 580, 791,1991,1280,
1086,1974,2034, 630, 257,3338,2788,4903,1017,  86,4790, 966,2789,1995,1696,1131,
 259,3095,4188,1308, 179,1463,5257, 289,4107,1248,  42,3413,1725,2288, 896,1947,
 774,4474,4254, 604,3430,4264, 392,2514,2588, 452, 237,1408,3018, 988,4531,1970,
3034,3310, 540,2370,1562,1288,2990, 502,4765,1147,   4,1853,2708, 207, 294,2814,
4078,2902,2509, 684,  34,3105,3532,2551, 644, 709,2801,2344, 573,1727,3573,3557,
2021,1081,3100,4315,2100,3681, 199,2263,1837,2385, 146,3484,1195,2776,3949, 997,
1939,3973,1008,1091,1202,1962,1847,1149,4209,5444,1076, 493, 117,5400,2521, 972,
1490,2934,1796,4542,2374,1512,2933,2657, 413,2888,1135,2762,2314,2156,1355,2369,
 766,2007,2527,2170,3124,2491,2593,2632,4757,2437, 234,3125,3591,1898,1750,1376,
1942,3468,3138, 570,2127,2145,3276,4131, 962, 132,1445,4196,  19, 941,3624,3480,
3366,1973,1374,4461,3431,2629, 283,2415,2275, 808,2887,3620,2112,2563,1353,3610,
 955,1089,3103,1053,  96,  88,4097, 823,3808,1583, 399, 292,4091,3313, 421,1128,
 642,4006, 903,2539,1877,2082, 596,  29,4066,1790, 722,2157, 130, 995,1569, 769,
1485, 464, 513,2213, 288,1923,1101,2453,4316, 133, 486,2445,  50, 625, 487,2207,
  57, 423, 481,2962, 159,3729,1558, 491, 303, 482, 501, 240,2837, 112,3648,2392,
1783, 362,   8,3433,3422, 610,2793,3277,1390,1284,1654,  21,3823, 734, 367, 623,
 193, 287, 374,1009,1483, 816, 476, 313,2255,2340,1262,2150,2899,1146,2581, 782,
2116,1659,2018,1880, 255,3586,3314,1110,2867,2137,2564, 986,2767,5185,2006, 650,
 158, 926, 762, 881,3157,2717,2362,3587, 306,3690,3245,1542,3077,2427,1691,2478,
2118,2985,3490,2438, 539,2305, 983, 129,1754, 355,4201,2386, 827,2923, 104,1773,
2838,2771, 411,2905,3919, 376, 767, 122,1114, 828,2422,1817,3506, 266,3460,1007,
1609,4998, 945,2612,4429,2274, 726,1247,1964,2914,2199,2070,4002,4108, 657,3323,
1422, 579, 455,2764,4737,1222,2895,1670, 824,1223,1487,2525, 558, 861,3080, 598,
2659,2515,1967, 752,2583,2376,2214,4180, 977, 704,2464,4999,2622,4109,1210,2961,
 819,1541, 142,2284,  44, 418, 457,1126,3730,4347,4626,1644,1876,3671,1864, 302,
1063,5694, 624, 723,1984,3745,1314,1676,2488,1610,1449,3558,3569,2166,2098, 409,
1011,2325,3704,2306, 818,1732,1383,1824,1844,3757, 999,2705,3497,1216,1423,2683,
2426,2954,2501,2726,2229,1475,2554,5064,1971,1794,1666,2014,1343, 783, 724, 191,
2434,1354,2220,5065,1763,2752,2472,4152, 131, 175,2885,3434,  92,1466,4920,2616,
3871,3872,3866, 128,1551,1632, 669,1854,3682,4691,4125,1230, 188,2973,3290,1302,
1213, 560,3266, 917, 763,3909,3249,1760, 868,1958, 764,1782,2097, 145,2277,3774,
4462,  64,1491,3062, 971,2132,3606,2442, 221,1226,1617, 218, 323,1185,3207,3147,
 571, 619,1473,1005,1744,2281, 449,1887,2396,3685, 275, 375,3816,1743,3844,3731,
 845,1983,2350,4210,1377, 773, 967,3499,3052,3743,2725,4007,1697,1022,3943,1464,
3264,2855,2722,1952,1029,2839,2467,  84,4383,2215, 820,1391,2015,2448,3672, 377,
1948,2168, 797,2545,3536,2578,2645,  94,2874,1678, 405,1259,3071, 771, 546,1315,
 470,1243,3083, 895,2468, 981, 969,2037, 846,4181, 653,1276,2928,  14,2594, 557,
3007,2474, 156, 902,1338,1740,2574, 537,2518, 973,2282,2216,2433,1928, 138,2903,
1293,2631,1612, 646,3457, 839,2935, 111, 496,2191,2847, 589,3186, 149,3994,2060,
4031,2641,4067,3145,1870,  37,3597,2136,1025,2051,3009,3383,3549,1121,1016,3261,
1301, 251,2446,2599,2153, 872,3246, 637, 334,3705, 831, 884, 921,3065,3140,4092,
2198,1944, 246,2964, 108,2045,1152,1921,2308,1031, 203,3173,4170,1907,3890, 810,
1401,2003,1690, 506, 647,1242,2828,1761,1649,3208,2249,1589,3709,2931,5156,1708,
 498, 666,2613, 834,3817,1231, 184,2851,1124, 883,3197,2261,3710,1765,1553,2658,
1178,2639,2351,  93,1193, 942,2538,2141,4402, 235,1821, 870,1591,2192,1709,1871,
3341,1618,4126,2595,2334, 603, 651,  69, 701, 268,2662,3411,2555,1380,1606, 503,
 448, 254,2371,2646, 574,1187,2309,1770, 322,2235,1292,1801, 305, 566,1133, 229,
2067,2057, 706, 167, 483,2002,2672,3295,1820,3561,3067, 316, 378,2746,3452,1112,
 136,1981, 507,1651,2917,1117, 285,4591, 182,2580,3522,1304, 335,3303,1835,2504,
1795,1792,2248, 674,1018,2106,2449,1857,2292,2845, 976,3047,1781,2600,2727,1389,
1281,  52,3152, 153, 265,3950, 672,3485,3951,4463, 430,1183, 365, 278,2169,  27,
1407,1336,2304, 209,1340,1730,2202,1852,2403,2883, 979,1737,1062, 631,2829,2542,
3876,2592, 825,2086,2226,3048,3625, 352,1417,3724, 542, 991, 431,1351,3938,1861,
2294, 826,1361,2927,3142,3503,1738, 463,2462,2723, 582,1916,1595,2808, 400,3845,
3891,2868,3621,2254,  58,2492,1123, 910,2160,2614,1372,1603,1196,1072,3385,1700,
3267,1980, 696, 480,2430, 920, 799,1570,2920,1951,2041,4047,2540,1321,4223,2469,
3562,2228,1271,2602, 401,2833,3351,2575,5157, 907,2312,1256, 410, 263,3507,1582,
 996, 678,1849,2316,1480, 908,3545,2237, 703,2322, 667,1826,2849,1531,2604,2999,
2407,3146,2151,2630,1786,3711, 469,3542, 497,3899,2409, 858, 837,4446,3393,1274,
 786, 620,1845,2001,3311, 484, 308,3367,1204,1815,3691,2332,1532,2557,1842,2020,
2724,1927,2333,4440, 567,  22,1673,2728,4475,1987,1858,1144,1597, 101,1832,3601,
  12, 974,3783,4391, 951,1412,   1,3720, 453,4608,4041, 528,1041,1027,3230,2628,
1129, 875,1051,3291,1203,2262,1069,2860,2799,2149,2615,3278, 144,1758,3040,  31,
 475,1680, 366,2685,3184, 311,1642,4008,2466,5036,1593,1493,2809, 216,1420,1668,
 233, 304,2128,3284, 232,1429,1768,1040,2008,3407,2740,2967,2543, 242,2133, 778,
1565,2022,2620, 505,2189,2756,1098,2273, 372,1614, 708, 553,2846,2094,2278, 169,
3626,2835,4161, 228,2674,3165, 809,1454,1309, 466,1705,1095, 900,3423, 880,2667,
3751,5258,2317,3109,2571,4317,2766,1503,1342, 866,4447,1118,  63,2076, 314,1881,
1348,1061, 172, 978,3515,1747, 532, 511,3970,   6, 601, 905,2699,3300,1751, 276,
1467,3725,2668,  65,4239,2544,2779,2556,1604, 578,2451,1802, 992,2331,2624,1320,
3446, 713,1513,1013, 103,2786,2447,1661, 886,1702, 916, 654,3574,2031,1556, 751,
2178,2821,2179,1498,1538,2176, 271, 914,2251,2080,1325, 638,1953,2937,3877,2432,
2754,  95,3265,1716, 260,1227,4083, 775, 106,1357,3254, 426,1607, 555,2480, 772,
1985, 244,2546, 474, 495,1046,2611,1851,2061,  71,2089,1675,2590, 742,3758,2843,
3222,1433, 267,2180,2576,2826,2233,2092,3913,2435, 956,1745,3075, 856,2113,1116,
 451,   3,1988,2896,1398, 993,2463,1878,2049,1341,2718,2721,2870,2108, 712,2904,
4363,2753,2324, 277,2872,2349,2649, 384, 987, 435, 691,3000, 922, 164,3939, 652,
1500,1184,4153,2482,3373,2165,4848,2335,3775,3508,3154,2806,2830,1554,2102,1664,
2530,1434,2408, 893,1547,2623,3447,2832,2242,2532,3169,2856,3223,2078,  49,3770,
3469, 462, 318, 656,2259,3250,3069, 679,1629,2758, 344,1138,1104,3120,1836,1283,
3115,2154,1437,4448, 934, 759,1999, 794,2862,1038, 533,2560,1722,2342, 855,2626,
1197,1663,4476,3127,  85,4240,2528,  25,1111,1181,3673, 407,3470,4561,2679,2713,
 768,1925,2841,3986,1544,1165, 932, 373,1240,2146,1930,2673, 721,4766, 354,4333,
 391,2963, 187,  61,3364,1442,1102, 330,1940,1767, 341,3809,4118, 393,2496,2062,
2211, 105, 331, 300, 439, 913,1332, 626, 379,3304,1557, 328, 689,3952, 309,1555,
 931, 317,2517,3027, 325, 569, 686,2107,3084,  60,1042,1333,2794, 264,3177,4014,
1628, 258,3712,   7,4464,1176,1043,1778, 683, 114,1975,  78,1492, 383,1886, 510,
 386, 645,5291,2891,2069,3305,4138,3867,2939,2603,2493,1935,1066,1848,3588,1015,
1282,1289,4609, 697,1453,3044,2666,3611,1856,2412,  54, 719,1330, 568,3778,2459,
1748, 788, 492, 551,1191,1000, 488,3394,3763, 282,1799, 348,2016,1523,3155,2390,
1049, 382,2019,1788,1170, 729,2968,3523, 897,3926,2785,2938,3292, 350,2319,3238,
1718,1717,2655,3453,3143,4465, 161,2889,2980,2009,1421,  56,1908,1640,2387,2232,
1917,1874,2477,4921, 148,  83,3438, 592,4245,2882,1822,1055, 741, 115,1496,1624,
 381,1638,4592,1020, 516,3214, 458, 947,4575,1432, 211,1514,2926,1865,2142, 189,
 852,1221,1400,1486, 882,2299,4036, 351,  28,1122, 700,6479,6480,6481,6482,6483,  #last 512
)

site-packages/pip/_vendor/chardet/gb2312prober.py000064400000003332147511334570015621 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is mozilla.org code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

from .mbcharsetprober import MultiByteCharSetProber
from .codingstatemachine import CodingStateMachine
from .chardistribution import GB2312DistributionAnalysis
from .mbcssm import GB2312_SM_MODEL

class GB2312Prober(MultiByteCharSetProber):
    def __init__(self):
        super(GB2312Prober, self).__init__()
        self.coding_sm = CodingStateMachine(GB2312_SM_MODEL)
        self.distribution_analyzer = GB2312DistributionAnalysis()
        self.reset()

    @property
    def charset_name(self):
        return "GB2312"

    @property
    def language(self):
        return "Chinese"
site-packages/pip/_vendor/chardet/langgreekmodel.py000064400000030620147511334570016467 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Communicator client code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

# 255: Control characters that usually does not exist in any text
# 254: Carriage/Return
# 253: symbol (punctuation) that does not belong to word
# 252: 0 - 9

# Character Mapping Table:
Latin7_char_to_order_map = (
255,255,255,255,255,255,255,255,255,255,254,255,255,254,255,255,  # 00
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,  # 10
253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,  # 20
252,252,252,252,252,252,252,252,252,252,253,253,253,253,253,253,  # 30
253, 82,100,104, 94, 98,101,116,102,111,187,117, 92, 88,113, 85,  # 40
 79,118,105, 83, 67,114,119, 95, 99,109,188,253,253,253,253,253,  # 50
253, 72, 70, 80, 81, 60, 96, 93, 89, 68,120, 97, 77, 86, 69, 55,  # 60
 78,115, 65, 66, 58, 76,106,103, 87,107,112,253,253,253,253,253,  # 70
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,  # 80
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,  # 90
253,233, 90,253,253,253,253,253,253,253,253,253,253, 74,253,253,  # a0
253,253,253,253,247,248, 61, 36, 46, 71, 73,253, 54,253,108,123,  # b0
110, 31, 51, 43, 41, 34, 91, 40, 52, 47, 44, 53, 38, 49, 59, 39,  # c0
 35, 48,250, 37, 33, 45, 56, 50, 84, 57,120,121, 17, 18, 22, 15,  # d0
124,  1, 29, 20, 21,  3, 32, 13, 25,  5, 11, 16, 10,  6, 30,  4,  # e0
  9,  8, 14,  7,  2, 12, 28, 23, 42, 24, 64, 75, 19, 26, 27,253,  # f0
)

win1253_char_to_order_map = (
255,255,255,255,255,255,255,255,255,255,254,255,255,254,255,255,  # 00
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,  # 10
253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,  # 20
252,252,252,252,252,252,252,252,252,252,253,253,253,253,253,253,  # 30
253, 82,100,104, 94, 98,101,116,102,111,187,117, 92, 88,113, 85,  # 40
 79,118,105, 83, 67,114,119, 95, 99,109,188,253,253,253,253,253,  # 50
253, 72, 70, 80, 81, 60, 96, 93, 89, 68,120, 97, 77, 86, 69, 55,  # 60
 78,115, 65, 66, 58, 76,106,103, 87,107,112,253,253,253,253,253,  # 70
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,  # 80
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,  # 90
253,233, 61,253,253,253,253,253,253,253,253,253,253, 74,253,253,  # a0
253,253,253,253,247,253,253, 36, 46, 71, 73,253, 54,253,108,123,  # b0
110, 31, 51, 43, 41, 34, 91, 40, 52, 47, 44, 53, 38, 49, 59, 39,  # c0
 35, 48,250, 37, 33, 45, 56, 50, 84, 57,120,121, 17, 18, 22, 15,  # d0
124,  1, 29, 20, 21,  3, 32, 13, 25,  5, 11, 16, 10,  6, 30,  4,  # e0
  9,  8, 14,  7,  2, 12, 28, 23, 42, 24, 64, 75, 19, 26, 27,253,  # f0
)

# Model Table:
# total sequences: 100%
# first 512 sequences: 98.2851%
# first 1024 sequences:1.7001%
# rest  sequences:     0.0359%
# negative sequences:  0.0148%
GreekLangModel = (
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,3,2,2,3,3,3,3,3,3,3,3,1,3,3,3,0,2,2,3,3,0,3,0,3,2,0,3,3,3,0,
3,0,0,0,2,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,3,3,3,3,3,0,3,3,0,3,2,3,3,0,3,2,3,3,3,0,0,3,0,3,0,3,3,2,0,0,0,
2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,
0,2,3,2,2,3,3,3,3,3,3,3,3,0,3,3,3,3,0,2,3,3,0,3,3,3,3,2,3,3,3,0,
2,0,0,0,2,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,2,3,3,2,3,3,3,3,3,3,3,3,3,3,3,3,0,2,1,3,3,3,3,2,3,3,2,3,3,2,0,
0,0,0,0,2,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,3,3,3,3,0,3,3,3,3,3,3,0,3,3,0,3,3,3,3,3,3,3,3,3,3,0,3,2,3,3,0,
2,0,1,0,2,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,
0,3,3,3,3,3,2,3,0,0,0,0,3,3,0,3,1,3,3,3,0,3,3,0,3,3,3,3,0,0,0,0,
2,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,3,3,3,3,3,0,3,0,3,3,3,3,3,0,3,2,2,2,3,0,2,3,3,3,3,3,2,3,3,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,3,3,3,3,3,3,2,2,2,3,3,3,3,0,3,1,3,3,3,3,2,3,3,3,3,3,3,3,2,2,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,3,3,3,3,3,2,0,3,0,0,0,3,3,2,3,3,3,3,3,0,0,3,2,3,0,2,3,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,3,0,3,3,3,3,0,0,3,3,0,2,3,0,3,0,3,3,3,0,0,3,0,3,0,2,2,3,3,0,0,
0,0,1,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,3,3,3,3,3,2,0,3,2,3,3,3,3,0,3,3,3,3,3,0,3,3,2,3,2,3,3,2,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,3,3,2,3,2,3,3,3,3,3,3,0,2,3,2,3,2,2,2,3,2,3,3,2,3,0,2,2,2,3,0,
2,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,3,0,0,0,3,3,3,2,3,3,0,0,3,0,3,0,0,0,3,2,0,3,0,3,0,0,2,0,2,0,
0,0,0,0,2,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,3,3,3,3,0,3,3,3,3,3,3,0,3,3,0,3,0,0,0,3,3,0,3,3,3,0,0,1,2,3,0,
3,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,3,3,3,3,3,2,0,0,3,2,2,3,3,0,3,3,3,3,3,2,1,3,0,3,2,3,3,2,1,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,3,3,0,2,3,3,3,3,3,3,0,0,3,0,3,0,0,0,3,3,0,3,2,3,0,0,3,3,3,0,
3,0,0,0,2,0,0,0,0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,3,3,3,3,0,3,3,3,3,3,3,0,0,3,0,3,0,0,0,3,2,0,3,2,3,0,0,3,2,3,0,
2,0,0,0,0,0,0,0,0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,3,1,2,2,3,3,3,3,3,3,0,2,3,0,3,0,0,0,3,3,0,3,0,2,0,0,2,3,1,0,
2,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,3,0,3,3,3,3,0,3,0,3,3,2,3,0,3,3,3,3,3,3,0,3,3,3,0,2,3,0,0,3,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,3,0,3,3,3,0,0,3,0,0,0,3,3,0,3,0,2,3,3,0,0,3,0,3,0,3,3,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,3,0,0,0,3,3,3,3,3,3,0,0,3,0,2,0,0,0,3,3,0,3,0,3,0,0,2,0,2,0,
0,0,0,0,1,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,3,3,3,3,3,3,0,3,0,2,0,3,2,0,3,2,3,2,3,0,0,3,2,3,2,3,3,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,3,0,0,2,3,3,3,3,3,0,0,0,3,0,2,1,0,0,3,2,2,2,0,3,0,0,2,2,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,3,0,3,3,3,2,0,3,0,3,0,3,3,0,2,1,2,3,3,0,0,3,0,3,0,3,3,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,2,3,3,3,0,3,3,3,3,3,3,0,2,3,0,3,0,0,0,2,1,0,2,2,3,0,0,2,2,2,0,
0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,3,0,0,2,3,3,3,2,3,0,0,1,3,0,2,0,0,0,0,3,0,1,0,2,0,0,1,1,1,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,3,3,3,3,3,1,0,3,0,0,0,3,2,0,3,2,3,3,3,0,0,3,0,3,2,2,2,1,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,3,0,3,3,3,0,0,3,0,0,0,0,2,0,2,3,3,2,2,2,2,3,0,2,0,2,2,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,3,3,3,3,2,0,0,0,0,0,0,2,3,0,2,0,2,3,2,0,0,3,0,3,0,3,1,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,3,2,3,3,2,2,3,0,2,0,3,0,0,0,2,0,0,0,0,1,2,0,2,0,2,0,
0,2,0,2,0,2,2,0,0,1,0,2,2,2,0,2,2,2,0,2,2,2,0,0,2,0,0,1,0,0,0,0,
0,2,0,3,3,2,0,0,0,0,0,0,1,3,0,2,0,2,2,2,0,0,2,0,3,0,0,2,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,3,0,2,3,2,0,2,2,0,2,0,2,2,0,2,0,2,2,2,0,0,0,0,0,0,2,3,0,0,0,2,
0,1,2,0,0,0,0,2,2,0,0,0,2,1,0,2,2,0,0,0,0,0,0,1,0,2,0,0,0,0,0,0,
0,0,2,1,0,2,3,2,2,3,2,3,2,0,0,3,3,3,0,0,3,2,0,0,0,1,1,0,2,0,2,2,
0,2,0,2,0,2,2,0,0,2,0,2,2,2,0,2,2,2,2,0,0,2,0,0,0,2,0,1,0,0,0,0,
0,3,0,3,3,2,2,0,3,0,0,0,2,2,0,2,2,2,1,2,0,0,1,2,2,0,0,3,0,0,0,2,
0,1,2,0,0,0,1,2,0,0,0,0,0,0,0,2,2,0,1,0,0,2,0,0,0,2,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,2,3,3,2,2,0,0,0,2,0,2,3,3,0,2,0,0,0,0,0,0,2,2,2,0,2,2,0,2,0,2,
0,2,2,0,0,2,2,2,2,1,0,0,2,2,0,2,0,0,2,0,0,0,0,0,0,2,0,0,0,0,0,0,
0,2,0,3,2,3,0,0,0,3,0,0,2,2,0,2,0,2,2,2,0,0,2,0,0,0,0,0,0,0,0,2,
0,0,2,2,0,0,2,2,2,0,0,0,0,0,0,2,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,2,0,0,3,2,0,2,2,2,2,2,0,0,0,2,0,0,0,0,2,0,1,0,0,2,0,1,0,0,0,
0,2,2,2,0,2,2,0,1,2,0,2,2,2,0,2,2,2,2,1,2,2,0,0,2,0,0,0,0,0,0,0,
0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,
0,2,0,2,0,2,2,0,0,0,0,1,2,1,0,0,2,2,0,0,2,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,3,2,3,0,0,2,0,0,0,2,2,0,2,0,0,0,1,0,0,2,0,2,0,2,2,0,0,0,0,
0,0,2,0,0,0,0,2,2,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,
0,2,2,3,2,2,0,0,0,0,0,0,1,3,0,2,0,2,2,0,0,0,1,0,2,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,2,0,2,0,3,2,0,2,0,0,0,0,0,0,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,
0,0,2,0,0,0,0,1,1,0,0,2,1,2,0,2,2,0,1,0,0,1,0,0,0,2,0,0,0,0,0,0,
0,3,0,2,2,2,0,0,2,0,0,0,2,0,0,0,2,3,0,2,0,0,0,0,0,0,2,2,0,0,0,2,
0,1,2,0,0,0,1,2,2,1,0,0,0,2,0,0,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,3,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,2,1,2,0,2,2,0,2,0,0,2,0,0,0,0,1,2,1,0,2,1,0,0,0,0,0,0,0,0,0,0,
0,0,2,0,0,0,3,1,2,2,0,2,0,0,0,0,2,0,0,0,2,0,0,3,0,0,0,0,2,2,2,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,2,1,0,2,0,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,1,0,0,0,0,0,0,2,
0,2,2,0,0,2,2,2,2,2,0,1,2,0,0,0,2,2,0,1,0,2,0,0,2,2,0,0,0,0,0,0,
0,0,0,0,1,0,0,0,0,0,0,0,3,0,0,2,0,0,0,0,0,0,0,0,2,0,2,0,0,0,0,2,
0,1,2,0,0,0,0,2,2,1,0,1,0,1,0,2,2,2,1,0,0,0,0,0,0,1,0,0,0,0,0,0,
0,2,0,1,2,0,0,0,0,0,0,0,0,0,0,2,0,0,2,2,0,0,0,0,1,0,0,0,0,0,0,2,
0,2,2,0,0,0,0,2,2,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,2,0,0,2,0,0,0,
0,2,2,2,2,0,0,0,3,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,2,0,0,0,0,0,0,1,
0,0,2,0,0,0,0,1,2,0,0,0,0,0,0,2,2,1,1,0,0,0,0,0,0,1,0,0,0,0,0,0,
0,2,0,2,2,2,0,0,2,0,0,0,0,0,0,0,2,2,2,0,0,0,2,0,0,0,0,0,0,0,0,2,
0,0,1,0,0,0,0,2,1,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,
0,3,0,2,0,0,0,0,0,0,0,0,2,0,0,0,0,0,2,0,0,0,0,0,0,0,2,0,0,0,0,2,
0,0,2,0,0,0,0,2,2,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,2,0,2,2,1,0,0,0,0,0,0,2,0,0,2,0,2,2,2,0,0,0,0,0,0,2,0,0,0,0,2,
0,0,2,0,0,2,0,2,2,0,0,0,0,2,0,2,0,0,0,0,0,2,0,0,0,2,0,0,0,0,0,0,
0,0,3,0,0,0,2,2,0,2,2,0,0,0,0,0,2,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,2,0,0,0,0,0,
0,2,2,2,2,2,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,1,
0,0,0,0,0,0,0,2,1,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,2,2,0,0,0,0,0,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,
0,2,0,0,0,2,0,0,0,0,0,1,0,0,0,0,2,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,2,0,0,0,
0,2,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,1,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,2,0,2,0,0,0,
0,0,0,0,0,0,0,0,2,1,0,0,0,0,0,0,2,0,0,0,1,2,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
)

Latin7GreekModel = {
  'char_to_order_map': Latin7_char_to_order_map,
  'precedence_matrix': GreekLangModel,
  'typical_positive_ratio': 0.982851,
  'keep_english_letter': False,
  'charset_name': "ISO-8859-7",
  'language': 'Greek',
}

Win1253GreekModel = {
  'char_to_order_map': win1253_char_to_order_map,
  'precedence_matrix': GreekLangModel,
  'typical_positive_ratio': 0.982851,
  'keep_english_letter': False,
  'charset_name': "windows-1253",
  'language': 'Greek',
}
site-packages/pip/_vendor/chardet/cli/__init__.py000064400000000001147511334570016003 0ustar00
site-packages/pip/_vendor/chardet/cli/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000161147511334570023235 0ustar003

���e�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/chardet/cli/__pycache__/__init__.cpython-36.pyc000064400000000161147511334570022276 0ustar003

���e�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/chardet/cli/__pycache__/chardetect.cpython-36.pyc000064400000005671147511334570022660 0ustar003

���e�
�@srdZddlmZmZmZddlZddlZddlmZddl	m
Z
ddlmZd
dd	�Z
dd
d�Zedkrne�dS)a
Script which takes one or more file paths and reports on their detected
encodings

Example::

    % chardetect somefile someotherfile
    somefile: windows-1252 with confidence 0.5
    someotherfile: ascii with confidence 1.0

If no paths are provided, it takes its input from stdin.

�)�absolute_import�print_function�unicode_literalsN)�__version__)�PY2)�UniversalDetector�stdincCs|t�}x&|D]}t|�}|j|�|jrPqW|j�|j}trP|jtj	�d�}|drndj
||d|d�Sdj
|�SdS)z�
    Return a string describing the probable encoding of a file or
    list of strings.

    :param lines: The lines to get the encoding of.
    :type lines: Iterable of bytes
    :param name: Name of file or collection of lines
    :type name: str
    �ignore�encodingz{0}: {1} with confidence {2}Z
confidencez{0}: no resultN)r�	bytearrayZfeed�done�close�resultr�decode�sys�getfilesystemencoding�format)�lines�name�u�liner�r� /usr/lib/python3.6/chardetect.py�description_ofs



rcCs�tjdd�}|jddtjd�dtr(tjntjjgd�|jdd	d
jt	�d�|j
|�}x4|jD]*}|j�rxt
dtjd�t
t||j��q^WdS)z�
    Handles command line arguments and gets things started.

    :param argv: List of arguments, as if specified on the command-line.
                 If None, ``sys.argv[1:]`` is used instead.
    :type argv: list of str
    zVTakes one or more file paths and reports their detected                      encodings)�description�inputz^File whose encoding we would like to determine.                               (default: stdin)�rb�*)�help�type�nargs�defaultz	--version�versionz%(prog)s {0})�actionr"z0You are running chardetect interactively. Press z8CTRL-D twice at the start of a blank line to signal the z4end of your input. If you want help, run chardetect z--help
)�fileNzhYou are running chardetect interactively. Press CTRL-D twice at the start of a blank line to signal the z�You are running chardetect interactively. Press CTRL-D twice at the start of a blank line to signal the end of your input. If you want help, run chardetect z�You are running chardetect interactively. Press CTRL-D twice at the start of a blank line to signal the end of your input. If you want help, run chardetect --help
)�argparse�ArgumentParser�add_argumentZFileTyperrr�bufferrr�
parse_argsr�isatty�print�stderrrr)�argv�parser�args�frrr�main6s	

r1�__main__)r)N)�__doc__Z
__future__rrrr%rZchardetrZchardet.compatrZchardet.universaldetectorrrr1�__name__rrrr�<module>s

site-packages/pip/_vendor/chardet/cli/__pycache__/chardetect.cpython-36.opt-1.pyc000064400000005671147511334570023617 0ustar003

���e�
�@srdZddlmZmZmZddlZddlZddlmZddl	m
Z
ddlmZd
dd	�Z
dd
d�Zedkrne�dS)a
Script which takes one or more file paths and reports on their detected
encodings

Example::

    % chardetect somefile someotherfile
    somefile: windows-1252 with confidence 0.5
    someotherfile: ascii with confidence 1.0

If no paths are provided, it takes its input from stdin.

�)�absolute_import�print_function�unicode_literalsN)�__version__)�PY2)�UniversalDetector�stdincCs|t�}x&|D]}t|�}|j|�|jrPqW|j�|j}trP|jtj	�d�}|drndj
||d|d�Sdj
|�SdS)z�
    Return a string describing the probable encoding of a file or
    list of strings.

    :param lines: The lines to get the encoding of.
    :type lines: Iterable of bytes
    :param name: Name of file or collection of lines
    :type name: str
    �ignore�encodingz{0}: {1} with confidence {2}Z
confidencez{0}: no resultN)r�	bytearrayZfeed�done�close�resultr�decode�sys�getfilesystemencoding�format)�lines�name�u�liner�r� /usr/lib/python3.6/chardetect.py�description_ofs



rcCs�tjdd�}|jddtjd�dtr(tjntjjgd�|jdd	d
jt	�d�|j
|�}x4|jD]*}|j�rxt
dtjd�t
t||j��q^WdS)z�
    Handles command line arguments and gets things started.

    :param argv: List of arguments, as if specified on the command-line.
                 If None, ``sys.argv[1:]`` is used instead.
    :type argv: list of str
    zVTakes one or more file paths and reports their detected                      encodings)�description�inputz^File whose encoding we would like to determine.                               (default: stdin)�rb�*)�help�type�nargs�defaultz	--version�versionz%(prog)s {0})�actionr"z0You are running chardetect interactively. Press z8CTRL-D twice at the start of a blank line to signal the z4end of your input. If you want help, run chardetect z--help
)�fileNzhYou are running chardetect interactively. Press CTRL-D twice at the start of a blank line to signal the z�You are running chardetect interactively. Press CTRL-D twice at the start of a blank line to signal the end of your input. If you want help, run chardetect z�You are running chardetect interactively. Press CTRL-D twice at the start of a blank line to signal the end of your input. If you want help, run chardetect --help
)�argparse�ArgumentParser�add_argumentZFileTyperrr�bufferrr�
parse_argsr�isatty�print�stderrrr)�argv�parser�args�frrr�main6s	

r1�__main__)r)N)�__doc__Z
__future__rrrr%rZchardetrZchardet.compatrZchardet.universaldetectorrrr1�__name__rrrr�<module>s

site-packages/pip/_vendor/chardet/cli/chardetect.py000064400000005262147511334570016370 0ustar00#!/usr/bin/env python
"""
Script which takes one or more file paths and reports on their detected
encodings

Example::

    % chardetect somefile someotherfile
    somefile: windows-1252 with confidence 0.5
    someotherfile: ascii with confidence 1.0

If no paths are provided, it takes its input from stdin.

"""

from __future__ import absolute_import, print_function, unicode_literals

import argparse
import sys

from chardet import __version__
from chardet.compat import PY2
from chardet.universaldetector import UniversalDetector


def description_of(lines, name='stdin'):
    """
    Return a string describing the probable encoding of a file or
    list of strings.

    :param lines: The lines to get the encoding of.
    :type lines: Iterable of bytes
    :param name: Name of file or collection of lines
    :type name: str
    """
    u = UniversalDetector()
    for line in lines:
        line = bytearray(line)
        u.feed(line)
        # shortcut out of the loop to save reading further - particularly useful if we read a BOM.
        if u.done:
            break
    u.close()
    result = u.result
    if PY2:
        name = name.decode(sys.getfilesystemencoding(), 'ignore')
    if result['encoding']:
        return '{0}: {1} with confidence {2}'.format(name, result['encoding'],
                                                     result['confidence'])
    else:
        return '{0}: no result'.format(name)


def main(argv=None):
    """
    Handles command line arguments and gets things started.

    :param argv: List of arguments, as if specified on the command-line.
                 If None, ``sys.argv[1:]`` is used instead.
    :type argv: list of str
    """
    # Get command line arguments
    parser = argparse.ArgumentParser(
        description="Takes one or more file paths and reports their detected \
                     encodings")
    parser.add_argument('input',
                        help='File whose encoding we would like to determine. \
                              (default: stdin)',
                        type=argparse.FileType('rb'), nargs='*',
                        default=[sys.stdin if PY2 else sys.stdin.buffer])
    parser.add_argument('--version', action='version',
                        version='%(prog)s {0}'.format(__version__))
    args = parser.parse_args(argv)

    for f in args.input:
        if f.isatty():
            print("You are running chardetect interactively. Press " +
                  "CTRL-D twice at the start of a blank line to signal the " +
                  "end of your input. If you want help, run chardetect " +
                  "--help\n", file=sys.stderr)
        print(description_of(f, f.name))


if __name__ == '__main__':
    main()
site-packages/pip/_vendor/chardet/mbcsgroupprober.py000064400000003734147511334570016730 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Universal charset detector code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 2001
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#   Shy Shalom - original C code
#   Proofpoint, Inc.
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

from .charsetgroupprober import CharSetGroupProber
from .utf8prober import UTF8Prober
from .sjisprober import SJISProber
from .eucjpprober import EUCJPProber
from .gb2312prober import GB2312Prober
from .euckrprober import EUCKRProber
from .cp949prober import CP949Prober
from .big5prober import Big5Prober
from .euctwprober import EUCTWProber


class MBCSGroupProber(CharSetGroupProber):
    def __init__(self, lang_filter=None):
        super(MBCSGroupProber, self).__init__(lang_filter=lang_filter)
        self.probers = [
            UTF8Prober(),
            SJISProber(),
            EUCJPProber(),
            GB2312Prober(),
            EUCKRProber(),
            CP949Prober(),
            Big5Prober(),
            EUCTWProber()
        ]
        self.reset()
site-packages/pip/_vendor/chardet/cp949prober.py000064400000003477147511334570015603 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is mozilla.org code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

from .chardistribution import EUCKRDistributionAnalysis
from .codingstatemachine import CodingStateMachine
from .mbcharsetprober import MultiByteCharSetProber
from .mbcssm import CP949_SM_MODEL


class CP949Prober(MultiByteCharSetProber):
    def __init__(self):
        super(CP949Prober, self).__init__()
        self.coding_sm = CodingStateMachine(CP949_SM_MODEL)
        # NOTE: CP949 is a superset of EUC-KR, so the distribution should be
        #       not different.
        self.distribution_analyzer = EUCKRDistributionAnalysis()
        self.reset()

    @property
    def charset_name(self):
        return "CP949"

    @property
    def language(self):
        return "Korean"
site-packages/pip/_vendor/chardet/sjisprober.py000064400000007276147511334570015704 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is mozilla.org code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

from .mbcharsetprober import MultiByteCharSetProber
from .codingstatemachine import CodingStateMachine
from .chardistribution import SJISDistributionAnalysis
from .jpcntx import SJISContextAnalysis
from .mbcssm import SJIS_SM_MODEL
from .enums import ProbingState, MachineState


class SJISProber(MultiByteCharSetProber):
    def __init__(self):
        super(SJISProber, self).__init__()
        self.coding_sm = CodingStateMachine(SJIS_SM_MODEL)
        self.distribution_analyzer = SJISDistributionAnalysis()
        self.context_analyzer = SJISContextAnalysis()
        self.reset()

    def reset(self):
        super(SJISProber, self).reset()
        self.context_analyzer.reset()

    @property
    def charset_name(self):
        return self.context_analyzer.charset_name

    @property
    def language(self):
        return "Japanese"

    def feed(self, byte_str):
        for i in range(len(byte_str)):
            coding_state = self.coding_sm.next_state(byte_str[i])
            if coding_state == MachineState.ERROR:
                self.logger.debug('%s %s prober hit error at byte %s',
                                  self.charset_name, self.language, i)
                self._state = ProbingState.NOT_ME
                break
            elif coding_state == MachineState.ITS_ME:
                self._state = ProbingState.FOUND_IT
                break
            elif coding_state == MachineState.START:
                char_len = self.coding_sm.get_current_charlen()
                if i == 0:
                    self._last_char[1] = byte_str[0]
                    self.context_analyzer.feed(self._last_char[2 - char_len:],
                                               char_len)
                    self.distribution_analyzer.feed(self._last_char, char_len)
                else:
                    self.context_analyzer.feed(byte_str[i + 1 - char_len:i + 3
                                                        - char_len], char_len)
                    self.distribution_analyzer.feed(byte_str[i - 1:i + 1],
                                                    char_len)

        self._last_char[0] = byte_str[-1]

        if self.state == ProbingState.DETECTING:
            if (self.context_analyzer.got_enough_data() and
               (self.get_confidence() > self.SHORTCUT_THRESHOLD)):
                self._state = ProbingState.FOUND_IT

        return self.state

    def get_confidence(self):
        context_conf = self.context_analyzer.get_confidence()
        distrib_conf = self.distribution_analyzer.get_confidence()
        return max(context_conf, distrib_conf)
site-packages/pip/_vendor/chardet/__pycache__/charsetgroupprober.cpython-36.pyc000064400000004144147511334570023715 0ustar003

���e��@s,ddlmZddlmZGdd�de�ZdS)�)�ProbingState)�
CharSetProbercsReZdZd�fdd�	Z�fdd�Zedd��Zedd	��Zd
d�Zdd
�Z	�Z
S)�CharSetGroupProberNcs(tt|�j|d�d|_g|_d|_dS)N)�lang_filter�)�superr�__init__�_active_num�probers�_best_guess_prober)�selfr)�	__class__��(/usr/lib/python3.6/charsetgroupprober.pyr!szCharSetGroupProber.__init__csNtt|�j�d|_x.|jD]$}|r|j�d|_|jd7_qWd|_dS)NrTr)rr�resetr	r
�activer)r�prober)r
rrr'szCharSetGroupProber.resetcCs |js|j�|jsdS|jjS)N)r�get_confidence�charset_name)rrrrr1s
zCharSetGroupProber.charset_namecCs |js|j�|jsdS|jjS)N)rr�language)rrrrr9s
zCharSetGroupProber.languagecCs�xx|jD]n}|sq|jsq|j|�}|s*q|tjkr@||_|jS|tjkrd|_|jd8_|jdkrtj|_	|jSqW|jS)NFrr)
r
r�feedr�FOUND_ITr�state�NOT_MEr	Z_state)rZbyte_strrrrrrrAs$




zCharSetGroupProber.feedcCs�|j}|tjkrdS|tjkr"dSd}d|_x\|jD]R}|s>q4|jsV|jjd|j	�q4|j
�}|jjd|j	|j|�||kr4|}||_q4W|js�dS|S)Ng�G�z��?g{�G�z�?gz
%s not activez%s %s confidence = %s)rrrrrr
rZlogger�debugrrr)rrZ	best_confrZconfrrrrUs*


z!CharSetGroupProber.get_confidence)N)�__name__�
__module__�__qualname__rr�propertyrrrr�
__classcell__rr)r
rr s
rN)ZenumsrZ
charsetproberrrrrrr�<module>ssite-packages/pip/_vendor/chardet/__pycache__/langthaimodel.cpython-36.opt-1.pyc000064400000055420147511334570023547 0ustar003

���e,�@sd�Zd�Zeed�d�d�d�d��Zd�S)�������j�k�d����e�^���l�m�n�o����Y�_�p�q������@�H�I�r�J�s�t�f�Q���u�Z�g�N�R�`���[�O�T�h�i�a�b�\�������������X���������������v���������c�U�S��������������������������������K���4�"�3�w�/�:�9�1�5�7�+���,��0����'�>��6�-�	���=�����*�.���L��B�?��
��$��
�(�� �#�V�������������)��!��2�%���C�M�&�]���D�8�;�A�E�<�F�P�G�W�����g��@��?FzTIS-620ZThai)Zchar_to_order_mapZprecedence_matrixZtypical_positive_ratioZkeep_english_letterZcharset_nameZlanguageN(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr(r�r�rrrrrrrrr�r�rrrrr�rrrrr�rrrrrrrrrrrrrrrrr�r�rrrrrrr�rrrrrrrrr�rrrrr�r�r�r�rrr�rrrrr|rrrrr�r�r|rrrrrrrrr�r|r�r|r�r�rrr|r�r|r|rrr�rrrrr|rrr�r�rrrrr�rrrrr�rrrrrrrrrrrrrrrrrrr�rrr|rrr�r|r|r|rrr�r|rrr�r�r�r�r�r�r�r|rrr�r�rrr|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrr|rrrrrrrrrrrrrrrrrrrrrrr|r|r|r|r|r|r|rrrrr|rrr|rrrrr|r|r|rrr�r|rrr�rrrrr|r|r�r|rrrrr�r|r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr|r|rrrrrrrrr�r|rrrrrrrrrrr|r|r|r|rrrrr|r|rrrrr|r|rrr|rrr|r|rrrrr�r|rrr�r|r|rrrrr�r�r|r�r�r�rrr�r|r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrrrrr|r|rrrrrrrrr|rrr|r|rrrrr|r|rrr|r|r|r|r�r�rrr�r|r�r�rrr|r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrr|rrr|rrrrr|r|rrr|rrrrr|rrr�r�r|rrr|r|r|rrr|r|r|r|r|r�r|r�r|r|r�r�rrrrr|r�r�r�r|r|r�r�rrr�r�r�r�r�r�r�r�r�r�r|rrr�r�r|r�r�rrrrr|rrrrr|r�r�rrrrr�rrrrr�r|r|rrr�r|r|r�r�r�r�r|r|r|r�r|r|r�r�r�r|r�r�r|r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr|rrrrr|r�r�rrrrr�r|rrr�r|r�r|r|r|r|r�r|r�r�r|r|r|r�r|r|r�r�r�r|r�r�r|r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr|rrr|rrr|r�r|r|r�rrr|r�rrr|r�r|rrr|r|rrr�r|rrr|r|r�r|r|r|r|r�r|r|r�r�r�r�r|r�r�r|r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr|rrrrr|rrr|r|r|rrr|r|rrr|r|r�r|rrr|r|rrr�rrr|r|r|rrr|r|r|rrrrr|r�rrr�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�rrr�rrrrrrrrrrr�r�rrr�r|r|rrrrrrrrrrr�r�r�r�r�rrr�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r|rrr�r�r�rrr�r|r�r�r�r�r�rrr�r�r�r�r�r�r�r�r|r�rrrrrrrrr�r�r|rrr�r�rrr�rrrrr|rrrrrrrrrrr�r�rrrrrrr�r�r�rrrrr�r�rrr�r�r�r�r|r�r�r|r�r�rrr�r�r�r�r�r|rrr�r�r�r�r�r�r�r�r�r�r�rrrrrrrrr|rrrrrrrrrrrrrrr�r|r�rrrrr|r|r�r|r|r|rrr�r�r|r�r|r�r|r�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r|r�r|rrrrrrr�r|r�r|r|r�r|r�rrr|r|r�r|r�r�r�r|r|r�r�r|r�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrr|r�rrrrr�r�rrr�r|rrr�r�rrr|r�r�r|r�r|r|rrr|r�r�r�r�r�r|rrr�r�r�rrr�r|r�r|r�rrr�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr�r�rrr|rrrrrrr�rrr|r�rrr|r�rrr|r|r|r|r�rrrrr�r|r�rrr�r|rrr�r|r�r�rrr|r|r|r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rrrrr|rrr|rrrrr|rrr|rrr|rrrrr|r�r�rrr|r|r|r�r|r|r|r�r|r|r�r|r�r�r|r|r|rrr�r�rrr�r�r�r�r�r�r�r�r|r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrr|rrr|r|r�r�rrr|rrr|rrr|r�rrr|r|r�r|r�r|r|r|r�r|r|r|r|r�rrr|r�r|r|r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrrr|rrr�r|rrrrr|r|rrr�r�r�r|r�rrrrr|r|rrr�r�r�rrr�r�r�r�rrr�r�rrrrr�r|r�r|r�r�r�rrr|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrr|rrr|rrrrr�r�rrr�r�r|r�r|r�r�rrr�r�r�r|rrr�r�r�r�r�r�r�r�rrr�r�r|r|r|r|r�r�r�r�r�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr|r|r�r�r|r�rrrrr|rrr|r|rrr|r|rrr�r|r|r�r|r�rrr|r�r|r|r|r|r|r�rrr|r�r|r|r|r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrrrrrrrrr�rrrrr�r|r�r�rrr|r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r|r|r|rrr�r�r�rrr�rrr|r�rrr|r|rrrrrrrrrrr�r�r|r|r|r�r|r|r�r|r�r|rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r|rrr�rrrrr|rrrrr�rrrrr�rrr|r|rrr|rrrrrrr�r�r|r|rrr�r�r�r�rrr�r�rrr�r�r�r|r|r�r�rrr�r�r|r|r|rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr|rrrrr|r�rrrrr|r|rrr�rrr|r�rrr|r�r�r|r|r�r|rrr|r�r�rrr�r�r�r�rrr�r�r|rrr�rrr�r�rrr�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�rrr|r|r|r�r|r�r�rrr�r�rrr�rrr�r�r|r�r�r�r�r|r�r�r�r�r|r�r�r�r�r|r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�rrrrrrrrr|r|r|r|r|r�rrr�r�r�r|r�r�r�r|r�r|r�rrr|r�r�rrr�r�r�r�r�rrr�r�r|rrr�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rrr�rrrrr�r|r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rrr�rrr�r�r�r|r�r�r|r�rrrrr|rrrrrrr|rrr�r�r|r|r|r�r�r�r|r|r�r�r�r�r�r�r�rrr�r�r�r�r|r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�rrr�r|r�r�r�r�r�r�r�r�r�r�r�r|rrr�rrrrr�r�r�r�rrr�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr�r|rrr�r|rrr�r�rrr�r|r|r�r�r|r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrr|r�r�r�r�r�rrr�r|r|r|r|r|r|r�r�r�r�r�rrr�r�r�rrr�r�r�r�r�r�r�r|r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�rrr�r|r|r�rrrrr|rrrrr�r�r�r�r|r|r�r|r�rrrrr�r�r�rrr|r�r�r�r�r|r�r�r�r�r�r�r�r�r|r�r�r�rrr�r�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�rrr�r�rrr�rrr�r�r�r�r�rrr|r�r�r�rrr�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�rrrrr�rrr|r�rrrrr�r|r|r�r�r|r�r�r�r|r�r�r�r�r�rrr�r�r�rrr�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r|r�rrrrrrr|r|r�r�r�r�r�rrr�r�r�r|r|r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r|rrr�r|r�r�r|r�r�rrr�r�r�r|r�r�r�r�r�rrr�r�rrr�r�r�r|r|r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�rrr�r|r�r�r|r|r�r�r|r�r�r�r�rrr�r|r�r�r�r�r|r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r|r|r�r�r�r|r�r|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�rrr�r�r�r�r�r�r�r�rrr|r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rrr|r|r�r�r�r�r�r�r�r�r|rrr|r�r|r|rrr�r�r�r|rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr|r|r�r�r�r�r�r�r|r�r|r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�rrr�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�rrr�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�rrrrr�r�r�r�r|rrr�r�r|r�r�r�r�r�r�r|r�r�r�r�r�r|r�r�r�r|r�r�r�r�r|r�r�rrr�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�rrr�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr|r�r�r�r�r�r�r�r|r�r�r�r�r�r|r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�rrr�r�r�r�r�r|r�r�r�r�r�r�r�r�r|r�r�r�r�r�rrrrr�r�r�r�r|r�r�r�r|r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r|r|r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r|r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rrr�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r|r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r|r�r�r|r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r|r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)ZTIS620CharToOrderMapZ
ThaiLangModelZTIS620ThaiModel�r�r��#/usr/lib/python3.6/langthaimodel.py�<module>%s*
site-packages/pip/_vendor/chardet/__pycache__/big5prober.cpython-36.pyc000064400000002021147511334570022025 0ustar003

���e��@sDddlmZddlmZddlmZddlmZGdd�de�ZdS)�)�MultiByteCharSetProber)�CodingStateMachine)�Big5DistributionAnalysis)�
BIG5_SM_MODELcs4eZdZ�fdd�Zedd��Zedd��Z�ZS)�
Big5Probercs,tt|�j�tt�|_t�|_|j�dS)N)	�superr�__init__rrZ	coding_smrZdistribution_analyzer�reset)�self)�	__class__�� /usr/lib/python3.6/big5prober.pyr#s
zBig5Prober.__init__cCsdS)NZBig5r)r
rrr
�charset_name)szBig5Prober.charset_namecCsdS)NZChineser)r
rrr
�language-szBig5Prober.language)�__name__�
__module__�__qualname__r�propertyrr�
__classcell__rr)rr
r"srN)	ZmbcharsetproberrZcodingstatemachinerZchardistributionrZmbcssmrrrrrr
�<module>ssite-packages/pip/_vendor/chardet/__pycache__/utf8prober.cpython-36.opt-1.pyc000064400000003543147511334570023036 0ustar003

���e�
�@sHddlmZddlmZmZddlmZddlmZGdd�de�Z	dS)�)�
CharSetProber)�ProbingState�MachineState)�CodingStateMachine)�
UTF8_SM_MODELcsTeZdZdZ�fdd�Z�fdd�Zedd��Zedd	��Zd
d�Z	dd
�Z
�ZS)�
UTF8Proberg�?cs*tt|�j�tt�|_d|_|j�dS)N)�superr�__init__rr�	coding_sm�
_num_mb_chars�reset)�self)�	__class__�� /usr/lib/python3.6/utf8prober.pyr	&s
zUTF8Prober.__init__cs"tt|�j�|jj�d|_dS)N�)rrrr
r)r
)rrrr,s
zUTF8Prober.resetcCsdS)Nzutf-8r)r
rrr�charset_name1szUTF8Prober.charset_namecCsdS)N�r)r
rrr�language5szUTF8Prober.languagecCs�xj|D]b}|jj|�}|tjkr,tj|_Pq|tjkrBtj|_Pq|tj	kr|jj
�dkr|jd7_qW|jtj
kr�|j�|jkr�tj|_|jS)N�r)r
Z
next_staterZERRORrZNOT_MEZ_stateZITS_MEZFOUND_ITZSTARTZget_current_charlenr�stateZ	DETECTING�get_confidenceZSHORTCUT_THRESHOLD)r
Zbyte_str�cZcoding_staterrr�feed9s



zUTF8Prober.feedcCs.d}|jdkr&||j|j9}d|S|SdS)Ng�G�z��?�g�?)r�
ONE_CHAR_PROB)r
ZunlikerrrrLs

zUTF8Prober.get_confidence)�__name__�
__module__�__qualname__rr	r�propertyrrrr�
__classcell__rr)rrr#srN)
Z
charsetproberrZenumsrrZcodingstatemachinerZmbcssmrrrrrr�<module>ssite-packages/pip/_vendor/chardet/__pycache__/euctwprober.cpython-36.pyc000064400000002031147511334570022327 0ustar003

���e��@sDddlmZddlmZddlmZddlmZGdd�de�ZdS)�)�MultiByteCharSetProber)�CodingStateMachine)�EUCTWDistributionAnalysis)�EUCTW_SM_MODELcs4eZdZ�fdd�Zedd��Zedd��Z�ZS)�EUCTWProbercs,tt|�j�tt�|_t�|_|j�dS)N)	�superr�__init__rrZ	coding_smrZdistribution_analyzer�reset)�self)�	__class__��!/usr/lib/python3.6/euctwprober.pyr"s
zEUCTWProber.__init__cCsdS)NzEUC-TWr)r
rrr
�charset_name(szEUCTWProber.charset_namecCsdS)NZTaiwanr)r
rrr
�language,szEUCTWProber.language)�__name__�
__module__�__qualname__r�propertyrr�
__classcell__rr)rr
r!srN)	ZmbcharsetproberrZcodingstatemachinerZchardistributionrZmbcssmrrrrrr
�<module>ssite-packages/pip/_vendor/chardet/__pycache__/mbcsgroupprober.cpython-36.pyc000064400000002024147511334570023203 0ustar003

���e��@s�ddlmZddlmZddlmZddlmZddlm	Z	ddl
mZddlm
Z
ddlmZdd	lmZGd
d�de�ZdS)
�)�CharSetGroupProber)�
UTF8Prober)�
SJISProber)�EUCJPProber)�GB2312Prober)�EUCKRProber)�CP949Prober)�
Big5Prober)�EUCTWProbercseZdZd�fdd�	Z�ZS)�MBCSGroupProberNcsDtt|�j|d�t�t�t�t�t�t�t	�t
�g|_|j�dS)N)�lang_filter)
�superr�__init__rrrrrrr	r
Zprobers�reset)�selfr)�	__class__��%/usr/lib/python3.6/mbcsgroupprober.pyr*s
zMBCSGroupProber.__init__)N)�__name__�
__module__�__qualname__r�
__classcell__rr)rrr)srN)ZcharsetgroupproberrZ
utf8proberrZ
sjisproberrZeucjpproberrZgb2312proberrZeuckrproberrZcp949proberrZ
big5proberr	Zeuctwproberr
rrrrr�<module>ssite-packages/pip/_vendor/chardet/__pycache__/escprober.cpython-36.opt-1.pyc000064400000004742147511334570022724 0ustar003

���en�@sXddlmZddlmZddlmZmZmZddlm	Z	m
Z
mZmZGdd�de�Z
dS)�)�
CharSetProber)�CodingStateMachine)�LanguageFilter�ProbingState�MachineState)�HZ_SM_MODEL�ISO2022CN_SM_MODEL�ISO2022JP_SM_MODEL�ISO2022KR_SM_MODELcsVeZdZdZd�fdd�	Z�fdd�Zedd��Zed	d
��Zdd�Z	d
d�Z
�ZS)�EscCharSetProberz�
    This CharSetProber uses a "code scheme" approach for detecting encodings,
    whereby easily recognizable escape or shift sequences are relied on to
    identify these encodings.
    Ncs�tt|�j|d�g|_|jtj@rD|jjtt	��|jjtt
��|jtj@r`|jjtt��|jtj
@r||jjtt��d|_d|_d|_d|_|j�dS)N)�lang_filter)�superr�__init__�	coding_smrrZCHINESE_SIMPLIFIED�appendrrrZJAPANESEr	ZKOREANr
�active_sm_count�_detected_charset�_detected_language�_state�reset)�selfr)�	__class__��/usr/lib/python3.6/escprober.pyr*szEscCharSetProber.__init__csNtt|�j�x"|jD]}|s qd|_|j�qWt|j�|_d|_d|_dS)NT)	r
rrr�active�lenrrr)rr)rrrr:szEscCharSetProber.resetcCs|jS)N)r)rrrr�charset_nameEszEscCharSetProber.charset_namecCs|jS)N)r)rrrr�languageIszEscCharSetProber.languagecCs|jr
dSdSdS)Ng�G�z��?g)r)rrrr�get_confidenceMszEscCharSetProber.get_confidencecCs�x�|D]�}x�|jD]�}|s|jr&q|j|�}|tjkrhd|_|jd8_|jdkr�tj|_|j	Sq|tj
krtj|_|j�|_
|j|_|j	SqWqW|j	S)NFr�)rrZ
next_staterZERRORrrZNOT_MEr�stateZITS_MEZFOUND_ITZget_coding_state_machinerrr)rZbyte_str�crZcoding_staterrr�feedSs"





zEscCharSetProber.feed)N)�__name__�
__module__�__qualname__�__doc__rr�propertyrrrr"�
__classcell__rr)rrr#srN)Z
charsetproberrZcodingstatemachinerZenumsrrrZescsmrrr	r
rrrrr�<module>ssite-packages/pip/_vendor/chardet/__pycache__/sjisprober.cpython-36.opt-1.pyc000064400000004470147511334570023120 0ustar003

���e��@s`ddlmZddlmZddlmZddlmZddlm	Z	ddl
mZmZGdd�de�Z
d	S)
�)�MultiByteCharSetProber)�CodingStateMachine)�SJISDistributionAnalysis)�SJISContextAnalysis)�
SJIS_SM_MODEL)�ProbingState�MachineStatecsPeZdZ�fdd�Z�fdd�Zedd��Zedd��Zd	d
�Zdd�Z	�Z
S)
�
SJISProbercs4tt|�j�tt�|_t�|_t�|_	|j
�dS)N)�superr	�__init__rr�	coding_smr�distribution_analyzerr�context_analyzer�reset)�self)�	__class__�� /usr/lib/python3.6/sjisprober.pyr%s

zSJISProber.__init__cstt|�j�|jj�dS)N)r
r	rr)r)rrrr,szSJISProber.resetcCs|jjS)N)r�charset_name)rrrrr0szSJISProber.charset_namecCsdS)NZJapaneser)rrrr�language4szSJISProber.languagecCsL�xtt|��D]�}|jj||�}|tjkrP|jjd|j|j	|�t
j|_Pq|tj
krft
j|_Pq|tjkr|jj�}|dkr�|d|jd<|jj|jd|d�|�|jj|j|�q|jj||d||d|�|�|jj||d|d�|�qW|d|jd<|jt
jk�rF|jj��rF|j�|jk�rFt
j|_|jS)Nz!%s %s prober hit error at byte %s�r�����)�range�lenrZ
next_staterZERRORZlogger�debugrrrZNOT_MEZ_stateZITS_MEZFOUND_ITZSTARTZget_current_charlenZ
_last_charr�feedr
�stateZ	DETECTINGZgot_enough_data�get_confidenceZSHORTCUT_THRESHOLD)rZbyte_str�iZcoding_stateZchar_lenrrrr8s6




zSJISProber.feedcCs|jj�}|jj�}t||�S)N)rrr
�max)rZcontext_confZdistrib_confrrrrYs

zSJISProber.get_confidence)�__name__�
__module__�__qualname__rr�propertyrrrr�
__classcell__rr)rrr	$s!r	N)ZmbcharsetproberrZcodingstatemachinerZchardistributionrZjpcntxrZmbcssmrZenumsrrr	rrrr�<module>ssite-packages/pip/_vendor/chardet/__pycache__/codingstatemachine.cpython-36.pyc000064400000005365147511334570023634 0ustar003

���e�@s(ddlZddlmZGdd�de�ZdS)�N�)�MachineStatec@sDeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Ze	dd
��Z
dS)�CodingStateMachinea�
    A state machine to verify a byte sequence for a particular encoding. For
    each byte the detector receives, it will feed that byte to every active
    state machine available, one byte at a time. The state machine changes its
    state based on its previous state and the byte it receives. There are 3
    states in a state machine that are of interest to an auto-detector:

    START state: This is the state to start with, or a legal byte sequence
                 (i.e. a valid code point) for character has been identified.

    ME state:  This indicates that the state machine identified a byte sequence
               that is specific to the charset it is designed for and that
               there is no other possible encoding which can contain this byte
               sequence. This will to lead to an immediate positive answer for
               the detector.

    ERROR state: This indicates the state machine identified an illegal byte
                 sequence for that encoding. This will lead to an immediate
                 negative answer for this encoding. Detector will exclude this
                 encoding from consideration from here on.
    cCs0||_d|_d|_d|_tjt�|_|j�dS)Nr)	�_model�_curr_byte_pos�_curr_char_len�_curr_state�loggingZ	getLogger�__name__Zlogger�reset)�selfZsm�r
�(/usr/lib/python3.6/codingstatemachine.py�__init__7szCodingStateMachine.__init__cCstj|_dS)N)r�STARTr)rr
r
rr?szCodingStateMachine.resetcCsh|jd|}|jtjkr0d|_|jd||_|j|jd|}|jd||_|jd7_|jS)NZclass_tablerZchar_len_tableZclass_factorZstate_tabler)rrrrrr)r�cZ
byte_classZ
curr_stater
r
r�
next_stateBszCodingStateMachine.next_statecCs|jS)N)r)rr
r
r�get_current_charlenPsz&CodingStateMachine.get_current_charlencCs
|jdS)N�name)r)rr
r
r�get_coding_state_machineSsz+CodingStateMachine.get_coding_state_machinecCs
|jdS)N�language)r)rr
r
rrVszCodingStateMachine.languageN)r
�
__module__�__qualname__�__doc__rrrrr�propertyrr
r
r
rr!sr)r	Zenumsr�objectrr
r
r
r�<module>ssite-packages/pip/_vendor/chardet/__pycache__/cp949prober.cpython-36.opt-1.pyc000064400000002030147511334570023006 0ustar003

���e?�@sDddlmZddlmZddlmZddlmZGdd�de�ZdS)�)�EUCKRDistributionAnalysis)�CodingStateMachine)�MultiByteCharSetProber)�CP949_SM_MODELcs4eZdZ�fdd�Zedd��Zedd��Z�ZS)�CP949Probercs,tt|�j�tt�|_t�|_|j�dS)N)	�superr�__init__rrZ	coding_smrZdistribution_analyzer�reset)�self)�	__class__��!/usr/lib/python3.6/cp949prober.pyr#s
zCP949Prober.__init__cCsdS)NZCP949r)r
rrr
�charset_name+szCP949Prober.charset_namecCsdS)NZKoreanr)r
rrr
�language/szCP949Prober.language)�__name__�
__module__�__qualname__r�propertyrr�
__classcell__rr)rr
r"srN)	ZchardistributionrZcodingstatemachinerZmbcharsetproberrZmbcssmrrrrrr
�<module>ssite-packages/pip/_vendor/chardet/__pycache__/sbcharsetprober.cpython-36.opt-1.pyc000064400000005532147511334570024126 0ustar003

���e�@s4ddlmZddlmZmZmZGdd�de�ZdS)�)�
CharSetProber)�CharacterCategory�ProbingState�SequenceLikelihoodcsbeZdZdZdZdZdZd�fdd�	Z�fd	d
�Ze	dd��Z
e	d
d��Zdd�Zdd�Z
�ZS)�SingleByteCharSetProber�@igffffff�?g�������?FNcsJtt|�j�||_||_||_d|_d|_d|_d|_	d|_
|j�dS)N)�superr�__init__�_model�	_reversed�_name_prober�_last_order�
_seq_counters�_total_seqs�_total_char�
_freq_char�reset)�self�model�reversedZname_prober)�	__class__��%/usr/lib/python3.6/sbcharsetprober.pyr	'sz SingleByteCharSetProber.__init__cs:tt|�j�d|_dgtj�|_d|_d|_d|_	dS)N��)
rrrr
rZget_num_categoriesrrrr)r)rrrr5szSingleByteCharSetProber.resetcCs|jr|jjS|jdSdS)N�charset_name)rrr
)rrrrr?sz$SingleByteCharSetProber.charset_namecCs|jr|jjS|jjd�SdS)N�language)rrr
�get)rrrrrFsz SingleByteCharSetProber.languagec	Csn|jds|j|�}|s|jS|jd}x�t|�D]�\}}||}|tjkrZ|jd7_||jkr�|jd7_|j	|jkr�|j
d7_
|js�|j	|j|}|jd|}n||j|j	}|jd|}|j|d7<||_	q2W|jd}|jt
jk�rh|j
|jk�rh|j�}||jk�r@|jjd||�t
j|_n(||jk�rh|jjd|||j�t
j|_|jS)NZkeep_english_letter�char_to_order_maprZprecedence_matrixrz$%s confidence = %s, we have a winnerz9%s confidence = %s, below negative shortcut threshhold %s)r
Zfilter_international_words�state�	enumeraterZCONTROLr�SAMPLE_SIZErr
rrrrZ	DETECTING�SB_ENOUGH_REL_THRESHOLD�get_confidence�POSITIVE_SHORTCUT_THRESHOLDZlogger�debugZFOUND_ITZ_state�NEGATIVE_SHORTCUT_THRESHOLDZNOT_ME)	rZbyte_strr�i�c�orderrrZ
confidencerrr�feedMsF







zSingleByteCharSetProber.feedcCsNd}|jdkrJd|jtj|j|jd}||j|j}|dkrJd}|S)Ng{�G�z�?rg�?Ztypical_positive_ratiog�G�z��?)rrrZPOSITIVEr
rr)r�rrrrr#|s
 z&SingleByteCharSetProber.get_confidence)FN)�__name__�
__module__�__qualname__r!r"r$r&r	r�propertyrrr*r#�
__classcell__rr)rrr!s
/rN)Z
charsetproberrZenumsrrrrrrrr�<module>ssite-packages/pip/_vendor/chardet/__pycache__/euctwprober.cpython-36.opt-1.pyc000064400000002031147511334570023266 0ustar003

���e��@sDddlmZddlmZddlmZddlmZGdd�de�ZdS)�)�MultiByteCharSetProber)�CodingStateMachine)�EUCTWDistributionAnalysis)�EUCTW_SM_MODELcs4eZdZ�fdd�Zedd��Zedd��Z�ZS)�EUCTWProbercs,tt|�j�tt�|_t�|_|j�dS)N)	�superr�__init__rrZ	coding_smrZdistribution_analyzer�reset)�self)�	__class__��!/usr/lib/python3.6/euctwprober.pyr"s
zEUCTWProber.__init__cCsdS)NzEUC-TWr)r
rrr
�charset_name(szEUCTWProber.charset_namecCsdS)NZTaiwanr)r
rrr
�language,szEUCTWProber.language)�__name__�
__module__�__qualname__r�propertyrr�
__classcell__rr)rr
r!srN)	ZmbcharsetproberrZcodingstatemachinerZchardistributionrZmbcssmrrrrrr
�<module>ssite-packages/pip/_vendor/chardet/__pycache__/euckrprober.cpython-36.opt-1.pyc000064400000002031147511334570023250 0ustar003

���e��@sDddlmZddlmZddlmZddlmZGdd�de�ZdS)�)�MultiByteCharSetProber)�CodingStateMachine)�EUCKRDistributionAnalysis)�EUCKR_SM_MODELcs4eZdZ�fdd�Zedd��Zedd��Z�ZS)�EUCKRProbercs,tt|�j�tt�|_t�|_|j�dS)N)	�superr�__init__rrZ	coding_smrZdistribution_analyzer�reset)�self)�	__class__��!/usr/lib/python3.6/euckrprober.pyr#s
zEUCKRProber.__init__cCsdS)NzEUC-KRr)r
rrr
�charset_name)szEUCKRProber.charset_namecCsdS)NZKoreanr)r
rrr
�language-szEUCKRProber.language)�__name__�
__module__�__qualname__r�propertyrr�
__classcell__rr)rr
r"srN)	ZmbcharsetproberrZcodingstatemachinerZchardistributionrZmbcssmrrrrrr
�<module>ssite-packages/pip/_vendor/chardet/__pycache__/enums.cpython-36.pyc000064400000004753147511334570021132 0ustar003

���e}�@shdZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZGd	d
�d
e�ZGdd�de�Zd
S)zr
All of the Enums that are used throughout the chardet package.

:author: Dan Blanchard (dan.blanchard@gmail.com)
c@seZdZdZdZdZdZdS)�
InputStatezS
    This enum represents the different states a universal detector can be in.
    ���N)�__name__�
__module__�__qualname__�__doc__Z
PURE_ASCIIZ	ESC_ASCIIZ	HIGH_BYTE�r	r	�/usr/lib/python3.6/enums.pyrsrc@s<eZdZdZdZdZdZdZdZdZ	eeBZ
e
eBeBZdS)	�LanguageFilterzj
    This enum represents the different language filters we can apply to a
    ``UniversalDetector``.
    rr����N)rrrrZCHINESE_SIMPLIFIEDZCHINESE_TRADITIONALZJAPANESEZKOREANZNON_CJKZALLZCHINESEZCJKr	r	r	r
rsrc@seZdZdZdZdZdZdS)�ProbingStatezG
    This enum represents the different states a prober can be in.
    rrrN)rrrrZ	DETECTINGZFOUND_ITZNOT_MEr	r	r	r
r src@seZdZdZdZdZdZdS)�MachineStatezN
    This enum represents the different states a state machine can be in.
    rrrN)rrrrZSTARTZERRORZITS_MEr	r	r	r
r)src@s,eZdZdZdZdZdZdZedd��Z	dS)	�SequenceLikelihoodzX
    This enum represents the likelihood of a character following the previous one.
    rrr�cCsdS)z::returns: The number of likelihood categories in the enum.rr	)�clsr	r	r
�get_num_categories;sz%SequenceLikelihood.get_num_categoriesN)
rrrrZNEGATIVEZUNLIKELYZLIKELYZPOSITIVE�classmethodrr	r	r	r
r2src@s$eZdZdZdZdZdZdZdZdS)�CharacterCategoryz�
    This enum represents the different categories language models for
    ``SingleByteCharsetProber`` put characters into.

    Anything less than CONTROL is considered a letter.
    �����N)	rrrrZ	UNDEFINEDZ
LINE_BREAKZSYMBOLZDIGITZCONTROLr	r	r	r
rAsrN)r�objectrrrrrrr	r	r	r
�<module>s			site-packages/pip/_vendor/chardet/__pycache__/latin1prober.cpython-36.pyc000064400000005456147511334570022406 0ustar003

���e��@s^ddlmZddlmZdZdZdZdZdZdZ	dZ
dZd	Zd
Z
eeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee
ee
ee
eeeeeeeeeeeeeeeee
eeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee	e	e	e	e	e	e
e
e	e	e	e	e	e	e	e	e
e
e	e	e	e	e	ee	e	e	e	e	e
e
e
eeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee�fZdZGdd�de�Zd
S)�)�
CharSetProber)�ProbingState��������csLeZdZ�fdd�Zdd�Zedd��Zedd��Zd	d
�Zdd�Z	�Z
S)
�Latin1Probercs&tt|�j�d|_d|_|j�dS)N)�superr�__init__�_last_char_class�
_freq_counter�reset)�self)�	__class__��"/usr/lib/python3.6/latin1prober.pyraszLatin1Prober.__init__cCs t|_dgt|_tj|�dS)Nr)�OTHr�FREQ_CAT_NUMrrr)rrrrrgszLatin1Prober.resetcCsdS)Nz
ISO-8859-1r)rrrr�charset_namelszLatin1Prober.charset_namecCsdS)N�r)rrrr�languagepszLatin1Prober.languagecCsb|j|�}xP|D]H}t|}t|jt|}|dkr@tj|_P|j|d7<||_qW|j	S)Nrr)
Zfilter_with_english_letters�Latin1_CharToClass�Latin1ClassModelr�	CLASS_NUMr�NOT_MEZ_stater�state)rZbyte_str�cZ
char_classZfreqrrr�feedts



zLatin1Prober.feedcCs\|jtjkrdSt|j�}|dkr(d}n|jd|jdd|}|dkrPd}|d}|S)Ng{�G�z�?grrg4@g\��(\�?)rrr�sumr)rZtotalZ
confidencerrr�get_confidence�s
zLatin1Prober.get_confidence)�__name__�
__module__�__qualname__rr�propertyrrr!r#�
__classcell__rr)rrr`srN)@rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)Z
charsetproberrZenumsrrZUDFrZASCZASSZACVZACOZASVZASOrrrrrrrr�<module>sh	site-packages/pip/_vendor/chardet/__pycache__/mbcsgroupprober.cpython-36.opt-1.pyc000064400000002024147511334570024142 0ustar003

���e��@s�ddlmZddlmZddlmZddlmZddlm	Z	ddl
mZddlm
Z
ddlmZdd	lmZGd
d�de�ZdS)
�)�CharSetGroupProber)�
UTF8Prober)�
SJISProber)�EUCJPProber)�GB2312Prober)�EUCKRProber)�CP949Prober)�
Big5Prober)�EUCTWProbercseZdZd�fdd�	Z�ZS)�MBCSGroupProberNcsDtt|�j|d�t�t�t�t�t�t�t	�t
�g|_|j�dS)N)�lang_filter)
�superr�__init__rrrrrrr	r
Zprobers�reset)�selfr)�	__class__��%/usr/lib/python3.6/mbcsgroupprober.pyr*s
zMBCSGroupProber.__init__)N)�__name__�
__module__�__qualname__r�
__classcell__rr)rrr)srN)ZcharsetgroupproberrZ
utf8proberrZ
sjisproberrZeucjpproberrZgb2312proberrZeuckrproberrZcp949proberrZ
big5proberr	Zeuctwproberr
rrrrr�<module>ssite-packages/pip/_vendor/chardet/__pycache__/compat.cpython-36.opt-1.pyc000064400000000431147511334570022212 0ustar003

���en�@s@ddlZejdkr(dZdZeefZeZndZdZeefZeZdS)�N�TF)rr)	�sys�version_infoZPY2ZPY3�strZunicodeZbase_strZ	text_type�bytes�rr�/usr/lib/python3.6/compat.py�<module>s
site-packages/pip/_vendor/chardet/__pycache__/charsetgroupprober.cpython-36.opt-1.pyc000064400000004144147511334570024654 0ustar003

���e��@s,ddlmZddlmZGdd�de�ZdS)�)�ProbingState)�
CharSetProbercsReZdZd�fdd�	Z�fdd�Zedd��Zedd	��Zd
d�Zdd
�Z	�Z
S)�CharSetGroupProberNcs(tt|�j|d�d|_g|_d|_dS)N)�lang_filter�)�superr�__init__�_active_num�probers�_best_guess_prober)�selfr)�	__class__��(/usr/lib/python3.6/charsetgroupprober.pyr!szCharSetGroupProber.__init__csNtt|�j�d|_x.|jD]$}|r|j�d|_|jd7_qWd|_dS)NrTr)rr�resetr	r
�activer)r�prober)r
rrr'szCharSetGroupProber.resetcCs |js|j�|jsdS|jjS)N)r�get_confidence�charset_name)rrrrr1s
zCharSetGroupProber.charset_namecCs |js|j�|jsdS|jjS)N)rr�language)rrrrr9s
zCharSetGroupProber.languagecCs�xx|jD]n}|sq|jsq|j|�}|s*q|tjkr@||_|jS|tjkrd|_|jd8_|jdkrtj|_	|jSqW|jS)NFrr)
r
r�feedr�FOUND_ITr�state�NOT_MEr	Z_state)rZbyte_strrrrrrrAs$




zCharSetGroupProber.feedcCs�|j}|tjkrdS|tjkr"dSd}d|_x\|jD]R}|s>q4|jsV|jjd|j	�q4|j
�}|jjd|j	|j|�||kr4|}||_q4W|js�dS|S)Ng�G�z��?g{�G�z�?gz
%s not activez%s %s confidence = %s)rrrrrr
rZlogger�debugrrr)rrZ	best_confrZconfrrrrUs*


z!CharSetGroupProber.get_confidence)N)�__name__�
__module__�__qualname__rr�propertyrrrr�
__classcell__rr)r
rr s
rN)ZenumsrZ
charsetproberrrrrrr�<module>ssite-packages/pip/_vendor/chardet/__pycache__/langgreekmodel.cpython-36.opt-1.pyc000064400000057637147511334570023733 0ustar003

���e�1�@s4d�Zd�Zd�Zeed�d�d�d�d��Zeed�d�d�d�d��Zd�S)������R�d�h�^�b�e�t�f�o��u�\�X�q�U�O�v�i�S�C�r�w�_�c�m��H�F�P�Q�<�`�]�Y�D�x�a�M�V�E�7�N�s�A�B�:�L�j�g�W�k�p���Z�J���=�$�.�G�I�6�l�{�n��3�+�)�"�[�(�4�/�,�5�&�1�;�'�#�0��%�!�-�8�2�T�9�y�����|������ �
�����
����	��������*��@�K����g���s�?Fz
ISO-8859-7ZGreek)Zchar_to_order_mapZprecedence_matrixZtypical_positive_ratioZkeep_english_letterZcharset_nameZlanguagezwindows-1253N(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr9r:rrrrrrrrrrr;rrrrrrr<r=r>r?r@rArBrrCrrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r(r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr9r>rrrrrrrrrrr;rrrrrrr<rrr?r@rArBrrCrrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r(r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r(r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjryryrjrjrjrjrjrjrjrjrfrjrjrjr�ryryrjrjr�rjr�rjryr�rjrjrjr�rjr�r�r�ryr�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjrjr�rjrjr�rjryrjrjr�rjryrjrjrjr�r�rjr�rjr�rjrjryr�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�ryrjryryrjrjrjrjrjrjrjrjr�rjrjrjrjr�ryrjrjr�rjrjrjrjryrjrjrjr�ryr�r�r�ryr�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryrjrjryrjrjrjrjrjrjrjrjrjrjrjrjr�ryrfrjrjrjrjryrjrjryrjrjryr�r�r�r�r�ryr�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjr�rjrjrjrjrjrjr�rjrjr�rjrjrjrjrjrjrjrjrjrjr�rjryrjrjr�ryr�rfr�ryr�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�rfr�r�r�r�r�r�r�r�r�rjrjrjrjrjryrjr�r�r�r�rjrjr�rjrfrjrjrjr�rjrjr�rjrjrjrjr�r�r�r�ryr�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjrjr�rjr�rjrjrjrjrjr�rjryryryrjr�ryrjrjrjrjrjryrjrjr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjrjrjryryryrjrjrjrjr�rjrfrjrjrjrjryrjrjrjrjrjrjrjryryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjrjryr�rjr�r�r�rjrjryrjrjrjrjrjr�r�rjryrjr�ryrjr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�rjrjrjrjr�r�rjrjr�ryrjr�rjr�rjrjrjr�r�rjr�rjr�ryryrjrjr�r�r�r�rfr�r�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjrjryr�rjryrjrjrjrjr�rjrjrjrjrjr�rjrjryrjryrjrjryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjryrjryrjrjrjrjrjrjr�ryrjryrjryryryrjryrjrjryrjr�ryryryrjr�ryr�r�r�r�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�r�r�rjrjrjryrjrjr�r�rjr�rjr�r�r�rjryr�rjr�rjr�r�ryr�ryr�r�r�r�r�ryr�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjr�rjrjrjrjrjrjr�rjrjr�rjr�r�r�rjrjr�rjrjrjr�r�rfryrjr�rjr�r�r�r�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjrjryr�r�rjryryrjrjr�rjrjrjrjrjryrfrjr�rjryrjrjryrfr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjr�ryrjrjrjrjrjrjr�r�rjr�rjr�r�r�rjrjr�rjryrjr�r�rjrjrjr�rjr�r�r�ryr�r�r�r�r�rjr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjr�rjrjrjrjrjrjr�r�rjr�rjr�r�r�rjryr�rjryrjr�r�rjryrjr�ryr�r�r�r�r�r�r�r�r�rjr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrfryryrjrjrjrjrjrjr�ryrjr�rjr�r�r�rjrjr�rjr�ryr�r�ryrjrfr�ryr�r�r�r�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�rjrjrjrjr�rjr�rjrjryrjr�rjrjrjrjrjrjr�rjrjrjr�ryrjr�r�rjr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�rjrjrjr�r�rjr�r�r�rjrjr�rjr�ryrjrjr�r�rjr�rjr�rjrjr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�r�r�rjrjrjrjrjrjr�r�rjr�ryr�r�r�rjrjr�rjr�rjr�r�ryr�ryr�r�r�r�r�rfr�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjrjrjr�rjr�ryr�rjryr�rjryrjryrjr�r�rjryrjryrjrjr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�r�ryrjrjrjrjrjr�r�r�rjr�ryrfr�r�rjryryryr�rjr�r�ryryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�rjrjrjryr�rjr�rjr�rjrjr�ryrfryrjrjr�r�rjr�rjr�rjrjr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryrjrjrjr�rjrjrjrjrjrjr�ryrjr�rjr�r�r�ryrfr�ryryrjr�r�ryryryr�r�r�r�r�r�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�r�ryrjrjrjryrjr�r�rfrjr�ryr�r�r�r�rjr�rfr�ryr�r�rfrfrfr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjrjrfr�rjr�r�r�rjryr�rjryrjrjrjr�r�rjr�rjryryryrfr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�rjrjrjr�r�rjr�r�r�r�ryr�ryrjrjryryryryrjr�ryr�ryryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjryr�r�r�r�r�r�ryrjr�ryr�ryrjryr�r�rjr�rjr�rjrfr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjryrjrjryryrjr�ryr�rjr�r�r�ryr�r�r�r�rfryr�ryr�ryr�r�ryr�ryr�ryryr�r�rfr�ryryryr�ryryryr�ryryryr�r�ryr�r�rfr�r�r�r�r�ryr�rjrjryr�r�r�r�r�r�rfrjr�ryr�ryryryr�r�ryr�rjr�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�ryrjryr�ryryr�ryr�ryryr�ryr�ryryryr�r�r�r�r�r�ryrjr�r�r�ryr�rfryr�r�r�r�ryryr�r�r�ryrfr�ryryr�r�r�r�r�r�rfr�ryr�r�r�r�r�r�r�r�ryrfr�ryrjryryrjryrjryr�r�rjrjrjr�r�rjryr�r�r�rfrfr�ryr�ryryr�ryr�ryr�ryryr�r�ryr�ryryryr�ryryryryr�r�ryr�r�r�ryr�rfr�r�r�r�r�rjr�rjrjryryr�rjr�r�r�ryryr�ryryryrfryr�r�rfryryr�r�rjr�r�r�ryr�rfryr�r�r�rfryr�r�r�r�r�r�r�ryryr�rfr�r�ryr�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryrjrjryryr�r�r�ryr�ryrjrjr�ryr�r�r�r�r�r�ryryryr�ryryr�ryr�ryr�ryryr�r�ryryryryrfr�r�ryryr�ryr�r�ryr�r�r�r�r�r�ryr�r�r�r�r�r�r�ryr�rjryrjr�r�r�rjr�r�ryryr�ryr�ryryryr�r�ryr�r�r�r�r�r�r�r�ryr�r�ryryr�r�ryryryr�r�r�r�r�r�ryr�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�ryr�r�rjryr�ryryryryryr�r�r�ryr�r�r�r�ryr�rfr�r�ryr�rfr�r�r�r�ryryryr�ryryr�rfryr�ryryryr�ryryryryrfryryr�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�rfr�r�r�r�r�r�r�r�r�ryr�r�r�r�r�r�rfr�r�r�r�r�r�r�r�r�ryr�ryr�ryryr�r�r�r�rfryrfr�r�ryryr�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�rjryrjr�r�ryr�r�r�ryryr�ryr�r�r�rfr�r�ryr�ryr�ryryr�r�r�r�r�r�ryr�r�r�r�ryryr�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�ryr�r�r�r�r�r�r�ryryrjryryr�r�r�r�r�r�rfrjr�ryr�ryryr�r�r�rfr�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryr�ryr�rjryr�ryr�r�r�r�r�r�ryryr�r�r�r�r�r�r�r�r�r�r�r�r�r�rfr�r�ryr�r�r�r�rfrfr�r�ryrfryr�ryryr�rfr�r�rfr�r�r�ryr�r�r�r�r�r�r�rjr�ryryryr�r�ryr�r�r�ryr�r�r�ryrjr�ryr�r�r�r�r�r�ryryr�r�r�ryr�rfryr�r�r�rfryryrfr�r�r�ryr�r�ryr�r�r�r�r�r�r�r�rfr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryrfryr�ryryr�ryr�r�ryr�r�r�r�rfryrfr�ryrfr�r�r�r�r�r�r�r�r�r�r�r�ryr�r�r�rjrfryryr�ryr�r�r�r�ryr�r�r�ryr�r�rjr�r�r�r�ryryryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryrfr�ryr�rfryr�r�r�r�r�r�r�r�r�r�r�r�r�ryr�r�rfr�r�r�r�r�r�ryr�ryryr�r�ryryryryryr�rfryr�r�r�ryryr�rfr�ryr�r�ryryr�r�r�r�r�r�r�r�r�r�rfr�r�r�r�r�r�r�rjr�r�ryr�r�r�r�r�r�r�r�ryr�ryr�r�r�r�ryr�rfryr�r�r�r�ryryrfr�rfr�rfr�ryryryrfr�r�r�r�r�r�rfr�r�r�r�r�r�r�ryr�rfryr�r�r�r�r�r�r�r�r�r�ryr�r�ryryr�r�r�r�rfr�r�r�r�r�r�ryr�ryryr�r�r�r�ryryr�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�ryr�r�ryr�r�r�r�ryryryryr�r�r�rjr�r�r�r�r�r�r�r�ryr�r�r�r�r�r�ryr�r�r�r�r�r�rfr�r�ryr�r�r�r�rfryr�r�r�r�r�r�ryryrfrfr�r�r�r�r�r�rfr�r�r�r�r�r�r�ryr�ryryryr�r�ryr�r�r�r�r�r�r�ryryryr�r�r�ryr�r�r�r�r�r�r�r�ryr�r�rfr�r�r�r�ryrfr�r�r�r�r�r�rfr�r�r�r�r�rfr�r�r�r�r�r�r�r�r�r�r�rjr�ryr�r�r�r�r�r�r�r�ryr�r�r�r�r�ryr�r�r�r�r�r�r�ryr�r�r�r�ryr�r�ryr�r�r�r�ryryr�r�r�r�rfr�r�rfr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryr�ryryrfr�r�r�r�r�r�ryr�r�ryr�ryryryr�r�r�r�r�r�ryr�r�r�r�ryr�r�ryr�r�ryr�ryryr�r�r�r�ryr�ryr�r�r�r�r�ryr�r�r�ryr�r�r�r�r�r�r�r�rjr�r�r�ryryr�ryryr�r�r�r�r�ryr�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rfr�r�r�r�r�rfr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rfr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryr�r�ryr�r�r�r�r�r�ryryryryryr�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�rfrfr�r�r�rfr�r�r�r�r�r�r�ryrfr�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryryr�r�r�r�r�ryr�r�r�r�r�r�r�r�rfr�r�r�r�r�r�r�r�r�ryr�r�r�ryr�r�r�r�r�rfr�r�r�r�ryryr�r�r�rfr�r�r�r�r�r�r�r�r�r�r�r�r�r�rfr�r�r�r�r�r�r�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rfr�r�rfr�ryr�r�r�r�ryr�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rfr�r�r�r�rfrfr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rfr�r�r�r�r�r�r�r�r�r�r�r�r�rfr�r�r�r�rfr�r�ryr�ryr�r�r�r�r�r�r�r�r�r�r�ryrfr�r�r�r�r�r�ryr�r�r�rfryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)ZLatin7_char_to_order_mapZwin1253_char_to_order_mapZGreekLangModelZLatin7GreekModelZWin1253GreekModel�r�r��$/usr/lib/python3.6/langgreekmodel.py�<module>#sV
site-packages/pip/_vendor/chardet/__pycache__/__init__.cpython-36.pyc000064400000001375147511334570021537 0ustar003

���e�@s8ddlmZmZddlmZddlmZmZdd�ZdS)�)�PY2�PY3)�UniversalDetector)�__version__�VERSIONcCsHt|t�s0t|t�s(tdjt|����nt|�}t�}|j|�|j�S)z�
    Detect the encoding of the given byte string.

    :param byte_str:     The byte sequence to examine.
    :type byte_str:      ``bytes`` or ``bytearray``
    z4Expected object of type bytes or bytearray, got: {0})	�
isinstance�	bytearray�bytes�	TypeError�format�typerZfeed�close)Zbyte_strZdetector�r�/usr/lib/python3.6/__init__.py�detects


rN)	�compatrrZuniversaldetectorr�versionrrrrrrr�<module>ssite-packages/pip/_vendor/chardet/__pycache__/euctwfreq.cpython-36.opt-1.pyc000064400000152605147511334570022747 0ustar003

���e�{�@sdZdZ�dZ�dS(g�?i���������	�R���������
�n��!���L�,�A��s������L
�S
���������.�N�i�����:�����?���=�N�K������l	����
����
� ��
�����
�o�$��i���c�8����������z�|���t�"�
�e�@�\��	��������F��M
�Q�H���P�v���f�����D�T
����F�N
��E��O��/���s���3�<�2���&�L�����O
����G���M���?���`��F	�*���g�	�Z�
�:����K��	��	�������`�����g��	� �q��~����P
�	������!��u���*���	�
��~��������e���G�^�������U�C���B��������� �j�o�/���O�2	�[����
�&������S�(����p�]��6�i�
�'�������8��+�%�[���\��������X	�(������0��� �H�
�
�"�!���+��1�"���
�O�G	���f�1�����������2�9���l�,���	�������}�h�#�q
�Q�M�&��X���#����
�����j�����M�����%����$���'��	���N�i��7��J���!���������M��)�P�U
�����
��%�
�
��� ��
� �I���3	���
��r
���������m����$���x�������
������%�&���
��������&��'���'���.�����H	�������$�#���D�&�A��	��������U�G������
�Q
�P�S�'���d��0�F�����*��������J���U�����I	�R
���<��S
�:�7
��
���	�	�9���}���	���V�P�T
��)�C�����&�
��	�)���m	��������4	������n	�*��������O������	�+�(�
������U
�(���5�Y�j�
�l��u��)�
�*�+�V
�
�=�������������4��!����T�,�x�����e���	���J	�P��s
�5�A�V
�/�k���
��l�!�	���A��`��
���A��
����
���	���������M�
�������W
���
�t
����+�}��j�8
�����������-�)�m��	���W
���		��
��a��
�P�K	��,������7�'�u
���{�k�������
�9
������������1���b���	�
����o	���X
�,�Q����������
��
�X
�����5�D���l����[� 
����Y
��%��.�Y	��*����
���R������p����n�c�g�+����'�2�����{��l���m���:
���f�
	���|�5	�������Z
���
������,������;���I�Y�����[
�X�"���	������~�����,��k����-�������D�����>�]���,�������v�L�B�i�&�����
�����G������B���
�!
�����u�������	� �a�����v
�S���}�;
��D���=���0����\
�(������v� ���������.�����9�H�������]
�E���������!���/�������-���Q�*��.�/�������0�3��"
��*�R�����W�����/��b�.���	�����������R����)��2��	������������Z	���T��w
������	��������[	�O��	��`�7�x
�^
��������\	�������������g���n��
�_��~������0��
�a���y
�]	�������������	�;���	�-����L	������/��������`���#�"�������0�
�s�����d�s��
���Y
����k�w�o����������1��2���g�p�Q��U��v������C�S��^	���5���B���_��b�N�����X�����
��L�c�
�
�	�K�w�*�a�G�3��
����1�6�	�2�6	�N����	�3���:��\�q����-�����3�
�������x�
���r���4�
�_
��U����@��5���������/�+�6���������
�7	�Z
�����8�4��C������l��`
��������Y������5��	�{�����$�7��M�V�0�r�����g��V��8��9�,�Z����{�#�W�0���$��
��
��4��[��v���c��V�M	�$�-����X����d����W�[
�������q��
��1�����t����	����
�2��2�:�3� �.���2���
�������%�������
��������
��
�6��4���|���S����_	��
������`�����	�3���;�r����7�Y�L�;���	��	�����T�\�����<���V�5����
�����"�x�%��;�8	� ���2�E�=�!�����^��w��
�"�����s��#�<�Q����r�$��]��%��.�&��3�5�B��y�'��������>�Y�a
�b
�(�
�	�)�*�	�����:��6��^�����(��
�+�+���x�,�	�.�h�/�-����
�V����h���z�.�����<����;�/�?���
�	�0�8������5�6�
�_�)�1�2�	�3�y�	����
��^�4���?�$�+�\
��5�_�W����z
�6�0��
�]
�	���p	�7����8�^
�<
���V�9��6���I�w�}�c
�^��
�9�@��
�A�����	�:�e�1���B��������������7�;�<���=�{
���(�`�R�1�Y��I��	�����6��d
�>�����e
�?���o��q�Y�n���l��C�D�{���
�_
�E�P�U��������E�m��@���:�
��
�A������#
�	�N�
��B��C�W�T�T����
��f
�;�D��&��
�|���������
���<�=
��
����	�V���
��o���	�E��
�F�������������D�-�8���G�d��	���
�H�t����I�z�
��J��
��
�K���|
�����>
��������L�:�	�=���M��?
����H������N���	�O�P��F�k�Q�9�������
�	�u��v������R�S����}�
�>�������
�:�T�	���N	��C������U�~�� �W���G����,������O���|�����O�?�V�`	�� ��~��w����@�!����;������W���"��
�X��Y��R�g
�u���
�h
��
�x�=�
��Z��<����[��#��=� 	�\�$��]�H�I�^�7�O	�$
�
�_�%�v�J�� �!�w�t��`�a�b�9	�&���"�{����	�'�c���d�8������[�9���d�Z�	�e�����@
�!�(�"�f��������#�$�T�g�K���%��W�M�h���i�����%��������
�`
��y�7�m�L�j�k��:	�>�!�������B�)�l�|���
�z����i
���M�����
���m��n�u�q	�v����
�
�*�N�o�p���q�7�.�r���y��
���
��f����s�!	�y��
�O�	�����+�P�j
��
�>������Y��A�t��u�%
�
��v���z�2�w�,��	��	���3�Q����
�x��6�9�k
���&�6����-�
��R�y���Q�9�c��
�'�.�
�o�(�B�F�?�l
���)�e����
�z��
�S�T������{�/���"	�	�b�/���`���0�u��v���1��
�	��U��*���/�E��S��8�Z�|�}�F�+����!�~�2�,�F�P	��m
��
��e�s��C���r	������a�3�Z�%�-�����	�4�����A���
��
�i�
����D�.����	�n������
���	��3�5�J�0���s	��
�����@���V��/��	���6�W�A�7�o��
���P�0��1��n
���a�Q�2��p�:���P��b�������{�3�7�E���	����|�F�G�X�������]��
�4���}
�����
�5��&��8�	
�����	�������L�������
�	�Y�

�����4�
�B�9�8������	�H��
�/�3�Z�[���
�o
����:�0���n�\���]���^������	�����I������;��4��<��=�;�	��
����X�	�����5�_���
�
��������}�0�>�p
�J���K������&
�F�������
��
�#��[�����f��>�q
��?�r
��6�������
�������j�����@�-�p�9�;	�6�y�A������	�'
��������V�����`���4�����K�
�a��
�6����������"���K��

������	�Y�G�o��b�f�c�7��s
�p�r�	�8��������A
�Q	�
�C�,�
��������\�L��u��~
��
����9�����B��Z���
�C������j�d��0�h�g��e���M�}�7��
��
��D�	�����f�:����N�;�
�E�F��(
���<����G�M�=�w�g��~�t	�?�����
�t�m�#	��	��+���h�������[�
�O�	�����
���B
����$	�t
��
���a	���	��H������Q�"�<	�
����-�1��#�u
�'���D�����
��
�I���>�J�%	����P��
�i��
�=�a
��K��v
���������
�����_���j���?�E�1��
��
��k���8�Q����l�����
��m�@�
��b�������R������5�������L�	�w
��n�����4�H����M�����������\��o���N�]���
��\�x
�g��E�
������������p���
��q�
���������O�������r��Z��s�t������P�e�����������o�����^��}�������z�A�&	�������
�����n�-����
�Q��i�
������R���u��	�����B������
�y
����v�w�B�h�q�����<��*�i�S�"�T�C�>�)�
�x�U�����~������z���V�������C
�)�q����W�
	�������D���E�T�F�p�����#�#�y���
����G���z
������	�����S���X�
��������u	�
�=�H���_��R�������w�����Y�z�{�q����3���5���
���{
���|�}�S�~���
�
��Z�[�������`���F��]���n�����I���
����������G�T�J���K�z�S�r�\�)
��
�v	��5������
�b	���j��������
���]�^����c�������
����
�a�:�1�������A�h��a��
�6������b
�-���
�H�_���	��
����������|
��*
�f���	����
�D
��
�L�W����U��
���$���
�
���N���R	�k��E
�������M���S	�'	��`���s��F
���a�}
����
�(	�j��	�~
�V�B�[�b�W�����b�
�I��c
��
��{�����	����	�c�
��	��d�
�
�]������=	�e�������
����
�N�	����f�g�
��
�
��h������
�=��X���������d
��H�C�
�i��T�W�
�
��9��	�N���K�E�j�4�J�
�Y�
��G
�	�k���h�O��
�
��]�P�Z��l���Q�[�k��m�n�R�o�c	������S�X����\�+��������I�T�U�U����
���l��
�D����p��q�
�k�	�m��
�K��������V�8��
� ��r�@�W�r��
�C�G����
��^�
�����8�t��:���
�!��n�s��"��
�#�$�]�t�	�%��
�&�r�u�'�+
�(�)��e
�
�}�
�����*�4�1�+�����������,���
�>	�X�t�r�?	���
�-��v�u�{���w	�w���.�x�T	�,
����/�0�������1��
��2��
�G�R�
��^��Y�
�Z���
�[�3�^��������
�)	�\�	�4�@�v��5��6�7��d	�]��
���^�8��&��	����9�y�u���
���z�R��{�����(�_���<�9��l������_� 
�S�`�k���[�	���
�x�:��;�<�!
�$�=��|���>�d��
��?�e��������������
�}�@�A�B���.�~�a�O�
�����`�b��
�%�����
�;��
����C��D�������f
�"
�L�
�����
�E�;��F�&�G��
�
���x	�H�)����c��I�d�T���
�
�e��������	��J��g�~��U��e	�	����f�a�8�g��
�	�b�h�K�y	�����L��:�
�������*�M�i����c�t���M�N������	��
���f	���Y�d�e��*����O�H
��#
���#�;�
��	������I
�\�@	�����s�$
��
�P�Q�
�	�
�R�j�S�T���+���U��V����L�
�!��J
�\�4�W����X�Y�Z��[�>�w�k�f����+���%
����$��\����l�
�]��l�^�_�g
�%��`�a�g�����
���N�K
�b�c�d��
�c�w��
�e���b�y��f�h�g�h�h��'�i��m���z	�j�d�i����	�<����|����������O�k��
�l���
�a���m�����n�
��u���n�����e�&
�
�o�����	�<���p�q�P�Q��j��	��	����(��_�r�s�����<�E�R�f��������]���I�t�=��u��v��
���o�
�
�k�,�l�w�'
��	�m����F�`����f����
�S�J�x��y�z�n��L
���p�O�����
�{��
���A	�y����U	���|�}�~�F��q����
�	������������p����B	��o�	����p��-
���r���q�q���(
���r����s������
����
�(�.
�s�)
��t���	��*
���u���T�v�M
�{	���t��'�����
�����������g�Z�w�
�	�c��j�+
�p����x������h����
������,
�=����	��u������/
�y��
��	�y�v�	����w���������
�z������>���C�'�����{��Z�����������a����	��q�?��
��|��	�"���
����}�������=��A��1��)����?�
�
����x�	��	��	�
���h��S�����(��
���'�x�y��
���
��
������_�	����>��}�P���������~��
�N
��c��
����H��	�i��������������j���	�*	����
������h
���_���/��<��~�7�k�T���]��U�b�
�t���g	��z���
���|	����������{���	��0
���
��+	��,	����I����������B���4�s�
��������
���?�C���	��G��������������������|�1
���m�C	�����l���i
�}	���?����������h	�����I������V�����������^�������-	�.	���)���
����2
��i�����
�����A�����-
���������
���
���	�������
��
����
�����Q���
�O
���������z�K������[���J�����@��.
�����D	��}�/
�@��
�����
��@��R�����	�j
�~����J�����
�������
�0
��m�����m�~	���	����1
�����	������
�d�������X����9���������U��������L��3
������
�	�������	������
���$�(�v����&���b����������������{�����������������2
��	�������	���
����������^����V	�	�����������������2�W	��
�W�o�i	��k
�X���	�A�4
��
��3
����	������
����
���������	���H�����������D���I�s�
���
��� �����w��{�������/	�n�,������	��	���
�����
��4
�B����P
�5
��
��
�	���
��	���|��z�����
�'��5
����!���y����V�����	��������������	�
���-�������
��
��	�������
��"�%����J�C����x������	��l
�6
�����
��
����X����7
���	�
�N��m
����� �W���!�	��
��
�>�8
��"�#�9
�j	�D�����$�	
�%��
���X�&��'�E��(�F��)��x��	�k������*�+������D���,�������K�-�.�/�:
��0�J���1�|����������E�*�2�
�������H�x�����3������3������;
�p���<
��������4�5�	�	�6��O���
�7�8���9���:����;���������<�
��Y�=�>�
�	�?���@�A�B�
����=
��	����o�	�f�������
��
�	�	�C��D��#��
�>
��E���F�Z�E	����	�[�$�
���G�?
���	�����G������@
�H���A
�\�L�������
�I�	������i�J�	���
�	�K�L�]�M�N�
���j��Q
�
�B
�O�P����Q�
����R�
�S�T���(�����C
���U��V��W�X�����Y������Z����d�[�c�H��_��#�0	�\��]��n�

��	������^���_�k	�D
���`�a���b�����	�c������d����	�����d�^�E
�e�����
����f��_�7����� �
���F
���g�-����!����h�	�q�;������X�� �	����i�j������
�k�l�����~�m�G
�`�n�y�p����o�>��p���q�
���r���a�����s�t�"�u�|���?����v���������e���w�x�y������z�{�@�|���	�}��	����%�s��R
��.���M�?�
���~����b����@������������������������
��6
�I����������c����������2�����m��@����	����N����f����Z�
��������\�H
��d�����J����	���	��
��������K�����#�����������	����K������g�����h���>��.������������)�n
�t�����
�o
��
�I
���
���q�����r��������p
����	����z����U��J
���1	�K
�t��������
�����J����L�	�������N(rrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r	r	r	r	r	r	r	r	r	r		r
	r	r	r
	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r 	r!	r"	r#	r$	r%	r&	r'	r(	r)	r*	r+	r,	r-	r.	r/	r0	r1	r2	r3	r4	r5	r6	r7	r8	r9	r:	r;	r<	r=	r>	r?	r@	rA	rB	rC	rD	rE	rF	rG	rH	rI	rJ	rK	rL	rM	rN	rO	rP	rQ	rR	rS	rT	rU	rV	rW	rX	rY	rZ	r[	r\	r]	r^	r_	r`	ra	rb	rc	rd	re	rf	rg	rh	ri	rj	rk	rl	rm	rn	ro	rp	rq	rr	rs	rt	ru	rv	rw	rx	ry	rz	r{	r|	r}	r~	r	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r
r
r
r
r
r
r
r
r
r	
r

r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r 
r!
r"
r#
r$
r%
r&
r'
r(
r)
r*
r+
r,
r-
r.
r/
r0
r1
r2
r3
r4
r5
r6
r7
r8
r9
r:
r;
r<
r=
r>
r?
r@
rA
rB
rC
rD
rE
rF
rG
rH
rI
rJ
rK
rL
rM
rN
rO
rP
rQ
rR
rS
rT
rU
rV
rW
rX
rY
rZ
r[
r\
r]
r^
r_
r`
ra
rb
rc
rd
re
rf
rg
rh
ri
rj
rk
rl
rm
rn
ro
rp
rq
rr
rs
rt
ru
rv
rw
rx
ry
rz
r{
r|
r}
r~
r
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r
r
r
r
r
r
r
r
r
r	
r

r
r
r

r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r 
r!
r"
r#
r$
r%
r&
r'
r(
r)
r*
r+
r,
r-
r.
r/
r0
r1
r2
r3
r4
r5
r6
r7
r8
r9
r:
r;
r<
r=
r>
r?
r@
rA
rB
rC
rD
rE
rF
rG
rH
rI
rJ
rK
rL
rM
rN
rO
rP
rQ
rR
rS
rT
rU
rV
rW
rX
rY
rZ
r[
r\
r]
r^
r_
r`
ra
rb
rc
rd
re
rf
rg
rh
ri
rj
rk
rl
rm
rn
ro
rp
rq
rr
rs
rt
ru
rv
rw
rx
ry
rz
r{
r|
r}
r~
r
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r)Z EUCTW_TYPICAL_DISTRIBUTION_RATIOZEUCTW_TABLE_SIZEZEUCTW_CHAR_TO_FREQ_ORDER�rr�/usr/lib/python3.6/euctwfreq.py�<module>,s�site-packages/pip/_vendor/chardet/__pycache__/eucjpprober.cpython-36.pyc000064400000004436147511334570022321 0ustar003

���e��@s`ddlmZmZddlmZddlmZddlmZddl	m
Z
ddlmZGdd�de�Z
d	S)
�)�ProbingState�MachineState)�MultiByteCharSetProber)�CodingStateMachine)�EUCJPDistributionAnalysis)�EUCJPContextAnalysis)�EUCJP_SM_MODELcsPeZdZ�fdd�Z�fdd�Zedd��Zedd��Zd	d
�Zdd�Z	�Z
S)
�EUCJPProbercs4tt|�j�tt�|_t�|_t�|_	|j
�dS)N)�superr	�__init__rr�	coding_smr�distribution_analyzerr�context_analyzer�reset)�self)�	__class__��!/usr/lib/python3.6/eucjpprober.pyr%s

zEUCJPProber.__init__cstt|�j�|jj�dS)N)r
r	rr)r)rrrr,szEUCJPProber.resetcCsdS)NzEUC-JPr)rrrr�charset_name0szEUCJPProber.charset_namecCsdS)NZJapaneser)rrrr�language4szEUCJPProber.languagecCs6x�tt|��D]�}|jj||�}|tjkrN|jjd|j|j	|�t
j|_Pq|tj
krdt
j|_Pq|tjkr|jj�}|dkr�|d|jd<|jj|j|�|jj|j|�q|jj||d|d�|�|jj||d|d�|�qW|d|jd<|jt
jk�r0|jj��r0|j�|jk�r0t
j|_|jS)Nz!%s %s prober hit error at byte %s�r���)�range�lenrZ
next_staterZERRORZlogger�debugrrrZNOT_MEZ_stateZITS_MEZFOUND_ITZSTARTZget_current_charlenZ
_last_charr�feedr
�stateZ	DETECTINGZgot_enough_data�get_confidenceZSHORTCUT_THRESHOLD)rZbyte_str�iZcoding_stateZchar_lenrrrr8s4




zEUCJPProber.feedcCs|jj�}|jj�}t||�S)N)rrr
�max)rZcontext_confZdistrib_confrrrrYs

zEUCJPProber.get_confidence)�__name__�
__module__�__qualname__rr�propertyrrrr�
__classcell__rr)rrr	$s!r	N)ZenumsrrZmbcharsetproberrZcodingstatemachinerZchardistributionrZjpcntxrZmbcssmrr	rrrr�<module>ssite-packages/pip/_vendor/chardet/__pycache__/latin1prober.cpython-36.opt-1.pyc000064400000005456147511334570023345 0ustar003

���e��@s^ddlmZddlmZdZdZdZdZdZdZ	dZ
dZd	Zd
Z
eeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee
ee
ee
eeeeeeeeeeeeeeeee
eeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee	e	e	e	e	e	e
e
e	e	e	e	e	e	e	e	e
e
e	e	e	e	e	ee	e	e	e	e	e
e
e
eeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee�fZdZGdd�de�Zd
S)�)�
CharSetProber)�ProbingState��������csLeZdZ�fdd�Zdd�Zedd��Zedd��Zd	d
�Zdd�Z	�Z
S)
�Latin1Probercs&tt|�j�d|_d|_|j�dS)N)�superr�__init__�_last_char_class�
_freq_counter�reset)�self)�	__class__��"/usr/lib/python3.6/latin1prober.pyraszLatin1Prober.__init__cCs t|_dgt|_tj|�dS)Nr)�OTHr�FREQ_CAT_NUMrrr)rrrrrgszLatin1Prober.resetcCsdS)Nz
ISO-8859-1r)rrrr�charset_namelszLatin1Prober.charset_namecCsdS)N�r)rrrr�languagepszLatin1Prober.languagecCsb|j|�}xP|D]H}t|}t|jt|}|dkr@tj|_P|j|d7<||_qW|j	S)Nrr)
Zfilter_with_english_letters�Latin1_CharToClass�Latin1ClassModelr�	CLASS_NUMr�NOT_MEZ_stater�state)rZbyte_str�cZ
char_classZfreqrrr�feedts



zLatin1Prober.feedcCs\|jtjkrdSt|j�}|dkr(d}n|jd|jdd|}|dkrPd}|d}|S)Ng{�G�z�?grrg4@g\��(\�?)rrr�sumr)rZtotalZ
confidencerrr�get_confidence�s
zLatin1Prober.get_confidence)�__name__�
__module__�__qualname__rr�propertyrrr!r#�
__classcell__rr)rrr`srN)@rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)Z
charsetproberrZenumsrrZUDFrZASCZASSZACVZACOZASVZASOrrrrrrrr�<module>sh	site-packages/pip/_vendor/chardet/__pycache__/version.cpython-36.pyc000064400000000550147511334570021457 0ustar003

���e��@sdZdZejd�ZdS)z�
This module exists only to simplify retrieving the version number of chardet
from within setup.py and from chardet subpackages.

:author: Dan Blanchard (dan.blanchard@gmail.com)
z3.0.4�.N)�__doc__�__version__�split�VERSION�rr�/usr/lib/python3.6/version.py�<module>ssite-packages/pip/_vendor/chardet/__pycache__/escsm.cpython-36.opt-1.pyc000064400000016167147511334570022056 0ustar003

���e)�@s�ddlmZdZejejdejejejejejejejejejejejejejejejejejejejdejdejdejdddejdejdddejdejdejejejejejejejf0ZdZedeedd	d
�ZdZ	ejdejejejejejejejejejejejejejejejejejejejejejejejejejejejejdejejejejejejejejejddejejejejejejejejejejejejejejejejejejejejejejf@Z
dZe	de
edd	d
�ZdZ
ejdejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejdejejejdejejejejejdejejejejejejejejejejejejejejejejejejejejejejejejejejejejfHZdZe
deeddd
�ZdZejdejejejejejejejejejejejejejejejejejejejdejejejejejejdejejejejejejejejejejejf(ZdZedeeddd
�ZdS)�)�MachineState������z
HZ-GB-2312ZChinese)Zclass_tableZclass_factorZstate_tableZchar_len_table�nameZlanguage�	zISO-2022-CN���
zISO-2022-JPZJapanesezISO-2022-KRZKoreanN(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)rrrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)	rrrrrrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr
rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)
rrrrrrrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)rrrrrr)ZenumsrZHZ_CLSZSTARTZERRORZITS_MEZHZ_STZHZ_CHAR_LEN_TABLEZHZ_SM_MODELZ
ISO2022CN_CLSZISO2022CN_STZISO2022CN_CHAR_LEN_TABLEZISO2022CN_SM_MODELZ
ISO2022JP_CLSZISO2022JP_STZISO2022JP_CHAR_LEN_TABLEZISO2022JP_SM_MODELZ
ISO2022KR_CLSZISO2022KR_STZISO2022KR_CHAR_LEN_TABLEZISO2022KR_SM_MODEL�rr�/usr/lib/python3.6/escsm.py�<module>sp "    $     $ $site-packages/pip/_vendor/chardet/__pycache__/universaldetector.cpython-36.opt-1.pyc000064400000013173147511334570024500 0ustar003

���e�0�@s�dZddlZddlZddlZddlmZddlmZmZm	Z	ddl
mZddlm
Z
ddlmZdd	lmZGd
d�de�ZdS)a
Module containing the UniversalDetector detector class, which is the primary
class a user of ``chardet`` should use.

:author: Mark Pilgrim (initial port to Python)
:author: Shy Shalom (original C code)
:author: Dan Blanchard (major refactoring for 3.0)
:author: Ian Cordasco
�N�)�CharSetGroupProber)�
InputState�LanguageFilter�ProbingState)�EscCharSetProber)�Latin1Prober)�MBCSGroupProber)�SBCSGroupProberc	@sneZdZdZdZejd�Zejd�Zejd�Z	dddd	d
ddd
d�Z
ejfdd�Z
dd�Zdd�Zdd�ZdS)�UniversalDetectoraq
    The ``UniversalDetector`` class underlies the ``chardet.detect`` function
    and coordinates all of the different charset probers.

    To get a ``dict`` containing an encoding and its confidence, you can simply
    run:

    .. code::

            u = UniversalDetector()
            u.feed(some_bytes)
            u.close()
            detected = u.result

    g�������?s[�-�]s(|~{)s[�-�]zWindows-1252zWindows-1250zWindows-1251zWindows-1256zWindows-1253zWindows-1255zWindows-1254zWindows-1257)z
iso-8859-1z
iso-8859-2z
iso-8859-5z
iso-8859-6z
iso-8859-7z
iso-8859-8z
iso-8859-9ziso-8859-13cCsNd|_g|_d|_d|_d|_d|_d|_||_tj	t
�|_d|_|j
�dS)N)�_esc_charset_prober�_charset_probers�result�done�	_got_data�_input_state�
_last_char�lang_filter�loggingZ	getLogger�__name__�logger�_has_win_bytes�reset)�selfr�r�'/usr/lib/python3.6/universaldetector.py�__init__QszUniversalDetector.__init__cCsZdddd�|_d|_d|_d|_tj|_d|_|jr>|jj	�x|j
D]}|j	�qFWdS)z�
        Reset the UniversalDetector and all of its probers back to their
        initial states.  This is called by ``__init__``, so you only need to
        call this directly in between analyses of different documents.
        Ng)�encoding�
confidence�languageF�)rrrrr�
PURE_ASCIIrrrrr
)r�proberrrrr^s
zUniversalDetector.resetcCs>|jr
dSt|�sdSt|t�s(t|�}|js�|jtj�rJdddd�|_nv|jtj	tj
f�rldddd�|_nT|jd�r�dddd�|_n:|jd	�r�d
ddd�|_n |jtjtjf�r�dddd�|_d|_|jd
dk	r�d|_dS|j
tjk�r.|jj|��rtj|_
n*|j
tjk�r.|jj|j|��r.tj|_
|dd�|_|j
tjk�r�|j�s^t|j�|_|jj|�tjk�r:|jj|jj�|jjd�|_d|_n�|j
tjk�r:|j�s�t |j�g|_|jt!j"@�r�|jj#t$��|jj#t%��x@|jD]6}|j|�tjk�r�|j|j�|jd�|_d|_P�q�W|j&j|��r:d|_'dS)a�
        Takes a chunk of a document and feeds it through all of the relevant
        charset probers.

        After calling ``feed``, you can check the value of the ``done``
        attribute to see if you need to continue feeding the
        ``UniversalDetector`` more data, or if it has made a prediction
        (in the ``result`` attribute).

        .. note::
           You should always call ``close`` when you're done feeding in your
           document if ``done`` is not already ``True``.
        Nz	UTF-8-SIGg�?�)rrrzUTF-32s��zX-ISO-10646-UCS-4-3412s��zX-ISO-10646-UCS-4-2143zUTF-16Trr���)(r�len�
isinstance�	bytearrayr�
startswith�codecs�BOM_UTF8r�BOM_UTF32_LE�BOM_UTF32_BE�BOM_LE�BOM_BErrr!�HIGH_BYTE_DETECTOR�search�	HIGH_BYTE�ESC_DETECTORrZ	ESC_ASCIIrrr�feedrZFOUND_IT�charset_name�get_confidencerr
r	rZNON_CJK�appendr
r�WIN_BYTE_DETECTORr)rZbyte_strr"rrrr3os|





zUniversalDetector.feedc	Cs�|jr|jSd|_|js&|jjd�n�|jtjkrBdddd�|_n�|jtjkr�d}d}d}x,|j	D]"}|slqb|j
�}||krb|}|}qbW|r�||jkr�|j}|jj
�}|j
�}|jd	�r�|jr�|jj||�}|||jd�|_|jj�tjk�rz|jd
dk�rz|jjd�xn|j	D]d}|�s �qt|t��rZxF|jD] }|jjd|j|j|j
���q4Wn|jjd|j|j|j
���qW|jS)
z�
        Stop analyzing the current document and come up with a final
        prediction.

        :returns:  The ``result`` attribute, a ``dict`` with the keys
                   `encoding`, `confidence`, and `language`.
        Tzno data received!�asciig�?r#)rrrNgziso-8859rz no probers hit minimum thresholdz%s %s confidence = %s)rrrr�debugrrr!r1r
r5�MINIMUM_THRESHOLDr4�lowerr(r�ISO_WIN_MAP�getrZgetEffectiveLevelr�DEBUGr&rZprobers)	rZprober_confidenceZmax_prober_confidenceZ
max_proberr"r4Zlower_charset_namerZgroup_proberrrr�close�s`	

zUniversalDetector.closeN)r�
__module__�__qualname__�__doc__r:�re�compiler/r2r7r<rZALLrrr3r?rrrrr3s"



mr)rBr)rrCZcharsetgroupproberrZenumsrrrZ	escproberrZlatin1proberrZmbcsgroupproberr	Zsbcsgroupproberr
�objectrrrrr�<module>$ssite-packages/pip/_vendor/chardet/__pycache__/enums.cpython-36.opt-1.pyc000064400000004753147511334570022071 0ustar003

���e}�@shdZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZGd	d
�d
e�ZGdd�de�Zd
S)zr
All of the Enums that are used throughout the chardet package.

:author: Dan Blanchard (dan.blanchard@gmail.com)
c@seZdZdZdZdZdZdS)�
InputStatezS
    This enum represents the different states a universal detector can be in.
    ���N)�__name__�
__module__�__qualname__�__doc__Z
PURE_ASCIIZ	ESC_ASCIIZ	HIGH_BYTE�r	r	�/usr/lib/python3.6/enums.pyrsrc@s<eZdZdZdZdZdZdZdZdZ	eeBZ
e
eBeBZdS)	�LanguageFilterzj
    This enum represents the different language filters we can apply to a
    ``UniversalDetector``.
    rr����N)rrrrZCHINESE_SIMPLIFIEDZCHINESE_TRADITIONALZJAPANESEZKOREANZNON_CJKZALLZCHINESEZCJKr	r	r	r
rsrc@seZdZdZdZdZdZdS)�ProbingStatezG
    This enum represents the different states a prober can be in.
    rrrN)rrrrZ	DETECTINGZFOUND_ITZNOT_MEr	r	r	r
r src@seZdZdZdZdZdZdS)�MachineStatezN
    This enum represents the different states a state machine can be in.
    rrrN)rrrrZSTARTZERRORZITS_MEr	r	r	r
r)src@s,eZdZdZdZdZdZdZedd��Z	dS)	�SequenceLikelihoodzX
    This enum represents the likelihood of a character following the previous one.
    rrr�cCsdS)z::returns: The number of likelihood categories in the enum.rr	)�clsr	r	r
�get_num_categories;sz%SequenceLikelihood.get_num_categoriesN)
rrrrZNEGATIVEZUNLIKELYZLIKELYZPOSITIVE�classmethodrr	r	r	r
r2src@s$eZdZdZdZdZdZdZdZdS)�CharacterCategoryz�
    This enum represents the different categories language models for
    ``SingleByteCharsetProber`` put characters into.

    Anything less than CONTROL is considered a letter.
    �����N)	rrrrZ	UNDEFINEDZ
LINE_BREAKZSYMBOLZDIGITZCONTROLr	r	r	r
rAsrN)r�objectrrrrrrr	r	r	r
�<module>s			site-packages/pip/_vendor/chardet/__pycache__/jpcntx.cpython-36.opt-1.pyc000064400000113272147511334570022245 0ustar003

���e�L��@s8d`ZGdd�de�ZGdd	�d	e�ZGd
d�de�ZdS)a������c@sPeZdZdZdZdZdZdZdd�Zdd	�Z	d
d�Z
dd
�Zdd�Zdd�Z
dS)�JapaneseContextAnalysis�r�di�rcCs*d|_d|_d|_d|_d|_|j�dS)N)�
_total_rel�_rel_sample�_need_to_skip_char_num�_last_char_order�_done�reset)�self�r�/usr/lib/python3.6/jpcntx.py�__init__{sz JapaneseContextAnalysis.__init__cCs*d|_dg|j|_d|_d|_d|_dS)NrrF���)r
�NUM_OF_CATEGORYrrr
r)rrrrr�s
zJapaneseContextAnalysis.resetcCs�|jr
dS|j}x�||kr�|j|||d��\}}||7}||krV|||_d|_q|dkr�|jdkr�|jd7_|j|jkr�d|_P|jt|j|d7<||_qWdS)NrrTrrr)rr�	get_orderr
r
�MAX_REL_THRESHOLDr�jp2CharContext)r�byte_strZ	num_bytes�i�order�char_lenrrr�feed�s 	

zJapaneseContextAnalysis.feedcCs|j|jkS)N)r
�ENOUGH_REL_THRESHOLD)rrrr�got_enough_data�sz'JapaneseContextAnalysis.got_enough_datacCs,|j|jkr"|j|jd|jS|jSdS)Nr)r
�MINIMUM_DATA_THRESHOLDr�	DONT_KNOW)rrrr�get_confidence�sz&JapaneseContextAnalysis.get_confidencecCsdS)Nrr)rrr)rrrrrr�sz!JapaneseContextAnalysis.get_orderNr)�__name__�
__module__�__qualname__rr!rrr rrrrr"rrrrrrtsrcs0eZdZ�fdd�Zedd��Zdd�Z�ZS)�SJISContextAnalysiscstt|�j�d|_dS)NZ	SHIFT_JIS)�superr&r�
_charset_name)r)�	__class__rrr�szSJISContextAnalysis.__init__cCs|jS)N)r()rrrr�charset_name�sz SJISContextAnalysis.charset_namecCs�|sdS|d}d|ko"dkns@d|ko:dknrld}|dksdd	|ko^dknrpd
|_nd}t|�dkr�|d}|dkr�d|ko�dknr�|d|fSd|fS)Nrr�����r��ZCP932����r)rrr)r(�len)rr�
first_charr�second_charrrrr�s0  zSJISContextAnalysis.get_order)r#r$r%r�propertyr*r�
__classcell__rr)r)rr&�sr&c@seZdZdd�ZdS)�EUCJPContextAnalysiscCs�|sdS|d}|dks0d|ko*dknr6d}n|dkrDd}nd}t|�dkr�|d}|d	kr�d|kovd
knr�|d|fSd
|fS)Nrr���r�r���r)rrr)r3)rrr4rr5rrrr�s  zEUCJPContextAnalysis.get_orderN)r#r$r%rrrrrr8�sr8N�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)Sr?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)r�objectrr&r8rrrr�<module>s�Csite-packages/pip/_vendor/chardet/__pycache__/big5freq.cpython-36.pyc000064400000152574147511334570021514 0ustar003

���ez�@sdZdZ�dZ�dS(g�?���	�������	�R������a����n���!����L�,�B��������{
�]
���
�j���.�N�i�����:����?���=�N�K����k�q	�����
�����b��
�����
�o�2��i����c�8���������{�|����"��
��@�\���	��������F��|
�Q�H���P�v��������D�^
����F�}
��E��O��0���s���4�<�2���&�M�����~
����G���[���?���a��K	�*��g��	�Z�
�:����K�	��	�������`�l�����
	� �q��~����
�	�������c��u���*���	��
��~�m������e��G�^��n���U�C���C��������� �j�o�/���P�7	�[������
�?����o�S�(����p�]��6�j��@�������8��+�3�[���\�������]	�A������1����H��
��
�d����+��2�����
�O�L	���f�1����������3�9���l�,���	���������e�z
�Q�M�&���X�������
�����k�p��M�����%�������'��	��\���7��J��!���������N��B�P�_
�q����
����
��
�� ��� �I���8	���
��{
����
����m����f���x�������
��
���g����
��������&��'������.������M	������$�#���D�h�A�	����r���U�G������
�
�Q�S�i���d��0�F�����C����� ���J����U�����N	�
���<��
�:�A
��
��	� 	�9���}���	���W�P�
��)�D����4��
��	�)����r	��s����t��9	�����s	�*�������]�����	�D�j�(
�
����
��u�5�Y�j�	�l��u����
���`
�
�=������������5��!����T�E�x���e���	���O	�P��|
�6�A�
�/�k���
��l�!�	���B��`��
����A�������
�v�	����w�����M�
�������a
����
�}
���x�,�}���B
����������F�k�m�	���
���	���
��b��)
�^�P	�
�,������7�5�~
�y�|��z������
�C
�{�����������1���b��	��
����t	���
��_�����������
��
�b
�����5�D���l��|�[�*
����
��%��G�^	��l�����
��`�����p����n�c�g�m����'�2�����{���������D
���f�	���|�:	�������
����
�
������n��������;���I�Y�}���
�X�"��
�����������-��l������������E��~��>�]���,�������v�L�B�i�&����������G������B���
�+
�����������	� �a����
�a���}�E
��D���=���0����
�6������v�!������������9�H�����
�F���������"��H�������
�o���R�*��.�������I�3��,
��*�S����X�����/��b�p��	�����������R����7��2���	�����������_	���b��
�����	�������`	�O��	���`�7�
�
��������a	�������������g������
�_��~��������
�a���
�b	�������������	�;���	�-�
���Q	������q��������`���#�#�������r�
�s�����d�t��
���c
����k�w���������������g��Q��U�
�������C�S��c	���5���B���_��c�N�����Y������
��L�d��
�
�	�K��8�a�G���
����s�6�	�t�;	�N����	�4���:��\�q����.�����u��
���������
��
�r�����
�
���c����@�����������/�9�����������
�<	�d
�����8�v��C����
���m��
��������Z��������w��	�{������$���M�d�0�r�����g��V�����:�Z����{�$�e�0���$��
�
��5��[���v���c��V�R	�$�;����f�����d����W�e
��������
��J�����u���	��
���
�K��2��L� �.���2����������%�������
����������
��
�x��M���}���T�����d	�
������`�����	�3���;�����y�Y�L����	��	�����U�\��������V�N����
��� 
��"�x�%��;�=	� ����2�E��!������^���"�w��
�#��������$�<�Q����r�%��]���&��<�'��3�6�B���(�����������g�
�
�)�
�	�*�+�!	������:��7��^�����(�
�,�+���x�-��/�h�=�.�������V����h����/�����<����;�0�����
�	�1�z������5�6�
�_�)�2�3���4�y�"	�����
��^�5����?�%�+�f
��6�_�W����
�7�>��
�g
�#	���u	�8����9�h
�F
���V�{���O���I�w�~�
�^���:�����������	�;�e�1������������������P�<�=���>�
���(�`�R�?�Y��I��	������7���
�?�����
�@���o��q�Y�n���l��������
�i
�E�P�V��!
�������n��A���|�
��
�B������-
�	�O��
��C��D�W�T�T�������
�}�E���&�
���������
���~�G
��
����$	�W�����o���
�F��G����"
�������#
�����D�-�Q���H�e��	���I�����J�z�
��K���L��$
�
�����H
����������M�:��	����N���I
������H������O����	�P�Q����k�R�R��������
�
������������S�T�����%
���������&
�S�U����S	��C������V�~��!�X�������,��������O���|�����P��e	���������������T������W�����X��Y��R�
�v���
���=���Z��U����[�����V�%	�\���]���^�8�T	�.
�'
�_��v�����w�t��`�a�b�>	����"�{����	��c���d�9������[�9���d�h�
�e�����J
�!���f����������T�g����%��W�M�h���i�������������(
�j
��y�7�m��j�k��?	�W�"�������B��l�|���)
�z����
����������m��n�u�v	�w���	�*
���o�p���q�7�.�r����
���
��f�����s�&	�y��� �������
��>�������Y���t��u�/
�+
��v����@�w���	��	���3�����,
�x���6�9�
�	���6�����!���y���Q�:�c�
���-
�o���F�X�
����f����z��
��������{�/���'	�	�b����`����u�	�v������
�
�
������0�E��S��8�[�|�}�G�����!�~���F�U	��
����e�s�����w	����"��a�	�Z�&������	�
�����A���
��i����������
�o������
�
��	��A��J�0���x	��
�����Y�������	�����Z�8�o����Q�����
���a�Q����p�;���P��b����������
����	��������������]������
��������'���.
�����
��������L�������	��/
�����B�0
�[��8������	���/�3������
�����0���n��������������	�����������4�����<�	��
����Y�	�����C�����
��������}�1��
����K������0
�F�������
��#��\�����f��>�
���
����������
�������j������-�p�9�@	�6�y�������
�1
���������V���������4�����K�1
����
�D����������#������2
�����	�Z�H�p����g������
�q�r�
����������K
�V	�
�\�,�
��������]���u��
������������Z���
������j����0�h�h��������E���	
���	���������������
����2
���������M���w�����y	�?������

�t�m�(	��	��+�����������i�
��	������
���L
����)	�
��
���f	���	��������R�"�A	�3
����-�1��$�
�(���]���#��$��
������ �*	������
����
�=�k
��!��
���������
�����_���������^�2���
������9������������4
������
��b�����������5��������"�	�
��������4�I����#����������j�����$�k���
��\�
�g��E�5
����������������
����6
�����	���%���������[�����������&�e�����������o�����l�	�}�������z���+	�����������n�-���
�'��i�
������(������	������������
��������B�i�q�����<��*�j�)�"�*���>�)�7
���+�����������z���,������M
�)�r����-�	�������������T���p�����#�#������������
�����	�	�����S���.�8
��������z	�9
�=�����m��S�������x�����/�����q����3���5���:
��
�������������
�;
��0�1�������n���_��]���n���������
�%���������`��������z�T�s�2�3
���
�{	��5������
�g	���k���������
���3�4����c� ����
�
����a�:�1�����!�A�h� �o��
�6�"����l
�-���
�a�5���	��
���#�!��$��
��4
�f���	�%�&�
�N
�
���W������
�'��$����<
��N���W	�k��O
�����������X	�,	��6�&�t���P
���7�
���
�-	�j��	�
��C�[�p���'�8�
�b���m
��
��{�(���	����	�9�
��	��:�
�
�^�������B	�;��������
����
���	���)�<�=�
��
��>������
�=�������������n
��H�D�
�?���U�W�
�=
��:��	�N��K�E�@�4�c�
��>
��Q
�
�A���h����
�
�(�]������B�*����l���C�D���E�h	���������X�����+������+�J�����U����
���m� �E����F��G�
�k�	�n��
�d����������F�!� ��r�@���H�"�C�G�����
��^�
����8�t�)�;��?
�!��o�I���"�#�#�$��J�	�%��
�&�r�K�'�5
�(�)��o
�@
�}�A
��,��*�4�1�+�������"��,��
�C	���u�r�D	�-�
�-��L�v�{���|	�M�.�.�N�Y	�6
����/�0�/����1�$��2�%�G�R�B
���_����C
����D
���3���������.	���	�4�@�w��5��6�7��i	���&�0���8�	�&��	�*��+�G�O�u��#�
�1�P�R��Q�����)�����=�9��l�������E
�S��k��\�	���
�y�:��;�<�F
�%�=��R���>�d�'�,�?�e�$��2�3�-����4��(�S�@�A�B���.�T��O�
��������
�&�����
�<�
����C��D���U���p
�G
�e�
����
�E�;�V�F�'�G�)�
���}	�H�*���W���I��T��
�
���������	��J�.�g�~��U��j	�	���X����8��*�	����K�~	�����L��H�
����%���+�f������t���M�N�������	�� 
���k	�Y��Y�����*�/���O�R
�Z�H
���#�I�
��	����5�S
�]�E	�[�\�
�s�I
�+�P�Q�,�	�
�R��S�T���,���U�]�V�����L�-�!��T
�\�4�W���X�Y�Z��[�>�w������+�^��J
���_�$��\�����l�.�]�`��^�_�q
�%��`�a�����
�6�g�U
�b�c�d�a�
�q�x�
�e�
�b�y��f�h�g�h���'�i�����	�j�r��
�0�	�=���b�|��������	��h�k��
�l���
�a��c�m�����n�
��u���d�e���s�K
�/�o�7��&�	�J�8�p�q�i�j�f���	�g�	����(��`�r�s�����<�F�k�f��������]���I�t�>��u��v��
����
�0��-��w�L
��	�����G�a��
�t���
�S�K�x��y�z���V
����O���1���
�{�
����F	�y��9�Z	���|�}�~�F�������
�	�:�����2�����p��;�G	���	�h�����7
������q����M
��������������
����
�(�8
��N
�	��3��	��O
�<����T��W
�	�����(�����1����������u�Z��
�	�c��j�P
�p���
�������v�i�j��
�����Q
�K����	�'��������9
���
��	�z��	�
���k���������2�	��l����>�=�C�'���>�
��Z�����m�n����b����	��q�?��
����	�"��
�?�
�}����o�@�>��A��1��)����?�
��
����x�	��	��	�3���h���l����)���
�p��'����4���
�
����A�_�	�B��L��
�P�q�����r�����X
��c�s�!
����H��	�w����C�D�t�������x�
��	�/	����5�
�4�u��r
���_���/��<��~�7�y�m���^�5�n�b�
�t���l	�����6���	�7��������v����	�8�:
��
��0	��1	��w��I������(����B����4�s��
�����
��
���?�C��x�	�E�H����F�y�z�G�����{����|��;
�}�9�m�H	�~���z���s
�	���M����������m	�����I������o�����������_��������2	�3	�H��*���
����<
��i�:����
�����A������R
���������
���"
���	�������
���
����
�����Q���
�Y
��������{�K�����[��J�����N��S
�����I	���T
�@���
�����
��@��R����	�t
�����J�I���
�������
�U
��m�����{�	���	�
�;�V
����	���<����
�d�������X����9����6�����U����� ���L�!�=
���)���
�	������	�����
��=�$�(�v����&���c������	�
���>�������{�"��������#��������W
��	��������	��$�
�%���������&�^����[	�	���'�����������(� ����2�\	�J�
�p�p�n	��u
�q�)���	�O�>
��
�*�X
�����	�+����,��
����
������K��
�		���I�-���!������.�D���J�s�#
����
���7����/�w��|������"�4	�|�,�������	���������?�Y
�P�0��1�Z
�Z
�
��
�	����
�#�@�2��|��z�����A�'��?
���	���8���y���L�V�
�3���	���$������M���%����	��
��
�.��4�N�����
��	�������
�5�9�%������K�Q����x�6�7��B��
	�O�v
�[
���&���
��
����X��*��\
�P��	�
�N�+�w
������W�C���	�Q�D��?�]
�����^
�o	�R�	
���E��
��F��'��X��G��S���T�R��8�x��	�k�S��T��� ������D��9�!�����(��L�"�#�$�%�_
��&�J�:�U��}��)��

�;�<�
�E�+�'�
��H�=�V�*�H�y�,�+�W�3�>����?�(�I�,�-�`
�q��a
�����
���)�*�X�	�+��O���Y�,�-��.��/�����0�.������1�
�J�r�2�3�$
�	�4��5�6�7��
��b
��	����}�	�f���K����
��
�	�	�8�Z�9���:����
�c
���:���;�s�J	�@����	�t�;�
�/�<�d
��	����U���0��e
�=���f
�u�M�������
�>�	�[��A�L��i�?�	�
��
�	�@�A�v�B�C�\���j��[
��
�g
�D�E����F��B��G��
��H���(���M�C�h
���I�N�J��K�L���M���1�2�N���d�O�d�V�D�`��#�5	�P�]�Q�E�n�
��	���^�R�_�F�S�p	�i
���T�U��

��V�����	��G�����e�H�O�	��P�`�W�w�j
�X� ��3�
�a��I�Y�J�x�7�����-��
�4�k
��b�Z�-���.�c��[�	�r�;���K�5�L�X�� �	����\�]���6�%
�^�_����~�`�l
�y�a�z�~�����b�?��c�7�d�
���e��z�����M�f�g�/�h�|��@�d���i�N�����8�f���j�k��O�P����l�m�@�n�9�	�o��	�����<�s��\
�Q�.�R�N�@��
��p��S�{�q��A���������T�
����:����r��
�s�@
�W�;�t�u��v�w�x�y�|�����z�����3�Q����m��A�{�|���	��}�O�R����g���~�Z�
��������\�m
��}���<�X����	��	��
����
���Y�����0��������=��	����L������h�����i�>�>��/�?�U�������e�)�x
�t��f�S���
�y
��
�n
�
���T��@����������������	����z�U�g��V���o
���6	�p
�t���������
�A���J�V�h�Z��	���W�����N(rrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r	r	r	r	r	r	r	r	r	r		r
	r	r	r
	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r 	r!	r"	r#	r$	r%	r&	r'	r(	r)	r*	r+	r,	r-	r.	r/	r0	r1	r2	r3	r4	r5	r6	r7	r8	r9	r:	r;	r<	r=	r>	r?	r@	rA	rB	rC	rD	rE	rF	rG	rH	rI	rJ	rK	rL	rM	rN	rO	rP	rQ	rR	rS	rT	rU	rV	rW	rX	rY	rZ	r[	r\	r]	r^	r_	r`	ra	rb	rc	rd	re	rf	rg	rh	ri	rj	rk	rl	rm	rn	ro	rp	rq	rr	rs	rt	ru	rv	rw	rx	ry	rz	r{	r|	r}	r~	r	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r
r
r
r
r
r
r
r
r
r	
r

r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r 
r!
r"
r#
r$
r%
r&
r'
r(
r)
r*
r+
r,
r-
r.
r/
r0
r1
r2
r3
r4
r5
r6
r7
r8
r9
r:
r;
r<
r=
r>
r?
r@
rA
rB
rC
rD
rE
rF
rG
rH
rI
rJ
rK
rL
rM
rN
rO
rP
rQ
rR
rS
rT
rU
rV
rW
rX
rY
rZ
r[
r\
r]
r^
r_
r`
ra
rb
rc
rd
re
rf
rg
rh
ri
rj
rk
rl
rm
rn
ro
rp
rq
rr
rs
rt
ru
rv
rw
rx
ry
rz
r{
r|
r}
r~
r
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r
r
r
r
r
r
r
r
r
r	
r

r
r
r

r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r 
r!
r"
r#
r$
r%
r&
r'
r(
r)
r*
r+
r,
r-
r.
r/
r0
r1
r2
r3
r4
r5
r6
r7
r8
r9
r:
r;
r<
r=
r>
r?
r@
rA
rB
rC
rD
rE
rF
rG
rH
rI
rJ
rK
rL
rM
rN
rO
rP
rQ
rR
rS
rT
rU
rV
rW
rX
rY
rZ
r[
r\
r]
r^
r_
r`
ra
rb
rc
rd
re
rf
rg
rh
ri
rj
rk
rl
rm
rn
ro
rp
rq
rr
rs
rt
ru
rv
rw
rx
ry
rz
r{
r|
r}
r~
r
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r)ZBIG5_TYPICAL_DISTRIBUTION_RATIOZBIG5_TABLE_SIZEZBIG5_CHAR_TO_FREQ_ORDER�rr�/usr/lib/python3.6/big5freq.py�<module>+s�site-packages/pip/_vendor/chardet/__pycache__/langturkishmodel.cpython-36.pyc000064400000055442147511334570023360 0ustar003

���e^+�@sd�Zd�Zeed�d�d�d�d��Zd�S)����%�/�'��4�$�-�5�<��1��.�*�0�E�,�#��3�&�>�A�+�8�����������
��
����@���	�� �9�:�������������������������e���������j��������d���������^�P�]��i���?�������~�}�|�h�I�c�O�U�{�6�z�b�\�y�x�[�g�w�D�v�u�a�t�s�2�Z�r�q�p�o�7�)�(�V�Y�F�;�N�G�R�X�!�M�B�T�S�n�K�=�`��C�m�J�W�f�"�_�Q�l�L�H����k�g�X4���?Tz
ISO-8859-9ZTurkish)Zchar_to_order_mapZprecedence_matrixZtypical_positive_ratioZkeep_english_letterZcharset_nameZlanguageN(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5rrrrrr6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�(r$r r$r$r$rr$r$r$r$r$r$r$r$r rrr$r$rr$r$r�r$r$r$r$r$r�r$rr$r$r rr�r�rrr�r�r�rr�r�rrrrr�r�r�r�r�r�r�r r r�r�rr�r�rr$r r r$r$r�r$r$r$r$r$r$r$r r$rr�r$r$rr$r$r�r$r$r$r$r$r�r$r�r$r$rrr�rr�rr�r�r�r�r�r�rrrrr�r�r�r�r�r�r�r r r�r�r�rr�rr$r$r r$r$r�r$r$r$r$r$r$r$r r$rrr$r$r�r$r$rr r$r$r$r$r�r$r�r$r$rrr�r�r�rr�r�r�r�rrr�rr rr�r�r�rr�r�r�r�r r�r�r�r�r�rr$r$r$r$r$r$r r$r$r$r$r$r$r$r$rr$r$r r�r$r rr r rr$r$r�r�r�r r r r�rr�r�rr�r�rrr�r�r�r�r�r�r�r�r�r�r�r�rr�rrr�rr�r�rr$r$r$r r$r$rr r$r$r$r$r$r$r$rr$r rr�r$r r�rr r$r$r rr�r�r r rr�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r r�r r�r�r�rr�rr$r$rr$r$r$r$r$r$r$rr r�r�r r$r�r r$r�r�r r r r$r�r$r�rr rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r$r$r$r$r$r$r$r$r$r$r$r$r$r$r$r�r$r$r$r�r$r r�r r$r r$r$rr�r�r r$r r�r�rr�r�r�r�r�r�r r�r�rr�r�r�r�r�r�r�r�r�rrrr�r r�r�rr$r$r$r r$r$r r$r$r$r$r r$r$r$r�r$r$r�r�r rr�r�r r$r r r�r�r�r r r r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�rr�rr�r r�r�rr$r$r$r r$r$r$r$r$r$r$r r$r$r$r�r$r r�rr$r rrr$r r$r rr�r�r r r r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�rr�r�r�r�r�r$r$r$r r$r$r$r$r$r$r$r r$r$r$r�r$r r r�r r$r�r�r r r r r�r�r�r r$r$r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r r�rr�r�r�r$r$r$r$r$r$r$r r r r r$r r$r$r�r$r$rrr r r�r�r r r$r r�r�rr$r�r$rr�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�rr�r�r�r�rr$r$r$r r$r$r$r rr r r$r r$r$r�r$r r�r�rrr�rrr rr r�r�r�rr�r$r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�rr�rr�rr�r�r�r$r$r$r r$r$r r$r r r r$r$r$r$rr$rrr�r$r rrr$r$r r$rr�r�rrr�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r r�r�rr$r r r$r$r�r$r$r$r$r$r$r$r r rr�r$r$rr$r$r�rr$r$r r$r�r$r�r$r r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r r r r$r$r�r$r$r$r$r$r$r$r$r$r�r�r$r r�r$r$r�r$r r$r$r$r�r$rr$r r�r�r�r�r�r�r�r�r�r�rr�rr r�rr�r�r�r�r�r�r�r r r�r�rr�r�rr$r$r$rr r$r$rr�r�rr�r�r$r$r r$r�r�r r�r�r r�r r�r�r�r r�r r�r�r$rr�rr�r�r�r r rr�rrr rr r r r�r rrr�r�r�r r�r�r�r�r�rr rr$r$r�r$r$r$r$r$r r$r�r�r�r�r r$r�r r$rr�r r$rr$r�r$r�r r$r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r$r$r$rr$r$r r r$r r r�rr r$r�rr rr�rr�r�r�rr�r r r�r�r�rrrr�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�rrr�r�rr�r�r�r$r$r$rr$r$rrr$r$rrr$r$rr�r rr r�r rr�r�rrr rr�r�r�r r rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r$r$r$rr�r rr$r�r�r r�r�r$r$r�r$r�r�rr�rr r�r�rrr r r�rr�r�rr rrr�rr�rrrrrr�rrrr r rr r�rr�r�r�r�r�r�rr�r�r$r$r$r r$r r$r$r�r r r r$r$r$r�r$r�r�r�r r r�rr rrrr�r�r�rr�r$r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r$r$r$r$r$r$r rr r r$r$r$r$r r�r r�r�r�r r r�r�r rr$r$r�r�rrrrr�r�rr�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�rr�r�r�rrr r$r$r�r$r$r$r$r$r$r r r�r r�r r$r r$r r r r r r r rr$r r$r r�r rr r r r rrr r rr r rr r�r�r rrr�r rr�r�rr�r�r�rr r$r$rrrr�rrrr r$r rrr�r�r�r�r�r�r�r�r�r�rr�rr�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r$r$r$r r r r$r r$r r rr$r$r$r�r rr r�r rr�r�rrrrrr�r�rr rr�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�rr�r r�rr�r�r�r$r$r$r r$r$r$r$r$r r$rr r$r$rr r�r�r�r�r�r�r�r$r rrr�r�r�r�r r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r$r$r$r r r$r$r rrrrrr$r$r�r$rr�r�rrr�r�r$rr rr�r�r�r�r�r$r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r$r$r$r r r$r r r r$r rrr$r$r�r$r�r�r�r�rr�r�r$rrr r�r�r�rrr�r�rr�r�r r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr$r$r�r$r$r$r$r$r r r rr r�r rr r rrr�rr r r r r r r r�r�r rr rr rr�rrr$rr rrr r�r�r r�rr�rr�rr�r�r�rr�rr$r$r$rr$r$r$r�rrr�r r r$rr�r$r�r�r�rr�r�r�rr�r�rr�rr�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r$r$r r�r�r r rr�r�rr�r�r$r$rr$r�r�rrr�r r�r$r�r�r�r r�rrr�rr r�rr r r�r r r r rr�r rrr�r r�r rr r�r�r�r�r�r�r�r�r�r$r$r$rr$r r$r r�r r r rr$r r�r rr r�rr r�r�rr�r r r�r�r�r rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�rr�r�r�r$r$r$r�r$r$rrr r$rr�r$r r$r�r$r�r�r�rr�r�r�rr�rr�r�r�r�r�rr r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r$r$r�r$r$r r$r$r r r�r�r�r�rr r�rr$r�r�r�r$rrr�r$r�r r r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r$r$r$rr r rr�r$rrrrr$r$r r$r�r�rr�rr r�r r r�r r r�r rr�r r rrrrr�r rrr�rrrrr rr rr r�rr�rr�r�r�r�r�r�r�r$r$r$r�rrr$r�r�rrr�r�r r r�r$r�r�rrr�rr�r�r�r�r�r r�r�r�r�r$rr�rr�rr�r r�r�rr�rr�rrrr rrr�r r�r�r�r�r�r�r�r�r�r$r$r$r�r r�r r�rrrr�r�r$r$r�r r�r�rr�r�r rrr�rr�rr�rr�r�r r�rr r�r r�r rrr�rr�r rrr�r rrr�rr�r�r�rrr�r�r�r�r$r r$r�rr�r�r�r�r�r�r�r�rr r�rr�r�rr�r�rr�r�r�r�r�r r�r�r�r�r�rrr�r�rr�rr�r�rr�r�r�r rr�rr�r r�r�r�r�r�r�r�r�r�r�r�r$r$r$r�r�r r$r�r�rr�rr�r r$r r$r�r�rr$r�r rr�r�r�r�r r�rr�r�r rr�r�rrr�r rr�r�rr�r�rrr�rrr r�rr�r�r�r�rr�r�r�r�r$r r r�r�rrr�r�r�r�r�r�r$rrrr�r�r�r�r�rr�r�r�r�r�r r�rr�r�rr�r�r�r�r�r�rr�r�r�r�r�r�r�rr�r�r�rr�rr�r�r�r�r�r�r�r�r�r�r�r�r$r$r�r r$r r rr r rrr r�rr$r r r r�r�r r r�r�r�rr rr$r�r rrr�rrrr�rr r r rrr r�r�r�r�rr�rrr�r�r�r�r�r�r�r�rrr r$r�r$r$r$r r r r rr�rr�rr�rr r r�r�r r rr$rrr rr�r�rrr r�rrr�r�rr r�r rrr r�r�rr�r�r�rr�rr�rr�r�r�r�r$r$r r�r�r$rr�r�r�r�r�r�r$r rr r�r�rr�r�r r�r�r�r�r�r r�rr�r�r rrr�r�rr�rr r�r�rrr�r�r rrrrr�r r�r�r�r�r�r�r�r�r�r$r$r r�r�rr�r�r�r�rr�r�r$r$r r r�r�rr�r�r r�rr�r�r�r r�rr�r�r�rrr�r�r r�r rr�r�rrr rr r�r rr rrrr�r�rrr�r�r�r�r$r$r r�r�r r r�r�r�rrr�r r rr$rr�rr�rr r�r�r�r�r�rr�rr�r�rrr�r�r�r�r�rr�r�rr�r�r�rrr�rr�rr�r�r�r�r�r�r�r�r�r�r�r$r$r$r r�r�r�rr�r�rr�r�r r$rr r�r�rr�r�r r�r�r�rr�r r�r r�r�rrr r rr r�r rrr�r�rrr�rrrrr rrr�r�r�r�r�r�r�r�r�r$r$r$r�r rr rr�r�rrr�r$r$rr r�r�rr�r�r r�r r�rrr r�r�r�r�r�rrrrr r�rrr�rrrrr�r�r�rrrr�rr�r�r�rr�r�r�r�r�r$r$r$r�r r r$r r�r�rr�r�r r$rr�r�r�r�r�r�r r�r r�r�r�r r�r�r�r�rrr�r�r�rr�r�rr�rrr�rr�rrrr�rr�r�r�r�r�r�r�r�r�r�r�r$r r$r�r�r�r�r�r�r�rr�r�r r r r r�r�rr�r�r r�r�r�r�r�r r�rr�r�r�r rrr�rr�r rrr�r�rrr rr�r r�r r�rr�r�r�r r�r�r�r�r�r�r�r�r r r�r rrrrr r r�r�rr�rr�r�rr$r�r�r�r�rr�r�r rr�r�r�rr�rr�r�r�r�r�r rr�rr�r�r�r�r�r�r�r�r�r r�r�r�r�r�r�r�r�r r�r�r r$r�r r$rr r r�r r�r�r r�r rrrr rr�r�rr rrr rr�rr�r r�rr�rrr�r�r r rr rrr r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r$r$r$r�r rr r�r�r�rr�r�r$r r�rr�r�rr�r�r r�r�r�rr rr�rr�r�r�r�r�rr�rr�r�rr�r�r�r�rr�rr�rrrr�rr�r�r�r�r�r�r�r�r�r�r�r�r r r�r r rrr�rrrrrr�r�rr rrrr�rr�r�r�rrrrr�r�r rr�rrrr�rrr rr rrr r�rrr rr�r r�r�r�r�r�r�r�r�r$r r r�r�r r�r�r�r�r�r�r�r r r�r r�r�rr�r�r r�r�r�r�r�r r�r�r�r�r rr�r�r�r�r�rr�r�r�r�r�r�r�rr�r�r�rr�rr�r�r�r�r�r�r�r�r�r�r�r�r$r r�r r r�rrr�rr�r�rr�r�r�rr�rr�r�r�r�r�rr�r�r�r�r r�rr�rr�rrr�r�rr r�rr�rrr�r�rr�rr�r r�r�r�r�r�r�r�r�r r r r�rrr�r�r�rr�r�r�rr r�rr�r�rr�r�rr�r�r�r�rr r�rr�r�r�rr�r�r�rr�r�rr�r�r�r�r�r�rr�rr�r r�r�r�r�r�r�r�r�r�r�r�r r r r rr�rrrr�r�r�r�rr r�r�rr�r�r�rr�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r r�r�r�r�r�r�r�rrr r�rr�r�r�rr�rr�r�r�rr�r�rr�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r r�r�r�r�r�rr�r�rr r r�r rr rrr r r�r�r�r�rr�r�rrr�r�r r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r r r r�r�r�rr�r�r�r�r�r�r r rrr�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�rr�r�r�rr�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r r r r�rr�rr�r�r�r�r�r�rrr�r�r�r�r�r�r�rr�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r r r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)ZLatin5_TurkishCharToOrderMapZTurkishLangModelZLatin5TurkishModel�r�r��&/usr/lib/python3.6/langturkishmodel.py�<module>%s*site-packages/pip/_vendor/chardet/__pycache__/euckrfreq.cpython-36.pyc000064400000056746147511334570022003 0ustar003

���e�4�0	@sdZdZ�	d2Z�	d1S(3	g@�0	�
��x�t�����H�a�������+��W�u��h��]�������������v�w�������m�F�!�p�������������x���/������������9�����t���-�y��K������������O��n������������0����<�4�{����������i���r����������������������X�X���������Y���&��P�������������^��������������9�������������Q���"��t������]�{�7����{��;��u���z�/��|�������7���.��������������{����#�|�}��~��t�8��_�	�
���
�!����_���������*��u��`�"���|�������a�������?��R�!� �/���!�"�=���#��$�%�&�'�(�)�,���'�b�$�*��+��-���,�������&�U����#���-�.�'���f�/�s��0�������� ��9�e�[�1����Z���:�����2�3���G����y�4�����5�6�7�,�w����s�8����9�:���~�;��<�;�}�=�>�?�o�)����@��A�B����2�Y�C�D��<�E�F�G�H�I�%�J�K�L�M�N�O��`�>�P��=�Q��R��S�T�;�������U��V�W�X�4�Y���Z�[��\�]���^���_�"�P���`�;��~�H�a��v��z�?����b���<�c�d�d�e�f�c�0��d��g�y���h�i�s�0�j�=�k��l�����<�b���U������I�m�n�o��p�q�r�s�t�u�������6�v�w��*��]�x�y���z�Z��-�:��b�
�{�|��&�'����5����>�}�~�w��g����6��%���(��v����w������E������������f���V��7�����B��N��[��'���������S�������e���x������������������������?�����q��f��(�)����~���\���������)��������������$����������l�����~�����C�����@��������2����K��z�V��������Q�r�f������ �h�+�3��1�������g�(����z��������������������A��<�j���M�g��2�������������������V�h����J�����0����b�������������Y���������a�!�*����������K�D�8��R��B��@����������y����������X�:����#������i���G����k����=��������!������J�����=����}����j�����������������������E����������j�O�4�����������������	�v�]��C���������o�����
����l�c�A������������T����k����������3�*��q�����>��������+������;��p��x�������������	��
����
������l�������)������m�8���D��������������7�L�B���D�������t���� �!�
�"�#�$�%�R�&�'�(�)�*��+�,��,�-�.�m���
�^�/�c��E�����a�m�0�E�1�2�3����4�T�����5������6�7��n���o�8�9�F����:�G�;�<�=�>�?�@�A�B�C�D�E�$�F�G�����%���p�H�I�J�K�L�����������M�N�O�P�Q�R���S�/�T��U�����V��I�H��������W��X��q�Y�Z�[�\���r��s�]�^�_�`��v�L�a����.��b�F�>�����j�c�Z����B�6��`�d�e����|��f���5�g��h��i�H��j����k��l���1�m�n�o�p�q�r�C�s�t�u�v�w���x�y�z�{���|�}�~���������O��������������E���q���I��\��-���S����
��e�����l��M���Q��P���^��
�-�F���������������������
�.���t�����J���������g�������������������������������������u�����9�	������	���$���������5�%��k�L����������������A��������������R��u���������)�����:��������������"�$�v�����c�Z�����������*�W��K���L���+���������������	�B�����?���������������M�[�5�������n����������C���'���������������
�������	��������������F�T�/����������8��u������K���(�M���i��������������T��?���������e� �(��������%�0����O��	�8�	�	�	�	�	�����	�	�	����#����		����V�
���P�� �M�W���
	��	��|�	�
	�S������������	�	�����	�	�	�	�	�	���	�	��9�D���������	�3����	�	�	�{�	����	�	�	�� 	�������!	�k�������N���Y���"	�#	�$	�%	�&	�'	�(	�]�)	�*	�+	�,	�-	�.	��/	�����>�����1	������\��2	��3	�����4	�5	�6	�w�7	��8	��3�����9	�:	�L�����;	���<	���������������=	��J�>	�?	�@	���A	���B	�C	�D	���E	�����F	�G	�H	�I	������������J	�K	����L	�M	�N	��I�O	�P	���Q	�R	�S	��T	���U	�V	�W	�X	���Y	�Z	�[	�\	�]	����^	�_	�`	�a	����b	�c	�d	�e	����f	�g	�U���x� �h	����i	�j	�����4�&�������!�����S�y������"�
���#�k	���l	�m	�n	�����p�Q��.�o	�U��O���p	���q	�r	�s	�x�t	��u	�v	�w	���$�x	��y	���`�z	�{	�|	�}	���~	�	�	�	�	�	�	�	���C�������%�y���o�	���	�	�	�	�	�	��	�	�	�	����	����	�	��@����	�G�	������2��������	�	�	�	�N�	���������	�	�	�	�	�r�	���z�n�����P�	�#�	�&�Q�	���m�	�c����	�'�	�w�	�	���(�l�@�������)���*�	�	�	�	���	�	�	�	���	����	�	�	�H��	�	��	���	�	�	���	�	�r�	���A�	�������	�}�,���:�����
��I�	�N��1�	�W������������	�+��	��	�1���b����	�q��	�,��	���e����_��d��	��	��	��	��	��	����2�����	���	��	��	��	���	���	��	��-��	�����	��	��	��	��	��	����	��	��	��	��	��	���	����	��	����7��	����	���	��	��	��	��	��	��	������������,�G�������	��^��	��	�.��	��g���	����	�	�	���	�����_��	�	�	���h�h��{�	�|����3�	�	�	�����	�
�
�\�����
�
�
�
�
�
�
��	
�

�
�
����
�j�
�i��4��
�
�
�
�
�
���
�
�
�
��6�d���/�
�
����o�
��R������	�&�
�k�n�z��������
��
����X��d��S��}�
�}����~������� 
�!
�N�"
����������#
���s�$
�%
�&
�f�D��1�'
���(
�@�)
�^�����*
���+
�,
�-
�.
�/
�J�+�0
�1
�2
���T��3
�4
�5
�6
���7
�A�.�����8
�9
��"�:
�;
���<
�=
�>
���0�?
��@
��A
�B
��C
�D
�E
�F
�G
���_�[�H
�I
���`�a�J
�����K
�L
�M
�N
��O
��i����P
�Q
�R
N(0	rrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxrryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r	r	r	r	r	r	r	r	r	r		r
	r	r	r
	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r 	r!	r"	r#	r$	r%	r&	r'	r(	r)	r*	r+	r,	r-	r.	r/	r0	)Z EUCKR_TYPICAL_DISTRIBUTION_RATIOZEUCKR_TABLE_SIZEZEUCKR_CHAR_TO_FREQ_ORDER�r1	r1	�/usr/lib/python3.6/euckrfreq.py�<module>)s(site-packages/pip/_vendor/chardet/__pycache__/sbcsgroupprober.cpython-36.pyc000064400000002776147511334570023227 0ustar003

���e�
�@s�ddlmZddlmZddlmZmZmZmZm	Z	m
Z
ddlmZm
Z
ddlmZmZddlmZddlmZddlmZdd	lmZGd
d�de�ZdS)
�)�CharSetGroupProber)�SingleByteCharSetProber)�Win1251CyrillicModel�
Koi8rModel�Latin5CyrillicModel�MacCyrillicModel�Ibm866Model�Ibm855Model)�Latin7GreekModel�Win1253GreekModel)�Latin5BulgarianModel�Win1251BulgarianModel)�TIS620ThaiModel)�Win1255HebrewModel)�HebrewProber)�Latin5TurkishModelcseZdZ�fdd�Z�ZS)�SBCSGroupProberc
s�tt|�j�tt�tt�tt�tt�tt�tt	�tt
�tt�tt�tt
�tt�tt�g|_t�}ttd|�}ttd|�}|j||�|jj|||g�|j�dS)NFT)�superr�__init__rrrrrrr	r
rrr
rrZprobersrrZset_model_probers�extend�reset)�selfZ
hebrew_proberZlogical_hebrew_proberZvisual_hebrew_prober)�	__class__��%/usr/lib/python3.6/sbcsgroupprober.pyr,s,
zSBCSGroupProber.__init__)�__name__�
__module__�__qualname__r�
__classcell__rr)rrr+srN)ZcharsetgroupproberrZsbcharsetproberrZlangcyrillicmodelrrrrrr	Zlanggreekmodelr
rZlangbulgarianmodelrr
Z
langthaimodelrZlanghebrewmodelrZhebrewproberrZlangturkishmodelrrrrrr�<module>s site-packages/pip/_vendor/chardet/__pycache__/langthaimodel.cpython-36.pyc000064400000055420147511334570022610 0ustar003

���e,�@sd�Zd�Zeed�d�d�d�d��Zd�S)�������j�k�d����e�^���l�m�n�o����Y�_�p�q������@�H�I�r�J�s�t�f�Q���u�Z�g�N�R�`���[�O�T�h�i�a�b�\�������������X���������������v���������c�U�S��������������������������������K���4�"�3�w�/�:�9�1�5�7�+���,��0����'�>��6�-�	���=�����*�.���L��B�?��
��$��
�(�� �#�V�������������)��!��2�%���C�M�&�]���D�8�;�A�E�<�F�P�G�W�����g��@��?FzTIS-620ZThai)Zchar_to_order_mapZprecedence_matrixZtypical_positive_ratioZkeep_english_letterZcharset_nameZlanguageN(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr(r�r�rrrrrrrrr�r�rrrrr�rrrrr�rrrrrrrrrrrrrrrrr�r�rrrrrrr�rrrrrrrrr�rrrrr�r�r�r�rrr�rrrrr|rrrrr�r�r|rrrrrrrrr�r|r�r|r�r�rrr|r�r|r|rrr�rrrrr|rrr�r�rrrrr�rrrrr�rrrrrrrrrrrrrrrrrrr�rrr|rrr�r|r|r|rrr�r|rrr�r�r�r�r�r�r�r|rrr�r�rrr|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrr|rrrrrrrrrrrrrrrrrrrrrrr|r|r|r|r|r|r|rrrrr|rrr|rrrrr|r|r|rrr�r|rrr�rrrrr|r|r�r|rrrrr�r|r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr|r|rrrrrrrrr�r|rrrrrrrrrrr|r|r|r|rrrrr|r|rrrrr|r|rrr|rrr|r|rrrrr�r|rrr�r|r|rrrrr�r�r|r�r�r�rrr�r|r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrrrrr|r|rrrrrrrrr|rrr|r|rrrrr|r|rrr|r|r|r|r�r�rrr�r|r�r�rrr|r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrr|rrr|rrrrr|r|rrr|rrrrr|rrr�r�r|rrr|r|r|rrr|r|r|r|r|r�r|r�r|r|r�r�rrrrr|r�r�r�r|r|r�r�rrr�r�r�r�r�r�r�r�r�r�r|rrr�r�r|r�r�rrrrr|rrrrr|r�r�rrrrr�rrrrr�r|r|rrr�r|r|r�r�r�r�r|r|r|r�r|r|r�r�r�r|r�r�r|r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr|rrrrr|r�r�rrrrr�r|rrr�r|r�r|r|r|r|r�r|r�r�r|r|r|r�r|r|r�r�r�r|r�r�r|r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr|rrr|rrr|r�r|r|r�rrr|r�rrr|r�r|rrr|r|rrr�r|rrr|r|r�r|r|r|r|r�r|r|r�r�r�r�r|r�r�r|r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr|rrrrr|rrr|r|r|rrr|r|rrr|r|r�r|rrr|r|rrr�rrr|r|r|rrr|r|r|rrrrr|r�rrr�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�rrr�rrrrrrrrrrr�r�rrr�r|r|rrrrrrrrrrr�r�r�r�r�rrr�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r|rrr�r�r�rrr�r|r�r�r�r�r�rrr�r�r�r�r�r�r�r�r|r�rrrrrrrrr�r�r|rrr�r�rrr�rrrrr|rrrrrrrrrrr�r�rrrrrrr�r�r�rrrrr�r�rrr�r�r�r�r|r�r�r|r�r�rrr�r�r�r�r�r|rrr�r�r�r�r�r�r�r�r�r�r�rrrrrrrrr|rrrrrrrrrrrrrrr�r|r�rrrrr|r|r�r|r|r|rrr�r�r|r�r|r�r|r�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r|r�r|rrrrrrr�r|r�r|r|r�r|r�rrr|r|r�r|r�r�r�r|r|r�r�r|r�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrr|r�rrrrr�r�rrr�r|rrr�r�rrr|r�r�r|r�r|r|rrr|r�r�r�r�r�r|rrr�r�r�rrr�r|r�r|r�rrr�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr�r�rrr|rrrrrrr�rrr|r�rrr|r�rrr|r|r|r|r�rrrrr�r|r�rrr�r|rrr�r|r�r�rrr|r|r|r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rrrrr|rrr|rrrrr|rrr|rrr|rrrrr|r�r�rrr|r|r|r�r|r|r|r�r|r|r�r|r�r�r|r|r|rrr�r�rrr�r�r�r�r�r�r�r�r|r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrr|rrr|r|r�r�rrr|rrr|rrr|r�rrr|r|r�r|r�r|r|r|r�r|r|r|r|r�rrr|r�r|r|r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrrr|rrr�r|rrrrr|r|rrr�r�r�r|r�rrrrr|r|rrr�r�r�rrr�r�r�r�rrr�r�rrrrr�r|r�r|r�r�r�rrr|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrr|rrr|rrrrr�r�rrr�r�r|r�r|r�r�rrr�r�r�r|rrr�r�r�r�r�r�r�r�rrr�r�r|r|r|r|r�r�r�r�r�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr|r|r�r�r|r�rrrrr|rrr|r|rrr|r|rrr�r|r|r�r|r�rrr|r�r|r|r|r|r|r�rrr|r�r|r|r|r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrrrrrrrrr�rrrrr�r|r�r�rrr|r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r|r|r|rrr�r�r�rrr�rrr|r�rrr|r|rrrrrrrrrrr�r�r|r|r|r�r|r|r�r|r�r|rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r|rrr�rrrrr|rrrrr�rrrrr�rrr|r|rrr|rrrrrrr�r�r|r|rrr�r�r�r�rrr�r�rrr�r�r�r|r|r�r�rrr�r�r|r|r|rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr|rrrrr|r�rrrrr|r|rrr�rrr|r�rrr|r�r�r|r|r�r|rrr|r�r�rrr�r�r�r�rrr�r�r|rrr�rrr�r�rrr�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�rrr|r|r|r�r|r�r�rrr�r�rrr�rrr�r�r|r�r�r�r�r|r�r�r�r�r|r�r�r�r�r|r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�rrrrrrrrr|r|r|r|r|r�rrr�r�r�r|r�r�r�r|r�r|r�rrr|r�r�rrr�r�r�r�r�rrr�r�r|rrr�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rrr�rrrrr�r|r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rrr�rrr�r�r�r|r�r�r|r�rrrrr|rrrrrrr|rrr�r�r|r|r|r�r�r�r|r|r�r�r�r�r�r�r�rrr�r�r�r�r|r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�rrr�r|r�r�r�r�r�r�r�r�r�r�r�r|rrr�rrrrr�r�r�r�rrr�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr�r|rrr�r|rrr�r�rrr�r|r|r�r�r|r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrr|r�r�r�r�r�rrr�r|r|r|r|r|r|r�r�r�r�r�rrr�r�r�rrr�r�r�r�r�r�r�r|r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�rrr�r|r|r�rrrrr|rrrrr�r�r�r�r|r|r�r|r�rrrrr�r�r�rrr|r�r�r�r�r|r�r�r�r�r�r�r�r�r|r�r�r�rrr�r�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�rrr�r�rrr�rrr�r�r�r�r�rrr|r�r�r�rrr�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�rrrrr�rrr|r�rrrrr�r|r|r�r�r|r�r�r�r|r�r�r�r�r�rrr�r�r�rrr�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r|r�rrrrrrr|r|r�r�r�r�r�rrr�r�r�r|r|r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r|rrr�r|r�r�r|r�r�rrr�r�r�r|r�r�r�r�r�rrr�r�rrr�r�r�r|r|r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�rrr�r|r�r�r|r|r�r�r|r�r�r�r�rrr�r|r�r�r�r�r|r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r|r|r�r�r�r|r�r|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�rrr�r�r�r�r�r�r�r�rrr|r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rrr|r|r�r�r�r�r�r�r�r�r|rrr|r�r|r|rrr�r�r�r|rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr|r|r�r�r�r�r�r�r|r�r|r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�rrr�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�rrr�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�rrrrr�r�r�r�r|rrr�r�r|r�r�r�r�r�r�r|r�r�r�r�r�r|r�r�r�r|r�r�r�r�r|r�r�rrr�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�rrr�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr|r�r�r�r�r�r�r�r|r�r�r�r�r�r|r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�rrr�r�r�r�r�r|r�r�r�r�r�r�r�r�r|r�r�r�r�r�rrrrr�r�r�r�r|r�r�r�r|r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r|r|r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r|r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rrr�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r|r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r|r�r�r|r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r|r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)ZTIS620CharToOrderMapZ
ThaiLangModelZTIS620ThaiModel�r�r��#/usr/lib/python3.6/langthaimodel.py�<module>%s*
site-packages/pip/_vendor/chardet/__pycache__/langhungarianmodel.cpython-36.pyc000064400000060307147511334570023637 0ustar003

���e01�@s4d�Zd�Zd�Zeed�d�d�d�d��Zeed�d�d�d�d��Zd�S)�������(�6�-� �2�1�&�'�5�$�)�"�#�/�.�G�+�!�%�9�0�@�D�7�4���������	����
����C�
�����A�>����������������������������������������������K�����������������O���������������������������������3�Q���N���������,�������=�����������:���B�;�������<�E�?�������R��J���F�P���H����S�M�T��L�U�����I�*������8���V�W�g��(��P�?Tz
ISO-8859-2Z	Hungarian)Zchar_to_order_mapZprecedence_matrixZtypical_positive_ratioZkeep_english_letterZcharset_nameZlanguagezwindows-1250N(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrr�rrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNr}rOr�rPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r�rarbrcrdrerfrgrhr{rjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr�r|r�r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r`r�rr�r�r�r�r�r�rir�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr(r�r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r#r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2rr2r2r2r2r2r2r2r2rrr2r2r#r#rrrrrr#rr2rrr2r2r2r2r2rr2r2r2r2r2r2r#rr2r2r2r2rr2r2r#r#r2r2r�r#r#r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r2rr#r2r2r2r2r2rr2r2r2r2r2r#r#rr2r2r2r2r2r2r2r#r#r2rr�r#r#r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r2r2r2r2r2r2r2r2r2r2r2r#r#rr2r2r2r#r2r2r2r2r2r#r2r2rrr�r2rr2r�r�r�r�r�r�r�r�r�r�r2r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r2r2r2r2r2r2rr2r2r2rr2r2rr2r2r2r2r2rr2r2rrr2rr2rr�r2rrr�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r#r�r2r2r2r2r2r2rr2r2r2r2r2rr2r2r2r#rr2rrr2r#rr2r2rrr�r2r2r2r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2r2r2r2r2r2r2r2r2r2rrr2r2r2r2r2r2rr2r2r2r2rr2r2r2r2r�rr2rr�r�r�r#r#r�r�r�r�r�r2r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2r2r2r2r2r2r2r2r2r2r2r#r#r#r2r2rr#r2rrr2rr#r2rrr#r�r2r2r#r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2rrr2r2r2r2r2r#rr2r2r2r2r#rr#r2r2r2r2rrr2r#r#r2rr�r#r#r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r2r2r2r2r2r2r2r2rrr2r2r2r2r2rr#r2r2r2r2r2rrr#r2r2r2r�r#r#rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r#r�r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2rr2r2r2rr2r2rr2r2r2rr�r2rr2r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r#r�r2r2r2r2r2r2rr2r2r2rr2rr2r2r2r#r2rrrr2r#r#r2r2r#r#r�r2r2rr�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2r2r2r2r2r2r2rr2r2r2rr2rr2r2r2rr2r2r2r2r2r#rr2rrr�rrrr�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2r2r2rrrr2r#r2r2rrr#r2r2r2r#r#r2r#rr2rr2rrrr#r�rrrr�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r2r#r#r2r2r2r2r2r#rr2r2r2r2r#rr#r2r2r2rrr2rr#r�r2rr�r#r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r2r#r#r2r2r2r2r2r#rr2r2r2r2r#r#r�r2r2r2r2r�rr2r�r�rr#r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r2r2r2r2r2r2rrr2r2rrrrr2r2r�r#rr2rr2rrr2rr#rr�rrrr�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r2r2r2r2r2r2r#rr2r2r2rr#rr2r2rrrr2rr2r2r#r2r2r#r#r�rr2rr�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2r2r2r#rrrrr2r2r2r#r#r#r2r2r#r#r2r#r#r2rr#rr2r#r#r�rrrr�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2r2r2rr#rr#r#r2r2r#r#r#r#r2r2r#r#rrr#rr#r#rrr#r#r�rrr#r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2r2r2r#r#rr#r#r2r2r#r�r#r#r2r2rr�r#r#rr2r#r�rrr#r�r�r#r2rr�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2rr#r2r2r2r2r2r#rr2rr2r2rr#r#r2rr2rr#rrr�r#rr#r�r�r#r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r2r2r2r2rrrrr2r#rrr#r#r2r2r�r2rr#rr2rr#r2r2r#r#r�rr#r2r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2r2r2rrrr2rr2r2r2rr#r#r2r2r#r#r#rrr2rr2rrrr#r�rrr#r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r#r�r�r2r2r2r2r2r�r�r2r2rr2r�r�r�rr2r2r#r�r#rr�r�r#r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r2r#rr2r2r2r2r2r#rr2r2rrr#r#r�r2r2rrr#rrr#r�rrr�r#r#r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r2r2rrr#r2r#rr2r2rrr#r#rrr#r#r#r#r2rr#r#r#r#rr#r�r#rr#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r#r�r�r�r�r�r�r�r�r�rr2r2r#r#r#r#r#r2r2r2r�r#r#r2r2r#r#r#r#r#rrr�r2r#r#rr�rr#r#r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2r#r�r#rr#rrr�r#rr2r#rr�r�r�rr#r#r#r#r#rr�r�r#r#r�r�r�r�r#rr#rrrr#rr#rr�rr�rrr#r#rr#r#rr#r#r#r�r#r�r�r�r#r#r�r#r#r#rr2rr2r2r�r#rrr2r#r�r#r�rr#rrr�r#r#r�r�r#r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r2r2rrr#r�r�r2rr2rr�r�r�r#r#r2r�r�r#r#r�r�rr#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r2r#r#rrr2r2r#r�r#r2rr2r#r#r#r�r#r#r#r#r#r2r#r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r2r#r#r#rrrr#r�r#rr2r2rr�r�r�rr#r#r#rr#r#r#r�r#r#r#r�r�r�r#rrrrrr#r#r#rr�rr#r#r#r#r#rr#r#r#r#r#r#r�r#r#r#r�r�r#r#r2rrr#r�r�r#r#rrr�r2r�r#rr#r#r�r�r#r#r#r�r#r#r#r#r�rr#r#r#rrr#r#r#rr#rr#r#r#r#r#r#r#rr#r#r#rr2r#r#r#r#r#r#r#r#r#r�r#rr2r2r�r#r�r�r�r2r2r#r�r�r#rrr#r�r�r�r�rr�r�r#r#r#r�rr#r#r#rr#r#r#r#r#r#rr#r#r�r#r#r�r#r#r#r�r#rr#r#r�r#r#r#r#r#r#r#r�r#rr2r2r�r#r�r�r�rrr�r�r�r�r#rrr�r�r�r�r#r�r�r#r#r�r�rr�r#r�rr#r#r#r#rr#r#r#r#r#r#r#rr#r#r#r#r#r#r#r#r#rr�r#r#r#r#r#r�r#r2rrr�r#r�r#r�rr2rr�r�r#rrr#r�r�r#r#r#r�r�rr#r�r#rrr#r#rr#r#r#r#r#r#rr#r#r#r#r#r#r�rr#r�r#r#r�r#r#r#r�r#r#rr#r#r�r#rrrr�r�r#r�r�rrr#r#r�r�rr#r#r�r�r�r#rr�r�rr#r�r�rr#r#r#rr#r#r#r#rr#rr#r#r#rrr#r#rr#r#r#rr#r#r#r#r#r#r#r#r#r#r�r#r#rr2r�r�r�r#r�r2rr#r�r�r#rr#r#r�r�r�r�rr#r�r#r#r�r�rr#rr#r#r#r�r�r�r#r�r#r#r#r#r#rr�r�r#r�r�r�rr�r�r#r#r#r#r#r#r#r#r�r#r2r�r�rr#rrr#r�r�rr#rrr�r�r�rr#r#r#r�r#r#r�r�r#r#rr�r�r�r#rr#rrr#r#rr#rr�r#r#r#r#r#r#r#r#r#rr#r#r�r�r#r#r#r#r�r�r#r#r2rr�r�r�r#r�rrrr�r�r�rrr#r�r�r�r�r2r#r#r#r#r�r�rr#r#r#rr#r�r#r#r#r�r#r#r#r#r#r#r#r�rr#r�r�r#r�r#r#r�r#r#r#r#r#r#r�r#rr2rr�r�r�r#r�rrr�r�r�r�rr#r#r�r�r�r�rr#r�r#r#r�r�rr#r#r�rr#r#r#r#rr#rr#rr�r#r#r#r�rr#r#r#rr#r#r#r#r�r#r#r#r#r#r�r#r2r#r#rrrr2rr#r#rrr#r#r�r#r�rrr#r#r#r#r#r�r�r#r#r�r#r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr�r�r�r�r�rrr�r�r�r�rrr#r�r�r�r#r#r�r�r#rr�r�rr#r#r#rrr#r#r#rr#rr#r#r�r#r#r#r#rr#r#r#rr#r#r#r#r�r#rr#r#r#r�r#r#r�r�r#rr2rr#r�r�rr�r#r#r�r�r�r#r#r#r#r�r#r#r�r�r#r�r�r�r�r�r#rr#rr#rr#r#r#rr�rr#r#r#r�r#rr�r�r#r#r#r�r�r�r�r�r�r�r�r�rr2rr�r�r�r�r�r#r#rr#r�r�r#r#r#r�r�r�r�rr�r�r#r#r�r�rr#r#r#rr#r#r#r#r#r#rr#r�r#r#r#r#r�rr#r#r#r#r#r#r�r#r�r#r#r#r#r#r�r#r#rrr�r#r#r#r�rrrr�r�r�r2rr#r�r�r�r#r#r�r�r#r#r�r#r#r#r�r�r#r#r�r#r#r#r#r#r#r#r#rr#r#r#r#r#r#r#rr#r#r#r�r�r#r#r#r�r#r�r#rr#r�rr#r#rrr#r#rr#r#r#r�r�r�r#r#r�r#r#r#r#r�r�r#r#r#r�r�r�r#rrrrrr#r#r#rr�rr#r#r#r#r#r#r#r#r#r#r#r#r�r#r#r�r�r�r#r�r#rr2r�r�r�r#r�rrr�r�r�r�rrr�r�r�r�r�r#r�r�r#r�r�r�rr�r#r�rr#r#r#r#r#r�rr�r�r�r#rr#r#r#r#r�r#rr�r#r�r#r�r#r#r#r�r#r�r#rrrr�r�r�r#r�rr#rr�r�r�r#r#rr�r�r�r�r#r�r�r#r#r�r�rr#r�r#rr#r#r#r#r#r#r#r#r#r#r#r#r#r#r#r#r#r#r#r#r#r#rr�r#r#r#r#r#r�r#r#rrr�r�r�r#r�rrrr�r�r�r#r#r�r�r�r�r�r#r#r�rr�r�r#r#r#r�r#r#r�r#r#r#r#r#r#r�r#r#r#r#r�r�r#r�r�r#r#r�r#r�r#r#r#r#r#r�r�r�r#r#r�r�r#r�r#rr#r�r�r#r#r#rr�r�r�r#r#r�r#r�r#r#r�r�r#r�r�r�r�r�r�rr#rr#r#r#r#r#rr�rr�r#r#r�r#rr#r�r#r#r#r�r�r�r�r�r�r#r�r�rr#r#r�r#rr�r�r#r#r#r�r�r�r#r#r�r�r�r�r�r#r�r�r#r�r�r�rr#r�r#rrr#r#r#r#r#rr#r#r�r#r#r#r#rr#r#r#rr#r#r�r#r�r#r#r#r#r#r�r#r#rrr�r�r�r�r�r#r#r�r�r�r�rr#r�r�r�r�r�rr�r�rrr�r�rr�r�r#rr#r#r#r#r#r#r#r�r#r#r�r#r#r�r#r�r�r�r#r#r#r#r�r�r#r#r#r#r�r�r#r#r#rr�r�r2r#r�rr#r#r#r�r�r#r#r#r�r�r�r#r#r�r�r�r#r�r�r#r�r#r�r#rr#r�r#r#r#rr#r#r�r#r#r#r#r#r�r�r�r#r#r#r#r#r�r#r�r�r�r#r�r�rr#r#r�r�r�r�r�r#r�r�r�r�r�r�r�r�r#r�r#r�r�r�r#r�r�r�r�rr�r�r�rr#r#r#r#r#r#r#r#r#r�r#r#r#r#r#r#r#r#r#rr#r#r�r�r#r#r#r#r#r�r#rr#r#r#rr#r#r#r�r#r#rr#r�r�r�r�r#r#r#r#r�r#r�r�r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r#r�r#r#r#r#r#r�r�r#r#rr#r�r�r�r#r#r�r�r�r#r#r�r�r#r�r#r�r�r�r#rr#r#r#r#r#r#r#r#r�r#r�r#r#r#r#r#r#r�r#r#r#r�r�r�r�r�r�r#r�r�rr�r�r�r#r#r#r#r�r�r#r#r�r�r�r�r�r#r#r#rr�r�r#r�r�r#r�r#r�r�r�r�r#r#r#r#r#r#r#r#rr�r#r#r#r#r�r#r#r#r�r#r#r#r�r�r�r�r�r�r�r�r�r#r�r�r#r#r#r#r#r�r�rr#r�r#r�r�r�r#r�r#r�r�r�r�r�r�r#r�r�r�r�r�r�r#r#r#r#r#r#r�r#r#r�r#r�r#r#r�r#r#r�r�r#r#r#r�r�r�r�r�r�r�r�r�r#r�r�r#r#r#r�r�r�r�r#r�rr�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r#r#r#r#r#r�r�r#r#r�r#r�r#r�r�r#r#r#r�r#r#r#r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r#r#rr#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r#r#r�r#r�r�r#r#r�r#r�r#r#r�r#r#r#r�r#r#r#r�r�r�r�r�r�r�r�r�rr#r#r#r#r#r#r#r#r#r#r�r�r#r#r#r�r�r#r�r�r#r�r#r�r#r#r#r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r#r#r#r#r�r�r�r#r#r#r�r�r�r�r#r#r#r�r�r�r�r�r�r�r�r�r�r�r�r�r#r#r#r#r#r#r�r#r#r�r#r�r#r�r�r#r#r�r�r#r#r�r�r�r�r�r�r�r�r�r�)ZLatin2_HungarianCharToOrderMapZwin1250HungarianCharToOrderMapZHungarianLangModelZLatin2HungarianModelZWin1250HungarianModel�r�r��(/usr/lib/python3.6/langhungarianmodel.py�<module>#sV
site-packages/pip/_vendor/chardet/__pycache__/utf8prober.cpython-36.pyc000064400000003543147511334570022077 0ustar003

���e�
�@sHddlmZddlmZmZddlmZddlmZGdd�de�Z	dS)�)�
CharSetProber)�ProbingState�MachineState)�CodingStateMachine)�
UTF8_SM_MODELcsTeZdZdZ�fdd�Z�fdd�Zedd��Zedd	��Zd
d�Z	dd
�Z
�ZS)�
UTF8Proberg�?cs*tt|�j�tt�|_d|_|j�dS)N)�superr�__init__rr�	coding_sm�
_num_mb_chars�reset)�self)�	__class__�� /usr/lib/python3.6/utf8prober.pyr	&s
zUTF8Prober.__init__cs"tt|�j�|jj�d|_dS)N�)rrrr
r)r
)rrrr,s
zUTF8Prober.resetcCsdS)Nzutf-8r)r
rrr�charset_name1szUTF8Prober.charset_namecCsdS)N�r)r
rrr�language5szUTF8Prober.languagecCs�xj|D]b}|jj|�}|tjkr,tj|_Pq|tjkrBtj|_Pq|tj	kr|jj
�dkr|jd7_qW|jtj
kr�|j�|jkr�tj|_|jS)N�r)r
Z
next_staterZERRORrZNOT_MEZ_stateZITS_MEZFOUND_ITZSTARTZget_current_charlenr�stateZ	DETECTING�get_confidenceZSHORTCUT_THRESHOLD)r
Zbyte_str�cZcoding_staterrr�feed9s



zUTF8Prober.feedcCs.d}|jdkr&||j|j9}d|S|SdS)Ng�G�z��?�g�?)r�
ONE_CHAR_PROB)r
ZunlikerrrrLs

zUTF8Prober.get_confidence)�__name__�
__module__�__qualname__rr	r�propertyrrrr�
__classcell__rr)rrr#srN)
Z
charsetproberrZenumsrrZcodingstatemachinerZmbcssmrrrrrr�<module>ssite-packages/pip/_vendor/chardet/__pycache__/hebrewprober.cpython-36.pyc000064400000005513147511334570022464 0ustar003

���e6�@s,ddlmZddlmZGdd�de�ZdS)�)�
CharSetProber)�ProbingStatecs�eZdZdZdZdZdZdZdZdZ	dZ
d	Zd
ZdZ
dZd
ZdZ�fdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zedd��Zedd��Zedd ��Z�ZS)!�HebrewProber�������������������g{�G�z�?z
ISO-8859-8zwindows-1255cs>tt|�j�d|_d|_d|_d|_d|_d|_|j	�dS)N)
�superr�__init__�_final_char_logical_score�_final_char_visual_score�_prev�_before_prev�_logical_prober�_visual_prober�reset)�self)�	__class__��"/usr/lib/python3.6/hebrewprober.pyr�szHebrewProber.__init__cCsd|_d|_d|_d|_dS)N�� )rrrr)rrrrr�szHebrewProber.resetcCs||_||_dS)N)rr)rZ
logicalProberZvisualProberrrr�set_model_probers�szHebrewProber.set_model_proberscCs||j|j|j|j|jgkS)N)�	FINAL_KAF�	FINAL_MEM�	FINAL_NUN�FINAL_PE�FINAL_TSADI)r�crrr�is_final�szHebrewProber.is_finalcCs||j|j|j|jgkS)N)�
NORMAL_KAF�
NORMAL_MEM�
NORMAL_NUN�	NORMAL_PE)rr%rrr�is_non_final�s
zHebrewProber.is_non_finalcCs�|jtjkrtjS|j|�}x�|D]�}|dkrp|jdkr�|j|j�rT|jd7_q�|j|j�r�|j	d7_	n,|jdkr�|j|j�r�|dkr�|j	d7_	|j|_||_q"Wtj
S)Nrr)�stater�NOT_MEZfilter_high_byte_onlyrr&rrr+r�	DETECTING)rZbyte_strZcurrrr�feed�s 




zHebrewProber.feedcCsx|j|j}||jkr|jS||jkr.|jS|jj�|jj�}||jkrR|jS||jkrd|jS|dkrr|jS|jS)Ng)	rr�MIN_FINAL_CHAR_DISTANCE�LOGICAL_HEBREW_NAME�VISUAL_HEBREW_NAMErZget_confidencer�MIN_MODEL_DISTANCE)rZfinalsubZmodelsubrrr�charset_name�s

zHebrewProber.charset_namecCsdS)NZHebrewr)rrrr�languageszHebrewProber.languagecCs(|jjtjkr"|jjtjkr"tjStjS)N)rr,rr-rr.)rrrrr,szHebrewProber.state)�__name__�
__module__�__qualname__r r'r!r(r"r)r#r*r$ZNORMAL_TSADIr0r3r2r1rrrr&r+r/�propertyr4r5r,�
__classcell__rr)rrr�s.

;rN)Z
charsetproberrZenumsrrrrrr�<module>scsite-packages/pip/_vendor/chardet/__pycache__/euckrfreq.cpython-36.opt-1.pyc000064400000056746147511334570022742 0ustar003

���e�4�0	@sdZdZ�	d2Z�	d1S(3	g@�0	�
��x�t�����H�a�������+��W�u��h��]�������������v�w�������m�F�!�p�������������x���/������������9�����t���-�y��K������������O��n������������0����<�4�{����������i���r����������������������X�X���������Y���&��P�������������^��������������9�������������Q���"��t������]�{�7����{��;��u���z�/��|�������7���.��������������{����#�|�}��~��t�8��_�	�
���
�!����_���������*��u��`�"���|�������a�������?��R�!� �/���!�"�=���#��$�%�&�'�(�)�,���'�b�$�*��+��-���,�������&�U����#���-�.�'���f�/�s��0�������� ��9�e�[�1����Z���:�����2�3���G����y�4�����5�6�7�,�w����s�8����9�:���~�;��<�;�}�=�>�?�o�)����@��A�B����2�Y�C�D��<�E�F�G�H�I�%�J�K�L�M�N�O��`�>�P��=�Q��R��S�T�;�������U��V�W�X�4�Y���Z�[��\�]���^���_�"�P���`�;��~�H�a��v��z�?����b���<�c�d�d�e�f�c�0��d��g�y���h�i�s�0�j�=�k��l�����<�b���U������I�m�n�o��p�q�r�s�t�u�������6�v�w��*��]�x�y���z�Z��-�:��b�
�{�|��&�'����5����>�}�~�w��g����6��%���(��v����w������E������������f���V��7�����B��N��[��'���������S�������e���x������������������������?�����q��f��(�)����~���\���������)��������������$����������l�����~�����C�����@��������2����K��z�V��������Q�r�f������ �h�+�3��1�������g�(����z��������������������A��<�j���M�g��2�������������������V�h����J�����0����b�������������Y���������a�!�*����������K�D�8��R��B��@����������y����������X�:����#������i���G����k����=��������!������J�����=����}����j�����������������������E����������j�O�4�����������������	�v�]��C���������o�����
����l�c�A������������T����k����������3�*��q�����>��������+������;��p��x�������������	��
����
������l�������)������m�8���D��������������7�L�B���D�������t���� �!�
�"�#�$�%�R�&�'�(�)�*��+�,��,�-�.�m���
�^�/�c��E�����a�m�0�E�1�2�3����4�T�����5������6�7��n���o�8�9�F����:�G�;�<�=�>�?�@�A�B�C�D�E�$�F�G�����%���p�H�I�J�K�L�����������M�N�O�P�Q�R���S�/�T��U�����V��I�H��������W��X��q�Y�Z�[�\���r��s�]�^�_�`��v�L�a����.��b�F�>�����j�c�Z����B�6��`�d�e����|��f���5�g��h��i�H��j����k��l���1�m�n�o�p�q�r�C�s�t�u�v�w���x�y�z�{���|�}�~���������O��������������E���q���I��\��-���S����
��e�����l��M���Q��P���^��
�-�F���������������������
�.���t�����J���������g�������������������������������������u�����9�	������	���$���������5�%��k�L����������������A��������������R��u���������)�����:��������������"�$�v�����c�Z�����������*�W��K���L���+���������������	�B�����?���������������M�[�5�������n����������C���'���������������
�������	��������������F�T�/����������8��u������K���(�M���i��������������T��?���������e� �(��������%�0����O��	�8�	�	�	�	�	�����	�	�	����#����		����V�
���P�� �M�W���
	��	��|�	�
	�S������������	�	�����	�	�	�	�	�	���	�	��9�D���������	�3����	�	�	�{�	����	�	�	�� 	�������!	�k�������N���Y���"	�#	�$	�%	�&	�'	�(	�]�)	�*	�+	�,	�-	�.	��/	�����>�����1	������\��2	��3	�����4	�5	�6	�w�7	��8	��3�����9	�:	�L�����;	���<	���������������=	��J�>	�?	�@	���A	���B	�C	�D	���E	�����F	�G	�H	�I	������������J	�K	����L	�M	�N	��I�O	�P	���Q	�R	�S	��T	���U	�V	�W	�X	���Y	�Z	�[	�\	�]	����^	�_	�`	�a	����b	�c	�d	�e	����f	�g	�U���x� �h	����i	�j	�����4�&�������!�����S�y������"�
���#�k	���l	�m	�n	�����p�Q��.�o	�U��O���p	���q	�r	�s	�x�t	��u	�v	�w	���$�x	��y	���`�z	�{	�|	�}	���~	�	�	�	�	�	�	�	���C�������%�y���o�	���	�	�	�	�	�	��	�	�	�	����	����	�	��@����	�G�	������2��������	�	�	�	�N�	���������	�	�	�	�	�r�	���z�n�����P�	�#�	�&�Q�	���m�	�c����	�'�	�w�	�	���(�l�@�������)���*�	�	�	�	���	�	�	�	���	����	�	�	�H��	�	��	���	�	�	���	�	�r�	���A�	�������	�}�,���:�����
��I�	�N��1�	�W������������	�+��	��	�1���b����	�q��	�,��	���e����_��d��	��	��	��	��	��	����2�����	���	��	��	��	���	���	��	��-��	�����	��	��	��	��	��	����	��	��	��	��	��	���	����	��	����7��	����	���	��	��	��	��	��	��	������������,�G�������	��^��	��	�.��	��g���	����	�	�	���	�����_��	�	�	���h�h��{�	�|����3�	�	�	�����	�
�
�\�����
�
�
�
�
�
�
��	
�

�
�
����
�j�
�i��4��
�
�
�
�
�
���
�
�
�
��6�d���/�
�
����o�
��R������	�&�
�k�n�z��������
��
����X��d��S��}�
�}����~������� 
�!
�N�"
����������#
���s�$
�%
�&
�f�D��1�'
���(
�@�)
�^�����*
���+
�,
�-
�.
�/
�J�+�0
�1
�2
���T��3
�4
�5
�6
���7
�A�.�����8
�9
��"�:
�;
���<
�=
�>
���0�?
��@
��A
�B
��C
�D
�E
�F
�G
���_�[�H
�I
���`�a�J
�����K
�L
�M
�N
��O
��i����P
�Q
�R
N(0	rrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxrryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r	r	r	r	r	r	r	r	r	r		r
	r	r	r
	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r 	r!	r"	r#	r$	r%	r&	r'	r(	r)	r*	r+	r,	r-	r.	r/	r0	)Z EUCKR_TYPICAL_DISTRIBUTION_RATIOZEUCKR_TABLE_SIZEZEUCKR_CHAR_TO_FREQ_ORDER�r1	r1	�/usr/lib/python3.6/euckrfreq.py�<module>)s(site-packages/pip/_vendor/chardet/__pycache__/euctwfreq.cpython-36.pyc000064400000152605147511334570022010 0ustar003

���e�{�@sdZdZ�dZ�dS(g�?i���������	�R���������
�n��!���L�,�A��s������L
�S
���������.�N�i�����:�����?���=�N�K������l	����
����
� ��
�����
�o�$��i���c�8����������z�|���t�"�
�e�@�\��	��������F��M
�Q�H���P�v���f�����D�T
����F�N
��E��O��/���s���3�<�2���&�L�����O
����G���M���?���`��F	�*���g�	�Z�
�:����K��	��	�������`�����g��	� �q��~����P
�	������!��u���*���	�
��~��������e���G�^�������U�C���B��������� �j�o�/���O�2	�[����
�&������S�(����p�]��6�i�
�'�������8��+�%�[���\��������X	�(������0��� �H�
�
�"�!���+��1�"���
�O�G	���f�1�����������2�9���l�,���	�������}�h�#�q
�Q�M�&��X���#����
�����j�����M�����%����$���'��	���N�i��7��J���!���������M��)�P�U
�����
��%�
�
��� ��
� �I���3	���
��r
���������m����$���x�������
������%�&���
��������&��'���'���.�����H	�������$�#���D�&�A��	��������U�G������
�Q
�P�S�'���d��0�F�����*��������J���U�����I	�R
���<��S
�:�7
��
���	�	�9���}���	���V�P�T
��)�C�����&�
��	�)���m	��������4	������n	�*��������O������	�+�(�
������U
�(���5�Y�j�
�l��u��)�
�*�+�V
�
�=�������������4��!����T�,�x�����e���	���J	�P��s
�5�A�V
�/�k���
��l�!�	���A��`��
���A��
����
���	���������M�
�������W
���
�t
����+�}��j�8
�����������-�)�m��	���W
���		��
��a��
�P�K	��,������7�'�u
���{�k�������
�9
������������1���b���	�
����o	���X
�,�Q����������
��
�X
�����5�D���l����[� 
����Y
��%��.�Y	��*����
���R������p����n�c�g�+����'�2�����{��l���m���:
���f�
	���|�5	�������Z
���
������,������;���I�Y�����[
�X�"���	������~�����,��k����-�������D�����>�]���,�������v�L�B�i�&�����
�����G������B���
�!
�����u�������	� �a�����v
�S���}�;
��D���=���0����\
�(������v� ���������.�����9�H�������]
�E���������!���/�������-���Q�*��.�/�������0�3��"
��*�R�����W�����/��b�.���	�����������R����)��2��	������������Z	���T��w
������	��������[	�O��	��`�7�x
�^
��������\	�������������g���n��
�_��~������0��
�a���y
�]	�������������	�;���	�-����L	������/��������`���#�"�������0�
�s�����d�s��
���Y
����k�w�o����������1��2���g�p�Q��U��v������C�S��^	���5���B���_��b�N�����X�����
��L�c�
�
�	�K�w�*�a�G�3��
����1�6�	�2�6	�N����	�3���:��\�q����-�����3�
�������x�
���r���4�
�_
��U����@��5���������/�+�6���������
�7	�Z
�����8�4��C������l��`
��������Y������5��	�{�����$�7��M�V�0�r�����g��V��8��9�,�Z����{�#�W�0���$��
��
��4��[��v���c��V�M	�$�-����X����d����W�[
�������q��
��1�����t����	����
�2��2�:�3� �.���2���
�������%�������
��������
��
�6��4���|���S����_	��
������`�����	�3���;�r����7�Y�L�;���	��	�����T�\�����<���V�5����
�����"�x�%��;�8	� ���2�E�=�!�����^��w��
�"�����s��#�<�Q����r�$��]��%��.�&��3�5�B��y�'��������>�Y�a
�b
�(�
�	�)�*�	�����:��6��^�����(��
�+�+���x�,�	�.�h�/�-����
�V����h���z�.�����<����;�/�?���
�	�0�8������5�6�
�_�)�1�2�	�3�y�	����
��^�4���?�$�+�\
��5�_�W����z
�6�0��
�]
�	���p	�7����8�^
�<
���V�9��6���I�w�}�c
�^��
�9�@��
�A�����	�:�e�1���B��������������7�;�<���=�{
���(�`�R�1�Y��I��	�����6��d
�>�����e
�?���o��q�Y�n���l��C�D�{���
�_
�E�P�U��������E�m��@���:�
��
�A������#
�	�N�
��B��C�W�T�T����
��f
�;�D��&��
�|���������
���<�=
��
����	�V���
��o���	�E��
�F�������������D�-�8���G�d��	���
�H�t����I�z�
��J��
��
�K���|
�����>
��������L�:�	�=���M��?
����H������N���	�O�P��F�k�Q�9�������
�	�u��v������R�S����}�
�>�������
�:�T�	���N	��C������U�~�� �W���G����,������O���|�����O�?�V�`	�� ��~��w����@�!����;������W���"��
�X��Y��R�g
�u���
�h
��
�x�=�
��Z��<����[��#��=� 	�\�$��]�H�I�^�7�O	�$
�
�_�%�v�J�� �!�w�t��`�a�b�9	�&���"�{����	�'�c���d�8������[�9���d�Z�	�e�����@
�!�(�"�f��������#�$�T�g�K���%��W�M�h���i�����%��������
�`
��y�7�m�L�j�k��:	�>�!�������B�)�l�|���
�z����i
���M�����
���m��n�u�q	�v����
�
�*�N�o�p���q�7�.�r���y��
���
��f����s�!	�y��
�O�	�����+�P�j
��
�>������Y��A�t��u�%
�
��v���z�2�w�,��	��	���3�Q����
�x��6�9�k
���&�6����-�
��R�y���Q�9�c��
�'�.�
�o�(�B�F�?�l
���)�e����
�z��
�S�T������{�/���"	�	�b�/���`���0�u��v���1��
�	��U��*���/�E��S��8�Z�|�}�F�+����!�~�2�,�F�P	��m
��
��e�s��C���r	������a�3�Z�%�-�����	�4�����A���
��
�i�
����D�.����	�n������
���	��3�5�J�0���s	��
�����@���V��/��	���6�W�A�7�o��
���P�0��1��n
���a�Q�2��p�:���P��b�������{�3�7�E���	����|�F�G�X�������]��
�4���}
�����
�5��&��8�	
�����	�������L�������
�	�Y�

�����4�
�B�9�8������	�H��
�/�3�Z�[���
�o
����:�0���n�\���]���^������	�����I������;��4��<��=�;�	��
����X�	�����5�_���
�
��������}�0�>�p
�J���K������&
�F�������
��
�#��[�����f��>�q
��?�r
��6�������
�������j�����@�-�p�9�;	�6�y�A������	�'
��������V�����`���4�����K�
�a��
�6����������"���K��

������	�Y�G�o��b�f�c�7��s
�p�r�	�8��������A
�Q	�
�C�,�
��������\�L��u��~
��
����9�����B��Z���
�C������j�d��0�h�g��e���M�}�7��
��
��D�	�����f�:����N�;�
�E�F��(
���<����G�M�=�w�g��~�t	�?�����
�t�m�#	��	��+���h�������[�
�O�	�����
���B
����$	�t
��
���a	���	��H������Q�"�<	�
����-�1��#�u
�'���D�����
��
�I���>�J�%	����P��
�i��
�=�a
��K��v
���������
�����_���j���?�E�1��
��
��k���8�Q����l�����
��m�@�
��b�������R������5�������L�	�w
��n�����4�H����M�����������\��o���N�]���
��\�x
�g��E�
������������p���
��q�
���������O�������r��Z��s�t������P�e�����������o�����^��}�������z�A�&	�������
�����n�-����
�Q��i�
������R���u��	�����B������
�y
����v�w�B�h�q�����<��*�i�S�"�T�C�>�)�
�x�U�����~������z���V�������C
�)�q����W�
	�������D���E�T�F�p�����#�#�y���
����G���z
������	�����S���X�
��������u	�
�=�H���_��R�������w�����Y�z�{�q����3���5���
���{
���|�}�S�~���
�
��Z�[�������`���F��]���n�����I���
����������G�T�J���K�z�S�r�\�)
��
�v	��5������
�b	���j��������
���]�^����c�������
����
�a�:�1�������A�h��a��
�6������b
�-���
�H�_���	��
����������|
��*
�f���	����
�D
��
�L�W����U��
���$���
�
���N���R	�k��E
�������M���S	�'	��`���s��F
���a�}
����
�(	�j��	�~
�V�B�[�b�W�����b�
�I��c
��
��{�����	����	�c�
��	��d�
�
�]������=	�e�������
����
�N�	����f�g�
��
�
��h������
�=��X���������d
��H�C�
�i��T�W�
�
��9��	�N���K�E�j�4�J�
�Y�
��G
�	�k���h�O��
�
��]�P�Z��l���Q�[�k��m�n�R�o�c	������S�X����\�+��������I�T�U�U����
���l��
�D����p��q�
�k�	�m��
�K��������V�8��
� ��r�@�W�r��
�C�G����
��^�
�����8�t��:���
�!��n�s��"��
�#�$�]�t�	�%��
�&�r�u�'�+
�(�)��e
�
�}�
�����*�4�1�+�����������,���
�>	�X�t�r�?	���
�-��v�u�{���w	�w���.�x�T	�,
����/�0�������1��
��2��
�G�R�
��^��Y�
�Z���
�[�3�^��������
�)	�\�	�4�@�v��5��6�7��d	�]��
���^�8��&��	����9�y�u���
���z�R��{�����(�_���<�9��l������_� 
�S�`�k���[�	���
�x�:��;�<�!
�$�=��|���>�d��
��?�e��������������
�}�@�A�B���.�~�a�O�
�����`�b��
�%�����
�;��
����C��D�������f
�"
�L�
�����
�E�;��F�&�G��
�
���x	�H�)����c��I�d�T���
�
�e��������	��J��g�~��U��e	�	����f�a�8�g��
�	�b�h�K�y	�����L��:�
�������*�M�i����c�t���M�N������	��
���f	���Y�d�e��*����O�H
��#
���#�;�
��	������I
�\�@	�����s�$
��
�P�Q�
�	�
�R�j�S�T���+���U��V����L�
�!��J
�\�4�W����X�Y�Z��[�>�w�k�f����+���%
����$��\����l�
�]��l�^�_�g
�%��`�a�g�����
���N�K
�b�c�d��
�c�w��
�e���b�y��f�h�g�h�h��'�i��m���z	�j�d�i����	�<����|����������O�k��
�l���
�a���m�����n�
��u���n�����e�&
�
�o�����	�<���p�q�P�Q��j��	��	����(��_�r�s�����<�E�R�f��������]���I�t�=��u��v��
���o�
�
�k�,�l�w�'
��	�m����F�`����f����
�S�J�x��y�z�n��L
���p�O�����
�{��
���A	�y����U	���|�}�~�F��q����
�	������������p����B	��o�	����p��-
���r���q�q���(
���r����s������
����
�(�.
�s�)
��t���	��*
���u���T�v�M
�{	���t��'�����
�����������g�Z�w�
�	�c��j�+
�p����x������h����
������,
�=����	��u������/
�y��
��	�y�v�	����w���������
�z������>���C�'�����{��Z�����������a����	��q�?��
��|��	�"���
����}�������=��A��1��)����?�
�
����x�	��	��	�
���h��S�����(��
���'�x�y��
���
��
������_�	����>��}�P���������~��
�N
��c��
����H��	�i��������������j���	�*	����
������h
���_���/��<��~�7�k�T���]��U�b�
�t���g	��z���
���|	����������{���	��0
���
��+	��,	����I����������B���4�s�
��������
���?�C���	��G��������������������|�1
���m�C	�����l���i
�}	���?����������h	�����I������V�����������^�������-	�.	���)���
����2
��i�����
�����A�����-
���������
���
���	�������
��
����
�����Q���
�O
���������z�K������[���J�����@��.
�����D	��}�/
�@��
�����
��@��R�����	�j
�~����J�����
�������
�0
��m�����m�~	���	����1
�����	������
�d�������X����9���������U��������L��3
������
�	�������	������
���$�(�v����&���b����������������{�����������������2
��	�������	���
����������^����V	�	�����������������2�W	��
�W�o�i	��k
�X���	�A�4
��
��3
����	������
����
���������	���H�����������D���I�s�
���
��� �����w��{�������/	�n�,������	��	���
�����
��4
�B����P
�5
��
��
�	���
��	���|��z�����
�'��5
����!���y����V�����	��������������	�
���-�������
��
��	�������
��"�%����J�C����x������	��l
�6
�����
��
����X����7
���	�
�N��m
����� �W���!�	��
��
�>�8
��"�#�9
�j	�D�����$�	
�%��
���X�&��'�E��(�F��)��x��	�k������*�+������D���,�������K�-�.�/�:
��0�J���1�|����������E�*�2�
�������H�x�����3������3������;
�p���<
��������4�5�	�	�6��O���
�7�8���9���:����;���������<�
��Y�=�>�
�	�?���@�A�B�
����=
��	����o�	�f�������
��
�	�	�C��D��#��
�>
��E���F�Z�E	����	�[�$�
���G�?
���	�����G������@
�H���A
�\�L�������
�I�	������i�J�	���
�	�K�L�]�M�N�
���j��Q
�
�B
�O�P����Q�
����R�
�S�T���(�����C
���U��V��W�X�����Y������Z����d�[�c�H��_��#�0	�\��]��n�

��	������^���_�k	�D
���`�a���b�����	�c������d����	�����d�^�E
�e�����
����f��_�7����� �
���F
���g�-����!����h�	�q�;������X�� �	����i�j������
�k�l�����~�m�G
�`�n�y�p����o�>��p���q�
���r���a�����s�t�"�u�|���?����v���������e���w�x�y������z�{�@�|���	�}��	����%�s��R
��.���M�?�
���~����b����@������������������������
��6
�I����������c����������2�����m��@����	����N����f����Z�
��������\�H
��d�����J����	���	��
��������K�����#�����������	����K������g�����h���>��.������������)�n
�t�����
�o
��
�I
���
���q�����r��������p
����	����z����U��J
���1	�K
�t��������
�����J����L�	�������N(rrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r	r	r	r	r	r	r	r	r	r		r
	r	r	r
	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r 	r!	r"	r#	r$	r%	r&	r'	r(	r)	r*	r+	r,	r-	r.	r/	r0	r1	r2	r3	r4	r5	r6	r7	r8	r9	r:	r;	r<	r=	r>	r?	r@	rA	rB	rC	rD	rE	rF	rG	rH	rI	rJ	rK	rL	rM	rN	rO	rP	rQ	rR	rS	rT	rU	rV	rW	rX	rY	rZ	r[	r\	r]	r^	r_	r`	ra	rb	rc	rd	re	rf	rg	rh	ri	rj	rk	rl	rm	rn	ro	rp	rq	rr	rs	rt	ru	rv	rw	rx	ry	rz	r{	r|	r}	r~	r	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r
r
r
r
r
r
r
r
r
r	
r

r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r 
r!
r"
r#
r$
r%
r&
r'
r(
r)
r*
r+
r,
r-
r.
r/
r0
r1
r2
r3
r4
r5
r6
r7
r8
r9
r:
r;
r<
r=
r>
r?
r@
rA
rB
rC
rD
rE
rF
rG
rH
rI
rJ
rK
rL
rM
rN
rO
rP
rQ
rR
rS
rT
rU
rV
rW
rX
rY
rZ
r[
r\
r]
r^
r_
r`
ra
rb
rc
rd
re
rf
rg
rh
ri
rj
rk
rl
rm
rn
ro
rp
rq
rr
rs
rt
ru
rv
rw
rx
ry
rz
r{
r|
r}
r~
r
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r
r
r
r
r
r
r
r
r
r	
r

r
r
r

r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r 
r!
r"
r#
r$
r%
r&
r'
r(
r)
r*
r+
r,
r-
r.
r/
r0
r1
r2
r3
r4
r5
r6
r7
r8
r9
r:
r;
r<
r=
r>
r?
r@
rA
rB
rC
rD
rE
rF
rG
rH
rI
rJ
rK
rL
rM
rN
rO
rP
rQ
rR
rS
rT
rU
rV
rW
rX
rY
rZ
r[
r\
r]
r^
r_
r`
ra
rb
rc
rd
re
rf
rg
rh
ri
rj
rk
rl
rm
rn
ro
rp
rq
rr
rs
rt
ru
rv
rw
rx
ry
rz
r{
r|
r}
r~
r
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r)Z EUCTW_TYPICAL_DISTRIBUTION_RATIOZEUCTW_TABLE_SIZEZEUCTW_CHAR_TO_FREQ_ORDER�rr�/usr/lib/python3.6/euctwfreq.py�<module>,s�site-packages/pip/_vendor/chardet/__pycache__/eucjpprober.cpython-36.opt-1.pyc000064400000004436147511334570023260 0ustar003

���e��@s`ddlmZmZddlmZddlmZddlmZddl	m
Z
ddlmZGdd�de�Z
d	S)
�)�ProbingState�MachineState)�MultiByteCharSetProber)�CodingStateMachine)�EUCJPDistributionAnalysis)�EUCJPContextAnalysis)�EUCJP_SM_MODELcsPeZdZ�fdd�Z�fdd�Zedd��Zedd��Zd	d
�Zdd�Z	�Z
S)
�EUCJPProbercs4tt|�j�tt�|_t�|_t�|_	|j
�dS)N)�superr	�__init__rr�	coding_smr�distribution_analyzerr�context_analyzer�reset)�self)�	__class__��!/usr/lib/python3.6/eucjpprober.pyr%s

zEUCJPProber.__init__cstt|�j�|jj�dS)N)r
r	rr)r)rrrr,szEUCJPProber.resetcCsdS)NzEUC-JPr)rrrr�charset_name0szEUCJPProber.charset_namecCsdS)NZJapaneser)rrrr�language4szEUCJPProber.languagecCs6x�tt|��D]�}|jj||�}|tjkrN|jjd|j|j	|�t
j|_Pq|tj
krdt
j|_Pq|tjkr|jj�}|dkr�|d|jd<|jj|j|�|jj|j|�q|jj||d|d�|�|jj||d|d�|�qW|d|jd<|jt
jk�r0|jj��r0|j�|jk�r0t
j|_|jS)Nz!%s %s prober hit error at byte %s�r���)�range�lenrZ
next_staterZERRORZlogger�debugrrrZNOT_MEZ_stateZITS_MEZFOUND_ITZSTARTZget_current_charlenZ
_last_charr�feedr
�stateZ	DETECTINGZgot_enough_data�get_confidenceZSHORTCUT_THRESHOLD)rZbyte_str�iZcoding_stateZchar_lenrrrr8s4




zEUCJPProber.feedcCs|jj�}|jj�}t||�S)N)rrr
�max)rZcontext_confZdistrib_confrrrrYs

zEUCJPProber.get_confidence)�__name__�
__module__�__qualname__rr�propertyrrrr�
__classcell__rr)rrr	$s!r	N)ZenumsrrZmbcharsetproberrZcodingstatemachinerZchardistributionrZjpcntxrZmbcssmrr	rrrr�<module>ssite-packages/pip/_vendor/chardet/__pycache__/sbcsgroupprober.cpython-36.opt-1.pyc000064400000002776147511334570024166 0ustar003

���e�
�@s�ddlmZddlmZddlmZmZmZmZm	Z	m
Z
ddlmZm
Z
ddlmZmZddlmZddlmZddlmZdd	lmZGd
d�de�ZdS)
�)�CharSetGroupProber)�SingleByteCharSetProber)�Win1251CyrillicModel�
Koi8rModel�Latin5CyrillicModel�MacCyrillicModel�Ibm866Model�Ibm855Model)�Latin7GreekModel�Win1253GreekModel)�Latin5BulgarianModel�Win1251BulgarianModel)�TIS620ThaiModel)�Win1255HebrewModel)�HebrewProber)�Latin5TurkishModelcseZdZ�fdd�Z�ZS)�SBCSGroupProberc
s�tt|�j�tt�tt�tt�tt�tt�tt	�tt
�tt�tt�tt
�tt�tt�g|_t�}ttd|�}ttd|�}|j||�|jj|||g�|j�dS)NFT)�superr�__init__rrrrrrr	r
rrr
rrZprobersrrZset_model_probers�extend�reset)�selfZ
hebrew_proberZlogical_hebrew_proberZvisual_hebrew_prober)�	__class__��%/usr/lib/python3.6/sbcsgroupprober.pyr,s,
zSBCSGroupProber.__init__)�__name__�
__module__�__qualname__r�
__classcell__rr)rrr+srN)ZcharsetgroupproberrZsbcharsetproberrZlangcyrillicmodelrrrrrr	Zlanggreekmodelr
rZlangbulgarianmodelrr
Z
langthaimodelrZlanghebrewmodelrZhebrewproberrZlangturkishmodelrrrrrr�<module>s site-packages/pip/_vendor/chardet/__pycache__/charsetprober.cpython-36.pyc000064400000006456147511334570022650 0ustar003

���e��@s0ddlZddlZddlmZGdd�de�ZdS)�N�)�ProbingStatec@sneZdZdZddd�Zdd�Zedd��Zd	d
�Zedd��Z	d
d�Z
edd��Zedd��Z
edd��ZdS)�
CharSetProbergffffff�?NcCsd|_||_tjt�|_dS)N)�_state�lang_filter�loggingZ	getLogger�__name__Zlogger)�selfr�r
�#/usr/lib/python3.6/charsetprober.py�__init__'szCharSetProber.__init__cCstj|_dS)N)rZ	DETECTINGr)r	r
r
r�reset,szCharSetProber.resetcCsdS)Nr
)r	r
r
r�charset_name/szCharSetProber.charset_namecCsdS)Nr
)r	�bufr
r
r�feed3szCharSetProber.feedcCs|jS)N)r)r	r
r
r�state6szCharSetProber.statecCsdS)Ngr
)r	r
r
r�get_confidence:szCharSetProber.get_confidencecCstjdd|�}|S)Ns([-])+� )�re�sub)rr
r
r�filter_high_byte_only=sz#CharSetProber.filter_high_byte_onlycCsbt�}tjd|�}xJ|D]B}|j|dd��|dd�}|j�rP|dkrPd}|j|�qW|S)u9
        We define three types of bytes:
        alphabet: english alphabets [a-zA-Z]
        international: international characters [€-ÿ]
        marker: everything else [^a-zA-Z€-ÿ]

        The input buffer can be thought to contain a series of words delimited
        by markers. This function works to filter all words that contain at
        least one international character. All contiguous sequences of markers
        are replaced by a single space ascii character.

        This filter applies to all scripts which do not use English characters.
        s%[a-zA-Z]*[�-�]+[a-zA-Z]*[^a-zA-Z�-�]?Nr��r���r)�	bytearrayr�findall�extend�isalpha)r�filteredZwordsZwordZ	last_charr
r
r�filter_international_wordsBs
z(CharSetProber.filter_international_wordscCs�t�}d}d}x�tt|��D]r}|||d�}|dkr>d}n|dkrJd}|dkr|j�r||kr�|r�|j|||��|jd�|d}qW|s�|j||d	��|S)
a�
        Returns a copy of ``buf`` that retains only the sequences of English
        alphabet and high byte characters that are not between <> characters.
        Also retains English alphabet and high byte characters immediately
        before occurrences of >.

        This filter can be applied to all scripts which contain both English
        characters and extended ASCII characters, but is currently only used by
        ``Latin1Prober``.
        Frr�>�<TrrN)r�range�lenrr)rrZin_tag�prevZcurrZbuf_charr
r
r�filter_with_english_lettersgs"
z)CharSetProber.filter_with_english_letters)N)r�
__module__�__qualname__ZSHORTCUT_THRESHOLDrr
�propertyrrrr�staticmethodrrr$r
r
r
rr#s
%r)rrZenumsr�objectrr
r
r
r�<module>ssite-packages/pip/_vendor/chardet/__pycache__/hebrewprober.cpython-36.opt-1.pyc000064400000005513147511334570023423 0ustar003

���e6�@s,ddlmZddlmZGdd�de�ZdS)�)�
CharSetProber)�ProbingStatecs�eZdZdZdZdZdZdZdZdZ	dZ
d	Zd
ZdZ
dZd
ZdZ�fdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zedd��Zedd��Zedd ��Z�ZS)!�HebrewProber�������������������g{�G�z�?z
ISO-8859-8zwindows-1255cs>tt|�j�d|_d|_d|_d|_d|_d|_|j	�dS)N)
�superr�__init__�_final_char_logical_score�_final_char_visual_score�_prev�_before_prev�_logical_prober�_visual_prober�reset)�self)�	__class__��"/usr/lib/python3.6/hebrewprober.pyr�szHebrewProber.__init__cCsd|_d|_d|_d|_dS)N�� )rrrr)rrrrr�szHebrewProber.resetcCs||_||_dS)N)rr)rZ
logicalProberZvisualProberrrr�set_model_probers�szHebrewProber.set_model_proberscCs||j|j|j|j|jgkS)N)�	FINAL_KAF�	FINAL_MEM�	FINAL_NUN�FINAL_PE�FINAL_TSADI)r�crrr�is_final�szHebrewProber.is_finalcCs||j|j|j|jgkS)N)�
NORMAL_KAF�
NORMAL_MEM�
NORMAL_NUN�	NORMAL_PE)rr%rrr�is_non_final�s
zHebrewProber.is_non_finalcCs�|jtjkrtjS|j|�}x�|D]�}|dkrp|jdkr�|j|j�rT|jd7_q�|j|j�r�|j	d7_	n,|jdkr�|j|j�r�|dkr�|j	d7_	|j|_||_q"Wtj
S)Nrr)�stater�NOT_MEZfilter_high_byte_onlyrr&rrr+r�	DETECTING)rZbyte_strZcurrrr�feed�s 




zHebrewProber.feedcCsx|j|j}||jkr|jS||jkr.|jS|jj�|jj�}||jkrR|jS||jkrd|jS|dkrr|jS|jS)Ng)	rr�MIN_FINAL_CHAR_DISTANCE�LOGICAL_HEBREW_NAME�VISUAL_HEBREW_NAMErZget_confidencer�MIN_MODEL_DISTANCE)rZfinalsubZmodelsubrrr�charset_name�s

zHebrewProber.charset_namecCsdS)NZHebrewr)rrrr�languageszHebrewProber.languagecCs(|jjtjkr"|jjtjkr"tjStjS)N)rr,rr-rr.)rrrrr,szHebrewProber.state)�__name__�
__module__�__qualname__r r'r!r(r"r)r#r*r$ZNORMAL_TSADIr0r3r2r1rrrr&r+r/�propertyr4r5r,�
__classcell__rr)rrr�s.

;rN)Z
charsetproberrZenumsrrrrrr�<module>scsite-packages/pip/_vendor/chardet/__pycache__/mbcharsetprober.cpython-36.opt-1.pyc000064400000004151147511334570024114 0ustar003

���eU
�@s0ddlmZddlmZmZGdd�de�ZdS)�)�
CharSetProber)�ProbingState�MachineStatecsVeZdZdZd�fdd�	Z�fdd�Zedd��Zed	d
��Zdd�Z	d
d�Z
�ZS)�MultiByteCharSetProberz 
    MultiByteCharSetProber
    Ncs,tt|�j|d�d|_d|_ddg|_dS)N)�lang_filter�)�superr�__init__�distribution_analyzer�	coding_sm�
_last_char)�selfr)�	__class__��%/usr/lib/python3.6/mbcharsetprober.pyr	'szMultiByteCharSetProber.__init__cs<tt|�j�|jr|jj�|jr.|jj�ddg|_dS)Nr)rr�resetrr
r)r
)rrrr-s

zMultiByteCharSetProber.resetcCst�dS)N)�NotImplementedError)r
rrr�charset_name5sz#MultiByteCharSetProber.charset_namecCst�dS)N)r)r
rrr�language9szMultiByteCharSetProber.languagecCsx�tt|��D]�}|jj||�}|tjkrN|jjd|j|j	|�t
j|_Pq|tj
krdt
j|_Pq|tjkr|jj�}|dkr�|d|jd<|jj|j|�q|jj||d|d�|�qW|d|jd<|jt
jkr�|jj�r�|j�|jkr�t
j|_|jS)Nz!%s %s prober hit error at byte %srr���)�range�lenrZ
next_staterZERRORZlogger�debugrrrZNOT_MEZ_stateZITS_MEZFOUND_ITZSTARTZget_current_charlenrr
�feed�stateZ	DETECTINGZgot_enough_data�get_confidenceZSHORTCUT_THRESHOLD)r
Zbyte_str�iZcoding_stateZchar_lenrrrr=s.





zMultiByteCharSetProber.feedcCs
|jj�S)N)r
r)r
rrrrZsz%MultiByteCharSetProber.get_confidence)N)�__name__�
__module__�__qualname__�__doc__r	r�propertyrrrr�
__classcell__rr)rrr"srN)Z
charsetproberrZenumsrrrrrrr�<module>ssite-packages/pip/_vendor/chardet/__pycache__/compat.cpython-36.pyc000064400000000431147511334570021253 0ustar003

���en�@s@ddlZejdkr(dZdZeefZeZndZdZeefZeZdS)�N�TF)rr)	�sys�version_infoZPY2ZPY3�strZunicodeZbase_strZ	text_type�bytes�rr�/usr/lib/python3.6/compat.py�<module>s
site-packages/pip/_vendor/chardet/__pycache__/chardistribution.cpython-36.pyc000064400000014135147511334570023353 0ustar003

���e�$�@s�ddlmZmZmZddlmZmZmZddlm	Z	m
Z
mZddlm
Z
mZmZddlmZmZmZGdd�de�ZGdd	�d	e�ZGd
d�de�ZGdd
�d
e�ZGdd�de�ZGdd�de�ZGdd�de�ZdS)�)�EUCTW_CHAR_TO_FREQ_ORDER�EUCTW_TABLE_SIZE� EUCTW_TYPICAL_DISTRIBUTION_RATIO)�EUCKR_CHAR_TO_FREQ_ORDER�EUCKR_TABLE_SIZE� EUCKR_TYPICAL_DISTRIBUTION_RATIO)�GB2312_CHAR_TO_FREQ_ORDER�GB2312_TABLE_SIZE�!GB2312_TYPICAL_DISTRIBUTION_RATIO)�BIG5_CHAR_TO_FREQ_ORDER�BIG5_TABLE_SIZE�BIG5_TYPICAL_DISTRIBUTION_RATIO)�JIS_CHAR_TO_FREQ_ORDER�JIS_TABLE_SIZE�JIS_TYPICAL_DISTRIBUTION_RATIOc@sLeZdZdZdZdZdZdd�Zdd�Zd	d
�Z	dd�Z
d
d�Zdd�ZdS)�CharDistributionAnalysisig�G�z��?g{�G�z�?�cCs0d|_d|_d|_d|_d|_d|_|j�dS)N)�_char_to_freq_order�_table_size�typical_distribution_ratio�_done�_total_chars�_freq_chars�reset)�self�r�&/usr/lib/python3.6/chardistribution.py�__init__.sz!CharDistributionAnalysis.__init__cCsd|_d|_d|_dS)zreset analyser, clear any stateF�N)rrr)rrrrr=szCharDistributionAnalysis.resetcCsX|dkr|j|�}nd}|dkrT|jd7_||jkrTd|j|krT|jd7_dS)z"feed a character with known length�rriN���)�	get_orderrrrr)r�charZchar_len�orderrrr�feedFs
zCharDistributionAnalysis.feedcCsT|jdks|j|jkr|jS|j|jkrN|j|j|j|j}||jkrN|S|jS)z(return confidence based on existing datar)rr�MINIMUM_DATA_THRESHOLD�SURE_NOr�SURE_YES)r�rrrr�get_confidenceTs

z'CharDistributionAnalysis.get_confidencecCs|j|jkS)N)r�ENOUGH_DATA_THRESHOLD)rrrr�got_enough_datadsz(CharDistributionAnalysis.got_enough_datacCsdS)Nrr r)r�byte_strrrrr!isz"CharDistributionAnalysis.get_orderN)
�__name__�
__module__�__qualname__r*r'r&r%rrr$r)r+r!rrrrr(s	rcs$eZdZ�fdd�Zdd�Z�ZS)�EUCTWDistributionAnalysiscs$tt|�j�t|_t|_t|_dS)N)	�superr0rrrrrrr)r)�	__class__rrrrsz"EUCTWDistributionAnalysis.__init__cCs0|d}|dkr(d|d|ddSdSdS)Nr���^r�r r)rr,�
first_charrrrr!xsz#EUCTWDistributionAnalysis.get_order)r-r.r/rr!�
__classcell__rr)r2rr0qsr0cs$eZdZ�fdd�Zdd�Z�ZS)�EUCKRDistributionAnalysiscs$tt|�j�t|_t|_t|_dS)N)	r1r8rrrrrrr)r)r2rrr�sz"EUCKRDistributionAnalysis.__init__cCs0|d}|dkr(d|d|ddSdSdS)Nr�r4rr5r r)rr,r6rrrr!�sz#EUCKRDistributionAnalysis.get_order)r-r.r/rr!r7rr)r2rr8�sr8cs$eZdZ�fdd�Zdd�Z�ZS)�GB2312DistributionAnalysiscs$tt|�j�t|_t|_t|_dS)N)	r1r:rrrr	rr
r)r)r2rrr�sz#GB2312DistributionAnalysis.__init__cCs>|d|d}}|dkr6|dkr6d|d|dSdSdS)Nrrr9r5r4r r)rr,r6�second_charrrrr!�sz$GB2312DistributionAnalysis.get_order)r-r.r/rr!r7rr)r2rr:�sr:cs$eZdZ�fdd�Zdd�Z�ZS)�Big5DistributionAnalysiscs$tt|�j�t|_t|_t|_dS)N)	r1r<rrrrrr
r)r)r2rrr�sz!Big5DistributionAnalysis.__init__cCsX|d|d}}|dkrP|dkr:d|d|ddSd|d|dSndSdS)	Nrr�r5��?�@r r)rr,r6r;rrrr!�sz"Big5DistributionAnalysis.get_order)r-r.r/rr!r7rr)r2rr<�sr<cs$eZdZ�fdd�Zdd�Z�ZS)�SJISDistributionAnalysiscs$tt|�j�t|_t|_t|_dS)N)	r1rArrrrrrr)r)r2rrr�sz!SJISDistributionAnalysis.__init__cCsr|d|d}}|dkr0|dkr0d|d}n&|dkrR|dkrRd|dd}ndS||d	}|d
krnd}|S)
Nrr��������r@�r r r)rr,r6r;r#rrrr!�sz"SJISDistributionAnalysis.get_order)r-r.r/rr!r7rr)r2rrA�srAcs$eZdZ�fdd�Zdd�Z�ZS)�EUCJPDistributionAnalysiscs$tt|�j�t|_t|_t|_dS)N)	r1rIrrrrrrr)r)r2rrr�sz"EUCJPDistributionAnalysis.__init__cCs0|d}|dkr(d|d|ddSdSdS)Nr�r4r5rr r)rr,r"rrrr!�sz#EUCJPDistributionAnalysis.get_order)r-r.r/rr!r7rr)r2rrI�srIN)Z	euctwfreqrrrZ	euckrfreqrrrZ
gb2312freqrr	r
Zbig5freqrrr
Zjisfreqrrr�objectrr0r8r:r<rArIrrrr�<module>sIsite-packages/pip/_vendor/chardet/__pycache__/langbulgarianmodel.cpython-36.opt-1.pyc000064400000060341147511334570024564 0ustar003

���e'2�@s4d�Zd�Zd�Zeed�d�d�d�d��Zeed�d�d�d�d��Zd�S)������M�Z�c�d�H�m�k�e�O��Q�f�L�^�R�n��l�[�J�w�T�`�o��s�A�E�F�B�?�D�p�g�\���h�_�V�W�G�t���U�]�a�q�����������������������������������������������������������������������i�������������-������ �#�+�%�,�7�/�(�;�!�.�&�$�)��'��"�3�0�1�5�2�6�9�=���C���<�8���	���������
����
������������K�4���*��>�������:��b�������x�N�@�S�y�u�X�z�Y�j�I�P�v�r�g! _B�?Fz
ISO-8859-5Z	Bulgairan)Zchar_to_order_mapZprecedence_matrixZtypical_positive_ratioZkeep_english_letterZcharset_nameZlanguagezwindows-1251Z	BulgarianN(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr(r/r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr>r?r@rArBrCrDrEr�rFrGrHrIrJrKrLrMr�r�r�r�r�r�rWrNrOrPrQrRrSrTrUr�rVrXrYrZr�r�r�r[r\r]r_r`r^r|r~r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r�r}rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�(r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)ZLatin5_BulgarianCharToOrderMapZwin1251BulgarianCharToOrderMapZBulgarianLangModelZLatin5BulgarianModelZWin1251BulgarianModel�r�r��(/usr/lib/python3.6/langbulgarianmodel.py�<module>&sV
site-packages/pip/_vendor/chardet/__pycache__/mbcssm.cpython-36.opt-1.pyc000064400000042131147511334570022216 0ustar003

���e�c�@sl	ddlmZdZejejejdejejejejejejejejejejejejejejejejejejejejfZdZedeedd�Zd Z	ejejdejejejddejd	ejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejfFZ
d!Ze	d
e
edd�Zd"Z
ddddejejejejejejejejejejejejejejejejejejejejejejejejejejdejdejejejejejejejf(Zd#Ze
d	eedd�Zd$ZejejdejejejejejejejejejejejejejfZd%Zedeedd�Zd&Zejejejddddejejejejejejejejejejejejejejejejejejejejejejejejejdejejejejejejejejejejejejejejejf0Zd'Zed
eedd�Zd(Zejejejejejejdejejejejejejejejejejejejejejejejejdejejejejejejejejejdejejejejejejejejejejejejejf0Zd)Zed
eedd�Zd*ZejejejdejejejejejejejejejejejejejejejejejejejejfZd+Zed	eedd�Z d,Z!dd
d
ejddejejejejejejejejejejejejd	d	d	d	ejejd	d	d	d	d	ejd	d	d	d	d	d	dd
d
ejddd	d	ejd	d	d	d	d	d	d	ejejejejf8Z"d-Z#e!d	e"e#dd�Z$d.Z%d	d	d
d	ddejejejejejejejejejejejejdddejejejdddejdejd	d	d
d	dddddejdddejejejdddddejdejejejf8Z&d/Z'e%d	e&e'dd�Z(d0Z)ejejejejejejdd
dddd
d	dddejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejddddejejejejejejejejejejejejejdddejejejejejejejejejejejejd
d
d
d
ejejejejejejejejejejejejejejd
d
ejejejejejejejejejejejejddddejejejejejejejejejejejejejejejdejejejejejejejejejejejejddddejejejejejejejejejejejejejejejdejejejejejejejejejejejejdddejejejejejejejejejejejejejejejejejejejejejejejejejejejf�Z*d1Z+e)de*e+dd�Z,dS)2�)�MachineState�����ZBig5)Zclass_tableZclass_factorZstate_tableZchar_len_table�name����	�
ZCP949zEUC-JPzEUC-KRzx-euc-twZGB2312Z	Shift_JISzUTF-16BEzUTF-16LE���
���zUTF-8N(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)rrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r
r
r
r
r
r
r
r
r
r
r
r
rrrr
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)
rrrrrrrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)rrrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)rrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)rrrrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r)rrrrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)rrrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)rrrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)rrrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r
rrrrrrrrrrrrrrrr
rrrrrrrrrrrrrrr)rrrrrrrrrrrrrrr	r	)-ZenumsrZBIG5_CLSZERRORZSTARTZITS_MEZBIG5_STZBIG5_CHAR_LEN_TABLEZ
BIG5_SM_MODELZ	CP949_CLSZCP949_STZCP949_CHAR_LEN_TABLEZCP949_SM_MODELZ	EUCJP_CLSZEUCJP_STZEUCJP_CHAR_LEN_TABLEZEUCJP_SM_MODELZ	EUCKR_CLSZEUCKR_STZEUCKR_CHAR_LEN_TABLEZEUCKR_SM_MODELZ	EUCTW_CLSZEUCTW_STZEUCTW_CHAR_LEN_TABLEZEUCTW_SM_MODELZ
GB2312_CLSZ	GB2312_STZGB2312_CHAR_LEN_TABLEZGB2312_SM_MODELZSJIS_CLSZSJIS_STZSJIS_CHAR_LEN_TABLEZ
SJIS_SM_MODELZ
UCS2BE_CLSZ	UCS2BE_STZUCS2BE_CHAR_LEN_TABLEZUCS2BE_SM_MODELZ
UCS2LE_CLSZ	UCS2LE_STZUCS2LE_CHAR_LEN_TABLEZUCS2LE_SM_MODELZUTF8_CLSZUTF8_STZUTF8_CHAR_LEN_TABLEZ
UTF8_SM_MODEL�rr�/usr/lib/python3.6/mbcssm.py�<module>sh $ (((((,  "$   $  $ $                $site-packages/pip/_vendor/chardet/__pycache__/codingstatemachine.cpython-36.opt-1.pyc000064400000005365147511334570024573 0ustar003

���e�@s(ddlZddlmZGdd�de�ZdS)�N�)�MachineStatec@sDeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Ze	dd
��Z
dS)�CodingStateMachinea�
    A state machine to verify a byte sequence for a particular encoding. For
    each byte the detector receives, it will feed that byte to every active
    state machine available, one byte at a time. The state machine changes its
    state based on its previous state and the byte it receives. There are 3
    states in a state machine that are of interest to an auto-detector:

    START state: This is the state to start with, or a legal byte sequence
                 (i.e. a valid code point) for character has been identified.

    ME state:  This indicates that the state machine identified a byte sequence
               that is specific to the charset it is designed for and that
               there is no other possible encoding which can contain this byte
               sequence. This will to lead to an immediate positive answer for
               the detector.

    ERROR state: This indicates the state machine identified an illegal byte
                 sequence for that encoding. This will lead to an immediate
                 negative answer for this encoding. Detector will exclude this
                 encoding from consideration from here on.
    cCs0||_d|_d|_d|_tjt�|_|j�dS)Nr)	�_model�_curr_byte_pos�_curr_char_len�_curr_state�loggingZ	getLogger�__name__Zlogger�reset)�selfZsm�r
�(/usr/lib/python3.6/codingstatemachine.py�__init__7szCodingStateMachine.__init__cCstj|_dS)N)r�STARTr)rr
r
rr?szCodingStateMachine.resetcCsh|jd|}|jtjkr0d|_|jd||_|j|jd|}|jd||_|jd7_|jS)NZclass_tablerZchar_len_tableZclass_factorZstate_tabler)rrrrrr)r�cZ
byte_classZ
curr_stater
r
r�
next_stateBszCodingStateMachine.next_statecCs|jS)N)r)rr
r
r�get_current_charlenPsz&CodingStateMachine.get_current_charlencCs
|jdS)N�name)r)rr
r
r�get_coding_state_machineSsz+CodingStateMachine.get_coding_state_machinecCs
|jdS)N�language)r)rr
r
rrVszCodingStateMachine.languageN)r
�
__module__�__qualname__�__doc__rrrrr�propertyrr
r
r
rr!sr)r	Zenumsr�objectrr
r
r
r�<module>ssite-packages/pip/_vendor/chardet/__pycache__/universaldetector.cpython-36.pyc000064400000013173147511334570023541 0ustar003

���e�0�@s�dZddlZddlZddlZddlmZddlmZmZm	Z	ddl
mZddlm
Z
ddlmZdd	lmZGd
d�de�ZdS)a
Module containing the UniversalDetector detector class, which is the primary
class a user of ``chardet`` should use.

:author: Mark Pilgrim (initial port to Python)
:author: Shy Shalom (original C code)
:author: Dan Blanchard (major refactoring for 3.0)
:author: Ian Cordasco
�N�)�CharSetGroupProber)�
InputState�LanguageFilter�ProbingState)�EscCharSetProber)�Latin1Prober)�MBCSGroupProber)�SBCSGroupProberc	@sneZdZdZdZejd�Zejd�Zejd�Z	dddd	d
ddd
d�Z
ejfdd�Z
dd�Zdd�Zdd�ZdS)�UniversalDetectoraq
    The ``UniversalDetector`` class underlies the ``chardet.detect`` function
    and coordinates all of the different charset probers.

    To get a ``dict`` containing an encoding and its confidence, you can simply
    run:

    .. code::

            u = UniversalDetector()
            u.feed(some_bytes)
            u.close()
            detected = u.result

    g�������?s[�-�]s(|~{)s[�-�]zWindows-1252zWindows-1250zWindows-1251zWindows-1256zWindows-1253zWindows-1255zWindows-1254zWindows-1257)z
iso-8859-1z
iso-8859-2z
iso-8859-5z
iso-8859-6z
iso-8859-7z
iso-8859-8z
iso-8859-9ziso-8859-13cCsNd|_g|_d|_d|_d|_d|_d|_||_tj	t
�|_d|_|j
�dS)N)�_esc_charset_prober�_charset_probers�result�done�	_got_data�_input_state�
_last_char�lang_filter�loggingZ	getLogger�__name__�logger�_has_win_bytes�reset)�selfr�r�'/usr/lib/python3.6/universaldetector.py�__init__QszUniversalDetector.__init__cCsZdddd�|_d|_d|_d|_tj|_d|_|jr>|jj	�x|j
D]}|j	�qFWdS)z�
        Reset the UniversalDetector and all of its probers back to their
        initial states.  This is called by ``__init__``, so you only need to
        call this directly in between analyses of different documents.
        Ng)�encoding�
confidence�languageF�)rrrrr�
PURE_ASCIIrrrrr
)r�proberrrrr^s
zUniversalDetector.resetcCs>|jr
dSt|�sdSt|t�s(t|�}|js�|jtj�rJdddd�|_nv|jtj	tj
f�rldddd�|_nT|jd�r�dddd�|_n:|jd	�r�d
ddd�|_n |jtjtjf�r�dddd�|_d|_|jd
dk	r�d|_dS|j
tjk�r.|jj|��rtj|_
n*|j
tjk�r.|jj|j|��r.tj|_
|dd�|_|j
tjk�r�|j�s^t|j�|_|jj|�tjk�r:|jj|jj�|jjd�|_d|_n�|j
tjk�r:|j�s�t |j�g|_|jt!j"@�r�|jj#t$��|jj#t%��x@|jD]6}|j|�tjk�r�|j|j�|jd�|_d|_P�q�W|j&j|��r:d|_'dS)a�
        Takes a chunk of a document and feeds it through all of the relevant
        charset probers.

        After calling ``feed``, you can check the value of the ``done``
        attribute to see if you need to continue feeding the
        ``UniversalDetector`` more data, or if it has made a prediction
        (in the ``result`` attribute).

        .. note::
           You should always call ``close`` when you're done feeding in your
           document if ``done`` is not already ``True``.
        Nz	UTF-8-SIGg�?�)rrrzUTF-32s��zX-ISO-10646-UCS-4-3412s��zX-ISO-10646-UCS-4-2143zUTF-16Trr���)(r�len�
isinstance�	bytearrayr�
startswith�codecs�BOM_UTF8r�BOM_UTF32_LE�BOM_UTF32_BE�BOM_LE�BOM_BErrr!�HIGH_BYTE_DETECTOR�search�	HIGH_BYTE�ESC_DETECTORrZ	ESC_ASCIIrrr�feedrZFOUND_IT�charset_name�get_confidencerr
r	rZNON_CJK�appendr
r�WIN_BYTE_DETECTORr)rZbyte_strr"rrrr3os|





zUniversalDetector.feedc	Cs�|jr|jSd|_|js&|jjd�n�|jtjkrBdddd�|_n�|jtjkr�d}d}d}x,|j	D]"}|slqb|j
�}||krb|}|}qbW|r�||jkr�|j}|jj
�}|j
�}|jd	�r�|jr�|jj||�}|||jd�|_|jj�tjk�rz|jd
dk�rz|jjd�xn|j	D]d}|�s �qt|t��rZxF|jD] }|jjd|j|j|j
���q4Wn|jjd|j|j|j
���qW|jS)
z�
        Stop analyzing the current document and come up with a final
        prediction.

        :returns:  The ``result`` attribute, a ``dict`` with the keys
                   `encoding`, `confidence`, and `language`.
        Tzno data received!�asciig�?r#)rrrNgziso-8859rz no probers hit minimum thresholdz%s %s confidence = %s)rrrr�debugrrr!r1r
r5�MINIMUM_THRESHOLDr4�lowerr(r�ISO_WIN_MAP�getrZgetEffectiveLevelr�DEBUGr&rZprobers)	rZprober_confidenceZmax_prober_confidenceZ
max_proberr"r4Zlower_charset_namerZgroup_proberrrr�close�s`	

zUniversalDetector.closeN)r�
__module__�__qualname__�__doc__r:�re�compiler/r2r7r<rZALLrrr3r?rrrrr3s"



mr)rBr)rrCZcharsetgroupproberrZenumsrrrZ	escproberrZlatin1proberrZmbcsgroupproberr	Zsbcsgroupproberr
�objectrrrrr�<module>$ssite-packages/pip/_vendor/chardet/__pycache__/langhungarianmodel.cpython-36.opt-1.pyc000064400000060307147511334570024576 0ustar003

���e01�@s4d�Zd�Zd�Zeed�d�d�d�d��Zeed�d�d�d�d��Zd�S)�������(�6�-� �2�1�&�'�5�$�)�"�#�/�.�G�+�!�%�9�0�@�D�7�4���������	����
����C�
�����A�>����������������������������������������������K�����������������O���������������������������������3�Q���N���������,�������=�����������:���B�;�������<�E�?�������R��J���F�P���H����S�M�T��L�U�����I�*������8���V�W�g��(��P�?Tz
ISO-8859-2Z	Hungarian)Zchar_to_order_mapZprecedence_matrixZtypical_positive_ratioZkeep_english_letterZcharset_nameZlanguagezwindows-1250N(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrr�rrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNr}rOr�rPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r�rarbrcrdrerfrgrhr{rjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr�r|r�r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r`r�rr�r�r�r�r�r�rir�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr(r�r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r#r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2rr2r2r2r2r2r2r2r2rrr2r2r#r#rrrrrr#rr2rrr2r2r2r2r2rr2r2r2r2r2r2r#rr2r2r2r2rr2r2r#r#r2r2r�r#r#r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r2rr#r2r2r2r2r2rr2r2r2r2r2r#r#rr2r2r2r2r2r2r2r#r#r2rr�r#r#r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r2r2r2r2r2r2r2r2r2r2r2r#r#rr2r2r2r#r2r2r2r2r2r#r2r2rrr�r2rr2r�r�r�r�r�r�r�r�r�r�r2r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r2r2r2r2r2r2rr2r2r2rr2r2rr2r2r2r2r2rr2r2rrr2rr2rr�r2rrr�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r#r�r2r2r2r2r2r2rr2r2r2r2r2rr2r2r2r#rr2rrr2r#rr2r2rrr�r2r2r2r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2r2r2r2r2r2r2r2r2r2rrr2r2r2r2r2r2rr2r2r2r2rr2r2r2r2r�rr2rr�r�r�r#r#r�r�r�r�r�r2r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2r2r2r2r2r2r2r2r2r2r2r#r#r#r2r2rr#r2rrr2rr#r2rrr#r�r2r2r#r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2rrr2r2r2r2r2r#rr2r2r2r2r#rr#r2r2r2r2rrr2r#r#r2rr�r#r#r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r2r2r2r2r2r2r2r2rrr2r2r2r2r2rr#r2r2r2r2r2rrr#r2r2r2r�r#r#rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r#r�r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2r2rr2r2r2rr2r2rr2r2r2rr�r2rr2r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r#r�r2r2r2r2r2r2rr2r2r2rr2rr2r2r2r#r2rrrr2r#r#r2r2r#r#r�r2r2rr�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2r2r2r2r2r2r2rr2r2r2rr2rr2r2r2rr2r2r2r2r2r#rr2rrr�rrrr�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2r2r2rrrr2r#r2r2rrr#r2r2r2r#r#r2r#rr2rr2rrrr#r�rrrr�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r2r#r#r2r2r2r2r2r#rr2r2r2r2r#rr#r2r2r2rrr2rr#r�r2rr�r#r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r2r#r#r2r2r2r2r2r#rr2r2r2r2r#r#r�r2r2r2r2r�rr2r�r�rr#r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r2r2r2r2r2r2rrr2r2rrrrr2r2r�r#rr2rr2rrr2rr#rr�rrrr�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r2r2r2r2r2r2r#rr2r2r2rr#rr2r2rrrr2rr2r2r#r2r2r#r#r�rr2rr�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2r2r2r#rrrrr2r2r2r#r#r#r2r2r#r#r2r#r#r2rr#rr2r#r#r�rrrr�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2r2r2rr#rr#r#r2r2r#r#r#r#r2r2r#r#rrr#rr#r#rrr#r#r�rrr#r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2r2r2r#r#rr#r#r2r2r#r�r#r#r2r2rr�r#r#rr2r#r�rrr#r�r�r#r2rr�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2rr#r2r2r2r2r2r#rr2rr2r2rr#r#r2rr2rr#rrr�r#rr#r�r�r#r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r2r2r2r2rrrrr2r#rrr#r#r2r2r�r2rr#rr2rr#r2r2r#r#r�rr#r2r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2r2r2rrrr2rr2r2r2rr#r#r2r2r#r#r#rrr2rr2rrrr#r�rrr#r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r#r�r�r2r2r2r2r2r�r�r2r2rr2r�r�r�rr2r2r#r�r#rr�r�r#r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r2r#rr2r2r2r2r2r#rr2r2rrr#r#r�r2r2rrr#rrr#r�rrr�r#r#r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r2r2rrr#r2r#rr2r2rrr#r#rrr#r#r#r#r2rr#r#r#r#rr#r�r#rr#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r#r�r�r�r�r�r�r�r�r�rr2r2r#r#r#r#r#r2r2r2r�r#r#r2r2r#r#r#r#r#rrr�r2r#r#rr�rr#r#r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�r2r#r�r#rr#rrr�r#rr2r#rr�r�r�rr#r#r#r#r#rr�r�r#r#r�r�r�r�r#rr#rrrr#rr#rr�rr�rrr#r#rr#r#rr#r#r#r�r#r�r�r�r#r#r�r#r#r#rr2rr2r2r�r#rrr2r#r�r#r�rr#rrr�r#r#r�r�r#r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r2r2rrr#r�r�r2rr2rr�r�r�r#r#r2r�r�r#r#r�r�rr#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r2r#r#rrr2r2r#r�r#r2rr2r#r#r#r�r#r#r#r#r#r2r#r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r2r#r#r#rrrr#r�r#rr2r2rr�r�r�rr#r#r#rr#r#r#r�r#r#r#r�r�r�r#rrrrrr#r#r#rr�rr#r#r#r#r#rr#r#r#r#r#r#r�r#r#r#r�r�r#r#r2rrr#r�r�r#r#rrr�r2r�r#rr#r#r�r�r#r#r#r�r#r#r#r#r�rr#r#r#rrr#r#r#rr#rr#r#r#r#r#r#r#rr#r#r#rr2r#r#r#r#r#r#r#r#r#r�r#rr2r2r�r#r�r�r�r2r2r#r�r�r#rrr#r�r�r�r�rr�r�r#r#r#r�rr#r#r#rr#r#r#r#r#r#rr#r#r�r#r#r�r#r#r#r�r#rr#r#r�r#r#r#r#r#r#r#r�r#rr2r2r�r#r�r�r�rrr�r�r�r�r#rrr�r�r�r�r#r�r�r#r#r�r�rr�r#r�rr#r#r#r#rr#r#r#r#r#r#r#rr#r#r#r#r#r#r#r#r#rr�r#r#r#r#r#r�r#r2rrr�r#r�r#r�rr2rr�r�r#rrr#r�r�r#r#r#r�r�rr#r�r#rrr#r#rr#r#r#r#r#r#rr#r#r#r#r#r#r�rr#r�r#r#r�r#r#r#r�r#r#rr#r#r�r#rrrr�r�r#r�r�rrr#r#r�r�rr#r#r�r�r�r#rr�r�rr#r�r�rr#r#r#rr#r#r#r#rr#rr#r#r#rrr#r#rr#r#r#rr#r#r#r#r#r#r#r#r#r#r�r#r#rr2r�r�r�r#r�r2rr#r�r�r#rr#r#r�r�r�r�rr#r�r#r#r�r�rr#rr#r#r#r�r�r�r#r�r#r#r#r#r#rr�r�r#r�r�r�rr�r�r#r#r#r#r#r#r#r#r�r#r2r�r�rr#rrr#r�r�rr#rrr�r�r�rr#r#r#r�r#r#r�r�r#r#rr�r�r�r#rr#rrr#r#rr#rr�r#r#r#r#r#r#r#r#r#rr#r#r�r�r#r#r#r#r�r�r#r#r2rr�r�r�r#r�rrrr�r�r�rrr#r�r�r�r�r2r#r#r#r#r�r�rr#r#r#rr#r�r#r#r#r�r#r#r#r#r#r#r#r�rr#r�r�r#r�r#r#r�r#r#r#r#r#r#r�r#rr2rr�r�r�r#r�rrr�r�r�r�rr#r#r�r�r�r�rr#r�r#r#r�r�rr#r#r�rr#r#r#r#rr#rr#rr�r#r#r#r�rr#r#r#rr#r#r#r#r�r#r#r#r#r#r�r#r2r#r#rrrr2rr#r#rrr#r#r�r#r�rrr#r#r#r#r#r�r�r#r#r�r#r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr�r�r�r�r�rrr�r�r�r�rrr#r�r�r�r#r#r�r�r#rr�r�rr#r#r#rrr#r#r#rr#rr#r#r�r#r#r#r#rr#r#r#rr#r#r#r#r�r#rr#r#r#r�r#r#r�r�r#rr2rr#r�r�rr�r#r#r�r�r�r#r#r#r#r�r#r#r�r�r#r�r�r�r�r�r#rr#rr#rr#r#r#rr�rr#r#r#r�r#rr�r�r#r#r#r�r�r�r�r�r�r�r�r�rr2rr�r�r�r�r�r#r#rr#r�r�r#r#r#r�r�r�r�rr�r�r#r#r�r�rr#r#r#rr#r#r#r#r#r#rr#r�r#r#r#r#r�rr#r#r#r#r#r#r�r#r�r#r#r#r#r#r�r#r#rrr�r#r#r#r�rrrr�r�r�r2rr#r�r�r�r#r#r�r�r#r#r�r#r#r#r�r�r#r#r�r#r#r#r#r#r#r#r#rr#r#r#r#r#r#r#rr#r#r#r�r�r#r#r#r�r#r�r#rr#r�rr#r#rrr#r#rr#r#r#r�r�r�r#r#r�r#r#r#r#r�r�r#r#r#r�r�r�r#rrrrrr#r#r#rr�rr#r#r#r#r#r#r#r#r#r#r#r#r�r#r#r�r�r�r#r�r#rr2r�r�r�r#r�rrr�r�r�r�rrr�r�r�r�r�r#r�r�r#r�r�r�rr�r#r�rr#r#r#r#r#r�rr�r�r�r#rr#r#r#r#r�r#rr�r#r�r#r�r#r#r#r�r#r�r#rrrr�r�r�r#r�rr#rr�r�r�r#r#rr�r�r�r�r#r�r�r#r#r�r�rr#r�r#rr#r#r#r#r#r#r#r#r#r#r#r#r#r#r#r#r#r#r#r#r#r#rr�r#r#r#r#r#r�r#r#rrr�r�r�r#r�rrrr�r�r�r#r#r�r�r�r�r�r#r#r�rr�r�r#r#r#r�r#r#r�r#r#r#r#r#r#r�r#r#r#r#r�r�r#r�r�r#r#r�r#r�r#r#r#r#r#r�r�r�r#r#r�r�r#r�r#rr#r�r�r#r#r#rr�r�r�r#r#r�r#r�r#r#r�r�r#r�r�r�r�r�r�rr#rr#r#r#r#r#rr�rr�r#r#r�r#rr#r�r#r#r#r�r�r�r�r�r�r#r�r�rr#r#r�r#rr�r�r#r#r#r�r�r�r#r#r�r�r�r�r�r#r�r�r#r�r�r�rr#r�r#rrr#r#r#r#r#rr#r#r�r#r#r#r#rr#r#r#rr#r#r�r#r�r#r#r#r#r#r�r#r#rrr�r�r�r�r�r#r#r�r�r�r�rr#r�r�r�r�r�rr�r�rrr�r�rr�r�r#rr#r#r#r#r#r#r#r�r#r#r�r#r#r�r#r�r�r�r#r#r#r#r�r�r#r#r#r#r�r�r#r#r#rr�r�r2r#r�rr#r#r#r�r�r#r#r#r�r�r�r#r#r�r�r�r#r�r�r#r�r#r�r#rr#r�r#r#r#rr#r#r�r#r#r#r#r#r�r�r�r#r#r#r#r#r�r#r�r�r�r#r�r�rr#r#r�r�r�r�r�r#r�r�r�r�r�r�r�r�r#r�r#r�r�r�r#r�r�r�r�rr�r�r�rr#r#r#r#r#r#r#r#r#r�r#r#r#r#r#r#r#r#r#rr#r#r�r�r#r#r#r#r#r�r#rr#r#r#rr#r#r#r�r#r#rr#r�r�r�r�r#r#r#r#r�r#r�r�r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r#r�r#r#r#r#r#r�r�r#r#rr#r�r�r�r#r#r�r�r�r#r#r�r�r#r�r#r�r�r�r#rr#r#r#r#r#r#r#r#r�r#r�r#r#r#r#r#r#r�r#r#r#r�r�r�r�r�r�r#r�r�rr�r�r�r#r#r#r#r�r�r#r#r�r�r�r�r�r#r#r#rr�r�r#r�r�r#r�r#r�r�r�r�r#r#r#r#r#r#r#r#rr�r#r#r#r#r�r#r#r#r�r#r#r#r�r�r�r�r�r�r�r�r�r#r�r�r#r#r#r#r#r�r�rr#r�r#r�r�r�r#r�r#r�r�r�r�r�r�r#r�r�r�r�r�r�r#r#r#r#r#r#r�r#r#r�r#r�r#r#r�r#r#r�r�r#r#r#r�r�r�r�r�r�r�r�r�r#r�r�r#r#r#r�r�r�r�r#r�rr�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r#r#r#r#r#r�r�r#r#r�r#r�r#r�r�r#r#r#r�r#r#r#r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r#r#rr#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r#r#r�r#r�r�r#r#r�r#r�r#r#r�r#r#r#r�r#r#r#r�r�r�r�r�r�r�r�r�rr#r#r#r#r#r#r#r#r#r#r�r�r#r#r#r�r�r#r�r�r#r�r#r�r#r#r#r�r�r#r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r#r#r#r#r�r�r�r#r#r#r�r�r�r�r#r#r#r�r�r�r�r�r�r�r�r�r�r�r�r�r#r#r#r#r#r#r�r#r#r�r#r�r#r�r�r#r#r�r�r#r#r�r�r�r�r�r�r�r�r�r�)ZLatin2_HungarianCharToOrderMapZwin1250HungarianCharToOrderMapZHungarianLangModelZLatin2HungarianModelZWin1250HungarianModel�r�r��(/usr/lib/python3.6/langhungarianmodel.py�<module>#sV
site-packages/pip/_vendor/chardet/__pycache__/langturkishmodel.cpython-36.opt-1.pyc000064400000055442147511334570024317 0ustar003

���e^+�@sd�Zd�Zeed�d�d�d�d��Zd�S)����%�/�'��4�$�-�5�<��1��.�*�0�E�,�#��3�&�>�A�+�8�����������
��
����@���	�� �9�:�������������������������e���������j��������d���������^�P�]��i���?�������~�}�|�h�I�c�O�U�{�6�z�b�\�y�x�[�g�w�D�v�u�a�t�s�2�Z�r�q�p�o�7�)�(�V�Y�F�;�N�G�R�X�!�M�B�T�S�n�K�=�`��C�m�J�W�f�"�_�Q�l�L�H����k�g�X4���?Tz
ISO-8859-9ZTurkish)Zchar_to_order_mapZprecedence_matrixZtypical_positive_ratioZkeep_english_letterZcharset_nameZlanguageN(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5rrrrrr6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�(r$r r$r$r$rr$r$r$r$r$r$r$r$r rrr$r$rr$r$r�r$r$r$r$r$r�r$rr$r$r rr�r�rrr�r�r�rr�r�rrrrr�r�r�r�r�r�r�r r r�r�rr�r�rr$r r r$r$r�r$r$r$r$r$r$r$r r$rr�r$r$rr$r$r�r$r$r$r$r$r�r$r�r$r$rrr�rr�rr�r�r�r�r�r�rrrrr�r�r�r�r�r�r�r r r�r�r�rr�rr$r$r r$r$r�r$r$r$r$r$r$r$r r$rrr$r$r�r$r$rr r$r$r$r$r�r$r�r$r$rrr�r�r�rr�r�r�r�rrr�rr rr�r�r�rr�r�r�r�r r�r�r�r�r�rr$r$r$r$r$r$r r$r$r$r$r$r$r$r$rr$r$r r�r$r rr r rr$r$r�r�r�r r r r�rr�r�rr�r�rrr�r�r�r�r�r�r�r�r�r�r�r�rr�rrr�rr�r�rr$r$r$r r$r$rr r$r$r$r$r$r$r$rr$r rr�r$r r�rr r$r$r rr�r�r r rr�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r r�r r�r�r�rr�rr$r$rr$r$r$r$r$r$r$rr r�r�r r$r�r r$r�r�r r r r$r�r$r�rr rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r$r$r$r$r$r$r$r$r$r$r$r$r$r$r$r�r$r$r$r�r$r r�r r$r r$r$rr�r�r r$r r�r�rr�r�r�r�r�r�r r�r�rr�r�r�r�r�r�r�r�r�rrrr�r r�r�rr$r$r$r r$r$r r$r$r$r$r r$r$r$r�r$r$r�r�r rr�r�r r$r r r�r�r�r r r r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�rr�rr�r r�r�rr$r$r$r r$r$r$r$r$r$r$r r$r$r$r�r$r r�rr$r rrr$r r$r rr�r�r r r r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�rr�r�r�r�r�r$r$r$r r$r$r$r$r$r$r$r r$r$r$r�r$r r r�r r$r�r�r r r r r�r�r�r r$r$r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r r�rr�r�r�r$r$r$r$r$r$r$r r r r r$r r$r$r�r$r$rrr r r�r�r r r$r r�r�rr$r�r$rr�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�rr�r�r�r�rr$r$r$r r$r$r$r rr r r$r r$r$r�r$r r�r�rrr�rrr rr r�r�r�rr�r$r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�rr�rr�rr�r�r�r$r$r$r r$r$r r$r r r r$r$r$r$rr$rrr�r$r rrr$r$r r$rr�r�rrr�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r r�r�rr$r r r$r$r�r$r$r$r$r$r$r$r r rr�r$r$rr$r$r�rr$r$r r$r�r$r�r$r r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r r r r$r$r�r$r$r$r$r$r$r$r$r$r�r�r$r r�r$r$r�r$r r$r$r$r�r$rr$r r�r�r�r�r�r�r�r�r�r�rr�rr r�rr�r�r�r�r�r�r�r r r�r�rr�r�rr$r$r$rr r$r$rr�r�rr�r�r$r$r r$r�r�r r�r�r r�r r�r�r�r r�r r�r�r$rr�rr�r�r�r r rr�rrr rr r r r�r rrr�r�r�r r�r�r�r�r�rr rr$r$r�r$r$r$r$r$r r$r�r�r�r�r r$r�r r$rr�r r$rr$r�r$r�r r$r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r$r$r$rr$r$r r r$r r r�rr r$r�rr rr�rr�r�r�rr�r r r�r�r�rrrr�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�rrr�r�rr�r�r�r$r$r$rr$r$rrr$r$rrr$r$rr�r rr r�r rr�r�rrr rr�r�r�r r rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r$r$r$rr�r rr$r�r�r r�r�r$r$r�r$r�r�rr�rr r�r�rrr r r�rr�r�rr rrr�rr�rrrrrr�rrrr r rr r�rr�r�r�r�r�r�rr�r�r$r$r$r r$r r$r$r�r r r r$r$r$r�r$r�r�r�r r r�rr rrrr�r�r�rr�r$r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r$r$r$r$r$r$r rr r r$r$r$r$r r�r r�r�r�r r r�r�r rr$r$r�r�rrrrr�r�rr�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�rr�r�r�rrr r$r$r�r$r$r$r$r$r$r r r�r r�r r$r r$r r r r r r r rr$r r$r r�r rr r r r rrr r rr r rr r�r�r rrr�r rr�r�rr�r�r�rr r$r$rrrr�rrrr r$r rrr�r�r�r�r�r�r�r�r�r�rr�rr�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r$r$r$r r r r$r r$r r rr$r$r$r�r rr r�r rr�r�rrrrrr�r�rr rr�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�rr�r r�rr�r�r�r$r$r$r r$r$r$r$r$r r$rr r$r$rr r�r�r�r�r�r�r�r$r rrr�r�r�r�r r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r$r$r$r r r$r$r rrrrrr$r$r�r$rr�r�rrr�r�r$rr rr�r�r�r�r�r$r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r$r$r$r r r$r r r r$r rrr$r$r�r$r�r�r�r�rr�r�r$rrr r�r�r�rrr�r�rr�r�r r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr$r$r�r$r$r$r$r$r r r rr r�r rr r rrr�rr r r r r r r r�r�r rr rr rr�rrr$rr rrr r�r�r r�rr�rr�rr�r�r�rr�rr$r$r$rr$r$r$r�rrr�r r r$rr�r$r�r�r�rr�r�r�rr�r�rr�rr�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r$r$r r�r�r r rr�r�rr�r�r$r$rr$r�r�rrr�r r�r$r�r�r�r r�rrr�rr r�rr r r�r r r r rr�r rrr�r r�r rr r�r�r�r�r�r�r�r�r�r$r$r$rr$r r$r r�r r r rr$r r�r rr r�rr r�r�rr�r r r�r�r�r rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�rr�r�r�r$r$r$r�r$r$rrr r$rr�r$r r$r�r$r�r�r�rr�r�r�rr�rr�r�r�r�r�rr r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r$r$r�r$r$r r$r$r r r�r�r�r�rr r�rr$r�r�r�r$rrr�r$r�r r r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r$r$r$rr r rr�r$rrrrr$r$r r$r�r�rr�rr r�r r r�r r r�r rr�r r rrrrr�r rrr�rrrrr rr rr r�rr�rr�r�r�r�r�r�r�r$r$r$r�rrr$r�r�rrr�r�r r r�r$r�r�rrr�rr�r�r�r�r�r r�r�r�r�r$rr�rr�rr�r r�r�rr�rr�rrrr rrr�r r�r�r�r�r�r�r�r�r�r$r$r$r�r r�r r�rrrr�r�r$r$r�r r�r�rr�r�r rrr�rr�rr�rr�r�r r�rr r�r r�r rrr�rr�r rrr�r rrr�rr�r�r�rrr�r�r�r�r$r r$r�rr�r�r�r�r�r�r�r�rr r�rr�r�rr�r�rr�r�r�r�r�r r�r�r�r�r�rrr�r�rr�rr�r�rr�r�r�r rr�rr�r r�r�r�r�r�r�r�r�r�r�r�r$r$r$r�r�r r$r�r�rr�rr�r r$r r$r�r�rr$r�r rr�r�r�r�r r�rr�r�r rr�r�rrr�r rr�r�rr�r�rrr�rrr r�rr�r�r�r�rr�r�r�r�r$r r r�r�rrr�r�r�r�r�r�r$rrrr�r�r�r�r�rr�r�r�r�r�r r�rr�r�rr�r�r�r�r�r�rr�r�r�r�r�r�r�rr�r�r�rr�rr�r�r�r�r�r�r�r�r�r�r�r�r$r$r�r r$r r rr r rrr r�rr$r r r r�r�r r r�r�r�rr rr$r�r rrr�rrrr�rr r r rrr r�r�r�r�rr�rrr�r�r�r�r�r�r�r�rrr r$r�r$r$r$r r r r rr�rr�rr�rr r r�r�r r rr$rrr rr�r�rrr r�rrr�r�rr r�r rrr r�r�rr�r�r�rr�rr�rr�r�r�r�r$r$r r�r�r$rr�r�r�r�r�r�r$r rr r�r�rr�r�r r�r�r�r�r�r r�rr�r�r rrr�r�rr�rr r�r�rrr�r�r rrrrr�r r�r�r�r�r�r�r�r�r�r$r$r r�r�rr�r�r�r�rr�r�r$r$r r r�r�rr�r�r r�rr�r�r�r r�rr�r�r�rrr�r�r r�r rr�r�rrr rr r�r rr rrrr�r�rrr�r�r�r�r$r$r r�r�r r r�r�r�rrr�r r rr$rr�rr�rr r�r�r�r�r�rr�rr�r�rrr�r�r�r�r�rr�r�rr�r�r�rrr�rr�rr�r�r�r�r�r�r�r�r�r�r�r$r$r$r r�r�r�rr�r�rr�r�r r$rr r�r�rr�r�r r�r�r�rr�r r�r r�r�rrr r rr r�r rrr�r�rrr�rrrrr rrr�r�r�r�r�r�r�r�r�r$r$r$r�r rr rr�r�rrr�r$r$rr r�r�rr�r�r r�r r�rrr r�r�r�r�r�rrrrr r�rrr�rrrrr�r�r�rrrr�rr�r�r�rr�r�r�r�r�r$r$r$r�r r r$r r�r�rr�r�r r$rr�r�r�r�r�r�r r�r r�r�r�r r�r�r�r�rrr�r�r�rr�r�rr�rrr�rr�rrrr�rr�r�r�r�r�r�r�r�r�r�r�r$r r$r�r�r�r�r�r�r�rr�r�r r r r r�r�rr�r�r r�r�r�r�r�r r�rr�r�r�r rrr�rr�r rrr�r�rrr rr�r r�r r�rr�r�r�r r�r�r�r�r�r�r�r�r r r�r rrrrr r r�r�rr�rr�r�rr$r�r�r�r�rr�r�r rr�r�r�rr�rr�r�r�r�r�r rr�rr�r�r�r�r�r�r�r�r�r r�r�r�r�r�r�r�r�r r�r�r r$r�r r$rr r r�r r�r�r r�r rrrr rr�r�rr rrr rr�rr�r r�rr�rrr�r�r r rr rrr r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r$r$r$r�r rr r�r�r�rr�r�r$r r�rr�r�rr�r�r r�r�r�rr rr�rr�r�r�r�r�rr�rr�r�rr�r�r�r�rr�rr�rrrr�rr�r�r�r�r�r�r�r�r�r�r�r�r r r�r r rrr�rrrrrr�r�rr rrrr�rr�r�r�rrrrr�r�r rr�rrrr�rrr rr rrr r�rrr rr�r r�r�r�r�r�r�r�r�r$r r r�r�r r�r�r�r�r�r�r�r r r�r r�r�rr�r�r r�r�r�r�r�r r�r�r�r�r rr�r�r�r�r�rr�r�r�r�r�r�r�rr�r�r�rr�rr�r�r�r�r�r�r�r�r�r�r�r�r$r r�r r r�rrr�rr�r�rr�r�r�rr�rr�r�r�r�r�rr�r�r�r�r r�rr�rr�rrr�r�rr r�rr�rrr�r�rr�rr�r r�r�r�r�r�r�r�r�r r r r�rrr�r�r�rr�r�r�rr r�rr�r�rr�r�rr�r�r�r�rr r�rr�r�r�rr�r�r�rr�r�rr�r�r�r�r�r�rr�rr�r r�r�r�r�r�r�r�r�r�r�r�r r r r rr�rrrr�r�r�r�rr r�r�rr�r�r�rr�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r r�r�r�r�r�r�r�rrr r�rr�r�r�rr�rr�r�r�rr�r�rr�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r r�r�r�r�r�rr�r�rr r r�r rr rrr r r�r�r�r�rr�r�rrr�r�r r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r r r r�r�r�rr�r�r�r�r�r�r r rrr�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�r�rr�r�r�rr�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r r r r�rr�rr�r�r�r�r�r�rrr�r�r�r�r�r�r�rr�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r r r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)ZLatin5_TurkishCharToOrderMapZTurkishLangModelZLatin5TurkishModel�r�r��&/usr/lib/python3.6/langturkishmodel.py�<module>%s*site-packages/pip/_vendor/chardet/__pycache__/langcyrillicmodel.cpython-36.pyc000064400000073220147511334570023473 0ustar003

���eF�@s�d�Zd�Zd�Zd�Zd�Zd�Zd�Zeed�d�d�d�d��Zeed�d�d�d�d��Zeed�d�d�d�d��Z	eed�d�d�d�d��Z
eed�d�d�d�d��Zeed�d�d�d�d��Zd�S)�����������������J��K�������������G��B��A��L��@���M�H��E�C��N�I���O������������������������������������������������������������������������D��������������������������������������������������
��'������������	�����
��������6�;�%�,�:�)�0�5�.�7�*�<�$�1�&��"�#�+�-� �(�4�8�!�=�>�3�9�/�?�2�F�gl���P@�?FzKOI8-RZRussian)Zchar_to_order_mapZprecedence_matrixZtypical_positive_ratioZkeep_english_letterZcharset_nameZlanguagezwindows-1251z
ISO-8859-5ZMacCyrillicZIBM866ZIBM855N(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrrrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqr\rrrsrtrurvrrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxryr�r~r{r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r}rrzr�r�r�r�r�r�r�rwr�(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r]r^r_r`rarbrcrdrerfrgrhrir�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxryr�r~r{r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r}rrzr�r�r�r�r�r�r�rwr�rjr\rkrlrmrnrorprqrrrsrtrurvrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrr\r�rxryr�r~r{r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r}rrzr�r�r�r�r�r�r�rwr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr9r:r;r<r\r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrwr�r�r�rxr�ryr�rzr�r{r�r|r�r}r�r~r�rTrUrVrWrXrYrZrr�r�r�r[r]r^r_r�r�r`rarbrcrdrerfr�r�rgrhrirjrkrlrmrnr�r�r�r�r�r�r�r�r�rorprqrrr�r�rsr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rtrur�r�r�r�r�r�r�r�r�r�r�r�rvrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxryr�r~r{r|r�r�r�r�r�r�r�r�r�r�r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r]r^r_r`rarbrcrdrerfrgrhrir�r�r�r�r}rrzr�r�r�r�r�r�r�rwr�rjr\rkrlrmrnrorprqrrrsrtrurvrr(r�rxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxr�r�rxrxrxrxr�rxrxrxr|rxr|rxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxr�rxr|r|r|r|r|r�r�r|rxrxrxr|rxrxrxrxrxrxrxrxrxrxr|rxrxr�r�rxrxrxrxrxrxrxrxrxr|rxr|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxr|r|rxrxrxrxrxrxrxrxrxr|rxrxr�r�rxrxrxrxrxrxrxrxr|rxrxr�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr|rxr|rxrxrxrxrxrxrxrxrxrxrxrxrxr�r�rxrxrxrxrxrxrxrxrxrxrxr|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxrxrxrxrxrxrxrxrxr|rxrxr�r�rxrxrxrxrxrxrxrxrxrxrxr|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxrxrxr|r|r|rxr�rxrxr�rxrxrxrxr|r|rxr�r|r|r|rxrxr|r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxr|rxrxrxrxrxr|r|rxr|rxrxrxr|r�r|r|r�r�r|r|r|r|r|r|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxr|r|r|rxr�r|r|rxrxr|r�r|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxr|rxrxr�r|rxr|r|rxr|rxrxrxrxr|r|rxr�rxr|r|rxr�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxrxrxr|r|rxrxrxrxrxr|rxrxrxrxr|r|r|r�rxrxrxr|r|r|r|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxrxrxrxrxr|rxr|rxrxrxrxrxrxr|rxr|r|r�r�rxr|r�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxrxrxrxrxrxr|r�r�rxr�r�r�r�r�r|r�r�r�r|r|r|r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxr|rxrxr|r|r|r|r�rxr|rxr|rxr|r�r|r|r�r�r�r|r�r|r�r|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxrxrxrxrxrxrxr|r|rxr|rxrxrxr|r|r|r|r�r|r|r|r|rxr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�rxr|rxr|r|rxrxrxrxrxrxrxrxrxr�rxr|r�r�rxrxrxrxr|rxrxrxrxr|rxr|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rxrxrxrxrxr|r|rxrxr�r|r�r�rxr|rxr|rxr�r�r�r|r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr�rxr�r|rxrxrxrxr|rxrxrxrxr�r|r|r�r�r|rxr|r|r|rxr|rxr|r|rxr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr|rxr�r|rxr|rxr�r�r|rxrxr|r�r|rxr�r�r|rxr|r|r�r�rxr�rxr|r|r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr�rxr�r|rxrxrxrxrxrxrxrxr|r�rxr|r�r�r|r|rxrxrxr|rxrxr�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxr|r|rxrxr|r|r|rxrxr�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxr|r|rxrxrxrxrxrxrxr�rxr|rxrxr|rxr|r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxr|rxrxrxr|r|r|r|rxr�rxr|rxr�r�r|r�r�r|r|r|r|r�rxr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r|r|rxrxrxrxrxr�r|r|r�rxr�r�rxr�r�rxr�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr|r|r�r�rxrxrxr|r|r�r|r|rxr�r�r|r�r�r|r|r�rxr�r�r|r�r�r|r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr|rxrxrxrxr�r|r|r|r�r|r�rxrxr�r�r|r�r|r�r|r|r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rxrxrxrxrxr|r�rxr|r|rxr|r�rxr|r�rxr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxr|rxrxrxr|r|r|rxrxr�r|r�r|r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr�r�r|r�r|rxrxr|r|r�r|r|rxr�r|r�r�r�r|r|rxr|r�r|r|r|r|r|rxr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxrxrxr�r�r�r�r�r|r|r�r�rxr�r�r�rxr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�rxrxrxr|r�r�r�r|r�r�r�r�r|r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r|rxr|r|r|r�r|r|r|r�r|r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rxrxrxrxr�r�r�r�r�r�r�r�r�rxr�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r|rxr|r|r|rxr�r|r|r|r|r|r|r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxr|r|r|r|rxr|r|r�r�r|r|r|r|r�r�rxr�r|r�r|r�r�r�r�r�r�r�r|r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�rxr|r|r|r|r�r|r�r|r�r|r�r�r�r|r�r|r|r�r�r|r|r�r�r�r�r|r�r�r�r�r�r�r�r�r�r|r�r�r�r|r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr|r|r|r�r�r�r|rxr�r�r�r�r|r�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rxr|rxr|r�r|r|r|r|r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r|r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr�r�r�r�r|r|r|rxr|r|r|r|r|r|r|r�r�r�r|r�r|r�r�r�r|r|r�r�r�r�r|r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rxr|rxrxr|r�r�r�r�r�r�r�r�r|r�r�r�rxr�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rxrxrxrxr�r|r|r|r|r�r�r�r�r|r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rxrxrxr|r�r�r�r�r|r|r�r�r�r|r�r�r�rxr�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr|rxr|r�r�r�r|r|r|r�r�r�r|r�r�r�r�r�r�r�r�r�rxr�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr�r|r�r�r|r|r|r|r|r|r�r|r|r�r�r�r�r�r|r|r|r�r�r�r�r|r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxr|r|r�r�r�r|r|r�r�r�r�r|r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rxr|rxr|r�r�r�r�r�r�r�r�r�r|r�r|r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r|r�r�r�r�r|r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr|r|r|r�r�r�r|r|r�r�r�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rxr�r|r|r|r|r|r|r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r|r�r�r�r�rxr|r�r|r�r|r|r�r�r�r�r�r|r�r�r�r|r�r�r�r�r�r|r�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r|r|r|r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r|r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r|r|r|r�rxr�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r|rxr|r|r�r�r�r�r|r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r|r�r�r|r�r|r|r|r|r|r�r|r|r�r�r�r�r�r�r|r|r|r�r|r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r|r|r|r�r�r�r|r|r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r|r|r|r�r�r�r|r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r|r|r|r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r|r|r�r�r|r�r|r�r�r�r�r�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)
ZKOI8R_char_to_order_mapZwin1251_char_to_order_mapZlatin5_char_to_order_mapZmacCyrillic_char_to_order_mapZIBM855_char_to_order_mapZIBM866_char_to_order_mapZRussianLangModelZ
Koi8rModelZWin1251CyrillicModelZLatin5CyrillicModelZMacCyrillicModelZIbm866ModelZIbm855Model�r�r��'/usr/lib/python3.6/langcyrillicmodel.py�<module>s
site-packages/pip/_vendor/chardet/__pycache__/euckrprober.cpython-36.pyc000064400000002031147511334570022311 0ustar003

���e��@sDddlmZddlmZddlmZddlmZGdd�de�ZdS)�)�MultiByteCharSetProber)�CodingStateMachine)�EUCKRDistributionAnalysis)�EUCKR_SM_MODELcs4eZdZ�fdd�Zedd��Zedd��Z�ZS)�EUCKRProbercs,tt|�j�tt�|_t�|_|j�dS)N)	�superr�__init__rrZ	coding_smrZdistribution_analyzer�reset)�self)�	__class__��!/usr/lib/python3.6/euckrprober.pyr#s
zEUCKRProber.__init__cCsdS)NzEUC-KRr)r
rrr
�charset_name)szEUCKRProber.charset_namecCsdS)NZKoreanr)r
rrr
�language-szEUCKRProber.language)�__name__�
__module__�__qualname__r�propertyrr�
__classcell__rr)rr
r"srN)	ZmbcharsetproberrZcodingstatemachinerZchardistributionrZmbcssmrrrrrr
�<module>ssite-packages/pip/_vendor/chardet/__pycache__/version.cpython-36.opt-1.pyc000064400000000550147511334570022416 0ustar003

���e��@sdZdZejd�ZdS)z�
This module exists only to simplify retrieving the version number of chardet
from within setup.py and from chardet subpackages.

:author: Dan Blanchard (dan.blanchard@gmail.com)
z3.0.4�.N)�__doc__�__version__�split�VERSION�rr�/usr/lib/python3.6/version.py�<module>ssite-packages/pip/_vendor/chardet/__pycache__/mbcssm.cpython-36.pyc000064400000042131147511334570021257 0ustar003

���e�c�@sl	ddlmZdZejejejdejejejejejejejejejejejejejejejejejejejejfZdZedeedd�Zd Z	ejejdejejejddejd	ejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejfFZ
d!Ze	d
e
edd�Zd"Z
ddddejejejejejejejejejejejejejejejejejejejejejejejejejejdejdejejejejejejejf(Zd#Ze
d	eedd�Zd$ZejejdejejejejejejejejejejejejejfZd%Zedeedd�Zd&Zejejejddddejejejejejejejejejejejejejejejejejejejejejejejejejdejejejejejejejejejejejejejejejf0Zd'Zed
eedd�Zd(Zejejejejejejdejejejejejejejejejejejejejejejejejdejejejejejejejejejdejejejejejejejejejejejejejf0Zd)Zed
eedd�Zd*ZejejejdejejejejejejejejejejejejejejejejejejejejfZd+Zed	eedd�Z d,Z!dd
d
ejddejejejejejejejejejejejejd	d	d	d	ejejd	d	d	d	d	ejd	d	d	d	d	d	dd
d
ejddd	d	ejd	d	d	d	d	d	d	ejejejejf8Z"d-Z#e!d	e"e#dd�Z$d.Z%d	d	d
d	ddejejejejejejejejejejejejdddejejejdddejdejd	d	d
d	dddddejdddejejejdddddejdejejejf8Z&d/Z'e%d	e&e'dd�Z(d0Z)ejejejejejejdd
dddd
d	dddejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejddddejejejejejejejejejejejejejdddejejejejejejejejejejejejd
d
d
d
ejejejejejejejejejejejejejejd
d
ejejejejejejejejejejejejddddejejejejejejejejejejejejejejejdejejejejejejejejejejejejddddejejejejejejejejejejejejejejejdejejejejejejejejejejejejdddejejejejejejejejejejejejejejejejejejejejejejejejejejejf�Z*d1Z+e)de*e+dd�Z,dS)2�)�MachineState�����ZBig5)Zclass_tableZclass_factorZstate_tableZchar_len_table�name����	�
ZCP949zEUC-JPzEUC-KRzx-euc-twZGB2312Z	Shift_JISzUTF-16BEzUTF-16LE���
���zUTF-8N(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)rrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r
r
r
r
r
r
r
r
r
r
r
r
rrrr
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)
rrrrrrrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)rrrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)rrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)rrrrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r)rrrrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)rrrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)rrrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)rrrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r
rrrrrrrrrrrrrrrr
rrrrrrrrrrrrrrr)rrrrrrrrrrrrrrr	r	)-ZenumsrZBIG5_CLSZERRORZSTARTZITS_MEZBIG5_STZBIG5_CHAR_LEN_TABLEZ
BIG5_SM_MODELZ	CP949_CLSZCP949_STZCP949_CHAR_LEN_TABLEZCP949_SM_MODELZ	EUCJP_CLSZEUCJP_STZEUCJP_CHAR_LEN_TABLEZEUCJP_SM_MODELZ	EUCKR_CLSZEUCKR_STZEUCKR_CHAR_LEN_TABLEZEUCKR_SM_MODELZ	EUCTW_CLSZEUCTW_STZEUCTW_CHAR_LEN_TABLEZEUCTW_SM_MODELZ
GB2312_CLSZ	GB2312_STZGB2312_CHAR_LEN_TABLEZGB2312_SM_MODELZSJIS_CLSZSJIS_STZSJIS_CHAR_LEN_TABLEZ
SJIS_SM_MODELZ
UCS2BE_CLSZ	UCS2BE_STZUCS2BE_CHAR_LEN_TABLEZUCS2BE_SM_MODELZ
UCS2LE_CLSZ	UCS2LE_STZUCS2LE_CHAR_LEN_TABLEZUCS2LE_SM_MODELZUTF8_CLSZUTF8_STZUTF8_CHAR_LEN_TABLEZ
UTF8_SM_MODEL�rr�/usr/lib/python3.6/mbcssm.py�<module>sh $ (((((,  "$   $  $ $                $site-packages/pip/_vendor/chardet/__pycache__/cp949prober.cpython-36.pyc000064400000002030147511334570022047 0ustar003

���e?�@sDddlmZddlmZddlmZddlmZGdd�de�ZdS)�)�EUCKRDistributionAnalysis)�CodingStateMachine)�MultiByteCharSetProber)�CP949_SM_MODELcs4eZdZ�fdd�Zedd��Zedd��Z�ZS)�CP949Probercs,tt|�j�tt�|_t�|_|j�dS)N)	�superr�__init__rrZ	coding_smrZdistribution_analyzer�reset)�self)�	__class__��!/usr/lib/python3.6/cp949prober.pyr#s
zCP949Prober.__init__cCsdS)NZCP949r)r
rrr
�charset_name+szCP949Prober.charset_namecCsdS)NZKoreanr)r
rrr
�language/szCP949Prober.language)�__name__�
__module__�__qualname__r�propertyrr�
__classcell__rr)rr
r"srN)	ZchardistributionrZcodingstatemachinerZmbcharsetproberrZmbcssmrrrrrr
�<module>ssite-packages/pip/_vendor/chardet/__pycache__/langhebrewmodel.cpython-36.opt-1.pyc000064400000055445147511334570024105 0ustar003

���eQ,�@sd�Zd�Zeed�d�d�d�d��Zd�S)������E�[�O�P�\�Y�a�Z�D�o�p�R�I�_�U�N�y�V�G�C�f�k�T�r�g�s�2�J�<�=�*�L�F�@�5�i�]�8�A�6�1�B�n�3�+�,�?�Q�M�b�K�l�|���������(�:���������������������S�4�/�.�H� �^���q���m���������"�t���v�d�����u�w�h�}�����W�c���j�z�{���7�����e�����x���0�'�9����;�)�X�!�%�$���#���>����~�����&�-��������������������	������������������
�������
����`�gC��|�?Fzwindows-1255ZHebrew)Zchar_to_order_mapZprecedence_matrixZtypical_positive_ratioZkeep_english_letterZcharset_nameZlanguageN(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r(r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)ZWIN1255_CHAR_TO_ORDER_MAPZHEBREW_LANG_MODELZWin1255HebrewModel�r�r��%/usr/lib/python3.6/langhebrewmodel.py�<module>&s*
site-packages/pip/_vendor/chardet/__pycache__/langgreekmodel.cpython-36.pyc000064400000057637147511334570022774 0ustar003

���e�1�@s4d�Zd�Zd�Zeed�d�d�d�d��Zeed�d�d�d�d��Zd�S)������R�d�h�^�b�e�t�f�o��u�\�X�q�U�O�v�i�S�C�r�w�_�c�m��H�F�P�Q�<�`�]�Y�D�x�a�M�V�E�7�N�s�A�B�:�L�j�g�W�k�p���Z�J���=�$�.�G�I�6�l�{�n��3�+�)�"�[�(�4�/�,�5�&�1�;�'�#�0��%�!�-�8�2�T�9�y�����|������ �
�����
����	��������*��@�K����g���s�?Fz
ISO-8859-7ZGreek)Zchar_to_order_mapZprecedence_matrixZtypical_positive_ratioZkeep_english_letterZcharset_nameZlanguagezwindows-1253N(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr9r:rrrrrrrrrrr;rrrrrrr<r=r>r?r@rArBrrCrrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r(r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr9r>rrrrrrrrrrr;rrrrrrr<rrr?r@rArBrrCrrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r(r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r(r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjryryrjrjrjrjrjrjrjrjrfrjrjrjr�ryryrjrjr�rjr�rjryr�rjrjrjr�rjr�r�r�ryr�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjrjr�rjrjr�rjryrjrjr�rjryrjrjrjr�r�rjr�rjr�rjrjryr�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�ryrjryryrjrjrjrjrjrjrjrjr�rjrjrjrjr�ryrjrjr�rjrjrjrjryrjrjrjr�ryr�r�r�ryr�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryrjrjryrjrjrjrjrjrjrjrjrjrjrjrjr�ryrfrjrjrjrjryrjrjryrjrjryr�r�r�r�r�ryr�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjr�rjrjrjrjrjrjr�rjrjr�rjrjrjrjrjrjrjrjrjrjr�rjryrjrjr�ryr�rfr�ryr�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�rfr�r�r�r�r�r�r�r�r�rjrjrjrjrjryrjr�r�r�r�rjrjr�rjrfrjrjrjr�rjrjr�rjrjrjrjr�r�r�r�ryr�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjrjr�rjr�rjrjrjrjrjr�rjryryryrjr�ryrjrjrjrjrjryrjrjr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjrjrjryryryrjrjrjrjr�rjrfrjrjrjrjryrjrjrjrjrjrjrjryryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjrjryr�rjr�r�r�rjrjryrjrjrjrjrjr�r�rjryrjr�ryrjr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�rjrjrjrjr�r�rjrjr�ryrjr�rjr�rjrjrjr�r�rjr�rjr�ryryrjrjr�r�r�r�rfr�r�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjrjryr�rjryrjrjrjrjr�rjrjrjrjrjr�rjrjryrjryrjrjryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjryrjryrjrjrjrjrjrjr�ryrjryrjryryryrjryrjrjryrjr�ryryryrjr�ryr�r�r�r�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�r�r�rjrjrjryrjrjr�r�rjr�rjr�r�r�rjryr�rjr�rjr�r�ryr�ryr�r�r�r�r�ryr�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjr�rjrjrjrjrjrjr�rjrjr�rjr�r�r�rjrjr�rjrjrjr�r�rfryrjr�rjr�r�r�r�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjrjryr�r�rjryryrjrjr�rjrjrjrjrjryrfrjr�rjryrjrjryrfr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjr�ryrjrjrjrjrjrjr�r�rjr�rjr�r�r�rjrjr�rjryrjr�r�rjrjrjr�rjr�r�r�ryr�r�r�r�r�rjr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjr�rjrjrjrjrjrjr�r�rjr�rjr�r�r�rjryr�rjryrjr�r�rjryrjr�ryr�r�r�r�r�r�r�r�r�rjr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrfryryrjrjrjrjrjrjr�ryrjr�rjr�r�r�rjrjr�rjr�ryr�r�ryrjrfr�ryr�r�r�r�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�rjrjrjrjr�rjr�rjrjryrjr�rjrjrjrjrjrjr�rjrjrjr�ryrjr�r�rjr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�rjrjrjr�r�rjr�r�r�rjrjr�rjr�ryrjrjr�r�rjr�rjr�rjrjr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�r�r�rjrjrjrjrjrjr�r�rjr�ryr�r�r�rjrjr�rjr�rjr�r�ryr�ryr�r�r�r�r�rfr�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjrjrjr�rjr�ryr�rjryr�rjryrjryrjr�r�rjryrjryrjrjr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�r�ryrjrjrjrjrjr�r�r�rjr�ryrfr�r�rjryryryr�rjr�r�ryryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�rjrjrjryr�rjr�rjr�rjrjr�ryrfryrjrjr�r�rjr�rjr�rjrjr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryrjrjrjr�rjrjrjrjrjrjr�ryrjr�rjr�r�r�ryrfr�ryryrjr�r�ryryryr�r�r�r�r�r�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�r�ryrjrjrjryrjr�r�rfrjr�ryr�r�r�r�rjr�rfr�ryr�r�rfrfrfr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjrjrfr�rjr�r�r�rjryr�rjryrjrjrjr�r�rjr�rjryryryrfr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�rjrjrjr�r�rjr�r�r�r�ryr�ryrjrjryryryryrjr�ryr�ryryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjrjrjrjryr�r�r�r�r�r�ryrjr�ryr�ryrjryr�r�rjr�rjr�rjrfr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjryrjrjryryrjr�ryr�rjr�r�r�ryr�r�r�r�rfryr�ryr�ryr�r�ryr�ryr�ryryr�r�rfr�ryryryr�ryryryr�ryryryr�r�ryr�r�rfr�r�r�r�r�ryr�rjrjryr�r�r�r�r�r�rfrjr�ryr�ryryryr�r�ryr�rjr�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�ryrjryr�ryryr�ryr�ryryr�ryr�ryryryr�r�r�r�r�r�ryrjr�r�r�ryr�rfryr�r�r�r�ryryr�r�r�ryrfr�ryryr�r�r�r�r�r�rfr�ryr�r�r�r�r�r�r�r�ryrfr�ryrjryryrjryrjryr�r�rjrjrjr�r�rjryr�r�r�rfrfr�ryr�ryryr�ryr�ryr�ryryr�r�ryr�ryryryr�ryryryryr�r�ryr�r�r�ryr�rfr�r�r�r�r�rjr�rjrjryryr�rjr�r�r�ryryr�ryryryrfryr�r�rfryryr�r�rjr�r�r�ryr�rfryr�r�r�rfryr�r�r�r�r�r�r�ryryr�rfr�r�ryr�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryrjrjryryr�r�r�ryr�ryrjrjr�ryr�r�r�r�r�r�ryryryr�ryryr�ryr�ryr�ryryr�r�ryryryryrfr�r�ryryr�ryr�r�ryr�r�r�r�r�r�ryr�r�r�r�r�r�r�ryr�rjryrjr�r�r�rjr�r�ryryr�ryr�ryryryr�r�ryr�r�r�r�r�r�r�r�ryr�r�ryryr�r�ryryryr�r�r�r�r�r�ryr�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�ryr�r�rjryr�ryryryryryr�r�r�ryr�r�r�r�ryr�rfr�r�ryr�rfr�r�r�r�ryryryr�ryryr�rfryr�ryryryr�ryryryryrfryryr�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�rfr�r�r�r�r�r�r�r�r�ryr�r�r�r�r�r�rfr�r�r�r�r�r�r�r�r�ryr�ryr�ryryr�r�r�r�rfryrfr�r�ryryr�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�rjryrjr�r�ryr�r�r�ryryr�ryr�r�r�rfr�r�ryr�ryr�ryryr�r�r�r�r�r�ryr�r�r�r�ryryr�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�ryr�r�r�r�r�r�r�ryryrjryryr�r�r�r�r�r�rfrjr�ryr�ryryr�r�r�rfr�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryr�ryr�rjryr�ryr�r�r�r�r�r�ryryr�r�r�r�r�r�r�r�r�r�r�r�r�r�rfr�r�ryr�r�r�r�rfrfr�r�ryrfryr�ryryr�rfr�r�rfr�r�r�ryr�r�r�r�r�r�r�rjr�ryryryr�r�ryr�r�r�ryr�r�r�ryrjr�ryr�r�r�r�r�r�ryryr�r�r�ryr�rfryr�r�r�rfryryrfr�r�r�ryr�r�ryr�r�r�r�r�r�r�r�rfr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rjr�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryrfryr�ryryr�ryr�r�ryr�r�r�r�rfryrfr�ryrfr�r�r�r�r�r�r�r�r�r�r�r�ryr�r�r�rjrfryryr�ryr�r�r�r�ryr�r�r�ryr�r�rjr�r�r�r�ryryryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryrfr�ryr�rfryr�r�r�r�r�r�r�r�r�r�r�r�r�ryr�r�rfr�r�r�r�r�r�ryr�ryryr�r�ryryryryryr�rfryr�r�r�ryryr�rfr�ryr�r�ryryr�r�r�r�r�r�r�r�r�r�rfr�r�r�r�r�r�r�rjr�r�ryr�r�r�r�r�r�r�r�ryr�ryr�r�r�r�ryr�rfryr�r�r�r�ryryrfr�rfr�rfr�ryryryrfr�r�r�r�r�r�rfr�r�r�r�r�r�r�ryr�rfryr�r�r�r�r�r�r�r�r�r�ryr�r�ryryr�r�r�r�rfr�r�r�r�r�r�ryr�ryryr�r�r�r�ryryr�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�ryr�r�ryr�r�r�r�ryryryryr�r�r�rjr�r�r�r�r�r�r�r�ryr�r�r�r�r�r�ryr�r�r�r�r�r�rfr�r�ryr�r�r�r�rfryr�r�r�r�r�r�ryryrfrfr�r�r�r�r�r�rfr�r�r�r�r�r�r�ryr�ryryryr�r�ryr�r�r�r�r�r�r�ryryryr�r�r�ryr�r�r�r�r�r�r�r�ryr�r�rfr�r�r�r�ryrfr�r�r�r�r�r�rfr�r�r�r�r�rfr�r�r�r�r�r�r�r�r�r�r�rjr�ryr�r�r�r�r�r�r�r�ryr�r�r�r�r�ryr�r�r�r�r�r�r�ryr�r�r�r�ryr�r�ryr�r�r�r�ryryr�r�r�r�rfr�r�rfr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryr�ryryrfr�r�r�r�r�r�ryr�r�ryr�ryryryr�r�r�r�r�r�ryr�r�r�r�ryr�r�ryr�r�ryr�ryryr�r�r�r�ryr�ryr�r�r�r�r�ryr�r�r�ryr�r�r�r�r�r�r�r�rjr�r�r�ryryr�ryryr�r�r�r�r�ryr�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rfr�r�r�r�r�rfr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rfr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryr�r�ryr�r�r�r�r�r�ryryryryryr�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�rfrfr�r�r�rfr�r�r�r�r�r�r�ryrfr�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ryryr�r�r�r�r�ryr�r�r�r�r�r�r�r�rfr�r�r�r�r�r�r�r�r�ryr�r�r�ryr�r�r�r�r�rfr�r�r�r�ryryr�r�r�rfr�r�r�r�r�r�r�r�r�r�r�r�r�r�rfr�r�r�r�r�r�r�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rfr�r�rfr�ryr�r�r�r�ryr�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�ryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rfr�r�r�r�rfrfr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rfr�r�r�r�r�r�r�r�r�r�r�r�r�rfr�r�r�r�rfr�r�ryr�ryr�r�r�r�r�r�r�r�r�r�r�ryrfr�r�r�r�r�r�ryr�r�r�rfryr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)ZLatin7_char_to_order_mapZwin1253_char_to_order_mapZGreekLangModelZLatin7GreekModelZWin1253GreekModel�r�r��$/usr/lib/python3.6/langgreekmodel.py�<module>#sV
site-packages/pip/_vendor/chardet/__pycache__/charsetprober.cpython-36.opt-1.pyc000064400000006456147511334570023607 0ustar003

���e��@s0ddlZddlZddlmZGdd�de�ZdS)�N�)�ProbingStatec@sneZdZdZddd�Zdd�Zedd��Zd	d
�Zedd��Z	d
d�Z
edd��Zedd��Z
edd��ZdS)�
CharSetProbergffffff�?NcCsd|_||_tjt�|_dS)N)�_state�lang_filter�loggingZ	getLogger�__name__Zlogger)�selfr�r
�#/usr/lib/python3.6/charsetprober.py�__init__'szCharSetProber.__init__cCstj|_dS)N)rZ	DETECTINGr)r	r
r
r�reset,szCharSetProber.resetcCsdS)Nr
)r	r
r
r�charset_name/szCharSetProber.charset_namecCsdS)Nr
)r	�bufr
r
r�feed3szCharSetProber.feedcCs|jS)N)r)r	r
r
r�state6szCharSetProber.statecCsdS)Ngr
)r	r
r
r�get_confidence:szCharSetProber.get_confidencecCstjdd|�}|S)Ns([-])+� )�re�sub)rr
r
r�filter_high_byte_only=sz#CharSetProber.filter_high_byte_onlycCsbt�}tjd|�}xJ|D]B}|j|dd��|dd�}|j�rP|dkrPd}|j|�qW|S)u9
        We define three types of bytes:
        alphabet: english alphabets [a-zA-Z]
        international: international characters [€-ÿ]
        marker: everything else [^a-zA-Z€-ÿ]

        The input buffer can be thought to contain a series of words delimited
        by markers. This function works to filter all words that contain at
        least one international character. All contiguous sequences of markers
        are replaced by a single space ascii character.

        This filter applies to all scripts which do not use English characters.
        s%[a-zA-Z]*[�-�]+[a-zA-Z]*[^a-zA-Z�-�]?Nr��r���r)�	bytearrayr�findall�extend�isalpha)r�filteredZwordsZwordZ	last_charr
r
r�filter_international_wordsBs
z(CharSetProber.filter_international_wordscCs�t�}d}d}x�tt|��D]r}|||d�}|dkr>d}n|dkrJd}|dkr|j�r||kr�|r�|j|||��|jd�|d}qW|s�|j||d	��|S)
a�
        Returns a copy of ``buf`` that retains only the sequences of English
        alphabet and high byte characters that are not between <> characters.
        Also retains English alphabet and high byte characters immediately
        before occurrences of >.

        This filter can be applied to all scripts which contain both English
        characters and extended ASCII characters, but is currently only used by
        ``Latin1Prober``.
        Frr�>�<TrrN)r�range�lenrr)rrZin_tag�prevZcurrZbuf_charr
r
r�filter_with_english_lettersgs"
z)CharSetProber.filter_with_english_letters)N)r�
__module__�__qualname__ZSHORTCUT_THRESHOLDrr
�propertyrrrr�staticmethodrrr$r
r
r
rr#s
%r)rrZenumsr�objectrr
r
r
r�<module>ssite-packages/pip/_vendor/chardet/__pycache__/escprober.cpython-36.pyc000064400000004742147511334570021765 0ustar003

���en�@sXddlmZddlmZddlmZmZmZddlm	Z	m
Z
mZmZGdd�de�Z
dS)�)�
CharSetProber)�CodingStateMachine)�LanguageFilter�ProbingState�MachineState)�HZ_SM_MODEL�ISO2022CN_SM_MODEL�ISO2022JP_SM_MODEL�ISO2022KR_SM_MODELcsVeZdZdZd�fdd�	Z�fdd�Zedd��Zed	d
��Zdd�Z	d
d�Z
�ZS)�EscCharSetProberz�
    This CharSetProber uses a "code scheme" approach for detecting encodings,
    whereby easily recognizable escape or shift sequences are relied on to
    identify these encodings.
    Ncs�tt|�j|d�g|_|jtj@rD|jjtt	��|jjtt
��|jtj@r`|jjtt��|jtj
@r||jjtt��d|_d|_d|_d|_|j�dS)N)�lang_filter)�superr�__init__�	coding_smrrZCHINESE_SIMPLIFIED�appendrrrZJAPANESEr	ZKOREANr
�active_sm_count�_detected_charset�_detected_language�_state�reset)�selfr)�	__class__��/usr/lib/python3.6/escprober.pyr*szEscCharSetProber.__init__csNtt|�j�x"|jD]}|s qd|_|j�qWt|j�|_d|_d|_dS)NT)	r
rrr�active�lenrrr)rr)rrrr:szEscCharSetProber.resetcCs|jS)N)r)rrrr�charset_nameEszEscCharSetProber.charset_namecCs|jS)N)r)rrrr�languageIszEscCharSetProber.languagecCs|jr
dSdSdS)Ng�G�z��?g)r)rrrr�get_confidenceMszEscCharSetProber.get_confidencecCs�x�|D]�}x�|jD]�}|s|jr&q|j|�}|tjkrhd|_|jd8_|jdkr�tj|_|j	Sq|tj
krtj|_|j�|_
|j|_|j	SqWqW|j	S)NFr�)rrZ
next_staterZERRORrrZNOT_MEr�stateZITS_MEZFOUND_ITZget_coding_state_machinerrr)rZbyte_str�crZcoding_staterrr�feedSs"





zEscCharSetProber.feed)N)�__name__�
__module__�__qualname__�__doc__rr�propertyrrrr"�
__classcell__rr)rrr#srN)Z
charsetproberrZcodingstatemachinerZenumsrrrZescsmrrr	r
rrrrr�<module>ssite-packages/pip/_vendor/chardet/__pycache__/gb2312prober.cpython-36.pyc000064400000002041147511334570022101 0ustar003

���e��@sDddlmZddlmZddlmZddlmZGdd�de�ZdS)�)�MultiByteCharSetProber)�CodingStateMachine)�GB2312DistributionAnalysis)�GB2312_SM_MODELcs4eZdZ�fdd�Zedd��Zedd��Z�ZS)�GB2312Probercs,tt|�j�tt�|_t�|_|j�dS)N)	�superr�__init__rrZ	coding_smrZdistribution_analyzer�reset)�self)�	__class__��"/usr/lib/python3.6/gb2312prober.pyr"s
zGB2312Prober.__init__cCsdS)NZGB2312r)r
rrr
�charset_name(szGB2312Prober.charset_namecCsdS)NZChineser)r
rrr
�language,szGB2312Prober.language)�__name__�
__module__�__qualname__r�propertyrr�
__classcell__rr)rr
r!srN)	ZmbcharsetproberrZcodingstatemachinerZchardistributionrZmbcssmrrrrrr
�<module>ssite-packages/pip/_vendor/chardet/__pycache__/big5prober.cpython-36.opt-1.pyc000064400000002021147511334570022764 0ustar003

���e��@sDddlmZddlmZddlmZddlmZGdd�de�ZdS)�)�MultiByteCharSetProber)�CodingStateMachine)�Big5DistributionAnalysis)�
BIG5_SM_MODELcs4eZdZ�fdd�Zedd��Zedd��Z�ZS)�
Big5Probercs,tt|�j�tt�|_t�|_|j�dS)N)	�superr�__init__rrZ	coding_smrZdistribution_analyzer�reset)�self)�	__class__�� /usr/lib/python3.6/big5prober.pyr#s
zBig5Prober.__init__cCsdS)NZBig5r)r
rrr
�charset_name)szBig5Prober.charset_namecCsdS)NZChineser)r
rrr
�language-szBig5Prober.language)�__name__�
__module__�__qualname__r�propertyrr�
__classcell__rr)rr
r"srN)	ZmbcharsetproberrZcodingstatemachinerZchardistributionrZmbcssmrrrrrr
�<module>ssite-packages/pip/_vendor/chardet/__pycache__/mbcharsetprober.cpython-36.pyc000064400000004151147511334570023155 0ustar003

���eU
�@s0ddlmZddlmZmZGdd�de�ZdS)�)�
CharSetProber)�ProbingState�MachineStatecsVeZdZdZd�fdd�	Z�fdd�Zedd��Zed	d
��Zdd�Z	d
d�Z
�ZS)�MultiByteCharSetProberz 
    MultiByteCharSetProber
    Ncs,tt|�j|d�d|_d|_ddg|_dS)N)�lang_filter�)�superr�__init__�distribution_analyzer�	coding_sm�
_last_char)�selfr)�	__class__��%/usr/lib/python3.6/mbcharsetprober.pyr	'szMultiByteCharSetProber.__init__cs<tt|�j�|jr|jj�|jr.|jj�ddg|_dS)Nr)rr�resetrr
r)r
)rrrr-s

zMultiByteCharSetProber.resetcCst�dS)N)�NotImplementedError)r
rrr�charset_name5sz#MultiByteCharSetProber.charset_namecCst�dS)N)r)r
rrr�language9szMultiByteCharSetProber.languagecCsx�tt|��D]�}|jj||�}|tjkrN|jjd|j|j	|�t
j|_Pq|tj
krdt
j|_Pq|tjkr|jj�}|dkr�|d|jd<|jj|j|�q|jj||d|d�|�qW|d|jd<|jt
jkr�|jj�r�|j�|jkr�t
j|_|jS)Nz!%s %s prober hit error at byte %srr���)�range�lenrZ
next_staterZERRORZlogger�debugrrrZNOT_MEZ_stateZITS_MEZFOUND_ITZSTARTZget_current_charlenrr
�feed�stateZ	DETECTINGZgot_enough_data�get_confidenceZSHORTCUT_THRESHOLD)r
Zbyte_str�iZcoding_stateZchar_lenrrrr=s.





zMultiByteCharSetProber.feedcCs
|jj�S)N)r
r)r
rrrrZsz%MultiByteCharSetProber.get_confidence)N)�__name__�
__module__�__qualname__�__doc__r	r�propertyrrrr�
__classcell__rr)rrr"srN)Z
charsetproberrZenumsrrrrrrr�<module>ssite-packages/pip/_vendor/chardet/__pycache__/sjisprober.cpython-36.pyc000064400000004470147511334570022161 0ustar003

���e��@s`ddlmZddlmZddlmZddlmZddlm	Z	ddl
mZmZGdd�de�Z
d	S)
�)�MultiByteCharSetProber)�CodingStateMachine)�SJISDistributionAnalysis)�SJISContextAnalysis)�
SJIS_SM_MODEL)�ProbingState�MachineStatecsPeZdZ�fdd�Z�fdd�Zedd��Zedd��Zd	d
�Zdd�Z	�Z
S)
�
SJISProbercs4tt|�j�tt�|_t�|_t�|_	|j
�dS)N)�superr	�__init__rr�	coding_smr�distribution_analyzerr�context_analyzer�reset)�self)�	__class__�� /usr/lib/python3.6/sjisprober.pyr%s

zSJISProber.__init__cstt|�j�|jj�dS)N)r
r	rr)r)rrrr,szSJISProber.resetcCs|jjS)N)r�charset_name)rrrrr0szSJISProber.charset_namecCsdS)NZJapaneser)rrrr�language4szSJISProber.languagecCsL�xtt|��D]�}|jj||�}|tjkrP|jjd|j|j	|�t
j|_Pq|tj
krft
j|_Pq|tjkr|jj�}|dkr�|d|jd<|jj|jd|d�|�|jj|j|�q|jj||d||d|�|�|jj||d|d�|�qW|d|jd<|jt
jk�rF|jj��rF|j�|jk�rFt
j|_|jS)Nz!%s %s prober hit error at byte %s�r�����)�range�lenrZ
next_staterZERRORZlogger�debugrrrZNOT_MEZ_stateZITS_MEZFOUND_ITZSTARTZget_current_charlenZ
_last_charr�feedr
�stateZ	DETECTINGZgot_enough_data�get_confidenceZSHORTCUT_THRESHOLD)rZbyte_str�iZcoding_stateZchar_lenrrrr8s6




zSJISProber.feedcCs|jj�}|jj�}t||�S)N)rrr
�max)rZcontext_confZdistrib_confrrrrYs

zSJISProber.get_confidence)�__name__�
__module__�__qualname__rr�propertyrrrr�
__classcell__rr)rrr	$s!r	N)ZmbcharsetproberrZcodingstatemachinerZchardistributionrZjpcntxrZmbcssmrZenumsrrr	rrrr�<module>ssite-packages/pip/_vendor/chardet/__pycache__/sbcharsetprober.cpython-36.pyc000064400000005532147511334570023167 0ustar003

���e�@s4ddlmZddlmZmZmZGdd�de�ZdS)�)�
CharSetProber)�CharacterCategory�ProbingState�SequenceLikelihoodcsbeZdZdZdZdZdZd�fdd�	Z�fd	d
�Ze	dd��Z
e	d
d��Zdd�Zdd�Z
�ZS)�SingleByteCharSetProber�@igffffff�?g�������?FNcsJtt|�j�||_||_||_d|_d|_d|_d|_	d|_
|j�dS)N)�superr�__init__�_model�	_reversed�_name_prober�_last_order�
_seq_counters�_total_seqs�_total_char�
_freq_char�reset)�self�model�reversedZname_prober)�	__class__��%/usr/lib/python3.6/sbcharsetprober.pyr	'sz SingleByteCharSetProber.__init__cs:tt|�j�d|_dgtj�|_d|_d|_d|_	dS)N��)
rrrr
rZget_num_categoriesrrrr)r)rrrr5szSingleByteCharSetProber.resetcCs|jr|jjS|jdSdS)N�charset_name)rrr
)rrrrr?sz$SingleByteCharSetProber.charset_namecCs|jr|jjS|jjd�SdS)N�language)rrr
�get)rrrrrFsz SingleByteCharSetProber.languagec	Csn|jds|j|�}|s|jS|jd}x�t|�D]�\}}||}|tjkrZ|jd7_||jkr�|jd7_|j	|jkr�|j
d7_
|js�|j	|j|}|jd|}n||j|j	}|jd|}|j|d7<||_	q2W|jd}|jt
jk�rh|j
|jk�rh|j�}||jk�r@|jjd||�t
j|_n(||jk�rh|jjd|||j�t
j|_|jS)NZkeep_english_letter�char_to_order_maprZprecedence_matrixrz$%s confidence = %s, we have a winnerz9%s confidence = %s, below negative shortcut threshhold %s)r
Zfilter_international_words�state�	enumeraterZCONTROLr�SAMPLE_SIZErr
rrrrZ	DETECTING�SB_ENOUGH_REL_THRESHOLD�get_confidence�POSITIVE_SHORTCUT_THRESHOLDZlogger�debugZFOUND_ITZ_state�NEGATIVE_SHORTCUT_THRESHOLDZNOT_ME)	rZbyte_strr�i�c�orderrrZ
confidencerrr�feedMsF







zSingleByteCharSetProber.feedcCsNd}|jdkrJd|jtj|j|jd}||j|j}|dkrJd}|S)Ng{�G�z�?rg�?Ztypical_positive_ratiog�G�z��?)rrrZPOSITIVEr
rr)r�rrrrr#|s
 z&SingleByteCharSetProber.get_confidence)FN)�__name__�
__module__�__qualname__r!r"r$r&r	r�propertyrrr*r#�
__classcell__rr)rrr!s
/rN)Z
charsetproberrZenumsrrrrrrrr�<module>ssite-packages/pip/_vendor/chardet/__pycache__/jisfreq.cpython-36.pyc000064400000126637147511334570021454 0ustar003

���e�d�@sdZdZ�dZ�dS(g@i�(������'�O��}�������]�
�
���
���������������X�}�����k��g
�
��k������������%�&�0�1�,�-�������������<���p�������������g������W�X�����h
�"�������
�	��\����
�/
�����������0
��
�h�����������������������������������������������������������������������������	�
���
������������������� ��!�"�#�$�%�&�'�(�)�*�+�,�-�.�/�0�1�2�3�4�5�6�7��V�j�4����B���8�9�:�;�<�=�>�v�����S��e��f���d	�+����a�w�����I���~���
��?�@�A�B�C�D�
��-���i���
��E�^�.���3��i
�F�/�Y�������j��
��G�H�I�J�q��1
��Y��k�/��2���#�����*�������[��\�5��!�!�	�%�@�l��'�A����4�
�������<����-���7�S����~�K�=��E��;���7�7�8�m�&���
��O�K�=�~�d���L�M�N�O�P�Q�R�S�T�U�V�Y�>�J�"�p	�p������T�_���.�X���L�j�e��9�P� �l�y�����D�����)�h���F�?��+���g�c����B�]�N����8�j�:�5���7���R�4�G�d�����n��h�t�6�3�$�W�C���:
�x����*�V	��W�X�Y�Z�[�\�]�^�_�`��a�b�c�d�e�f�g�h�i�j�k�l�m�n�o�p�q�r�s�t�
�u�v�w�x�y�z�{�|�������l�}��~���	��
�����
�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������k���
�>��
��
�	��
	�j
�����Z�[�\�]������	�
���
������������������� �!�"�#�$�%�&�'�(�)�*�+�,�-�.�/�0�1�2�3�4�5�6�7�8�9�:�;�<�=�>�?�@�A�B�C�D�E�F�G�H�I�J�K�L�M�N�O�P�Q�R�S�T�U�V�W�X�Y�Z�[�\�]�^�_�`�a�b�c�d�e�f�g�h�i�j�k�l�m�n�o�p�q�r�s�t�u�v�w�x�y�z�{�|�}�~����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������	�
���
������������������� �!�"�#�$�%�&�'�(�)�*�+�,�-�.�/�0�1�2�3�4�5�6�7�8�9�:�;�<�=�>�?�@�A�B�C�D�E�F�G�H�I�J�K�L�M�N�O�P�Q�R�S�T�U�V�W�X�Y�Z�[�\�]�^�_�`�a�b�c�d�e�f�g�h�i�j�k�l�m�n�o�p�q�r�s�t�u�v�w�x�y�z�{�|�}�~����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������	�
���
������������������� �!�"�#�$�%�&�'�(�)�*�+�,�-�.�/�0�1�2�3�4�5�6�7�8�9�:�;�<�=�>�?�@�A�B�C�D�E�F�G�H�I�J�K�L�M�N�O�P�Q�R�S�T�U�V�W�X�Y�Z�[�\�]�^�_�`�a�b�c�d�e�f�g�h�i�j�k�l�m�n�o�p�q�r�s�t�u�v�w�x�y�z�{�|�}�~����������������������������������������������������������������������������������������������������������������������������������������
����������2
�����{�T���a
�
�^�����|��=��m�k
�����	�V�y�	��
�q	�����
�	������!��	��������C�O�3��������r	���n���+�����
�p�q���;
�A������C�0��n�
����T�C�o�
��
�#	���~��P����������	���	��$	�z�����&���������������_���U���
�s	��	����I����������3
����$�W�1�5�P��	����������X����l
����H��	�������
�
�W���~�p��$����@�L����	�����u���J�I	�W	����_���	�<�����]�D�������h���������f�-��}�t	���	�������,���
�b����X���F��{�`����+�3�q��m��4
�������B�l�X	�
��%
���%	�2�u��	��	��J��|�r��@����������
���	�L
�������b
�J	��c�����������4	�R��m
�T��������������e	����?�
������-�	��*�E����+����&	�	����/�����.�s�[���+��
��
�B�c
��������	�)��,���u	��	��9�&
�������v	���Y���3��
�����D�����d���|�(�y������<�8��	�i����d
�5
�B��n���������|��	�!���)�@�n
�m��	�0�a�����	�z�'�����	�W���[�v�����	����	�o
���	��
���'
���"���9������J���h����e
���\�����p
�@��Q�9������	����w	�i�E��J�I��	��������f����
���1�����
����'	�r�
������
���c��	�q��x����
�2�������	�*���b�F��v���d�
����!�R����Q����P����������(	����_��`���f
��6
���A����a�����u�T��2��������^�C����
��[����
�v�%������n�!�F��Z�^������d������x	���]���B��c��
�����P�q�	��H�Y	����(
���
���g�}�K�	����4���W�����C�g
�-�d���k��4�}��q
�~�+�	����h�@	�	��@�A	�D�:���e��
���q�w�����������d���Z��*�M
�[����s�Z�
����>��'�-����
�����)	�������t�F�7
���t�#��������Y���f�k�b�1���
��6�o�c��q��	������
�u��:�����N
���U�=��	�v���l�N������]��;������
����l�L�	�}�������B	��8
�Q���#�`�T���	�x��r��	���������	���3�n���������{���@��������9
����5�h
�d�Z	����i������s�-�r
�������[���0������.�S�	�R�O
�������;��4�y	�
�;��z	��p��� ����:
�(�|�
��"����������O��e����
�C	�t�)
��6�K����8��P
��h���������3�$�����*
�=�b�e���
�4�����	�.�7��\����j����������/������=���^���5	����	�\�������9�#�+�
�s
�,��7���Y�����
�M���+
�i
�Q���6	���������y���s�#�{	��
�U�f�<����v����)��
�������m�<���	�����g��	�l��� �D	�b�u��
��D�B����
�[	���8����>�����������]���D�������6�Z���	�5���
��
�<�m��
�,�	��u�^�	����g����I���g����\�:�
�M�t�	���E��o��������E�R�����E�j
���g�W���K�	�C��=���]�$�!��	�����`�K�v���0�i��^����3���"��������a�k
��;
�w��������
�y�������P���	�w�����t
�����������J�a���]�������	�����L�h�������
�u
�j���#�C����	�a���!�s�|	��	�Y��
����h�O�5���z�i�
�S���L���.��&��	�<
�7���s�A���M��
��s���*�,
�%�
��	��k���&�f���O�j�"��(�-��[�������-�=��}	���M�S���������������
�
��
��~	��J�t�k�v�e�y�����<
���f��M�k��
���o�������w�v
�l���]���.��
������/�f	�q�$�g��G���n��>�6�=
�������N��	�x��e�*	��h������$�����o�b�,�����t����	�����
�V���
�i������w
���p�H���V��i�<�Z�	�8�r�w��&������/�������>��>
�E	�x���&�e���w�	����5�����
��\����S�i������2������+�'���%����x
�O��
�Q
���������_�H������g����y
��
�0�&�+	������l
�'��'�z��	�	��_����������?
��
����	��z
���m
�|�
��	�g	�0���� ��*���
�`�w���
�#���c�)�����R
�������T����r��V�����_�����r����
�`�x���
���f���n
�=
�*�A���	�(���x��S
��{�T
��9���M��r����;��(��%��[�a�b���D����-
�\	�����o
�E�m�)���!���c��,	�1��]	����>�I����	�T���������y��k�z�x���N�	�����	�5����"���H�<�d��
���h	��	��
�p
���:�L�1�����
���>
���q
�?�Z�M�I���B�K	�D��#�
�
�Y�>������`�������������� ����c���������T��a��2�y�/���"�U
�����b����/���	����
�{�
��j����5�*�������a�x����	��F������7����	�?��w���������+��8����
�N��
����,���n��
���o�s�_�?���	�����	����?�0��	�	��y�3�f��=���l��	��v�]��	�	�����z��
��y�j��u����p���j���	
�i�z����r
������������F��'��	�{
�n��x�u�$����M�����k��p��
�^	�q�
�`�|��	��

�|
�	��
���	�
�����	�G���-���Z�1�Y���-	�	�e����[�C����e��.�-�����	���
�
�7	�b�l�	������{���.�	����z���
�
��?
����9��
�����}��
�a��	�
�Q�	�d�;�V
���/�^�D���?�m��
��
�9�I�
�����
�����
�{�}� ������&���~���X���}
��
���9����:��"�\����
�_	���8�W�~
�^�%�P�s��w����8	��L����.
�����
����N�����P��S������:�R�'�0�b���
�������z��	�����K�������
�r��q�
�����n�L	�����������X�@
���g�i���4��:����r�	��Q��
��	�;�'��>���b�m���w�����s
�����A
��t
�9	�����	�o�^���2��c��������r����!��t��1�{���������s��������x��v�a�p����
���L�O���U��.����)�.	��E�	�"��y��	������(�
���q�E�6�
��
�2�u�6������.��i	����V���s��
���G�G�j�t��W��	���`����u����
���{�Q��	�O�C���
���J�	�	�	�$������/	����_�����j	�R��7���/
��3�D�u
�6�X�4��������>����0
�`�`	�
������U��� �
�r���)���5���H��� 
���~�r��
������	���]����������@
����
�a���v����/�!�6������7�w��	�3�c�8�x����B
�����M	���F	��g����(��
�:	����H���
�f����"��
�~������
�y������ �!�N��
�4�A
���d�e����z������������0	�����������{������
��f�(�l���G�	����^�B
�m�F�|��U����2�R�����#���
����F����U����1���c�k��N	�S�	�����
����$�O	���Y����
���G����W
�	�C
�?����~����U�F�����������
�C
�	�����e�
������/�Q�_�v
�;���P�)���G�m�	���G�,�I�z�_�	���	�9�n��
�A�	�
�X
���#�V�)�<�%��}���H�w
�5��p�.�����m�����
�#�h�|�b����@�(�^�z����A��
� ��"��g��|���������6��X���,�����		����$�#�I�k	�����:�	��Q��o�h�V�$�
���t����Y
�,����
��D
���	�~��
�
���%��� ��������
� �7�/�
	������	�E�V����
�E
�u��&�������	���o���Z��i���	�����!���n�8���'����������	��G��R�2����:��,����Z
�l��(�X�P	�?���	�o��������W�*����;�8��&�)����
�o������H�_���0���x
��0��;	��
���X��	�����1��Z�*�%��	���|�d�`���S�j�}�\�����G	���	�f�q�%�B��1
�p��"�������|������	���l	�2
��
��
���@�h��;����	��	���&�����}��%�����?��	���a	�y
��
����	��=�1��<��{�#��������
�A��
��t���	��>�
�K�3
���=����\�����'�A�+�g���b	�K�������
���
���4
��*��m	�	��
�o�N��0�(�i��$�4� �{����Q	����U�����,�`���	����
�j�)� ���-�6���	�F
�2�)��
�*��YN(rrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r	r	r	r	r	r	r	r	r	r		r
	r	r	r
	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r 	r!	r"	r#	r$	r%	r&	r'	r(	r)	r*	r+	r,	r-	r.	r/	r0	r1	r2	r3	r4	r5	r6	r7	r8	r9	r:	r;	r<	r=	r>	r?	r@	rA	rB	rC	rD	rE	rF	rG	rH	rI	rJ	rK	rL	rM	rN	rO	rP	rQ	rR	rS	rT	rU	rV	rW	rX	rY	rZ	r[	r\	r]	r^	r_	r`	ra	rb	rc	rd	re	rf	rg	rh	ri	rj	rk	rl	rm	rn	ro	rp	rq	rr	rs	rt	ru	rv	rw	rx	ry	rz	r{	r|	r}	r~	r	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r
r
r
r
r
r
r
r
r
r	
r

r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r 
r!
r"
r#
r$
r%
r&
r'
r(
r)
r*
r+
r,
r-
r.
r/
r0
r1
r2
r3
r4
r5
r6
r7
r8
r9
r:
r;
r<
r=
r>
r?
r@
rA
rB
rC
rD
rE
rF
rG
rH
rI
rJ
rK
rL
rM
rN
rO
rP
rQ
rR
rS
rT
rU
rV
rW
rX
rY
rZ
r[
r\
r]
r^
r_
r`
ra
rb
rc
rd
re
rf
rg
rh
ri
rj
rk
rl
rm
rn
ro
rp
rq
rr
rs
rt
ru
rv
rw
rx
ry
rz
r{
r|
r}
r~
r
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r
r
r
r
r
r
r
r
r
r	
r

r
r
r

r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r 
r!
r"
r#
r$
r%
r&
r'
r(
r)
r*
r+
r,
r-
r.
r/
r0
r1
r2
r3
r4
r5
r6
r7
r8
r9
r:
r;
r<
r=
r>
r?
r@
rA
rB
rC
rD
rE
rF
rG
rH
rI
rJ
rK
rL
rM
rN
rO
rP
rQ
rR
rS
rT
rU
rV
rW
rX
rY
rZ
r[
r\
r]
r^
r_
r`
ra
rb
rc
rd
re
rf
rg
rh
ri
rj
rk
rl
rm
rn
ro
rp
rq
rr
rs
rt
ru
rv
rw
rx
ry
rz
r{
r|
r}
r~
r
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrr)ZJIS_TYPICAL_DISTRIBUTION_RATIOZJIS_TABLE_SIZEZJIS_CHAR_TO_FREQ_ORDER�rr�/usr/lib/python3.6/jisfreq.py�<module>,s$site-packages/pip/_vendor/chardet/__pycache__/gb2312freq.cpython-36.opt-1.pyc000064400000112637147511334570022521 0ustar003

���e�P��@sdZdZ�d�Z�d�S(�g�������?i������<	�T���	�Q��
�
�����<�w��9	���	������Z����q���
�W
�y���e�o���v
����������L�B��Q
�����Q����E��f����������d�(�y���,�	���
�E�e��
�W�R����|	�R
��
���
�X
��
�j�E���+�P��������;���m���l���������e�H������^�k�6���"�F���
��
�
�^	���
������?�`�u�$�1��
����� �U���1����g�l����		��G���2���q	�� �P��L	�������y	����
���M���k��4	�������
�	�@	�`��5�����N
��7�3��n��u�G�&�b�h�0��M�N���	�
�Y	��
���	�
���
�&��
�Q�
�7�^�h����J��P�����r���������T	�-�G�,�u���	���$���<�B��$�>����x�L�W����n�
���T���Z���
��	�{���	����
�?�
�0�b�;�D�������C�S��U�>	�|�g����	��
�K���

�����%
�5
������l��h�
����?��� �U�����	��*�<
��
���
��
������
���
�!	�X
���	��Q�g������	���r�
����q��
���z
���	�3�������
�
�]�c�z���7���_���C��&�a�H�|
�l�
��=���I�P��
���~
�8�`��w	���
�T��
�������
�������[�q����	�g��|�C�	����N�I����5�
�(���/�H�K
�
�����9��I���|�������
���W�����^�~�����	��=�X��3�
�I����
������3�r������o������{��i�C���!�H��	������f��=
��1�[���|�D��%��t���
�� ��.����y��T���9
��
��#���J���n�����
���>�B�"����a�����M�K����W���	�	�!
�r	�F�+	��
�
����x
�H�J��������2�K���R
�$��9�
�t
���
����������������%�~��
�`	�������'�v������|��'���n
�*�����F��%��)�~��
���J�&���
�!�D���w��[���	���	�'	�	���k���	�
��
�����������h
�B�_��?��g�h���'��������������5�]�������������1	��)�a	�������;	����_����Q�/�u
��
�
�j�[���}��
�����	����Q�^
��G��z������V�O���������m�u�	�6�����-�$���R���}	�
�p�r�]
�
�D�������i������1�n������t�Y�G	���/�|�t�0�
������\��
�����v�����c����
�~�A�"��"	�!�8�C�8	���	�j�
�1��l�>��
��]
��
�~��L�(����K�d��v�D������%��	������	�,��2���
���j�-�.���m�K�����o�	�;�	����d�@�a�	�M	�1
�L�@�2��w��t�Q�
�/�� ���~�����
�/����
�_	�#���O��O�G���k�[�O�o����-
�
���	�!���I������{�r��.�h�
��T
����'�L
�a��
�@��������
�T�/
���������M���i
�l�����
�{�8�#�
����9��S�R�	�w��Z��0
�E�g�g���
���L�����p����S�
�������	�@��I	�L�;
�O�q����)	��:�F��5�J�d�A
���
�)
�E�L�Y�����
�m	������������c��R��!����
���(����]�I���
��	���e����������`�������
��[��C
�(�P�,��}�O	�'�L�\
�
������Z����s�'�	�U�������N
�����
���M��������������i���g�'����	���
��P�0�s	�i��p	�	�n������\�
���
�!�h����m�(�&�
�C��B�)�5	�������
�����>�������	���{�N�����E�3��J�#��c��
��$
���5��x��]�����k����������P
��
���r���L��z�d���#�
����[�S��h��G�����
�U�+��	�y��w�"��V�2
�&�z��	��	��J
�P	��	�������%	�Z�$�o�T�K�1�
��`�V����d�����+����A��������$��	�_�3�\���������b��f������
�A�n���x��F�
�O�[����(���
�#�7�2	�x��f	��	��
�!�6�{
����Y����+����� ������	�	�K�e����
�T�$����h��	�R���"���Z� 
�_��@���a�����3
�
�k�)�j����e��
�A�^���
�W	����h�:
�z
�)
��|�������8�j����
���3�	�
�z���p�y����2�o�D	�=���.��9�a�f��*�_��	�*����
�
���-��
�}��
�����*	�
�t��t�6
�q�������������S� 	���!�)��y�
�����j	��	���V�I���g
���L�����p�D�����%���M����$������E	�"�U	��e
�	�|��	�Q�������
�1
�������$� �]�4�&���z����`���w�
�����
�)�%
�&�����*���8�2�>
�S�
����1�#���
���#�	�
�������F
���,	�>�+�
���Z�3�q���
�$��
�V��	�;���N���m�K�W����5�
�	�R�4���I��D�������
���
������%�K�|���
������9��n	�����
�-�3�2
�p���?
��~��������C��[��
������3	�>���l�
�`�x	�������#
����0������	�L
���Q���R��?�(�m��`
���3����&
�
���s�D��0	�b	�f�@�	����
��=��
�<�������
���6����s������S�����
�8�Z
�����������
����
�����N��	��
�L�
�J	����	�b��g����[
������~��	�	����f�����
��`�4�=	�����l�?	��c�����i����������w����%���d	�^�+������Z��F�I
��t���	�/��	���y�*
�s�&�+����)����	����
��e	�����Q�]	��	�_�7	��0�c�u����L�k�
������#����������?�����
�n�
������	�����	�H��Y��u	���
�	��J���_�_�&���������Z�%��	�s���S����V���������1�'����F�\�
�|�	�W�����|��S�W��	�*�	�R�l�P�#���-�q
����	���4�������a�����@
���X�i�%����8
�3�,��}������d�x��	���#	��6��
��
�.���A��
���r
��
��R�d����+
������
�{�{�H�	�
���������c�	�
�����,��I
���c������=��J�.�,�>����L�e�]��
�C�����*�;�
�� ��p�
����R���9�
�
��
����?�:�	�����Q�'�s���w�[��
��-�
�)����M�������]�}�6�@�
��U����/���s
�	���k	�?���,
�=
�1�M����/�S
�d�����o
�6	�k�u����W�{��x�W���������x�^����[	�+�7�:���U���!����
�.�H�T���Y����	��@�"�C�s�	��	��G
�b��7�~�.
�U��q�	�����{�����Y�/�
����A�
���*�:�8�'��������A�_�0�������i��K��
���t�*����
�������	��
����D�����>����v��

��
�'��V�����
����k���\�����!����*�U
�������z��\�f
����	�
�����������������B	������{��=�
���&�
���V��	��"�!��
�	�����
�(	�=��
��
���9����4�a�����-�Q	��
���
�m�������C���7�}�q�D�4���u���	�����v���F	���u�a
��H�o��
�
	�l�K�A	�����	�z�4�	�!
�H
��	���5��j���`��
�B�:�O�a���#�����d���(�
�&
��^�m�g
�E
��o	���(�G�$�@�
�I���A���`�X��7���/��$�����h�����	�U�"�T�������m����!�������� ��M�	������	�2�q����9����������/�������p�@�X	��j��i
�^
�b��
���n��v������o�o���v�����0���9���$	���f�S�z�
��D�{���X�����V�3�Y�
����
�A�������q�U�
�:	��2�j����{	��	�F��
�	��	������c�i�R	�;�k�h�����
��Y�O�x��z�Z�<�v	��
�
�
���I���4
�M��������b��������C����
����O��8������	�.�]��V�c
��	����
�H	��T����	��>
�
���3�����,����f����l�T�W�H�.�'�>�p�����"��	�J���
��
�v�2����	�x�	�2���g� �4����
�
���{
�z	���	�
����	��������?�����	�J������
�	�8���E�j
�\��8�8
�� ����`��>�b�S����������0�����E����d����1�����n�@������T��	�����Q���C���K�;�k���������_�\	�e��w�������M��.	�r�a����
����
����g���'�
����	�T���4�o���	�X�y��x���	��
�
�U
�^�:�������"�#�������	������N�U���p��"
�-��	���:���
���	������	���W�
�G
�L��
�G�w�o�����M�r�����Q
���I�N�%�
�X����7
��
�a�����	�'
�i�h��}�N�y�?�t���D������l����	����e�J�s�2�*�y�����������q����5�}�s�$�����5
�B������#�d�s�}���~����b
��O
�/	�]����	�]�2����f�7���O�

�R��#
�	�[��E���f
�S
�	�d�F����C	�V
�>��	���B���	�1�6�m����	��������p
�����
��<�z�
�|
�X����s�e�]�����
��
��O���+��	�������:�	�A���������(
�
�m��4�P��	�n��
�o�o���m��y���8�	���<����<�c	�C�����&�w�
��	�$� 
�9�&����)�`�������G�b�E��:�Q�o�F�
�����	�
�F�|�;�
���3�4�%���:�	�c��p�6
�\�C��0�9
��������~	���"�h������	�)��	��
���*
���
�
�%��	�����
�.����9�	�����
���	��"�!��,
��g	�J�g�F
������
���;�i	�Z�E�^�A
���l�5�������4�'
���k�	��	�2���
��	�X�7���
�{���B�x�=�e�(�������'��������������D
�i�k�������-�,��
�e�7
������������n�}
�p�7�j��	��9���
�������0�P�����������O
�
���	���U�
����<
����
�J���t�N���)��.����*��A���r
�]�)������G��_
�p�k
���
	�%�
����
���>�b�_�^�?��:�Y�D�%����
�������Y��
��������l
�A���	��
�	�D�B�	�
���	�@
�(�v
������g��
�	�}�v����
����������������� �-�~��y�%�	��
�_���������j�M���G�+�	������	������3
�;�
�G�)��
���������
�
��,�I�	�����X�A�\������P�v���	�V��=�
�
�6�<���X��
�	��8�-	�Y
���������c�����9�	�-
�u���	��
�R�
���6���	��h	�}��?
�w
�����	�a�(���1��
���>�������]��
�X�r�P�0�,��+�j��`������.���
��&	�W�B
���|�7�U���	��W��Y��
���w
�
��������u���b��q
����b������=�$
��N�J����U�����	���i�K�,���4�r�{����H��p�5���=��	���E�9��;��<��5��
��i��\����p������r��N����^�����K����*��{�+
�	��*�8����	������j
��@�l	�6���2�8���	������'������B
����\�����S�V	��~���������
��V��
�z���^�	����_
�}
�G�q��I�����8�t�h�S	��}�R�	�9��S�n
�P��B�����s���X�}�f����������������n�I�^��T���x���r����_��b��O�P�Q�R�SN(�rrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r	r	r	r	r	r	r	r	r	r		r
	r	r	r
	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r 	r!	r"	r#	r$	r%	r&	r'	r(	r)	r*	r+	r,	r-	r.	r/	r0	r1	r2	r3	r4	r5	r6	r7	r8	r9	r:	r;	r<	r=	r>	r?	r@	rA	rB	rC	rD	rE	rF	rG	rH	rI	rJ	rK	rL	rM	rN	rO	rP	rQ	rR	rS	rT	rU	rV	rW	rX	rY	rZ	r[	r\	r]	r^	r_	r`	ra	rb	rc	rd	re	rf	rg	rh	ri	rj	rk	rl	rm	rn	ro	rp	rq	rr	rs	rt	ru	rv	rw	rx	ry	rz	r{	r|	r}	r~	r	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r
r
r
r
r
r
r
r
r
r	
r

r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r 
r!
r"
r#
r$
r%
r&
r'
r(
r)
r*
r+
r,
r-
r.
r/
r0
r1
r2
r3
r4
r5
r6
r7
r8
r9
r:
r;
r<
r=
r>
r?
r@
rA
rB
rC
rD
rE
rF
rG
rH
rI
rJ
rK
rL
rM
rN
rO
rP
rQ
rR
rS
rT
rU
rV
rW
rX
rY
rZ
r[
r\
r]
r^
r_
r`
ra
rb
rc
rd
re
rf
rg
rh
ri
rj
rk
rl
rm
rn
ro
rp
rq
rr
rs
rt
ru
rv
rw
rx
ry
rz
r{
r|
r}
r~
r
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r
r
r
r
r
r
r
r
r
r	
r

r
r
r

r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r 
r!
r"
r#
r$
r%
r&
r'
r(
r)
r*
r+
r,
r-
r.
r/
r0
r1
r2
r3
r4
r5
r6
r7
r8
r9
r:
r;
r<
r=
r>
r?
r@
rA
rB
rC
rD
rE
rF
rG
rH
rI
rJ
rK
rL
rM
rN
rO
rP
rQ
rR
rS
rT
rU
rV
rW
rX
rY
rZ
r[
r\
r]
r^
r_
r`
ra
rb
rc
rd
re
rf
rg
rh
ri
rj
rk
rl
rm
rn
ro
rp
rq
rr
rs
rt
ru
rv
rw
rx
ry
rz
r{
r|
r}
r~
r
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)Z!GB2312_TYPICAL_DISTRIBUTION_RATIOZGB2312_TABLE_SIZEZGB2312_CHAR_TO_FREQ_ORDER�r�r�� /usr/lib/python3.6/gb2312freq.py�<module>*s�site-packages/pip/_vendor/chardet/__pycache__/gb2312prober.cpython-36.opt-1.pyc000064400000002041147511334570023040 0ustar003

���e��@sDddlmZddlmZddlmZddlmZGdd�de�ZdS)�)�MultiByteCharSetProber)�CodingStateMachine)�GB2312DistributionAnalysis)�GB2312_SM_MODELcs4eZdZ�fdd�Zedd��Zedd��Z�ZS)�GB2312Probercs,tt|�j�tt�|_t�|_|j�dS)N)	�superr�__init__rrZ	coding_smrZdistribution_analyzer�reset)�self)�	__class__��"/usr/lib/python3.6/gb2312prober.pyr"s
zGB2312Prober.__init__cCsdS)NZGB2312r)r
rrr
�charset_name(szGB2312Prober.charset_namecCsdS)NZChineser)r
rrr
�language,szGB2312Prober.language)�__name__�
__module__�__qualname__r�propertyrr�
__classcell__rr)rr
r!srN)	ZmbcharsetproberrZcodingstatemachinerZchardistributionrZmbcssmrrrrrr
�<module>ssite-packages/pip/_vendor/chardet/__pycache__/langhebrewmodel.cpython-36.pyc000064400000055445147511334570023146 0ustar003

���eQ,�@sd�Zd�Zeed�d�d�d�d��Zd�S)������E�[�O�P�\�Y�a�Z�D�o�p�R�I�_�U�N�y�V�G�C�f�k�T�r�g�s�2�J�<�=�*�L�F�@�5�i�]�8�A�6�1�B�n�3�+�,�?�Q�M�b�K�l�|���������(�:���������������������S�4�/�.�H� �^���q���m���������"�t���v�d�����u�w�h�}�����W�c���j�z�{���7�����e�����x���0�'�9����;�)�X�!�%�$���#���>����~�����&�-��������������������	������������������
�������
����`�gC��|�?Fzwindows-1255ZHebrew)Zchar_to_order_mapZprecedence_matrixZtypical_positive_ratioZkeep_english_letterZcharset_nameZlanguageN(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r(r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)ZWIN1255_CHAR_TO_ORDER_MAPZHEBREW_LANG_MODELZWin1255HebrewModel�r�r��%/usr/lib/python3.6/langhebrewmodel.py�<module>&s*
site-packages/pip/_vendor/chardet/__pycache__/__init__.cpython-36.opt-1.pyc000064400000001375147511334570022476 0ustar003

���e�@s8ddlmZmZddlmZddlmZmZdd�ZdS)�)�PY2�PY3)�UniversalDetector)�__version__�VERSIONcCsHt|t�s0t|t�s(tdjt|����nt|�}t�}|j|�|j�S)z�
    Detect the encoding of the given byte string.

    :param byte_str:     The byte sequence to examine.
    :type byte_str:      ``bytes`` or ``bytearray``
    z4Expected object of type bytes or bytearray, got: {0})	�
isinstance�	bytearray�bytes�	TypeError�format�typerZfeed�close)Zbyte_strZdetector�r�/usr/lib/python3.6/__init__.py�detects


rN)	�compatrrZuniversaldetectorr�versionrrrrrrr�<module>ssite-packages/pip/_vendor/chardet/__pycache__/gb2312freq.cpython-36.pyc000064400000112637147511334570021562 0ustar003

���e�P��@sdZdZ�d�Z�d�S(�g�������?i������<	�T���	�Q��
�
�����<�w��9	���	������Z����q���
�W
�y���e�o���v
����������L�B��Q
�����Q����E��f����������d�(�y���,�	���
�E�e��
�W�R����|	�R
��
���
�X
��
�j�E���+�P��������;���m���l���������e�H������^�k�6���"�F���
��
�
�^	���
������?�`�u�$�1��
����� �U���1����g�l����		��G���2���q	�� �P��L	�������y	����
���M���k��4	�������
�	�@	�`��5�����N
��7�3��n��u�G�&�b�h�0��M�N���	�
�Y	��
���	�
���
�&��
�Q�
�7�^�h����J��P�����r���������T	�-�G�,�u���	���$���<�B��$�>����x�L�W����n�
���T���Z���
��	�{���	����
�?�
�0�b�;�D�������C�S��U�>	�|�g����	��
�K���

�����%
�5
������l��h�
����?��� �U�����	��*�<
��
���
��
������
���
�!	�X
���	��Q�g������	���r�
����q��
���z
���	�3�������
�
�]�c�z���7���_���C��&�a�H�|
�l�
��=���I�P��
���~
�8�`��w	���
�T��
�������
�������[�q����	�g��|�C�	����N�I����5�
�(���/�H�K
�
�����9��I���|�������
���W�����^�~�����	��=�X��3�
�I����
������3�r������o������{��i�C���!�H��	������f��=
��1�[���|�D��%��t���
�� ��.����y��T���9
��
��#���J���n�����
���>�B�"����a�����M�K����W���	�	�!
�r	�F�+	��
�
����x
�H�J��������2�K���R
�$��9�
�t
���
����������������%�~��
�`	�������'�v������|��'���n
�*�����F��%��)�~��
���J�&���
�!�D���w��[���	���	�'	�	���k���	�
��
�����������h
�B�_��?��g�h���'��������������5�]�������������1	��)�a	�������;	����_����Q�/�u
��
�
�j�[���}��
�����	����Q�^
��G��z������V�O���������m�u�	�6�����-�$���R���}	�
�p�r�]
�
�D�������i������1�n������t�Y�G	���/�|�t�0�
������\��
�����v�����c����
�~�A�"��"	�!�8�C�8	���	�j�
�1��l�>��
��]
��
�~��L�(����K�d��v�D������%��	������	�,��2���
���j�-�.���m�K�����o�	�;�	����d�@�a�	�M	�1
�L�@�2��w��t�Q�
�/�� ���~�����
�/����
�_	�#���O��O�G���k�[�O�o����-
�
���	�!���I������{�r��.�h�
��T
����'�L
�a��
�@��������
�T�/
���������M���i
�l�����
�{�8�#�
����9��S�R�	�w��Z��0
�E�g�g���
���L�����p����S�
�������	�@��I	�L�;
�O�q����)	��:�F��5�J�d�A
���
�)
�E�L�Y�����
�m	������������c��R��!����
���(����]�I���
��	���e����������`�������
��[��C
�(�P�,��}�O	�'�L�\
�
������Z����s�'�	�U�������N
�����
���M��������������i���g�'����	���
��P�0�s	�i��p	�	�n������\�
���
�!�h����m�(�&�
�C��B�)�5	�������
�����>�������	���{�N�����E�3��J�#��c��
��$
���5��x��]�����k����������P
��
���r���L��z�d���#�
����[�S��h��G�����
�U�+��	�y��w�"��V�2
�&�z��	��	��J
�P	��	�������%	�Z�$�o�T�K�1�
��`�V����d�����+����A��������$��	�_�3�\���������b��f������
�A�n���x��F�
�O�[����(���
�#�7�2	�x��f	��	��
�!�6�{
����Y����+����� ������	�	�K�e����
�T�$����h��	�R���"���Z� 
�_��@���a�����3
�
�k�)�j����e��
�A�^���
�W	����h�:
�z
�)
��|�������8�j����
���3�	�
�z���p�y����2�o�D	�=���.��9�a�f��*�_��	�*����
�
���-��
�}��
�����*	�
�t��t�6
�q�������������S� 	���!�)��y�
�����j	��	���V�I���g
���L�����p�D�����%���M����$������E	�"�U	��e
�	�|��	�Q�������
�1
�������$� �]�4�&���z����`���w�
�����
�)�%
�&�����*���8�2�>
�S�
����1�#���
���#�	�
�������F
���,	�>�+�
���Z�3�q���
�$��
�V��	�;���N���m�K�W����5�
�	�R�4���I��D�������
���
������%�K�|���
������9��n	�����
�-�3�2
�p���?
��~��������C��[��
������3	�>���l�
�`�x	�������#
����0������	�L
���Q���R��?�(�m��`
���3����&
�
���s�D��0	�b	�f�@�	����
��=��
�<�������
���6����s������S�����
�8�Z
�����������
����
�����N��	��
�L�
�J	����	�b��g����[
������~��	�	����f�����
��`�4�=	�����l�?	��c�����i����������w����%���d	�^�+������Z��F�I
��t���	�/��	���y�*
�s�&�+����)����	����
��e	�����Q�]	��	�_�7	��0�c�u����L�k�
������#����������?�����
�n�
������	�����	�H��Y��u	���
�	��J���_�_�&���������Z�%��	�s���S����V���������1�'����F�\�
�|�	�W�����|��S�W��	�*�	�R�l�P�#���-�q
����	���4�������a�����@
���X�i�%����8
�3�,��}������d�x��	���#	��6��
��
�.���A��
���r
��
��R�d����+
������
�{�{�H�	�
���������c�	�
�����,��I
���c������=��J�.�,�>����L�e�]��
�C�����*�;�
�� ��p�
����R���9�
�
��
����?�:�	�����Q�'�s���w�[��
��-�
�)����M�������]�}�6�@�
��U����/���s
�	���k	�?���,
�=
�1�M����/�S
�d�����o
�6	�k�u����W�{��x�W���������x�^����[	�+�7�:���U���!����
�.�H�T���Y����	��@�"�C�s�	��	��G
�b��7�~�.
�U��q�	�����{�����Y�/�
����A�
���*�:�8�'��������A�_�0�������i��K��
���t�*����
�������	��
����D�����>����v��

��
�'��V�����
����k���\�����!����*�U
�������z��\�f
����	�
�����������������B	������{��=�
���&�
���V��	��"�!��
�	�����
�(	�=��
��
���9����4�a�����-�Q	��
���
�m�������C���7�}�q�D�4���u���	�����v���F	���u�a
��H�o��
�
	�l�K�A	�����	�z�4�	�!
�H
��	���5��j���`��
�B�:�O�a���#�����d���(�
�&
��^�m�g
�E
��o	���(�G�$�@�
�I���A���`�X��7���/��$�����h�����	�U�"�T�������m����!�������� ��M�	������	�2�q����9����������/�������p�@�X	��j��i
�^
�b��
���n��v������o�o���v�����0���9���$	���f�S�z�
��D�{���X�����V�3�Y�
����
�A�������q�U�
�:	��2�j����{	��	�F��
�	��	������c�i�R	�;�k�h�����
��Y�O�x��z�Z�<�v	��
�
�
���I���4
�M��������b��������C����
����O��8������	�.�]��V�c
��	����
�H	��T����	��>
�
���3�����,����f����l�T�W�H�.�'�>�p�����"��	�J���
��
�v�2����	�x�	�2���g� �4����
�
���{
�z	���	�
����	��������?�����	�J������
�	�8���E�j
�\��8�8
�� ����`��>�b�S����������0�����E����d����1�����n�@������T��	�����Q���C���K�;�k���������_�\	�e��w�������M��.	�r�a����
����
����g���'�
����	�T���4�o���	�X�y��x���	��
�
�U
�^�:�������"�#�������	������N�U���p��"
�-��	���:���
���	������	���W�
�G
�L��
�G�w�o�����M�r�����Q
���I�N�%�
�X����7
��
�a�����	�'
�i�h��}�N�y�?�t���D������l����	����e�J�s�2�*�y�����������q����5�}�s�$�����5
�B������#�d�s�}���~����b
��O
�/	�]����	�]�2����f�7���O�

�R��#
�	�[��E���f
�S
�	�d�F����C	�V
�>��	���B���	�1�6�m����	��������p
�����
��<�z�
�|
�X����s�e�]�����
��
��O���+��	�������:�	�A���������(
�
�m��4�P��	�n��
�o�o���m��y���8�	���<����<�c	�C�����&�w�
��	�$� 
�9�&����)�`�������G�b�E��:�Q�o�F�
�����	�
�F�|�;�
���3�4�%���:�	�c��p�6
�\�C��0�9
��������~	���"�h������	�)��	��
���*
���
�
�%��	�����
�.����9�	�����
���	��"�!��,
��g	�J�g�F
������
���;�i	�Z�E�^�A
���l�5�������4�'
���k�	��	�2���
��	�X�7���
�{���B�x�=�e�(�������'��������������D
�i�k�������-�,��
�e�7
������������n�}
�p�7�j��	��9���
�������0�P�����������O
�
���	���U�
����<
����
�J���t�N���)��.����*��A���r
�]�)������G��_
�p�k
���
	�%�
����
���>�b�_�^�?��:�Y�D�%����
�������Y��
��������l
�A���	��
�	�D�B�	�
���	�@
�(�v
������g��
�	�}�v����
����������������� �-�~��y�%�	��
�_���������j�M���G�+�	������	������3
�;�
�G�)��
���������
�
��,�I�	�����X�A�\������P�v���	�V��=�
�
�6�<���X��
�	��8�-	�Y
���������c�����9�	�-
�u���	��
�R�
���6���	��h	�}��?
�w
�����	�a�(���1��
���>�������]��
�X�r�P�0�,��+�j��`������.���
��&	�W�B
���|�7�U���	��W��Y��
���w
�
��������u���b��q
����b������=�$
��N�J����U�����	���i�K�,���4�r�{����H��p�5���=��	���E�9��;��<��5��
��i��\����p������r��N����^�����K����*��{�+
�	��*�8����	������j
��@�l	�6���2�8���	������'������B
����\�����S�V	��~���������
��V��
�z���^�	����_
�}
�G�q��I�����8�t�h�S	��}�R�	�9��S�n
�P��B�����s���X�}�f����������������n�I�^��T���x���r����_��b��O�P�Q�R�SN(�rrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r	r	r	r	r	r	r	r	r	r		r
	r	r	r
	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r 	r!	r"	r#	r$	r%	r&	r'	r(	r)	r*	r+	r,	r-	r.	r/	r0	r1	r2	r3	r4	r5	r6	r7	r8	r9	r:	r;	r<	r=	r>	r?	r@	rA	rB	rC	rD	rE	rF	rG	rH	rI	rJ	rK	rL	rM	rN	rO	rP	rQ	rR	rS	rT	rU	rV	rW	rX	rY	rZ	r[	r\	r]	r^	r_	r`	ra	rb	rc	rd	re	rf	rg	rh	ri	rj	rk	rl	rm	rn	ro	rp	rq	rr	rs	rt	ru	rv	rw	rx	ry	rz	r{	r|	r}	r~	r	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r
r
r
r
r
r
r
r
r
r	
r

r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r 
r!
r"
r#
r$
r%
r&
r'
r(
r)
r*
r+
r,
r-
r.
r/
r0
r1
r2
r3
r4
r5
r6
r7
r8
r9
r:
r;
r<
r=
r>
r?
r@
rA
rB
rC
rD
rE
rF
rG
rH
rI
rJ
rK
rL
rM
rN
rO
rP
rQ
rR
rS
rT
rU
rV
rW
rX
rY
rZ
r[
r\
r]
r^
r_
r`
ra
rb
rc
rd
re
rf
rg
rh
ri
rj
rk
rl
rm
rn
ro
rp
rq
rr
rs
rt
ru
rv
rw
rx
ry
rz
r{
r|
r}
r~
r
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r
r
r
r
r
r
r
r
r
r	
r

r
r
r

r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r 
r!
r"
r#
r$
r%
r&
r'
r(
r)
r*
r+
r,
r-
r.
r/
r0
r1
r2
r3
r4
r5
r6
r7
r8
r9
r:
r;
r<
r=
r>
r?
r@
rA
rB
rC
rD
rE
rF
rG
rH
rI
rJ
rK
rL
rM
rN
rO
rP
rQ
rR
rS
rT
rU
rV
rW
rX
rY
rZ
r[
r\
r]
r^
r_
r`
ra
rb
rc
rd
re
rf
rg
rh
ri
rj
rk
rl
rm
rn
ro
rp
rq
rr
rs
rt
ru
rv
rw
rx
ry
rz
r{
r|
r}
r~
r
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)Z!GB2312_TYPICAL_DISTRIBUTION_RATIOZGB2312_TABLE_SIZEZGB2312_CHAR_TO_FREQ_ORDER�r�r�� /usr/lib/python3.6/gb2312freq.py�<module>*s�site-packages/pip/_vendor/chardet/__pycache__/jisfreq.cpython-36.opt-1.pyc000064400000126637147511334570022413 0ustar003

���e�d�@sdZdZ�dZ�dS(g@i�(������'�O��}�������]�
�
���
���������������X�}�����k��g
�
��k������������%�&�0�1�,�-�������������<���p�������������g������W�X�����h
�"�������
�	��\����
�/
�����������0
��
�h�����������������������������������������������������������������������������	�
���
������������������� ��!�"�#�$�%�&�'�(�)�*�+�,�-�.�/�0�1�2�3�4�5�6�7��V�j�4����B���8�9�:�;�<�=�>�v�����S��e��f���d	�+����a�w�����I���~���
��?�@�A�B�C�D�
��-���i���
��E�^�.���3��i
�F�/�Y�������j��
��G�H�I�J�q��1
��Y��k�/��2���#�����*�������[��\�5��!�!�	�%�@�l��'�A����4�
�������<����-���7�S����~�K�=��E��;���7�7�8�m�&���
��O�K�=�~�d���L�M�N�O�P�Q�R�S�T�U�V�Y�>�J�"�p	�p������T�_���.�X���L�j�e��9�P� �l�y�����D�����)�h���F�?��+���g�c����B�]�N����8�j�:�5���7���R�4�G�d�����n��h�t�6�3�$�W�C���:
�x����*�V	��W�X�Y�Z�[�\�]�^�_�`��a�b�c�d�e�f�g�h�i�j�k�l�m�n�o�p�q�r�s�t�
�u�v�w�x�y�z�{�|�������l�}��~���	��
�����
�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������k���
�>��
��
�	��
	�j
�����Z�[�\�]������	�
���
������������������� �!�"�#�$�%�&�'�(�)�*�+�,�-�.�/�0�1�2�3�4�5�6�7�8�9�:�;�<�=�>�?�@�A�B�C�D�E�F�G�H�I�J�K�L�M�N�O�P�Q�R�S�T�U�V�W�X�Y�Z�[�\�]�^�_�`�a�b�c�d�e�f�g�h�i�j�k�l�m�n�o�p�q�r�s�t�u�v�w�x�y�z�{�|�}�~����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������	�
���
������������������� �!�"�#�$�%�&�'�(�)�*�+�,�-�.�/�0�1�2�3�4�5�6�7�8�9�:�;�<�=�>�?�@�A�B�C�D�E�F�G�H�I�J�K�L�M�N�O�P�Q�R�S�T�U�V�W�X�Y�Z�[�\�]�^�_�`�a�b�c�d�e�f�g�h�i�j�k�l�m�n�o�p�q�r�s�t�u�v�w�x�y�z�{�|�}�~����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������	�
���
������������������� �!�"�#�$�%�&�'�(�)�*�+�,�-�.�/�0�1�2�3�4�5�6�7�8�9�:�;�<�=�>�?�@�A�B�C�D�E�F�G�H�I�J�K�L�M�N�O�P�Q�R�S�T�U�V�W�X�Y�Z�[�\�]�^�_�`�a�b�c�d�e�f�g�h�i�j�k�l�m�n�o�p�q�r�s�t�u�v�w�x�y�z�{�|�}�~����������������������������������������������������������������������������������������������������������������������������������������
����������2
�����{�T���a
�
�^�����|��=��m�k
�����	�V�y�	��
�q	�����
�	������!��	��������C�O�3��������r	���n���+�����
�p�q���;
�A������C�0��n�
����T�C�o�
��
�#	���~��P����������	���	��$	�z�����&���������������_���U���
�s	��	����I����������3
����$�W�1�5�P��	����������X����l
����H��	�������
�
�W���~�p��$����@�L����	�����u���J�I	�W	����_���	�<�����]�D�������h���������f�-��}�t	���	�������,���
�b����X���F��{�`����+�3�q��m��4
�������B�l�X	�
��%
���%	�2�u��	��	��J��|�r��@����������
���	�L
�������b
�J	��c�����������4	�R��m
�T��������������e	����?�
������-�	��*�E����+����&	�	����/�����.�s�[���+��
��
�B�c
��������	�)��,���u	��	��9�&
�������v	���Y���3��
�����D�����d���|�(�y������<�8��	�i����d
�5
�B��n���������|��	�!���)�@�n
�m��	�0�a�����	�z�'�����	�W���[�v�����	����	�o
���	��
���'
���"���9������J���h����e
���\�����p
�@��Q�9������	����w	�i�E��J�I��	��������f����
���1�����
����'	�r�
������
���c��	�q��x����
�2�������	�*���b�F��v���d�
����!�R����Q����P����������(	����_��`���f
��6
���A����a�����u�T��2��������^�C����
��[����
�v�%������n�!�F��Z�^������d������x	���]���B��c��
�����P�q�	��H�Y	����(
���
���g�}�K�	����4���W�����C�g
�-�d���k��4�}��q
�~�+�	����h�@	�	��@�A	�D�:���e��
���q�w�����������d���Z��*�M
�[����s�Z�
����>��'�-����
�����)	�������t�F�7
���t�#��������Y���f�k�b�1���
��6�o�c��q��	������
�u��:�����N
���U�=��	�v���l�N������]��;������
����l�L�	�}�������B	��8
�Q���#�`�T���	�x��r��	���������	���3�n���������{���@��������9
����5�h
�d�Z	����i������s�-�r
�������[���0������.�S�	�R�O
�������;��4�y	�
�;��z	��p��� ����:
�(�|�
��"����������O��e����
�C	�t�)
��6�K����8��P
��h���������3�$�����*
�=�b�e���
�4�����	�.�7��\����j����������/������=���^���5	����	�\�������9�#�+�
�s
�,��7���Y�����
�M���+
�i
�Q���6	���������y���s�#�{	��
�U�f�<����v����)��
�������m�<���	�����g��	�l��� �D	�b�u��
��D�B����
�[	���8����>�����������]���D�������6�Z���	�5���
��
�<�m��
�,�	��u�^�	����g����I���g����\�:�
�M�t�	���E��o��������E�R�����E�j
���g�W���K�	�C��=���]�$�!��	�����`�K�v���0�i��^����3���"��������a�k
��;
�w��������
�y�������P���	�w�����t
�����������J�a���]�������	�����L�h�������
�u
�j���#�C����	�a���!�s�|	��	�Y��
����h�O�5���z�i�
�S���L���.��&��	�<
�7���s�A���M��
��s���*�,
�%�
��	��k���&�f���O�j�"��(�-��[�������-�=��}	���M�S���������������
�
��
��~	��J�t�k�v�e�y�����<
���f��M�k��
���o�������w�v
�l���]���.��
������/�f	�q�$�g��G���n��>�6�=
�������N��	�x��e�*	��h������$�����o�b�,�����t����	�����
�V���
�i������w
���p�H���V��i�<�Z�	�8�r�w��&������/�������>��>
�E	�x���&�e���w�	����5�����
��\����S�i������2������+�'���%����x
�O��
�Q
���������_�H������g����y
��
�0�&�+	������l
�'��'�z��	�	��_����������?
��
����	��z
���m
�|�
��	�g	�0���� ��*���
�`�w���
�#���c�)�����R
�������T����r��V�����_�����r����
�`�x���
���f���n
�=
�*�A���	�(���x��S
��{�T
��9���M��r����;��(��%��[�a�b���D����-
�\	�����o
�E�m�)���!���c��,	�1��]	����>�I����	�T���������y��k�z�x���N�	�����	�5����"���H�<�d��
���h	��	��
�p
���:�L�1�����
���>
���q
�?�Z�M�I���B�K	�D��#�
�
�Y�>������`�������������� ����c���������T��a��2�y�/���"�U
�����b����/���	����
�{�
��j����5�*�������a�x����	��F������7����	�?��w���������+��8����
�N��
����,���n��
���o�s�_�?���	�����	����?�0��	�	��y�3�f��=���l��	��v�]��	�	�����z��
��y�j��u����p���j���	
�i�z����r
������������F��'��	�{
�n��x�u�$����M�����k��p��
�^	�q�
�`�|��	��

�|
�	��
���	�
�����	�G���-���Z�1�Y���-	�	�e����[�C����e��.�-�����	���
�
�7	�b�l�	������{���.�	����z���
�
��?
����9��
�����}��
�a��	�
�Q�	�d�;�V
���/�^�D���?�m��
��
�9�I�
�����
�����
�{�}� ������&���~���X���}
��
���9����:��"�\����
�_	���8�W�~
�^�%�P�s��w����8	��L����.
�����
����N�����P��S������:�R�'�0�b���
�������z��	�����K�������
�r��q�
�����n�L	�����������X�@
���g�i���4��:����r�	��Q��
��	�;�'��>���b�m���w�����s
�����A
��t
�9	�����	�o�^���2��c��������r����!��t��1�{���������s��������x��v�a�p����
���L�O���U��.����)�.	��E�	�"��y��	������(�
���q�E�6�
��
�2�u�6������.��i	����V���s��
���G�G�j�t��W��	���`����u����
���{�Q��	�O�C���
���J�	�	�	�$������/	����_�����j	�R��7���/
��3�D�u
�6�X�4��������>����0
�`�`	�
������U��� �
�r���)���5���H��� 
���~�r��
������	���]����������@
����
�a���v����/�!�6������7�w��	�3�c�8�x����B
�����M	���F	��g����(��
�:	����H���
�f����"��
�~������
�y������ �!�N��
�4�A
���d�e����z������������0	�����������{������
��f�(�l���G�	����^�B
�m�F�|��U����2�R�����#���
����F����U����1���c�k��N	�S�	�����
����$�O	���Y����
���G����W
�	�C
�?����~����U�F�����������
�C
�	�����e�
������/�Q�_�v
�;���P�)���G�m�	���G�,�I�z�_�	���	�9�n��
�A�	�
�X
���#�V�)�<�%��}���H�w
�5��p�.�����m�����
�#�h�|�b����@�(�^�z����A��
� ��"��g��|���������6��X���,�����		����$�#�I�k	�����:�	��Q��o�h�V�$�
���t����Y
�,����
��D
���	�~��
�
���%��� ��������
� �7�/�
	������	�E�V����
�E
�u��&�������	���o���Z��i���	�����!���n�8���'����������	��G��R�2����:��,����Z
�l��(�X�P	�?���	�o��������W�*����;�8��&�)����
�o������H�_���0���x
��0��;	��
���X��	�����1��Z�*�%��	���|�d�`���S�j�}�\�����G	���	�f�q�%�B��1
�p��"�������|������	���l	�2
��
��
���@�h��;����	��	���&�����}��%�����?��	���a	�y
��
����	��=�1��<��{�#��������
�A��
��t���	��>�
�K�3
���=����\�����'�A�+�g���b	�K�������
���
���4
��*��m	�	��
�o�N��0�(�i��$�4� �{����Q	����U�����,�`���	����
�j�)� ���-�6���	�F
�2�)��
�*��YN(rrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r	r	r	r	r	r	r	r	r	r		r
	r	r	r
	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r 	r!	r"	r#	r$	r%	r&	r'	r(	r)	r*	r+	r,	r-	r.	r/	r0	r1	r2	r3	r4	r5	r6	r7	r8	r9	r:	r;	r<	r=	r>	r?	r@	rA	rB	rC	rD	rE	rF	rG	rH	rI	rJ	rK	rL	rM	rN	rO	rP	rQ	rR	rS	rT	rU	rV	rW	rX	rY	rZ	r[	r\	r]	r^	r_	r`	ra	rb	rc	rd	re	rf	rg	rh	ri	rj	rk	rl	rm	rn	ro	rp	rq	rr	rs	rt	ru	rv	rw	rx	ry	rz	r{	r|	r}	r~	r	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r
r
r
r
r
r
r
r
r
r	
r

r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r 
r!
r"
r#
r$
r%
r&
r'
r(
r)
r*
r+
r,
r-
r.
r/
r0
r1
r2
r3
r4
r5
r6
r7
r8
r9
r:
r;
r<
r=
r>
r?
r@
rA
rB
rC
rD
rE
rF
rG
rH
rI
rJ
rK
rL
rM
rN
rO
rP
rQ
rR
rS
rT
rU
rV
rW
rX
rY
rZ
r[
r\
r]
r^
r_
r`
ra
rb
rc
rd
re
rf
rg
rh
ri
rj
rk
rl
rm
rn
ro
rp
rq
rr
rs
rt
ru
rv
rw
rx
ry
rz
r{
r|
r}
r~
r
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r
r
r
r
r
r
r
r
r
r	
r

r
r
r

r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r 
r!
r"
r#
r$
r%
r&
r'
r(
r)
r*
r+
r,
r-
r.
r/
r0
r1
r2
r3
r4
r5
r6
r7
r8
r9
r:
r;
r<
r=
r>
r?
r@
rA
rB
rC
rD
rE
rF
rG
rH
rI
rJ
rK
rL
rM
rN
rO
rP
rQ
rR
rS
rT
rU
rV
rW
rX
rY
rZ
r[
r\
r]
r^
r_
r`
ra
rb
rc
rd
re
rf
rg
rh
ri
rj
rk
rl
rm
rn
ro
rp
rq
rr
rs
rt
ru
rv
rw
rx
ry
rz
r{
r|
r}
r~
r
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrr)ZJIS_TYPICAL_DISTRIBUTION_RATIOZJIS_TABLE_SIZEZJIS_CHAR_TO_FREQ_ORDER�rr�/usr/lib/python3.6/jisfreq.py�<module>,s$site-packages/pip/_vendor/chardet/__pycache__/big5freq.cpython-36.opt-1.pyc000064400000152574147511334570022453 0ustar003

���ez�@sdZdZ�dZ�dS(g�?���	�������	�R������a����n���!����L�,�B��������{
�]
���
�j���.�N�i�����:����?���=�N�K����k�q	�����
�����b��
�����
�o�2��i����c�8���������{�|����"��
��@�\���	��������F��|
�Q�H���P�v��������D�^
����F�}
��E��O��0���s���4�<�2���&�M�����~
����G���[���?���a��K	�*��g��	�Z�
�:����K�	��	�������`�l�����
	� �q��~����
�	�������c��u���*���	��
��~�m������e��G�^��n���U�C���C��������� �j�o�/���P�7	�[������
�?����o�S�(����p�]��6�j��@�������8��+�3�[���\�������]	�A������1����H��
��
�d����+��2�����
�O�L	���f�1����������3�9���l�,���	���������e�z
�Q�M�&���X�������
�����k�p��M�����%�������'��	��\���7��J��!���������N��B�P�_
�q����
����
��
�� ��� �I���8	���
��{
����
����m����f���x�������
��
���g����
��������&��'������.������M	������$�#���D�h�A�	����r���U�G������
�
�Q�S�i���d��0�F�����C����� ���J����U�����N	�
���<��
�:�A
��
��	� 	�9���}���	���W�P�
��)�D����4��
��	�)����r	��s����t��9	�����s	�*�������]�����	�D�j�(
�
����
��u�5�Y�j�	�l��u����
���`
�
�=������������5��!����T�E�x���e���	���O	�P��|
�6�A�
�/�k���
��l�!�	���B��`��
����A�������
�v�	����w�����M�
�������a
����
�}
���x�,�}���B
����������F�k�m�	���
���	���
��b��)
�^�P	�
�,������7�5�~
�y�|��z������
�C
�{�����������1���b��	��
����t	���
��_�����������
��
�b
�����5�D���l��|�[�*
����
��%��G�^	��l�����
��`�����p����n�c�g�m����'�2�����{���������D
���f�	���|�:	�������
����
�
������n��������;���I�Y�}���
�X�"��
�����������-��l������������E��~��>�]���,�������v�L�B�i�&����������G������B���
�+
�����������	� �a����
�a���}�E
��D���=���0����
�6������v�!������������9�H�����
�F���������"��H�������
�o���R�*��.�������I�3��,
��*�S����X�����/��b�p��	�����������R����7��2���	�����������_	���b��
�����	�������`	�O��	���`�7�
�
��������a	�������������g������
�_��~��������
�a���
�b	�������������	�;���	�-�
���Q	������q��������`���#�#�������r�
�s�����d�t��
���c
����k�w���������������g��Q��U�
�������C�S��c	���5���B���_��c�N�����Y������
��L�d��
�
�	�K��8�a�G���
����s�6�	�t�;	�N����	�4���:��\�q����.�����u��
���������
��
�r�����
�
���c����@�����������/�9�����������
�<	�d
�����8�v��C����
���m��
��������Z��������w��	�{������$���M�d�0�r�����g��V�����:�Z����{�$�e�0���$��
�
��5��[���v���c��V�R	�$�;����f�����d����W�e
��������
��J�����u���	��
���
�K��2��L� �.���2����������%�������
����������
��
�x��M���}���T�����d	�
������`�����	�3���;�����y�Y�L����	��	�����U�\��������V�N����
��� 
��"�x�%��;�=	� ����2�E��!������^���"�w��
�#��������$�<�Q����r�%��]���&��<�'��3�6�B���(�����������g�
�
�)�
�	�*�+�!	������:��7��^�����(�
�,�+���x�-��/�h�=�.�������V����h����/�����<����;�0�����
�	�1�z������5�6�
�_�)�2�3���4�y�"	�����
��^�5����?�%�+�f
��6�_�W����
�7�>��
�g
�#	���u	�8����9�h
�F
���V�{���O���I�w�~�
�^���:�����������	�;�e�1������������������P�<�=���>�
���(�`�R�?�Y��I��	������7���
�?�����
�@���o��q�Y�n���l��������
�i
�E�P�V��!
�������n��A���|�
��
�B������-
�	�O��
��C��D�W�T�T�������
�}�E���&�
���������
���~�G
��
����$	�W�����o���
�F��G����"
�������#
�����D�-�Q���H�e��	���I�����J�z�
��K���L��$
�
�����H
����������M�:��	����N���I
������H������O����	�P�Q����k�R�R��������
�
������������S�T�����%
���������&
�S�U����S	��C������V�~��!�X�������,��������O���|�����P��e	���������������T������W�����X��Y��R�
�v���
���=���Z��U����[�����V�%	�\���]���^�8�T	�.
�'
�_��v�����w�t��`�a�b�>	����"�{����	��c���d�9������[�9���d�h�
�e�����J
�!���f����������T�g����%��W�M�h���i�������������(
�j
��y�7�m��j�k��?	�W�"�������B��l�|���)
�z����
����������m��n�u�v	�w���	�*
���o�p���q�7�.�r����
���
��f�����s�&	�y��� �������
��>�������Y���t��u�/
�+
��v����@�w���	��	���3�����,
�x���6�9�
�	���6�����!���y���Q�:�c�
���-
�o���F�X�
����f����z��
��������{�/���'	�	�b����`����u�	�v������
�
�
������0�E��S��8�[�|�}�G�����!�~���F�U	��
����e�s�����w	����"��a�	�Z�&������	�
�����A���
��i����������
�o������
�
��	��A��J�0���x	��
�����Y�������	�����Z�8�o����Q�����
���a�Q����p�;���P��b����������
����	��������������]������
��������'���.
�����
��������L�������	��/
�����B�0
�[��8������	���/�3������
�����0���n��������������	�����������4�����<�	��
����Y�	�����C�����
��������}�1��
����K������0
�F�������
��#��\�����f��>�
���
����������
�������j������-�p�9�@	�6�y�������
�1
���������V���������4�����K�1
����
�D����������#������2
�����	�Z�H�p����g������
�q�r�
����������K
�V	�
�\�,�
��������]���u��
������������Z���
������j����0�h�h��������E���	
���	���������������
����2
���������M���w�����y	�?������

�t�m�(	��	��+�����������i�
��	������
���L
����)	�
��
���f	���	��������R�"�A	�3
����-�1��$�
�(���]���#��$��
������ �*	������
����
�=�k
��!��
���������
�����_���������^�2���
������9������������4
������
��b�����������5��������"�	�
��������4�I����#����������j�����$�k���
��\�
�g��E�5
����������������
����6
�����	���%���������[�����������&�e�����������o�����l�	�}�������z���+	�����������n�-���
�'��i�
������(������	������������
��������B�i�q�����<��*�j�)�"�*���>�)�7
���+�����������z���,������M
�)�r����-�	�������������T���p�����#�#������������
�����	�	�����S���.�8
��������z	�9
�=�����m��S�������x�����/�����q����3���5���:
��
�������������
�;
��0�1�������n���_��]���n���������
�%���������`��������z�T�s�2�3
���
�{	��5������
�g	���k���������
���3�4����c� ����
�
����a�:�1�����!�A�h� �o��
�6�"����l
�-���
�a�5���	��
���#�!��$��
��4
�f���	�%�&�
�N
�
���W������
�'��$����<
��N���W	�k��O
�����������X	�,	��6�&�t���P
���7�
���
�-	�j��	�
��C�[�p���'�8�
�b���m
��
��{�(���	����	�9�
��	��:�
�
�^�������B	�;��������
����
���	���)�<�=�
��
��>������
�=�������������n
��H�D�
�?���U�W�
�=
��:��	�N��K�E�@�4�c�
��>
��Q
�
�A���h����
�
�(�]������B�*����l���C�D���E�h	���������X�����+������+�J�����U����
���m� �E����F��G�
�k�	�n��
�d����������F�!� ��r�@���H�"�C�G�����
��^�
����8�t�)�;��?
�!��o�I���"�#�#�$��J�	�%��
�&�r�K�'�5
�(�)��o
�@
�}�A
��,��*�4�1�+�������"��,��
�C	���u�r�D	�-�
�-��L�v�{���|	�M�.�.�N�Y	�6
����/�0�/����1�$��2�%�G�R�B
���_����C
����D
���3���������.	���	�4�@�w��5��6�7��i	���&�0���8�	�&��	�*��+�G�O�u��#�
�1�P�R��Q�����)�����=�9��l�������E
�S��k��\�	���
�y�:��;�<�F
�%�=��R���>�d�'�,�?�e�$��2�3�-����4��(�S�@�A�B���.�T��O�
��������
�&�����
�<�
����C��D���U���p
�G
�e�
����
�E�;�V�F�'�G�)�
���}	�H�*���W���I��T��
�
���������	��J�.�g�~��U��j	�	���X����8��*�	����K�~	�����L��H�
����%���+�f������t���M�N�������	�� 
���k	�Y��Y�����*�/���O�R
�Z�H
���#�I�
��	����5�S
�]�E	�[�\�
�s�I
�+�P�Q�,�	�
�R��S�T���,���U�]�V�����L�-�!��T
�\�4�W���X�Y�Z��[�>�w������+�^��J
���_�$��\�����l�.�]�`��^�_�q
�%��`�a�����
�6�g�U
�b�c�d�a�
�q�x�
�e�
�b�y��f�h�g�h���'�i�����	�j�r��
�0�	�=���b�|��������	��h�k��
�l���
�a��c�m�����n�
��u���d�e���s�K
�/�o�7��&�	�J�8�p�q�i�j�f���	�g�	����(��`�r�s�����<�F�k�f��������]���I�t�>��u��v��
����
�0��-��w�L
��	�����G�a��
�t���
�S�K�x��y�z���V
����O���1���
�{�
����F	�y��9�Z	���|�}�~�F�������
�	�:�����2�����p��;�G	���	�h�����7
������q����M
��������������
����
�(�8
��N
�	��3��	��O
�<����T��W
�	�����(�����1����������u�Z��
�	�c��j�P
�p���
�������v�i�j��
�����Q
�K����	�'��������9
���
��	�z��	�
���k���������2�	��l����>�=�C�'���>�
��Z�����m�n����b����	��q�?��
����	�"��
�?�
�}����o�@�>��A��1��)����?�
��
����x�	��	��	�3���h���l����)���
�p��'����4���
�
����A�_�	�B��L��
�P�q�����r�����X
��c�s�!
����H��	�w����C�D�t�������x�
��	�/	����5�
�4�u��r
���_���/��<��~�7�y�m���^�5�n�b�
�t���l	�����6���	�7��������v����	�8�:
��
��0	��1	��w��I������(����B����4�s��
�����
��
���?�C��x�	�E�H����F�y�z�G�����{����|��;
�}�9�m�H	�~���z���s
�	���M����������m	�����I������o�����������_��������2	�3	�H��*���
����<
��i�:����
�����A������R
���������
���"
���	�������
���
����
�����Q���
�Y
��������{�K�����[��J�����N��S
�����I	���T
�@���
�����
��@��R����	�t
�����J�I���
�������
�U
��m�����{�	���	�
�;�V
����	���<����
�d�������X����9����6�����U����� ���L�!�=
���)���
�	������	�����
��=�$�(�v����&���c������	�
���>�������{�"��������#��������W
��	��������	��$�
�%���������&�^����[	�	���'�����������(� ����2�\	�J�
�p�p�n	��u
�q�)���	�O�>
��
�*�X
�����	�+����,��
����
������K��
�		���I�-���!������.�D���J�s�#
����
���7����/�w��|������"�4	�|�,�������	���������?�Y
�P�0��1�Z
�Z
�
��
�	����
�#�@�2��|��z�����A�'��?
���	���8���y���L�V�
�3���	���$������M���%����	��
��
�.��4�N�����
��	�������
�5�9�%������K�Q����x�6�7��B��
	�O�v
�[
���&���
��
����X��*��\
�P��	�
�N�+�w
������W�C���	�Q�D��?�]
�����^
�o	�R�	
���E��
��F��'��X��G��S���T�R��8�x��	�k�S��T��� ������D��9�!�����(��L�"�#�$�%�_
��&�J�:�U��}��)��

�;�<�
�E�+�'�
��H�=�V�*�H�y�,�+�W�3�>����?�(�I�,�-�`
�q��a
�����
���)�*�X�	�+��O���Y�,�-��.��/�����0�.������1�
�J�r�2�3�$
�	�4��5�6�7��
��b
��	����}�	�f���K����
��
�	�	�8�Z�9���:����
�c
���:���;�s�J	�@����	�t�;�
�/�<�d
��	����U���0��e
�=���f
�u�M�������
�>�	�[��A�L��i�?�	�
��
�	�@�A�v�B�C�\���j��[
��
�g
�D�E����F��B��G��
��H���(���M�C�h
���I�N�J��K�L���M���1�2�N���d�O�d�V�D�`��#�5	�P�]�Q�E�n�
��	���^�R�_�F�S�p	�i
���T�U��

��V�����	��G�����e�H�O�	��P�`�W�w�j
�X� ��3�
�a��I�Y�J�x�7�����-��
�4�k
��b�Z�-���.�c��[�	�r�;���K�5�L�X�� �	����\�]���6�%
�^�_����~�`�l
�y�a�z�~�����b�?��c�7�d�
���e��z�����M�f�g�/�h�|��@�d���i�N�����8�f���j�k��O�P����l�m�@�n�9�	�o��	�����<�s��\
�Q�.�R�N�@��
��p��S�{�q��A���������T�
����:����r��
�s�@
�W�;�t�u��v�w�x�y�|�����z�����3�Q����m��A�{�|���	��}�O�R����g���~�Z�
��������\�m
��}���<�X����	��	��
����
���Y�����0��������=��	����L������h�����i�>�>��/�?�U�������e�)�x
�t��f�S���
�y
��
�n
�
���T��@����������������	����z�U�g��V���o
���6	�p
�t���������
�A���J�V�h�Z��	���W�����N(rrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r	r	r	r	r	r	r	r	r	r		r
	r	r	r
	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r	r 	r!	r"	r#	r$	r%	r&	r'	r(	r)	r*	r+	r,	r-	r.	r/	r0	r1	r2	r3	r4	r5	r6	r7	r8	r9	r:	r;	r<	r=	r>	r?	r@	rA	rB	rC	rD	rE	rF	rG	rH	rI	rJ	rK	rL	rM	rN	rO	rP	rQ	rR	rS	rT	rU	rV	rW	rX	rY	rZ	r[	r\	r]	r^	r_	r`	ra	rb	rc	rd	re	rf	rg	rh	ri	rj	rk	rl	rm	rn	ro	rp	rq	rr	rs	rt	ru	rv	rw	rx	ry	rz	r{	r|	r}	r~	r	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r�	r
r
r
r
r
r
r
r
r
r	
r

r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r 
r!
r"
r#
r$
r%
r&
r'
r(
r)
r*
r+
r,
r-
r.
r/
r0
r1
r2
r3
r4
r5
r6
r7
r8
r9
r:
r;
r<
r=
r>
r?
r@
rA
rB
rC
rD
rE
rF
rG
rH
rI
rJ
rK
rL
rM
rN
rO
rP
rQ
rR
rS
rT
rU
rV
rW
rX
rY
rZ
r[
r\
r]
r^
r_
r`
ra
rb
rc
rd
re
rf
rg
rh
ri
rj
rk
rl
rm
rn
ro
rp
rq
rr
rs
rt
ru
rv
rw
rx
ry
rz
r{
r|
r}
r~
r
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r
r
r
r
r
r
r
r
r
r	
r

r
r
r

r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r
r 
r!
r"
r#
r$
r%
r&
r'
r(
r)
r*
r+
r,
r-
r.
r/
r0
r1
r2
r3
r4
r5
r6
r7
r8
r9
r:
r;
r<
r=
r>
r?
r@
rA
rB
rC
rD
rE
rF
rG
rH
rI
rJ
rK
rL
rM
rN
rO
rP
rQ
rR
rS
rT
rU
rV
rW
rX
rY
rZ
r[
r\
r]
r^
r_
r`
ra
rb
rc
rd
re
rf
rg
rh
ri
rj
rk
rl
rm
rn
ro
rp
rq
rr
rs
rt
ru
rv
rw
rx
ry
rz
r{
r|
r}
r~
r
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
r�
rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r)ZBIG5_TYPICAL_DISTRIBUTION_RATIOZBIG5_TABLE_SIZEZBIG5_CHAR_TO_FREQ_ORDER�rr�/usr/lib/python3.6/big5freq.py�<module>+s�site-packages/pip/_vendor/chardet/__pycache__/langbulgarianmodel.cpython-36.pyc000064400000060341147511334570023625 0ustar003

���e'2�@s4d�Zd�Zd�Zeed�d�d�d�d��Zeed�d�d�d�d��Zd�S)������M�Z�c�d�H�m�k�e�O��Q�f�L�^�R�n��l�[�J�w�T�`�o��s�A�E�F�B�?�D�p�g�\���h�_�V�W�G�t���U�]�a�q�����������������������������������������������������������������������i�������������-������ �#�+�%�,�7�/�(�;�!�.�&�$�)��'��"�3�0�1�5�2�6�9�=���C���<�8���	���������
����
������������K�4���*��>�������:��b�������x�N�@�S�y�u�X�z�Y�j�I�P�v�r�g! _B�?Fz
ISO-8859-5Z	Bulgairan)Zchar_to_order_mapZprecedence_matrixZtypical_positive_ratioZkeep_english_letterZcharset_nameZlanguagezwindows-1251Z	BulgarianN(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr(r/r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr>r?r@rArBrCrDrEr�rFrGrHrIrJrKrLrMr�r�r�r�r�r�rWrNrOrPrQrRrSrTrUr�rVrXrYrZr�r�r�r[r\r]r_r`r^r|r~r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r�r}rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�(r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)ZLatin5_BulgarianCharToOrderMapZwin1251BulgarianCharToOrderMapZBulgarianLangModelZLatin5BulgarianModelZWin1251BulgarianModel�r�r��(/usr/lib/python3.6/langbulgarianmodel.py�<module>&sV
site-packages/pip/_vendor/chardet/__pycache__/escsm.cpython-36.pyc000064400000016167147511334570021117 0ustar003

���e)�@s�ddlmZdZejejdejejejejejejejejejejejejejejejejejejejdejdejdejdddejdejdddejdejdejejejejejejejf0ZdZedeedd	d
�ZdZ	ejdejejejejejejejejejejejejejejejejejejejejejejejejejejejejdejejejejejejejejejddejejejejejejejejejejejejejejejejejejejejejejf@Z
dZe	de
edd	d
�ZdZ
ejdejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejejdejejejdejejejejejdejejejejejejejejejejejejejejejejejejejejejejejejejejejejfHZdZe
deeddd
�ZdZejdejejejejejejejejejejejejejejejejejejejdejejejejejejdejejejejejejejejejejejf(ZdZedeeddd
�ZdS)�)�MachineState������z
HZ-GB-2312ZChinese)Zclass_tableZclass_factorZstate_tableZchar_len_table�nameZlanguage�	zISO-2022-CN���
zISO-2022-JPZJapanesezISO-2022-KRZKoreanN(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)rrrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)	rrrrrrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr
rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)
rrrrrrrrrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)rrrrrr)ZenumsrZHZ_CLSZSTARTZERRORZITS_MEZHZ_STZHZ_CHAR_LEN_TABLEZHZ_SM_MODELZ
ISO2022CN_CLSZISO2022CN_STZISO2022CN_CHAR_LEN_TABLEZISO2022CN_SM_MODELZ
ISO2022JP_CLSZISO2022JP_STZISO2022JP_CHAR_LEN_TABLEZISO2022JP_SM_MODELZ
ISO2022KR_CLSZISO2022KR_STZISO2022KR_CHAR_LEN_TABLEZISO2022KR_SM_MODEL�rr�/usr/lib/python3.6/escsm.py�<module>sp "    $     $ $site-packages/pip/_vendor/chardet/__pycache__/langcyrillicmodel.cpython-36.opt-1.pyc000064400000073220147511334570024432 0ustar003

���eF�@s�d�Zd�Zd�Zd�Zd�Zd�Zd�Zeed�d�d�d�d��Zeed�d�d�d�d��Zeed�d�d�d�d��Z	eed�d�d�d�d��Z
eed�d�d�d�d��Zeed�d�d�d�d��Zd�S)�����������������J��K�������������G��B��A��L��@���M�H��E�C��N�I���O������������������������������������������������������������������������D��������������������������������������������������
��'������������	�����
��������6�;�%�,�:�)�0�5�.�7�*�<�$�1�&��"�#�+�-� �(�4�8�!�=�>�3�9�/�?�2�F�gl���P@�?FzKOI8-RZRussian)Zchar_to_order_mapZprecedence_matrixZtypical_positive_ratioZkeep_english_letterZcharset_nameZlanguagezwindows-1251z
ISO-8859-5ZMacCyrillicZIBM866ZIBM855N(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrrrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqr\rrrsrtrurvrrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxryr�r~r{r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r}rrzr�r�r�r�r�r�r�rwr�(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r]r^r_r`rarbrcrdrerfrgrhrir�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxryr�r~r{r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r}rrzr�r�r�r�r�r�r�rwr�rjr\rkrlrmrnrorprqrrrsrtrurvrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrr\r�rxryr�r~r{r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r}rrzr�r�r�r�r�r�r�rwr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr9r:r;r<r\r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrwr�r�r�rxr�ryr�rzr�r{r�r|r�r}r�r~r�rTrUrVrWrXrYrZrr�r�r�r[r]r^r_r�r�r`rarbrcrdrerfr�r�rgrhrirjrkrlrmrnr�r�r�r�r�r�r�r�r�rorprqrrr�r�rsr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rtrur�r�r�r�r�r�r�r�r�r�r�r�rvrr(rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8rrrrrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxryr�r~r{r|r�r�r�r�r�r�r�r�r�r�r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r]r^r_r`rarbrcrdrerfrgrhrir�r�r�r�r}rrzr�r�r�r�r�r�r�rwr�rjr\rkrlrmrnrorprqrrrsrtrurvrr(r�rxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxr�r�rxrxrxrxr�rxrxrxr|rxr|rxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxr�rxr|r|r|r|r|r�r�r|rxrxrxr|rxrxrxrxrxrxrxrxrxrxr|rxrxr�r�rxrxrxrxrxrxrxrxrxr|rxr|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxr|r|rxrxrxrxrxrxrxrxrxr|rxrxr�r�rxrxrxrxrxrxrxrxr|rxrxr�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr|rxr|rxrxrxrxrxrxrxrxrxrxrxrxrxr�r�rxrxrxrxrxrxrxrxrxrxrxr|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxrxrxrxrxrxrxrxrxr|rxrxr�r�rxrxrxrxrxrxrxrxrxrxrxr|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxrxrxr|r|r|rxr�rxrxr�rxrxrxrxr|r|rxr�r|r|r|rxrxr|r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxr|rxrxrxrxrxr|r|rxr|rxrxrxr|r�r|r|r�r�r|r|r|r|r|r|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxrxr|r|r|rxr�r|r|rxrxr|r�r|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxr|rxrxr�r|rxr|r|rxr|rxrxrxrxr|r|rxr�rxr|r|rxr�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxrxrxr|r|rxrxrxrxrxr|rxrxrxrxr|r|r|r�rxrxrxr|r|r|r|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxrxrxrxrxr|rxr|rxrxrxrxrxrxr|rxr|r|r�r�rxr|r�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxrxrxrxrxrxr|r�r�rxr�r�r�r�r�r|r�r�r�r|r|r|r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxr|rxrxr|r|r|r|r�rxr|rxr|rxr|r�r|r|r�r�r�r|r�r|r�r|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxrxrxrxrxrxrxr|r|rxr|rxrxrxr|r|r|r|r�r|r|r|r|rxr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�rxr|rxr|r|rxrxrxrxrxrxrxrxrxr�rxr|r�r�rxrxrxrxr|rxrxrxrxr|rxr|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rxrxrxrxrxr|r|rxrxr�r|r�r�rxr|rxr|rxr�r�r�r|r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr�rxr�r|rxrxrxrxr|rxrxrxrxr�r|r|r�r�r|rxr|r|r|rxr|rxr|r|rxr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr|rxr�r|rxr|rxr�r�r|rxrxr|r�r|rxr�r�r|rxr|r|r�r�rxr�rxr|r|r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr�rxr�r|rxrxrxrxrxrxrxrxr|r�rxr|r�r�r|r|rxrxrxr|rxrxr�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxr|r|rxrxr|r|r|rxrxr�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxr|r|rxrxrxrxrxrxrxr�rxr|rxrxr|rxr|r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�rxrxrxrxrxrxr|rxrxrxr|r|r|r|rxr�rxr|rxr�r�r|r�r�r|r|r|r|r�rxr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r|r|rxrxrxrxrxr�r|r|r�rxr�r�rxr�r�rxr�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr|r|r�r�rxrxrxr|r|r�r|r|rxr�r�r|r�r�r|r|r�rxr�r�r|r�r�r|r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr|rxrxrxrxr�r|r|r|r�r|r�rxrxr�r�r|r�r|r�r|r|r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rxrxrxrxrxr|r�rxr|r|rxr|r�rxr|r�rxr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxr|rxrxrxr|r|r|rxrxr�r|r�r|r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr�r�r|r�r|rxrxr|r|r�r|r|rxr�r|r�r�r�r|r|rxr|r�r|r|r|r|r|rxr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxrxrxr�r�r�r�r�r|r|r�r�rxr�r�r�rxr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�rxrxrxr|r�r�r�r|r�r�r�r�r|r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r|rxr|r|r|r�r|r|r|r�r|r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rxrxrxrxr�r�r�r�r�r�r�r�r�rxr�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r|rxr|r|r|rxr�r|r|r|r|r|r|r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxrxr|r|r|r|rxr|r|r�r�r|r|r|r|r�r�rxr�r|r�r|r�r�r�r�r�r�r�r|r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�rxr|r|r|r|r�r|r�r|r�r|r�r�r�r|r�r|r|r�r�r|r|r�r�r�r�r|r�r�r�r�r�r�r�r�r�r|r�r�r�r|r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr|r|r|r�r�r�r|rxr�r�r�r�r|r�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rxr|rxr|r�r|r|r|r|r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r|r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr�r�r�r�r|r|r|rxr|r|r|r|r|r|r|r�r�r�r|r�r|r�r�r�r|r|r�r�r�r�r|r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rxr|rxrxr|r�r�r�r�r�r�r�r�r|r�r�r�rxr�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rxrxrxrxr�r|r|r|r|r�r�r�r�r|r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rxrxrxr|r�r�r�r�r|r|r�r�r�r|r�r�r�rxr�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr|rxr|r�r�r�r|r|r|r�r�r�r|r�r�r�r�r�r�r�r�r�rxr�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr�r|r�r�r|r|r|r|r|r|r�r|r|r�r�r�r�r�r|r|r|r�r�r�r�r|r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxrxr|r|r�r�r�r|r|r�r�r�r�r|r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rxr|rxr|r�r�r�r�r�r�r�r�r�r|r�r|r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r|r�r�r�r�r|r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rxr|r|r|r�r�r�r|r|r�r�r�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|rxr�r|r|r|r|r|r|r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r|r�r�r�r�rxr|r�r|r�r|r|r�r�r�r�r�r|r�r�r�r|r�r�r�r�r�r|r�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r|r|r|r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r|r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r|r|r|r�rxr�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r|rxr|r|r�r�r�r�r|r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r|r�r�r|r�r|r|r|r|r|r�r|r|r�r�r�r�r�r�r|r|r|r�r|r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r|r|r|r�r�r�r|r|r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r|r|r|r�r�r�r|r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r|r|r|r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r|r|r�r�r|r�r|r�r�r�r�r�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r|r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)
ZKOI8R_char_to_order_mapZwin1251_char_to_order_mapZlatin5_char_to_order_mapZmacCyrillic_char_to_order_mapZIBM855_char_to_order_mapZIBM866_char_to_order_mapZRussianLangModelZ
Koi8rModelZWin1251CyrillicModelZLatin5CyrillicModelZMacCyrillicModelZIbm866ModelZIbm855Model�r�r��'/usr/lib/python3.6/langcyrillicmodel.py�<module>s
site-packages/pip/_vendor/chardet/__pycache__/chardistribution.cpython-36.opt-1.pyc000064400000014135147511334570024312 0ustar003

���e�$�@s�ddlmZmZmZddlmZmZmZddlm	Z	m
Z
mZddlm
Z
mZmZddlmZmZmZGdd�de�ZGdd	�d	e�ZGd
d�de�ZGdd
�d
e�ZGdd�de�ZGdd�de�ZGdd�de�ZdS)�)�EUCTW_CHAR_TO_FREQ_ORDER�EUCTW_TABLE_SIZE� EUCTW_TYPICAL_DISTRIBUTION_RATIO)�EUCKR_CHAR_TO_FREQ_ORDER�EUCKR_TABLE_SIZE� EUCKR_TYPICAL_DISTRIBUTION_RATIO)�GB2312_CHAR_TO_FREQ_ORDER�GB2312_TABLE_SIZE�!GB2312_TYPICAL_DISTRIBUTION_RATIO)�BIG5_CHAR_TO_FREQ_ORDER�BIG5_TABLE_SIZE�BIG5_TYPICAL_DISTRIBUTION_RATIO)�JIS_CHAR_TO_FREQ_ORDER�JIS_TABLE_SIZE�JIS_TYPICAL_DISTRIBUTION_RATIOc@sLeZdZdZdZdZdZdd�Zdd�Zd	d
�Z	dd�Z
d
d�Zdd�ZdS)�CharDistributionAnalysisig�G�z��?g{�G�z�?�cCs0d|_d|_d|_d|_d|_d|_|j�dS)N)�_char_to_freq_order�_table_size�typical_distribution_ratio�_done�_total_chars�_freq_chars�reset)�self�r�&/usr/lib/python3.6/chardistribution.py�__init__.sz!CharDistributionAnalysis.__init__cCsd|_d|_d|_dS)zreset analyser, clear any stateF�N)rrr)rrrrr=szCharDistributionAnalysis.resetcCsX|dkr|j|�}nd}|dkrT|jd7_||jkrTd|j|krT|jd7_dS)z"feed a character with known length�rriN���)�	get_orderrrrr)r�charZchar_len�orderrrr�feedFs
zCharDistributionAnalysis.feedcCsT|jdks|j|jkr|jS|j|jkrN|j|j|j|j}||jkrN|S|jS)z(return confidence based on existing datar)rr�MINIMUM_DATA_THRESHOLD�SURE_NOr�SURE_YES)r�rrrr�get_confidenceTs

z'CharDistributionAnalysis.get_confidencecCs|j|jkS)N)r�ENOUGH_DATA_THRESHOLD)rrrr�got_enough_datadsz(CharDistributionAnalysis.got_enough_datacCsdS)Nrr r)r�byte_strrrrr!isz"CharDistributionAnalysis.get_orderN)
�__name__�
__module__�__qualname__r*r'r&r%rrr$r)r+r!rrrrr(s	rcs$eZdZ�fdd�Zdd�Z�ZS)�EUCTWDistributionAnalysiscs$tt|�j�t|_t|_t|_dS)N)	�superr0rrrrrrr)r)�	__class__rrrrsz"EUCTWDistributionAnalysis.__init__cCs0|d}|dkr(d|d|ddSdSdS)Nr���^r�r r)rr,�
first_charrrrr!xsz#EUCTWDistributionAnalysis.get_order)r-r.r/rr!�
__classcell__rr)r2rr0qsr0cs$eZdZ�fdd�Zdd�Z�ZS)�EUCKRDistributionAnalysiscs$tt|�j�t|_t|_t|_dS)N)	r1r8rrrrrrr)r)r2rrr�sz"EUCKRDistributionAnalysis.__init__cCs0|d}|dkr(d|d|ddSdSdS)Nr�r4rr5r r)rr,r6rrrr!�sz#EUCKRDistributionAnalysis.get_order)r-r.r/rr!r7rr)r2rr8�sr8cs$eZdZ�fdd�Zdd�Z�ZS)�GB2312DistributionAnalysiscs$tt|�j�t|_t|_t|_dS)N)	r1r:rrrr	rr
r)r)r2rrr�sz#GB2312DistributionAnalysis.__init__cCs>|d|d}}|dkr6|dkr6d|d|dSdSdS)Nrrr9r5r4r r)rr,r6�second_charrrrr!�sz$GB2312DistributionAnalysis.get_order)r-r.r/rr!r7rr)r2rr:�sr:cs$eZdZ�fdd�Zdd�Z�ZS)�Big5DistributionAnalysiscs$tt|�j�t|_t|_t|_dS)N)	r1r<rrrrrr
r)r)r2rrr�sz!Big5DistributionAnalysis.__init__cCsX|d|d}}|dkrP|dkr:d|d|ddSd|d|dSndSdS)	Nrr�r5��?�@r r)rr,r6r;rrrr!�sz"Big5DistributionAnalysis.get_order)r-r.r/rr!r7rr)r2rr<�sr<cs$eZdZ�fdd�Zdd�Z�ZS)�SJISDistributionAnalysiscs$tt|�j�t|_t|_t|_dS)N)	r1rArrrrrrr)r)r2rrr�sz!SJISDistributionAnalysis.__init__cCsr|d|d}}|dkr0|dkr0d|d}n&|dkrR|dkrRd|dd}ndS||d	}|d
krnd}|S)
Nrr��������r@�r r r)rr,r6r;r#rrrr!�sz"SJISDistributionAnalysis.get_order)r-r.r/rr!r7rr)r2rrA�srAcs$eZdZ�fdd�Zdd�Z�ZS)�EUCJPDistributionAnalysiscs$tt|�j�t|_t|_t|_dS)N)	r1rIrrrrrrr)r)r2rrr�sz"EUCJPDistributionAnalysis.__init__cCs0|d}|dkr(d|d|ddSdSdS)Nr�r4r5rr r)rr,r"rrrr!�sz#EUCJPDistributionAnalysis.get_order)r-r.r/rr!r7rr)r2rrI�srIN)Z	euctwfreqrrrZ	euckrfreqrrrZ
gb2312freqrr	r
Zbig5freqrrr
Zjisfreqrrr�objectrr0r8r:r<rArIrrrr�<module>sIsite-packages/pip/_vendor/chardet/__pycache__/jpcntx.cpython-36.pyc000064400000113272147511334570021306 0ustar003

���e�L��@s8d`ZGdd�de�ZGdd	�d	e�ZGd
d�de�ZdS)a������c@sPeZdZdZdZdZdZdZdd�Zdd	�Z	d
d�Z
dd
�Zdd�Zdd�Z
dS)�JapaneseContextAnalysis�r�di�rcCs*d|_d|_d|_d|_d|_|j�dS)N)�
_total_rel�_rel_sample�_need_to_skip_char_num�_last_char_order�_done�reset)�self�r�/usr/lib/python3.6/jpcntx.py�__init__{sz JapaneseContextAnalysis.__init__cCs*d|_dg|j|_d|_d|_d|_dS)NrrF���)r
�NUM_OF_CATEGORYrrr
r)rrrrr�s
zJapaneseContextAnalysis.resetcCs�|jr
dS|j}x�||kr�|j|||d��\}}||7}||krV|||_d|_q|dkr�|jdkr�|jd7_|j|jkr�d|_P|jt|j|d7<||_qWdS)NrrTrrr)rr�	get_orderr
r
�MAX_REL_THRESHOLDr�jp2CharContext)r�byte_strZ	num_bytes�i�order�char_lenrrr�feed�s 	

zJapaneseContextAnalysis.feedcCs|j|jkS)N)r
�ENOUGH_REL_THRESHOLD)rrrr�got_enough_data�sz'JapaneseContextAnalysis.got_enough_datacCs,|j|jkr"|j|jd|jS|jSdS)Nr)r
�MINIMUM_DATA_THRESHOLDr�	DONT_KNOW)rrrr�get_confidence�sz&JapaneseContextAnalysis.get_confidencecCsdS)Nrr)rrr)rrrrrr�sz!JapaneseContextAnalysis.get_orderNr)�__name__�
__module__�__qualname__rr!rrr rrrrr"rrrrrrtsrcs0eZdZ�fdd�Zedd��Zdd�Z�ZS)�SJISContextAnalysiscstt|�j�d|_dS)NZ	SHIFT_JIS)�superr&r�
_charset_name)r)�	__class__rrr�szSJISContextAnalysis.__init__cCs|jS)N)r()rrrr�charset_name�sz SJISContextAnalysis.charset_namecCs�|sdS|d}d|ko"dkns@d|ko:dknrld}|dksdd	|ko^dknrpd
|_nd}t|�dkr�|d}|dkr�d|ko�dknr�|d|fSd|fS)Nrr�����r��ZCP932����r)rrr)r(�len)rr�
first_charr�second_charrrrr�s0  zSJISContextAnalysis.get_order)r#r$r%r�propertyr*r�
__classcell__rr)r)rr&�sr&c@seZdZdd�ZdS)�EUCJPContextAnalysiscCs�|sdS|d}|dks0d|ko*dknr6d}n|dkrDd}nd}t|�dkr�|d}|d	kr�d|kovd
knr�|d|fSd
|fS)Nrr���r�r���r)rrr)r3)rrr4rr5rrrr�s  zEUCJPContextAnalysis.get_orderN)r#r$r%rrrrrr8�sr8N�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr�Srrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr)Sr?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)r�objectrr&r8rrrr�<module>s�Csite-packages/pip/_vendor/chardet/chardistribution.py000064400000022303147511334570017063 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Communicator client code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

from .euctwfreq import (EUCTW_CHAR_TO_FREQ_ORDER, EUCTW_TABLE_SIZE,
                        EUCTW_TYPICAL_DISTRIBUTION_RATIO)
from .euckrfreq import (EUCKR_CHAR_TO_FREQ_ORDER, EUCKR_TABLE_SIZE,
                        EUCKR_TYPICAL_DISTRIBUTION_RATIO)
from .gb2312freq import (GB2312_CHAR_TO_FREQ_ORDER, GB2312_TABLE_SIZE,
                         GB2312_TYPICAL_DISTRIBUTION_RATIO)
from .big5freq import (BIG5_CHAR_TO_FREQ_ORDER, BIG5_TABLE_SIZE,
                       BIG5_TYPICAL_DISTRIBUTION_RATIO)
from .jisfreq import (JIS_CHAR_TO_FREQ_ORDER, JIS_TABLE_SIZE,
                      JIS_TYPICAL_DISTRIBUTION_RATIO)


class CharDistributionAnalysis(object):
    ENOUGH_DATA_THRESHOLD = 1024
    SURE_YES = 0.99
    SURE_NO = 0.01
    MINIMUM_DATA_THRESHOLD = 3

    def __init__(self):
        # Mapping table to get frequency order from char order (get from
        # GetOrder())
        self._char_to_freq_order = None
        self._table_size = None  # Size of above table
        # This is a constant value which varies from language to language,
        # used in calculating confidence.  See
        # http://www.mozilla.org/projects/intl/UniversalCharsetDetection.html
        # for further detail.
        self.typical_distribution_ratio = None
        self._done = None
        self._total_chars = None
        self._freq_chars = None
        self.reset()

    def reset(self):
        """reset analyser, clear any state"""
        # If this flag is set to True, detection is done and conclusion has
        # been made
        self._done = False
        self._total_chars = 0  # Total characters encountered
        # The number of characters whose frequency order is less than 512
        self._freq_chars = 0

    def feed(self, char, char_len):
        """feed a character with known length"""
        if char_len == 2:
            # we only care about 2-bytes character in our distribution analysis
            order = self.get_order(char)
        else:
            order = -1
        if order >= 0:
            self._total_chars += 1
            # order is valid
            if order < self._table_size:
                if 512 > self._char_to_freq_order[order]:
                    self._freq_chars += 1

    def get_confidence(self):
        """return confidence based on existing data"""
        # if we didn't receive any character in our consideration range,
        # return negative answer
        if self._total_chars <= 0 or self._freq_chars <= self.MINIMUM_DATA_THRESHOLD:
            return self.SURE_NO

        if self._total_chars != self._freq_chars:
            r = (self._freq_chars / ((self._total_chars - self._freq_chars)
                 * self.typical_distribution_ratio))
            if r < self.SURE_YES:
                return r

        # normalize confidence (we don't want to be 100% sure)
        return self.SURE_YES

    def got_enough_data(self):
        # It is not necessary to receive all data to draw conclusion.
        # For charset detection, certain amount of data is enough
        return self._total_chars > self.ENOUGH_DATA_THRESHOLD

    def get_order(self, byte_str):
        # We do not handle characters based on the original encoding string,
        # but convert this encoding string to a number, here called order.
        # This allows multiple encodings of a language to share one frequency
        # table.
        return -1


class EUCTWDistributionAnalysis(CharDistributionAnalysis):
    def __init__(self):
        super(EUCTWDistributionAnalysis, self).__init__()
        self._char_to_freq_order = EUCTW_CHAR_TO_FREQ_ORDER
        self._table_size = EUCTW_TABLE_SIZE
        self.typical_distribution_ratio = EUCTW_TYPICAL_DISTRIBUTION_RATIO

    def get_order(self, byte_str):
        # for euc-TW encoding, we are interested
        #   first  byte range: 0xc4 -- 0xfe
        #   second byte range: 0xa1 -- 0xfe
        # no validation needed here. State machine has done that
        first_char = byte_str[0]
        if first_char >= 0xC4:
            return 94 * (first_char - 0xC4) + byte_str[1] - 0xA1
        else:
            return -1


class EUCKRDistributionAnalysis(CharDistributionAnalysis):
    def __init__(self):
        super(EUCKRDistributionAnalysis, self).__init__()
        self._char_to_freq_order = EUCKR_CHAR_TO_FREQ_ORDER
        self._table_size = EUCKR_TABLE_SIZE
        self.typical_distribution_ratio = EUCKR_TYPICAL_DISTRIBUTION_RATIO

    def get_order(self, byte_str):
        # for euc-KR encoding, we are interested
        #   first  byte range: 0xb0 -- 0xfe
        #   second byte range: 0xa1 -- 0xfe
        # no validation needed here. State machine has done that
        first_char = byte_str[0]
        if first_char >= 0xB0:
            return 94 * (first_char - 0xB0) + byte_str[1] - 0xA1
        else:
            return -1


class GB2312DistributionAnalysis(CharDistributionAnalysis):
    def __init__(self):
        super(GB2312DistributionAnalysis, self).__init__()
        self._char_to_freq_order = GB2312_CHAR_TO_FREQ_ORDER
        self._table_size = GB2312_TABLE_SIZE
        self.typical_distribution_ratio = GB2312_TYPICAL_DISTRIBUTION_RATIO

    def get_order(self, byte_str):
        # for GB2312 encoding, we are interested
        #  first  byte range: 0xb0 -- 0xfe
        #  second byte range: 0xa1 -- 0xfe
        # no validation needed here. State machine has done that
        first_char, second_char = byte_str[0], byte_str[1]
        if (first_char >= 0xB0) and (second_char >= 0xA1):
            return 94 * (first_char - 0xB0) + second_char - 0xA1
        else:
            return -1


class Big5DistributionAnalysis(CharDistributionAnalysis):
    def __init__(self):
        super(Big5DistributionAnalysis, self).__init__()
        self._char_to_freq_order = BIG5_CHAR_TO_FREQ_ORDER
        self._table_size = BIG5_TABLE_SIZE
        self.typical_distribution_ratio = BIG5_TYPICAL_DISTRIBUTION_RATIO

    def get_order(self, byte_str):
        # for big5 encoding, we are interested
        #   first  byte range: 0xa4 -- 0xfe
        #   second byte range: 0x40 -- 0x7e , 0xa1 -- 0xfe
        # no validation needed here. State machine has done that
        first_char, second_char = byte_str[0], byte_str[1]
        if first_char >= 0xA4:
            if second_char >= 0xA1:
                return 157 * (first_char - 0xA4) + second_char - 0xA1 + 63
            else:
                return 157 * (first_char - 0xA4) + second_char - 0x40
        else:
            return -1


class SJISDistributionAnalysis(CharDistributionAnalysis):
    def __init__(self):
        super(SJISDistributionAnalysis, self).__init__()
        self._char_to_freq_order = JIS_CHAR_TO_FREQ_ORDER
        self._table_size = JIS_TABLE_SIZE
        self.typical_distribution_ratio = JIS_TYPICAL_DISTRIBUTION_RATIO

    def get_order(self, byte_str):
        # for sjis encoding, we are interested
        #   first  byte range: 0x81 -- 0x9f , 0xe0 -- 0xfe
        #   second byte range: 0x40 -- 0x7e,  0x81 -- oxfe
        # no validation needed here. State machine has done that
        first_char, second_char = byte_str[0], byte_str[1]
        if (first_char >= 0x81) and (first_char <= 0x9F):
            order = 188 * (first_char - 0x81)
        elif (first_char >= 0xE0) and (first_char <= 0xEF):
            order = 188 * (first_char - 0xE0 + 31)
        else:
            return -1
        order = order + second_char - 0x40
        if second_char > 0x7F:
            order = -1
        return order


class EUCJPDistributionAnalysis(CharDistributionAnalysis):
    def __init__(self):
        super(EUCJPDistributionAnalysis, self).__init__()
        self._char_to_freq_order = JIS_CHAR_TO_FREQ_ORDER
        self._table_size = JIS_TABLE_SIZE
        self.typical_distribution_ratio = JIS_TYPICAL_DISTRIBUTION_RATIO

    def get_order(self, byte_str):
        # for euc-JP encoding, we are interested
        #   first  byte range: 0xa0 -- 0xfe
        #   second byte range: 0xa1 -- 0xfe
        # no validation needed here. State machine has done that
        char = byte_str[0]
        if char >= 0xA0:
            return 94 * (char - 0xA1) + byte_str[1] - 0xa1
        else:
            return -1
site-packages/pip/_vendor/chardet/sbcsgroupprober.py000064400000006732147511334570016737 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Universal charset detector code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 2001
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#   Shy Shalom - original C code
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

from .charsetgroupprober import CharSetGroupProber
from .sbcharsetprober import SingleByteCharSetProber
from .langcyrillicmodel import (Win1251CyrillicModel, Koi8rModel,
                                Latin5CyrillicModel, MacCyrillicModel,
                                Ibm866Model, Ibm855Model)
from .langgreekmodel import Latin7GreekModel, Win1253GreekModel
from .langbulgarianmodel import Latin5BulgarianModel, Win1251BulgarianModel
# from .langhungarianmodel import Latin2HungarianModel, Win1250HungarianModel
from .langthaimodel import TIS620ThaiModel
from .langhebrewmodel import Win1255HebrewModel
from .hebrewprober import HebrewProber
from .langturkishmodel import Latin5TurkishModel


class SBCSGroupProber(CharSetGroupProber):
    def __init__(self):
        super(SBCSGroupProber, self).__init__()
        self.probers = [
            SingleByteCharSetProber(Win1251CyrillicModel),
            SingleByteCharSetProber(Koi8rModel),
            SingleByteCharSetProber(Latin5CyrillicModel),
            SingleByteCharSetProber(MacCyrillicModel),
            SingleByteCharSetProber(Ibm866Model),
            SingleByteCharSetProber(Ibm855Model),
            SingleByteCharSetProber(Latin7GreekModel),
            SingleByteCharSetProber(Win1253GreekModel),
            SingleByteCharSetProber(Latin5BulgarianModel),
            SingleByteCharSetProber(Win1251BulgarianModel),
            # TODO: Restore Hungarian encodings (iso-8859-2 and windows-1250)
            #       after we retrain model.
            # SingleByteCharSetProber(Latin2HungarianModel),
            # SingleByteCharSetProber(Win1250HungarianModel),
            SingleByteCharSetProber(TIS620ThaiModel),
            SingleByteCharSetProber(Latin5TurkishModel),
        ]
        hebrew_prober = HebrewProber()
        logical_hebrew_prober = SingleByteCharSetProber(Win1255HebrewModel,
                                                        False, hebrew_prober)
        visual_hebrew_prober = SingleByteCharSetProber(Win1255HebrewModel, True,
                                                       hebrew_prober)
        hebrew_prober.set_model_probers(logical_hebrew_prober, visual_hebrew_prober)
        self.probers.extend([hebrew_prober, logical_hebrew_prober,
                             visual_hebrew_prober])

        self.reset()
site-packages/pip/_vendor/chardet/jisfreq.py000064400000062261147511334570015160 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Communicator client code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

# Sampling from about 20M text materials include literature and computer technology
#
# Japanese frequency table, applied to both S-JIS and EUC-JP
# They are sorted in order.

# 128  --> 0.77094
# 256  --> 0.85710
# 512  --> 0.92635
# 1024 --> 0.97130
# 2048 --> 0.99431
#
# Ideal Distribution Ratio = 0.92635 / (1-0.92635) = 12.58
# Random Distribution Ration = 512 / (2965+62+83+86-512) = 0.191
#
# Typical Distribution Ratio, 25% of IDR

JIS_TYPICAL_DISTRIBUTION_RATIO = 3.0

# Char to FreqOrder table ,
JIS_TABLE_SIZE = 4368

JIS_CHAR_TO_FREQ_ORDER = (
  40,   1,   6, 182, 152, 180, 295,2127, 285, 381,3295,4304,3068,4606,3165,3510, #   16
3511,1822,2785,4607,1193,2226,5070,4608, 171,2996,1247,  18, 179,5071, 856,1661, #   32
1262,5072, 619, 127,3431,3512,3230,1899,1700, 232, 228,1294,1298, 284, 283,2041, #   48
2042,1061,1062,  48,  49,  44,  45, 433, 434,1040,1041, 996, 787,2997,1255,4305, #   64
2108,4609,1684,1648,5073,5074,5075,5076,5077,5078,3687,5079,4610,5080,3927,3928, #   80
5081,3296,3432, 290,2285,1471,2187,5082,2580,2825,1303,2140,1739,1445,2691,3375, #   96
1691,3297,4306,4307,4611, 452,3376,1182,2713,3688,3069,4308,5083,5084,5085,5086, #  112
5087,5088,5089,5090,5091,5092,5093,5094,5095,5096,5097,5098,5099,5100,5101,5102, #  128
5103,5104,5105,5106,5107,5108,5109,5110,5111,5112,4097,5113,5114,5115,5116,5117, #  144
5118,5119,5120,5121,5122,5123,5124,5125,5126,5127,5128,5129,5130,5131,5132,5133, #  160
5134,5135,5136,5137,5138,5139,5140,5141,5142,5143,5144,5145,5146,5147,5148,5149, #  176
5150,5151,5152,4612,5153,5154,5155,5156,5157,5158,5159,5160,5161,5162,5163,5164, #  192
5165,5166,5167,5168,5169,5170,5171,5172,5173,5174,5175,1472, 598, 618, 820,1205, #  208
1309,1412,1858,1307,1692,5176,5177,5178,5179,5180,5181,5182,1142,1452,1234,1172, #  224
1875,2043,2149,1793,1382,2973, 925,2404,1067,1241, 960,1377,2935,1491, 919,1217, #  240
1865,2030,1406,1499,2749,4098,5183,5184,5185,5186,5187,5188,2561,4099,3117,1804, #  256
2049,3689,4309,3513,1663,5189,3166,3118,3298,1587,1561,3433,5190,3119,1625,2998, #  272
3299,4613,1766,3690,2786,4614,5191,5192,5193,5194,2161,  26,3377,   2,3929,  20, #  288
3691,  47,4100,  50,  17,  16,  35, 268,  27, 243,  42, 155,  24, 154,  29, 184, #  304
   4,  91,  14,  92,  53, 396,  33, 289,   9,  37,  64, 620,  21,  39, 321,   5, #  320
  12,  11,  52,  13,   3, 208, 138,   0,   7,  60, 526, 141, 151,1069, 181, 275, #  336
1591,  83, 132,1475, 126, 331, 829,  15,  69, 160,  59,  22, 157,  55,1079, 312, #  352
 109,  38,  23,  25,  10,  19,  79,5195,  61, 382,1124,   8,  30,5196,5197,5198, #  368
5199,5200,5201,5202,5203,5204,5205,5206,  89,  62,  74,  34,2416, 112, 139, 196, #  384
 271, 149,  84, 607, 131, 765,  46,  88, 153, 683,  76, 874, 101, 258,  57,  80, #  400
  32, 364, 121,1508, 169,1547,  68, 235, 145,2999,  41, 360,3027,  70,  63,  31, #  416
  43, 259, 262,1383,  99, 533, 194,  66,  93, 846, 217, 192,  56, 106,  58, 565, #  432
 280, 272, 311, 256, 146,  82, 308,  71, 100, 128, 214, 655, 110, 261, 104,1140, #  448
  54,  51,  36,  87,  67,3070, 185,2618,2936,2020,  28,1066,2390,2059,5207,5208, #  464
5209,5210,5211,5212,5213,5214,5215,5216,4615,5217,5218,5219,5220,5221,5222,5223, #  480
5224,5225,5226,5227,5228,5229,5230,5231,5232,5233,5234,5235,5236,3514,5237,5238, #  496
5239,5240,5241,5242,5243,5244,2297,2031,4616,4310,3692,5245,3071,5246,3598,5247, #  512
4617,3231,3515,5248,4101,4311,4618,3808,4312,4102,5249,4103,4104,3599,5250,5251, #  528
5252,5253,5254,5255,5256,5257,5258,5259,5260,5261,5262,5263,5264,5265,5266,5267, #  544
5268,5269,5270,5271,5272,5273,5274,5275,5276,5277,5278,5279,5280,5281,5282,5283, #  560
5284,5285,5286,5287,5288,5289,5290,5291,5292,5293,5294,5295,5296,5297,5298,5299, #  576
5300,5301,5302,5303,5304,5305,5306,5307,5308,5309,5310,5311,5312,5313,5314,5315, #  592
5316,5317,5318,5319,5320,5321,5322,5323,5324,5325,5326,5327,5328,5329,5330,5331, #  608
5332,5333,5334,5335,5336,5337,5338,5339,5340,5341,5342,5343,5344,5345,5346,5347, #  624
5348,5349,5350,5351,5352,5353,5354,5355,5356,5357,5358,5359,5360,5361,5362,5363, #  640
5364,5365,5366,5367,5368,5369,5370,5371,5372,5373,5374,5375,5376,5377,5378,5379, #  656
5380,5381, 363, 642,2787,2878,2788,2789,2316,3232,2317,3434,2011, 165,1942,3930, #  672
3931,3932,3933,5382,4619,5383,4620,5384,5385,5386,5387,5388,5389,5390,5391,5392, #  688
5393,5394,5395,5396,5397,5398,5399,5400,5401,5402,5403,5404,5405,5406,5407,5408, #  704
5409,5410,5411,5412,5413,5414,5415,5416,5417,5418,5419,5420,5421,5422,5423,5424, #  720
5425,5426,5427,5428,5429,5430,5431,5432,5433,5434,5435,5436,5437,5438,5439,5440, #  736
5441,5442,5443,5444,5445,5446,5447,5448,5449,5450,5451,5452,5453,5454,5455,5456, #  752
5457,5458,5459,5460,5461,5462,5463,5464,5465,5466,5467,5468,5469,5470,5471,5472, #  768
5473,5474,5475,5476,5477,5478,5479,5480,5481,5482,5483,5484,5485,5486,5487,5488, #  784
5489,5490,5491,5492,5493,5494,5495,5496,5497,5498,5499,5500,5501,5502,5503,5504, #  800
5505,5506,5507,5508,5509,5510,5511,5512,5513,5514,5515,5516,5517,5518,5519,5520, #  816
5521,5522,5523,5524,5525,5526,5527,5528,5529,5530,5531,5532,5533,5534,5535,5536, #  832
5537,5538,5539,5540,5541,5542,5543,5544,5545,5546,5547,5548,5549,5550,5551,5552, #  848
5553,5554,5555,5556,5557,5558,5559,5560,5561,5562,5563,5564,5565,5566,5567,5568, #  864
5569,5570,5571,5572,5573,5574,5575,5576,5577,5578,5579,5580,5581,5582,5583,5584, #  880
5585,5586,5587,5588,5589,5590,5591,5592,5593,5594,5595,5596,5597,5598,5599,5600, #  896
5601,5602,5603,5604,5605,5606,5607,5608,5609,5610,5611,5612,5613,5614,5615,5616, #  912
5617,5618,5619,5620,5621,5622,5623,5624,5625,5626,5627,5628,5629,5630,5631,5632, #  928
5633,5634,5635,5636,5637,5638,5639,5640,5641,5642,5643,5644,5645,5646,5647,5648, #  944
5649,5650,5651,5652,5653,5654,5655,5656,5657,5658,5659,5660,5661,5662,5663,5664, #  960
5665,5666,5667,5668,5669,5670,5671,5672,5673,5674,5675,5676,5677,5678,5679,5680, #  976
5681,5682,5683,5684,5685,5686,5687,5688,5689,5690,5691,5692,5693,5694,5695,5696, #  992
5697,5698,5699,5700,5701,5702,5703,5704,5705,5706,5707,5708,5709,5710,5711,5712, # 1008
5713,5714,5715,5716,5717,5718,5719,5720,5721,5722,5723,5724,5725,5726,5727,5728, # 1024
5729,5730,5731,5732,5733,5734,5735,5736,5737,5738,5739,5740,5741,5742,5743,5744, # 1040
5745,5746,5747,5748,5749,5750,5751,5752,5753,5754,5755,5756,5757,5758,5759,5760, # 1056
5761,5762,5763,5764,5765,5766,5767,5768,5769,5770,5771,5772,5773,5774,5775,5776, # 1072
5777,5778,5779,5780,5781,5782,5783,5784,5785,5786,5787,5788,5789,5790,5791,5792, # 1088
5793,5794,5795,5796,5797,5798,5799,5800,5801,5802,5803,5804,5805,5806,5807,5808, # 1104
5809,5810,5811,5812,5813,5814,5815,5816,5817,5818,5819,5820,5821,5822,5823,5824, # 1120
5825,5826,5827,5828,5829,5830,5831,5832,5833,5834,5835,5836,5837,5838,5839,5840, # 1136
5841,5842,5843,5844,5845,5846,5847,5848,5849,5850,5851,5852,5853,5854,5855,5856, # 1152
5857,5858,5859,5860,5861,5862,5863,5864,5865,5866,5867,5868,5869,5870,5871,5872, # 1168
5873,5874,5875,5876,5877,5878,5879,5880,5881,5882,5883,5884,5885,5886,5887,5888, # 1184
5889,5890,5891,5892,5893,5894,5895,5896,5897,5898,5899,5900,5901,5902,5903,5904, # 1200
5905,5906,5907,5908,5909,5910,5911,5912,5913,5914,5915,5916,5917,5918,5919,5920, # 1216
5921,5922,5923,5924,5925,5926,5927,5928,5929,5930,5931,5932,5933,5934,5935,5936, # 1232
5937,5938,5939,5940,5941,5942,5943,5944,5945,5946,5947,5948,5949,5950,5951,5952, # 1248
5953,5954,5955,5956,5957,5958,5959,5960,5961,5962,5963,5964,5965,5966,5967,5968, # 1264
5969,5970,5971,5972,5973,5974,5975,5976,5977,5978,5979,5980,5981,5982,5983,5984, # 1280
5985,5986,5987,5988,5989,5990,5991,5992,5993,5994,5995,5996,5997,5998,5999,6000, # 1296
6001,6002,6003,6004,6005,6006,6007,6008,6009,6010,6011,6012,6013,6014,6015,6016, # 1312
6017,6018,6019,6020,6021,6022,6023,6024,6025,6026,6027,6028,6029,6030,6031,6032, # 1328
6033,6034,6035,6036,6037,6038,6039,6040,6041,6042,6043,6044,6045,6046,6047,6048, # 1344
6049,6050,6051,6052,6053,6054,6055,6056,6057,6058,6059,6060,6061,6062,6063,6064, # 1360
6065,6066,6067,6068,6069,6070,6071,6072,6073,6074,6075,6076,6077,6078,6079,6080, # 1376
6081,6082,6083,6084,6085,6086,6087,6088,6089,6090,6091,6092,6093,6094,6095,6096, # 1392
6097,6098,6099,6100,6101,6102,6103,6104,6105,6106,6107,6108,6109,6110,6111,6112, # 1408
6113,6114,2044,2060,4621, 997,1235, 473,1186,4622, 920,3378,6115,6116, 379,1108, # 1424
4313,2657,2735,3934,6117,3809, 636,3233, 573,1026,3693,3435,2974,3300,2298,4105, # 1440
 854,2937,2463, 393,2581,2417, 539, 752,1280,2750,2480, 140,1161, 440, 708,1569, # 1456
 665,2497,1746,1291,1523,3000, 164,1603, 847,1331, 537,1997, 486, 508,1693,2418, # 1472
1970,2227, 878,1220, 299,1030, 969, 652,2751, 624,1137,3301,2619,  65,3302,2045, # 1488
1761,1859,3120,1930,3694,3516, 663,1767, 852, 835,3695, 269, 767,2826,2339,1305, # 1504
 896,1150, 770,1616,6118, 506,1502,2075,1012,2519, 775,2520,2975,2340,2938,4314, # 1520
3028,2086,1224,1943,2286,6119,3072,4315,2240,1273,1987,3935,1557, 175, 597, 985, # 1536
3517,2419,2521,1416,3029, 585, 938,1931,1007,1052,1932,1685,6120,3379,4316,4623, # 1552
 804, 599,3121,1333,2128,2539,1159,1554,2032,3810, 687,2033,2904, 952, 675,1467, # 1568
3436,6121,2241,1096,1786,2440,1543,1924, 980,1813,2228, 781,2692,1879, 728,1918, # 1584
3696,4624, 548,1950,4625,1809,1088,1356,3303,2522,1944, 502, 972, 373, 513,2827, # 1600
 586,2377,2391,1003,1976,1631,6122,2464,1084, 648,1776,4626,2141, 324, 962,2012, # 1616
2177,2076,1384, 742,2178,1448,1173,1810, 222, 102, 301, 445, 125,2420, 662,2498, # 1632
 277, 200,1476,1165,1068, 224,2562,1378,1446, 450,1880, 659, 791, 582,4627,2939, # 1648
3936,1516,1274, 555,2099,3697,1020,1389,1526,3380,1762,1723,1787,2229, 412,2114, # 1664
1900,2392,3518, 512,2597, 427,1925,2341,3122,1653,1686,2465,2499, 697, 330, 273, # 1680
 380,2162, 951, 832, 780, 991,1301,3073, 965,2270,3519, 668,2523,2636,1286, 535, # 1696
1407, 518, 671, 957,2658,2378, 267, 611,2197,3030,6123, 248,2299, 967,1799,2356, # 1712
 850,1418,3437,1876,1256,1480,2828,1718,6124,6125,1755,1664,2405,6126,4628,2879, # 1728
2829, 499,2179, 676,4629, 557,2329,2214,2090, 325,3234, 464, 811,3001, 992,2342, # 1744
2481,1232,1469, 303,2242, 466,1070,2163, 603,1777,2091,4630,2752,4631,2714, 322, # 1760
2659,1964,1768, 481,2188,1463,2330,2857,3600,2092,3031,2421,4632,2318,2070,1849, # 1776
2598,4633,1302,2254,1668,1701,2422,3811,2905,3032,3123,2046,4106,1763,1694,4634, # 1792
1604, 943,1724,1454, 917, 868,2215,1169,2940, 552,1145,1800,1228,1823,1955, 316, # 1808
1080,2510, 361,1807,2830,4107,2660,3381,1346,1423,1134,4108,6127, 541,1263,1229, # 1824
1148,2540, 545, 465,1833,2880,3438,1901,3074,2482, 816,3937, 713,1788,2500, 122, # 1840
1575, 195,1451,2501,1111,6128, 859, 374,1225,2243,2483,4317, 390,1033,3439,3075, # 1856
2524,1687, 266, 793,1440,2599, 946, 779, 802, 507, 897,1081, 528,2189,1292, 711, # 1872
1866,1725,1167,1640, 753, 398,2661,1053, 246, 348,4318, 137,1024,3440,1600,2077, # 1888
2129, 825,4319, 698, 238, 521, 187,2300,1157,2423,1641,1605,1464,1610,1097,2541, # 1904
1260,1436, 759,2255,1814,2150, 705,3235, 409,2563,3304, 561,3033,2005,2564, 726, # 1920
1956,2343,3698,4109, 949,3812,3813,3520,1669, 653,1379,2525, 881,2198, 632,2256, # 1936
1027, 778,1074, 733,1957, 514,1481,2466, 554,2180, 702,3938,1606,1017,1398,6129, # 1952
1380,3521, 921, 993,1313, 594, 449,1489,1617,1166, 768,1426,1360, 495,1794,3601, # 1968
1177,3602,1170,4320,2344, 476, 425,3167,4635,3168,1424, 401,2662,1171,3382,1998, # 1984
1089,4110, 477,3169, 474,6130,1909, 596,2831,1842, 494, 693,1051,1028,1207,3076, # 2000
 606,2115, 727,2790,1473,1115, 743,3522, 630, 805,1532,4321,2021, 366,1057, 838, # 2016
 684,1114,2142,4322,2050,1492,1892,1808,2271,3814,2424,1971,1447,1373,3305,1090, # 2032
1536,3939,3523,3306,1455,2199, 336, 369,2331,1035, 584,2393, 902, 718,2600,6131, # 2048
2753, 463,2151,1149,1611,2467, 715,1308,3124,1268, 343,1413,3236,1517,1347,2663, # 2064
2093,3940,2022,1131,1553,2100,2941,1427,3441,2942,1323,2484,6132,1980, 872,2368, # 2080
2441,2943, 320,2369,2116,1082, 679,1933,3941,2791,3815, 625,1143,2023, 422,2200, # 2096
3816,6133, 730,1695, 356,2257,1626,2301,2858,2637,1627,1778, 937, 883,2906,2693, # 2112
3002,1769,1086, 400,1063,1325,3307,2792,4111,3077, 456,2345,1046, 747,6134,1524, # 2128
 884,1094,3383,1474,2164,1059, 974,1688,2181,2258,1047, 345,1665,1187, 358, 875, # 2144
3170, 305, 660,3524,2190,1334,1135,3171,1540,1649,2542,1527, 927, 968,2793, 885, # 2160
1972,1850, 482, 500,2638,1218,1109,1085,2543,1654,2034, 876,  78,2287,1482,1277, # 2176
 861,1675,1083,1779, 724,2754, 454, 397,1132,1612,2332, 893, 672,1237, 257,2259, # 2192
2370, 135,3384, 337,2244, 547, 352, 340, 709,2485,1400, 788,1138,2511, 540, 772, # 2208
1682,2260,2272,2544,2013,1843,1902,4636,1999,1562,2288,4637,2201,1403,1533, 407, # 2224
 576,3308,1254,2071, 978,3385, 170, 136,1201,3125,2664,3172,2394, 213, 912, 873, # 2240
3603,1713,2202, 699,3604,3699, 813,3442, 493, 531,1054, 468,2907,1483, 304, 281, # 2256
4112,1726,1252,2094, 339,2319,2130,2639, 756,1563,2944, 748, 571,2976,1588,2425, # 2272
2715,1851,1460,2426,1528,1392,1973,3237, 288,3309, 685,3386, 296, 892,2716,2216, # 2288
1570,2245, 722,1747,2217, 905,3238,1103,6135,1893,1441,1965, 251,1805,2371,3700, # 2304
2601,1919,1078,  75,2182,1509,1592,1270,2640,4638,2152,6136,3310,3817, 524, 706, # 2320
1075, 292,3818,1756,2602, 317,  98,3173,3605,3525,1844,2218,3819,2502, 814, 567, # 2336
 385,2908,1534,6137, 534,1642,3239, 797,6138,1670,1529, 953,4323, 188,1071, 538, # 2352
 178, 729,3240,2109,1226,1374,2000,2357,2977, 731,2468,1116,2014,2051,6139,1261, # 2368
1593, 803,2859,2736,3443, 556, 682, 823,1541,6140,1369,2289,1706,2794, 845, 462, # 2384
2603,2665,1361, 387, 162,2358,1740, 739,1770,1720,1304,1401,3241,1049, 627,1571, # 2400
2427,3526,1877,3942,1852,1500, 431,1910,1503, 677, 297,2795, 286,1433,1038,1198, # 2416
2290,1133,1596,4113,4639,2469,1510,1484,3943,6141,2442, 108, 712,4640,2372, 866, # 2432
3701,2755,3242,1348, 834,1945,1408,3527,2395,3243,1811, 824, 994,1179,2110,1548, # 2448
1453, 790,3003, 690,4324,4325,2832,2909,3820,1860,3821, 225,1748, 310, 346,1780, # 2464
2470, 821,1993,2717,2796, 828, 877,3528,2860,2471,1702,2165,2910,2486,1789, 453, # 2480
 359,2291,1676,  73,1164,1461,1127,3311, 421, 604, 314,1037, 589, 116,2487, 737, # 2496
 837,1180, 111, 244, 735,6142,2261,1861,1362, 986, 523, 418, 581,2666,3822, 103, # 2512
 855, 503,1414,1867,2488,1091, 657,1597, 979, 605,1316,4641,1021,2443,2078,2001, # 2528
1209,  96, 587,2166,1032, 260,1072,2153, 173,  94, 226,3244, 819,2006,4642,4114, # 2544
2203, 231,1744, 782,  97,2667, 786,3387, 887, 391, 442,2219,4326,1425,6143,2694, # 2560
 633,1544,1202, 483,2015, 592,2052,1958,2472,1655, 419, 129,4327,3444,3312,1714, # 2576
1257,3078,4328,1518,1098, 865,1310,1019,1885,1512,1734, 469,2444, 148, 773, 436, # 2592
1815,1868,1128,1055,4329,1245,2756,3445,2154,1934,1039,4643, 579,1238, 932,2320, # 2608
 353, 205, 801, 115,2428, 944,2321,1881, 399,2565,1211, 678, 766,3944, 335,2101, # 2624
1459,1781,1402,3945,2737,2131,1010, 844, 981,1326,1013, 550,1816,1545,2620,1335, # 2640
1008, 371,2881, 936,1419,1613,3529,1456,1395,2273,1834,2604,1317,2738,2503, 416, # 2656
1643,4330, 806,1126, 229, 591,3946,1314,1981,1576,1837,1666, 347,1790, 977,3313, # 2672
 764,2861,1853, 688,2429,1920,1462,  77, 595, 415,2002,3034, 798,1192,4115,6144, # 2688
2978,4331,3035,2695,2582,2072,2566, 430,2430,1727, 842,1396,3947,3702, 613, 377, # 2704
 278, 236,1417,3388,3314,3174, 757,1869, 107,3530,6145,1194, 623,2262, 207,1253, # 2720
2167,3446,3948, 492,1117,1935, 536,1838,2757,1246,4332, 696,2095,2406,1393,1572, # 2736
3175,1782, 583, 190, 253,1390,2230, 830,3126,3389, 934,3245,1703,1749,2979,1870, # 2752
2545,1656,2204, 869,2346,4116,3176,1817, 496,1764,4644, 942,1504, 404,1903,1122, # 2768
1580,3606,2945,1022, 515, 372,1735, 955,2431,3036,6146,2797,1110,2302,2798, 617, # 2784
6147, 441, 762,1771,3447,3607,3608,1904, 840,3037,  86, 939,1385, 572,1370,2445, # 2800
1336, 114,3703, 898, 294, 203,3315, 703,1583,2274, 429, 961,4333,1854,1951,3390, # 2816
2373,3704,4334,1318,1381, 966,1911,2322,1006,1155, 309, 989, 458,2718,1795,1372, # 2832
1203, 252,1689,1363,3177, 517,1936, 168,1490, 562, 193,3823,1042,4117,1835, 551, # 2848
 470,4645, 395, 489,3448,1871,1465,2583,2641, 417,1493, 279,1295, 511,1236,1119, # 2864
  72,1231,1982,1812,3004, 871,1564, 984,3449,1667,2696,2096,4646,2347,2833,1673, # 2880
3609, 695,3246,2668, 807,1183,4647, 890, 388,2333,1801,1457,2911,1765,1477,1031, # 2896
3316,3317,1278,3391,2799,2292,2526, 163,3450,4335,2669,1404,1802,6148,2323,2407, # 2912
1584,1728,1494,1824,1269, 298, 909,3318,1034,1632, 375, 776,1683,2061, 291, 210, # 2928
1123, 809,1249,1002,2642,3038, 206,1011,2132, 144, 975, 882,1565, 342, 667, 754, # 2944
1442,2143,1299,2303,2062, 447, 626,2205,1221,2739,2912,1144,1214,2206,2584, 760, # 2960
1715, 614, 950,1281,2670,2621, 810, 577,1287,2546,4648, 242,2168, 250,2643, 691, # 2976
 123,2644, 647, 313,1029, 689,1357,2946,1650, 216, 771,1339,1306, 808,2063, 549, # 2992
 913,1371,2913,2914,6149,1466,1092,1174,1196,1311,2605,2396,1783,1796,3079, 406, # 3008
2671,2117,3949,4649, 487,1825,2220,6150,2915, 448,2348,1073,6151,2397,1707, 130, # 3024
 900,1598, 329, 176,1959,2527,1620,6152,2275,4336,3319,1983,2191,3705,3610,2155, # 3040
3706,1912,1513,1614,6153,1988, 646, 392,2304,1589,3320,3039,1826,1239,1352,1340, # 3056
2916, 505,2567,1709,1437,2408,2547, 906,6154,2672, 384,1458,1594,1100,1329, 710, # 3072
 423,3531,2064,2231,2622,1989,2673,1087,1882, 333, 841,3005,1296,2882,2379, 580, # 3088
1937,1827,1293,2585, 601, 574, 249,1772,4118,2079,1120, 645, 901,1176,1690, 795, # 3104
2207, 478,1434, 516,1190,1530, 761,2080, 930,1264, 355, 435,1552, 644,1791, 987, # 3120
 220,1364,1163,1121,1538, 306,2169,1327,1222, 546,2645, 218, 241, 610,1704,3321, # 3136
1984,1839,1966,2528, 451,6155,2586,3707,2568, 907,3178, 254,2947, 186,1845,4650, # 3152
 745, 432,1757, 428,1633, 888,2246,2221,2489,3611,2118,1258,1265, 956,3127,1784, # 3168
4337,2490, 319, 510, 119, 457,3612, 274,2035,2007,4651,1409,3128, 970,2758, 590, # 3184
2800, 661,2247,4652,2008,3950,1420,1549,3080,3322,3951,1651,1375,2111, 485,2491, # 3200
1429,1156,6156,2548,2183,1495, 831,1840,2529,2446, 501,1657, 307,1894,3247,1341, # 3216
 666, 899,2156,1539,2549,1559, 886, 349,2208,3081,2305,1736,3824,2170,2759,1014, # 3232
1913,1386, 542,1397,2948, 490, 368, 716, 362, 159, 282,2569,1129,1658,1288,1750, # 3248
2674, 276, 649,2016, 751,1496, 658,1818,1284,1862,2209,2087,2512,3451, 622,2834, # 3264
 376, 117,1060,2053,1208,1721,1101,1443, 247,1250,3179,1792,3952,2760,2398,3953, # 3280
6157,2144,3708, 446,2432,1151,2570,3452,2447,2761,2835,1210,2448,3082, 424,2222, # 3296
1251,2449,2119,2836, 504,1581,4338, 602, 817, 857,3825,2349,2306, 357,3826,1470, # 3312
1883,2883, 255, 958, 929,2917,3248, 302,4653,1050,1271,1751,2307,1952,1430,2697, # 3328
2719,2359, 354,3180, 777, 158,2036,4339,1659,4340,4654,2308,2949,2248,1146,2232, # 3344
3532,2720,1696,2623,3827,6158,3129,1550,2698,1485,1297,1428, 637, 931,2721,2145, # 3360
 914,2550,2587,  81,2450, 612, 827,2646,1242,4655,1118,2884, 472,1855,3181,3533, # 3376
3534, 569,1353,2699,1244,1758,2588,4119,2009,2762,2171,3709,1312,1531,6159,1152, # 3392
1938, 134,1830, 471,3710,2276,1112,1535,3323,3453,3535, 982,1337,2950, 488, 826, # 3408
 674,1058,1628,4120,2017, 522,2399, 211, 568,1367,3454, 350, 293,1872,1139,3249, # 3424
1399,1946,3006,1300,2360,3324, 588, 736,6160,2606, 744, 669,3536,3828,6161,1358, # 3440
 199, 723, 848, 933, 851,1939,1505,1514,1338,1618,1831,4656,1634,3613, 443,2740, # 3456
3829, 717,1947, 491,1914,6162,2551,1542,4121,1025,6163,1099,1223, 198,3040,2722, # 3472
 370, 410,1905,2589, 998,1248,3182,2380, 519,1449,4122,1710, 947, 928,1153,4341, # 3488
2277, 344,2624,1511, 615, 105, 161,1212,1076,1960,3130,2054,1926,1175,1906,2473, # 3504
 414,1873,2801,6164,2309, 315,1319,3325, 318,2018,2146,2157, 963, 631, 223,4342, # 3520
4343,2675, 479,3711,1197,2625,3712,2676,2361,6165,4344,4123,6166,2451,3183,1886, # 3536
2184,1674,1330,1711,1635,1506, 799, 219,3250,3083,3954,1677,3713,3326,2081,3614, # 3552
1652,2073,4657,1147,3041,1752, 643,1961, 147,1974,3955,6167,1716,2037, 918,3007, # 3568
1994, 120,1537, 118, 609,3184,4345, 740,3455,1219, 332,1615,3830,6168,1621,2980, # 3584
1582, 783, 212, 553,2350,3714,1349,2433,2082,4124, 889,6169,2310,1275,1410, 973, # 3600
 166,1320,3456,1797,1215,3185,2885,1846,2590,2763,4658, 629, 822,3008, 763, 940, # 3616
1990,2862, 439,2409,1566,1240,1622, 926,1282,1907,2764, 654,2210,1607, 327,1130, # 3632
3956,1678,1623,6170,2434,2192, 686, 608,3831,3715, 903,3957,3042,6171,2741,1522, # 3648
1915,1105,1555,2552,1359, 323,3251,4346,3457, 738,1354,2553,2311,2334,1828,2003, # 3664
3832,1753,2351,1227,6172,1887,4125,1478,6173,2410,1874,1712,1847, 520,1204,2607, # 3680
 264,4659, 836,2677,2102, 600,4660,3833,2278,3084,6174,4347,3615,1342, 640, 532, # 3696
 543,2608,1888,2400,2591,1009,4348,1497, 341,1737,3616,2723,1394, 529,3252,1321, # 3712
 983,4661,1515,2120, 971,2592, 924, 287,1662,3186,4349,2700,4350,1519, 908,1948, # 3728
2452, 156, 796,1629,1486,2223,2055, 694,4126,1259,1036,3392,1213,2249,2742,1889, # 3744
1230,3958,1015, 910, 408, 559,3617,4662, 746, 725, 935,4663,3959,3009,1289, 563, # 3760
 867,4664,3960,1567,2981,2038,2626, 988,2263,2381,4351, 143,2374, 704,1895,6175, # 3776
1188,3716,2088, 673,3085,2362,4352, 484,1608,1921,2765,2918, 215, 904,3618,3537, # 3792
 894, 509, 976,3043,2701,3961,4353,2837,2982, 498,6176,6177,1102,3538,1332,3393, # 3808
1487,1636,1637, 233, 245,3962, 383, 650, 995,3044, 460,1520,1206,2352, 749,3327, # 3824
 530, 700, 389,1438,1560,1773,3963,2264, 719,2951,2724,3834, 870,1832,1644,1000, # 3840
 839,2474,3717, 197,1630,3394, 365,2886,3964,1285,2133, 734, 922, 818,1106, 732, # 3856
 480,2083,1774,3458, 923,2279,1350, 221,3086,  85,2233,2234,3835,1585,3010,2147, # 3872
1387,1705,2382,1619,2475, 133, 239,2802,1991,1016,2084,2383, 411,2838,1113, 651, # 3888
1985,1160,3328, 990,1863,3087,1048,1276,2647, 265,2627,1599,3253,2056, 150, 638, # 3904
2019, 656, 853, 326,1479, 680,1439,4354,1001,1759, 413,3459,3395,2492,1431, 459, # 3920
4355,1125,3329,2265,1953,1450,2065,2863, 849, 351,2678,3131,3254,3255,1104,1577, # 3936
 227,1351,1645,2453,2193,1421,2887, 812,2121, 634,  95,2435, 201,2312,4665,1646, # 3952
1671,2743,1601,2554,2702,2648,2280,1315,1366,2089,3132,1573,3718,3965,1729,1189, # 3968
 328,2679,1077,1940,1136, 558,1283, 964,1195, 621,2074,1199,1743,3460,3619,1896, # 3984
1916,1890,3836,2952,1154,2112,1064, 862, 378,3011,2066,2113,2803,1568,2839,6178, # 4000
3088,2919,1941,1660,2004,1992,2194, 142, 707,1590,1708,1624,1922,1023,1836,1233, # 4016
1004,2313, 789, 741,3620,6179,1609,2411,1200,4127,3719,3720,4666,2057,3721, 593, # 4032
2840, 367,2920,1878,6180,3461,1521, 628,1168, 692,2211,2649, 300, 720,2067,2571, # 4048
2953,3396, 959,2504,3966,3539,3462,1977, 701,6181, 954,1043, 800, 681, 183,3722, # 4064
1803,1730,3540,4128,2103, 815,2314, 174, 467, 230,2454,1093,2134, 755,3541,3397, # 4080
1141,1162,6182,1738,2039, 270,3256,2513,1005,1647,2185,3837, 858,1679,1897,1719, # 4096
2954,2324,1806, 402, 670, 167,4129,1498,2158,2104, 750,6183, 915, 189,1680,1551, # 4112
 455,4356,1501,2455, 405,1095,2955, 338,1586,1266,1819, 570, 641,1324, 237,1556, # 4128
2650,1388,3723,6184,1368,2384,1343,1978,3089,2436, 879,3724, 792,1191, 758,3012, # 4144
1411,2135,1322,4357, 240,4667,1848,3725,1574,6185, 420,3045,1546,1391, 714,4358, # 4160
1967, 941,1864, 863, 664, 426, 560,1731,2680,1785,2864,1949,2363, 403,3330,1415, # 4176
1279,2136,1697,2335, 204, 721,2097,3838,  90,6186,2085,2505, 191,3967, 124,2148, # 4192
1376,1798,1178,1107,1898,1405, 860,4359,1243,1272,2375,2983,1558,2456,1638, 113, # 4208
3621, 578,1923,2609, 880, 386,4130, 784,2186,2266,1422,2956,2172,1722, 497, 263, # 4224
2514,1267,2412,2610, 177,2703,3542, 774,1927,1344, 616,1432,1595,1018, 172,4360, # 4240
2325, 911,4361, 438,1468,3622, 794,3968,2024,2173,1681,1829,2957, 945, 895,3090, # 4256
 575,2212,2476, 475,2401,2681, 785,2744,1745,2293,2555,1975,3133,2865, 394,4668, # 4272
3839, 635,4131, 639, 202,1507,2195,2766,1345,1435,2572,3726,1908,1184,1181,2457, # 4288
3727,3134,4362, 843,2611, 437, 916,4669, 234, 769,1884,3046,3047,3623, 833,6187, # 4304
1639,2250,2402,1355,1185,2010,2047, 999, 525,1732,1290,1488,2612, 948,1578,3728, # 4320
2413,2477,1216,2725,2159, 334,3840,1328,3624,2921,1525,4132, 564,1056, 891,4363, # 4336
1444,1698,2385,2251,3729,1365,2281,2235,1717,6188, 864,3841,2515, 444, 527,2767, # 4352
2922,3625, 544, 461,6189, 566, 209,2437,3398,2098,1065,2068,3331,3626,3257,2137, # 4368  #last 512
)


site-packages/pip/_vendor/chardet/euckrprober.py000064400000003324147511334570016033 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is mozilla.org code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

from .mbcharsetprober import MultiByteCharSetProber
from .codingstatemachine import CodingStateMachine
from .chardistribution import EUCKRDistributionAnalysis
from .mbcssm import EUCKR_SM_MODEL


class EUCKRProber(MultiByteCharSetProber):
    def __init__(self):
        super(EUCKRProber, self).__init__()
        self.coding_sm = CodingStateMachine(EUCKR_SM_MODEL)
        self.distribution_analyzer = EUCKRDistributionAnalysis()
        self.reset()

    @property
    def charset_name(self):
        return "EUC-KR"

    @property
    def language(self):
        return "Korean"
site-packages/pip/_vendor/chardet/compat.py000064400000002156147511334570014775 0ustar00######################## BEGIN LICENSE BLOCK ########################
# Contributor(s):
#   Dan Blanchard
#   Ian Cordasco
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

import sys


if sys.version_info < (3, 0):
    PY2 = True
    PY3 = False
    base_str = (str, unicode)
    text_type = unicode
else:
    PY2 = False
    PY3 = True
    base_str = (bytes, str)
    text_type = str
site-packages/pip/_vendor/chardet/langbulgarianmodel.py000064400000031047147511334570017342 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Communicator client code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

# 255: Control characters that usually does not exist in any text
# 254: Carriage/Return
# 253: symbol (punctuation) that does not belong to word
# 252: 0 - 9

# Character Mapping Table:
# this table is modified base on win1251BulgarianCharToOrderMap, so
# only number <64 is sure valid

Latin5_BulgarianCharToOrderMap = (
255,255,255,255,255,255,255,255,255,255,254,255,255,254,255,255,  # 00
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,  # 10
253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,  # 20
252,252,252,252,252,252,252,252,252,252,253,253,253,253,253,253,  # 30
253, 77, 90, 99,100, 72,109,107,101, 79,185, 81,102, 76, 94, 82,  # 40
110,186,108, 91, 74,119, 84, 96,111,187,115,253,253,253,253,253,  # 50
253, 65, 69, 70, 66, 63, 68,112,103, 92,194,104, 95, 86, 87, 71,  # 60
116,195, 85, 93, 97,113,196,197,198,199,200,253,253,253,253,253,  # 70
194,195,196,197,198,199,200,201,202,203,204,205,206,207,208,209,  # 80
210,211,212,213,214,215,216,217,218,219,220,221,222,223,224,225,  # 90
 81,226,227,228,229,230,105,231,232,233,234,235,236, 45,237,238,  # a0
 31, 32, 35, 43, 37, 44, 55, 47, 40, 59, 33, 46, 38, 36, 41, 30,  # b0
 39, 28, 34, 51, 48, 49, 53, 50, 54, 57, 61,239, 67,240, 60, 56,  # c0
  1, 18,  9, 20, 11,  3, 23, 15,  2, 26, 12, 10, 14,  6,  4, 13,  # d0
  7,  8,  5, 19, 29, 25, 22, 21, 27, 24, 17, 75, 52,241, 42, 16,  # e0
 62,242,243,244, 58,245, 98,246,247,248,249,250,251, 91,252,253,  # f0
)

win1251BulgarianCharToOrderMap = (
255,255,255,255,255,255,255,255,255,255,254,255,255,254,255,255,  # 00
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,  # 10
253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,  # 20
252,252,252,252,252,252,252,252,252,252,253,253,253,253,253,253,  # 30
253, 77, 90, 99,100, 72,109,107,101, 79,185, 81,102, 76, 94, 82,  # 40
110,186,108, 91, 74,119, 84, 96,111,187,115,253,253,253,253,253,  # 50
253, 65, 69, 70, 66, 63, 68,112,103, 92,194,104, 95, 86, 87, 71,  # 60
116,195, 85, 93, 97,113,196,197,198,199,200,253,253,253,253,253,  # 70
206,207,208,209,210,211,212,213,120,214,215,216,217,218,219,220,  # 80
221, 78, 64, 83,121, 98,117,105,222,223,224,225,226,227,228,229,  # 90
 88,230,231,232,233,122, 89,106,234,235,236,237,238, 45,239,240,  # a0
 73, 80,118,114,241,242,243,244,245, 62, 58,246,247,248,249,250,  # b0
 31, 32, 35, 43, 37, 44, 55, 47, 40, 59, 33, 46, 38, 36, 41, 30,  # c0
 39, 28, 34, 51, 48, 49, 53, 50, 54, 57, 61,251, 67,252, 60, 56,  # d0
  1, 18,  9, 20, 11,  3, 23, 15,  2, 26, 12, 10, 14,  6,  4, 13,  # e0
  7,  8,  5, 19, 29, 25, 22, 21, 27, 24, 17, 75, 52,253, 42, 16,  # f0
)

# Model Table:
# total sequences: 100%
# first 512 sequences: 96.9392%
# first 1024 sequences:3.0618%
# rest  sequences:     0.2992%
# negative sequences:  0.0020%
BulgarianLangModel = (
0,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,3,3,3,3,3,3,3,3,2,3,3,3,3,3,
3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,0,3,3,3,2,2,3,2,2,1,2,2,
3,1,3,3,2,3,3,3,3,3,3,3,3,3,3,3,3,0,3,3,3,3,3,3,3,3,3,3,0,3,0,1,
0,0,0,0,0,0,0,0,0,0,1,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,
3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,3,2,3,3,3,3,3,3,3,3,0,3,1,0,
0,1,0,0,0,0,0,0,0,0,1,1,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,
3,2,2,2,3,3,3,3,3,3,3,3,3,3,3,3,3,1,3,2,3,3,3,3,3,3,3,3,0,3,0,0,
0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,2,3,3,2,3,3,3,3,3,3,3,3,3,3,3,3,1,3,2,3,3,3,3,3,3,3,3,0,3,0,0,
0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,3,3,3,3,3,3,2,3,2,2,1,3,3,3,3,2,2,2,1,1,2,0,1,0,1,0,0,
0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,1,
3,3,3,3,3,3,3,2,3,2,2,3,3,1,1,2,3,3,2,3,3,3,3,2,1,2,0,2,0,3,0,0,
0,0,0,0,0,0,0,1,0,0,2,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,1,
3,3,3,3,3,3,3,1,3,3,3,3,3,2,3,2,3,3,3,3,3,2,3,3,1,3,0,3,0,2,0,0,
0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,
3,3,3,3,3,3,3,3,1,3,3,2,3,3,3,1,3,3,2,3,2,2,2,0,0,2,0,2,0,2,0,0,
0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,1,
3,3,3,3,3,3,3,3,3,0,3,3,3,2,2,3,3,3,1,2,2,3,2,1,1,2,0,2,0,0,0,0,
1,0,0,0,0,0,0,0,0,0,2,0,0,1,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,
3,3,3,3,3,3,3,2,3,3,1,2,3,2,2,2,3,3,3,3,3,2,2,3,1,2,0,2,1,2,0,0,
0,0,0,0,0,0,0,0,0,0,3,0,0,1,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,1,
3,3,3,3,3,1,3,3,3,3,3,2,3,3,3,2,3,3,2,3,2,2,2,3,1,2,0,1,0,1,0,0,
0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,
3,3,3,3,3,3,3,3,3,3,3,1,1,1,2,2,1,3,1,3,2,2,3,0,0,1,0,1,0,1,0,0,
0,0,0,1,0,0,0,0,1,0,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,
3,3,3,3,3,2,2,3,2,2,3,1,2,1,1,1,2,3,1,3,1,2,2,0,1,1,1,1,0,1,0,0,
0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,
3,3,3,3,3,1,3,2,2,3,3,1,2,3,1,1,3,3,3,3,1,2,2,1,1,1,0,2,0,2,0,1,
0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,
3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,1,2,2,3,3,3,2,2,1,1,2,0,2,0,1,0,0,
0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,
3,0,1,2,1,3,3,2,3,3,3,3,3,2,3,2,1,0,3,1,2,1,2,1,2,3,2,1,0,1,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,2,3,3,3,3,3,3,3,3,3,3,3,3,0,0,3,1,3,3,2,3,3,2,2,2,0,1,0,0,
0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,3,3,3,3,0,3,3,3,3,3,2,1,1,2,1,3,3,0,3,1,1,1,1,3,2,0,1,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,
3,3,2,2,2,3,3,3,3,3,3,3,3,3,3,3,1,1,3,1,3,3,2,3,2,2,2,3,0,2,0,0,
0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,2,3,3,2,2,3,2,1,1,1,1,1,3,1,3,1,1,0,0,0,1,0,0,0,1,0,0,
0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,2,3,2,0,3,2,0,3,0,2,0,0,2,1,3,1,0,0,1,0,0,0,1,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,
3,3,3,3,2,1,1,1,1,2,1,1,2,1,1,1,2,2,1,2,1,1,1,0,1,1,0,1,0,1,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,
3,3,3,3,2,1,3,1,1,2,1,3,2,1,1,0,1,2,3,2,1,1,1,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,3,3,3,3,2,2,1,0,1,0,0,1,0,0,0,2,1,0,3,0,0,1,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,
3,3,3,2,3,2,3,3,1,3,2,1,1,1,2,1,1,2,1,3,0,1,0,0,0,1,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,1,1,2,2,3,3,2,3,2,2,2,3,1,2,2,1,1,2,1,1,2,2,0,1,1,0,1,0,2,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,2,1,3,1,0,2,2,1,3,2,1,0,0,2,0,2,0,1,0,0,0,0,0,0,0,1,0,0,
0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,
3,3,3,3,3,3,1,2,0,2,3,1,2,3,2,0,1,3,1,2,1,1,1,0,0,1,0,0,2,2,2,3,
2,2,2,2,1,2,1,1,2,2,1,1,2,0,1,1,1,0,0,1,1,0,0,1,1,0,0,0,1,1,0,1,
3,3,3,3,3,2,1,2,2,1,2,0,2,0,1,0,1,2,1,2,1,1,0,0,0,1,0,1,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,1,
3,3,2,3,3,1,1,3,1,0,3,2,1,0,0,0,1,2,0,2,0,1,0,0,0,1,0,1,2,1,2,2,
1,1,1,1,1,1,1,2,2,2,1,1,1,1,1,1,1,0,1,2,1,1,1,0,0,0,0,0,1,1,0,0,
3,1,0,1,0,2,3,2,2,2,3,2,2,2,2,2,1,0,2,1,2,1,1,1,0,1,2,1,2,2,2,1,
1,1,2,2,2,2,1,2,1,1,0,1,2,1,2,2,2,1,1,1,0,1,1,1,1,2,0,1,0,0,0,0,
2,3,2,3,3,0,0,2,1,0,2,1,0,0,0,0,2,3,0,2,0,0,0,0,0,1,0,0,2,0,1,2,
2,1,2,1,2,2,1,1,1,2,1,1,1,0,1,2,2,1,1,1,1,1,0,1,1,1,0,0,1,2,0,0,
3,3,2,2,3,0,2,3,1,1,2,0,0,0,1,0,0,2,0,2,0,0,0,1,0,1,0,1,2,0,2,2,
1,1,1,1,2,1,0,1,2,2,2,1,1,1,1,1,1,1,0,1,1,1,0,0,0,0,0,0,1,1,0,0,
2,3,2,3,3,0,0,3,0,1,1,0,1,0,0,0,2,2,1,2,0,0,0,0,0,0,0,0,2,0,1,2,
2,2,1,1,1,1,1,2,2,2,1,0,2,0,1,0,1,0,0,1,0,1,0,0,1,0,0,0,0,1,0,0,
3,3,3,3,2,2,2,2,2,0,2,1,1,1,1,2,1,2,1,1,0,2,0,1,0,1,0,0,2,0,1,2,
1,1,1,1,1,1,1,2,2,1,1,0,2,0,1,0,2,0,0,1,1,1,0,0,2,0,0,0,1,1,0,0,
2,3,3,3,3,1,0,0,0,0,0,0,0,0,0,0,2,0,0,1,1,0,0,0,0,0,0,1,2,0,1,2,
2,2,2,1,1,2,1,1,2,2,2,1,2,0,1,1,1,1,1,1,0,1,1,1,1,0,0,1,1,1,0,0,
2,3,3,3,3,0,2,2,0,2,1,0,0,0,1,1,1,2,0,2,0,0,0,3,0,0,0,0,2,0,2,2,
1,1,1,2,1,2,1,1,2,2,2,1,2,0,1,1,1,0,1,1,1,1,0,2,1,0,0,0,1,1,0,0,
2,3,3,3,3,0,2,1,0,0,2,0,0,0,0,0,1,2,0,2,0,0,0,0,0,0,0,0,2,0,1,2,
1,1,1,2,1,1,1,1,2,2,2,0,1,0,1,1,1,0,0,1,1,1,0,0,1,0,0,0,0,1,0,0,
3,3,2,2,3,0,1,0,1,0,0,0,0,0,0,0,1,1,0,3,0,0,0,0,0,0,0,0,1,0,2,2,
1,1,1,1,1,2,1,1,2,2,1,2,2,1,0,1,1,1,1,1,0,1,0,0,1,0,0,0,1,1,0,0,
3,1,0,1,0,2,2,2,2,3,2,1,1,1,2,3,0,0,1,0,2,1,1,0,1,1,1,1,2,1,1,1,
1,2,2,1,2,1,2,2,1,1,0,1,2,1,2,2,1,1,1,0,0,1,1,1,2,1,0,1,0,0,0,0,
2,1,0,1,0,3,1,2,2,2,2,1,2,2,1,1,1,0,2,1,2,2,1,1,2,1,1,0,2,1,1,1,
1,2,2,2,2,2,2,2,1,2,0,1,1,0,2,1,1,1,1,1,0,0,1,1,1,1,0,1,0,0,0,0,
2,1,1,1,1,2,2,2,2,1,2,2,2,1,2,2,1,1,2,1,2,3,2,2,1,1,1,1,0,1,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,2,2,3,2,0,1,2,0,1,2,1,1,0,1,0,1,2,1,2,0,0,0,1,1,0,0,0,1,0,0,2,
1,1,0,0,1,1,0,1,1,1,1,0,2,0,1,1,1,0,0,1,1,0,0,0,0,1,0,0,0,1,0,0,
2,0,0,0,0,1,2,2,2,2,2,2,2,1,2,1,1,1,1,1,1,1,0,1,1,1,1,1,2,1,1,1,
1,2,2,2,2,1,1,2,1,2,1,1,1,0,2,1,2,1,1,1,0,2,1,1,1,1,0,1,0,0,0,0,
3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,
1,1,0,1,0,1,1,1,1,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,2,2,3,2,0,0,0,0,1,0,0,0,0,0,0,1,1,0,2,0,0,0,0,0,0,0,0,1,0,1,2,
1,1,1,1,1,1,0,0,2,2,2,2,2,0,1,1,0,1,1,1,1,1,0,0,1,0,0,0,1,1,0,1,
2,3,1,2,1,0,1,1,0,2,2,2,0,0,1,0,0,1,1,1,1,0,0,0,0,0,0,0,1,0,1,2,
1,1,1,1,2,1,1,1,1,1,1,1,1,0,1,1,0,1,0,1,0,1,0,0,1,0,0,0,0,1,0,0,
2,2,2,2,2,0,0,2,0,0,2,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,0,2,0,2,2,
1,1,1,1,1,0,0,1,2,1,1,0,1,0,1,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,
1,2,2,2,2,0,0,2,0,1,1,0,0,0,1,0,0,2,0,2,0,0,0,0,0,0,0,0,0,0,1,1,
0,0,0,1,1,1,1,1,1,1,1,1,1,0,1,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,
1,2,2,3,2,0,0,1,0,0,1,0,0,0,0,0,0,1,0,2,0,0,0,1,0,0,0,0,0,0,0,2,
1,1,0,0,1,0,0,0,1,1,0,0,1,0,1,1,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,
2,1,2,2,2,1,2,1,2,2,1,1,2,1,1,1,0,1,1,1,1,2,0,1,0,1,1,1,1,0,1,1,
1,1,2,1,1,1,1,1,1,0,0,1,2,1,1,1,1,1,1,0,0,1,1,1,0,0,0,0,0,0,0,0,
1,0,0,1,3,1,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,2,2,2,1,0,0,1,0,2,0,0,0,0,0,1,1,1,0,1,0,0,0,0,0,0,0,0,2,0,0,1,
0,2,0,1,0,0,1,1,2,0,1,0,1,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,
1,2,2,2,2,0,1,1,0,2,1,0,1,1,1,0,0,1,0,2,0,1,0,0,0,0,0,0,0,0,0,1,
0,1,0,0,1,0,0,0,1,1,0,0,1,0,0,1,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,
2,2,2,2,2,0,0,1,0,0,0,1,0,1,0,0,0,1,0,1,0,0,0,0,0,0,0,0,0,0,0,1,
0,1,0,1,1,1,0,0,1,1,1,0,1,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,
2,0,1,0,0,1,2,1,1,1,1,1,1,2,2,1,0,0,1,0,1,0,0,0,0,1,1,1,1,0,0,0,
1,1,2,1,1,1,1,0,0,0,1,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,2,1,2,1,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0,0,0,0,0,0,0,0,0,0,0,1,
0,0,0,0,0,0,0,0,1,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,0,0,1,2,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,
0,1,1,0,1,1,1,0,0,1,0,0,1,0,1,0,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,
1,0,1,0,0,1,1,1,1,1,1,1,1,1,1,1,0,0,1,0,2,0,0,2,0,1,0,0,1,0,0,1,
1,1,0,0,1,1,0,1,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,1,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,
1,1,1,1,1,1,1,2,0,0,0,0,0,0,2,1,0,1,1,0,0,1,1,1,0,1,0,0,0,0,0,0,
2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,0,0,1,1,1,1,1,1,1,1,1,1,1,1,1,1,0,1,0,1,1,0,1,1,1,1,1,0,1,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,
)

Latin5BulgarianModel = {
  'char_to_order_map': Latin5_BulgarianCharToOrderMap,
  'precedence_matrix': BulgarianLangModel,
  'typical_positive_ratio': 0.969392,
  'keep_english_letter': False,
  'charset_name': "ISO-8859-5",
  'language': 'Bulgairan',
}

Win1251BulgarianModel = {
  'char_to_order_map': win1251BulgarianCharToOrderMap,
  'precedence_matrix': BulgarianLangModel,
  'typical_positive_ratio': 0.969392,
  'keep_english_letter': False,
  'charset_name': "windows-1251",
  'language': 'Bulgarian',
}
site-packages/pip/_vendor/chardet/enums.py000064400000003175147511334570014643 0ustar00"""
All of the Enums that are used throughout the chardet package.

:author: Dan Blanchard (dan.blanchard@gmail.com)
"""


class InputState(object):
    """
    This enum represents the different states a universal detector can be in.
    """
    PURE_ASCII = 0
    ESC_ASCII = 1
    HIGH_BYTE = 2


class LanguageFilter(object):
    """
    This enum represents the different language filters we can apply to a
    ``UniversalDetector``.
    """
    CHINESE_SIMPLIFIED = 0x01
    CHINESE_TRADITIONAL = 0x02
    JAPANESE = 0x04
    KOREAN = 0x08
    NON_CJK = 0x10
    ALL = 0x1F
    CHINESE = CHINESE_SIMPLIFIED | CHINESE_TRADITIONAL
    CJK = CHINESE | JAPANESE | KOREAN


class ProbingState(object):
    """
    This enum represents the different states a prober can be in.
    """
    DETECTING = 0
    FOUND_IT = 1
    NOT_ME = 2


class MachineState(object):
    """
    This enum represents the different states a state machine can be in.
    """
    START = 0
    ERROR = 1
    ITS_ME = 2


class SequenceLikelihood(object):
    """
    This enum represents the likelihood of a character following the previous one.
    """
    NEGATIVE = 0
    UNLIKELY = 1
    LIKELY = 2
    POSITIVE = 3

    @classmethod
    def get_num_categories(cls):
        """:returns: The number of likelihood categories in the enum."""
        return 4


class CharacterCategory(object):
    """
    This enum represents the different categories language models for
    ``SingleByteCharsetProber`` put characters into.

    Anything less than CONTROL is considered a letter.
    """
    UNDEFINED = 255
    LINE_BREAK = 254
    SYMBOL = 253
    DIGIT = 252
    CONTROL = 251
site-packages/pip/_vendor/chardet/langturkishmodel.py000064400000025536147511334570017075 0ustar00# -*- coding: utf-8 -*-
######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Communicator client code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#   Özgür Baskın - Turkish Language Model
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

# 255: Control characters that usually does not exist in any text
# 254: Carriage/Return
# 253: symbol (punctuation) that does not belong to word
# 252: 0 - 9

# Character Mapping Table:
Latin5_TurkishCharToOrderMap = (
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,
255, 23, 37, 47, 39, 29, 52, 36, 45, 53, 60, 16, 49, 20, 46, 42,
 48, 69, 44, 35, 31, 51, 38, 62, 65, 43, 56,255,255,255,255,255,
255,  1, 21, 28, 12,  2, 18, 27, 25,  3, 24, 10,  5, 13,  4, 15,
 26, 64,  7,  8,  9, 14, 32, 57, 58, 11, 22,255,255,255,255,255,
180,179,178,177,176,175,174,173,172,171,170,169,168,167,166,165,
164,163,162,161,160,159,101,158,157,156,155,154,153,152,151,106,
150,149,148,147,146,145,144,100,143,142,141,140,139,138,137,136,
 94, 80, 93,135,105,134,133, 63,132,131,130,129,128,127,126,125,
124,104, 73, 99, 79, 85,123, 54,122, 98, 92,121,120, 91,103,119,
 68,118,117, 97,116,115, 50, 90,114,113,112,111, 55, 41, 40, 86,
 89, 70, 59, 78, 71, 82, 88, 33, 77, 66, 84, 83,110, 75, 61, 96,
 30, 67,109, 74, 87,102, 34, 95, 81,108, 76, 72, 17,  6, 19,107,
)

TurkishLangModel = (
3,2,3,3,3,1,3,3,3,3,3,3,3,3,2,1,1,3,3,1,3,3,0,3,3,3,3,3,0,3,1,3,
3,2,1,0,0,1,1,0,0,0,1,0,0,1,1,1,1,0,0,0,0,0,0,0,2,2,0,0,1,0,0,1,
3,2,2,3,3,0,3,3,3,3,3,3,3,2,3,1,0,3,3,1,3,3,0,3,3,3,3,3,0,3,0,3,
3,1,1,0,1,0,1,0,0,0,0,0,0,1,1,1,1,0,0,0,0,0,0,0,2,2,0,0,0,1,0,1,
3,3,2,3,3,0,3,3,3,3,3,3,3,2,3,1,1,3,3,0,3,3,1,2,3,3,3,3,0,3,0,3,
3,1,1,0,0,0,1,0,0,0,0,1,1,0,1,2,1,0,0,0,1,0,0,0,0,2,0,0,0,0,0,1,
3,3,3,3,3,3,2,3,3,3,3,3,3,3,3,1,3,3,2,0,3,2,1,2,2,1,3,3,0,0,0,2,
2,2,0,1,0,0,1,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0,1,0,0,1,
3,3,3,2,3,3,1,2,3,3,3,3,3,3,3,1,3,2,1,0,3,2,0,1,2,3,3,2,1,0,0,2,
2,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,2,0,2,0,0,0,
1,0,1,3,3,1,3,3,3,3,3,3,3,1,2,0,0,2,3,0,2,3,0,0,2,2,2,3,0,3,0,1,
2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,0,3,3,3,0,3,2,0,2,3,2,3,3,1,0,0,2,
3,2,0,0,1,0,0,0,0,0,0,2,0,0,1,0,0,0,0,0,0,0,0,0,1,1,1,0,2,0,0,1,
3,3,3,2,3,3,2,3,3,3,3,2,3,3,3,0,3,3,0,0,2,1,0,0,2,3,2,2,0,0,0,2,
2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,1,0,1,0,2,0,0,1,
3,3,3,2,3,3,3,3,3,3,3,2,3,3,3,0,3,2,0,1,3,2,1,1,3,2,3,2,1,0,0,2,
2,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,
3,3,3,2,3,3,3,3,3,3,3,2,3,3,3,0,3,2,2,0,2,3,0,0,2,2,2,2,0,0,0,2,
3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,2,0,1,0,0,0,
3,3,3,3,3,3,3,2,2,2,2,3,2,3,3,0,3,3,1,1,2,2,0,0,2,2,3,2,0,0,1,3,
0,3,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,1,
3,3,3,2,3,3,3,2,1,2,2,3,2,3,3,0,3,2,0,0,1,1,0,1,1,2,1,2,0,0,0,1,
0,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,0,0,
3,3,3,2,3,3,2,3,2,2,2,3,3,3,3,1,3,1,1,0,3,2,1,1,3,3,2,3,1,0,0,1,
1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,2,0,0,1,
3,2,2,3,3,0,3,3,3,3,3,3,3,2,2,1,0,3,3,1,3,3,0,1,3,3,2,3,0,3,0,3,
2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,
2,2,2,3,3,0,3,3,3,3,3,3,3,3,3,0,0,3,2,0,3,3,0,3,2,3,3,3,0,3,1,3,
2,0,0,0,0,0,0,0,0,0,0,1,0,1,2,0,1,0,0,0,0,0,0,0,2,2,0,0,1,0,0,1,
3,3,3,1,2,3,3,1,0,0,1,0,0,3,3,2,3,0,0,2,0,0,2,0,2,0,0,0,2,0,2,0,
0,3,1,0,1,0,0,0,2,2,1,0,1,1,2,1,2,2,2,0,2,1,1,0,0,0,2,0,0,0,0,0,
1,2,1,3,3,0,3,3,3,3,3,2,3,0,0,0,0,2,3,0,2,3,1,0,2,3,1,3,0,3,0,2,
3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,1,3,3,2,2,3,2,2,0,1,2,3,0,1,2,1,0,1,0,0,0,1,0,2,2,0,0,0,1,
1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,0,1,0,0,0,
3,3,3,1,3,3,1,1,3,3,1,1,3,3,1,0,2,1,2,0,2,1,0,0,1,1,2,1,0,0,0,2,
2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,1,0,2,1,3,0,0,2,0,0,3,3,0,3,0,0,1,0,1,2,0,0,1,1,2,2,0,1,0,
0,1,2,1,1,0,1,0,1,1,1,1,1,0,1,1,1,2,2,1,2,0,1,0,0,0,0,0,0,1,0,0,
3,3,3,2,3,2,3,3,0,2,2,2,3,3,3,0,3,0,0,0,2,2,0,1,2,1,1,1,0,0,0,1,
0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,
3,3,3,3,3,3,2,1,2,2,3,3,3,3,2,0,2,0,0,0,2,2,0,0,2,1,3,3,0,0,1,1,
1,1,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,
1,1,2,3,3,0,3,3,3,3,3,3,2,2,0,2,0,2,3,2,3,2,2,2,2,2,2,2,1,3,2,3,
2,0,2,1,2,2,2,2,1,1,2,2,1,2,2,1,2,0,0,2,1,1,0,2,1,0,0,1,0,0,0,1,
2,3,3,1,1,1,0,1,1,1,2,3,2,1,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,
0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,2,2,2,3,2,3,2,2,1,3,3,3,0,2,1,2,0,2,1,0,0,1,1,1,1,1,0,0,1,
2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,2,0,1,0,0,0,
3,3,3,2,3,3,3,3,3,2,3,1,2,3,3,1,2,0,0,0,0,0,0,0,3,2,1,1,0,0,0,0,
2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,
3,3,3,2,2,3,3,2,1,1,1,1,1,3,3,0,3,1,0,0,1,1,0,0,3,1,2,1,0,0,0,0,
0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,
3,3,3,2,2,3,2,2,2,3,2,1,1,3,3,0,3,0,0,0,0,1,0,0,3,1,1,2,0,0,0,1,
1,0,0,1,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,
1,1,1,3,3,0,3,3,3,3,3,2,2,2,1,2,0,2,1,2,2,1,1,0,1,2,2,2,2,2,2,2,
0,0,2,1,2,1,2,1,0,1,1,3,1,2,1,1,2,0,0,2,0,1,0,1,0,1,0,0,0,1,0,1,
3,3,3,1,3,3,3,0,1,1,0,2,2,3,1,0,3,0,0,0,1,0,0,0,1,0,0,1,0,1,0,0,
1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,2,0,0,2,2,1,0,0,1,0,0,3,3,1,3,0,0,1,1,0,2,0,3,0,0,0,2,0,1,1,
0,1,2,0,1,2,2,0,2,2,2,2,1,0,2,1,1,0,2,0,2,1,2,0,0,0,0,0,0,0,0,0,
3,3,3,1,3,2,3,2,0,2,2,2,1,3,2,0,2,1,2,0,1,2,0,0,1,0,2,2,0,0,0,2,
1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0,0,0,
3,3,3,0,3,3,1,1,2,3,1,0,3,2,3,0,3,0,0,0,1,0,0,0,1,0,1,0,0,0,0,0,
1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,3,3,0,3,3,2,3,3,2,2,0,0,0,0,1,2,0,1,3,0,0,0,3,1,1,0,3,0,2,
2,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,1,2,2,1,0,3,1,1,1,1,3,3,2,3,0,0,1,0,1,2,0,2,2,0,2,2,0,2,1,
0,2,2,1,1,1,1,0,2,1,1,0,1,1,1,1,2,1,2,1,2,0,1,0,1,0,0,0,0,0,0,0,
3,3,3,0,1,1,3,0,0,1,1,0,0,2,2,0,3,0,0,1,1,0,1,0,0,0,0,0,2,0,0,0,
0,3,1,0,1,0,1,0,2,0,0,1,0,1,0,1,1,1,2,1,1,0,2,0,0,0,0,0,0,0,0,0,
3,3,3,0,2,0,2,0,1,1,1,0,0,3,3,0,2,0,0,1,0,0,2,1,1,0,1,0,1,0,1,0,
0,2,0,1,2,0,2,0,2,1,1,0,1,0,2,1,1,0,2,1,1,0,1,0,0,0,1,1,0,0,0,0,
3,2,3,0,1,0,0,0,0,0,0,0,0,1,2,0,1,0,0,1,0,0,1,0,0,0,0,0,2,0,0,0,
0,0,1,1,0,0,1,0,1,0,0,1,0,0,0,2,1,0,1,0,2,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,0,0,2,3,0,0,1,0,1,0,2,3,2,3,0,0,1,3,0,2,1,0,0,0,0,2,0,1,0,
0,2,1,0,0,1,1,0,2,1,0,0,1,0,0,1,1,0,1,1,2,0,1,0,0,0,0,1,0,0,0,0,
3,2,2,0,0,1,1,0,0,0,0,0,0,3,1,1,1,0,0,0,0,0,1,0,0,0,0,0,2,0,1,0,
0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,0,0,0,0,0,0,0,0,
0,0,0,3,3,0,2,3,2,2,1,2,2,1,1,2,0,1,3,2,2,2,0,0,2,2,0,0,0,1,2,1,
3,0,2,1,1,0,1,1,1,0,1,2,2,2,1,1,2,0,0,0,0,1,0,1,1,0,0,0,0,0,0,0,
0,1,1,2,3,0,3,3,3,2,2,2,2,1,0,1,0,1,0,1,2,2,0,0,2,2,1,3,1,1,2,1,
0,0,1,1,2,0,1,1,0,0,1,2,0,2,1,1,2,0,0,1,0,0,0,1,0,1,0,1,0,0,0,0,
3,3,2,0,0,3,1,0,0,0,0,0,0,3,2,1,2,0,0,1,0,0,2,0,0,0,0,0,2,0,1,0,
0,2,1,1,0,0,1,0,1,2,0,0,1,1,0,0,2,1,1,1,1,0,2,0,0,0,0,0,0,0,0,0,
3,3,2,0,0,1,0,0,0,0,1,0,0,3,3,2,2,0,0,1,0,0,2,0,1,0,0,0,2,0,1,0,
0,0,1,1,0,0,2,0,2,1,0,0,1,1,2,1,2,0,2,1,2,1,1,1,0,0,1,1,0,0,0,0,
3,3,2,0,0,2,2,0,0,0,1,1,0,2,2,1,3,1,0,1,0,1,2,0,0,0,0,0,1,0,1,0,
0,1,1,0,0,0,0,0,1,0,0,1,0,0,0,1,1,0,1,0,1,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,2,0,0,0,1,0,0,1,0,0,2,3,1,2,0,0,1,0,0,2,0,0,0,1,0,2,0,2,0,
0,1,1,2,2,1,2,0,2,1,1,0,0,1,1,0,1,1,1,1,2,1,1,0,0,0,0,0,0,0,0,0,
3,3,3,0,2,1,2,1,0,0,1,1,0,3,3,1,2,0,0,1,0,0,2,0,2,0,1,1,2,0,0,0,
0,0,1,1,1,1,2,0,1,1,0,1,1,1,1,0,0,0,1,1,1,0,1,0,0,0,1,0,0,0,0,0,
3,3,3,0,2,2,3,2,0,0,1,0,0,2,3,1,0,0,0,0,0,0,2,0,2,0,0,0,2,0,0,0,
0,1,1,0,0,0,1,0,0,1,0,1,1,0,1,0,1,1,1,0,1,0,0,0,0,0,0,0,0,0,0,0,
3,2,3,0,0,0,0,0,0,0,1,0,0,2,2,2,2,0,0,1,0,0,2,0,0,0,0,0,2,0,1,0,
0,0,2,1,1,0,1,0,2,1,1,0,0,1,1,2,1,0,2,0,2,0,1,0,0,0,2,0,0,0,0,0,
0,0,0,2,2,0,2,1,1,1,1,2,2,0,0,1,0,1,0,0,1,3,0,0,0,0,1,0,0,2,1,0,
0,0,1,0,1,0,0,0,0,0,2,1,0,1,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,
2,0,0,2,3,0,2,3,1,2,2,0,2,0,0,2,0,2,1,1,1,2,1,0,0,1,2,1,1,2,1,0,
1,0,2,0,1,0,1,1,0,0,2,2,1,2,1,1,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,0,2,1,2,0,0,0,1,0,0,3,2,0,1,0,0,1,0,0,2,0,0,0,1,2,1,0,1,0,
0,0,0,0,1,0,1,0,0,1,0,0,0,0,1,0,1,0,1,1,1,0,1,0,0,0,0,0,0,0,0,0,
0,0,0,2,2,0,2,2,1,1,0,1,1,1,1,1,0,0,1,2,1,1,1,0,1,0,0,0,1,1,1,1,
0,0,2,1,0,1,1,1,0,1,1,2,1,2,1,1,2,0,1,1,2,1,0,2,0,0,0,0,0,0,0,0,
3,2,2,0,0,2,0,0,0,0,0,0,0,2,2,0,2,0,0,1,0,0,2,0,0,0,0,0,2,0,0,0,
0,2,1,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,0,0,0,0,0,0,0,0,
0,0,0,3,2,0,2,2,0,1,1,0,1,0,0,1,0,0,0,1,0,1,0,0,0,0,0,1,0,0,0,0,
2,0,1,0,1,0,1,1,0,0,1,2,0,1,0,1,1,0,0,1,0,1,0,2,0,0,0,0,0,0,0,0,
2,2,2,0,1,1,0,0,0,1,0,0,0,1,2,0,1,0,0,1,0,0,1,0,0,0,0,1,2,0,1,0,
0,0,1,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0,1,0,2,0,0,0,0,0,0,0,0,0,0,0,
2,2,2,2,1,0,1,1,1,0,0,0,0,1,2,0,0,1,0,0,0,1,0,0,1,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,
1,1,2,0,1,0,0,0,1,0,1,0,0,0,1,0,0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,2,0,0,0,0,0,1,
0,0,1,2,2,0,2,1,2,1,1,2,2,0,0,0,0,1,0,0,1,1,0,0,2,0,0,0,0,1,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,
2,2,2,0,0,0,1,0,0,0,0,0,0,2,2,1,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,
0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,1,1,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,2,2,0,1,0,1,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,1,0,0,0,0,0,0,0,0,0,0,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
)

Latin5TurkishModel = {
  'char_to_order_map': Latin5_TurkishCharToOrderMap,
  'precedence_matrix': TurkishLangModel,
  'typical_positive_ratio': 0.970290,
  'keep_english_letter': True,
  'charset_name': "ISO-8859-9",
  'language': 'Turkish',
}
site-packages/pip/_vendor/chardet/mbcharsetprober.py000064400000006525147511334570016700 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Universal charset detector code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 2001
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#   Shy Shalom - original C code
#   Proofpoint, Inc.
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

from .charsetprober import CharSetProber
from .enums import ProbingState, MachineState


class MultiByteCharSetProber(CharSetProber):
    """
    MultiByteCharSetProber
    """

    def __init__(self, lang_filter=None):
        super(MultiByteCharSetProber, self).__init__(lang_filter=lang_filter)
        self.distribution_analyzer = None
        self.coding_sm = None
        self._last_char = [0, 0]

    def reset(self):
        super(MultiByteCharSetProber, self).reset()
        if self.coding_sm:
            self.coding_sm.reset()
        if self.distribution_analyzer:
            self.distribution_analyzer.reset()
        self._last_char = [0, 0]

    @property
    def charset_name(self):
        raise NotImplementedError

    @property
    def language(self):
        raise NotImplementedError

    def feed(self, byte_str):
        for i in range(len(byte_str)):
            coding_state = self.coding_sm.next_state(byte_str[i])
            if coding_state == MachineState.ERROR:
                self.logger.debug('%s %s prober hit error at byte %s',
                                  self.charset_name, self.language, i)
                self._state = ProbingState.NOT_ME
                break
            elif coding_state == MachineState.ITS_ME:
                self._state = ProbingState.FOUND_IT
                break
            elif coding_state == MachineState.START:
                char_len = self.coding_sm.get_current_charlen()
                if i == 0:
                    self._last_char[1] = byte_str[0]
                    self.distribution_analyzer.feed(self._last_char, char_len)
                else:
                    self.distribution_analyzer.feed(byte_str[i - 1:i + 1],
                                                    char_len)

        self._last_char[0] = byte_str[-1]

        if self.state == ProbingState.DETECTING:
            if (self.distribution_analyzer.got_enough_data() and
                    (self.get_confidence() > self.SHORTCUT_THRESHOLD)):
                self._state = ProbingState.FOUND_IT

        return self.state

    def get_confidence(self):
        return self.distribution_analyzer.get_confidence()
site-packages/pip/_vendor/chardet/charsetprober.py000064400000011766147511334570016364 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Universal charset detector code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 2001
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#   Shy Shalom - original C code
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

import logging
import re

from .enums import ProbingState


class CharSetProber(object):

    SHORTCUT_THRESHOLD = 0.95

    def __init__(self, lang_filter=None):
        self._state = None
        self.lang_filter = lang_filter
        self.logger = logging.getLogger(__name__)

    def reset(self):
        self._state = ProbingState.DETECTING

    @property
    def charset_name(self):
        return None

    def feed(self, buf):
        pass

    @property
    def state(self):
        return self._state

    def get_confidence(self):
        return 0.0

    @staticmethod
    def filter_high_byte_only(buf):
        buf = re.sub(b'([\x00-\x7F])+', b' ', buf)
        return buf

    @staticmethod
    def filter_international_words(buf):
        """
        We define three types of bytes:
        alphabet: english alphabets [a-zA-Z]
        international: international characters [\x80-\xFF]
        marker: everything else [^a-zA-Z\x80-\xFF]

        The input buffer can be thought to contain a series of words delimited
        by markers. This function works to filter all words that contain at
        least one international character. All contiguous sequences of markers
        are replaced by a single space ascii character.

        This filter applies to all scripts which do not use English characters.
        """
        filtered = bytearray()

        # This regex expression filters out only words that have at-least one
        # international character. The word may include one marker character at
        # the end.
        words = re.findall(b'[a-zA-Z]*[\x80-\xFF]+[a-zA-Z]*[^a-zA-Z\x80-\xFF]?',
                           buf)

        for word in words:
            filtered.extend(word[:-1])

            # If the last character in the word is a marker, replace it with a
            # space as markers shouldn't affect our analysis (they are used
            # similarly across all languages and may thus have similar
            # frequencies).
            last_char = word[-1:]
            if not last_char.isalpha() and last_char < b'\x80':
                last_char = b' '
            filtered.extend(last_char)

        return filtered

    @staticmethod
    def filter_with_english_letters(buf):
        """
        Returns a copy of ``buf`` that retains only the sequences of English
        alphabet and high byte characters that are not between <> characters.
        Also retains English alphabet and high byte characters immediately
        before occurrences of >.

        This filter can be applied to all scripts which contain both English
        characters and extended ASCII characters, but is currently only used by
        ``Latin1Prober``.
        """
        filtered = bytearray()
        in_tag = False
        prev = 0

        for curr in range(len(buf)):
            # Slice here to get bytes instead of an int with Python 3
            buf_char = buf[curr:curr + 1]
            # Check if we're coming out of or entering an HTML tag
            if buf_char == b'>':
                in_tag = False
            elif buf_char == b'<':
                in_tag = True

            # If current character is not extended-ASCII and not alphabetic...
            if buf_char < b'\x80' and not buf_char.isalpha():
                # ...and we're not in a tag
                if curr > prev and not in_tag:
                    # Keep everything after last non-extended-ASCII,
                    # non-alphabetic character
                    filtered.extend(buf[prev:curr])
                    # Output a space to delimit stretch we kept
                    filtered.extend(b' ')
                prev = curr + 1

        # If we're not in a tag...
        if not in_tag:
            # Keep everything after last non-extended-ASCII, non-alphabetic
            # character
            filtered.extend(buf[prev:])

        return filtered
site-packages/pip/_vendor/chardet/charsetgroupprober.py000064400000007313147511334570017432 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Communicator client code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

from .enums import ProbingState
from .charsetprober import CharSetProber


class CharSetGroupProber(CharSetProber):
    def __init__(self, lang_filter=None):
        super(CharSetGroupProber, self).__init__(lang_filter=lang_filter)
        self._active_num = 0
        self.probers = []
        self._best_guess_prober = None

    def reset(self):
        super(CharSetGroupProber, self).reset()
        self._active_num = 0
        for prober in self.probers:
            if prober:
                prober.reset()
                prober.active = True
                self._active_num += 1
        self._best_guess_prober = None

    @property
    def charset_name(self):
        if not self._best_guess_prober:
            self.get_confidence()
            if not self._best_guess_prober:
                return None
        return self._best_guess_prober.charset_name

    @property
    def language(self):
        if not self._best_guess_prober:
            self.get_confidence()
            if not self._best_guess_prober:
                return None
        return self._best_guess_prober.language

    def feed(self, byte_str):
        for prober in self.probers:
            if not prober:
                continue
            if not prober.active:
                continue
            state = prober.feed(byte_str)
            if not state:
                continue
            if state == ProbingState.FOUND_IT:
                self._best_guess_prober = prober
                return self.state
            elif state == ProbingState.NOT_ME:
                prober.active = False
                self._active_num -= 1
                if self._active_num <= 0:
                    self._state = ProbingState.NOT_ME
                    return self.state
        return self.state

    def get_confidence(self):
        state = self.state
        if state == ProbingState.FOUND_IT:
            return 0.99
        elif state == ProbingState.NOT_ME:
            return 0.01
        best_conf = 0.0
        self._best_guess_prober = None
        for prober in self.probers:
            if not prober:
                continue
            if not prober.active:
                self.logger.debug('%s not active', prober.charset_name)
                continue
            conf = prober.get_confidence()
            self.logger.debug('%s %s confidence = %s', prober.charset_name, prober.language, conf)
            if best_conf < conf:
                best_conf = conf
                self._best_guess_prober = prober
        if not self._best_guess_prober:
            return 0.0
        return best_conf
site-packages/pip/_vendor/chardet/codingstatemachine.py000064400000007006147511334570017342 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is mozilla.org code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

import logging

from .enums import MachineState


class CodingStateMachine(object):
    """
    A state machine to verify a byte sequence for a particular encoding. For
    each byte the detector receives, it will feed that byte to every active
    state machine available, one byte at a time. The state machine changes its
    state based on its previous state and the byte it receives. There are 3
    states in a state machine that are of interest to an auto-detector:

    START state: This is the state to start with, or a legal byte sequence
                 (i.e. a valid code point) for character has been identified.

    ME state:  This indicates that the state machine identified a byte sequence
               that is specific to the charset it is designed for and that
               there is no other possible encoding which can contain this byte
               sequence. This will to lead to an immediate positive answer for
               the detector.

    ERROR state: This indicates the state machine identified an illegal byte
                 sequence for that encoding. This will lead to an immediate
                 negative answer for this encoding. Detector will exclude this
                 encoding from consideration from here on.
    """
    def __init__(self, sm):
        self._model = sm
        self._curr_byte_pos = 0
        self._curr_char_len = 0
        self._curr_state = None
        self.logger = logging.getLogger(__name__)
        self.reset()

    def reset(self):
        self._curr_state = MachineState.START

    def next_state(self, c):
        # for each byte we get its class
        # if it is first byte, we also get byte length
        byte_class = self._model['class_table'][c]
        if self._curr_state == MachineState.START:
            self._curr_byte_pos = 0
            self._curr_char_len = self._model['char_len_table'][byte_class]
        # from byte's class and state_table, we get its next state
        curr_state = (self._curr_state * self._model['class_factor']
                      + byte_class)
        self._curr_state = self._model['state_table'][curr_state]
        self._curr_byte_pos += 1
        return self._curr_state

    def get_current_charlen(self):
        return self._curr_char_len

    def get_coding_state_machine(self):
        return self._model['name']

    @property
    def language(self):
        return self._model['language']
site-packages/pip/_vendor/chardet/sbcharsetprober.py000064400000013031147511334570016674 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Universal charset detector code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 2001
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#   Shy Shalom - original C code
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

from .charsetprober import CharSetProber
from .enums import CharacterCategory, ProbingState, SequenceLikelihood


class SingleByteCharSetProber(CharSetProber):
    SAMPLE_SIZE = 64
    SB_ENOUGH_REL_THRESHOLD = 1024  #  0.25 * SAMPLE_SIZE^2
    POSITIVE_SHORTCUT_THRESHOLD = 0.95
    NEGATIVE_SHORTCUT_THRESHOLD = 0.05

    def __init__(self, model, reversed=False, name_prober=None):
        super(SingleByteCharSetProber, self).__init__()
        self._model = model
        # TRUE if we need to reverse every pair in the model lookup
        self._reversed = reversed
        # Optional auxiliary prober for name decision
        self._name_prober = name_prober
        self._last_order = None
        self._seq_counters = None
        self._total_seqs = None
        self._total_char = None
        self._freq_char = None
        self.reset()

    def reset(self):
        super(SingleByteCharSetProber, self).reset()
        # char order of last character
        self._last_order = 255
        self._seq_counters = [0] * SequenceLikelihood.get_num_categories()
        self._total_seqs = 0
        self._total_char = 0
        # characters that fall in our sampling range
        self._freq_char = 0

    @property
    def charset_name(self):
        if self._name_prober:
            return self._name_prober.charset_name
        else:
            return self._model['charset_name']

    @property
    def language(self):
        if self._name_prober:
            return self._name_prober.language
        else:
            return self._model.get('language')

    def feed(self, byte_str):
        if not self._model['keep_english_letter']:
            byte_str = self.filter_international_words(byte_str)
        if not byte_str:
            return self.state
        char_to_order_map = self._model['char_to_order_map']
        for i, c in enumerate(byte_str):
            # XXX: Order is in range 1-64, so one would think we want 0-63 here,
            #      but that leads to 27 more test failures than before.
            order = char_to_order_map[c]
            # XXX: This was SYMBOL_CAT_ORDER before, with a value of 250, but
            #      CharacterCategory.SYMBOL is actually 253, so we use CONTROL
            #      to make it closer to the original intent. The only difference
            #      is whether or not we count digits and control characters for
            #      _total_char purposes.
            if order < CharacterCategory.CONTROL:
                self._total_char += 1
            if order < self.SAMPLE_SIZE:
                self._freq_char += 1
                if self._last_order < self.SAMPLE_SIZE:
                    self._total_seqs += 1
                    if not self._reversed:
                        i = (self._last_order * self.SAMPLE_SIZE) + order
                        model = self._model['precedence_matrix'][i]
                    else:  # reverse the order of the letters in the lookup
                        i = (order * self.SAMPLE_SIZE) + self._last_order
                        model = self._model['precedence_matrix'][i]
                    self._seq_counters[model] += 1
            self._last_order = order

        charset_name = self._model['charset_name']
        if self.state == ProbingState.DETECTING:
            if self._total_seqs > self.SB_ENOUGH_REL_THRESHOLD:
                confidence = self.get_confidence()
                if confidence > self.POSITIVE_SHORTCUT_THRESHOLD:
                    self.logger.debug('%s confidence = %s, we have a winner',
                                      charset_name, confidence)
                    self._state = ProbingState.FOUND_IT
                elif confidence < self.NEGATIVE_SHORTCUT_THRESHOLD:
                    self.logger.debug('%s confidence = %s, below negative '
                                      'shortcut threshhold %s', charset_name,
                                      confidence,
                                      self.NEGATIVE_SHORTCUT_THRESHOLD)
                    self._state = ProbingState.NOT_ME

        return self.state

    def get_confidence(self):
        r = 0.01
        if self._total_seqs > 0:
            r = ((1.0 * self._seq_counters[SequenceLikelihood.POSITIVE]) /
                 self._total_seqs / self._model['typical_positive_ratio'])
            r = r * self._freq_char / self._total_char
            if r >= 1.0:
                r = 0.99
        return r
site-packages/pip/_vendor/chardet/langhungarianmodel.py000064400000030460147511334570017350 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Communicator client code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

# 255: Control characters that usually does not exist in any text
# 254: Carriage/Return
# 253: symbol (punctuation) that does not belong to word
# 252: 0 - 9

# Character Mapping Table:
Latin2_HungarianCharToOrderMap = (
255,255,255,255,255,255,255,255,255,255,254,255,255,254,255,255,  # 00
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,  # 10
253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,  # 20
252,252,252,252,252,252,252,252,252,252,253,253,253,253,253,253,  # 30
253, 28, 40, 54, 45, 32, 50, 49, 38, 39, 53, 36, 41, 34, 35, 47,
 46, 71, 43, 33, 37, 57, 48, 64, 68, 55, 52,253,253,253,253,253,
253,  2, 18, 26, 17,  1, 27, 12, 20,  9, 22,  7,  6, 13,  4,  8,
 23, 67, 10,  5,  3, 21, 19, 65, 62, 16, 11,253,253,253,253,253,
159,160,161,162,163,164,165,166,167,168,169,170,171,172,173,174,
175,176,177,178,179,180,181,182,183,184,185,186,187,188,189,190,
191,192,193,194,195,196,197, 75,198,199,200,201,202,203,204,205,
 79,206,207,208,209,210,211,212,213,214,215,216,217,218,219,220,
221, 51, 81,222, 78,223,224,225,226, 44,227,228,229, 61,230,231,
232,233,234, 58,235, 66, 59,236,237,238, 60, 69, 63,239,240,241,
 82, 14, 74,242, 70, 80,243, 72,244, 15, 83, 77, 84, 30, 76, 85,
245,246,247, 25, 73, 42, 24,248,249,250, 31, 56, 29,251,252,253,
)

win1250HungarianCharToOrderMap = (
255,255,255,255,255,255,255,255,255,255,254,255,255,254,255,255,  # 00
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,  # 10
253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,  # 20
252,252,252,252,252,252,252,252,252,252,253,253,253,253,253,253,  # 30
253, 28, 40, 54, 45, 32, 50, 49, 38, 39, 53, 36, 41, 34, 35, 47,
 46, 72, 43, 33, 37, 57, 48, 64, 68, 55, 52,253,253,253,253,253,
253,  2, 18, 26, 17,  1, 27, 12, 20,  9, 22,  7,  6, 13,  4,  8,
 23, 67, 10,  5,  3, 21, 19, 65, 62, 16, 11,253,253,253,253,253,
161,162,163,164,165,166,167,168,169,170,171,172,173,174,175,176,
177,178,179,180, 78,181, 69,182,183,184,185,186,187,188,189,190,
191,192,193,194,195,196,197, 76,198,199,200,201,202,203,204,205,
 81,206,207,208,209,210,211,212,213,214,215,216,217,218,219,220,
221, 51, 83,222, 80,223,224,225,226, 44,227,228,229, 61,230,231,
232,233,234, 58,235, 66, 59,236,237,238, 60, 70, 63,239,240,241,
 84, 14, 75,242, 71, 82,243, 73,244, 15, 85, 79, 86, 30, 77, 87,
245,246,247, 25, 74, 42, 24,248,249,250, 31, 56, 29,251,252,253,
)

# Model Table:
# total sequences: 100%
# first 512 sequences: 94.7368%
# first 1024 sequences:5.2623%
# rest  sequences:     0.8894%
# negative sequences:  0.0009%
HungarianLangModel = (
0,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,1,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,
3,3,3,3,3,3,3,3,3,3,2,3,3,3,3,3,3,3,3,2,2,3,3,1,1,2,2,2,2,2,1,2,
3,2,2,3,3,3,3,3,2,3,3,3,3,3,3,1,2,3,3,3,3,2,3,3,1,1,3,3,0,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,
3,2,1,3,3,3,3,3,2,3,3,3,3,3,1,1,2,3,3,3,3,3,3,3,1,1,3,2,0,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,
3,3,3,3,3,3,3,3,3,3,3,1,1,2,3,3,3,1,3,3,3,3,3,1,3,3,2,2,0,3,2,3,
0,0,0,0,0,0,0,0,0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,
3,3,3,3,3,3,2,3,3,3,2,3,3,2,3,3,3,3,3,2,3,3,2,2,3,2,3,2,0,3,2,2,
0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,
3,3,3,3,3,3,2,3,3,3,3,3,2,3,3,3,1,2,3,2,2,3,1,2,3,3,2,2,0,3,3,3,
0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,
3,3,3,3,3,3,3,3,3,3,2,2,3,3,3,3,3,3,2,3,3,3,3,2,3,3,3,3,0,2,3,2,
0,0,0,1,1,0,0,0,0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,
3,3,3,3,3,3,3,3,3,3,3,1,1,1,3,3,2,1,3,2,2,3,2,1,3,2,2,1,0,3,3,1,
0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,
3,2,2,3,3,3,3,3,1,2,3,3,3,3,1,2,1,3,3,3,3,2,2,3,1,1,3,2,0,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,
3,3,3,3,3,3,3,3,2,2,3,3,3,3,3,2,1,3,3,3,3,3,2,2,1,3,3,3,0,1,1,2,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,
3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,3,3,3,2,3,3,2,3,3,3,2,0,3,2,3,
0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,1,0,
3,3,3,3,3,3,2,3,3,3,2,3,2,3,3,3,1,3,2,2,2,3,1,1,3,3,1,1,0,3,3,2,
0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,
3,3,3,3,3,3,3,2,3,3,3,2,3,2,3,3,3,2,3,3,3,3,3,1,2,3,2,2,0,2,2,2,
0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,
3,3,3,2,2,2,3,1,3,3,2,2,1,3,3,3,1,1,3,1,2,3,2,3,2,2,2,1,0,2,2,2,
0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,
3,1,1,3,3,3,3,3,1,2,3,3,3,3,1,2,1,3,3,3,2,2,3,2,1,0,3,2,0,1,1,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,1,1,3,3,3,3,3,1,2,3,3,3,3,1,1,0,3,3,3,3,0,2,3,0,0,2,1,0,1,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,3,2,2,3,3,2,2,2,2,3,3,0,1,2,3,2,3,2,2,3,2,1,2,0,2,2,2,
0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,
3,3,3,3,3,3,1,2,3,3,3,2,1,2,3,3,2,2,2,3,2,3,3,1,3,3,1,1,0,2,3,2,
0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,
3,3,3,1,2,2,2,2,3,3,3,1,1,1,3,3,1,1,3,1,1,3,2,1,2,3,1,1,0,2,2,2,
0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,
3,3,3,2,1,2,1,1,3,3,1,1,1,1,3,3,1,1,2,2,1,2,1,1,2,2,1,1,0,2,2,1,
0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,
3,3,3,1,1,2,1,1,3,3,1,0,1,1,3,3,2,0,1,1,2,3,1,0,2,2,1,0,0,1,3,2,
0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,
3,2,1,3,3,3,3,3,1,2,3,2,3,3,2,1,1,3,2,3,2,1,2,2,0,1,2,1,0,0,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,
3,3,3,3,2,2,2,2,3,1,2,2,1,1,3,3,0,3,2,1,2,3,2,1,3,3,1,1,0,2,1,3,
0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,
3,3,3,2,2,2,3,2,3,3,3,2,1,1,3,3,1,1,1,2,2,3,2,3,2,2,2,1,0,2,2,1,
0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,
1,0,0,3,3,3,3,3,0,0,3,3,2,3,0,0,0,2,3,3,1,0,1,2,0,0,1,1,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,1,2,3,3,3,3,3,1,2,3,3,2,2,1,1,0,3,3,2,2,1,2,2,1,0,2,2,0,1,1,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,2,2,1,3,1,2,3,3,2,2,1,1,2,2,1,1,1,1,3,2,1,1,1,1,2,1,0,1,2,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,
2,3,3,1,1,1,1,1,3,3,3,0,1,1,3,3,1,1,1,1,1,2,2,0,3,1,1,2,0,2,1,1,
0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,
3,1,0,1,2,1,2,2,0,1,2,3,1,2,0,0,0,2,1,1,1,1,1,2,0,0,1,1,0,0,0,0,
1,2,1,2,2,2,1,2,1,2,0,2,0,2,2,1,1,2,1,1,2,1,1,1,0,1,0,0,0,1,1,0,
1,1,1,2,3,2,3,3,0,1,2,2,3,1,0,1,0,2,1,2,2,0,1,1,0,0,1,1,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,0,0,3,3,2,2,1,0,0,3,2,3,2,0,0,0,1,1,3,0,0,1,1,0,0,2,1,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,1,1,2,2,3,3,1,0,1,3,2,3,1,1,1,0,1,1,1,1,1,3,1,0,0,2,2,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,1,1,1,2,2,2,1,0,1,2,3,3,2,0,0,0,2,1,1,1,2,1,1,1,0,1,1,1,0,0,0,
1,2,2,2,2,2,1,1,1,2,0,2,1,1,1,1,1,2,1,1,1,1,1,1,0,1,1,1,0,0,1,1,
3,2,2,1,0,0,1,1,2,2,0,3,0,1,2,1,1,0,0,1,1,1,0,1,1,1,1,0,2,1,1,1,
2,2,1,1,1,2,1,2,1,1,1,1,1,1,1,2,1,1,1,2,3,1,1,1,1,1,1,1,1,1,0,1,
2,3,3,0,1,0,0,0,3,3,1,0,0,1,2,2,1,0,0,0,0,2,0,0,1,1,1,0,2,1,1,1,
2,1,1,1,1,1,1,2,1,1,0,1,1,0,1,1,1,0,1,2,1,1,0,1,1,1,1,1,1,1,0,1,
2,3,3,0,1,0,0,0,2,2,0,0,0,0,1,2,2,0,0,0,0,1,0,0,1,1,0,0,2,0,1,0,
2,1,1,1,1,2,1,1,1,1,1,1,1,2,1,1,1,1,1,1,1,1,1,2,0,1,1,1,1,1,0,1,
3,2,2,0,1,0,1,0,2,3,2,0,0,1,2,2,1,0,0,1,1,1,0,0,2,1,0,1,2,2,1,1,
2,1,1,1,1,1,1,2,1,1,1,1,1,1,0,2,1,0,1,1,0,1,1,1,0,1,1,2,1,1,0,1,
2,2,2,0,0,1,0,0,2,2,1,1,0,0,2,1,1,0,0,0,1,2,0,0,2,1,0,0,2,1,1,1,
2,1,1,1,1,2,1,2,1,1,1,2,2,1,1,2,1,1,1,2,1,1,1,1,1,1,1,1,1,1,0,1,
1,2,3,0,0,0,1,0,3,2,1,0,0,1,2,1,1,0,0,0,0,2,1,0,1,1,0,0,2,1,2,1,
1,1,0,0,0,1,0,1,1,1,1,1,2,0,0,1,0,0,0,2,0,0,1,1,1,1,1,1,1,1,0,1,
3,0,0,2,1,2,2,1,0,0,2,1,2,2,0,0,0,2,1,1,1,0,1,1,0,0,1,1,2,0,0,0,
1,2,1,2,2,1,1,2,1,2,0,1,1,1,1,1,1,1,1,1,2,1,1,0,0,1,1,1,1,0,0,1,
1,3,2,0,0,0,1,0,2,2,2,0,0,0,2,2,1,0,0,0,0,3,1,1,1,1,0,0,2,1,1,1,
2,1,0,1,1,1,0,1,1,1,1,1,1,1,0,2,1,0,0,1,0,1,1,0,1,1,1,1,1,1,0,1,
2,3,2,0,0,0,1,0,2,2,0,0,0,0,2,1,1,0,0,0,0,2,1,0,1,1,0,0,2,1,1,0,
2,1,1,1,1,2,1,2,1,2,0,1,1,1,0,2,1,1,1,2,1,1,1,1,0,1,1,1,1,1,0,1,
3,1,1,2,2,2,3,2,1,1,2,2,1,1,0,1,0,2,2,1,1,1,1,1,0,0,1,1,0,1,1,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,2,2,0,0,0,0,0,2,2,0,0,0,0,2,2,1,0,0,0,1,1,0,0,1,2,0,0,2,1,1,1,
2,2,1,1,1,2,1,2,1,1,0,1,1,1,1,2,1,1,1,2,1,1,1,1,0,1,2,1,1,1,0,1,
1,0,0,1,2,3,2,1,0,0,2,0,1,1,0,0,0,1,1,1,1,0,1,1,0,0,1,0,0,0,0,0,
1,2,1,2,1,2,1,1,1,2,0,2,1,1,1,0,1,2,0,0,1,1,1,0,0,0,0,0,0,0,0,0,
2,3,2,0,0,0,0,0,1,1,2,1,0,0,1,1,1,0,0,0,0,2,0,0,1,1,0,0,2,1,1,1,
2,1,1,1,1,1,1,2,1,0,1,1,1,1,0,2,1,1,1,1,1,1,0,1,0,1,1,1,1,1,0,1,
1,2,2,0,1,1,1,0,2,2,2,0,0,0,3,2,1,0,0,0,1,1,0,0,1,1,0,1,1,1,0,0,
1,1,0,1,1,1,1,1,1,1,1,2,1,1,1,1,1,1,1,2,1,1,1,0,0,1,1,1,0,1,0,1,
2,1,0,2,1,1,2,2,1,1,2,1,1,1,0,0,0,1,1,0,1,1,1,1,0,0,1,1,1,0,0,0,
1,2,2,2,2,2,1,1,1,2,0,2,1,1,1,1,1,1,1,1,1,1,1,1,0,1,1,0,0,0,1,0,
1,2,3,0,0,0,1,0,2,2,0,0,0,0,2,2,0,0,0,0,0,1,0,0,1,0,0,0,2,0,1,0,
2,1,1,1,1,1,0,2,0,0,0,1,2,1,1,1,1,0,1,2,0,1,0,1,0,1,1,1,0,1,0,1,
2,2,2,0,0,0,1,0,2,1,2,0,0,0,1,1,2,0,0,0,0,1,0,0,1,1,0,0,2,1,0,1,
2,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,2,0,1,1,1,1,1,0,1,
1,2,2,0,0,0,1,0,2,2,2,0,0,0,1,1,0,0,0,0,0,1,1,0,2,0,0,1,1,1,0,1,
1,0,1,1,1,1,1,1,0,1,1,1,1,0,0,1,0,0,1,1,0,1,0,1,1,1,1,1,0,0,0,1,
1,0,0,1,0,1,2,1,0,0,1,1,1,2,0,0,0,1,1,0,1,0,1,1,0,0,1,0,0,0,0,0,
0,2,1,2,1,1,1,1,1,2,0,2,0,1,1,0,1,2,1,0,1,1,1,0,0,0,0,0,0,1,0,0,
2,1,1,0,1,2,0,0,1,1,1,0,0,0,1,1,0,0,0,0,0,1,0,0,1,0,0,0,2,1,0,1,
2,2,1,1,1,1,1,2,1,1,0,1,1,1,1,2,1,1,1,2,1,1,0,1,0,1,1,1,1,1,0,1,
1,2,2,0,0,0,0,0,1,1,0,0,0,0,2,1,0,0,0,0,0,2,0,0,2,2,0,0,2,0,0,1,
2,1,1,1,1,1,1,1,0,1,1,0,1,1,0,1,0,0,0,1,1,1,1,0,0,1,1,1,1,0,0,1,
1,1,2,0,0,3,1,0,2,1,1,1,0,0,1,1,1,0,0,0,1,1,0,0,0,1,0,0,1,0,1,0,
1,2,1,0,1,1,1,2,1,1,0,1,1,1,1,1,0,0,0,1,1,1,1,1,0,1,0,0,0,1,0,0,
2,1,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,0,0,1,0,0,0,0,2,0,0,0,
2,1,1,1,1,1,1,1,1,1,0,1,1,1,1,1,1,1,1,1,2,1,1,0,0,1,1,1,1,1,0,1,
2,1,1,1,2,1,1,1,0,1,1,2,1,0,0,0,0,1,1,1,1,0,1,0,0,0,0,1,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,0,1,1,1,1,1,0,0,1,1,2,1,0,0,0,1,1,0,0,0,1,1,0,0,1,0,1,0,0,0,
1,2,1,1,1,1,1,1,1,1,0,1,0,1,1,1,1,1,1,0,1,1,1,0,0,0,0,0,0,1,0,0,
2,0,0,0,1,1,1,1,0,0,1,1,0,0,0,0,0,1,1,1,2,0,0,1,0,0,1,0,1,0,0,0,
0,1,1,1,1,1,1,1,1,2,0,1,1,1,1,0,1,1,1,0,1,1,1,0,0,0,0,0,0,0,0,0,
1,0,0,1,1,1,1,1,0,0,2,1,0,1,0,0,0,1,0,1,0,0,0,0,0,0,1,0,0,0,0,0,
0,1,1,1,1,1,1,0,1,1,0,1,0,1,1,0,1,1,0,0,1,1,1,0,0,0,0,0,0,0,0,0,
1,0,0,1,1,1,0,0,0,0,1,0,2,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,
0,1,1,1,1,1,0,0,1,1,0,1,0,1,0,0,1,1,1,0,1,1,1,0,0,0,0,0,0,0,0,0,
0,0,0,1,0,0,0,0,0,0,1,1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,1,1,1,0,1,0,0,1,1,0,1,0,1,1,0,1,1,1,0,1,1,1,0,0,0,0,0,0,0,0,0,
2,1,1,1,1,1,1,1,1,1,1,0,0,1,1,1,0,0,1,0,0,1,0,1,0,1,1,1,0,0,1,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,0,0,1,1,1,1,0,0,0,1,1,1,0,0,0,0,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,
0,1,1,1,1,1,1,0,1,1,0,1,0,1,0,0,1,1,0,0,1,1,0,0,0,0,0,0,0,0,0,0,
)

Latin2HungarianModel = {
  'char_to_order_map': Latin2_HungarianCharToOrderMap,
  'precedence_matrix': HungarianLangModel,
  'typical_positive_ratio': 0.947368,
  'keep_english_letter': True,
  'charset_name': "ISO-8859-2",
  'language': 'Hungarian',
}

Win1250HungarianModel = {
  'char_to_order_map': win1250HungarianCharToOrderMap,
  'precedence_matrix': HungarianLangModel,
  'typical_positive_ratio': 0.947368,
  'keep_english_letter': True,
  'charset_name': "windows-1250",
  'language': 'Hungarian',
}
site-packages/pip/_vendor/chardet/langthaimodel.py000064400000026032147511334570016321 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Communicator client code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

# 255: Control characters that usually does not exist in any text
# 254: Carriage/Return
# 253: symbol (punctuation) that does not belong to word
# 252: 0 - 9

# The following result for thai was collected from a limited sample (1M).

# Character Mapping Table:
TIS620CharToOrderMap = (
255,255,255,255,255,255,255,255,255,255,254,255,255,254,255,255,  # 00
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,  # 10
253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,  # 20
252,252,252,252,252,252,252,252,252,252,253,253,253,253,253,253,  # 30
253,182,106,107,100,183,184,185,101, 94,186,187,108,109,110,111,  # 40
188,189,190, 89, 95,112,113,191,192,193,194,253,253,253,253,253,  # 50
253, 64, 72, 73,114, 74,115,116,102, 81,201,117, 90,103, 78, 82,  # 60
 96,202, 91, 79, 84,104,105, 97, 98, 92,203,253,253,253,253,253,  # 70
209,210,211,212,213, 88,214,215,216,217,218,219,220,118,221,222,
223,224, 99, 85, 83,225,226,227,228,229,230,231,232,233,234,235,
236,  5, 30,237, 24,238, 75,  8, 26, 52, 34, 51,119, 47, 58, 57,
 49, 53, 55, 43, 20, 19, 44, 14, 48,  3, 17, 25, 39, 62, 31, 54,
 45,  9, 16,  2, 61, 15,239, 12, 42, 46, 18, 21, 76,  4, 66, 63,
 22, 10,  1, 36, 23, 13, 40, 27, 32, 35, 86,240,241,242,243,244,
 11, 28, 41, 29, 33,245, 50, 37,  6,  7, 67, 77, 38, 93,246,247,
 68, 56, 59, 65, 69, 60, 70, 80, 71, 87,248,249,250,251,252,253,
)

# Model Table:
# total sequences: 100%
# first 512 sequences: 92.6386%
# first 1024 sequences:7.3177%
# rest  sequences:     1.0230%
# negative sequences:  0.0436%
ThaiLangModel = (
0,1,3,3,3,3,0,0,3,3,0,3,3,0,3,3,3,3,3,3,3,3,0,0,3,3,3,0,3,3,3,3,
0,3,3,0,0,0,1,3,0,3,3,2,3,3,0,1,2,3,3,3,3,0,2,0,2,0,0,3,2,1,2,2,
3,0,3,3,2,3,0,0,3,3,0,3,3,0,3,3,3,3,3,3,3,3,3,0,3,2,3,0,2,2,2,3,
0,2,3,0,0,0,0,1,0,1,2,3,1,1,3,2,2,0,1,1,0,0,1,0,0,0,0,0,0,0,1,1,
3,3,3,2,3,3,3,3,3,3,3,3,3,3,3,2,2,2,2,2,2,2,3,3,2,3,2,3,3,2,2,2,
3,1,2,3,0,3,3,2,2,1,2,3,3,1,2,0,1,3,0,1,0,0,1,0,0,0,0,0,0,0,1,1,
3,3,2,2,3,3,3,3,1,2,3,3,3,3,3,2,2,2,2,3,3,2,2,3,3,2,2,3,2,3,2,2,
3,3,1,2,3,1,2,2,3,3,1,0,2,1,0,0,3,1,2,1,0,0,1,0,0,0,0,0,0,1,0,1,
3,3,3,3,3,3,2,2,3,3,3,3,2,3,2,2,3,3,2,2,3,2,2,2,2,1,1,3,1,2,1,1,
3,2,1,0,2,1,0,1,0,1,1,0,1,1,0,0,1,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,
3,3,3,2,3,2,3,3,2,2,3,2,3,3,2,3,1,1,2,3,2,2,2,3,2,2,2,2,2,1,2,1,
2,2,1,1,3,3,2,1,0,1,2,2,0,1,3,0,0,0,1,1,0,0,0,0,0,2,3,0,0,2,1,1,
3,3,2,3,3,2,0,0,3,3,0,3,3,0,2,2,3,1,2,2,1,1,1,0,2,2,2,0,2,2,1,1,
0,2,1,0,2,0,0,2,0,1,0,0,1,0,0,0,1,1,1,1,0,0,0,0,0,0,0,0,0,0,1,0,
3,3,2,3,3,2,0,0,3,3,0,2,3,0,2,1,2,2,2,2,1,2,0,0,2,2,2,0,2,2,1,1,
0,2,1,0,2,0,0,2,0,1,1,0,1,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,
3,3,2,3,2,3,2,0,2,2,1,3,2,1,3,2,1,2,3,2,2,3,0,2,3,2,2,1,2,2,2,2,
1,2,2,0,0,0,0,2,0,1,2,0,1,1,1,0,1,0,3,1,1,0,0,0,0,0,0,0,0,0,1,0,
3,3,2,3,3,2,3,2,2,2,3,2,2,3,2,2,1,2,3,2,2,3,1,3,2,2,2,3,2,2,2,3,
3,2,1,3,0,1,1,1,0,2,1,1,1,1,1,0,1,0,1,1,0,0,0,0,0,0,0,0,0,2,0,0,
1,0,0,3,0,3,3,3,3,3,0,0,3,0,2,2,3,3,3,3,3,0,0,0,1,1,3,0,0,0,0,2,
0,0,1,0,0,0,0,0,0,0,2,3,0,0,0,3,0,2,0,0,0,0,0,3,0,0,0,0,0,0,0,0,
2,0,3,3,3,3,0,0,2,3,0,0,3,0,3,3,2,3,3,3,3,3,0,0,3,3,3,0,0,0,3,3,
0,0,3,0,0,0,0,2,0,0,2,1,1,3,0,0,1,0,0,2,3,0,1,0,0,0,0,0,0,0,1,0,
3,3,3,3,2,3,3,3,3,3,3,3,1,2,1,3,3,2,2,1,2,2,2,3,1,1,2,0,2,1,2,1,
2,2,1,0,0,0,1,1,0,1,0,1,1,0,0,0,0,0,1,1,0,0,1,0,0,0,0,0,0,0,0,0,
3,0,2,1,2,3,3,3,0,2,0,2,2,0,2,1,3,2,2,1,2,1,0,0,2,2,1,0,2,1,2,2,
0,1,1,0,0,0,0,1,0,1,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,2,1,3,3,1,1,3,0,2,3,1,1,3,2,1,1,2,0,2,2,3,2,1,1,1,1,1,2,
3,0,0,1,3,1,2,1,2,0,3,0,0,0,1,0,3,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,
3,3,1,1,3,2,3,3,3,1,3,2,1,3,2,1,3,2,2,2,2,1,3,3,1,2,1,3,1,2,3,0,
2,1,1,3,2,2,2,1,2,1,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,
3,3,2,3,2,3,3,2,3,2,3,2,3,3,2,1,0,3,2,2,2,1,2,2,2,1,2,2,1,2,1,1,
2,2,2,3,0,1,3,1,1,1,1,0,1,1,0,2,1,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,2,3,2,2,1,1,3,2,3,2,3,2,0,3,2,2,1,2,0,2,2,2,1,2,2,2,2,1,
3,2,1,2,2,1,0,2,0,1,0,0,1,1,0,0,0,0,0,1,1,0,1,0,0,0,0,0,0,0,0,1,
3,3,3,3,3,2,3,1,2,3,3,2,2,3,0,1,1,2,0,3,3,2,2,3,0,1,1,3,0,0,0,0,
3,1,0,3,3,0,2,0,2,1,0,0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,2,3,2,3,3,0,1,3,1,1,2,1,2,1,1,3,1,1,0,2,3,1,1,1,1,1,1,1,1,
3,1,1,2,2,2,2,1,1,1,0,0,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,
3,2,2,1,1,2,1,3,3,2,3,2,2,3,2,2,3,1,2,2,1,2,0,3,2,1,2,2,2,2,2,1,
3,2,1,2,2,2,1,1,1,1,0,0,1,1,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,3,3,3,1,3,3,0,2,1,0,3,2,0,0,3,1,0,1,1,0,1,0,0,0,0,0,1,
1,0,0,1,0,3,2,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,0,2,2,2,3,0,0,1,3,0,3,2,0,3,2,2,3,3,3,3,3,1,0,2,2,2,0,2,2,1,2,
0,2,3,0,0,0,0,1,0,1,0,0,1,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,
3,0,2,3,1,3,3,2,3,3,0,3,3,0,3,2,2,3,2,3,3,3,0,0,2,2,3,0,1,1,1,3,
0,0,3,0,0,0,2,2,0,1,3,0,1,2,2,2,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,
3,2,3,3,2,0,3,3,2,2,3,1,3,2,1,3,2,0,1,2,2,0,2,3,2,1,0,3,0,0,0,0,
3,0,0,2,3,1,3,0,0,3,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,1,3,2,2,2,1,2,0,1,3,1,1,3,1,3,0,0,2,1,1,1,1,2,1,1,1,0,2,1,0,1,
1,2,0,0,0,3,1,1,0,0,0,0,1,0,1,0,0,1,0,1,0,0,0,0,0,3,1,0,0,0,1,0,
3,3,3,3,2,2,2,2,2,1,3,1,1,1,2,0,1,1,2,1,2,1,3,2,0,0,3,1,1,1,1,1,
3,1,0,2,3,0,0,0,3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,2,3,0,3,3,0,2,0,0,0,0,0,0,0,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,
0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,2,3,1,3,0,0,1,2,0,0,2,0,3,3,2,3,3,3,2,3,0,0,2,2,2,0,0,0,2,2,
0,0,1,0,0,0,0,3,0,0,0,0,2,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,
0,0,0,3,0,2,0,0,0,0,0,0,0,0,0,0,1,2,3,1,3,3,0,0,1,0,3,0,0,0,0,0,
0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,1,2,3,1,2,3,1,0,3,0,2,2,1,0,2,1,1,2,0,1,0,0,1,1,1,1,0,1,0,0,
1,0,0,0,0,1,1,0,3,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,2,1,0,1,1,1,3,1,2,2,2,2,2,2,1,1,1,1,0,3,1,0,1,3,1,1,1,1,
1,1,0,2,0,1,3,1,1,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,2,0,1,
3,0,2,2,1,3,3,2,3,3,0,1,1,0,2,2,1,2,1,3,3,1,0,0,3,2,0,0,0,0,2,1,
0,1,0,0,0,0,1,2,0,1,1,3,1,1,2,2,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,
0,0,3,0,0,1,0,0,0,3,0,0,3,0,3,1,0,1,1,1,3,2,0,0,0,3,0,0,0,0,2,0,
0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,2,0,0,0,0,0,0,0,0,0,
3,3,1,3,2,1,3,3,1,2,2,0,1,2,1,0,1,2,0,0,0,0,0,3,0,0,0,3,0,0,0,0,
3,0,0,1,1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,0,1,2,0,3,3,3,2,2,0,1,1,0,1,3,0,0,0,2,2,0,0,0,0,3,1,0,1,0,0,0,
0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,0,2,3,1,2,0,0,2,1,0,3,1,0,1,2,0,1,1,1,1,3,0,0,3,1,1,0,2,2,1,1,
0,2,0,0,0,0,0,1,0,1,0,0,1,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,0,0,3,1,2,0,0,2,2,0,1,2,0,1,0,1,3,1,2,1,0,0,0,2,0,3,0,0,0,1,0,
0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,0,1,1,2,2,0,0,0,2,0,2,1,0,1,1,0,1,1,1,2,1,0,0,1,1,1,0,2,1,1,1,
0,1,1,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,
0,0,0,2,0,1,3,1,1,1,1,0,0,0,0,3,2,0,1,0,0,0,1,2,0,0,0,1,0,0,0,0,
0,0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,3,3,3,3,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,0,2,3,2,2,0,0,0,1,0,0,0,0,2,3,2,1,2,2,3,0,0,0,2,3,1,0,0,0,1,1,
0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,1,1,0,1,0,0,0,0,0,0,0,0,0,
3,3,2,2,0,1,0,0,0,0,2,0,2,0,1,0,0,0,1,1,0,0,0,2,1,0,1,0,1,1,0,0,
0,1,0,2,0,0,1,0,3,0,1,0,0,0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,1,0,0,1,0,0,0,0,0,1,1,2,0,0,0,0,1,0,0,1,3,1,0,0,0,0,1,1,0,0,
0,1,0,0,0,0,3,0,0,0,0,0,0,3,0,0,0,0,0,0,0,3,0,0,0,0,0,0,0,0,0,0,
3,3,1,1,1,1,2,3,0,0,2,1,1,1,1,1,0,2,1,1,0,0,0,2,1,0,1,2,1,1,0,1,
2,1,0,3,0,0,0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,3,1,0,0,0,0,0,0,0,3,0,0,0,3,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,1,
0,0,0,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,2,0,0,0,0,0,0,1,2,1,0,1,1,0,2,0,0,1,0,0,2,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,2,0,0,0,1,3,0,1,0,0,0,2,0,0,0,0,0,0,0,1,2,0,0,0,0,0,
3,3,0,0,1,1,2,0,0,1,2,1,0,1,1,1,0,1,1,0,0,2,1,1,0,1,0,0,1,1,1,0,
0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,2,2,1,0,0,0,0,1,0,0,0,0,3,0,0,0,0,0,0,0,0,0,3,0,0,0,0,0,0,0,0,
2,0,0,0,0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,3,0,0,1,1,0,0,0,2,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,0,1,2,0,1,2,0,0,1,1,0,2,0,1,0,0,1,0,0,0,0,1,0,0,0,2,0,0,0,0,
1,0,0,1,0,1,1,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,1,0,0,0,0,0,0,0,1,1,0,1,1,0,2,1,3,0,0,0,0,1,1,0,0,0,0,0,0,0,3,
1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,0,1,0,1,0,0,2,0,0,2,0,0,1,1,2,0,0,1,1,0,0,0,1,0,0,0,1,1,0,0,0,
1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,
1,0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,1,1,0,0,0,
2,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,3,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,0,0,0,0,2,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,3,0,0,0,
2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,1,0,0,0,0,
1,0,0,0,0,0,0,0,0,1,0,0,0,0,2,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,1,1,0,0,2,1,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
)

TIS620ThaiModel = {
  'char_to_order_map': TIS620CharToOrderMap,
  'precedence_matrix': ThaiLangModel,
  'typical_positive_ratio': 0.926386,
  'keep_english_letter': False,
  'charset_name': "TIS-620",
  'language': 'Thai',
}
site-packages/pip/_vendor/chardet/escprober.py000064400000007556147511334570015507 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is mozilla.org code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

from .charsetprober import CharSetProber
from .codingstatemachine import CodingStateMachine
from .enums import LanguageFilter, ProbingState, MachineState
from .escsm import (HZ_SM_MODEL, ISO2022CN_SM_MODEL, ISO2022JP_SM_MODEL,
                    ISO2022KR_SM_MODEL)


class EscCharSetProber(CharSetProber):
    """
    This CharSetProber uses a "code scheme" approach for detecting encodings,
    whereby easily recognizable escape or shift sequences are relied on to
    identify these encodings.
    """

    def __init__(self, lang_filter=None):
        super(EscCharSetProber, self).__init__(lang_filter=lang_filter)
        self.coding_sm = []
        if self.lang_filter & LanguageFilter.CHINESE_SIMPLIFIED:
            self.coding_sm.append(CodingStateMachine(HZ_SM_MODEL))
            self.coding_sm.append(CodingStateMachine(ISO2022CN_SM_MODEL))
        if self.lang_filter & LanguageFilter.JAPANESE:
            self.coding_sm.append(CodingStateMachine(ISO2022JP_SM_MODEL))
        if self.lang_filter & LanguageFilter.KOREAN:
            self.coding_sm.append(CodingStateMachine(ISO2022KR_SM_MODEL))
        self.active_sm_count = None
        self._detected_charset = None
        self._detected_language = None
        self._state = None
        self.reset()

    def reset(self):
        super(EscCharSetProber, self).reset()
        for coding_sm in self.coding_sm:
            if not coding_sm:
                continue
            coding_sm.active = True
            coding_sm.reset()
        self.active_sm_count = len(self.coding_sm)
        self._detected_charset = None
        self._detected_language = None

    @property
    def charset_name(self):
        return self._detected_charset

    @property
    def language(self):
        return self._detected_language

    def get_confidence(self):
        if self._detected_charset:
            return 0.99
        else:
            return 0.00

    def feed(self, byte_str):
        for c in byte_str:
            for coding_sm in self.coding_sm:
                if not coding_sm or not coding_sm.active:
                    continue
                coding_state = coding_sm.next_state(c)
                if coding_state == MachineState.ERROR:
                    coding_sm.active = False
                    self.active_sm_count -= 1
                    if self.active_sm_count <= 0:
                        self._state = ProbingState.NOT_ME
                        return self.state
                elif coding_state == MachineState.ITS_ME:
                    self._state = ProbingState.FOUND_IT
                    self._detected_charset = coding_sm.get_coding_state_machine()
                    self._detected_language = coding_sm.language
                    return self.state

        return self.state
site-packages/pip/_vendor/chardet/latin1prober.py000064400000012372147511334570016115 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Universal charset detector code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 2001
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#   Shy Shalom - original C code
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

from .charsetprober import CharSetProber
from .enums import ProbingState

FREQ_CAT_NUM = 4

UDF = 0  # undefined
OTH = 1  # other
ASC = 2  # ascii capital letter
ASS = 3  # ascii small letter
ACV = 4  # accent capital vowel
ACO = 5  # accent capital other
ASV = 6  # accent small vowel
ASO = 7  # accent small other
CLASS_NUM = 8  # total classes

Latin1_CharToClass = (
    OTH, OTH, OTH, OTH, OTH, OTH, OTH, OTH,   # 00 - 07
    OTH, OTH, OTH, OTH, OTH, OTH, OTH, OTH,   # 08 - 0F
    OTH, OTH, OTH, OTH, OTH, OTH, OTH, OTH,   # 10 - 17
    OTH, OTH, OTH, OTH, OTH, OTH, OTH, OTH,   # 18 - 1F
    OTH, OTH, OTH, OTH, OTH, OTH, OTH, OTH,   # 20 - 27
    OTH, OTH, OTH, OTH, OTH, OTH, OTH, OTH,   # 28 - 2F
    OTH, OTH, OTH, OTH, OTH, OTH, OTH, OTH,   # 30 - 37
    OTH, OTH, OTH, OTH, OTH, OTH, OTH, OTH,   # 38 - 3F
    OTH, ASC, ASC, ASC, ASC, ASC, ASC, ASC,   # 40 - 47
    ASC, ASC, ASC, ASC, ASC, ASC, ASC, ASC,   # 48 - 4F
    ASC, ASC, ASC, ASC, ASC, ASC, ASC, ASC,   # 50 - 57
    ASC, ASC, ASC, OTH, OTH, OTH, OTH, OTH,   # 58 - 5F
    OTH, ASS, ASS, ASS, ASS, ASS, ASS, ASS,   # 60 - 67
    ASS, ASS, ASS, ASS, ASS, ASS, ASS, ASS,   # 68 - 6F
    ASS, ASS, ASS, ASS, ASS, ASS, ASS, ASS,   # 70 - 77
    ASS, ASS, ASS, OTH, OTH, OTH, OTH, OTH,   # 78 - 7F
    OTH, UDF, OTH, ASO, OTH, OTH, OTH, OTH,   # 80 - 87
    OTH, OTH, ACO, OTH, ACO, UDF, ACO, UDF,   # 88 - 8F
    UDF, OTH, OTH, OTH, OTH, OTH, OTH, OTH,   # 90 - 97
    OTH, OTH, ASO, OTH, ASO, UDF, ASO, ACO,   # 98 - 9F
    OTH, OTH, OTH, OTH, OTH, OTH, OTH, OTH,   # A0 - A7
    OTH, OTH, OTH, OTH, OTH, OTH, OTH, OTH,   # A8 - AF
    OTH, OTH, OTH, OTH, OTH, OTH, OTH, OTH,   # B0 - B7
    OTH, OTH, OTH, OTH, OTH, OTH, OTH, OTH,   # B8 - BF
    ACV, ACV, ACV, ACV, ACV, ACV, ACO, ACO,   # C0 - C7
    ACV, ACV, ACV, ACV, ACV, ACV, ACV, ACV,   # C8 - CF
    ACO, ACO, ACV, ACV, ACV, ACV, ACV, OTH,   # D0 - D7
    ACV, ACV, ACV, ACV, ACV, ACO, ACO, ACO,   # D8 - DF
    ASV, ASV, ASV, ASV, ASV, ASV, ASO, ASO,   # E0 - E7
    ASV, ASV, ASV, ASV, ASV, ASV, ASV, ASV,   # E8 - EF
    ASO, ASO, ASV, ASV, ASV, ASV, ASV, OTH,   # F0 - F7
    ASV, ASV, ASV, ASV, ASV, ASO, ASO, ASO,   # F8 - FF
)

# 0 : illegal
# 1 : very unlikely
# 2 : normal
# 3 : very likely
Latin1ClassModel = (
# UDF OTH ASC ASS ACV ACO ASV ASO
    0,  0,  0,  0,  0,  0,  0,  0,  # UDF
    0,  3,  3,  3,  3,  3,  3,  3,  # OTH
    0,  3,  3,  3,  3,  3,  3,  3,  # ASC
    0,  3,  3,  3,  1,  1,  3,  3,  # ASS
    0,  3,  3,  3,  1,  2,  1,  2,  # ACV
    0,  3,  3,  3,  3,  3,  3,  3,  # ACO
    0,  3,  1,  3,  1,  1,  1,  3,  # ASV
    0,  3,  1,  3,  1,  1,  3,  3,  # ASO
)


class Latin1Prober(CharSetProber):
    def __init__(self):
        super(Latin1Prober, self).__init__()
        self._last_char_class = None
        self._freq_counter = None
        self.reset()

    def reset(self):
        self._last_char_class = OTH
        self._freq_counter = [0] * FREQ_CAT_NUM
        CharSetProber.reset(self)

    @property
    def charset_name(self):
        return "ISO-8859-1"

    @property
    def language(self):
        return ""

    def feed(self, byte_str):
        byte_str = self.filter_with_english_letters(byte_str)
        for c in byte_str:
            char_class = Latin1_CharToClass[c]
            freq = Latin1ClassModel[(self._last_char_class * CLASS_NUM)
                                    + char_class]
            if freq == 0:
                self._state = ProbingState.NOT_ME
                break
            self._freq_counter[freq] += 1
            self._last_char_class = char_class

        return self.state

    def get_confidence(self):
        if self.state == ProbingState.NOT_ME:
            return 0.01

        total = sum(self._freq_counter)
        if total < 0.01:
            confidence = 0.0
        else:
            confidence = ((self._freq_counter[3] - self._freq_counter[1] * 20.0)
                          / total)
        if confidence < 0.0:
            confidence = 0.0
        # lower the confidence of latin1 so that other more accurate
        # detector can take priority.
        confidence = confidence * 0.73
        return confidence
site-packages/pip/_vendor/chardet/euctwfreq.py000064400000075605147511334570015530 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Communicator client code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

# EUCTW frequency table
# Converted from big5 work
# by Taiwan's Mandarin Promotion Council
# <http:#www.edu.tw:81/mandr/>

# 128  --> 0.42261
# 256  --> 0.57851
# 512  --> 0.74851
# 1024 --> 0.89384
# 2048 --> 0.97583
#
# Idea Distribution Ratio = 0.74851/(1-0.74851) =2.98
# Random Distribution Ration = 512/(5401-512)=0.105
#
# Typical Distribution Ratio about 25% of Ideal one, still much higher than RDR

EUCTW_TYPICAL_DISTRIBUTION_RATIO = 0.75

# Char to FreqOrder table ,
EUCTW_TABLE_SIZE = 5376

EUCTW_CHAR_TO_FREQ_ORDER = (
   1,1800,1506, 255,1431, 198,   9,  82,   6,7310, 177, 202,3615,1256,2808, 110,  # 2742
3735,  33,3241, 261,  76,  44,2113,  16,2931,2184,1176, 659,3868,  26,3404,2643,  # 2758
1198,3869,3313,4060, 410,2211, 302, 590, 361,1963,   8, 204,  58,4296,7311,1931,  # 2774
  63,7312,7313, 317,1614,  75, 222, 159,4061,2412,1480,7314,3500,3068, 224,2809,  # 2790
3616,   3,  10,3870,1471,  29,2774,1135,2852,1939, 873, 130,3242,1123, 312,7315,  # 2806
4297,2051, 507, 252, 682,7316, 142,1914, 124, 206,2932,  34,3501,3173,  64, 604,  # 2822
7317,2494,1976,1977, 155,1990, 645, 641,1606,7318,3405, 337,  72, 406,7319,  80,  # 2838
 630, 238,3174,1509, 263, 939,1092,2644, 756,1440,1094,3406, 449,  69,2969, 591,  # 2854
 179,2095, 471, 115,2034,1843,  60,  50,2970, 134, 806,1868, 734,2035,3407, 180,  # 2870
 995,1607, 156, 537,2893, 688,7320, 319,1305, 779,2144, 514,2374, 298,4298, 359,  # 2886
2495,  90,2707,1338, 663,  11, 906,1099,2545,  20,2436, 182, 532,1716,7321, 732,  # 2902
1376,4062,1311,1420,3175,  25,2312,1056, 113, 399, 382,1949, 242,3408,2467, 529,  # 2918
3243, 475,1447,3617,7322, 117,  21, 656, 810,1297,2295,2329,3502,7323, 126,4063,  # 2934
 706, 456, 150, 613,4299,  71,1118,2036,4064, 145,3069,  85, 835, 486,2114,1246,  # 2950
1426, 428, 727,1285,1015, 800, 106, 623, 303,1281,7324,2127,2354, 347,3736, 221,  # 2966
3503,3110,7325,1955,1153,4065,  83, 296,1199,3070, 192, 624,  93,7326, 822,1897,  # 2982
2810,3111, 795,2064, 991,1554,1542,1592,  27,  43,2853, 859, 139,1456, 860,4300,  # 2998
 437, 712,3871, 164,2392,3112, 695, 211,3017,2096, 195,3872,1608,3504,3505,3618,  # 3014
3873, 234, 811,2971,2097,3874,2229,1441,3506,1615,2375, 668,2076,1638, 305, 228,  # 3030
1664,4301, 467, 415,7327, 262,2098,1593, 239, 108, 300, 200,1033, 512,1247,2077,  # 3046
7328,7329,2173,3176,3619,2673, 593, 845,1062,3244,  88,1723,2037,3875,1950, 212,  # 3062
 266, 152, 149, 468,1898,4066,4302,  77, 187,7330,3018,  37,   5,2972,7331,3876,  # 3078
7332,7333,  39,2517,4303,2894,3177,2078,  55, 148,  74,4304, 545, 483,1474,1029,  # 3094
1665, 217,1869,1531,3113,1104,2645,4067,  24, 172,3507, 900,3877,3508,3509,4305,  # 3110
  32,1408,2811,1312, 329, 487,2355,2247,2708, 784,2674,   4,3019,3314,1427,1788,  # 3126
 188, 109, 499,7334,3620,1717,1789, 888,1217,3020,4306,7335,3510,7336,3315,1520,  # 3142
3621,3878, 196,1034, 775,7337,7338, 929,1815, 249, 439,  38,7339,1063,7340, 794,  # 3158
3879,1435,2296,  46, 178,3245,2065,7341,2376,7342, 214,1709,4307, 804,  35, 707,  # 3174
 324,3622,1601,2546, 140, 459,4068,7343,7344,1365, 839, 272, 978,2257,2572,3409,  # 3190
2128,1363,3623,1423, 697, 100,3071,  48,  70,1231, 495,3114,2193,7345,1294,7346,  # 3206
2079, 462, 586,1042,3246, 853, 256, 988, 185,2377,3410,1698, 434,1084,7347,3411,  # 3222
 314,2615,2775,4308,2330,2331, 569,2280, 637,1816,2518, 757,1162,1878,1616,3412,  # 3238
 287,1577,2115, 768,4309,1671,2854,3511,2519,1321,3737, 909,2413,7348,4069, 933,  # 3254
3738,7349,2052,2356,1222,4310, 765,2414,1322, 786,4311,7350,1919,1462,1677,2895,  # 3270
1699,7351,4312,1424,2437,3115,3624,2590,3316,1774,1940,3413,3880,4070, 309,1369,  # 3286
1130,2812, 364,2230,1653,1299,3881,3512,3882,3883,2646, 525,1085,3021, 902,2000,  # 3302
1475, 964,4313, 421,1844,1415,1057,2281, 940,1364,3116, 376,4314,4315,1381,   7,  # 3318
2520, 983,2378, 336,1710,2675,1845, 321,3414, 559,1131,3022,2742,1808,1132,1313,  # 3334
 265,1481,1857,7352, 352,1203,2813,3247, 167,1089, 420,2814, 776, 792,1724,3513,  # 3350
4071,2438,3248,7353,4072,7354, 446, 229, 333,2743, 901,3739,1200,1557,4316,2647,  # 3366
1920, 395,2744,2676,3740,4073,1835, 125, 916,3178,2616,4317,7355,7356,3741,7357,  # 3382
7358,7359,4318,3117,3625,1133,2547,1757,3415,1510,2313,1409,3514,7360,2145, 438,  # 3398
2591,2896,2379,3317,1068, 958,3023, 461, 311,2855,2677,4074,1915,3179,4075,1978,  # 3414
 383, 750,2745,2617,4076, 274, 539, 385,1278,1442,7361,1154,1964, 384, 561, 210,  # 3430
  98,1295,2548,3515,7362,1711,2415,1482,3416,3884,2897,1257, 129,7363,3742, 642,  # 3446
 523,2776,2777,2648,7364, 141,2231,1333,  68, 176, 441, 876, 907,4077, 603,2592,  # 3462
 710, 171,3417, 404, 549,  18,3118,2393,1410,3626,1666,7365,3516,4319,2898,4320,  # 3478
7366,2973, 368,7367, 146, 366,  99, 871,3627,1543, 748, 807,1586,1185,  22,2258,  # 3494
 379,3743,3180,7368,3181, 505,1941,2618,1991,1382,2314,7369, 380,2357, 218, 702,  # 3510
1817,1248,3418,3024,3517,3318,3249,7370,2974,3628, 930,3250,3744,7371,  59,7372,  # 3526
 585, 601,4078, 497,3419,1112,1314,4321,1801,7373,1223,1472,2174,7374, 749,1836,  # 3542
 690,1899,3745,1772,3885,1476, 429,1043,1790,2232,2116, 917,4079, 447,1086,1629,  # 3558
7375, 556,7376,7377,2020,1654, 844,1090, 105, 550, 966,1758,2815,1008,1782, 686,  # 3574
1095,7378,2282, 793,1602,7379,3518,2593,4322,4080,2933,2297,4323,3746, 980,2496,  # 3590
 544, 353, 527,4324, 908,2678,2899,7380, 381,2619,1942,1348,7381,1341,1252, 560,  # 3606
3072,7382,3420,2856,7383,2053, 973, 886,2080, 143,4325,7384,7385, 157,3886, 496,  # 3622
4081,  57, 840, 540,2038,4326,4327,3421,2117,1445, 970,2259,1748,1965,2081,4082,  # 3638
3119,1234,1775,3251,2816,3629, 773,1206,2129,1066,2039,1326,3887,1738,1725,4083,  # 3654
 279,3120,  51,1544,2594, 423,1578,2130,2066, 173,4328,1879,7386,7387,1583, 264,  # 3670
 610,3630,4329,2439, 280, 154,7388,7389,7390,1739, 338,1282,3073, 693,2857,1411,  # 3686
1074,3747,2440,7391,4330,7392,7393,1240, 952,2394,7394,2900,1538,2679, 685,1483,  # 3702
4084,2468,1436, 953,4085,2054,4331, 671,2395,  79,4086,2441,3252, 608, 567,2680,  # 3718
3422,4087,4088,1691, 393,1261,1791,2396,7395,4332,7396,7397,7398,7399,1383,1672,  # 3734
3748,3182,1464, 522,1119, 661,1150, 216, 675,4333,3888,1432,3519, 609,4334,2681,  # 3750
2397,7400,7401,7402,4089,3025,   0,7403,2469, 315, 231,2442, 301,3319,4335,2380,  # 3766
7404, 233,4090,3631,1818,4336,4337,7405,  96,1776,1315,2082,7406, 257,7407,1809,  # 3782
3632,2709,1139,1819,4091,2021,1124,2163,2778,1777,2649,7408,3074, 363,1655,3183,  # 3798
7409,2975,7410,7411,7412,3889,1567,3890, 718, 103,3184, 849,1443, 341,3320,2934,  # 3814
1484,7413,1712, 127,  67, 339,4092,2398, 679,1412, 821,7414,7415, 834, 738, 351,  # 3830
2976,2146, 846, 235,1497,1880, 418,1992,3749,2710, 186,1100,2147,2746,3520,1545,  # 3846
1355,2935,2858,1377, 583,3891,4093,2573,2977,7416,1298,3633,1078,2549,3634,2358,  # 3862
  78,3750,3751, 267,1289,2099,2001,1594,4094, 348, 369,1274,2194,2175,1837,4338,  # 3878
1820,2817,3635,2747,2283,2002,4339,2936,2748, 144,3321, 882,4340,3892,2749,3423,  # 3894
4341,2901,7417,4095,1726, 320,7418,3893,3026, 788,2978,7419,2818,1773,1327,2859,  # 3910
3894,2819,7420,1306,4342,2003,1700,3752,3521,2359,2650, 787,2022, 506, 824,3636,  # 3926
 534, 323,4343,1044,3322,2023,1900, 946,3424,7421,1778,1500,1678,7422,1881,4344,  # 3942
 165, 243,4345,3637,2521, 123, 683,4096, 764,4346,  36,3895,1792, 589,2902, 816,  # 3958
 626,1667,3027,2233,1639,1555,1622,3753,3896,7423,3897,2860,1370,1228,1932, 891,  # 3974
2083,2903, 304,4097,7424, 292,2979,2711,3522, 691,2100,4098,1115,4347, 118, 662,  # 3990
7425, 611,1156, 854,2381,1316,2861,   2, 386, 515,2904,7426,7427,3253, 868,2234,  # 4006
1486, 855,2651, 785,2212,3028,7428,1040,3185,3523,7429,3121, 448,7430,1525,7431,  # 4022
2164,4348,7432,3754,7433,4099,2820,3524,3122, 503, 818,3898,3123,1568, 814, 676,  # 4038
1444, 306,1749,7434,3755,1416,1030, 197,1428, 805,2821,1501,4349,7435,7436,7437,  # 4054
1993,7438,4350,7439,7440,2195,  13,2779,3638,2980,3124,1229,1916,7441,3756,2131,  # 4070
7442,4100,4351,2399,3525,7443,2213,1511,1727,1120,7444,7445, 646,3757,2443, 307,  # 4086
7446,7447,1595,3186,7448,7449,7450,3639,1113,1356,3899,1465,2522,2523,7451, 519,  # 4102
7452, 128,2132,  92,2284,1979,7453,3900,1512, 342,3125,2196,7454,2780,2214,1980,  # 4118
3323,7455, 290,1656,1317, 789, 827,2360,7456,3758,4352, 562, 581,3901,7457, 401,  # 4134
4353,2248,  94,4354,1399,2781,7458,1463,2024,4355,3187,1943,7459, 828,1105,4101,  # 4150
1262,1394,7460,4102, 605,4356,7461,1783,2862,7462,2822, 819,2101, 578,2197,2937,  # 4166
7463,1502, 436,3254,4103,3255,2823,3902,2905,3425,3426,7464,2712,2315,7465,7466,  # 4182
2332,2067,  23,4357, 193, 826,3759,2102, 699,1630,4104,3075, 390,1793,1064,3526,  # 4198
7467,1579,3076,3077,1400,7468,4105,1838,1640,2863,7469,4358,4359, 137,4106, 598,  # 4214
3078,1966, 780, 104, 974,2938,7470, 278, 899, 253, 402, 572, 504, 493,1339,7471,  # 4230
3903,1275,4360,2574,2550,7472,3640,3029,3079,2249, 565,1334,2713, 863,  41,7473,  # 4246
7474,4361,7475,1657,2333,  19, 463,2750,4107, 606,7476,2981,3256,1087,2084,1323,  # 4262
2652,2982,7477,1631,1623,1750,4108,2682,7478,2864, 791,2714,2653,2334, 232,2416,  # 4278
7479,2983,1498,7480,2654,2620, 755,1366,3641,3257,3126,2025,1609, 119,1917,3427,  # 4294
 862,1026,4109,7481,3904,3760,4362,3905,4363,2260,1951,2470,7482,1125, 817,4110,  # 4310
4111,3906,1513,1766,2040,1487,4112,3030,3258,2824,3761,3127,7483,7484,1507,7485,  # 4326
2683, 733,  40,1632,1106,2865, 345,4113, 841,2524, 230,4364,2984,1846,3259,3428,  # 4342
7486,1263, 986,3429,7487, 735, 879, 254,1137, 857, 622,1300,1180,1388,1562,3907,  # 4358
3908,2939, 967,2751,2655,1349, 592,2133,1692,3324,2985,1994,4114,1679,3909,1901,  # 4374
2185,7488, 739,3642,2715,1296,1290,7489,4115,2198,2199,1921,1563,2595,2551,1870,  # 4390
2752,2986,7490, 435,7491, 343,1108, 596,  17,1751,4365,2235,3430,3643,7492,4366,  # 4406
 294,3527,2940,1693, 477, 979, 281,2041,3528, 643,2042,3644,2621,2782,2261,1031,  # 4422
2335,2134,2298,3529,4367, 367,1249,2552,7493,3530,7494,4368,1283,3325,2004, 240,  # 4438
1762,3326,4369,4370, 836,1069,3128, 474,7495,2148,2525, 268,3531,7496,3188,1521,  # 4454
1284,7497,1658,1546,4116,7498,3532,3533,7499,4117,3327,2684,1685,4118, 961,1673,  # 4470
2622, 190,2005,2200,3762,4371,4372,7500, 570,2497,3645,1490,7501,4373,2623,3260,  # 4486
1956,4374, 584,1514, 396,1045,1944,7502,4375,1967,2444,7503,7504,4376,3910, 619,  # 4502
7505,3129,3261, 215,2006,2783,2553,3189,4377,3190,4378, 763,4119,3763,4379,7506,  # 4518
7507,1957,1767,2941,3328,3646,1174, 452,1477,4380,3329,3130,7508,2825,1253,2382,  # 4534
2186,1091,2285,4120, 492,7509, 638,1169,1824,2135,1752,3911, 648, 926,1021,1324,  # 4550
4381, 520,4382, 997, 847,1007, 892,4383,3764,2262,1871,3647,7510,2400,1784,4384,  # 4566
1952,2942,3080,3191,1728,4121,2043,3648,4385,2007,1701,3131,1551,  30,2263,4122,  # 4582
7511,2026,4386,3534,7512, 501,7513,4123, 594,3431,2165,1821,3535,3432,3536,3192,  # 4598
 829,2826,4124,7514,1680,3132,1225,4125,7515,3262,4387,4126,3133,2336,7516,4388,  # 4614
4127,7517,3912,3913,7518,1847,2383,2596,3330,7519,4389, 374,3914, 652,4128,4129,  # 4630
 375,1140, 798,7520,7521,7522,2361,4390,2264, 546,1659, 138,3031,2445,4391,7523,  # 4646
2250, 612,1848, 910, 796,3765,1740,1371, 825,3766,3767,7524,2906,2554,7525, 692,  # 4662
 444,3032,2624, 801,4392,4130,7526,1491, 244,1053,3033,4131,4132, 340,7527,3915,  # 4678
1041,2987, 293,1168,  87,1357,7528,1539, 959,7529,2236, 721, 694,4133,3768, 219,  # 4694
1478, 644,1417,3331,2656,1413,1401,1335,1389,3916,7530,7531,2988,2362,3134,1825,  # 4710
 730,1515, 184,2827,  66,4393,7532,1660,2943, 246,3332, 378,1457, 226,3433, 975,  # 4726
3917,2944,1264,3537, 674, 696,7533, 163,7534,1141,2417,2166, 713,3538,3333,4394,  # 4742
3918,7535,7536,1186,  15,7537,1079,1070,7538,1522,3193,3539, 276,1050,2716, 758,  # 4758
1126, 653,2945,3263,7539,2337, 889,3540,3919,3081,2989, 903,1250,4395,3920,3434,  # 4774
3541,1342,1681,1718, 766,3264, 286,  89,2946,3649,7540,1713,7541,2597,3334,2990,  # 4790
7542,2947,2215,3194,2866,7543,4396,2498,2526, 181, 387,1075,3921, 731,2187,3335,  # 4806
7544,3265, 310, 313,3435,2299, 770,4134,  54,3034, 189,4397,3082,3769,3922,7545,  # 4822
1230,1617,1849, 355,3542,4135,4398,3336, 111,4136,3650,1350,3135,3436,3035,4137,  # 4838
2149,3266,3543,7546,2784,3923,3924,2991, 722,2008,7547,1071, 247,1207,2338,2471,  # 4854
1378,4399,2009, 864,1437,1214,4400, 373,3770,1142,2216, 667,4401, 442,2753,2555,  # 4870
3771,3925,1968,4138,3267,1839, 837, 170,1107, 934,1336,1882,7548,7549,2118,4139,  # 4886
2828, 743,1569,7550,4402,4140, 582,2384,1418,3437,7551,1802,7552, 357,1395,1729,  # 4902
3651,3268,2418,1564,2237,7553,3083,3772,1633,4403,1114,2085,4141,1532,7554, 482,  # 4918
2446,4404,7555,7556,1492, 833,1466,7557,2717,3544,1641,2829,7558,1526,1272,3652,  # 4934
4142,1686,1794, 416,2556,1902,1953,1803,7559,3773,2785,3774,1159,2316,7560,2867,  # 4950
4405,1610,1584,3036,2419,2754, 443,3269,1163,3136,7561,7562,3926,7563,4143,2499,  # 4966
3037,4406,3927,3137,2103,1647,3545,2010,1872,4144,7564,4145, 431,3438,7565, 250,  # 4982
  97,  81,4146,7566,1648,1850,1558, 160, 848,7567, 866, 740,1694,7568,2201,2830,  # 4998
3195,4147,4407,3653,1687, 950,2472, 426, 469,3196,3654,3655,3928,7569,7570,1188,  # 5014
 424,1995, 861,3546,4148,3775,2202,2685, 168,1235,3547,4149,7571,2086,1674,4408,  # 5030
3337,3270, 220,2557,1009,7572,3776, 670,2992, 332,1208, 717,7573,7574,3548,2447,  # 5046
3929,3338,7575, 513,7576,1209,2868,3339,3138,4409,1080,7577,7578,7579,7580,2527,  # 5062
3656,3549, 815,1587,3930,3931,7581,3550,3439,3777,1254,4410,1328,3038,1390,3932,  # 5078
1741,3933,3778,3934,7582, 236,3779,2448,3271,7583,7584,3657,3780,1273,3781,4411,  # 5094
7585, 308,7586,4412, 245,4413,1851,2473,1307,2575, 430, 715,2136,2449,7587, 270,  # 5110
 199,2869,3935,7588,3551,2718,1753, 761,1754, 725,1661,1840,4414,3440,3658,7589,  # 5126
7590, 587,  14,3272, 227,2598, 326, 480,2265, 943,2755,3552, 291, 650,1883,7591,  # 5142
1702,1226, 102,1547,  62,3441, 904,4415,3442,1164,4150,7592,7593,1224,1548,2756,  # 5158
 391, 498,1493,7594,1386,1419,7595,2055,1177,4416, 813, 880,1081,2363, 566,1145,  # 5174
4417,2286,1001,1035,2558,2599,2238, 394,1286,7596,7597,2068,7598,  86,1494,1730,  # 5190
3936, 491,1588, 745, 897,2948, 843,3340,3937,2757,2870,3273,1768, 998,2217,2069,  # 5206
 397,1826,1195,1969,3659,2993,3341, 284,7599,3782,2500,2137,2119,1903,7600,3938,  # 5222
2150,3939,4151,1036,3443,1904, 114,2559,4152, 209,1527,7601,7602,2949,2831,2625,  # 5238
2385,2719,3139, 812,2560,7603,3274,7604,1559, 737,1884,3660,1210, 885,  28,2686,  # 5254
3553,3783,7605,4153,1004,1779,4418,7606, 346,1981,2218,2687,4419,3784,1742, 797,  # 5270
1642,3940,1933,1072,1384,2151, 896,3941,3275,3661,3197,2871,3554,7607,2561,1958,  # 5286
4420,2450,1785,7608,7609,7610,3942,4154,1005,1308,3662,4155,2720,4421,4422,1528,  # 5302
2600, 161,1178,4156,1982, 987,4423,1101,4157, 631,3943,1157,3198,2420,1343,1241,  # 5318
1016,2239,2562, 372, 877,2339,2501,1160, 555,1934, 911,3944,7611, 466,1170, 169,  # 5334
1051,2907,2688,3663,2474,2994,1182,2011,2563,1251,2626,7612, 992,2340,3444,1540,  # 5350
2721,1201,2070,2401,1996,2475,7613,4424, 528,1922,2188,1503,1873,1570,2364,3342,  # 5366
3276,7614, 557,1073,7615,1827,3445,2087,2266,3140,3039,3084, 767,3085,2786,4425,  # 5382
1006,4158,4426,2341,1267,2176,3664,3199, 778,3945,3200,2722,1597,2657,7616,4427,  # 5398
7617,3446,7618,7619,7620,3277,2689,1433,3278, 131,  95,1504,3946, 723,4159,3141,  # 5414
1841,3555,2758,2189,3947,2027,2104,3665,7621,2995,3948,1218,7622,3343,3201,3949,  # 5430
4160,2576, 248,1634,3785, 912,7623,2832,3666,3040,3786, 654,  53,7624,2996,7625,  # 5446
1688,4428, 777,3447,1032,3950,1425,7626, 191, 820,2120,2833, 971,4429, 931,3202,  # 5462
 135, 664, 783,3787,1997, 772,2908,1935,3951,3788,4430,2909,3203, 282,2723, 640,  # 5478
1372,3448,1127, 922, 325,3344,7627,7628, 711,2044,7629,7630,3952,2219,2787,1936,  # 5494
3953,3345,2220,2251,3789,2300,7631,4431,3790,1258,3279,3954,3204,2138,2950,3955,  # 5510
3956,7632,2221, 258,3205,4432, 101,1227,7633,3280,1755,7634,1391,3281,7635,2910,  # 5526
2056, 893,7636,7637,7638,1402,4161,2342,7639,7640,3206,3556,7641,7642, 878,1325,  # 5542
1780,2788,4433, 259,1385,2577, 744,1183,2267,4434,7643,3957,2502,7644, 684,1024,  # 5558
4162,7645, 472,3557,3449,1165,3282,3958,3959, 322,2152, 881, 455,1695,1152,1340,  # 5574
 660, 554,2153,4435,1058,4436,4163, 830,1065,3346,3960,4437,1923,7646,1703,1918,  # 5590
7647, 932,2268, 122,7648,4438, 947, 677,7649,3791,2627, 297,1905,1924,2269,4439,  # 5606
2317,3283,7650,7651,4164,7652,4165,  84,4166, 112, 989,7653, 547,1059,3961, 701,  # 5622
3558,1019,7654,4167,7655,3450, 942, 639, 457,2301,2451, 993,2951, 407, 851, 494,  # 5638
4440,3347, 927,7656,1237,7657,2421,3348, 573,4168, 680, 921,2911,1279,1874, 285,  # 5654
 790,1448,1983, 719,2167,7658,7659,4441,3962,3963,1649,7660,1541, 563,7661,1077,  # 5670
7662,3349,3041,3451, 511,2997,3964,3965,3667,3966,1268,2564,3350,3207,4442,4443,  # 5686
7663, 535,1048,1276,1189,2912,2028,3142,1438,1373,2834,2952,1134,2012,7664,4169,  # 5702
1238,2578,3086,1259,7665, 700,7666,2953,3143,3668,4170,7667,4171,1146,1875,1906,  # 5718
4444,2601,3967, 781,2422, 132,1589, 203, 147, 273,2789,2402, 898,1786,2154,3968,  # 5734
3969,7668,3792,2790,7669,7670,4445,4446,7671,3208,7672,1635,3793, 965,7673,1804,  # 5750
2690,1516,3559,1121,1082,1329,3284,3970,1449,3794,  65,1128,2835,2913,2759,1590,  # 5766
3795,7674,7675,  12,2658,  45, 976,2579,3144,4447, 517,2528,1013,1037,3209,7676,  # 5782
3796,2836,7677,3797,7678,3452,7679,2602, 614,1998,2318,3798,3087,2724,2628,7680,  # 5798
2580,4172, 599,1269,7681,1810,3669,7682,2691,3088, 759,1060, 489,1805,3351,3285,  # 5814
1358,7683,7684,2386,1387,1215,2629,2252, 490,7685,7686,4173,1759,2387,2343,7687,  # 5830
4448,3799,1907,3971,2630,1806,3210,4449,3453,3286,2760,2344, 874,7688,7689,3454,  # 5846
3670,1858,  91,2914,3671,3042,3800,4450,7690,3145,3972,2659,7691,3455,1202,1403,  # 5862
3801,2954,2529,1517,2503,4451,3456,2504,7692,4452,7693,2692,1885,1495,1731,3973,  # 5878
2365,4453,7694,2029,7695,7696,3974,2693,1216, 237,2581,4174,2319,3975,3802,4454,  # 5894
4455,2694,3560,3457, 445,4456,7697,7698,7699,7700,2761,  61,3976,3672,1822,3977,  # 5910
7701, 687,2045, 935, 925, 405,2660, 703,1096,1859,2725,4457,3978,1876,1367,2695,  # 5926
3352, 918,2105,1781,2476, 334,3287,1611,1093,4458, 564,3146,3458,3673,3353, 945,  # 5942
2631,2057,4459,7702,1925, 872,4175,7703,3459,2696,3089, 349,4176,3674,3979,4460,  # 5958
3803,4177,3675,2155,3980,4461,4462,4178,4463,2403,2046, 782,3981, 400, 251,4179,  # 5974
1624,7704,7705, 277,3676, 299,1265, 476,1191,3804,2121,4180,4181,1109, 205,7706,  # 5990
2582,1000,2156,3561,1860,7707,7708,7709,4464,7710,4465,2565, 107,2477,2157,3982,  # 6006
3460,3147,7711,1533, 541,1301, 158, 753,4182,2872,3562,7712,1696, 370,1088,4183,  # 6022
4466,3563, 579, 327, 440, 162,2240, 269,1937,1374,3461, 968,3043,  56,1396,3090,  # 6038
2106,3288,3354,7713,1926,2158,4467,2998,7714,3564,7715,7716,3677,4468,2478,7717,  # 6054
2791,7718,1650,4469,7719,2603,7720,7721,3983,2661,3355,1149,3356,3984,3805,3985,  # 6070
7722,1076,  49,7723, 951,3211,3289,3290, 450,2837, 920,7724,1811,2792,2366,4184,  # 6086
1908,1138,2367,3806,3462,7725,3212,4470,1909,1147,1518,2423,4471,3807,7726,4472,  # 6102
2388,2604, 260,1795,3213,7727,7728,3808,3291, 708,7729,3565,1704,7730,3566,1351,  # 6118
1618,3357,2999,1886, 944,4185,3358,4186,3044,3359,4187,7731,3678, 422, 413,1714,  # 6134
3292, 500,2058,2345,4188,2479,7732,1344,1910, 954,7733,1668,7734,7735,3986,2404,  # 6150
4189,3567,3809,4190,7736,2302,1318,2505,3091, 133,3092,2873,4473, 629,  31,2838,  # 6166
2697,3810,4474, 850, 949,4475,3987,2955,1732,2088,4191,1496,1852,7737,3988, 620,  # 6182
3214, 981,1242,3679,3360,1619,3680,1643,3293,2139,2452,1970,1719,3463,2168,7738,  # 6198
3215,7739,7740,3361,1828,7741,1277,4476,1565,2047,7742,1636,3568,3093,7743, 869,  # 6214
2839, 655,3811,3812,3094,3989,3000,3813,1310,3569,4477,7744,7745,7746,1733, 558,  # 6230
4478,3681, 335,1549,3045,1756,4192,3682,1945,3464,1829,1291,1192, 470,2726,2107,  # 6246
2793, 913,1054,3990,7747,1027,7748,3046,3991,4479, 982,2662,3362,3148,3465,3216,  # 6262
3217,1946,2794,7749, 571,4480,7750,1830,7751,3570,2583,1523,2424,7752,2089, 984,  # 6278
4481,3683,1959,7753,3684, 852, 923,2795,3466,3685, 969,1519, 999,2048,2320,1705,  # 6294
7754,3095, 615,1662, 151, 597,3992,2405,2321,1049, 275,4482,3686,4193, 568,3687,  # 6310
3571,2480,4194,3688,7755,2425,2270, 409,3218,7756,1566,2874,3467,1002, 769,2840,  # 6326
 194,2090,3149,3689,2222,3294,4195, 628,1505,7757,7758,1763,2177,3001,3993, 521,  # 6342
1161,2584,1787,2203,2406,4483,3994,1625,4196,4197, 412,  42,3096, 464,7759,2632,  # 6358
4484,3363,1760,1571,2875,3468,2530,1219,2204,3814,2633,2140,2368,4485,4486,3295,  # 6374
1651,3364,3572,7760,7761,3573,2481,3469,7762,3690,7763,7764,2271,2091, 460,7765,  # 6390
4487,7766,3002, 962, 588,3574, 289,3219,2634,1116,  52,7767,3047,1796,7768,7769,  # 6406
7770,1467,7771,1598,1143,3691,4198,1984,1734,1067,4488,1280,3365, 465,4489,1572,  # 6422
 510,7772,1927,2241,1812,1644,3575,7773,4490,3692,7774,7775,2663,1573,1534,7776,  # 6438
7777,4199, 536,1807,1761,3470,3815,3150,2635,7778,7779,7780,4491,3471,2915,1911,  # 6454
2796,7781,3296,1122, 377,3220,7782, 360,7783,7784,4200,1529, 551,7785,2059,3693,  # 6470
1769,2426,7786,2916,4201,3297,3097,2322,2108,2030,4492,1404, 136,1468,1479, 672,  # 6486
1171,3221,2303, 271,3151,7787,2762,7788,2049, 678,2727, 865,1947,4493,7789,2013,  # 6502
3995,2956,7790,2728,2223,1397,3048,3694,4494,4495,1735,2917,3366,3576,7791,3816,  # 6518
 509,2841,2453,2876,3817,7792,7793,3152,3153,4496,4202,2531,4497,2304,1166,1010,  # 6534
 552, 681,1887,7794,7795,2957,2958,3996,1287,1596,1861,3154, 358, 453, 736, 175,  # 6550
 478,1117, 905,1167,1097,7796,1853,1530,7797,1706,7798,2178,3472,2287,3695,3473,  # 6566
3577,4203,2092,4204,7799,3367,1193,2482,4205,1458,2190,2205,1862,1888,1421,3298,  # 6582
2918,3049,2179,3474, 595,2122,7800,3997,7801,7802,4206,1707,2636, 223,3696,1359,  # 6598
 751,3098, 183,3475,7803,2797,3003, 419,2369, 633, 704,3818,2389, 241,7804,7805,  # 6614
7806, 838,3004,3697,2272,2763,2454,3819,1938,2050,3998,1309,3099,2242,1181,7807,  # 6630
1136,2206,3820,2370,1446,4207,2305,4498,7808,7809,4208,1055,2605, 484,3698,7810,  # 6646
3999, 625,4209,2273,3368,1499,4210,4000,7811,4001,4211,3222,2274,2275,3476,7812,  # 6662
7813,2764, 808,2606,3699,3369,4002,4212,3100,2532, 526,3370,3821,4213, 955,7814,  # 6678
1620,4214,2637,2427,7815,1429,3700,1669,1831, 994, 928,7816,3578,1260,7817,7818,  # 6694
7819,1948,2288, 741,2919,1626,4215,2729,2455, 867,1184, 362,3371,1392,7820,7821,  # 6710
4003,4216,1770,1736,3223,2920,4499,4500,1928,2698,1459,1158,7822,3050,3372,2877,  # 6726
1292,1929,2506,2842,3701,1985,1187,2071,2014,2607,4217,7823,2566,2507,2169,3702,  # 6742
2483,3299,7824,3703,4501,7825,7826, 666,1003,3005,1022,3579,4218,7827,4502,1813,  # 6758
2253, 574,3822,1603, 295,1535, 705,3823,4219, 283, 858, 417,7828,7829,3224,4503,  # 6774
4504,3051,1220,1889,1046,2276,2456,4004,1393,1599, 689,2567, 388,4220,7830,2484,  # 6790
 802,7831,2798,3824,2060,1405,2254,7832,4505,3825,2109,1052,1345,3225,1585,7833,  # 6806
 809,7834,7835,7836, 575,2730,3477, 956,1552,1469,1144,2323,7837,2324,1560,2457,  # 6822
3580,3226,4005, 616,2207,3155,2180,2289,7838,1832,7839,3478,4506,7840,1319,3704,  # 6838
3705,1211,3581,1023,3227,1293,2799,7841,7842,7843,3826, 607,2306,3827, 762,2878,  # 6854
1439,4221,1360,7844,1485,3052,7845,4507,1038,4222,1450,2061,2638,4223,1379,4508,  # 6870
2585,7846,7847,4224,1352,1414,2325,2921,1172,7848,7849,3828,3829,7850,1797,1451,  # 6886
7851,7852,7853,7854,2922,4006,4007,2485,2346, 411,4008,4009,3582,3300,3101,4509,  # 6902
1561,2664,1452,4010,1375,7855,7856,  47,2959, 316,7857,1406,1591,2923,3156,7858,  # 6918
1025,2141,3102,3157, 354,2731, 884,2224,4225,2407, 508,3706, 726,3583, 996,2428,  # 6934
3584, 729,7859, 392,2191,1453,4011,4510,3707,7860,7861,2458,3585,2608,1675,2800,  # 6950
 919,2347,2960,2348,1270,4511,4012,  73,7862,7863, 647,7864,3228,2843,2255,1550,  # 6966
1346,3006,7865,1332, 883,3479,7866,7867,7868,7869,3301,2765,7870,1212, 831,1347,  # 6982
4226,4512,2326,3830,1863,3053, 720,3831,4513,4514,3832,7871,4227,7872,7873,4515,  # 6998
7874,7875,1798,4516,3708,2609,4517,3586,1645,2371,7876,7877,2924, 669,2208,2665,  # 7014
2429,7878,2879,7879,7880,1028,3229,7881,4228,2408,7882,2256,1353,7883,7884,4518,  # 7030
3158, 518,7885,4013,7886,4229,1960,7887,2142,4230,7888,7889,3007,2349,2350,3833,  # 7046
 516,1833,1454,4014,2699,4231,4519,2225,2610,1971,1129,3587,7890,2766,7891,2961,  # 7062
1422, 577,1470,3008,1524,3373,7892,7893, 432,4232,3054,3480,7894,2586,1455,2508,  # 7078
2226,1972,1175,7895,1020,2732,4015,3481,4520,7896,2733,7897,1743,1361,3055,3482,  # 7094
2639,4016,4233,4521,2290, 895, 924,4234,2170, 331,2243,3056, 166,1627,3057,1098,  # 7110
7898,1232,2880,2227,3374,4522, 657, 403,1196,2372, 542,3709,3375,1600,4235,3483,  # 7126
7899,4523,2767,3230, 576, 530,1362,7900,4524,2533,2666,3710,4017,7901, 842,3834,  # 7142
7902,2801,2031,1014,4018, 213,2700,3376, 665, 621,4236,7903,3711,2925,2430,7904,  # 7158
2431,3302,3588,3377,7905,4237,2534,4238,4525,3589,1682,4239,3484,1380,7906, 724,  # 7174
2277, 600,1670,7907,1337,1233,4526,3103,2244,7908,1621,4527,7909, 651,4240,7910,  # 7190
1612,4241,2611,7911,2844,7912,2734,2307,3058,7913, 716,2459,3059, 174,1255,2701,  # 7206
4019,3590, 548,1320,1398, 728,4020,1574,7914,1890,1197,3060,4021,7915,3061,3062,  # 7222
3712,3591,3713, 747,7916, 635,4242,4528,7917,7918,7919,4243,7920,7921,4529,7922,  # 7238
3378,4530,2432, 451,7923,3714,2535,2072,4244,2735,4245,4022,7924,1764,4531,7925,  # 7254
4246, 350,7926,2278,2390,2486,7927,4247,4023,2245,1434,4024, 488,4532, 458,4248,  # 7270
4025,3715, 771,1330,2391,3835,2568,3159,2159,2409,1553,2667,3160,4249,7928,2487,  # 7286
2881,2612,1720,2702,4250,3379,4533,7929,2536,4251,7930,3231,4252,2768,7931,2015,  # 7302
2736,7932,1155,1017,3716,3836,7933,3303,2308, 201,1864,4253,1430,7934,4026,7935,  # 7318
7936,7937,7938,7939,4254,1604,7940, 414,1865, 371,2587,4534,4535,3485,2016,3104,  # 7334
4536,1708, 960,4255, 887, 389,2171,1536,1663,1721,7941,2228,4027,2351,2926,1580,  # 7350
7942,7943,7944,1744,7945,2537,4537,4538,7946,4539,7947,2073,7948,7949,3592,3380,  # 7366
2882,4256,7950,4257,2640,3381,2802, 673,2703,2460, 709,3486,4028,3593,4258,7951,  # 7382
1148, 502, 634,7952,7953,1204,4540,3594,1575,4541,2613,3717,7954,3718,3105, 948,  # 7398
3232, 121,1745,3837,1110,7955,4259,3063,2509,3009,4029,3719,1151,1771,3838,1488,  # 7414
4030,1986,7956,2433,3487,7957,7958,2093,7959,4260,3839,1213,1407,2803, 531,2737,  # 7430
2538,3233,1011,1537,7960,2769,4261,3106,1061,7961,3720,3721,1866,2883,7962,2017,  # 7446
 120,4262,4263,2062,3595,3234,2309,3840,2668,3382,1954,4542,7963,7964,3488,1047,  # 7462
2704,1266,7965,1368,4543,2845, 649,3383,3841,2539,2738,1102,2846,2669,7966,7967,  # 7478
1999,7968,1111,3596,2962,7969,2488,3842,3597,2804,1854,3384,3722,7970,7971,3385,  # 7494
2410,2884,3304,3235,3598,7972,2569,7973,3599,2805,4031,1460, 856,7974,3600,7975,  # 7510
2885,2963,7976,2886,3843,7977,4264, 632,2510, 875,3844,1697,3845,2291,7978,7979,  # 7526
4544,3010,1239, 580,4545,4265,7980, 914, 936,2074,1190,4032,1039,2123,7981,7982,  # 7542
7983,3386,1473,7984,1354,4266,3846,7985,2172,3064,4033, 915,3305,4267,4268,3306,  # 7558
1605,1834,7986,2739, 398,3601,4269,3847,4034, 328,1912,2847,4035,3848,1331,4270,  # 7574
3011, 937,4271,7987,3602,4036,4037,3387,2160,4546,3388, 524, 742, 538,3065,1012,  # 7590
7988,7989,3849,2461,7990, 658,1103, 225,3850,7991,7992,4547,7993,4548,7994,3236,  # 7606
1243,7995,4038, 963,2246,4549,7996,2705,3603,3161,7997,7998,2588,2327,7999,4550,  # 7622
8000,8001,8002,3489,3307, 957,3389,2540,2032,1930,2927,2462, 870,2018,3604,1746,  # 7638
2770,2771,2434,2463,8003,3851,8004,3723,3107,3724,3490,3390,3725,8005,1179,3066,  # 7654
8006,3162,2373,4272,3726,2541,3163,3108,2740,4039,8007,3391,1556,2542,2292, 977,  # 7670
2887,2033,4040,1205,3392,8008,1765,3393,3164,2124,1271,1689, 714,4551,3491,8009,  # 7686
2328,3852, 533,4273,3605,2181, 617,8010,2464,3308,3492,2310,8011,8012,3165,8013,  # 7702
8014,3853,1987, 618, 427,2641,3493,3394,8015,8016,1244,1690,8017,2806,4274,4552,  # 7718
8018,3494,8019,8020,2279,1576, 473,3606,4275,3395, 972,8021,3607,8022,3067,8023,  # 7734
8024,4553,4554,8025,3727,4041,4042,8026, 153,4555, 356,8027,1891,2888,4276,2143,  # 7750
 408, 803,2352,8028,3854,8029,4277,1646,2570,2511,4556,4557,3855,8030,3856,4278,  # 7766
8031,2411,3396, 752,8032,8033,1961,2964,8034, 746,3012,2465,8035,4279,3728, 698,  # 7782
4558,1892,4280,3608,2543,4559,3609,3857,8036,3166,3397,8037,1823,1302,4043,2706,  # 7798
3858,1973,4281,8038,4282,3167, 823,1303,1288,1236,2848,3495,4044,3398, 774,3859,  # 7814
8039,1581,4560,1304,2849,3860,4561,8040,2435,2161,1083,3237,4283,4045,4284, 344,  # 7830
1173, 288,2311, 454,1683,8041,8042,1461,4562,4046,2589,8043,8044,4563, 985, 894,  # 7846
8045,3399,3168,8046,1913,2928,3729,1988,8047,2110,1974,8048,4047,8049,2571,1194,  # 7862
 425,8050,4564,3169,1245,3730,4285,8051,8052,2850,8053, 636,4565,1855,3861, 760,  # 7878
1799,8054,4286,2209,1508,4566,4048,1893,1684,2293,8055,8056,8057,4287,4288,2210,  # 7894
 479,8058,8059, 832,8060,4049,2489,8061,2965,2490,3731, 990,3109, 627,1814,2642,  # 7910
4289,1582,4290,2125,2111,3496,4567,8062, 799,4291,3170,8063,4568,2112,1737,3013,  # 7926
1018, 543, 754,4292,3309,1676,4569,4570,4050,8064,1489,8065,3497,8066,2614,2889,  # 7942
4051,8067,8068,2966,8069,8070,8071,8072,3171,4571,4572,2182,1722,8073,3238,3239,  # 7958
1842,3610,1715, 481, 365,1975,1856,8074,8075,1962,2491,4573,8076,2126,3611,3240,  # 7974
 433,1894,2063,2075,8077, 602,2741,8078,8079,8080,8081,8082,3014,1628,3400,8083,  # 7990
3172,4574,4052,2890,4575,2512,8084,2544,2772,8085,8086,8087,3310,4576,2891,8088,  # 8006
4577,8089,2851,4578,4579,1221,2967,4053,2513,8090,8091,8092,1867,1989,8093,8094,  # 8022
8095,1895,8096,8097,4580,1896,4054, 318,8098,2094,4055,4293,8099,8100, 485,8101,  # 8038
 938,3862, 553,2670, 116,8102,3863,3612,8103,3498,2671,2773,3401,3311,2807,8104,  # 8054
3613,2929,4056,1747,2930,2968,8105,8106, 207,8107,8108,2672,4581,2514,8109,3015,  # 8070
 890,3614,3864,8110,1877,3732,3402,8111,2183,2353,3403,1652,8112,8113,8114, 941,  # 8086
2294, 208,3499,4057,2019, 330,4294,3865,2892,2492,3733,4295,8115,8116,8117,8118,  # 8102
)

site-packages/pip/_vendor/chardet/eucjpprober.py000064400000007245147511334570016036 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is mozilla.org code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

from .enums import ProbingState, MachineState
from .mbcharsetprober import MultiByteCharSetProber
from .codingstatemachine import CodingStateMachine
from .chardistribution import EUCJPDistributionAnalysis
from .jpcntx import EUCJPContextAnalysis
from .mbcssm import EUCJP_SM_MODEL


class EUCJPProber(MultiByteCharSetProber):
    def __init__(self):
        super(EUCJPProber, self).__init__()
        self.coding_sm = CodingStateMachine(EUCJP_SM_MODEL)
        self.distribution_analyzer = EUCJPDistributionAnalysis()
        self.context_analyzer = EUCJPContextAnalysis()
        self.reset()

    def reset(self):
        super(EUCJPProber, self).reset()
        self.context_analyzer.reset()

    @property
    def charset_name(self):
        return "EUC-JP"

    @property
    def language(self):
        return "Japanese"

    def feed(self, byte_str):
        for i in range(len(byte_str)):
            # PY3K: byte_str is a byte array, so byte_str[i] is an int, not a byte
            coding_state = self.coding_sm.next_state(byte_str[i])
            if coding_state == MachineState.ERROR:
                self.logger.debug('%s %s prober hit error at byte %s',
                                  self.charset_name, self.language, i)
                self._state = ProbingState.NOT_ME
                break
            elif coding_state == MachineState.ITS_ME:
                self._state = ProbingState.FOUND_IT
                break
            elif coding_state == MachineState.START:
                char_len = self.coding_sm.get_current_charlen()
                if i == 0:
                    self._last_char[1] = byte_str[0]
                    self.context_analyzer.feed(self._last_char, char_len)
                    self.distribution_analyzer.feed(self._last_char, char_len)
                else:
                    self.context_analyzer.feed(byte_str[i - 1:i + 1],
                                                char_len)
                    self.distribution_analyzer.feed(byte_str[i - 1:i + 1],
                                                     char_len)

        self._last_char[0] = byte_str[-1]

        if self.state == ProbingState.DETECTING:
            if (self.context_analyzer.got_enough_data() and
               (self.get_confidence() > self.SHORTCUT_THRESHOLD)):
                self._state = ProbingState.FOUND_IT

        return self.state

    def get_confidence(self):
        context_conf = self.context_analyzer.get_confidence()
        distrib_conf = self.distribution_analyzer.get_confidence()
        return max(context_conf, distrib_conf)
site-packages/pip/_vendor/chardet/hebrewprober.py000064400000033016147511334570016177 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Universal charset detector code.
#
# The Initial Developer of the Original Code is
#          Shy Shalom
# Portions created by the Initial Developer are Copyright (C) 2005
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

from .charsetprober import CharSetProber
from .enums import ProbingState

# This prober doesn't actually recognize a language or a charset.
# It is a helper prober for the use of the Hebrew model probers

### General ideas of the Hebrew charset recognition ###
#
# Four main charsets exist in Hebrew:
# "ISO-8859-8" - Visual Hebrew
# "windows-1255" - Logical Hebrew
# "ISO-8859-8-I" - Logical Hebrew
# "x-mac-hebrew" - ?? Logical Hebrew ??
#
# Both "ISO" charsets use a completely identical set of code points, whereas
# "windows-1255" and "x-mac-hebrew" are two different proper supersets of
# these code points. windows-1255 defines additional characters in the range
# 0x80-0x9F as some misc punctuation marks as well as some Hebrew-specific
# diacritics and additional 'Yiddish' ligature letters in the range 0xc0-0xd6.
# x-mac-hebrew defines similar additional code points but with a different
# mapping.
#
# As far as an average Hebrew text with no diacritics is concerned, all four
# charsets are identical with respect to code points. Meaning that for the
# main Hebrew alphabet, all four map the same values to all 27 Hebrew letters
# (including final letters).
#
# The dominant difference between these charsets is their directionality.
# "Visual" directionality means that the text is ordered as if the renderer is
# not aware of a BIDI rendering algorithm. The renderer sees the text and
# draws it from left to right. The text itself when ordered naturally is read
# backwards. A buffer of Visual Hebrew generally looks like so:
# "[last word of first line spelled backwards] [whole line ordered backwards
# and spelled backwards] [first word of first line spelled backwards]
# [end of line] [last word of second line] ... etc' "
# adding punctuation marks, numbers and English text to visual text is
# naturally also "visual" and from left to right.
#
# "Logical" directionality means the text is ordered "naturally" according to
# the order it is read. It is the responsibility of the renderer to display
# the text from right to left. A BIDI algorithm is used to place general
# punctuation marks, numbers and English text in the text.
#
# Texts in x-mac-hebrew are almost impossible to find on the Internet. From
# what little evidence I could find, it seems that its general directionality
# is Logical.
#
# To sum up all of the above, the Hebrew probing mechanism knows about two
# charsets:
# Visual Hebrew - "ISO-8859-8" - backwards text - Words and sentences are
#    backwards while line order is natural. For charset recognition purposes
#    the line order is unimportant (In fact, for this implementation, even
#    word order is unimportant).
# Logical Hebrew - "windows-1255" - normal, naturally ordered text.
#
# "ISO-8859-8-I" is a subset of windows-1255 and doesn't need to be
#    specifically identified.
# "x-mac-hebrew" is also identified as windows-1255. A text in x-mac-hebrew
#    that contain special punctuation marks or diacritics is displayed with
#    some unconverted characters showing as question marks. This problem might
#    be corrected using another model prober for x-mac-hebrew. Due to the fact
#    that x-mac-hebrew texts are so rare, writing another model prober isn't
#    worth the effort and performance hit.
#
#### The Prober ####
#
# The prober is divided between two SBCharSetProbers and a HebrewProber,
# all of which are managed, created, fed data, inquired and deleted by the
# SBCSGroupProber. The two SBCharSetProbers identify that the text is in
# fact some kind of Hebrew, Logical or Visual. The final decision about which
# one is it is made by the HebrewProber by combining final-letter scores
# with the scores of the two SBCharSetProbers to produce a final answer.
#
# The SBCSGroupProber is responsible for stripping the original text of HTML
# tags, English characters, numbers, low-ASCII punctuation characters, spaces
# and new lines. It reduces any sequence of such characters to a single space.
# The buffer fed to each prober in the SBCS group prober is pure text in
# high-ASCII.
# The two SBCharSetProbers (model probers) share the same language model:
# Win1255Model.
# The first SBCharSetProber uses the model normally as any other
# SBCharSetProber does, to recognize windows-1255, upon which this model was
# built. The second SBCharSetProber is told to make the pair-of-letter
# lookup in the language model backwards. This in practice exactly simulates
# a visual Hebrew model using the windows-1255 logical Hebrew model.
#
# The HebrewProber is not using any language model. All it does is look for
# final-letter evidence suggesting the text is either logical Hebrew or visual
# Hebrew. Disjointed from the model probers, the results of the HebrewProber
# alone are meaningless. HebrewProber always returns 0.00 as confidence
# since it never identifies a charset by itself. Instead, the pointer to the
# HebrewProber is passed to the model probers as a helper "Name Prober".
# When the Group prober receives a positive identification from any prober,
# it asks for the name of the charset identified. If the prober queried is a
# Hebrew model prober, the model prober forwards the call to the
# HebrewProber to make the final decision. In the HebrewProber, the
# decision is made according to the final-letters scores maintained and Both
# model probers scores. The answer is returned in the form of the name of the
# charset identified, either "windows-1255" or "ISO-8859-8".

class HebrewProber(CharSetProber):
    # windows-1255 / ISO-8859-8 code points of interest
    FINAL_KAF = 0xea
    NORMAL_KAF = 0xeb
    FINAL_MEM = 0xed
    NORMAL_MEM = 0xee
    FINAL_NUN = 0xef
    NORMAL_NUN = 0xf0
    FINAL_PE = 0xf3
    NORMAL_PE = 0xf4
    FINAL_TSADI = 0xf5
    NORMAL_TSADI = 0xf6

    # Minimum Visual vs Logical final letter score difference.
    # If the difference is below this, don't rely solely on the final letter score
    # distance.
    MIN_FINAL_CHAR_DISTANCE = 5

    # Minimum Visual vs Logical model score difference.
    # If the difference is below this, don't rely at all on the model score
    # distance.
    MIN_MODEL_DISTANCE = 0.01

    VISUAL_HEBREW_NAME = "ISO-8859-8"
    LOGICAL_HEBREW_NAME = "windows-1255"

    def __init__(self):
        super(HebrewProber, self).__init__()
        self._final_char_logical_score = None
        self._final_char_visual_score = None
        self._prev = None
        self._before_prev = None
        self._logical_prober = None
        self._visual_prober = None
        self.reset()

    def reset(self):
        self._final_char_logical_score = 0
        self._final_char_visual_score = 0
        # The two last characters seen in the previous buffer,
        # mPrev and mBeforePrev are initialized to space in order to simulate
        # a word delimiter at the beginning of the data
        self._prev = ' '
        self._before_prev = ' '
        # These probers are owned by the group prober.

    def set_model_probers(self, logicalProber, visualProber):
        self._logical_prober = logicalProber
        self._visual_prober = visualProber

    def is_final(self, c):
        return c in [self.FINAL_KAF, self.FINAL_MEM, self.FINAL_NUN,
                     self.FINAL_PE, self.FINAL_TSADI]

    def is_non_final(self, c):
        # The normal Tsadi is not a good Non-Final letter due to words like
        # 'lechotet' (to chat) containing an apostrophe after the tsadi. This
        # apostrophe is converted to a space in FilterWithoutEnglishLetters
        # causing the Non-Final tsadi to appear at an end of a word even
        # though this is not the case in the original text.
        # The letters Pe and Kaf rarely display a related behavior of not being
        # a good Non-Final letter. Words like 'Pop', 'Winamp' and 'Mubarak'
        # for example legally end with a Non-Final Pe or Kaf. However, the
        # benefit of these letters as Non-Final letters outweighs the damage
        # since these words are quite rare.
        return c in [self.NORMAL_KAF, self.NORMAL_MEM,
                     self.NORMAL_NUN, self.NORMAL_PE]

    def feed(self, byte_str):
        # Final letter analysis for logical-visual decision.
        # Look for evidence that the received buffer is either logical Hebrew
        # or visual Hebrew.
        # The following cases are checked:
        # 1) A word longer than 1 letter, ending with a final letter. This is
        #    an indication that the text is laid out "naturally" since the
        #    final letter really appears at the end. +1 for logical score.
        # 2) A word longer than 1 letter, ending with a Non-Final letter. In
        #    normal Hebrew, words ending with Kaf, Mem, Nun, Pe or Tsadi,
        #    should not end with the Non-Final form of that letter. Exceptions
        #    to this rule are mentioned above in isNonFinal(). This is an
        #    indication that the text is laid out backwards. +1 for visual
        #    score
        # 3) A word longer than 1 letter, starting with a final letter. Final
        #    letters should not appear at the beginning of a word. This is an
        #    indication that the text is laid out backwards. +1 for visual
        #    score.
        #
        # The visual score and logical score are accumulated throughout the
        # text and are finally checked against each other in GetCharSetName().
        # No checking for final letters in the middle of words is done since
        # that case is not an indication for either Logical or Visual text.
        #
        # We automatically filter out all 7-bit characters (replace them with
        # spaces) so the word boundary detection works properly. [MAP]

        if self.state == ProbingState.NOT_ME:
            # Both model probers say it's not them. No reason to continue.
            return ProbingState.NOT_ME

        byte_str = self.filter_high_byte_only(byte_str)

        for cur in byte_str:
            if cur == ' ':
                # We stand on a space - a word just ended
                if self._before_prev != ' ':
                    # next-to-last char was not a space so self._prev is not a
                    # 1 letter word
                    if self.is_final(self._prev):
                        # case (1) [-2:not space][-1:final letter][cur:space]
                        self._final_char_logical_score += 1
                    elif self.is_non_final(self._prev):
                        # case (2) [-2:not space][-1:Non-Final letter][
                        #  cur:space]
                        self._final_char_visual_score += 1
            else:
                # Not standing on a space
                if ((self._before_prev == ' ') and
                        (self.is_final(self._prev)) and (cur != ' ')):
                    # case (3) [-2:space][-1:final letter][cur:not space]
                    self._final_char_visual_score += 1
            self._before_prev = self._prev
            self._prev = cur

        # Forever detecting, till the end or until both model probers return
        # ProbingState.NOT_ME (handled above)
        return ProbingState.DETECTING

    @property
    def charset_name(self):
        # Make the decision: is it Logical or Visual?
        # If the final letter score distance is dominant enough, rely on it.
        finalsub = self._final_char_logical_score - self._final_char_visual_score
        if finalsub >= self.MIN_FINAL_CHAR_DISTANCE:
            return self.LOGICAL_HEBREW_NAME
        if finalsub <= -self.MIN_FINAL_CHAR_DISTANCE:
            return self.VISUAL_HEBREW_NAME

        # It's not dominant enough, try to rely on the model scores instead.
        modelsub = (self._logical_prober.get_confidence()
                    - self._visual_prober.get_confidence())
        if modelsub > self.MIN_MODEL_DISTANCE:
            return self.LOGICAL_HEBREW_NAME
        if modelsub < -self.MIN_MODEL_DISTANCE:
            return self.VISUAL_HEBREW_NAME

        # Still no good, back to final letter distance, maybe it'll save the
        # day.
        if finalsub < 0.0:
            return self.VISUAL_HEBREW_NAME

        # (finalsub > 0 - Logical) or (don't know what to do) default to
        # Logical.
        return self.LOGICAL_HEBREW_NAME

    @property
    def language(self):
        return 'Hebrew'

    @property
    def state(self):
        # Remain active as long as any of the model probers are active.
        if (self._logical_prober.state == ProbingState.NOT_ME) and \
           (self._visual_prober.state == ProbingState.NOT_ME):
            return ProbingState.NOT_ME
        return ProbingState.DETECTING
site-packages/pip/_vendor/chardet/utf8prober.py000064400000005316147511334570015613 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is mozilla.org code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

from .charsetprober import CharSetProber
from .enums import ProbingState, MachineState
from .codingstatemachine import CodingStateMachine
from .mbcssm import UTF8_SM_MODEL



class UTF8Prober(CharSetProber):
    ONE_CHAR_PROB = 0.5

    def __init__(self):
        super(UTF8Prober, self).__init__()
        self.coding_sm = CodingStateMachine(UTF8_SM_MODEL)
        self._num_mb_chars = None
        self.reset()

    def reset(self):
        super(UTF8Prober, self).reset()
        self.coding_sm.reset()
        self._num_mb_chars = 0

    @property
    def charset_name(self):
        return "utf-8"

    @property
    def language(self):
        return ""

    def feed(self, byte_str):
        for c in byte_str:
            coding_state = self.coding_sm.next_state(c)
            if coding_state == MachineState.ERROR:
                self._state = ProbingState.NOT_ME
                break
            elif coding_state == MachineState.ITS_ME:
                self._state = ProbingState.FOUND_IT
                break
            elif coding_state == MachineState.START:
                if self.coding_sm.get_current_charlen() >= 2:
                    self._num_mb_chars += 1

        if self.state == ProbingState.DETECTING:
            if self.get_confidence() > self.SHORTCUT_THRESHOLD:
                self._state = ProbingState.FOUND_IT

        return self.state

    def get_confidence(self):
        unlike = 0.99
        if self._num_mb_chars < 6:
            unlike *= self.ONE_CHAR_PROB ** self._num_mb_chars
            return 1.0 - unlike
        else:
            return unlike
site-packages/pip/_vendor/chardet/__init__.py000064400000003027147511334570015247 0ustar00######################## BEGIN LICENSE BLOCK ########################
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################


from .compat import PY2, PY3
from .universaldetector import UniversalDetector
from .version import __version__, VERSION


def detect(byte_str):
    """
    Detect the encoding of the given byte string.

    :param byte_str:     The byte sequence to examine.
    :type byte_str:      ``bytes`` or ``bytearray``
    """
    if not isinstance(byte_str, bytearray):
        if not isinstance(byte_str, bytes):
            raise TypeError('Expected object of type bytes or bytearray, got: '
                            '{0}'.format(type(byte_str)))
        else:
            byte_str = bytearray(byte_str)
    detector = UniversalDetector()
    detector.feed(byte_str)
    return detector.close()
site-packages/pip/_vendor/chardet/langhebrewmodel.py000064400000026121147511334570016647 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Universal charset detector code.
#
# The Initial Developer of the Original Code is
#          Simon Montagu
# Portions created by the Initial Developer are Copyright (C) 2005
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#   Shy Shalom - original C code
#   Shoshannah Forbes - original C code (?)
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

# 255: Control characters that usually does not exist in any text
# 254: Carriage/Return
# 253: symbol (punctuation) that does not belong to word
# 252: 0 - 9

# Windows-1255 language model
# Character Mapping Table:
WIN1255_CHAR_TO_ORDER_MAP = (
255,255,255,255,255,255,255,255,255,255,254,255,255,254,255,255,  # 00
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,  # 10
253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,  # 20
252,252,252,252,252,252,252,252,252,252,253,253,253,253,253,253,  # 30
253, 69, 91, 79, 80, 92, 89, 97, 90, 68,111,112, 82, 73, 95, 85,  # 40
 78,121, 86, 71, 67,102,107, 84,114,103,115,253,253,253,253,253,  # 50
253, 50, 74, 60, 61, 42, 76, 70, 64, 53,105, 93, 56, 65, 54, 49,  # 60
 66,110, 51, 43, 44, 63, 81, 77, 98, 75,108,253,253,253,253,253,  # 70
124,202,203,204,205, 40, 58,206,207,208,209,210,211,212,213,214,
215, 83, 52, 47, 46, 72, 32, 94,216,113,217,109,218,219,220,221,
 34,116,222,118,100,223,224,117,119,104,125,225,226, 87, 99,227,
106,122,123,228, 55,229,230,101,231,232,120,233, 48, 39, 57,234,
 30, 59, 41, 88, 33, 37, 36, 31, 29, 35,235, 62, 28,236,126,237,
238, 38, 45,239,240,241,242,243,127,244,245,246,247,248,249,250,
  9,  8, 20, 16,  3,  2, 24, 14, 22,  1, 25, 15,  4, 11,  6, 23,
 12, 19, 13, 26, 18, 27, 21, 17,  7, 10,  5,251,252,128, 96,253,
)

# Model Table:
# total sequences: 100%
# first 512 sequences: 98.4004%
# first 1024 sequences: 1.5981%
# rest  sequences:      0.087%
# negative sequences:   0.0015%
HEBREW_LANG_MODEL = (
0,3,3,3,3,3,3,3,3,3,3,2,3,3,3,3,3,3,3,3,3,3,3,2,3,2,1,2,0,1,0,0,
3,0,3,1,0,0,1,3,2,0,1,1,2,0,2,2,2,1,1,1,1,2,1,1,1,2,0,0,2,2,0,1,
3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,2,2,2,
1,2,1,2,1,2,0,0,2,0,0,0,0,0,1,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,
3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,2,2,
1,2,1,3,1,1,0,0,2,0,0,0,1,0,1,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,
3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,1,0,1,2,2,1,3,
1,2,1,1,2,2,0,0,2,2,0,0,0,0,1,0,1,0,0,0,1,0,0,0,0,0,0,1,0,1,1,0,
3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,3,3,2,2,2,2,3,2,
1,2,1,2,2,2,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,
3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,3,3,2,3,2,2,3,2,2,2,1,2,2,2,2,
1,2,1,1,2,2,0,1,2,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,
3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,0,2,2,2,2,2,
0,2,0,2,2,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,
3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,3,0,2,2,2,
0,2,1,2,2,2,0,0,2,1,0,0,0,0,1,0,1,0,0,0,0,0,0,2,0,0,0,0,0,0,1,0,
3,3,3,3,3,3,3,3,3,3,3,2,3,3,3,3,3,3,3,3,3,3,3,3,3,2,1,2,3,2,2,2,
1,2,1,2,2,2,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,
3,3,3,3,3,3,3,3,3,2,3,3,3,2,3,3,3,3,3,3,3,3,3,3,3,3,3,1,0,2,0,2,
0,2,1,2,2,2,0,0,1,2,0,0,0,0,1,0,1,0,0,0,0,0,0,1,0,0,0,2,0,0,1,0,
3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,3,2,3,2,2,3,2,1,2,1,1,1,
0,1,1,1,1,1,3,0,1,0,0,0,0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,
3,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,0,1,1,0,1,1,0,0,1,0,0,1,0,0,0,0,
0,0,1,0,0,0,0,0,2,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,2,2,2,2,2,2,
0,2,0,1,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,
3,3,3,3,3,3,3,3,3,2,3,3,3,2,1,2,3,3,2,3,3,3,3,2,3,2,1,2,0,2,1,2,
0,2,0,2,2,2,0,0,1,2,0,0,0,0,1,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,
3,3,3,3,3,3,3,3,3,2,3,3,3,1,2,2,3,3,2,3,2,3,2,2,3,1,2,2,0,2,2,2,
0,2,1,2,2,2,0,0,1,2,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,0,1,0,0,1,0,
3,3,3,3,3,3,3,3,3,3,3,3,3,2,3,3,3,2,3,3,2,2,2,3,3,3,3,1,3,2,2,2,
0,2,0,1,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,
3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,2,3,3,3,2,3,2,2,2,1,2,2,0,2,2,2,2,
0,2,0,2,2,2,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,
3,3,3,3,3,3,3,3,3,3,3,2,3,3,3,1,3,2,3,3,2,3,3,2,2,1,2,2,2,2,2,2,
0,2,1,2,1,2,0,0,1,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,1,0,
3,3,3,3,3,3,2,3,2,3,3,2,3,3,3,3,2,3,2,3,3,3,3,3,2,2,2,2,2,2,2,1,
0,2,0,1,2,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,
3,3,3,3,3,3,3,3,3,2,1,2,3,3,3,3,3,3,3,2,3,2,3,2,1,2,3,0,2,1,2,2,
0,2,1,1,2,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,2,0,
3,3,3,3,3,3,3,3,3,2,3,3,3,3,2,1,3,1,2,2,2,1,2,3,3,1,2,1,2,2,2,2,
0,1,1,1,1,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,2,0,0,0,0,0,0,0,0,
3,3,3,3,3,3,3,3,3,3,0,2,3,3,3,1,3,3,3,1,2,2,2,2,1,1,2,2,2,2,2,2,
0,2,0,1,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,
3,3,3,3,3,3,2,3,3,3,2,2,3,3,3,2,1,2,3,2,3,2,2,2,2,1,2,1,1,1,2,2,
0,2,1,1,1,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,
3,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,0,1,0,0,0,1,0,0,0,0,0,
1,0,1,0,0,0,0,0,2,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,2,3,3,2,3,1,2,2,2,2,3,2,3,1,1,2,2,1,2,2,1,1,0,2,2,2,2,
0,1,0,1,2,2,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,
3,0,0,1,1,0,1,0,0,1,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,2,2,0,
0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,0,1,0,1,0,1,1,0,1,1,0,0,0,1,1,0,1,1,1,0,0,0,0,0,0,1,0,0,0,0,0,
0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,0,0,0,1,1,0,1,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,
3,2,2,1,2,2,2,2,2,2,2,1,2,2,1,2,2,1,1,1,1,1,1,1,1,2,1,1,0,3,3,3,
0,3,0,2,2,2,2,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,
2,2,2,3,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,1,2,2,1,2,2,2,1,1,1,2,0,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,2,2,2,2,2,2,2,2,2,2,1,2,2,2,2,2,2,2,2,2,2,2,0,2,2,0,0,0,0,0,0,
0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,3,1,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,1,2,1,0,2,1,0,
0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,1,1,1,1,1,1,1,1,1,1,0,0,1,1,1,1,0,1,1,1,1,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,
0,3,1,1,2,2,2,2,2,1,2,2,2,1,1,2,2,2,2,2,2,2,1,2,2,1,0,1,1,1,1,0,
0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,2,1,1,1,1,2,1,1,2,1,0,1,1,1,1,1,1,1,1,1,1,1,0,1,0,0,0,0,0,0,0,
0,0,2,0,0,0,0,0,0,0,0,1,1,0,0,0,0,1,1,0,0,1,1,0,0,0,0,0,0,1,0,0,
2,1,1,2,2,2,2,2,2,2,2,2,2,2,1,2,2,2,2,2,1,2,1,2,1,1,1,1,0,0,0,0,
0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,2,1,2,2,2,2,2,2,2,2,2,2,1,2,1,2,1,1,2,1,1,1,2,1,2,1,2,0,1,0,1,
0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,3,1,2,2,2,1,2,2,2,2,2,2,2,2,1,2,1,1,1,1,1,1,2,1,2,1,1,0,1,0,1,
0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,1,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,2,2,
0,2,0,1,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,
3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,1,1,1,1,1,1,1,0,1,1,0,1,0,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,2,0,1,1,1,0,1,0,0,0,1,1,0,1,1,0,0,0,0,0,1,1,0,0,
0,1,1,1,2,1,2,2,2,0,2,0,2,0,1,1,2,1,1,1,1,2,1,0,1,1,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,0,1,0,0,0,0,0,1,0,1,2,2,0,1,0,0,1,1,2,2,1,2,0,2,0,0,0,1,2,0,1,
2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,2,0,2,1,2,0,2,0,0,1,1,1,1,1,1,0,1,0,0,0,1,0,0,1,
2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,1,0,0,0,0,0,1,0,2,1,1,0,1,0,0,1,1,1,2,2,0,0,1,0,0,0,1,0,0,1,
1,1,2,1,0,1,1,1,0,1,0,1,1,1,1,0,0,0,1,0,1,0,0,0,0,0,0,0,0,2,2,1,
0,2,0,1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,1,0,0,1,0,1,1,1,1,0,0,0,0,0,1,0,0,0,0,1,1,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,1,1,1,1,1,1,1,2,1,0,1,1,1,1,1,1,1,1,1,1,1,0,1,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,1,1,1,0,0,0,0,1,1,1,0,1,1,0,1,0,0,0,1,1,0,1,
2,0,1,0,1,0,1,0,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,1,0,1,1,1,0,1,0,0,1,1,2,1,1,2,0,1,0,0,0,1,1,0,1,
1,0,0,1,0,0,1,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,1,0,1,1,2,0,1,0,0,0,0,2,1,1,2,0,2,0,0,0,1,1,0,1,
1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,1,0,2,1,1,0,1,0,0,2,2,1,2,1,1,0,1,0,0,0,1,1,0,1,
2,0,1,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,1,2,2,0,0,0,0,0,1,1,0,1,0,0,1,0,0,0,0,1,0,1,
1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,1,2,2,0,0,0,0,2,1,1,1,0,2,1,1,0,0,0,2,1,0,1,
1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,1,0,1,1,2,0,1,0,0,1,1,0,2,1,1,0,1,0,0,0,1,1,0,1,
2,2,1,1,1,0,1,1,0,1,1,0,1,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,1,0,2,1,1,0,1,0,0,1,1,0,1,2,1,0,2,0,0,0,1,1,0,1,
2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,
0,1,0,0,2,0,2,1,1,0,1,0,1,0,0,1,0,0,0,0,1,0,0,0,1,0,0,0,0,0,1,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,0,0,1,0,0,1,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,1,0,1,1,2,0,1,0,0,1,1,1,0,1,0,0,1,0,0,0,1,0,0,1,
1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,0,0,0,0,0,0,0,1,0,1,1,0,0,1,0,0,2,1,1,1,1,1,0,1,0,0,0,0,1,0,1,
0,1,1,1,2,1,1,1,1,0,1,1,1,1,1,1,1,1,1,1,1,1,0,1,1,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,1,2,1,0,0,0,0,0,1,1,1,1,1,0,1,0,0,0,1,1,0,0,
)

Win1255HebrewModel = {
  'char_to_order_map': WIN1255_CHAR_TO_ORDER_MAP,
  'precedence_matrix': HEBREW_LANG_MODEL,
  'typical_positive_ratio': 0.984004,
  'keep_english_letter': False,
  'charset_name': "windows-1255",
  'language': 'Hebrew',
}
site-packages/pip/_vendor/chardet/version.py000064400000000362147511334570015174 0ustar00"""
This module exists only to simplify retrieving the version number of chardet
from within setup.py and from chardet subpackages.

:author: Dan Blanchard (dan.blanchard@gmail.com)
"""

__version__ = "3.0.4"
VERSION = __version__.split('.')
site-packages/pip/_vendor/chardet/mbcssm.py000064400000061611147511334570014777 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is mozilla.org code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

from .enums import MachineState

# BIG5

BIG5_CLS = (
    1,1,1,1,1,1,1,1,  # 00 - 07    #allow 0x00 as legal value
    1,1,1,1,1,1,0,0,  # 08 - 0f
    1,1,1,1,1,1,1,1,  # 10 - 17
    1,1,1,0,1,1,1,1,  # 18 - 1f
    1,1,1,1,1,1,1,1,  # 20 - 27
    1,1,1,1,1,1,1,1,  # 28 - 2f
    1,1,1,1,1,1,1,1,  # 30 - 37
    1,1,1,1,1,1,1,1,  # 38 - 3f
    2,2,2,2,2,2,2,2,  # 40 - 47
    2,2,2,2,2,2,2,2,  # 48 - 4f
    2,2,2,2,2,2,2,2,  # 50 - 57
    2,2,2,2,2,2,2,2,  # 58 - 5f
    2,2,2,2,2,2,2,2,  # 60 - 67
    2,2,2,2,2,2,2,2,  # 68 - 6f
    2,2,2,2,2,2,2,2,  # 70 - 77
    2,2,2,2,2,2,2,1,  # 78 - 7f
    4,4,4,4,4,4,4,4,  # 80 - 87
    4,4,4,4,4,4,4,4,  # 88 - 8f
    4,4,4,4,4,4,4,4,  # 90 - 97
    4,4,4,4,4,4,4,4,  # 98 - 9f
    4,3,3,3,3,3,3,3,  # a0 - a7
    3,3,3,3,3,3,3,3,  # a8 - af
    3,3,3,3,3,3,3,3,  # b0 - b7
    3,3,3,3,3,3,3,3,  # b8 - bf
    3,3,3,3,3,3,3,3,  # c0 - c7
    3,3,3,3,3,3,3,3,  # c8 - cf
    3,3,3,3,3,3,3,3,  # d0 - d7
    3,3,3,3,3,3,3,3,  # d8 - df
    3,3,3,3,3,3,3,3,  # e0 - e7
    3,3,3,3,3,3,3,3,  # e8 - ef
    3,3,3,3,3,3,3,3,  # f0 - f7
    3,3,3,3,3,3,3,0  # f8 - ff
)

BIG5_ST = (
    MachineState.ERROR,MachineState.START,MachineState.START,     3,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,#00-07
    MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ERROR,#08-0f
    MachineState.ERROR,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START#10-17
)

BIG5_CHAR_LEN_TABLE = (0, 1, 1, 2, 0)

BIG5_SM_MODEL = {'class_table': BIG5_CLS,
                 'class_factor': 5,
                 'state_table': BIG5_ST,
                 'char_len_table': BIG5_CHAR_LEN_TABLE,
                 'name': 'Big5'}

# CP949

CP949_CLS  = (
    1,1,1,1,1,1,1,1, 1,1,1,1,1,1,0,0,  # 00 - 0f
    1,1,1,1,1,1,1,1, 1,1,1,0,1,1,1,1,  # 10 - 1f
    1,1,1,1,1,1,1,1, 1,1,1,1,1,1,1,1,  # 20 - 2f
    1,1,1,1,1,1,1,1, 1,1,1,1,1,1,1,1,  # 30 - 3f
    1,4,4,4,4,4,4,4, 4,4,4,4,4,4,4,4,  # 40 - 4f
    4,4,5,5,5,5,5,5, 5,5,5,1,1,1,1,1,  # 50 - 5f
    1,5,5,5,5,5,5,5, 5,5,5,5,5,5,5,5,  # 60 - 6f
    5,5,5,5,5,5,5,5, 5,5,5,1,1,1,1,1,  # 70 - 7f
    0,6,6,6,6,6,6,6, 6,6,6,6,6,6,6,6,  # 80 - 8f
    6,6,6,6,6,6,6,6, 6,6,6,6,6,6,6,6,  # 90 - 9f
    6,7,7,7,7,7,7,7, 7,7,7,7,7,8,8,8,  # a0 - af
    7,7,7,7,7,7,7,7, 7,7,7,7,7,7,7,7,  # b0 - bf
    7,7,7,7,7,7,9,2, 2,3,2,2,2,2,2,2,  # c0 - cf
    2,2,2,2,2,2,2,2, 2,2,2,2,2,2,2,2,  # d0 - df
    2,2,2,2,2,2,2,2, 2,2,2,2,2,2,2,2,  # e0 - ef
    2,2,2,2,2,2,2,2, 2,2,2,2,2,2,2,0,  # f0 - ff
)

CP949_ST = (
#cls=    0      1      2      3      4      5      6      7      8      9  # previous state =
    MachineState.ERROR,MachineState.START,     3,MachineState.ERROR,MachineState.START,MachineState.START,     4,     5,MachineState.ERROR,     6, # MachineState.START
    MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR, # MachineState.ERROR
    MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME, # MachineState.ITS_ME
    MachineState.ERROR,MachineState.ERROR,MachineState.START,MachineState.START,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.START,MachineState.START,MachineState.START, # 3
    MachineState.ERROR,MachineState.ERROR,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START, # 4
    MachineState.ERROR,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START, # 5
    MachineState.ERROR,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.ERROR,MachineState.ERROR,MachineState.START,MachineState.START,MachineState.START, # 6
)

CP949_CHAR_LEN_TABLE = (0, 1, 2, 0, 1, 1, 2, 2, 0, 2)

CP949_SM_MODEL = {'class_table': CP949_CLS,
                  'class_factor': 10,
                  'state_table': CP949_ST,
                  'char_len_table': CP949_CHAR_LEN_TABLE,
                  'name': 'CP949'}

# EUC-JP

EUCJP_CLS = (
    4,4,4,4,4,4,4,4,  # 00 - 07
    4,4,4,4,4,4,5,5,  # 08 - 0f
    4,4,4,4,4,4,4,4,  # 10 - 17
    4,4,4,5,4,4,4,4,  # 18 - 1f
    4,4,4,4,4,4,4,4,  # 20 - 27
    4,4,4,4,4,4,4,4,  # 28 - 2f
    4,4,4,4,4,4,4,4,  # 30 - 37
    4,4,4,4,4,4,4,4,  # 38 - 3f
    4,4,4,4,4,4,4,4,  # 40 - 47
    4,4,4,4,4,4,4,4,  # 48 - 4f
    4,4,4,4,4,4,4,4,  # 50 - 57
    4,4,4,4,4,4,4,4,  # 58 - 5f
    4,4,4,4,4,4,4,4,  # 60 - 67
    4,4,4,4,4,4,4,4,  # 68 - 6f
    4,4,4,4,4,4,4,4,  # 70 - 77
    4,4,4,4,4,4,4,4,  # 78 - 7f
    5,5,5,5,5,5,5,5,  # 80 - 87
    5,5,5,5,5,5,1,3,  # 88 - 8f
    5,5,5,5,5,5,5,5,  # 90 - 97
    5,5,5,5,5,5,5,5,  # 98 - 9f
    5,2,2,2,2,2,2,2,  # a0 - a7
    2,2,2,2,2,2,2,2,  # a8 - af
    2,2,2,2,2,2,2,2,  # b0 - b7
    2,2,2,2,2,2,2,2,  # b8 - bf
    2,2,2,2,2,2,2,2,  # c0 - c7
    2,2,2,2,2,2,2,2,  # c8 - cf
    2,2,2,2,2,2,2,2,  # d0 - d7
    2,2,2,2,2,2,2,2,  # d8 - df
    0,0,0,0,0,0,0,0,  # e0 - e7
    0,0,0,0,0,0,0,0,  # e8 - ef
    0,0,0,0,0,0,0,0,  # f0 - f7
    0,0,0,0,0,0,0,5  # f8 - ff
)

EUCJP_ST = (
          3,     4,     3,     5,MachineState.START,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,#00-07
     MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,#08-0f
     MachineState.ITS_ME,MachineState.ITS_ME,MachineState.START,MachineState.ERROR,MachineState.START,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,#10-17
     MachineState.ERROR,MachineState.ERROR,MachineState.START,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,     3,MachineState.ERROR,#18-1f
          3,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.START,MachineState.START,MachineState.START,MachineState.START#20-27
)

EUCJP_CHAR_LEN_TABLE = (2, 2, 2, 3, 1, 0)

EUCJP_SM_MODEL = {'class_table': EUCJP_CLS,
                  'class_factor': 6,
                  'state_table': EUCJP_ST,
                  'char_len_table': EUCJP_CHAR_LEN_TABLE,
                  'name': 'EUC-JP'}

# EUC-KR

EUCKR_CLS  = (
    1,1,1,1,1,1,1,1,  # 00 - 07
    1,1,1,1,1,1,0,0,  # 08 - 0f
    1,1,1,1,1,1,1,1,  # 10 - 17
    1,1,1,0,1,1,1,1,  # 18 - 1f
    1,1,1,1,1,1,1,1,  # 20 - 27
    1,1,1,1,1,1,1,1,  # 28 - 2f
    1,1,1,1,1,1,1,1,  # 30 - 37
    1,1,1,1,1,1,1,1,  # 38 - 3f
    1,1,1,1,1,1,1,1,  # 40 - 47
    1,1,1,1,1,1,1,1,  # 48 - 4f
    1,1,1,1,1,1,1,1,  # 50 - 57
    1,1,1,1,1,1,1,1,  # 58 - 5f
    1,1,1,1,1,1,1,1,  # 60 - 67
    1,1,1,1,1,1,1,1,  # 68 - 6f
    1,1,1,1,1,1,1,1,  # 70 - 77
    1,1,1,1,1,1,1,1,  # 78 - 7f
    0,0,0,0,0,0,0,0,  # 80 - 87
    0,0,0,0,0,0,0,0,  # 88 - 8f
    0,0,0,0,0,0,0,0,  # 90 - 97
    0,0,0,0,0,0,0,0,  # 98 - 9f
    0,2,2,2,2,2,2,2,  # a0 - a7
    2,2,2,2,2,3,3,3,  # a8 - af
    2,2,2,2,2,2,2,2,  # b0 - b7
    2,2,2,2,2,2,2,2,  # b8 - bf
    2,2,2,2,2,2,2,2,  # c0 - c7
    2,3,2,2,2,2,2,2,  # c8 - cf
    2,2,2,2,2,2,2,2,  # d0 - d7
    2,2,2,2,2,2,2,2,  # d8 - df
    2,2,2,2,2,2,2,2,  # e0 - e7
    2,2,2,2,2,2,2,2,  # e8 - ef
    2,2,2,2,2,2,2,2,  # f0 - f7
    2,2,2,2,2,2,2,0   # f8 - ff
)

EUCKR_ST = (
    MachineState.ERROR,MachineState.START,     3,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,#00-07
    MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ERROR,MachineState.ERROR,MachineState.START,MachineState.START #08-0f
)

EUCKR_CHAR_LEN_TABLE = (0, 1, 2, 0)

EUCKR_SM_MODEL = {'class_table': EUCKR_CLS,
                'class_factor': 4,
                'state_table': EUCKR_ST,
                'char_len_table': EUCKR_CHAR_LEN_TABLE,
                'name': 'EUC-KR'}

# EUC-TW

EUCTW_CLS = (
    2,2,2,2,2,2,2,2,  # 00 - 07
    2,2,2,2,2,2,0,0,  # 08 - 0f
    2,2,2,2,2,2,2,2,  # 10 - 17
    2,2,2,0,2,2,2,2,  # 18 - 1f
    2,2,2,2,2,2,2,2,  # 20 - 27
    2,2,2,2,2,2,2,2,  # 28 - 2f
    2,2,2,2,2,2,2,2,  # 30 - 37
    2,2,2,2,2,2,2,2,  # 38 - 3f
    2,2,2,2,2,2,2,2,  # 40 - 47
    2,2,2,2,2,2,2,2,  # 48 - 4f
    2,2,2,2,2,2,2,2,  # 50 - 57
    2,2,2,2,2,2,2,2,  # 58 - 5f
    2,2,2,2,2,2,2,2,  # 60 - 67
    2,2,2,2,2,2,2,2,  # 68 - 6f
    2,2,2,2,2,2,2,2,  # 70 - 77
    2,2,2,2,2,2,2,2,  # 78 - 7f
    0,0,0,0,0,0,0,0,  # 80 - 87
    0,0,0,0,0,0,6,0,  # 88 - 8f
    0,0,0,0,0,0,0,0,  # 90 - 97
    0,0,0,0,0,0,0,0,  # 98 - 9f
    0,3,4,4,4,4,4,4,  # a0 - a7
    5,5,1,1,1,1,1,1,  # a8 - af
    1,1,1,1,1,1,1,1,  # b0 - b7
    1,1,1,1,1,1,1,1,  # b8 - bf
    1,1,3,1,3,3,3,3,  # c0 - c7
    3,3,3,3,3,3,3,3,  # c8 - cf
    3,3,3,3,3,3,3,3,  # d0 - d7
    3,3,3,3,3,3,3,3,  # d8 - df
    3,3,3,3,3,3,3,3,  # e0 - e7
    3,3,3,3,3,3,3,3,  # e8 - ef
    3,3,3,3,3,3,3,3,  # f0 - f7
    3,3,3,3,3,3,3,0   # f8 - ff
)

EUCTW_ST = (
    MachineState.ERROR,MachineState.ERROR,MachineState.START,     3,     3,     3,     4,MachineState.ERROR,#00-07
    MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ITS_ME,#08-0f
    MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ERROR,MachineState.START,MachineState.ERROR,#10-17
    MachineState.START,MachineState.START,MachineState.START,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,#18-1f
         5,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.START,MachineState.ERROR,MachineState.START,MachineState.START,#20-27
    MachineState.START,MachineState.ERROR,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START #28-2f
)

EUCTW_CHAR_LEN_TABLE = (0, 0, 1, 2, 2, 2, 3)

EUCTW_SM_MODEL = {'class_table': EUCTW_CLS,
                'class_factor': 7,
                'state_table': EUCTW_ST,
                'char_len_table': EUCTW_CHAR_LEN_TABLE,
                'name': 'x-euc-tw'}

# GB2312

GB2312_CLS = (
    1,1,1,1,1,1,1,1,  # 00 - 07
    1,1,1,1,1,1,0,0,  # 08 - 0f
    1,1,1,1,1,1,1,1,  # 10 - 17
    1,1,1,0,1,1,1,1,  # 18 - 1f
    1,1,1,1,1,1,1,1,  # 20 - 27
    1,1,1,1,1,1,1,1,  # 28 - 2f
    3,3,3,3,3,3,3,3,  # 30 - 37
    3,3,1,1,1,1,1,1,  # 38 - 3f
    2,2,2,2,2,2,2,2,  # 40 - 47
    2,2,2,2,2,2,2,2,  # 48 - 4f
    2,2,2,2,2,2,2,2,  # 50 - 57
    2,2,2,2,2,2,2,2,  # 58 - 5f
    2,2,2,2,2,2,2,2,  # 60 - 67
    2,2,2,2,2,2,2,2,  # 68 - 6f
    2,2,2,2,2,2,2,2,  # 70 - 77
    2,2,2,2,2,2,2,4,  # 78 - 7f
    5,6,6,6,6,6,6,6,  # 80 - 87
    6,6,6,6,6,6,6,6,  # 88 - 8f
    6,6,6,6,6,6,6,6,  # 90 - 97
    6,6,6,6,6,6,6,6,  # 98 - 9f
    6,6,6,6,6,6,6,6,  # a0 - a7
    6,6,6,6,6,6,6,6,  # a8 - af
    6,6,6,6,6,6,6,6,  # b0 - b7
    6,6,6,6,6,6,6,6,  # b8 - bf
    6,6,6,6,6,6,6,6,  # c0 - c7
    6,6,6,6,6,6,6,6,  # c8 - cf
    6,6,6,6,6,6,6,6,  # d0 - d7
    6,6,6,6,6,6,6,6,  # d8 - df
    6,6,6,6,6,6,6,6,  # e0 - e7
    6,6,6,6,6,6,6,6,  # e8 - ef
    6,6,6,6,6,6,6,6,  # f0 - f7
    6,6,6,6,6,6,6,0   # f8 - ff
)

GB2312_ST = (
    MachineState.ERROR,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START,     3,MachineState.ERROR,#00-07
    MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ITS_ME,#08-0f
    MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ERROR,MachineState.ERROR,MachineState.START,#10-17
         4,MachineState.ERROR,MachineState.START,MachineState.START,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,#18-1f
    MachineState.ERROR,MachineState.ERROR,     5,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ERROR,#20-27
    MachineState.ERROR,MachineState.ERROR,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.START #28-2f
)

# To be accurate, the length of class 6 can be either 2 or 4.
# But it is not necessary to discriminate between the two since
# it is used for frequency analysis only, and we are validating
# each code range there as well. So it is safe to set it to be
# 2 here.
GB2312_CHAR_LEN_TABLE = (0, 1, 1, 1, 1, 1, 2)

GB2312_SM_MODEL = {'class_table': GB2312_CLS,
                   'class_factor': 7,
                   'state_table': GB2312_ST,
                   'char_len_table': GB2312_CHAR_LEN_TABLE,
                   'name': 'GB2312'}

# Shift_JIS

SJIS_CLS = (
    1,1,1,1,1,1,1,1,  # 00 - 07
    1,1,1,1,1,1,0,0,  # 08 - 0f
    1,1,1,1,1,1,1,1,  # 10 - 17
    1,1,1,0,1,1,1,1,  # 18 - 1f
    1,1,1,1,1,1,1,1,  # 20 - 27
    1,1,1,1,1,1,1,1,  # 28 - 2f
    1,1,1,1,1,1,1,1,  # 30 - 37
    1,1,1,1,1,1,1,1,  # 38 - 3f
    2,2,2,2,2,2,2,2,  # 40 - 47
    2,2,2,2,2,2,2,2,  # 48 - 4f
    2,2,2,2,2,2,2,2,  # 50 - 57
    2,2,2,2,2,2,2,2,  # 58 - 5f
    2,2,2,2,2,2,2,2,  # 60 - 67
    2,2,2,2,2,2,2,2,  # 68 - 6f
    2,2,2,2,2,2,2,2,  # 70 - 77
    2,2,2,2,2,2,2,1,  # 78 - 7f
    3,3,3,3,3,2,2,3,  # 80 - 87
    3,3,3,3,3,3,3,3,  # 88 - 8f
    3,3,3,3,3,3,3,3,  # 90 - 97
    3,3,3,3,3,3,3,3,  # 98 - 9f
    #0xa0 is illegal in sjis encoding, but some pages does
    #contain such byte. We need to be more error forgiven.
    2,2,2,2,2,2,2,2,  # a0 - a7
    2,2,2,2,2,2,2,2,  # a8 - af
    2,2,2,2,2,2,2,2,  # b0 - b7
    2,2,2,2,2,2,2,2,  # b8 - bf
    2,2,2,2,2,2,2,2,  # c0 - c7
    2,2,2,2,2,2,2,2,  # c8 - cf
    2,2,2,2,2,2,2,2,  # d0 - d7
    2,2,2,2,2,2,2,2,  # d8 - df
    3,3,3,3,3,3,3,3,  # e0 - e7
    3,3,3,3,3,4,4,4,  # e8 - ef
    3,3,3,3,3,3,3,3,  # f0 - f7
    3,3,3,3,3,0,0,0)  # f8 - ff


SJIS_ST = (
    MachineState.ERROR,MachineState.START,MachineState.START,     3,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,#00-07
    MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,#08-0f
    MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ERROR,MachineState.ERROR,MachineState.START,MachineState.START,MachineState.START,MachineState.START #10-17
)

SJIS_CHAR_LEN_TABLE = (0, 1, 1, 2, 0, 0)

SJIS_SM_MODEL = {'class_table': SJIS_CLS,
               'class_factor': 6,
               'state_table': SJIS_ST,
               'char_len_table': SJIS_CHAR_LEN_TABLE,
               'name': 'Shift_JIS'}

# UCS2-BE

UCS2BE_CLS = (
    0,0,0,0,0,0,0,0,  # 00 - 07
    0,0,1,0,0,2,0,0,  # 08 - 0f
    0,0,0,0,0,0,0,0,  # 10 - 17
    0,0,0,3,0,0,0,0,  # 18 - 1f
    0,0,0,0,0,0,0,0,  # 20 - 27
    0,3,3,3,3,3,0,0,  # 28 - 2f
    0,0,0,0,0,0,0,0,  # 30 - 37
    0,0,0,0,0,0,0,0,  # 38 - 3f
    0,0,0,0,0,0,0,0,  # 40 - 47
    0,0,0,0,0,0,0,0,  # 48 - 4f
    0,0,0,0,0,0,0,0,  # 50 - 57
    0,0,0,0,0,0,0,0,  # 58 - 5f
    0,0,0,0,0,0,0,0,  # 60 - 67
    0,0,0,0,0,0,0,0,  # 68 - 6f
    0,0,0,0,0,0,0,0,  # 70 - 77
    0,0,0,0,0,0,0,0,  # 78 - 7f
    0,0,0,0,0,0,0,0,  # 80 - 87
    0,0,0,0,0,0,0,0,  # 88 - 8f
    0,0,0,0,0,0,0,0,  # 90 - 97
    0,0,0,0,0,0,0,0,  # 98 - 9f
    0,0,0,0,0,0,0,0,  # a0 - a7
    0,0,0,0,0,0,0,0,  # a8 - af
    0,0,0,0,0,0,0,0,  # b0 - b7
    0,0,0,0,0,0,0,0,  # b8 - bf
    0,0,0,0,0,0,0,0,  # c0 - c7
    0,0,0,0,0,0,0,0,  # c8 - cf
    0,0,0,0,0,0,0,0,  # d0 - d7
    0,0,0,0,0,0,0,0,  # d8 - df
    0,0,0,0,0,0,0,0,  # e0 - e7
    0,0,0,0,0,0,0,0,  # e8 - ef
    0,0,0,0,0,0,0,0,  # f0 - f7
    0,0,0,0,0,0,4,5   # f8 - ff
)

UCS2BE_ST  = (
          5,     7,     7,MachineState.ERROR,     4,     3,MachineState.ERROR,MachineState.ERROR,#00-07
     MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,#08-0f
     MachineState.ITS_ME,MachineState.ITS_ME,     6,     6,     6,     6,MachineState.ERROR,MachineState.ERROR,#10-17
          6,     6,     6,     6,     6,MachineState.ITS_ME,     6,     6,#18-1f
          6,     6,     6,     6,     5,     7,     7,MachineState.ERROR,#20-27
          5,     8,     6,     6,MachineState.ERROR,     6,     6,     6,#28-2f
          6,     6,     6,     6,MachineState.ERROR,MachineState.ERROR,MachineState.START,MachineState.START #30-37
)

UCS2BE_CHAR_LEN_TABLE = (2, 2, 2, 0, 2, 2)

UCS2BE_SM_MODEL = {'class_table': UCS2BE_CLS,
                   'class_factor': 6,
                   'state_table': UCS2BE_ST,
                   'char_len_table': UCS2BE_CHAR_LEN_TABLE,
                   'name': 'UTF-16BE'}

# UCS2-LE

UCS2LE_CLS = (
    0,0,0,0,0,0,0,0,  # 00 - 07
    0,0,1,0,0,2,0,0,  # 08 - 0f
    0,0,0,0,0,0,0,0,  # 10 - 17
    0,0,0,3,0,0,0,0,  # 18 - 1f
    0,0,0,0,0,0,0,0,  # 20 - 27
    0,3,3,3,3,3,0,0,  # 28 - 2f
    0,0,0,0,0,0,0,0,  # 30 - 37
    0,0,0,0,0,0,0,0,  # 38 - 3f
    0,0,0,0,0,0,0,0,  # 40 - 47
    0,0,0,0,0,0,0,0,  # 48 - 4f
    0,0,0,0,0,0,0,0,  # 50 - 57
    0,0,0,0,0,0,0,0,  # 58 - 5f
    0,0,0,0,0,0,0,0,  # 60 - 67
    0,0,0,0,0,0,0,0,  # 68 - 6f
    0,0,0,0,0,0,0,0,  # 70 - 77
    0,0,0,0,0,0,0,0,  # 78 - 7f
    0,0,0,0,0,0,0,0,  # 80 - 87
    0,0,0,0,0,0,0,0,  # 88 - 8f
    0,0,0,0,0,0,0,0,  # 90 - 97
    0,0,0,0,0,0,0,0,  # 98 - 9f
    0,0,0,0,0,0,0,0,  # a0 - a7
    0,0,0,0,0,0,0,0,  # a8 - af
    0,0,0,0,0,0,0,0,  # b0 - b7
    0,0,0,0,0,0,0,0,  # b8 - bf
    0,0,0,0,0,0,0,0,  # c0 - c7
    0,0,0,0,0,0,0,0,  # c8 - cf
    0,0,0,0,0,0,0,0,  # d0 - d7
    0,0,0,0,0,0,0,0,  # d8 - df
    0,0,0,0,0,0,0,0,  # e0 - e7
    0,0,0,0,0,0,0,0,  # e8 - ef
    0,0,0,0,0,0,0,0,  # f0 - f7
    0,0,0,0,0,0,4,5   # f8 - ff
)

UCS2LE_ST = (
          6,     6,     7,     6,     4,     3,MachineState.ERROR,MachineState.ERROR,#00-07
     MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,#08-0f
     MachineState.ITS_ME,MachineState.ITS_ME,     5,     5,     5,MachineState.ERROR,MachineState.ITS_ME,MachineState.ERROR,#10-17
          5,     5,     5,MachineState.ERROR,     5,MachineState.ERROR,     6,     6,#18-1f
          7,     6,     8,     8,     5,     5,     5,MachineState.ERROR,#20-27
          5,     5,     5,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,     5,     5,#28-2f
          5,     5,     5,MachineState.ERROR,     5,MachineState.ERROR,MachineState.START,MachineState.START #30-37
)

UCS2LE_CHAR_LEN_TABLE = (2, 2, 2, 2, 2, 2)

UCS2LE_SM_MODEL = {'class_table': UCS2LE_CLS,
                 'class_factor': 6,
                 'state_table': UCS2LE_ST,
                 'char_len_table': UCS2LE_CHAR_LEN_TABLE,
                 'name': 'UTF-16LE'}

# UTF-8

UTF8_CLS = (
    1,1,1,1,1,1,1,1,  # 00 - 07  #allow 0x00 as a legal value
    1,1,1,1,1,1,0,0,  # 08 - 0f
    1,1,1,1,1,1,1,1,  # 10 - 17
    1,1,1,0,1,1,1,1,  # 18 - 1f
    1,1,1,1,1,1,1,1,  # 20 - 27
    1,1,1,1,1,1,1,1,  # 28 - 2f
    1,1,1,1,1,1,1,1,  # 30 - 37
    1,1,1,1,1,1,1,1,  # 38 - 3f
    1,1,1,1,1,1,1,1,  # 40 - 47
    1,1,1,1,1,1,1,1,  # 48 - 4f
    1,1,1,1,1,1,1,1,  # 50 - 57
    1,1,1,1,1,1,1,1,  # 58 - 5f
    1,1,1,1,1,1,1,1,  # 60 - 67
    1,1,1,1,1,1,1,1,  # 68 - 6f
    1,1,1,1,1,1,1,1,  # 70 - 77
    1,1,1,1,1,1,1,1,  # 78 - 7f
    2,2,2,2,3,3,3,3,  # 80 - 87
    4,4,4,4,4,4,4,4,  # 88 - 8f
    4,4,4,4,4,4,4,4,  # 90 - 97
    4,4,4,4,4,4,4,4,  # 98 - 9f
    5,5,5,5,5,5,5,5,  # a0 - a7
    5,5,5,5,5,5,5,5,  # a8 - af
    5,5,5,5,5,5,5,5,  # b0 - b7
    5,5,5,5,5,5,5,5,  # b8 - bf
    0,0,6,6,6,6,6,6,  # c0 - c7
    6,6,6,6,6,6,6,6,  # c8 - cf
    6,6,6,6,6,6,6,6,  # d0 - d7
    6,6,6,6,6,6,6,6,  # d8 - df
    7,8,8,8,8,8,8,8,  # e0 - e7
    8,8,8,8,8,9,8,8,  # e8 - ef
    10,11,11,11,11,11,11,11,  # f0 - f7
    12,13,13,13,14,15,0,0    # f8 - ff
)

UTF8_ST = (
    MachineState.ERROR,MachineState.START,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,     12,   10,#00-07
         9,     11,     8,     7,     6,     5,     4,    3,#08-0f
    MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,#10-17
    MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,#18-1f
    MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,#20-27
    MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,MachineState.ITS_ME,#28-2f
    MachineState.ERROR,MachineState.ERROR,     5,     5,     5,     5,MachineState.ERROR,MachineState.ERROR,#30-37
    MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,#38-3f
    MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,     5,     5,     5,MachineState.ERROR,MachineState.ERROR,#40-47
    MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,#48-4f
    MachineState.ERROR,MachineState.ERROR,     7,     7,     7,     7,MachineState.ERROR,MachineState.ERROR,#50-57
    MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,#58-5f
    MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,     7,     7,MachineState.ERROR,MachineState.ERROR,#60-67
    MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,#68-6f
    MachineState.ERROR,MachineState.ERROR,     9,     9,     9,     9,MachineState.ERROR,MachineState.ERROR,#70-77
    MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,#78-7f
    MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,     9,MachineState.ERROR,MachineState.ERROR,#80-87
    MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,#88-8f
    MachineState.ERROR,MachineState.ERROR,    12,    12,    12,    12,MachineState.ERROR,MachineState.ERROR,#90-97
    MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,#98-9f
    MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,    12,MachineState.ERROR,MachineState.ERROR,#a0-a7
    MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,#a8-af
    MachineState.ERROR,MachineState.ERROR,    12,    12,    12,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,#b0-b7
    MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,#b8-bf
    MachineState.ERROR,MachineState.ERROR,MachineState.START,MachineState.START,MachineState.START,MachineState.START,MachineState.ERROR,MachineState.ERROR,#c0-c7
    MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR,MachineState.ERROR #c8-cf
)

UTF8_CHAR_LEN_TABLE = (0, 1, 0, 0, 0, 0, 2, 3, 3, 3, 4, 4, 5, 5, 6, 6)

UTF8_SM_MODEL = {'class_table': UTF8_CLS,
                 'class_factor': 16,
                 'state_table': UTF8_ST,
                 'char_len_table': UTF8_CHAR_LEN_TABLE,
                 'name': 'UTF-8'}
site-packages/pip/_vendor/chardet/euckrfreq.py000064400000032352147511334570015502 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Communicator client code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################

# Sampling from about 20M text materials include literature and computer technology

# 128  --> 0.79
# 256  --> 0.92
# 512  --> 0.986
# 1024 --> 0.99944
# 2048 --> 0.99999
#
# Idea Distribution Ratio = 0.98653 / (1-0.98653) = 73.24
# Random Distribution Ration = 512 / (2350-512) = 0.279.
#
# Typical Distribution Ratio

EUCKR_TYPICAL_DISTRIBUTION_RATIO = 6.0

EUCKR_TABLE_SIZE = 2352

# Char to FreqOrder table ,
EUCKR_CHAR_TO_FREQ_ORDER = (
  13, 130, 120,1396, 481,1719,1720, 328, 609, 212,1721, 707, 400, 299,1722,  87,
1397,1723, 104, 536,1117,1203,1724,1267, 685,1268, 508,1725,1726,1727,1728,1398,
1399,1729,1730,1731, 141, 621, 326,1057, 368,1732, 267, 488,  20,1733,1269,1734,
 945,1400,1735,  47, 904,1270,1736,1737, 773, 248,1738, 409, 313, 786, 429,1739,
 116, 987, 813,1401, 683,  75,1204, 145,1740,1741,1742,1743,  16, 847, 667, 622,
 708,1744,1745,1746, 966, 787, 304, 129,1747,  60, 820, 123, 676,1748,1749,1750,
1751, 617,1752, 626,1753,1754,1755,1756, 653,1757,1758,1759,1760,1761,1762, 856,
 344,1763,1764,1765,1766,  89, 401, 418, 806, 905, 848,1767,1768,1769, 946,1205,
 709,1770,1118,1771, 241,1772,1773,1774,1271,1775, 569,1776, 999,1777,1778,1779,
1780, 337, 751,1058,  28, 628, 254,1781, 177, 906, 270, 349, 891,1079,1782,  19,
1783, 379,1784, 315,1785, 629, 754,1402, 559,1786, 636, 203,1206,1787, 710, 567,
1788, 935, 814,1789,1790,1207, 766, 528,1791,1792,1208,1793,1794,1795,1796,1797,
1403,1798,1799, 533,1059,1404,1405,1156,1406, 936, 884,1080,1800, 351,1801,1802,
1803,1804,1805, 801,1806,1807,1808,1119,1809,1157, 714, 474,1407,1810, 298, 899,
 885,1811,1120, 802,1158,1812, 892,1813,1814,1408, 659,1815,1816,1121,1817,1818,
1819,1820,1821,1822, 319,1823, 594, 545,1824, 815, 937,1209,1825,1826, 573,1409,
1022,1827,1210,1828,1829,1830,1831,1832,1833, 556, 722, 807,1122,1060,1834, 697,
1835, 900, 557, 715,1836,1410, 540,1411, 752,1159, 294, 597,1211, 976, 803, 770,
1412,1837,1838,  39, 794,1413, 358,1839, 371, 925,1840, 453, 661, 788, 531, 723,
 544,1023,1081, 869,  91,1841, 392, 430, 790, 602,1414, 677,1082, 457,1415,1416,
1842,1843, 475, 327,1024,1417, 795, 121,1844, 733, 403,1418,1845,1846,1847, 300,
 119, 711,1212, 627,1848,1272, 207,1849,1850, 796,1213, 382,1851, 519,1852,1083,
 893,1853,1854,1855, 367, 809, 487, 671,1856, 663,1857,1858, 956, 471, 306, 857,
1859,1860,1160,1084,1861,1862,1863,1864,1865,1061,1866,1867,1868,1869,1870,1871,
 282,  96, 574,1872, 502,1085,1873,1214,1874, 907,1875,1876, 827, 977,1419,1420,
1421, 268,1877,1422,1878,1879,1880, 308,1881,   2, 537,1882,1883,1215,1884,1885,
 127, 791,1886,1273,1423,1887,  34, 336, 404, 643,1888, 571, 654, 894, 840,1889,
   0, 886,1274, 122, 575, 260, 908, 938,1890,1275, 410, 316,1891,1892, 100,1893,
1894,1123,  48,1161,1124,1025,1895, 633, 901,1276,1896,1897, 115, 816,1898, 317,
1899, 694,1900, 909, 734,1424, 572, 866,1425, 691,  85, 524,1010, 543, 394, 841,
1901,1902,1903,1026,1904,1905,1906,1907,1908,1909,  30, 451, 651, 988, 310,1910,
1911,1426, 810,1216,  93,1912,1913,1277,1217,1914, 858, 759,  45,  58, 181, 610,
 269,1915,1916, 131,1062, 551, 443,1000, 821,1427, 957, 895,1086,1917,1918, 375,
1919, 359,1920, 687,1921, 822,1922, 293,1923,1924,  40, 662, 118, 692,  29, 939,
 887, 640, 482, 174,1925,  69,1162, 728,1428, 910,1926,1278,1218,1279, 386, 870,
 217, 854,1163, 823,1927,1928,1929,1930, 834,1931,  78,1932, 859,1933,1063,1934,
1935,1936,1937, 438,1164, 208, 595,1938,1939,1940,1941,1219,1125,1942, 280, 888,
1429,1430,1220,1431,1943,1944,1945,1946,1947,1280, 150, 510,1432,1948,1949,1950,
1951,1952,1953,1954,1011,1087,1955,1433,1043,1956, 881,1957, 614, 958,1064,1065,
1221,1958, 638,1001, 860, 967, 896,1434, 989, 492, 553,1281,1165,1959,1282,1002,
1283,1222,1960,1961,1962,1963,  36, 383, 228, 753, 247, 454,1964, 876, 678,1965,
1966,1284, 126, 464, 490, 835, 136, 672, 529, 940,1088,1435, 473,1967,1968, 467,
  50, 390, 227, 587, 279, 378, 598, 792, 968, 240, 151, 160, 849, 882,1126,1285,
 639,1044, 133, 140, 288, 360, 811, 563,1027, 561, 142, 523,1969,1970,1971,   7,
 103, 296, 439, 407, 506, 634, 990,1972,1973,1974,1975, 645,1976,1977,1978,1979,
1980,1981, 236,1982,1436,1983,1984,1089, 192, 828, 618, 518,1166, 333,1127,1985,
 818,1223,1986,1987,1988,1989,1990,1991,1992,1993, 342,1128,1286, 746, 842,1994,
1995, 560, 223,1287,  98,   8, 189, 650, 978,1288,1996,1437,1997,  17, 345, 250,
 423, 277, 234, 512, 226,  97, 289,  42, 167,1998, 201,1999,2000, 843, 836, 824,
 532, 338, 783,1090, 182, 576, 436,1438,1439, 527, 500,2001, 947, 889,2002,2003,
2004,2005, 262, 600, 314, 447,2006, 547,2007, 693, 738,1129,2008,  71,1440, 745,
 619, 688,2009, 829,2010,2011, 147,2012,  33, 948,2013,2014,  74, 224,2015,  61,
 191, 918, 399, 637,2016,1028,1130, 257, 902,2017,2018,2019,2020,2021,2022,2023,
2024,2025,2026, 837,2027,2028,2029,2030, 179, 874, 591,  52, 724, 246,2031,2032,
2033,2034,1167, 969,2035,1289, 630, 605, 911,1091,1168,2036,2037,2038,1441, 912,
2039, 623,2040,2041, 253,1169,1290,2042,1442, 146, 620, 611, 577, 433,2043,1224,
 719,1170, 959, 440, 437, 534,  84, 388, 480,1131, 159, 220, 198, 679,2044,1012,
 819,1066,1443, 113,1225, 194, 318,1003,1029,2045,2046,2047,2048,1067,2049,2050,
2051,2052,2053,  59, 913, 112,2054, 632,2055, 455, 144, 739,1291,2056, 273, 681,
 499,2057, 448,2058,2059, 760,2060,2061, 970, 384, 169, 245,1132,2062,2063, 414,
1444,2064,2065,  41, 235,2066, 157, 252, 877, 568, 919, 789, 580,2067, 725,2068,
2069,1292,2070,2071,1445,2072,1446,2073,2074,  55, 588,  66,1447, 271,1092,2075,
1226,2076, 960,1013, 372,2077,2078,2079,2080,2081,1293,2082,2083,2084,2085, 850,
2086,2087,2088,2089,2090, 186,2091,1068, 180,2092,2093,2094, 109,1227, 522, 606,
2095, 867,1448,1093, 991,1171, 926, 353,1133,2096, 581,2097,2098,2099,1294,1449,
1450,2100, 596,1172,1014,1228,2101,1451,1295,1173,1229,2102,2103,1296,1134,1452,
 949,1135,2104,2105,1094,1453,1454,1455,2106,1095,2107,2108,2109,2110,2111,2112,
2113,2114,2115,2116,2117, 804,2118,2119,1230,1231, 805,1456, 405,1136,2120,2121,
2122,2123,2124, 720, 701,1297, 992,1457, 927,1004,2125,2126,2127,2128,2129,2130,
  22, 417,2131, 303,2132, 385,2133, 971, 520, 513,2134,1174,  73,1096, 231, 274,
 962,1458, 673,2135,1459,2136, 152,1137,2137,2138,2139,2140,1005,1138,1460,1139,
2141,2142,2143,2144,  11, 374, 844,2145, 154,1232,  46,1461,2146, 838, 830, 721,
1233, 106,2147,  90, 428, 462, 578, 566,1175, 352,2148,2149, 538,1234, 124,1298,
2150,1462, 761, 565,2151, 686,2152, 649,2153,  72, 173,2154, 460, 415,2155,1463,
2156,1235, 305,2157,2158,2159,2160,2161,2162, 579,2163,2164,2165,2166,2167, 747,
2168,2169,2170,2171,1464, 669,2172,2173,2174,2175,2176,1465,2177,  23, 530, 285,
2178, 335, 729,2179, 397,2180,2181,2182,1030,2183,2184, 698,2185,2186, 325,2187,
2188, 369,2189, 799,1097,1015, 348,2190,1069, 680,2191, 851,1466,2192,2193,  10,
2194, 613, 424,2195, 979, 108, 449, 589,  27, 172,  81,1031,  80, 774, 281, 350,
1032, 525, 301, 582,1176,2196, 674,1045,2197,2198,1467, 730, 762,2199,2200,2201,
2202,1468,2203, 993,2204,2205, 266,1070, 963,1140,2206,2207,2208, 664,1098, 972,
2209,2210,2211,1177,1469,1470, 871,2212,2213,2214,2215,2216,1471,2217,2218,2219,
2220,2221,2222,2223,2224,2225,2226,2227,1472,1236,2228,2229,2230,2231,2232,2233,
2234,2235,1299,2236,2237, 200,2238, 477, 373,2239,2240, 731, 825, 777,2241,2242,
2243, 521, 486, 548,2244,2245,2246,1473,1300,  53, 549, 137, 875,  76, 158,2247,
1301,1474, 469, 396,1016, 278, 712,2248, 321, 442, 503, 767, 744, 941,1237,1178,
1475,2249,  82, 178,1141,1179, 973,2250,1302,2251, 297,2252,2253, 570,2254,2255,
2256,  18, 450, 206,2257, 290, 292,1142,2258, 511, 162,  99, 346, 164, 735,2259,
1476,1477,   4, 554, 343, 798,1099,2260,1100,2261,  43, 171,1303, 139, 215,2262,
2263, 717, 775,2264,1033, 322, 216,2265, 831,2266, 149,2267,1304,2268,2269, 702,
1238, 135, 845, 347, 309,2270, 484,2271, 878, 655, 238,1006,1478,2272,  67,2273,
 295,2274,2275, 461,2276, 478, 942, 412,2277,1034,2278,2279,2280, 265,2281, 541,
2282,2283,2284,2285,2286,  70, 852,1071,2287,2288,2289,2290,  21,  56, 509, 117,
 432,2291,2292, 331, 980, 552,1101, 148, 284, 105, 393,1180,1239, 755,2293, 187,
2294,1046,1479,2295, 340,2296,  63,1047, 230,2297,2298,1305, 763,1306, 101, 800,
 808, 494,2299,2300,2301, 903,2302,  37,1072,  14,   5,2303,  79, 675,2304, 312,
2305,2306,2307,2308,2309,1480,   6,1307,2310,2311,2312,   1, 470,  35,  24, 229,
2313, 695, 210,  86, 778,  15, 784, 592, 779,  32,  77, 855, 964,2314, 259,2315,
 501, 380,2316,2317,  83, 981, 153, 689,1308,1481,1482,1483,2318,2319, 716,1484,
2320,2321,2322,2323,2324,2325,1485,2326,2327, 128,  57,  68, 261,1048, 211, 170,
1240,  31,2328,  51, 435, 742,2329,2330,2331, 635,2332, 264, 456,2333,2334,2335,
 425,2336,1486, 143, 507, 263, 943,2337, 363, 920,1487, 256,1488,1102, 243, 601,
1489,2338,2339,2340,2341,2342,2343,2344, 861,2345,2346,2347,2348,2349,2350, 395,
2351,1490,1491,  62, 535, 166, 225,2352,2353, 668, 419,1241, 138, 604, 928,2354,
1181,2355,1492,1493,2356,2357,2358,1143,2359, 696,2360, 387, 307,1309, 682, 476,
2361,2362, 332,  12, 222, 156,2363, 232,2364, 641, 276, 656, 517,1494,1495,1035,
 416, 736,1496,2365,1017, 586,2366,2367,2368,1497,2369, 242,2370,2371,2372,1498,
2373, 965, 713,2374,2375,2376,2377, 740, 982,1499, 944,1500,1007,2378,2379,1310,
1501,2380,2381,2382, 785, 329,2383,2384,1502,2385,2386,2387, 932,2388,1503,2389,
2390,2391,2392,1242,2393,2394,2395,2396,2397, 994, 950,2398,2399,2400,2401,1504,
1311,2402,2403,2404,2405,1049, 749,2406,2407, 853, 718,1144,1312,2408,1182,1505,
2409,2410, 255, 516, 479, 564, 550, 214,1506,1507,1313, 413, 239, 444, 339,1145,
1036,1508,1509,1314,1037,1510,1315,2411,1511,2412,2413,2414, 176, 703, 497, 624,
 593, 921, 302,2415, 341, 165,1103,1512,2416,1513,2417,2418,2419, 376,2420, 700,
2421,2422,2423, 258, 768,1316,2424,1183,2425, 995, 608,2426,2427,2428,2429, 221,
2430,2431,2432,2433,2434,2435,2436,2437, 195, 323, 726, 188, 897, 983,1317, 377,
 644,1050, 879,2438, 452,2439,2440,2441,2442,2443,2444, 914,2445,2446,2447,2448,
 915, 489,2449,1514,1184,2450,2451, 515,  64, 427, 495,2452, 583,2453, 483, 485,
1038, 562, 213,1515, 748, 666,2454,2455,2456,2457, 334,2458, 780, 996,1008, 705,
1243,2459,2460,2461,2462,2463, 114,2464, 493,1146, 366, 163,1516, 961,1104,2465,
 291,2466,1318,1105,2467,1517, 365,2468, 355, 951,1244,2469,1319,2470, 631,2471,
2472, 218,1320, 364, 320, 756,1518,1519,1321,1520,1322,2473,2474,2475,2476, 997,
2477,2478,2479,2480, 665,1185,2481, 916,1521,2482,2483,2484, 584, 684,2485,2486,
 797,2487,1051,1186,2488,2489,2490,1522,2491,2492, 370,2493,1039,1187,  65,2494,
 434, 205, 463,1188,2495, 125, 812, 391, 402, 826, 699, 286, 398, 155, 781, 771,
 585,2496, 590, 505,1073,2497, 599, 244, 219, 917,1018, 952, 646,1523,2498,1323,
2499,2500,  49, 984, 354, 741,2501, 625,2502,1324,2503,1019, 190, 357, 757, 491,
  95, 782, 868,2504,2505,2506,2507,2508,2509, 134,1524,1074, 422,1525, 898,2510,
 161,2511,2512,2513,2514, 769,2515,1526,2516,2517, 411,1325,2518, 472,1527,2519,
2520,2521,2522,2523,2524, 985,2525,2526,2527,2528,2529,2530, 764,2531,1245,2532,
2533,  25, 204, 311,2534, 496,2535,1052,2536,2537,2538,2539,2540,2541,2542, 199,
 704, 504, 468, 758, 657,1528, 196,  44, 839,1246, 272, 750,2543, 765, 862,2544,
2545,1326,2546, 132, 615, 933,2547, 732,2548,2549,2550,1189,1529,2551, 283,1247,
1053, 607, 929,2552,2553,2554, 930, 183, 872, 616,1040,1147,2555,1148,1020, 441,
 249,1075,2556,2557,2558, 466, 743,2559,2560,2561,  92, 514, 426, 420, 526,2562,
2563,2564,2565,2566,2567,2568, 185,2569,2570,2571,2572, 776,1530, 658,2573, 362,
2574, 361, 922,1076, 793,2575,2576,2577,2578,2579,2580,1531, 251,2581,2582,2583,
2584,1532,  54, 612, 237,1327,2585,2586, 275, 408, 647, 111,2587,1533,1106, 465,
   3, 458,   9,  38,2588, 107, 110, 890, 209,  26, 737, 498,2589,1534,2590, 431,
 202,  88,1535, 356, 287,1107, 660,1149,2591, 381,1536, 986,1150, 445,1248,1151,
 974,2592,2593, 846,2594, 446, 953, 184,1249,1250, 727,2595, 923, 193, 883,2596,
2597,2598, 102, 324, 539, 817,2599, 421,1041,2600, 832,2601,  94, 175, 197, 406,
2602, 459,2603,2604,2605,2606,2607, 330, 555,2608,2609,2610, 706,1108, 389,2611,
2612,2613,2614, 233,2615, 833, 558, 931, 954,1251,2616,2617,1537, 546,2618,2619,
1009,2620,2621,2622,1538, 690,1328,2623, 955,2624,1539,2625,2626, 772,2627,2628,
2629,2630,2631, 924, 648, 863, 603,2632,2633, 934,1540, 864, 865,2634, 642,1042,
 670,1190,2635,2636,2637,2638, 168,2639, 652, 873, 542,1054,1541,2640,2641,2642,  # 512, 256
)

site-packages/pip/_vendor/chardet/jpcntx.py000064400000046273147511334570015030 0ustar00######################## BEGIN LICENSE BLOCK ########################
# The Original Code is Mozilla Communicator client code.
#
# The Initial Developer of the Original Code is
# Netscape Communications Corporation.
# Portions created by the Initial Developer are Copyright (C) 1998
# the Initial Developer. All Rights Reserved.
#
# Contributor(s):
#   Mark Pilgrim - port to Python
#
# This library is free software; you can redistribute it and/or
# modify it under the terms of the GNU Lesser General Public
# License as published by the Free Software Foundation; either
# version 2.1 of the License, or (at your option) any later version.
#
# This library is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
# Lesser General Public License for more details.
#
# You should have received a copy of the GNU Lesser General Public
# License along with this library; if not, write to the Free Software
# Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
# 02110-1301  USA
######################### END LICENSE BLOCK #########################


# This is hiragana 2-char sequence table, the number in each cell represents its frequency category
jp2CharContext = (
(0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1),
(2,4,0,4,0,3,0,4,0,3,4,4,4,2,4,3,3,4,3,2,3,3,4,2,3,3,3,2,4,1,4,3,3,1,5,4,3,4,3,4,3,5,3,0,3,5,4,2,0,3,1,0,3,3,0,3,3,0,1,1,0,4,3,0,3,3,0,4,0,2,0,3,5,5,5,5,4,0,4,1,0,3,4),
(0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2),
(0,4,0,5,0,5,0,4,0,4,5,4,4,3,5,3,5,1,5,3,4,3,4,4,3,4,3,3,4,3,5,4,4,3,5,5,3,5,5,5,3,5,5,3,4,5,5,3,1,3,2,0,3,4,0,4,2,0,4,2,1,5,3,2,3,5,0,4,0,2,0,5,4,4,5,4,5,0,4,0,0,4,4),
(0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0),
(0,3,0,4,0,3,0,3,0,4,5,4,3,3,3,3,4,3,5,4,4,3,5,4,4,3,4,3,4,4,4,4,5,3,4,4,3,4,5,5,4,5,5,1,4,5,4,3,0,3,3,1,3,3,0,4,4,0,3,3,1,5,3,3,3,5,0,4,0,3,0,4,4,3,4,3,3,0,4,1,1,3,4),
(0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0),
(0,4,0,3,0,3,0,4,0,3,4,4,3,2,2,1,2,1,3,1,3,3,3,3,3,4,3,1,3,3,5,3,3,0,4,3,0,5,4,3,3,5,4,4,3,4,4,5,0,1,2,0,1,2,0,2,2,0,1,0,0,5,2,2,1,4,0,3,0,1,0,4,4,3,5,4,3,0,2,1,0,4,3),
(0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0),
(0,3,0,5,0,4,0,2,1,4,4,2,4,1,4,2,4,2,4,3,3,3,4,3,3,3,3,1,4,2,3,3,3,1,4,4,1,1,1,4,3,3,2,0,2,4,3,2,0,3,3,0,3,1,1,0,0,0,3,3,0,4,2,2,3,4,0,4,0,3,0,4,4,5,3,4,4,0,3,0,0,1,4),
(1,4,0,4,0,4,0,4,0,3,5,4,4,3,4,3,5,4,3,3,4,3,5,4,4,4,4,3,4,2,4,3,3,1,5,4,3,2,4,5,4,5,5,4,4,5,4,4,0,3,2,2,3,3,0,4,3,1,3,2,1,4,3,3,4,5,0,3,0,2,0,4,5,5,4,5,4,0,4,0,0,5,4),
(0,5,0,5,0,4,0,3,0,4,4,3,4,3,3,3,4,0,4,4,4,3,4,3,4,3,3,1,4,2,4,3,4,0,5,4,1,4,5,4,4,5,3,2,4,3,4,3,2,4,1,3,3,3,2,3,2,0,4,3,3,4,3,3,3,4,0,4,0,3,0,4,5,4,4,4,3,0,4,1,0,1,3),
(0,3,1,4,0,3,0,2,0,3,4,4,3,1,4,2,3,3,4,3,4,3,4,3,4,4,3,2,3,1,5,4,4,1,4,4,3,5,4,4,3,5,5,4,3,4,4,3,1,2,3,1,2,2,0,3,2,0,3,1,0,5,3,3,3,4,3,3,3,3,4,4,4,4,5,4,2,0,3,3,2,4,3),
(0,2,0,3,0,1,0,1,0,0,3,2,0,0,2,0,1,0,2,1,3,3,3,1,2,3,1,0,1,0,4,2,1,1,3,3,0,4,3,3,1,4,3,3,0,3,3,2,0,0,0,0,1,0,0,2,0,0,0,0,0,4,1,0,2,3,2,2,2,1,3,3,3,4,4,3,2,0,3,1,0,3,3),
(0,4,0,4,0,3,0,3,0,4,4,4,3,3,3,3,3,3,4,3,4,2,4,3,4,3,3,2,4,3,4,5,4,1,4,5,3,5,4,5,3,5,4,0,3,5,5,3,1,3,3,2,2,3,0,3,4,1,3,3,2,4,3,3,3,4,0,4,0,3,0,4,5,4,4,5,3,0,4,1,0,3,4),
(0,2,0,3,0,3,0,0,0,2,2,2,1,0,1,0,0,0,3,0,3,0,3,0,1,3,1,0,3,1,3,3,3,1,3,3,3,0,1,3,1,3,4,0,0,3,1,1,0,3,2,0,0,0,0,1,3,0,1,0,0,3,3,2,0,3,0,0,0,0,0,3,4,3,4,3,3,0,3,0,0,2,3),
(2,3,0,3,0,2,0,1,0,3,3,4,3,1,3,1,1,1,3,1,4,3,4,3,3,3,0,0,3,1,5,4,3,1,4,3,2,5,5,4,4,4,4,3,3,4,4,4,0,2,1,1,3,2,0,1,2,0,0,1,0,4,1,3,3,3,0,3,0,1,0,4,4,4,5,5,3,0,2,0,0,4,4),
(0,2,0,1,0,3,1,3,0,2,3,3,3,0,3,1,0,0,3,0,3,2,3,1,3,2,1,1,0,0,4,2,1,0,2,3,1,4,3,2,0,4,4,3,1,3,1,3,0,1,0,0,1,0,0,0,1,0,0,0,0,4,1,1,1,2,0,3,0,0,0,3,4,2,4,3,2,0,1,0,0,3,3),
(0,1,0,4,0,5,0,4,0,2,4,4,2,3,3,2,3,3,5,3,3,3,4,3,4,2,3,0,4,3,3,3,4,1,4,3,2,1,5,5,3,4,5,1,3,5,4,2,0,3,3,0,1,3,0,4,2,0,1,3,1,4,3,3,3,3,0,3,0,1,0,3,4,4,4,5,5,0,3,0,1,4,5),
(0,2,0,3,0,3,0,0,0,2,3,1,3,0,4,0,1,1,3,0,3,4,3,2,3,1,0,3,3,2,3,1,3,0,2,3,0,2,1,4,1,2,2,0,0,3,3,0,0,2,0,0,0,1,0,0,0,0,2,2,0,3,2,1,3,3,0,2,0,2,0,0,3,3,1,2,4,0,3,0,2,2,3),
(2,4,0,5,0,4,0,4,0,2,4,4,4,3,4,3,3,3,1,2,4,3,4,3,4,4,5,0,3,3,3,3,2,0,4,3,1,4,3,4,1,4,4,3,3,4,4,3,1,2,3,0,4,2,0,4,1,0,3,3,0,4,3,3,3,4,0,4,0,2,0,3,5,3,4,5,2,0,3,0,0,4,5),
(0,3,0,4,0,1,0,1,0,1,3,2,2,1,3,0,3,0,2,0,2,0,3,0,2,0,0,0,1,0,1,1,0,0,3,1,0,0,0,4,0,3,1,0,2,1,3,0,0,0,0,0,0,3,0,0,0,0,0,0,0,4,2,2,3,1,0,3,0,0,0,1,4,4,4,3,0,0,4,0,0,1,4),
(1,4,1,5,0,3,0,3,0,4,5,4,4,3,5,3,3,4,4,3,4,1,3,3,3,3,2,1,4,1,5,4,3,1,4,4,3,5,4,4,3,5,4,3,3,4,4,4,0,3,3,1,2,3,0,3,1,0,3,3,0,5,4,4,4,4,4,4,3,3,5,4,4,3,3,5,4,0,3,2,0,4,4),
(0,2,0,3,0,1,0,0,0,1,3,3,3,2,4,1,3,0,3,1,3,0,2,2,1,1,0,0,2,0,4,3,1,0,4,3,0,4,4,4,1,4,3,1,1,3,3,1,0,2,0,0,1,3,0,0,0,0,2,0,0,4,3,2,4,3,5,4,3,3,3,4,3,3,4,3,3,0,2,1,0,3,3),
(0,2,0,4,0,3,0,2,0,2,5,5,3,4,4,4,4,1,4,3,3,0,4,3,4,3,1,3,3,2,4,3,0,3,4,3,0,3,4,4,2,4,4,0,4,5,3,3,2,2,1,1,1,2,0,1,5,0,3,3,2,4,3,3,3,4,0,3,0,2,0,4,4,3,5,5,0,0,3,0,2,3,3),
(0,3,0,4,0,3,0,1,0,3,4,3,3,1,3,3,3,0,3,1,3,0,4,3,3,1,1,0,3,0,3,3,0,0,4,4,0,1,5,4,3,3,5,0,3,3,4,3,0,2,0,1,1,1,0,1,3,0,1,2,1,3,3,2,3,3,0,3,0,1,0,1,3,3,4,4,1,0,1,2,2,1,3),
(0,1,0,4,0,4,0,3,0,1,3,3,3,2,3,1,1,0,3,0,3,3,4,3,2,4,2,0,1,0,4,3,2,0,4,3,0,5,3,3,2,4,4,4,3,3,3,4,0,1,3,0,0,1,0,0,1,0,0,0,0,4,2,3,3,3,0,3,0,0,0,4,4,4,5,3,2,0,3,3,0,3,5),
(0,2,0,3,0,0,0,3,0,1,3,0,2,0,0,0,1,0,3,1,1,3,3,0,0,3,0,0,3,0,2,3,1,0,3,1,0,3,3,2,0,4,2,2,0,2,0,0,0,4,0,0,0,0,0,0,0,0,0,0,0,2,1,2,0,1,0,1,0,0,0,1,3,1,2,0,0,0,1,0,0,1,4),
(0,3,0,3,0,5,0,1,0,2,4,3,1,3,3,2,1,1,5,2,1,0,5,1,2,0,0,0,3,3,2,2,3,2,4,3,0,0,3,3,1,3,3,0,2,5,3,4,0,3,3,0,1,2,0,2,2,0,3,2,0,2,2,3,3,3,0,2,0,1,0,3,4,4,2,5,4,0,3,0,0,3,5),
(0,3,0,3,0,3,0,1,0,3,3,3,3,0,3,0,2,0,2,1,1,0,2,0,1,0,0,0,2,1,0,0,1,0,3,2,0,0,3,3,1,2,3,1,0,3,3,0,0,1,0,0,0,0,0,2,0,0,0,0,0,2,3,1,2,3,0,3,0,1,0,3,2,1,0,4,3,0,1,1,0,3,3),
(0,4,0,5,0,3,0,3,0,4,5,5,4,3,5,3,4,3,5,3,3,2,5,3,4,4,4,3,4,3,4,5,5,3,4,4,3,4,4,5,4,4,4,3,4,5,5,4,2,3,4,2,3,4,0,3,3,1,4,3,2,4,3,3,5,5,0,3,0,3,0,5,5,5,5,4,4,0,4,0,1,4,4),
(0,4,0,4,0,3,0,3,0,3,5,4,4,2,3,2,5,1,3,2,5,1,4,2,3,2,3,3,4,3,3,3,3,2,5,4,1,3,3,5,3,4,4,0,4,4,3,1,1,3,1,0,2,3,0,2,3,0,3,0,0,4,3,1,3,4,0,3,0,2,0,4,4,4,3,4,5,0,4,0,0,3,4),
(0,3,0,3,0,3,1,2,0,3,4,4,3,3,3,0,2,2,4,3,3,1,3,3,3,1,1,0,3,1,4,3,2,3,4,4,2,4,4,4,3,4,4,3,2,4,4,3,1,3,3,1,3,3,0,4,1,0,2,2,1,4,3,2,3,3,5,4,3,3,5,4,4,3,3,0,4,0,3,2,2,4,4),
(0,2,0,1,0,0,0,0,0,1,2,1,3,0,0,0,0,0,2,0,1,2,1,0,0,1,0,0,0,0,3,0,0,1,0,1,1,3,1,0,0,0,1,1,0,1,1,0,0,0,0,0,2,0,0,0,0,0,0,0,0,1,1,2,2,0,3,4,0,0,0,1,1,0,0,1,0,0,0,0,0,1,1),
(0,1,0,0,0,1,0,0,0,0,4,0,4,1,4,0,3,0,4,0,3,0,4,0,3,0,3,0,4,1,5,1,4,0,0,3,0,5,0,5,2,0,1,0,0,0,2,1,4,0,1,3,0,0,3,0,0,3,1,1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0),
(1,4,0,5,0,3,0,2,0,3,5,4,4,3,4,3,5,3,4,3,3,0,4,3,3,3,3,3,3,2,4,4,3,1,3,4,4,5,4,4,3,4,4,1,3,5,4,3,3,3,1,2,2,3,3,1,3,1,3,3,3,5,3,3,4,5,0,3,0,3,0,3,4,3,4,4,3,0,3,0,2,4,3),
(0,1,0,4,0,0,0,0,0,1,4,0,4,1,4,2,4,0,3,0,1,0,1,0,0,0,0,0,2,0,3,1,1,1,0,3,0,0,0,1,2,1,0,0,1,1,1,1,0,1,0,0,0,1,0,0,3,0,0,0,0,3,2,0,2,2,0,1,0,0,0,2,3,2,3,3,0,0,0,0,2,1,0),
(0,5,1,5,0,3,0,3,0,5,4,4,5,1,5,3,3,0,4,3,4,3,5,3,4,3,3,2,4,3,4,3,3,0,3,3,1,4,4,3,4,4,4,3,4,5,5,3,2,3,1,1,3,3,1,3,1,1,3,3,2,4,5,3,3,5,0,4,0,3,0,4,4,3,5,3,3,0,3,4,0,4,3),
(0,5,0,5,0,3,0,2,0,4,4,3,5,2,4,3,3,3,4,4,4,3,5,3,5,3,3,1,4,0,4,3,3,0,3,3,0,4,4,4,4,5,4,3,3,5,5,3,2,3,1,2,3,2,0,1,0,0,3,2,2,4,4,3,1,5,0,4,0,3,0,4,3,1,3,2,1,0,3,3,0,3,3),
(0,4,0,5,0,5,0,4,0,4,5,5,5,3,4,3,3,2,5,4,4,3,5,3,5,3,4,0,4,3,4,4,3,2,4,4,3,4,5,4,4,5,5,0,3,5,5,4,1,3,3,2,3,3,1,3,1,0,4,3,1,4,4,3,4,5,0,4,0,2,0,4,3,4,4,3,3,0,4,0,0,5,5),
(0,4,0,4,0,5,0,1,1,3,3,4,4,3,4,1,3,0,5,1,3,0,3,1,3,1,1,0,3,0,3,3,4,0,4,3,0,4,4,4,3,4,4,0,3,5,4,1,0,3,0,0,2,3,0,3,1,0,3,1,0,3,2,1,3,5,0,3,0,1,0,3,2,3,3,4,4,0,2,2,0,4,4),
(2,4,0,5,0,4,0,3,0,4,5,5,4,3,5,3,5,3,5,3,5,2,5,3,4,3,3,4,3,4,5,3,2,1,5,4,3,2,3,4,5,3,4,1,2,5,4,3,0,3,3,0,3,2,0,2,3,0,4,1,0,3,4,3,3,5,0,3,0,1,0,4,5,5,5,4,3,0,4,2,0,3,5),
(0,5,0,4,0,4,0,2,0,5,4,3,4,3,4,3,3,3,4,3,4,2,5,3,5,3,4,1,4,3,4,4,4,0,3,5,0,4,4,4,4,5,3,1,3,4,5,3,3,3,3,3,3,3,0,2,2,0,3,3,2,4,3,3,3,5,3,4,1,3,3,5,3,2,0,0,0,0,4,3,1,3,3),
(0,1,0,3,0,3,0,1,0,1,3,3,3,2,3,3,3,0,3,0,0,0,3,1,3,0,0,0,2,2,2,3,0,0,3,2,0,1,2,4,1,3,3,0,0,3,3,3,0,1,0,0,2,1,0,0,3,0,3,1,0,3,0,0,1,3,0,2,0,1,0,3,3,1,3,3,0,0,1,1,0,3,3),
(0,2,0,3,0,2,1,4,0,2,2,3,1,1,3,1,1,0,2,0,3,1,2,3,1,3,0,0,1,0,4,3,2,3,3,3,1,4,2,3,3,3,3,1,0,3,1,4,0,1,1,0,1,2,0,1,1,0,1,1,0,3,1,3,2,2,0,1,0,0,0,2,3,3,3,1,0,0,0,0,0,2,3),
(0,5,0,4,0,5,0,2,0,4,5,5,3,3,4,3,3,1,5,4,4,2,4,4,4,3,4,2,4,3,5,5,4,3,3,4,3,3,5,5,4,5,5,1,3,4,5,3,1,4,3,1,3,3,0,3,3,1,4,3,1,4,5,3,3,5,0,4,0,3,0,5,3,3,1,4,3,0,4,0,1,5,3),
(0,5,0,5,0,4,0,2,0,4,4,3,4,3,3,3,3,3,5,4,4,4,4,4,4,5,3,3,5,2,4,4,4,3,4,4,3,3,4,4,5,5,3,3,4,3,4,3,3,4,3,3,3,3,1,2,2,1,4,3,3,5,4,4,3,4,0,4,0,3,0,4,4,4,4,4,1,0,4,2,0,2,4),
(0,4,0,4,0,3,0,1,0,3,5,2,3,0,3,0,2,1,4,2,3,3,4,1,4,3,3,2,4,1,3,3,3,0,3,3,0,0,3,3,3,5,3,3,3,3,3,2,0,2,0,0,2,0,0,2,0,0,1,0,0,3,1,2,2,3,0,3,0,2,0,4,4,3,3,4,1,0,3,0,0,2,4),
(0,0,0,4,0,0,0,0,0,0,1,0,1,0,2,0,0,0,0,0,1,0,2,0,1,0,0,0,0,0,3,1,3,0,3,2,0,0,0,1,0,3,2,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,3,4,0,2,0,0,0,0,0,0,2),
(0,2,1,3,0,2,0,2,0,3,3,3,3,1,3,1,3,3,3,3,3,3,4,2,2,1,2,1,4,0,4,3,1,3,3,3,2,4,3,5,4,3,3,3,3,3,3,3,0,1,3,0,2,0,0,1,0,0,1,0,0,4,2,0,2,3,0,3,3,0,3,3,4,2,3,1,4,0,1,2,0,2,3),
(0,3,0,3,0,1,0,3,0,2,3,3,3,0,3,1,2,0,3,3,2,3,3,2,3,2,3,1,3,0,4,3,2,0,3,3,1,4,3,3,2,3,4,3,1,3,3,1,1,0,1,1,0,1,0,1,0,1,0,0,0,4,1,1,0,3,0,3,1,0,2,3,3,3,3,3,1,0,0,2,0,3,3),
(0,0,0,0,0,0,0,0,0,0,3,0,2,0,3,0,0,0,0,0,0,0,3,0,0,0,0,0,0,0,3,0,3,0,3,1,0,1,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,3,0,2,0,2,3,0,0,0,0,0,0,0,0,3),
(0,2,0,3,1,3,0,3,0,2,3,3,3,1,3,1,3,1,3,1,3,3,3,1,3,0,2,3,1,1,4,3,3,2,3,3,1,2,2,4,1,3,3,0,1,4,2,3,0,1,3,0,3,0,0,1,3,0,2,0,0,3,3,2,1,3,0,3,0,2,0,3,4,4,4,3,1,0,3,0,0,3,3),
(0,2,0,1,0,2,0,0,0,1,3,2,2,1,3,0,1,1,3,0,3,2,3,1,2,0,2,0,1,1,3,3,3,0,3,3,1,1,2,3,2,3,3,1,2,3,2,0,0,1,0,0,0,0,0,0,3,0,1,0,0,2,1,2,1,3,0,3,0,0,0,3,4,4,4,3,2,0,2,0,0,2,4),
(0,0,0,1,0,1,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,1,1,1,0,0,0,0,0,0,0,0,0,2,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,3,1,0,0,0,0,0,0,0,3),
(0,3,0,3,0,2,0,3,0,3,3,3,2,3,2,2,2,0,3,1,3,3,3,2,3,3,0,0,3,0,3,2,2,0,2,3,1,4,3,4,3,3,2,3,1,5,4,4,0,3,1,2,1,3,0,3,1,1,2,0,2,3,1,3,1,3,0,3,0,1,0,3,3,4,4,2,1,0,2,1,0,2,4),
(0,1,0,3,0,1,0,2,0,1,4,2,5,1,4,0,2,0,2,1,3,1,4,0,2,1,0,0,2,1,4,1,1,0,3,3,0,5,1,3,2,3,3,1,0,3,2,3,0,1,0,0,0,0,0,0,1,0,0,0,0,4,0,1,0,3,0,2,0,1,0,3,3,3,4,3,3,0,0,0,0,2,3),
(0,0,0,1,0,0,0,0,0,0,2,0,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,1,0,0,1,0,0,0,0,0,3),
(0,1,0,3,0,4,0,3,0,2,4,3,1,0,3,2,2,1,3,1,2,2,3,1,1,1,2,1,3,0,1,2,0,1,3,2,1,3,0,5,5,1,0,0,1,3,2,1,0,3,0,0,1,0,0,0,0,0,3,4,0,1,1,1,3,2,0,2,0,1,0,2,3,3,1,2,3,0,1,0,1,0,4),
(0,0,0,1,0,3,0,3,0,2,2,1,0,0,4,0,3,0,3,1,3,0,3,0,3,0,1,0,3,0,3,1,3,0,3,3,0,0,1,2,1,1,1,0,1,2,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,2,2,1,2,0,0,2,0,0,0,0,2,3,3,3,3,0,0,0,0,1,4),
(0,0,0,3,0,3,0,0,0,0,3,1,1,0,3,0,1,0,2,0,1,0,0,0,0,0,0,0,1,0,3,0,2,0,2,3,0,0,2,2,3,1,2,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,3,0,0,2,0,0,0,0,2,3),
(2,4,0,5,0,5,0,4,0,3,4,3,3,3,4,3,3,3,4,3,4,4,5,4,5,5,5,2,3,0,5,5,4,1,5,4,3,1,5,4,3,4,4,3,3,4,3,3,0,3,2,0,2,3,0,3,0,0,3,3,0,5,3,2,3,3,0,3,0,3,0,3,4,5,4,5,3,0,4,3,0,3,4),
(0,3,0,3,0,3,0,3,0,3,3,4,3,2,3,2,3,0,4,3,3,3,3,3,3,3,3,0,3,2,4,3,3,1,3,4,3,4,4,4,3,4,4,3,2,4,4,1,0,2,0,0,1,1,0,2,0,0,3,1,0,5,3,2,1,3,0,3,0,1,2,4,3,2,4,3,3,0,3,2,0,4,4),
(0,3,0,3,0,1,0,0,0,1,4,3,3,2,3,1,3,1,4,2,3,2,4,2,3,4,3,0,2,2,3,3,3,0,3,3,3,0,3,4,1,3,3,0,3,4,3,3,0,1,1,0,1,0,0,0,4,0,3,0,0,3,1,2,1,3,0,4,0,1,0,4,3,3,4,3,3,0,2,0,0,3,3),
(0,3,0,4,0,1,0,3,0,3,4,3,3,0,3,3,3,1,3,1,3,3,4,3,3,3,0,0,3,1,5,3,3,1,3,3,2,5,4,3,3,4,5,3,2,5,3,4,0,1,0,0,0,0,0,2,0,0,1,1,0,4,2,2,1,3,0,3,0,2,0,4,4,3,5,3,2,0,1,1,0,3,4),
(0,5,0,4,0,5,0,2,0,4,4,3,3,2,3,3,3,1,4,3,4,1,5,3,4,3,4,0,4,2,4,3,4,1,5,4,0,4,4,4,4,5,4,1,3,5,4,2,1,4,1,1,3,2,0,3,1,0,3,2,1,4,3,3,3,4,0,4,0,3,0,4,4,4,3,3,3,0,4,2,0,3,4),
(1,4,0,4,0,3,0,1,0,3,3,3,1,1,3,3,2,2,3,3,1,0,3,2,2,1,2,0,3,1,2,1,2,0,3,2,0,2,2,3,3,4,3,0,3,3,1,2,0,1,1,3,1,2,0,0,3,0,1,1,0,3,2,2,3,3,0,3,0,0,0,2,3,3,4,3,3,0,1,0,0,1,4),
(0,4,0,4,0,4,0,0,0,3,4,4,3,1,4,2,3,2,3,3,3,1,4,3,4,0,3,0,4,2,3,3,2,2,5,4,2,1,3,4,3,4,3,1,3,3,4,2,0,2,1,0,3,3,0,0,2,0,3,1,0,4,4,3,4,3,0,4,0,1,0,2,4,4,4,4,4,0,3,2,0,3,3),
(0,0,0,1,0,4,0,0,0,0,0,0,1,1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,3,2,0,0,1,0,0,0,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,2),
(0,2,0,3,0,4,0,4,0,1,3,3,3,0,4,0,2,1,2,1,1,1,2,0,3,1,1,0,1,0,3,1,0,0,3,3,2,0,1,1,0,0,0,0,0,1,0,2,0,2,2,0,3,1,0,0,1,0,1,1,0,1,2,0,3,0,0,0,0,1,0,0,3,3,4,3,1,0,1,0,3,0,2),
(0,0,0,3,0,5,0,0,0,0,1,0,2,0,3,1,0,1,3,0,0,0,2,0,0,0,1,0,0,0,1,1,0,0,4,0,0,0,2,3,0,1,4,1,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,3,0,0,0,0,0,1,0,0,0,0,0,0,0,2,0,0,3,0,0,0,0,0,3),
(0,2,0,5,0,5,0,1,0,2,4,3,3,2,5,1,3,2,3,3,3,0,4,1,2,0,3,0,4,0,2,2,1,1,5,3,0,0,1,4,2,3,2,0,3,3,3,2,0,2,4,1,1,2,0,1,1,0,3,1,0,1,3,1,2,3,0,2,0,0,0,1,3,5,4,4,4,0,3,0,0,1,3),
(0,4,0,5,0,4,0,4,0,4,5,4,3,3,4,3,3,3,4,3,4,4,5,3,4,5,4,2,4,2,3,4,3,1,4,4,1,3,5,4,4,5,5,4,4,5,5,5,2,3,3,1,4,3,1,3,3,0,3,3,1,4,3,4,4,4,0,3,0,4,0,3,3,4,4,5,0,0,4,3,0,4,5),
(0,4,0,4,0,3,0,3,0,3,4,4,4,3,3,2,4,3,4,3,4,3,5,3,4,3,2,1,4,2,4,4,3,1,3,4,2,4,5,5,3,4,5,4,1,5,4,3,0,3,2,2,3,2,1,3,1,0,3,3,3,5,3,3,3,5,4,4,2,3,3,4,3,3,3,2,1,0,3,2,1,4,3),
(0,4,0,5,0,4,0,3,0,3,5,5,3,2,4,3,4,0,5,4,4,1,4,4,4,3,3,3,4,3,5,5,2,3,3,4,1,2,5,5,3,5,5,2,3,5,5,4,0,3,2,0,3,3,1,1,5,1,4,1,0,4,3,2,3,5,0,4,0,3,0,5,4,3,4,3,0,0,4,1,0,4,4),
(1,3,0,4,0,2,0,2,0,2,5,5,3,3,3,3,3,0,4,2,3,4,4,4,3,4,0,0,3,4,5,4,3,3,3,3,2,5,5,4,5,5,5,4,3,5,5,5,1,3,1,0,1,0,0,3,2,0,4,2,0,5,2,3,2,4,1,3,0,3,0,4,5,4,5,4,3,0,4,2,0,5,4),
(0,3,0,4,0,5,0,3,0,3,4,4,3,2,3,2,3,3,3,3,3,2,4,3,3,2,2,0,3,3,3,3,3,1,3,3,3,0,4,4,3,4,4,1,1,4,4,2,0,3,1,0,1,1,0,4,1,0,2,3,1,3,3,1,3,4,0,3,0,1,0,3,1,3,0,0,1,0,2,0,0,4,4),
(0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0),
(0,3,0,3,0,2,0,3,0,1,5,4,3,3,3,1,4,2,1,2,3,4,4,2,4,4,5,0,3,1,4,3,4,0,4,3,3,3,2,3,2,5,3,4,3,2,2,3,0,0,3,0,2,1,0,1,2,0,0,0,0,2,1,1,3,1,0,2,0,4,0,3,4,4,4,5,2,0,2,0,0,1,3),
(0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,1,0,0,1,1,0,0,0,4,2,1,1,0,1,0,3,2,0,0,3,1,1,1,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,3,0,1,0,0,0,2,0,0,0,1,4,0,4,2,1,0,0,0,0,0,1),
(0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,0,0,3,1,0,0,0,2,0,2,1,0,0,1,2,1,0,1,1,0,0,3,0,0,0,0,0,0,0,0,0,0,0,1,3,1,0,0,0,0,0,1,0,0,2,1,0,0,0,0,0,0,0,0,2),
(0,4,0,4,0,4,0,3,0,4,4,3,4,2,4,3,2,0,4,4,4,3,5,3,5,3,3,2,4,2,4,3,4,3,1,4,0,2,3,4,4,4,3,3,3,4,4,4,3,4,1,3,4,3,2,1,2,1,3,3,3,4,4,3,3,5,0,4,0,3,0,4,3,3,3,2,1,0,3,0,0,3,3),
(0,4,0,3,0,3,0,3,0,3,5,5,3,3,3,3,4,3,4,3,3,3,4,4,4,3,3,3,3,4,3,5,3,3,1,3,2,4,5,5,5,5,4,3,4,5,5,3,2,2,3,3,3,3,2,3,3,1,2,3,2,4,3,3,3,4,0,4,0,2,0,4,3,2,2,1,2,0,3,0,0,4,1),
)

class JapaneseContextAnalysis(object):
    NUM_OF_CATEGORY = 6
    DONT_KNOW = -1
    ENOUGH_REL_THRESHOLD = 100
    MAX_REL_THRESHOLD = 1000
    MINIMUM_DATA_THRESHOLD = 4

    def __init__(self):
        self._total_rel = None
        self._rel_sample = None
        self._need_to_skip_char_num = None
        self._last_char_order = None
        self._done = None
        self.reset()

    def reset(self):
        self._total_rel = 0  # total sequence received
        # category counters, each integer counts sequence in its category
        self._rel_sample = [0] * self.NUM_OF_CATEGORY
        # if last byte in current buffer is not the last byte of a character,
        # we need to know how many bytes to skip in next buffer
        self._need_to_skip_char_num = 0
        self._last_char_order = -1  # The order of previous char
        # If this flag is set to True, detection is done and conclusion has
        # been made
        self._done = False

    def feed(self, byte_str, num_bytes):
        if self._done:
            return

        # The buffer we got is byte oriented, and a character may span in more than one
        # buffers. In case the last one or two byte in last buffer is not
        # complete, we record how many byte needed to complete that character
        # and skip these bytes here.  We can choose to record those bytes as
        # well and analyse the character once it is complete, but since a
        # character will not make much difference, by simply skipping
        # this character will simply our logic and improve performance.
        i = self._need_to_skip_char_num
        while i < num_bytes:
            order, char_len = self.get_order(byte_str[i:i + 2])
            i += char_len
            if i > num_bytes:
                self._need_to_skip_char_num = i - num_bytes
                self._last_char_order = -1
            else:
                if (order != -1) and (self._last_char_order != -1):
                    self._total_rel += 1
                    if self._total_rel > self.MAX_REL_THRESHOLD:
                        self._done = True
                        break
                    self._rel_sample[jp2CharContext[self._last_char_order][order]] += 1
                self._last_char_order = order

    def got_enough_data(self):
        return self._total_rel > self.ENOUGH_REL_THRESHOLD

    def get_confidence(self):
        # This is just one way to calculate confidence. It works well for me.
        if self._total_rel > self.MINIMUM_DATA_THRESHOLD:
            return (self._total_rel - self._rel_sample[0]) / self._total_rel
        else:
            return self.DONT_KNOW

    def get_order(self, byte_str):
        return -1, 1

class SJISContextAnalysis(JapaneseContextAnalysis):
    def __init__(self):
        super(SJISContextAnalysis, self).__init__()
        self._charset_name = "SHIFT_JIS"

    @property
    def charset_name(self):
        return self._charset_name

    def get_order(self, byte_str):
        if not byte_str:
            return -1, 1
        # find out current char's byte length
        first_char = byte_str[0]
        if (0x81 <= first_char <= 0x9F) or (0xE0 <= first_char <= 0xFC):
            char_len = 2
            if (first_char == 0x87) or (0xFA <= first_char <= 0xFC):
                self._charset_name = "CP932"
        else:
            char_len = 1

        # return its order if it is hiragana
        if len(byte_str) > 1:
            second_char = byte_str[1]
            if (first_char == 202) and (0x9F <= second_char <= 0xF1):
                return second_char - 0x9F, char_len

        return -1, char_len

class EUCJPContextAnalysis(JapaneseContextAnalysis):
    def get_order(self, byte_str):
        if not byte_str:
            return -1, 1
        # find out current char's byte length
        first_char = byte_str[0]
        if (first_char == 0x8E) or (0xA1 <= first_char <= 0xFE):
            char_len = 2
        elif first_char == 0x8F:
            char_len = 3
        else:
            char_len = 1

        # return its order if it is hiragana
        if len(byte_str) > 1:
            second_char = byte_str[1]
            if (first_char == 0xA4) and (0xA1 <= second_char <= 0xF3):
                return second_char - 0xA1, char_len

        return -1, char_len


site-packages/pip/_vendor/appdirs.py000064400000053540147511334570013545 0ustar00#!/usr/bin/env python
# -*- coding: utf-8 -*-
# Copyright (c) 2005-2010 ActiveState Software Inc.
# Copyright (c) 2013 Eddy Petrișor

"""Utilities for determining application-specific dirs.

See <http://github.com/ActiveState/appdirs> for details and usage.
"""
# Dev Notes:
# - MSDN on where to store app data files:
#   http://support.microsoft.com/default.aspx?scid=kb;en-us;310294#XSLTH3194121123120121120120
# - macOS: http://developer.apple.com/documentation/MacOSX/Conceptual/BPFileSystem/index.html
# - XDG spec for Un*x: http://standards.freedesktop.org/basedir-spec/basedir-spec-latest.html

__version_info__ = (1, 4, 0)
__version__ = '.'.join(map(str, __version_info__))


import sys
import os

PY3 = sys.version_info[0] == 3

if PY3:
    unicode = str

if sys.platform.startswith('java'):
    import platform
    os_name = platform.java_ver()[3][0]
    if os_name.startswith('Windows'): # "Windows XP", "Windows 7", etc.
        system = 'win32'
    elif os_name.startswith('Mac'): # "macOS", etc.
        system = 'darwin'
    else: # "Linux", "SunOS", "FreeBSD", etc.
        # Setting this to "linux2" is not ideal, but only Windows or Mac
        # are actually checked for and the rest of the module expects
        # *sys.platform* style strings.
        system = 'linux2'
else:
    system = sys.platform



def user_data_dir(appname=None, appauthor=None, version=None, roaming=False):
    r"""Return full path to the user-specific data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user data directories are:
        macOS:                  ~/Library/Application Support/<AppName>
        Unix:                   ~/.local/share/<AppName>    # or in $XDG_DATA_HOME, if defined
        Win XP (not roaming):   C:\Documents and Settings\<username>\Application Data\<AppAuthor>\<AppName>
        Win XP (roaming):       C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>
        Win 7  (not roaming):   C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>
        Win 7  (roaming):       C:\Users\<username>\AppData\Roaming\<AppAuthor>\<AppName>

    For Unix, we follow the XDG spec and support $XDG_DATA_HOME.
    That means, by default "~/.local/share/<AppName>".
    """
    if system == "win32":
        if appauthor is None:
            appauthor = appname
        const = roaming and "CSIDL_APPDATA" or "CSIDL_LOCAL_APPDATA"
        path = os.path.normpath(_get_win_folder(const))
        if appname:
            if appauthor is not False:
                path = os.path.join(path, appauthor, appname)
            else:
                path = os.path.join(path, appname)
    elif system == 'darwin':
        path = os.path.expanduser('~/Library/Application Support/')
        if appname:
            path = os.path.join(path, appname)
    else:
        path = os.getenv('XDG_DATA_HOME', os.path.expanduser("~/.local/share"))
        if appname:
            path = os.path.join(path, appname)
    if appname and version:
        path = os.path.join(path, version)
    return path


def site_data_dir(appname=None, appauthor=None, version=None, multipath=False):
    """Return full path to the user-shared data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "multipath" is an optional parameter only applicable to *nix
            which indicates that the entire list of data dirs should be
            returned. By default, the first item from XDG_DATA_DIRS is
            returned, or '/usr/local/share/<AppName>',
            if XDG_DATA_DIRS is not set

    Typical user data directories are:
        macOS:      /Library/Application Support/<AppName>
        Unix:       /usr/local/share/<AppName> or /usr/share/<AppName>
        Win XP:     C:\Documents and Settings\All Users\Application Data\<AppAuthor>\<AppName>
        Vista:      (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)
        Win 7:      C:\ProgramData\<AppAuthor>\<AppName>   # Hidden, but writeable on Win 7.

    For Unix, this is using the $XDG_DATA_DIRS[0] default.

    WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
    """
    if system == "win32":
        if appauthor is None:
            appauthor = appname
        path = os.path.normpath(_get_win_folder("CSIDL_COMMON_APPDATA"))
        if appname:
            if appauthor is not False:
                path = os.path.join(path, appauthor, appname)
            else:
                path = os.path.join(path, appname)
    elif system == 'darwin':
        path = os.path.expanduser('/Library/Application Support')
        if appname:
            path = os.path.join(path, appname)
    else:
        # XDG default for $XDG_DATA_DIRS
        # only first, if multipath is False
        path = os.getenv('XDG_DATA_DIRS',
                         os.pathsep.join(['/usr/local/share', '/usr/share']))
        pathlist = [os.path.expanduser(x.rstrip(os.sep)) for x in path.split(os.pathsep)]
        if appname:
            if version:
                appname = os.path.join(appname, version)
            pathlist = [os.sep.join([x, appname]) for x in pathlist]

        if multipath:
            path = os.pathsep.join(pathlist)
        else:
            path = pathlist[0]
        return path

    if appname and version:
        path = os.path.join(path, version)
    return path


def user_config_dir(appname=None, appauthor=None, version=None, roaming=False):
    r"""Return full path to the user-specific config dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user data directories are:
        macOS:                  same as user_data_dir
        Unix:                   ~/.config/<AppName>     # or in $XDG_CONFIG_HOME, if defined
        Win *:                  same as user_data_dir

    For Unix, we follow the XDG spec and support $XDG_CONFIG_HOME.
    That means, by deafult "~/.config/<AppName>".
    """
    if system in ["win32", "darwin"]:
        path = user_data_dir(appname, appauthor, None, roaming)
    else:
        path = os.getenv('XDG_CONFIG_HOME', os.path.expanduser("~/.config"))
        if appname:
            path = os.path.join(path, appname)
    if appname and version:
        path = os.path.join(path, version)
    return path


def site_config_dir(appname=None, appauthor=None, version=None, multipath=False):
    """Return full path to the user-shared data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "multipath" is an optional parameter only applicable to *nix
            which indicates that the entire list of config dirs should be
            returned. By default, the first item from XDG_CONFIG_DIRS is
            returned, or '/etc/xdg/<AppName>', if XDG_CONFIG_DIRS is not set

    Typical user data directories are:
        macOS:      same as site_data_dir
        Unix:       /etc/xdg/<AppName> or $XDG_CONFIG_DIRS[i]/<AppName> for each value in
                    $XDG_CONFIG_DIRS
        Win *:      same as site_data_dir
        Vista:      (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)

    For Unix, this is using the $XDG_CONFIG_DIRS[0] default, if multipath=False

    WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
    """
    if system in ["win32", "darwin"]:
        path = site_data_dir(appname, appauthor)
        if appname and version:
            path = os.path.join(path, version)
    else:
        # XDG default for $XDG_CONFIG_DIRS
        # only first, if multipath is False
        path = os.getenv('XDG_CONFIG_DIRS', '/etc/xdg')
        pathlist = [os.path.expanduser(x.rstrip(os.sep)) for x in path.split(os.pathsep)]
        if appname:
            if version:
                appname = os.path.join(appname, version)
            pathlist = [os.sep.join([x, appname]) for x in pathlist]

        if multipath:
            path = os.pathsep.join(pathlist)
        else:
            path = pathlist[0]
    return path


def user_cache_dir(appname=None, appauthor=None, version=None, opinion=True):
    r"""Return full path to the user-specific cache dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "opinion" (boolean) can be False to disable the appending of
            "Cache" to the base app data dir for Windows. See
            discussion below.

    Typical user cache directories are:
        macOS:      ~/Library/Caches/<AppName>
        Unix:       ~/.cache/<AppName> (XDG default)
        Win XP:     C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Cache
        Vista:      C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Cache

    On Windows the only suggestion in the MSDN docs is that local settings go in
    the `CSIDL_LOCAL_APPDATA` directory. This is identical to the non-roaming
    app data dir (the default returned by `user_data_dir` above). Apps typically
    put cache data somewhere *under* the given dir here. Some examples:
        ...\Mozilla\Firefox\Profiles\<ProfileName>\Cache
        ...\Acme\SuperApp\Cache\1.0
    OPINION: This function appends "Cache" to the `CSIDL_LOCAL_APPDATA` value.
    This can be disabled with the `opinion=False` option.
    """
    if system == "win32":
        if appauthor is None:
            appauthor = appname
        path = os.path.normpath(_get_win_folder("CSIDL_LOCAL_APPDATA"))
        if appname:
            if appauthor is not False:
                path = os.path.join(path, appauthor, appname)
            else:
                path = os.path.join(path, appname)
            if opinion:
                path = os.path.join(path, "Cache")
    elif system == 'darwin':
        path = os.path.expanduser('~/Library/Caches')
        if appname:
            path = os.path.join(path, appname)
    else:
        path = os.getenv('XDG_CACHE_HOME', os.path.expanduser('~/.cache'))
        if appname:
            path = os.path.join(path, appname)
    if appname and version:
        path = os.path.join(path, version)
    return path


def user_log_dir(appname=None, appauthor=None, version=None, opinion=True):
    r"""Return full path to the user-specific log dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "opinion" (boolean) can be False to disable the appending of
            "Logs" to the base app data dir for Windows, and "log" to the
            base cache dir for Unix. See discussion below.

    Typical user cache directories are:
        macOS:      ~/Library/Logs/<AppName>
        Unix:       ~/.cache/<AppName>/log  # or under $XDG_CACHE_HOME if defined
        Win XP:     C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Logs
        Vista:      C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Logs

    On Windows the only suggestion in the MSDN docs is that local settings
    go in the `CSIDL_LOCAL_APPDATA` directory. (Note: I'm interested in
    examples of what some windows apps use for a logs dir.)

    OPINION: This function appends "Logs" to the `CSIDL_LOCAL_APPDATA`
    value for Windows and appends "log" to the user cache dir for Unix.
    This can be disabled with the `opinion=False` option.
    """
    if system == "darwin":
        path = os.path.join(
            os.path.expanduser('~/Library/Logs'),
            appname)
    elif system == "win32":
        path = user_data_dir(appname, appauthor, version)
        version = False
        if opinion:
            path = os.path.join(path, "Logs")
    else:
        path = user_cache_dir(appname, appauthor, version)
        version = False
        if opinion:
            path = os.path.join(path, "log")
    if appname and version:
        path = os.path.join(path, version)
    return path


class AppDirs(object):
    """Convenience wrapper for getting application dirs."""
    def __init__(self, appname, appauthor=None, version=None, roaming=False,
                 multipath=False):
        self.appname = appname
        self.appauthor = appauthor
        self.version = version
        self.roaming = roaming
        self.multipath = multipath

    @property
    def user_data_dir(self):
        return user_data_dir(self.appname, self.appauthor,
                             version=self.version, roaming=self.roaming)

    @property
    def site_data_dir(self):
        return site_data_dir(self.appname, self.appauthor,
                             version=self.version, multipath=self.multipath)

    @property
    def user_config_dir(self):
        return user_config_dir(self.appname, self.appauthor,
                               version=self.version, roaming=self.roaming)

    @property
    def site_config_dir(self):
        return site_config_dir(self.appname, self.appauthor,
                             version=self.version, multipath=self.multipath)

    @property
    def user_cache_dir(self):
        return user_cache_dir(self.appname, self.appauthor,
                              version=self.version)

    @property
    def user_log_dir(self):
        return user_log_dir(self.appname, self.appauthor,
                            version=self.version)


#---- internal support stuff

def _get_win_folder_from_registry(csidl_name):
    """This is a fallback technique at best. I'm not sure if using the
    registry for this guarantees us the correct answer for all CSIDL_*
    names.
    """
    import _winreg

    shell_folder_name = {
        "CSIDL_APPDATA": "AppData",
        "CSIDL_COMMON_APPDATA": "Common AppData",
        "CSIDL_LOCAL_APPDATA": "Local AppData",
    }[csidl_name]

    key = _winreg.OpenKey(
        _winreg.HKEY_CURRENT_USER,
        r"Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders"
    )
    dir, type = _winreg.QueryValueEx(key, shell_folder_name)
    return dir


def _get_win_folder_with_pywin32(csidl_name):
    from win32com.shell import shellcon, shell
    dir = shell.SHGetFolderPath(0, getattr(shellcon, csidl_name), 0, 0)
    # Try to make this a unicode path because SHGetFolderPath does
    # not return unicode strings when there is unicode data in the
    # path.
    try:
        dir = unicode(dir)

        # Downgrade to short path name if have highbit chars. See
        # <http://bugs.activestate.com/show_bug.cgi?id=85099>.
        has_high_char = False
        for c in dir:
            if ord(c) > 255:
                has_high_char = True
                break
        if has_high_char:
            try:
                import win32api
                dir = win32api.GetShortPathName(dir)
            except ImportError:
                pass
    except UnicodeError:
        pass
    return dir


def _get_win_folder_with_ctypes(csidl_name):
    import ctypes

    csidl_const = {
        "CSIDL_APPDATA": 26,
        "CSIDL_COMMON_APPDATA": 35,
        "CSIDL_LOCAL_APPDATA": 28,
    }[csidl_name]

    buf = ctypes.create_unicode_buffer(1024)
    ctypes.windll.shell32.SHGetFolderPathW(None, csidl_const, None, 0, buf)

    # Downgrade to short path name if have highbit chars. See
    # <http://bugs.activestate.com/show_bug.cgi?id=85099>.
    has_high_char = False
    for c in buf:
        if ord(c) > 255:
            has_high_char = True
            break
    if has_high_char:
        buf2 = ctypes.create_unicode_buffer(1024)
        if ctypes.windll.kernel32.GetShortPathNameW(buf.value, buf2, 1024):
            buf = buf2

    return buf.value

def _get_win_folder_with_jna(csidl_name):
    import array
    from com.sun import jna
    from com.sun.jna.platform import win32

    buf_size = win32.WinDef.MAX_PATH * 2
    buf = array.zeros('c', buf_size)
    shell = win32.Shell32.INSTANCE
    shell.SHGetFolderPath(None, getattr(win32.ShlObj, csidl_name), None, win32.ShlObj.SHGFP_TYPE_CURRENT, buf)
    dir = jna.Native.toString(buf.tostring()).rstrip("\0")

    # Downgrade to short path name if have highbit chars. See
    # <http://bugs.activestate.com/show_bug.cgi?id=85099>.
    has_high_char = False
    for c in dir:
        if ord(c) > 255:
            has_high_char = True
            break
    if has_high_char:
        buf = array.zeros('c', buf_size)
        kernel = win32.Kernel32.INSTANCE
        if kernal.GetShortPathName(dir, buf, buf_size):
            dir = jna.Native.toString(buf.tostring()).rstrip("\0")

    return dir

if system == "win32":
    try:
        import win32com.shell
        _get_win_folder = _get_win_folder_with_pywin32
    except ImportError:
        try:
            from ctypes import windll
            _get_win_folder = _get_win_folder_with_ctypes
        except ImportError:
            try:
                import com.sun.jna
                _get_win_folder = _get_win_folder_with_jna
            except ImportError:
                _get_win_folder = _get_win_folder_from_registry


#---- self test code

if __name__ == "__main__":
    appname = "MyApp"
    appauthor = "MyCompany"

    props = ("user_data_dir", "site_data_dir",
             "user_config_dir", "site_config_dir",
             "user_cache_dir", "user_log_dir")

    print("-- app dirs (with optional 'version')")
    dirs = AppDirs(appname, appauthor, version="1.0")
    for prop in props:
        print("%s: %s" % (prop, getattr(dirs, prop)))

    print("\n-- app dirs (without optional 'version')")
    dirs = AppDirs(appname, appauthor)
    for prop in props:
        print("%s: %s" % (prop, getattr(dirs, prop)))

    print("\n-- app dirs (without optional 'appauthor')")
    dirs = AppDirs(appname)
    for prop in props:
        print("%s: %s" % (prop, getattr(dirs, prop)))

    print("\n-- app dirs (with disabled 'appauthor')")
    dirs = AppDirs(appname, appauthor=False)
    for prop in props:
        print("%s: %s" % (prop, getattr(dirs, prop)))
site-packages/pip/_vendor/packaging/__init__.py000064400000001001147511334570015547 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

from .__about__ import (
    __author__, __copyright__, __email__, __license__, __summary__, __title__,
    __uri__, __version__
)

__all__ = [
    "__title__", "__summary__", "__uri__", "__version__", "__author__",
    "__email__", "__license__", "__copyright__",
]
site-packages/pip/_vendor/packaging/version.py000064400000026444147511334570015517 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

import collections
import itertools
import re

from ._structures import Infinity


__all__ = [
    "parse", "Version", "LegacyVersion", "InvalidVersion", "VERSION_PATTERN"
]


_Version = collections.namedtuple(
    "_Version",
    ["epoch", "release", "dev", "pre", "post", "local"],
)


def parse(version):
    """
    Parse the given version string and return either a :class:`Version` object
    or a :class:`LegacyVersion` object depending on if the given version is
    a valid PEP 440 version or a legacy version.
    """
    try:
        return Version(version)
    except InvalidVersion:
        return LegacyVersion(version)


class InvalidVersion(ValueError):
    """
    An invalid version was found, users should refer to PEP 440.
    """


class _BaseVersion(object):

    def __hash__(self):
        return hash(self._key)

    def __lt__(self, other):
        return self._compare(other, lambda s, o: s < o)

    def __le__(self, other):
        return self._compare(other, lambda s, o: s <= o)

    def __eq__(self, other):
        return self._compare(other, lambda s, o: s == o)

    def __ge__(self, other):
        return self._compare(other, lambda s, o: s >= o)

    def __gt__(self, other):
        return self._compare(other, lambda s, o: s > o)

    def __ne__(self, other):
        return self._compare(other, lambda s, o: s != o)

    def _compare(self, other, method):
        if not isinstance(other, _BaseVersion):
            return NotImplemented

        return method(self._key, other._key)


class LegacyVersion(_BaseVersion):

    def __init__(self, version):
        self._version = str(version)
        self._key = _legacy_cmpkey(self._version)

    def __str__(self):
        return self._version

    def __repr__(self):
        return "<LegacyVersion({0})>".format(repr(str(self)))

    @property
    def public(self):
        return self._version

    @property
    def base_version(self):
        return self._version

    @property
    def local(self):
        return None

    @property
    def is_prerelease(self):
        return False

    @property
    def is_postrelease(self):
        return False


_legacy_version_component_re = re.compile(
    r"(\d+ | [a-z]+ | \.| -)", re.VERBOSE,
)

_legacy_version_replacement_map = {
    "pre": "c", "preview": "c", "-": "final-", "rc": "c", "dev": "@",
}


def _parse_version_parts(s):
    for part in _legacy_version_component_re.split(s):
        part = _legacy_version_replacement_map.get(part, part)

        if not part or part == ".":
            continue

        if part[:1] in "0123456789":
            # pad for numeric comparison
            yield part.zfill(8)
        else:
            yield "*" + part

    # ensure that alpha/beta/candidate are before final
    yield "*final"


def _legacy_cmpkey(version):
    # We hardcode an epoch of -1 here. A PEP 440 version can only have a epoch
    # greater than or equal to 0. This will effectively put the LegacyVersion,
    # which uses the defacto standard originally implemented by setuptools,
    # as before all PEP 440 versions.
    epoch = -1

    # This scheme is taken from pkg_resources.parse_version setuptools prior to
    # it's adoption of the packaging library.
    parts = []
    for part in _parse_version_parts(version.lower()):
        if part.startswith("*"):
            # remove "-" before a prerelease tag
            if part < "*final":
                while parts and parts[-1] == "*final-":
                    parts.pop()

            # remove trailing zeros from each series of numeric parts
            while parts and parts[-1] == "00000000":
                parts.pop()

        parts.append(part)
    parts = tuple(parts)

    return epoch, parts

# Deliberately not anchored to the start and end of the string, to make it
# easier for 3rd party code to reuse
VERSION_PATTERN = r"""
    v?
    (?:
        (?:(?P<epoch>[0-9]+)!)?                           # epoch
        (?P<release>[0-9]+(?:\.[0-9]+)*)                  # release segment
        (?P<pre>                                          # pre-release
            [-_\.]?
            (?P<pre_l>(a|b|c|rc|alpha|beta|pre|preview))
            [-_\.]?
            (?P<pre_n>[0-9]+)?
        )?
        (?P<post>                                         # post release
            (?:-(?P<post_n1>[0-9]+))
            |
            (?:
                [-_\.]?
                (?P<post_l>post|rev|r)
                [-_\.]?
                (?P<post_n2>[0-9]+)?
            )
        )?
        (?P<dev>                                          # dev release
            [-_\.]?
            (?P<dev_l>dev)
            [-_\.]?
            (?P<dev_n>[0-9]+)?
        )?
    )
    (?:\+(?P<local>[a-z0-9]+(?:[-_\.][a-z0-9]+)*))?       # local version
"""


class Version(_BaseVersion):

    _regex = re.compile(
        r"^\s*" + VERSION_PATTERN + r"\s*$",
        re.VERBOSE | re.IGNORECASE,
    )

    def __init__(self, version):
        # Validate the version and parse it into pieces
        match = self._regex.search(version)
        if not match:
            raise InvalidVersion("Invalid version: '{0}'".format(version))

        # Store the parsed out pieces of the version
        self._version = _Version(
            epoch=int(match.group("epoch")) if match.group("epoch") else 0,
            release=tuple(int(i) for i in match.group("release").split(".")),
            pre=_parse_letter_version(
                match.group("pre_l"),
                match.group("pre_n"),
            ),
            post=_parse_letter_version(
                match.group("post_l"),
                match.group("post_n1") or match.group("post_n2"),
            ),
            dev=_parse_letter_version(
                match.group("dev_l"),
                match.group("dev_n"),
            ),
            local=_parse_local_version(match.group("local")),
        )

        # Generate a key which will be used for sorting
        self._key = _cmpkey(
            self._version.epoch,
            self._version.release,
            self._version.pre,
            self._version.post,
            self._version.dev,
            self._version.local,
        )

    def __repr__(self):
        return "<Version({0})>".format(repr(str(self)))

    def __str__(self):
        parts = []

        # Epoch
        if self._version.epoch != 0:
            parts.append("{0}!".format(self._version.epoch))

        # Release segment
        parts.append(".".join(str(x) for x in self._version.release))

        # Pre-release
        if self._version.pre is not None:
            parts.append("".join(str(x) for x in self._version.pre))

        # Post-release
        if self._version.post is not None:
            parts.append(".post{0}".format(self._version.post[1]))

        # Development release
        if self._version.dev is not None:
            parts.append(".dev{0}".format(self._version.dev[1]))

        # Local version segment
        if self._version.local is not None:
            parts.append(
                "+{0}".format(".".join(str(x) for x in self._version.local))
            )

        return "".join(parts)

    @property
    def public(self):
        return str(self).split("+", 1)[0]

    @property
    def base_version(self):
        parts = []

        # Epoch
        if self._version.epoch != 0:
            parts.append("{0}!".format(self._version.epoch))

        # Release segment
        parts.append(".".join(str(x) for x in self._version.release))

        return "".join(parts)

    @property
    def local(self):
        version_string = str(self)
        if "+" in version_string:
            return version_string.split("+", 1)[1]

    @property
    def is_prerelease(self):
        return bool(self._version.dev or self._version.pre)

    @property
    def is_postrelease(self):
        return bool(self._version.post)


def _parse_letter_version(letter, number):
    if letter:
        # We consider there to be an implicit 0 in a pre-release if there is
        # not a numeral associated with it.
        if number is None:
            number = 0

        # We normalize any letters to their lower case form
        letter = letter.lower()

        # We consider some words to be alternate spellings of other words and
        # in those cases we want to normalize the spellings to our preferred
        # spelling.
        if letter == "alpha":
            letter = "a"
        elif letter == "beta":
            letter = "b"
        elif letter in ["c", "pre", "preview"]:
            letter = "rc"
        elif letter in ["rev", "r"]:
            letter = "post"

        return letter, int(number)
    if not letter and number:
        # We assume if we are given a number, but we are not given a letter
        # then this is using the implicit post release syntax (e.g. 1.0-1)
        letter = "post"

        return letter, int(number)


_local_version_seperators = re.compile(r"[\._-]")


def _parse_local_version(local):
    """
    Takes a string like abc.1.twelve and turns it into ("abc", 1, "twelve").
    """
    if local is not None:
        return tuple(
            part.lower() if not part.isdigit() else int(part)
            for part in _local_version_seperators.split(local)
        )


def _cmpkey(epoch, release, pre, post, dev, local):
    # When we compare a release version, we want to compare it with all of the
    # trailing zeros removed. So we'll use a reverse the list, drop all the now
    # leading zeros until we come to something non zero, then take the rest
    # re-reverse it back into the correct order and make it a tuple and use
    # that for our sorting key.
    release = tuple(
        reversed(list(
            itertools.dropwhile(
                lambda x: x == 0,
                reversed(release),
            )
        ))
    )

    # We need to "trick" the sorting algorithm to put 1.0.dev0 before 1.0a0.
    # We'll do this by abusing the pre segment, but we _only_ want to do this
    # if there is not a pre or a post segment. If we have one of those then
    # the normal sorting rules will handle this case correctly.
    if pre is None and post is None and dev is not None:
        pre = -Infinity
    # Versions without a pre-release (except as noted above) should sort after
    # those with one.
    elif pre is None:
        pre = Infinity

    # Versions without a post segment should sort before those with one.
    if post is None:
        post = -Infinity

    # Versions without a development segment should sort after those with one.
    if dev is None:
        dev = Infinity

    if local is None:
        # Versions without a local segment should sort before those with one.
        local = -Infinity
    else:
        # Versions with a local segment need that segment parsed to implement
        # the sorting rules in PEP440.
        # - Alpha numeric segments sort before numeric segments
        # - Alpha numeric segments sort lexicographically
        # - Numeric segments sort numerically
        # - Shorter versions sort before longer versions when the prefixes
        #   match exactly
        local = tuple(
            (i, "") if isinstance(i, int) else (-Infinity, i)
            for i in local
        )

    return epoch, release, pre, post, dev, local
site-packages/pip/_vendor/packaging/requirements.py000064400000010347147511334570016550 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

import string
import re

from pip._vendor.pyparsing import (
    stringStart, stringEnd, originalTextFor, ParseException
)
from pip._vendor.pyparsing import ZeroOrMore, Word, Optional, Regex, Combine
from pip._vendor.pyparsing import Literal as L  # noqa
from pip._vendor.six.moves.urllib import parse as urlparse

from .markers import MARKER_EXPR, Marker
from .specifiers import LegacySpecifier, Specifier, SpecifierSet


class InvalidRequirement(ValueError):
    """
    An invalid requirement was found, users should refer to PEP 508.
    """


ALPHANUM = Word(string.ascii_letters + string.digits)

LBRACKET = L("[").suppress()
RBRACKET = L("]").suppress()
LPAREN = L("(").suppress()
RPAREN = L(")").suppress()
COMMA = L(",").suppress()
SEMICOLON = L(";").suppress()
AT = L("@").suppress()

PUNCTUATION = Word("-_.")
IDENTIFIER_END = ALPHANUM | (ZeroOrMore(PUNCTUATION) + ALPHANUM)
IDENTIFIER = Combine(ALPHANUM + ZeroOrMore(IDENTIFIER_END))

NAME = IDENTIFIER("name")
EXTRA = IDENTIFIER

URI = Regex(r'[^ ]+')("url")
URL = (AT + URI)

EXTRAS_LIST = EXTRA + ZeroOrMore(COMMA + EXTRA)
EXTRAS = (LBRACKET + Optional(EXTRAS_LIST) + RBRACKET)("extras")

VERSION_PEP440 = Regex(Specifier._regex_str, re.VERBOSE | re.IGNORECASE)
VERSION_LEGACY = Regex(LegacySpecifier._regex_str, re.VERBOSE | re.IGNORECASE)

VERSION_ONE = VERSION_PEP440 ^ VERSION_LEGACY
VERSION_MANY = Combine(VERSION_ONE + ZeroOrMore(COMMA + VERSION_ONE),
                       joinString=",", adjacent=False)("_raw_spec")
_VERSION_SPEC = Optional(((LPAREN + VERSION_MANY + RPAREN) | VERSION_MANY))
_VERSION_SPEC.setParseAction(lambda s, l, t: t._raw_spec or '')

VERSION_SPEC = originalTextFor(_VERSION_SPEC)("specifier")
VERSION_SPEC.setParseAction(lambda s, l, t: t[1])

MARKER_EXPR = originalTextFor(MARKER_EXPR())("marker")
MARKER_EXPR.setParseAction(
    lambda s, l, t: Marker(s[t._original_start:t._original_end])
)
MARKER_SEPERATOR = SEMICOLON
MARKER = MARKER_SEPERATOR + MARKER_EXPR

VERSION_AND_MARKER = VERSION_SPEC + Optional(MARKER)
URL_AND_MARKER = URL + Optional(MARKER)

NAMED_REQUIREMENT = \
    NAME + Optional(EXTRAS) + (URL_AND_MARKER | VERSION_AND_MARKER)

REQUIREMENT = stringStart + NAMED_REQUIREMENT + stringEnd


class Requirement(object):
    """Parse a requirement.

    Parse a given requirement string into its parts, such as name, specifier,
    URL, and extras. Raises InvalidRequirement on a badly-formed requirement
    string.
    """

    # TODO: Can we test whether something is contained within a requirement?
    #       If so how do we do that? Do we need to test against the _name_ of
    #       the thing as well as the version? What about the markers?
    # TODO: Can we normalize the name and extra name?

    def __init__(self, requirement_string):
        try:
            req = REQUIREMENT.parseString(requirement_string)
        except ParseException as e:
            raise InvalidRequirement(
                "Invalid requirement, parse error at \"{0!r}\"".format(
                    requirement_string[e.loc:e.loc + 8]))

        self.name = req.name
        if req.url:
            parsed_url = urlparse.urlparse(req.url)
            if not (parsed_url.scheme and parsed_url.netloc) or (
                    not parsed_url.scheme and not parsed_url.netloc):
                raise InvalidRequirement("Invalid URL given")
            self.url = req.url
        else:
            self.url = None
        self.extras = set(req.extras.asList() if req.extras else [])
        self.specifier = SpecifierSet(req.specifier)
        self.marker = req.marker if req.marker else None

    def __str__(self):
        parts = [self.name]

        if self.extras:
            parts.append("[{0}]".format(",".join(sorted(self.extras))))

        if self.specifier:
            parts.append(str(self.specifier))

        if self.url:
            parts.append("@ {0}".format(self.url))

        if self.marker:
            parts.append("; {0}".format(self.marker))

        return "".join(parts)

    def __repr__(self):
        return "<Requirement({0!r})>".format(str(self))
site-packages/pip/_vendor/packaging/markers.py000064400000020046147511334570015466 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

import operator
import os
import platform
import sys

from pip._vendor.pyparsing import (
    ParseException, ParseResults, stringStart, stringEnd,
)
from pip._vendor.pyparsing import ZeroOrMore, Group, Forward, QuotedString
from pip._vendor.pyparsing import Literal as L  # noqa

from ._compat import string_types
from .specifiers import Specifier, InvalidSpecifier


__all__ = [
    "InvalidMarker", "UndefinedComparison", "UndefinedEnvironmentName",
    "Marker", "default_environment",
]


class InvalidMarker(ValueError):
    """
    An invalid marker was found, users should refer to PEP 508.
    """


class UndefinedComparison(ValueError):
    """
    An invalid operation was attempted on a value that doesn't support it.
    """


class UndefinedEnvironmentName(ValueError):
    """
    A name was attempted to be used that does not exist inside of the
    environment.
    """


class Node(object):

    def __init__(self, value):
        self.value = value

    def __str__(self):
        return str(self.value)

    def __repr__(self):
        return "<{0}({1!r})>".format(self.__class__.__name__, str(self))

    def serialize(self):
        raise NotImplementedError


class Variable(Node):

    def serialize(self):
        return str(self)


class Value(Node):

    def serialize(self):
        return '"{0}"'.format(self)


class Op(Node):

    def serialize(self):
        return str(self)


VARIABLE = (
    L("implementation_version") |
    L("platform_python_implementation") |
    L("implementation_name") |
    L("python_full_version") |
    L("platform_release") |
    L("platform_version") |
    L("platform_machine") |
    L("platform_system") |
    L("python_version") |
    L("sys_platform") |
    L("os_name") |
    L("os.name") |  # PEP-345
    L("sys.platform") |  # PEP-345
    L("platform.version") |  # PEP-345
    L("platform.machine") |  # PEP-345
    L("platform.python_implementation") |  # PEP-345
    L("python_implementation") |  # undocumented setuptools legacy
    L("extra")
)
ALIASES = {
    'os.name': 'os_name',
    'sys.platform': 'sys_platform',
    'platform.version': 'platform_version',
    'platform.machine': 'platform_machine',
    'platform.python_implementation': 'platform_python_implementation',
    'python_implementation': 'platform_python_implementation'
}
VARIABLE.setParseAction(lambda s, l, t: Variable(ALIASES.get(t[0], t[0])))

VERSION_CMP = (
    L("===") |
    L("==") |
    L(">=") |
    L("<=") |
    L("!=") |
    L("~=") |
    L(">") |
    L("<")
)

MARKER_OP = VERSION_CMP | L("not in") | L("in")
MARKER_OP.setParseAction(lambda s, l, t: Op(t[0]))

MARKER_VALUE = QuotedString("'") | QuotedString('"')
MARKER_VALUE.setParseAction(lambda s, l, t: Value(t[0]))

BOOLOP = L("and") | L("or")

MARKER_VAR = VARIABLE | MARKER_VALUE

MARKER_ITEM = Group(MARKER_VAR + MARKER_OP + MARKER_VAR)
MARKER_ITEM.setParseAction(lambda s, l, t: tuple(t[0]))

LPAREN = L("(").suppress()
RPAREN = L(")").suppress()

MARKER_EXPR = Forward()
MARKER_ATOM = MARKER_ITEM | Group(LPAREN + MARKER_EXPR + RPAREN)
MARKER_EXPR << MARKER_ATOM + ZeroOrMore(BOOLOP + MARKER_EXPR)

MARKER = stringStart + MARKER_EXPR + stringEnd


def _coerce_parse_result(results):
    if isinstance(results, ParseResults):
        return [_coerce_parse_result(i) for i in results]
    else:
        return results


def _format_marker(marker, first=True):
    assert isinstance(marker, (list, tuple, string_types))

    # Sometimes we have a structure like [[...]] which is a single item list
    # where the single item is itself it's own list. In that case we want skip
    # the rest of this function so that we don't get extraneous () on the
    # outside.
    if (isinstance(marker, list) and len(marker) == 1 and
            isinstance(marker[0], (list, tuple))):
        return _format_marker(marker[0])

    if isinstance(marker, list):
        inner = (_format_marker(m, first=False) for m in marker)
        if first:
            return " ".join(inner)
        else:
            return "(" + " ".join(inner) + ")"
    elif isinstance(marker, tuple):
        return " ".join([m.serialize() for m in marker])
    else:
        return marker


_operators = {
    "in": lambda lhs, rhs: lhs in rhs,
    "not in": lambda lhs, rhs: lhs not in rhs,
    "<": operator.lt,
    "<=": operator.le,
    "==": operator.eq,
    "!=": operator.ne,
    ">=": operator.ge,
    ">": operator.gt,
}


def _eval_op(lhs, op, rhs):
    try:
        spec = Specifier("".join([op.serialize(), rhs]))
    except InvalidSpecifier:
        pass
    else:
        return spec.contains(lhs)

    oper = _operators.get(op.serialize())
    if oper is None:
        raise UndefinedComparison(
            "Undefined {0!r} on {1!r} and {2!r}.".format(op, lhs, rhs)
        )

    return oper(lhs, rhs)


_undefined = object()


def _get_env(environment, name):
    value = environment.get(name, _undefined)

    if value is _undefined:
        raise UndefinedEnvironmentName(
            "{0!r} does not exist in evaluation environment.".format(name)
        )

    return value


def _evaluate_markers(markers, environment):
    groups = [[]]

    for marker in markers:
        assert isinstance(marker, (list, tuple, string_types))

        if isinstance(marker, list):
            groups[-1].append(_evaluate_markers(marker, environment))
        elif isinstance(marker, tuple):
            lhs, op, rhs = marker

            if isinstance(lhs, Variable):
                lhs_value = _get_env(environment, lhs.value)
                rhs_value = rhs.value
            else:
                lhs_value = lhs.value
                rhs_value = _get_env(environment, rhs.value)

            groups[-1].append(_eval_op(lhs_value, op, rhs_value))
        else:
            assert marker in ["and", "or"]
            if marker == "or":
                groups.append([])

    return any(all(item) for item in groups)


def format_full_version(info):
    version = '{0.major}.{0.minor}.{0.micro}'.format(info)
    kind = info.releaselevel
    if kind != 'final':
        version += kind[0] + str(info.serial)
    return version


def default_environment():
    if hasattr(sys, 'implementation'):
        iver = format_full_version(sys.implementation.version)
        implementation_name = sys.implementation.name
    else:
        iver = '0'
        implementation_name = ''

    return {
        "implementation_name": implementation_name,
        "implementation_version": iver,
        "os_name": os.name,
        "platform_machine": platform.machine(),
        "platform_release": platform.release(),
        "platform_system": platform.system(),
        "platform_version": platform.version(),
        "python_full_version": platform.python_version(),
        "platform_python_implementation": platform.python_implementation(),
        "python_version": platform.python_version()[:3],
        "sys_platform": sys.platform,
    }


class Marker(object):

    def __init__(self, marker):
        try:
            self._markers = _coerce_parse_result(MARKER.parseString(marker))
        except ParseException as e:
            err_str = "Invalid marker: {0!r}, parse error at {1!r}".format(
                marker, marker[e.loc:e.loc + 8])
            raise InvalidMarker(err_str)

    def __str__(self):
        return _format_marker(self._markers)

    def __repr__(self):
        return "<Marker({0!r})>".format(str(self))

    def evaluate(self, environment=None):
        """Evaluate a marker.

        Return the boolean from evaluating the given marker against the
        environment. environment is an optional argument to override all or
        part of the determined environment.

        The environment is determined from the current Python process.
        """
        current_environment = default_environment()
        if environment is not None:
            current_environment.update(environment)

        return _evaluate_markers(self._markers, current_environment)
site-packages/pip/_vendor/packaging/specifiers.py000064400000066571147511334570016173 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

import abc
import functools
import itertools
import re

from ._compat import string_types, with_metaclass
from .version import Version, LegacyVersion, parse


class InvalidSpecifier(ValueError):
    """
    An invalid specifier was found, users should refer to PEP 440.
    """


class BaseSpecifier(with_metaclass(abc.ABCMeta, object)):

    @abc.abstractmethod
    def __str__(self):
        """
        Returns the str representation of this Specifier like object. This
        should be representative of the Specifier itself.
        """

    @abc.abstractmethod
    def __hash__(self):
        """
        Returns a hash value for this Specifier like object.
        """

    @abc.abstractmethod
    def __eq__(self, other):
        """
        Returns a boolean representing whether or not the two Specifier like
        objects are equal.
        """

    @abc.abstractmethod
    def __ne__(self, other):
        """
        Returns a boolean representing whether or not the two Specifier like
        objects are not equal.
        """

    @abc.abstractproperty
    def prereleases(self):
        """
        Returns whether or not pre-releases as a whole are allowed by this
        specifier.
        """

    @prereleases.setter
    def prereleases(self, value):
        """
        Sets whether or not pre-releases as a whole are allowed by this
        specifier.
        """

    @abc.abstractmethod
    def contains(self, item, prereleases=None):
        """
        Determines if the given item is contained within this specifier.
        """

    @abc.abstractmethod
    def filter(self, iterable, prereleases=None):
        """
        Takes an iterable of items and filters them so that only items which
        are contained within this specifier are allowed in it.
        """


class _IndividualSpecifier(BaseSpecifier):

    _operators = {}

    def __init__(self, spec="", prereleases=None):
        match = self._regex.search(spec)
        if not match:
            raise InvalidSpecifier("Invalid specifier: '{0}'".format(spec))

        self._spec = (
            match.group("operator").strip(),
            match.group("version").strip(),
        )

        # Store whether or not this Specifier should accept prereleases
        self._prereleases = prereleases

    def __repr__(self):
        pre = (
            ", prereleases={0!r}".format(self.prereleases)
            if self._prereleases is not None
            else ""
        )

        return "<{0}({1!r}{2})>".format(
            self.__class__.__name__,
            str(self),
            pre,
        )

    def __str__(self):
        return "{0}{1}".format(*self._spec)

    def __hash__(self):
        return hash(self._spec)

    def __eq__(self, other):
        if isinstance(other, string_types):
            try:
                other = self.__class__(other)
            except InvalidSpecifier:
                return NotImplemented
        elif not isinstance(other, self.__class__):
            return NotImplemented

        return self._spec == other._spec

    def __ne__(self, other):
        if isinstance(other, string_types):
            try:
                other = self.__class__(other)
            except InvalidSpecifier:
                return NotImplemented
        elif not isinstance(other, self.__class__):
            return NotImplemented

        return self._spec != other._spec

    def _get_operator(self, op):
        return getattr(self, "_compare_{0}".format(self._operators[op]))

    def _coerce_version(self, version):
        if not isinstance(version, (LegacyVersion, Version)):
            version = parse(version)
        return version

    @property
    def operator(self):
        return self._spec[0]

    @property
    def version(self):
        return self._spec[1]

    @property
    def prereleases(self):
        return self._prereleases

    @prereleases.setter
    def prereleases(self, value):
        self._prereleases = value

    def __contains__(self, item):
        return self.contains(item)

    def contains(self, item, prereleases=None):
        # Determine if prereleases are to be allowed or not.
        if prereleases is None:
            prereleases = self.prereleases

        # Normalize item to a Version or LegacyVersion, this allows us to have
        # a shortcut for ``"2.0" in Specifier(">=2")
        item = self._coerce_version(item)

        # Determine if we should be supporting prereleases in this specifier
        # or not, if we do not support prereleases than we can short circuit
        # logic if this version is a prereleases.
        if item.is_prerelease and not prereleases:
            return False

        # Actually do the comparison to determine if this item is contained
        # within this Specifier or not.
        return self._get_operator(self.operator)(item, self.version)

    def filter(self, iterable, prereleases=None):
        yielded = False
        found_prereleases = []

        kw = {"prereleases": prereleases if prereleases is not None else True}

        # Attempt to iterate over all the values in the iterable and if any of
        # them match, yield them.
        for version in iterable:
            parsed_version = self._coerce_version(version)

            if self.contains(parsed_version, **kw):
                # If our version is a prerelease, and we were not set to allow
                # prereleases, then we'll store it for later incase nothing
                # else matches this specifier.
                if (parsed_version.is_prerelease and not
                        (prereleases or self.prereleases)):
                    found_prereleases.append(version)
                # Either this is not a prerelease, or we should have been
                # accepting prereleases from the begining.
                else:
                    yielded = True
                    yield version

        # Now that we've iterated over everything, determine if we've yielded
        # any values, and if we have not and we have any prereleases stored up
        # then we will go ahead and yield the prereleases.
        if not yielded and found_prereleases:
            for version in found_prereleases:
                yield version


class LegacySpecifier(_IndividualSpecifier):

    _regex_str = (
        r"""
        (?P<operator>(==|!=|<=|>=|<|>))
        \s*
        (?P<version>
            [^,;\s)]* # Since this is a "legacy" specifier, and the version
                      # string can be just about anything, we match everything
                      # except for whitespace, a semi-colon for marker support,
                      # a closing paren since versions can be enclosed in
                      # them, and a comma since it's a version separator.
        )
        """
    )

    _regex = re.compile(
        r"^\s*" + _regex_str + r"\s*$", re.VERBOSE | re.IGNORECASE)

    _operators = {
        "==": "equal",
        "!=": "not_equal",
        "<=": "less_than_equal",
        ">=": "greater_than_equal",
        "<": "less_than",
        ">": "greater_than",
    }

    def _coerce_version(self, version):
        if not isinstance(version, LegacyVersion):
            version = LegacyVersion(str(version))
        return version

    def _compare_equal(self, prospective, spec):
        return prospective == self._coerce_version(spec)

    def _compare_not_equal(self, prospective, spec):
        return prospective != self._coerce_version(spec)

    def _compare_less_than_equal(self, prospective, spec):
        return prospective <= self._coerce_version(spec)

    def _compare_greater_than_equal(self, prospective, spec):
        return prospective >= self._coerce_version(spec)

    def _compare_less_than(self, prospective, spec):
        return prospective < self._coerce_version(spec)

    def _compare_greater_than(self, prospective, spec):
        return prospective > self._coerce_version(spec)


def _require_version_compare(fn):
    @functools.wraps(fn)
    def wrapped(self, prospective, spec):
        if not isinstance(prospective, Version):
            return False
        return fn(self, prospective, spec)
    return wrapped


class Specifier(_IndividualSpecifier):

    _regex_str = (
        r"""
        (?P<operator>(~=|==|!=|<=|>=|<|>|===))
        (?P<version>
            (?:
                # The identity operators allow for an escape hatch that will
                # do an exact string match of the version you wish to install.
                # This will not be parsed by PEP 440 and we cannot determine
                # any semantic meaning from it. This operator is discouraged
                # but included entirely as an escape hatch.
                (?<====)  # Only match for the identity operator
                \s*
                [^\s]*    # We just match everything, except for whitespace
                          # since we are only testing for strict identity.
            )
            |
            (?:
                # The (non)equality operators allow for wild card and local
                # versions to be specified so we have to define these two
                # operators separately to enable that.
                (?<===|!=)            # Only match for equals and not equals

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)*   # release
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?

                # You cannot use a wild card and a dev or local version
                # together so group them with a | and make them optional.
                (?:
                    (?:[-_\.]?dev[-_\.]?[0-9]*)?         # dev release
                    (?:\+[a-z0-9]+(?:[-_\.][a-z0-9]+)*)? # local
                    |
                    \.\*  # Wild card syntax of .*
                )?
            )
            |
            (?:
                # The compatible operator requires at least two digits in the
                # release segment.
                (?<=~=)               # Only match for the compatible operator

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)+   # release  (We have a + instead of a *)
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?
                (?:[-_\.]?dev[-_\.]?[0-9]*)?          # dev release
            )
            |
            (?:
                # All other operators only allow a sub set of what the
                # (non)equality operators do. Specifically they do not allow
                # local versions to be specified nor do they allow the prefix
                # matching wild cards.
                (?<!==|!=|~=)         # We have special cases for these
                                      # operators so we want to make sure they
                                      # don't match here.

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)*   # release
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?
                (?:[-_\.]?dev[-_\.]?[0-9]*)?          # dev release
            )
        )
        """
    )

    _regex = re.compile(
        r"^\s*" + _regex_str + r"\s*$", re.VERBOSE | re.IGNORECASE)

    _operators = {
        "~=": "compatible",
        "==": "equal",
        "!=": "not_equal",
        "<=": "less_than_equal",
        ">=": "greater_than_equal",
        "<": "less_than",
        ">": "greater_than",
        "===": "arbitrary",
    }

    @_require_version_compare
    def _compare_compatible(self, prospective, spec):
        # Compatible releases have an equivalent combination of >= and ==. That
        # is that ~=2.2 is equivalent to >=2.2,==2.*. This allows us to
        # implement this in terms of the other specifiers instead of
        # implementing it ourselves. The only thing we need to do is construct
        # the other specifiers.

        # We want everything but the last item in the version, but we want to
        # ignore post and dev releases and we want to treat the pre-release as
        # it's own separate segment.
        prefix = ".".join(
            list(
                itertools.takewhile(
                    lambda x: (not x.startswith("post") and not
                               x.startswith("dev")),
                    _version_split(spec),
                )
            )[:-1]
        )

        # Add the prefix notation to the end of our string
        prefix += ".*"

        return (self._get_operator(">=")(prospective, spec) and
                self._get_operator("==")(prospective, prefix))

    @_require_version_compare
    def _compare_equal(self, prospective, spec):
        # We need special logic to handle prefix matching
        if spec.endswith(".*"):
            # In the case of prefix matching we want to ignore local segment.
            prospective = Version(prospective.public)
            # Split the spec out by dots, and pretend that there is an implicit
            # dot in between a release segment and a pre-release segment.
            spec = _version_split(spec[:-2])  # Remove the trailing .*

            # Split the prospective version out by dots, and pretend that there
            # is an implicit dot in between a release segment and a pre-release
            # segment.
            prospective = _version_split(str(prospective))

            # Shorten the prospective version to be the same length as the spec
            # so that we can determine if the specifier is a prefix of the
            # prospective version or not.
            prospective = prospective[:len(spec)]

            # Pad out our two sides with zeros so that they both equal the same
            # length.
            spec, prospective = _pad_version(spec, prospective)
        else:
            # Convert our spec string into a Version
            spec = Version(spec)

            # If the specifier does not have a local segment, then we want to
            # act as if the prospective version also does not have a local
            # segment.
            if not spec.local:
                prospective = Version(prospective.public)

        return prospective == spec

    @_require_version_compare
    def _compare_not_equal(self, prospective, spec):
        return not self._compare_equal(prospective, spec)

    @_require_version_compare
    def _compare_less_than_equal(self, prospective, spec):
        return prospective <= Version(spec)

    @_require_version_compare
    def _compare_greater_than_equal(self, prospective, spec):
        return prospective >= Version(spec)

    @_require_version_compare
    def _compare_less_than(self, prospective, spec):
        # Convert our spec to a Version instance, since we'll want to work with
        # it as a version.
        spec = Version(spec)

        # Check to see if the prospective version is less than the spec
        # version. If it's not we can short circuit and just return False now
        # instead of doing extra unneeded work.
        if not prospective < spec:
            return False

        # This special case is here so that, unless the specifier itself
        # includes is a pre-release version, that we do not accept pre-release
        # versions for the version mentioned in the specifier (e.g. <3.1 should
        # not match 3.1.dev0, but should match 3.0.dev0).
        if not spec.is_prerelease and prospective.is_prerelease:
            if Version(prospective.base_version) == Version(spec.base_version):
                return False

        # If we've gotten to here, it means that prospective version is both
        # less than the spec version *and* it's not a pre-release of the same
        # version in the spec.
        return True

    @_require_version_compare
    def _compare_greater_than(self, prospective, spec):
        # Convert our spec to a Version instance, since we'll want to work with
        # it as a version.
        spec = Version(spec)

        # Check to see if the prospective version is greater than the spec
        # version. If it's not we can short circuit and just return False now
        # instead of doing extra unneeded work.
        if not prospective > spec:
            return False

        # This special case is here so that, unless the specifier itself
        # includes is a post-release version, that we do not accept
        # post-release versions for the version mentioned in the specifier
        # (e.g. >3.1 should not match 3.0.post0, but should match 3.2.post0).
        if not spec.is_postrelease and prospective.is_postrelease:
            if Version(prospective.base_version) == Version(spec.base_version):
                return False

        # Ensure that we do not allow a local version of the version mentioned
        # in the specifier, which is techincally greater than, to match.
        if prospective.local is not None:
            if Version(prospective.base_version) == Version(spec.base_version):
                return False

        # If we've gotten to here, it means that prospective version is both
        # greater than the spec version *and* it's not a pre-release of the
        # same version in the spec.
        return True

    def _compare_arbitrary(self, prospective, spec):
        return str(prospective).lower() == str(spec).lower()

    @property
    def prereleases(self):
        # If there is an explicit prereleases set for this, then we'll just
        # blindly use that.
        if self._prereleases is not None:
            return self._prereleases

        # Look at all of our specifiers and determine if they are inclusive
        # operators, and if they are if they are including an explicit
        # prerelease.
        operator, version = self._spec
        if operator in ["==", ">=", "<=", "~=", "==="]:
            # The == specifier can include a trailing .*, if it does we
            # want to remove before parsing.
            if operator == "==" and version.endswith(".*"):
                version = version[:-2]

            # Parse the version, and if it is a pre-release than this
            # specifier allows pre-releases.
            if parse(version).is_prerelease:
                return True

        return False

    @prereleases.setter
    def prereleases(self, value):
        self._prereleases = value


_prefix_regex = re.compile(r"^([0-9]+)((?:a|b|c|rc)[0-9]+)$")


def _version_split(version):
    result = []
    for item in version.split("."):
        match = _prefix_regex.search(item)
        if match:
            result.extend(match.groups())
        else:
            result.append(item)
    return result


def _pad_version(left, right):
    left_split, right_split = [], []

    # Get the release segment of our versions
    left_split.append(list(itertools.takewhile(lambda x: x.isdigit(), left)))
    right_split.append(list(itertools.takewhile(lambda x: x.isdigit(), right)))

    # Get the rest of our versions
    left_split.append(left[len(left_split[0]):])
    right_split.append(right[len(right_split[0]):])

    # Insert our padding
    left_split.insert(
        1,
        ["0"] * max(0, len(right_split[0]) - len(left_split[0])),
    )
    right_split.insert(
        1,
        ["0"] * max(0, len(left_split[0]) - len(right_split[0])),
    )

    return (
        list(itertools.chain(*left_split)),
        list(itertools.chain(*right_split)),
    )


class SpecifierSet(BaseSpecifier):

    def __init__(self, specifiers="", prereleases=None):
        # Split on , to break each indidivual specifier into it's own item, and
        # strip each item to remove leading/trailing whitespace.
        specifiers = [s.strip() for s in specifiers.split(",") if s.strip()]

        # Parsed each individual specifier, attempting first to make it a
        # Specifier and falling back to a LegacySpecifier.
        parsed = set()
        for specifier in specifiers:
            try:
                parsed.add(Specifier(specifier))
            except InvalidSpecifier:
                parsed.add(LegacySpecifier(specifier))

        # Turn our parsed specifiers into a frozen set and save them for later.
        self._specs = frozenset(parsed)

        # Store our prereleases value so we can use it later to determine if
        # we accept prereleases or not.
        self._prereleases = prereleases

    def __repr__(self):
        pre = (
            ", prereleases={0!r}".format(self.prereleases)
            if self._prereleases is not None
            else ""
        )

        return "<SpecifierSet({0!r}{1})>".format(str(self), pre)

    def __str__(self):
        return ",".join(sorted(str(s) for s in self._specs))

    def __hash__(self):
        return hash(self._specs)

    def __and__(self, other):
        if isinstance(other, string_types):
            other = SpecifierSet(other)
        elif not isinstance(other, SpecifierSet):
            return NotImplemented

        specifier = SpecifierSet()
        specifier._specs = frozenset(self._specs | other._specs)

        if self._prereleases is None and other._prereleases is not None:
            specifier._prereleases = other._prereleases
        elif self._prereleases is not None and other._prereleases is None:
            specifier._prereleases = self._prereleases
        elif self._prereleases == other._prereleases:
            specifier._prereleases = self._prereleases
        else:
            raise ValueError(
                "Cannot combine SpecifierSets with True and False prerelease "
                "overrides."
            )

        return specifier

    def __eq__(self, other):
        if isinstance(other, string_types):
            other = SpecifierSet(other)
        elif isinstance(other, _IndividualSpecifier):
            other = SpecifierSet(str(other))
        elif not isinstance(other, SpecifierSet):
            return NotImplemented

        return self._specs == other._specs

    def __ne__(self, other):
        if isinstance(other, string_types):
            other = SpecifierSet(other)
        elif isinstance(other, _IndividualSpecifier):
            other = SpecifierSet(str(other))
        elif not isinstance(other, SpecifierSet):
            return NotImplemented

        return self._specs != other._specs

    def __len__(self):
        return len(self._specs)

    def __iter__(self):
        return iter(self._specs)

    @property
    def prereleases(self):
        # If we have been given an explicit prerelease modifier, then we'll
        # pass that through here.
        if self._prereleases is not None:
            return self._prereleases

        # If we don't have any specifiers, and we don't have a forced value,
        # then we'll just return None since we don't know if this should have
        # pre-releases or not.
        if not self._specs:
            return None

        # Otherwise we'll see if any of the given specifiers accept
        # prereleases, if any of them do we'll return True, otherwise False.
        return any(s.prereleases for s in self._specs)

    @prereleases.setter
    def prereleases(self, value):
        self._prereleases = value

    def __contains__(self, item):
        return self.contains(item)

    def contains(self, item, prereleases=None):
        # Ensure that our item is a Version or LegacyVersion instance.
        if not isinstance(item, (LegacyVersion, Version)):
            item = parse(item)

        # Determine if we're forcing a prerelease or not, if we're not forcing
        # one for this particular filter call, then we'll use whatever the
        # SpecifierSet thinks for whether or not we should support prereleases.
        if prereleases is None:
            prereleases = self.prereleases

        # We can determine if we're going to allow pre-releases by looking to
        # see if any of the underlying items supports them. If none of them do
        # and this item is a pre-release then we do not allow it and we can
        # short circuit that here.
        # Note: This means that 1.0.dev1 would not be contained in something
        #       like >=1.0.devabc however it would be in >=1.0.debabc,>0.0.dev0
        if not prereleases and item.is_prerelease:
            return False

        # We simply dispatch to the underlying specs here to make sure that the
        # given version is contained within all of them.
        # Note: This use of all() here means that an empty set of specifiers
        #       will always return True, this is an explicit design decision.
        return all(
            s.contains(item, prereleases=prereleases)
            for s in self._specs
        )

    def filter(self, iterable, prereleases=None):
        # Determine if we're forcing a prerelease or not, if we're not forcing
        # one for this particular filter call, then we'll use whatever the
        # SpecifierSet thinks for whether or not we should support prereleases.
        if prereleases is None:
            prereleases = self.prereleases

        # If we have any specifiers, then we want to wrap our iterable in the
        # filter method for each one, this will act as a logical AND amongst
        # each specifier.
        if self._specs:
            for spec in self._specs:
                iterable = spec.filter(iterable, prereleases=bool(prereleases))
            return iterable
        # If we do not have any specifiers, then we need to have a rough filter
        # which will filter out any pre-releases, unless there are no final
        # releases, and which will filter out LegacyVersion in general.
        else:
            filtered = []
            found_prereleases = []

            for item in iterable:
                # Ensure that we some kind of Version class for this item.
                if not isinstance(item, (LegacyVersion, Version)):
                    parsed_version = parse(item)
                else:
                    parsed_version = item

                # Filter out any item which is parsed as a LegacyVersion
                if isinstance(parsed_version, LegacyVersion):
                    continue

                # Store any item which is a pre-release for later unless we've
                # already found a final version or we are accepting prereleases
                if parsed_version.is_prerelease and not prereleases:
                    if not filtered:
                        found_prereleases.append(item)
                else:
                    filtered.append(item)

            # If we've found no items except for pre-releases, then we'll go
            # ahead and use the pre-releases
            if not filtered and found_prereleases and prereleases is None:
                return found_prereleases

            return filtered
site-packages/pip/_vendor/packaging/__pycache__/utils.cpython-36.opt-1.pyc000064400000000630147511334570022402 0ustar003

���e��@s2ddlmZmZmZddlZejd�Zdd�ZdS)�)�absolute_import�division�print_functionNz[-_.]+cCstjd|�j�S)N�-)�_canonicalize_regex�sub�lower)�name�r
�/usr/lib/python3.6/utils.py�canonicalize_namesr)Z
__future__rrr�re�compilerrr
r
r
r�<module>s
site-packages/pip/_vendor/packaging/__pycache__/version.cpython-36.opt-1.pyc000064400000024426147511334570022740 0ustar003

���e$-�@s�ddlmZmZmZddlZddlZddlZddlmZddddd	gZ	ej
d
ddd
dddg�Zdd�ZGdd�de
�ZGdd�de�ZGdd�de�Zejdej�Zdddddd�Zdd�Zdd�ZdZGd d�de�Zd!d"�Zejd#�Zd$d%�Zd&d'�ZdS)(�)�absolute_import�division�print_functionN�)�Infinity�parse�Version�
LegacyVersion�InvalidVersion�VERSION_PATTERN�_Version�epoch�release�dev�pre�post�localcCs&yt|�Stk
r t|�SXdS)z�
    Parse the given version string and return either a :class:`Version` object
    or a :class:`LegacyVersion` object depending on if the given version is
    a valid PEP 440 version or a legacy version.
    N)rr
r	)�version�r�/usr/lib/python3.6/version.pyrsc@seZdZdZdS)r
zF
    An invalid version was found, users should refer to PEP 440.
    N)�__name__�
__module__�__qualname__�__doc__rrrrr
$sc@sLeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)�_BaseVersioncCs
t|j�S)N)�hash�_key)�selfrrr�__hash__,sz_BaseVersion.__hash__cCs|j|dd��S)NcSs||kS)Nr)�s�orrr�<lambda>0sz%_BaseVersion.__lt__.<locals>.<lambda>)�_compare)r�otherrrr�__lt__/sz_BaseVersion.__lt__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!3sz%_BaseVersion.__le__.<locals>.<lambda>)r")rr#rrr�__le__2sz_BaseVersion.__le__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!6sz%_BaseVersion.__eq__.<locals>.<lambda>)r")rr#rrr�__eq__5sz_BaseVersion.__eq__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!9sz%_BaseVersion.__ge__.<locals>.<lambda>)r")rr#rrr�__ge__8sz_BaseVersion.__ge__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!<sz%_BaseVersion.__gt__.<locals>.<lambda>)r")rr#rrr�__gt__;sz_BaseVersion.__gt__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!?sz%_BaseVersion.__ne__.<locals>.<lambda>)r")rr#rrr�__ne__>sz_BaseVersion.__ne__cCst|t�stS||j|j�S)N)�
isinstancer�NotImplementedr)rr#�methodrrrr"As
z_BaseVersion._compareN)rrrrr$r%r&r'r(r)r"rrrrr*src@s`eZdZdd�Zdd�Zdd�Zedd��Zed	d
��Zedd��Z	ed
d��Z
edd��ZdS)r	cCst|�|_t|j�|_dS)N)�str�_version�_legacy_cmpkeyr)rrrrr�__init__Js
zLegacyVersion.__init__cCs|jS)N)r.)rrrr�__str__NszLegacyVersion.__str__cCsdjtt|���S)Nz<LegacyVersion({0})>)�format�reprr-)rrrr�__repr__QszLegacyVersion.__repr__cCs|jS)N)r.)rrrr�publicTszLegacyVersion.publiccCs|jS)N)r.)rrrr�base_versionXszLegacyVersion.base_versioncCsdS)Nr)rrrrr\szLegacyVersion.localcCsdS)NFr)rrrr�
is_prerelease`szLegacyVersion.is_prereleasecCsdS)NFr)rrrr�is_postreleasedszLegacyVersion.is_postreleaseN)rrrr0r1r4�propertyr5r6rr7r8rrrrr	Hsz(\d+ | [a-z]+ | \.| -)�czfinal-�@)r�preview�-�rcrccsbxVtj|�D]H}tj||�}|s|dkr,q|dd�dkrJ|jd�Vqd|VqWdVdS)N�.r�
0123456789��*z*final)�_legacy_version_component_re�split�_legacy_version_replacement_map�get�zfill)r�partrrr�_parse_version_partsrsrIcCs�d}g}xlt|j��D]\}|jd�rh|dkrJx|rH|ddkrH|j�q.Wx|rf|ddkrf|j�qLW|j|�qWt|�}||fS)	NrrBz*finalz*final-Z00000000���rJrJ)rI�lower�
startswith�pop�append�tuple)rr
�partsrHrrrr/�s
r/a�
    v?
    (?:
        (?:(?P<epoch>[0-9]+)!)?                           # epoch
        (?P<release>[0-9]+(?:\.[0-9]+)*)                  # release segment
        (?P<pre>                                          # pre-release
            [-_\.]?
            (?P<pre_l>(a|b|c|rc|alpha|beta|pre|preview))
            [-_\.]?
            (?P<pre_n>[0-9]+)?
        )?
        (?P<post>                                         # post release
            (?:-(?P<post_n1>[0-9]+))
            |
            (?:
                [-_\.]?
                (?P<post_l>post|rev|r)
                [-_\.]?
                (?P<post_n2>[0-9]+)?
            )
        )?
        (?P<dev>                                          # dev release
            [-_\.]?
            (?P<dev_l>dev)
            [-_\.]?
            (?P<dev_n>[0-9]+)?
        )?
    )
    (?:\+(?P<local>[a-z0-9]+(?:[-_\.][a-z0-9]+)*))?       # local version
c@s|eZdZejdedejejB�Zdd�Z	dd�Z
dd�Zed	d
��Z
edd��Zed
d��Zedd��Zedd��ZdS)rz^\s*z\s*$c	Cs�|jj|�}|stdj|���t|jd�r8t|jd��ndtdd�|jd�jd�D��t	|jd�|jd	��t	|jd
�|jd�p�|jd��t	|jd
�|jd��t
|jd��d�|_t|jj
|jj|jj|jj|jj|jj�|_dS)NzInvalid version: '{0}'r
rcss|]}t|�VqdS)N)�int)�.0�irrr�	<genexpr>�sz#Version.__init__.<locals>.<genexpr>rr?Zpre_lZpre_nZpost_lZpost_n1Zpost_n2Zdev_lZdev_nr)r
rrrrr)�_regex�searchr
r2r�grouprQrOrD�_parse_letter_version�_parse_local_versionr.�_cmpkeyr
rrrrrr)rr�matchrrrr0�s.

zVersion.__init__cCsdjtt|���S)Nz<Version({0})>)r2r3r-)rrrrr4�szVersion.__repr__cCs�g}|jjdkr$|jdj|jj��|jdjdd�|jjD���|jjdk	rl|jdjdd�|jjD���|jjdk	r�|jdj|jjd	��|jjdk	r�|jd
j|jjd	��|jj	dk	r�|jdjdjdd�|jj	D����dj|�S)
Nrz{0}!r?css|]}t|�VqdS)N)r-)rR�xrrrrT�sz"Version.__str__.<locals>.<genexpr>�css|]}t|�VqdS)N)r-)rRr\rrrrT�sz.post{0}rz.dev{0}z+{0}css|]}t|�VqdS)N)r-)rRr\rrrrTs)
r.r
rNr2�joinrrrrr)rrPrrrr1�s zVersion.__str__cCst|�jdd�dS)N�+rr)r-rD)rrrrr5
szVersion.publiccCsLg}|jjdkr$|jdj|jj��|jdjdd�|jjD���dj|�S)Nrz{0}!r?css|]}t|�VqdS)N)r-)rRr\rrrrTsz'Version.base_version.<locals>.<genexpr>r])r.r
rNr2r^r)rrPrrrr6s
zVersion.base_versioncCs$t|�}d|kr |jdd�dSdS)Nr_r)r-rD)rZversion_stringrrrrsz
Version.localcCst|jjp|jj�S)N)�boolr.rr)rrrrr7!szVersion.is_prereleasecCst|jj�S)N)r`r.r)rrrrr8%szVersion.is_postreleaseN)rrr�re�compiler�VERBOSE�
IGNORECASErUr0r4r1r9r5r6rr7r8rrrrr�s
#
cCsx|rZ|dkrd}|j�}|dkr&d}n(|dkr4d}n|d
krBd	}n|dkrNd}|t|�fS|rt|rtd}|t|�fSdS)NrZalpha�aZbeta�br:rr<r>�rev�rr)r:rr<)rgrh)rKrQ)ZletterZnumberrrrrX*s 
rXz[\._-]cCs$|dk	r tdd�tj|�D��SdS)zR
    Takes a string like abc.1.twelve and turns it into ("abc", 1, "twelve").
    Ncss&|]}|j�s|j�nt|�VqdS)N)�isdigitrKrQ)rRrHrrrrTRsz'_parse_local_version.<locals>.<genexpr>)rO�_local_version_seperatorsrD)rrrrrYLsrYcCs�ttttjdd�t|�����}|dkr@|dkr@|dk	r@t}n|dkrLt}|dkrZt}|dkrft}|dkrvt}ntdd�|D��}||||||fS)NcSs|dkS)Nrr)r\rrrr!`sz_cmpkey.<locals>.<lambda>css*|]"}t|t�r|dfnt|fVqdS)r]N)r*rQr)rRrSrrrrT�sz_cmpkey.<locals>.<genexpr>)rO�reversed�list�	itertools�	dropwhiler)r
rrrrrrrrrZWs&		
rZ)Z
__future__rrr�collectionsrmraZ_structuresr�__all__�
namedtuplerr�
ValueErrorr
�objectrr	rbrcrCrErIr/rrrXrjrYrZrrrr�<module>s.!
9k
site-packages/pip/_vendor/packaging/__pycache__/markers.cpython-36.pyc000064400000021133147511334570021750 0ustar003

���e& �	@s@ddlmZmZmZddlZddlZddlZddlZddlm	Z	m
Z
mZmZddlm
Z
mZmZmZddlmZddlmZddlmZmZd	d
ddd
gZGdd	�d	e�ZGdd
�d
e�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�Z Gdd�de�Z!ed�ed�Bed�Bed�Bed�Bed�Bed�Bed �Bed!�Bed"�Bed#�Bed$�Bed%�Bed&�Bed'�Bed(�Bed)�Bed*�BZ"d#d"ddddd+�Z#e"j$d,d-��ed.�ed/�Bed0�Bed1�Bed2�Bed3�Bed4�Bed5�BZ%e%ed6�Bed7�BZ&e&j$d8d-��ed9�ed:�BZ'e'j$d;d-��ed<�ed=�BZ(e"e'BZ)ee)e&e)�Z*e*j$d>d-��ed?�j+�Z,ed@�j+�Z-e�Z.e*ee,e.e-�BZ/e.e/e
e(e.�>ee.eZ0dAdB�Z1dSdDdE�Z2dFd-�dGd-�ej3ej4ej5ej6ej7ej8dH�Z9dIdJ�Z:e�Z;dKdL�Z<dMdN�Z=dOdP�Z>dQd
�Z?GdRd�de�Z@dS)T�)�absolute_import�division�print_functionN)�ParseException�ParseResults�stringStart�	stringEnd)�
ZeroOrMore�Group�Forward�QuotedString)�Literal�)�string_types)�	Specifier�InvalidSpecifier�
InvalidMarker�UndefinedComparison�UndefinedEnvironmentName�Marker�default_environmentc@seZdZdZdS)rzE
    An invalid marker was found, users should refer to PEP 508.
    N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/markers.pyrsc@seZdZdZdS)rzP
    An invalid operation was attempted on a value that doesn't support it.
    N)rrrrrrrrr!sc@seZdZdZdS)rz\
    A name was attempted to be used that does not exist inside of the
    environment.
    N)rrrrrrrrr'sc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�NodecCs
||_dS)N)�value)�selfrrrr�__init__0sz
Node.__init__cCs
t|j�S)N)�strr)rrrr�__str__3szNode.__str__cCsdj|jjt|��S)Nz<{0}({1!r})>)�format�	__class__rr!)rrrr�__repr__6sz
Node.__repr__cCst�dS)N)�NotImplementedError)rrrr�	serialize9szNode.serializeN)rrrr r"r%r'rrrrr.src@seZdZdd�ZdS)�VariablecCst|�S)N)r!)rrrrr'?szVariable.serializeN)rrrr'rrrrr(=sr(c@seZdZdd�ZdS)�ValuecCs
dj|�S)Nz"{0}")r#)rrrrr'EszValue.serializeN)rrrr'rrrrr)Csr)c@seZdZdd�ZdS)�OpcCst|�S)N)r!)rrrrr'KszOp.serializeN)rrrr'rrrrr*Isr*�implementation_version�platform_python_implementation�implementation_name�python_full_version�platform_release�platform_version�platform_machine�platform_system�python_version�sys_platform�os_namezos.namezsys.platformzplatform.versionzplatform.machinezplatform.python_implementation�python_implementationZextra)zos.namezsys.platformzplatform.versionzplatform.machinezplatform.python_implementationr6cCsttj|d|d��S)Nr)r(�ALIASES�get)�s�l�trrr�<lambda>ksr<z===z==z>=z<=z!=z~=�>�<znot in�incCst|d�S)Nr)r*)r9r:r;rrrr<ys�'�"cCst|d�S)Nr)r))r9r:r;rrrr<|s�and�orcCst|d�S)Nr)�tuple)r9r:r;rrrr<�s�(�)cCs t|t�rdd�|D�S|SdS)NcSsg|]}t|��qSr)�_coerce_parse_result)�.0�irrr�
<listcomp>�sz(_coerce_parse_result.<locals>.<listcomp>)�
isinstancer)�resultsrrrrG�s
rGTcCs�t|tttf�st�t|t�rHt|�dkrHt|dttf�rHt|d�St|t�r�dd�|D�}|rndj|�Sddj|�dSn"t|t�r�djdd	�|D��S|SdS)
Nrrcss|]}t|dd�VqdS)F)�firstN)�_format_marker)rH�mrrr�	<genexpr>�sz!_format_marker.<locals>.<genexpr>� rErFcSsg|]}|j��qSr)r')rHrOrrrrJ�sz"_format_marker.<locals>.<listcomp>)rK�listrDr�AssertionError�lenrN�join)�markerrM�innerrrrrN�s


rNcCs||kS)Nr)�lhs�rhsrrrr<�scCs||kS)Nr)rXrYrrrr<�s)r?znot inr>z<=z==z!=z>=r=c
Cslytdj|j�|g��}Wntk
r.YnX|j|�Stj|j��}|dkrbtdj|||���|||�S)N�z#Undefined {0!r} on {1!r} and {2!r}.)	rrUr'r�contains�
_operatorsr8rr#)rX�oprY�specZoperrrr�_eval_op�s
r_cCs&|j|t�}|tkr"tdj|���|S)Nz/{0!r} does not exist in evaluation environment.)r8�
_undefinedrr#)�environment�namerrrr�_get_env�s
rcc	Cs�gg}x�|D]�}t|tttf�s$t�t|t�rD|djt||��qt|t�r�|\}}}t|t�rvt||j	�}|j	}n|j	}t||j	�}|djt
|||��q|dks�t�|dkr|jg�qWtdd�|D��S)	NrrBrCcss|]}t|�VqdS)N)�all)rH�itemrrrrP�sz$_evaluate_markers.<locals>.<genexpr>���rf)rBrC)rKrRrDrrS�append�_evaluate_markersr(rcrr_�any)	Zmarkersra�groupsrVrXr]rYZ	lhs_valueZ	rhs_valuerrrrh�s"




rhcCs2dj|�}|j}|dkr.||dt|j�7}|S)Nz{0.major}.{0.minor}.{0.micro}�finalr)r#�releaselevelr!�serial)�info�versionZkindrrr�format_full_version�s

rpcCslttd�r ttjj�}tjj}nd}d}||tjtj�tj	�tj
�tj�tj�tj�tj�dd�tjd�S)N�implementation�0rZ�)r-r+r5r1r/r2r0r.r,r3r4)
�hasattr�sysrprqrorb�os�platform�machine�release�systemr3r6)Ziverr-rrrr�s 

c@s.eZdZdd�Zdd�Zdd�Zd
dd	�ZdS)rcCs`yttj|��|_WnFtk
rZ}z*dj|||j|jd��}t|��WYdd}~XnXdS)Nz+Invalid marker: {0!r}, parse error at {1!r}�)rG�MARKERZparseString�_markersrr#�locr)rrV�eZerr_strrrrr szMarker.__init__cCs
t|j�S)N)rNr})rrrrr"szMarker.__str__cCsdjt|��S)Nz<Marker({0!r})>)r#r!)rrrrr%szMarker.__repr__NcCs$t�}|dk	r|j|�t|j|�S)a$Evaluate a marker.

        Return the boolean from evaluating the given marker against the
        environment. environment is an optional argument to override all or
        part of the determined environment.

        The environment is determined from the current Python process.
        N)r�updaterhr})rraZcurrent_environmentrrr�evaluate"s	
zMarker.evaluate)N)rrrr r"r%r�rrrrrs)T)AZ
__future__rrr�operatorrvrwruZpip._vendor.pyparsingrrrrr	r
rrr
�LZ_compatrZ
specifiersrr�__all__�
ValueErrorrrr�objectrr(r)r*ZVARIABLEr7ZsetParseActionZVERSION_CMPZ	MARKER_OPZMARKER_VALUEZBOOLOPZ
MARKER_VARZMARKER_ITEM�suppressZLPARENZRPARENZMARKER_EXPRZMARKER_ATOMr|rGrN�lt�le�eq�ne�ge�gtr\r_r`rcrhrprrrrrr�<module>sx�
	6


site-packages/pip/_vendor/packaging/__pycache__/__about__.cpython-36.opt-1.pyc000064400000001177147511334570023157 0ustar003

���e��@sPddlmZmZmZdddddddd	gZd
ZdZdZd
ZdZ	dZ
dZde	ZdS)�)�absolute_import�division�print_function�	__title__�__summary__�__uri__�__version__�
__author__�	__email__�__license__�
__copyright__Z	packagingz"Core utilities for Python packagesz!https://github.com/pypa/packagingz16.8z)Donald Stufft and individual contributorszdonald@stufft.ioz"BSD or Apache License, Version 2.0zCopyright 2014-2016 %sN)
Z
__future__rrr�__all__rrrrr	r
rr�rr�/usr/lib/python3.6/__about__.py�<module>s

site-packages/pip/_vendor/packaging/__pycache__/__init__.cpython-36.pyc000064400000000735147511334570022050 0ustar003

���e�@sTddlmZmZmZddlmZmZmZmZm	Z	m
Z
mZmZdddddd	d
dgZ
dS)
�)�absolute_import�division�print_function�)�
__author__�
__copyright__�	__email__�__license__�__summary__�	__title__�__uri__�__version__rr
rr
rrr	rN)Z
__future__rrr�	__about__rrrr	r
rrr
�__all__�rr�/usr/lib/python3.6/__init__.py�<module>s(
site-packages/pip/_vendor/packaging/__pycache__/_structures.cpython-36.opt-1.pyc000064400000005335147511334570023633 0ustar003

���e��@sDddlmZmZmZGdd�de�Ze�ZGdd�de�Ze�ZdS)�)�absolute_import�division�print_functionc@sTeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)�InfinitycCsdS)Nr�)�selfrr�!/usr/lib/python3.6/_structures.py�__repr__	szInfinity.__repr__cCstt|��S)N)�hash�repr)rrrr�__hash__szInfinity.__hash__cCsdS)NFr)r�otherrrr�__lt__szInfinity.__lt__cCsdS)NFr)rr
rrr�__le__szInfinity.__le__cCst||j�S)N)�
isinstance�	__class__)rr
rrr�__eq__szInfinity.__eq__cCst||j�S)N)rr)rr
rrr�__ne__szInfinity.__ne__cCsdS)NTr)rr
rrr�__gt__szInfinity.__gt__cCsdS)NTr)rr
rrr�__ge__szInfinity.__ge__cCstS)N)�NegativeInfinity)rrrr�__neg__!szInfinity.__neg__N)�__name__�
__module__�__qualname__r	rrrrrrrrrrrrrsrc@sTeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)rcCsdS)Nz	-Infinityr)rrrrr	)szNegativeInfinity.__repr__cCstt|��S)N)r
r)rrrrr,szNegativeInfinity.__hash__cCsdS)NTr)rr
rrrr/szNegativeInfinity.__lt__cCsdS)NTr)rr
rrrr2szNegativeInfinity.__le__cCst||j�S)N)rr)rr
rrrr5szNegativeInfinity.__eq__cCst||j�S)N)rr)rr
rrrr8szNegativeInfinity.__ne__cCsdS)NFr)rr
rrrr;szNegativeInfinity.__gt__cCsdS)NFr)rr
rrrr>szNegativeInfinity.__ge__cCstS)N)r)rrrrrAszNegativeInfinity.__neg__N)rrrr	rrrrrrrrrrrrr'srN)Z
__future__rrr�objectrrrrrr�<module>ssite-packages/pip/_vendor/packaging/__pycache__/_compat.cpython-36.opt-1.pyc000064400000001634147511334570022671 0ustar003

���e\�@sVddlmZmZmZddlZejddkZejddkZerDefZ	ne
fZ	dd�ZdS)�)�absolute_import�division�print_functionN��cs&G��fdd�d��}tj|dfi�S)z/
    Create a base class with a metaclass.
    cseZdZ��fdd�ZdS)z!with_metaclass.<locals>.metaclasscs�|�|�S)N�)�cls�nameZ
this_bases�d)�bases�metar�/usr/lib/python3.6/_compat.py�__new__sz)with_metaclass.<locals>.metaclass.__new__N)�__name__�
__module__�__qualname__rr)rrrr
�	metaclasssrZtemporary_class)�typer)rrrr)rrr
�with_metaclasssr)Z
__future__rrr�sys�version_infoZPY2ZPY3�strZstring_typesZ
basestringrrrrr
�<module>ssite-packages/pip/_vendor/packaging/__pycache__/utils.cpython-36.pyc000064400000000630147511334570021443 0ustar003

���e��@s2ddlmZmZmZddlZejd�Zdd�ZdS)�)�absolute_import�division�print_functionNz[-_.]+cCstjd|�j�S)N�-)�_canonicalize_regex�sub�lower)�name�r
�/usr/lib/python3.6/utils.py�canonicalize_namesr)Z
__future__rrr�re�compilerrr
r
r
r�<module>s
site-packages/pip/_vendor/packaging/__pycache__/specifiers.cpython-36.opt-1.pyc000064400000046437147511334570023415 0ustar003

���eym�@s�ddlmZmZmZddlZddlZddlZddlZddlm	Z	m
Z
ddlmZm
Z
mZGdd�de�ZGdd	�d	e
eje��ZGd
d�de�ZGdd
�d
e�Zdd�ZGdd�de�Zejd�Zdd�Zdd�ZGdd�de�ZdS)�)�absolute_import�division�print_functionN�)�string_types�with_metaclass)�Version�
LegacyVersion�parsec@seZdZdZdS)�InvalidSpecifierzH
    An invalid specifier was found, users should refer to PEP 440.
    N)�__name__�
__module__�__qualname__�__doc__�rr� /usr/lib/python3.6/specifiers.pyrsrc@s�eZdZejdd��Zejdd��Zejdd��Zejdd��Zej	d	d
��Z
e
jdd
��Z
ejdd
d��Zejddd��Z
dS)�
BaseSpecifiercCsdS)z�
        Returns the str representation of this Specifier like object. This
        should be representative of the Specifier itself.
        Nr)�selfrrr�__str__szBaseSpecifier.__str__cCsdS)zF
        Returns a hash value for this Specifier like object.
        Nr)rrrr�__hash__szBaseSpecifier.__hash__cCsdS)zq
        Returns a boolean representing whether or not the two Specifier like
        objects are equal.
        Nr)r�otherrrr�__eq__$szBaseSpecifier.__eq__cCsdS)zu
        Returns a boolean representing whether or not the two Specifier like
        objects are not equal.
        Nr)rrrrr�__ne__+szBaseSpecifier.__ne__cCsdS)zg
        Returns whether or not pre-releases as a whole are allowed by this
        specifier.
        Nr)rrrr�prereleases2szBaseSpecifier.prereleasescCsdS)zd
        Sets whether or not pre-releases as a whole are allowed by this
        specifier.
        Nr)r�valuerrrr9sNcCsdS)zR
        Determines if the given item is contained within this specifier.
        Nr)r�itemrrrr�contains@szBaseSpecifier.containscCsdS)z�
        Takes an iterable of items and filters them so that only items which
        are contained within this specifier are allowed in it.
        Nr)r�iterablerrrr�filterFszBaseSpecifier.filter)N)N)rr
r�abc�abstractmethodrrrr�abstractpropertyr�setterrrrrrrrsrc@s�eZdZiZd dd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zedd��Z
edd��Zedd��Zejdd��Zdd�Zd!dd�Zd"dd�ZdS)#�_IndividualSpecifier�NcCsF|jj|�}|stdj|���|jd�j�|jd�j�f|_||_dS)NzInvalid specifier: '{0}'�operator�version)�_regex�searchr�format�group�strip�_spec�_prereleases)r�specr�matchrrr�__init__Rsz_IndividualSpecifier.__init__cCs0|jdk	rdj|j�nd}dj|jjt|�|�S)Nz, prereleases={0!r}r$z<{0}({1!r}{2})>)r-r)r�	__class__r�str)r�prerrr�__repr___sz_IndividualSpecifier.__repr__cCsdj|j�S)Nz{0}{1})r)r,)rrrrrlsz_IndividualSpecifier.__str__cCs
t|j�S)N)�hashr,)rrrrrosz_IndividualSpecifier.__hash__cCsLt|t�r0y|j|�}Wq@tk
r,tSXnt||j�s@tS|j|jkS)N)�
isinstancerr1r�NotImplementedr,)rrrrrrrs
z_IndividualSpecifier.__eq__cCsLt|t�r0y|j|�}Wq@tk
r,tSXnt||j�s@tS|j|jkS)N)r6rr1rr7r,)rrrrrr}s
z_IndividualSpecifier.__ne__cCst|dj|j|��S)Nz_compare_{0})�getattrr)�
_operators)r�oprrr�
_get_operator�sz"_IndividualSpecifier._get_operatorcCst|ttf�st|�}|S)N)r6r	rr
)rr&rrr�_coerce_version�sz$_IndividualSpecifier._coerce_versioncCs
|jdS)Nr)r,)rrrrr%�sz_IndividualSpecifier.operatorcCs
|jdS)Nr)r,)rrrrr&�sz_IndividualSpecifier.versioncCs|jS)N)r-)rrrrr�sz _IndividualSpecifier.prereleasescCs
||_dS)N)r-)rrrrrr�scCs
|j|�S)N)r)rrrrr�__contains__�sz!_IndividualSpecifier.__contains__cCs<|dkr|j}|j|�}|jr(|r(dS|j|j�||j�S)NF)rr<�
is_prereleaser;r%r&)rrrrrrr�s
z_IndividualSpecifier.containsccs�d}g}d|dk	r|ndi}xL|D]D}|j|�}|j|f|�r"|jr\|pL|jr\|j|�q"d}|Vq"W|r�|r�x|D]
}|VqzWdS)NFrT)r<rr>r�append)rrrZyielded�found_prereleases�kwr&�parsed_versionrrrr�s




z_IndividualSpecifier.filter)r$N)N)N)rr
rr9r0r4rrrrr;r<�propertyr%r&rr"r=rrrrrrr#Ns 



r#c@sveZdZdZejdedejejB�Zdddddd	d
�Z	dd�Z
d
d�Zdd�Zdd�Z
dd�Zdd�Zdd�ZdS)�LegacySpecifiera�
        (?P<operator>(==|!=|<=|>=|<|>))
        \s*
        (?P<version>
            [^,;\s)]* # Since this is a "legacy" specifier, and the version
                      # string can be just about anything, we match everything
                      # except for whitespace, a semi-colon for marker support,
                      # a closing paren since versions can be enclosed in
                      # them, and a comma since it's a version separator.
        )
        z^\s*z\s*$�equal�	not_equal�less_than_equal�greater_than_equal�	less_than�greater_than)z==z!=z<=z>=�<�>cCst|t�stt|��}|S)N)r6r	r2)rr&rrrr<�s
zLegacySpecifier._coerce_versioncCs||j|�kS)N)r<)r�prospectiver.rrr�_compare_equal�szLegacySpecifier._compare_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_not_equal�sz"LegacySpecifier._compare_not_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_less_than_equal�sz(LegacySpecifier._compare_less_than_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_greater_than_equalsz+LegacySpecifier._compare_greater_than_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_less_thansz"LegacySpecifier._compare_less_thancCs||j|�kS)N)r<)rrMr.rrr�_compare_greater_thansz%LegacySpecifier._compare_greater_thanN)rr
r�
_regex_str�re�compile�VERBOSE�
IGNORECASEr'r9r<rNrOrPrQrRrSrrrrrD�s 
rDcstj���fdd��}|S)Ncst|t�sdS�|||�S)NF)r6r)rrMr.)�fnrr�wrappeds
z)_require_version_compare.<locals>.wrapped)�	functools�wraps)rYrZr)rYr�_require_version_compare
sr]c	@s�eZdZdZejdedejejB�Zdddddd	d
dd�Z	e
d
d��Ze
dd��Ze
dd��Z
e
dd��Ze
dd��Ze
dd��Ze
dd��Zdd�Zedd��Zejdd��Zd S)!�	Specifiera
        (?P<operator>(~=|==|!=|<=|>=|<|>|===))
        (?P<version>
            (?:
                # The identity operators allow for an escape hatch that will
                # do an exact string match of the version you wish to install.
                # This will not be parsed by PEP 440 and we cannot determine
                # any semantic meaning from it. This operator is discouraged
                # but included entirely as an escape hatch.
                (?<====)  # Only match for the identity operator
                \s*
                [^\s]*    # We just match everything, except for whitespace
                          # since we are only testing for strict identity.
            )
            |
            (?:
                # The (non)equality operators allow for wild card and local
                # versions to be specified so we have to define these two
                # operators separately to enable that.
                (?<===|!=)            # Only match for equals and not equals

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)*   # release
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?

                # You cannot use a wild card and a dev or local version
                # together so group them with a | and make them optional.
                (?:
                    (?:[-_\.]?dev[-_\.]?[0-9]*)?         # dev release
                    (?:\+[a-z0-9]+(?:[-_\.][a-z0-9]+)*)? # local
                    |
                    \.\*  # Wild card syntax of .*
                )?
            )
            |
            (?:
                # The compatible operator requires at least two digits in the
                # release segment.
                (?<=~=)               # Only match for the compatible operator

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)+   # release  (We have a + instead of a *)
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?
                (?:[-_\.]?dev[-_\.]?[0-9]*)?          # dev release
            )
            |
            (?:
                # All other operators only allow a sub set of what the
                # (non)equality operators do. Specifically they do not allow
                # local versions to be specified nor do they allow the prefix
                # matching wild cards.
                (?<!==|!=|~=)         # We have special cases for these
                                      # operators so we want to make sure they
                                      # don't match here.

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)*   # release
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?
                (?:[-_\.]?dev[-_\.]?[0-9]*)?          # dev release
            )
        )
        z^\s*z\s*$Z
compatiblerErFrGrHrIrJZ	arbitrary)z~=z==z!=z<=z>=rKrLz===cCsNdjttjdd�t|���dd��}|d7}|jd�||�oL|jd�||�S)	N�.cSs|jd�o|jd�S)NZpostZdev)�
startswith)�xrrr�<lambda>�sz/Specifier._compare_compatible.<locals>.<lambda>rz.*z>=z==���)�join�list�	itertools�	takewhile�_version_splitr;)rrMr.�prefixrrr�_compare_compatible�s
zSpecifier._compare_compatiblecCsp|jd�rPt|j�}t|dd��}tt|��}|dt|��}t||�\}}nt|�}|jsht|j�}||kS)Nz.*����)�endswithrZpublicrhr2�len�_pad_version�local)rrMr.rrrrN�s


zSpecifier._compare_equalcCs|j||�S)N)rN)rrMr.rrrrO�szSpecifier._compare_not_equalcCs|t|�kS)N)r)rrMr.rrrrP�sz"Specifier._compare_less_than_equalcCs|t|�kS)N)r)rrMr.rrrrQ�sz%Specifier._compare_greater_than_equalcCs>t|�}||ksdS|jr:|jr:t|j�t|j�kr:dSdS)NFT)rr>�base_version)rrMr.rrrrR�szSpecifier._compare_less_thancCs`t|�}||ksdS|jr:|jr:t|j�t|j�kr:dS|jdk	r\t|j�t|j�kr\dSdS)NFT)rZis_postreleaserqrp)rrMr.rrrrS�s
zSpecifier._compare_greater_thancCst|�j�t|�j�kS)N)r2�lower)rrMr.rrr�_compare_arbitraryszSpecifier._compare_arbitrarycCsR|jdk	r|jS|j\}}|d
krN|dkr@|jd�r@|dd�}t|�jrNdSd	S)N�==�>=�<=�~=�===z.*rkTF)rtrurvrwrxrl)r-r,rmr
r>)rr%r&rrrrs


zSpecifier.prereleasescCs
||_dS)N)r-)rrrrrrsN)rr
rrTrUrVrWrXr'r9r]rjrNrOrPrQrRrSrsrCrr"rrrrr^s*^#r^z^([0-9]+)((?:a|b|c|rc)[0-9]+)$cCsDg}x:|jd�D],}tj|�}|r2|j|j��q|j|�qW|S)Nr_)�split�
_prefix_regexr(�extend�groupsr?)r&�resultrr/rrrrh's
rhc	Cs�gg}}|jttjdd�|���|jttjdd�|���|j|t|d�d��|j|t|d�d��|jddgtdt|d�t|d���|jddgtdt|d�t|d���ttj|��ttj|��fS)NcSs|j�S)N)�isdigit)rarrrrb6sz_pad_version.<locals>.<lambda>cSs|j�S)N)r~)rarrrrb7srr�0)r?rerfrgrn�insert�max�chain)�left�rightZ
left_splitZright_splitrrrro2s
&&roc@s�eZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zedd��Z
e
jdd��Z
dd�Zddd�Zd dd�ZdS)!�SpecifierSetr$NcCsrdd�|jd�D�}t�}xB|D]:}y|jt|��Wq tk
rX|jt|��Yq Xq Wt|�|_||_dS)NcSsg|]}|j�r|j��qSr)r+)�.0�srrr�
<listcomp>Rsz)SpecifierSet.__init__.<locals>.<listcomp>�,)	ry�set�addr^rrD�	frozenset�_specsr-)rZ
specifiersrZparsed�	specifierrrrr0Os

zSpecifierSet.__init__cCs*|jdk	rdj|j�nd}djt|�|�S)Nz, prereleases={0!r}r$z<SpecifierSet({0!r}{1})>)r-r)rr2)rr3rrrr4dszSpecifierSet.__repr__cCsdjtdd�|jD���S)Nr�css|]}t|�VqdS)N)r2)r�r�rrr�	<genexpr>nsz'SpecifierSet.__str__.<locals>.<genexpr>)rd�sortedr�)rrrrrmszSpecifierSet.__str__cCs
t|j�S)N)r5r�)rrrrrpszSpecifierSet.__hash__cCs�t|t�rt|�}nt|t�s"tSt�}t|j|jB�|_|jdkrX|jdk	rX|j|_n<|jdk	rv|jdkrv|j|_n|j|jkr�|j|_ntd��|S)NzFCannot combine SpecifierSets with True and False prerelease overrides.)r6rr�r7r�r�r-�
ValueError)rrr�rrr�__and__ss





zSpecifierSet.__and__cCsFt|t�rt|�}n&t|t�r,tt|��}nt|t�s:tS|j|jkS)N)r6rr�r#r2r7r�)rrrrrr�s



zSpecifierSet.__eq__cCsFt|t�rt|�}n&t|t�r,tt|��}nt|t�s:tS|j|jkS)N)r6rr�r#r2r7r�)rrrrrr�s



zSpecifierSet.__ne__cCs
t|j�S)N)rnr�)rrrr�__len__�szSpecifierSet.__len__cCs
t|j�S)N)�iterr�)rrrr�__iter__�szSpecifierSet.__iter__cCs.|jdk	r|jS|jsdStdd�|jD��S)Ncss|]}|jVqdS)N)r)r�r�rrrr��sz+SpecifierSet.prereleases.<locals>.<genexpr>)r-r��any)rrrrr�s

zSpecifierSet.prereleasescCs
||_dS)N)r-)rrrrrr�scCs
|j|�S)N)r)rrrrrr=�szSpecifierSet.__contains__csNt�ttf�st����dkr$|j��r4�jr4dSt��fdd�|jD��S)NFc3s|]}|j��d�VqdS))rN)r)r�r�)rrrrr��sz(SpecifierSet.contains.<locals>.<genexpr>)r6r	rr
rr>�allr�)rrrr)rrrr�szSpecifierSet.containscCs�|dkr|j}|jr:x |jD]}|j|t|�d�}qW|Sg}g}xZ|D]R}t|ttf�sdt|�}n|}t|t�rtqH|jr�|r�|s�|j	|�qH|j	|�qHW|r�|r�|dkr�|S|SdS)N)r)
rr�r�boolr6r	rr
r>r?)rrrr.Zfilteredr@rrBrrrr�s*


zSpecifierSet.filter)r$N)N)N)rr
rr0r4rrr�rrr�r�rCrr"r=rrrrrrr�Ms
	


r�)Z
__future__rrrrr[rfrUZ_compatrrr&rr	r
r�r�ABCMeta�objectrr#rDr]r^rVrzrhror�rrrr�<module>s&9	4	
site-packages/pip/_vendor/packaging/__pycache__/requirements.cpython-36.opt-1.pyc000064400000007306147511334570023774 0ustar003

���e��@srddlmZmZmZddlZddlZddlmZmZm	Z	m
Z
ddlmZmZm
Z
mZmZddlmZddlmZddlmZmZdd	lmZmZmZGd
d�de�Zeejej�Z ed�j!�Z"ed
�j!�Z#ed�j!�Z$ed�j!�Z%ed�j!�Z&ed�j!�Z'ed�j!�Z(ed�Z)e ee)�e BZ*ee ee*��Z+e+d�Z,e+Z-ed�d�Z.e(e.Z/e-ee&e-�Z0e"e
e0�e#d�Z1eej2ej3ej4B�Z5eej2ej3ej4B�Z6e5e6AZ7ee7ee&e7�ddd�d�Z8e
e$e8e%e8B�Z9e9j:dd��e	e9�d�Z;e;j:dd��e	e��d�Zej:d d��e'Z<e<eZ=e;e
e=�Z>e/e
e=�Z?e,e
e1�e?e>BZ@ee@eZAGd!d"�d"eB�ZCdS)#�)�absolute_import�division�print_functionN)�stringStart�	stringEnd�originalTextFor�ParseException)�
ZeroOrMore�Word�Optional�Regex�Combine)�Literal)�parse�)�MARKER_EXPR�Marker)�LegacySpecifier�	Specifier�SpecifierSetc@seZdZdZdS)�InvalidRequirementzJ
    An invalid requirement was found, users should refer to PEP 508.
    N)�__name__�
__module__�__qualname__�__doc__�rr�"/usr/lib/python3.6/requirements.pyrsr�[�]�(�)�,�;�@z-_.�namez[^ ]+�url�extrasF)Z
joinStringZadjacent�	_raw_speccCs
|jpdS)N�)r')�s�l�trrr�<lambda>8sr,�	specifiercCs|dS)Nrr)r)r*r+rrrr,;s�markercCst||j|j��S)N)rZ_original_startZ
_original_end)r)r*r+rrrr,?sc@s(eZdZdZdd�Zdd�Zdd�ZdS)	�Requirementz�Parse a requirement.

    Parse a given requirement string into its parts, such as name, specifier,
    URL, and extras. Raises InvalidRequirement on a badly-formed requirement
    string.
    cCs�ytj|�}Wn@tk
rN}z$tdj||j|jd����WYdd}~XnX|j|_|jr�tj|j�}|j	ot|j
s�|j	r�|j
r�td��|j|_nd|_t|jr�|jj
�ng�|_t|j�|_|jr�|jnd|_dS)Nz+Invalid requirement, parse error at "{0!r}"�zInvalid URL given)�REQUIREMENTZparseStringrr�format�locr$r%�urlparse�schemeZnetloc�setr&ZasListrr-r.)�selfZrequirement_stringZreq�eZ
parsed_urlrrr�__init__Zs"*
zRequirement.__init__cCsz|jg}|jr*|jdjdjt|j����|jr@|jt|j��|jrX|jdj|j��|j	rp|jdj|j	��dj|�S)Nz[{0}]r!z@ {0}z; {0}r()
r$r&�appendr2�join�sortedr-�strr%r.)r7�partsrrr�__str__oszRequirement.__str__cCsdjt|��S)Nz<Requirement({0!r})>)r2r=)r7rrr�__repr__�szRequirement.__repr__N)rrrrr9r?r@rrrrr/Msr/)DZ
__future__rrr�string�reZpip._vendor.pyparsingrrrrr	r
rrr
r�LZpip._vendor.six.moves.urllibrr4ZmarkersrrZ
specifiersrrr�
ValueErrorrZ
ascii_lettersZdigitsZALPHANUM�suppressZLBRACKETZRBRACKETZLPARENZRPAREN�COMMAZ	SEMICOLON�ATZPUNCTUATIONZIDENTIFIER_ENDZ
IDENTIFIER�NAMEZEXTRAZURIZURLZEXTRAS_LISTZEXTRASZ
_regex_str�VERBOSE�
IGNORECASEZVERSION_PEP440ZVERSION_LEGACYZVERSION_ONEZVERSION_MANYZ
_VERSION_SPECZsetParseActionZVERSION_SPECZMARKER_SEPERATORZMARKERZVERSION_AND_MARKERZURL_AND_MARKERZNAMED_REQUIREMENTr1�objectr/rrrr�<module>sZ
site-packages/pip/_vendor/packaging/__pycache__/_structures.cpython-36.pyc000064400000005335147511334570022674 0ustar003

���e��@sDddlmZmZmZGdd�de�Ze�ZGdd�de�Ze�ZdS)�)�absolute_import�division�print_functionc@sTeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)�InfinitycCsdS)Nr�)�selfrr�!/usr/lib/python3.6/_structures.py�__repr__	szInfinity.__repr__cCstt|��S)N)�hash�repr)rrrr�__hash__szInfinity.__hash__cCsdS)NFr)r�otherrrr�__lt__szInfinity.__lt__cCsdS)NFr)rr
rrr�__le__szInfinity.__le__cCst||j�S)N)�
isinstance�	__class__)rr
rrr�__eq__szInfinity.__eq__cCst||j�S)N)rr)rr
rrr�__ne__szInfinity.__ne__cCsdS)NTr)rr
rrr�__gt__szInfinity.__gt__cCsdS)NTr)rr
rrr�__ge__szInfinity.__ge__cCstS)N)�NegativeInfinity)rrrr�__neg__!szInfinity.__neg__N)�__name__�
__module__�__qualname__r	rrrrrrrrrrrrrsrc@sTeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)rcCsdS)Nz	-Infinityr)rrrrr	)szNegativeInfinity.__repr__cCstt|��S)N)r
r)rrrrr,szNegativeInfinity.__hash__cCsdS)NTr)rr
rrrr/szNegativeInfinity.__lt__cCsdS)NTr)rr
rrrr2szNegativeInfinity.__le__cCst||j�S)N)rr)rr
rrrr5szNegativeInfinity.__eq__cCst||j�S)N)rr)rr
rrrr8szNegativeInfinity.__ne__cCsdS)NFr)rr
rrrr;szNegativeInfinity.__gt__cCsdS)NFr)rr
rrrr>szNegativeInfinity.__ge__cCstS)N)r)rrrrrAszNegativeInfinity.__neg__N)rrrr	rrrrrrrrrrrrr'srN)Z
__future__rrr�objectrrrrrr�<module>ssite-packages/pip/_vendor/packaging/__pycache__/_compat.cpython-36.pyc000064400000001634147511334570021732 0ustar003

���e\�@sVddlmZmZmZddlZejddkZejddkZerDefZ	ne
fZ	dd�ZdS)�)�absolute_import�division�print_functionN��cs&G��fdd�d��}tj|dfi�S)z/
    Create a base class with a metaclass.
    cseZdZ��fdd�ZdS)z!with_metaclass.<locals>.metaclasscs�|�|�S)N�)�cls�nameZ
this_bases�d)�bases�metar�/usr/lib/python3.6/_compat.py�__new__sz)with_metaclass.<locals>.metaclass.__new__N)�__name__�
__module__�__qualname__rr)rrrr
�	metaclasssrZtemporary_class)�typer)rrrr)rrr
�with_metaclasssr)Z
__future__rrr�sys�version_infoZPY2ZPY3�strZstring_typesZ
basestringrrrrr
�<module>ssite-packages/pip/_vendor/packaging/__pycache__/requirements.cpython-36.pyc000064400000007306147511334570023035 0ustar003

���e��@srddlmZmZmZddlZddlZddlmZmZm	Z	m
Z
ddlmZmZm
Z
mZmZddlmZddlmZddlmZmZdd	lmZmZmZGd
d�de�Zeejej�Z ed�j!�Z"ed
�j!�Z#ed�j!�Z$ed�j!�Z%ed�j!�Z&ed�j!�Z'ed�j!�Z(ed�Z)e ee)�e BZ*ee ee*��Z+e+d�Z,e+Z-ed�d�Z.e(e.Z/e-ee&e-�Z0e"e
e0�e#d�Z1eej2ej3ej4B�Z5eej2ej3ej4B�Z6e5e6AZ7ee7ee&e7�ddd�d�Z8e
e$e8e%e8B�Z9e9j:dd��e	e9�d�Z;e;j:dd��e	e��d�Zej:d d��e'Z<e<eZ=e;e
e=�Z>e/e
e=�Z?e,e
e1�e?e>BZ@ee@eZAGd!d"�d"eB�ZCdS)#�)�absolute_import�division�print_functionN)�stringStart�	stringEnd�originalTextFor�ParseException)�
ZeroOrMore�Word�Optional�Regex�Combine)�Literal)�parse�)�MARKER_EXPR�Marker)�LegacySpecifier�	Specifier�SpecifierSetc@seZdZdZdS)�InvalidRequirementzJ
    An invalid requirement was found, users should refer to PEP 508.
    N)�__name__�
__module__�__qualname__�__doc__�rr�"/usr/lib/python3.6/requirements.pyrsr�[�]�(�)�,�;�@z-_.�namez[^ ]+�url�extrasF)Z
joinStringZadjacent�	_raw_speccCs
|jpdS)N�)r')�s�l�trrr�<lambda>8sr,�	specifiercCs|dS)Nrr)r)r*r+rrrr,;s�markercCst||j|j��S)N)rZ_original_startZ
_original_end)r)r*r+rrrr,?sc@s(eZdZdZdd�Zdd�Zdd�ZdS)	�Requirementz�Parse a requirement.

    Parse a given requirement string into its parts, such as name, specifier,
    URL, and extras. Raises InvalidRequirement on a badly-formed requirement
    string.
    cCs�ytj|�}Wn@tk
rN}z$tdj||j|jd����WYdd}~XnX|j|_|jr�tj|j�}|j	ot|j
s�|j	r�|j
r�td��|j|_nd|_t|jr�|jj
�ng�|_t|j�|_|jr�|jnd|_dS)Nz+Invalid requirement, parse error at "{0!r}"�zInvalid URL given)�REQUIREMENTZparseStringrr�format�locr$r%�urlparse�schemeZnetloc�setr&ZasListrr-r.)�selfZrequirement_stringZreq�eZ
parsed_urlrrr�__init__Zs"*
zRequirement.__init__cCsz|jg}|jr*|jdjdjt|j����|jr@|jt|j��|jrX|jdj|j��|j	rp|jdj|j	��dj|�S)Nz[{0}]r!z@ {0}z; {0}r()
r$r&�appendr2�join�sortedr-�strr%r.)r7�partsrrr�__str__oszRequirement.__str__cCsdjt|��S)Nz<Requirement({0!r})>)r2r=)r7rrr�__repr__�szRequirement.__repr__N)rrrrr9r?r@rrrrr/Msr/)DZ
__future__rrr�string�reZpip._vendor.pyparsingrrrrr	r
rrr
r�LZpip._vendor.six.moves.urllibrr4ZmarkersrrZ
specifiersrrr�
ValueErrorrZ
ascii_lettersZdigitsZALPHANUM�suppressZLBRACKETZRBRACKETZLPARENZRPAREN�COMMAZ	SEMICOLON�ATZPUNCTUATIONZIDENTIFIER_ENDZ
IDENTIFIER�NAMEZEXTRAZURIZURLZEXTRAS_LISTZEXTRASZ
_regex_str�VERBOSE�
IGNORECASEZVERSION_PEP440ZVERSION_LEGACYZVERSION_ONEZVERSION_MANYZ
_VERSION_SPECZsetParseActionZVERSION_SPECZMARKER_SEPERATORZMARKERZVERSION_AND_MARKERZURL_AND_MARKERZNAMED_REQUIREMENTr1�objectr/rrrr�<module>sZ
site-packages/pip/_vendor/packaging/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000735147511334570023007 0ustar003

���e�@sTddlmZmZmZddlmZmZmZmZm	Z	m
Z
mZmZdddddd	d
dgZ
dS)
�)�absolute_import�division�print_function�)�
__author__�
__copyright__�	__email__�__license__�__summary__�	__title__�__uri__�__version__rr
rr
rrr	rN)Z
__future__rrr�	__about__rrrr	r
rrr
�__all__�rr�/usr/lib/python3.6/__init__.py�<module>s(
site-packages/pip/_vendor/packaging/__pycache__/specifiers.cpython-36.pyc000064400000046437147511334570022456 0ustar003

���eym�@s�ddlmZmZmZddlZddlZddlZddlZddlm	Z	m
Z
ddlmZm
Z
mZGdd�de�ZGdd	�d	e
eje��ZGd
d�de�ZGdd
�d
e�Zdd�ZGdd�de�Zejd�Zdd�Zdd�ZGdd�de�ZdS)�)�absolute_import�division�print_functionN�)�string_types�with_metaclass)�Version�
LegacyVersion�parsec@seZdZdZdS)�InvalidSpecifierzH
    An invalid specifier was found, users should refer to PEP 440.
    N)�__name__�
__module__�__qualname__�__doc__�rr� /usr/lib/python3.6/specifiers.pyrsrc@s�eZdZejdd��Zejdd��Zejdd��Zejdd��Zej	d	d
��Z
e
jdd
��Z
ejdd
d��Zejddd��Z
dS)�
BaseSpecifiercCsdS)z�
        Returns the str representation of this Specifier like object. This
        should be representative of the Specifier itself.
        Nr)�selfrrr�__str__szBaseSpecifier.__str__cCsdS)zF
        Returns a hash value for this Specifier like object.
        Nr)rrrr�__hash__szBaseSpecifier.__hash__cCsdS)zq
        Returns a boolean representing whether or not the two Specifier like
        objects are equal.
        Nr)r�otherrrr�__eq__$szBaseSpecifier.__eq__cCsdS)zu
        Returns a boolean representing whether or not the two Specifier like
        objects are not equal.
        Nr)rrrrr�__ne__+szBaseSpecifier.__ne__cCsdS)zg
        Returns whether or not pre-releases as a whole are allowed by this
        specifier.
        Nr)rrrr�prereleases2szBaseSpecifier.prereleasescCsdS)zd
        Sets whether or not pre-releases as a whole are allowed by this
        specifier.
        Nr)r�valuerrrr9sNcCsdS)zR
        Determines if the given item is contained within this specifier.
        Nr)r�itemrrrr�contains@szBaseSpecifier.containscCsdS)z�
        Takes an iterable of items and filters them so that only items which
        are contained within this specifier are allowed in it.
        Nr)r�iterablerrrr�filterFszBaseSpecifier.filter)N)N)rr
r�abc�abstractmethodrrrr�abstractpropertyr�setterrrrrrrrsrc@s�eZdZiZd dd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zedd��Z
edd��Zedd��Zejdd��Zdd�Zd!dd�Zd"dd�ZdS)#�_IndividualSpecifier�NcCsF|jj|�}|stdj|���|jd�j�|jd�j�f|_||_dS)NzInvalid specifier: '{0}'�operator�version)�_regex�searchr�format�group�strip�_spec�_prereleases)r�specr�matchrrr�__init__Rsz_IndividualSpecifier.__init__cCs0|jdk	rdj|j�nd}dj|jjt|�|�S)Nz, prereleases={0!r}r$z<{0}({1!r}{2})>)r-r)r�	__class__r�str)r�prerrr�__repr___sz_IndividualSpecifier.__repr__cCsdj|j�S)Nz{0}{1})r)r,)rrrrrlsz_IndividualSpecifier.__str__cCs
t|j�S)N)�hashr,)rrrrrosz_IndividualSpecifier.__hash__cCsLt|t�r0y|j|�}Wq@tk
r,tSXnt||j�s@tS|j|jkS)N)�
isinstancerr1r�NotImplementedr,)rrrrrrrs
z_IndividualSpecifier.__eq__cCsLt|t�r0y|j|�}Wq@tk
r,tSXnt||j�s@tS|j|jkS)N)r6rr1rr7r,)rrrrrr}s
z_IndividualSpecifier.__ne__cCst|dj|j|��S)Nz_compare_{0})�getattrr)�
_operators)r�oprrr�
_get_operator�sz"_IndividualSpecifier._get_operatorcCst|ttf�st|�}|S)N)r6r	rr
)rr&rrr�_coerce_version�sz$_IndividualSpecifier._coerce_versioncCs
|jdS)Nr)r,)rrrrr%�sz_IndividualSpecifier.operatorcCs
|jdS)Nr)r,)rrrrr&�sz_IndividualSpecifier.versioncCs|jS)N)r-)rrrrr�sz _IndividualSpecifier.prereleasescCs
||_dS)N)r-)rrrrrr�scCs
|j|�S)N)r)rrrrr�__contains__�sz!_IndividualSpecifier.__contains__cCs<|dkr|j}|j|�}|jr(|r(dS|j|j�||j�S)NF)rr<�
is_prereleaser;r%r&)rrrrrrr�s
z_IndividualSpecifier.containsccs�d}g}d|dk	r|ndi}xL|D]D}|j|�}|j|f|�r"|jr\|pL|jr\|j|�q"d}|Vq"W|r�|r�x|D]
}|VqzWdS)NFrT)r<rr>r�append)rrrZyielded�found_prereleases�kwr&�parsed_versionrrrr�s




z_IndividualSpecifier.filter)r$N)N)N)rr
rr9r0r4rrrrr;r<�propertyr%r&rr"r=rrrrrrr#Ns 



r#c@sveZdZdZejdedejejB�Zdddddd	d
�Z	dd�Z
d
d�Zdd�Zdd�Z
dd�Zdd�Zdd�ZdS)�LegacySpecifiera�
        (?P<operator>(==|!=|<=|>=|<|>))
        \s*
        (?P<version>
            [^,;\s)]* # Since this is a "legacy" specifier, and the version
                      # string can be just about anything, we match everything
                      # except for whitespace, a semi-colon for marker support,
                      # a closing paren since versions can be enclosed in
                      # them, and a comma since it's a version separator.
        )
        z^\s*z\s*$�equal�	not_equal�less_than_equal�greater_than_equal�	less_than�greater_than)z==z!=z<=z>=�<�>cCst|t�stt|��}|S)N)r6r	r2)rr&rrrr<�s
zLegacySpecifier._coerce_versioncCs||j|�kS)N)r<)r�prospectiver.rrr�_compare_equal�szLegacySpecifier._compare_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_not_equal�sz"LegacySpecifier._compare_not_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_less_than_equal�sz(LegacySpecifier._compare_less_than_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_greater_than_equalsz+LegacySpecifier._compare_greater_than_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_less_thansz"LegacySpecifier._compare_less_thancCs||j|�kS)N)r<)rrMr.rrr�_compare_greater_thansz%LegacySpecifier._compare_greater_thanN)rr
r�
_regex_str�re�compile�VERBOSE�
IGNORECASEr'r9r<rNrOrPrQrRrSrrrrrD�s 
rDcstj���fdd��}|S)Ncst|t�sdS�|||�S)NF)r6r)rrMr.)�fnrr�wrappeds
z)_require_version_compare.<locals>.wrapped)�	functools�wraps)rYrZr)rYr�_require_version_compare
sr]c	@s�eZdZdZejdedejejB�Zdddddd	d
dd�Z	e
d
d��Ze
dd��Ze
dd��Z
e
dd��Ze
dd��Ze
dd��Ze
dd��Zdd�Zedd��Zejdd��Zd S)!�	Specifiera
        (?P<operator>(~=|==|!=|<=|>=|<|>|===))
        (?P<version>
            (?:
                # The identity operators allow for an escape hatch that will
                # do an exact string match of the version you wish to install.
                # This will not be parsed by PEP 440 and we cannot determine
                # any semantic meaning from it. This operator is discouraged
                # but included entirely as an escape hatch.
                (?<====)  # Only match for the identity operator
                \s*
                [^\s]*    # We just match everything, except for whitespace
                          # since we are only testing for strict identity.
            )
            |
            (?:
                # The (non)equality operators allow for wild card and local
                # versions to be specified so we have to define these two
                # operators separately to enable that.
                (?<===|!=)            # Only match for equals and not equals

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)*   # release
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?

                # You cannot use a wild card and a dev or local version
                # together so group them with a | and make them optional.
                (?:
                    (?:[-_\.]?dev[-_\.]?[0-9]*)?         # dev release
                    (?:\+[a-z0-9]+(?:[-_\.][a-z0-9]+)*)? # local
                    |
                    \.\*  # Wild card syntax of .*
                )?
            )
            |
            (?:
                # The compatible operator requires at least two digits in the
                # release segment.
                (?<=~=)               # Only match for the compatible operator

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)+   # release  (We have a + instead of a *)
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?
                (?:[-_\.]?dev[-_\.]?[0-9]*)?          # dev release
            )
            |
            (?:
                # All other operators only allow a sub set of what the
                # (non)equality operators do. Specifically they do not allow
                # local versions to be specified nor do they allow the prefix
                # matching wild cards.
                (?<!==|!=|~=)         # We have special cases for these
                                      # operators so we want to make sure they
                                      # don't match here.

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)*   # release
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?
                (?:[-_\.]?dev[-_\.]?[0-9]*)?          # dev release
            )
        )
        z^\s*z\s*$Z
compatiblerErFrGrHrIrJZ	arbitrary)z~=z==z!=z<=z>=rKrLz===cCsNdjttjdd�t|���dd��}|d7}|jd�||�oL|jd�||�S)	N�.cSs|jd�o|jd�S)NZpostZdev)�
startswith)�xrrr�<lambda>�sz/Specifier._compare_compatible.<locals>.<lambda>rz.*z>=z==���)�join�list�	itertools�	takewhile�_version_splitr;)rrMr.�prefixrrr�_compare_compatible�s
zSpecifier._compare_compatiblecCsp|jd�rPt|j�}t|dd��}tt|��}|dt|��}t||�\}}nt|�}|jsht|j�}||kS)Nz.*����)�endswithrZpublicrhr2�len�_pad_version�local)rrMr.rrrrN�s


zSpecifier._compare_equalcCs|j||�S)N)rN)rrMr.rrrrO�szSpecifier._compare_not_equalcCs|t|�kS)N)r)rrMr.rrrrP�sz"Specifier._compare_less_than_equalcCs|t|�kS)N)r)rrMr.rrrrQ�sz%Specifier._compare_greater_than_equalcCs>t|�}||ksdS|jr:|jr:t|j�t|j�kr:dSdS)NFT)rr>�base_version)rrMr.rrrrR�szSpecifier._compare_less_thancCs`t|�}||ksdS|jr:|jr:t|j�t|j�kr:dS|jdk	r\t|j�t|j�kr\dSdS)NFT)rZis_postreleaserqrp)rrMr.rrrrS�s
zSpecifier._compare_greater_thancCst|�j�t|�j�kS)N)r2�lower)rrMr.rrr�_compare_arbitraryszSpecifier._compare_arbitrarycCsR|jdk	r|jS|j\}}|d
krN|dkr@|jd�r@|dd�}t|�jrNdSd	S)N�==�>=�<=�~=�===z.*rkTF)rtrurvrwrxrl)r-r,rmr
r>)rr%r&rrrrs


zSpecifier.prereleasescCs
||_dS)N)r-)rrrrrrsN)rr
rrTrUrVrWrXr'r9r]rjrNrOrPrQrRrSrsrCrr"rrrrr^s*^#r^z^([0-9]+)((?:a|b|c|rc)[0-9]+)$cCsDg}x:|jd�D],}tj|�}|r2|j|j��q|j|�qW|S)Nr_)�split�
_prefix_regexr(�extend�groupsr?)r&�resultrr/rrrrh's
rhc	Cs�gg}}|jttjdd�|���|jttjdd�|���|j|t|d�d��|j|t|d�d��|jddgtdt|d�t|d���|jddgtdt|d�t|d���ttj|��ttj|��fS)NcSs|j�S)N)�isdigit)rarrrrb6sz_pad_version.<locals>.<lambda>cSs|j�S)N)r~)rarrrrb7srr�0)r?rerfrgrn�insert�max�chain)�left�rightZ
left_splitZright_splitrrrro2s
&&roc@s�eZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zedd��Z
e
jdd��Z
dd�Zddd�Zd dd�ZdS)!�SpecifierSetr$NcCsrdd�|jd�D�}t�}xB|D]:}y|jt|��Wq tk
rX|jt|��Yq Xq Wt|�|_||_dS)NcSsg|]}|j�r|j��qSr)r+)�.0�srrr�
<listcomp>Rsz)SpecifierSet.__init__.<locals>.<listcomp>�,)	ry�set�addr^rrD�	frozenset�_specsr-)rZ
specifiersrZparsed�	specifierrrrr0Os

zSpecifierSet.__init__cCs*|jdk	rdj|j�nd}djt|�|�S)Nz, prereleases={0!r}r$z<SpecifierSet({0!r}{1})>)r-r)rr2)rr3rrrr4dszSpecifierSet.__repr__cCsdjtdd�|jD���S)Nr�css|]}t|�VqdS)N)r2)r�r�rrr�	<genexpr>nsz'SpecifierSet.__str__.<locals>.<genexpr>)rd�sortedr�)rrrrrmszSpecifierSet.__str__cCs
t|j�S)N)r5r�)rrrrrpszSpecifierSet.__hash__cCs�t|t�rt|�}nt|t�s"tSt�}t|j|jB�|_|jdkrX|jdk	rX|j|_n<|jdk	rv|jdkrv|j|_n|j|jkr�|j|_ntd��|S)NzFCannot combine SpecifierSets with True and False prerelease overrides.)r6rr�r7r�r�r-�
ValueError)rrr�rrr�__and__ss





zSpecifierSet.__and__cCsFt|t�rt|�}n&t|t�r,tt|��}nt|t�s:tS|j|jkS)N)r6rr�r#r2r7r�)rrrrrr�s



zSpecifierSet.__eq__cCsFt|t�rt|�}n&t|t�r,tt|��}nt|t�s:tS|j|jkS)N)r6rr�r#r2r7r�)rrrrrr�s



zSpecifierSet.__ne__cCs
t|j�S)N)rnr�)rrrr�__len__�szSpecifierSet.__len__cCs
t|j�S)N)�iterr�)rrrr�__iter__�szSpecifierSet.__iter__cCs.|jdk	r|jS|jsdStdd�|jD��S)Ncss|]}|jVqdS)N)r)r�r�rrrr��sz+SpecifierSet.prereleases.<locals>.<genexpr>)r-r��any)rrrrr�s

zSpecifierSet.prereleasescCs
||_dS)N)r-)rrrrrr�scCs
|j|�S)N)r)rrrrrr=�szSpecifierSet.__contains__csNt�ttf�st����dkr$|j��r4�jr4dSt��fdd�|jD��S)NFc3s|]}|j��d�VqdS))rN)r)r�r�)rrrrr��sz(SpecifierSet.contains.<locals>.<genexpr>)r6r	rr
rr>�allr�)rrrr)rrrr�szSpecifierSet.containscCs�|dkr|j}|jr:x |jD]}|j|t|�d�}qW|Sg}g}xZ|D]R}t|ttf�sdt|�}n|}t|t�rtqH|jr�|r�|s�|j	|�qH|j	|�qHW|r�|r�|dkr�|S|SdS)N)r)
rr�r�boolr6r	rr
r>r?)rrrr.Zfilteredr@rrBrrrr�s*


zSpecifierSet.filter)r$N)N)N)rr
rr0r4rrr�rrr�r�rCrr"r=rrrrrrr�Ms
	


r�)Z
__future__rrrrr[rfrUZ_compatrrr&rr	r
r�r�ABCMeta�objectrr#rDr]r^rVrzrhror�rrrr�<module>s&9	4	
site-packages/pip/_vendor/packaging/__pycache__/markers.cpython-36.opt-1.pyc000064400000020761147511334570022715 0ustar003

���e& �	@s@ddlmZmZmZddlZddlZddlZddlZddlm	Z	m
Z
mZmZddlm
Z
mZmZmZddlmZddlmZddlmZmZd	d
ddd
gZGdd	�d	e�ZGdd
�d
e�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�Z Gdd�de�Z!ed�ed�Bed�Bed�Bed�Bed�Bed�Bed �Bed!�Bed"�Bed#�Bed$�Bed%�Bed&�Bed'�Bed(�Bed)�Bed*�BZ"d#d"ddddd+�Z#e"j$d,d-��ed.�ed/�Bed0�Bed1�Bed2�Bed3�Bed4�Bed5�BZ%e%ed6�Bed7�BZ&e&j$d8d-��ed9�ed:�BZ'e'j$d;d-��ed<�ed=�BZ(e"e'BZ)ee)e&e)�Z*e*j$d>d-��ed?�j+�Z,ed@�j+�Z-e�Z.e*ee,e.e-�BZ/e.e/e
e(e.�>ee.eZ0dAdB�Z1dSdDdE�Z2dFd-�dGd-�ej3ej4ej5ej6ej7ej8dH�Z9dIdJ�Z:e�Z;dKdL�Z<dMdN�Z=dOdP�Z>dQd
�Z?GdRd�de�Z@dS)T�)�absolute_import�division�print_functionN)�ParseException�ParseResults�stringStart�	stringEnd)�
ZeroOrMore�Group�Forward�QuotedString)�Literal�)�string_types)�	Specifier�InvalidSpecifier�
InvalidMarker�UndefinedComparison�UndefinedEnvironmentName�Marker�default_environmentc@seZdZdZdS)rzE
    An invalid marker was found, users should refer to PEP 508.
    N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/markers.pyrsc@seZdZdZdS)rzP
    An invalid operation was attempted on a value that doesn't support it.
    N)rrrrrrrrr!sc@seZdZdZdS)rz\
    A name was attempted to be used that does not exist inside of the
    environment.
    N)rrrrrrrrr'sc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�NodecCs
||_dS)N)�value)�selfrrrr�__init__0sz
Node.__init__cCs
t|j�S)N)�strr)rrrr�__str__3szNode.__str__cCsdj|jjt|��S)Nz<{0}({1!r})>)�format�	__class__rr!)rrrr�__repr__6sz
Node.__repr__cCst�dS)N)�NotImplementedError)rrrr�	serialize9szNode.serializeN)rrrr r"r%r'rrrrr.src@seZdZdd�ZdS)�VariablecCst|�S)N)r!)rrrrr'?szVariable.serializeN)rrrr'rrrrr(=sr(c@seZdZdd�ZdS)�ValuecCs
dj|�S)Nz"{0}")r#)rrrrr'EszValue.serializeN)rrrr'rrrrr)Csr)c@seZdZdd�ZdS)�OpcCst|�S)N)r!)rrrrr'KszOp.serializeN)rrrr'rrrrr*Isr*�implementation_version�platform_python_implementation�implementation_name�python_full_version�platform_release�platform_version�platform_machine�platform_system�python_version�sys_platform�os_namezos.namezsys.platformzplatform.versionzplatform.machinezplatform.python_implementation�python_implementationZextra)zos.namezsys.platformzplatform.versionzplatform.machinezplatform.python_implementationr6cCsttj|d|d��S)Nr)r(�ALIASES�get)�s�l�trrr�<lambda>ksr<z===z==z>=z<=z!=z~=�>�<znot in�incCst|d�S)Nr)r*)r9r:r;rrrr<ys�'�"cCst|d�S)Nr)r))r9r:r;rrrr<|s�and�orcCst|d�S)Nr)�tuple)r9r:r;rrrr<�s�(�)cCs t|t�rdd�|D�S|SdS)NcSsg|]}t|��qSr)�_coerce_parse_result)�.0�irrr�
<listcomp>�sz(_coerce_parse_result.<locals>.<listcomp>)�
isinstancer)�resultsrrrrG�s
rGTcCs�t|t�r4t|�dkr4t|dttf�r4t|d�St|t�rndd�|D�}|rZdj|�Sddj|�dSn"t|t�r�djdd	�|D��S|SdS)
Nrrcss|]}t|dd�VqdS)F)�firstN)�_format_marker)rH�mrrr�	<genexpr>�sz!_format_marker.<locals>.<genexpr>� rErFcSsg|]}|j��qSr)r')rHrOrrrrJ�sz"_format_marker.<locals>.<listcomp>)rK�list�lenrDrN�join)�markerrM�innerrrrrN�s


rNcCs||kS)Nr)�lhs�rhsrrrr<�scCs||kS)Nr)rWrXrrrr<�s)r?znot inr>z<=z==z!=z>=r=c
Cslytdj|j�|g��}Wntk
r.YnX|j|�Stj|j��}|dkrbtdj|||���|||�S)N�z#Undefined {0!r} on {1!r} and {2!r}.)	rrTr'r�contains�
_operatorsr8rr#)rW�oprX�specZoperrrr�_eval_op�s
r^cCs&|j|t�}|tkr"tdj|���|S)Nz/{0!r} does not exist in evaluation environment.)r8�
_undefinedrr#)�environment�namerrrr�_get_env�s
rbc	Cs�gg}x�|D]�}t|t�r0|djt||��qt|t�r�|\}}}t|t�rbt||j�}|j}n|j}t||j�}|djt|||��q|dkr|jg�qWt	dd�|D��S)NrrCcss|]}t|�VqdS)N)�all)rH�itemrrrrP�sz$_evaluate_markers.<locals>.<genexpr>���re)
rKrR�append�_evaluate_markersrDr(rbrr^�any)	Zmarkersr`�groupsrUrWr\rXZ	lhs_valueZ	rhs_valuerrrrg�s




rgcCs2dj|�}|j}|dkr.||dt|j�7}|S)Nz{0.major}.{0.minor}.{0.micro}�finalr)r#�releaselevelr!�serial)�info�versionZkindrrr�format_full_version�s

rocCslttd�r ttjj�}tjj}nd}d}||tjtj�tj	�tj
�tj�tj�tj�tj�dd�tjd�S)N�implementation�0rY�)r-r+r5r1r/r2r0r.r,r3r4)
�hasattr�sysrorprnra�os�platform�machine�release�systemr3r6)Ziverr-rrrr�s 

c@s.eZdZdd�Zdd�Zdd�Zd
dd	�ZdS)rcCs`yttj|��|_WnFtk
rZ}z*dj|||j|jd��}t|��WYdd}~XnXdS)Nz+Invalid marker: {0!r}, parse error at {1!r}�)rG�MARKERZparseString�_markersrr#�locr)rrU�eZerr_strrrrr szMarker.__init__cCs
t|j�S)N)rNr|)rrrrr"szMarker.__str__cCsdjt|��S)Nz<Marker({0!r})>)r#r!)rrrrr%szMarker.__repr__NcCs$t�}|dk	r|j|�t|j|�S)a$Evaluate a marker.

        Return the boolean from evaluating the given marker against the
        environment. environment is an optional argument to override all or
        part of the determined environment.

        The environment is determined from the current Python process.
        N)r�updatergr|)rr`Zcurrent_environmentrrr�evaluate"s	
zMarker.evaluate)N)rrrr r"r%r�rrrrrs)T)AZ
__future__rrr�operatorrurvrtZpip._vendor.pyparsingrrrrr	r
rrr
�LZ_compatrZ
specifiersrr�__all__�
ValueErrorrrr�objectrr(r)r*ZVARIABLEr7ZsetParseActionZVERSION_CMPZ	MARKER_OPZMARKER_VALUEZBOOLOPZ
MARKER_VARZMARKER_ITEM�suppressZLPARENZRPARENZMARKER_EXPRZMARKER_ATOMr{rGrN�lt�le�eq�ne�ge�gtr[r^r_rbrgrorrrrrr�<module>sx�
	6


site-packages/pip/_vendor/packaging/__pycache__/__about__.cpython-36.pyc000064400000001177147511334570022220 0ustar003

���e��@sPddlmZmZmZdddddddd	gZd
ZdZdZd
ZdZ	dZ
dZde	ZdS)�)�absolute_import�division�print_function�	__title__�__summary__�__uri__�__version__�
__author__�	__email__�__license__�
__copyright__Z	packagingz"Core utilities for Python packagesz!https://github.com/pypa/packagingz16.8z)Donald Stufft and individual contributorszdonald@stufft.ioz"BSD or Apache License, Version 2.0zCopyright 2014-2016 %sN)
Z
__future__rrr�__all__rrrrr	r
rr�rr�/usr/lib/python3.6/__about__.py�<module>s

site-packages/pip/_vendor/packaging/__pycache__/version.cpython-36.pyc000064400000024426147511334570022001 0ustar003

���e$-�@s�ddlmZmZmZddlZddlZddlZddlmZddddd	gZ	ej
d
ddd
dddg�Zdd�ZGdd�de
�ZGdd�de�ZGdd�de�Zejdej�Zdddddd�Zdd�Zdd�ZdZGd d�de�Zd!d"�Zejd#�Zd$d%�Zd&d'�ZdS)(�)�absolute_import�division�print_functionN�)�Infinity�parse�Version�
LegacyVersion�InvalidVersion�VERSION_PATTERN�_Version�epoch�release�dev�pre�post�localcCs&yt|�Stk
r t|�SXdS)z�
    Parse the given version string and return either a :class:`Version` object
    or a :class:`LegacyVersion` object depending on if the given version is
    a valid PEP 440 version or a legacy version.
    N)rr
r	)�version�r�/usr/lib/python3.6/version.pyrsc@seZdZdZdS)r
zF
    An invalid version was found, users should refer to PEP 440.
    N)�__name__�
__module__�__qualname__�__doc__rrrrr
$sc@sLeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)�_BaseVersioncCs
t|j�S)N)�hash�_key)�selfrrr�__hash__,sz_BaseVersion.__hash__cCs|j|dd��S)NcSs||kS)Nr)�s�orrr�<lambda>0sz%_BaseVersion.__lt__.<locals>.<lambda>)�_compare)r�otherrrr�__lt__/sz_BaseVersion.__lt__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!3sz%_BaseVersion.__le__.<locals>.<lambda>)r")rr#rrr�__le__2sz_BaseVersion.__le__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!6sz%_BaseVersion.__eq__.<locals>.<lambda>)r")rr#rrr�__eq__5sz_BaseVersion.__eq__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!9sz%_BaseVersion.__ge__.<locals>.<lambda>)r")rr#rrr�__ge__8sz_BaseVersion.__ge__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!<sz%_BaseVersion.__gt__.<locals>.<lambda>)r")rr#rrr�__gt__;sz_BaseVersion.__gt__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!?sz%_BaseVersion.__ne__.<locals>.<lambda>)r")rr#rrr�__ne__>sz_BaseVersion.__ne__cCst|t�stS||j|j�S)N)�
isinstancer�NotImplementedr)rr#�methodrrrr"As
z_BaseVersion._compareN)rrrrr$r%r&r'r(r)r"rrrrr*src@s`eZdZdd�Zdd�Zdd�Zedd��Zed	d
��Zedd��Z	ed
d��Z
edd��ZdS)r	cCst|�|_t|j�|_dS)N)�str�_version�_legacy_cmpkeyr)rrrrr�__init__Js
zLegacyVersion.__init__cCs|jS)N)r.)rrrr�__str__NszLegacyVersion.__str__cCsdjtt|���S)Nz<LegacyVersion({0})>)�format�reprr-)rrrr�__repr__QszLegacyVersion.__repr__cCs|jS)N)r.)rrrr�publicTszLegacyVersion.publiccCs|jS)N)r.)rrrr�base_versionXszLegacyVersion.base_versioncCsdS)Nr)rrrrr\szLegacyVersion.localcCsdS)NFr)rrrr�
is_prerelease`szLegacyVersion.is_prereleasecCsdS)NFr)rrrr�is_postreleasedszLegacyVersion.is_postreleaseN)rrrr0r1r4�propertyr5r6rr7r8rrrrr	Hsz(\d+ | [a-z]+ | \.| -)�czfinal-�@)r�preview�-�rcrccsbxVtj|�D]H}tj||�}|s|dkr,q|dd�dkrJ|jd�Vqd|VqWdVdS)N�.r�
0123456789��*z*final)�_legacy_version_component_re�split�_legacy_version_replacement_map�get�zfill)r�partrrr�_parse_version_partsrsrIcCs�d}g}xlt|j��D]\}|jd�rh|dkrJx|rH|ddkrH|j�q.Wx|rf|ddkrf|j�qLW|j|�qWt|�}||fS)	NrrBz*finalz*final-Z00000000���rJrJ)rI�lower�
startswith�pop�append�tuple)rr
�partsrHrrrr/�s
r/a�
    v?
    (?:
        (?:(?P<epoch>[0-9]+)!)?                           # epoch
        (?P<release>[0-9]+(?:\.[0-9]+)*)                  # release segment
        (?P<pre>                                          # pre-release
            [-_\.]?
            (?P<pre_l>(a|b|c|rc|alpha|beta|pre|preview))
            [-_\.]?
            (?P<pre_n>[0-9]+)?
        )?
        (?P<post>                                         # post release
            (?:-(?P<post_n1>[0-9]+))
            |
            (?:
                [-_\.]?
                (?P<post_l>post|rev|r)
                [-_\.]?
                (?P<post_n2>[0-9]+)?
            )
        )?
        (?P<dev>                                          # dev release
            [-_\.]?
            (?P<dev_l>dev)
            [-_\.]?
            (?P<dev_n>[0-9]+)?
        )?
    )
    (?:\+(?P<local>[a-z0-9]+(?:[-_\.][a-z0-9]+)*))?       # local version
c@s|eZdZejdedejejB�Zdd�Z	dd�Z
dd�Zed	d
��Z
edd��Zed
d��Zedd��Zedd��ZdS)rz^\s*z\s*$c	Cs�|jj|�}|stdj|���t|jd�r8t|jd��ndtdd�|jd�jd�D��t	|jd�|jd	��t	|jd
�|jd�p�|jd��t	|jd
�|jd��t
|jd��d�|_t|jj
|jj|jj|jj|jj|jj�|_dS)NzInvalid version: '{0}'r
rcss|]}t|�VqdS)N)�int)�.0�irrr�	<genexpr>�sz#Version.__init__.<locals>.<genexpr>rr?Zpre_lZpre_nZpost_lZpost_n1Zpost_n2Zdev_lZdev_nr)r
rrrrr)�_regex�searchr
r2r�grouprQrOrD�_parse_letter_version�_parse_local_versionr.�_cmpkeyr
rrrrrr)rr�matchrrrr0�s.

zVersion.__init__cCsdjtt|���S)Nz<Version({0})>)r2r3r-)rrrrr4�szVersion.__repr__cCs�g}|jjdkr$|jdj|jj��|jdjdd�|jjD���|jjdk	rl|jdjdd�|jjD���|jjdk	r�|jdj|jjd	��|jjdk	r�|jd
j|jjd	��|jj	dk	r�|jdjdjdd�|jj	D����dj|�S)
Nrz{0}!r?css|]}t|�VqdS)N)r-)rR�xrrrrT�sz"Version.__str__.<locals>.<genexpr>�css|]}t|�VqdS)N)r-)rRr\rrrrT�sz.post{0}rz.dev{0}z+{0}css|]}t|�VqdS)N)r-)rRr\rrrrTs)
r.r
rNr2�joinrrrrr)rrPrrrr1�s zVersion.__str__cCst|�jdd�dS)N�+rr)r-rD)rrrrr5
szVersion.publiccCsLg}|jjdkr$|jdj|jj��|jdjdd�|jjD���dj|�S)Nrz{0}!r?css|]}t|�VqdS)N)r-)rRr\rrrrTsz'Version.base_version.<locals>.<genexpr>r])r.r
rNr2r^r)rrPrrrr6s
zVersion.base_versioncCs$t|�}d|kr |jdd�dSdS)Nr_r)r-rD)rZversion_stringrrrrsz
Version.localcCst|jjp|jj�S)N)�boolr.rr)rrrrr7!szVersion.is_prereleasecCst|jj�S)N)r`r.r)rrrrr8%szVersion.is_postreleaseN)rrr�re�compiler�VERBOSE�
IGNORECASErUr0r4r1r9r5r6rr7r8rrrrr�s
#
cCsx|rZ|dkrd}|j�}|dkr&d}n(|dkr4d}n|d
krBd	}n|dkrNd}|t|�fS|rt|rtd}|t|�fSdS)NrZalpha�aZbeta�br:rr<r>�rev�rr)r:rr<)rgrh)rKrQ)ZletterZnumberrrrrX*s 
rXz[\._-]cCs$|dk	r tdd�tj|�D��SdS)zR
    Takes a string like abc.1.twelve and turns it into ("abc", 1, "twelve").
    Ncss&|]}|j�s|j�nt|�VqdS)N)�isdigitrKrQ)rRrHrrrrTRsz'_parse_local_version.<locals>.<genexpr>)rO�_local_version_seperatorsrD)rrrrrYLsrYcCs�ttttjdd�t|�����}|dkr@|dkr@|dk	r@t}n|dkrLt}|dkrZt}|dkrft}|dkrvt}ntdd�|D��}||||||fS)NcSs|dkS)Nrr)r\rrrr!`sz_cmpkey.<locals>.<lambda>css*|]"}t|t�r|dfnt|fVqdS)r]N)r*rQr)rRrSrrrrT�sz_cmpkey.<locals>.<genexpr>)rO�reversed�list�	itertools�	dropwhiler)r
rrrrrrrrrZWs&		
rZ)Z
__future__rrr�collectionsrmraZ_structuresr�__all__�
namedtuplerr�
ValueErrorr
�objectrr	rbrcrCrErIr/rrrXrjrYrZrrrr�<module>s.!
9k
site-packages/pip/_vendor/packaging/_structures.py000064400000002610147511334570016401 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function


class Infinity(object):

    def __repr__(self):
        return "Infinity"

    def __hash__(self):
        return hash(repr(self))

    def __lt__(self, other):
        return False

    def __le__(self, other):
        return False

    def __eq__(self, other):
        return isinstance(other, self.__class__)

    def __ne__(self, other):
        return not isinstance(other, self.__class__)

    def __gt__(self, other):
        return True

    def __ge__(self, other):
        return True

    def __neg__(self):
        return NegativeInfinity

Infinity = Infinity()


class NegativeInfinity(object):

    def __repr__(self):
        return "-Infinity"

    def __hash__(self):
        return hash(repr(self))

    def __lt__(self, other):
        return True

    def __le__(self, other):
        return True

    def __eq__(self, other):
        return isinstance(other, self.__class__)

    def __ne__(self, other):
        return not isinstance(other, self.__class__)

    def __gt__(self, other):
        return False

    def __ge__(self, other):
        return False

    def __neg__(self):
        return Infinity

NegativeInfinity = NegativeInfinity()
site-packages/pip/_vendor/packaging/__about__.py000064400000001320147511334600015714 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

__all__ = [
    "__title__", "__summary__", "__uri__", "__version__", "__author__",
    "__email__", "__license__", "__copyright__",
]

__title__ = "packaging"
__summary__ = "Core utilities for Python packages"
__uri__ = "https://github.com/pypa/packaging"

__version__ = "16.8"

__author__ = "Donald Stufft and individual contributors"
__email__ = "donald@stufft.io"

__license__ = "BSD or Apache License, Version 2.0"
__copyright__ = "Copyright 2014-2016 %s" % __author__
site-packages/pip/_vendor/packaging/utils.py000064400000000645147511334600015157 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

import re


_canonicalize_regex = re.compile(r"[-_.]+")


def canonicalize_name(name):
    # This is taken from PEP 503.
    return _canonicalize_regex.sub("-", name).lower()
site-packages/pip/_vendor/packaging/_compat.py000064400000001534147511334600015437 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

import sys


PY2 = sys.version_info[0] == 2
PY3 = sys.version_info[0] == 3

# flake8: noqa

if PY3:
    string_types = str,
else:
    string_types = basestring,


def with_metaclass(meta, *bases):
    """
    Create a base class with a metaclass.
    """
    # This requires a bit of explanation: the basic idea is to make a dummy
    # metaclass for one level of class instantiation that replaces itself with
    # the actual metaclass.
    class metaclass(meta):
        def __new__(cls, name, this_bases, d):
            return meta(name, bases, d)
    return type.__new__(metaclass, 'temporary_class', (), {})
site-packages/pip/_vendor/urllib3/poolmanager.py000064400000040664147511334600015760 0ustar00from __future__ import absolute_import
import collections
import functools
import logging

from ._collections import RecentlyUsedContainer
from .connectionpool import HTTPConnectionPool, HTTPSConnectionPool
from .connectionpool import port_by_scheme
from .exceptions import LocationValueError, MaxRetryError, ProxySchemeUnknown
from .packages.six.moves.urllib.parse import urljoin
from .request import RequestMethods
from .util.url import parse_url
from .util.retry import Retry


__all__ = ['PoolManager', 'ProxyManager', 'proxy_from_url']


log = logging.getLogger(__name__)

SSL_KEYWORDS = ('key_file', 'cert_file', 'cert_reqs', 'ca_certs',
                'ssl_version', 'ca_cert_dir', 'ssl_context')

# All known keyword arguments that could be provided to the pool manager, its
# pools, or the underlying connections. This is used to construct a pool key.
_key_fields = (
    'key_scheme',  # str
    'key_host',  # str
    'key_port',  # int
    'key_timeout',  # int or float or Timeout
    'key_retries',  # int or Retry
    'key_strict',  # bool
    'key_block',  # bool
    'key_source_address',  # str
    'key_key_file',  # str
    'key_cert_file',  # str
    'key_cert_reqs',  # str
    'key_ca_certs',  # str
    'key_ssl_version',  # str
    'key_ca_cert_dir',  # str
    'key_ssl_context',  # instance of ssl.SSLContext or urllib3.util.ssl_.SSLContext
    'key_maxsize',  # int
    'key_headers',  # dict
    'key__proxy',  # parsed proxy url
    'key__proxy_headers',  # dict
    'key_socket_options',  # list of (level (int), optname (int), value (int or str)) tuples
    'key__socks_options',  # dict
    'key_assert_hostname',  # bool or string
    'key_assert_fingerprint',  # str
)

#: The namedtuple class used to construct keys for the connection pool.
#: All custom key schemes should include the fields in this key at a minimum.
PoolKey = collections.namedtuple('PoolKey', _key_fields)


def _default_key_normalizer(key_class, request_context):
    """
    Create a pool key out of a request context dictionary.

    According to RFC 3986, both the scheme and host are case-insensitive.
    Therefore, this function normalizes both before constructing the pool
    key for an HTTPS request. If you wish to change this behaviour, provide
    alternate callables to ``key_fn_by_scheme``.

    :param key_class:
        The class to use when constructing the key. This should be a namedtuple
        with the ``scheme`` and ``host`` keys at a minimum.
    :type  key_class: namedtuple
    :param request_context:
        A dictionary-like object that contain the context for a request.
    :type  request_context: dict

    :return: A namedtuple that can be used as a connection pool key.
    :rtype:  PoolKey
    """
    # Since we mutate the dictionary, make a copy first
    context = request_context.copy()
    context['scheme'] = context['scheme'].lower()
    context['host'] = context['host'].lower()

    # These are both dictionaries and need to be transformed into frozensets
    for key in ('headers', '_proxy_headers', '_socks_options'):
        if key in context and context[key] is not None:
            context[key] = frozenset(context[key].items())

    # The socket_options key may be a list and needs to be transformed into a
    # tuple.
    socket_opts = context.get('socket_options')
    if socket_opts is not None:
        context['socket_options'] = tuple(socket_opts)

    # Map the kwargs to the names in the namedtuple - this is necessary since
    # namedtuples can't have fields starting with '_'.
    for key in list(context.keys()):
        context['key_' + key] = context.pop(key)

    # Default to ``None`` for keys missing from the context
    for field in key_class._fields:
        if field not in context:
            context[field] = None

    return key_class(**context)


#: A dictionary that maps a scheme to a callable that creates a pool key.
#: This can be used to alter the way pool keys are constructed, if desired.
#: Each PoolManager makes a copy of this dictionary so they can be configured
#: globally here, or individually on the instance.
key_fn_by_scheme = {
    'http': functools.partial(_default_key_normalizer, PoolKey),
    'https': functools.partial(_default_key_normalizer, PoolKey),
}

pool_classes_by_scheme = {
    'http': HTTPConnectionPool,
    'https': HTTPSConnectionPool,
}


class PoolManager(RequestMethods):
    """
    Allows for arbitrary requests while transparently keeping track of
    necessary connection pools for you.

    :param num_pools:
        Number of connection pools to cache before discarding the least
        recently used pool.

    :param headers:
        Headers to include with all requests, unless other headers are given
        explicitly.

    :param \\**connection_pool_kw:
        Additional parameters are used to create fresh
        :class:`urllib3.connectionpool.ConnectionPool` instances.

    Example::

        >>> manager = PoolManager(num_pools=2)
        >>> r = manager.request('GET', 'http://google.com/')
        >>> r = manager.request('GET', 'http://google.com/mail')
        >>> r = manager.request('GET', 'http://yahoo.com/')
        >>> len(manager.pools)
        2

    """

    proxy = None

    def __init__(self, num_pools=10, headers=None, **connection_pool_kw):
        RequestMethods.__init__(self, headers)
        self.connection_pool_kw = connection_pool_kw
        self.pools = RecentlyUsedContainer(num_pools,
                                           dispose_func=lambda p: p.close())

        # Locally set the pool classes and keys so other PoolManagers can
        # override them.
        self.pool_classes_by_scheme = pool_classes_by_scheme
        self.key_fn_by_scheme = key_fn_by_scheme.copy()

    def __enter__(self):
        return self

    def __exit__(self, exc_type, exc_val, exc_tb):
        self.clear()
        # Return False to re-raise any potential exceptions
        return False

    def _new_pool(self, scheme, host, port, request_context=None):
        """
        Create a new :class:`ConnectionPool` based on host, port, scheme, and
        any additional pool keyword arguments.

        If ``request_context`` is provided, it is provided as keyword arguments
        to the pool class used. This method is used to actually create the
        connection pools handed out by :meth:`connection_from_url` and
        companion methods. It is intended to be overridden for customization.
        """
        pool_cls = self.pool_classes_by_scheme[scheme]
        if request_context is None:
            request_context = self.connection_pool_kw.copy()

        # Although the context has everything necessary to create the pool,
        # this function has historically only used the scheme, host, and port
        # in the positional args. When an API change is acceptable these can
        # be removed.
        for key in ('scheme', 'host', 'port'):
            request_context.pop(key, None)

        if scheme == 'http':
            for kw in SSL_KEYWORDS:
                request_context.pop(kw, None)

        return pool_cls(host, port, **request_context)

    def clear(self):
        """
        Empty our store of pools and direct them all to close.

        This will not affect in-flight connections, but they will not be
        re-used after completion.
        """
        self.pools.clear()

    def connection_from_host(self, host, port=None, scheme='http', pool_kwargs=None):
        """
        Get a :class:`ConnectionPool` based on the host, port, and scheme.

        If ``port`` isn't given, it will be derived from the ``scheme`` using
        ``urllib3.connectionpool.port_by_scheme``. If ``pool_kwargs`` is
        provided, it is merged with the instance's ``connection_pool_kw``
        variable and used to create the new connection pool, if one is
        needed.
        """

        if not host:
            raise LocationValueError("No host specified.")

        request_context = self._merge_pool_kwargs(pool_kwargs)
        request_context['scheme'] = scheme or 'http'
        if not port:
            port = port_by_scheme.get(request_context['scheme'].lower(), 80)
        request_context['port'] = port
        request_context['host'] = host

        return self.connection_from_context(request_context)

    def connection_from_context(self, request_context):
        """
        Get a :class:`ConnectionPool` based on the request context.

        ``request_context`` must at least contain the ``scheme`` key and its
        value must be a key in ``key_fn_by_scheme`` instance variable.
        """
        scheme = request_context['scheme'].lower()
        pool_key_constructor = self.key_fn_by_scheme[scheme]
        pool_key = pool_key_constructor(request_context)

        return self.connection_from_pool_key(pool_key, request_context=request_context)

    def connection_from_pool_key(self, pool_key, request_context=None):
        """
        Get a :class:`ConnectionPool` based on the provided pool key.

        ``pool_key`` should be a namedtuple that only contains immutable
        objects. At a minimum it must have the ``scheme``, ``host``, and
        ``port`` fields.
        """
        with self.pools.lock:
            # If the scheme, host, or port doesn't match existing open
            # connections, open a new ConnectionPool.
            pool = self.pools.get(pool_key)
            if pool:
                return pool

            # Make a fresh ConnectionPool of the desired type
            scheme = request_context['scheme']
            host = request_context['host']
            port = request_context['port']
            pool = self._new_pool(scheme, host, port, request_context=request_context)
            self.pools[pool_key] = pool

        return pool

    def connection_from_url(self, url, pool_kwargs=None):
        """
        Similar to :func:`urllib3.connectionpool.connection_from_url`.

        If ``pool_kwargs`` is not provided and a new pool needs to be
        constructed, ``self.connection_pool_kw`` is used to initialize
        the :class:`urllib3.connectionpool.ConnectionPool`. If ``pool_kwargs``
        is provided, it is used instead. Note that if a new pool does not
        need to be created for the request, the provided ``pool_kwargs`` are
        not used.
        """
        u = parse_url(url)
        return self.connection_from_host(u.host, port=u.port, scheme=u.scheme,
                                         pool_kwargs=pool_kwargs)

    def _merge_pool_kwargs(self, override):
        """
        Merge a dictionary of override values for self.connection_pool_kw.

        This does not modify self.connection_pool_kw and returns a new dict.
        Any keys in the override dictionary with a value of ``None`` are
        removed from the merged dictionary.
        """
        base_pool_kwargs = self.connection_pool_kw.copy()
        if override:
            for key, value in override.items():
                if value is None:
                    try:
                        del base_pool_kwargs[key]
                    except KeyError:
                        pass
                else:
                    base_pool_kwargs[key] = value
        return base_pool_kwargs

    def urlopen(self, method, url, redirect=True, **kw):
        """
        Same as :meth:`urllib3.connectionpool.HTTPConnectionPool.urlopen`
        with custom cross-host redirect logic and only sends the request-uri
        portion of the ``url``.

        The given ``url`` parameter must be absolute, such that an appropriate
        :class:`urllib3.connectionpool.ConnectionPool` can be chosen for it.
        """
        u = parse_url(url)
        conn = self.connection_from_host(u.host, port=u.port, scheme=u.scheme)

        kw['assert_same_host'] = False
        kw['redirect'] = False

        if 'headers' not in kw:
            kw['headers'] = self.headers.copy()

        if self.proxy is not None and u.scheme == "http":
            response = conn.urlopen(method, url, **kw)
        else:
            response = conn.urlopen(method, u.request_uri, **kw)

        redirect_location = redirect and response.get_redirect_location()
        if not redirect_location:
            return response

        # Support relative URLs for redirecting.
        redirect_location = urljoin(url, redirect_location)

        # RFC 7231, Section 6.4.4
        if response.status == 303:
            method = 'GET'

        retries = kw.get('retries')
        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect)

        # Strip headers marked as unsafe to forward to the redirected location.
        # Check remove_headers_on_redirect to avoid a potential network call within
        # conn.is_same_host() which may use socket.gethostbyname() in the future.
        if (retries.remove_headers_on_redirect
                and not conn.is_same_host(redirect_location)):
            for header in retries.remove_headers_on_redirect:
                kw['headers'].pop(header, None)

        try:
            retries = retries.increment(method, url, response=response, _pool=conn)
        except MaxRetryError:
            if retries.raise_on_redirect:
                raise
            return response

        kw['retries'] = retries
        kw['redirect'] = redirect

        log.info("Redirecting %s -> %s", url, redirect_location)
        return self.urlopen(method, redirect_location, **kw)


class ProxyManager(PoolManager):
    """
    Behaves just like :class:`PoolManager`, but sends all requests through
    the defined proxy, using the CONNECT method for HTTPS URLs.

    :param proxy_url:
        The URL of the proxy to be used.

    :param proxy_headers:
        A dictionary contaning headers that will be sent to the proxy. In case
        of HTTP they are being sent with each request, while in the
        HTTPS/CONNECT case they are sent only once. Could be used for proxy
        authentication.

    Example:
        >>> proxy = urllib3.ProxyManager('http://localhost:3128/')
        >>> r1 = proxy.request('GET', 'http://google.com/')
        >>> r2 = proxy.request('GET', 'http://httpbin.org/')
        >>> len(proxy.pools)
        1
        >>> r3 = proxy.request('GET', 'https://httpbin.org/')
        >>> r4 = proxy.request('GET', 'https://twitter.com/')
        >>> len(proxy.pools)
        3

    """

    def __init__(self, proxy_url, num_pools=10, headers=None,
                 proxy_headers=None, **connection_pool_kw):

        if isinstance(proxy_url, HTTPConnectionPool):
            proxy_url = '%s://%s:%i' % (proxy_url.scheme, proxy_url.host,
                                        proxy_url.port)
        proxy = parse_url(proxy_url)
        if not proxy.port:
            port = port_by_scheme.get(proxy.scheme, 80)
            proxy = proxy._replace(port=port)

        if proxy.scheme not in ("http", "https"):
            raise ProxySchemeUnknown(proxy.scheme)

        self.proxy = proxy
        self.proxy_headers = proxy_headers or {}

        connection_pool_kw['_proxy'] = self.proxy
        connection_pool_kw['_proxy_headers'] = self.proxy_headers

        super(ProxyManager, self).__init__(
            num_pools, headers, **connection_pool_kw)

    def connection_from_host(self, host, port=None, scheme='http', pool_kwargs=None):
        if scheme == "https":
            return super(ProxyManager, self).connection_from_host(
                host, port, scheme, pool_kwargs=pool_kwargs)

        return super(ProxyManager, self).connection_from_host(
            self.proxy.host, self.proxy.port, self.proxy.scheme, pool_kwargs=pool_kwargs)

    def _set_proxy_headers(self, url, headers=None):
        """
        Sets headers needed by proxies: specifically, the Accept and Host
        headers. Only sets headers not provided by the user.
        """
        headers_ = {'Accept': '*/*'}

        netloc = parse_url(url).netloc
        if netloc:
            headers_['Host'] = netloc

        if headers:
            headers_.update(headers)
        return headers_

    def urlopen(self, method, url, redirect=True, **kw):
        "Same as HTTP(S)ConnectionPool.urlopen, ``url`` must be absolute."
        u = parse_url(url)

        if u.scheme == "http":
            # For proxied HTTPS requests, httplib sets the necessary headers
            # on the CONNECT to the proxy. For HTTP, we'll definitely
            # need to set 'Host' at the very least.
            headers = kw.get('headers', self.headers)
            kw['headers'] = self._set_proxy_headers(url, headers)

        return super(ProxyManager, self).urlopen(method, url, redirect=redirect, **kw)


def proxy_from_url(url, **kw):
    return ProxyManager(proxy_url=url, **kw)
site-packages/pip/_vendor/urllib3/_collections.py000064400000023734147511334600016130 0ustar00from __future__ import absolute_import
from collections import Mapping, MutableMapping
try:
    from threading import RLock
except ImportError:  # Platform-specific: No threads available
    class RLock:
        def __enter__(self):
            pass

        def __exit__(self, exc_type, exc_value, traceback):
            pass


try:  # Python 2.7+
    from collections import OrderedDict
except ImportError:
    from .packages.ordered_dict import OrderedDict
from .packages.six import iterkeys, itervalues, PY3


__all__ = ['RecentlyUsedContainer', 'HTTPHeaderDict']


_Null = object()


class RecentlyUsedContainer(MutableMapping):
    """
    Provides a thread-safe dict-like container which maintains up to
    ``maxsize`` keys while throwing away the least-recently-used keys beyond
    ``maxsize``.

    :param maxsize:
        Maximum number of recent elements to retain.

    :param dispose_func:
        Every time an item is evicted from the container,
        ``dispose_func(value)`` is called.  Callback which will get called
    """

    ContainerCls = OrderedDict

    def __init__(self, maxsize=10, dispose_func=None):
        self._maxsize = maxsize
        self.dispose_func = dispose_func

        self._container = self.ContainerCls()
        self.lock = RLock()

    def __getitem__(self, key):
        # Re-insert the item, moving it to the end of the eviction line.
        with self.lock:
            item = self._container.pop(key)
            self._container[key] = item
            return item

    def __setitem__(self, key, value):
        evicted_value = _Null
        with self.lock:
            # Possibly evict the existing value of 'key'
            evicted_value = self._container.get(key, _Null)
            self._container[key] = value

            # If we didn't evict an existing value, we might have to evict the
            # least recently used item from the beginning of the container.
            if len(self._container) > self._maxsize:
                _key, evicted_value = self._container.popitem(last=False)

        if self.dispose_func and evicted_value is not _Null:
            self.dispose_func(evicted_value)

    def __delitem__(self, key):
        with self.lock:
            value = self._container.pop(key)

        if self.dispose_func:
            self.dispose_func(value)

    def __len__(self):
        with self.lock:
            return len(self._container)

    def __iter__(self):
        raise NotImplementedError('Iteration over this class is unlikely to be threadsafe.')

    def clear(self):
        with self.lock:
            # Copy pointers to all values, then wipe the mapping
            values = list(itervalues(self._container))
            self._container.clear()

        if self.dispose_func:
            for value in values:
                self.dispose_func(value)

    def keys(self):
        with self.lock:
            return list(iterkeys(self._container))


class HTTPHeaderDict(MutableMapping):
    """
    :param headers:
        An iterable of field-value pairs. Must not contain multiple field names
        when compared case-insensitively.

    :param kwargs:
        Additional field-value pairs to pass in to ``dict.update``.

    A ``dict`` like container for storing HTTP Headers.

    Field names are stored and compared case-insensitively in compliance with
    RFC 7230. Iteration provides the first case-sensitive key seen for each
    case-insensitive pair.

    Using ``__setitem__`` syntax overwrites fields that compare equal
    case-insensitively in order to maintain ``dict``'s api. For fields that
    compare equal, instead create a new ``HTTPHeaderDict`` and use ``.add``
    in a loop.

    If multiple fields that are equal case-insensitively are passed to the
    constructor or ``.update``, the behavior is undefined and some will be
    lost.

    >>> headers = HTTPHeaderDict()
    >>> headers.add('Set-Cookie', 'foo=bar')
    >>> headers.add('set-cookie', 'baz=quxx')
    >>> headers['content-length'] = '7'
    >>> headers['SET-cookie']
    'foo=bar, baz=quxx'
    >>> headers['Content-Length']
    '7'
    """

    def __init__(self, headers=None, **kwargs):
        super(HTTPHeaderDict, self).__init__()
        self._container = OrderedDict()
        if headers is not None:
            if isinstance(headers, HTTPHeaderDict):
                self._copy_from(headers)
            else:
                self.extend(headers)
        if kwargs:
            self.extend(kwargs)

    def __setitem__(self, key, val):
        self._container[key.lower()] = [key, val]
        return self._container[key.lower()]

    def __getitem__(self, key):
        val = self._container[key.lower()]
        return ', '.join(val[1:])

    def __delitem__(self, key):
        del self._container[key.lower()]

    def __contains__(self, key):
        return key.lower() in self._container

    def __eq__(self, other):
        if not isinstance(other, Mapping) and not hasattr(other, 'keys'):
            return False
        if not isinstance(other, type(self)):
            other = type(self)(other)
        return (dict((k.lower(), v) for k, v in self.itermerged()) ==
                dict((k.lower(), v) for k, v in other.itermerged()))

    def __ne__(self, other):
        return not self.__eq__(other)

    if not PY3:  # Python 2
        iterkeys = MutableMapping.iterkeys
        itervalues = MutableMapping.itervalues

    __marker = object()

    def __len__(self):
        return len(self._container)

    def __iter__(self):
        # Only provide the originally cased names
        for vals in self._container.values():
            yield vals[0]

    def pop(self, key, default=__marker):
        '''D.pop(k[,d]) -> v, remove specified key and return the corresponding value.
          If key is not found, d is returned if given, otherwise KeyError is raised.
        '''
        # Using the MutableMapping function directly fails due to the private marker.
        # Using ordinary dict.pop would expose the internal structures.
        # So let's reinvent the wheel.
        try:
            value = self[key]
        except KeyError:
            if default is self.__marker:
                raise
            return default
        else:
            del self[key]
            return value

    def discard(self, key):
        try:
            del self[key]
        except KeyError:
            pass

    def add(self, key, val):
        """Adds a (name, value) pair, doesn't overwrite the value if it already
        exists.

        >>> headers = HTTPHeaderDict(foo='bar')
        >>> headers.add('Foo', 'baz')
        >>> headers['foo']
        'bar, baz'
        """
        key_lower = key.lower()
        new_vals = [key, val]
        # Keep the common case aka no item present as fast as possible
        vals = self._container.setdefault(key_lower, new_vals)
        if new_vals is not vals:
            vals.append(val)

    def extend(self, *args, **kwargs):
        """Generic import function for any type of header-like object.
        Adapted version of MutableMapping.update in order to insert items
        with self.add instead of self.__setitem__
        """
        if len(args) > 1:
            raise TypeError("extend() takes at most 1 positional "
                            "arguments ({0} given)".format(len(args)))
        other = args[0] if len(args) >= 1 else ()

        if isinstance(other, HTTPHeaderDict):
            for key, val in other.iteritems():
                self.add(key, val)
        elif isinstance(other, Mapping):
            for key in other:
                self.add(key, other[key])
        elif hasattr(other, "keys"):
            for key in other.keys():
                self.add(key, other[key])
        else:
            for key, value in other:
                self.add(key, value)

        for key, value in kwargs.items():
            self.add(key, value)

    def getlist(self, key, default=__marker):
        """Returns a list of all the values for the named field. Returns an
        empty list if the key doesn't exist."""
        try:
            vals = self._container[key.lower()]
        except KeyError:
            if default is self.__marker:
                return []
            return default
        else:
            return vals[1:]

    # Backwards compatibility for httplib
    getheaders = getlist
    getallmatchingheaders = getlist
    iget = getlist

    # Backwards compatibility for http.cookiejar
    get_all = getlist

    def __repr__(self):
        return "%s(%s)" % (type(self).__name__, dict(self.itermerged()))

    def _copy_from(self, other):
        for key in other:
            val = other.getlist(key)
            if isinstance(val, list):
                # Don't need to convert tuples
                val = list(val)
            self._container[key.lower()] = [key] + val

    def copy(self):
        clone = type(self)()
        clone._copy_from(self)
        return clone

    def iteritems(self):
        """Iterate over all header lines, including duplicate ones."""
        for key in self:
            vals = self._container[key.lower()]
            for val in vals[1:]:
                yield vals[0], val

    def itermerged(self):
        """Iterate over all headers, merging duplicate ones together."""
        for key in self:
            val = self._container[key.lower()]
            yield val[0], ', '.join(val[1:])

    def items(self):
        return list(self.iteritems())

    @classmethod
    def from_httplib(cls, message):  # Python 2
        """Read headers from a Python 2 httplib message object."""
        # python2.7 does not expose a proper API for exporting multiheaders
        # efficiently. This function re-reads raw lines from the message
        # object and extracts the multiheaders properly.
        headers = []

        for line in message.headers:
            if line.startswith((' ', '\t')):
                key, value = headers[-1]
                headers[-1] = (key, value + '\r\n' + line.rstrip())
                continue

            key, value = line.split(':', 1)
            headers.append((key, value.strip()))

        return cls(headers)
site-packages/pip/_vendor/urllib3/__init__.py000064400000005445147511334600015211 0ustar00"""
urllib3 - Thread-safe connection pooling and re-using.
"""

from __future__ import absolute_import
import warnings

from .connectionpool import (
    HTTPConnectionPool,
    HTTPSConnectionPool,
    connection_from_url
)

from . import exceptions
from .filepost import encode_multipart_formdata
from .poolmanager import PoolManager, ProxyManager, proxy_from_url
from .response import HTTPResponse
from .util.request import make_headers
from .util.url import get_host
from .util.timeout import Timeout
from .util.retry import Retry


# Set default logging handler to avoid "No handler found" warnings.
import logging
try:  # Python 2.7+
    from logging import NullHandler
except ImportError:
    class NullHandler(logging.Handler):
        def emit(self, record):
            pass

__author__ = 'Andrey Petrov (andrey.petrov@shazow.net)'
__license__ = 'MIT'
__version__ = '1.22'

__all__ = (
    'HTTPConnectionPool',
    'HTTPSConnectionPool',
    'PoolManager',
    'ProxyManager',
    'HTTPResponse',
    'Retry',
    'Timeout',
    'add_stderr_logger',
    'connection_from_url',
    'disable_warnings',
    'encode_multipart_formdata',
    'get_host',
    'make_headers',
    'proxy_from_url',
)

logging.getLogger(__name__).addHandler(NullHandler())


def add_stderr_logger(level=logging.DEBUG):
    """
    Helper for quickly adding a StreamHandler to the logger. Useful for
    debugging.

    Returns the handler after adding it.
    """
    # This method needs to be in this __init__.py to get the __name__ correct
    # even if urllib3 is vendored within another package.
    logger = logging.getLogger(__name__)
    handler = logging.StreamHandler()
    handler.setFormatter(logging.Formatter('%(asctime)s %(levelname)s %(message)s'))
    logger.addHandler(handler)
    logger.setLevel(level)
    logger.debug('Added a stderr logging handler to logger: %s', __name__)
    return handler


# ... Clean up.
del NullHandler


# All warning filters *must* be appended unless you're really certain that they
# shouldn't be: otherwise, it's very hard for users to use most Python
# mechanisms to silence them.
# SecurityWarning's always go off by default.
warnings.simplefilter('always', exceptions.SecurityWarning, append=True)
# SubjectAltNameWarning's should go off once per host
warnings.simplefilter('default', exceptions.SubjectAltNameWarning, append=True)
# InsecurePlatformWarning's don't vary between requests, so we keep it default.
warnings.simplefilter('default', exceptions.InsecurePlatformWarning,
                      append=True)
# SNIMissingWarnings should go off only once.
warnings.simplefilter('default', exceptions.SNIMissingWarning, append=True)


def disable_warnings(category=exceptions.HTTPWarning):
    """
    Helper for quickly disabling all urllib3 warnings.
    """
    warnings.simplefilter('ignore', category)
site-packages/pip/_vendor/urllib3/connection.py000064400000031313147511334600015602 0ustar00from __future__ import absolute_import
import datetime
import logging
import os
import sys
import socket
from socket import error as SocketError, timeout as SocketTimeout
import warnings
from .packages import six
from .packages.six.moves.http_client import HTTPConnection as _HTTPConnection
from .packages.six.moves.http_client import HTTPException  # noqa: F401

try:  # Compiled with SSL?
    import ssl
    BaseSSLError = ssl.SSLError
except (ImportError, AttributeError):  # Platform-specific: No SSL.
    ssl = None

    class BaseSSLError(BaseException):
        pass


try:  # Python 3:
    # Not a no-op, we're adding this to the namespace so it can be imported.
    ConnectionError = ConnectionError
except NameError:  # Python 2:
    class ConnectionError(Exception):
        pass


from .exceptions import (
    NewConnectionError,
    ConnectTimeoutError,
    SubjectAltNameWarning,
    SystemTimeWarning,
)
from .packages.ssl_match_hostname import match_hostname, CertificateError

from .util.ssl_ import (
    resolve_cert_reqs,
    resolve_ssl_version,
    assert_fingerprint,
    create_urllib3_context,
    ssl_wrap_socket
)


from .util import connection

from ._collections import HTTPHeaderDict

log = logging.getLogger(__name__)

port_by_scheme = {
    'http': 80,
    'https': 443,
}

# When updating RECENT_DATE, move it to
# within two years of the current date, and no
# earlier than 6 months ago.
RECENT_DATE = datetime.date(2016, 1, 1)


class DummyConnection(object):
    """Used to detect a failed ConnectionCls import."""
    pass


class HTTPConnection(_HTTPConnection, object):
    """
    Based on httplib.HTTPConnection but provides an extra constructor
    backwards-compatibility layer between older and newer Pythons.

    Additional keyword parameters are used to configure attributes of the connection.
    Accepted parameters include:

      - ``strict``: See the documentation on :class:`urllib3.connectionpool.HTTPConnectionPool`
      - ``source_address``: Set the source address for the current connection.

        .. note:: This is ignored for Python 2.6. It is only applied for 2.7 and 3.x

      - ``socket_options``: Set specific options on the underlying socket. If not specified, then
        defaults are loaded from ``HTTPConnection.default_socket_options`` which includes disabling
        Nagle's algorithm (sets TCP_NODELAY to 1) unless the connection is behind a proxy.

        For example, if you wish to enable TCP Keep Alive in addition to the defaults,
        you might pass::

            HTTPConnection.default_socket_options + [
                (socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1),
            ]

        Or you may want to disable the defaults by passing an empty list (e.g., ``[]``).
    """

    default_port = port_by_scheme['http']

    #: Disable Nagle's algorithm by default.
    #: ``[(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)]``
    default_socket_options = [(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)]

    #: Whether this connection verifies the host's certificate.
    is_verified = False

    def __init__(self, *args, **kw):
        if six.PY3:  # Python 3
            kw.pop('strict', None)

        # Pre-set source_address in case we have an older Python like 2.6.
        self.source_address = kw.get('source_address')

        if sys.version_info < (2, 7):  # Python 2.6
            # _HTTPConnection on Python 2.6 will balk at this keyword arg, but
            # not newer versions. We can still use it when creating a
            # connection though, so we pop it *after* we have saved it as
            # self.source_address.
            kw.pop('source_address', None)

        #: The socket options provided by the user. If no options are
        #: provided, we use the default options.
        self.socket_options = kw.pop('socket_options', self.default_socket_options)

        # Superclass also sets self.source_address in Python 2.7+.
        _HTTPConnection.__init__(self, *args, **kw)

    def _new_conn(self):
        """ Establish a socket connection and set nodelay settings on it.

        :return: New socket connection.
        """
        extra_kw = {}
        if self.source_address:
            extra_kw['source_address'] = self.source_address

        if self.socket_options:
            extra_kw['socket_options'] = self.socket_options

        try:
            conn = connection.create_connection(
                (self.host, self.port), self.timeout, **extra_kw)

        except SocketTimeout as e:
            raise ConnectTimeoutError(
                self, "Connection to %s timed out. (connect timeout=%s)" %
                (self.host, self.timeout))

        except SocketError as e:
            raise NewConnectionError(
                self, "Failed to establish a new connection: %s" % e)

        return conn

    def _prepare_conn(self, conn):
        self.sock = conn
        # the _tunnel_host attribute was added in python 2.6.3 (via
        # http://hg.python.org/cpython/rev/0f57b30a152f) so pythons 2.6(0-2) do
        # not have them.
        if getattr(self, '_tunnel_host', None):
            # TODO: Fix tunnel so it doesn't depend on self.sock state.
            self._tunnel()
            # Mark this connection as not reusable
            self.auto_open = 0

    def connect(self):
        conn = self._new_conn()
        self._prepare_conn(conn)

    def request_chunked(self, method, url, body=None, headers=None):
        """
        Alternative to the common request method, which sends the
        body with chunked encoding and not as one block
        """
        headers = HTTPHeaderDict(headers if headers is not None else {})
        skip_accept_encoding = 'accept-encoding' in headers
        skip_host = 'host' in headers
        self.putrequest(
            method,
            url,
            skip_accept_encoding=skip_accept_encoding,
            skip_host=skip_host
        )
        for header, value in headers.items():
            self.putheader(header, value)
        if 'transfer-encoding' not in headers:
            self.putheader('Transfer-Encoding', 'chunked')
        self.endheaders()

        if body is not None:
            stringish_types = six.string_types + (six.binary_type,)
            if isinstance(body, stringish_types):
                body = (body,)
            for chunk in body:
                if not chunk:
                    continue
                if not isinstance(chunk, six.binary_type):
                    chunk = chunk.encode('utf8')
                len_str = hex(len(chunk))[2:]
                self.send(len_str.encode('utf-8'))
                self.send(b'\r\n')
                self.send(chunk)
                self.send(b'\r\n')

        # After the if clause, to always have a closed body
        self.send(b'0\r\n\r\n')


class HTTPSConnection(HTTPConnection):
    default_port = port_by_scheme['https']

    ssl_version = None

    def __init__(self, host, port=None, key_file=None, cert_file=None,
                 strict=None, timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
                 ssl_context=None, **kw):

        HTTPConnection.__init__(self, host, port, strict=strict,
                                timeout=timeout, **kw)

        self.key_file = key_file
        self.cert_file = cert_file
        self.ssl_context = ssl_context

        # Required property for Google AppEngine 1.9.0 which otherwise causes
        # HTTPS requests to go out as HTTP. (See Issue #356)
        self._protocol = 'https'

    def connect(self):
        conn = self._new_conn()
        self._prepare_conn(conn)

        if self.ssl_context is None:
            self.ssl_context = create_urllib3_context(
                ssl_version=resolve_ssl_version(None),
                cert_reqs=resolve_cert_reqs(None),
            )

        self.sock = ssl_wrap_socket(
            sock=conn,
            keyfile=self.key_file,
            certfile=self.cert_file,
            ssl_context=self.ssl_context,
        )


class VerifiedHTTPSConnection(HTTPSConnection):
    """
    Based on httplib.HTTPSConnection but wraps the socket with
    SSL certification.
    """
    cert_reqs = None
    ca_certs = None
    ca_cert_dir = None
    ssl_version = None
    assert_fingerprint = None

    def set_cert(self, key_file=None, cert_file=None,
                 cert_reqs=None, ca_certs=None,
                 assert_hostname=None, assert_fingerprint=None,
                 ca_cert_dir=None):
        """
        This method should only be called once, before the connection is used.
        """
        # If cert_reqs is not provided, we can try to guess. If the user gave
        # us a cert database, we assume they want to use it: otherwise, if
        # they gave us an SSL Context object we should use whatever is set for
        # it.
        if cert_reqs is None:
            if ca_certs or ca_cert_dir:
                cert_reqs = 'CERT_REQUIRED'
            elif self.ssl_context is not None:
                cert_reqs = self.ssl_context.verify_mode

        self.key_file = key_file
        self.cert_file = cert_file
        self.cert_reqs = cert_reqs
        self.assert_hostname = assert_hostname
        self.assert_fingerprint = assert_fingerprint
        self.ca_certs = ca_certs and os.path.expanduser(ca_certs)
        self.ca_cert_dir = ca_cert_dir and os.path.expanduser(ca_cert_dir)

    def connect(self):
        # Add certificate verification
        conn = self._new_conn()

        hostname = self.host
        if getattr(self, '_tunnel_host', None):
            # _tunnel_host was added in Python 2.6.3
            # (See: http://hg.python.org/cpython/rev/0f57b30a152f)

            self.sock = conn
            # Calls self._set_hostport(), so self.host is
            # self._tunnel_host below.
            self._tunnel()
            # Mark this connection as not reusable
            self.auto_open = 0

            # Override the host with the one we're requesting data from.
            hostname = self._tunnel_host

        is_time_off = datetime.date.today() < RECENT_DATE
        if is_time_off:
            warnings.warn((
                'System time is way off (before {0}). This will probably '
                'lead to SSL verification errors').format(RECENT_DATE),
                SystemTimeWarning
            )

        # Wrap socket using verification with the root certs in
        # trusted_root_certs
        if self.ssl_context is None:
            self.ssl_context = create_urllib3_context(
                ssl_version=resolve_ssl_version(self.ssl_version),
                cert_reqs=resolve_cert_reqs(self.cert_reqs),
            )

        context = self.ssl_context
        context.verify_mode = resolve_cert_reqs(self.cert_reqs)
        self.sock = ssl_wrap_socket(
            sock=conn,
            keyfile=self.key_file,
            certfile=self.cert_file,
            ca_certs=self.ca_certs,
            ca_cert_dir=self.ca_cert_dir,
            server_hostname=hostname,
            ssl_context=context)

        if self.assert_fingerprint:
            assert_fingerprint(self.sock.getpeercert(binary_form=True),
                               self.assert_fingerprint)
        elif context.verify_mode != ssl.CERT_NONE \
                and not getattr(context, 'check_hostname', False) \
                and self.assert_hostname is not False:
            # While urllib3 attempts to always turn off hostname matching from
            # the TLS library, this cannot always be done. So we check whether
            # the TLS Library still thinks it's matching hostnames.
            cert = self.sock.getpeercert()
            if not cert.get('subjectAltName', ()):
                warnings.warn((
                    'Certificate for {0} has no `subjectAltName`, falling back to check for a '
                    '`commonName` for now. This feature is being removed by major browsers and '
                    'deprecated by RFC 2818. (See https://github.com/shazow/urllib3/issues/497 '
                    'for details.)'.format(hostname)),
                    SubjectAltNameWarning
                )
            _match_hostname(cert, self.assert_hostname or hostname)

        self.is_verified = (
            context.verify_mode == ssl.CERT_REQUIRED or
            self.assert_fingerprint is not None
        )


def _match_hostname(cert, asserted_hostname):
    try:
        match_hostname(cert, asserted_hostname)
    except CertificateError as e:
        log.error(
            'Certificate did not match expected hostname: %s. '
            'Certificate: %s', asserted_hostname, cert
        )
        # Add cert to exception and reraise so client code can inspect
        # the cert when catching the exception, if they want to
        e._peer_cert = cert
        raise


if ssl:
    # Make a copy for testing.
    UnverifiedHTTPSConnection = HTTPSConnection
    HTTPSConnection = VerifiedHTTPSConnection
else:
    HTTPSConnection = DummyConnection
site-packages/pip/_vendor/urllib3/fields.py000064400000013467147511334600014723 0ustar00from __future__ import absolute_import
import email.utils
import mimetypes

from .packages import six


def guess_content_type(filename, default='application/octet-stream'):
    """
    Guess the "Content-Type" of a file.

    :param filename:
        The filename to guess the "Content-Type" of using :mod:`mimetypes`.
    :param default:
        If no "Content-Type" can be guessed, default to `default`.
    """
    if filename:
        return mimetypes.guess_type(filename)[0] or default
    return default


def format_header_param(name, value):
    """
    Helper function to format and quote a single header parameter.

    Particularly useful for header parameters which might contain
    non-ASCII values, like file names. This follows RFC 2231, as
    suggested by RFC 2388 Section 4.4.

    :param name:
        The name of the parameter, a string expected to be ASCII only.
    :param value:
        The value of the parameter, provided as a unicode string.
    """
    if not any(ch in value for ch in '"\\\r\n'):
        result = '%s="%s"' % (name, value)
        try:
            result.encode('ascii')
        except (UnicodeEncodeError, UnicodeDecodeError):
            pass
        else:
            return result
    if not six.PY3 and isinstance(value, six.text_type):  # Python 2:
        value = value.encode('utf-8')
    value = email.utils.encode_rfc2231(value, 'utf-8')
    value = '%s*=%s' % (name, value)
    return value


class RequestField(object):
    """
    A data container for request body parameters.

    :param name:
        The name of this request field.
    :param data:
        The data/value body.
    :param filename:
        An optional filename of the request field.
    :param headers:
        An optional dict-like object of headers to initially use for the field.
    """
    def __init__(self, name, data, filename=None, headers=None):
        self._name = name
        self._filename = filename
        self.data = data
        self.headers = {}
        if headers:
            self.headers = dict(headers)

    @classmethod
    def from_tuples(cls, fieldname, value):
        """
        A :class:`~urllib3.fields.RequestField` factory from old-style tuple parameters.

        Supports constructing :class:`~urllib3.fields.RequestField` from
        parameter of key/value strings AND key/filetuple. A filetuple is a
        (filename, data, MIME type) tuple where the MIME type is optional.
        For example::

            'foo': 'bar',
            'fakefile': ('foofile.txt', 'contents of foofile'),
            'realfile': ('barfile.txt', open('realfile').read()),
            'typedfile': ('bazfile.bin', open('bazfile').read(), 'image/jpeg'),
            'nonamefile': 'contents of nonamefile field',

        Field names and filenames must be unicode.
        """
        if isinstance(value, tuple):
            if len(value) == 3:
                filename, data, content_type = value
            else:
                filename, data = value
                content_type = guess_content_type(filename)
        else:
            filename = None
            content_type = None
            data = value

        request_param = cls(fieldname, data, filename=filename)
        request_param.make_multipart(content_type=content_type)

        return request_param

    def _render_part(self, name, value):
        """
        Overridable helper function to format a single header parameter.

        :param name:
            The name of the parameter, a string expected to be ASCII only.
        :param value:
            The value of the parameter, provided as a unicode string.
        """
        return format_header_param(name, value)

    def _render_parts(self, header_parts):
        """
        Helper function to format and quote a single header.

        Useful for single headers that are composed of multiple items. E.g.,
        'Content-Disposition' fields.

        :param header_parts:
            A sequence of (k, v) typles or a :class:`dict` of (k, v) to format
            as `k1="v1"; k2="v2"; ...`.
        """
        parts = []
        iterable = header_parts
        if isinstance(header_parts, dict):
            iterable = header_parts.items()

        for name, value in iterable:
            if value is not None:
                parts.append(self._render_part(name, value))

        return '; '.join(parts)

    def render_headers(self):
        """
        Renders the headers for this request field.
        """
        lines = []

        sort_keys = ['Content-Disposition', 'Content-Type', 'Content-Location']
        for sort_key in sort_keys:
            if self.headers.get(sort_key, False):
                lines.append('%s: %s' % (sort_key, self.headers[sort_key]))

        for header_name, header_value in self.headers.items():
            if header_name not in sort_keys:
                if header_value:
                    lines.append('%s: %s' % (header_name, header_value))

        lines.append('\r\n')
        return '\r\n'.join(lines)

    def make_multipart(self, content_disposition=None, content_type=None,
                       content_location=None):
        """
        Makes this request field into a multipart request field.

        This method overrides "Content-Disposition", "Content-Type" and
        "Content-Location" headers to the request parameter.

        :param content_type:
            The 'Content-Type' of the request body.
        :param content_location:
            The 'Content-Location' of the request body.

        """
        self.headers['Content-Disposition'] = content_disposition or 'form-data'
        self.headers['Content-Disposition'] += '; '.join([
            '', self._render_parts(
                (('name', self._name), ('filename', self._filename))
            )
        ])
        self.headers['Content-Type'] = content_type
        self.headers['Content-Location'] = content_location
site-packages/pip/_vendor/urllib3/response.py000064400000054567147511334600015321 0ustar00from __future__ import absolute_import
from contextlib import contextmanager
import zlib
import io
import logging
from socket import timeout as SocketTimeout
from socket import error as SocketError

from ._collections import HTTPHeaderDict
from .exceptions import (
    BodyNotHttplibCompatible, ProtocolError, DecodeError, ReadTimeoutError,
    ResponseNotChunked, IncompleteRead, InvalidHeader
)
from .packages.six import string_types as basestring, binary_type, PY3
from .packages.six.moves import http_client as httplib
from .connection import HTTPException, BaseSSLError
from .util.response import is_fp_closed, is_response_to_head

log = logging.getLogger(__name__)


class DeflateDecoder(object):

    def __init__(self):
        self._first_try = True
        self._data = binary_type()
        self._obj = zlib.decompressobj()

    def __getattr__(self, name):
        return getattr(self._obj, name)

    def decompress(self, data):
        if not data:
            return data

        if not self._first_try:
            return self._obj.decompress(data)

        self._data += data
        try:
            decompressed = self._obj.decompress(data)
            if decompressed:
                self._first_try = False
                self._data = None
            return decompressed
        except zlib.error:
            self._first_try = False
            self._obj = zlib.decompressobj(-zlib.MAX_WBITS)
            try:
                return self.decompress(self._data)
            finally:
                self._data = None


class GzipDecoder(object):

    def __init__(self):
        self._obj = zlib.decompressobj(16 + zlib.MAX_WBITS)

    def __getattr__(self, name):
        return getattr(self._obj, name)

    def decompress(self, data):
        if not data:
            return data
        return self._obj.decompress(data)


def _get_decoder(mode):
    if mode == 'gzip':
        return GzipDecoder()

    return DeflateDecoder()


class HTTPResponse(io.IOBase):
    """
    HTTP Response container.

    Backwards-compatible to httplib's HTTPResponse but the response ``body`` is
    loaded and decoded on-demand when the ``data`` property is accessed.  This
    class is also compatible with the Python standard library's :mod:`io`
    module, and can hence be treated as a readable object in the context of that
    framework.

    Extra parameters for behaviour not present in httplib.HTTPResponse:

    :param preload_content:
        If True, the response's body will be preloaded during construction.

    :param decode_content:
        If True, attempts to decode specific content-encoding's based on headers
        (like 'gzip' and 'deflate') will be skipped and raw data will be used
        instead.

    :param original_response:
        When this HTTPResponse wrapper is generated from an httplib.HTTPResponse
        object, it's convenient to include the original for debug purposes. It's
        otherwise unused.

    :param retries:
        The retries contains the last :class:`~urllib3.util.retry.Retry` that
        was used during the request.

    :param enforce_content_length:
        Enforce content length checking. Body returned by server must match
        value of Content-Length header, if present. Otherwise, raise error.
    """

    CONTENT_DECODERS = ['gzip', 'deflate']
    REDIRECT_STATUSES = [301, 302, 303, 307, 308]

    def __init__(self, body='', headers=None, status=0, version=0, reason=None,
                 strict=0, preload_content=True, decode_content=True,
                 original_response=None, pool=None, connection=None,
                 retries=None, enforce_content_length=False, request_method=None):

        if isinstance(headers, HTTPHeaderDict):
            self.headers = headers
        else:
            self.headers = HTTPHeaderDict(headers)
        self.status = status
        self.version = version
        self.reason = reason
        self.strict = strict
        self.decode_content = decode_content
        self.retries = retries
        self.enforce_content_length = enforce_content_length

        self._decoder = None
        self._body = None
        self._fp = None
        self._original_response = original_response
        self._fp_bytes_read = 0

        if body and isinstance(body, (basestring, binary_type)):
            self._body = body

        self._pool = pool
        self._connection = connection

        if hasattr(body, 'read'):
            self._fp = body

        # Are we using the chunked-style of transfer encoding?
        self.chunked = False
        self.chunk_left = None
        tr_enc = self.headers.get('transfer-encoding', '').lower()
        # Don't incur the penalty of creating a list and then discarding it
        encodings = (enc.strip() for enc in tr_enc.split(","))
        if "chunked" in encodings:
            self.chunked = True

        # Determine length of response
        self.length_remaining = self._init_length(request_method)

        # If requested, preload the body.
        if preload_content and not self._body:
            self._body = self.read(decode_content=decode_content)

    def get_redirect_location(self):
        """
        Should we redirect and where to?

        :returns: Truthy redirect location string if we got a redirect status
            code and valid location. ``None`` if redirect status and no
            location. ``False`` if not a redirect status code.
        """
        if self.status in self.REDIRECT_STATUSES:
            return self.headers.get('location')

        return False

    def release_conn(self):
        if not self._pool or not self._connection:
            return

        self._pool._put_conn(self._connection)
        self._connection = None

    @property
    def data(self):
        # For backwords-compat with earlier urllib3 0.4 and earlier.
        if self._body:
            return self._body

        if self._fp:
            return self.read(cache_content=True)

    @property
    def connection(self):
        return self._connection

    def tell(self):
        """
        Obtain the number of bytes pulled over the wire so far. May differ from
        the amount of content returned by :meth:``HTTPResponse.read`` if bytes
        are encoded on the wire (e.g, compressed).
        """
        return self._fp_bytes_read

    def _init_length(self, request_method):
        """
        Set initial length value for Response content if available.
        """
        length = self.headers.get('content-length')

        if length is not None and self.chunked:
            # This Response will fail with an IncompleteRead if it can't be
            # received as chunked. This method falls back to attempt reading
            # the response before raising an exception.
            log.warning("Received response with both Content-Length and "
                        "Transfer-Encoding set. This is expressly forbidden "
                        "by RFC 7230 sec 3.3.2. Ignoring Content-Length and "
                        "attempting to process response as Transfer-Encoding: "
                        "chunked.")
            return None

        elif length is not None:
            try:
                # RFC 7230 section 3.3.2 specifies multiple content lengths can
                # be sent in a single Content-Length header
                # (e.g. Content-Length: 42, 42). This line ensures the values
                # are all valid ints and that as long as the `set` length is 1,
                # all values are the same. Otherwise, the header is invalid.
                lengths = set([int(val) for val in length.split(',')])
                if len(lengths) > 1:
                    raise InvalidHeader("Content-Length contained multiple "
                                        "unmatching values (%s)" % length)
                length = lengths.pop()
            except ValueError:
                length = None
            else:
                if length < 0:
                    length = None

        # Convert status to int for comparison
        # In some cases, httplib returns a status of "_UNKNOWN"
        try:
            status = int(self.status)
        except ValueError:
            status = 0

        # Check for responses that shouldn't include a body
        if status in (204, 304) or 100 <= status < 200 or request_method == 'HEAD':
            length = 0

        return length

    def _init_decoder(self):
        """
        Set-up the _decoder attribute if necessary.
        """
        # Note: content-encoding value should be case-insensitive, per RFC 7230
        # Section 3.2
        content_encoding = self.headers.get('content-encoding', '').lower()
        if self._decoder is None and content_encoding in self.CONTENT_DECODERS:
            self._decoder = _get_decoder(content_encoding)

    def _decode(self, data, decode_content, flush_decoder):
        """
        Decode the data passed in and potentially flush the decoder.
        """
        try:
            if decode_content and self._decoder:
                data = self._decoder.decompress(data)
        except (IOError, zlib.error) as e:
            content_encoding = self.headers.get('content-encoding', '').lower()
            raise DecodeError(
                "Received response with content-encoding: %s, but "
                "failed to decode it." % content_encoding, e)

        if flush_decoder and decode_content:
            data += self._flush_decoder()

        return data

    def _flush_decoder(self):
        """
        Flushes the decoder. Should only be called if the decoder is actually
        being used.
        """
        if self._decoder:
            buf = self._decoder.decompress(b'')
            return buf + self._decoder.flush()

        return b''

    @contextmanager
    def _error_catcher(self):
        """
        Catch low-level python exceptions, instead re-raising urllib3
        variants, so that low-level exceptions are not leaked in the
        high-level api.

        On exit, release the connection back to the pool.
        """
        clean_exit = False

        try:
            try:
                yield

            except SocketTimeout:
                # FIXME: Ideally we'd like to include the url in the ReadTimeoutError but
                # there is yet no clean way to get at it from this context.
                raise ReadTimeoutError(self._pool, None, 'Read timed out.')

            except BaseSSLError as e:
                # FIXME: Is there a better way to differentiate between SSLErrors?
                if 'read operation timed out' not in str(e):  # Defensive:
                    # This shouldn't happen but just in case we're missing an edge
                    # case, let's avoid swallowing SSL errors.
                    raise

                raise ReadTimeoutError(self._pool, None, 'Read timed out.')

            except (HTTPException, SocketError) as e:
                # This includes IncompleteRead.
                raise ProtocolError('Connection broken: %r' % e, e)

            # If no exception is thrown, we should avoid cleaning up
            # unnecessarily.
            clean_exit = True
        finally:
            # If we didn't terminate cleanly, we need to throw away our
            # connection.
            if not clean_exit:
                # The response may not be closed but we're not going to use it
                # anymore so close it now to ensure that the connection is
                # released back to the pool.
                if self._original_response:
                    self._original_response.close()

                # Closing the response may not actually be sufficient to close
                # everything, so if we have a hold of the connection close that
                # too.
                if self._connection:
                    self._connection.close()

            # If we hold the original response but it's closed now, we should
            # return the connection back to the pool.
            if self._original_response and self._original_response.isclosed():
                self.release_conn()

    def read(self, amt=None, decode_content=None, cache_content=False):
        """
        Similar to :meth:`httplib.HTTPResponse.read`, but with two additional
        parameters: ``decode_content`` and ``cache_content``.

        :param amt:
            How much of the content to read. If specified, caching is skipped
            because it doesn't make sense to cache partial content as the full
            response.

        :param decode_content:
            If True, will attempt to decode the body based on the
            'content-encoding' header.

        :param cache_content:
            If True, will save the returned data such that the same result is
            returned despite of the state of the underlying file object. This
            is useful if you want the ``.data`` property to continue working
            after having ``.read()`` the file object. (Overridden if ``amt`` is
            set.)
        """
        self._init_decoder()
        if decode_content is None:
            decode_content = self.decode_content

        if self._fp is None:
            return

        flush_decoder = False
        data = None

        with self._error_catcher():
            if amt is None:
                # cStringIO doesn't like amt=None
                data = self._fp.read()
                flush_decoder = True
            else:
                cache_content = False
                data = self._fp.read(amt)
                if amt != 0 and not data:  # Platform-specific: Buggy versions of Python.
                    # Close the connection when no data is returned
                    #
                    # This is redundant to what httplib/http.client _should_
                    # already do.  However, versions of python released before
                    # December 15, 2012 (http://bugs.python.org/issue16298) do
                    # not properly close the connection in all cases. There is
                    # no harm in redundantly calling close.
                    self._fp.close()
                    flush_decoder = True
                    if self.enforce_content_length and self.length_remaining not in (0, None):
                        # This is an edge case that httplib failed to cover due
                        # to concerns of backward compatibility. We're
                        # addressing it here to make sure IncompleteRead is
                        # raised during streaming, so all calls with incorrect
                        # Content-Length are caught.
                        raise IncompleteRead(self._fp_bytes_read, self.length_remaining)

        if data:
            self._fp_bytes_read += len(data)
            if self.length_remaining is not None:
                self.length_remaining -= len(data)

            data = self._decode(data, decode_content, flush_decoder)

            if cache_content:
                self._body = data

        return data

    def stream(self, amt=2**16, decode_content=None):
        """
        A generator wrapper for the read() method. A call will block until
        ``amt`` bytes have been read from the connection or until the
        connection is closed.

        :param amt:
            How much of the content to read. The generator will return up to
            much data per iteration, but may return less. This is particularly
            likely when using compressed data. However, the empty string will
            never be returned.

        :param decode_content:
            If True, will attempt to decode the body based on the
            'content-encoding' header.
        """
        if self.chunked and self.supports_chunked_reads():
            for line in self.read_chunked(amt, decode_content=decode_content):
                yield line
        else:
            while not is_fp_closed(self._fp):
                data = self.read(amt=amt, decode_content=decode_content)

                if data:
                    yield data

    @classmethod
    def from_httplib(ResponseCls, r, **response_kw):
        """
        Given an :class:`httplib.HTTPResponse` instance ``r``, return a
        corresponding :class:`urllib3.response.HTTPResponse` object.

        Remaining parameters are passed to the HTTPResponse constructor, along
        with ``original_response=r``.
        """
        headers = r.msg

        if not isinstance(headers, HTTPHeaderDict):
            if PY3:  # Python 3
                headers = HTTPHeaderDict(headers.items())
            else:  # Python 2
                headers = HTTPHeaderDict.from_httplib(headers)

        # HTTPResponse objects in Python 3 don't have a .strict attribute
        strict = getattr(r, 'strict', 0)
        resp = ResponseCls(body=r,
                           headers=headers,
                           status=r.status,
                           version=r.version,
                           reason=r.reason,
                           strict=strict,
                           original_response=r,
                           **response_kw)
        return resp

    # Backwards-compatibility methods for httplib.HTTPResponse
    def getheaders(self):
        return self.headers

    def getheader(self, name, default=None):
        return self.headers.get(name, default)

    # Backwards compatibility for http.cookiejar
    def info(self):
        return self.headers

    # Overrides from io.IOBase
    def close(self):
        if not self.closed:
            self._fp.close()

        if self._connection:
            self._connection.close()

    @property
    def closed(self):
        if self._fp is None:
            return True
        elif hasattr(self._fp, 'isclosed'):
            return self._fp.isclosed()
        elif hasattr(self._fp, 'closed'):
            return self._fp.closed
        else:
            return True

    def fileno(self):
        if self._fp is None:
            raise IOError("HTTPResponse has no file to get a fileno from")
        elif hasattr(self._fp, "fileno"):
            return self._fp.fileno()
        else:
            raise IOError("The file-like object this HTTPResponse is wrapped "
                          "around has no file descriptor")

    def flush(self):
        if self._fp is not None and hasattr(self._fp, 'flush'):
            return self._fp.flush()

    def readable(self):
        # This method is required for `io` module compatibility.
        return True

    def readinto(self, b):
        # This method is required for `io` module compatibility.
        temp = self.read(len(b))
        if len(temp) == 0:
            return 0
        else:
            b[:len(temp)] = temp
            return len(temp)

    def supports_chunked_reads(self):
        """
        Checks if the underlying file-like object looks like a
        httplib.HTTPResponse object. We do this by testing for the fp
        attribute. If it is present we assume it returns raw chunks as
        processed by read_chunked().
        """
        return hasattr(self._fp, 'fp')

    def _update_chunk_length(self):
        # First, we'll figure out length of a chunk and then
        # we'll try to read it from socket.
        if self.chunk_left is not None:
            return
        line = self._fp.fp.readline()
        line = line.split(b';', 1)[0]
        try:
            self.chunk_left = int(line, 16)
        except ValueError:
            # Invalid chunked protocol response, abort.
            self.close()
            raise httplib.IncompleteRead(line)

    def _handle_chunk(self, amt):
        returned_chunk = None
        if amt is None:
            chunk = self._fp._safe_read(self.chunk_left)
            returned_chunk = chunk
            self._fp._safe_read(2)  # Toss the CRLF at the end of the chunk.
            self.chunk_left = None
        elif amt < self.chunk_left:
            value = self._fp._safe_read(amt)
            self.chunk_left = self.chunk_left - amt
            returned_chunk = value
        elif amt == self.chunk_left:
            value = self._fp._safe_read(amt)
            self._fp._safe_read(2)  # Toss the CRLF at the end of the chunk.
            self.chunk_left = None
            returned_chunk = value
        else:  # amt > self.chunk_left
            returned_chunk = self._fp._safe_read(self.chunk_left)
            self._fp._safe_read(2)  # Toss the CRLF at the end of the chunk.
            self.chunk_left = None
        return returned_chunk

    def read_chunked(self, amt=None, decode_content=None):
        """
        Similar to :meth:`HTTPResponse.read`, but with an additional
        parameter: ``decode_content``.

        :param decode_content:
            If True, will attempt to decode the body based on the
            'content-encoding' header.
        """
        self._init_decoder()
        # FIXME: Rewrite this method and make it a class with a better structured logic.
        if not self.chunked:
            raise ResponseNotChunked(
                "Response is not chunked. "
                "Header 'transfer-encoding: chunked' is missing.")
        if not self.supports_chunked_reads():
            raise BodyNotHttplibCompatible(
                "Body should be httplib.HTTPResponse like. "
                "It should have have an fp attribute which returns raw chunks.")

        # Don't bother reading the body of a HEAD request.
        if self._original_response and is_response_to_head(self._original_response):
            self._original_response.close()
            return

        with self._error_catcher():
            while True:
                self._update_chunk_length()
                if self.chunk_left == 0:
                    break
                chunk = self._handle_chunk(amt)
                decoded = self._decode(chunk, decode_content=decode_content,
                                       flush_decoder=False)
                if decoded:
                    yield decoded

            if decode_content:
                # On CPython and PyPy, we should never need to flush the
                # decoder. However, on Jython we *might* need to, so
                # lets defensively do it anyway.
                decoded = self._flush_decoder()
                if decoded:  # Platform-specific: Jython.
                    yield decoded

            # Chunk content ends with \r\n: discard it.
            while True:
                line = self._fp.fp.readline()
                if not line:
                    # Some sites may not end with '\r\n'.
                    break
                if line == b'\r\n':
                    break

            # We read everything; close the "file".
            if self._original_response:
                self._original_response.close()
site-packages/pip/_vendor/urllib3/packages/six.py000064400000072622147511334600016034 0ustar00"""Utilities for writing code that runs on Python 2 and 3"""

# Copyright (c) 2010-2015 Benjamin Peterson
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in all
# copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.

from __future__ import absolute_import

import functools
import itertools
import operator
import sys
import types

__author__ = "Benjamin Peterson <benjamin@python.org>"
__version__ = "1.10.0"


# Useful for very coarse version differentiation.
PY2 = sys.version_info[0] == 2
PY3 = sys.version_info[0] == 3
PY34 = sys.version_info[0:2] >= (3, 4)

if PY3:
    string_types = str,
    integer_types = int,
    class_types = type,
    text_type = str
    binary_type = bytes

    MAXSIZE = sys.maxsize
else:
    string_types = basestring,
    integer_types = (int, long)
    class_types = (type, types.ClassType)
    text_type = unicode
    binary_type = str

    if sys.platform.startswith("java"):
        # Jython always uses 32 bits.
        MAXSIZE = int((1 << 31) - 1)
    else:
        # It's possible to have sizeof(long) != sizeof(Py_ssize_t).
        class X(object):

            def __len__(self):
                return 1 << 31
        try:
            len(X())
        except OverflowError:
            # 32-bit
            MAXSIZE = int((1 << 31) - 1)
        else:
            # 64-bit
            MAXSIZE = int((1 << 63) - 1)
        del X


def _add_doc(func, doc):
    """Add documentation to a function."""
    func.__doc__ = doc


def _import_module(name):
    """Import module, returning the module after the last dot."""
    __import__(name)
    return sys.modules[name]


class _LazyDescr(object):

    def __init__(self, name):
        self.name = name

    def __get__(self, obj, tp):
        result = self._resolve()
        setattr(obj, self.name, result)  # Invokes __set__.
        try:
            # This is a bit ugly, but it avoids running this again by
            # removing this descriptor.
            delattr(obj.__class__, self.name)
        except AttributeError:
            pass
        return result


class MovedModule(_LazyDescr):

    def __init__(self, name, old, new=None):
        super(MovedModule, self).__init__(name)
        if PY3:
            if new is None:
                new = name
            self.mod = new
        else:
            self.mod = old

    def _resolve(self):
        return _import_module(self.mod)

    def __getattr__(self, attr):
        _module = self._resolve()
        value = getattr(_module, attr)
        setattr(self, attr, value)
        return value


class _LazyModule(types.ModuleType):

    def __init__(self, name):
        super(_LazyModule, self).__init__(name)
        self.__doc__ = self.__class__.__doc__

    def __dir__(self):
        attrs = ["__doc__", "__name__"]
        attrs += [attr.name for attr in self._moved_attributes]
        return attrs

    # Subclasses should override this
    _moved_attributes = []


class MovedAttribute(_LazyDescr):

    def __init__(self, name, old_mod, new_mod, old_attr=None, new_attr=None):
        super(MovedAttribute, self).__init__(name)
        if PY3:
            if new_mod is None:
                new_mod = name
            self.mod = new_mod
            if new_attr is None:
                if old_attr is None:
                    new_attr = name
                else:
                    new_attr = old_attr
            self.attr = new_attr
        else:
            self.mod = old_mod
            if old_attr is None:
                old_attr = name
            self.attr = old_attr

    def _resolve(self):
        module = _import_module(self.mod)
        return getattr(module, self.attr)


class _SixMetaPathImporter(object):

    """
    A meta path importer to import six.moves and its submodules.

    This class implements a PEP302 finder and loader. It should be compatible
    with Python 2.5 and all existing versions of Python3
    """

    def __init__(self, six_module_name):
        self.name = six_module_name
        self.known_modules = {}

    def _add_module(self, mod, *fullnames):
        for fullname in fullnames:
            self.known_modules[self.name + "." + fullname] = mod

    def _get_module(self, fullname):
        return self.known_modules[self.name + "." + fullname]

    def find_module(self, fullname, path=None):
        if fullname in self.known_modules:
            return self
        return None

    def __get_module(self, fullname):
        try:
            return self.known_modules[fullname]
        except KeyError:
            raise ImportError("This loader does not know module " + fullname)

    def load_module(self, fullname):
        try:
            # in case of a reload
            return sys.modules[fullname]
        except KeyError:
            pass
        mod = self.__get_module(fullname)
        if isinstance(mod, MovedModule):
            mod = mod._resolve()
        else:
            mod.__loader__ = self
        sys.modules[fullname] = mod
        return mod

    def is_package(self, fullname):
        """
        Return true, if the named module is a package.

        We need this method to get correct spec objects with
        Python 3.4 (see PEP451)
        """
        return hasattr(self.__get_module(fullname), "__path__")

    def get_code(self, fullname):
        """Return None

        Required, if is_package is implemented"""
        self.__get_module(fullname)  # eventually raises ImportError
        return None
    get_source = get_code  # same as get_code

_importer = _SixMetaPathImporter(__name__)


class _MovedItems(_LazyModule):

    """Lazy loading of moved objects"""
    __path__ = []  # mark as package


_moved_attributes = [
    MovedAttribute("cStringIO", "cStringIO", "io", "StringIO"),
    MovedAttribute("filter", "itertools", "builtins", "ifilter", "filter"),
    MovedAttribute("filterfalse", "itertools", "itertools", "ifilterfalse", "filterfalse"),
    MovedAttribute("input", "__builtin__", "builtins", "raw_input", "input"),
    MovedAttribute("intern", "__builtin__", "sys"),
    MovedAttribute("map", "itertools", "builtins", "imap", "map"),
    MovedAttribute("getcwd", "os", "os", "getcwdu", "getcwd"),
    MovedAttribute("getcwdb", "os", "os", "getcwd", "getcwdb"),
    MovedAttribute("range", "__builtin__", "builtins", "xrange", "range"),
    MovedAttribute("reload_module", "__builtin__", "importlib" if PY34 else "imp", "reload"),
    MovedAttribute("reduce", "__builtin__", "functools"),
    MovedAttribute("shlex_quote", "pipes", "shlex", "quote"),
    MovedAttribute("StringIO", "StringIO", "io"),
    MovedAttribute("UserDict", "UserDict", "collections"),
    MovedAttribute("UserList", "UserList", "collections"),
    MovedAttribute("UserString", "UserString", "collections"),
    MovedAttribute("xrange", "__builtin__", "builtins", "xrange", "range"),
    MovedAttribute("zip", "itertools", "builtins", "izip", "zip"),
    MovedAttribute("zip_longest", "itertools", "itertools", "izip_longest", "zip_longest"),
    MovedModule("builtins", "__builtin__"),
    MovedModule("configparser", "ConfigParser"),
    MovedModule("copyreg", "copy_reg"),
    MovedModule("dbm_gnu", "gdbm", "dbm.gnu"),
    MovedModule("_dummy_thread", "dummy_thread", "_dummy_thread"),
    MovedModule("http_cookiejar", "cookielib", "http.cookiejar"),
    MovedModule("http_cookies", "Cookie", "http.cookies"),
    MovedModule("html_entities", "htmlentitydefs", "html.entities"),
    MovedModule("html_parser", "HTMLParser", "html.parser"),
    MovedModule("http_client", "httplib", "http.client"),
    MovedModule("email_mime_multipart", "email.MIMEMultipart", "email.mime.multipart"),
    MovedModule("email_mime_nonmultipart", "email.MIMENonMultipart", "email.mime.nonmultipart"),
    MovedModule("email_mime_text", "email.MIMEText", "email.mime.text"),
    MovedModule("email_mime_base", "email.MIMEBase", "email.mime.base"),
    MovedModule("BaseHTTPServer", "BaseHTTPServer", "http.server"),
    MovedModule("CGIHTTPServer", "CGIHTTPServer", "http.server"),
    MovedModule("SimpleHTTPServer", "SimpleHTTPServer", "http.server"),
    MovedModule("cPickle", "cPickle", "pickle"),
    MovedModule("queue", "Queue"),
    MovedModule("reprlib", "repr"),
    MovedModule("socketserver", "SocketServer"),
    MovedModule("_thread", "thread", "_thread"),
    MovedModule("tkinter", "Tkinter"),
    MovedModule("tkinter_dialog", "Dialog", "tkinter.dialog"),
    MovedModule("tkinter_filedialog", "FileDialog", "tkinter.filedialog"),
    MovedModule("tkinter_scrolledtext", "ScrolledText", "tkinter.scrolledtext"),
    MovedModule("tkinter_simpledialog", "SimpleDialog", "tkinter.simpledialog"),
    MovedModule("tkinter_tix", "Tix", "tkinter.tix"),
    MovedModule("tkinter_ttk", "ttk", "tkinter.ttk"),
    MovedModule("tkinter_constants", "Tkconstants", "tkinter.constants"),
    MovedModule("tkinter_dnd", "Tkdnd", "tkinter.dnd"),
    MovedModule("tkinter_colorchooser", "tkColorChooser",
                "tkinter.colorchooser"),
    MovedModule("tkinter_commondialog", "tkCommonDialog",
                "tkinter.commondialog"),
    MovedModule("tkinter_tkfiledialog", "tkFileDialog", "tkinter.filedialog"),
    MovedModule("tkinter_font", "tkFont", "tkinter.font"),
    MovedModule("tkinter_messagebox", "tkMessageBox", "tkinter.messagebox"),
    MovedModule("tkinter_tksimpledialog", "tkSimpleDialog",
                "tkinter.simpledialog"),
    MovedModule("urllib_parse", __name__ + ".moves.urllib_parse", "urllib.parse"),
    MovedModule("urllib_error", __name__ + ".moves.urllib_error", "urllib.error"),
    MovedModule("urllib", __name__ + ".moves.urllib", __name__ + ".moves.urllib"),
    MovedModule("urllib_robotparser", "robotparser", "urllib.robotparser"),
    MovedModule("xmlrpc_client", "xmlrpclib", "xmlrpc.client"),
    MovedModule("xmlrpc_server", "SimpleXMLRPCServer", "xmlrpc.server"),
]
# Add windows specific modules.
if sys.platform == "win32":
    _moved_attributes += [
        MovedModule("winreg", "_winreg"),
    ]

for attr in _moved_attributes:
    setattr(_MovedItems, attr.name, attr)
    if isinstance(attr, MovedModule):
        _importer._add_module(attr, "moves." + attr.name)
del attr

_MovedItems._moved_attributes = _moved_attributes

moves = _MovedItems(__name__ + ".moves")
_importer._add_module(moves, "moves")


class Module_six_moves_urllib_parse(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_parse"""


_urllib_parse_moved_attributes = [
    MovedAttribute("ParseResult", "urlparse", "urllib.parse"),
    MovedAttribute("SplitResult", "urlparse", "urllib.parse"),
    MovedAttribute("parse_qs", "urlparse", "urllib.parse"),
    MovedAttribute("parse_qsl", "urlparse", "urllib.parse"),
    MovedAttribute("urldefrag", "urlparse", "urllib.parse"),
    MovedAttribute("urljoin", "urlparse", "urllib.parse"),
    MovedAttribute("urlparse", "urlparse", "urllib.parse"),
    MovedAttribute("urlsplit", "urlparse", "urllib.parse"),
    MovedAttribute("urlunparse", "urlparse", "urllib.parse"),
    MovedAttribute("urlunsplit", "urlparse", "urllib.parse"),
    MovedAttribute("quote", "urllib", "urllib.parse"),
    MovedAttribute("quote_plus", "urllib", "urllib.parse"),
    MovedAttribute("unquote", "urllib", "urllib.parse"),
    MovedAttribute("unquote_plus", "urllib", "urllib.parse"),
    MovedAttribute("urlencode", "urllib", "urllib.parse"),
    MovedAttribute("splitquery", "urllib", "urllib.parse"),
    MovedAttribute("splittag", "urllib", "urllib.parse"),
    MovedAttribute("splituser", "urllib", "urllib.parse"),
    MovedAttribute("uses_fragment", "urlparse", "urllib.parse"),
    MovedAttribute("uses_netloc", "urlparse", "urllib.parse"),
    MovedAttribute("uses_params", "urlparse", "urllib.parse"),
    MovedAttribute("uses_query", "urlparse", "urllib.parse"),
    MovedAttribute("uses_relative", "urlparse", "urllib.parse"),
]
for attr in _urllib_parse_moved_attributes:
    setattr(Module_six_moves_urllib_parse, attr.name, attr)
del attr

Module_six_moves_urllib_parse._moved_attributes = _urllib_parse_moved_attributes

_importer._add_module(Module_six_moves_urllib_parse(__name__ + ".moves.urllib_parse"),
                      "moves.urllib_parse", "moves.urllib.parse")


class Module_six_moves_urllib_error(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_error"""


_urllib_error_moved_attributes = [
    MovedAttribute("URLError", "urllib2", "urllib.error"),
    MovedAttribute("HTTPError", "urllib2", "urllib.error"),
    MovedAttribute("ContentTooShortError", "urllib", "urllib.error"),
]
for attr in _urllib_error_moved_attributes:
    setattr(Module_six_moves_urllib_error, attr.name, attr)
del attr

Module_six_moves_urllib_error._moved_attributes = _urllib_error_moved_attributes

_importer._add_module(Module_six_moves_urllib_error(__name__ + ".moves.urllib.error"),
                      "moves.urllib_error", "moves.urllib.error")


class Module_six_moves_urllib_request(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_request"""


_urllib_request_moved_attributes = [
    MovedAttribute("urlopen", "urllib2", "urllib.request"),
    MovedAttribute("install_opener", "urllib2", "urllib.request"),
    MovedAttribute("build_opener", "urllib2", "urllib.request"),
    MovedAttribute("pathname2url", "urllib", "urllib.request"),
    MovedAttribute("url2pathname", "urllib", "urllib.request"),
    MovedAttribute("getproxies", "urllib", "urllib.request"),
    MovedAttribute("Request", "urllib2", "urllib.request"),
    MovedAttribute("OpenerDirector", "urllib2", "urllib.request"),
    MovedAttribute("HTTPDefaultErrorHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPRedirectHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPCookieProcessor", "urllib2", "urllib.request"),
    MovedAttribute("ProxyHandler", "urllib2", "urllib.request"),
    MovedAttribute("BaseHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPPasswordMgr", "urllib2", "urllib.request"),
    MovedAttribute("HTTPPasswordMgrWithDefaultRealm", "urllib2", "urllib.request"),
    MovedAttribute("AbstractBasicAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPBasicAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("ProxyBasicAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("AbstractDigestAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPDigestAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("ProxyDigestAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPSHandler", "urllib2", "urllib.request"),
    MovedAttribute("FileHandler", "urllib2", "urllib.request"),
    MovedAttribute("FTPHandler", "urllib2", "urllib.request"),
    MovedAttribute("CacheFTPHandler", "urllib2", "urllib.request"),
    MovedAttribute("UnknownHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPErrorProcessor", "urllib2", "urllib.request"),
    MovedAttribute("urlretrieve", "urllib", "urllib.request"),
    MovedAttribute("urlcleanup", "urllib", "urllib.request"),
    MovedAttribute("URLopener", "urllib", "urllib.request"),
    MovedAttribute("FancyURLopener", "urllib", "urllib.request"),
    MovedAttribute("proxy_bypass", "urllib", "urllib.request"),
]
for attr in _urllib_request_moved_attributes:
    setattr(Module_six_moves_urllib_request, attr.name, attr)
del attr

Module_six_moves_urllib_request._moved_attributes = _urllib_request_moved_attributes

_importer._add_module(Module_six_moves_urllib_request(__name__ + ".moves.urllib.request"),
                      "moves.urllib_request", "moves.urllib.request")


class Module_six_moves_urllib_response(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_response"""


_urllib_response_moved_attributes = [
    MovedAttribute("addbase", "urllib", "urllib.response"),
    MovedAttribute("addclosehook", "urllib", "urllib.response"),
    MovedAttribute("addinfo", "urllib", "urllib.response"),
    MovedAttribute("addinfourl", "urllib", "urllib.response"),
]
for attr in _urllib_response_moved_attributes:
    setattr(Module_six_moves_urllib_response, attr.name, attr)
del attr

Module_six_moves_urllib_response._moved_attributes = _urllib_response_moved_attributes

_importer._add_module(Module_six_moves_urllib_response(__name__ + ".moves.urllib.response"),
                      "moves.urllib_response", "moves.urllib.response")


class Module_six_moves_urllib_robotparser(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_robotparser"""


_urllib_robotparser_moved_attributes = [
    MovedAttribute("RobotFileParser", "robotparser", "urllib.robotparser"),
]
for attr in _urllib_robotparser_moved_attributes:
    setattr(Module_six_moves_urllib_robotparser, attr.name, attr)
del attr

Module_six_moves_urllib_robotparser._moved_attributes = _urllib_robotparser_moved_attributes

_importer._add_module(Module_six_moves_urllib_robotparser(__name__ + ".moves.urllib.robotparser"),
                      "moves.urllib_robotparser", "moves.urllib.robotparser")


class Module_six_moves_urllib(types.ModuleType):

    """Create a six.moves.urllib namespace that resembles the Python 3 namespace"""
    __path__ = []  # mark as package
    parse = _importer._get_module("moves.urllib_parse")
    error = _importer._get_module("moves.urllib_error")
    request = _importer._get_module("moves.urllib_request")
    response = _importer._get_module("moves.urllib_response")
    robotparser = _importer._get_module("moves.urllib_robotparser")

    def __dir__(self):
        return ['parse', 'error', 'request', 'response', 'robotparser']

_importer._add_module(Module_six_moves_urllib(__name__ + ".moves.urllib"),
                      "moves.urllib")


def add_move(move):
    """Add an item to six.moves."""
    setattr(_MovedItems, move.name, move)


def remove_move(name):
    """Remove item from six.moves."""
    try:
        delattr(_MovedItems, name)
    except AttributeError:
        try:
            del moves.__dict__[name]
        except KeyError:
            raise AttributeError("no such move, %r" % (name,))


if PY3:
    _meth_func = "__func__"
    _meth_self = "__self__"

    _func_closure = "__closure__"
    _func_code = "__code__"
    _func_defaults = "__defaults__"
    _func_globals = "__globals__"
else:
    _meth_func = "im_func"
    _meth_self = "im_self"

    _func_closure = "func_closure"
    _func_code = "func_code"
    _func_defaults = "func_defaults"
    _func_globals = "func_globals"


try:
    advance_iterator = next
except NameError:
    def advance_iterator(it):
        return it.next()
next = advance_iterator


try:
    callable = callable
except NameError:
    def callable(obj):
        return any("__call__" in klass.__dict__ for klass in type(obj).__mro__)


if PY3:
    def get_unbound_function(unbound):
        return unbound

    create_bound_method = types.MethodType

    def create_unbound_method(func, cls):
        return func

    Iterator = object
else:
    def get_unbound_function(unbound):
        return unbound.im_func

    def create_bound_method(func, obj):
        return types.MethodType(func, obj, obj.__class__)

    def create_unbound_method(func, cls):
        return types.MethodType(func, None, cls)

    class Iterator(object):

        def next(self):
            return type(self).__next__(self)

    callable = callable
_add_doc(get_unbound_function,
         """Get the function out of a possibly unbound function""")


get_method_function = operator.attrgetter(_meth_func)
get_method_self = operator.attrgetter(_meth_self)
get_function_closure = operator.attrgetter(_func_closure)
get_function_code = operator.attrgetter(_func_code)
get_function_defaults = operator.attrgetter(_func_defaults)
get_function_globals = operator.attrgetter(_func_globals)


if PY3:
    def iterkeys(d, **kw):
        return iter(d.keys(**kw))

    def itervalues(d, **kw):
        return iter(d.values(**kw))

    def iteritems(d, **kw):
        return iter(d.items(**kw))

    def iterlists(d, **kw):
        return iter(d.lists(**kw))

    viewkeys = operator.methodcaller("keys")

    viewvalues = operator.methodcaller("values")

    viewitems = operator.methodcaller("items")
else:
    def iterkeys(d, **kw):
        return d.iterkeys(**kw)

    def itervalues(d, **kw):
        return d.itervalues(**kw)

    def iteritems(d, **kw):
        return d.iteritems(**kw)

    def iterlists(d, **kw):
        return d.iterlists(**kw)

    viewkeys = operator.methodcaller("viewkeys")

    viewvalues = operator.methodcaller("viewvalues")

    viewitems = operator.methodcaller("viewitems")

_add_doc(iterkeys, "Return an iterator over the keys of a dictionary.")
_add_doc(itervalues, "Return an iterator over the values of a dictionary.")
_add_doc(iteritems,
         "Return an iterator over the (key, value) pairs of a dictionary.")
_add_doc(iterlists,
         "Return an iterator over the (key, [values]) pairs of a dictionary.")


if PY3:
    def b(s):
        return s.encode("latin-1")

    def u(s):
        return s
    unichr = chr
    import struct
    int2byte = struct.Struct(">B").pack
    del struct
    byte2int = operator.itemgetter(0)
    indexbytes = operator.getitem
    iterbytes = iter
    import io
    StringIO = io.StringIO
    BytesIO = io.BytesIO
    _assertCountEqual = "assertCountEqual"
    if sys.version_info[1] <= 1:
        _assertRaisesRegex = "assertRaisesRegexp"
        _assertRegex = "assertRegexpMatches"
    else:
        _assertRaisesRegex = "assertRaisesRegex"
        _assertRegex = "assertRegex"
else:
    def b(s):
        return s
    # Workaround for standalone backslash

    def u(s):
        return unicode(s.replace(r'\\', r'\\\\'), "unicode_escape")
    unichr = unichr
    int2byte = chr

    def byte2int(bs):
        return ord(bs[0])

    def indexbytes(buf, i):
        return ord(buf[i])
    iterbytes = functools.partial(itertools.imap, ord)
    import StringIO
    StringIO = BytesIO = StringIO.StringIO
    _assertCountEqual = "assertItemsEqual"
    _assertRaisesRegex = "assertRaisesRegexp"
    _assertRegex = "assertRegexpMatches"
_add_doc(b, """Byte literal""")
_add_doc(u, """Text literal""")


def assertCountEqual(self, *args, **kwargs):
    return getattr(self, _assertCountEqual)(*args, **kwargs)


def assertRaisesRegex(self, *args, **kwargs):
    return getattr(self, _assertRaisesRegex)(*args, **kwargs)


def assertRegex(self, *args, **kwargs):
    return getattr(self, _assertRegex)(*args, **kwargs)


if PY3:
    exec_ = getattr(moves.builtins, "exec")

    def reraise(tp, value, tb=None):
        if value is None:
            value = tp()
        if value.__traceback__ is not tb:
            raise value.with_traceback(tb)
        raise value

else:
    def exec_(_code_, _globs_=None, _locs_=None):
        """Execute code in a namespace."""
        if _globs_ is None:
            frame = sys._getframe(1)
            _globs_ = frame.f_globals
            if _locs_ is None:
                _locs_ = frame.f_locals
            del frame
        elif _locs_ is None:
            _locs_ = _globs_
        exec("""exec _code_ in _globs_, _locs_""")

    exec_("""def reraise(tp, value, tb=None):
    raise tp, value, tb
""")


if sys.version_info[:2] == (3, 2):
    exec_("""def raise_from(value, from_value):
    if from_value is None:
        raise value
    raise value from from_value
""")
elif sys.version_info[:2] > (3, 2):
    exec_("""def raise_from(value, from_value):
    raise value from from_value
""")
else:
    def raise_from(value, from_value):
        raise value


print_ = getattr(moves.builtins, "print", None)
if print_ is None:
    def print_(*args, **kwargs):
        """The new-style print function for Python 2.4 and 2.5."""
        fp = kwargs.pop("file", sys.stdout)
        if fp is None:
            return

        def write(data):
            if not isinstance(data, basestring):
                data = str(data)
            # If the file has an encoding, encode unicode with it.
            if (isinstance(fp, file) and
                    isinstance(data, unicode) and
                    fp.encoding is not None):
                errors = getattr(fp, "errors", None)
                if errors is None:
                    errors = "strict"
                data = data.encode(fp.encoding, errors)
            fp.write(data)
        want_unicode = False
        sep = kwargs.pop("sep", None)
        if sep is not None:
            if isinstance(sep, unicode):
                want_unicode = True
            elif not isinstance(sep, str):
                raise TypeError("sep must be None or a string")
        end = kwargs.pop("end", None)
        if end is not None:
            if isinstance(end, unicode):
                want_unicode = True
            elif not isinstance(end, str):
                raise TypeError("end must be None or a string")
        if kwargs:
            raise TypeError("invalid keyword arguments to print()")
        if not want_unicode:
            for arg in args:
                if isinstance(arg, unicode):
                    want_unicode = True
                    break
        if want_unicode:
            newline = unicode("\n")
            space = unicode(" ")
        else:
            newline = "\n"
            space = " "
        if sep is None:
            sep = space
        if end is None:
            end = newline
        for i, arg in enumerate(args):
            if i:
                write(sep)
            write(arg)
        write(end)
if sys.version_info[:2] < (3, 3):
    _print = print_

    def print_(*args, **kwargs):
        fp = kwargs.get("file", sys.stdout)
        flush = kwargs.pop("flush", False)
        _print(*args, **kwargs)
        if flush and fp is not None:
            fp.flush()

_add_doc(reraise, """Reraise an exception.""")

if sys.version_info[0:2] < (3, 4):
    def wraps(wrapped, assigned=functools.WRAPPER_ASSIGNMENTS,
              updated=functools.WRAPPER_UPDATES):
        def wrapper(f):
            f = functools.wraps(wrapped, assigned, updated)(f)
            f.__wrapped__ = wrapped
            return f
        return wrapper
else:
    wraps = functools.wraps


def with_metaclass(meta, *bases):
    """Create a base class with a metaclass."""
    # This requires a bit of explanation: the basic idea is to make a dummy
    # metaclass for one level of class instantiation that replaces itself with
    # the actual metaclass.
    class metaclass(meta):

        def __new__(cls, name, this_bases, d):
            return meta(name, bases, d)
    return type.__new__(metaclass, 'temporary_class', (), {})


def add_metaclass(metaclass):
    """Class decorator for creating a class with a metaclass."""
    def wrapper(cls):
        orig_vars = cls.__dict__.copy()
        slots = orig_vars.get('__slots__')
        if slots is not None:
            if isinstance(slots, str):
                slots = [slots]
            for slots_var in slots:
                orig_vars.pop(slots_var)
        orig_vars.pop('__dict__', None)
        orig_vars.pop('__weakref__', None)
        return metaclass(cls.__name__, cls.__bases__, orig_vars)
    return wrapper


def python_2_unicode_compatible(klass):
    """
    A decorator that defines __unicode__ and __str__ methods under Python 2.
    Under Python 3 it does nothing.

    To support Python 2 and 3 with a single code base, define a __str__ method
    returning text and apply this decorator to the class.
    """
    if PY2:
        if '__str__' not in klass.__dict__:
            raise ValueError("@python_2_unicode_compatible cannot be applied "
                             "to %s because it doesn't define __str__()." %
                             klass.__name__)
        klass.__unicode__ = klass.__str__
        klass.__str__ = lambda self: self.__unicode__().encode('utf-8')
    return klass


# Complete the moves implementation.
# This code is at the end of this module to speed up module loading.
# Turn this module into a package.
__path__ = []  # required for PEP 302 and PEP 451
__package__ = __name__  # see PEP 366 @ReservedAssignment
if globals().get("__spec__") is not None:
    __spec__.submodule_search_locations = []  # PEP 451 @UndefinedVariable
# Remove other six meta path importers, since they cause problems. This can
# happen if six is removed from sys.modules and then reloaded. (Setuptools does
# this for some reason.)
if sys.meta_path:
    for i, importer in enumerate(sys.meta_path):
        # Here's some real nastiness: Another "instance" of the six module might
        # be floating around. Therefore, we can't use isinstance() to check for
        # the six meta path importer, since the other six instance will have
        # inserted an importer with different class.
        if (type(importer).__name__ == "_SixMetaPathImporter" and
                importer.name == __name__):
            del sys.meta_path[i]
            break
    del i, importer
# Finally, add the importer to the meta path import hook.
sys.meta_path.append(_importer)
site-packages/pip/_vendor/urllib3/packages/__init__.py000064400000000155147511334600016760 0ustar00from __future__ import absolute_import

from . import ssl_match_hostname

__all__ = ('ssl_match_hostname', )
site-packages/pip/_vendor/urllib3/packages/ordered_dict.py000064400000021347147511334600017656 0ustar00# Backport of OrderedDict() class that runs on Python 2.4, 2.5, 2.6, 2.7 and pypy.
# Passes Python2.7's test suite and incorporates all the latest updates.
# Copyright 2009 Raymond Hettinger, released under the MIT License.
# http://code.activestate.com/recipes/576693/
try:
    from thread import get_ident as _get_ident
except ImportError:
    from dummy_thread import get_ident as _get_ident

try:
    from _abcoll import KeysView, ValuesView, ItemsView
except ImportError:
    pass


class OrderedDict(dict):
    'Dictionary that remembers insertion order'
    # An inherited dict maps keys to values.
    # The inherited dict provides __getitem__, __len__, __contains__, and get.
    # The remaining methods are order-aware.
    # Big-O running times for all methods are the same as for regular dictionaries.

    # The internal self.__map dictionary maps keys to links in a doubly linked list.
    # The circular doubly linked list starts and ends with a sentinel element.
    # The sentinel element never gets deleted (this simplifies the algorithm).
    # Each link is stored as a list of length three:  [PREV, NEXT, KEY].

    def __init__(self, *args, **kwds):
        '''Initialize an ordered dictionary.  Signature is the same as for
        regular dictionaries, but keyword arguments are not recommended
        because their insertion order is arbitrary.

        '''
        if len(args) > 1:
            raise TypeError('expected at most 1 arguments, got %d' % len(args))
        try:
            self.__root
        except AttributeError:
            self.__root = root = []                     # sentinel node
            root[:] = [root, root, None]
            self.__map = {}
        self.__update(*args, **kwds)

    def __setitem__(self, key, value, dict_setitem=dict.__setitem__):
        'od.__setitem__(i, y) <==> od[i]=y'
        # Setting a new item creates a new link which goes at the end of the linked
        # list, and the inherited dictionary is updated with the new key/value pair.
        if key not in self:
            root = self.__root
            last = root[0]
            last[1] = root[0] = self.__map[key] = [last, root, key]
        dict_setitem(self, key, value)

    def __delitem__(self, key, dict_delitem=dict.__delitem__):
        'od.__delitem__(y) <==> del od[y]'
        # Deleting an existing item uses self.__map to find the link which is
        # then removed by updating the links in the predecessor and successor nodes.
        dict_delitem(self, key)
        link_prev, link_next, key = self.__map.pop(key)
        link_prev[1] = link_next
        link_next[0] = link_prev

    def __iter__(self):
        'od.__iter__() <==> iter(od)'
        root = self.__root
        curr = root[1]
        while curr is not root:
            yield curr[2]
            curr = curr[1]

    def __reversed__(self):
        'od.__reversed__() <==> reversed(od)'
        root = self.__root
        curr = root[0]
        while curr is not root:
            yield curr[2]
            curr = curr[0]

    def clear(self):
        'od.clear() -> None.  Remove all items from od.'
        try:
            for node in self.__map.itervalues():
                del node[:]
            root = self.__root
            root[:] = [root, root, None]
            self.__map.clear()
        except AttributeError:
            pass
        dict.clear(self)

    def popitem(self, last=True):
        '''od.popitem() -> (k, v), return and remove a (key, value) pair.
        Pairs are returned in LIFO order if last is true or FIFO order if false.

        '''
        if not self:
            raise KeyError('dictionary is empty')
        root = self.__root
        if last:
            link = root[0]
            link_prev = link[0]
            link_prev[1] = root
            root[0] = link_prev
        else:
            link = root[1]
            link_next = link[1]
            root[1] = link_next
            link_next[0] = root
        key = link[2]
        del self.__map[key]
        value = dict.pop(self, key)
        return key, value

    # -- the following methods do not depend on the internal structure --

    def keys(self):
        'od.keys() -> list of keys in od'
        return list(self)

    def values(self):
        'od.values() -> list of values in od'
        return [self[key] for key in self]

    def items(self):
        'od.items() -> list of (key, value) pairs in od'
        return [(key, self[key]) for key in self]

    def iterkeys(self):
        'od.iterkeys() -> an iterator over the keys in od'
        return iter(self)

    def itervalues(self):
        'od.itervalues -> an iterator over the values in od'
        for k in self:
            yield self[k]

    def iteritems(self):
        'od.iteritems -> an iterator over the (key, value) items in od'
        for k in self:
            yield (k, self[k])

    def update(*args, **kwds):
        '''od.update(E, **F) -> None.  Update od from dict/iterable E and F.

        If E is a dict instance, does:           for k in E: od[k] = E[k]
        If E has a .keys() method, does:         for k in E.keys(): od[k] = E[k]
        Or if E is an iterable of items, does:   for k, v in E: od[k] = v
        In either case, this is followed by:     for k, v in F.items(): od[k] = v

        '''
        if len(args) > 2:
            raise TypeError('update() takes at most 2 positional '
                            'arguments (%d given)' % (len(args),))
        elif not args:
            raise TypeError('update() takes at least 1 argument (0 given)')
        self = args[0]
        # Make progressively weaker assumptions about "other"
        other = ()
        if len(args) == 2:
            other = args[1]
        if isinstance(other, dict):
            for key in other:
                self[key] = other[key]
        elif hasattr(other, 'keys'):
            for key in other.keys():
                self[key] = other[key]
        else:
            for key, value in other:
                self[key] = value
        for key, value in kwds.items():
            self[key] = value

    __update = update  # let subclasses override update without breaking __init__

    __marker = object()

    def pop(self, key, default=__marker):
        '''od.pop(k[,d]) -> v, remove specified key and return the corresponding value.
        If key is not found, d is returned if given, otherwise KeyError is raised.

        '''
        if key in self:
            result = self[key]
            del self[key]
            return result
        if default is self.__marker:
            raise KeyError(key)
        return default

    def setdefault(self, key, default=None):
        'od.setdefault(k[,d]) -> od.get(k,d), also set od[k]=d if k not in od'
        if key in self:
            return self[key]
        self[key] = default
        return default

    def __repr__(self, _repr_running={}):
        'od.__repr__() <==> repr(od)'
        call_key = id(self), _get_ident()
        if call_key in _repr_running:
            return '...'
        _repr_running[call_key] = 1
        try:
            if not self:
                return '%s()' % (self.__class__.__name__,)
            return '%s(%r)' % (self.__class__.__name__, self.items())
        finally:
            del _repr_running[call_key]

    def __reduce__(self):
        'Return state information for pickling'
        items = [[k, self[k]] for k in self]
        inst_dict = vars(self).copy()
        for k in vars(OrderedDict()):
            inst_dict.pop(k, None)
        if inst_dict:
            return (self.__class__, (items,), inst_dict)
        return self.__class__, (items,)

    def copy(self):
        'od.copy() -> a shallow copy of od'
        return self.__class__(self)

    @classmethod
    def fromkeys(cls, iterable, value=None):
        '''OD.fromkeys(S[, v]) -> New ordered dictionary with keys from S
        and values equal to v (which defaults to None).

        '''
        d = cls()
        for key in iterable:
            d[key] = value
        return d

    def __eq__(self, other):
        '''od.__eq__(y) <==> od==y.  Comparison to another OD is order-sensitive
        while comparison to a regular mapping is order-insensitive.

        '''
        if isinstance(other, OrderedDict):
            return len(self)==len(other) and self.items() == other.items()
        return dict.__eq__(self, other)

    def __ne__(self, other):
        return not self == other

    # -- the following methods are only used in Python 2.7 --

    def viewkeys(self):
        "od.viewkeys() -> a set-like object providing a view on od's keys"
        return KeysView(self)

    def viewvalues(self):
        "od.viewvalues() -> an object providing a view on od's values"
        return ValuesView(self)

    def viewitems(self):
        "od.viewitems() -> a set-like object providing a view on od's items"
        return ItemsView(self)
site-packages/pip/_vendor/urllib3/packages/backports/makefile.py000064400000002665147511334600020776 0ustar00# -*- coding: utf-8 -*-
"""
backports.makefile
~~~~~~~~~~~~~~~~~~

Backports the Python 3 ``socket.makefile`` method for use with anything that
wants to create a "fake" socket object.
"""
import io

from socket import SocketIO


def backport_makefile(self, mode="r", buffering=None, encoding=None,
                      errors=None, newline=None):
    """
    Backport of ``socket.makefile`` from Python 3.5.
    """
    if not set(mode) <= set(["r", "w", "b"]):
        raise ValueError(
            "invalid mode %r (only r, w, b allowed)" % (mode,)
        )
    writing = "w" in mode
    reading = "r" in mode or not writing
    assert reading or writing
    binary = "b" in mode
    rawmode = ""
    if reading:
        rawmode += "r"
    if writing:
        rawmode += "w"
    raw = SocketIO(self, rawmode)
    self._makefile_refs += 1
    if buffering is None:
        buffering = -1
    if buffering < 0:
        buffering = io.DEFAULT_BUFFER_SIZE
    if buffering == 0:
        if not binary:
            raise ValueError("unbuffered streams must be binary")
        return raw
    if reading and writing:
        buffer = io.BufferedRWPair(raw, raw, buffering)
    elif reading:
        buffer = io.BufferedReader(raw, buffering)
    else:
        assert writing
        buffer = io.BufferedWriter(raw, buffering)
    if binary:
        return buffer
    text = io.TextIOWrapper(buffer, encoding, errors, newline)
    text.mode = mode
    return text
site-packages/pip/_vendor/urllib3/packages/backports/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000161147511334600026170 0ustar003

���e�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/urllib3/packages/backports/__pycache__/makefile.cpython-36.opt-1.pyc000064400000002223147511334600026207 0ustar003

���e��@s&dZddlZddlmZddd�ZdS)z�
backports.makefile
~~~~~~~~~~~~~~~~~~

Backports the Python 3 ``socket.makefile`` method for use with anything that
wants to create a "fake" socket object.
�N)�SocketIO�rc
Cst|�tdddg�ks$td|f��d|k}d|kp8|}d|k}d}	|rR|	d7}	|r^|	d7}	t||	�}
|jd7_|dkr�d
}|dkr�tj}|dkr�|s�td	��|
S|r�|r�tj|
|
|�}n|r�tj|
|�}ntj|
|�}|r�|Stj	||||�}||_
|S)z:
    Backport of ``socket.makefile`` from Python 3.5.
    r�w�bz&invalid mode %r (only r, w, b allowed)��Nrz!unbuffered streams must be binary���)�set�
ValueErrorrZ_makefile_refs�io�DEFAULT_BUFFER_SIZE�BufferedRWPair�BufferedReader�BufferedWriter�
TextIOWrapper�mode)
�selfr�	buffering�encoding�errors�newlineZwritingZreadingZbinaryZrawmode�raw�buffer�text�r�/usr/lib/python3.6/makefile.py�backport_makefiles>
r)rNNNN)�__doc__rZsocketrrrrrr�<module>ssite-packages/pip/_vendor/urllib3/packages/backports/__pycache__/__init__.cpython-36.pyc000064400000000161147511334600025231 0ustar003

���e�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/urllib3/packages/backports/__pycache__/makefile.cpython-36.pyc000064400000002275147511334600025257 0ustar003

���e��@s&dZddlZddlmZddd�ZdS)z�
backports.makefile
~~~~~~~~~~~~~~~~~~

Backports the Python 3 ``socket.makefile`` method for use with anything that
wants to create a "fake" socket object.
�N)�SocketIO�rc
Cst|�tdddg�ks$td|f��d|k}d|kp8|}|sF|sFt�d|k}d}	|r^|	d7}	|rj|	d7}	t||	�}
|jd7_|dkr�d
}|dkr�tj}|dkr�|s�td	��|
S|r�|r�tj|
|
|�}n&|r�tj|
|�}n|s�t�tj	|
|�}|�r�|Stj
||||�}||_|S)z:
    Backport of ``socket.makefile`` from Python 3.5.
    r�w�bz&invalid mode %r (only r, w, b allowed)��Nrz!unbuffered streams must be binary���)�set�
ValueError�AssertionErrorrZ_makefile_refs�io�DEFAULT_BUFFER_SIZE�BufferedRWPair�BufferedReader�BufferedWriter�
TextIOWrapper�mode)
�selfr�	buffering�encoding�errors�newlineZwritingZreadingZbinaryZrawmode�raw�buffer�text�r�/usr/lib/python3.6/makefile.py�backport_makefilesB
r)rNNNN)�__doc__rZsocketrrrrrr�<module>ssite-packages/pip/_vendor/urllib3/packages/backports/__init__.py000064400000000000147511334600020735 0ustar00site-packages/pip/_vendor/urllib3/packages/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000356147511334600024206 0ustar003

���em�@s ddlmZddlmZdZdS)�)�absolute_import�)�ssl_match_hostnamerN)r)Z
__future__r�r�__all__�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/urllib3/packages/__pycache__/six.cpython-36.opt-1.pyc000064400000057532147511334600023262 0ustar003

���e�u�I@srdZddlmZddlZddlZddlZddlZddlZdZdZ	ej
ddkZej
ddkZej
dd��dzkZ
er�efZefZefZeZeZejZn�efZeefZeejfZeZeZejjd	�r�e�d|�ZnLGdd
�d
e�Z ye!e ��Wn e"k
�re�d~�ZYnXe�d��Z[ dd�Z#dd�Z$Gdd�de�Z%Gdd�de%�Z&Gdd�dej'�Z(Gdd�de%�Z)Gdd�de�Z*e*e+�Z,Gdd�de(�Z-e)ddd d!�e)d"d#d$d%d"�e)d&d#d#d'd&�e)d(d)d$d*d(�e)d+d)d,�e)d-d#d$d.d-�e)d/d0d0d1d/�e)d2d0d0d/d2�e)d3d)d$d4d3�e)d5d)e
�rd6nd7d8�e)d9d)d:�e)d;d<d=d>�e)d!d!d �e)d?d?d@�e)dAdAd@�e)dBdBd@�e)d4d)d$d4d3�e)dCd#d$dDdC�e)dEd#d#dFdE�e&d$d)�e&dGdH�e&dIdJ�e&dKdLdM�e&dNdOdN�e&dPdQdR�e&dSdTdU�e&dVdWdX�e&dYdZd[�e&d\d]d^�e&d_d`da�e&dbdcdd�e&dedfdg�e&dhdidj�e&dkdkdl�e&dmdmdl�e&dndndl�e&dododp�e&dqdr�e&dsdt�e&dudv�e&dwdxdw�e&dydz�e&d{d|d}�e&d~dd��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�e+d�d��e&d�e+d�d��e&d�e+d�e+d��e&d�d�d��e&d�d�d��e&d�d�d��g>Z.ejd�k�rZe.e&d�d��g7Z.x:e.D]2Z/e0e-e/j1e/�e2e/e&��r`e,j3e/d�e/j1��q`W[/e.e-_.e-e+d��Z4e,j3e4d��Gd�d��d�e(�Z5e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d>d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��gZ6xe6D]Z/e0e5e/j1e/��q�W[/e6e5_.e,j3e5e+d��d�dӃGd�dՄd�e(�Z7e)d�d�d��e)d�d�d��e)d�d�d��gZ8xe8D]Z/e0e7e/j1e/��q$W[/e8e7_.e,j3e7e+d��d�d܃Gd�dބd�e(�Z9e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)�dd�d�g!Z:xe:D]Z/e0e9e/j1e/��q�W[/e:e9_.e,j3e9e+�d��d�d�G�d�d��de(�Z;e)�dd��d�e)�dd��d�e)�d	d��d�e)�d
d��d�gZ<xe<D]Z/e0e;e/j1e/��qTW[/e<e;_.e,j3e;e+�d��d�d
�G�d�d��de(�Z=e)�dd�d��gZ>xe>D]Z/e0e=e/j1e/��q�W[/e>e=_.e,j3e=e+�d��d�d�G�d�d��dej'�Z?e,j3e?e+d���d��d�d�Z@�d�d�ZAe�	rj�dZB�dZC�dZD�dZE�dZF�d ZGn$�d!ZB�d"ZC�d#ZD�d$ZE�d%ZF�d&ZGyeHZIWn"eJk
�	r��d'�d(�ZIYnXeIZHyeKZKWn"eJk
�	r��d)�d*�ZKYnXe�
r�d+�d,�ZLejMZN�d-�d.�ZOeZPn>�d/�d,�ZL�d0�d1�ZN�d2�d.�ZOG�d3�d4��d4e�ZPeKZKe#eL�d5�ejQeB�ZRejQeC�ZSejQeD�ZTejQeE�ZUejQeF�ZVejQeG�ZWe�
r��d6�d7�ZX�d8�d9�ZY�d:�d;�ZZ�d<�d=�Z[ej\�d>�Z]ej\�d?�Z^ej\�d@�Z_nT�dA�d7�ZX�dB�d9�ZY�dC�d;�ZZ�dD�d=�Z[ej\�dE�Z]ej\�dF�Z^ej\�dG�Z_e#eX�dH�e#eY�dI�e#eZ�dJ�e#e[�dK�e�r�dL�dM�Z`�dN�dO�ZaebZcddldZdedje�dP�jfZg[dejhd�ZiejjZkelZmddlnZnenjoZoenjpZp�dQZqej
d
d
k�r�dRZr�dSZsn�dTZr�dUZsnj�dV�dM�Z`�dW�dO�ZaecZcebZg�dX�dY�Zi�dZ�d[�Zkejtejuev�ZmddloZoeojoZoZp�d\Zq�dRZr�dSZse#e`�d]�e#ea�d^��d_�dQ�Zw�d`�dT�Zx�da�dU�Zye�r�eze4j{�db�Z|�d��dc�dd�Z}n�d��de�df�Z|e|�dg�ej
dd��d�k�
re|�dh�n.ej
dd��d�k�
r8e|�di�n�dj�dk�Z~eze4j{�dld�Zedk�
rj�dm�dn�Zej
dd��d�k�
r�eZ��do�dn�Ze#e}�dp�ej
dd��d�k�
r�ej�ej�f�dq�dr�Z�nej�Z��ds�dt�Z��du�dv�Z��dw�dx�Z�gZ�e+Z�e��j��dy�dk	�rge�_�ej��rbx>e�ej��D]0\Z�Z�ee��j+dk�r*e�j1e+k�r*ej�e�=P�q*W[�[�ej�j�e,�dS(�z6Utilities for writing code that runs on Python 2 and 3�)�absolute_importNz'Benjamin Peterson <benjamin@python.org>z1.10.0����java��c@seZdZdd�ZdS)�XcCsdS)Nrrl�)�selfr
r
�/usr/lib/python3.6/six.py�__len__>sz	X.__len__N)�__name__�
__module__�__qualname__r
r
r
r
rr	<sr	�?cCs
||_dS)z Add documentation to a function.N)�__doc__)�func�docr
r
r�_add_docKsrcCst|�tj|S)z7Import module, returning the module after the last dot.)�
__import__�sys�modules)�namer
r
r�_import_modulePsrc@seZdZdd�Zdd�ZdS)�
_LazyDescrcCs
||_dS)N)r)rrr
r
r�__init__Xsz_LazyDescr.__init__cCsB|j�}t||j|�yt|j|j�Wntk
r<YnX|S)N)�_resolve�setattrr�delattr�	__class__�AttributeError)r�obj�tp�resultr
r
r�__get__[sz_LazyDescr.__get__N)rrrrr%r
r
r
rrVsrcs.eZdZd�fdd�	Zdd�Zdd�Z�ZS)	�MovedModuleNcs2tt|�j|�tr(|dkr |}||_n||_dS)N)�superr&r�PY3�mod)rr�old�new)r r
rriszMovedModule.__init__cCs
t|j�S)N)rr))rr
r
rrrszMovedModule._resolvecCs"|j�}t||�}t|||�|S)N)r�getattrr)r�attr�_module�valuer
r
r�__getattr__us
zMovedModule.__getattr__)N)rrrrrr0�
__classcell__r
r
)r rr&gs	r&cs(eZdZ�fdd�Zdd�ZgZ�ZS)�_LazyModulecstt|�j|�|jj|_dS)N)r'r2rr r)rr)r r
rr~sz_LazyModule.__init__cCs ddg}|dd�|jD�7}|S)NrrcSsg|]
}|j�qSr
)r)�.0r-r
r
r�
<listcomp>�sz'_LazyModule.__dir__.<locals>.<listcomp>)�_moved_attributes)rZattrsr
r
r�__dir__�sz_LazyModule.__dir__)rrrrr6r5r1r
r
)r rr2|sr2cs&eZdZd�fdd�	Zdd�Z�ZS)�MovedAttributeNcsdtt|�j|�trH|dkr |}||_|dkr@|dkr<|}n|}||_n||_|dkrZ|}||_dS)N)r'r7rr(r)r-)rrZold_modZnew_modZold_attrZnew_attr)r r
rr�szMovedAttribute.__init__cCst|j�}t||j�S)N)rr)r,r-)r�moduler
r
rr�s
zMovedAttribute._resolve)NN)rrrrrr1r
r
)r rr7�sr7c@sVeZdZdZdd�Zdd�Zdd�Zdd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZeZdS)�_SixMetaPathImporterz�
    A meta path importer to import six.moves and its submodules.

    This class implements a PEP302 finder and loader. It should be compatible
    with Python 2.5 and all existing versions of Python3
    cCs||_i|_dS)N)r�
known_modules)rZsix_module_namer
r
rr�sz_SixMetaPathImporter.__init__cGs&x |D]}||j|jd|<qWdS)N�.)r:r)rr)Z	fullnames�fullnamer
r
r�_add_module�s
z _SixMetaPathImporter._add_modulecCs|j|jd|S)Nr;)r:r)rr<r
r
r�_get_module�sz _SixMetaPathImporter._get_moduleNcCs||jkr|SdS)N)r:)rr<�pathr
r
r�find_module�s
z _SixMetaPathImporter.find_modulecCs0y
|j|Stk
r*td|��YnXdS)Nz!This loader does not know module )r:�KeyError�ImportError)rr<r
r
rZ__get_module�s
z!_SixMetaPathImporter.__get_modulecCsRy
tj|Stk
rYnX|j|�}t|t�r>|j�}n||_|tj|<|S)N)rrrA� _SixMetaPathImporter__get_module�
isinstancer&r�
__loader__)rr<r)r
r
r�load_module�s




z _SixMetaPathImporter.load_modulecCst|j|�d�S)z�
        Return true, if the named module is a package.

        We need this method to get correct spec objects with
        Python 3.4 (see PEP451)
        �__path__)�hasattrrC)rr<r
r
r�
is_package�sz_SixMetaPathImporter.is_packagecCs|j|�dS)z;Return None

        Required, if is_package is implementedN)rC)rr<r
r
r�get_code�s
z_SixMetaPathImporter.get_code)N)
rrrrrr=r>r@rCrFrIrJ�
get_sourcer
r
r
rr9�s
	r9c@seZdZdZgZdS)�_MovedItemszLazy loading of moved objectsN)rrrrrGr
r
r
rrL�srLZ	cStringIO�io�StringIO�filter�	itertools�builtinsZifilter�filterfalseZifilterfalse�inputZ__builtin__Z	raw_input�internr�map�imap�getcwd�osZgetcwdu�getcwdb�rangeZxrangeZ
reload_module�	importlibZimp�reload�reduce�	functoolsZshlex_quoteZpipesZshlexZquote�UserDict�collections�UserList�
UserString�zipZizip�zip_longestZizip_longestZconfigparserZConfigParser�copyregZcopy_regZdbm_gnuZgdbmzdbm.gnuZ
_dummy_threadZdummy_threadZhttp_cookiejarZ	cookielibzhttp.cookiejarZhttp_cookiesZCookiezhttp.cookiesZ
html_entitiesZhtmlentitydefsz
html.entitiesZhtml_parserZ
HTMLParserzhtml.parserZhttp_clientZhttplibzhttp.clientZemail_mime_multipartzemail.MIMEMultipartzemail.mime.multipartZemail_mime_nonmultipartzemail.MIMENonMultipartzemail.mime.nonmultipartZemail_mime_textzemail.MIMETextzemail.mime.textZemail_mime_basezemail.MIMEBasezemail.mime.baseZBaseHTTPServerzhttp.serverZ
CGIHTTPServerZSimpleHTTPServerZcPickle�pickleZqueueZQueue�reprlib�reprZsocketserverZSocketServer�_threadZthreadZtkinterZTkinterZtkinter_dialogZDialogztkinter.dialogZtkinter_filedialogZ
FileDialogztkinter.filedialogZtkinter_scrolledtextZScrolledTextztkinter.scrolledtextZtkinter_simpledialogZSimpleDialogztkinter.simpledialogZtkinter_tixZTixztkinter.tixZtkinter_ttkZttkztkinter.ttkZtkinter_constantsZTkconstantsztkinter.constantsZtkinter_dndZTkdndztkinter.dndZtkinter_colorchooserZtkColorChooserztkinter.colorchooserZtkinter_commondialogZtkCommonDialogztkinter.commondialogZtkinter_tkfiledialogZtkFileDialogZtkinter_fontZtkFontztkinter.fontZtkinter_messageboxZtkMessageBoxztkinter.messageboxZtkinter_tksimpledialogZtkSimpleDialogZurllib_parsez.moves.urllib_parsezurllib.parseZurllib_errorz.moves.urllib_errorzurllib.errorZurllibz
.moves.urllibZurllib_robotparser�robotparserzurllib.robotparserZ
xmlrpc_clientZ	xmlrpclibz
xmlrpc.clientZ
xmlrpc_serverZSimpleXMLRPCServerz
xmlrpc.serverZwin32�winreg�_winregzmoves.z.moves�movesc@seZdZdZdS)�Module_six_moves_urllib_parsez7Lazy loading of moved objects in six.moves.urllib_parseN)rrrrr
r
r
rrn@srnZParseResultZurlparseZSplitResultZparse_qsZ	parse_qslZ	urldefragZurljoinZurlsplitZ
urlunparseZ
urlunsplitZ
quote_plusZunquoteZunquote_plusZ	urlencodeZ
splitqueryZsplittagZ	splituserZ
uses_fragmentZuses_netlocZuses_paramsZ
uses_queryZ
uses_relativezmoves.urllib_parsezmoves.urllib.parsec@seZdZdZdS)�Module_six_moves_urllib_errorz7Lazy loading of moved objects in six.moves.urllib_errorN)rrrrr
r
r
rrohsroZURLErrorZurllib2Z	HTTPErrorZContentTooShortErrorz.moves.urllib.errorzmoves.urllib_errorzmoves.urllib.errorc@seZdZdZdS)�Module_six_moves_urllib_requestz9Lazy loading of moved objects in six.moves.urllib_requestN)rrrrr
r
r
rrp|srpZurlopenzurllib.requestZinstall_openerZbuild_openerZpathname2urlZurl2pathnameZ
getproxiesZRequestZOpenerDirectorZHTTPDefaultErrorHandlerZHTTPRedirectHandlerZHTTPCookieProcessorZProxyHandlerZBaseHandlerZHTTPPasswordMgrZHTTPPasswordMgrWithDefaultRealmZAbstractBasicAuthHandlerZHTTPBasicAuthHandlerZProxyBasicAuthHandlerZAbstractDigestAuthHandlerZHTTPDigestAuthHandlerZProxyDigestAuthHandlerZHTTPHandlerZHTTPSHandlerZFileHandlerZ
FTPHandlerZCacheFTPHandlerZUnknownHandlerZHTTPErrorProcessorZurlretrieveZ
urlcleanupZ	URLopenerZFancyURLopenerZproxy_bypassz.moves.urllib.requestzmoves.urllib_requestzmoves.urllib.requestc@seZdZdZdS)� Module_six_moves_urllib_responsez:Lazy loading of moved objects in six.moves.urllib_responseN)rrrrr
r
r
rrq�srqZaddbasezurllib.responseZaddclosehookZaddinfoZ
addinfourlz.moves.urllib.responsezmoves.urllib_responsezmoves.urllib.responsec@seZdZdZdS)�#Module_six_moves_urllib_robotparserz=Lazy loading of moved objects in six.moves.urllib_robotparserN)rrrrr
r
r
rrr�srrZRobotFileParserz.moves.urllib.robotparserzmoves.urllib_robotparserzmoves.urllib.robotparserc@sNeZdZdZgZejd�Zejd�Zejd�Z	ejd�Z
ejd�Zdd�Zd	S)
�Module_six_moves_urllibzICreate a six.moves.urllib namespace that resembles the Python 3 namespacezmoves.urllib_parsezmoves.urllib_errorzmoves.urllib_requestzmoves.urllib_responsezmoves.urllib_robotparsercCsdddddgS)N�parse�error�request�responserjr
)rr
r
rr6�szModule_six_moves_urllib.__dir__N)
rrrrrG�	_importerr>rtrurvrwrjr6r
r
r
rrs�s




rszmoves.urllibcCstt|j|�dS)zAdd an item to six.moves.N)rrLr)Zmover
r
r�add_move�srycCsXytt|�WnDtk
rRytj|=Wn"tk
rLtd|f��YnXYnXdS)zRemove item from six.moves.zno such move, %rN)rrLr!rm�__dict__rA)rr
r
r�remove_move�sr{�__func__�__self__�__closure__�__code__�__defaults__�__globals__�im_funcZim_selfZfunc_closureZ	func_codeZ
func_defaultsZfunc_globalscCs|j�S)N)�next)�itr
r
r�advance_iteratorsr�cCstdd�t|�jD��S)Ncss|]}d|jkVqdS)�__call__N)rz)r3�klassr
r
r�	<genexpr>szcallable.<locals>.<genexpr>)�any�type�__mro__)r"r
r
r�callablesr�cCs|S)Nr
)�unboundr
r
r�get_unbound_functionsr�cCs|S)Nr
)r�clsr
r
r�create_unbound_methodsr�cCs|jS)N)r�)r�r
r
rr�"scCstj|||j�S)N)�types�
MethodTyper )rr"r
r
r�create_bound_method%sr�cCstj|d|�S)N)r�r�)rr�r
r
rr�(sc@seZdZdd�ZdS)�IteratorcCst|�j|�S)N)r��__next__)rr
r
rr�-sz
Iterator.nextN)rrrr�r
r
r
rr�+sr�z3Get the function out of a possibly unbound functioncKst|jf|��S)N)�iter�keys)�d�kwr
r
r�iterkeys>sr�cKst|jf|��S)N)r��values)r�r�r
r
r�
itervaluesAsr�cKst|jf|��S)N)r��items)r�r�r
r
r�	iteritemsDsr�cKst|jf|��S)N)r�Zlists)r�r�r
r
r�	iterlistsGsr�r�r�r�cKs|jf|�S)N)r�)r�r�r
r
rr�PscKs|jf|�S)N)r�)r�r�r
r
rr�SscKs|jf|�S)N)r�)r�r�r
r
rr�VscKs|jf|�S)N)r�)r�r�r
r
rr�Ys�viewkeys�
viewvalues�	viewitemsz1Return an iterator over the keys of a dictionary.z3Return an iterator over the values of a dictionary.z?Return an iterator over the (key, value) pairs of a dictionary.zBReturn an iterator over the (key, [values]) pairs of a dictionary.cCs
|jd�S)Nzlatin-1)�encode)�sr
r
r�bksr�cCs|S)Nr
)r�r
r
r�unsr�z>B�assertCountEqualZassertRaisesRegexpZassertRegexpMatches�assertRaisesRegex�assertRegexcCs|S)Nr
)r�r
r
rr��scCst|jdd�d�S)Nz\\z\\\\Zunicode_escape)�unicode�replace)r�r
r
rr��scCst|d�S)Nr)�ord)Zbsr
r
r�byte2int�sr�cCst||�S)N)r�)Zbuf�ir
r
r�
indexbytes�sr�ZassertItemsEqualzByte literalzText literalcOst|t�||�S)N)r,�_assertCountEqual)r�args�kwargsr
r
rr��scOst|t�||�S)N)r,�_assertRaisesRegex)rr�r�r
r
rr��scOst|t�||�S)N)r,�_assertRegex)rr�r�r
r
rr��s�execcCs*|dkr|�}|j|k	r"|j|��|�dS)N)�
__traceback__�with_traceback)r#r/�tbr
r
r�reraise�s


r�cCsB|dkr*tjd�}|j}|dkr&|j}~n|dkr6|}td�dS)zExecute code in a namespace.Nrzexec _code_ in _globs_, _locs_)r�	_getframe�	f_globals�f_localsr�)Z_code_Z_globs_Z_locs_�framer
r
r�exec_�s
r�z9def reraise(tp, value, tb=None):
    raise tp, value, tb
zrdef raise_from(value, from_value):
    if from_value is None:
        raise value
    raise value from from_value
zCdef raise_from(value, from_value):
    raise value from from_value
cCs|�dS)Nr
)r/Z
from_valuer
r
r�
raise_from�sr��printc
s6|jdtj���dkrdS�fdd�}d}|jdd�}|dk	r`t|t�rNd}nt|t�s`td��|jd	d�}|dk	r�t|t�r�d}nt|t�s�td
��|r�td��|s�x|D]}t|t�r�d}Pq�W|r�td�}td
�}nd}d
}|dkr�|}|dk�r�|}x,t|�D] \}	}|	�r||�||��qW||�dS)z4The new-style print function for Python 2.4 and 2.5.�fileNcsdt|t�st|�}t�t�rVt|t�rV�jdk	rVt�dd�}|dkrHd}|j�j|�}�j|�dS)N�errors�strict)	rD�
basestring�strr�r��encodingr,r��write)�datar�)�fpr
rr��s



zprint_.<locals>.writeF�sepTzsep must be None or a string�endzend must be None or a stringz$invalid keyword arguments to print()�
� )�popr�stdoutrDr�r��	TypeError�	enumerate)
r�r�r�Zwant_unicoder�r��arg�newlineZspacer�r
)r�r�print_�sL







r�cOs<|jdtj�}|jdd�}t||�|r8|dk	r8|j�dS)Nr��flushF)�getrr�r��_printr�)r�r�r�r�r
r
rr�s

zReraise an exception.cs���fdd�}|S)Ncstj����|�}�|_|S)N)r^�wraps�__wrapped__)�f)�assigned�updated�wrappedr
r�wrapperszwraps.<locals>.wrapperr
)r�r�r�r�r
)r�r�r�rr�sr�cs&G��fdd�d��}tj|dfi�S)z%Create a base class with a metaclass.cseZdZ��fdd�ZdS)z!with_metaclass.<locals>.metaclasscs�|�|�S)Nr
)r�rZ
this_basesr�)�bases�metar
r�__new__'sz)with_metaclass.<locals>.metaclass.__new__N)rrrr�r
)r�r�r
r�	metaclass%sr�Ztemporary_class)r�r�)r�r�r�r
)r�r�r�with_metaclass sr�cs�fdd�}|S)z6Class decorator for creating a class with a metaclass.csl|jj�}|jd�}|dk	rDt|t�r,|g}x|D]}|j|�q2W|jdd�|jdd��|j|j|�S)N�	__slots__rz�__weakref__)rz�copyr�rDr�r�r�	__bases__)r�Z	orig_vars�slotsZ	slots_var)r�r
rr�.s



zadd_metaclass.<locals>.wrapperr
)r�r�r
)r�r�
add_metaclass,sr�cCs2tr.d|jkrtd|j��|j|_dd�|_|S)a
    A decorator that defines __unicode__ and __str__ methods under Python 2.
    Under Python 3 it does nothing.

    To support Python 2 and 3 with a single code base, define a __str__ method
    returning text and apply this decorator to the class.
    �__str__zY@python_2_unicode_compatible cannot be applied to %s because it doesn't define __str__().cSs|j�jd�S)Nzutf-8)�__unicode__r�)rr
r
r�<lambda>Jsz-python_2_unicode_compatible.<locals>.<lambda>)�PY2rz�
ValueErrorrr�r�)r�r
r
r�python_2_unicode_compatible<s


r��__spec__)rrli���li���ll����)N)NN)rr)rr)rr)rr)�rZ
__future__rr^rP�operatorrr��
__author__�__version__�version_infor�r(ZPY34r�Zstring_types�intZ
integer_typesr�Zclass_typesZ	text_type�bytesZbinary_type�maxsizeZMAXSIZEr�ZlongZ	ClassTyper��platform�
startswith�objectr	�len�
OverflowErrorrrrr&�
ModuleTyper2r7r9rrxrLr5r-rrrDr=rmrnZ_urllib_parse_moved_attributesroZ_urllib_error_moved_attributesrpZ _urllib_request_moved_attributesrqZ!_urllib_response_moved_attributesrrZ$_urllib_robotparser_moved_attributesrsryr{Z
_meth_funcZ
_meth_selfZ
_func_closureZ
_func_codeZ_func_defaultsZ
_func_globalsr�r��	NameErrorr�r�r�r�r�r��
attrgetterZget_method_functionZget_method_selfZget_function_closureZget_function_codeZget_function_defaultsZget_function_globalsr�r�r�r��methodcallerr�r�r�r�r��chrZunichr�struct�Struct�packZint2byte�
itemgetterr��getitemr�r�Z	iterbytesrMrN�BytesIOr�r�r��partialrVr�r�r�r�r,rQr�r�r�r�r��WRAPPER_ASSIGNMENTS�WRAPPER_UPDATESr�r�r�r�rG�__package__�globalsr�r��submodule_search_locations�	meta_pathr�r�Zimporter�appendr
r
r
r�<module>s�

>












































































































5site-packages/pip/_vendor/urllib3/packages/__pycache__/__init__.cpython-36.pyc000064400000000356147511334600023247 0ustar003

���em�@s ddlmZddlmZdZdS)�)�absolute_import�)�ssl_match_hostnamerN)r)Z
__future__r�r�__all__�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/urllib3/packages/__pycache__/ordered_dict.cpython-36.opt-1.pyc000064400000020173147511334600025075 0ustar003

���e�"�@styddlmZWn ek
r0ddlmZYnXyddlmZmZmZWnek
r^YnXGdd�de	�Z
dS)�)�	get_ident)�KeysView�
ValuesView�	ItemsViewc@seZdZdZdd�Zejfdd�Zejfdd�Zdd	�Zd
d�Z	dd
�Z
d6dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�ZeZe�Zefdd �Zd7d"d#�Zifd$d%�Zd&d'�Zd(d)�Zed8d*d+��Zd,d-�Zd.d/�Zd0d1�Zd2d3�Z d4d5�Z!d!S)9�OrderedDictz)Dictionary that remembers insertion ordercOsnt|�dkrtdt|���y
|jWn6tk
r\g|_}||dg|dd�<i|_YnX|j||�dS)z�Initialize an ordered dictionary.  Signature is the same as for
        regular dictionaries, but keyword arguments are not recommended
        because their insertion order is arbitrary.

        �z$expected at most 1 arguments, got %dN)�len�	TypeError�_OrderedDict__root�AttributeError�_OrderedDict__map�_OrderedDict__update)�self�args�kwds�root�r�"/usr/lib/python3.6/ordered_dict.py�__init__s

zOrderedDict.__init__cCsF||kr6|j}|d}|||g|d<|d<|j|<||||�dS)z!od.__setitem__(i, y) <==> od[i]=yrrN)r
r)r�key�valueZdict_setitemr�lastrrr�__setitem__,s
 zOrderedDict.__setitem__cCs0|||�|jj|�\}}}||d<||d<dS)z od.__delitem__(y) <==> del od[y]rrN)r�pop)rrZdict_delitem�	link_prev�	link_nextrrr�__delitem__6s
zOrderedDict.__delitem__ccs2|j}|d}x||k	r,|dV|d}qWdS)zod.__iter__() <==> iter(od)r�N)r
)rr�currrrr�__iter__?s


zOrderedDict.__iter__ccs2|j}|d}x||k	r,|dV|d}qWdS)z#od.__reversed__() <==> reversed(od)rrN)r
)rrrrrr�__reversed__Gs


zOrderedDict.__reversed__cCshyDx|jj�D]}|dd�=qW|j}||dg|dd�<|jj�Wntk
rXYnXtj|�dS)z.od.clear() -> None.  Remove all items from od.N)r�
itervaluesr
�clearr�dict)rZnoderrrrr"OszOrderedDict.clearTcCs||std��|j}|r8|d}|d}||d<||d<n |d}|d}||d<||d<|d}|j|=tj||�}||fS)z�od.popitem() -> (k, v), return and remove a (key, value) pair.
        Pairs are returned in LIFO order if last is true or FIFO order if false.

        zdictionary is emptyrrr)�KeyErrorr
rr#r)rrr�linkrrrrrrr�popitem[s 
zOrderedDict.popitemcCst|�S)zod.keys() -> list of keys in od)�list)rrrr�keystszOrderedDict.keyscs�fdd��D�S)z#od.values() -> list of values in odcsg|]}�|�qSrr)�.0r)rrr�
<listcomp>zsz&OrderedDict.values.<locals>.<listcomp>r)rr)rr�valuesxszOrderedDict.valuescs�fdd��D�S)z.od.items() -> list of (key, value) pairs in odcsg|]}|�|f�qSrr)r)r)rrrr*~sz%OrderedDict.items.<locals>.<listcomp>r)rr)rr�items|szOrderedDict.itemscCst|�S)z0od.iterkeys() -> an iterator over the keys in od)�iter)rrrr�iterkeys�szOrderedDict.iterkeysccsx|D]}||VqWdS)z2od.itervalues -> an iterator over the values in odNr)r�krrrr!�s
zOrderedDict.itervaluesccs x|D]}|||fVqWdS)z=od.iteritems -> an iterator over the (key, value) items in odNr)rr/rrr�	iteritems�s
zOrderedDict.iteritemscOs�t|�dkr tdt|�f��n|s,td��|d}f}t|�dkrL|d}t|t�rrx^|D]}||||<q\WnDt|d�r�x8|j�D]}||||<q�Wnx|D]\}}|||<q�Wx|j�D]\}}|||<q�WdS)a�od.update(E, **F) -> None.  Update od from dict/iterable E and F.

        If E is a dict instance, does:           for k in E: od[k] = E[k]
        If E has a .keys() method, does:         for k in E.keys(): od[k] = E[k]
        Or if E is an iterable of items, does:   for k, v in E: od[k] = v
        In either case, this is followed by:     for k, v in F.items(): od[k] = v

        rz8update() takes at most 2 positional arguments (%d given)z,update() takes at least 1 argument (0 given)rrr(N)rr	�
isinstancer#�hasattrr(r,)rrr�otherrrrrr�update�s&	


zOrderedDict.updatecCs0||kr||}||=|S||jkr,t|��|S)z�od.pop(k[,d]) -> v, remove specified key and return the corresponding value.
        If key is not found, d is returned if given, otherwise KeyError is raised.

        )�_OrderedDict__markerr$)rr�default�resultrrrr�s
zOrderedDict.popNcCs||kr||S|||<|S)zDod.setdefault(k[,d]) -> od.get(k,d), also set od[k]=d if k not in odr)rrr6rrr�
setdefault�szOrderedDict.setdefaultcCsVt|�t�f}||krdSd||<z&|s6d|jjfSd|jj|j�fS||=XdS)zod.__repr__() <==> repr(od)z...rz%s()z%s(%r)N)�id�
_get_ident�	__class__�__name__r,)rZ
_repr_runningZcall_keyrrr�__repr__�szOrderedDict.__repr__cs\�fdd��D�}t��j�}xtt��D]}|j|d�q*W|rP�j|f|fS�j|ffS)z%Return state information for picklingcsg|]}|�|g�qSrr)r)r/)rrrr*�sz*OrderedDict.__reduce__.<locals>.<listcomp>N)�vars�copyrrr;)rr,Z	inst_dictr/r)rr�
__reduce__�szOrderedDict.__reduce__cCs
|j|�S)z!od.copy() -> a shallow copy of od)r;)rrrrr?�szOrderedDict.copycCs |�}x|D]}|||<qW|S)z�OD.fromkeys(S[, v]) -> New ordered dictionary with keys from S
        and values equal to v (which defaults to None).

        r)�cls�iterabler�drrrr�fromkeys�s
zOrderedDict.fromkeyscCs6t|t�r*t|�t|�ko(|j�|j�kStj||�S)z�od.__eq__(y) <==> od==y.  Comparison to another OD is order-sensitive
        while comparison to a regular mapping is order-insensitive.

        )r1rrr,r#�__eq__)rr3rrrrE�s
 zOrderedDict.__eq__cCs
||kS)Nr)rr3rrr�__ne__�szOrderedDict.__ne__cCst|�S)z@od.viewkeys() -> a set-like object providing a view on od's keys)r)rrrr�viewkeys�szOrderedDict.viewkeyscCst|�S)z<od.viewvalues() -> an object providing a view on od's values)r)rrrr�
viewvalues�szOrderedDict.viewvaluescCst|�S)zBod.viewitems() -> a set-like object providing a view on od's items)r)rrrr�	viewitemsszOrderedDict.viewitems)T)N)N)"r<�
__module__�__qualname__�__doc__rr#rrrr r"r&r(r+r,r.r!r0r4r
�objectr5rr8r=r@r?�classmethodrDrErFrGrHrIrrrrrs:
	




	rN)Zthreadrr:�ImportErrorZdummy_threadZ_abcollrrrr#rrrrr�<module>ssite-packages/pip/_vendor/urllib3/packages/__pycache__/ordered_dict.cpython-36.pyc000064400000020173147511334600024136 0ustar003

���e�"�@styddlmZWn ek
r0ddlmZYnXyddlmZmZmZWnek
r^YnXGdd�de	�Z
dS)�)�	get_ident)�KeysView�
ValuesView�	ItemsViewc@seZdZdZdd�Zejfdd�Zejfdd�Zdd	�Zd
d�Z	dd
�Z
d6dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�ZeZe�Zefdd �Zd7d"d#�Zifd$d%�Zd&d'�Zd(d)�Zed8d*d+��Zd,d-�Zd.d/�Zd0d1�Zd2d3�Z d4d5�Z!d!S)9�OrderedDictz)Dictionary that remembers insertion ordercOsnt|�dkrtdt|���y
|jWn6tk
r\g|_}||dg|dd�<i|_YnX|j||�dS)z�Initialize an ordered dictionary.  Signature is the same as for
        regular dictionaries, but keyword arguments are not recommended
        because their insertion order is arbitrary.

        �z$expected at most 1 arguments, got %dN)�len�	TypeError�_OrderedDict__root�AttributeError�_OrderedDict__map�_OrderedDict__update)�self�args�kwds�root�r�"/usr/lib/python3.6/ordered_dict.py�__init__s

zOrderedDict.__init__cCsF||kr6|j}|d}|||g|d<|d<|j|<||||�dS)z!od.__setitem__(i, y) <==> od[i]=yrrN)r
r)r�key�valueZdict_setitemr�lastrrr�__setitem__,s
 zOrderedDict.__setitem__cCs0|||�|jj|�\}}}||d<||d<dS)z od.__delitem__(y) <==> del od[y]rrN)r�pop)rrZdict_delitem�	link_prev�	link_nextrrr�__delitem__6s
zOrderedDict.__delitem__ccs2|j}|d}x||k	r,|dV|d}qWdS)zod.__iter__() <==> iter(od)r�N)r
)rr�currrrr�__iter__?s


zOrderedDict.__iter__ccs2|j}|d}x||k	r,|dV|d}qWdS)z#od.__reversed__() <==> reversed(od)rrN)r
)rrrrrr�__reversed__Gs


zOrderedDict.__reversed__cCshyDx|jj�D]}|dd�=qW|j}||dg|dd�<|jj�Wntk
rXYnXtj|�dS)z.od.clear() -> None.  Remove all items from od.N)r�
itervaluesr
�clearr�dict)rZnoderrrrr"OszOrderedDict.clearTcCs||std��|j}|r8|d}|d}||d<||d<n |d}|d}||d<||d<|d}|j|=tj||�}||fS)z�od.popitem() -> (k, v), return and remove a (key, value) pair.
        Pairs are returned in LIFO order if last is true or FIFO order if false.

        zdictionary is emptyrrr)�KeyErrorr
rr#r)rrr�linkrrrrrrr�popitem[s 
zOrderedDict.popitemcCst|�S)zod.keys() -> list of keys in od)�list)rrrr�keystszOrderedDict.keyscs�fdd��D�S)z#od.values() -> list of values in odcsg|]}�|�qSrr)�.0r)rrr�
<listcomp>zsz&OrderedDict.values.<locals>.<listcomp>r)rr)rr�valuesxszOrderedDict.valuescs�fdd��D�S)z.od.items() -> list of (key, value) pairs in odcsg|]}|�|f�qSrr)r)r)rrrr*~sz%OrderedDict.items.<locals>.<listcomp>r)rr)rr�items|szOrderedDict.itemscCst|�S)z0od.iterkeys() -> an iterator over the keys in od)�iter)rrrr�iterkeys�szOrderedDict.iterkeysccsx|D]}||VqWdS)z2od.itervalues -> an iterator over the values in odNr)r�krrrr!�s
zOrderedDict.itervaluesccs x|D]}|||fVqWdS)z=od.iteritems -> an iterator over the (key, value) items in odNr)rr/rrr�	iteritems�s
zOrderedDict.iteritemscOs�t|�dkr tdt|�f��n|s,td��|d}f}t|�dkrL|d}t|t�rrx^|D]}||||<q\WnDt|d�r�x8|j�D]}||||<q�Wnx|D]\}}|||<q�Wx|j�D]\}}|||<q�WdS)a�od.update(E, **F) -> None.  Update od from dict/iterable E and F.

        If E is a dict instance, does:           for k in E: od[k] = E[k]
        If E has a .keys() method, does:         for k in E.keys(): od[k] = E[k]
        Or if E is an iterable of items, does:   for k, v in E: od[k] = v
        In either case, this is followed by:     for k, v in F.items(): od[k] = v

        rz8update() takes at most 2 positional arguments (%d given)z,update() takes at least 1 argument (0 given)rrr(N)rr	�
isinstancer#�hasattrr(r,)rrr�otherrrrrr�update�s&	


zOrderedDict.updatecCs0||kr||}||=|S||jkr,t|��|S)z�od.pop(k[,d]) -> v, remove specified key and return the corresponding value.
        If key is not found, d is returned if given, otherwise KeyError is raised.

        )�_OrderedDict__markerr$)rr�default�resultrrrr�s
zOrderedDict.popNcCs||kr||S|||<|S)zDod.setdefault(k[,d]) -> od.get(k,d), also set od[k]=d if k not in odr)rrr6rrr�
setdefault�szOrderedDict.setdefaultcCsVt|�t�f}||krdSd||<z&|s6d|jjfSd|jj|j�fS||=XdS)zod.__repr__() <==> repr(od)z...rz%s()z%s(%r)N)�id�
_get_ident�	__class__�__name__r,)rZ
_repr_runningZcall_keyrrr�__repr__�szOrderedDict.__repr__cs\�fdd��D�}t��j�}xtt��D]}|j|d�q*W|rP�j|f|fS�j|ffS)z%Return state information for picklingcsg|]}|�|g�qSrr)r)r/)rrrr*�sz*OrderedDict.__reduce__.<locals>.<listcomp>N)�vars�copyrrr;)rr,Z	inst_dictr/r)rr�
__reduce__�szOrderedDict.__reduce__cCs
|j|�S)z!od.copy() -> a shallow copy of od)r;)rrrrr?�szOrderedDict.copycCs |�}x|D]}|||<qW|S)z�OD.fromkeys(S[, v]) -> New ordered dictionary with keys from S
        and values equal to v (which defaults to None).

        r)�cls�iterabler�drrrr�fromkeys�s
zOrderedDict.fromkeyscCs6t|t�r*t|�t|�ko(|j�|j�kStj||�S)z�od.__eq__(y) <==> od==y.  Comparison to another OD is order-sensitive
        while comparison to a regular mapping is order-insensitive.

        )r1rrr,r#�__eq__)rr3rrrrE�s
 zOrderedDict.__eq__cCs
||kS)Nr)rr3rrr�__ne__�szOrderedDict.__ne__cCst|�S)z@od.viewkeys() -> a set-like object providing a view on od's keys)r)rrrr�viewkeys�szOrderedDict.viewkeyscCst|�S)z<od.viewvalues() -> an object providing a view on od's values)r)rrrr�
viewvalues�szOrderedDict.viewvaluescCst|�S)zBod.viewitems() -> a set-like object providing a view on od's items)r)rrrr�	viewitemsszOrderedDict.viewitems)T)N)N)"r<�
__module__�__qualname__�__doc__rr#rrrr r"r&r(r+r,r.r!r0r4r
�objectr5rr8r=r@r?�classmethodrDrErFrGrHrIrrrrrs:
	




	rN)Zthreadrr:�ImportErrorZdummy_threadZ_abcollrrrr#rrrrr�<module>ssite-packages/pip/_vendor/urllib3/packages/__pycache__/six.cpython-36.pyc000064400000057532147511334600022323 0ustar003

���e�u�I@srdZddlmZddlZddlZddlZddlZddlZdZdZ	ej
ddkZej
ddkZej
dd��dzkZ
er�efZefZefZeZeZejZn�efZeefZeejfZeZeZejjd	�r�e�d|�ZnLGdd
�d
e�Z ye!e ��Wn e"k
�re�d~�ZYnXe�d��Z[ dd�Z#dd�Z$Gdd�de�Z%Gdd�de%�Z&Gdd�dej'�Z(Gdd�de%�Z)Gdd�de�Z*e*e+�Z,Gdd�de(�Z-e)ddd d!�e)d"d#d$d%d"�e)d&d#d#d'd&�e)d(d)d$d*d(�e)d+d)d,�e)d-d#d$d.d-�e)d/d0d0d1d/�e)d2d0d0d/d2�e)d3d)d$d4d3�e)d5d)e
�rd6nd7d8�e)d9d)d:�e)d;d<d=d>�e)d!d!d �e)d?d?d@�e)dAdAd@�e)dBdBd@�e)d4d)d$d4d3�e)dCd#d$dDdC�e)dEd#d#dFdE�e&d$d)�e&dGdH�e&dIdJ�e&dKdLdM�e&dNdOdN�e&dPdQdR�e&dSdTdU�e&dVdWdX�e&dYdZd[�e&d\d]d^�e&d_d`da�e&dbdcdd�e&dedfdg�e&dhdidj�e&dkdkdl�e&dmdmdl�e&dndndl�e&dododp�e&dqdr�e&dsdt�e&dudv�e&dwdxdw�e&dydz�e&d{d|d}�e&d~dd��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�e+d�d��e&d�e+d�d��e&d�e+d�e+d��e&d�d�d��e&d�d�d��e&d�d�d��g>Z.ejd�k�rZe.e&d�d��g7Z.x:e.D]2Z/e0e-e/j1e/�e2e/e&��r`e,j3e/d�e/j1��q`W[/e.e-_.e-e+d��Z4e,j3e4d��Gd�d��d�e(�Z5e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d>d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��gZ6xe6D]Z/e0e5e/j1e/��q�W[/e6e5_.e,j3e5e+d��d�dӃGd�dՄd�e(�Z7e)d�d�d��e)d�d�d��e)d�d�d��gZ8xe8D]Z/e0e7e/j1e/��q$W[/e8e7_.e,j3e7e+d��d�d܃Gd�dބd�e(�Z9e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)�dd�d�g!Z:xe:D]Z/e0e9e/j1e/��q�W[/e:e9_.e,j3e9e+�d��d�d�G�d�d��de(�Z;e)�dd��d�e)�dd��d�e)�d	d��d�e)�d
d��d�gZ<xe<D]Z/e0e;e/j1e/��qTW[/e<e;_.e,j3e;e+�d��d�d
�G�d�d��de(�Z=e)�dd�d��gZ>xe>D]Z/e0e=e/j1e/��q�W[/e>e=_.e,j3e=e+�d��d�d�G�d�d��dej'�Z?e,j3e?e+d���d��d�d�Z@�d�d�ZAe�	rj�dZB�dZC�dZD�dZE�dZF�d ZGn$�d!ZB�d"ZC�d#ZD�d$ZE�d%ZF�d&ZGyeHZIWn"eJk
�	r��d'�d(�ZIYnXeIZHyeKZKWn"eJk
�	r��d)�d*�ZKYnXe�
r�d+�d,�ZLejMZN�d-�d.�ZOeZPn>�d/�d,�ZL�d0�d1�ZN�d2�d.�ZOG�d3�d4��d4e�ZPeKZKe#eL�d5�ejQeB�ZRejQeC�ZSejQeD�ZTejQeE�ZUejQeF�ZVejQeG�ZWe�
r��d6�d7�ZX�d8�d9�ZY�d:�d;�ZZ�d<�d=�Z[ej\�d>�Z]ej\�d?�Z^ej\�d@�Z_nT�dA�d7�ZX�dB�d9�ZY�dC�d;�ZZ�dD�d=�Z[ej\�dE�Z]ej\�dF�Z^ej\�dG�Z_e#eX�dH�e#eY�dI�e#eZ�dJ�e#e[�dK�e�r�dL�dM�Z`�dN�dO�ZaebZcddldZdedje�dP�jfZg[dejhd�ZiejjZkelZmddlnZnenjoZoenjpZp�dQZqej
d
d
k�r�dRZr�dSZsn�dTZr�dUZsnj�dV�dM�Z`�dW�dO�ZaecZcebZg�dX�dY�Zi�dZ�d[�Zkejtejuev�ZmddloZoeojoZoZp�d\Zq�dRZr�dSZse#e`�d]�e#ea�d^��d_�dQ�Zw�d`�dT�Zx�da�dU�Zye�r�eze4j{�db�Z|�d��dc�dd�Z}n�d��de�df�Z|e|�dg�ej
dd��d�k�
re|�dh�n.ej
dd��d�k�
r8e|�di�n�dj�dk�Z~eze4j{�dld�Zedk�
rj�dm�dn�Zej
dd��d�k�
r�eZ��do�dn�Ze#e}�dp�ej
dd��d�k�
r�ej�ej�f�dq�dr�Z�nej�Z��ds�dt�Z��du�dv�Z��dw�dx�Z�gZ�e+Z�e��j��dy�dk	�rge�_�ej��rbx>e�ej��D]0\Z�Z�ee��j+dk�r*e�j1e+k�r*ej�e�=P�q*W[�[�ej�j�e,�dS(�z6Utilities for writing code that runs on Python 2 and 3�)�absolute_importNz'Benjamin Peterson <benjamin@python.org>z1.10.0����java��c@seZdZdd�ZdS)�XcCsdS)Nrrl�)�selfr
r
�/usr/lib/python3.6/six.py�__len__>sz	X.__len__N)�__name__�
__module__�__qualname__r
r
r
r
rr	<sr	�?cCs
||_dS)z Add documentation to a function.N)�__doc__)�func�docr
r
r�_add_docKsrcCst|�tj|S)z7Import module, returning the module after the last dot.)�
__import__�sys�modules)�namer
r
r�_import_modulePsrc@seZdZdd�Zdd�ZdS)�
_LazyDescrcCs
||_dS)N)r)rrr
r
r�__init__Xsz_LazyDescr.__init__cCsB|j�}t||j|�yt|j|j�Wntk
r<YnX|S)N)�_resolve�setattrr�delattr�	__class__�AttributeError)r�obj�tp�resultr
r
r�__get__[sz_LazyDescr.__get__N)rrrrr%r
r
r
rrVsrcs.eZdZd�fdd�	Zdd�Zdd�Z�ZS)	�MovedModuleNcs2tt|�j|�tr(|dkr |}||_n||_dS)N)�superr&r�PY3�mod)rr�old�new)r r
rriszMovedModule.__init__cCs
t|j�S)N)rr))rr
r
rrrszMovedModule._resolvecCs"|j�}t||�}t|||�|S)N)r�getattrr)r�attr�_module�valuer
r
r�__getattr__us
zMovedModule.__getattr__)N)rrrrrr0�
__classcell__r
r
)r rr&gs	r&cs(eZdZ�fdd�Zdd�ZgZ�ZS)�_LazyModulecstt|�j|�|jj|_dS)N)r'r2rr r)rr)r r
rr~sz_LazyModule.__init__cCs ddg}|dd�|jD�7}|S)NrrcSsg|]
}|j�qSr
)r)�.0r-r
r
r�
<listcomp>�sz'_LazyModule.__dir__.<locals>.<listcomp>)�_moved_attributes)rZattrsr
r
r�__dir__�sz_LazyModule.__dir__)rrrrr6r5r1r
r
)r rr2|sr2cs&eZdZd�fdd�	Zdd�Z�ZS)�MovedAttributeNcsdtt|�j|�trH|dkr |}||_|dkr@|dkr<|}n|}||_n||_|dkrZ|}||_dS)N)r'r7rr(r)r-)rrZold_modZnew_modZold_attrZnew_attr)r r
rr�szMovedAttribute.__init__cCst|j�}t||j�S)N)rr)r,r-)r�moduler
r
rr�s
zMovedAttribute._resolve)NN)rrrrrr1r
r
)r rr7�sr7c@sVeZdZdZdd�Zdd�Zdd�Zdd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZeZdS)�_SixMetaPathImporterz�
    A meta path importer to import six.moves and its submodules.

    This class implements a PEP302 finder and loader. It should be compatible
    with Python 2.5 and all existing versions of Python3
    cCs||_i|_dS)N)r�
known_modules)rZsix_module_namer
r
rr�sz_SixMetaPathImporter.__init__cGs&x |D]}||j|jd|<qWdS)N�.)r:r)rr)Z	fullnames�fullnamer
r
r�_add_module�s
z _SixMetaPathImporter._add_modulecCs|j|jd|S)Nr;)r:r)rr<r
r
r�_get_module�sz _SixMetaPathImporter._get_moduleNcCs||jkr|SdS)N)r:)rr<�pathr
r
r�find_module�s
z _SixMetaPathImporter.find_modulecCs0y
|j|Stk
r*td|��YnXdS)Nz!This loader does not know module )r:�KeyError�ImportError)rr<r
r
rZ__get_module�s
z!_SixMetaPathImporter.__get_modulecCsRy
tj|Stk
rYnX|j|�}t|t�r>|j�}n||_|tj|<|S)N)rrrA� _SixMetaPathImporter__get_module�
isinstancer&r�
__loader__)rr<r)r
r
r�load_module�s




z _SixMetaPathImporter.load_modulecCst|j|�d�S)z�
        Return true, if the named module is a package.

        We need this method to get correct spec objects with
        Python 3.4 (see PEP451)
        �__path__)�hasattrrC)rr<r
r
r�
is_package�sz_SixMetaPathImporter.is_packagecCs|j|�dS)z;Return None

        Required, if is_package is implementedN)rC)rr<r
r
r�get_code�s
z_SixMetaPathImporter.get_code)N)
rrrrrr=r>r@rCrFrIrJ�
get_sourcer
r
r
rr9�s
	r9c@seZdZdZgZdS)�_MovedItemszLazy loading of moved objectsN)rrrrrGr
r
r
rrL�srLZ	cStringIO�io�StringIO�filter�	itertools�builtinsZifilter�filterfalseZifilterfalse�inputZ__builtin__Z	raw_input�internr�map�imap�getcwd�osZgetcwdu�getcwdb�rangeZxrangeZ
reload_module�	importlibZimp�reload�reduce�	functoolsZshlex_quoteZpipesZshlexZquote�UserDict�collections�UserList�
UserString�zipZizip�zip_longestZizip_longestZconfigparserZConfigParser�copyregZcopy_regZdbm_gnuZgdbmzdbm.gnuZ
_dummy_threadZdummy_threadZhttp_cookiejarZ	cookielibzhttp.cookiejarZhttp_cookiesZCookiezhttp.cookiesZ
html_entitiesZhtmlentitydefsz
html.entitiesZhtml_parserZ
HTMLParserzhtml.parserZhttp_clientZhttplibzhttp.clientZemail_mime_multipartzemail.MIMEMultipartzemail.mime.multipartZemail_mime_nonmultipartzemail.MIMENonMultipartzemail.mime.nonmultipartZemail_mime_textzemail.MIMETextzemail.mime.textZemail_mime_basezemail.MIMEBasezemail.mime.baseZBaseHTTPServerzhttp.serverZ
CGIHTTPServerZSimpleHTTPServerZcPickle�pickleZqueueZQueue�reprlib�reprZsocketserverZSocketServer�_threadZthreadZtkinterZTkinterZtkinter_dialogZDialogztkinter.dialogZtkinter_filedialogZ
FileDialogztkinter.filedialogZtkinter_scrolledtextZScrolledTextztkinter.scrolledtextZtkinter_simpledialogZSimpleDialogztkinter.simpledialogZtkinter_tixZTixztkinter.tixZtkinter_ttkZttkztkinter.ttkZtkinter_constantsZTkconstantsztkinter.constantsZtkinter_dndZTkdndztkinter.dndZtkinter_colorchooserZtkColorChooserztkinter.colorchooserZtkinter_commondialogZtkCommonDialogztkinter.commondialogZtkinter_tkfiledialogZtkFileDialogZtkinter_fontZtkFontztkinter.fontZtkinter_messageboxZtkMessageBoxztkinter.messageboxZtkinter_tksimpledialogZtkSimpleDialogZurllib_parsez.moves.urllib_parsezurllib.parseZurllib_errorz.moves.urllib_errorzurllib.errorZurllibz
.moves.urllibZurllib_robotparser�robotparserzurllib.robotparserZ
xmlrpc_clientZ	xmlrpclibz
xmlrpc.clientZ
xmlrpc_serverZSimpleXMLRPCServerz
xmlrpc.serverZwin32�winreg�_winregzmoves.z.moves�movesc@seZdZdZdS)�Module_six_moves_urllib_parsez7Lazy loading of moved objects in six.moves.urllib_parseN)rrrrr
r
r
rrn@srnZParseResultZurlparseZSplitResultZparse_qsZ	parse_qslZ	urldefragZurljoinZurlsplitZ
urlunparseZ
urlunsplitZ
quote_plusZunquoteZunquote_plusZ	urlencodeZ
splitqueryZsplittagZ	splituserZ
uses_fragmentZuses_netlocZuses_paramsZ
uses_queryZ
uses_relativezmoves.urllib_parsezmoves.urllib.parsec@seZdZdZdS)�Module_six_moves_urllib_errorz7Lazy loading of moved objects in six.moves.urllib_errorN)rrrrr
r
r
rrohsroZURLErrorZurllib2Z	HTTPErrorZContentTooShortErrorz.moves.urllib.errorzmoves.urllib_errorzmoves.urllib.errorc@seZdZdZdS)�Module_six_moves_urllib_requestz9Lazy loading of moved objects in six.moves.urllib_requestN)rrrrr
r
r
rrp|srpZurlopenzurllib.requestZinstall_openerZbuild_openerZpathname2urlZurl2pathnameZ
getproxiesZRequestZOpenerDirectorZHTTPDefaultErrorHandlerZHTTPRedirectHandlerZHTTPCookieProcessorZProxyHandlerZBaseHandlerZHTTPPasswordMgrZHTTPPasswordMgrWithDefaultRealmZAbstractBasicAuthHandlerZHTTPBasicAuthHandlerZProxyBasicAuthHandlerZAbstractDigestAuthHandlerZHTTPDigestAuthHandlerZProxyDigestAuthHandlerZHTTPHandlerZHTTPSHandlerZFileHandlerZ
FTPHandlerZCacheFTPHandlerZUnknownHandlerZHTTPErrorProcessorZurlretrieveZ
urlcleanupZ	URLopenerZFancyURLopenerZproxy_bypassz.moves.urllib.requestzmoves.urllib_requestzmoves.urllib.requestc@seZdZdZdS)� Module_six_moves_urllib_responsez:Lazy loading of moved objects in six.moves.urllib_responseN)rrrrr
r
r
rrq�srqZaddbasezurllib.responseZaddclosehookZaddinfoZ
addinfourlz.moves.urllib.responsezmoves.urllib_responsezmoves.urllib.responsec@seZdZdZdS)�#Module_six_moves_urllib_robotparserz=Lazy loading of moved objects in six.moves.urllib_robotparserN)rrrrr
r
r
rrr�srrZRobotFileParserz.moves.urllib.robotparserzmoves.urllib_robotparserzmoves.urllib.robotparserc@sNeZdZdZgZejd�Zejd�Zejd�Z	ejd�Z
ejd�Zdd�Zd	S)
�Module_six_moves_urllibzICreate a six.moves.urllib namespace that resembles the Python 3 namespacezmoves.urllib_parsezmoves.urllib_errorzmoves.urllib_requestzmoves.urllib_responsezmoves.urllib_robotparsercCsdddddgS)N�parse�error�request�responserjr
)rr
r
rr6�szModule_six_moves_urllib.__dir__N)
rrrrrG�	_importerr>rtrurvrwrjr6r
r
r
rrs�s




rszmoves.urllibcCstt|j|�dS)zAdd an item to six.moves.N)rrLr)Zmover
r
r�add_move�srycCsXytt|�WnDtk
rRytj|=Wn"tk
rLtd|f��YnXYnXdS)zRemove item from six.moves.zno such move, %rN)rrLr!rm�__dict__rA)rr
r
r�remove_move�sr{�__func__�__self__�__closure__�__code__�__defaults__�__globals__�im_funcZim_selfZfunc_closureZ	func_codeZ
func_defaultsZfunc_globalscCs|j�S)N)�next)�itr
r
r�advance_iteratorsr�cCstdd�t|�jD��S)Ncss|]}d|jkVqdS)�__call__N)rz)r3�klassr
r
r�	<genexpr>szcallable.<locals>.<genexpr>)�any�type�__mro__)r"r
r
r�callablesr�cCs|S)Nr
)�unboundr
r
r�get_unbound_functionsr�cCs|S)Nr
)r�clsr
r
r�create_unbound_methodsr�cCs|jS)N)r�)r�r
r
rr�"scCstj|||j�S)N)�types�
MethodTyper )rr"r
r
r�create_bound_method%sr�cCstj|d|�S)N)r�r�)rr�r
r
rr�(sc@seZdZdd�ZdS)�IteratorcCst|�j|�S)N)r��__next__)rr
r
rr�-sz
Iterator.nextN)rrrr�r
r
r
rr�+sr�z3Get the function out of a possibly unbound functioncKst|jf|��S)N)�iter�keys)�d�kwr
r
r�iterkeys>sr�cKst|jf|��S)N)r��values)r�r�r
r
r�
itervaluesAsr�cKst|jf|��S)N)r��items)r�r�r
r
r�	iteritemsDsr�cKst|jf|��S)N)r�Zlists)r�r�r
r
r�	iterlistsGsr�r�r�r�cKs|jf|�S)N)r�)r�r�r
r
rr�PscKs|jf|�S)N)r�)r�r�r
r
rr�SscKs|jf|�S)N)r�)r�r�r
r
rr�VscKs|jf|�S)N)r�)r�r�r
r
rr�Ys�viewkeys�
viewvalues�	viewitemsz1Return an iterator over the keys of a dictionary.z3Return an iterator over the values of a dictionary.z?Return an iterator over the (key, value) pairs of a dictionary.zBReturn an iterator over the (key, [values]) pairs of a dictionary.cCs
|jd�S)Nzlatin-1)�encode)�sr
r
r�bksr�cCs|S)Nr
)r�r
r
r�unsr�z>B�assertCountEqualZassertRaisesRegexpZassertRegexpMatches�assertRaisesRegex�assertRegexcCs|S)Nr
)r�r
r
rr��scCst|jdd�d�S)Nz\\z\\\\Zunicode_escape)�unicode�replace)r�r
r
rr��scCst|d�S)Nr)�ord)Zbsr
r
r�byte2int�sr�cCst||�S)N)r�)Zbuf�ir
r
r�
indexbytes�sr�ZassertItemsEqualzByte literalzText literalcOst|t�||�S)N)r,�_assertCountEqual)r�args�kwargsr
r
rr��scOst|t�||�S)N)r,�_assertRaisesRegex)rr�r�r
r
rr��scOst|t�||�S)N)r,�_assertRegex)rr�r�r
r
rr��s�execcCs*|dkr|�}|j|k	r"|j|��|�dS)N)�
__traceback__�with_traceback)r#r/�tbr
r
r�reraise�s


r�cCsB|dkr*tjd�}|j}|dkr&|j}~n|dkr6|}td�dS)zExecute code in a namespace.Nrzexec _code_ in _globs_, _locs_)r�	_getframe�	f_globals�f_localsr�)Z_code_Z_globs_Z_locs_�framer
r
r�exec_�s
r�z9def reraise(tp, value, tb=None):
    raise tp, value, tb
zrdef raise_from(value, from_value):
    if from_value is None:
        raise value
    raise value from from_value
zCdef raise_from(value, from_value):
    raise value from from_value
cCs|�dS)Nr
)r/Z
from_valuer
r
r�
raise_from�sr��printc
s6|jdtj���dkrdS�fdd�}d}|jdd�}|dk	r`t|t�rNd}nt|t�s`td��|jd	d�}|dk	r�t|t�r�d}nt|t�s�td
��|r�td��|s�x|D]}t|t�r�d}Pq�W|r�td�}td
�}nd}d
}|dkr�|}|dk�r�|}x,t|�D] \}	}|	�r||�||��qW||�dS)z4The new-style print function for Python 2.4 and 2.5.�fileNcsdt|t�st|�}t�t�rVt|t�rV�jdk	rVt�dd�}|dkrHd}|j�j|�}�j|�dS)N�errors�strict)	rD�
basestring�strr�r��encodingr,r��write)�datar�)�fpr
rr��s



zprint_.<locals>.writeF�sepTzsep must be None or a string�endzend must be None or a stringz$invalid keyword arguments to print()�
� )�popr�stdoutrDr�r��	TypeError�	enumerate)
r�r�r�Zwant_unicoder�r��arg�newlineZspacer�r
)r�r�print_�sL







r�cOs<|jdtj�}|jdd�}t||�|r8|dk	r8|j�dS)Nr��flushF)�getrr�r��_printr�)r�r�r�r�r
r
rr�s

zReraise an exception.cs���fdd�}|S)Ncstj����|�}�|_|S)N)r^�wraps�__wrapped__)�f)�assigned�updated�wrappedr
r�wrapperszwraps.<locals>.wrapperr
)r�r�r�r�r
)r�r�r�rr�sr�cs&G��fdd�d��}tj|dfi�S)z%Create a base class with a metaclass.cseZdZ��fdd�ZdS)z!with_metaclass.<locals>.metaclasscs�|�|�S)Nr
)r�rZ
this_basesr�)�bases�metar
r�__new__'sz)with_metaclass.<locals>.metaclass.__new__N)rrrr�r
)r�r�r
r�	metaclass%sr�Ztemporary_class)r�r�)r�r�r�r
)r�r�r�with_metaclass sr�cs�fdd�}|S)z6Class decorator for creating a class with a metaclass.csl|jj�}|jd�}|dk	rDt|t�r,|g}x|D]}|j|�q2W|jdd�|jdd��|j|j|�S)N�	__slots__rz�__weakref__)rz�copyr�rDr�r�r�	__bases__)r�Z	orig_vars�slotsZ	slots_var)r�r
rr�.s



zadd_metaclass.<locals>.wrapperr
)r�r�r
)r�r�
add_metaclass,sr�cCs2tr.d|jkrtd|j��|j|_dd�|_|S)a
    A decorator that defines __unicode__ and __str__ methods under Python 2.
    Under Python 3 it does nothing.

    To support Python 2 and 3 with a single code base, define a __str__ method
    returning text and apply this decorator to the class.
    �__str__zY@python_2_unicode_compatible cannot be applied to %s because it doesn't define __str__().cSs|j�jd�S)Nzutf-8)�__unicode__r�)rr
r
r�<lambda>Jsz-python_2_unicode_compatible.<locals>.<lambda>)�PY2rz�
ValueErrorrr�r�)r�r
r
r�python_2_unicode_compatible<s


r��__spec__)rrli���li���ll����)N)NN)rr)rr)rr)rr)�rZ
__future__rr^rP�operatorrr��
__author__�__version__�version_infor�r(ZPY34r�Zstring_types�intZ
integer_typesr�Zclass_typesZ	text_type�bytesZbinary_type�maxsizeZMAXSIZEr�ZlongZ	ClassTyper��platform�
startswith�objectr	�len�
OverflowErrorrrrr&�
ModuleTyper2r7r9rrxrLr5r-rrrDr=rmrnZ_urllib_parse_moved_attributesroZ_urllib_error_moved_attributesrpZ _urllib_request_moved_attributesrqZ!_urllib_response_moved_attributesrrZ$_urllib_robotparser_moved_attributesrsryr{Z
_meth_funcZ
_meth_selfZ
_func_closureZ
_func_codeZ_func_defaultsZ
_func_globalsr�r��	NameErrorr�r�r�r�r�r��
attrgetterZget_method_functionZget_method_selfZget_function_closureZget_function_codeZget_function_defaultsZget_function_globalsr�r�r�r��methodcallerr�r�r�r�r��chrZunichr�struct�Struct�packZint2byte�
itemgetterr��getitemr�r�Z	iterbytesrMrN�BytesIOr�r�r��partialrVr�r�r�r�r,rQr�r�r�r�r��WRAPPER_ASSIGNMENTS�WRAPPER_UPDATESr�r�r�r�rG�__package__�globalsr�r��submodule_search_locations�	meta_pathr�r�Zimporter�appendr
r
r
r�<module>s�

>












































































































5pip/_vendor/urllib3/packages/ssl_match_hostname/__pycache__/_implementation.cpython-36.opt-1.pyc000064400000006157147511334600031434 0ustar00site-packages3

���eF�@stdZddlZddlZyddlZWnek
r8dZYnXdZGdd�de�Zddd�Zd	d
�Z	dd�Z
d
d�ZdS)zJThe match_hostname() function from Python 3.3.3, essential when using SSL.�Nz3.5.0.1c@seZdZdS)�CertificateErrorN)�__name__�
__module__�__qualname__�rr�%/usr/lib/python3.6/_implementation.pyrsr�c
Cs�g}|sdS|jd�}|d}|dd�}|jd�}||krLtdt|���|s`|j�|j�kS|dkrt|jd�n>|jd	�s�|jd	�r�|jtj|��n|jtj|�j	d
d��x|D]}|jtj|��q�Wtj
dd
j|�dtj�}	|	j
|�S)zhMatching according to RFC 6125, section 6.4.3

    http://tools.ietf.org/html/rfc6125#section-6.4.3
    F�.rrN�*z,too many wildcards in certificate DNS name: z[^.]+zxn--z\*z[^.]*z\Az\.z\Z)�split�countr�repr�lower�append�
startswith�re�escape�replace�compile�join�
IGNORECASE�match)
Zdn�hostnameZ
max_wildcardsZpats�partsZleftmostZ	remainderZ	wildcardsZfragZpatrrr�_dnsname_matchs*


rcCs&t|t�r"tjdkr"t|ddd�}|S)N��ascii�strict)�encoding�errors)r)�
isinstance�str�sys�version_infoZunicode)�objrrr�_to_unicodeOsr%cCstjt|�j��}||kS)z�Exact matching of IP addresses.

    RFC 6125 explicitly doesn't define an algorithm for this
    (section 1.7.2 - "Out of Scope").
    )�	ipaddress�
ip_addressr%�rstrip)Zipname�host_ipZiprrr�_ipaddress_matchTsr*cCs�|std��ytjt|��}WnPtk
r6d}Yn:tk
rLd}Yn$tk
rntdkrhd}n�YnXg}|jdf�}xb|D]Z\}}|dkr�|dkr�t||�r�dS|j|�q�|dkr�|dk	r�t	||�r�dS|j|�q�W|�s8xL|jdf�D]<}x6|D].\}}|dk�rt||��r$dS|j|��qWq�Wt
|�dk�rdtd	|d
jt
t|��f��n,t
|�dk�r�td||df��ntd
��dS)a)Verify that *cert* (in decoded format as returned by
    SSLSocket.getpeercert()) matches the *hostname*.  RFC 2818 and RFC 6125
    rules are followed, but IP addresses are not accepted for *hostname*.

    CertificateError is raised on failure. On success, the function
    returns nothing.
    ztempty or no certificate, match_hostname needs a SSL socket or SSL context with either CERT_OPTIONAL or CERT_REQUIREDNZsubjectAltNameZDNSz
IP AddressZsubjectZ
commonNamerz&hostname %r doesn't match either of %sz, zhostname %r doesn't match %rrz=no appropriate commonName or subjectAltName fields were found)�
ValueErrorr&r'r%�UnicodeError�AttributeError�getrrr*�lenrr�mapr
)Zcertrr)ZdnsnamesZsan�key�value�subrrr�match_hostname`sJ
r4)r)�__doc__rr"r&�ImportError�__version__r+rrr%r*r4rrrr�<module>s

5site-packages/pip/_vendor/urllib3/packages/ssl_match_hostname/__pycache__/__init__.cpython-36.pyc000064400000000740147511334600027117 0ustar003

���e��@s�ddlZy&ejd	kred��ddlmZmZWnNek
r|yddlmZmZWn$ek
rvddlmZmZYnXYnXd
ZdS)�N��zFallback to vendored code)�CertificateError�match_hostname�rr)rr)rr)	�sys�version_info�ImportErrorZsslrrZbackports.ssl_match_hostnameZ_implementation�__all__�rr�/usr/lib/python3.6/__init__.py�<module>s
pip/_vendor/urllib3/packages/ssl_match_hostname/__pycache__/_implementation.cpython-36.pyc000064400000006157147511334600030475 0ustar00site-packages3

���eF�@stdZddlZddlZyddlZWnek
r8dZYnXdZGdd�de�Zddd�Zd	d
�Z	dd�Z
d
d�ZdS)zJThe match_hostname() function from Python 3.3.3, essential when using SSL.�Nz3.5.0.1c@seZdZdS)�CertificateErrorN)�__name__�
__module__�__qualname__�rr�%/usr/lib/python3.6/_implementation.pyrsr�c
Cs�g}|sdS|jd�}|d}|dd�}|jd�}||krLtdt|���|s`|j�|j�kS|dkrt|jd�n>|jd	�s�|jd	�r�|jtj|��n|jtj|�j	d
d��x|D]}|jtj|��q�Wtj
dd
j|�dtj�}	|	j
|�S)zhMatching according to RFC 6125, section 6.4.3

    http://tools.ietf.org/html/rfc6125#section-6.4.3
    F�.rrN�*z,too many wildcards in certificate DNS name: z[^.]+zxn--z\*z[^.]*z\Az\.z\Z)�split�countr�repr�lower�append�
startswith�re�escape�replace�compile�join�
IGNORECASE�match)
Zdn�hostnameZ
max_wildcardsZpats�partsZleftmostZ	remainderZ	wildcardsZfragZpatrrr�_dnsname_matchs*


rcCs&t|t�r"tjdkr"t|ddd�}|S)N��ascii�strict)�encoding�errors)r)�
isinstance�str�sys�version_infoZunicode)�objrrr�_to_unicodeOsr%cCstjt|�j��}||kS)z�Exact matching of IP addresses.

    RFC 6125 explicitly doesn't define an algorithm for this
    (section 1.7.2 - "Out of Scope").
    )�	ipaddress�
ip_addressr%�rstrip)Zipname�host_ipZiprrr�_ipaddress_matchTsr*cCs�|std��ytjt|��}WnPtk
r6d}Yn:tk
rLd}Yn$tk
rntdkrhd}n�YnXg}|jdf�}xb|D]Z\}}|dkr�|dkr�t||�r�dS|j|�q�|dkr�|dk	r�t	||�r�dS|j|�q�W|�s8xL|jdf�D]<}x6|D].\}}|dk�rt||��r$dS|j|��qWq�Wt
|�dk�rdtd	|d
jt
t|��f��n,t
|�dk�r�td||df��ntd
��dS)a)Verify that *cert* (in decoded format as returned by
    SSLSocket.getpeercert()) matches the *hostname*.  RFC 2818 and RFC 6125
    rules are followed, but IP addresses are not accepted for *hostname*.

    CertificateError is raised on failure. On success, the function
    returns nothing.
    ztempty or no certificate, match_hostname needs a SSL socket or SSL context with either CERT_OPTIONAL or CERT_REQUIREDNZsubjectAltNameZDNSz
IP AddressZsubjectZ
commonNamerz&hostname %r doesn't match either of %sz, zhostname %r doesn't match %rrz=no appropriate commonName or subjectAltName fields were found)�
ValueErrorr&r'r%�UnicodeError�AttributeError�getrrr*�lenrr�mapr
)Zcertrr)ZdnsnamesZsan�key�value�subrrr�match_hostname`sJ
r4)r)�__doc__rr"r&�ImportError�__version__r+rrr%r*r4rrrr�<module>s

5pip/_vendor/urllib3/packages/ssl_match_hostname/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000740147511334600027777 0ustar00site-packages3

���e��@s�ddlZy&ejd	kred��ddlmZmZWnNek
r|yddlmZmZWn$ek
rvddlmZmZYnXYnXd
ZdS)�N��zFallback to vendored code)�CertificateError�match_hostname�rr)rr)rr)	�sys�version_info�ImportErrorZsslrrZbackports.ssl_match_hostnameZ_implementation�__all__�rr�/usr/lib/python3.6/__init__.py�<module>s
site-packages/pip/_vendor/urllib3/packages/ssl_match_hostname/_implementation.py000064400000013106147511334600024260 0ustar00"""The match_hostname() function from Python 3.3.3, essential when using SSL."""

# Note: This file is under the PSF license as the code comes from the python
# stdlib.   http://docs.python.org/3/license.html

import re
import sys

# ipaddress has been backported to 2.6+ in pypi.  If it is installed on the
# system, use it to handle IPAddress ServerAltnames (this was added in
# python-3.5) otherwise only do DNS matching.  This allows
# backports.ssl_match_hostname to continue to be used all the way back to
# python-2.4.
try:
    import ipaddress
except ImportError:
    ipaddress = None

__version__ = '3.5.0.1'


class CertificateError(ValueError):
    pass


def _dnsname_match(dn, hostname, max_wildcards=1):
    """Matching according to RFC 6125, section 6.4.3

    http://tools.ietf.org/html/rfc6125#section-6.4.3
    """
    pats = []
    if not dn:
        return False

    # Ported from python3-syntax:
    # leftmost, *remainder = dn.split(r'.')
    parts = dn.split(r'.')
    leftmost = parts[0]
    remainder = parts[1:]

    wildcards = leftmost.count('*')
    if wildcards > max_wildcards:
        # Issue #17980: avoid denials of service by refusing more
        # than one wildcard per fragment.  A survey of established
        # policy among SSL implementations showed it to be a
        # reasonable choice.
        raise CertificateError(
            "too many wildcards in certificate DNS name: " + repr(dn))

    # speed up common case w/o wildcards
    if not wildcards:
        return dn.lower() == hostname.lower()

    # RFC 6125, section 6.4.3, subitem 1.
    # The client SHOULD NOT attempt to match a presented identifier in which
    # the wildcard character comprises a label other than the left-most label.
    if leftmost == '*':
        # When '*' is a fragment by itself, it matches a non-empty dotless
        # fragment.
        pats.append('[^.]+')
    elif leftmost.startswith('xn--') or hostname.startswith('xn--'):
        # RFC 6125, section 6.4.3, subitem 3.
        # The client SHOULD NOT attempt to match a presented identifier
        # where the wildcard character is embedded within an A-label or
        # U-label of an internationalized domain name.
        pats.append(re.escape(leftmost))
    else:
        # Otherwise, '*' matches any dotless string, e.g. www*
        pats.append(re.escape(leftmost).replace(r'\*', '[^.]*'))

    # add the remaining fragments, ignore any wildcards
    for frag in remainder:
        pats.append(re.escape(frag))

    pat = re.compile(r'\A' + r'\.'.join(pats) + r'\Z', re.IGNORECASE)
    return pat.match(hostname)


def _to_unicode(obj):
    if isinstance(obj, str) and sys.version_info < (3,):
        obj = unicode(obj, encoding='ascii', errors='strict')
    return obj

def _ipaddress_match(ipname, host_ip):
    """Exact matching of IP addresses.

    RFC 6125 explicitly doesn't define an algorithm for this
    (section 1.7.2 - "Out of Scope").
    """
    # OpenSSL may add a trailing newline to a subjectAltName's IP address
    # Divergence from upstream: ipaddress can't handle byte str
    ip = ipaddress.ip_address(_to_unicode(ipname).rstrip())
    return ip == host_ip


def match_hostname(cert, hostname):
    """Verify that *cert* (in decoded format as returned by
    SSLSocket.getpeercert()) matches the *hostname*.  RFC 2818 and RFC 6125
    rules are followed, but IP addresses are not accepted for *hostname*.

    CertificateError is raised on failure. On success, the function
    returns nothing.
    """
    if not cert:
        raise ValueError("empty or no certificate, match_hostname needs a "
                         "SSL socket or SSL context with either "
                         "CERT_OPTIONAL or CERT_REQUIRED")
    try:
        # Divergence from upstream: ipaddress can't handle byte str
        host_ip = ipaddress.ip_address(_to_unicode(hostname))
    except ValueError:
        # Not an IP address (common case)
        host_ip = None
    except UnicodeError:
        # Divergence from upstream: Have to deal with ipaddress not taking
        # byte strings.  addresses should be all ascii, so we consider it not
        # an ipaddress in this case
        host_ip = None
    except AttributeError:
        # Divergence from upstream: Make ipaddress library optional
        if ipaddress is None:
            host_ip = None
        else:
            raise
    dnsnames = []
    san = cert.get('subjectAltName', ())
    for key, value in san:
        if key == 'DNS':
            if host_ip is None and _dnsname_match(value, hostname):
                return
            dnsnames.append(value)
        elif key == 'IP Address':
            if host_ip is not None and _ipaddress_match(value, host_ip):
                return
            dnsnames.append(value)
    if not dnsnames:
        # The subject is only checked when there is no dNSName entry
        # in subjectAltName
        for sub in cert.get('subject', ()):
            for key, value in sub:
                # XXX according to RFC 2818, the most specific Common Name
                # must be used.
                if key == 'commonName':
                    if _dnsname_match(value, hostname):
                        return
                    dnsnames.append(value)
    if len(dnsnames) > 1:
        raise CertificateError("hostname %r "
            "doesn't match either of %s"
            % (hostname, ', '.join(map(repr, dnsnames))))
    elif len(dnsnames) == 1:
        raise CertificateError("hostname %r "
            "doesn't match %r"
            % (hostname, dnsnames[0]))
    else:
        raise CertificateError("no appropriate commonName or "
            "subjectAltName fields were found")
site-packages/pip/_vendor/urllib3/packages/ssl_match_hostname/__init__.py000064400000001260147511334600022631 0ustar00import sys

try:
    # Our match_hostname function is the same as 3.5's, so we only want to
    # import the match_hostname function if it's at least that good.
    if sys.version_info < (3, 5):
        raise ImportError("Fallback to vendored code")

    from ssl import CertificateError, match_hostname
except ImportError:
    try:
        # Backport of the function from a pypi module
        from backports.ssl_match_hostname import CertificateError, match_hostname
    except ImportError:
        # Our vendored copy
        from ._implementation import CertificateError, match_hostname

# Not needed, but documenting what we provide.
__all__ = ('CertificateError', 'match_hostname')
site-packages/pip/_vendor/urllib3/connectionpool.py000064400000105036147511334600016500 0ustar00from __future__ import absolute_import
import errno
import logging
import sys
import warnings

from socket import error as SocketError, timeout as SocketTimeout
import socket


from .exceptions import (
    ClosedPoolError,
    ProtocolError,
    EmptyPoolError,
    HeaderParsingError,
    HostChangedError,
    LocationValueError,
    MaxRetryError,
    ProxyError,
    ReadTimeoutError,
    SSLError,
    TimeoutError,
    InsecureRequestWarning,
    NewConnectionError,
)
from .packages.ssl_match_hostname import CertificateError
from .packages import six
from .packages.six.moves import queue
from .connection import (
    port_by_scheme,
    DummyConnection,
    HTTPConnection, HTTPSConnection, VerifiedHTTPSConnection,
    HTTPException, BaseSSLError,
)
from .request import RequestMethods
from .response import HTTPResponse

from .util.connection import is_connection_dropped
from .util.request import set_file_position
from .util.response import assert_header_parsing
from .util.retry import Retry
from .util.timeout import Timeout
from .util.url import get_host, Url


if six.PY2:
    # Queue is imported for side effects on MS Windows
    import Queue as _unused_module_Queue  # noqa: F401

xrange = six.moves.xrange

log = logging.getLogger(__name__)

_Default = object()


# Pool objects
class ConnectionPool(object):
    """
    Base class for all connection pools, such as
    :class:`.HTTPConnectionPool` and :class:`.HTTPSConnectionPool`.
    """

    scheme = None
    QueueCls = queue.LifoQueue

    def __init__(self, host, port=None):
        if not host:
            raise LocationValueError("No host specified.")

        self.host = _ipv6_host(host).lower()
        self._proxy_host = host.lower()
        self.port = port

    def __str__(self):
        return '%s(host=%r, port=%r)' % (type(self).__name__,
                                         self.host, self.port)

    def __enter__(self):
        return self

    def __exit__(self, exc_type, exc_val, exc_tb):
        self.close()
        # Return False to re-raise any potential exceptions
        return False

    def close(self):
        """
        Close all pooled connections and disable the pool.
        """
        pass


# This is taken from http://hg.python.org/cpython/file/7aaba721ebc0/Lib/socket.py#l252
_blocking_errnos = set([errno.EAGAIN, errno.EWOULDBLOCK])


class HTTPConnectionPool(ConnectionPool, RequestMethods):
    """
    Thread-safe connection pool for one host.

    :param host:
        Host used for this HTTP Connection (e.g. "localhost"), passed into
        :class:`httplib.HTTPConnection`.

    :param port:
        Port used for this HTTP Connection (None is equivalent to 80), passed
        into :class:`httplib.HTTPConnection`.

    :param strict:
        Causes BadStatusLine to be raised if the status line can't be parsed
        as a valid HTTP/1.0 or 1.1 status line, passed into
        :class:`httplib.HTTPConnection`.

        .. note::
           Only works in Python 2. This parameter is ignored in Python 3.

    :param timeout:
        Socket timeout in seconds for each individual connection. This can
        be a float or integer, which sets the timeout for the HTTP request,
        or an instance of :class:`urllib3.util.Timeout` which gives you more
        fine-grained control over request timeouts. After the constructor has
        been parsed, this is always a `urllib3.util.Timeout` object.

    :param maxsize:
        Number of connections to save that can be reused. More than 1 is useful
        in multithreaded situations. If ``block`` is set to False, more
        connections will be created but they will not be saved once they've
        been used.

    :param block:
        If set to True, no more than ``maxsize`` connections will be used at
        a time. When no free connections are available, the call will block
        until a connection has been released. This is a useful side effect for
        particular multithreaded situations where one does not want to use more
        than maxsize connections per host to prevent flooding.

    :param headers:
        Headers to include with all requests, unless other headers are given
        explicitly.

    :param retries:
        Retry configuration to use by default with requests in this pool.

    :param _proxy:
        Parsed proxy URL, should not be used directly, instead, see
        :class:`urllib3.connectionpool.ProxyManager`"

    :param _proxy_headers:
        A dictionary with proxy headers, should not be used directly,
        instead, see :class:`urllib3.connectionpool.ProxyManager`"

    :param \\**conn_kw:
        Additional parameters are used to create fresh :class:`urllib3.connection.HTTPConnection`,
        :class:`urllib3.connection.HTTPSConnection` instances.
    """

    scheme = 'http'
    ConnectionCls = HTTPConnection
    ResponseCls = HTTPResponse

    def __init__(self, host, port=None, strict=False,
                 timeout=Timeout.DEFAULT_TIMEOUT, maxsize=1, block=False,
                 headers=None, retries=None,
                 _proxy=None, _proxy_headers=None,
                 **conn_kw):
        ConnectionPool.__init__(self, host, port)
        RequestMethods.__init__(self, headers)

        self.strict = strict

        if not isinstance(timeout, Timeout):
            timeout = Timeout.from_float(timeout)

        if retries is None:
            retries = Retry.DEFAULT

        self.timeout = timeout
        self.retries = retries

        self.pool = self.QueueCls(maxsize)
        self.block = block

        self.proxy = _proxy
        self.proxy_headers = _proxy_headers or {}

        # Fill the queue up so that doing get() on it will block properly
        for _ in xrange(maxsize):
            self.pool.put(None)

        # These are mostly for testing and debugging purposes.
        self.num_connections = 0
        self.num_requests = 0
        self.conn_kw = conn_kw

        if self.proxy:
            # Enable Nagle's algorithm for proxies, to avoid packet fragmentation.
            # We cannot know if the user has added default socket options, so we cannot replace the
            # list.
            self.conn_kw.setdefault('socket_options', [])

    def _new_conn(self):
        """
        Return a fresh :class:`HTTPConnection`.
        """
        self.num_connections += 1
        log.debug("Starting new HTTP connection (%d): %s",
                  self.num_connections, self.host)

        conn = self.ConnectionCls(host=self.host, port=self.port,
                                  timeout=self.timeout.connect_timeout,
                                  strict=self.strict, **self.conn_kw)
        return conn

    def _get_conn(self, timeout=None):
        """
        Get a connection. Will return a pooled connection if one is available.

        If no connections are available and :prop:`.block` is ``False``, then a
        fresh connection is returned.

        :param timeout:
            Seconds to wait before giving up and raising
            :class:`urllib3.exceptions.EmptyPoolError` if the pool is empty and
            :prop:`.block` is ``True``.
        """
        conn = None
        try:
            conn = self.pool.get(block=self.block, timeout=timeout)

        except AttributeError:  # self.pool is None
            raise ClosedPoolError(self, "Pool is closed.")

        except queue.Empty:
            if self.block:
                raise EmptyPoolError(self,
                                     "Pool reached maximum size and no more "
                                     "connections are allowed.")
            pass  # Oh well, we'll create a new connection then

        # If this is a persistent connection, check if it got disconnected
        if conn and is_connection_dropped(conn):
            log.debug("Resetting dropped connection: %s", self.host)
            conn.close()
            if getattr(conn, 'auto_open', 1) == 0:
                # This is a proxied connection that has been mutated by
                # httplib._tunnel() and cannot be reused (since it would
                # attempt to bypass the proxy)
                conn = None

        return conn or self._new_conn()

    def _put_conn(self, conn):
        """
        Put a connection back into the pool.

        :param conn:
            Connection object for the current host and port as returned by
            :meth:`._new_conn` or :meth:`._get_conn`.

        If the pool is already full, the connection is closed and discarded
        because we exceeded maxsize. If connections are discarded frequently,
        then maxsize should be increased.

        If the pool is closed, then the connection will be closed and discarded.
        """
        try:
            self.pool.put(conn, block=False)
            return  # Everything is dandy, done.
        except AttributeError:
            # self.pool is None.
            pass
        except queue.Full:
            # This should never happen if self.block == True
            log.warning(
                "Connection pool is full, discarding connection: %s",
                self.host)

        # Connection never got put back into the pool, close it.
        if conn:
            conn.close()

    def _validate_conn(self, conn):
        """
        Called right before a request is made, after the socket is created.
        """
        pass

    def _prepare_proxy(self, conn):
        # Nothing to do for HTTP connections.
        pass

    def _get_timeout(self, timeout):
        """ Helper that always returns a :class:`urllib3.util.Timeout` """
        if timeout is _Default:
            return self.timeout.clone()

        if isinstance(timeout, Timeout):
            return timeout.clone()
        else:
            # User passed us an int/float. This is for backwards compatibility,
            # can be removed later
            return Timeout.from_float(timeout)

    def _raise_timeout(self, err, url, timeout_value):
        """Is the error actually a timeout? Will raise a ReadTimeout or pass"""

        if isinstance(err, SocketTimeout):
            raise ReadTimeoutError(self, url, "Read timed out. (read timeout=%s)" % timeout_value)

        # See the above comment about EAGAIN in Python 3. In Python 2 we have
        # to specifically catch it and throw the timeout error
        if hasattr(err, 'errno') and err.errno in _blocking_errnos:
            raise ReadTimeoutError(self, url, "Read timed out. (read timeout=%s)" % timeout_value)

        # Catch possible read timeouts thrown as SSL errors. If not the
        # case, rethrow the original. We need to do this because of:
        # http://bugs.python.org/issue10272
        if 'timed out' in str(err) or 'did not complete (read)' in str(err):  # Python 2.6
            raise ReadTimeoutError(self, url, "Read timed out. (read timeout=%s)" % timeout_value)

    def _make_request(self, conn, method, url, timeout=_Default, chunked=False,
                      **httplib_request_kw):
        """
        Perform a request on a given urllib connection object taken from our
        pool.

        :param conn:
            a connection from one of our connection pools

        :param timeout:
            Socket timeout in seconds for the request. This can be a
            float or integer, which will set the same timeout value for
            the socket connect and the socket read, or an instance of
            :class:`urllib3.util.Timeout`, which gives you more fine-grained
            control over your timeouts.
        """
        self.num_requests += 1

        timeout_obj = self._get_timeout(timeout)
        timeout_obj.start_connect()
        conn.timeout = timeout_obj.connect_timeout

        # Trigger any extra validation we need to do.
        try:
            self._validate_conn(conn)
        except (SocketTimeout, BaseSSLError) as e:
            # Py2 raises this as a BaseSSLError, Py3 raises it as socket timeout.
            self._raise_timeout(err=e, url=url, timeout_value=conn.timeout)
            raise

        # conn.request() calls httplib.*.request, not the method in
        # urllib3.request. It also calls makefile (recv) on the socket.
        if chunked:
            conn.request_chunked(method, url, **httplib_request_kw)
        else:
            conn.request(method, url, **httplib_request_kw)

        # Reset the timeout for the recv() on the socket
        read_timeout = timeout_obj.read_timeout

        # App Engine doesn't have a sock attr
        if getattr(conn, 'sock', None):
            # In Python 3 socket.py will catch EAGAIN and return None when you
            # try and read into the file pointer created by http.client, which
            # instead raises a BadStatusLine exception. Instead of catching
            # the exception and assuming all BadStatusLine exceptions are read
            # timeouts, check for a zero timeout before making the request.
            if read_timeout == 0:
                raise ReadTimeoutError(
                    self, url, "Read timed out. (read timeout=%s)" % read_timeout)
            if read_timeout is Timeout.DEFAULT_TIMEOUT:
                conn.sock.settimeout(socket.getdefaulttimeout())
            else:  # None or a value
                conn.sock.settimeout(read_timeout)

        # Receive the response from the server
        try:
            try:  # Python 2.7, use buffering of HTTP responses
                httplib_response = conn.getresponse(buffering=True)
            except TypeError:  # Python 2.6 and older, Python 3
                try:
                    httplib_response = conn.getresponse()
                except Exception as e:
                    # Remove the TypeError from the exception chain in Python 3;
                    # otherwise it looks like a programming error was the cause.
                    six.raise_from(e, None)
        except (SocketTimeout, BaseSSLError, SocketError) as e:
            self._raise_timeout(err=e, url=url, timeout_value=read_timeout)
            raise

        # AppEngine doesn't have a version attr.
        http_version = getattr(conn, '_http_vsn_str', 'HTTP/?')
        log.debug("%s://%s:%s \"%s %s %s\" %s %s", self.scheme, self.host, self.port,
                  method, url, http_version, httplib_response.status,
                  httplib_response.length)

        try:
            assert_header_parsing(httplib_response.msg)
        except (HeaderParsingError, TypeError) as hpe:  # Platform-specific: Python 3
            log.warning(
                'Failed to parse headers (url=%s): %s',
                self._absolute_url(url), hpe, exc_info=True)

        return httplib_response

    def _absolute_url(self, path):
        return Url(scheme=self.scheme, host=self.host, port=self.port, path=path).url

    def close(self):
        """
        Close all pooled connections and disable the pool.
        """
        # Disable access to the pool
        old_pool, self.pool = self.pool, None

        try:
            while True:
                conn = old_pool.get(block=False)
                if conn:
                    conn.close()

        except queue.Empty:
            pass  # Done.

    def is_same_host(self, url):
        """
        Check if the given ``url`` is a member of the same host as this
        connection pool.
        """
        if url.startswith('/'):
            return True

        # TODO: Add optional support for socket.gethostbyname checking.
        scheme, host, port = get_host(url)

        host = _ipv6_host(host).lower()

        # Use explicit default port for comparison when none is given
        if self.port and not port:
            port = port_by_scheme.get(scheme)
        elif not self.port and port == port_by_scheme.get(scheme):
            port = None

        return (scheme, host, port) == (self.scheme, self.host, self.port)

    def urlopen(self, method, url, body=None, headers=None, retries=None,
                redirect=True, assert_same_host=True, timeout=_Default,
                pool_timeout=None, release_conn=None, chunked=False,
                body_pos=None, **response_kw):
        """
        Get a connection from the pool and perform an HTTP request. This is the
        lowest level call for making a request, so you'll need to specify all
        the raw details.

        .. note::

           More commonly, it's appropriate to use a convenience method provided
           by :class:`.RequestMethods`, such as :meth:`request`.

        .. note::

           `release_conn` will only behave as expected if
           `preload_content=False` because we want to make
           `preload_content=False` the default behaviour someday soon without
           breaking backwards compatibility.

        :param method:
            HTTP request method (such as GET, POST, PUT, etc.)

        :param body:
            Data to send in the request body (useful for creating
            POST requests, see HTTPConnectionPool.post_url for
            more convenience).

        :param headers:
            Dictionary of custom headers to send, such as User-Agent,
            If-None-Match, etc. If None, pool headers are used. If provided,
            these headers completely replace any pool-specific headers.

        :param retries:
            Configure the number of retries to allow before raising a
            :class:`~urllib3.exceptions.MaxRetryError` exception.

            Pass ``None`` to retry until you receive a response. Pass a
            :class:`~urllib3.util.retry.Retry` object for fine-grained control
            over different types of retries.
            Pass an integer number to retry connection errors that many times,
            but no other types of errors. Pass zero to never retry.

            If ``False``, then retries are disabled and any exception is raised
            immediately. Also, instead of raising a MaxRetryError on redirects,
            the redirect response will be returned.

        :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.

        :param redirect:
            If True, automatically handle redirects (status codes 301, 302,
            303, 307, 308). Each redirect counts as a retry. Disabling retries
            will disable redirect, too.

        :param assert_same_host:
            If ``True``, will make sure that the host of the pool requests is
            consistent else will raise HostChangedError. When False, you can
            use the pool on an HTTP proxy and request foreign hosts.

        :param timeout:
            If specified, overrides the default timeout for this one
            request. It may be a float (in seconds) or an instance of
            :class:`urllib3.util.Timeout`.

        :param pool_timeout:
            If set and the pool is set to block=True, then this method will
            block for ``pool_timeout`` seconds and raise EmptyPoolError if no
            connection is available within the time period.

        :param release_conn:
            If False, then the urlopen call will not release the connection
            back into the pool once a response is received (but will release if
            you read the entire contents of the response such as when
            `preload_content=True`). This is useful if you're not preloading
            the response's content immediately. You will need to call
            ``r.release_conn()`` on the response ``r`` to return the connection
            back into the pool. If None, it takes the value of
            ``response_kw.get('preload_content', True)``.

        :param chunked:
            If True, urllib3 will send the body using chunked transfer
            encoding. Otherwise, urllib3 will send the body using the standard
            content-length form. Defaults to False.

        :param int body_pos:
            Position to seek to in file-like body in the event of a retry or
            redirect. Typically this won't need to be set because urllib3 will
            auto-populate the value when needed.

        :param \\**response_kw:
            Additional parameters are passed to
            :meth:`urllib3.response.HTTPResponse.from_httplib`
        """
        if headers is None:
            headers = self.headers

        if not isinstance(retries, Retry):
            retries = Retry.from_int(retries, redirect=redirect, default=self.retries)

        if release_conn is None:
            release_conn = response_kw.get('preload_content', True)

        # Check host
        if assert_same_host and not self.is_same_host(url):
            raise HostChangedError(self, url, retries)

        conn = None

        # Track whether `conn` needs to be released before
        # returning/raising/recursing. Update this variable if necessary, and
        # leave `release_conn` constant throughout the function. That way, if
        # the function recurses, the original value of `release_conn` will be
        # passed down into the recursive call, and its value will be respected.
        #
        # See issue #651 [1] for details.
        #
        # [1] <https://github.com/shazow/urllib3/issues/651>
        release_this_conn = release_conn

        # Merge the proxy headers. Only do this in HTTP. We have to copy the
        # headers dict so we can safely change it without those changes being
        # reflected in anyone else's copy.
        if self.scheme == 'http':
            headers = headers.copy()
            headers.update(self.proxy_headers)

        # Must keep the exception bound to a separate variable or else Python 3
        # complains about UnboundLocalError.
        err = None

        # Keep track of whether we cleanly exited the except block. This
        # ensures we do proper cleanup in finally.
        clean_exit = False

        # Rewind body position, if needed. Record current position
        # for future rewinds in the event of a redirect/retry.
        body_pos = set_file_position(body, body_pos)

        try:
            # Request a connection from the queue.
            timeout_obj = self._get_timeout(timeout)
            conn = self._get_conn(timeout=pool_timeout)

            conn.timeout = timeout_obj.connect_timeout

            is_new_proxy_conn = self.proxy is not None and not getattr(conn, 'sock', None)
            if is_new_proxy_conn:
                self._prepare_proxy(conn)

            # Make the request on the httplib connection object.
            httplib_response = self._make_request(conn, method, url,
                                                  timeout=timeout_obj,
                                                  body=body, headers=headers,
                                                  chunked=chunked)

            # If we're going to release the connection in ``finally:``, then
            # the response doesn't need to know about the connection. Otherwise
            # it will also try to release it and we'll have a double-release
            # mess.
            response_conn = conn if not release_conn else None

            # Pass method to Response for length checking
            response_kw['request_method'] = method

            # Import httplib's response into our own wrapper object
            response = self.ResponseCls.from_httplib(httplib_response,
                                                     pool=self,
                                                     connection=response_conn,
                                                     retries=retries,
                                                     **response_kw)

            # Everything went great!
            clean_exit = True

        except queue.Empty:
            # Timed out by queue.
            raise EmptyPoolError(self, "No pool connections are available.")

        except (TimeoutError, HTTPException, SocketError, ProtocolError,
                BaseSSLError, SSLError, CertificateError) as e:
            # Discard the connection for these exceptions. It will be
            # replaced during the next _get_conn() call.
            clean_exit = False
            if isinstance(e, (BaseSSLError, CertificateError)):
                e = SSLError(e)
            elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
                e = ProxyError('Cannot connect to proxy.', e)
            elif isinstance(e, (SocketError, HTTPException)):
                e = ProtocolError('Connection aborted.', e)

            retries = retries.increment(method, url, error=e, _pool=self,
                                        _stacktrace=sys.exc_info()[2])
            retries.sleep()

            # Keep track of the error for the retry warning.
            err = e

        finally:
            if not clean_exit:
                # We hit some kind of exception, handled or otherwise. We need
                # to throw the connection away unless explicitly told not to.
                # Close the connection, set the variable to None, and make sure
                # we put the None back in the pool to avoid leaking it.
                conn = conn and conn.close()
                release_this_conn = True

            if release_this_conn:
                # Put the connection back to be reused. If the connection is
                # expired then it will be None, which will get replaced with a
                # fresh connection during _get_conn.
                self._put_conn(conn)

        if not conn:
            # Try again
            log.warning("Retrying (%r) after connection "
                        "broken by '%r': %s", retries, err, url)
            return self.urlopen(method, url, body, headers, retries,
                                redirect, assert_same_host,
                                timeout=timeout, pool_timeout=pool_timeout,
                                release_conn=release_conn, body_pos=body_pos,
                                **response_kw)

        def drain_and_release_conn(response):
            try:
                # discard any remaining response body, the connection will be
                # released back to the pool once the entire response is read
                response.read()
            except (TimeoutError, HTTPException, SocketError, ProtocolError,
                    BaseSSLError, SSLError) as e:
                pass

        # Handle redirect?
        redirect_location = redirect and response.get_redirect_location()
        if redirect_location:
            if response.status == 303:
                method = 'GET'

            try:
                retries = retries.increment(method, url, response=response, _pool=self)
            except MaxRetryError:
                if retries.raise_on_redirect:
                    # Drain and release the connection for this response, since
                    # we're not returning it to be released manually.
                    drain_and_release_conn(response)
                    raise
                return response

            # drain and return the connection to the pool before recursing
            drain_and_release_conn(response)

            retries.sleep_for_retry(response)
            log.debug("Redirecting %s -> %s", url, redirect_location)
            return self.urlopen(
                method, redirect_location, body, headers,
                retries=retries, redirect=redirect,
                assert_same_host=assert_same_host,
                timeout=timeout, pool_timeout=pool_timeout,
                release_conn=release_conn, body_pos=body_pos,
                **response_kw)

        # Check if we should retry the HTTP response.
        has_retry_after = bool(response.getheader('Retry-After'))
        if retries.is_retry(method, response.status, has_retry_after):
            try:
                retries = retries.increment(method, url, response=response, _pool=self)
            except MaxRetryError:
                if retries.raise_on_status:
                    # Drain and release the connection for this response, since
                    # we're not returning it to be released manually.
                    drain_and_release_conn(response)
                    raise
                return response

            # drain and return the connection to the pool before recursing
            drain_and_release_conn(response)

            retries.sleep(response)
            log.debug("Retry: %s", url)
            return self.urlopen(
                method, url, body, headers,
                retries=retries, redirect=redirect,
                assert_same_host=assert_same_host,
                timeout=timeout, pool_timeout=pool_timeout,
                release_conn=release_conn,
                body_pos=body_pos, **response_kw)

        return response


class HTTPSConnectionPool(HTTPConnectionPool):
    """
    Same as :class:`.HTTPConnectionPool`, but HTTPS.

    When Python is compiled with the :mod:`ssl` module, then
    :class:`.VerifiedHTTPSConnection` is used, which *can* verify certificates,
    instead of :class:`.HTTPSConnection`.

    :class:`.VerifiedHTTPSConnection` uses one of ``assert_fingerprint``,
    ``assert_hostname`` and ``host`` in this order to verify connections.
    If ``assert_hostname`` is False, no verification is done.

    The ``key_file``, ``cert_file``, ``cert_reqs``, ``ca_certs``,
    ``ca_cert_dir``, and ``ssl_version`` are only used if :mod:`ssl` is
    available and are fed into :meth:`urllib3.util.ssl_wrap_socket` to upgrade
    the connection socket into an SSL socket.
    """

    scheme = 'https'
    ConnectionCls = HTTPSConnection

    def __init__(self, host, port=None,
                 strict=False, timeout=Timeout.DEFAULT_TIMEOUT, maxsize=1,
                 block=False, headers=None, retries=None,
                 _proxy=None, _proxy_headers=None,
                 key_file=None, cert_file=None, cert_reqs=None,
                 ca_certs=None, ssl_version=None,
                 assert_hostname=None, assert_fingerprint=None,
                 ca_cert_dir=None, **conn_kw):

        HTTPConnectionPool.__init__(self, host, port, strict, timeout, maxsize,
                                    block, headers, retries, _proxy, _proxy_headers,
                                    **conn_kw)

        if ca_certs and cert_reqs is None:
            cert_reqs = 'CERT_REQUIRED'

        self.key_file = key_file
        self.cert_file = cert_file
        self.cert_reqs = cert_reqs
        self.ca_certs = ca_certs
        self.ca_cert_dir = ca_cert_dir
        self.ssl_version = ssl_version
        self.assert_hostname = assert_hostname
        self.assert_fingerprint = assert_fingerprint

    def _prepare_conn(self, conn):
        """
        Prepare the ``connection`` for :meth:`urllib3.util.ssl_wrap_socket`
        and establish the tunnel if proxy is used.
        """

        if isinstance(conn, VerifiedHTTPSConnection):
            conn.set_cert(key_file=self.key_file,
                          cert_file=self.cert_file,
                          cert_reqs=self.cert_reqs,
                          ca_certs=self.ca_certs,
                          ca_cert_dir=self.ca_cert_dir,
                          assert_hostname=self.assert_hostname,
                          assert_fingerprint=self.assert_fingerprint)
            conn.ssl_version = self.ssl_version
        return conn

    def _prepare_proxy(self, conn):
        """
        Establish tunnel connection early, because otherwise httplib
        would improperly set Host: header to proxy's IP:port.
        """
        # Python 2.7+
        try:
            set_tunnel = conn.set_tunnel
        except AttributeError:  # Platform-specific: Python 2.6
            set_tunnel = conn._set_tunnel

        if sys.version_info <= (2, 6, 4) and not self.proxy_headers:  # Python 2.6.4 and older
            set_tunnel(self._proxy_host, self.port)
        else:
            set_tunnel(self._proxy_host, self.port, self.proxy_headers)

        conn.connect()

    def _new_conn(self):
        """
        Return a fresh :class:`httplib.HTTPSConnection`.
        """
        self.num_connections += 1
        log.debug("Starting new HTTPS connection (%d): %s",
                  self.num_connections, self.host)

        if not self.ConnectionCls or self.ConnectionCls is DummyConnection:
            raise SSLError("Can't connect to HTTPS URL because the SSL "
                           "module is not available.")

        actual_host = self.host
        actual_port = self.port
        if self.proxy is not None:
            actual_host = self.proxy.host
            actual_port = self.proxy.port

        conn = self.ConnectionCls(host=actual_host, port=actual_port,
                                  timeout=self.timeout.connect_timeout,
                                  strict=self.strict, **self.conn_kw)

        return self._prepare_conn(conn)

    def _validate_conn(self, conn):
        """
        Called right before a request is made, after the socket is created.
        """
        super(HTTPSConnectionPool, self)._validate_conn(conn)

        # Force connect early to allow us to validate the connection.
        if not getattr(conn, 'sock', None):  # AppEngine might not have  `.sock`
            conn.connect()

        if not conn.is_verified:
            warnings.warn((
                'Unverified HTTPS request is being made. '
                'Adding certificate verification is strongly advised. See: '
                'https://urllib3.readthedocs.io/en/latest/advanced-usage.html'
                '#ssl-warnings'),
                InsecureRequestWarning)


def connection_from_url(url, **kw):
    """
    Given a url, return an :class:`.ConnectionPool` instance of its host.

    This is a shortcut for not having to parse out the scheme, host, and port
    of the url before creating an :class:`.ConnectionPool` instance.

    :param url:
        Absolute URL string that must include the scheme. Port is optional.

    :param \\**kw:
        Passes additional parameters to the constructor of the appropriate
        :class:`.ConnectionPool`. Useful for specifying things like
        timeout, maxsize, headers, etc.

    Example::

        >>> conn = connection_from_url('http://google.com/')
        >>> r = conn.request('GET', '/')
    """
    scheme, host, port = get_host(url)
    port = port or port_by_scheme.get(scheme, 80)
    if scheme == 'https':
        return HTTPSConnectionPool(host, port=port, **kw)
    else:
        return HTTPConnectionPool(host, port=port, **kw)


def _ipv6_host(host):
    """
    Process IPv6 address literals
    """

    # httplib doesn't like it when we include brackets in IPv6 addresses
    # Specifically, if we include brackets but also pass the port then
    # httplib crazily doubles up the square brackets on the Host header.
    # Instead, we need to make sure we never pass ``None`` as the port.
    # However, for backward compatibility reasons we can't actually
    # *assert* that.  See http://bugs.python.org/issue28539
    #
    # Also if an IPv6 address literal has a zone identifier, the
    # percent sign might be URIencoded, convert it back into ASCII
    if host.startswith('[') and host.endswith(']'):
        host = host.replace('%25', '%').strip('[]')
    return host
site-packages/pip/_vendor/urllib3/filepost.py000064400000004421147511334600015270 0ustar00from __future__ import absolute_import
import codecs

from uuid import uuid4
from io import BytesIO

from .packages import six
from .packages.six import b
from .fields import RequestField

writer = codecs.lookup('utf-8')[3]


def choose_boundary():
    """
    Our embarrassingly-simple replacement for mimetools.choose_boundary.
    """
    return uuid4().hex


def iter_field_objects(fields):
    """
    Iterate over fields.

    Supports list of (k, v) tuples and dicts, and lists of
    :class:`~urllib3.fields.RequestField`.

    """
    if isinstance(fields, dict):
        i = six.iteritems(fields)
    else:
        i = iter(fields)

    for field in i:
        if isinstance(field, RequestField):
            yield field
        else:
            yield RequestField.from_tuples(*field)


def iter_fields(fields):
    """
    .. deprecated:: 1.6

    Iterate over fields.

    The addition of :class:`~urllib3.fields.RequestField` makes this function
    obsolete. Instead, use :func:`iter_field_objects`, which returns
    :class:`~urllib3.fields.RequestField` objects.

    Supports list of (k, v) tuples and dicts.
    """
    if isinstance(fields, dict):
        return ((k, v) for k, v in six.iteritems(fields))

    return ((k, v) for k, v in fields)


def encode_multipart_formdata(fields, boundary=None):
    """
    Encode a dictionary of ``fields`` using the multipart/form-data MIME format.

    :param fields:
        Dictionary of fields or list of (key, :class:`~urllib3.fields.RequestField`).

    :param boundary:
        If not specified, then a random boundary will be generated using
        :func:`mimetools.choose_boundary`.
    """
    body = BytesIO()
    if boundary is None:
        boundary = choose_boundary()

    for field in iter_field_objects(fields):
        body.write(b('--%s\r\n' % (boundary)))

        writer(body).write(field.render_headers())
        data = field.data

        if isinstance(data, int):
            data = str(data)  # Backwards compatibility

        if isinstance(data, six.text_type):
            writer(body).write(data)
        else:
            body.write(data)

        body.write(b'\r\n')

    body.write(b('--%s--\r\n' % (boundary)))

    content_type = str('multipart/form-data; boundary=%s' % boundary)

    return body.getvalue(), content_type
site-packages/pip/_vendor/urllib3/contrib/ntlmpool.py000064400000010576147511334600016757 0ustar00"""
NTLM authenticating pool, contributed by erikcederstran

Issue #10, see: http://code.google.com/p/urllib3/issues/detail?id=10
"""
from __future__ import absolute_import

from logging import getLogger
from ntlm import ntlm

from .. import HTTPSConnectionPool
from ..packages.six.moves.http_client import HTTPSConnection


log = getLogger(__name__)


class NTLMConnectionPool(HTTPSConnectionPool):
    """
    Implements an NTLM authentication version of an urllib3 connection pool
    """

    scheme = 'https'

    def __init__(self, user, pw, authurl, *args, **kwargs):
        """
        authurl is a random URL on the server that is protected by NTLM.
        user is the Windows user, probably in the DOMAIN\\username format.
        pw is the password for the user.
        """
        super(NTLMConnectionPool, self).__init__(*args, **kwargs)
        self.authurl = authurl
        self.rawuser = user
        user_parts = user.split('\\', 1)
        self.domain = user_parts[0].upper()
        self.user = user_parts[1]
        self.pw = pw

    def _new_conn(self):
        # Performs the NTLM handshake that secures the connection. The socket
        # must be kept open while requests are performed.
        self.num_connections += 1
        log.debug('Starting NTLM HTTPS connection no. %d: https://%s%s',
                  self.num_connections, self.host, self.authurl)

        headers = {}
        headers['Connection'] = 'Keep-Alive'
        req_header = 'Authorization'
        resp_header = 'www-authenticate'

        conn = HTTPSConnection(host=self.host, port=self.port)

        # Send negotiation message
        headers[req_header] = (
            'NTLM %s' % ntlm.create_NTLM_NEGOTIATE_MESSAGE(self.rawuser))
        log.debug('Request headers: %s', headers)
        conn.request('GET', self.authurl, None, headers)
        res = conn.getresponse()
        reshdr = dict(res.getheaders())
        log.debug('Response status: %s %s', res.status, res.reason)
        log.debug('Response headers: %s', reshdr)
        log.debug('Response data: %s [...]', res.read(100))

        # Remove the reference to the socket, so that it can not be closed by
        # the response object (we want to keep the socket open)
        res.fp = None

        # Server should respond with a challenge message
        auth_header_values = reshdr[resp_header].split(', ')
        auth_header_value = None
        for s in auth_header_values:
            if s[:5] == 'NTLM ':
                auth_header_value = s[5:]
        if auth_header_value is None:
            raise Exception('Unexpected %s response header: %s' %
                            (resp_header, reshdr[resp_header]))

        # Send authentication message
        ServerChallenge, NegotiateFlags = \
            ntlm.parse_NTLM_CHALLENGE_MESSAGE(auth_header_value)
        auth_msg = ntlm.create_NTLM_AUTHENTICATE_MESSAGE(ServerChallenge,
                                                         self.user,
                                                         self.domain,
                                                         self.pw,
                                                         NegotiateFlags)
        headers[req_header] = 'NTLM %s' % auth_msg
        log.debug('Request headers: %s', headers)
        conn.request('GET', self.authurl, None, headers)
        res = conn.getresponse()
        log.debug('Response status: %s %s', res.status, res.reason)
        log.debug('Response headers: %s', dict(res.getheaders()))
        log.debug('Response data: %s [...]', res.read()[:100])
        if res.status != 200:
            if res.status == 401:
                raise Exception('Server rejected request: wrong '
                                'username or password')
            raise Exception('Wrong server response: %s %s' %
                            (res.status, res.reason))

        res.fp = None
        log.debug('Connection established')
        return conn

    def urlopen(self, method, url, body=None, headers=None, retries=3,
                redirect=True, assert_same_host=True):
        if headers is None:
            headers = {}
        headers['Connection'] = 'Keep-Alive'
        return super(NTLMConnectionPool, self).urlopen(method, url, body,
                                                       headers, retries,
                                                       redirect,
                                                       assert_same_host)
site-packages/pip/_vendor/urllib3/contrib/appengine.py000064400000025161147511334600017055 0ustar00"""
This module provides a pool manager that uses Google App Engine's
`URLFetch Service <https://cloud.google.com/appengine/docs/python/urlfetch>`_.

Example usage::

    from urllib3 import PoolManager
    from urllib3.contrib.appengine import AppEngineManager, is_appengine_sandbox

    if is_appengine_sandbox():
        # AppEngineManager uses AppEngine's URLFetch API behind the scenes
        http = AppEngineManager()
    else:
        # PoolManager uses a socket-level API behind the scenes
        http = PoolManager()

    r = http.request('GET', 'https://google.com/')

There are `limitations <https://cloud.google.com/appengine/docs/python/\
urlfetch/#Python_Quotas_and_limits>`_ to the URLFetch service and it may not be
the best choice for your application. There are three options for using
urllib3 on Google App Engine:

1. You can use :class:`AppEngineManager` with URLFetch. URLFetch is
   cost-effective in many circumstances as long as your usage is within the
   limitations.
2. You can use a normal :class:`~urllib3.PoolManager` by enabling sockets.
   Sockets also have `limitations and restrictions
   <https://cloud.google.com/appengine/docs/python/sockets/\
   #limitations-and-restrictions>`_ and have a lower free quota than URLFetch.
   To use sockets, be sure to specify the following in your ``app.yaml``::

        env_variables:
            GAE_USE_SOCKETS_HTTPLIB : 'true'

3. If you are using `App Engine Flexible
<https://cloud.google.com/appengine/docs/flexible/>`_, you can use the standard
:class:`PoolManager` without any configuration or special environment variables.
"""

from __future__ import absolute_import
import logging
import os
import warnings
from ..packages.six.moves.urllib.parse import urljoin

from ..exceptions import (
    HTTPError,
    HTTPWarning,
    MaxRetryError,
    ProtocolError,
    TimeoutError,
    SSLError
)

from ..packages.six import BytesIO
from ..request import RequestMethods
from ..response import HTTPResponse
from ..util.timeout import Timeout
from ..util.retry import Retry

try:
    from google.appengine.api import urlfetch
except ImportError:
    urlfetch = None


log = logging.getLogger(__name__)


class AppEnginePlatformWarning(HTTPWarning):
    pass


class AppEnginePlatformError(HTTPError):
    pass


class AppEngineManager(RequestMethods):
    """
    Connection manager for Google App Engine sandbox applications.

    This manager uses the URLFetch service directly instead of using the
    emulated httplib, and is subject to URLFetch limitations as described in
    the App Engine documentation `here
    <https://cloud.google.com/appengine/docs/python/urlfetch>`_.

    Notably it will raise an :class:`AppEnginePlatformError` if:
        * URLFetch is not available.
        * If you attempt to use this on App Engine Flexible, as full socket
          support is available.
        * If a request size is more than 10 megabytes.
        * If a response size is more than 32 megabtyes.
        * If you use an unsupported request method such as OPTIONS.

    Beyond those cases, it will raise normal urllib3 errors.
    """

    def __init__(self, headers=None, retries=None, validate_certificate=True,
                 urlfetch_retries=True):
        if not urlfetch:
            raise AppEnginePlatformError(
                "URLFetch is not available in this environment.")

        if is_prod_appengine_mvms():
            raise AppEnginePlatformError(
                "Use normal urllib3.PoolManager instead of AppEngineManager"
                "on Managed VMs, as using URLFetch is not necessary in "
                "this environment.")

        warnings.warn(
            "urllib3 is using URLFetch on Google App Engine sandbox instead "
            "of sockets. To use sockets directly instead of URLFetch see "
            "https://urllib3.readthedocs.io/en/latest/reference/urllib3.contrib.html.",
            AppEnginePlatformWarning)

        RequestMethods.__init__(self, headers)
        self.validate_certificate = validate_certificate
        self.urlfetch_retries = urlfetch_retries

        self.retries = retries or Retry.DEFAULT

    def __enter__(self):
        return self

    def __exit__(self, exc_type, exc_val, exc_tb):
        # Return False to re-raise any potential exceptions
        return False

    def urlopen(self, method, url, body=None, headers=None,
                retries=None, redirect=True, timeout=Timeout.DEFAULT_TIMEOUT,
                **response_kw):

        retries = self._get_retries(retries, redirect)

        try:
            follow_redirects = (
                    redirect and
                    retries.redirect != 0 and
                    retries.total)
            response = urlfetch.fetch(
                url,
                payload=body,
                method=method,
                headers=headers or {},
                allow_truncated=False,
                follow_redirects=self.urlfetch_retries and follow_redirects,
                deadline=self._get_absolute_timeout(timeout),
                validate_certificate=self.validate_certificate,
            )
        except urlfetch.DeadlineExceededError as e:
            raise TimeoutError(self, e)

        except urlfetch.InvalidURLError as e:
            if 'too large' in str(e):
                raise AppEnginePlatformError(
                    "URLFetch request too large, URLFetch only "
                    "supports requests up to 10mb in size.", e)
            raise ProtocolError(e)

        except urlfetch.DownloadError as e:
            if 'Too many redirects' in str(e):
                raise MaxRetryError(self, url, reason=e)
            raise ProtocolError(e)

        except urlfetch.ResponseTooLargeError as e:
            raise AppEnginePlatformError(
                "URLFetch response too large, URLFetch only supports"
                "responses up to 32mb in size.", e)

        except urlfetch.SSLCertificateError as e:
            raise SSLError(e)

        except urlfetch.InvalidMethodError as e:
            raise AppEnginePlatformError(
                "URLFetch does not support method: %s" % method, e)

        http_response = self._urlfetch_response_to_http_response(
            response, retries=retries, **response_kw)

        # Handle redirect?
        redirect_location = redirect and http_response.get_redirect_location()
        if redirect_location:
            # Check for redirect response
            if (self.urlfetch_retries and retries.raise_on_redirect):
                raise MaxRetryError(self, url, "too many redirects")
            else:
                if http_response.status == 303:
                    method = 'GET'

                try:
                    retries = retries.increment(method, url, response=http_response, _pool=self)
                except MaxRetryError:
                    if retries.raise_on_redirect:
                        raise MaxRetryError(self, url, "too many redirects")
                    return http_response

                retries.sleep_for_retry(http_response)
                log.debug("Redirecting %s -> %s", url, redirect_location)
                redirect_url = urljoin(url, redirect_location)
                return self.urlopen(
                    method, redirect_url, body, headers,
                    retries=retries, redirect=redirect,
                    timeout=timeout, **response_kw)

        # Check if we should retry the HTTP response.
        has_retry_after = bool(http_response.getheader('Retry-After'))
        if retries.is_retry(method, http_response.status, has_retry_after):
            retries = retries.increment(
                method, url, response=http_response, _pool=self)
            log.debug("Retry: %s", url)
            retries.sleep(http_response)
            return self.urlopen(
                method, url,
                body=body, headers=headers,
                retries=retries, redirect=redirect,
                timeout=timeout, **response_kw)

        return http_response

    def _urlfetch_response_to_http_response(self, urlfetch_resp, **response_kw):

        if is_prod_appengine():
            # Production GAE handles deflate encoding automatically, but does
            # not remove the encoding header.
            content_encoding = urlfetch_resp.headers.get('content-encoding')

            if content_encoding == 'deflate':
                del urlfetch_resp.headers['content-encoding']

        transfer_encoding = urlfetch_resp.headers.get('transfer-encoding')
        # We have a full response's content,
        # so let's make sure we don't report ourselves as chunked data.
        if transfer_encoding == 'chunked':
            encodings = transfer_encoding.split(",")
            encodings.remove('chunked')
            urlfetch_resp.headers['transfer-encoding'] = ','.join(encodings)

        return HTTPResponse(
            # In order for decoding to work, we must present the content as
            # a file-like object.
            body=BytesIO(urlfetch_resp.content),
            headers=urlfetch_resp.headers,
            status=urlfetch_resp.status_code,
            **response_kw
        )

    def _get_absolute_timeout(self, timeout):
        if timeout is Timeout.DEFAULT_TIMEOUT:
            return None  # Defer to URLFetch's default.
        if isinstance(timeout, Timeout):
            if timeout._read is not None or timeout._connect is not None:
                warnings.warn(
                    "URLFetch does not support granular timeout settings, "
                    "reverting to total or default URLFetch timeout.",
                    AppEnginePlatformWarning)
            return timeout.total
        return timeout

    def _get_retries(self, retries, redirect):
        if not isinstance(retries, Retry):
            retries = Retry.from_int(
                retries, redirect=redirect, default=self.retries)

        if retries.connect or retries.read or retries.redirect:
            warnings.warn(
                "URLFetch only supports total retries and does not "
                "recognize connect, read, or redirect retry parameters.",
                AppEnginePlatformWarning)

        return retries


def is_appengine():
    return (is_local_appengine() or
            is_prod_appengine() or
            is_prod_appengine_mvms())


def is_appengine_sandbox():
    return is_appengine() and not is_prod_appengine_mvms()


def is_local_appengine():
    return ('APPENGINE_RUNTIME' in os.environ and
            'Development/' in os.environ['SERVER_SOFTWARE'])


def is_prod_appengine():
    return ('APPENGINE_RUNTIME' in os.environ and
            'Google App Engine/' in os.environ['SERVER_SOFTWARE'] and
            not is_prod_appengine_mvms())


def is_prod_appengine_mvms():
    return os.environ.get('GAE_VM', False) == 'true'
site-packages/pip/_vendor/urllib3/contrib/securetransport.py000064400000073445147511334600020362 0ustar00"""
SecureTranport support for urllib3 via ctypes.

This makes platform-native TLS available to urllib3 users on macOS without the
use of a compiler. This is an important feature because the Python Package
Index is moving to become a TLSv1.2-or-higher server, and the default OpenSSL
that ships with macOS is not capable of doing TLSv1.2. The only way to resolve
this is to give macOS users an alternative solution to the problem, and that
solution is to use SecureTransport.

We use ctypes here because this solution must not require a compiler. That's
because pip is not allowed to require a compiler either.

This is not intended to be a seriously long-term solution to this problem.
The hope is that PEP 543 will eventually solve this issue for us, at which
point we can retire this contrib module. But in the short term, we need to
solve the impending tire fire that is Python on Mac without this kind of
contrib module. So...here we are.

To use this module, simply import and inject it::

    import urllib3.contrib.securetransport
    urllib3.contrib.securetransport.inject_into_urllib3()

Happy TLSing!
"""
from __future__ import absolute_import

import contextlib
import ctypes
import errno
import os.path
import shutil
import socket
import ssl
import threading
import weakref

from .. import util
from ._securetransport.bindings import (
    Security, SecurityConst, CoreFoundation
)
from ._securetransport.low_level import (
    _assert_no_error, _cert_array_from_pem, _temporary_keychain,
    _load_client_cert_chain
)

try:  # Platform-specific: Python 2
    from socket import _fileobject
except ImportError:  # Platform-specific: Python 3
    _fileobject = None
    from ..packages.backports.makefile import backport_makefile

try:
    memoryview(b'')
except NameError:
    raise ImportError("SecureTransport only works on Pythons with memoryview")

__all__ = ['inject_into_urllib3', 'extract_from_urllib3']

# SNI always works
HAS_SNI = True

orig_util_HAS_SNI = util.HAS_SNI
orig_util_SSLContext = util.ssl_.SSLContext

# This dictionary is used by the read callback to obtain a handle to the
# calling wrapped socket. This is a pretty silly approach, but for now it'll
# do. I feel like I should be able to smuggle a handle to the wrapped socket
# directly in the SSLConnectionRef, but for now this approach will work I
# guess.
#
# We need to lock around this structure for inserts, but we don't do it for
# reads/writes in the callbacks. The reasoning here goes as follows:
#
#    1. It is not possible to call into the callbacks before the dictionary is
#       populated, so once in the callback the id must be in the dictionary.
#    2. The callbacks don't mutate the dictionary, they only read from it, and
#       so cannot conflict with any of the insertions.
#
# This is good: if we had to lock in the callbacks we'd drastically slow down
# the performance of this code.
_connection_refs = weakref.WeakValueDictionary()
_connection_ref_lock = threading.Lock()

# Limit writes to 16kB. This is OpenSSL's limit, but we'll cargo-cult it over
# for no better reason than we need *a* limit, and this one is right there.
SSL_WRITE_BLOCKSIZE = 16384

# This is our equivalent of util.ssl_.DEFAULT_CIPHERS, but expanded out to
# individual cipher suites. We need to do this becuase this is how
# SecureTransport wants them.
CIPHER_SUITES = [
    SecurityConst.TLS_AES_256_GCM_SHA384,
    SecurityConst.TLS_CHACHA20_POLY1305_SHA256,
    SecurityConst.TLS_AES_128_GCM_SHA256,
    SecurityConst.TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,
    SecurityConst.TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,
    SecurityConst.TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,
    SecurityConst.TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,
    SecurityConst.TLS_DHE_DSS_WITH_AES_256_GCM_SHA384,
    SecurityConst.TLS_DHE_RSA_WITH_AES_256_GCM_SHA384,
    SecurityConst.TLS_DHE_DSS_WITH_AES_128_GCM_SHA256,
    SecurityConst.TLS_DHE_RSA_WITH_AES_128_GCM_SHA256,
    SecurityConst.TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384,
    SecurityConst.TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384,
    SecurityConst.TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA,
    SecurityConst.TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA,
    SecurityConst.TLS_DHE_RSA_WITH_AES_256_CBC_SHA256,
    SecurityConst.TLS_DHE_DSS_WITH_AES_256_CBC_SHA256,
    SecurityConst.TLS_DHE_RSA_WITH_AES_256_CBC_SHA,
    SecurityConst.TLS_DHE_DSS_WITH_AES_256_CBC_SHA,
    SecurityConst.TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256,
    SecurityConst.TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256,
    SecurityConst.TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA,
    SecurityConst.TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA,
    SecurityConst.TLS_DHE_RSA_WITH_AES_128_CBC_SHA256,
    SecurityConst.TLS_DHE_DSS_WITH_AES_128_CBC_SHA256,
    SecurityConst.TLS_DHE_RSA_WITH_AES_128_CBC_SHA,
    SecurityConst.TLS_DHE_DSS_WITH_AES_128_CBC_SHA,
    SecurityConst.TLS_RSA_WITH_AES_256_GCM_SHA384,
    SecurityConst.TLS_RSA_WITH_AES_128_GCM_SHA256,
    SecurityConst.TLS_RSA_WITH_AES_256_CBC_SHA256,
    SecurityConst.TLS_RSA_WITH_AES_128_CBC_SHA256,
    SecurityConst.TLS_RSA_WITH_AES_256_CBC_SHA,
    SecurityConst.TLS_RSA_WITH_AES_128_CBC_SHA,
]

# Basically this is simple: for PROTOCOL_SSLv23 we turn it into a low of
# TLSv1 and a high of TLSv1.2. For everything else, we pin to that version.
_protocol_to_min_max = {
    ssl.PROTOCOL_SSLv23: (SecurityConst.kTLSProtocol1, SecurityConst.kTLSProtocol12),
}

if hasattr(ssl, "PROTOCOL_SSLv2"):
    _protocol_to_min_max[ssl.PROTOCOL_SSLv2] = (
        SecurityConst.kSSLProtocol2, SecurityConst.kSSLProtocol2
    )
if hasattr(ssl, "PROTOCOL_SSLv3"):
    _protocol_to_min_max[ssl.PROTOCOL_SSLv3] = (
        SecurityConst.kSSLProtocol3, SecurityConst.kSSLProtocol3
    )
if hasattr(ssl, "PROTOCOL_TLSv1"):
    _protocol_to_min_max[ssl.PROTOCOL_TLSv1] = (
        SecurityConst.kTLSProtocol1, SecurityConst.kTLSProtocol1
    )
if hasattr(ssl, "PROTOCOL_TLSv1_1"):
    _protocol_to_min_max[ssl.PROTOCOL_TLSv1_1] = (
        SecurityConst.kTLSProtocol11, SecurityConst.kTLSProtocol11
    )
if hasattr(ssl, "PROTOCOL_TLSv1_2"):
    _protocol_to_min_max[ssl.PROTOCOL_TLSv1_2] = (
        SecurityConst.kTLSProtocol12, SecurityConst.kTLSProtocol12
    )
if hasattr(ssl, "PROTOCOL_TLS"):
    _protocol_to_min_max[ssl.PROTOCOL_TLS] = _protocol_to_min_max[ssl.PROTOCOL_SSLv23]


def inject_into_urllib3():
    """
    Monkey-patch urllib3 with SecureTransport-backed SSL-support.
    """
    util.ssl_.SSLContext = SecureTransportContext
    util.HAS_SNI = HAS_SNI
    util.ssl_.HAS_SNI = HAS_SNI
    util.IS_SECURETRANSPORT = True
    util.ssl_.IS_SECURETRANSPORT = True


def extract_from_urllib3():
    """
    Undo monkey-patching by :func:`inject_into_urllib3`.
    """
    util.ssl_.SSLContext = orig_util_SSLContext
    util.HAS_SNI = orig_util_HAS_SNI
    util.ssl_.HAS_SNI = orig_util_HAS_SNI
    util.IS_SECURETRANSPORT = False
    util.ssl_.IS_SECURETRANSPORT = False


def _read_callback(connection_id, data_buffer, data_length_pointer):
    """
    SecureTransport read callback. This is called by ST to request that data
    be returned from the socket.
    """
    wrapped_socket = None
    try:
        wrapped_socket = _connection_refs.get(connection_id)
        if wrapped_socket is None:
            return SecurityConst.errSSLInternal
        base_socket = wrapped_socket.socket

        requested_length = data_length_pointer[0]

        timeout = wrapped_socket.gettimeout()
        error = None
        read_count = 0
        buffer = (ctypes.c_char * requested_length).from_address(data_buffer)
        buffer_view = memoryview(buffer)

        try:
            while read_count < requested_length:
                if timeout is None or timeout >= 0:
                    readables = util.wait_for_read([base_socket], timeout)
                    if not readables:
                        raise socket.error(errno.EAGAIN, 'timed out')

                # We need to tell ctypes that we have a buffer that can be
                # written to. Upsettingly, we do that like this:
                chunk_size = base_socket.recv_into(
                    buffer_view[read_count:requested_length]
                )
                read_count += chunk_size
                if not chunk_size:
                    if not read_count:
                        return SecurityConst.errSSLClosedGraceful
                    break
        except (socket.error) as e:
            error = e.errno

            if error is not None and error != errno.EAGAIN:
                if error == errno.ECONNRESET:
                    return SecurityConst.errSSLClosedAbort
                raise

        data_length_pointer[0] = read_count

        if read_count != requested_length:
            return SecurityConst.errSSLWouldBlock

        return 0
    except Exception as e:
        if wrapped_socket is not None:
            wrapped_socket._exception = e
        return SecurityConst.errSSLInternal


def _write_callback(connection_id, data_buffer, data_length_pointer):
    """
    SecureTransport write callback. This is called by ST to request that data
    actually be sent on the network.
    """
    wrapped_socket = None
    try:
        wrapped_socket = _connection_refs.get(connection_id)
        if wrapped_socket is None:
            return SecurityConst.errSSLInternal
        base_socket = wrapped_socket.socket

        bytes_to_write = data_length_pointer[0]
        data = ctypes.string_at(data_buffer, bytes_to_write)

        timeout = wrapped_socket.gettimeout()
        error = None
        sent = 0

        try:
            while sent < bytes_to_write:
                if timeout is None or timeout >= 0:
                    writables = util.wait_for_write([base_socket], timeout)
                    if not writables:
                        raise socket.error(errno.EAGAIN, 'timed out')
                chunk_sent = base_socket.send(data)
                sent += chunk_sent

                # This has some needless copying here, but I'm not sure there's
                # much value in optimising this data path.
                data = data[chunk_sent:]
        except (socket.error) as e:
            error = e.errno

            if error is not None and error != errno.EAGAIN:
                if error == errno.ECONNRESET:
                    return SecurityConst.errSSLClosedAbort
                raise

        data_length_pointer[0] = sent
        if sent != bytes_to_write:
            return SecurityConst.errSSLWouldBlock

        return 0
    except Exception as e:
        if wrapped_socket is not None:
            wrapped_socket._exception = e
        return SecurityConst.errSSLInternal


# We need to keep these two objects references alive: if they get GC'd while
# in use then SecureTransport could attempt to call a function that is in freed
# memory. That would be...uh...bad. Yeah, that's the word. Bad.
_read_callback_pointer = Security.SSLReadFunc(_read_callback)
_write_callback_pointer = Security.SSLWriteFunc(_write_callback)


class WrappedSocket(object):
    """
    API-compatibility wrapper for Python's OpenSSL wrapped socket object.

    Note: _makefile_refs, _drop(), and _reuse() are needed for the garbage
    collector of PyPy.
    """
    def __init__(self, socket):
        self.socket = socket
        self.context = None
        self._makefile_refs = 0
        self._closed = False
        self._exception = None
        self._keychain = None
        self._keychain_dir = None
        self._client_cert_chain = None

        # We save off the previously-configured timeout and then set it to
        # zero. This is done because we use select and friends to handle the
        # timeouts, but if we leave the timeout set on the lower socket then
        # Python will "kindly" call select on that socket again for us. Avoid
        # that by forcing the timeout to zero.
        self._timeout = self.socket.gettimeout()
        self.socket.settimeout(0)

    @contextlib.contextmanager
    def _raise_on_error(self):
        """
        A context manager that can be used to wrap calls that do I/O from
        SecureTransport. If any of the I/O callbacks hit an exception, this
        context manager will correctly propagate the exception after the fact.
        This avoids silently swallowing those exceptions.

        It also correctly forces the socket closed.
        """
        self._exception = None

        # We explicitly don't catch around this yield because in the unlikely
        # event that an exception was hit in the block we don't want to swallow
        # it.
        yield
        if self._exception is not None:
            exception, self._exception = self._exception, None
            self.close()
            raise exception

    def _set_ciphers(self):
        """
        Sets up the allowed ciphers. By default this matches the set in
        util.ssl_.DEFAULT_CIPHERS, at least as supported by macOS. This is done
        custom and doesn't allow changing at this time, mostly because parsing
        OpenSSL cipher strings is going to be a freaking nightmare.
        """
        ciphers = (Security.SSLCipherSuite * len(CIPHER_SUITES))(*CIPHER_SUITES)
        result = Security.SSLSetEnabledCiphers(
            self.context, ciphers, len(CIPHER_SUITES)
        )
        _assert_no_error(result)

    def _custom_validate(self, verify, trust_bundle):
        """
        Called when we have set custom validation. We do this in two cases:
        first, when cert validation is entirely disabled; and second, when
        using a custom trust DB.
        """
        # If we disabled cert validation, just say: cool.
        if not verify:
            return

        # We want data in memory, so load it up.
        if os.path.isfile(trust_bundle):
            with open(trust_bundle, 'rb') as f:
                trust_bundle = f.read()

        cert_array = None
        trust = Security.SecTrustRef()

        try:
            # Get a CFArray that contains the certs we want.
            cert_array = _cert_array_from_pem(trust_bundle)

            # Ok, now the hard part. We want to get the SecTrustRef that ST has
            # created for this connection, shove our CAs into it, tell ST to
            # ignore everything else it knows, and then ask if it can build a
            # chain. This is a buuuunch of code.
            result = Security.SSLCopyPeerTrust(
                self.context, ctypes.byref(trust)
            )
            _assert_no_error(result)
            if not trust:
                raise ssl.SSLError("Failed to copy trust reference")

            result = Security.SecTrustSetAnchorCertificates(trust, cert_array)
            _assert_no_error(result)

            result = Security.SecTrustSetAnchorCertificatesOnly(trust, True)
            _assert_no_error(result)

            trust_result = Security.SecTrustResultType()
            result = Security.SecTrustEvaluate(
                trust, ctypes.byref(trust_result)
            )
            _assert_no_error(result)
        finally:
            if trust:
                CoreFoundation.CFRelease(trust)

            if cert_array is None:
                CoreFoundation.CFRelease(cert_array)

        # Ok, now we can look at what the result was.
        successes = (
            SecurityConst.kSecTrustResultUnspecified,
            SecurityConst.kSecTrustResultProceed
        )
        if trust_result.value not in successes:
            raise ssl.SSLError(
                "certificate verify failed, error code: %d" %
                trust_result.value
            )

    def handshake(self,
                  server_hostname,
                  verify,
                  trust_bundle,
                  min_version,
                  max_version,
                  client_cert,
                  client_key,
                  client_key_passphrase):
        """
        Actually performs the TLS handshake. This is run automatically by
        wrapped socket, and shouldn't be needed in user code.
        """
        # First, we do the initial bits of connection setup. We need to create
        # a context, set its I/O funcs, and set the connection reference.
        self.context = Security.SSLCreateContext(
            None, SecurityConst.kSSLClientSide, SecurityConst.kSSLStreamType
        )
        result = Security.SSLSetIOFuncs(
            self.context, _read_callback_pointer, _write_callback_pointer
        )
        _assert_no_error(result)

        # Here we need to compute the handle to use. We do this by taking the
        # id of self modulo 2**31 - 1. If this is already in the dictionary, we
        # just keep incrementing by one until we find a free space.
        with _connection_ref_lock:
            handle = id(self) % 2147483647
            while handle in _connection_refs:
                handle = (handle + 1) % 2147483647
            _connection_refs[handle] = self

        result = Security.SSLSetConnection(self.context, handle)
        _assert_no_error(result)

        # If we have a server hostname, we should set that too.
        if server_hostname:
            if not isinstance(server_hostname, bytes):
                server_hostname = server_hostname.encode('utf-8')

            result = Security.SSLSetPeerDomainName(
                self.context, server_hostname, len(server_hostname)
            )
            _assert_no_error(result)

        # Setup the ciphers.
        self._set_ciphers()

        # Set the minimum and maximum TLS versions.
        result = Security.SSLSetProtocolVersionMin(self.context, min_version)
        _assert_no_error(result)
        result = Security.SSLSetProtocolVersionMax(self.context, max_version)
        _assert_no_error(result)

        # If there's a trust DB, we need to use it. We do that by telling
        # SecureTransport to break on server auth. We also do that if we don't
        # want to validate the certs at all: we just won't actually do any
        # authing in that case.
        if not verify or trust_bundle is not None:
            result = Security.SSLSetSessionOption(
                self.context,
                SecurityConst.kSSLSessionOptionBreakOnServerAuth,
                True
            )
            _assert_no_error(result)

        # If there's a client cert, we need to use it.
        if client_cert:
            self._keychain, self._keychain_dir = _temporary_keychain()
            self._client_cert_chain = _load_client_cert_chain(
                self._keychain, client_cert, client_key
            )
            result = Security.SSLSetCertificate(
                self.context, self._client_cert_chain
            )
            _assert_no_error(result)

        while True:
            with self._raise_on_error():
                result = Security.SSLHandshake(self.context)

                if result == SecurityConst.errSSLWouldBlock:
                    raise socket.timeout("handshake timed out")
                elif result == SecurityConst.errSSLServerAuthCompleted:
                    self._custom_validate(verify, trust_bundle)
                    continue
                else:
                    _assert_no_error(result)
                    break

    def fileno(self):
        return self.socket.fileno()

    # Copy-pasted from Python 3.5 source code
    def _decref_socketios(self):
        if self._makefile_refs > 0:
            self._makefile_refs -= 1
        if self._closed:
            self.close()

    def recv(self, bufsiz):
        buffer = ctypes.create_string_buffer(bufsiz)
        bytes_read = self.recv_into(buffer, bufsiz)
        data = buffer[:bytes_read]
        return data

    def recv_into(self, buffer, nbytes=None):
        # Read short on EOF.
        if self._closed:
            return 0

        if nbytes is None:
            nbytes = len(buffer)

        buffer = (ctypes.c_char * nbytes).from_buffer(buffer)
        processed_bytes = ctypes.c_size_t(0)

        with self._raise_on_error():
            result = Security.SSLRead(
                self.context, buffer, nbytes, ctypes.byref(processed_bytes)
            )

        # There are some result codes that we want to treat as "not always
        # errors". Specifically, those are errSSLWouldBlock,
        # errSSLClosedGraceful, and errSSLClosedNoNotify.
        if (result == SecurityConst.errSSLWouldBlock):
            # If we didn't process any bytes, then this was just a time out.
            # However, we can get errSSLWouldBlock in situations when we *did*
            # read some data, and in those cases we should just read "short"
            # and return.
            if processed_bytes.value == 0:
                # Timed out, no data read.
                raise socket.timeout("recv timed out")
        elif result in (SecurityConst.errSSLClosedGraceful, SecurityConst.errSSLClosedNoNotify):
            # The remote peer has closed this connection. We should do so as
            # well. Note that we don't actually return here because in
            # principle this could actually be fired along with return data.
            # It's unlikely though.
            self.close()
        else:
            _assert_no_error(result)

        # Ok, we read and probably succeeded. We should return whatever data
        # was actually read.
        return processed_bytes.value

    def settimeout(self, timeout):
        self._timeout = timeout

    def gettimeout(self):
        return self._timeout

    def send(self, data):
        processed_bytes = ctypes.c_size_t(0)

        with self._raise_on_error():
            result = Security.SSLWrite(
                self.context, data, len(data), ctypes.byref(processed_bytes)
            )

        if result == SecurityConst.errSSLWouldBlock and processed_bytes.value == 0:
            # Timed out
            raise socket.timeout("send timed out")
        else:
            _assert_no_error(result)

        # We sent, and probably succeeded. Tell them how much we sent.
        return processed_bytes.value

    def sendall(self, data):
        total_sent = 0
        while total_sent < len(data):
            sent = self.send(data[total_sent:total_sent + SSL_WRITE_BLOCKSIZE])
            total_sent += sent

    def shutdown(self):
        with self._raise_on_error():
            Security.SSLClose(self.context)

    def close(self):
        # TODO: should I do clean shutdown here? Do I have to?
        if self._makefile_refs < 1:
            self._closed = True
            if self.context:
                CoreFoundation.CFRelease(self.context)
                self.context = None
            if self._client_cert_chain:
                CoreFoundation.CFRelease(self._client_cert_chain)
                self._client_cert_chain = None
            if self._keychain:
                Security.SecKeychainDelete(self._keychain)
                CoreFoundation.CFRelease(self._keychain)
                shutil.rmtree(self._keychain_dir)
                self._keychain = self._keychain_dir = None
            return self.socket.close()
        else:
            self._makefile_refs -= 1

    def getpeercert(self, binary_form=False):
        # Urgh, annoying.
        #
        # Here's how we do this:
        #
        # 1. Call SSLCopyPeerTrust to get hold of the trust object for this
        #    connection.
        # 2. Call SecTrustGetCertificateAtIndex for index 0 to get the leaf.
        # 3. To get the CN, call SecCertificateCopyCommonName and process that
        #    string so that it's of the appropriate type.
        # 4. To get the SAN, we need to do something a bit more complex:
        #    a. Call SecCertificateCopyValues to get the data, requesting
        #       kSecOIDSubjectAltName.
        #    b. Mess about with this dictionary to try to get the SANs out.
        #
        # This is gross. Really gross. It's going to be a few hundred LoC extra
        # just to repeat something that SecureTransport can *already do*. So my
        # operating assumption at this time is that what we want to do is
        # instead to just flag to urllib3 that it shouldn't do its own hostname
        # validation when using SecureTransport.
        if not binary_form:
            raise ValueError(
                "SecureTransport only supports dumping binary certs"
            )
        trust = Security.SecTrustRef()
        certdata = None
        der_bytes = None

        try:
            # Grab the trust store.
            result = Security.SSLCopyPeerTrust(
                self.context, ctypes.byref(trust)
            )
            _assert_no_error(result)
            if not trust:
                # Probably we haven't done the handshake yet. No biggie.
                return None

            cert_count = Security.SecTrustGetCertificateCount(trust)
            if not cert_count:
                # Also a case that might happen if we haven't handshaked.
                # Handshook? Handshaken?
                return None

            leaf = Security.SecTrustGetCertificateAtIndex(trust, 0)
            assert leaf

            # Ok, now we want the DER bytes.
            certdata = Security.SecCertificateCopyData(leaf)
            assert certdata

            data_length = CoreFoundation.CFDataGetLength(certdata)
            data_buffer = CoreFoundation.CFDataGetBytePtr(certdata)
            der_bytes = ctypes.string_at(data_buffer, data_length)
        finally:
            if certdata:
                CoreFoundation.CFRelease(certdata)
            if trust:
                CoreFoundation.CFRelease(trust)

        return der_bytes

    def _reuse(self):
        self._makefile_refs += 1

    def _drop(self):
        if self._makefile_refs < 1:
            self.close()
        else:
            self._makefile_refs -= 1


if _fileobject:  # Platform-specific: Python 2
    def makefile(self, mode, bufsize=-1):
        self._makefile_refs += 1
        return _fileobject(self, mode, bufsize, close=True)
else:  # Platform-specific: Python 3
    def makefile(self, mode="r", buffering=None, *args, **kwargs):
        # We disable buffering with SecureTransport because it conflicts with
        # the buffering that ST does internally (see issue #1153 for more).
        buffering = 0
        return backport_makefile(self, mode, buffering, *args, **kwargs)

WrappedSocket.makefile = makefile


class SecureTransportContext(object):
    """
    I am a wrapper class for the SecureTransport library, to translate the
    interface of the standard library ``SSLContext`` object to calls into
    SecureTransport.
    """
    def __init__(self, protocol):
        self._min_version, self._max_version = _protocol_to_min_max[protocol]
        self._options = 0
        self._verify = False
        self._trust_bundle = None
        self._client_cert = None
        self._client_key = None
        self._client_key_passphrase = None

    @property
    def check_hostname(self):
        """
        SecureTransport cannot have its hostname checking disabled. For more,
        see the comment on getpeercert() in this file.
        """
        return True

    @check_hostname.setter
    def check_hostname(self, value):
        """
        SecureTransport cannot have its hostname checking disabled. For more,
        see the comment on getpeercert() in this file.
        """
        pass

    @property
    def options(self):
        # TODO: Well, crap.
        #
        # So this is the bit of the code that is the most likely to cause us
        # trouble. Essentially we need to enumerate all of the SSL options that
        # users might want to use and try to see if we can sensibly translate
        # them, or whether we should just ignore them.
        return self._options

    @options.setter
    def options(self, value):
        # TODO: Update in line with above.
        self._options = value

    @property
    def verify_mode(self):
        return ssl.CERT_REQUIRED if self._verify else ssl.CERT_NONE

    @verify_mode.setter
    def verify_mode(self, value):
        self._verify = True if value == ssl.CERT_REQUIRED else False

    def set_default_verify_paths(self):
        # So, this has to do something a bit weird. Specifically, what it does
        # is nothing.
        #
        # This means that, if we had previously had load_verify_locations
        # called, this does not undo that. We need to do that because it turns
        # out that the rest of the urllib3 code will attempt to load the
        # default verify paths if it hasn't been told about any paths, even if
        # the context itself was sometime earlier. We resolve that by just
        # ignoring it.
        pass

    def load_default_certs(self):
        return self.set_default_verify_paths()

    def set_ciphers(self, ciphers):
        # For now, we just require the default cipher string.
        if ciphers != util.ssl_.DEFAULT_CIPHERS:
            raise ValueError(
                "SecureTransport doesn't support custom cipher strings"
            )

    def load_verify_locations(self, cafile=None, capath=None, cadata=None):
        # OK, we only really support cadata and cafile.
        if capath is not None:
            raise ValueError(
                "SecureTransport does not support cert directories"
            )

        self._trust_bundle = cafile or cadata

    def load_cert_chain(self, certfile, keyfile=None, password=None):
        self._client_cert = certfile
        self._client_key = keyfile
        self._client_cert_passphrase = password

    def wrap_socket(self, sock, server_side=False,
                    do_handshake_on_connect=True, suppress_ragged_eofs=True,
                    server_hostname=None):
        # So, what do we do here? Firstly, we assert some properties. This is a
        # stripped down shim, so there is some functionality we don't support.
        # See PEP 543 for the real deal.
        assert not server_side
        assert do_handshake_on_connect
        assert suppress_ragged_eofs

        # Ok, we're good to go. Now we want to create the wrapped socket object
        # and store it in the appropriate place.
        wrapped_socket = WrappedSocket(sock)

        # Now we can handshake
        wrapped_socket.handshake(
            server_hostname, self._verify, self._trust_bundle,
            self._min_version, self._max_version, self._client_cert,
            self._client_key, self._client_key_passphrase
        )
        return wrapped_socket
site-packages/pip/_vendor/urllib3/contrib/__pycache__/securetransport.cpython-36.opt-1.pyc000064400000042703147511334600025576 0ustar003

���e%w�1@s�dZddlmZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddlZddlZddl
mZddlmZmZmZddlmZmZmZmZydd	l	mZWn$ek
r�dZdd
lmZYnXyed�Wnek
r�ed��YnXd
dgZdZejZ ej!j"Z#ej$�Z%ej&�Z'dZ(ej)ej*ej+ej,ej-ej.ej/ej0ej1ej2ej3ej4ej5ej6ej7ej8ej9ej:ej;ej<ej=ej>ej?ej@ejAejBejCejDejEejFejGejHejIg!ZJe
jKejLejMfiZNeOe
d��r�ejPejPfeNe
jQ<eOe
d��r�ejRejRfeNe
jS<eOe
d��rejLejLfeNe
jT<eOe
d��r0ejUejUfeNe
jV<eOe
d��rNejMejMfeNe
jW<eOe
d��rjeNe
jKeNe
jX<dd
�ZYdd�ZZdd�Z[dd�Z\ej]e[�Z^ej_e\�Z`Gdd�dea�Zbe�r�d&dd �Zcn
d'd"d �Zceceb_cGd#d$�d$ea�ZddS)(aU
SecureTranport support for urllib3 via ctypes.

This makes platform-native TLS available to urllib3 users on macOS without the
use of a compiler. This is an important feature because the Python Package
Index is moving to become a TLSv1.2-or-higher server, and the default OpenSSL
that ships with macOS is not capable of doing TLSv1.2. The only way to resolve
this is to give macOS users an alternative solution to the problem, and that
solution is to use SecureTransport.

We use ctypes here because this solution must not require a compiler. That's
because pip is not allowed to require a compiler either.

This is not intended to be a seriously long-term solution to this problem.
The hope is that PEP 543 will eventually solve this issue for us, at which
point we can retire this contrib module. But in the short term, we need to
solve the impending tire fire that is Python on Mac without this kind of
contrib module. So...here we are.

To use this module, simply import and inject it::

    import urllib3.contrib.securetransport
    urllib3.contrib.securetransport.inject_into_urllib3()

Happy TLSing!
�)�absolute_importN�)�util�)�Security�
SecurityConst�CoreFoundation)�_assert_no_error�_cert_array_from_pem�_temporary_keychain�_load_client_cert_chain)�_fileobject)�backport_makefile�z5SecureTransport only works on Pythons with memoryview�inject_into_urllib3�extract_from_urllib3Ti@�PROTOCOL_SSLv2�PROTOCOL_SSLv3�PROTOCOL_TLSv1�PROTOCOL_TLSv1_1�PROTOCOL_TLSv1_2�PROTOCOL_TLScCs(ttj_tt_ttj_dt_dtj_dS)zG
    Monkey-patch urllib3 with SecureTransport-backed SSL-support.
    TN)�SecureTransportContextr�ssl_�
SSLContext�HAS_SNI�IS_SECURETRANSPORT�rr�%/usr/lib/python3.6/securetransport.pyr�s
cCs(ttj_tt_ttj_dt_dtj_dS)z>
    Undo monkey-patching by :func:`inject_into_urllib3`.
    FN)�orig_util_SSLContextrrr�orig_util_HAS_SNIrrrrrrr�s
cCsld}�y,tj|�}|dkr tjS|j}|d}|j�}d}d}tj|j|�}	t	|	�}
ylxf||kr�|dksr|dkr�t
j|g|�}|s�tjt
jd��|j|
||��}||7}|sZ|s�tjSPqZWWnTtjk
�r}
z4|
j
}|dk	o�|t
jk�r|t
jk�rtjS�WYdd}
~
XnX||d<||k�r0tjSdStk
�rf}
z|dk	�rV|
|_tjSd}
~
XnXdS)zs
    SecureTransport read callback. This is called by ST to request that data
    be returned from the socket.
    Nrz	timed out)�_connection_refs�getr�errSSLInternal�socket�
gettimeout�ctypes�c_charZfrom_address�
memoryviewrZ
wait_for_read�error�errno�EAGAIN�	recv_into�errSSLClosedGraceful�
ECONNRESET�errSSLClosedAbort�errSSLWouldBlock�	Exception�
_exception)�
connection_id�data_buffer�data_length_pointer�wrapped_socket�base_socketZrequested_length�timeoutr)Z
read_count�bufferZbuffer_viewZ	readablesZ
chunk_size�errr�_read_callback�sN




r;c
CsNd}�ytj|�}|dkr tjS|j}|d}tj||�}|j�}d}d}	y`xZ|	|kr�|dksf|dkr�tj	|g|�}
|
s�tj
tjd��|j
|�}|	|7}	||d�}qNWWnNtj
k
r�}z0|j}|dk	r�|tjkr�|tjkr�tjS�WYdd}~XnX|	|d<|	|k�rtjSdStk
�rH}z|dk	�r8||_tjSd}~XnXdS)zx
    SecureTransport write callback. This is called by ST to request that data
    actually be sent on the network.
    Nrz	timed out)r!r"rr#r$r&�	string_atr%rZwait_for_writer)r*r+�sendr.r/r0r1r2)
r3r4r5r6r7Zbytes_to_write�datar8r)�sentZ	writablesZ
chunk_sentr:rrr�_write_callback�sD





r@c@s�eZdZdZdd�Zejdd��Zdd�Zdd	�Z	d
d�Z
dd
�Zdd�Zdd�Z
d(dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd)d"d#�Zd$d%�Zd&d'�ZdS)*�
WrappedSocketz�
    API-compatibility wrapper for Python's OpenSSL wrapped socket object.

    Note: _makefile_refs, _drop(), and _reuse() are needed for the garbage
    collector of PyPy.
    cCsL||_d|_d|_d|_d|_d|_d|_d|_|jj�|_	|jj
d�dS)NrF)r$�context�_makefile_refs�_closedr2�	_keychain�
_keychain_dir�_client_cert_chainr%�_timeout�
settimeout)�selfr$rrr�__init__.szWrappedSocket.__init__ccs4d|_dV|jdk	r0|jd}|_|j�|�dS)a]
        A context manager that can be used to wrap calls that do I/O from
        SecureTransport. If any of the I/O callbacks hit an exception, this
        context manager will correctly propagate the exception after the fact.
        This avoids silently swallowing those exceptions.

        It also correctly forces the socket closed.
        N)r2�close)rJZ	exceptionrrr�_raise_on_error@s

zWrappedSocket._raise_on_errorcCs2tjtt�t�}tj|j|tt��}t|�dS)a4
        Sets up the allowed ciphers. By default this matches the set in
        util.ssl_.DEFAULT_CIPHERS, at least as supported by macOS. This is done
        custom and doesn't allow changing at this time, mostly because parsing
        OpenSSL cipher strings is going to be a freaking nightmare.
        N)rZSSLCipherSuite�len�
CIPHER_SUITESZSSLSetEnabledCiphersrBr	)rJ�ciphers�resultrrr�_set_ciphersUszWrappedSocket._set_ciphersc	Cs|sdStjj|�r2t|d��}|j�}WdQRXd}tj�}z�t|�}tj|j	t
j|��}t|�|srt
jd��tj||�}t|�tj|d�}t|�tj�}tj|t
j|��}t|�Wd|r�tj|�|dkr�tj|�Xtjtjf}|j|k�r
t
jd|j��dS)z�
        Called when we have set custom validation. We do this in two cases:
        first, when cert validation is entirely disabled; and second, when
        using a custom trust DB.
        N�rbzFailed to copy trust referenceTz)certificate verify failed, error code: %d)�os�path�isfile�open�readr�SecTrustRefr
�SSLCopyPeerTrustrBr&�byrefr	�sslZSSLErrorZSecTrustSetAnchorCertificatesZ!SecTrustSetAnchorCertificatesOnlyZSecTrustResultTypeZSecTrustEvaluater�	CFReleaserZkSecTrustResultUnspecifiedZkSecTrustResultProceed�value)	rJ�verify�trust_bundle�fZ
cert_array�trustrQZtrust_resultZ	successesrrr�_custom_validatebs@

zWrappedSocket._custom_validatec	Cs�tjdtjtj�|_tj|jtt�}	t	|	�t
�4t|�d}
x|
tkrV|
dd}
q@W|t|
<WdQRXtj
|j|
�}	t	|	�|r�t|t�s�|jd�}tj|j|t|��}	t	|	�|j�tj|j|�}	t	|	�tj|j|�}	t	|	�|s�|dk	�rtj|jtjd�}	t	|	�|�rNt�\|_|_t|j||�|_tj|j|j�}	t	|	�xf|j��Rtj|j�}	|	tj k�r~t!j"d��n(|	tj#k�r�|j$||��wPn
t	|	�PWdQRX�qPWdS)z�
        Actually performs the TLS handshake. This is run automatically by
        wrapped socket, and shouldn't be needed in user code.
        Ni���rzutf-8Tzhandshake timed out)%rZSSLCreateContextrZkSSLClientSideZkSSLStreamTyperBZ
SSLSetIOFuncs�_read_callback_pointer�_write_callback_pointerr	�_connection_ref_lock�idr!ZSSLSetConnection�
isinstance�bytes�encodeZSSLSetPeerDomainNamerNrRZSSLSetProtocolVersionMinZSSLSetProtocolVersionMaxZSSLSetSessionOptionZ"kSSLSessionOptionBreakOnServerAuthrrErFrrGZSSLSetCertificaterMZSSLHandshaker0r$r8ZerrSSLServerAuthCompletedrc)rJ�server_hostnamer_r`Zmin_versionZmax_versionZclient_certZ
client_keyZclient_key_passphraserQZhandlerrr�	handshake�s\



zWrappedSocket.handshakecCs
|jj�S)N)r$�fileno)rJrrrrm�szWrappedSocket.filenocCs*|jdkr|jd8_|jr&|j�dS)Nrr)rCrDrL)rJrrr�_decref_socketios�s
zWrappedSocket._decref_socketioscCs&tj|�}|j||�}|d|�}|S)N)r&Zcreate_string_bufferr,)rJZbufsizr9Z
bytes_readr>rrr�recvs
zWrappedSocket.recvNc
Cs�|jr
dS|dkrt|�}tj|j|�}tjd�}|j��tj|j	||tj
|��}WdQRX|tjkr�|j
dkr�tjd��n"|tjtjfkr�|j�nt|�|j
S)Nrzrecv timed out)rDrNr&r'Zfrom_buffer�c_size_trMrZSSLReadrBr[rr0r^r$r8r-ZerrSSLClosedNoNotifyrLr	)rJr9�nbytes�processed_bytesrQrrrr,
s 




zWrappedSocket.recv_intocCs
||_dS)N)rH)rJr8rrrrI2szWrappedSocket.settimeoutcCs|jS)N)rH)rJrrrr%5szWrappedSocket.gettimeoutc
Cshtjd�}|j��"tj|j|t|�tj|��}WdQRX|tj	krZ|j
dkrZtjd��nt
|�|j
S)Nrzsend timed out)r&rprMrZSSLWriterBrNr[rr0r^r$r8r	)rJr>rrrQrrrr=8s

"zWrappedSocket.sendcCs8d}x.|t|�kr2|j|||t��}||7}qWdS)Nr)rNr=�SSL_WRITE_BLOCKSIZE)rJr>Z
total_sentr?rrr�sendallIszWrappedSocket.sendallc	Cs$|j��tj|j�WdQRXdS)N)rMrZSSLCloserB)rJrrr�shutdownOs
zWrappedSocket.shutdowncCs�|jdkr�d|_|jr(tj|j�d|_|jr@tj|j�d|_|jrvtj|j�tj|j�t	j
|j�d|_|_|jj
�S|jd8_dS)NrT)rCrDrBrr]rGrErZSecKeychainDelete�shutilZrmtreerFr$rL)rJrrrrLSs

zWrappedSocket.closeFc
Cs�|std��tj�}d}d}zptj|jtj|��}t|�|sBdStj|�}|sTdStj	|d�}tj
|�}tj|�}tj
|�}	tj|	|�}Wd|r�tj|�|r�tj|�X|S)Nz2SecureTransport only supports dumping binary certsr)�
ValueErrorrrYrZrBr&r[r	ZSecTrustGetCertificateCountZSecTrustGetCertificateAtIndexZSecCertificateCopyDatarZCFDataGetLengthZCFDataGetBytePtrr<r])
rJZbinary_formrbZcertdataZ	der_bytesrQZ
cert_countZleafZdata_lengthr4rrr�getpeercertfs2




zWrappedSocket.getpeercertcCs|jd7_dS)Nr)rC)rJrrr�_reuse�szWrappedSocket._reusecCs&|jdkr|j�n|jd8_dS)Nr)rCrL)rJrrr�_drop�s

zWrappedSocket._drop)N)F)�__name__�
__module__�__qualname__�__doc__rK�
contextlib�contextmanagerrMrRrcrlrmrnror,rIr%r=rtrurLrxryrzrrrrrA's&
>Z
(
>rAcCs|jd7_t|||dd�S)NrT)rL)rCr
)rJ�mode�bufsizerrr�makefile�sr��rcOsd}t|||f|�|�S)Nr)r)rJr��	buffering�args�kwargsrrrr��sc@s�eZdZdZdd�Zedd��Zejdd��Zedd��Zejd	d��Zed
d��Z	e	jdd��Z	d
d�Z
dd�Zdd�Zddd�Z
ddd�Zddd�ZdS)rz�
    I am a wrapper class for the SecureTransport library, to translate the
    interface of the standard library ``SSLContext`` object to calls into
    SecureTransport.
    cCs8t|\|_|_d|_d|_d|_d|_d|_d|_dS)NrF)	�_protocol_to_min_max�_min_version�_max_version�_options�_verify�
_trust_bundle�_client_cert�_client_key�_client_key_passphrase)rJZprotocolrrrrK�szSecureTransportContext.__init__cCsdS)z�
        SecureTransport cannot have its hostname checking disabled. For more,
        see the comment on getpeercert() in this file.
        Tr)rJrrr�check_hostname�sz%SecureTransportContext.check_hostnamecCsdS)z�
        SecureTransport cannot have its hostname checking disabled. For more,
        see the comment on getpeercert() in this file.
        Nr)rJr^rrrr��scCs|jS)N)r�)rJrrr�options�szSecureTransportContext.optionscCs
||_dS)N)r�)rJr^rrrr��scCs|jrtjStjS)N)r�r\�
CERT_REQUIREDZ	CERT_NONE)rJrrr�verify_mode�sz"SecureTransportContext.verify_modecCs|tjkrdnd|_dS)NTF)r\r�r�)rJr^rrrr��scCsdS)Nr)rJrrr�set_default_verify_paths�s
z/SecureTransportContext.set_default_verify_pathscCs|j�S)N)r�)rJrrr�load_default_certs�sz)SecureTransportContext.load_default_certscCs|tjjkrtd��dS)Nz5SecureTransport doesn't support custom cipher strings)rrZDEFAULT_CIPHERSrw)rJrPrrr�set_cipherssz"SecureTransportContext.set_ciphersNcCs|dk	rtd��|p||_dS)Nz1SecureTransport does not support cert directories)rwr�)rJZcafileZcapathZcadatarrr�load_verify_locationssz,SecureTransportContext.load_verify_locationscCs||_||_||_dS)N)r�r�Z_client_cert_passphrase)rJZcertfileZkeyfileZpasswordrrr�load_cert_chainsz&SecureTransportContext.load_cert_chainFTc	Cs2t|�}|j||j|j|j|j|j|j|j�|S)N)	rArlr�r�r�r�r�r�r�)rJZsockZserver_sideZdo_handshake_on_connectZsuppress_ragged_eofsrkr6rrr�wrap_sockets
z"SecureTransportContext.wrap_socket)NNN)NN)FTTN)r{r|r}r~rK�propertyr��setterr�r�r�r�r�r�r�r�rrrrr�s 	

	
r���)r�)r�N)er~Z
__future__rrr&r*Zos.pathrTrvr$r\Z	threading�weakref�rZ_securetransport.bindingsrrrZ_securetransport.low_levelr	r
rrr
�ImportErrorZpackages.backports.makefilerr(�	NameError�__all__rr rrr�WeakValueDictionaryr!ZLockrfrsZTLS_AES_256_GCM_SHA384ZTLS_CHACHA20_POLY1305_SHA256ZTLS_AES_128_GCM_SHA256Z'TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384Z%TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384Z'TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256Z%TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256Z#TLS_DHE_DSS_WITH_AES_256_GCM_SHA384Z#TLS_DHE_RSA_WITH_AES_256_GCM_SHA384Z#TLS_DHE_DSS_WITH_AES_128_GCM_SHA256Z#TLS_DHE_RSA_WITH_AES_128_GCM_SHA256Z'TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384Z%TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384Z$TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHAZ"TLS_ECDHE_RSA_WITH_AES_256_CBC_SHAZ#TLS_DHE_RSA_WITH_AES_256_CBC_SHA256Z#TLS_DHE_DSS_WITH_AES_256_CBC_SHA256Z TLS_DHE_RSA_WITH_AES_256_CBC_SHAZ TLS_DHE_DSS_WITH_AES_256_CBC_SHAZ'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256Z%TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256Z$TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHAZ"TLS_ECDHE_RSA_WITH_AES_128_CBC_SHAZ#TLS_DHE_RSA_WITH_AES_128_CBC_SHA256Z#TLS_DHE_DSS_WITH_AES_128_CBC_SHA256Z TLS_DHE_RSA_WITH_AES_128_CBC_SHAZ TLS_DHE_DSS_WITH_AES_128_CBC_SHAZTLS_RSA_WITH_AES_256_GCM_SHA384ZTLS_RSA_WITH_AES_128_GCM_SHA256ZTLS_RSA_WITH_AES_256_CBC_SHA256ZTLS_RSA_WITH_AES_128_CBC_SHA256ZTLS_RSA_WITH_AES_256_CBC_SHAZTLS_RSA_WITH_AES_128_CBC_SHArOZPROTOCOL_SSLv23Z
kTLSProtocol1ZkTLSProtocol12r��hasattrZ
kSSLProtocol2rZ
kSSLProtocol3rrZkTLSProtocol11rrrrrr;r@ZSSLReadFuncrdZSSLWriteFuncre�objectrAr�rrrrr�<module>s�95



site-packages/pip/_vendor/urllib3/contrib/__pycache__/ntlmpool.cpython-36.pyc000064400000006131147511334600023233 0ustar003

���e~�@s\dZddlmZddlmZddlmZddlmZddlm	Z	ee
�ZGdd	�d	e�Zd
S)z
NTLM authenticating pool, contributed by erikcederstran

Issue #10, see: http://code.google.com/p/urllib3/issues/detail?id=10
�)�absolute_import)�	getLogger)�ntlm�)�HTTPSConnectionPool)�HTTPSConnectioncs:eZdZdZdZ�fdd�Zdd�Zd�fd
d�	Z�ZS)
�NTLMConnectionPoolzQ
    Implements an NTLM authentication version of an urllib3 connection pool
    ZhttpscsLtt|�j||�||_||_|jdd�}|dj�|_|d|_||_	dS)z�
        authurl is a random URL on the server that is protected by NTLM.
        user is the Windows user, probably in the DOMAIN\username format.
        pw is the password for the user.
        �\�rN)
�superr�__init__�authurl�rawuser�split�upper�domain�user�pw)�selfrrr
�args�kwargsZ
user_parts)�	__class__��/usr/lib/python3.6/ntlmpool.pyrs
zNTLMConnectionPool.__init__c
Cs�|jd7_tjd|j|j|j�i}d|d<d}d}t|j|jd�}dtj|j	�||<tjd	|�|j
d
|jd|�|j�}t|j
��}tjd|j|j�tjd|�tjd
|jd��d|_||jd�}d}x(|D] }	|	dd�dkr�|	dd�}q�W|dk�rtd|||f��tj|�\}
}tj|
|j|j|j|�}d|||<tjd	|�|j
d
|jd|�|j�}tjd|j|j�tjdt|j
���tjd
|j�dd��|jdk�r�|jdk�r�td��td|j|jf��d|_tjd�|S)Nr
z3Starting NTLM HTTPS connection no. %d: https://%s%sz
Keep-Alive�
ConnectionZ
Authorizationzwww-authenticate)�host�portzNTLM %szRequest headers: %sZGETzResponse status: %s %szResponse headers: %szResponse data: %s [...]�dz, �zNTLM z!Unexpected %s response header: %s��i�z3Server rejected request: wrong username or passwordzWrong server response: %s %szConnection established)Znum_connections�log�debugrr
rrrZcreate_NTLM_NEGOTIATE_MESSAGErZrequestZgetresponse�dictZ
getheadersZstatus�reason�read�fpr�	ExceptionZparse_NTLM_CHALLENGE_MESSAGEZ create_NTLM_AUTHENTICATE_MESSAGErrr)
r�headersZ
req_headerZresp_headerZconn�resZreshdrZauth_header_valuesZauth_header_value�sZServerChallengeZNegotiateFlagsZauth_msgrrr�	_new_conn's\


zNTLMConnectionPool._new_connN�Tcs0|dkri}d|d<tt|�j|||||||�S)Nz
Keep-Aliver)rr�urlopen)r�methodZurlZbodyr'ZretriesZredirectZassert_same_host)rrrr,hszNTLMConnectionPool.urlopen)NNr+TT)	�__name__�
__module__�__qualname__�__doc__�schemerr*r,�
__classcell__rr)rrrsArN)
r1Z
__future__rZloggingrr�rZpackages.six.moves.http_clientrr.r rrrrr�<module>ssite-packages/pip/_vendor/urllib3/contrib/__pycache__/appengine.cpython-36.pyc000064400000021027147511334600023336 0ustar003

���eq*�@s dZddlmZddlZddlZddlZddlmZddlm	Z	m
Z
mZmZm
Z
mZddlmZddlmZdd	lmZdd
lmZddlmZyddlmZWnek
r�dZYnXeje�ZGd
d�de
�ZGdd�de	�Z Gdd�de�Z!dd�Z"dd�Z#dd�Z$dd�Z%dd�Z&dS)aC
This module provides a pool manager that uses Google App Engine's
`URLFetch Service <https://cloud.google.com/appengine/docs/python/urlfetch>`_.

Example usage::

    from urllib3 import PoolManager
    from urllib3.contrib.appengine import AppEngineManager, is_appengine_sandbox

    if is_appengine_sandbox():
        # AppEngineManager uses AppEngine's URLFetch API behind the scenes
        http = AppEngineManager()
    else:
        # PoolManager uses a socket-level API behind the scenes
        http = PoolManager()

    r = http.request('GET', 'https://google.com/')

There are `limitations <https://cloud.google.com/appengine/docs/python/urlfetch/#Python_Quotas_and_limits>`_ to the URLFetch service and it may not be
the best choice for your application. There are three options for using
urllib3 on Google App Engine:

1. You can use :class:`AppEngineManager` with URLFetch. URLFetch is
   cost-effective in many circumstances as long as your usage is within the
   limitations.
2. You can use a normal :class:`~urllib3.PoolManager` by enabling sockets.
   Sockets also have `limitations and restrictions
   <https://cloud.google.com/appengine/docs/python/sockets/   #limitations-and-restrictions>`_ and have a lower free quota than URLFetch.
   To use sockets, be sure to specify the following in your ``app.yaml``::

        env_variables:
            GAE_USE_SOCKETS_HTTPLIB : 'true'

3. If you are using `App Engine Flexible
<https://cloud.google.com/appengine/docs/flexible/>`_, you can use the standard
:class:`PoolManager` without any configuration or special environment variables.
�)�absolute_importN�)�urljoin)�	HTTPError�HTTPWarning�
MaxRetryError�
ProtocolError�TimeoutError�SSLError)�BytesIO)�RequestMethods)�HTTPResponse)�Timeout)�Retry)�urlfetchc@seZdZdS)�AppEnginePlatformWarningN)�__name__�
__module__�__qualname__�rr�/usr/lib/python3.6/appengine.pyrGsrc@seZdZdS)�AppEnginePlatformErrorN)rrrrrrrrKsrc@sXeZdZdZddd�Zdd�Zdd	�Zddddejfd
d�Z	dd
�Z
dd�Zdd�ZdS)�AppEngineManagera
    Connection manager for Google App Engine sandbox applications.

    This manager uses the URLFetch service directly instead of using the
    emulated httplib, and is subject to URLFetch limitations as described in
    the App Engine documentation `here
    <https://cloud.google.com/appengine/docs/python/urlfetch>`_.

    Notably it will raise an :class:`AppEnginePlatformError` if:
        * URLFetch is not available.
        * If you attempt to use this on App Engine Flexible, as full socket
          support is available.
        * If a request size is more than 10 megabytes.
        * If a response size is more than 32 megabtyes.
        * If you use an unsupported request method such as OPTIONS.

    Beyond those cases, it will raise normal urllib3 errors.
    NTcCsNtstd��t�rtd��tjdt�tj||�||_||_	|pFt
j|_dS)Nz.URLFetch is not available in this environment.z�Use normal urllib3.PoolManager instead of AppEngineManageron Managed VMs, as using URLFetch is not necessary in this environment.z�urllib3 is using URLFetch on Google App Engine sandbox instead of sockets. To use sockets directly instead of URLFetch see https://urllib3.readthedocs.io/en/latest/reference/urllib3.contrib.html.)
rr�is_prod_appengine_mvms�warnings�warnrr�__init__�validate_certificate�urlfetch_retriesrZDEFAULT�retries)�self�headersrrrrrrrcszAppEngineManager.__init__cCs|S)Nr)r rrr�	__enter__{szAppEngineManager.__enter__cCsdS)NFr)r �exc_typeZexc_valZexc_tbrrr�__exit__~szAppEngineManager.__exit__cKs�|j||�}yF|o |jdko |j}	tj||||p2id|jo<|	|j|�|jd�}
W�nBtjk
r�}zt	||��WYdd}~X�ntj
k
r�}z$dt|�kr�td|��t
|��WYdd}~Xn�tjk
�r}z(dt|�kr�t|||d��t
|��WYdd}~Xn�tjk
�r6}ztd|��WYdd}~Xn`tjk
�rb}zt|��WYdd}~Xn4tjk
�r�}ztd	||��WYdd}~XnX|j|
fd
|i|��}|�o�|j�}
|
�rr|j�r�|j�r�t||d��n�|jdk�r�d
}y|j||||d�}Wn*tk
�r.|j�r*t||d��|SX|j|�tjd||
�t||
�}|j||||f|||d�|��St|jd��}|j ||j|��r�|j||||d�}tjd|�|j!|�|j||f|||||d�|��S|S)NrF)Zpayload�methodr!Zallow_truncated�follow_redirectsZdeadlinerz	too largezOURLFetch request too large, URLFetch only supports requests up to 10mb in size.zToo many redirects)�reasonzPURLFetch response too large, URLFetch only supportsresponses up to 32mb in size.z$URLFetch does not support method: %srztoo many redirectsi/ZGET)�responseZ_poolzRedirecting %s -> %s)r�redirect�timeoutzRetry-Afterz	Retry: %s)�bodyr!rr)r*)"�_get_retriesr)�totalrZfetchr�_get_absolute_timeoutrZDeadlineExceededErrorr	ZInvalidURLError�strrrZ
DownloadErrorrZResponseTooLargeErrorZSSLCertificateErrorr
ZInvalidMethodError�#_urlfetch_response_to_http_responseZget_redirect_locationZraise_on_redirect�statusZ	incrementZsleep_for_retry�log�debugr�urlopen�boolZ	getheaderZis_retryZsleep)r r%Zurlr+r!rr)r*�response_kwr&r(�eZ
http_responseZredirect_locationZredirect_urlZhas_retry_afterrrrr4�s�




zAppEngineManager.urlopencKszt�r"|jjd�}|dkr"|jd=|jjd�}|dkrZ|jd�}|jd�dj|�|jd<tft|j�|j|j	d�|��S)Nzcontent-encodingZdeflateztransfer-encodingZchunked�,)r+r!r1)
�is_prod_appenginer!�get�split�remove�joinr
rZcontentZstatus_code)r Z
urlfetch_respr6Zcontent_encodingZtransfer_encodingZ	encodingsrrrr0�s

z4AppEngineManager._urlfetch_response_to_http_responsecCsB|tjkrdSt|t�r>|jdk	s,|jdk	r8tjdt�|jS|S)NzdURLFetch does not support granular timeout settings, reverting to total or default URLFetch timeout.)	r�DEFAULT_TIMEOUT�
isinstanceZ_readZ_connectrrrr-)r r*rrrr.�s

z&AppEngineManager._get_absolute_timeoutcCs>t|t�stj|||jd�}|js.|js.|jr:tjdt	�|S)N)r)�defaultzhURLFetch only supports total retries and does not recognize connect, read, or redirect retry parameters.)
r?rZfrom_intrZconnect�readr)rrr)r rr)rrrr,s
zAppEngineManager._get_retries)NNTT)
rrr�__doc__rr"r$rr>r4r0r.r,rrrrrOs
ZrcCst�pt�pt�S)N)�is_local_appenginer9rrrrr�is_appenginesrDcCst�ot�S)N)rDrrrrr�is_appengine_sandboxsrEcCsdtjkodtjdkS)N�APPENGINE_RUNTIMEzDevelopment/�SERVER_SOFTWARE)�os�environrrrrrCs
rCcCs dtjkodtjdkot�S)NrFzGoogle App Engine/rG)rHrIrrrrrr9!s
r9cCstjjdd�dkS)NZGAE_VMF�true)rHrIr:rrrrr'sr)'rBZ
__future__rZloggingrHrZpackages.six.moves.urllib.parser�
exceptionsrrrrr	r
Zpackages.sixrZrequestrr(r
Zutil.timeoutrZ
util.retryrZgoogle.appengine.apir�ImportErrorZ	getLoggerrr2rrrrDrErCr9rrrrr�<module>'s2 	

Dsite-packages/pip/_vendor/urllib3/contrib/__pycache__/pyopenssl.cpython-36.pyc000064400000033337147511334600023433 0ustar003

���e�;�@s,dZddlmZddlZddlmZddlmZ	ddl
mZddlm
Z
mZddlmZydd	lmZWn$ek
r�dZd
dlmZYnXddlZddlZd
dlmZddlZd
d
lmZddgZdZejej j!ej"ej j#iZ$e%ed�o�e%ej d��rej j&e$ej'<e%ed��r0e%ej d��r0ej j(e$ej)<ye$j*ej+ej j,i�Wne-k
�r^YnXej.ej j/ej0ej j1ej2ej j1ej j3iZ4e5dd�e4j6�D��Z7dZ8ejZ9ej:j;Z<ej=e>�Z?dd�Z@dd�ZAdd�ZBdd�ZCdd�ZDGd d!�d!eE�ZFe�rd*d#d$�ZGneZGeGeF_GGd%d&�d&eE�ZHd'd(�ZIdS)+ab
SSL with SNI_-support for Python 2. Follow these instructions if you would
like to verify SSL certificates in Python 2. Note, the default libraries do
*not* do certificate checking; you need to do additional work to validate
certificates yourself.

This needs the following packages installed:

* pyOpenSSL (tested with 16.0.0)
* cryptography (minimum 1.3.4, from pyopenssl)
* idna (minimum 2.0, from cryptography)

However, pyopenssl depends on cryptography, which depends on idna, so while we
use all three directly here we end up having relatively few packages required.

You can install them with the following command:

    pip install pyopenssl cryptography idna

To activate certificate checking, call
:func:`~urllib3.contrib.pyopenssl.inject_into_urllib3` from your Python code
before you begin making HTTP requests. This can be done in a ``sitecustomize``
module, or at any other time before your application begins using ``urllib3``,
like this::

    try:
        import urllib3.contrib.pyopenssl
        urllib3.contrib.pyopenssl.inject_into_urllib3()
    except ImportError:
        pass

Now you can use :mod:`urllib3` as you normally would, and it will support SNI
when the required modules are installed.

Activating this module also has the positive side effect of disabling SSL/TLS
compression in Python 2 (see `CRIME attack`_).

If you want to configure the default list of supported cipher suites, you can
set the ``urllib3.contrib.pyopenssl.DEFAULT_SSL_CIPHER_LIST`` variable.

.. _sni: https://en.wikipedia.org/wiki/Server_Name_Indication
.. _crime attack: https://en.wikipedia.org/wiki/CRIME_(security_exploit)
�)�absolute_importN)�x509)�backend)�_Certificate)�timeout�error)�BytesIO)�_fileobject�)�backport_makefile)�six)�util�inject_into_urllib3�extract_from_urllib3T�PROTOCOL_TLSv1_1�TLSv1_1_METHOD�PROTOCOL_TLSv1_2�TLSv1_2_METHODccs|]\}}||fVqdS)N�)�.0�k�vrr�/usr/lib/python3.6/pyopenssl.py�	<genexpr>`sri@cCs.t�ttj_tt_ttj_dt_dtj_dS)z7Monkey-patch urllib3 with PyOpenSSL-backed SSL-support.TN)�_validate_dependencies_met�PyOpenSSLContextr
�ssl_�
SSLContext�HAS_SNI�IS_PYOPENSSLrrrrrmscCs(ttj_tt_ttj_dt_dtj_dS)z4Undo monkey-patching by :func:`inject_into_urllib3`.FN)�orig_util_SSLContextr
rr�orig_util_HAS_SNIrrrrrrrys
cCsRddlm}t|dd�dkr$td��ddlm}|�}t|dd�dkrNtd��dS)	z{
    Verifies that PyOpenSSL's package-level dependencies have been met.
    Throws `ImportError` if they are not met.
    r)�
Extensions�get_extension_for_classNzX'cryptography' module missing required functionality.  Try upgrading to v1.3.4 or newer.)�X509�_x509zS'pyOpenSSL' module missing required functionality. Try upgrading to v0.14 or newer.)Zcryptography.x509.extensionsr"�getattr�ImportErrorZOpenSSL.cryptor$)r"r$rrrrr�srcCs(dd�}||�}tjdkr$|jd�}|S)a�
    Converts a dNSName SubjectAlternativeName field to the form used by the
    standard library on the given Python version.

    Cryptography produces a dNSName as a unicode string that was idna-decoded
    from ASCII bytes. We need to idna-encode that string to get it back, and
    then on Python 3 we also need to convert to unicode via UTF-8 (the stdlib
    uses PyUnicode_FromStringAndSize on it, which decodes via UTF-8).
    cSsNddl}x:dD]2}|j|�r|t|�d�}|jd�|j|�SqW|j|�S)z�
        Borrowed wholesale from the Python Cryptography Project. It turns out
        that we can't just safely call `idna.encode`: it can explode for
        wildcard names. This avoids that problem.
        rN�*.�.�ascii)r(r))�idna�
startswith�len�encode)�namer+�prefixrrr�idna_encode�s

z'_dnsname_to_stdlib.<locals>.idna_encode�rzutf-8)r2r)�sys�version_info�decode)r/r1rrr�_dnsname_to_stdlib�s



r6cCs�t|d�r|j�}ntt|j�}y|jjtj�j	}WnNtj
k
rJgStjtjtj
tfk
r�}ztjd|�gSd}~XnXdd�|jtj�D�}|jdd�|jtj�D��|S)zU
    Given an PyOpenSSL certificate, provides all the subject alternative names.
    �to_cryptographyz�A problem was encountered with the certificate that prevented urllib3 from finding the SubjectAlternativeName field. This can affect certificate validation. The error was %sNcSsg|]}dt|�f�qS)ZDNS)r6)rr/rrr�
<listcomp>�sz%get_subj_alt_name.<locals>.<listcomp>css|]}dt|�fVqdS)z
IP AddressN)�str)rr/rrrr�sz$get_subj_alt_name.<locals>.<genexpr>)�hasattrr7r�openssl_backendr%�
extensionsr#rZSubjectAlternativeName�valueZExtensionNotFoundZDuplicateExtensionZUnsupportedExtensionZUnsupportedGeneralNameType�UnicodeError�logZwarningZget_values_for_typeZDNSName�extendZ	IPAddress)Z	peer_certZcertZext�e�namesrrr�get_subj_alt_name�s(


	rCc@s|eZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
d dd�Zdd�Zdd�ZdS)!�
WrappedSocketz�API-compatibility wrapper for Python OpenSSL's Connection-class.

    Note: _makefile_refs, _drop() and _reuse() are needed for the garbage
    collector of pypy.
    TcCs"||_||_||_d|_d|_dS)NrF)�
connection�socket�suppress_ragged_eofs�_makefile_refs�_closed)�selfrErFrGrrr�__init__�s
zWrappedSocket.__init__cCs
|jj�S)N)rF�fileno)rJrrrrL�szWrappedSocket.filenocCs*|jdkr|jd8_|jr&|j�dS)Nr�)rHrI�close)rJrrr�_decref_socketios�s
zWrappedSocket._decref_socketioscOs�y|jj||�}Wn�tjjk
rX}z&|jr<|jdkr<dStt|���WYdd}~Xn�tjj	k
r�}z|jj
�tjjkr�dS�WYdd}~XnJtjjk
r�t
j|j|jj��}|s�td��n|j||�SYnX|SdS)NrM�Unexpected EOF�zThe read operation timed out���)rRrP)rE�recv�OpenSSL�SSL�SysCallErrorrG�args�SocketErrorr9�ZeroReturnError�get_shutdown�RECEIVED_SHUTDOWN�
WantReadErrorr
�
wait_for_readrF�
gettimeoutr)rJrW�kwargs�datarA�rdrrrrSs 
zWrappedSocket.recvcOs�y|jj||�Stjjk
rT}z&|jr8|jdkr8dStt|���WYdd}~Xn�tjj	k
r�}z|jj
�tjjkr~dS�WYdd}~XnFtjjk
r�t
j|j|jj��}|s�td��n|j||�SYnXdS)NrM�Unexpected EOFrzThe read operation timed outrR)rRrb)rE�	recv_intorTrUrVrGrWrXr9rYrZr[r\r
r]rFr^r)rJrWr_rArarrrrcs
zWrappedSocket.recv_intocCs|jj|�S)N)rF�
settimeout)rJrrrrrd*szWrappedSocket.settimeoutcCs�xzy|jj|�Stjjk
rFtj|j|jj��}|s@t	��wYqtjj
k
rv}ztt|���WYdd}~XqXqWdS)N)
rE�sendrTrUZWantWriteErrorr
Zwait_for_writerFr^rrVrXr9)rJr`�wrrArrr�_send_until_done-szWrappedSocket._send_until_donecCs8d}x.|t|�kr2|j|||t��}||7}qWdS)Nr)r-rg�SSL_WRITE_BLOCKSIZE)rJr`Z
total_sentZsentrrr�sendall9szWrappedSocket.sendallcCs|jj�dS)N)rE�shutdown)rJrrrrj?szWrappedSocket.shutdowncCsH|jdkr6yd|_|jj�Stjjk
r2dSXn|jd8_dS)NrMT)rHrIrErNrTrU�Error)rJrrrrNCs

zWrappedSocket.closeFcCsD|jj�}|s|S|r(tjjtjj|�Sd|j�jffft|�d�S)NZ
commonName)ZsubjectZsubjectAltName)	rEZget_peer_certificaterTZcryptoZdump_certificateZ
FILETYPE_ASN1Zget_subjectZCNrC)rJZbinary_formrrrr�getpeercertMs
zWrappedSocket.getpeercertcCs|jd7_dS)NrM)rH)rJrrr�_reuse_szWrappedSocket._reusecCs&|jdkr|j�n|jd8_dS)NrM)rHrN)rJrrr�_dropbs

zWrappedSocket._dropN)T)F)�__name__�
__module__�__qualname__�__doc__rKrLrOrSrcrdrgrirjrNrlrmrnrrrrrD�s


rDrMcCs|jd7_t|||dd�S)NrMT)rN)rHr	)rJ�mode�bufsizerrr�makefilejsruc@szeZdZdZdd�Zedd��Zejdd��Zedd��Zejd	d��Zd
d�Z	dd
�Z
ddd�Zddd�Zddd�Z
dS)rz�
    I am a wrapper class for the PyOpenSSL ``Context`` object. I am responsible
    for translating the interface of the standard library ``SSLContext`` object
    to calls into PyOpenSSL.
    cCs*t||_tjj|j�|_d|_d|_dS)NrF)�_openssl_versions�protocolrTrUZContext�_ctx�_optionsZcheck_hostname)rJrwrrrrKys
zPyOpenSSLContext.__init__cCs|jS)N)ry)rJrrr�optionsszPyOpenSSLContext.optionscCs||_|jj|�dS)N)ryrxZset_options)rJr=rrrrz�scCst|jj�S)N)�_openssl_to_stdlib_verifyrxZget_verify_mode)rJrrr�verify_mode�szPyOpenSSLContext.verify_modecCs|jjt|t�dS)N)rxZ
set_verify�_stdlib_to_openssl_verify�_verify_callback)rJr=rrrr|�scCs|jj�dS)N)rx�set_default_verify_paths)rJrrrr�sz)PyOpenSSLContext.set_default_verify_pathscCs&t|tj�r|jd�}|jj|�dS)Nzutf-8)�
isinstancer�	text_typer.rxZset_cipher_list)rJZciphersrrr�set_ciphers�s
zPyOpenSSLContext.set_ciphersNcCsN|dk	r|jd�}|dk	r$|jd�}|jj||�|dk	rJ|jjt|��dS)Nzutf-8)r.rx�load_verify_locationsr)rJZcafileZcapathZcadatarrrr��s

z&PyOpenSSLContext.load_verify_locationscs<|jj|��dk	r(|jj�fdd��|jj|p4|�dS)Ncs�S)Nr)Z
max_lengthZprompt_twiceZuserdata)�passwordrr�<lambda>�sz2PyOpenSSLContext.load_cert_chain.<locals>.<lambda>)rxZuse_certificate_fileZ
set_passwd_cbZuse_privatekey_file)rJZcertfileZkeyfiler�r)r�r�load_cert_chain�sz PyOpenSSLContext.load_cert_chainFTc	Cs�tjj|j|�}t|tj�r&|jd�}|dk	r8|j|�|j	�x|y|j
�Wnhtjjk
r�tj
||j��}|s~td��wBYn4tjjk
r�}ztjd|��WYdd}~XnXPqBWt||�S)Nzutf-8zselect timed outzbad handshake: %r)rTrUZ
Connectionrxr�rr�r.Zset_tlsext_host_nameZset_connect_stateZdo_handshaker\r
r]r^rrk�sslZSSLErrorrD)	rJZsockZserver_sideZdo_handshake_on_connectrGZserver_hostname�cnxrarArrr�wrap_socket�s$

 zPyOpenSSLContext.wrap_socket)NNN)NN)FTTN)rorprqrrrK�propertyrz�setterr|rr�r�r�r�rrrrrss
	
rcCs|dkS)Nrr)r�rZerr_noZ	err_depthZreturn_coderrrr~�sr~rR)rR)JrrZ
__future__rZOpenSSL.SSLrTZcryptographyrZ$cryptography.hazmat.backends.opensslrr;Z)cryptography.hazmat.backends.openssl.x509rrFrrrX�iorr	r'Zpackages.backports.makefilerZloggingr�Zpackagesrr3�r
�__all__rZPROTOCOL_SSLv23rUZ
SSLv23_METHODZPROTOCOL_TLSv1ZTLSv1_METHODrvr:rrrr�updateZPROTOCOL_SSLv3ZSSLv3_METHOD�AttributeErrorZ	CERT_NONEZVERIFY_NONEZ
CERT_OPTIONALZVERIFY_PEERZ
CERT_REQUIREDZVERIFY_FAIL_IF_NO_PEER_CERTr}�dict�itemsr{rhr!rrr Z	getLoggerror?rrrr6rC�objectrDrurr~rrrr�<module>+sh




3Ssite-packages/pip/_vendor/urllib3/contrib/__pycache__/__init__.cpython-36.pyc000064400000000161147511334600023123 0ustar003

���e�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/urllib3/contrib/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000161147511334600024062 0ustar003

���e�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/urllib3/contrib/__pycache__/securetransport.cpython-36.pyc000064400000043014147511334600024633 0ustar003

���e%w�1@s�dZddlmZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddlZddlZddl
mZddlmZmZmZddlmZmZmZmZydd	l	mZWn$ek
r�dZdd
lmZYnXyed�Wnek
r�ed��YnXd
dgZdZejZ ej!j"Z#ej$�Z%ej&�Z'dZ(ej)ej*ej+ej,ej-ej.ej/ej0ej1ej2ej3ej4ej5ej6ej7ej8ej9ej:ej;ej<ej=ej>ej?ej@ejAejBejCejDejEejFejGejHejIg!ZJe
jKejLejMfiZNeOe
d��r�ejPejPfeNe
jQ<eOe
d��r�ejRejRfeNe
jS<eOe
d��rejLejLfeNe
jT<eOe
d��r0ejUejUfeNe
jV<eOe
d��rNejMejMfeNe
jW<eOe
d��rjeNe
jKeNe
jX<dd
�ZYdd�ZZdd�Z[dd�Z\ej]e[�Z^ej_e\�Z`Gdd�dea�Zbe�r�d&dd �Zcn
d'd"d �Zceceb_cGd#d$�d$ea�ZddS)(aU
SecureTranport support for urllib3 via ctypes.

This makes platform-native TLS available to urllib3 users on macOS without the
use of a compiler. This is an important feature because the Python Package
Index is moving to become a TLSv1.2-or-higher server, and the default OpenSSL
that ships with macOS is not capable of doing TLSv1.2. The only way to resolve
this is to give macOS users an alternative solution to the problem, and that
solution is to use SecureTransport.

We use ctypes here because this solution must not require a compiler. That's
because pip is not allowed to require a compiler either.

This is not intended to be a seriously long-term solution to this problem.
The hope is that PEP 543 will eventually solve this issue for us, at which
point we can retire this contrib module. But in the short term, we need to
solve the impending tire fire that is Python on Mac without this kind of
contrib module. So...here we are.

To use this module, simply import and inject it::

    import urllib3.contrib.securetransport
    urllib3.contrib.securetransport.inject_into_urllib3()

Happy TLSing!
�)�absolute_importN�)�util�)�Security�
SecurityConst�CoreFoundation)�_assert_no_error�_cert_array_from_pem�_temporary_keychain�_load_client_cert_chain)�_fileobject)�backport_makefile�z5SecureTransport only works on Pythons with memoryview�inject_into_urllib3�extract_from_urllib3Ti@�PROTOCOL_SSLv2�PROTOCOL_SSLv3�PROTOCOL_TLSv1�PROTOCOL_TLSv1_1�PROTOCOL_TLSv1_2�PROTOCOL_TLScCs(ttj_tt_ttj_dt_dtj_dS)zG
    Monkey-patch urllib3 with SecureTransport-backed SSL-support.
    TN)�SecureTransportContextr�ssl_�
SSLContext�HAS_SNI�IS_SECURETRANSPORT�rr�%/usr/lib/python3.6/securetransport.pyr�s
cCs(ttj_tt_ttj_dt_dtj_dS)z>
    Undo monkey-patching by :func:`inject_into_urllib3`.
    FN)�orig_util_SSLContextrrr�orig_util_HAS_SNIrrrrrrr�s
cCsld}�y,tj|�}|dkr tjS|j}|d}|j�}d}d}tj|j|�}	t	|	�}
ylxf||kr�|dksr|dkr�t
j|g|�}|s�tjt
jd��|j|
||��}||7}|sZ|s�tjSPqZWWnTtjk
�r}
z4|
j
}|dk	o�|t
jk�r|t
jk�rtjS�WYdd}
~
XnX||d<||k�r0tjSdStk
�rf}
z|dk	�rV|
|_tjSd}
~
XnXdS)zs
    SecureTransport read callback. This is called by ST to request that data
    be returned from the socket.
    Nrz	timed out)�_connection_refs�getr�errSSLInternal�socket�
gettimeout�ctypes�c_charZfrom_address�
memoryviewrZ
wait_for_read�error�errno�EAGAIN�	recv_into�errSSLClosedGraceful�
ECONNRESET�errSSLClosedAbort�errSSLWouldBlock�	Exception�
_exception)�
connection_id�data_buffer�data_length_pointer�wrapped_socket�base_socketZrequested_length�timeoutr)Z
read_count�bufferZbuffer_viewZ	readablesZ
chunk_size�errr�_read_callback�sN




r;c
CsNd}�ytj|�}|dkr tjS|j}|d}tj||�}|j�}d}d}	y`xZ|	|kr�|dksf|dkr�tj	|g|�}
|
s�tj
tjd��|j
|�}|	|7}	||d�}qNWWnNtj
k
r�}z0|j}|dk	r�|tjkr�|tjkr�tjS�WYdd}~XnX|	|d<|	|k�rtjSdStk
�rH}z|dk	�r8||_tjSd}~XnXdS)zx
    SecureTransport write callback. This is called by ST to request that data
    actually be sent on the network.
    Nrz	timed out)r!r"rr#r$r&�	string_atr%rZwait_for_writer)r*r+�sendr.r/r0r1r2)
r3r4r5r6r7Zbytes_to_write�datar8r)�sentZ	writablesZ
chunk_sentr:rrr�_write_callback�sD





r@c@s�eZdZdZdd�Zejdd��Zdd�Zdd	�Z	d
d�Z
dd
�Zdd�Zdd�Z
d(dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd)d"d#�Zd$d%�Zd&d'�ZdS)*�
WrappedSocketz�
    API-compatibility wrapper for Python's OpenSSL wrapped socket object.

    Note: _makefile_refs, _drop(), and _reuse() are needed for the garbage
    collector of PyPy.
    cCsL||_d|_d|_d|_d|_d|_d|_d|_|jj�|_	|jj
d�dS)NrF)r$�context�_makefile_refs�_closedr2�	_keychain�
_keychain_dir�_client_cert_chainr%�_timeout�
settimeout)�selfr$rrr�__init__.szWrappedSocket.__init__ccs4d|_dV|jdk	r0|jd}|_|j�|�dS)a]
        A context manager that can be used to wrap calls that do I/O from
        SecureTransport. If any of the I/O callbacks hit an exception, this
        context manager will correctly propagate the exception after the fact.
        This avoids silently swallowing those exceptions.

        It also correctly forces the socket closed.
        N)r2�close)rJZ	exceptionrrr�_raise_on_error@s

zWrappedSocket._raise_on_errorcCs2tjtt�t�}tj|j|tt��}t|�dS)a4
        Sets up the allowed ciphers. By default this matches the set in
        util.ssl_.DEFAULT_CIPHERS, at least as supported by macOS. This is done
        custom and doesn't allow changing at this time, mostly because parsing
        OpenSSL cipher strings is going to be a freaking nightmare.
        N)rZSSLCipherSuite�len�
CIPHER_SUITESZSSLSetEnabledCiphersrBr	)rJ�ciphers�resultrrr�_set_ciphersUszWrappedSocket._set_ciphersc	Cs|sdStjj|�r2t|d��}|j�}WdQRXd}tj�}z�t|�}tj|j	t
j|��}t|�|srt
jd��tj||�}t|�tj|d�}t|�tj�}tj|t
j|��}t|�Wd|r�tj|�|dkr�tj|�Xtjtjf}|j|k�r
t
jd|j��dS)z�
        Called when we have set custom validation. We do this in two cases:
        first, when cert validation is entirely disabled; and second, when
        using a custom trust DB.
        N�rbzFailed to copy trust referenceTz)certificate verify failed, error code: %d)�os�path�isfile�open�readr�SecTrustRefr
�SSLCopyPeerTrustrBr&�byrefr	�sslZSSLErrorZSecTrustSetAnchorCertificatesZ!SecTrustSetAnchorCertificatesOnlyZSecTrustResultTypeZSecTrustEvaluater�	CFReleaserZkSecTrustResultUnspecifiedZkSecTrustResultProceed�value)	rJ�verify�trust_bundle�fZ
cert_array�trustrQZtrust_resultZ	successesrrr�_custom_validatebs@

zWrappedSocket._custom_validatec	Cs�tjdtjtj�|_tj|jtt�}	t	|	�t
�4t|�d}
x|
tkrV|
dd}
q@W|t|
<WdQRXtj
|j|
�}	t	|	�|r�t|t�s�|jd�}tj|j|t|��}	t	|	�|j�tj|j|�}	t	|	�tj|j|�}	t	|	�|s�|dk	�rtj|jtjd�}	t	|	�|�rNt�\|_|_t|j||�|_tj|j|j�}	t	|	�xf|j��Rtj|j�}	|	tj k�r~t!j"d��n(|	tj#k�r�|j$||��wPn
t	|	�PWdQRX�qPWdS)z�
        Actually performs the TLS handshake. This is run automatically by
        wrapped socket, and shouldn't be needed in user code.
        Ni���rzutf-8Tzhandshake timed out)%rZSSLCreateContextrZkSSLClientSideZkSSLStreamTyperBZ
SSLSetIOFuncs�_read_callback_pointer�_write_callback_pointerr	�_connection_ref_lock�idr!ZSSLSetConnection�
isinstance�bytes�encodeZSSLSetPeerDomainNamerNrRZSSLSetProtocolVersionMinZSSLSetProtocolVersionMaxZSSLSetSessionOptionZ"kSSLSessionOptionBreakOnServerAuthrrErFrrGZSSLSetCertificaterMZSSLHandshaker0r$r8ZerrSSLServerAuthCompletedrc)rJ�server_hostnamer_r`Zmin_versionZmax_versionZclient_certZ
client_keyZclient_key_passphraserQZhandlerrr�	handshake�s\



zWrappedSocket.handshakecCs
|jj�S)N)r$�fileno)rJrrrrm�szWrappedSocket.filenocCs*|jdkr|jd8_|jr&|j�dS)Nrr)rCrDrL)rJrrr�_decref_socketios�s
zWrappedSocket._decref_socketioscCs&tj|�}|j||�}|d|�}|S)N)r&Zcreate_string_bufferr,)rJZbufsizr9Z
bytes_readr>rrr�recvs
zWrappedSocket.recvNc
Cs�|jr
dS|dkrt|�}tj|j|�}tjd�}|j��tj|j	||tj
|��}WdQRX|tjkr�|j
dkr�tjd��n"|tjtjfkr�|j�nt|�|j
S)Nrzrecv timed out)rDrNr&r'Zfrom_buffer�c_size_trMrZSSLReadrBr[rr0r^r$r8r-ZerrSSLClosedNoNotifyrLr	)rJr9�nbytes�processed_bytesrQrrrr,
s 




zWrappedSocket.recv_intocCs
||_dS)N)rH)rJr8rrrrI2szWrappedSocket.settimeoutcCs|jS)N)rH)rJrrrr%5szWrappedSocket.gettimeoutc
Cshtjd�}|j��"tj|j|t|�tj|��}WdQRX|tj	krZ|j
dkrZtjd��nt
|�|j
S)Nrzsend timed out)r&rprMrZSSLWriterBrNr[rr0r^r$r8r	)rJr>rrrQrrrr=8s

"zWrappedSocket.sendcCs8d}x.|t|�kr2|j|||t��}||7}qWdS)Nr)rNr=�SSL_WRITE_BLOCKSIZE)rJr>Z
total_sentr?rrr�sendallIszWrappedSocket.sendallc	Cs$|j��tj|j�WdQRXdS)N)rMrZSSLCloserB)rJrrr�shutdownOs
zWrappedSocket.shutdowncCs�|jdkr�d|_|jr(tj|j�d|_|jr@tj|j�d|_|jrvtj|j�tj|j�t	j
|j�d|_|_|jj
�S|jd8_dS)NrT)rCrDrBrr]rGrErZSecKeychainDelete�shutilZrmtreerFr$rL)rJrrrrLSs

zWrappedSocket.closeFc
Cs�|std��tj�}d}d}z�tj|jtj|��}t|�|sBdStj|�}|sTdStj	|d�}|sht
�tj|�}|szt
�tj
|�}tj|�}	tj|	|�}Wd|r�tj|�|r�tj|�X|S)Nz2SecureTransport only supports dumping binary certsr)�
ValueErrorrrYrZrBr&r[r	ZSecTrustGetCertificateCountZSecTrustGetCertificateAtIndex�AssertionErrorZSecCertificateCopyDatarZCFDataGetLengthZCFDataGetBytePtrr<r])
rJZbinary_formrbZcertdataZ	der_bytesrQZ
cert_countZleafZdata_lengthr4rrr�getpeercertfs6




zWrappedSocket.getpeercertcCs|jd7_dS)Nr)rC)rJrrr�_reuse�szWrappedSocket._reusecCs&|jdkr|j�n|jd8_dS)Nr)rCrL)rJrrr�_drop�s

zWrappedSocket._drop)N)F)�__name__�
__module__�__qualname__�__doc__rK�
contextlib�contextmanagerrMrRrcrlrmrnror,rIr%r=rtrurLryrzr{rrrrrA's&
>Z
(
>rAcCs|jd7_t|||dd�S)NrT)rL)rCr
)rJ�mode�bufsizerrr�makefile�sr��rcOsd}t|||f|�|�S)Nr)r)rJr��	buffering�args�kwargsrrrr��sc@s�eZdZdZdd�Zedd��Zejdd��Zedd��Zejd	d��Zed
d��Z	e	jdd��Z	d
d�Z
dd�Zdd�Zddd�Z
ddd�Zddd�ZdS)rz�
    I am a wrapper class for the SecureTransport library, to translate the
    interface of the standard library ``SSLContext`` object to calls into
    SecureTransport.
    cCs8t|\|_|_d|_d|_d|_d|_d|_d|_dS)NrF)	�_protocol_to_min_max�_min_version�_max_version�_options�_verify�
_trust_bundle�_client_cert�_client_key�_client_key_passphrase)rJZprotocolrrrrK�szSecureTransportContext.__init__cCsdS)z�
        SecureTransport cannot have its hostname checking disabled. For more,
        see the comment on getpeercert() in this file.
        Tr)rJrrr�check_hostname�sz%SecureTransportContext.check_hostnamecCsdS)z�
        SecureTransport cannot have its hostname checking disabled. For more,
        see the comment on getpeercert() in this file.
        Nr)rJr^rrrr��scCs|jS)N)r�)rJrrr�options�szSecureTransportContext.optionscCs
||_dS)N)r�)rJr^rrrr��scCs|jrtjStjS)N)r�r\�
CERT_REQUIREDZ	CERT_NONE)rJrrr�verify_mode�sz"SecureTransportContext.verify_modecCs|tjkrdnd|_dS)NTF)r\r�r�)rJr^rrrr��scCsdS)Nr)rJrrr�set_default_verify_paths�s
z/SecureTransportContext.set_default_verify_pathscCs|j�S)N)r�)rJrrr�load_default_certs�sz)SecureTransportContext.load_default_certscCs|tjjkrtd��dS)Nz5SecureTransport doesn't support custom cipher strings)rrZDEFAULT_CIPHERSrw)rJrPrrr�set_cipherssz"SecureTransportContext.set_ciphersNcCs|dk	rtd��|p||_dS)Nz1SecureTransport does not support cert directories)rwr�)rJZcafileZcapathZcadatarrr�load_verify_locationssz,SecureTransportContext.load_verify_locationscCs||_||_||_dS)N)r�r�Z_client_cert_passphrase)rJZcertfileZkeyfileZpasswordrrr�load_cert_chainsz&SecureTransportContext.load_cert_chainFTc	CsL|s
t�|st�|st�t|�}|j||j|j|j|j|j|j|j	�|S)N)
rxrArlr�r�r�r�r�r�r�)rJZsockZserver_sideZdo_handshake_on_connectZsuppress_ragged_eofsrkr6rrr�wrap_sockets

z"SecureTransportContext.wrap_socket)NNN)NN)FTTN)r|r}r~rrK�propertyr��setterr�r�r�r�r�r�r�r�rrrrr�s 	

	
r���)r�)r�N)erZ
__future__rr�r&r*Zos.pathrTrvr$r\Z	threading�weakref�rZ_securetransport.bindingsrrrZ_securetransport.low_levelr	r
rrr
�ImportErrorZpackages.backports.makefilerr(�	NameError�__all__rr rrr�WeakValueDictionaryr!ZLockrfrsZTLS_AES_256_GCM_SHA384ZTLS_CHACHA20_POLY1305_SHA256ZTLS_AES_128_GCM_SHA256Z'TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384Z%TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384Z'TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256Z%TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256Z#TLS_DHE_DSS_WITH_AES_256_GCM_SHA384Z#TLS_DHE_RSA_WITH_AES_256_GCM_SHA384Z#TLS_DHE_DSS_WITH_AES_128_GCM_SHA256Z#TLS_DHE_RSA_WITH_AES_128_GCM_SHA256Z'TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384Z%TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384Z$TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHAZ"TLS_ECDHE_RSA_WITH_AES_256_CBC_SHAZ#TLS_DHE_RSA_WITH_AES_256_CBC_SHA256Z#TLS_DHE_DSS_WITH_AES_256_CBC_SHA256Z TLS_DHE_RSA_WITH_AES_256_CBC_SHAZ TLS_DHE_DSS_WITH_AES_256_CBC_SHAZ'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256Z%TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256Z$TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHAZ"TLS_ECDHE_RSA_WITH_AES_128_CBC_SHAZ#TLS_DHE_RSA_WITH_AES_128_CBC_SHA256Z#TLS_DHE_DSS_WITH_AES_128_CBC_SHA256Z TLS_DHE_RSA_WITH_AES_128_CBC_SHAZ TLS_DHE_DSS_WITH_AES_128_CBC_SHAZTLS_RSA_WITH_AES_256_GCM_SHA384ZTLS_RSA_WITH_AES_128_GCM_SHA256ZTLS_RSA_WITH_AES_256_CBC_SHA256ZTLS_RSA_WITH_AES_128_CBC_SHA256ZTLS_RSA_WITH_AES_256_CBC_SHAZTLS_RSA_WITH_AES_128_CBC_SHArOZPROTOCOL_SSLv23Z
kTLSProtocol1ZkTLSProtocol12r��hasattrZ
kSSLProtocol2rZ
kSSLProtocol3rrZkTLSProtocol11rrrrrr;r@ZSSLReadFuncrdZSSLWriteFuncre�objectrAr�rrrrr�<module>s�95



site-packages/pip/_vendor/urllib3/contrib/__pycache__/appengine.cpython-36.opt-1.pyc000064400000021027147511334600024275 0ustar003

���eq*�@s dZddlmZddlZddlZddlZddlmZddlm	Z	m
Z
mZmZm
Z
mZddlmZddlmZdd	lmZdd
lmZddlmZyddlmZWnek
r�dZYnXeje�ZGd
d�de
�ZGdd�de	�Z Gdd�de�Z!dd�Z"dd�Z#dd�Z$dd�Z%dd�Z&dS)aC
This module provides a pool manager that uses Google App Engine's
`URLFetch Service <https://cloud.google.com/appengine/docs/python/urlfetch>`_.

Example usage::

    from urllib3 import PoolManager
    from urllib3.contrib.appengine import AppEngineManager, is_appengine_sandbox

    if is_appengine_sandbox():
        # AppEngineManager uses AppEngine's URLFetch API behind the scenes
        http = AppEngineManager()
    else:
        # PoolManager uses a socket-level API behind the scenes
        http = PoolManager()

    r = http.request('GET', 'https://google.com/')

There are `limitations <https://cloud.google.com/appengine/docs/python/urlfetch/#Python_Quotas_and_limits>`_ to the URLFetch service and it may not be
the best choice for your application. There are three options for using
urllib3 on Google App Engine:

1. You can use :class:`AppEngineManager` with URLFetch. URLFetch is
   cost-effective in many circumstances as long as your usage is within the
   limitations.
2. You can use a normal :class:`~urllib3.PoolManager` by enabling sockets.
   Sockets also have `limitations and restrictions
   <https://cloud.google.com/appengine/docs/python/sockets/   #limitations-and-restrictions>`_ and have a lower free quota than URLFetch.
   To use sockets, be sure to specify the following in your ``app.yaml``::

        env_variables:
            GAE_USE_SOCKETS_HTTPLIB : 'true'

3. If you are using `App Engine Flexible
<https://cloud.google.com/appengine/docs/flexible/>`_, you can use the standard
:class:`PoolManager` without any configuration or special environment variables.
�)�absolute_importN�)�urljoin)�	HTTPError�HTTPWarning�
MaxRetryError�
ProtocolError�TimeoutError�SSLError)�BytesIO)�RequestMethods)�HTTPResponse)�Timeout)�Retry)�urlfetchc@seZdZdS)�AppEnginePlatformWarningN)�__name__�
__module__�__qualname__�rr�/usr/lib/python3.6/appengine.pyrGsrc@seZdZdS)�AppEnginePlatformErrorN)rrrrrrrrKsrc@sXeZdZdZddd�Zdd�Zdd	�Zddddejfd
d�Z	dd
�Z
dd�Zdd�ZdS)�AppEngineManagera
    Connection manager for Google App Engine sandbox applications.

    This manager uses the URLFetch service directly instead of using the
    emulated httplib, and is subject to URLFetch limitations as described in
    the App Engine documentation `here
    <https://cloud.google.com/appengine/docs/python/urlfetch>`_.

    Notably it will raise an :class:`AppEnginePlatformError` if:
        * URLFetch is not available.
        * If you attempt to use this on App Engine Flexible, as full socket
          support is available.
        * If a request size is more than 10 megabytes.
        * If a response size is more than 32 megabtyes.
        * If you use an unsupported request method such as OPTIONS.

    Beyond those cases, it will raise normal urllib3 errors.
    NTcCsNtstd��t�rtd��tjdt�tj||�||_||_	|pFt
j|_dS)Nz.URLFetch is not available in this environment.z�Use normal urllib3.PoolManager instead of AppEngineManageron Managed VMs, as using URLFetch is not necessary in this environment.z�urllib3 is using URLFetch on Google App Engine sandbox instead of sockets. To use sockets directly instead of URLFetch see https://urllib3.readthedocs.io/en/latest/reference/urllib3.contrib.html.)
rr�is_prod_appengine_mvms�warnings�warnrr�__init__�validate_certificate�urlfetch_retriesrZDEFAULT�retries)�self�headersrrrrrrrcszAppEngineManager.__init__cCs|S)Nr)r rrr�	__enter__{szAppEngineManager.__enter__cCsdS)NFr)r �exc_typeZexc_valZexc_tbrrr�__exit__~szAppEngineManager.__exit__cKs�|j||�}yF|o |jdko |j}	tj||||p2id|jo<|	|j|�|jd�}
W�nBtjk
r�}zt	||��WYdd}~X�ntj
k
r�}z$dt|�kr�td|��t
|��WYdd}~Xn�tjk
�r}z(dt|�kr�t|||d��t
|��WYdd}~Xn�tjk
�r6}ztd|��WYdd}~Xn`tjk
�rb}zt|��WYdd}~Xn4tjk
�r�}ztd	||��WYdd}~XnX|j|
fd
|i|��}|�o�|j�}
|
�rr|j�r�|j�r�t||d��n�|jdk�r�d
}y|j||||d�}Wn*tk
�r.|j�r*t||d��|SX|j|�tjd||
�t||
�}|j||||f|||d�|��St|jd��}|j ||j|��r�|j||||d�}tjd|�|j!|�|j||f|||||d�|��S|S)NrF)Zpayload�methodr!Zallow_truncated�follow_redirectsZdeadlinerz	too largezOURLFetch request too large, URLFetch only supports requests up to 10mb in size.zToo many redirects)�reasonzPURLFetch response too large, URLFetch only supportsresponses up to 32mb in size.z$URLFetch does not support method: %srztoo many redirectsi/ZGET)�responseZ_poolzRedirecting %s -> %s)r�redirect�timeoutzRetry-Afterz	Retry: %s)�bodyr!rr)r*)"�_get_retriesr)�totalrZfetchr�_get_absolute_timeoutrZDeadlineExceededErrorr	ZInvalidURLError�strrrZ
DownloadErrorrZResponseTooLargeErrorZSSLCertificateErrorr
ZInvalidMethodError�#_urlfetch_response_to_http_responseZget_redirect_locationZraise_on_redirect�statusZ	incrementZsleep_for_retry�log�debugr�urlopen�boolZ	getheaderZis_retryZsleep)r r%Zurlr+r!rr)r*�response_kwr&r(�eZ
http_responseZredirect_locationZredirect_urlZhas_retry_afterrrrr4�s�




zAppEngineManager.urlopencKszt�r"|jjd�}|dkr"|jd=|jjd�}|dkrZ|jd�}|jd�dj|�|jd<tft|j�|j|j	d�|��S)Nzcontent-encodingZdeflateztransfer-encodingZchunked�,)r+r!r1)
�is_prod_appenginer!�get�split�remove�joinr
rZcontentZstatus_code)r Z
urlfetch_respr6Zcontent_encodingZtransfer_encodingZ	encodingsrrrr0�s

z4AppEngineManager._urlfetch_response_to_http_responsecCsB|tjkrdSt|t�r>|jdk	s,|jdk	r8tjdt�|jS|S)NzdURLFetch does not support granular timeout settings, reverting to total or default URLFetch timeout.)	r�DEFAULT_TIMEOUT�
isinstanceZ_readZ_connectrrrr-)r r*rrrr.�s

z&AppEngineManager._get_absolute_timeoutcCs>t|t�stj|||jd�}|js.|js.|jr:tjdt	�|S)N)r)�defaultzhURLFetch only supports total retries and does not recognize connect, read, or redirect retry parameters.)
r?rZfrom_intrZconnect�readr)rrr)r rr)rrrr,s
zAppEngineManager._get_retries)NNTT)
rrr�__doc__rr"r$rr>r4r0r.r,rrrrrOs
ZrcCst�pt�pt�S)N)�is_local_appenginer9rrrrr�is_appenginesrDcCst�ot�S)N)rDrrrrr�is_appengine_sandboxsrEcCsdtjkodtjdkS)N�APPENGINE_RUNTIMEzDevelopment/�SERVER_SOFTWARE)�os�environrrrrrCs
rCcCs dtjkodtjdkot�S)NrFzGoogle App Engine/rG)rHrIrrrrrr9!s
r9cCstjjdd�dkS)NZGAE_VMF�true)rHrIr:rrrrr'sr)'rBZ
__future__rZloggingrHrZpackages.six.moves.urllib.parser�
exceptionsrrrrr	r
Zpackages.sixrZrequestrr(r
Zutil.timeoutrZ
util.retryrZgoogle.appengine.apir�ImportErrorZ	getLoggerrr2rrrrDrErCr9rrrrr�<module>'s2 	

Dsite-packages/pip/_vendor/urllib3/contrib/__pycache__/ntlmpool.cpython-36.opt-1.pyc000064400000006131147511334600024172 0ustar003

���e~�@s\dZddlmZddlmZddlmZddlmZddlm	Z	ee
�ZGdd	�d	e�Zd
S)z
NTLM authenticating pool, contributed by erikcederstran

Issue #10, see: http://code.google.com/p/urllib3/issues/detail?id=10
�)�absolute_import)�	getLogger)�ntlm�)�HTTPSConnectionPool)�HTTPSConnectioncs:eZdZdZdZ�fdd�Zdd�Zd�fd
d�	Z�ZS)
�NTLMConnectionPoolzQ
    Implements an NTLM authentication version of an urllib3 connection pool
    ZhttpscsLtt|�j||�||_||_|jdd�}|dj�|_|d|_||_	dS)z�
        authurl is a random URL on the server that is protected by NTLM.
        user is the Windows user, probably in the DOMAIN\username format.
        pw is the password for the user.
        �\�rN)
�superr�__init__�authurl�rawuser�split�upper�domain�user�pw)�selfrrr
�args�kwargsZ
user_parts)�	__class__��/usr/lib/python3.6/ntlmpool.pyrs
zNTLMConnectionPool.__init__c
Cs�|jd7_tjd|j|j|j�i}d|d<d}d}t|j|jd�}dtj|j	�||<tjd	|�|j
d
|jd|�|j�}t|j
��}tjd|j|j�tjd|�tjd
|jd��d|_||jd�}d}x(|D] }	|	dd�dkr�|	dd�}q�W|dk�rtd|||f��tj|�\}
}tj|
|j|j|j|�}d|||<tjd	|�|j
d
|jd|�|j�}tjd|j|j�tjdt|j
���tjd
|j�dd��|jdk�r�|jdk�r�td��td|j|jf��d|_tjd�|S)Nr
z3Starting NTLM HTTPS connection no. %d: https://%s%sz
Keep-Alive�
ConnectionZ
Authorizationzwww-authenticate)�host�portzNTLM %szRequest headers: %sZGETzResponse status: %s %szResponse headers: %szResponse data: %s [...]�dz, �zNTLM z!Unexpected %s response header: %s��i�z3Server rejected request: wrong username or passwordzWrong server response: %s %szConnection established)Znum_connections�log�debugrr
rrrZcreate_NTLM_NEGOTIATE_MESSAGErZrequestZgetresponse�dictZ
getheadersZstatus�reason�read�fpr�	ExceptionZparse_NTLM_CHALLENGE_MESSAGEZ create_NTLM_AUTHENTICATE_MESSAGErrr)
r�headersZ
req_headerZresp_headerZconn�resZreshdrZauth_header_valuesZauth_header_value�sZServerChallengeZNegotiateFlagsZauth_msgrrr�	_new_conn's\


zNTLMConnectionPool._new_connN�Tcs0|dkri}d|d<tt|�j|||||||�S)Nz
Keep-Aliver)rr�urlopen)r�methodZurlZbodyr'ZretriesZredirectZassert_same_host)rrrr,hszNTLMConnectionPool.urlopen)NNr+TT)	�__name__�
__module__�__qualname__�__doc__�schemerr*r,�
__classcell__rr)rrrsArN)
r1Z
__future__rZloggingrr�rZpackages.six.moves.http_clientrr.r rrrrr�<module>ssite-packages/pip/_vendor/urllib3/contrib/__pycache__/pyopenssl.cpython-36.opt-1.pyc000064400000033337147511334600024372 0ustar003

���e�;�@s,dZddlmZddlZddlmZddlmZ	ddl
mZddlm
Z
mZddlmZydd	lmZWn$ek
r�dZd
dlmZYnXddlZddlZd
dlmZddlZd
d
lmZddgZdZejej j!ej"ej j#iZ$e%ed�o�e%ej d��rej j&e$ej'<e%ed��r0e%ej d��r0ej j(e$ej)<ye$j*ej+ej j,i�Wne-k
�r^YnXej.ej j/ej0ej j1ej2ej j1ej j3iZ4e5dd�e4j6�D��Z7dZ8ejZ9ej:j;Z<ej=e>�Z?dd�Z@dd�ZAdd�ZBdd�ZCdd�ZDGd d!�d!eE�ZFe�rd*d#d$�ZGneZGeGeF_GGd%d&�d&eE�ZHd'd(�ZIdS)+ab
SSL with SNI_-support for Python 2. Follow these instructions if you would
like to verify SSL certificates in Python 2. Note, the default libraries do
*not* do certificate checking; you need to do additional work to validate
certificates yourself.

This needs the following packages installed:

* pyOpenSSL (tested with 16.0.0)
* cryptography (minimum 1.3.4, from pyopenssl)
* idna (minimum 2.0, from cryptography)

However, pyopenssl depends on cryptography, which depends on idna, so while we
use all three directly here we end up having relatively few packages required.

You can install them with the following command:

    pip install pyopenssl cryptography idna

To activate certificate checking, call
:func:`~urllib3.contrib.pyopenssl.inject_into_urllib3` from your Python code
before you begin making HTTP requests. This can be done in a ``sitecustomize``
module, or at any other time before your application begins using ``urllib3``,
like this::

    try:
        import urllib3.contrib.pyopenssl
        urllib3.contrib.pyopenssl.inject_into_urllib3()
    except ImportError:
        pass

Now you can use :mod:`urllib3` as you normally would, and it will support SNI
when the required modules are installed.

Activating this module also has the positive side effect of disabling SSL/TLS
compression in Python 2 (see `CRIME attack`_).

If you want to configure the default list of supported cipher suites, you can
set the ``urllib3.contrib.pyopenssl.DEFAULT_SSL_CIPHER_LIST`` variable.

.. _sni: https://en.wikipedia.org/wiki/Server_Name_Indication
.. _crime attack: https://en.wikipedia.org/wiki/CRIME_(security_exploit)
�)�absolute_importN)�x509)�backend)�_Certificate)�timeout�error)�BytesIO)�_fileobject�)�backport_makefile)�six)�util�inject_into_urllib3�extract_from_urllib3T�PROTOCOL_TLSv1_1�TLSv1_1_METHOD�PROTOCOL_TLSv1_2�TLSv1_2_METHODccs|]\}}||fVqdS)N�)�.0�k�vrr�/usr/lib/python3.6/pyopenssl.py�	<genexpr>`sri@cCs.t�ttj_tt_ttj_dt_dtj_dS)z7Monkey-patch urllib3 with PyOpenSSL-backed SSL-support.TN)�_validate_dependencies_met�PyOpenSSLContextr
�ssl_�
SSLContext�HAS_SNI�IS_PYOPENSSLrrrrrmscCs(ttj_tt_ttj_dt_dtj_dS)z4Undo monkey-patching by :func:`inject_into_urllib3`.FN)�orig_util_SSLContextr
rr�orig_util_HAS_SNIrrrrrrrys
cCsRddlm}t|dd�dkr$td��ddlm}|�}t|dd�dkrNtd��dS)	z{
    Verifies that PyOpenSSL's package-level dependencies have been met.
    Throws `ImportError` if they are not met.
    r)�
Extensions�get_extension_for_classNzX'cryptography' module missing required functionality.  Try upgrading to v1.3.4 or newer.)�X509�_x509zS'pyOpenSSL' module missing required functionality. Try upgrading to v0.14 or newer.)Zcryptography.x509.extensionsr"�getattr�ImportErrorZOpenSSL.cryptor$)r"r$rrrrr�srcCs(dd�}||�}tjdkr$|jd�}|S)a�
    Converts a dNSName SubjectAlternativeName field to the form used by the
    standard library on the given Python version.

    Cryptography produces a dNSName as a unicode string that was idna-decoded
    from ASCII bytes. We need to idna-encode that string to get it back, and
    then on Python 3 we also need to convert to unicode via UTF-8 (the stdlib
    uses PyUnicode_FromStringAndSize on it, which decodes via UTF-8).
    cSsNddl}x:dD]2}|j|�r|t|�d�}|jd�|j|�SqW|j|�S)z�
        Borrowed wholesale from the Python Cryptography Project. It turns out
        that we can't just safely call `idna.encode`: it can explode for
        wildcard names. This avoids that problem.
        rN�*.�.�ascii)r(r))�idna�
startswith�len�encode)�namer+�prefixrrr�idna_encode�s

z'_dnsname_to_stdlib.<locals>.idna_encode�rzutf-8)r2r)�sys�version_info�decode)r/r1rrr�_dnsname_to_stdlib�s



r6cCs�t|d�r|j�}ntt|j�}y|jjtj�j	}WnNtj
k
rJgStjtjtj
tfk
r�}ztjd|�gSd}~XnXdd�|jtj�D�}|jdd�|jtj�D��|S)zU
    Given an PyOpenSSL certificate, provides all the subject alternative names.
    �to_cryptographyz�A problem was encountered with the certificate that prevented urllib3 from finding the SubjectAlternativeName field. This can affect certificate validation. The error was %sNcSsg|]}dt|�f�qS)ZDNS)r6)rr/rrr�
<listcomp>�sz%get_subj_alt_name.<locals>.<listcomp>css|]}dt|�fVqdS)z
IP AddressN)�str)rr/rrrr�sz$get_subj_alt_name.<locals>.<genexpr>)�hasattrr7r�openssl_backendr%�
extensionsr#rZSubjectAlternativeName�valueZExtensionNotFoundZDuplicateExtensionZUnsupportedExtensionZUnsupportedGeneralNameType�UnicodeError�logZwarningZget_values_for_typeZDNSName�extendZ	IPAddress)Z	peer_certZcertZext�e�namesrrr�get_subj_alt_name�s(


	rCc@s|eZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
d dd�Zdd�Zdd�ZdS)!�
WrappedSocketz�API-compatibility wrapper for Python OpenSSL's Connection-class.

    Note: _makefile_refs, _drop() and _reuse() are needed for the garbage
    collector of pypy.
    TcCs"||_||_||_d|_d|_dS)NrF)�
connection�socket�suppress_ragged_eofs�_makefile_refs�_closed)�selfrErFrGrrr�__init__�s
zWrappedSocket.__init__cCs
|jj�S)N)rF�fileno)rJrrrrL�szWrappedSocket.filenocCs*|jdkr|jd8_|jr&|j�dS)Nr�)rHrI�close)rJrrr�_decref_socketios�s
zWrappedSocket._decref_socketioscOs�y|jj||�}Wn�tjjk
rX}z&|jr<|jdkr<dStt|���WYdd}~Xn�tjj	k
r�}z|jj
�tjjkr�dS�WYdd}~XnJtjjk
r�t
j|j|jj��}|s�td��n|j||�SYnX|SdS)NrM�Unexpected EOF�zThe read operation timed out���)rRrP)rE�recv�OpenSSL�SSL�SysCallErrorrG�args�SocketErrorr9�ZeroReturnError�get_shutdown�RECEIVED_SHUTDOWN�
WantReadErrorr
�
wait_for_readrF�
gettimeoutr)rJrW�kwargs�datarA�rdrrrrSs 
zWrappedSocket.recvcOs�y|jj||�Stjjk
rT}z&|jr8|jdkr8dStt|���WYdd}~Xn�tjj	k
r�}z|jj
�tjjkr~dS�WYdd}~XnFtjjk
r�t
j|j|jj��}|s�td��n|j||�SYnXdS)NrM�Unexpected EOFrzThe read operation timed outrR)rRrb)rE�	recv_intorTrUrVrGrWrXr9rYrZr[r\r
r]rFr^r)rJrWr_rArarrrrcs
zWrappedSocket.recv_intocCs|jj|�S)N)rF�
settimeout)rJrrrrrd*szWrappedSocket.settimeoutcCs�xzy|jj|�Stjjk
rFtj|j|jj��}|s@t	��wYqtjj
k
rv}ztt|���WYdd}~XqXqWdS)N)
rE�sendrTrUZWantWriteErrorr
Zwait_for_writerFr^rrVrXr9)rJr`�wrrArrr�_send_until_done-szWrappedSocket._send_until_donecCs8d}x.|t|�kr2|j|||t��}||7}qWdS)Nr)r-rg�SSL_WRITE_BLOCKSIZE)rJr`Z
total_sentZsentrrr�sendall9szWrappedSocket.sendallcCs|jj�dS)N)rE�shutdown)rJrrrrj?szWrappedSocket.shutdowncCsH|jdkr6yd|_|jj�Stjjk
r2dSXn|jd8_dS)NrMT)rHrIrErNrTrU�Error)rJrrrrNCs

zWrappedSocket.closeFcCsD|jj�}|s|S|r(tjjtjj|�Sd|j�jffft|�d�S)NZ
commonName)ZsubjectZsubjectAltName)	rEZget_peer_certificaterTZcryptoZdump_certificateZ
FILETYPE_ASN1Zget_subjectZCNrC)rJZbinary_formrrrr�getpeercertMs
zWrappedSocket.getpeercertcCs|jd7_dS)NrM)rH)rJrrr�_reuse_szWrappedSocket._reusecCs&|jdkr|j�n|jd8_dS)NrM)rHrN)rJrrr�_dropbs

zWrappedSocket._dropN)T)F)�__name__�
__module__�__qualname__�__doc__rKrLrOrSrcrdrgrirjrNrlrmrnrrrrrD�s


rDrMcCs|jd7_t|||dd�S)NrMT)rN)rHr	)rJ�mode�bufsizerrr�makefilejsruc@szeZdZdZdd�Zedd��Zejdd��Zedd��Zejd	d��Zd
d�Z	dd
�Z
ddd�Zddd�Zddd�Z
dS)rz�
    I am a wrapper class for the PyOpenSSL ``Context`` object. I am responsible
    for translating the interface of the standard library ``SSLContext`` object
    to calls into PyOpenSSL.
    cCs*t||_tjj|j�|_d|_d|_dS)NrF)�_openssl_versions�protocolrTrUZContext�_ctx�_optionsZcheck_hostname)rJrwrrrrKys
zPyOpenSSLContext.__init__cCs|jS)N)ry)rJrrr�optionsszPyOpenSSLContext.optionscCs||_|jj|�dS)N)ryrxZset_options)rJr=rrrrz�scCst|jj�S)N)�_openssl_to_stdlib_verifyrxZget_verify_mode)rJrrr�verify_mode�szPyOpenSSLContext.verify_modecCs|jjt|t�dS)N)rxZ
set_verify�_stdlib_to_openssl_verify�_verify_callback)rJr=rrrr|�scCs|jj�dS)N)rx�set_default_verify_paths)rJrrrr�sz)PyOpenSSLContext.set_default_verify_pathscCs&t|tj�r|jd�}|jj|�dS)Nzutf-8)�
isinstancer�	text_typer.rxZset_cipher_list)rJZciphersrrr�set_ciphers�s
zPyOpenSSLContext.set_ciphersNcCsN|dk	r|jd�}|dk	r$|jd�}|jj||�|dk	rJ|jjt|��dS)Nzutf-8)r.rx�load_verify_locationsr)rJZcafileZcapathZcadatarrrr��s

z&PyOpenSSLContext.load_verify_locationscs<|jj|��dk	r(|jj�fdd��|jj|p4|�dS)Ncs�S)Nr)Z
max_lengthZprompt_twiceZuserdata)�passwordrr�<lambda>�sz2PyOpenSSLContext.load_cert_chain.<locals>.<lambda>)rxZuse_certificate_fileZ
set_passwd_cbZuse_privatekey_file)rJZcertfileZkeyfiler�r)r�r�load_cert_chain�sz PyOpenSSLContext.load_cert_chainFTc	Cs�tjj|j|�}t|tj�r&|jd�}|dk	r8|j|�|j	�x|y|j
�Wnhtjjk
r�tj
||j��}|s~td��wBYn4tjjk
r�}ztjd|��WYdd}~XnXPqBWt||�S)Nzutf-8zselect timed outzbad handshake: %r)rTrUZ
Connectionrxr�rr�r.Zset_tlsext_host_nameZset_connect_stateZdo_handshaker\r
r]r^rrk�sslZSSLErrorrD)	rJZsockZserver_sideZdo_handshake_on_connectrGZserver_hostname�cnxrarArrr�wrap_socket�s$

 zPyOpenSSLContext.wrap_socket)NNN)NN)FTTN)rorprqrrrK�propertyrz�setterr|rr�r�r�r�rrrrrss
	
rcCs|dkS)Nrr)r�rZerr_noZ	err_depthZreturn_coderrrr~�sr~rR)rR)JrrZ
__future__rZOpenSSL.SSLrTZcryptographyrZ$cryptography.hazmat.backends.opensslrr;Z)cryptography.hazmat.backends.openssl.x509rrFrrrX�iorr	r'Zpackages.backports.makefilerZloggingr�Zpackagesrr3�r
�__all__rZPROTOCOL_SSLv23rUZ
SSLv23_METHODZPROTOCOL_TLSv1ZTLSv1_METHODrvr:rrrr�updateZPROTOCOL_SSLv3ZSSLv3_METHOD�AttributeErrorZ	CERT_NONEZVERIFY_NONEZ
CERT_OPTIONALZVERIFY_PEERZ
CERT_REQUIREDZVERIFY_FAIL_IF_NO_PEER_CERTr}�dict�itemsr{rhr!rrr Z	getLoggerror?rrrr6rC�objectrDrurr~rrrr�<module>+sh




3Ssite-packages/pip/_vendor/urllib3/contrib/__pycache__/socks.cpython-36.opt-1.pyc000064400000011162147511334600023450 0ustar003

���e3�@s(dZddlmZyddlZWn6ek
rRddlZddlmZejde��YnXddl	m
ZmZ
ddlmZmZdd	lmZmZdd
lmZmZddlmZddlmZyddlZWnek
r�dZYnXGd
d�de�ZGdd�dee�ZGdd�de�ZGdd�de�ZGdd�de�ZdS)a�
This module contains provisional support for SOCKS proxies from within
urllib3. This module supports SOCKS4 (specifically the SOCKS4A variant) and
SOCKS5. To enable its functionality, either install PySocks or install this
module with the ``socks`` extra.

The SOCKS implementation supports the full range of urllib3 features. It also
supports the following SOCKS features:

- SOCKS4
- SOCKS4a
- SOCKS5
- Usernames and passwords for the SOCKS proxy

Known Limitations:

- Currently PySocks does not support contacting remote websites via literal
  IPv6 addresses. Any such connection attempt will fail. You must use a domain
  name.
- Currently PySocks does not support IPv6 connections to the SOCKS proxy. Any
  such connection attempt will fail.
�)�absolute_importN�)�DependencyWarningz�SOCKS support in urllib3 requires the installation of optional dependencies: specifically, PySocks.  For more information, see https://urllib3.readthedocs.io/en/latest/contrib.html#socks-proxies)�error�timeout)�HTTPConnection�HTTPSConnection)�HTTPConnectionPool�HTTPSConnectionPool)�ConnectTimeoutError�NewConnectionError)�PoolManager)�	parse_urlcs(eZdZdZ�fdd�Zdd�Z�ZS)�SOCKSConnectionzG
    A plain-text HTTP connection that connects via a SOCKS proxy.
    cs"|jd�|_tt|�j||�dS)N�_socks_options)�popr�superr�__init__)�self�args�kwargs)�	__class__��/usr/lib/python3.6/socks.pyr?szSOCKSConnection.__init__cCsXi}|jr|j|d<|jr$|j|d<yTtj|j|jff|jd|jd|jd|jd|jd|jd|jd	�|��}Wn�tk
r�}zt	|d
|j|jf��WYdd}~Xn�tj
k
�r"}zT|j�r|j}t|t�r�t	|d
|j|jf��nt
|d|��nt
|d|��WYdd}~Xn2tk
�rR}zt
|d|��WYdd}~XnX|S)
zA
        Establish a new connection via the SOCKS proxy.
        �source_address�socket_options�
socks_version�
proxy_host�
proxy_port�username�password�rdns)Z
proxy_typeZ
proxy_addrrZproxy_usernameZproxy_passwordZ
proxy_rdnsrz0Connection to %s timed out. (connect timeout=%s)Nz(Failed to establish a new connection: %s)rr�socksZcreate_connection�host�portrr�
SocketTimeoutrZ
ProxyErrorZ
socket_err�
isinstancer�SocketError)rZextra_kwZconn�errrr�	_new_connCsL

 
zSOCKSConnection._new_conn)�__name__�
__module__�__qualname__�__doc__rr)�
__classcell__rr)rrr;src@seZdZdS)�SOCKSHTTPSConnectionN)r*r+r,rrrrr/�sr/c@seZdZeZdS)�SOCKSHTTPConnectionPoolN)r*r+r,r�
ConnectionClsrrrrr0�sr0c@seZdZeZdS)�SOCKSHTTPSConnectionPoolN)r*r+r,r/r1rrrrr2�sr2cs,eZdZdZeed�Zd�fdd�	Z�ZS)�SOCKSProxyManagerzh
    A version of the urllib3 ProxyManager that routes connections via the
    defined SOCKS proxy.
    )ZhttpZhttpsN�
cs�t|�}|jdkrtj}d}	nN|jdkr4tj}d}	n8|jdkrJtj}d}	n"|jdkr`tj}d}	ntd|��||_||j|j|||	d�}
|
|d	<t	t
|�j||f|�t
j|_dS)
NZsocks5FZsocks5hTZsocks4Zsocks4az)Unable to determine SOCKS version from %s)rrrrr r!r)
r�schemer"ZPROXY_TYPE_SOCKS5ZPROXY_TYPE_SOCKS4�
ValueError�	proxy_urlr#r$rr3r�pool_classes_by_scheme)rr7rr Z	num_poolsZheadersZconnection_pool_kwZparsedrr!Z
socks_options)rrrr�s4





zSOCKSProxyManager.__init__)NNr4N)	r*r+r,r-r0r2r8rr.rr)rrr3�s
r3) r-Z
__future__rr"�ImportError�warnings�
exceptionsr�warnZsocketrr'rr%Z
connectionrrZconnectionpoolr	r
rrZpoolmanagerr
Zutil.urlrZsslrr/r0r2r3rrrr�<module>s2
Fsite-packages/pip/_vendor/urllib3/contrib/__pycache__/socks.cpython-36.pyc000064400000011162147511334600022511 0ustar003

���e3�@s(dZddlmZyddlZWn6ek
rRddlZddlmZejde��YnXddl	m
ZmZ
ddlmZmZdd	lmZmZdd
lmZmZddlmZddlmZyddlZWnek
r�dZYnXGd
d�de�ZGdd�dee�ZGdd�de�ZGdd�de�ZGdd�de�ZdS)a�
This module contains provisional support for SOCKS proxies from within
urllib3. This module supports SOCKS4 (specifically the SOCKS4A variant) and
SOCKS5. To enable its functionality, either install PySocks or install this
module with the ``socks`` extra.

The SOCKS implementation supports the full range of urllib3 features. It also
supports the following SOCKS features:

- SOCKS4
- SOCKS4a
- SOCKS5
- Usernames and passwords for the SOCKS proxy

Known Limitations:

- Currently PySocks does not support contacting remote websites via literal
  IPv6 addresses. Any such connection attempt will fail. You must use a domain
  name.
- Currently PySocks does not support IPv6 connections to the SOCKS proxy. Any
  such connection attempt will fail.
�)�absolute_importN�)�DependencyWarningz�SOCKS support in urllib3 requires the installation of optional dependencies: specifically, PySocks.  For more information, see https://urllib3.readthedocs.io/en/latest/contrib.html#socks-proxies)�error�timeout)�HTTPConnection�HTTPSConnection)�HTTPConnectionPool�HTTPSConnectionPool)�ConnectTimeoutError�NewConnectionError)�PoolManager)�	parse_urlcs(eZdZdZ�fdd�Zdd�Z�ZS)�SOCKSConnectionzG
    A plain-text HTTP connection that connects via a SOCKS proxy.
    cs"|jd�|_tt|�j||�dS)N�_socks_options)�popr�superr�__init__)�self�args�kwargs)�	__class__��/usr/lib/python3.6/socks.pyr?szSOCKSConnection.__init__cCsXi}|jr|j|d<|jr$|j|d<yTtj|j|jff|jd|jd|jd|jd|jd|jd|jd	�|��}Wn�tk
r�}zt	|d
|j|jf��WYdd}~Xn�tj
k
�r"}zT|j�r|j}t|t�r�t	|d
|j|jf��nt
|d|��nt
|d|��WYdd}~Xn2tk
�rR}zt
|d|��WYdd}~XnX|S)
zA
        Establish a new connection via the SOCKS proxy.
        �source_address�socket_options�
socks_version�
proxy_host�
proxy_port�username�password�rdns)Z
proxy_typeZ
proxy_addrrZproxy_usernameZproxy_passwordZ
proxy_rdnsrz0Connection to %s timed out. (connect timeout=%s)Nz(Failed to establish a new connection: %s)rr�socksZcreate_connection�host�portrr�
SocketTimeoutrZ
ProxyErrorZ
socket_err�
isinstancer�SocketError)rZextra_kwZconn�errrr�	_new_connCsL

 
zSOCKSConnection._new_conn)�__name__�
__module__�__qualname__�__doc__rr)�
__classcell__rr)rrr;src@seZdZdS)�SOCKSHTTPSConnectionN)r*r+r,rrrrr/�sr/c@seZdZeZdS)�SOCKSHTTPConnectionPoolN)r*r+r,r�
ConnectionClsrrrrr0�sr0c@seZdZeZdS)�SOCKSHTTPSConnectionPoolN)r*r+r,r/r1rrrrr2�sr2cs,eZdZdZeed�Zd�fdd�	Z�ZS)�SOCKSProxyManagerzh
    A version of the urllib3 ProxyManager that routes connections via the
    defined SOCKS proxy.
    )ZhttpZhttpsN�
cs�t|�}|jdkrtj}d}	nN|jdkr4tj}d}	n8|jdkrJtj}d}	n"|jdkr`tj}d}	ntd|��||_||j|j|||	d�}
|
|d	<t	t
|�j||f|�t
j|_dS)
NZsocks5FZsocks5hTZsocks4Zsocks4az)Unable to determine SOCKS version from %s)rrrrr r!r)
r�schemer"ZPROXY_TYPE_SOCKS5ZPROXY_TYPE_SOCKS4�
ValueError�	proxy_urlr#r$rr3r�pool_classes_by_scheme)rr7rr Z	num_poolsZheadersZconnection_pool_kwZparsedrr!Z
socks_options)rrrr�s4





zSOCKSProxyManager.__init__)NNr4N)	r*r+r,r-r0r2r8rr.rr)rrr3�s
r3) r-Z
__future__rr"�ImportError�warnings�
exceptionsr�warnZsocketrr'rr%Z
connectionrrZconnectionpoolr	r
rrZpoolmanagerr
Zutil.urlrZsslrr/r0r2r3rrrr�<module>s2
Fsite-packages/pip/_vendor/urllib3/contrib/socks.py000064400000014063147511334600016230 0ustar00# -*- coding: utf-8 -*-
"""
This module contains provisional support for SOCKS proxies from within
urllib3. This module supports SOCKS4 (specifically the SOCKS4A variant) and
SOCKS5. To enable its functionality, either install PySocks or install this
module with the ``socks`` extra.

The SOCKS implementation supports the full range of urllib3 features. It also
supports the following SOCKS features:

- SOCKS4
- SOCKS4a
- SOCKS5
- Usernames and passwords for the SOCKS proxy

Known Limitations:

- Currently PySocks does not support contacting remote websites via literal
  IPv6 addresses. Any such connection attempt will fail. You must use a domain
  name.
- Currently PySocks does not support IPv6 connections to the SOCKS proxy. Any
  such connection attempt will fail.
"""
from __future__ import absolute_import

try:
    import socks
except ImportError:
    import warnings
    from ..exceptions import DependencyWarning

    warnings.warn((
        'SOCKS support in urllib3 requires the installation of optional '
        'dependencies: specifically, PySocks.  For more information, see '
        'https://urllib3.readthedocs.io/en/latest/contrib.html#socks-proxies'
        ),
        DependencyWarning
    )
    raise

from socket import error as SocketError, timeout as SocketTimeout

from ..connection import (
    HTTPConnection, HTTPSConnection
)
from ..connectionpool import (
    HTTPConnectionPool, HTTPSConnectionPool
)
from ..exceptions import ConnectTimeoutError, NewConnectionError
from ..poolmanager import PoolManager
from ..util.url import parse_url

try:
    import ssl
except ImportError:
    ssl = None


class SOCKSConnection(HTTPConnection):
    """
    A plain-text HTTP connection that connects via a SOCKS proxy.
    """
    def __init__(self, *args, **kwargs):
        self._socks_options = kwargs.pop('_socks_options')
        super(SOCKSConnection, self).__init__(*args, **kwargs)

    def _new_conn(self):
        """
        Establish a new connection via the SOCKS proxy.
        """
        extra_kw = {}
        if self.source_address:
            extra_kw['source_address'] = self.source_address

        if self.socket_options:
            extra_kw['socket_options'] = self.socket_options

        try:
            conn = socks.create_connection(
                (self.host, self.port),
                proxy_type=self._socks_options['socks_version'],
                proxy_addr=self._socks_options['proxy_host'],
                proxy_port=self._socks_options['proxy_port'],
                proxy_username=self._socks_options['username'],
                proxy_password=self._socks_options['password'],
                proxy_rdns=self._socks_options['rdns'],
                timeout=self.timeout,
                **extra_kw
            )

        except SocketTimeout as e:
            raise ConnectTimeoutError(
                self, "Connection to %s timed out. (connect timeout=%s)" %
                (self.host, self.timeout))

        except socks.ProxyError as e:
            # This is fragile as hell, but it seems to be the only way to raise
            # useful errors here.
            if e.socket_err:
                error = e.socket_err
                if isinstance(error, SocketTimeout):
                    raise ConnectTimeoutError(
                        self,
                        "Connection to %s timed out. (connect timeout=%s)" %
                        (self.host, self.timeout)
                    )
                else:
                    raise NewConnectionError(
                        self,
                        "Failed to establish a new connection: %s" % error
                    )
            else:
                raise NewConnectionError(
                    self,
                    "Failed to establish a new connection: %s" % e
                )

        except SocketError as e:  # Defensive: PySocks should catch all these.
            raise NewConnectionError(
                self, "Failed to establish a new connection: %s" % e)

        return conn


# We don't need to duplicate the Verified/Unverified distinction from
# urllib3/connection.py here because the HTTPSConnection will already have been
# correctly set to either the Verified or Unverified form by that module. This
# means the SOCKSHTTPSConnection will automatically be the correct type.
class SOCKSHTTPSConnection(SOCKSConnection, HTTPSConnection):
    pass


class SOCKSHTTPConnectionPool(HTTPConnectionPool):
    ConnectionCls = SOCKSConnection


class SOCKSHTTPSConnectionPool(HTTPSConnectionPool):
    ConnectionCls = SOCKSHTTPSConnection


class SOCKSProxyManager(PoolManager):
    """
    A version of the urllib3 ProxyManager that routes connections via the
    defined SOCKS proxy.
    """
    pool_classes_by_scheme = {
        'http': SOCKSHTTPConnectionPool,
        'https': SOCKSHTTPSConnectionPool,
    }

    def __init__(self, proxy_url, username=None, password=None,
                 num_pools=10, headers=None, **connection_pool_kw):
        parsed = parse_url(proxy_url)

        if parsed.scheme == 'socks5':
            socks_version = socks.PROXY_TYPE_SOCKS5
            rdns = False
        elif parsed.scheme == 'socks5h':
            socks_version = socks.PROXY_TYPE_SOCKS5
            rdns = True
        elif parsed.scheme == 'socks4':
            socks_version = socks.PROXY_TYPE_SOCKS4
            rdns = False
        elif parsed.scheme == 'socks4a':
            socks_version = socks.PROXY_TYPE_SOCKS4
            rdns = True
        else:
            raise ValueError(
                "Unable to determine SOCKS version from %s" % proxy_url
            )

        self.proxy_url = proxy_url

        socks_options = {
            'socks_version': socks_version,
            'proxy_host': parsed.host,
            'proxy_port': parsed.port,
            'username': username,
            'password': password,
            'rdns': rdns
        }
        connection_pool_kw['_socks_options'] = socks_options

        super(SOCKSProxyManager, self).__init__(
            num_pools, headers, **connection_pool_kw
        )

        self.pool_classes_by_scheme = SOCKSProxyManager.pool_classes_by_scheme
site-packages/pip/_vendor/urllib3/contrib/_securetransport/__pycache__/low_level.cpython-36.pyc000064400000016271147511334600026767 0ustar003

���e/�@s�dZddlZddlZddlZddlZddlZddlZddlZddlm	Z	m
Z
mZejdej
�Zdd�Zdd	�Zd
d�Zddd
�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�ZdS)a�
Low-level helpers for the SecureTransport bindings.

These are Python functions that are not directly related to the high-level APIs
but are necessary to get them to work. They include a whole bunch of low-level
CoreFoundation messing about and memory management. The concerns in this module
are almost entirely about trying to avoid memory leaks and providing
appropriate and useful assistance to the higher-level code.
�N�)�Security�CoreFoundation�CFConsts;-----BEGIN CERTIFICATE-----
(.*?)
-----END CERTIFICATE-----cCstjtj|t|��S)zv
    Given a bytestring, create a CFData object from it. This CFData object must
    be CFReleased by the caller.
    )r�CFDataCreate�kCFAllocatorDefault�len)Z
bytestring�r	�/usr/lib/python3.6/low_level.py�_cf_data_from_bytessrcCsZt|�}dd�|D�}dd�|D�}tj||�}tj||�}tjtj|||tjtj�S)zK
    Given a list of Python tuples, create an associated CFDictionary.
    css|]}|dVqdS)rNr	)�.0�tr	r	r
�	<genexpr>,sz-_cf_dictionary_from_tuples.<locals>.<genexpr>css|]}|dVqdS)rNr	)rr
r	r	r
r-s)rr�	CFTypeRefZCFDictionaryCreaterZkCFTypeDictionaryKeyCallBacksZkCFTypeDictionaryValueCallBacks)ZtuplesZdictionary_size�keys�valuesZcf_keysZ	cf_valuesr	r	r
�_cf_dictionary_from_tuples%srcCsntj|tjtj��}tj|tj�}|dkrXtjd�}tj	||dtj�}|sRt
d��|j}|dk	rj|jd�}|S)z�
    Creates a Unicode string from a CFString object. Used entirely for error
    reporting.

    Yes, it annoys me quite a lot that this function is this complex.
    Niz'Error copying C string from CFStringRefzutf-8)
�ctypes�castZPOINTERZc_void_prZCFStringGetCStringPtrrZkCFStringEncodingUTF8Zcreate_string_bufferZCFStringGetCString�OSError�value�decode)rZvalue_as_void_p�string�buffer�resultr	r	r
�_cf_string_to_unicode;s"

rcCs\|dkrdStj|d�}t|�}tj|�|dks:|dkrBd|}|dkrPtj}||��dS)z[
    Checks the return code and throws an exception if there is an error to
    report
    rN�zOSStatus %s)rZSecCopyErrorMessageStringrr�	CFRelease�ssl�SSLError)�errorZexception_classZcf_error_string�outputr	r	r
�_assert_no_errorXs
r"cCs�dd�tj|�D�}|s"tjd��tjtjdtjtj	��}|sHtjd��ydx^|D]V}t
|�}|sjtjd��tjtj|�}tj
|�|s�tjd��tj||�tj
|�qPWWntk
r�tj
|�YnX|S)z�
    Given a bundle of certs in PEM format, turns them into a CFArray of certs
    that can be used to validate a cert chain.
    cSsg|]}tj|jd���qS)r)�base64Z	b64decode�group)r�matchr	r	r
�
<listcomp>ssz(_cert_array_from_pem.<locals>.<listcomp>zNo root certificates specifiedrzUnable to allocate memory!zUnable to build cert object!)�
_PEM_CERTS_RE�finditerrrr�CFArrayCreateMutablerr�byref�kCFTypeArrayCallBacksrrZSecCertificateCreateWithDatar�CFArrayAppendValue�	Exception)Z
pem_bundleZ	der_certsZ
cert_arrayZ	der_bytesZcertdataZcertr	r	r
�_cert_array_from_pemms2






r.cCstj�}tj|�|kS)z=
    Returns True if a given CFTypeRef is a certificate.
    )rZSecCertificateGetTypeIDr�CFGetTypeID)�item�expectedr	r	r
�_is_cert�sr2cCstj�}tj|�|kS)z;
    Returns True if a given CFTypeRef is an identity.
    )rZSecIdentityGetTypeIDrr/)r0r1r	r	r
�_is_identity�sr3cCs�tjd�}tj|dd��jd�}tj|dd��}tj�}tjj||�j	d�}t
j�}t
j|t
|�|ddtj|��}t|�||fS)a�
    This function creates a temporary Mac keychain that we can use to work with
    credentials. This keychain uses a one-time password and a temporary file to
    store the data. We expect to have one keychain per socket. The returned
    SecKeychainRef must be freed by the caller, including calling
    SecKeychainDelete.

    Returns a tuple of the SecKeychainRef and the path to the temporary
    directory that contains it.
    �(N�zutf-8F)�os�urandomr#Z	b64encoder�tempfileZmkdtemp�path�join�encoderZSecKeychainRefZSecKeychainCreaterrr*r")Zrandom_bytes�filenameZpasswordZ
tempdirectoryZ
keychain_path�keychain�statusr	r	r
�_temporary_keychain�s
r?cCsg}g}d}t|d��}|j�}WdQRXz�tjtj|t|��}tj�}tj|ddddd|t	j
|��}t|�tj|�}	xdt
|	�D]X}
tj||
�}t	j|tj�}t|�r�tj|�|j|�q�t|�r�tj|�|j|�q�WWd|r�tj|�tj|�X||fS)z�
    Given a single file, loads all the trust objects from it into arrays and
    the keychain.
    Returns a tuple of lists: the first list is a list of identities, the
    second a list of certs.
    N�rbr)�open�readrrrrZ
CFArrayRefrZ
SecItemImportrr*r"ZCFArrayGetCount�rangeZCFArrayGetValueAtIndexrrr2ZCFRetain�appendr3r)r=r9�certificates�
identitiesZresult_array�fZraw_filedataZfiledatarZresult_count�indexr0r	r	r
�_load_items_from_file�sH




rIcGs�g}g}dd�|D�}z�x.|D]&}t||�\}}|j|�|j|�qW|s�tj�}tj||dtj|��}t|�|j|�t	j
|jd��t	jt	j
dtjt	j��}	x tj||�D]}
t	j|	|
�q�W|	Sxtj||�D]}t	j
|�q�WXdS)z�
    Load certificates and maybe keys from a number of files. Has the end goal
    of returning a CFArray containing one SecIdentityRef, and then zero or more
    SecCertificateRef objects, suitable for use as a client certificate trust
    chain.
    css|]}|r|VqdS)Nr	)rr9r	r	r
r/sz*_load_client_cert_chain.<locals>.<genexpr>rN)rI�extendrZSecIdentityRefZ SecIdentityCreateWithCertificaterr*r"rDrr�popr)rr+�	itertools�chainr,)r=�pathsrErFZ	file_pathZnew_identitiesZ	new_certsZnew_identityr>Ztrust_chainr0�objr	r	r
�_load_client_cert_chains6 


rP)N)�__doc__r#rrL�rer6rr8Zbindingsrrr�compile�DOTALLr'rrrr"r.r2r3r?rIrPr	r	r	r
�<module>	s(


+(;site-packages/pip/_vendor/urllib3/contrib/_securetransport/__pycache__/__init__.cpython-36.pyc000064400000000161147511334600026525 0ustar003

���e�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/urllib3/contrib/_securetransport/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000161147511334600027464 0ustar003

���e�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>spip/_vendor/urllib3/contrib/_securetransport/__pycache__/low_level.cpython-36.opt-1.pyc000064400000016271147511334600027647 0ustar00site-packages3

���e/�@s�dZddlZddlZddlZddlZddlZddlZddlZddlm	Z	m
Z
mZejdej
�Zdd�Zdd	�Zd
d�Zddd
�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�ZdS)a�
Low-level helpers for the SecureTransport bindings.

These are Python functions that are not directly related to the high-level APIs
but are necessary to get them to work. They include a whole bunch of low-level
CoreFoundation messing about and memory management. The concerns in this module
are almost entirely about trying to avoid memory leaks and providing
appropriate and useful assistance to the higher-level code.
�N�)�Security�CoreFoundation�CFConsts;-----BEGIN CERTIFICATE-----
(.*?)
-----END CERTIFICATE-----cCstjtj|t|��S)zv
    Given a bytestring, create a CFData object from it. This CFData object must
    be CFReleased by the caller.
    )r�CFDataCreate�kCFAllocatorDefault�len)Z
bytestring�r	�/usr/lib/python3.6/low_level.py�_cf_data_from_bytessrcCsZt|�}dd�|D�}dd�|D�}tj||�}tj||�}tjtj|||tjtj�S)zK
    Given a list of Python tuples, create an associated CFDictionary.
    css|]}|dVqdS)rNr	)�.0�tr	r	r
�	<genexpr>,sz-_cf_dictionary_from_tuples.<locals>.<genexpr>css|]}|dVqdS)rNr	)rr
r	r	r
r-s)rr�	CFTypeRefZCFDictionaryCreaterZkCFTypeDictionaryKeyCallBacksZkCFTypeDictionaryValueCallBacks)ZtuplesZdictionary_size�keys�valuesZcf_keysZ	cf_valuesr	r	r
�_cf_dictionary_from_tuples%srcCsntj|tjtj��}tj|tj�}|dkrXtjd�}tj	||dtj�}|sRt
d��|j}|dk	rj|jd�}|S)z�
    Creates a Unicode string from a CFString object. Used entirely for error
    reporting.

    Yes, it annoys me quite a lot that this function is this complex.
    Niz'Error copying C string from CFStringRefzutf-8)
�ctypes�castZPOINTERZc_void_prZCFStringGetCStringPtrrZkCFStringEncodingUTF8Zcreate_string_bufferZCFStringGetCString�OSError�value�decode)rZvalue_as_void_p�string�buffer�resultr	r	r
�_cf_string_to_unicode;s"

rcCs\|dkrdStj|d�}t|�}tj|�|dks:|dkrBd|}|dkrPtj}||��dS)z[
    Checks the return code and throws an exception if there is an error to
    report
    rN�zOSStatus %s)rZSecCopyErrorMessageStringrr�	CFRelease�ssl�SSLError)�errorZexception_classZcf_error_string�outputr	r	r
�_assert_no_errorXs
r"cCs�dd�tj|�D�}|s"tjd��tjtjdtjtj	��}|sHtjd��ydx^|D]V}t
|�}|sjtjd��tjtj|�}tj
|�|s�tjd��tj||�tj
|�qPWWntk
r�tj
|�YnX|S)z�
    Given a bundle of certs in PEM format, turns them into a CFArray of certs
    that can be used to validate a cert chain.
    cSsg|]}tj|jd���qS)r)�base64Z	b64decode�group)r�matchr	r	r
�
<listcomp>ssz(_cert_array_from_pem.<locals>.<listcomp>zNo root certificates specifiedrzUnable to allocate memory!zUnable to build cert object!)�
_PEM_CERTS_RE�finditerrrr�CFArrayCreateMutablerr�byref�kCFTypeArrayCallBacksrrZSecCertificateCreateWithDatar�CFArrayAppendValue�	Exception)Z
pem_bundleZ	der_certsZ
cert_arrayZ	der_bytesZcertdataZcertr	r	r
�_cert_array_from_pemms2






r.cCstj�}tj|�|kS)z=
    Returns True if a given CFTypeRef is a certificate.
    )rZSecCertificateGetTypeIDr�CFGetTypeID)�item�expectedr	r	r
�_is_cert�sr2cCstj�}tj|�|kS)z;
    Returns True if a given CFTypeRef is an identity.
    )rZSecIdentityGetTypeIDrr/)r0r1r	r	r
�_is_identity�sr3cCs�tjd�}tj|dd��jd�}tj|dd��}tj�}tjj||�j	d�}t
j�}t
j|t
|�|ddtj|��}t|�||fS)a�
    This function creates a temporary Mac keychain that we can use to work with
    credentials. This keychain uses a one-time password and a temporary file to
    store the data. We expect to have one keychain per socket. The returned
    SecKeychainRef must be freed by the caller, including calling
    SecKeychainDelete.

    Returns a tuple of the SecKeychainRef and the path to the temporary
    directory that contains it.
    �(N�zutf-8F)�os�urandomr#Z	b64encoder�tempfileZmkdtemp�path�join�encoderZSecKeychainRefZSecKeychainCreaterrr*r")Zrandom_bytes�filenameZpasswordZ
tempdirectoryZ
keychain_path�keychain�statusr	r	r
�_temporary_keychain�s
r?cCsg}g}d}t|d��}|j�}WdQRXz�tjtj|t|��}tj�}tj|ddddd|t	j
|��}t|�tj|�}	xdt
|	�D]X}
tj||
�}t	j|tj�}t|�r�tj|�|j|�q�t|�r�tj|�|j|�q�WWd|r�tj|�tj|�X||fS)z�
    Given a single file, loads all the trust objects from it into arrays and
    the keychain.
    Returns a tuple of lists: the first list is a list of identities, the
    second a list of certs.
    N�rbr)�open�readrrrrZ
CFArrayRefrZ
SecItemImportrr*r"ZCFArrayGetCount�rangeZCFArrayGetValueAtIndexrrr2ZCFRetain�appendr3r)r=r9�certificates�
identitiesZresult_array�fZraw_filedataZfiledatarZresult_count�indexr0r	r	r
�_load_items_from_file�sH




rIcGs�g}g}dd�|D�}z�x.|D]&}t||�\}}|j|�|j|�qW|s�tj�}tj||dtj|��}t|�|j|�t	j
|jd��t	jt	j
dtjt	j��}	x tj||�D]}
t	j|	|
�q�W|	Sxtj||�D]}t	j
|�q�WXdS)z�
    Load certificates and maybe keys from a number of files. Has the end goal
    of returning a CFArray containing one SecIdentityRef, and then zero or more
    SecCertificateRef objects, suitable for use as a client certificate trust
    chain.
    css|]}|r|VqdS)Nr	)rr9r	r	r
r/sz*_load_client_cert_chain.<locals>.<genexpr>rN)rI�extendrZSecIdentityRefZ SecIdentityCreateWithCertificaterr*r"rDrr�popr)rr+�	itertools�chainr,)r=�pathsrErFZ	file_pathZnew_identitiesZ	new_certsZnew_identityr>Ztrust_chainr0�objr	r	r
�_load_client_cert_chains6 


rP)N)�__doc__r#rrL�rer6rr8Zbindingsrrr�compile�DOTALLr'rrrr"r.r2r3r?rIrPr	r	r	r
�<module>	s(


+(;site-packages/pip/_vendor/urllib3/contrib/_securetransport/__pycache__/bindings.cpython-36.opt-1.pyc000064400000024130147511334600027524 0ustar003

���e�D�@s�dZddlmZddlZddlmZddlmZmZm	Z	m
Z
mZmZm
Z
mZmZddlmZmZmZed�Zesxed��ed	�Zes�ed
��ej�dZeeeejd���Zedkr�edededf��eedd�Zeedd�ZeZ eZ!eZ"eZ#eZ$eZ%eZ&eZ'eZ(eZ)e
Z*ee)�Z+eZ,eZ-ee#�Z.ee$�Z/ee%�Z0ee&�Z1ee'�Z2eZ3eZ4eZ5ee�Z6eZ7eZ8ee�Z9eZ:eZ;ee�Z<eZ=eZ>ee�Z?ee�Z@eZAeZBeZCeZDeZEeZF�y�e.e/ee7�ee8�e:ee;�e<ee0�gejG_He-ejG_IgejJ_He*ejJ_IgejK_He*ejK_IgejL_He*ejL_Ie,e.gejM_He6ejM_Ie6gejN_He.ejN_Ie-egejO_He/ejO_Ie+e6ee9�gejP_He-ejP_Ie	eee eee<�gejQ_He-ejQ_Ie<gejR_He-ejR_Ie.e2ee0�gejS_He-ejS_Iee-eAeee
��ZTee-eAee�ee
��ZUe?eTeUgejV_He-ejV_Ie?e	e
gejW_He-ejW_Ie?e0gejX_He-ejX_Ie?e+e gejY_He-ejY_Ie?eAgejZ_He-ejZ_Ie?e	e
gej[_He-ej[_Ie?gej\_He-ej\_Ie?e	e
ee
�gej]_He-ej]_Ie?e	e
ee
�gej^_He-ej^_Ie?gej__He-ej__Ie?ee
�gej`_He-ej`_Ie?ee>�ee
�geja_He-eja_Ie?ee>�e
gejb_He-ejb_Ie?ee
�gejc_de-ejc_Ie?ee>�ee
�geje_He-eje_Ie?ee>�gejf_He-ejf_Ie?ee=�gejg_He-ejg_Ie?ee@�gejh_He-ejh_Ie@e0geji_He-eji_Ie@e gejj_ke-ejj_Ie@eeB�gejl_He-ejl_Ie@gejm_He!ejm_Ie@e!gejn_He6ejn_Ie,eDeEgejo_He?ejo_Ie?eFe gejp_He-ejp_Ie?e=gejq_He-ejq_Ie?e=gejr_He-ejr_Ie-egejO_He/ejO_IeTe_TeUe_Ue?e_?e=e_=e>e_>e9e_9e<e_<e@e_@eBe_Be7e_7e-e_-e/jsed�e_te/jsed�e_ue+gejv_He+ejv_Ie+gejw_Hdejw_Ie+gejx_He*ejx_Ie,e	e"gejy_He/ejy_Ie/e"gejz_He	ejz_Ie/e	e!e"gej{_Heej{_Ie,e	e!gej|_He.ej|_Ie.gej}_He!ej}_Ie.gej~_Heej~_Ie,ee+�ee+�e!e4e5gej_He2ej_Ie2e+gej�_He+ej�_Ie,ee+�e!e3gej�_He0ej�_Ie,e!e3gej�_He1ej�_Ie1egej�_Hdej�_Ie0gej�_He!ej�_Ie0e!gej�_Heej�_Ie,jsed�e_�ejsed�e_�ejsed�e_�ejsed�e_�e+e_+e0e_0e/e_/e2e_2Wne�k
�rted��YnXGdd�de��Z�Gdd�de��Z�dS)ay
This module uses ctypes to bind a whole bunch of functions and constants from
SecureTransport. The goal here is to provide the low-level API to
SecureTransport. These are essentially the C-level functions and constants, and
they're pretty gross to work with.

This code is a bastardised version of the code found in Will Bond's oscrypto
library. An enormous debt is owed to him for blazing this trail for us. For
that reason, this code should be considered to be covered both by urllib3's
license and by oscrypto's:

    Copyright (c) 2015-2016 Will Bond <will@wbond.net>

    Permission is hereby granted, free of charge, to any person obtaining a
    copy of this software and associated documentation files (the "Software"),
    to deal in the Software without restriction, including without limitation
    the rights to use, copy, modify, merge, publish, distribute, sublicense,
    and/or sell copies of the Software, and to permit persons to whom the
    Software is furnished to do so, subject to the following conditions:

    The above copyright notice and this permission notice shall be included in
    all copies or substantial portions of the Software.

    THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
    IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
    FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
    AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
    LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
    FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
    DEALINGS IN THE SOFTWARE.
�)�absolute_importN)�find_library)	�c_void_p�c_int32�c_char_p�c_size_t�c_byte�c_uint32�c_ulong�c_long�c_bool)�CDLL�POINTER�	CFUNCTYPE�Securityz'The library Security could not be found�CoreFoundationz-The library CoreFoundation could not be found�.�
�z1Only OS X 10.8 and newer are supported, not %s.%s�T)Z	use_errno�kSecImportExportPassphrase�kSecImportItemIdentity�kCFAllocatorDefault�kCFTypeArrayCallBacks�kCFTypeDictionaryKeyCallBacks�kCFTypeDictionaryValueCallBackszError initializing ctypesc@seZdZdZed�ZdS)�CFConstz_
    A class object that acts as essentially a namespace for CoreFoundation
    constants.
    iN)�__name__�
__module__�__qualname__�__doc__�CFStringEncodingZkCFStringEncodingUTF8�r"r"�/usr/lib/python3.6/bindings.pyr�src@s,eZdZdZdZdZdZdZdZdZ	dZ
dZdZdZ
dZd	ZdZd
ZdZdZdDZdEZdFZdGZdHZdIZdJZdKZdLZdMZdNZdOZdPZ dQZ!dRZ"dSZ#dTZ$dUZ%dVZ&dWZ'dXZ(dYZ)d"Z*d#Z+d$Z,d%Z-d&Z.d'Z/d(Z0d)Z1d*Z2d+Z3d,Z4d-Z5d.Z6d/Z7d0Z8d1Z9d2Z:d3Z;d4Z<d5Z=d6Z>d7Z?d8Z@d9ZAd:ZBd;ZCd<ZDd=ZEd>ZFd?ZGd@ZHdAZIdBZJdCS)Z�
SecurityConstzU
    A class object that acts as essentially a namespace for Security constants.
    rr���rr���iH&iK&iM&iX&iN&iO&iQ&iR&iV&iW&iT&iU&is&i`&io&iz&iq&iw&i�i�bi�bi�bi,�i0�i+�i/�����i$�i(�i
�i��k�j�9�8i#�i'�i	�i��g�@�3�2���=�<�5�/iiiNi���i���i���i���i���i���i���i���i���i���i���i���i���i���i���i���i���i���i ���iQ���i,���iR���)Krrrr Z"kSSLSessionOptionBreakOnServerAuthZ
kSSLProtocol2Z
kSSLProtocol3Z
kTLSProtocol1ZkTLSProtocol11ZkTLSProtocol12ZkSSLClientSideZkSSLStreamTypeZkSecFormatPEMSequenceZkSecTrustResultInvalidZkSecTrustResultProceedZkSecTrustResultDenyZkSecTrustResultUnspecifiedZ&kSecTrustResultRecoverableTrustFailureZ kSecTrustResultFatalTrustFailureZkSecTrustResultOtherErrorZerrSSLProtocolZerrSSLWouldBlockZerrSSLClosedGracefulZerrSSLClosedNoNotifyZerrSSLClosedAbortZerrSSLXCertChainInvalidZerrSSLCryptoZerrSSLInternalZerrSSLCertExpiredZerrSSLCertNotYetValidZerrSSLUnknownRootCertZerrSSLNoRootCertZerrSSLHostNameMismatchZerrSSLPeerHandshakeFailZerrSSLPeerUserCancelledZerrSSLWeakPeerEphemeralDHKeyZerrSSLServerAuthCompletedZerrSSLRecordOverflowZerrSecVerifyFailedZerrSecNoTrustSettingsZerrSecItemNotFoundZerrSecInvalidTrustSettingsZ'TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384Z%TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384Z'TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256Z%TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256Z#TLS_DHE_DSS_WITH_AES_256_GCM_SHA384Z#TLS_DHE_RSA_WITH_AES_256_GCM_SHA384Z#TLS_DHE_DSS_WITH_AES_128_GCM_SHA256Z#TLS_DHE_RSA_WITH_AES_128_GCM_SHA256Z'TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384Z%TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384Z$TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHAZ"TLS_ECDHE_RSA_WITH_AES_256_CBC_SHAZ#TLS_DHE_RSA_WITH_AES_256_CBC_SHA256Z#TLS_DHE_DSS_WITH_AES_256_CBC_SHA256Z TLS_DHE_RSA_WITH_AES_256_CBC_SHAZ TLS_DHE_DSS_WITH_AES_256_CBC_SHAZ'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256Z%TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256Z$TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHAZ"TLS_ECDHE_RSA_WITH_AES_128_CBC_SHAZ#TLS_DHE_RSA_WITH_AES_128_CBC_SHA256Z#TLS_DHE_DSS_WITH_AES_128_CBC_SHA256Z TLS_DHE_RSA_WITH_AES_128_CBC_SHAZ TLS_DHE_DSS_WITH_AES_128_CBC_SHAZTLS_RSA_WITH_AES_256_GCM_SHA384ZTLS_RSA_WITH_AES_128_GCM_SHA256ZTLS_RSA_WITH_AES_256_CBC_SHA256ZTLS_RSA_WITH_AES_128_CBC_SHA256ZTLS_RSA_WITH_AES_256_CBC_SHAZTLS_RSA_WITH_AES_128_CBC_SHAZTLS_AES_128_GCM_SHA256ZTLS_AES_256_GCM_SHA384ZTLS_CHACHA20_POLY1305_SHA256r"r"r"r#r$�s�r$)rr)�r Z
__future__r�platformZctypes.utilrZctypesrrrrrr	r
rrr
rrZ
security_path�ImportErrorZcore_foundation_pathZmac_ver�version�tuple�map�int�split�version_info�OSErrorrrZBooleanZCFIndexr!ZCFDataZCFStringZCFArrayZCFMutableArrayZCFDictionaryZCFErrorZCFTypeZCFTypeIDZ	CFTypeRefZCFAllocatorRefZOSStatusZ	CFDataRefZCFStringRefZ
CFArrayRefZCFMutableArrayRefZCFDictionaryRefZCFArrayCallBacksZCFDictionaryKeyCallBacksZCFDictionaryValueCallBacksZSecCertificateRefZSecExternalFormatZSecExternalItemTypeZSecIdentityRefZSecItemImportExportFlagsZ SecItemImportExportKeyParametersZSecKeychainRefZSSLProtocolZSSLCipherSuiteZ
SSLContextRefZSecTrustRefZSSLConnectionRefZSecTrustResultTypeZSecTrustOptionFlagsZSSLProtocolSideZSSLConnectionTypeZSSLSessionOptionZ
SecItemImportZargtypesZrestypeZSecCertificateGetTypeIDZSecIdentityGetTypeIDZSecKeyGetTypeIDZSecCertificateCreateWithDataZSecCertificateCopyDataZSecCopyErrorMessageStringZ SecIdentityCreateWithCertificateZSecKeychainCreateZSecKeychainDeleteZSecPKCS12ImportZSSLReadFuncZSSLWriteFuncZ
SSLSetIOFuncsZSSLSetPeerIDZSSLSetCertificateZSSLSetCertificateAuthoritiesZSSLSetConnectionZSSLSetPeerDomainNameZSSLHandshakeZSSLReadZSSLWriteZSSLCloseZSSLGetNumberSupportedCiphersZSSLGetSupportedCiphersZSSLSetEnabledCiphersZSSLGetNumberEnabledCiphersZargtypeZSSLGetEnabledCiphersZSSLGetNegotiatedCipherZSSLGetNegotiatedProtocolVersionZSSLCopyPeerTrustZSecTrustSetAnchorCertificatesZ!SecTrustSetAnchorCertificatesOnlyZ	argstypesZSecTrustEvaluateZSecTrustGetCertificateCountZSecTrustGetCertificateAtIndexZSSLCreateContextZSSLSetSessionOptionZSSLSetProtocolVersionMinZSSLSetProtocolVersionMaxZin_dllrrZCFRetainZ	CFReleaseZCFGetTypeIDZCFStringCreateWithCStringZCFStringGetCStringPtrZCFStringGetCStringZCFDataCreateZCFDataGetLengthZCFDataGetBytePtrZCFDictionaryCreateZCFDictionaryGetValueZ
CFArrayCreateZCFArrayCreateMutableZCFArrayAppendValueZCFArrayGetCountZCFArrayGetValueAtIndexrrrr�AttributeError�objectrr$r"r"r"r#�<module>s,,











































site-packages/pip/_vendor/urllib3/contrib/_securetransport/__pycache__/bindings.cpython-36.pyc000064400000024130147511334600026565 0ustar003

���e�D�@s�dZddlmZddlZddlmZddlmZmZm	Z	m
Z
mZmZm
Z
mZmZddlmZmZmZed�Zesxed��ed	�Zes�ed
��ej�dZeeeejd���Zedkr�edededf��eedd�Zeedd�ZeZ eZ!eZ"eZ#eZ$eZ%eZ&eZ'eZ(eZ)e
Z*ee)�Z+eZ,eZ-ee#�Z.ee$�Z/ee%�Z0ee&�Z1ee'�Z2eZ3eZ4eZ5ee�Z6eZ7eZ8ee�Z9eZ:eZ;ee�Z<eZ=eZ>ee�Z?ee�Z@eZAeZBeZCeZDeZEeZF�y�e.e/ee7�ee8�e:ee;�e<ee0�gejG_He-ejG_IgejJ_He*ejJ_IgejK_He*ejK_IgejL_He*ejL_Ie,e.gejM_He6ejM_Ie6gejN_He.ejN_Ie-egejO_He/ejO_Ie+e6ee9�gejP_He-ejP_Ie	eee eee<�gejQ_He-ejQ_Ie<gejR_He-ejR_Ie.e2ee0�gejS_He-ejS_Iee-eAeee
��ZTee-eAee�ee
��ZUe?eTeUgejV_He-ejV_Ie?e	e
gejW_He-ejW_Ie?e0gejX_He-ejX_Ie?e+e gejY_He-ejY_Ie?eAgejZ_He-ejZ_Ie?e	e
gej[_He-ej[_Ie?gej\_He-ej\_Ie?e	e
ee
�gej]_He-ej]_Ie?e	e
ee
�gej^_He-ej^_Ie?gej__He-ej__Ie?ee
�gej`_He-ej`_Ie?ee>�ee
�geja_He-eja_Ie?ee>�e
gejb_He-ejb_Ie?ee
�gejc_de-ejc_Ie?ee>�ee
�geje_He-eje_Ie?ee>�gejf_He-ejf_Ie?ee=�gejg_He-ejg_Ie?ee@�gejh_He-ejh_Ie@e0geji_He-eji_Ie@e gejj_ke-ejj_Ie@eeB�gejl_He-ejl_Ie@gejm_He!ejm_Ie@e!gejn_He6ejn_Ie,eDeEgejo_He?ejo_Ie?eFe gejp_He-ejp_Ie?e=gejq_He-ejq_Ie?e=gejr_He-ejr_Ie-egejO_He/ejO_IeTe_TeUe_Ue?e_?e=e_=e>e_>e9e_9e<e_<e@e_@eBe_Be7e_7e-e_-e/jsed�e_te/jsed�e_ue+gejv_He+ejv_Ie+gejw_Hdejw_Ie+gejx_He*ejx_Ie,e	e"gejy_He/ejy_Ie/e"gejz_He	ejz_Ie/e	e!e"gej{_Heej{_Ie,e	e!gej|_He.ej|_Ie.gej}_He!ej}_Ie.gej~_Heej~_Ie,ee+�ee+�e!e4e5gej_He2ej_Ie2e+gej�_He+ej�_Ie,ee+�e!e3gej�_He0ej�_Ie,e!e3gej�_He1ej�_Ie1egej�_Hdej�_Ie0gej�_He!ej�_Ie0e!gej�_Heej�_Ie,jsed�e_�ejsed�e_�ejsed�e_�ejsed�e_�e+e_+e0e_0e/e_/e2e_2Wne�k
�rted��YnXGdd�de��Z�Gdd�de��Z�dS)ay
This module uses ctypes to bind a whole bunch of functions and constants from
SecureTransport. The goal here is to provide the low-level API to
SecureTransport. These are essentially the C-level functions and constants, and
they're pretty gross to work with.

This code is a bastardised version of the code found in Will Bond's oscrypto
library. An enormous debt is owed to him for blazing this trail for us. For
that reason, this code should be considered to be covered both by urllib3's
license and by oscrypto's:

    Copyright (c) 2015-2016 Will Bond <will@wbond.net>

    Permission is hereby granted, free of charge, to any person obtaining a
    copy of this software and associated documentation files (the "Software"),
    to deal in the Software without restriction, including without limitation
    the rights to use, copy, modify, merge, publish, distribute, sublicense,
    and/or sell copies of the Software, and to permit persons to whom the
    Software is furnished to do so, subject to the following conditions:

    The above copyright notice and this permission notice shall be included in
    all copies or substantial portions of the Software.

    THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
    IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
    FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
    AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
    LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
    FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
    DEALINGS IN THE SOFTWARE.
�)�absolute_importN)�find_library)	�c_void_p�c_int32�c_char_p�c_size_t�c_byte�c_uint32�c_ulong�c_long�c_bool)�CDLL�POINTER�	CFUNCTYPE�Securityz'The library Security could not be found�CoreFoundationz-The library CoreFoundation could not be found�.�
�z1Only OS X 10.8 and newer are supported, not %s.%s�T)Z	use_errno�kSecImportExportPassphrase�kSecImportItemIdentity�kCFAllocatorDefault�kCFTypeArrayCallBacks�kCFTypeDictionaryKeyCallBacks�kCFTypeDictionaryValueCallBackszError initializing ctypesc@seZdZdZed�ZdS)�CFConstz_
    A class object that acts as essentially a namespace for CoreFoundation
    constants.
    iN)�__name__�
__module__�__qualname__�__doc__�CFStringEncodingZkCFStringEncodingUTF8�r"r"�/usr/lib/python3.6/bindings.pyr�src@s,eZdZdZdZdZdZdZdZdZ	dZ
dZdZdZ
dZd	ZdZd
ZdZdZdDZdEZdFZdGZdHZdIZdJZdKZdLZdMZdNZdOZdPZ dQZ!dRZ"dSZ#dTZ$dUZ%dVZ&dWZ'dXZ(dYZ)d"Z*d#Z+d$Z,d%Z-d&Z.d'Z/d(Z0d)Z1d*Z2d+Z3d,Z4d-Z5d.Z6d/Z7d0Z8d1Z9d2Z:d3Z;d4Z<d5Z=d6Z>d7Z?d8Z@d9ZAd:ZBd;ZCd<ZDd=ZEd>ZFd?ZGd@ZHdAZIdBZJdCS)Z�
SecurityConstzU
    A class object that acts as essentially a namespace for Security constants.
    rr���rr���iH&iK&iM&iX&iN&iO&iQ&iR&iV&iW&iT&iU&is&i`&io&iz&iq&iw&i�i�bi�bi�bi,�i0�i+�i/�����i$�i(�i
�i��k�j�9�8i#�i'�i	�i��g�@�3�2���=�<�5�/iiiNi���i���i���i���i���i���i���i���i���i���i���i���i���i���i���i���i���i���i ���iQ���i,���iR���)Krrrr Z"kSSLSessionOptionBreakOnServerAuthZ
kSSLProtocol2Z
kSSLProtocol3Z
kTLSProtocol1ZkTLSProtocol11ZkTLSProtocol12ZkSSLClientSideZkSSLStreamTypeZkSecFormatPEMSequenceZkSecTrustResultInvalidZkSecTrustResultProceedZkSecTrustResultDenyZkSecTrustResultUnspecifiedZ&kSecTrustResultRecoverableTrustFailureZ kSecTrustResultFatalTrustFailureZkSecTrustResultOtherErrorZerrSSLProtocolZerrSSLWouldBlockZerrSSLClosedGracefulZerrSSLClosedNoNotifyZerrSSLClosedAbortZerrSSLXCertChainInvalidZerrSSLCryptoZerrSSLInternalZerrSSLCertExpiredZerrSSLCertNotYetValidZerrSSLUnknownRootCertZerrSSLNoRootCertZerrSSLHostNameMismatchZerrSSLPeerHandshakeFailZerrSSLPeerUserCancelledZerrSSLWeakPeerEphemeralDHKeyZerrSSLServerAuthCompletedZerrSSLRecordOverflowZerrSecVerifyFailedZerrSecNoTrustSettingsZerrSecItemNotFoundZerrSecInvalidTrustSettingsZ'TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384Z%TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384Z'TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256Z%TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256Z#TLS_DHE_DSS_WITH_AES_256_GCM_SHA384Z#TLS_DHE_RSA_WITH_AES_256_GCM_SHA384Z#TLS_DHE_DSS_WITH_AES_128_GCM_SHA256Z#TLS_DHE_RSA_WITH_AES_128_GCM_SHA256Z'TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384Z%TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384Z$TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHAZ"TLS_ECDHE_RSA_WITH_AES_256_CBC_SHAZ#TLS_DHE_RSA_WITH_AES_256_CBC_SHA256Z#TLS_DHE_DSS_WITH_AES_256_CBC_SHA256Z TLS_DHE_RSA_WITH_AES_256_CBC_SHAZ TLS_DHE_DSS_WITH_AES_256_CBC_SHAZ'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256Z%TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256Z$TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHAZ"TLS_ECDHE_RSA_WITH_AES_128_CBC_SHAZ#TLS_DHE_RSA_WITH_AES_128_CBC_SHA256Z#TLS_DHE_DSS_WITH_AES_128_CBC_SHA256Z TLS_DHE_RSA_WITH_AES_128_CBC_SHAZ TLS_DHE_DSS_WITH_AES_128_CBC_SHAZTLS_RSA_WITH_AES_256_GCM_SHA384ZTLS_RSA_WITH_AES_128_GCM_SHA256ZTLS_RSA_WITH_AES_256_CBC_SHA256ZTLS_RSA_WITH_AES_128_CBC_SHA256ZTLS_RSA_WITH_AES_256_CBC_SHAZTLS_RSA_WITH_AES_128_CBC_SHAZTLS_AES_128_GCM_SHA256ZTLS_AES_256_GCM_SHA384ZTLS_CHACHA20_POLY1305_SHA256r"r"r"r#r$�s�r$)rr)�r Z
__future__r�platformZctypes.utilrZctypesrrrrrr	r
rrr
rrZ
security_path�ImportErrorZcore_foundation_pathZmac_ver�version�tuple�map�int�split�version_info�OSErrorrrZBooleanZCFIndexr!ZCFDataZCFStringZCFArrayZCFMutableArrayZCFDictionaryZCFErrorZCFTypeZCFTypeIDZ	CFTypeRefZCFAllocatorRefZOSStatusZ	CFDataRefZCFStringRefZ
CFArrayRefZCFMutableArrayRefZCFDictionaryRefZCFArrayCallBacksZCFDictionaryKeyCallBacksZCFDictionaryValueCallBacksZSecCertificateRefZSecExternalFormatZSecExternalItemTypeZSecIdentityRefZSecItemImportExportFlagsZ SecItemImportExportKeyParametersZSecKeychainRefZSSLProtocolZSSLCipherSuiteZ
SSLContextRefZSecTrustRefZSSLConnectionRefZSecTrustResultTypeZSecTrustOptionFlagsZSSLProtocolSideZSSLConnectionTypeZSSLSessionOptionZ
SecItemImportZargtypesZrestypeZSecCertificateGetTypeIDZSecIdentityGetTypeIDZSecKeyGetTypeIDZSecCertificateCreateWithDataZSecCertificateCopyDataZSecCopyErrorMessageStringZ SecIdentityCreateWithCertificateZSecKeychainCreateZSecKeychainDeleteZSecPKCS12ImportZSSLReadFuncZSSLWriteFuncZ
SSLSetIOFuncsZSSLSetPeerIDZSSLSetCertificateZSSLSetCertificateAuthoritiesZSSLSetConnectionZSSLSetPeerDomainNameZSSLHandshakeZSSLReadZSSLWriteZSSLCloseZSSLGetNumberSupportedCiphersZSSLGetSupportedCiphersZSSLSetEnabledCiphersZSSLGetNumberEnabledCiphersZargtypeZSSLGetEnabledCiphersZSSLGetNegotiatedCipherZSSLGetNegotiatedProtocolVersionZSSLCopyPeerTrustZSecTrustSetAnchorCertificatesZ!SecTrustSetAnchorCertificatesOnlyZ	argstypesZSecTrustEvaluateZSecTrustGetCertificateCountZSecTrustGetCertificateAtIndexZSSLCreateContextZSSLSetSessionOptionZSSLSetProtocolVersionMinZSSLSetProtocolVersionMaxZin_dllrrZCFRetainZ	CFReleaseZCFGetTypeIDZCFStringCreateWithCStringZCFStringGetCStringPtrZCFStringGetCStringZCFDataCreateZCFDataGetLengthZCFDataGetBytePtrZCFDictionaryCreateZCFDictionaryGetValueZ
CFArrayCreateZCFArrayCreateMutableZCFArrayAppendValueZCFArrayGetCountZCFArrayGetValueAtIndexrrrr�AttributeError�objectrr$r"r"r"r#�<module>s,,











































site-packages/pip/_vendor/urllib3/contrib/_securetransport/low_level.py000064400000027436147511334600022510 0ustar00"""
Low-level helpers for the SecureTransport bindings.

These are Python functions that are not directly related to the high-level APIs
but are necessary to get them to work. They include a whole bunch of low-level
CoreFoundation messing about and memory management. The concerns in this module
are almost entirely about trying to avoid memory leaks and providing
appropriate and useful assistance to the higher-level code.
"""
import base64
import ctypes
import itertools
import re
import os
import ssl
import tempfile

from .bindings import Security, CoreFoundation, CFConst


# This regular expression is used to grab PEM data out of a PEM bundle.
_PEM_CERTS_RE = re.compile(
    b"-----BEGIN CERTIFICATE-----\n(.*?)\n-----END CERTIFICATE-----", re.DOTALL
)


def _cf_data_from_bytes(bytestring):
    """
    Given a bytestring, create a CFData object from it. This CFData object must
    be CFReleased by the caller.
    """
    return CoreFoundation.CFDataCreate(
        CoreFoundation.kCFAllocatorDefault, bytestring, len(bytestring)
    )


def _cf_dictionary_from_tuples(tuples):
    """
    Given a list of Python tuples, create an associated CFDictionary.
    """
    dictionary_size = len(tuples)

    # We need to get the dictionary keys and values out in the same order.
    keys = (t[0] for t in tuples)
    values = (t[1] for t in tuples)
    cf_keys = (CoreFoundation.CFTypeRef * dictionary_size)(*keys)
    cf_values = (CoreFoundation.CFTypeRef * dictionary_size)(*values)

    return CoreFoundation.CFDictionaryCreate(
        CoreFoundation.kCFAllocatorDefault,
        cf_keys,
        cf_values,
        dictionary_size,
        CoreFoundation.kCFTypeDictionaryKeyCallBacks,
        CoreFoundation.kCFTypeDictionaryValueCallBacks,
    )


def _cf_string_to_unicode(value):
    """
    Creates a Unicode string from a CFString object. Used entirely for error
    reporting.

    Yes, it annoys me quite a lot that this function is this complex.
    """
    value_as_void_p = ctypes.cast(value, ctypes.POINTER(ctypes.c_void_p))

    string = CoreFoundation.CFStringGetCStringPtr(
        value_as_void_p,
        CFConst.kCFStringEncodingUTF8
    )
    if string is None:
        buffer = ctypes.create_string_buffer(1024)
        result = CoreFoundation.CFStringGetCString(
            value_as_void_p,
            buffer,
            1024,
            CFConst.kCFStringEncodingUTF8
        )
        if not result:
            raise OSError('Error copying C string from CFStringRef')
        string = buffer.value
    if string is not None:
        string = string.decode('utf-8')
    return string


def _assert_no_error(error, exception_class=None):
    """
    Checks the return code and throws an exception if there is an error to
    report
    """
    if error == 0:
        return

    cf_error_string = Security.SecCopyErrorMessageString(error, None)
    output = _cf_string_to_unicode(cf_error_string)
    CoreFoundation.CFRelease(cf_error_string)

    if output is None or output == u'':
        output = u'OSStatus %s' % error

    if exception_class is None:
        exception_class = ssl.SSLError

    raise exception_class(output)


def _cert_array_from_pem(pem_bundle):
    """
    Given a bundle of certs in PEM format, turns them into a CFArray of certs
    that can be used to validate a cert chain.
    """
    der_certs = [
        base64.b64decode(match.group(1))
        for match in _PEM_CERTS_RE.finditer(pem_bundle)
    ]
    if not der_certs:
        raise ssl.SSLError("No root certificates specified")

    cert_array = CoreFoundation.CFArrayCreateMutable(
        CoreFoundation.kCFAllocatorDefault,
        0,
        ctypes.byref(CoreFoundation.kCFTypeArrayCallBacks)
    )
    if not cert_array:
        raise ssl.SSLError("Unable to allocate memory!")

    try:
        for der_bytes in der_certs:
            certdata = _cf_data_from_bytes(der_bytes)
            if not certdata:
                raise ssl.SSLError("Unable to allocate memory!")
            cert = Security.SecCertificateCreateWithData(
                CoreFoundation.kCFAllocatorDefault, certdata
            )
            CoreFoundation.CFRelease(certdata)
            if not cert:
                raise ssl.SSLError("Unable to build cert object!")

            CoreFoundation.CFArrayAppendValue(cert_array, cert)
            CoreFoundation.CFRelease(cert)
    except Exception:
        # We need to free the array before the exception bubbles further.
        # We only want to do that if an error occurs: otherwise, the caller
        # should free.
        CoreFoundation.CFRelease(cert_array)

    return cert_array


def _is_cert(item):
    """
    Returns True if a given CFTypeRef is a certificate.
    """
    expected = Security.SecCertificateGetTypeID()
    return CoreFoundation.CFGetTypeID(item) == expected


def _is_identity(item):
    """
    Returns True if a given CFTypeRef is an identity.
    """
    expected = Security.SecIdentityGetTypeID()
    return CoreFoundation.CFGetTypeID(item) == expected


def _temporary_keychain():
    """
    This function creates a temporary Mac keychain that we can use to work with
    credentials. This keychain uses a one-time password and a temporary file to
    store the data. We expect to have one keychain per socket. The returned
    SecKeychainRef must be freed by the caller, including calling
    SecKeychainDelete.

    Returns a tuple of the SecKeychainRef and the path to the temporary
    directory that contains it.
    """
    # Unfortunately, SecKeychainCreate requires a path to a keychain. This
    # means we cannot use mkstemp to use a generic temporary file. Instead,
    # we're going to create a temporary directory and a filename to use there.
    # This filename will be 8 random bytes expanded into base64. We also need
    # some random bytes to password-protect the keychain we're creating, so we
    # ask for 40 random bytes.
    random_bytes = os.urandom(40)
    filename = base64.b64encode(random_bytes[:8]).decode('utf-8')
    password = base64.b64encode(random_bytes[8:])  # Must be valid UTF-8
    tempdirectory = tempfile.mkdtemp()

    keychain_path = os.path.join(tempdirectory, filename).encode('utf-8')

    # We now want to create the keychain itself.
    keychain = Security.SecKeychainRef()
    status = Security.SecKeychainCreate(
        keychain_path,
        len(password),
        password,
        False,
        None,
        ctypes.byref(keychain)
    )
    _assert_no_error(status)

    # Having created the keychain, we want to pass it off to the caller.
    return keychain, tempdirectory


def _load_items_from_file(keychain, path):
    """
    Given a single file, loads all the trust objects from it into arrays and
    the keychain.
    Returns a tuple of lists: the first list is a list of identities, the
    second a list of certs.
    """
    certificates = []
    identities = []
    result_array = None

    with open(path, 'rb') as f:
        raw_filedata = f.read()

    try:
        filedata = CoreFoundation.CFDataCreate(
            CoreFoundation.kCFAllocatorDefault,
            raw_filedata,
            len(raw_filedata)
        )
        result_array = CoreFoundation.CFArrayRef()
        result = Security.SecItemImport(
            filedata,  # cert data
            None,  # Filename, leaving it out for now
            None,  # What the type of the file is, we don't care
            None,  # what's in the file, we don't care
            0,  # import flags
            None,  # key params, can include passphrase in the future
            keychain,  # The keychain to insert into
            ctypes.byref(result_array)  # Results
        )
        _assert_no_error(result)

        # A CFArray is not very useful to us as an intermediary
        # representation, so we are going to extract the objects we want
        # and then free the array. We don't need to keep hold of keys: the
        # keychain already has them!
        result_count = CoreFoundation.CFArrayGetCount(result_array)
        for index in range(result_count):
            item = CoreFoundation.CFArrayGetValueAtIndex(
                result_array, index
            )
            item = ctypes.cast(item, CoreFoundation.CFTypeRef)

            if _is_cert(item):
                CoreFoundation.CFRetain(item)
                certificates.append(item)
            elif _is_identity(item):
                CoreFoundation.CFRetain(item)
                identities.append(item)
    finally:
        if result_array:
            CoreFoundation.CFRelease(result_array)

        CoreFoundation.CFRelease(filedata)

    return (identities, certificates)


def _load_client_cert_chain(keychain, *paths):
    """
    Load certificates and maybe keys from a number of files. Has the end goal
    of returning a CFArray containing one SecIdentityRef, and then zero or more
    SecCertificateRef objects, suitable for use as a client certificate trust
    chain.
    """
    # Ok, the strategy.
    #
    # This relies on knowing that macOS will not give you a SecIdentityRef
    # unless you have imported a key into a keychain. This is a somewhat
    # artificial limitation of macOS (for example, it doesn't necessarily
    # affect iOS), but there is nothing inside Security.framework that lets you
    # get a SecIdentityRef without having a key in a keychain.
    #
    # So the policy here is we take all the files and iterate them in order.
    # Each one will use SecItemImport to have one or more objects loaded from
    # it. We will also point at a keychain that macOS can use to work with the
    # private key.
    #
    # Once we have all the objects, we'll check what we actually have. If we
    # already have a SecIdentityRef in hand, fab: we'll use that. Otherwise,
    # we'll take the first certificate (which we assume to be our leaf) and
    # ask the keychain to give us a SecIdentityRef with that cert's associated
    # key.
    #
    # We'll then return a CFArray containing the trust chain: one
    # SecIdentityRef and then zero-or-more SecCertificateRef objects. The
    # responsibility for freeing this CFArray will be with the caller. This
    # CFArray must remain alive for the entire connection, so in practice it
    # will be stored with a single SSLSocket, along with the reference to the
    # keychain.
    certificates = []
    identities = []

    # Filter out bad paths.
    paths = (path for path in paths if path)

    try:
        for file_path in paths:
            new_identities, new_certs = _load_items_from_file(
                keychain, file_path
            )
            identities.extend(new_identities)
            certificates.extend(new_certs)

        # Ok, we have everything. The question is: do we have an identity? If
        # not, we want to grab one from the first cert we have.
        if not identities:
            new_identity = Security.SecIdentityRef()
            status = Security.SecIdentityCreateWithCertificate(
                keychain,
                certificates[0],
                ctypes.byref(new_identity)
            )
            _assert_no_error(status)
            identities.append(new_identity)

            # We now want to release the original certificate, as we no longer
            # need it.
            CoreFoundation.CFRelease(certificates.pop(0))

        # We now need to build a new CFArray that holds the trust chain.
        trust_chain = CoreFoundation.CFArrayCreateMutable(
            CoreFoundation.kCFAllocatorDefault,
            0,
            ctypes.byref(CoreFoundation.kCFTypeArrayCallBacks),
        )
        for item in itertools.chain(identities, certificates):
            # ArrayAppendValue does a CFRetain on the item. That's fine,
            # because the finally block will release our other refs to them.
            CoreFoundation.CFArrayAppendValue(trust_chain, item)

        return trust_chain
    finally:
        for obj in itertools.chain(identities, certificates):
            CoreFoundation.CFRelease(obj)
site-packages/pip/_vendor/urllib3/contrib/_securetransport/bindings.py000064400000042230147511334600022302 0ustar00"""
This module uses ctypes to bind a whole bunch of functions and constants from
SecureTransport. The goal here is to provide the low-level API to
SecureTransport. These are essentially the C-level functions and constants, and
they're pretty gross to work with.

This code is a bastardised version of the code found in Will Bond's oscrypto
library. An enormous debt is owed to him for blazing this trail for us. For
that reason, this code should be considered to be covered both by urllib3's
license and by oscrypto's:

    Copyright (c) 2015-2016 Will Bond <will@wbond.net>

    Permission is hereby granted, free of charge, to any person obtaining a
    copy of this software and associated documentation files (the "Software"),
    to deal in the Software without restriction, including without limitation
    the rights to use, copy, modify, merge, publish, distribute, sublicense,
    and/or sell copies of the Software, and to permit persons to whom the
    Software is furnished to do so, subject to the following conditions:

    The above copyright notice and this permission notice shall be included in
    all copies or substantial portions of the Software.

    THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
    IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
    FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
    AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
    LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
    FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
    DEALINGS IN THE SOFTWARE.
"""
from __future__ import absolute_import

import platform
from ctypes.util import find_library
from ctypes import (
    c_void_p, c_int32, c_char_p, c_size_t, c_byte, c_uint32, c_ulong, c_long,
    c_bool
)
from ctypes import CDLL, POINTER, CFUNCTYPE


security_path = find_library('Security')
if not security_path:
    raise ImportError('The library Security could not be found')


core_foundation_path = find_library('CoreFoundation')
if not core_foundation_path:
    raise ImportError('The library CoreFoundation could not be found')


version = platform.mac_ver()[0]
version_info = tuple(map(int, version.split('.')))
if version_info < (10, 8):
    raise OSError(
        'Only OS X 10.8 and newer are supported, not %s.%s' % (
            version_info[0], version_info[1]
        )
    )

Security = CDLL(security_path, use_errno=True)
CoreFoundation = CDLL(core_foundation_path, use_errno=True)

Boolean = c_bool
CFIndex = c_long
CFStringEncoding = c_uint32
CFData = c_void_p
CFString = c_void_p
CFArray = c_void_p
CFMutableArray = c_void_p
CFDictionary = c_void_p
CFError = c_void_p
CFType = c_void_p
CFTypeID = c_ulong

CFTypeRef = POINTER(CFType)
CFAllocatorRef = c_void_p

OSStatus = c_int32

CFDataRef = POINTER(CFData)
CFStringRef = POINTER(CFString)
CFArrayRef = POINTER(CFArray)
CFMutableArrayRef = POINTER(CFMutableArray)
CFDictionaryRef = POINTER(CFDictionary)
CFArrayCallBacks = c_void_p
CFDictionaryKeyCallBacks = c_void_p
CFDictionaryValueCallBacks = c_void_p

SecCertificateRef = POINTER(c_void_p)
SecExternalFormat = c_uint32
SecExternalItemType = c_uint32
SecIdentityRef = POINTER(c_void_p)
SecItemImportExportFlags = c_uint32
SecItemImportExportKeyParameters = c_void_p
SecKeychainRef = POINTER(c_void_p)
SSLProtocol = c_uint32
SSLCipherSuite = c_uint32
SSLContextRef = POINTER(c_void_p)
SecTrustRef = POINTER(c_void_p)
SSLConnectionRef = c_uint32
SecTrustResultType = c_uint32
SecTrustOptionFlags = c_uint32
SSLProtocolSide = c_uint32
SSLConnectionType = c_uint32
SSLSessionOption = c_uint32


try:
    Security.SecItemImport.argtypes = [
        CFDataRef,
        CFStringRef,
        POINTER(SecExternalFormat),
        POINTER(SecExternalItemType),
        SecItemImportExportFlags,
        POINTER(SecItemImportExportKeyParameters),
        SecKeychainRef,
        POINTER(CFArrayRef),
    ]
    Security.SecItemImport.restype = OSStatus

    Security.SecCertificateGetTypeID.argtypes = []
    Security.SecCertificateGetTypeID.restype = CFTypeID

    Security.SecIdentityGetTypeID.argtypes = []
    Security.SecIdentityGetTypeID.restype = CFTypeID

    Security.SecKeyGetTypeID.argtypes = []
    Security.SecKeyGetTypeID.restype = CFTypeID

    Security.SecCertificateCreateWithData.argtypes = [
        CFAllocatorRef,
        CFDataRef
    ]
    Security.SecCertificateCreateWithData.restype = SecCertificateRef

    Security.SecCertificateCopyData.argtypes = [
        SecCertificateRef
    ]
    Security.SecCertificateCopyData.restype = CFDataRef

    Security.SecCopyErrorMessageString.argtypes = [
        OSStatus,
        c_void_p
    ]
    Security.SecCopyErrorMessageString.restype = CFStringRef

    Security.SecIdentityCreateWithCertificate.argtypes = [
        CFTypeRef,
        SecCertificateRef,
        POINTER(SecIdentityRef)
    ]
    Security.SecIdentityCreateWithCertificate.restype = OSStatus

    Security.SecKeychainCreate.argtypes = [
        c_char_p,
        c_uint32,
        c_void_p,
        Boolean,
        c_void_p,
        POINTER(SecKeychainRef)
    ]
    Security.SecKeychainCreate.restype = OSStatus

    Security.SecKeychainDelete.argtypes = [
        SecKeychainRef
    ]
    Security.SecKeychainDelete.restype = OSStatus

    Security.SecPKCS12Import.argtypes = [
        CFDataRef,
        CFDictionaryRef,
        POINTER(CFArrayRef)
    ]
    Security.SecPKCS12Import.restype = OSStatus

    SSLReadFunc = CFUNCTYPE(OSStatus, SSLConnectionRef, c_void_p, POINTER(c_size_t))
    SSLWriteFunc = CFUNCTYPE(OSStatus, SSLConnectionRef, POINTER(c_byte), POINTER(c_size_t))

    Security.SSLSetIOFuncs.argtypes = [
        SSLContextRef,
        SSLReadFunc,
        SSLWriteFunc
    ]
    Security.SSLSetIOFuncs.restype = OSStatus

    Security.SSLSetPeerID.argtypes = [
        SSLContextRef,
        c_char_p,
        c_size_t
    ]
    Security.SSLSetPeerID.restype = OSStatus

    Security.SSLSetCertificate.argtypes = [
        SSLContextRef,
        CFArrayRef
    ]
    Security.SSLSetCertificate.restype = OSStatus

    Security.SSLSetCertificateAuthorities.argtypes = [
        SSLContextRef,
        CFTypeRef,
        Boolean
    ]
    Security.SSLSetCertificateAuthorities.restype = OSStatus

    Security.SSLSetConnection.argtypes = [
        SSLContextRef,
        SSLConnectionRef
    ]
    Security.SSLSetConnection.restype = OSStatus

    Security.SSLSetPeerDomainName.argtypes = [
        SSLContextRef,
        c_char_p,
        c_size_t
    ]
    Security.SSLSetPeerDomainName.restype = OSStatus

    Security.SSLHandshake.argtypes = [
        SSLContextRef
    ]
    Security.SSLHandshake.restype = OSStatus

    Security.SSLRead.argtypes = [
        SSLContextRef,
        c_char_p,
        c_size_t,
        POINTER(c_size_t)
    ]
    Security.SSLRead.restype = OSStatus

    Security.SSLWrite.argtypes = [
        SSLContextRef,
        c_char_p,
        c_size_t,
        POINTER(c_size_t)
    ]
    Security.SSLWrite.restype = OSStatus

    Security.SSLClose.argtypes = [
        SSLContextRef
    ]
    Security.SSLClose.restype = OSStatus

    Security.SSLGetNumberSupportedCiphers.argtypes = [
        SSLContextRef,
        POINTER(c_size_t)
    ]
    Security.SSLGetNumberSupportedCiphers.restype = OSStatus

    Security.SSLGetSupportedCiphers.argtypes = [
        SSLContextRef,
        POINTER(SSLCipherSuite),
        POINTER(c_size_t)
    ]
    Security.SSLGetSupportedCiphers.restype = OSStatus

    Security.SSLSetEnabledCiphers.argtypes = [
        SSLContextRef,
        POINTER(SSLCipherSuite),
        c_size_t
    ]
    Security.SSLSetEnabledCiphers.restype = OSStatus

    Security.SSLGetNumberEnabledCiphers.argtype = [
        SSLContextRef,
        POINTER(c_size_t)
    ]
    Security.SSLGetNumberEnabledCiphers.restype = OSStatus

    Security.SSLGetEnabledCiphers.argtypes = [
        SSLContextRef,
        POINTER(SSLCipherSuite),
        POINTER(c_size_t)
    ]
    Security.SSLGetEnabledCiphers.restype = OSStatus

    Security.SSLGetNegotiatedCipher.argtypes = [
        SSLContextRef,
        POINTER(SSLCipherSuite)
    ]
    Security.SSLGetNegotiatedCipher.restype = OSStatus

    Security.SSLGetNegotiatedProtocolVersion.argtypes = [
        SSLContextRef,
        POINTER(SSLProtocol)
    ]
    Security.SSLGetNegotiatedProtocolVersion.restype = OSStatus

    Security.SSLCopyPeerTrust.argtypes = [
        SSLContextRef,
        POINTER(SecTrustRef)
    ]
    Security.SSLCopyPeerTrust.restype = OSStatus

    Security.SecTrustSetAnchorCertificates.argtypes = [
        SecTrustRef,
        CFArrayRef
    ]
    Security.SecTrustSetAnchorCertificates.restype = OSStatus

    Security.SecTrustSetAnchorCertificatesOnly.argstypes = [
        SecTrustRef,
        Boolean
    ]
    Security.SecTrustSetAnchorCertificatesOnly.restype = OSStatus

    Security.SecTrustEvaluate.argtypes = [
        SecTrustRef,
        POINTER(SecTrustResultType)
    ]
    Security.SecTrustEvaluate.restype = OSStatus

    Security.SecTrustGetCertificateCount.argtypes = [
        SecTrustRef
    ]
    Security.SecTrustGetCertificateCount.restype = CFIndex

    Security.SecTrustGetCertificateAtIndex.argtypes = [
        SecTrustRef,
        CFIndex
    ]
    Security.SecTrustGetCertificateAtIndex.restype = SecCertificateRef

    Security.SSLCreateContext.argtypes = [
        CFAllocatorRef,
        SSLProtocolSide,
        SSLConnectionType
    ]
    Security.SSLCreateContext.restype = SSLContextRef

    Security.SSLSetSessionOption.argtypes = [
        SSLContextRef,
        SSLSessionOption,
        Boolean
    ]
    Security.SSLSetSessionOption.restype = OSStatus

    Security.SSLSetProtocolVersionMin.argtypes = [
        SSLContextRef,
        SSLProtocol
    ]
    Security.SSLSetProtocolVersionMin.restype = OSStatus

    Security.SSLSetProtocolVersionMax.argtypes = [
        SSLContextRef,
        SSLProtocol
    ]
    Security.SSLSetProtocolVersionMax.restype = OSStatus

    Security.SecCopyErrorMessageString.argtypes = [
        OSStatus,
        c_void_p
    ]
    Security.SecCopyErrorMessageString.restype = CFStringRef

    Security.SSLReadFunc = SSLReadFunc
    Security.SSLWriteFunc = SSLWriteFunc
    Security.SSLContextRef = SSLContextRef
    Security.SSLProtocol = SSLProtocol
    Security.SSLCipherSuite = SSLCipherSuite
    Security.SecIdentityRef = SecIdentityRef
    Security.SecKeychainRef = SecKeychainRef
    Security.SecTrustRef = SecTrustRef
    Security.SecTrustResultType = SecTrustResultType
    Security.SecExternalFormat = SecExternalFormat
    Security.OSStatus = OSStatus

    Security.kSecImportExportPassphrase = CFStringRef.in_dll(
        Security, 'kSecImportExportPassphrase'
    )
    Security.kSecImportItemIdentity = CFStringRef.in_dll(
        Security, 'kSecImportItemIdentity'
    )

    # CoreFoundation time!
    CoreFoundation.CFRetain.argtypes = [
        CFTypeRef
    ]
    CoreFoundation.CFRetain.restype = CFTypeRef

    CoreFoundation.CFRelease.argtypes = [
        CFTypeRef
    ]
    CoreFoundation.CFRelease.restype = None

    CoreFoundation.CFGetTypeID.argtypes = [
        CFTypeRef
    ]
    CoreFoundation.CFGetTypeID.restype = CFTypeID

    CoreFoundation.CFStringCreateWithCString.argtypes = [
        CFAllocatorRef,
        c_char_p,
        CFStringEncoding
    ]
    CoreFoundation.CFStringCreateWithCString.restype = CFStringRef

    CoreFoundation.CFStringGetCStringPtr.argtypes = [
        CFStringRef,
        CFStringEncoding
    ]
    CoreFoundation.CFStringGetCStringPtr.restype = c_char_p

    CoreFoundation.CFStringGetCString.argtypes = [
        CFStringRef,
        c_char_p,
        CFIndex,
        CFStringEncoding
    ]
    CoreFoundation.CFStringGetCString.restype = c_bool

    CoreFoundation.CFDataCreate.argtypes = [
        CFAllocatorRef,
        c_char_p,
        CFIndex
    ]
    CoreFoundation.CFDataCreate.restype = CFDataRef

    CoreFoundation.CFDataGetLength.argtypes = [
        CFDataRef
    ]
    CoreFoundation.CFDataGetLength.restype = CFIndex

    CoreFoundation.CFDataGetBytePtr.argtypes = [
        CFDataRef
    ]
    CoreFoundation.CFDataGetBytePtr.restype = c_void_p

    CoreFoundation.CFDictionaryCreate.argtypes = [
        CFAllocatorRef,
        POINTER(CFTypeRef),
        POINTER(CFTypeRef),
        CFIndex,
        CFDictionaryKeyCallBacks,
        CFDictionaryValueCallBacks
    ]
    CoreFoundation.CFDictionaryCreate.restype = CFDictionaryRef

    CoreFoundation.CFDictionaryGetValue.argtypes = [
        CFDictionaryRef,
        CFTypeRef
    ]
    CoreFoundation.CFDictionaryGetValue.restype = CFTypeRef

    CoreFoundation.CFArrayCreate.argtypes = [
        CFAllocatorRef,
        POINTER(CFTypeRef),
        CFIndex,
        CFArrayCallBacks,
    ]
    CoreFoundation.CFArrayCreate.restype = CFArrayRef

    CoreFoundation.CFArrayCreateMutable.argtypes = [
        CFAllocatorRef,
        CFIndex,
        CFArrayCallBacks
    ]
    CoreFoundation.CFArrayCreateMutable.restype = CFMutableArrayRef

    CoreFoundation.CFArrayAppendValue.argtypes = [
        CFMutableArrayRef,
        c_void_p
    ]
    CoreFoundation.CFArrayAppendValue.restype = None

    CoreFoundation.CFArrayGetCount.argtypes = [
        CFArrayRef
    ]
    CoreFoundation.CFArrayGetCount.restype = CFIndex

    CoreFoundation.CFArrayGetValueAtIndex.argtypes = [
        CFArrayRef,
        CFIndex
    ]
    CoreFoundation.CFArrayGetValueAtIndex.restype = c_void_p

    CoreFoundation.kCFAllocatorDefault = CFAllocatorRef.in_dll(
        CoreFoundation, 'kCFAllocatorDefault'
    )
    CoreFoundation.kCFTypeArrayCallBacks = c_void_p.in_dll(CoreFoundation, 'kCFTypeArrayCallBacks')
    CoreFoundation.kCFTypeDictionaryKeyCallBacks = c_void_p.in_dll(
        CoreFoundation, 'kCFTypeDictionaryKeyCallBacks'
    )
    CoreFoundation.kCFTypeDictionaryValueCallBacks = c_void_p.in_dll(
        CoreFoundation, 'kCFTypeDictionaryValueCallBacks'
    )

    CoreFoundation.CFTypeRef = CFTypeRef
    CoreFoundation.CFArrayRef = CFArrayRef
    CoreFoundation.CFStringRef = CFStringRef
    CoreFoundation.CFDictionaryRef = CFDictionaryRef

except (AttributeError):
    raise ImportError('Error initializing ctypes')


class CFConst(object):
    """
    A class object that acts as essentially a namespace for CoreFoundation
    constants.
    """
    kCFStringEncodingUTF8 = CFStringEncoding(0x08000100)


class SecurityConst(object):
    """
    A class object that acts as essentially a namespace for Security constants.
    """
    kSSLSessionOptionBreakOnServerAuth = 0

    kSSLProtocol2 = 1
    kSSLProtocol3 = 2
    kTLSProtocol1 = 4
    kTLSProtocol11 = 7
    kTLSProtocol12 = 8

    kSSLClientSide = 1
    kSSLStreamType = 0

    kSecFormatPEMSequence = 10

    kSecTrustResultInvalid = 0
    kSecTrustResultProceed = 1
    # This gap is present on purpose: this was kSecTrustResultConfirm, which
    # is deprecated.
    kSecTrustResultDeny = 3
    kSecTrustResultUnspecified = 4
    kSecTrustResultRecoverableTrustFailure = 5
    kSecTrustResultFatalTrustFailure = 6
    kSecTrustResultOtherError = 7

    errSSLProtocol = -9800
    errSSLWouldBlock = -9803
    errSSLClosedGraceful = -9805
    errSSLClosedNoNotify = -9816
    errSSLClosedAbort = -9806

    errSSLXCertChainInvalid = -9807
    errSSLCrypto = -9809
    errSSLInternal = -9810
    errSSLCertExpired = -9814
    errSSLCertNotYetValid = -9815
    errSSLUnknownRootCert = -9812
    errSSLNoRootCert = -9813
    errSSLHostNameMismatch = -9843
    errSSLPeerHandshakeFail = -9824
    errSSLPeerUserCancelled = -9839
    errSSLWeakPeerEphemeralDHKey = -9850
    errSSLServerAuthCompleted = -9841
    errSSLRecordOverflow = -9847

    errSecVerifyFailed = -67808
    errSecNoTrustSettings = -25263
    errSecItemNotFound = -25300
    errSecInvalidTrustSettings = -25262

    # Cipher suites. We only pick the ones our default cipher string allows.
    TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384 = 0xC02C
    TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384 = 0xC030
    TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 = 0xC02B
    TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256 = 0xC02F
    TLS_DHE_DSS_WITH_AES_256_GCM_SHA384 = 0x00A3
    TLS_DHE_RSA_WITH_AES_256_GCM_SHA384 = 0x009F
    TLS_DHE_DSS_WITH_AES_128_GCM_SHA256 = 0x00A2
    TLS_DHE_RSA_WITH_AES_128_GCM_SHA256 = 0x009E
    TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384 = 0xC024
    TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384 = 0xC028
    TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA = 0xC00A
    TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA = 0xC014
    TLS_DHE_RSA_WITH_AES_256_CBC_SHA256 = 0x006B
    TLS_DHE_DSS_WITH_AES_256_CBC_SHA256 = 0x006A
    TLS_DHE_RSA_WITH_AES_256_CBC_SHA = 0x0039
    TLS_DHE_DSS_WITH_AES_256_CBC_SHA = 0x0038
    TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256 = 0xC023
    TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256 = 0xC027
    TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA = 0xC009
    TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA = 0xC013
    TLS_DHE_RSA_WITH_AES_128_CBC_SHA256 = 0x0067
    TLS_DHE_DSS_WITH_AES_128_CBC_SHA256 = 0x0040
    TLS_DHE_RSA_WITH_AES_128_CBC_SHA = 0x0033
    TLS_DHE_DSS_WITH_AES_128_CBC_SHA = 0x0032
    TLS_RSA_WITH_AES_256_GCM_SHA384 = 0x009D
    TLS_RSA_WITH_AES_128_GCM_SHA256 = 0x009C
    TLS_RSA_WITH_AES_256_CBC_SHA256 = 0x003D
    TLS_RSA_WITH_AES_128_CBC_SHA256 = 0x003C
    TLS_RSA_WITH_AES_256_CBC_SHA = 0x0035
    TLS_RSA_WITH_AES_128_CBC_SHA = 0x002F
    TLS_AES_128_GCM_SHA256 = 0x1301
    TLS_AES_256_GCM_SHA384 = 0x1302
    TLS_CHACHA20_POLY1305_SHA256 = 0x1303
site-packages/pip/_vendor/urllib3/contrib/_securetransport/__init__.py000064400000000000147511334600022231 0ustar00site-packages/pip/_vendor/urllib3/contrib/pyopenssl.py000064400000035772147511334600017154 0ustar00"""
SSL with SNI_-support for Python 2. Follow these instructions if you would
like to verify SSL certificates in Python 2. Note, the default libraries do
*not* do certificate checking; you need to do additional work to validate
certificates yourself.

This needs the following packages installed:

* pyOpenSSL (tested with 16.0.0)
* cryptography (minimum 1.3.4, from pyopenssl)
* idna (minimum 2.0, from cryptography)

However, pyopenssl depends on cryptography, which depends on idna, so while we
use all three directly here we end up having relatively few packages required.

You can install them with the following command:

    pip install pyopenssl cryptography idna

To activate certificate checking, call
:func:`~urllib3.contrib.pyopenssl.inject_into_urllib3` from your Python code
before you begin making HTTP requests. This can be done in a ``sitecustomize``
module, or at any other time before your application begins using ``urllib3``,
like this::

    try:
        import urllib3.contrib.pyopenssl
        urllib3.contrib.pyopenssl.inject_into_urllib3()
    except ImportError:
        pass

Now you can use :mod:`urllib3` as you normally would, and it will support SNI
when the required modules are installed.

Activating this module also has the positive side effect of disabling SSL/TLS
compression in Python 2 (see `CRIME attack`_).

If you want to configure the default list of supported cipher suites, you can
set the ``urllib3.contrib.pyopenssl.DEFAULT_SSL_CIPHER_LIST`` variable.

.. _sni: https://en.wikipedia.org/wiki/Server_Name_Indication
.. _crime attack: https://en.wikipedia.org/wiki/CRIME_(security_exploit)
"""
from __future__ import absolute_import

import OpenSSL.SSL
from cryptography import x509
from cryptography.hazmat.backends.openssl import backend as openssl_backend
from cryptography.hazmat.backends.openssl.x509 import _Certificate

from socket import timeout, error as SocketError
from io import BytesIO

try:  # Platform-specific: Python 2
    from socket import _fileobject
except ImportError:  # Platform-specific: Python 3
    _fileobject = None
    from ..packages.backports.makefile import backport_makefile

import logging
import ssl
from ..packages import six
import sys

from .. import util

__all__ = ['inject_into_urllib3', 'extract_from_urllib3']

# SNI always works.
HAS_SNI = True

# Map from urllib3 to PyOpenSSL compatible parameter-values.
_openssl_versions = {
    ssl.PROTOCOL_SSLv23: OpenSSL.SSL.SSLv23_METHOD,
    ssl.PROTOCOL_TLSv1: OpenSSL.SSL.TLSv1_METHOD,
}

if hasattr(ssl, 'PROTOCOL_TLSv1_1') and hasattr(OpenSSL.SSL, 'TLSv1_1_METHOD'):
    _openssl_versions[ssl.PROTOCOL_TLSv1_1] = OpenSSL.SSL.TLSv1_1_METHOD

if hasattr(ssl, 'PROTOCOL_TLSv1_2') and hasattr(OpenSSL.SSL, 'TLSv1_2_METHOD'):
    _openssl_versions[ssl.PROTOCOL_TLSv1_2] = OpenSSL.SSL.TLSv1_2_METHOD

try:
    _openssl_versions.update({ssl.PROTOCOL_SSLv3: OpenSSL.SSL.SSLv3_METHOD})
except AttributeError:
    pass

_stdlib_to_openssl_verify = {
    ssl.CERT_NONE: OpenSSL.SSL.VERIFY_NONE,
    ssl.CERT_OPTIONAL: OpenSSL.SSL.VERIFY_PEER,
    ssl.CERT_REQUIRED:
        OpenSSL.SSL.VERIFY_PEER + OpenSSL.SSL.VERIFY_FAIL_IF_NO_PEER_CERT,
}
_openssl_to_stdlib_verify = dict(
    (v, k) for k, v in _stdlib_to_openssl_verify.items()
)

# OpenSSL will only write 16K at a time
SSL_WRITE_BLOCKSIZE = 16384

orig_util_HAS_SNI = util.HAS_SNI
orig_util_SSLContext = util.ssl_.SSLContext


log = logging.getLogger(__name__)


def inject_into_urllib3():
    'Monkey-patch urllib3 with PyOpenSSL-backed SSL-support.'

    _validate_dependencies_met()

    util.ssl_.SSLContext = PyOpenSSLContext
    util.HAS_SNI = HAS_SNI
    util.ssl_.HAS_SNI = HAS_SNI
    util.IS_PYOPENSSL = True
    util.ssl_.IS_PYOPENSSL = True


def extract_from_urllib3():
    'Undo monkey-patching by :func:`inject_into_urllib3`.'

    util.ssl_.SSLContext = orig_util_SSLContext
    util.HAS_SNI = orig_util_HAS_SNI
    util.ssl_.HAS_SNI = orig_util_HAS_SNI
    util.IS_PYOPENSSL = False
    util.ssl_.IS_PYOPENSSL = False


def _validate_dependencies_met():
    """
    Verifies that PyOpenSSL's package-level dependencies have been met.
    Throws `ImportError` if they are not met.
    """
    # Method added in `cryptography==1.1`; not available in older versions
    from cryptography.x509.extensions import Extensions
    if getattr(Extensions, "get_extension_for_class", None) is None:
        raise ImportError("'cryptography' module missing required functionality.  "
                          "Try upgrading to v1.3.4 or newer.")

    # pyOpenSSL 0.14 and above use cryptography for OpenSSL bindings. The _x509
    # attribute is only present on those versions.
    from OpenSSL.crypto import X509
    x509 = X509()
    if getattr(x509, "_x509", None) is None:
        raise ImportError("'pyOpenSSL' module missing required functionality. "
                          "Try upgrading to v0.14 or newer.")


def _dnsname_to_stdlib(name):
    """
    Converts a dNSName SubjectAlternativeName field to the form used by the
    standard library on the given Python version.

    Cryptography produces a dNSName as a unicode string that was idna-decoded
    from ASCII bytes. We need to idna-encode that string to get it back, and
    then on Python 3 we also need to convert to unicode via UTF-8 (the stdlib
    uses PyUnicode_FromStringAndSize on it, which decodes via UTF-8).
    """
    def idna_encode(name):
        """
        Borrowed wholesale from the Python Cryptography Project. It turns out
        that we can't just safely call `idna.encode`: it can explode for
        wildcard names. This avoids that problem.
        """
        import idna

        for prefix in [u'*.', u'.']:
            if name.startswith(prefix):
                name = name[len(prefix):]
                return prefix.encode('ascii') + idna.encode(name)
        return idna.encode(name)

    name = idna_encode(name)
    if sys.version_info >= (3, 0):
        name = name.decode('utf-8')
    return name


def get_subj_alt_name(peer_cert):
    """
    Given an PyOpenSSL certificate, provides all the subject alternative names.
    """
    # Pass the cert to cryptography, which has much better APIs for this.
    if hasattr(peer_cert, "to_cryptography"):
        cert = peer_cert.to_cryptography()
    else:
        # This is technically using private APIs, but should work across all
        # relevant versions before PyOpenSSL got a proper API for this.
        cert = _Certificate(openssl_backend, peer_cert._x509)

    # We want to find the SAN extension. Ask Cryptography to locate it (it's
    # faster than looping in Python)
    try:
        ext = cert.extensions.get_extension_for_class(
            x509.SubjectAlternativeName
        ).value
    except x509.ExtensionNotFound:
        # No such extension, return the empty list.
        return []
    except (x509.DuplicateExtension, x509.UnsupportedExtension,
            x509.UnsupportedGeneralNameType, UnicodeError) as e:
        # A problem has been found with the quality of the certificate. Assume
        # no SAN field is present.
        log.warning(
            "A problem was encountered with the certificate that prevented "
            "urllib3 from finding the SubjectAlternativeName field. This can "
            "affect certificate validation. The error was %s",
            e,
        )
        return []

    # We want to return dNSName and iPAddress fields. We need to cast the IPs
    # back to strings because the match_hostname function wants them as
    # strings.
    # Sadly the DNS names need to be idna encoded and then, on Python 3, UTF-8
    # decoded. This is pretty frustrating, but that's what the standard library
    # does with certificates, and so we need to attempt to do the same.
    names = [
        ('DNS', _dnsname_to_stdlib(name))
        for name in ext.get_values_for_type(x509.DNSName)
    ]
    names.extend(
        ('IP Address', str(name))
        for name in ext.get_values_for_type(x509.IPAddress)
    )

    return names


class WrappedSocket(object):
    '''API-compatibility wrapper for Python OpenSSL's Connection-class.

    Note: _makefile_refs, _drop() and _reuse() are needed for the garbage
    collector of pypy.
    '''

    def __init__(self, connection, socket, suppress_ragged_eofs=True):
        self.connection = connection
        self.socket = socket
        self.suppress_ragged_eofs = suppress_ragged_eofs
        self._makefile_refs = 0
        self._closed = False

    def fileno(self):
        return self.socket.fileno()

    # Copy-pasted from Python 3.5 source code
    def _decref_socketios(self):
        if self._makefile_refs > 0:
            self._makefile_refs -= 1
        if self._closed:
            self.close()

    def recv(self, *args, **kwargs):
        try:
            data = self.connection.recv(*args, **kwargs)
        except OpenSSL.SSL.SysCallError as e:
            if self.suppress_ragged_eofs and e.args == (-1, 'Unexpected EOF'):
                return b''
            else:
                raise SocketError(str(e))
        except OpenSSL.SSL.ZeroReturnError as e:
            if self.connection.get_shutdown() == OpenSSL.SSL.RECEIVED_SHUTDOWN:
                return b''
            else:
                raise
        except OpenSSL.SSL.WantReadError:
            rd = util.wait_for_read(self.socket, self.socket.gettimeout())
            if not rd:
                raise timeout('The read operation timed out')
            else:
                return self.recv(*args, **kwargs)
        else:
            return data

    def recv_into(self, *args, **kwargs):
        try:
            return self.connection.recv_into(*args, **kwargs)
        except OpenSSL.SSL.SysCallError as e:
            if self.suppress_ragged_eofs and e.args == (-1, 'Unexpected EOF'):
                return 0
            else:
                raise SocketError(str(e))
        except OpenSSL.SSL.ZeroReturnError as e:
            if self.connection.get_shutdown() == OpenSSL.SSL.RECEIVED_SHUTDOWN:
                return 0
            else:
                raise
        except OpenSSL.SSL.WantReadError:
            rd = util.wait_for_read(self.socket, self.socket.gettimeout())
            if not rd:
                raise timeout('The read operation timed out')
            else:
                return self.recv_into(*args, **kwargs)

    def settimeout(self, timeout):
        return self.socket.settimeout(timeout)

    def _send_until_done(self, data):
        while True:
            try:
                return self.connection.send(data)
            except OpenSSL.SSL.WantWriteError:
                wr = util.wait_for_write(self.socket, self.socket.gettimeout())
                if not wr:
                    raise timeout()
                continue
            except OpenSSL.SSL.SysCallError as e:
                raise SocketError(str(e))

    def sendall(self, data):
        total_sent = 0
        while total_sent < len(data):
            sent = self._send_until_done(data[total_sent:total_sent + SSL_WRITE_BLOCKSIZE])
            total_sent += sent

    def shutdown(self):
        # FIXME rethrow compatible exceptions should we ever use this
        self.connection.shutdown()

    def close(self):
        if self._makefile_refs < 1:
            try:
                self._closed = True
                return self.connection.close()
            except OpenSSL.SSL.Error:
                return
        else:
            self._makefile_refs -= 1

    def getpeercert(self, binary_form=False):
        x509 = self.connection.get_peer_certificate()

        if not x509:
            return x509

        if binary_form:
            return OpenSSL.crypto.dump_certificate(
                OpenSSL.crypto.FILETYPE_ASN1,
                x509)

        return {
            'subject': (
                (('commonName', x509.get_subject().CN),),
            ),
            'subjectAltName': get_subj_alt_name(x509)
        }

    def _reuse(self):
        self._makefile_refs += 1

    def _drop(self):
        if self._makefile_refs < 1:
            self.close()
        else:
            self._makefile_refs -= 1


if _fileobject:  # Platform-specific: Python 2
    def makefile(self, mode, bufsize=-1):
        self._makefile_refs += 1
        return _fileobject(self, mode, bufsize, close=True)
else:  # Platform-specific: Python 3
    makefile = backport_makefile

WrappedSocket.makefile = makefile


class PyOpenSSLContext(object):
    """
    I am a wrapper class for the PyOpenSSL ``Context`` object. I am responsible
    for translating the interface of the standard library ``SSLContext`` object
    to calls into PyOpenSSL.
    """
    def __init__(self, protocol):
        self.protocol = _openssl_versions[protocol]
        self._ctx = OpenSSL.SSL.Context(self.protocol)
        self._options = 0
        self.check_hostname = False

    @property
    def options(self):
        return self._options

    @options.setter
    def options(self, value):
        self._options = value
        self._ctx.set_options(value)

    @property
    def verify_mode(self):
        return _openssl_to_stdlib_verify[self._ctx.get_verify_mode()]

    @verify_mode.setter
    def verify_mode(self, value):
        self._ctx.set_verify(
            _stdlib_to_openssl_verify[value],
            _verify_callback
        )

    def set_default_verify_paths(self):
        self._ctx.set_default_verify_paths()

    def set_ciphers(self, ciphers):
        if isinstance(ciphers, six.text_type):
            ciphers = ciphers.encode('utf-8')
        self._ctx.set_cipher_list(ciphers)

    def load_verify_locations(self, cafile=None, capath=None, cadata=None):
        if cafile is not None:
            cafile = cafile.encode('utf-8')
        if capath is not None:
            capath = capath.encode('utf-8')
        self._ctx.load_verify_locations(cafile, capath)
        if cadata is not None:
            self._ctx.load_verify_locations(BytesIO(cadata))

    def load_cert_chain(self, certfile, keyfile=None, password=None):
        self._ctx.use_certificate_file(certfile)
        if password is not None:
            self._ctx.set_passwd_cb(lambda max_length, prompt_twice, userdata: password)
        self._ctx.use_privatekey_file(keyfile or certfile)

    def wrap_socket(self, sock, server_side=False,
                    do_handshake_on_connect=True, suppress_ragged_eofs=True,
                    server_hostname=None):
        cnx = OpenSSL.SSL.Connection(self._ctx, sock)

        if isinstance(server_hostname, six.text_type):  # Platform-specific: Python 3
            server_hostname = server_hostname.encode('utf-8')

        if server_hostname is not None:
            cnx.set_tlsext_host_name(server_hostname)

        cnx.set_connect_state()

        while True:
            try:
                cnx.do_handshake()
            except OpenSSL.SSL.WantReadError:
                rd = util.wait_for_read(sock, sock.gettimeout())
                if not rd:
                    raise timeout('select timed out')
                continue
            except OpenSSL.SSL.Error as e:
                raise ssl.SSLError('bad handshake: %r' % e)
            break

        return WrappedSocket(cnx, sock)


def _verify_callback(cnx, x509, err_no, err_depth, return_code):
    return err_no == 0
site-packages/pip/_vendor/urllib3/contrib/__init__.py000064400000000000147511334600016627 0ustar00site-packages/pip/_vendor/urllib3/util/__init__.py000064400000002024147511334600016154 0ustar00from __future__ import absolute_import
# For backwards compatibility, provide imports that used to be here.
from .connection import is_connection_dropped
from .request import make_headers
from .response import is_fp_closed
from .ssl_ import (
    SSLContext,
    HAS_SNI,
    IS_PYOPENSSL,
    IS_SECURETRANSPORT,
    assert_fingerprint,
    resolve_cert_reqs,
    resolve_ssl_version,
    ssl_wrap_socket,
)
from .timeout import (
    current_time,
    Timeout,
)

from .retry import Retry
from .url import (
    get_host,
    parse_url,
    split_first,
    Url,
)
from .wait import (
    wait_for_read,
    wait_for_write
)

__all__ = (
    'HAS_SNI',
    'IS_PYOPENSSL',
    'IS_SECURETRANSPORT',
    'SSLContext',
    'Retry',
    'Timeout',
    'Url',
    'assert_fingerprint',
    'current_time',
    'is_connection_dropped',
    'is_fp_closed',
    'get_host',
    'parse_url',
    'make_headers',
    'resolve_cert_reqs',
    'resolve_ssl_version',
    'split_first',
    'ssl_wrap_socket',
    'wait_for_read',
    'wait_for_write'
)
site-packages/pip/_vendor/urllib3/util/response.py000064400000004447147511334600016266 0ustar00from __future__ import absolute_import
from ..packages.six.moves import http_client as httplib

from ..exceptions import HeaderParsingError


def is_fp_closed(obj):
    """
    Checks whether a given file-like object is closed.

    :param obj:
        The file-like object to check.
    """

    try:
        # Check `isclosed()` first, in case Python3 doesn't set `closed`.
        # GH Issue #928
        return obj.isclosed()
    except AttributeError:
        pass

    try:
        # Check via the official file-like-object way.
        return obj.closed
    except AttributeError:
        pass

    try:
        # Check if the object is a container for another file-like object that
        # gets released on exhaustion (e.g. HTTPResponse).
        return obj.fp is None
    except AttributeError:
        pass

    raise ValueError("Unable to determine whether fp is closed.")


def assert_header_parsing(headers):
    """
    Asserts whether all headers have been successfully parsed.
    Extracts encountered errors from the result of parsing headers.

    Only works on Python 3.

    :param headers: Headers to verify.
    :type headers: `httplib.HTTPMessage`.

    :raises urllib3.exceptions.HeaderParsingError:
        If parsing errors are found.
    """

    # This will fail silently if we pass in the wrong kind of parameter.
    # To make debugging easier add an explicit check.
    if not isinstance(headers, httplib.HTTPMessage):
        raise TypeError('expected httplib.Message, got {0}.'.format(
            type(headers)))

    defects = getattr(headers, 'defects', None)
    get_payload = getattr(headers, 'get_payload', None)

    unparsed_data = None
    if get_payload:  # Platform-specific: Python 3.
        unparsed_data = get_payload()

    if defects or unparsed_data:
        raise HeaderParsingError(defects=defects, unparsed_data=unparsed_data)


def is_response_to_head(response):
    """
    Checks whether the request of a response has been a HEAD-request.
    Handles the quirks of AppEngine.

    :param conn:
    :type conn: :class:`httplib.HTTPResponse`
    """
    # FIXME: Can we do this somehow without accessing private httplib _method?
    method = response._method
    if isinstance(method, int):  # Platform-specific: Appengine
        return method == 3
    return method.upper() == 'HEAD'
site-packages/pip/_vendor/urllib3/util/wait.py000064400000002653147511334600015371 0ustar00from .selectors import (
    HAS_SELECT,
    DefaultSelector,
    EVENT_READ,
    EVENT_WRITE
)


def _wait_for_io_events(socks, events, timeout=None):
    """ Waits for IO events to be available from a list of sockets
    or optionally a single socket if passed in. Returns a list of
    sockets that can be interacted with immediately. """
    if not HAS_SELECT:
        raise ValueError('Platform does not have a selector')
    if not isinstance(socks, list):
        # Probably just a single socket.
        if hasattr(socks, "fileno"):
            socks = [socks]
        # Otherwise it might be a non-list iterable.
        else:
            socks = list(socks)
    with DefaultSelector() as selector:
        for sock in socks:
            selector.register(sock, events)
        return [key[0].fileobj for key in
                selector.select(timeout) if key[1] & events]


def wait_for_read(socks, timeout=None):
    """ Waits for reading to be available from a list of sockets
    or optionally a single socket if passed in. Returns a list of
    sockets that can be read from immediately. """
    return _wait_for_io_events(socks, EVENT_READ, timeout)


def wait_for_write(socks, timeout=None):
    """ Waits for writing to be available from a list of sockets
    or optionally a single socket if passed in. Returns a list of
    sockets that can be written to immediately. """
    return _wait_for_io_events(socks, EVENT_WRITE, timeout)
site-packages/pip/_vendor/urllib3/util/selectors.py000064400000051233147511334600016426 0ustar00# Backport of selectors.py from Python 3.5+ to support Python < 3.4
# Also has the behavior specified in PEP 475 which is to retry syscalls
# in the case of an EINTR error. This module is required because selectors34
# does not follow this behavior and instead returns that no dile descriptor
# events have occurred rather than retry the syscall. The decision to drop
# support for select.devpoll is made to maintain 100% test coverage.

import errno
import math
import select
import socket
import sys
import time
from collections import namedtuple, Mapping

try:
    monotonic = time.monotonic
except (AttributeError, ImportError):  # Python 3.3<
    monotonic = time.time

EVENT_READ = (1 << 0)
EVENT_WRITE = (1 << 1)

HAS_SELECT = True  # Variable that shows whether the platform has a selector.
_SYSCALL_SENTINEL = object()  # Sentinel in case a system call returns None.
_DEFAULT_SELECTOR = None


class SelectorError(Exception):
    def __init__(self, errcode):
        super(SelectorError, self).__init__()
        self.errno = errcode

    def __repr__(self):
        return "<SelectorError errno={0}>".format(self.errno)

    def __str__(self):
        return self.__repr__()


def _fileobj_to_fd(fileobj):
    """ Return a file descriptor from a file object. If
    given an integer will simply return that integer back. """
    if isinstance(fileobj, int):
        fd = fileobj
    else:
        try:
            fd = int(fileobj.fileno())
        except (AttributeError, TypeError, ValueError):
            raise ValueError("Invalid file object: {0!r}".format(fileobj))
    if fd < 0:
        raise ValueError("Invalid file descriptor: {0}".format(fd))
    return fd


# Determine which function to use to wrap system calls because Python 3.5+
# already handles the case when system calls are interrupted.
if sys.version_info >= (3, 5):
    def _syscall_wrapper(func, _, *args, **kwargs):
        """ This is the short-circuit version of the below logic
        because in Python 3.5+ all system calls automatically restart
        and recalculate their timeouts. """
        try:
            return func(*args, **kwargs)
        except (OSError, IOError, select.error) as e:
            errcode = None
            if hasattr(e, "errno"):
                errcode = e.errno
            raise SelectorError(errcode)
else:
    def _syscall_wrapper(func, recalc_timeout, *args, **kwargs):
        """ Wrapper function for syscalls that could fail due to EINTR.
        All functions should be retried if there is time left in the timeout
        in accordance with PEP 475. """
        timeout = kwargs.get("timeout", None)
        if timeout is None:
            expires = None
            recalc_timeout = False
        else:
            timeout = float(timeout)
            if timeout < 0.0:  # Timeout less than 0 treated as no timeout.
                expires = None
            else:
                expires = monotonic() + timeout

        args = list(args)
        if recalc_timeout and "timeout" not in kwargs:
            raise ValueError(
                "Timeout must be in args or kwargs to be recalculated")

        result = _SYSCALL_SENTINEL
        while result is _SYSCALL_SENTINEL:
            try:
                result = func(*args, **kwargs)
            # OSError is thrown by select.select
            # IOError is thrown by select.epoll.poll
            # select.error is thrown by select.poll.poll
            # Aren't we thankful for Python 3.x rework for exceptions?
            except (OSError, IOError, select.error) as e:
                # select.error wasn't a subclass of OSError in the past.
                errcode = None
                if hasattr(e, "errno"):
                    errcode = e.errno
                elif hasattr(e, "args"):
                    errcode = e.args[0]

                # Also test for the Windows equivalent of EINTR.
                is_interrupt = (errcode == errno.EINTR or (hasattr(errno, "WSAEINTR") and
                                                           errcode == errno.WSAEINTR))

                if is_interrupt:
                    if expires is not None:
                        current_time = monotonic()
                        if current_time > expires:
                            raise OSError(errno=errno.ETIMEDOUT)
                        if recalc_timeout:
                            if "timeout" in kwargs:
                                kwargs["timeout"] = expires - current_time
                    continue
                if errcode:
                    raise SelectorError(errcode)
                else:
                    raise
        return result


SelectorKey = namedtuple('SelectorKey', ['fileobj', 'fd', 'events', 'data'])


class _SelectorMapping(Mapping):
    """ Mapping of file objects to selector keys """

    def __init__(self, selector):
        self._selector = selector

    def __len__(self):
        return len(self._selector._fd_to_key)

    def __getitem__(self, fileobj):
        try:
            fd = self._selector._fileobj_lookup(fileobj)
            return self._selector._fd_to_key[fd]
        except KeyError:
            raise KeyError("{0!r} is not registered.".format(fileobj))

    def __iter__(self):
        return iter(self._selector._fd_to_key)


class BaseSelector(object):
    """ Abstract Selector class

    A selector supports registering file objects to be monitored
    for specific I/O events.

    A file object is a file descriptor or any object with a
    `fileno()` method. An arbitrary object can be attached to the
    file object which can be used for example to store context info,
    a callback, etc.

    A selector can use various implementations (select(), poll(), epoll(),
    and kqueue()) depending on the platform. The 'DefaultSelector' class uses
    the most efficient implementation for the current platform.
    """
    def __init__(self):
        # Maps file descriptors to keys.
        self._fd_to_key = {}

        # Read-only mapping returned by get_map()
        self._map = _SelectorMapping(self)

    def _fileobj_lookup(self, fileobj):
        """ Return a file descriptor from a file object.
        This wraps _fileobj_to_fd() to do an exhaustive
        search in case the object is invalid but we still
        have it in our map. Used by unregister() so we can
        unregister an object that was previously registered
        even if it is closed. It is also used by _SelectorMapping
        """
        try:
            return _fileobj_to_fd(fileobj)
        except ValueError:

            # Search through all our mapped keys.
            for key in self._fd_to_key.values():
                if key.fileobj is fileobj:
                    return key.fd

            # Raise ValueError after all.
            raise

    def register(self, fileobj, events, data=None):
        """ Register a file object for a set of events to monitor. """
        if (not events) or (events & ~(EVENT_READ | EVENT_WRITE)):
            raise ValueError("Invalid events: {0!r}".format(events))

        key = SelectorKey(fileobj, self._fileobj_lookup(fileobj), events, data)

        if key.fd in self._fd_to_key:
            raise KeyError("{0!r} (FD {1}) is already registered"
                           .format(fileobj, key.fd))

        self._fd_to_key[key.fd] = key
        return key

    def unregister(self, fileobj):
        """ Unregister a file object from being monitored. """
        try:
            key = self._fd_to_key.pop(self._fileobj_lookup(fileobj))
        except KeyError:
            raise KeyError("{0!r} is not registered".format(fileobj))

        # Getting the fileno of a closed socket on Windows errors with EBADF.
        except socket.error as e:  # Platform-specific: Windows.
            if e.errno != errno.EBADF:
                raise
            else:
                for key in self._fd_to_key.values():
                    if key.fileobj is fileobj:
                        self._fd_to_key.pop(key.fd)
                        break
                else:
                    raise KeyError("{0!r} is not registered".format(fileobj))
        return key

    def modify(self, fileobj, events, data=None):
        """ Change a registered file object monitored events and data. """
        # NOTE: Some subclasses optimize this operation even further.
        try:
            key = self._fd_to_key[self._fileobj_lookup(fileobj)]
        except KeyError:
            raise KeyError("{0!r} is not registered".format(fileobj))

        if events != key.events:
            self.unregister(fileobj)
            key = self.register(fileobj, events, data)

        elif data != key.data:
            # Use a shortcut to update the data.
            key = key._replace(data=data)
            self._fd_to_key[key.fd] = key

        return key

    def select(self, timeout=None):
        """ Perform the actual selection until some monitored file objects
        are ready or the timeout expires. """
        raise NotImplementedError()

    def close(self):
        """ Close the selector. This must be called to ensure that all
        underlying resources are freed. """
        self._fd_to_key.clear()
        self._map = None

    def get_key(self, fileobj):
        """ Return the key associated with a registered file object. """
        mapping = self.get_map()
        if mapping is None:
            raise RuntimeError("Selector is closed")
        try:
            return mapping[fileobj]
        except KeyError:
            raise KeyError("{0!r} is not registered".format(fileobj))

    def get_map(self):
        """ Return a mapping of file objects to selector keys """
        return self._map

    def _key_from_fd(self, fd):
        """ Return the key associated to a given file descriptor
         Return None if it is not found. """
        try:
            return self._fd_to_key[fd]
        except KeyError:
            return None

    def __enter__(self):
        return self

    def __exit__(self, *args):
        self.close()


# Almost all platforms have select.select()
if hasattr(select, "select"):
    class SelectSelector(BaseSelector):
        """ Select-based selector. """
        def __init__(self):
            super(SelectSelector, self).__init__()
            self._readers = set()
            self._writers = set()

        def register(self, fileobj, events, data=None):
            key = super(SelectSelector, self).register(fileobj, events, data)
            if events & EVENT_READ:
                self._readers.add(key.fd)
            if events & EVENT_WRITE:
                self._writers.add(key.fd)
            return key

        def unregister(self, fileobj):
            key = super(SelectSelector, self).unregister(fileobj)
            self._readers.discard(key.fd)
            self._writers.discard(key.fd)
            return key

        def _select(self, r, w, timeout=None):
            """ Wrapper for select.select because timeout is a positional arg """
            return select.select(r, w, [], timeout)

        def select(self, timeout=None):
            # Selecting on empty lists on Windows errors out.
            if not len(self._readers) and not len(self._writers):
                return []

            timeout = None if timeout is None else max(timeout, 0.0)
            ready = []
            r, w, _ = _syscall_wrapper(self._select, True, self._readers,
                                       self._writers, timeout)
            r = set(r)
            w = set(w)
            for fd in r | w:
                events = 0
                if fd in r:
                    events |= EVENT_READ
                if fd in w:
                    events |= EVENT_WRITE

                key = self._key_from_fd(fd)
                if key:
                    ready.append((key, events & key.events))
            return ready


if hasattr(select, "poll"):
    class PollSelector(BaseSelector):
        """ Poll-based selector """
        def __init__(self):
            super(PollSelector, self).__init__()
            self._poll = select.poll()

        def register(self, fileobj, events, data=None):
            key = super(PollSelector, self).register(fileobj, events, data)
            event_mask = 0
            if events & EVENT_READ:
                event_mask |= select.POLLIN
            if events & EVENT_WRITE:
                event_mask |= select.POLLOUT
            self._poll.register(key.fd, event_mask)
            return key

        def unregister(self, fileobj):
            key = super(PollSelector, self).unregister(fileobj)
            self._poll.unregister(key.fd)
            return key

        def _wrap_poll(self, timeout=None):
            """ Wrapper function for select.poll.poll() so that
            _syscall_wrapper can work with only seconds. """
            if timeout is not None:
                if timeout <= 0:
                    timeout = 0
                else:
                    # select.poll.poll() has a resolution of 1 millisecond,
                    # round away from zero to wait *at least* timeout seconds.
                    timeout = math.ceil(timeout * 1e3)

            result = self._poll.poll(timeout)
            return result

        def select(self, timeout=None):
            ready = []
            fd_events = _syscall_wrapper(self._wrap_poll, True, timeout=timeout)
            for fd, event_mask in fd_events:
                events = 0
                if event_mask & ~select.POLLIN:
                    events |= EVENT_WRITE
                if event_mask & ~select.POLLOUT:
                    events |= EVENT_READ

                key = self._key_from_fd(fd)
                if key:
                    ready.append((key, events & key.events))

            return ready


if hasattr(select, "epoll"):
    class EpollSelector(BaseSelector):
        """ Epoll-based selector """
        def __init__(self):
            super(EpollSelector, self).__init__()
            self._epoll = select.epoll()

        def fileno(self):
            return self._epoll.fileno()

        def register(self, fileobj, events, data=None):
            key = super(EpollSelector, self).register(fileobj, events, data)
            events_mask = 0
            if events & EVENT_READ:
                events_mask |= select.EPOLLIN
            if events & EVENT_WRITE:
                events_mask |= select.EPOLLOUT
            _syscall_wrapper(self._epoll.register, False, key.fd, events_mask)
            return key

        def unregister(self, fileobj):
            key = super(EpollSelector, self).unregister(fileobj)
            try:
                _syscall_wrapper(self._epoll.unregister, False, key.fd)
            except SelectorError:
                # This can occur when the fd was closed since registry.
                pass
            return key

        def select(self, timeout=None):
            if timeout is not None:
                if timeout <= 0:
                    timeout = 0.0
                else:
                    # select.epoll.poll() has a resolution of 1 millisecond
                    # but luckily takes seconds so we don't need a wrapper
                    # like PollSelector. Just for better rounding.
                    timeout = math.ceil(timeout * 1e3) * 1e-3
                timeout = float(timeout)
            else:
                timeout = -1.0  # epoll.poll() must have a float.

            # We always want at least 1 to ensure that select can be called
            # with no file descriptors registered. Otherwise will fail.
            max_events = max(len(self._fd_to_key), 1)

            ready = []
            fd_events = _syscall_wrapper(self._epoll.poll, True,
                                         timeout=timeout,
                                         maxevents=max_events)
            for fd, event_mask in fd_events:
                events = 0
                if event_mask & ~select.EPOLLIN:
                    events |= EVENT_WRITE
                if event_mask & ~select.EPOLLOUT:
                    events |= EVENT_READ

                key = self._key_from_fd(fd)
                if key:
                    ready.append((key, events & key.events))
            return ready

        def close(self):
            self._epoll.close()
            super(EpollSelector, self).close()


if hasattr(select, "kqueue"):
    class KqueueSelector(BaseSelector):
        """ Kqueue / Kevent-based selector """
        def __init__(self):
            super(KqueueSelector, self).__init__()
            self._kqueue = select.kqueue()

        def fileno(self):
            return self._kqueue.fileno()

        def register(self, fileobj, events, data=None):
            key = super(KqueueSelector, self).register(fileobj, events, data)
            if events & EVENT_READ:
                kevent = select.kevent(key.fd,
                                       select.KQ_FILTER_READ,
                                       select.KQ_EV_ADD)

                _syscall_wrapper(self._kqueue.control, False, [kevent], 0, 0)

            if events & EVENT_WRITE:
                kevent = select.kevent(key.fd,
                                       select.KQ_FILTER_WRITE,
                                       select.KQ_EV_ADD)

                _syscall_wrapper(self._kqueue.control, False, [kevent], 0, 0)

            return key

        def unregister(self, fileobj):
            key = super(KqueueSelector, self).unregister(fileobj)
            if key.events & EVENT_READ:
                kevent = select.kevent(key.fd,
                                       select.KQ_FILTER_READ,
                                       select.KQ_EV_DELETE)
                try:
                    _syscall_wrapper(self._kqueue.control, False, [kevent], 0, 0)
                except SelectorError:
                    pass
            if key.events & EVENT_WRITE:
                kevent = select.kevent(key.fd,
                                       select.KQ_FILTER_WRITE,
                                       select.KQ_EV_DELETE)
                try:
                    _syscall_wrapper(self._kqueue.control, False, [kevent], 0, 0)
                except SelectorError:
                    pass

            return key

        def select(self, timeout=None):
            if timeout is not None:
                timeout = max(timeout, 0)

            max_events = len(self._fd_to_key) * 2
            ready_fds = {}

            kevent_list = _syscall_wrapper(self._kqueue.control, True,
                                           None, max_events, timeout)

            for kevent in kevent_list:
                fd = kevent.ident
                event_mask = kevent.filter
                events = 0
                if event_mask == select.KQ_FILTER_READ:
                    events |= EVENT_READ
                if event_mask == select.KQ_FILTER_WRITE:
                    events |= EVENT_WRITE

                key = self._key_from_fd(fd)
                if key:
                    if key.fd not in ready_fds:
                        ready_fds[key.fd] = (key, events & key.events)
                    else:
                        old_events = ready_fds[key.fd][1]
                        ready_fds[key.fd] = (key, (events | old_events) & key.events)

            return list(ready_fds.values())

        def close(self):
            self._kqueue.close()
            super(KqueueSelector, self).close()


if not hasattr(select, 'select'):  # Platform-specific: AppEngine
    HAS_SELECT = False


def _can_allocate(struct):
    """ Checks that select structs can be allocated by the underlying
    operating system, not just advertised by the select module. We don't
    check select() because we'll be hopeful that most platforms that
    don't have it available will not advertise it. (ie: GAE) """
    try:
        # select.poll() objects won't fail until used.
        if struct == 'poll':
            p = select.poll()
            p.poll(0)

        # All others will fail on allocation.
        else:
            getattr(select, struct)().close()
        return True
    except (OSError, AttributeError) as e:
        return False


# Choose the best implementation, roughly:
# kqueue == epoll > poll > select. Devpoll not supported. (See above)
# select() also can't accept a FD > FD_SETSIZE (usually around 1024)
def DefaultSelector():
    """ This function serves as a first call for DefaultSelector to
    detect if the select module is being monkey-patched incorrectly
    by eventlet, greenlet, and preserve proper behavior. """
    global _DEFAULT_SELECTOR
    if _DEFAULT_SELECTOR is None:
        if _can_allocate('kqueue'):
            _DEFAULT_SELECTOR = KqueueSelector
        elif _can_allocate('epoll'):
            _DEFAULT_SELECTOR = EpollSelector
        elif _can_allocate('poll'):
            _DEFAULT_SELECTOR = PollSelector
        elif hasattr(select, 'select'):
            _DEFAULT_SELECTOR = SelectSelector
        else:  # Platform-specific: AppEngine
            raise ValueError('Platform does not have a selector')
    return _DEFAULT_SELECTOR()
site-packages/pip/_vendor/urllib3/util/connection.py000064400000010215147511334600016555 0ustar00from __future__ import absolute_import
import socket
from .wait import wait_for_read
from .selectors import HAS_SELECT, SelectorError


def is_connection_dropped(conn):  # Platform-specific
    """
    Returns True if the connection is dropped and should be closed.

    :param conn:
        :class:`httplib.HTTPConnection` object.

    Note: For platforms like AppEngine, this will always return ``False`` to
    let the platform handle connection recycling transparently for us.
    """
    sock = getattr(conn, 'sock', False)
    if sock is False:  # Platform-specific: AppEngine
        return False
    if sock is None:  # Connection already closed (such as by httplib).
        return True

    if not HAS_SELECT:
        return False

    try:
        return bool(wait_for_read(sock, timeout=0.0))
    except SelectorError:
        return True


# This function is copied from socket.py in the Python 2.7 standard
# library test suite. Added to its signature is only `socket_options`.
# One additional modification is that we avoid binding to IPv6 servers
# discovered in DNS if the system doesn't have IPv6 functionality.
def create_connection(address, timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
                      source_address=None, socket_options=None):
    """Connect to *address* and return the socket object.

    Convenience function.  Connect to *address* (a 2-tuple ``(host,
    port)``) and return the socket object.  Passing the optional
    *timeout* parameter will set the timeout on the socket instance
    before attempting to connect.  If no *timeout* is supplied, the
    global default timeout setting returned by :func:`getdefaulttimeout`
    is used.  If *source_address* is set it must be a tuple of (host, port)
    for the socket to bind as a source address before making the connection.
    An host of '' or port 0 tells the OS to use the default.
    """

    host, port = address
    if host.startswith('['):
        host = host.strip('[]')
    err = None

    # Using the value from allowed_gai_family() in the context of getaddrinfo lets
    # us select whether to work with IPv4 DNS records, IPv6 records, or both.
    # The original create_connection function always returns all records.
    family = allowed_gai_family()

    for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
        af, socktype, proto, canonname, sa = res
        sock = None
        try:
            sock = socket.socket(af, socktype, proto)

            # If provided, set socket level options before connecting.
            _set_socket_options(sock, socket_options)

            if timeout is not socket._GLOBAL_DEFAULT_TIMEOUT:
                sock.settimeout(timeout)
            if source_address:
                sock.bind(source_address)
            sock.connect(sa)
            return sock

        except socket.error as e:
            err = e
            if sock is not None:
                sock.close()
                sock = None

    if err is not None:
        raise err

    raise socket.error("getaddrinfo returns an empty list")


def _set_socket_options(sock, options):
    if options is None:
        return

    for opt in options:
        sock.setsockopt(*opt)


def allowed_gai_family():
    """This function is designed to work in the context of
    getaddrinfo, where family=socket.AF_UNSPEC is the default and
    will perform a DNS search for both IPv6 and IPv4 records."""

    family = socket.AF_INET
    if HAS_IPV6:
        family = socket.AF_UNSPEC
    return family


def _has_ipv6(host):
    """ Returns True if the system can bind an IPv6 address. """
    sock = None
    has_ipv6 = False

    if socket.has_ipv6:
        # has_ipv6 returns true if cPython was compiled with IPv6 support.
        # It does not tell us if the system has IPv6 support enabled. To
        # determine that we must bind to an IPv6 address.
        # https://github.com/shazow/urllib3/pull/611
        # https://bugs.python.org/issue658327
        try:
            sock = socket.socket(socket.AF_INET6)
            sock.bind((host, 0))
            has_ipv6 = True
        except Exception:
            pass

    if sock:
        sock.close()
    return has_ipv6


HAS_IPV6 = _has_ipv6('::1')
site-packages/pip/_vendor/urllib3/util/url.py000064400000015216147511334600015226 0ustar00from __future__ import absolute_import
from collections import namedtuple
import re

from ..exceptions import LocationParseError


url_attrs = ['scheme', 'auth', 'host', 'port', 'path', 'query', 'fragment']

# We only want to normalize urls with an HTTP(S) scheme.
# urllib3 infers URLs without a scheme (None) to be http.
NORMALIZABLE_SCHEMES = ('http', 'https', None)

_contains_disallowed_url_pchar_re = re.compile('[\x00-\x20\x7f]')
from ..packages.six.moves.urllib.parse import quote

class Url(namedtuple('Url', url_attrs)):
    """
    Datastructure for representing an HTTP URL. Used as a return value for
    :func:`parse_url`. Both the scheme and host are normalized as they are
    both case-insensitive according to RFC 3986.
    """
    __slots__ = ()

    def __new__(cls, scheme=None, auth=None, host=None, port=None, path=None,
                query=None, fragment=None):
        if path and not path.startswith('/'):
            path = '/' + path
        if scheme:
            scheme = scheme.lower()
        if host and scheme in NORMALIZABLE_SCHEMES:
            host = host.lower()
        return super(Url, cls).__new__(cls, scheme, auth, host, port, path,
                                       query, fragment)

    @property
    def hostname(self):
        """For backwards-compatibility with urlparse. We're nice like that."""
        return self.host

    @property
    def request_uri(self):
        """Absolute path including the query string."""
        uri = self.path or '/'

        if self.query is not None:
            uri += '?' + self.query

        return uri

    @property
    def netloc(self):
        """Network location including host and port"""
        if self.port:
            return '%s:%d' % (self.host, self.port)
        return self.host

    @property
    def url(self):
        """
        Convert self into a url

        This function should more or less round-trip with :func:`.parse_url`. The
        returned url may not be exactly the same as the url inputted to
        :func:`.parse_url`, but it should be equivalent by the RFC (e.g., urls
        with a blank port will have : removed).

        Example: ::

            >>> U = parse_url('http://google.com/mail/')
            >>> U.url
            'http://google.com/mail/'
            >>> Url('http', 'username:password', 'host.com', 80,
            ... '/path', 'query', 'fragment').url
            'http://username:password@host.com:80/path?query#fragment'
        """
        scheme, auth, host, port, path, query, fragment = self
        url = ''

        # We use "is not None" we want things to happen with empty strings (or 0 port)
        if scheme is not None:
            url += scheme + '://'
        if auth is not None:
            url += auth + '@'
        if host is not None:
            url += host
        if port is not None:
            url += ':' + str(port)
        if path is not None:
            url += path
        if query is not None:
            url += '?' + query
        if fragment is not None:
            url += '#' + fragment

        return url

    def __str__(self):
        return self.url


def split_first(s, delims):
    """
    Given a string and an iterable of delimiters, split on the first found
    delimiter. Return two split parts and the matched delimiter.

    If not found, then the first part is the full input string.

    Example::

        >>> split_first('foo/bar?baz', '?/=')
        ('foo', 'bar?baz', '/')
        >>> split_first('foo/bar?baz', '123')
        ('foo/bar?baz', '', None)

    Scales linearly with number of delims. Not ideal for large number of delims.
    """
    min_idx = None
    min_delim = None
    for d in delims:
        idx = s.find(d)
        if idx < 0:
            continue

        if min_idx is None or idx < min_idx:
            min_idx = idx
            min_delim = d

    if min_idx is None or min_idx < 0:
        return s, '', None

    return s[:min_idx], s[min_idx + 1:], min_delim


def parse_url(url):
    """
    Given a url, return a parsed :class:`.Url` namedtuple. Best-effort is
    performed to parse incomplete urls. Fields not provided will be None.

    Partly backwards-compatible with :mod:`urlparse`.

    Example::

        >>> parse_url('http://google.com/mail/')
        Url(scheme='http', host='google.com', port=None, path='/mail/', ...)
        >>> parse_url('google.com:80')
        Url(scheme=None, host='google.com', port=80, path=None, ...)
        >>> parse_url('/foo?bar')
        Url(scheme=None, host=None, port=None, path='/foo', query='bar', ...)
    """

    # While this code has overlap with stdlib's urlparse, it is much
    # simplified for our needs and less annoying.
    # Additionally, this implementations does silly things to be optimal
    # on CPython.

    if not url:
        # Empty
        return Url()

    # Prevent CVE-2019-9740.
    # adapted from https://github.com/python/cpython/pull/12755
    url = _contains_disallowed_url_pchar_re.sub(lambda match: quote(match.group()), url)

    scheme = None
    auth = None
    host = None
    port = None
    path = None
    fragment = None
    query = None

    # Scheme
    if '://' in url:
        scheme, url = url.split('://', 1)

    # Find the earliest Authority Terminator
    # (http://tools.ietf.org/html/rfc3986#section-3.2)
    url, path_, delim = split_first(url, ['/', '?', '#'])

    if delim:
        # Reassemble the path
        path = delim + path_

    # Auth
    if '@' in url:
        # Last '@' denotes end of auth part
        auth, url = url.rsplit('@', 1)

    # IPv6
    if url and url[0] == '[':
        host, url = url.split(']', 1)
        host += ']'

    # Port
    if ':' in url:
        _host, port = url.split(':', 1)

        if not host:
            host = _host

        if port:
            # If given, ports must be integers. No whitespace, no plus or
            # minus prefixes, no non-integer digits such as ^2 (superscript).
            if not port.isdigit():
                raise LocationParseError(url)
            try:
                port = int(port)
            except ValueError:
                raise LocationParseError(url)
        else:
            # Blank ports are cool, too. (rfc3986#section-3.2.3)
            port = None

    elif not host and url:
        host = url

    if not path:
        return Url(scheme, auth, host, port, path, query, fragment)

    # Fragment
    if '#' in path:
        path, fragment = path.split('#', 1)

    # Query
    if '?' in path:
        path, query = path.split('?', 1)

    return Url(scheme, auth, host, port, path, query, fragment)


def get_host(url):
    """
    Deprecated. Use :func:`parse_url` instead.
    """
    p = parse_url(url)
    return p.scheme or 'http', p.hostname, p.port
site-packages/pip/_vendor/urllib3/util/__pycache__/url.cpython-36.opt-1.pyc000064400000012430147511334600022444 0ustar003

���e��@s�ddlmZddlmZddlZddlmZdddd	d
ddgZdZej	d�Z
ddlmZGdd�dede��Z
dd�Zdd�Zdd�ZdS)�)�absolute_import)�
namedtupleN�)�LocationParseError�scheme�auth�host�port�path�query�fragment�http�httpsz[- ])�quotecs^eZdZdZfZd�fdd�	Zedd��Zedd��Zed	d
��Z	edd��Z
d
d�Z�ZS)�Urlz�
    Datastructure for representing an HTTP URL. Used as a return value for
    :func:`parse_url`. Both the scheme and host are normalized as they are
    both case-insensitive according to RFC 3986.
    Nc	sV|r|jd�rd|}|r$|j�}|r8|tkr8|j�}tt|�j||||||||�S)N�/)�
startswith�lower�NORMALIZABLE_SCHEMES�superr�__new__)�clsrrrr	r
rr)�	__class__��/usr/lib/python3.6/url.pyrszUrl.__new__cCs|jS)z@For backwards-compatibility with urlparse. We're nice like that.)r)�selfrrr�hostname$szUrl.hostnamecCs&|jpd}|jdk	r"|d|j7}|S)z)Absolute path including the query string.rN�?)r
r)rZurirrr�request_uri)s

zUrl.request_uricCs|jrd|j|jfS|jS)z(Network location including host and portz%s:%d)r	r)rrrr�netloc3sz
Url.netlocc	Cs�|\}}}}}}}d}|dk	r*||d7}|dk	r>||d7}|dk	rN||7}|dk	rf|dt|�7}|dk	rv||7}|dk	r�|d|7}|dk	r�|d|7}|S)a�
        Convert self into a url

        This function should more or less round-trip with :func:`.parse_url`. The
        returned url may not be exactly the same as the url inputted to
        :func:`.parse_url`, but it should be equivalent by the RFC (e.g., urls
        with a blank port will have : removed).

        Example: ::

            >>> U = parse_url('http://google.com/mail/')
            >>> U.url
            'http://google.com/mail/'
            >>> Url('http', 'username:password', 'host.com', 80,
            ... '/path', 'query', 'fragment').url
            'http://username:password@host.com:80/path?query#fragment'
        �Nz://�@�:r�#)�str)	rrrrr	r
rr�urlrrrr%:s"zUrl.urlcCs|jS)N)r%)rrrr�__str__bszUrl.__str__)NNNNNNN)
�__name__�
__module__�__qualname__�__doc__�	__slots__r�propertyrrrr%r&�
__classcell__rr)rrrs

(rcCszd}d}x8|D]0}|j|�}|dkr&q|dks6||kr|}|}qW|dksR|dkr\|ddfS|d|�||dd�|fS)a�
    Given a string and an iterable of delimiters, split on the first found
    delimiter. Return two split parts and the matched delimiter.

    If not found, then the first part is the full input string.

    Example::

        >>> split_first('foo/bar?baz', '?/=')
        ('foo', 'bar?baz', '/')
        >>> split_first('foo/bar?baz', '123')
        ('foo/bar?baz', '', None)

    Scales linearly with number of delims. Not ideal for large number of delims.
    Nrr �)�find)�sZdelimsZmin_idxZ	min_delim�d�idxrrr�split_firstfs


r3cCs�|s
t�Stjdd�|�}d}d}d}d}d}d}d}d|krN|jdd�\}}t|dddg�\}}}	|	rp|	|}d	|kr�|jd	d�\}}|r�|d
dkr�|jdd�\}}|d7}d
|k�r|jd
d�\}
}|s�|
}|�r|j�s�t|��yt|�}Wnt	k
�rt|��YnXnd}n|�r.|�r.|}|�sHt|||||||�Sd|k�rb|jdd�\}}d|k�r||jdd�\}}t|||||||�S)a:
    Given a url, return a parsed :class:`.Url` namedtuple. Best-effort is
    performed to parse incomplete urls. Fields not provided will be None.

    Partly backwards-compatible with :mod:`urlparse`.

    Example::

        >>> parse_url('http://google.com/mail/')
        Url(scheme='http', host='google.com', port=None, path='/mail/', ...)
        >>> parse_url('google.com:80')
        Url(scheme=None, host='google.com', port=80, path=None, ...)
        >>> parse_url('/foo?bar')
        Url(scheme=None, host=None, port=None, path='/foo', query='bar', ...)
    cSst|j��S)N)r�group)�matchrrr�<lambda>�szparse_url.<locals>.<lambda>Nz://r.rrr#r!r�[�]r")
r�!_contains_disallowed_url_pchar_re�sub�splitr3�rsplit�isdigitr�int�
ValueError)r%rrrr	r
rrZpath_ZdelimZ_hostrrr�	parse_url�sR


r@cCst|�}|jpd|j|jfS)z4
    Deprecated. Use :func:`parse_url` instead.
    r
)r@rrr	)r%�prrr�get_host�srB)r
rN)Z
__future__r�collectionsr�re�
exceptionsrZ	url_attrsr�compiler9Zpackages.six.moves.urllib.parserrr3r@rBrrrr�<module>s
U!asite-packages/pip/_vendor/urllib3/util/__pycache__/response.cpython-36.opt-1.pyc000064400000003433147511334600023503 0ustar003

���e'	�@s@ddlmZddlmZddlmZdd�Zdd�Zd	d
�Z	dS)�)�absolute_import�)�http_client)�HeaderParsingErrorcCsfy|j�Stk
rYnXy|jStk
r8YnXy
|jdkStk
rXYnXtd��dS)zt
    Checks whether a given file-like object is closed.

    :param obj:
        The file-like object to check.
    Nz)Unable to determine whether fp is closed.)Zisclosed�AttributeError�closed�fp�
ValueError)�obj�r�/usr/lib/python3.6/response.py�is_fp_closeds
r
cCs\t|tj�stdjt|����t|dd�}t|dd�}d}|rD|�}|sL|rXt||d��dS)aP
    Asserts whether all headers have been successfully parsed.
    Extracts encountered errors from the result of parsing headers.

    Only works on Python 3.

    :param headers: Headers to verify.
    :type headers: `httplib.HTTPMessage`.

    :raises urllib3.exceptions.HeaderParsingError:
        If parsing errors are found.
    z"expected httplib.Message, got {0}.�defectsN�get_payload)r�
unparsed_data)�
isinstance�httplibZHTTPMessage�	TypeError�format�type�getattrr)Zheadersrrrrrr�assert_header_parsing&srcCs$|j}t|t�r|dkS|j�dkS)z�
    Checks whether the request of a response has been a HEAD-request.
    Handles the quirks of AppEngine.

    :param conn:
    :type conn: :class:`httplib.HTTPResponse`
    �ZHEAD)�_methodr�int�upper)Zresponse�methodrrr�is_response_to_headEs	
rN)
Z
__future__rZpackages.six.movesrr�
exceptionsrr
rrrrrr�<module>s
site-packages/pip/_vendor/urllib3/util/__pycache__/ssl_.cpython-36.opt-1.pyc000064400000021265147511334600022610 0ustar003

���e�/�!@s�ddlmZddlZddlZddlZddlmZmZddlm	Z	m
Z
mZddlm
Z
mZmZdZdZdZdZe	e
ed�Zd	d
�Zeede�Zy,ddlZddlmZmZmZdd
lmZWnek
r�YnXyddlmZmZmZWn"ek
�rd0\ZZdZYnXdj dddddddddddddd d!d"g�Z!ydd#lmZWn.ek
�rrddl"Z"Gd$d%�d%e#�ZYnXd&d'�Z$d(d)�Z%d*d+�Z&d1d,d-�Z'd2d.d/�Z(dS)3�)�absolute_importN)�hexlify�	unhexlify)�md5�sha1�sha256�)�SSLError�InsecurePlatformWarning�SNIMissingWarningF)� �(�@cCsHtt|�t|��}x*tt|�t|��D]\}}|||AO}q(W|dkS)z�
    Compare two digests of equal length in constant time.

    The digests must be of type str/bytes.
    Returns True if the digests match, and False otherwise.
    r)�abs�len�zip�	bytearray)�a�b�result�l�r�r�/usr/lib/python3.6/ssl_.py�_const_compare_digest_backportsrZcompare_digest)�wrap_socket�	CERT_NONE�PROTOCOL_SSLv23)�HAS_SNI)�OP_NO_SSLv2�OP_NO_SSLv3�OP_NO_COMPRESSION��i�:zTLS13-AES-256-GCM-SHA384zTLS13-CHACHA20-POLY1305-SHA256zTLS13-AES-128-GCM-SHA256zECDH+AESGCMz
ECDH+CHACHA20z	DH+AESGCMzDH+CHACHA20zECDH+AES256z	DH+AES256zECDH+AES128zDH+AESz
RSA+AESGCMzRSA+AESz!aNULLz!eNULLz!MD5)�
SSLContextc@s\eZdZdejkodknp*dejkZdd�Zdd�Zdd	d
�Zdd�Z	ddd�Z
dS)r%r��cCs6||_d|_tj|_d|_d|_d|_d|_d|_	dS)NFr)
�protocol�check_hostname�sslr�verify_mode�ca_certs�options�certfile�keyfile�ciphers)�selfZprotocol_versionrrr�__init__cszSSLContext.__init__cCs||_||_dS)N)r.r/)r1r.r/rrr�load_cert_chainnszSSLContext.load_cert_chainNcCs||_|dk	rtd��dS)Nz-CA directories not supported in older Pythons)r,r	)r1ZcafileZcapathrrr�load_verify_locationsrsz SSLContext.load_verify_locationscCs|jstd��||_dS)Nz�Your version of Python does not support setting a custom cipher suite. Please upgrade to Python 2.7, 3.2, or later if you need this functionality.)�supports_set_ciphers�	TypeErrorr0)r1Zcipher_suiterrr�set_ciphersxszSSLContext.set_ciphersFcCsTtjdt�|j|j|j|j|j|d�}|jrDt	|fd|j
i|��St	|f|�SdS)Na2A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings)r/r.r,�	cert_reqs�ssl_version�server_sider0)�warnings�warnr
r/r.r,r+r(r5rr0)r1Zsocket�server_hostnamer:�kwargsrrrr�szSSLContext.wrap_socket)rr&)r')r'r)NN)NF)�__name__�
__module__�__qualname__�sys�version_infor5r2r3r4r7rrrrrr%_s

	r%cCsn|jdd�j�}t|�}tj|�}|s4tdj|���t|j��}||�j	�}t
||�sjtdj|t|����dS)z�
    Checks if given fingerprint matches the supplied certificate.

    :param cert:
        Certificate as bytes object.
    :param fingerprint:
        Fingerprint as string of hexdigits, can be interspersed by colons.
    r$�z"Fingerprint of invalid length: {0}z6Fingerprints did not match. Expected "{0}", got "{1}".N)�replace�lowerr�HASHFUNC_MAP�getr	�formatr�encodeZdigest�_const_compare_digestr)ZcertZfingerprintZ
digest_lengthZhashfuncZfingerprint_bytesZcert_digestrrr�assert_fingerprint�s


rLcCs@|dkrtSt|t�r<tt|d�}|dkr8ttd|�}|S|S)a�
    Resolves the argument to a numeric constant, which can be passed to
    the wrap_socket function/method from the ssl module.
    Defaults to :data:`ssl.CERT_NONE`.
    If given a string it is assumed to be the name of the constant in the
    :mod:`ssl` module or its abbrevation.
    (So you can specify `REQUIRED` instead of `CERT_REQUIRED`.
    If it's neither `None` nor a string we assume it is already the numeric
    constant which can directly be passed to wrap_socket.
    NZCERT_)r�
isinstance�str�getattrr*)�	candidate�resrrr�resolve_cert_reqs�s
rRcCs@|dkrtSt|t�r<tt|d�}|dkr8ttd|�}|S|S)z 
    like resolve_cert_reqs
    NZ	PROTOCOL_)rrMrNrOr*)rPrQrrr�resolve_ssl_version�s
rScCs�t|p
tj�}|dkrtjn|}|dkrDd}|tO}|tO}|tO}|j|O_t|dd�rl|j	|pht
�||_t|dd�dk	r�d|_|S)a�All arguments have the same meaning as ``ssl_wrap_socket``.

    By default, this function does a lot of the same work that
    ``ssl.create_default_context`` does on Python 3.4+. It:

    - Disables SSLv2, SSLv3, and compression
    - Sets a restricted set of server ciphers

    If you wish to enable SSLv3, you can do::

        from urllib3.util import ssl_
        context = ssl_.create_urllib3_context()
        context.options &= ~ssl_.OP_NO_SSLv3

    You can do the same to enable compression (substituting ``COMPRESSION``
    for ``SSLv3`` in the last line above).

    :param ssl_version:
        The desired protocol version to use. This will default to
        PROTOCOL_SSLv23 which will negotiate the highest protocol that both
        the server and your installation of OpenSSL support.
    :param cert_reqs:
        Whether to require the certificate verification. This defaults to
        ``ssl.CERT_REQUIRED``.
    :param options:
        Specific OpenSSL options. These default to ``ssl.OP_NO_SSLv2``,
        ``ssl.OP_NO_SSLv3``, ``ssl.OP_NO_COMPRESSION``.
    :param ciphers:
        Which cipher suites to allow the server to select.
    :returns:
        Constructed SSLContext object with specified options
    :rtype: SSLContext
    Nrr5Tr)F)
r%r*rZ
CERT_REQUIREDrr r!r-rOr7�DEFAULT_CIPHERSr+r))r9r8r-r0�contextrrr�create_urllib3_context�s#rVc
Cs�|}
|
dkrt|||d�}
|s"|	r�y|
j||	�Wq�tk
r\}zt|��WYdd}~Xq�tk
r�}z|jtjkr�t|���WYdd}~Xq�Xn|dkr�t|
d�r�|
j�|r�|
j	||�t
r�|
j||d�Stj
dt�|
j|�S)a
    All arguments except for server_hostname, ssl_context, and ca_cert_dir have
    the same meaning as they do when using :func:`ssl.wrap_socket`.

    :param server_hostname:
        When SNI is supported, the expected hostname of the certificate
    :param ssl_context:
        A pre-made :class:`SSLContext` object. If none is provided, one will
        be created using :func:`create_urllib3_context`.
    :param ciphers:
        A string of ciphers we wish the client to support. This is not
        supported on Python 2.6 as the ssl module does not support it.
    :param ca_cert_dir:
        A directory containing CA certificates in multiple separate files, as
        supported by OpenSSL's -CApath flag or the capath argument to
        SSLContext.load_verify_locations().
    N)r0�load_default_certs)r=a�An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings)rVr4�IOErrorr	�OSError�errno�ENOENT�hasattrrWr3rrr;r<r)Zsockr/r.r8r,r=r9r0Zssl_contextZca_cert_dirrU�errr�ssl_wrap_sockets.r^)r"r#)NNNN)	NNNNNNNNN))Z
__future__rrZr;ZhmacZbinasciirrZhashlibrrr�
exceptionsr	r
rr%rZIS_PYOPENSSLZIS_SECURETRANSPORTrGrrOrKr*rrr�ImportErrorrr r!�joinrTrB�objectrLrRrSrVr^rrrr�<module>st

:
>site-packages/pip/_vendor/urllib3/util/__pycache__/retry.cpython-36.pyc000064400000030433147511334600022053 0ustar003

���e;�@s�ddlmZddlZddlZddlmZddlmZddlZddl	Z	ddl
mZmZm
Z
mZmZmZddlmZeje�Zedd	d
ddd
g�ZGdd�de�Zed�e_dS)�)�absolute_importN)�
namedtuple)�	takewhile�)�ConnectTimeoutError�
MaxRetryError�
ProtocolError�ReadTimeoutError�
ResponseError�
InvalidHeader)�six�RequestHistory�method�url�error�status�redirect_locationc
@s�eZdZdZeddddddg�Zedg�Zed	d
dg�ZdZd
ddddeddddddef
dd�Z	dd�Z
ed2dd��Zdd�Z
dd�Zdd�Zd3dd�Zdd �Zd4d!d"�Zd#d$�Zd%d&�Zd'd(�Zd5d*d+�Zd,d-�Zd6d.d/�Zd0d1�ZdS)7�Retrya2 Retry configuration.

    Each retry attempt will create a new Retry object with updated values, so
    they can be safely reused.

    Retries can be defined as a default for a pool::

        retries = Retry(connect=5, read=2, redirect=5)
        http = PoolManager(retries=retries)
        response = http.request('GET', 'http://example.com/')

    Or per-request (which overrides the default for the pool)::

        response = http.request('GET', 'http://example.com/', retries=Retry(10))

    Retries can be disabled by passing ``False``::

        response = http.request('GET', 'http://example.com/', retries=False)

    Errors will be wrapped in :class:`~urllib3.exceptions.MaxRetryError` unless
    retries are disabled, in which case the causing exception will be raised.

    :param int total:
        Total number of retries to allow. Takes precedence over other counts.

        Set to ``None`` to remove this constraint and fall back on other
        counts. It's a good idea to set this to some sensibly-high value to
        account for unexpected edge cases and avoid infinite retry loops.

        Set to ``0`` to fail on the first retry.

        Set to ``False`` to disable and imply ``raise_on_redirect=False``.

    :param int connect:
        How many connection-related errors to retry on.

        These are errors raised before the request is sent to the remote server,
        which we assume has not triggered the server to process the request.

        Set to ``0`` to fail on the first retry of this type.

    :param int read:
        How many times to retry on read errors.

        These errors are raised after the request was sent to the server, so the
        request may have side-effects.

        Set to ``0`` to fail on the first retry of this type.

    :param int redirect:
        How many redirects to perform. Limit this to avoid infinite redirect
        loops.

        A redirect is a HTTP response with a status code 301, 302, 303, 307 or
        308.

        Set to ``0`` to fail on the first retry of this type.

        Set to ``False`` to disable and imply ``raise_on_redirect=False``.

    :param int status:
        How many times to retry on bad status codes.

        These are retries made on responses, where status code matches
        ``status_forcelist``.

        Set to ``0`` to fail on the first retry of this type.

    :param iterable method_whitelist:
        Set of uppercased HTTP method verbs that we should retry on.

        By default, we only retry on methods which are considered to be
        idempotent (multiple requests with the same parameters end with the
        same state). See :attr:`Retry.DEFAULT_METHOD_WHITELIST`.

        Set to a ``False`` value to retry on any verb.

    :param iterable status_forcelist:
        A set of integer HTTP status codes that we should force a retry on.
        A retry is initiated if the request method is in ``method_whitelist``
        and the response status code is in ``status_forcelist``.

        By default, this is disabled with ``None``.

    :param float backoff_factor:
        A backoff factor to apply between attempts after the second try
        (most errors are resolved immediately by a second try without a
        delay). urllib3 will sleep for::

            {backoff factor} * (2 ^ ({number of total retries} - 1))

        seconds. If the backoff_factor is 0.1, then :func:`.sleep` will sleep
        for [0.0s, 0.2s, 0.4s, ...] between retries. It will never be longer
        than :attr:`Retry.BACKOFF_MAX`.

        By default, backoff is disabled (set to 0).

    :param bool raise_on_redirect: Whether, if the number of redirects is
        exhausted, to raise a MaxRetryError, or to return a response with a
        response code in the 3xx range.

    :param iterable remove_headers_on_redirect:
        Sequence of headers to remove from the request when a response
        indicating a redirect is returned before firing off the redirected
        request

    :param bool raise_on_status: Similar meaning to ``raise_on_redirect``:
        whether we should raise an exception, or return a response,
        if status falls in ``status_forcelist`` range and retries have
        been exhausted.

    :param tuple history: The history of the request encountered during
        each call to :meth:`~Retry.increment`. The list is in the order
        the requests occurred. Each list item is of class :class:`RequestHistory`.

    :param bool respect_retry_after_header:
        Whether to respect Retry-After header on status codes defined as
        :attr:`Retry.RETRY_AFTER_STATUS_CODES` or not.

    ZHEADZGETZPUTZDELETEZOPTIONSZTRACEZ
Authorizationi�i�i��x�
NrTcCsv||_||_||_||_|dks(|dkr0d}d}	||_|p>t�|_||_||_|	|_	|
|_
|pbt�|_||_
|
|_dS)NFr)�total�connect�readr�redirect�set�status_forcelist�method_whitelist�backoff_factor�raise_on_redirect�raise_on_status�tuple�history�respect_retry_after_header�remove_headers_on_redirect)�selfrrrrrrrrrrr!r"r#�r%�/usr/lib/python3.6/retry.py�__init__�s zRetry.__init__cKsPt|j|j|j|j|j|j|j|j|j	|j
|j|jd�}|j
|�t|�f|�S)N)rrrrrrrrrrr!r#)�dictrrrrrrrrrrr!r#�update�type)r$�kwZparamsr%r%r&�new�s

z	Retry.newcCsR|dkr|dk	r|n|j}t|t�r(|St|�o2d}|||d�}tjd||�|S)z4 Backwards-compatibility for the old retries format.N)rz!Converted retries value: %r -> %r)�DEFAULT�
isinstancer�bool�log�debug)�clsZretriesr�defaultZnew_retriesr%r%r&�from_int�s
zRetry.from_intcCsFtttdd�t|j����}|dkr(dS|jd|d}t|j|�S)zJ Formula for computing the current backoff

        :rtype: float
        cSs
|jdkS)N)r)�xr%r%r&�<lambda>�sz(Retry.get_backoff_time.<locals>.<lambda>�rr)�len�listr�reversedr!r�min�BACKOFF_MAX)r$Zconsecutive_errors_lenZ
backoff_valuer%r%r&�get_backoff_time�szRetry.get_backoff_timecCs\tjd|�rt|�}n6tjj|�}|dkr6td|��tj|�}|tj�}|dkrXd}|S)Nz^\s*[0-9]+\s*$zInvalid Retry-After header: %sr)	�re�match�int�emailZutilsZ	parsedater�timeZmktime)r$�retry_afterZsecondsZretry_date_tupleZ
retry_dater%r%r&�parse_retry_after�s

zRetry.parse_retry_aftercCs |jd�}|dkrdS|j|�S)z* Get the value of Retry-After in seconds. zRetry-AfterN)Z	getheaderrD)r$�responserCr%r%r&�get_retry_after�s
zRetry.get_retry_aftercCs |j|�}|rtj|�dSdS)NTF)rFrB�sleep)r$rErCr%r%r&�sleep_for_retry�s


zRetry.sleep_for_retrycCs"|j�}|dkrdStj|�dS)Nr)r=rBrG)r$Zbackoffr%r%r&�_sleep_backoffszRetry._sleep_backoffcCs"|r|j|�}|rdS|j�dS)aC Sleep between retry attempts.

        This method will respect a server's ``Retry-After`` response header
        and sleep the duration of the time requested. If that is not present, it
        will use an exponential backoff. By default, the backoff factor is 0 and
        this method will return immediately.
        N)rHrI)r$rEZsleptr%r%r&rGs
	
zRetry.sleepcCs
t|t�S)z{ Errors when we're fairly sure that the server did not receive the
        request, so it should be safe to retry.
        )r.r)r$�errr%r%r&�_is_connection_errorszRetry._is_connection_errorcCst|ttf�S)z� Errors that occur after the request has been started, so we should
        assume that the server began processing it.
        )r.r	r)r$rJr%r%r&�_is_read_error!szRetry._is_read_errorcCs|jr|j�|jkrdSdS)z| Checks if a given HTTP method should be retried upon, depending if
        it is included on the method whitelist.
        FT)r�upper)r$rr%r%r&�_is_method_retryable'szRetry._is_method_retryableFcCs<|j|�sdS|jr"||jkr"dS|jo:|jo:|o:||jkS)ax Is this method/status code retryable? (Based on whitelists and control
        variables such as the number of total retries to allow, whether to
        respect the Retry-After header, whether this header is present, and
        whether the returned status code is on the list of status codes to
        be retried upon on the presence of the aforementioned header)
        FT)rNrrr"�RETRY_AFTER_STATUS_CODES)r$r�status_codeZhas_retry_afterr%r%r&�is_retry0s
zRetry.is_retrycCs:|j|j|j|j|jf}ttd|��}|s.dSt|�dkS)z Are we out of retries? NFr)rrrrrr9�filterr;)r$Zretry_countsr%r%r&�is_exhausted@s
zRetry.is_exhaustedcCs�|jdkr |r tjt|�||��|j}|dk	r6|d8}|j}|j}	|j}
|j}d}d}
d}|r�|j|�r�|dkr�tjt|�||��n|dk	r�|d8}n�|r�|j	|�r�|	dks�|j
|�r�tjt|�||��n|	dk	r�|	d8}	nn|o�|j��r|
dk	r�|
d8}
d}|j�}|j}
n<tj
}|�rL|j�rL|dk	�r6|d8}tjj|jd�}|j}
|jt||||
|�f}|j|||	|
||d�}|j��r�t|||�p�t|���tjd||�|S)	a� Return a new Retry object with incremented retry counters.

        :param response: A response object, or None, if the server did not
            return a response.
        :type response: :class:`~urllib3.response.HTTPResponse`
        :param Exception error: An error encountered during the request, or
            None if the response was received successfully.

        :return: A new ``Retry`` object.
        FNr7�unknownztoo many redirects)rP)rrrrrr!z$Incremented Retry for (url='%s'): %r)rrZreraiser*rrrrrKrLrNZget_redirect_locationr
Z
GENERIC_ERRORZSPECIFIC_ERROR�formatr!r
r,rSrr0r1)r$rrrErZ_poolZ_stacktracerrrrZstatus_count�causerrr!Z	new_retryr%r%r&�	incrementIsX




zRetry.incrementcCsdjt|�|d�S)Nz|{cls.__name__}(total={self.total}, connect={self.connect}, read={self.read}, redirect={self.redirect}, status={self.status}))r2r$)rUr*)r$r%r%r&�__repr__�szRetry.__repr__)TN)N)N)F)NNNNNN)�__name__�
__module__�__qualname__�__doc__�	frozensetZDEFAULT_METHOD_WHITELISTZ"DEFAULT_REDIRECT_HEADERS_BLACKLISTrOr<r'r,�classmethodr4r=rDrFrHrIrGrKrLrNrQrSrWrXr%r%r%r&rs8x





	
	
Jr�)Z
__future__rrBZlogging�collectionsr�	itertoolsrrAr>�
exceptionsrrrr	r
rZpackagesrZ	getLoggerrYr0r
�objectrr-r%r%r%r&�<module>s  


site-packages/pip/_vendor/urllib3/util/__pycache__/selectors.cpython-36.pyc000064400000037715147511334600022723 0ustar003

���e�R�@szddlZddlZddlZddlZddlZddlZddlmZmZy
ej	Z	Wne
efk
rhejZ	YnXd(Zd)Z
dZe�ZdaGdd�de�Zdd�Zejd*kr�dd�Znd
d�Zedddddg�ZGdd�de�ZGdd�de�Zeed��rGdd�de�Zeed��rGdd�de�Zeed��r:Gdd�de�Zeed ��rVGd!d"�d"e�Zeed��sfd#Zd$d%�Zd&d'�Z dS)+�N)�
namedtuple�Mapping�Tcs,eZdZ�fdd�Zdd�Zdd�Z�ZS)�
SelectorErrorcstt|�j�||_dS)N)�superr�__init__�errno)�self�errcode)�	__class__��/usr/lib/python3.6/selectors.pyrszSelectorError.__init__cCsdj|j�S)Nz<SelectorError errno={0}>)�formatr)r	rrr
�__repr__"szSelectorError.__repr__cCs|j�S)N)r)r	rrr
�__str__%szSelectorError.__str__)�__name__�
__module__�__qualname__rrr�
__classcell__rr)rr
rsrc
Csdt|t�r|}n:yt|j��}Wn(tttfk
rHtdj|���YnX|dkr`tdj|���|S)zl Return a file descriptor from a file object. If
    given an integer will simply return that integer back. zInvalid file object: {0!r}rzInvalid file descriptor: {0})�
isinstance�int�fileno�AttributeError�	TypeError�
ValueErrorr)�fileobj�fdrrr
�_fileobj_to_fd)s
r��cOsVy
|||�Stttjfk
rP}z"d}t|d�r8|j}t|��WYdd}~XnXdS)z� This is the short-circuit version of the below logic
        because in Python 3.5+ all system calls automatically restart
        and recalculate their timeouts. Nr)�OSError�IOError�select�error�hasattrrr)�func�_�args�kwargs�er
rrr
�_syscall_wrapper;s

r*cOsR|jdd�}|dkrd}d}n t|�}|dkr4d}n
t�|}t|�}|rZd|krZtd��t}x�|tk�rLy|||�}Wq`tttj	fk
�rH}z�d}t
|d�r�|j}nt
|d�r�|jd}|tj
kp�t
td	�o�|tjk}	|	�r&|dk	�r$t�}
|
|k�rttjd
��|�r$d|k�r$||
|d<w`|�r6t|��n�WYdd}~Xq`Xq`W|S)z� Wrapper function for syscalls that could fail due to EINTR.
        All functions should be retried if there is time left in the timeout
        in accordance with PEP 475. �timeoutNFgz4Timeout must be in args or kwargs to be recalculatedrr'r�WSAEINTR)r)�get�float�	monotonic�listr�_SYSCALL_SENTINELr r!r"r#r$rr'ZEINTRr,Z	ETIMEDOUTr)r%Zrecalc_timeoutr'r(r+Zexpires�resultr)r
Zis_interruptZcurrent_timerrr
r*GsJ








�SelectorKeyrr�events�datac@s0eZdZdZdd�Zdd�Zdd�Zdd	�Zd
S)�_SelectorMappingz* Mapping of file objects to selector keys cCs
||_dS)N)�	_selector)r	Zselectorrrr
r�sz_SelectorMapping.__init__cCst|jj�S)N)�lenr7�
_fd_to_key)r	rrr
�__len__�sz_SelectorMapping.__len__cCs@y|jj|�}|jj|Stk
r:tdj|���YnXdS)Nz{0!r} is not registered.)r7�_fileobj_lookupr9�KeyErrorr)r	rrrrr
�__getitem__�s
z_SelectorMapping.__getitem__cCst|jj�S)N)�iterr7r9)r	rrr
�__iter__�sz_SelectorMapping.__iter__N)rrr�__doc__rr:r=r?rrrr
r6�s
r6c@sveZdZdZdd�Zdd�Zddd�Zd	d
�Zddd�Zdd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�ZdS)�BaseSelectora/ Abstract Selector class

    A selector supports registering file objects to be monitored
    for specific I/O events.

    A file object is a file descriptor or any object with a
    `fileno()` method. An arbitrary object can be attached to the
    file object which can be used for example to store context info,
    a callback, etc.

    A selector can use various implementations (select(), poll(), epoll(),
    and kqueue()) depending on the platform. The 'DefaultSelector' class uses
    the most efficient implementation for the current platform.
    cCsi|_t|�|_dS)N)r9r6�_map)r	rrr
r�szBaseSelector.__init__cCsHyt|�Stk
rBx"|jj�D]}|j|kr$|jSq$W�YnXdS)aa Return a file descriptor from a file object.
        This wraps _fileobj_to_fd() to do an exhaustive
        search in case the object is invalid but we still
        have it in our map. Used by unregister() so we can
        unregister an object that was previously registered
        even if it is closed. It is also used by _SelectorMapping
        N)rrr9�valuesrr)r	r�keyrrr
r;�s

zBaseSelector._fileobj_lookupNcCsd|s|ttB@r"tdj|���t||j|�||�}|j|jkrTtdj||j���||j|j<|S)z8 Register a file object for a set of events to monitor. zInvalid events: {0!r}z${0!r} (FD {1}) is already registered)	�
EVENT_READ�EVENT_WRITErrr3r;rr9r<)r	rr4r5rDrrr
�register�szBaseSelector.registercCs�y|jj|j|��}Wn�tk
r8tdj|���Ynptjk
r�}zR|jtjkrZ�n<x:|jj	�D]}|j
|krf|jj|j�PqfWtdj|���WYdd}~XnX|S)z0 Unregister a file object from being monitored. z{0!r} is not registeredN)r9�popr;r<r�socketr#rZEBADFrCrr)r	rrDr)rrr
�
unregister�s
 zBaseSelector.unregistercCs�y|j|j|�}Wn"tk
r6tdj|���YnX||jkr\|j|�|j|||�}n"||jkr~|j|d�}||j|j	<|S)z< Change a registered file object monitored events and data. z{0!r} is not registered)r5)
r9r;r<rr4rJrGr5�_replacer)r	rr4r5rDrrr
�modify�s


zBaseSelector.modifycCs
t��dS)zj Perform the actual selection until some monitored file objects
        are ready or the timeout expires. N)�NotImplementedError)r	r+rrr
r"�szBaseSelector.selectcCs|jj�d|_dS)zd Close the selector. This must be called to ensure that all
        underlying resources are freed. N)r9�clearrB)r	rrr
�close�s
zBaseSelector.closecCsH|j�}|dkrtd��y||Stk
rBtdj|���YnXdS)z: Return the key associated with a registered file object. NzSelector is closedz{0!r} is not registered)�get_map�RuntimeErrorr<r)r	r�mappingrrr
�get_keyszBaseSelector.get_keycCs|jS)z3 Return a mapping of file objects to selector keys )rB)r	rrr
rP
szBaseSelector.get_mapcCs$y
|j|Stk
rdSXdS)z_ Return the key associated to a given file descriptor
         Return None if it is not found. N)r9r<)r	rrrr
�_key_from_fds
zBaseSelector._key_from_fdcCs|S)Nr)r	rrr
�	__enter__szBaseSelector.__enter__cGs|j�dS)N)rO)r	r'rrr
�__exit__szBaseSelector.__exit__)N)N)N)rrrr@rr;rGrJrLr"rOrSrPrTrUrVrrrr
rA�s



rAr"csNeZdZdZ�fdd�Zd
�fdd�	Z�fdd�Zdd	d
�Zddd�Z�Z	S)�SelectSelectorz Select-based selector. cs"tt|�j�t�|_t�|_dS)N)rrWr�set�_readers�_writers)r	)rrr
r!szSelectSelector.__init__NcsDtt|�j|||�}|t@r*|jj|j�|t@r@|jj|j�|S)N)	rrWrGrErY�addrrFrZ)r	rr4r5rD)rrr
rG&szSelectSelector.registercs0tt|�j|�}|jj|j�|jj|j�|S)N)rrWrJrY�discardrrZ)r	rrD)rrr
rJ.szSelectSelector.unregistercCstj||g|�S)z? Wrapper for select.select because timeout is a positional arg )r")r	�r�wr+rrr
�_select4szSelectSelector._selectc	Cs�t|j�rt|j�rgS|dkr(dnt|d�}g}t|jd|j|j|�\}}}t|�}t|�}xV||BD]J}d}||kr�|tO}||kr�|tO}|j	|�}|rl|j
|||j@f�qlW|S)NgTr)r8rYrZ�maxr*r_rXrErFrT�appendr4)	r	r+�readyr]r^r&rr4rDrrr
r"8s$
zSelectSelector.select)N)N)N)
rrrr@rrGrJr_r"rrr)rr
rWs
rW�pollcsNeZdZdZ�fdd�Zd
�fdd�	Z�fdd�Zdd	d
�Zddd�Z�Z	S)�PollSelectorz Poll-based selector cstt|�j�tj�|_dS)N)rrdrr"rc�_poll)r	)rrr
rSszPollSelector.__init__NcsPtt|�j|||�}d}|t@r*|tjO}|t@r<|tjO}|jj|j	|�|S)Nr)
rrdrGrEr"�POLLINrF�POLLOUTrer)r	rr4r5rD�
event_mask)rrr
rGWs

zPollSelector.registercs"tt|�j|�}|jj|j�|S)N)rrdrJrer)r	rrD)rrr
rJaszPollSelector.unregistercCs4|dk	r$|dkrd}ntj|d�}|jj|�}|S)zj Wrapper function for select.poll.poll() so that
            _syscall_wrapper can work with only seconds. Nrg@�@)�math�ceilrerc)r	r+r2rrr
�
_wrap_pollfszPollSelector._wrap_pollcCsxg}t|jd|d�}x^|D]V\}}d}|tj@r:|tO}|tj@rN|tO}|j|�}|r|j|||j	@f�qW|S)NT)r+r)
r*rkr"rfrFrgrErTrar4)r	r+rb�	fd_eventsrrhr4rDrrr
r"ts
zPollSelector.select)N)N)N)
rrrr@rrGrJrkr"rrr)rr
rdQs

rd�epollcsXeZdZdZ�fdd�Zdd�Zd�fdd�	Z�fd	d
�Zddd�Z�fd
d�Z	�Z
S)�
EpollSelectorz Epoll-based selector cstt|�j�tj�|_dS)N)rrnrr"rm�_epoll)r	)rrr
r�szEpollSelector.__init__cCs
|jj�S)N)ror)r	rrr
r�szEpollSelector.filenoNcsTtt|�j|||�}d}|t@r*|tjO}|t@r<|tjO}t|j	jd|j
|�|S)NrF)rrnrGrEr"�EPOLLINrF�EPOLLOUTr*ror)r	rr4r5rDZevents_mask)rrr
rG�s

zEpollSelector.registercs@tt|�j|�}yt|jjd|j�Wntk
r:YnX|S)NF)rrnrJr*rorr)r	rrD)rrr
rJ�szEpollSelector.unregisterc	Cs�|dk	r2|dkrd}ntj|d�d}t|�}nd	}tt|j�d�}g}t|jjd||d�}x^|D]V\}}d}|t	j
@r�|tO}|t	j@r�|t
O}|j|�}|rd|j|||j@f�qdW|S)
Nrgg@�@g����MbP?g�?rT)r+Z	maxeventsg�)rirjr.r`r8r9r*rorcr"rprFrqrErTrar4)	r	r+�
max_eventsrbrlrrhr4rDrrr
r"�s*


zEpollSelector.selectcs|jj�tt|�j�dS)N)rorOrrn)r	)rrr
rO�s
zEpollSelector.close)N)N)rrrr@rrrGrJr"rOrrr)rr
rn�s
	
!rn�kqueuecsXeZdZdZ�fdd�Zdd�Zd�fdd�	Z�fd	d
�Zddd�Z�fd
d�Z	�Z
S)�KqueueSelectorz  Kqueue / Kevent-based selector cstt|�j�tj�|_dS)N)rrtrr"rs�_kqueue)r	)rrr
r�szKqueueSelector.__init__cCs
|jj�S)N)rur)r	rrr
r�szKqueueSelector.filenoNcs|tt|�j|||�}|t@rFtj|jtjtj�}t	|j
jd|gdd�|t@rxtj|jtj
tj�}t	|j
jd|gdd�|S)NFr)rrtrGrEr"�keventr�KQ_FILTER_READZ	KQ_EV_ADDr*ru�controlrF�KQ_FILTER_WRITE)r	rr4r5rDrv)rrr
rG�szKqueueSelector.registercs�tt|�j|�}|jt@r^tj|jtjtj	�}yt
|jjd|gdd�Wnt
k
r\YnX|jt@r�tj|jtjtj	�}yt
|jjd|gdd�Wnt
k
r�YnX|S)NFr)rrtrJr4rEr"rvrrwZKQ_EV_DELETEr*rurxrrFry)r	rrDrv)rrr
rJ�s$

zKqueueSelector.unregistercCs�|dk	rt|d�}t|j�d}i}t|jjdd||�}x�|D]�}|j}|j}d}|tj	krd|t
O}|tjkrv|tO}|j
|�}	|	r>|	j|kr�|	||	j@f||	j<q>||	jd}
|	||
B|	j@f||	j<q>Wt|j��S)Nr�Tr)r`r8r9r*rurxZident�filterr"rwrEryrFrTrr4r0rC)r	r+rrZ	ready_fdsZkevent_listrvrrhr4rDZ
old_eventsrrr
r"�s*







zKqueueSelector.selectcs|jj�tt|�j�dS)N)rurOrrt)r	)rrr
rOs
zKqueueSelector.close)N)N)rrrr@rrrGrJr"rOrrr)rr
rt�s
rtFcCsZy0|dkrtj�}|jd�ntt|��j�dSttfk
rT}zdSd}~XnXdS)a
 Checks that select structs can be allocated by the underlying
    operating system, not just advertised by the select module. We don't
    check select() because we'll be hopeful that most platforms that
    don't have it available will not advertise it. (ie: GAE) rcrTFN)r"rc�getattrrOr r)�struct�pr)rrr
�
_can_allocatesrcCsPtdkrJtd�rtan4td�r$tan&td�r2tanttd�rBtantd��t�S)z� This function serves as a first call for DefaultSelector to
    detect if the select module is being monkey-patched incorrectly
    by eventlet, greenlet, and preserve proper behavior. Nrsrmrcr"z!Platform does not have a selector)	�_DEFAULT_SELECTORrrtrnrdr$r"rWrrrrr
�DefaultSelector5s
r�rrz)rr)!rrir"rI�sysZtime�collectionsrrr/r�ImportErrorrErFZ
HAS_SELECT�objectr1r��	Exceptionrr�version_infor*r3r6rAr$rWrdrnrtrr�rrrr
�<module>sH


8	14BRsite-packages/pip/_vendor/urllib3/util/__pycache__/url.cpython-36.pyc000064400000012430147511334600021505 0ustar003

���e��@s�ddlmZddlmZddlZddlmZdddd	d
ddgZdZej	d�Z
ddlmZGdd�dede��Z
dd�Zdd�Zdd�ZdS)�)�absolute_import)�
namedtupleN�)�LocationParseError�scheme�auth�host�port�path�query�fragment�http�httpsz[- ])�quotecs^eZdZdZfZd�fdd�	Zedd��Zedd��Zed	d
��Z	edd��Z
d
d�Z�ZS)�Urlz�
    Datastructure for representing an HTTP URL. Used as a return value for
    :func:`parse_url`. Both the scheme and host are normalized as they are
    both case-insensitive according to RFC 3986.
    Nc	sV|r|jd�rd|}|r$|j�}|r8|tkr8|j�}tt|�j||||||||�S)N�/)�
startswith�lower�NORMALIZABLE_SCHEMES�superr�__new__)�clsrrrr	r
rr)�	__class__��/usr/lib/python3.6/url.pyrszUrl.__new__cCs|jS)z@For backwards-compatibility with urlparse. We're nice like that.)r)�selfrrr�hostname$szUrl.hostnamecCs&|jpd}|jdk	r"|d|j7}|S)z)Absolute path including the query string.rN�?)r
r)rZurirrr�request_uri)s

zUrl.request_uricCs|jrd|j|jfS|jS)z(Network location including host and portz%s:%d)r	r)rrrr�netloc3sz
Url.netlocc	Cs�|\}}}}}}}d}|dk	r*||d7}|dk	r>||d7}|dk	rN||7}|dk	rf|dt|�7}|dk	rv||7}|dk	r�|d|7}|dk	r�|d|7}|S)a�
        Convert self into a url

        This function should more or less round-trip with :func:`.parse_url`. The
        returned url may not be exactly the same as the url inputted to
        :func:`.parse_url`, but it should be equivalent by the RFC (e.g., urls
        with a blank port will have : removed).

        Example: ::

            >>> U = parse_url('http://google.com/mail/')
            >>> U.url
            'http://google.com/mail/'
            >>> Url('http', 'username:password', 'host.com', 80,
            ... '/path', 'query', 'fragment').url
            'http://username:password@host.com:80/path?query#fragment'
        �Nz://�@�:r�#)�str)	rrrrr	r
rr�urlrrrr%:s"zUrl.urlcCs|jS)N)r%)rrrr�__str__bszUrl.__str__)NNNNNNN)
�__name__�
__module__�__qualname__�__doc__�	__slots__r�propertyrrrr%r&�
__classcell__rr)rrrs

(rcCszd}d}x8|D]0}|j|�}|dkr&q|dks6||kr|}|}qW|dksR|dkr\|ddfS|d|�||dd�|fS)a�
    Given a string and an iterable of delimiters, split on the first found
    delimiter. Return two split parts and the matched delimiter.

    If not found, then the first part is the full input string.

    Example::

        >>> split_first('foo/bar?baz', '?/=')
        ('foo', 'bar?baz', '/')
        >>> split_first('foo/bar?baz', '123')
        ('foo/bar?baz', '', None)

    Scales linearly with number of delims. Not ideal for large number of delims.
    Nrr �)�find)�sZdelimsZmin_idxZ	min_delim�d�idxrrr�split_firstfs


r3cCs�|s
t�Stjdd�|�}d}d}d}d}d}d}d}d|krN|jdd�\}}t|dddg�\}}}	|	rp|	|}d	|kr�|jd	d�\}}|r�|d
dkr�|jdd�\}}|d7}d
|k�r|jd
d�\}
}|s�|
}|�r|j�s�t|��yt|�}Wnt	k
�rt|��YnXnd}n|�r.|�r.|}|�sHt|||||||�Sd|k�rb|jdd�\}}d|k�r||jdd�\}}t|||||||�S)a:
    Given a url, return a parsed :class:`.Url` namedtuple. Best-effort is
    performed to parse incomplete urls. Fields not provided will be None.

    Partly backwards-compatible with :mod:`urlparse`.

    Example::

        >>> parse_url('http://google.com/mail/')
        Url(scheme='http', host='google.com', port=None, path='/mail/', ...)
        >>> parse_url('google.com:80')
        Url(scheme=None, host='google.com', port=80, path=None, ...)
        >>> parse_url('/foo?bar')
        Url(scheme=None, host=None, port=None, path='/foo', query='bar', ...)
    cSst|j��S)N)r�group)�matchrrr�<lambda>�szparse_url.<locals>.<lambda>Nz://r.rrr#r!r�[�]r")
r�!_contains_disallowed_url_pchar_re�sub�splitr3�rsplit�isdigitr�int�
ValueError)r%rrrr	r
rrZpath_ZdelimZ_hostrrr�	parse_url�sR


r@cCst|�}|jpd|j|jfS)z4
    Deprecated. Use :func:`parse_url` instead.
    r
)r@rrr	)r%�prrr�get_host�srB)r
rN)Z
__future__r�collectionsr�re�
exceptionsrZ	url_attrsr�compiler9Zpackages.six.moves.urllib.parserrr3r@rBrrrr�<module>s
U!asite-packages/pip/_vendor/urllib3/util/__pycache__/selectors.cpython-36.opt-1.pyc000064400000037715147511334600023662 0ustar003

���e�R�@szddlZddlZddlZddlZddlZddlZddlmZmZy
ej	Z	Wne
efk
rhejZ	YnXd(Zd)Z
dZe�ZdaGdd�de�Zdd�Zejd*kr�dd�Znd
d�Zedddddg�ZGdd�de�ZGdd�de�Zeed��rGdd�de�Zeed��rGdd�de�Zeed��r:Gdd�de�Zeed ��rVGd!d"�d"e�Zeed��sfd#Zd$d%�Zd&d'�Z dS)+�N)�
namedtuple�Mapping�Tcs,eZdZ�fdd�Zdd�Zdd�Z�ZS)�
SelectorErrorcstt|�j�||_dS)N)�superr�__init__�errno)�self�errcode)�	__class__��/usr/lib/python3.6/selectors.pyrszSelectorError.__init__cCsdj|j�S)Nz<SelectorError errno={0}>)�formatr)r	rrr
�__repr__"szSelectorError.__repr__cCs|j�S)N)r)r	rrr
�__str__%szSelectorError.__str__)�__name__�
__module__�__qualname__rrr�
__classcell__rr)rr
rsrc
Csdt|t�r|}n:yt|j��}Wn(tttfk
rHtdj|���YnX|dkr`tdj|���|S)zl Return a file descriptor from a file object. If
    given an integer will simply return that integer back. zInvalid file object: {0!r}rzInvalid file descriptor: {0})�
isinstance�int�fileno�AttributeError�	TypeError�
ValueErrorr)�fileobj�fdrrr
�_fileobj_to_fd)s
r��cOsVy
|||�Stttjfk
rP}z"d}t|d�r8|j}t|��WYdd}~XnXdS)z� This is the short-circuit version of the below logic
        because in Python 3.5+ all system calls automatically restart
        and recalculate their timeouts. Nr)�OSError�IOError�select�error�hasattrrr)�func�_�args�kwargs�er
rrr
�_syscall_wrapper;s

r*cOsR|jdd�}|dkrd}d}n t|�}|dkr4d}n
t�|}t|�}|rZd|krZtd��t}x�|tk�rLy|||�}Wq`tttj	fk
�rH}z�d}t
|d�r�|j}nt
|d�r�|jd}|tj
kp�t
td	�o�|tjk}	|	�r&|dk	�r$t�}
|
|k�rttjd
��|�r$d|k�r$||
|d<w`|�r6t|��n�WYdd}~Xq`Xq`W|S)z� Wrapper function for syscalls that could fail due to EINTR.
        All functions should be retried if there is time left in the timeout
        in accordance with PEP 475. �timeoutNFgz4Timeout must be in args or kwargs to be recalculatedrr'r�WSAEINTR)r)�get�float�	monotonic�listr�_SYSCALL_SENTINELr r!r"r#r$rr'ZEINTRr,Z	ETIMEDOUTr)r%Zrecalc_timeoutr'r(r+Zexpires�resultr)r
Zis_interruptZcurrent_timerrr
r*GsJ








�SelectorKeyrr�events�datac@s0eZdZdZdd�Zdd�Zdd�Zdd	�Zd
S)�_SelectorMappingz* Mapping of file objects to selector keys cCs
||_dS)N)�	_selector)r	Zselectorrrr
r�sz_SelectorMapping.__init__cCst|jj�S)N)�lenr7�
_fd_to_key)r	rrr
�__len__�sz_SelectorMapping.__len__cCs@y|jj|�}|jj|Stk
r:tdj|���YnXdS)Nz{0!r} is not registered.)r7�_fileobj_lookupr9�KeyErrorr)r	rrrrr
�__getitem__�s
z_SelectorMapping.__getitem__cCst|jj�S)N)�iterr7r9)r	rrr
�__iter__�sz_SelectorMapping.__iter__N)rrr�__doc__rr:r=r?rrrr
r6�s
r6c@sveZdZdZdd�Zdd�Zddd�Zd	d
�Zddd�Zdd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�ZdS)�BaseSelectora/ Abstract Selector class

    A selector supports registering file objects to be monitored
    for specific I/O events.

    A file object is a file descriptor or any object with a
    `fileno()` method. An arbitrary object can be attached to the
    file object which can be used for example to store context info,
    a callback, etc.

    A selector can use various implementations (select(), poll(), epoll(),
    and kqueue()) depending on the platform. The 'DefaultSelector' class uses
    the most efficient implementation for the current platform.
    cCsi|_t|�|_dS)N)r9r6�_map)r	rrr
r�szBaseSelector.__init__cCsHyt|�Stk
rBx"|jj�D]}|j|kr$|jSq$W�YnXdS)aa Return a file descriptor from a file object.
        This wraps _fileobj_to_fd() to do an exhaustive
        search in case the object is invalid but we still
        have it in our map. Used by unregister() so we can
        unregister an object that was previously registered
        even if it is closed. It is also used by _SelectorMapping
        N)rrr9�valuesrr)r	r�keyrrr
r;�s

zBaseSelector._fileobj_lookupNcCsd|s|ttB@r"tdj|���t||j|�||�}|j|jkrTtdj||j���||j|j<|S)z8 Register a file object for a set of events to monitor. zInvalid events: {0!r}z${0!r} (FD {1}) is already registered)	�
EVENT_READ�EVENT_WRITErrr3r;rr9r<)r	rr4r5rDrrr
�register�szBaseSelector.registercCs�y|jj|j|��}Wn�tk
r8tdj|���Ynptjk
r�}zR|jtjkrZ�n<x:|jj	�D]}|j
|krf|jj|j�PqfWtdj|���WYdd}~XnX|S)z0 Unregister a file object from being monitored. z{0!r} is not registeredN)r9�popr;r<r�socketr#rZEBADFrCrr)r	rrDr)rrr
�
unregister�s
 zBaseSelector.unregistercCs�y|j|j|�}Wn"tk
r6tdj|���YnX||jkr\|j|�|j|||�}n"||jkr~|j|d�}||j|j	<|S)z< Change a registered file object monitored events and data. z{0!r} is not registered)r5)
r9r;r<rr4rJrGr5�_replacer)r	rr4r5rDrrr
�modify�s


zBaseSelector.modifycCs
t��dS)zj Perform the actual selection until some monitored file objects
        are ready or the timeout expires. N)�NotImplementedError)r	r+rrr
r"�szBaseSelector.selectcCs|jj�d|_dS)zd Close the selector. This must be called to ensure that all
        underlying resources are freed. N)r9�clearrB)r	rrr
�close�s
zBaseSelector.closecCsH|j�}|dkrtd��y||Stk
rBtdj|���YnXdS)z: Return the key associated with a registered file object. NzSelector is closedz{0!r} is not registered)�get_map�RuntimeErrorr<r)r	r�mappingrrr
�get_keyszBaseSelector.get_keycCs|jS)z3 Return a mapping of file objects to selector keys )rB)r	rrr
rP
szBaseSelector.get_mapcCs$y
|j|Stk
rdSXdS)z_ Return the key associated to a given file descriptor
         Return None if it is not found. N)r9r<)r	rrrr
�_key_from_fds
zBaseSelector._key_from_fdcCs|S)Nr)r	rrr
�	__enter__szBaseSelector.__enter__cGs|j�dS)N)rO)r	r'rrr
�__exit__szBaseSelector.__exit__)N)N)N)rrrr@rr;rGrJrLr"rOrSrPrTrUrVrrrr
rA�s



rAr"csNeZdZdZ�fdd�Zd
�fdd�	Z�fdd�Zdd	d
�Zddd�Z�Z	S)�SelectSelectorz Select-based selector. cs"tt|�j�t�|_t�|_dS)N)rrWr�set�_readers�_writers)r	)rrr
r!szSelectSelector.__init__NcsDtt|�j|||�}|t@r*|jj|j�|t@r@|jj|j�|S)N)	rrWrGrErY�addrrFrZ)r	rr4r5rD)rrr
rG&szSelectSelector.registercs0tt|�j|�}|jj|j�|jj|j�|S)N)rrWrJrY�discardrrZ)r	rrD)rrr
rJ.szSelectSelector.unregistercCstj||g|�S)z? Wrapper for select.select because timeout is a positional arg )r")r	�r�wr+rrr
�_select4szSelectSelector._selectc	Cs�t|j�rt|j�rgS|dkr(dnt|d�}g}t|jd|j|j|�\}}}t|�}t|�}xV||BD]J}d}||kr�|tO}||kr�|tO}|j	|�}|rl|j
|||j@f�qlW|S)NgTr)r8rYrZ�maxr*r_rXrErFrT�appendr4)	r	r+�readyr]r^r&rr4rDrrr
r"8s$
zSelectSelector.select)N)N)N)
rrrr@rrGrJr_r"rrr)rr
rWs
rW�pollcsNeZdZdZ�fdd�Zd
�fdd�	Z�fdd�Zdd	d
�Zddd�Z�Z	S)�PollSelectorz Poll-based selector cstt|�j�tj�|_dS)N)rrdrr"rc�_poll)r	)rrr
rSszPollSelector.__init__NcsPtt|�j|||�}d}|t@r*|tjO}|t@r<|tjO}|jj|j	|�|S)Nr)
rrdrGrEr"�POLLINrF�POLLOUTrer)r	rr4r5rD�
event_mask)rrr
rGWs

zPollSelector.registercs"tt|�j|�}|jj|j�|S)N)rrdrJrer)r	rrD)rrr
rJaszPollSelector.unregistercCs4|dk	r$|dkrd}ntj|d�}|jj|�}|S)zj Wrapper function for select.poll.poll() so that
            _syscall_wrapper can work with only seconds. Nrg@�@)�math�ceilrerc)r	r+r2rrr
�
_wrap_pollfszPollSelector._wrap_pollcCsxg}t|jd|d�}x^|D]V\}}d}|tj@r:|tO}|tj@rN|tO}|j|�}|r|j|||j	@f�qW|S)NT)r+r)
r*rkr"rfrFrgrErTrar4)r	r+rb�	fd_eventsrrhr4rDrrr
r"ts
zPollSelector.select)N)N)N)
rrrr@rrGrJrkr"rrr)rr
rdQs

rd�epollcsXeZdZdZ�fdd�Zdd�Zd�fdd�	Z�fd	d
�Zddd�Z�fd
d�Z	�Z
S)�
EpollSelectorz Epoll-based selector cstt|�j�tj�|_dS)N)rrnrr"rm�_epoll)r	)rrr
r�szEpollSelector.__init__cCs
|jj�S)N)ror)r	rrr
r�szEpollSelector.filenoNcsTtt|�j|||�}d}|t@r*|tjO}|t@r<|tjO}t|j	jd|j
|�|S)NrF)rrnrGrEr"�EPOLLINrF�EPOLLOUTr*ror)r	rr4r5rDZevents_mask)rrr
rG�s

zEpollSelector.registercs@tt|�j|�}yt|jjd|j�Wntk
r:YnX|S)NF)rrnrJr*rorr)r	rrD)rrr
rJ�szEpollSelector.unregisterc	Cs�|dk	r2|dkrd}ntj|d�d}t|�}nd	}tt|j�d�}g}t|jjd||d�}x^|D]V\}}d}|t	j
@r�|tO}|t	j@r�|t
O}|j|�}|rd|j|||j@f�qdW|S)
Nrgg@�@g����MbP?g�?rT)r+Z	maxeventsg�)rirjr.r`r8r9r*rorcr"rprFrqrErTrar4)	r	r+�
max_eventsrbrlrrhr4rDrrr
r"�s*


zEpollSelector.selectcs|jj�tt|�j�dS)N)rorOrrn)r	)rrr
rO�s
zEpollSelector.close)N)N)rrrr@rrrGrJr"rOrrr)rr
rn�s
	
!rn�kqueuecsXeZdZdZ�fdd�Zdd�Zd�fdd�	Z�fd	d
�Zddd�Z�fd
d�Z	�Z
S)�KqueueSelectorz  Kqueue / Kevent-based selector cstt|�j�tj�|_dS)N)rrtrr"rs�_kqueue)r	)rrr
r�szKqueueSelector.__init__cCs
|jj�S)N)rur)r	rrr
r�szKqueueSelector.filenoNcs|tt|�j|||�}|t@rFtj|jtjtj�}t	|j
jd|gdd�|t@rxtj|jtj
tj�}t	|j
jd|gdd�|S)NFr)rrtrGrEr"�keventr�KQ_FILTER_READZ	KQ_EV_ADDr*ru�controlrF�KQ_FILTER_WRITE)r	rr4r5rDrv)rrr
rG�szKqueueSelector.registercs�tt|�j|�}|jt@r^tj|jtjtj	�}yt
|jjd|gdd�Wnt
k
r\YnX|jt@r�tj|jtjtj	�}yt
|jjd|gdd�Wnt
k
r�YnX|S)NFr)rrtrJr4rEr"rvrrwZKQ_EV_DELETEr*rurxrrFry)r	rrDrv)rrr
rJ�s$

zKqueueSelector.unregistercCs�|dk	rt|d�}t|j�d}i}t|jjdd||�}x�|D]�}|j}|j}d}|tj	krd|t
O}|tjkrv|tO}|j
|�}	|	r>|	j|kr�|	||	j@f||	j<q>||	jd}
|	||
B|	j@f||	j<q>Wt|j��S)Nr�Tr)r`r8r9r*rurxZident�filterr"rwrEryrFrTrr4r0rC)r	r+rrZ	ready_fdsZkevent_listrvrrhr4rDZ
old_eventsrrr
r"�s*







zKqueueSelector.selectcs|jj�tt|�j�dS)N)rurOrrt)r	)rrr
rOs
zKqueueSelector.close)N)N)rrrr@rrrGrJr"rOrrr)rr
rt�s
rtFcCsZy0|dkrtj�}|jd�ntt|��j�dSttfk
rT}zdSd}~XnXdS)a
 Checks that select structs can be allocated by the underlying
    operating system, not just advertised by the select module. We don't
    check select() because we'll be hopeful that most platforms that
    don't have it available will not advertise it. (ie: GAE) rcrTFN)r"rc�getattrrOr r)�struct�pr)rrr
�
_can_allocatesrcCsPtdkrJtd�rtan4td�r$tan&td�r2tanttd�rBtantd��t�S)z� This function serves as a first call for DefaultSelector to
    detect if the select module is being monkey-patched incorrectly
    by eventlet, greenlet, and preserve proper behavior. Nrsrmrcr"z!Platform does not have a selector)	�_DEFAULT_SELECTORrrtrnrdr$r"rWrrrrr
�DefaultSelector5s
r�rrz)rr)!rrir"rI�sysZtime�collectionsrrr/r�ImportErrorrErFZ
HAS_SELECT�objectr1r��	Exceptionrr�version_infor*r3r6rAr$rWrdrnrtrr�rrrr
�<module>sH


8	14BRsite-packages/pip/_vendor/urllib3/util/__pycache__/response.cpython-36.pyc000064400000003433147511334600022544 0ustar003

���e'	�@s@ddlmZddlmZddlmZdd�Zdd�Zd	d
�Z	dS)�)�absolute_import�)�http_client)�HeaderParsingErrorcCsfy|j�Stk
rYnXy|jStk
r8YnXy
|jdkStk
rXYnXtd��dS)zt
    Checks whether a given file-like object is closed.

    :param obj:
        The file-like object to check.
    Nz)Unable to determine whether fp is closed.)Zisclosed�AttributeError�closed�fp�
ValueError)�obj�r�/usr/lib/python3.6/response.py�is_fp_closeds
r
cCs\t|tj�stdjt|����t|dd�}t|dd�}d}|rD|�}|sL|rXt||d��dS)aP
    Asserts whether all headers have been successfully parsed.
    Extracts encountered errors from the result of parsing headers.

    Only works on Python 3.

    :param headers: Headers to verify.
    :type headers: `httplib.HTTPMessage`.

    :raises urllib3.exceptions.HeaderParsingError:
        If parsing errors are found.
    z"expected httplib.Message, got {0}.�defectsN�get_payload)r�
unparsed_data)�
isinstance�httplibZHTTPMessage�	TypeError�format�type�getattrr)Zheadersrrrrrr�assert_header_parsing&srcCs$|j}t|t�r|dkS|j�dkS)z�
    Checks whether the request of a response has been a HEAD-request.
    Handles the quirks of AppEngine.

    :param conn:
    :type conn: :class:`httplib.HTTPResponse`
    �ZHEAD)�_methodr�int�upper)Zresponse�methodrrr�is_response_to_headEs	
rN)
Z
__future__rZpackages.six.movesrr�
exceptionsrr
rrrrrr�<module>s
site-packages/pip/_vendor/urllib3/util/__pycache__/wait.cpython-36.opt-1.pyc000064400000003055147511334600022611 0ustar003

���e��@s:ddlmZmZmZmZd	dd�Zd
dd�Zddd�ZdS)�)�
HAS_SELECT�DefaultSelector�
EVENT_READ�EVENT_WRITENcsttstd��t|t�s0t|d�r(|g}nt|�}t��4}x|D]}|j|��q>W�fdd�|j|�D�SQRXdS)z� Waits for IO events to be available from a list of sockets
    or optionally a single socket if passed in. Returns a list of
    sockets that can be interacted with immediately. z!Platform does not have a selector�filenocs"g|]}|d�@r|dj�qS)r�)Zfileobj)�.0�key)�events��/usr/lib/python3.6/wait.py�
<listcomp>sz'_wait_for_io_events.<locals>.<listcomp>N)r�
ValueError�
isinstance�list�hasattrr�registerZselect)�socksr
�timeoutZselectorZsockr)r
r�_wait_for_io_events	s



rcCst|t|�S)z� Waits for reading to be available from a list of sockets
    or optionally a single socket if passed in. Returns a list of
    sockets that can be read from immediately. )rr)rrrrr�
wait_for_readsrcCst|t|�S)z� Waits for writing to be available from a list of sockets
    or optionally a single socket if passed in. Returns a list of
    sockets that can be written to immediately. )rr)rrrrr�wait_for_write$sr)N)N)N)Z	selectorsrrrrrrrrrrr�<module>s

site-packages/pip/_vendor/urllib3/util/__pycache__/connection.cpython-36.pyc000064400000005744147511334600023054 0ustar003

���e��@snddlmZddlZddlmZddlmZmZdd�Zej	ddfdd	�Z
d
d�Zdd
�Zdd�Z
e
d�ZdS)�)�absolute_importN�)�
wait_for_read)�
HAS_SELECT�
SelectorErrorcCsVt|dd�}|dkrdS|dkr$dSts,dSytt|dd��Stk
rPdSXdS)a 
    Returns True if the connection is dropped and should be closed.

    :param conn:
        :class:`httplib.HTTPConnection` object.

    Note: For platforms like AppEngine, this will always return ``False`` to
    let the platform handle connection recycling transparently for us.
    �sockFNTg)�timeout)�getattrr�boolrr)Zconnr�r� /usr/lib/python3.6/connection.py�is_connection_droppeds
r
cCs�|\}}|jd�r|jd�}d}t�}x�tj|||tj�D]�}|\}	}
}}}
d}yHtj|	|
|�}t||�|tjk	r~|j|�|r�|j	|�|j
|
�|Stjk
r�}z|}|dk	r�|j�d}WYdd}~Xq:Xq:W|dk	r�|�tjd��dS)adConnect to *address* and return the socket object.

    Convenience function.  Connect to *address* (a 2-tuple ``(host,
    port)``) and return the socket object.  Passing the optional
    *timeout* parameter will set the timeout on the socket instance
    before attempting to connect.  If no *timeout* is supplied, the
    global default timeout setting returned by :func:`getdefaulttimeout`
    is used.  If *source_address* is set it must be a tuple of (host, port)
    for the socket to bind as a source address before making the connection.
    An host of '' or port 0 tells the OS to use the default.
    �[z[]Nz!getaddrinfo returns an empty list)
�
startswith�strip�allowed_gai_family�socketZgetaddrinfoZSOCK_STREAM�_set_socket_options�_GLOBAL_DEFAULT_TIMEOUTZ
settimeout�bindZconnect�error�close)ZaddressrZsource_addressZsocket_options�hostZport�err�family�resZafZsocktype�protoZ	canonnameZsar�errr�create_connection$s2






rcCs(|dkrdSx|D]}|j|�qWdS)N)Z
setsockopt)rZoptions�optrrrrXs
rcCstj}trtj}|S)z�This function is designed to work in the context of
    getaddrinfo, where family=socket.AF_UNSPEC is the default and
    will perform a DNS search for both IPv6 and IPv4 records.)rZAF_INET�HAS_IPV6Z	AF_UNSPEC)rrrrr`srcCsVd}d}tjrFy"tjtj�}|j|df�d}Wntk
rDYnX|rR|j�|S)z6 Returns True if the system can bind an IPv6 address. NFrT)r�has_ipv6ZAF_INET6r�	Exceptionr)rrr!rrr�	_has_ipv6ksr#z::1)Z
__future__rr�waitrZ	selectorsrrr
rrrrr#r rrrr�<module>s3site-packages/pip/_vendor/urllib3/util/__pycache__/connection.cpython-36.opt-1.pyc000064400000005744147511334600024013 0ustar003

���e��@snddlmZddlZddlmZddlmZmZdd�Zej	ddfdd	�Z
d
d�Zdd
�Zdd�Z
e
d�ZdS)�)�absolute_importN�)�
wait_for_read)�
HAS_SELECT�
SelectorErrorcCsVt|dd�}|dkrdS|dkr$dSts,dSytt|dd��Stk
rPdSXdS)a 
    Returns True if the connection is dropped and should be closed.

    :param conn:
        :class:`httplib.HTTPConnection` object.

    Note: For platforms like AppEngine, this will always return ``False`` to
    let the platform handle connection recycling transparently for us.
    �sockFNTg)�timeout)�getattrr�boolrr)Zconnr�r� /usr/lib/python3.6/connection.py�is_connection_droppeds
r
cCs�|\}}|jd�r|jd�}d}t�}x�tj|||tj�D]�}|\}	}
}}}
d}yHtj|	|
|�}t||�|tjk	r~|j|�|r�|j	|�|j
|
�|Stjk
r�}z|}|dk	r�|j�d}WYdd}~Xq:Xq:W|dk	r�|�tjd��dS)adConnect to *address* and return the socket object.

    Convenience function.  Connect to *address* (a 2-tuple ``(host,
    port)``) and return the socket object.  Passing the optional
    *timeout* parameter will set the timeout on the socket instance
    before attempting to connect.  If no *timeout* is supplied, the
    global default timeout setting returned by :func:`getdefaulttimeout`
    is used.  If *source_address* is set it must be a tuple of (host, port)
    for the socket to bind as a source address before making the connection.
    An host of '' or port 0 tells the OS to use the default.
    �[z[]Nz!getaddrinfo returns an empty list)
�
startswith�strip�allowed_gai_family�socketZgetaddrinfoZSOCK_STREAM�_set_socket_options�_GLOBAL_DEFAULT_TIMEOUTZ
settimeout�bindZconnect�error�close)ZaddressrZsource_addressZsocket_options�hostZport�err�family�resZafZsocktype�protoZ	canonnameZsar�errr�create_connection$s2






rcCs(|dkrdSx|D]}|j|�qWdS)N)Z
setsockopt)rZoptions�optrrrrXs
rcCstj}trtj}|S)z�This function is designed to work in the context of
    getaddrinfo, where family=socket.AF_UNSPEC is the default and
    will perform a DNS search for both IPv6 and IPv4 records.)rZAF_INET�HAS_IPV6Z	AF_UNSPEC)rrrrr`srcCsVd}d}tjrFy"tjtj�}|j|df�d}Wntk
rDYnX|rR|j�|S)z6 Returns True if the system can bind an IPv6 address. NFrT)r�has_ipv6ZAF_INET6r�	Exceptionr)rrr!rrr�	_has_ipv6ksr#z::1)Z
__future__rr�waitrZ	selectorsrrr
rrrrr#r rrrr�<module>s3site-packages/pip/_vendor/urllib3/util/__pycache__/wait.cpython-36.pyc000064400000003055147511334600021652 0ustar003

���e��@s:ddlmZmZmZmZd	dd�Zd
dd�Zddd�ZdS)�)�
HAS_SELECT�DefaultSelector�
EVENT_READ�EVENT_WRITENcsttstd��t|t�s0t|d�r(|g}nt|�}t��4}x|D]}|j|��q>W�fdd�|j|�D�SQRXdS)z� Waits for IO events to be available from a list of sockets
    or optionally a single socket if passed in. Returns a list of
    sockets that can be interacted with immediately. z!Platform does not have a selector�filenocs"g|]}|d�@r|dj�qS)r�)Zfileobj)�.0�key)�events��/usr/lib/python3.6/wait.py�
<listcomp>sz'_wait_for_io_events.<locals>.<listcomp>N)r�
ValueError�
isinstance�list�hasattrr�registerZselect)�socksr
�timeoutZselectorZsockr)r
r�_wait_for_io_events	s



rcCst|t|�S)z� Waits for reading to be available from a list of sockets
    or optionally a single socket if passed in. Returns a list of
    sockets that can be read from immediately. )rr)rrrrr�
wait_for_readsrcCst|t|�S)z� Waits for writing to be available from a list of sockets
    or optionally a single socket if passed in. Returns a list of
    sockets that can be written to immediately. )rr)rrrrr�wait_for_write$sr)N)N)N)Z	selectorsrrrrrrrrrrr�<module>s

site-packages/pip/_vendor/urllib3/util/__pycache__/__init__.cpython-36.pyc000064400000002026147511334600022442 0ustar003

���e�@s�ddlmZddlmZddlmZddlmZddlm	Z	m
Z
mZmZm
Z
mZmZmZddlmZmZddlmZdd	lmZmZmZmZdd
lmZmZd ZdS)!�)�absolute_import�)�is_connection_dropped)�make_headers)�is_fp_closed)�
SSLContext�HAS_SNI�IS_PYOPENSSL�IS_SECURETRANSPORT�assert_fingerprint�resolve_cert_reqs�resolve_ssl_version�ssl_wrap_socket)�current_time�Timeout)�Retry)�get_host�	parse_url�split_first�Url)�
wait_for_read�wait_for_writerr	r
rrrrrrrrrrrrr
rrrrN)rr	r
rrrrrrrrrrrrr
rrrr)Z
__future__rZ
connectionrZrequestrZresponserZssl_rrr	r
rrr
rZtimeoutrrZretryrZurlrrrr�waitrr�__all__�rr�/usr/lib/python3.6/__init__.py�<module>s8(
site-packages/pip/_vendor/urllib3/util/__pycache__/ssl_.cpython-36.pyc000064400000021265147511334600021651 0ustar003

���e�/�!@s�ddlmZddlZddlZddlZddlmZmZddlm	Z	m
Z
mZddlm
Z
mZmZdZdZdZdZe	e
ed�Zd	d
�Zeede�Zy,ddlZddlmZmZmZdd
lmZWnek
r�YnXyddlmZmZmZWn"ek
�rd0\ZZdZYnXdj dddddddddddddd d!d"g�Z!ydd#lmZWn.ek
�rrddl"Z"Gd$d%�d%e#�ZYnXd&d'�Z$d(d)�Z%d*d+�Z&d1d,d-�Z'd2d.d/�Z(dS)3�)�absolute_importN)�hexlify�	unhexlify)�md5�sha1�sha256�)�SSLError�InsecurePlatformWarning�SNIMissingWarningF)� �(�@cCsHtt|�t|��}x*tt|�t|��D]\}}|||AO}q(W|dkS)z�
    Compare two digests of equal length in constant time.

    The digests must be of type str/bytes.
    Returns True if the digests match, and False otherwise.
    r)�abs�len�zip�	bytearray)�a�b�result�l�r�r�/usr/lib/python3.6/ssl_.py�_const_compare_digest_backportsrZcompare_digest)�wrap_socket�	CERT_NONE�PROTOCOL_SSLv23)�HAS_SNI)�OP_NO_SSLv2�OP_NO_SSLv3�OP_NO_COMPRESSION��i�:zTLS13-AES-256-GCM-SHA384zTLS13-CHACHA20-POLY1305-SHA256zTLS13-AES-128-GCM-SHA256zECDH+AESGCMz
ECDH+CHACHA20z	DH+AESGCMzDH+CHACHA20zECDH+AES256z	DH+AES256zECDH+AES128zDH+AESz
RSA+AESGCMzRSA+AESz!aNULLz!eNULLz!MD5)�
SSLContextc@s\eZdZdejkodknp*dejkZdd�Zdd�Zdd	d
�Zdd�Z	ddd�Z
dS)r%r��cCs6||_d|_tj|_d|_d|_d|_d|_d|_	dS)NFr)
�protocol�check_hostname�sslr�verify_mode�ca_certs�options�certfile�keyfile�ciphers)�selfZprotocol_versionrrr�__init__cszSSLContext.__init__cCs||_||_dS)N)r.r/)r1r.r/rrr�load_cert_chainnszSSLContext.load_cert_chainNcCs||_|dk	rtd��dS)Nz-CA directories not supported in older Pythons)r,r	)r1ZcafileZcapathrrr�load_verify_locationsrsz SSLContext.load_verify_locationscCs|jstd��||_dS)Nz�Your version of Python does not support setting a custom cipher suite. Please upgrade to Python 2.7, 3.2, or later if you need this functionality.)�supports_set_ciphers�	TypeErrorr0)r1Zcipher_suiterrr�set_ciphersxszSSLContext.set_ciphersFcCsTtjdt�|j|j|j|j|j|d�}|jrDt	|fd|j
i|��St	|f|�SdS)Na2A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings)r/r.r,�	cert_reqs�ssl_version�server_sider0)�warnings�warnr
r/r.r,r+r(r5rr0)r1Zsocket�server_hostnamer:�kwargsrrrr�szSSLContext.wrap_socket)rr&)r')r'r)NN)NF)�__name__�
__module__�__qualname__�sys�version_infor5r2r3r4r7rrrrrr%_s

	r%cCsn|jdd�j�}t|�}tj|�}|s4tdj|���t|j��}||�j	�}t
||�sjtdj|t|����dS)z�
    Checks if given fingerprint matches the supplied certificate.

    :param cert:
        Certificate as bytes object.
    :param fingerprint:
        Fingerprint as string of hexdigits, can be interspersed by colons.
    r$�z"Fingerprint of invalid length: {0}z6Fingerprints did not match. Expected "{0}", got "{1}".N)�replace�lowerr�HASHFUNC_MAP�getr	�formatr�encodeZdigest�_const_compare_digestr)ZcertZfingerprintZ
digest_lengthZhashfuncZfingerprint_bytesZcert_digestrrr�assert_fingerprint�s


rLcCs@|dkrtSt|t�r<tt|d�}|dkr8ttd|�}|S|S)a�
    Resolves the argument to a numeric constant, which can be passed to
    the wrap_socket function/method from the ssl module.
    Defaults to :data:`ssl.CERT_NONE`.
    If given a string it is assumed to be the name of the constant in the
    :mod:`ssl` module or its abbrevation.
    (So you can specify `REQUIRED` instead of `CERT_REQUIRED`.
    If it's neither `None` nor a string we assume it is already the numeric
    constant which can directly be passed to wrap_socket.
    NZCERT_)r�
isinstance�str�getattrr*)�	candidate�resrrr�resolve_cert_reqs�s
rRcCs@|dkrtSt|t�r<tt|d�}|dkr8ttd|�}|S|S)z 
    like resolve_cert_reqs
    NZ	PROTOCOL_)rrMrNrOr*)rPrQrrr�resolve_ssl_version�s
rScCs�t|p
tj�}|dkrtjn|}|dkrDd}|tO}|tO}|tO}|j|O_t|dd�rl|j	|pht
�||_t|dd�dk	r�d|_|S)a�All arguments have the same meaning as ``ssl_wrap_socket``.

    By default, this function does a lot of the same work that
    ``ssl.create_default_context`` does on Python 3.4+. It:

    - Disables SSLv2, SSLv3, and compression
    - Sets a restricted set of server ciphers

    If you wish to enable SSLv3, you can do::

        from urllib3.util import ssl_
        context = ssl_.create_urllib3_context()
        context.options &= ~ssl_.OP_NO_SSLv3

    You can do the same to enable compression (substituting ``COMPRESSION``
    for ``SSLv3`` in the last line above).

    :param ssl_version:
        The desired protocol version to use. This will default to
        PROTOCOL_SSLv23 which will negotiate the highest protocol that both
        the server and your installation of OpenSSL support.
    :param cert_reqs:
        Whether to require the certificate verification. This defaults to
        ``ssl.CERT_REQUIRED``.
    :param options:
        Specific OpenSSL options. These default to ``ssl.OP_NO_SSLv2``,
        ``ssl.OP_NO_SSLv3``, ``ssl.OP_NO_COMPRESSION``.
    :param ciphers:
        Which cipher suites to allow the server to select.
    :returns:
        Constructed SSLContext object with specified options
    :rtype: SSLContext
    Nrr5Tr)F)
r%r*rZ
CERT_REQUIREDrr r!r-rOr7�DEFAULT_CIPHERSr+r))r9r8r-r0�contextrrr�create_urllib3_context�s#rVc
Cs�|}
|
dkrt|||d�}
|s"|	r�y|
j||	�Wq�tk
r\}zt|��WYdd}~Xq�tk
r�}z|jtjkr�t|���WYdd}~Xq�Xn|dkr�t|
d�r�|
j�|r�|
j	||�t
r�|
j||d�Stj
dt�|
j|�S)a
    All arguments except for server_hostname, ssl_context, and ca_cert_dir have
    the same meaning as they do when using :func:`ssl.wrap_socket`.

    :param server_hostname:
        When SNI is supported, the expected hostname of the certificate
    :param ssl_context:
        A pre-made :class:`SSLContext` object. If none is provided, one will
        be created using :func:`create_urllib3_context`.
    :param ciphers:
        A string of ciphers we wish the client to support. This is not
        supported on Python 2.6 as the ssl module does not support it.
    :param ca_cert_dir:
        A directory containing CA certificates in multiple separate files, as
        supported by OpenSSL's -CApath flag or the capath argument to
        SSLContext.load_verify_locations().
    N)r0�load_default_certs)r=a�An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings)rVr4�IOErrorr	�OSError�errno�ENOENT�hasattrrWr3rrr;r<r)Zsockr/r.r8r,r=r9r0Zssl_contextZca_cert_dirrU�errr�ssl_wrap_sockets.r^)r"r#)NNNN)	NNNNNNNNN))Z
__future__rrZr;ZhmacZbinasciirrZhashlibrrr�
exceptionsr	r
rr%rZIS_PYOPENSSLZIS_SECURETRANSPORTrGrrOrKr*rrr�ImportErrorrr r!�joinrTrB�objectrLrRrSrVr^rrrr�<module>st

:
>site-packages/pip/_vendor/urllib3/util/__pycache__/__init__.cpython-36.opt-1.pyc000064400000002026147511334600023401 0ustar003

���e�@s�ddlmZddlmZddlmZddlmZddlm	Z	m
Z
mZmZm
Z
mZmZmZddlmZmZddlmZdd	lmZmZmZmZdd
lmZmZd ZdS)!�)�absolute_import�)�is_connection_dropped)�make_headers)�is_fp_closed)�
SSLContext�HAS_SNI�IS_PYOPENSSL�IS_SECURETRANSPORT�assert_fingerprint�resolve_cert_reqs�resolve_ssl_version�ssl_wrap_socket)�current_time�Timeout)�Retry)�get_host�	parse_url�split_first�Url)�
wait_for_read�wait_for_writerr	r
rrrrrrrrrrrrr
rrrrN)rr	r
rrrrrrrrrrrrr
rrrr)Z
__future__rZ
connectionrZrequestrZresponserZssl_rrr	r
rrr
rZtimeoutrrZretryrZurlrrrr�waitrr�__all__�rr�/usr/lib/python3.6/__init__.py�<module>s8(
site-packages/pip/_vendor/urllib3/util/__pycache__/request.cpython-36.pyc000064400000006102147511334600022372 0ustar003

���ey�@s\ddlmZddlmZddlmZmZddlmZdZ	e
�Zddd	�Zd
d�Z
dd
�ZdS)�)�absolute_import)�	b64encode�)�b�
integer_types)�UnrewindableBodyErrorzgzip,deflateNcCs�i}|r6t|t�rnt|t�r*dj|�}nt}||d<|rB||d<|rNd|d<|rldtt|��jd�|d<|r�dtt|��jd�|d	<|r�d
|d<|S)a
    Shortcuts for generating request headers.

    :param keep_alive:
        If ``True``, adds 'connection: keep-alive' header.

    :param accept_encoding:
        Can be a boolean, list, or string.
        ``True`` translates to 'gzip,deflate'.
        List will get joined by comma.
        String will be used as provided.

    :param user_agent:
        String representing the user-agent you want, such as
        "python-urllib3/0.6"

    :param basic_auth:
        Colon-separated username:password string for 'authorization: basic ...'
        auth header.

    :param proxy_basic_auth:
        Colon-separated username:password string for 'proxy-authorization: basic ...'
        auth header.

    :param disable_cache:
        If ``True``, adds 'cache-control: no-cache' header.

    Example::

        >>> make_headers(keep_alive=True, user_agent="Batman/1.0")
        {'connection': 'keep-alive', 'user-agent': 'Batman/1.0'}
        >>> make_headers(accept_encoding=True)
        {'accept-encoding': 'gzip,deflate'}
    �,zaccept-encodingz
user-agentz
keep-aliveZ
connectionzBasic zutf-8Z
authorizationzproxy-authorizationzno-cachez
cache-control)�
isinstance�str�list�join�ACCEPT_ENCODINGrr�decode)Z
keep_aliveZaccept_encodingZ
user_agentZ
basic_authZproxy_basic_authZ
disable_cacheZheaders�r�/usr/lib/python3.6/request.py�make_headerss*$

rcCsR|dk	rt||�n:t|dd�dk	rNy|j�}Wnttfk
rLt}YnX|S)z
    If a position is provided, move file to that point.
    Otherwise, we'll attempt to record a position for future use.
    N�tell)�rewind_body�getattrr�IOError�OSError�_FAILEDTELL)�body�posrrr�set_file_positionMs
rcCstt|dd�}|dk	rNt|t�rNy||�Wqpttfk
rJtd��YqpXn"|tkr`td��ntdt|���dS)z�
    Attempt to rewind body to a certain position.
    Primarily used for request redirects and retries.

    :param body:
        File-like object that supports seek.

    :param int pos:
        Position to seek to in file.
    �seekNzAAn error occurred when rewinding request body for redirect/retry.zRUnable to record file position for rewinding request body during a redirect/retry.z4body_pos must be of type integer, instead it was %s.)	rr	rrrrr�
ValueError�type)rZbody_posZ	body_seekrrrr_s
r)NNNNNN)Z
__future__r�base64rZpackages.sixrr�
exceptionsrr
�objectrrrrrrrr�<module>s
Asite-packages/pip/_vendor/urllib3/util/__pycache__/request.cpython-36.opt-1.pyc000064400000006102147511334600023331 0ustar003

���ey�@s\ddlmZddlmZddlmZmZddlmZdZ	e
�Zddd	�Zd
d�Z
dd
�ZdS)�)�absolute_import)�	b64encode�)�b�
integer_types)�UnrewindableBodyErrorzgzip,deflateNcCs�i}|r6t|t�rnt|t�r*dj|�}nt}||d<|rB||d<|rNd|d<|rldtt|��jd�|d<|r�dtt|��jd�|d	<|r�d
|d<|S)a
    Shortcuts for generating request headers.

    :param keep_alive:
        If ``True``, adds 'connection: keep-alive' header.

    :param accept_encoding:
        Can be a boolean, list, or string.
        ``True`` translates to 'gzip,deflate'.
        List will get joined by comma.
        String will be used as provided.

    :param user_agent:
        String representing the user-agent you want, such as
        "python-urllib3/0.6"

    :param basic_auth:
        Colon-separated username:password string for 'authorization: basic ...'
        auth header.

    :param proxy_basic_auth:
        Colon-separated username:password string for 'proxy-authorization: basic ...'
        auth header.

    :param disable_cache:
        If ``True``, adds 'cache-control: no-cache' header.

    Example::

        >>> make_headers(keep_alive=True, user_agent="Batman/1.0")
        {'connection': 'keep-alive', 'user-agent': 'Batman/1.0'}
        >>> make_headers(accept_encoding=True)
        {'accept-encoding': 'gzip,deflate'}
    �,zaccept-encodingz
user-agentz
keep-aliveZ
connectionzBasic zutf-8Z
authorizationzproxy-authorizationzno-cachez
cache-control)�
isinstance�str�list�join�ACCEPT_ENCODINGrr�decode)Z
keep_aliveZaccept_encodingZ
user_agentZ
basic_authZproxy_basic_authZ
disable_cacheZheaders�r�/usr/lib/python3.6/request.py�make_headerss*$

rcCsR|dk	rt||�n:t|dd�dk	rNy|j�}Wnttfk
rLt}YnX|S)z
    If a position is provided, move file to that point.
    Otherwise, we'll attempt to record a position for future use.
    N�tell)�rewind_body�getattrr�IOError�OSError�_FAILEDTELL)�body�posrrr�set_file_positionMs
rcCstt|dd�}|dk	rNt|t�rNy||�Wqpttfk
rJtd��YqpXn"|tkr`td��ntdt|���dS)z�
    Attempt to rewind body to a certain position.
    Primarily used for request redirects and retries.

    :param body:
        File-like object that supports seek.

    :param int pos:
        Position to seek to in file.
    �seekNzAAn error occurred when rewinding request body for redirect/retry.zRUnable to record file position for rewinding request body during a redirect/retry.z4body_pos must be of type integer, instead it was %s.)	rr	rrrrr�
ValueError�type)rZbody_posZ	body_seekrrrr_s
r)NNNNNN)Z
__future__r�base64rZpackages.sixrr�
exceptionsrr
�objectrrrrrrrr�<module>s
Asite-packages/pip/_vendor/urllib3/util/__pycache__/timeout.cpython-36.pyc000064400000020757147511334600022404 0ustar003

���e&�@sTddlmZddlmZddlZddlmZe�Ze	edej�Z
Gdd�de�ZdS)	�)�absolute_import)�_GLOBAL_DEFAULT_TIMEOUTN�)�TimeoutStateErrorZ	monotonicc@steZdZdZeZdeefdd�Zdd�Ze	dd��Z
e	d	d
��Zdd�Zd
d�Z
dd�Zedd��Zedd��ZdS)�Timeouta� Timeout configuration.

    Timeouts can be defined as a default for a pool::

        timeout = Timeout(connect=2.0, read=7.0)
        http = PoolManager(timeout=timeout)
        response = http.request('GET', 'http://example.com/')

    Or per-request (which overrides the default for the pool)::

        response = http.request('GET', 'http://example.com/', timeout=Timeout(10))

    Timeouts can be disabled by setting all the parameters to ``None``::

        no_timeout = Timeout(connect=None, read=None)
        response = http.request('GET', 'http://example.com/, timeout=no_timeout)


    :param total:
        This combines the connect and read timeouts into one; the read timeout
        will be set to the time leftover from the connect attempt. In the
        event that both a connect timeout and a total are specified, or a read
        timeout and a total are specified, the shorter timeout will be applied.

        Defaults to None.

    :type total: integer, float, or None

    :param connect:
        The maximum amount of time to wait for a connection attempt to a server
        to succeed. Omitting the parameter will default the connect timeout to
        the system default, probably `the global default timeout in socket.py
        <http://hg.python.org/cpython/file/603b4d593758/Lib/socket.py#l535>`_.
        None will set an infinite timeout for connection attempts.

    :type connect: integer, float, or None

    :param read:
        The maximum amount of time to wait between consecutive
        read operations for a response from the server. Omitting
        the parameter will default the read timeout to the system
        default, probably `the global default timeout in socket.py
        <http://hg.python.org/cpython/file/603b4d593758/Lib/socket.py#l535>`_.
        None will set an infinite timeout.

    :type read: integer, float, or None

    .. note::

        Many factors can affect the total amount of time for urllib3 to return
        an HTTP response.

        For example, Python's DNS resolver does not obey the timeout specified
        on the socket. Other factors that can affect total request time include
        high CPU load, high swap, the program running at a low priority level,
        or other behaviors.

        In addition, the read and total timeouts only measure the time between
        read operations on the socket connecting the client and the server,
        not the total amount of time for the request to return a complete
        response. For most requests, the timeout is raised because the server
        has not sent the first byte in the specified time. This is not always
        the case; if a server streams one byte every fifteen seconds, a timeout
        of 20 seconds will not trigger, even though the request will take
        several minutes to complete.

        If your goal is to cut off any request after a set amount of wall clock
        time, consider having a second "watcher" thread to cut off a slow
        request.
    NcCs4|j|d�|_|j|d�|_|j|d�|_d|_dS)N�connect�read�total)�_validate_timeout�_connect�_readr	�_start_connect)�selfr	rr�r�/usr/lib/python3.6/timeout.py�__init__]szTimeout.__init__cCsdt|�j|j|j|jfS)Nz!%s(connect=%r, read=%r, total=%r))�type�__name__rrr	)rrrr�__str__cszTimeout.__str__cCs�|tkr|jS|dks ||jkr$|St|t�r6td��yt|�Wn(ttfk
rjtd||f��YnXy|dkr�td||f��Wn$tk
r�td||f��YnX|S)a� Check that a timeout attribute is valid.

        :param value: The timeout value to validate
        :param name: The name of the timeout attribute to validate. This is
            used to specify in error messages.
        :return: The validated and casted version of the given value.
        :raises ValueError: If it is a numeric value less than or equal to
            zero, or the type is not an integer, float, or None.
        NzDTimeout cannot be a boolean value. It must be an int, float or None.z>Timeout value %s was %s, but it must be an int, float or None.rzdAttempted to set %s timeout to %s, but the timeout cannot be set to a value less than or equal to 0.)�_Default�DEFAULT_TIMEOUT�
isinstance�bool�
ValueError�float�	TypeError)�cls�value�namerrrr
gs&
zTimeout._validate_timeoutcCst||d�S)a� Create a new Timeout from a legacy timeout value.

        The timeout value used by httplib.py sets the same timeout on the
        connect(), and recv() socket requests. This creates a :class:`Timeout`
        object that sets the individual timeouts to the ``timeout`` value
        passed to this function.

        :param timeout: The legacy timeout value.
        :type timeout: integer, float, sentinel default object, or None
        :return: Timeout object
        :rtype: :class:`Timeout`
        )rr)r)rZtimeoutrrr�
from_float�szTimeout.from_floatcCst|j|j|jd�S)a Create a copy of the timeout object

        Timeout properties are stored per-pool but each request needs a fresh
        Timeout object to ensure each one has its own start/stop configured.

        :return: a copy of the timeout object
        :rtype: :class:`Timeout`
        )rrr	)rrrr	)rrrr�clone�s
z
Timeout.clonecCs |jdk	rtd��t�|_|jS)z� Start the timeout clock, used during a connect() attempt

        :raises urllib3.exceptions.TimeoutStateError: if you attempt
            to start a timer that has been started already.
        Nz'Timeout timer has already been started.)r
r�current_time)rrrr�
start_connect�s
zTimeout.start_connectcCs|jdkrtd��t�|jS)a Gets the time elapsed since the call to :meth:`start_connect`.

        :return: Elapsed time.
        :rtype: float
        :raises urllib3.exceptions.TimeoutStateError: if you attempt
            to get duration for a timer that hasn't been started.
        Nz:Can't get connect duration for timer that has not started.)r
rr!)rrrr�get_connect_duration�s
zTimeout.get_connect_durationcCs:|jdkr|jS|jdks&|j|jkr,|jSt|j|j�S)a" Get the value to use when setting a connection timeout.

        This will be a positive float or integer, the value None
        (never timeout), or the default system timeout.

        :return: Connect timeout.
        :rtype: int, float, :attr:`Timeout.DEFAULT_TIMEOUT` or None
        N)r	rr�min)rrrr�connect_timeout�s


zTimeout.connect_timeoutcCs�|jdk	rX|j|jk	rX|jdk	rX|j|jk	rX|jdkr<|jStdt|j|j�|j��S|jdk	r�|j|jk	r�td|j|j��S|jSdS)a� Get the value for the read timeout.

        This assumes some time has elapsed in the connection timeout and
        computes the read timeout appropriately.

        If self.total is set, the read timeout is dependent on the amount of
        time taken by the connect timeout. If the connection time has not been
        established, a :exc:`~urllib3.exceptions.TimeoutStateError` will be
        raised.

        :return: Value to use for the read timeout.
        :rtype: int, float, :attr:`Timeout.DEFAULT_TIMEOUT` or None
        :raises urllib3.exceptions.TimeoutStateError: If :meth:`start_connect`
            has not yet been called on this object.
        Nr)r	rrr
�maxr$r#)rrrr�read_timeout�s



zTimeout.read_timeout)r�
__module__�__qualname__�__doc__rrrrr�classmethodr
rr r"r#�propertyr%r'rrrrrsF%
r)Z
__future__rZsocketrZtime�
exceptionsr�objectr�getattrr!rrrrr�<module>ssite-packages/pip/_vendor/urllib3/util/__pycache__/timeout.cpython-36.opt-1.pyc000064400000020757147511334600023343 0ustar003

���e&�@sTddlmZddlmZddlZddlmZe�Ze	edej�Z
Gdd�de�ZdS)	�)�absolute_import)�_GLOBAL_DEFAULT_TIMEOUTN�)�TimeoutStateErrorZ	monotonicc@steZdZdZeZdeefdd�Zdd�Ze	dd��Z
e	d	d
��Zdd�Zd
d�Z
dd�Zedd��Zedd��ZdS)�Timeouta� Timeout configuration.

    Timeouts can be defined as a default for a pool::

        timeout = Timeout(connect=2.0, read=7.0)
        http = PoolManager(timeout=timeout)
        response = http.request('GET', 'http://example.com/')

    Or per-request (which overrides the default for the pool)::

        response = http.request('GET', 'http://example.com/', timeout=Timeout(10))

    Timeouts can be disabled by setting all the parameters to ``None``::

        no_timeout = Timeout(connect=None, read=None)
        response = http.request('GET', 'http://example.com/, timeout=no_timeout)


    :param total:
        This combines the connect and read timeouts into one; the read timeout
        will be set to the time leftover from the connect attempt. In the
        event that both a connect timeout and a total are specified, or a read
        timeout and a total are specified, the shorter timeout will be applied.

        Defaults to None.

    :type total: integer, float, or None

    :param connect:
        The maximum amount of time to wait for a connection attempt to a server
        to succeed. Omitting the parameter will default the connect timeout to
        the system default, probably `the global default timeout in socket.py
        <http://hg.python.org/cpython/file/603b4d593758/Lib/socket.py#l535>`_.
        None will set an infinite timeout for connection attempts.

    :type connect: integer, float, or None

    :param read:
        The maximum amount of time to wait between consecutive
        read operations for a response from the server. Omitting
        the parameter will default the read timeout to the system
        default, probably `the global default timeout in socket.py
        <http://hg.python.org/cpython/file/603b4d593758/Lib/socket.py#l535>`_.
        None will set an infinite timeout.

    :type read: integer, float, or None

    .. note::

        Many factors can affect the total amount of time for urllib3 to return
        an HTTP response.

        For example, Python's DNS resolver does not obey the timeout specified
        on the socket. Other factors that can affect total request time include
        high CPU load, high swap, the program running at a low priority level,
        or other behaviors.

        In addition, the read and total timeouts only measure the time between
        read operations on the socket connecting the client and the server,
        not the total amount of time for the request to return a complete
        response. For most requests, the timeout is raised because the server
        has not sent the first byte in the specified time. This is not always
        the case; if a server streams one byte every fifteen seconds, a timeout
        of 20 seconds will not trigger, even though the request will take
        several minutes to complete.

        If your goal is to cut off any request after a set amount of wall clock
        time, consider having a second "watcher" thread to cut off a slow
        request.
    NcCs4|j|d�|_|j|d�|_|j|d�|_d|_dS)N�connect�read�total)�_validate_timeout�_connect�_readr	�_start_connect)�selfr	rr�r�/usr/lib/python3.6/timeout.py�__init__]szTimeout.__init__cCsdt|�j|j|j|jfS)Nz!%s(connect=%r, read=%r, total=%r))�type�__name__rrr	)rrrr�__str__cszTimeout.__str__cCs�|tkr|jS|dks ||jkr$|St|t�r6td��yt|�Wn(ttfk
rjtd||f��YnXy|dkr�td||f��Wn$tk
r�td||f��YnX|S)a� Check that a timeout attribute is valid.

        :param value: The timeout value to validate
        :param name: The name of the timeout attribute to validate. This is
            used to specify in error messages.
        :return: The validated and casted version of the given value.
        :raises ValueError: If it is a numeric value less than or equal to
            zero, or the type is not an integer, float, or None.
        NzDTimeout cannot be a boolean value. It must be an int, float or None.z>Timeout value %s was %s, but it must be an int, float or None.rzdAttempted to set %s timeout to %s, but the timeout cannot be set to a value less than or equal to 0.)�_Default�DEFAULT_TIMEOUT�
isinstance�bool�
ValueError�float�	TypeError)�cls�value�namerrrr
gs&
zTimeout._validate_timeoutcCst||d�S)a� Create a new Timeout from a legacy timeout value.

        The timeout value used by httplib.py sets the same timeout on the
        connect(), and recv() socket requests. This creates a :class:`Timeout`
        object that sets the individual timeouts to the ``timeout`` value
        passed to this function.

        :param timeout: The legacy timeout value.
        :type timeout: integer, float, sentinel default object, or None
        :return: Timeout object
        :rtype: :class:`Timeout`
        )rr)r)rZtimeoutrrr�
from_float�szTimeout.from_floatcCst|j|j|jd�S)a Create a copy of the timeout object

        Timeout properties are stored per-pool but each request needs a fresh
        Timeout object to ensure each one has its own start/stop configured.

        :return: a copy of the timeout object
        :rtype: :class:`Timeout`
        )rrr	)rrrr	)rrrr�clone�s
z
Timeout.clonecCs |jdk	rtd��t�|_|jS)z� Start the timeout clock, used during a connect() attempt

        :raises urllib3.exceptions.TimeoutStateError: if you attempt
            to start a timer that has been started already.
        Nz'Timeout timer has already been started.)r
r�current_time)rrrr�
start_connect�s
zTimeout.start_connectcCs|jdkrtd��t�|jS)a Gets the time elapsed since the call to :meth:`start_connect`.

        :return: Elapsed time.
        :rtype: float
        :raises urllib3.exceptions.TimeoutStateError: if you attempt
            to get duration for a timer that hasn't been started.
        Nz:Can't get connect duration for timer that has not started.)r
rr!)rrrr�get_connect_duration�s
zTimeout.get_connect_durationcCs:|jdkr|jS|jdks&|j|jkr,|jSt|j|j�S)a" Get the value to use when setting a connection timeout.

        This will be a positive float or integer, the value None
        (never timeout), or the default system timeout.

        :return: Connect timeout.
        :rtype: int, float, :attr:`Timeout.DEFAULT_TIMEOUT` or None
        N)r	rr�min)rrrr�connect_timeout�s


zTimeout.connect_timeoutcCs�|jdk	rX|j|jk	rX|jdk	rX|j|jk	rX|jdkr<|jStdt|j|j�|j��S|jdk	r�|j|jk	r�td|j|j��S|jSdS)a� Get the value for the read timeout.

        This assumes some time has elapsed in the connection timeout and
        computes the read timeout appropriately.

        If self.total is set, the read timeout is dependent on the amount of
        time taken by the connect timeout. If the connection time has not been
        established, a :exc:`~urllib3.exceptions.TimeoutStateError` will be
        raised.

        :return: Value to use for the read timeout.
        :rtype: int, float, :attr:`Timeout.DEFAULT_TIMEOUT` or None
        :raises urllib3.exceptions.TimeoutStateError: If :meth:`start_connect`
            has not yet been called on this object.
        Nr)r	rrr
�maxr$r#)rrrr�read_timeout�s



zTimeout.read_timeout)r�
__module__�__qualname__�__doc__rrrrr�classmethodr
rr r"r#�propertyr%r'rrrrrsF%
r)Z
__future__rZsocketrZtime�
exceptionsr�objectr�getattrr!rrrrr�<module>ssite-packages/pip/_vendor/urllib3/util/__pycache__/retry.cpython-36.opt-1.pyc000064400000030433147511334600023012 0ustar003

���e;�@s�ddlmZddlZddlZddlmZddlmZddlZddl	Z	ddl
mZmZm
Z
mZmZmZddlmZeje�Zedd	d
ddd
g�ZGdd�de�Zed�e_dS)�)�absolute_importN)�
namedtuple)�	takewhile�)�ConnectTimeoutError�
MaxRetryError�
ProtocolError�ReadTimeoutError�
ResponseError�
InvalidHeader)�six�RequestHistory�method�url�error�status�redirect_locationc
@s�eZdZdZeddddddg�Zedg�Zed	d
dg�ZdZd
ddddeddddddef
dd�Z	dd�Z
ed2dd��Zdd�Z
dd�Zdd�Zd3dd�Zdd �Zd4d!d"�Zd#d$�Zd%d&�Zd'd(�Zd5d*d+�Zd,d-�Zd6d.d/�Zd0d1�ZdS)7�Retrya2 Retry configuration.

    Each retry attempt will create a new Retry object with updated values, so
    they can be safely reused.

    Retries can be defined as a default for a pool::

        retries = Retry(connect=5, read=2, redirect=5)
        http = PoolManager(retries=retries)
        response = http.request('GET', 'http://example.com/')

    Or per-request (which overrides the default for the pool)::

        response = http.request('GET', 'http://example.com/', retries=Retry(10))

    Retries can be disabled by passing ``False``::

        response = http.request('GET', 'http://example.com/', retries=False)

    Errors will be wrapped in :class:`~urllib3.exceptions.MaxRetryError` unless
    retries are disabled, in which case the causing exception will be raised.

    :param int total:
        Total number of retries to allow. Takes precedence over other counts.

        Set to ``None`` to remove this constraint and fall back on other
        counts. It's a good idea to set this to some sensibly-high value to
        account for unexpected edge cases and avoid infinite retry loops.

        Set to ``0`` to fail on the first retry.

        Set to ``False`` to disable and imply ``raise_on_redirect=False``.

    :param int connect:
        How many connection-related errors to retry on.

        These are errors raised before the request is sent to the remote server,
        which we assume has not triggered the server to process the request.

        Set to ``0`` to fail on the first retry of this type.

    :param int read:
        How many times to retry on read errors.

        These errors are raised after the request was sent to the server, so the
        request may have side-effects.

        Set to ``0`` to fail on the first retry of this type.

    :param int redirect:
        How many redirects to perform. Limit this to avoid infinite redirect
        loops.

        A redirect is a HTTP response with a status code 301, 302, 303, 307 or
        308.

        Set to ``0`` to fail on the first retry of this type.

        Set to ``False`` to disable and imply ``raise_on_redirect=False``.

    :param int status:
        How many times to retry on bad status codes.

        These are retries made on responses, where status code matches
        ``status_forcelist``.

        Set to ``0`` to fail on the first retry of this type.

    :param iterable method_whitelist:
        Set of uppercased HTTP method verbs that we should retry on.

        By default, we only retry on methods which are considered to be
        idempotent (multiple requests with the same parameters end with the
        same state). See :attr:`Retry.DEFAULT_METHOD_WHITELIST`.

        Set to a ``False`` value to retry on any verb.

    :param iterable status_forcelist:
        A set of integer HTTP status codes that we should force a retry on.
        A retry is initiated if the request method is in ``method_whitelist``
        and the response status code is in ``status_forcelist``.

        By default, this is disabled with ``None``.

    :param float backoff_factor:
        A backoff factor to apply between attempts after the second try
        (most errors are resolved immediately by a second try without a
        delay). urllib3 will sleep for::

            {backoff factor} * (2 ^ ({number of total retries} - 1))

        seconds. If the backoff_factor is 0.1, then :func:`.sleep` will sleep
        for [0.0s, 0.2s, 0.4s, ...] between retries. It will never be longer
        than :attr:`Retry.BACKOFF_MAX`.

        By default, backoff is disabled (set to 0).

    :param bool raise_on_redirect: Whether, if the number of redirects is
        exhausted, to raise a MaxRetryError, or to return a response with a
        response code in the 3xx range.

    :param iterable remove_headers_on_redirect:
        Sequence of headers to remove from the request when a response
        indicating a redirect is returned before firing off the redirected
        request

    :param bool raise_on_status: Similar meaning to ``raise_on_redirect``:
        whether we should raise an exception, or return a response,
        if status falls in ``status_forcelist`` range and retries have
        been exhausted.

    :param tuple history: The history of the request encountered during
        each call to :meth:`~Retry.increment`. The list is in the order
        the requests occurred. Each list item is of class :class:`RequestHistory`.

    :param bool respect_retry_after_header:
        Whether to respect Retry-After header on status codes defined as
        :attr:`Retry.RETRY_AFTER_STATUS_CODES` or not.

    ZHEADZGETZPUTZDELETEZOPTIONSZTRACEZ
Authorizationi�i�i��x�
NrTcCsv||_||_||_||_|dks(|dkr0d}d}	||_|p>t�|_||_||_|	|_	|
|_
|pbt�|_||_
|
|_dS)NFr)�total�connect�readr�redirect�set�status_forcelist�method_whitelist�backoff_factor�raise_on_redirect�raise_on_status�tuple�history�respect_retry_after_header�remove_headers_on_redirect)�selfrrrrrrrrrrr!r"r#�r%�/usr/lib/python3.6/retry.py�__init__�s zRetry.__init__cKsPt|j|j|j|j|j|j|j|j|j	|j
|j|jd�}|j
|�t|�f|�S)N)rrrrrrrrrrr!r#)�dictrrrrrrrrrrr!r#�update�type)r$�kwZparamsr%r%r&�new�s

z	Retry.newcCsR|dkr|dk	r|n|j}t|t�r(|St|�o2d}|||d�}tjd||�|S)z4 Backwards-compatibility for the old retries format.N)rz!Converted retries value: %r -> %r)�DEFAULT�
isinstancer�bool�log�debug)�clsZretriesr�defaultZnew_retriesr%r%r&�from_int�s
zRetry.from_intcCsFtttdd�t|j����}|dkr(dS|jd|d}t|j|�S)zJ Formula for computing the current backoff

        :rtype: float
        cSs
|jdkS)N)r)�xr%r%r&�<lambda>�sz(Retry.get_backoff_time.<locals>.<lambda>�rr)�len�listr�reversedr!r�min�BACKOFF_MAX)r$Zconsecutive_errors_lenZ
backoff_valuer%r%r&�get_backoff_time�szRetry.get_backoff_timecCs\tjd|�rt|�}n6tjj|�}|dkr6td|��tj|�}|tj�}|dkrXd}|S)Nz^\s*[0-9]+\s*$zInvalid Retry-After header: %sr)	�re�match�int�emailZutilsZ	parsedater�timeZmktime)r$�retry_afterZsecondsZretry_date_tupleZ
retry_dater%r%r&�parse_retry_after�s

zRetry.parse_retry_aftercCs |jd�}|dkrdS|j|�S)z* Get the value of Retry-After in seconds. zRetry-AfterN)Z	getheaderrD)r$�responserCr%r%r&�get_retry_after�s
zRetry.get_retry_aftercCs |j|�}|rtj|�dSdS)NTF)rFrB�sleep)r$rErCr%r%r&�sleep_for_retry�s


zRetry.sleep_for_retrycCs"|j�}|dkrdStj|�dS)Nr)r=rBrG)r$Zbackoffr%r%r&�_sleep_backoffszRetry._sleep_backoffcCs"|r|j|�}|rdS|j�dS)aC Sleep between retry attempts.

        This method will respect a server's ``Retry-After`` response header
        and sleep the duration of the time requested. If that is not present, it
        will use an exponential backoff. By default, the backoff factor is 0 and
        this method will return immediately.
        N)rHrI)r$rEZsleptr%r%r&rGs
	
zRetry.sleepcCs
t|t�S)z{ Errors when we're fairly sure that the server did not receive the
        request, so it should be safe to retry.
        )r.r)r$�errr%r%r&�_is_connection_errorszRetry._is_connection_errorcCst|ttf�S)z� Errors that occur after the request has been started, so we should
        assume that the server began processing it.
        )r.r	r)r$rJr%r%r&�_is_read_error!szRetry._is_read_errorcCs|jr|j�|jkrdSdS)z| Checks if a given HTTP method should be retried upon, depending if
        it is included on the method whitelist.
        FT)r�upper)r$rr%r%r&�_is_method_retryable'szRetry._is_method_retryableFcCs<|j|�sdS|jr"||jkr"dS|jo:|jo:|o:||jkS)ax Is this method/status code retryable? (Based on whitelists and control
        variables such as the number of total retries to allow, whether to
        respect the Retry-After header, whether this header is present, and
        whether the returned status code is on the list of status codes to
        be retried upon on the presence of the aforementioned header)
        FT)rNrrr"�RETRY_AFTER_STATUS_CODES)r$r�status_codeZhas_retry_afterr%r%r&�is_retry0s
zRetry.is_retrycCs:|j|j|j|j|jf}ttd|��}|s.dSt|�dkS)z Are we out of retries? NFr)rrrrrr9�filterr;)r$Zretry_countsr%r%r&�is_exhausted@s
zRetry.is_exhaustedcCs�|jdkr |r tjt|�||��|j}|dk	r6|d8}|j}|j}	|j}
|j}d}d}
d}|r�|j|�r�|dkr�tjt|�||��n|dk	r�|d8}n�|r�|j	|�r�|	dks�|j
|�r�tjt|�||��n|	dk	r�|	d8}	nn|o�|j��r|
dk	r�|
d8}
d}|j�}|j}
n<tj
}|�rL|j�rL|dk	�r6|d8}tjj|jd�}|j}
|jt||||
|�f}|j|||	|
||d�}|j��r�t|||�p�t|���tjd||�|S)	a� Return a new Retry object with incremented retry counters.

        :param response: A response object, or None, if the server did not
            return a response.
        :type response: :class:`~urllib3.response.HTTPResponse`
        :param Exception error: An error encountered during the request, or
            None if the response was received successfully.

        :return: A new ``Retry`` object.
        FNr7�unknownztoo many redirects)rP)rrrrrr!z$Incremented Retry for (url='%s'): %r)rrZreraiser*rrrrrKrLrNZget_redirect_locationr
Z
GENERIC_ERRORZSPECIFIC_ERROR�formatr!r
r,rSrr0r1)r$rrrErZ_poolZ_stacktracerrrrZstatus_count�causerrr!Z	new_retryr%r%r&�	incrementIsX




zRetry.incrementcCsdjt|�|d�S)Nz|{cls.__name__}(total={self.total}, connect={self.connect}, read={self.read}, redirect={self.redirect}, status={self.status}))r2r$)rUr*)r$r%r%r&�__repr__�szRetry.__repr__)TN)N)N)F)NNNNNN)�__name__�
__module__�__qualname__�__doc__�	frozensetZDEFAULT_METHOD_WHITELISTZ"DEFAULT_REDIRECT_HEADERS_BLACKLISTrOr<r'r,�classmethodr4r=rDrFrHrIrGrKrLrNrQrSrWrXr%r%r%r&rs8x





	
	
Jr�)Z
__future__rrBZlogging�collectionsr�	itertoolsrrAr>�
exceptionsrrrr	r
rZpackagesrZ	getLoggerrYr0r
�objectrr-r%r%r%r&�<module>s  


site-packages/pip/_vendor/urllib3/util/request.py000064400000007171147511334600016115 0ustar00from __future__ import absolute_import
from base64 import b64encode

from ..packages.six import b, integer_types
from ..exceptions import UnrewindableBodyError

ACCEPT_ENCODING = 'gzip,deflate'
_FAILEDTELL = object()


def make_headers(keep_alive=None, accept_encoding=None, user_agent=None,
                 basic_auth=None, proxy_basic_auth=None, disable_cache=None):
    """
    Shortcuts for generating request headers.

    :param keep_alive:
        If ``True``, adds 'connection: keep-alive' header.

    :param accept_encoding:
        Can be a boolean, list, or string.
        ``True`` translates to 'gzip,deflate'.
        List will get joined by comma.
        String will be used as provided.

    :param user_agent:
        String representing the user-agent you want, such as
        "python-urllib3/0.6"

    :param basic_auth:
        Colon-separated username:password string for 'authorization: basic ...'
        auth header.

    :param proxy_basic_auth:
        Colon-separated username:password string for 'proxy-authorization: basic ...'
        auth header.

    :param disable_cache:
        If ``True``, adds 'cache-control: no-cache' header.

    Example::

        >>> make_headers(keep_alive=True, user_agent="Batman/1.0")
        {'connection': 'keep-alive', 'user-agent': 'Batman/1.0'}
        >>> make_headers(accept_encoding=True)
        {'accept-encoding': 'gzip,deflate'}
    """
    headers = {}
    if accept_encoding:
        if isinstance(accept_encoding, str):
            pass
        elif isinstance(accept_encoding, list):
            accept_encoding = ','.join(accept_encoding)
        else:
            accept_encoding = ACCEPT_ENCODING
        headers['accept-encoding'] = accept_encoding

    if user_agent:
        headers['user-agent'] = user_agent

    if keep_alive:
        headers['connection'] = 'keep-alive'

    if basic_auth:
        headers['authorization'] = 'Basic ' + \
            b64encode(b(basic_auth)).decode('utf-8')

    if proxy_basic_auth:
        headers['proxy-authorization'] = 'Basic ' + \
            b64encode(b(proxy_basic_auth)).decode('utf-8')

    if disable_cache:
        headers['cache-control'] = 'no-cache'

    return headers


def set_file_position(body, pos):
    """
    If a position is provided, move file to that point.
    Otherwise, we'll attempt to record a position for future use.
    """
    if pos is not None:
        rewind_body(body, pos)
    elif getattr(body, 'tell', None) is not None:
        try:
            pos = body.tell()
        except (IOError, OSError):
            # This differentiates from None, allowing us to catch
            # a failed `tell()` later when trying to rewind the body.
            pos = _FAILEDTELL

    return pos


def rewind_body(body, body_pos):
    """
    Attempt to rewind body to a certain position.
    Primarily used for request redirects and retries.

    :param body:
        File-like object that supports seek.

    :param int pos:
        Position to seek to in file.
    """
    body_seek = getattr(body, 'seek', None)
    if body_seek is not None and isinstance(body_pos, integer_types):
        try:
            body_seek(body_pos)
        except (IOError, OSError):
            raise UnrewindableBodyError("An error occurred when rewinding request "
                                        "body for redirect/retry.")
    elif body_pos is _FAILEDTELL:
        raise UnrewindableBodyError("Unable to record file position for rewinding "
                                    "request body during a redirect/retry.")
    else:
        raise ValueError("body_pos must be of type integer, "
                         "instead it was %s." % type(body_pos))
site-packages/pip/_vendor/urllib3/util/timeout.py000064400000023035147511334600016110 0ustar00from __future__ import absolute_import
# The default socket timeout, used by httplib to indicate that no timeout was
# specified by the user
from socket import _GLOBAL_DEFAULT_TIMEOUT
import time

from ..exceptions import TimeoutStateError

# A sentinel value to indicate that no timeout was specified by the user in
# urllib3
_Default = object()


# Use time.monotonic if available.
current_time = getattr(time, "monotonic", time.time)


class Timeout(object):
    """ Timeout configuration.

    Timeouts can be defined as a default for a pool::

        timeout = Timeout(connect=2.0, read=7.0)
        http = PoolManager(timeout=timeout)
        response = http.request('GET', 'http://example.com/')

    Or per-request (which overrides the default for the pool)::

        response = http.request('GET', 'http://example.com/', timeout=Timeout(10))

    Timeouts can be disabled by setting all the parameters to ``None``::

        no_timeout = Timeout(connect=None, read=None)
        response = http.request('GET', 'http://example.com/, timeout=no_timeout)


    :param total:
        This combines the connect and read timeouts into one; the read timeout
        will be set to the time leftover from the connect attempt. In the
        event that both a connect timeout and a total are specified, or a read
        timeout and a total are specified, the shorter timeout will be applied.

        Defaults to None.

    :type total: integer, float, or None

    :param connect:
        The maximum amount of time to wait for a connection attempt to a server
        to succeed. Omitting the parameter will default the connect timeout to
        the system default, probably `the global default timeout in socket.py
        <http://hg.python.org/cpython/file/603b4d593758/Lib/socket.py#l535>`_.
        None will set an infinite timeout for connection attempts.

    :type connect: integer, float, or None

    :param read:
        The maximum amount of time to wait between consecutive
        read operations for a response from the server. Omitting
        the parameter will default the read timeout to the system
        default, probably `the global default timeout in socket.py
        <http://hg.python.org/cpython/file/603b4d593758/Lib/socket.py#l535>`_.
        None will set an infinite timeout.

    :type read: integer, float, or None

    .. note::

        Many factors can affect the total amount of time for urllib3 to return
        an HTTP response.

        For example, Python's DNS resolver does not obey the timeout specified
        on the socket. Other factors that can affect total request time include
        high CPU load, high swap, the program running at a low priority level,
        or other behaviors.

        In addition, the read and total timeouts only measure the time between
        read operations on the socket connecting the client and the server,
        not the total amount of time for the request to return a complete
        response. For most requests, the timeout is raised because the server
        has not sent the first byte in the specified time. This is not always
        the case; if a server streams one byte every fifteen seconds, a timeout
        of 20 seconds will not trigger, even though the request will take
        several minutes to complete.

        If your goal is to cut off any request after a set amount of wall clock
        time, consider having a second "watcher" thread to cut off a slow
        request.
    """

    #: A sentinel object representing the default timeout value
    DEFAULT_TIMEOUT = _GLOBAL_DEFAULT_TIMEOUT

    def __init__(self, total=None, connect=_Default, read=_Default):
        self._connect = self._validate_timeout(connect, 'connect')
        self._read = self._validate_timeout(read, 'read')
        self.total = self._validate_timeout(total, 'total')
        self._start_connect = None

    def __str__(self):
        return '%s(connect=%r, read=%r, total=%r)' % (
            type(self).__name__, self._connect, self._read, self.total)

    @classmethod
    def _validate_timeout(cls, value, name):
        """ Check that a timeout attribute is valid.

        :param value: The timeout value to validate
        :param name: The name of the timeout attribute to validate. This is
            used to specify in error messages.
        :return: The validated and casted version of the given value.
        :raises ValueError: If it is a numeric value less than or equal to
            zero, or the type is not an integer, float, or None.
        """
        if value is _Default:
            return cls.DEFAULT_TIMEOUT

        if value is None or value is cls.DEFAULT_TIMEOUT:
            return value

        if isinstance(value, bool):
            raise ValueError("Timeout cannot be a boolean value. It must "
                             "be an int, float or None.")
        try:
            float(value)
        except (TypeError, ValueError):
            raise ValueError("Timeout value %s was %s, but it must be an "
                             "int, float or None." % (name, value))

        try:
            if value <= 0:
                raise ValueError("Attempted to set %s timeout to %s, but the "
                                 "timeout cannot be set to a value less "
                                 "than or equal to 0." % (name, value))
        except TypeError:  # Python 3
            raise ValueError("Timeout value %s was %s, but it must be an "
                             "int, float or None." % (name, value))

        return value

    @classmethod
    def from_float(cls, timeout):
        """ Create a new Timeout from a legacy timeout value.

        The timeout value used by httplib.py sets the same timeout on the
        connect(), and recv() socket requests. This creates a :class:`Timeout`
        object that sets the individual timeouts to the ``timeout`` value
        passed to this function.

        :param timeout: The legacy timeout value.
        :type timeout: integer, float, sentinel default object, or None
        :return: Timeout object
        :rtype: :class:`Timeout`
        """
        return Timeout(read=timeout, connect=timeout)

    def clone(self):
        """ Create a copy of the timeout object

        Timeout properties are stored per-pool but each request needs a fresh
        Timeout object to ensure each one has its own start/stop configured.

        :return: a copy of the timeout object
        :rtype: :class:`Timeout`
        """
        # We can't use copy.deepcopy because that will also create a new object
        # for _GLOBAL_DEFAULT_TIMEOUT, which socket.py uses as a sentinel to
        # detect the user default.
        return Timeout(connect=self._connect, read=self._read,
                       total=self.total)

    def start_connect(self):
        """ Start the timeout clock, used during a connect() attempt

        :raises urllib3.exceptions.TimeoutStateError: if you attempt
            to start a timer that has been started already.
        """
        if self._start_connect is not None:
            raise TimeoutStateError("Timeout timer has already been started.")
        self._start_connect = current_time()
        return self._start_connect

    def get_connect_duration(self):
        """ Gets the time elapsed since the call to :meth:`start_connect`.

        :return: Elapsed time.
        :rtype: float
        :raises urllib3.exceptions.TimeoutStateError: if you attempt
            to get duration for a timer that hasn't been started.
        """
        if self._start_connect is None:
            raise TimeoutStateError("Can't get connect duration for timer "
                                    "that has not started.")
        return current_time() - self._start_connect

    @property
    def connect_timeout(self):
        """ Get the value to use when setting a connection timeout.

        This will be a positive float or integer, the value None
        (never timeout), or the default system timeout.

        :return: Connect timeout.
        :rtype: int, float, :attr:`Timeout.DEFAULT_TIMEOUT` or None
        """
        if self.total is None:
            return self._connect

        if self._connect is None or self._connect is self.DEFAULT_TIMEOUT:
            return self.total

        return min(self._connect, self.total)

    @property
    def read_timeout(self):
        """ Get the value for the read timeout.

        This assumes some time has elapsed in the connection timeout and
        computes the read timeout appropriately.

        If self.total is set, the read timeout is dependent on the amount of
        time taken by the connect timeout. If the connection time has not been
        established, a :exc:`~urllib3.exceptions.TimeoutStateError` will be
        raised.

        :return: Value to use for the read timeout.
        :rtype: int, float, :attr:`Timeout.DEFAULT_TIMEOUT` or None
        :raises urllib3.exceptions.TimeoutStateError: If :meth:`start_connect`
            has not yet been called on this object.
        """
        if (self.total is not None and
                self.total is not self.DEFAULT_TIMEOUT and
                self._read is not None and
                self._read is not self.DEFAULT_TIMEOUT):
            # In case the connect timeout has not yet been established.
            if self._start_connect is None:
                return self._read
            return max(0, min(self.total - self.get_connect_duration(),
                              self._read))
        elif self.total is not None and self.total is not self.DEFAULT_TIMEOUT:
            return max(0, self.total - self.get_connect_duration())
        else:
            return self._read
site-packages/pip/_vendor/urllib3/util/ssl_.py000064400000027666147511334600015400 0ustar00from __future__ import absolute_import
import errno
import warnings
import hmac

from binascii import hexlify, unhexlify
from hashlib import md5, sha1, sha256

from ..exceptions import SSLError, InsecurePlatformWarning, SNIMissingWarning


SSLContext = None
HAS_SNI = False
IS_PYOPENSSL = False
IS_SECURETRANSPORT = False

# Maps the length of a digest to a possible hash function producing this digest
HASHFUNC_MAP = {
    32: md5,
    40: sha1,
    64: sha256,
}


def _const_compare_digest_backport(a, b):
    """
    Compare two digests of equal length in constant time.

    The digests must be of type str/bytes.
    Returns True if the digests match, and False otherwise.
    """
    result = abs(len(a) - len(b))
    for l, r in zip(bytearray(a), bytearray(b)):
        result |= l ^ r
    return result == 0


_const_compare_digest = getattr(hmac, 'compare_digest',
                                _const_compare_digest_backport)


try:  # Test for SSL features
    import ssl
    from ssl import wrap_socket, CERT_NONE, PROTOCOL_SSLv23
    from ssl import HAS_SNI  # Has SNI?
except ImportError:
    pass


try:
    from ssl import OP_NO_SSLv2, OP_NO_SSLv3, OP_NO_COMPRESSION
except ImportError:
    OP_NO_SSLv2, OP_NO_SSLv3 = 0x1000000, 0x2000000
    OP_NO_COMPRESSION = 0x20000

# A secure default.
# Sources for more information on TLS ciphers:
#
# - https://wiki.mozilla.org/Security/Server_Side_TLS
# - https://www.ssllabs.com/projects/best-practices/index.html
# - https://hynek.me/articles/hardening-your-web-servers-ssl-ciphers/
#
# The general intent is:
# - Prefer TLS 1.3 cipher suites
# - prefer cipher suites that offer perfect forward secrecy (DHE/ECDHE),
# - prefer ECDHE over DHE for better performance,
# - prefer any AES-GCM and ChaCha20 over any AES-CBC for better performance and
#   security,
# - prefer AES-GCM over ChaCha20 because hardware-accelerated AES is common,
# - disable NULL authentication, MD5 MACs and DSS for security reasons.
DEFAULT_CIPHERS = ':'.join([
    'TLS13-AES-256-GCM-SHA384',
    'TLS13-CHACHA20-POLY1305-SHA256',
    'TLS13-AES-128-GCM-SHA256',
    'ECDH+AESGCM',
    'ECDH+CHACHA20',
    'DH+AESGCM',
    'DH+CHACHA20',
    'ECDH+AES256',
    'DH+AES256',
    'ECDH+AES128',
    'DH+AES',
    'RSA+AESGCM',
    'RSA+AES',
    '!aNULL',
    '!eNULL',
    '!MD5',
])

try:
    from ssl import SSLContext  # Modern SSL?
except ImportError:
    import sys

    class SSLContext(object):  # Platform-specific: Python 2 & 3.1
        supports_set_ciphers = ((2, 7) <= sys.version_info < (3,) or
                                (3, 2) <= sys.version_info)

        def __init__(self, protocol_version):
            self.protocol = protocol_version
            # Use default values from a real SSLContext
            self.check_hostname = False
            self.verify_mode = ssl.CERT_NONE
            self.ca_certs = None
            self.options = 0
            self.certfile = None
            self.keyfile = None
            self.ciphers = None

        def load_cert_chain(self, certfile, keyfile):
            self.certfile = certfile
            self.keyfile = keyfile

        def load_verify_locations(self, cafile=None, capath=None):
            self.ca_certs = cafile

            if capath is not None:
                raise SSLError("CA directories not supported in older Pythons")

        def set_ciphers(self, cipher_suite):
            if not self.supports_set_ciphers:
                raise TypeError(
                    'Your version of Python does not support setting '
                    'a custom cipher suite. Please upgrade to Python '
                    '2.7, 3.2, or later if you need this functionality.'
                )
            self.ciphers = cipher_suite

        def wrap_socket(self, socket, server_hostname=None, server_side=False):
            warnings.warn(
                'A true SSLContext object is not available. This prevents '
                'urllib3 from configuring SSL appropriately and may cause '
                'certain SSL connections to fail. You can upgrade to a newer '
                'version of Python to solve this. For more information, see '
                'https://urllib3.readthedocs.io/en/latest/advanced-usage.html'
                '#ssl-warnings',
                InsecurePlatformWarning
            )
            kwargs = {
                'keyfile': self.keyfile,
                'certfile': self.certfile,
                'ca_certs': self.ca_certs,
                'cert_reqs': self.verify_mode,
                'ssl_version': self.protocol,
                'server_side': server_side,
            }
            if self.supports_set_ciphers:  # Platform-specific: Python 2.7+
                return wrap_socket(socket, ciphers=self.ciphers, **kwargs)
            else:  # Platform-specific: Python 2.6
                return wrap_socket(socket, **kwargs)


def assert_fingerprint(cert, fingerprint):
    """
    Checks if given fingerprint matches the supplied certificate.

    :param cert:
        Certificate as bytes object.
    :param fingerprint:
        Fingerprint as string of hexdigits, can be interspersed by colons.
    """

    fingerprint = fingerprint.replace(':', '').lower()
    digest_length = len(fingerprint)
    hashfunc = HASHFUNC_MAP.get(digest_length)
    if not hashfunc:
        raise SSLError(
            'Fingerprint of invalid length: {0}'.format(fingerprint))

    # We need encode() here for py32; works on py2 and p33.
    fingerprint_bytes = unhexlify(fingerprint.encode())

    cert_digest = hashfunc(cert).digest()

    if not _const_compare_digest(cert_digest, fingerprint_bytes):
        raise SSLError('Fingerprints did not match. Expected "{0}", got "{1}".'
                       .format(fingerprint, hexlify(cert_digest)))


def resolve_cert_reqs(candidate):
    """
    Resolves the argument to a numeric constant, which can be passed to
    the wrap_socket function/method from the ssl module.
    Defaults to :data:`ssl.CERT_NONE`.
    If given a string it is assumed to be the name of the constant in the
    :mod:`ssl` module or its abbrevation.
    (So you can specify `REQUIRED` instead of `CERT_REQUIRED`.
    If it's neither `None` nor a string we assume it is already the numeric
    constant which can directly be passed to wrap_socket.
    """
    if candidate is None:
        return CERT_NONE

    if isinstance(candidate, str):
        res = getattr(ssl, candidate, None)
        if res is None:
            res = getattr(ssl, 'CERT_' + candidate)
        return res

    return candidate


def resolve_ssl_version(candidate):
    """
    like resolve_cert_reqs
    """
    if candidate is None:
        return PROTOCOL_SSLv23

    if isinstance(candidate, str):
        res = getattr(ssl, candidate, None)
        if res is None:
            res = getattr(ssl, 'PROTOCOL_' + candidate)
        return res

    return candidate


def create_urllib3_context(ssl_version=None, cert_reqs=None,
                           options=None, ciphers=None):
    """All arguments have the same meaning as ``ssl_wrap_socket``.

    By default, this function does a lot of the same work that
    ``ssl.create_default_context`` does on Python 3.4+. It:

    - Disables SSLv2, SSLv3, and compression
    - Sets a restricted set of server ciphers

    If you wish to enable SSLv3, you can do::

        from urllib3.util import ssl_
        context = ssl_.create_urllib3_context()
        context.options &= ~ssl_.OP_NO_SSLv3

    You can do the same to enable compression (substituting ``COMPRESSION``
    for ``SSLv3`` in the last line above).

    :param ssl_version:
        The desired protocol version to use. This will default to
        PROTOCOL_SSLv23 which will negotiate the highest protocol that both
        the server and your installation of OpenSSL support.
    :param cert_reqs:
        Whether to require the certificate verification. This defaults to
        ``ssl.CERT_REQUIRED``.
    :param options:
        Specific OpenSSL options. These default to ``ssl.OP_NO_SSLv2``,
        ``ssl.OP_NO_SSLv3``, ``ssl.OP_NO_COMPRESSION``.
    :param ciphers:
        Which cipher suites to allow the server to select.
    :returns:
        Constructed SSLContext object with specified options
    :rtype: SSLContext
    """
    context = SSLContext(ssl_version or ssl.PROTOCOL_SSLv23)

    # Setting the default here, as we may have no ssl module on import
    cert_reqs = ssl.CERT_REQUIRED if cert_reqs is None else cert_reqs

    if options is None:
        options = 0
        # SSLv2 is easily broken and is considered harmful and dangerous
        options |= OP_NO_SSLv2
        # SSLv3 has several problems and is now dangerous
        options |= OP_NO_SSLv3
        # Disable compression to prevent CRIME attacks for OpenSSL 1.0+
        # (issue #309)
        options |= OP_NO_COMPRESSION

    context.options |= options

    if getattr(context, 'supports_set_ciphers', True):  # Platform-specific: Python 2.6
        context.set_ciphers(ciphers or DEFAULT_CIPHERS)

    context.verify_mode = cert_reqs
    if getattr(context, 'check_hostname', None) is not None:  # Platform-specific: Python 3.2
        # We do our own verification, including fingerprints and alternative
        # hostnames. So disable it here
        context.check_hostname = False
    return context


def ssl_wrap_socket(sock, keyfile=None, certfile=None, cert_reqs=None,
                    ca_certs=None, server_hostname=None,
                    ssl_version=None, ciphers=None, ssl_context=None,
                    ca_cert_dir=None):
    """
    All arguments except for server_hostname, ssl_context, and ca_cert_dir have
    the same meaning as they do when using :func:`ssl.wrap_socket`.

    :param server_hostname:
        When SNI is supported, the expected hostname of the certificate
    :param ssl_context:
        A pre-made :class:`SSLContext` object. If none is provided, one will
        be created using :func:`create_urllib3_context`.
    :param ciphers:
        A string of ciphers we wish the client to support. This is not
        supported on Python 2.6 as the ssl module does not support it.
    :param ca_cert_dir:
        A directory containing CA certificates in multiple separate files, as
        supported by OpenSSL's -CApath flag or the capath argument to
        SSLContext.load_verify_locations().
    """
    context = ssl_context
    if context is None:
        # Note: This branch of code and all the variables in it are no longer
        # used by urllib3 itself. We should consider deprecating and removing
        # this code.
        context = create_urllib3_context(ssl_version, cert_reqs,
                                         ciphers=ciphers)

    if ca_certs or ca_cert_dir:
        try:
            context.load_verify_locations(ca_certs, ca_cert_dir)
        except IOError as e:  # Platform-specific: Python 2.6, 2.7, 3.2
            raise SSLError(e)
        # Py33 raises FileNotFoundError which subclasses OSError
        # These are not equivalent unless we check the errno attribute
        except OSError as e:  # Platform-specific: Python 3.3 and beyond
            if e.errno == errno.ENOENT:
                raise SSLError(e)
            raise
    elif ssl_context is None and hasattr(context, 'load_default_certs'):
        # try to load OS default certs; works well on Windows (require Python3.4+)
        context.load_default_certs()

    if certfile:
        context.load_cert_chain(certfile, keyfile)
    if HAS_SNI:  # Platform-specific: OpenSSL with enabled SNI
        return context.wrap_socket(sock, server_hostname=server_hostname)

    warnings.warn(
        'An HTTPS request has been made, but the SNI (Subject Name '
        'Indication) extension to TLS is not available on this platform. '
        'This may cause the server to present an incorrect TLS '
        'certificate, which can cause validation failures. You can upgrade to '
        'a newer version of Python to solve this. For more information, see '
        'https://urllib3.readthedocs.io/en/latest/advanced-usage.html'
        '#ssl-warnings',
        SNIMissingWarning
    )
    return context.wrap_socket(sock)
site-packages/pip/_vendor/urllib3/util/retry.py000064400000035400147511334600015566 0ustar00from __future__ import absolute_import
import time
import logging
from collections import namedtuple
from itertools import takewhile
import email
import re

from ..exceptions import (
    ConnectTimeoutError,
    MaxRetryError,
    ProtocolError,
    ReadTimeoutError,
    ResponseError,
    InvalidHeader,
)
from ..packages import six


log = logging.getLogger(__name__)

# Data structure for representing the metadata of requests that result in a retry.
RequestHistory = namedtuple('RequestHistory', ["method", "url", "error",
                                               "status", "redirect_location"])


class Retry(object):
    """ Retry configuration.

    Each retry attempt will create a new Retry object with updated values, so
    they can be safely reused.

    Retries can be defined as a default for a pool::

        retries = Retry(connect=5, read=2, redirect=5)
        http = PoolManager(retries=retries)
        response = http.request('GET', 'http://example.com/')

    Or per-request (which overrides the default for the pool)::

        response = http.request('GET', 'http://example.com/', retries=Retry(10))

    Retries can be disabled by passing ``False``::

        response = http.request('GET', 'http://example.com/', retries=False)

    Errors will be wrapped in :class:`~urllib3.exceptions.MaxRetryError` unless
    retries are disabled, in which case the causing exception will be raised.

    :param int total:
        Total number of retries to allow. Takes precedence over other counts.

        Set to ``None`` to remove this constraint and fall back on other
        counts. It's a good idea to set this to some sensibly-high value to
        account for unexpected edge cases and avoid infinite retry loops.

        Set to ``0`` to fail on the first retry.

        Set to ``False`` to disable and imply ``raise_on_redirect=False``.

    :param int connect:
        How many connection-related errors to retry on.

        These are errors raised before the request is sent to the remote server,
        which we assume has not triggered the server to process the request.

        Set to ``0`` to fail on the first retry of this type.

    :param int read:
        How many times to retry on read errors.

        These errors are raised after the request was sent to the server, so the
        request may have side-effects.

        Set to ``0`` to fail on the first retry of this type.

    :param int redirect:
        How many redirects to perform. Limit this to avoid infinite redirect
        loops.

        A redirect is a HTTP response with a status code 301, 302, 303, 307 or
        308.

        Set to ``0`` to fail on the first retry of this type.

        Set to ``False`` to disable and imply ``raise_on_redirect=False``.

    :param int status:
        How many times to retry on bad status codes.

        These are retries made on responses, where status code matches
        ``status_forcelist``.

        Set to ``0`` to fail on the first retry of this type.

    :param iterable method_whitelist:
        Set of uppercased HTTP method verbs that we should retry on.

        By default, we only retry on methods which are considered to be
        idempotent (multiple requests with the same parameters end with the
        same state). See :attr:`Retry.DEFAULT_METHOD_WHITELIST`.

        Set to a ``False`` value to retry on any verb.

    :param iterable status_forcelist:
        A set of integer HTTP status codes that we should force a retry on.
        A retry is initiated if the request method is in ``method_whitelist``
        and the response status code is in ``status_forcelist``.

        By default, this is disabled with ``None``.

    :param float backoff_factor:
        A backoff factor to apply between attempts after the second try
        (most errors are resolved immediately by a second try without a
        delay). urllib3 will sleep for::

            {backoff factor} * (2 ^ ({number of total retries} - 1))

        seconds. If the backoff_factor is 0.1, then :func:`.sleep` will sleep
        for [0.0s, 0.2s, 0.4s, ...] between retries. It will never be longer
        than :attr:`Retry.BACKOFF_MAX`.

        By default, backoff is disabled (set to 0).

    :param bool raise_on_redirect: Whether, if the number of redirects is
        exhausted, to raise a MaxRetryError, or to return a response with a
        response code in the 3xx range.

    :param iterable remove_headers_on_redirect:
        Sequence of headers to remove from the request when a response
        indicating a redirect is returned before firing off the redirected
        request

    :param bool raise_on_status: Similar meaning to ``raise_on_redirect``:
        whether we should raise an exception, or return a response,
        if status falls in ``status_forcelist`` range and retries have
        been exhausted.

    :param tuple history: The history of the request encountered during
        each call to :meth:`~Retry.increment`. The list is in the order
        the requests occurred. Each list item is of class :class:`RequestHistory`.

    :param bool respect_retry_after_header:
        Whether to respect Retry-After header on status codes defined as
        :attr:`Retry.RETRY_AFTER_STATUS_CODES` or not.

    """

    DEFAULT_METHOD_WHITELIST = frozenset([
        'HEAD', 'GET', 'PUT', 'DELETE', 'OPTIONS', 'TRACE'])

    DEFAULT_REDIRECT_HEADERS_BLACKLIST = frozenset(['Authorization'])

    RETRY_AFTER_STATUS_CODES = frozenset([413, 429, 503])

    #: Maximum backoff time.
    BACKOFF_MAX = 120

    def __init__(self, total=10, connect=None, read=None, redirect=None, status=None,
                 method_whitelist=DEFAULT_METHOD_WHITELIST, status_forcelist=None,
                 backoff_factor=0, raise_on_redirect=True, raise_on_status=True,
                 history=None, respect_retry_after_header=True,
                 remove_headers_on_redirect=DEFAULT_REDIRECT_HEADERS_BLACKLIST):

        self.total = total
        self.connect = connect
        self.read = read
        self.status = status

        if redirect is False or total is False:
            redirect = 0
            raise_on_redirect = False

        self.redirect = redirect
        self.status_forcelist = status_forcelist or set()
        self.method_whitelist = method_whitelist
        self.backoff_factor = backoff_factor
        self.raise_on_redirect = raise_on_redirect
        self.raise_on_status = raise_on_status
        self.history = history or tuple()
        self.respect_retry_after_header = respect_retry_after_header
        self.remove_headers_on_redirect = remove_headers_on_redirect

    def new(self, **kw):
        params = dict(
            total=self.total,
            connect=self.connect, read=self.read, redirect=self.redirect, status=self.status,
            method_whitelist=self.method_whitelist,
            status_forcelist=self.status_forcelist,
            backoff_factor=self.backoff_factor,
            raise_on_redirect=self.raise_on_redirect,
            raise_on_status=self.raise_on_status,
            history=self.history,
            remove_headers_on_redirect=self.remove_headers_on_redirect,
        )
        params.update(kw)
        return type(self)(**params)

    @classmethod
    def from_int(cls, retries, redirect=True, default=None):
        """ Backwards-compatibility for the old retries format."""
        if retries is None:
            retries = default if default is not None else cls.DEFAULT

        if isinstance(retries, Retry):
            return retries

        redirect = bool(redirect) and None
        new_retries = cls(retries, redirect=redirect)
        log.debug("Converted retries value: %r -> %r", retries, new_retries)
        return new_retries

    def get_backoff_time(self):
        """ Formula for computing the current backoff

        :rtype: float
        """
        # We want to consider only the last consecutive errors sequence (Ignore redirects).
        consecutive_errors_len = len(list(takewhile(lambda x: x.redirect_location is None,
                                                    reversed(self.history))))
        if consecutive_errors_len <= 1:
            return 0

        backoff_value = self.backoff_factor * (2 ** (consecutive_errors_len - 1))
        return min(self.BACKOFF_MAX, backoff_value)

    def parse_retry_after(self, retry_after):
        # Whitespace: https://tools.ietf.org/html/rfc7230#section-3.2.4
        if re.match(r"^\s*[0-9]+\s*$", retry_after):
            seconds = int(retry_after)
        else:
            retry_date_tuple = email.utils.parsedate(retry_after)
            if retry_date_tuple is None:
                raise InvalidHeader("Invalid Retry-After header: %s" % retry_after)
            retry_date = time.mktime(retry_date_tuple)
            seconds = retry_date - time.time()

        if seconds < 0:
            seconds = 0

        return seconds

    def get_retry_after(self, response):
        """ Get the value of Retry-After in seconds. """

        retry_after = response.getheader("Retry-After")

        if retry_after is None:
            return None

        return self.parse_retry_after(retry_after)

    def sleep_for_retry(self, response=None):
        retry_after = self.get_retry_after(response)
        if retry_after:
            time.sleep(retry_after)
            return True

        return False

    def _sleep_backoff(self):
        backoff = self.get_backoff_time()
        if backoff <= 0:
            return
        time.sleep(backoff)

    def sleep(self, response=None):
        """ Sleep between retry attempts.

        This method will respect a server's ``Retry-After`` response header
        and sleep the duration of the time requested. If that is not present, it
        will use an exponential backoff. By default, the backoff factor is 0 and
        this method will return immediately.
        """

        if response:
            slept = self.sleep_for_retry(response)
            if slept:
                return

        self._sleep_backoff()

    def _is_connection_error(self, err):
        """ Errors when we're fairly sure that the server did not receive the
        request, so it should be safe to retry.
        """
        return isinstance(err, ConnectTimeoutError)

    def _is_read_error(self, err):
        """ Errors that occur after the request has been started, so we should
        assume that the server began processing it.
        """
        return isinstance(err, (ReadTimeoutError, ProtocolError))

    def _is_method_retryable(self, method):
        """ Checks if a given HTTP method should be retried upon, depending if
        it is included on the method whitelist.
        """
        if self.method_whitelist and method.upper() not in self.method_whitelist:
            return False

        return True

    def is_retry(self, method, status_code, has_retry_after=False):
        """ Is this method/status code retryable? (Based on whitelists and control
        variables such as the number of total retries to allow, whether to
        respect the Retry-After header, whether this header is present, and
        whether the returned status code is on the list of status codes to
        be retried upon on the presence of the aforementioned header)
        """
        if not self._is_method_retryable(method):
            return False

        if self.status_forcelist and status_code in self.status_forcelist:
            return True

        return (self.total and self.respect_retry_after_header and
                has_retry_after and (status_code in self.RETRY_AFTER_STATUS_CODES))

    def is_exhausted(self):
        """ Are we out of retries? """
        retry_counts = (self.total, self.connect, self.read, self.redirect, self.status)
        retry_counts = list(filter(None, retry_counts))
        if not retry_counts:
            return False

        return min(retry_counts) < 0

    def increment(self, method=None, url=None, response=None, error=None,
                  _pool=None, _stacktrace=None):
        """ Return a new Retry object with incremented retry counters.

        :param response: A response object, or None, if the server did not
            return a response.
        :type response: :class:`~urllib3.response.HTTPResponse`
        :param Exception error: An error encountered during the request, or
            None if the response was received successfully.

        :return: A new ``Retry`` object.
        """
        if self.total is False and error:
            # Disabled, indicate to re-raise the error.
            raise six.reraise(type(error), error, _stacktrace)

        total = self.total
        if total is not None:
            total -= 1

        connect = self.connect
        read = self.read
        redirect = self.redirect
        status_count = self.status
        cause = 'unknown'
        status = None
        redirect_location = None

        if error and self._is_connection_error(error):
            # Connect retry?
            if connect is False:
                raise six.reraise(type(error), error, _stacktrace)
            elif connect is not None:
                connect -= 1

        elif error and self._is_read_error(error):
            # Read retry?
            if read is False or not self._is_method_retryable(method):
                raise six.reraise(type(error), error, _stacktrace)
            elif read is not None:
                read -= 1

        elif response and response.get_redirect_location():
            # Redirect retry?
            if redirect is not None:
                redirect -= 1
            cause = 'too many redirects'
            redirect_location = response.get_redirect_location()
            status = response.status

        else:
            # Incrementing because of a server error like a 500 in
            # status_forcelist and a the given method is in the whitelist
            cause = ResponseError.GENERIC_ERROR
            if response and response.status:
                if status_count is not None:
                    status_count -= 1
                cause = ResponseError.SPECIFIC_ERROR.format(
                    status_code=response.status)
                status = response.status

        history = self.history + (RequestHistory(method, url, error, status, redirect_location),)

        new_retry = self.new(
            total=total,
            connect=connect, read=read, redirect=redirect, status=status_count,
            history=history)

        if new_retry.is_exhausted():
            raise MaxRetryError(_pool, url, error or ResponseError(cause))

        log.debug("Incremented Retry for (url='%s'): %r", url, new_retry)

        return new_retry

    def __repr__(self):
        return ('{cls.__name__}(total={self.total}, connect={self.connect}, '
                'read={self.read}, redirect={self.redirect}, status={self.status})').format(
                    cls=type(self), self=self)


# For backwards compatibility (equivalent to pre-v1.9):
Retry.DEFAULT = Retry(3)
site-packages/pip/_vendor/urllib3/request.py000064400000013472147511334600015141 0ustar00from __future__ import absolute_import

from .filepost import encode_multipart_formdata
from .packages.six.moves.urllib.parse import urlencode


__all__ = ['RequestMethods']


class RequestMethods(object):
    """
    Convenience mixin for classes who implement a :meth:`urlopen` method, such
    as :class:`~urllib3.connectionpool.HTTPConnectionPool` and
    :class:`~urllib3.poolmanager.PoolManager`.

    Provides behavior for making common types of HTTP request methods and
    decides which type of request field encoding to use.

    Specifically,

    :meth:`.request_encode_url` is for sending requests whose fields are
    encoded in the URL (such as GET, HEAD, DELETE).

    :meth:`.request_encode_body` is for sending requests whose fields are
    encoded in the *body* of the request using multipart or www-form-urlencoded
    (such as for POST, PUT, PATCH).

    :meth:`.request` is for making any kind of request, it will look up the
    appropriate encoding format and use one of the above two methods to make
    the request.

    Initializer parameters:

    :param headers:
        Headers to include with all requests, unless other headers are given
        explicitly.
    """

    _encode_url_methods = set(['DELETE', 'GET', 'HEAD', 'OPTIONS'])

    def __init__(self, headers=None):
        self.headers = headers or {}

    def urlopen(self, method, url, body=None, headers=None,
                encode_multipart=True, multipart_boundary=None,
                **kw):  # Abstract
        raise NotImplemented("Classes extending RequestMethods must implement "
                             "their own ``urlopen`` method.")

    def request(self, method, url, fields=None, headers=None, **urlopen_kw):
        """
        Make a request using :meth:`urlopen` with the appropriate encoding of
        ``fields`` based on the ``method`` used.

        This is a convenience method that requires the least amount of manual
        effort. It can be used in most situations, while still having the
        option to drop down to more specific methods when necessary, such as
        :meth:`request_encode_url`, :meth:`request_encode_body`,
        or even the lowest level :meth:`urlopen`.
        """
        method = method.upper()

        if method in self._encode_url_methods:
            return self.request_encode_url(method, url, fields=fields,
                                           headers=headers,
                                           **urlopen_kw)
        else:
            return self.request_encode_body(method, url, fields=fields,
                                            headers=headers,
                                            **urlopen_kw)

    def request_encode_url(self, method, url, fields=None, headers=None,
                           **urlopen_kw):
        """
        Make a request using :meth:`urlopen` with the ``fields`` encoded in
        the url. This is useful for request methods like GET, HEAD, DELETE, etc.
        """
        if headers is None:
            headers = self.headers

        extra_kw = {'headers': headers}
        extra_kw.update(urlopen_kw)

        if fields:
            url += '?' + urlencode(fields)

        return self.urlopen(method, url, **extra_kw)

    def request_encode_body(self, method, url, fields=None, headers=None,
                            encode_multipart=True, multipart_boundary=None,
                            **urlopen_kw):
        """
        Make a request using :meth:`urlopen` with the ``fields`` encoded in
        the body. This is useful for request methods like POST, PUT, PATCH, etc.

        When ``encode_multipart=True`` (default), then
        :meth:`urllib3.filepost.encode_multipart_formdata` is used to encode
        the payload with the appropriate content type. Otherwise
        :meth:`urllib.urlencode` is used with the
        'application/x-www-form-urlencoded' content type.

        Multipart encoding must be used when posting files, and it's reasonably
        safe to use it in other times too. However, it may break request
        signing, such as with OAuth.

        Supports an optional ``fields`` parameter of key/value strings AND
        key/filetuple. A filetuple is a (filename, data, MIME type) tuple where
        the MIME type is optional. For example::

            fields = {
                'foo': 'bar',
                'fakefile': ('foofile.txt', 'contents of foofile'),
                'realfile': ('barfile.txt', open('realfile').read()),
                'typedfile': ('bazfile.bin', open('bazfile').read(),
                              'image/jpeg'),
                'nonamefile': 'contents of nonamefile field',
            }

        When uploading a file, providing a filename (the first parameter of the
        tuple) is optional but recommended to best mimick behavior of browsers.

        Note that if ``headers`` are supplied, the 'Content-Type' header will
        be overwritten because it depends on the dynamic random boundary string
        which is used to compose the body of the request. The random boundary
        string can be explicitly set with the ``multipart_boundary`` parameter.
        """
        if headers is None:
            headers = self.headers

        extra_kw = {'headers': {}}

        if fields:
            if 'body' in urlopen_kw:
                raise TypeError(
                    "request got values for both 'fields' and 'body', can only specify one.")

            if encode_multipart:
                body, content_type = encode_multipart_formdata(fields, boundary=multipart_boundary)
            else:
                body, content_type = urlencode(fields), 'application/x-www-form-urlencoded'

            extra_kw['body'] = body
            extra_kw['headers'] = {'Content-Type': content_type}

        extra_kw['headers'].update(headers)
        extra_kw.update(urlopen_kw)

        return self.urlopen(method, url, **extra_kw)
site-packages/pip/_vendor/urllib3/__pycache__/poolmanager.cpython-36.opt-1.pyc000064400000031074147511334600023176 0ustar003

���e�A�@sddlmZddlZddlZddlZddlmZddlmZm	Z	ddlm
Z
ddlmZm
Z
mZddlmZdd	lmZdd
lmZddlmZdd
dgZeje�Zd4Zd5Zejd-e�Zd.d/�Zej ee�ej ee�d0�Z!ee	d0�Z"Gd1d�de�Z#Gd2d
�d
e#�Z$d3d�Z%dS)6�)�absolute_importN�)�RecentlyUsedContainer)�HTTPConnectionPool�HTTPSConnectionPool)�port_by_scheme)�LocationValueError�
MaxRetryError�ProxySchemeUnknown)�urljoin)�RequestMethods)�	parse_url)�Retry�PoolManager�ProxyManager�proxy_from_url�key_file�	cert_file�	cert_reqs�ca_certs�ssl_version�ca_cert_dir�ssl_context�
key_scheme�key_host�key_port�key_timeout�key_retries�
key_strict�	key_block�key_source_address�key_key_file�
key_cert_file�
key_cert_reqs�key_ca_certs�key_ssl_version�key_ca_cert_dir�key_ssl_context�key_maxsize�key_headers�
key__proxy�key__proxy_headers�key_socket_options�key__socks_options�key_assert_hostname�key_assert_fingerprint�PoolKeycCs�|j�}|dj�|d<|dj�|d<x4d	D],}||kr.||dk	r.t||j��||<q.W|jd�}|dk	r|t|�|d<x&t|j��D]}|j|�|d|<q�Wx|j	D]}||kr�d||<q�W|f|�S)
a�
    Create a pool key out of a request context dictionary.

    According to RFC 3986, both the scheme and host are case-insensitive.
    Therefore, this function normalizes both before constructing the pool
    key for an HTTPS request. If you wish to change this behaviour, provide
    alternate callables to ``key_fn_by_scheme``.

    :param key_class:
        The class to use when constructing the key. This should be a namedtuple
        with the ``scheme`` and ``host`` keys at a minimum.
    :type  key_class: namedtuple
    :param request_context:
        A dictionary-like object that contain the context for a request.
    :type  request_context: dict

    :return: A namedtuple that can be used as a connection pool key.
    :rtype:  PoolKey
    �scheme�host�headers�_proxy_headers�_socks_optionsNZsocket_optionsZkey_)r3r4r5)
�copy�lower�	frozenset�items�get�tuple�list�keys�pop�_fields)Z	key_class�request_context�context�keyZsocket_optsZfield�rC�!/usr/lib/python3.6/poolmanager.py�_default_key_normalizer9s

rE)�http�httpsc@sxeZdZdZdZddd�Zdd�Zdd	�Zdd
d�Zdd
�Z	ddd�Z
dd�Zddd�Zd dd�Z
dd�Zd!dd�ZdS)"ra$
    Allows for arbitrary requests while transparently keeping track of
    necessary connection pools for you.

    :param num_pools:
        Number of connection pools to cache before discarding the least
        recently used pool.

    :param headers:
        Headers to include with all requests, unless other headers are given
        explicitly.

    :param \**connection_pool_kw:
        Additional parameters are used to create fresh
        :class:`urllib3.connectionpool.ConnectionPool` instances.

    Example::

        >>> manager = PoolManager(num_pools=2)
        >>> r = manager.request('GET', 'http://google.com/')
        >>> r = manager.request('GET', 'http://google.com/mail')
        >>> r = manager.request('GET', 'http://yahoo.com/')
        >>> len(manager.pools)
        2

    N�
cKs8tj||�||_t|dd�d�|_t|_tj�|_dS)NcSs|j�S)N)�close)�prCrCrD�<lambda>�sz&PoolManager.__init__.<locals>.<lambda>)Zdispose_func)r�__init__�connection_pool_kwr�pools�pool_classes_by_scheme�key_fn_by_schemer6)�self�	num_poolsr3rMrCrCrDrL�szPoolManager.__init__cCs|S)NrC)rQrCrCrD�	__enter__�szPoolManager.__enter__cCs|j�dS)NF)�clear)rQ�exc_typeZexc_valZexc_tbrCrCrD�__exit__�szPoolManager.__exit__cCsf|j|}|dkr|jj�}xdD]}|j|d�q"W|dkrXxtD]}|j|d�qDW|||f|�S)a�
        Create a new :class:`ConnectionPool` based on host, port, scheme, and
        any additional pool keyword arguments.

        If ``request_context`` is provided, it is provided as keyword arguments
        to the pool class used. This method is used to actually create the
        connection pools handed out by :meth:`connection_from_url` and
        companion methods. It is intended to be overridden for customization.
        Nr1r2�portrF)r1r2rW)rOrMr6r>�SSL_KEYWORDS)rQr1r2rWr@Zpool_clsrB�kwrCrCrD�	_new_pool�s




zPoolManager._new_poolcCs|jj�dS)z�
        Empty our store of pools and direct them all to close.

        This will not affect in-flight connections, but they will not be
        re-used after completion.
        N)rNrT)rQrCrCrDrT�szPoolManager.clearrFcCsT|std��|j|�}|pd|d<|s:tj|dj�d�}||d<||d<|j|�S)a�
        Get a :class:`ConnectionPool` based on the host, port, and scheme.

        If ``port`` isn't given, it will be derived from the ``scheme`` using
        ``urllib3.connectionpool.port_by_scheme``. If ``pool_kwargs`` is
        provided, it is merged with the instance's ``connection_pool_kw``
        variable and used to create the new connection pool, if one is
        needed.
        zNo host specified.rFr1�PrWr2)r�_merge_pool_kwargsrr:r7�connection_from_context)rQr2rWr1�pool_kwargsr@rCrCrD�connection_from_host�s
z PoolManager.connection_from_hostcCs,|dj�}|j|}||�}|j||d�S)z�
        Get a :class:`ConnectionPool` based on the request context.

        ``request_context`` must at least contain the ``scheme`` key and its
        value must be a key in ``key_fn_by_scheme`` instance variable.
        r1)r@)r7rP�connection_from_pool_key)rQr@r1Zpool_key_constructor�pool_keyrCrCrDr]�s
z#PoolManager.connection_from_contextc
Cs`|jj�N|jj|�}|r|S|d}|d}|d}|j||||d�}||j|<WdQRX|S)z�
        Get a :class:`ConnectionPool` based on the provided pool key.

        ``pool_key`` should be a namedtuple that only contains immutable
        objects. At a minimum it must have the ``scheme``, ``host``, and
        ``port`` fields.
        r1r2rW)r@N)rN�lockr:rZ)rQrar@Zpoolr1r2rWrCrCrDr`�s
z$PoolManager.connection_from_pool_keycCs t|�}|j|j|j|j|d�S)a�
        Similar to :func:`urllib3.connectionpool.connection_from_url`.

        If ``pool_kwargs`` is not provided and a new pool needs to be
        constructed, ``self.connection_pool_kw`` is used to initialize
        the :class:`urllib3.connectionpool.ConnectionPool`. If ``pool_kwargs``
        is provided, it is used instead. Note that if a new pool does not
        need to be created for the request, the provided ``pool_kwargs`` are
        not used.
        )rWr1r^)r
r_r2rWr1)rQ�urlr^�urCrCrD�connection_from_url
szPoolManager.connection_from_urlcCsZ|jj�}|rVxF|j�D]:\}}|dkrJy
||=WqRtk
rFYqRXq|||<qW|S)a
        Merge a dictionary of override values for self.connection_pool_kw.

        This does not modify self.connection_pool_kw and returns a new dict.
        Any keys in the override dictionary with a value of ``None`` are
        removed from the merged dictionary.
        N)rMr6r9�KeyError)rQ�overrideZbase_pool_kwargsrB�valuerCrCrDr\s

zPoolManager._merge_pool_kwargsTcKsdt|�}|j|j|j|jd�}d|d<d|d<d|krD|jj�|d<|jdk	rj|jdkrj|j||f|�}n|j||j	f|�}|o�|j
�}|s�|St||�}|jdkr�d	}|j
d
�}	t|	t�s�tj|	|d�}	|	jo�|j|��r�x|	jD]}
|dj|
d�q�Wy|	j||||d�}	Wn tk
�r4|	j�r0�|SX|	|d
<||d<tjd
||�|j||f|�S)a]
        Same as :meth:`urllib3.connectionpool.HTTPConnectionPool.urlopen`
        with custom cross-host redirect logic and only sends the request-uri
        portion of the ``url``.

        The given ``url`` parameter must be absolute, such that an appropriate
        :class:`urllib3.connectionpool.ConnectionPool` can be chosen for it.
        )rWr1FZassert_same_host�redirectr3NrFi/ZGET�retries)ri)�responseZ_poolzRedirecting %s -> %s)r
r_r2rWr1r3r6�proxy�urlopenZrequest_uriZget_redirect_locationrZstatusr:�
isinstancerZfrom_intZremove_headers_on_redirectZis_same_hostr>Z	incrementr	Zraise_on_redirect�log�info)rQ�methodrcrirYrdZconnrkZredirect_locationrj�headerrCrCrDrm-s@	



zPoolManager.urlopen)rHN)N)NrFN)N)N)T)�__name__�
__module__�__qualname__�__doc__rlrLrSrVrZrTr_r]r`rer\rmrCrCrCrDrys

	


csHeZdZdZd�fdd�	Zd�fdd�	Zdd	d
�Zd�fdd
�	Z�ZS)raw
    Behaves just like :class:`PoolManager`, but sends all requests through
    the defined proxy, using the CONNECT method for HTTPS URLs.

    :param proxy_url:
        The URL of the proxy to be used.

    :param proxy_headers:
        A dictionary contaning headers that will be sent to the proxy. In case
        of HTTP they are being sent with each request, while in the
        HTTPS/CONNECT case they are sent only once. Could be used for proxy
        authentication.

    Example:
        >>> proxy = urllib3.ProxyManager('http://localhost:3128/')
        >>> r1 = proxy.request('GET', 'http://google.com/')
        >>> r2 = proxy.request('GET', 'http://httpbin.org/')
        >>> len(proxy.pools)
        1
        >>> r3 = proxy.request('GET', 'https://httpbin.org/')
        >>> r4 = proxy.request('GET', 'https://twitter.com/')
        >>> len(proxy.pools)
        3

    rHNcs�t|t�rd|j|j|jf}t|�}|jsFtj|jd�}|j|d�}|jdkrZt	|j��||_
|pfi|_|j
|d<|j|d<tt
|�j||f|�dS)	Nz
%s://%s:%ir[)rWrFrG�_proxyr4)rFrG)rnrr1r2rWr
rr:�_replacer
rl�
proxy_headers�superrrL)rQ�	proxy_urlrRr3ryrMrlrW)�	__class__rCrDrL�s








zProxyManager.__init__rFcsD|dkr tt|�j||||d�Stt|�j|jj|jj|jj|d�S)NrG)r^)rzrr_rlr2rWr1)rQr2rWr1r^)r|rCrDr_�s


z!ProxyManager.connection_from_hostcCs0ddi}t|�j}|r||d<|r,|j|�|S)z�
        Sets headers needed by proxies: specifically, the Accept and Host
        headers. Only sets headers not provided by the user.
        ZAcceptz*/*ZHost)r
�netloc�update)rQrcr3Zheaders_r}rCrCrD�_set_proxy_headers�s

zProxyManager._set_proxy_headersTcsNt|�}|jdkr0|jd|j�}|j||�|d<tt|�j||fd|i|��S)z@Same as HTTP(S)ConnectionPool.urlopen, ``url`` must be absolute.rFr3ri)r
r1r:r3rrzrrm)rQrqrcrirYrdr3)r|rCrDrm�s

zProxyManager.urlopen)rHNN)NrFN)N)T)	rsrtrurvrLr_rrm�
__classcell__rCrC)r|rDris
cKstfd|i|��S)Nr{)r)rcrYrCrCrDr�s)rrrrrrr)rrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/)&Z
__future__r�collections�	functoolsZlogging�_collectionsrZconnectionpoolrrr�
exceptionsrr	r
Zpackages.six.moves.urllib.parserZrequestrZutil.urlr
Z
util.retryr�__all__Z	getLoggerrsrorXZ_key_fields�
namedtupler0rE�partialrPrOrrrrCrCrCrD�<module>s`

6
qWsite-packages/pip/_vendor/urllib3/__pycache__/response.cpython-36.pyc000064400000037433147511334600021576 0ustar003

���ewY�@sddlmZddlmZddlZddlZddlZddlmZ	ddlm
Zddlm
Z
ddlmZmZmZmZmZmZmZdd	lmZmZmZdd
lmZddlmZm Z ddl!m"Z"m#Z#ej$e%�Z&Gd
d�de'�Z(Gdd�de'�Z)dd�Z*Gdd�dej+�Z,dS)�)�absolute_import)�contextmanagerN)�timeout)�error�)�HTTPHeaderDict)�BodyNotHttplibCompatible�
ProtocolError�DecodeError�ReadTimeoutError�ResponseNotChunked�IncompleteRead�
InvalidHeader)�string_types�binary_type�PY3)�http_client)�
HTTPException�BaseSSLError)�is_fp_closed�is_response_to_headc@s$eZdZdd�Zdd�Zdd�ZdS)�DeflateDecodercCsd|_t�|_tj�|_dS)NT)�
_first_tryr�_data�zlib�
decompressobj�_obj)�self�r�/usr/lib/python3.6/response.py�__init__szDeflateDecoder.__init__cCst|j|�S)N)�getattrr)r�namerrr�__getattr__szDeflateDecoder.__getattr__cCs�|s|S|js|jj|�S|j|7_y |jj|�}|rFd|_d|_|Stjk
r�d|_tjtj�|_z|j|j�Sd|_XYnXdS)NF)rr�
decompressrrrr�	MAX_WBITS)r�dataZdecompressedrrrr$ s"zDeflateDecoder.decompressN)�__name__�
__module__�__qualname__r r#r$rrrrrsrc@s$eZdZdd�Zdd�Zdd�ZdS)�GzipDecodercCstjdtj�|_dS)N�)rrr%r)rrrrr 9szGzipDecoder.__init__cCst|j|�S)N)r!r)rr"rrrr#<szGzipDecoder.__getattr__cCs|s|S|jj|�S)N)rr$)rr&rrrr$?szGzipDecoder.decompressN)r'r(r)r r#r$rrrrr*7sr*cCs|dkrt�St�S)N�gzip)r*r)�moderrr�_get_decoderEsr.c@seZdZdZddgZdddddgZdFdd�Zdd�Zdd�Ze	dd��Z
e	dd��Zdd�Zdd�Z
dd�Zdd�Zd d!�Zed"d#��ZdGd$d%�ZdId(d)�Zed*d+��Zd,d-�ZdJd.d/�Zd0d1�Zd2d3�Ze	d4d5��Zd6d7�Zd8d9�Zd:d;�Zd<d=�Zd>d?�Z d@dA�Z!dBdC�Z"dKdDdE�Z#d
S)L�HTTPResponsea	
    HTTP Response container.

    Backwards-compatible to httplib's HTTPResponse but the response ``body`` is
    loaded and decoded on-demand when the ``data`` property is accessed.  This
    class is also compatible with the Python standard library's :mod:`io`
    module, and can hence be treated as a readable object in the context of that
    framework.

    Extra parameters for behaviour not present in httplib.HTTPResponse:

    :param preload_content:
        If True, the response's body will be preloaded during construction.

    :param decode_content:
        If True, attempts to decode specific content-encoding's based on headers
        (like 'gzip' and 'deflate') will be skipped and raw data will be used
        instead.

    :param original_response:
        When this HTTPResponse wrapper is generated from an httplib.HTTPResponse
        object, it's convenient to include the original for debug purposes. It's
        otherwise unused.

    :param retries:
        The retries contains the last :class:`~urllib3.util.retry.Retry` that
        was used during the request.

    :param enforce_content_length:
        Enforce content length checking. Body returned by server must match
        value of Content-Length header, if present. Otherwise, raise error.
    r,Zdeflatei-i.i/i3i4�NrTFcCst|t�r||_n
t|�|_||_||_||_||_||_||_|
|_	d|_
d|_d|_|	|_
d|_|r|t|ttf�r|||_|
|_||_t|d�r�||_d|_d|_|jjdd�j�}dd�|jd�D�}d	|kr�d
|_|j|�|_|r�|jr�|j|d�|_dS)Nr�readFztransfer-encodingr0css|]}|j�VqdS)N)�strip)�.0�encrrr�	<genexpr>�sz(HTTPResponse.__init__.<locals>.<genexpr>�,�chunkedT)�decode_content)�
isinstancer�headers�status�version�reason�strictr8�retries�enforce_content_length�_decoder�_body�_fp�_original_response�_fp_bytes_read�
basestringr�_pool�_connection�hasattrr7�
chunk_left�get�lower�split�_init_length�length_remainingr1)r�bodyr:r;r<r=r>Zpreload_contentr8�original_responseZpool�
connectionr?r@�request_methodZtr_encZ	encodingsrrrr qs<


zHTTPResponse.__init__cCs|j|jkr|jjd�SdS)a
        Should we redirect and where to?

        :returns: Truthy redirect location string if we got a redirect status
            code and valid location. ``None`` if redirect status and no
            location. ``False`` if not a redirect status code.
        �locationF)r;�REDIRECT_STATUSESr:rK)rrrr�get_redirect_location�sz"HTTPResponse.get_redirect_locationcCs,|js|jrdS|jj|j�d|_dS)N)rGrHZ	_put_conn)rrrr�release_conn�szHTTPResponse.release_conncCs"|jr|jS|jr|jdd�SdS)NT)�
cache_content)rBrCr1)rrrrr&�szHTTPResponse.datacCs|jS)N)rH)rrrrrR�szHTTPResponse.connectioncCs|jS)z�
        Obtain the number of bytes pulled over the wire so far. May differ from
        the amount of content returned by :meth:``HTTPResponse.read`` if bytes
        are encoded on the wire (e.g, compressed).
        )rE)rrrr�tell�szHTTPResponse.tellcCs�|jjd�}|dk	r(|jr(tjd�dS|dk	r�y<tdd�|jd�D��}t|�dkrbtd|��|j	�}Wnt
k
r�d}YnX|d	kr�d}yt|j�}Wnt
k
r�d	}YnX|dks�d|ko�d
kns�|dkr�d	}|S)zM
        Set initial length value for Response content if available.
        zcontent-lengthNz�Received response with both Content-Length and Transfer-Encoding set. This is expressly forbidden by RFC 7230 sec 3.3.2. Ignoring Content-Length and attempting to process response as Transfer-Encoding: chunked.cSsg|]}t|��qSr)�int)r3�valrrr�
<listcomp>�sz-HTTPResponse._init_length.<locals>.<listcomp>r6rz8Content-Length contained multiple unmatching values (%s)r���0�d��ZHEAD)r]r^)
r:rKr7�logZwarning�setrM�lenr�pop�
ValueErrorrZr;)rrSZlengthZlengthsr;rrrrN�s,


(zHTTPResponse._init_lengthcCs4|jjdd�j�}|jdkr0||jkr0t|�|_dS)z=
        Set-up the _decoder attribute if necessary.
        zcontent-encodingr0N)r:rKrLrA�CONTENT_DECODERSr.)r�content_encodingrrr�
_init_decoder�szHTTPResponse._init_decodercCs|y|r|jr|jj|�}WnHttjfk
rb}z&|jjdd�j�}td||��WYdd}~XnX|rx|rx||j	�7}|S)zN
        Decode the data passed in and potentially flush the decoder.
        zcontent-encodingr0zEReceived response with content-encoding: %s, but failed to decode it.N)
rAr$�IOErrorrrr:rKrLr
�_flush_decoder)rr&r8�
flush_decoder�ergrrr�_decodes
zHTTPResponse._decodecCs$|jr |jjd�}||jj�SdS)zk
        Flushes the decoder. Should only be called if the decoder is actually
        being used.
        �)rAr$�flush)rZbufrrrrjszHTTPResponse._flush_decoderccs�d}z�y
dVWn�tk
r2t|jdd��Ynptk
rn}z"dt|�krP�t|jdd��WYdd}~Xn4ttfk
r�}ztd||��WYdd}~XnXd}Wd|s�|jr�|jj	�|j
r�|j
j	�|jr�|jj�r�|j�XdS)z�
        Catch low-level python exceptions, instead re-raising urllib3
        variants, so that low-level exceptions are not leaked in the
        high-level api.

        On exit, release the connection back to the pool.
        FNzRead timed out.zread operation timed outzConnection broken: %rT)
�
SocketTimeoutrrGr�strr�SocketErrorr	rD�closerH�isclosedrW)rZ
clean_exitrlrrr�_error_catcher!s(	
 

zHTTPResponse._error_catchercCs�|j�|dkr|j}|jdkr$dSd}d}|j��h|dkrN|jj�}d}nJd}|jj|�}|dkr�|r�|jj�d}|jr�|jdkr�t|j	|j��WdQRX|r�|j	t
|�7_	|jdk	r�|jt
|�8_|j|||�}|r�||_|S)aP
        Similar to :meth:`httplib.HTTPResponse.read`, but with two additional
        parameters: ``decode_content`` and ``cache_content``.

        :param amt:
            How much of the content to read. If specified, caching is skipped
            because it doesn't make sense to cache partial content as the full
            response.

        :param decode_content:
            If True, will attempt to decode the body based on the
            'content-encoding' header.

        :param cache_content:
            If True, will save the returned data such that the same result is
            returned despite of the state of the underlying file object. This
            is useful if you want the ``.data`` property to continue working
            after having ``.read()`` the file object. (Overridden if ``amt`` is
            set.)
        NFTr)rN)
rhr8rCrur1rsr@rOr
rErcrmrB)r�amtr8rXrkr&rrrr1Zs4




zHTTPResponse.read�r+ccsZ|jr.|j�r.xF|j||d�D]
}|VqWn(x&t|j�sT|j||d�}|r0|Vq0WdS)a_
        A generator wrapper for the read() method. A call will block until
        ``amt`` bytes have been read from the connection or until the
        connection is closed.

        :param amt:
            How much of the content to read. The generator will return up to
            much data per iteration, but may return less. This is particularly
            likely when using compressed data. However, the empty string will
            never be returned.

        :param decode_content:
            If True, will attempt to decode the body based on the
            'content-encoding' header.
        )r8)rvr8N)r7�supports_chunked_reads�read_chunkedrrCr1)rrvr8�liner&rrr�stream�szHTTPResponse.streamc
Ks`|j}t|t�s,tr"t|j��}n
tj|�}t|dd�}|f|||j|j|j	||d�|��}|S)a
        Given an :class:`httplib.HTTPResponse` instance ``r``, return a
        corresponding :class:`urllib3.response.HTTPResponse` object.

        Remaining parameters are passed to the HTTPResponse constructor, along
        with ``original_response=r``.
        r>r)rPr:r;r<r=r>rQ)
�msgr9rr�items�from_httplibr!r;r<r=)ZResponseCls�rZresponse_kwr:r>Zresprrrr~�s	

zHTTPResponse.from_httplibcCs|jS)N)r:)rrrr�
getheaders�szHTTPResponse.getheaderscCs|jj||�S)N)r:rK)rr"�defaultrrr�	getheader�szHTTPResponse.getheadercCs|jS)N)r:)rrrr�info�szHTTPResponse.infocCs$|js|jj�|jr |jj�dS)N)�closedrCrsrH)rrrrrs�s
zHTTPResponse.closecCs@|jdkrdSt|jd�r$|jj�St|jd�r8|jjSdSdS)NTrtr�)rCrIrtr�)rrrrr��s

zHTTPResponse.closedcCs6|jdkrtd��nt|jd�r*|jj�Std��dS)Nz-HTTPResponse has no file to get a fileno from�filenozOThe file-like object this HTTPResponse is wrapped around has no file descriptor)rCrirIr�)rrrrr��s



zHTTPResponse.filenocCs$|jdk	r t|jd�r |jj�SdS)Nro)rCrIro)rrrrro�szHTTPResponse.flushcCsdS)NTr)rrrr�readableszHTTPResponse.readablecCs:|jt|��}t|�dkrdS||dt|��<t|�SdS)Nr)r1rc)r�bZtemprrr�readintos
zHTTPResponse.readintocCst|jd�S)z�
        Checks if the underlying file-like object looks like a
        httplib.HTTPResponse object. We do this by testing for the fp
        attribute. If it is present we assume it returns raw chunks as
        processed by read_chunked().
        �fp)rIrC)rrrrrxsz#HTTPResponse.supports_chunked_readscCsf|jdk	rdS|jjj�}|jdd�d}yt|d�|_Wn&tk
r`|j�tj	|��YnXdS)N�;rrr+)
rJrCr��readlinerMrZrers�httplibr
)rrzrrr�_update_chunk_lengths
z!HTTPResponse._update_chunk_lengthcCs�d}|dkr2|jj|j�}|}|jjd�d|_nv||jkrZ|jj|�}|j||_|}nN||jkr�|jj|�}|jjd�d|_|}n |jj|j�}|jjd�d|_|S)Nrw)rCZ
_safe_readrJ)rrvZreturned_chunk�chunk�valuerrr�
_handle_chunk%s&

zHTTPResponse._handle_chunkccs�|j�|jstd��|j�s&td��|jrDt|j�rD|jj�dS|j���x<|j	�|j
dkrdP|j|�}|j||dd�}|rP|VqPW|r�|j
�}|r�|Vx |jjj�}|s�P|dkr�Pq�W|jr�|jj�WdQRXdS)z�
        Similar to :meth:`HTTPResponse.read`, but with an additional
        parameter: ``decode_content``.

        :param decode_content:
            If True, will attempt to decode the body based on the
            'content-encoding' header.
        zHResponse is not chunked. Header 'transfer-encoding: chunked' is missing.zgBody should be httplib.HTTPResponse like. It should have have an fp attribute which returns raw chunks.NrF)r8rks
)rhr7rrxrrDrrsrur�rJr�rmrjrCr�r�)rrvr8r�Zdecodedrzrrrry;s@	




zHTTPResponse.read_chunked)r0NrrNrTTNNNNFN)NNF�)r�N)N)NN)$r'r(r)�__doc__rfrUr rVrW�propertyr&rRrYrNrhrmrjrrur1r{�classmethodr~r�r�r�rsr�r�ror�r�rxr�r�ryrrrrr/LsB 
-
	0
9
E

			r/)-Z
__future__r�
contextlibrr�ioZloggingZsocketrrprrr�_collectionsr�
exceptionsrr	r
rrr
rZpackages.sixrrFrrZpackages.six.movesrr�rRrrZ
util.responserrZ	getLoggerr'ra�objectrr*r.�IOBaser/rrrr�<module>s"$
!site-packages/pip/_vendor/urllib3/__pycache__/__init__.cpython-36.opt-1.pyc000064400000004576147511334600022440 0ustar003

���e%�@s`dZddlmZddlZddlmZmZmZddlm	Z	ddl
mZddlm
Z
mZmZdd	lmZdd
lmZddlmZddlmZdd
lmZddlZyddlmZWn&ek
r�Gdd�dej�ZYnXdZdZdZ d(Z!ej"e#�j$e��ej%fd"d�Z&[ej'd#e	j(d$d%�ej'd&e	j)d$d%�ej'd&e	j*d$d%�ej'd&e	j+d$d%�e	j,fd'd�Z-dS))z8
urllib3 - Thread-safe connection pooling and re-using.
�)�absolute_importN�)�HTTPConnectionPool�HTTPSConnectionPool�connection_from_url)�
exceptions)�encode_multipart_formdata)�PoolManager�ProxyManager�proxy_from_url)�HTTPResponse)�make_headers)�get_host)�Timeout)�Retry)�NullHandlerc@seZdZdd�ZdS)rcCsdS)N�)�self�recordrr�/usr/lib/python3.6/__init__.py�emitszNullHandler.emitN)�__name__�
__module__�__qualname__rrrrrrsrz(Andrey Petrov (andrey.petrov@shazow.net)ZMITz1.22rrr	r
rrr�add_stderr_loggerr�disable_warningsrrr
rcCsFtjt�}tj�}|jtjd��|j|�|j|�|jdt�|S)z�
    Helper for quickly adding a StreamHandler to the logger. Useful for
    debugging.

    Returns the handler after adding it.
    z%%(asctime)s %(levelname)s %(message)sz,Added a stderr logging handler to logger: %s)	�logging�	getLoggerrZ
StreamHandlerZsetFormatterZ	Formatter�
addHandlerZsetLevel�debug)�levelZloggerZhandlerrrrr9s	


�alwaysT)�append�defaultcCstjd|�dS)z<
    Helper for quickly disabling all urllib3 warnings.
    �ignoreN)�warnings�simplefilter)�categoryrrrr]s)rrr	r
rrrrrrrrr
r).�__doc__Z
__future__rr%Zconnectionpoolrrr�rZfilepostrZpoolmanagerr	r
rZresponserZutil.requestr
Zutil.urlrZutil.timeoutrZ
util.retryrrr�ImportErrorZHandler�
__author__Z__license__�__version__�__all__rrr�DEBUGrr&ZSecurityWarningZSubjectAltNameWarningZInsecurePlatformWarningZSNIMissingWarningZHTTPWarningrrrrr�<module>sT
site-packages/pip/_vendor/urllib3/__pycache__/connectionpool.cpython-36.opt-1.pyc000064400000056152147511334600023727 0ustar003

���e��@s�ddlmZddlZddlZddlZddlZddlmZm	Z
ddlZddlmZm
Z
mZmZmZmZmZmZmZmZmZmZmZddlmZddlmZddlmZdd	lm Z m!Z!m"Z"m#Z#m$Z$m%Z%m&Z&dd
l'm(Z(ddl)m*Z*ddl+m,Z,dd
l-m.Z.ddl/m0Z0ddl1m2Z2ddl3m4Z4ddl5m6Z6m7Z7ej8�r<ddl9Z:ej;j<Z<ej=e>�Z?e@�ZAGdd�de@�ZBeCejDejEg�ZFGdd�deBe(�ZGGdd�deG�ZHdd�ZIdd�ZJdS)�)�absolute_importN)�error�timeout�)
�ClosedPoolError�
ProtocolError�EmptyPoolError�HeaderParsingError�HostChangedError�LocationValueError�
MaxRetryError�
ProxyError�ReadTimeoutError�SSLError�TimeoutError�InsecureRequestWarning�NewConnectionError)�CertificateError)�six)�queue)�port_by_scheme�DummyConnection�HTTPConnection�HTTPSConnection�VerifiedHTTPSConnection�
HTTPException�BaseSSLError)�RequestMethods)�HTTPResponse)�is_connection_dropped)�set_file_position)�assert_header_parsing)�Retry)�Timeout)�get_host�Urlc@sDeZdZdZdZejZd
dd�Zdd�Z	dd�Z
d	d
�Zdd�ZdS)�ConnectionPoolzz
    Base class for all connection pools, such as
    :class:`.HTTPConnectionPool` and :class:`.HTTPSConnectionPool`.
    NcCs.|std��t|�j�|_|j�|_||_dS)NzNo host specified.)r�
_ipv6_host�lower�host�_proxy_host�port)�selfr)r+�r-�$/usr/lib/python3.6/connectionpool.py�__init__Cs

zConnectionPool.__init__cCsdt|�j|j|jfS)Nz%s(host=%r, port=%r))�type�__name__r)r+)r,r-r-r.�__str__Ks
zConnectionPool.__str__cCs|S)Nr-)r,r-r-r.�	__enter__OszConnectionPool.__enter__cCs|j�dS)NF)�close)r,�exc_typeZexc_valZexc_tbr-r-r.�__exit__RszConnectionPool.__exit__cCsdS)zD
        Close all pooled connections and disable the pool.
        Nr-)r,r-r-r.r4WszConnectionPool.close)N)
r1�
__module__�__qualname__�__doc__�schemerZ	LifoQueue�QueueClsr/r2r3r6r4r-r-r-r.r&:s
r&c
@s�eZdZdZdZeZeZdde	j
ddddddf	dd�Zdd	�Zd!d
d�Z
dd
�Zdd�Zdd�Zdd�Zdd�Zedfdd�Zdd�Zdd�Zdd�Zdddddeddddf
dd �ZdS)"�HTTPConnectionPoolaN	
    Thread-safe connection pool for one host.

    :param host:
        Host used for this HTTP Connection (e.g. "localhost"), passed into
        :class:`httplib.HTTPConnection`.

    :param port:
        Port used for this HTTP Connection (None is equivalent to 80), passed
        into :class:`httplib.HTTPConnection`.

    :param strict:
        Causes BadStatusLine to be raised if the status line can't be parsed
        as a valid HTTP/1.0 or 1.1 status line, passed into
        :class:`httplib.HTTPConnection`.

        .. note::
           Only works in Python 2. This parameter is ignored in Python 3.

    :param timeout:
        Socket timeout in seconds for each individual connection. This can
        be a float or integer, which sets the timeout for the HTTP request,
        or an instance of :class:`urllib3.util.Timeout` which gives you more
        fine-grained control over request timeouts. After the constructor has
        been parsed, this is always a `urllib3.util.Timeout` object.

    :param maxsize:
        Number of connections to save that can be reused. More than 1 is useful
        in multithreaded situations. If ``block`` is set to False, more
        connections will be created but they will not be saved once they've
        been used.

    :param block:
        If set to True, no more than ``maxsize`` connections will be used at
        a time. When no free connections are available, the call will block
        until a connection has been released. This is a useful side effect for
        particular multithreaded situations where one does not want to use more
        than maxsize connections per host to prevent flooding.

    :param headers:
        Headers to include with all requests, unless other headers are given
        explicitly.

    :param retries:
        Retry configuration to use by default with requests in this pool.

    :param _proxy:
        Parsed proxy URL, should not be used directly, instead, see
        :class:`urllib3.connectionpool.ProxyManager`"

    :param _proxy_headers:
        A dictionary with proxy headers, should not be used directly,
        instead, see :class:`urllib3.connectionpool.ProxyManager`"

    :param \**conn_kw:
        Additional parameters are used to create fresh :class:`urllib3.connection.HTTPConnection`,
        :class:`urllib3.connection.HTTPSConnection` instances.
    �httpNFrc
Ks�tj|||�tj||�||_t|t�s4tj|�}|dkrBtj}||_	||_
|j|�|_||_
|	|_|
pli|_xt|�D]}|jjd�qzWd|_d|_||_|jr�|jjdg�dS)NrZsocket_options)r&r/r�strict�
isinstancer#�
from_floatr"ZDEFAULTr�retriesr;�pool�block�proxy�
proxy_headers�xrange�put�num_connections�num_requests�conn_kw�
setdefault)
r,r)r+r>r�maxsizerC�headersrA�_proxy�_proxy_headersrJ�_r-r-r.r/�s(


zHTTPConnectionPool.__init__cCsJ|jd7_tjd|j|j�|jf|j|j|jj|jd�|j	��}|S)z9
        Return a fresh :class:`HTTPConnection`.
        rz%Starting new HTTP connection (%d): %s)r)r+rr>)
rH�log�debugr)�
ConnectionClsr+r�connect_timeoutr>rJ)r,�connr-r-r.�	_new_conn�szHTTPConnectionPool._new_conncCs�d}y|jj|j|d�}WnBtk
r8t|d��Yn&tjk
r\|jrXt|d��YnX|r�t|�r�t	j
d|j�|j�t
|dd�dkr�d}|p�|j�S)	a�
        Get a connection. Will return a pooled connection if one is available.

        If no connections are available and :prop:`.block` is ``False``, then a
        fresh connection is returned.

        :param timeout:
            Seconds to wait before giving up and raising
            :class:`urllib3.exceptions.EmptyPoolError` if the pool is empty and
            :prop:`.block` is ``True``.
        N)rCrzPool is closed.z>Pool reached maximum size and no more connections are allowed.z Resetting dropped connection: %sZ	auto_openrr)rB�getrC�AttributeErrorrr�EmptyrrrQrRr)r4�getattrrV)r,rrUr-r-r.�	_get_conn�s zHTTPConnectionPool._get_conncCs\y|jj|dd�dStk
r(Yn$tjk
rJtjd|j�YnX|rX|j�dS)a�
        Put a connection back into the pool.

        :param conn:
            Connection object for the current host and port as returned by
            :meth:`._new_conn` or :meth:`._get_conn`.

        If the pool is already full, the connection is closed and discarded
        because we exceeded maxsize. If connections are discarded frequently,
        then maxsize should be increased.

        If the pool is closed, then the connection will be closed and discarded.
        F)rCNz2Connection pool is full, discarding connection: %s)	rBrGrXrZFullrQ�warningr)r4)r,rUr-r-r.�	_put_conn�szHTTPConnectionPool._put_conncCsdS)zU
        Called right before a request is made, after the socket is created.
        Nr-)r,rUr-r-r.�_validate_connsz!HTTPConnectionPool._validate_conncCsdS)Nr-)r,rUr-r-r.�_prepare_proxy!sz!HTTPConnectionPool._prepare_proxycCs2|tkr|jj�St|t�r$|j�Stj|�SdS)z< Helper that always returns a :class:`urllib3.util.Timeout` N)�_DefaultrZcloner?r#r@)r,rr-r-r.�_get_timeout%s


zHTTPConnectionPool._get_timeoutcCsjt|t�rt||d|��t|d�r>|jtkr>t||d|��dt|�ksVdt|�krft||d|��dS)zAIs the error actually a timeout? Will raise a ReadTimeout or passz!Read timed out. (read timeout=%s)�errnoz	timed outzdid not complete (read)N)r?�
SocketTimeoutr�hasattrrb�_blocking_errnos�str)r,�err�url�
timeout_valuer-r-r.�_raise_timeout1s
z!HTTPConnectionPool._raise_timeoutc
:Ks|jd7_|j|�}|j�|j|_y|j|�Wn:ttfk
rp}z|j|||jd��WYdd}~XnX|r�|j	||f|�n|j
||f|�|j}	t|dd�r�|	dkr�t
||d|	��|	tjkr�|jjtj��n|jj|	�yjy|jdd�}
WnTtk
�rPy|j�}
Wn0tk
�rJ}ztj|d�WYdd}~XnXYnXWn<tttfk
�r�}z|j|||	d��WYdd}~XnXt|d	d
�}tjd|j|j|j||||
j|
j �	yt!|
j"�Wn@t#tfk
�r}ztj$d|j%|�|dd
�WYdd}~XnX|
S)a
        Perform a request on a given urllib connection object taken from our
        pool.

        :param conn:
            a connection from one of our connection pools

        :param timeout:
            Socket timeout in seconds for the request. This can be a
            float or integer, which will set the same timeout value for
            the socket connect and the socket read, or an instance of
            :class:`urllib3.util.Timeout`, which gives you more fine-grained
            control over your timeouts.
        r)rgrhriN�sockrz!Read timed out. (read timeout=%s)T)�	bufferingZ
_http_vsn_strzHTTP/?z%s://%s:%s "%s %s %s" %s %sz$Failed to parse headers (url=%s): %s)�exc_info)&rIraZ
start_connectrTrr^rcrrjZrequest_chunked�request�read_timeoutrZrr#�DEFAULT_TIMEOUTrkZ
settimeout�socketZgetdefaulttimeoutZgetresponse�	TypeError�	ExceptionrZ
raise_from�SocketErrorrQrRr:r)r+�statusZlengthr!�msgr	r\�
_absolute_url)
r,rU�methodrhr�chunkedZhttplib_request_kw�timeout_obj�ero�httplib_responseZhttp_versionZhper-r-r.�
_make_requestBsT

(
$z HTTPConnectionPool._make_requestcCst|j|j|j|d�jS)N)r:r)r+�path)r%r:r)r+rh)r,r~r-r-r.rw�sz HTTPConnectionPool._absolute_urlcCsL|jd}|_y"x|jdd�}|r|j�qWWntjk
rFYnXdS)zD
        Close all pooled connections and disable the pool.
        NF)rC)rBrWr4rrY)r,Zold_poolrUr-r-r.r4�szHTTPConnectionPool.closecCst|jd�rdSt|�\}}}t|�j�}|jr@|r@tj|�}n|jrZ|tj|�krZd}|||f|j|j|jfkS)zj
        Check if the given ``url`` is a member of the same host as this
        connection pool.
        �/TN)	�
startswithr$r'r(r+rrWr:r))r,rhr:r)r+r-r-r.�is_same_host�s
zHTTPConnectionPool.is_same_hostTc
.Ks�|dkr|j}t|t�s*tj|||jd�}|
dkr>|
jdd�}
|rZ|j|�rZt|||��d}|
}|jdkr�|j	�}|j
|j�d}d}t||�}�zry�|j
|�}|j|	d�}|j|_|jdk	o�t|dd�}|r�|j|�|j|||||||d	�}|
�s�|nd}||
d
<|jj|f|||d�|
��}d}Wn�tjk
�rNt|d��Yn�ttttttt fk
�r}z�d}t|tt f��r�t|�}n>t|tt!f��r�|j�r�t"d
|�}nt|ttf��r�td|�}|j#||||t$j%�dd�}|j&�|}WYdd}~XnXWd|�s |�o|j'�}d}|�r0|j(|�X|�spt)j*d|||�|j+|||||||f||	|
|d�|
��Sdd�}|�o�|j,�}|�r$|j-dk�r�d}y|j#||||d�}Wn(t.k
�r�|j/�r�||��|SX||�|j0|�t)j1d||�|j+||||f|||||	|
|d�|
��St2|j3d��}|j4||j-|��r�y|j#||||d�}Wn(t.k
�r�|j5�r~||��|SX||�|j&|�t)j1d|�|j+||||f|||||	|
|d�|
��S|S)a�
        Get a connection from the pool and perform an HTTP request. This is the
        lowest level call for making a request, so you'll need to specify all
        the raw details.

        .. note::

           More commonly, it's appropriate to use a convenience method provided
           by :class:`.RequestMethods`, such as :meth:`request`.

        .. note::

           `release_conn` will only behave as expected if
           `preload_content=False` because we want to make
           `preload_content=False` the default behaviour someday soon without
           breaking backwards compatibility.

        :param method:
            HTTP request method (such as GET, POST, PUT, etc.)

        :param body:
            Data to send in the request body (useful for creating
            POST requests, see HTTPConnectionPool.post_url for
            more convenience).

        :param headers:
            Dictionary of custom headers to send, such as User-Agent,
            If-None-Match, etc. If None, pool headers are used. If provided,
            these headers completely replace any pool-specific headers.

        :param retries:
            Configure the number of retries to allow before raising a
            :class:`~urllib3.exceptions.MaxRetryError` exception.

            Pass ``None`` to retry until you receive a response. Pass a
            :class:`~urllib3.util.retry.Retry` object for fine-grained control
            over different types of retries.
            Pass an integer number to retry connection errors that many times,
            but no other types of errors. Pass zero to never retry.

            If ``False``, then retries are disabled and any exception is raised
            immediately. Also, instead of raising a MaxRetryError on redirects,
            the redirect response will be returned.

        :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.

        :param redirect:
            If True, automatically handle redirects (status codes 301, 302,
            303, 307, 308). Each redirect counts as a retry. Disabling retries
            will disable redirect, too.

        :param assert_same_host:
            If ``True``, will make sure that the host of the pool requests is
            consistent else will raise HostChangedError. When False, you can
            use the pool on an HTTP proxy and request foreign hosts.

        :param timeout:
            If specified, overrides the default timeout for this one
            request. It may be a float (in seconds) or an instance of
            :class:`urllib3.util.Timeout`.

        :param pool_timeout:
            If set and the pool is set to block=True, then this method will
            block for ``pool_timeout`` seconds and raise EmptyPoolError if no
            connection is available within the time period.

        :param release_conn:
            If False, then the urlopen call will not release the connection
            back into the pool once a response is received (but will release if
            you read the entire contents of the response such as when
            `preload_content=True`). This is useful if you're not preloading
            the response's content immediately. You will need to call
            ``r.release_conn()`` on the response ``r`` to return the connection
            back into the pool. If None, it takes the value of
            ``response_kw.get('preload_content', True)``.

        :param chunked:
            If True, urllib3 will send the body using chunked transfer
            encoding. Otherwise, urllib3 will send the body using the standard
            content-length form. Defaults to False.

        :param int body_pos:
            Position to seek to in file-like body in the event of a retry or
            redirect. Typically this won't need to be set because urllib3 will
            auto-populate the value when needed.

        :param \**response_kw:
            Additional parameters are passed to
            :meth:`urllib3.response.HTTPResponse.from_httplib`
        N)�redirect�defaultZpreload_contentTr=F)rrk)r�bodyrMryZrequest_method)rB�
connectionrAz"No pool connections are available.zCannot connect to proxy.zConnection aborted.�)r�_poolZ_stacktracez1Retrying (%r) after connection broken by '%r': %s)r�pool_timeout�release_conn�body_poscSs@y|j�Wn.ttttttfk
r:}zWYdd}~XnXdS)N)�readrrrtrrr)�responser{r-r-r.�drain_and_release_conn�s

z:HTTPConnectionPool.urlopen.<locals>.drain_and_release_conni/ZGET)r�r�zRedirecting %s -> %s)rAr��assert_same_hostrr�r�r�zRetry-Afterz	Retry: %s)6rMr?r"Zfrom_intrArWr�r
r:�copy�updaterEr rar[rTrrDrZr_r}�ResponseClsZfrom_httplibrrYrrrrtrrrrrr
Z	increment�sysrmZsleepr4r]rQr\�urlopenZget_redirect_locationrurZraise_on_redirectZsleep_for_retryrR�boolZ	getheaderZis_retryZraise_on_status)r,rxrhr�rMrAr�r�rr�r�ryr�Zresponse_kwrUZrelease_this_connrgZ
clean_exitrzZis_new_proxy_connr|Z
response_connr�r{r�Zredirect_locationZhas_retry_afterr-r-r.r��s�^















zHTTPConnectionPool.urlopen)N)r1r7r8r9r:rrSrr�r#rpr/rVr[r]r^r_rarjr`r}rwr4r�r�r-r-r-r.r<bs.:%
&Ur<csneZdZdZdZeZddejddddddddddddddfdd�Z	dd	�Z
d
d�Zdd
�Z�fdd�Z
�ZS)�HTTPSConnectionPoola�
    Same as :class:`.HTTPConnectionPool`, but HTTPS.

    When Python is compiled with the :mod:`ssl` module, then
    :class:`.VerifiedHTTPSConnection` is used, which *can* verify certificates,
    instead of :class:`.HTTPSConnection`.

    :class:`.VerifiedHTTPSConnection` uses one of ``assert_fingerprint``,
    ``assert_hostname`` and ``host`` in this order to verify connections.
    If ``assert_hostname`` is False, no verification is done.

    The ``key_file``, ``cert_file``, ``cert_reqs``, ``ca_certs``,
    ``ca_cert_dir``, and ``ssl_version`` are only used if :mod:`ssl` is
    available and are fed into :meth:`urllib3.util.ssl_wrap_socket` to upgrade
    the connection socket into an SSL socket.
    �httpsNFrcKsftj||||||||||	|
f|�|r2|
dkr2d}
||_||_|
|_||_||_||_||_||_	dS)NZ
CERT_REQUIRED)
r<r/�key_file�	cert_file�	cert_reqs�ca_certs�ca_cert_dir�ssl_version�assert_hostname�assert_fingerprint)r,r)r+r>rrLrCrMrArNrOr�r�r�r�r�r�r�r�rJr-r-r.r/�s	zHTTPSConnectionPool.__init__c	Cs<t|t�r8|j|j|j|j|j|j|j|j	d�|j
|_
|S)z�
        Prepare the ``connection`` for :meth:`urllib3.util.ssl_wrap_socket`
        and establish the tunnel if proxy is used.
        )r�r�r�r�r�r�r�)r?rZset_certr�r�r�r�r�r�r�r�)r,rUr-r-r.�
_prepare_conns

z!HTTPSConnectionPool._prepare_conncCsfy
|j}Wntk
r$|j}YnXtjdkrH|jrH||j|j�n||j|j|j�|j�dS)z�
        Establish tunnel connection early, because otherwise httplib
        would improperly set Host: header to proxy's IP:port.
        r���N)r�r�r�)	�
set_tunnelrXZ_set_tunnelr��version_inforEr*r+�connect)r,rUr�r-r-r.r_ s
z"HTTPSConnectionPool._prepare_proxycCs�|jd7_tjd|j|j�|js2|jtkr:td��|j}|j}|jdk	r`|jj}|jj}|jf|||j	j
|jd�|j��}|j
|�S)zB
        Return a fresh :class:`httplib.HTTPSConnection`.
        rz&Starting new HTTPS connection (%d): %szCCan't connect to HTTPS URL because the SSL module is not available.N)r)r+rr>)rHrQrRr)rSrrr+rDrrTr>rJr�)r,Zactual_hostZactual_portrUr-r-r.rV2s

zHTTPSConnectionPool._new_conncs:tt|�j|�t|dd�s$|j�|js6tjdt�dS)zU
        Called right before a request is made, after the socket is created.
        rkNz�Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings)	�superr�r^rZr�Zis_verified�warnings�warnr)r,rU)�	__class__r-r.r^Jsz"HTTPSConnectionPool._validate_conn)r1r7r8r9r:rrSr#rpr/r�r_rVr^�
__classcell__r-r-)r�r.r��sr�cKsRt|�\}}}|ptj|d�}|dkr:t|fd|i|��St|fd|i|��SdS)a�
    Given a url, return an :class:`.ConnectionPool` instance of its host.

    This is a shortcut for not having to parse out the scheme, host, and port
    of the url before creating an :class:`.ConnectionPool` instance.

    :param url:
        Absolute URL string that must include the scheme. Port is optional.

    :param \**kw:
        Passes additional parameters to the constructor of the appropriate
        :class:`.ConnectionPool`. Useful for specifying things like
        timeout, maxsize, headers, etc.

    Example::

        >>> conn = connection_from_url('http://google.com/')
        >>> r = conn.request('GET', '/')
    �Pr�r+N)r$rrWr�r<)rh�kwr:r)r+r-r-r.�connection_from_url]s
r�cCs*|jd�r&|jd�r&|jdd�jd�}|S)z'
    Process IPv6 address literals
    �[�]z%25�%z[])r��endswith�replace�strip)r)r-r-r.r'ysr')KZ
__future__rrbZloggingr�r�rqrrtrrc�
exceptionsrrrr	r
rrr
rrrrrZpackages.ssl_match_hostnamerZpackagesrZpackages.six.movesrr�rrrrrrrrnrr�rZutil.connectionrZutil.requestr Z
util.responser!Z
util.retryr"Zutil.timeoutr#Zutil.urlr$r%ZPY2ZQueueZ_unused_module_QueueZmovesrFZ	getLoggerr1rQ�objectr`r&�setZEAGAINZEWOULDBLOCKrer<r�r�r'r-r-r-r.�<module>sF<$
%|site-packages/pip/_vendor/urllib3/__pycache__/poolmanager.cpython-36.pyc000064400000031074147511334600022237 0ustar003

���e�A�@sddlmZddlZddlZddlZddlmZddlmZm	Z	ddlm
Z
ddlmZm
Z
mZddlmZdd	lmZdd
lmZddlmZdd
dgZeje�Zd4Zd5Zejd-e�Zd.d/�Zej ee�ej ee�d0�Z!ee	d0�Z"Gd1d�de�Z#Gd2d
�d
e#�Z$d3d�Z%dS)6�)�absolute_importN�)�RecentlyUsedContainer)�HTTPConnectionPool�HTTPSConnectionPool)�port_by_scheme)�LocationValueError�
MaxRetryError�ProxySchemeUnknown)�urljoin)�RequestMethods)�	parse_url)�Retry�PoolManager�ProxyManager�proxy_from_url�key_file�	cert_file�	cert_reqs�ca_certs�ssl_version�ca_cert_dir�ssl_context�
key_scheme�key_host�key_port�key_timeout�key_retries�
key_strict�	key_block�key_source_address�key_key_file�
key_cert_file�
key_cert_reqs�key_ca_certs�key_ssl_version�key_ca_cert_dir�key_ssl_context�key_maxsize�key_headers�
key__proxy�key__proxy_headers�key_socket_options�key__socks_options�key_assert_hostname�key_assert_fingerprint�PoolKeycCs�|j�}|dj�|d<|dj�|d<x4d	D],}||kr.||dk	r.t||j��||<q.W|jd�}|dk	r|t|�|d<x&t|j��D]}|j|�|d|<q�Wx|j	D]}||kr�d||<q�W|f|�S)
a�
    Create a pool key out of a request context dictionary.

    According to RFC 3986, both the scheme and host are case-insensitive.
    Therefore, this function normalizes both before constructing the pool
    key for an HTTPS request. If you wish to change this behaviour, provide
    alternate callables to ``key_fn_by_scheme``.

    :param key_class:
        The class to use when constructing the key. This should be a namedtuple
        with the ``scheme`` and ``host`` keys at a minimum.
    :type  key_class: namedtuple
    :param request_context:
        A dictionary-like object that contain the context for a request.
    :type  request_context: dict

    :return: A namedtuple that can be used as a connection pool key.
    :rtype:  PoolKey
    �scheme�host�headers�_proxy_headers�_socks_optionsNZsocket_optionsZkey_)r3r4r5)
�copy�lower�	frozenset�items�get�tuple�list�keys�pop�_fields)Z	key_class�request_context�context�keyZsocket_optsZfield�rC�!/usr/lib/python3.6/poolmanager.py�_default_key_normalizer9s

rE)�http�httpsc@sxeZdZdZdZddd�Zdd�Zdd	�Zdd
d�Zdd
�Z	ddd�Z
dd�Zddd�Zd dd�Z
dd�Zd!dd�ZdS)"ra$
    Allows for arbitrary requests while transparently keeping track of
    necessary connection pools for you.

    :param num_pools:
        Number of connection pools to cache before discarding the least
        recently used pool.

    :param headers:
        Headers to include with all requests, unless other headers are given
        explicitly.

    :param \**connection_pool_kw:
        Additional parameters are used to create fresh
        :class:`urllib3.connectionpool.ConnectionPool` instances.

    Example::

        >>> manager = PoolManager(num_pools=2)
        >>> r = manager.request('GET', 'http://google.com/')
        >>> r = manager.request('GET', 'http://google.com/mail')
        >>> r = manager.request('GET', 'http://yahoo.com/')
        >>> len(manager.pools)
        2

    N�
cKs8tj||�||_t|dd�d�|_t|_tj�|_dS)NcSs|j�S)N)�close)�prCrCrD�<lambda>�sz&PoolManager.__init__.<locals>.<lambda>)Zdispose_func)r�__init__�connection_pool_kwr�pools�pool_classes_by_scheme�key_fn_by_schemer6)�self�	num_poolsr3rMrCrCrDrL�szPoolManager.__init__cCs|S)NrC)rQrCrCrD�	__enter__�szPoolManager.__enter__cCs|j�dS)NF)�clear)rQ�exc_typeZexc_valZexc_tbrCrCrD�__exit__�szPoolManager.__exit__cCsf|j|}|dkr|jj�}xdD]}|j|d�q"W|dkrXxtD]}|j|d�qDW|||f|�S)a�
        Create a new :class:`ConnectionPool` based on host, port, scheme, and
        any additional pool keyword arguments.

        If ``request_context`` is provided, it is provided as keyword arguments
        to the pool class used. This method is used to actually create the
        connection pools handed out by :meth:`connection_from_url` and
        companion methods. It is intended to be overridden for customization.
        Nr1r2�portrF)r1r2rW)rOrMr6r>�SSL_KEYWORDS)rQr1r2rWr@Zpool_clsrB�kwrCrCrD�	_new_pool�s




zPoolManager._new_poolcCs|jj�dS)z�
        Empty our store of pools and direct them all to close.

        This will not affect in-flight connections, but they will not be
        re-used after completion.
        N)rNrT)rQrCrCrDrT�szPoolManager.clearrFcCsT|std��|j|�}|pd|d<|s:tj|dj�d�}||d<||d<|j|�S)a�
        Get a :class:`ConnectionPool` based on the host, port, and scheme.

        If ``port`` isn't given, it will be derived from the ``scheme`` using
        ``urllib3.connectionpool.port_by_scheme``. If ``pool_kwargs`` is
        provided, it is merged with the instance's ``connection_pool_kw``
        variable and used to create the new connection pool, if one is
        needed.
        zNo host specified.rFr1�PrWr2)r�_merge_pool_kwargsrr:r7�connection_from_context)rQr2rWr1�pool_kwargsr@rCrCrD�connection_from_host�s
z PoolManager.connection_from_hostcCs,|dj�}|j|}||�}|j||d�S)z�
        Get a :class:`ConnectionPool` based on the request context.

        ``request_context`` must at least contain the ``scheme`` key and its
        value must be a key in ``key_fn_by_scheme`` instance variable.
        r1)r@)r7rP�connection_from_pool_key)rQr@r1Zpool_key_constructor�pool_keyrCrCrDr]�s
z#PoolManager.connection_from_contextc
Cs`|jj�N|jj|�}|r|S|d}|d}|d}|j||||d�}||j|<WdQRX|S)z�
        Get a :class:`ConnectionPool` based on the provided pool key.

        ``pool_key`` should be a namedtuple that only contains immutable
        objects. At a minimum it must have the ``scheme``, ``host``, and
        ``port`` fields.
        r1r2rW)r@N)rN�lockr:rZ)rQrar@Zpoolr1r2rWrCrCrDr`�s
z$PoolManager.connection_from_pool_keycCs t|�}|j|j|j|j|d�S)a�
        Similar to :func:`urllib3.connectionpool.connection_from_url`.

        If ``pool_kwargs`` is not provided and a new pool needs to be
        constructed, ``self.connection_pool_kw`` is used to initialize
        the :class:`urllib3.connectionpool.ConnectionPool`. If ``pool_kwargs``
        is provided, it is used instead. Note that if a new pool does not
        need to be created for the request, the provided ``pool_kwargs`` are
        not used.
        )rWr1r^)r
r_r2rWr1)rQ�urlr^�urCrCrD�connection_from_url
szPoolManager.connection_from_urlcCsZ|jj�}|rVxF|j�D]:\}}|dkrJy
||=WqRtk
rFYqRXq|||<qW|S)a
        Merge a dictionary of override values for self.connection_pool_kw.

        This does not modify self.connection_pool_kw and returns a new dict.
        Any keys in the override dictionary with a value of ``None`` are
        removed from the merged dictionary.
        N)rMr6r9�KeyError)rQ�overrideZbase_pool_kwargsrB�valuerCrCrDr\s

zPoolManager._merge_pool_kwargsTcKsdt|�}|j|j|j|jd�}d|d<d|d<d|krD|jj�|d<|jdk	rj|jdkrj|j||f|�}n|j||j	f|�}|o�|j
�}|s�|St||�}|jdkr�d	}|j
d
�}	t|	t�s�tj|	|d�}	|	jo�|j|��r�x|	jD]}
|dj|
d�q�Wy|	j||||d�}	Wn tk
�r4|	j�r0�|SX|	|d
<||d<tjd
||�|j||f|�S)a]
        Same as :meth:`urllib3.connectionpool.HTTPConnectionPool.urlopen`
        with custom cross-host redirect logic and only sends the request-uri
        portion of the ``url``.

        The given ``url`` parameter must be absolute, such that an appropriate
        :class:`urllib3.connectionpool.ConnectionPool` can be chosen for it.
        )rWr1FZassert_same_host�redirectr3NrFi/ZGET�retries)ri)�responseZ_poolzRedirecting %s -> %s)r
r_r2rWr1r3r6�proxy�urlopenZrequest_uriZget_redirect_locationrZstatusr:�
isinstancerZfrom_intZremove_headers_on_redirectZis_same_hostr>Z	incrementr	Zraise_on_redirect�log�info)rQ�methodrcrirYrdZconnrkZredirect_locationrj�headerrCrCrDrm-s@	



zPoolManager.urlopen)rHN)N)NrFN)N)N)T)�__name__�
__module__�__qualname__�__doc__rlrLrSrVrZrTr_r]r`rer\rmrCrCrCrDrys

	


csHeZdZdZd�fdd�	Zd�fdd�	Zdd	d
�Zd�fdd
�	Z�ZS)raw
    Behaves just like :class:`PoolManager`, but sends all requests through
    the defined proxy, using the CONNECT method for HTTPS URLs.

    :param proxy_url:
        The URL of the proxy to be used.

    :param proxy_headers:
        A dictionary contaning headers that will be sent to the proxy. In case
        of HTTP they are being sent with each request, while in the
        HTTPS/CONNECT case they are sent only once. Could be used for proxy
        authentication.

    Example:
        >>> proxy = urllib3.ProxyManager('http://localhost:3128/')
        >>> r1 = proxy.request('GET', 'http://google.com/')
        >>> r2 = proxy.request('GET', 'http://httpbin.org/')
        >>> len(proxy.pools)
        1
        >>> r3 = proxy.request('GET', 'https://httpbin.org/')
        >>> r4 = proxy.request('GET', 'https://twitter.com/')
        >>> len(proxy.pools)
        3

    rHNcs�t|t�rd|j|j|jf}t|�}|jsFtj|jd�}|j|d�}|jdkrZt	|j��||_
|pfi|_|j
|d<|j|d<tt
|�j||f|�dS)	Nz
%s://%s:%ir[)rWrFrG�_proxyr4)rFrG)rnrr1r2rWr
rr:�_replacer
rl�
proxy_headers�superrrL)rQ�	proxy_urlrRr3ryrMrlrW)�	__class__rCrDrL�s








zProxyManager.__init__rFcsD|dkr tt|�j||||d�Stt|�j|jj|jj|jj|d�S)NrG)r^)rzrr_rlr2rWr1)rQr2rWr1r^)r|rCrDr_�s


z!ProxyManager.connection_from_hostcCs0ddi}t|�j}|r||d<|r,|j|�|S)z�
        Sets headers needed by proxies: specifically, the Accept and Host
        headers. Only sets headers not provided by the user.
        ZAcceptz*/*ZHost)r
�netloc�update)rQrcr3Zheaders_r}rCrCrD�_set_proxy_headers�s

zProxyManager._set_proxy_headersTcsNt|�}|jdkr0|jd|j�}|j||�|d<tt|�j||fd|i|��S)z@Same as HTTP(S)ConnectionPool.urlopen, ``url`` must be absolute.rFr3ri)r
r1r:r3rrzrrm)rQrqrcrirYrdr3)r|rCrDrm�s

zProxyManager.urlopen)rHNN)NrFN)N)T)	rsrtrurvrLr_rrm�
__classcell__rCrC)r|rDris
cKstfd|i|��S)Nr{)r)rcrYrCrCrDr�s)rrrrrrr)rrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/)&Z
__future__r�collections�	functoolsZlogging�_collectionsrZconnectionpoolrrr�
exceptionsrr	r
Zpackages.six.moves.urllib.parserZrequestrZutil.urlr
Z
util.retryr�__all__Z	getLoggerrsrorXZ_key_fields�
namedtupler0rE�partialrPrOrrrrCrCrCrD�<module>s`

6
qWsite-packages/pip/_vendor/urllib3/__pycache__/exceptions.cpython-36.opt-1.pyc000064400000024121147511334600023046 0ustar003

���e��@sLddlmZddlmZGdd�de�ZGdd�de�ZGdd	�d	e�Z	Gd
d�de	�Z
Gdd
�d
e�ZGdd�de�ZGdd�de�Z
Gdd�de�ZeZGdd�de
�ZGdd�de
�ZGdd�de�ZGdd�de�ZGdd�dee
�ZGdd�de�ZGd d!�d!ee	�ZGd"d#�d#e	�ZGd$d%�d%e	�ZGd&d'�d'ee�ZGd(d)�d)e�ZGd*d+�d+e�ZGd,d-�d-e�ZGd.d/�d/e�ZGd0d1�d1e�ZGd2d3�d3e�Z Gd4d5�d5e�Z!Gd6d7�d7e�Z"Gd8d9�d9e�Z#Gd:d;�d;ee�Z$Gd<d=�d=e�Z%Gd>d?�d?ee�ZGd@dA�dAe�Z&GdBdC�dCe'e�Z(GdDdE�dEe�Z)GdFdG�dGe�Z*dHS)I�)�absolute_import�)�IncompleteReadc@seZdZdZdS)�	HTTPErrorz#Base exception used by this module.N)�__name__�
__module__�__qualname__�__doc__�r
r
� /usr/lib/python3.6/exceptions.pyrsrc@seZdZdZdS)�HTTPWarningz!Base warning used by this module.N)rrrr	r
r
r
rr
src@s eZdZdZdd�Zdd�ZdS)�	PoolErrorz/Base exception for errors caused within a pool.cCs||_tj|d||f�dS)Nz%s: %s)�poolr�__init__)�selfr�messager
r
rrszPoolError.__init__cCs
|jdfS)N)NN)�	__class__)rr
r
r�
__reduce__szPoolError.__reduce__N)rrrr	rrr
r
r
rr
sr
c@s eZdZdZdd�Zdd�ZdS)�RequestErrorz8Base exception for PoolErrors that have associated URLs.cCs||_tj|||�dS)N)�urlr
r)rrrrr
r
rrszRequestError.__init__cCs|jd|jdffS)N)rr)rr
r
rr#szRequestError.__reduce__N)rrrr	rrr
r
r
rrsrc@seZdZdZdS)�SSLErrorz9Raised when SSL certificate fails in an HTTPS connection.N)rrrr	r
r
r
rr(src@seZdZdZdS)�
ProxyErrorz,Raised when the connection to a proxy fails.N)rrrr	r
r
r
rr-src@seZdZdZdS)�DecodeErrorz;Raised when automatic decoding based on Content-Type fails.N)rrrr	r
r
r
rr2src@seZdZdZdS)�
ProtocolErrorz>Raised when something unexpected happens mid-request/response.N)rrrr	r
r
r
rr7src@seZdZdZddd�ZdS)�
MaxRetryErroraRaised when the maximum number of retries is exceeded.

    :param pool: The connection pool
    :type pool: :class:`~urllib3.connectionpool.HTTPConnectionPool`
    :param string url: The requested Url
    :param exceptions.Exception reason: The underlying error

    NcCs&||_d||f}tj||||�dS)Nz0Max retries exceeded with url: %s (Caused by %r))�reasonrr)rrrrrr
r
rrLs
zMaxRetryError.__init__)N)rrrr	rr
r
r
rrBsrc@seZdZdZddd�ZdS)�HostChangedErrorz?Raised when an existing pool gets a request for a foreign host.�cCs"d|}tj||||�||_dS)Nz)Tried to open a foreign host with url: %s)rr�retries)rrrrrr
r
rrXszHostChangedError.__init__N)r)rrrr	rr
r
r
rrUsrc@seZdZdZdS)�TimeoutStateErrorz3 Raised when passing an invalid state to a timeout N)rrrr	r
r
r
rr^src@seZdZdZdS)�TimeoutErrorz� Raised when a socket timeout error occurs.

    Catching this error will catch both :exc:`ReadTimeoutErrors
    <ReadTimeoutError>` and :exc:`ConnectTimeoutErrors <ConnectTimeoutError>`.
    N)rrrr	r
r
r
rr csr c@seZdZdZdS)�ReadTimeoutErrorzFRaised when a socket timeout occurs while receiving data from a serverN)rrrr	r
r
r
rr!lsr!c@seZdZdZdS)�ConnectTimeoutErrorz@Raised when a socket timeout occurs while connecting to a serverN)rrrr	r
r
r
rr"ssr"c@seZdZdZdS)�NewConnectionErrorzHRaised when we fail to establish a new connection. Usually ECONNREFUSED.N)rrrr	r
r
r
rr#xsr#c@seZdZdZdS)�EmptyPoolErrorzCRaised when a pool runs out of connections and no more are allowed.N)rrrr	r
r
r
rr$}sr$c@seZdZdZdS)�ClosedPoolErrorzCRaised when a request enters a pool after the pool has been closed.N)rrrr	r
r
r
rr%�sr%c@seZdZdZdS)�LocationValueErrorz<Raised when there is something wrong with a given URL input.N)rrrr	r
r
r
rr&�sr&c@seZdZdZdd�ZdS)�LocationParseErrorz=Raised when get_host or similar fails to parse the URL input.cCsd|}tj||�||_dS)NzFailed to parse: %s)rr�location)rr(rr
r
rr�szLocationParseError.__init__N)rrrr	rr
r
r
rr'�sr'c@seZdZdZdZdZdS)�
ResponseErrorzDUsed as a container for an error reason supplied in a MaxRetryError.ztoo many error responsesz&too many {status_code} error responsesN)rrrr	Z
GENERIC_ERRORZSPECIFIC_ERRORr
r
r
rr)�sr)c@seZdZdZdS)�SecurityWarningz/Warned when perfoming security reducing actionsN)rrrr	r
r
r
rr*�sr*c@seZdZdZdS)�SubjectAltNameWarningzBWarned when connecting to a host with a certificate missing a SAN.N)rrrr	r
r
r
rr+�sr+c@seZdZdZdS)�InsecureRequestWarningz/Warned when making an unverified HTTPS request.N)rrrr	r
r
r
rr,�sr,c@seZdZdZdS)�SystemTimeWarningz0Warned when system time is suspected to be wrongN)rrrr	r
r
r
rr-�sr-c@seZdZdZdS)�InsecurePlatformWarningzEWarned when certain SSL configuration is not available on a platform.N)rrrr	r
r
r
rr.�sr.c@seZdZdZdS)�SNIMissingWarningz9Warned when making a HTTPS request without SNI available.N)rrrr	r
r
r
rr/�sr/c@seZdZdZdS)�DependencyWarningzc
    Warned when an attempt is made to import a module with missing optional
    dependencies.
    N)rrrr	r
r
r
rr0�sr0c@seZdZdZdS)�ResponseNotChunkedz;Response needs to be chunked in order to read it as chunks.N)rrrr	r
r
r
rr1�sr1c@seZdZdZdS)�BodyNotHttplibCompatiblezz
    Body should be httplib.HTTPResponse like (have an fp attribute which
    returns raw chunks) for read_chunked().
    N)rrrr	r
r
r
rr2�sr2cs(eZdZdZ�fdd�Zdd�Z�ZS)rz�
    Response length doesn't match expected Content-Length

    Subclass of http_client.IncompleteRead to allow int value
    for `partial` to avoid creating large objects on streamed
    reads.
    cstt|�j||�dS)N)�superrr)r�partial�expected)rr
rr�szIncompleteRead.__init__cCsd|j|jfS)Nz/IncompleteRead(%i bytes read, %i more expected))r4r5)rr
r
r�__repr__�szIncompleteRead.__repr__)rrrr	rr6�
__classcell__r
r
)rrr�src@seZdZdZdS)�
InvalidHeaderz(The header provided was somehow invalid.N)rrrr	r
r
r
rr8�sr8cs eZdZdZ�fdd�Z�ZS)�ProxySchemeUnknownz1ProxyManager does not support the supplied schemecsd|}tt|�j|�dS)NzNot supported proxy scheme %s)r3r9r)r�schemer)rr
rr�szProxySchemeUnknown.__init__)rrrr	rr7r
r
)rrr9�sr9cs eZdZdZ�fdd�Z�ZS)�HeaderParsingErrorzNRaised by assert_header_parsing, but we convert it to a log.warning statement.cs$d|pd|f}tt|�j|�dS)Nz%s, unparsed data: %rZUnknown)r3r;r)rZdefectsZ
unparsed_datar)rr
rr�szHeaderParsingError.__init__)rrrr	rr7r
r
)rrr;�sr;c@seZdZdZdS)�UnrewindableBodyErrorz9urllib3 encountered an error when trying to rewind a bodyN)rrrr	r
r
r
rr<�sr<N)+Z
__future__rZpackages.six.moves.http_clientrZhttplib_IncompleteRead�	Exceptionr�Warningrr
rrrrr�ConnectionErrorrrrr r!r"r#r$r%�
ValueErrorr&r'r)r*r+r,r-r.r/r0r1r2r8�AssertionErrorr9r;r<r
r
r
r�<module>sH		
	site-packages/pip/_vendor/urllib3/__pycache__/connection.cpython-36.pyc000064400000021073147511334600022070 0ustar003

���e�2�@s�ddlmZddlZddlZddlZddlZddlZddlmZm	Z
ddlZddlm
Z
ddlmZddlmZyddlZejZWn,eefk
r�dZGdd	�d	e�ZYnXyeZWn$ek
r�Gd
d�de�ZYnXddlmZmZmZmZdd
l m!Z!m"Z"ddl#m$Z$m%Z%m&Z&m'Z'm(Z(ddl)m*Z*ddl+m,Z,ej-e.�Z/ddd�Z0ej1ddd�Z2Gdd�de3�Z4Gdd�dee3�ZGdd�de�Z5Gdd�de5�Z6dd�Z7e�r�e5Z8e6Z5ne4Z5dS)�)�absolute_importN)�error�timeout�)�six)�HTTPConnection)�
HTTPExceptionc@seZdZdS)�BaseSSLErrorN)�__name__�
__module__�__qualname__�r
r
� /usr/lib/python3.6/connection.pyr	sr	c@seZdZdS)�ConnectionErrorN)r
rrr
r
r
rrsr)�NewConnectionError�ConnectTimeoutError�SubjectAltNameWarning�SystemTimeWarning)�match_hostname�CertificateError)�resolve_cert_reqs�resolve_ssl_version�assert_fingerprint�create_urllib3_context�ssl_wrap_socket)�
connection)�HTTPHeaderDict�Pi�)�http�httpsi�c@seZdZdZdS)�DummyConnectionz-Used to detect a failed ConnectionCls import.N)r
rr�__doc__r
r
r
rr Asr c@sVeZdZdZedZejejdfgZ	dZ
dd�Zdd�Zd	d
�Z
dd�Zddd�Zd
S)ra{
    Based on httplib.HTTPConnection but provides an extra constructor
    backwards-compatibility layer between older and newer Pythons.

    Additional keyword parameters are used to configure attributes of the connection.
    Accepted parameters include:

      - ``strict``: See the documentation on :class:`urllib3.connectionpool.HTTPConnectionPool`
      - ``source_address``: Set the source address for the current connection.

        .. note:: This is ignored for Python 2.6. It is only applied for 2.7 and 3.x

      - ``socket_options``: Set specific options on the underlying socket. If not specified, then
        defaults are loaded from ``HTTPConnection.default_socket_options`` which includes disabling
        Nagle's algorithm (sets TCP_NODELAY to 1) unless the connection is behind a proxy.

        For example, if you wish to enable TCP Keep Alive in addition to the defaults,
        you might pass::

            HTTPConnection.default_socket_options + [
                (socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1),
            ]

        Or you may want to disable the defaults by passing an empty list (e.g., ``[]``).
    rrFcOsZtjr|jdd�|jd�|_tjdkr4|jdd�|jd|j�|_t	j
|f|�|�dS)N�strict�source_address���socket_options)r$r%)rZPY3�pop�getr#�sys�version_info�default_socket_optionsr&�_HTTPConnection�__init__)�self�args�kwr
r
rr-js
zHTTPConnection.__init__cCs�i}|jr|j|d<|jr$|j|d<ytj|j|jf|jf|�}Wnftk
rz}zt|d|j|jf��WYdd}~Xn0t	k
r�}zt
|d|��WYdd}~XnX|S)zp Establish a socket connection and set nodelay settings on it.

        :return: New socket connection.
        r#r&z0Connection to %s timed out. (connect timeout=%s)Nz(Failed to establish a new connection: %s)r#r&rZcreate_connection�host�portr�
SocketTimeoutr�SocketErrorr)r.Zextra_kw�conn�er
r
r�	_new_conns 

 zHTTPConnection._new_conncCs$||_t|dd�r |j�d|_dS)N�_tunnel_hostr)�sock�getattr�_tunnel�	auto_open)r.r5r
r
r�
_prepare_conn�szHTTPConnection._prepare_conncCs|j�}|j|�dS)N)r7r=)r.r5r
r
r�connect�szHTTPConnection.connectNcCst|dk	r|ni�}d|k}d|k}|j||||d�x |j�D]\}}|j||�q@Wd|krl|jdd�|j�|dk	�rtjtjf}	t||	�r�|f}xj|D]b}
|
s�q�t|
tj�s�|
j	d�}
t
t|
��d	d�}|j|j	d
��|jd�|j|
�|jd�q�W|jd�dS)
z�
        Alternative to the common request method, which sends the
        body with chunked encoding and not as one block
        Nzaccept-encodingr1)�skip_accept_encoding�	skip_hostztransfer-encodingzTransfer-EncodingZchunked�utf8r$zutf-8s
s0

)
rZ
putrequest�itemsZ	putheaderZ
endheadersrZstring_typesZbinary_type�
isinstance�encode�hex�len�send)r.�methodZurlZbodyZheadersr?r@�header�valueZstringish_types�chunkZlen_strr
r
r�request_chunked�s8





zHTTPConnection.request_chunked)NN)r
rrr!�port_by_scheme�default_port�socketZIPPROTO_TCPZTCP_NODELAYr+�is_verifiedr-r7r=r>rLr
r
r
rrFsrc@s8eZdZedZdZddddejdfdd�Zdd�Z	dS)�HTTPSConnectionrNc	Ks8tj|||f||d�|��||_||_||_d|_dS)N)r"rr)rr-�key_file�	cert_file�ssl_contextZ	_protocol)	r.r1r2rRrSr"rrTr0r
r
rr-�szHTTPSConnection.__init__cCsN|j�}|j|�|jdkr2ttd�td�d�|_t||j|j|jd�|_	dS)N)�ssl_version�	cert_reqs)r9�keyfile�certfilerT)
r7r=rTrrrrrRrSr9)r.r5r
r
rr>�s

zHTTPSConnection.connect)
r
rrrMrNrUrOZ_GLOBAL_DEFAULT_TIMEOUTr-r>r
r
r
rrQ�s
rQc@s6eZdZdZdZdZdZdZdZddd�Z	dd�Z
dS)�VerifiedHTTPSConnectionz[
    Based on httplib.HTTPSConnection but wraps the socket with
    SSL certification.
    NcCsn|dkr(|s|rd}n|jdk	r(|jj}||_||_||_||_||_|oTtjj	|�|_
|oftjj	|�|_dS)zX
        This method should only be called once, before the connection is used.
        N�
CERT_REQUIRED)rT�verify_moderRrSrV�assert_hostnamer�os�path�
expanduser�ca_certs�ca_cert_dir)r.rRrSrVr`r\rrar
r
r�set_certs
z VerifiedHTTPSConnection.set_certc	CsL|j�}|j}t|dd�r4||_|j�d|_|j}tjj	�t
k}|rXtjdj
t
�t�|jdkr|tt|j�t|j�d�|_|j}t|j�|_t||j|j|j|j||d�|_|jr�t|jjdd�|j�nb|jtjko�t|dd	�o�|jd	k	�r.|jj�}|j d
f��stjdj
|�t!�t"||j�p*|�|jtj#k�pD|jdk	|_$dS)Nr8rzWSystem time is way off (before {0}). This will probably lead to SSL verification errors)rUrV)r9rWrXr`raZserver_hostnamerTT)Zbinary_formZcheck_hostnameFZsubjectAltNamez�Certificate for {0} has no `subjectAltName`, falling back to check for a `commonName` for now. This feature is being removed by major browsers and deprecated by RFC 2818. (See https://github.com/shazow/urllib3/issues/497 for details.))%r7r1r:r9r;r<r8�datetime�dateZtoday�RECENT_DATE�warnings�warn�formatrrTrrrUrrVr[rrRrSr`rarZgetpeercert�sslZ	CERT_NONEr\r(r�_match_hostnamerZrP)r.r5ZhostnameZis_time_off�context�certr
r
rr>sT



zVerifiedHTTPSConnection.connect)NNNNNNN)r
rrr!rVr`rarUrrbr>r
r
r
rrY�s
rYcCsLyt||�Wn8tk
rF}ztjd||�||_�WYdd}~XnXdS)Nz@Certificate did not match expected hostname: %s. Certificate: %s)rr�logrZ
_peer_cert)rlZasserted_hostnamer6r
r
rrjbsrj)9Z
__future__rrcZloggingr]r)rOrr4rr3rfZpackagesrZpackages.six.moves.http_clientrr,rriZSSLErrorr	�ImportError�AttributeError�
BaseExceptionr�	NameError�	Exception�
exceptionsrrrrZpackages.ssl_match_hostnamerrZ	util.ssl_rrrrr�utilr�_collectionsrZ	getLoggerr
rmrMrdre�objectr rQrYrjZUnverifiedHTTPSConnectionr
r
r
r�<module>sN
	
&lsite-packages/pip/_vendor/urllib3/__pycache__/request.cpython-36.opt-1.pyc000064400000012556147511334600022366 0ustar003

���e:�@s>ddlmZddlmZddlmZdgZGdd�de�ZdS)�)�absolute_import�)�encode_multipart_formdata)�	urlencode�RequestMethodsc@sReZdZdZeddddg�Zddd�Zdd
d�Zddd
�Zddd�Z	ddd�Z
dS)ra�
    Convenience mixin for classes who implement a :meth:`urlopen` method, such
    as :class:`~urllib3.connectionpool.HTTPConnectionPool` and
    :class:`~urllib3.poolmanager.PoolManager`.

    Provides behavior for making common types of HTTP request methods and
    decides which type of request field encoding to use.

    Specifically,

    :meth:`.request_encode_url` is for sending requests whose fields are
    encoded in the URL (such as GET, HEAD, DELETE).

    :meth:`.request_encode_body` is for sending requests whose fields are
    encoded in the *body* of the request using multipart or www-form-urlencoded
    (such as for POST, PUT, PATCH).

    :meth:`.request` is for making any kind of request, it will look up the
    appropriate encoding format and use one of the above two methods to make
    the request.

    Initializer parameters:

    :param headers:
        Headers to include with all requests, unless other headers are given
        explicitly.
    ZDELETEZGETZHEADZOPTIONSNcCs|pi|_dS)N)�headers)�selfr�r	�/usr/lib/python3.6/request.py�__init__)szRequestMethods.__init__TcKstd��dS)NzMClasses extending RequestMethods must implement their own ``urlopen`` method.)�NotImplemented)r�method�url�bodyr�encode_multipart�multipart_boundary�kwr	r	r
�urlopen,szRequestMethods.urlopencKsJ|j�}||jkr,|j||f||d�|��S|j||f||d�|��SdS)a�
        Make a request using :meth:`urlopen` with the appropriate encoding of
        ``fields`` based on the ``method`` used.

        This is a convenience method that requires the least amount of manual
        effort. It can be used in most situations, while still having the
        option to drop down to more specific methods when necessary, such as
        :meth:`request_encode_url`, :meth:`request_encode_body`,
        or even the lowest level :meth:`urlopen`.
        )�fieldsrN)�upper�_encode_url_methods�request_encode_url�request_encode_body)rr
rrr�
urlopen_kwr	r	r
�request2s
zRequestMethods.requestcKsD|dkr|j}d|i}|j|�|r4|dt|�7}|j||f|�S)z�
        Make a request using :meth:`urlopen` with the ``fields`` encoded in
        the url. This is useful for request methods like GET, HEAD, DELETE, etc.
        Nr�?)r�updaterr)rr
rrrr�extra_kwr	r	r
rHs
z!RequestMethods.request_encode_urlcKs�|dkr|j}dii}|rbd|kr*td��|r@t||d�\}	}
nt|�d}	}
|	|d<d|
i|d<|dj|�|j|�|j||f|�S)a�
        Make a request using :meth:`urlopen` with the ``fields`` encoded in
        the body. This is useful for request methods like POST, PUT, PATCH, etc.

        When ``encode_multipart=True`` (default), then
        :meth:`urllib3.filepost.encode_multipart_formdata` is used to encode
        the payload with the appropriate content type. Otherwise
        :meth:`urllib.urlencode` is used with the
        'application/x-www-form-urlencoded' content type.

        Multipart encoding must be used when posting files, and it's reasonably
        safe to use it in other times too. However, it may break request
        signing, such as with OAuth.

        Supports an optional ``fields`` parameter of key/value strings AND
        key/filetuple. A filetuple is a (filename, data, MIME type) tuple where
        the MIME type is optional. For example::

            fields = {
                'foo': 'bar',
                'fakefile': ('foofile.txt', 'contents of foofile'),
                'realfile': ('barfile.txt', open('realfile').read()),
                'typedfile': ('bazfile.bin', open('bazfile').read(),
                              'image/jpeg'),
                'nonamefile': 'contents of nonamefile field',
            }

        When uploading a file, providing a filename (the first parameter of the
        tuple) is optional but recommended to best mimick behavior of browsers.

        Note that if ``headers`` are supplied, the 'Content-Type' header will
        be overwritten because it depends on the dynamic random boundary string
        which is used to compose the body of the request. The random boundary
        string can be explicitly set with the ``multipart_boundary`` parameter.
        NrrzFrequest got values for both 'fields' and 'body', can only specify one.)�boundaryz!application/x-www-form-urlencodedzContent-Type)r�	TypeErrorrrrr)rr
rrrrrrrrZcontent_typer	r	r
rYs&
z"RequestMethods.request_encode_body)N)NNTN)NN)NN)NNTN)�__name__�
__module__�__qualname__�__doc__�setrrrrrrr	r	r	r
r
s



N)	Z
__future__rZfilepostrZpackages.six.moves.urllib.parser�__all__�objectrr	r	r	r
�<module>ssite-packages/pip/_vendor/urllib3/__pycache__/fields.cpython-36.opt-1.pyc000064400000013240147511334600022133 0ustar003

���e7�@sNddlmZddlZddlZddlmZddd�Zdd	�ZGd
d�de	�Z
dS)
�)�absolute_importN�)�six�application/octet-streamcCs|rtj|�dp|S|S)z�
    Guess the "Content-Type" of a file.

    :param filename:
        The filename to guess the "Content-Type" of using :mod:`mimetypes`.
    :param default:
        If no "Content-Type" can be guessed, default to `default`.
    r)�	mimetypesZ
guess_type)�filename�default�r	�/usr/lib/python3.6/fields.py�guess_content_types	rcs�t�fdd�dD��sNd|�f}y|jd�Wnttfk
rHYnX|Stjrlt�tj�rl�jd��tj	j
�d��d|�f��S)a�
    Helper function to format and quote a single header parameter.

    Particularly useful for header parameters which might contain
    non-ASCII values, like file names. This follows RFC 2231, as
    suggested by RFC 2388 Section 4.4.

    :param name:
        The name of the parameter, a string expected to be ASCII only.
    :param value:
        The value of the parameter, provided as a unicode string.
    c3s|]}|�kVqdS)Nr	)�.0Zch)�valuer	r
�	<genexpr>#sz&format_header_param.<locals>.<genexpr>z"\
z%s="%s"�asciizutf-8z%s*=%s)�any�encode�UnicodeEncodeError�UnicodeDecodeErrorrZPY3�
isinstanceZ	text_type�emailZutilsZencode_rfc2231)�namer
�resultr	)r
r
�format_header_params

rc@sHeZdZdZddd�Zedd��Zdd�Zd	d
�Zdd�Z	dd
d�Z
dS)�RequestFieldaK
    A data container for request body parameters.

    :param name:
        The name of this request field.
    :param data:
        The data/value body.
    :param filename:
        An optional filename of the request field.
    :param headers:
        An optional dict-like object of headers to initially use for the field.
    NcCs*||_||_||_i|_|r&t|�|_dS)N)�_name�	_filename�data�headers�dict)�selfrrrrr	r	r
�__init__?szRequestField.__init__cCs^t|t�r4t|�dkr"|\}}}q@|\}}t|�}nd}d}|}||||d�}|j|d�|S)a�
        A :class:`~urllib3.fields.RequestField` factory from old-style tuple parameters.

        Supports constructing :class:`~urllib3.fields.RequestField` from
        parameter of key/value strings AND key/filetuple. A filetuple is a
        (filename, data, MIME type) tuple where the MIME type is optional.
        For example::

            'foo': 'bar',
            'fakefile': ('foofile.txt', 'contents of foofile'),
            'realfile': ('barfile.txt', open('realfile').read()),
            'typedfile': ('bazfile.bin', open('bazfile').read(), 'image/jpeg'),
            'nonamefile': 'contents of nonamefile field',

        Field names and filenames must be unicode.
        �N)r)�content_type)r�tuple�lenr�make_multipart)�clsZ	fieldnamer
rrr"Z
request_paramr	r	r
�from_tuplesGs

zRequestField.from_tuplescCs
t||�S)a
        Overridable helper function to format a single header parameter.

        :param name:
            The name of the parameter, a string expected to be ASCII only.
        :param value:
            The value of the parameter, provided as a unicode string.
        )r)rrr
r	r	r
�_render_partis	zRequestField._render_partcCsPg}|}t|t�r|j�}x*|D]"\}}|dk	r |j|j||��q Wdj|�S)aO
        Helper function to format and quote a single header.

        Useful for single headers that are composed of multiple items. E.g.,
        'Content-Disposition' fields.

        :param header_parts:
            A sequence of (k, v) typles or a :class:`dict` of (k, v) to format
            as `k1="v1"; k2="v2"; ...`.
        Nz; )rr�items�appendr(�join)rZheader_parts�parts�iterablerr
r	r	r
�
_render_partsts
zRequestField._render_partscCs�g}dddg}x2|D]*}|jj|d�r|jd||j|f�qWx4|jj�D]&\}}||krN|rN|jd||f�qNW|jd�dj|�S)z=
        Renders the headers for this request field.
        zContent-DispositionzContent-TypezContent-LocationFz%s: %sz
)r�getr*r)r+)r�linesZ	sort_keysZsort_keyZheader_nameZheader_valuer	r	r
�render_headers�s


zRequestField.render_headersc	CsX|pd|jd<|jddjd|jd|jfd|jff�g�7<||jd<||jd<d	S)
a|
        Makes this request field into a multipart request field.

        This method overrides "Content-Disposition", "Content-Type" and
        "Content-Location" headers to the request parameter.

        :param content_type:
            The 'Content-Type' of the request body.
        :param content_location:
            The 'Content-Location' of the request body.

        z	form-datazContent-Dispositionz; �rrzContent-TypezContent-LocationN)rr+r.rr)rZcontent_dispositionr"Zcontent_locationr	r	r
r%�s
zRequestField.make_multipart)NN)NNN)�__name__�
__module__�__qualname__�__doc__r �classmethodr'r(r.r1r%r	r	r	r
r2s
"r)r)Z
__future__rZemail.utilsrrZpackagesrrr�objectrr	r	r	r
�<module>s
site-packages/pip/_vendor/urllib3/__pycache__/_collections.cpython-36.pyc000064400000024340147511334600022406 0ustar003

���e�'�@s�ddlmZddlmZmZyddlmZWn"ek
rNGdd�d�ZYnXyddlmZWn ek
r�ddl	mZYnXddl
mZmZm
Z
d	d
gZe�ZGdd	�d	e�ZGdd
�d
e�Zd
S)�)�absolute_import)�Mapping�MutableMapping)�RLockc@seZdZdd�Zdd�ZdS)rcCsdS)N�)�selfrr�"/usr/lib/python3.6/_collections.py�	__enter__szRLock.__enter__cCsdS)Nr)r�exc_type�	exc_value�	tracebackrrr�__exit__
szRLock.__exit__N)�__name__�
__module__�__qualname__r	r
rrrrrsr)�OrderedDict�)�iterkeys�
itervalues�PY3�RecentlyUsedContainer�HTTPHeaderDictc@sVeZdZdZeZddd�Zdd�Zdd	�Zd
d�Z	dd
�Z
dd�Zdd�Zdd�Z
dS)ra�
    Provides a thread-safe dict-like container which maintains up to
    ``maxsize`` keys while throwing away the least-recently-used keys beyond
    ``maxsize``.

    :param maxsize:
        Maximum number of recent elements to retain.

    :param dispose_func:
        Every time an item is evicted from the container,
        ``dispose_func(value)`` is called.  Callback which will get called
    �
NcCs"||_||_|j�|_t�|_dS)N)�_maxsize�dispose_func�ContainerCls�
_containerr�lock)r�maxsizerrrr�__init__+s
zRecentlyUsedContainer.__init__c
Cs,|j�|jj|�}||j|<|SQRXdS)N)rr�pop)r�key�itemrrr�__getitem__2s
z!RecentlyUsedContainer.__getitem__c
Cslt}|j�@|jj|t�}||j|<t|j�|jkrF|jjdd�\}}WdQRX|jrh|tk	rh|j|�dS)NF)Zlast)�_Nullrr�get�lenr�popitemr)rr!�valueZ
evicted_valueZ_keyrrr�__setitem__9s
z!RecentlyUsedContainer.__setitem__c	Cs2|j�|jj|�}WdQRX|jr.|j|�dS)N)rrr r)rr!r(rrr�__delitem__Hsz!RecentlyUsedContainer.__delitem__c	Cs|j�t|j�SQRXdS)N)rr&r)rrrr�__len__OszRecentlyUsedContainer.__len__cCstd��dS)Nz7Iteration over this class is unlikely to be threadsafe.)�NotImplementedError)rrrr�__iter__SszRecentlyUsedContainer.__iter__c
CsL|j�tt|j��}|jj�WdQRX|jrHx|D]}|j|�q6WdS)N)r�listrr�clearr)r�valuesr(rrrr/Vs
zRecentlyUsedContainer.clearc
Cs |j�tt|j��SQRXdS)N)rr.rr)rrrr�keys`szRecentlyUsedContainer.keys)rN)rrr�__doc__rrrr#r)r*r+r-r/r1rrrrrs

cs�eZdZdZd-�fdd�	Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
esZej
Z
ejZe�Zdd�Zdd�Zefdd�Zdd�Zdd�Zdd�Zefdd�ZeZeZeZeZdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Z d)d*�Z!e"d+d,��Z#�Z$S).rap
    :param headers:
        An iterable of field-value pairs. Must not contain multiple field names
        when compared case-insensitively.

    :param kwargs:
        Additional field-value pairs to pass in to ``dict.update``.

    A ``dict`` like container for storing HTTP Headers.

    Field names are stored and compared case-insensitively in compliance with
    RFC 7230. Iteration provides the first case-sensitive key seen for each
    case-insensitive pair.

    Using ``__setitem__`` syntax overwrites fields that compare equal
    case-insensitively in order to maintain ``dict``'s api. For fields that
    compare equal, instead create a new ``HTTPHeaderDict`` and use ``.add``
    in a loop.

    If multiple fields that are equal case-insensitively are passed to the
    constructor or ``.update``, the behavior is undefined and some will be
    lost.

    >>> headers = HTTPHeaderDict()
    >>> headers.add('Set-Cookie', 'foo=bar')
    >>> headers.add('set-cookie', 'baz=quxx')
    >>> headers['content-length'] = '7'
    >>> headers['SET-cookie']
    'foo=bar, baz=quxx'
    >>> headers['Content-Length']
    '7'
    NcsPtt|�j�t�|_|dk	r>t|t�r4|j|�n
|j|�|rL|j|�dS)N)�superrrrr�
isinstance�
_copy_from�extend)r�headers�kwargs)�	__class__rrr�s

zHTTPHeaderDict.__init__cCs ||g|j|j�<|j|j�S)N)r�lower)rr!�valrrrr)�szHTTPHeaderDict.__setitem__cCs |j|j�}dj|dd��S)Nz, r)rr:�join)rr!r;rrrr#�szHTTPHeaderDict.__getitem__cCs|j|j�=dS)N)rr:)rr!rrrr*�szHTTPHeaderDict.__delitem__cCs|j�|jkS)N)r:r)rr!rrr�__contains__�szHTTPHeaderDict.__contains__cCsbt|t�rt|d�rdSt|t|��s6t|�|�}tdd�|j�D��tdd�|j�D��kS)Nr1Fcss|]\}}|j�|fVqdS)N)r:)�.0�k�vrrr�	<genexpr>�sz(HTTPHeaderDict.__eq__.<locals>.<genexpr>css|]\}}|j�|fVqdS)N)r:)r>r?r@rrrrA�s)r4r�hasattr�type�dict�
itermerged)r�otherrrr�__eq__�szHTTPHeaderDict.__eq__cCs|j|�S)N)rG)rrFrrr�__ne__�szHTTPHeaderDict.__ne__cCs
t|j�S)N)r&r)rrrrr+�szHTTPHeaderDict.__len__ccs"x|jj�D]}|dVqWdS)Nr)rr0)r�valsrrrr-�szHTTPHeaderDict.__iter__cCs<y||}Wn tk
r,||jkr(�|SX||=|SdS)z�D.pop(k[,d]) -> v, remove specified key and return the corresponding value.
          If key is not found, d is returned if given, otherwise KeyError is raised.
        N)�KeyError�_HTTPHeaderDict__marker)rr!�defaultr(rrrr �s
zHTTPHeaderDict.popcCs$y
||=Wntk
rYnXdS)N)rJ)rr!rrr�discard�s
zHTTPHeaderDict.discardcCs4|j�}||g}|jj||�}||k	r0|j|�dS)z�Adds a (name, value) pair, doesn't overwrite the value if it already
        exists.

        >>> headers = HTTPHeaderDict(foo='bar')
        >>> headers.add('Foo', 'baz')
        >>> headers['foo']
        'bar, baz'
        N)r:r�
setdefault�append)rr!r;Z	key_lowerZnew_valsrIrrr�add�s
	zHTTPHeaderDict.addcOst|�dkrtdjt|����t|�dkr2|dnf}t|t�rdx�|j�D]\}}|j||�qJWnvt|t�r�xj|D]}|j|||�qtWnLt|d�r�x@|j	�D]}|j|||�q�Wnx|D]\}}|j||�q�Wx |j
�D]\}}|j||�q�WdS)z�Generic import function for any type of header-like object.
        Adapted version of MutableMapping.update in order to insert items
        with self.add instead of self.__setitem__
        rz9extend() takes at most 1 positional arguments ({0} given)rr1N)r&�	TypeError�formatr4r�	iteritemsrPrrBr1�items)r�argsr8rFr!r;r(rrrr6�s"



zHTTPHeaderDict.extendcCsFy|j|j�}Wn"tk
r4||jkr0gS|SX|dd�SdS)zmReturns a list of all the values for the named field. Returns an
        empty list if the key doesn't exist.rN)rr:rJrK)rr!rLrIrrr�getlist�s
zHTTPHeaderDict.getlistcCsdt|�jt|j��fS)Nz%s(%s))rCrrDrE)rrrr�__repr__szHTTPHeaderDict.__repr__cCsBx<|D]4}|j|�}t|t�r&t|�}|g||j|j�<qWdS)N)rVr4r.rr:)rrFr!r;rrrr5s



zHTTPHeaderDict._copy_fromcCst|��}|j|�|S)N)rCr5)rZclonerrr�copys

zHTTPHeaderDict.copyccsDx>|D]6}|j|j�}x"|dd�D]}|d|fVq&WqWdS)z8Iterate over all header lines, including duplicate ones.rNr)rr:)rr!rIr;rrrrSs
zHTTPHeaderDict.iteritemsccs<x6|D].}|j|j�}|ddj|dd��fVqWdS)z:Iterate over all headers, merging duplicate ones together.rz, rN)rr:r<)rr!r;rrrrE%s
zHTTPHeaderDict.itermergedcCst|j��S)N)r.rS)rrrrrT+szHTTPHeaderDict.itemscCsng}x`|jD]V}|jd�r@|d\}}||d|j�f|d<q|jdd�\}}|j||j�f�qW||�S)	z4Read headers from a Python 2 httplib message object.� �	rz
�:)rYrZ���r\)r7�
startswith�rstrip�splitrO�strip)�cls�messager7�liner!r(rrr�from_httplib.s
zHTTPHeaderDict.from_httplib)N)%rrrr2rr)r#r*r=rGrHrrrr�objectrKr+r-r rMrPr6rVZ
getheadersZgetallmatchingheadersZigetZget_allrWr5rXrSrErT�classmethodrd�
__classcell__rr)r9rres< 
N)Z
__future__r�collectionsrrZ	threadingr�ImportErrorrZpackages.ordered_dictZpackages.sixrrr�__all__rer$rrrrrr�<module>sJsite-packages/pip/_vendor/urllib3/__pycache__/_collections.cpython-36.opt-1.pyc000064400000024340147511334600023345 0ustar003

���e�'�@s�ddlmZddlmZmZyddlmZWn"ek
rNGdd�d�ZYnXyddlmZWn ek
r�ddl	mZYnXddl
mZmZm
Z
d	d
gZe�ZGdd	�d	e�ZGdd
�d
e�Zd
S)�)�absolute_import)�Mapping�MutableMapping)�RLockc@seZdZdd�Zdd�ZdS)rcCsdS)N�)�selfrr�"/usr/lib/python3.6/_collections.py�	__enter__szRLock.__enter__cCsdS)Nr)r�exc_type�	exc_value�	tracebackrrr�__exit__
szRLock.__exit__N)�__name__�
__module__�__qualname__r	r
rrrrrsr)�OrderedDict�)�iterkeys�
itervalues�PY3�RecentlyUsedContainer�HTTPHeaderDictc@sVeZdZdZeZddd�Zdd�Zdd	�Zd
d�Z	dd
�Z
dd�Zdd�Zdd�Z
dS)ra�
    Provides a thread-safe dict-like container which maintains up to
    ``maxsize`` keys while throwing away the least-recently-used keys beyond
    ``maxsize``.

    :param maxsize:
        Maximum number of recent elements to retain.

    :param dispose_func:
        Every time an item is evicted from the container,
        ``dispose_func(value)`` is called.  Callback which will get called
    �
NcCs"||_||_|j�|_t�|_dS)N)�_maxsize�dispose_func�ContainerCls�
_containerr�lock)r�maxsizerrrr�__init__+s
zRecentlyUsedContainer.__init__c
Cs,|j�|jj|�}||j|<|SQRXdS)N)rr�pop)r�key�itemrrr�__getitem__2s
z!RecentlyUsedContainer.__getitem__c
Cslt}|j�@|jj|t�}||j|<t|j�|jkrF|jjdd�\}}WdQRX|jrh|tk	rh|j|�dS)NF)Zlast)�_Nullrr�get�lenr�popitemr)rr!�valueZ
evicted_valueZ_keyrrr�__setitem__9s
z!RecentlyUsedContainer.__setitem__c	Cs2|j�|jj|�}WdQRX|jr.|j|�dS)N)rrr r)rr!r(rrr�__delitem__Hsz!RecentlyUsedContainer.__delitem__c	Cs|j�t|j�SQRXdS)N)rr&r)rrrr�__len__OszRecentlyUsedContainer.__len__cCstd��dS)Nz7Iteration over this class is unlikely to be threadsafe.)�NotImplementedError)rrrr�__iter__SszRecentlyUsedContainer.__iter__c
CsL|j�tt|j��}|jj�WdQRX|jrHx|D]}|j|�q6WdS)N)r�listrr�clearr)r�valuesr(rrrr/Vs
zRecentlyUsedContainer.clearc
Cs |j�tt|j��SQRXdS)N)rr.rr)rrrr�keys`szRecentlyUsedContainer.keys)rN)rrr�__doc__rrrr#r)r*r+r-r/r1rrrrrs

cs�eZdZdZd-�fdd�	Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
esZej
Z
ejZe�Zdd�Zdd�Zefdd�Zdd�Zdd�Zdd�Zefdd�ZeZeZeZeZdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Z d)d*�Z!e"d+d,��Z#�Z$S).rap
    :param headers:
        An iterable of field-value pairs. Must not contain multiple field names
        when compared case-insensitively.

    :param kwargs:
        Additional field-value pairs to pass in to ``dict.update``.

    A ``dict`` like container for storing HTTP Headers.

    Field names are stored and compared case-insensitively in compliance with
    RFC 7230. Iteration provides the first case-sensitive key seen for each
    case-insensitive pair.

    Using ``__setitem__`` syntax overwrites fields that compare equal
    case-insensitively in order to maintain ``dict``'s api. For fields that
    compare equal, instead create a new ``HTTPHeaderDict`` and use ``.add``
    in a loop.

    If multiple fields that are equal case-insensitively are passed to the
    constructor or ``.update``, the behavior is undefined and some will be
    lost.

    >>> headers = HTTPHeaderDict()
    >>> headers.add('Set-Cookie', 'foo=bar')
    >>> headers.add('set-cookie', 'baz=quxx')
    >>> headers['content-length'] = '7'
    >>> headers['SET-cookie']
    'foo=bar, baz=quxx'
    >>> headers['Content-Length']
    '7'
    NcsPtt|�j�t�|_|dk	r>t|t�r4|j|�n
|j|�|rL|j|�dS)N)�superrrrr�
isinstance�
_copy_from�extend)r�headers�kwargs)�	__class__rrr�s

zHTTPHeaderDict.__init__cCs ||g|j|j�<|j|j�S)N)r�lower)rr!�valrrrr)�szHTTPHeaderDict.__setitem__cCs |j|j�}dj|dd��S)Nz, r)rr:�join)rr!r;rrrr#�szHTTPHeaderDict.__getitem__cCs|j|j�=dS)N)rr:)rr!rrrr*�szHTTPHeaderDict.__delitem__cCs|j�|jkS)N)r:r)rr!rrr�__contains__�szHTTPHeaderDict.__contains__cCsbt|t�rt|d�rdSt|t|��s6t|�|�}tdd�|j�D��tdd�|j�D��kS)Nr1Fcss|]\}}|j�|fVqdS)N)r:)�.0�k�vrrr�	<genexpr>�sz(HTTPHeaderDict.__eq__.<locals>.<genexpr>css|]\}}|j�|fVqdS)N)r:)r>r?r@rrrrA�s)r4r�hasattr�type�dict�
itermerged)r�otherrrr�__eq__�szHTTPHeaderDict.__eq__cCs|j|�S)N)rG)rrFrrr�__ne__�szHTTPHeaderDict.__ne__cCs
t|j�S)N)r&r)rrrrr+�szHTTPHeaderDict.__len__ccs"x|jj�D]}|dVqWdS)Nr)rr0)r�valsrrrr-�szHTTPHeaderDict.__iter__cCs<y||}Wn tk
r,||jkr(�|SX||=|SdS)z�D.pop(k[,d]) -> v, remove specified key and return the corresponding value.
          If key is not found, d is returned if given, otherwise KeyError is raised.
        N)�KeyError�_HTTPHeaderDict__marker)rr!�defaultr(rrrr �s
zHTTPHeaderDict.popcCs$y
||=Wntk
rYnXdS)N)rJ)rr!rrr�discard�s
zHTTPHeaderDict.discardcCs4|j�}||g}|jj||�}||k	r0|j|�dS)z�Adds a (name, value) pair, doesn't overwrite the value if it already
        exists.

        >>> headers = HTTPHeaderDict(foo='bar')
        >>> headers.add('Foo', 'baz')
        >>> headers['foo']
        'bar, baz'
        N)r:r�
setdefault�append)rr!r;Z	key_lowerZnew_valsrIrrr�add�s
	zHTTPHeaderDict.addcOst|�dkrtdjt|����t|�dkr2|dnf}t|t�rdx�|j�D]\}}|j||�qJWnvt|t�r�xj|D]}|j|||�qtWnLt|d�r�x@|j	�D]}|j|||�q�Wnx|D]\}}|j||�q�Wx |j
�D]\}}|j||�q�WdS)z�Generic import function for any type of header-like object.
        Adapted version of MutableMapping.update in order to insert items
        with self.add instead of self.__setitem__
        rz9extend() takes at most 1 positional arguments ({0} given)rr1N)r&�	TypeError�formatr4r�	iteritemsrPrrBr1�items)r�argsr8rFr!r;r(rrrr6�s"



zHTTPHeaderDict.extendcCsFy|j|j�}Wn"tk
r4||jkr0gS|SX|dd�SdS)zmReturns a list of all the values for the named field. Returns an
        empty list if the key doesn't exist.rN)rr:rJrK)rr!rLrIrrr�getlist�s
zHTTPHeaderDict.getlistcCsdt|�jt|j��fS)Nz%s(%s))rCrrDrE)rrrr�__repr__szHTTPHeaderDict.__repr__cCsBx<|D]4}|j|�}t|t�r&t|�}|g||j|j�<qWdS)N)rVr4r.rr:)rrFr!r;rrrr5s



zHTTPHeaderDict._copy_fromcCst|��}|j|�|S)N)rCr5)rZclonerrr�copys

zHTTPHeaderDict.copyccsDx>|D]6}|j|j�}x"|dd�D]}|d|fVq&WqWdS)z8Iterate over all header lines, including duplicate ones.rNr)rr:)rr!rIr;rrrrSs
zHTTPHeaderDict.iteritemsccs<x6|D].}|j|j�}|ddj|dd��fVqWdS)z:Iterate over all headers, merging duplicate ones together.rz, rN)rr:r<)rr!r;rrrrE%s
zHTTPHeaderDict.itermergedcCst|j��S)N)r.rS)rrrrrT+szHTTPHeaderDict.itemscCsng}x`|jD]V}|jd�r@|d\}}||d|j�f|d<q|jdd�\}}|j||j�f�qW||�S)	z4Read headers from a Python 2 httplib message object.� �	rz
�:)rYrZ���r\)r7�
startswith�rstrip�splitrO�strip)�cls�messager7�liner!r(rrr�from_httplib.s
zHTTPHeaderDict.from_httplib)N)%rrrr2rr)r#r*r=rGrHrrrr�objectrKr+r-r rMrPr6rVZ
getheadersZgetallmatchingheadersZigetZget_allrWr5rXrSrErT�classmethodrd�
__classcell__rr)r9rres< 
N)Z
__future__r�collectionsrrZ	threadingr�ImportErrorrZpackages.ordered_dictZpackages.sixrrr�__all__rer$rrrrrr�<module>sJsite-packages/pip/_vendor/urllib3/__pycache__/exceptions.cpython-36.pyc000064400000024121147511334600022107 0ustar003

���e��@sLddlmZddlmZGdd�de�ZGdd�de�ZGdd	�d	e�Z	Gd
d�de	�Z
Gdd
�d
e�ZGdd�de�ZGdd�de�Z
Gdd�de�ZeZGdd�de
�ZGdd�de
�ZGdd�de�ZGdd�de�ZGdd�dee
�ZGdd�de�ZGd d!�d!ee	�ZGd"d#�d#e	�ZGd$d%�d%e	�ZGd&d'�d'ee�ZGd(d)�d)e�ZGd*d+�d+e�ZGd,d-�d-e�ZGd.d/�d/e�ZGd0d1�d1e�ZGd2d3�d3e�Z Gd4d5�d5e�Z!Gd6d7�d7e�Z"Gd8d9�d9e�Z#Gd:d;�d;ee�Z$Gd<d=�d=e�Z%Gd>d?�d?ee�ZGd@dA�dAe�Z&GdBdC�dCe'e�Z(GdDdE�dEe�Z)GdFdG�dGe�Z*dHS)I�)�absolute_import�)�IncompleteReadc@seZdZdZdS)�	HTTPErrorz#Base exception used by this module.N)�__name__�
__module__�__qualname__�__doc__�r
r
� /usr/lib/python3.6/exceptions.pyrsrc@seZdZdZdS)�HTTPWarningz!Base warning used by this module.N)rrrr	r
r
r
rr
src@s eZdZdZdd�Zdd�ZdS)�	PoolErrorz/Base exception for errors caused within a pool.cCs||_tj|d||f�dS)Nz%s: %s)�poolr�__init__)�selfr�messager
r
rrszPoolError.__init__cCs
|jdfS)N)NN)�	__class__)rr
r
r�
__reduce__szPoolError.__reduce__N)rrrr	rrr
r
r
rr
sr
c@s eZdZdZdd�Zdd�ZdS)�RequestErrorz8Base exception for PoolErrors that have associated URLs.cCs||_tj|||�dS)N)�urlr
r)rrrrr
r
rrszRequestError.__init__cCs|jd|jdffS)N)rr)rr
r
rr#szRequestError.__reduce__N)rrrr	rrr
r
r
rrsrc@seZdZdZdS)�SSLErrorz9Raised when SSL certificate fails in an HTTPS connection.N)rrrr	r
r
r
rr(src@seZdZdZdS)�
ProxyErrorz,Raised when the connection to a proxy fails.N)rrrr	r
r
r
rr-src@seZdZdZdS)�DecodeErrorz;Raised when automatic decoding based on Content-Type fails.N)rrrr	r
r
r
rr2src@seZdZdZdS)�
ProtocolErrorz>Raised when something unexpected happens mid-request/response.N)rrrr	r
r
r
rr7src@seZdZdZddd�ZdS)�
MaxRetryErroraRaised when the maximum number of retries is exceeded.

    :param pool: The connection pool
    :type pool: :class:`~urllib3.connectionpool.HTTPConnectionPool`
    :param string url: The requested Url
    :param exceptions.Exception reason: The underlying error

    NcCs&||_d||f}tj||||�dS)Nz0Max retries exceeded with url: %s (Caused by %r))�reasonrr)rrrrrr
r
rrLs
zMaxRetryError.__init__)N)rrrr	rr
r
r
rrBsrc@seZdZdZddd�ZdS)�HostChangedErrorz?Raised when an existing pool gets a request for a foreign host.�cCs"d|}tj||||�||_dS)Nz)Tried to open a foreign host with url: %s)rr�retries)rrrrrr
r
rrXszHostChangedError.__init__N)r)rrrr	rr
r
r
rrUsrc@seZdZdZdS)�TimeoutStateErrorz3 Raised when passing an invalid state to a timeout N)rrrr	r
r
r
rr^src@seZdZdZdS)�TimeoutErrorz� Raised when a socket timeout error occurs.

    Catching this error will catch both :exc:`ReadTimeoutErrors
    <ReadTimeoutError>` and :exc:`ConnectTimeoutErrors <ConnectTimeoutError>`.
    N)rrrr	r
r
r
rr csr c@seZdZdZdS)�ReadTimeoutErrorzFRaised when a socket timeout occurs while receiving data from a serverN)rrrr	r
r
r
rr!lsr!c@seZdZdZdS)�ConnectTimeoutErrorz@Raised when a socket timeout occurs while connecting to a serverN)rrrr	r
r
r
rr"ssr"c@seZdZdZdS)�NewConnectionErrorzHRaised when we fail to establish a new connection. Usually ECONNREFUSED.N)rrrr	r
r
r
rr#xsr#c@seZdZdZdS)�EmptyPoolErrorzCRaised when a pool runs out of connections and no more are allowed.N)rrrr	r
r
r
rr$}sr$c@seZdZdZdS)�ClosedPoolErrorzCRaised when a request enters a pool after the pool has been closed.N)rrrr	r
r
r
rr%�sr%c@seZdZdZdS)�LocationValueErrorz<Raised when there is something wrong with a given URL input.N)rrrr	r
r
r
rr&�sr&c@seZdZdZdd�ZdS)�LocationParseErrorz=Raised when get_host or similar fails to parse the URL input.cCsd|}tj||�||_dS)NzFailed to parse: %s)rr�location)rr(rr
r
rr�szLocationParseError.__init__N)rrrr	rr
r
r
rr'�sr'c@seZdZdZdZdZdS)�
ResponseErrorzDUsed as a container for an error reason supplied in a MaxRetryError.ztoo many error responsesz&too many {status_code} error responsesN)rrrr	Z
GENERIC_ERRORZSPECIFIC_ERRORr
r
r
rr)�sr)c@seZdZdZdS)�SecurityWarningz/Warned when perfoming security reducing actionsN)rrrr	r
r
r
rr*�sr*c@seZdZdZdS)�SubjectAltNameWarningzBWarned when connecting to a host with a certificate missing a SAN.N)rrrr	r
r
r
rr+�sr+c@seZdZdZdS)�InsecureRequestWarningz/Warned when making an unverified HTTPS request.N)rrrr	r
r
r
rr,�sr,c@seZdZdZdS)�SystemTimeWarningz0Warned when system time is suspected to be wrongN)rrrr	r
r
r
rr-�sr-c@seZdZdZdS)�InsecurePlatformWarningzEWarned when certain SSL configuration is not available on a platform.N)rrrr	r
r
r
rr.�sr.c@seZdZdZdS)�SNIMissingWarningz9Warned when making a HTTPS request without SNI available.N)rrrr	r
r
r
rr/�sr/c@seZdZdZdS)�DependencyWarningzc
    Warned when an attempt is made to import a module with missing optional
    dependencies.
    N)rrrr	r
r
r
rr0�sr0c@seZdZdZdS)�ResponseNotChunkedz;Response needs to be chunked in order to read it as chunks.N)rrrr	r
r
r
rr1�sr1c@seZdZdZdS)�BodyNotHttplibCompatiblezz
    Body should be httplib.HTTPResponse like (have an fp attribute which
    returns raw chunks) for read_chunked().
    N)rrrr	r
r
r
rr2�sr2cs(eZdZdZ�fdd�Zdd�Z�ZS)rz�
    Response length doesn't match expected Content-Length

    Subclass of http_client.IncompleteRead to allow int value
    for `partial` to avoid creating large objects on streamed
    reads.
    cstt|�j||�dS)N)�superrr)r�partial�expected)rr
rr�szIncompleteRead.__init__cCsd|j|jfS)Nz/IncompleteRead(%i bytes read, %i more expected))r4r5)rr
r
r�__repr__�szIncompleteRead.__repr__)rrrr	rr6�
__classcell__r
r
)rrr�src@seZdZdZdS)�
InvalidHeaderz(The header provided was somehow invalid.N)rrrr	r
r
r
rr8�sr8cs eZdZdZ�fdd�Z�ZS)�ProxySchemeUnknownz1ProxyManager does not support the supplied schemecsd|}tt|�j|�dS)NzNot supported proxy scheme %s)r3r9r)r�schemer)rr
rr�szProxySchemeUnknown.__init__)rrrr	rr7r
r
)rrr9�sr9cs eZdZdZ�fdd�Z�ZS)�HeaderParsingErrorzNRaised by assert_header_parsing, but we convert it to a log.warning statement.cs$d|pd|f}tt|�j|�dS)Nz%s, unparsed data: %rZUnknown)r3r;r)rZdefectsZ
unparsed_datar)rr
rr�szHeaderParsingError.__init__)rrrr	rr7r
r
)rrr;�sr;c@seZdZdZdS)�UnrewindableBodyErrorz9urllib3 encountered an error when trying to rewind a bodyN)rrrr	r
r
r
rr<�sr<N)+Z
__future__rZpackages.six.moves.http_clientrZhttplib_IncompleteRead�	Exceptionr�Warningrr
rrrrr�ConnectionErrorrrrr r!r"r#r$r%�
ValueErrorr&r'r)r*r+r,r-r.r/r0r1r2r8�AssertionErrorr9r;r<r
r
r
r�<module>sH		
	site-packages/pip/_vendor/urllib3/__pycache__/filepost.cpython-36.pyc000064400000005037147511334600021560 0ustar003

���e	�@s�ddlmZddlZddlmZddlmZddlmZddl	m
Z
ddlmZej
d	�d
Zdd�Zd
d�Zdd�Zddd�ZdS)�)�absolute_importN)�uuid4)�BytesIO�)�six)�b)�RequestFieldzutf-8�cCst�jS)zN
    Our embarrassingly-simple replacement for mimetools.choose_boundary.
    )r�hex�rr�/usr/lib/python3.6/filepost.py�choose_boundarysr
ccsNt|t�rtj|�}nt|�}x*|D]"}t|t�r:|Vq$tj|�Vq$WdS)z�
    Iterate over fields.

    Supports list of (k, v) tuples and dicts, and lists of
    :class:`~urllib3.fields.RequestField`.

    N)�
isinstance�dictr�	iteritems�iterrZfrom_tuples)�fields�i�fieldrrr�iter_field_objectss


rcCs,t|t�rdd�tj|�D�Sdd�|D�S)a-
    .. deprecated:: 1.6

    Iterate over fields.

    The addition of :class:`~urllib3.fields.RequestField` makes this function
    obsolete. Instead, use :func:`iter_field_objects`, which returns
    :class:`~urllib3.fields.RequestField` objects.

    Supports list of (k, v) tuples and dicts.
    css|]\}}||fVqdS)Nr)�.0�k�vrrr�	<genexpr>6sziter_fields.<locals>.<genexpr>css|]\}}||fVqdS)Nr)rrrrrrr8s)rrrr)rrrr�iter_fields)s
rcCs�t�}|dkrt�}x|t|�D]p}|jtd|��t|�j|j��|j}t|t	�r^t
|�}t|tj�rzt|�j|�n
|j|�|jd�qW|jtd|��t
d|�}|j
�|fS)aJ
    Encode a dictionary of ``fields`` using the multipart/form-data MIME format.

    :param fields:
        Dictionary of fields or list of (key, :class:`~urllib3.fields.RequestField`).

    :param boundary:
        If not specified, then a random boundary will be generated using
        :func:`mimetools.choose_boundary`.
    Nz--%s
s
z--%s--
z multipart/form-data; boundary=%s)rr
r�writer�writerZrender_headers�datar�int�strrZ	text_type�getvalue)r�boundaryZbodyrrZcontent_typerrr�encode_multipart_formdata;s 

r")N)Z
__future__r�codecsZuuidr�iorZpackagesrZpackages.sixrrr�lookuprr
rrr"rrrr�<module>ssite-packages/pip/_vendor/urllib3/__pycache__/response.cpython-36.opt-1.pyc000064400000037433147511334600022535 0ustar003

���ewY�@sddlmZddlmZddlZddlZddlZddlmZ	ddlm
Zddlm
Z
ddlmZmZmZmZmZmZmZdd	lmZmZmZdd
lmZddlmZm Z ddl!m"Z"m#Z#ej$e%�Z&Gd
d�de'�Z(Gdd�de'�Z)dd�Z*Gdd�dej+�Z,dS)�)�absolute_import)�contextmanagerN)�timeout)�error�)�HTTPHeaderDict)�BodyNotHttplibCompatible�
ProtocolError�DecodeError�ReadTimeoutError�ResponseNotChunked�IncompleteRead�
InvalidHeader)�string_types�binary_type�PY3)�http_client)�
HTTPException�BaseSSLError)�is_fp_closed�is_response_to_headc@s$eZdZdd�Zdd�Zdd�ZdS)�DeflateDecodercCsd|_t�|_tj�|_dS)NT)�
_first_tryr�_data�zlib�
decompressobj�_obj)�self�r�/usr/lib/python3.6/response.py�__init__szDeflateDecoder.__init__cCst|j|�S)N)�getattrr)r�namerrr�__getattr__szDeflateDecoder.__getattr__cCs�|s|S|js|jj|�S|j|7_y |jj|�}|rFd|_d|_|Stjk
r�d|_tjtj�|_z|j|j�Sd|_XYnXdS)NF)rr�
decompressrrrr�	MAX_WBITS)r�dataZdecompressedrrrr$ s"zDeflateDecoder.decompressN)�__name__�
__module__�__qualname__r r#r$rrrrrsrc@s$eZdZdd�Zdd�Zdd�ZdS)�GzipDecodercCstjdtj�|_dS)N�)rrr%r)rrrrr 9szGzipDecoder.__init__cCst|j|�S)N)r!r)rr"rrrr#<szGzipDecoder.__getattr__cCs|s|S|jj|�S)N)rr$)rr&rrrr$?szGzipDecoder.decompressN)r'r(r)r r#r$rrrrr*7sr*cCs|dkrt�St�S)N�gzip)r*r)�moderrr�_get_decoderEsr.c@seZdZdZddgZdddddgZdFdd�Zdd�Zdd�Ze	dd��Z
e	dd��Zdd�Zdd�Z
dd�Zdd�Zd d!�Zed"d#��ZdGd$d%�ZdId(d)�Zed*d+��Zd,d-�ZdJd.d/�Zd0d1�Zd2d3�Ze	d4d5��Zd6d7�Zd8d9�Zd:d;�Zd<d=�Zd>d?�Z d@dA�Z!dBdC�Z"dKdDdE�Z#d
S)L�HTTPResponsea	
    HTTP Response container.

    Backwards-compatible to httplib's HTTPResponse but the response ``body`` is
    loaded and decoded on-demand when the ``data`` property is accessed.  This
    class is also compatible with the Python standard library's :mod:`io`
    module, and can hence be treated as a readable object in the context of that
    framework.

    Extra parameters for behaviour not present in httplib.HTTPResponse:

    :param preload_content:
        If True, the response's body will be preloaded during construction.

    :param decode_content:
        If True, attempts to decode specific content-encoding's based on headers
        (like 'gzip' and 'deflate') will be skipped and raw data will be used
        instead.

    :param original_response:
        When this HTTPResponse wrapper is generated from an httplib.HTTPResponse
        object, it's convenient to include the original for debug purposes. It's
        otherwise unused.

    :param retries:
        The retries contains the last :class:`~urllib3.util.retry.Retry` that
        was used during the request.

    :param enforce_content_length:
        Enforce content length checking. Body returned by server must match
        value of Content-Length header, if present. Otherwise, raise error.
    r,Zdeflatei-i.i/i3i4�NrTFcCst|t�r||_n
t|�|_||_||_||_||_||_||_|
|_	d|_
d|_d|_|	|_
d|_|r|t|ttf�r|||_|
|_||_t|d�r�||_d|_d|_|jjdd�j�}dd�|jd�D�}d	|kr�d
|_|j|�|_|r�|jr�|j|d�|_dS)Nr�readFztransfer-encodingr0css|]}|j�VqdS)N)�strip)�.0�encrrr�	<genexpr>�sz(HTTPResponse.__init__.<locals>.<genexpr>�,�chunkedT)�decode_content)�
isinstancer�headers�status�version�reason�strictr8�retries�enforce_content_length�_decoder�_body�_fp�_original_response�_fp_bytes_read�
basestringr�_pool�_connection�hasattrr7�
chunk_left�get�lower�split�_init_length�length_remainingr1)r�bodyr:r;r<r=r>Zpreload_contentr8�original_responseZpool�
connectionr?r@�request_methodZtr_encZ	encodingsrrrr qs<


zHTTPResponse.__init__cCs|j|jkr|jjd�SdS)a
        Should we redirect and where to?

        :returns: Truthy redirect location string if we got a redirect status
            code and valid location. ``None`` if redirect status and no
            location. ``False`` if not a redirect status code.
        �locationF)r;�REDIRECT_STATUSESr:rK)rrrr�get_redirect_location�sz"HTTPResponse.get_redirect_locationcCs,|js|jrdS|jj|j�d|_dS)N)rGrHZ	_put_conn)rrrr�release_conn�szHTTPResponse.release_conncCs"|jr|jS|jr|jdd�SdS)NT)�
cache_content)rBrCr1)rrrrr&�szHTTPResponse.datacCs|jS)N)rH)rrrrrR�szHTTPResponse.connectioncCs|jS)z�
        Obtain the number of bytes pulled over the wire so far. May differ from
        the amount of content returned by :meth:``HTTPResponse.read`` if bytes
        are encoded on the wire (e.g, compressed).
        )rE)rrrr�tell�szHTTPResponse.tellcCs�|jjd�}|dk	r(|jr(tjd�dS|dk	r�y<tdd�|jd�D��}t|�dkrbtd|��|j	�}Wnt
k
r�d}YnX|d	kr�d}yt|j�}Wnt
k
r�d	}YnX|dks�d|ko�d
kns�|dkr�d	}|S)zM
        Set initial length value for Response content if available.
        zcontent-lengthNz�Received response with both Content-Length and Transfer-Encoding set. This is expressly forbidden by RFC 7230 sec 3.3.2. Ignoring Content-Length and attempting to process response as Transfer-Encoding: chunked.cSsg|]}t|��qSr)�int)r3�valrrr�
<listcomp>�sz-HTTPResponse._init_length.<locals>.<listcomp>r6rz8Content-Length contained multiple unmatching values (%s)r���0�d��ZHEAD)r]r^)
r:rKr7�logZwarning�setrM�lenr�pop�
ValueErrorrZr;)rrSZlengthZlengthsr;rrrrN�s,


(zHTTPResponse._init_lengthcCs4|jjdd�j�}|jdkr0||jkr0t|�|_dS)z=
        Set-up the _decoder attribute if necessary.
        zcontent-encodingr0N)r:rKrLrA�CONTENT_DECODERSr.)r�content_encodingrrr�
_init_decoder�szHTTPResponse._init_decodercCs|y|r|jr|jj|�}WnHttjfk
rb}z&|jjdd�j�}td||��WYdd}~XnX|rx|rx||j	�7}|S)zN
        Decode the data passed in and potentially flush the decoder.
        zcontent-encodingr0zEReceived response with content-encoding: %s, but failed to decode it.N)
rAr$�IOErrorrrr:rKrLr
�_flush_decoder)rr&r8�
flush_decoder�ergrrr�_decodes
zHTTPResponse._decodecCs$|jr |jjd�}||jj�SdS)zk
        Flushes the decoder. Should only be called if the decoder is actually
        being used.
        �)rAr$�flush)rZbufrrrrjszHTTPResponse._flush_decoderccs�d}z�y
dVWn�tk
r2t|jdd��Ynptk
rn}z"dt|�krP�t|jdd��WYdd}~Xn4ttfk
r�}ztd||��WYdd}~XnXd}Wd|s�|jr�|jj	�|j
r�|j
j	�|jr�|jj�r�|j�XdS)z�
        Catch low-level python exceptions, instead re-raising urllib3
        variants, so that low-level exceptions are not leaked in the
        high-level api.

        On exit, release the connection back to the pool.
        FNzRead timed out.zread operation timed outzConnection broken: %rT)
�
SocketTimeoutrrGr�strr�SocketErrorr	rD�closerH�isclosedrW)rZ
clean_exitrlrrr�_error_catcher!s(	
 

zHTTPResponse._error_catchercCs�|j�|dkr|j}|jdkr$dSd}d}|j��h|dkrN|jj�}d}nJd}|jj|�}|dkr�|r�|jj�d}|jr�|jdkr�t|j	|j��WdQRX|r�|j	t
|�7_	|jdk	r�|jt
|�8_|j|||�}|r�||_|S)aP
        Similar to :meth:`httplib.HTTPResponse.read`, but with two additional
        parameters: ``decode_content`` and ``cache_content``.

        :param amt:
            How much of the content to read. If specified, caching is skipped
            because it doesn't make sense to cache partial content as the full
            response.

        :param decode_content:
            If True, will attempt to decode the body based on the
            'content-encoding' header.

        :param cache_content:
            If True, will save the returned data such that the same result is
            returned despite of the state of the underlying file object. This
            is useful if you want the ``.data`` property to continue working
            after having ``.read()`` the file object. (Overridden if ``amt`` is
            set.)
        NFTr)rN)
rhr8rCrur1rsr@rOr
rErcrmrB)r�amtr8rXrkr&rrrr1Zs4




zHTTPResponse.read�r+ccsZ|jr.|j�r.xF|j||d�D]
}|VqWn(x&t|j�sT|j||d�}|r0|Vq0WdS)a_
        A generator wrapper for the read() method. A call will block until
        ``amt`` bytes have been read from the connection or until the
        connection is closed.

        :param amt:
            How much of the content to read. The generator will return up to
            much data per iteration, but may return less. This is particularly
            likely when using compressed data. However, the empty string will
            never be returned.

        :param decode_content:
            If True, will attempt to decode the body based on the
            'content-encoding' header.
        )r8)rvr8N)r7�supports_chunked_reads�read_chunkedrrCr1)rrvr8�liner&rrr�stream�szHTTPResponse.streamc
Ks`|j}t|t�s,tr"t|j��}n
tj|�}t|dd�}|f|||j|j|j	||d�|��}|S)a
        Given an :class:`httplib.HTTPResponse` instance ``r``, return a
        corresponding :class:`urllib3.response.HTTPResponse` object.

        Remaining parameters are passed to the HTTPResponse constructor, along
        with ``original_response=r``.
        r>r)rPr:r;r<r=r>rQ)
�msgr9rr�items�from_httplibr!r;r<r=)ZResponseCls�rZresponse_kwr:r>Zresprrrr~�s	

zHTTPResponse.from_httplibcCs|jS)N)r:)rrrr�
getheaders�szHTTPResponse.getheaderscCs|jj||�S)N)r:rK)rr"�defaultrrr�	getheader�szHTTPResponse.getheadercCs|jS)N)r:)rrrr�info�szHTTPResponse.infocCs$|js|jj�|jr |jj�dS)N)�closedrCrsrH)rrrrrs�s
zHTTPResponse.closecCs@|jdkrdSt|jd�r$|jj�St|jd�r8|jjSdSdS)NTrtr�)rCrIrtr�)rrrrr��s

zHTTPResponse.closedcCs6|jdkrtd��nt|jd�r*|jj�Std��dS)Nz-HTTPResponse has no file to get a fileno from�filenozOThe file-like object this HTTPResponse is wrapped around has no file descriptor)rCrirIr�)rrrrr��s



zHTTPResponse.filenocCs$|jdk	r t|jd�r |jj�SdS)Nro)rCrIro)rrrrro�szHTTPResponse.flushcCsdS)NTr)rrrr�readableszHTTPResponse.readablecCs:|jt|��}t|�dkrdS||dt|��<t|�SdS)Nr)r1rc)r�bZtemprrr�readintos
zHTTPResponse.readintocCst|jd�S)z�
        Checks if the underlying file-like object looks like a
        httplib.HTTPResponse object. We do this by testing for the fp
        attribute. If it is present we assume it returns raw chunks as
        processed by read_chunked().
        �fp)rIrC)rrrrrxsz#HTTPResponse.supports_chunked_readscCsf|jdk	rdS|jjj�}|jdd�d}yt|d�|_Wn&tk
r`|j�tj	|��YnXdS)N�;rrr+)
rJrCr��readlinerMrZrers�httplibr
)rrzrrr�_update_chunk_lengths
z!HTTPResponse._update_chunk_lengthcCs�d}|dkr2|jj|j�}|}|jjd�d|_nv||jkrZ|jj|�}|j||_|}nN||jkr�|jj|�}|jjd�d|_|}n |jj|j�}|jjd�d|_|S)Nrw)rCZ
_safe_readrJ)rrvZreturned_chunk�chunk�valuerrr�
_handle_chunk%s&

zHTTPResponse._handle_chunkccs�|j�|jstd��|j�s&td��|jrDt|j�rD|jj�dS|j���x<|j	�|j
dkrdP|j|�}|j||dd�}|rP|VqPW|r�|j
�}|r�|Vx |jjj�}|s�P|dkr�Pq�W|jr�|jj�WdQRXdS)z�
        Similar to :meth:`HTTPResponse.read`, but with an additional
        parameter: ``decode_content``.

        :param decode_content:
            If True, will attempt to decode the body based on the
            'content-encoding' header.
        zHResponse is not chunked. Header 'transfer-encoding: chunked' is missing.zgBody should be httplib.HTTPResponse like. It should have have an fp attribute which returns raw chunks.NrF)r8rks
)rhr7rrxrrDrrsrur�rJr�rmrjrCr�r�)rrvr8r�Zdecodedrzrrrry;s@	




zHTTPResponse.read_chunked)r0NrrNrTTNNNNFN)NNF�)r�N)N)NN)$r'r(r)�__doc__rfrUr rVrW�propertyr&rRrYrNrhrmrjrrur1r{�classmethodr~r�r�r�rsr�r�ror�r�rxr�r�ryrrrrr/LsB 
-
	0
9
E

			r/)-Z
__future__r�
contextlibrr�ioZloggingZsocketrrprrr�_collectionsr�
exceptionsrr	r
rrr
rZpackages.sixrrFrrZpackages.six.movesrr�rRrrZ
util.responserrZ	getLoggerr'ra�objectrr*r.�IOBaser/rrrr�<module>s"$
!site-packages/pip/_vendor/urllib3/__pycache__/filepost.cpython-36.opt-1.pyc000064400000005037147511334600022517 0ustar003

���e	�@s�ddlmZddlZddlmZddlmZddlmZddl	m
Z
ddlmZej
d	�d
Zdd�Zd
d�Zdd�Zddd�ZdS)�)�absolute_importN)�uuid4)�BytesIO�)�six)�b)�RequestFieldzutf-8�cCst�jS)zN
    Our embarrassingly-simple replacement for mimetools.choose_boundary.
    )r�hex�rr�/usr/lib/python3.6/filepost.py�choose_boundarysr
ccsNt|t�rtj|�}nt|�}x*|D]"}t|t�r:|Vq$tj|�Vq$WdS)z�
    Iterate over fields.

    Supports list of (k, v) tuples and dicts, and lists of
    :class:`~urllib3.fields.RequestField`.

    N)�
isinstance�dictr�	iteritems�iterrZfrom_tuples)�fields�i�fieldrrr�iter_field_objectss


rcCs,t|t�rdd�tj|�D�Sdd�|D�S)a-
    .. deprecated:: 1.6

    Iterate over fields.

    The addition of :class:`~urllib3.fields.RequestField` makes this function
    obsolete. Instead, use :func:`iter_field_objects`, which returns
    :class:`~urllib3.fields.RequestField` objects.

    Supports list of (k, v) tuples and dicts.
    css|]\}}||fVqdS)Nr)�.0�k�vrrr�	<genexpr>6sziter_fields.<locals>.<genexpr>css|]\}}||fVqdS)Nr)rrrrrrr8s)rrrr)rrrr�iter_fields)s
rcCs�t�}|dkrt�}x|t|�D]p}|jtd|��t|�j|j��|j}t|t	�r^t
|�}t|tj�rzt|�j|�n
|j|�|jd�qW|jtd|��t
d|�}|j
�|fS)aJ
    Encode a dictionary of ``fields`` using the multipart/form-data MIME format.

    :param fields:
        Dictionary of fields or list of (key, :class:`~urllib3.fields.RequestField`).

    :param boundary:
        If not specified, then a random boundary will be generated using
        :func:`mimetools.choose_boundary`.
    Nz--%s
s
z--%s--
z multipart/form-data; boundary=%s)rr
r�writer�writerZrender_headers�datar�int�strrZ	text_type�getvalue)r�boundaryZbodyrrZcontent_typerrr�encode_multipart_formdata;s 

r")N)Z
__future__r�codecsZuuidr�iorZpackagesrZpackages.sixrrr�lookuprr
rrr"rrrr�<module>ssite-packages/pip/_vendor/urllib3/__pycache__/__init__.cpython-36.pyc000064400000004576147511334600021501 0ustar003

���e%�@s`dZddlmZddlZddlmZmZmZddlm	Z	ddl
mZddlm
Z
mZmZdd	lmZdd
lmZddlmZddlmZdd
lmZddlZyddlmZWn&ek
r�Gdd�dej�ZYnXdZdZdZ d(Z!ej"e#�j$e��ej%fd"d�Z&[ej'd#e	j(d$d%�ej'd&e	j)d$d%�ej'd&e	j*d$d%�ej'd&e	j+d$d%�e	j,fd'd�Z-dS))z8
urllib3 - Thread-safe connection pooling and re-using.
�)�absolute_importN�)�HTTPConnectionPool�HTTPSConnectionPool�connection_from_url)�
exceptions)�encode_multipart_formdata)�PoolManager�ProxyManager�proxy_from_url)�HTTPResponse)�make_headers)�get_host)�Timeout)�Retry)�NullHandlerc@seZdZdd�ZdS)rcCsdS)N�)�self�recordrr�/usr/lib/python3.6/__init__.py�emitszNullHandler.emitN)�__name__�
__module__�__qualname__rrrrrrsrz(Andrey Petrov (andrey.petrov@shazow.net)ZMITz1.22rrr	r
rrr�add_stderr_loggerr�disable_warningsrrr
rcCsFtjt�}tj�}|jtjd��|j|�|j|�|jdt�|S)z�
    Helper for quickly adding a StreamHandler to the logger. Useful for
    debugging.

    Returns the handler after adding it.
    z%%(asctime)s %(levelname)s %(message)sz,Added a stderr logging handler to logger: %s)	�logging�	getLoggerrZ
StreamHandlerZsetFormatterZ	Formatter�
addHandlerZsetLevel�debug)�levelZloggerZhandlerrrrr9s	


�alwaysT)�append�defaultcCstjd|�dS)z<
    Helper for quickly disabling all urllib3 warnings.
    �ignoreN)�warnings�simplefilter)�categoryrrrr]s)rrr	r
rrrrrrrrr
r).�__doc__Z
__future__rr%Zconnectionpoolrrr�rZfilepostrZpoolmanagerr	r
rZresponserZutil.requestr
Zutil.urlrZutil.timeoutrZ
util.retryrrr�ImportErrorZHandler�
__author__Z__license__�__version__�__all__rrr�DEBUGrr&ZSecurityWarningZSubjectAltNameWarningZInsecurePlatformWarningZSNIMissingWarningZHTTPWarningrrrrr�<module>sT
site-packages/pip/_vendor/urllib3/__pycache__/connection.cpython-36.opt-1.pyc000064400000021073147511334600023027 0ustar003

���e�2�@s�ddlmZddlZddlZddlZddlZddlZddlmZm	Z
ddlZddlm
Z
ddlmZddlmZyddlZejZWn,eefk
r�dZGdd	�d	e�ZYnXyeZWn$ek
r�Gd
d�de�ZYnXddlmZmZmZmZdd
l m!Z!m"Z"ddl#m$Z$m%Z%m&Z&m'Z'm(Z(ddl)m*Z*ddl+m,Z,ej-e.�Z/ddd�Z0ej1ddd�Z2Gdd�de3�Z4Gdd�dee3�ZGdd�de�Z5Gdd�de5�Z6dd�Z7e�r�e5Z8e6Z5ne4Z5dS)�)�absolute_importN)�error�timeout�)�six)�HTTPConnection)�
HTTPExceptionc@seZdZdS)�BaseSSLErrorN)�__name__�
__module__�__qualname__�r
r
� /usr/lib/python3.6/connection.pyr	sr	c@seZdZdS)�ConnectionErrorN)r
rrr
r
r
rrsr)�NewConnectionError�ConnectTimeoutError�SubjectAltNameWarning�SystemTimeWarning)�match_hostname�CertificateError)�resolve_cert_reqs�resolve_ssl_version�assert_fingerprint�create_urllib3_context�ssl_wrap_socket)�
connection)�HTTPHeaderDict�Pi�)�http�httpsi�c@seZdZdZdS)�DummyConnectionz-Used to detect a failed ConnectionCls import.N)r
rr�__doc__r
r
r
rr Asr c@sVeZdZdZedZejejdfgZ	dZ
dd�Zdd�Zd	d
�Z
dd�Zddd�Zd
S)ra{
    Based on httplib.HTTPConnection but provides an extra constructor
    backwards-compatibility layer between older and newer Pythons.

    Additional keyword parameters are used to configure attributes of the connection.
    Accepted parameters include:

      - ``strict``: See the documentation on :class:`urllib3.connectionpool.HTTPConnectionPool`
      - ``source_address``: Set the source address for the current connection.

        .. note:: This is ignored for Python 2.6. It is only applied for 2.7 and 3.x

      - ``socket_options``: Set specific options on the underlying socket. If not specified, then
        defaults are loaded from ``HTTPConnection.default_socket_options`` which includes disabling
        Nagle's algorithm (sets TCP_NODELAY to 1) unless the connection is behind a proxy.

        For example, if you wish to enable TCP Keep Alive in addition to the defaults,
        you might pass::

            HTTPConnection.default_socket_options + [
                (socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1),
            ]

        Or you may want to disable the defaults by passing an empty list (e.g., ``[]``).
    rrFcOsZtjr|jdd�|jd�|_tjdkr4|jdd�|jd|j�|_t	j
|f|�|�dS)N�strict�source_address���socket_options)r$r%)rZPY3�pop�getr#�sys�version_info�default_socket_optionsr&�_HTTPConnection�__init__)�self�args�kwr
r
rr-js
zHTTPConnection.__init__cCs�i}|jr|j|d<|jr$|j|d<ytj|j|jf|jf|�}Wnftk
rz}zt|d|j|jf��WYdd}~Xn0t	k
r�}zt
|d|��WYdd}~XnX|S)zp Establish a socket connection and set nodelay settings on it.

        :return: New socket connection.
        r#r&z0Connection to %s timed out. (connect timeout=%s)Nz(Failed to establish a new connection: %s)r#r&rZcreate_connection�host�portr�
SocketTimeoutr�SocketErrorr)r.Zextra_kw�conn�er
r
r�	_new_conns 

 zHTTPConnection._new_conncCs$||_t|dd�r |j�d|_dS)N�_tunnel_hostr)�sock�getattr�_tunnel�	auto_open)r.r5r
r
r�
_prepare_conn�szHTTPConnection._prepare_conncCs|j�}|j|�dS)N)r7r=)r.r5r
r
r�connect�szHTTPConnection.connectNcCst|dk	r|ni�}d|k}d|k}|j||||d�x |j�D]\}}|j||�q@Wd|krl|jdd�|j�|dk	�rtjtjf}	t||	�r�|f}xj|D]b}
|
s�q�t|
tj�s�|
j	d�}
t
t|
��d	d�}|j|j	d
��|jd�|j|
�|jd�q�W|jd�dS)
z�
        Alternative to the common request method, which sends the
        body with chunked encoding and not as one block
        Nzaccept-encodingr1)�skip_accept_encoding�	skip_hostztransfer-encodingzTransfer-EncodingZchunked�utf8r$zutf-8s
s0

)
rZ
putrequest�itemsZ	putheaderZ
endheadersrZstring_typesZbinary_type�
isinstance�encode�hex�len�send)r.�methodZurlZbodyZheadersr?r@�header�valueZstringish_types�chunkZlen_strr
r
r�request_chunked�s8





zHTTPConnection.request_chunked)NN)r
rrr!�port_by_scheme�default_port�socketZIPPROTO_TCPZTCP_NODELAYr+�is_verifiedr-r7r=r>rLr
r
r
rrFsrc@s8eZdZedZdZddddejdfdd�Zdd�Z	dS)�HTTPSConnectionrNc	Ks8tj|||f||d�|��||_||_||_d|_dS)N)r"rr)rr-�key_file�	cert_file�ssl_contextZ	_protocol)	r.r1r2rRrSr"rrTr0r
r
rr-�szHTTPSConnection.__init__cCsN|j�}|j|�|jdkr2ttd�td�d�|_t||j|j|jd�|_	dS)N)�ssl_version�	cert_reqs)r9�keyfile�certfilerT)
r7r=rTrrrrrRrSr9)r.r5r
r
rr>�s

zHTTPSConnection.connect)
r
rrrMrNrUrOZ_GLOBAL_DEFAULT_TIMEOUTr-r>r
r
r
rrQ�s
rQc@s6eZdZdZdZdZdZdZdZddd�Z	dd�Z
dS)�VerifiedHTTPSConnectionz[
    Based on httplib.HTTPSConnection but wraps the socket with
    SSL certification.
    NcCsn|dkr(|s|rd}n|jdk	r(|jj}||_||_||_||_||_|oTtjj	|�|_
|oftjj	|�|_dS)zX
        This method should only be called once, before the connection is used.
        N�
CERT_REQUIRED)rT�verify_moderRrSrV�assert_hostnamer�os�path�
expanduser�ca_certs�ca_cert_dir)r.rRrSrVr`r\rrar
r
r�set_certs
z VerifiedHTTPSConnection.set_certc	CsL|j�}|j}t|dd�r4||_|j�d|_|j}tjj	�t
k}|rXtjdj
t
�t�|jdkr|tt|j�t|j�d�|_|j}t|j�|_t||j|j|j|j||d�|_|jr�t|jjdd�|j�nb|jtjko�t|dd	�o�|jd	k	�r.|jj�}|j d
f��stjdj
|�t!�t"||j�p*|�|jtj#k�pD|jdk	|_$dS)Nr8rzWSystem time is way off (before {0}). This will probably lead to SSL verification errors)rUrV)r9rWrXr`raZserver_hostnamerTT)Zbinary_formZcheck_hostnameFZsubjectAltNamez�Certificate for {0} has no `subjectAltName`, falling back to check for a `commonName` for now. This feature is being removed by major browsers and deprecated by RFC 2818. (See https://github.com/shazow/urllib3/issues/497 for details.))%r7r1r:r9r;r<r8�datetime�dateZtoday�RECENT_DATE�warnings�warn�formatrrTrrrUrrVr[rrRrSr`rarZgetpeercert�sslZ	CERT_NONEr\r(r�_match_hostnamerZrP)r.r5ZhostnameZis_time_off�context�certr
r
rr>sT



zVerifiedHTTPSConnection.connect)NNNNNNN)r
rrr!rVr`rarUrrbr>r
r
r
rrY�s
rYcCsLyt||�Wn8tk
rF}ztjd||�||_�WYdd}~XnXdS)Nz@Certificate did not match expected hostname: %s. Certificate: %s)rr�logrZ
_peer_cert)rlZasserted_hostnamer6r
r
rrjbsrj)9Z
__future__rrcZloggingr]r)rOrr4rr3rfZpackagesrZpackages.six.moves.http_clientrr,rriZSSLErrorr	�ImportError�AttributeError�
BaseExceptionr�	NameError�	Exception�
exceptionsrrrrZpackages.ssl_match_hostnamerrZ	util.ssl_rrrrr�utilr�_collectionsrZ	getLoggerr
rmrMrdre�objectr rQrYrjZUnverifiedHTTPSConnectionr
r
r
r�<module>sN
	
&lsite-packages/pip/_vendor/urllib3/__pycache__/connectionpool.cpython-36.pyc000064400000056152147511334600022770 0ustar003

���e��@s�ddlmZddlZddlZddlZddlZddlmZm	Z
ddlZddlmZm
Z
mZmZmZmZmZmZmZmZmZmZmZddlmZddlmZddlmZdd	lm Z m!Z!m"Z"m#Z#m$Z$m%Z%m&Z&dd
l'm(Z(ddl)m*Z*ddl+m,Z,dd
l-m.Z.ddl/m0Z0ddl1m2Z2ddl3m4Z4ddl5m6Z6m7Z7ej8�r<ddl9Z:ej;j<Z<ej=e>�Z?e@�ZAGdd�de@�ZBeCejDejEg�ZFGdd�deBe(�ZGGdd�deG�ZHdd�ZIdd�ZJdS)�)�absolute_importN)�error�timeout�)
�ClosedPoolError�
ProtocolError�EmptyPoolError�HeaderParsingError�HostChangedError�LocationValueError�
MaxRetryError�
ProxyError�ReadTimeoutError�SSLError�TimeoutError�InsecureRequestWarning�NewConnectionError)�CertificateError)�six)�queue)�port_by_scheme�DummyConnection�HTTPConnection�HTTPSConnection�VerifiedHTTPSConnection�
HTTPException�BaseSSLError)�RequestMethods)�HTTPResponse)�is_connection_dropped)�set_file_position)�assert_header_parsing)�Retry)�Timeout)�get_host�Urlc@sDeZdZdZdZejZd
dd�Zdd�Z	dd�Z
d	d
�Zdd�ZdS)�ConnectionPoolzz
    Base class for all connection pools, such as
    :class:`.HTTPConnectionPool` and :class:`.HTTPSConnectionPool`.
    NcCs.|std��t|�j�|_|j�|_||_dS)NzNo host specified.)r�
_ipv6_host�lower�host�_proxy_host�port)�selfr)r+�r-�$/usr/lib/python3.6/connectionpool.py�__init__Cs

zConnectionPool.__init__cCsdt|�j|j|jfS)Nz%s(host=%r, port=%r))�type�__name__r)r+)r,r-r-r.�__str__Ks
zConnectionPool.__str__cCs|S)Nr-)r,r-r-r.�	__enter__OszConnectionPool.__enter__cCs|j�dS)NF)�close)r,�exc_typeZexc_valZexc_tbr-r-r.�__exit__RszConnectionPool.__exit__cCsdS)zD
        Close all pooled connections and disable the pool.
        Nr-)r,r-r-r.r4WszConnectionPool.close)N)
r1�
__module__�__qualname__�__doc__�schemerZ	LifoQueue�QueueClsr/r2r3r6r4r-r-r-r.r&:s
r&c
@s�eZdZdZdZeZeZdde	j
ddddddf	dd�Zdd	�Zd!d
d�Z
dd
�Zdd�Zdd�Zdd�Zdd�Zedfdd�Zdd�Zdd�Zdd�Zdddddeddddf
dd �ZdS)"�HTTPConnectionPoolaN	
    Thread-safe connection pool for one host.

    :param host:
        Host used for this HTTP Connection (e.g. "localhost"), passed into
        :class:`httplib.HTTPConnection`.

    :param port:
        Port used for this HTTP Connection (None is equivalent to 80), passed
        into :class:`httplib.HTTPConnection`.

    :param strict:
        Causes BadStatusLine to be raised if the status line can't be parsed
        as a valid HTTP/1.0 or 1.1 status line, passed into
        :class:`httplib.HTTPConnection`.

        .. note::
           Only works in Python 2. This parameter is ignored in Python 3.

    :param timeout:
        Socket timeout in seconds for each individual connection. This can
        be a float or integer, which sets the timeout for the HTTP request,
        or an instance of :class:`urllib3.util.Timeout` which gives you more
        fine-grained control over request timeouts. After the constructor has
        been parsed, this is always a `urllib3.util.Timeout` object.

    :param maxsize:
        Number of connections to save that can be reused. More than 1 is useful
        in multithreaded situations. If ``block`` is set to False, more
        connections will be created but they will not be saved once they've
        been used.

    :param block:
        If set to True, no more than ``maxsize`` connections will be used at
        a time. When no free connections are available, the call will block
        until a connection has been released. This is a useful side effect for
        particular multithreaded situations where one does not want to use more
        than maxsize connections per host to prevent flooding.

    :param headers:
        Headers to include with all requests, unless other headers are given
        explicitly.

    :param retries:
        Retry configuration to use by default with requests in this pool.

    :param _proxy:
        Parsed proxy URL, should not be used directly, instead, see
        :class:`urllib3.connectionpool.ProxyManager`"

    :param _proxy_headers:
        A dictionary with proxy headers, should not be used directly,
        instead, see :class:`urllib3.connectionpool.ProxyManager`"

    :param \**conn_kw:
        Additional parameters are used to create fresh :class:`urllib3.connection.HTTPConnection`,
        :class:`urllib3.connection.HTTPSConnection` instances.
    �httpNFrc
Ks�tj|||�tj||�||_t|t�s4tj|�}|dkrBtj}||_	||_
|j|�|_||_
|	|_|
pli|_xt|�D]}|jjd�qzWd|_d|_||_|jr�|jjdg�dS)NrZsocket_options)r&r/r�strict�
isinstancer#�
from_floatr"ZDEFAULTr�retriesr;�pool�block�proxy�
proxy_headers�xrange�put�num_connections�num_requests�conn_kw�
setdefault)
r,r)r+r>r�maxsizerC�headersrA�_proxy�_proxy_headersrJ�_r-r-r.r/�s(


zHTTPConnectionPool.__init__cCsJ|jd7_tjd|j|j�|jf|j|j|jj|jd�|j	��}|S)z9
        Return a fresh :class:`HTTPConnection`.
        rz%Starting new HTTP connection (%d): %s)r)r+rr>)
rH�log�debugr)�
ConnectionClsr+r�connect_timeoutr>rJ)r,�connr-r-r.�	_new_conn�szHTTPConnectionPool._new_conncCs�d}y|jj|j|d�}WnBtk
r8t|d��Yn&tjk
r\|jrXt|d��YnX|r�t|�r�t	j
d|j�|j�t
|dd�dkr�d}|p�|j�S)	a�
        Get a connection. Will return a pooled connection if one is available.

        If no connections are available and :prop:`.block` is ``False``, then a
        fresh connection is returned.

        :param timeout:
            Seconds to wait before giving up and raising
            :class:`urllib3.exceptions.EmptyPoolError` if the pool is empty and
            :prop:`.block` is ``True``.
        N)rCrzPool is closed.z>Pool reached maximum size and no more connections are allowed.z Resetting dropped connection: %sZ	auto_openrr)rB�getrC�AttributeErrorrr�EmptyrrrQrRr)r4�getattrrV)r,rrUr-r-r.�	_get_conn�s zHTTPConnectionPool._get_conncCs\y|jj|dd�dStk
r(Yn$tjk
rJtjd|j�YnX|rX|j�dS)a�
        Put a connection back into the pool.

        :param conn:
            Connection object for the current host and port as returned by
            :meth:`._new_conn` or :meth:`._get_conn`.

        If the pool is already full, the connection is closed and discarded
        because we exceeded maxsize. If connections are discarded frequently,
        then maxsize should be increased.

        If the pool is closed, then the connection will be closed and discarded.
        F)rCNz2Connection pool is full, discarding connection: %s)	rBrGrXrZFullrQ�warningr)r4)r,rUr-r-r.�	_put_conn�szHTTPConnectionPool._put_conncCsdS)zU
        Called right before a request is made, after the socket is created.
        Nr-)r,rUr-r-r.�_validate_connsz!HTTPConnectionPool._validate_conncCsdS)Nr-)r,rUr-r-r.�_prepare_proxy!sz!HTTPConnectionPool._prepare_proxycCs2|tkr|jj�St|t�r$|j�Stj|�SdS)z< Helper that always returns a :class:`urllib3.util.Timeout` N)�_DefaultrZcloner?r#r@)r,rr-r-r.�_get_timeout%s


zHTTPConnectionPool._get_timeoutcCsjt|t�rt||d|��t|d�r>|jtkr>t||d|��dt|�ksVdt|�krft||d|��dS)zAIs the error actually a timeout? Will raise a ReadTimeout or passz!Read timed out. (read timeout=%s)�errnoz	timed outzdid not complete (read)N)r?�
SocketTimeoutr�hasattrrb�_blocking_errnos�str)r,�err�url�
timeout_valuer-r-r.�_raise_timeout1s
z!HTTPConnectionPool._raise_timeoutc
:Ks|jd7_|j|�}|j�|j|_y|j|�Wn:ttfk
rp}z|j|||jd��WYdd}~XnX|r�|j	||f|�n|j
||f|�|j}	t|dd�r�|	dkr�t
||d|	��|	tjkr�|jjtj��n|jj|	�yjy|jdd�}
WnTtk
�rPy|j�}
Wn0tk
�rJ}ztj|d�WYdd}~XnXYnXWn<tttfk
�r�}z|j|||	d��WYdd}~XnXt|d	d
�}tjd|j|j|j||||
j|
j �	yt!|
j"�Wn@t#tfk
�r}ztj$d|j%|�|dd
�WYdd}~XnX|
S)a
        Perform a request on a given urllib connection object taken from our
        pool.

        :param conn:
            a connection from one of our connection pools

        :param timeout:
            Socket timeout in seconds for the request. This can be a
            float or integer, which will set the same timeout value for
            the socket connect and the socket read, or an instance of
            :class:`urllib3.util.Timeout`, which gives you more fine-grained
            control over your timeouts.
        r)rgrhriN�sockrz!Read timed out. (read timeout=%s)T)�	bufferingZ
_http_vsn_strzHTTP/?z%s://%s:%s "%s %s %s" %s %sz$Failed to parse headers (url=%s): %s)�exc_info)&rIraZ
start_connectrTrr^rcrrjZrequest_chunked�request�read_timeoutrZrr#�DEFAULT_TIMEOUTrkZ
settimeout�socketZgetdefaulttimeoutZgetresponse�	TypeError�	ExceptionrZ
raise_from�SocketErrorrQrRr:r)r+�statusZlengthr!�msgr	r\�
_absolute_url)
r,rU�methodrhr�chunkedZhttplib_request_kw�timeout_obj�ero�httplib_responseZhttp_versionZhper-r-r.�
_make_requestBsT

(
$z HTTPConnectionPool._make_requestcCst|j|j|j|d�jS)N)r:r)r+�path)r%r:r)r+rh)r,r~r-r-r.rw�sz HTTPConnectionPool._absolute_urlcCsL|jd}|_y"x|jdd�}|r|j�qWWntjk
rFYnXdS)zD
        Close all pooled connections and disable the pool.
        NF)rC)rBrWr4rrY)r,Zold_poolrUr-r-r.r4�szHTTPConnectionPool.closecCst|jd�rdSt|�\}}}t|�j�}|jr@|r@tj|�}n|jrZ|tj|�krZd}|||f|j|j|jfkS)zj
        Check if the given ``url`` is a member of the same host as this
        connection pool.
        �/TN)	�
startswithr$r'r(r+rrWr:r))r,rhr:r)r+r-r-r.�is_same_host�s
zHTTPConnectionPool.is_same_hostTc
.Ks�|dkr|j}t|t�s*tj|||jd�}|
dkr>|
jdd�}
|rZ|j|�rZt|||��d}|
}|jdkr�|j	�}|j
|j�d}d}t||�}�zry�|j
|�}|j|	d�}|j|_|jdk	o�t|dd�}|r�|j|�|j|||||||d	�}|
�s�|nd}||
d
<|jj|f|||d�|
��}d}Wn�tjk
�rNt|d��Yn�ttttttt fk
�r}z�d}t|tt f��r�t|�}n>t|tt!f��r�|j�r�t"d
|�}nt|ttf��r�td|�}|j#||||t$j%�dd�}|j&�|}WYdd}~XnXWd|�s |�o|j'�}d}|�r0|j(|�X|�spt)j*d|||�|j+|||||||f||	|
|d�|
��Sdd�}|�o�|j,�}|�r$|j-dk�r�d}y|j#||||d�}Wn(t.k
�r�|j/�r�||��|SX||�|j0|�t)j1d||�|j+||||f|||||	|
|d�|
��St2|j3d��}|j4||j-|��r�y|j#||||d�}Wn(t.k
�r�|j5�r~||��|SX||�|j&|�t)j1d|�|j+||||f|||||	|
|d�|
��S|S)a�
        Get a connection from the pool and perform an HTTP request. This is the
        lowest level call for making a request, so you'll need to specify all
        the raw details.

        .. note::

           More commonly, it's appropriate to use a convenience method provided
           by :class:`.RequestMethods`, such as :meth:`request`.

        .. note::

           `release_conn` will only behave as expected if
           `preload_content=False` because we want to make
           `preload_content=False` the default behaviour someday soon without
           breaking backwards compatibility.

        :param method:
            HTTP request method (such as GET, POST, PUT, etc.)

        :param body:
            Data to send in the request body (useful for creating
            POST requests, see HTTPConnectionPool.post_url for
            more convenience).

        :param headers:
            Dictionary of custom headers to send, such as User-Agent,
            If-None-Match, etc. If None, pool headers are used. If provided,
            these headers completely replace any pool-specific headers.

        :param retries:
            Configure the number of retries to allow before raising a
            :class:`~urllib3.exceptions.MaxRetryError` exception.

            Pass ``None`` to retry until you receive a response. Pass a
            :class:`~urllib3.util.retry.Retry` object for fine-grained control
            over different types of retries.
            Pass an integer number to retry connection errors that many times,
            but no other types of errors. Pass zero to never retry.

            If ``False``, then retries are disabled and any exception is raised
            immediately. Also, instead of raising a MaxRetryError on redirects,
            the redirect response will be returned.

        :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.

        :param redirect:
            If True, automatically handle redirects (status codes 301, 302,
            303, 307, 308). Each redirect counts as a retry. Disabling retries
            will disable redirect, too.

        :param assert_same_host:
            If ``True``, will make sure that the host of the pool requests is
            consistent else will raise HostChangedError. When False, you can
            use the pool on an HTTP proxy and request foreign hosts.

        :param timeout:
            If specified, overrides the default timeout for this one
            request. It may be a float (in seconds) or an instance of
            :class:`urllib3.util.Timeout`.

        :param pool_timeout:
            If set and the pool is set to block=True, then this method will
            block for ``pool_timeout`` seconds and raise EmptyPoolError if no
            connection is available within the time period.

        :param release_conn:
            If False, then the urlopen call will not release the connection
            back into the pool once a response is received (but will release if
            you read the entire contents of the response such as when
            `preload_content=True`). This is useful if you're not preloading
            the response's content immediately. You will need to call
            ``r.release_conn()`` on the response ``r`` to return the connection
            back into the pool. If None, it takes the value of
            ``response_kw.get('preload_content', True)``.

        :param chunked:
            If True, urllib3 will send the body using chunked transfer
            encoding. Otherwise, urllib3 will send the body using the standard
            content-length form. Defaults to False.

        :param int body_pos:
            Position to seek to in file-like body in the event of a retry or
            redirect. Typically this won't need to be set because urllib3 will
            auto-populate the value when needed.

        :param \**response_kw:
            Additional parameters are passed to
            :meth:`urllib3.response.HTTPResponse.from_httplib`
        N)�redirect�defaultZpreload_contentTr=F)rrk)r�bodyrMryZrequest_method)rB�
connectionrAz"No pool connections are available.zCannot connect to proxy.zConnection aborted.�)r�_poolZ_stacktracez1Retrying (%r) after connection broken by '%r': %s)r�pool_timeout�release_conn�body_poscSs@y|j�Wn.ttttttfk
r:}zWYdd}~XnXdS)N)�readrrrtrrr)�responser{r-r-r.�drain_and_release_conn�s

z:HTTPConnectionPool.urlopen.<locals>.drain_and_release_conni/ZGET)r�r�zRedirecting %s -> %s)rAr��assert_same_hostrr�r�r�zRetry-Afterz	Retry: %s)6rMr?r"Zfrom_intrArWr�r
r:�copy�updaterEr rar[rTrrDrZr_r}�ResponseClsZfrom_httplibrrYrrrrtrrrrrr
Z	increment�sysrmZsleepr4r]rQr\�urlopenZget_redirect_locationrurZraise_on_redirectZsleep_for_retryrR�boolZ	getheaderZis_retryZraise_on_status)r,rxrhr�rMrAr�r�rr�r�ryr�Zresponse_kwrUZrelease_this_connrgZ
clean_exitrzZis_new_proxy_connr|Z
response_connr�r{r�Zredirect_locationZhas_retry_afterr-r-r.r��s�^















zHTTPConnectionPool.urlopen)N)r1r7r8r9r:rrSrr�r#rpr/rVr[r]r^r_rarjr`r}rwr4r�r�r-r-r-r.r<bs.:%
&Ur<csneZdZdZdZeZddejddddddddddddddfdd�Z	dd	�Z
d
d�Zdd
�Z�fdd�Z
�ZS)�HTTPSConnectionPoola�
    Same as :class:`.HTTPConnectionPool`, but HTTPS.

    When Python is compiled with the :mod:`ssl` module, then
    :class:`.VerifiedHTTPSConnection` is used, which *can* verify certificates,
    instead of :class:`.HTTPSConnection`.

    :class:`.VerifiedHTTPSConnection` uses one of ``assert_fingerprint``,
    ``assert_hostname`` and ``host`` in this order to verify connections.
    If ``assert_hostname`` is False, no verification is done.

    The ``key_file``, ``cert_file``, ``cert_reqs``, ``ca_certs``,
    ``ca_cert_dir``, and ``ssl_version`` are only used if :mod:`ssl` is
    available and are fed into :meth:`urllib3.util.ssl_wrap_socket` to upgrade
    the connection socket into an SSL socket.
    �httpsNFrcKsftj||||||||||	|
f|�|r2|
dkr2d}
||_||_|
|_||_||_||_||_||_	dS)NZ
CERT_REQUIRED)
r<r/�key_file�	cert_file�	cert_reqs�ca_certs�ca_cert_dir�ssl_version�assert_hostname�assert_fingerprint)r,r)r+r>rrLrCrMrArNrOr�r�r�r�r�r�r�r�rJr-r-r.r/�s	zHTTPSConnectionPool.__init__c	Cs<t|t�r8|j|j|j|j|j|j|j|j	d�|j
|_
|S)z�
        Prepare the ``connection`` for :meth:`urllib3.util.ssl_wrap_socket`
        and establish the tunnel if proxy is used.
        )r�r�r�r�r�r�r�)r?rZset_certr�r�r�r�r�r�r�r�)r,rUr-r-r.�
_prepare_conns

z!HTTPSConnectionPool._prepare_conncCsfy
|j}Wntk
r$|j}YnXtjdkrH|jrH||j|j�n||j|j|j�|j�dS)z�
        Establish tunnel connection early, because otherwise httplib
        would improperly set Host: header to proxy's IP:port.
        r���N)r�r�r�)	�
set_tunnelrXZ_set_tunnelr��version_inforEr*r+�connect)r,rUr�r-r-r.r_ s
z"HTTPSConnectionPool._prepare_proxycCs�|jd7_tjd|j|j�|js2|jtkr:td��|j}|j}|jdk	r`|jj}|jj}|jf|||j	j
|jd�|j��}|j
|�S)zB
        Return a fresh :class:`httplib.HTTPSConnection`.
        rz&Starting new HTTPS connection (%d): %szCCan't connect to HTTPS URL because the SSL module is not available.N)r)r+rr>)rHrQrRr)rSrrr+rDrrTr>rJr�)r,Zactual_hostZactual_portrUr-r-r.rV2s

zHTTPSConnectionPool._new_conncs:tt|�j|�t|dd�s$|j�|js6tjdt�dS)zU
        Called right before a request is made, after the socket is created.
        rkNz�Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings)	�superr�r^rZr�Zis_verified�warnings�warnr)r,rU)�	__class__r-r.r^Jsz"HTTPSConnectionPool._validate_conn)r1r7r8r9r:rrSr#rpr/r�r_rVr^�
__classcell__r-r-)r�r.r��sr�cKsRt|�\}}}|ptj|d�}|dkr:t|fd|i|��St|fd|i|��SdS)a�
    Given a url, return an :class:`.ConnectionPool` instance of its host.

    This is a shortcut for not having to parse out the scheme, host, and port
    of the url before creating an :class:`.ConnectionPool` instance.

    :param url:
        Absolute URL string that must include the scheme. Port is optional.

    :param \**kw:
        Passes additional parameters to the constructor of the appropriate
        :class:`.ConnectionPool`. Useful for specifying things like
        timeout, maxsize, headers, etc.

    Example::

        >>> conn = connection_from_url('http://google.com/')
        >>> r = conn.request('GET', '/')
    �Pr�r+N)r$rrWr�r<)rh�kwr:r)r+r-r-r.�connection_from_url]s
r�cCs*|jd�r&|jd�r&|jdd�jd�}|S)z'
    Process IPv6 address literals
    �[�]z%25�%z[])r��endswith�replace�strip)r)r-r-r.r'ysr')KZ
__future__rrbZloggingr�r�rqrrtrrc�
exceptionsrrrr	r
rrr
rrrrrZpackages.ssl_match_hostnamerZpackagesrZpackages.six.movesrr�rrrrrrrrnrr�rZutil.connectionrZutil.requestr Z
util.responser!Z
util.retryr"Zutil.timeoutr#Zutil.urlr$r%ZPY2ZQueueZ_unused_module_QueueZmovesrFZ	getLoggerr1rQ�objectr`r&�setZEAGAINZEWOULDBLOCKrer<r�r�r'r-r-r-r.�<module>sF<$
%|site-packages/pip/_vendor/urllib3/__pycache__/fields.cpython-36.pyc000064400000013240147511334600021174 0ustar003

���e7�@sNddlmZddlZddlZddlmZddd�Zdd	�ZGd
d�de	�Z
dS)
�)�absolute_importN�)�six�application/octet-streamcCs|rtj|�dp|S|S)z�
    Guess the "Content-Type" of a file.

    :param filename:
        The filename to guess the "Content-Type" of using :mod:`mimetypes`.
    :param default:
        If no "Content-Type" can be guessed, default to `default`.
    r)�	mimetypesZ
guess_type)�filename�default�r	�/usr/lib/python3.6/fields.py�guess_content_types	rcs�t�fdd�dD��sNd|�f}y|jd�Wnttfk
rHYnX|Stjrlt�tj�rl�jd��tj	j
�d��d|�f��S)a�
    Helper function to format and quote a single header parameter.

    Particularly useful for header parameters which might contain
    non-ASCII values, like file names. This follows RFC 2231, as
    suggested by RFC 2388 Section 4.4.

    :param name:
        The name of the parameter, a string expected to be ASCII only.
    :param value:
        The value of the parameter, provided as a unicode string.
    c3s|]}|�kVqdS)Nr	)�.0Zch)�valuer	r
�	<genexpr>#sz&format_header_param.<locals>.<genexpr>z"\
z%s="%s"�asciizutf-8z%s*=%s)�any�encode�UnicodeEncodeError�UnicodeDecodeErrorrZPY3�
isinstanceZ	text_type�emailZutilsZencode_rfc2231)�namer
�resultr	)r
r
�format_header_params

rc@sHeZdZdZddd�Zedd��Zdd�Zd	d
�Zdd�Z	dd
d�Z
dS)�RequestFieldaK
    A data container for request body parameters.

    :param name:
        The name of this request field.
    :param data:
        The data/value body.
    :param filename:
        An optional filename of the request field.
    :param headers:
        An optional dict-like object of headers to initially use for the field.
    NcCs*||_||_||_i|_|r&t|�|_dS)N)�_name�	_filename�data�headers�dict)�selfrrrrr	r	r
�__init__?szRequestField.__init__cCs^t|t�r4t|�dkr"|\}}}q@|\}}t|�}nd}d}|}||||d�}|j|d�|S)a�
        A :class:`~urllib3.fields.RequestField` factory from old-style tuple parameters.

        Supports constructing :class:`~urllib3.fields.RequestField` from
        parameter of key/value strings AND key/filetuple. A filetuple is a
        (filename, data, MIME type) tuple where the MIME type is optional.
        For example::

            'foo': 'bar',
            'fakefile': ('foofile.txt', 'contents of foofile'),
            'realfile': ('barfile.txt', open('realfile').read()),
            'typedfile': ('bazfile.bin', open('bazfile').read(), 'image/jpeg'),
            'nonamefile': 'contents of nonamefile field',

        Field names and filenames must be unicode.
        �N)r)�content_type)r�tuple�lenr�make_multipart)�clsZ	fieldnamer
rrr"Z
request_paramr	r	r
�from_tuplesGs

zRequestField.from_tuplescCs
t||�S)a
        Overridable helper function to format a single header parameter.

        :param name:
            The name of the parameter, a string expected to be ASCII only.
        :param value:
            The value of the parameter, provided as a unicode string.
        )r)rrr
r	r	r
�_render_partis	zRequestField._render_partcCsPg}|}t|t�r|j�}x*|D]"\}}|dk	r |j|j||��q Wdj|�S)aO
        Helper function to format and quote a single header.

        Useful for single headers that are composed of multiple items. E.g.,
        'Content-Disposition' fields.

        :param header_parts:
            A sequence of (k, v) typles or a :class:`dict` of (k, v) to format
            as `k1="v1"; k2="v2"; ...`.
        Nz; )rr�items�appendr(�join)rZheader_parts�parts�iterablerr
r	r	r
�
_render_partsts
zRequestField._render_partscCs�g}dddg}x2|D]*}|jj|d�r|jd||j|f�qWx4|jj�D]&\}}||krN|rN|jd||f�qNW|jd�dj|�S)z=
        Renders the headers for this request field.
        zContent-DispositionzContent-TypezContent-LocationFz%s: %sz
)r�getr*r)r+)r�linesZ	sort_keysZsort_keyZheader_nameZheader_valuer	r	r
�render_headers�s


zRequestField.render_headersc	CsX|pd|jd<|jddjd|jd|jfd|jff�g�7<||jd<||jd<d	S)
a|
        Makes this request field into a multipart request field.

        This method overrides "Content-Disposition", "Content-Type" and
        "Content-Location" headers to the request parameter.

        :param content_type:
            The 'Content-Type' of the request body.
        :param content_location:
            The 'Content-Location' of the request body.

        z	form-datazContent-Dispositionz; �rrzContent-TypezContent-LocationN)rr+r.rr)rZcontent_dispositionr"Zcontent_locationr	r	r
r%�s
zRequestField.make_multipart)NN)NNN)�__name__�
__module__�__qualname__�__doc__r �classmethodr'r(r.r1r%r	r	r	r
r2s
"r)r)Z
__future__rZemail.utilsrrZpackagesrrr�objectrr	r	r	r
�<module>s
site-packages/pip/_vendor/urllib3/__pycache__/request.cpython-36.pyc000064400000012556147511334600021427 0ustar003

���e:�@s>ddlmZddlmZddlmZdgZGdd�de�ZdS)�)�absolute_import�)�encode_multipart_formdata)�	urlencode�RequestMethodsc@sReZdZdZeddddg�Zddd�Zdd
d�Zddd
�Zddd�Z	ddd�Z
dS)ra�
    Convenience mixin for classes who implement a :meth:`urlopen` method, such
    as :class:`~urllib3.connectionpool.HTTPConnectionPool` and
    :class:`~urllib3.poolmanager.PoolManager`.

    Provides behavior for making common types of HTTP request methods and
    decides which type of request field encoding to use.

    Specifically,

    :meth:`.request_encode_url` is for sending requests whose fields are
    encoded in the URL (such as GET, HEAD, DELETE).

    :meth:`.request_encode_body` is for sending requests whose fields are
    encoded in the *body* of the request using multipart or www-form-urlencoded
    (such as for POST, PUT, PATCH).

    :meth:`.request` is for making any kind of request, it will look up the
    appropriate encoding format and use one of the above two methods to make
    the request.

    Initializer parameters:

    :param headers:
        Headers to include with all requests, unless other headers are given
        explicitly.
    ZDELETEZGETZHEADZOPTIONSNcCs|pi|_dS)N)�headers)�selfr�r	�/usr/lib/python3.6/request.py�__init__)szRequestMethods.__init__TcKstd��dS)NzMClasses extending RequestMethods must implement their own ``urlopen`` method.)�NotImplemented)r�method�url�bodyr�encode_multipart�multipart_boundary�kwr	r	r
�urlopen,szRequestMethods.urlopencKsJ|j�}||jkr,|j||f||d�|��S|j||f||d�|��SdS)a�
        Make a request using :meth:`urlopen` with the appropriate encoding of
        ``fields`` based on the ``method`` used.

        This is a convenience method that requires the least amount of manual
        effort. It can be used in most situations, while still having the
        option to drop down to more specific methods when necessary, such as
        :meth:`request_encode_url`, :meth:`request_encode_body`,
        or even the lowest level :meth:`urlopen`.
        )�fieldsrN)�upper�_encode_url_methods�request_encode_url�request_encode_body)rr
rrr�
urlopen_kwr	r	r
�request2s
zRequestMethods.requestcKsD|dkr|j}d|i}|j|�|r4|dt|�7}|j||f|�S)z�
        Make a request using :meth:`urlopen` with the ``fields`` encoded in
        the url. This is useful for request methods like GET, HEAD, DELETE, etc.
        Nr�?)r�updaterr)rr
rrrr�extra_kwr	r	r
rHs
z!RequestMethods.request_encode_urlcKs�|dkr|j}dii}|rbd|kr*td��|r@t||d�\}	}
nt|�d}	}
|	|d<d|
i|d<|dj|�|j|�|j||f|�S)a�
        Make a request using :meth:`urlopen` with the ``fields`` encoded in
        the body. This is useful for request methods like POST, PUT, PATCH, etc.

        When ``encode_multipart=True`` (default), then
        :meth:`urllib3.filepost.encode_multipart_formdata` is used to encode
        the payload with the appropriate content type. Otherwise
        :meth:`urllib.urlencode` is used with the
        'application/x-www-form-urlencoded' content type.

        Multipart encoding must be used when posting files, and it's reasonably
        safe to use it in other times too. However, it may break request
        signing, such as with OAuth.

        Supports an optional ``fields`` parameter of key/value strings AND
        key/filetuple. A filetuple is a (filename, data, MIME type) tuple where
        the MIME type is optional. For example::

            fields = {
                'foo': 'bar',
                'fakefile': ('foofile.txt', 'contents of foofile'),
                'realfile': ('barfile.txt', open('realfile').read()),
                'typedfile': ('bazfile.bin', open('bazfile').read(),
                              'image/jpeg'),
                'nonamefile': 'contents of nonamefile field',
            }

        When uploading a file, providing a filename (the first parameter of the
        tuple) is optional but recommended to best mimick behavior of browsers.

        Note that if ``headers`` are supplied, the 'Content-Type' header will
        be overwritten because it depends on the dynamic random boundary string
        which is used to compose the body of the request. The random boundary
        string can be explicitly set with the ``multipart_boundary`` parameter.
        NrrzFrequest got values for both 'fields' and 'body', can only specify one.)�boundaryz!application/x-www-form-urlencodedzContent-Type)r�	TypeErrorrrrr)rr
rrrrrrrrZcontent_typer	r	r
rYs&
z"RequestMethods.request_encode_body)N)NNTN)NN)NN)NNTN)�__name__�
__module__�__qualname__�__doc__�setrrrrrrr	r	r	r
r
s



N)	Z
__future__rZfilepostrZpackages.six.moves.urllib.parser�__all__�objectrr	r	r	r
�<module>ssite-packages/pip/_vendor/urllib3/exceptions.py000064400000014713147511334600015631 0ustar00from __future__ import absolute_import
from .packages.six.moves.http_client import (
    IncompleteRead as httplib_IncompleteRead
)
# Base Exceptions


class HTTPError(Exception):
    "Base exception used by this module."
    pass


class HTTPWarning(Warning):
    "Base warning used by this module."
    pass


class PoolError(HTTPError):
    "Base exception for errors caused within a pool."
    def __init__(self, pool, message):
        self.pool = pool
        HTTPError.__init__(self, "%s: %s" % (pool, message))

    def __reduce__(self):
        # For pickling purposes.
        return self.__class__, (None, None)


class RequestError(PoolError):
    "Base exception for PoolErrors that have associated URLs."
    def __init__(self, pool, url, message):
        self.url = url
        PoolError.__init__(self, pool, message)

    def __reduce__(self):
        # For pickling purposes.
        return self.__class__, (None, self.url, None)


class SSLError(HTTPError):
    "Raised when SSL certificate fails in an HTTPS connection."
    pass


class ProxyError(HTTPError):
    "Raised when the connection to a proxy fails."
    pass


class DecodeError(HTTPError):
    "Raised when automatic decoding based on Content-Type fails."
    pass


class ProtocolError(HTTPError):
    "Raised when something unexpected happens mid-request/response."
    pass


#: Renamed to ProtocolError but aliased for backwards compatibility.
ConnectionError = ProtocolError


# Leaf Exceptions

class MaxRetryError(RequestError):
    """Raised when the maximum number of retries is exceeded.

    :param pool: The connection pool
    :type pool: :class:`~urllib3.connectionpool.HTTPConnectionPool`
    :param string url: The requested Url
    :param exceptions.Exception reason: The underlying error

    """

    def __init__(self, pool, url, reason=None):
        self.reason = reason

        message = "Max retries exceeded with url: %s (Caused by %r)" % (
            url, reason)

        RequestError.__init__(self, pool, url, message)


class HostChangedError(RequestError):
    "Raised when an existing pool gets a request for a foreign host."

    def __init__(self, pool, url, retries=3):
        message = "Tried to open a foreign host with url: %s" % url
        RequestError.__init__(self, pool, url, message)
        self.retries = retries


class TimeoutStateError(HTTPError):
    """ Raised when passing an invalid state to a timeout """
    pass


class TimeoutError(HTTPError):
    """ Raised when a socket timeout error occurs.

    Catching this error will catch both :exc:`ReadTimeoutErrors
    <ReadTimeoutError>` and :exc:`ConnectTimeoutErrors <ConnectTimeoutError>`.
    """
    pass


class ReadTimeoutError(TimeoutError, RequestError):
    "Raised when a socket timeout occurs while receiving data from a server"
    pass


# This timeout error does not have a URL attached and needs to inherit from the
# base HTTPError
class ConnectTimeoutError(TimeoutError):
    "Raised when a socket timeout occurs while connecting to a server"
    pass


class NewConnectionError(ConnectTimeoutError, PoolError):
    "Raised when we fail to establish a new connection. Usually ECONNREFUSED."
    pass


class EmptyPoolError(PoolError):
    "Raised when a pool runs out of connections and no more are allowed."
    pass


class ClosedPoolError(PoolError):
    "Raised when a request enters a pool after the pool has been closed."
    pass


class LocationValueError(ValueError, HTTPError):
    "Raised when there is something wrong with a given URL input."
    pass


class LocationParseError(LocationValueError):
    "Raised when get_host or similar fails to parse the URL input."

    def __init__(self, location):
        message = "Failed to parse: %s" % location
        HTTPError.__init__(self, message)

        self.location = location


class ResponseError(HTTPError):
    "Used as a container for an error reason supplied in a MaxRetryError."
    GENERIC_ERROR = 'too many error responses'
    SPECIFIC_ERROR = 'too many {status_code} error responses'


class SecurityWarning(HTTPWarning):
    "Warned when perfoming security reducing actions"
    pass


class SubjectAltNameWarning(SecurityWarning):
    "Warned when connecting to a host with a certificate missing a SAN."
    pass


class InsecureRequestWarning(SecurityWarning):
    "Warned when making an unverified HTTPS request."
    pass


class SystemTimeWarning(SecurityWarning):
    "Warned when system time is suspected to be wrong"
    pass


class InsecurePlatformWarning(SecurityWarning):
    "Warned when certain SSL configuration is not available on a platform."
    pass


class SNIMissingWarning(HTTPWarning):
    "Warned when making a HTTPS request without SNI available."
    pass


class DependencyWarning(HTTPWarning):
    """
    Warned when an attempt is made to import a module with missing optional
    dependencies.
    """
    pass


class ResponseNotChunked(ProtocolError, ValueError):
    "Response needs to be chunked in order to read it as chunks."
    pass


class BodyNotHttplibCompatible(HTTPError):
    """
    Body should be httplib.HTTPResponse like (have an fp attribute which
    returns raw chunks) for read_chunked().
    """
    pass


class IncompleteRead(HTTPError, httplib_IncompleteRead):
    """
    Response length doesn't match expected Content-Length

    Subclass of http_client.IncompleteRead to allow int value
    for `partial` to avoid creating large objects on streamed
    reads.
    """
    def __init__(self, partial, expected):
        super(IncompleteRead, self).__init__(partial, expected)

    def __repr__(self):
        return ('IncompleteRead(%i bytes read, '
                '%i more expected)' % (self.partial, self.expected))


class InvalidHeader(HTTPError):
    "The header provided was somehow invalid."
    pass


class ProxySchemeUnknown(AssertionError, ValueError):
    "ProxyManager does not support the supplied scheme"
    # TODO(t-8ch): Stop inheriting from AssertionError in v2.0.

    def __init__(self, scheme):
        message = "Not supported proxy scheme %s" % scheme
        super(ProxySchemeUnknown, self).__init__(message)


class HeaderParsingError(HTTPError):
    "Raised by assert_header_parsing, but we convert it to a log.warning statement."
    def __init__(self, defects, unparsed_data):
        message = '%s, unparsed data: %r' % (defects or 'Unknown', unparsed_data)
        super(HeaderParsingError, self).__init__(message)


class UnrewindableBodyError(HTTPError):
    "urllib3 encountered an error when trying to rewind a body"
    pass
site-packages/pip/_vendor/html5lib/treeadapters/__pycache__/__init__.cpython-36.pyc000064400000000553147511334600024317 0ustar003

���e��@sZddlmZmZmZddlmZdgZyddlmZWnek
rJYnXej	d�dS)�)�absolute_import�division�unicode_literals�)�saxr)�genshirN)
Z
__future__rrr�r�__all__r�ImportError�append�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/html5lib/treeadapters/__pycache__/genshi.cpython-36.opt-1.pyc000064400000002640147511334600024773 0ustar003

���e�@sLddlmZmZmZddlmZmZddlmZmZm	Z	m
Z
mZdd�ZdS)�)�absolute_import�division�unicode_literals)�QName�Attrs)�START�END�TEXT�COMMENT�DOCTYPEccsZg}�x6|D�],}|d}|dkr2|j|d�n|rLtdj|�dfVg}|dkr�|d	rrd
|d	|df}n|d}tdd
�|dj�D��}tt|�|fdfV|dkr�d}|dk�r�|d	r�d
|d	|df}n|d}tt|�dfVq|dk�rt|dd fVq|dkrt	|d|d|dfd#fVqqW|�rVtdj|�d&fVdS)'N�type�
Characters�SpaceCharacters�data���StartTag�EmptyTag�	namespacez{%s}%s�namecSs4g|],\}}t|ddk	r"d|n|d�|f�qS)rNz{%s}%sr)r)�.0�attr�value�r�/usr/lib/python3.6/genshi.py�
<listcomp>szto_genshi.<locals>.<listcomp>ZEndTag�CommentZDoctypeZpublicIdZsystemId)r
r���r)Nrr)rrrr)Nrrrr)Nrrrr)Nrrrr)Nrrrr)Nrr)
�appendr	�joinr�itemsrrrr
r)Zwalker�text�tokenrrZattrsrrr�	to_genshis<

r#N)
Z
__future__rrrZgenshi.corerrrrr	r
rr#rrrr�<module>ssite-packages/pip/_vendor/html5lib/treeadapters/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000553147511334600025256 0ustar003

���e��@sZddlmZmZmZddlmZdgZyddlmZWnek
rJYnXej	d�dS)�)�absolute_import�division�unicode_literals�)�saxr)�genshirN)
Z
__future__rrr�r�__all__r�ImportError�append�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/html5lib/treeadapters/__pycache__/genshi.cpython-36.pyc000064400000002640147511334600024034 0ustar003

���e�@sLddlmZmZmZddlmZmZddlmZmZm	Z	m
Z
mZdd�ZdS)�)�absolute_import�division�unicode_literals)�QName�Attrs)�START�END�TEXT�COMMENT�DOCTYPEccsZg}�x6|D�],}|d}|dkr2|j|d�n|rLtdj|�dfVg}|dkr�|d	rrd
|d	|df}n|d}tdd
�|dj�D��}tt|�|fdfV|dkr�d}|dk�r�|d	r�d
|d	|df}n|d}tt|�dfVq|dk�rt|dd fVq|dkrt	|d|d|dfd#fVqqW|�rVtdj|�d&fVdS)'N�type�
Characters�SpaceCharacters�data���StartTag�EmptyTag�	namespacez{%s}%s�namecSs4g|],\}}t|ddk	r"d|n|d�|f�qS)rNz{%s}%sr)r)�.0�attr�value�r�/usr/lib/python3.6/genshi.py�
<listcomp>szto_genshi.<locals>.<listcomp>ZEndTag�CommentZDoctypeZpublicIdZsystemId)r
r���r)Nrr)rrrr)Nrrrr)Nrrrr)Nrrrr)Nrrrr)Nrr)
�appendr	�joinr�itemsrrrr
r)Zwalker�text�tokenrrZattrsrrr�	to_genshis<

r#N)
Z
__future__rrrZgenshi.corerrrrr	r
rr#rrrr�<module>ssite-packages/pip/_vendor/html5lib/treeadapters/__pycache__/sax.cpython-36.opt-1.pyc000064400000002335147511334600024312 0ustar003

���e}�@shddlmZmZmZddlmZddlmZmZiZ	x&ej
�D]\ZZZ
edk	r>e
e	e<q>Wdd�ZdS)�)�absolute_import�division�unicode_literals)�AttributesNSImpl�)�adjustForeignAttributes�unadjustForeignAttributesNcCs|j�x tj�D]\}}|j||�qWx�|D]�}|d}|dkrHq0q0|d
kr�t|dt�}|j|d|df|d|�|dkr�|j|d|df|d�q0|dkr�|j|d|df|d�q0|dkr�|j|d�q0|dkr0q0q0Wx tj�D]\}}|j	|��q�W|j
�dS)z8Call SAX-like content handler based on treewalker walker�typeZDoctype�StartTag�EmptyTag�data�	namespace�nameZEndTag�
Characters�SpaceCharacters�CommentN)r
r)rr)Z
startDocument�prefix_mapping�itemsZstartPrefixMappingrrZstartElementNSZendElementNSZ
charactersZendPrefixMappingZendDocument)ZwalkerZhandler�prefixr
�tokenr	Zattrs�r�/usr/lib/python3.6/sax.py�to_sax
s6
r)Z
__future__rrrZxml.sax.xmlreaderrZ	constantsrrr�valuesrZ	localNamer
rrrrr�<module>ssite-packages/pip/_vendor/html5lib/treeadapters/__pycache__/sax.cpython-36.pyc000064400000002416147511334600023353 0ustar003

���e}�@shddlmZmZmZddlmZddlmZmZiZ	x&ej
�D]\ZZZ
edk	r>e
e	e<q>Wdd�ZdS)�)�absolute_import�division�unicode_literals)�AttributesNSImpl�)�adjustForeignAttributes�unadjustForeignAttributesNcCs(|j�x tj�D]\}}|j||�qWx�|D]�}|d}|dkrHq0q0|dkr�t|dt�}|j|d|df|d|�|dkr�|j|d|df|d�q0|dkr�|j|d|df|d�q0|dkr�|j|d�q0|dkr�q0ds0t	d
��q0Wx tj�D]\}}|j
|��qW|j�dS)z8Call SAX-like content handler based on treewalker walker�typeZDoctype�StartTag�EmptyTag�data�	namespace�nameZEndTag�
Characters�SpaceCharacters�CommentFzUnknown token typeN)r
r)rr)Z
startDocument�prefix_mapping�itemsZstartPrefixMappingrrZstartElementNSZendElementNSZ
characters�AssertionErrorZendPrefixMappingZendDocument)ZwalkerZhandler�prefixr
�tokenr	Zattrs�r�/usr/lib/python3.6/sax.py�to_sax
s6
r)Z
__future__rrrZxml.sax.xmlreaderrZ	constantsrrr�valuesrZ	localNamer
rrrrr�<module>ssite-packages/pip/_vendor/html5lib/treeadapters/genshi.py000064400000003023147511334600017544 0ustar00from __future__ import absolute_import, division, unicode_literals

from genshi.core import QName, Attrs
from genshi.core import START, END, TEXT, COMMENT, DOCTYPE


def to_genshi(walker):
    text = []
    for token in walker:
        type = token["type"]
        if type in ("Characters", "SpaceCharacters"):
            text.append(token["data"])
        elif text:
            yield TEXT, "".join(text), (None, -1, -1)
            text = []

        if type in ("StartTag", "EmptyTag"):
            if token["namespace"]:
                name = "{%s}%s" % (token["namespace"], token["name"])
            else:
                name = token["name"]
            attrs = Attrs([(QName("{%s}%s" % attr if attr[0] is not None else attr[1]), value)
                           for attr, value in token["data"].items()])
            yield (START, (QName(name), attrs), (None, -1, -1))
            if type == "EmptyTag":
                type = "EndTag"

        if type == "EndTag":
            if token["namespace"]:
                name = "{%s}%s" % (token["namespace"], token["name"])
            else:
                name = token["name"]

            yield END, QName(name), (None, -1, -1)

        elif type == "Comment":
            yield COMMENT, token["data"], (None, -1, -1)

        elif type == "Doctype":
            yield DOCTYPE, (token["name"], token["publicId"],
                            token["systemId"]), (None, -1, -1)

        else:
            pass  # FIXME: What to do?

    if text:
        yield TEXT, "".join(text), (None, -1, -1)
site-packages/pip/_vendor/html5lib/treeadapters/__init__.py000064400000000320147511334600020023 0ustar00from __future__ import absolute_import, division, unicode_literals

from . import sax

__all__ = ["sax"]

try:
    from . import genshi  # noqa
except ImportError:
    pass
else:
    __all__.append("genshi")
site-packages/pip/_vendor/html5lib/treeadapters/sax.py000064400000003175147511334600017072 0ustar00from __future__ import absolute_import, division, unicode_literals

from xml.sax.xmlreader import AttributesNSImpl

from ..constants import adjustForeignAttributes, unadjustForeignAttributes

prefix_mapping = {}
for prefix, localName, namespace in adjustForeignAttributes.values():
    if prefix is not None:
        prefix_mapping[prefix] = namespace


def to_sax(walker, handler):
    """Call SAX-like content handler based on treewalker walker"""
    handler.startDocument()
    for prefix, namespace in prefix_mapping.items():
        handler.startPrefixMapping(prefix, namespace)

    for token in walker:
        type = token["type"]
        if type == "Doctype":
            continue
        elif type in ("StartTag", "EmptyTag"):
            attrs = AttributesNSImpl(token["data"],
                                     unadjustForeignAttributes)
            handler.startElementNS((token["namespace"], token["name"]),
                                   token["name"],
                                   attrs)
            if type == "EmptyTag":
                handler.endElementNS((token["namespace"], token["name"]),
                                     token["name"])
        elif type == "EndTag":
            handler.endElementNS((token["namespace"], token["name"]),
                                 token["name"])
        elif type in ("Characters", "SpaceCharacters"):
            handler.characters(token["data"])
        elif type == "Comment":
            pass
        else:
            assert False, "Unknown token type"

    for prefix, namespace in prefix_mapping.items():
        handler.endPrefixMapping(prefix)
    handler.endDocument()
site-packages/pip/_vendor/html5lib/treebuilders/etree_lxml.py000064400000033521147511334600020443 0ustar00"""Module for supporting the lxml.etree library. The idea here is to use as much
of the native library as possible, without using fragile hacks like custom element
names that break between releases. The downside of this is that we cannot represent
all possible trees; specifically the following are known to cause problems:

Text or comments as siblings of the root element
Docypes with no name

When any of these things occur, we emit a DataLossWarning
"""

from __future__ import absolute_import, division, unicode_literals
# pylint:disable=protected-access

import warnings
import re
import sys

from . import base
from ..constants import DataLossWarning
from .. import constants
from . import etree as etree_builders
from .. import _ihatexml

import lxml.etree as etree


fullTree = True
tag_regexp = re.compile("{([^}]*)}(.*)")

comment_type = etree.Comment("asd").tag


class DocumentType(object):
    def __init__(self, name, publicId, systemId):
        self.name = name
        self.publicId = publicId
        self.systemId = systemId


class Document(object):
    def __init__(self):
        self._elementTree = None
        self._childNodes = []

    def appendChild(self, element):
        self._elementTree.getroot().addnext(element._element)

    def _getChildNodes(self):
        return self._childNodes

    childNodes = property(_getChildNodes)


def testSerializer(element):
    rv = []
    infosetFilter = _ihatexml.InfosetFilter(preventDoubleDashComments=True)

    def serializeElement(element, indent=0):
        if not hasattr(element, "tag"):
            if hasattr(element, "getroot"):
                # Full tree case
                rv.append("#document")
                if element.docinfo.internalDTD:
                    if not (element.docinfo.public_id or
                            element.docinfo.system_url):
                        dtd_str = "<!DOCTYPE %s>" % element.docinfo.root_name
                    else:
                        dtd_str = """<!DOCTYPE %s "%s" "%s">""" % (
                            element.docinfo.root_name,
                            element.docinfo.public_id,
                            element.docinfo.system_url)
                    rv.append("|%s%s" % (' ' * (indent + 2), dtd_str))
                next_element = element.getroot()
                while next_element.getprevious() is not None:
                    next_element = next_element.getprevious()
                while next_element is not None:
                    serializeElement(next_element, indent + 2)
                    next_element = next_element.getnext()
            elif isinstance(element, str) or isinstance(element, bytes):
                # Text in a fragment
                assert isinstance(element, str) or sys.version_info[0] == 2
                rv.append("|%s\"%s\"" % (' ' * indent, element))
            else:
                # Fragment case
                rv.append("#document-fragment")
                for next_element in element:
                    serializeElement(next_element, indent + 2)
        elif element.tag == comment_type:
            rv.append("|%s<!-- %s -->" % (' ' * indent, element.text))
            if hasattr(element, "tail") and element.tail:
                rv.append("|%s\"%s\"" % (' ' * indent, element.tail))
        else:
            assert isinstance(element, etree._Element)
            nsmatch = etree_builders.tag_regexp.match(element.tag)
            if nsmatch is not None:
                ns = nsmatch.group(1)
                tag = nsmatch.group(2)
                prefix = constants.prefixes[ns]
                rv.append("|%s<%s %s>" % (' ' * indent, prefix,
                                          infosetFilter.fromXmlName(tag)))
            else:
                rv.append("|%s<%s>" % (' ' * indent,
                                       infosetFilter.fromXmlName(element.tag)))

            if hasattr(element, "attrib"):
                attributes = []
                for name, value in element.attrib.items():
                    nsmatch = tag_regexp.match(name)
                    if nsmatch is not None:
                        ns, name = nsmatch.groups()
                        name = infosetFilter.fromXmlName(name)
                        prefix = constants.prefixes[ns]
                        attr_string = "%s %s" % (prefix, name)
                    else:
                        attr_string = infosetFilter.fromXmlName(name)
                    attributes.append((attr_string, value))

                for name, value in sorted(attributes):
                    rv.append('|%s%s="%s"' % (' ' * (indent + 2), name, value))

            if element.text:
                rv.append("|%s\"%s\"" % (' ' * (indent + 2), element.text))
            indent += 2
            for child in element:
                serializeElement(child, indent)
            if hasattr(element, "tail") and element.tail:
                rv.append("|%s\"%s\"" % (' ' * (indent - 2), element.tail))
    serializeElement(element, 0)

    return "\n".join(rv)


def tostring(element):
    """Serialize an element and its child nodes to a string"""
    rv = []

    def serializeElement(element):
        if not hasattr(element, "tag"):
            if element.docinfo.internalDTD:
                if element.docinfo.doctype:
                    dtd_str = element.docinfo.doctype
                else:
                    dtd_str = "<!DOCTYPE %s>" % element.docinfo.root_name
                rv.append(dtd_str)
            serializeElement(element.getroot())

        elif element.tag == comment_type:
            rv.append("<!--%s-->" % (element.text,))

        else:
            # This is assumed to be an ordinary element
            if not element.attrib:
                rv.append("<%s>" % (element.tag,))
            else:
                attr = " ".join(["%s=\"%s\"" % (name, value)
                                 for name, value in element.attrib.items()])
                rv.append("<%s %s>" % (element.tag, attr))
            if element.text:
                rv.append(element.text)

            for child in element:
                serializeElement(child)

            rv.append("</%s>" % (element.tag,))

        if hasattr(element, "tail") and element.tail:
            rv.append(element.tail)

    serializeElement(element)

    return "".join(rv)


class TreeBuilder(base.TreeBuilder):
    documentClass = Document
    doctypeClass = DocumentType
    elementClass = None
    commentClass = None
    fragmentClass = Document
    implementation = etree

    def __init__(self, namespaceHTMLElements, fullTree=False):
        builder = etree_builders.getETreeModule(etree, fullTree=fullTree)
        infosetFilter = self.infosetFilter = _ihatexml.InfosetFilter(preventDoubleDashComments=True)
        self.namespaceHTMLElements = namespaceHTMLElements

        class Attributes(dict):
            def __init__(self, element, value=None):
                if value is None:
                    value = {}
                self._element = element
                dict.__init__(self, value)  # pylint:disable=non-parent-init-called
                for key, value in self.items():
                    if isinstance(key, tuple):
                        name = "{%s}%s" % (key[2], infosetFilter.coerceAttribute(key[1]))
                    else:
                        name = infosetFilter.coerceAttribute(key)
                    self._element._element.attrib[name] = value

            def __setitem__(self, key, value):
                dict.__setitem__(self, key, value)
                if isinstance(key, tuple):
                    name = "{%s}%s" % (key[2], infosetFilter.coerceAttribute(key[1]))
                else:
                    name = infosetFilter.coerceAttribute(key)
                self._element._element.attrib[name] = value

        class Element(builder.Element):
            def __init__(self, name, namespace):
                name = infosetFilter.coerceElement(name)
                builder.Element.__init__(self, name, namespace=namespace)
                self._attributes = Attributes(self)

            def _setName(self, name):
                self._name = infosetFilter.coerceElement(name)
                self._element.tag = self._getETreeTag(
                    self._name, self._namespace)

            def _getName(self):
                return infosetFilter.fromXmlName(self._name)

            name = property(_getName, _setName)

            def _getAttributes(self):
                return self._attributes

            def _setAttributes(self, attributes):
                self._attributes = Attributes(self, attributes)

            attributes = property(_getAttributes, _setAttributes)

            def insertText(self, data, insertBefore=None):
                data = infosetFilter.coerceCharacters(data)
                builder.Element.insertText(self, data, insertBefore)

            def appendChild(self, child):
                builder.Element.appendChild(self, child)

        class Comment(builder.Comment):
            def __init__(self, data):
                data = infosetFilter.coerceComment(data)
                builder.Comment.__init__(self, data)

            def _setData(self, data):
                data = infosetFilter.coerceComment(data)
                self._element.text = data

            def _getData(self):
                return self._element.text

            data = property(_getData, _setData)

        self.elementClass = Element
        self.commentClass = Comment
        # self.fragmentClass = builder.DocumentFragment
        base.TreeBuilder.__init__(self, namespaceHTMLElements)

    def reset(self):
        base.TreeBuilder.reset(self)
        self.insertComment = self.insertCommentInitial
        self.initial_comments = []
        self.doctype = None

    def testSerializer(self, element):
        return testSerializer(element)

    def getDocument(self):
        if fullTree:
            return self.document._elementTree
        else:
            return self.document._elementTree.getroot()

    def getFragment(self):
        fragment = []
        element = self.openElements[0]._element
        if element.text:
            fragment.append(element.text)
        fragment.extend(list(element))
        if element.tail:
            fragment.append(element.tail)
        return fragment

    def insertDoctype(self, token):
        name = token["name"]
        publicId = token["publicId"]
        systemId = token["systemId"]

        if not name:
            warnings.warn("lxml cannot represent empty doctype", DataLossWarning)
            self.doctype = None
        else:
            coercedName = self.infosetFilter.coerceElement(name)
            if coercedName != name:
                warnings.warn("lxml cannot represent non-xml doctype", DataLossWarning)

            doctype = self.doctypeClass(coercedName, publicId, systemId)
            self.doctype = doctype

    def insertCommentInitial(self, data, parent=None):
        assert parent is None or parent is self.document
        assert self.document._elementTree is None
        self.initial_comments.append(data)

    def insertCommentMain(self, data, parent=None):
        if (parent == self.document and
                self.document._elementTree.getroot()[-1].tag == comment_type):
            warnings.warn("lxml cannot represent adjacent comments beyond the root elements", DataLossWarning)
        super(TreeBuilder, self).insertComment(data, parent)

    def insertRoot(self, token):
        """Create the document root"""
        # Because of the way libxml2 works, it doesn't seem to be possible to
        # alter information like the doctype after the tree has been parsed.
        # Therefore we need to use the built-in parser to create our initial
        # tree, after which we can add elements like normal
        docStr = ""
        if self.doctype:
            assert self.doctype.name
            docStr += "<!DOCTYPE %s" % self.doctype.name
            if (self.doctype.publicId is not None or
                    self.doctype.systemId is not None):
                docStr += (' PUBLIC "%s" ' %
                           (self.infosetFilter.coercePubid(self.doctype.publicId or "")))
                if self.doctype.systemId:
                    sysid = self.doctype.systemId
                    if sysid.find("'") >= 0 and sysid.find('"') >= 0:
                        warnings.warn("DOCTYPE system cannot contain single and double quotes", DataLossWarning)
                        sysid = sysid.replace("'", 'U00027')
                    if sysid.find("'") >= 0:
                        docStr += '"%s"' % sysid
                    else:
                        docStr += "'%s'" % sysid
                else:
                    docStr += "''"
            docStr += ">"
            if self.doctype.name != token["name"]:
                warnings.warn("lxml cannot represent doctype with a different name to the root element", DataLossWarning)
        docStr += "<THIS_SHOULD_NEVER_APPEAR_PUBLICLY/>"
        root = etree.fromstring(docStr)

        # Append the initial comments:
        for comment_token in self.initial_comments:
            comment = self.commentClass(comment_token["data"])
            root.addprevious(comment._element)

        # Create the root document and add the ElementTree to it
        self.document = self.documentClass()
        self.document._elementTree = root.getroottree()

        # Give the root element the right name
        name = token["name"]
        namespace = token.get("namespace", self.defaultNamespace)
        if namespace is None:
            etree_tag = name
        else:
            etree_tag = "{%s}%s" % (namespace, name)
        root.tag = etree_tag

        # Add the root element to the internal child/open data structures
        root_element = self.elementClass(name, namespace)
        root_element._element = root
        self.document._childNodes.append(root_element)
        self.openElements.append(root_element)

        # Reset to the default insert comment function
        self.insertComment = self.insertCommentMain
site-packages/pip/_vendor/html5lib/treebuilders/__init__.py000064400000006516147511334600020046 0ustar00"""A collection of modules for building different kinds of tree from
HTML documents.

To create a treebuilder for a new type of tree, you need to do
implement several things:

1) A set of classes for various types of elements: Document, Doctype,
Comment, Element. These must implement the interface of
_base.treebuilders.Node (although comment nodes have a different
signature for their constructor, see treebuilders.etree.Comment)
Textual content may also be implemented as another node type, or not, as
your tree implementation requires.

2) A treebuilder object (called TreeBuilder by convention) that
inherits from treebuilders._base.TreeBuilder. This has 4 required attributes:
documentClass - the class to use for the bottommost node of a document
elementClass - the class to use for HTML Elements
commentClass - the class to use for comments
doctypeClass - the class to use for doctypes
It also has one required method:
getDocument - Returns the root node of the complete document tree

3) If you wish to run the unit tests, you must also create a
testSerializer method on your treebuilder which accepts a node and
returns a string containing Node and its children serialized according
to the format used in the unittests
"""

from __future__ import absolute_import, division, unicode_literals

from .._utils import default_etree

treeBuilderCache = {}


def getTreeBuilder(treeType, implementation=None, **kwargs):
    """Get a TreeBuilder class for various types of tree with built-in support

    treeType - the name of the tree type required (case-insensitive). Supported
               values are:

               "dom" - A generic builder for DOM implementations, defaulting to
                       a xml.dom.minidom based implementation.
               "etree" - A generic builder for tree implementations exposing an
                         ElementTree-like interface, defaulting to
                         xml.etree.cElementTree if available and
                         xml.etree.ElementTree if not.
               "lxml" - A etree-based builder for lxml.etree, handling
                        limitations of lxml's implementation.

    implementation - (Currently applies to the "etree" and "dom" tree types). A
                      module implementing the tree type e.g.
                      xml.etree.ElementTree or xml.etree.cElementTree."""

    treeType = treeType.lower()
    if treeType not in treeBuilderCache:
        if treeType == "dom":
            from . import dom
            # Come up with a sane default (pref. from the stdlib)
            if implementation is None:
                from xml.dom import minidom
                implementation = minidom
            # NEVER cache here, caching is done in the dom submodule
            return dom.getDomModule(implementation, **kwargs).TreeBuilder
        elif treeType == "lxml":
            from . import etree_lxml
            treeBuilderCache[treeType] = etree_lxml.TreeBuilder
        elif treeType == "etree":
            from . import etree
            if implementation is None:
                implementation = default_etree
            # NEVER cache here, caching is done in the etree submodule
            return etree.getETreeModule(implementation, **kwargs).TreeBuilder
        else:
            raise ValueError("""Unrecognised treebuilder "%s" """ % treeType)
    return treeBuilderCache.get(treeType)
site-packages/pip/_vendor/html5lib/treebuilders/__pycache__/__init__.cpython-36.pyc000064400000005742147511334600024332 0ustar003

���eN
�@s6dZddlmZmZmZddlmZiZddd�ZdS)	a�A collection of modules for building different kinds of tree from
HTML documents.

To create a treebuilder for a new type of tree, you need to do
implement several things:

1) A set of classes for various types of elements: Document, Doctype,
Comment, Element. These must implement the interface of
_base.treebuilders.Node (although comment nodes have a different
signature for their constructor, see treebuilders.etree.Comment)
Textual content may also be implemented as another node type, or not, as
your tree implementation requires.

2) A treebuilder object (called TreeBuilder by convention) that
inherits from treebuilders._base.TreeBuilder. This has 4 required attributes:
documentClass - the class to use for the bottommost node of a document
elementClass - the class to use for HTML Elements
commentClass - the class to use for comments
doctypeClass - the class to use for doctypes
It also has one required method:
getDocument - Returns the root node of the complete document tree

3) If you wish to run the unit tests, you must also create a
testSerializer method on your treebuilder which accepts a node and
returns a string containing Node and its children serialized according
to the format used in the unittests
�)�absolute_import�division�unicode_literals�)�
default_etreeNcKs�|j�}|tkr�|dkrLddlm}|dkr<ddlm}|}|j|f|�jS|dkrlddlm}|jt|<n<|d	kr�dd
lm	}|dkr�t
}|j|f|�jStd|��tj
|�S)a�Get a TreeBuilder class for various types of tree with built-in support

    treeType - the name of the tree type required (case-insensitive). Supported
               values are:

               "dom" - A generic builder for DOM implementations, defaulting to
                       a xml.dom.minidom based implementation.
               "etree" - A generic builder for tree implementations exposing an
                         ElementTree-like interface, defaulting to
                         xml.etree.cElementTree if available and
                         xml.etree.ElementTree if not.
               "lxml" - A etree-based builder for lxml.etree, handling
                        limitations of lxml's implementation.

    implementation - (Currently applies to the "etree" and "dom" tree types). A
                      module implementing the tree type e.g.
                      xml.etree.ElementTree or xml.etree.cElementTree.�dom�)rNr)�minidomZlxml)�
etree_lxml�etree)rzUnrecognised treebuilder "%s" )�lower�treeBuilderCache�rZxml.domr	ZgetDomModuleZTreeBuilderr
rrZgetETreeModule�
ValueError�get)ZtreeType�implementation�kwargsrr	r
r�r�/usr/lib/python3.6/__init__.py�getTreeBuilder$s$r)N)	�__doc__Z
__future__rrrZ_utilsrr
rrrrr�<module>ssite-packages/pip/_vendor/html5lib/treebuilders/__pycache__/dom.cpython-36.pyc000064400000021735147511334600023352 0ustar003

���e�"�@s|ddlmZmZmZddlmZddlmZmZddl	Z	ddl
mZddl
mZdd	lm
Z
dd
lmZdd�Zee�ZdS)
�)�absolute_import�division�unicode_literals)�MutableMapping)�minidom�NodeN�)�base�)�	constants)�
namespaces)�moduleFactoryFactorycsV��Gdd�dt��G��fdd�dtj��G����fdd�dtj�}dd��t�S)	Nc@sLeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)zgetDomBuilder.<locals>.AttrListcSs
||_dS)N)�element)�selfr�r�/usr/lib/python3.6/dom.py�__init__sz(getDomBuilder.<locals>.AttrList.__init__cSst|jjj��S)N)�iterr�
attributes�keys)rrrr�__iter__sz(getDomBuilder.<locals>.AttrList.__iter__cSs4t|t�rt�n |jjj|�}||_||jj|<dS)N)�
isinstance�tuple�NotImplementedErrorr�
ownerDocumentZcreateAttribute�valuer)r�namer�attrrrr�__setitem__s

z+getDomBuilder.<locals>.AttrList.__setitem__cSst|jj�S)N)�lenrr)rrrr�__len__ sz'getDomBuilder.<locals>.AttrList.__len__cSst|jjj��S)N)�listrr�items)rrrrr"#sz%getDomBuilder.<locals>.AttrList.itemscSst|jjj��S)N)r!rr�values)rrrrr#&sz&getDomBuilder.<locals>.AttrList.valuescSs"t|t�rt�n|jj|jSdS)N)rrrrrr)rrrrr�__getitem__)s
z+getDomBuilder.<locals>.AttrList.__getitem__cSst|t�rt�n
|jj|=dS)N)rrrrr)rrrrr�__delitem__/s
z+getDomBuilder.<locals>.AttrList.__delitem__N)�__name__�
__module__�__qualname__rrrr r"r#r$r%rrrr�AttrListsr)cs�eZdZdd�Zedd��Zdd�Zddd	�Zd
d�Zdd
�Z	dd�Z
�fdd�Zdd�Zeee�Z
�fdd�Zdd�Zdd�Zee�ZdS)z"getDomBuilder.<locals>.NodeBuildercSstjj||j�||_dS)N)r	rr�nodeNamer)rrrrrr6sz+getDomBuilder.<locals>.NodeBuilder.__init__cSst|jd�r|jjpdS)N�namespaceURI)�hasattrrr+)rrrr�<lambda>:sz+getDomBuilder.<locals>.NodeBuilder.<lambda>cSs||_|jj|j�dS)N)�parentr�appendChild)r�noderrrr/=sz.getDomBuilder.<locals>.NodeBuilder.appendChildNcSs4|jjj|�}|r$|jj||j�n|jj|�dS)N)rr�createTextNode�insertBeforer/)r�datar2�textrrr�
insertTextAsz-getDomBuilder.<locals>.NodeBuilder.insertTextcSs|jj|j|j�||_dS)N)rr2r.)rr0ZrefNoderrrr2Hsz/getDomBuilder.<locals>.NodeBuilder.insertBeforecSs&|jj|jkr|jj|j�d|_dS)N)rZ
parentNode�removeChildr.)rr0rrrr6Lsz.getDomBuilder.<locals>.NodeBuilder.removeChildcSs:x.|jj�r.|jj}|jj|�|jj|�qWg|_dS)N)r�
hasChildNodesZ
firstChildr6r/�
childNodes)rZ	newParent�childrrr�reparentChildrenQs
z3getDomBuilder.<locals>.NodeBuilder.reparentChildrencs
�|j�S)N)r)r)r)rr�
getAttributesXsz0getDomBuilder.<locals>.NodeBuilder.getAttributescSsz|rvxpt|j��D]`\}}t|t�rd|ddk	rF|dd|d}n|d}|jj|d||�q|jj||�qWdS)Nr�:rr
)r!r"rrrZsetAttributeNSZsetAttribute)rrrrZ
qualifiedNamerrr�
setAttributes[s
z0getDomBuilder.<locals>.NodeBuilder.setAttributescs�|jjd��S)NF)r�	cloneNode)r)�NodeBuilderrrr>jsz,getDomBuilder.<locals>.NodeBuilder.cloneNodecSs
|jj�S)N)rr7)rrrr�
hasContentmsz-getDomBuilder.<locals>.NodeBuilder.hasContentcSs(|jdkrtd|jfS|j|jfSdS)NZhtml)�	namespacerr)rrrr�getNameTupleps
z/getDomBuilder.<locals>.NodeBuilder.getNameTuple)N)r&r'r(r�propertyrAr/r5r2r6r:r;r=rr>r@rBZ	nameTupler)r)r?rrr?5s

r?cs�eZdZ�fdd�Z��fdd�Zd�fdd�	Z�fdd	�Z�fd
d�Zdd
�Z�fdd�Z	dd�Z
dd�Zddd�Z�Z
dZdS)z"getDomBuilder.<locals>.TreeBuildercs�j�jddd�|_tj|�S)N)�getDOMImplementationZcreateDocument�dom�weakref�proxy)r)�Domrr�
documentClassysz0getDomBuilder.<locals>.TreeBuilder.documentClasscsR|d}|d}|d}�j�}|j|||�}|jj�|���tkrN|j|_dS)Nr�publicId�systemId)rDZcreateDocumentTypeZdocumentr/rrEr)r�tokenrrJrKZdomimplZdoctype)rHr?rr�
insertDoctype}sz0getDomBuilder.<locals>.TreeBuilder.insertDoctypeNcs6|dkr |jdkr |jj|�}n|jj||�}�|�S)N)ZdefaultNamespacerEZ
createElementZcreateElementNS)rrrAr0)r?rr�elementClass�sz/getDomBuilder.<locals>.TreeBuilder.elementClasscs�|jj|��S)N)rEZ
createComment)rr3)r?rr�commentClass�sz/getDomBuilder.<locals>.TreeBuilder.commentClasscs�|jj��S)N)rEZcreateDocumentFragment)r)r?rr�
fragmentClass�sz0getDomBuilder.<locals>.TreeBuilder.fragmentClasscSs|jj|j�dS)N)rEr/r)rr0rrrr/�sz.getDomBuilder.<locals>.TreeBuilder.appendChildcs�|�S)Nr)rr)�testSerializerrrrQ�sz1getDomBuilder.<locals>.TreeBuilder.testSerializercSs|jS)N)rE)rrrr�getDocument�sz.getDomBuilder.<locals>.TreeBuilder.getDocumentcSstjj|�jS)N)r	�TreeBuilder�getFragmentr)rrrrrT�sz.getDomBuilder.<locals>.TreeBuilder.getFragmentcSsp|}||krtjj|||�nNt|jd�rXtj|jjkrXt|jj�|j_|jjj	tj�|jj
|jj|��dS)N�_child_node_types)r	rSr5r,rEr�	TEXT_NODErUr!�appendr/r1)rr3r.rrrr5�sz-getDomBuilder.<locals>.TreeBuilder.insertText)N)N)r&r'r(rIrMrNrOrPr/rQrRrTr5�implementationrr)rH�DomImplementationr?rQrrrSxs

rScs0|j�g�d��fdd�	��|d�dj��S)Nrcs$|jtjkr�|jrj|js|jrP|jp&d}|jp0d}�jdd||j||f�q~�jdd||jf�n�jdd|f��nz|jtjkr��jd��n`|jtjkr��jd��nF|jtj	krވjdd||j
f��n|jtjk�r�jd	d||j
f�n�t|d
��r6|j
dk	�r6dtj|j
|jf}n|j}�jdd||f�|j��r�g}xftt|j��D]T}|jj|�}|j}|j}|j
}	|	�r�dtj|	|jf}n|j}|j||f��qpWx2t|�D]&\}}�jd
d|d||f��q�W|d7}x|jD]}
�|
|��qWdS)N�z|%s<!DOCTYPE %s "%s" "%s">� z|%s<!DOCTYPE %s>z|%s<!DOCTYPE >z	#documentz#document-fragmentz|%s<!-- %s -->z|%s"%s"r+z%s %sz|%s<%s>z
|%s%s="%s"r
)ZnodeTyperZDOCUMENT_TYPE_NODErrJrKrWZ
DOCUMENT_NODEZDOCUMENT_FRAGMENT_NODEZCOMMENT_NODEZ	nodeValuerVr,r+r�prefixesr*Z
hasAttributes�rangerr�itemrZ	localName�sortedr8)r�indentrJrKrr�irr�nsr9)�rv�serializeElementrrrd�sN


"z?getDomBuilder.<locals>.testSerializer.<locals>.serializeElement�
)r)Z	normalize�join)rr)rcrdrrQ�s
.
z%getDomBuilder.<locals>.testSerializer)rr	rrS�locals)rYrSr)r)rHrYr?rQr�
getDomBuilders$C:6rh)Z
__future__rrr�collectionsrZxml.domrrrFrZr	rrZ_utilsr
rhZgetDomModulerrrr�<module>s_site-packages/pip/_vendor/html5lib/treebuilders/__pycache__/etree.cpython-36.pyc000064400000026777147511334600023712 0ustar003

���e�1�@s�ddlmZmZmZddlmZddlZddlmZddlm	Z	ddlm
Z
dd	l
mZdd
lm
Z
ejd�Zdd
d�Ze
e�ZdS)�)�absolute_import�division�unicode_literals)�	text_typeN�)�base�)�	_ihatexml)�	constants)�
namespaces)�moduleFactoryFactoryz
{([^}]*)}(.*)Fc	s����jd�j�G�fdd�dtj��G�fdd�d���G�fdd�d���G�fdd	�d	���G�fd
d�d����fdd
��	��fdd�}G��������	fdd�dtj�}t�S)NZasdcs�eZdZd$�fdd�	Zdd�Zdd�Zdd	�Zeee�Zd
d�Z	dd
�Z
ee
e	�Zdd�Zdd�Z
eee
�Zdd�Zdd�Zeee�Zdd�Zdd�Zdd�Zdd�Zd%dd�Zd d!�Zd"d#�ZdS)&z getETreeBuilder.<locals>.ElementNcs^||_||_�j|j||��|_|dkr:td|jf|_n|j|jf|_d|_g|_g|_	dS)N�html)
�_name�
_namespace�Element�_getETreeTag�_elementrZ	nameTuple�parent�_childNodes�_flags)�self�name�	namespace)�ElementTree��/usr/lib/python3.6/etree.py�__init__s

z)getETreeBuilder.<locals>.Element.__init__cSs|dkr|}nd||f}|S)Nz{%s}%sr)rrrZ	etree_tagrrrr#sz-getETreeBuilder.<locals>.Element._getETreeTagcSs||_|j|j|j�|j_dS)N)rrrr�tag)rrrrr�_setName*sz)getETreeBuilder.<locals>.Element._setNamecSs|jS)N)r)rrrr�_getName.sz)getETreeBuilder.<locals>.Element._getNamecSs||_|j|j|j�|j_dS)N)rrrrr)rrrrr�
_setNamespace3sz.getETreeBuilder.<locals>.Element._setNamespacecSs|jS)N)r)rrrr�
_getNamespace7sz.getETreeBuilder.<locals>.Element._getNamespacecSs|jjS)N)r�attrib)rrrr�_getAttributes<sz/getETreeBuilder.<locals>.Element._getAttributescSspx"t|jjj��D]}|jj|=qWxF|j�D]:\}}t|t�rVd|d|df}n|}|jj||�q.WdS)Nz{%s}%srr)�listrr"�keys�items�
isinstance�tuple�set)r�
attributes�key�valuerrrr�_setAttributes?s
z/getETreeBuilder.<locals>.Element._setAttributescSs|jS)N)r)rrrr�_getChildNodesMsz/getETreeBuilder.<locals>.Element._getChildNodescSs.|jdd�=g|_x|D]}|j|�qWdS)N)rrZinsertChild)rr,�elementrrr�_setChildNodesPs
z/getETreeBuilder.<locals>.Element._setChildNodescSst|jjpt|j��S)z,Return true if the node has children or text)�boolr�text�len)rrrr�
hasContentXsz+getETreeBuilder.<locals>.Element.hasContentcSs$|jj|�|jj|j�||_dS)N)r�appendrr)r�noderrr�appendChild\sz,getETreeBuilder.<locals>.Element.appendChildcSs,t|j�j|j�}|jj||j�||_dS)N)r$r�index�insertr)rr6ZrefNoder8rrr�insertBeforeasz-getETreeBuilder.<locals>.Element.insertBeforecSs$|jj|�|jj|j�d|_dS)N)r�removerr)rr6rrr�removeChildfsz,getETreeBuilder.<locals>.Element.removeChildcSs�t|j�s,|jjsd|j_|jj|7_n�|dkrb|jdjsLd|jd_|jdj|7_nxt|j�}|j|j�}|dkr�|j|djs�d|j|d_|j|dj|7_n |jjs�d|j_|jj|7_dS)N�rr���r>r>)r3rr2�tailr$r8)r�datar:Zchildrenr8rrr�
insertTextks"

z+getETreeBuilder.<locals>.Element.insertTextcSs8t|�|j|j�}x |jj�D]\}}||j|<qW|S)N)�typerrr*r&)rr/rr,rrr�	cloneNode�sz*getETreeBuilder.<locals>.Element.cloneNodecSsl|jr"|jdjj|jj7_n0|jjs2d|j_|jjdk	rR|jj|jj7_d|j_tjj||�dS)Nrr=r>)�
childNodesrr?r2r�Node�reparentChildren)rZ	newParentrrrrF�sz1getETreeBuilder.<locals>.Element.reparentChildren)N)N)�__name__�
__module__�__qualname__rrrr�propertyrr r!rr#r-r*r.r0rDr4r7r:r<rArCrFr)rrrrs*





rcs2eZdZ�fdd�Zdd�Zdd�Zeee�ZdS)z getETreeBuilder.<locals>.Commentcs"�j|�|_d|_g|_g|_dS)N)�Commentrrrr)rr@)rrrr�sz)getETreeBuilder.<locals>.Comment.__init__cSs|jjS)N)rr2)rrrr�_getData�sz)getETreeBuilder.<locals>.Comment._getDatacSs||j_dS)N)rr2)rr,rrr�_setData�sz)getETreeBuilder.<locals>.Comment._setDataN)rGrHrIrrLrMrJr@r)rrrrK�srKcsLeZdZ�fdd�Zdd�Zdd�Zeee�Zdd�Zd	d
�Z	eee	�Z
dS)z%getETreeBuilder.<locals>.DocumentTypecs$�j|d�||j_||_||_dS)Nz
<!DOCTYPE>)rrr2�publicId�systemId)rrrNrO)rrrr�sz.getETreeBuilder.<locals>.DocumentType.__init__cSs|jjdd�S)NrNr=)r�get)rrrr�_getPublicId�sz2getETreeBuilder.<locals>.DocumentType._getPublicIdcSs|dk	r|jjd|�dS)NrN)rr))rr,rrr�_setPublicId�sz2getETreeBuilder.<locals>.DocumentType._setPublicIdcSs|jjdd�S)NrOr=)rrP)rrrr�_getSystemId�sz2getETreeBuilder.<locals>.DocumentType._getSystemIdcSs|dk	r|jjd|�dS)NrO)rr))rr,rrr�_setSystemId�sz2getETreeBuilder.<locals>.DocumentType._setSystemIdN)rGrHrIrrQrRrJrNrSrTrOr)rrr�DocumentType�s
rUcseZdZ�fdd�ZdS)z!getETreeBuilder.<locals>.Documentcs�j|d�dS)N�
DOCUMENT_ROOT)r)r)rrrr�sz*getETreeBuilder.<locals>.Document.__init__N)rGrHrIrr)rrr�Document�srWcseZdZ�fdd�ZdS)z)getETreeBuilder.<locals>.DocumentFragmentcs�j|d�dS)NZDOCUMENT_FRAGMENT)r)r)rrrr�sz2getETreeBuilder.<locals>.DocumentFragment.__init__N)rGrHrIrr)rrr�DocumentFragment�srXcs*g�d���fdd�	��|d�dj��S)Nrcs�t|d�s|j�}|jdkrz|jd�s0|jd�rd|jd�p<d}|jd�pJd}�jd|j||f�n�jd|jf��n�|jdkr�jd	�|jdk	r��jd
d|d|jf�|jdk	r�td
��t|d�r�t|j	�r�td���np|j�k�r�jdd||jf��nHt
|jt��s4tdt
|j�|jf��tj|j�}|dk�rR|j}n"|j�\}}tj|}d||f}�jdd||f�t|d��r2g}xb|j	j�D]T\}}	tj|�}|dk	�r�|j�\}}tj|}d||f}
n|}
|j|
|	f��q�Wx2t|�D]&\}}	�jdd|d||	f��qW|j�rV�jd
d|d|jf�|d7}x|D]}�||��qdW|j�r��jd
d|d|jf�dS)Nrz
<!DOCTYPE>rNrOr=z<!DOCTYPE %s "%s" "%s">z
<!DOCTYPE %s>rVz	#documentz|%s"%s"� rzDocument node cannot have tailr"z$Document node cannot have attributesz|%s<!-- %s -->zExpected unicode, got %s, %sz%s %sz|%s<%s>z
|%s%s="%s")�hasattr�getrootrrPr5r2r?�	TypeErrorr3r"r'r�AssertionErrorrB�
tag_regexp�match�groupsr
�prefixesr&�sorted)r/�indentrNrOZnsmatchr�ns�prefixr*r,Zattr_string�child)�ElementTreeCommentType�rv�serializeElementrrri�s^










"
zAgetETreeBuilder.<locals>.testSerializer.<locals>.serializeElement�
)r)�join)r/)rg)rhrir�testSerializer�s7
z'getETreeBuilder.<locals>.testSerializercs2g�tj�������fdd���|�dj��S)z4Serialize an element and its child nodes to a stringcs�t|�j�r|j�}|jdkr||jd�s2|jd�rf|jd�p>d}|jd�pLd}�jd|j||f�n�jd|jf��n|jdkr�|jdk	r��j|j�|jdk	r�td��t	|d	�r�t
|j�r�td
��x�|D]}�|�q�Wn�|j�k�r�jd|jf�n�|j�s$�jd�j|j�f�n2d
j
�fdd�|jj�D��}�jd|j|f�|j�rj�j|j�x|D]}�|��qpW�jd|jf�|j�r��j|j�dS)Nz
<!DOCTYPE>rNrOr=z<!DOCTYPE %s PUBLIC "%s" "%s">z
<!DOCTYPE %s>rVzDocument node cannot have tailr"z$Document node cannot have attributesz	<!--%s-->z<%s>rYcs"g|]\}}d�j|�|f�qS)z%s="%s")�fromXmlName)�.0rr,)�filterrr�
<listcomp>&szOgetETreeBuilder.<locals>.tostring.<locals>.serializeElement.<locals>.<listcomp>z<%s %s>z</%s>)r'rr[rrPr5r2r?r\rZr3r"rmrkr&)r/rNrOrf�attr)rrgrorhrirrris@





z;getETreeBuilder.<locals>.tostring.<locals>.serializeElementr=)r	Z
InfosetFilterrk)r/)rrg)rorhrir�tostrings
-z!getETreeBuilder.<locals>.tostringcsDeZdZ�Z�Z�Z�Z�Z�Z�fdd�Z	�fdd�Z
dd�ZdS)z$getETreeBuilder.<locals>.TreeBuildercs�|�S)Nr)rr/)rlrrrlAsz3getETreeBuilder.<locals>.TreeBuilder.testSerializercs<�r|jjS|jdk	r*|jjjd|j�S|jjjd�SdS)Nz{%s}htmlr
)ZdocumentrZdefaultNamespace�find)r)�fullTreerr�getDocumentDs
z0getETreeBuilder.<locals>.TreeBuilder.getDocumentcSstjj|�jS)N)r�TreeBuilder�getFragmentr)rrrrrwNsz0getETreeBuilder.<locals>.TreeBuilder.getFragmentN)rGrHrIZ
documentClassZdoctypeClassZelementClassZcommentClassZ
fragmentClass�implementationrlrurwr)rKrWrXrUr�ElementTreeImplementationrtrlrrrv9s
rv)rKrrrErv�locals)ryrtrrrvr)
rKrWrXrUrrrgryrtrlr�getETreeBuilders~>6$r{)F)Z
__future__rrrZpip._vendor.sixr�rer=rr	r
rZ_utilsr�compiler^r{ZgetETreeModulerrrr�<module>s

Esite-packages/pip/_vendor/html5lib/treebuilders/__pycache__/etree.cpython-36.opt-1.pyc000064400000026630147511334600024635 0ustar003

���e�1�@s�ddlmZmZmZddlmZddlZddlmZddlm	Z	ddlm
Z
dd	l
mZdd
lm
Z
ejd�Zdd
d�Ze
e�ZdS)�)�absolute_import�division�unicode_literals)�	text_typeN�)�base�)�	_ihatexml)�	constants)�
namespaces)�moduleFactoryFactoryz
{([^}]*)}(.*)Fc	s����jd�j�G�fdd�dtj��G�fdd�d���G�fdd�d���G�fdd	�d	���G�fd
d�d����fdd
��	��fdd�}G��������	fdd�dtj�}t�S)NZasdcs�eZdZd$�fdd�	Zdd�Zdd�Zdd	�Zeee�Zd
d�Z	dd
�Z
ee
e	�Zdd�Zdd�Z
eee
�Zdd�Zdd�Zeee�Zdd�Zdd�Zdd�Zdd�Zd%dd�Zd d!�Zd"d#�ZdS)&z getETreeBuilder.<locals>.ElementNcs^||_||_�j|j||��|_|dkr:td|jf|_n|j|jf|_d|_g|_g|_	dS)N�html)
�_name�
_namespace�Element�_getETreeTag�_elementrZ	nameTuple�parent�_childNodes�_flags)�self�name�	namespace)�ElementTree��/usr/lib/python3.6/etree.py�__init__s

z)getETreeBuilder.<locals>.Element.__init__cSs|dkr|}nd||f}|S)Nz{%s}%sr)rrrZ	etree_tagrrrr#sz-getETreeBuilder.<locals>.Element._getETreeTagcSs||_|j|j|j�|j_dS)N)rrrr�tag)rrrrr�_setName*sz)getETreeBuilder.<locals>.Element._setNamecSs|jS)N)r)rrrr�_getName.sz)getETreeBuilder.<locals>.Element._getNamecSs||_|j|j|j�|j_dS)N)rrrrr)rrrrr�
_setNamespace3sz.getETreeBuilder.<locals>.Element._setNamespacecSs|jS)N)r)rrrr�
_getNamespace7sz.getETreeBuilder.<locals>.Element._getNamespacecSs|jjS)N)r�attrib)rrrr�_getAttributes<sz/getETreeBuilder.<locals>.Element._getAttributescSspx"t|jjj��D]}|jj|=qWxF|j�D]:\}}t|t�rVd|d|df}n|}|jj||�q.WdS)Nz{%s}%srr)�listrr"�keys�items�
isinstance�tuple�set)r�
attributes�key�valuerrrr�_setAttributes?s
z/getETreeBuilder.<locals>.Element._setAttributescSs|jS)N)r)rrrr�_getChildNodesMsz/getETreeBuilder.<locals>.Element._getChildNodescSs.|jdd�=g|_x|D]}|j|�qWdS)N)rrZinsertChild)rr,�elementrrr�_setChildNodesPs
z/getETreeBuilder.<locals>.Element._setChildNodescSst|jjpt|j��S)z,Return true if the node has children or text)�boolr�text�len)rrrr�
hasContentXsz+getETreeBuilder.<locals>.Element.hasContentcSs$|jj|�|jj|j�||_dS)N)r�appendrr)r�noderrr�appendChild\sz,getETreeBuilder.<locals>.Element.appendChildcSs,t|j�j|j�}|jj||j�||_dS)N)r$r�index�insertr)rr6ZrefNoder8rrr�insertBeforeasz-getETreeBuilder.<locals>.Element.insertBeforecSs$|jj|�|jj|j�d|_dS)N)r�removerr)rr6rrr�removeChildfsz,getETreeBuilder.<locals>.Element.removeChildcSs�t|j�s,|jjsd|j_|jj|7_n�|dkrb|jdjsLd|jd_|jdj|7_nxt|j�}|j|j�}|dkr�|j|djs�d|j|d_|j|dj|7_n |jjs�d|j_|jj|7_dS)N�rr���r>r>)r3rr2�tailr$r8)r�datar:Zchildrenr8rrr�
insertTextks"

z+getETreeBuilder.<locals>.Element.insertTextcSs8t|�|j|j�}x |jj�D]\}}||j|<qW|S)N)�typerrr*r&)rr/rr,rrr�	cloneNode�sz*getETreeBuilder.<locals>.Element.cloneNodecSsl|jr"|jdjj|jj7_n0|jjs2d|j_|jjdk	rR|jj|jj7_d|j_tjj||�dS)Nrr=r>)�
childNodesrr?r2r�Node�reparentChildren)rZ	newParentrrrrF�sz1getETreeBuilder.<locals>.Element.reparentChildren)N)N)�__name__�
__module__�__qualname__rrrr�propertyrr r!rr#r-r*r.r0rDr4r7r:r<rArCrFr)rrrrs*





rcs2eZdZ�fdd�Zdd�Zdd�Zeee�ZdS)z getETreeBuilder.<locals>.Commentcs"�j|�|_d|_g|_g|_dS)N)�Commentrrrr)rr@)rrrr�sz)getETreeBuilder.<locals>.Comment.__init__cSs|jjS)N)rr2)rrrr�_getData�sz)getETreeBuilder.<locals>.Comment._getDatacSs||j_dS)N)rr2)rr,rrr�_setData�sz)getETreeBuilder.<locals>.Comment._setDataN)rGrHrIrrLrMrJr@r)rrrrK�srKcsLeZdZ�fdd�Zdd�Zdd�Zeee�Zdd�Zd	d
�Z	eee	�Z
dS)z%getETreeBuilder.<locals>.DocumentTypecs$�j|d�||j_||_||_dS)Nz
<!DOCTYPE>)rrr2�publicId�systemId)rrrNrO)rrrr�sz.getETreeBuilder.<locals>.DocumentType.__init__cSs|jjdd�S)NrNr=)r�get)rrrr�_getPublicId�sz2getETreeBuilder.<locals>.DocumentType._getPublicIdcSs|dk	r|jjd|�dS)NrN)rr))rr,rrr�_setPublicId�sz2getETreeBuilder.<locals>.DocumentType._setPublicIdcSs|jjdd�S)NrOr=)rrP)rrrr�_getSystemId�sz2getETreeBuilder.<locals>.DocumentType._getSystemIdcSs|dk	r|jjd|�dS)NrO)rr))rr,rrr�_setSystemId�sz2getETreeBuilder.<locals>.DocumentType._setSystemIdN)rGrHrIrrQrRrJrNrSrTrOr)rrr�DocumentType�s
rUcseZdZ�fdd�ZdS)z!getETreeBuilder.<locals>.Documentcs�j|d�dS)N�
DOCUMENT_ROOT)r)r)rrrr�sz*getETreeBuilder.<locals>.Document.__init__N)rGrHrIrr)rrr�Document�srWcseZdZ�fdd�ZdS)z)getETreeBuilder.<locals>.DocumentFragmentcs�j|d�dS)NZDOCUMENT_FRAGMENT)r)r)rrrr�sz2getETreeBuilder.<locals>.DocumentFragment.__init__N)rGrHrIrr)rrr�DocumentFragment�srXcs*g�d���fdd�	��|d�dj��S)Nrcszt|d�s|j�}|jdkrz|jd�s0|jd�rd|jd�p<d}|jd�pJd}�jd|j||f�n�jd|jf��n�|jdkr�jd	�|jdk	r��jd
d|d|jf�|jdk	r�td
��t|d�r�t|j	�r�td���nJ|j�k�r�jdd||jf��n"t
j|j�}|dk�r,|j}n"|j�\}}t
j|}d||f}�jdd||f�t|d��rg}xb|j	j�D]T\}}	t
j|�}|dk	�r�|j�\}}t
j|}d||f}
n|}
|j|
|	f��q�Wx2t|�D]&\}}	�jdd|d||	f��q�W|j�r0�jd
d|d|jf�|d7}x|D]}�||��q>W|j�rv�jd
d|d|jf�dS)Nrz
<!DOCTYPE>rNrOr=z<!DOCTYPE %s "%s" "%s">z
<!DOCTYPE %s>rVz	#documentz|%s"%s"� rzDocument node cannot have tailr"z$Document node cannot have attributesz|%s<!-- %s -->z%s %sz|%s<%s>z
|%s%s="%s")�hasattr�getrootrrPr5r2r?�	TypeErrorr3r"�
tag_regexp�match�groupsr
�prefixesr&�sorted)r/�indentrNrOZnsmatchr�ns�prefixr*r,Zattr_string�child)�ElementTreeCommentType�rv�serializeElementrrrh�sZ










"
zAgetETreeBuilder.<locals>.testSerializer.<locals>.serializeElement�
)r)�join)r/)rf)rgrhr�testSerializer�s7
z'getETreeBuilder.<locals>.testSerializercs2g�tj�������fdd���|�dj��S)z4Serialize an element and its child nodes to a stringcs�t|�j�r|j�}|jdkr||jd�s2|jd�rf|jd�p>d}|jd�pLd}�jd|j||f�n�jd|jf��n|jdkr�|jdk	r��j|j�|jdk	r�td��t	|d	�r�t
|j�r�td
��x�|D]}�|�q�Wn�|j�k�r�jd|jf�n�|j�s$�jd�j|j�f�n2d
j
�fdd�|jj�D��}�jd|j|f�|j�rj�j|j�x|D]}�|��qpW�jd|jf�|j�r��j|j�dS)Nz
<!DOCTYPE>rNrOr=z<!DOCTYPE %s PUBLIC "%s" "%s">z
<!DOCTYPE %s>rVzDocument node cannot have tailr"z$Document node cannot have attributesz	<!--%s-->z<%s>rYcs"g|]\}}d�j|�|f�qS)z%s="%s")�fromXmlName)�.0rr,)�filterrr�
<listcomp>&szOgetETreeBuilder.<locals>.tostring.<locals>.serializeElement.<locals>.<listcomp>z<%s %s>z</%s>)r'rr[rrPr5r2r?r\rZr3r"rlrjr&)r/rNrOre�attr)rrfrnrgrhrrrhs@





z;getETreeBuilder.<locals>.tostring.<locals>.serializeElementr=)r	Z
InfosetFilterrj)r/)rrf)rnrgrhr�tostrings
-z!getETreeBuilder.<locals>.tostringcsDeZdZ�Z�Z�Z�Z�Z�Z�fdd�Z	�fdd�Z
dd�ZdS)z$getETreeBuilder.<locals>.TreeBuildercs�|�S)Nr)rr/)rkrrrkAsz3getETreeBuilder.<locals>.TreeBuilder.testSerializercs<�r|jjS|jdk	r*|jjjd|j�S|jjjd�SdS)Nz{%s}htmlr
)ZdocumentrZdefaultNamespace�find)r)�fullTreerr�getDocumentDs
z0getETreeBuilder.<locals>.TreeBuilder.getDocumentcSstjj|�jS)N)r�TreeBuilder�getFragmentr)rrrrrvNsz0getETreeBuilder.<locals>.TreeBuilder.getFragmentN)rGrHrIZ
documentClassZdoctypeClassZelementClassZcommentClassZ
fragmentClass�implementationrkrtrvr)rKrWrXrUr�ElementTreeImplementationrsrkrrru9s
ru)rKrrrEru�locals)rxrsrqrur)
rKrWrXrUrrrfrxrsrkr�getETreeBuilders~>6$rz)F)Z
__future__rrrZpip._vendor.sixr�rer=rr	r
rZ_utilsr�compiler]rzZgetETreeModulerrrr�<module>s

Esite-packages/pip/_vendor/html5lib/treebuilders/__pycache__/dom.cpython-36.opt-1.pyc000064400000021735147511334600024311 0ustar003

���e�"�@s|ddlmZmZmZddlmZddlmZmZddl	Z	ddl
mZddl
mZdd	lm
Z
dd
lmZdd�Zee�ZdS)
�)�absolute_import�division�unicode_literals)�MutableMapping)�minidom�NodeN�)�base�)�	constants)�
namespaces)�moduleFactoryFactorycsV��Gdd�dt��G��fdd�dtj��G����fdd�dtj�}dd��t�S)	Nc@sLeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)zgetDomBuilder.<locals>.AttrListcSs
||_dS)N)�element)�selfr�r�/usr/lib/python3.6/dom.py�__init__sz(getDomBuilder.<locals>.AttrList.__init__cSst|jjj��S)N)�iterr�
attributes�keys)rrrr�__iter__sz(getDomBuilder.<locals>.AttrList.__iter__cSs4t|t�rt�n |jjj|�}||_||jj|<dS)N)�
isinstance�tuple�NotImplementedErrorr�
ownerDocumentZcreateAttribute�valuer)r�namer�attrrrr�__setitem__s

z+getDomBuilder.<locals>.AttrList.__setitem__cSst|jj�S)N)�lenrr)rrrr�__len__ sz'getDomBuilder.<locals>.AttrList.__len__cSst|jjj��S)N)�listrr�items)rrrrr"#sz%getDomBuilder.<locals>.AttrList.itemscSst|jjj��S)N)r!rr�values)rrrrr#&sz&getDomBuilder.<locals>.AttrList.valuescSs"t|t�rt�n|jj|jSdS)N)rrrrrr)rrrrr�__getitem__)s
z+getDomBuilder.<locals>.AttrList.__getitem__cSst|t�rt�n
|jj|=dS)N)rrrrr)rrrrr�__delitem__/s
z+getDomBuilder.<locals>.AttrList.__delitem__N)�__name__�
__module__�__qualname__rrrr r"r#r$r%rrrr�AttrListsr)cs�eZdZdd�Zedd��Zdd�Zddd	�Zd
d�Zdd
�Z	dd�Z
�fdd�Zdd�Zeee�Z
�fdd�Zdd�Zdd�Zee�ZdS)z"getDomBuilder.<locals>.NodeBuildercSstjj||j�||_dS)N)r	rr�nodeNamer)rrrrrr6sz+getDomBuilder.<locals>.NodeBuilder.__init__cSst|jd�r|jjpdS)N�namespaceURI)�hasattrrr+)rrrr�<lambda>:sz+getDomBuilder.<locals>.NodeBuilder.<lambda>cSs||_|jj|j�dS)N)�parentr�appendChild)r�noderrrr/=sz.getDomBuilder.<locals>.NodeBuilder.appendChildNcSs4|jjj|�}|r$|jj||j�n|jj|�dS)N)rr�createTextNode�insertBeforer/)r�datar2�textrrr�
insertTextAsz-getDomBuilder.<locals>.NodeBuilder.insertTextcSs|jj|j|j�||_dS)N)rr2r.)rr0ZrefNoderrrr2Hsz/getDomBuilder.<locals>.NodeBuilder.insertBeforecSs&|jj|jkr|jj|j�d|_dS)N)rZ
parentNode�removeChildr.)rr0rrrr6Lsz.getDomBuilder.<locals>.NodeBuilder.removeChildcSs:x.|jj�r.|jj}|jj|�|jj|�qWg|_dS)N)r�
hasChildNodesZ
firstChildr6r/�
childNodes)rZ	newParent�childrrr�reparentChildrenQs
z3getDomBuilder.<locals>.NodeBuilder.reparentChildrencs
�|j�S)N)r)r)r)rr�
getAttributesXsz0getDomBuilder.<locals>.NodeBuilder.getAttributescSsz|rvxpt|j��D]`\}}t|t�rd|ddk	rF|dd|d}n|d}|jj|d||�q|jj||�qWdS)Nr�:rr
)r!r"rrrZsetAttributeNSZsetAttribute)rrrrZ
qualifiedNamerrr�
setAttributes[s
z0getDomBuilder.<locals>.NodeBuilder.setAttributescs�|jjd��S)NF)r�	cloneNode)r)�NodeBuilderrrr>jsz,getDomBuilder.<locals>.NodeBuilder.cloneNodecSs
|jj�S)N)rr7)rrrr�
hasContentmsz-getDomBuilder.<locals>.NodeBuilder.hasContentcSs(|jdkrtd|jfS|j|jfSdS)NZhtml)�	namespacerr)rrrr�getNameTupleps
z/getDomBuilder.<locals>.NodeBuilder.getNameTuple)N)r&r'r(r�propertyrAr/r5r2r6r:r;r=rr>r@rBZ	nameTupler)r)r?rrr?5s

r?cs�eZdZ�fdd�Z��fdd�Zd�fdd�	Z�fdd	�Z�fd
d�Zdd
�Z�fdd�Z	dd�Z
dd�Zddd�Z�Z
dZdS)z"getDomBuilder.<locals>.TreeBuildercs�j�jddd�|_tj|�S)N)�getDOMImplementationZcreateDocument�dom�weakref�proxy)r)�Domrr�
documentClassysz0getDomBuilder.<locals>.TreeBuilder.documentClasscsR|d}|d}|d}�j�}|j|||�}|jj�|���tkrN|j|_dS)Nr�publicId�systemId)rDZcreateDocumentTypeZdocumentr/rrEr)r�tokenrrJrKZdomimplZdoctype)rHr?rr�
insertDoctype}sz0getDomBuilder.<locals>.TreeBuilder.insertDoctypeNcs6|dkr |jdkr |jj|�}n|jj||�}�|�S)N)ZdefaultNamespacerEZ
createElementZcreateElementNS)rrrAr0)r?rr�elementClass�sz/getDomBuilder.<locals>.TreeBuilder.elementClasscs�|jj|��S)N)rEZ
createComment)rr3)r?rr�commentClass�sz/getDomBuilder.<locals>.TreeBuilder.commentClasscs�|jj��S)N)rEZcreateDocumentFragment)r)r?rr�
fragmentClass�sz0getDomBuilder.<locals>.TreeBuilder.fragmentClasscSs|jj|j�dS)N)rEr/r)rr0rrrr/�sz.getDomBuilder.<locals>.TreeBuilder.appendChildcs�|�S)Nr)rr)�testSerializerrrrQ�sz1getDomBuilder.<locals>.TreeBuilder.testSerializercSs|jS)N)rE)rrrr�getDocument�sz.getDomBuilder.<locals>.TreeBuilder.getDocumentcSstjj|�jS)N)r	�TreeBuilder�getFragmentr)rrrrrT�sz.getDomBuilder.<locals>.TreeBuilder.getFragmentcSsp|}||krtjj|||�nNt|jd�rXtj|jjkrXt|jj�|j_|jjj	tj�|jj
|jj|��dS)N�_child_node_types)r	rSr5r,rEr�	TEXT_NODErUr!�appendr/r1)rr3r.rrrr5�sz-getDomBuilder.<locals>.TreeBuilder.insertText)N)N)r&r'r(rIrMrNrOrPr/rQrRrTr5�implementationrr)rH�DomImplementationr?rQrrrSxs

rScs0|j�g�d��fdd�	��|d�dj��S)Nrcs$|jtjkr�|jrj|js|jrP|jp&d}|jp0d}�jdd||j||f�q~�jdd||jf�n�jdd|f��nz|jtjkr��jd��n`|jtjkr��jd��nF|jtj	krވjdd||j
f��n|jtjk�r�jd	d||j
f�n�t|d
��r6|j
dk	�r6dtj|j
|jf}n|j}�jdd||f�|j��r�g}xftt|j��D]T}|jj|�}|j}|j}|j
}	|	�r�dtj|	|jf}n|j}|j||f��qpWx2t|�D]&\}}�jd
d|d||f��q�W|d7}x|jD]}
�|
|��qWdS)N�z|%s<!DOCTYPE %s "%s" "%s">� z|%s<!DOCTYPE %s>z|%s<!DOCTYPE >z	#documentz#document-fragmentz|%s<!-- %s -->z|%s"%s"r+z%s %sz|%s<%s>z
|%s%s="%s"r
)ZnodeTyperZDOCUMENT_TYPE_NODErrJrKrWZ
DOCUMENT_NODEZDOCUMENT_FRAGMENT_NODEZCOMMENT_NODEZ	nodeValuerVr,r+r�prefixesr*Z
hasAttributes�rangerr�itemrZ	localName�sortedr8)r�indentrJrKrr�irr�nsr9)�rv�serializeElementrrrd�sN


"z?getDomBuilder.<locals>.testSerializer.<locals>.serializeElement�
)r)Z	normalize�join)rr)rcrdrrQ�s
.
z%getDomBuilder.<locals>.testSerializer)rr	rrS�locals)rYrSr)r)rHrYr?rQr�
getDomBuilders$C:6rh)Z
__future__rrr�collectionsrZxml.domrrrFrZr	rrZ_utilsr
rhZgetDomModulerrrr�<module>s_site-packages/pip/_vendor/html5lib/treebuilders/__pycache__/etree_lxml.cpython-36.pyc000064400000026717147511334600024740 0ustar003

���eQ7�@s�dZddlmZmZmZddlZddlZddlZddlm	Z	ddl
mZddlm
Z
dd	lmZ
dd
lmZddljZdZejd�Zejd
�jZGdd�de�ZGdd�de�Zdd�Zdd�ZGdd�de	j�ZdS)a�Module for supporting the lxml.etree library. The idea here is to use as much
of the native library as possible, without using fragile hacks like custom element
names that break between releases. The downside of this is that we cannot represent
all possible trees; specifically the following are known to cause problems:

Text or comments as siblings of the root element
Docypes with no name

When any of these things occur, we emit a DataLossWarning
�)�absolute_import�division�unicode_literalsN�)�base�)�DataLossWarning)�	constants)�etree)�	_ihatexmlTz
{([^}]*)}(.*)Zasdc@seZdZdd�ZdS)�DocumentTypecCs||_||_||_dS)N)�name�publicId�systemId)�selfr
rr�r� /usr/lib/python3.6/etree_lxml.py�__init__#szDocumentType.__init__N)�__name__�
__module__�__qualname__rrrrrr"src@s,eZdZdd�Zdd�Zdd�Zee�ZdS)�DocumentcCsd|_g|_dS)N)�_elementTree�_childNodes)rrrrr*szDocument.__init__cCs|jj�j|j�dS)N)r�getrootZaddnext�_element)r�elementrrr�appendChild.szDocument.appendChildcCs|jS)N)r)rrrr�_getChildNodes1szDocument._getChildNodesN)rrrrrr�propertyZ
childNodesrrrrr)srcs6g�tjdd��d���fdd�	��|d�dj��S)NT)�preventDoubleDashCommentsrc
sDt|d��s8t|d�rˆjd�|jjrz|jjp6|jjsFd|jj}nd|jj|jj|jjf}�jdd|d|f�|j�}x|j�dk	r�|j�}q�Wx�|dk	r��||d�|j	�}q�Wnrt
|t�s�t
|t��rt
|t�s�t
jd	dks�t��jd
d||f�n(�jd�x|D]}�||d��qW�n|jtk�r��jdd||jf�t|d
��r@|j�r@�jd
d||jf��n�t
|tj��s�t�tjj|j�}|dk	�r�|jd�}|jd�}tj|}�jdd||�j|�f�n�jdd|�j|j�f�t|d��r�g}xr|jj�D]d\}	}
tj|	�}|dk	�rx|j�\}}	�j|	�}	tj|}d||	f}n
�j|	�}|j||
f��q.Wx2t |�D]&\}	}
�jdd|d|	|
f��q�W|j�r�jd
d|d|jf�|d7}x|D]}�||��q�Wt|d
��r@|j�r@�jd
d|d|jf�dS)N�tagrz	#documentz
<!DOCTYPE %s>z<!DOCTYPE %s "%s" "%s">z|%s%s� rrz|%s"%s"z#document-fragmentz|%s<!-- %s -->�tailrz
|%s<%s %s>z|%s<%s>�attribz%s %sz
|%s%s="%s")!�hasattr�append�docinfo�internalDTDZ	public_idZ
system_url�	root_namerZgetpreviousZgetnext�
isinstance�str�bytes�sys�version_info�AssertionErrorr!�comment_type�textr#r
Z_Element�etree_builders�
tag_regexp�match�groupr	�prefixes�fromXmlNamer$�items�groups�sorted)
r�indent�dtd_strZnext_elementZnsmatch�nsr!�prefix�
attributesr
�valueZattr_string�child)�
infosetFilter�rv�serializeElementrrrD;st













"
z(testSerializer.<locals>.serializeElement�
)r)r�
InfosetFilter�join)rr)rBrCrDr�testSerializer7s
F
rHcs$g���fdd���|�dj��S)z4Serialize an element and its child nodes to a stringcs
t|d�sH|jjr:|jjr$|jj}nd|jj}�j|��|j��n�|jtkrf�jd|j	f�n�|j
s��jd|jf�n.djdd�|j
j�D��}�jd|j|f�|j	r��j|j	�x|D]}�|�q�W�jd	|jf�t|d
�o�|j
�r�j|j
�dS)Nr!z
<!DOCTYPE %s>z	<!--%s-->z<%s>r"cSsg|]\}}d||f�qS)z%s="%s"r)�.0r
r@rrr�
<listcomp>�sz6tostring.<locals>.serializeElement.<locals>.<listcomp>z<%s %s>z</%s>r#)r%r'r(�doctyper)r&rr!r0r1r$rGr8r#)rr<�attrrA)rCrDrrrD�s*





z"tostring.<locals>.serializeElement�)rG)rr)rCrDr�tostring�s rNcszeZdZeZeZdZdZeZ	e
Zddd�Zdd�Z
dd�Zd	d
�Zdd�Zd
d�Zddd�Zd�fdd�	Zdd�Z�ZS)�TreeBuilderNFcs�tjt|d��tjdd��|_||_G�fdd�dt��G���fdd�d�j�}G��fdd	�d	�j	�}||_
||_tj
j||�dS)
N)�fullTreeT)r cs&eZdZd�fdd�	Z�fdd�ZdS)z(TreeBuilder.__init__.<locals>.AttributesNcsv|dkri}||_tj||�xR|j�D]F\}}t|t�rVd|d�j|d�f}n
�j|�}||jjj|<q(WdS)Nz{%s}%srr)r�dictrr8r*�tuple�coerceAttributer$)rrr@�keyr
)rBrrr�s

z1TreeBuilder.__init__.<locals>.Attributes.__init__csPtj|||�t|t�r4d|d�j|d�f}n
�j|�}||jjj|<dS)Nz{%s}%srr)rQ�__setitem__r*rRrSrr$)rrTr@r
)rBrrrU�s


z4TreeBuilder.__init__.<locals>.Attributes.__setitem__)N)rrrrrUr)rBrr�
Attributes�srVcsxeZdZ���fdd�Z�fdd�Z�fdd�Zeee�Zdd�Z�fd	d
�Z	eee	�Z
d��fdd
�	Z�fdd�ZdS)z%TreeBuilder.__init__.<locals>.Elementcs*�j|�}�jj|||d��|�|_dS)N)�	namespace)�
coerceElement�Elementr�_attributes)rr
rW)rV�builderrBrrr�s
z.TreeBuilder.__init__.<locals>.Element.__init__cs$�j|�|_|j|j|j�|j_dS)N)rX�_nameZ_getETreeTagZ
_namespacerr!)rr
)rBrr�_setName�sz.TreeBuilder.__init__.<locals>.Element._setNamecs�j|j�S)N)r7r\)r)rBrr�_getName�sz.TreeBuilder.__init__.<locals>.Element._getNamecSs|jS)N)rZ)rrrr�_getAttributes�sz4TreeBuilder.__init__.<locals>.Element._getAttributescs�||�|_dS)N)rZ)rr?)rVrr�_setAttributes�sz4TreeBuilder.__init__.<locals>.Element._setAttributesNcs�j|�}�jj|||�dS)N)ZcoerceCharactersrY�
insertText)r�dataZinsertBefore)r[rBrrra�s
z0TreeBuilder.__init__.<locals>.Element.insertTextcs�jj||�dS)N)rYr)rrA)r[rrr�sz1TreeBuilder.__init__.<locals>.Element.appendChild)N)
rrrrr]r^rr
r_r`r?rarr)rVr[rBrrrY�s

rYcs8eZdZ��fdd�Z�fdd�Zdd�Zeee�ZdS)z%TreeBuilder.__init__.<locals>.Commentcs�j|�}�jj||�dS)N)�
coerceComment�Commentr)rrb)r[rBrrr�s
z.TreeBuilder.__init__.<locals>.Comment.__init__cs�j|�}||j_dS)N)rcrr1)rrb)rBrr�_setData�s
z.TreeBuilder.__init__.<locals>.Comment._setDatacSs|jjS)N)rr1)rrrr�_getData�sz.TreeBuilder.__init__.<locals>.Comment._getDataN)rrrrrerfrrbr)r[rBrrrd�srd)r2ZgetETreeModuler
rrFrB�namespaceHTMLElementsrQrYrd�elementClass�commentClassrrOr)rrgrPrYrdr)rVr[rBrr�szTreeBuilder.__init__cCs$tjj|�|j|_g|_d|_dS)N)rrO�reset�insertCommentInitial�
insertComment�initial_commentsrK)rrrrrjszTreeBuilder.resetcCst|�S)N)rH)rrrrrrH	szTreeBuilder.testSerializercCstr|jjS|jjj�SdS)N)rP�documentrr)rrrr�getDocumentszTreeBuilder.getDocumentcCsFg}|jdj}|jr"|j|j�|jt|��|jrB|j|j�|S)Nr)�openElementsrr1r&�extend�listr#)rZfragmentrrrr�getFragmentszTreeBuilder.getFragmentcCsh|d}|d}|d}|s0tjdt�d|_n4|jj|�}||krPtjdt�|j|||�}||_dS)Nr
rrz#lxml cannot represent empty doctypez%lxml cannot represent non-xml doctype)�warnings�warnrrKrBrX�doctypeClass)r�tokenr
rrZcoercedNamerKrrr�
insertDoctypeszTreeBuilder.insertDoctypecCs6|dks||jkst�|jjdks&t�|jj|�dS)N)rnr/rrmr&)rrb�parentrrrrk,sz TreeBuilder.insertCommentInitialcsB||jkr,|jjj�djtkr,tjdt�tt	|�j
||�dS)Nrz@lxml cannot represent adjacent comments beyond the root elements���)rnrrr!r0rtrur�superrOrl)rrbry)�	__class__rr�insertCommentMain1s
zTreeBuilder.insertCommentMaincCs�d}|jr�|jjst�|d|jj7}|jjdk	s>|jjdk	r�|d|jj|jjpRd�7}|jjr�|jj}|jd�dkr�|jd�dkr�tj	dt
�|jdd	�}|jd�dkr�|d
|7}q�|d|7}n|d7}|d
7}|jj|dkr�tj	dt
�|d7}tj
|�}x*|jD] }|j|d�}|j|j��qW|j�|_|j�|j_|d}|jd|j�}|dk�rn|}	nd||f}	|	|_|j||�}
||
_|jjj|
�|jj|
�|j|_dS)zCreate the document rootrMz<!DOCTYPE %sNz
 PUBLIC "%s" �'r�"z6DOCTYPE system cannot contain single and double quotesZU00027z"%s"z'%s'z''�>r
zGlxml cannot represent doctype with a different name to the root elementz$<THIS_SHOULD_NEVER_APPEAR_PUBLICLY/>rbrWz{%s}%s)rKr
r/rrrBZcoercePubid�findrtrur�replacer
Z
fromstringrmriZaddpreviousr�
documentClassrnZgetroottreer�getZdefaultNamespacer!rhrr&rpr}rl)rrwZdocStrZsysid�root�
comment_tokenZcommentr
rWZ	etree_tagZroot_elementrrr�
insertRoot7sL


zTreeBuilder.insertRoot)F)N)N)rrrrr�rrvrhriZ
fragmentClassr
�implementationrrjrHrorsrxrkr}r��
__classcell__rr)r|rrO�s
L

rO)�__doc__Z
__future__rrrrt�rer-rMrr	rr
r2rZ
lxml.etreerP�compiler3rdr!r0�objectrrrHrNrOrrrr�<module>
s$

O)site-packages/pip/_vendor/html5lib/treebuilders/__pycache__/__init__.cpython-36.opt-1.pyc000064400000005742147511334600025271 0ustar003

���eN
�@s6dZddlmZmZmZddlmZiZddd�ZdS)	a�A collection of modules for building different kinds of tree from
HTML documents.

To create a treebuilder for a new type of tree, you need to do
implement several things:

1) A set of classes for various types of elements: Document, Doctype,
Comment, Element. These must implement the interface of
_base.treebuilders.Node (although comment nodes have a different
signature for their constructor, see treebuilders.etree.Comment)
Textual content may also be implemented as another node type, or not, as
your tree implementation requires.

2) A treebuilder object (called TreeBuilder by convention) that
inherits from treebuilders._base.TreeBuilder. This has 4 required attributes:
documentClass - the class to use for the bottommost node of a document
elementClass - the class to use for HTML Elements
commentClass - the class to use for comments
doctypeClass - the class to use for doctypes
It also has one required method:
getDocument - Returns the root node of the complete document tree

3) If you wish to run the unit tests, you must also create a
testSerializer method on your treebuilder which accepts a node and
returns a string containing Node and its children serialized according
to the format used in the unittests
�)�absolute_import�division�unicode_literals�)�
default_etreeNcKs�|j�}|tkr�|dkrLddlm}|dkr<ddlm}|}|j|f|�jS|dkrlddlm}|jt|<n<|d	kr�dd
lm	}|dkr�t
}|j|f|�jStd|��tj
|�S)a�Get a TreeBuilder class for various types of tree with built-in support

    treeType - the name of the tree type required (case-insensitive). Supported
               values are:

               "dom" - A generic builder for DOM implementations, defaulting to
                       a xml.dom.minidom based implementation.
               "etree" - A generic builder for tree implementations exposing an
                         ElementTree-like interface, defaulting to
                         xml.etree.cElementTree if available and
                         xml.etree.ElementTree if not.
               "lxml" - A etree-based builder for lxml.etree, handling
                        limitations of lxml's implementation.

    implementation - (Currently applies to the "etree" and "dom" tree types). A
                      module implementing the tree type e.g.
                      xml.etree.ElementTree or xml.etree.cElementTree.�dom�)rNr)�minidomZlxml)�
etree_lxml�etree)rzUnrecognised treebuilder "%s" )�lower�treeBuilderCache�rZxml.domr	ZgetDomModuleZTreeBuilderr
rrZgetETreeModule�
ValueError�get)ZtreeType�implementation�kwargsrr	r
r�r�/usr/lib/python3.6/__init__.py�getTreeBuilder$s$r)N)	�__doc__Z
__future__rrrZ_utilsrr
rrrrr�<module>ssite-packages/pip/_vendor/html5lib/treebuilders/__pycache__/base.cpython-36.opt-1.pyc000064400000025337147511334600024446 0ustar003

���ev6�@s�ddlmZmZmZddlmZddlmZmZm	Z	dZ
ee�dfeeee	ddfg�B�dfeeee	dd	fe	dd
fg�B�dfee	ddfe	ddfg�dfee	ddfe	dd
fg�dfd�Z
Gdd�de�ZGdd�de�ZGdd�de�ZdS)�)�absolute_import�division�unicode_literals)�	text_type�)�scopingElements�tableInsertModeElements�
namespacesNF�html�buttonZolZul�table�optgroup�optionT)Nr�listrZselectc@s^eZdZdd�Zdd�Zdd�Zdd�Zdd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zd	S)�NodecCs(||_d|_d|_i|_g|_g|_dS)a6Node representing an item in the tree.
        name - The tag name associated with the node
        parent - The parent of the current node (or None for the document node)
        value - The value of the current node (applies to text nodes and
        comments
        attributes - a dict holding name, value pairs for attributes of the node
        childNodes - a list of child nodes of the current node. This must
        include all elements but not necessarily other node types
        _flags - A list of miscellaneous flags that can be set on the node
        N)�name�parent�value�
attributes�
childNodesZ_flags)�selfr�r�/usr/lib/python3.6/base.py�__init__sz
Node.__init__cCs:djdd�|jj�D��}|r,d|j|fSd|jSdS)N� cSsg|]\}}d||f�qS)z%s="%s"r)�.0rrrrr�
<listcomp>+sz Node.__str__.<locals>.<listcomp>z<%s %s>z<%s>)�joinr�itemsr)rZ
attributesStrrrr�__str__*s

zNode.__str__cCs
d|jS)Nz<%s>)r)rrrr�__repr__3sz
Node.__repr__cCst�dS)z3Insert node as a child of the current node
        N)�NotImplementedError)r�noderrr�appendChild6szNode.appendChildNcCst�dS)z�Insert data as text in the current node, positioned before the
        start of node insertBefore or to the end of the node's text.
        N)r!)r�data�insertBeforerrr�
insertText;szNode.insertTextcCst�dS)z�Insert node as a child of the current node, before refNode in the
        list of child nodes. Raises ValueError if refNode is not a child of
        the current nodeN)r!)rr"ZrefNoderrrr%AszNode.insertBeforecCst�dS)z:Remove node from the children of the current node
        N)r!)rr"rrr�removeChildGszNode.removeChildcCs$x|jD]}|j|�qWg|_dS)z�Move all the children of the current node to newParent.
        This is needed so that trees that don't store text as nodes move the
        text in the correct way
        N)rr#)rZ	newParentZchildrrr�reparentChildrenLszNode.reparentChildrencCst�dS)z�Return a shallow copy of the current node i.e. a node with the same
        name and attributes but with no parent or child nodes
        N)r!)rrrr�	cloneNodeVszNode.cloneNodecCst�dS)zFReturn true if the node has children or text, false otherwise
        N)r!)rrrr�
hasContent\szNode.hasContent)N)
�__name__�
__module__�__qualname__rrr r#r&r%r'r(r)r*rrrrrs	

rc@seZdZdd�Zdd�ZdS)�ActiveFormattingElementscCsfd}|tkrVxH|ddd�D]6}|tkr*P|j||�r>|d7}|dkr|j|�PqWtj||�dS)Nr�����)�Marker�
nodesEqual�remover�append)rr"Z
equalCount�elementrrrr5cs
zActiveFormattingElements.appendcCs$|j|jksdS|j|jks dSdS)NFT)�	nameTupler)rZnode1Znode2rrrr3ps
z#ActiveFormattingElements.nodesEqualN)r+r,r-r5r3rrrrr.bs
r.c@s�eZdZdZdZdZdZdZdZdd�Z	dd�Z
d+dd�Zd	d
�Zdd�Z
d
d�Zdd�Zdd�Zd,dd�Zdd�Zdd�Zdd�Zeee�Zdd�Zdd�Zd-dd �Zd!d"�Zd.d#d$�Zd%d&�Zd'd(�Zd)d*�ZdS)/�TreeBuilderaBase treebuilder implementation
    documentClass - the class to use for the bottommost node of a document
    elementClass - the class to use for HTML Elements
    commentClass - the class to use for comments
    doctypeClass - the class to use for doctypes
    NcCs|rd|_nd|_|j�dS)Nzhttp://www.w3.org/1999/xhtml)�defaultNamespace�reset)rZnamespaceHTMLElementsrrrr�szTreeBuilder.__init__cCs.g|_t�|_d|_d|_d|_|j�|_dS)NF)�openElementsr.�activeFormattingElementsZheadPointerZformPointer�insertFromTable�
documentClass�document)rrrrr:�szTreeBuilder.resetcCs~t|d�}|s$t|t�r$td|f}t|\}}xHt|j�D]:}|rP||krPdS|rd|j|krddS||j|kAr<dSq<WdS)Nr7r
TF)�hasattr�
isinstancerr	�listElementsMap�reversedr;r7)r�targetZvariantZ	exactNodeZlistElements�invertr"rrr�elementInScope�s

zTreeBuilder.elementInScopecCs�|js
dSt|j�d}|j|}|tks4||jkr8dSx6|tkrn||jkrn|dkrZd}P|d8}|j|}q:WxR|d7}|j|}|j�}|jd|j|j|jd��}||j|<||jdkrrPqrWdS)Nr/rZStartTag)�typer�	namespacer$r1r1)	r<�lenr2r;r)�
insertElementrrHr)r�i�entryZcloner6rrr�#reconstructActiveFormattingElements�s.


z/TreeBuilder.reconstructActiveFormattingElementscCs,|jj�}x|jr&|tkr&|jj�}qWdS)N)r<�popr2)rrLrrr�clearActiveFormattingElements�s
z)TreeBuilder.clearActiveFormattingElementscCs8x2|jddd�D]}|tkr"Pq|j|kr|SqWdS)z�Check if an element exists between the end of the active
        formatting elements and the last marker. If it does, return it, else
        return falseNr/Fr1)r<r2r)rr�itemrrr�!elementInActiveFormattingElements�s
z-TreeBuilder.elementInActiveFormattingElementscCs&|j|�}|jj|�|jj|�dS)N)�
createElementr;r5r?r#)r�tokenr6rrr�
insertRoot�s
zTreeBuilder.insertRootcCs6|d}|d}|d}|j|||�}|jj|�dS)Nr�publicId�systemId)�doctypeClassr?r#)rrSrrUrVZdoctyperrr�
insertDoctypes
zTreeBuilder.insertDoctypecCs*|dkr|jd}|j|j|d��dS)Nr/r$r1)r;r#�commentClass)rrSrrrr�
insertComment	s
zTreeBuilder.insertCommentcCs0|d}|jd|j�}|j||�}|d|_|S)z.Create an element but don't insert it anywhererrHr$)�getr9�elementClassr)rrSrrHr6rrrrRs

zTreeBuilder.createElementcCs|jS)N)�_insertFromTable)rrrr�_getInsertFromTableszTreeBuilder._getInsertFromTablecCs ||_|r|j|_n|j|_dS)zsSwitch the function used to insert an element from the
        normal one to the misnested table one and back againN)r]�insertElementTablerJ�insertElementNormal)rrrrr�_setInsertFromTables
zTreeBuilder._setInsertFromTablecCsL|d}|jd|j�}|j||�}|d|_|jdj|�|jj|�|S)NrrHr$r/r1)r[r9r\rr;r#r5)rrSrrHr6rrrr`$s
zTreeBuilder.insertElementNormalcCs`|j|�}|jdjtkr$|j|�S|j�\}}|dkrD|j|�n|j||�|jj|�|S)z-Create an element and insert it into the treer/Nr1)	rRr;rrr`�getTableMisnestedNodePositionr#r%r5)rrSr6rr%rrrr_.s

zTreeBuilder.insertElementTablecCsX|dkr|jd}|js0|jr<|jdjtkr<|j|�n|j�\}}|j||�dS)zInsert text data.Nr/r1r1)r;r=rrr&rb)rr$rr%rrrr&>s

zTreeBuilder.insertTextcCsvd}d}d}x(|jddd�D]}|jdkr|}PqW|rd|jrL|j}|}qn|j|jj|�d}n
|jd}||fS)zsGet the foster parent element, and sibling to insert before
        (or None) when inserting a misnested table nodeNr/rrr1)r;rr�index)rZ	lastTableZfosterParentr%ZelmrrrrbMs

z)TreeBuilder.getTableMisnestedNodePositionc
Cs8|jd
j}|td�kr4||kr4|jj�|j|�dS)Nr/�dd�dt�lirr
�p�rp�rtr1)rdrerfrr
rgrhri)r;r�	frozensetrN�generateImpliedEndTags)r�excluderrrrrkgs

z"TreeBuilder.generateImpliedEndTagscCs|jS)zReturn the final tree)r?)rrrr�getDocumentqszTreeBuilder.getDocumentcCs|j�}|jdj|�|S)zReturn the final fragmentr)�
fragmentClassr;r()rZfragmentrrr�getFragmentuszTreeBuilder.getFragmentcCst�dS)zzSerialize the subtree of node in the format required by unit tests
        node - the node from which to start serializingN)r!)rr"rrr�testSerializer|szTreeBuilder.testSerializer)N)N)N)N)r+r,r-�__doc__r>r\rYrWrnrr:rFrMrOrQrTrXrZrRr^ra�propertyr=r`r_r&rbrkrmrorprrrrr8zs6
.
	




r8)Z
__future__rrrZpip._vendor.sixrZ	constantsrrr	r2rj�setrB�objectrrr.r8rrrr�<module>s
Ksite-packages/pip/_vendor/html5lib/treebuilders/__pycache__/etree_lxml.cpython-36.opt-1.pyc000064400000026430147511334600025667 0ustar003

���eQ7�@s�dZddlmZmZmZddlZddlZddlZddlm	Z	ddl
mZddlm
Z
dd	lmZ
dd
lmZddljZdZejd�Zejd
�jZGdd�de�ZGdd�de�Zdd�Zdd�ZGdd�de	j�ZdS)a�Module for supporting the lxml.etree library. The idea here is to use as much
of the native library as possible, without using fragile hacks like custom element
names that break between releases. The downside of this is that we cannot represent
all possible trees; specifically the following are known to cause problems:

Text or comments as siblings of the root element
Docypes with no name

When any of these things occur, we emit a DataLossWarning
�)�absolute_import�division�unicode_literalsN�)�base�)�DataLossWarning)�	constants)�etree)�	_ihatexmlTz
{([^}]*)}(.*)Zasdc@seZdZdd�ZdS)�DocumentTypecCs||_||_||_dS)N)�name�publicId�systemId)�selfr
rr�r� /usr/lib/python3.6/etree_lxml.py�__init__#szDocumentType.__init__N)�__name__�
__module__�__qualname__rrrrrr"src@s,eZdZdd�Zdd�Zdd�Zee�ZdS)�DocumentcCsd|_g|_dS)N)�_elementTree�_childNodes)rrrrr*szDocument.__init__cCs|jj�j|j�dS)N)r�getrootZaddnext�_element)r�elementrrr�appendChild.szDocument.appendChildcCs|jS)N)r)rrrr�_getChildNodes1szDocument._getChildNodesN)rrrrrr�propertyZ
childNodesrrrrr)srcs6g�tjdd��d���fdd�	��|d�dj��S)NT)�preventDoubleDashCommentsrc
st|d��st|d�rˆjd�|jjrz|jjp6|jjsFd|jj}nd|jj|jj|jjf}�jdd|d|f�|j�}x|j�dk	r�|j�}q�Wxx|dk	r��||d�|j	�}q�WnTt
|t�s�t
|t�r�jd	d||f�n(�jd
�x|D]}�||d��q�W�n�|j
tk�rn�jdd||jf�t|d��r|j�r�jd	d||jf��n�tjj|j
�}|dk	�r�|jd
�}|jd�}tj|}�jdd||�j|�f�n�jdd|�j|j
�f�t|d��r�g}xr|jj�D]d\}	}
tj|	�}|dk	�rH|j�\}}	�j|	�}	tj|}d||	f}n
�j|	�}|j||
f��q�Wx2t|�D]&\}	}
�jdd|d|	|
f��qpW|j�r��jd	d|d|jf�|d7}x|D]}�||��q�Wt|d��r|j�r�jd	d|d|jf�dS)N�tagrz	#documentz
<!DOCTYPE %s>z<!DOCTYPE %s "%s" "%s">z|%s%s� rz|%s"%s"z#document-fragmentz|%s<!-- %s -->�tailrz
|%s<%s %s>z|%s<%s>�attribz%s %sz
|%s%s="%s")�hasattr�append�docinfo�internalDTDZ	public_idZ
system_url�	root_namerZgetpreviousZgetnext�
isinstance�str�bytesr!�comment_type�textr#�etree_builders�
tag_regexp�match�groupr	�prefixes�fromXmlNamer$�items�groups�sorted)
r�indent�dtd_strZnext_elementZnsmatch�nsr!�prefix�
attributesr
�valueZattr_string�child)�
infosetFilter�rv�serializeElementrrrA;sp













"
z(testSerializer.<locals>.serializeElement�
)r)r�
InfosetFilter�join)rr)r?r@rAr�testSerializer7s
F
rEcs$g���fdd���|�dj��S)z4Serialize an element and its child nodes to a stringcs
t|d�sH|jjr:|jjr$|jj}nd|jj}�j|��|j��n�|jtkrf�jd|j	f�n�|j
s��jd|jf�n.djdd�|j
j�D��}�jd|j|f�|j	r��j|j	�x|D]}�|�q�W�jd	|jf�t|d
�o�|j
�r�j|j
�dS)Nr!z
<!DOCTYPE %s>z	<!--%s-->z<%s>r"cSsg|]\}}d||f�qS)z%s="%s"r)�.0r
r=rrr�
<listcomp>�sz6tostring.<locals>.serializeElement.<locals>.<listcomp>z<%s %s>z</%s>r#)r%r'r(�doctyper)r&rr!r-r.r$rDr5r#)rr9�attrr>)r@rArrrA�s*





z"tostring.<locals>.serializeElement�)rD)rr)r@rAr�tostring�s rKcszeZdZeZeZdZdZeZ	e
Zddd�Zdd�Z
dd�Zd	d
�Zdd�Zd
d�Zddd�Zd�fdd�	Zdd�Z�ZS)�TreeBuilderNFcs�tjt|d��tjdd��|_||_G�fdd�dt��G���fdd�d�j�}G��fdd	�d	�j	�}||_
||_tj
j||�dS)
N)�fullTreeT)r cs&eZdZd�fdd�	Z�fdd�ZdS)z(TreeBuilder.__init__.<locals>.AttributesNcsv|dkri}||_tj||�xR|j�D]F\}}t|t�rVd|d�j|d�f}n
�j|�}||jjj|<q(WdS)Nz{%s}%srr)r�dictrr5r*�tuple�coerceAttributer$)rrr=�keyr
)r?rrr�s

z1TreeBuilder.__init__.<locals>.Attributes.__init__csPtj|||�t|t�r4d|d�j|d�f}n
�j|�}||jjj|<dS)Nz{%s}%srr)rN�__setitem__r*rOrPrr$)rrQr=r
)r?rrrR�s


z4TreeBuilder.__init__.<locals>.Attributes.__setitem__)N)rrrrrRr)r?rr�
Attributes�srScsxeZdZ���fdd�Z�fdd�Z�fdd�Zeee�Zdd�Z�fd	d
�Z	eee	�Z
d��fdd
�	Z�fdd�ZdS)z%TreeBuilder.__init__.<locals>.Elementcs*�j|�}�jj|||d��|�|_dS)N)�	namespace)�
coerceElement�Elementr�_attributes)rr
rT)rS�builderr?rrr�s
z.TreeBuilder.__init__.<locals>.Element.__init__cs$�j|�|_|j|j|j�|j_dS)N)rU�_nameZ_getETreeTagZ
_namespacerr!)rr
)r?rr�_setName�sz.TreeBuilder.__init__.<locals>.Element._setNamecs�j|j�S)N)r4rY)r)r?rr�_getName�sz.TreeBuilder.__init__.<locals>.Element._getNamecSs|jS)N)rW)rrrr�_getAttributes�sz4TreeBuilder.__init__.<locals>.Element._getAttributescs�||�|_dS)N)rW)rr<)rSrr�_setAttributes�sz4TreeBuilder.__init__.<locals>.Element._setAttributesNcs�j|�}�jj|||�dS)N)ZcoerceCharactersrV�
insertText)r�dataZinsertBefore)rXr?rrr^�s
z0TreeBuilder.__init__.<locals>.Element.insertTextcs�jj||�dS)N)rVr)rr>)rXrrr�sz1TreeBuilder.__init__.<locals>.Element.appendChild)N)
rrrrrZr[rr
r\r]r<r^rr)rSrXr?rrrV�s

rVcs8eZdZ��fdd�Z�fdd�Zdd�Zeee�ZdS)z%TreeBuilder.__init__.<locals>.Commentcs�j|�}�jj||�dS)N)�
coerceComment�Commentr)rr_)rXr?rrr�s
z.TreeBuilder.__init__.<locals>.Comment.__init__cs�j|�}||j_dS)N)r`rr.)rr_)r?rr�_setData�s
z.TreeBuilder.__init__.<locals>.Comment._setDatacSs|jjS)N)rr.)rrrr�_getData�sz.TreeBuilder.__init__.<locals>.Comment._getDataN)rrrrrbrcrr_r)rXr?rrra�sra)r/ZgetETreeModuler
rrCr?�namespaceHTMLElementsrNrVra�elementClass�commentClassrrLr)rrdrMrVrar)rSrXr?rr�szTreeBuilder.__init__cCs$tjj|�|j|_g|_d|_dS)N)rrL�reset�insertCommentInitial�
insertComment�initial_commentsrH)rrrrrgszTreeBuilder.resetcCst|�S)N)rE)rrrrrrE	szTreeBuilder.testSerializercCstr|jjS|jjj�SdS)N)rM�documentrr)rrrr�getDocumentszTreeBuilder.getDocumentcCsFg}|jdj}|jr"|j|j�|jt|��|jrB|j|j�|S)Nr)�openElementsrr.r&�extend�listr#)rZfragmentrrrr�getFragmentszTreeBuilder.getFragmentcCsh|d}|d}|d}|s0tjdt�d|_n4|jj|�}||krPtjdt�|j|||�}||_dS)Nr
rrz#lxml cannot represent empty doctypez%lxml cannot represent non-xml doctype)�warnings�warnrrHr?rU�doctypeClass)r�tokenr
rrZcoercedNamerHrrr�
insertDoctypeszTreeBuilder.insertDoctypecCs|jj|�dS)N)rjr&)rr_�parentrrrrh,sz TreeBuilder.insertCommentInitialcsB||jkr,|jjj�djtkr,tjdt�tt	|�j
||�dS)Nrz@lxml cannot represent adjacent comments beyond the root elements���)rkrrr!r-rqrrr�superrLri)rr_rv)�	__class__rr�insertCommentMain1s
zTreeBuilder.insertCommentMaincCs�d}|jr�|d|jj7}|jjdk	s2|jjdk	r�|d|jj|jjpFd�7}|jjr�|jj}|jd�dkr�|jd�dkr�tjdt	�|j
dd	�}|jd�dkr�|d
|7}q�|d|7}n|d7}|d
7}|jj|dkr�tjdt	�|d7}tj|�}x*|j
D] }|j|d�}|j|j��qW|j�|_|j�|j_|d}|jd|j�}|dk�rb|}	nd||f}	|	|_|j||�}
||
_|jjj|
�|jj|
�|j|_dS)zCreate the document rootrJz<!DOCTYPE %sNz
 PUBLIC "%s" �'r�"z6DOCTYPE system cannot contain single and double quotesZU00027z"%s"z'%s'z''�>r
zGlxml cannot represent doctype with a different name to the root elementz$<THIS_SHOULD_NEVER_APPEAR_PUBLICLY/>r_rTz{%s}%s)rHr
rrr?ZcoercePubid�findrqrrr�replacer
Z
fromstringrjrfZaddpreviousr�
documentClassrkZgetroottreer�getZdefaultNamespacer!rerr&rmrzri)rrtZdocStrZsysid�root�
comment_tokenZcommentr
rTZ	etree_tagZroot_elementrrr�
insertRoot7sJ


zTreeBuilder.insertRoot)F)N)N)rrrrr�rrsrerfZ
fragmentClassr
�implementationrrgrErlrprurhrzr��
__classcell__rr)ryrrL�s
L

rL)�__doc__Z
__future__rrrrq�re�sysrJrr	rr
r/rZ
lxml.etreerM�compiler0rar!r-�objectrrrErKrLrrrr�<module>
s$

O)site-packages/pip/_vendor/html5lib/treebuilders/__pycache__/base.cpython-36.pyc000064400000025515147511334600023505 0ustar003

���ev6�@s�ddlmZmZmZddlmZddlmZmZm	Z	dZ
ee�dfeeee	ddfg�B�dfeeee	dd	fe	dd
fg�B�dfee	ddfe	ddfg�dfee	ddfe	dd
fg�dfd�Z
Gdd�de�ZGdd�de�ZGdd�de�ZdS)�)�absolute_import�division�unicode_literals)�	text_type�)�scopingElements�tableInsertModeElements�
namespacesNF�html�buttonZolZul�table�optgroup�optionT)Nr�listrZselectc@s^eZdZdd�Zdd�Zdd�Zdd�Zdd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zd	S)�NodecCs(||_d|_d|_i|_g|_g|_dS)a6Node representing an item in the tree.
        name - The tag name associated with the node
        parent - The parent of the current node (or None for the document node)
        value - The value of the current node (applies to text nodes and
        comments
        attributes - a dict holding name, value pairs for attributes of the node
        childNodes - a list of child nodes of the current node. This must
        include all elements but not necessarily other node types
        _flags - A list of miscellaneous flags that can be set on the node
        N)�name�parent�value�
attributes�
childNodesZ_flags)�selfr�r�/usr/lib/python3.6/base.py�__init__sz
Node.__init__cCs:djdd�|jj�D��}|r,d|j|fSd|jSdS)N� cSsg|]\}}d||f�qS)z%s="%s"r)�.0rrrrr�
<listcomp>+sz Node.__str__.<locals>.<listcomp>z<%s %s>z<%s>)�joinr�itemsr)rZ
attributesStrrrr�__str__*s

zNode.__str__cCs
d|jS)Nz<%s>)r)rrrr�__repr__3sz
Node.__repr__cCst�dS)z3Insert node as a child of the current node
        N)�NotImplementedError)r�noderrr�appendChild6szNode.appendChildNcCst�dS)z�Insert data as text in the current node, positioned before the
        start of node insertBefore or to the end of the node's text.
        N)r!)r�data�insertBeforerrr�
insertText;szNode.insertTextcCst�dS)z�Insert node as a child of the current node, before refNode in the
        list of child nodes. Raises ValueError if refNode is not a child of
        the current nodeN)r!)rr"ZrefNoderrrr%AszNode.insertBeforecCst�dS)z:Remove node from the children of the current node
        N)r!)rr"rrr�removeChildGszNode.removeChildcCs$x|jD]}|j|�qWg|_dS)z�Move all the children of the current node to newParent.
        This is needed so that trees that don't store text as nodes move the
        text in the correct way
        N)rr#)rZ	newParentZchildrrr�reparentChildrenLszNode.reparentChildrencCst�dS)z�Return a shallow copy of the current node i.e. a node with the same
        name and attributes but with no parent or child nodes
        N)r!)rrrr�	cloneNodeVszNode.cloneNodecCst�dS)zFReturn true if the node has children or text, false otherwise
        N)r!)rrrr�
hasContent\szNode.hasContent)N)
�__name__�
__module__�__qualname__rrr r#r&r%r'r(r)r*rrrrrs	

rc@seZdZdd�Zdd�ZdS)�ActiveFormattingElementscCsfd}|tkrVxH|ddd�D]6}|tkr*P|j||�r>|d7}|dkr|j|�PqWtj||�dS)Nr�����)�Marker�
nodesEqual�remover�append)rr"Z
equalCount�elementrrrr5cs
zActiveFormattingElements.appendcCs$|j|jksdS|j|jks dSdS)NFT)�	nameTupler)rZnode1Znode2rrrr3ps
z#ActiveFormattingElements.nodesEqualN)r+r,r-r5r3rrrrr.bs
r.c@s�eZdZdZdZdZdZdZdZdd�Z	dd�Z
d+dd�Zd	d
�Zdd�Z
d
d�Zdd�Zdd�Zd,dd�Zdd�Zdd�Zdd�Zeee�Zdd�Zdd�Zd-dd �Zd!d"�Zd.d#d$�Zd%d&�Zd'd(�Zd)d*�ZdS)/�TreeBuilderaBase treebuilder implementation
    documentClass - the class to use for the bottommost node of a document
    elementClass - the class to use for HTML Elements
    commentClass - the class to use for comments
    doctypeClass - the class to use for doctypes
    NcCs|rd|_nd|_|j�dS)Nzhttp://www.w3.org/1999/xhtml)�defaultNamespace�reset)rZnamespaceHTMLElementsrrrr�szTreeBuilder.__init__cCs.g|_t�|_d|_d|_d|_|j�|_dS)NF)�openElementsr.�activeFormattingElementsZheadPointerZformPointer�insertFromTable�
documentClass�document)rrrrr:�szTreeBuilder.resetcCs�t|d�}|s2t|t�r$td|f}t|t�s2t�t|\}}xHt|j�D]:}|r^||kr^dS|rr|j	|krrdS||j	|kArJdSqJWds�t�dS)Nr7r
TF)
�hasattr�
isinstancerr	�tuple�AssertionError�listElementsMap�reversedr;r7)r�targetZvariantZ	exactNodeZlistElements�invertr"rrr�elementInScope�s

zTreeBuilder.elementInScopecCs�|js
dSt|j�d}|j|}|tks4||jkr8dSx6|tkrn||jkrn|dkrZd}P|d8}|j|}q:WxR|d7}|j|}|j�}|jd|j|j|jd��}||j|<||jdkrrPqrWdS)Nr/rZStartTag)�typer�	namespacer$r1r1)	r<�lenr2r;r)�
insertElementrrJr)r�i�entryZcloner6rrr�#reconstructActiveFormattingElements�s.


z/TreeBuilder.reconstructActiveFormattingElementscCs,|jj�}x|jr&|tkr&|jj�}qWdS)N)r<�popr2)rrNrrr�clearActiveFormattingElements�s
z)TreeBuilder.clearActiveFormattingElementscCs8x2|jddd�D]}|tkr"Pq|j|kr|SqWdS)z�Check if an element exists between the end of the active
        formatting elements and the last marker. If it does, return it, else
        return falseNr/Fr1)r<r2r)rr�itemrrr�!elementInActiveFormattingElements�s
z-TreeBuilder.elementInActiveFormattingElementscCs&|j|�}|jj|�|jj|�dS)N)�
createElementr;r5r?r#)r�tokenr6rrr�
insertRoot�s
zTreeBuilder.insertRootcCs6|d}|d}|d}|j|||�}|jj|�dS)Nr�publicId�systemId)�doctypeClassr?r#)rrUrrWrXZdoctyperrr�
insertDoctypes
zTreeBuilder.insertDoctypecCs*|dkr|jd}|j|j|d��dS)Nr/r$r1)r;r#�commentClass)rrUrrrr�
insertComment	s
zTreeBuilder.insertCommentcCs0|d}|jd|j�}|j||�}|d|_|S)z.Create an element but don't insert it anywhererrJr$)�getr9�elementClassr)rrUrrJr6rrrrTs

zTreeBuilder.createElementcCs|jS)N)�_insertFromTable)rrrr�_getInsertFromTableszTreeBuilder._getInsertFromTablecCs ||_|r|j|_n|j|_dS)zsSwitch the function used to insert an element from the
        normal one to the misnested table one and back againN)r_�insertElementTablerL�insertElementNormal)rrrrr�_setInsertFromTables
zTreeBuilder._setInsertFromTablecCsb|d}t|t�std|��|jd|j�}|j||�}|d|_|jdj|�|jj	|�|S)NrzElement %s not unicoderJr$r/r1)
rArrCr]r9r^rr;r#r5)rrUrrJr6rrrrb$s
zTreeBuilder.insertElementNormalcCs`|j|�}|jdjtkr$|j|�S|j�\}}|dkrD|j|�n|j||�|jj|�|S)z-Create an element and insert it into the treer/Nr1)	rTr;rrrb�getTableMisnestedNodePositionr#r%r5)rrUr6rr%rrrra.s

zTreeBuilder.insertElementTablecCsX|dkr|jd}|js0|jr<|jdjtkr<|j|�n|j�\}}|j||�dS)zInsert text data.Nr/r1r1)r;r=rrr&rd)rr$rr%rrrr&>s

zTreeBuilder.insertTextcCsvd}d}d}x(|jddd�D]}|jdkr|}PqW|rd|jrL|j}|}qn|j|jj|�d}n
|jd}||fS)zsGet the foster parent element, and sibling to insert before
        (or None) when inserting a misnested table nodeNr/rrr1)r;rr�index)rZ	lastTableZfosterParentr%ZelmrrrrdMs

z)TreeBuilder.getTableMisnestedNodePositionc
Cs8|jd
j}|td�kr4||kr4|jj�|j|�dS)Nr/�dd�dt�lirr
�p�rp�rtr1)rfrgrhrr
rirjrk)r;r�	frozensetrP�generateImpliedEndTags)r�excluderrrrrmgs

z"TreeBuilder.generateImpliedEndTagscCs|jS)zReturn the final tree)r?)rrrr�getDocumentqszTreeBuilder.getDocumentcCs|j�}|jdj|�|S)zReturn the final fragmentr)�
fragmentClassr;r()rZfragmentrrr�getFragmentuszTreeBuilder.getFragmentcCst�dS)zzSerialize the subtree of node in the format required by unit tests
        node - the node from which to start serializingN)r!)rr"rrr�testSerializer|szTreeBuilder.testSerializer)N)N)N)N)r+r,r-�__doc__r>r^r[rYrprr:rHrOrQrSrVrZr\rTr`rc�propertyr=rbrar&rdrmrorqrrrrrrr8zs6
.
	




r8)Z
__future__rrrZpip._vendor.sixrZ	constantsrrr	r2rl�setrD�objectrrr.r8rrrr�<module>s
Ksite-packages/pip/_vendor/html5lib/treebuilders/dom.py000064400000021203147511334600017054 0ustar00from __future__ import absolute_import, division, unicode_literals


from collections import MutableMapping
from xml.dom import minidom, Node
import weakref

from . import base
from .. import constants
from ..constants import namespaces
from .._utils import moduleFactoryFactory


def getDomBuilder(DomImplementation):
    Dom = DomImplementation

    class AttrList(MutableMapping):
        def __init__(self, element):
            self.element = element

        def __iter__(self):
            return iter(self.element.attributes.keys())

        def __setitem__(self, name, value):
            if isinstance(name, tuple):
                raise NotImplementedError
            else:
                attr = self.element.ownerDocument.createAttribute(name)
                attr.value = value
                self.element.attributes[name] = attr

        def __len__(self):
            return len(self.element.attributes)

        def items(self):
            return list(self.element.attributes.items())

        def values(self):
            return list(self.element.attributes.values())

        def __getitem__(self, name):
            if isinstance(name, tuple):
                raise NotImplementedError
            else:
                return self.element.attributes[name].value

        def __delitem__(self, name):
            if isinstance(name, tuple):
                raise NotImplementedError
            else:
                del self.element.attributes[name]

    class NodeBuilder(base.Node):
        def __init__(self, element):
            base.Node.__init__(self, element.nodeName)
            self.element = element

        namespace = property(lambda self: hasattr(self.element, "namespaceURI") and
                             self.element.namespaceURI or None)

        def appendChild(self, node):
            node.parent = self
            self.element.appendChild(node.element)

        def insertText(self, data, insertBefore=None):
            text = self.element.ownerDocument.createTextNode(data)
            if insertBefore:
                self.element.insertBefore(text, insertBefore.element)
            else:
                self.element.appendChild(text)

        def insertBefore(self, node, refNode):
            self.element.insertBefore(node.element, refNode.element)
            node.parent = self

        def removeChild(self, node):
            if node.element.parentNode == self.element:
                self.element.removeChild(node.element)
            node.parent = None

        def reparentChildren(self, newParent):
            while self.element.hasChildNodes():
                child = self.element.firstChild
                self.element.removeChild(child)
                newParent.element.appendChild(child)
            self.childNodes = []

        def getAttributes(self):
            return AttrList(self.element)

        def setAttributes(self, attributes):
            if attributes:
                for name, value in list(attributes.items()):
                    if isinstance(name, tuple):
                        if name[0] is not None:
                            qualifiedName = (name[0] + ":" + name[1])
                        else:
                            qualifiedName = name[1]
                        self.element.setAttributeNS(name[2], qualifiedName,
                                                    value)
                    else:
                        self.element.setAttribute(
                            name, value)
        attributes = property(getAttributes, setAttributes)

        def cloneNode(self):
            return NodeBuilder(self.element.cloneNode(False))

        def hasContent(self):
            return self.element.hasChildNodes()

        def getNameTuple(self):
            if self.namespace is None:
                return namespaces["html"], self.name
            else:
                return self.namespace, self.name

        nameTuple = property(getNameTuple)

    class TreeBuilder(base.TreeBuilder):  # pylint:disable=unused-variable
        def documentClass(self):
            self.dom = Dom.getDOMImplementation().createDocument(None, None, None)
            return weakref.proxy(self)

        def insertDoctype(self, token):
            name = token["name"]
            publicId = token["publicId"]
            systemId = token["systemId"]

            domimpl = Dom.getDOMImplementation()
            doctype = domimpl.createDocumentType(name, publicId, systemId)
            self.document.appendChild(NodeBuilder(doctype))
            if Dom == minidom:
                doctype.ownerDocument = self.dom

        def elementClass(self, name, namespace=None):
            if namespace is None and self.defaultNamespace is None:
                node = self.dom.createElement(name)
            else:
                node = self.dom.createElementNS(namespace, name)

            return NodeBuilder(node)

        def commentClass(self, data):
            return NodeBuilder(self.dom.createComment(data))

        def fragmentClass(self):
            return NodeBuilder(self.dom.createDocumentFragment())

        def appendChild(self, node):
            self.dom.appendChild(node.element)

        def testSerializer(self, element):
            return testSerializer(element)

        def getDocument(self):
            return self.dom

        def getFragment(self):
            return base.TreeBuilder.getFragment(self).element

        def insertText(self, data, parent=None):
            data = data
            if parent != self:
                base.TreeBuilder.insertText(self, data, parent)
            else:
                # HACK: allow text nodes as children of the document node
                if hasattr(self.dom, '_child_node_types'):
                    # pylint:disable=protected-access
                    if Node.TEXT_NODE not in self.dom._child_node_types:
                        self.dom._child_node_types = list(self.dom._child_node_types)
                        self.dom._child_node_types.append(Node.TEXT_NODE)
                self.dom.appendChild(self.dom.createTextNode(data))

        implementation = DomImplementation
        name = None

    def testSerializer(element):
        element.normalize()
        rv = []

        def serializeElement(element, indent=0):
            if element.nodeType == Node.DOCUMENT_TYPE_NODE:
                if element.name:
                    if element.publicId or element.systemId:
                        publicId = element.publicId or ""
                        systemId = element.systemId or ""
                        rv.append("""|%s<!DOCTYPE %s "%s" "%s">""" %
                                  (' ' * indent, element.name, publicId, systemId))
                    else:
                        rv.append("|%s<!DOCTYPE %s>" % (' ' * indent, element.name))
                else:
                    rv.append("|%s<!DOCTYPE >" % (' ' * indent,))
            elif element.nodeType == Node.DOCUMENT_NODE:
                rv.append("#document")
            elif element.nodeType == Node.DOCUMENT_FRAGMENT_NODE:
                rv.append("#document-fragment")
            elif element.nodeType == Node.COMMENT_NODE:
                rv.append("|%s<!-- %s -->" % (' ' * indent, element.nodeValue))
            elif element.nodeType == Node.TEXT_NODE:
                rv.append("|%s\"%s\"" % (' ' * indent, element.nodeValue))
            else:
                if (hasattr(element, "namespaceURI") and
                        element.namespaceURI is not None):
                    name = "%s %s" % (constants.prefixes[element.namespaceURI],
                                      element.nodeName)
                else:
                    name = element.nodeName
                rv.append("|%s<%s>" % (' ' * indent, name))
                if element.hasAttributes():
                    attributes = []
                    for i in range(len(element.attributes)):
                        attr = element.attributes.item(i)
                        name = attr.nodeName
                        value = attr.value
                        ns = attr.namespaceURI
                        if ns:
                            name = "%s %s" % (constants.prefixes[ns], attr.localName)
                        else:
                            name = attr.nodeName
                        attributes.append((name, value))

                    for name, value in sorted(attributes):
                        rv.append('|%s%s="%s"' % (' ' * (indent + 2), name, value))
            indent += 2
            for child in element.childNodes:
                serializeElement(child, indent)
        serializeElement(element, 0)

        return "\n".join(rv)

    return locals()


# The actual means to get a module!
getDomModule = moduleFactoryFactory(getDomBuilder)
site-packages/pip/_vendor/html5lib/treebuilders/etree.py000064400000030734147511334600017412 0ustar00from __future__ import absolute_import, division, unicode_literals
# pylint:disable=protected-access

from pip._vendor.six import text_type

import re

from . import base
from .. import _ihatexml
from .. import constants
from ..constants import namespaces
from .._utils import moduleFactoryFactory

tag_regexp = re.compile("{([^}]*)}(.*)")


def getETreeBuilder(ElementTreeImplementation, fullTree=False):
    ElementTree = ElementTreeImplementation
    ElementTreeCommentType = ElementTree.Comment("asd").tag

    class Element(base.Node):
        def __init__(self, name, namespace=None):
            self._name = name
            self._namespace = namespace
            self._element = ElementTree.Element(self._getETreeTag(name,
                                                                  namespace))
            if namespace is None:
                self.nameTuple = namespaces["html"], self._name
            else:
                self.nameTuple = self._namespace, self._name
            self.parent = None
            self._childNodes = []
            self._flags = []

        def _getETreeTag(self, name, namespace):
            if namespace is None:
                etree_tag = name
            else:
                etree_tag = "{%s}%s" % (namespace, name)
            return etree_tag

        def _setName(self, name):
            self._name = name
            self._element.tag = self._getETreeTag(self._name, self._namespace)

        def _getName(self):
            return self._name

        name = property(_getName, _setName)

        def _setNamespace(self, namespace):
            self._namespace = namespace
            self._element.tag = self._getETreeTag(self._name, self._namespace)

        def _getNamespace(self):
            return self._namespace

        namespace = property(_getNamespace, _setNamespace)

        def _getAttributes(self):
            return self._element.attrib

        def _setAttributes(self, attributes):
            # Delete existing attributes first
            # XXX - there may be a better way to do this...
            for key in list(self._element.attrib.keys()):
                del self._element.attrib[key]
            for key, value in attributes.items():
                if isinstance(key, tuple):
                    name = "{%s}%s" % (key[2], key[1])
                else:
                    name = key
                self._element.set(name, value)

        attributes = property(_getAttributes, _setAttributes)

        def _getChildNodes(self):
            return self._childNodes

        def _setChildNodes(self, value):
            del self._element[:]
            self._childNodes = []
            for element in value:
                self.insertChild(element)

        childNodes = property(_getChildNodes, _setChildNodes)

        def hasContent(self):
            """Return true if the node has children or text"""
            return bool(self._element.text or len(self._element))

        def appendChild(self, node):
            self._childNodes.append(node)
            self._element.append(node._element)
            node.parent = self

        def insertBefore(self, node, refNode):
            index = list(self._element).index(refNode._element)
            self._element.insert(index, node._element)
            node.parent = self

        def removeChild(self, node):
            self._childNodes.remove(node)
            self._element.remove(node._element)
            node.parent = None

        def insertText(self, data, insertBefore=None):
            if not(len(self._element)):
                if not self._element.text:
                    self._element.text = ""
                self._element.text += data
            elif insertBefore is None:
                # Insert the text as the tail of the last child element
                if not self._element[-1].tail:
                    self._element[-1].tail = ""
                self._element[-1].tail += data
            else:
                # Insert the text before the specified node
                children = list(self._element)
                index = children.index(insertBefore._element)
                if index > 0:
                    if not self._element[index - 1].tail:
                        self._element[index - 1].tail = ""
                    self._element[index - 1].tail += data
                else:
                    if not self._element.text:
                        self._element.text = ""
                    self._element.text += data

        def cloneNode(self):
            element = type(self)(self.name, self.namespace)
            for name, value in self.attributes.items():
                element.attributes[name] = value
            return element

        def reparentChildren(self, newParent):
            if newParent.childNodes:
                newParent.childNodes[-1]._element.tail += self._element.text
            else:
                if not newParent._element.text:
                    newParent._element.text = ""
                if self._element.text is not None:
                    newParent._element.text += self._element.text
            self._element.text = ""
            base.Node.reparentChildren(self, newParent)

    class Comment(Element):
        def __init__(self, data):
            # Use the superclass constructor to set all properties on the
            # wrapper element
            self._element = ElementTree.Comment(data)
            self.parent = None
            self._childNodes = []
            self._flags = []

        def _getData(self):
            return self._element.text

        def _setData(self, value):
            self._element.text = value

        data = property(_getData, _setData)

    class DocumentType(Element):
        def __init__(self, name, publicId, systemId):
            Element.__init__(self, "<!DOCTYPE>")
            self._element.text = name
            self.publicId = publicId
            self.systemId = systemId

        def _getPublicId(self):
            return self._element.get("publicId", "")

        def _setPublicId(self, value):
            if value is not None:
                self._element.set("publicId", value)

        publicId = property(_getPublicId, _setPublicId)

        def _getSystemId(self):
            return self._element.get("systemId", "")

        def _setSystemId(self, value):
            if value is not None:
                self._element.set("systemId", value)

        systemId = property(_getSystemId, _setSystemId)

    class Document(Element):
        def __init__(self):
            Element.__init__(self, "DOCUMENT_ROOT")

    class DocumentFragment(Element):
        def __init__(self):
            Element.__init__(self, "DOCUMENT_FRAGMENT")

    def testSerializer(element):
        rv = []

        def serializeElement(element, indent=0):
            if not(hasattr(element, "tag")):
                element = element.getroot()
            if element.tag == "<!DOCTYPE>":
                if element.get("publicId") or element.get("systemId"):
                    publicId = element.get("publicId") or ""
                    systemId = element.get("systemId") or ""
                    rv.append("""<!DOCTYPE %s "%s" "%s">""" %
                              (element.text, publicId, systemId))
                else:
                    rv.append("<!DOCTYPE %s>" % (element.text,))
            elif element.tag == "DOCUMENT_ROOT":
                rv.append("#document")
                if element.text is not None:
                    rv.append("|%s\"%s\"" % (' ' * (indent + 2), element.text))
                if element.tail is not None:
                    raise TypeError("Document node cannot have tail")
                if hasattr(element, "attrib") and len(element.attrib):
                    raise TypeError("Document node cannot have attributes")
            elif element.tag == ElementTreeCommentType:
                rv.append("|%s<!-- %s -->" % (' ' * indent, element.text))
            else:
                assert isinstance(element.tag, text_type), \
                    "Expected unicode, got %s, %s" % (type(element.tag), element.tag)
                nsmatch = tag_regexp.match(element.tag)

                if nsmatch is None:
                    name = element.tag
                else:
                    ns, name = nsmatch.groups()
                    prefix = constants.prefixes[ns]
                    name = "%s %s" % (prefix, name)
                rv.append("|%s<%s>" % (' ' * indent, name))

                if hasattr(element, "attrib"):
                    attributes = []
                    for name, value in element.attrib.items():
                        nsmatch = tag_regexp.match(name)
                        if nsmatch is not None:
                            ns, name = nsmatch.groups()
                            prefix = constants.prefixes[ns]
                            attr_string = "%s %s" % (prefix, name)
                        else:
                            attr_string = name
                        attributes.append((attr_string, value))

                    for name, value in sorted(attributes):
                        rv.append('|%s%s="%s"' % (' ' * (indent + 2), name, value))
                if element.text:
                    rv.append("|%s\"%s\"" % (' ' * (indent + 2), element.text))
            indent += 2
            for child in element:
                serializeElement(child, indent)
            if element.tail:
                rv.append("|%s\"%s\"" % (' ' * (indent - 2), element.tail))
        serializeElement(element, 0)

        return "\n".join(rv)

    def tostring(element):  # pylint:disable=unused-variable
        """Serialize an element and its child nodes to a string"""
        rv = []
        filter = _ihatexml.InfosetFilter()

        def serializeElement(element):
            if isinstance(element, ElementTree.ElementTree):
                element = element.getroot()

            if element.tag == "<!DOCTYPE>":
                if element.get("publicId") or element.get("systemId"):
                    publicId = element.get("publicId") or ""
                    systemId = element.get("systemId") or ""
                    rv.append("""<!DOCTYPE %s PUBLIC "%s" "%s">""" %
                              (element.text, publicId, systemId))
                else:
                    rv.append("<!DOCTYPE %s>" % (element.text,))
            elif element.tag == "DOCUMENT_ROOT":
                if element.text is not None:
                    rv.append(element.text)
                if element.tail is not None:
                    raise TypeError("Document node cannot have tail")
                if hasattr(element, "attrib") and len(element.attrib):
                    raise TypeError("Document node cannot have attributes")

                for child in element:
                    serializeElement(child)

            elif element.tag == ElementTreeCommentType:
                rv.append("<!--%s-->" % (element.text,))
            else:
                # This is assumed to be an ordinary element
                if not element.attrib:
                    rv.append("<%s>" % (filter.fromXmlName(element.tag),))
                else:
                    attr = " ".join(["%s=\"%s\"" % (
                        filter.fromXmlName(name), value)
                        for name, value in element.attrib.items()])
                    rv.append("<%s %s>" % (element.tag, attr))
                if element.text:
                    rv.append(element.text)

                for child in element:
                    serializeElement(child)

                rv.append("</%s>" % (element.tag,))

            if element.tail:
                rv.append(element.tail)

        serializeElement(element)

        return "".join(rv)

    class TreeBuilder(base.TreeBuilder):  # pylint:disable=unused-variable
        documentClass = Document
        doctypeClass = DocumentType
        elementClass = Element
        commentClass = Comment
        fragmentClass = DocumentFragment
        implementation = ElementTreeImplementation

        def testSerializer(self, element):
            return testSerializer(element)

        def getDocument(self):
            if fullTree:
                return self.document._element
            else:
                if self.defaultNamespace is not None:
                    return self.document._element.find(
                        "{%s}html" % self.defaultNamespace)
                else:
                    return self.document._element.find("html")

        def getFragment(self):
            return base.TreeBuilder.getFragment(self)._element

    return locals()


getETreeModule = moduleFactoryFactory(getETreeBuilder)
site-packages/pip/_vendor/html5lib/treebuilders/base.py000064400000033166147511334600017222 0ustar00from __future__ import absolute_import, division, unicode_literals
from pip._vendor.six import text_type

from ..constants import scopingElements, tableInsertModeElements, namespaces

# The scope markers are inserted when entering object elements,
# marquees, table cells, and table captions, and are used to prevent formatting
# from "leaking" into tables, object elements, and marquees.
Marker = None

listElementsMap = {
    None: (frozenset(scopingElements), False),
    "button": (frozenset(scopingElements | set([(namespaces["html"], "button")])), False),
    "list": (frozenset(scopingElements | set([(namespaces["html"], "ol"),
                                              (namespaces["html"], "ul")])), False),
    "table": (frozenset([(namespaces["html"], "html"),
                         (namespaces["html"], "table")]), False),
    "select": (frozenset([(namespaces["html"], "optgroup"),
                          (namespaces["html"], "option")]), True)
}


class Node(object):
    def __init__(self, name):
        """Node representing an item in the tree.
        name - The tag name associated with the node
        parent - The parent of the current node (or None for the document node)
        value - The value of the current node (applies to text nodes and
        comments
        attributes - a dict holding name, value pairs for attributes of the node
        childNodes - a list of child nodes of the current node. This must
        include all elements but not necessarily other node types
        _flags - A list of miscellaneous flags that can be set on the node
        """
        self.name = name
        self.parent = None
        self.value = None
        self.attributes = {}
        self.childNodes = []
        self._flags = []

    def __str__(self):
        attributesStr = " ".join(["%s=\"%s\"" % (name, value)
                                  for name, value in
                                  self.attributes.items()])
        if attributesStr:
            return "<%s %s>" % (self.name, attributesStr)
        else:
            return "<%s>" % (self.name)

    def __repr__(self):
        return "<%s>" % (self.name)

    def appendChild(self, node):
        """Insert node as a child of the current node
        """
        raise NotImplementedError

    def insertText(self, data, insertBefore=None):
        """Insert data as text in the current node, positioned before the
        start of node insertBefore or to the end of the node's text.
        """
        raise NotImplementedError

    def insertBefore(self, node, refNode):
        """Insert node as a child of the current node, before refNode in the
        list of child nodes. Raises ValueError if refNode is not a child of
        the current node"""
        raise NotImplementedError

    def removeChild(self, node):
        """Remove node from the children of the current node
        """
        raise NotImplementedError

    def reparentChildren(self, newParent):
        """Move all the children of the current node to newParent.
        This is needed so that trees that don't store text as nodes move the
        text in the correct way
        """
        # XXX - should this method be made more general?
        for child in self.childNodes:
            newParent.appendChild(child)
        self.childNodes = []

    def cloneNode(self):
        """Return a shallow copy of the current node i.e. a node with the same
        name and attributes but with no parent or child nodes
        """
        raise NotImplementedError

    def hasContent(self):
        """Return true if the node has children or text, false otherwise
        """
        raise NotImplementedError


class ActiveFormattingElements(list):
    def append(self, node):
        equalCount = 0
        if node != Marker:
            for element in self[::-1]:
                if element == Marker:
                    break
                if self.nodesEqual(element, node):
                    equalCount += 1
                if equalCount == 3:
                    self.remove(element)
                    break
        list.append(self, node)

    def nodesEqual(self, node1, node2):
        if not node1.nameTuple == node2.nameTuple:
            return False

        if not node1.attributes == node2.attributes:
            return False

        return True


class TreeBuilder(object):
    """Base treebuilder implementation
    documentClass - the class to use for the bottommost node of a document
    elementClass - the class to use for HTML Elements
    commentClass - the class to use for comments
    doctypeClass - the class to use for doctypes
    """
    # pylint:disable=not-callable

    # Document class
    documentClass = None

    # The class to use for creating a node
    elementClass = None

    # The class to use for creating comments
    commentClass = None

    # The class to use for creating doctypes
    doctypeClass = None

    # Fragment class
    fragmentClass = None

    def __init__(self, namespaceHTMLElements):
        if namespaceHTMLElements:
            self.defaultNamespace = "http://www.w3.org/1999/xhtml"
        else:
            self.defaultNamespace = None
        self.reset()

    def reset(self):
        self.openElements = []
        self.activeFormattingElements = ActiveFormattingElements()

        # XXX - rename these to headElement, formElement
        self.headPointer = None
        self.formPointer = None

        self.insertFromTable = False

        self.document = self.documentClass()

    def elementInScope(self, target, variant=None):

        # If we pass a node in we match that. if we pass a string
        # match any node with that name
        exactNode = hasattr(target, "nameTuple")
        if not exactNode:
            if isinstance(target, text_type):
                target = (namespaces["html"], target)
            assert isinstance(target, tuple)

        listElements, invert = listElementsMap[variant]

        for node in reversed(self.openElements):
            if exactNode and node == target:
                return True
            elif not exactNode and node.nameTuple == target:
                return True
            elif (invert ^ (node.nameTuple in listElements)):
                return False

        assert False  # We should never reach this point

    def reconstructActiveFormattingElements(self):
        # Within this algorithm the order of steps described in the
        # specification is not quite the same as the order of steps in the
        # code. It should still do the same though.

        # Step 1: stop the algorithm when there's nothing to do.
        if not self.activeFormattingElements:
            return

        # Step 2 and step 3: we start with the last element. So i is -1.
        i = len(self.activeFormattingElements) - 1
        entry = self.activeFormattingElements[i]
        if entry == Marker or entry in self.openElements:
            return

        # Step 6
        while entry != Marker and entry not in self.openElements:
            if i == 0:
                # This will be reset to 0 below
                i = -1
                break
            i -= 1
            # Step 5: let entry be one earlier in the list.
            entry = self.activeFormattingElements[i]

        while True:
            # Step 7
            i += 1

            # Step 8
            entry = self.activeFormattingElements[i]
            clone = entry.cloneNode()  # Mainly to get a new copy of the attributes

            # Step 9
            element = self.insertElement({"type": "StartTag",
                                          "name": clone.name,
                                          "namespace": clone.namespace,
                                          "data": clone.attributes})

            # Step 10
            self.activeFormattingElements[i] = element

            # Step 11
            if element == self.activeFormattingElements[-1]:
                break

    def clearActiveFormattingElements(self):
        entry = self.activeFormattingElements.pop()
        while self.activeFormattingElements and entry != Marker:
            entry = self.activeFormattingElements.pop()

    def elementInActiveFormattingElements(self, name):
        """Check if an element exists between the end of the active
        formatting elements and the last marker. If it does, return it, else
        return false"""

        for item in self.activeFormattingElements[::-1]:
            # Check for Marker first because if it's a Marker it doesn't have a
            # name attribute.
            if item == Marker:
                break
            elif item.name == name:
                return item
        return False

    def insertRoot(self, token):
        element = self.createElement(token)
        self.openElements.append(element)
        self.document.appendChild(element)

    def insertDoctype(self, token):
        name = token["name"]
        publicId = token["publicId"]
        systemId = token["systemId"]

        doctype = self.doctypeClass(name, publicId, systemId)
        self.document.appendChild(doctype)

    def insertComment(self, token, parent=None):
        if parent is None:
            parent = self.openElements[-1]
        parent.appendChild(self.commentClass(token["data"]))

    def createElement(self, token):
        """Create an element but don't insert it anywhere"""
        name = token["name"]
        namespace = token.get("namespace", self.defaultNamespace)
        element = self.elementClass(name, namespace)
        element.attributes = token["data"]
        return element

    def _getInsertFromTable(self):
        return self._insertFromTable

    def _setInsertFromTable(self, value):
        """Switch the function used to insert an element from the
        normal one to the misnested table one and back again"""
        self._insertFromTable = value
        if value:
            self.insertElement = self.insertElementTable
        else:
            self.insertElement = self.insertElementNormal

    insertFromTable = property(_getInsertFromTable, _setInsertFromTable)

    def insertElementNormal(self, token):
        name = token["name"]
        assert isinstance(name, text_type), "Element %s not unicode" % name
        namespace = token.get("namespace", self.defaultNamespace)
        element = self.elementClass(name, namespace)
        element.attributes = token["data"]
        self.openElements[-1].appendChild(element)
        self.openElements.append(element)
        return element

    def insertElementTable(self, token):
        """Create an element and insert it into the tree"""
        element = self.createElement(token)
        if self.openElements[-1].name not in tableInsertModeElements:
            return self.insertElementNormal(token)
        else:
            # We should be in the InTable mode. This means we want to do
            # special magic element rearranging
            parent, insertBefore = self.getTableMisnestedNodePosition()
            if insertBefore is None:
                parent.appendChild(element)
            else:
                parent.insertBefore(element, insertBefore)
            self.openElements.append(element)
        return element

    def insertText(self, data, parent=None):
        """Insert text data."""
        if parent is None:
            parent = self.openElements[-1]

        if (not self.insertFromTable or (self.insertFromTable and
                                         self.openElements[-1].name
                                         not in tableInsertModeElements)):
            parent.insertText(data)
        else:
            # We should be in the InTable mode. This means we want to do
            # special magic element rearranging
            parent, insertBefore = self.getTableMisnestedNodePosition()
            parent.insertText(data, insertBefore)

    def getTableMisnestedNodePosition(self):
        """Get the foster parent element, and sibling to insert before
        (or None) when inserting a misnested table node"""
        # The foster parent element is the one which comes before the most
        # recently opened table element
        # XXX - this is really inelegant
        lastTable = None
        fosterParent = None
        insertBefore = None
        for elm in self.openElements[::-1]:
            if elm.name == "table":
                lastTable = elm
                break
        if lastTable:
            # XXX - we should really check that this parent is actually a
            # node here
            if lastTable.parent:
                fosterParent = lastTable.parent
                insertBefore = lastTable
            else:
                fosterParent = self.openElements[
                    self.openElements.index(lastTable) - 1]
        else:
            fosterParent = self.openElements[0]
        return fosterParent, insertBefore

    def generateImpliedEndTags(self, exclude=None):
        name = self.openElements[-1].name
        # XXX td, th and tr are not actually needed
        if (name in frozenset(("dd", "dt", "li", "option", "optgroup", "p", "rp", "rt")) and
                name != exclude):
            self.openElements.pop()
            # XXX This is not entirely what the specification says. We should
            # investigate it more closely.
            self.generateImpliedEndTags(exclude)

    def getDocument(self):
        "Return the final tree"
        return self.document

    def getFragment(self):
        "Return the final fragment"
        # assert self.innerHTML
        fragment = self.fragmentClass()
        self.openElements[0].reparentChildren(fragment)
        return fragment

    def testSerializer(self, node):
        """Serialize the subtree of node in the format required by unit tests
        node - the node from which to start serializing"""
        raise NotImplementedError
site-packages/pip/_vendor/html5lib/_tokenizer.py000064400000225444147511334600015772 0ustar00from __future__ import absolute_import, division, unicode_literals

from pip._vendor.six import unichr as chr

from collections import deque

from .constants import spaceCharacters
from .constants import entities
from .constants import asciiLetters, asciiUpper2Lower
from .constants import digits, hexDigits, EOF
from .constants import tokenTypes, tagTokenTypes
from .constants import replacementCharacters

from ._inputstream import HTMLInputStream

from ._trie import Trie

entitiesTrie = Trie(entities)


class HTMLTokenizer(object):
    """ This class takes care of tokenizing HTML.

    * self.currentToken
      Holds the token that is currently being processed.

    * self.state
      Holds a reference to the method to be invoked... XXX

    * self.stream
      Points to HTMLInputStream object.
    """

    def __init__(self, stream, parser=None, **kwargs):

        self.stream = HTMLInputStream(stream, **kwargs)
        self.parser = parser

        # Setup the initial tokenizer state
        self.escapeFlag = False
        self.lastFourChars = []
        self.state = self.dataState
        self.escape = False

        # The current token being created
        self.currentToken = None
        super(HTMLTokenizer, self).__init__()

    def __iter__(self):
        """ This is where the magic happens.

        We do our usually processing through the states and when we have a token
        to return we yield the token which pauses processing until the next token
        is requested.
        """
        self.tokenQueue = deque([])
        # Start processing. When EOF is reached self.state will return False
        # instead of True and the loop will terminate.
        while self.state():
            while self.stream.errors:
                yield {"type": tokenTypes["ParseError"], "data": self.stream.errors.pop(0)}
            while self.tokenQueue:
                yield self.tokenQueue.popleft()

    def consumeNumberEntity(self, isHex):
        """This function returns either U+FFFD or the character based on the
        decimal or hexadecimal representation. It also discards ";" if present.
        If not present self.tokenQueue.append({"type": tokenTypes["ParseError"]}) is invoked.
        """

        allowed = digits
        radix = 10
        if isHex:
            allowed = hexDigits
            radix = 16

        charStack = []

        # Consume all the characters that are in range while making sure we
        # don't hit an EOF.
        c = self.stream.char()
        while c in allowed and c is not EOF:
            charStack.append(c)
            c = self.stream.char()

        # Convert the set of characters consumed to an int.
        charAsInt = int("".join(charStack), radix)

        # Certain characters get replaced with others
        if charAsInt in replacementCharacters:
            char = replacementCharacters[charAsInt]
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "illegal-codepoint-for-numeric-entity",
                                    "datavars": {"charAsInt": charAsInt}})
        elif ((0xD800 <= charAsInt <= 0xDFFF) or
              (charAsInt > 0x10FFFF)):
            char = "\uFFFD"
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "illegal-codepoint-for-numeric-entity",
                                    "datavars": {"charAsInt": charAsInt}})
        else:
            # Should speed up this check somehow (e.g. move the set to a constant)
            if ((0x0001 <= charAsInt <= 0x0008) or
                (0x000E <= charAsInt <= 0x001F) or
                (0x007F <= charAsInt <= 0x009F) or
                (0xFDD0 <= charAsInt <= 0xFDEF) or
                charAsInt in frozenset([0x000B, 0xFFFE, 0xFFFF, 0x1FFFE,
                                        0x1FFFF, 0x2FFFE, 0x2FFFF, 0x3FFFE,
                                        0x3FFFF, 0x4FFFE, 0x4FFFF, 0x5FFFE,
                                        0x5FFFF, 0x6FFFE, 0x6FFFF, 0x7FFFE,
                                        0x7FFFF, 0x8FFFE, 0x8FFFF, 0x9FFFE,
                                        0x9FFFF, 0xAFFFE, 0xAFFFF, 0xBFFFE,
                                        0xBFFFF, 0xCFFFE, 0xCFFFF, 0xDFFFE,
                                        0xDFFFF, 0xEFFFE, 0xEFFFF, 0xFFFFE,
                                        0xFFFFF, 0x10FFFE, 0x10FFFF])):
                self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                        "data":
                                        "illegal-codepoint-for-numeric-entity",
                                        "datavars": {"charAsInt": charAsInt}})
            try:
                # Try/except needed as UCS-2 Python builds' unichar only works
                # within the BMP.
                char = chr(charAsInt)
            except ValueError:
                v = charAsInt - 0x10000
                char = chr(0xD800 | (v >> 10)) + chr(0xDC00 | (v & 0x3FF))

        # Discard the ; if present. Otherwise, put it back on the queue and
        # invoke parseError on parser.
        if c != ";":
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "numeric-entity-without-semicolon"})
            self.stream.unget(c)

        return char

    def consumeEntity(self, allowedChar=None, fromAttribute=False):
        # Initialise to the default output for when no entity is matched
        output = "&"

        charStack = [self.stream.char()]
        if (charStack[0] in spaceCharacters or charStack[0] in (EOF, "<", "&") or
                (allowedChar is not None and allowedChar == charStack[0])):
            self.stream.unget(charStack[0])

        elif charStack[0] == "#":
            # Read the next character to see if it's hex or decimal
            hex = False
            charStack.append(self.stream.char())
            if charStack[-1] in ("x", "X"):
                hex = True
                charStack.append(self.stream.char())

            # charStack[-1] should be the first digit
            if (hex and charStack[-1] in hexDigits) \
                    or (not hex and charStack[-1] in digits):
                # At least one digit found, so consume the whole number
                self.stream.unget(charStack[-1])
                output = self.consumeNumberEntity(hex)
            else:
                # No digits found
                self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                        "data": "expected-numeric-entity"})
                self.stream.unget(charStack.pop())
                output = "&" + "".join(charStack)

        else:
            # At this point in the process might have named entity. Entities
            # are stored in the global variable "entities".
            #
            # Consume characters and compare to these to a substring of the
            # entity names in the list until the substring no longer matches.
            while (charStack[-1] is not EOF):
                if not entitiesTrie.has_keys_with_prefix("".join(charStack)):
                    break
                charStack.append(self.stream.char())

            # At this point we have a string that starts with some characters
            # that may match an entity
            # Try to find the longest entity the string will match to take care
            # of &noti for instance.
            try:
                entityName = entitiesTrie.longest_prefix("".join(charStack[:-1]))
                entityLength = len(entityName)
            except KeyError:
                entityName = None

            if entityName is not None:
                if entityName[-1] != ";":
                    self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                            "named-entity-without-semicolon"})
                if (entityName[-1] != ";" and fromAttribute and
                    (charStack[entityLength] in asciiLetters or
                     charStack[entityLength] in digits or
                     charStack[entityLength] == "=")):
                    self.stream.unget(charStack.pop())
                    output = "&" + "".join(charStack)
                else:
                    output = entities[entityName]
                    self.stream.unget(charStack.pop())
                    output += "".join(charStack[entityLength:])
            else:
                self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                        "expected-named-entity"})
                self.stream.unget(charStack.pop())
                output = "&" + "".join(charStack)

        if fromAttribute:
            self.currentToken["data"][-1][1] += output
        else:
            if output in spaceCharacters:
                tokenType = "SpaceCharacters"
            else:
                tokenType = "Characters"
            self.tokenQueue.append({"type": tokenTypes[tokenType], "data": output})

    def processEntityInAttribute(self, allowedChar):
        """This method replaces the need for "entityInAttributeValueState".
        """
        self.consumeEntity(allowedChar=allowedChar, fromAttribute=True)

    def emitCurrentToken(self):
        """This method is a generic handler for emitting the tags. It also sets
        the state to "data" because that's what's needed after a token has been
        emitted.
        """
        token = self.currentToken
        # Add token to the queue to be yielded
        if (token["type"] in tagTokenTypes):
            token["name"] = token["name"].translate(asciiUpper2Lower)
            if token["type"] == tokenTypes["EndTag"]:
                if token["data"]:
                    self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                            "data": "attributes-in-end-tag"})
                if token["selfClosing"]:
                    self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                            "data": "self-closing-flag-on-end-tag"})
        self.tokenQueue.append(token)
        self.state = self.dataState

    # Below are the various tokenizer states worked out.
    def dataState(self):
        data = self.stream.char()
        if data == "&":
            self.state = self.entityDataState
        elif data == "<":
            self.state = self.tagOpenState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.tokenQueue.append({"type": tokenTypes["Characters"],
                                    "data": "\u0000"})
        elif data is EOF:
            # Tokenization ends.
            return False
        elif data in spaceCharacters:
            # Directly after emitting a token you switch back to the "data
            # state". At that point spaceCharacters are important so they are
            # emitted separately.
            self.tokenQueue.append({"type": tokenTypes["SpaceCharacters"], "data":
                                    data + self.stream.charsUntil(spaceCharacters, True)})
            # No need to update lastFourChars here, since the first space will
            # have already been appended to lastFourChars and will have broken
            # any <!-- or --> sequences
        else:
            chars = self.stream.charsUntil(("&", "<", "\u0000"))
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data":
                                    data + chars})
        return True

    def entityDataState(self):
        self.consumeEntity()
        self.state = self.dataState
        return True

    def rcdataState(self):
        data = self.stream.char()
        if data == "&":
            self.state = self.characterReferenceInRcdata
        elif data == "<":
            self.state = self.rcdataLessThanSignState
        elif data == EOF:
            # Tokenization ends.
            return False
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.tokenQueue.append({"type": tokenTypes["Characters"],
                                    "data": "\uFFFD"})
        elif data in spaceCharacters:
            # Directly after emitting a token you switch back to the "data
            # state". At that point spaceCharacters are important so they are
            # emitted separately.
            self.tokenQueue.append({"type": tokenTypes["SpaceCharacters"], "data":
                                    data + self.stream.charsUntil(spaceCharacters, True)})
            # No need to update lastFourChars here, since the first space will
            # have already been appended to lastFourChars and will have broken
            # any <!-- or --> sequences
        else:
            chars = self.stream.charsUntil(("&", "<", "\u0000"))
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data":
                                    data + chars})
        return True

    def characterReferenceInRcdata(self):
        self.consumeEntity()
        self.state = self.rcdataState
        return True

    def rawtextState(self):
        data = self.stream.char()
        if data == "<":
            self.state = self.rawtextLessThanSignState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.tokenQueue.append({"type": tokenTypes["Characters"],
                                    "data": "\uFFFD"})
        elif data == EOF:
            # Tokenization ends.
            return False
        else:
            chars = self.stream.charsUntil(("<", "\u0000"))
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data":
                                    data + chars})
        return True

    def scriptDataState(self):
        data = self.stream.char()
        if data == "<":
            self.state = self.scriptDataLessThanSignState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.tokenQueue.append({"type": tokenTypes["Characters"],
                                    "data": "\uFFFD"})
        elif data == EOF:
            # Tokenization ends.
            return False
        else:
            chars = self.stream.charsUntil(("<", "\u0000"))
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data":
                                    data + chars})
        return True

    def plaintextState(self):
        data = self.stream.char()
        if data == EOF:
            # Tokenization ends.
            return False
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.tokenQueue.append({"type": tokenTypes["Characters"],
                                    "data": "\uFFFD"})
        else:
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data":
                                    data + self.stream.charsUntil("\u0000")})
        return True

    def tagOpenState(self):
        data = self.stream.char()
        if data == "!":
            self.state = self.markupDeclarationOpenState
        elif data == "/":
            self.state = self.closeTagOpenState
        elif data in asciiLetters:
            self.currentToken = {"type": tokenTypes["StartTag"],
                                 "name": data, "data": [],
                                 "selfClosing": False,
                                 "selfClosingAcknowledged": False}
            self.state = self.tagNameState
        elif data == ">":
            # XXX In theory it could be something besides a tag name. But
            # do we really care?
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "expected-tag-name-but-got-right-bracket"})
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "<>"})
            self.state = self.dataState
        elif data == "?":
            # XXX In theory it could be something besides a tag name. But
            # do we really care?
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "expected-tag-name-but-got-question-mark"})
            self.stream.unget(data)
            self.state = self.bogusCommentState
        else:
            # XXX
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "expected-tag-name"})
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "<"})
            self.stream.unget(data)
            self.state = self.dataState
        return True

    def closeTagOpenState(self):
        data = self.stream.char()
        if data in asciiLetters:
            self.currentToken = {"type": tokenTypes["EndTag"], "name": data,
                                 "data": [], "selfClosing": False}
            self.state = self.tagNameState
        elif data == ">":
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "expected-closing-tag-but-got-right-bracket"})
            self.state = self.dataState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "expected-closing-tag-but-got-eof"})
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "</"})
            self.state = self.dataState
        else:
            # XXX data can be _'_...
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "expected-closing-tag-but-got-char",
                                    "datavars": {"data": data}})
            self.stream.unget(data)
            self.state = self.bogusCommentState
        return True

    def tagNameState(self):
        data = self.stream.char()
        if data in spaceCharacters:
            self.state = self.beforeAttributeNameState
        elif data == ">":
            self.emitCurrentToken()
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-tag-name"})
            self.state = self.dataState
        elif data == "/":
            self.state = self.selfClosingStartTagState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.currentToken["name"] += "\uFFFD"
        else:
            self.currentToken["name"] += data
            # (Don't use charsUntil here, because tag names are
            # very short and it's faster to not do anything fancy)
        return True

    def rcdataLessThanSignState(self):
        data = self.stream.char()
        if data == "/":
            self.temporaryBuffer = ""
            self.state = self.rcdataEndTagOpenState
        else:
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "<"})
            self.stream.unget(data)
            self.state = self.rcdataState
        return True

    def rcdataEndTagOpenState(self):
        data = self.stream.char()
        if data in asciiLetters:
            self.temporaryBuffer += data
            self.state = self.rcdataEndTagNameState
        else:
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "</"})
            self.stream.unget(data)
            self.state = self.rcdataState
        return True

    def rcdataEndTagNameState(self):
        appropriate = self.currentToken and self.currentToken["name"].lower() == self.temporaryBuffer.lower()
        data = self.stream.char()
        if data in spaceCharacters and appropriate:
            self.currentToken = {"type": tokenTypes["EndTag"],
                                 "name": self.temporaryBuffer,
                                 "data": [], "selfClosing": False}
            self.state = self.beforeAttributeNameState
        elif data == "/" and appropriate:
            self.currentToken = {"type": tokenTypes["EndTag"],
                                 "name": self.temporaryBuffer,
                                 "data": [], "selfClosing": False}
            self.state = self.selfClosingStartTagState
        elif data == ">" and appropriate:
            self.currentToken = {"type": tokenTypes["EndTag"],
                                 "name": self.temporaryBuffer,
                                 "data": [], "selfClosing": False}
            self.emitCurrentToken()
            self.state = self.dataState
        elif data in asciiLetters:
            self.temporaryBuffer += data
        else:
            self.tokenQueue.append({"type": tokenTypes["Characters"],
                                    "data": "</" + self.temporaryBuffer})
            self.stream.unget(data)
            self.state = self.rcdataState
        return True

    def rawtextLessThanSignState(self):
        data = self.stream.char()
        if data == "/":
            self.temporaryBuffer = ""
            self.state = self.rawtextEndTagOpenState
        else:
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "<"})
            self.stream.unget(data)
            self.state = self.rawtextState
        return True

    def rawtextEndTagOpenState(self):
        data = self.stream.char()
        if data in asciiLetters:
            self.temporaryBuffer += data
            self.state = self.rawtextEndTagNameState
        else:
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "</"})
            self.stream.unget(data)
            self.state = self.rawtextState
        return True

    def rawtextEndTagNameState(self):
        appropriate = self.currentToken and self.currentToken["name"].lower() == self.temporaryBuffer.lower()
        data = self.stream.char()
        if data in spaceCharacters and appropriate:
            self.currentToken = {"type": tokenTypes["EndTag"],
                                 "name": self.temporaryBuffer,
                                 "data": [], "selfClosing": False}
            self.state = self.beforeAttributeNameState
        elif data == "/" and appropriate:
            self.currentToken = {"type": tokenTypes["EndTag"],
                                 "name": self.temporaryBuffer,
                                 "data": [], "selfClosing": False}
            self.state = self.selfClosingStartTagState
        elif data == ">" and appropriate:
            self.currentToken = {"type": tokenTypes["EndTag"],
                                 "name": self.temporaryBuffer,
                                 "data": [], "selfClosing": False}
            self.emitCurrentToken()
            self.state = self.dataState
        elif data in asciiLetters:
            self.temporaryBuffer += data
        else:
            self.tokenQueue.append({"type": tokenTypes["Characters"],
                                    "data": "</" + self.temporaryBuffer})
            self.stream.unget(data)
            self.state = self.rawtextState
        return True

    def scriptDataLessThanSignState(self):
        data = self.stream.char()
        if data == "/":
            self.temporaryBuffer = ""
            self.state = self.scriptDataEndTagOpenState
        elif data == "!":
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "<!"})
            self.state = self.scriptDataEscapeStartState
        else:
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "<"})
            self.stream.unget(data)
            self.state = self.scriptDataState
        return True

    def scriptDataEndTagOpenState(self):
        data = self.stream.char()
        if data in asciiLetters:
            self.temporaryBuffer += data
            self.state = self.scriptDataEndTagNameState
        else:
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "</"})
            self.stream.unget(data)
            self.state = self.scriptDataState
        return True

    def scriptDataEndTagNameState(self):
        appropriate = self.currentToken and self.currentToken["name"].lower() == self.temporaryBuffer.lower()
        data = self.stream.char()
        if data in spaceCharacters and appropriate:
            self.currentToken = {"type": tokenTypes["EndTag"],
                                 "name": self.temporaryBuffer,
                                 "data": [], "selfClosing": False}
            self.state = self.beforeAttributeNameState
        elif data == "/" and appropriate:
            self.currentToken = {"type": tokenTypes["EndTag"],
                                 "name": self.temporaryBuffer,
                                 "data": [], "selfClosing": False}
            self.state = self.selfClosingStartTagState
        elif data == ">" and appropriate:
            self.currentToken = {"type": tokenTypes["EndTag"],
                                 "name": self.temporaryBuffer,
                                 "data": [], "selfClosing": False}
            self.emitCurrentToken()
            self.state = self.dataState
        elif data in asciiLetters:
            self.temporaryBuffer += data
        else:
            self.tokenQueue.append({"type": tokenTypes["Characters"],
                                    "data": "</" + self.temporaryBuffer})
            self.stream.unget(data)
            self.state = self.scriptDataState
        return True

    def scriptDataEscapeStartState(self):
        data = self.stream.char()
        if data == "-":
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "-"})
            self.state = self.scriptDataEscapeStartDashState
        else:
            self.stream.unget(data)
            self.state = self.scriptDataState
        return True

    def scriptDataEscapeStartDashState(self):
        data = self.stream.char()
        if data == "-":
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "-"})
            self.state = self.scriptDataEscapedDashDashState
        else:
            self.stream.unget(data)
            self.state = self.scriptDataState
        return True

    def scriptDataEscapedState(self):
        data = self.stream.char()
        if data == "-":
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "-"})
            self.state = self.scriptDataEscapedDashState
        elif data == "<":
            self.state = self.scriptDataEscapedLessThanSignState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.tokenQueue.append({"type": tokenTypes["Characters"],
                                    "data": "\uFFFD"})
        elif data == EOF:
            self.state = self.dataState
        else:
            chars = self.stream.charsUntil(("<", "-", "\u0000"))
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data":
                                    data + chars})
        return True

    def scriptDataEscapedDashState(self):
        data = self.stream.char()
        if data == "-":
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "-"})
            self.state = self.scriptDataEscapedDashDashState
        elif data == "<":
            self.state = self.scriptDataEscapedLessThanSignState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.tokenQueue.append({"type": tokenTypes["Characters"],
                                    "data": "\uFFFD"})
            self.state = self.scriptDataEscapedState
        elif data == EOF:
            self.state = self.dataState
        else:
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": data})
            self.state = self.scriptDataEscapedState
        return True

    def scriptDataEscapedDashDashState(self):
        data = self.stream.char()
        if data == "-":
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "-"})
        elif data == "<":
            self.state = self.scriptDataEscapedLessThanSignState
        elif data == ">":
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": ">"})
            self.state = self.scriptDataState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.tokenQueue.append({"type": tokenTypes["Characters"],
                                    "data": "\uFFFD"})
            self.state = self.scriptDataEscapedState
        elif data == EOF:
            self.state = self.dataState
        else:
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": data})
            self.state = self.scriptDataEscapedState
        return True

    def scriptDataEscapedLessThanSignState(self):
        data = self.stream.char()
        if data == "/":
            self.temporaryBuffer = ""
            self.state = self.scriptDataEscapedEndTagOpenState
        elif data in asciiLetters:
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "<" + data})
            self.temporaryBuffer = data
            self.state = self.scriptDataDoubleEscapeStartState
        else:
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "<"})
            self.stream.unget(data)
            self.state = self.scriptDataEscapedState
        return True

    def scriptDataEscapedEndTagOpenState(self):
        data = self.stream.char()
        if data in asciiLetters:
            self.temporaryBuffer = data
            self.state = self.scriptDataEscapedEndTagNameState
        else:
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "</"})
            self.stream.unget(data)
            self.state = self.scriptDataEscapedState
        return True

    def scriptDataEscapedEndTagNameState(self):
        appropriate = self.currentToken and self.currentToken["name"].lower() == self.temporaryBuffer.lower()
        data = self.stream.char()
        if data in spaceCharacters and appropriate:
            self.currentToken = {"type": tokenTypes["EndTag"],
                                 "name": self.temporaryBuffer,
                                 "data": [], "selfClosing": False}
            self.state = self.beforeAttributeNameState
        elif data == "/" and appropriate:
            self.currentToken = {"type": tokenTypes["EndTag"],
                                 "name": self.temporaryBuffer,
                                 "data": [], "selfClosing": False}
            self.state = self.selfClosingStartTagState
        elif data == ">" and appropriate:
            self.currentToken = {"type": tokenTypes["EndTag"],
                                 "name": self.temporaryBuffer,
                                 "data": [], "selfClosing": False}
            self.emitCurrentToken()
            self.state = self.dataState
        elif data in asciiLetters:
            self.temporaryBuffer += data
        else:
            self.tokenQueue.append({"type": tokenTypes["Characters"],
                                    "data": "</" + self.temporaryBuffer})
            self.stream.unget(data)
            self.state = self.scriptDataEscapedState
        return True

    def scriptDataDoubleEscapeStartState(self):
        data = self.stream.char()
        if data in (spaceCharacters | frozenset(("/", ">"))):
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": data})
            if self.temporaryBuffer.lower() == "script":
                self.state = self.scriptDataDoubleEscapedState
            else:
                self.state = self.scriptDataEscapedState
        elif data in asciiLetters:
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": data})
            self.temporaryBuffer += data
        else:
            self.stream.unget(data)
            self.state = self.scriptDataEscapedState
        return True

    def scriptDataDoubleEscapedState(self):
        data = self.stream.char()
        if data == "-":
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "-"})
            self.state = self.scriptDataDoubleEscapedDashState
        elif data == "<":
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "<"})
            self.state = self.scriptDataDoubleEscapedLessThanSignState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.tokenQueue.append({"type": tokenTypes["Characters"],
                                    "data": "\uFFFD"})
        elif data == EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-script-in-script"})
            self.state = self.dataState
        else:
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": data})
        return True

    def scriptDataDoubleEscapedDashState(self):
        data = self.stream.char()
        if data == "-":
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "-"})
            self.state = self.scriptDataDoubleEscapedDashDashState
        elif data == "<":
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "<"})
            self.state = self.scriptDataDoubleEscapedLessThanSignState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.tokenQueue.append({"type": tokenTypes["Characters"],
                                    "data": "\uFFFD"})
            self.state = self.scriptDataDoubleEscapedState
        elif data == EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-script-in-script"})
            self.state = self.dataState
        else:
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": data})
            self.state = self.scriptDataDoubleEscapedState
        return True

    def scriptDataDoubleEscapedDashDashState(self):
        data = self.stream.char()
        if data == "-":
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "-"})
        elif data == "<":
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "<"})
            self.state = self.scriptDataDoubleEscapedLessThanSignState
        elif data == ">":
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": ">"})
            self.state = self.scriptDataState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.tokenQueue.append({"type": tokenTypes["Characters"],
                                    "data": "\uFFFD"})
            self.state = self.scriptDataDoubleEscapedState
        elif data == EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-script-in-script"})
            self.state = self.dataState
        else:
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": data})
            self.state = self.scriptDataDoubleEscapedState
        return True

    def scriptDataDoubleEscapedLessThanSignState(self):
        data = self.stream.char()
        if data == "/":
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": "/"})
            self.temporaryBuffer = ""
            self.state = self.scriptDataDoubleEscapeEndState
        else:
            self.stream.unget(data)
            self.state = self.scriptDataDoubleEscapedState
        return True

    def scriptDataDoubleEscapeEndState(self):
        data = self.stream.char()
        if data in (spaceCharacters | frozenset(("/", ">"))):
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": data})
            if self.temporaryBuffer.lower() == "script":
                self.state = self.scriptDataEscapedState
            else:
                self.state = self.scriptDataDoubleEscapedState
        elif data in asciiLetters:
            self.tokenQueue.append({"type": tokenTypes["Characters"], "data": data})
            self.temporaryBuffer += data
        else:
            self.stream.unget(data)
            self.state = self.scriptDataDoubleEscapedState
        return True

    def beforeAttributeNameState(self):
        data = self.stream.char()
        if data in spaceCharacters:
            self.stream.charsUntil(spaceCharacters, True)
        elif data in asciiLetters:
            self.currentToken["data"].append([data, ""])
            self.state = self.attributeNameState
        elif data == ">":
            self.emitCurrentToken()
        elif data == "/":
            self.state = self.selfClosingStartTagState
        elif data in ("'", '"', "=", "<"):
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "invalid-character-in-attribute-name"})
            self.currentToken["data"].append([data, ""])
            self.state = self.attributeNameState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.currentToken["data"].append(["\uFFFD", ""])
            self.state = self.attributeNameState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "expected-attribute-name-but-got-eof"})
            self.state = self.dataState
        else:
            self.currentToken["data"].append([data, ""])
            self.state = self.attributeNameState
        return True

    def attributeNameState(self):
        data = self.stream.char()
        leavingThisState = True
        emitToken = False
        if data == "=":
            self.state = self.beforeAttributeValueState
        elif data in asciiLetters:
            self.currentToken["data"][-1][0] += data +\
                self.stream.charsUntil(asciiLetters, True)
            leavingThisState = False
        elif data == ">":
            # XXX If we emit here the attributes are converted to a dict
            # without being checked and when the code below runs we error
            # because data is a dict not a list
            emitToken = True
        elif data in spaceCharacters:
            self.state = self.afterAttributeNameState
        elif data == "/":
            self.state = self.selfClosingStartTagState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.currentToken["data"][-1][0] += "\uFFFD"
            leavingThisState = False
        elif data in ("'", '"', "<"):
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data":
                                    "invalid-character-in-attribute-name"})
            self.currentToken["data"][-1][0] += data
            leavingThisState = False
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "eof-in-attribute-name"})
            self.state = self.dataState
        else:
            self.currentToken["data"][-1][0] += data
            leavingThisState = False

        if leavingThisState:
            # Attributes are not dropped at this stage. That happens when the
            # start tag token is emitted so values can still be safely appended
            # to attributes, but we do want to report the parse error in time.
            self.currentToken["data"][-1][0] = (
                self.currentToken["data"][-1][0].translate(asciiUpper2Lower))
            for name, _ in self.currentToken["data"][:-1]:
                if self.currentToken["data"][-1][0] == name:
                    self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                            "duplicate-attribute"})
                    break
            # XXX Fix for above XXX
            if emitToken:
                self.emitCurrentToken()
        return True

    def afterAttributeNameState(self):
        data = self.stream.char()
        if data in spaceCharacters:
            self.stream.charsUntil(spaceCharacters, True)
        elif data == "=":
            self.state = self.beforeAttributeValueState
        elif data == ">":
            self.emitCurrentToken()
        elif data in asciiLetters:
            self.currentToken["data"].append([data, ""])
            self.state = self.attributeNameState
        elif data == "/":
            self.state = self.selfClosingStartTagState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.currentToken["data"].append(["\uFFFD", ""])
            self.state = self.attributeNameState
        elif data in ("'", '"', "<"):
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "invalid-character-after-attribute-name"})
            self.currentToken["data"].append([data, ""])
            self.state = self.attributeNameState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "expected-end-of-tag-but-got-eof"})
            self.state = self.dataState
        else:
            self.currentToken["data"].append([data, ""])
            self.state = self.attributeNameState
        return True

    def beforeAttributeValueState(self):
        data = self.stream.char()
        if data in spaceCharacters:
            self.stream.charsUntil(spaceCharacters, True)
        elif data == "\"":
            self.state = self.attributeValueDoubleQuotedState
        elif data == "&":
            self.state = self.attributeValueUnQuotedState
            self.stream.unget(data)
        elif data == "'":
            self.state = self.attributeValueSingleQuotedState
        elif data == ">":
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "expected-attribute-value-but-got-right-bracket"})
            self.emitCurrentToken()
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.currentToken["data"][-1][1] += "\uFFFD"
            self.state = self.attributeValueUnQuotedState
        elif data in ("=", "<", "`"):
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "equals-in-unquoted-attribute-value"})
            self.currentToken["data"][-1][1] += data
            self.state = self.attributeValueUnQuotedState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "expected-attribute-value-but-got-eof"})
            self.state = self.dataState
        else:
            self.currentToken["data"][-1][1] += data
            self.state = self.attributeValueUnQuotedState
        return True

    def attributeValueDoubleQuotedState(self):
        data = self.stream.char()
        if data == "\"":
            self.state = self.afterAttributeValueState
        elif data == "&":
            self.processEntityInAttribute('"')
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.currentToken["data"][-1][1] += "\uFFFD"
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-attribute-value-double-quote"})
            self.state = self.dataState
        else:
            self.currentToken["data"][-1][1] += data +\
                self.stream.charsUntil(("\"", "&", "\u0000"))
        return True

    def attributeValueSingleQuotedState(self):
        data = self.stream.char()
        if data == "'":
            self.state = self.afterAttributeValueState
        elif data == "&":
            self.processEntityInAttribute("'")
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.currentToken["data"][-1][1] += "\uFFFD"
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-attribute-value-single-quote"})
            self.state = self.dataState
        else:
            self.currentToken["data"][-1][1] += data +\
                self.stream.charsUntil(("'", "&", "\u0000"))
        return True

    def attributeValueUnQuotedState(self):
        data = self.stream.char()
        if data in spaceCharacters:
            self.state = self.beforeAttributeNameState
        elif data == "&":
            self.processEntityInAttribute(">")
        elif data == ">":
            self.emitCurrentToken()
        elif data in ('"', "'", "=", "<", "`"):
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-character-in-unquoted-attribute-value"})
            self.currentToken["data"][-1][1] += data
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.currentToken["data"][-1][1] += "\uFFFD"
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-attribute-value-no-quotes"})
            self.state = self.dataState
        else:
            self.currentToken["data"][-1][1] += data + self.stream.charsUntil(
                frozenset(("&", ">", '"', "'", "=", "<", "`", "\u0000")) | spaceCharacters)
        return True

    def afterAttributeValueState(self):
        data = self.stream.char()
        if data in spaceCharacters:
            self.state = self.beforeAttributeNameState
        elif data == ">":
            self.emitCurrentToken()
        elif data == "/":
            self.state = self.selfClosingStartTagState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-EOF-after-attribute-value"})
            self.stream.unget(data)
            self.state = self.dataState
        else:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-character-after-attribute-value"})
            self.stream.unget(data)
            self.state = self.beforeAttributeNameState
        return True

    def selfClosingStartTagState(self):
        data = self.stream.char()
        if data == ">":
            self.currentToken["selfClosing"] = True
            self.emitCurrentToken()
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data":
                                    "unexpected-EOF-after-solidus-in-tag"})
            self.stream.unget(data)
            self.state = self.dataState
        else:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-character-after-solidus-in-tag"})
            self.stream.unget(data)
            self.state = self.beforeAttributeNameState
        return True

    def bogusCommentState(self):
        # Make a new comment token and give it as value all the characters
        # until the first > or EOF (charsUntil checks for EOF automatically)
        # and emit it.
        data = self.stream.charsUntil(">")
        data = data.replace("\u0000", "\uFFFD")
        self.tokenQueue.append(
            {"type": tokenTypes["Comment"], "data": data})

        # Eat the character directly after the bogus comment which is either a
        # ">" or an EOF.
        self.stream.char()
        self.state = self.dataState
        return True

    def markupDeclarationOpenState(self):
        charStack = [self.stream.char()]
        if charStack[-1] == "-":
            charStack.append(self.stream.char())
            if charStack[-1] == "-":
                self.currentToken = {"type": tokenTypes["Comment"], "data": ""}
                self.state = self.commentStartState
                return True
        elif charStack[-1] in ('d', 'D'):
            matched = True
            for expected in (('o', 'O'), ('c', 'C'), ('t', 'T'),
                             ('y', 'Y'), ('p', 'P'), ('e', 'E')):
                charStack.append(self.stream.char())
                if charStack[-1] not in expected:
                    matched = False
                    break
            if matched:
                self.currentToken = {"type": tokenTypes["Doctype"],
                                     "name": "",
                                     "publicId": None, "systemId": None,
                                     "correct": True}
                self.state = self.doctypeState
                return True
        elif (charStack[-1] == "[" and
              self.parser is not None and
              self.parser.tree.openElements and
              self.parser.tree.openElements[-1].namespace != self.parser.tree.defaultNamespace):
            matched = True
            for expected in ["C", "D", "A", "T", "A", "["]:
                charStack.append(self.stream.char())
                if charStack[-1] != expected:
                    matched = False
                    break
            if matched:
                self.state = self.cdataSectionState
                return True

        self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                "expected-dashes-or-doctype"})

        while charStack:
            self.stream.unget(charStack.pop())
        self.state = self.bogusCommentState
        return True

    def commentStartState(self):
        data = self.stream.char()
        if data == "-":
            self.state = self.commentStartDashState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.currentToken["data"] += "\uFFFD"
        elif data == ">":
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "incorrect-comment"})
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-comment"})
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            self.currentToken["data"] += data
            self.state = self.commentState
        return True

    def commentStartDashState(self):
        data = self.stream.char()
        if data == "-":
            self.state = self.commentEndState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.currentToken["data"] += "-\uFFFD"
        elif data == ">":
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "incorrect-comment"})
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-comment"})
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            self.currentToken["data"] += "-" + data
            self.state = self.commentState
        return True

    def commentState(self):
        data = self.stream.char()
        if data == "-":
            self.state = self.commentEndDashState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.currentToken["data"] += "\uFFFD"
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "eof-in-comment"})
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            self.currentToken["data"] += data + \
                self.stream.charsUntil(("-", "\u0000"))
        return True

    def commentEndDashState(self):
        data = self.stream.char()
        if data == "-":
            self.state = self.commentEndState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.currentToken["data"] += "-\uFFFD"
            self.state = self.commentState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-comment-end-dash"})
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            self.currentToken["data"] += "-" + data
            self.state = self.commentState
        return True

    def commentEndState(self):
        data = self.stream.char()
        if data == ">":
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.currentToken["data"] += "--\uFFFD"
            self.state = self.commentState
        elif data == "!":
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-bang-after-double-dash-in-comment"})
            self.state = self.commentEndBangState
        elif data == "-":
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-dash-after-double-dash-in-comment"})
            self.currentToken["data"] += data
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-comment-double-dash"})
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            # XXX
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-char-in-comment"})
            self.currentToken["data"] += "--" + data
            self.state = self.commentState
        return True

    def commentEndBangState(self):
        data = self.stream.char()
        if data == ">":
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        elif data == "-":
            self.currentToken["data"] += "--!"
            self.state = self.commentEndDashState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.currentToken["data"] += "--!\uFFFD"
            self.state = self.commentState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-comment-end-bang-state"})
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            self.currentToken["data"] += "--!" + data
            self.state = self.commentState
        return True

    def doctypeState(self):
        data = self.stream.char()
        if data in spaceCharacters:
            self.state = self.beforeDoctypeNameState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "expected-doctype-name-but-got-eof"})
            self.currentToken["correct"] = False
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "need-space-after-doctype"})
            self.stream.unget(data)
            self.state = self.beforeDoctypeNameState
        return True

    def beforeDoctypeNameState(self):
        data = self.stream.char()
        if data in spaceCharacters:
            pass
        elif data == ">":
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "expected-doctype-name-but-got-right-bracket"})
            self.currentToken["correct"] = False
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.currentToken["name"] = "\uFFFD"
            self.state = self.doctypeNameState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "expected-doctype-name-but-got-eof"})
            self.currentToken["correct"] = False
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            self.currentToken["name"] = data
            self.state = self.doctypeNameState
        return True

    def doctypeNameState(self):
        data = self.stream.char()
        if data in spaceCharacters:
            self.currentToken["name"] = self.currentToken["name"].translate(asciiUpper2Lower)
            self.state = self.afterDoctypeNameState
        elif data == ">":
            self.currentToken["name"] = self.currentToken["name"].translate(asciiUpper2Lower)
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.currentToken["name"] += "\uFFFD"
            self.state = self.doctypeNameState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-doctype-name"})
            self.currentToken["correct"] = False
            self.currentToken["name"] = self.currentToken["name"].translate(asciiUpper2Lower)
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            self.currentToken["name"] += data
        return True

    def afterDoctypeNameState(self):
        data = self.stream.char()
        if data in spaceCharacters:
            pass
        elif data == ">":
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        elif data is EOF:
            self.currentToken["correct"] = False
            self.stream.unget(data)
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-doctype"})
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            if data in ("p", "P"):
                matched = True
                for expected in (("u", "U"), ("b", "B"), ("l", "L"),
                                 ("i", "I"), ("c", "C")):
                    data = self.stream.char()
                    if data not in expected:
                        matched = False
                        break
                if matched:
                    self.state = self.afterDoctypePublicKeywordState
                    return True
            elif data in ("s", "S"):
                matched = True
                for expected in (("y", "Y"), ("s", "S"), ("t", "T"),
                                 ("e", "E"), ("m", "M")):
                    data = self.stream.char()
                    if data not in expected:
                        matched = False
                        break
                if matched:
                    self.state = self.afterDoctypeSystemKeywordState
                    return True

            # All the characters read before the current 'data' will be
            # [a-zA-Z], so they're garbage in the bogus doctype and can be
            # discarded; only the latest character might be '>' or EOF
            # and needs to be ungetted
            self.stream.unget(data)
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "expected-space-or-right-bracket-in-doctype", "datavars":
                                    {"data": data}})
            self.currentToken["correct"] = False
            self.state = self.bogusDoctypeState

        return True

    def afterDoctypePublicKeywordState(self):
        data = self.stream.char()
        if data in spaceCharacters:
            self.state = self.beforeDoctypePublicIdentifierState
        elif data in ("'", '"'):
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-char-in-doctype"})
            self.stream.unget(data)
            self.state = self.beforeDoctypePublicIdentifierState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-doctype"})
            self.currentToken["correct"] = False
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            self.stream.unget(data)
            self.state = self.beforeDoctypePublicIdentifierState
        return True

    def beforeDoctypePublicIdentifierState(self):
        data = self.stream.char()
        if data in spaceCharacters:
            pass
        elif data == "\"":
            self.currentToken["publicId"] = ""
            self.state = self.doctypePublicIdentifierDoubleQuotedState
        elif data == "'":
            self.currentToken["publicId"] = ""
            self.state = self.doctypePublicIdentifierSingleQuotedState
        elif data == ">":
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-end-of-doctype"})
            self.currentToken["correct"] = False
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-doctype"})
            self.currentToken["correct"] = False
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-char-in-doctype"})
            self.currentToken["correct"] = False
            self.state = self.bogusDoctypeState
        return True

    def doctypePublicIdentifierDoubleQuotedState(self):
        data = self.stream.char()
        if data == "\"":
            self.state = self.afterDoctypePublicIdentifierState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.currentToken["publicId"] += "\uFFFD"
        elif data == ">":
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-end-of-doctype"})
            self.currentToken["correct"] = False
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-doctype"})
            self.currentToken["correct"] = False
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            self.currentToken["publicId"] += data
        return True

    def doctypePublicIdentifierSingleQuotedState(self):
        data = self.stream.char()
        if data == "'":
            self.state = self.afterDoctypePublicIdentifierState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.currentToken["publicId"] += "\uFFFD"
        elif data == ">":
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-end-of-doctype"})
            self.currentToken["correct"] = False
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-doctype"})
            self.currentToken["correct"] = False
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            self.currentToken["publicId"] += data
        return True

    def afterDoctypePublicIdentifierState(self):
        data = self.stream.char()
        if data in spaceCharacters:
            self.state = self.betweenDoctypePublicAndSystemIdentifiersState
        elif data == ">":
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        elif data == '"':
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-char-in-doctype"})
            self.currentToken["systemId"] = ""
            self.state = self.doctypeSystemIdentifierDoubleQuotedState
        elif data == "'":
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-char-in-doctype"})
            self.currentToken["systemId"] = ""
            self.state = self.doctypeSystemIdentifierSingleQuotedState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-doctype"})
            self.currentToken["correct"] = False
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-char-in-doctype"})
            self.currentToken["correct"] = False
            self.state = self.bogusDoctypeState
        return True

    def betweenDoctypePublicAndSystemIdentifiersState(self):
        data = self.stream.char()
        if data in spaceCharacters:
            pass
        elif data == ">":
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        elif data == '"':
            self.currentToken["systemId"] = ""
            self.state = self.doctypeSystemIdentifierDoubleQuotedState
        elif data == "'":
            self.currentToken["systemId"] = ""
            self.state = self.doctypeSystemIdentifierSingleQuotedState
        elif data == EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-doctype"})
            self.currentToken["correct"] = False
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-char-in-doctype"})
            self.currentToken["correct"] = False
            self.state = self.bogusDoctypeState
        return True

    def afterDoctypeSystemKeywordState(self):
        data = self.stream.char()
        if data in spaceCharacters:
            self.state = self.beforeDoctypeSystemIdentifierState
        elif data in ("'", '"'):
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-char-in-doctype"})
            self.stream.unget(data)
            self.state = self.beforeDoctypeSystemIdentifierState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-doctype"})
            self.currentToken["correct"] = False
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            self.stream.unget(data)
            self.state = self.beforeDoctypeSystemIdentifierState
        return True

    def beforeDoctypeSystemIdentifierState(self):
        data = self.stream.char()
        if data in spaceCharacters:
            pass
        elif data == "\"":
            self.currentToken["systemId"] = ""
            self.state = self.doctypeSystemIdentifierDoubleQuotedState
        elif data == "'":
            self.currentToken["systemId"] = ""
            self.state = self.doctypeSystemIdentifierSingleQuotedState
        elif data == ">":
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-char-in-doctype"})
            self.currentToken["correct"] = False
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-doctype"})
            self.currentToken["correct"] = False
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-char-in-doctype"})
            self.currentToken["correct"] = False
            self.state = self.bogusDoctypeState
        return True

    def doctypeSystemIdentifierDoubleQuotedState(self):
        data = self.stream.char()
        if data == "\"":
            self.state = self.afterDoctypeSystemIdentifierState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.currentToken["systemId"] += "\uFFFD"
        elif data == ">":
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-end-of-doctype"})
            self.currentToken["correct"] = False
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-doctype"})
            self.currentToken["correct"] = False
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            self.currentToken["systemId"] += data
        return True

    def doctypeSystemIdentifierSingleQuotedState(self):
        data = self.stream.char()
        if data == "'":
            self.state = self.afterDoctypeSystemIdentifierState
        elif data == "\u0000":
            self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                    "data": "invalid-codepoint"})
            self.currentToken["systemId"] += "\uFFFD"
        elif data == ">":
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-end-of-doctype"})
            self.currentToken["correct"] = False
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-doctype"})
            self.currentToken["correct"] = False
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            self.currentToken["systemId"] += data
        return True

    def afterDoctypeSystemIdentifierState(self):
        data = self.stream.char()
        if data in spaceCharacters:
            pass
        elif data == ">":
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        elif data is EOF:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "eof-in-doctype"})
            self.currentToken["correct"] = False
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            self.tokenQueue.append({"type": tokenTypes["ParseError"], "data":
                                    "unexpected-char-in-doctype"})
            self.state = self.bogusDoctypeState
        return True

    def bogusDoctypeState(self):
        data = self.stream.char()
        if data == ">":
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        elif data is EOF:
            # XXX EMIT
            self.stream.unget(data)
            self.tokenQueue.append(self.currentToken)
            self.state = self.dataState
        else:
            pass
        return True

    def cdataSectionState(self):
        data = []
        while True:
            data.append(self.stream.charsUntil("]"))
            data.append(self.stream.charsUntil(">"))
            char = self.stream.char()
            if char == EOF:
                break
            else:
                assert char == ">"
                if data[-1][-2:] == "]]":
                    data[-1] = data[-1][:-2]
                    break
                else:
                    data.append(char)

        data = "".join(data)  # pylint:disable=redefined-variable-type
        # Deal with null here rather than in the parser
        nullCount = data.count("\u0000")
        if nullCount > 0:
            for _ in range(nullCount):
                self.tokenQueue.append({"type": tokenTypes["ParseError"],
                                        "data": "invalid-codepoint"})
            data = data.replace("\u0000", "\uFFFD")
        if data:
            self.tokenQueue.append({"type": tokenTypes["Characters"],
                                    "data": data})
        self.state = self.dataState
        return True
site-packages/pip/_vendor/html5lib/html5parser.py000064400000344662147511334600016073 0ustar00from __future__ import absolute_import, division, unicode_literals
from pip._vendor.six import with_metaclass, viewkeys, PY3

import types

try:
    from collections import OrderedDict
except ImportError:
    from pip._vendor.ordereddict import OrderedDict

from . import _inputstream
from . import _tokenizer

from . import treebuilders
from .treebuilders.base import Marker

from . import _utils
from .constants import (
    spaceCharacters, asciiUpper2Lower,
    specialElements, headingElements, cdataElements, rcdataElements,
    tokenTypes, tagTokenTypes,
    namespaces,
    htmlIntegrationPointElements, mathmlTextIntegrationPointElements,
    adjustForeignAttributes as adjustForeignAttributesMap,
    adjustMathMLAttributes, adjustSVGAttributes,
    E,
    ReparseException
)


def parse(doc, treebuilder="etree", namespaceHTMLElements=True, **kwargs):
    """Parse a string or file-like object into a tree"""
    tb = treebuilders.getTreeBuilder(treebuilder)
    p = HTMLParser(tb, namespaceHTMLElements=namespaceHTMLElements)
    return p.parse(doc, **kwargs)


def parseFragment(doc, container="div", treebuilder="etree", namespaceHTMLElements=True, **kwargs):
    tb = treebuilders.getTreeBuilder(treebuilder)
    p = HTMLParser(tb, namespaceHTMLElements=namespaceHTMLElements)
    return p.parseFragment(doc, container=container, **kwargs)


def method_decorator_metaclass(function):
    class Decorated(type):
        def __new__(meta, classname, bases, classDict):
            for attributeName, attribute in classDict.items():
                if isinstance(attribute, types.FunctionType):
                    attribute = function(attribute)

                classDict[attributeName] = attribute
            return type.__new__(meta, classname, bases, classDict)
    return Decorated


class HTMLParser(object):
    """HTML parser. Generates a tree structure from a stream of (possibly
        malformed) HTML"""

    def __init__(self, tree=None, strict=False, namespaceHTMLElements=True, debug=False):
        """
        strict - raise an exception when a parse error is encountered

        tree - a treebuilder class controlling the type of tree that will be
        returned. Built in treebuilders can be accessed through
        html5lib.treebuilders.getTreeBuilder(treeType)
        """

        # Raise an exception on the first error encountered
        self.strict = strict

        if tree is None:
            tree = treebuilders.getTreeBuilder("etree")
        self.tree = tree(namespaceHTMLElements)
        self.errors = []

        self.phases = dict([(name, cls(self, self.tree)) for name, cls in
                            getPhases(debug).items()])

    def _parse(self, stream, innerHTML=False, container="div", scripting=False, **kwargs):

        self.innerHTMLMode = innerHTML
        self.container = container
        self.scripting = scripting
        self.tokenizer = _tokenizer.HTMLTokenizer(stream, parser=self, **kwargs)
        self.reset()

        try:
            self.mainLoop()
        except ReparseException:
            self.reset()
            self.mainLoop()

    def reset(self):
        self.tree.reset()
        self.firstStartTag = False
        self.errors = []
        self.log = []  # only used with debug mode
        # "quirks" / "limited quirks" / "no quirks"
        self.compatMode = "no quirks"

        if self.innerHTMLMode:
            self.innerHTML = self.container.lower()

            if self.innerHTML in cdataElements:
                self.tokenizer.state = self.tokenizer.rcdataState
            elif self.innerHTML in rcdataElements:
                self.tokenizer.state = self.tokenizer.rawtextState
            elif self.innerHTML == 'plaintext':
                self.tokenizer.state = self.tokenizer.plaintextState
            else:
                # state already is data state
                # self.tokenizer.state = self.tokenizer.dataState
                pass
            self.phase = self.phases["beforeHtml"]
            self.phase.insertHtmlElement()
            self.resetInsertionMode()
        else:
            self.innerHTML = False  # pylint:disable=redefined-variable-type
            self.phase = self.phases["initial"]

        self.lastPhase = None

        self.beforeRCDataPhase = None

        self.framesetOK = True

    @property
    def documentEncoding(self):
        """The name of the character encoding
        that was used to decode the input stream,
        or :obj:`None` if that is not determined yet.

        """
        if not hasattr(self, 'tokenizer'):
            return None
        return self.tokenizer.stream.charEncoding[0].name

    def isHTMLIntegrationPoint(self, element):
        if (element.name == "annotation-xml" and
                element.namespace == namespaces["mathml"]):
            return ("encoding" in element.attributes and
                    element.attributes["encoding"].translate(
                        asciiUpper2Lower) in
                    ("text/html", "application/xhtml+xml"))
        else:
            return (element.namespace, element.name) in htmlIntegrationPointElements

    def isMathMLTextIntegrationPoint(self, element):
        return (element.namespace, element.name) in mathmlTextIntegrationPointElements

    def mainLoop(self):
        CharactersToken = tokenTypes["Characters"]
        SpaceCharactersToken = tokenTypes["SpaceCharacters"]
        StartTagToken = tokenTypes["StartTag"]
        EndTagToken = tokenTypes["EndTag"]
        CommentToken = tokenTypes["Comment"]
        DoctypeToken = tokenTypes["Doctype"]
        ParseErrorToken = tokenTypes["ParseError"]

        for token in self.normalizedTokens():
            prev_token = None
            new_token = token
            while new_token is not None:
                prev_token = new_token
                currentNode = self.tree.openElements[-1] if self.tree.openElements else None
                currentNodeNamespace = currentNode.namespace if currentNode else None
                currentNodeName = currentNode.name if currentNode else None

                type = new_token["type"]

                if type == ParseErrorToken:
                    self.parseError(new_token["data"], new_token.get("datavars", {}))
                    new_token = None
                else:
                    if (len(self.tree.openElements) == 0 or
                        currentNodeNamespace == self.tree.defaultNamespace or
                        (self.isMathMLTextIntegrationPoint(currentNode) and
                         ((type == StartTagToken and
                           token["name"] not in frozenset(["mglyph", "malignmark"])) or
                          type in (CharactersToken, SpaceCharactersToken))) or
                        (currentNodeNamespace == namespaces["mathml"] and
                         currentNodeName == "annotation-xml" and
                         type == StartTagToken and
                         token["name"] == "svg") or
                        (self.isHTMLIntegrationPoint(currentNode) and
                         type in (StartTagToken, CharactersToken, SpaceCharactersToken))):
                        phase = self.phase
                    else:
                        phase = self.phases["inForeignContent"]

                    if type == CharactersToken:
                        new_token = phase.processCharacters(new_token)
                    elif type == SpaceCharactersToken:
                        new_token = phase.processSpaceCharacters(new_token)
                    elif type == StartTagToken:
                        new_token = phase.processStartTag(new_token)
                    elif type == EndTagToken:
                        new_token = phase.processEndTag(new_token)
                    elif type == CommentToken:
                        new_token = phase.processComment(new_token)
                    elif type == DoctypeToken:
                        new_token = phase.processDoctype(new_token)

            if (type == StartTagToken and prev_token["selfClosing"] and
                    not prev_token["selfClosingAcknowledged"]):
                self.parseError("non-void-element-with-trailing-solidus",
                                {"name": prev_token["name"]})

        # When the loop finishes it's EOF
        reprocess = True
        phases = []
        while reprocess:
            phases.append(self.phase)
            reprocess = self.phase.processEOF()
            if reprocess:
                assert self.phase not in phases

    def normalizedTokens(self):
        for token in self.tokenizer:
            yield self.normalizeToken(token)

    def parse(self, stream, *args, **kwargs):
        """Parse a HTML document into a well-formed tree

        stream - a filelike object or string containing the HTML to be parsed

        The optional encoding parameter must be a string that indicates
        the encoding.  If specified, that encoding will be used,
        regardless of any BOM or later declaration (such as in a meta
        element)

        scripting - treat noscript elements as if javascript was turned on
        """
        self._parse(stream, False, None, *args, **kwargs)
        return self.tree.getDocument()

    def parseFragment(self, stream, *args, **kwargs):
        """Parse a HTML fragment into a well-formed tree fragment

        container - name of the element we're setting the innerHTML property
        if set to None, default to 'div'

        stream - a filelike object or string containing the HTML to be parsed

        The optional encoding parameter must be a string that indicates
        the encoding.  If specified, that encoding will be used,
        regardless of any BOM or later declaration (such as in a meta
        element)

        scripting - treat noscript elements as if javascript was turned on
        """
        self._parse(stream, True, *args, **kwargs)
        return self.tree.getFragment()

    def parseError(self, errorcode="XXX-undefined-error", datavars=None):
        # XXX The idea is to make errorcode mandatory.
        if datavars is None:
            datavars = {}
        self.errors.append((self.tokenizer.stream.position(), errorcode, datavars))
        if self.strict:
            raise ParseError(E[errorcode] % datavars)

    def normalizeToken(self, token):
        """ HTML5 specific normalizations to the token stream """

        if token["type"] == tokenTypes["StartTag"]:
            raw = token["data"]
            token["data"] = OrderedDict(raw)
            if len(raw) > len(token["data"]):
                # we had some duplicated attribute, fix so first wins
                token["data"].update(raw[::-1])

        return token

    def adjustMathMLAttributes(self, token):
        adjust_attributes(token, adjustMathMLAttributes)

    def adjustSVGAttributes(self, token):
        adjust_attributes(token, adjustSVGAttributes)

    def adjustForeignAttributes(self, token):
        adjust_attributes(token, adjustForeignAttributesMap)

    def reparseTokenNormal(self, token):
        # pylint:disable=unused-argument
        self.parser.phase()

    def resetInsertionMode(self):
        # The name of this method is mostly historical. (It's also used in the
        # specification.)
        last = False
        newModes = {
            "select": "inSelect",
            "td": "inCell",
            "th": "inCell",
            "tr": "inRow",
            "tbody": "inTableBody",
            "thead": "inTableBody",
            "tfoot": "inTableBody",
            "caption": "inCaption",
            "colgroup": "inColumnGroup",
            "table": "inTable",
            "head": "inBody",
            "body": "inBody",
            "frameset": "inFrameset",
            "html": "beforeHead"
        }
        for node in self.tree.openElements[::-1]:
            nodeName = node.name
            new_phase = None
            if node == self.tree.openElements[0]:
                assert self.innerHTML
                last = True
                nodeName = self.innerHTML
            # Check for conditions that should only happen in the innerHTML
            # case
            if nodeName in ("select", "colgroup", "head", "html"):
                assert self.innerHTML

            if not last and node.namespace != self.tree.defaultNamespace:
                continue

            if nodeName in newModes:
                new_phase = self.phases[newModes[nodeName]]
                break
            elif last:
                new_phase = self.phases["inBody"]
                break

        self.phase = new_phase

    def parseRCDataRawtext(self, token, contentType):
        """Generic RCDATA/RAWTEXT Parsing algorithm
        contentType - RCDATA or RAWTEXT
        """
        assert contentType in ("RAWTEXT", "RCDATA")

        self.tree.insertElement(token)

        if contentType == "RAWTEXT":
            self.tokenizer.state = self.tokenizer.rawtextState
        else:
            self.tokenizer.state = self.tokenizer.rcdataState

        self.originalPhase = self.phase

        self.phase = self.phases["text"]


@_utils.memoize
def getPhases(debug):
    def log(function):
        """Logger that records which phase processes each token"""
        type_names = dict((value, key) for key, value in
                          tokenTypes.items())

        def wrapped(self, *args, **kwargs):
            if function.__name__.startswith("process") and len(args) > 0:
                token = args[0]
                try:
                    info = {"type": type_names[token['type']]}
                except:
                    raise
                if token['type'] in tagTokenTypes:
                    info["name"] = token['name']

                self.parser.log.append((self.parser.tokenizer.state.__name__,
                                        self.parser.phase.__class__.__name__,
                                        self.__class__.__name__,
                                        function.__name__,
                                        info))
                return function(self, *args, **kwargs)
            else:
                return function(self, *args, **kwargs)
        return wrapped

    def getMetaclass(use_metaclass, metaclass_func):
        if use_metaclass:
            return method_decorator_metaclass(metaclass_func)
        else:
            return type

    # pylint:disable=unused-argument
    class Phase(with_metaclass(getMetaclass(debug, log))):
        """Base class for helper object that implements each phase of processing
        """

        def __init__(self, parser, tree):
            self.parser = parser
            self.tree = tree

        def processEOF(self):
            raise NotImplementedError

        def processComment(self, token):
            # For most phases the following is correct. Where it's not it will be
            # overridden.
            self.tree.insertComment(token, self.tree.openElements[-1])

        def processDoctype(self, token):
            self.parser.parseError("unexpected-doctype")

        def processCharacters(self, token):
            self.tree.insertText(token["data"])

        def processSpaceCharacters(self, token):
            self.tree.insertText(token["data"])

        def processStartTag(self, token):
            return self.startTagHandler[token["name"]](token)

        def startTagHtml(self, token):
            if not self.parser.firstStartTag and token["name"] == "html":
                self.parser.parseError("non-html-root")
            # XXX Need a check here to see if the first start tag token emitted is
            # this token... If it's not, invoke self.parser.parseError().
            for attr, value in token["data"].items():
                if attr not in self.tree.openElements[0].attributes:
                    self.tree.openElements[0].attributes[attr] = value
            self.parser.firstStartTag = False

        def processEndTag(self, token):
            return self.endTagHandler[token["name"]](token)

    class InitialPhase(Phase):
        def processSpaceCharacters(self, token):
            pass

        def processComment(self, token):
            self.tree.insertComment(token, self.tree.document)

        def processDoctype(self, token):
            name = token["name"]
            publicId = token["publicId"]
            systemId = token["systemId"]
            correct = token["correct"]

            if (name != "html" or publicId is not None or
                    systemId is not None and systemId != "about:legacy-compat"):
                self.parser.parseError("unknown-doctype")

            if publicId is None:
                publicId = ""

            self.tree.insertDoctype(token)

            if publicId != "":
                publicId = publicId.translate(asciiUpper2Lower)

            if (not correct or token["name"] != "html" or
                    publicId.startswith(
                        ("+//silmaril//dtd html pro v0r11 19970101//",
                         "-//advasoft ltd//dtd html 3.0 aswedit + extensions//",
                         "-//as//dtd html 3.0 aswedit + extensions//",
                         "-//ietf//dtd html 2.0 level 1//",
                         "-//ietf//dtd html 2.0 level 2//",
                         "-//ietf//dtd html 2.0 strict level 1//",
                         "-//ietf//dtd html 2.0 strict level 2//",
                         "-//ietf//dtd html 2.0 strict//",
                         "-//ietf//dtd html 2.0//",
                         "-//ietf//dtd html 2.1e//",
                         "-//ietf//dtd html 3.0//",
                         "-//ietf//dtd html 3.2 final//",
                         "-//ietf//dtd html 3.2//",
                         "-//ietf//dtd html 3//",
                         "-//ietf//dtd html level 0//",
                         "-//ietf//dtd html level 1//",
                         "-//ietf//dtd html level 2//",
                         "-//ietf//dtd html level 3//",
                         "-//ietf//dtd html strict level 0//",
                         "-//ietf//dtd html strict level 1//",
                         "-//ietf//dtd html strict level 2//",
                         "-//ietf//dtd html strict level 3//",
                         "-//ietf//dtd html strict//",
                         "-//ietf//dtd html//",
                         "-//metrius//dtd metrius presentational//",
                         "-//microsoft//dtd internet explorer 2.0 html strict//",
                         "-//microsoft//dtd internet explorer 2.0 html//",
                         "-//microsoft//dtd internet explorer 2.0 tables//",
                         "-//microsoft//dtd internet explorer 3.0 html strict//",
                         "-//microsoft//dtd internet explorer 3.0 html//",
                         "-//microsoft//dtd internet explorer 3.0 tables//",
                         "-//netscape comm. corp.//dtd html//",
                         "-//netscape comm. corp.//dtd strict html//",
                         "-//o'reilly and associates//dtd html 2.0//",
                         "-//o'reilly and associates//dtd html extended 1.0//",
                         "-//o'reilly and associates//dtd html extended relaxed 1.0//",
                         "-//softquad software//dtd hotmetal pro 6.0::19990601::extensions to html 4.0//",
                         "-//softquad//dtd hotmetal pro 4.0::19971010::extensions to html 4.0//",
                         "-//spyglass//dtd html 2.0 extended//",
                         "-//sq//dtd html 2.0 hotmetal + extensions//",
                         "-//sun microsystems corp.//dtd hotjava html//",
                         "-//sun microsystems corp.//dtd hotjava strict html//",
                         "-//w3c//dtd html 3 1995-03-24//",
                         "-//w3c//dtd html 3.2 draft//",
                         "-//w3c//dtd html 3.2 final//",
                         "-//w3c//dtd html 3.2//",
                         "-//w3c//dtd html 3.2s draft//",
                         "-//w3c//dtd html 4.0 frameset//",
                         "-//w3c//dtd html 4.0 transitional//",
                         "-//w3c//dtd html experimental 19960712//",
                         "-//w3c//dtd html experimental 970421//",
                         "-//w3c//dtd w3 html//",
                         "-//w3o//dtd w3 html 3.0//",
                         "-//webtechs//dtd mozilla html 2.0//",
                         "-//webtechs//dtd mozilla html//")) or
                    publicId in ("-//w3o//dtd w3 html strict 3.0//en//",
                                 "-/w3c/dtd html 4.0 transitional/en",
                                 "html") or
                    publicId.startswith(
                        ("-//w3c//dtd html 4.01 frameset//",
                         "-//w3c//dtd html 4.01 transitional//")) and
                    systemId is None or
                    systemId and systemId.lower() == "http://www.ibm.com/data/dtd/v11/ibmxhtml1-transitional.dtd"):
                self.parser.compatMode = "quirks"
            elif (publicId.startswith(
                    ("-//w3c//dtd xhtml 1.0 frameset//",
                     "-//w3c//dtd xhtml 1.0 transitional//")) or
                  publicId.startswith(
                      ("-//w3c//dtd html 4.01 frameset//",
                       "-//w3c//dtd html 4.01 transitional//")) and
                  systemId is not None):
                self.parser.compatMode = "limited quirks"

            self.parser.phase = self.parser.phases["beforeHtml"]

        def anythingElse(self):
            self.parser.compatMode = "quirks"
            self.parser.phase = self.parser.phases["beforeHtml"]

        def processCharacters(self, token):
            self.parser.parseError("expected-doctype-but-got-chars")
            self.anythingElse()
            return token

        def processStartTag(self, token):
            self.parser.parseError("expected-doctype-but-got-start-tag",
                                   {"name": token["name"]})
            self.anythingElse()
            return token

        def processEndTag(self, token):
            self.parser.parseError("expected-doctype-but-got-end-tag",
                                   {"name": token["name"]})
            self.anythingElse()
            return token

        def processEOF(self):
            self.parser.parseError("expected-doctype-but-got-eof")
            self.anythingElse()
            return True

    class BeforeHtmlPhase(Phase):
        # helper methods
        def insertHtmlElement(self):
            self.tree.insertRoot(impliedTagToken("html", "StartTag"))
            self.parser.phase = self.parser.phases["beforeHead"]

        # other
        def processEOF(self):
            self.insertHtmlElement()
            return True

        def processComment(self, token):
            self.tree.insertComment(token, self.tree.document)

        def processSpaceCharacters(self, token):
            pass

        def processCharacters(self, token):
            self.insertHtmlElement()
            return token

        def processStartTag(self, token):
            if token["name"] == "html":
                self.parser.firstStartTag = True
            self.insertHtmlElement()
            return token

        def processEndTag(self, token):
            if token["name"] not in ("head", "body", "html", "br"):
                self.parser.parseError("unexpected-end-tag-before-html",
                                       {"name": token["name"]})
            else:
                self.insertHtmlElement()
                return token

    class BeforeHeadPhase(Phase):
        def __init__(self, parser, tree):
            Phase.__init__(self, parser, tree)

            self.startTagHandler = _utils.MethodDispatcher([
                ("html", self.startTagHtml),
                ("head", self.startTagHead)
            ])
            self.startTagHandler.default = self.startTagOther

            self.endTagHandler = _utils.MethodDispatcher([
                (("head", "body", "html", "br"), self.endTagImplyHead)
            ])
            self.endTagHandler.default = self.endTagOther

        def processEOF(self):
            self.startTagHead(impliedTagToken("head", "StartTag"))
            return True

        def processSpaceCharacters(self, token):
            pass

        def processCharacters(self, token):
            self.startTagHead(impliedTagToken("head", "StartTag"))
            return token

        def startTagHtml(self, token):
            return self.parser.phases["inBody"].processStartTag(token)

        def startTagHead(self, token):
            self.tree.insertElement(token)
            self.tree.headPointer = self.tree.openElements[-1]
            self.parser.phase = self.parser.phases["inHead"]

        def startTagOther(self, token):
            self.startTagHead(impliedTagToken("head", "StartTag"))
            return token

        def endTagImplyHead(self, token):
            self.startTagHead(impliedTagToken("head", "StartTag"))
            return token

        def endTagOther(self, token):
            self.parser.parseError("end-tag-after-implied-root",
                                   {"name": token["name"]})

    class InHeadPhase(Phase):
        def __init__(self, parser, tree):
            Phase.__init__(self, parser, tree)

            self.startTagHandler = _utils.MethodDispatcher([
                ("html", self.startTagHtml),
                ("title", self.startTagTitle),
                (("noframes", "style"), self.startTagNoFramesStyle),
                ("noscript", self.startTagNoscript),
                ("script", self.startTagScript),
                (("base", "basefont", "bgsound", "command", "link"),
                 self.startTagBaseLinkCommand),
                ("meta", self.startTagMeta),
                ("head", self.startTagHead)
            ])
            self.startTagHandler.default = self.startTagOther

            self.endTagHandler = _utils.MethodDispatcher([
                ("head", self.endTagHead),
                (("br", "html", "body"), self.endTagHtmlBodyBr)
            ])
            self.endTagHandler.default = self.endTagOther

        # the real thing
        def processEOF(self):
            self.anythingElse()
            return True

        def processCharacters(self, token):
            self.anythingElse()
            return token

        def startTagHtml(self, token):
            return self.parser.phases["inBody"].processStartTag(token)

        def startTagHead(self, token):
            self.parser.parseError("two-heads-are-not-better-than-one")

        def startTagBaseLinkCommand(self, token):
            self.tree.insertElement(token)
            self.tree.openElements.pop()
            token["selfClosingAcknowledged"] = True

        def startTagMeta(self, token):
            self.tree.insertElement(token)
            self.tree.openElements.pop()
            token["selfClosingAcknowledged"] = True

            attributes = token["data"]
            if self.parser.tokenizer.stream.charEncoding[1] == "tentative":
                if "charset" in attributes:
                    self.parser.tokenizer.stream.changeEncoding(attributes["charset"])
                elif ("content" in attributes and
                      "http-equiv" in attributes and
                      attributes["http-equiv"].lower() == "content-type"):
                    # Encoding it as UTF-8 here is a hack, as really we should pass
                    # the abstract Unicode string, and just use the
                    # ContentAttrParser on that, but using UTF-8 allows all chars
                    # to be encoded and as a ASCII-superset works.
                    data = _inputstream.EncodingBytes(attributes["content"].encode("utf-8"))
                    parser = _inputstream.ContentAttrParser(data)
                    codec = parser.parse()
                    self.parser.tokenizer.stream.changeEncoding(codec)

        def startTagTitle(self, token):
            self.parser.parseRCDataRawtext(token, "RCDATA")

        def startTagNoFramesStyle(self, token):
            # Need to decide whether to implement the scripting-disabled case
            self.parser.parseRCDataRawtext(token, "RAWTEXT")

        def startTagNoscript(self, token):
            if self.parser.scripting:
                self.parser.parseRCDataRawtext(token, "RAWTEXT")
            else:
                self.tree.insertElement(token)
                self.parser.phase = self.parser.phases["inHeadNoscript"]

        def startTagScript(self, token):
            self.tree.insertElement(token)
            self.parser.tokenizer.state = self.parser.tokenizer.scriptDataState
            self.parser.originalPhase = self.parser.phase
            self.parser.phase = self.parser.phases["text"]

        def startTagOther(self, token):
            self.anythingElse()
            return token

        def endTagHead(self, token):
            node = self.parser.tree.openElements.pop()
            assert node.name == "head", "Expected head got %s" % node.name
            self.parser.phase = self.parser.phases["afterHead"]

        def endTagHtmlBodyBr(self, token):
            self.anythingElse()
            return token

        def endTagOther(self, token):
            self.parser.parseError("unexpected-end-tag", {"name": token["name"]})

        def anythingElse(self):
            self.endTagHead(impliedTagToken("head"))

    class InHeadNoscriptPhase(Phase):
        def __init__(self, parser, tree):
            Phase.__init__(self, parser, tree)

            self.startTagHandler = _utils.MethodDispatcher([
                ("html", self.startTagHtml),
                (("basefont", "bgsound", "link", "meta", "noframes", "style"), self.startTagBaseLinkCommand),
                (("head", "noscript"), self.startTagHeadNoscript),
            ])
            self.startTagHandler.default = self.startTagOther

            self.endTagHandler = _utils.MethodDispatcher([
                ("noscript", self.endTagNoscript),
                ("br", self.endTagBr),
            ])
            self.endTagHandler.default = self.endTagOther

        def processEOF(self):
            self.parser.parseError("eof-in-head-noscript")
            self.anythingElse()
            return True

        def processComment(self, token):
            return self.parser.phases["inHead"].processComment(token)

        def processCharacters(self, token):
            self.parser.parseError("char-in-head-noscript")
            self.anythingElse()
            return token

        def processSpaceCharacters(self, token):
            return self.parser.phases["inHead"].processSpaceCharacters(token)

        def startTagHtml(self, token):
            return self.parser.phases["inBody"].processStartTag(token)

        def startTagBaseLinkCommand(self, token):
            return self.parser.phases["inHead"].processStartTag(token)

        def startTagHeadNoscript(self, token):
            self.parser.parseError("unexpected-start-tag", {"name": token["name"]})

        def startTagOther(self, token):
            self.parser.parseError("unexpected-inhead-noscript-tag", {"name": token["name"]})
            self.anythingElse()
            return token

        def endTagNoscript(self, token):
            node = self.parser.tree.openElements.pop()
            assert node.name == "noscript", "Expected noscript got %s" % node.name
            self.parser.phase = self.parser.phases["inHead"]

        def endTagBr(self, token):
            self.parser.parseError("unexpected-inhead-noscript-tag", {"name": token["name"]})
            self.anythingElse()
            return token

        def endTagOther(self, token):
            self.parser.parseError("unexpected-end-tag", {"name": token["name"]})

        def anythingElse(self):
            # Caller must raise parse error first!
            self.endTagNoscript(impliedTagToken("noscript"))

    class AfterHeadPhase(Phase):
        def __init__(self, parser, tree):
            Phase.__init__(self, parser, tree)

            self.startTagHandler = _utils.MethodDispatcher([
                ("html", self.startTagHtml),
                ("body", self.startTagBody),
                ("frameset", self.startTagFrameset),
                (("base", "basefont", "bgsound", "link", "meta", "noframes", "script",
                  "style", "title"),
                 self.startTagFromHead),
                ("head", self.startTagHead)
            ])
            self.startTagHandler.default = self.startTagOther
            self.endTagHandler = _utils.MethodDispatcher([(("body", "html", "br"),
                                                           self.endTagHtmlBodyBr)])
            self.endTagHandler.default = self.endTagOther

        def processEOF(self):
            self.anythingElse()
            return True

        def processCharacters(self, token):
            self.anythingElse()
            return token

        def startTagHtml(self, token):
            return self.parser.phases["inBody"].processStartTag(token)

        def startTagBody(self, token):
            self.parser.framesetOK = False
            self.tree.insertElement(token)
            self.parser.phase = self.parser.phases["inBody"]

        def startTagFrameset(self, token):
            self.tree.insertElement(token)
            self.parser.phase = self.parser.phases["inFrameset"]

        def startTagFromHead(self, token):
            self.parser.parseError("unexpected-start-tag-out-of-my-head",
                                   {"name": token["name"]})
            self.tree.openElements.append(self.tree.headPointer)
            self.parser.phases["inHead"].processStartTag(token)
            for node in self.tree.openElements[::-1]:
                if node.name == "head":
                    self.tree.openElements.remove(node)
                    break

        def startTagHead(self, token):
            self.parser.parseError("unexpected-start-tag", {"name": token["name"]})

        def startTagOther(self, token):
            self.anythingElse()
            return token

        def endTagHtmlBodyBr(self, token):
            self.anythingElse()
            return token

        def endTagOther(self, token):
            self.parser.parseError("unexpected-end-tag", {"name": token["name"]})

        def anythingElse(self):
            self.tree.insertElement(impliedTagToken("body", "StartTag"))
            self.parser.phase = self.parser.phases["inBody"]
            self.parser.framesetOK = True

    class InBodyPhase(Phase):
        # http://www.whatwg.org/specs/web-apps/current-work/#parsing-main-inbody
        # the really-really-really-very crazy mode
        def __init__(self, parser, tree):
            Phase.__init__(self, parser, tree)

            # Set this to the default handler
            self.processSpaceCharacters = self.processSpaceCharactersNonPre

            self.startTagHandler = _utils.MethodDispatcher([
                ("html", self.startTagHtml),
                (("base", "basefont", "bgsound", "command", "link", "meta",
                  "script", "style", "title"),
                 self.startTagProcessInHead),
                ("body", self.startTagBody),
                ("frameset", self.startTagFrameset),
                (("address", "article", "aside", "blockquote", "center", "details",
                  "dir", "div", "dl", "fieldset", "figcaption", "figure",
                  "footer", "header", "hgroup", "main", "menu", "nav", "ol", "p",
                  "section", "summary", "ul"),
                 self.startTagCloseP),
                (headingElements, self.startTagHeading),
                (("pre", "listing"), self.startTagPreListing),
                ("form", self.startTagForm),
                (("li", "dd", "dt"), self.startTagListItem),
                ("plaintext", self.startTagPlaintext),
                ("a", self.startTagA),
                (("b", "big", "code", "em", "font", "i", "s", "small", "strike",
                  "strong", "tt", "u"), self.startTagFormatting),
                ("nobr", self.startTagNobr),
                ("button", self.startTagButton),
                (("applet", "marquee", "object"), self.startTagAppletMarqueeObject),
                ("xmp", self.startTagXmp),
                ("table", self.startTagTable),
                (("area", "br", "embed", "img", "keygen", "wbr"),
                 self.startTagVoidFormatting),
                (("param", "source", "track"), self.startTagParamSource),
                ("input", self.startTagInput),
                ("hr", self.startTagHr),
                ("image", self.startTagImage),
                ("isindex", self.startTagIsIndex),
                ("textarea", self.startTagTextarea),
                ("iframe", self.startTagIFrame),
                ("noscript", self.startTagNoscript),
                (("noembed", "noframes"), self.startTagRawtext),
                ("select", self.startTagSelect),
                (("rp", "rt"), self.startTagRpRt),
                (("option", "optgroup"), self.startTagOpt),
                (("math"), self.startTagMath),
                (("svg"), self.startTagSvg),
                (("caption", "col", "colgroup", "frame", "head",
                  "tbody", "td", "tfoot", "th", "thead",
                  "tr"), self.startTagMisplaced)
            ])
            self.startTagHandler.default = self.startTagOther

            self.endTagHandler = _utils.MethodDispatcher([
                ("body", self.endTagBody),
                ("html", self.endTagHtml),
                (("address", "article", "aside", "blockquote", "button", "center",
                  "details", "dialog", "dir", "div", "dl", "fieldset", "figcaption", "figure",
                  "footer", "header", "hgroup", "listing", "main", "menu", "nav", "ol", "pre",
                  "section", "summary", "ul"), self.endTagBlock),
                ("form", self.endTagForm),
                ("p", self.endTagP),
                (("dd", "dt", "li"), self.endTagListItem),
                (headingElements, self.endTagHeading),
                (("a", "b", "big", "code", "em", "font", "i", "nobr", "s", "small",
                  "strike", "strong", "tt", "u"), self.endTagFormatting),
                (("applet", "marquee", "object"), self.endTagAppletMarqueeObject),
                ("br", self.endTagBr),
            ])
            self.endTagHandler.default = self.endTagOther

        def isMatchingFormattingElement(self, node1, node2):
            return (node1.name == node2.name and
                    node1.namespace == node2.namespace and
                    node1.attributes == node2.attributes)

        # helper
        def addFormattingElement(self, token):
            self.tree.insertElement(token)
            element = self.tree.openElements[-1]

            matchingElements = []
            for node in self.tree.activeFormattingElements[::-1]:
                if node is Marker:
                    break
                elif self.isMatchingFormattingElement(node, element):
                    matchingElements.append(node)

            assert len(matchingElements) <= 3
            if len(matchingElements) == 3:
                self.tree.activeFormattingElements.remove(matchingElements[-1])
            self.tree.activeFormattingElements.append(element)

        # the real deal
        def processEOF(self):
            allowed_elements = frozenset(("dd", "dt", "li", "p", "tbody", "td",
                                          "tfoot", "th", "thead", "tr", "body",
                                          "html"))
            for node in self.tree.openElements[::-1]:
                if node.name not in allowed_elements:
                    self.parser.parseError("expected-closing-tag-but-got-eof")
                    break
            # Stop parsing

        def processSpaceCharactersDropNewline(self, token):
            # Sometimes (start of <pre>, <listing>, and <textarea> blocks) we
            # want to drop leading newlines
            data = token["data"]
            self.processSpaceCharacters = self.processSpaceCharactersNonPre
            if (data.startswith("\n") and
                self.tree.openElements[-1].name in ("pre", "listing", "textarea") and
                    not self.tree.openElements[-1].hasContent()):
                data = data[1:]
            if data:
                self.tree.reconstructActiveFormattingElements()
                self.tree.insertText(data)

        def processCharacters(self, token):
            if token["data"] == "\u0000":
                # The tokenizer should always emit null on its own
                return
            self.tree.reconstructActiveFormattingElements()
            self.tree.insertText(token["data"])
            # This must be bad for performance
            if (self.parser.framesetOK and
                any([char not in spaceCharacters
                     for char in token["data"]])):
                self.parser.framesetOK = False

        def processSpaceCharactersNonPre(self, token):
            self.tree.reconstructActiveFormattingElements()
            self.tree.insertText(token["data"])

        def startTagProcessInHead(self, token):
            return self.parser.phases["inHead"].processStartTag(token)

        def startTagBody(self, token):
            self.parser.parseError("unexpected-start-tag", {"name": "body"})
            if (len(self.tree.openElements) == 1 or
                    self.tree.openElements[1].name != "body"):
                assert self.parser.innerHTML
            else:
                self.parser.framesetOK = False
                for attr, value in token["data"].items():
                    if attr not in self.tree.openElements[1].attributes:
                        self.tree.openElements[1].attributes[attr] = value

        def startTagFrameset(self, token):
            self.parser.parseError("unexpected-start-tag", {"name": "frameset"})
            if (len(self.tree.openElements) == 1 or self.tree.openElements[1].name != "body"):
                assert self.parser.innerHTML
            elif not self.parser.framesetOK:
                pass
            else:
                if self.tree.openElements[1].parent:
                    self.tree.openElements[1].parent.removeChild(self.tree.openElements[1])
                while self.tree.openElements[-1].name != "html":
                    self.tree.openElements.pop()
                self.tree.insertElement(token)
                self.parser.phase = self.parser.phases["inFrameset"]

        def startTagCloseP(self, token):
            if self.tree.elementInScope("p", variant="button"):
                self.endTagP(impliedTagToken("p"))
            self.tree.insertElement(token)

        def startTagPreListing(self, token):
            if self.tree.elementInScope("p", variant="button"):
                self.endTagP(impliedTagToken("p"))
            self.tree.insertElement(token)
            self.parser.framesetOK = False
            self.processSpaceCharacters = self.processSpaceCharactersDropNewline

        def startTagForm(self, token):
            if self.tree.formPointer:
                self.parser.parseError("unexpected-start-tag", {"name": "form"})
            else:
                if self.tree.elementInScope("p", variant="button"):
                    self.endTagP(impliedTagToken("p"))
                self.tree.insertElement(token)
                self.tree.formPointer = self.tree.openElements[-1]

        def startTagListItem(self, token):
            self.parser.framesetOK = False

            stopNamesMap = {"li": ["li"],
                            "dt": ["dt", "dd"],
                            "dd": ["dt", "dd"]}
            stopNames = stopNamesMap[token["name"]]
            for node in reversed(self.tree.openElements):
                if node.name in stopNames:
                    self.parser.phase.processEndTag(
                        impliedTagToken(node.name, "EndTag"))
                    break
                if (node.nameTuple in specialElements and
                        node.name not in ("address", "div", "p")):
                    break

            if self.tree.elementInScope("p", variant="button"):
                self.parser.phase.processEndTag(
                    impliedTagToken("p", "EndTag"))

            self.tree.insertElement(token)

        def startTagPlaintext(self, token):
            if self.tree.elementInScope("p", variant="button"):
                self.endTagP(impliedTagToken("p"))
            self.tree.insertElement(token)
            self.parser.tokenizer.state = self.parser.tokenizer.plaintextState

        def startTagHeading(self, token):
            if self.tree.elementInScope("p", variant="button"):
                self.endTagP(impliedTagToken("p"))
            if self.tree.openElements[-1].name in headingElements:
                self.parser.parseError("unexpected-start-tag", {"name": token["name"]})
                self.tree.openElements.pop()
            self.tree.insertElement(token)

        def startTagA(self, token):
            afeAElement = self.tree.elementInActiveFormattingElements("a")
            if afeAElement:
                self.parser.parseError("unexpected-start-tag-implies-end-tag",
                                       {"startName": "a", "endName": "a"})
                self.endTagFormatting(impliedTagToken("a"))
                if afeAElement in self.tree.openElements:
                    self.tree.openElements.remove(afeAElement)
                if afeAElement in self.tree.activeFormattingElements:
                    self.tree.activeFormattingElements.remove(afeAElement)
            self.tree.reconstructActiveFormattingElements()
            self.addFormattingElement(token)

        def startTagFormatting(self, token):
            self.tree.reconstructActiveFormattingElements()
            self.addFormattingElement(token)

        def startTagNobr(self, token):
            self.tree.reconstructActiveFormattingElements()
            if self.tree.elementInScope("nobr"):
                self.parser.parseError("unexpected-start-tag-implies-end-tag",
                                       {"startName": "nobr", "endName": "nobr"})
                self.processEndTag(impliedTagToken("nobr"))
                # XXX Need tests that trigger the following
                self.tree.reconstructActiveFormattingElements()
            self.addFormattingElement(token)

        def startTagButton(self, token):
            if self.tree.elementInScope("button"):
                self.parser.parseError("unexpected-start-tag-implies-end-tag",
                                       {"startName": "button", "endName": "button"})
                self.processEndTag(impliedTagToken("button"))
                return token
            else:
                self.tree.reconstructActiveFormattingElements()
                self.tree.insertElement(token)
                self.parser.framesetOK = False

        def startTagAppletMarqueeObject(self, token):
            self.tree.reconstructActiveFormattingElements()
            self.tree.insertElement(token)
            self.tree.activeFormattingElements.append(Marker)
            self.parser.framesetOK = False

        def startTagXmp(self, token):
            if self.tree.elementInScope("p", variant="button"):
                self.endTagP(impliedTagToken("p"))
            self.tree.reconstructActiveFormattingElements()
            self.parser.framesetOK = False
            self.parser.parseRCDataRawtext(token, "RAWTEXT")

        def startTagTable(self, token):
            if self.parser.compatMode != "quirks":
                if self.tree.elementInScope("p", variant="button"):
                    self.processEndTag(impliedTagToken("p"))
            self.tree.insertElement(token)
            self.parser.framesetOK = False
            self.parser.phase = self.parser.phases["inTable"]

        def startTagVoidFormatting(self, token):
            self.tree.reconstructActiveFormattingElements()
            self.tree.insertElement(token)
            self.tree.openElements.pop()
            token["selfClosingAcknowledged"] = True
            self.parser.framesetOK = False

        def startTagInput(self, token):
            framesetOK = self.parser.framesetOK
            self.startTagVoidFormatting(token)
            if ("type" in token["data"] and
                    token["data"]["type"].translate(asciiUpper2Lower) == "hidden"):
                # input type=hidden doesn't change framesetOK
                self.parser.framesetOK = framesetOK

        def startTagParamSource(self, token):
            self.tree.insertElement(token)
            self.tree.openElements.pop()
            token["selfClosingAcknowledged"] = True

        def startTagHr(self, token):
            if self.tree.elementInScope("p", variant="button"):
                self.endTagP(impliedTagToken("p"))
            self.tree.insertElement(token)
            self.tree.openElements.pop()
            token["selfClosingAcknowledged"] = True
            self.parser.framesetOK = False

        def startTagImage(self, token):
            # No really...
            self.parser.parseError("unexpected-start-tag-treated-as",
                                   {"originalName": "image", "newName": "img"})
            self.processStartTag(impliedTagToken("img", "StartTag",
                                                 attributes=token["data"],
                                                 selfClosing=token["selfClosing"]))

        def startTagIsIndex(self, token):
            self.parser.parseError("deprecated-tag", {"name": "isindex"})
            if self.tree.formPointer:
                return
            form_attrs = {}
            if "action" in token["data"]:
                form_attrs["action"] = token["data"]["action"]
            self.processStartTag(impliedTagToken("form", "StartTag",
                                                 attributes=form_attrs))
            self.processStartTag(impliedTagToken("hr", "StartTag"))
            self.processStartTag(impliedTagToken("label", "StartTag"))
            # XXX Localization ...
            if "prompt" in token["data"]:
                prompt = token["data"]["prompt"]
            else:
                prompt = "This is a searchable index. Enter search keywords: "
            self.processCharacters(
                {"type": tokenTypes["Characters"], "data": prompt})
            attributes = token["data"].copy()
            if "action" in attributes:
                del attributes["action"]
            if "prompt" in attributes:
                del attributes["prompt"]
            attributes["name"] = "isindex"
            self.processStartTag(impliedTagToken("input", "StartTag",
                                                 attributes=attributes,
                                                 selfClosing=token["selfClosing"]))
            self.processEndTag(impliedTagToken("label"))
            self.processStartTag(impliedTagToken("hr", "StartTag"))
            self.processEndTag(impliedTagToken("form"))

        def startTagTextarea(self, token):
            self.tree.insertElement(token)
            self.parser.tokenizer.state = self.parser.tokenizer.rcdataState
            self.processSpaceCharacters = self.processSpaceCharactersDropNewline
            self.parser.framesetOK = False

        def startTagIFrame(self, token):
            self.parser.framesetOK = False
            self.startTagRawtext(token)

        def startTagNoscript(self, token):
            if self.parser.scripting:
                self.startTagRawtext(token)
            else:
                self.startTagOther(token)

        def startTagRawtext(self, token):
            """iframe, noembed noframes, noscript(if scripting enabled)"""
            self.parser.parseRCDataRawtext(token, "RAWTEXT")

        def startTagOpt(self, token):
            if self.tree.openElements[-1].name == "option":
                self.parser.phase.processEndTag(impliedTagToken("option"))
            self.tree.reconstructActiveFormattingElements()
            self.parser.tree.insertElement(token)

        def startTagSelect(self, token):
            self.tree.reconstructActiveFormattingElements()
            self.tree.insertElement(token)
            self.parser.framesetOK = False
            if self.parser.phase in (self.parser.phases["inTable"],
                                     self.parser.phases["inCaption"],
                                     self.parser.phases["inColumnGroup"],
                                     self.parser.phases["inTableBody"],
                                     self.parser.phases["inRow"],
                                     self.parser.phases["inCell"]):
                self.parser.phase = self.parser.phases["inSelectInTable"]
            else:
                self.parser.phase = self.parser.phases["inSelect"]

        def startTagRpRt(self, token):
            if self.tree.elementInScope("ruby"):
                self.tree.generateImpliedEndTags()
                if self.tree.openElements[-1].name != "ruby":
                    self.parser.parseError()
            self.tree.insertElement(token)

        def startTagMath(self, token):
            self.tree.reconstructActiveFormattingElements()
            self.parser.adjustMathMLAttributes(token)
            self.parser.adjustForeignAttributes(token)
            token["namespace"] = namespaces["mathml"]
            self.tree.insertElement(token)
            # Need to get the parse error right for the case where the token
            # has a namespace not equal to the xmlns attribute
            if token["selfClosing"]:
                self.tree.openElements.pop()
                token["selfClosingAcknowledged"] = True

        def startTagSvg(self, token):
            self.tree.reconstructActiveFormattingElements()
            self.parser.adjustSVGAttributes(token)
            self.parser.adjustForeignAttributes(token)
            token["namespace"] = namespaces["svg"]
            self.tree.insertElement(token)
            # Need to get the parse error right for the case where the token
            # has a namespace not equal to the xmlns attribute
            if token["selfClosing"]:
                self.tree.openElements.pop()
                token["selfClosingAcknowledged"] = True

        def startTagMisplaced(self, token):
            """ Elements that should be children of other elements that have a
            different insertion mode; here they are ignored
            "caption", "col", "colgroup", "frame", "frameset", "head",
            "option", "optgroup", "tbody", "td", "tfoot", "th", "thead",
            "tr", "noscript"
            """
            self.parser.parseError("unexpected-start-tag-ignored", {"name": token["name"]})

        def startTagOther(self, token):
            self.tree.reconstructActiveFormattingElements()
            self.tree.insertElement(token)

        def endTagP(self, token):
            if not self.tree.elementInScope("p", variant="button"):
                self.startTagCloseP(impliedTagToken("p", "StartTag"))
                self.parser.parseError("unexpected-end-tag", {"name": "p"})
                self.endTagP(impliedTagToken("p", "EndTag"))
            else:
                self.tree.generateImpliedEndTags("p")
                if self.tree.openElements[-1].name != "p":
                    self.parser.parseError("unexpected-end-tag", {"name": "p"})
                node = self.tree.openElements.pop()
                while node.name != "p":
                    node = self.tree.openElements.pop()

        def endTagBody(self, token):
            if not self.tree.elementInScope("body"):
                self.parser.parseError()
                return
            elif self.tree.openElements[-1].name != "body":
                for node in self.tree.openElements[2:]:
                    if node.name not in frozenset(("dd", "dt", "li", "optgroup",
                                                   "option", "p", "rp", "rt",
                                                   "tbody", "td", "tfoot",
                                                   "th", "thead", "tr", "body",
                                                   "html")):
                        # Not sure this is the correct name for the parse error
                        self.parser.parseError(
                            "expected-one-end-tag-but-got-another",
                            {"gotName": "body", "expectedName": node.name})
                        break
            self.parser.phase = self.parser.phases["afterBody"]

        def endTagHtml(self, token):
            # We repeat the test for the body end tag token being ignored here
            if self.tree.elementInScope("body"):
                self.endTagBody(impliedTagToken("body"))
                return token

        def endTagBlock(self, token):
            # Put us back in the right whitespace handling mode
            if token["name"] == "pre":
                self.processSpaceCharacters = self.processSpaceCharactersNonPre
            inScope = self.tree.elementInScope(token["name"])
            if inScope:
                self.tree.generateImpliedEndTags()
            if self.tree.openElements[-1].name != token["name"]:
                self.parser.parseError("end-tag-too-early", {"name": token["name"]})
            if inScope:
                node = self.tree.openElements.pop()
                while node.name != token["name"]:
                    node = self.tree.openElements.pop()

        def endTagForm(self, token):
            node = self.tree.formPointer
            self.tree.formPointer = None
            if node is None or not self.tree.elementInScope(node):
                self.parser.parseError("unexpected-end-tag",
                                       {"name": "form"})
            else:
                self.tree.generateImpliedEndTags()
                if self.tree.openElements[-1] != node:
                    self.parser.parseError("end-tag-too-early-ignored",
                                           {"name": "form"})
                self.tree.openElements.remove(node)

        def endTagListItem(self, token):
            if token["name"] == "li":
                variant = "list"
            else:
                variant = None
            if not self.tree.elementInScope(token["name"], variant=variant):
                self.parser.parseError("unexpected-end-tag", {"name": token["name"]})
            else:
                self.tree.generateImpliedEndTags(exclude=token["name"])
                if self.tree.openElements[-1].name != token["name"]:
                    self.parser.parseError(
                        "end-tag-too-early",
                        {"name": token["name"]})
                node = self.tree.openElements.pop()
                while node.name != token["name"]:
                    node = self.tree.openElements.pop()

        def endTagHeading(self, token):
            for item in headingElements:
                if self.tree.elementInScope(item):
                    self.tree.generateImpliedEndTags()
                    break
            if self.tree.openElements[-1].name != token["name"]:
                self.parser.parseError("end-tag-too-early", {"name": token["name"]})

            for item in headingElements:
                if self.tree.elementInScope(item):
                    item = self.tree.openElements.pop()
                    while item.name not in headingElements:
                        item = self.tree.openElements.pop()
                    break

        def endTagFormatting(self, token):
            """The much-feared adoption agency algorithm"""
            # http://svn.whatwg.org/webapps/complete.html#adoptionAgency revision 7867
            # XXX Better parseError messages appreciated.

            # Step 1
            outerLoopCounter = 0

            # Step 2
            while outerLoopCounter < 8:

                # Step 3
                outerLoopCounter += 1

                # Step 4:

                # Let the formatting element be the last element in
                # the list of active formatting elements that:
                # - is between the end of the list and the last scope
                # marker in the list, if any, or the start of the list
                # otherwise, and
                # - has the same tag name as the token.
                formattingElement = self.tree.elementInActiveFormattingElements(
                    token["name"])
                if (not formattingElement or
                    (formattingElement in self.tree.openElements and
                     not self.tree.elementInScope(formattingElement.name))):
                    # If there is no such node, then abort these steps
                    # and instead act as described in the "any other
                    # end tag" entry below.
                    self.endTagOther(token)
                    return

                # Otherwise, if there is such a node, but that node is
                # not in the stack of open elements, then this is a
                # parse error; remove the element from the list, and
                # abort these steps.
                elif formattingElement not in self.tree.openElements:
                    self.parser.parseError("adoption-agency-1.2", {"name": token["name"]})
                    self.tree.activeFormattingElements.remove(formattingElement)
                    return

                # Otherwise, if there is such a node, and that node is
                # also in the stack of open elements, but the element
                # is not in scope, then this is a parse error; ignore
                # the token, and abort these steps.
                elif not self.tree.elementInScope(formattingElement.name):
                    self.parser.parseError("adoption-agency-4.4", {"name": token["name"]})
                    return

                # Otherwise, there is a formatting element and that
                # element is in the stack and is in scope. If the
                # element is not the current node, this is a parse
                # error. In any case, proceed with the algorithm as
                # written in the following steps.
                else:
                    if formattingElement != self.tree.openElements[-1]:
                        self.parser.parseError("adoption-agency-1.3", {"name": token["name"]})

                # Step 5:

                # Let the furthest block be the topmost node in the
                # stack of open elements that is lower in the stack
                # than the formatting element, and is an element in
                # the special category. There might not be one.
                afeIndex = self.tree.openElements.index(formattingElement)
                furthestBlock = None
                for element in self.tree.openElements[afeIndex:]:
                    if element.nameTuple in specialElements:
                        furthestBlock = element
                        break

                # Step 6:

                # If there is no furthest block, then the UA must
                # first pop all the nodes from the bottom of the stack
                # of open elements, from the current node up to and
                # including the formatting element, then remove the
                # formatting element from the list of active
                # formatting elements, and finally abort these steps.
                if furthestBlock is None:
                    element = self.tree.openElements.pop()
                    while element != formattingElement:
                        element = self.tree.openElements.pop()
                    self.tree.activeFormattingElements.remove(element)
                    return

                # Step 7
                commonAncestor = self.tree.openElements[afeIndex - 1]

                # Step 8:
                # The bookmark is supposed to help us identify where to reinsert
                # nodes in step 15. We have to ensure that we reinsert nodes after
                # the node before the active formatting element. Note the bookmark
                # can move in step 9.7
                bookmark = self.tree.activeFormattingElements.index(formattingElement)

                # Step 9
                lastNode = node = furthestBlock
                innerLoopCounter = 0

                index = self.tree.openElements.index(node)
                while innerLoopCounter < 3:
                    innerLoopCounter += 1
                    # Node is element before node in open elements
                    index -= 1
                    node = self.tree.openElements[index]
                    if node not in self.tree.activeFormattingElements:
                        self.tree.openElements.remove(node)
                        continue
                    # Step 9.6
                    if node == formattingElement:
                        break
                    # Step 9.7
                    if lastNode == furthestBlock:
                        bookmark = self.tree.activeFormattingElements.index(node) + 1
                    # Step 9.8
                    clone = node.cloneNode()
                    # Replace node with clone
                    self.tree.activeFormattingElements[
                        self.tree.activeFormattingElements.index(node)] = clone
                    self.tree.openElements[
                        self.tree.openElements.index(node)] = clone
                    node = clone
                    # Step 9.9
                    # Remove lastNode from its parents, if any
                    if lastNode.parent:
                        lastNode.parent.removeChild(lastNode)
                    node.appendChild(lastNode)
                    # Step 9.10
                    lastNode = node

                # Step 10
                # Foster parent lastNode if commonAncestor is a
                # table, tbody, tfoot, thead, or tr we need to foster
                # parent the lastNode
                if lastNode.parent:
                    lastNode.parent.removeChild(lastNode)

                if commonAncestor.name in frozenset(("table", "tbody", "tfoot", "thead", "tr")):
                    parent, insertBefore = self.tree.getTableMisnestedNodePosition()
                    parent.insertBefore(lastNode, insertBefore)
                else:
                    commonAncestor.appendChild(lastNode)

                # Step 11
                clone = formattingElement.cloneNode()

                # Step 12
                furthestBlock.reparentChildren(clone)

                # Step 13
                furthestBlock.appendChild(clone)

                # Step 14
                self.tree.activeFormattingElements.remove(formattingElement)
                self.tree.activeFormattingElements.insert(bookmark, clone)

                # Step 15
                self.tree.openElements.remove(formattingElement)
                self.tree.openElements.insert(
                    self.tree.openElements.index(furthestBlock) + 1, clone)

        def endTagAppletMarqueeObject(self, token):
            if self.tree.elementInScope(token["name"]):
                self.tree.generateImpliedEndTags()
            if self.tree.openElements[-1].name != token["name"]:
                self.parser.parseError("end-tag-too-early", {"name": token["name"]})

            if self.tree.elementInScope(token["name"]):
                element = self.tree.openElements.pop()
                while element.name != token["name"]:
                    element = self.tree.openElements.pop()
                self.tree.clearActiveFormattingElements()

        def endTagBr(self, token):
            self.parser.parseError("unexpected-end-tag-treated-as",
                                   {"originalName": "br", "newName": "br element"})
            self.tree.reconstructActiveFormattingElements()
            self.tree.insertElement(impliedTagToken("br", "StartTag"))
            self.tree.openElements.pop()

        def endTagOther(self, token):
            for node in self.tree.openElements[::-1]:
                if node.name == token["name"]:
                    self.tree.generateImpliedEndTags(exclude=token["name"])
                    if self.tree.openElements[-1].name != token["name"]:
                        self.parser.parseError("unexpected-end-tag", {"name": token["name"]})
                    while self.tree.openElements.pop() != node:
                        pass
                    break
                else:
                    if node.nameTuple in specialElements:
                        self.parser.parseError("unexpected-end-tag", {"name": token["name"]})
                        break

    class TextPhase(Phase):
        def __init__(self, parser, tree):
            Phase.__init__(self, parser, tree)
            self.startTagHandler = _utils.MethodDispatcher([])
            self.startTagHandler.default = self.startTagOther
            self.endTagHandler = _utils.MethodDispatcher([
                ("script", self.endTagScript)])
            self.endTagHandler.default = self.endTagOther

        def processCharacters(self, token):
            self.tree.insertText(token["data"])

        def processEOF(self):
            self.parser.parseError("expected-named-closing-tag-but-got-eof",
                                   {"name": self.tree.openElements[-1].name})
            self.tree.openElements.pop()
            self.parser.phase = self.parser.originalPhase
            return True

        def startTagOther(self, token):
            assert False, "Tried to process start tag %s in RCDATA/RAWTEXT mode" % token['name']

        def endTagScript(self, token):
            node = self.tree.openElements.pop()
            assert node.name == "script"
            self.parser.phase = self.parser.originalPhase
            # The rest of this method is all stuff that only happens if
            # document.write works

        def endTagOther(self, token):
            self.tree.openElements.pop()
            self.parser.phase = self.parser.originalPhase

    class InTablePhase(Phase):
        # http://www.whatwg.org/specs/web-apps/current-work/#in-table
        def __init__(self, parser, tree):
            Phase.__init__(self, parser, tree)
            self.startTagHandler = _utils.MethodDispatcher([
                ("html", self.startTagHtml),
                ("caption", self.startTagCaption),
                ("colgroup", self.startTagColgroup),
                ("col", self.startTagCol),
                (("tbody", "tfoot", "thead"), self.startTagRowGroup),
                (("td", "th", "tr"), self.startTagImplyTbody),
                ("table", self.startTagTable),
                (("style", "script"), self.startTagStyleScript),
                ("input", self.startTagInput),
                ("form", self.startTagForm)
            ])
            self.startTagHandler.default = self.startTagOther

            self.endTagHandler = _utils.MethodDispatcher([
                ("table", self.endTagTable),
                (("body", "caption", "col", "colgroup", "html", "tbody", "td",
                  "tfoot", "th", "thead", "tr"), self.endTagIgnore)
            ])
            self.endTagHandler.default = self.endTagOther

        # helper methods
        def clearStackToTableContext(self):
            # "clear the stack back to a table context"
            while self.tree.openElements[-1].name not in ("table", "html"):
                # self.parser.parseError("unexpected-implied-end-tag-in-table",
                #  {"name":  self.tree.openElements[-1].name})
                self.tree.openElements.pop()
            # When the current node is <html> it's an innerHTML case

        # processing methods
        def processEOF(self):
            if self.tree.openElements[-1].name != "html":
                self.parser.parseError("eof-in-table")
            else:
                assert self.parser.innerHTML
            # Stop parsing

        def processSpaceCharacters(self, token):
            originalPhase = self.parser.phase
            self.parser.phase = self.parser.phases["inTableText"]
            self.parser.phase.originalPhase = originalPhase
            self.parser.phase.processSpaceCharacters(token)

        def processCharacters(self, token):
            originalPhase = self.parser.phase
            self.parser.phase = self.parser.phases["inTableText"]
            self.parser.phase.originalPhase = originalPhase
            self.parser.phase.processCharacters(token)

        def insertText(self, token):
            # If we get here there must be at least one non-whitespace character
            # Do the table magic!
            self.tree.insertFromTable = True
            self.parser.phases["inBody"].processCharacters(token)
            self.tree.insertFromTable = False

        def startTagCaption(self, token):
            self.clearStackToTableContext()
            self.tree.activeFormattingElements.append(Marker)
            self.tree.insertElement(token)
            self.parser.phase = self.parser.phases["inCaption"]

        def startTagColgroup(self, token):
            self.clearStackToTableContext()
            self.tree.insertElement(token)
            self.parser.phase = self.parser.phases["inColumnGroup"]

        def startTagCol(self, token):
            self.startTagColgroup(impliedTagToken("colgroup", "StartTag"))
            return token

        def startTagRowGroup(self, token):
            self.clearStackToTableContext()
            self.tree.insertElement(token)
            self.parser.phase = self.parser.phases["inTableBody"]

        def startTagImplyTbody(self, token):
            self.startTagRowGroup(impliedTagToken("tbody", "StartTag"))
            return token

        def startTagTable(self, token):
            self.parser.parseError("unexpected-start-tag-implies-end-tag",
                                   {"startName": "table", "endName": "table"})
            self.parser.phase.processEndTag(impliedTagToken("table"))
            if not self.parser.innerHTML:
                return token

        def startTagStyleScript(self, token):
            return self.parser.phases["inHead"].processStartTag(token)

        def startTagInput(self, token):
            if ("type" in token["data"] and
                    token["data"]["type"].translate(asciiUpper2Lower) == "hidden"):
                self.parser.parseError("unexpected-hidden-input-in-table")
                self.tree.insertElement(token)
                # XXX associate with form
                self.tree.openElements.pop()
            else:
                self.startTagOther(token)

        def startTagForm(self, token):
            self.parser.parseError("unexpected-form-in-table")
            if self.tree.formPointer is None:
                self.tree.insertElement(token)
                self.tree.formPointer = self.tree.openElements[-1]
                self.tree.openElements.pop()

        def startTagOther(self, token):
            self.parser.parseError("unexpected-start-tag-implies-table-voodoo", {"name": token["name"]})
            # Do the table magic!
            self.tree.insertFromTable = True
            self.parser.phases["inBody"].processStartTag(token)
            self.tree.insertFromTable = False

        def endTagTable(self, token):
            if self.tree.elementInScope("table", variant="table"):
                self.tree.generateImpliedEndTags()
                if self.tree.openElements[-1].name != "table":
                    self.parser.parseError("end-tag-too-early-named",
                                           {"gotName": "table",
                                            "expectedName": self.tree.openElements[-1].name})
                while self.tree.openElements[-1].name != "table":
                    self.tree.openElements.pop()
                self.tree.openElements.pop()
                self.parser.resetInsertionMode()
            else:
                # innerHTML case
                assert self.parser.innerHTML
                self.parser.parseError()

        def endTagIgnore(self, token):
            self.parser.parseError("unexpected-end-tag", {"name": token["name"]})

        def endTagOther(self, token):
            self.parser.parseError("unexpected-end-tag-implies-table-voodoo", {"name": token["name"]})
            # Do the table magic!
            self.tree.insertFromTable = True
            self.parser.phases["inBody"].processEndTag(token)
            self.tree.insertFromTable = False

    class InTableTextPhase(Phase):
        def __init__(self, parser, tree):
            Phase.__init__(self, parser, tree)
            self.originalPhase = None
            self.characterTokens = []

        def flushCharacters(self):
            data = "".join([item["data"] for item in self.characterTokens])
            if any([item not in spaceCharacters for item in data]):
                token = {"type": tokenTypes["Characters"], "data": data}
                self.parser.phases["inTable"].insertText(token)
            elif data:
                self.tree.insertText(data)
            self.characterTokens = []

        def processComment(self, token):
            self.flushCharacters()
            self.parser.phase = self.originalPhase
            return token

        def processEOF(self):
            self.flushCharacters()
            self.parser.phase = self.originalPhase
            return True

        def processCharacters(self, token):
            if token["data"] == "\u0000":
                return
            self.characterTokens.append(token)

        def processSpaceCharacters(self, token):
            # pretty sure we should never reach here
            self.characterTokens.append(token)
    #        assert False

        def processStartTag(self, token):
            self.flushCharacters()
            self.parser.phase = self.originalPhase
            return token

        def processEndTag(self, token):
            self.flushCharacters()
            self.parser.phase = self.originalPhase
            return token

    class InCaptionPhase(Phase):
        # http://www.whatwg.org/specs/web-apps/current-work/#in-caption
        def __init__(self, parser, tree):
            Phase.__init__(self, parser, tree)

            self.startTagHandler = _utils.MethodDispatcher([
                ("html", self.startTagHtml),
                (("caption", "col", "colgroup", "tbody", "td", "tfoot", "th",
                  "thead", "tr"), self.startTagTableElement)
            ])
            self.startTagHandler.default = self.startTagOther

            self.endTagHandler = _utils.MethodDispatcher([
                ("caption", self.endTagCaption),
                ("table", self.endTagTable),
                (("body", "col", "colgroup", "html", "tbody", "td", "tfoot", "th",
                  "thead", "tr"), self.endTagIgnore)
            ])
            self.endTagHandler.default = self.endTagOther

        def ignoreEndTagCaption(self):
            return not self.tree.elementInScope("caption", variant="table")

        def processEOF(self):
            self.parser.phases["inBody"].processEOF()

        def processCharacters(self, token):
            return self.parser.phases["inBody"].processCharacters(token)

        def startTagTableElement(self, token):
            self.parser.parseError()
            # XXX Have to duplicate logic here to find out if the tag is ignored
            ignoreEndTag = self.ignoreEndTagCaption()
            self.parser.phase.processEndTag(impliedTagToken("caption"))
            if not ignoreEndTag:
                return token

        def startTagOther(self, token):
            return self.parser.phases["inBody"].processStartTag(token)

        def endTagCaption(self, token):
            if not self.ignoreEndTagCaption():
                # AT this code is quite similar to endTagTable in "InTable"
                self.tree.generateImpliedEndTags()
                if self.tree.openElements[-1].name != "caption":
                    self.parser.parseError("expected-one-end-tag-but-got-another",
                                           {"gotName": "caption",
                                            "expectedName": self.tree.openElements[-1].name})
                while self.tree.openElements[-1].name != "caption":
                    self.tree.openElements.pop()
                self.tree.openElements.pop()
                self.tree.clearActiveFormattingElements()
                self.parser.phase = self.parser.phases["inTable"]
            else:
                # innerHTML case
                assert self.parser.innerHTML
                self.parser.parseError()

        def endTagTable(self, token):
            self.parser.parseError()
            ignoreEndTag = self.ignoreEndTagCaption()
            self.parser.phase.processEndTag(impliedTagToken("caption"))
            if not ignoreEndTag:
                return token

        def endTagIgnore(self, token):
            self.parser.parseError("unexpected-end-tag", {"name": token["name"]})

        def endTagOther(self, token):
            return self.parser.phases["inBody"].processEndTag(token)

    class InColumnGroupPhase(Phase):
        # http://www.whatwg.org/specs/web-apps/current-work/#in-column

        def __init__(self, parser, tree):
            Phase.__init__(self, parser, tree)

            self.startTagHandler = _utils.MethodDispatcher([
                ("html", self.startTagHtml),
                ("col", self.startTagCol)
            ])
            self.startTagHandler.default = self.startTagOther

            self.endTagHandler = _utils.MethodDispatcher([
                ("colgroup", self.endTagColgroup),
                ("col", self.endTagCol)
            ])
            self.endTagHandler.default = self.endTagOther

        def ignoreEndTagColgroup(self):
            return self.tree.openElements[-1].name == "html"

        def processEOF(self):
            if self.tree.openElements[-1].name == "html":
                assert self.parser.innerHTML
                return
            else:
                ignoreEndTag = self.ignoreEndTagColgroup()
                self.endTagColgroup(impliedTagToken("colgroup"))
                if not ignoreEndTag:
                    return True

        def processCharacters(self, token):
            ignoreEndTag = self.ignoreEndTagColgroup()
            self.endTagColgroup(impliedTagToken("colgroup"))
            if not ignoreEndTag:
                return token

        def startTagCol(self, token):
            self.tree.insertElement(token)
            self.tree.openElements.pop()
            token["selfClosingAcknowledged"] = True

        def startTagOther(self, token):
            ignoreEndTag = self.ignoreEndTagColgroup()
            self.endTagColgroup(impliedTagToken("colgroup"))
            if not ignoreEndTag:
                return token

        def endTagColgroup(self, token):
            if self.ignoreEndTagColgroup():
                # innerHTML case
                assert self.parser.innerHTML
                self.parser.parseError()
            else:
                self.tree.openElements.pop()
                self.parser.phase = self.parser.phases["inTable"]

        def endTagCol(self, token):
            self.parser.parseError("no-end-tag", {"name": "col"})

        def endTagOther(self, token):
            ignoreEndTag = self.ignoreEndTagColgroup()
            self.endTagColgroup(impliedTagToken("colgroup"))
            if not ignoreEndTag:
                return token

    class InTableBodyPhase(Phase):
        # http://www.whatwg.org/specs/web-apps/current-work/#in-table0
        def __init__(self, parser, tree):
            Phase.__init__(self, parser, tree)
            self.startTagHandler = _utils.MethodDispatcher([
                ("html", self.startTagHtml),
                ("tr", self.startTagTr),
                (("td", "th"), self.startTagTableCell),
                (("caption", "col", "colgroup", "tbody", "tfoot", "thead"),
                 self.startTagTableOther)
            ])
            self.startTagHandler.default = self.startTagOther

            self.endTagHandler = _utils.MethodDispatcher([
                (("tbody", "tfoot", "thead"), self.endTagTableRowGroup),
                ("table", self.endTagTable),
                (("body", "caption", "col", "colgroup", "html", "td", "th",
                  "tr"), self.endTagIgnore)
            ])
            self.endTagHandler.default = self.endTagOther

        # helper methods
        def clearStackToTableBodyContext(self):
            while self.tree.openElements[-1].name not in ("tbody", "tfoot",
                                                          "thead", "html"):
                # self.parser.parseError("unexpected-implied-end-tag-in-table",
                #  {"name": self.tree.openElements[-1].name})
                self.tree.openElements.pop()
            if self.tree.openElements[-1].name == "html":
                assert self.parser.innerHTML

        # the rest
        def processEOF(self):
            self.parser.phases["inTable"].processEOF()

        def processSpaceCharacters(self, token):
            return self.parser.phases["inTable"].processSpaceCharacters(token)

        def processCharacters(self, token):
            return self.parser.phases["inTable"].processCharacters(token)

        def startTagTr(self, token):
            self.clearStackToTableBodyContext()
            self.tree.insertElement(token)
            self.parser.phase = self.parser.phases["inRow"]

        def startTagTableCell(self, token):
            self.parser.parseError("unexpected-cell-in-table-body",
                                   {"name": token["name"]})
            self.startTagTr(impliedTagToken("tr", "StartTag"))
            return token

        def startTagTableOther(self, token):
            # XXX AT Any ideas on how to share this with endTagTable?
            if (self.tree.elementInScope("tbody", variant="table") or
                self.tree.elementInScope("thead", variant="table") or
                    self.tree.elementInScope("tfoot", variant="table")):
                self.clearStackToTableBodyContext()
                self.endTagTableRowGroup(
                    impliedTagToken(self.tree.openElements[-1].name))
                return token
            else:
                # innerHTML case
                assert self.parser.innerHTML
                self.parser.parseError()

        def startTagOther(self, token):
            return self.parser.phases["inTable"].processStartTag(token)

        def endTagTableRowGroup(self, token):
            if self.tree.elementInScope(token["name"], variant="table"):
                self.clearStackToTableBodyContext()
                self.tree.openElements.pop()
                self.parser.phase = self.parser.phases["inTable"]
            else:
                self.parser.parseError("unexpected-end-tag-in-table-body",
                                       {"name": token["name"]})

        def endTagTable(self, token):
            if (self.tree.elementInScope("tbody", variant="table") or
                self.tree.elementInScope("thead", variant="table") or
                    self.tree.elementInScope("tfoot", variant="table")):
                self.clearStackToTableBodyContext()
                self.endTagTableRowGroup(
                    impliedTagToken(self.tree.openElements[-1].name))
                return token
            else:
                # innerHTML case
                assert self.parser.innerHTML
                self.parser.parseError()

        def endTagIgnore(self, token):
            self.parser.parseError("unexpected-end-tag-in-table-body",
                                   {"name": token["name"]})

        def endTagOther(self, token):
            return self.parser.phases["inTable"].processEndTag(token)

    class InRowPhase(Phase):
        # http://www.whatwg.org/specs/web-apps/current-work/#in-row
        def __init__(self, parser, tree):
            Phase.__init__(self, parser, tree)
            self.startTagHandler = _utils.MethodDispatcher([
                ("html", self.startTagHtml),
                (("td", "th"), self.startTagTableCell),
                (("caption", "col", "colgroup", "tbody", "tfoot", "thead",
                  "tr"), self.startTagTableOther)
            ])
            self.startTagHandler.default = self.startTagOther

            self.endTagHandler = _utils.MethodDispatcher([
                ("tr", self.endTagTr),
                ("table", self.endTagTable),
                (("tbody", "tfoot", "thead"), self.endTagTableRowGroup),
                (("body", "caption", "col", "colgroup", "html", "td", "th"),
                 self.endTagIgnore)
            ])
            self.endTagHandler.default = self.endTagOther

        # helper methods (XXX unify this with other table helper methods)
        def clearStackToTableRowContext(self):
            while self.tree.openElements[-1].name not in ("tr", "html"):
                self.parser.parseError("unexpected-implied-end-tag-in-table-row",
                                       {"name": self.tree.openElements[-1].name})
                self.tree.openElements.pop()

        def ignoreEndTagTr(self):
            return not self.tree.elementInScope("tr", variant="table")

        # the rest
        def processEOF(self):
            self.parser.phases["inTable"].processEOF()

        def processSpaceCharacters(self, token):
            return self.parser.phases["inTable"].processSpaceCharacters(token)

        def processCharacters(self, token):
            return self.parser.phases["inTable"].processCharacters(token)

        def startTagTableCell(self, token):
            self.clearStackToTableRowContext()
            self.tree.insertElement(token)
            self.parser.phase = self.parser.phases["inCell"]
            self.tree.activeFormattingElements.append(Marker)

        def startTagTableOther(self, token):
            ignoreEndTag = self.ignoreEndTagTr()
            self.endTagTr(impliedTagToken("tr"))
            # XXX how are we sure it's always ignored in the innerHTML case?
            if not ignoreEndTag:
                return token

        def startTagOther(self, token):
            return self.parser.phases["inTable"].processStartTag(token)

        def endTagTr(self, token):
            if not self.ignoreEndTagTr():
                self.clearStackToTableRowContext()
                self.tree.openElements.pop()
                self.parser.phase = self.parser.phases["inTableBody"]
            else:
                # innerHTML case
                assert self.parser.innerHTML
                self.parser.parseError()

        def endTagTable(self, token):
            ignoreEndTag = self.ignoreEndTagTr()
            self.endTagTr(impliedTagToken("tr"))
            # Reprocess the current tag if the tr end tag was not ignored
            # XXX how are we sure it's always ignored in the innerHTML case?
            if not ignoreEndTag:
                return token

        def endTagTableRowGroup(self, token):
            if self.tree.elementInScope(token["name"], variant="table"):
                self.endTagTr(impliedTagToken("tr"))
                return token
            else:
                self.parser.parseError()

        def endTagIgnore(self, token):
            self.parser.parseError("unexpected-end-tag-in-table-row",
                                   {"name": token["name"]})

        def endTagOther(self, token):
            return self.parser.phases["inTable"].processEndTag(token)

    class InCellPhase(Phase):
        # http://www.whatwg.org/specs/web-apps/current-work/#in-cell
        def __init__(self, parser, tree):
            Phase.__init__(self, parser, tree)
            self.startTagHandler = _utils.MethodDispatcher([
                ("html", self.startTagHtml),
                (("caption", "col", "colgroup", "tbody", "td", "tfoot", "th",
                  "thead", "tr"), self.startTagTableOther)
            ])
            self.startTagHandler.default = self.startTagOther

            self.endTagHandler = _utils.MethodDispatcher([
                (("td", "th"), self.endTagTableCell),
                (("body", "caption", "col", "colgroup", "html"), self.endTagIgnore),
                (("table", "tbody", "tfoot", "thead", "tr"), self.endTagImply)
            ])
            self.endTagHandler.default = self.endTagOther

        # helper
        def closeCell(self):
            if self.tree.elementInScope("td", variant="table"):
                self.endTagTableCell(impliedTagToken("td"))
            elif self.tree.elementInScope("th", variant="table"):
                self.endTagTableCell(impliedTagToken("th"))

        # the rest
        def processEOF(self):
            self.parser.phases["inBody"].processEOF()

        def processCharacters(self, token):
            return self.parser.phases["inBody"].processCharacters(token)

        def startTagTableOther(self, token):
            if (self.tree.elementInScope("td", variant="table") or
                    self.tree.elementInScope("th", variant="table")):
                self.closeCell()
                return token
            else:
                # innerHTML case
                assert self.parser.innerHTML
                self.parser.parseError()

        def startTagOther(self, token):
            return self.parser.phases["inBody"].processStartTag(token)

        def endTagTableCell(self, token):
            if self.tree.elementInScope(token["name"], variant="table"):
                self.tree.generateImpliedEndTags(token["name"])
                if self.tree.openElements[-1].name != token["name"]:
                    self.parser.parseError("unexpected-cell-end-tag",
                                           {"name": token["name"]})
                    while True:
                        node = self.tree.openElements.pop()
                        if node.name == token["name"]:
                            break
                else:
                    self.tree.openElements.pop()
                self.tree.clearActiveFormattingElements()
                self.parser.phase = self.parser.phases["inRow"]
            else:
                self.parser.parseError("unexpected-end-tag", {"name": token["name"]})

        def endTagIgnore(self, token):
            self.parser.parseError("unexpected-end-tag", {"name": token["name"]})

        def endTagImply(self, token):
            if self.tree.elementInScope(token["name"], variant="table"):
                self.closeCell()
                return token
            else:
                # sometimes innerHTML case
                self.parser.parseError()

        def endTagOther(self, token):
            return self.parser.phases["inBody"].processEndTag(token)

    class InSelectPhase(Phase):
        def __init__(self, parser, tree):
            Phase.__init__(self, parser, tree)

            self.startTagHandler = _utils.MethodDispatcher([
                ("html", self.startTagHtml),
                ("option", self.startTagOption),
                ("optgroup", self.startTagOptgroup),
                ("select", self.startTagSelect),
                (("input", "keygen", "textarea"), self.startTagInput),
                ("script", self.startTagScript)
            ])
            self.startTagHandler.default = self.startTagOther

            self.endTagHandler = _utils.MethodDispatcher([
                ("option", self.endTagOption),
                ("optgroup", self.endTagOptgroup),
                ("select", self.endTagSelect)
            ])
            self.endTagHandler.default = self.endTagOther

        # http://www.whatwg.org/specs/web-apps/current-work/#in-select
        def processEOF(self):
            if self.tree.openElements[-1].name != "html":
                self.parser.parseError("eof-in-select")
            else:
                assert self.parser.innerHTML

        def processCharacters(self, token):
            if token["data"] == "\u0000":
                return
            self.tree.insertText(token["data"])

        def startTagOption(self, token):
            # We need to imply </option> if <option> is the current node.
            if self.tree.openElements[-1].name == "option":
                self.tree.openElements.pop()
            self.tree.insertElement(token)

        def startTagOptgroup(self, token):
            if self.tree.openElements[-1].name == "option":
                self.tree.openElements.pop()
            if self.tree.openElements[-1].name == "optgroup":
                self.tree.openElements.pop()
            self.tree.insertElement(token)

        def startTagSelect(self, token):
            self.parser.parseError("unexpected-select-in-select")
            self.endTagSelect(impliedTagToken("select"))

        def startTagInput(self, token):
            self.parser.parseError("unexpected-input-in-select")
            if self.tree.elementInScope("select", variant="select"):
                self.endTagSelect(impliedTagToken("select"))
                return token
            else:
                assert self.parser.innerHTML

        def startTagScript(self, token):
            return self.parser.phases["inHead"].processStartTag(token)

        def startTagOther(self, token):
            self.parser.parseError("unexpected-start-tag-in-select",
                                   {"name": token["name"]})

        def endTagOption(self, token):
            if self.tree.openElements[-1].name == "option":
                self.tree.openElements.pop()
            else:
                self.parser.parseError("unexpected-end-tag-in-select",
                                       {"name": "option"})

        def endTagOptgroup(self, token):
            # </optgroup> implicitly closes <option>
            if (self.tree.openElements[-1].name == "option" and
                    self.tree.openElements[-2].name == "optgroup"):
                self.tree.openElements.pop()
            # It also closes </optgroup>
            if self.tree.openElements[-1].name == "optgroup":
                self.tree.openElements.pop()
            # But nothing else
            else:
                self.parser.parseError("unexpected-end-tag-in-select",
                                       {"name": "optgroup"})

        def endTagSelect(self, token):
            if self.tree.elementInScope("select", variant="select"):
                node = self.tree.openElements.pop()
                while node.name != "select":
                    node = self.tree.openElements.pop()
                self.parser.resetInsertionMode()
            else:
                # innerHTML case
                assert self.parser.innerHTML
                self.parser.parseError()

        def endTagOther(self, token):
            self.parser.parseError("unexpected-end-tag-in-select",
                                   {"name": token["name"]})

    class InSelectInTablePhase(Phase):
        def __init__(self, parser, tree):
            Phase.__init__(self, parser, tree)

            self.startTagHandler = _utils.MethodDispatcher([
                (("caption", "table", "tbody", "tfoot", "thead", "tr", "td", "th"),
                 self.startTagTable)
            ])
            self.startTagHandler.default = self.startTagOther

            self.endTagHandler = _utils.MethodDispatcher([
                (("caption", "table", "tbody", "tfoot", "thead", "tr", "td", "th"),
                 self.endTagTable)
            ])
            self.endTagHandler.default = self.endTagOther

        def processEOF(self):
            self.parser.phases["inSelect"].processEOF()

        def processCharacters(self, token):
            return self.parser.phases["inSelect"].processCharacters(token)

        def startTagTable(self, token):
            self.parser.parseError("unexpected-table-element-start-tag-in-select-in-table", {"name": token["name"]})
            self.endTagOther(impliedTagToken("select"))
            return token

        def startTagOther(self, token):
            return self.parser.phases["inSelect"].processStartTag(token)

        def endTagTable(self, token):
            self.parser.parseError("unexpected-table-element-end-tag-in-select-in-table", {"name": token["name"]})
            if self.tree.elementInScope(token["name"], variant="table"):
                self.endTagOther(impliedTagToken("select"))
                return token

        def endTagOther(self, token):
            return self.parser.phases["inSelect"].processEndTag(token)

    class InForeignContentPhase(Phase):
        breakoutElements = frozenset(["b", "big", "blockquote", "body", "br",
                                      "center", "code", "dd", "div", "dl", "dt",
                                      "em", "embed", "h1", "h2", "h3",
                                      "h4", "h5", "h6", "head", "hr", "i", "img",
                                      "li", "listing", "menu", "meta", "nobr",
                                      "ol", "p", "pre", "ruby", "s", "small",
                                      "span", "strong", "strike", "sub", "sup",
                                      "table", "tt", "u", "ul", "var"])

        def __init__(self, parser, tree):
            Phase.__init__(self, parser, tree)

        def adjustSVGTagNames(self, token):
            replacements = {"altglyph": "altGlyph",
                            "altglyphdef": "altGlyphDef",
                            "altglyphitem": "altGlyphItem",
                            "animatecolor": "animateColor",
                            "animatemotion": "animateMotion",
                            "animatetransform": "animateTransform",
                            "clippath": "clipPath",
                            "feblend": "feBlend",
                            "fecolormatrix": "feColorMatrix",
                            "fecomponenttransfer": "feComponentTransfer",
                            "fecomposite": "feComposite",
                            "feconvolvematrix": "feConvolveMatrix",
                            "fediffuselighting": "feDiffuseLighting",
                            "fedisplacementmap": "feDisplacementMap",
                            "fedistantlight": "feDistantLight",
                            "feflood": "feFlood",
                            "fefunca": "feFuncA",
                            "fefuncb": "feFuncB",
                            "fefuncg": "feFuncG",
                            "fefuncr": "feFuncR",
                            "fegaussianblur": "feGaussianBlur",
                            "feimage": "feImage",
                            "femerge": "feMerge",
                            "femergenode": "feMergeNode",
                            "femorphology": "feMorphology",
                            "feoffset": "feOffset",
                            "fepointlight": "fePointLight",
                            "fespecularlighting": "feSpecularLighting",
                            "fespotlight": "feSpotLight",
                            "fetile": "feTile",
                            "feturbulence": "feTurbulence",
                            "foreignobject": "foreignObject",
                            "glyphref": "glyphRef",
                            "lineargradient": "linearGradient",
                            "radialgradient": "radialGradient",
                            "textpath": "textPath"}

            if token["name"] in replacements:
                token["name"] = replacements[token["name"]]

        def processCharacters(self, token):
            if token["data"] == "\u0000":
                token["data"] = "\uFFFD"
            elif (self.parser.framesetOK and
                  any(char not in spaceCharacters for char in token["data"])):
                self.parser.framesetOK = False
            Phase.processCharacters(self, token)

        def processStartTag(self, token):
            currentNode = self.tree.openElements[-1]
            if (token["name"] in self.breakoutElements or
                (token["name"] == "font" and
                 set(token["data"].keys()) & set(["color", "face", "size"]))):
                self.parser.parseError("unexpected-html-element-in-foreign-content",
                                       {"name": token["name"]})
                while (self.tree.openElements[-1].namespace !=
                       self.tree.defaultNamespace and
                       not self.parser.isHTMLIntegrationPoint(self.tree.openElements[-1]) and
                       not self.parser.isMathMLTextIntegrationPoint(self.tree.openElements[-1])):
                    self.tree.openElements.pop()
                return token

            else:
                if currentNode.namespace == namespaces["mathml"]:
                    self.parser.adjustMathMLAttributes(token)
                elif currentNode.namespace == namespaces["svg"]:
                    self.adjustSVGTagNames(token)
                    self.parser.adjustSVGAttributes(token)
                self.parser.adjustForeignAttributes(token)
                token["namespace"] = currentNode.namespace
                self.tree.insertElement(token)
                if token["selfClosing"]:
                    self.tree.openElements.pop()
                    token["selfClosingAcknowledged"] = True

        def processEndTag(self, token):
            nodeIndex = len(self.tree.openElements) - 1
            node = self.tree.openElements[-1]
            if node.name.translate(asciiUpper2Lower) != token["name"]:
                self.parser.parseError("unexpected-end-tag", {"name": token["name"]})

            while True:
                if node.name.translate(asciiUpper2Lower) == token["name"]:
                    # XXX this isn't in the spec but it seems necessary
                    if self.parser.phase == self.parser.phases["inTableText"]:
                        self.parser.phase.flushCharacters()
                        self.parser.phase = self.parser.phase.originalPhase
                    while self.tree.openElements.pop() != node:
                        assert self.tree.openElements
                    new_token = None
                    break
                nodeIndex -= 1

                node = self.tree.openElements[nodeIndex]
                if node.namespace != self.tree.defaultNamespace:
                    continue
                else:
                    new_token = self.parser.phase.processEndTag(token)
                    break
            return new_token

    class AfterBodyPhase(Phase):
        def __init__(self, parser, tree):
            Phase.__init__(self, parser, tree)

            self.startTagHandler = _utils.MethodDispatcher([
                ("html", self.startTagHtml)
            ])
            self.startTagHandler.default = self.startTagOther

            self.endTagHandler = _utils.MethodDispatcher([("html", self.endTagHtml)])
            self.endTagHandler.default = self.endTagOther

        def processEOF(self):
            # Stop parsing
            pass

        def processComment(self, token):
            # This is needed because data is to be appended to the <html> element
            # here and not to whatever is currently open.
            self.tree.insertComment(token, self.tree.openElements[0])

        def processCharacters(self, token):
            self.parser.parseError("unexpected-char-after-body")
            self.parser.phase = self.parser.phases["inBody"]
            return token

        def startTagHtml(self, token):
            return self.parser.phases["inBody"].processStartTag(token)

        def startTagOther(self, token):
            self.parser.parseError("unexpected-start-tag-after-body",
                                   {"name": token["name"]})
            self.parser.phase = self.parser.phases["inBody"]
            return token

        def endTagHtml(self, name):
            if self.parser.innerHTML:
                self.parser.parseError("unexpected-end-tag-after-body-innerhtml")
            else:
                self.parser.phase = self.parser.phases["afterAfterBody"]

        def endTagOther(self, token):
            self.parser.parseError("unexpected-end-tag-after-body",
                                   {"name": token["name"]})
            self.parser.phase = self.parser.phases["inBody"]
            return token

    class InFramesetPhase(Phase):
        # http://www.whatwg.org/specs/web-apps/current-work/#in-frameset
        def __init__(self, parser, tree):
            Phase.__init__(self, parser, tree)

            self.startTagHandler = _utils.MethodDispatcher([
                ("html", self.startTagHtml),
                ("frameset", self.startTagFrameset),
                ("frame", self.startTagFrame),
                ("noframes", self.startTagNoframes)
            ])
            self.startTagHandler.default = self.startTagOther

            self.endTagHandler = _utils.MethodDispatcher([
                ("frameset", self.endTagFrameset)
            ])
            self.endTagHandler.default = self.endTagOther

        def processEOF(self):
            if self.tree.openElements[-1].name != "html":
                self.parser.parseError("eof-in-frameset")
            else:
                assert self.parser.innerHTML

        def processCharacters(self, token):
            self.parser.parseError("unexpected-char-in-frameset")

        def startTagFrameset(self, token):
            self.tree.insertElement(token)

        def startTagFrame(self, token):
            self.tree.insertElement(token)
            self.tree.openElements.pop()

        def startTagNoframes(self, token):
            return self.parser.phases["inBody"].processStartTag(token)

        def startTagOther(self, token):
            self.parser.parseError("unexpected-start-tag-in-frameset",
                                   {"name": token["name"]})

        def endTagFrameset(self, token):
            if self.tree.openElements[-1].name == "html":
                # innerHTML case
                self.parser.parseError("unexpected-frameset-in-frameset-innerhtml")
            else:
                self.tree.openElements.pop()
            if (not self.parser.innerHTML and
                    self.tree.openElements[-1].name != "frameset"):
                # If we're not in innerHTML mode and the current node is not a
                # "frameset" element (anymore) then switch.
                self.parser.phase = self.parser.phases["afterFrameset"]

        def endTagOther(self, token):
            self.parser.parseError("unexpected-end-tag-in-frameset",
                                   {"name": token["name"]})

    class AfterFramesetPhase(Phase):
        # http://www.whatwg.org/specs/web-apps/current-work/#after3
        def __init__(self, parser, tree):
            Phase.__init__(self, parser, tree)

            self.startTagHandler = _utils.MethodDispatcher([
                ("html", self.startTagHtml),
                ("noframes", self.startTagNoframes)
            ])
            self.startTagHandler.default = self.startTagOther

            self.endTagHandler = _utils.MethodDispatcher([
                ("html", self.endTagHtml)
            ])
            self.endTagHandler.default = self.endTagOther

        def processEOF(self):
            # Stop parsing
            pass

        def processCharacters(self, token):
            self.parser.parseError("unexpected-char-after-frameset")

        def startTagNoframes(self, token):
            return self.parser.phases["inHead"].processStartTag(token)

        def startTagOther(self, token):
            self.parser.parseError("unexpected-start-tag-after-frameset",
                                   {"name": token["name"]})

        def endTagHtml(self, token):
            self.parser.phase = self.parser.phases["afterAfterFrameset"]

        def endTagOther(self, token):
            self.parser.parseError("unexpected-end-tag-after-frameset",
                                   {"name": token["name"]})

    class AfterAfterBodyPhase(Phase):
        def __init__(self, parser, tree):
            Phase.__init__(self, parser, tree)

            self.startTagHandler = _utils.MethodDispatcher([
                ("html", self.startTagHtml)
            ])
            self.startTagHandler.default = self.startTagOther

        def processEOF(self):
            pass

        def processComment(self, token):
            self.tree.insertComment(token, self.tree.document)

        def processSpaceCharacters(self, token):
            return self.parser.phases["inBody"].processSpaceCharacters(token)

        def processCharacters(self, token):
            self.parser.parseError("expected-eof-but-got-char")
            self.parser.phase = self.parser.phases["inBody"]
            return token

        def startTagHtml(self, token):
            return self.parser.phases["inBody"].processStartTag(token)

        def startTagOther(self, token):
            self.parser.parseError("expected-eof-but-got-start-tag",
                                   {"name": token["name"]})
            self.parser.phase = self.parser.phases["inBody"]
            return token

        def processEndTag(self, token):
            self.parser.parseError("expected-eof-but-got-end-tag",
                                   {"name": token["name"]})
            self.parser.phase = self.parser.phases["inBody"]
            return token

    class AfterAfterFramesetPhase(Phase):
        def __init__(self, parser, tree):
            Phase.__init__(self, parser, tree)

            self.startTagHandler = _utils.MethodDispatcher([
                ("html", self.startTagHtml),
                ("noframes", self.startTagNoFrames)
            ])
            self.startTagHandler.default = self.startTagOther

        def processEOF(self):
            pass

        def processComment(self, token):
            self.tree.insertComment(token, self.tree.document)

        def processSpaceCharacters(self, token):
            return self.parser.phases["inBody"].processSpaceCharacters(token)

        def processCharacters(self, token):
            self.parser.parseError("expected-eof-but-got-char")

        def startTagHtml(self, token):
            return self.parser.phases["inBody"].processStartTag(token)

        def startTagNoFrames(self, token):
            return self.parser.phases["inHead"].processStartTag(token)

        def startTagOther(self, token):
            self.parser.parseError("expected-eof-but-got-start-tag",
                                   {"name": token["name"]})

        def processEndTag(self, token):
            self.parser.parseError("expected-eof-but-got-end-tag",
                                   {"name": token["name"]})
    # pylint:enable=unused-argument

    return {
        "initial": InitialPhase,
        "beforeHtml": BeforeHtmlPhase,
        "beforeHead": BeforeHeadPhase,
        "inHead": InHeadPhase,
        "inHeadNoscript": InHeadNoscriptPhase,
        "afterHead": AfterHeadPhase,
        "inBody": InBodyPhase,
        "text": TextPhase,
        "inTable": InTablePhase,
        "inTableText": InTableTextPhase,
        "inCaption": InCaptionPhase,
        "inColumnGroup": InColumnGroupPhase,
        "inTableBody": InTableBodyPhase,
        "inRow": InRowPhase,
        "inCell": InCellPhase,
        "inSelect": InSelectPhase,
        "inSelectInTable": InSelectInTablePhase,
        "inForeignContent": InForeignContentPhase,
        "afterBody": AfterBodyPhase,
        "inFrameset": InFramesetPhase,
        "afterFrameset": AfterFramesetPhase,
        "afterAfterBody": AfterAfterBodyPhase,
        "afterAfterFrameset": AfterAfterFramesetPhase,
        # XXX after after frameset
    }


def adjust_attributes(token, replacements):
    if PY3 or _utils.PY27:
        needs_adjustment = viewkeys(token['data']) & viewkeys(replacements)
    else:
        needs_adjustment = frozenset(token['data']) & frozenset(replacements)
    if needs_adjustment:
        token['data'] = OrderedDict((replacements.get(k, k), v)
                                    for k, v in token['data'].items())


def impliedTagToken(name, type="EndTag", attributes=None,
                    selfClosing=False):
    if attributes is None:
        attributes = {}
    return {"type": tokenTypes[type], "name": name, "data": attributes,
            "selfClosing": selfClosing}


class ParseError(Exception):
    """Error in parsed document"""
    pass
site-packages/pip/_vendor/html5lib/_ihatexml.py000064400000040501147511334600015560 0ustar00from __future__ import absolute_import, division, unicode_literals

import re
import warnings

from .constants import DataLossWarning

baseChar = """
[#x0041-#x005A] | [#x0061-#x007A] | [#x00C0-#x00D6] | [#x00D8-#x00F6] |
[#x00F8-#x00FF] | [#x0100-#x0131] | [#x0134-#x013E] | [#x0141-#x0148] |
[#x014A-#x017E] | [#x0180-#x01C3] | [#x01CD-#x01F0] | [#x01F4-#x01F5] |
[#x01FA-#x0217] | [#x0250-#x02A8] | [#x02BB-#x02C1] | #x0386 |
[#x0388-#x038A] | #x038C | [#x038E-#x03A1] | [#x03A3-#x03CE] |
[#x03D0-#x03D6] | #x03DA | #x03DC | #x03DE | #x03E0 | [#x03E2-#x03F3] |
[#x0401-#x040C] | [#x040E-#x044F] | [#x0451-#x045C] | [#x045E-#x0481] |
[#x0490-#x04C4] | [#x04C7-#x04C8] | [#x04CB-#x04CC] | [#x04D0-#x04EB] |
[#x04EE-#x04F5] | [#x04F8-#x04F9] | [#x0531-#x0556] | #x0559 |
[#x0561-#x0586] | [#x05D0-#x05EA] | [#x05F0-#x05F2] | [#x0621-#x063A] |
[#x0641-#x064A] | [#x0671-#x06B7] | [#x06BA-#x06BE] | [#x06C0-#x06CE] |
[#x06D0-#x06D3] | #x06D5 | [#x06E5-#x06E6] | [#x0905-#x0939] | #x093D |
[#x0958-#x0961] | [#x0985-#x098C] | [#x098F-#x0990] | [#x0993-#x09A8] |
[#x09AA-#x09B0] | #x09B2 | [#x09B6-#x09B9] | [#x09DC-#x09DD] |
[#x09DF-#x09E1] | [#x09F0-#x09F1] | [#x0A05-#x0A0A] | [#x0A0F-#x0A10] |
[#x0A13-#x0A28] | [#x0A2A-#x0A30] | [#x0A32-#x0A33] | [#x0A35-#x0A36] |
[#x0A38-#x0A39] | [#x0A59-#x0A5C] | #x0A5E | [#x0A72-#x0A74] |
[#x0A85-#x0A8B] | #x0A8D | [#x0A8F-#x0A91] | [#x0A93-#x0AA8] |
[#x0AAA-#x0AB0] | [#x0AB2-#x0AB3] | [#x0AB5-#x0AB9] | #x0ABD | #x0AE0 |
[#x0B05-#x0B0C] | [#x0B0F-#x0B10] | [#x0B13-#x0B28] | [#x0B2A-#x0B30] |
[#x0B32-#x0B33] | [#x0B36-#x0B39] | #x0B3D | [#x0B5C-#x0B5D] |
[#x0B5F-#x0B61] | [#x0B85-#x0B8A] | [#x0B8E-#x0B90] | [#x0B92-#x0B95] |
[#x0B99-#x0B9A] | #x0B9C | [#x0B9E-#x0B9F] | [#x0BA3-#x0BA4] |
[#x0BA8-#x0BAA] | [#x0BAE-#x0BB5] | [#x0BB7-#x0BB9] | [#x0C05-#x0C0C] |
[#x0C0E-#x0C10] | [#x0C12-#x0C28] | [#x0C2A-#x0C33] | [#x0C35-#x0C39] |
[#x0C60-#x0C61] | [#x0C85-#x0C8C] | [#x0C8E-#x0C90] | [#x0C92-#x0CA8] |
[#x0CAA-#x0CB3] | [#x0CB5-#x0CB9] | #x0CDE | [#x0CE0-#x0CE1] |
[#x0D05-#x0D0C] | [#x0D0E-#x0D10] | [#x0D12-#x0D28] | [#x0D2A-#x0D39] |
[#x0D60-#x0D61] | [#x0E01-#x0E2E] | #x0E30 | [#x0E32-#x0E33] |
[#x0E40-#x0E45] | [#x0E81-#x0E82] | #x0E84 | [#x0E87-#x0E88] | #x0E8A |
#x0E8D | [#x0E94-#x0E97] | [#x0E99-#x0E9F] | [#x0EA1-#x0EA3] | #x0EA5 |
#x0EA7 | [#x0EAA-#x0EAB] | [#x0EAD-#x0EAE] | #x0EB0 | [#x0EB2-#x0EB3] |
#x0EBD | [#x0EC0-#x0EC4] | [#x0F40-#x0F47] | [#x0F49-#x0F69] |
[#x10A0-#x10C5] | [#x10D0-#x10F6] | #x1100 | [#x1102-#x1103] |
[#x1105-#x1107] | #x1109 | [#x110B-#x110C] | [#x110E-#x1112] | #x113C |
#x113E | #x1140 | #x114C | #x114E | #x1150 | [#x1154-#x1155] | #x1159 |
[#x115F-#x1161] | #x1163 | #x1165 | #x1167 | #x1169 | [#x116D-#x116E] |
[#x1172-#x1173] | #x1175 | #x119E | #x11A8 | #x11AB | [#x11AE-#x11AF] |
[#x11B7-#x11B8] | #x11BA | [#x11BC-#x11C2] | #x11EB | #x11F0 | #x11F9 |
[#x1E00-#x1E9B] | [#x1EA0-#x1EF9] | [#x1F00-#x1F15] | [#x1F18-#x1F1D] |
[#x1F20-#x1F45] | [#x1F48-#x1F4D] | [#x1F50-#x1F57] | #x1F59 | #x1F5B |
#x1F5D | [#x1F5F-#x1F7D] | [#x1F80-#x1FB4] | [#x1FB6-#x1FBC] | #x1FBE |
[#x1FC2-#x1FC4] | [#x1FC6-#x1FCC] | [#x1FD0-#x1FD3] | [#x1FD6-#x1FDB] |
[#x1FE0-#x1FEC] | [#x1FF2-#x1FF4] | [#x1FF6-#x1FFC] | #x2126 |
[#x212A-#x212B] | #x212E | [#x2180-#x2182] | [#x3041-#x3094] |
[#x30A1-#x30FA] | [#x3105-#x312C] | [#xAC00-#xD7A3]"""

ideographic = """[#x4E00-#x9FA5] | #x3007 | [#x3021-#x3029]"""

combiningCharacter = """
[#x0300-#x0345] | [#x0360-#x0361] | [#x0483-#x0486] | [#x0591-#x05A1] |
[#x05A3-#x05B9] | [#x05BB-#x05BD] | #x05BF | [#x05C1-#x05C2] | #x05C4 |
[#x064B-#x0652] | #x0670 | [#x06D6-#x06DC] | [#x06DD-#x06DF] |
[#x06E0-#x06E4] | [#x06E7-#x06E8] | [#x06EA-#x06ED] | [#x0901-#x0903] |
#x093C | [#x093E-#x094C] | #x094D | [#x0951-#x0954] | [#x0962-#x0963] |
[#x0981-#x0983] | #x09BC | #x09BE | #x09BF | [#x09C0-#x09C4] |
[#x09C7-#x09C8] | [#x09CB-#x09CD] | #x09D7 | [#x09E2-#x09E3] | #x0A02 |
#x0A3C | #x0A3E | #x0A3F | [#x0A40-#x0A42] | [#x0A47-#x0A48] |
[#x0A4B-#x0A4D] | [#x0A70-#x0A71] | [#x0A81-#x0A83] | #x0ABC |
[#x0ABE-#x0AC5] | [#x0AC7-#x0AC9] | [#x0ACB-#x0ACD] | [#x0B01-#x0B03] |
#x0B3C | [#x0B3E-#x0B43] | [#x0B47-#x0B48] | [#x0B4B-#x0B4D] |
[#x0B56-#x0B57] | [#x0B82-#x0B83] | [#x0BBE-#x0BC2] | [#x0BC6-#x0BC8] |
[#x0BCA-#x0BCD] | #x0BD7 | [#x0C01-#x0C03] | [#x0C3E-#x0C44] |
[#x0C46-#x0C48] | [#x0C4A-#x0C4D] | [#x0C55-#x0C56] | [#x0C82-#x0C83] |
[#x0CBE-#x0CC4] | [#x0CC6-#x0CC8] | [#x0CCA-#x0CCD] | [#x0CD5-#x0CD6] |
[#x0D02-#x0D03] | [#x0D3E-#x0D43] | [#x0D46-#x0D48] | [#x0D4A-#x0D4D] |
#x0D57 | #x0E31 | [#x0E34-#x0E3A] | [#x0E47-#x0E4E] | #x0EB1 |
[#x0EB4-#x0EB9] | [#x0EBB-#x0EBC] | [#x0EC8-#x0ECD] | [#x0F18-#x0F19] |
#x0F35 | #x0F37 | #x0F39 | #x0F3E | #x0F3F | [#x0F71-#x0F84] |
[#x0F86-#x0F8B] | [#x0F90-#x0F95] | #x0F97 | [#x0F99-#x0FAD] |
[#x0FB1-#x0FB7] | #x0FB9 | [#x20D0-#x20DC] | #x20E1 | [#x302A-#x302F] |
#x3099 | #x309A"""

digit = """
[#x0030-#x0039] | [#x0660-#x0669] | [#x06F0-#x06F9] | [#x0966-#x096F] |
[#x09E6-#x09EF] | [#x0A66-#x0A6F] | [#x0AE6-#x0AEF] | [#x0B66-#x0B6F] |
[#x0BE7-#x0BEF] | [#x0C66-#x0C6F] | [#x0CE6-#x0CEF] | [#x0D66-#x0D6F] |
[#x0E50-#x0E59] | [#x0ED0-#x0ED9] | [#x0F20-#x0F29]"""

extender = """
#x00B7 | #x02D0 | #x02D1 | #x0387 | #x0640 | #x0E46 | #x0EC6 | #x3005 |
#[#x3031-#x3035] | [#x309D-#x309E] | [#x30FC-#x30FE]"""

letter = " | ".join([baseChar, ideographic])

# Without the
name = " | ".join([letter, digit, ".", "-", "_", combiningCharacter,
                   extender])
nameFirst = " | ".join([letter, "_"])

reChar = re.compile(r"#x([\d|A-F]{4,4})")
reCharRange = re.compile(r"\[#x([\d|A-F]{4,4})-#x([\d|A-F]{4,4})\]")


def charStringToList(chars):
    charRanges = [item.strip() for item in chars.split(" | ")]
    rv = []
    for item in charRanges:
        foundMatch = False
        for regexp in (reChar, reCharRange):
            match = regexp.match(item)
            if match is not None:
                rv.append([hexToInt(item) for item in match.groups()])
                if len(rv[-1]) == 1:
                    rv[-1] = rv[-1] * 2
                foundMatch = True
                break
        if not foundMatch:
            assert len(item) == 1

            rv.append([ord(item)] * 2)
    rv = normaliseCharList(rv)
    return rv


def normaliseCharList(charList):
    charList = sorted(charList)
    for item in charList:
        assert item[1] >= item[0]
    rv = []
    i = 0
    while i < len(charList):
        j = 1
        rv.append(charList[i])
        while i + j < len(charList) and charList[i + j][0] <= rv[-1][1] + 1:
            rv[-1][1] = charList[i + j][1]
            j += 1
        i += j
    return rv

# We don't really support characters above the BMP :(
max_unicode = int("FFFF", 16)


def missingRanges(charList):
    rv = []
    if charList[0] != 0:
        rv.append([0, charList[0][0] - 1])
    for i, item in enumerate(charList[:-1]):
        rv.append([item[1] + 1, charList[i + 1][0] - 1])
    if charList[-1][1] != max_unicode:
        rv.append([charList[-1][1] + 1, max_unicode])
    return rv


def listToRegexpStr(charList):
    rv = []
    for item in charList:
        if item[0] == item[1]:
            rv.append(escapeRegexp(chr(item[0])))
        else:
            rv.append(escapeRegexp(chr(item[0])) + "-" +
                      escapeRegexp(chr(item[1])))
    return "[%s]" % "".join(rv)


def hexToInt(hex_str):
    return int(hex_str, 16)


def escapeRegexp(string):
    specialCharacters = (".", "^", "$", "*", "+", "?", "{", "}",
                         "[", "]", "|", "(", ")", "-")
    for char in specialCharacters:
        string = string.replace(char, "\\" + char)

    return string

# output from the above
nonXmlNameBMPRegexp = re.compile('[\x00-,/:-@\\[-\\^`\\{-\xb6\xb8-\xbf\xd7\xf7\u0132-\u0133\u013f-\u0140\u0149\u017f\u01c4-\u01cc\u01f1-\u01f3\u01f6-\u01f9\u0218-\u024f\u02a9-\u02ba\u02c2-\u02cf\u02d2-\u02ff\u0346-\u035f\u0362-\u0385\u038b\u038d\u03a2\u03cf\u03d7-\u03d9\u03db\u03dd\u03df\u03e1\u03f4-\u0400\u040d\u0450\u045d\u0482\u0487-\u048f\u04c5-\u04c6\u04c9-\u04ca\u04cd-\u04cf\u04ec-\u04ed\u04f6-\u04f7\u04fa-\u0530\u0557-\u0558\u055a-\u0560\u0587-\u0590\u05a2\u05ba\u05be\u05c0\u05c3\u05c5-\u05cf\u05eb-\u05ef\u05f3-\u0620\u063b-\u063f\u0653-\u065f\u066a-\u066f\u06b8-\u06b9\u06bf\u06cf\u06d4\u06e9\u06ee-\u06ef\u06fa-\u0900\u0904\u093a-\u093b\u094e-\u0950\u0955-\u0957\u0964-\u0965\u0970-\u0980\u0984\u098d-\u098e\u0991-\u0992\u09a9\u09b1\u09b3-\u09b5\u09ba-\u09bb\u09bd\u09c5-\u09c6\u09c9-\u09ca\u09ce-\u09d6\u09d8-\u09db\u09de\u09e4-\u09e5\u09f2-\u0a01\u0a03-\u0a04\u0a0b-\u0a0e\u0a11-\u0a12\u0a29\u0a31\u0a34\u0a37\u0a3a-\u0a3b\u0a3d\u0a43-\u0a46\u0a49-\u0a4a\u0a4e-\u0a58\u0a5d\u0a5f-\u0a65\u0a75-\u0a80\u0a84\u0a8c\u0a8e\u0a92\u0aa9\u0ab1\u0ab4\u0aba-\u0abb\u0ac6\u0aca\u0ace-\u0adf\u0ae1-\u0ae5\u0af0-\u0b00\u0b04\u0b0d-\u0b0e\u0b11-\u0b12\u0b29\u0b31\u0b34-\u0b35\u0b3a-\u0b3b\u0b44-\u0b46\u0b49-\u0b4a\u0b4e-\u0b55\u0b58-\u0b5b\u0b5e\u0b62-\u0b65\u0b70-\u0b81\u0b84\u0b8b-\u0b8d\u0b91\u0b96-\u0b98\u0b9b\u0b9d\u0ba0-\u0ba2\u0ba5-\u0ba7\u0bab-\u0bad\u0bb6\u0bba-\u0bbd\u0bc3-\u0bc5\u0bc9\u0bce-\u0bd6\u0bd8-\u0be6\u0bf0-\u0c00\u0c04\u0c0d\u0c11\u0c29\u0c34\u0c3a-\u0c3d\u0c45\u0c49\u0c4e-\u0c54\u0c57-\u0c5f\u0c62-\u0c65\u0c70-\u0c81\u0c84\u0c8d\u0c91\u0ca9\u0cb4\u0cba-\u0cbd\u0cc5\u0cc9\u0cce-\u0cd4\u0cd7-\u0cdd\u0cdf\u0ce2-\u0ce5\u0cf0-\u0d01\u0d04\u0d0d\u0d11\u0d29\u0d3a-\u0d3d\u0d44-\u0d45\u0d49\u0d4e-\u0d56\u0d58-\u0d5f\u0d62-\u0d65\u0d70-\u0e00\u0e2f\u0e3b-\u0e3f\u0e4f\u0e5a-\u0e80\u0e83\u0e85-\u0e86\u0e89\u0e8b-\u0e8c\u0e8e-\u0e93\u0e98\u0ea0\u0ea4\u0ea6\u0ea8-\u0ea9\u0eac\u0eaf\u0eba\u0ebe-\u0ebf\u0ec5\u0ec7\u0ece-\u0ecf\u0eda-\u0f17\u0f1a-\u0f1f\u0f2a-\u0f34\u0f36\u0f38\u0f3a-\u0f3d\u0f48\u0f6a-\u0f70\u0f85\u0f8c-\u0f8f\u0f96\u0f98\u0fae-\u0fb0\u0fb8\u0fba-\u109f\u10c6-\u10cf\u10f7-\u10ff\u1101\u1104\u1108\u110a\u110d\u1113-\u113b\u113d\u113f\u1141-\u114b\u114d\u114f\u1151-\u1153\u1156-\u1158\u115a-\u115e\u1162\u1164\u1166\u1168\u116a-\u116c\u116f-\u1171\u1174\u1176-\u119d\u119f-\u11a7\u11a9-\u11aa\u11ac-\u11ad\u11b0-\u11b6\u11b9\u11bb\u11c3-\u11ea\u11ec-\u11ef\u11f1-\u11f8\u11fa-\u1dff\u1e9c-\u1e9f\u1efa-\u1eff\u1f16-\u1f17\u1f1e-\u1f1f\u1f46-\u1f47\u1f4e-\u1f4f\u1f58\u1f5a\u1f5c\u1f5e\u1f7e-\u1f7f\u1fb5\u1fbd\u1fbf-\u1fc1\u1fc5\u1fcd-\u1fcf\u1fd4-\u1fd5\u1fdc-\u1fdf\u1fed-\u1ff1\u1ff5\u1ffd-\u20cf\u20dd-\u20e0\u20e2-\u2125\u2127-\u2129\u212c-\u212d\u212f-\u217f\u2183-\u3004\u3006\u3008-\u3020\u3030\u3036-\u3040\u3095-\u3098\u309b-\u309c\u309f-\u30a0\u30fb\u30ff-\u3104\u312d-\u4dff\u9fa6-\uabff\ud7a4-\uffff]')  # noqa

nonXmlNameFirstBMPRegexp = re.compile('[\x00-@\\[-\\^`\\{-\xbf\xd7\xf7\u0132-\u0133\u013f-\u0140\u0149\u017f\u01c4-\u01cc\u01f1-\u01f3\u01f6-\u01f9\u0218-\u024f\u02a9-\u02ba\u02c2-\u0385\u0387\u038b\u038d\u03a2\u03cf\u03d7-\u03d9\u03db\u03dd\u03df\u03e1\u03f4-\u0400\u040d\u0450\u045d\u0482-\u048f\u04c5-\u04c6\u04c9-\u04ca\u04cd-\u04cf\u04ec-\u04ed\u04f6-\u04f7\u04fa-\u0530\u0557-\u0558\u055a-\u0560\u0587-\u05cf\u05eb-\u05ef\u05f3-\u0620\u063b-\u0640\u064b-\u0670\u06b8-\u06b9\u06bf\u06cf\u06d4\u06d6-\u06e4\u06e7-\u0904\u093a-\u093c\u093e-\u0957\u0962-\u0984\u098d-\u098e\u0991-\u0992\u09a9\u09b1\u09b3-\u09b5\u09ba-\u09db\u09de\u09e2-\u09ef\u09f2-\u0a04\u0a0b-\u0a0e\u0a11-\u0a12\u0a29\u0a31\u0a34\u0a37\u0a3a-\u0a58\u0a5d\u0a5f-\u0a71\u0a75-\u0a84\u0a8c\u0a8e\u0a92\u0aa9\u0ab1\u0ab4\u0aba-\u0abc\u0abe-\u0adf\u0ae1-\u0b04\u0b0d-\u0b0e\u0b11-\u0b12\u0b29\u0b31\u0b34-\u0b35\u0b3a-\u0b3c\u0b3e-\u0b5b\u0b5e\u0b62-\u0b84\u0b8b-\u0b8d\u0b91\u0b96-\u0b98\u0b9b\u0b9d\u0ba0-\u0ba2\u0ba5-\u0ba7\u0bab-\u0bad\u0bb6\u0bba-\u0c04\u0c0d\u0c11\u0c29\u0c34\u0c3a-\u0c5f\u0c62-\u0c84\u0c8d\u0c91\u0ca9\u0cb4\u0cba-\u0cdd\u0cdf\u0ce2-\u0d04\u0d0d\u0d11\u0d29\u0d3a-\u0d5f\u0d62-\u0e00\u0e2f\u0e31\u0e34-\u0e3f\u0e46-\u0e80\u0e83\u0e85-\u0e86\u0e89\u0e8b-\u0e8c\u0e8e-\u0e93\u0e98\u0ea0\u0ea4\u0ea6\u0ea8-\u0ea9\u0eac\u0eaf\u0eb1\u0eb4-\u0ebc\u0ebe-\u0ebf\u0ec5-\u0f3f\u0f48\u0f6a-\u109f\u10c6-\u10cf\u10f7-\u10ff\u1101\u1104\u1108\u110a\u110d\u1113-\u113b\u113d\u113f\u1141-\u114b\u114d\u114f\u1151-\u1153\u1156-\u1158\u115a-\u115e\u1162\u1164\u1166\u1168\u116a-\u116c\u116f-\u1171\u1174\u1176-\u119d\u119f-\u11a7\u11a9-\u11aa\u11ac-\u11ad\u11b0-\u11b6\u11b9\u11bb\u11c3-\u11ea\u11ec-\u11ef\u11f1-\u11f8\u11fa-\u1dff\u1e9c-\u1e9f\u1efa-\u1eff\u1f16-\u1f17\u1f1e-\u1f1f\u1f46-\u1f47\u1f4e-\u1f4f\u1f58\u1f5a\u1f5c\u1f5e\u1f7e-\u1f7f\u1fb5\u1fbd\u1fbf-\u1fc1\u1fc5\u1fcd-\u1fcf\u1fd4-\u1fd5\u1fdc-\u1fdf\u1fed-\u1ff1\u1ff5\u1ffd-\u2125\u2127-\u2129\u212c-\u212d\u212f-\u217f\u2183-\u3006\u3008-\u3020\u302a-\u3040\u3095-\u30a0\u30fb-\u3104\u312d-\u4dff\u9fa6-\uabff\ud7a4-\uffff]')  # noqa

# Simpler things
nonPubidCharRegexp = re.compile("[^\x20\x0D\x0Aa-zA-Z0-9\-\'()+,./:=?;!*#@$_%]")


class InfosetFilter(object):
    replacementRegexp = re.compile(r"U[\dA-F]{5,5}")

    def __init__(self,
                 dropXmlnsLocalName=False,
                 dropXmlnsAttrNs=False,
                 preventDoubleDashComments=False,
                 preventDashAtCommentEnd=False,
                 replaceFormFeedCharacters=True,
                 preventSingleQuotePubid=False):

        self.dropXmlnsLocalName = dropXmlnsLocalName
        self.dropXmlnsAttrNs = dropXmlnsAttrNs

        self.preventDoubleDashComments = preventDoubleDashComments
        self.preventDashAtCommentEnd = preventDashAtCommentEnd

        self.replaceFormFeedCharacters = replaceFormFeedCharacters

        self.preventSingleQuotePubid = preventSingleQuotePubid

        self.replaceCache = {}

    def coerceAttribute(self, name, namespace=None):
        if self.dropXmlnsLocalName and name.startswith("xmlns:"):
            warnings.warn("Attributes cannot begin with xmlns", DataLossWarning)
            return None
        elif (self.dropXmlnsAttrNs and
              namespace == "http://www.w3.org/2000/xmlns/"):
            warnings.warn("Attributes cannot be in the xml namespace", DataLossWarning)
            return None
        else:
            return self.toXmlName(name)

    def coerceElement(self, name):
        return self.toXmlName(name)

    def coerceComment(self, data):
        if self.preventDoubleDashComments:
            while "--" in data:
                warnings.warn("Comments cannot contain adjacent dashes", DataLossWarning)
                data = data.replace("--", "- -")
            if data.endswith("-"):
                warnings.warn("Comments cannot end in a dash", DataLossWarning)
                data += " "
        return data

    def coerceCharacters(self, data):
        if self.replaceFormFeedCharacters:
            for _ in range(data.count("\x0C")):
                warnings.warn("Text cannot contain U+000C", DataLossWarning)
            data = data.replace("\x0C", " ")
        # Other non-xml characters
        return data

    def coercePubid(self, data):
        dataOutput = data
        for char in nonPubidCharRegexp.findall(data):
            warnings.warn("Coercing non-XML pubid", DataLossWarning)
            replacement = self.getReplacementCharacter(char)
            dataOutput = dataOutput.replace(char, replacement)
        if self.preventSingleQuotePubid and dataOutput.find("'") >= 0:
            warnings.warn("Pubid cannot contain single quote", DataLossWarning)
            dataOutput = dataOutput.replace("'", self.getReplacementCharacter("'"))
        return dataOutput

    def toXmlName(self, name):
        nameFirst = name[0]
        nameRest = name[1:]
        m = nonXmlNameFirstBMPRegexp.match(nameFirst)
        if m:
            warnings.warn("Coercing non-XML name", DataLossWarning)
            nameFirstOutput = self.getReplacementCharacter(nameFirst)
        else:
            nameFirstOutput = nameFirst

        nameRestOutput = nameRest
        replaceChars = set(nonXmlNameBMPRegexp.findall(nameRest))
        for char in replaceChars:
            warnings.warn("Coercing non-XML name", DataLossWarning)
            replacement = self.getReplacementCharacter(char)
            nameRestOutput = nameRestOutput.replace(char, replacement)
        return nameFirstOutput + nameRestOutput

    def getReplacementCharacter(self, char):
        if char in self.replaceCache:
            replacement = self.replaceCache[char]
        else:
            replacement = self.escapeChar(char)
        return replacement

    def fromXmlName(self, name):
        for item in set(self.replacementRegexp.findall(name)):
            name = name.replace(item, self.unescapeChar(item))
        return name

    def escapeChar(self, char):
        replacement = "U%05X" % ord(char)
        self.replaceCache[char] = replacement
        return replacement

    def unescapeChar(self, charcode):
        return chr(int(charcode[1:], 16))
site-packages/pip/_vendor/html5lib/_inputstream.py000064400000077424147511334600016336 0ustar00from __future__ import absolute_import, division, unicode_literals

from pip._vendor.six import text_type, binary_type
from pip._vendor.six.moves import http_client, urllib

import codecs
import re

from pip._vendor import webencodings

from .constants import EOF, spaceCharacters, asciiLetters, asciiUppercase
from .constants import ReparseException
from . import _utils

from io import StringIO

try:
    from io import BytesIO
except ImportError:
    BytesIO = StringIO

# Non-unicode versions of constants for use in the pre-parser
spaceCharactersBytes = frozenset([item.encode("ascii") for item in spaceCharacters])
asciiLettersBytes = frozenset([item.encode("ascii") for item in asciiLetters])
asciiUppercaseBytes = frozenset([item.encode("ascii") for item in asciiUppercase])
spacesAngleBrackets = spaceCharactersBytes | frozenset([b">", b"<"])


invalid_unicode_no_surrogate = "[\u0001-\u0008\u000B\u000E-\u001F\u007F-\u009F\uFDD0-\uFDEF\uFFFE\uFFFF\U0001FFFE\U0001FFFF\U0002FFFE\U0002FFFF\U0003FFFE\U0003FFFF\U0004FFFE\U0004FFFF\U0005FFFE\U0005FFFF\U0006FFFE\U0006FFFF\U0007FFFE\U0007FFFF\U0008FFFE\U0008FFFF\U0009FFFE\U0009FFFF\U000AFFFE\U000AFFFF\U000BFFFE\U000BFFFF\U000CFFFE\U000CFFFF\U000DFFFE\U000DFFFF\U000EFFFE\U000EFFFF\U000FFFFE\U000FFFFF\U0010FFFE\U0010FFFF]"  # noqa

if _utils.supports_lone_surrogates:
    # Use one extra step of indirection and create surrogates with
    # eval. Not using this indirection would introduce an illegal
    # unicode literal on platforms not supporting such lone
    # surrogates.
    assert invalid_unicode_no_surrogate[-1] == "]" and invalid_unicode_no_surrogate.count("]") == 1
    invalid_unicode_re = re.compile(invalid_unicode_no_surrogate[:-1] +
                                    eval('"\\uD800-\\uDFFF"') +  # pylint:disable=eval-used
                                    "]")
else:
    invalid_unicode_re = re.compile(invalid_unicode_no_surrogate)

non_bmp_invalid_codepoints = set([0x1FFFE, 0x1FFFF, 0x2FFFE, 0x2FFFF, 0x3FFFE,
                                  0x3FFFF, 0x4FFFE, 0x4FFFF, 0x5FFFE, 0x5FFFF,
                                  0x6FFFE, 0x6FFFF, 0x7FFFE, 0x7FFFF, 0x8FFFE,
                                  0x8FFFF, 0x9FFFE, 0x9FFFF, 0xAFFFE, 0xAFFFF,
                                  0xBFFFE, 0xBFFFF, 0xCFFFE, 0xCFFFF, 0xDFFFE,
                                  0xDFFFF, 0xEFFFE, 0xEFFFF, 0xFFFFE, 0xFFFFF,
                                  0x10FFFE, 0x10FFFF])

ascii_punctuation_re = re.compile("[\u0009-\u000D\u0020-\u002F\u003A-\u0040\u005B-\u0060\u007B-\u007E]")

# Cache for charsUntil()
charsUntilRegEx = {}


class BufferedStream(object):
    """Buffering for streams that do not have buffering of their own

    The buffer is implemented as a list of chunks on the assumption that
    joining many strings will be slow since it is O(n**2)
    """

    def __init__(self, stream):
        self.stream = stream
        self.buffer = []
        self.position = [-1, 0]  # chunk number, offset

    def tell(self):
        pos = 0
        for chunk in self.buffer[:self.position[0]]:
            pos += len(chunk)
        pos += self.position[1]
        return pos

    def seek(self, pos):
        assert pos <= self._bufferedBytes()
        offset = pos
        i = 0
        while len(self.buffer[i]) < offset:
            offset -= len(self.buffer[i])
            i += 1
        self.position = [i, offset]

    def read(self, bytes):
        if not self.buffer:
            return self._readStream(bytes)
        elif (self.position[0] == len(self.buffer) and
              self.position[1] == len(self.buffer[-1])):
            return self._readStream(bytes)
        else:
            return self._readFromBuffer(bytes)

    def _bufferedBytes(self):
        return sum([len(item) for item in self.buffer])

    def _readStream(self, bytes):
        data = self.stream.read(bytes)
        self.buffer.append(data)
        self.position[0] += 1
        self.position[1] = len(data)
        return data

    def _readFromBuffer(self, bytes):
        remainingBytes = bytes
        rv = []
        bufferIndex = self.position[0]
        bufferOffset = self.position[1]
        while bufferIndex < len(self.buffer) and remainingBytes != 0:
            assert remainingBytes > 0
            bufferedData = self.buffer[bufferIndex]

            if remainingBytes <= len(bufferedData) - bufferOffset:
                bytesToRead = remainingBytes
                self.position = [bufferIndex, bufferOffset + bytesToRead]
            else:
                bytesToRead = len(bufferedData) - bufferOffset
                self.position = [bufferIndex, len(bufferedData)]
                bufferIndex += 1
            rv.append(bufferedData[bufferOffset:bufferOffset + bytesToRead])
            remainingBytes -= bytesToRead

            bufferOffset = 0

        if remainingBytes:
            rv.append(self._readStream(remainingBytes))

        return b"".join(rv)


def HTMLInputStream(source, **kwargs):
    # Work around Python bug #20007: read(0) closes the connection.
    # http://bugs.python.org/issue20007
    if (isinstance(source, http_client.HTTPResponse) or
        # Also check for addinfourl wrapping HTTPResponse
        (isinstance(source, urllib.response.addbase) and
         isinstance(source.fp, http_client.HTTPResponse))):
        isUnicode = False
    elif hasattr(source, "read"):
        isUnicode = isinstance(source.read(0), text_type)
    else:
        isUnicode = isinstance(source, text_type)

    if isUnicode:
        encodings = [x for x in kwargs if x.endswith("_encoding")]
        if encodings:
            raise TypeError("Cannot set an encoding with a unicode input, set %r" % encodings)

        return HTMLUnicodeInputStream(source, **kwargs)
    else:
        return HTMLBinaryInputStream(source, **kwargs)


class HTMLUnicodeInputStream(object):
    """Provides a unicode stream of characters to the HTMLTokenizer.

    This class takes care of character encoding and removing or replacing
    incorrect byte-sequences and also provides column and line tracking.

    """

    _defaultChunkSize = 10240

    def __init__(self, source):
        """Initialises the HTMLInputStream.

        HTMLInputStream(source, [encoding]) -> Normalized stream from source
        for use by html5lib.

        source can be either a file-object, local filename or a string.

        The optional encoding parameter must be a string that indicates
        the encoding.  If specified, that encoding will be used,
        regardless of any BOM or later declaration (such as in a meta
        element)

        """

        if not _utils.supports_lone_surrogates:
            # Such platforms will have already checked for such
            # surrogate errors, so no need to do this checking.
            self.reportCharacterErrors = None
        elif len("\U0010FFFF") == 1:
            self.reportCharacterErrors = self.characterErrorsUCS4
        else:
            self.reportCharacterErrors = self.characterErrorsUCS2

        # List of where new lines occur
        self.newLines = [0]

        self.charEncoding = (lookupEncoding("utf-8"), "certain")
        self.dataStream = self.openStream(source)

        self.reset()

    def reset(self):
        self.chunk = ""
        self.chunkSize = 0
        self.chunkOffset = 0
        self.errors = []

        # number of (complete) lines in previous chunks
        self.prevNumLines = 0
        # number of columns in the last line of the previous chunk
        self.prevNumCols = 0

        # Deal with CR LF and surrogates split over chunk boundaries
        self._bufferedCharacter = None

    def openStream(self, source):
        """Produces a file object from source.

        source can be either a file object, local filename or a string.

        """
        # Already a file object
        if hasattr(source, 'read'):
            stream = source
        else:
            stream = StringIO(source)

        return stream

    def _position(self, offset):
        chunk = self.chunk
        nLines = chunk.count('\n', 0, offset)
        positionLine = self.prevNumLines + nLines
        lastLinePos = chunk.rfind('\n', 0, offset)
        if lastLinePos == -1:
            positionColumn = self.prevNumCols + offset
        else:
            positionColumn = offset - (lastLinePos + 1)
        return (positionLine, positionColumn)

    def position(self):
        """Returns (line, col) of the current position in the stream."""
        line, col = self._position(self.chunkOffset)
        return (line + 1, col)

    def char(self):
        """ Read one character from the stream or queue if available. Return
            EOF when EOF is reached.
        """
        # Read a new chunk from the input stream if necessary
        if self.chunkOffset >= self.chunkSize:
            if not self.readChunk():
                return EOF

        chunkOffset = self.chunkOffset
        char = self.chunk[chunkOffset]
        self.chunkOffset = chunkOffset + 1

        return char

    def readChunk(self, chunkSize=None):
        if chunkSize is None:
            chunkSize = self._defaultChunkSize

        self.prevNumLines, self.prevNumCols = self._position(self.chunkSize)

        self.chunk = ""
        self.chunkSize = 0
        self.chunkOffset = 0

        data = self.dataStream.read(chunkSize)

        # Deal with CR LF and surrogates broken across chunks
        if self._bufferedCharacter:
            data = self._bufferedCharacter + data
            self._bufferedCharacter = None
        elif not data:
            # We have no more data, bye-bye stream
            return False

        if len(data) > 1:
            lastv = ord(data[-1])
            if lastv == 0x0D or 0xD800 <= lastv <= 0xDBFF:
                self._bufferedCharacter = data[-1]
                data = data[:-1]

        if self.reportCharacterErrors:
            self.reportCharacterErrors(data)

        # Replace invalid characters
        data = data.replace("\r\n", "\n")
        data = data.replace("\r", "\n")

        self.chunk = data
        self.chunkSize = len(data)

        return True

    def characterErrorsUCS4(self, data):
        for _ in range(len(invalid_unicode_re.findall(data))):
            self.errors.append("invalid-codepoint")

    def characterErrorsUCS2(self, data):
        # Someone picked the wrong compile option
        # You lose
        skip = False
        for match in invalid_unicode_re.finditer(data):
            if skip:
                continue
            codepoint = ord(match.group())
            pos = match.start()
            # Pretty sure there should be endianness issues here
            if _utils.isSurrogatePair(data[pos:pos + 2]):
                # We have a surrogate pair!
                char_val = _utils.surrogatePairToCodepoint(data[pos:pos + 2])
                if char_val in non_bmp_invalid_codepoints:
                    self.errors.append("invalid-codepoint")
                skip = True
            elif (codepoint >= 0xD800 and codepoint <= 0xDFFF and
                  pos == len(data) - 1):
                self.errors.append("invalid-codepoint")
            else:
                skip = False
                self.errors.append("invalid-codepoint")

    def charsUntil(self, characters, opposite=False):
        """ Returns a string of characters from the stream up to but not
        including any character in 'characters' or EOF. 'characters' must be
        a container that supports the 'in' method and iteration over its
        characters.
        """

        # Use a cache of regexps to find the required characters
        try:
            chars = charsUntilRegEx[(characters, opposite)]
        except KeyError:
            if __debug__:
                for c in characters:
                    assert(ord(c) < 128)
            regex = "".join(["\\x%02x" % ord(c) for c in characters])
            if not opposite:
                regex = "^%s" % regex
            chars = charsUntilRegEx[(characters, opposite)] = re.compile("[%s]+" % regex)

        rv = []

        while True:
            # Find the longest matching prefix
            m = chars.match(self.chunk, self.chunkOffset)
            if m is None:
                # If nothing matched, and it wasn't because we ran out of chunk,
                # then stop
                if self.chunkOffset != self.chunkSize:
                    break
            else:
                end = m.end()
                # If not the whole chunk matched, return everything
                # up to the part that didn't match
                if end != self.chunkSize:
                    rv.append(self.chunk[self.chunkOffset:end])
                    self.chunkOffset = end
                    break
            # If the whole remainder of the chunk matched,
            # use it all and read the next chunk
            rv.append(self.chunk[self.chunkOffset:])
            if not self.readChunk():
                # Reached EOF
                break

        r = "".join(rv)
        return r

    def unget(self, char):
        # Only one character is allowed to be ungotten at once - it must
        # be consumed again before any further call to unget
        if char is not None:
            if self.chunkOffset == 0:
                # unget is called quite rarely, so it's a good idea to do
                # more work here if it saves a bit of work in the frequently
                # called char and charsUntil.
                # So, just prepend the ungotten character onto the current
                # chunk:
                self.chunk = char + self.chunk
                self.chunkSize += 1
            else:
                self.chunkOffset -= 1
                assert self.chunk[self.chunkOffset] == char


class HTMLBinaryInputStream(HTMLUnicodeInputStream):
    """Provides a unicode stream of characters to the HTMLTokenizer.

    This class takes care of character encoding and removing or replacing
    incorrect byte-sequences and also provides column and line tracking.

    """

    def __init__(self, source, override_encoding=None, transport_encoding=None,
                 same_origin_parent_encoding=None, likely_encoding=None,
                 default_encoding="windows-1252", useChardet=True):
        """Initialises the HTMLInputStream.

        HTMLInputStream(source, [encoding]) -> Normalized stream from source
        for use by html5lib.

        source can be either a file-object, local filename or a string.

        The optional encoding parameter must be a string that indicates
        the encoding.  If specified, that encoding will be used,
        regardless of any BOM or later declaration (such as in a meta
        element)

        """
        # Raw Stream - for unicode objects this will encode to utf-8 and set
        #              self.charEncoding as appropriate
        self.rawStream = self.openStream(source)

        HTMLUnicodeInputStream.__init__(self, self.rawStream)

        # Encoding Information
        # Number of bytes to use when looking for a meta element with
        # encoding information
        self.numBytesMeta = 1024
        # Number of bytes to use when using detecting encoding using chardet
        self.numBytesChardet = 100
        # Things from args
        self.override_encoding = override_encoding
        self.transport_encoding = transport_encoding
        self.same_origin_parent_encoding = same_origin_parent_encoding
        self.likely_encoding = likely_encoding
        self.default_encoding = default_encoding

        # Determine encoding
        self.charEncoding = self.determineEncoding(useChardet)
        assert self.charEncoding[0] is not None

        # Call superclass
        self.reset()

    def reset(self):
        self.dataStream = self.charEncoding[0].codec_info.streamreader(self.rawStream, 'replace')
        HTMLUnicodeInputStream.reset(self)

    def openStream(self, source):
        """Produces a file object from source.

        source can be either a file object, local filename or a string.

        """
        # Already a file object
        if hasattr(source, 'read'):
            stream = source
        else:
            stream = BytesIO(source)

        try:
            stream.seek(stream.tell())
        except:  # pylint:disable=bare-except
            stream = BufferedStream(stream)

        return stream

    def determineEncoding(self, chardet=True):
        # BOMs take precedence over everything
        # This will also read past the BOM if present
        charEncoding = self.detectBOM(), "certain"
        if charEncoding[0] is not None:
            return charEncoding

        # If we've been overriden, we've been overriden
        charEncoding = lookupEncoding(self.override_encoding), "certain"
        if charEncoding[0] is not None:
            return charEncoding

        # Now check the transport layer
        charEncoding = lookupEncoding(self.transport_encoding), "certain"
        if charEncoding[0] is not None:
            return charEncoding

        # Look for meta elements with encoding information
        charEncoding = self.detectEncodingMeta(), "tentative"
        if charEncoding[0] is not None:
            return charEncoding

        # Parent document encoding
        charEncoding = lookupEncoding(self.same_origin_parent_encoding), "tentative"
        if charEncoding[0] is not None and not charEncoding[0].name.startswith("utf-16"):
            return charEncoding

        # "likely" encoding
        charEncoding = lookupEncoding(self.likely_encoding), "tentative"
        if charEncoding[0] is not None:
            return charEncoding

        # Guess with chardet, if available
        if chardet:
            try:
                from chardet.universaldetector import UniversalDetector
            except ImportError:
                pass
            else:
                buffers = []
                detector = UniversalDetector()
                while not detector.done:
                    buffer = self.rawStream.read(self.numBytesChardet)
                    assert isinstance(buffer, bytes)
                    if not buffer:
                        break
                    buffers.append(buffer)
                    detector.feed(buffer)
                detector.close()
                encoding = lookupEncoding(detector.result['encoding'])
                self.rawStream.seek(0)
                if encoding is not None:
                    return encoding, "tentative"

        # Try the default encoding
        charEncoding = lookupEncoding(self.default_encoding), "tentative"
        if charEncoding[0] is not None:
            return charEncoding

        # Fallback to html5lib's default if even that hasn't worked
        return lookupEncoding("windows-1252"), "tentative"

    def changeEncoding(self, newEncoding):
        assert self.charEncoding[1] != "certain"
        newEncoding = lookupEncoding(newEncoding)
        if newEncoding is None:
            return
        if newEncoding.name in ("utf-16be", "utf-16le"):
            newEncoding = lookupEncoding("utf-8")
            assert newEncoding is not None
        elif newEncoding == self.charEncoding[0]:
            self.charEncoding = (self.charEncoding[0], "certain")
        else:
            self.rawStream.seek(0)
            self.charEncoding = (newEncoding, "certain")
            self.reset()
            raise ReparseException("Encoding changed from %s to %s" % (self.charEncoding[0], newEncoding))

    def detectBOM(self):
        """Attempts to detect at BOM at the start of the stream. If
        an encoding can be determined from the BOM return the name of the
        encoding otherwise return None"""
        bomDict = {
            codecs.BOM_UTF8: 'utf-8',
            codecs.BOM_UTF16_LE: 'utf-16le', codecs.BOM_UTF16_BE: 'utf-16be',
            codecs.BOM_UTF32_LE: 'utf-32le', codecs.BOM_UTF32_BE: 'utf-32be'
        }

        # Go to beginning of file and read in 4 bytes
        string = self.rawStream.read(4)
        assert isinstance(string, bytes)

        # Try detecting the BOM using bytes from the string
        encoding = bomDict.get(string[:3])         # UTF-8
        seek = 3
        if not encoding:
            # Need to detect UTF-32 before UTF-16
            encoding = bomDict.get(string)         # UTF-32
            seek = 4
            if not encoding:
                encoding = bomDict.get(string[:2])  # UTF-16
                seek = 2

        # Set the read position past the BOM if one was found, otherwise
        # set it to the start of the stream
        if encoding:
            self.rawStream.seek(seek)
            return lookupEncoding(encoding)
        else:
            self.rawStream.seek(0)
            return None

    def detectEncodingMeta(self):
        """Report the encoding declared by the meta element
        """
        buffer = self.rawStream.read(self.numBytesMeta)
        assert isinstance(buffer, bytes)
        parser = EncodingParser(buffer)
        self.rawStream.seek(0)
        encoding = parser.getEncoding()

        if encoding is not None and encoding.name in ("utf-16be", "utf-16le"):
            encoding = lookupEncoding("utf-8")

        return encoding


class EncodingBytes(bytes):
    """String-like object with an associated position and various extra methods
    If the position is ever greater than the string length then an exception is
    raised"""
    def __new__(self, value):
        assert isinstance(value, bytes)
        return bytes.__new__(self, value.lower())

    def __init__(self, value):
        # pylint:disable=unused-argument
        self._position = -1

    def __iter__(self):
        return self

    def __next__(self):
        p = self._position = self._position + 1
        if p >= len(self):
            raise StopIteration
        elif p < 0:
            raise TypeError
        return self[p:p + 1]

    def next(self):
        # Py2 compat
        return self.__next__()

    def previous(self):
        p = self._position
        if p >= len(self):
            raise StopIteration
        elif p < 0:
            raise TypeError
        self._position = p = p - 1
        return self[p:p + 1]

    def setPosition(self, position):
        if self._position >= len(self):
            raise StopIteration
        self._position = position

    def getPosition(self):
        if self._position >= len(self):
            raise StopIteration
        if self._position >= 0:
            return self._position
        else:
            return None

    position = property(getPosition, setPosition)

    def getCurrentByte(self):
        return self[self.position:self.position + 1]

    currentByte = property(getCurrentByte)

    def skip(self, chars=spaceCharactersBytes):
        """Skip past a list of characters"""
        p = self.position               # use property for the error-checking
        while p < len(self):
            c = self[p:p + 1]
            if c not in chars:
                self._position = p
                return c
            p += 1
        self._position = p
        return None

    def skipUntil(self, chars):
        p = self.position
        while p < len(self):
            c = self[p:p + 1]
            if c in chars:
                self._position = p
                return c
            p += 1
        self._position = p
        return None

    def matchBytes(self, bytes):
        """Look for a sequence of bytes at the start of a string. If the bytes
        are found return True and advance the position to the byte after the
        match. Otherwise return False and leave the position alone"""
        p = self.position
        data = self[p:p + len(bytes)]
        rv = data.startswith(bytes)
        if rv:
            self.position += len(bytes)
        return rv

    def jumpTo(self, bytes):
        """Look for the next sequence of bytes matching a given sequence. If
        a match is found advance the position to the last byte of the match"""
        newPosition = self[self.position:].find(bytes)
        if newPosition > -1:
            # XXX: This is ugly, but I can't see a nicer way to fix this.
            if self._position == -1:
                self._position = 0
            self._position += (newPosition + len(bytes) - 1)
            return True
        else:
            raise StopIteration


class EncodingParser(object):
    """Mini parser for detecting character encoding from meta elements"""

    def __init__(self, data):
        """string - the data to work on for encoding detection"""
        self.data = EncodingBytes(data)
        self.encoding = None

    def getEncoding(self):
        methodDispatch = (
            (b"<!--", self.handleComment),
            (b"<meta", self.handleMeta),
            (b"</", self.handlePossibleEndTag),
            (b"<!", self.handleOther),
            (b"<?", self.handleOther),
            (b"<", self.handlePossibleStartTag))
        for _ in self.data:
            keepParsing = True
            for key, method in methodDispatch:
                if self.data.matchBytes(key):
                    try:
                        keepParsing = method()
                        break
                    except StopIteration:
                        keepParsing = False
                        break
            if not keepParsing:
                break

        return self.encoding

    def handleComment(self):
        """Skip over comments"""
        return self.data.jumpTo(b"-->")

    def handleMeta(self):
        if self.data.currentByte not in spaceCharactersBytes:
            # if we have <meta not followed by a space so just keep going
            return True
        # We have a valid meta element we want to search for attributes
        hasPragma = False
        pendingEncoding = None
        while True:
            # Try to find the next attribute after the current position
            attr = self.getAttribute()
            if attr is None:
                return True
            else:
                if attr[0] == b"http-equiv":
                    hasPragma = attr[1] == b"content-type"
                    if hasPragma and pendingEncoding is not None:
                        self.encoding = pendingEncoding
                        return False
                elif attr[0] == b"charset":
                    tentativeEncoding = attr[1]
                    codec = lookupEncoding(tentativeEncoding)
                    if codec is not None:
                        self.encoding = codec
                        return False
                elif attr[0] == b"content":
                    contentParser = ContentAttrParser(EncodingBytes(attr[1]))
                    tentativeEncoding = contentParser.parse()
                    if tentativeEncoding is not None:
                        codec = lookupEncoding(tentativeEncoding)
                        if codec is not None:
                            if hasPragma:
                                self.encoding = codec
                                return False
                            else:
                                pendingEncoding = codec

    def handlePossibleStartTag(self):
        return self.handlePossibleTag(False)

    def handlePossibleEndTag(self):
        next(self.data)
        return self.handlePossibleTag(True)

    def handlePossibleTag(self, endTag):
        data = self.data
        if data.currentByte not in asciiLettersBytes:
            # If the next byte is not an ascii letter either ignore this
            # fragment (possible start tag case) or treat it according to
            # handleOther
            if endTag:
                data.previous()
                self.handleOther()
            return True

        c = data.skipUntil(spacesAngleBrackets)
        if c == b"<":
            # return to the first step in the overall "two step" algorithm
            # reprocessing the < byte
            data.previous()
        else:
            # Read all attributes
            attr = self.getAttribute()
            while attr is not None:
                attr = self.getAttribute()
        return True

    def handleOther(self):
        return self.data.jumpTo(b">")

    def getAttribute(self):
        """Return a name,value pair for the next attribute in the stream,
        if one is found, or None"""
        data = self.data
        # Step 1 (skip chars)
        c = data.skip(spaceCharactersBytes | frozenset([b"/"]))
        assert c is None or len(c) == 1
        # Step 2
        if c in (b">", None):
            return None
        # Step 3
        attrName = []
        attrValue = []
        # Step 4 attribute name
        while True:
            if c == b"=" and attrName:
                break
            elif c in spaceCharactersBytes:
                # Step 6!
                c = data.skip()
                break
            elif c in (b"/", b">"):
                return b"".join(attrName), b""
            elif c in asciiUppercaseBytes:
                attrName.append(c.lower())
            elif c is None:
                return None
            else:
                attrName.append(c)
            # Step 5
            c = next(data)
        # Step 7
        if c != b"=":
            data.previous()
            return b"".join(attrName), b""
        # Step 8
        next(data)
        # Step 9
        c = data.skip()
        # Step 10
        if c in (b"'", b'"'):
            # 10.1
            quoteChar = c
            while True:
                # 10.2
                c = next(data)
                # 10.3
                if c == quoteChar:
                    next(data)
                    return b"".join(attrName), b"".join(attrValue)
                # 10.4
                elif c in asciiUppercaseBytes:
                    attrValue.append(c.lower())
                # 10.5
                else:
                    attrValue.append(c)
        elif c == b">":
            return b"".join(attrName), b""
        elif c in asciiUppercaseBytes:
            attrValue.append(c.lower())
        elif c is None:
            return None
        else:
            attrValue.append(c)
        # Step 11
        while True:
            c = next(data)
            if c in spacesAngleBrackets:
                return b"".join(attrName), b"".join(attrValue)
            elif c in asciiUppercaseBytes:
                attrValue.append(c.lower())
            elif c is None:
                return None
            else:
                attrValue.append(c)


class ContentAttrParser(object):
    def __init__(self, data):
        assert isinstance(data, bytes)
        self.data = data

    def parse(self):
        try:
            # Check if the attr name is charset
            # otherwise return
            self.data.jumpTo(b"charset")
            self.data.position += 1
            self.data.skip()
            if not self.data.currentByte == b"=":
                # If there is no = sign keep looking for attrs
                return None
            self.data.position += 1
            self.data.skip()
            # Look for an encoding between matching quote marks
            if self.data.currentByte in (b'"', b"'"):
                quoteMark = self.data.currentByte
                self.data.position += 1
                oldPosition = self.data.position
                if self.data.jumpTo(quoteMark):
                    return self.data[oldPosition:self.data.position]
                else:
                    return None
            else:
                # Unquoted value
                oldPosition = self.data.position
                try:
                    self.data.skipUntil(spaceCharactersBytes)
                    return self.data[oldPosition:self.data.position]
                except StopIteration:
                    # Return the whole remaining value
                    return self.data[oldPosition:]
        except StopIteration:
            return None


def lookupEncoding(encoding):
    """Return the python codec name corresponding to an encoding or None if the
    string doesn't correspond to a valid encoding."""
    if isinstance(encoding, binary_type):
        try:
            encoding = encoding.decode("ascii")
        except UnicodeDecodeError:
            return None

    if encoding is not None:
        try:
            return webencodings.lookup(encoding)
        except AttributeError:
            return None
    else:
        return None
site-packages/pip/_vendor/html5lib/treewalkers/etree.py000064400000011114147511334600017240 0ustar00from __future__ import absolute_import, division, unicode_literals

try:
    from collections import OrderedDict
except ImportError:
    try:
        from ordereddict import OrderedDict
    except ImportError:
        OrderedDict = dict

import re

from pip._vendor.six import string_types

from . import base
from .._utils import moduleFactoryFactory

tag_regexp = re.compile("{([^}]*)}(.*)")


def getETreeBuilder(ElementTreeImplementation):
    ElementTree = ElementTreeImplementation
    ElementTreeCommentType = ElementTree.Comment("asd").tag

    class TreeWalker(base.NonRecursiveTreeWalker):  # pylint:disable=unused-variable
        """Given the particular ElementTree representation, this implementation,
        to avoid using recursion, returns "nodes" as tuples with the following
        content:

        1. The current element

        2. The index of the element relative to its parent

        3. A stack of ancestor elements

        4. A flag "text", "tail" or None to indicate if the current node is a
           text node; either the text or tail of the current element (1)
        """
        def getNodeDetails(self, node):
            if isinstance(node, tuple):  # It might be the root Element
                elt, _, _, flag = node
                if flag in ("text", "tail"):
                    return base.TEXT, getattr(elt, flag)
                else:
                    node = elt

            if not(hasattr(node, "tag")):
                node = node.getroot()

            if node.tag in ("DOCUMENT_ROOT", "DOCUMENT_FRAGMENT"):
                return (base.DOCUMENT,)

            elif node.tag == "<!DOCTYPE>":
                return (base.DOCTYPE, node.text,
                        node.get("publicId"), node.get("systemId"))

            elif node.tag == ElementTreeCommentType:
                return base.COMMENT, node.text

            else:
                assert isinstance(node.tag, string_types), type(node.tag)
                # This is assumed to be an ordinary element
                match = tag_regexp.match(node.tag)
                if match:
                    namespace, tag = match.groups()
                else:
                    namespace = None
                    tag = node.tag
                attrs = OrderedDict()
                for name, value in list(node.attrib.items()):
                    match = tag_regexp.match(name)
                    if match:
                        attrs[(match.group(1), match.group(2))] = value
                    else:
                        attrs[(None, name)] = value
                return (base.ELEMENT, namespace, tag,
                        attrs, len(node) or node.text)

        def getFirstChild(self, node):
            if isinstance(node, tuple):
                element, key, parents, flag = node
            else:
                element, key, parents, flag = node, None, [], None

            if flag in ("text", "tail"):
                return None
            else:
                if element.text:
                    return element, key, parents, "text"
                elif len(element):
                    parents.append(element)
                    return element[0], 0, parents, None
                else:
                    return None

        def getNextSibling(self, node):
            if isinstance(node, tuple):
                element, key, parents, flag = node
            else:
                return None

            if flag == "text":
                if len(element):
                    parents.append(element)
                    return element[0], 0, parents, None
                else:
                    return None
            else:
                if element.tail and flag != "tail":
                    return element, key, parents, "tail"
                elif key < len(parents[-1]) - 1:
                    return parents[-1][key + 1], key + 1, parents, None
                else:
                    return None

        def getParentNode(self, node):
            if isinstance(node, tuple):
                element, key, parents, flag = node
            else:
                return None

            if flag == "text":
                if not parents:
                    return element
                else:
                    return element, key, parents, None
            else:
                parent = parents.pop()
                if not parents:
                    return parent
                else:
                    assert list(parents[-1]).count(parent) == 1
                    return parent, list(parents[-1]).index(parent), parents, None

    return locals()

getETreeModule = moduleFactoryFactory(getETreeBuilder)
site-packages/pip/_vendor/html5lib/treewalkers/base.py000064400000011513147511334600017051 0ustar00from __future__ import absolute_import, division, unicode_literals

from xml.dom import Node
from ..constants import namespaces, voidElements, spaceCharacters

__all__ = ["DOCUMENT", "DOCTYPE", "TEXT", "ELEMENT", "COMMENT", "ENTITY", "UNKNOWN",
           "TreeWalker", "NonRecursiveTreeWalker"]

DOCUMENT = Node.DOCUMENT_NODE
DOCTYPE = Node.DOCUMENT_TYPE_NODE
TEXT = Node.TEXT_NODE
ELEMENT = Node.ELEMENT_NODE
COMMENT = Node.COMMENT_NODE
ENTITY = Node.ENTITY_NODE
UNKNOWN = "<#UNKNOWN#>"

spaceCharacters = "".join(spaceCharacters)


class TreeWalker(object):
    def __init__(self, tree):
        self.tree = tree

    def __iter__(self):
        raise NotImplementedError

    def error(self, msg):
        return {"type": "SerializeError", "data": msg}

    def emptyTag(self, namespace, name, attrs, hasChildren=False):
        yield {"type": "EmptyTag", "name": name,
               "namespace": namespace,
               "data": attrs}
        if hasChildren:
            yield self.error("Void element has children")

    def startTag(self, namespace, name, attrs):
        return {"type": "StartTag",
                "name": name,
                "namespace": namespace,
                "data": attrs}

    def endTag(self, namespace, name):
        return {"type": "EndTag",
                "name": name,
                "namespace": namespace}

    def text(self, data):
        data = data
        middle = data.lstrip(spaceCharacters)
        left = data[:len(data) - len(middle)]
        if left:
            yield {"type": "SpaceCharacters", "data": left}
        data = middle
        middle = data.rstrip(spaceCharacters)
        right = data[len(middle):]
        if middle:
            yield {"type": "Characters", "data": middle}
        if right:
            yield {"type": "SpaceCharacters", "data": right}

    def comment(self, data):
        return {"type": "Comment", "data": data}

    def doctype(self, name, publicId=None, systemId=None):
        return {"type": "Doctype",
                "name": name,
                "publicId": publicId,
                "systemId": systemId}

    def entity(self, name):
        return {"type": "Entity", "name": name}

    def unknown(self, nodeType):
        return self.error("Unknown node type: " + nodeType)


class NonRecursiveTreeWalker(TreeWalker):
    def getNodeDetails(self, node):
        raise NotImplementedError

    def getFirstChild(self, node):
        raise NotImplementedError

    def getNextSibling(self, node):
        raise NotImplementedError

    def getParentNode(self, node):
        raise NotImplementedError

    def __iter__(self):
        currentNode = self.tree
        while currentNode is not None:
            details = self.getNodeDetails(currentNode)
            type, details = details[0], details[1:]
            hasChildren = False

            if type == DOCTYPE:
                yield self.doctype(*details)

            elif type == TEXT:
                for token in self.text(*details):
                    yield token

            elif type == ELEMENT:
                namespace, name, attributes, hasChildren = details
                if (not namespace or namespace == namespaces["html"]) and name in voidElements:
                    for token in self.emptyTag(namespace, name, attributes,
                                               hasChildren):
                        yield token
                    hasChildren = False
                else:
                    yield self.startTag(namespace, name, attributes)

            elif type == COMMENT:
                yield self.comment(details[0])

            elif type == ENTITY:
                yield self.entity(details[0])

            elif type == DOCUMENT:
                hasChildren = True

            else:
                yield self.unknown(details[0])

            if hasChildren:
                firstChild = self.getFirstChild(currentNode)
            else:
                firstChild = None

            if firstChild is not None:
                currentNode = firstChild
            else:
                while currentNode is not None:
                    details = self.getNodeDetails(currentNode)
                    type, details = details[0], details[1:]
                    if type == ELEMENT:
                        namespace, name, attributes, hasChildren = details
                        if (namespace and namespace != namespaces["html"]) or name not in voidElements:
                            yield self.endTag(namespace, name)
                    if self.tree is currentNode:
                        currentNode = None
                        break
                    nextSibling = self.getNextSibling(currentNode)
                    if nextSibling is not None:
                        currentNode = nextSibling
                        break
                    else:
                        currentNode = self.getParentNode(currentNode)
site-packages/pip/_vendor/html5lib/treewalkers/__init__.py000064400000012650147511334600017701 0ustar00"""A collection of modules for iterating through different kinds of
tree, generating tokens identical to those produced by the tokenizer
module.

To create a tree walker for a new type of tree, you need to do
implement a tree walker object (called TreeWalker by convention) that
implements a 'serialize' method taking a tree as sole argument and
returning an iterator generating tokens.
"""

from __future__ import absolute_import, division, unicode_literals

from .. import constants
from .._utils import default_etree

__all__ = ["getTreeWalker", "pprint", "dom", "etree", "genshi", "etree_lxml"]

treeWalkerCache = {}


def getTreeWalker(treeType, implementation=None, **kwargs):
    """Get a TreeWalker class for various types of tree with built-in support

    Args:
        treeType (str): the name of the tree type required (case-insensitive).
            Supported values are:

            - "dom": The xml.dom.minidom DOM implementation
            - "etree": A generic walker for tree implementations exposing an
                       elementtree-like interface (known to work with
                       ElementTree, cElementTree and lxml.etree).
            - "lxml": Optimized walker for lxml.etree
            - "genshi": a Genshi stream

        Implementation: A module implementing the tree type e.g.
            xml.etree.ElementTree or cElementTree (Currently applies to the
            "etree" tree type only).
    """

    treeType = treeType.lower()
    if treeType not in treeWalkerCache:
        if treeType == "dom":
            from . import dom
            treeWalkerCache[treeType] = dom.TreeWalker
        elif treeType == "genshi":
            from . import genshi
            treeWalkerCache[treeType] = genshi.TreeWalker
        elif treeType == "lxml":
            from . import etree_lxml
            treeWalkerCache[treeType] = etree_lxml.TreeWalker
        elif treeType == "etree":
            from . import etree
            if implementation is None:
                implementation = default_etree
            # XXX: NEVER cache here, caching is done in the etree submodule
            return etree.getETreeModule(implementation, **kwargs).TreeWalker
    return treeWalkerCache.get(treeType)


def concatenateCharacterTokens(tokens):
    pendingCharacters = []
    for token in tokens:
        type = token["type"]
        if type in ("Characters", "SpaceCharacters"):
            pendingCharacters.append(token["data"])
        else:
            if pendingCharacters:
                yield {"type": "Characters", "data": "".join(pendingCharacters)}
                pendingCharacters = []
            yield token
    if pendingCharacters:
        yield {"type": "Characters", "data": "".join(pendingCharacters)}


def pprint(walker):
    """Pretty printer for tree walkers"""
    output = []
    indent = 0
    for token in concatenateCharacterTokens(walker):
        type = token["type"]
        if type in ("StartTag", "EmptyTag"):
            # tag name
            if token["namespace"] and token["namespace"] != constants.namespaces["html"]:
                if token["namespace"] in constants.prefixes:
                    ns = constants.prefixes[token["namespace"]]
                else:
                    ns = token["namespace"]
                name = "%s %s" % (ns, token["name"])
            else:
                name = token["name"]
            output.append("%s<%s>" % (" " * indent, name))
            indent += 2
            # attributes (sorted for consistent ordering)
            attrs = token["data"]
            for (namespace, localname), value in sorted(attrs.items()):
                if namespace:
                    if namespace in constants.prefixes:
                        ns = constants.prefixes[namespace]
                    else:
                        ns = namespace
                    name = "%s %s" % (ns, localname)
                else:
                    name = localname
                output.append("%s%s=\"%s\"" % (" " * indent, name, value))
            # self-closing
            if type == "EmptyTag":
                indent -= 2

        elif type == "EndTag":
            indent -= 2

        elif type == "Comment":
            output.append("%s<!-- %s -->" % (" " * indent, token["data"]))

        elif type == "Doctype":
            if token["name"]:
                if token["publicId"]:
                    output.append("""%s<!DOCTYPE %s "%s" "%s">""" %
                                  (" " * indent,
                                   token["name"],
                                   token["publicId"],
                                   token["systemId"] if token["systemId"] else ""))
                elif token["systemId"]:
                    output.append("""%s<!DOCTYPE %s "" "%s">""" %
                                  (" " * indent,
                                   token["name"],
                                   token["systemId"]))
                else:
                    output.append("%s<!DOCTYPE %s>" % (" " * indent,
                                                       token["name"]))
            else:
                output.append("%s<!DOCTYPE >" % (" " * indent,))

        elif type == "Characters":
            output.append("%s\"%s\"" % (" " * indent, token["data"]))

        elif type == "SpaceCharacters":
            assert False, "concatenateCharacterTokens should have got rid of all Space tokens"

        else:
            raise ValueError("Unknown token type, %s" % type)

    return "\n".join(output)
site-packages/pip/_vendor/html5lib/treewalkers/etree_lxml.py000064400000014245147511334600020304 0ustar00from __future__ import absolute_import, division, unicode_literals
from pip._vendor.six import text_type

from lxml import etree
from ..treebuilders.etree import tag_regexp

from . import base

from .. import _ihatexml


def ensure_str(s):
    if s is None:
        return None
    elif isinstance(s, text_type):
        return s
    else:
        return s.decode("ascii", "strict")


class Root(object):
    def __init__(self, et):
        self.elementtree = et
        self.children = []

        try:
            if et.docinfo.internalDTD:
                self.children.append(Doctype(self,
                                             ensure_str(et.docinfo.root_name),
                                             ensure_str(et.docinfo.public_id),
                                             ensure_str(et.docinfo.system_url)))
        except AttributeError:
            pass

        try:
            node = et.getroot()
        except AttributeError:
            node = et

        while node.getprevious() is not None:
            node = node.getprevious()
        while node is not None:
            self.children.append(node)
            node = node.getnext()

        self.text = None
        self.tail = None

    def __getitem__(self, key):
        return self.children[key]

    def getnext(self):
        return None

    def __len__(self):
        return 1


class Doctype(object):
    def __init__(self, root_node, name, public_id, system_id):
        self.root_node = root_node
        self.name = name
        self.public_id = public_id
        self.system_id = system_id

        self.text = None
        self.tail = None

    def getnext(self):
        return self.root_node.children[1]


class FragmentRoot(Root):
    def __init__(self, children):
        self.children = [FragmentWrapper(self, child) for child in children]
        self.text = self.tail = None

    def getnext(self):
        return None


class FragmentWrapper(object):
    def __init__(self, fragment_root, obj):
        self.root_node = fragment_root
        self.obj = obj
        if hasattr(self.obj, 'text'):
            self.text = ensure_str(self.obj.text)
        else:
            self.text = None
        if hasattr(self.obj, 'tail'):
            self.tail = ensure_str(self.obj.tail)
        else:
            self.tail = None

    def __getattr__(self, name):
        return getattr(self.obj, name)

    def getnext(self):
        siblings = self.root_node.children
        idx = siblings.index(self)
        if idx < len(siblings) - 1:
            return siblings[idx + 1]
        else:
            return None

    def __getitem__(self, key):
        return self.obj[key]

    def __bool__(self):
        return bool(self.obj)

    def getparent(self):
        return None

    def __str__(self):
        return str(self.obj)

    def __unicode__(self):
        return str(self.obj)

    def __len__(self):
        return len(self.obj)


class TreeWalker(base.NonRecursiveTreeWalker):
    def __init__(self, tree):
        # pylint:disable=redefined-variable-type
        if isinstance(tree, list):
            self.fragmentChildren = set(tree)
            tree = FragmentRoot(tree)
        else:
            self.fragmentChildren = set()
            tree = Root(tree)
        base.NonRecursiveTreeWalker.__init__(self, tree)
        self.filter = _ihatexml.InfosetFilter()

    def getNodeDetails(self, node):
        if isinstance(node, tuple):  # Text node
            node, key = node
            assert key in ("text", "tail"), "Text nodes are text or tail, found %s" % key
            return base.TEXT, ensure_str(getattr(node, key))

        elif isinstance(node, Root):
            return (base.DOCUMENT,)

        elif isinstance(node, Doctype):
            return base.DOCTYPE, node.name, node.public_id, node.system_id

        elif isinstance(node, FragmentWrapper) and not hasattr(node, "tag"):
            return base.TEXT, ensure_str(node.obj)

        elif node.tag == etree.Comment:
            return base.COMMENT, ensure_str(node.text)

        elif node.tag == etree.Entity:
            return base.ENTITY, ensure_str(node.text)[1:-1]  # strip &;

        else:
            # This is assumed to be an ordinary element
            match = tag_regexp.match(ensure_str(node.tag))
            if match:
                namespace, tag = match.groups()
            else:
                namespace = None
                tag = ensure_str(node.tag)
            attrs = {}
            for name, value in list(node.attrib.items()):
                name = ensure_str(name)
                value = ensure_str(value)
                match = tag_regexp.match(name)
                if match:
                    attrs[(match.group(1), match.group(2))] = value
                else:
                    attrs[(None, name)] = value
            return (base.ELEMENT, namespace, self.filter.fromXmlName(tag),
                    attrs, len(node) > 0 or node.text)

    def getFirstChild(self, node):
        assert not isinstance(node, tuple), "Text nodes have no children"

        assert len(node) or node.text, "Node has no children"
        if node.text:
            return (node, "text")
        else:
            return node[0]

    def getNextSibling(self, node):
        if isinstance(node, tuple):  # Text node
            node, key = node
            assert key in ("text", "tail"), "Text nodes are text or tail, found %s" % key
            if key == "text":
                # XXX: we cannot use a "bool(node) and node[0] or None" construct here
                # because node[0] might evaluate to False if it has no child element
                if len(node):
                    return node[0]
                else:
                    return None
            else:  # tail
                return node.getnext()

        return (node, "tail") if node.tail else node.getnext()

    def getParentNode(self, node):
        if isinstance(node, tuple):  # Text node
            node, key = node
            assert key in ("text", "tail"), "Text nodes are text or tail, found %s" % key
            if key == "text":
                return node
            # else: fallback to "normal" processing
        elif node in self.fragmentChildren:
            return None

        return node.getparent()
site-packages/pip/_vendor/html5lib/treewalkers/dom.py000064400000002605147511334600016720 0ustar00from __future__ import absolute_import, division, unicode_literals

from xml.dom import Node

from . import base


class TreeWalker(base.NonRecursiveTreeWalker):
    def getNodeDetails(self, node):
        if node.nodeType == Node.DOCUMENT_TYPE_NODE:
            return base.DOCTYPE, node.name, node.publicId, node.systemId

        elif node.nodeType in (Node.TEXT_NODE, Node.CDATA_SECTION_NODE):
            return base.TEXT, node.nodeValue

        elif node.nodeType == Node.ELEMENT_NODE:
            attrs = {}
            for attr in list(node.attributes.keys()):
                attr = node.getAttributeNode(attr)
                if attr.namespaceURI:
                    attrs[(attr.namespaceURI, attr.localName)] = attr.value
                else:
                    attrs[(None, attr.name)] = attr.value
            return (base.ELEMENT, node.namespaceURI, node.nodeName,
                    attrs, node.hasChildNodes())

        elif node.nodeType == Node.COMMENT_NODE:
            return base.COMMENT, node.nodeValue

        elif node.nodeType in (Node.DOCUMENT_NODE, Node.DOCUMENT_FRAGMENT_NODE):
            return (base.DOCUMENT,)

        else:
            return base.UNKNOWN, node.nodeType

    def getFirstChild(self, node):
        return node.firstChild

    def getNextSibling(self, node):
        return node.nextSibling

    def getParentNode(self, node):
        return node.parentNode
site-packages/pip/_vendor/html5lib/treewalkers/genshi.py000064400000004405147511334600017416 0ustar00from __future__ import absolute_import, division, unicode_literals

from genshi.core import QName
from genshi.core import START, END, XML_NAMESPACE, DOCTYPE, TEXT
from genshi.core import START_NS, END_NS, START_CDATA, END_CDATA, PI, COMMENT

from . import base

from ..constants import voidElements, namespaces


class TreeWalker(base.TreeWalker):
    def __iter__(self):
        # Buffer the events so we can pass in the following one
        previous = None
        for event in self.tree:
            if previous is not None:
                for token in self.tokens(previous, event):
                    yield token
            previous = event

        # Don't forget the final event!
        if previous is not None:
            for token in self.tokens(previous, None):
                yield token

    def tokens(self, event, next):
        kind, data, _ = event
        if kind == START:
            tag, attribs = data
            name = tag.localname
            namespace = tag.namespace
            converted_attribs = {}
            for k, v in attribs:
                if isinstance(k, QName):
                    converted_attribs[(k.namespace, k.localname)] = v
                else:
                    converted_attribs[(None, k)] = v

            if namespace == namespaces["html"] and name in voidElements:
                for token in self.emptyTag(namespace, name, converted_attribs,
                                           not next or next[0] != END or
                                           next[1] != tag):
                    yield token
            else:
                yield self.startTag(namespace, name, converted_attribs)

        elif kind == END:
            name = data.localname
            namespace = data.namespace
            if namespace != namespaces["html"] or name not in voidElements:
                yield self.endTag(namespace, name)

        elif kind == COMMENT:
            yield self.comment(data)

        elif kind == TEXT:
            for token in self.text(data):
                yield token

        elif kind == DOCTYPE:
            yield self.doctype(*data)

        elif kind in (XML_NAMESPACE, DOCTYPE, START_NS, END_NS,
                      START_CDATA, END_CDATA, PI):
            pass

        else:
            yield self.unknown(kind)
site-packages/pip/_vendor/html5lib/treewalkers/__pycache__/dom.cpython-36.opt-1.pyc000064400000003137147511334600024144 0ustar003

���e��@sBddlmZmZmZddlmZddlmZGdd�dej�Z	dS)�)�absolute_import�division�unicode_literals)�Node�)�basec@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�
TreeWalkercCs�|jtjkr tj|j|j|jfS|jtjtj	fkr>tj
|jfS|jtjkr�i}xJt
|jj��D]8}|j|�}|jr�|j||j|jf<q^|j|d|jf<q^Wtj|j|j||j�fS|jtjkr�tj|jfS|jtjtjfkr�tjfStj|jfSdS)N)ZnodeTyperZDOCUMENT_TYPE_NODErZDOCTYPE�nameZpublicIdZsystemIdZ	TEXT_NODEZCDATA_SECTION_NODEZTEXTZ	nodeValueZELEMENT_NODE�listZ
attributes�keysZgetAttributeNodeZnamespaceURI�valueZ	localNameZELEMENTZnodeNameZ
hasChildNodesZCOMMENT_NODE�COMMENTZ
DOCUMENT_NODEZDOCUMENT_FRAGMENT_NODEZDOCUMENTZUNKNOWN)�self�nodeZattrs�attr�r�/usr/lib/python3.6/dom.py�getNodeDetails	s$
zTreeWalker.getNodeDetailscCs|jS)N)Z
firstChild)rrrrr�
getFirstChild$szTreeWalker.getFirstChildcCs|jS)N)ZnextSibling)rrrrr�getNextSibling'szTreeWalker.getNextSiblingcCs|jS)N)Z
parentNode)rrrrr�
getParentNode*szTreeWalker.getParentNodeN)�__name__�
__module__�__qualname__rrrrrrrrrsrN)
Z
__future__rrrZxml.domr�rZNonRecursiveTreeWalkerrrrrr�<module>ssite-packages/pip/_vendor/html5lib/treewalkers/__pycache__/base.cpython-36.pyc000064400000010612147511334600023334 0ustar003

���eK�	@s�ddlmZmZmZddlmZddlmZmZm	Z	ddddd	d
ddd
g	Z
ejZej
ZejZejZejZejZdZdje	�Z	Gdd�de�ZGdd
�d
e�ZdS)�)�absolute_import�division�unicode_literals)�Node�)�
namespaces�voidElements�spaceCharacters�DOCUMENT�DOCTYPE�TEXT�ELEMENT�COMMENT�ENTITY�UNKNOWN�
TreeWalker�NonRecursiveTreeWalkerz<#UNKNOWN#>�c@sheZdZdd�Zdd�Zdd�Zddd	�Zd
d�Zdd
�Zdd�Z	dd�Z
ddd�Zdd�Zdd�Z
dS)rcCs
||_dS)N)�tree)�selfr�r�/usr/lib/python3.6/base.py�__init__szTreeWalker.__init__cCst�dS)N)�NotImplementedError)rrrr�__iter__szTreeWalker.__iter__cCs
d|d�S)NZSerializeError)�type�datar)r�msgrrr�errorszTreeWalker.errorFccs$d|||d�V|r |jd�VdS)NZEmptyTag)r�name�	namespacerzVoid element has children)r)rr r�attrs�hasChildrenrrr�emptyTags

zTreeWalker.emptyTagcCsd|||d�S)NZStartTag)rrr rr)rr rr!rrr�startTag%szTreeWalker.startTagcCsd||d�S)NZEndTag)rrr r)rr rrrr�endTag+szTreeWalker.endTagccsx|}|jt�}|dt|�t|��}|r6d|d�V|}|jt�}|t|�d�}|rdd|d�V|rtd|d�VdS)NZSpaceCharacters)rrZ
Characters)�lstripr	�len�rstrip)rrZmiddle�left�rightrrr�text0s

zTreeWalker.textcCs
d|d�S)N�Comment)rrr)rrrrr�comment>szTreeWalker.commentNcCsd|||d�S)NZDoctype)rr�publicId�systemIdr)rrr.r/rrr�doctypeAszTreeWalker.doctypecCs
d|d�S)NZEntity)rrr)rrrrr�entityGszTreeWalker.entitycCs|jd|�S)NzUnknown node type: )r)rZnodeTyperrr�unknownJszTreeWalker.unknown)F)NN)�__name__�
__module__�__qualname__rrrr#r$r%r+r-r0r1r2rrrrrs

c@s4eZdZdd�Zdd�Zdd�Zdd�Zd	d
�ZdS)rcCst�dS)N)r)r�noderrr�getNodeDetailsOsz%NonRecursiveTreeWalker.getNodeDetailscCst�dS)N)r)rr6rrr�
getFirstChildRsz$NonRecursiveTreeWalker.getFirstChildcCst�dS)N)r)rr6rrr�getNextSiblingUsz%NonRecursiveTreeWalker.getNextSiblingcCst�dS)N)r)rr6rrr�
getParentNodeXsz$NonRecursiveTreeWalker.getParentNodeccs|j}�x�|dk	�r|j|�}|d|dd�}}d}|tkrN|j|�Vn�|tkrrx�|j|�D]
}|VqbWn�|tkr�|\}}}}|s�|tdkr�|tkr�x|j	||||�D]
}|Vq�Wd}n|j
|||�VnV|tkr�|j|d�Vn<|t
k�r|j|d�Vn |tk�rd}n|j|d�V|�r@|j|�}	nd}	|	dk	�rT|	}q
x�|dk	�r�|j|�}|d|dd�}}|tk�r�|\}}}}|�r�|tdk�s�|tk�r�|j||�V|j|k�r�d}P|j|�}
|
dk	�r�|
}Pn
|j|�}�qVWq
WdS)Nr�FZhtmlT)rr7rr0rr+r
rrr#r$rr-rr1r
r2r8r%r9r:)rZcurrentNodeZdetailsrr"�tokenr rZ
attributesZ
firstChildZnextSiblingrrrr[sZ









zNonRecursiveTreeWalker.__iter__N)r3r4r5r7r8r9r:rrrrrrNs
N)Z
__future__rrrZxml.domrZ	constantsrrr	�__all__Z
DOCUMENT_NODEr
ZDOCUMENT_TYPE_NODErZ	TEXT_NODErZELEMENT_NODEr
ZCOMMENT_NODErZENTITY_NODErr�join�objectrrrrrr�<module>s
:site-packages/pip/_vendor/html5lib/treewalkers/__pycache__/etree.cpython-36.pyc000064400000007001147511334600023524 0ustar003

���eL�@s�ddlmZmZmZyddlmZWn>ek
rbyddlmZWnek
r\eZYnXYnXddl	Z	ddl
mZddlm
Z
ddlmZe	jd	�Zd
d�Zee�ZdS)�)�absolute_import�division�unicode_literals)�OrderedDictN)�string_types�)�base�)�moduleFactoryFactoryz
{([^}]*)}(.*)cs,|}|jd�j�G�fdd�dtj�}t�S)NZasdcs4eZdZdZ�fdd�Zdd�Zdd�Zdd	�Zd
S)z#getETreeBuilder.<locals>.TreeWalkera�Given the particular ElementTree representation, this implementation,
        to avoid using recursion, returns "nodes" as tuples with the following
        content:

        1. The current element

        2. The index of the element relative to its parent

        3. A stack of ancestor elements

        4. A flag "text", "tail" or None to indicate if the current node is a
           text node; either the text or tail of the current element (1)
        csLt|t�r2|\}}}}|dkr.tjt||�fS|}t|d�sD|j�}|jdkrVtjfS|jdkr|tj	|j
|jd�|jd�fS|j�kr�tj|j
fSt|jt
�s�tt|j���tj|j�}|r�|j�\}}n
d}|j}t�}xPt|jj��D]>\}	}
tj|	�}|�r|
||jd	�|jd
�f<q�|
|d|	f<q�Wtj|||t|��pD|j
fSdS)
N�text�tail�tag�
DOCUMENT_ROOT�DOCUMENT_FRAGMENTz
<!DOCTYPE>ZpublicIdZsystemIdrr	)rr)rr)�
isinstance�tuplerZTEXT�getattr�hasattrZgetrootr
ZDOCUMENTZDOCTYPEr�get�COMMENTr�AssertionError�type�
tag_regexp�match�groupsr�listZattrib�items�groupZELEMENT�len)�self�nodeZelt�_�flagr�	namespacer
Zattrs�name�value)�ElementTreeCommentType��/usr/lib/python3.6/etree.py�getNodeDetails's8





z2getETreeBuilder.<locals>.TreeWalker.getNodeDetailscSstt|t�r|\}}}}n|dgdf\}}}}|dkr8dS|jrJ|||dfSt|�rl|j|�|dd|dfSdSdS)Nrrr)rr)rrrr�append)rr �element�key�parentsr"r'r'r(�
getFirstChildOs

z1getETreeBuilder.<locals>.TreeWalker.getFirstChildcSs�t|t�r|\}}}}ndS|dkrLt|�rF|j|�|dd|dfSdSnN|jrf|dkrf|||dfS|t|d�dkr�|d|d|d|dfSdSdS)Nrrrr���r/)rrrr*r)rr r+r,r-r"r'r'r(�getNextSibling`s

z2getETreeBuilder.<locals>.TreeWalker.getNextSiblingcSs�t|t�r|\}}}}ndS|dkr:|s,|S|||dfSnD|j�}|sJ|St|d�j|�dksdt�|t|d�j|�|dfSdS)Nrrr/r/)rr�popr�countr�index)rr r+r,r-r"�parentr'r'r(�
getParentNodets
z1getETreeBuilder.<locals>.TreeWalker.getParentNodeN)�__name__�
__module__�__qualname__�__doc__r)r.r0r5r')r&r'r(�
TreeWalkers

(r:)�Commentr
rZNonRecursiveTreeWalker�locals)ZElementTreeImplementationZElementTreer:r')r&r(�getETreeBuildersnr=)Z
__future__rrr�collectionsr�ImportErrorZordereddict�dict�reZpip._vendor.sixr�rZ_utilsr
�compilerr=ZgetETreeModuler'r'r'r(�<module>s
tsite-packages/pip/_vendor/html5lib/treewalkers/__pycache__/etree_lxml.cpython-36.pyc000064400000014646147511334600024575 0ustar003

���e��@s�ddlmZmZmZddlmZddlmZddlm	Z	ddl
mZddl
mZd	d
�Z
Gdd�de�ZGd
d�de�ZGdd�de�ZGdd�de�ZGdd�dej�ZdS)�)�absolute_import�division�unicode_literals)�	text_type)�etree�)�
tag_regexp�)�base)�	_ihatexmlcCs*|dkrdSt|t�r|S|jdd�SdS)N�ascii�strict)�
isinstancer�decode)�s�r� /usr/lib/python3.6/etree_lxml.py�
ensure_strs

rc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�RootcCs�||_g|_y:|jjrD|jjt|t|jj�t|jj�t|jj	���Wnt
k
rZYnXy|j�}Wnt
k
r�|}YnXx|j�dk	r�|j�}q�Wx |dk	r�|jj|�|j
�}q�Wd|_d|_dS)N)Zelementtree�childrenZdocinfoZinternalDTD�append�DoctyperZ	root_name�	public_idZ
system_url�AttributeErrorZgetrootZgetprevious�getnext�text�tail)�selfZet�noderrr�__init__s*




z
Root.__init__cCs
|j|S)N)r)r�keyrrr�__getitem__1szRoot.__getitem__cCsdS)Nr)rrrrr4szRoot.getnextcCsdS)Nr	r)rrrr�__len__7szRoot.__len__N)�__name__�
__module__�__qualname__rr!rr"rrrrrsrc@seZdZdd�Zdd�ZdS)rcCs(||_||_||_||_d|_d|_dS)N)�	root_node�namer�	system_idrr)rr&r'rr(rrrr<szDoctype.__init__cCs|jjdS)Nr	)r&r)rrrrrEszDoctype.getnextN)r#r$r%rrrrrrr;s	rc@seZdZdd�Zdd�ZdS)�FragmentRootcs$�fdd�|D��_d�_�_dS)Ncsg|]}t�|��qSr)�FragmentWrapper)�.0Zchild)rrr�
<listcomp>Ksz)FragmentRoot.__init__.<locals>.<listcomp>)rrr)rrr)rrrJszFragmentRoot.__init__cCsdS)Nr)rrrrrNszFragmentRoot.getnextN)r#r$r%rrrrrrr)Isr)c@sTeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)r*cCsT||_||_t|jd�r(t|jj�|_nd|_t|jd�rJt|jj�|_nd|_dS)Nrr)r&�obj�hasattrrrr)rZ
fragment_rootr-rrrrSszFragmentWrapper.__init__cCst|j|�S)N)�getattrr-)rr'rrr�__getattr___szFragmentWrapper.__getattr__cCs6|jj}|j|�}|t|�dkr.||dSdSdS)Nr	)r&r�index�len)rZsiblings�idxrrrrbs

zFragmentWrapper.getnextcCs
|j|S)N)r-)rr rrrr!jszFragmentWrapper.__getitem__cCs
t|j�S)N)�boolr-)rrrr�__bool__mszFragmentWrapper.__bool__cCsdS)Nr)rrrr�	getparentpszFragmentWrapper.getparentcCs
t|j�S)N)�strr-)rrrr�__str__sszFragmentWrapper.__str__cCs
t|j�S)N)r7r-)rrrr�__unicode__vszFragmentWrapper.__unicode__cCs
t|j�S)N)r2r-)rrrrr"yszFragmentWrapper.__len__N)r#r$r%rr0rr!r5r6r8r9r"rrrrr*Rsr*c@s4eZdZdd�Zdd�Zdd�Zdd�Zd	d
�ZdS)�
TreeWalkercCsJt|t�rt|�|_t|�}nt�|_t|�}tjj||�t	j
�|_dS)N)r�list�set�fragmentChildrenr)rr
�NonRecursiveTreeWalkerrrZ
InfosetFilter�filter)rZtreerrrr~s


zTreeWalker.__init__c	Cs�t|t�r:|\}}|dks&td|��tjtt||��fSt|t�rLtjfSt|t	�rjtj
|j|j|j
fSt|t�r�t|d�r�tjt|j�fS|jtjkr�tjt|j�fS|jtjkr�tjt|j�dd	�fStjt|j��}|�r�|j�\}}nd}t|j�}i}xbt|jj��D]P\}}t|�}t|�}tj|�}|�rX|||jd�|jd�f<n||d|f<�qWtj||j j!|�|t"|�dk�p�|jfSdS)
Nrrz%Text nodes are text or tail, found %s�tagr	rr)rr���)#r�tuple�AssertionErrorr
ZTEXTrr/rZDOCUMENTrZDOCTYPEr'rr(r*r.r-r@r�Comment�COMMENTrZEntityZENTITYr�match�groupsr;Zattrib�items�groupZELEMENTr?ZfromXmlNamer2)	rrr rF�	namespacer@Zattrsr'�valuerrr�getNodeDetails�s:




zTreeWalker.getNodeDetailscCsDt|t�std��t|�s*|js*td��|jr8|dfS|dSdS)NzText nodes have no childrenzNode has no childrenrr)rrBrCr2r)rrrrr�
getFirstChild�s
zTreeWalker.getFirstChildcCsbt|t�rL|\}}|dks&td|��|dkrDt|�r>|dSdSn|j�S|jrZ|dfS|j�S)Nrrz%Text nodes are text or tail, found %sr)rr)rrBrCr2rr)rrr rrr�getNextSibling�s
zTreeWalker.getNextSiblingcCsJt|t�r4|\}}|dks&td|��|dkrB|Sn||jkrBdS|j�S)Nrrz%Text nodes are text or tail, found %s)rr)rrBrCr=r6)rrr rrr�
getParentNode�s

zTreeWalker.getParentNodeN)r#r$r%rrLrMrNrOrrrrr:}s
)	r:N)Z
__future__rrrZpip._vendor.sixrZlxmlrZtreebuilders.etreer�r
rr�objectrrr)r*r>r:rrrr�<module>s	&	+site-packages/pip/_vendor/html5lib/treewalkers/__pycache__/genshi.cpython-36.pyc000064400000003415147511334600023702 0ustar003

���e	�@s�ddlmZmZmZddlmZddlmZmZmZm	Z	m
Z
ddlmZmZm
Z
mZmZmZddlmZddlmZmZGd	d
�d
ej�ZdS)�)�absolute_import�division�unicode_literals)�QName)�START�END�
XML_NAMESPACE�DOCTYPE�TEXT)�START_NS�END_NS�START_CDATA�	END_CDATA�PI�COMMENT�)�base�)�voidElements�
namespacesc@seZdZdd�Zdd�ZdS)�
TreeWalkerccsdd}x6|jD],}|dk	r4x|j||�D]
}|Vq&W|}qW|dk	r`x|j|d�D]
}|VqRWdS)N)Ztree�tokens)�selfZprevious�event�token�r�/usr/lib/python3.6/genshi.py�__iter__
s
zTreeWalker.__iter__ccs�|\}}}|tkr�|\}}|j}|j}	i}
x8|D]0\}}t|t�rT||
|j|jf<q0||
d|f<q0W|	tdkr�|tkr�xJ|j|	||
|p�|dtkp�|d|k�D]
}
|
Vq�Wn|j	|	||
�Vn�|tkr�|j}|j}	|	tdks�|tkr�|j
|	|�Vn~|tk�r|j|�Vnf|t
k�r>xZ|j|�D]}
|
V�q,Wn>|tk�rV|j|�Vn&|tttttttfk�rpn|j|�VdS)NZhtmlrr)rZ	localname�	namespace�
isinstancerrrZemptyTagrZstartTagZendTagrZcommentr
�textr	Zdoctyperrrr
rr�unknown)rr�nextZkind�data�_�tagZattribs�namerZconverted_attribs�k�vrrrrrs@





zTreeWalker.tokensN)�__name__�
__module__�__qualname__rrrrrrrsrN)Z
__future__rrrZgenshi.corerrrrr	r
rrr
rrr�rZ	constantsrrrrrrr�<module>s site-packages/pip/_vendor/html5lib/treewalkers/__pycache__/__init__.cpython-36.pyc000064400000007226147511334600024170 0ustar003

���e��@sbdZddlmZmZmZddlmZddlmZdddd	d
dgZ	iZ
dd
d�Zdd�Zdd�Z
dS)a�A collection of modules for iterating through different kinds of
tree, generating tokens identical to those produced by the tokenizer
module.

To create a tree walker for a new type of tree, you need to do
implement a tree walker object (called TreeWalker by convention) that
implements a 'serialize' method taking a tree as sole argument and
returning an iterator generating tokens.
�)�absolute_import�division�unicode_literals�)�	constants)�
default_etree�
getTreeWalker�pprint�dom�etree�genshi�
etree_lxmlNcKs�|j�}|tkr�|dkr0ddlm}|jt|<np|dkrPddlm}|jt|<nP|dkrpddlm}|jt|<n0|dkr�dd	lm}|d
kr�t}|j	|f|�jStj
|�S)a�Get a TreeWalker class for various types of tree with built-in support

    Args:
        treeType (str): the name of the tree type required (case-insensitive).
            Supported values are:

            - "dom": The xml.dom.minidom DOM implementation
            - "etree": A generic walker for tree implementations exposing an
                       elementtree-like interface (known to work with
                       ElementTree, cElementTree and lxml.etree).
            - "lxml": Optimized walker for lxml.etree
            - "genshi": a Genshi stream

        Implementation: A module implementing the tree type e.g.
            xml.etree.ElementTree or cElementTree (Currently applies to the
            "etree" tree type only).
    r
�)r
r)rZlxml)r
r)rN)�lower�treeWalkerCache�r
Z
TreeWalkerrr
rrZgetETreeModule�get)ZtreeType�implementation�kwargsr
rr
r�r�/usr/lib/python3.6/__init__.pyrs"ccslg}xL|D]D}|d}|dkr.|j|d�q
|rHddj|�d�Vg}|Vq
W|rhddj|�d�VdS)N�type�
Characters�SpaceCharacters�datar)rr)rr)�append�join)�tokensZpendingCharacters�tokenrrrr�concatenateCharacterTokens<s

rcCslg}d}�xVt|�D�]H}|d}|d k�r&|dr~|dtjdkr~|dtjkrdtj|d}n|d}d||df}n|d}|jd	d
||f�|d7}|d}xdt|j��D]T\\}}	}
|r�|tjkr�tj|}n|}d||	f}n|	}|jd
d
|||
f�q�W|dk�r^|d8}q|dk�r:|d8}q|dk�r`|jdd
||df�q|dk�r|d�r�|d�r�|jdd
||d|d|d�r�|dndf�nF|d�r�|jdd
||d|df�n|jdd
||df�n|jdd
|f�q|dk�r8|jdd
||df�q|dk�rRd�s^td��qtd|��qWdj	|�S)!zPretty printer for tree walkersrr�StartTag�EmptyTag�	namespaceZhtmlz%s %s�namez%s<%s>� rrz	%s%s="%s"ZEndTag�Commentz
%s<!-- %s -->ZDoctypeZpublicIdz%s<!DOCTYPE %s "%s" "%s">ZsystemIdrz%s<!DOCTYPE %s "" "%s">z%s<!DOCTYPE %s>z
%s<!DOCTYPE >rz%s"%s"rFzBconcatenateCharacterTokens should have got rid of all Space tokenszUnknown token type, %s�
)r r!)
rrZ
namespaces�prefixesr�sorted�items�AssertionError�
ValueErrorr)Zwalker�output�indentrr�nsr#Zattrsr"Z	localname�valuerrrr	Ksd












)N)�__doc__Z
__future__rrrrrZ_utilsr�__all__rrrr	rrrr�<module>	s
'site-packages/pip/_vendor/html5lib/treewalkers/__pycache__/etree.cpython-36.opt-1.pyc000064400000006635147511334600024477 0ustar003

���eL�@s�ddlmZmZmZyddlmZWn>ek
rbyddlmZWnek
r\eZYnXYnXddl	Z	ddl
mZddlm
Z
ddlmZe	jd	�Zd
d�Zee�ZdS)�)�absolute_import�division�unicode_literals)�OrderedDictN)�string_types�)�base�)�moduleFactoryFactoryz
{([^}]*)}(.*)cs,|}|jd�j�G�fdd�dtj�}t�S)NZasdcs4eZdZdZ�fdd�Zdd�Zdd�Zdd	�Zd
S)z#getETreeBuilder.<locals>.TreeWalkera�Given the particular ElementTree representation, this implementation,
        to avoid using recursion, returns "nodes" as tuples with the following
        content:

        1. The current element

        2. The index of the element relative to its parent

        3. A stack of ancestor elements

        4. A flag "text", "tail" or None to indicate if the current node is a
           text node; either the text or tail of the current element (1)
        cs2t|t�r2|\}}}}|dkr.tjt||�fS|}t|d�sD|j�}|jdkrVtjfS|jdkr|tj	|j
|jd�|jd�fS|j�kr�tj|j
fSt
j|j�}|r�|j�\}}n
d}|j}t�}xPt|jj��D]>\}	}
t
j|	�}|�r|
||jd	�|jd
�f<q�|
|d|	f<q�Wtj|||t|��p*|j
fSdS)
N�text�tail�tag�
DOCUMENT_ROOT�DOCUMENT_FRAGMENTz
<!DOCTYPE>ZpublicIdZsystemIdrr	)rr)rr)�
isinstance�tuplerZTEXT�getattr�hasattrZgetrootr
ZDOCUMENTZDOCTYPEr�get�COMMENT�
tag_regexp�match�groupsr�listZattrib�items�groupZELEMENT�len)�self�nodeZelt�_�flagr�	namespacer
Zattrs�name�value)�ElementTreeCommentType��/usr/lib/python3.6/etree.py�getNodeDetails's6





z2getETreeBuilder.<locals>.TreeWalker.getNodeDetailscSstt|t�r|\}}}}n|dgdf\}}}}|dkr8dS|jrJ|||dfSt|�rl|j|�|dd|dfSdSdS)Nrrr)rr)rrrr�append)rr�element�key�parentsr r%r%r&�
getFirstChildOs

z1getETreeBuilder.<locals>.TreeWalker.getFirstChildcSs�t|t�r|\}}}}ndS|dkrLt|�rF|j|�|dd|dfSdSnN|jrf|dkrf|||dfS|t|d�dkr�|d|d|d|dfSdSdS)Nrrrr���r-)rrrr(r)rrr)r*r+r r%r%r&�getNextSibling`s

z2getETreeBuilder.<locals>.TreeWalker.getNextSiblingcSsht|t�r|\}}}}ndS|dkr:|s,|S|||dfSn*|j�}|sJ|S|t|d�j|�|dfSdS)Nrrr-)rr�popr�index)rrr)r*r+r �parentr%r%r&�
getParentNodets
z1getETreeBuilder.<locals>.TreeWalker.getParentNodeN)�__name__�
__module__�__qualname__�__doc__r'r,r.r2r%)r$r%r&�
TreeWalkers

(r7)�Commentr
rZNonRecursiveTreeWalker�locals)ZElementTreeImplementationZElementTreer7r%)r$r&�getETreeBuildersnr:)Z
__future__rrr�collectionsr�ImportErrorZordereddict�dict�reZpip._vendor.sixr�rZ_utilsr
�compilerr:ZgetETreeModuler%r%r%r&�<module>s
tsite-packages/pip/_vendor/html5lib/treewalkers/__pycache__/dom.cpython-36.pyc000064400000003137147511334600023205 0ustar003

���e��@sBddlmZmZmZddlmZddlmZGdd�dej�Z	dS)�)�absolute_import�division�unicode_literals)�Node�)�basec@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�
TreeWalkercCs�|jtjkr tj|j|j|jfS|jtjtj	fkr>tj
|jfS|jtjkr�i}xJt
|jj��D]8}|j|�}|jr�|j||j|jf<q^|j|d|jf<q^Wtj|j|j||j�fS|jtjkr�tj|jfS|jtjtjfkr�tjfStj|jfSdS)N)ZnodeTyperZDOCUMENT_TYPE_NODErZDOCTYPE�nameZpublicIdZsystemIdZ	TEXT_NODEZCDATA_SECTION_NODEZTEXTZ	nodeValueZELEMENT_NODE�listZ
attributes�keysZgetAttributeNodeZnamespaceURI�valueZ	localNameZELEMENTZnodeNameZ
hasChildNodesZCOMMENT_NODE�COMMENTZ
DOCUMENT_NODEZDOCUMENT_FRAGMENT_NODEZDOCUMENTZUNKNOWN)�self�nodeZattrs�attr�r�/usr/lib/python3.6/dom.py�getNodeDetails	s$
zTreeWalker.getNodeDetailscCs|jS)N)Z
firstChild)rrrrr�
getFirstChild$szTreeWalker.getFirstChildcCs|jS)N)ZnextSibling)rrrrr�getNextSibling'szTreeWalker.getNextSiblingcCs|jS)N)Z
parentNode)rrrrr�
getParentNode*szTreeWalker.getParentNodeN)�__name__�
__module__�__qualname__rrrrrrrrrsrN)
Z
__future__rrrZxml.domr�rZNonRecursiveTreeWalkerrrrrr�<module>ssite-packages/pip/_vendor/html5lib/treewalkers/__pycache__/genshi.cpython-36.opt-1.pyc000064400000003415147511334600024641 0ustar003

���e	�@s�ddlmZmZmZddlmZddlmZmZmZm	Z	m
Z
ddlmZmZm
Z
mZmZmZddlmZddlmZmZGd	d
�d
ej�ZdS)�)�absolute_import�division�unicode_literals)�QName)�START�END�
XML_NAMESPACE�DOCTYPE�TEXT)�START_NS�END_NS�START_CDATA�	END_CDATA�PI�COMMENT�)�base�)�voidElements�
namespacesc@seZdZdd�Zdd�ZdS)�
TreeWalkerccsdd}x6|jD],}|dk	r4x|j||�D]
}|Vq&W|}qW|dk	r`x|j|d�D]
}|VqRWdS)N)Ztree�tokens)�selfZprevious�event�token�r�/usr/lib/python3.6/genshi.py�__iter__
s
zTreeWalker.__iter__ccs�|\}}}|tkr�|\}}|j}|j}	i}
x8|D]0\}}t|t�rT||
|j|jf<q0||
d|f<q0W|	tdkr�|tkr�xJ|j|	||
|p�|dtkp�|d|k�D]
}
|
Vq�Wn|j	|	||
�Vn�|tkr�|j}|j}	|	tdks�|tkr�|j
|	|�Vn~|tk�r|j|�Vnf|t
k�r>xZ|j|�D]}
|
V�q,Wn>|tk�rV|j|�Vn&|tttttttfk�rpn|j|�VdS)NZhtmlrr)rZ	localname�	namespace�
isinstancerrrZemptyTagrZstartTagZendTagrZcommentr
�textr	Zdoctyperrrr
rr�unknown)rr�nextZkind�data�_�tagZattribs�namerZconverted_attribs�k�vrrrrrs@





zTreeWalker.tokensN)�__name__�
__module__�__qualname__rrrrrrrsrN)Z
__future__rrrZgenshi.corerrrrr	r
rrr
rrr�rZ	constantsrrrrrrr�<module>s site-packages/pip/_vendor/html5lib/treewalkers/__pycache__/etree_lxml.cpython-36.opt-1.pyc000064400000014053147511334600025524 0ustar003

���e��@s�ddlmZmZmZddlmZddlmZddlm	Z	ddl
mZddl
mZd	d
�Z
Gdd�de�ZGd
d�de�ZGdd�de�ZGdd�de�ZGdd�dej�ZdS)�)�absolute_import�division�unicode_literals)�	text_type)�etree�)�
tag_regexp�)�base)�	_ihatexmlcCs*|dkrdSt|t�r|S|jdd�SdS)N�ascii�strict)�
isinstancer�decode)�s�r� /usr/lib/python3.6/etree_lxml.py�
ensure_strs

rc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�RootcCs�||_g|_y:|jjrD|jjt|t|jj�t|jj�t|jj	���Wnt
k
rZYnXy|j�}Wnt
k
r�|}YnXx|j�dk	r�|j�}q�Wx |dk	r�|jj|�|j
�}q�Wd|_d|_dS)N)Zelementtree�childrenZdocinfoZinternalDTD�append�DoctyperZ	root_name�	public_idZ
system_url�AttributeErrorZgetrootZgetprevious�getnext�text�tail)�selfZet�noderrr�__init__s*




z
Root.__init__cCs
|j|S)N)r)r�keyrrr�__getitem__1szRoot.__getitem__cCsdS)Nr)rrrrr4szRoot.getnextcCsdS)Nr	r)rrrr�__len__7szRoot.__len__N)�__name__�
__module__�__qualname__rr!rr"rrrrrsrc@seZdZdd�Zdd�ZdS)rcCs(||_||_||_||_d|_d|_dS)N)�	root_node�namer�	system_idrr)rr&r'rr(rrrr<szDoctype.__init__cCs|jjdS)Nr	)r&r)rrrrrEszDoctype.getnextN)r#r$r%rrrrrrr;s	rc@seZdZdd�Zdd�ZdS)�FragmentRootcs$�fdd�|D��_d�_�_dS)Ncsg|]}t�|��qSr)�FragmentWrapper)�.0Zchild)rrr�
<listcomp>Ksz)FragmentRoot.__init__.<locals>.<listcomp>)rrr)rrr)rrrJszFragmentRoot.__init__cCsdS)Nr)rrrrrNszFragmentRoot.getnextN)r#r$r%rrrrrrr)Isr)c@sTeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)r*cCsT||_||_t|jd�r(t|jj�|_nd|_t|jd�rJt|jj�|_nd|_dS)Nrr)r&�obj�hasattrrrr)rZ
fragment_rootr-rrrrSszFragmentWrapper.__init__cCst|j|�S)N)�getattrr-)rr'rrr�__getattr___szFragmentWrapper.__getattr__cCs6|jj}|j|�}|t|�dkr.||dSdSdS)Nr	)r&r�index�len)rZsiblings�idxrrrrbs

zFragmentWrapper.getnextcCs
|j|S)N)r-)rr rrrr!jszFragmentWrapper.__getitem__cCs
t|j�S)N)�boolr-)rrrr�__bool__mszFragmentWrapper.__bool__cCsdS)Nr)rrrr�	getparentpszFragmentWrapper.getparentcCs
t|j�S)N)�strr-)rrrr�__str__sszFragmentWrapper.__str__cCs
t|j�S)N)r7r-)rrrr�__unicode__vszFragmentWrapper.__unicode__cCs
t|j�S)N)r2r-)rrrrr"yszFragmentWrapper.__len__N)r#r$r%rr0rr!r5r6r8r9r"rrrrr*Rsr*c@s4eZdZdd�Zdd�Zdd�Zdd�Zd	d
�ZdS)�
TreeWalkercCsJt|t�rt|�|_t|�}nt�|_t|�}tjj||�t	j
�|_dS)N)r�list�set�fragmentChildrenr)rr
�NonRecursiveTreeWalkerrrZ
InfosetFilter�filter)rZtreerrrr~s


zTreeWalker.__init__c	Cs�t|t�r&|\}}tjtt||��fSt|t�r8tjfSt|t�rVtj	|j
|j|jfSt|t
�r|t|d�r|tjt|j�fS|jtjkr�tjt|j�fS|jtjkr�tjt|j�dd�fStjt|j��}|r�|j�\}}nd}t|j�}i}xbt|jj��D]P\}}t|�}t|�}tj|�}|�rB|||jd�|jd�f<n||d|f<�qWtj||jj |�|t!|�dk�px|jfSdS)N�tagr	rr���)"r�tupler
ZTEXTrr/rZDOCUMENTrZDOCTYPEr'rr(r*r.r-r@r�Comment�COMMENTrZEntityZENTITYr�match�groupsr;Zattrib�items�groupZELEMENTr?ZfromXmlNamer2)	rrr rE�	namespacer@Zattrsr'�valuerrr�getNodeDetails�s8




zTreeWalker.getNodeDetailscCs|jr|dfS|dSdS)Nrr)r)rrrrr�
getFirstChild�szTreeWalker.getFirstChildcCsNt|t�r8|\}}|dkr0t|�r*|dSdSn|j�S|jrF|dfS|j�S)Nrrr)rrBr2rr)rrr rrr�getNextSibling�s
zTreeWalker.getNextSiblingcCs6t|t�r |\}}|dkr.|Sn||jkr.dS|j�S)Nr)rrBr=r6)rrr rrr�
getParentNode�s

zTreeWalker.getParentNodeN)r#r$r%rrKrLrMrNrrrrr:}s
)	r:N)Z
__future__rrrZpip._vendor.sixrZlxmlrZtreebuilders.etreer�r
rr�objectrrr)r*r>r:rrrr�<module>s	&	+site-packages/pip/_vendor/html5lib/treewalkers/__pycache__/base.cpython-36.opt-1.pyc000064400000010612147511334600024273 0ustar003

���eK�	@s�ddlmZmZmZddlmZddlmZmZm	Z	ddddd	d
ddd
g	Z
ejZej
ZejZejZejZejZdZdje	�Z	Gdd�de�ZGdd
�d
e�ZdS)�)�absolute_import�division�unicode_literals)�Node�)�
namespaces�voidElements�spaceCharacters�DOCUMENT�DOCTYPE�TEXT�ELEMENT�COMMENT�ENTITY�UNKNOWN�
TreeWalker�NonRecursiveTreeWalkerz<#UNKNOWN#>�c@sheZdZdd�Zdd�Zdd�Zddd	�Zd
d�Zdd
�Zdd�Z	dd�Z
ddd�Zdd�Zdd�Z
dS)rcCs
||_dS)N)�tree)�selfr�r�/usr/lib/python3.6/base.py�__init__szTreeWalker.__init__cCst�dS)N)�NotImplementedError)rrrr�__iter__szTreeWalker.__iter__cCs
d|d�S)NZSerializeError)�type�datar)r�msgrrr�errorszTreeWalker.errorFccs$d|||d�V|r |jd�VdS)NZEmptyTag)r�name�	namespacerzVoid element has children)r)rr r�attrs�hasChildrenrrr�emptyTags

zTreeWalker.emptyTagcCsd|||d�S)NZStartTag)rrr rr)rr rr!rrr�startTag%szTreeWalker.startTagcCsd||d�S)NZEndTag)rrr r)rr rrrr�endTag+szTreeWalker.endTagccsx|}|jt�}|dt|�t|��}|r6d|d�V|}|jt�}|t|�d�}|rdd|d�V|rtd|d�VdS)NZSpaceCharacters)rrZ
Characters)�lstripr	�len�rstrip)rrZmiddle�left�rightrrr�text0s

zTreeWalker.textcCs
d|d�S)N�Comment)rrr)rrrrr�comment>szTreeWalker.commentNcCsd|||d�S)NZDoctype)rr�publicId�systemIdr)rrr.r/rrr�doctypeAszTreeWalker.doctypecCs
d|d�S)NZEntity)rrr)rrrrr�entityGszTreeWalker.entitycCs|jd|�S)NzUnknown node type: )r)rZnodeTyperrr�unknownJszTreeWalker.unknown)F)NN)�__name__�
__module__�__qualname__rrrr#r$r%r+r-r0r1r2rrrrrs

c@s4eZdZdd�Zdd�Zdd�Zdd�Zd	d
�ZdS)rcCst�dS)N)r)r�noderrr�getNodeDetailsOsz%NonRecursiveTreeWalker.getNodeDetailscCst�dS)N)r)rr6rrr�
getFirstChildRsz$NonRecursiveTreeWalker.getFirstChildcCst�dS)N)r)rr6rrr�getNextSiblingUsz%NonRecursiveTreeWalker.getNextSiblingcCst�dS)N)r)rr6rrr�
getParentNodeXsz$NonRecursiveTreeWalker.getParentNodeccs|j}�x�|dk	�r|j|�}|d|dd�}}d}|tkrN|j|�Vn�|tkrrx�|j|�D]
}|VqbWn�|tkr�|\}}}}|s�|tdkr�|tkr�x|j	||||�D]
}|Vq�Wd}n|j
|||�VnV|tkr�|j|d�Vn<|t
k�r|j|d�Vn |tk�rd}n|j|d�V|�r@|j|�}	nd}	|	dk	�rT|	}q
x�|dk	�r�|j|�}|d|dd�}}|tk�r�|\}}}}|�r�|tdk�s�|tk�r�|j||�V|j|k�r�d}P|j|�}
|
dk	�r�|
}Pn
|j|�}�qVWq
WdS)Nr�FZhtmlT)rr7rr0rr+r
rrr#r$rr-rr1r
r2r8r%r9r:)rZcurrentNodeZdetailsrr"�tokenr rZ
attributesZ
firstChildZnextSiblingrrrr[sZ









zNonRecursiveTreeWalker.__iter__N)r3r4r5r7r8r9r:rrrrrrNs
N)Z
__future__rrrZxml.domrZ	constantsrrr	�__all__Z
DOCUMENT_NODEr
ZDOCUMENT_TYPE_NODErZ	TEXT_NODErZELEMENT_NODEr
ZCOMMENT_NODErZENTITY_NODErr�join�objectrrrrrr�<module>s
:site-packages/pip/_vendor/html5lib/treewalkers/__pycache__/__init__.cpython-36.opt-1.pyc000064400000007063147511334600025126 0ustar003

���e��@sbdZddlmZmZmZddlmZddlmZdddd	d
dgZ	iZ
dd
d�Zdd�Zdd�Z
dS)a�A collection of modules for iterating through different kinds of
tree, generating tokens identical to those produced by the tokenizer
module.

To create a tree walker for a new type of tree, you need to do
implement a tree walker object (called TreeWalker by convention) that
implements a 'serialize' method taking a tree as sole argument and
returning an iterator generating tokens.
�)�absolute_import�division�unicode_literals�)�	constants)�
default_etree�
getTreeWalker�pprint�dom�etree�genshi�
etree_lxmlNcKs�|j�}|tkr�|dkr0ddlm}|jt|<np|dkrPddlm}|jt|<nP|dkrpddlm}|jt|<n0|dkr�dd	lm}|d
kr�t}|j	|f|�jStj
|�S)a�Get a TreeWalker class for various types of tree with built-in support

    Args:
        treeType (str): the name of the tree type required (case-insensitive).
            Supported values are:

            - "dom": The xml.dom.minidom DOM implementation
            - "etree": A generic walker for tree implementations exposing an
                       elementtree-like interface (known to work with
                       ElementTree, cElementTree and lxml.etree).
            - "lxml": Optimized walker for lxml.etree
            - "genshi": a Genshi stream

        Implementation: A module implementing the tree type e.g.
            xml.etree.ElementTree or cElementTree (Currently applies to the
            "etree" tree type only).
    r
�)r
r)rZlxml)r
r)rN)�lower�treeWalkerCache�r
Z
TreeWalkerrr
rrZgetETreeModule�get)ZtreeType�implementation�kwargsr
rr
r�r�/usr/lib/python3.6/__init__.pyrs"ccslg}xL|D]D}|d}|dkr.|j|d�q
|rHddj|�d�Vg}|Vq
W|rhddj|�d�VdS)N�type�
Characters�SpaceCharacters�datar)rr)rr)�append�join)�tokensZpendingCharacters�tokenrrrr�concatenateCharacterTokens<s

rcCs^g}d}�xHt|�D�]:}|d}|dk�r&|dr~|dtjdkr~|dtjkrdtj|d}n|d}d||df}n|d}|jd	d
||f�|d7}|d}xdt|j��D]T\\}}	}
|r�|tjkr�tj|}n|}d||	f}n|	}|jd
d
|||
f�q�W|dk�rP|d8}q|dk�r:|d8}q|dk�r`|jdd
||df�q|dk�r|d�r�|d�r�|jdd
||d|d|d�r�|dndf�nF|d�r�|jdd
||d|df�n|jdd
||df�n|jdd
|f�q|dk�r8|jdd
||df�q|dk�rDqtd|��qWdj|�S)zPretty printer for tree walkersrr�StartTag�EmptyTag�	namespaceZhtmlz%s %s�namez%s<%s>� rrz	%s%s="%s"ZEndTag�Commentz
%s<!-- %s -->ZDoctypeZpublicIdz%s<!DOCTYPE %s "%s" "%s">ZsystemIdrz%s<!DOCTYPE %s "" "%s">z%s<!DOCTYPE %s>z
%s<!DOCTYPE >rz%s"%s"rzUnknown token type, %s�
)r r!)	rrZ
namespaces�prefixesr�sorted�items�
ValueErrorr)Zwalker�output�indentrr�nsr#Zattrsr"Z	localname�valuerrrr	Ksd












)N)�__doc__Z
__future__rrrrrZ_utilsr�__all__rrrr	rrrr�<module>	s
'site-packages/pip/_vendor/html5lib/__pycache__/constants.cpython-36.opt-1.pyc000064400000201300147511334600023041 0ustar003

���e�E��@s�-ddlmZmZmZddlZdZddddddd	d
ddd
ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d,d,d0d1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrdsdtdudvdwdxdydzd{d|d}d~dd�d�d�d���Zd�d�d�d�d�d�d��Zeed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fg�Z	eed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fg�Z
eed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fgN�Zeed�d�fed�d�fed�d�fed�d�fg�Zeed�d�fed�d�fed�d�fed�d�fed�d�fg�Z
d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-�d.�d/�>Z�d0�d1iZ�d2�d3e�d2f�d2�d4e�d2f�d2�d5e�d2f�d2�d6e�d2f�d2�d7e�d2f�d2d�e�d2f�d2�d8e�d2f�d9d�e�d9f�d9�d:e�d9f�d9�d;e�d9fd�d<e�d<f�d<�d2e�d<f�d=�Ze�d>�d?�ej�D��Ze�d@�dA�dB�dC�dDg�Zed�d�d�d�d�g�Zeej�Zeej�Zeej�Zeej�Zeej�Ze�dE�d?�ejD��Z�d|Z ed�d��dFd�d�d�d�d�d�d�d�d�dԐdG�dHg�Z!ed�d�g�Z"ed�d�d�d�d�d�d�g�Z#e�dIg�e�dJg�e�dKg�e�dL�dMg�e�dL�dMg�e�dN�dOg�e�dPg�e�dQ�dRg�e�dS�dR�dT�dUg�e�dVg�e�dWg�e�dR�dXg�e�dR�dX�dYg�e�dR�dXg�e�dR�dZg�e�dR�dX�d[�dZ�dT�dKg�e�dR�dX�dZ�dQg�e�dR�dXg��d\�Z$�d}Z%e�dy�dz�d{�d|�d}g�Z&�d~�d~�d�d�d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��ddÐdĐdŐdƐdǐdȐdɐdʐdːd̐d͐dΐdϐdАdѐdҐdӐd��dѐdԐdՐd֐dÐdאdؐdِdڐdېdܐdݐdސdߐd�d�d�d�d�d�d�d�d�d�d�d�dԐd�d�d�d�d�d�d�d�d�d�d�d�d��d��d��d��d��d��d��d��d��d��d��d��d�d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d(�d+�d��d,�d-�d.�d/�d0�d0�d1�d1�d2�d3�d4�d5�d5�d4�d6�d7�dڐd8�d9�d:�d;�d<�d=�d>�d?�d@�dA�dB�dC�dC�dD�dE�dF�dG�dH�dI�dJ�dK�dL�dM�dN�dO�dP�dQ�dR�dS�dT�dT�dU�dV�dW�dX�dY�dZ�d[�d\�d]�d^�d_�d`�da�db�dc�dd�de�df�dg�dh�di�dj�dk�dl�dm�dn�do�dp�dq�dr�ds�dt�dՐd֐du�dv�dw�dx�dy�dz�d{�d|�d}�d~�d�d��d��dאdؐdِd��d��d��dX�d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d"�d��dA�d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��ddÐdĐdŐdƐdǐdȐdɐdʐdːd̐d͐dΐdϐdϐdАdѐdҐdҐdӐdӐdԐdՐd֐dאdאdؐdِdڐdېdܐdݐdސdߐd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d'�d�d�d�d�d�d�d�d��d��d��d��d��d��d��d��d��d��d��d��d��d��d�d�d�d�d�d�d�d�d�d	�d�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �dڐd!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-�d.�d/�d0�dߐd^�d�d1�d2�d3�d4�d5�d6�d7�d8�d9�d:�d;�d<�d=�d>�d?�d?�d@�dA�dB�dC�dD�d�dE�dF�dG�dH�dF�dI�dI�dJ�dK�dL�d@�dM�dN�dO�dP�dQ�dR�dS�dT�dU�dV�dW�dX�dY�dZ�d[�d\�d]�d^�d^�d_�d`�da�db�dc�dc�dd�de�df�dg�dg�dh�di�dj�dk�dl�dm�dn�do�dp�d1�dq�dr�ds�dt�du�dv�dܐdݐdw�dx�dy�dz�d{�d|�d}�d~�d~�d�d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��dɐdɐd��d��d��d��d��d��d��d��d��d��d��d��d�d�d��d��d��d��d��d��d��d��d��d��ddÐdĐdŐdƐdǐdȐdɐdʐdːd̐d��d͐dΐdϐdY�dАdѐdҐdӐdԐdY�dҐdՐdՐd֐dאdY�d��dؐdؐdِdِd��dڐdېdܐdݐdސdߐd�d�d�d�dk�d�dܐd�d�d��d��d�dݐd��d�d�d�d�d:�d�dm�d�d�d�d�d�d�d�d�d��d��d�d��d
�d��d��d��d��d��d��d��d��d��d��d�d�d�du�du�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d*�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-�d.�dސd��d/�d/�d0�d1�dߐd�d2�d3�d4�d5�d5�d��d6�d,�d,�d7�d8�d9�d:�d;�d<�d=�d>�d?�d$�d@�dA�dB�dB�dC�dD�dE�dF�d��d��dG�dH�dH�d��dI�dJ�dK�dK�dL�dM�dN�dO�dP�dQ�dR�d��dS�dT�dU�dV�dP�dW�dX�dY�dZ�dZ�d[�d��d��d\�d]�d^�d3�d^�d��dX�d_�d��d`�d��d��d��da�db�dc�dd�de�df�dg�dh�di�dj�dk�dl�dm�dn�do�dp�dq�dr�ds�dt�du�dv�dw�dx�dl�dm�dy�dz�d{�d{�dn�dw�dy�dz�d��d|�d}�dԐd~�d�d��dߐd��di�d��dːd��d��dϐd��d��d��d��d��d��d��d��d��dd�d�dΐdΐd��d��dѐd��d��d��d��d��d��d��d��d��d��d��d��dʐdӐd��d��d��d��d��dߐd��dd�d�d��d��d��d��d��d��d��d��d��d��d�d	�d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d�d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��ddÐdĐdĐdŐd��d��d��d��d��dƐdǐd��dȐdɐdʐdːd̐dӐd��d͐dΐdΐdϐdϐdАdѐd�d�d�d��dҐdӐdԐdՐd֐dאdؐdِdڐdېdܐdݐdސd�dߐd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d��d��d��d��d��d��d��d�d�d�d�d�d�d��d��d��d��d�d�d�d"�d�d�d�d�d�d�d�d�d	�d	�d
�d
�d�d�d�d̐d
�d �d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d �d�d�d֐d��d�d(�d�dg�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d�d)�d*�d��d+�d+�d;�d,�d,�d-�d.�d/�d/�d֐d0�d1�d1�d7�d2�d3�d4�d5�d6�d7�d4�d@�d4�d8�d9�d:�d��d;�d<�d=�d8�d9�d>�d��d>�d?�d@�dA�dB�dC�dD�d@�dE�dE�dF�d��dG�dH�dI�dJ�d��d<�dK�dL�dM�dM�dN�dO�dP�dQ�dR�dS�dT�dU�dV�dW�dX�dY�dZ�d[�d\�d]�d^�d_�d}�dՐd`�da�dv�db�dc�dd�de�dX�df�d]�dg�d]�dh�di�di�d^�d_�dj�dk�d$�dl�dm�dn�do�dp�dq�dr�ds�dt�du�dv�dw�dx�dy�dz�d{�d|�da�dv�d}�d~�dܐd�d�d��d��d��d^�do�d�ds�d��dg�d`�d�d�d��du�d��dv�dy�dy�d��d��d��d��d��d��dh�d��du�db�dw�dz�d��df�d��dw�d��d�ds�d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��db�d�d��d��d��dl�d��d��d��d��d��d��di�d��d��d��d��d��d`�d��d�d��d��d��d��d��d��dz�d��d��dw�dݐd��d��dT�dT�d��d��d��d��d��d��d��d��d��dn�d��d��d��d��d��d��d��d��d��d��d��d�d�d�dj�dv�d��d��d��d��d��d��ddÐdÐd��dאdĐd��d��dŐd!�d��dƐdǐd�d��dȐdɐd��dʐd��dːd̐d̐d͐dΐd��dϐdАdѐdҐd��dӐdԐdՐdƐd֐dאd̐dؐdِdڐd̐dېdېd��d��d��d��d��dܐdݐdސdːdߐd�d�d�d��d�d�dx�dx�d�d��d�d��d��d��d�d��d��d��d��d��d��d��d��d��dАd�d�d�d�d�d�d�dϐd�d�d�d��d�d�d��d�d��d��d��d��d��d��d��d͐d�d�d�d��d�d�d��d�d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��dӐd��d��d��d��d��d��dÐdŐdĐd��d͐d��dɐdʐdʐd͐d��d��d��d��d�dd��dd�dÐdĐd�d�dȐdǐdȐd�d��d�d�d��d��d��d��d��d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�dw�dw�d�dS�d�d�dT�dU�d�d�d�dV�d�d�d��d�d�d �d!�d"�d#�d#�d$�d%�dِd��dQ�d&�d'�d�d(�d)�d*�d+�d,�d��d-�d.�d/�d��d0�dR�d1�d2�d2�d3�d3�d4�d4�d5�d6�d7�d8�d2�d9�d9�d:�d;�d;�d��d<�d=�d=�d>�dސd?�d?�dސd@�dA�d�dB�dC�dD�dE�du�dF�dG�dH�dI�d��dJ�dK�dߐdL�d�dM�d�dN�dO�d"�dP�d��dQ�dR�d�d�dS�dT�d�dU�dV�dW�dW�d�dX�dY�d�d�d�dY�d�d�dZ�d[�d\�d�d]�d�d[�dZ�d\�d��d^�d_�d`�d��d��d�da�db�dc�dd�de�d2�df�dg�dh�d)�di�dj�dǐd��d��d#�dڐdk�d��dl�dm�dn�d5�do�d�dp�dq�d�dr�dr�d�ds�d
�dt�du�dv�d%�d��dw�dx�dy�dz�d{�d|�d��d�d}�d~�d�d��d��d��d��d��d�d~�d��d��d��d��d��d��d�d$�d�d!�d��d��d��d��d�d��d�d�d��d��d��d�dy�d�d�d�d�d��dz�d��d��dʐd�d�d��d��d��d��d��d��d�d��d��d��d��d��d��d��d��d��d��d%�d�d��d��d��d��d��d�d��d��d��d��d��d��dA�d��d��d��dC�dB�d��d��d��d��d��d��dD�d��d��d��d��d��d �d��d��d��d��d��d��d��d��d��d��d�d��d��d��d��dސd��d��d��d��d��dV�d��dW�dW�d��d��d��ddÐdĐdŐd^�d��dƐdǐd��dȐdɐdʐdːd̐d͐dΐdϐdАdѐdѐdސd7�dҐd<�dӐd8�d9�d8�d9�d:�d;�d:�d;�d6�d6�d
�d
�d�dԐd��dȐd>�dՐd�dŐdI�d��d֐dאdؐd@�dِdڐdېdܐdݐdސd֐d@�dאdܐdېdߐd�d�dA�d��dC�dB�d��d��d��dD�dE�d�d�d�d�d�d�d�dG�d�d�d�dH�d�d�d�d�d�d�d�d�dG�dH�d�d�d�d�d�d�d�d!�d��d��d��d��d��d��d��d�d��d��d��d[�d��d��dR�dR�d��d��d��dY�dV�dU�dY�dV�d�d�d͐d�d�d#�d�d�d3�d�d�d�d�d�d�d��d�dJ�d	�d��d��dn�d
�d��d�d�d
�d�d
�d�d�d�d�d�d�d�d�dY�d�dܐd�d�d�d1�d�d�d�d�d�dr�d�dt�d�d�d�d�dq�d�d�d �d �d!�d"�d#�dѐdѐd$�d%�d1�ds�dq�d�dn�d&�dy�d&�d'�d(�d(�d)�d*�d+�d,�d-�d.�d	�d��d'�d/�d/�d0�dݐd1�d2�dېd3�dŐdW�d��dI�dL�d��ds�d��d��d4�d5�d6�d7�d��dl�d�d8�d�d0�d9�d:�d;�d��d��d<�dl�d��dǐd=�d��d�d>�d5�d4�d7�d6�d?�d@�dA�d��dB�dC�dD�dE�dC�d��d��dF�d:�d�dm�d�dG�dؐd��dH�dאd�d��dI�d�dJ�d�d�dِd��dK�d�d�d�d��d��dL�dL�dM�dN�dO�dP�dP�dQ�dR�dS�dT�dU�dV�dV�dW�dX�dY�dZ�d��d[�d\�d]�d^�d_�d`�da�db�dc���Z'�dd�dD�dАde�d��dݐd�d�d�d��dO�dE�d,�d��dѐdf�d��dg�dh�dݐd��dܐd��d5�d�d��d͐dJ�d��d��d�di�dX�d��dj�"Z(d�dk�dl�dm�dn�do�dp�dq�dr�Z)ee)�dse)�dte)�dug�Z*e�dv�d?�ej�D��Z+�dwe+d�<G�dx�dy��dye,�Z-G�dz�d{��d{e.�Z/dS(~�)�absolute_import�division�unicode_literalsNz5Null character in input stream, replaced with U+FFFD.zInvalid codepoint in stream.z&Solidus (/) incorrectly placed in tag.z.Incorrect CR newline entity, replaced with LF.z9Entity used with illegal number (windows-1252 reference).zPNumeric entity couldn't be converted to character (codepoint U+%(charAsInt)08x).zBNumeric entity represents an illegal codepoint: U+%(charAsInt)08x.z#Numeric entity didn't end with ';'.z1Numeric entity expected. Got end of file instead.z'Numeric entity expected but none found.z!Named entity didn't end with ';'.z Named entity expected. Got none.z'End tag contains unexpected attributes.z.End tag contains unexpected self-closing flag.z#Expected tag name. Got '>' instead.zSExpected tag name. Got '?' instead. (HTML doesn't support processing instructions.)z-Expected tag name. Got something else insteadz6Expected closing tag. Got '>' instead. Ignoring '</>'.z-Expected closing tag. Unexpected end of file.z<Expected closing tag. Unexpected character '%(data)s' found.z'Unexpected end of file in the tag name.z8Unexpected end of file. Expected attribute name instead.z)Unexpected end of file in attribute name.z#Invalid character in attribute namez#Dropped duplicate attribute on tag.z1Unexpected end of file. Expected = or end of tag.z1Unexpected end of file. Expected attribute value.z*Expected attribute value. Got '>' instead.z"Unexpected = in unquoted attributez*Unexpected character in unquoted attributez*Unexpected character after attribute name.z+Unexpected character after attribute value.z.Unexpected end of file in attribute value (").z.Unexpected end of file in attribute value (').z*Unexpected end of file in attribute value.z)Unexpected end of file in tag. Expected >z/Unexpected character after / in tag. Expected >z&Expected '--' or 'DOCTYPE'. Not found.z Unexpected ! after -- in commentz$Unexpected space after -- in commentzIncorrect comment.z"Unexpected end of file in comment.z%Unexpected end of file in comment (-)z+Unexpected '-' after '--' found in comment.z'Unexpected end of file in comment (--).z&Unexpected character in comment found.z(No space after literal string 'DOCTYPE'.z.Unexpected > character. Expected DOCTYPE name.z.Unexpected end of file. Expected DOCTYPE name.z'Unexpected end of file in DOCTYPE name.z"Unexpected end of file in DOCTYPE.z%Expected space or '>'. Got '%(data)s'zUnexpected end of DOCTYPE.z Unexpected character in DOCTYPE.zXXX innerHTML EOFzUnexpected DOCTYPE. Ignored.z%html needs to be the first start tag.z)Unexpected End of file. Expected DOCTYPE.zErroneous DOCTYPE.z2Unexpected non-space characters. Expected DOCTYPE.z2Unexpected start tag (%(name)s). Expected DOCTYPE.z0Unexpected end tag (%(name)s). Expected DOCTYPE.z?Unexpected end tag (%(name)s) after the (implied) root element.z4Unexpected end of file. Expected end tag (%(name)s).z4Unexpected start tag head in existing head. Ignored.z'Unexpected end tag (%(name)s). Ignored.z;Unexpected start tag (%(name)s) that can be in head. Moved.z Unexpected start tag (%(name)s).zMissing end tag (%(name)s).zMissing end tags (%(name)s).zCUnexpected start tag (%(startName)s) implies end tag (%(endName)s).z@Unexpected start tag (%(originalName)s). Treated as %(newName)s.z,Unexpected start tag %(name)s. Don't use it!z'Unexpected start tag %(name)s. Ignored.zEUnexpected end tag (%(gotName)s). Missing end tag (%(expectedName)s).z:End tag (%(name)s) seen too early. Expected other end tag.zFUnexpected end tag (%(gotName)s). Expected end tag (%(expectedName)s).z+End tag (%(name)s) seen too early. Ignored.zQEnd tag (%(name)s) violates step 1, paragraph 1 of the adoption agency algorithm.zQEnd tag (%(name)s) violates step 1, paragraph 2 of the adoption agency algorithm.zQEnd tag (%(name)s) violates step 1, paragraph 3 of the adoption agency algorithm.zQEnd tag (%(name)s) violates step 4, paragraph 4 of the adoption agency algorithm.z>Unexpected end tag (%(originalName)s). Treated as %(newName)s.z'This element (%(name)s) has no end tag.z9Unexpected implied end tag (%(name)s) in the table phase.z>Unexpected implied end tag (%(name)s) in the table body phase.zDUnexpected non-space characters in table context caused voodoo mode.z3Unexpected input with type hidden in table context.z!Unexpected form in table context.zDUnexpected start tag (%(name)s) in table context caused voodoo mode.zBUnexpected end tag (%(name)s) in table context caused voodoo mode.zCUnexpected table cell start tag (%(name)s) in the table body phase.zFGot table cell end tag (%(name)s) while required end tags are missing.z?Unexpected end tag (%(name)s) in the table body phase. Ignored.z=Unexpected implied end tag (%(name)s) in the table row phase.z>Unexpected end tag (%(name)s) in the table row phase. Ignored.zJUnexpected select start tag in the select phase treated as select end tag.z/Unexpected input start tag in the select phase.zBUnexpected start tag token (%(name)s in the select phase. Ignored.z;Unexpected end tag (%(name)s) in the select phase. Ignored.zKUnexpected table element start tag (%(name)s) in the select in table phase.zIUnexpected table element end tag (%(name)s) in the select in table phase.z8Unexpected non-space characters in the after body phase.z>Unexpected start tag token (%(name)s) in the after body phase.z<Unexpected end tag token (%(name)s) in the after body phase.z@Unexpected characters in the frameset phase. Characters ignored.zEUnexpected start tag token (%(name)s) in the frameset phase. Ignored.zFUnexpected end tag token (frameset) in the frameset phase (innerHTML).zCUnexpected end tag token (%(name)s) in the frameset phase. Ignored.zEUnexpected non-space characters in the after frameset phase. Ignored.zEUnexpected start tag (%(name)s) in the after frameset phase. Ignored.zCUnexpected end tag (%(name)s) in the after frameset phase. Ignored.z(Unexpected end tag after body(innerHtml)z6Unexpected non-space characters. Expected end of file.z6Unexpected start tag (%(name)s). Expected end of file.z4Unexpected end tag (%(name)s). Expected end of file.z/Unexpected end of file. Expected table content.z0Unexpected end of file. Expected select content.z2Unexpected end of file. Expected frameset content.z0Unexpected end of file. Expected script content.z0Unexpected end of file. Expected foreign contentz0Trailing solidus not allowed on element %(name)sz2Element %(name)s not allowed in a non-html contextz*Unexpected end tag (%(name)s) before html.z9Element %(name)s not allowed in a inhead-noscript contextz8Unexpected end of file. Expected inhead-noscript contentz@Unexpected non-space character. Expected inhead-noscript contentz0Undefined error (this sucks and should be fixed))�znull-characterzinvalid-codepointzincorrectly-placed-soliduszincorrect-cr-newline-entityzillegal-windows-1252-entityzcant-convert-numeric-entityz$illegal-codepoint-for-numeric-entityz numeric-entity-without-semicolonz#expected-numeric-entity-but-got-eofzexpected-numeric-entityznamed-entity-without-semicolonzexpected-named-entityzattributes-in-end-tagzself-closing-flag-on-end-tagz'expected-tag-name-but-got-right-bracketz'expected-tag-name-but-got-question-markzexpected-tag-namez*expected-closing-tag-but-got-right-bracketz expected-closing-tag-but-got-eofz!expected-closing-tag-but-got-charzeof-in-tag-namez#expected-attribute-name-but-got-eofzeof-in-attribute-namez#invalid-character-in-attribute-namezduplicate-attributez$expected-end-of-tag-name-but-got-eofz$expected-attribute-value-but-got-eofz.expected-attribute-value-but-got-right-bracketz"equals-in-unquoted-attribute-valuez0unexpected-character-in-unquoted-attribute-valuez&invalid-character-after-attribute-namez*unexpected-character-after-attribute-valuez#eof-in-attribute-value-double-quotez#eof-in-attribute-value-single-quotez eof-in-attribute-value-no-quotesz#unexpected-EOF-after-solidus-in-tagz)unexpected-character-after-solidus-in-tagzexpected-dashes-or-doctypez,unexpected-bang-after-double-dash-in-commentz-unexpected-space-after-double-dash-in-commentzincorrect-commentzeof-in-commentzeof-in-comment-end-dashz,unexpected-dash-after-double-dash-in-commentzeof-in-comment-double-dashzeof-in-comment-end-space-statezeof-in-comment-end-bang-statezunexpected-char-in-commentzneed-space-after-doctypez+expected-doctype-name-but-got-right-bracketz!expected-doctype-name-but-got-eofzeof-in-doctype-namezeof-in-doctypez*expected-space-or-right-bracket-in-doctypezunexpected-end-of-doctypezunexpected-char-in-doctypezeof-in-innerhtmlzunexpected-doctypez
non-html-rootzexpected-doctype-but-got-eofzunknown-doctypezexpected-doctype-but-got-charsz"expected-doctype-but-got-start-tagz expected-doctype-but-got-end-tagzend-tag-after-implied-rootz&expected-named-closing-tag-but-got-eofz!two-heads-are-not-better-than-onezunexpected-end-tagz#unexpected-start-tag-out-of-my-headzunexpected-start-tagzmissing-end-tagzmissing-end-tagsz$unexpected-start-tag-implies-end-tagzunexpected-start-tag-treated-aszdeprecated-tagzunexpected-start-tag-ignoredz$expected-one-end-tag-but-got-anotherzend-tag-too-earlyzend-tag-too-early-namedzend-tag-too-early-ignoredzadoption-agency-1.1zadoption-agency-1.2zadoption-agency-1.3zadoption-agency-4.4zunexpected-end-tag-treated-asz
no-end-tagz#unexpected-implied-end-tag-in-tablez(unexpected-implied-end-tag-in-table-bodyz$unexpected-char-implies-table-voodooz unexpected-hidden-input-in-tablezunexpected-form-in-tablez)unexpected-start-tag-implies-table-voodooz'unexpected-end-tag-implies-table-voodoozunexpected-cell-in-table-bodyzunexpected-cell-end-tagz unexpected-end-tag-in-table-bodyz'unexpected-implied-end-tag-in-table-rowzunexpected-end-tag-in-table-rowzunexpected-select-in-selectzunexpected-input-in-selectzunexpected-start-tag-in-selectzunexpected-end-tag-in-selectz5unexpected-table-element-start-tag-in-select-in-tablez3unexpected-table-element-end-tag-in-select-in-tablezunexpected-char-after-bodyzunexpected-start-tag-after-bodyzunexpected-end-tag-after-bodyzunexpected-char-in-framesetz unexpected-start-tag-in-framesetz)unexpected-frameset-in-frameset-innerhtmlzunexpected-end-tag-in-framesetzunexpected-char-after-framesetz#unexpected-start-tag-after-framesetz!unexpected-end-tag-after-framesetz'unexpected-end-tag-after-body-innerhtmlzexpected-eof-but-got-charzexpected-eof-but-got-start-tagzexpected-eof-but-got-end-tagzeof-in-tablez
eof-in-selectzeof-in-framesetzeof-in-script-in-scriptzeof-in-foreign-landsz&non-void-element-with-trailing-solidusz*unexpected-html-element-in-foreign-contentzunexpected-end-tag-before-htmlzunexpected-inhead-noscript-tagzeof-in-head-noscriptzchar-in-head-noscriptzXXX-undefined-errorzhttp://www.w3.org/1999/xhtmlz"http://www.w3.org/1998/Math/MathMLzhttp://www.w3.org/2000/svgzhttp://www.w3.org/1999/xlinkz$http://www.w3.org/XML/1998/namespacezhttp://www.w3.org/2000/xmlns/)�html�mathml�svg�xlink�xml�xmlnsrZappletZcaptionZmarquee�object�tableZtdZthrZmi�moZmnZmsZmtextzannotation-xmlrZ
foreignObjectZdesc�title�a�bZbig�codeZemZfont�iZnobr�sZsmallZstrikeZstrongZtt�uZaddressZareaZarticleZaside�baseZbasefontZbgsoundZ
blockquoteZbody�br�button�center�colZcolgroup�commandZdd�details�dirZdivZdlZdtZembed�fieldsetZfigureZfooterZform�frameZframeset�h1�h2�h3�h4�h5�h6�head�header�hrZiframeZimage�img�inputZisindexZli�linkZlisting�menu�metaZnavZnoembedZnoframesZnoscriptZol�pZparamZ	plaintextZpre�scriptZsection�select�styleZtbodyZtextareaZtfootZtheadZtrZulZwbrZxmpz
annotaion-xmlZ
attributeNameZ
attributeTypeZ
baseFrequencyZbaseProfileZcalcModeZ
clipPathUnitsZcontentScriptTypeZcontentStyleTypeZdiffuseConstantZedgeModeZexternalResourcesRequiredZ	filterResZfilterUnitsZglyphRefZgradientTransformZ
gradientUnitsZkernelMatrixZkernelUnitLengthZ	keyPointsZ
keySplinesZkeyTimesZlengthAdjustZlimitingConeAngleZmarkerHeightZmarkerUnitsZmarkerWidthZmaskContentUnitsZ	maskUnitsZ
numOctavesZ
pathLengthZpatternContentUnitsZpatternTransformZpatternUnitsZ	pointsAtXZ	pointsAtYZ	pointsAtZZ
preserveAlphaZpreserveAspectRatioZprimitiveUnitsZrefXZrefYZrepeatCountZ	repeatDurZrequiredExtensionsZrequiredFeaturesZspecularConstantZspecularExponentZspreadMethodZstartOffsetZstdDeviationZstitchTilesZsurfaceScaleZsystemLanguageZtableValuesZtargetXZtargetYZ
textLengthZviewBoxZ
viewTargetZxChannelSelectorZyChannelSelectorZ
zoomAndPan)>Z
attributenameZ
attributetypeZ
basefrequencyZbaseprofileZcalcmodeZ
clippathunitsZcontentscripttypeZcontentstyletypeZdiffuseconstantZedgemodeZexternalresourcesrequiredZ	filterresZfilterunitsZglyphrefZgradienttransformZ
gradientunitsZkernelmatrixZkernelunitlengthZ	keypointsZ
keysplinesZkeytimesZlengthadjustZlimitingconeangleZmarkerheightZmarkerunitsZmarkerwidthZmaskcontentunitsZ	maskunitsZ
numoctavesZ
pathlengthZpatterncontentunitsZpatterntransformZpatternunitsZ	pointsatxZ	pointsatyZ	pointsatzZ
preservealphaZpreserveaspectratioZprimitiveunitsZrefxZrefyZrepeatcountZ	repeatdurZrequiredextensionsZrequiredfeaturesZspecularconstantZspecularexponentZspreadmethodZstartoffsetZstddeviationZstitchtilesZsurfacescaleZsystemlanguageZtablevaluesZtargetxZtargetyZ
textlengthZviewboxZ
viewtargetZxchannelselectorZychannelselectorZ
zoomandpanZ
definitionurlZ
definitionURLrZactuateZarcroleZhrefZroleZshow�typer	ZlangZspacer
)z
xlink:actuatez
xlink:arcrolez
xlink:hrefz
xlink:rolez
xlink:showzxlink:titlez
xlink:typezxml:basezxml:langz	xml:spacer
zxmlns:xlinkcCs"g|]\}\}}}||f|f�qS�r2)�.0Zqname�prefixZlocal�nsr2r2�/usr/lib/python3.6/constants.py�
<listcomp>
sr7�	�
�� �
cCs g|]}t|�t|j��f�qSr2)�ord�lower)r3�cr2r2r6r7#szevent-source�sourceZtrackZ
irrelevantZscopedZismapZautoplayZcontrolsZdefer�async�openZmultipleZdisabledZhiddenZchecked�defaultZnoshadeZ
autosubmit�readonlyZselectedZ	autofocusZrequired)�r0r(ZaudioZvideor.rZdatagridrr'r+rZoptionZoptgrouprr)r/�output� �� �� �& �  �! ���0 �`�9 �R�}� � � � �" � � ���"!�a�: �S�~�xzlt;zgt;zamp;zapos;zquot;�Æ�&�ÁuĂ�ÂuАu𝔄�ÀuΑuĀu⩓uĄu𝔸u⁡�Åu𝒜u≔�Ã�Äu∖u⫧u⌆uБu∵uℬuΒu𝔅u𝔹u˘u≎uЧ�©uĆu⋒uⅅuℭuČ�ÇuĈu∰uĊ�¸�·uΧu⊙u⊖u⊕u⊗u∲u”u’u∷u⩴u≡u∯u∮uℂu∐u∳u⨯u𝒞u⋓u≍u⤑uЂuЅuЏu‡u↡u⫤uĎuДu∇uΔu𝔇�´u˙u˝�`u˜u⋄uⅆu𝔻�¨u⃜u≐u⇓u⇐u⇔u⟸u⟺u⟹u⇒u⊨u⇑u⇕u∥u↓u⤓u⇵ȗu⥐u⥞u↽u⥖u⥟u⇁u⥗u⊤u↧u𝒟uĐuŊ�Ð�ÉuĚ�ÊuЭuĖu𝔈�Èu∈uĒu◻u▫uĘu𝔼uΕu⩵u≂u⇌uℰu⩳uΗ�Ëu∃uⅇuФu𝔉u◼u▪u𝔽u∀uℱuЃ�>uΓuϜuĞuĢuĜuГuĠu𝔊u⋙u𝔾u≥u⋛u≧u⪢u≷u⩾u≳u𝒢u≫uЪuˇ�^uĤuℌuℋuℍu─uĦu≏uЕuIJuЁ�Í�ÎuИuİuℑ�ÌuĪuⅈu∬u∫u⋂u⁣u⁢uĮu𝕀uΙuℐuĨuІ�ÏuĴuЙu𝔍u𝕁u𝒥uЈuЄuХuЌuΚuĶuКu𝔎u𝕂u𝒦uЉ�<uĹuΛu⟪uℒu↞uĽuĻuЛu⟨u←u⇤u⇆u⌈u⟦u⥡u⇃u⥙u⌊u↔u⥎u⊣u↤u⥚u⊲u⧏u⊴u⥑u⥠u↿u⥘u↼u⥒u⋚u≦u≶u⪡u⩽u≲u𝔏u⋘u⇚uĿu⟵u⟷u⟶u𝕃u↙u↘u↰uŁu≪u⤅uМu uℳu𝔐u∓u𝕄uΜuЊuŃuŇuŅuНu​u𝔑u⁠� uℕu⫬u≢u≭u∦u∉u≠u≂̸u∄u≯u≱u≧̸u≫̸u≹u⩾̸u≵u≎̸u≏̸u⋪u⧏̸u⋬u≮u≰u≸u≪̸u⩽̸u≴u⪢̸u⪡̸u⊀u⪯̸u⋠u∌u⋫u⧐̸u⋭u⊏̸u⋢u⊐̸u⋣u⊂⃒u⊈u⊁u⪰̸u⋡u≿̸u⊃⃒u⊉u≁u≄u≇u≉u∤u𝒩�ÑuΝuŒ�Ó�ÔuОuŐu𝔒�ÒuŌuΩuΟu𝕆u“u‘u⩔u𝒪�Ø�Õu⨷�Öu‾u⏞u⎴u⏜u∂uПu𝔓uΦuΠ�±uℙu⪻u≺u⪯u≼u≾u″u∏u∝u𝒫uΨ�"u𝔔uℚu𝒬u⤐�®uŔu⟫u↠u⤖uŘuŖuРuℜu∋u⇋u⥯uΡu⟩u→u⇥u⇄u⌉u⟧u⥝u⇂u⥕u⌋u⊢u↦u⥛u⊳u⧐u⊵u⥏u⥜u↾u⥔u⇀u⥓uℝu⥰u⇛uℛu↱u⧴uЩuШuЬuŚu⪼uŠuŞuŜuСu𝔖u↑uΣu∘u𝕊u√u□u⊓u⊏u⊑u⊐u⊒u⊔u𝒮u⋆u⋐u⊆u≻u⪰u≽u≿u∑u⋑u⊃u⊇�Þu™uЋuЦuΤuŤuŢuТu𝔗u∴uΘu  u u∼u≃u≅u≈u𝕋u⃛u𝒯uŦ�Úu↟u⥉uЎuŬ�ÛuУuŰu𝔘�ÙuŪ�_u⏟u⎵u⏝u⋃u⊎uŲu𝕌u⤒u⇅u↕u⥮u⊥u↥u↖u↗uϒuΥuŮu𝒰uŨ�Üu⊫u⫫uВu⊩u⫦u⋁u‖u∣�|u❘u≀u u𝔙u𝕍u𝒱u⊪uŴu⋀u𝔚u𝕎u𝒲u𝔛uΞu𝕏u𝒳uЯuЇuЮ�ÝuŶuЫu𝔜u𝕐u𝒴uŸuЖuŹuŽuЗuŻuΖuℨuℤu𝒵�áuău∾u∾̳u∿�âuа�æu𝔞�àuℵuαuāu⨿u∧u⩕u⩜u⩘u⩚u∠u⦤u∡u⦨u⦩u⦪u⦫u⦬u⦭u⦮u⦯u∟u⊾u⦝u∢u⍼uąu𝕒u⩰u⩯u≊u≋�'�åu𝒶�*�ã�äu⨑u⫭u≌u϶u‵u∽u⋍u⊽u⌅u⎶uбu„u⦰uβuℶu≬u𝔟u◯u⨀u⨁u⨂u⨆u★u▽u△u⨄u⤍u⧫u▴u▾u◂u▸u␣u▒u░u▓u█u=⃥u≡⃥u⌐u𝕓u⋈u╗u╔u╖u╓u═u╦u╩u╤u╧u╝u╚u╜u╙u║u╬u╣u╠u╫u╢u╟u⧉u╕u╒u┐u┌u╥u╨u┬u┴u⊟u⊞u⊠u╛u╘u┘u└u│u╪u╡u╞u┼u┤u├�¦u𝒷u⁏�\u⧅u⟈u•u⪮uću∩u⩄u⩉u⩋u⩇u⩀u∩︀u⁁u⩍uč�çuĉu⩌u⩐uċu⦲�¢u𝔠uчu✓uχu○u⧃uˆu≗u↺u↻uⓈu⊛u⊚u⊝u⨐u⫯u⧂u♣�:�,�@u∁u⩭u𝕔u℗u↵u✗u𝒸u⫏u⫑u⫐u⫒u⋯u⤸u⤵u⋞u⋟u↶u⤽u∪u⩈u⩆u⩊u⊍u⩅u∪︀u↷u⤼u⋎u⋏�¤u∱u⌭u⥥u†uℸu‐u⤏uďuдu⇊u⩷�°uδu⦱u⥿u𝔡u♦uϝu⋲�÷u⋇uђu⌞u⌍�$u𝕕u≑u∸u∔u⊡u⌟u⌌u𝒹uѕu⧶uđu⋱u▿u⦦uџu⟿�éu⩮uěu≖�êu≕uэuėu≒u𝔢u⪚�èu⪖u⪘u⪙u⏧uℓu⪕u⪗uēu∅u u u uŋu uęu𝕖u⋕u⧣u⩱uεuϵ�=u≟u⩸u⧥u≓u⥱uℯuη�ð�ëu€�!uфu♀uffiuffufflu𝔣ufiZfju♭uflu▱uƒu𝕗u⋔u⫙u⨍�½u⅓�¼u⅕u⅙u⅛u⅔u⅖�¾u⅗u⅜u⅘u⅚u⅝u⅞u⁄u⌢u𝒻u⪌uǵuγu⪆uğuĝuгuġu⪩u⪀u⪂u⪄u⋛︀u⪔u𝔤uℷuѓu⪒u⪥u⪤u≩u⪊u⪈u⋧u𝕘uℊu⪎u⪐u⪧u⩺u⋗u⦕u⩼u⥸u≩︀uъu⥈u↭uℏuĥu♥u…u⊹u𝔥u⤥u⤦u⇿u∻u↩u↪u𝕙u―u𝒽uħu⁃�í�îuиuе�¡u𝔦�ìu⨌u∭u⧜u℩uijuīuıu⊷uƵu℅u∞u⧝u⊺u⨗u⨼uёuįu𝕚uι�¿u𝒾u⋹u⋵u⋴u⋳uĩuі�ïuĵuйu𝔧uȷu𝕛u𝒿uјuєuκuϰuķuкu𝔨uĸuхuќu𝕜u𝓀u⤛u⤎u⪋u⥢uĺu⦴uλu⦑u⪅�«u⤟u⤝u↫u⤹u⥳u↢u⪫u⤙u⪭u⪭︀u⤌u❲�{�[u⦋u⦏u⦍uľuļuлu⤶u⥧u⥋u↲u≤u⇇u⋋u⪨u⩿u⪁u⪃u⋚︀u⪓u⋖u⥼u𝔩u⪑u⥪u▄uљu⥫u◺uŀu⎰u≨u⪉u⪇u⋦u⟬u⇽u⟼u↬u⦅u𝕝u⨭u⨴u∗u◊�(u⦓u⥭u‎u⊿u‹u𝓁u⪍u⪏u‚ułu⪦u⩹u⋉u⥶u⩻u⦖u◃u⥊u⥦u≨︀u∺�¯u♂u✠u▮u⨩uмu—u𝔪u℧�µu⫰u−u⨪u⫛u⊧u𝕞u𝓂uμu⊸u⋙̸u≫⃒u⇍u⇎u⋘̸u≪⃒u⇏u⊯u⊮uńu∠⃒u⩰̸u≋̸uʼnu♮u⩃uňuņu⩭̸u⩂uнu–u⇗u⤤u≐̸u⤨u𝔫u↮u⫲u⋼u⋺uњu≦̸u↚u‥u𝕟�¬u⋹̸u⋵̸u⋷u⋶u⋾u⋽u⫽⃥u∂̸u⨔u↛u⤳̸u↝̸u𝓃u⊄u⫅̸u⊅u⫆̸�ñuν�#u№u u⊭u⤄u≍⃒u⊬u≥⃒u>⃒u⧞u⤂u≤⃒u<⃒u⊴⃒u⤃u⊵⃒u∼⃒u⇖u⤣u⤧�ó�ôuоuőu⨸u⦼uœu⦿u𝔬u˛�òu⧁u⦵u⦾u⦻u⧀uōuωuοu⦶u𝕠u⦷u⦹u∨u⩝uℴ�ª�ºu⊶u⩖u⩗u⩛�øu⊘�õu⨶�öu⌽�¶u⫳u⫽uп�%�.u‰u‱u𝔭uφuϕu☎uπuϖuℎ�+u⨣u⨢u⨥u⩲u⨦u⨧u⨕u𝕡�£u⪳u⪷u⪹u⪵u⋨u′u⌮u⌒u⌓u⊰u𝓅uψu u𝔮u𝕢u⁗u𝓆u⨖�?u⤜u⥤u∽̱uŕu⦳u⦒u⦥�»u⥵u⤠u⤳u⤞u⥅u⥴u↣u↝u⤚u∶u❳�}�]u⦌u⦎u⦐uřuŗuрu⤷u⥩u↳u▭u⥽u𝔯u⥬uρuϱu⇉u⋌u˚u‏u⎱u⫮u⟭u⇾u⦆u𝕣u⨮u⨵�)u⦔u⨒u›u𝓇u⋊u▹u⧎u⥨u℞uśu⪴u⪸ušuşuŝu⪶u⪺u⋩u⨓uсu⋅u⩦u⇘�§�;u⤩u✶u𝔰u♯uщuш�­uσuςu⩪u⪞u⪠u⪝u⪟u≆u⨤u⥲u⨳u⧤u⌣u⪪u⪬u⪬︀uь�/u⧄u⌿u𝕤u♠u⊓︀u⊔︀u𝓈u☆u⊂u⫅u⪽u⫃u⫁u⫋u⊊u⪿u⥹u⫇u⫕u⫓u♪�¹�²�³u⫆u⪾u⫘u⫄u⟉u⫗u⥻u⫂u⫌u⊋u⫀u⫈u⫔u⫖u⇙u⤪�ßu⌖uτuťuţuтu⌕u𝔱uθuϑ�þ�×u⨱u⨰u⌶u⫱u𝕥u⫚u‴u▵u≜u◬u⨺u⨹u⧍u⨻u⏢u𝓉uцuћuŧu⥣�úuўuŭ�ûuуuűu⥾u𝔲�ùu▀u⌜u⌏u◸uūuųu𝕦uυu⇈u⌝u⌎uůu◹u𝓊u⋰uũ�üu⦧u⫨u⫩u⦜u⊊︀u⫋︀u⊋︀u⫌︀uвu⊻u≚u⋮u𝔳u𝕧u𝓋u⦚uŵu⩟u≙u℘u𝔴u𝕨u𝓌u𝔵uξu⋻u𝕩u𝓍�ýuяuŷuы�¥u𝔶uїu𝕪u𝓎uю�ÿuźužuзużuζu𝔷uжu⇝u𝕫u𝓏u‍u‌(�ZAEligzAElig;ZAMPzAMP;ZAacutezAacute;zAbreve;ZAcirczAcirc;zAcy;zAfr;ZAgravezAgrave;zAlpha;zAmacr;zAnd;zAogon;zAopf;zApplyFunction;ZAringzAring;zAscr;zAssign;ZAtildezAtilde;ZAumlzAuml;z
Backslash;zBarv;zBarwed;zBcy;zBecause;zBernoullis;zBeta;zBfr;zBopf;zBreve;zBscr;zBumpeq;zCHcy;ZCOPYzCOPY;zCacute;zCap;zCapitalDifferentialD;zCayleys;zCcaron;ZCcedilzCcedil;zCcirc;zCconint;zCdot;zCedilla;z
CenterDot;zCfr;zChi;z
CircleDot;zCircleMinus;zCirclePlus;zCircleTimes;zClockwiseContourIntegral;zCloseCurlyDoubleQuote;zCloseCurlyQuote;zColon;zColone;z
Congruent;zConint;zContourIntegral;zCopf;z
Coproduct;z CounterClockwiseContourIntegral;zCross;zCscr;zCup;zCupCap;zDD;z	DDotrahd;zDJcy;zDScy;zDZcy;zDagger;zDarr;zDashv;zDcaron;zDcy;zDel;zDelta;zDfr;zDiacriticalAcute;zDiacriticalDot;zDiacriticalDoubleAcute;zDiacriticalGrave;zDiacriticalTilde;zDiamond;zDifferentialD;zDopf;zDot;zDotDot;z	DotEqual;zDoubleContourIntegral;z
DoubleDot;zDoubleDownArrow;zDoubleLeftArrow;zDoubleLeftRightArrow;zDoubleLeftTee;zDoubleLongLeftArrow;zDoubleLongLeftRightArrow;zDoubleLongRightArrow;zDoubleRightArrow;zDoubleRightTee;zDoubleUpArrow;zDoubleUpDownArrow;zDoubleVerticalBar;z
DownArrow;z
DownArrowBar;zDownArrowUpArrow;z
DownBreve;zDownLeftRightVector;zDownLeftTeeVector;zDownLeftVector;zDownLeftVectorBar;zDownRightTeeVector;zDownRightVector;zDownRightVectorBar;zDownTee;z
DownTeeArrow;z
Downarrow;zDscr;zDstrok;zENG;ZETHzETH;ZEacutezEacute;zEcaron;ZEcirczEcirc;zEcy;zEdot;zEfr;ZEgravezEgrave;zElement;zEmacr;zEmptySmallSquare;zEmptyVerySmallSquare;zEogon;zEopf;zEpsilon;zEqual;zEqualTilde;zEquilibrium;zEscr;zEsim;zEta;ZEumlzEuml;zExists;z
ExponentialE;zFcy;zFfr;zFilledSmallSquare;zFilledVerySmallSquare;zFopf;zForAll;zFouriertrf;zFscr;zGJcy;ZGTzGT;zGamma;zGammad;zGbreve;zGcedil;zGcirc;zGcy;zGdot;zGfr;zGg;zGopf;z
GreaterEqual;zGreaterEqualLess;zGreaterFullEqual;zGreaterGreater;zGreaterLess;zGreaterSlantEqual;z
GreaterTilde;zGscr;zGt;zHARDcy;zHacek;zHat;zHcirc;zHfr;z
HilbertSpace;zHopf;zHorizontalLine;zHscr;zHstrok;z
HumpDownHump;z
HumpEqual;zIEcy;zIJlig;zIOcy;ZIacutezIacute;ZIcirczIcirc;zIcy;zIdot;zIfr;ZIgravezIgrave;zIm;zImacr;zImaginaryI;zImplies;zInt;z	Integral;z
Intersection;zInvisibleComma;zInvisibleTimes;zIogon;zIopf;zIota;zIscr;zItilde;zIukcy;ZIumlzIuml;zJcirc;zJcy;zJfr;zJopf;zJscr;zJsercy;zJukcy;zKHcy;zKJcy;zKappa;zKcedil;zKcy;zKfr;zKopf;zKscr;zLJcy;ZLTzLT;zLacute;zLambda;zLang;zLaplacetrf;zLarr;zLcaron;zLcedil;zLcy;zLeftAngleBracket;z
LeftArrow;z
LeftArrowBar;zLeftArrowRightArrow;zLeftCeiling;zLeftDoubleBracket;zLeftDownTeeVector;zLeftDownVector;zLeftDownVectorBar;z
LeftFloor;zLeftRightArrow;zLeftRightVector;zLeftTee;z
LeftTeeArrow;zLeftTeeVector;z
LeftTriangle;zLeftTriangleBar;zLeftTriangleEqual;zLeftUpDownVector;zLeftUpTeeVector;z
LeftUpVector;zLeftUpVectorBar;zLeftVector;zLeftVectorBar;z
Leftarrow;zLeftrightarrow;zLessEqualGreater;zLessFullEqual;zLessGreater;z	LessLess;zLessSlantEqual;z
LessTilde;zLfr;zLl;zLleftarrow;zLmidot;zLongLeftArrow;zLongLeftRightArrow;zLongRightArrow;zLongleftarrow;zLongleftrightarrow;zLongrightarrow;zLopf;zLowerLeftArrow;zLowerRightArrow;zLscr;zLsh;zLstrok;zLt;zMap;zMcy;zMediumSpace;z
Mellintrf;zMfr;z
MinusPlus;zMopf;zMscr;zMu;zNJcy;zNacute;zNcaron;zNcedil;zNcy;zNegativeMediumSpace;zNegativeThickSpace;zNegativeThinSpace;zNegativeVeryThinSpace;zNestedGreaterGreater;zNestedLessLess;zNewLine;zNfr;zNoBreak;zNonBreakingSpace;zNopf;zNot;z
NotCongruent;z
NotCupCap;zNotDoubleVerticalBar;zNotElement;z	NotEqual;zNotEqualTilde;z
NotExists;zNotGreater;zNotGreaterEqual;zNotGreaterFullEqual;zNotGreaterGreater;zNotGreaterLess;zNotGreaterSlantEqual;zNotGreaterTilde;zNotHumpDownHump;z
NotHumpEqual;zNotLeftTriangle;zNotLeftTriangleBar;zNotLeftTriangleEqual;zNotLess;z
NotLessEqual;zNotLessGreater;zNotLessLess;zNotLessSlantEqual;z
NotLessTilde;zNotNestedGreaterGreater;zNotNestedLessLess;zNotPrecedes;zNotPrecedesEqual;zNotPrecedesSlantEqual;zNotReverseElement;zNotRightTriangle;zNotRightTriangleBar;zNotRightTriangleEqual;zNotSquareSubset;zNotSquareSubsetEqual;zNotSquareSuperset;zNotSquareSupersetEqual;z
NotSubset;zNotSubsetEqual;zNotSucceeds;zNotSucceedsEqual;zNotSucceedsSlantEqual;zNotSucceedsTilde;zNotSuperset;zNotSupersetEqual;z	NotTilde;zNotTildeEqual;zNotTildeFullEqual;zNotTildeTilde;zNotVerticalBar;zNscr;ZNtildezNtilde;zNu;zOElig;ZOacutezOacute;ZOcirczOcirc;zOcy;zOdblac;zOfr;ZOgravezOgrave;zOmacr;zOmega;zOmicron;zOopf;zOpenCurlyDoubleQuote;zOpenCurlyQuote;zOr;zOscr;ZOslashzOslash;ZOtildezOtilde;zOtimes;ZOumlzOuml;zOverBar;z
OverBrace;zOverBracket;zOverParenthesis;z	PartialD;zPcy;zPfr;zPhi;zPi;z
PlusMinus;zPoincareplane;zPopf;zPr;z	Precedes;zPrecedesEqual;zPrecedesSlantEqual;zPrecedesTilde;zPrime;zProduct;zProportion;z
Proportional;zPscr;zPsi;ZQUOTzQUOT;zQfr;zQopf;zQscr;zRBarr;ZREGzREG;zRacute;zRang;zRarr;zRarrtl;zRcaron;zRcedil;zRcy;zRe;zReverseElement;zReverseEquilibrium;zReverseUpEquilibrium;zRfr;zRho;zRightAngleBracket;zRightArrow;zRightArrowBar;zRightArrowLeftArrow;z
RightCeiling;zRightDoubleBracket;zRightDownTeeVector;zRightDownVector;zRightDownVectorBar;zRightFloor;z	RightTee;zRightTeeArrow;zRightTeeVector;zRightTriangle;zRightTriangleBar;zRightTriangleEqual;zRightUpDownVector;zRightUpTeeVector;zRightUpVector;zRightUpVectorBar;zRightVector;zRightVectorBar;zRightarrow;zRopf;z
RoundImplies;zRrightarrow;zRscr;zRsh;zRuleDelayed;zSHCHcy;zSHcy;zSOFTcy;zSacute;zSc;zScaron;zScedil;zScirc;zScy;zSfr;zShortDownArrow;zShortLeftArrow;zShortRightArrow;z
ShortUpArrow;zSigma;zSmallCircle;zSopf;zSqrt;zSquare;zSquareIntersection;z
SquareSubset;zSquareSubsetEqual;zSquareSuperset;zSquareSupersetEqual;zSquareUnion;zSscr;zStar;zSub;zSubset;zSubsetEqual;z	Succeeds;zSucceedsEqual;zSucceedsSlantEqual;zSucceedsTilde;z	SuchThat;zSum;zSup;z	Superset;zSupersetEqual;zSupset;ZTHORNzTHORN;zTRADE;zTSHcy;zTScy;zTab;zTau;zTcaron;zTcedil;zTcy;zTfr;z
Therefore;zTheta;zThickSpace;z
ThinSpace;zTilde;zTildeEqual;zTildeFullEqual;zTildeTilde;zTopf;z
TripleDot;zTscr;zTstrok;ZUacutezUacute;zUarr;z	Uarrocir;zUbrcy;zUbreve;ZUcirczUcirc;zUcy;zUdblac;zUfr;ZUgravezUgrave;zUmacr;z	UnderBar;zUnderBrace;z
UnderBracket;zUnderParenthesis;zUnion;z
UnionPlus;zUogon;zUopf;zUpArrow;zUpArrowBar;zUpArrowDownArrow;zUpDownArrow;zUpEquilibrium;zUpTee;zUpTeeArrow;zUparrow;zUpdownarrow;zUpperLeftArrow;zUpperRightArrow;zUpsi;zUpsilon;zUring;zUscr;zUtilde;ZUumlzUuml;zVDash;zVbar;zVcy;zVdash;zVdashl;zVee;zVerbar;zVert;zVerticalBar;z
VerticalLine;zVerticalSeparator;zVerticalTilde;zVeryThinSpace;zVfr;zVopf;zVscr;zVvdash;zWcirc;zWedge;zWfr;zWopf;zWscr;zXfr;zXi;zXopf;zXscr;zYAcy;zYIcy;zYUcy;ZYacutezYacute;zYcirc;zYcy;zYfr;zYopf;zYscr;zYuml;zZHcy;zZacute;zZcaron;zZcy;zZdot;zZeroWidthSpace;zZeta;zZfr;zZopf;zZscr;Zaacutezaacute;zabreve;zac;zacE;zacd;Zacirczacirc;Zacutezacute;zacy;Zaeligzaelig;zaf;zafr;Zagravezagrave;zalefsym;zaleph;zalpha;zamacr;zamalg;Zampzamp;zand;zandand;zandd;z	andslope;zandv;zang;zange;zangle;zangmsd;z	angmsdaa;z	angmsdab;z	angmsdac;z	angmsdad;z	angmsdae;z	angmsdaf;z	angmsdag;z	angmsdah;zangrt;zangrtvb;z	angrtvbd;zangsph;zangst;zangzarr;zaogon;zaopf;zap;zapE;zapacir;zape;zapid;zapos;zapprox;z	approxeq;Zaringzaring;zascr;zast;zasymp;zasympeq;Zatildezatilde;Zaumlzauml;z	awconint;zawint;zbNot;z	backcong;zbackepsilon;z
backprime;zbacksim;z
backsimeq;zbarvee;zbarwed;z	barwedge;zbbrk;z	bbrktbrk;zbcong;zbcy;zbdquo;zbecaus;zbecause;zbemptyv;zbepsi;zbernou;zbeta;zbeth;zbetween;zbfr;zbigcap;zbigcirc;zbigcup;zbigodot;z	bigoplus;z
bigotimes;z	bigsqcup;zbigstar;zbigtriangledown;zbigtriangleup;z	biguplus;zbigvee;z	bigwedge;zbkarow;z
blacklozenge;zblacksquare;zblacktriangle;zblacktriangledown;zblacktriangleleft;zblacktriangleright;zblank;zblk12;zblk14;zblk34;zblock;zbne;zbnequiv;zbnot;zbopf;zbot;zbottom;zbowtie;zboxDL;zboxDR;zboxDl;zboxDr;zboxH;zboxHD;zboxHU;zboxHd;zboxHu;zboxUL;zboxUR;zboxUl;zboxUr;zboxV;zboxVH;zboxVL;zboxVR;zboxVh;zboxVl;zboxVr;zboxbox;zboxdL;zboxdR;zboxdl;zboxdr;zboxh;zboxhD;zboxhU;zboxhd;zboxhu;z	boxminus;zboxplus;z	boxtimes;zboxuL;zboxuR;zboxul;zboxur;zboxv;zboxvH;zboxvL;zboxvR;zboxvh;zboxvl;zboxvr;zbprime;zbreve;Zbrvbarzbrvbar;zbscr;zbsemi;zbsim;zbsime;zbsol;zbsolb;z	bsolhsub;zbull;zbullet;zbump;zbumpE;zbumpe;zbumpeq;zcacute;zcap;zcapand;z	capbrcup;zcapcap;zcapcup;zcapdot;zcaps;zcaret;zcaron;zccaps;zccaron;Zccedilzccedil;zccirc;zccups;zccupssm;zcdot;Zcedilzcedil;zcemptyv;Zcentzcent;z
centerdot;zcfr;zchcy;zcheck;z
checkmark;zchi;zcir;zcirE;zcirc;zcirceq;zcirclearrowleft;zcirclearrowright;z	circledR;z	circledS;zcircledast;zcircledcirc;zcircleddash;zcire;z	cirfnint;zcirmid;zcirscir;zclubs;z	clubsuit;zcolon;zcolone;zcoloneq;zcomma;zcommat;zcomp;zcompfn;zcomplement;z
complexes;zcong;zcongdot;zconint;zcopf;zcoprod;�copyzcopy;zcopysr;zcrarr;zcross;zcscr;zcsub;zcsube;zcsup;zcsupe;zctdot;zcudarrl;zcudarrr;zcuepr;zcuesc;zcularr;zcularrp;zcup;z	cupbrcap;zcupcap;zcupcup;zcupdot;zcupor;zcups;zcurarr;zcurarrm;zcurlyeqprec;zcurlyeqsucc;z	curlyvee;zcurlywedge;Zcurrenzcurren;zcurvearrowleft;zcurvearrowright;zcuvee;zcuwed;z	cwconint;zcwint;zcylcty;zdArr;zdHar;zdagger;zdaleth;zdarr;zdash;zdashv;zdbkarow;zdblac;zdcaron;zdcy;zdd;zddagger;zddarr;zddotseq;Zdegzdeg;zdelta;zdemptyv;zdfisht;zdfr;zdharl;zdharr;zdiam;zdiamond;zdiamondsuit;zdiams;zdie;zdigamma;zdisin;zdiv;Zdividezdivide;zdivideontimes;zdivonx;zdjcy;zdlcorn;zdlcrop;zdollar;zdopf;zdot;zdoteq;z	doteqdot;z	dotminus;zdotplus;z
dotsquare;zdoublebarwedge;z
downarrow;zdowndownarrows;zdownharpoonleft;zdownharpoonright;z	drbkarow;zdrcorn;zdrcrop;zdscr;zdscy;zdsol;zdstrok;zdtdot;zdtri;zdtrif;zduarr;zduhar;zdwangle;zdzcy;z	dzigrarr;zeDDot;zeDot;Zeacutezeacute;zeaster;zecaron;zecir;Zecirczecirc;zecolon;zecy;zedot;zee;zefDot;zefr;zeg;Zegravezegrave;zegs;zegsdot;zel;z	elinters;zell;zels;zelsdot;zemacr;zempty;z	emptyset;zemptyv;zemsp13;zemsp14;zemsp;zeng;zensp;zeogon;zeopf;zepar;zeparsl;zeplus;zepsi;zepsilon;zepsiv;zeqcirc;zeqcolon;zeqsim;zeqslantgtr;zeqslantless;zequals;zequest;zequiv;zequivDD;z	eqvparsl;zerDot;zerarr;zescr;zesdot;zesim;zeta;Zethzeth;Zeumlzeuml;zeuro;zexcl;zexist;zexpectation;z
exponentiale;zfallingdotseq;zfcy;zfemale;zffilig;zfflig;zffllig;zffr;zfilig;zfjlig;zflat;zfllig;zfltns;zfnof;zfopf;zforall;zfork;zforkv;z	fpartint;Zfrac12zfrac12;zfrac13;Zfrac14zfrac14;zfrac15;zfrac16;zfrac18;zfrac23;zfrac25;Zfrac34zfrac34;zfrac35;zfrac38;zfrac45;zfrac56;zfrac58;zfrac78;zfrasl;zfrown;zfscr;zgE;zgEl;zgacute;zgamma;zgammad;zgap;zgbreve;zgcirc;zgcy;zgdot;zge;zgel;zgeq;zgeqq;z	geqslant;zges;zgescc;zgesdot;zgesdoto;z	gesdotol;zgesl;zgesles;zgfr;zgg;zggg;zgimel;zgjcy;zgl;zglE;zgla;zglj;zgnE;zgnap;z	gnapprox;zgne;zgneq;zgneqq;zgnsim;zgopf;zgrave;zgscr;zgsim;zgsime;zgsiml;�gtzgt;zgtcc;zgtcir;zgtdot;zgtlPar;zgtquest;z
gtrapprox;zgtrarr;zgtrdot;z
gtreqless;zgtreqqless;zgtrless;zgtrsim;z
gvertneqq;zgvnE;zhArr;zhairsp;zhalf;zhamilt;zhardcy;zharr;zharrcir;zharrw;zhbar;zhcirc;zhearts;z
heartsuit;zhellip;zhercon;zhfr;z	hksearow;z	hkswarow;zhoarr;zhomtht;zhookleftarrow;zhookrightarrow;zhopf;zhorbar;zhscr;zhslash;zhstrok;zhybull;zhyphen;Ziacuteziacute;zic;Zicirczicirc;zicy;ziecy;Ziexclziexcl;ziff;zifr;Zigravezigrave;zii;ziiiint;ziiint;ziinfin;ziiota;zijlig;zimacr;zimage;z	imagline;z	imagpart;zimath;zimof;zimped;zin;zincare;zinfin;z	infintie;zinodot;zint;zintcal;z	integers;z	intercal;z	intlarhk;zintprod;ziocy;ziogon;ziopf;ziota;ziprod;Ziquestziquest;ziscr;zisin;zisinE;zisindot;zisins;zisinsv;zisinv;zit;zitilde;ziukcy;Ziumlziuml;zjcirc;zjcy;zjfr;zjmath;zjopf;zjscr;zjsercy;zjukcy;zkappa;zkappav;zkcedil;zkcy;zkfr;zkgreen;zkhcy;zkjcy;zkopf;zkscr;zlAarr;zlArr;zlAtail;zlBarr;zlE;zlEg;zlHar;zlacute;z	laemptyv;zlagran;zlambda;zlang;zlangd;zlangle;zlap;Zlaquozlaquo;zlarr;zlarrb;zlarrbfs;zlarrfs;zlarrhk;zlarrlp;zlarrpl;zlarrsim;zlarrtl;zlat;zlatail;zlate;zlates;zlbarr;zlbbrk;zlbrace;zlbrack;zlbrke;zlbrksld;zlbrkslu;zlcaron;zlcedil;zlceil;zlcub;zlcy;zldca;zldquo;zldquor;zldrdhar;z	ldrushar;zldsh;zle;z
leftarrow;zleftarrowtail;zleftharpoondown;zleftharpoonup;zleftleftarrows;zleftrightarrow;zleftrightarrows;zleftrightharpoons;zleftrightsquigarrow;zleftthreetimes;zleg;zleq;zleqq;z	leqslant;zles;zlescc;zlesdot;zlesdoto;z	lesdotor;zlesg;zlesges;zlessapprox;zlessdot;z
lesseqgtr;zlesseqqgtr;zlessgtr;zlesssim;zlfisht;zlfloor;zlfr;zlg;zlgE;zlhard;zlharu;zlharul;zlhblk;zljcy;zll;zllarr;z	llcorner;zllhard;zlltri;zlmidot;zlmoust;zlmoustache;zlnE;zlnap;z	lnapprox;zlne;zlneq;zlneqq;zlnsim;zloang;zloarr;zlobrk;zlongleftarrow;zlongleftrightarrow;zlongmapsto;zlongrightarrow;zlooparrowleft;zlooparrowright;zlopar;zlopf;zloplus;zlotimes;zlowast;zlowbar;zloz;zlozenge;zlozf;zlpar;zlparlt;zlrarr;z	lrcorner;zlrhar;zlrhard;zlrm;zlrtri;zlsaquo;zlscr;zlsh;zlsim;zlsime;zlsimg;zlsqb;zlsquo;zlsquor;zlstrok;�ltzlt;zltcc;zltcir;zltdot;zlthree;zltimes;zltlarr;zltquest;zltrPar;zltri;zltrie;zltrif;z	lurdshar;zluruhar;z
lvertneqq;zlvnE;zmDDot;Zmacrzmacr;zmale;zmalt;zmaltese;zmap;zmapsto;zmapstodown;zmapstoleft;z	mapstoup;zmarker;zmcomma;zmcy;zmdash;zmeasuredangle;zmfr;zmho;�microzmicro;zmid;zmidast;zmidcir;Zmiddotzmiddot;zminus;zminusb;zminusd;zminusdu;zmlcp;zmldr;zmnplus;zmodels;zmopf;zmp;zmscr;zmstpos;zmu;z	multimap;zmumap;znGg;znGt;znGtv;znLeftarrow;znLeftrightarrow;znLl;znLt;znLtv;znRightarrow;znVDash;znVdash;znabla;znacute;znang;znap;znapE;znapid;znapos;znapprox;znatur;znatural;z	naturals;Znbspznbsp;znbump;znbumpe;zncap;zncaron;zncedil;zncong;z	ncongdot;zncup;zncy;zndash;zne;zneArr;znearhk;znearr;znearrow;znedot;znequiv;znesear;znesim;znexist;znexists;znfr;zngE;znge;zngeq;zngeqq;z
ngeqslant;znges;zngsim;zngt;zngtr;znhArr;znharr;znhpar;zni;znis;znisd;zniv;znjcy;znlArr;znlE;znlarr;znldr;znle;znleftarrow;znleftrightarrow;znleq;znleqq;z
nleqslant;znles;znless;znlsim;znlt;znltri;znltrie;znmid;znopf;�notznot;znotin;znotinE;z	notindot;znotinva;znotinvb;znotinvc;znotni;znotniva;znotnivb;znotnivc;znpar;z
nparallel;znparsl;znpart;znpolint;znpr;znprcue;znpre;znprec;znpreceq;znrArr;znrarr;znrarrc;znrarrw;znrightarrow;znrtri;znrtrie;znsc;znsccue;znsce;znscr;z
nshortmid;znshortparallel;znsim;znsime;znsimeq;znsmid;znspar;znsqsube;znsqsupe;znsub;znsubE;znsube;znsubset;z
nsubseteq;znsubseteqq;znsucc;znsucceq;znsup;znsupE;znsupe;znsupset;z
nsupseteq;znsupseteqq;zntgl;Zntildezntilde;zntlg;zntriangleleft;zntrianglelefteq;zntriangleright;zntrianglerighteq;znu;znum;znumero;znumsp;znvDash;znvHarr;znvap;znvdash;znvge;znvgt;znvinfin;znvlArr;znvle;znvlt;znvltrie;znvrArr;znvrtrie;znvsim;znwArr;znwarhk;znwarr;znwarrow;znwnear;zoS;Zoacutezoacute;zoast;zocir;Zocirczocirc;zocy;zodash;zodblac;zodiv;zodot;zodsold;zoelig;zofcir;zofr;zogon;Zogravezograve;zogt;zohbar;zohm;zoint;zolarr;zolcir;zolcross;zoline;zolt;zomacr;zomega;zomicron;zomid;zominus;zoopf;zopar;zoperp;zoplus;zor;zorarr;zord;zorder;zorderof;Zordfzordf;Zordmzordm;zorigof;zoror;zorslope;zorv;zoscr;Zoslashzoslash;zosol;Zotildezotilde;zotimes;z	otimesas;Zoumlzouml;zovbar;zpar;Zparazpara;z	parallel;zparsim;zparsl;zpart;zpcy;zpercnt;zperiod;zpermil;zperp;zpertenk;zpfr;zphi;zphiv;zphmmat;zphone;zpi;z
pitchfork;zpiv;zplanck;zplanckh;zplankv;zplus;z	plusacir;zplusb;zpluscir;zplusdo;zplusdu;zpluse;Zplusmnzplusmn;zplussim;zplustwo;zpm;z	pointint;zpopf;Zpoundzpound;zpr;zprE;zprap;zprcue;zpre;zprec;zprecapprox;zpreccurlyeq;zpreceq;zprecnapprox;z	precneqq;z	precnsim;zprecsim;zprime;zprimes;zprnE;zprnap;zprnsim;zprod;z	profalar;z	profline;z	profsurf;zprop;zpropto;zprsim;zprurel;zpscr;zpsi;zpuncsp;zqfr;zqint;zqopf;zqprime;zqscr;zquaternions;zquatint;zquest;zquesteq;Zquotzquot;zrAarr;zrArr;zrAtail;zrBarr;zrHar;zrace;zracute;zradic;z	raemptyv;zrang;zrangd;zrange;zrangle;Zraquozraquo;zrarr;zrarrap;zrarrb;zrarrbfs;zrarrc;zrarrfs;zrarrhk;zrarrlp;zrarrpl;zrarrsim;zrarrtl;zrarrw;zratail;zratio;z
rationals;zrbarr;zrbbrk;zrbrace;zrbrack;zrbrke;zrbrksld;zrbrkslu;zrcaron;zrcedil;zrceil;zrcub;zrcy;zrdca;zrdldhar;zrdquo;zrdquor;zrdsh;zreal;zrealine;z	realpart;zreals;zrect;Zregzreg;zrfisht;zrfloor;zrfr;zrhard;zrharu;zrharul;zrho;zrhov;zrightarrow;zrightarrowtail;zrightharpoondown;zrightharpoonup;zrightleftarrows;zrightleftharpoons;zrightrightarrows;zrightsquigarrow;zrightthreetimes;zring;z
risingdotseq;zrlarr;zrlhar;zrlm;zrmoust;zrmoustache;zrnmid;zroang;zroarr;zrobrk;zropar;zropf;zroplus;zrotimes;zrpar;zrpargt;z	rppolint;zrrarr;zrsaquo;zrscr;zrsh;zrsqb;zrsquo;zrsquor;zrthree;zrtimes;zrtri;zrtrie;zrtrif;z	rtriltri;zruluhar;zrx;zsacute;zsbquo;zsc;zscE;zscap;zscaron;zsccue;zsce;zscedil;zscirc;zscnE;zscnap;zscnsim;z	scpolint;zscsim;zscy;zsdot;zsdotb;zsdote;zseArr;zsearhk;zsearr;zsearrow;Zsectzsect;zsemi;zseswar;z	setminus;zsetmn;zsext;zsfr;zsfrown;zsharp;zshchcy;zshcy;z	shortmid;zshortparallel;Zshyzshy;zsigma;zsigmaf;zsigmav;zsim;zsimdot;zsime;zsimeq;zsimg;zsimgE;zsiml;zsimlE;zsimne;zsimplus;zsimrarr;zslarr;zsmallsetminus;zsmashp;z	smeparsl;zsmid;zsmile;zsmt;zsmte;zsmtes;zsoftcy;zsol;zsolb;zsolbar;zsopf;zspades;z
spadesuit;zspar;zsqcap;zsqcaps;zsqcup;zsqcups;zsqsub;zsqsube;z	sqsubset;zsqsubseteq;zsqsup;zsqsupe;z	sqsupset;zsqsupseteq;zsqu;zsquare;zsquarf;zsquf;zsrarr;zsscr;zssetmn;zssmile;zsstarf;zstar;zstarf;zstraightepsilon;zstraightphi;zstrns;zsub;zsubE;zsubdot;zsube;zsubedot;zsubmult;zsubnE;zsubne;zsubplus;zsubrarr;zsubset;z	subseteq;z
subseteqq;z
subsetneq;zsubsetneqq;zsubsim;zsubsub;zsubsup;zsucc;zsuccapprox;zsucccurlyeq;zsucceq;zsuccnapprox;z	succneqq;z	succnsim;zsuccsim;zsum;zsung;Zsup1zsup1;Zsup2zsup2;Zsup3zsup3;zsup;zsupE;zsupdot;zsupdsub;zsupe;zsupedot;zsuphsol;zsuphsub;zsuplarr;zsupmult;zsupnE;zsupne;zsupplus;zsupset;z	supseteq;z
supseteqq;z
supsetneq;zsupsetneqq;zsupsim;zsupsub;zsupsup;zswArr;zswarhk;zswarr;zswarrow;zswnwar;Zszligzszlig;ztarget;ztau;ztbrk;ztcaron;ztcedil;ztcy;ztdot;ztelrec;ztfr;zthere4;z
therefore;ztheta;z	thetasym;zthetav;zthickapprox;z	thicksim;zthinsp;zthkap;zthksim;Zthornzthorn;ztilde;�timesztimes;ztimesb;z	timesbar;ztimesd;ztint;ztoea;ztop;ztopbot;ztopcir;ztopf;ztopfork;ztosa;ztprime;ztrade;z	triangle;z
triangledown;z
triangleleft;ztrianglelefteq;z
triangleq;ztriangleright;ztrianglerighteq;ztridot;ztrie;z	triminus;ztriplus;ztrisb;ztritime;z	trpezium;ztscr;ztscy;ztshcy;ztstrok;ztwixt;ztwoheadleftarrow;ztwoheadrightarrow;zuArr;zuHar;Zuacutezuacute;zuarr;zubrcy;zubreve;Zucirczucirc;zucy;zudarr;zudblac;zudhar;zufisht;zufr;Zugravezugrave;zuharl;zuharr;zuhblk;zulcorn;z	ulcorner;zulcrop;zultri;zumacr;Zumlzuml;zuogon;zuopf;zuparrow;zupdownarrow;zupharpoonleft;zupharpoonright;zuplus;zupsi;zupsih;zupsilon;zupuparrows;zurcorn;z	urcorner;zurcrop;zuring;zurtri;zuscr;zutdot;zutilde;zutri;zutrif;zuuarr;Zuumlzuuml;zuwangle;zvArr;zvBar;zvBarv;zvDash;zvangrt;zvarepsilon;z	varkappa;zvarnothing;zvarphi;zvarpi;z
varpropto;zvarr;zvarrho;z	varsigma;z
varsubsetneq;zvarsubsetneqq;z
varsupsetneq;zvarsupsetneqq;z	vartheta;zvartriangleleft;zvartriangleright;zvcy;zvdash;zvee;zveebar;zveeeq;zvellip;zverbar;zvert;zvfr;zvltri;zvnsub;zvnsup;zvopf;zvprop;zvrtri;zvscr;zvsubnE;zvsubne;zvsupnE;zvsupne;zvzigzag;zwcirc;zwedbar;zwedge;zwedgeq;zweierp;zwfr;zwopf;zwp;zwr;zwreath;zwscr;zxcap;zxcirc;zxcup;zxdtri;zxfr;zxhArr;zxharr;zxi;zxlArr;zxlarr;zxmap;zxnis;zxodot;zxopf;zxoplus;zxotime;zxrArr;zxrarr;zxscr;zxsqcup;zxuplus;zxutri;zxvee;zxwedge;Zyacutezyacute;zyacy;zycirc;zycy;Zyenzyen;zyfr;zyicy;zyopf;zyscr;zyucy;Zyumlzyuml;zzacute;zzcaron;zzcy;zzdot;zzeetrf;zzeta;zzfr;zzhcy;zzigrarr;zzopf;zzscr;zzwj;zzwnj;u������)"r�
���������������������������������������)ZDoctypeZ
CharactersZSpaceCharacters�StartTag�EndTag�EmptyTag�CommentZ
ParseErrorrrrcCsg|]\}}||f�qSr2r2)r3�k�vr2r2r6r7xsZmathc@seZdZdS)�DataLossWarningN)�__name__�
__module__�__qualname__r2r2r2r6r|src@seZdZdS)�ReparseExceptionN)rrrr2r2r2r6r�sr)rr r!r"r#r$) rGrHrIrJrKrLrMrNrOrPrQrRrSrHrTrHrHrUrVrWrXrYrZr[r\r]r^r_r`rHrarb)0Z
__future__rrr�stringZEOF�EZ
namespaces�	frozensetZscopingElementsZformattingElementsZspecialElementsZhtmlIntegrationPointElementsZ"mathmlTextIntegrationPointElementsZadjustSVGAttributesZadjustMathMLAttributesZadjustForeignAttributes�dict�itemsZunadjustForeignAttributesZspaceCharactersZtableInsertModeElementsZascii_lowercaseZasciiLowercaseZascii_uppercaseZasciiUppercaseZ
ascii_lettersZasciiLettersZdigitsZ	hexdigitsZ	hexDigitsZasciiUpper2LowerZheadingElementsZvoidElementsZ
cdataElementsZrcdataElementsZbooleanAttributesZentitiesWindows1252ZxmlEntitiesZentitiesZreplacementCharactersZ
tokenTypesZ
tagTokenTypes�prefixes�UserWarningr�	Exceptionrr2r2r2r6�<module>s<






























































































































site-packages/pip/_vendor/html5lib/__pycache__/_tokenizer.cpython-36.opt-1.pyc000064400000122103147511334600023201 0ustar003

���e$+�@s�ddlmZmZmZddlmZddlmZddl	m
Z
ddl	mZddl	mZm
Z
ddl	mZmZmZdd	l	mZmZdd
l	mZddlmZddlmZee�ZGd
d�de�ZdS)�)�absolute_import�division�unicode_literals)�unichr)�deque�)�spaceCharacters)�entities)�asciiLetters�asciiUpper2Lower)�digits�	hexDigits�EOF)�
tokenTypes�
tagTokenTypes)�replacementCharacters)�HTMLInputStream)�TriecsdeZdZdZd��fdd�	Zdd�Zdd�Zd�d
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zd2d3�Zd4d5�Zd6d7�Zd8d9�Zd:d;�Zd<d=�Z d>d?�Z!d@dA�Z"dBdC�Z#dDdE�Z$dFdG�Z%dHdI�Z&dJdK�Z'dLdM�Z(dNdO�Z)dPdQ�Z*dRdS�Z+dTdU�Z,dVdW�Z-dXdY�Z.dZd[�Z/d\d]�Z0d^d_�Z1d`da�Z2dbdc�Z3ddde�Z4dfdg�Z5dhdi�Z6djdk�Z7dldm�Z8dndo�Z9dpdq�Z:drds�Z;dtdu�Z<dvdw�Z=dxdy�Z>dzd{�Z?d|d}�Z@d~d�ZAd�d��ZBd�d��ZCd�d��ZDd�d��ZEd�d��ZFd�d��ZGd�d��ZHd�d��ZId�d��ZJd�d��ZKd�d��ZL�ZMS)��
HTMLTokenizera	 This class takes care of tokenizing HTML.

    * self.currentToken
      Holds the token that is currently being processed.

    * self.state
      Holds a reference to the method to be invoked... XXX

    * self.stream
      Points to HTMLInputStream object.
    NcsFt|f|�|_||_d|_g|_|j|_d|_d|_t	t
|�j�dS)NF)r�stream�parserZ
escapeFlagZ
lastFourChars�	dataState�state�escape�currentToken�superr�__init__)�selfrr�kwargs)�	__class__�� /usr/lib/python3.6/_tokenizer.pyr"szHTMLTokenizer.__init__ccs\tg�|_xL|j�rVx&|jjr:td|jjjd�d�VqWx|jrR|jj�Vq>WqWdS)z� This is where the magic happens.

        We do our usually processing through the states and when we have a token
        to return we yield the token which pauses processing until the next token
        is requested.
        �
ParseErrorr)�type�dataN)r�
tokenQueuerr�errorsr�pop�popleft)rr r r!�__iter__1s


zHTMLTokenizer.__iter__c	%Cs(t}d}|rt}d}g}|jj�}x(||krJ|tk	rJ|j|�|jj�}q$Wtdj|�|�}|tkr�t|}|j	jt
ddd|id���nld|ko�d	kns�|d
kr�d}|j	jt
ddd|id���n(d|ko�d
kn�s�d|ko�dkn�s�d|k�odkn�s�d|k�o4dkn�s�|tddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d
g#�k�r�|j	jt
ddd|id��yt|�}Wn>t
k
�r�|d6}td|d?B�td7|d8@B�}YnX|d9k�r$|j	jt
dd:d;��|jj|�|S)<z�This function returns either U+FFFD or the character based on the
        decimal or hexadecimal representation. It also discards ";" if present.
        If not present self.tokenQueue.append({"type": tokenTypes["ParseError"]}) is invoked.
        �
��r"z$illegal-codepoint-for-numeric-entity�	charAsInt)r#r$�datavarsi�i��i��u�r�����i�i��i��i��i��i��i��i��i��i��i��i��i��i��i��i��i��i��i��i��i��	i��	i��
i��
i��i��i��i��i��
i��
i��i��i��i��i��ii�i��;z numeric-entity-without-semicolon)r#r$)rr
r�charr�append�int�joinrr%r�	frozenset�chr�
ValueError�unget)	rZisHexZallowed�radix�	charStack�cr-r6�vr r r!�consumeNumberEntityAs`

&

z!HTMLTokenizer.consumeNumberEntityFc	
Cs�d}|jj�g}|dtksB|dtddfksB|dk	rV||dkrV|jj|d��n"|ddk�rd}|j|jj��|ddkr�d	}|j|jj��|r�|dtks�|r�|dtkr�|jj|d�|j|�}n4|j	jt
d
dd��|jj|j��dd
j|�}�njx8|dtk	�rFt
jd
j|���s2P|j|jj���qWy$t
jd
j|dd���}t|�}Wntk
�r�d}YnX|dk	�rD|ddk�r�|j	jt
d
dd��|ddk�r|�r||tk�s�||tk�s�||dk�r|jj|j��dd
j|�}n.t|}|jj|j��|d
j||d��7}n4|j	jt
d
dd��|jj|j��dd
j|�}|�r�|jddd|7<n*|tk�r�d}nd}|j	jt
||d��dS)N�&r�<�#Fr�x�XTr"zexpected-numeric-entity)r#r$r,r5znamed-entity-without-semicolon�=zexpected-named-entityr$�SpaceCharacters�
Characters���)rFrGrKrKrKrKrKrKrKrK)rr6rrr=r7r
rrBr%rr'r9�entitiesTrieZhas_keys_with_prefixZlongest_prefix�len�KeyErrorr
r	r)	r�allowedChar�
fromAttribute�outputr?�hexZ
entityNameZentityLengthZ	tokenTyper r r!�
consumeEntity�sf





zHTMLTokenizer.consumeEntitycCs|j|dd�dS)zIThis method replaces the need for "entityInAttributeValueState".
        T)rOrPN)rS)rrOr r r!�processEntityInAttribute�sz&HTMLTokenizer.processEntityInAttributecCs�|j}|dtkrp|djt�|d<|dtdkrp|drR|jjtddd��|drp|jjtdd	d��|jj|�|j|_d
S)z�This method is a generic handler for emitting the tags. It also sets
        the state to "data" because that's what's needed after a token has been
        emitted.
        r#�name�EndTagr$r"zattributes-in-end-tag)r#r$�selfClosingzself-closing-flag-on-end-tagN)	rr�	translaterrr%r7rr)r�tokenr r r!�emitCurrentToken�s

zHTMLTokenizer.emitCurrentTokencCs�|jj�}|dkr|j|_n�|dkr.|j|_n�|dkrd|jjtddd��|jjtddd��n`|tkrpdS|t	kr�|jjtd	||jj
t	d
�d��n&|jj
d�}|jjtd||d��d
S)NrCrD�r"zinvalid-codepoint)r#r$rJFrIT)rCrDr[)rr6�entityDataStater�tagOpenStater%r7rrr�
charsUntil)rr$�charsr r r!r�s&



zHTMLTokenizer.dataStatecCs|j�|j|_dS)NT)rSrr)rr r r!r\szHTMLTokenizer.entityDataStatecCs�|jj�}|dkr|j|_n�|dkr.|j|_n�|tkr:dS|dkrp|jjtddd��|jjtdd	d��nT|t	kr�|jjtd
||jj
t	d�d��n&|jj
d�}|jjtd||d��dS)
NrCrDFr[r"zinvalid-codepoint)r#r$rJu�rIT)rCrDr[)rr6�characterReferenceInRcdatar�rcdataLessThanSignStaterr%r7rrr^)rr$r_r r r!�rcdataStates&



zHTMLTokenizer.rcdataStatecCs|j�|j|_dS)NT)rSrbr)rr r r!r`1sz(HTMLTokenizer.characterReferenceInRcdatacCs�|jj�}|dkr|j|_nh|dkrR|jjtddd��|jjtddd��n2|tkr^dS|jjd
�}|jjtd||d��d	S)NrDr[r"zinvalid-codepoint)r#r$rJu�FT)rDr[)	rr6�rawtextLessThanSignStaterr%r7rrr^)rr$r_r r r!�rawtextState6s


zHTMLTokenizer.rawtextStatecCs�|jj�}|dkr|j|_nh|dkrR|jjtddd��|jjtddd��n2|tkr^dS|jjd
�}|jjtd||d��d	S)NrDr[r"zinvalid-codepoint)r#r$rJu�FT)rDr[)	rr6�scriptDataLessThanSignStaterr%r7rrr^)rr$r_r r r!�scriptDataStateHs


zHTMLTokenizer.scriptDataStatecCsr|jj�}|tkrdS|dkrL|jjtddd��|jjtddd��n"|jjtd||jjd�d��dS)	NFr[r"zinvalid-codepoint)r#r$rJu�T)rr6rr%r7rr^)rr$r r r!�plaintextStateZs

zHTMLTokenizer.plaintextStatecCs|jj�}|dkr|j|_n�|dkr.|j|_n�|tkrVtd|gddd�|_|j|_n�|dkr�|j	j
tddd	��|j	j
td
dd	��|j|_nt|dkr�|j	j
tdd
d	��|jj|�|j
|_n@|j	j
tddd	��|j	j
td
dd	��|jj|�|j|_dS)N�!�/ZStartTagF)r#rUr$rWZselfClosingAcknowledged�>r"z'expected-tag-name-but-got-right-bracket)r#r$rJz<>�?z'expected-tag-name-but-got-question-markzexpected-tag-namerDT)rr6�markupDeclarationOpenStater�closeTagOpenStater
rr�tagNameStater%r7rr=�bogusCommentState)rr$r r r!r]is6









zHTMLTokenizer.tagOpenStatecCs�|jj�}|tkr0td|gdd�|_|j|_n�|dkrX|jjtddd��|j	|_nn|t
kr�|jjtddd��|jjtd	d
d��|j	|_n0|jjtddd|id
��|jj|�|j|_dS)NrVF)r#rUr$rWrjr"z*expected-closing-tag-but-got-right-bracket)r#r$z expected-closing-tag-but-got-eofrJz</z!expected-closing-tag-but-got-charr$)r#r$r.T)
rr6r
rrrnrr%r7rrr=ro)rr$r r r!rm�s(





zHTMLTokenizer.closeTagOpenStatecCs�|jj�}|tkr|j|_n�|dkr.|j�n~|tkrV|jjt	ddd��|j
|_nV|dkrh|j|_nD|dkr�|jjt	ddd��|jdd	7<n|jd|7<d
S)Nrjr"zeof-in-tag-name)r#r$rir[zinvalid-codepointrUu�T)
rr6r�beforeAttributeNameStaterrZrr%r7rr�selfClosingStartTagStater)rr$r r r!rn�s"






zHTMLTokenizer.tagNameStatecCsP|jj�}|dkr"d|_|j|_n*|jjtddd��|jj|�|j	|_dS)Nrir,rJrD)r#r$T)
rr6�temporaryBuffer�rcdataEndTagOpenStaterr%r7rr=rb)rr$r r r!ra�s

z%HTMLTokenizer.rcdataLessThanSignStatecCsX|jj�}|tkr*|j|7_|j|_n*|jjtddd��|jj	|�|j
|_dS)NrJz</)r#r$T)rr6r
rr�rcdataEndTagNameStaterr%r7rr=rb)rr$r r r!rs�s

z#HTMLTokenizer.rcdataEndTagOpenStatecCs|jo|jdj�|jj�k}|jj�}|tkrT|rTtd|jgdd�|_|j|_n�|dkr�|r�td|jgdd�|_|j	|_n||dkr�|r�td|jgdd�|_|j
�|j|_nH|tkr�|j|7_n0|j
jtdd|jd	��|jj|�|j|_d
S)NrUrVF)r#rUr$rWrirjrJz</)r#r$T)r�lowerrrrr6rrrprrqrZrr
r%r7r=rb)r�appropriater$r r r!rt�s2



z#HTMLTokenizer.rcdataEndTagNameStatecCsP|jj�}|dkr"d|_|j|_n*|jjtddd��|jj|�|j	|_dS)Nrir,rJrD)r#r$T)
rr6rr�rawtextEndTagOpenStaterr%r7rr=rd)rr$r r r!rc�s

z&HTMLTokenizer.rawtextLessThanSignStatecCsX|jj�}|tkr*|j|7_|j|_n*|jjtddd��|jj	|�|j
|_dS)NrJz</)r#r$T)rr6r
rr�rawtextEndTagNameStaterr%r7rr=rd)rr$r r r!rw�s

z$HTMLTokenizer.rawtextEndTagOpenStatecCs|jo|jdj�|jj�k}|jj�}|tkrT|rTtd|jgdd�|_|j|_n�|dkr�|r�td|jgdd�|_|j	|_n||dkr�|r�td|jgdd�|_|j
�|j|_nH|tkr�|j|7_n0|j
jtdd|jd	��|jj|�|j|_d
S)NrUrVF)r#rUr$rWrirjrJz</)r#r$T)rrurrrr6rrrprrqrZrr
r%r7r=rd)rrvr$r r r!rxs2



z$HTMLTokenizer.rawtextEndTagNameStatecCsx|jj�}|dkr"d|_|j|_nR|dkrJ|jjtddd��|j|_n*|jjtddd��|jj	|�|j
|_dS)	Nrir,rhrJz<!)r#r$rDT)rr6rr�scriptDataEndTagOpenStaterr%r7r�scriptDataEscapeStartStater=rf)rr$r r r!res


z)HTMLTokenizer.scriptDataLessThanSignStatecCsX|jj�}|tkr*|j|7_|j|_n*|jjtddd��|jj	|�|j
|_dS)NrJz</)r#r$T)rr6r
rr�scriptDataEndTagNameStaterr%r7rr=rf)rr$r r r!ry,s

z'HTMLTokenizer.scriptDataEndTagOpenStatecCs|jo|jdj�|jj�k}|jj�}|tkrT|rTtd|jgdd�|_|j|_n�|dkr�|r�td|jgdd�|_|j	|_n||dkr�|r�td|jgdd�|_|j
�|j|_nH|tkr�|j|7_n0|j
jtdd|jd	��|jj|�|j|_d
S)NrUrVF)r#rUr$rWrirjrJz</)r#r$T)rrurrrr6rrrprrqrZrr
r%r7r=rf)rrvr$r r r!r{7s2



z'HTMLTokenizer.scriptDataEndTagNameStatecCsJ|jj�}|dkr2|jjtddd��|j|_n|jj|�|j|_dS)N�-rJ)r#r$T)	rr6r%r7r�scriptDataEscapeStartDashStaterr=rf)rr$r r r!rzSs

z(HTMLTokenizer.scriptDataEscapeStartStatecCsJ|jj�}|dkr2|jjtddd��|j|_n|jj|�|j|_dS)Nr|rJ)r#r$T)	rr6r%r7r�scriptDataEscapedDashDashStaterr=rf)rr$r r r!r}]s

z,HTMLTokenizer.scriptDataEscapeStartDashStatecCs�|jj�}|dkr2|jjtddd��|j|_n�|dkrD|j|_nn|dkrz|jjtddd��|jjtddd��n8|tkr�|j	|_n&|jj
d
�}|jjtd||d��d	S)Nr|rJ)r#r$rDr[r"zinvalid-codepointu�T)rDr|r[)rr6r%r7r�scriptDataEscapedDashStater�"scriptDataEscapedLessThanSignStaterrr^)rr$r_r r r!�scriptDataEscapedStategs"




z$HTMLTokenizer.scriptDataEscapedStatecCs�|jj�}|dkr2|jjtddd��|j|_n�|dkrD|j|_nn|dkr�|jjtddd��|jjtddd��|j|_n0|t	kr�|j
|_n|jjtd|d��|j|_d	S)
Nr|rJ)r#r$rDr[r"zinvalid-codepointu�T)rr6r%r7rr~rr�r�rr)rr$r r r!r{s"






z(HTMLTokenizer.scriptDataEscapedDashStatecCs�|jj�}|dkr*|jjtddd��n�|dkr<|j|_n�|dkrd|jjtddd��|j|_nn|dkr�|jjtddd��|jjtdd	d��|j|_n0|t	kr�|j
|_n|jjtd|d��|j|_d
S)Nr|rJ)r#r$rDrjr[r"zinvalid-codepointu�T)rr6r%r7rr�rrfr�rr)rr$r r r!r~�s&






z,HTMLTokenizer.scriptDataEscapedDashDashStatecCs�|jj�}|dkr"d|_|j|_n\|tkrT|jjtdd|d��||_|j	|_n*|jjtddd��|jj
|�|j|_dS)Nrir,rJrD)r#r$T)rr6rr� scriptDataEscapedEndTagOpenStaterr
r%r7r� scriptDataDoubleEscapeStartStater=r�)rr$r r r!r��s


z0HTMLTokenizer.scriptDataEscapedLessThanSignStatecCsP|jj�}|tkr"||_|j|_n*|jjtddd��|jj	|�|j
|_dS)NrJz</)r#r$T)rr6r
rr� scriptDataEscapedEndTagNameStaterr%r7rr=r�)rr$r r r!r��s

z.HTMLTokenizer.scriptDataEscapedEndTagOpenStatecCs|jo|jdj�|jj�k}|jj�}|tkrT|rTtd|jgdd�|_|j|_n�|dkr�|r�td|jgdd�|_|j	|_n||dkr�|r�td|jgdd�|_|j
�|j|_nH|tkr�|j|7_n0|j
jtdd|jd	��|jj|�|j|_d
S)NrUrVF)r#rUr$rWrirjrJz</)r#r$T)rrurrrr6rrrprrqrZrr
r%r7r=r�)rrvr$r r r!r��s2



z.HTMLTokenizer.scriptDataEscapedEndTagNameStatecCs�|jj�}|ttd�BkrR|jjtd|d��|jj�dkrH|j	|_
q�|j|_
nB|tkr�|jjtd|d��|j|7_n|jj
|�|j|_
dS)NrirjrJ)r#r$�scriptT)rirj)rr6rr:r%r7rrrru�scriptDataDoubleEscapedStaterr�r
r=)rr$r r r!r��s


z.HTMLTokenizer.scriptDataDoubleEscapeStartStatecCs�|jj�}|dkr2|jjtddd��|j|_n�|dkrZ|jjtddd��|j|_nt|dkr�|jjtddd��|jjtddd��n>|tkr�|jjtdd	d��|j	|_n|jjtd|d��d
S)Nr|rJ)r#r$rDr[r"zinvalid-codepointu�zeof-in-script-in-scriptT)
rr6r%r7r� scriptDataDoubleEscapedDashStater�(scriptDataDoubleEscapedLessThanSignStaterr)rr$r r r!r��s$





z*HTMLTokenizer.scriptDataDoubleEscapedStatecCs�|jj�}|dkr2|jjtddd��|j|_n�|dkrZ|jjtddd��|j|_n�|dkr�|jjtddd��|jjtddd��|j|_nF|t	kr�|jjtdd	d��|j
|_n|jjtd|d��|j|_d
S)Nr|rJ)r#r$rDr[r"zinvalid-codepointu�zeof-in-script-in-scriptT)rr6r%r7r�$scriptDataDoubleEscapedDashDashStaterr�r�rr)rr$r r r!r�s(







z.HTMLTokenizer.scriptDataDoubleEscapedDashStatecCs|jj�}|dkr*|jjtddd��n�|dkrR|jjtddd��|j|_n�|dkrz|jjtddd��|j|_n�|dkr�|jjtddd��|jjtdd	d��|j|_nF|t	kr�|jjtdd
d��|j
|_n|jjtd|d��|j|_dS)Nr|rJ)r#r$rDrjr[r"zinvalid-codepointu�zeof-in-script-in-scriptT)rr6r%r7rr�rrfr�rr)rr$r r r!r�s,







z2HTMLTokenizer.scriptDataDoubleEscapedDashDashStatecCsP|jj�}|dkr8|jjtddd��d|_|j|_n|jj|�|j	|_dS)NrirJ)r#r$r,T)
rr6r%r7rrr�scriptDataDoubleEscapeEndStaterr=r�)rr$r r r!r�0s

z6HTMLTokenizer.scriptDataDoubleEscapedLessThanSignStatecCs�|jj�}|ttd�BkrR|jjtd|d��|jj�dkrH|j	|_
q�|j|_
nB|tkr�|jjtd|d��|j|7_n|jj
|�|j|_
dS)NrirjrJ)r#r$r�T)rirj)rr6rr:r%r7rrrrur�rr�r
r=)rr$r r r!r�;s


z,HTMLTokenizer.scriptDataDoubleEscapeEndStatecCs0|jj�}|tkr$|jjtd��n|tkrJ|jdj|dg�|j|_n�|dkr\|j	�n�|dkrn|j
|_n�|dkr�|jjtd
dd��|jdj|dg�|j|_n�|d
kr�|jjtd
dd��|jdjddg�|j|_nF|t
k�r|jjtd
dd��|j|_n|jdj|dg�|j|_dS)NTr$r,rjri�'�"rHrDr"z#invalid-character-in-attribute-name)r#r$r[zinvalid-codepointu�z#expected-attribute-name-but-got-eof)r�r�rHrD)rr6rr^r
rr7�attributeNameStaterrZrqr%rrr)rr$r r r!rpKs6










z&HTMLTokenizer.beforeAttributeNameStatecCs�|jj�}d}d}|dkr&|j|_�n0|tkr^|jddd||jjtd�7<d}�n�|dkrld}n�|tkr~|j|_n�|dkr�|j	|_n�|d	kr�|j
jtd
dd��|jdddd
7<d}n�|dk�r|j
jtd
dd��|jddd|7<d}nH|t
k�r8|j
jtd
dd��|j|_n|jddd|7<d}|�r�|jdddjt�|jddd<xP|jddd�D]:\}}|jddd|k�r�|j
jtd
dd��P�q�W|�r�|j�dS)NTFrHr$rrrjrir[r"zinvalid-codepoint)r#r$u�r�r�rDz#invalid-character-in-attribute-namezeof-in-attribute-namezduplicate-attributerKrK)r�r�rDrKrKrKrKrKrK)rr6�beforeAttributeValueStaterr
rr^r�afterAttributeNameStaterqr%r7rrrrXrrZ)rr$ZleavingThisStateZ	emitTokenrU�_r r r!r�isR








&
z HTMLTokenizer.attributeNameStatecCsF|jj�}|tkr$|jjtd��n|dkr8|j|_�n
|dkrJ|j�n�|tkrp|jdj	|dg�|j
|_n�|dkr�|j|_n�|dkr�|jj	t
dd	d
��|jdj	ddg�|j
|_n�|dk�r�|jj	t
ddd
��|jdj	|dg�|j
|_nF|tk�r&|jj	t
ddd
��|j|_n|jdj	|dg�|j
|_dS)NTrHrjr$r,rir[r"zinvalid-codepoint)r#r$u�r�r�rDz&invalid-character-after-attribute-namezexpected-end-of-tag-but-got-eof)r�r�rD)rr6rr^r�rrZr
rr7r�rqr%rrr)rr$r r r!r��s:











z%HTMLTokenizer.afterAttributeNameStatecCsj|jj�}|tkr$|jjtd��nB|dkr8|j|_�n.|dkrX|j|_|jj|��n|dkrl|j|_�n�|dkr�|j	j
tddd��|j�n�|d	kr�|j	j
tdd
d��|j
dddd
7<|j|_n�|dk�r|j	j
tddd��|j
ddd|7<|j|_nL|tk�rD|j	j
tddd��|j|_n"|j
ddd|7<|j|_dS)NTr�rCr�rjr"z.expected-attribute-value-but-got-right-bracket)r#r$r[zinvalid-codepointr$ru�rHrD�`z"equals-in-unquoted-attribute-valuez$expected-attribute-value-but-got-eofrK)rHrDr�rKrK)rr6rr^�attributeValueDoubleQuotedStater�attributeValueUnQuotedStater=�attributeValueSingleQuotedStater%r7rrZrrr)rr$r r r!r��s>










z'HTMLTokenizer.beforeAttributeValueStatecCs�|jj�}|dkr|j|_n�|dkr0|jd�n�|dkrj|jjtddd��|jdddd	7<nN|t	kr�|jjtdd
d��|j
|_n&|jdd
d||jjd�7<dS)Nr�rCr[r"zinvalid-codepoint)r#r$r$ru�z#eof-in-attribute-value-double-quoteTrKrK)r�rCr[)rr6�afterAttributeValueStaterrTr%r7rrrrr^)rr$r r r!r��s 




z-HTMLTokenizer.attributeValueDoubleQuotedStatecCs�|jj�}|dkr|j|_n�|dkr0|jd�n�|dkrj|jjtddd��|jdddd	7<nN|t	kr�|jjtdd
d��|j
|_n&|jdd
d||jjd�7<dS)Nr�rCr[r"zinvalid-codepoint)r#r$r$ru�z#eof-in-attribute-value-single-quoteTrKrK)r�rCr[)rr6r�rrTr%r7rrrrr^)rr$r r r!r��s 




z-HTMLTokenizer.attributeValueSingleQuotedStatecCs|jj�}|tkr|j|_�n�|dkr2|jd�n�|dkrD|j�n�|dkr~|jjt	dd	d
��|j
ddd|7<n�|d
kr�|jjt	ddd
��|j
dddd7<nV|tkr�|jjt	ddd
��|j|_n.|j
ddd||jj
td�tB�7<dS)NrCrjr�r�rHrDr�r"z0unexpected-character-in-unquoted-attribute-value)r#r$r$rr[zinvalid-codepointu�z eof-in-attribute-value-no-quotesT)r�r�rHrDr�rKrKrK)rCrjr�r�rHrDr�r[)rr6rrprrTrZr%r7rrrrr^r:)rr$r r r!r�s,





z)HTMLTokenizer.attributeValueUnQuotedStatecCs�|jj�}|tkr|j|_n�|dkr.|j�np|dkr@|j|_n^|tkrt|jj	t
ddd��|jj|�|j|_n*|jj	t
ddd��|jj|�|j|_dS)Nrjrir"z$unexpected-EOF-after-attribute-value)r#r$z*unexpected-character-after-attribute-valueT)
rr6rrprrZrqrr%r7rr=r)rr$r r r!r� s"






z&HTMLTokenizer.afterAttributeValueStatecCs�|jj�}|dkr&d|jd<|j�n^|tkrZ|jjtddd��|jj|�|j	|_
n*|jjtddd��|jj|�|j|_
dS)NrjTrWr"z#unexpected-EOF-after-solidus-in-tag)r#r$z)unexpected-character-after-solidus-in-tag)rr6rrZrr%r7rr=rrrp)rr$r r r!rq4s





z&HTMLTokenizer.selfClosingStartTagStatecCsD|jjd�}|jdd�}|jjtd|d��|jj�|j|_dS)Nrjr[u��Comment)r#r$T)	rr^�replacer%r7rr6rr)rr$r r r!roFs
zHTMLTokenizer.bogusCommentStatecCs�|jj�g}|ddkrT|j|jj��|ddkrPtddd�|_|j|_dS�n�|ddkr�d}x.d&D]&}|j|jj��|d'|krjd}PqjW|r�tdddddd�|_|j|_dSn�|d(dk�rH|jdk	�rH|jj	j
�rH|jj	j
d)j|jj	jk�rHd}x2d*D]*}|j|jj��|d+|k�rd}P�qW|�rH|j
|_dS|jjtddd��x|�rz|jj|j���q`W|j|_dS),Nrr|r�r,)r#r$T�d�D�o�Or@�C�t�T�y�Y�p�P�e�EFZDoctype)r#rU�publicId�systemId�correct�[�Ar"zexpected-dashes-or-doctyperKrKrK)r�r��r�r��r@r��r�r��r�r��r�r��r�r�)r�r�r�r�r�r�rKrKrK)r�r�r�r�r�r�rK)rr6r7rr�commentStartStater�doctypeStaterZtreeZopenElements�	namespaceZdefaultNamespace�cdataSectionStater%r=r'ro)rr?�matched�expectedr r r!rlUsR


z(HTMLTokenizer.markupDeclarationOpenStatecCs�|jj�}|dkr|j|_n�|dkrN|jjtddd��|jdd7<n�|dkr�|jjtdd	d��|jj|j�|j|_nP|t	kr�|jjtdd
d��|jj|j�|j|_n|jd|7<|j
|_dS)Nr|r[r"zinvalid-codepoint)r#r$r$u�rjzincorrect-commentzeof-in-commentT)rr6�commentStartDashStaterr%r7rrrr�commentState)rr$r r r!r��s(






zHTMLTokenizer.commentStartStatecCs�|jj�}|dkr|j|_n�|dkrN|jjtddd��|jdd7<n�|dkr�|jjtdd	d��|jj|j�|j|_nT|t	kr�|jjtdd
d��|jj|j�|j|_n|jdd|7<|j
|_dS)Nr|r[r"zinvalid-codepoint)r#r$r$u-�rjzincorrect-commentzeof-in-commentT)rr6�commentEndStaterr%r7rrrrr�)rr$r r r!r��s(






z#HTMLTokenizer.commentStartDashStatecCs�|jj�}|dkr|j|_n�|dkrN|jjtddd��|jdd7<nT|tkr�|jjtddd��|jj|j�|j	|_n|jd||jj
d
�7<d	S)Nr|r[r"zinvalid-codepoint)r#r$r$u�zeof-in-commentT)r|r[)rr6�commentEndDashStaterr%r7rrrrr^)rr$r r r!r��s




zHTMLTokenizer.commentStatecCs�|jj�}|dkr|j|_n�|dkrV|jjtddd��|jdd7<|j|_nT|t	kr�|jjtddd��|jj|j�|j
|_n|jdd|7<|j|_d	S)
Nr|r[r"zinvalid-codepoint)r#r$r$u-�zeof-in-comment-end-dashT)rr6r�rr%r7rrr�rr)rr$r r r!r��s 





z!HTMLTokenizer.commentEndDashStatecCs,|jj�}|dkr*|jj|j�|j|_n�|dkrd|jjtddd��|jdd7<|j|_n�|dkr�|jjtdd	d��|j	|_n�|d
kr�|jjtddd��|jd|7<nj|t
kr�|jjtddd��|jj|j�|j|_n4|jjtdd
d��|jdd|7<|j|_dS)Nrjr[r"zinvalid-codepoint)r#r$r$u--�rhz,unexpected-bang-after-double-dash-in-commentr|z,unexpected-dash-after-double-dash-in-commentzeof-in-comment-double-dashzunexpected-char-in-commentz--T)rr6r%r7rrrrr��commentEndBangStater)rr$r r r!r��s6









zHTMLTokenizer.commentEndStatecCs�|jj�}|dkr*|jj|j�|j|_n�|dkrN|jdd7<|j|_n�|dkr�|jjtddd��|jdd	7<|j	|_nT|t
kr�|jjtdd
d��|jj|j�|j|_n|jdd|7<|j	|_dS)Nrjr|r$z--!r[r"zinvalid-codepoint)r#r$u--!�zeof-in-comment-end-bang-stateT)rr6r%r7rrrr�rr�r)rr$r r r!r��s(






z!HTMLTokenizer.commentEndBangStatecCs�|jj�}|tkr|j|_nj|tkr\|jjtddd��d|j	d<|jj|j	�|j
|_n*|jjtddd��|jj|�|j|_dS)Nr"z!expected-doctype-name-but-got-eof)r#r$Fr�zneed-space-after-doctypeT)rr6r�beforeDoctypeNameStaterrr%r7rrrr=)rr$r r r!r�s





zHTMLTokenizer.doctypeStatecCs�|jj�}|tkrn�|dkrT|jjtddd��d|jd<|jj|j�|j|_n�|dkr�|jjtddd��d	|jd
<|j	|_nR|t
kr�|jjtddd��d|jd<|jj|j�|j|_n||jd
<|j	|_dS)
Nrjr"z+expected-doctype-name-but-got-right-bracket)r#r$Fr�r[zinvalid-codepointu�rUz!expected-doctype-name-but-got-eofT)rr6rr%r7rrrr�doctypeNameStater)rr$r r r!r�s.










z$HTMLTokenizer.beforeDoctypeNameStatecCs|jj�}|tkr2|jdjt�|jd<|j|_n�|dkrh|jdjt�|jd<|jj	|j�|j
|_n�|dkr�|jj	tddd��|jdd7<|j|_nh|t
kr�|jj	tddd��d	|jd
<|jdjt�|jd<|jj	|j�|j
|_n|jd|7<dS)NrUrjr[r"zinvalid-codepoint)r#r$u�zeof-in-doctype-nameFr�T)rr6rrrXr�afterDoctypeNameStaterr%r7rrr�r)rr$r r r!r�6s,







zHTMLTokenizer.doctypeNameStatecCsR|jj�}|tkr�n8|dkr8|jj|j�|j|_�n|tkr�d|jd<|jj	|�|jjt
ddd��|jj|j�|j|_�n�|d!kr�d	}x$d'D]}|jj�}||kr�d}Pq�W|r�|j|_d	SnJ|d(k�rd	}x(d.D] }|jj�}||k�r�d}P�q�W|�r|j|_d	S|jj	|�|jjt
ddd|id ��d|jd<|j
|_d	S)/NrjFr�r"zeof-in-doctype)r#r$r�r�T�u�U�b�B�l�L�i�Ir@r��s�Sr�r�r�r�r�r��m�Mz*expected-space-or-right-bracket-in-doctyper$)r#r$r.)r�r��r�r��r�r��r�r��r�r��r@r�)r�r�r�r�r�)r�r��r�r��r�r��r�r��r�r��r�r�)r�r�r�r�r�)rr6rr%r7rrrrr=r�afterDoctypePublicKeywordState�afterDoctypeSystemKeywordState�bogusDoctypeState)rr$r�r�r r r!r�OsT







z#HTMLTokenizer.afterDoctypeNameStatecCs�|jj�}|tkr|j|_n�|d
krP|jjtddd��|jj|�|j|_nT|t	kr�|jjtddd��d|j
d<|jj|j
�|j|_n|jj|�|j|_d	S)Nr�r�r"zunexpected-char-in-doctype)r#r$zeof-in-doctypeFr�T)r�r�)rr6r�"beforeDoctypePublicIdentifierStaterr%r7rr=rrr)rr$r r r!r��s"






z,HTMLTokenizer.afterDoctypePublicKeywordStatecCs�|jj�}|tkrn�|dkr0d|jd<|j|_n�|dkrLd|jd<|j|_n�|dkr�|jjt	ddd��d	|jd
<|jj|j�|j
|_nh|tkr�|jjt	ddd��d	|jd
<|jj|j�|j
|_n(|jjt	ddd��d	|jd
<|j|_d
S)Nr�r,r�r�rjr"zunexpected-end-of-doctype)r#r$Fr�zeof-in-doctypezunexpected-char-in-doctypeT)
rr6rr�(doctypePublicIdentifierDoubleQuotedStater�(doctypePublicIdentifierSingleQuotedStater%r7rrrr�)rr$r r r!r��s4












z0HTMLTokenizer.beforeDoctypePublicIdentifierStatecCs�|jj�}|dkr|j|_n�|dkrN|jjtddd��|jdd7<n�|dkr�|jjtdd	d��d
|jd<|jj|j�|j|_nR|t	kr�|jjtddd��d
|jd<|jj|j�|j|_n|jd|7<d
S)Nr�r[r"zinvalid-codepoint)r#r$r�u�rjzunexpected-end-of-doctypeFr�zeof-in-doctypeT)
rr6�!afterDoctypePublicIdentifierStaterr%r7rrrr)rr$r r r!r��s*








z6HTMLTokenizer.doctypePublicIdentifierDoubleQuotedStatecCs�|jj�}|dkr|j|_n�|dkrN|jjtddd��|jdd7<n�|dkr�|jjtdd	d��d
|jd<|jj|j�|j|_nR|t	kr�|jjtddd��d
|jd<|jj|j�|j|_n|jd|7<d
S)Nr�r[r"zinvalid-codepoint)r#r$r�u�rjzunexpected-end-of-doctypeFr�zeof-in-doctypeT)
rr6r�rr%r7rrrr)rr$r r r!r��s*








z6HTMLTokenizer.doctypePublicIdentifierSingleQuotedStatecCs|jj�}|tkr|j|_n�|dkr<|jj|j�|j|_n�|dkrn|jjt	ddd��d|jd<|j
|_n�|dkr�|jjt	ddd��d|jd<|j|_nh|tkr�|jjt	dd	d��d
|jd<|jj|j�|j|_n(|jjt	ddd��d
|jd<|j
|_dS)
Nrjr�r"zunexpected-char-in-doctype)r#r$r,r�r�zeof-in-doctypeFr�T)rr6r�-betweenDoctypePublicAndSystemIdentifiersStaterr%r7rrr�(doctypeSystemIdentifierDoubleQuotedState�(doctypeSystemIdentifierSingleQuotedStaterr�)rr$r r r!r��s6













z/HTMLTokenizer.afterDoctypePublicIdentifierStatecCs�|jj�}|tkrn�|dkr4|jj|j�|j|_n�|dkrPd|jd<|j|_n�|dkrld|jd<|j	|_nh|t
kr�|jjtddd��d	|jd
<|jj|j�|j|_n(|jjtddd��d	|jd
<|j|_dS)
Nrjr�r,r�r�r"zeof-in-doctype)r#r$Fr�zunexpected-char-in-doctypeT)
rr6rr%r7rrrr�r�rrr�)rr$r r r!r�s.










z;HTMLTokenizer.betweenDoctypePublicAndSystemIdentifiersStatecCs�|jj�}|tkr|j|_n�|d
krP|jjtddd��|jj|�|j|_nT|t	kr�|jjtddd��d|j
d<|jj|j
�|j|_n|jj|�|j|_d	S)Nr�r�r"zunexpected-char-in-doctype)r#r$zeof-in-doctypeFr�T)r�r�)rr6r�"beforeDoctypeSystemIdentifierStaterr%r7rr=rrr)rr$r r r!r�s"






z,HTMLTokenizer.afterDoctypeSystemKeywordStatecCs�|jj�}|tkrn�|dkr0d|jd<|j|_n�|dkrLd|jd<|j|_n�|dkr�|jjt	ddd��d	|jd
<|jj|j�|j
|_nh|tkr�|jjt	ddd��d	|jd
<|jj|j�|j
|_n(|jjt	ddd��d	|jd
<|j|_dS)
Nr�r,r�r�rjr"zunexpected-char-in-doctype)r#r$Fr�zeof-in-doctypeT)
rr6rrr�rr�r%r7rrrr�)rr$r r r!r�/s4












z0HTMLTokenizer.beforeDoctypeSystemIdentifierStatecCs�|jj�}|dkr|j|_n�|dkrN|jjtddd��|jdd7<n�|dkr�|jjtdd	d��d
|jd<|jj|j�|j|_nR|t	kr�|jjtddd��d
|jd<|jj|j�|j|_n|jd|7<d
S)Nr�r[r"zinvalid-codepoint)r#r$r�u�rjzunexpected-end-of-doctypeFr�zeof-in-doctypeT)
rr6�!afterDoctypeSystemIdentifierStaterr%r7rrrr)rr$r r r!r�Ls*








z6HTMLTokenizer.doctypeSystemIdentifierDoubleQuotedStatecCs�|jj�}|dkr|j|_n�|dkrN|jjtddd��|jdd7<n�|dkr�|jjtdd	d��d
|jd<|jj|j�|j|_nR|t	kr�|jjtddd��d
|jd<|jj|j�|j|_n|jd|7<d
S)Nr�r[r"zinvalid-codepoint)r#r$r�u�rjzunexpected-end-of-doctypeFr�zeof-in-doctypeT)
rr6r�rr%r7rrrr)rr$r r r!r�ds*








z6HTMLTokenizer.doctypeSystemIdentifierSingleQuotedStatecCs�|jj�}|tkrn~|dkr4|jj|j�|j|_n^|tkrt|jjt	ddd��d|jd<|jj|j�|j|_n|jjt	ddd��|j
|_dS)	Nrjr"zeof-in-doctype)r#r$Fr�zunexpected-char-in-doctypeT)rr6rr%r7rrrrrr�)rr$r r r!r�|s 





z/HTMLTokenizer.afterDoctypeSystemIdentifierStatecCsZ|jj�}|dkr*|jj|j�|j|_n,|tkrV|jj|�|jj|j�|j|_ndS)NrjT)	rr6r%r7rrrrr=)rr$r r r!r��s


zHTMLTokenizer.bogusDoctypeStatecCs�g}xt|j|jjd��|j|jjd��|jj�}|tkr@Pq|ddd�dkrl|ddd�|d<Pq|j|�qWdj|�}|jd�}|dkr�x&t|�D]}|jjt	d	d
d��q�W|j
dd�}|r�|jjt	d
|d��|j|_dS)N�]rjr�z]]r,r[rr"zinvalid-codepoint)r#r$u�rJTrK���rKr�rK)
r7rr^r6rr9�count�ranger%rr�rr)rr$r6Z	nullCountr�r r r!r��s.



zHTMLTokenizer.cdataSectionState)N)NF)N�__name__�
__module__�__qualname__�__doc__rr)rBrSrTrZrr\rbr`rdrfrgr]rmrnrarsrtrcrwrxreryr{rzr}r�rr~r�r�r�r�r�r�r�r�r�rpr�r�r�r�r�r�r�rqrorlr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r��
__classcell__r r )rr!rs�H
P#

6 "-3rN)Z
__future__rrrZpip._vendor.sixrr;�collectionsrZ	constantsrr	r
rrr
rrrrZ_inputstreamrZ_trierrL�objectrr r r r!�<module>ssite-packages/pip/_vendor/html5lib/__pycache__/serializer.cpython-36.pyc000064400000022106147511334600022244 0ustar003

���ea7�@s�ddlmZmZmZddlmZddlZddlmZm	Z	ddl
mZmZm
Z
ddl
mZmZmZddlmZmZdd	lmZd
je
�dZejded
�Zejded�ZiZed�dkZx�eej��D]p\Z Z!er�ee!�dks�er�ee!�dkr�q�e!dkr�ee!�dk�rej"e!�Z!ne#e!�Z!e!ek�s4e j$�r�e ee!<q�Wdd�Z%ede%�ddd�Z&Gdd�de'�Z(Gdd�de)�Z*dS)�)�absolute_import�division�unicode_literals)�	text_typeN)�register_error�xmlcharrefreplace_errors�)�voidElements�booleanAttributes�spaceCharacters)�rcdataElements�entities�xmlEntities)�treewalkers�_utils)�escape�z"'=<>`�[�]u_	

 /`  ᠎᠏           

   ]u􏿿��&c
Cs"t|ttf��rg}g}d}x�t|j|j|j��D]n\}}|rFd}q4||j}tj|j|t	|j|dg���r�tj
|j||d��}d}nt|�}|j|�q4Wx^|D]V}t
j|�}	|	r�|jd�|j|	�|	jd�s�|jd�q�|jdt|�dd��q�Wdj|�|jfSt|�SdS)NFrTr�;z&#x%s;r)�
isinstance�UnicodeEncodeError�UnicodeTranslateError�	enumerate�object�start�endrZisSurrogatePair�min�surrogatePairToCodepoint�ord�append�_encode_entity_map�get�endswith�hex�joinr)
�exc�resZ
codepoints�skip�i�c�indexZ	codepointZcp�e�r/� /usr/lib/python3.6/serializer.py�htmlentityreplace_errors*s0 
"




r1�htmlentityreplace�etreecKs$tj|�}tf|�}|j||�|�S)N)rZ
getTreeWalker�HTMLSerializer�render)�inputZtree�encodingZserializer_optsZwalker�sr/r/r0�	serializeJs

r9c@s~eZdZdZdZdZdZdZdZdZ	dZ
dZdZdZ
dZdZdZd!Zdd�Zdd�Zdd�Zd"dd�Zd#dd�Zd$dd �ZdS)%r4�legacy�"TF�quote_attr_values�
quote_char�use_best_quote_char�omit_optional_tags�minimize_boolean_attributes�use_trailing_solidus�space_before_trailing_solidus�escape_lt_in_attrs�
escape_rcdata�resolve_entities�alphabetical_attributes�inject_meta_charset�strip_whitespace�sanitizec	Kszt|�t|j�}t|�dkr2tdtt|����d|kr@d|_x(|jD]}t|||j|t	||���qHWg|_
d|_dS)a6	Initialize HTMLSerializer.

        Keyword options (default given first unless specified) include:

        inject_meta_charset=True|False
          Whether it insert a meta element to define the character set of the
          document.
        quote_attr_values="legacy"|"spec"|"always"
          Whether to quote attribute values that don't require quoting
          per legacy browser behaviour, when required by the standard, or always.
        quote_char=u'"'|u"'"
          Use given quote character for attribute quoting. Default is to
          use double quote unless attribute value contains a double quote,
          in which case single quotes are used instead.
        escape_lt_in_attrs=False|True
          Whether to escape < in attribute values.
        escape_rcdata=False|True
          Whether to escape characters that need to be escaped within normal
          elements within rcdata elements such as style.
        resolve_entities=True|False
          Whether to resolve named character entities that appear in the
          source tree. The XML predefined entities &lt; &gt; &amp; &quot; &apos;
          are unaffected by this setting.
        strip_whitespace=False|True
          Whether to remove semantically meaningless whitespace. (This
          compresses all whitespace to a single space except within pre.)
        minimize_boolean_attributes=True|False
          Shortens boolean attributes to give just the attribute value,
          for example <input disabled="disabled"> becomes <input disabled>.
        use_trailing_solidus=False|True
          Includes a close-tag slash at the end of the start tag of void
          elements (empty elements whose end tag is forbidden). E.g. <hr/>.
        space_before_trailing_solidus=True|False
          Places a space immediately before the closing slash in a tag
          using a trailing solidus. E.g. <hr />. Requires use_trailing_solidus.
        sanitize=False|True
          Strip all unsafe or unknown constructs from output.
          See `html5lib user documentation`_
        omit_optional_tags=True|False
          Omit start/end tags that are optional.
        alphabetical_attributes=False|True
          Reorder attributes to be in alphabetical order.

        .. _html5lib user documentation: http://code.google.com/p/html5lib/wiki/UserDocumentation
        rz2__init__() got an unexpected keyword argument '%s'r=FN)�	frozenset�options�len�	TypeError�next�iterr>�setattrr$�getattr�errors�strict)�self�kwargsZunexpected_args�attrr/r/r0�__init__ps.zHTMLSerializer.__init__cCs*t|t�st�|jr"|j|jd�S|SdS)Nr2)rr�AssertionErrorr7�encode)rT�stringr/r/r0rY�szHTMLSerializer.encodecCs*t|t�st�|jr"|j|jd�S|SdS)NrS)rrrXr7rY)rTrZr/r/r0�encodeStrict�szHTMLSerializer.encodeStrictNccs�||_d}g|_|r0|jr0ddlm}|||�}|jrJddlm}||�}|jrdddlm}||�}|j	r~ddl
m}||�}|jr�ddlm}||�}�xR|D�]H}|d}|dk�r`d|d}|dr�|d	|d7}n|d
r�|d7}|d
�rJ|d
j
d�d
k�r0|d
j
d�d
k�r*|jd�d}nd}|d||d
|f7}|d7}|j|�Vq�|d5k�r�|dk�sz|�r�|�r�|dj
d�d
k�r�|jd�|j|d�Vn|jt|d��Vq�|d6k�r�|d}	|jd|	�V|	tk�r|j�rd}n|�r|jd��x�|dj�D�]�\\}
}}|}
|}|jd�V|j|
�V|j�s�|
tj|	t��k�r"|
tjdt��k�r"|jd�V|jdk�s�t|�d
k�r�d}n@|jd k�r�tj|�dk	}n$|jd!k�r�tj|�dk	}ntd"��|jd#d$�}|j �r|jd%d&�}|�r�|j!}|j"�rTd|k�r<d|k�r<d}nd|k�rTd|k�rTd}|dk�rl|jdd'�}n|jdd(�}|j|�V|j|�V|j|�Vn|j|�V�q"W|	t#k�r�|j$�r�|j%�r�|jd)�Vn|jd*�V|jd�Vq�|d+k�r6|d}	|	tk�rd}n|�r$|jd�|jd,|	�Vq�|d-k�rx|d}|j
d.�d
k�rb|jd/�|jd0|d�Vq�|d1k�r�|d}	|	d2}|t&k�r�|jd3|	�|j'�r�|t(k�r�t&|}nd4|	}|j|�Vq�|j|d�q�WdS)7NFr)�Filter�typeZDoctypez<!DOCTYPE %s�nameZpublicIdz PUBLIC "%s"ZsystemIdz SYSTEMr;r�'zASystem identifer contains both single and double quote charactersz %s%s%s�>�
Characters�SpaceCharacters�dataz</zUnexpected </ in CDATA�StartTag�EmptyTagz<%sTz+Unexpected child element of a CDATA element� r�=�always�specr:z?quote_attr_values must be one of: 'always', 'spec', or 'legacy'rz&amp;�<z&lt;z&#39;z&quot;z /�/ZEndTagz</%s>�Commentz--zComment contains --z	<!--%s-->ZEntityrzEntity %s not recognizedz&%s;)rarb)rdre))r7rRrGZfilters.inject_meta_charsetr\rFZfilters.alphabeticalattributesrHZfilters.whitespacerIZfilters.sanitizerr?Zfilters.optionaltags�find�serializeErrorr[rYrrrD�itemsr@r
r$�tupler<rL�_quoteAttributeSpec�search�_quoteAttributeLegacy�
ValueError�replacerCr=r>r	rArBr
rEr)rT�
treewalkerr7Zin_cdatar\�tokenr]Zdoctyper=r^�_Z	attr_nameZ
attr_value�k�vZ
quote_attrrc�keyr/r/r0r9�s�


















zHTMLSerializer.serializecCs2|rdjt|j||���Sdjt|j|���SdS)N�r)r'�listr9)rTrvr7r/r/r0r5?szHTMLSerializer.render�XXX ERROR MESSAGE NEEDEDcCs|jj|�|jrt�dS)N)rRr"rS�SerializeError)rTrcr/r/r0rnEszHTMLSerializer.serializeError)r<r=r>r?r@rArBrCrDrErFrGrHrI)N)N)r~)�__name__�
__module__�__qualname__r<r=r>r?r@rArBrCrDrErFrGrHrIrKrWrYr[r9r5rnr/r/r/r0r4Qs68


r4c@seZdZdZdS)rzError in serialized treeN)r�r�r��__doc__r/r/r/r0rLsr)r3N)+Z
__future__rrrZpip._vendor.sixr�re�codecsrrZ	constantsr	r
rrr
rrrrZxml.sax.saxutilsrr'Z_quoteAttributeSpecChars�compilerqrsr#rLZ_is_ucs4r}roryrzr r!�islowerr1r9rr4�	Exceptionrr/r/r/r0�<module>s:
	

|site-packages/pip/_vendor/html5lib/__pycache__/_tokenizer.cpython-36.pyc000064400000122141147511334600022244 0ustar003

���e$+�@s�ddlmZmZmZddlmZddlmZddl	m
Z
ddl	mZddl	mZm
Z
ddl	mZmZmZdd	l	mZmZdd
l	mZddlmZddlmZee�ZGd
d�de�ZdS)�)�absolute_import�division�unicode_literals)�unichr)�deque�)�spaceCharacters)�entities)�asciiLetters�asciiUpper2Lower)�digits�	hexDigits�EOF)�
tokenTypes�
tagTokenTypes)�replacementCharacters)�HTMLInputStream)�TriecsdeZdZdZd��fdd�	Zdd�Zdd�Zd�d
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zd2d3�Zd4d5�Zd6d7�Zd8d9�Zd:d;�Zd<d=�Z d>d?�Z!d@dA�Z"dBdC�Z#dDdE�Z$dFdG�Z%dHdI�Z&dJdK�Z'dLdM�Z(dNdO�Z)dPdQ�Z*dRdS�Z+dTdU�Z,dVdW�Z-dXdY�Z.dZd[�Z/d\d]�Z0d^d_�Z1d`da�Z2dbdc�Z3ddde�Z4dfdg�Z5dhdi�Z6djdk�Z7dldm�Z8dndo�Z9dpdq�Z:drds�Z;dtdu�Z<dvdw�Z=dxdy�Z>dzd{�Z?d|d}�Z@d~d�ZAd�d��ZBd�d��ZCd�d��ZDd�d��ZEd�d��ZFd�d��ZGd�d��ZHd�d��ZId�d��ZJd�d��ZKd�d��ZL�ZMS)��
HTMLTokenizera	 This class takes care of tokenizing HTML.

    * self.currentToken
      Holds the token that is currently being processed.

    * self.state
      Holds a reference to the method to be invoked... XXX

    * self.stream
      Points to HTMLInputStream object.
    NcsFt|f|�|_||_d|_g|_|j|_d|_d|_t	t
|�j�dS)NF)r�stream�parserZ
escapeFlagZ
lastFourChars�	dataState�state�escape�currentToken�superr�__init__)�selfrr�kwargs)�	__class__�� /usr/lib/python3.6/_tokenizer.pyr"szHTMLTokenizer.__init__ccs\tg�|_xL|j�rVx&|jjr:td|jjjd�d�VqWx|jrR|jj�Vq>WqWdS)z� This is where the magic happens.

        We do our usually processing through the states and when we have a token
        to return we yield the token which pauses processing until the next token
        is requested.
        �
ParseErrorr)�type�dataN)r�
tokenQueuerr�errorsr�pop�popleft)rr r r!�__iter__1s


zHTMLTokenizer.__iter__c	%Cs(t}d}|rt}d}g}|jj�}x(||krJ|tk	rJ|j|�|jj�}q$Wtdj|�|�}|tkr�t|}|j	jt
ddd|id���nld|ko�d	kns�|d
kr�d}|j	jt
ddd|id���n(d|ko�d
kn�s�d|ko�dkn�s�d|k�odkn�s�d|k�o4dkn�s�|tddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d
g#�k�r�|j	jt
ddd|id��yt|�}Wn>t
k
�r�|d6}td|d?B�td7|d8@B�}YnX|d9k�r$|j	jt
dd:d;��|jj|�|S)<z�This function returns either U+FFFD or the character based on the
        decimal or hexadecimal representation. It also discards ";" if present.
        If not present self.tokenQueue.append({"type": tokenTypes["ParseError"]}) is invoked.
        �
��r"z$illegal-codepoint-for-numeric-entity�	charAsInt)r#r$�datavarsi�i��i��u�r�����i�i��i��i��i��i��i��i��i��i��i��i��i��i��i��i��i��i��i��i��i��	i��	i��
i��
i��i��i��i��i��
i��
i��i��i��i��i��ii�i��;z numeric-entity-without-semicolon)r#r$)rr
r�charr�append�int�joinrr%r�	frozenset�chr�
ValueError�unget)	rZisHexZallowed�radix�	charStack�cr-r6�vr r r!�consumeNumberEntityAs`

&

z!HTMLTokenizer.consumeNumberEntityFc	
Cs�d}|jj�g}|dtksB|dtddfksB|dk	rV||dkrV|jj|d��n"|ddk�rd}|j|jj��|ddkr�d	}|j|jj��|r�|dtks�|r�|dtkr�|jj|d�|j|�}n4|j	jt
d
dd��|jj|j��dd
j|�}�njx8|dtk	�rFt
jd
j|���s2P|j|jj���qWy$t
jd
j|dd���}t|�}Wntk
�r�d}YnX|dk	�rD|ddk�r�|j	jt
d
dd��|ddk�r|�r||tk�s�||tk�s�||dk�r|jj|j��dd
j|�}n.t|}|jj|j��|d
j||d��7}n4|j	jt
d
dd��|jj|j��dd
j|�}|�r�|jddd|7<n*|tk�r�d}nd}|j	jt
||d��dS)N�&r�<�#Fr�x�XTr"zexpected-numeric-entity)r#r$r,r5znamed-entity-without-semicolon�=zexpected-named-entityr$�SpaceCharacters�
Characters���)rFrGrKrKrKrKrKrKrKrK)rr6rrr=r7r
rrBr%rr'r9�entitiesTrieZhas_keys_with_prefixZlongest_prefix�len�KeyErrorr
r	r)	r�allowedChar�
fromAttribute�outputr?�hexZ
entityNameZentityLengthZ	tokenTyper r r!�
consumeEntity�sf





zHTMLTokenizer.consumeEntitycCs|j|dd�dS)zIThis method replaces the need for "entityInAttributeValueState".
        T)rOrPN)rS)rrOr r r!�processEntityInAttribute�sz&HTMLTokenizer.processEntityInAttributecCs�|j}|dtkrp|djt�|d<|dtdkrp|drR|jjtddd��|drp|jjtdd	d��|jj|�|j|_d
S)z�This method is a generic handler for emitting the tags. It also sets
        the state to "data" because that's what's needed after a token has been
        emitted.
        r#�name�EndTagr$r"zattributes-in-end-tag)r#r$�selfClosingzself-closing-flag-on-end-tagN)	rr�	translaterrr%r7rr)r�tokenr r r!�emitCurrentToken�s

zHTMLTokenizer.emitCurrentTokencCs�|jj�}|dkr|j|_n�|dkr.|j|_n�|dkrd|jjtddd��|jjtddd��n`|tkrpdS|t	kr�|jjtd	||jj
t	d
�d��n&|jj
d�}|jjtd||d��d
S)NrCrD�r"zinvalid-codepoint)r#r$rJFrIT)rCrDr[)rr6�entityDataStater�tagOpenStater%r7rrr�
charsUntil)rr$�charsr r r!r�s&



zHTMLTokenizer.dataStatecCs|j�|j|_dS)NT)rSrr)rr r r!r\szHTMLTokenizer.entityDataStatecCs�|jj�}|dkr|j|_n�|dkr.|j|_n�|tkr:dS|dkrp|jjtddd��|jjtdd	d��nT|t	kr�|jjtd
||jj
t	d�d��n&|jj
d�}|jjtd||d��dS)
NrCrDFr[r"zinvalid-codepoint)r#r$rJu�rIT)rCrDr[)rr6�characterReferenceInRcdatar�rcdataLessThanSignStaterr%r7rrr^)rr$r_r r r!�rcdataStates&



zHTMLTokenizer.rcdataStatecCs|j�|j|_dS)NT)rSrbr)rr r r!r`1sz(HTMLTokenizer.characterReferenceInRcdatacCs�|jj�}|dkr|j|_nh|dkrR|jjtddd��|jjtddd��n2|tkr^dS|jjd
�}|jjtd||d��d	S)NrDr[r"zinvalid-codepoint)r#r$rJu�FT)rDr[)	rr6�rawtextLessThanSignStaterr%r7rrr^)rr$r_r r r!�rawtextState6s


zHTMLTokenizer.rawtextStatecCs�|jj�}|dkr|j|_nh|dkrR|jjtddd��|jjtddd��n2|tkr^dS|jjd
�}|jjtd||d��d	S)NrDr[r"zinvalid-codepoint)r#r$rJu�FT)rDr[)	rr6�scriptDataLessThanSignStaterr%r7rrr^)rr$r_r r r!�scriptDataStateHs


zHTMLTokenizer.scriptDataStatecCsr|jj�}|tkrdS|dkrL|jjtddd��|jjtddd��n"|jjtd||jjd�d��dS)	NFr[r"zinvalid-codepoint)r#r$rJu�T)rr6rr%r7rr^)rr$r r r!�plaintextStateZs

zHTMLTokenizer.plaintextStatecCs|jj�}|dkr|j|_n�|dkr.|j|_n�|tkrVtd|gddd�|_|j|_n�|dkr�|j	j
tddd	��|j	j
td
dd	��|j|_nt|dkr�|j	j
tdd
d	��|jj|�|j
|_n@|j	j
tddd	��|j	j
td
dd	��|jj|�|j|_dS)N�!�/ZStartTagF)r#rUr$rWZselfClosingAcknowledged�>r"z'expected-tag-name-but-got-right-bracket)r#r$rJz<>�?z'expected-tag-name-but-got-question-markzexpected-tag-namerDT)rr6�markupDeclarationOpenStater�closeTagOpenStater
rr�tagNameStater%r7rr=�bogusCommentState)rr$r r r!r]is6









zHTMLTokenizer.tagOpenStatecCs�|jj�}|tkr0td|gdd�|_|j|_n�|dkrX|jjtddd��|j	|_nn|t
kr�|jjtddd��|jjtd	d
d��|j	|_n0|jjtddd|id
��|jj|�|j|_dS)NrVF)r#rUr$rWrjr"z*expected-closing-tag-but-got-right-bracket)r#r$z expected-closing-tag-but-got-eofrJz</z!expected-closing-tag-but-got-charr$)r#r$r.T)
rr6r
rrrnrr%r7rrr=ro)rr$r r r!rm�s(





zHTMLTokenizer.closeTagOpenStatecCs�|jj�}|tkr|j|_n�|dkr.|j�n~|tkrV|jjt	ddd��|j
|_nV|dkrh|j|_nD|dkr�|jjt	ddd��|jdd	7<n|jd|7<d
S)Nrjr"zeof-in-tag-name)r#r$rir[zinvalid-codepointrUu�T)
rr6r�beforeAttributeNameStaterrZrr%r7rr�selfClosingStartTagStater)rr$r r r!rn�s"






zHTMLTokenizer.tagNameStatecCsP|jj�}|dkr"d|_|j|_n*|jjtddd��|jj|�|j	|_dS)Nrir,rJrD)r#r$T)
rr6�temporaryBuffer�rcdataEndTagOpenStaterr%r7rr=rb)rr$r r r!ra�s

z%HTMLTokenizer.rcdataLessThanSignStatecCsX|jj�}|tkr*|j|7_|j|_n*|jjtddd��|jj	|�|j
|_dS)NrJz</)r#r$T)rr6r
rr�rcdataEndTagNameStaterr%r7rr=rb)rr$r r r!rs�s

z#HTMLTokenizer.rcdataEndTagOpenStatecCs|jo|jdj�|jj�k}|jj�}|tkrT|rTtd|jgdd�|_|j|_n�|dkr�|r�td|jgdd�|_|j	|_n||dkr�|r�td|jgdd�|_|j
�|j|_nH|tkr�|j|7_n0|j
jtdd|jd	��|jj|�|j|_d
S)NrUrVF)r#rUr$rWrirjrJz</)r#r$T)r�lowerrrrr6rrrprrqrZrr
r%r7r=rb)r�appropriater$r r r!rt�s2



z#HTMLTokenizer.rcdataEndTagNameStatecCsP|jj�}|dkr"d|_|j|_n*|jjtddd��|jj|�|j	|_dS)Nrir,rJrD)r#r$T)
rr6rr�rawtextEndTagOpenStaterr%r7rr=rd)rr$r r r!rc�s

z&HTMLTokenizer.rawtextLessThanSignStatecCsX|jj�}|tkr*|j|7_|j|_n*|jjtddd��|jj	|�|j
|_dS)NrJz</)r#r$T)rr6r
rr�rawtextEndTagNameStaterr%r7rr=rd)rr$r r r!rw�s

z$HTMLTokenizer.rawtextEndTagOpenStatecCs|jo|jdj�|jj�k}|jj�}|tkrT|rTtd|jgdd�|_|j|_n�|dkr�|r�td|jgdd�|_|j	|_n||dkr�|r�td|jgdd�|_|j
�|j|_nH|tkr�|j|7_n0|j
jtdd|jd	��|jj|�|j|_d
S)NrUrVF)r#rUr$rWrirjrJz</)r#r$T)rrurrrr6rrrprrqrZrr
r%r7r=rd)rrvr$r r r!rxs2



z$HTMLTokenizer.rawtextEndTagNameStatecCsx|jj�}|dkr"d|_|j|_nR|dkrJ|jjtddd��|j|_n*|jjtddd��|jj	|�|j
|_dS)	Nrir,rhrJz<!)r#r$rDT)rr6rr�scriptDataEndTagOpenStaterr%r7r�scriptDataEscapeStartStater=rf)rr$r r r!res


z)HTMLTokenizer.scriptDataLessThanSignStatecCsX|jj�}|tkr*|j|7_|j|_n*|jjtddd��|jj	|�|j
|_dS)NrJz</)r#r$T)rr6r
rr�scriptDataEndTagNameStaterr%r7rr=rf)rr$r r r!ry,s

z'HTMLTokenizer.scriptDataEndTagOpenStatecCs|jo|jdj�|jj�k}|jj�}|tkrT|rTtd|jgdd�|_|j|_n�|dkr�|r�td|jgdd�|_|j	|_n||dkr�|r�td|jgdd�|_|j
�|j|_nH|tkr�|j|7_n0|j
jtdd|jd	��|jj|�|j|_d
S)NrUrVF)r#rUr$rWrirjrJz</)r#r$T)rrurrrr6rrrprrqrZrr
r%r7r=rf)rrvr$r r r!r{7s2



z'HTMLTokenizer.scriptDataEndTagNameStatecCsJ|jj�}|dkr2|jjtddd��|j|_n|jj|�|j|_dS)N�-rJ)r#r$T)	rr6r%r7r�scriptDataEscapeStartDashStaterr=rf)rr$r r r!rzSs

z(HTMLTokenizer.scriptDataEscapeStartStatecCsJ|jj�}|dkr2|jjtddd��|j|_n|jj|�|j|_dS)Nr|rJ)r#r$T)	rr6r%r7r�scriptDataEscapedDashDashStaterr=rf)rr$r r r!r}]s

z,HTMLTokenizer.scriptDataEscapeStartDashStatecCs�|jj�}|dkr2|jjtddd��|j|_n�|dkrD|j|_nn|dkrz|jjtddd��|jjtddd��n8|tkr�|j	|_n&|jj
d
�}|jjtd||d��d	S)Nr|rJ)r#r$rDr[r"zinvalid-codepointu�T)rDr|r[)rr6r%r7r�scriptDataEscapedDashStater�"scriptDataEscapedLessThanSignStaterrr^)rr$r_r r r!�scriptDataEscapedStategs"




z$HTMLTokenizer.scriptDataEscapedStatecCs�|jj�}|dkr2|jjtddd��|j|_n�|dkrD|j|_nn|dkr�|jjtddd��|jjtddd��|j|_n0|t	kr�|j
|_n|jjtd|d��|j|_d	S)
Nr|rJ)r#r$rDr[r"zinvalid-codepointu�T)rr6r%r7rr~rr�r�rr)rr$r r r!r{s"






z(HTMLTokenizer.scriptDataEscapedDashStatecCs�|jj�}|dkr*|jjtddd��n�|dkr<|j|_n�|dkrd|jjtddd��|j|_nn|dkr�|jjtddd��|jjtdd	d��|j|_n0|t	kr�|j
|_n|jjtd|d��|j|_d
S)Nr|rJ)r#r$rDrjr[r"zinvalid-codepointu�T)rr6r%r7rr�rrfr�rr)rr$r r r!r~�s&






z,HTMLTokenizer.scriptDataEscapedDashDashStatecCs�|jj�}|dkr"d|_|j|_n\|tkrT|jjtdd|d��||_|j	|_n*|jjtddd��|jj
|�|j|_dS)Nrir,rJrD)r#r$T)rr6rr� scriptDataEscapedEndTagOpenStaterr
r%r7r� scriptDataDoubleEscapeStartStater=r�)rr$r r r!r��s


z0HTMLTokenizer.scriptDataEscapedLessThanSignStatecCsP|jj�}|tkr"||_|j|_n*|jjtddd��|jj	|�|j
|_dS)NrJz</)r#r$T)rr6r
rr� scriptDataEscapedEndTagNameStaterr%r7rr=r�)rr$r r r!r��s

z.HTMLTokenizer.scriptDataEscapedEndTagOpenStatecCs|jo|jdj�|jj�k}|jj�}|tkrT|rTtd|jgdd�|_|j|_n�|dkr�|r�td|jgdd�|_|j	|_n||dkr�|r�td|jgdd�|_|j
�|j|_nH|tkr�|j|7_n0|j
jtdd|jd	��|jj|�|j|_d
S)NrUrVF)r#rUr$rWrirjrJz</)r#r$T)rrurrrr6rrrprrqrZrr
r%r7r=r�)rrvr$r r r!r��s2



z.HTMLTokenizer.scriptDataEscapedEndTagNameStatecCs�|jj�}|ttd�BkrR|jjtd|d��|jj�dkrH|j	|_
q�|j|_
nB|tkr�|jjtd|d��|j|7_n|jj
|�|j|_
dS)NrirjrJ)r#r$�scriptT)rirj)rr6rr:r%r7rrrru�scriptDataDoubleEscapedStaterr�r
r=)rr$r r r!r��s


z.HTMLTokenizer.scriptDataDoubleEscapeStartStatecCs�|jj�}|dkr2|jjtddd��|j|_n�|dkrZ|jjtddd��|j|_nt|dkr�|jjtddd��|jjtddd��n>|tkr�|jjtdd	d��|j	|_n|jjtd|d��d
S)Nr|rJ)r#r$rDr[r"zinvalid-codepointu�zeof-in-script-in-scriptT)
rr6r%r7r� scriptDataDoubleEscapedDashStater�(scriptDataDoubleEscapedLessThanSignStaterr)rr$r r r!r��s$





z*HTMLTokenizer.scriptDataDoubleEscapedStatecCs�|jj�}|dkr2|jjtddd��|j|_n�|dkrZ|jjtddd��|j|_n�|dkr�|jjtddd��|jjtddd��|j|_nF|t	kr�|jjtdd	d��|j
|_n|jjtd|d��|j|_d
S)Nr|rJ)r#r$rDr[r"zinvalid-codepointu�zeof-in-script-in-scriptT)rr6r%r7r�$scriptDataDoubleEscapedDashDashStaterr�r�rr)rr$r r r!r�s(







z.HTMLTokenizer.scriptDataDoubleEscapedDashStatecCs|jj�}|dkr*|jjtddd��n�|dkrR|jjtddd��|j|_n�|dkrz|jjtddd��|j|_n�|dkr�|jjtddd��|jjtdd	d��|j|_nF|t	kr�|jjtdd
d��|j
|_n|jjtd|d��|j|_dS)Nr|rJ)r#r$rDrjr[r"zinvalid-codepointu�zeof-in-script-in-scriptT)rr6r%r7rr�rrfr�rr)rr$r r r!r�s,







z2HTMLTokenizer.scriptDataDoubleEscapedDashDashStatecCsP|jj�}|dkr8|jjtddd��d|_|j|_n|jj|�|j	|_dS)NrirJ)r#r$r,T)
rr6r%r7rrr�scriptDataDoubleEscapeEndStaterr=r�)rr$r r r!r�0s

z6HTMLTokenizer.scriptDataDoubleEscapedLessThanSignStatecCs�|jj�}|ttd�BkrR|jjtd|d��|jj�dkrH|j	|_
q�|j|_
nB|tkr�|jjtd|d��|j|7_n|jj
|�|j|_
dS)NrirjrJ)r#r$r�T)rirj)rr6rr:r%r7rrrrur�rr�r
r=)rr$r r r!r�;s


z,HTMLTokenizer.scriptDataDoubleEscapeEndStatecCs0|jj�}|tkr$|jjtd��n|tkrJ|jdj|dg�|j|_n�|dkr\|j	�n�|dkrn|j
|_n�|dkr�|jjtd
dd��|jdj|dg�|j|_n�|d
kr�|jjtd
dd��|jdjddg�|j|_nF|t
k�r|jjtd
dd��|j|_n|jdj|dg�|j|_dS)NTr$r,rjri�'�"rHrDr"z#invalid-character-in-attribute-name)r#r$r[zinvalid-codepointu�z#expected-attribute-name-but-got-eof)r�r�rHrD)rr6rr^r
rr7�attributeNameStaterrZrqr%rrr)rr$r r r!rpKs6










z&HTMLTokenizer.beforeAttributeNameStatecCs�|jj�}d}d}|dkr&|j|_�n0|tkr^|jddd||jjtd�7<d}�n�|dkrld}n�|tkr~|j|_n�|dkr�|j	|_n�|d	kr�|j
jtd
dd��|jdddd
7<d}n�|dk�r|j
jtd
dd��|jddd|7<d}nH|t
k�r8|j
jtd
dd��|j|_n|jddd|7<d}|�r�|jdddjt�|jddd<xP|jddd�D]:\}}|jddd|k�r�|j
jtd
dd��P�q�W|�r�|j�dS)NTFrHr$rrrjrir[r"zinvalid-codepoint)r#r$u�r�r�rDz#invalid-character-in-attribute-namezeof-in-attribute-namezduplicate-attributerKrK)r�r�rDrKrKrKrKrKrK)rr6�beforeAttributeValueStaterr
rr^r�afterAttributeNameStaterqr%r7rrrrXrrZ)rr$ZleavingThisStateZ	emitTokenrU�_r r r!r�isR








&
z HTMLTokenizer.attributeNameStatecCsF|jj�}|tkr$|jjtd��n|dkr8|j|_�n
|dkrJ|j�n�|tkrp|jdj	|dg�|j
|_n�|dkr�|j|_n�|dkr�|jj	t
dd	d
��|jdj	ddg�|j
|_n�|dk�r�|jj	t
ddd
��|jdj	|dg�|j
|_nF|tk�r&|jj	t
ddd
��|j|_n|jdj	|dg�|j
|_dS)NTrHrjr$r,rir[r"zinvalid-codepoint)r#r$u�r�r�rDz&invalid-character-after-attribute-namezexpected-end-of-tag-but-got-eof)r�r�rD)rr6rr^r�rrZr
rr7r�rqr%rrr)rr$r r r!r��s:











z%HTMLTokenizer.afterAttributeNameStatecCsj|jj�}|tkr$|jjtd��nB|dkr8|j|_�n.|dkrX|j|_|jj|��n|dkrl|j|_�n�|dkr�|j	j
tddd��|j�n�|d	kr�|j	j
tdd
d��|j
dddd
7<|j|_n�|dk�r|j	j
tddd��|j
ddd|7<|j|_nL|tk�rD|j	j
tddd��|j|_n"|j
ddd|7<|j|_dS)NTr�rCr�rjr"z.expected-attribute-value-but-got-right-bracket)r#r$r[zinvalid-codepointr$ru�rHrD�`z"equals-in-unquoted-attribute-valuez$expected-attribute-value-but-got-eofrK)rHrDr�rKrK)rr6rr^�attributeValueDoubleQuotedStater�attributeValueUnQuotedStater=�attributeValueSingleQuotedStater%r7rrZrrr)rr$r r r!r��s>










z'HTMLTokenizer.beforeAttributeValueStatecCs�|jj�}|dkr|j|_n�|dkr0|jd�n�|dkrj|jjtddd��|jdddd	7<nN|t	kr�|jjtdd
d��|j
|_n&|jdd
d||jjd�7<dS)Nr�rCr[r"zinvalid-codepoint)r#r$r$ru�z#eof-in-attribute-value-double-quoteTrKrK)r�rCr[)rr6�afterAttributeValueStaterrTr%r7rrrrr^)rr$r r r!r��s 




z-HTMLTokenizer.attributeValueDoubleQuotedStatecCs�|jj�}|dkr|j|_n�|dkr0|jd�n�|dkrj|jjtddd��|jdddd	7<nN|t	kr�|jjtdd
d��|j
|_n&|jdd
d||jjd�7<dS)Nr�rCr[r"zinvalid-codepoint)r#r$r$ru�z#eof-in-attribute-value-single-quoteTrKrK)r�rCr[)rr6r�rrTr%r7rrrrr^)rr$r r r!r��s 




z-HTMLTokenizer.attributeValueSingleQuotedStatecCs|jj�}|tkr|j|_�n�|dkr2|jd�n�|dkrD|j�n�|dkr~|jjt	dd	d
��|j
ddd|7<n�|d
kr�|jjt	ddd
��|j
dddd7<nV|tkr�|jjt	ddd
��|j|_n.|j
ddd||jj
td�tB�7<dS)NrCrjr�r�rHrDr�r"z0unexpected-character-in-unquoted-attribute-value)r#r$r$rr[zinvalid-codepointu�z eof-in-attribute-value-no-quotesT)r�r�rHrDr�rKrKrK)rCrjr�r�rHrDr�r[)rr6rrprrTrZr%r7rrrrr^r:)rr$r r r!r�s,





z)HTMLTokenizer.attributeValueUnQuotedStatecCs�|jj�}|tkr|j|_n�|dkr.|j�np|dkr@|j|_n^|tkrt|jj	t
ddd��|jj|�|j|_n*|jj	t
ddd��|jj|�|j|_dS)Nrjrir"z$unexpected-EOF-after-attribute-value)r#r$z*unexpected-character-after-attribute-valueT)
rr6rrprrZrqrr%r7rr=r)rr$r r r!r� s"






z&HTMLTokenizer.afterAttributeValueStatecCs�|jj�}|dkr&d|jd<|j�n^|tkrZ|jjtddd��|jj|�|j	|_
n*|jjtddd��|jj|�|j|_
dS)NrjTrWr"z#unexpected-EOF-after-solidus-in-tag)r#r$z)unexpected-character-after-solidus-in-tag)rr6rrZrr%r7rr=rrrp)rr$r r r!rq4s





z&HTMLTokenizer.selfClosingStartTagStatecCsD|jjd�}|jdd�}|jjtd|d��|jj�|j|_dS)Nrjr[u��Comment)r#r$T)	rr^�replacer%r7rr6rr)rr$r r r!roFs
zHTMLTokenizer.bogusCommentStatecCs�|jj�g}|ddkrT|j|jj��|ddkrPtddd�|_|j|_dS�n�|ddkr�d}x.d&D]&}|j|jj��|d'|krjd}PqjW|r�tdddddd�|_|j|_dSn�|d(dk�rH|jdk	�rH|jj	j
�rH|jj	j
d)j|jj	jk�rHd}x2d*D]*}|j|jj��|d+|k�rd}P�qW|�rH|j
|_dS|jjtddd��x|�rz|jj|j���q`W|j|_dS),Nrr|r�r,)r#r$T�d�D�o�Or@�C�t�T�y�Y�p�P�e�EFZDoctype)r#rU�publicId�systemId�correct�[�Ar"zexpected-dashes-or-doctyperKrKrK)r�r��r�r��r@r��r�r��r�r��r�r��r�r�)r�r�r�r�r�r�rKrKrK)r�r�r�r�r�r�rK)rr6r7rr�commentStartStater�doctypeStaterZtreeZopenElements�	namespaceZdefaultNamespace�cdataSectionStater%r=r'ro)rr?�matched�expectedr r r!rlUsR


z(HTMLTokenizer.markupDeclarationOpenStatecCs�|jj�}|dkr|j|_n�|dkrN|jjtddd��|jdd7<n�|dkr�|jjtdd	d��|jj|j�|j|_nP|t	kr�|jjtdd
d��|jj|j�|j|_n|jd|7<|j
|_dS)Nr|r[r"zinvalid-codepoint)r#r$r$u�rjzincorrect-commentzeof-in-commentT)rr6�commentStartDashStaterr%r7rrrr�commentState)rr$r r r!r��s(






zHTMLTokenizer.commentStartStatecCs�|jj�}|dkr|j|_n�|dkrN|jjtddd��|jdd7<n�|dkr�|jjtdd	d��|jj|j�|j|_nT|t	kr�|jjtdd
d��|jj|j�|j|_n|jdd|7<|j
|_dS)Nr|r[r"zinvalid-codepoint)r#r$r$u-�rjzincorrect-commentzeof-in-commentT)rr6�commentEndStaterr%r7rrrrr�)rr$r r r!r��s(






z#HTMLTokenizer.commentStartDashStatecCs�|jj�}|dkr|j|_n�|dkrN|jjtddd��|jdd7<nT|tkr�|jjtddd��|jj|j�|j	|_n|jd||jj
d
�7<d	S)Nr|r[r"zinvalid-codepoint)r#r$r$u�zeof-in-commentT)r|r[)rr6�commentEndDashStaterr%r7rrrrr^)rr$r r r!r��s




zHTMLTokenizer.commentStatecCs�|jj�}|dkr|j|_n�|dkrV|jjtddd��|jdd7<|j|_nT|t	kr�|jjtddd��|jj|j�|j
|_n|jdd|7<|j|_d	S)
Nr|r[r"zinvalid-codepoint)r#r$r$u-�zeof-in-comment-end-dashT)rr6r�rr%r7rrr�rr)rr$r r r!r��s 





z!HTMLTokenizer.commentEndDashStatecCs,|jj�}|dkr*|jj|j�|j|_n�|dkrd|jjtddd��|jdd7<|j|_n�|dkr�|jjtdd	d��|j	|_n�|d
kr�|jjtddd��|jd|7<nj|t
kr�|jjtddd��|jj|j�|j|_n4|jjtdd
d��|jdd|7<|j|_dS)Nrjr[r"zinvalid-codepoint)r#r$r$u--�rhz,unexpected-bang-after-double-dash-in-commentr|z,unexpected-dash-after-double-dash-in-commentzeof-in-comment-double-dashzunexpected-char-in-commentz--T)rr6r%r7rrrrr��commentEndBangStater)rr$r r r!r��s6









zHTMLTokenizer.commentEndStatecCs�|jj�}|dkr*|jj|j�|j|_n�|dkrN|jdd7<|j|_n�|dkr�|jjtddd��|jdd	7<|j	|_nT|t
kr�|jjtdd
d��|jj|j�|j|_n|jdd|7<|j	|_dS)Nrjr|r$z--!r[r"zinvalid-codepoint)r#r$u--!�zeof-in-comment-end-bang-stateT)rr6r%r7rrrr�rr�r)rr$r r r!r��s(






z!HTMLTokenizer.commentEndBangStatecCs�|jj�}|tkr|j|_nj|tkr\|jjtddd��d|j	d<|jj|j	�|j
|_n*|jjtddd��|jj|�|j|_dS)Nr"z!expected-doctype-name-but-got-eof)r#r$Fr�zneed-space-after-doctypeT)rr6r�beforeDoctypeNameStaterrr%r7rrrr=)rr$r r r!r�s





zHTMLTokenizer.doctypeStatecCs�|jj�}|tkrn�|dkrT|jjtddd��d|jd<|jj|j�|j|_n�|dkr�|jjtddd��d	|jd
<|j	|_nR|t
kr�|jjtddd��d|jd<|jj|j�|j|_n||jd
<|j	|_dS)
Nrjr"z+expected-doctype-name-but-got-right-bracket)r#r$Fr�r[zinvalid-codepointu�rUz!expected-doctype-name-but-got-eofT)rr6rr%r7rrrr�doctypeNameStater)rr$r r r!r�s.










z$HTMLTokenizer.beforeDoctypeNameStatecCs|jj�}|tkr2|jdjt�|jd<|j|_n�|dkrh|jdjt�|jd<|jj	|j�|j
|_n�|dkr�|jj	tddd��|jdd7<|j|_nh|t
kr�|jj	tddd��d	|jd
<|jdjt�|jd<|jj	|j�|j
|_n|jd|7<dS)NrUrjr[r"zinvalid-codepoint)r#r$u�zeof-in-doctype-nameFr�T)rr6rrrXr�afterDoctypeNameStaterr%r7rrr�r)rr$r r r!r�6s,







zHTMLTokenizer.doctypeNameStatecCsR|jj�}|tkr�n8|dkr8|jj|j�|j|_�n|tkr�d|jd<|jj	|�|jjt
ddd��|jj|j�|j|_�n�|d!kr�d	}x$d'D]}|jj�}||kr�d}Pq�W|r�|j|_d	SnJ|d(k�rd	}x(d.D] }|jj�}||k�r�d}P�q�W|�r|j|_d	S|jj	|�|jjt
ddd|id ��d|jd<|j
|_d	S)/NrjFr�r"zeof-in-doctype)r#r$r�r�T�u�U�b�B�l�L�i�Ir@r��s�Sr�r�r�r�r�r��m�Mz*expected-space-or-right-bracket-in-doctyper$)r#r$r.)r�r��r�r��r�r��r�r��r�r��r@r�)r�r�r�r�r�)r�r��r�r��r�r��r�r��r�r��r�r�)r�r�r�r�r�)rr6rr%r7rrrrr=r�afterDoctypePublicKeywordState�afterDoctypeSystemKeywordState�bogusDoctypeState)rr$r�r�r r r!r�OsT







z#HTMLTokenizer.afterDoctypeNameStatecCs�|jj�}|tkr|j|_n�|d
krP|jjtddd��|jj|�|j|_nT|t	kr�|jjtddd��d|j
d<|jj|j
�|j|_n|jj|�|j|_d	S)Nr�r�r"zunexpected-char-in-doctype)r#r$zeof-in-doctypeFr�T)r�r�)rr6r�"beforeDoctypePublicIdentifierStaterr%r7rr=rrr)rr$r r r!r��s"






z,HTMLTokenizer.afterDoctypePublicKeywordStatecCs�|jj�}|tkrn�|dkr0d|jd<|j|_n�|dkrLd|jd<|j|_n�|dkr�|jjt	ddd��d	|jd
<|jj|j�|j
|_nh|tkr�|jjt	ddd��d	|jd
<|jj|j�|j
|_n(|jjt	ddd��d	|jd
<|j|_d
S)Nr�r,r�r�rjr"zunexpected-end-of-doctype)r#r$Fr�zeof-in-doctypezunexpected-char-in-doctypeT)
rr6rr�(doctypePublicIdentifierDoubleQuotedStater�(doctypePublicIdentifierSingleQuotedStater%r7rrrr�)rr$r r r!r��s4












z0HTMLTokenizer.beforeDoctypePublicIdentifierStatecCs�|jj�}|dkr|j|_n�|dkrN|jjtddd��|jdd7<n�|dkr�|jjtdd	d��d
|jd<|jj|j�|j|_nR|t	kr�|jjtddd��d
|jd<|jj|j�|j|_n|jd|7<d
S)Nr�r[r"zinvalid-codepoint)r#r$r�u�rjzunexpected-end-of-doctypeFr�zeof-in-doctypeT)
rr6�!afterDoctypePublicIdentifierStaterr%r7rrrr)rr$r r r!r��s*








z6HTMLTokenizer.doctypePublicIdentifierDoubleQuotedStatecCs�|jj�}|dkr|j|_n�|dkrN|jjtddd��|jdd7<n�|dkr�|jjtdd	d��d
|jd<|jj|j�|j|_nR|t	kr�|jjtddd��d
|jd<|jj|j�|j|_n|jd|7<d
S)Nr�r[r"zinvalid-codepoint)r#r$r�u�rjzunexpected-end-of-doctypeFr�zeof-in-doctypeT)
rr6r�rr%r7rrrr)rr$r r r!r��s*








z6HTMLTokenizer.doctypePublicIdentifierSingleQuotedStatecCs|jj�}|tkr|j|_n�|dkr<|jj|j�|j|_n�|dkrn|jjt	ddd��d|jd<|j
|_n�|dkr�|jjt	ddd��d|jd<|j|_nh|tkr�|jjt	dd	d��d
|jd<|jj|j�|j|_n(|jjt	ddd��d
|jd<|j
|_dS)
Nrjr�r"zunexpected-char-in-doctype)r#r$r,r�r�zeof-in-doctypeFr�T)rr6r�-betweenDoctypePublicAndSystemIdentifiersStaterr%r7rrr�(doctypeSystemIdentifierDoubleQuotedState�(doctypeSystemIdentifierSingleQuotedStaterr�)rr$r r r!r��s6













z/HTMLTokenizer.afterDoctypePublicIdentifierStatecCs�|jj�}|tkrn�|dkr4|jj|j�|j|_n�|dkrPd|jd<|j|_n�|dkrld|jd<|j	|_nh|t
kr�|jjtddd��d	|jd
<|jj|j�|j|_n(|jjtddd��d	|jd
<|j|_dS)
Nrjr�r,r�r�r"zeof-in-doctype)r#r$Fr�zunexpected-char-in-doctypeT)
rr6rr%r7rrrr�r�rrr�)rr$r r r!r�s.










z;HTMLTokenizer.betweenDoctypePublicAndSystemIdentifiersStatecCs�|jj�}|tkr|j|_n�|d
krP|jjtddd��|jj|�|j|_nT|t	kr�|jjtddd��d|j
d<|jj|j
�|j|_n|jj|�|j|_d	S)Nr�r�r"zunexpected-char-in-doctype)r#r$zeof-in-doctypeFr�T)r�r�)rr6r�"beforeDoctypeSystemIdentifierStaterr%r7rr=rrr)rr$r r r!r�s"






z,HTMLTokenizer.afterDoctypeSystemKeywordStatecCs�|jj�}|tkrn�|dkr0d|jd<|j|_n�|dkrLd|jd<|j|_n�|dkr�|jjt	ddd��d	|jd
<|jj|j�|j
|_nh|tkr�|jjt	ddd��d	|jd
<|jj|j�|j
|_n(|jjt	ddd��d	|jd
<|j|_dS)
Nr�r,r�r�rjr"zunexpected-char-in-doctype)r#r$Fr�zeof-in-doctypeT)
rr6rrr�rr�r%r7rrrr�)rr$r r r!r�/s4












z0HTMLTokenizer.beforeDoctypeSystemIdentifierStatecCs�|jj�}|dkr|j|_n�|dkrN|jjtddd��|jdd7<n�|dkr�|jjtdd	d��d
|jd<|jj|j�|j|_nR|t	kr�|jjtddd��d
|jd<|jj|j�|j|_n|jd|7<d
S)Nr�r[r"zinvalid-codepoint)r#r$r�u�rjzunexpected-end-of-doctypeFr�zeof-in-doctypeT)
rr6�!afterDoctypeSystemIdentifierStaterr%r7rrrr)rr$r r r!r�Ls*








z6HTMLTokenizer.doctypeSystemIdentifierDoubleQuotedStatecCs�|jj�}|dkr|j|_n�|dkrN|jjtddd��|jdd7<n�|dkr�|jjtdd	d��d
|jd<|jj|j�|j|_nR|t	kr�|jjtddd��d
|jd<|jj|j�|j|_n|jd|7<d
S)Nr�r[r"zinvalid-codepoint)r#r$r�u�rjzunexpected-end-of-doctypeFr�zeof-in-doctypeT)
rr6r�rr%r7rrrr)rr$r r r!r�ds*








z6HTMLTokenizer.doctypeSystemIdentifierSingleQuotedStatecCs�|jj�}|tkrn~|dkr4|jj|j�|j|_n^|tkrt|jjt	ddd��d|jd<|jj|j�|j|_n|jjt	ddd��|j
|_dS)	Nrjr"zeof-in-doctype)r#r$Fr�zunexpected-char-in-doctypeT)rr6rr%r7rrrrrr�)rr$r r r!r�|s 





z/HTMLTokenizer.afterDoctypeSystemIdentifierStatecCsZ|jj�}|dkr*|jj|j�|j|_n,|tkrV|jj|�|jj|j�|j|_ndS)NrjT)	rr6r%r7rrrrr=)rr$r r r!r��s


zHTMLTokenizer.bogusDoctypeStatecCs�g}x�|j|jjd��|j|jjd��|jj�}|tkr@Pq|dksLt�|ddd�dkrx|ddd�|d<Pq|j|�qWdj|�}|jd�}|dkr�x&t|�D]}|j	jt
d	d
d��q�W|jdd�}|r�|j	jt
d
|d��|j|_
dS)N�]rjr�z]]r,r[rr"zinvalid-codepoint)r#r$u�rJTrK���rKr�rK)r7rr^r6r�AssertionErrorr9�count�ranger%rr�rr)rr$r6Z	nullCountr�r r r!r��s0



zHTMLTokenizer.cdataSectionState)N)NF)N�__name__�
__module__�__qualname__�__doc__rr)rBrSrTrZrr\rbr`rdrfrgr]rmrnrarsrtrcrwrxreryr{rzr}r�rr~r�r�r�r�r�r�r�r�r�rpr�r�r�r�r�r�r�rqrorlr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r��
__classcell__r r )rr!rs�H
P#

6 "-3rN)Z
__future__rrrZpip._vendor.sixrr;�collectionsrZ	constantsrr	r
rrr
rrrrZ_inputstreamrZ_trierrL�objectrr r r r!�<module>ssite-packages/pip/_vendor/html5lib/__pycache__/_inputstream.cpython-36.opt-1.pyc000064400000053240147511334600023547 0ustar003

���e�)@s�ddlmZmZmZddlmZmZddlmZm	Z	ddl
Z
ddlZddlm
Z
ddlmZmZmZmZddlmZdd	lmZdd
lmZyddlmZWnek
r�eZYnXedd
�eD��Zedd
�eD��Zedd
�eD��Zeeddg�BZdZej �r(ej!eddF�e"d�d�Z#n
ej!e�Z#e$dddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4g �Z%ej!d5�Z&iZ'Gd6d7�d7e(�Z)d8d9�Z*Gd:d;�d;e(�Z+Gd<d=�d=e+�Z,Gd>d?�d?e-�Z.Gd@dA�dAe(�Z/GdBdC�dCe(�Z0dDdE�Z1dS)G�)�absolute_import�division�unicode_literals)�	text_type�binary_type)�http_client�urllibN)�webencodings�)�EOF�spaceCharacters�asciiLetters�asciiUppercase)�ReparseException)�_utils)�StringIO)�BytesIOcCsg|]}|jd��qS)�ascii)�encode)�.0�item�r�"/usr/lib/python3.6/_inputstream.py�
<listcomp>srcCsg|]}|jd��qS)r)r)rrrrrrscCsg|]}|jd��qS)r)r)rrrrrrs�>�<u�[---Ÿ﷐-﷯￾￿🿾🿿𯿾𯿿𿿾𿿿񏿾񏿿񟿾񟿿񯿾񯿿񿿾񿿿򏿾򏿿򟿾򟿿򯿾򯿿򿿾򿿿󏿾󏿿󟿾󟿿󯿾󯿿󿿾󿿿􏿾􏿿]z"\uD800-\uDFFF"�]i��i��i��i��i��i��i��i��i��i��i��i��i��i��i��i��i��	i��	i��
i��
i��i��i��i��i��
i��
i��i��i��i��i��i��z[	-
 -/:-@[-`{-~]c@sHeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dS)�BufferedStreamz�Buffering for streams that do not have buffering of their own

    The buffer is implemented as a list of chunks on the assumption that
    joining many strings will be slow since it is O(n**2)
    cCs||_g|_ddg|_dS)Nr
r���)�stream�buffer�position)�selfrrrr�__init__@szBufferedStream.__init__cCs@d}x(|jd|jd�D]}|t|�7}qW||jd7}|S)Nrr
)r r!�len)r"�pos�chunkrrr�tellEs
zBufferedStream.tellcCsH|}d}x0t|j|�|kr8|t|j|�8}|d7}q
W||g|_dS)Nrr
)r$r r!)r"r%�offset�irrr�seekLszBufferedStream.seekcCsT|js|j|�S|jdt|j�krF|jdt|jd�krF|j|�S|j|�SdS)Nrr
r)r �_readStreamr!r$�_readFromBuffer)r"�bytesrrr�readUs

zBufferedStream.readcCstdd�|jD��S)NcSsg|]}t|��qSr)r$)rrrrrr_sz1BufferedStream._bufferedBytes.<locals>.<listcomp>)�sumr )r"rrr�_bufferedBytes^szBufferedStream._bufferedBytescCs<|jj|�}|jj|�|jdd7<t|�|jd<|S)Nrr
)rr.r �appendr!r$)r"r-�datarrrr+as
zBufferedStream._readStreamcCs�|}g}|jd}|jd}x�|t|j�kr�|dkr�|j|}|t|�|krb|}|||g|_n"t|�|}|t|�g|_|d7}|j||||��||8}d}qW|r�|j|j|��dj|�S)Nrr
�)r!r$r r1r+�join)r"r-ZremainingBytes�rvZbufferIndexZbufferOffsetZbufferedDataZbytesToReadrrrr,hs$


zBufferedStream._readFromBufferN)�__name__�
__module__�__qualname__�__doc__r#r'r*r.r0r+r,rrrrr9s		rcKs�t|tj�s(t|tjj�r.t|jtj�r.d}n&t|d�rJt|jd�t	�}n
t|t	�}|r�dd�|D�}|rvt
d|��t|f|�St|f|�SdS)NFr.rcSsg|]}|jd�r|�qS)Z	_encoding)�endswith)r�xrrrr�sz#HTMLInputStream.<locals>.<listcomp>z3Cannot set an encoding with a unicode input, set %r)
�
isinstancerZHTTPResponserZresponseZaddbase�fp�hasattrr.r�	TypeError�HTMLUnicodeInputStream�HTMLBinaryInputStream)�source�kwargsZ	isUnicodeZ	encodingsrrr�HTMLInputStream�s

rDc@speZdZdZdZdd�Zdd�Zdd�Zd	d
�Zdd�Z	d
d�Z
ddd�Zdd�Zdd�Z
ddd�Zdd�ZdS)r@z�Provides a unicode stream of characters to the HTMLTokenizer.

    This class takes care of character encoding and removing or replacing
    incorrect byte-sequences and also provides column and line tracking.

    i(cCsZtjsd|_ntd�dkr$|j|_n|j|_dg|_td�df|_|j	|�|_
|j�dS)a�Initialises the HTMLInputStream.

        HTMLInputStream(source, [encoding]) -> Normalized stream from source
        for use by html5lib.

        source can be either a file-object, local filename or a string.

        The optional encoding parameter must be a string that indicates
        the encoding.  If specified, that encoding will be used,
        regardless of any BOM or later declaration (such as in a meta
        element)

        Nu􏿿r
rzutf-8�certain)r�supports_lone_surrogates�reportCharacterErrorsr$�characterErrorsUCS4�characterErrorsUCS2ZnewLines�lookupEncoding�charEncoding�
openStream�
dataStream�reset)r"rBrrrr#�s
zHTMLUnicodeInputStream.__init__cCs.d|_d|_d|_g|_d|_d|_d|_dS)N�r)r&�	chunkSize�chunkOffset�errors�prevNumLines�prevNumCols�_bufferedCharacter)r"rrrrN�szHTMLUnicodeInputStream.resetcCst|d�r|}nt|�}|S)zvProduces a file object from source.

        source can be either a file object, local filename or a string.

        r.)r>r)r"rBrrrrrL�s
z!HTMLUnicodeInputStream.openStreamcCsT|j}|jdd|�}|j|}|jdd|�}|dkr@|j|}n||d}||fS)N�
rr
r)r&�countrS�rfindrT)r"r(r&ZnLinesZpositionLineZlastLinePosZpositionColumnrrr�	_position�s
z HTMLUnicodeInputStream._positioncCs|j|j�\}}|d|fS)z:Returns (line, col) of the current position in the stream.r
)rYrQ)r"�line�colrrrr!�szHTMLUnicodeInputStream.positioncCs6|j|jkr|j�stS|j}|j|}|d|_|S)zo Read one character from the stream or queue if available. Return
            EOF when EOF is reached.
        r
)rQrP�	readChunkrr&)r"rQ�charrrrr]�s

zHTMLUnicodeInputStream.charNcCs�|dkr|j}|j|j�\|_|_d|_d|_d|_|jj|�}|j	rX|j	|}d|_	n|s`dSt
|�dkr�t|d�}|dks�d|ko�dknr�|d
|_	|dd�}|jr�|j|�|j
dd	�}|j
d
d	�}||_t
|�|_dS)NrOrFr
�
i�i��z
rV�
Trrr)�_defaultChunkSizerYrPrSrTr&rQrMr.rUr$�ordrG�replace)r"rPr2Zlastvrrrr\�s0
 


z HTMLUnicodeInputStream.readChunkcCs,x&tttj|���D]}|jjd�qWdS)Nzinvalid-codepoint)�ranger$�invalid_unicode_re�findallrRr1)r"r2�_rrrrH%sz*HTMLUnicodeInputStream.characterErrorsUCS4cCs�d}x�tj|�D]�}|rqt|j��}|j�}tj|||d��rttj|||d��}|tkrn|j	j
d�d}q|dkr�|dkr�|t|�dkr�|j	j
d�qd}|j	j
d�qWdS)NF�zinvalid-codepointTi�i��r
)rd�finditerra�group�startrZisSurrogatePairZsurrogatePairToCodepoint�non_bmp_invalid_codepointsrRr1r$)r"r2�skip�matchZ	codepointr%Zchar_valrrrrI)s z*HTMLUnicodeInputStream.characterErrorsUCS2Fc	Cs�yt||f}WnNtk
r^djdd�|D��}|s@d|}tjd|�}t||f<YnXg}x||j|j|j�}|dkr�|j|jkr�Pn0|j	�}||jkr�|j
|j|j|��||_P|j
|j|jd��|j�sfPqfWdj|�}|S)z� Returns a string of characters from the stream up to but not
        including any character in 'characters' or EOF. 'characters' must be
        a container that supports the 'in' method and iteration over its
        characters.
        rOcSsg|]}dt|��qS)z\x%02x)ra)r�crrrrNsz5HTMLUnicodeInputStream.charsUntil.<locals>.<listcomp>z^%sz[%s]+N)�charsUntilRegEx�KeyErrorr4�re�compilermr&rQrP�endr1r\)	r"Z
charactersZopposite�charsZregexr5�mrs�rrrr�
charsUntil@s. 

z!HTMLUnicodeInputStream.charsUntilcCs@|dk	r<|jdkr.||j|_|jd7_n|jd8_dS)Nrr
)rQr&rP)r"r]rrr�ungetos
zHTMLUnicodeInputStream.unget)N)F)r6r7r8r9r`r#rNrLrYr!r]r\rHrIrwrxrrrrr@�s 
&
/r@c@sLeZdZdZddd�Zdd�Zd	d
�Zddd�Zd
d�Zdd�Z	dd�Z
dS)rAz�Provides a unicode stream of characters to the HTMLTokenizer.

    This class takes care of character encoding and removing or replacing
    incorrect byte-sequences and also provides column and line tracking.

    N�windows-1252TcCs\|j|�|_tj||j�d|_d|_||_||_||_||_	||_
|j|�|_|j
�dS)a�Initialises the HTMLInputStream.

        HTMLInputStream(source, [encoding]) -> Normalized stream from source
        for use by html5lib.

        source can be either a file-object, local filename or a string.

        The optional encoding parameter must be a string that indicates
        the encoding.  If specified, that encoding will be used,
        regardless of any BOM or later declaration (such as in a meta
        element)

        i�dN)rL�	rawStreamr@r#�numBytesMeta�numBytesChardet�override_encoding�transport_encoding�same_origin_parent_encoding�likely_encoding�default_encoding�determineEncodingrKrN)r"rBr~rr�r�r�Z
useChardetrrrr#�szHTMLBinaryInputStream.__init__cCs&|jdjj|jd�|_tj|�dS)Nrrb)rKZ
codec_info�streamreaderr{rMr@rN)r"rrrrN�szHTMLBinaryInputStream.resetc	CsDt|d�r|}nt|�}y|j|j��Wnt|�}YnX|S)zvProduces a file object from source.

        source can be either a file object, local filename or a string.

        r.)r>rr*r'r)r"rBrrrrrL�s
z HTMLBinaryInputStream.openStreamcCs�|j�df}|ddk	r|St|j�df}|ddk	r:|St|j�df}|ddk	rX|S|j�df}|ddk	rt|St|j�df}|ddk	r�|djjd�r�|St|j�df}|ddk	r�|S|�rdyddl	m
}Wntk
r�YnxXg}|�}x6|j�s.|j
j|j�}|�sP|j|�|j|�q�W|j�t|jd�}|j
jd�|dk	�rd|dfSt|j�df}|ddk	�r�|Std�dfS)NrErZ	tentativezutf-16)�UniversalDetector�encodingzwindows-1252)�	detectBOMrJr~r�detectEncodingMetar��name�
startswithr�Zchardet.universaldetectorr��ImportError�doner{r.r}r1Zfeed�close�resultr*r�)r"ZchardetrKr�ZbuffersZdetectorr r�rrrr��sP


z'HTMLBinaryInputStream.determineEncodingcCs�t|�}|dkrdS|jdkr(td�}nT||jdkrH|jddf|_n4|jjd�|df|_|j�td|jd|f��dS)N�utf-16be�utf-16lezutf-8rrEzEncoding changed from %s to %s)r�r�)rJr�rKr{r*rNr)r"ZnewEncodingrrr�changeEncodings

z$HTMLBinaryInputStream.changeEncodingc
Cs�tjdtjdtjdtjdtjdi}|jjd�}|j|dd��}d}|sp|j|�}d}|sp|j|dd	��}d	}|r�|jj	|�t
|�S|jj	d
�dSdS)z�Attempts to detect at BOM at the start of the stream. If
        an encoding can be determined from the BOM return the name of the
        encoding otherwise return Nonezutf-8zutf-16lezutf-16bezutf-32lezutf-32be�N�rgr)�codecs�BOM_UTF8�BOM_UTF16_LE�BOM_UTF16_BE�BOM_UTF32_LE�BOM_UTF32_BEr{r.�getr*rJ)r"ZbomDict�stringr�r*rrrr�s"
zHTMLBinaryInputStream.detectBOMcCsH|jj|j�}t|�}|jjd�|j�}|dk	rD|jdkrDtd�}|S)z9Report the encoding declared by the meta element
        rN�utf-16be�utf-16lezutf-8)r�r�)r{r.r|�EncodingParserr*�getEncodingr�rJ)r"r �parserr�rrrr�9sz(HTMLBinaryInputStream.detectEncodingMeta)NNNNryT)T)r6r7r8r9r#rNrLr�r�r�r�rrrrrA�s
(
>"rAc@s�eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zeee
�Z
dd�Zee�Zefdd�Zdd�Zdd�Zdd�ZdS)�
EncodingBytesz�String-like object with an associated position and various extra methods
    If the position is ever greater than the string length then an exception is
    raisedcCstj||j��S)N)r-�__new__�lower)r"�valuerrrr�LszEncodingBytes.__new__cCs
d|_dS)Nr
r)rY)r"r�rrrr#PszEncodingBytes.__init__cCs|S)Nr)r"rrr�__iter__TszEncodingBytes.__iter__cCs>|jd}|_|t|�kr"t�n|dkr.t�|||d�S)Nr
r)rYr$�
StopIterationr?)r"�prrr�__next__WszEncodingBytes.__next__cCs|j�S)N)r�)r"rrr�next_szEncodingBytes.nextcCsB|j}|t|�krt�n|dkr$t�|d|_}|||d�S)Nrr
)rYr$r�r?)r"r�rrr�previouscszEncodingBytes.previouscCs|jt|�krt�||_dS)N)rYr$r�)r"r!rrr�setPositionlszEncodingBytes.setPositioncCs*|jt|�krt�|jdkr"|jSdSdS)Nr)rYr$r�)r"rrr�getPositionqs

zEncodingBytes.getPositioncCs||j|jd�S)Nr
)r!)r"rrr�getCurrentByte{szEncodingBytes.getCurrentBytecCsL|j}x:|t|�kr@|||d�}||kr6||_|S|d7}qW||_dS)zSkip past a list of charactersr
N)r!r$rY)r"rtr�rnrrrrl�szEncodingBytes.skipcCsL|j}x:|t|�kr@|||d�}||kr6||_|S|d7}qW||_dS)Nr
)r!r$rY)r"rtr�rnrrr�	skipUntil�szEncodingBytes.skipUntilcCs>|j}|||t|��}|j|�}|r:|jt|�7_|S)z�Look for a sequence of bytes at the start of a string. If the bytes
        are found return True and advance the position to the byte after the
        match. Otherwise return False and leave the position alone)r!r$r�)r"r-r�r2r5rrr�
matchBytes�s
zEncodingBytes.matchBytescCsR||jd�j|�}|dkrJ|jdkr,d|_|j|t|�d7_dSt�dS)z�Look for the next sequence of bytes matching a given sequence. If
        a match is found advance the position to the last byte of the matchNr
rTrr)r!�findrYr$r�)r"r-ZnewPositionrrr�jumpTo�s
zEncodingBytes.jumpToN)r6r7r8r9r�r#r�r�r�r�r�r��propertyr!r��currentByte�spaceCharactersBytesrlr�r�r�rrrrr�Hs 	
r�c@sXeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�ZdS)r�z?Mini parser for detecting character encoding from meta elementscCst|�|_d|_dS)z3string - the data to work on for encoding detectionN)r�r2r�)r"r2rrrr#�s
zEncodingParser.__init__c
Cs�d|jfd|jfd|jfd|jfd|jfd|jff}x^|jD]T}d}xD|D]<\}}|jj|�rJy|�}PWqJtk
r�d}PYqJXqJW|s<Pq<W|jS)	Ns<!--s<metas</s<!s<?rTF)	�
handleComment�
handleMeta�handlePossibleEndTag�handleOther�handlePossibleStartTagr2r�r�r�)r"ZmethodDispatchrfZkeepParsing�key�methodrrrr��s&zEncodingParser.getEncodingcCs|jjd�S)zSkip over commentss-->)r2r�)r"rrrr��szEncodingParser.handleCommentcCs�|jjtkrdSd}d}x�|j�}|dkr.dS|ddkr^|ddk}|r�|dk	r�||_dSq|ddkr�|d}t|�}|dk	r�||_dSq|ddkrtt|d��}|j�}|dk	rt|�}|dk	r|r�||_dS|}qWdS)	NTFrs
http-equivr
scontent-typescharsetscontent)	r2r�r��getAttributer�rJ�ContentAttrParserr��parse)r"Z	hasPragmaZpendingEncoding�attrZtentativeEncoding�codecZ
contentParserrrrr��s:zEncodingParser.handleMetacCs
|jd�S)NF)�handlePossibleTag)r"rrrr��sz%EncodingParser.handlePossibleStartTagcCst|j�|jd�S)NT)r�r2r�)r"rrrr��s
z#EncodingParser.handlePossibleEndTagcCsf|j}|jtkr(|r$|j�|j�dS|jt�}|dkrD|j�n|j�}x|dk	r`|j�}qNWdS)NTr)r2r��asciiLettersBytesr�r�r��spacesAngleBracketsr�)r"ZendTagr2rnr�rrrr��s



z EncodingParser.handlePossibleTagcCs|jjd�S)Nr)r2r�)r"rrrr�szEncodingParser.handleOthercCs�|j}|jttdg�B�}|dkr&dSg}g}xt|dkr@|r@PnX|tkrT|j�}PnD|d	krjdj|�dfS|tkr�|j|j��n|dkr�dS|j|�t|�}q0W|dkr�|j	�dj|�dfSt|�|j�}|d
k�r:|}x�t|�}||k�rt|�dj|�dj|�fS|tk�r*|j|j��q�|j|�q�WnJ|dk�rRdj|�dfS|tk�rl|j|j��n|dk�rzdS|j|�x^t|�}|t
k�r�dj|�dj|�fS|tk�r�|j|j��n|dk�r�dS|j|��q�WdS)z_Return a name,value pair for the next attribute in the stream,
        if one is found, or None�/rN�=r3�'�")rN)r�r)r�r�)r2rlr��	frozensetr4�asciiUppercaseBytesr1r�r�r�r�)r"r2rnZattrNameZ	attrValueZ	quoteCharrrrr�sf










zEncodingParser.getAttributeN)
r6r7r8r9r#r�r�r�r�r�r�r�r�rrrrr��s$r�c@seZdZdd�Zdd�ZdS)r�cCs
||_dS)N)r2)r"r2rrrr#fszContentAttrParser.__init__cCsy�|jjd�|jjd7_|jj�|jjdks8dS|jjd7_|jj�|jjdkr�|jj}|jjd7_|jj}|jj|�r�|j||jj�SdSnF|jj}y|jjt�|j||jj�Stk
r�|j|d�SXWntk
�rdSXdS)Nscharsetr
r�r�r�)r�r�)r2r�r!rlr�r�r�r�)r"Z	quoteMarkZoldPositionrrrr�js.

zContentAttrParser.parseN)r6r7r8r#r�rrrrr�esr�cCs`t|t�r.y|jd�}Wntk
r,dSX|dk	rXy
tj|�Stk
rTdSXndSdS)z{Return the python codec name corresponding to an encoding or None if the
    string doesn't correspond to a valid encoding.rN)r<r�decode�UnicodeDecodeErrorr	�lookup�AttributeError)r�rrrrJ�s

rJr)2Z
__future__rrrZpip._vendor.sixrrZpip._vendor.six.movesrrr�rqZpip._vendorr	Z	constantsrrr
rrrOr�iorrr�r�r�r�r�r�Zinvalid_unicode_no_surrogaterFrr�evalrd�setrkZascii_punctuation_rero�objectrrDr@rAr-r�r�r�rJrrrr�<module>sV









JgIh6'site-packages/pip/_vendor/html5lib/__pycache__/constants.cpython-36.pyc000064400000201300147511334600022102 0ustar003

���e�E��@s�-ddlmZmZmZddlZdZddddddd	d
ddd
ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d,d,d0d1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrdsdtdudvdwdxdydzd{d|d}d~dd�d�d�d���Zd�d�d�d�d�d�d��Zeed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fg�Z	eed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fg�Z
eed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fgN�Zeed�d�fed�d�fed�d�fed�d�fg�Zeed�d�fed�d�fed�d�fed�d�fed�d�fg�Z
d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-�d.�d/�>Z�d0�d1iZ�d2�d3e�d2f�d2�d4e�d2f�d2�d5e�d2f�d2�d6e�d2f�d2�d7e�d2f�d2d�e�d2f�d2�d8e�d2f�d9d�e�d9f�d9�d:e�d9f�d9�d;e�d9fd�d<e�d<f�d<�d2e�d<f�d=�Ze�d>�d?�ej�D��Ze�d@�dA�dB�dC�dDg�Zed�d�d�d�d�g�Zeej�Zeej�Zeej�Zeej�Zeej�Ze�dE�d?�ejD��Z�d|Z ed�d��dFd�d�d�d�d�d�d�d�d�dԐdG�dHg�Z!ed�d�g�Z"ed�d�d�d�d�d�d�g�Z#e�dIg�e�dJg�e�dKg�e�dL�dMg�e�dL�dMg�e�dN�dOg�e�dPg�e�dQ�dRg�e�dS�dR�dT�dUg�e�dVg�e�dWg�e�dR�dXg�e�dR�dX�dYg�e�dR�dXg�e�dR�dZg�e�dR�dX�d[�dZ�dT�dKg�e�dR�dX�dZ�dQg�e�dR�dXg��d\�Z$�d}Z%e�dy�dz�d{�d|�d}g�Z&�d~�d~�d�d�d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��ddÐdĐdŐdƐdǐdȐdɐdʐdːd̐d͐dΐdϐdАdѐdҐdӐd��dѐdԐdՐd֐dÐdאdؐdِdڐdېdܐdݐdސdߐd�d�d�d�d�d�d�d�d�d�d�d�dԐd�d�d�d�d�d�d�d�d�d�d�d�d��d��d��d��d��d��d��d��d��d��d��d��d�d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d(�d+�d��d,�d-�d.�d/�d0�d0�d1�d1�d2�d3�d4�d5�d5�d4�d6�d7�dڐd8�d9�d:�d;�d<�d=�d>�d?�d@�dA�dB�dC�dC�dD�dE�dF�dG�dH�dI�dJ�dK�dL�dM�dN�dO�dP�dQ�dR�dS�dT�dT�dU�dV�dW�dX�dY�dZ�d[�d\�d]�d^�d_�d`�da�db�dc�dd�de�df�dg�dh�di�dj�dk�dl�dm�dn�do�dp�dq�dr�ds�dt�dՐd֐du�dv�dw�dx�dy�dz�d{�d|�d}�d~�d�d��d��dאdؐdِd��d��d��dX�d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d"�d��dA�d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��ddÐdĐdŐdƐdǐdȐdɐdʐdːd̐d͐dΐdϐdϐdАdѐdҐdҐdӐdӐdԐdՐd֐dאdאdؐdِdڐdېdܐdݐdސdߐd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d'�d�d�d�d�d�d�d�d��d��d��d��d��d��d��d��d��d��d��d��d��d��d�d�d�d�d�d�d�d�d�d	�d�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �dڐd!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-�d.�d/�d0�dߐd^�d�d1�d2�d3�d4�d5�d6�d7�d8�d9�d:�d;�d<�d=�d>�d?�d?�d@�dA�dB�dC�dD�d�dE�dF�dG�dH�dF�dI�dI�dJ�dK�dL�d@�dM�dN�dO�dP�dQ�dR�dS�dT�dU�dV�dW�dX�dY�dZ�d[�d\�d]�d^�d^�d_�d`�da�db�dc�dc�dd�de�df�dg�dg�dh�di�dj�dk�dl�dm�dn�do�dp�d1�dq�dr�ds�dt�du�dv�dܐdݐdw�dx�dy�dz�d{�d|�d}�d~�d~�d�d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��dɐdɐd��d��d��d��d��d��d��d��d��d��d��d��d�d�d��d��d��d��d��d��d��d��d��d��ddÐdĐdŐdƐdǐdȐdɐdʐdːd̐d��d͐dΐdϐdY�dАdѐdҐdӐdԐdY�dҐdՐdՐd֐dאdY�d��dؐdؐdِdِd��dڐdېdܐdݐdސdߐd�d�d�d�dk�d�dܐd�d�d��d��d�dݐd��d�d�d�d�d:�d�dm�d�d�d�d�d�d�d�d�d��d��d�d��d
�d��d��d��d��d��d��d��d��d��d��d�d�d�du�du�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d*�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-�d.�dސd��d/�d/�d0�d1�dߐd�d2�d3�d4�d5�d5�d��d6�d,�d,�d7�d8�d9�d:�d;�d<�d=�d>�d?�d$�d@�dA�dB�dB�dC�dD�dE�dF�d��d��dG�dH�dH�d��dI�dJ�dK�dK�dL�dM�dN�dO�dP�dQ�dR�d��dS�dT�dU�dV�dP�dW�dX�dY�dZ�dZ�d[�d��d��d\�d]�d^�d3�d^�d��dX�d_�d��d`�d��d��d��da�db�dc�dd�de�df�dg�dh�di�dj�dk�dl�dm�dn�do�dp�dq�dr�ds�dt�du�dv�dw�dx�dl�dm�dy�dz�d{�d{�dn�dw�dy�dz�d��d|�d}�dԐd~�d�d��dߐd��di�d��dːd��d��dϐd��d��d��d��d��d��d��d��d��dd�d�dΐdΐd��d��dѐd��d��d��d��d��d��d��d��d��d��d��d��dʐdӐd��d��d��d��d��dߐd��dd�d�d��d��d��d��d��d��d��d��d��d��d�d	�d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d�d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��ddÐdĐdĐdŐd��d��d��d��d��dƐdǐd��dȐdɐdʐdːd̐dӐd��d͐dΐdΐdϐdϐdАdѐd�d�d�d��dҐdӐdԐdՐd֐dאdؐdِdڐdېdܐdݐdސd�dߐd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d��d��d��d��d��d��d��d�d�d�d�d�d�d��d��d��d��d�d�d�d"�d�d�d�d�d�d�d�d�d	�d	�d
�d
�d�d�d�d̐d
�d �d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d �d�d�d֐d��d�d(�d�dg�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d�d)�d*�d��d+�d+�d;�d,�d,�d-�d.�d/�d/�d֐d0�d1�d1�d7�d2�d3�d4�d5�d6�d7�d4�d@�d4�d8�d9�d:�d��d;�d<�d=�d8�d9�d>�d��d>�d?�d@�dA�dB�dC�dD�d@�dE�dE�dF�d��dG�dH�dI�dJ�d��d<�dK�dL�dM�dM�dN�dO�dP�dQ�dR�dS�dT�dU�dV�dW�dX�dY�dZ�d[�d\�d]�d^�d_�d}�dՐd`�da�dv�db�dc�dd�de�dX�df�d]�dg�d]�dh�di�di�d^�d_�dj�dk�d$�dl�dm�dn�do�dp�dq�dr�ds�dt�du�dv�dw�dx�dy�dz�d{�d|�da�dv�d}�d~�dܐd�d�d��d��d��d^�do�d�ds�d��dg�d`�d�d�d��du�d��dv�dy�dy�d��d��d��d��d��d��dh�d��du�db�dw�dz�d��df�d��dw�d��d�ds�d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��db�d�d��d��d��dl�d��d��d��d��d��d��di�d��d��d��d��d��d`�d��d�d��d��d��d��d��d��dz�d��d��dw�dݐd��d��dT�dT�d��d��d��d��d��d��d��d��d��dn�d��d��d��d��d��d��d��d��d��d��d��d�d�d�dj�dv�d��d��d��d��d��d��ddÐdÐd��dאdĐd��d��dŐd!�d��dƐdǐd�d��dȐdɐd��dʐd��dːd̐d̐d͐dΐd��dϐdАdѐdҐd��dӐdԐdՐdƐd֐dאd̐dؐdِdڐd̐dېdېd��d��d��d��d��dܐdݐdސdːdߐd�d�d�d��d�d�dx�dx�d�d��d�d��d��d��d�d��d��d��d��d��d��d��d��d��dАd�d�d�d�d�d�d�dϐd�d�d�d��d�d�d��d�d��d��d��d��d��d��d��d͐d�d�d�d��d�d�d��d�d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��dӐd��d��d��d��d��d��dÐdŐdĐd��d͐d��dɐdʐdʐd͐d��d��d��d��d�dd��dd�dÐdĐd�d�dȐdǐdȐd�d��d�d�d��d��d��d��d��d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�dw�dw�d�dS�d�d�dT�dU�d�d�d�dV�d�d�d��d�d�d �d!�d"�d#�d#�d$�d%�dِd��dQ�d&�d'�d�d(�d)�d*�d+�d,�d��d-�d.�d/�d��d0�dR�d1�d2�d2�d3�d3�d4�d4�d5�d6�d7�d8�d2�d9�d9�d:�d;�d;�d��d<�d=�d=�d>�dސd?�d?�dސd@�dA�d�dB�dC�dD�dE�du�dF�dG�dH�dI�d��dJ�dK�dߐdL�d�dM�d�dN�dO�d"�dP�d��dQ�dR�d�d�dS�dT�d�dU�dV�dW�dW�d�dX�dY�d�d�d�dY�d�d�dZ�d[�d\�d�d]�d�d[�dZ�d\�d��d^�d_�d`�d��d��d�da�db�dc�dd�de�d2�df�dg�dh�d)�di�dj�dǐd��d��d#�dڐdk�d��dl�dm�dn�d5�do�d�dp�dq�d�dr�dr�d�ds�d
�dt�du�dv�d%�d��dw�dx�dy�dz�d{�d|�d��d�d}�d~�d�d��d��d��d��d��d�d~�d��d��d��d��d��d��d�d$�d�d!�d��d��d��d��d�d��d�d�d��d��d��d�dy�d�d�d�d�d��dz�d��d��dʐd�d�d��d��d��d��d��d��d�d��d��d��d��d��d��d��d��d��d��d%�d�d��d��d��d��d��d�d��d��d��d��d��d��dA�d��d��d��dC�dB�d��d��d��d��d��d��dD�d��d��d��d��d��d �d��d��d��d��d��d��d��d��d��d��d�d��d��d��d��dސd��d��d��d��d��dV�d��dW�dW�d��d��d��ddÐdĐdŐd^�d��dƐdǐd��dȐdɐdʐdːd̐d͐dΐdϐdАdѐdѐdސd7�dҐd<�dӐd8�d9�d8�d9�d:�d;�d:�d;�d6�d6�d
�d
�d�dԐd��dȐd>�dՐd�dŐdI�d��d֐dאdؐd@�dِdڐdېdܐdݐdސd֐d@�dאdܐdېdߐd�d�dA�d��dC�dB�d��d��d��dD�dE�d�d�d�d�d�d�d�dG�d�d�d�dH�d�d�d�d�d�d�d�d�dG�dH�d�d�d�d�d�d�d�d!�d��d��d��d��d��d��d��d�d��d��d��d[�d��d��dR�dR�d��d��d��dY�dV�dU�dY�dV�d�d�d͐d�d�d#�d�d�d3�d�d�d�d�d�d�d��d�dJ�d	�d��d��dn�d
�d��d�d�d
�d�d
�d�d�d�d�d�d�d�d�dY�d�dܐd�d�d�d1�d�d�d�d�d�dr�d�dt�d�d�d�d�dq�d�d�d �d �d!�d"�d#�dѐdѐd$�d%�d1�ds�dq�d�dn�d&�dy�d&�d'�d(�d(�d)�d*�d+�d,�d-�d.�d	�d��d'�d/�d/�d0�dݐd1�d2�dېd3�dŐdW�d��dI�dL�d��ds�d��d��d4�d5�d6�d7�d��dl�d�d8�d�d0�d9�d:�d;�d��d��d<�dl�d��dǐd=�d��d�d>�d5�d4�d7�d6�d?�d@�dA�d��dB�dC�dD�dE�dC�d��d��dF�d:�d�dm�d�dG�dؐd��dH�dאd�d��dI�d�dJ�d�d�dِd��dK�d�d�d�d��d��dL�dL�dM�dN�dO�dP�dP�dQ�dR�dS�dT�dU�dV�dV�dW�dX�dY�dZ�d��d[�d\�d]�d^�d_�d`�da�db�dc���Z'�dd�dD�dАde�d��dݐd�d�d�d��dO�dE�d,�d��dѐdf�d��dg�dh�dݐd��dܐd��d5�d�d��d͐dJ�d��d��d�di�dX�d��dj�"Z(d�dk�dl�dm�dn�do�dp�dq�dr�Z)ee)�dse)�dte)�dug�Z*e�dv�d?�ej�D��Z+�dwe+d�<G�dx�dy��dye,�Z-G�dz�d{��d{e.�Z/dS(~�)�absolute_import�division�unicode_literalsNz5Null character in input stream, replaced with U+FFFD.zInvalid codepoint in stream.z&Solidus (/) incorrectly placed in tag.z.Incorrect CR newline entity, replaced with LF.z9Entity used with illegal number (windows-1252 reference).zPNumeric entity couldn't be converted to character (codepoint U+%(charAsInt)08x).zBNumeric entity represents an illegal codepoint: U+%(charAsInt)08x.z#Numeric entity didn't end with ';'.z1Numeric entity expected. Got end of file instead.z'Numeric entity expected but none found.z!Named entity didn't end with ';'.z Named entity expected. Got none.z'End tag contains unexpected attributes.z.End tag contains unexpected self-closing flag.z#Expected tag name. Got '>' instead.zSExpected tag name. Got '?' instead. (HTML doesn't support processing instructions.)z-Expected tag name. Got something else insteadz6Expected closing tag. Got '>' instead. Ignoring '</>'.z-Expected closing tag. Unexpected end of file.z<Expected closing tag. Unexpected character '%(data)s' found.z'Unexpected end of file in the tag name.z8Unexpected end of file. Expected attribute name instead.z)Unexpected end of file in attribute name.z#Invalid character in attribute namez#Dropped duplicate attribute on tag.z1Unexpected end of file. Expected = or end of tag.z1Unexpected end of file. Expected attribute value.z*Expected attribute value. Got '>' instead.z"Unexpected = in unquoted attributez*Unexpected character in unquoted attributez*Unexpected character after attribute name.z+Unexpected character after attribute value.z.Unexpected end of file in attribute value (").z.Unexpected end of file in attribute value (').z*Unexpected end of file in attribute value.z)Unexpected end of file in tag. Expected >z/Unexpected character after / in tag. Expected >z&Expected '--' or 'DOCTYPE'. Not found.z Unexpected ! after -- in commentz$Unexpected space after -- in commentzIncorrect comment.z"Unexpected end of file in comment.z%Unexpected end of file in comment (-)z+Unexpected '-' after '--' found in comment.z'Unexpected end of file in comment (--).z&Unexpected character in comment found.z(No space after literal string 'DOCTYPE'.z.Unexpected > character. Expected DOCTYPE name.z.Unexpected end of file. Expected DOCTYPE name.z'Unexpected end of file in DOCTYPE name.z"Unexpected end of file in DOCTYPE.z%Expected space or '>'. Got '%(data)s'zUnexpected end of DOCTYPE.z Unexpected character in DOCTYPE.zXXX innerHTML EOFzUnexpected DOCTYPE. Ignored.z%html needs to be the first start tag.z)Unexpected End of file. Expected DOCTYPE.zErroneous DOCTYPE.z2Unexpected non-space characters. Expected DOCTYPE.z2Unexpected start tag (%(name)s). Expected DOCTYPE.z0Unexpected end tag (%(name)s). Expected DOCTYPE.z?Unexpected end tag (%(name)s) after the (implied) root element.z4Unexpected end of file. Expected end tag (%(name)s).z4Unexpected start tag head in existing head. Ignored.z'Unexpected end tag (%(name)s). Ignored.z;Unexpected start tag (%(name)s) that can be in head. Moved.z Unexpected start tag (%(name)s).zMissing end tag (%(name)s).zMissing end tags (%(name)s).zCUnexpected start tag (%(startName)s) implies end tag (%(endName)s).z@Unexpected start tag (%(originalName)s). Treated as %(newName)s.z,Unexpected start tag %(name)s. Don't use it!z'Unexpected start tag %(name)s. Ignored.zEUnexpected end tag (%(gotName)s). Missing end tag (%(expectedName)s).z:End tag (%(name)s) seen too early. Expected other end tag.zFUnexpected end tag (%(gotName)s). Expected end tag (%(expectedName)s).z+End tag (%(name)s) seen too early. Ignored.zQEnd tag (%(name)s) violates step 1, paragraph 1 of the adoption agency algorithm.zQEnd tag (%(name)s) violates step 1, paragraph 2 of the adoption agency algorithm.zQEnd tag (%(name)s) violates step 1, paragraph 3 of the adoption agency algorithm.zQEnd tag (%(name)s) violates step 4, paragraph 4 of the adoption agency algorithm.z>Unexpected end tag (%(originalName)s). Treated as %(newName)s.z'This element (%(name)s) has no end tag.z9Unexpected implied end tag (%(name)s) in the table phase.z>Unexpected implied end tag (%(name)s) in the table body phase.zDUnexpected non-space characters in table context caused voodoo mode.z3Unexpected input with type hidden in table context.z!Unexpected form in table context.zDUnexpected start tag (%(name)s) in table context caused voodoo mode.zBUnexpected end tag (%(name)s) in table context caused voodoo mode.zCUnexpected table cell start tag (%(name)s) in the table body phase.zFGot table cell end tag (%(name)s) while required end tags are missing.z?Unexpected end tag (%(name)s) in the table body phase. Ignored.z=Unexpected implied end tag (%(name)s) in the table row phase.z>Unexpected end tag (%(name)s) in the table row phase. Ignored.zJUnexpected select start tag in the select phase treated as select end tag.z/Unexpected input start tag in the select phase.zBUnexpected start tag token (%(name)s in the select phase. Ignored.z;Unexpected end tag (%(name)s) in the select phase. Ignored.zKUnexpected table element start tag (%(name)s) in the select in table phase.zIUnexpected table element end tag (%(name)s) in the select in table phase.z8Unexpected non-space characters in the after body phase.z>Unexpected start tag token (%(name)s) in the after body phase.z<Unexpected end tag token (%(name)s) in the after body phase.z@Unexpected characters in the frameset phase. Characters ignored.zEUnexpected start tag token (%(name)s) in the frameset phase. Ignored.zFUnexpected end tag token (frameset) in the frameset phase (innerHTML).zCUnexpected end tag token (%(name)s) in the frameset phase. Ignored.zEUnexpected non-space characters in the after frameset phase. Ignored.zEUnexpected start tag (%(name)s) in the after frameset phase. Ignored.zCUnexpected end tag (%(name)s) in the after frameset phase. Ignored.z(Unexpected end tag after body(innerHtml)z6Unexpected non-space characters. Expected end of file.z6Unexpected start tag (%(name)s). Expected end of file.z4Unexpected end tag (%(name)s). Expected end of file.z/Unexpected end of file. Expected table content.z0Unexpected end of file. Expected select content.z2Unexpected end of file. Expected frameset content.z0Unexpected end of file. Expected script content.z0Unexpected end of file. Expected foreign contentz0Trailing solidus not allowed on element %(name)sz2Element %(name)s not allowed in a non-html contextz*Unexpected end tag (%(name)s) before html.z9Element %(name)s not allowed in a inhead-noscript contextz8Unexpected end of file. Expected inhead-noscript contentz@Unexpected non-space character. Expected inhead-noscript contentz0Undefined error (this sucks and should be fixed))�znull-characterzinvalid-codepointzincorrectly-placed-soliduszincorrect-cr-newline-entityzillegal-windows-1252-entityzcant-convert-numeric-entityz$illegal-codepoint-for-numeric-entityz numeric-entity-without-semicolonz#expected-numeric-entity-but-got-eofzexpected-numeric-entityznamed-entity-without-semicolonzexpected-named-entityzattributes-in-end-tagzself-closing-flag-on-end-tagz'expected-tag-name-but-got-right-bracketz'expected-tag-name-but-got-question-markzexpected-tag-namez*expected-closing-tag-but-got-right-bracketz expected-closing-tag-but-got-eofz!expected-closing-tag-but-got-charzeof-in-tag-namez#expected-attribute-name-but-got-eofzeof-in-attribute-namez#invalid-character-in-attribute-namezduplicate-attributez$expected-end-of-tag-name-but-got-eofz$expected-attribute-value-but-got-eofz.expected-attribute-value-but-got-right-bracketz"equals-in-unquoted-attribute-valuez0unexpected-character-in-unquoted-attribute-valuez&invalid-character-after-attribute-namez*unexpected-character-after-attribute-valuez#eof-in-attribute-value-double-quotez#eof-in-attribute-value-single-quotez eof-in-attribute-value-no-quotesz#unexpected-EOF-after-solidus-in-tagz)unexpected-character-after-solidus-in-tagzexpected-dashes-or-doctypez,unexpected-bang-after-double-dash-in-commentz-unexpected-space-after-double-dash-in-commentzincorrect-commentzeof-in-commentzeof-in-comment-end-dashz,unexpected-dash-after-double-dash-in-commentzeof-in-comment-double-dashzeof-in-comment-end-space-statezeof-in-comment-end-bang-statezunexpected-char-in-commentzneed-space-after-doctypez+expected-doctype-name-but-got-right-bracketz!expected-doctype-name-but-got-eofzeof-in-doctype-namezeof-in-doctypez*expected-space-or-right-bracket-in-doctypezunexpected-end-of-doctypezunexpected-char-in-doctypezeof-in-innerhtmlzunexpected-doctypez
non-html-rootzexpected-doctype-but-got-eofzunknown-doctypezexpected-doctype-but-got-charsz"expected-doctype-but-got-start-tagz expected-doctype-but-got-end-tagzend-tag-after-implied-rootz&expected-named-closing-tag-but-got-eofz!two-heads-are-not-better-than-onezunexpected-end-tagz#unexpected-start-tag-out-of-my-headzunexpected-start-tagzmissing-end-tagzmissing-end-tagsz$unexpected-start-tag-implies-end-tagzunexpected-start-tag-treated-aszdeprecated-tagzunexpected-start-tag-ignoredz$expected-one-end-tag-but-got-anotherzend-tag-too-earlyzend-tag-too-early-namedzend-tag-too-early-ignoredzadoption-agency-1.1zadoption-agency-1.2zadoption-agency-1.3zadoption-agency-4.4zunexpected-end-tag-treated-asz
no-end-tagz#unexpected-implied-end-tag-in-tablez(unexpected-implied-end-tag-in-table-bodyz$unexpected-char-implies-table-voodooz unexpected-hidden-input-in-tablezunexpected-form-in-tablez)unexpected-start-tag-implies-table-voodooz'unexpected-end-tag-implies-table-voodoozunexpected-cell-in-table-bodyzunexpected-cell-end-tagz unexpected-end-tag-in-table-bodyz'unexpected-implied-end-tag-in-table-rowzunexpected-end-tag-in-table-rowzunexpected-select-in-selectzunexpected-input-in-selectzunexpected-start-tag-in-selectzunexpected-end-tag-in-selectz5unexpected-table-element-start-tag-in-select-in-tablez3unexpected-table-element-end-tag-in-select-in-tablezunexpected-char-after-bodyzunexpected-start-tag-after-bodyzunexpected-end-tag-after-bodyzunexpected-char-in-framesetz unexpected-start-tag-in-framesetz)unexpected-frameset-in-frameset-innerhtmlzunexpected-end-tag-in-framesetzunexpected-char-after-framesetz#unexpected-start-tag-after-framesetz!unexpected-end-tag-after-framesetz'unexpected-end-tag-after-body-innerhtmlzexpected-eof-but-got-charzexpected-eof-but-got-start-tagzexpected-eof-but-got-end-tagzeof-in-tablez
eof-in-selectzeof-in-framesetzeof-in-script-in-scriptzeof-in-foreign-landsz&non-void-element-with-trailing-solidusz*unexpected-html-element-in-foreign-contentzunexpected-end-tag-before-htmlzunexpected-inhead-noscript-tagzeof-in-head-noscriptzchar-in-head-noscriptzXXX-undefined-errorzhttp://www.w3.org/1999/xhtmlz"http://www.w3.org/1998/Math/MathMLzhttp://www.w3.org/2000/svgzhttp://www.w3.org/1999/xlinkz$http://www.w3.org/XML/1998/namespacezhttp://www.w3.org/2000/xmlns/)�html�mathml�svg�xlink�xml�xmlnsrZappletZcaptionZmarquee�object�tableZtdZthrZmi�moZmnZmsZmtextzannotation-xmlrZ
foreignObjectZdesc�title�a�bZbig�codeZemZfont�iZnobr�sZsmallZstrikeZstrongZtt�uZaddressZareaZarticleZaside�baseZbasefontZbgsoundZ
blockquoteZbody�br�button�center�colZcolgroup�commandZdd�details�dirZdivZdlZdtZembed�fieldsetZfigureZfooterZform�frameZframeset�h1�h2�h3�h4�h5�h6�head�header�hrZiframeZimage�img�inputZisindexZli�linkZlisting�menu�metaZnavZnoembedZnoframesZnoscriptZol�pZparamZ	plaintextZpre�scriptZsection�select�styleZtbodyZtextareaZtfootZtheadZtrZulZwbrZxmpz
annotaion-xmlZ
attributeNameZ
attributeTypeZ
baseFrequencyZbaseProfileZcalcModeZ
clipPathUnitsZcontentScriptTypeZcontentStyleTypeZdiffuseConstantZedgeModeZexternalResourcesRequiredZ	filterResZfilterUnitsZglyphRefZgradientTransformZ
gradientUnitsZkernelMatrixZkernelUnitLengthZ	keyPointsZ
keySplinesZkeyTimesZlengthAdjustZlimitingConeAngleZmarkerHeightZmarkerUnitsZmarkerWidthZmaskContentUnitsZ	maskUnitsZ
numOctavesZ
pathLengthZpatternContentUnitsZpatternTransformZpatternUnitsZ	pointsAtXZ	pointsAtYZ	pointsAtZZ
preserveAlphaZpreserveAspectRatioZprimitiveUnitsZrefXZrefYZrepeatCountZ	repeatDurZrequiredExtensionsZrequiredFeaturesZspecularConstantZspecularExponentZspreadMethodZstartOffsetZstdDeviationZstitchTilesZsurfaceScaleZsystemLanguageZtableValuesZtargetXZtargetYZ
textLengthZviewBoxZ
viewTargetZxChannelSelectorZyChannelSelectorZ
zoomAndPan)>Z
attributenameZ
attributetypeZ
basefrequencyZbaseprofileZcalcmodeZ
clippathunitsZcontentscripttypeZcontentstyletypeZdiffuseconstantZedgemodeZexternalresourcesrequiredZ	filterresZfilterunitsZglyphrefZgradienttransformZ
gradientunitsZkernelmatrixZkernelunitlengthZ	keypointsZ
keysplinesZkeytimesZlengthadjustZlimitingconeangleZmarkerheightZmarkerunitsZmarkerwidthZmaskcontentunitsZ	maskunitsZ
numoctavesZ
pathlengthZpatterncontentunitsZpatterntransformZpatternunitsZ	pointsatxZ	pointsatyZ	pointsatzZ
preservealphaZpreserveaspectratioZprimitiveunitsZrefxZrefyZrepeatcountZ	repeatdurZrequiredextensionsZrequiredfeaturesZspecularconstantZspecularexponentZspreadmethodZstartoffsetZstddeviationZstitchtilesZsurfacescaleZsystemlanguageZtablevaluesZtargetxZtargetyZ
textlengthZviewboxZ
viewtargetZxchannelselectorZychannelselectorZ
zoomandpanZ
definitionurlZ
definitionURLrZactuateZarcroleZhrefZroleZshow�typer	ZlangZspacer
)z
xlink:actuatez
xlink:arcrolez
xlink:hrefz
xlink:rolez
xlink:showzxlink:titlez
xlink:typezxml:basezxml:langz	xml:spacer
zxmlns:xlinkcCs"g|]\}\}}}||f|f�qS�r2)�.0Zqname�prefixZlocal�nsr2r2�/usr/lib/python3.6/constants.py�
<listcomp>
sr7�	�
�� �
cCs g|]}t|�t|j��f�qSr2)�ord�lower)r3�cr2r2r6r7#szevent-source�sourceZtrackZ
irrelevantZscopedZismapZautoplayZcontrolsZdefer�async�openZmultipleZdisabledZhiddenZchecked�defaultZnoshadeZ
autosubmit�readonlyZselectedZ	autofocusZrequired)�r0r(ZaudioZvideor.rZdatagridrr'r+rZoptionZoptgrouprr)r/�output� �� �� �& �  �! ���0 �`�9 �R�}� � � � �" � � ���"!�a�: �S�~�xzlt;zgt;zamp;zapos;zquot;�Æ�&�ÁuĂ�ÂuАu𝔄�ÀuΑuĀu⩓uĄu𝔸u⁡�Åu𝒜u≔�Ã�Äu∖u⫧u⌆uБu∵uℬuΒu𝔅u𝔹u˘u≎uЧ�©uĆu⋒uⅅuℭuČ�ÇuĈu∰uĊ�¸�·uΧu⊙u⊖u⊕u⊗u∲u”u’u∷u⩴u≡u∯u∮uℂu∐u∳u⨯u𝒞u⋓u≍u⤑uЂuЅuЏu‡u↡u⫤uĎuДu∇uΔu𝔇�´u˙u˝�`u˜u⋄uⅆu𝔻�¨u⃜u≐u⇓u⇐u⇔u⟸u⟺u⟹u⇒u⊨u⇑u⇕u∥u↓u⤓u⇵ȗu⥐u⥞u↽u⥖u⥟u⇁u⥗u⊤u↧u𝒟uĐuŊ�Ð�ÉuĚ�ÊuЭuĖu𝔈�Èu∈uĒu◻u▫uĘu𝔼uΕu⩵u≂u⇌uℰu⩳uΗ�Ëu∃uⅇuФu𝔉u◼u▪u𝔽u∀uℱuЃ�>uΓuϜuĞuĢuĜuГuĠu𝔊u⋙u𝔾u≥u⋛u≧u⪢u≷u⩾u≳u𝒢u≫uЪuˇ�^uĤuℌuℋuℍu─uĦu≏uЕuIJuЁ�Í�ÎuИuİuℑ�ÌuĪuⅈu∬u∫u⋂u⁣u⁢uĮu𝕀uΙuℐuĨuІ�ÏuĴuЙu𝔍u𝕁u𝒥uЈuЄuХuЌuΚuĶuКu𝔎u𝕂u𝒦uЉ�<uĹuΛu⟪uℒu↞uĽuĻuЛu⟨u←u⇤u⇆u⌈u⟦u⥡u⇃u⥙u⌊u↔u⥎u⊣u↤u⥚u⊲u⧏u⊴u⥑u⥠u↿u⥘u↼u⥒u⋚u≦u≶u⪡u⩽u≲u𝔏u⋘u⇚uĿu⟵u⟷u⟶u𝕃u↙u↘u↰uŁu≪u⤅uМu uℳu𝔐u∓u𝕄uΜuЊuŃuŇuŅuНu​u𝔑u⁠� uℕu⫬u≢u≭u∦u∉u≠u≂̸u∄u≯u≱u≧̸u≫̸u≹u⩾̸u≵u≎̸u≏̸u⋪u⧏̸u⋬u≮u≰u≸u≪̸u⩽̸u≴u⪢̸u⪡̸u⊀u⪯̸u⋠u∌u⋫u⧐̸u⋭u⊏̸u⋢u⊐̸u⋣u⊂⃒u⊈u⊁u⪰̸u⋡u≿̸u⊃⃒u⊉u≁u≄u≇u≉u∤u𝒩�ÑuΝuŒ�Ó�ÔuОuŐu𝔒�ÒuŌuΩuΟu𝕆u“u‘u⩔u𝒪�Ø�Õu⨷�Öu‾u⏞u⎴u⏜u∂uПu𝔓uΦuΠ�±uℙu⪻u≺u⪯u≼u≾u″u∏u∝u𝒫uΨ�"u𝔔uℚu𝒬u⤐�®uŔu⟫u↠u⤖uŘuŖuРuℜu∋u⇋u⥯uΡu⟩u→u⇥u⇄u⌉u⟧u⥝u⇂u⥕u⌋u⊢u↦u⥛u⊳u⧐u⊵u⥏u⥜u↾u⥔u⇀u⥓uℝu⥰u⇛uℛu↱u⧴uЩuШuЬuŚu⪼uŠuŞuŜuСu𝔖u↑uΣu∘u𝕊u√u□u⊓u⊏u⊑u⊐u⊒u⊔u𝒮u⋆u⋐u⊆u≻u⪰u≽u≿u∑u⋑u⊃u⊇�Þu™uЋuЦuΤuŤuŢuТu𝔗u∴uΘu  u u∼u≃u≅u≈u𝕋u⃛u𝒯uŦ�Úu↟u⥉uЎuŬ�ÛuУuŰu𝔘�ÙuŪ�_u⏟u⎵u⏝u⋃u⊎uŲu𝕌u⤒u⇅u↕u⥮u⊥u↥u↖u↗uϒuΥuŮu𝒰uŨ�Üu⊫u⫫uВu⊩u⫦u⋁u‖u∣�|u❘u≀u u𝔙u𝕍u𝒱u⊪uŴu⋀u𝔚u𝕎u𝒲u𝔛uΞu𝕏u𝒳uЯuЇuЮ�ÝuŶuЫu𝔜u𝕐u𝒴uŸuЖuŹuŽuЗuŻuΖuℨuℤu𝒵�áuău∾u∾̳u∿�âuа�æu𝔞�àuℵuαuāu⨿u∧u⩕u⩜u⩘u⩚u∠u⦤u∡u⦨u⦩u⦪u⦫u⦬u⦭u⦮u⦯u∟u⊾u⦝u∢u⍼uąu𝕒u⩰u⩯u≊u≋�'�åu𝒶�*�ã�äu⨑u⫭u≌u϶u‵u∽u⋍u⊽u⌅u⎶uбu„u⦰uβuℶu≬u𝔟u◯u⨀u⨁u⨂u⨆u★u▽u△u⨄u⤍u⧫u▴u▾u◂u▸u␣u▒u░u▓u█u=⃥u≡⃥u⌐u𝕓u⋈u╗u╔u╖u╓u═u╦u╩u╤u╧u╝u╚u╜u╙u║u╬u╣u╠u╫u╢u╟u⧉u╕u╒u┐u┌u╥u╨u┬u┴u⊟u⊞u⊠u╛u╘u┘u└u│u╪u╡u╞u┼u┤u├�¦u𝒷u⁏�\u⧅u⟈u•u⪮uću∩u⩄u⩉u⩋u⩇u⩀u∩︀u⁁u⩍uč�çuĉu⩌u⩐uċu⦲�¢u𝔠uчu✓uχu○u⧃uˆu≗u↺u↻uⓈu⊛u⊚u⊝u⨐u⫯u⧂u♣�:�,�@u∁u⩭u𝕔u℗u↵u✗u𝒸u⫏u⫑u⫐u⫒u⋯u⤸u⤵u⋞u⋟u↶u⤽u∪u⩈u⩆u⩊u⊍u⩅u∪︀u↷u⤼u⋎u⋏�¤u∱u⌭u⥥u†uℸu‐u⤏uďuдu⇊u⩷�°uδu⦱u⥿u𝔡u♦uϝu⋲�÷u⋇uђu⌞u⌍�$u𝕕u≑u∸u∔u⊡u⌟u⌌u𝒹uѕu⧶uđu⋱u▿u⦦uџu⟿�éu⩮uěu≖�êu≕uэuėu≒u𝔢u⪚�èu⪖u⪘u⪙u⏧uℓu⪕u⪗uēu∅u u u uŋu uęu𝕖u⋕u⧣u⩱uεuϵ�=u≟u⩸u⧥u≓u⥱uℯuη�ð�ëu€�!uфu♀uffiuffufflu𝔣ufiZfju♭uflu▱uƒu𝕗u⋔u⫙u⨍�½u⅓�¼u⅕u⅙u⅛u⅔u⅖�¾u⅗u⅜u⅘u⅚u⅝u⅞u⁄u⌢u𝒻u⪌uǵuγu⪆uğuĝuгuġu⪩u⪀u⪂u⪄u⋛︀u⪔u𝔤uℷuѓu⪒u⪥u⪤u≩u⪊u⪈u⋧u𝕘uℊu⪎u⪐u⪧u⩺u⋗u⦕u⩼u⥸u≩︀uъu⥈u↭uℏuĥu♥u…u⊹u𝔥u⤥u⤦u⇿u∻u↩u↪u𝕙u―u𝒽uħu⁃�í�îuиuе�¡u𝔦�ìu⨌u∭u⧜u℩uijuīuıu⊷uƵu℅u∞u⧝u⊺u⨗u⨼uёuįu𝕚uι�¿u𝒾u⋹u⋵u⋴u⋳uĩuі�ïuĵuйu𝔧uȷu𝕛u𝒿uјuєuκuϰuķuкu𝔨uĸuхuќu𝕜u𝓀u⤛u⤎u⪋u⥢uĺu⦴uλu⦑u⪅�«u⤟u⤝u↫u⤹u⥳u↢u⪫u⤙u⪭u⪭︀u⤌u❲�{�[u⦋u⦏u⦍uľuļuлu⤶u⥧u⥋u↲u≤u⇇u⋋u⪨u⩿u⪁u⪃u⋚︀u⪓u⋖u⥼u𝔩u⪑u⥪u▄uљu⥫u◺uŀu⎰u≨u⪉u⪇u⋦u⟬u⇽u⟼u↬u⦅u𝕝u⨭u⨴u∗u◊�(u⦓u⥭u‎u⊿u‹u𝓁u⪍u⪏u‚ułu⪦u⩹u⋉u⥶u⩻u⦖u◃u⥊u⥦u≨︀u∺�¯u♂u✠u▮u⨩uмu—u𝔪u℧�µu⫰u−u⨪u⫛u⊧u𝕞u𝓂uμu⊸u⋙̸u≫⃒u⇍u⇎u⋘̸u≪⃒u⇏u⊯u⊮uńu∠⃒u⩰̸u≋̸uʼnu♮u⩃uňuņu⩭̸u⩂uнu–u⇗u⤤u≐̸u⤨u𝔫u↮u⫲u⋼u⋺uњu≦̸u↚u‥u𝕟�¬u⋹̸u⋵̸u⋷u⋶u⋾u⋽u⫽⃥u∂̸u⨔u↛u⤳̸u↝̸u𝓃u⊄u⫅̸u⊅u⫆̸�ñuν�#u№u u⊭u⤄u≍⃒u⊬u≥⃒u>⃒u⧞u⤂u≤⃒u<⃒u⊴⃒u⤃u⊵⃒u∼⃒u⇖u⤣u⤧�ó�ôuоuőu⨸u⦼uœu⦿u𝔬u˛�òu⧁u⦵u⦾u⦻u⧀uōuωuοu⦶u𝕠u⦷u⦹u∨u⩝uℴ�ª�ºu⊶u⩖u⩗u⩛�øu⊘�õu⨶�öu⌽�¶u⫳u⫽uп�%�.u‰u‱u𝔭uφuϕu☎uπuϖuℎ�+u⨣u⨢u⨥u⩲u⨦u⨧u⨕u𝕡�£u⪳u⪷u⪹u⪵u⋨u′u⌮u⌒u⌓u⊰u𝓅uψu u𝔮u𝕢u⁗u𝓆u⨖�?u⤜u⥤u∽̱uŕu⦳u⦒u⦥�»u⥵u⤠u⤳u⤞u⥅u⥴u↣u↝u⤚u∶u❳�}�]u⦌u⦎u⦐uřuŗuрu⤷u⥩u↳u▭u⥽u𝔯u⥬uρuϱu⇉u⋌u˚u‏u⎱u⫮u⟭u⇾u⦆u𝕣u⨮u⨵�)u⦔u⨒u›u𝓇u⋊u▹u⧎u⥨u℞uśu⪴u⪸ušuşuŝu⪶u⪺u⋩u⨓uсu⋅u⩦u⇘�§�;u⤩u✶u𝔰u♯uщuш�­uσuςu⩪u⪞u⪠u⪝u⪟u≆u⨤u⥲u⨳u⧤u⌣u⪪u⪬u⪬︀uь�/u⧄u⌿u𝕤u♠u⊓︀u⊔︀u𝓈u☆u⊂u⫅u⪽u⫃u⫁u⫋u⊊u⪿u⥹u⫇u⫕u⫓u♪�¹�²�³u⫆u⪾u⫘u⫄u⟉u⫗u⥻u⫂u⫌u⊋u⫀u⫈u⫔u⫖u⇙u⤪�ßu⌖uτuťuţuтu⌕u𝔱uθuϑ�þ�×u⨱u⨰u⌶u⫱u𝕥u⫚u‴u▵u≜u◬u⨺u⨹u⧍u⨻u⏢u𝓉uцuћuŧu⥣�úuўuŭ�ûuуuűu⥾u𝔲�ùu▀u⌜u⌏u◸uūuųu𝕦uυu⇈u⌝u⌎uůu◹u𝓊u⋰uũ�üu⦧u⫨u⫩u⦜u⊊︀u⫋︀u⊋︀u⫌︀uвu⊻u≚u⋮u𝔳u𝕧u𝓋u⦚uŵu⩟u≙u℘u𝔴u𝕨u𝓌u𝔵uξu⋻u𝕩u𝓍�ýuяuŷuы�¥u𝔶uїu𝕪u𝓎uю�ÿuźužuзużuζu𝔷uжu⇝u𝕫u𝓏u‍u‌(�ZAEligzAElig;ZAMPzAMP;ZAacutezAacute;zAbreve;ZAcirczAcirc;zAcy;zAfr;ZAgravezAgrave;zAlpha;zAmacr;zAnd;zAogon;zAopf;zApplyFunction;ZAringzAring;zAscr;zAssign;ZAtildezAtilde;ZAumlzAuml;z
Backslash;zBarv;zBarwed;zBcy;zBecause;zBernoullis;zBeta;zBfr;zBopf;zBreve;zBscr;zBumpeq;zCHcy;ZCOPYzCOPY;zCacute;zCap;zCapitalDifferentialD;zCayleys;zCcaron;ZCcedilzCcedil;zCcirc;zCconint;zCdot;zCedilla;z
CenterDot;zCfr;zChi;z
CircleDot;zCircleMinus;zCirclePlus;zCircleTimes;zClockwiseContourIntegral;zCloseCurlyDoubleQuote;zCloseCurlyQuote;zColon;zColone;z
Congruent;zConint;zContourIntegral;zCopf;z
Coproduct;z CounterClockwiseContourIntegral;zCross;zCscr;zCup;zCupCap;zDD;z	DDotrahd;zDJcy;zDScy;zDZcy;zDagger;zDarr;zDashv;zDcaron;zDcy;zDel;zDelta;zDfr;zDiacriticalAcute;zDiacriticalDot;zDiacriticalDoubleAcute;zDiacriticalGrave;zDiacriticalTilde;zDiamond;zDifferentialD;zDopf;zDot;zDotDot;z	DotEqual;zDoubleContourIntegral;z
DoubleDot;zDoubleDownArrow;zDoubleLeftArrow;zDoubleLeftRightArrow;zDoubleLeftTee;zDoubleLongLeftArrow;zDoubleLongLeftRightArrow;zDoubleLongRightArrow;zDoubleRightArrow;zDoubleRightTee;zDoubleUpArrow;zDoubleUpDownArrow;zDoubleVerticalBar;z
DownArrow;z
DownArrowBar;zDownArrowUpArrow;z
DownBreve;zDownLeftRightVector;zDownLeftTeeVector;zDownLeftVector;zDownLeftVectorBar;zDownRightTeeVector;zDownRightVector;zDownRightVectorBar;zDownTee;z
DownTeeArrow;z
Downarrow;zDscr;zDstrok;zENG;ZETHzETH;ZEacutezEacute;zEcaron;ZEcirczEcirc;zEcy;zEdot;zEfr;ZEgravezEgrave;zElement;zEmacr;zEmptySmallSquare;zEmptyVerySmallSquare;zEogon;zEopf;zEpsilon;zEqual;zEqualTilde;zEquilibrium;zEscr;zEsim;zEta;ZEumlzEuml;zExists;z
ExponentialE;zFcy;zFfr;zFilledSmallSquare;zFilledVerySmallSquare;zFopf;zForAll;zFouriertrf;zFscr;zGJcy;ZGTzGT;zGamma;zGammad;zGbreve;zGcedil;zGcirc;zGcy;zGdot;zGfr;zGg;zGopf;z
GreaterEqual;zGreaterEqualLess;zGreaterFullEqual;zGreaterGreater;zGreaterLess;zGreaterSlantEqual;z
GreaterTilde;zGscr;zGt;zHARDcy;zHacek;zHat;zHcirc;zHfr;z
HilbertSpace;zHopf;zHorizontalLine;zHscr;zHstrok;z
HumpDownHump;z
HumpEqual;zIEcy;zIJlig;zIOcy;ZIacutezIacute;ZIcirczIcirc;zIcy;zIdot;zIfr;ZIgravezIgrave;zIm;zImacr;zImaginaryI;zImplies;zInt;z	Integral;z
Intersection;zInvisibleComma;zInvisibleTimes;zIogon;zIopf;zIota;zIscr;zItilde;zIukcy;ZIumlzIuml;zJcirc;zJcy;zJfr;zJopf;zJscr;zJsercy;zJukcy;zKHcy;zKJcy;zKappa;zKcedil;zKcy;zKfr;zKopf;zKscr;zLJcy;ZLTzLT;zLacute;zLambda;zLang;zLaplacetrf;zLarr;zLcaron;zLcedil;zLcy;zLeftAngleBracket;z
LeftArrow;z
LeftArrowBar;zLeftArrowRightArrow;zLeftCeiling;zLeftDoubleBracket;zLeftDownTeeVector;zLeftDownVector;zLeftDownVectorBar;z
LeftFloor;zLeftRightArrow;zLeftRightVector;zLeftTee;z
LeftTeeArrow;zLeftTeeVector;z
LeftTriangle;zLeftTriangleBar;zLeftTriangleEqual;zLeftUpDownVector;zLeftUpTeeVector;z
LeftUpVector;zLeftUpVectorBar;zLeftVector;zLeftVectorBar;z
Leftarrow;zLeftrightarrow;zLessEqualGreater;zLessFullEqual;zLessGreater;z	LessLess;zLessSlantEqual;z
LessTilde;zLfr;zLl;zLleftarrow;zLmidot;zLongLeftArrow;zLongLeftRightArrow;zLongRightArrow;zLongleftarrow;zLongleftrightarrow;zLongrightarrow;zLopf;zLowerLeftArrow;zLowerRightArrow;zLscr;zLsh;zLstrok;zLt;zMap;zMcy;zMediumSpace;z
Mellintrf;zMfr;z
MinusPlus;zMopf;zMscr;zMu;zNJcy;zNacute;zNcaron;zNcedil;zNcy;zNegativeMediumSpace;zNegativeThickSpace;zNegativeThinSpace;zNegativeVeryThinSpace;zNestedGreaterGreater;zNestedLessLess;zNewLine;zNfr;zNoBreak;zNonBreakingSpace;zNopf;zNot;z
NotCongruent;z
NotCupCap;zNotDoubleVerticalBar;zNotElement;z	NotEqual;zNotEqualTilde;z
NotExists;zNotGreater;zNotGreaterEqual;zNotGreaterFullEqual;zNotGreaterGreater;zNotGreaterLess;zNotGreaterSlantEqual;zNotGreaterTilde;zNotHumpDownHump;z
NotHumpEqual;zNotLeftTriangle;zNotLeftTriangleBar;zNotLeftTriangleEqual;zNotLess;z
NotLessEqual;zNotLessGreater;zNotLessLess;zNotLessSlantEqual;z
NotLessTilde;zNotNestedGreaterGreater;zNotNestedLessLess;zNotPrecedes;zNotPrecedesEqual;zNotPrecedesSlantEqual;zNotReverseElement;zNotRightTriangle;zNotRightTriangleBar;zNotRightTriangleEqual;zNotSquareSubset;zNotSquareSubsetEqual;zNotSquareSuperset;zNotSquareSupersetEqual;z
NotSubset;zNotSubsetEqual;zNotSucceeds;zNotSucceedsEqual;zNotSucceedsSlantEqual;zNotSucceedsTilde;zNotSuperset;zNotSupersetEqual;z	NotTilde;zNotTildeEqual;zNotTildeFullEqual;zNotTildeTilde;zNotVerticalBar;zNscr;ZNtildezNtilde;zNu;zOElig;ZOacutezOacute;ZOcirczOcirc;zOcy;zOdblac;zOfr;ZOgravezOgrave;zOmacr;zOmega;zOmicron;zOopf;zOpenCurlyDoubleQuote;zOpenCurlyQuote;zOr;zOscr;ZOslashzOslash;ZOtildezOtilde;zOtimes;ZOumlzOuml;zOverBar;z
OverBrace;zOverBracket;zOverParenthesis;z	PartialD;zPcy;zPfr;zPhi;zPi;z
PlusMinus;zPoincareplane;zPopf;zPr;z	Precedes;zPrecedesEqual;zPrecedesSlantEqual;zPrecedesTilde;zPrime;zProduct;zProportion;z
Proportional;zPscr;zPsi;ZQUOTzQUOT;zQfr;zQopf;zQscr;zRBarr;ZREGzREG;zRacute;zRang;zRarr;zRarrtl;zRcaron;zRcedil;zRcy;zRe;zReverseElement;zReverseEquilibrium;zReverseUpEquilibrium;zRfr;zRho;zRightAngleBracket;zRightArrow;zRightArrowBar;zRightArrowLeftArrow;z
RightCeiling;zRightDoubleBracket;zRightDownTeeVector;zRightDownVector;zRightDownVectorBar;zRightFloor;z	RightTee;zRightTeeArrow;zRightTeeVector;zRightTriangle;zRightTriangleBar;zRightTriangleEqual;zRightUpDownVector;zRightUpTeeVector;zRightUpVector;zRightUpVectorBar;zRightVector;zRightVectorBar;zRightarrow;zRopf;z
RoundImplies;zRrightarrow;zRscr;zRsh;zRuleDelayed;zSHCHcy;zSHcy;zSOFTcy;zSacute;zSc;zScaron;zScedil;zScirc;zScy;zSfr;zShortDownArrow;zShortLeftArrow;zShortRightArrow;z
ShortUpArrow;zSigma;zSmallCircle;zSopf;zSqrt;zSquare;zSquareIntersection;z
SquareSubset;zSquareSubsetEqual;zSquareSuperset;zSquareSupersetEqual;zSquareUnion;zSscr;zStar;zSub;zSubset;zSubsetEqual;z	Succeeds;zSucceedsEqual;zSucceedsSlantEqual;zSucceedsTilde;z	SuchThat;zSum;zSup;z	Superset;zSupersetEqual;zSupset;ZTHORNzTHORN;zTRADE;zTSHcy;zTScy;zTab;zTau;zTcaron;zTcedil;zTcy;zTfr;z
Therefore;zTheta;zThickSpace;z
ThinSpace;zTilde;zTildeEqual;zTildeFullEqual;zTildeTilde;zTopf;z
TripleDot;zTscr;zTstrok;ZUacutezUacute;zUarr;z	Uarrocir;zUbrcy;zUbreve;ZUcirczUcirc;zUcy;zUdblac;zUfr;ZUgravezUgrave;zUmacr;z	UnderBar;zUnderBrace;z
UnderBracket;zUnderParenthesis;zUnion;z
UnionPlus;zUogon;zUopf;zUpArrow;zUpArrowBar;zUpArrowDownArrow;zUpDownArrow;zUpEquilibrium;zUpTee;zUpTeeArrow;zUparrow;zUpdownarrow;zUpperLeftArrow;zUpperRightArrow;zUpsi;zUpsilon;zUring;zUscr;zUtilde;ZUumlzUuml;zVDash;zVbar;zVcy;zVdash;zVdashl;zVee;zVerbar;zVert;zVerticalBar;z
VerticalLine;zVerticalSeparator;zVerticalTilde;zVeryThinSpace;zVfr;zVopf;zVscr;zVvdash;zWcirc;zWedge;zWfr;zWopf;zWscr;zXfr;zXi;zXopf;zXscr;zYAcy;zYIcy;zYUcy;ZYacutezYacute;zYcirc;zYcy;zYfr;zYopf;zYscr;zYuml;zZHcy;zZacute;zZcaron;zZcy;zZdot;zZeroWidthSpace;zZeta;zZfr;zZopf;zZscr;Zaacutezaacute;zabreve;zac;zacE;zacd;Zacirczacirc;Zacutezacute;zacy;Zaeligzaelig;zaf;zafr;Zagravezagrave;zalefsym;zaleph;zalpha;zamacr;zamalg;Zampzamp;zand;zandand;zandd;z	andslope;zandv;zang;zange;zangle;zangmsd;z	angmsdaa;z	angmsdab;z	angmsdac;z	angmsdad;z	angmsdae;z	angmsdaf;z	angmsdag;z	angmsdah;zangrt;zangrtvb;z	angrtvbd;zangsph;zangst;zangzarr;zaogon;zaopf;zap;zapE;zapacir;zape;zapid;zapos;zapprox;z	approxeq;Zaringzaring;zascr;zast;zasymp;zasympeq;Zatildezatilde;Zaumlzauml;z	awconint;zawint;zbNot;z	backcong;zbackepsilon;z
backprime;zbacksim;z
backsimeq;zbarvee;zbarwed;z	barwedge;zbbrk;z	bbrktbrk;zbcong;zbcy;zbdquo;zbecaus;zbecause;zbemptyv;zbepsi;zbernou;zbeta;zbeth;zbetween;zbfr;zbigcap;zbigcirc;zbigcup;zbigodot;z	bigoplus;z
bigotimes;z	bigsqcup;zbigstar;zbigtriangledown;zbigtriangleup;z	biguplus;zbigvee;z	bigwedge;zbkarow;z
blacklozenge;zblacksquare;zblacktriangle;zblacktriangledown;zblacktriangleleft;zblacktriangleright;zblank;zblk12;zblk14;zblk34;zblock;zbne;zbnequiv;zbnot;zbopf;zbot;zbottom;zbowtie;zboxDL;zboxDR;zboxDl;zboxDr;zboxH;zboxHD;zboxHU;zboxHd;zboxHu;zboxUL;zboxUR;zboxUl;zboxUr;zboxV;zboxVH;zboxVL;zboxVR;zboxVh;zboxVl;zboxVr;zboxbox;zboxdL;zboxdR;zboxdl;zboxdr;zboxh;zboxhD;zboxhU;zboxhd;zboxhu;z	boxminus;zboxplus;z	boxtimes;zboxuL;zboxuR;zboxul;zboxur;zboxv;zboxvH;zboxvL;zboxvR;zboxvh;zboxvl;zboxvr;zbprime;zbreve;Zbrvbarzbrvbar;zbscr;zbsemi;zbsim;zbsime;zbsol;zbsolb;z	bsolhsub;zbull;zbullet;zbump;zbumpE;zbumpe;zbumpeq;zcacute;zcap;zcapand;z	capbrcup;zcapcap;zcapcup;zcapdot;zcaps;zcaret;zcaron;zccaps;zccaron;Zccedilzccedil;zccirc;zccups;zccupssm;zcdot;Zcedilzcedil;zcemptyv;Zcentzcent;z
centerdot;zcfr;zchcy;zcheck;z
checkmark;zchi;zcir;zcirE;zcirc;zcirceq;zcirclearrowleft;zcirclearrowright;z	circledR;z	circledS;zcircledast;zcircledcirc;zcircleddash;zcire;z	cirfnint;zcirmid;zcirscir;zclubs;z	clubsuit;zcolon;zcolone;zcoloneq;zcomma;zcommat;zcomp;zcompfn;zcomplement;z
complexes;zcong;zcongdot;zconint;zcopf;zcoprod;�copyzcopy;zcopysr;zcrarr;zcross;zcscr;zcsub;zcsube;zcsup;zcsupe;zctdot;zcudarrl;zcudarrr;zcuepr;zcuesc;zcularr;zcularrp;zcup;z	cupbrcap;zcupcap;zcupcup;zcupdot;zcupor;zcups;zcurarr;zcurarrm;zcurlyeqprec;zcurlyeqsucc;z	curlyvee;zcurlywedge;Zcurrenzcurren;zcurvearrowleft;zcurvearrowright;zcuvee;zcuwed;z	cwconint;zcwint;zcylcty;zdArr;zdHar;zdagger;zdaleth;zdarr;zdash;zdashv;zdbkarow;zdblac;zdcaron;zdcy;zdd;zddagger;zddarr;zddotseq;Zdegzdeg;zdelta;zdemptyv;zdfisht;zdfr;zdharl;zdharr;zdiam;zdiamond;zdiamondsuit;zdiams;zdie;zdigamma;zdisin;zdiv;Zdividezdivide;zdivideontimes;zdivonx;zdjcy;zdlcorn;zdlcrop;zdollar;zdopf;zdot;zdoteq;z	doteqdot;z	dotminus;zdotplus;z
dotsquare;zdoublebarwedge;z
downarrow;zdowndownarrows;zdownharpoonleft;zdownharpoonright;z	drbkarow;zdrcorn;zdrcrop;zdscr;zdscy;zdsol;zdstrok;zdtdot;zdtri;zdtrif;zduarr;zduhar;zdwangle;zdzcy;z	dzigrarr;zeDDot;zeDot;Zeacutezeacute;zeaster;zecaron;zecir;Zecirczecirc;zecolon;zecy;zedot;zee;zefDot;zefr;zeg;Zegravezegrave;zegs;zegsdot;zel;z	elinters;zell;zels;zelsdot;zemacr;zempty;z	emptyset;zemptyv;zemsp13;zemsp14;zemsp;zeng;zensp;zeogon;zeopf;zepar;zeparsl;zeplus;zepsi;zepsilon;zepsiv;zeqcirc;zeqcolon;zeqsim;zeqslantgtr;zeqslantless;zequals;zequest;zequiv;zequivDD;z	eqvparsl;zerDot;zerarr;zescr;zesdot;zesim;zeta;Zethzeth;Zeumlzeuml;zeuro;zexcl;zexist;zexpectation;z
exponentiale;zfallingdotseq;zfcy;zfemale;zffilig;zfflig;zffllig;zffr;zfilig;zfjlig;zflat;zfllig;zfltns;zfnof;zfopf;zforall;zfork;zforkv;z	fpartint;Zfrac12zfrac12;zfrac13;Zfrac14zfrac14;zfrac15;zfrac16;zfrac18;zfrac23;zfrac25;Zfrac34zfrac34;zfrac35;zfrac38;zfrac45;zfrac56;zfrac58;zfrac78;zfrasl;zfrown;zfscr;zgE;zgEl;zgacute;zgamma;zgammad;zgap;zgbreve;zgcirc;zgcy;zgdot;zge;zgel;zgeq;zgeqq;z	geqslant;zges;zgescc;zgesdot;zgesdoto;z	gesdotol;zgesl;zgesles;zgfr;zgg;zggg;zgimel;zgjcy;zgl;zglE;zgla;zglj;zgnE;zgnap;z	gnapprox;zgne;zgneq;zgneqq;zgnsim;zgopf;zgrave;zgscr;zgsim;zgsime;zgsiml;�gtzgt;zgtcc;zgtcir;zgtdot;zgtlPar;zgtquest;z
gtrapprox;zgtrarr;zgtrdot;z
gtreqless;zgtreqqless;zgtrless;zgtrsim;z
gvertneqq;zgvnE;zhArr;zhairsp;zhalf;zhamilt;zhardcy;zharr;zharrcir;zharrw;zhbar;zhcirc;zhearts;z
heartsuit;zhellip;zhercon;zhfr;z	hksearow;z	hkswarow;zhoarr;zhomtht;zhookleftarrow;zhookrightarrow;zhopf;zhorbar;zhscr;zhslash;zhstrok;zhybull;zhyphen;Ziacuteziacute;zic;Zicirczicirc;zicy;ziecy;Ziexclziexcl;ziff;zifr;Zigravezigrave;zii;ziiiint;ziiint;ziinfin;ziiota;zijlig;zimacr;zimage;z	imagline;z	imagpart;zimath;zimof;zimped;zin;zincare;zinfin;z	infintie;zinodot;zint;zintcal;z	integers;z	intercal;z	intlarhk;zintprod;ziocy;ziogon;ziopf;ziota;ziprod;Ziquestziquest;ziscr;zisin;zisinE;zisindot;zisins;zisinsv;zisinv;zit;zitilde;ziukcy;Ziumlziuml;zjcirc;zjcy;zjfr;zjmath;zjopf;zjscr;zjsercy;zjukcy;zkappa;zkappav;zkcedil;zkcy;zkfr;zkgreen;zkhcy;zkjcy;zkopf;zkscr;zlAarr;zlArr;zlAtail;zlBarr;zlE;zlEg;zlHar;zlacute;z	laemptyv;zlagran;zlambda;zlang;zlangd;zlangle;zlap;Zlaquozlaquo;zlarr;zlarrb;zlarrbfs;zlarrfs;zlarrhk;zlarrlp;zlarrpl;zlarrsim;zlarrtl;zlat;zlatail;zlate;zlates;zlbarr;zlbbrk;zlbrace;zlbrack;zlbrke;zlbrksld;zlbrkslu;zlcaron;zlcedil;zlceil;zlcub;zlcy;zldca;zldquo;zldquor;zldrdhar;z	ldrushar;zldsh;zle;z
leftarrow;zleftarrowtail;zleftharpoondown;zleftharpoonup;zleftleftarrows;zleftrightarrow;zleftrightarrows;zleftrightharpoons;zleftrightsquigarrow;zleftthreetimes;zleg;zleq;zleqq;z	leqslant;zles;zlescc;zlesdot;zlesdoto;z	lesdotor;zlesg;zlesges;zlessapprox;zlessdot;z
lesseqgtr;zlesseqqgtr;zlessgtr;zlesssim;zlfisht;zlfloor;zlfr;zlg;zlgE;zlhard;zlharu;zlharul;zlhblk;zljcy;zll;zllarr;z	llcorner;zllhard;zlltri;zlmidot;zlmoust;zlmoustache;zlnE;zlnap;z	lnapprox;zlne;zlneq;zlneqq;zlnsim;zloang;zloarr;zlobrk;zlongleftarrow;zlongleftrightarrow;zlongmapsto;zlongrightarrow;zlooparrowleft;zlooparrowright;zlopar;zlopf;zloplus;zlotimes;zlowast;zlowbar;zloz;zlozenge;zlozf;zlpar;zlparlt;zlrarr;z	lrcorner;zlrhar;zlrhard;zlrm;zlrtri;zlsaquo;zlscr;zlsh;zlsim;zlsime;zlsimg;zlsqb;zlsquo;zlsquor;zlstrok;�ltzlt;zltcc;zltcir;zltdot;zlthree;zltimes;zltlarr;zltquest;zltrPar;zltri;zltrie;zltrif;z	lurdshar;zluruhar;z
lvertneqq;zlvnE;zmDDot;Zmacrzmacr;zmale;zmalt;zmaltese;zmap;zmapsto;zmapstodown;zmapstoleft;z	mapstoup;zmarker;zmcomma;zmcy;zmdash;zmeasuredangle;zmfr;zmho;�microzmicro;zmid;zmidast;zmidcir;Zmiddotzmiddot;zminus;zminusb;zminusd;zminusdu;zmlcp;zmldr;zmnplus;zmodels;zmopf;zmp;zmscr;zmstpos;zmu;z	multimap;zmumap;znGg;znGt;znGtv;znLeftarrow;znLeftrightarrow;znLl;znLt;znLtv;znRightarrow;znVDash;znVdash;znabla;znacute;znang;znap;znapE;znapid;znapos;znapprox;znatur;znatural;z	naturals;Znbspznbsp;znbump;znbumpe;zncap;zncaron;zncedil;zncong;z	ncongdot;zncup;zncy;zndash;zne;zneArr;znearhk;znearr;znearrow;znedot;znequiv;znesear;znesim;znexist;znexists;znfr;zngE;znge;zngeq;zngeqq;z
ngeqslant;znges;zngsim;zngt;zngtr;znhArr;znharr;znhpar;zni;znis;znisd;zniv;znjcy;znlArr;znlE;znlarr;znldr;znle;znleftarrow;znleftrightarrow;znleq;znleqq;z
nleqslant;znles;znless;znlsim;znlt;znltri;znltrie;znmid;znopf;�notznot;znotin;znotinE;z	notindot;znotinva;znotinvb;znotinvc;znotni;znotniva;znotnivb;znotnivc;znpar;z
nparallel;znparsl;znpart;znpolint;znpr;znprcue;znpre;znprec;znpreceq;znrArr;znrarr;znrarrc;znrarrw;znrightarrow;znrtri;znrtrie;znsc;znsccue;znsce;znscr;z
nshortmid;znshortparallel;znsim;znsime;znsimeq;znsmid;znspar;znsqsube;znsqsupe;znsub;znsubE;znsube;znsubset;z
nsubseteq;znsubseteqq;znsucc;znsucceq;znsup;znsupE;znsupe;znsupset;z
nsupseteq;znsupseteqq;zntgl;Zntildezntilde;zntlg;zntriangleleft;zntrianglelefteq;zntriangleright;zntrianglerighteq;znu;znum;znumero;znumsp;znvDash;znvHarr;znvap;znvdash;znvge;znvgt;znvinfin;znvlArr;znvle;znvlt;znvltrie;znvrArr;znvrtrie;znvsim;znwArr;znwarhk;znwarr;znwarrow;znwnear;zoS;Zoacutezoacute;zoast;zocir;Zocirczocirc;zocy;zodash;zodblac;zodiv;zodot;zodsold;zoelig;zofcir;zofr;zogon;Zogravezograve;zogt;zohbar;zohm;zoint;zolarr;zolcir;zolcross;zoline;zolt;zomacr;zomega;zomicron;zomid;zominus;zoopf;zopar;zoperp;zoplus;zor;zorarr;zord;zorder;zorderof;Zordfzordf;Zordmzordm;zorigof;zoror;zorslope;zorv;zoscr;Zoslashzoslash;zosol;Zotildezotilde;zotimes;z	otimesas;Zoumlzouml;zovbar;zpar;Zparazpara;z	parallel;zparsim;zparsl;zpart;zpcy;zpercnt;zperiod;zpermil;zperp;zpertenk;zpfr;zphi;zphiv;zphmmat;zphone;zpi;z
pitchfork;zpiv;zplanck;zplanckh;zplankv;zplus;z	plusacir;zplusb;zpluscir;zplusdo;zplusdu;zpluse;Zplusmnzplusmn;zplussim;zplustwo;zpm;z	pointint;zpopf;Zpoundzpound;zpr;zprE;zprap;zprcue;zpre;zprec;zprecapprox;zpreccurlyeq;zpreceq;zprecnapprox;z	precneqq;z	precnsim;zprecsim;zprime;zprimes;zprnE;zprnap;zprnsim;zprod;z	profalar;z	profline;z	profsurf;zprop;zpropto;zprsim;zprurel;zpscr;zpsi;zpuncsp;zqfr;zqint;zqopf;zqprime;zqscr;zquaternions;zquatint;zquest;zquesteq;Zquotzquot;zrAarr;zrArr;zrAtail;zrBarr;zrHar;zrace;zracute;zradic;z	raemptyv;zrang;zrangd;zrange;zrangle;Zraquozraquo;zrarr;zrarrap;zrarrb;zrarrbfs;zrarrc;zrarrfs;zrarrhk;zrarrlp;zrarrpl;zrarrsim;zrarrtl;zrarrw;zratail;zratio;z
rationals;zrbarr;zrbbrk;zrbrace;zrbrack;zrbrke;zrbrksld;zrbrkslu;zrcaron;zrcedil;zrceil;zrcub;zrcy;zrdca;zrdldhar;zrdquo;zrdquor;zrdsh;zreal;zrealine;z	realpart;zreals;zrect;Zregzreg;zrfisht;zrfloor;zrfr;zrhard;zrharu;zrharul;zrho;zrhov;zrightarrow;zrightarrowtail;zrightharpoondown;zrightharpoonup;zrightleftarrows;zrightleftharpoons;zrightrightarrows;zrightsquigarrow;zrightthreetimes;zring;z
risingdotseq;zrlarr;zrlhar;zrlm;zrmoust;zrmoustache;zrnmid;zroang;zroarr;zrobrk;zropar;zropf;zroplus;zrotimes;zrpar;zrpargt;z	rppolint;zrrarr;zrsaquo;zrscr;zrsh;zrsqb;zrsquo;zrsquor;zrthree;zrtimes;zrtri;zrtrie;zrtrif;z	rtriltri;zruluhar;zrx;zsacute;zsbquo;zsc;zscE;zscap;zscaron;zsccue;zsce;zscedil;zscirc;zscnE;zscnap;zscnsim;z	scpolint;zscsim;zscy;zsdot;zsdotb;zsdote;zseArr;zsearhk;zsearr;zsearrow;Zsectzsect;zsemi;zseswar;z	setminus;zsetmn;zsext;zsfr;zsfrown;zsharp;zshchcy;zshcy;z	shortmid;zshortparallel;Zshyzshy;zsigma;zsigmaf;zsigmav;zsim;zsimdot;zsime;zsimeq;zsimg;zsimgE;zsiml;zsimlE;zsimne;zsimplus;zsimrarr;zslarr;zsmallsetminus;zsmashp;z	smeparsl;zsmid;zsmile;zsmt;zsmte;zsmtes;zsoftcy;zsol;zsolb;zsolbar;zsopf;zspades;z
spadesuit;zspar;zsqcap;zsqcaps;zsqcup;zsqcups;zsqsub;zsqsube;z	sqsubset;zsqsubseteq;zsqsup;zsqsupe;z	sqsupset;zsqsupseteq;zsqu;zsquare;zsquarf;zsquf;zsrarr;zsscr;zssetmn;zssmile;zsstarf;zstar;zstarf;zstraightepsilon;zstraightphi;zstrns;zsub;zsubE;zsubdot;zsube;zsubedot;zsubmult;zsubnE;zsubne;zsubplus;zsubrarr;zsubset;z	subseteq;z
subseteqq;z
subsetneq;zsubsetneqq;zsubsim;zsubsub;zsubsup;zsucc;zsuccapprox;zsucccurlyeq;zsucceq;zsuccnapprox;z	succneqq;z	succnsim;zsuccsim;zsum;zsung;Zsup1zsup1;Zsup2zsup2;Zsup3zsup3;zsup;zsupE;zsupdot;zsupdsub;zsupe;zsupedot;zsuphsol;zsuphsub;zsuplarr;zsupmult;zsupnE;zsupne;zsupplus;zsupset;z	supseteq;z
supseteqq;z
supsetneq;zsupsetneqq;zsupsim;zsupsub;zsupsup;zswArr;zswarhk;zswarr;zswarrow;zswnwar;Zszligzszlig;ztarget;ztau;ztbrk;ztcaron;ztcedil;ztcy;ztdot;ztelrec;ztfr;zthere4;z
therefore;ztheta;z	thetasym;zthetav;zthickapprox;z	thicksim;zthinsp;zthkap;zthksim;Zthornzthorn;ztilde;�timesztimes;ztimesb;z	timesbar;ztimesd;ztint;ztoea;ztop;ztopbot;ztopcir;ztopf;ztopfork;ztosa;ztprime;ztrade;z	triangle;z
triangledown;z
triangleleft;ztrianglelefteq;z
triangleq;ztriangleright;ztrianglerighteq;ztridot;ztrie;z	triminus;ztriplus;ztrisb;ztritime;z	trpezium;ztscr;ztscy;ztshcy;ztstrok;ztwixt;ztwoheadleftarrow;ztwoheadrightarrow;zuArr;zuHar;Zuacutezuacute;zuarr;zubrcy;zubreve;Zucirczucirc;zucy;zudarr;zudblac;zudhar;zufisht;zufr;Zugravezugrave;zuharl;zuharr;zuhblk;zulcorn;z	ulcorner;zulcrop;zultri;zumacr;Zumlzuml;zuogon;zuopf;zuparrow;zupdownarrow;zupharpoonleft;zupharpoonright;zuplus;zupsi;zupsih;zupsilon;zupuparrows;zurcorn;z	urcorner;zurcrop;zuring;zurtri;zuscr;zutdot;zutilde;zutri;zutrif;zuuarr;Zuumlzuuml;zuwangle;zvArr;zvBar;zvBarv;zvDash;zvangrt;zvarepsilon;z	varkappa;zvarnothing;zvarphi;zvarpi;z
varpropto;zvarr;zvarrho;z	varsigma;z
varsubsetneq;zvarsubsetneqq;z
varsupsetneq;zvarsupsetneqq;z	vartheta;zvartriangleleft;zvartriangleright;zvcy;zvdash;zvee;zveebar;zveeeq;zvellip;zverbar;zvert;zvfr;zvltri;zvnsub;zvnsup;zvopf;zvprop;zvrtri;zvscr;zvsubnE;zvsubne;zvsupnE;zvsupne;zvzigzag;zwcirc;zwedbar;zwedge;zwedgeq;zweierp;zwfr;zwopf;zwp;zwr;zwreath;zwscr;zxcap;zxcirc;zxcup;zxdtri;zxfr;zxhArr;zxharr;zxi;zxlArr;zxlarr;zxmap;zxnis;zxodot;zxopf;zxoplus;zxotime;zxrArr;zxrarr;zxscr;zxsqcup;zxuplus;zxutri;zxvee;zxwedge;Zyacutezyacute;zyacy;zycirc;zycy;Zyenzyen;zyfr;zyicy;zyopf;zyscr;zyucy;Zyumlzyuml;zzacute;zzcaron;zzcy;zzdot;zzeetrf;zzeta;zzfr;zzhcy;zzigrarr;zzopf;zzscr;zzwj;zzwnj;u������)"r�
���������������������������������������)ZDoctypeZ
CharactersZSpaceCharacters�StartTag�EndTag�EmptyTag�CommentZ
ParseErrorrrrcCsg|]\}}||f�qSr2r2)r3�k�vr2r2r6r7xsZmathc@seZdZdS)�DataLossWarningN)�__name__�
__module__�__qualname__r2r2r2r6r|src@seZdZdS)�ReparseExceptionN)rrrr2r2r2r6r�sr)rr r!r"r#r$) rGrHrIrJrKrLrMrNrOrPrQrRrSrHrTrHrHrUrVrWrXrYrZr[r\r]r^r_r`rHrarb)0Z
__future__rrr�stringZEOF�EZ
namespaces�	frozensetZscopingElementsZformattingElementsZspecialElementsZhtmlIntegrationPointElementsZ"mathmlTextIntegrationPointElementsZadjustSVGAttributesZadjustMathMLAttributesZadjustForeignAttributes�dict�itemsZunadjustForeignAttributesZspaceCharactersZtableInsertModeElementsZascii_lowercaseZasciiLowercaseZascii_uppercaseZasciiUppercaseZ
ascii_lettersZasciiLettersZdigitsZ	hexdigitsZ	hexDigitsZasciiUpper2LowerZheadingElementsZvoidElementsZ
cdataElementsZrcdataElementsZbooleanAttributesZentitiesWindows1252ZxmlEntitiesZentitiesZreplacementCharactersZ
tokenTypesZ
tagTokenTypes�prefixes�UserWarningr�	Exceptionrr2r2r2r6�<module>s<






























































































































site-packages/pip/_vendor/html5lib/__pycache__/_inputstream.cpython-36.pyc000064400000054147147511334600022617 0ustar003

���e�)@sddlmZmZmZddlmZmZddlmZm	Z	ddl
Z
ddlZddlm
Z
ddlmZmZmZmZddlmZdd	lmZdd
lmZyddlmZWnek
r�eZYnXedd
�eD��Zedd
�eD��Zedd
�eD��Zeeddg�BZdZej �rJedFdk�r&ej!d�dk�s*t"�ej#eddG�e$d�d�Z%n
ej#e�Z%e&dddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4g �Z'ej#d5�Z(iZ)Gd6d7�d7e*�Z+d8d9�Z,Gd:d;�d;e*�Z-Gd<d=�d=e-�Z.Gd>d?�d?e/�Z0Gd@dA�dAe*�Z1GdBdC�dCe*�Z2dDdE�Z3dS)H�)�absolute_import�division�unicode_literals)�	text_type�binary_type)�http_client�urllibN)�webencodings�)�EOF�spaceCharacters�asciiLetters�asciiUppercase)�ReparseException)�_utils)�StringIO)�BytesIOcCsg|]}|jd��qS)�ascii)�encode)�.0�item�r�"/usr/lib/python3.6/_inputstream.py�
<listcomp>srcCsg|]}|jd��qS)r)r)rrrrrrscCsg|]}|jd��qS)r)r)rrrrrrs�>�<u�[---Ÿ﷐-﷯￾￿🿾🿿𯿾𯿿𿿾𿿿񏿾񏿿񟿾񟿿񯿾񯿿񿿾񿿿򏿾򏿿򟿾򟿿򯿾򯿿򿿾򿿿󏿾󏿿󟿾󟿿󯿾󯿿󿿾󿿿􏿾􏿿]�]z"\uD800-\uDFFF"i��i��i��i��i��i��i��i��i��i��i��i��i��i��i��i��i��	i��	i��
i��
i��i��i��i��i��
i��
i��i��i��i��i��i��z[	-
 -/:-@[-`{-~]c@sHeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dS)�BufferedStreamz�Buffering for streams that do not have buffering of their own

    The buffer is implemented as a list of chunks on the assumption that
    joining many strings will be slow since it is O(n**2)
    cCs||_g|_ddg|_dS)Nr
r���)�stream�buffer�position)�selfrrrr�__init__@szBufferedStream.__init__cCs@d}x(|jd|jd�D]}|t|�7}qW||jd7}|S)Nrr
)r r!�len)r"�pos�chunkrrr�tellEs
zBufferedStream.tellcCsX||j�kst�|}d}x0t|j|�|krH|t|j|�8}|d7}qW||g|_dS)Nrr
)�_bufferedBytes�AssertionErrorr$r r!)r"r%�offset�irrr�seekLszBufferedStream.seekcCsT|js|j|�S|jdt|j�krF|jdt|jd�krF|j|�S|j|�SdS)Nrr
r)r �_readStreamr!r$�_readFromBuffer)r"�bytesrrr�readUs

zBufferedStream.readcCstdd�|jD��S)NcSsg|]}t|��qSr)r$)rrrrrr_sz1BufferedStream._bufferedBytes.<locals>.<listcomp>)�sumr )r"rrrr(^szBufferedStream._bufferedBytescCs<|jj|�}|jj|�|jdd7<t|�|jd<|S)Nrr
)rr0r �appendr!r$)r"r/�datarrrr-as
zBufferedStream._readStreamcCs�|}g}|jd}|jd}x�|t|j�kr�|dkr�|dks@t�|j|}|t|�|krn|}|||g|_n"t|�|}|t|�g|_|d7}|j||||��||8}d}qW|r�|j|j|��dj|�S)Nrr
�)r!r$r r)r2r-�join)r"r/ZremainingBytes�rvZbufferIndexZbufferOffsetZbufferedDataZbytesToReadrrrr.hs&


zBufferedStream._readFromBufferN)�__name__�
__module__�__qualname__�__doc__r#r'r,r0r(r-r.rrrrr9s		rcKs�t|tj�s(t|tjj�r.t|jtj�r.d}n&t|d�rJt|jd�t	�}n
t|t	�}|r�dd�|D�}|rvt
d|��t|f|�St|f|�SdS)NFr0rcSsg|]}|jd�r|�qS)Z	_encoding)�endswith)r�xrrrr�sz#HTMLInputStream.<locals>.<listcomp>z3Cannot set an encoding with a unicode input, set %r)
�
isinstancerZHTTPResponserZresponseZaddbase�fp�hasattrr0r�	TypeError�HTMLUnicodeInputStream�HTMLBinaryInputStream)�source�kwargsZ	isUnicodeZ	encodingsrrr�HTMLInputStream�s

rEc@speZdZdZdZdd�Zdd�Zdd�Zd	d
�Zdd�Z	d
d�Z
ddd�Zdd�Zdd�Z
ddd�Zdd�ZdS)rAz�Provides a unicode stream of characters to the HTMLTokenizer.

    This class takes care of character encoding and removing or replacing
    incorrect byte-sequences and also provides column and line tracking.

    i(cCsZtjsd|_ntd�dkr$|j|_n|j|_dg|_td�df|_|j	|�|_
|j�dS)a�Initialises the HTMLInputStream.

        HTMLInputStream(source, [encoding]) -> Normalized stream from source
        for use by html5lib.

        source can be either a file-object, local filename or a string.

        The optional encoding parameter must be a string that indicates
        the encoding.  If specified, that encoding will be used,
        regardless of any BOM or later declaration (such as in a meta
        element)

        Nu􏿿r
rzutf-8�certain)r�supports_lone_surrogates�reportCharacterErrorsr$�characterErrorsUCS4�characterErrorsUCS2ZnewLines�lookupEncoding�charEncoding�
openStream�
dataStream�reset)r"rCrrrr#�s
zHTMLUnicodeInputStream.__init__cCs.d|_d|_d|_g|_d|_d|_d|_dS)N�r)r&�	chunkSize�chunkOffset�errors�prevNumLines�prevNumCols�_bufferedCharacter)r"rrrrO�szHTMLUnicodeInputStream.resetcCst|d�r|}nt|�}|S)zvProduces a file object from source.

        source can be either a file object, local filename or a string.

        r0)r?r)r"rCrrrrrM�s
z!HTMLUnicodeInputStream.openStreamcCsT|j}|jdd|�}|j|}|jdd|�}|dkr@|j|}n||d}||fS)N�
rr
r)r&�countrT�rfindrU)r"r*r&ZnLinesZpositionLineZlastLinePosZpositionColumnrrr�	_position�s
z HTMLUnicodeInputStream._positioncCs|j|j�\}}|d|fS)z:Returns (line, col) of the current position in the stream.r
)rZrR)r"�line�colrrrr!�szHTMLUnicodeInputStream.positioncCs6|j|jkr|j�stS|j}|j|}|d|_|S)zo Read one character from the stream or queue if available. Return
            EOF when EOF is reached.
        r
)rRrQ�	readChunkrr&)r"rR�charrrrr^�s

zHTMLUnicodeInputStream.charNcCs�|dkr|j}|j|j�\|_|_d|_d|_d|_|jj|�}|j	rX|j	|}d|_	n|s`dSt
|�dkr�t|d�}|dks�d|ko�dknr�|d
|_	|dd�}|jr�|j|�|j
dd	�}|j
d
d	�}||_t
|�|_dS)NrPrFr
�
i�i��z
rW�
Trrr)�_defaultChunkSizerZrQrTrUr&rRrNr0rVr$�ordrH�replace)r"rQr3Zlastvrrrr]�s0
 


z HTMLUnicodeInputStream.readChunkcCs,x&tttj|���D]}|jjd�qWdS)Nzinvalid-codepoint)�ranger$�invalid_unicode_re�findallrSr2)r"r3�_rrrrI%sz*HTMLUnicodeInputStream.characterErrorsUCS4cCs�d}x�tj|�D]�}|rqt|j��}|j�}tj|||d��rttj|||d��}|tkrn|j	j
d�d}q|dkr�|dkr�|t|�dkr�|j	j
d�qd}|j	j
d�qWdS)NF�zinvalid-codepointTi�i��r
)re�finditerrb�group�startrZisSurrogatePairZsurrogatePairToCodepoint�non_bmp_invalid_codepointsrSr2r$)r"r3�skip�matchZ	codepointr%Zchar_valrrrrJ)s z*HTMLUnicodeInputStream.characterErrorsUCS2Fc
Csyt||f}Wnltk
r|x|D]}t|�dks&t�q&Wdjdd�|D��}|s^d|}tjd|�}t||f<YnXg}x||j|j|j	�}|dkr�|j	|j
kr�Pn0|j�}||j
kr�|j|j|j	|��||_	P|j|j|j	d��|j
�s�Pq�Wdj|�}	|	S)z� Returns a string of characters from the stream up to but not
        including any character in 'characters' or EOF. 'characters' must be
        a container that supports the 'in' method and iteration over its
        characters.
        �rPcSsg|]}dt|��qS)z\x%02x)rb)r�crrrrNsz5HTMLUnicodeInputStream.charsUntil.<locals>.<listcomp>z^%sz[%s]+N)�charsUntilRegEx�KeyErrorrbr)r5�re�compilernr&rRrQ�endr2r])
r"Z
charactersZopposite�charsrpZregexr6�mru�rrrr�
charsUntil@s2
 

z!HTMLUnicodeInputStream.charsUntilcCsT|dk	rP|jdkr.||j|_|jd7_n"|jd8_|j|j|ksPt�dS)Nrr
)rRr&rQr))r"r^rrr�ungetos
zHTMLUnicodeInputStream.unget)N)F)r7r8r9r:rar#rOrMrZr!r^r]rIrJryrzrrrrrA�s 
&
/rAc@sLeZdZdZddd�Zdd�Zd	d
�Zddd�Zd
d�Zdd�Z	dd�Z
dS)rBz�Provides a unicode stream of characters to the HTMLTokenizer.

    This class takes care of character encoding and removing or replacing
    incorrect byte-sequences and also provides column and line tracking.

    N�windows-1252TcCsn|j|�|_tj||j�d|_d|_||_||_||_||_	||_
|j|�|_|jddk	sbt
�|j�dS)a�Initialises the HTMLInputStream.

        HTMLInputStream(source, [encoding]) -> Normalized stream from source
        for use by html5lib.

        source can be either a file-object, local filename or a string.

        The optional encoding parameter must be a string that indicates
        the encoding.  If specified, that encoding will be used,
        regardless of any BOM or later declaration (such as in a meta
        element)

        i�drN)rM�	rawStreamrAr#�numBytesMeta�numBytesChardet�override_encoding�transport_encoding�same_origin_parent_encoding�likely_encoding�default_encoding�determineEncodingrLr)rO)r"rCr�r�r�r�r�Z
useChardetrrrr#�szHTMLBinaryInputStream.__init__cCs&|jdjj|jd�|_tj|�dS)Nrrc)rLZ
codec_info�streamreaderr}rNrArO)r"rrrrO�szHTMLBinaryInputStream.resetc	CsDt|d�r|}nt|�}y|j|j��Wnt|�}YnX|S)zvProduces a file object from source.

        source can be either a file object, local filename or a string.

        r0)r?rr,r'r)r"rCrrrrrM�s
z HTMLBinaryInputStream.openStreamcCs�|j�df}|ddk	r|St|j�df}|ddk	r:|St|j�df}|ddk	rX|S|j�df}|ddk	rt|St|j�df}|ddk	r�|djjd�r�|St|j�df}|ddk	r�|S|�rtyddl	m
}Wntk
r�Yn�Xg}|�}xF|j�s>|j
j|j�}t|t��s t�|�s(P|j|�|j|�q�W|j�t|jd�}|j
jd�|dk	�rt|dfSt|j�df}|ddk	�r�|Std�dfS)NrFrZ	tentativezutf-16)�UniversalDetector�encodingzwindows-1252)�	detectBOMrKr�r��detectEncodingMetar��name�
startswithr�Zchardet.universaldetectorr��ImportError�doner}r0rr=r/r)r2Zfeed�close�resultr,r�)r"ZchardetrLr�ZbuffersZdetectorr r�rrrr��sR


z'HTMLBinaryInputStream.determineEncodingcCs�|jddkst�t|�}|dkr&dS|jdkrFtd�}|dk	s�t�nT||jdkrf|jddf|_n4|jjd�|df|_|j�td|jd|f��dS)	Nr
rF�utf-16be�utf-16lezutf-8rzEncoding changed from %s to %s)r�r�)rLr)rKr�r}r,rOr)r"ZnewEncodingrrr�changeEncodings

z$HTMLBinaryInputStream.changeEncodingc
Cs�tjdtjdtjdtjdtjdi}|jjd�}t|t	�s<t
�|j|dd��}d}|s~|j|�}d}|s~|j|dd	��}d	}|r�|jj|�t
|�S|jjd
�dSdS)z�Attempts to detect at BOM at the start of the stream. If
        an encoding can be determined from the BOM return the name of the
        encoding otherwise return Nonezutf-8zutf-16lezutf-16bezutf-32lezutf-32be�N�rhr)�codecs�BOM_UTF8�BOM_UTF16_LE�BOM_UTF16_BE�BOM_UTF32_LE�BOM_UTF32_BEr}r0r=r/r)�getr,rK)r"ZbomDict�stringr�r,rrrr�s$
zHTMLBinaryInputStream.detectBOMcCsV|jj|j�}t|t�st�t|�}|jjd�|j�}|dk	rR|j	dkrRt
d�}|S)z9Report the encoding declared by the meta element
        rN�utf-16be�utf-16lezutf-8)r�r�)r}r0r~r=r/r)�EncodingParserr,�getEncodingr�rK)r"r �parserr�rrrr�9sz(HTMLBinaryInputStream.detectEncodingMeta)NNNNr{T)T)r7r8r9r:r#rOrMr�r�r�r�rrrrrB�s
(
>"rBc@s�eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zeee
�Z
dd�Zee�Zefdd�Zdd�Zdd�Zdd�ZdS)�
EncodingBytesz�String-like object with an associated position and various extra methods
    If the position is ever greater than the string length then an exception is
    raisedcCst|t�st�tj||j��S)N)r=r/r)�__new__�lower)r"�valuerrrr�LszEncodingBytes.__new__cCs
d|_dS)Nr
r)rZ)r"r�rrrr#PszEncodingBytes.__init__cCs|S)Nr)r"rrr�__iter__TszEncodingBytes.__iter__cCs>|jd}|_|t|�kr"t�n|dkr.t�|||d�S)Nr
r)rZr$�
StopIterationr@)r"�prrr�__next__WszEncodingBytes.__next__cCs|j�S)N)r�)r"rrr�next_szEncodingBytes.nextcCsB|j}|t|�krt�n|dkr$t�|d|_}|||d�S)Nrr
)rZr$r�r@)r"r�rrr�previouscszEncodingBytes.previouscCs|jt|�krt�||_dS)N)rZr$r�)r"r!rrr�setPositionlszEncodingBytes.setPositioncCs*|jt|�krt�|jdkr"|jSdSdS)Nr)rZr$r�)r"rrr�getPositionqs

zEncodingBytes.getPositioncCs||j|jd�S)Nr
)r!)r"rrr�getCurrentByte{szEncodingBytes.getCurrentBytecCsL|j}x:|t|�kr@|||d�}||kr6||_|S|d7}qW||_dS)zSkip past a list of charactersr
N)r!r$rZ)r"rvr�rprrrrm�szEncodingBytes.skipcCsL|j}x:|t|�kr@|||d�}||kr6||_|S|d7}qW||_dS)Nr
)r!r$rZ)r"rvr�rprrr�	skipUntil�szEncodingBytes.skipUntilcCs>|j}|||t|��}|j|�}|r:|jt|�7_|S)z�Look for a sequence of bytes at the start of a string. If the bytes
        are found return True and advance the position to the byte after the
        match. Otherwise return False and leave the position alone)r!r$r�)r"r/r�r3r6rrr�
matchBytes�s
zEncodingBytes.matchBytescCsR||jd�j|�}|dkrJ|jdkr,d|_|j|t|�d7_dSt�dS)z�Look for the next sequence of bytes matching a given sequence. If
        a match is found advance the position to the last byte of the matchNr
rTrr)r!�findrZr$r�)r"r/ZnewPositionrrr�jumpTo�s
zEncodingBytes.jumpToN)r7r8r9r:r�r#r�r�r�r�r�r��propertyr!r��currentByte�spaceCharactersBytesrmr�r�r�rrrrr�Hs 	
r�c@sXeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�ZdS)r�z?Mini parser for detecting character encoding from meta elementscCst|�|_d|_dS)z3string - the data to work on for encoding detectionN)r�r3r�)r"r3rrrr#�s
zEncodingParser.__init__c
Cs�d|jfd|jfd|jfd|jfd|jfd|jff}x^|jD]T}d}xD|D]<\}}|jj|�rJy|�}PWqJtk
r�d}PYqJXqJW|s<Pq<W|jS)	Ns<!--s<metas</s<!s<?rTF)	�
handleComment�
handleMeta�handlePossibleEndTag�handleOther�handlePossibleStartTagr3r�r�r�)r"ZmethodDispatchrgZkeepParsing�key�methodrrrr��s&zEncodingParser.getEncodingcCs|jjd�S)zSkip over commentss-->)r3r�)r"rrrr��szEncodingParser.handleCommentcCs�|jjtkrdSd}d}x�|j�}|dkr.dS|ddkr^|ddk}|r�|dk	r�||_dSq|ddkr�|d}t|�}|dk	r�||_dSq|ddkrtt|d��}|j�}|dk	rt|�}|dk	r|r�||_dS|}qWdS)	NTFrs
http-equivr
scontent-typescharsetscontent)	r3r�r��getAttributer�rK�ContentAttrParserr��parse)r"Z	hasPragmaZpendingEncoding�attrZtentativeEncoding�codecZ
contentParserrrrr��s:zEncodingParser.handleMetacCs
|jd�S)NF)�handlePossibleTag)r"rrrr��sz%EncodingParser.handlePossibleStartTagcCst|j�|jd�S)NT)r�r3r�)r"rrrr��s
z#EncodingParser.handlePossibleEndTagcCsf|j}|jtkr(|r$|j�|j�dS|jt�}|dkrD|j�n|j�}x|dk	r`|j�}qNWdS)NTr)r3r��asciiLettersBytesr�r�r��spacesAngleBracketsr�)r"ZendTagr3rpr�rrrr��s



z EncodingParser.handlePossibleTagcCs|jjd�S)Nr)r3r�)r"rrrr�szEncodingParser.handleOthercCs|j}|jttdg�B�}|dks2t|�dks2t�|d	kr>dSg}g}xt|dkrX|rXPnX|tkrl|j�}PnD|d
kr�dj|�dfS|tkr�|j|j	��n|dkr�dS|j|�t
|�}qHW|dkr�|j�dj|�dfSt
|�|j�}|dk�rT|}x�t
|�}||k�r(t
|�dj|�dj|�fS|tk�rB|j|j	��n
|j|��q�WnJ|dk�rldj|�dfS|tk�r�|j|j	��n|dk�r�dS|j|�x^t
|�}|tk�r�dj|�dj|�fS|tk�r�|j|j	��n|dk�r�dS|j|��q�WdS)z_Return a name,value pair for the next attribute in the stream,
        if one is found, or None�/Nr
r�=r4�'�")rN)r�r)r�r�)
r3rmr��	frozensetr$r)r5�asciiUppercaseBytesr2r�r�r�r�)r"r3rpZattrNameZ	attrValueZ	quoteCharrrrr�sh










zEncodingParser.getAttributeN)
r7r8r9r:r#r�r�r�r�r�r�r�r�rrrrr��s$r�c@seZdZdd�Zdd�ZdS)r�cCst|t�st�||_dS)N)r=r/r)r3)r"r3rrrr#fszContentAttrParser.__init__cCsy�|jjd�|jjd7_|jj�|jjdks8dS|jjd7_|jj�|jjdkr�|jj}|jjd7_|jj}|jj|�r�|j||jj�SdSnF|jj}y|jjt�|j||jj�Stk
r�|j|d�SXWntk
�rdSXdS)Nscharsetr
r�r�r�)r�r�)r3r�r!rmr�r�r�r�)r"Z	quoteMarkZoldPositionrrrr�js.

zContentAttrParser.parseN)r7r8r9r#r�rrrrr�esr�cCs`t|t�r.y|jd�}Wntk
r,dSX|dk	rXy
tj|�Stk
rTdSXndSdS)z{Return the python codec name corresponding to an encoding or None if the
    string doesn't correspond to a valid encoding.rN)r=r�decode�UnicodeDecodeErrorr	�lookup�AttributeError)r�rrrrK�s

rKrr)4Z
__future__rrrZpip._vendor.sixrrZpip._vendor.six.movesrrr�rsZpip._vendorr	Z	constantsrrr
rrrPr�iorrr�r�r�r�r�r�Zinvalid_unicode_no_surrogaterGrXr)rt�evalre�setrlZascii_punctuation_rerq�objectrrErArBr/r�r�r�rKrrrr�<module>sX
"








JgIh6'site-packages/pip/_vendor/html5lib/__pycache__/__init__.cpython-36.pyc000064400000001563147511334600021636 0ustar003

���e�@shdZddlmZmZmZddlmZmZmZddl	m
Z
ddlmZddl
mZdd	d
ddd
gZdZdS)aM
HTML parsing library based on the WHATWG "HTML5"
specification. The parser is designed to be compatible with existing
HTML found in the wild and implements well-defined error recovery that
is largely compatible with modern desktop web browsers.

Example usage:

import html5lib
f = open("my_document.html")
tree = html5lib.parse(f)
�)�absolute_import�division�unicode_literals�)�
HTMLParser�parse�
parseFragment)�getTreeBuilder)�
getTreeWalker)�	serializerrrr	r
rz1.0b10N)�__doc__Z
__future__rrrZhtml5parserrrrZtreebuildersr	Ztreewalkersr
Z
serializerr�__all__�__version__�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/html5lib/__pycache__/html5parser.cpython-36.pyc000064400000277205147511334600022355 0ustar003

���e���@sFddlmZmZmZddlmZmZmZddlZyddl	m
Z
Wn ek
r`ddlm
Z
YnXddl
mZddl
mZddl
mZdd	lmZdd
l
mZddlmZmZmZmZmZmZmZmZmZmZmZm Z!m"Z"m#Z#m$Z$m%Z%d!dd�Z&d"dd�Z'dd�Z(Gdd�de)�Z*ej+dd��Z,dd�Z-d#dd�Z.Gdd �d e/�Z0dS)$�)�absolute_import�division�unicode_literals)�with_metaclass�viewkeys�PY3N)�OrderedDict�)�_inputstream)�
_tokenizer)�treebuilders)�Marker)�_utils)�spaceCharacters�asciiUpper2Lower�specialElements�headingElements�
cdataElements�rcdataElements�
tokenTypes�
tagTokenTypes�
namespaces�htmlIntegrationPointElements�"mathmlTextIntegrationPointElements�adjustForeignAttributes�adjustMathMLAttributes�adjustSVGAttributes�E�ReparseException�etreeTcKs$tj|�}t||d�}|j|f|�S)z.Parse a string or file-like object into a tree)�namespaceHTMLElements)r�getTreeBuilder�
HTMLParser�parse)�doc�treebuilderr �kwargs�tb�p�r)�!/usr/lib/python3.6/html5parser.pyr#s
r#�divcKs,tj|�}t||d�}|j|fd|i|��S)N)r �	container)rr!r"�
parseFragment)r$r,r%r r&r'r(r)r)r*r-&s
r-csG�fdd�dt�}|S)NcseZdZ�fdd�ZdS)z-method_decorator_metaclass.<locals>.DecoratedcsBx0|j�D]$\}}t|tj�r&�|�}|||<q
Wtj||||�S)N)�items�
isinstance�types�FunctionType�type�__new__)�metaZ	classname�basesZ	classDictZ
attributeNameZ	attribute)�functionr)r*r3.s
z5method_decorator_metaclass.<locals>.Decorated.__new__N)�__name__�
__module__�__qualname__r3r))r6r)r*�	Decorated-sr:)r2)r6r:r))r6r*�method_decorator_metaclass,sr;c@s�eZdZdZd+dd�Zd,dd	�Zd
d�Zedd
��Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zd-dd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�ZdS).r"zZHTML parser. Generates a tree structure from a stream of (possibly
        malformed) HTMLNFTcsL|�_|dkrtjd�}||��_g�_t�fdd�t|�j�D���_dS)a
        strict - raise an exception when a parse error is encountered

        tree - a treebuilder class controlling the type of tree that will be
        returned. Built in treebuilders can be accessed through
        html5lib.treebuilders.getTreeBuilder(treeType)
        Nrcs g|]\}}||��j�f�qSr))�tree)�.0�name�cls)�selfr)r*�
<listcomp>Msz'HTMLParser.__init__.<locals>.<listcomp>)	�strictrr!r<�errors�dict�	getPhasesr.�phases)r@r<rBr �debugr))r@r*�__init__<s


zHTMLParser.__init__r+cKsh||_||_||_tj|fd|i|��|_|j�y|j�Wn$tk
rb|j�|j�YnXdS)N�parser)	�
innerHTMLModer,�	scriptingrZ
HTMLTokenizer�	tokenizer�reset�mainLoopr)r@�stream�	innerHTMLr,rKr&r)r)r*�_parsePszHTMLParser._parsecCs�|jj�d|_g|_g|_d|_|jr�|jj�|_	|j	t
krL|jj|j_
n0|j	tkrd|jj|j_
n|j	dkr||jj|j_
n|jd|_|jj�|j�nd|_	|jd|_d|_d|_d|_dS)NFz	no quirks�	plaintext�
beforeHtml�initialT)r<rM�
firstStartTagrC�log�
compatModerJr,�lowerrPrrL�rcdataState�stater�rawtextState�plaintextStaterF�phase�insertHtmlElement�resetInsertionModeZ	lastPhaseZbeforeRCDataPhase�
framesetOK)r@r)r)r*rM^s*





zHTMLParser.resetcCst|d�sdS|jjjdjS)z�The name of the character encoding
        that was used to decode the input stream,
        or :obj:`None` if that is not determined yet.

        rLNr)�hasattrrLrO�charEncodingr>)r@r)r)r*�documentEncoding�s
zHTMLParser.documentEncodingcCsJ|jdkr6|jtdkr6d|jko4|jdjt�dkS|j|jftkSdS)Nzannotation-xml�mathml�encoding�	text/html�application/xhtml+xml)rfrg)r>�	namespacer�
attributes�	translaterr)r@�elementr)r)r*�isHTMLIntegrationPoint�s


z!HTMLParser.isHTMLIntegrationPointcCs|j|jftkS)N)rhr>r)r@rkr)r)r*�isMathMLTextIntegrationPoint�sz'HTMLParser.isMathMLTextIntegrationPointcCsztd}td}td}td}td}td}td}�x�|j�D�]�}d}	|}
�x�|
dk	�r|
}	|jjrx|jjdnd}|r�|jnd}|r�|jnd}
|
d	}||kr�|j|
d
|
jdi��d}
qVt|jj�dk�sl||jj	k�sl|j
|��r ||k�r|d
tddg�k�sl|||fk�sl|tdk�rP|
dk�rP||k�rP|d
dk�sl|j
|��rt||||fk�rt|j}n
|jd}||k�r�|j|
�}
qV||k�r�|j|
�}
qV||k�r�|j|
�}
qV||k�r�|j|
�}
qV||k�r�|j|
�}
qV||krV|j|
�}
qVW||krD|	drD|	drD|jdd
|	d
i�qDWd}g}x8|�rt|j|j�|jj�}|�r>|j|k�s>t��q>WdS)N�
CharactersZSpaceCharacters�StartTag�EndTag�CommentZDoctype�
ParseErrorr	r2�data�datavarsrr>ZmglyphZ
malignmarkrdzannotation-xml�svg�inForeignContent�selfClosing�selfClosingAcknowledgedz&non-void-element-with-trailing-solidusT���)r�normalizedTokensr<�openElementsrhr>�
parseError�get�len�defaultNamespacerm�	frozensetrrlr]rF�processCharacters�processSpaceCharacters�processStartTag�
processEndTag�processComment�processDoctype�append�
processEOF�AssertionError)r@ZCharactersTokenZSpaceCharactersTokenZ
StartTagTokenZEndTagTokenZCommentTokenZDoctypeTokenZParseErrorToken�tokenZ
prev_token�	new_token�currentNodeZcurrentNodeNamespaceZcurrentNodeNamer2r]Z	reprocessrFr)r)r*rN�sp










zHTMLParser.mainLoopccs x|jD]}|j|�VqWdS)N)rL�normalizeToken)r@r�r)r)r*rz�szHTMLParser.normalizedTokenscOs |j|ddf|�|�|jj�S)a�Parse a HTML document into a well-formed tree

        stream - a filelike object or string containing the HTML to be parsed

        The optional encoding parameter must be a string that indicates
        the encoding.  If specified, that encoding will be used,
        regardless of any BOM or later declaration (such as in a meta
        element)

        scripting - treat noscript elements as if javascript was turned on
        FN)rQr<ZgetDocument)r@rO�argsr&r)r)r*r#�szHTMLParser.parsecOs|j|df|�|�|jj�S)a2Parse a HTML fragment into a well-formed tree fragment

        container - name of the element we're setting the innerHTML property
        if set to None, default to 'div'

        stream - a filelike object or string containing the HTML to be parsed

        The optional encoding parameter must be a string that indicates
        the encoding.  If specified, that encoding will be used,
        regardless of any BOM or later declaration (such as in a meta
        element)

        scripting - treat noscript elements as if javascript was turned on
        T)rQr<ZgetFragment)r@rOr�r&r)r)r*r-�szHTMLParser.parseFragment�XXX-undefined-errorcCs@|dkri}|jj|jjj�||f�|jr<tt||��dS)N)rCr�rLrOZpositionrBrrr)r@�	errorcodertr)r)r*r|s
zHTMLParser.parseErrorcCsT|dtdkrP|d}t|�|d<t|�t|d�krP|dj|ddd��|S)z3 HTML5 specific normalizations to the token stream r2rorsNr	ry)rrr~�update)r@r��rawr)r)r*r�szHTMLParser.normalizeTokencCst|t�dS)N)�adjust_attributesr)r@r�r)r)r*rsz!HTMLParser.adjustMathMLAttributescCst|t�dS)N)r�r)r@r�r)r)r*rszHTMLParser.adjustSVGAttributescCst|t�dS)N)r��adjustForeignAttributesMap)r@r�r)r)r*rsz"HTMLParser.adjustForeignAttributescCs|jj�dS)N)rIr])r@r�r)r)r*�reparseTokenNormalszHTMLParser.reparseTokenNormalcCs�d}ddddddddddd	d	d
dd�}x�|jjddd�D]�}|j}d}||jjdkrl|jsbt�d}|j}|dkr~|js~t�|r�|j|jjkr�q:||kr�|j||}Pq:|r:|jd	}Pq:W||_dS)NF�inSelect�inCell�inRow�inTableBody�	inCaption�
inColumnGroup�inTable�inBody�
inFrameset�
beforeHead)�select�td�th�tr�tbody�thead�tfoot�caption�colgroup�table�head�body�frameset�htmlr	rTr�r�r�r�ry)r�r�r�r�)	r<r{r>rPr�rhrrFr])r@ZlastZnewModes�nodeZnodeNameZ	new_phaser)r)r*r_!sB


zHTMLParser.resetInsertionModecCsR|dkst�|jj|�|dkr.|jj|j_n|jj|j_|j|_|j	d|_dS)zYGeneric RCDATA/RAWTEXT Parsing algorithm
        contentType - RCDATA or RAWTEXT
        �RAWTEXT�RCDATA�textN)r�r�)
r�r<�
insertElementrLr[rZrYr]�
originalPhaserF)r@r�ZcontentTyper)r)r*�parseRCDataRawtextMszHTMLParser.parseRCDataRawtext)NFTF)Fr+F)r�N)r7r8r9�__doc__rHrQrM�propertyrcrlrmrNrzr#r-r|r�rrrr�r_r�r)r)r)r*r"8s&

"
C
,r"cs"dd�}dd�}Gdd�dt|||����Gdd�d��}Gd	d
�d
��}G�fdd�d��}G�fd
d�d��}G�fdd�d��}G�fdd�d��}G�fdd�d��}	G�fdd�d��}
G�fdd�d��}G�fdd�d��}G�fdd�d��}
G�fdd�d��}G�fdd �d ��}G�fd!d"�d"��}G�fd#d$�d$��}G�fd%d&�d&��}G�fd'd(�d(��}G�fd)d*�d*��}G�fd+d,�d,��}G�fd-d.�d.��}G�fd/d0�d0��}G�fd1d2�d2��}G�fd3d4�d4��}|||||||	|
|||
||||||||||||d5�S)6Ncs(tdd�tj�D�����fdd�}|S)z4Logger that records which phase processes each tokencss|]\}}||fVqdS)Nr))r=�key�valuer)r)r*�	<genexpr>csz)getPhases.<locals>.log.<locals>.<genexpr>cs��jjd�r�t|�dkr�|d}yd�|di}Wn�YnX|dtkr\|d|d<|jjj|jjjj|jj	j
j|j
j�j|f��|f|�|�S�|f|�|�SdS)NZprocessrr2r>)r7�
startswithr~rrIrVr�rLrZr]�	__class__)r@r�r&r��info)r6�
type_namesr)r*�wrappedfs
z'getPhases.<locals>.log.<locals>.wrapped)rDrr.)r6r�r))r6r�r*rVaszgetPhases.<locals>.logcSs|rt|�StSdS)N)r;r2)Z
use_metaclassZmetaclass_funcr)r)r*�getMetaclasszszgetPhases.<locals>.getMetaclassc@sXeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�ZdS)zgetPhases.<locals>.PhasezNBase class for helper object that implements each phase of processing
        cSs||_||_dS)N)rIr<)r@rIr<r)r)r*rH�sz!getPhases.<locals>.Phase.__init__cSst�dS)N)�NotImplementedError)r@r)r)r*r��sz#getPhases.<locals>.Phase.processEOFcSs|jj||jjd�dS)Nr	ry)r<�
insertCommentr{)r@r�r)r)r*r��sz'getPhases.<locals>.Phase.processCommentcSs|jjd�dS)Nzunexpected-doctype)rIr|)r@r�r)r)r*r��sz'getPhases.<locals>.Phase.processDoctypecSs|jj|d�dS)Nrs)r<�
insertText)r@r�r)r)r*r��sz*getPhases.<locals>.Phase.processCharacterscSs|jj|d�dS)Nrs)r<r�)r@r�r)r)r*r��sz/getPhases.<locals>.Phase.processSpaceCharacterscSs|j|d|�S)Nr>)�startTagHandler)r@r�r)r)r*r��sz(getPhases.<locals>.Phase.processStartTagcSsl|jjr"|ddkr"|jjd�x<|dj�D],\}}||jjdjkr0||jjdj|<q0Wd|j_dS)Nr>r�z
non-html-rootrsrF)rIrUr|r.r<r{ri)r@r��attrr�r)r)r*�startTagHtml�sz%getPhases.<locals>.Phase.startTagHtmlcSs|j|d|�S)Nr>)�
endTagHandler)r@r�r)r)r*r��sz&getPhases.<locals>.Phase.processEndTagN)
r7r8r9r�rHr�r�r�r�r�r�r�r�r)r)r)r*�Phase�s
r�c@sLeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)zgetPhases.<locals>.InitialPhasecSsdS)Nr))r@r�r)r)r*r��sz6getPhases.<locals>.InitialPhase.processSpaceCharacterscSs|jj||jj�dS)N)r<r��document)r@r�r)r)r*r��sz.getPhases.<locals>.InitialPhase.processCommentc8Ss|d}|d}|d}|d}|dks@|dk	s@|dk	rL|dkrL|jjd�|dkrXd}|jj|�|dkrv|jt�}|�s�|ddk�s�|jdJ��s�|dKk�s�|jdL��r�|dk�s�|�r�|j�dDk�r�dE|j_n*|jdM��s�|jdN��r|dk	�rdH|j_|jj	dI|j_
dS)ONr>�publicId�systemId�correctr�zabout:legacy-compatzunknown-doctype��*+//silmaril//dtd html pro v0r11 19970101//�4-//advasoft ltd//dtd html 3.0 aswedit + extensions//�*-//as//dtd html 3.0 aswedit + extensions//�-//ietf//dtd html 2.0 level 1//�-//ietf//dtd html 2.0 level 2//�&-//ietf//dtd html 2.0 strict level 1//�&-//ietf//dtd html 2.0 strict level 2//�-//ietf//dtd html 2.0 strict//�-//ietf//dtd html 2.0//�-//ietf//dtd html 2.1e//�-//ietf//dtd html 3.0//�-//ietf//dtd html 3.2 final//�-//ietf//dtd html 3.2//�-//ietf//dtd html 3//�-//ietf//dtd html level 0//�-//ietf//dtd html level 1//�-//ietf//dtd html level 2//�-//ietf//dtd html level 3//�"-//ietf//dtd html strict level 0//�"-//ietf//dtd html strict level 1//�"-//ietf//dtd html strict level 2//�"-//ietf//dtd html strict level 3//�-//ietf//dtd html strict//�-//ietf//dtd html//�(-//metrius//dtd metrius presentational//�5-//microsoft//dtd internet explorer 2.0 html strict//�.-//microsoft//dtd internet explorer 2.0 html//�0-//microsoft//dtd internet explorer 2.0 tables//�5-//microsoft//dtd internet explorer 3.0 html strict//�.-//microsoft//dtd internet explorer 3.0 html//�0-//microsoft//dtd internet explorer 3.0 tables//�#-//netscape comm. corp.//dtd html//�*-//netscape comm. corp.//dtd strict html//�*-//o'reilly and associates//dtd html 2.0//�3-//o'reilly and associates//dtd html extended 1.0//�;-//o'reilly and associates//dtd html extended relaxed 1.0//�N-//softquad software//dtd hotmetal pro 6.0::19990601::extensions to html 4.0//�E-//softquad//dtd hotmetal pro 4.0::19971010::extensions to html 4.0//�$-//spyglass//dtd html 2.0 extended//�+-//sq//dtd html 2.0 hotmetal + extensions//�--//sun microsystems corp.//dtd hotjava html//�4-//sun microsystems corp.//dtd hotjava strict html//�-//w3c//dtd html 3 1995-03-24//�-//w3c//dtd html 3.2 draft//�-//w3c//dtd html 3.2 final//�-//w3c//dtd html 3.2//�-//w3c//dtd html 3.2s draft//�-//w3c//dtd html 4.0 frameset//�#-//w3c//dtd html 4.0 transitional//�(-//w3c//dtd html experimental 19960712//�&-//w3c//dtd html experimental 970421//�-//w3c//dtd w3 html//�-//w3o//dtd w3 html 3.0//�#-//webtechs//dtd mozilla html 2.0//�-//webtechs//dtd mozilla html//�$-//w3o//dtd w3 html strict 3.0//en//�"-/w3c/dtd html 4.0 transitional/en� -//w3c//dtd html 4.01 frameset//�$-//w3c//dtd html 4.01 transitional//z:http://www.ibm.com/data/dtd/v11/ibmxhtml1-transitional.dtd�quirks� -//w3c//dtd xhtml 1.0 frameset//�$-//w3c//dtd xhtml 1.0 transitional//zlimited quirksrS)7r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr)rrr�)rr)r	r
)rr)rIr|r<Z
insertDoctyperjrr�rXrWrFr])r@r�r>r�r�r�r)r)r*r��s�



z.getPhases.<locals>.InitialPhase.processDoctypecSsd|j_|jjd|j_dS)NrrS)rIrWrFr])r@r)r)r*�anythingElsesz,getPhases.<locals>.InitialPhase.anythingElsecSs|jjd�|j�|S)Nzexpected-doctype-but-got-chars)rIr|r)r@r�r)r)r*r�sz1getPhases.<locals>.InitialPhase.processCharacterscSs"|jjdd|di�|j�|S)Nz"expected-doctype-but-got-start-tagr>)rIr|r)r@r�r)r)r*r�sz/getPhases.<locals>.InitialPhase.processStartTagcSs"|jjdd|di�|j�|S)Nz expected-doctype-but-got-end-tagr>)rIr|r)r@r�r)r)r*r�sz-getPhases.<locals>.InitialPhase.processEndTagcSs|jjd�|j�dS)Nzexpected-doctype-but-got-eofT)rIr|r)r@r)r)r*r�%sz*getPhases.<locals>.InitialPhase.processEOFN)r7r8r9r�r�r�rr�r�r�r�r)r)r)r*�InitialPhase�s_rc@sDeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dS)z"getPhases.<locals>.BeforeHtmlPhasecSs&|jjtdd��|jjd|j_dS)Nr�ror�)r<Z
insertRoot�impliedTagTokenrIrFr])r@r)r)r*r^,sz4getPhases.<locals>.BeforeHtmlPhase.insertHtmlElementcSs|j�dS)NT)r^)r@r)r)r*r�1sz-getPhases.<locals>.BeforeHtmlPhase.processEOFcSs|jj||jj�dS)N)r<r�r�)r@r�r)r)r*r�5sz1getPhases.<locals>.BeforeHtmlPhase.processCommentcSsdS)Nr))r@r�r)r)r*r�8sz9getPhases.<locals>.BeforeHtmlPhase.processSpaceCharacterscSs|j�|S)N)r^)r@r�r)r)r*r�;sz4getPhases.<locals>.BeforeHtmlPhase.processCharacterscSs |ddkrd|j_|j�|S)Nr>r�T)rIrUr^)r@r�r)r)r*r�?sz2getPhases.<locals>.BeforeHtmlPhase.processStartTagcSs4|ddkr$|jjdd|di�n|j�|SdS)Nr>r�r�r��brzunexpected-end-tag-before-html)r�r�r�r)rIr|r^)r@r�r)r)r*r�Es
z0getPhases.<locals>.BeforeHtmlPhase.processEndTagN)
r7r8r9r^r�r�r�r�r�r�r)r)r)r*�BeforeHtmlPhase*srcsXeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)z"getPhases.<locals>.BeforeHeadPhasecsV�j|||�tjd|jfd|jfg�|_|j|j_tjd|jfg�|_	|j
|j	_dS)Nr�r�r�r)r�r�r�r)rHr�MethodDispatcherr��startTagHeadr��
startTagOther�default�endTagImplyHeadr��endTagOther)r@rIr<)r�r)r*rHNs
z+getPhases.<locals>.BeforeHeadPhase.__init__cSs|jtdd��dS)Nr�roT)rr
)r@r)r)r*r�\sz-getPhases.<locals>.BeforeHeadPhase.processEOFcSsdS)Nr))r@r�r)r)r*r�`sz9getPhases.<locals>.BeforeHeadPhase.processSpaceCharacterscSs|jtdd��|S)Nr�ro)rr
)r@r�r)r)r*r�csz4getPhases.<locals>.BeforeHeadPhase.processCharacterscSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�gsz/getPhases.<locals>.BeforeHeadPhase.startTagHtmlcSs0|jj|�|jjd|j_|jjd|j_dS)Nr	�inHeadry)r<r�r{�headPointerrIrFr])r@r�r)r)r*rjsz/getPhases.<locals>.BeforeHeadPhase.startTagHeadcSs|jtdd��|S)Nr�ro)rr
)r@r�r)r)r*rosz0getPhases.<locals>.BeforeHeadPhase.startTagOthercSs|jtdd��|S)Nr�ro)rr
)r@r�r)r)r*rssz2getPhases.<locals>.BeforeHeadPhase.endTagImplyHeadcSs|jjdd|di�dS)Nzend-tag-after-implied-rootr>)rIr|)r@r�r)r)r*rwsz.getPhases.<locals>.BeforeHeadPhase.endTagOtherN)r7r8r9rHr�r�r�r�rrrrr))r�r)r*�BeforeHeadPhaseMsrcs�eZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd �Zd!S)"zgetPhases.<locals>.InHeadPhasecs��j|||�tjd|jfd|jfd|jfd|jfd|jfd|jfd|j	fd
|j
fg�|_|j|j_
tjd
|jfd|jfg�|_|j|j_
dS)Nr��title�noframes�style�noscript�script�base�basefont�bgsound�command�linkr4r�rr�)rr)rrr r!r")rr�r�)rHrrr��
startTagTitle�startTagNoFramesStyle�startTagNoscript�startTagScript�startTagBaseLinkCommand�startTagMetarr�rr�
endTagHead�endTagHtmlBodyBrr�r)r@rIr<)r�r)r*rH|s 
z'getPhases.<locals>.InHeadPhase.__init__cSs|j�dS)NT)r)r@r)r)r*r��sz)getPhases.<locals>.InHeadPhase.processEOFcSs|j�|S)N)r)r@r�r)r)r*r��sz0getPhases.<locals>.InHeadPhase.processCharacterscSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r��sz+getPhases.<locals>.InHeadPhase.startTagHtmlcSs|jjd�dS)Nz!two-heads-are-not-better-than-one)rIr|)r@r�r)r)r*r�sz+getPhases.<locals>.InHeadPhase.startTagHeadcSs$|jj|�|jjj�d|d<dS)NTrx)r<r�r{�pop)r@r�r)r)r*r'�sz6getPhases.<locals>.InHeadPhase.startTagBaseLinkCommandcSs�|jj|�|jjj�d|d<|d}|jjjjddkr�d|krZ|jjjj|d�nVd|kr�d|kr�|dj	�d	kr�t
j|djd
��}t
j
|�}|j�}|jjjj|�dS)NTrxrsr	Z	tentative�charsetZcontentz
http-equivzcontent-typezutf-8)r<r�r{r+rIrLrOrbZchangeEncodingrXr
Z
EncodingBytes�encodeZContentAttrParserr#)r@r�rirsrI�codecr)r)r*r(�s
z+getPhases.<locals>.InHeadPhase.startTagMetacSs|jj|d�dS)Nr�)rIr�)r@r�r)r)r*r#�sz,getPhases.<locals>.InHeadPhase.startTagTitlecSs|jj|d�dS)Nr�)rIr�)r@r�r)r)r*r$�sz4getPhases.<locals>.InHeadPhase.startTagNoFramesStylecSs8|jjr|jj|d�n|jj|�|jjd|j_dS)Nr��inHeadNoscript)rIrKr�r<r�rFr])r@r�r)r)r*r%�sz/getPhases.<locals>.InHeadPhase.startTagNoscriptcSs<|jj|�|jjj|jj_|jj|j_|jjd|j_dS)Nr�)	r<r�rIrLZscriptDataStaterZr]r�rF)r@r�r)r)r*r&�sz-getPhases.<locals>.InHeadPhase.startTagScriptcSs|j�|S)N)r)r@r�r)r)r*r�sz,getPhases.<locals>.InHeadPhase.startTagOthercSs:|jjjj�}|jdks&td|j��|jjd|j_dS)Nr�zExpected head got %s�	afterHead)rIr<r{r+r>r�rFr])r@r�r�r)r)r*r)�sz)getPhases.<locals>.InHeadPhase.endTagHeadcSs|j�|S)N)r)r@r�r)r)r*r*�sz/getPhases.<locals>.InHeadPhase.endTagHtmlBodyBrcSs|jjdd|di�dS)Nzunexpected-end-tagr>)rIr|)r@r�r)r)r*r�sz*getPhases.<locals>.InHeadPhase.endTagOthercSs|jtd��dS)Nr�)r)r
)r@r)r)r*r�sz+getPhases.<locals>.InHeadPhase.anythingElseN)r7r8r9rHr�r�r�rr'r(r#r$r%r&rr)r*rrr))r�r)r*�InHeadPhase{s r1csxeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�ZdS)z&getPhases.<locals>.InHeadNoscriptPhasecsf�j|||�tjd|jfd|jfd|jfg�|_|j|j_tjd	|j	fd
|j
fg�|_|j|j_dS)
Nr�rr r"r4rrr�rr)rr r"r4rr)r�r)
rHrrr�r'�startTagHeadNoscriptr�rr�endTagNoscript�endTagBrr�r)r@rIr<)r�r)r*rH�s
z/getPhases.<locals>.InHeadNoscriptPhase.__init__cSs|jjd�|j�dS)Nzeof-in-head-noscriptT)rIr|r)r@r)r)r*r��sz1getPhases.<locals>.InHeadNoscriptPhase.processEOFcSs|jjdj|�S)Nr)rIrFr�)r@r�r)r)r*r��sz5getPhases.<locals>.InHeadNoscriptPhase.processCommentcSs|jjd�|j�|S)Nzchar-in-head-noscript)rIr|r)r@r�r)r)r*r��sz8getPhases.<locals>.InHeadNoscriptPhase.processCharacterscSs|jjdj|�S)Nr)rIrFr�)r@r�r)r)r*r�sz=getPhases.<locals>.InHeadNoscriptPhase.processSpaceCharacterscSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�sz3getPhases.<locals>.InHeadNoscriptPhase.startTagHtmlcSs|jjdj|�S)Nr)rIrFr�)r@r�r)r)r*r'sz>getPhases.<locals>.InHeadNoscriptPhase.startTagBaseLinkCommandcSs|jjdd|di�dS)Nzunexpected-start-tagr>)rIr|)r@r�r)r)r*r2	sz;getPhases.<locals>.InHeadNoscriptPhase.startTagHeadNoscriptcSs"|jjdd|di�|j�|S)Nzunexpected-inhead-noscript-tagr>)rIr|r)r@r�r)r)r*rsz4getPhases.<locals>.InHeadNoscriptPhase.startTagOthercSs:|jjjj�}|jdks&td|j��|jjd|j_dS)NrzExpected noscript got %sr)rIr<r{r+r>r�rFr])r@r�r�r)r)r*r3sz5getPhases.<locals>.InHeadNoscriptPhase.endTagNoscriptcSs"|jjdd|di�|j�|S)Nzunexpected-inhead-noscript-tagr>)rIr|r)r@r�r)r)r*r4sz/getPhases.<locals>.InHeadNoscriptPhase.endTagBrcSs|jjdd|di�dS)Nzunexpected-end-tagr>)rIr|)r@r�r)r)r*rsz2getPhases.<locals>.InHeadNoscriptPhase.endTagOthercSs|jtd��dS)Nr)r3r
)r@r)r)r*rsz3getPhases.<locals>.InHeadNoscriptPhase.anythingElseN)r7r8r9rHr�r�r�r�r�r'r2rr3r4rrr))r�r)r*�InHeadNoscriptPhase�sr5cspeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�ZdS)z!getPhases.<locals>.AfterHeadPhasec
sn�j|||�tjd|jfd|jfd|jfd|jfd
|jfg�|_|j	|j_
tjd|jfg�|_|j
|j_
dS)Nr�r�r�rrr r"r4rrrrr�r)	rrr r"r4rrrr)r�r�r)rHrrr��startTagBody�startTagFrameset�startTagFromHeadrr�rrr*r�r)r@rIr<)r�r)r*rH#s
z*getPhases.<locals>.AfterHeadPhase.__init__cSs|j�dS)NT)r)r@r)r)r*r�4sz,getPhases.<locals>.AfterHeadPhase.processEOFcSs|j�|S)N)r)r@r�r)r)r*r�8sz3getPhases.<locals>.AfterHeadPhase.processCharacterscSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�<sz.getPhases.<locals>.AfterHeadPhase.startTagHtmlcSs(d|j_|jj|�|jjd|j_dS)NFr�)rIr`r<r�rFr])r@r�r)r)r*r6?sz.getPhases.<locals>.AfterHeadPhase.startTagBodycSs |jj|�|jjd|j_dS)Nr�)r<r�rIrFr])r@r�r)r)r*r7Dsz2getPhases.<locals>.AfterHeadPhase.startTagFramesetcSst|jjdd|di�|jjj|jj�|jjdj|�x4|jjddd�D]}|jdkrN|jjj	|�PqNWdS)Nz#unexpected-start-tag-out-of-my-headr>rr	r�ry)
rIr|r<r{r�rrFr�r>�remove)r@r�r�r)r)r*r8Hs
z2getPhases.<locals>.AfterHeadPhase.startTagFromHeadcSs|jjdd|di�dS)Nzunexpected-start-tagr>)rIr|)r@r�r)r)r*rRsz.getPhases.<locals>.AfterHeadPhase.startTagHeadcSs|j�|S)N)r)r@r�r)r)r*rUsz/getPhases.<locals>.AfterHeadPhase.startTagOthercSs|j�|S)N)r)r@r�r)r)r*r*Ysz2getPhases.<locals>.AfterHeadPhase.endTagHtmlBodyBrcSs|jjdd|di�dS)Nzunexpected-end-tagr>)rIr|)r@r�r)r)r*r]sz-getPhases.<locals>.AfterHeadPhase.endTagOthercSs.|jjtdd��|jjd|j_d|j_dS)Nr�ror�T)r<r�r
rIrFr]r`)r@r)r)r*r`sz.getPhases.<locals>.AfterHeadPhase.anythingElseN)r7r8r9rHr�r�r�r6r7r8rrr*rrr))r�r)r*�AfterHeadPhase"s
r:cs�eZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Zd5d6�Zd7d8�Zd9d:�Zd;d<�Z d=d>�Z!d?d@�Z"dAdB�Z#dCdD�Z$dEdF�Z%dGdH�Z&dIdJ�Z'dKdL�Z(dMdN�Z)dOdP�Z*dQdR�Z+dSdT�Z,dUdV�Z-dWdX�Z.dYdZ�Z/d[d\�Z0d]d^�Z1d_d`�Z2dadb�Z3dcdd�Z4dedf�Z5dgS)hzgetPhases.<locals>.InBodyPhasec,s��j|||�|j|_tjd|jfdd|jfd|jfd|jfde|j	ft
|jfdf|jfd&|j
fdg|jfd*|jfd+|jfdh|jfd8|jfd9|jfdi|jfd=|jfd>|jfdj|jfdk|jfdH|jfdI|jfdJ|jfdK|jfdL|jfdM|jfdN|jfdl|j fdQ|j!fdm|j"fdn|j#fdV|j$fdW|j%fdo|j&fg!�|_'|j(|j'_)tjd|j*fd|j+fdp|j,fd&|j-fd |j.fdq|j/ft
|j0fdr|j1fds|j2fd@|j3fg
�|_4|j5|j4_)dS)tNr�rrr r!r"r4rrrr�r��address�article�aside�
blockquote�center�details�dirr+�dl�fieldset�
figcaption�figure�footer�header�hgroup�main�menu�nav�olr(�section�summary�ul�pre�listing�form�li�dd�dtrR�a�b�big�code�em�font�i�s�small�strike�strong�tt�u�nobr�button�applet�marquee�objectZxmpr��arear�embed�img�keygen�wbr�param�source�track�input�hr�image�isindex�textareaZiframer�noembedrr��rp�rt�option�optgroupZmathrur��colr��framer�r�r�r�r�r�r��dialog)	rrr r!r"r4rrr)r;r<r=r>r?r@rAr+rBrCrDrErFrGrHrIrJrKrLr(rMrNrO)rPrQ)rSrTrU)rWrXrYrZr[r\r]r^r_r`rarb)rerfrg)rhrrirjrkrl)rmrnro)rur)rvrw)rxry)r�rzr�r{r�r�r�r�r�r�r�)r;r<r=r>rdr?r@r|rAr+rBrCrDrErFrGrHrQrIrJrKrLrPrMrNrO)rTrUrS)rVrWrXrYrZr[r\rcr]r^r_r`rarb)rerfrg)6rH�processSpaceCharactersNonPrer�rrr��startTagProcessInHeadr6r7�startTagClosePr�startTagHeading�startTagPreListing�startTagForm�startTagListItem�startTagPlaintext�	startTagA�startTagFormatting�startTagNobr�startTagButton�startTagAppletMarqueeObject�startTagXmp�
startTagTable�startTagVoidFormatting�startTagParamSource�
startTagInput�
startTagHr�
startTagImage�startTagIsIndex�startTagTextarea�startTagIFramer%�startTagRawtext�startTagSelect�startTagRpRt�startTagOpt�startTagMath�startTagSvg�startTagMisplacedr�rr�
endTagBody�
endTagHtml�endTagBlock�
endTagForm�endTagP�endTagListItem�
endTagHeading�endTagFormatting�endTagAppletMarqueeObjectr4r�r)r@rIr<)r�r)r*rHhs~
z'getPhases.<locals>.InBodyPhase.__init__cSs$|j|jko"|j|jko"|j|jkS)N)r>rhri)r@Znode1Znode2r)r)r*�isMatchingFormattingElement�sz:getPhases.<locals>.InBodyPhase.isMatchingFormattingElementcSs�|jj|�|jjd}g}x<|jjddd�D]&}|tkr@Pq0|j||�r0|j|�q0Wt|�dksjt�t|�dkr�|jjj	|d�|jjj|�dS)Nr	�ryryry)
r<r�r{�activeFormattingElementsr
r�r�r~r�r9)r@r�rkZmatchingElementsr�r)r)r*�addFormattingElement�sz3getPhases.<locals>.InBodyPhase.addFormattingElementc
Ss@td�}x2|jjddd�D]}|j|kr|jjd�PqWdS)NrTrUrSr(r�r�r�r�r�r�r�r�r	z expected-closing-tag-but-got-eof)rTrUrSr(r�r�r�r�r�r�r�r�ry)r�r<r{r>rIr|)r@Zallowed_elementsr�r)r)r*r��s
z)getPhases.<locals>.InBodyPhase.processEOFcSsh|d}|j|_|jd�rJ|jjdjdkrJ|jjd	j�rJ|dd�}|rd|jj�|jj|�dS)
Nrs�
r	rPrQrtry)rPrQrtry)	r}r�r�r<r{r>Z
hasContent�#reconstructActiveFormattingElementsr�)r@r�rsr)r)r*�!processSpaceCharactersDropNewline�s

z@getPhases.<locals>.InBodyPhase.processSpaceCharactersDropNewlinecSsT|ddkrdS|jj�|jj|d�|jjrPtdd�|dD��rPd|j_dS)Nrs�cSsg|]}|tk�qSr))r)r=�charr)r)r*rA�szDgetPhases.<locals>.InBodyPhase.processCharacters.<locals>.<listcomp>F)r<r�r�rIr`�any)r@r�r)r)r*r��s
z0getPhases.<locals>.InBodyPhase.processCharacterscSs|jj�|jj|d�dS)Nrs)r<r�r�)r@r�r)r)r*r}�s
z;getPhases.<locals>.InBodyPhase.processSpaceCharactersNonPrecSs|jjdj|�S)Nr)rIrFr�)r@r�r)r)r*r~�sz4getPhases.<locals>.InBodyPhase.startTagProcessInHeadcSs�|jjdddi�t|jj�dks4|jjdjdkrB|jjs�t�nFd|j_x<|dj	�D],\}}||jjdj
krX||jjdj
|<qXWdS)Nzunexpected-start-tagr>r�r	Frs)rIr|r~r<r{r>rPr�r`r.ri)r@r�r�r�r)r)r*r6�sz+getPhases.<locals>.InBodyPhase.startTagBodycSs�|jjdddi�t|jj�dks4|jjdjdkrB|jjs�t�nt|jjsLnj|jjdj	rv|jjdj	j
|jjd�x"|jjdjdkr�|jjj�qxW|jj|�|jj
d|j_dS)	Nzunexpected-start-tagr>r�r	r�r�r�ry)rIr|r~r<r{r>rPr�r`�parent�removeChildr+r�rFr])r@r�r)r)r*r7�s"z/getPhases.<locals>.InBodyPhase.startTagFramesetcSs.|jjddd�r|jtd��|jj|�dS)Nr(rd)�variant)r<�elementInScoper�r
r�)r@r�r)r)r*r	sz-getPhases.<locals>.InBodyPhase.startTagClosePcSs>|jjddd�r|jtd��|jj|�d|j_|j|_dS)Nr(rd)r�F)	r<r�r�r
r�rIr`r�r�)r@r�r)r)r*r�s
z1getPhases.<locals>.InBodyPhase.startTagPreListingcSsZ|jjr|jjdddi�n:|jjddd�r:|jtd��|jj|�|jjd|j_dS)	Nzunexpected-start-tagr>rRr(rd)r�r	ry)	r<�formPointerrIr|r�r�r
r�r{)r@r�r)r)r*r�sz+getPhases.<locals>.InBodyPhase.startTagFormcSs�d|j_dgddgddgd�}||d}xLt|jj�D]<}|j|kr^|jjjt|jd��P|j	t
kr8|jd
kr8Pq8W|jjd
dd�r�|jjjtd
d��|jj|�dS)NFrSrUrT)rSrUrTr>rpr;r+r(rd)r�)r;r+r()
rIr`�reversedr<r{r>r]r�r
�	nameTuplerr�r�)r@r�ZstopNamesMapZ	stopNamesr�r)r)r*r�s"


z/getPhases.<locals>.InBodyPhase.startTagListItemcSs>|jjddd�r|jtd��|jj|�|jjj|jj_dS)Nr(rd)r�)	r<r�r�r
r�rIrLr\rZ)r@r�r)r)r*r�4sz0getPhases.<locals>.InBodyPhase.startTagPlaintextcSsb|jjddd�r|jtd��|jjdjtkrR|jjdd|di�|jjj	�|jj
|�dS)Nr(rd)r�r	zunexpected-start-tagr>ry)r<r�r�r
r{r>rrIr|r+r�)r@r�r)r)r*r�:sz.getPhases.<locals>.InBodyPhase.startTagHeadingcSs~|jjd�}|rf|jjdddd��|jtd��||jjkrL|jjj|�||jjkrf|jjj|�|jj	�|j
|�dS)NrVz$unexpected-start-tag-implies-end-tag)�	startName�endName)r<�!elementInActiveFormattingElementsrIr|r�r
r{r9r�r�r�)r@r�ZafeAElementr)r)r*r�Bs
z(getPhases.<locals>.InBodyPhase.startTagAcSs|jj�|j|�dS)N)r<r�r�)r@r�r)r)r*r�Os
z1getPhases.<locals>.InBodyPhase.startTagFormattingcSsP|jj�|jjd�rB|jjdddd��|jtd��|jj�|j|�dS)Nrcz$unexpected-start-tag-implies-end-tag)r�r�)r<r�r�rIr|r�r
r�)r@r�r)r)r*r�Ss

z+getPhases.<locals>.InBodyPhase.startTagNobrcSsT|jjd�r2|jjdddd��|jtd��|S|jj�|jj|�d|j_dS)Nrdz$unexpected-start-tag-implies-end-tag)r�r�F)	r<r�rIr|r�r
r�r�r`)r@r�r)r)r*r�]s
z-getPhases.<locals>.InBodyPhase.startTagButtoncSs0|jj�|jj|�|jjjt�d|j_dS)NF)r<r�r�r�r�r
rIr`)r@r�r)r)r*r�hs
z:getPhases.<locals>.InBodyPhase.startTagAppletMarqueeObjectcSsB|jjddd�r|jtd��|jj�d|j_|jj|d�dS)Nr(rd)r�Fr�)r<r�r�r
r�rIr`r�)r@r�r)r)r*r�ns

z*getPhases.<locals>.InBodyPhase.startTagXmpcSsR|jjdkr*|jjddd�r*|jtd��|jj|�d|j_|jjd|j_	dS)Nrr(rd)r�Fr�)
rIrWr<r�r�r
r�r`rFr])r@r�r)r)r*r�usz,getPhases.<locals>.InBodyPhase.startTagTablecSs6|jj�|jj|�|jjj�d|d<d|j_dS)NTrxF)r<r�r�r{r+rIr`)r@r�r)r)r*r�}s

z5getPhases.<locals>.InBodyPhase.startTagVoidFormattingcSs@|jj}|j|�d|dkr<|ddjt�dkr<||j_dS)Nr2rs�hidden)rIr`r�rjr)r@r�r`r)r)r*r��s

z,getPhases.<locals>.InBodyPhase.startTagInputcSs$|jj|�|jjj�d|d<dS)NTrx)r<r�r{r+)r@r�r)r)r*r��sz2getPhases.<locals>.InBodyPhase.startTagParamSourcecSsJ|jjddd�r|jtd��|jj|�|jjj�d|d<d|j_dS)Nr(rd)r�TrxF)	r<r�r�r
r�r{r+rIr`)r@r�r)r)r*r��sz)getPhases.<locals>.InBodyPhase.startTagHrcSs6|jjdddd��|jtdd|d|dd��dS)	Nzunexpected-start-tag-treated-asrrrj)�originalName�newNamerorsrw)rirw)rIr|r�r
)r@r�r)r)r*r��s

z,getPhases.<locals>.InBodyPhase.startTagImagecSs|jjdddi�|jjrdSi}d|dkr>|dd|d<|jtdd|d��|jtd	d��|jtd
d��d|dkr�|dd}nd}|jtd
|d��|dj�}d|kr�|d=d|kr�|d=d|d<|jtdd||dd��|j	td
��|jtd	d��|j	td��dS)Nzdeprecated-tagr>rs�actionrsrRro)rirqZlabel�promptz3This is a searchable index. Enter search keywords: rn)r2rsrprw)rirw)
rIr|r<r�r�r
r�r�copyr�)r@r�Z
form_attrsr�rir)r)r*r��s6


z.getPhases.<locals>.InBodyPhase.startTagIsIndexcSs0|jj|�|jjj|jj_|j|_d|j_dS)NF)	r<r�rIrLrYrZr�r�r`)r@r�r)r)r*r��sz/getPhases.<locals>.InBodyPhase.startTagTextareacSsd|j_|j|�dS)NF)rIr`r�)r@r�r)r)r*r��sz-getPhases.<locals>.InBodyPhase.startTagIFramecSs"|jjr|j|�n
|j|�dS)N)rIrKr�r)r@r�r)r)r*r%�sz/getPhases.<locals>.InBodyPhase.startTagNoscriptcSs|jj|d�dS)z8iframe, noembed noframes, noscript(if scripting enabled)r�N)rIr�)r@r�r)r)r*r��sz.getPhases.<locals>.InBodyPhase.startTagRawtextcSs@|jjdjdkr$|jjjtd��|jj�|jjj|�dS)Nr	rxry)	r<r{r>rIr]r�r
r�r�)r@r�r)r)r*r��s
z*getPhases.<locals>.InBodyPhase.startTagOptcSs�|jj�|jj|�d|j_|jj|jjd|jjd|jjd|jjd|jjd|jjdfkrx|jjd|j_n|jjd	|j_dS)
NFr�r�r�r�r�r��inSelectInTabler�)r<r�r�rIr`r]rF)r@r�r)r)r*r��s




z-getPhases.<locals>.InBodyPhase.startTagSelectcSsB|jjd�r2|jj�|jjdjdkr2|jj�|jj|�dS)N�rubyr	ry)r<r��generateImpliedEndTagsr{r>rIr|r�)r@r�r)r)r*r��s


z+getPhases.<locals>.InBodyPhase.startTagRpRtcSsZ|jj�|jj|�|jj|�td|d<|jj|�|drV|jjj�d|d<dS)NrdrhrwTrx)	r<r�rIrrrr�r{r+)r@r�r)r)r*r��s
z+getPhases.<locals>.InBodyPhase.startTagMathcSsZ|jj�|jj|�|jj|�td|d<|jj|�|drV|jjj�d|d<dS)NrurhrwTrx)	r<r�rIrrrr�r{r+)r@r�r)r)r*r��s
z*getPhases.<locals>.InBodyPhase.startTagSvgcSs|jjdd|di�dS)a5 Elements that should be children of other elements that have a
            different insertion mode; here they are ignored
            "caption", "col", "colgroup", "frame", "frameset", "head",
            "option", "optgroup", "tbody", "td", "tfoot", "th", "thead",
            "tr", "noscript"
            zunexpected-start-tag-ignoredr>N)rIr|)r@r�r)r)r*r�sz0getPhases.<locals>.InBodyPhase.startTagMisplacedcSs|jj�|jj|�dS)N)r<r�r�)r@r�r)r)r*rs
z,getPhases.<locals>.InBodyPhase.startTagOthercSs�|jjddd�sD|jtdd��|jjdddi�|jtdd��nX|jjd�|jjd	j	dkrt|jjdddi�|jjj
�}x|j	dkr�|jjj
�}q�WdS)
Nr(rd)r�rozunexpected-end-tagr>rpr	ry)r<r�rr
rIr|r�r�r{r>r+)r@r�r�r)r)r*r�sz&getPhases.<locals>.InBodyPhase.endTagPcSs�|jjd�s|jj�dS|jjdjdkrlx>|jjdd�D]*}|jtd�kr>|jjdd|jd��Pq>W|jjd|j_dS)Nr�r	�rTrUrSryrxr(rvrwr�r�r�r�r�r�r�z$expected-one-end-tag-but-got-another)�gotName�expectedName�	afterBodyry)rTrUrSryrxr(rvrwr�r�r�r�r�r�r�r�)	r<r�rIr|r{r>r�rFr])r@r�r�r)r)r*r�!s
z)getPhases.<locals>.InBodyPhase.endTagBodycSs"|jjd�r|jtd��|SdS)Nr�)r<r�r�r
)r@r�r)r)r*r�3sz)getPhases.<locals>.InBodyPhase.endTagHtmlcSs�|ddkr|j|_|jj|d�}|r2|jj�|jjdj|dkr^|jjdd|di�|r�|jjj	�}x|j|dkr�|jjj	�}qpWdS)Nr>rPr	zend-tag-too-earlyry)
r}r�r<r�r�r{r>rIr|r+)r@r�ZinScoper�r)r)r*r�9s
z*getPhases.<locals>.InBodyPhase.endTagBlockcSsx|jj}d|j_|dks&|jj|�r:|jjdddi�n:|jj�|jjd|krf|jjdddi�|jjj|�dS)Nzunexpected-end-tagr>rRr	zend-tag-too-early-ignoredry)r<r�r�rIr|r�r{r9)r@r�r�r)r)r*r�Gs

z)getPhases.<locals>.InBodyPhase.endTagFormcSs�|ddkrd}nd}|jj|d|d�sB|jjdd|di�nj|jj|dd�|jjd	j|dkr�|jjdd|di�|jjj�}x|j|dkr�|jjj�}q�WdS)
Nr>rS�list)r�zunexpected-end-tag)�excluder	zend-tag-too-earlyry)r<r�rIr|r�r{r>r+)r@r�r�r�r)r)r*r�Tsz-getPhases.<locals>.InBodyPhase.endTagListItemcSs�x$tD]}|jj|�r|jj�PqW|jjdj|dkrR|jjdd|di�xBtD]:}|jj|�rX|jjj�}x|jtkr�|jjj�}qvWPqXWdS)Nr	r>zend-tag-too-earlyry)	rr<r�r�r{r>rIr|r+)r@r��itemr)r)r*r�es


z,getPhases.<locals>.InBodyPhase.endTagHeadingcSs"d}�x|dk�r|d7}|jj|d�}|sL||jjkrZ|jj|j�rZ|j|�dS||jjkr�|jjdd|di�|jjj	|�dS|jj|j�s�|jjdd|di�dS||jjdkr�|jjdd|di�|jjj
|�}d}x,|jj|d�D]}|jtk�r|}P�qW|dk�rb|jjj
�}x||k�rN|jjj
�}�q4W|jjj	|�dS|jj|d}|jjj
|�}|}	}
d}|jjj
|
�}x�|d	k�rh|d7}|d8}|jj|}
|
|jjk�r�|jjj	|
��q�|
|k�r�P|	|k�r
|jjj
|
�d}|
j�}
|
|jj|jjj
|
�<|
|jj|jjj
|
�<|
}
|	j�rV|	jj|	�|
j|	�|
}	�q�W|	j�r~|	jj|	�|jtd�k�r�|jj�\}}|j|	|�n
|j|	�|j�}
|j|
�|j|
�|jjj	|�|jjj||
�|jjj	|�|jjj|jjj
|�d|
�qWdS)z)The much-feared adoption agency algorithmr�r	r>Nzadoption-agency-1.2zadoption-agency-4.4zadoption-agency-1.3r�r�r�r�r�r�ry)r�r�r�r�r�)r<r�r{r�r>rrIr|r�r9�indexr�rr+Z	cloneNoder�r�ZappendChildr�ZgetTableMisnestedNodePosition�insertBeforeZreparentChildren�insert)r@r�ZouterLoopCounterZformattingElementZafeIndexZ
furthestBlockrkZcommonAncestorZbookmarkZlastNoder�ZinnerLoopCounterr�Zcloner�r�r)r)r*r�ts�











z/getPhases.<locals>.InBodyPhase.endTagFormattingcSs�|jj|d�r|jj�|jjdj|dkrF|jjdd|di�|jj|d�r�|jjj�}x|j|dkr�|jjj�}qdW|jj�dS)Nr>r	zend-tag-too-earlyry)	r<r�r�r{r>rIr|r+�clearActiveFormattingElements)r@r�rkr)r)r*r�s
z8getPhases.<locals>.InBodyPhase.endTagAppletMarqueeObjectcSs@|jjdddd��|jj�|jjtdd��|jjj�dS)Nzunexpected-end-tag-treated-asrz
br element)r�r�ro)rIr|r<r�r�r
r{r+)r@r�r)r)r*r4#s

z'getPhases.<locals>.InBodyPhase.endTagBrcSs�x�|jjddd�D]�}|j|dkr~|jj|dd�|jjdj|dkrd|jjdd|di�x|jjj�|krxqfWPq|jtkr|jjdd|di�PqWdS)Nr	r>)r�zunexpected-end-tagryry)	r<r{r>r�rIr|r+r�r)r@r�r�r)r)r*r*s
z*getPhases.<locals>.InBodyPhase.endTagOtherN)6r7r8r9rHr�r�r�r�r�r}r~r6r7rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r%r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r4rr))r�r)r*�InBodyPhaseeshG

	

	

$r�cs@eZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
S)zgetPhases.<locals>.TextPhasecsF�j|||�tjg�|_|j|j_tjd|jfg�|_|j|j_dS)Nr)	rHrrr�rr�endTagScriptr�r)r@rIr<)r�r)r*rH9s
z%getPhases.<locals>.TextPhase.__init__cSs|jj|d�dS)Nrs)r<r�)r@r�r)r)r*r�Asz.getPhases.<locals>.TextPhase.processCharacterscSs8|jjdd|jjdji�|jjj�|jj|j_dS)Nz&expected-named-closing-tag-but-got-eofr>r	Try)rIr|r<r{r>r+r�r])r@r)r)r*r�Ds
z'getPhases.<locals>.TextPhase.processEOFcSsdstd|d��dS)NFz4Tried to process start tag %s in RCDATA/RAWTEXT moder>)r�)r@r�r)r)r*rKsz*getPhases.<locals>.TextPhase.startTagOthercSs*|jjj�}|jdkst�|jj|j_dS)Nr)r<r{r+r>r�rIr�r])r@r�r�r)r)r*r�Nsz)getPhases.<locals>.TextPhase.endTagScriptcSs|jjj�|jj|j_dS)N)r<r{r+rIr�r])r@r�r)r)r*rUsz(getPhases.<locals>.TextPhase.endTagOtherN)	r7r8r9rHr�r�rr�rr))r�r)r*�	TextPhase8sr�cs�eZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'S)(zgetPhases.<locals>.InTablePhasec
s��j|||�tjd|jfd|jfd|jfd|jfd|jfd|jfd|j	fd|j
fd|jfd|jfg
�|_
|j|j
_tjd|jfd|jfg�|_|j|j_dS)Nr�r�r�rzr�r�r�r�r�r�r�rrrprRr�)r�r�r�)r�r�r�)rr)r�r�rzr�r�r�r�r�r�r�r�)rHrrr��startTagCaption�startTagColgroup�startTagCol�startTagRowGroup�startTagImplyTbodyr��startTagStyleScriptr�r�r�rr�endTagTable�endTagIgnorer�r)r@rIr<)r�r)r*rH[s$
z(getPhases.<locals>.InTablePhase.__init__cSs(x"|jjdjdkr"|jjj�qWdS)Nr	r�r�ry)r�r�)r<r{r>r+)r@r)r)r*�clearStackToTableContextssz8getPhases.<locals>.InTablePhase.clearStackToTableContextcSs0|jjdjdkr |jjd�n|jjs,t�dS)Nr	r�zeof-in-tablery)r<r{r>rIr|rPr�)r@r)r)r*r�|sz*getPhases.<locals>.InTablePhase.processEOFcSs4|jj}|jjd|j_||jj_|jjj|�dS)N�inTableText)rIr]rFr�r�)r@r�r�r)r)r*r��s
z6getPhases.<locals>.InTablePhase.processSpaceCharacterscSs4|jj}|jjd|j_||jj_|jjj|�dS)Nr�)rIr]rFr�r�)r@r�r�r)r)r*r��s
z1getPhases.<locals>.InTablePhase.processCharacterscSs&d|j_|jjdj|�d|j_dS)NTr�F)r<�insertFromTablerIrFr�)r@r�r)r)r*r��sz*getPhases.<locals>.InTablePhase.insertTextcSs6|j�|jjjt�|jj|�|jjd|j_dS)Nr�)	r�r<r�r�r
r�rIrFr])r@r�r)r)r*r��sz/getPhases.<locals>.InTablePhase.startTagCaptioncSs(|j�|jj|�|jjd|j_dS)Nr�)r�r<r�rIrFr])r@r�r)r)r*r��sz0getPhases.<locals>.InTablePhase.startTagColgroupcSs|jtdd��|S)Nr�ro)r�r
)r@r�r)r)r*r��sz+getPhases.<locals>.InTablePhase.startTagColcSs(|j�|jj|�|jjd|j_dS)Nr�)r�r<r�rIrFr])r@r�r)r)r*r��sz0getPhases.<locals>.InTablePhase.startTagRowGroupcSs|jtdd��|S)Nr�ro)r�r
)r@r�r)r)r*r��sz2getPhases.<locals>.InTablePhase.startTagImplyTbodycSs6|jjdddd��|jjjtd��|jjs2|SdS)Nz$unexpected-start-tag-implies-end-tagr�)r�r�)rIr|r]r�r
rP)r@r�r)r)r*r��s
z-getPhases.<locals>.InTablePhase.startTagTablecSs|jjdj|�S)Nr)rIrFr�)r@r�r)r)r*r��sz3getPhases.<locals>.InTablePhase.startTagStyleScriptcSsVd|dkrH|ddjt�dkrH|jjd�|jj|�|jjj�n
|j|�dS)Nr2rsr�z unexpected-hidden-input-in-table)	rjrrIr|r<r�r{r+r)r@r�r)r)r*r��sz-getPhases.<locals>.InTablePhase.startTagInputcSsD|jjd�|jjdkr@|jj|�|jjd|j_|jjj�dS)Nzunexpected-form-in-tabler	ry)rIr|r<r�r�r{r+)r@r�r)r)r*r��s
z,getPhases.<locals>.InTablePhase.startTagFormcSs<|jjdd|di�d|j_|jjdj|�d|j_dS)Nz)unexpected-start-tag-implies-table-voodoor>Tr�F)rIr|r<r�rFr�)r@r�r)r)r*r�sz-getPhases.<locals>.InTablePhase.startTagOthercSs�|jjddd�r�|jj�|jjdjdkrJ|jjdd|jjdjd��x"|jjdjdkrl|jjj�qLW|jjj�|jj�n|jj	s�t
�|jj�dS)	Nr�)r�r	zend-tag-too-early-named)r�r�ryryry)r<r�r�r{r>rIr|r+r_rPr�)r@r�r)r)r*r��s
z+getPhases.<locals>.InTablePhase.endTagTablecSs|jjdd|di�dS)Nzunexpected-end-tagr>)rIr|)r@r�r)r)r*r��sz,getPhases.<locals>.InTablePhase.endTagIgnorecSs<|jjdd|di�d|j_|jjdj|�d|j_dS)Nz'unexpected-end-tag-implies-table-voodoor>Tr�F)rIr|r<r�rFr�)r@r�r)r)r*r�sz+getPhases.<locals>.InTablePhase.endTagOtherN)r7r8r9rHr�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�rr))r�r)r*�InTablePhaseYs&	
r�csPeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)z#getPhases.<locals>.InTableTextPhasecs�j|||�d|_g|_dS)N)rHr��characterTokens)r@rIr<)r�r)r*rH�sz,getPhases.<locals>.InTableTextPhase.__init__cSsddjdd�|jD��}tdd�|D��rJtd|d�}|jjdj|�n|rZ|jj|�g|_dS)Nr�cSsg|]}|d�qS)rsr))r=r�r)r)r*rA�szGgetPhases.<locals>.InTableTextPhase.flushCharacters.<locals>.<listcomp>cSsg|]}|tk�qSr))r)r=r�r)r)r*rA�srn)r2rsr�)�joinr�r�rrIrFr�r<)r@rsr�r)r)r*�flushCharacters�sz3getPhases.<locals>.InTableTextPhase.flushCharacterscSs|j�|j|j_|S)N)r�r�rIr])r@r�r)r)r*r��s
z2getPhases.<locals>.InTableTextPhase.processCommentcSs|j�|j|j_dS)NT)r�r�rIr])r@r)r)r*r��s
z.getPhases.<locals>.InTableTextPhase.processEOFcSs |ddkrdS|jj|�dS)Nrsr�)r�r�)r@r�r)r)r*r�sz5getPhases.<locals>.InTableTextPhase.processCharacterscSs|jj|�dS)N)r�r�)r@r�r)r)r*r�sz:getPhases.<locals>.InTableTextPhase.processSpaceCharacterscSs|j�|j|j_|S)N)r�r�rIr])r@r�r)r)r*r�
s
z3getPhases.<locals>.InTableTextPhase.processStartTagcSs|j�|j|j_|S)N)r�r�rIr])r@r�r)r)r*r�s
z1getPhases.<locals>.InTableTextPhase.processEndTagN)r7r8r9rHr�r�r�r�r�r�r�r))r�r)r*�InTableTextPhase�s	r�cs`eZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�ZdS)z!getPhases.<locals>.InCaptionPhasec
sf�j|||�tjd|jfd
|jfg�|_|j|j_tjd|jfd|j	fd|j
fg�|_|j|j_dS)Nr�r�rzr�r�r�r�r�r�r�r�r�)	r�rzr�r�r�r�r�r�r�)
r�rzr�r�r�r�r�r�r�r�)
rHrrr��startTagTableElementr�rr�
endTagCaptionr�r�r�r)r@rIr<)r�r)r*rHs
z*getPhases.<locals>.InCaptionPhase.__init__cSs|jjddd�S)Nr�r�)r�)r<r�)r@r)r)r*�ignoreEndTagCaption+sz5getPhases.<locals>.InCaptionPhase.ignoreEndTagCaptioncSs|jjdj�dS)Nr�)rIrFr�)r@r)r)r*r�.sz,getPhases.<locals>.InCaptionPhase.processEOFcSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�1sz3getPhases.<locals>.InCaptionPhase.processCharacterscSs0|jj�|j�}|jjjtd��|s,|SdS)Nr�)rIr|r�r]r�r
)r@r��ignoreEndTagr)r)r*r�4s

z6getPhases.<locals>.InCaptionPhase.startTagTableElementcSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r<sz/getPhases.<locals>.InCaptionPhase.startTagOthercSs�|j�s�|jj�|jjdjdkrB|jjdd|jjdjd��x"|jjdjdkrd|jjj�qDW|jjj�|jj�|jj	d|j_
n|jjs�t�|jj�dS)	Nr	r�z$expected-one-end-tag-but-got-another)r�r�r�ryryry)
r�r<r�r{r>rIr|r+r�rFr]rPr�)r@r�r)r)r*r�?s

z/getPhases.<locals>.InCaptionPhase.endTagCaptioncSs0|jj�|j�}|jjjtd��|s,|SdS)Nr�)rIr|r�r]r�r
)r@r�r�r)r)r*r�Qs

z-getPhases.<locals>.InCaptionPhase.endTagTablecSs|jjdd|di�dS)Nzunexpected-end-tagr>)rIr|)r@r�r)r)r*r�Xsz.getPhases.<locals>.InCaptionPhase.endTagIgnorecSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r[sz-getPhases.<locals>.InCaptionPhase.endTagOtherN)
r7r8r9rHr�r�r�r�rr�r�r�rr))r�r)r*�InCaptionPhasesr�csXeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)z%getPhases.<locals>.InColumnGroupPhasecs^�j|||�tjd|jfd|jfg�|_|j|j_tjd|jfd|j	fg�|_
|j|j
_dS)Nr�rzr�)rHrrr�r�r�rr�endTagColgroup�	endTagColr�r)r@rIr<)r�r)r*rHas
z.getPhases.<locals>.InColumnGroupPhase.__init__cSs|jjdjdkS)Nr	r�ry)r<r{r>)r@r)r)r*�ignoreEndTagColgrouppsz:getPhases.<locals>.InColumnGroupPhase.ignoreEndTagColgroupcSsD|jjdjdkr"|jjst�dS|j�}|jtd��|s@dSdS)Nr	r�r�Try)	r<r{r>rIrPr�r�r�r
)r@r�r)r)r*r�ssz0getPhases.<locals>.InColumnGroupPhase.processEOFcSs"|j�}|jtd��|s|SdS)Nr�)r�r�r
)r@r�r�r)r)r*r�}sz7getPhases.<locals>.InColumnGroupPhase.processCharacterscSs$|jj|�|jjj�d|d<dS)NTrx)r<r�r{r+)r@r�r)r)r*r��sz1getPhases.<locals>.InColumnGroupPhase.startTagColcSs"|j�}|jtd��|s|SdS)Nr�)r�r�r
)r@r�r�r)r)r*r�sz3getPhases.<locals>.InColumnGroupPhase.startTagOthercSs@|j�r |jjst�|jj�n|jjj�|jjd|j_	dS)Nr�)
r�rIrPr�r|r<r{r+rFr])r@r�r)r)r*r��s
z4getPhases.<locals>.InColumnGroupPhase.endTagColgroupcSs|jjdddi�dS)Nz
no-end-tagr>rz)rIr|)r@r�r)r)r*r��sz/getPhases.<locals>.InColumnGroupPhase.endTagColcSs"|j�}|jtd��|s|SdS)Nr�)r�r�r
)r@r�r�r)r)r*r�sz1getPhases.<locals>.InColumnGroupPhase.endTagOtherN)r7r8r9rHr�r�r�r�rr�r�rr))r�r)r*�InColumnGroupPhase^s
	r�csxeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�ZdS)z#getPhases.<locals>.InTableBodyPhasecsv�j|||�tjd|jfd|jfd
|jfd|jfg�|_|j|j_	tjd|j
fd|jfd|jfg�|_
|j|j
_	dS)Nr�r�r�r�r�rzr�r�r�r�r�r�)r�r�)r�rzr�r�r�r�)r�r�r�)r�r�rzr�r�r�r�r�)rHrrr��
startTagTr�startTagTableCell�startTagTableOtherr�rr�endTagTableRowGroupr�r�r�r)r@rIr<)r�r)r*rH�s
z,getPhases.<locals>.InTableBodyPhase.__init__cSsFx"|jjdjdkr"|jjj�qW|jjdjdkrB|jjsBt�dS)	Nr	r�r�r�r�ry)r�r�r�r�ry)r<r{r>r+rIrPr�)r@r)r)r*�clearStackToTableBodyContext�s
z@getPhases.<locals>.InTableBodyPhase.clearStackToTableBodyContextcSs|jjdj�dS)Nr�)rIrFr�)r@r)r)r*r��sz.getPhases.<locals>.InTableBodyPhase.processEOFcSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r��sz:getPhases.<locals>.InTableBodyPhase.processSpaceCharacterscSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r��sz5getPhases.<locals>.InTableBodyPhase.processCharacterscSs(|j�|jj|�|jjd|j_dS)Nr�)r�r<r�rIrFr])r@r�r)r)r*r��sz.getPhases.<locals>.InTableBodyPhase.startTagTrcSs*|jjdd|di�|jtdd��|S)Nzunexpected-cell-in-table-bodyr>r�ro)rIr|r�r
)r@r�r)r)r*r��sz5getPhases.<locals>.InTableBodyPhase.startTagTableCellcSsn|jjddd�s0|jjddd�s0|jjddd�rT|j�|jt|jjdj��|S|jjs`t	�|jj
�dS)Nr�r�)r�r�r�r	ry)r<r�r�r�r
r{r>rIrPr�r|)r@r�r)r)r*r��sz6getPhases.<locals>.InTableBodyPhase.startTagTableOthercSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�sz1getPhases.<locals>.InTableBodyPhase.startTagOthercSsT|jj|ddd�r:|j�|jjj�|jjd|j_n|jjdd|di�dS)Nr>r�)r�r�z unexpected-end-tag-in-table-body)	r<r�r�r{r+rIrFr]r|)r@r�r)r)r*r��sz7getPhases.<locals>.InTableBodyPhase.endTagTableRowGroupcSsn|jjddd�s0|jjddd�s0|jjddd�rT|j�|jt|jjdj��|S|jjs`t	�|jj
�dS)Nr�r�)r�r�r�r	ry)r<r�r�r�r
r{r>rIrPr�r|)r@r�r)r)r*r��sz/getPhases.<locals>.InTableBodyPhase.endTagTablecSs|jjdd|di�dS)Nz unexpected-end-tag-in-table-bodyr>)rIr|)r@r�r)r)r*r��sz0getPhases.<locals>.InTableBodyPhase.endTagIgnorecSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�sz/getPhases.<locals>.InTableBodyPhase.endTagOtherN)r7r8r9rHr�r�r�r�r�r�r�rr�r�r�rr))r�r)r*�InTableBodyPhase�s
	
r�cs�eZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�ZdS)zgetPhases.<locals>.InRowPhasecsv�j|||�tjd|jfd
|jfd|jfg�|_|j|j_tjd
|j	fd|j
fd|jfd|jfg�|_
|j|j
_dS)Nr�r�r�r�rzr�r�r�r�r�r�r�)r�r�)r�rzr�r�r�r�r�)r�r�r�)r�r�rzr�r�r�r�)rHrrr�r�r�r�rr�endTagTrr�r�r�r�r)r@rIr<)r�r)r*rHs
z&getPhases.<locals>.InRowPhase.__init__cSsDx>|jjdjdkr>|jjdd|jjdji�|jjj�qWdS)	Nr	r�r�z'unexpected-implied-end-tag-in-table-rowr>ry)r�r�ry)r<r{r>rIr|r+)r@r)r)r*�clearStackToTableRowContextsz9getPhases.<locals>.InRowPhase.clearStackToTableRowContextcSs|jjddd�S)Nr�r�)r�)r<r�)r@r)r)r*�ignoreEndTagTrsz,getPhases.<locals>.InRowPhase.ignoreEndTagTrcSs|jjdj�dS)Nr�)rIrFr�)r@r)r)r*r�"sz(getPhases.<locals>.InRowPhase.processEOFcSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�%sz4getPhases.<locals>.InRowPhase.processSpaceCharacterscSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�(sz/getPhases.<locals>.InRowPhase.processCharacterscSs6|j�|jj|�|jjd|j_|jjjt�dS)Nr�)	r�r<r�rIrFr]r�r�r
)r@r�r)r)r*r�+sz/getPhases.<locals>.InRowPhase.startTagTableCellcSs"|j�}|jtd��|s|SdS)Nr�)r�r�r
)r@r�r�r)r)r*r�1sz0getPhases.<locals>.InRowPhase.startTagTableOthercSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r8sz+getPhases.<locals>.InRowPhase.startTagOthercSsH|j�s.|j�|jjj�|jjd|j_n|jjs:t	�|jj
�dS)Nr�)r�r�r<r{r+rIrFr]rPr�r|)r@r�r)r)r*r�;sz&getPhases.<locals>.InRowPhase.endTagTrcSs"|j�}|jtd��|s|SdS)Nr�)r�r�r
)r@r�r�r)r)r*r�Esz)getPhases.<locals>.InRowPhase.endTagTablecSs4|jj|ddd�r&|jtd��|S|jj�dS)Nr>r�)r�r�)r<r�r�r
rIr|)r@r�r)r)r*r�Msz1getPhases.<locals>.InRowPhase.endTagTableRowGroupcSs|jjdd|di�dS)Nzunexpected-end-tag-in-table-rowr>)rIr|)r@r�r)r)r*r�Tsz*getPhases.<locals>.InRowPhase.endTagIgnorecSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*rXsz)getPhases.<locals>.InRowPhase.endTagOtherN)r7r8r9rHr�r�r�r�r�r�r�rr�r�r�r�rr))r�r)r*�
InRowPhases
r�cs`eZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�ZdS)zgetPhases.<locals>.InCellPhasecsf�j|||�tjd|jfd
|jfg�|_|j|j_tjd|jfd|j	fd|j
fg�|_|j|j_dS)Nr�r�rzr�r�r�r�r�r�r�r�r�)	r�rzr�r�r�r�r�r�r�)r�r�)r�r�rzr�r�)r�r�r�r�r�)
rHrrr�r�r�rr�endTagTableCellr��endTagImplyr�r)r@rIr<)r�r)r*rH]s
z'getPhases.<locals>.InCellPhase.__init__cSsB|jjddd�r |jtd��n|jjddd�r>|jtd��dS)Nr�r�)r�r�)r<r�r�r
)r@r)r)r*�	closeCellnsz(getPhases.<locals>.InCellPhase.closeCellcSs|jjdj�dS)Nr�)rIrFr�)r@r)r)r*r�usz)getPhases.<locals>.InCellPhase.processEOFcSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�xsz0getPhases.<locals>.InCellPhase.processCharacterscSsF|jjddd�s |jjddd�r,|j�|S|jjs8t�|jj�dS)Nr�r�)r�r�)r<r�r�rIrPr�r|)r@r�r)r)r*r�{sz1getPhases.<locals>.InCellPhase.startTagTableOthercSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�sz,getPhases.<locals>.InCellPhase.startTagOthercSs�|jj|ddd�r�|jj|d�|jjdj|dkrt|jjdd|di�x.|jjj�}|j|dkrRPqRWn|jjj�|jj�|jj	d|j_
n|jjdd|di�dS)	Nr>r�)r�r	zunexpected-cell-end-tagr�zunexpected-end-tagry)r<r�r�r{r>rIr|r+r�rFr])r@r�r�r)r)r*r��s
z.getPhases.<locals>.InCellPhase.endTagTableCellcSs|jjdd|di�dS)Nzunexpected-end-tagr>)rIr|)r@r�r)r)r*r��sz+getPhases.<locals>.InCellPhase.endTagIgnorecSs.|jj|ddd�r |j�|S|jj�dS)Nr>r�)r�)r<r�r�rIr|)r@r�r)r)r*r��sz*getPhases.<locals>.InCellPhase.endTagImplycSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�sz*getPhases.<locals>.InCellPhase.endTagOtherN)
r7r8r9rHr�r�r�r�rr�r�r�rr))r�r)r*�InCellPhase[s
r�csxeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�ZdS)z getPhases.<locals>.InSelectPhasecs��j|||�tjd|jfd|jfd|jfd|jfd	|jfd|jfg�|_	|j
|j	_tjd|jfd|j
fd|jfg�|_|j|j_dS)
Nr�rxryr�rprkrtr)rprkrt)rHrrr��startTagOption�startTagOptgroupr�r�r&r�rr�endTagOption�endTagOptgroup�endTagSelectr�r)r@rIr<)r�r)r*rH�s
z)getPhases.<locals>.InSelectPhase.__init__cSs0|jjdjdkr |jjd�n|jjs,t�dS)Nr	r�z
eof-in-selectry)r<r{r>rIr|rPr�)r@r)r)r*r��sz+getPhases.<locals>.InSelectPhase.processEOFcSs$|ddkrdS|jj|d�dS)Nrsr�)r<r�)r@r�r)r)r*r��sz2getPhases.<locals>.InSelectPhase.processCharacterscSs.|jjdjdkr|jjj�|jj|�dS)Nr	rxry)r<r{r>r+r�)r@r�r)r)r*r��sz/getPhases.<locals>.InSelectPhase.startTagOptioncSsL|jjdjdkr|jjj�|jjdjdkr<|jjj�|jj|�dS)Nr	rxryryry)r<r{r>r+r�)r@r�r)r)r*r��s
z1getPhases.<locals>.InSelectPhase.startTagOptgroupcSs|jjd�|jtd��dS)Nzunexpected-select-in-selectr�)rIr|r�r
)r@r�r)r)r*r��sz/getPhases.<locals>.InSelectPhase.startTagSelectcSs>|jjd�|jjddd�r.|jtd��|S|jjs:t�dS)Nzunexpected-input-in-selectr�)r�)rIr|r<r�r�r
rPr�)r@r�r)r)r*r��s
z.getPhases.<locals>.InSelectPhase.startTagInputcSs|jjdj|�S)Nr)rIrFr�)r@r�r)r)r*r&�sz/getPhases.<locals>.InSelectPhase.startTagScriptcSs|jjdd|di�dS)Nzunexpected-start-tag-in-selectr>)rIr|)r@r�r)r)r*r�sz.getPhases.<locals>.InSelectPhase.startTagOthercSs6|jjdjdkr |jjj�n|jjdddi�dS)Nr	rxzunexpected-end-tag-in-selectr>ry)r<r{r>r+rIr|)r@r�r)r)r*r��sz-getPhases.<locals>.InSelectPhase.endTagOptioncSsf|jjdjdkr0|jjdjdkr0|jjj�|jjd	jdkrP|jjj�n|jjdddi�dS)
Nr	rxr�ryzunexpected-end-tag-in-selectr>ry���ry)r<r{r>r+rIr|)r@r�r)r)r*r��sz/getPhases.<locals>.InSelectPhase.endTagOptgroupcSs^|jjddd�rD|jjj�}x|jdkr6|jjj�}qW|jj�n|jjsPt�|jj	�dS)Nr�)r�)
r<r�r{r+r>rIr_rPr�r|)r@r�r�r)r)r*r��sz-getPhases.<locals>.InSelectPhase.endTagSelectcSs|jjdd|di�dS)Nzunexpected-end-tag-in-selectr>)rIr|)r@r�r)r)r*r	sz,getPhases.<locals>.InSelectPhase.endTagOtherN)r7r8r9rHr�r�r�r�r�r�r&rr�r�r�rr))r�r)r*�
InSelectPhase�s
r�csHeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dS)z'getPhases.<locals>.InSelectInTablePhasec	sN�j|||�tjd	|jfg�|_|j|j_tjd
|jfg�|_|j	|j_dS)Nr�r�r�r�r�r�r�r�)r�r�r�r�r�r�r�r�)r�r�r�r�r�r�r�r�)
rHrrr�r�rrr�r�r)r@rIr<)r�r)r*rH	s
z0getPhases.<locals>.InSelectInTablePhase.__init__cSs|jjdj�dS)Nr�)rIrFr�)r@r)r)r*r�	sz2getPhases.<locals>.InSelectInTablePhase.processEOFcSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�	sz9getPhases.<locals>.InSelectInTablePhase.processCharacterscSs(|jjdd|di�|jtd��|S)Nz5unexpected-table-element-start-tag-in-select-in-tabler>r�)rIr|rr
)r@r�r)r)r*r�!	sz5getPhases.<locals>.InSelectInTablePhase.startTagTablecSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r&	sz5getPhases.<locals>.InSelectInTablePhase.startTagOthercSs@|jjdd|di�|jj|ddd�r<|jtd��|SdS)Nz3unexpected-table-element-end-tag-in-select-in-tabler>r�)r�r�)rIr|r<r�rr
)r@r�r)r)r*r�)	sz3getPhases.<locals>.InSelectInTablePhase.endTagTablecSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r/	sz3getPhases.<locals>.InSelectInTablePhase.endTagOtherN)
r7r8r9rHr�r�r�rr�rr))r�r)r*�InSelectInTablePhase	sr�c-s�eZdZeddddddddd	d
ddd
ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,g,�Z�fd-d.�Zd/d0�Z�fd1d2�Zd3d4�Zd5d6�Z	d7S)8z(getPhases.<locals>.InForeignContentPhaserWrXr>r�rr?rYrTr+rBrUrZriZh1Zh2Zh3Zh4Zh5Zh6r�rqr\rjrSrQrJr4rcrLr(rPr�r]r^�spanr`r_�subZsupr�rarbrO�varcs�j|||�dS)N)rH)r@rIr<)r�r)r*rH<	sz1getPhases.<locals>.InForeignContentPhase.__init__c%Ssnddddddddd	d
ddd
ddddddddddddddddddd d!d"d#d$d%�$}|d&|krj||d&|d&<dS)'NZaltGlyphZaltGlyphDefZaltGlyphItemZanimateColorZ
animateMotionZanimateTransformZclipPathZfeBlendZ
feColorMatrixZfeComponentTransferZfeCompositeZfeConvolveMatrixZfeDiffuseLightingZfeDisplacementMapZfeDistantLightZfeFloodZfeFuncAZfeFuncBZfeFuncGZfeFuncRZfeGaussianBlurZfeImageZfeMergeZfeMergeNodeZfeMorphologyZfeOffsetZfePointLightZfeSpecularLightingZfeSpotLightZfeTileZfeTurbulenceZ
foreignObjectZglyphRefZlinearGradientZradialGradientZtextPath)$ZaltglyphZaltglyphdefZaltglyphitemZanimatecolorZ
animatemotionZanimatetransformZclippathZfeblendZ
fecolormatrixZfecomponenttransferZfecompositeZfeconvolvematrixZfediffuselightingZfedisplacementmapZfedistantlightZfefloodZfefuncaZfefuncbZfefuncgZfefuncrZfegaussianblurZfeimageZfemergeZfemergenodeZfemorphologyZfeoffsetZfepointlightZfespecularlightingZfespotlightZfetileZfeturbulenceZ
foreignobjectZglyphrefZlineargradientZradialgradientZtextpathr>r))r@r��replacementsr)r)r*�adjustSVGTagNames?	sLz:getPhases.<locals>.InForeignContentPhase.adjustSVGTagNamescsL|ddkrd|d<n&|jjr<tdd�|dD��r<d|j_�j||�dS)Nrsr�u�css|]}|tkVqdS)N)r)r=r�r)r)r*r�l	szMgetPhases.<locals>.InForeignContentPhase.processCharacters.<locals>.<genexpr>F)rIr`r�r�)r@r�)r�r)r*r�h	s
z:getPhases.<locals>.InForeignContentPhase.processCharacterscSs6|jjd}|d|jksD|ddkr�t|dj��tdddg�@r�|jjdd|di�xR|jjdj|jjkr�|jj	|jjd�r�|jj
|jjd�r�|jjj�q\W|S|jtd	kr�|jj
|�n$|jtd
kr�|j|�|jj|�|jj|�|j|d<|jj|�|d�r2|jjj�d
|d<dS)Nr	r>r[rsZcolorZface�sizez*unexpected-html-element-in-foreign-contentrdrurhrwTrxryryryry)r<r{�breakoutElements�set�keysrIr|rhrrlrmr+rrrrrr�)r@r�r�r)r)r*r�p	s.



z8getPhases.<locals>.InForeignContentPhase.processStartTagcSs�t|jj�d}|jjd}|jjt�|dkrF|jjdd|di�x�|jjt�|dkr�|jj|jj	dkr�|jjj
�|jjj|j_x |jjj�|kr�|jjs�t
�q�Wd}P|d8}|jj|}|j|jjkr�qHqH|jjj|�}PqHW|S)Nr	r>zunexpected-end-tagr�ry)r~r<r{r>rjrrIr|r]rFr�r�r+r�rhrr�)r@r�Z	nodeIndexr�r�r)r)r*r��	s(z6getPhases.<locals>.InForeignContentPhase.processEndTagN)
r7r8r9r�rrHrr�r�r�r))r�r)r*�InForeignContentPhase2	s


)rcsPeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)z!getPhases.<locals>.AfterBodyPhasecsN�j|||�tjd|jfg�|_|j|j_tjd|jfg�|_|j	|j_dS)Nr�)
rHrrr�r�rrr�r�r)r@rIr<)r�r)r*rH�	s
z*getPhases.<locals>.AfterBodyPhase.__init__cSsdS)Nr))r@r)r)r*r��	sz,getPhases.<locals>.AfterBodyPhase.processEOFcSs|jj||jjd�dS)Nr)r<r�r{)r@r�r)r)r*r��	sz0getPhases.<locals>.AfterBodyPhase.processCommentcSs |jjd�|jjd|j_|S)Nzunexpected-char-after-bodyr�)rIr|rFr])r@r�r)r)r*r��	sz3getPhases.<locals>.AfterBodyPhase.processCharacterscSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r��	sz.getPhases.<locals>.AfterBodyPhase.startTagHtmlcSs*|jjdd|di�|jjd|j_|S)Nzunexpected-start-tag-after-bodyr>r�)rIr|rFr])r@r�r)r)r*r�	sz/getPhases.<locals>.AfterBodyPhase.startTagOthercSs*|jjr|jjd�n|jjd|j_dS)Nz'unexpected-end-tag-after-body-innerhtml�afterAfterBody)rIrPr|rFr])r@r>r)r)r*r��	sz,getPhases.<locals>.AfterBodyPhase.endTagHtmlcSs*|jjdd|di�|jjd|j_|S)Nzunexpected-end-tag-after-bodyr>r�)rIr|rFr])r@r�r)r)r*r�	sz-getPhases.<locals>.AfterBodyPhase.endTagOtherN)r7r8r9rHr�r�r�r�rr�rr))r�r)r*�AfterBodyPhase�	sr
csXeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)z"getPhases.<locals>.InFramesetPhasecsf�j|||�tjd|jfd|jfd|jfd|jfg�|_|j|j_	tjd|j
fg�|_|j|j_	dS)Nr�r�r{r)
rHrrr�r7�
startTagFrame�startTagNoframesr�rr�endTagFramesetr�r)r@rIr<)r�r)r*rH�	s
z+getPhases.<locals>.InFramesetPhase.__init__cSs0|jjdjdkr |jjd�n|jjs,t�dS)Nr	r�zeof-in-framesetry)r<r{r>rIr|rPr�)r@r)r)r*r��	sz-getPhases.<locals>.InFramesetPhase.processEOFcSs|jjd�dS)Nzunexpected-char-in-frameset)rIr|)r@r�r)r)r*r��	sz4getPhases.<locals>.InFramesetPhase.processCharacterscSs|jj|�dS)N)r<r�)r@r�r)r)r*r7�	sz3getPhases.<locals>.InFramesetPhase.startTagFramesetcSs|jj|�|jjj�dS)N)r<r�r{r+)r@r�r)r)r*r�	sz0getPhases.<locals>.InFramesetPhase.startTagFramecSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�	sz3getPhases.<locals>.InFramesetPhase.startTagNoframescSs|jjdd|di�dS)Nz unexpected-start-tag-in-framesetr>)rIr|)r@r�r)r)r*r�	sz0getPhases.<locals>.InFramesetPhase.startTagOthercSs\|jjdjdkr |jjd�n|jjj�|jjrX|jjdjdkrX|jjd|j_dS)Nr	r�z)unexpected-frameset-in-frameset-innerhtmlr��
afterFramesetryry)	r<r{r>rIr|r+rPrFr])r@r�r)r)r*r
�	s
z1getPhases.<locals>.InFramesetPhase.endTagFramesetcSs|jjdd|di�dS)Nzunexpected-end-tag-in-framesetr>)rIr|)r@r�r)r)r*r	
sz.getPhases.<locals>.InFramesetPhase.endTagOtherN)r7r8r9rHr�r�r7rrrr
rr))r�r)r*�InFramesetPhase�	srcsHeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dS)z%getPhases.<locals>.AfterFramesetPhasecsV�j|||�tjd|jfd|jfg�|_|j|j_tjd|jfg�|_	|j
|j	_dS)Nr�r)rHrrr�rr�rrr�r�r)r@rIr<)r�r)r*rH
s
z.getPhases.<locals>.AfterFramesetPhase.__init__cSsdS)Nr))r@r)r)r*r�
sz0getPhases.<locals>.AfterFramesetPhase.processEOFcSs|jjd�dS)Nzunexpected-char-after-frameset)rIr|)r@r�r)r)r*r�!
sz7getPhases.<locals>.AfterFramesetPhase.processCharacterscSs|jjdj|�S)Nr)rIrFr�)r@r�r)r)r*r$
sz6getPhases.<locals>.AfterFramesetPhase.startTagNoframescSs|jjdd|di�dS)Nz#unexpected-start-tag-after-framesetr>)rIr|)r@r�r)r)r*r'
sz3getPhases.<locals>.AfterFramesetPhase.startTagOthercSs|jjd|j_dS)N�afterAfterFrameset)rIrFr])r@r�r)r)r*r�+
sz0getPhases.<locals>.AfterFramesetPhase.endTagHtmlcSs|jjdd|di�dS)Nz!unexpected-end-tag-after-framesetr>)rIr|)r@r�r)r)r*r.
sz1getPhases.<locals>.AfterFramesetPhase.endTagOtherN)
r7r8r9rHr�r�rrr�rr))r�r)r*�AfterFramesetPhase
srcsPeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)z&getPhases.<locals>.AfterAfterBodyPhasecs0�j|||�tjd|jfg�|_|j|j_dS)Nr�)rHrrr�r�rr)r@rIr<)r�r)r*rH3
sz/getPhases.<locals>.AfterAfterBodyPhase.__init__cSsdS)Nr))r@r)r)r*r�;
sz1getPhases.<locals>.AfterAfterBodyPhase.processEOFcSs|jj||jj�dS)N)r<r�r�)r@r�r)r)r*r�>
sz5getPhases.<locals>.AfterAfterBodyPhase.processCommentcSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�A
sz=getPhases.<locals>.AfterAfterBodyPhase.processSpaceCharacterscSs |jjd�|jjd|j_|S)Nzexpected-eof-but-got-charr�)rIr|rFr])r@r�r)r)r*r�D
sz8getPhases.<locals>.AfterAfterBodyPhase.processCharacterscSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�I
sz3getPhases.<locals>.AfterAfterBodyPhase.startTagHtmlcSs*|jjdd|di�|jjd|j_|S)Nzexpected-eof-but-got-start-tagr>r�)rIr|rFr])r@r�r)r)r*rL
sz4getPhases.<locals>.AfterAfterBodyPhase.startTagOthercSs*|jjdd|di�|jjd|j_|S)Nzexpected-eof-but-got-end-tagr>r�)rIr|rFr])r@r�r)r)r*r�R
sz4getPhases.<locals>.AfterAfterBodyPhase.processEndTagN)r7r8r9rHr�r�r�r�r�rr�r))r�r)r*�AfterAfterBodyPhase2
srcsXeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)z*getPhases.<locals>.AfterAfterFramesetPhasecs8�j|||�tjd|jfd|jfg�|_|j|j_dS)Nr�r)rHrrr��startTagNoFramesr�rr)r@rIr<)r�r)r*rHY
s
z3getPhases.<locals>.AfterAfterFramesetPhase.__init__cSsdS)Nr))r@r)r)r*r�b
sz5getPhases.<locals>.AfterAfterFramesetPhase.processEOFcSs|jj||jj�dS)N)r<r�r�)r@r�r)r)r*r�e
sz9getPhases.<locals>.AfterAfterFramesetPhase.processCommentcSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�h
szAgetPhases.<locals>.AfterAfterFramesetPhase.processSpaceCharacterscSs|jjd�dS)Nzexpected-eof-but-got-char)rIr|)r@r�r)r)r*r�k
sz<getPhases.<locals>.AfterAfterFramesetPhase.processCharacterscSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�n
sz7getPhases.<locals>.AfterAfterFramesetPhase.startTagHtmlcSs|jjdj|�S)Nr)rIrFr�)r@r�r)r)r*rq
sz;getPhases.<locals>.AfterAfterFramesetPhase.startTagNoFramescSs|jjdd|di�dS)Nzexpected-eof-but-got-start-tagr>)rIr|)r@r�r)r)r*rt
sz8getPhases.<locals>.AfterAfterFramesetPhase.startTagOthercSs|jjdd|di�dS)Nzexpected-eof-but-got-end-tagr>)rIr|)r@r�r)r)r*r�x
sz8getPhases.<locals>.AfterAfterFramesetPhase.processEndTagN)r7r8r9rHr�r�r�r�r�rrr�r))r�r)r*�AfterAfterFramesetPhaseX
s	r)rTrSr�rr/r0r�r�r�r�r�r�r�r�r�r�r�rvr�r�rr	r)r)rGrVr�rrrr1r5r:r�r�r�r�r�r�r�r�r�r�r�rr
rrrrr))r�r*rE_sp)#.g@CX!-GBbYLd's/9%&&rEcs^ts
tjr t|d�t��@}nt|d�t��@}|rZt�fdd�|dj�D��|d<dS)Nrsc3s"|]\}}�j||�|fVqdS)N)r})r=�k�v)rr)r*r��
sz$adjust_attributes.<locals>.<genexpr>)rrZPY27rr�rr.)r�rZneeds_adjustmentr))rr*r��
s
r�rpFcCs|dkri}t||||d�S)N)r2r>rsrw)r)r>r2rirwr)r)r*r
�
s
r
c@seZdZdZdS)rrzError in parsed documentN)r7r8r9r�r)r)r)r*rr�
srr)rT)r+rT)rpNF)1Z
__future__rrrZpip._vendor.sixrrrr0�collectionsr�ImportErrorZpip._vendor.ordereddictr�r
rrZtreebuilders.baser
rZ	constantsrrrrrrrrrrrrr�rrrrr#r-r;rgr"ZmemoizerEr�r
�	Exceptionrrr)r)r)r*�<module>sRH

)L

site-packages/pip/_vendor/html5lib/__pycache__/serializer.cpython-36.opt-1.pyc000064400000021775147511334600023216 0ustar003

���ea7�@s�ddlmZmZmZddlmZddlZddlmZm	Z	ddl
mZmZm
Z
ddl
mZmZmZddlmZmZdd	lmZd
je
�dZejded
�Zejded�ZiZed�dkZx�eej��D]p\Z Z!er�ee!�dks�er�ee!�dkr�q�e!dkr�ee!�dk�rej"e!�Z!ne#e!�Z!e!ek�s4e j$�r�e ee!<q�Wdd�Z%ede%�ddd�Z&Gdd�de'�Z(Gdd�de)�Z*dS)�)�absolute_import�division�unicode_literals)�	text_typeN)�register_error�xmlcharrefreplace_errors�)�voidElements�booleanAttributes�spaceCharacters)�rcdataElements�entities�xmlEntities)�treewalkers�_utils)�escape�z"'=<>`�[�]u_	

 /`  ᠎᠏           

   ]u􏿿��&c
Cs"t|ttf��rg}g}d}x�t|j|j|j��D]n\}}|rFd}q4||j}tj|j|t	|j|dg���r�tj
|j||d��}d}nt|�}|j|�q4Wx^|D]V}t
j|�}	|	r�|jd�|j|	�|	jd�s�|jd�q�|jdt|�dd��q�Wdj|�|jfSt|�SdS)NFrTr�;z&#x%s;r)�
isinstance�UnicodeEncodeError�UnicodeTranslateError�	enumerate�object�start�endrZisSurrogatePair�min�surrogatePairToCodepoint�ord�append�_encode_entity_map�get�endswith�hex�joinr)
�exc�resZ
codepoints�skip�i�c�indexZ	codepointZcp�e�r/� /usr/lib/python3.6/serializer.py�htmlentityreplace_errors*s0 
"




r1�htmlentityreplace�etreecKs$tj|�}tf|�}|j||�|�S)N)rZ
getTreeWalker�HTMLSerializer�render)�inputZtree�encodingZserializer_optsZwalker�sr/r/r0�	serializeJs

r9c@s~eZdZdZdZdZdZdZdZdZ	dZ
dZdZdZ
dZdZdZd!Zdd�Zdd�Zdd�Zd"dd�Zd#dd�Zd$dd �ZdS)%r4�legacy�"TF�quote_attr_values�
quote_char�use_best_quote_char�omit_optional_tags�minimize_boolean_attributes�use_trailing_solidus�space_before_trailing_solidus�escape_lt_in_attrs�
escape_rcdata�resolve_entities�alphabetical_attributes�inject_meta_charset�strip_whitespace�sanitizec	Kszt|�t|j�}t|�dkr2tdtt|����d|kr@d|_x(|jD]}t|||j|t	||���qHWg|_
d|_dS)a6	Initialize HTMLSerializer.

        Keyword options (default given first unless specified) include:

        inject_meta_charset=True|False
          Whether it insert a meta element to define the character set of the
          document.
        quote_attr_values="legacy"|"spec"|"always"
          Whether to quote attribute values that don't require quoting
          per legacy browser behaviour, when required by the standard, or always.
        quote_char=u'"'|u"'"
          Use given quote character for attribute quoting. Default is to
          use double quote unless attribute value contains a double quote,
          in which case single quotes are used instead.
        escape_lt_in_attrs=False|True
          Whether to escape < in attribute values.
        escape_rcdata=False|True
          Whether to escape characters that need to be escaped within normal
          elements within rcdata elements such as style.
        resolve_entities=True|False
          Whether to resolve named character entities that appear in the
          source tree. The XML predefined entities &lt; &gt; &amp; &quot; &apos;
          are unaffected by this setting.
        strip_whitespace=False|True
          Whether to remove semantically meaningless whitespace. (This
          compresses all whitespace to a single space except within pre.)
        minimize_boolean_attributes=True|False
          Shortens boolean attributes to give just the attribute value,
          for example <input disabled="disabled"> becomes <input disabled>.
        use_trailing_solidus=False|True
          Includes a close-tag slash at the end of the start tag of void
          elements (empty elements whose end tag is forbidden). E.g. <hr/>.
        space_before_trailing_solidus=True|False
          Places a space immediately before the closing slash in a tag
          using a trailing solidus. E.g. <hr />. Requires use_trailing_solidus.
        sanitize=False|True
          Strip all unsafe or unknown constructs from output.
          See `html5lib user documentation`_
        omit_optional_tags=True|False
          Omit start/end tags that are optional.
        alphabetical_attributes=False|True
          Reorder attributes to be in alphabetical order.

        .. _html5lib user documentation: http://code.google.com/p/html5lib/wiki/UserDocumentation
        rz2__init__() got an unexpected keyword argument '%s'r=FN)�	frozenset�options�len�	TypeError�next�iterr>�setattrr$�getattr�errors�strict)�self�kwargsZunexpected_args�attrr/r/r0�__init__ps.zHTMLSerializer.__init__cCs|jr|j|jd�S|SdS)Nr2)r7�encode)rT�stringr/r/r0rX�szHTMLSerializer.encodecCs|jr|j|jd�S|SdS)NrS)r7rX)rTrYr/r/r0�encodeStrict�szHTMLSerializer.encodeStrictNccs�||_d}g|_|r0|jr0ddlm}|||�}|jrJddlm}||�}|jrdddlm}||�}|j	r~ddl
m}||�}|jr�ddlm}||�}�xR|D�]H}|d}|dk�r`d|d}|dr�|d	|d7}n|d
r�|d7}|d
�rJ|d
j
d�d
k�r0|d
j
d�d
k�r*|jd�d}nd}|d||d
|f7}|d7}|j|�Vq�|d5k�r�|dk�sz|�r�|�r�|dj
d�d
k�r�|jd�|j|d�Vn|jt|d��Vq�|d6k�r�|d}	|jd|	�V|	tk�r|j�rd}n|�r|jd��x�|dj�D�]�\\}
}}|}
|}|jd�V|j|
�V|j�s�|
tj|	t��k�r"|
tjdt��k�r"|jd�V|jdk�s�t|�d
k�r�d}n@|jd k�r�tj|�dk	}n$|jd!k�r�tj|�dk	}ntd"��|jd#d$�}|j �r|jd%d&�}|�r�|j!}|j"�rTd|k�r<d|k�r<d}nd|k�rTd|k�rTd}|dk�rl|jdd'�}n|jdd(�}|j|�V|j|�V|j|�Vn|j|�V�q"W|	t#k�r�|j$�r�|j%�r�|jd)�Vn|jd*�V|jd�Vq�|d+k�r6|d}	|	tk�rd}n|�r$|jd�|jd,|	�Vq�|d-k�rx|d}|j
d.�d
k�rb|jd/�|jd0|d�Vq�|d1k�r�|d}	|	d2}|t&k�r�|jd3|	�|j'�r�|t(k�r�t&|}nd4|	}|j|�Vq�|j|d�q�WdS)7NFr)�Filter�typeZDoctypez<!DOCTYPE %s�nameZpublicIdz PUBLIC "%s"ZsystemIdz SYSTEMr;r�'zASystem identifer contains both single and double quote charactersz %s%s%s�>�
Characters�SpaceCharacters�dataz</zUnexpected </ in CDATA�StartTag�EmptyTagz<%sTz+Unexpected child element of a CDATA element� r�=�always�specr:z?quote_attr_values must be one of: 'always', 'spec', or 'legacy'rz&amp;�<z&lt;z&#39;z&quot;z /�/ZEndTagz</%s>�Commentz--zComment contains --z	<!--%s-->ZEntityrzEntity %s not recognizedz&%s;)r`ra)rcrd))r7rRrGZfilters.inject_meta_charsetr[rFZfilters.alphabeticalattributesrHZfilters.whitespacerIZfilters.sanitizerr?Zfilters.optionaltags�find�serializeErrorrZrXrrrD�itemsr@r
r$�tupler<rL�_quoteAttributeSpec�search�_quoteAttributeLegacy�
ValueError�replacerCr=r>r	rArBr
rEr)rT�
treewalkerr7Zin_cdatar[�tokenr\Zdoctyper=r]�_Z	attr_nameZ
attr_value�k�vZ
quote_attrrb�keyr/r/r0r9�s�


















zHTMLSerializer.serializecCs2|rdjt|j||���Sdjt|j|���SdS)N�r)r'�listr9)rTrur7r/r/r0r5?szHTMLSerializer.render�XXX ERROR MESSAGE NEEDEDcCs|jj|�|jrt�dS)N)rRr"rS�SerializeError)rTrbr/r/r0rmEszHTMLSerializer.serializeError)r<r=r>r?r@rArBrCrDrErFrGrHrI)N)N)r})�__name__�
__module__�__qualname__r<r=r>r?r@rArBrCrDrErFrGrHrIrKrWrXrZr9r5rmr/r/r/r0r4Qs68


r4c@seZdZdZdS)r~zError in serialized treeN)rr�r��__doc__r/r/r/r0r~Lsr~)r3N)+Z
__future__rrrZpip._vendor.sixr�re�codecsrrZ	constantsr	r
rrr
rrrrZxml.sax.saxutilsrr'Z_quoteAttributeSpecChars�compilerprrr#rLZ_is_ucs4r|rnrxryr r!�islowerr1r9rr4�	Exceptionr~r/r/r/r0�<module>s:
	

|site-packages/pip/_vendor/html5lib/__pycache__/_ihatexml.cpython-36.opt-1.pyc000064400000032626147511334600023014 0ustar003

���eAA�@s�ddlmZmZmZddlZddlZddlmZdZdZ	dZ
dZd	Zd
j
ee	g�Zd
j
eeddd
e
eg�Zd
j
ed
g�Zejd�Zejd�Zdd�Zdd�Zedd�Zdd�Zdd�Zdd�Zdd�Zejd�Zejd�Zejd �ZGd!d"�d"e�Z dS)#�)�absolute_import�division�unicode_literalsN�)�DataLossWarninga^
[#x0041-#x005A] | [#x0061-#x007A] | [#x00C0-#x00D6] | [#x00D8-#x00F6] |
[#x00F8-#x00FF] | [#x0100-#x0131] | [#x0134-#x013E] | [#x0141-#x0148] |
[#x014A-#x017E] | [#x0180-#x01C3] | [#x01CD-#x01F0] | [#x01F4-#x01F5] |
[#x01FA-#x0217] | [#x0250-#x02A8] | [#x02BB-#x02C1] | #x0386 |
[#x0388-#x038A] | #x038C | [#x038E-#x03A1] | [#x03A3-#x03CE] |
[#x03D0-#x03D6] | #x03DA | #x03DC | #x03DE | #x03E0 | [#x03E2-#x03F3] |
[#x0401-#x040C] | [#x040E-#x044F] | [#x0451-#x045C] | [#x045E-#x0481] |
[#x0490-#x04C4] | [#x04C7-#x04C8] | [#x04CB-#x04CC] | [#x04D0-#x04EB] |
[#x04EE-#x04F5] | [#x04F8-#x04F9] | [#x0531-#x0556] | #x0559 |
[#x0561-#x0586] | [#x05D0-#x05EA] | [#x05F0-#x05F2] | [#x0621-#x063A] |
[#x0641-#x064A] | [#x0671-#x06B7] | [#x06BA-#x06BE] | [#x06C0-#x06CE] |
[#x06D0-#x06D3] | #x06D5 | [#x06E5-#x06E6] | [#x0905-#x0939] | #x093D |
[#x0958-#x0961] | [#x0985-#x098C] | [#x098F-#x0990] | [#x0993-#x09A8] |
[#x09AA-#x09B0] | #x09B2 | [#x09B6-#x09B9] | [#x09DC-#x09DD] |
[#x09DF-#x09E1] | [#x09F0-#x09F1] | [#x0A05-#x0A0A] | [#x0A0F-#x0A10] |
[#x0A13-#x0A28] | [#x0A2A-#x0A30] | [#x0A32-#x0A33] | [#x0A35-#x0A36] |
[#x0A38-#x0A39] | [#x0A59-#x0A5C] | #x0A5E | [#x0A72-#x0A74] |
[#x0A85-#x0A8B] | #x0A8D | [#x0A8F-#x0A91] | [#x0A93-#x0AA8] |
[#x0AAA-#x0AB0] | [#x0AB2-#x0AB3] | [#x0AB5-#x0AB9] | #x0ABD | #x0AE0 |
[#x0B05-#x0B0C] | [#x0B0F-#x0B10] | [#x0B13-#x0B28] | [#x0B2A-#x0B30] |
[#x0B32-#x0B33] | [#x0B36-#x0B39] | #x0B3D | [#x0B5C-#x0B5D] |
[#x0B5F-#x0B61] | [#x0B85-#x0B8A] | [#x0B8E-#x0B90] | [#x0B92-#x0B95] |
[#x0B99-#x0B9A] | #x0B9C | [#x0B9E-#x0B9F] | [#x0BA3-#x0BA4] |
[#x0BA8-#x0BAA] | [#x0BAE-#x0BB5] | [#x0BB7-#x0BB9] | [#x0C05-#x0C0C] |
[#x0C0E-#x0C10] | [#x0C12-#x0C28] | [#x0C2A-#x0C33] | [#x0C35-#x0C39] |
[#x0C60-#x0C61] | [#x0C85-#x0C8C] | [#x0C8E-#x0C90] | [#x0C92-#x0CA8] |
[#x0CAA-#x0CB3] | [#x0CB5-#x0CB9] | #x0CDE | [#x0CE0-#x0CE1] |
[#x0D05-#x0D0C] | [#x0D0E-#x0D10] | [#x0D12-#x0D28] | [#x0D2A-#x0D39] |
[#x0D60-#x0D61] | [#x0E01-#x0E2E] | #x0E30 | [#x0E32-#x0E33] |
[#x0E40-#x0E45] | [#x0E81-#x0E82] | #x0E84 | [#x0E87-#x0E88] | #x0E8A |
#x0E8D | [#x0E94-#x0E97] | [#x0E99-#x0E9F] | [#x0EA1-#x0EA3] | #x0EA5 |
#x0EA7 | [#x0EAA-#x0EAB] | [#x0EAD-#x0EAE] | #x0EB0 | [#x0EB2-#x0EB3] |
#x0EBD | [#x0EC0-#x0EC4] | [#x0F40-#x0F47] | [#x0F49-#x0F69] |
[#x10A0-#x10C5] | [#x10D0-#x10F6] | #x1100 | [#x1102-#x1103] |
[#x1105-#x1107] | #x1109 | [#x110B-#x110C] | [#x110E-#x1112] | #x113C |
#x113E | #x1140 | #x114C | #x114E | #x1150 | [#x1154-#x1155] | #x1159 |
[#x115F-#x1161] | #x1163 | #x1165 | #x1167 | #x1169 | [#x116D-#x116E] |
[#x1172-#x1173] | #x1175 | #x119E | #x11A8 | #x11AB | [#x11AE-#x11AF] |
[#x11B7-#x11B8] | #x11BA | [#x11BC-#x11C2] | #x11EB | #x11F0 | #x11F9 |
[#x1E00-#x1E9B] | [#x1EA0-#x1EF9] | [#x1F00-#x1F15] | [#x1F18-#x1F1D] |
[#x1F20-#x1F45] | [#x1F48-#x1F4D] | [#x1F50-#x1F57] | #x1F59 | #x1F5B |
#x1F5D | [#x1F5F-#x1F7D] | [#x1F80-#x1FB4] | [#x1FB6-#x1FBC] | #x1FBE |
[#x1FC2-#x1FC4] | [#x1FC6-#x1FCC] | [#x1FD0-#x1FD3] | [#x1FD6-#x1FDB] |
[#x1FE0-#x1FEC] | [#x1FF2-#x1FF4] | [#x1FF6-#x1FFC] | #x2126 |
[#x212A-#x212B] | #x212E | [#x2180-#x2182] | [#x3041-#x3094] |
[#x30A1-#x30FA] | [#x3105-#x312C] | [#xAC00-#xD7A3]z*[#x4E00-#x9FA5] | #x3007 | [#x3021-#x3029]a�
[#x0300-#x0345] | [#x0360-#x0361] | [#x0483-#x0486] | [#x0591-#x05A1] |
[#x05A3-#x05B9] | [#x05BB-#x05BD] | #x05BF | [#x05C1-#x05C2] | #x05C4 |
[#x064B-#x0652] | #x0670 | [#x06D6-#x06DC] | [#x06DD-#x06DF] |
[#x06E0-#x06E4] | [#x06E7-#x06E8] | [#x06EA-#x06ED] | [#x0901-#x0903] |
#x093C | [#x093E-#x094C] | #x094D | [#x0951-#x0954] | [#x0962-#x0963] |
[#x0981-#x0983] | #x09BC | #x09BE | #x09BF | [#x09C0-#x09C4] |
[#x09C7-#x09C8] | [#x09CB-#x09CD] | #x09D7 | [#x09E2-#x09E3] | #x0A02 |
#x0A3C | #x0A3E | #x0A3F | [#x0A40-#x0A42] | [#x0A47-#x0A48] |
[#x0A4B-#x0A4D] | [#x0A70-#x0A71] | [#x0A81-#x0A83] | #x0ABC |
[#x0ABE-#x0AC5] | [#x0AC7-#x0AC9] | [#x0ACB-#x0ACD] | [#x0B01-#x0B03] |
#x0B3C | [#x0B3E-#x0B43] | [#x0B47-#x0B48] | [#x0B4B-#x0B4D] |
[#x0B56-#x0B57] | [#x0B82-#x0B83] | [#x0BBE-#x0BC2] | [#x0BC6-#x0BC8] |
[#x0BCA-#x0BCD] | #x0BD7 | [#x0C01-#x0C03] | [#x0C3E-#x0C44] |
[#x0C46-#x0C48] | [#x0C4A-#x0C4D] | [#x0C55-#x0C56] | [#x0C82-#x0C83] |
[#x0CBE-#x0CC4] | [#x0CC6-#x0CC8] | [#x0CCA-#x0CCD] | [#x0CD5-#x0CD6] |
[#x0D02-#x0D03] | [#x0D3E-#x0D43] | [#x0D46-#x0D48] | [#x0D4A-#x0D4D] |
#x0D57 | #x0E31 | [#x0E34-#x0E3A] | [#x0E47-#x0E4E] | #x0EB1 |
[#x0EB4-#x0EB9] | [#x0EBB-#x0EBC] | [#x0EC8-#x0ECD] | [#x0F18-#x0F19] |
#x0F35 | #x0F37 | #x0F39 | #x0F3E | #x0F3F | [#x0F71-#x0F84] |
[#x0F86-#x0F8B] | [#x0F90-#x0F95] | #x0F97 | [#x0F99-#x0FAD] |
[#x0FB1-#x0FB7] | #x0FB9 | [#x20D0-#x20DC] | #x20E1 | [#x302A-#x302F] |
#x3099 | #x309Aa
[#x0030-#x0039] | [#x0660-#x0669] | [#x06F0-#x06F9] | [#x0966-#x096F] |
[#x09E6-#x09EF] | [#x0A66-#x0A6F] | [#x0AE6-#x0AEF] | [#x0B66-#x0B6F] |
[#x0BE7-#x0BEF] | [#x0C66-#x0C6F] | [#x0CE6-#x0CEF] | [#x0D66-#x0D6F] |
[#x0E50-#x0E59] | [#x0ED0-#x0ED9] | [#x0F20-#x0F29]z}
#x00B7 | #x02D0 | #x02D1 | #x0387 | #x0640 | #x0E46 | #x0EC6 | #x3005 |
#[#x3031-#x3035] | [#x309D-#x309E] | [#x30FC-#x30FE]z | �.�-�_z#x([\d|A-F]{4,4})z'\[#x([\d|A-F]{4,4})-#x([\d|A-F]{4,4})\]cCs�dd�|jd�D�}g}x�|D]�}d}x`ttfD]T}|j|�}|dk	r0|jdd�|j�D��t|d	�dkr~|d
d|d<d}Pq0W|s|jt|�gd�qWt|�}|S)NcSsg|]}|j��qS�)�strip)�.0�itemr
r
�/usr/lib/python3.6/_ihatexml.py�
<listcomp>hsz$charStringToList.<locals>.<listcomp>z | FcSsg|]}t|��qSr
)�hexToInt)rr
r
r
rrosr�T���rr)	�split�reChar�reCharRange�match�append�groups�len�ord�normaliseCharList)�charsZ
charRanges�rvr
Z
foundMatchZregexprr
r
r�charStringToListgs 

rcCs�t|�}x|D]}qWg}d}x�|t|�kr�d}|j||�xT||t|�kr�|||d|dddkr�|||d|dd<|d7}q@W||7}q W|S)Nrrrr)�sortedrr)�charListr
r�i�jr
r
rr|s
2rZFFFF�cCs�g}|ddkr*|jd|dddg�xBt|dd��D].\}}|j|dd||dddg�q<W|ddtkr�|j|dddtg�|S)Nrrrrr)r�	enumerate�max_unicode)r rr!r
r
r
r�
missingRanges�s*r&cCsrg}x^|D]V}|d|dkr6|jtt|d���q
|jtt|d��dtt|d���q
Wddj|�S)Nrrrz[%s]�)r�escapeRegexp�chr�join)r rr
r
r
r�listToRegexpStr�s
r+cCs
t|d�S)Nr#)�int)Zhex_strr
r
rr�srcCs&d}x|D]}|j|d|�}q
W|S)Nr�^�$�*�+�?�{�}�[�]�|�(�)r�\)rr-r.r/r0r1r2r3r4r5r6r7r8r)�replace)�stringZspecialCharacters�charr
r
rr(�s

r(u�[-,/:-@\[-\^`\{-¶¸-¿×÷IJ-ijĿ-ŀʼnſDŽ-njDZ-dzǶ-ǹȘ-ɏʩ-ʺ˂-ˏ˒-˿͆-͟͢-΅΋΍΢Ϗϗ-ϙϛϝϟϡϴ-ЀЍѐѝ҂҇-ҏӅ-ӆӉ-ӊӍ-ӏӬ-ӭӶ-ӷӺ-԰՗-՘՚-ՠև-֐ֺ֢־׀׃ׅ-׏׫-ׯ׳-ؠػ-ؿٓ-ٟ٪-ٯڸ-ڹڿۏ۔۩ۮ-ۯۺ-ऀऄऺ-ऻॎ-ॐॕ-ॗ।-॥॰-ঀ঄঍-঎঑-঒঩঱঳-঵঺-঻ঽ৅-৆৉-৊ৎ-৖৘-৛৞৤-৥৲-ਁਃ-਄਋-਎਑-਒਩਱਴਷਺-਻਽੃-੆੉-੊੎-੘੝੟-੥ੵ-઀઄ઌ઎઒઩઱઴઺-઻૆૊૎-૟ૡ-૥૰-଀଄଍-଎଑-଒଩଱଴-ଵ଺-଻ୄ-୆୉-୊୎-୕୘-୛୞ୢ-୥୰-஁஄஋-஍஑஖-஘஛஝஠-஢஥-஧஫-஭ஶ஺-஽௃-௅௉௎-௖௘-௦௰-ఀఄ఍఑఩ఴ఺-ఽ౅౉౎-౔౗-౟ౢ-౥౰-ಁ಄಍಑಩಴಺-ಽ೅೉೎-೔೗-ೝ೟ೢ-೥೰-ഁഄ഍഑ഩഺ-ഽൄ-൅൉ൎ-ൖ൘-ൟൢ-൥൰-฀ฯ฻-฿๏๚-຀຃຅-ຆຉ຋-ຌຎ-ຓຘຠ຤຦ຨ-ຩຬຯ຺຾-຿໅໇໎-໏໚-༗༚-༟༪-༴༶༸༺-༽཈ཪ-཰྅ྌ-ྏྖ྘ྮ-ྰྸྺ-႟჆-჏ჷ-ჿᄁᄄᄈᄊᄍᄓ-ᄻᄽᄿᅁ-ᅋᅍᅏᅑ-ᅓᅖ-ᅘᅚ-ᅞᅢᅤᅦᅨᅪ-ᅬᅯ-ᅱᅴᅶ-ᆝᆟ-ᆧᆩ-ᆪᆬ-ᆭᆰ-ᆶᆹᆻᇃ-ᇪᇬ-ᇯᇱ-ᇸᇺ-᷿ẜ-ẟỺ-ỿ἖-἗἞-἟὆-὇὎-὏὘὚὜὞὾-὿᾵᾽᾿-῁῅῍-῏῔-῕῜-῟῭-῱῵´-⃏⃝-⃠⃢-℥℧-℩ℬ-ℭℯ-ⅿↃ-〄〆〈-〠〰〶-぀ゕ-゘゛-゜ゟ-゠・ヿ-㄄ㄭ-䷿龦-꯿힤-￿]u�[-@\[-\^`\{-¿×÷IJ-ijĿ-ŀʼnſDŽ-njDZ-dzǶ-ǹȘ-ɏʩ-ʺ˂-΅·΋΍΢Ϗϗ-ϙϛϝϟϡϴ-ЀЍѐѝ҂-ҏӅ-ӆӉ-ӊӍ-ӏӬ-ӭӶ-ӷӺ-԰՗-՘՚-ՠև-׏׫-ׯ׳-ؠػ-ـً-ٰڸ-ڹڿۏ۔ۖ-ۤۧ-ऄऺ-़ा-ॗॢ-঄঍-঎঑-঒঩঱঳-঵঺-৛৞ৢ-৯৲-਄਋-਎਑-਒਩਱਴਷਺-੘੝੟-ੱੵ-઄ઌ઎઒઩઱઴઺-઼ા-૟ૡ-଄଍-଎଑-଒଩଱଴-ଵ଺-଼ା-୛୞ୢ-஄஋-஍஑஖-஘஛஝஠-஢஥-஧஫-஭ஶ஺-ఄ఍఑఩ఴ఺-౟ౢ-಄಍಑಩಴಺-ೝ೟ೢ-ഄ഍഑ഩഺ-ൟൢ-฀ฯัิ-฿ๆ-຀຃຅-ຆຉ຋-ຌຎ-ຓຘຠ຤຦ຨ-ຩຬຯັິ-ຼ຾-຿໅-༿཈ཪ-႟჆-჏ჷ-ჿᄁᄄᄈᄊᄍᄓ-ᄻᄽᄿᅁ-ᅋᅍᅏᅑ-ᅓᅖ-ᅘᅚ-ᅞᅢᅤᅦᅨᅪ-ᅬᅯ-ᅱᅴᅶ-ᆝᆟ-ᆧᆩ-ᆪᆬ-ᆭᆰ-ᆶᆹᆻᇃ-ᇪᇬ-ᇯᇱ-ᇸᇺ-᷿ẜ-ẟỺ-ỿ἖-἗἞-἟὆-὇὎-὏὘὚὜὞὾-὿᾵᾽᾿-῁῅῍-῏῔-῕῜-῟῭-῱῵´-℥℧-℩ℬ-ℭℯ-ⅿↃ-〆〈-〠〪-぀ゕ-゠・-㄄ㄭ-䷿龦-꯿힤-￿]z#[^ 
a-zA-Z0-9\-'()+,./:=?;!*#@$_%]c@sreZdZejd�Zddd�Zddd�Zd	d
�Zdd�Z	d
d�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�ZdS)�
InfosetFilterz
U[\dA-F]{5,5}FTcCs.||_||_||_||_||_||_i|_dS)N)�dropXmlnsLocalName�dropXmlnsAttrNs�preventDoubleDashComments�preventDashAtCommentEnd�replaceFormFeedCharacters�preventSingleQuotePubid�replaceCache)�selfr>r?r@rArBrCr
r
r�__init__�szInfosetFilter.__init__NcCsL|jr |jd�r tjdt�dS|jr>|dkr>tjdt�dS|j|�SdS)Nzxmlns:z"Attributes cannot begin with xmlnszhttp://www.w3.org/2000/xmlns/z)Attributes cannot be in the xml namespace)r>�
startswith�warnings�warnrr?�	toXmlName)rE�name�	namespacer
r
r�coerceAttribute�szInfosetFilter.coerceAttributecCs
|j|�S)N)rJ)rErKr
r
r�
coerceElement�szInfosetFilter.coerceElementcCsN|jrJx$d|kr*tjdt�|jdd�}qW|jd�rJtjdt�|d7}|S)Nz--z'Comments cannot contain adjacent dashesz- -rzComments cannot end in a dash� )r@rHrIrr:�endswith)rE�datar
r
r�
coerceComment�s

zInfosetFilter.coerceCommentcCs:|jr6x"t|jd��D]}tjdt�qW|jdd�}|S)N�zText cannot contain U+000CrO)rB�range�countrHrIrr:)rErQr	r
r
r�coerceCharacters�s
zInfosetFilter.coerceCharacterscCsp|}x4tj|�D]&}tjdt�|j|�}|j||�}qW|jrl|jd�dkrltjdt�|jd|jd��}|S)NzCoercing non-XML pubid�'rz!Pubid cannot contain single quote)	�nonPubidCharRegexp�findallrHrIr�getReplacementCharacterr:rC�find)rErQZ
dataOutputr<�replacementr
r
r�coercePubid�s
zInfosetFilter.coercePubidc
Cs�|d}|dd�}tj|�}|r:tjdt�|j|�}n|}|}ttj|��}x.|D]&}tjdt�|j|�}	|j	||	�}qVW||S)NrrzCoercing non-XML name)
�nonXmlNameFirstBMPRegexprrHrIrrZ�set�nonXmlNameBMPRegexprYr:)
rErK�	nameFirstZnameRest�mZnameFirstOutputZnameRestOutputZreplaceCharsr<r\r
r
rrJ�s


zInfosetFilter.toXmlNamecCs$||jkr|j|}n
|j|�}|S)N)rD�
escapeChar)rEr<r\r
r
rrZs

z%InfosetFilter.getReplacementCharactercCs0x*t|jj|��D]}|j||j|��}qW|S)N)r_�replacementRegexprYr:�unescapeChar)rErKr
r
r
r�fromXmlNameszInfosetFilter.fromXmlNamecCsdt|�}||j|<|S)NzU%05X)rrD)rEr<r\r
r
rrcs
zInfosetFilter.escapeCharcCstt|dd�d��S)Nrr#)r)r,)rEZcharcoder
r
rreszInfosetFilter.unescapeChar)FFFFTF)N)�__name__�
__module__�__qualname__�re�compilerdrFrMrNrRrVr]rJrZrfrcrer
r
r
rr=�s"



r=)!Z
__future__rrrrjrHZ	constantsrZbaseCharZideographicZcombiningCharacterZdigitZextenderr*ZletterrKrarkrrrrr,r%r&r+rr(r`r^rX�objectr=r
r
r
r�<module>s20


	


site-packages/pip/_vendor/html5lib/__pycache__/_ihatexml.cpython-36.pyc000064400000032721147511334610022052 0ustar003

���eAA�@s�ddlmZmZmZddlZddlZddlmZdZdZ	dZ
dZd	Zd
j
ee	g�Zd
j
eeddd
e
eg�Zd
j
ed
g�Zejd�Zejd�Zdd�Zdd�Zedd�Zdd�Zdd�Zdd�Zdd�Zejd�Zejd�Zejd �ZGd!d"�d"e�Z dS)#�)�absolute_import�division�unicode_literalsN�)�DataLossWarninga^
[#x0041-#x005A] | [#x0061-#x007A] | [#x00C0-#x00D6] | [#x00D8-#x00F6] |
[#x00F8-#x00FF] | [#x0100-#x0131] | [#x0134-#x013E] | [#x0141-#x0148] |
[#x014A-#x017E] | [#x0180-#x01C3] | [#x01CD-#x01F0] | [#x01F4-#x01F5] |
[#x01FA-#x0217] | [#x0250-#x02A8] | [#x02BB-#x02C1] | #x0386 |
[#x0388-#x038A] | #x038C | [#x038E-#x03A1] | [#x03A3-#x03CE] |
[#x03D0-#x03D6] | #x03DA | #x03DC | #x03DE | #x03E0 | [#x03E2-#x03F3] |
[#x0401-#x040C] | [#x040E-#x044F] | [#x0451-#x045C] | [#x045E-#x0481] |
[#x0490-#x04C4] | [#x04C7-#x04C8] | [#x04CB-#x04CC] | [#x04D0-#x04EB] |
[#x04EE-#x04F5] | [#x04F8-#x04F9] | [#x0531-#x0556] | #x0559 |
[#x0561-#x0586] | [#x05D0-#x05EA] | [#x05F0-#x05F2] | [#x0621-#x063A] |
[#x0641-#x064A] | [#x0671-#x06B7] | [#x06BA-#x06BE] | [#x06C0-#x06CE] |
[#x06D0-#x06D3] | #x06D5 | [#x06E5-#x06E6] | [#x0905-#x0939] | #x093D |
[#x0958-#x0961] | [#x0985-#x098C] | [#x098F-#x0990] | [#x0993-#x09A8] |
[#x09AA-#x09B0] | #x09B2 | [#x09B6-#x09B9] | [#x09DC-#x09DD] |
[#x09DF-#x09E1] | [#x09F0-#x09F1] | [#x0A05-#x0A0A] | [#x0A0F-#x0A10] |
[#x0A13-#x0A28] | [#x0A2A-#x0A30] | [#x0A32-#x0A33] | [#x0A35-#x0A36] |
[#x0A38-#x0A39] | [#x0A59-#x0A5C] | #x0A5E | [#x0A72-#x0A74] |
[#x0A85-#x0A8B] | #x0A8D | [#x0A8F-#x0A91] | [#x0A93-#x0AA8] |
[#x0AAA-#x0AB0] | [#x0AB2-#x0AB3] | [#x0AB5-#x0AB9] | #x0ABD | #x0AE0 |
[#x0B05-#x0B0C] | [#x0B0F-#x0B10] | [#x0B13-#x0B28] | [#x0B2A-#x0B30] |
[#x0B32-#x0B33] | [#x0B36-#x0B39] | #x0B3D | [#x0B5C-#x0B5D] |
[#x0B5F-#x0B61] | [#x0B85-#x0B8A] | [#x0B8E-#x0B90] | [#x0B92-#x0B95] |
[#x0B99-#x0B9A] | #x0B9C | [#x0B9E-#x0B9F] | [#x0BA3-#x0BA4] |
[#x0BA8-#x0BAA] | [#x0BAE-#x0BB5] | [#x0BB7-#x0BB9] | [#x0C05-#x0C0C] |
[#x0C0E-#x0C10] | [#x0C12-#x0C28] | [#x0C2A-#x0C33] | [#x0C35-#x0C39] |
[#x0C60-#x0C61] | [#x0C85-#x0C8C] | [#x0C8E-#x0C90] | [#x0C92-#x0CA8] |
[#x0CAA-#x0CB3] | [#x0CB5-#x0CB9] | #x0CDE | [#x0CE0-#x0CE1] |
[#x0D05-#x0D0C] | [#x0D0E-#x0D10] | [#x0D12-#x0D28] | [#x0D2A-#x0D39] |
[#x0D60-#x0D61] | [#x0E01-#x0E2E] | #x0E30 | [#x0E32-#x0E33] |
[#x0E40-#x0E45] | [#x0E81-#x0E82] | #x0E84 | [#x0E87-#x0E88] | #x0E8A |
#x0E8D | [#x0E94-#x0E97] | [#x0E99-#x0E9F] | [#x0EA1-#x0EA3] | #x0EA5 |
#x0EA7 | [#x0EAA-#x0EAB] | [#x0EAD-#x0EAE] | #x0EB0 | [#x0EB2-#x0EB3] |
#x0EBD | [#x0EC0-#x0EC4] | [#x0F40-#x0F47] | [#x0F49-#x0F69] |
[#x10A0-#x10C5] | [#x10D0-#x10F6] | #x1100 | [#x1102-#x1103] |
[#x1105-#x1107] | #x1109 | [#x110B-#x110C] | [#x110E-#x1112] | #x113C |
#x113E | #x1140 | #x114C | #x114E | #x1150 | [#x1154-#x1155] | #x1159 |
[#x115F-#x1161] | #x1163 | #x1165 | #x1167 | #x1169 | [#x116D-#x116E] |
[#x1172-#x1173] | #x1175 | #x119E | #x11A8 | #x11AB | [#x11AE-#x11AF] |
[#x11B7-#x11B8] | #x11BA | [#x11BC-#x11C2] | #x11EB | #x11F0 | #x11F9 |
[#x1E00-#x1E9B] | [#x1EA0-#x1EF9] | [#x1F00-#x1F15] | [#x1F18-#x1F1D] |
[#x1F20-#x1F45] | [#x1F48-#x1F4D] | [#x1F50-#x1F57] | #x1F59 | #x1F5B |
#x1F5D | [#x1F5F-#x1F7D] | [#x1F80-#x1FB4] | [#x1FB6-#x1FBC] | #x1FBE |
[#x1FC2-#x1FC4] | [#x1FC6-#x1FCC] | [#x1FD0-#x1FD3] | [#x1FD6-#x1FDB] |
[#x1FE0-#x1FEC] | [#x1FF2-#x1FF4] | [#x1FF6-#x1FFC] | #x2126 |
[#x212A-#x212B] | #x212E | [#x2180-#x2182] | [#x3041-#x3094] |
[#x30A1-#x30FA] | [#x3105-#x312C] | [#xAC00-#xD7A3]z*[#x4E00-#x9FA5] | #x3007 | [#x3021-#x3029]a�
[#x0300-#x0345] | [#x0360-#x0361] | [#x0483-#x0486] | [#x0591-#x05A1] |
[#x05A3-#x05B9] | [#x05BB-#x05BD] | #x05BF | [#x05C1-#x05C2] | #x05C4 |
[#x064B-#x0652] | #x0670 | [#x06D6-#x06DC] | [#x06DD-#x06DF] |
[#x06E0-#x06E4] | [#x06E7-#x06E8] | [#x06EA-#x06ED] | [#x0901-#x0903] |
#x093C | [#x093E-#x094C] | #x094D | [#x0951-#x0954] | [#x0962-#x0963] |
[#x0981-#x0983] | #x09BC | #x09BE | #x09BF | [#x09C0-#x09C4] |
[#x09C7-#x09C8] | [#x09CB-#x09CD] | #x09D7 | [#x09E2-#x09E3] | #x0A02 |
#x0A3C | #x0A3E | #x0A3F | [#x0A40-#x0A42] | [#x0A47-#x0A48] |
[#x0A4B-#x0A4D] | [#x0A70-#x0A71] | [#x0A81-#x0A83] | #x0ABC |
[#x0ABE-#x0AC5] | [#x0AC7-#x0AC9] | [#x0ACB-#x0ACD] | [#x0B01-#x0B03] |
#x0B3C | [#x0B3E-#x0B43] | [#x0B47-#x0B48] | [#x0B4B-#x0B4D] |
[#x0B56-#x0B57] | [#x0B82-#x0B83] | [#x0BBE-#x0BC2] | [#x0BC6-#x0BC8] |
[#x0BCA-#x0BCD] | #x0BD7 | [#x0C01-#x0C03] | [#x0C3E-#x0C44] |
[#x0C46-#x0C48] | [#x0C4A-#x0C4D] | [#x0C55-#x0C56] | [#x0C82-#x0C83] |
[#x0CBE-#x0CC4] | [#x0CC6-#x0CC8] | [#x0CCA-#x0CCD] | [#x0CD5-#x0CD6] |
[#x0D02-#x0D03] | [#x0D3E-#x0D43] | [#x0D46-#x0D48] | [#x0D4A-#x0D4D] |
#x0D57 | #x0E31 | [#x0E34-#x0E3A] | [#x0E47-#x0E4E] | #x0EB1 |
[#x0EB4-#x0EB9] | [#x0EBB-#x0EBC] | [#x0EC8-#x0ECD] | [#x0F18-#x0F19] |
#x0F35 | #x0F37 | #x0F39 | #x0F3E | #x0F3F | [#x0F71-#x0F84] |
[#x0F86-#x0F8B] | [#x0F90-#x0F95] | #x0F97 | [#x0F99-#x0FAD] |
[#x0FB1-#x0FB7] | #x0FB9 | [#x20D0-#x20DC] | #x20E1 | [#x302A-#x302F] |
#x3099 | #x309Aa
[#x0030-#x0039] | [#x0660-#x0669] | [#x06F0-#x06F9] | [#x0966-#x096F] |
[#x09E6-#x09EF] | [#x0A66-#x0A6F] | [#x0AE6-#x0AEF] | [#x0B66-#x0B6F] |
[#x0BE7-#x0BEF] | [#x0C66-#x0C6F] | [#x0CE6-#x0CEF] | [#x0D66-#x0D6F] |
[#x0E50-#x0E59] | [#x0ED0-#x0ED9] | [#x0F20-#x0F29]z}
#x00B7 | #x02D0 | #x02D1 | #x0387 | #x0640 | #x0E46 | #x0EC6 | #x3005 |
#[#x3031-#x3035] | [#x309D-#x309E] | [#x30FC-#x30FE]z | �.�-�_z#x([\d|A-F]{4,4})z'\[#x([\d|A-F]{4,4})-#x([\d|A-F]{4,4})\]cCs�dd�|jd�D�}g}x�|D]�}d}x`ttfD]T}|j|�}|dk	r0|jdd�|j�D��t|d	�dkr~|d
d|d<d}Pq0W|st|�dks�t�|jt|�gd�qWt	|�}|S)NcSsg|]}|j��qS�)�strip)�.0�itemr
r
�/usr/lib/python3.6/_ihatexml.py�
<listcomp>hsz$charStringToList.<locals>.<listcomp>z | FcSsg|]}t|��qSr
)�hexToInt)rr
r
r
rrosr�T���rr)
�split�reChar�reCharRange�match�append�groups�len�AssertionError�ord�normaliseCharList)�charsZ
charRanges�rvr
Z
foundMatchZregexprr
r
r�charStringToListgs"

rcCs�t|�}x |D]}|d|dkst�qWg}d}x�|t|�kr�d}|j||�xT||t|�kr�|||d|dddkr�|||d|dd<|d7}qTW||7}q4W|S)Nrrrr)�sortedrrr)�charListr
r�i�jr
r
rr|s
2rZFFFF�cCs�g}|ddkr*|jd|dddg�xBt|dd��D].\}}|j|dd||dddg�q<W|ddtkr�|j|dddtg�|S)Nrrrrr)r�	enumerate�max_unicode)r!rr"r
r
r
r�
missingRanges�s*r'cCsrg}x^|D]V}|d|dkr6|jtt|d���q
|jtt|d��dtt|d���q
Wddj|�S)Nrrrz[%s]�)r�escapeRegexp�chr�join)r!rr
r
r
r�listToRegexpStr�s
r,cCs
t|d�S)Nr$)�int)Zhex_strr
r
rr�srcCs&d}x|D]}|j|d|�}q
W|S)Nr�^�$�*�+�?�{�}�[�]�|�(�)r�\)rr.r/r0r1r2r3r4r5r6r7r8r9r)�replace)�stringZspecialCharacters�charr
r
rr)�s

r)u�[-,/:-@\[-\^`\{-¶¸-¿×÷IJ-ijĿ-ŀʼnſDŽ-njDZ-dzǶ-ǹȘ-ɏʩ-ʺ˂-ˏ˒-˿͆-͟͢-΅΋΍΢Ϗϗ-ϙϛϝϟϡϴ-ЀЍѐѝ҂҇-ҏӅ-ӆӉ-ӊӍ-ӏӬ-ӭӶ-ӷӺ-԰՗-՘՚-ՠև-֐ֺ֢־׀׃ׅ-׏׫-ׯ׳-ؠػ-ؿٓ-ٟ٪-ٯڸ-ڹڿۏ۔۩ۮ-ۯۺ-ऀऄऺ-ऻॎ-ॐॕ-ॗ।-॥॰-ঀ঄঍-঎঑-঒঩঱঳-঵঺-঻ঽ৅-৆৉-৊ৎ-৖৘-৛৞৤-৥৲-ਁਃ-਄਋-਎਑-਒਩਱਴਷਺-਻਽੃-੆੉-੊੎-੘੝੟-੥ੵ-઀઄ઌ઎઒઩઱઴઺-઻૆૊૎-૟ૡ-૥૰-଀଄଍-଎଑-଒଩଱଴-ଵ଺-଻ୄ-୆୉-୊୎-୕୘-୛୞ୢ-୥୰-஁஄஋-஍஑஖-஘஛஝஠-஢஥-஧஫-஭ஶ஺-஽௃-௅௉௎-௖௘-௦௰-ఀఄ఍఑఩ఴ఺-ఽ౅౉౎-౔౗-౟ౢ-౥౰-ಁ಄಍಑಩಴಺-ಽ೅೉೎-೔೗-ೝ೟ೢ-೥೰-ഁഄ഍഑ഩഺ-ഽൄ-൅൉ൎ-ൖ൘-ൟൢ-൥൰-฀ฯ฻-฿๏๚-຀຃຅-ຆຉ຋-ຌຎ-ຓຘຠ຤຦ຨ-ຩຬຯ຺຾-຿໅໇໎-໏໚-༗༚-༟༪-༴༶༸༺-༽཈ཪ-཰྅ྌ-ྏྖ྘ྮ-ྰྸྺ-႟჆-჏ჷ-ჿᄁᄄᄈᄊᄍᄓ-ᄻᄽᄿᅁ-ᅋᅍᅏᅑ-ᅓᅖ-ᅘᅚ-ᅞᅢᅤᅦᅨᅪ-ᅬᅯ-ᅱᅴᅶ-ᆝᆟ-ᆧᆩ-ᆪᆬ-ᆭᆰ-ᆶᆹᆻᇃ-ᇪᇬ-ᇯᇱ-ᇸᇺ-᷿ẜ-ẟỺ-ỿ἖-἗἞-἟὆-὇὎-὏὘὚὜὞὾-὿᾵᾽᾿-῁῅῍-῏῔-῕῜-῟῭-῱῵´-⃏⃝-⃠⃢-℥℧-℩ℬ-ℭℯ-ⅿↃ-〄〆〈-〠〰〶-぀ゕ-゘゛-゜ゟ-゠・ヿ-㄄ㄭ-䷿龦-꯿힤-￿]u�[-@\[-\^`\{-¿×÷IJ-ijĿ-ŀʼnſDŽ-njDZ-dzǶ-ǹȘ-ɏʩ-ʺ˂-΅·΋΍΢Ϗϗ-ϙϛϝϟϡϴ-ЀЍѐѝ҂-ҏӅ-ӆӉ-ӊӍ-ӏӬ-ӭӶ-ӷӺ-԰՗-՘՚-ՠև-׏׫-ׯ׳-ؠػ-ـً-ٰڸ-ڹڿۏ۔ۖ-ۤۧ-ऄऺ-़ा-ॗॢ-঄঍-঎঑-঒঩঱঳-঵঺-৛৞ৢ-৯৲-਄਋-਎਑-਒਩਱਴਷਺-੘੝੟-ੱੵ-઄ઌ઎઒઩઱઴઺-઼ા-૟ૡ-଄଍-଎଑-଒଩଱଴-ଵ଺-଼ା-୛୞ୢ-஄஋-஍஑஖-஘஛஝஠-஢஥-஧஫-஭ஶ஺-ఄ఍఑఩ఴ఺-౟ౢ-಄಍಑಩಴಺-ೝ೟ೢ-ഄ഍഑ഩഺ-ൟൢ-฀ฯัิ-฿ๆ-຀຃຅-ຆຉ຋-ຌຎ-ຓຘຠ຤຦ຨ-ຩຬຯັິ-ຼ຾-຿໅-༿཈ཪ-႟჆-჏ჷ-ჿᄁᄄᄈᄊᄍᄓ-ᄻᄽᄿᅁ-ᅋᅍᅏᅑ-ᅓᅖ-ᅘᅚ-ᅞᅢᅤᅦᅨᅪ-ᅬᅯ-ᅱᅴᅶ-ᆝᆟ-ᆧᆩ-ᆪᆬ-ᆭᆰ-ᆶᆹᆻᇃ-ᇪᇬ-ᇯᇱ-ᇸᇺ-᷿ẜ-ẟỺ-ỿ἖-἗἞-἟὆-὇὎-὏὘὚὜὞὾-὿᾵᾽᾿-῁῅῍-῏῔-῕῜-῟῭-῱῵´-℥℧-℩ℬ-ℭℯ-ⅿↃ-〆〈-〠〪-぀ゕ-゠・-㄄ㄭ-䷿龦-꯿힤-￿]z#[^ 
a-zA-Z0-9\-'()+,./:=?;!*#@$_%]c@sreZdZejd�Zddd�Zddd�Zd	d
�Zdd�Z	d
d�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�ZdS)�
InfosetFilterz
U[\dA-F]{5,5}FTcCs.||_||_||_||_||_||_i|_dS)N)�dropXmlnsLocalName�dropXmlnsAttrNs�preventDoubleDashComments�preventDashAtCommentEnd�replaceFormFeedCharacters�preventSingleQuotePubid�replaceCache)�selfr?r@rArBrCrDr
r
r�__init__�szInfosetFilter.__init__NcCsL|jr |jd�r tjdt�dS|jr>|dkr>tjdt�dS|j|�SdS)Nzxmlns:z"Attributes cannot begin with xmlnszhttp://www.w3.org/2000/xmlns/z)Attributes cannot be in the xml namespace)r?�
startswith�warnings�warnrr@�	toXmlName)rF�name�	namespacer
r
r�coerceAttribute�szInfosetFilter.coerceAttributecCs
|j|�S)N)rK)rFrLr
r
r�
coerceElement�szInfosetFilter.coerceElementcCsN|jrJx$d|kr*tjdt�|jdd�}qW|jd�rJtjdt�|d7}|S)Nz--z'Comments cannot contain adjacent dashesz- -rzComments cannot end in a dash� )rArIrJrr;�endswith)rF�datar
r
r�
coerceComment�s

zInfosetFilter.coerceCommentcCs:|jr6x"t|jd��D]}tjdt�qW|jdd�}|S)N�zText cannot contain U+000CrP)rC�range�countrIrJrr;)rFrRr	r
r
r�coerceCharacters�s
zInfosetFilter.coerceCharacterscCsp|}x4tj|�D]&}tjdt�|j|�}|j||�}qW|jrl|jd�dkrltjdt�|jd|jd��}|S)NzCoercing non-XML pubid�'rz!Pubid cannot contain single quote)	�nonPubidCharRegexp�findallrIrJr�getReplacementCharacterr;rD�find)rFrRZ
dataOutputr=�replacementr
r
r�coercePubid�s
zInfosetFilter.coercePubidc
Cs�|d}|dd�}tj|�}|r:tjdt�|j|�}n|}|}ttj|��}x.|D]&}tjdt�|j|�}	|j	||	�}qVW||S)NrrzCoercing non-XML name)
�nonXmlNameFirstBMPRegexprrIrJrr[�set�nonXmlNameBMPRegexprZr;)
rFrL�	nameFirstZnameRest�mZnameFirstOutputZnameRestOutputZreplaceCharsr=r]r
r
rrK�s


zInfosetFilter.toXmlNamecCs$||jkr|j|}n
|j|�}|S)N)rE�
escapeChar)rFr=r]r
r
rr[s

z%InfosetFilter.getReplacementCharactercCs0x*t|jj|��D]}|j||j|��}qW|S)N)r`�replacementRegexprZr;�unescapeChar)rFrLr
r
r
r�fromXmlNameszInfosetFilter.fromXmlNamecCsdt|�}||j|<|S)NzU%05X)rrE)rFr=r]r
r
rrds
zInfosetFilter.escapeCharcCstt|dd�d��S)Nrr$)r*r-)rFZcharcoder
r
rrfszInfosetFilter.unescapeChar)FFFFTF)N)�__name__�
__module__�__qualname__�re�compilererGrNrOrSrWr^rKr[rgrdrfr
r
r
rr>�s"



r>)!Z
__future__rrrrkrIZ	constantsrZbaseCharZideographicZcombiningCharacterZdigitZextenderr+ZletterrLrbrlrrrrr-r&r'r,rr)rar_rY�objectr>r
r
r
r�<module>s20


	


site-packages/pip/_vendor/html5lib/__pycache__/_utils.cpython-36.pyc000064400000006332147511334610021376 0ustar003

���e�@s
ddlmZmZmZddlZddlmZddlmZyddl	j
jZWn e
k
rdddlj
jZYnXddddd	d
dgZejddko�ejd
dkZy,ed�Zeee�s�ed�Zeee�s�t�WndZYnXdZGdd�de�Zdd�Zdd�Zdd	�Zdd�ZdS)�)�absolute_import�division�unicode_literalsN)�
ModuleType)�	text_type�
default_etree�MethodDispatcher�isSurrogatePair�surrogatePairToCodepoint�moduleFactoryFactory�supports_lone_surrogates�PY27���z"\uD800"z	u"\uD800"FTc@s$eZdZdZffdd�Zdd�ZdS)rapDict with 2 special properties:

    On initiation, keys that are lists, sets or tuples are converted to
    multiple keys so accessing any one of the items in the original
    list-like object returns the matching value

    md = MethodDispatcher({("foo", "bar"):"baz"})
    md["foo"] == "baz"

    A default value which can be set through the default attribute.
    cCs~g}xN|D]F\}}t|ttttf�rBx*|D]}|j||f�q*Wq
|j||f�q
Wtj||�t|�t|�kstt	�d|_
dS)N)�
isinstance�list�tuple�	frozenset�set�append�dict�__init__�len�AssertionError�default)�self�itemsZ_dictEntries�name�value�item�r!�/usr/lib/python3.6/_utils.pyr4s
zMethodDispatcher.__init__cCstj|||j�S)N)r�getr)r�keyr!r!r"�__getitem__CszMethodDispatcher.__getitem__N)�__name__�
__module__�__qualname__�__doc__rr%r!r!r!r"r'scCsLt|�dkoJt|d�dkoJt|d�dkoJt|d�dkoJt|d�dkS)Nrri�i��ri�i��)r�ord)�datar!r!r"r	Js cCs,dt|d�ddt|d�d}|S)Niri�iri�)r*)r+Zchar_valr!r!r"r
Pscsi���fdd�}|S)Ncs�ttjtd��rd|j}n
d|j}t|j��}y�|||Stk
r�t|�}�|f|�|�}|jj|�d�kr�i�|<d�|kr�i�||<d�||kr�i�|||<|�|||<|SXdS)N�z_%s_factorys_%s_factoryr�args�kwargs)	rrr&�typerr�KeyError�__dict__�update)Z
baseModuler-r.rZkwargs_tuple�modZobjs)�factory�moduleCacher!r"�
moduleFactory\s$
z+moduleFactoryFactory.<locals>.moduleFactoryr!)r4r6r!)r4r5r"rYscsi���fdd�}|S)Ncs2t|�t|j��f}|�kr*�||��|<�|S)N)rr)r-r.r$)�cache�funcr!r"�wrappedyszmemoize.<locals>.wrappedr!)r8r9r!)r7r8r"�memoizevsr:)Z
__future__rrr�sys�typesrZpip._vendor.sixrZxml.etree.cElementTreeZetreeZcElementTreer�ImportErrorZxml.etree.ElementTreeZElementTree�__all__�version_infor
�evalZ_xrrrrrr	r
rr:r!r!r!r"�<module>s0

#	site-packages/pip/_vendor/html5lib/__pycache__/html5parser.cpython-36.opt-1.pyc000064400000275615147511334610023320 0ustar003

���e���@sFddlmZmZmZddlmZmZmZddlZyddl	m
Z
Wn ek
r`ddlm
Z
YnXddl
mZddl
mZddl
mZdd	lmZdd
l
mZddlmZmZmZmZmZmZmZmZmZmZmZm Z!m"Z"m#Z#m$Z$m%Z%d!dd�Z&d"dd�Z'dd�Z(Gdd�de)�Z*ej+dd��Z,dd�Z-d#dd�Z.Gdd �d e/�Z0dS)$�)�absolute_import�division�unicode_literals)�with_metaclass�viewkeys�PY3N)�OrderedDict�)�_inputstream)�
_tokenizer)�treebuilders)�Marker)�_utils)�spaceCharacters�asciiUpper2Lower�specialElements�headingElements�
cdataElements�rcdataElements�
tokenTypes�
tagTokenTypes�
namespaces�htmlIntegrationPointElements�"mathmlTextIntegrationPointElements�adjustForeignAttributes�adjustMathMLAttributes�adjustSVGAttributes�E�ReparseException�etreeTcKs$tj|�}t||d�}|j|f|�S)z.Parse a string or file-like object into a tree)�namespaceHTMLElements)r�getTreeBuilder�
HTMLParser�parse)�doc�treebuilderr �kwargs�tb�p�r)�!/usr/lib/python3.6/html5parser.pyr#s
r#�divcKs,tj|�}t||d�}|j|fd|i|��S)N)r �	container)rr!r"�
parseFragment)r$r,r%r r&r'r(r)r)r*r-&s
r-csG�fdd�dt�}|S)NcseZdZ�fdd�ZdS)z-method_decorator_metaclass.<locals>.DecoratedcsBx0|j�D]$\}}t|tj�r&�|�}|||<q
Wtj||||�S)N)�items�
isinstance�types�FunctionType�type�__new__)�metaZ	classname�basesZ	classDictZ
attributeNameZ	attribute)�functionr)r*r3.s
z5method_decorator_metaclass.<locals>.Decorated.__new__N)�__name__�
__module__�__qualname__r3r))r6r)r*�	Decorated-sr:)r2)r6r:r))r6r*�method_decorator_metaclass,sr;c@s�eZdZdZd+dd�Zd,dd	�Zd
d�Zedd
��Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zd-dd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�ZdS).r"zZHTML parser. Generates a tree structure from a stream of (possibly
        malformed) HTMLNFTcsL|�_|dkrtjd�}||��_g�_t�fdd�t|�j�D���_dS)a
        strict - raise an exception when a parse error is encountered

        tree - a treebuilder class controlling the type of tree that will be
        returned. Built in treebuilders can be accessed through
        html5lib.treebuilders.getTreeBuilder(treeType)
        Nrcs g|]\}}||��j�f�qSr))�tree)�.0�name�cls)�selfr)r*�
<listcomp>Msz'HTMLParser.__init__.<locals>.<listcomp>)	�strictrr!r<�errors�dict�	getPhasesr.�phases)r@r<rBr �debugr))r@r*�__init__<s


zHTMLParser.__init__r+cKsh||_||_||_tj|fd|i|��|_|j�y|j�Wn$tk
rb|j�|j�YnXdS)N�parser)	�
innerHTMLModer,�	scriptingrZ
HTMLTokenizer�	tokenizer�reset�mainLoopr)r@�stream�	innerHTMLr,rKr&r)r)r*�_parsePszHTMLParser._parsecCs�|jj�d|_g|_g|_d|_|jr�|jj�|_	|j	t
krL|jj|j_
n0|j	tkrd|jj|j_
n|j	dkr||jj|j_
n|jd|_|jj�|j�nd|_	|jd|_d|_d|_d|_dS)NFz	no quirks�	plaintext�
beforeHtml�initialT)r<rM�
firstStartTagrC�log�
compatModerJr,�lowerrPrrL�rcdataState�stater�rawtextState�plaintextStaterF�phase�insertHtmlElement�resetInsertionModeZ	lastPhaseZbeforeRCDataPhase�
framesetOK)r@r)r)r*rM^s*





zHTMLParser.resetcCst|d�sdS|jjjdjS)z�The name of the character encoding
        that was used to decode the input stream,
        or :obj:`None` if that is not determined yet.

        rLNr)�hasattrrLrO�charEncodingr>)r@r)r)r*�documentEncoding�s
zHTMLParser.documentEncodingcCsJ|jdkr6|jtdkr6d|jko4|jdjt�dkS|j|jftkSdS)Nzannotation-xml�mathml�encoding�	text/html�application/xhtml+xml)rfrg)r>�	namespacer�
attributes�	translaterr)r@�elementr)r)r*�isHTMLIntegrationPoint�s


z!HTMLParser.isHTMLIntegrationPointcCs|j|jftkS)N)rhr>r)r@rkr)r)r*�isMathMLTextIntegrationPoint�sz'HTMLParser.isMathMLTextIntegrationPointcCsjtd}td}td}td}td}td}td}�x�|j�D�]�}d}	|}
�x�|
dk	�r|
}	|jjrx|jjdnd}|r�|jnd}|r�|jnd}
|
d	}||kr�|j|
d
|
jdi��d}
qVt|jj�dk�sl||jj	k�sl|j
|��r ||k�r|d
tddg�k�sl|||fk�sl|tdk�rP|
dk�rP||k�rP|d
dk�sl|j
|��rt||||fk�rt|j}n
|jd}||k�r�|j|
�}
qV||k�r�|j|
�}
qV||k�r�|j|
�}
qV||k�r�|j|
�}
qV||k�r�|j|
�}
qV||krV|j|
�}
qVW||krD|	drD|	drD|jdd
|	d
i�qDWd}g}x(|�rd|j|j�|jj�}|�r>�q>WdS)N�
CharactersZSpaceCharacters�StartTag�EndTag�CommentZDoctype�
ParseErrorr	r2�data�datavarsrr>ZmglyphZ
malignmarkrdzannotation-xml�svg�inForeignContent�selfClosing�selfClosingAcknowledgedz&non-void-element-with-trailing-solidusT���)r�normalizedTokensr<�openElementsrhr>�
parseError�get�len�defaultNamespacerm�	frozensetrrlr]rF�processCharacters�processSpaceCharacters�processStartTag�
processEndTag�processComment�processDoctype�append�
processEOF)r@ZCharactersTokenZSpaceCharactersTokenZ
StartTagTokenZEndTagTokenZCommentTokenZDoctypeTokenZParseErrorToken�tokenZ
prev_token�	new_token�currentNodeZcurrentNodeNamespaceZcurrentNodeNamer2r]Z	reprocessrFr)r)r*rN�sp










zHTMLParser.mainLoopccs x|jD]}|j|�VqWdS)N)rL�normalizeToken)r@r�r)r)r*rz�szHTMLParser.normalizedTokenscOs |j|ddf|�|�|jj�S)a�Parse a HTML document into a well-formed tree

        stream - a filelike object or string containing the HTML to be parsed

        The optional encoding parameter must be a string that indicates
        the encoding.  If specified, that encoding will be used,
        regardless of any BOM or later declaration (such as in a meta
        element)

        scripting - treat noscript elements as if javascript was turned on
        FN)rQr<ZgetDocument)r@rO�argsr&r)r)r*r#�szHTMLParser.parsecOs|j|df|�|�|jj�S)a2Parse a HTML fragment into a well-formed tree fragment

        container - name of the element we're setting the innerHTML property
        if set to None, default to 'div'

        stream - a filelike object or string containing the HTML to be parsed

        The optional encoding parameter must be a string that indicates
        the encoding.  If specified, that encoding will be used,
        regardless of any BOM or later declaration (such as in a meta
        element)

        scripting - treat noscript elements as if javascript was turned on
        T)rQr<ZgetFragment)r@rOr�r&r)r)r*r-�szHTMLParser.parseFragment�XXX-undefined-errorcCs@|dkri}|jj|jjj�||f�|jr<tt||��dS)N)rCr�rLrOZpositionrBrrr)r@�	errorcodertr)r)r*r|s
zHTMLParser.parseErrorcCsT|dtdkrP|d}t|�|d<t|�t|d�krP|dj|ddd��|S)z3 HTML5 specific normalizations to the token stream r2rorsNr	ry)rrr~�update)r@r��rawr)r)r*r�szHTMLParser.normalizeTokencCst|t�dS)N)�adjust_attributesr)r@r�r)r)r*rsz!HTMLParser.adjustMathMLAttributescCst|t�dS)N)r�r)r@r�r)r)r*rszHTMLParser.adjustSVGAttributescCst|t�dS)N)r��adjustForeignAttributesMap)r@r�r)r)r*rsz"HTMLParser.adjustForeignAttributescCs|jj�dS)N)rIr])r@r�r)r)r*�reparseTokenNormalszHTMLParser.reparseTokenNormalcCs�d}ddddddddddd	d	d
dd�}x�|jjddd�D]p}|j}d}||jjdkrbd}|j}|dkrj|r�|j|jjkr�q:||kr�|j||}Pq:|r:|jd	}Pq:W||_dS)NF�inSelect�inCell�inRow�inTableBody�	inCaption�
inColumnGroup�inTable�inBody�
inFrameset�
beforeHead)�select�td�th�tr�tbody�thead�tfoot�caption�colgroup�table�head�body�frameset�htmlr	rTr�r�r�r�ry)r�r�r�r�)r<r{r>rPrhrrFr])r@ZlastZnewModes�nodeZnodeNameZ	new_phaser)r)r*r_!s>
zHTMLParser.resetInsertionModecCsF|jj|�|dkr"|jj|j_n|jj|j_|j|_|jd|_dS)zYGeneric RCDATA/RAWTEXT Parsing algorithm
        contentType - RCDATA or RAWTEXT
        �RAWTEXT�textN)	r<�
insertElementrLr[rZrYr]�
originalPhaserF)r@r�ZcontentTyper)r)r*�parseRCDataRawtextMszHTMLParser.parseRCDataRawtext)NFTF)Fr+F)r�N)r7r8r9�__doc__rHrQrM�propertyrcrlrmrNrzr#r-r|r�rrrr�r_r�r)r)r)r*r"8s&

"
C
,r"cs"dd�}dd�}Gdd�dt|||����Gdd�d��}Gd	d
�d
��}G�fdd�d��}G�fd
d�d��}G�fdd�d��}G�fdd�d��}G�fdd�d��}	G�fdd�d��}
G�fdd�d��}G�fdd�d��}G�fdd�d��}
G�fdd�d��}G�fdd �d ��}G�fd!d"�d"��}G�fd#d$�d$��}G�fd%d&�d&��}G�fd'd(�d(��}G�fd)d*�d*��}G�fd+d,�d,��}G�fd-d.�d.��}G�fd/d0�d0��}G�fd1d2�d2��}G�fd3d4�d4��}|||||||	|
|||
||||||||||||d5�S)6Ncs(tdd�tj�D�����fdd�}|S)z4Logger that records which phase processes each tokencss|]\}}||fVqdS)Nr))r=�key�valuer)r)r*�	<genexpr>csz)getPhases.<locals>.log.<locals>.<genexpr>cs��jjd�r�t|�dkr�|d}yd�|di}Wn�YnX|dtkr\|d|d<|jjj|jjjj|jj	j
j|j
j�j|f��|f|�|�S�|f|�|�SdS)NZprocessrr2r>)r7�
startswithr~rrIrVr�rLrZr]�	__class__)r@r�r&r��info)r6�
type_namesr)r*�wrappedfs
z'getPhases.<locals>.log.<locals>.wrapped)rDrr.)r6r�r))r6r�r*rVaszgetPhases.<locals>.logcSs|rt|�StSdS)N)r;r2)Z
use_metaclassZmetaclass_funcr)r)r*�getMetaclasszszgetPhases.<locals>.getMetaclassc@sXeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�ZdS)zgetPhases.<locals>.PhasezNBase class for helper object that implements each phase of processing
        cSs||_||_dS)N)rIr<)r@rIr<r)r)r*rH�sz!getPhases.<locals>.Phase.__init__cSst�dS)N)�NotImplementedError)r@r)r)r*r��sz#getPhases.<locals>.Phase.processEOFcSs|jj||jjd�dS)Nr	ry)r<�
insertCommentr{)r@r�r)r)r*r��sz'getPhases.<locals>.Phase.processCommentcSs|jjd�dS)Nzunexpected-doctype)rIr|)r@r�r)r)r*r��sz'getPhases.<locals>.Phase.processDoctypecSs|jj|d�dS)Nrs)r<�
insertText)r@r�r)r)r*r��sz*getPhases.<locals>.Phase.processCharacterscSs|jj|d�dS)Nrs)r<r�)r@r�r)r)r*r��sz/getPhases.<locals>.Phase.processSpaceCharacterscSs|j|d|�S)Nr>)�startTagHandler)r@r�r)r)r*r��sz(getPhases.<locals>.Phase.processStartTagcSsl|jjr"|ddkr"|jjd�x<|dj�D],\}}||jjdjkr0||jjdj|<q0Wd|j_dS)Nr>r�z
non-html-rootrsrF)rIrUr|r.r<r{ri)r@r��attrr�r)r)r*�startTagHtml�sz%getPhases.<locals>.Phase.startTagHtmlcSs|j|d|�S)Nr>)�
endTagHandler)r@r�r)r)r*r��sz&getPhases.<locals>.Phase.processEndTagN)
r7r8r9r�rHr�r�r�r�r�r�r�r�r)r)r)r*�Phase�s
r�c@sLeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)zgetPhases.<locals>.InitialPhasecSsdS)Nr))r@r�r)r)r*r��sz6getPhases.<locals>.InitialPhase.processSpaceCharacterscSs|jj||jj�dS)N)r<r��document)r@r�r)r)r*r��sz.getPhases.<locals>.InitialPhase.processCommentc8Ss|d}|d}|d}|d}|dks@|dk	s@|dk	rL|dkrL|jjd�|dkrXd}|jj|�|dkrv|jt�}|�s�|ddk�s�|jdJ��s�|dKk�s�|jdL��r�|dk�s�|�r�|j�dDk�r�dE|j_n*|jdM��s�|jdN��r|dk	�rdH|j_|jj	dI|j_
dS)ONr>�publicId�systemId�correctr�zabout:legacy-compatzunknown-doctype��*+//silmaril//dtd html pro v0r11 19970101//�4-//advasoft ltd//dtd html 3.0 aswedit + extensions//�*-//as//dtd html 3.0 aswedit + extensions//�-//ietf//dtd html 2.0 level 1//�-//ietf//dtd html 2.0 level 2//�&-//ietf//dtd html 2.0 strict level 1//�&-//ietf//dtd html 2.0 strict level 2//�-//ietf//dtd html 2.0 strict//�-//ietf//dtd html 2.0//�-//ietf//dtd html 2.1e//�-//ietf//dtd html 3.0//�-//ietf//dtd html 3.2 final//�-//ietf//dtd html 3.2//�-//ietf//dtd html 3//�-//ietf//dtd html level 0//�-//ietf//dtd html level 1//�-//ietf//dtd html level 2//�-//ietf//dtd html level 3//�"-//ietf//dtd html strict level 0//�"-//ietf//dtd html strict level 1//�"-//ietf//dtd html strict level 2//�"-//ietf//dtd html strict level 3//�-//ietf//dtd html strict//�-//ietf//dtd html//�(-//metrius//dtd metrius presentational//�5-//microsoft//dtd internet explorer 2.0 html strict//�.-//microsoft//dtd internet explorer 2.0 html//�0-//microsoft//dtd internet explorer 2.0 tables//�5-//microsoft//dtd internet explorer 3.0 html strict//�.-//microsoft//dtd internet explorer 3.0 html//�0-//microsoft//dtd internet explorer 3.0 tables//�#-//netscape comm. corp.//dtd html//�*-//netscape comm. corp.//dtd strict html//�*-//o'reilly and associates//dtd html 2.0//�3-//o'reilly and associates//dtd html extended 1.0//�;-//o'reilly and associates//dtd html extended relaxed 1.0//�N-//softquad software//dtd hotmetal pro 6.0::19990601::extensions to html 4.0//�E-//softquad//dtd hotmetal pro 4.0::19971010::extensions to html 4.0//�$-//spyglass//dtd html 2.0 extended//�+-//sq//dtd html 2.0 hotmetal + extensions//�--//sun microsystems corp.//dtd hotjava html//�4-//sun microsystems corp.//dtd hotjava strict html//�-//w3c//dtd html 3 1995-03-24//�-//w3c//dtd html 3.2 draft//�-//w3c//dtd html 3.2 final//�-//w3c//dtd html 3.2//�-//w3c//dtd html 3.2s draft//�-//w3c//dtd html 4.0 frameset//�#-//w3c//dtd html 4.0 transitional//�(-//w3c//dtd html experimental 19960712//�&-//w3c//dtd html experimental 970421//�-//w3c//dtd w3 html//�-//w3o//dtd w3 html 3.0//�#-//webtechs//dtd mozilla html 2.0//�-//webtechs//dtd mozilla html//�$-//w3o//dtd w3 html strict 3.0//en//�"-/w3c/dtd html 4.0 transitional/en� -//w3c//dtd html 4.01 frameset//�$-//w3c//dtd html 4.01 transitional//z:http://www.ibm.com/data/dtd/v11/ibmxhtml1-transitional.dtd�quirks� -//w3c//dtd xhtml 1.0 frameset//�$-//w3c//dtd xhtml 1.0 transitional//zlimited quirksrS)7r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr)rrr�)rr)rr)rr)rIr|r<Z
insertDoctyperjrr�rXrWrFr])r@r�r>r�r�r�r)r)r*r��s�



z.getPhases.<locals>.InitialPhase.processDoctypecSsd|j_|jjd|j_dS)NrrS)rIrWrFr])r@r)r)r*�anythingElsesz,getPhases.<locals>.InitialPhase.anythingElsecSs|jjd�|j�|S)Nzexpected-doctype-but-got-chars)rIr|r	)r@r�r)r)r*r�sz1getPhases.<locals>.InitialPhase.processCharacterscSs"|jjdd|di�|j�|S)Nz"expected-doctype-but-got-start-tagr>)rIr|r	)r@r�r)r)r*r�sz/getPhases.<locals>.InitialPhase.processStartTagcSs"|jjdd|di�|j�|S)Nz expected-doctype-but-got-end-tagr>)rIr|r	)r@r�r)r)r*r�sz-getPhases.<locals>.InitialPhase.processEndTagcSs|jjd�|j�dS)Nzexpected-doctype-but-got-eofT)rIr|r	)r@r)r)r*r�%sz*getPhases.<locals>.InitialPhase.processEOFN)r7r8r9r�r�r�r	r�r�r�r�r)r)r)r*�InitialPhase�s_r
c@sDeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dS)z"getPhases.<locals>.BeforeHtmlPhasecSs&|jjtdd��|jjd|j_dS)Nr�ror�)r<Z
insertRoot�impliedTagTokenrIrFr])r@r)r)r*r^,sz4getPhases.<locals>.BeforeHtmlPhase.insertHtmlElementcSs|j�dS)NT)r^)r@r)r)r*r�1sz-getPhases.<locals>.BeforeHtmlPhase.processEOFcSs|jj||jj�dS)N)r<r�r�)r@r�r)r)r*r�5sz1getPhases.<locals>.BeforeHtmlPhase.processCommentcSsdS)Nr))r@r�r)r)r*r�8sz9getPhases.<locals>.BeforeHtmlPhase.processSpaceCharacterscSs|j�|S)N)r^)r@r�r)r)r*r�;sz4getPhases.<locals>.BeforeHtmlPhase.processCharacterscSs |ddkrd|j_|j�|S)Nr>r�T)rIrUr^)r@r�r)r)r*r�?sz2getPhases.<locals>.BeforeHtmlPhase.processStartTagcSs4|ddkr$|jjdd|di�n|j�|SdS)Nr>r�r�r��brzunexpected-end-tag-before-html)r�r�r�r)rIr|r^)r@r�r)r)r*r�Es
z0getPhases.<locals>.BeforeHtmlPhase.processEndTagN)
r7r8r9r^r�r�r�r�r�r�r)r)r)r*�BeforeHtmlPhase*sr
csXeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)z"getPhases.<locals>.BeforeHeadPhasecsV�j|||�tjd|jfd|jfg�|_|j|j_tjd|jfg�|_	|j
|j	_dS)Nr�r�r�r)r�r�r�r)rHr�MethodDispatcherr��startTagHeadr��
startTagOther�default�endTagImplyHeadr��endTagOther)r@rIr<)r�r)r*rHNs
z+getPhases.<locals>.BeforeHeadPhase.__init__cSs|jtdd��dS)Nr�roT)rr)r@r)r)r*r�\sz-getPhases.<locals>.BeforeHeadPhase.processEOFcSsdS)Nr))r@r�r)r)r*r�`sz9getPhases.<locals>.BeforeHeadPhase.processSpaceCharacterscSs|jtdd��|S)Nr�ro)rr)r@r�r)r)r*r�csz4getPhases.<locals>.BeforeHeadPhase.processCharacterscSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�gsz/getPhases.<locals>.BeforeHeadPhase.startTagHtmlcSs0|jj|�|jjd|j_|jjd|j_dS)Nr	�inHeadry)r<r�r{�headPointerrIrFr])r@r�r)r)r*rjsz/getPhases.<locals>.BeforeHeadPhase.startTagHeadcSs|jtdd��|S)Nr�ro)rr)r@r�r)r)r*rosz0getPhases.<locals>.BeforeHeadPhase.startTagOthercSs|jtdd��|S)Nr�ro)rr)r@r�r)r)r*rssz2getPhases.<locals>.BeforeHeadPhase.endTagImplyHeadcSs|jjdd|di�dS)Nzend-tag-after-implied-rootr>)rIr|)r@r�r)r)r*rwsz.getPhases.<locals>.BeforeHeadPhase.endTagOtherN)r7r8r9rHr�r�r�r�rrrrr))r�r)r*�BeforeHeadPhaseMsrcs�eZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd �Zd!S)"zgetPhases.<locals>.InHeadPhasecs��j|||�tjd|jfd|jfd|jfd|jfd|jfd|jfd|j	fd
|j
fg�|_|j|j_
tjd
|jfd|jfg�|_|j|j_
dS)Nr��title�noframes�style�noscript�script�base�basefont�bgsound�command�linkr4r�rr�)rr)rrrrr )rr�r�)rHrrr��
startTagTitle�startTagNoFramesStyle�startTagNoscript�startTagScript�startTagBaseLinkCommand�startTagMetarr�rr�
endTagHead�endTagHtmlBodyBrr�r)r@rIr<)r�r)r*rH|s 
z'getPhases.<locals>.InHeadPhase.__init__cSs|j�dS)NT)r	)r@r)r)r*r��sz)getPhases.<locals>.InHeadPhase.processEOFcSs|j�|S)N)r	)r@r�r)r)r*r��sz0getPhases.<locals>.InHeadPhase.processCharacterscSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r��sz+getPhases.<locals>.InHeadPhase.startTagHtmlcSs|jjd�dS)Nz!two-heads-are-not-better-than-one)rIr|)r@r�r)r)r*r�sz+getPhases.<locals>.InHeadPhase.startTagHeadcSs$|jj|�|jjj�d|d<dS)NTrx)r<r�r{�pop)r@r�r)r)r*r%�sz6getPhases.<locals>.InHeadPhase.startTagBaseLinkCommandcSs�|jj|�|jjj�d|d<|d}|jjjjddkr�d|krZ|jjjj|d�nVd|kr�d|kr�|dj	�d	kr�t
j|djd
��}t
j
|�}|j�}|jjjj|�dS)NTrxrsr	Z	tentative�charsetZcontentz
http-equivzcontent-typezutf-8)r<r�r{r)rIrLrOrbZchangeEncodingrXr
Z
EncodingBytes�encodeZContentAttrParserr#)r@r�rirsrI�codecr)r)r*r&�s
z+getPhases.<locals>.InHeadPhase.startTagMetacSs|jj|d�dS)NZRCDATA)rIr�)r@r�r)r)r*r!�sz,getPhases.<locals>.InHeadPhase.startTagTitlecSs|jj|d�dS)Nr�)rIr�)r@r�r)r)r*r"�sz4getPhases.<locals>.InHeadPhase.startTagNoFramesStylecSs8|jjr|jj|d�n|jj|�|jjd|j_dS)Nr��inHeadNoscript)rIrKr�r<r�rFr])r@r�r)r)r*r#�sz/getPhases.<locals>.InHeadPhase.startTagNoscriptcSs<|jj|�|jjj|jj_|jj|j_|jjd|j_dS)Nr�)	r<r�rIrLZscriptDataStaterZr]r�rF)r@r�r)r)r*r$�sz-getPhases.<locals>.InHeadPhase.startTagScriptcSs|j�|S)N)r	)r@r�r)r)r*r�sz,getPhases.<locals>.InHeadPhase.startTagOthercSs"|jjjj�}|jjd|j_dS)N�	afterHead)rIr<r{r)rFr])r@r�r�r)r)r*r'�sz)getPhases.<locals>.InHeadPhase.endTagHeadcSs|j�|S)N)r	)r@r�r)r)r*r(�sz/getPhases.<locals>.InHeadPhase.endTagHtmlBodyBrcSs|jjdd|di�dS)Nzunexpected-end-tagr>)rIr|)r@r�r)r)r*r�sz*getPhases.<locals>.InHeadPhase.endTagOthercSs|jtd��dS)Nr�)r'r)r@r)r)r*r	�sz+getPhases.<locals>.InHeadPhase.anythingElseN)r7r8r9rHr�r�r�rr%r&r!r"r#r$rr'r(rr	r))r�r)r*�InHeadPhase{s r/csxeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�ZdS)z&getPhases.<locals>.InHeadNoscriptPhasecsf�j|||�tjd|jfd|jfd|jfg�|_|j|j_tjd	|j	fd
|j
fg�|_|j|j_dS)
Nr�rrr r4rrr�rr)rrr r4rr)r�r)
rHrrr�r%�startTagHeadNoscriptr�rr�endTagNoscript�endTagBrr�r)r@rIr<)r�r)r*rH�s
z/getPhases.<locals>.InHeadNoscriptPhase.__init__cSs|jjd�|j�dS)Nzeof-in-head-noscriptT)rIr|r	)r@r)r)r*r��sz1getPhases.<locals>.InHeadNoscriptPhase.processEOFcSs|jjdj|�S)Nr)rIrFr�)r@r�r)r)r*r��sz5getPhases.<locals>.InHeadNoscriptPhase.processCommentcSs|jjd�|j�|S)Nzchar-in-head-noscript)rIr|r	)r@r�r)r)r*r��sz8getPhases.<locals>.InHeadNoscriptPhase.processCharacterscSs|jjdj|�S)Nr)rIrFr�)r@r�r)r)r*r�sz=getPhases.<locals>.InHeadNoscriptPhase.processSpaceCharacterscSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�sz3getPhases.<locals>.InHeadNoscriptPhase.startTagHtmlcSs|jjdj|�S)Nr)rIrFr�)r@r�r)r)r*r%sz>getPhases.<locals>.InHeadNoscriptPhase.startTagBaseLinkCommandcSs|jjdd|di�dS)Nzunexpected-start-tagr>)rIr|)r@r�r)r)r*r0	sz;getPhases.<locals>.InHeadNoscriptPhase.startTagHeadNoscriptcSs"|jjdd|di�|j�|S)Nzunexpected-inhead-noscript-tagr>)rIr|r	)r@r�r)r)r*rsz4getPhases.<locals>.InHeadNoscriptPhase.startTagOthercSs"|jjjj�}|jjd|j_dS)Nr)rIr<r{r)rFr])r@r�r�r)r)r*r1sz5getPhases.<locals>.InHeadNoscriptPhase.endTagNoscriptcSs"|jjdd|di�|j�|S)Nzunexpected-inhead-noscript-tagr>)rIr|r	)r@r�r)r)r*r2sz/getPhases.<locals>.InHeadNoscriptPhase.endTagBrcSs|jjdd|di�dS)Nzunexpected-end-tagr>)rIr|)r@r�r)r)r*rsz2getPhases.<locals>.InHeadNoscriptPhase.endTagOthercSs|jtd��dS)Nr)r1r)r@r)r)r*r	sz3getPhases.<locals>.InHeadNoscriptPhase.anythingElseN)r7r8r9rHr�r�r�r�r�r%r0rr1r2rr	r))r�r)r*�InHeadNoscriptPhase�sr3cspeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�ZdS)z!getPhases.<locals>.AfterHeadPhasec
sn�j|||�tjd|jfd|jfd|jfd|jfd
|jfg�|_|j	|j_
tjd|jfg�|_|j
|j_
dS)Nr�r�r�rrrr r4rrrrr�r)	rrrr r4rrrr)r�r�r)rHrrr��startTagBody�startTagFrameset�startTagFromHeadrr�rrr(r�r)r@rIr<)r�r)r*rH#s
z*getPhases.<locals>.AfterHeadPhase.__init__cSs|j�dS)NT)r	)r@r)r)r*r�4sz,getPhases.<locals>.AfterHeadPhase.processEOFcSs|j�|S)N)r	)r@r�r)r)r*r�8sz3getPhases.<locals>.AfterHeadPhase.processCharacterscSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�<sz.getPhases.<locals>.AfterHeadPhase.startTagHtmlcSs(d|j_|jj|�|jjd|j_dS)NFr�)rIr`r<r�rFr])r@r�r)r)r*r4?sz.getPhases.<locals>.AfterHeadPhase.startTagBodycSs |jj|�|jjd|j_dS)Nr�)r<r�rIrFr])r@r�r)r)r*r5Dsz2getPhases.<locals>.AfterHeadPhase.startTagFramesetcSst|jjdd|di�|jjj|jj�|jjdj|�x4|jjddd�D]}|jdkrN|jjj	|�PqNWdS)Nz#unexpected-start-tag-out-of-my-headr>rr	r�ry)
rIr|r<r{r�rrFr�r>�remove)r@r�r�r)r)r*r6Hs
z2getPhases.<locals>.AfterHeadPhase.startTagFromHeadcSs|jjdd|di�dS)Nzunexpected-start-tagr>)rIr|)r@r�r)r)r*rRsz.getPhases.<locals>.AfterHeadPhase.startTagHeadcSs|j�|S)N)r	)r@r�r)r)r*rUsz/getPhases.<locals>.AfterHeadPhase.startTagOthercSs|j�|S)N)r	)r@r�r)r)r*r(Ysz2getPhases.<locals>.AfterHeadPhase.endTagHtmlBodyBrcSs|jjdd|di�dS)Nzunexpected-end-tagr>)rIr|)r@r�r)r)r*r]sz-getPhases.<locals>.AfterHeadPhase.endTagOthercSs.|jjtdd��|jjd|j_d|j_dS)Nr�ror�T)r<r�rrIrFr]r`)r@r)r)r*r	`sz.getPhases.<locals>.AfterHeadPhase.anythingElseN)r7r8r9rHr�r�r�r4r5r6rrr(rr	r))r�r)r*�AfterHeadPhase"s
r8cs�eZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Zd5d6�Zd7d8�Zd9d:�Zd;d<�Z d=d>�Z!d?d@�Z"dAdB�Z#dCdD�Z$dEdF�Z%dGdH�Z&dIdJ�Z'dKdL�Z(dMdN�Z)dOdP�Z*dQdR�Z+dSdT�Z,dUdV�Z-dWdX�Z.dYdZ�Z/d[d\�Z0d]d^�Z1d_d`�Z2dadb�Z3dcdd�Z4dedf�Z5dgS)hzgetPhases.<locals>.InBodyPhasec,s��j|||�|j|_tjd|jfdd|jfd|jfd|jfde|j	ft
|jfdf|jfd&|j
fdg|jfd*|jfd+|jfdh|jfd8|jfd9|jfdi|jfd=|jfd>|jfdj|jfdk|jfdH|jfdI|jfdJ|jfdK|jfdL|jfdM|jfdN|jfdl|j fdQ|j!fdm|j"fdn|j#fdV|j$fdW|j%fdo|j&fg!�|_'|j(|j'_)tjd|j*fd|j+fdp|j,fd&|j-fd |j.fdq|j/ft
|j0fdr|j1fds|j2fd@|j3fg
�|_4|j5|j4_)dS)tNr�rrrrr r4rrrr�r��address�article�aside�
blockquote�center�details�dirr+�dl�fieldset�
figcaption�figure�footer�header�hgroup�main�menu�nav�olr(�section�summary�ul�pre�listing�form�li�dd�dtrR�a�b�big�code�em�font�i�s�small�strike�strong�tt�u�nobr�button�applet�marquee�objectZxmpr��arear�embed�img�keygen�wbr�param�source�track�input�hr�image�isindex�textareaZiframer�noembedrr��rp�rt�option�optgroupZmathrur��colr��framer�r�r�r�r�r�r��dialog)	rrrrr r4rrr)r9r:r;r<r=r>r?r+r@rArBrCrDrErFrGrHrIrJr(rKrLrM)rNrO)rQrRrS)rUrVrWrXrYrZr[r\r]r^r_r`)rcrdre)rfrrgrhrirj)rkrlrm)rsr)rtru)rvrw)r�rxr�ryr�r�r�r�r�r�r�)r9r:r;r<rbr=r>rzr?r+r@rArBrCrDrErFrOrGrHrIrJrNrKrLrM)rRrSrQ)rTrUrVrWrXrYrZrar[r\r]r^r_r`)rcrdre)6rH�processSpaceCharactersNonPrer�rrr��startTagProcessInHeadr4r5�startTagClosePr�startTagHeading�startTagPreListing�startTagForm�startTagListItem�startTagPlaintext�	startTagA�startTagFormatting�startTagNobr�startTagButton�startTagAppletMarqueeObject�startTagXmp�
startTagTable�startTagVoidFormatting�startTagParamSource�
startTagInput�
startTagHr�
startTagImage�startTagIsIndex�startTagTextarea�startTagIFramer#�startTagRawtext�startTagSelect�startTagRpRt�startTagOpt�startTagMath�startTagSvg�startTagMisplacedr�rr�
endTagBody�
endTagHtml�endTagBlock�
endTagForm�endTagP�endTagListItem�
endTagHeading�endTagFormatting�endTagAppletMarqueeObjectr2r�r)r@rIr<)r�r)r*rHhs~
z'getPhases.<locals>.InBodyPhase.__init__cSs$|j|jko"|j|jko"|j|jkS)N)r>rhri)r@Znode1Znode2r)r)r*�isMatchingFormattingElement�sz:getPhases.<locals>.InBodyPhase.isMatchingFormattingElementcSs�|jj|�|jjd}g}x<|jjddd�D]&}|tkr@Pq0|j||�r0|j|�q0Wt|�dkrx|jjj|d�|jjj|�dS)Nr	�ryryry)	r<r�r{�activeFormattingElementsr
r�r�r~r7)r@r�rkZmatchingElementsr�r)r)r*�addFormattingElement�sz3getPhases.<locals>.InBodyPhase.addFormattingElementc
Ss@td�}x2|jjddd�D]}|j|kr|jjd�PqWdS)NrRrSrQr(r�r�r�r�r�r�r�r�r	z expected-closing-tag-but-got-eof)rRrSrQr(r�r�r�r�r�r�r�r�ry)r�r<r{r>rIr|)r@Zallowed_elementsr�r)r)r*r��s
z)getPhases.<locals>.InBodyPhase.processEOFcSsh|d}|j|_|jd�rJ|jjdjdkrJ|jjd	j�rJ|dd�}|rd|jj�|jj|�dS)
Nrs�
r	rNrOrrry)rNrOrrry)	r{r�r�r<r{r>Z
hasContent�#reconstructActiveFormattingElementsr�)r@r�rsr)r)r*�!processSpaceCharactersDropNewline�s

z@getPhases.<locals>.InBodyPhase.processSpaceCharactersDropNewlinecSsT|ddkrdS|jj�|jj|d�|jjrPtdd�|dD��rPd|j_dS)Nrs�cSsg|]}|tk�qSr))r)r=�charr)r)r*rA�szDgetPhases.<locals>.InBodyPhase.processCharacters.<locals>.<listcomp>F)r<r�r�rIr`�any)r@r�r)r)r*r��s
z0getPhases.<locals>.InBodyPhase.processCharacterscSs|jj�|jj|d�dS)Nrs)r<r�r�)r@r�r)r)r*r{�s
z;getPhases.<locals>.InBodyPhase.processSpaceCharactersNonPrecSs|jjdj|�S)Nr)rIrFr�)r@r�r)r)r*r|�sz4getPhases.<locals>.InBodyPhase.startTagProcessInHeadcSs�|jjdddi�t|jj�dks||jjdjdkr6nFd|j_x<|dj�D],\}}||jjdjkrL||jjdj|<qLWdS)Nzunexpected-start-tagr>r�r	Frs)	rIr|r~r<r{r>r`r.ri)r@r�r�r�r)r)r*r4�sz+getPhases.<locals>.InBodyPhase.startTagBodycSs�|jjdddi�t|jj�dks�|jjdjdkr6nt|jjs@nj|jjdjrj|jjdjj|jjd�x"|jjdjdkr�|jjj	�qlW|jj
|�|jjd|j_dS)	Nzunexpected-start-tagr>r�r	r�r�r�ry)
rIr|r~r<r{r>r`�parent�removeChildr)r�rFr])r@r�r)r)r*r5�s"z/getPhases.<locals>.InBodyPhase.startTagFramesetcSs.|jjddd�r|jtd��|jj|�dS)Nr(rb)�variant)r<�elementInScoper�rr�)r@r�r)r)r*r}	sz-getPhases.<locals>.InBodyPhase.startTagClosePcSs>|jjddd�r|jtd��|jj|�d|j_|j|_dS)Nr(rb)r�F)	r<r�r�rr�rIr`r�r�)r@r�r)r)r*rs
z1getPhases.<locals>.InBodyPhase.startTagPreListingcSsZ|jjr|jjdddi�n:|jjddd�r:|jtd��|jj|�|jjd|j_dS)	Nzunexpected-start-tagr>rPr(rb)r�r	ry)	r<�formPointerrIr|r�r�rr�r{)r@r�r)r)r*r�sz+getPhases.<locals>.InBodyPhase.startTagFormcSs�d|j_dgddgddgd�}||d}xLt|jj�D]<}|j|kr^|jjjt|jd��P|j	t
kr8|jd
kr8Pq8W|jjd
dd�r�|jjjtd
d��|jj|�dS)NFrQrSrR)rQrSrRr>rpr9r+r(rb)r�)r9r+r()
rIr`�reversedr<r{r>r]r�r�	nameTuplerr�r�)r@r�ZstopNamesMapZ	stopNamesr�r)r)r*r�s"


z/getPhases.<locals>.InBodyPhase.startTagListItemcSs>|jjddd�r|jtd��|jj|�|jjj|jj_dS)Nr(rb)r�)	r<r�r�rr�rIrLr\rZ)r@r�r)r)r*r�4sz0getPhases.<locals>.InBodyPhase.startTagPlaintextcSsb|jjddd�r|jtd��|jjdjtkrR|jjdd|di�|jjj	�|jj
|�dS)Nr(rb)r�r	zunexpected-start-tagr>ry)r<r�r�rr{r>rrIr|r)r�)r@r�r)r)r*r~:sz.getPhases.<locals>.InBodyPhase.startTagHeadingcSs~|jjd�}|rf|jjdddd��|jtd��||jjkrL|jjj|�||jjkrf|jjj|�|jj	�|j
|�dS)NrTz$unexpected-start-tag-implies-end-tag)�	startName�endName)r<�!elementInActiveFormattingElementsrIr|r�rr{r7r�r�r�)r@r�ZafeAElementr)r)r*r�Bs
z(getPhases.<locals>.InBodyPhase.startTagAcSs|jj�|j|�dS)N)r<r�r�)r@r�r)r)r*r�Os
z1getPhases.<locals>.InBodyPhase.startTagFormattingcSsP|jj�|jjd�rB|jjdddd��|jtd��|jj�|j|�dS)Nraz$unexpected-start-tag-implies-end-tag)r�r�)r<r�r�rIr|r�rr�)r@r�r)r)r*r�Ss

z+getPhases.<locals>.InBodyPhase.startTagNobrcSsT|jjd�r2|jjdddd��|jtd��|S|jj�|jj|�d|j_dS)Nrbz$unexpected-start-tag-implies-end-tag)r�r�F)	r<r�rIr|r�rr�r�r`)r@r�r)r)r*r�]s
z-getPhases.<locals>.InBodyPhase.startTagButtoncSs0|jj�|jj|�|jjjt�d|j_dS)NF)r<r�r�r�r�r
rIr`)r@r�r)r)r*r�hs
z:getPhases.<locals>.InBodyPhase.startTagAppletMarqueeObjectcSsB|jjddd�r|jtd��|jj�d|j_|jj|d�dS)Nr(rb)r�Fr�)r<r�r�rr�rIr`r�)r@r�r)r)r*r�ns

z*getPhases.<locals>.InBodyPhase.startTagXmpcSsR|jjdkr*|jjddd�r*|jtd��|jj|�d|j_|jjd|j_	dS)Nrr(rb)r�Fr�)
rIrWr<r�r�rr�r`rFr])r@r�r)r)r*r�usz,getPhases.<locals>.InBodyPhase.startTagTablecSs6|jj�|jj|�|jjj�d|d<d|j_dS)NTrxF)r<r�r�r{r)rIr`)r@r�r)r)r*r�}s

z5getPhases.<locals>.InBodyPhase.startTagVoidFormattingcSs@|jj}|j|�d|dkr<|ddjt�dkr<||j_dS)Nr2rs�hidden)rIr`r�rjr)r@r�r`r)r)r*r��s

z,getPhases.<locals>.InBodyPhase.startTagInputcSs$|jj|�|jjj�d|d<dS)NTrx)r<r�r{r))r@r�r)r)r*r��sz2getPhases.<locals>.InBodyPhase.startTagParamSourcecSsJ|jjddd�r|jtd��|jj|�|jjj�d|d<d|j_dS)Nr(rb)r�TrxF)	r<r�r�rr�r{r)rIr`)r@r�r)r)r*r��sz)getPhases.<locals>.InBodyPhase.startTagHrcSs6|jjdddd��|jtdd|d|dd��dS)	Nzunexpected-start-tag-treated-asrprh)�originalName�newNamerorsrw)rirw)rIr|r�r)r@r�r)r)r*r��s

z,getPhases.<locals>.InBodyPhase.startTagImagecSs|jjdddi�|jjrdSi}d|dkr>|dd|d<|jtdd|d��|jtd	d��|jtd
d��d|dkr�|dd}nd}|jtd
|d��|dj�}d|kr�|d=d|kr�|d=d|d<|jtdd||dd��|j	td
��|jtd	d��|j	td��dS)Nzdeprecated-tagr>rq�actionrsrPro)riroZlabel�promptz3This is a searchable index. Enter search keywords: rn)r2rsrnrw)rirw)
rIr|r<r�r�rr�r�copyr�)r@r�Z
form_attrsr�rir)r)r*r��s6


z.getPhases.<locals>.InBodyPhase.startTagIsIndexcSs0|jj|�|jjj|jj_|j|_d|j_dS)NF)	r<r�rIrLrYrZr�r�r`)r@r�r)r)r*r��sz/getPhases.<locals>.InBodyPhase.startTagTextareacSsd|j_|j|�dS)NF)rIr`r�)r@r�r)r)r*r��sz-getPhases.<locals>.InBodyPhase.startTagIFramecSs"|jjr|j|�n
|j|�dS)N)rIrKr�r)r@r�r)r)r*r#�sz/getPhases.<locals>.InBodyPhase.startTagNoscriptcSs|jj|d�dS)z8iframe, noembed noframes, noscript(if scripting enabled)r�N)rIr�)r@r�r)r)r*r��sz.getPhases.<locals>.InBodyPhase.startTagRawtextcSs@|jjdjdkr$|jjjtd��|jj�|jjj|�dS)Nr	rvry)	r<r{r>rIr]r�rr�r�)r@r�r)r)r*r��s
z*getPhases.<locals>.InBodyPhase.startTagOptcSs�|jj�|jj|�d|j_|jj|jjd|jjd|jjd|jjd|jjd|jjdfkrx|jjd|j_n|jjd	|j_dS)
NFr�r�r�r�r�r��inSelectInTabler�)r<r�r�rIr`r]rF)r@r�r)r)r*r��s




z-getPhases.<locals>.InBodyPhase.startTagSelectcSsB|jjd�r2|jj�|jjdjdkr2|jj�|jj|�dS)N�rubyr	ry)r<r��generateImpliedEndTagsr{r>rIr|r�)r@r�r)r)r*r��s


z+getPhases.<locals>.InBodyPhase.startTagRpRtcSsZ|jj�|jj|�|jj|�td|d<|jj|�|drV|jjj�d|d<dS)NrdrhrwTrx)	r<r�rIrrrr�r{r))r@r�r)r)r*r��s
z+getPhases.<locals>.InBodyPhase.startTagMathcSsZ|jj�|jj|�|jj|�td|d<|jj|�|drV|jjj�d|d<dS)NrurhrwTrx)	r<r�rIrrrr�r{r))r@r�r)r)r*r��s
z*getPhases.<locals>.InBodyPhase.startTagSvgcSs|jjdd|di�dS)a5 Elements that should be children of other elements that have a
            different insertion mode; here they are ignored
            "caption", "col", "colgroup", "frame", "frameset", "head",
            "option", "optgroup", "tbody", "td", "tfoot", "th", "thead",
            "tr", "noscript"
            zunexpected-start-tag-ignoredr>N)rIr|)r@r�r)r)r*r�sz0getPhases.<locals>.InBodyPhase.startTagMisplacedcSs|jj�|jj|�dS)N)r<r�r�)r@r�r)r)r*rs
z,getPhases.<locals>.InBodyPhase.startTagOthercSs�|jjddd�sD|jtdd��|jjdddi�|jtdd��nX|jjd�|jjd	j	dkrt|jjdddi�|jjj
�}x|j	dkr�|jjj
�}q�WdS)
Nr(rb)r�rozunexpected-end-tagr>rpr	ry)r<r�r}rrIr|r�r�r{r>r))r@r�r�r)r)r*r�sz&getPhases.<locals>.InBodyPhase.endTagPcSs�|jjd�s|jj�dS|jjdjdkrlx>|jjdd�D]*}|jtd�kr>|jjdd|jd��Pq>W|jjd|j_dS)Nr�r	�rRrSrQrwrvr(rtrur�r�r�r�r�r�r�z$expected-one-end-tag-but-got-another)�gotName�expectedName�	afterBodyry)rRrSrQrwrvr(rtrur�r�r�r�r�r�r�r�)	r<r�rIr|r{r>r�rFr])r@r�r�r)r)r*r�!s
z)getPhases.<locals>.InBodyPhase.endTagBodycSs"|jjd�r|jtd��|SdS)Nr�)r<r�r�r)r@r�r)r)r*r�3sz)getPhases.<locals>.InBodyPhase.endTagHtmlcSs�|ddkr|j|_|jj|d�}|r2|jj�|jjdj|dkr^|jjdd|di�|r�|jjj	�}x|j|dkr�|jjj	�}qpWdS)Nr>rNr	zend-tag-too-earlyry)
r{r�r<r�r�r{r>rIr|r))r@r�ZinScoper�r)r)r*r�9s
z*getPhases.<locals>.InBodyPhase.endTagBlockcSsx|jj}d|j_|dks&|jj|�r:|jjdddi�n:|jj�|jjd|krf|jjdddi�|jjj|�dS)Nzunexpected-end-tagr>rPr	zend-tag-too-early-ignoredry)r<r�r�rIr|r�r{r7)r@r�r�r)r)r*r�Gs

z)getPhases.<locals>.InBodyPhase.endTagFormcSs�|ddkrd}nd}|jj|d|d�sB|jjdd|di�nj|jj|dd�|jjd	j|dkr�|jjdd|di�|jjj�}x|j|dkr�|jjj�}q�WdS)
Nr>rQ�list)r�zunexpected-end-tag)�excluder	zend-tag-too-earlyry)r<r�rIr|r�r{r>r))r@r�r�r�r)r)r*r�Tsz-getPhases.<locals>.InBodyPhase.endTagListItemcSs�x$tD]}|jj|�r|jj�PqW|jjdj|dkrR|jjdd|di�xBtD]:}|jj|�rX|jjj�}x|jtkr�|jjj�}qvWPqXWdS)Nr	r>zend-tag-too-earlyry)	rr<r�r�r{r>rIr|r))r@r��itemr)r)r*r�es


z,getPhases.<locals>.InBodyPhase.endTagHeadingcSs"d}�x|dk�r|d7}|jj|d�}|sL||jjkrZ|jj|j�rZ|j|�dS||jjkr�|jjdd|di�|jjj	|�dS|jj|j�s�|jjdd|di�dS||jjdkr�|jjdd|di�|jjj
|�}d}x,|jj|d�D]}|jtk�r|}P�qW|dk�rb|jjj
�}x||k�rN|jjj
�}�q4W|jjj	|�dS|jj|d}|jjj
|�}|}	}
d}|jjj
|
�}x�|d	k�rh|d7}|d8}|jj|}
|
|jjk�r�|jjj	|
��q�|
|k�r�P|	|k�r
|jjj
|
�d}|
j�}
|
|jj|jjj
|
�<|
|jj|jjj
|
�<|
}
|	j�rV|	jj|	�|
j|	�|
}	�q�W|	j�r~|	jj|	�|jtd�k�r�|jj�\}}|j|	|�n
|j|	�|j�}
|j|
�|j|
�|jjj	|�|jjj||
�|jjj	|�|jjj|jjj
|�d|
�qWdS)z)The much-feared adoption agency algorithmr�r	r>Nzadoption-agency-1.2zadoption-agency-4.4zadoption-agency-1.3r�r�r�r�r�r�ry)r�r�r�r�r�)r<r�r{r�r>rrIr|r�r7�indexr�rr)Z	cloneNoder�r�ZappendChildr�ZgetTableMisnestedNodePosition�insertBeforeZreparentChildren�insert)r@r�ZouterLoopCounterZformattingElementZafeIndexZ
furthestBlockrkZcommonAncestorZbookmarkZlastNoder�ZinnerLoopCounterr�Zcloner�r�r)r)r*r�ts�











z/getPhases.<locals>.InBodyPhase.endTagFormattingcSs�|jj|d�r|jj�|jjdj|dkrF|jjdd|di�|jj|d�r�|jjj�}x|j|dkr�|jjj�}qdW|jj�dS)Nr>r	zend-tag-too-earlyry)	r<r�r�r{r>rIr|r)�clearActiveFormattingElements)r@r�rkr)r)r*r�s
z8getPhases.<locals>.InBodyPhase.endTagAppletMarqueeObjectcSs@|jjdddd��|jj�|jjtdd��|jjj�dS)Nzunexpected-end-tag-treated-asrz
br element)r�r�ro)rIr|r<r�r�rr{r))r@r�r)r)r*r2#s

z'getPhases.<locals>.InBodyPhase.endTagBrcSs�x�|jjddd�D]�}|j|dkr~|jj|dd�|jjdj|dkrd|jjdd|di�x|jjj�|krxqfWPq|jtkr|jjdd|di�PqWdS)Nr	r>)r�zunexpected-end-tagryry)	r<r{r>r�rIr|r)r�r)r@r�r�r)r)r*r*s
z*getPhases.<locals>.InBodyPhase.endTagOtherN)6r7r8r9rHr�r�r�r�r�r{r|r4r5r}rr�r�r�r~r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r#r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r2rr))r�r)r*�InBodyPhaseeshG

	

	

$r�cs@eZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
S)zgetPhases.<locals>.TextPhasecsF�j|||�tjg�|_|j|j_tjd|jfg�|_|j|j_dS)Nr)	rHrrr�rr�endTagScriptr�r)r@rIr<)r�r)r*rH9s
z%getPhases.<locals>.TextPhase.__init__cSs|jj|d�dS)Nrs)r<r�)r@r�r)r)r*r�Asz.getPhases.<locals>.TextPhase.processCharacterscSs8|jjdd|jjdji�|jjj�|jj|j_dS)Nz&expected-named-closing-tag-but-got-eofr>r	Try)rIr|r<r{r>r)r�r])r@r)r)r*r�Ds
z'getPhases.<locals>.TextPhase.processEOFcSsdS)Nr))r@r�r)r)r*rKsz*getPhases.<locals>.TextPhase.startTagOthercSs|jjj�}|jj|j_dS)N)r<r{r)rIr�r])r@r�r�r)r)r*r�Nsz)getPhases.<locals>.TextPhase.endTagScriptcSs|jjj�|jj|j_dS)N)r<r{r)rIr�r])r@r�r)r)r*rUsz(getPhases.<locals>.TextPhase.endTagOtherN)	r7r8r9rHr�r�rr�rr))r�r)r*�	TextPhase8sr�cs�eZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'S)(zgetPhases.<locals>.InTablePhasec
s��j|||�tjd|jfd|jfd|jfd|jfd|jfd|jfd|j	fd|j
fd|jfd|jfg
�|_
|j|j
_tjd|jfd|jfg�|_|j|j_dS)Nr�r�r�rxr�r�r�r�r�r�r�rrrnrPr�)r�r�r�)r�r�r�)rr)r�r�rxr�r�r�r�r�r�r�r�)rHrrr��startTagCaption�startTagColgroup�startTagCol�startTagRowGroup�startTagImplyTbodyr��startTagStyleScriptr�r�r�rr�endTagTable�endTagIgnorer�r)r@rIr<)r�r)r*rH[s$
z(getPhases.<locals>.InTablePhase.__init__cSs(x"|jjdjdkr"|jjj�qWdS)Nr	r�r�ry)r�r�)r<r{r>r))r@r)r)r*�clearStackToTableContextssz8getPhases.<locals>.InTablePhase.clearStackToTableContextcSs$|jjdjdkr |jjd�ndS)Nr	r�zeof-in-tablery)r<r{r>rIr|)r@r)r)r*r�|sz*getPhases.<locals>.InTablePhase.processEOFcSs4|jj}|jjd|j_||jj_|jjj|�dS)N�inTableText)rIr]rFr�r�)r@r�r�r)r)r*r��s
z6getPhases.<locals>.InTablePhase.processSpaceCharacterscSs4|jj}|jjd|j_||jj_|jjj|�dS)Nr�)rIr]rFr�r�)r@r�r�r)r)r*r��s
z1getPhases.<locals>.InTablePhase.processCharacterscSs&d|j_|jjdj|�d|j_dS)NTr�F)r<�insertFromTablerIrFr�)r@r�r)r)r*r��sz*getPhases.<locals>.InTablePhase.insertTextcSs6|j�|jjjt�|jj|�|jjd|j_dS)Nr�)	r�r<r�r�r
r�rIrFr])r@r�r)r)r*r��sz/getPhases.<locals>.InTablePhase.startTagCaptioncSs(|j�|jj|�|jjd|j_dS)Nr�)r�r<r�rIrFr])r@r�r)r)r*r��sz0getPhases.<locals>.InTablePhase.startTagColgroupcSs|jtdd��|S)Nr�ro)r�r)r@r�r)r)r*r��sz+getPhases.<locals>.InTablePhase.startTagColcSs(|j�|jj|�|jjd|j_dS)Nr�)r�r<r�rIrFr])r@r�r)r)r*r��sz0getPhases.<locals>.InTablePhase.startTagRowGroupcSs|jtdd��|S)Nr�ro)r�r)r@r�r)r)r*r��sz2getPhases.<locals>.InTablePhase.startTagImplyTbodycSs6|jjdddd��|jjjtd��|jjs2|SdS)Nz$unexpected-start-tag-implies-end-tagr�)r�r�)rIr|r]r�rrP)r@r�r)r)r*r��s
z-getPhases.<locals>.InTablePhase.startTagTablecSs|jjdj|�S)Nr)rIrFr�)r@r�r)r)r*r��sz3getPhases.<locals>.InTablePhase.startTagStyleScriptcSsVd|dkrH|ddjt�dkrH|jjd�|jj|�|jjj�n
|j|�dS)Nr2rsr�z unexpected-hidden-input-in-table)	rjrrIr|r<r�r{r)r)r@r�r)r)r*r��sz-getPhases.<locals>.InTablePhase.startTagInputcSsD|jjd�|jjdkr@|jj|�|jjd|j_|jjj�dS)Nzunexpected-form-in-tabler	ry)rIr|r<r�r�r{r))r@r�r)r)r*r��s
z,getPhases.<locals>.InTablePhase.startTagFormcSs<|jjdd|di�d|j_|jjdj|�d|j_dS)Nz)unexpected-start-tag-implies-table-voodoor>Tr�F)rIr|r<r�rFr�)r@r�r)r)r*r�sz-getPhases.<locals>.InTablePhase.startTagOthercSs�|jjddd�r�|jj�|jjdjdkrJ|jjdd|jjdjd��x"|jjdjdkrl|jjj�qLW|jjj�|jj�n
|jj�dS)	Nr�)r�r	zend-tag-too-early-named)r�r�ryryry)	r<r�r�r{r>rIr|r)r_)r@r�r)r)r*r��s
z+getPhases.<locals>.InTablePhase.endTagTablecSs|jjdd|di�dS)Nzunexpected-end-tagr>)rIr|)r@r�r)r)r*r��sz,getPhases.<locals>.InTablePhase.endTagIgnorecSs<|jjdd|di�d|j_|jjdj|�d|j_dS)Nz'unexpected-end-tag-implies-table-voodoor>Tr�F)rIr|r<r�rFr�)r@r�r)r)r*r�sz+getPhases.<locals>.InTablePhase.endTagOtherN)r7r8r9rHr�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�rr))r�r)r*�InTablePhaseYs&	
r�csPeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)z#getPhases.<locals>.InTableTextPhasecs�j|||�d|_g|_dS)N)rHr��characterTokens)r@rIr<)r�r)r*rH�sz,getPhases.<locals>.InTableTextPhase.__init__cSsddjdd�|jD��}tdd�|D��rJtd|d�}|jjdj|�n|rZ|jj|�g|_dS)Nr�cSsg|]}|d�qS)rsr))r=r�r)r)r*rA�szGgetPhases.<locals>.InTableTextPhase.flushCharacters.<locals>.<listcomp>cSsg|]}|tk�qSr))r)r=r�r)r)r*rA�srn)r2rsr�)�joinr�r�rrIrFr�r<)r@rsr�r)r)r*�flushCharacters�sz3getPhases.<locals>.InTableTextPhase.flushCharacterscSs|j�|j|j_|S)N)r�r�rIr])r@r�r)r)r*r��s
z2getPhases.<locals>.InTableTextPhase.processCommentcSs|j�|j|j_dS)NT)r�r�rIr])r@r)r)r*r��s
z.getPhases.<locals>.InTableTextPhase.processEOFcSs |ddkrdS|jj|�dS)Nrsr�)r�r�)r@r�r)r)r*r�sz5getPhases.<locals>.InTableTextPhase.processCharacterscSs|jj|�dS)N)r�r�)r@r�r)r)r*r�sz:getPhases.<locals>.InTableTextPhase.processSpaceCharacterscSs|j�|j|j_|S)N)r�r�rIr])r@r�r)r)r*r�
s
z3getPhases.<locals>.InTableTextPhase.processStartTagcSs|j�|j|j_|S)N)r�r�rIr])r@r�r)r)r*r�s
z1getPhases.<locals>.InTableTextPhase.processEndTagN)r7r8r9rHr�r�r�r�r�r�r�r))r�r)r*�InTableTextPhase�s	r�cs`eZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�ZdS)z!getPhases.<locals>.InCaptionPhasec
sf�j|||�tjd|jfd
|jfg�|_|j|j_tjd|jfd|j	fd|j
fg�|_|j|j_dS)Nr�r�rxr�r�r�r�r�r�r�r�r�)	r�rxr�r�r�r�r�r�r�)
r�rxr�r�r�r�r�r�r�r�)
rHrrr��startTagTableElementr�rr�
endTagCaptionr�r�r�r)r@rIr<)r�r)r*rHs
z*getPhases.<locals>.InCaptionPhase.__init__cSs|jjddd�S)Nr�r�)r�)r<r�)r@r)r)r*�ignoreEndTagCaption+sz5getPhases.<locals>.InCaptionPhase.ignoreEndTagCaptioncSs|jjdj�dS)Nr�)rIrFr�)r@r)r)r*r�.sz,getPhases.<locals>.InCaptionPhase.processEOFcSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�1sz3getPhases.<locals>.InCaptionPhase.processCharacterscSs0|jj�|j�}|jjjtd��|s,|SdS)Nr�)rIr|r�r]r�r)r@r��ignoreEndTagr)r)r*r�4s

z6getPhases.<locals>.InCaptionPhase.startTagTableElementcSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r<sz/getPhases.<locals>.InCaptionPhase.startTagOthercSs�|j�s�|jj�|jjdjdkrB|jjdd|jjdjd��x"|jjdjdkrd|jjj�qDW|jjj�|jj�|jj	d|j_
n
|jj�dS)	Nr	r�z$expected-one-end-tag-but-got-another)r�r�r�ryryry)r�r<r�r{r>rIr|r)r�rFr])r@r�r)r)r*r�?s

z/getPhases.<locals>.InCaptionPhase.endTagCaptioncSs0|jj�|j�}|jjjtd��|s,|SdS)Nr�)rIr|r�r]r�r)r@r�r�r)r)r*r�Qs

z-getPhases.<locals>.InCaptionPhase.endTagTablecSs|jjdd|di�dS)Nzunexpected-end-tagr>)rIr|)r@r�r)r)r*r�Xsz.getPhases.<locals>.InCaptionPhase.endTagIgnorecSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r[sz-getPhases.<locals>.InCaptionPhase.endTagOtherN)
r7r8r9rHr�r�r�r�rr�r�r�rr))r�r)r*�InCaptionPhasesr�csXeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)z%getPhases.<locals>.InColumnGroupPhasecs^�j|||�tjd|jfd|jfg�|_|j|j_tjd|jfd|j	fg�|_
|j|j
_dS)Nr�rxr�)rHrrr�r�r�rr�endTagColgroup�	endTagColr�r)r@rIr<)r�r)r*rHas
z.getPhases.<locals>.InColumnGroupPhase.__init__cSs|jjdjdkS)Nr	r�ry)r<r{r>)r@r)r)r*�ignoreEndTagColgrouppsz:getPhases.<locals>.InColumnGroupPhase.ignoreEndTagColgroupcSs8|jjdjdkrdS|j�}|jtd��|s4dSdS)Nr	r�r�Try)r<r{r>r�r�r)r@r�r)r)r*r�ssz0getPhases.<locals>.InColumnGroupPhase.processEOFcSs"|j�}|jtd��|s|SdS)Nr�)r�r�r)r@r�r�r)r)r*r�}sz7getPhases.<locals>.InColumnGroupPhase.processCharacterscSs$|jj|�|jjj�d|d<dS)NTrx)r<r�r{r))r@r�r)r)r*r��sz1getPhases.<locals>.InColumnGroupPhase.startTagColcSs"|j�}|jtd��|s|SdS)Nr�)r�r�r)r@r�r�r)r)r*r�sz3getPhases.<locals>.InColumnGroupPhase.startTagOthercSs4|j�r|jj�n|jjj�|jjd|j_dS)Nr�)r�rIr|r<r{r)rFr])r@r�r)r)r*r��sz4getPhases.<locals>.InColumnGroupPhase.endTagColgroupcSs|jjdddi�dS)Nz
no-end-tagr>rx)rIr|)r@r�r)r)r*r��sz/getPhases.<locals>.InColumnGroupPhase.endTagColcSs"|j�}|jtd��|s|SdS)Nr�)r�r�r)r@r�r�r)r)r*r�sz1getPhases.<locals>.InColumnGroupPhase.endTagOtherN)r7r8r9rHr�r�r�r�rr�r�rr))r�r)r*�InColumnGroupPhase^s
	r�csxeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�ZdS)z#getPhases.<locals>.InTableBodyPhasecsv�j|||�tjd|jfd|jfd
|jfd|jfg�|_|j|j_	tjd|j
fd|jfd|jfg�|_
|j|j
_	dS)Nr�r�r�r�r�rxr�r�r�r�r�r�)r�r�)r�rxr�r�r�r�)r�r�r�)r�r�rxr�r�r�r�r�)rHrrr��
startTagTr�startTagTableCell�startTagTableOtherr�rr�endTagTableRowGroupr�r�r�r)r@rIr<)r�r)r*rH�s
z,getPhases.<locals>.InTableBodyPhase.__init__cSs:x"|jjdjdkr"|jjj�qW|jjdjdkr6dS)	Nr	r�r�r�r�ry)r�r�r�r�ry)r<r{r>r))r@r)r)r*�clearStackToTableBodyContext�s
z@getPhases.<locals>.InTableBodyPhase.clearStackToTableBodyContextcSs|jjdj�dS)Nr�)rIrFr�)r@r)r)r*r��sz.getPhases.<locals>.InTableBodyPhase.processEOFcSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r��sz:getPhases.<locals>.InTableBodyPhase.processSpaceCharacterscSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r��sz5getPhases.<locals>.InTableBodyPhase.processCharacterscSs(|j�|jj|�|jjd|j_dS)Nr�)r�r<r�rIrFr])r@r�r)r)r*r��sz.getPhases.<locals>.InTableBodyPhase.startTagTrcSs*|jjdd|di�|jtdd��|S)Nzunexpected-cell-in-table-bodyr>r�ro)rIr|r�r)r@r�r)r)r*r��sz5getPhases.<locals>.InTableBodyPhase.startTagTableCellcSsb|jjddd�s0|jjddd�s0|jjddd�rT|j�|jt|jjdj��|S|jj�dS)Nr�r�)r�r�r�r	ry)	r<r�r�r�rr{r>rIr|)r@r�r)r)r*r��sz6getPhases.<locals>.InTableBodyPhase.startTagTableOthercSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�sz1getPhases.<locals>.InTableBodyPhase.startTagOthercSsT|jj|ddd�r:|j�|jjj�|jjd|j_n|jjdd|di�dS)Nr>r�)r�r�z unexpected-end-tag-in-table-body)	r<r�r�r{r)rIrFr]r|)r@r�r)r)r*r��sz7getPhases.<locals>.InTableBodyPhase.endTagTableRowGroupcSsb|jjddd�s0|jjddd�s0|jjddd�rT|j�|jt|jjdj��|S|jj�dS)Nr�r�)r�r�r�r	ry)	r<r�r�r�rr{r>rIr|)r@r�r)r)r*r��sz/getPhases.<locals>.InTableBodyPhase.endTagTablecSs|jjdd|di�dS)Nz unexpected-end-tag-in-table-bodyr>)rIr|)r@r�r)r)r*r��sz0getPhases.<locals>.InTableBodyPhase.endTagIgnorecSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�sz/getPhases.<locals>.InTableBodyPhase.endTagOtherN)r7r8r9rHr�r�r�r�r�r�r�rr�r�r�rr))r�r)r*�InTableBodyPhase�s
	
r�cs�eZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�ZdS)zgetPhases.<locals>.InRowPhasecsv�j|||�tjd|jfd
|jfd|jfg�|_|j|j_tjd
|j	fd|j
fd|jfd|jfg�|_
|j|j
_dS)Nr�r�r�r�rxr�r�r�r�r�r�r�)r�r�)r�rxr�r�r�r�r�)r�r�r�)r�r�rxr�r�r�r�)rHrrr�r�r�r�rr�endTagTrr�r�r�r�r)r@rIr<)r�r)r*rHs
z&getPhases.<locals>.InRowPhase.__init__cSsDx>|jjdjdkr>|jjdd|jjdji�|jjj�qWdS)	Nr	r�r�z'unexpected-implied-end-tag-in-table-rowr>ry)r�r�ry)r<r{r>rIr|r))r@r)r)r*�clearStackToTableRowContextsz9getPhases.<locals>.InRowPhase.clearStackToTableRowContextcSs|jjddd�S)Nr�r�)r�)r<r�)r@r)r)r*�ignoreEndTagTrsz,getPhases.<locals>.InRowPhase.ignoreEndTagTrcSs|jjdj�dS)Nr�)rIrFr�)r@r)r)r*r�"sz(getPhases.<locals>.InRowPhase.processEOFcSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�%sz4getPhases.<locals>.InRowPhase.processSpaceCharacterscSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�(sz/getPhases.<locals>.InRowPhase.processCharacterscSs6|j�|jj|�|jjd|j_|jjjt�dS)Nr�)	r�r<r�rIrFr]r�r�r
)r@r�r)r)r*r�+sz/getPhases.<locals>.InRowPhase.startTagTableCellcSs"|j�}|jtd��|s|SdS)Nr�)r�r�r)r@r�r�r)r)r*r�1sz0getPhases.<locals>.InRowPhase.startTagTableOthercSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r8sz+getPhases.<locals>.InRowPhase.startTagOthercSs<|j�s.|j�|jjj�|jjd|j_n
|jj�dS)Nr�)	r�r�r<r{r)rIrFr]r|)r@r�r)r)r*r�;s
z&getPhases.<locals>.InRowPhase.endTagTrcSs"|j�}|jtd��|s|SdS)Nr�)r�r�r)r@r�r�r)r)r*r�Esz)getPhases.<locals>.InRowPhase.endTagTablecSs4|jj|ddd�r&|jtd��|S|jj�dS)Nr>r�)r�r�)r<r�r�rrIr|)r@r�r)r)r*r�Msz1getPhases.<locals>.InRowPhase.endTagTableRowGroupcSs|jjdd|di�dS)Nzunexpected-end-tag-in-table-rowr>)rIr|)r@r�r)r)r*r�Tsz*getPhases.<locals>.InRowPhase.endTagIgnorecSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*rXsz)getPhases.<locals>.InRowPhase.endTagOtherN)r7r8r9rHr�r�r�r�r�r�r�rr�r�r�r�rr))r�r)r*�
InRowPhases
r�cs`eZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�ZdS)zgetPhases.<locals>.InCellPhasecsf�j|||�tjd|jfd
|jfg�|_|j|j_tjd|jfd|j	fd|j
fg�|_|j|j_dS)Nr�r�rxr�r�r�r�r�r�r�r�r�)	r�rxr�r�r�r�r�r�r�)r�r�)r�r�rxr�r�)r�r�r�r�r�)
rHrrr�r�r�rr�endTagTableCellr��endTagImplyr�r)r@rIr<)r�r)r*rH]s
z'getPhases.<locals>.InCellPhase.__init__cSsB|jjddd�r |jtd��n|jjddd�r>|jtd��dS)Nr�r�)r�r�)r<r�r�r)r@r)r)r*�	closeCellnsz(getPhases.<locals>.InCellPhase.closeCellcSs|jjdj�dS)Nr�)rIrFr�)r@r)r)r*r�usz)getPhases.<locals>.InCellPhase.processEOFcSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�xsz0getPhases.<locals>.InCellPhase.processCharacterscSs:|jjddd�s |jjddd�r,|j�|S|jj�dS)Nr�r�)r�r�)r<r�r�rIr|)r@r�r)r)r*r�{s
z1getPhases.<locals>.InCellPhase.startTagTableOthercSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�sz,getPhases.<locals>.InCellPhase.startTagOthercSs�|jj|ddd�r�|jj|d�|jjdj|dkrt|jjdd|di�x.|jjj�}|j|dkrRPqRWn|jjj�|jj�|jj	d|j_
n|jjdd|di�dS)	Nr>r�)r�r	zunexpected-cell-end-tagr�zunexpected-end-tagry)r<r�r�r{r>rIr|r)r�rFr])r@r�r�r)r)r*r��s
z.getPhases.<locals>.InCellPhase.endTagTableCellcSs|jjdd|di�dS)Nzunexpected-end-tagr>)rIr|)r@r�r)r)r*r��sz+getPhases.<locals>.InCellPhase.endTagIgnorecSs.|jj|ddd�r |j�|S|jj�dS)Nr>r�)r�)r<r�r�rIr|)r@r�r)r)r*r��sz*getPhases.<locals>.InCellPhase.endTagImplycSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�sz*getPhases.<locals>.InCellPhase.endTagOtherN)
r7r8r9rHr�r�r�r�rr�r�r�rr))r�r)r*�InCellPhase[s
r�csxeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�ZdS)z getPhases.<locals>.InSelectPhasecs��j|||�tjd|jfd|jfd|jfd|jfd	|jfd|jfg�|_	|j
|j	_tjd|jfd|j
fd|jfg�|_|j|j_dS)
Nr�rvrwr�rnrirrr)rnrirr)rHrrr��startTagOption�startTagOptgroupr�r�r$r�rr�endTagOption�endTagOptgroup�endTagSelectr�r)r@rIr<)r�r)r*rH�s
z)getPhases.<locals>.InSelectPhase.__init__cSs$|jjdjdkr |jjd�ndS)Nr	r�z
eof-in-selectry)r<r{r>rIr|)r@r)r)r*r��sz+getPhases.<locals>.InSelectPhase.processEOFcSs$|ddkrdS|jj|d�dS)Nrsr�)r<r�)r@r�r)r)r*r��sz2getPhases.<locals>.InSelectPhase.processCharacterscSs.|jjdjdkr|jjj�|jj|�dS)Nr	rvry)r<r{r>r)r�)r@r�r)r)r*r��sz/getPhases.<locals>.InSelectPhase.startTagOptioncSsL|jjdjdkr|jjj�|jjdjdkr<|jjj�|jj|�dS)Nr	rvrwryry)r<r{r>r)r�)r@r�r)r)r*r��s
z1getPhases.<locals>.InSelectPhase.startTagOptgroupcSs|jjd�|jtd��dS)Nzunexpected-select-in-selectr�)rIr|r�r)r@r�r)r)r*r��sz/getPhases.<locals>.InSelectPhase.startTagSelectcSs2|jjd�|jjddd�r.|jtd��|SdS)Nzunexpected-input-in-selectr�)r�)rIr|r<r�r�r)r@r�r)r)r*r��s
z.getPhases.<locals>.InSelectPhase.startTagInputcSs|jjdj|�S)Nr)rIrFr�)r@r�r)r)r*r$�sz/getPhases.<locals>.InSelectPhase.startTagScriptcSs|jjdd|di�dS)Nzunexpected-start-tag-in-selectr>)rIr|)r@r�r)r)r*r�sz.getPhases.<locals>.InSelectPhase.startTagOthercSs6|jjdjdkr |jjj�n|jjdddi�dS)Nr	rvzunexpected-end-tag-in-selectr>ry)r<r{r>r)rIr|)r@r�r)r)r*r��sz-getPhases.<locals>.InSelectPhase.endTagOptioncSsf|jjdjdkr0|jjdjdkr0|jjj�|jjd	jdkrP|jjj�n|jjdddi�dS)
Nr	rvr�rwzunexpected-end-tag-in-selectr>ry���ry)r<r{r>r)rIr|)r@r�r)r)r*r��sz/getPhases.<locals>.InSelectPhase.endTagOptgroupcSsR|jjddd�rD|jjj�}x|jdkr6|jjj�}qW|jj�n
|jj�dS)Nr�)r�)r<r�r{r)r>rIr_r|)r@r�r�r)r)r*r��sz-getPhases.<locals>.InSelectPhase.endTagSelectcSs|jjdd|di�dS)Nzunexpected-end-tag-in-selectr>)rIr|)r@r�r)r)r*r	sz,getPhases.<locals>.InSelectPhase.endTagOtherN)r7r8r9rHr�r�r�r�r�r�r$rr�r�r�rr))r�r)r*�
InSelectPhase�s
r�csHeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dS)z'getPhases.<locals>.InSelectInTablePhasec	sN�j|||�tjd	|jfg�|_|j|j_tjd
|jfg�|_|j	|j_dS)Nr�r�r�r�r�r�r�r�)r�r�r�r�r�r�r�r�)r�r�r�r�r�r�r�r�)
rHrrr�r�rrr�r�r)r@rIr<)r�r)r*rH	s
z0getPhases.<locals>.InSelectInTablePhase.__init__cSs|jjdj�dS)Nr�)rIrFr�)r@r)r)r*r�	sz2getPhases.<locals>.InSelectInTablePhase.processEOFcSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�	sz9getPhases.<locals>.InSelectInTablePhase.processCharacterscSs(|jjdd|di�|jtd��|S)Nz5unexpected-table-element-start-tag-in-select-in-tabler>r�)rIr|rr)r@r�r)r)r*r�!	sz5getPhases.<locals>.InSelectInTablePhase.startTagTablecSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r&	sz5getPhases.<locals>.InSelectInTablePhase.startTagOthercSs@|jjdd|di�|jj|ddd�r<|jtd��|SdS)Nz3unexpected-table-element-end-tag-in-select-in-tabler>r�)r�r�)rIr|r<r�rr)r@r�r)r)r*r�)	sz3getPhases.<locals>.InSelectInTablePhase.endTagTablecSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r/	sz3getPhases.<locals>.InSelectInTablePhase.endTagOtherN)
r7r8r9rHr�r�r�rr�rr))r�r)r*�InSelectInTablePhase	sr�c-s�eZdZeddddddddd	d
ddd
ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,g,�Z�fd-d.�Zd/d0�Z�fd1d2�Zd3d4�Zd5d6�Z	d7S)8z(getPhases.<locals>.InForeignContentPhaserUrVr<r�rr=rWrRr+r@rSrXrgZh1Zh2Zh3Zh4Zh5Zh6r�rorZrhrQrOrHr4rarJr(rNr�r[r\�spanr^r]�subZsupr�r_r`rM�varcs�j|||�dS)N)rH)r@rIr<)r�r)r*rH<	sz1getPhases.<locals>.InForeignContentPhase.__init__c%Ssnddddddddd	d
ddd
ddddddddddddddddddd d!d"d#d$d%�$}|d&|krj||d&|d&<dS)'NZaltGlyphZaltGlyphDefZaltGlyphItemZanimateColorZ
animateMotionZanimateTransformZclipPathZfeBlendZ
feColorMatrixZfeComponentTransferZfeCompositeZfeConvolveMatrixZfeDiffuseLightingZfeDisplacementMapZfeDistantLightZfeFloodZfeFuncAZfeFuncBZfeFuncGZfeFuncRZfeGaussianBlurZfeImageZfeMergeZfeMergeNodeZfeMorphologyZfeOffsetZfePointLightZfeSpecularLightingZfeSpotLightZfeTileZfeTurbulenceZ
foreignObjectZglyphRefZlinearGradientZradialGradientZtextPath)$ZaltglyphZaltglyphdefZaltglyphitemZanimatecolorZ
animatemotionZanimatetransformZclippathZfeblendZ
fecolormatrixZfecomponenttransferZfecompositeZfeconvolvematrixZfediffuselightingZfedisplacementmapZfedistantlightZfefloodZfefuncaZfefuncbZfefuncgZfefuncrZfegaussianblurZfeimageZfemergeZfemergenodeZfemorphologyZfeoffsetZfepointlightZfespecularlightingZfespotlightZfetileZfeturbulenceZ
foreignobjectZglyphrefZlineargradientZradialgradientZtextpathr>r))r@r��replacementsr)r)r*�adjustSVGTagNames?	sLz:getPhases.<locals>.InForeignContentPhase.adjustSVGTagNamescsL|ddkrd|d<n&|jjr<tdd�|dD��r<d|j_�j||�dS)Nrsr�u�css|]}|tkVqdS)N)r)r=r�r)r)r*r�l	szMgetPhases.<locals>.InForeignContentPhase.processCharacters.<locals>.<genexpr>F)rIr`r�r�)r@r�)r�r)r*r�h	s
z:getPhases.<locals>.InForeignContentPhase.processCharacterscSs6|jjd}|d|jksD|ddkr�t|dj��tdddg�@r�|jjdd|di�xR|jjdj|jjkr�|jj	|jjd�r�|jj
|jjd�r�|jjj�q\W|S|jtd	kr�|jj
|�n$|jtd
kr�|j|�|jj|�|jj|�|j|d<|jj|�|d�r2|jjj�d
|d<dS)Nr	r>rYrsZcolorZface�sizez*unexpected-html-element-in-foreign-contentrdrurhrwTrxryryryry)r<r{�breakoutElements�set�keysrIr|rhrrlrmr)rrrrrr�)r@r�r�r)r)r*r�p	s.



z8getPhases.<locals>.InForeignContentPhase.processStartTagcSs�t|jj�d}|jjd}|jjt�|dkrF|jjdd|di�x�|jjt�|dkr�|jj|jj	dkr�|jjj
�|jjj|j_x|jjj�|kr�q�Wd}P|d8}|jj|}|j
|jjkr�qHqH|jjj|�}PqHW|S)Nr	r>zunexpected-end-tagr�ry)r~r<r{r>rjrrIr|r]rFr�r�r)rhrr�)r@r�Z	nodeIndexr�r�r)r)r*r��	s(z6getPhases.<locals>.InForeignContentPhase.processEndTagN)
r7r8r9r�rrHrr�r�r�r))r�r)r*�InForeignContentPhase2	s


)rcsPeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)z!getPhases.<locals>.AfterBodyPhasecsN�j|||�tjd|jfg�|_|j|j_tjd|jfg�|_|j	|j_dS)Nr�)
rHrrr�r�rrr�r�r)r@rIr<)r�r)r*rH�	s
z*getPhases.<locals>.AfterBodyPhase.__init__cSsdS)Nr))r@r)r)r*r��	sz,getPhases.<locals>.AfterBodyPhase.processEOFcSs|jj||jjd�dS)Nr)r<r�r{)r@r�r)r)r*r��	sz0getPhases.<locals>.AfterBodyPhase.processCommentcSs |jjd�|jjd|j_|S)Nzunexpected-char-after-bodyr�)rIr|rFr])r@r�r)r)r*r��	sz3getPhases.<locals>.AfterBodyPhase.processCharacterscSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r��	sz.getPhases.<locals>.AfterBodyPhase.startTagHtmlcSs*|jjdd|di�|jjd|j_|S)Nzunexpected-start-tag-after-bodyr>r�)rIr|rFr])r@r�r)r)r*r�	sz/getPhases.<locals>.AfterBodyPhase.startTagOthercSs*|jjr|jjd�n|jjd|j_dS)Nz'unexpected-end-tag-after-body-innerhtml�afterAfterBody)rIrPr|rFr])r@r>r)r)r*r��	sz,getPhases.<locals>.AfterBodyPhase.endTagHtmlcSs*|jjdd|di�|jjd|j_|S)Nzunexpected-end-tag-after-bodyr>r�)rIr|rFr])r@r�r)r)r*r�	sz-getPhases.<locals>.AfterBodyPhase.endTagOtherN)r7r8r9rHr�r�r�r�rr�rr))r�r)r*�AfterBodyPhase�	srcsXeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)z"getPhases.<locals>.InFramesetPhasecsf�j|||�tjd|jfd|jfd|jfd|jfg�|_|j|j_	tjd|j
fg�|_|j|j_	dS)Nr�r�ryr)
rHrrr�r5�
startTagFrame�startTagNoframesr�rr�endTagFramesetr�r)r@rIr<)r�r)r*rH�	s
z+getPhases.<locals>.InFramesetPhase.__init__cSs$|jjdjdkr |jjd�ndS)Nr	r�zeof-in-framesetry)r<r{r>rIr|)r@r)r)r*r��	sz-getPhases.<locals>.InFramesetPhase.processEOFcSs|jjd�dS)Nzunexpected-char-in-frameset)rIr|)r@r�r)r)r*r��	sz4getPhases.<locals>.InFramesetPhase.processCharacterscSs|jj|�dS)N)r<r�)r@r�r)r)r*r5�	sz3getPhases.<locals>.InFramesetPhase.startTagFramesetcSs|jj|�|jjj�dS)N)r<r�r{r))r@r�r)r)r*r	�	sz0getPhases.<locals>.InFramesetPhase.startTagFramecSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r
�	sz3getPhases.<locals>.InFramesetPhase.startTagNoframescSs|jjdd|di�dS)Nz unexpected-start-tag-in-framesetr>)rIr|)r@r�r)r)r*r�	sz0getPhases.<locals>.InFramesetPhase.startTagOthercSs\|jjdjdkr |jjd�n|jjj�|jjrX|jjdjdkrX|jjd|j_dS)Nr	r�z)unexpected-frameset-in-frameset-innerhtmlr��
afterFramesetryry)	r<r{r>rIr|r)rPrFr])r@r�r)r)r*r�	s
z1getPhases.<locals>.InFramesetPhase.endTagFramesetcSs|jjdd|di�dS)Nzunexpected-end-tag-in-framesetr>)rIr|)r@r�r)r)r*r	
sz.getPhases.<locals>.InFramesetPhase.endTagOtherN)r7r8r9rHr�r�r5r	r
rrrr))r�r)r*�InFramesetPhase�	sr
csHeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dS)z%getPhases.<locals>.AfterFramesetPhasecsV�j|||�tjd|jfd|jfg�|_|j|j_tjd|jfg�|_	|j
|j	_dS)Nr�r)rHrrr�r
r�rrr�r�r)r@rIr<)r�r)r*rH
s
z.getPhases.<locals>.AfterFramesetPhase.__init__cSsdS)Nr))r@r)r)r*r�
sz0getPhases.<locals>.AfterFramesetPhase.processEOFcSs|jjd�dS)Nzunexpected-char-after-frameset)rIr|)r@r�r)r)r*r�!
sz7getPhases.<locals>.AfterFramesetPhase.processCharacterscSs|jjdj|�S)Nr)rIrFr�)r@r�r)r)r*r
$
sz6getPhases.<locals>.AfterFramesetPhase.startTagNoframescSs|jjdd|di�dS)Nz#unexpected-start-tag-after-framesetr>)rIr|)r@r�r)r)r*r'
sz3getPhases.<locals>.AfterFramesetPhase.startTagOthercSs|jjd|j_dS)N�afterAfterFrameset)rIrFr])r@r�r)r)r*r�+
sz0getPhases.<locals>.AfterFramesetPhase.endTagHtmlcSs|jjdd|di�dS)Nz!unexpected-end-tag-after-framesetr>)rIr|)r@r�r)r)r*r.
sz1getPhases.<locals>.AfterFramesetPhase.endTagOtherN)
r7r8r9rHr�r�r
rr�rr))r�r)r*�AfterFramesetPhase
srcsPeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)z&getPhases.<locals>.AfterAfterBodyPhasecs0�j|||�tjd|jfg�|_|j|j_dS)Nr�)rHrrr�r�rr)r@rIr<)r�r)r*rH3
sz/getPhases.<locals>.AfterAfterBodyPhase.__init__cSsdS)Nr))r@r)r)r*r�;
sz1getPhases.<locals>.AfterAfterBodyPhase.processEOFcSs|jj||jj�dS)N)r<r�r�)r@r�r)r)r*r�>
sz5getPhases.<locals>.AfterAfterBodyPhase.processCommentcSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�A
sz=getPhases.<locals>.AfterAfterBodyPhase.processSpaceCharacterscSs |jjd�|jjd|j_|S)Nzexpected-eof-but-got-charr�)rIr|rFr])r@r�r)r)r*r�D
sz8getPhases.<locals>.AfterAfterBodyPhase.processCharacterscSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�I
sz3getPhases.<locals>.AfterAfterBodyPhase.startTagHtmlcSs*|jjdd|di�|jjd|j_|S)Nzexpected-eof-but-got-start-tagr>r�)rIr|rFr])r@r�r)r)r*rL
sz4getPhases.<locals>.AfterAfterBodyPhase.startTagOthercSs*|jjdd|di�|jjd|j_|S)Nzexpected-eof-but-got-end-tagr>r�)rIr|rFr])r@r�r)r)r*r�R
sz4getPhases.<locals>.AfterAfterBodyPhase.processEndTagN)r7r8r9rHr�r�r�r�r�rr�r))r�r)r*�AfterAfterBodyPhase2
srcsXeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)z*getPhases.<locals>.AfterAfterFramesetPhasecs8�j|||�tjd|jfd|jfg�|_|j|j_dS)Nr�r)rHrrr��startTagNoFramesr�rr)r@rIr<)r�r)r*rHY
s
z3getPhases.<locals>.AfterAfterFramesetPhase.__init__cSsdS)Nr))r@r)r)r*r�b
sz5getPhases.<locals>.AfterAfterFramesetPhase.processEOFcSs|jj||jj�dS)N)r<r�r�)r@r�r)r)r*r�e
sz9getPhases.<locals>.AfterAfterFramesetPhase.processCommentcSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�h
szAgetPhases.<locals>.AfterAfterFramesetPhase.processSpaceCharacterscSs|jjd�dS)Nzexpected-eof-but-got-char)rIr|)r@r�r)r)r*r�k
sz<getPhases.<locals>.AfterAfterFramesetPhase.processCharacterscSs|jjdj|�S)Nr�)rIrFr�)r@r�r)r)r*r�n
sz7getPhases.<locals>.AfterAfterFramesetPhase.startTagHtmlcSs|jjdj|�S)Nr)rIrFr�)r@r�r)r)r*rq
sz;getPhases.<locals>.AfterAfterFramesetPhase.startTagNoFramescSs|jjdd|di�dS)Nzexpected-eof-but-got-start-tagr>)rIr|)r@r�r)r)r*rt
sz8getPhases.<locals>.AfterAfterFramesetPhase.startTagOthercSs|jjdd|di�dS)Nzexpected-eof-but-got-end-tagr>)rIr|)r@r�r)r)r*r�x
sz8getPhases.<locals>.AfterAfterFramesetPhase.processEndTagN)r7r8r9rHr�r�r�r�r�rrr�r))r�r)r*�AfterAfterFramesetPhaseX
s	r)rTrSr�rr-r.r�r�r�r�r�r�r�r�r�r�r�rvr�r�rrr)r)rGrVr�r
r
rr/r3r8r�r�r�r�r�r�r�r�r�r�r�rrr
rrrr))r�r*rE_sp)#.g@CX!-GBbYLd's/9%&&rEcs^ts
tjr t|d�t��@}nt|d�t��@}|rZt�fdd�|dj�D��|d<dS)Nrsc3s"|]\}}�j||�|fVqdS)N)r})r=�k�v)rr)r*r��
sz$adjust_attributes.<locals>.<genexpr>)rrZPY27rr�rr.)r�rZneeds_adjustmentr))rr*r��
s
r�rpFcCs|dkri}t||||d�S)N)r2r>rsrw)r)r>r2rirwr)r)r*r�
s
rc@seZdZdZdS)rrzError in parsed documentN)r7r8r9r�r)r)r)r*rr�
srr)rT)r+rT)rpNF)1Z
__future__rrrZpip._vendor.sixrrrr0�collectionsr�ImportErrorZpip._vendor.ordereddictr�r
rrZtreebuilders.baser
rZ	constantsrrrrrrrrrrrrr�rrrrr#r-r;rer"ZmemoizerEr�r�	Exceptionrrr)r)r)r*�<module>sRH

)L

site-packages/pip/_vendor/html5lib/__pycache__/_utils.cpython-36.opt-1.pyc000064400000006234147511334610022336 0ustar003

���e�@s�ddlmZmZmZddlZddlmZddlmZyddl	j
jZWn e
k
rdddlj
jZYnXddddd	d
dgZejddko�ejd
dkZyed�Zeee�s�ed�ZWndZYnXdZGdd�de�Zdd�Zdd�Zdd	�Zdd�ZdS)�)�absolute_import�division�unicode_literalsN)�
ModuleType)�	text_type�
default_etree�MethodDispatcher�isSurrogatePair�surrogatePairToCodepoint�moduleFactoryFactory�supports_lone_surrogates�PY27���z"\uD800"z	u"\uD800"FTc@s$eZdZdZffdd�Zdd�ZdS)rapDict with 2 special properties:

    On initiation, keys that are lists, sets or tuples are converted to
    multiple keys so accessing any one of the items in the original
    list-like object returns the matching value

    md = MethodDispatcher({("foo", "bar"):"baz"})
    md["foo"] == "baz"

    A default value which can be set through the default attribute.
    cCsjg}xN|D]F\}}t|ttttf�rBx*|D]}|j||f�q*Wq
|j||f�q
Wtj||�d|_dS)N)	�
isinstance�list�tuple�	frozenset�set�append�dict�__init__�default)�self�itemsZ_dictEntries�name�value�item�r�/usr/lib/python3.6/_utils.pyr4s
zMethodDispatcher.__init__cCstj|||j�S)N)r�getr)r�keyrrr �__getitem__CszMethodDispatcher.__getitem__N)�__name__�
__module__�__qualname__�__doc__rr#rrrr r'scCsLt|�dkoJt|d�dkoJt|d�dkoJt|d�dkoJt|d�dkS)Nrri�i��ri�i��)�len�ord)�datarrr r	Js cCs,dt|d�ddt|d�d}|S)Niri�iri�)r))r*Zchar_valrrr r
Pscsi���fdd�}|S)Ncs�ttjtd��rd|j}n
d|j}t|j��}y�|||Stk
r�t|�}�|f|�|�}|jj|�d�kr�i�|<d�|kr�i�||<d�||kr�i�|||<|�|||<|SXdS)N�z_%s_factorys_%s_factoryr�args�kwargs)	rrr$�typerr�KeyError�__dict__�update)Z
baseModuler,r-rZkwargs_tuple�modZobjs)�factory�moduleCacherr �
moduleFactory\s$
z+moduleFactoryFactory.<locals>.moduleFactoryr)r3r5r)r3r4r rYscsi���fdd�}|S)Ncs2t|�t|j��f}|�kr*�||��|<�|S)N)rr)r,r-r")�cache�funcrr �wrappedyszmemoize.<locals>.wrappedr)r7r8r)r6r7r �memoizevsr9)Z
__future__rrr�sys�typesrZpip._vendor.sixrZxml.etree.cElementTreeZetreeZcElementTreer�ImportErrorZxml.etree.ElementTreeZElementTree�__all__�version_infor
�evalZ_xrrrrr	r
rr9rrrr �<module>s0

#	site-packages/pip/_vendor/html5lib/__pycache__/__init__.cpython-36.opt-1.pyc000064400000001563147511334610022576 0ustar003

���e�@shdZddlmZmZmZddlmZmZmZddl	m
Z
ddlmZddl
mZdd	d
ddd
gZdZdS)aM
HTML parsing library based on the WHATWG "HTML5"
specification. The parser is designed to be compatible with existing
HTML found in the wild and implements well-defined error recovery that
is largely compatible with modern desktop web browsers.

Example usage:

import html5lib
f = open("my_document.html")
tree = html5lib.parse(f)
�)�absolute_import�division�unicode_literals�)�
HTMLParser�parse�
parseFragment)�getTreeBuilder)�
getTreeWalker)�	serializerrrr	r
rz1.0b10N)�__doc__Z
__future__rrrZhtml5parserrrrZtreebuildersr	Ztreewalkersr
Z
serializerr�__all__�__version__�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/html5lib/constants.py000064400000242673147511334610015641 0ustar00from __future__ import absolute_import, division, unicode_literals

import string

EOF = None

E = {
    "null-character":
        "Null character in input stream, replaced with U+FFFD.",
    "invalid-codepoint":
        "Invalid codepoint in stream.",
    "incorrectly-placed-solidus":
        "Solidus (/) incorrectly placed in tag.",
    "incorrect-cr-newline-entity":
        "Incorrect CR newline entity, replaced with LF.",
    "illegal-windows-1252-entity":
        "Entity used with illegal number (windows-1252 reference).",
    "cant-convert-numeric-entity":
        "Numeric entity couldn't be converted to character "
        "(codepoint U+%(charAsInt)08x).",
    "illegal-codepoint-for-numeric-entity":
        "Numeric entity represents an illegal codepoint: "
        "U+%(charAsInt)08x.",
    "numeric-entity-without-semicolon":
        "Numeric entity didn't end with ';'.",
    "expected-numeric-entity-but-got-eof":
        "Numeric entity expected. Got end of file instead.",
    "expected-numeric-entity":
        "Numeric entity expected but none found.",
    "named-entity-without-semicolon":
        "Named entity didn't end with ';'.",
    "expected-named-entity":
        "Named entity expected. Got none.",
    "attributes-in-end-tag":
        "End tag contains unexpected attributes.",
    'self-closing-flag-on-end-tag':
        "End tag contains unexpected self-closing flag.",
    "expected-tag-name-but-got-right-bracket":
        "Expected tag name. Got '>' instead.",
    "expected-tag-name-but-got-question-mark":
        "Expected tag name. Got '?' instead. (HTML doesn't "
        "support processing instructions.)",
    "expected-tag-name":
        "Expected tag name. Got something else instead",
    "expected-closing-tag-but-got-right-bracket":
        "Expected closing tag. Got '>' instead. Ignoring '</>'.",
    "expected-closing-tag-but-got-eof":
        "Expected closing tag. Unexpected end of file.",
    "expected-closing-tag-but-got-char":
        "Expected closing tag. Unexpected character '%(data)s' found.",
    "eof-in-tag-name":
        "Unexpected end of file in the tag name.",
    "expected-attribute-name-but-got-eof":
        "Unexpected end of file. Expected attribute name instead.",
    "eof-in-attribute-name":
        "Unexpected end of file in attribute name.",
    "invalid-character-in-attribute-name":
        "Invalid character in attribute name",
    "duplicate-attribute":
        "Dropped duplicate attribute on tag.",
    "expected-end-of-tag-name-but-got-eof":
        "Unexpected end of file. Expected = or end of tag.",
    "expected-attribute-value-but-got-eof":
        "Unexpected end of file. Expected attribute value.",
    "expected-attribute-value-but-got-right-bracket":
        "Expected attribute value. Got '>' instead.",
    'equals-in-unquoted-attribute-value':
        "Unexpected = in unquoted attribute",
    'unexpected-character-in-unquoted-attribute-value':
        "Unexpected character in unquoted attribute",
    "invalid-character-after-attribute-name":
        "Unexpected character after attribute name.",
    "unexpected-character-after-attribute-value":
        "Unexpected character after attribute value.",
    "eof-in-attribute-value-double-quote":
        "Unexpected end of file in attribute value (\").",
    "eof-in-attribute-value-single-quote":
        "Unexpected end of file in attribute value (').",
    "eof-in-attribute-value-no-quotes":
        "Unexpected end of file in attribute value.",
    "unexpected-EOF-after-solidus-in-tag":
        "Unexpected end of file in tag. Expected >",
    "unexpected-character-after-solidus-in-tag":
        "Unexpected character after / in tag. Expected >",
    "expected-dashes-or-doctype":
        "Expected '--' or 'DOCTYPE'. Not found.",
    "unexpected-bang-after-double-dash-in-comment":
        "Unexpected ! after -- in comment",
    "unexpected-space-after-double-dash-in-comment":
        "Unexpected space after -- in comment",
    "incorrect-comment":
        "Incorrect comment.",
    "eof-in-comment":
        "Unexpected end of file in comment.",
    "eof-in-comment-end-dash":
        "Unexpected end of file in comment (-)",
    "unexpected-dash-after-double-dash-in-comment":
        "Unexpected '-' after '--' found in comment.",
    "eof-in-comment-double-dash":
        "Unexpected end of file in comment (--).",
    "eof-in-comment-end-space-state":
        "Unexpected end of file in comment.",
    "eof-in-comment-end-bang-state":
        "Unexpected end of file in comment.",
    "unexpected-char-in-comment":
        "Unexpected character in comment found.",
    "need-space-after-doctype":
        "No space after literal string 'DOCTYPE'.",
    "expected-doctype-name-but-got-right-bracket":
        "Unexpected > character. Expected DOCTYPE name.",
    "expected-doctype-name-but-got-eof":
        "Unexpected end of file. Expected DOCTYPE name.",
    "eof-in-doctype-name":
        "Unexpected end of file in DOCTYPE name.",
    "eof-in-doctype":
        "Unexpected end of file in DOCTYPE.",
    "expected-space-or-right-bracket-in-doctype":
        "Expected space or '>'. Got '%(data)s'",
    "unexpected-end-of-doctype":
        "Unexpected end of DOCTYPE.",
    "unexpected-char-in-doctype":
        "Unexpected character in DOCTYPE.",
    "eof-in-innerhtml":
        "XXX innerHTML EOF",
    "unexpected-doctype":
        "Unexpected DOCTYPE. Ignored.",
    "non-html-root":
        "html needs to be the first start tag.",
    "expected-doctype-but-got-eof":
        "Unexpected End of file. Expected DOCTYPE.",
    "unknown-doctype":
        "Erroneous DOCTYPE.",
    "expected-doctype-but-got-chars":
        "Unexpected non-space characters. Expected DOCTYPE.",
    "expected-doctype-but-got-start-tag":
        "Unexpected start tag (%(name)s). Expected DOCTYPE.",
    "expected-doctype-but-got-end-tag":
        "Unexpected end tag (%(name)s). Expected DOCTYPE.",
    "end-tag-after-implied-root":
        "Unexpected end tag (%(name)s) after the (implied) root element.",
    "expected-named-closing-tag-but-got-eof":
        "Unexpected end of file. Expected end tag (%(name)s).",
    "two-heads-are-not-better-than-one":
        "Unexpected start tag head in existing head. Ignored.",
    "unexpected-end-tag":
        "Unexpected end tag (%(name)s). Ignored.",
    "unexpected-start-tag-out-of-my-head":
        "Unexpected start tag (%(name)s) that can be in head. Moved.",
    "unexpected-start-tag":
        "Unexpected start tag (%(name)s).",
    "missing-end-tag":
        "Missing end tag (%(name)s).",
    "missing-end-tags":
        "Missing end tags (%(name)s).",
    "unexpected-start-tag-implies-end-tag":
        "Unexpected start tag (%(startName)s) "
        "implies end tag (%(endName)s).",
    "unexpected-start-tag-treated-as":
        "Unexpected start tag (%(originalName)s). Treated as %(newName)s.",
    "deprecated-tag":
        "Unexpected start tag %(name)s. Don't use it!",
    "unexpected-start-tag-ignored":
        "Unexpected start tag %(name)s. Ignored.",
    "expected-one-end-tag-but-got-another":
        "Unexpected end tag (%(gotName)s). "
        "Missing end tag (%(expectedName)s).",
    "end-tag-too-early":
        "End tag (%(name)s) seen too early. Expected other end tag.",
    "end-tag-too-early-named":
        "Unexpected end tag (%(gotName)s). Expected end tag (%(expectedName)s).",
    "end-tag-too-early-ignored":
        "End tag (%(name)s) seen too early. Ignored.",
    "adoption-agency-1.1":
        "End tag (%(name)s) violates step 1, "
        "paragraph 1 of the adoption agency algorithm.",
    "adoption-agency-1.2":
        "End tag (%(name)s) violates step 1, "
        "paragraph 2 of the adoption agency algorithm.",
    "adoption-agency-1.3":
        "End tag (%(name)s) violates step 1, "
        "paragraph 3 of the adoption agency algorithm.",
    "adoption-agency-4.4":
        "End tag (%(name)s) violates step 4, "
        "paragraph 4 of the adoption agency algorithm.",
    "unexpected-end-tag-treated-as":
        "Unexpected end tag (%(originalName)s). Treated as %(newName)s.",
    "no-end-tag":
        "This element (%(name)s) has no end tag.",
    "unexpected-implied-end-tag-in-table":
        "Unexpected implied end tag (%(name)s) in the table phase.",
    "unexpected-implied-end-tag-in-table-body":
        "Unexpected implied end tag (%(name)s) in the table body phase.",
    "unexpected-char-implies-table-voodoo":
        "Unexpected non-space characters in "
        "table context caused voodoo mode.",
    "unexpected-hidden-input-in-table":
        "Unexpected input with type hidden in table context.",
    "unexpected-form-in-table":
        "Unexpected form in table context.",
    "unexpected-start-tag-implies-table-voodoo":
        "Unexpected start tag (%(name)s) in "
        "table context caused voodoo mode.",
    "unexpected-end-tag-implies-table-voodoo":
        "Unexpected end tag (%(name)s) in "
        "table context caused voodoo mode.",
    "unexpected-cell-in-table-body":
        "Unexpected table cell start tag (%(name)s) "
        "in the table body phase.",
    "unexpected-cell-end-tag":
        "Got table cell end tag (%(name)s) "
        "while required end tags are missing.",
    "unexpected-end-tag-in-table-body":
        "Unexpected end tag (%(name)s) in the table body phase. Ignored.",
    "unexpected-implied-end-tag-in-table-row":
        "Unexpected implied end tag (%(name)s) in the table row phase.",
    "unexpected-end-tag-in-table-row":
        "Unexpected end tag (%(name)s) in the table row phase. Ignored.",
    "unexpected-select-in-select":
        "Unexpected select start tag in the select phase "
        "treated as select end tag.",
    "unexpected-input-in-select":
        "Unexpected input start tag in the select phase.",
    "unexpected-start-tag-in-select":
        "Unexpected start tag token (%(name)s in the select phase. "
        "Ignored.",
    "unexpected-end-tag-in-select":
        "Unexpected end tag (%(name)s) in the select phase. Ignored.",
    "unexpected-table-element-start-tag-in-select-in-table":
        "Unexpected table element start tag (%(name)s) in the select in table phase.",
    "unexpected-table-element-end-tag-in-select-in-table":
        "Unexpected table element end tag (%(name)s) in the select in table phase.",
    "unexpected-char-after-body":
        "Unexpected non-space characters in the after body phase.",
    "unexpected-start-tag-after-body":
        "Unexpected start tag token (%(name)s)"
        " in the after body phase.",
    "unexpected-end-tag-after-body":
        "Unexpected end tag token (%(name)s)"
        " in the after body phase.",
    "unexpected-char-in-frameset":
        "Unexpected characters in the frameset phase. Characters ignored.",
    "unexpected-start-tag-in-frameset":
        "Unexpected start tag token (%(name)s)"
        " in the frameset phase. Ignored.",
    "unexpected-frameset-in-frameset-innerhtml":
        "Unexpected end tag token (frameset) "
        "in the frameset phase (innerHTML).",
    "unexpected-end-tag-in-frameset":
        "Unexpected end tag token (%(name)s)"
        " in the frameset phase. Ignored.",
    "unexpected-char-after-frameset":
        "Unexpected non-space characters in the "
        "after frameset phase. Ignored.",
    "unexpected-start-tag-after-frameset":
        "Unexpected start tag (%(name)s)"
        " in the after frameset phase. Ignored.",
    "unexpected-end-tag-after-frameset":
        "Unexpected end tag (%(name)s)"
        " in the after frameset phase. Ignored.",
    "unexpected-end-tag-after-body-innerhtml":
        "Unexpected end tag after body(innerHtml)",
    "expected-eof-but-got-char":
        "Unexpected non-space characters. Expected end of file.",
    "expected-eof-but-got-start-tag":
        "Unexpected start tag (%(name)s)"
        ". Expected end of file.",
    "expected-eof-but-got-end-tag":
        "Unexpected end tag (%(name)s)"
        ". Expected end of file.",
    "eof-in-table":
        "Unexpected end of file. Expected table content.",
    "eof-in-select":
        "Unexpected end of file. Expected select content.",
    "eof-in-frameset":
        "Unexpected end of file. Expected frameset content.",
    "eof-in-script-in-script":
        "Unexpected end of file. Expected script content.",
    "eof-in-foreign-lands":
        "Unexpected end of file. Expected foreign content",
    "non-void-element-with-trailing-solidus":
        "Trailing solidus not allowed on element %(name)s",
    "unexpected-html-element-in-foreign-content":
        "Element %(name)s not allowed in a non-html context",
    "unexpected-end-tag-before-html":
        "Unexpected end tag (%(name)s) before html.",
    "unexpected-inhead-noscript-tag":
        "Element %(name)s not allowed in a inhead-noscript context",
    "eof-in-head-noscript":
        "Unexpected end of file. Expected inhead-noscript content",
    "char-in-head-noscript":
        "Unexpected non-space character. Expected inhead-noscript content",
    "XXX-undefined-error":
        "Undefined error (this sucks and should be fixed)",
}

namespaces = {
    "html": "http://www.w3.org/1999/xhtml",
    "mathml": "http://www.w3.org/1998/Math/MathML",
    "svg": "http://www.w3.org/2000/svg",
    "xlink": "http://www.w3.org/1999/xlink",
    "xml": "http://www.w3.org/XML/1998/namespace",
    "xmlns": "http://www.w3.org/2000/xmlns/"
}

scopingElements = frozenset([
    (namespaces["html"], "applet"),
    (namespaces["html"], "caption"),
    (namespaces["html"], "html"),
    (namespaces["html"], "marquee"),
    (namespaces["html"], "object"),
    (namespaces["html"], "table"),
    (namespaces["html"], "td"),
    (namespaces["html"], "th"),
    (namespaces["mathml"], "mi"),
    (namespaces["mathml"], "mo"),
    (namespaces["mathml"], "mn"),
    (namespaces["mathml"], "ms"),
    (namespaces["mathml"], "mtext"),
    (namespaces["mathml"], "annotation-xml"),
    (namespaces["svg"], "foreignObject"),
    (namespaces["svg"], "desc"),
    (namespaces["svg"], "title"),
])

formattingElements = frozenset([
    (namespaces["html"], "a"),
    (namespaces["html"], "b"),
    (namespaces["html"], "big"),
    (namespaces["html"], "code"),
    (namespaces["html"], "em"),
    (namespaces["html"], "font"),
    (namespaces["html"], "i"),
    (namespaces["html"], "nobr"),
    (namespaces["html"], "s"),
    (namespaces["html"], "small"),
    (namespaces["html"], "strike"),
    (namespaces["html"], "strong"),
    (namespaces["html"], "tt"),
    (namespaces["html"], "u")
])

specialElements = frozenset([
    (namespaces["html"], "address"),
    (namespaces["html"], "applet"),
    (namespaces["html"], "area"),
    (namespaces["html"], "article"),
    (namespaces["html"], "aside"),
    (namespaces["html"], "base"),
    (namespaces["html"], "basefont"),
    (namespaces["html"], "bgsound"),
    (namespaces["html"], "blockquote"),
    (namespaces["html"], "body"),
    (namespaces["html"], "br"),
    (namespaces["html"], "button"),
    (namespaces["html"], "caption"),
    (namespaces["html"], "center"),
    (namespaces["html"], "col"),
    (namespaces["html"], "colgroup"),
    (namespaces["html"], "command"),
    (namespaces["html"], "dd"),
    (namespaces["html"], "details"),
    (namespaces["html"], "dir"),
    (namespaces["html"], "div"),
    (namespaces["html"], "dl"),
    (namespaces["html"], "dt"),
    (namespaces["html"], "embed"),
    (namespaces["html"], "fieldset"),
    (namespaces["html"], "figure"),
    (namespaces["html"], "footer"),
    (namespaces["html"], "form"),
    (namespaces["html"], "frame"),
    (namespaces["html"], "frameset"),
    (namespaces["html"], "h1"),
    (namespaces["html"], "h2"),
    (namespaces["html"], "h3"),
    (namespaces["html"], "h4"),
    (namespaces["html"], "h5"),
    (namespaces["html"], "h6"),
    (namespaces["html"], "head"),
    (namespaces["html"], "header"),
    (namespaces["html"], "hr"),
    (namespaces["html"], "html"),
    (namespaces["html"], "iframe"),
    # Note that image is commented out in the spec as "this isn't an
    # element that can end up on the stack, so it doesn't matter,"
    (namespaces["html"], "image"),
    (namespaces["html"], "img"),
    (namespaces["html"], "input"),
    (namespaces["html"], "isindex"),
    (namespaces["html"], "li"),
    (namespaces["html"], "link"),
    (namespaces["html"], "listing"),
    (namespaces["html"], "marquee"),
    (namespaces["html"], "menu"),
    (namespaces["html"], "meta"),
    (namespaces["html"], "nav"),
    (namespaces["html"], "noembed"),
    (namespaces["html"], "noframes"),
    (namespaces["html"], "noscript"),
    (namespaces["html"], "object"),
    (namespaces["html"], "ol"),
    (namespaces["html"], "p"),
    (namespaces["html"], "param"),
    (namespaces["html"], "plaintext"),
    (namespaces["html"], "pre"),
    (namespaces["html"], "script"),
    (namespaces["html"], "section"),
    (namespaces["html"], "select"),
    (namespaces["html"], "style"),
    (namespaces["html"], "table"),
    (namespaces["html"], "tbody"),
    (namespaces["html"], "td"),
    (namespaces["html"], "textarea"),
    (namespaces["html"], "tfoot"),
    (namespaces["html"], "th"),
    (namespaces["html"], "thead"),
    (namespaces["html"], "title"),
    (namespaces["html"], "tr"),
    (namespaces["html"], "ul"),
    (namespaces["html"], "wbr"),
    (namespaces["html"], "xmp"),
    (namespaces["svg"], "foreignObject")
])

htmlIntegrationPointElements = frozenset([
    (namespaces["mathml"], "annotaion-xml"),
    (namespaces["svg"], "foreignObject"),
    (namespaces["svg"], "desc"),
    (namespaces["svg"], "title")
])

mathmlTextIntegrationPointElements = frozenset([
    (namespaces["mathml"], "mi"),
    (namespaces["mathml"], "mo"),
    (namespaces["mathml"], "mn"),
    (namespaces["mathml"], "ms"),
    (namespaces["mathml"], "mtext")
])

adjustSVGAttributes = {
    "attributename": "attributeName",
    "attributetype": "attributeType",
    "basefrequency": "baseFrequency",
    "baseprofile": "baseProfile",
    "calcmode": "calcMode",
    "clippathunits": "clipPathUnits",
    "contentscripttype": "contentScriptType",
    "contentstyletype": "contentStyleType",
    "diffuseconstant": "diffuseConstant",
    "edgemode": "edgeMode",
    "externalresourcesrequired": "externalResourcesRequired",
    "filterres": "filterRes",
    "filterunits": "filterUnits",
    "glyphref": "glyphRef",
    "gradienttransform": "gradientTransform",
    "gradientunits": "gradientUnits",
    "kernelmatrix": "kernelMatrix",
    "kernelunitlength": "kernelUnitLength",
    "keypoints": "keyPoints",
    "keysplines": "keySplines",
    "keytimes": "keyTimes",
    "lengthadjust": "lengthAdjust",
    "limitingconeangle": "limitingConeAngle",
    "markerheight": "markerHeight",
    "markerunits": "markerUnits",
    "markerwidth": "markerWidth",
    "maskcontentunits": "maskContentUnits",
    "maskunits": "maskUnits",
    "numoctaves": "numOctaves",
    "pathlength": "pathLength",
    "patterncontentunits": "patternContentUnits",
    "patterntransform": "patternTransform",
    "patternunits": "patternUnits",
    "pointsatx": "pointsAtX",
    "pointsaty": "pointsAtY",
    "pointsatz": "pointsAtZ",
    "preservealpha": "preserveAlpha",
    "preserveaspectratio": "preserveAspectRatio",
    "primitiveunits": "primitiveUnits",
    "refx": "refX",
    "refy": "refY",
    "repeatcount": "repeatCount",
    "repeatdur": "repeatDur",
    "requiredextensions": "requiredExtensions",
    "requiredfeatures": "requiredFeatures",
    "specularconstant": "specularConstant",
    "specularexponent": "specularExponent",
    "spreadmethod": "spreadMethod",
    "startoffset": "startOffset",
    "stddeviation": "stdDeviation",
    "stitchtiles": "stitchTiles",
    "surfacescale": "surfaceScale",
    "systemlanguage": "systemLanguage",
    "tablevalues": "tableValues",
    "targetx": "targetX",
    "targety": "targetY",
    "textlength": "textLength",
    "viewbox": "viewBox",
    "viewtarget": "viewTarget",
    "xchannelselector": "xChannelSelector",
    "ychannelselector": "yChannelSelector",
    "zoomandpan": "zoomAndPan"
}

adjustMathMLAttributes = {"definitionurl": "definitionURL"}

adjustForeignAttributes = {
    "xlink:actuate": ("xlink", "actuate", namespaces["xlink"]),
    "xlink:arcrole": ("xlink", "arcrole", namespaces["xlink"]),
    "xlink:href": ("xlink", "href", namespaces["xlink"]),
    "xlink:role": ("xlink", "role", namespaces["xlink"]),
    "xlink:show": ("xlink", "show", namespaces["xlink"]),
    "xlink:title": ("xlink", "title", namespaces["xlink"]),
    "xlink:type": ("xlink", "type", namespaces["xlink"]),
    "xml:base": ("xml", "base", namespaces["xml"]),
    "xml:lang": ("xml", "lang", namespaces["xml"]),
    "xml:space": ("xml", "space", namespaces["xml"]),
    "xmlns": (None, "xmlns", namespaces["xmlns"]),
    "xmlns:xlink": ("xmlns", "xlink", namespaces["xmlns"])
}

unadjustForeignAttributes = dict([((ns, local), qname) for qname, (prefix, local, ns) in
                                  adjustForeignAttributes.items()])

spaceCharacters = frozenset([
    "\t",
    "\n",
    "\u000C",
    " ",
    "\r"
])

tableInsertModeElements = frozenset([
    "table",
    "tbody",
    "tfoot",
    "thead",
    "tr"
])

asciiLowercase = frozenset(string.ascii_lowercase)
asciiUppercase = frozenset(string.ascii_uppercase)
asciiLetters = frozenset(string.ascii_letters)
digits = frozenset(string.digits)
hexDigits = frozenset(string.hexdigits)

asciiUpper2Lower = dict([(ord(c), ord(c.lower()))
                         for c in string.ascii_uppercase])

# Heading elements need to be ordered
headingElements = (
    "h1",
    "h2",
    "h3",
    "h4",
    "h5",
    "h6"
)

voidElements = frozenset([
    "base",
    "command",
    "event-source",
    "link",
    "meta",
    "hr",
    "br",
    "img",
    "embed",
    "param",
    "area",
    "col",
    "input",
    "source",
    "track"
])

cdataElements = frozenset(['title', 'textarea'])

rcdataElements = frozenset([
    'style',
    'script',
    'xmp',
    'iframe',
    'noembed',
    'noframes',
    'noscript'
])

booleanAttributes = {
    "": frozenset(["irrelevant"]),
    "style": frozenset(["scoped"]),
    "img": frozenset(["ismap"]),
    "audio": frozenset(["autoplay", "controls"]),
    "video": frozenset(["autoplay", "controls"]),
    "script": frozenset(["defer", "async"]),
    "details": frozenset(["open"]),
    "datagrid": frozenset(["multiple", "disabled"]),
    "command": frozenset(["hidden", "disabled", "checked", "default"]),
    "hr": frozenset(["noshade"]),
    "menu": frozenset(["autosubmit"]),
    "fieldset": frozenset(["disabled", "readonly"]),
    "option": frozenset(["disabled", "readonly", "selected"]),
    "optgroup": frozenset(["disabled", "readonly"]),
    "button": frozenset(["disabled", "autofocus"]),
    "input": frozenset(["disabled", "readonly", "required", "autofocus", "checked", "ismap"]),
    "select": frozenset(["disabled", "readonly", "autofocus", "multiple"]),
    "output": frozenset(["disabled", "readonly"]),
}

# entitiesWindows1252 has to be _ordered_ and needs to have an index. It
# therefore can't be a frozenset.
entitiesWindows1252 = (
    8364,   # 0x80  0x20AC  EURO SIGN
    65533,  # 0x81          UNDEFINED
    8218,   # 0x82  0x201A  SINGLE LOW-9 QUOTATION MARK
    402,    # 0x83  0x0192  LATIN SMALL LETTER F WITH HOOK
    8222,   # 0x84  0x201E  DOUBLE LOW-9 QUOTATION MARK
    8230,   # 0x85  0x2026  HORIZONTAL ELLIPSIS
    8224,   # 0x86  0x2020  DAGGER
    8225,   # 0x87  0x2021  DOUBLE DAGGER
    710,    # 0x88  0x02C6  MODIFIER LETTER CIRCUMFLEX ACCENT
    8240,   # 0x89  0x2030  PER MILLE SIGN
    352,    # 0x8A  0x0160  LATIN CAPITAL LETTER S WITH CARON
    8249,   # 0x8B  0x2039  SINGLE LEFT-POINTING ANGLE QUOTATION MARK
    338,    # 0x8C  0x0152  LATIN CAPITAL LIGATURE OE
    65533,  # 0x8D          UNDEFINED
    381,    # 0x8E  0x017D  LATIN CAPITAL LETTER Z WITH CARON
    65533,  # 0x8F          UNDEFINED
    65533,  # 0x90          UNDEFINED
    8216,   # 0x91  0x2018  LEFT SINGLE QUOTATION MARK
    8217,   # 0x92  0x2019  RIGHT SINGLE QUOTATION MARK
    8220,   # 0x93  0x201C  LEFT DOUBLE QUOTATION MARK
    8221,   # 0x94  0x201D  RIGHT DOUBLE QUOTATION MARK
    8226,   # 0x95  0x2022  BULLET
    8211,   # 0x96  0x2013  EN DASH
    8212,   # 0x97  0x2014  EM DASH
    732,    # 0x98  0x02DC  SMALL TILDE
    8482,   # 0x99  0x2122  TRADE MARK SIGN
    353,    # 0x9A  0x0161  LATIN SMALL LETTER S WITH CARON
    8250,   # 0x9B  0x203A  SINGLE RIGHT-POINTING ANGLE QUOTATION MARK
    339,    # 0x9C  0x0153  LATIN SMALL LIGATURE OE
    65533,  # 0x9D          UNDEFINED
    382,    # 0x9E  0x017E  LATIN SMALL LETTER Z WITH CARON
    376     # 0x9F  0x0178  LATIN CAPITAL LETTER Y WITH DIAERESIS
)

xmlEntities = frozenset(['lt;', 'gt;', 'amp;', 'apos;', 'quot;'])

entities = {
    "AElig": "\xc6",
    "AElig;": "\xc6",
    "AMP": "&",
    "AMP;": "&",
    "Aacute": "\xc1",
    "Aacute;": "\xc1",
    "Abreve;": "\u0102",
    "Acirc": "\xc2",
    "Acirc;": "\xc2",
    "Acy;": "\u0410",
    "Afr;": "\U0001d504",
    "Agrave": "\xc0",
    "Agrave;": "\xc0",
    "Alpha;": "\u0391",
    "Amacr;": "\u0100",
    "And;": "\u2a53",
    "Aogon;": "\u0104",
    "Aopf;": "\U0001d538",
    "ApplyFunction;": "\u2061",
    "Aring": "\xc5",
    "Aring;": "\xc5",
    "Ascr;": "\U0001d49c",
    "Assign;": "\u2254",
    "Atilde": "\xc3",
    "Atilde;": "\xc3",
    "Auml": "\xc4",
    "Auml;": "\xc4",
    "Backslash;": "\u2216",
    "Barv;": "\u2ae7",
    "Barwed;": "\u2306",
    "Bcy;": "\u0411",
    "Because;": "\u2235",
    "Bernoullis;": "\u212c",
    "Beta;": "\u0392",
    "Bfr;": "\U0001d505",
    "Bopf;": "\U0001d539",
    "Breve;": "\u02d8",
    "Bscr;": "\u212c",
    "Bumpeq;": "\u224e",
    "CHcy;": "\u0427",
    "COPY": "\xa9",
    "COPY;": "\xa9",
    "Cacute;": "\u0106",
    "Cap;": "\u22d2",
    "CapitalDifferentialD;": "\u2145",
    "Cayleys;": "\u212d",
    "Ccaron;": "\u010c",
    "Ccedil": "\xc7",
    "Ccedil;": "\xc7",
    "Ccirc;": "\u0108",
    "Cconint;": "\u2230",
    "Cdot;": "\u010a",
    "Cedilla;": "\xb8",
    "CenterDot;": "\xb7",
    "Cfr;": "\u212d",
    "Chi;": "\u03a7",
    "CircleDot;": "\u2299",
    "CircleMinus;": "\u2296",
    "CirclePlus;": "\u2295",
    "CircleTimes;": "\u2297",
    "ClockwiseContourIntegral;": "\u2232",
    "CloseCurlyDoubleQuote;": "\u201d",
    "CloseCurlyQuote;": "\u2019",
    "Colon;": "\u2237",
    "Colone;": "\u2a74",
    "Congruent;": "\u2261",
    "Conint;": "\u222f",
    "ContourIntegral;": "\u222e",
    "Copf;": "\u2102",
    "Coproduct;": "\u2210",
    "CounterClockwiseContourIntegral;": "\u2233",
    "Cross;": "\u2a2f",
    "Cscr;": "\U0001d49e",
    "Cup;": "\u22d3",
    "CupCap;": "\u224d",
    "DD;": "\u2145",
    "DDotrahd;": "\u2911",
    "DJcy;": "\u0402",
    "DScy;": "\u0405",
    "DZcy;": "\u040f",
    "Dagger;": "\u2021",
    "Darr;": "\u21a1",
    "Dashv;": "\u2ae4",
    "Dcaron;": "\u010e",
    "Dcy;": "\u0414",
    "Del;": "\u2207",
    "Delta;": "\u0394",
    "Dfr;": "\U0001d507",
    "DiacriticalAcute;": "\xb4",
    "DiacriticalDot;": "\u02d9",
    "DiacriticalDoubleAcute;": "\u02dd",
    "DiacriticalGrave;": "`",
    "DiacriticalTilde;": "\u02dc",
    "Diamond;": "\u22c4",
    "DifferentialD;": "\u2146",
    "Dopf;": "\U0001d53b",
    "Dot;": "\xa8",
    "DotDot;": "\u20dc",
    "DotEqual;": "\u2250",
    "DoubleContourIntegral;": "\u222f",
    "DoubleDot;": "\xa8",
    "DoubleDownArrow;": "\u21d3",
    "DoubleLeftArrow;": "\u21d0",
    "DoubleLeftRightArrow;": "\u21d4",
    "DoubleLeftTee;": "\u2ae4",
    "DoubleLongLeftArrow;": "\u27f8",
    "DoubleLongLeftRightArrow;": "\u27fa",
    "DoubleLongRightArrow;": "\u27f9",
    "DoubleRightArrow;": "\u21d2",
    "DoubleRightTee;": "\u22a8",
    "DoubleUpArrow;": "\u21d1",
    "DoubleUpDownArrow;": "\u21d5",
    "DoubleVerticalBar;": "\u2225",
    "DownArrow;": "\u2193",
    "DownArrowBar;": "\u2913",
    "DownArrowUpArrow;": "\u21f5",
    "DownBreve;": "\u0311",
    "DownLeftRightVector;": "\u2950",
    "DownLeftTeeVector;": "\u295e",
    "DownLeftVector;": "\u21bd",
    "DownLeftVectorBar;": "\u2956",
    "DownRightTeeVector;": "\u295f",
    "DownRightVector;": "\u21c1",
    "DownRightVectorBar;": "\u2957",
    "DownTee;": "\u22a4",
    "DownTeeArrow;": "\u21a7",
    "Downarrow;": "\u21d3",
    "Dscr;": "\U0001d49f",
    "Dstrok;": "\u0110",
    "ENG;": "\u014a",
    "ETH": "\xd0",
    "ETH;": "\xd0",
    "Eacute": "\xc9",
    "Eacute;": "\xc9",
    "Ecaron;": "\u011a",
    "Ecirc": "\xca",
    "Ecirc;": "\xca",
    "Ecy;": "\u042d",
    "Edot;": "\u0116",
    "Efr;": "\U0001d508",
    "Egrave": "\xc8",
    "Egrave;": "\xc8",
    "Element;": "\u2208",
    "Emacr;": "\u0112",
    "EmptySmallSquare;": "\u25fb",
    "EmptyVerySmallSquare;": "\u25ab",
    "Eogon;": "\u0118",
    "Eopf;": "\U0001d53c",
    "Epsilon;": "\u0395",
    "Equal;": "\u2a75",
    "EqualTilde;": "\u2242",
    "Equilibrium;": "\u21cc",
    "Escr;": "\u2130",
    "Esim;": "\u2a73",
    "Eta;": "\u0397",
    "Euml": "\xcb",
    "Euml;": "\xcb",
    "Exists;": "\u2203",
    "ExponentialE;": "\u2147",
    "Fcy;": "\u0424",
    "Ffr;": "\U0001d509",
    "FilledSmallSquare;": "\u25fc",
    "FilledVerySmallSquare;": "\u25aa",
    "Fopf;": "\U0001d53d",
    "ForAll;": "\u2200",
    "Fouriertrf;": "\u2131",
    "Fscr;": "\u2131",
    "GJcy;": "\u0403",
    "GT": ">",
    "GT;": ">",
    "Gamma;": "\u0393",
    "Gammad;": "\u03dc",
    "Gbreve;": "\u011e",
    "Gcedil;": "\u0122",
    "Gcirc;": "\u011c",
    "Gcy;": "\u0413",
    "Gdot;": "\u0120",
    "Gfr;": "\U0001d50a",
    "Gg;": "\u22d9",
    "Gopf;": "\U0001d53e",
    "GreaterEqual;": "\u2265",
    "GreaterEqualLess;": "\u22db",
    "GreaterFullEqual;": "\u2267",
    "GreaterGreater;": "\u2aa2",
    "GreaterLess;": "\u2277",
    "GreaterSlantEqual;": "\u2a7e",
    "GreaterTilde;": "\u2273",
    "Gscr;": "\U0001d4a2",
    "Gt;": "\u226b",
    "HARDcy;": "\u042a",
    "Hacek;": "\u02c7",
    "Hat;": "^",
    "Hcirc;": "\u0124",
    "Hfr;": "\u210c",
    "HilbertSpace;": "\u210b",
    "Hopf;": "\u210d",
    "HorizontalLine;": "\u2500",
    "Hscr;": "\u210b",
    "Hstrok;": "\u0126",
    "HumpDownHump;": "\u224e",
    "HumpEqual;": "\u224f",
    "IEcy;": "\u0415",
    "IJlig;": "\u0132",
    "IOcy;": "\u0401",
    "Iacute": "\xcd",
    "Iacute;": "\xcd",
    "Icirc": "\xce",
    "Icirc;": "\xce",
    "Icy;": "\u0418",
    "Idot;": "\u0130",
    "Ifr;": "\u2111",
    "Igrave": "\xcc",
    "Igrave;": "\xcc",
    "Im;": "\u2111",
    "Imacr;": "\u012a",
    "ImaginaryI;": "\u2148",
    "Implies;": "\u21d2",
    "Int;": "\u222c",
    "Integral;": "\u222b",
    "Intersection;": "\u22c2",
    "InvisibleComma;": "\u2063",
    "InvisibleTimes;": "\u2062",
    "Iogon;": "\u012e",
    "Iopf;": "\U0001d540",
    "Iota;": "\u0399",
    "Iscr;": "\u2110",
    "Itilde;": "\u0128",
    "Iukcy;": "\u0406",
    "Iuml": "\xcf",
    "Iuml;": "\xcf",
    "Jcirc;": "\u0134",
    "Jcy;": "\u0419",
    "Jfr;": "\U0001d50d",
    "Jopf;": "\U0001d541",
    "Jscr;": "\U0001d4a5",
    "Jsercy;": "\u0408",
    "Jukcy;": "\u0404",
    "KHcy;": "\u0425",
    "KJcy;": "\u040c",
    "Kappa;": "\u039a",
    "Kcedil;": "\u0136",
    "Kcy;": "\u041a",
    "Kfr;": "\U0001d50e",
    "Kopf;": "\U0001d542",
    "Kscr;": "\U0001d4a6",
    "LJcy;": "\u0409",
    "LT": "<",
    "LT;": "<",
    "Lacute;": "\u0139",
    "Lambda;": "\u039b",
    "Lang;": "\u27ea",
    "Laplacetrf;": "\u2112",
    "Larr;": "\u219e",
    "Lcaron;": "\u013d",
    "Lcedil;": "\u013b",
    "Lcy;": "\u041b",
    "LeftAngleBracket;": "\u27e8",
    "LeftArrow;": "\u2190",
    "LeftArrowBar;": "\u21e4",
    "LeftArrowRightArrow;": "\u21c6",
    "LeftCeiling;": "\u2308",
    "LeftDoubleBracket;": "\u27e6",
    "LeftDownTeeVector;": "\u2961",
    "LeftDownVector;": "\u21c3",
    "LeftDownVectorBar;": "\u2959",
    "LeftFloor;": "\u230a",
    "LeftRightArrow;": "\u2194",
    "LeftRightVector;": "\u294e",
    "LeftTee;": "\u22a3",
    "LeftTeeArrow;": "\u21a4",
    "LeftTeeVector;": "\u295a",
    "LeftTriangle;": "\u22b2",
    "LeftTriangleBar;": "\u29cf",
    "LeftTriangleEqual;": "\u22b4",
    "LeftUpDownVector;": "\u2951",
    "LeftUpTeeVector;": "\u2960",
    "LeftUpVector;": "\u21bf",
    "LeftUpVectorBar;": "\u2958",
    "LeftVector;": "\u21bc",
    "LeftVectorBar;": "\u2952",
    "Leftarrow;": "\u21d0",
    "Leftrightarrow;": "\u21d4",
    "LessEqualGreater;": "\u22da",
    "LessFullEqual;": "\u2266",
    "LessGreater;": "\u2276",
    "LessLess;": "\u2aa1",
    "LessSlantEqual;": "\u2a7d",
    "LessTilde;": "\u2272",
    "Lfr;": "\U0001d50f",
    "Ll;": "\u22d8",
    "Lleftarrow;": "\u21da",
    "Lmidot;": "\u013f",
    "LongLeftArrow;": "\u27f5",
    "LongLeftRightArrow;": "\u27f7",
    "LongRightArrow;": "\u27f6",
    "Longleftarrow;": "\u27f8",
    "Longleftrightarrow;": "\u27fa",
    "Longrightarrow;": "\u27f9",
    "Lopf;": "\U0001d543",
    "LowerLeftArrow;": "\u2199",
    "LowerRightArrow;": "\u2198",
    "Lscr;": "\u2112",
    "Lsh;": "\u21b0",
    "Lstrok;": "\u0141",
    "Lt;": "\u226a",
    "Map;": "\u2905",
    "Mcy;": "\u041c",
    "MediumSpace;": "\u205f",
    "Mellintrf;": "\u2133",
    "Mfr;": "\U0001d510",
    "MinusPlus;": "\u2213",
    "Mopf;": "\U0001d544",
    "Mscr;": "\u2133",
    "Mu;": "\u039c",
    "NJcy;": "\u040a",
    "Nacute;": "\u0143",
    "Ncaron;": "\u0147",
    "Ncedil;": "\u0145",
    "Ncy;": "\u041d",
    "NegativeMediumSpace;": "\u200b",
    "NegativeThickSpace;": "\u200b",
    "NegativeThinSpace;": "\u200b",
    "NegativeVeryThinSpace;": "\u200b",
    "NestedGreaterGreater;": "\u226b",
    "NestedLessLess;": "\u226a",
    "NewLine;": "\n",
    "Nfr;": "\U0001d511",
    "NoBreak;": "\u2060",
    "NonBreakingSpace;": "\xa0",
    "Nopf;": "\u2115",
    "Not;": "\u2aec",
    "NotCongruent;": "\u2262",
    "NotCupCap;": "\u226d",
    "NotDoubleVerticalBar;": "\u2226",
    "NotElement;": "\u2209",
    "NotEqual;": "\u2260",
    "NotEqualTilde;": "\u2242\u0338",
    "NotExists;": "\u2204",
    "NotGreater;": "\u226f",
    "NotGreaterEqual;": "\u2271",
    "NotGreaterFullEqual;": "\u2267\u0338",
    "NotGreaterGreater;": "\u226b\u0338",
    "NotGreaterLess;": "\u2279",
    "NotGreaterSlantEqual;": "\u2a7e\u0338",
    "NotGreaterTilde;": "\u2275",
    "NotHumpDownHump;": "\u224e\u0338",
    "NotHumpEqual;": "\u224f\u0338",
    "NotLeftTriangle;": "\u22ea",
    "NotLeftTriangleBar;": "\u29cf\u0338",
    "NotLeftTriangleEqual;": "\u22ec",
    "NotLess;": "\u226e",
    "NotLessEqual;": "\u2270",
    "NotLessGreater;": "\u2278",
    "NotLessLess;": "\u226a\u0338",
    "NotLessSlantEqual;": "\u2a7d\u0338",
    "NotLessTilde;": "\u2274",
    "NotNestedGreaterGreater;": "\u2aa2\u0338",
    "NotNestedLessLess;": "\u2aa1\u0338",
    "NotPrecedes;": "\u2280",
    "NotPrecedesEqual;": "\u2aaf\u0338",
    "NotPrecedesSlantEqual;": "\u22e0",
    "NotReverseElement;": "\u220c",
    "NotRightTriangle;": "\u22eb",
    "NotRightTriangleBar;": "\u29d0\u0338",
    "NotRightTriangleEqual;": "\u22ed",
    "NotSquareSubset;": "\u228f\u0338",
    "NotSquareSubsetEqual;": "\u22e2",
    "NotSquareSuperset;": "\u2290\u0338",
    "NotSquareSupersetEqual;": "\u22e3",
    "NotSubset;": "\u2282\u20d2",
    "NotSubsetEqual;": "\u2288",
    "NotSucceeds;": "\u2281",
    "NotSucceedsEqual;": "\u2ab0\u0338",
    "NotSucceedsSlantEqual;": "\u22e1",
    "NotSucceedsTilde;": "\u227f\u0338",
    "NotSuperset;": "\u2283\u20d2",
    "NotSupersetEqual;": "\u2289",
    "NotTilde;": "\u2241",
    "NotTildeEqual;": "\u2244",
    "NotTildeFullEqual;": "\u2247",
    "NotTildeTilde;": "\u2249",
    "NotVerticalBar;": "\u2224",
    "Nscr;": "\U0001d4a9",
    "Ntilde": "\xd1",
    "Ntilde;": "\xd1",
    "Nu;": "\u039d",
    "OElig;": "\u0152",
    "Oacute": "\xd3",
    "Oacute;": "\xd3",
    "Ocirc": "\xd4",
    "Ocirc;": "\xd4",
    "Ocy;": "\u041e",
    "Odblac;": "\u0150",
    "Ofr;": "\U0001d512",
    "Ograve": "\xd2",
    "Ograve;": "\xd2",
    "Omacr;": "\u014c",
    "Omega;": "\u03a9",
    "Omicron;": "\u039f",
    "Oopf;": "\U0001d546",
    "OpenCurlyDoubleQuote;": "\u201c",
    "OpenCurlyQuote;": "\u2018",
    "Or;": "\u2a54",
    "Oscr;": "\U0001d4aa",
    "Oslash": "\xd8",
    "Oslash;": "\xd8",
    "Otilde": "\xd5",
    "Otilde;": "\xd5",
    "Otimes;": "\u2a37",
    "Ouml": "\xd6",
    "Ouml;": "\xd6",
    "OverBar;": "\u203e",
    "OverBrace;": "\u23de",
    "OverBracket;": "\u23b4",
    "OverParenthesis;": "\u23dc",
    "PartialD;": "\u2202",
    "Pcy;": "\u041f",
    "Pfr;": "\U0001d513",
    "Phi;": "\u03a6",
    "Pi;": "\u03a0",
    "PlusMinus;": "\xb1",
    "Poincareplane;": "\u210c",
    "Popf;": "\u2119",
    "Pr;": "\u2abb",
    "Precedes;": "\u227a",
    "PrecedesEqual;": "\u2aaf",
    "PrecedesSlantEqual;": "\u227c",
    "PrecedesTilde;": "\u227e",
    "Prime;": "\u2033",
    "Product;": "\u220f",
    "Proportion;": "\u2237",
    "Proportional;": "\u221d",
    "Pscr;": "\U0001d4ab",
    "Psi;": "\u03a8",
    "QUOT": "\"",
    "QUOT;": "\"",
    "Qfr;": "\U0001d514",
    "Qopf;": "\u211a",
    "Qscr;": "\U0001d4ac",
    "RBarr;": "\u2910",
    "REG": "\xae",
    "REG;": "\xae",
    "Racute;": "\u0154",
    "Rang;": "\u27eb",
    "Rarr;": "\u21a0",
    "Rarrtl;": "\u2916",
    "Rcaron;": "\u0158",
    "Rcedil;": "\u0156",
    "Rcy;": "\u0420",
    "Re;": "\u211c",
    "ReverseElement;": "\u220b",
    "ReverseEquilibrium;": "\u21cb",
    "ReverseUpEquilibrium;": "\u296f",
    "Rfr;": "\u211c",
    "Rho;": "\u03a1",
    "RightAngleBracket;": "\u27e9",
    "RightArrow;": "\u2192",
    "RightArrowBar;": "\u21e5",
    "RightArrowLeftArrow;": "\u21c4",
    "RightCeiling;": "\u2309",
    "RightDoubleBracket;": "\u27e7",
    "RightDownTeeVector;": "\u295d",
    "RightDownVector;": "\u21c2",
    "RightDownVectorBar;": "\u2955",
    "RightFloor;": "\u230b",
    "RightTee;": "\u22a2",
    "RightTeeArrow;": "\u21a6",
    "RightTeeVector;": "\u295b",
    "RightTriangle;": "\u22b3",
    "RightTriangleBar;": "\u29d0",
    "RightTriangleEqual;": "\u22b5",
    "RightUpDownVector;": "\u294f",
    "RightUpTeeVector;": "\u295c",
    "RightUpVector;": "\u21be",
    "RightUpVectorBar;": "\u2954",
    "RightVector;": "\u21c0",
    "RightVectorBar;": "\u2953",
    "Rightarrow;": "\u21d2",
    "Ropf;": "\u211d",
    "RoundImplies;": "\u2970",
    "Rrightarrow;": "\u21db",
    "Rscr;": "\u211b",
    "Rsh;": "\u21b1",
    "RuleDelayed;": "\u29f4",
    "SHCHcy;": "\u0429",
    "SHcy;": "\u0428",
    "SOFTcy;": "\u042c",
    "Sacute;": "\u015a",
    "Sc;": "\u2abc",
    "Scaron;": "\u0160",
    "Scedil;": "\u015e",
    "Scirc;": "\u015c",
    "Scy;": "\u0421",
    "Sfr;": "\U0001d516",
    "ShortDownArrow;": "\u2193",
    "ShortLeftArrow;": "\u2190",
    "ShortRightArrow;": "\u2192",
    "ShortUpArrow;": "\u2191",
    "Sigma;": "\u03a3",
    "SmallCircle;": "\u2218",
    "Sopf;": "\U0001d54a",
    "Sqrt;": "\u221a",
    "Square;": "\u25a1",
    "SquareIntersection;": "\u2293",
    "SquareSubset;": "\u228f",
    "SquareSubsetEqual;": "\u2291",
    "SquareSuperset;": "\u2290",
    "SquareSupersetEqual;": "\u2292",
    "SquareUnion;": "\u2294",
    "Sscr;": "\U0001d4ae",
    "Star;": "\u22c6",
    "Sub;": "\u22d0",
    "Subset;": "\u22d0",
    "SubsetEqual;": "\u2286",
    "Succeeds;": "\u227b",
    "SucceedsEqual;": "\u2ab0",
    "SucceedsSlantEqual;": "\u227d",
    "SucceedsTilde;": "\u227f",
    "SuchThat;": "\u220b",
    "Sum;": "\u2211",
    "Sup;": "\u22d1",
    "Superset;": "\u2283",
    "SupersetEqual;": "\u2287",
    "Supset;": "\u22d1",
    "THORN": "\xde",
    "THORN;": "\xde",
    "TRADE;": "\u2122",
    "TSHcy;": "\u040b",
    "TScy;": "\u0426",
    "Tab;": "\t",
    "Tau;": "\u03a4",
    "Tcaron;": "\u0164",
    "Tcedil;": "\u0162",
    "Tcy;": "\u0422",
    "Tfr;": "\U0001d517",
    "Therefore;": "\u2234",
    "Theta;": "\u0398",
    "ThickSpace;": "\u205f\u200a",
    "ThinSpace;": "\u2009",
    "Tilde;": "\u223c",
    "TildeEqual;": "\u2243",
    "TildeFullEqual;": "\u2245",
    "TildeTilde;": "\u2248",
    "Topf;": "\U0001d54b",
    "TripleDot;": "\u20db",
    "Tscr;": "\U0001d4af",
    "Tstrok;": "\u0166",
    "Uacute": "\xda",
    "Uacute;": "\xda",
    "Uarr;": "\u219f",
    "Uarrocir;": "\u2949",
    "Ubrcy;": "\u040e",
    "Ubreve;": "\u016c",
    "Ucirc": "\xdb",
    "Ucirc;": "\xdb",
    "Ucy;": "\u0423",
    "Udblac;": "\u0170",
    "Ufr;": "\U0001d518",
    "Ugrave": "\xd9",
    "Ugrave;": "\xd9",
    "Umacr;": "\u016a",
    "UnderBar;": "_",
    "UnderBrace;": "\u23df",
    "UnderBracket;": "\u23b5",
    "UnderParenthesis;": "\u23dd",
    "Union;": "\u22c3",
    "UnionPlus;": "\u228e",
    "Uogon;": "\u0172",
    "Uopf;": "\U0001d54c",
    "UpArrow;": "\u2191",
    "UpArrowBar;": "\u2912",
    "UpArrowDownArrow;": "\u21c5",
    "UpDownArrow;": "\u2195",
    "UpEquilibrium;": "\u296e",
    "UpTee;": "\u22a5",
    "UpTeeArrow;": "\u21a5",
    "Uparrow;": "\u21d1",
    "Updownarrow;": "\u21d5",
    "UpperLeftArrow;": "\u2196",
    "UpperRightArrow;": "\u2197",
    "Upsi;": "\u03d2",
    "Upsilon;": "\u03a5",
    "Uring;": "\u016e",
    "Uscr;": "\U0001d4b0",
    "Utilde;": "\u0168",
    "Uuml": "\xdc",
    "Uuml;": "\xdc",
    "VDash;": "\u22ab",
    "Vbar;": "\u2aeb",
    "Vcy;": "\u0412",
    "Vdash;": "\u22a9",
    "Vdashl;": "\u2ae6",
    "Vee;": "\u22c1",
    "Verbar;": "\u2016",
    "Vert;": "\u2016",
    "VerticalBar;": "\u2223",
    "VerticalLine;": "|",
    "VerticalSeparator;": "\u2758",
    "VerticalTilde;": "\u2240",
    "VeryThinSpace;": "\u200a",
    "Vfr;": "\U0001d519",
    "Vopf;": "\U0001d54d",
    "Vscr;": "\U0001d4b1",
    "Vvdash;": "\u22aa",
    "Wcirc;": "\u0174",
    "Wedge;": "\u22c0",
    "Wfr;": "\U0001d51a",
    "Wopf;": "\U0001d54e",
    "Wscr;": "\U0001d4b2",
    "Xfr;": "\U0001d51b",
    "Xi;": "\u039e",
    "Xopf;": "\U0001d54f",
    "Xscr;": "\U0001d4b3",
    "YAcy;": "\u042f",
    "YIcy;": "\u0407",
    "YUcy;": "\u042e",
    "Yacute": "\xdd",
    "Yacute;": "\xdd",
    "Ycirc;": "\u0176",
    "Ycy;": "\u042b",
    "Yfr;": "\U0001d51c",
    "Yopf;": "\U0001d550",
    "Yscr;": "\U0001d4b4",
    "Yuml;": "\u0178",
    "ZHcy;": "\u0416",
    "Zacute;": "\u0179",
    "Zcaron;": "\u017d",
    "Zcy;": "\u0417",
    "Zdot;": "\u017b",
    "ZeroWidthSpace;": "\u200b",
    "Zeta;": "\u0396",
    "Zfr;": "\u2128",
    "Zopf;": "\u2124",
    "Zscr;": "\U0001d4b5",
    "aacute": "\xe1",
    "aacute;": "\xe1",
    "abreve;": "\u0103",
    "ac;": "\u223e",
    "acE;": "\u223e\u0333",
    "acd;": "\u223f",
    "acirc": "\xe2",
    "acirc;": "\xe2",
    "acute": "\xb4",
    "acute;": "\xb4",
    "acy;": "\u0430",
    "aelig": "\xe6",
    "aelig;": "\xe6",
    "af;": "\u2061",
    "afr;": "\U0001d51e",
    "agrave": "\xe0",
    "agrave;": "\xe0",
    "alefsym;": "\u2135",
    "aleph;": "\u2135",
    "alpha;": "\u03b1",
    "amacr;": "\u0101",
    "amalg;": "\u2a3f",
    "amp": "&",
    "amp;": "&",
    "and;": "\u2227",
    "andand;": "\u2a55",
    "andd;": "\u2a5c",
    "andslope;": "\u2a58",
    "andv;": "\u2a5a",
    "ang;": "\u2220",
    "ange;": "\u29a4",
    "angle;": "\u2220",
    "angmsd;": "\u2221",
    "angmsdaa;": "\u29a8",
    "angmsdab;": "\u29a9",
    "angmsdac;": "\u29aa",
    "angmsdad;": "\u29ab",
    "angmsdae;": "\u29ac",
    "angmsdaf;": "\u29ad",
    "angmsdag;": "\u29ae",
    "angmsdah;": "\u29af",
    "angrt;": "\u221f",
    "angrtvb;": "\u22be",
    "angrtvbd;": "\u299d",
    "angsph;": "\u2222",
    "angst;": "\xc5",
    "angzarr;": "\u237c",
    "aogon;": "\u0105",
    "aopf;": "\U0001d552",
    "ap;": "\u2248",
    "apE;": "\u2a70",
    "apacir;": "\u2a6f",
    "ape;": "\u224a",
    "apid;": "\u224b",
    "apos;": "'",
    "approx;": "\u2248",
    "approxeq;": "\u224a",
    "aring": "\xe5",
    "aring;": "\xe5",
    "ascr;": "\U0001d4b6",
    "ast;": "*",
    "asymp;": "\u2248",
    "asympeq;": "\u224d",
    "atilde": "\xe3",
    "atilde;": "\xe3",
    "auml": "\xe4",
    "auml;": "\xe4",
    "awconint;": "\u2233",
    "awint;": "\u2a11",
    "bNot;": "\u2aed",
    "backcong;": "\u224c",
    "backepsilon;": "\u03f6",
    "backprime;": "\u2035",
    "backsim;": "\u223d",
    "backsimeq;": "\u22cd",
    "barvee;": "\u22bd",
    "barwed;": "\u2305",
    "barwedge;": "\u2305",
    "bbrk;": "\u23b5",
    "bbrktbrk;": "\u23b6",
    "bcong;": "\u224c",
    "bcy;": "\u0431",
    "bdquo;": "\u201e",
    "becaus;": "\u2235",
    "because;": "\u2235",
    "bemptyv;": "\u29b0",
    "bepsi;": "\u03f6",
    "bernou;": "\u212c",
    "beta;": "\u03b2",
    "beth;": "\u2136",
    "between;": "\u226c",
    "bfr;": "\U0001d51f",
    "bigcap;": "\u22c2",
    "bigcirc;": "\u25ef",
    "bigcup;": "\u22c3",
    "bigodot;": "\u2a00",
    "bigoplus;": "\u2a01",
    "bigotimes;": "\u2a02",
    "bigsqcup;": "\u2a06",
    "bigstar;": "\u2605",
    "bigtriangledown;": "\u25bd",
    "bigtriangleup;": "\u25b3",
    "biguplus;": "\u2a04",
    "bigvee;": "\u22c1",
    "bigwedge;": "\u22c0",
    "bkarow;": "\u290d",
    "blacklozenge;": "\u29eb",
    "blacksquare;": "\u25aa",
    "blacktriangle;": "\u25b4",
    "blacktriangledown;": "\u25be",
    "blacktriangleleft;": "\u25c2",
    "blacktriangleright;": "\u25b8",
    "blank;": "\u2423",
    "blk12;": "\u2592",
    "blk14;": "\u2591",
    "blk34;": "\u2593",
    "block;": "\u2588",
    "bne;": "=\u20e5",
    "bnequiv;": "\u2261\u20e5",
    "bnot;": "\u2310",
    "bopf;": "\U0001d553",
    "bot;": "\u22a5",
    "bottom;": "\u22a5",
    "bowtie;": "\u22c8",
    "boxDL;": "\u2557",
    "boxDR;": "\u2554",
    "boxDl;": "\u2556",
    "boxDr;": "\u2553",
    "boxH;": "\u2550",
    "boxHD;": "\u2566",
    "boxHU;": "\u2569",
    "boxHd;": "\u2564",
    "boxHu;": "\u2567",
    "boxUL;": "\u255d",
    "boxUR;": "\u255a",
    "boxUl;": "\u255c",
    "boxUr;": "\u2559",
    "boxV;": "\u2551",
    "boxVH;": "\u256c",
    "boxVL;": "\u2563",
    "boxVR;": "\u2560",
    "boxVh;": "\u256b",
    "boxVl;": "\u2562",
    "boxVr;": "\u255f",
    "boxbox;": "\u29c9",
    "boxdL;": "\u2555",
    "boxdR;": "\u2552",
    "boxdl;": "\u2510",
    "boxdr;": "\u250c",
    "boxh;": "\u2500",
    "boxhD;": "\u2565",
    "boxhU;": "\u2568",
    "boxhd;": "\u252c",
    "boxhu;": "\u2534",
    "boxminus;": "\u229f",
    "boxplus;": "\u229e",
    "boxtimes;": "\u22a0",
    "boxuL;": "\u255b",
    "boxuR;": "\u2558",
    "boxul;": "\u2518",
    "boxur;": "\u2514",
    "boxv;": "\u2502",
    "boxvH;": "\u256a",
    "boxvL;": "\u2561",
    "boxvR;": "\u255e",
    "boxvh;": "\u253c",
    "boxvl;": "\u2524",
    "boxvr;": "\u251c",
    "bprime;": "\u2035",
    "breve;": "\u02d8",
    "brvbar": "\xa6",
    "brvbar;": "\xa6",
    "bscr;": "\U0001d4b7",
    "bsemi;": "\u204f",
    "bsim;": "\u223d",
    "bsime;": "\u22cd",
    "bsol;": "\\",
    "bsolb;": "\u29c5",
    "bsolhsub;": "\u27c8",
    "bull;": "\u2022",
    "bullet;": "\u2022",
    "bump;": "\u224e",
    "bumpE;": "\u2aae",
    "bumpe;": "\u224f",
    "bumpeq;": "\u224f",
    "cacute;": "\u0107",
    "cap;": "\u2229",
    "capand;": "\u2a44",
    "capbrcup;": "\u2a49",
    "capcap;": "\u2a4b",
    "capcup;": "\u2a47",
    "capdot;": "\u2a40",
    "caps;": "\u2229\ufe00",
    "caret;": "\u2041",
    "caron;": "\u02c7",
    "ccaps;": "\u2a4d",
    "ccaron;": "\u010d",
    "ccedil": "\xe7",
    "ccedil;": "\xe7",
    "ccirc;": "\u0109",
    "ccups;": "\u2a4c",
    "ccupssm;": "\u2a50",
    "cdot;": "\u010b",
    "cedil": "\xb8",
    "cedil;": "\xb8",
    "cemptyv;": "\u29b2",
    "cent": "\xa2",
    "cent;": "\xa2",
    "centerdot;": "\xb7",
    "cfr;": "\U0001d520",
    "chcy;": "\u0447",
    "check;": "\u2713",
    "checkmark;": "\u2713",
    "chi;": "\u03c7",
    "cir;": "\u25cb",
    "cirE;": "\u29c3",
    "circ;": "\u02c6",
    "circeq;": "\u2257",
    "circlearrowleft;": "\u21ba",
    "circlearrowright;": "\u21bb",
    "circledR;": "\xae",
    "circledS;": "\u24c8",
    "circledast;": "\u229b",
    "circledcirc;": "\u229a",
    "circleddash;": "\u229d",
    "cire;": "\u2257",
    "cirfnint;": "\u2a10",
    "cirmid;": "\u2aef",
    "cirscir;": "\u29c2",
    "clubs;": "\u2663",
    "clubsuit;": "\u2663",
    "colon;": ":",
    "colone;": "\u2254",
    "coloneq;": "\u2254",
    "comma;": ",",
    "commat;": "@",
    "comp;": "\u2201",
    "compfn;": "\u2218",
    "complement;": "\u2201",
    "complexes;": "\u2102",
    "cong;": "\u2245",
    "congdot;": "\u2a6d",
    "conint;": "\u222e",
    "copf;": "\U0001d554",
    "coprod;": "\u2210",
    "copy": "\xa9",
    "copy;": "\xa9",
    "copysr;": "\u2117",
    "crarr;": "\u21b5",
    "cross;": "\u2717",
    "cscr;": "\U0001d4b8",
    "csub;": "\u2acf",
    "csube;": "\u2ad1",
    "csup;": "\u2ad0",
    "csupe;": "\u2ad2",
    "ctdot;": "\u22ef",
    "cudarrl;": "\u2938",
    "cudarrr;": "\u2935",
    "cuepr;": "\u22de",
    "cuesc;": "\u22df",
    "cularr;": "\u21b6",
    "cularrp;": "\u293d",
    "cup;": "\u222a",
    "cupbrcap;": "\u2a48",
    "cupcap;": "\u2a46",
    "cupcup;": "\u2a4a",
    "cupdot;": "\u228d",
    "cupor;": "\u2a45",
    "cups;": "\u222a\ufe00",
    "curarr;": "\u21b7",
    "curarrm;": "\u293c",
    "curlyeqprec;": "\u22de",
    "curlyeqsucc;": "\u22df",
    "curlyvee;": "\u22ce",
    "curlywedge;": "\u22cf",
    "curren": "\xa4",
    "curren;": "\xa4",
    "curvearrowleft;": "\u21b6",
    "curvearrowright;": "\u21b7",
    "cuvee;": "\u22ce",
    "cuwed;": "\u22cf",
    "cwconint;": "\u2232",
    "cwint;": "\u2231",
    "cylcty;": "\u232d",
    "dArr;": "\u21d3",
    "dHar;": "\u2965",
    "dagger;": "\u2020",
    "daleth;": "\u2138",
    "darr;": "\u2193",
    "dash;": "\u2010",
    "dashv;": "\u22a3",
    "dbkarow;": "\u290f",
    "dblac;": "\u02dd",
    "dcaron;": "\u010f",
    "dcy;": "\u0434",
    "dd;": "\u2146",
    "ddagger;": "\u2021",
    "ddarr;": "\u21ca",
    "ddotseq;": "\u2a77",
    "deg": "\xb0",
    "deg;": "\xb0",
    "delta;": "\u03b4",
    "demptyv;": "\u29b1",
    "dfisht;": "\u297f",
    "dfr;": "\U0001d521",
    "dharl;": "\u21c3",
    "dharr;": "\u21c2",
    "diam;": "\u22c4",
    "diamond;": "\u22c4",
    "diamondsuit;": "\u2666",
    "diams;": "\u2666",
    "die;": "\xa8",
    "digamma;": "\u03dd",
    "disin;": "\u22f2",
    "div;": "\xf7",
    "divide": "\xf7",
    "divide;": "\xf7",
    "divideontimes;": "\u22c7",
    "divonx;": "\u22c7",
    "djcy;": "\u0452",
    "dlcorn;": "\u231e",
    "dlcrop;": "\u230d",
    "dollar;": "$",
    "dopf;": "\U0001d555",
    "dot;": "\u02d9",
    "doteq;": "\u2250",
    "doteqdot;": "\u2251",
    "dotminus;": "\u2238",
    "dotplus;": "\u2214",
    "dotsquare;": "\u22a1",
    "doublebarwedge;": "\u2306",
    "downarrow;": "\u2193",
    "downdownarrows;": "\u21ca",
    "downharpoonleft;": "\u21c3",
    "downharpoonright;": "\u21c2",
    "drbkarow;": "\u2910",
    "drcorn;": "\u231f",
    "drcrop;": "\u230c",
    "dscr;": "\U0001d4b9",
    "dscy;": "\u0455",
    "dsol;": "\u29f6",
    "dstrok;": "\u0111",
    "dtdot;": "\u22f1",
    "dtri;": "\u25bf",
    "dtrif;": "\u25be",
    "duarr;": "\u21f5",
    "duhar;": "\u296f",
    "dwangle;": "\u29a6",
    "dzcy;": "\u045f",
    "dzigrarr;": "\u27ff",
    "eDDot;": "\u2a77",
    "eDot;": "\u2251",
    "eacute": "\xe9",
    "eacute;": "\xe9",
    "easter;": "\u2a6e",
    "ecaron;": "\u011b",
    "ecir;": "\u2256",
    "ecirc": "\xea",
    "ecirc;": "\xea",
    "ecolon;": "\u2255",
    "ecy;": "\u044d",
    "edot;": "\u0117",
    "ee;": "\u2147",
    "efDot;": "\u2252",
    "efr;": "\U0001d522",
    "eg;": "\u2a9a",
    "egrave": "\xe8",
    "egrave;": "\xe8",
    "egs;": "\u2a96",
    "egsdot;": "\u2a98",
    "el;": "\u2a99",
    "elinters;": "\u23e7",
    "ell;": "\u2113",
    "els;": "\u2a95",
    "elsdot;": "\u2a97",
    "emacr;": "\u0113",
    "empty;": "\u2205",
    "emptyset;": "\u2205",
    "emptyv;": "\u2205",
    "emsp13;": "\u2004",
    "emsp14;": "\u2005",
    "emsp;": "\u2003",
    "eng;": "\u014b",
    "ensp;": "\u2002",
    "eogon;": "\u0119",
    "eopf;": "\U0001d556",
    "epar;": "\u22d5",
    "eparsl;": "\u29e3",
    "eplus;": "\u2a71",
    "epsi;": "\u03b5",
    "epsilon;": "\u03b5",
    "epsiv;": "\u03f5",
    "eqcirc;": "\u2256",
    "eqcolon;": "\u2255",
    "eqsim;": "\u2242",
    "eqslantgtr;": "\u2a96",
    "eqslantless;": "\u2a95",
    "equals;": "=",
    "equest;": "\u225f",
    "equiv;": "\u2261",
    "equivDD;": "\u2a78",
    "eqvparsl;": "\u29e5",
    "erDot;": "\u2253",
    "erarr;": "\u2971",
    "escr;": "\u212f",
    "esdot;": "\u2250",
    "esim;": "\u2242",
    "eta;": "\u03b7",
    "eth": "\xf0",
    "eth;": "\xf0",
    "euml": "\xeb",
    "euml;": "\xeb",
    "euro;": "\u20ac",
    "excl;": "!",
    "exist;": "\u2203",
    "expectation;": "\u2130",
    "exponentiale;": "\u2147",
    "fallingdotseq;": "\u2252",
    "fcy;": "\u0444",
    "female;": "\u2640",
    "ffilig;": "\ufb03",
    "fflig;": "\ufb00",
    "ffllig;": "\ufb04",
    "ffr;": "\U0001d523",
    "filig;": "\ufb01",
    "fjlig;": "fj",
    "flat;": "\u266d",
    "fllig;": "\ufb02",
    "fltns;": "\u25b1",
    "fnof;": "\u0192",
    "fopf;": "\U0001d557",
    "forall;": "\u2200",
    "fork;": "\u22d4",
    "forkv;": "\u2ad9",
    "fpartint;": "\u2a0d",
    "frac12": "\xbd",
    "frac12;": "\xbd",
    "frac13;": "\u2153",
    "frac14": "\xbc",
    "frac14;": "\xbc",
    "frac15;": "\u2155",
    "frac16;": "\u2159",
    "frac18;": "\u215b",
    "frac23;": "\u2154",
    "frac25;": "\u2156",
    "frac34": "\xbe",
    "frac34;": "\xbe",
    "frac35;": "\u2157",
    "frac38;": "\u215c",
    "frac45;": "\u2158",
    "frac56;": "\u215a",
    "frac58;": "\u215d",
    "frac78;": "\u215e",
    "frasl;": "\u2044",
    "frown;": "\u2322",
    "fscr;": "\U0001d4bb",
    "gE;": "\u2267",
    "gEl;": "\u2a8c",
    "gacute;": "\u01f5",
    "gamma;": "\u03b3",
    "gammad;": "\u03dd",
    "gap;": "\u2a86",
    "gbreve;": "\u011f",
    "gcirc;": "\u011d",
    "gcy;": "\u0433",
    "gdot;": "\u0121",
    "ge;": "\u2265",
    "gel;": "\u22db",
    "geq;": "\u2265",
    "geqq;": "\u2267",
    "geqslant;": "\u2a7e",
    "ges;": "\u2a7e",
    "gescc;": "\u2aa9",
    "gesdot;": "\u2a80",
    "gesdoto;": "\u2a82",
    "gesdotol;": "\u2a84",
    "gesl;": "\u22db\ufe00",
    "gesles;": "\u2a94",
    "gfr;": "\U0001d524",
    "gg;": "\u226b",
    "ggg;": "\u22d9",
    "gimel;": "\u2137",
    "gjcy;": "\u0453",
    "gl;": "\u2277",
    "glE;": "\u2a92",
    "gla;": "\u2aa5",
    "glj;": "\u2aa4",
    "gnE;": "\u2269",
    "gnap;": "\u2a8a",
    "gnapprox;": "\u2a8a",
    "gne;": "\u2a88",
    "gneq;": "\u2a88",
    "gneqq;": "\u2269",
    "gnsim;": "\u22e7",
    "gopf;": "\U0001d558",
    "grave;": "`",
    "gscr;": "\u210a",
    "gsim;": "\u2273",
    "gsime;": "\u2a8e",
    "gsiml;": "\u2a90",
    "gt": ">",
    "gt;": ">",
    "gtcc;": "\u2aa7",
    "gtcir;": "\u2a7a",
    "gtdot;": "\u22d7",
    "gtlPar;": "\u2995",
    "gtquest;": "\u2a7c",
    "gtrapprox;": "\u2a86",
    "gtrarr;": "\u2978",
    "gtrdot;": "\u22d7",
    "gtreqless;": "\u22db",
    "gtreqqless;": "\u2a8c",
    "gtrless;": "\u2277",
    "gtrsim;": "\u2273",
    "gvertneqq;": "\u2269\ufe00",
    "gvnE;": "\u2269\ufe00",
    "hArr;": "\u21d4",
    "hairsp;": "\u200a",
    "half;": "\xbd",
    "hamilt;": "\u210b",
    "hardcy;": "\u044a",
    "harr;": "\u2194",
    "harrcir;": "\u2948",
    "harrw;": "\u21ad",
    "hbar;": "\u210f",
    "hcirc;": "\u0125",
    "hearts;": "\u2665",
    "heartsuit;": "\u2665",
    "hellip;": "\u2026",
    "hercon;": "\u22b9",
    "hfr;": "\U0001d525",
    "hksearow;": "\u2925",
    "hkswarow;": "\u2926",
    "hoarr;": "\u21ff",
    "homtht;": "\u223b",
    "hookleftarrow;": "\u21a9",
    "hookrightarrow;": "\u21aa",
    "hopf;": "\U0001d559",
    "horbar;": "\u2015",
    "hscr;": "\U0001d4bd",
    "hslash;": "\u210f",
    "hstrok;": "\u0127",
    "hybull;": "\u2043",
    "hyphen;": "\u2010",
    "iacute": "\xed",
    "iacute;": "\xed",
    "ic;": "\u2063",
    "icirc": "\xee",
    "icirc;": "\xee",
    "icy;": "\u0438",
    "iecy;": "\u0435",
    "iexcl": "\xa1",
    "iexcl;": "\xa1",
    "iff;": "\u21d4",
    "ifr;": "\U0001d526",
    "igrave": "\xec",
    "igrave;": "\xec",
    "ii;": "\u2148",
    "iiiint;": "\u2a0c",
    "iiint;": "\u222d",
    "iinfin;": "\u29dc",
    "iiota;": "\u2129",
    "ijlig;": "\u0133",
    "imacr;": "\u012b",
    "image;": "\u2111",
    "imagline;": "\u2110",
    "imagpart;": "\u2111",
    "imath;": "\u0131",
    "imof;": "\u22b7",
    "imped;": "\u01b5",
    "in;": "\u2208",
    "incare;": "\u2105",
    "infin;": "\u221e",
    "infintie;": "\u29dd",
    "inodot;": "\u0131",
    "int;": "\u222b",
    "intcal;": "\u22ba",
    "integers;": "\u2124",
    "intercal;": "\u22ba",
    "intlarhk;": "\u2a17",
    "intprod;": "\u2a3c",
    "iocy;": "\u0451",
    "iogon;": "\u012f",
    "iopf;": "\U0001d55a",
    "iota;": "\u03b9",
    "iprod;": "\u2a3c",
    "iquest": "\xbf",
    "iquest;": "\xbf",
    "iscr;": "\U0001d4be",
    "isin;": "\u2208",
    "isinE;": "\u22f9",
    "isindot;": "\u22f5",
    "isins;": "\u22f4",
    "isinsv;": "\u22f3",
    "isinv;": "\u2208",
    "it;": "\u2062",
    "itilde;": "\u0129",
    "iukcy;": "\u0456",
    "iuml": "\xef",
    "iuml;": "\xef",
    "jcirc;": "\u0135",
    "jcy;": "\u0439",
    "jfr;": "\U0001d527",
    "jmath;": "\u0237",
    "jopf;": "\U0001d55b",
    "jscr;": "\U0001d4bf",
    "jsercy;": "\u0458",
    "jukcy;": "\u0454",
    "kappa;": "\u03ba",
    "kappav;": "\u03f0",
    "kcedil;": "\u0137",
    "kcy;": "\u043a",
    "kfr;": "\U0001d528",
    "kgreen;": "\u0138",
    "khcy;": "\u0445",
    "kjcy;": "\u045c",
    "kopf;": "\U0001d55c",
    "kscr;": "\U0001d4c0",
    "lAarr;": "\u21da",
    "lArr;": "\u21d0",
    "lAtail;": "\u291b",
    "lBarr;": "\u290e",
    "lE;": "\u2266",
    "lEg;": "\u2a8b",
    "lHar;": "\u2962",
    "lacute;": "\u013a",
    "laemptyv;": "\u29b4",
    "lagran;": "\u2112",
    "lambda;": "\u03bb",
    "lang;": "\u27e8",
    "langd;": "\u2991",
    "langle;": "\u27e8",
    "lap;": "\u2a85",
    "laquo": "\xab",
    "laquo;": "\xab",
    "larr;": "\u2190",
    "larrb;": "\u21e4",
    "larrbfs;": "\u291f",
    "larrfs;": "\u291d",
    "larrhk;": "\u21a9",
    "larrlp;": "\u21ab",
    "larrpl;": "\u2939",
    "larrsim;": "\u2973",
    "larrtl;": "\u21a2",
    "lat;": "\u2aab",
    "latail;": "\u2919",
    "late;": "\u2aad",
    "lates;": "\u2aad\ufe00",
    "lbarr;": "\u290c",
    "lbbrk;": "\u2772",
    "lbrace;": "{",
    "lbrack;": "[",
    "lbrke;": "\u298b",
    "lbrksld;": "\u298f",
    "lbrkslu;": "\u298d",
    "lcaron;": "\u013e",
    "lcedil;": "\u013c",
    "lceil;": "\u2308",
    "lcub;": "{",
    "lcy;": "\u043b",
    "ldca;": "\u2936",
    "ldquo;": "\u201c",
    "ldquor;": "\u201e",
    "ldrdhar;": "\u2967",
    "ldrushar;": "\u294b",
    "ldsh;": "\u21b2",
    "le;": "\u2264",
    "leftarrow;": "\u2190",
    "leftarrowtail;": "\u21a2",
    "leftharpoondown;": "\u21bd",
    "leftharpoonup;": "\u21bc",
    "leftleftarrows;": "\u21c7",
    "leftrightarrow;": "\u2194",
    "leftrightarrows;": "\u21c6",
    "leftrightharpoons;": "\u21cb",
    "leftrightsquigarrow;": "\u21ad",
    "leftthreetimes;": "\u22cb",
    "leg;": "\u22da",
    "leq;": "\u2264",
    "leqq;": "\u2266",
    "leqslant;": "\u2a7d",
    "les;": "\u2a7d",
    "lescc;": "\u2aa8",
    "lesdot;": "\u2a7f",
    "lesdoto;": "\u2a81",
    "lesdotor;": "\u2a83",
    "lesg;": "\u22da\ufe00",
    "lesges;": "\u2a93",
    "lessapprox;": "\u2a85",
    "lessdot;": "\u22d6",
    "lesseqgtr;": "\u22da",
    "lesseqqgtr;": "\u2a8b",
    "lessgtr;": "\u2276",
    "lesssim;": "\u2272",
    "lfisht;": "\u297c",
    "lfloor;": "\u230a",
    "lfr;": "\U0001d529",
    "lg;": "\u2276",
    "lgE;": "\u2a91",
    "lhard;": "\u21bd",
    "lharu;": "\u21bc",
    "lharul;": "\u296a",
    "lhblk;": "\u2584",
    "ljcy;": "\u0459",
    "ll;": "\u226a",
    "llarr;": "\u21c7",
    "llcorner;": "\u231e",
    "llhard;": "\u296b",
    "lltri;": "\u25fa",
    "lmidot;": "\u0140",
    "lmoust;": "\u23b0",
    "lmoustache;": "\u23b0",
    "lnE;": "\u2268",
    "lnap;": "\u2a89",
    "lnapprox;": "\u2a89",
    "lne;": "\u2a87",
    "lneq;": "\u2a87",
    "lneqq;": "\u2268",
    "lnsim;": "\u22e6",
    "loang;": "\u27ec",
    "loarr;": "\u21fd",
    "lobrk;": "\u27e6",
    "longleftarrow;": "\u27f5",
    "longleftrightarrow;": "\u27f7",
    "longmapsto;": "\u27fc",
    "longrightarrow;": "\u27f6",
    "looparrowleft;": "\u21ab",
    "looparrowright;": "\u21ac",
    "lopar;": "\u2985",
    "lopf;": "\U0001d55d",
    "loplus;": "\u2a2d",
    "lotimes;": "\u2a34",
    "lowast;": "\u2217",
    "lowbar;": "_",
    "loz;": "\u25ca",
    "lozenge;": "\u25ca",
    "lozf;": "\u29eb",
    "lpar;": "(",
    "lparlt;": "\u2993",
    "lrarr;": "\u21c6",
    "lrcorner;": "\u231f",
    "lrhar;": "\u21cb",
    "lrhard;": "\u296d",
    "lrm;": "\u200e",
    "lrtri;": "\u22bf",
    "lsaquo;": "\u2039",
    "lscr;": "\U0001d4c1",
    "lsh;": "\u21b0",
    "lsim;": "\u2272",
    "lsime;": "\u2a8d",
    "lsimg;": "\u2a8f",
    "lsqb;": "[",
    "lsquo;": "\u2018",
    "lsquor;": "\u201a",
    "lstrok;": "\u0142",
    "lt": "<",
    "lt;": "<",
    "ltcc;": "\u2aa6",
    "ltcir;": "\u2a79",
    "ltdot;": "\u22d6",
    "lthree;": "\u22cb",
    "ltimes;": "\u22c9",
    "ltlarr;": "\u2976",
    "ltquest;": "\u2a7b",
    "ltrPar;": "\u2996",
    "ltri;": "\u25c3",
    "ltrie;": "\u22b4",
    "ltrif;": "\u25c2",
    "lurdshar;": "\u294a",
    "luruhar;": "\u2966",
    "lvertneqq;": "\u2268\ufe00",
    "lvnE;": "\u2268\ufe00",
    "mDDot;": "\u223a",
    "macr": "\xaf",
    "macr;": "\xaf",
    "male;": "\u2642",
    "malt;": "\u2720",
    "maltese;": "\u2720",
    "map;": "\u21a6",
    "mapsto;": "\u21a6",
    "mapstodown;": "\u21a7",
    "mapstoleft;": "\u21a4",
    "mapstoup;": "\u21a5",
    "marker;": "\u25ae",
    "mcomma;": "\u2a29",
    "mcy;": "\u043c",
    "mdash;": "\u2014",
    "measuredangle;": "\u2221",
    "mfr;": "\U0001d52a",
    "mho;": "\u2127",
    "micro": "\xb5",
    "micro;": "\xb5",
    "mid;": "\u2223",
    "midast;": "*",
    "midcir;": "\u2af0",
    "middot": "\xb7",
    "middot;": "\xb7",
    "minus;": "\u2212",
    "minusb;": "\u229f",
    "minusd;": "\u2238",
    "minusdu;": "\u2a2a",
    "mlcp;": "\u2adb",
    "mldr;": "\u2026",
    "mnplus;": "\u2213",
    "models;": "\u22a7",
    "mopf;": "\U0001d55e",
    "mp;": "\u2213",
    "mscr;": "\U0001d4c2",
    "mstpos;": "\u223e",
    "mu;": "\u03bc",
    "multimap;": "\u22b8",
    "mumap;": "\u22b8",
    "nGg;": "\u22d9\u0338",
    "nGt;": "\u226b\u20d2",
    "nGtv;": "\u226b\u0338",
    "nLeftarrow;": "\u21cd",
    "nLeftrightarrow;": "\u21ce",
    "nLl;": "\u22d8\u0338",
    "nLt;": "\u226a\u20d2",
    "nLtv;": "\u226a\u0338",
    "nRightarrow;": "\u21cf",
    "nVDash;": "\u22af",
    "nVdash;": "\u22ae",
    "nabla;": "\u2207",
    "nacute;": "\u0144",
    "nang;": "\u2220\u20d2",
    "nap;": "\u2249",
    "napE;": "\u2a70\u0338",
    "napid;": "\u224b\u0338",
    "napos;": "\u0149",
    "napprox;": "\u2249",
    "natur;": "\u266e",
    "natural;": "\u266e",
    "naturals;": "\u2115",
    "nbsp": "\xa0",
    "nbsp;": "\xa0",
    "nbump;": "\u224e\u0338",
    "nbumpe;": "\u224f\u0338",
    "ncap;": "\u2a43",
    "ncaron;": "\u0148",
    "ncedil;": "\u0146",
    "ncong;": "\u2247",
    "ncongdot;": "\u2a6d\u0338",
    "ncup;": "\u2a42",
    "ncy;": "\u043d",
    "ndash;": "\u2013",
    "ne;": "\u2260",
    "neArr;": "\u21d7",
    "nearhk;": "\u2924",
    "nearr;": "\u2197",
    "nearrow;": "\u2197",
    "nedot;": "\u2250\u0338",
    "nequiv;": "\u2262",
    "nesear;": "\u2928",
    "nesim;": "\u2242\u0338",
    "nexist;": "\u2204",
    "nexists;": "\u2204",
    "nfr;": "\U0001d52b",
    "ngE;": "\u2267\u0338",
    "nge;": "\u2271",
    "ngeq;": "\u2271",
    "ngeqq;": "\u2267\u0338",
    "ngeqslant;": "\u2a7e\u0338",
    "nges;": "\u2a7e\u0338",
    "ngsim;": "\u2275",
    "ngt;": "\u226f",
    "ngtr;": "\u226f",
    "nhArr;": "\u21ce",
    "nharr;": "\u21ae",
    "nhpar;": "\u2af2",
    "ni;": "\u220b",
    "nis;": "\u22fc",
    "nisd;": "\u22fa",
    "niv;": "\u220b",
    "njcy;": "\u045a",
    "nlArr;": "\u21cd",
    "nlE;": "\u2266\u0338",
    "nlarr;": "\u219a",
    "nldr;": "\u2025",
    "nle;": "\u2270",
    "nleftarrow;": "\u219a",
    "nleftrightarrow;": "\u21ae",
    "nleq;": "\u2270",
    "nleqq;": "\u2266\u0338",
    "nleqslant;": "\u2a7d\u0338",
    "nles;": "\u2a7d\u0338",
    "nless;": "\u226e",
    "nlsim;": "\u2274",
    "nlt;": "\u226e",
    "nltri;": "\u22ea",
    "nltrie;": "\u22ec",
    "nmid;": "\u2224",
    "nopf;": "\U0001d55f",
    "not": "\xac",
    "not;": "\xac",
    "notin;": "\u2209",
    "notinE;": "\u22f9\u0338",
    "notindot;": "\u22f5\u0338",
    "notinva;": "\u2209",
    "notinvb;": "\u22f7",
    "notinvc;": "\u22f6",
    "notni;": "\u220c",
    "notniva;": "\u220c",
    "notnivb;": "\u22fe",
    "notnivc;": "\u22fd",
    "npar;": "\u2226",
    "nparallel;": "\u2226",
    "nparsl;": "\u2afd\u20e5",
    "npart;": "\u2202\u0338",
    "npolint;": "\u2a14",
    "npr;": "\u2280",
    "nprcue;": "\u22e0",
    "npre;": "\u2aaf\u0338",
    "nprec;": "\u2280",
    "npreceq;": "\u2aaf\u0338",
    "nrArr;": "\u21cf",
    "nrarr;": "\u219b",
    "nrarrc;": "\u2933\u0338",
    "nrarrw;": "\u219d\u0338",
    "nrightarrow;": "\u219b",
    "nrtri;": "\u22eb",
    "nrtrie;": "\u22ed",
    "nsc;": "\u2281",
    "nsccue;": "\u22e1",
    "nsce;": "\u2ab0\u0338",
    "nscr;": "\U0001d4c3",
    "nshortmid;": "\u2224",
    "nshortparallel;": "\u2226",
    "nsim;": "\u2241",
    "nsime;": "\u2244",
    "nsimeq;": "\u2244",
    "nsmid;": "\u2224",
    "nspar;": "\u2226",
    "nsqsube;": "\u22e2",
    "nsqsupe;": "\u22e3",
    "nsub;": "\u2284",
    "nsubE;": "\u2ac5\u0338",
    "nsube;": "\u2288",
    "nsubset;": "\u2282\u20d2",
    "nsubseteq;": "\u2288",
    "nsubseteqq;": "\u2ac5\u0338",
    "nsucc;": "\u2281",
    "nsucceq;": "\u2ab0\u0338",
    "nsup;": "\u2285",
    "nsupE;": "\u2ac6\u0338",
    "nsupe;": "\u2289",
    "nsupset;": "\u2283\u20d2",
    "nsupseteq;": "\u2289",
    "nsupseteqq;": "\u2ac6\u0338",
    "ntgl;": "\u2279",
    "ntilde": "\xf1",
    "ntilde;": "\xf1",
    "ntlg;": "\u2278",
    "ntriangleleft;": "\u22ea",
    "ntrianglelefteq;": "\u22ec",
    "ntriangleright;": "\u22eb",
    "ntrianglerighteq;": "\u22ed",
    "nu;": "\u03bd",
    "num;": "#",
    "numero;": "\u2116",
    "numsp;": "\u2007",
    "nvDash;": "\u22ad",
    "nvHarr;": "\u2904",
    "nvap;": "\u224d\u20d2",
    "nvdash;": "\u22ac",
    "nvge;": "\u2265\u20d2",
    "nvgt;": ">\u20d2",
    "nvinfin;": "\u29de",
    "nvlArr;": "\u2902",
    "nvle;": "\u2264\u20d2",
    "nvlt;": "<\u20d2",
    "nvltrie;": "\u22b4\u20d2",
    "nvrArr;": "\u2903",
    "nvrtrie;": "\u22b5\u20d2",
    "nvsim;": "\u223c\u20d2",
    "nwArr;": "\u21d6",
    "nwarhk;": "\u2923",
    "nwarr;": "\u2196",
    "nwarrow;": "\u2196",
    "nwnear;": "\u2927",
    "oS;": "\u24c8",
    "oacute": "\xf3",
    "oacute;": "\xf3",
    "oast;": "\u229b",
    "ocir;": "\u229a",
    "ocirc": "\xf4",
    "ocirc;": "\xf4",
    "ocy;": "\u043e",
    "odash;": "\u229d",
    "odblac;": "\u0151",
    "odiv;": "\u2a38",
    "odot;": "\u2299",
    "odsold;": "\u29bc",
    "oelig;": "\u0153",
    "ofcir;": "\u29bf",
    "ofr;": "\U0001d52c",
    "ogon;": "\u02db",
    "ograve": "\xf2",
    "ograve;": "\xf2",
    "ogt;": "\u29c1",
    "ohbar;": "\u29b5",
    "ohm;": "\u03a9",
    "oint;": "\u222e",
    "olarr;": "\u21ba",
    "olcir;": "\u29be",
    "olcross;": "\u29bb",
    "oline;": "\u203e",
    "olt;": "\u29c0",
    "omacr;": "\u014d",
    "omega;": "\u03c9",
    "omicron;": "\u03bf",
    "omid;": "\u29b6",
    "ominus;": "\u2296",
    "oopf;": "\U0001d560",
    "opar;": "\u29b7",
    "operp;": "\u29b9",
    "oplus;": "\u2295",
    "or;": "\u2228",
    "orarr;": "\u21bb",
    "ord;": "\u2a5d",
    "order;": "\u2134",
    "orderof;": "\u2134",
    "ordf": "\xaa",
    "ordf;": "\xaa",
    "ordm": "\xba",
    "ordm;": "\xba",
    "origof;": "\u22b6",
    "oror;": "\u2a56",
    "orslope;": "\u2a57",
    "orv;": "\u2a5b",
    "oscr;": "\u2134",
    "oslash": "\xf8",
    "oslash;": "\xf8",
    "osol;": "\u2298",
    "otilde": "\xf5",
    "otilde;": "\xf5",
    "otimes;": "\u2297",
    "otimesas;": "\u2a36",
    "ouml": "\xf6",
    "ouml;": "\xf6",
    "ovbar;": "\u233d",
    "par;": "\u2225",
    "para": "\xb6",
    "para;": "\xb6",
    "parallel;": "\u2225",
    "parsim;": "\u2af3",
    "parsl;": "\u2afd",
    "part;": "\u2202",
    "pcy;": "\u043f",
    "percnt;": "%",
    "period;": ".",
    "permil;": "\u2030",
    "perp;": "\u22a5",
    "pertenk;": "\u2031",
    "pfr;": "\U0001d52d",
    "phi;": "\u03c6",
    "phiv;": "\u03d5",
    "phmmat;": "\u2133",
    "phone;": "\u260e",
    "pi;": "\u03c0",
    "pitchfork;": "\u22d4",
    "piv;": "\u03d6",
    "planck;": "\u210f",
    "planckh;": "\u210e",
    "plankv;": "\u210f",
    "plus;": "+",
    "plusacir;": "\u2a23",
    "plusb;": "\u229e",
    "pluscir;": "\u2a22",
    "plusdo;": "\u2214",
    "plusdu;": "\u2a25",
    "pluse;": "\u2a72",
    "plusmn": "\xb1",
    "plusmn;": "\xb1",
    "plussim;": "\u2a26",
    "plustwo;": "\u2a27",
    "pm;": "\xb1",
    "pointint;": "\u2a15",
    "popf;": "\U0001d561",
    "pound": "\xa3",
    "pound;": "\xa3",
    "pr;": "\u227a",
    "prE;": "\u2ab3",
    "prap;": "\u2ab7",
    "prcue;": "\u227c",
    "pre;": "\u2aaf",
    "prec;": "\u227a",
    "precapprox;": "\u2ab7",
    "preccurlyeq;": "\u227c",
    "preceq;": "\u2aaf",
    "precnapprox;": "\u2ab9",
    "precneqq;": "\u2ab5",
    "precnsim;": "\u22e8",
    "precsim;": "\u227e",
    "prime;": "\u2032",
    "primes;": "\u2119",
    "prnE;": "\u2ab5",
    "prnap;": "\u2ab9",
    "prnsim;": "\u22e8",
    "prod;": "\u220f",
    "profalar;": "\u232e",
    "profline;": "\u2312",
    "profsurf;": "\u2313",
    "prop;": "\u221d",
    "propto;": "\u221d",
    "prsim;": "\u227e",
    "prurel;": "\u22b0",
    "pscr;": "\U0001d4c5",
    "psi;": "\u03c8",
    "puncsp;": "\u2008",
    "qfr;": "\U0001d52e",
    "qint;": "\u2a0c",
    "qopf;": "\U0001d562",
    "qprime;": "\u2057",
    "qscr;": "\U0001d4c6",
    "quaternions;": "\u210d",
    "quatint;": "\u2a16",
    "quest;": "?",
    "questeq;": "\u225f",
    "quot": "\"",
    "quot;": "\"",
    "rAarr;": "\u21db",
    "rArr;": "\u21d2",
    "rAtail;": "\u291c",
    "rBarr;": "\u290f",
    "rHar;": "\u2964",
    "race;": "\u223d\u0331",
    "racute;": "\u0155",
    "radic;": "\u221a",
    "raemptyv;": "\u29b3",
    "rang;": "\u27e9",
    "rangd;": "\u2992",
    "range;": "\u29a5",
    "rangle;": "\u27e9",
    "raquo": "\xbb",
    "raquo;": "\xbb",
    "rarr;": "\u2192",
    "rarrap;": "\u2975",
    "rarrb;": "\u21e5",
    "rarrbfs;": "\u2920",
    "rarrc;": "\u2933",
    "rarrfs;": "\u291e",
    "rarrhk;": "\u21aa",
    "rarrlp;": "\u21ac",
    "rarrpl;": "\u2945",
    "rarrsim;": "\u2974",
    "rarrtl;": "\u21a3",
    "rarrw;": "\u219d",
    "ratail;": "\u291a",
    "ratio;": "\u2236",
    "rationals;": "\u211a",
    "rbarr;": "\u290d",
    "rbbrk;": "\u2773",
    "rbrace;": "}",
    "rbrack;": "]",
    "rbrke;": "\u298c",
    "rbrksld;": "\u298e",
    "rbrkslu;": "\u2990",
    "rcaron;": "\u0159",
    "rcedil;": "\u0157",
    "rceil;": "\u2309",
    "rcub;": "}",
    "rcy;": "\u0440",
    "rdca;": "\u2937",
    "rdldhar;": "\u2969",
    "rdquo;": "\u201d",
    "rdquor;": "\u201d",
    "rdsh;": "\u21b3",
    "real;": "\u211c",
    "realine;": "\u211b",
    "realpart;": "\u211c",
    "reals;": "\u211d",
    "rect;": "\u25ad",
    "reg": "\xae",
    "reg;": "\xae",
    "rfisht;": "\u297d",
    "rfloor;": "\u230b",
    "rfr;": "\U0001d52f",
    "rhard;": "\u21c1",
    "rharu;": "\u21c0",
    "rharul;": "\u296c",
    "rho;": "\u03c1",
    "rhov;": "\u03f1",
    "rightarrow;": "\u2192",
    "rightarrowtail;": "\u21a3",
    "rightharpoondown;": "\u21c1",
    "rightharpoonup;": "\u21c0",
    "rightleftarrows;": "\u21c4",
    "rightleftharpoons;": "\u21cc",
    "rightrightarrows;": "\u21c9",
    "rightsquigarrow;": "\u219d",
    "rightthreetimes;": "\u22cc",
    "ring;": "\u02da",
    "risingdotseq;": "\u2253",
    "rlarr;": "\u21c4",
    "rlhar;": "\u21cc",
    "rlm;": "\u200f",
    "rmoust;": "\u23b1",
    "rmoustache;": "\u23b1",
    "rnmid;": "\u2aee",
    "roang;": "\u27ed",
    "roarr;": "\u21fe",
    "robrk;": "\u27e7",
    "ropar;": "\u2986",
    "ropf;": "\U0001d563",
    "roplus;": "\u2a2e",
    "rotimes;": "\u2a35",
    "rpar;": ")",
    "rpargt;": "\u2994",
    "rppolint;": "\u2a12",
    "rrarr;": "\u21c9",
    "rsaquo;": "\u203a",
    "rscr;": "\U0001d4c7",
    "rsh;": "\u21b1",
    "rsqb;": "]",
    "rsquo;": "\u2019",
    "rsquor;": "\u2019",
    "rthree;": "\u22cc",
    "rtimes;": "\u22ca",
    "rtri;": "\u25b9",
    "rtrie;": "\u22b5",
    "rtrif;": "\u25b8",
    "rtriltri;": "\u29ce",
    "ruluhar;": "\u2968",
    "rx;": "\u211e",
    "sacute;": "\u015b",
    "sbquo;": "\u201a",
    "sc;": "\u227b",
    "scE;": "\u2ab4",
    "scap;": "\u2ab8",
    "scaron;": "\u0161",
    "sccue;": "\u227d",
    "sce;": "\u2ab0",
    "scedil;": "\u015f",
    "scirc;": "\u015d",
    "scnE;": "\u2ab6",
    "scnap;": "\u2aba",
    "scnsim;": "\u22e9",
    "scpolint;": "\u2a13",
    "scsim;": "\u227f",
    "scy;": "\u0441",
    "sdot;": "\u22c5",
    "sdotb;": "\u22a1",
    "sdote;": "\u2a66",
    "seArr;": "\u21d8",
    "searhk;": "\u2925",
    "searr;": "\u2198",
    "searrow;": "\u2198",
    "sect": "\xa7",
    "sect;": "\xa7",
    "semi;": ";",
    "seswar;": "\u2929",
    "setminus;": "\u2216",
    "setmn;": "\u2216",
    "sext;": "\u2736",
    "sfr;": "\U0001d530",
    "sfrown;": "\u2322",
    "sharp;": "\u266f",
    "shchcy;": "\u0449",
    "shcy;": "\u0448",
    "shortmid;": "\u2223",
    "shortparallel;": "\u2225",
    "shy": "\xad",
    "shy;": "\xad",
    "sigma;": "\u03c3",
    "sigmaf;": "\u03c2",
    "sigmav;": "\u03c2",
    "sim;": "\u223c",
    "simdot;": "\u2a6a",
    "sime;": "\u2243",
    "simeq;": "\u2243",
    "simg;": "\u2a9e",
    "simgE;": "\u2aa0",
    "siml;": "\u2a9d",
    "simlE;": "\u2a9f",
    "simne;": "\u2246",
    "simplus;": "\u2a24",
    "simrarr;": "\u2972",
    "slarr;": "\u2190",
    "smallsetminus;": "\u2216",
    "smashp;": "\u2a33",
    "smeparsl;": "\u29e4",
    "smid;": "\u2223",
    "smile;": "\u2323",
    "smt;": "\u2aaa",
    "smte;": "\u2aac",
    "smtes;": "\u2aac\ufe00",
    "softcy;": "\u044c",
    "sol;": "/",
    "solb;": "\u29c4",
    "solbar;": "\u233f",
    "sopf;": "\U0001d564",
    "spades;": "\u2660",
    "spadesuit;": "\u2660",
    "spar;": "\u2225",
    "sqcap;": "\u2293",
    "sqcaps;": "\u2293\ufe00",
    "sqcup;": "\u2294",
    "sqcups;": "\u2294\ufe00",
    "sqsub;": "\u228f",
    "sqsube;": "\u2291",
    "sqsubset;": "\u228f",
    "sqsubseteq;": "\u2291",
    "sqsup;": "\u2290",
    "sqsupe;": "\u2292",
    "sqsupset;": "\u2290",
    "sqsupseteq;": "\u2292",
    "squ;": "\u25a1",
    "square;": "\u25a1",
    "squarf;": "\u25aa",
    "squf;": "\u25aa",
    "srarr;": "\u2192",
    "sscr;": "\U0001d4c8",
    "ssetmn;": "\u2216",
    "ssmile;": "\u2323",
    "sstarf;": "\u22c6",
    "star;": "\u2606",
    "starf;": "\u2605",
    "straightepsilon;": "\u03f5",
    "straightphi;": "\u03d5",
    "strns;": "\xaf",
    "sub;": "\u2282",
    "subE;": "\u2ac5",
    "subdot;": "\u2abd",
    "sube;": "\u2286",
    "subedot;": "\u2ac3",
    "submult;": "\u2ac1",
    "subnE;": "\u2acb",
    "subne;": "\u228a",
    "subplus;": "\u2abf",
    "subrarr;": "\u2979",
    "subset;": "\u2282",
    "subseteq;": "\u2286",
    "subseteqq;": "\u2ac5",
    "subsetneq;": "\u228a",
    "subsetneqq;": "\u2acb",
    "subsim;": "\u2ac7",
    "subsub;": "\u2ad5",
    "subsup;": "\u2ad3",
    "succ;": "\u227b",
    "succapprox;": "\u2ab8",
    "succcurlyeq;": "\u227d",
    "succeq;": "\u2ab0",
    "succnapprox;": "\u2aba",
    "succneqq;": "\u2ab6",
    "succnsim;": "\u22e9",
    "succsim;": "\u227f",
    "sum;": "\u2211",
    "sung;": "\u266a",
    "sup1": "\xb9",
    "sup1;": "\xb9",
    "sup2": "\xb2",
    "sup2;": "\xb2",
    "sup3": "\xb3",
    "sup3;": "\xb3",
    "sup;": "\u2283",
    "supE;": "\u2ac6",
    "supdot;": "\u2abe",
    "supdsub;": "\u2ad8",
    "supe;": "\u2287",
    "supedot;": "\u2ac4",
    "suphsol;": "\u27c9",
    "suphsub;": "\u2ad7",
    "suplarr;": "\u297b",
    "supmult;": "\u2ac2",
    "supnE;": "\u2acc",
    "supne;": "\u228b",
    "supplus;": "\u2ac0",
    "supset;": "\u2283",
    "supseteq;": "\u2287",
    "supseteqq;": "\u2ac6",
    "supsetneq;": "\u228b",
    "supsetneqq;": "\u2acc",
    "supsim;": "\u2ac8",
    "supsub;": "\u2ad4",
    "supsup;": "\u2ad6",
    "swArr;": "\u21d9",
    "swarhk;": "\u2926",
    "swarr;": "\u2199",
    "swarrow;": "\u2199",
    "swnwar;": "\u292a",
    "szlig": "\xdf",
    "szlig;": "\xdf",
    "target;": "\u2316",
    "tau;": "\u03c4",
    "tbrk;": "\u23b4",
    "tcaron;": "\u0165",
    "tcedil;": "\u0163",
    "tcy;": "\u0442",
    "tdot;": "\u20db",
    "telrec;": "\u2315",
    "tfr;": "\U0001d531",
    "there4;": "\u2234",
    "therefore;": "\u2234",
    "theta;": "\u03b8",
    "thetasym;": "\u03d1",
    "thetav;": "\u03d1",
    "thickapprox;": "\u2248",
    "thicksim;": "\u223c",
    "thinsp;": "\u2009",
    "thkap;": "\u2248",
    "thksim;": "\u223c",
    "thorn": "\xfe",
    "thorn;": "\xfe",
    "tilde;": "\u02dc",
    "times": "\xd7",
    "times;": "\xd7",
    "timesb;": "\u22a0",
    "timesbar;": "\u2a31",
    "timesd;": "\u2a30",
    "tint;": "\u222d",
    "toea;": "\u2928",
    "top;": "\u22a4",
    "topbot;": "\u2336",
    "topcir;": "\u2af1",
    "topf;": "\U0001d565",
    "topfork;": "\u2ada",
    "tosa;": "\u2929",
    "tprime;": "\u2034",
    "trade;": "\u2122",
    "triangle;": "\u25b5",
    "triangledown;": "\u25bf",
    "triangleleft;": "\u25c3",
    "trianglelefteq;": "\u22b4",
    "triangleq;": "\u225c",
    "triangleright;": "\u25b9",
    "trianglerighteq;": "\u22b5",
    "tridot;": "\u25ec",
    "trie;": "\u225c",
    "triminus;": "\u2a3a",
    "triplus;": "\u2a39",
    "trisb;": "\u29cd",
    "tritime;": "\u2a3b",
    "trpezium;": "\u23e2",
    "tscr;": "\U0001d4c9",
    "tscy;": "\u0446",
    "tshcy;": "\u045b",
    "tstrok;": "\u0167",
    "twixt;": "\u226c",
    "twoheadleftarrow;": "\u219e",
    "twoheadrightarrow;": "\u21a0",
    "uArr;": "\u21d1",
    "uHar;": "\u2963",
    "uacute": "\xfa",
    "uacute;": "\xfa",
    "uarr;": "\u2191",
    "ubrcy;": "\u045e",
    "ubreve;": "\u016d",
    "ucirc": "\xfb",
    "ucirc;": "\xfb",
    "ucy;": "\u0443",
    "udarr;": "\u21c5",
    "udblac;": "\u0171",
    "udhar;": "\u296e",
    "ufisht;": "\u297e",
    "ufr;": "\U0001d532",
    "ugrave": "\xf9",
    "ugrave;": "\xf9",
    "uharl;": "\u21bf",
    "uharr;": "\u21be",
    "uhblk;": "\u2580",
    "ulcorn;": "\u231c",
    "ulcorner;": "\u231c",
    "ulcrop;": "\u230f",
    "ultri;": "\u25f8",
    "umacr;": "\u016b",
    "uml": "\xa8",
    "uml;": "\xa8",
    "uogon;": "\u0173",
    "uopf;": "\U0001d566",
    "uparrow;": "\u2191",
    "updownarrow;": "\u2195",
    "upharpoonleft;": "\u21bf",
    "upharpoonright;": "\u21be",
    "uplus;": "\u228e",
    "upsi;": "\u03c5",
    "upsih;": "\u03d2",
    "upsilon;": "\u03c5",
    "upuparrows;": "\u21c8",
    "urcorn;": "\u231d",
    "urcorner;": "\u231d",
    "urcrop;": "\u230e",
    "uring;": "\u016f",
    "urtri;": "\u25f9",
    "uscr;": "\U0001d4ca",
    "utdot;": "\u22f0",
    "utilde;": "\u0169",
    "utri;": "\u25b5",
    "utrif;": "\u25b4",
    "uuarr;": "\u21c8",
    "uuml": "\xfc",
    "uuml;": "\xfc",
    "uwangle;": "\u29a7",
    "vArr;": "\u21d5",
    "vBar;": "\u2ae8",
    "vBarv;": "\u2ae9",
    "vDash;": "\u22a8",
    "vangrt;": "\u299c",
    "varepsilon;": "\u03f5",
    "varkappa;": "\u03f0",
    "varnothing;": "\u2205",
    "varphi;": "\u03d5",
    "varpi;": "\u03d6",
    "varpropto;": "\u221d",
    "varr;": "\u2195",
    "varrho;": "\u03f1",
    "varsigma;": "\u03c2",
    "varsubsetneq;": "\u228a\ufe00",
    "varsubsetneqq;": "\u2acb\ufe00",
    "varsupsetneq;": "\u228b\ufe00",
    "varsupsetneqq;": "\u2acc\ufe00",
    "vartheta;": "\u03d1",
    "vartriangleleft;": "\u22b2",
    "vartriangleright;": "\u22b3",
    "vcy;": "\u0432",
    "vdash;": "\u22a2",
    "vee;": "\u2228",
    "veebar;": "\u22bb",
    "veeeq;": "\u225a",
    "vellip;": "\u22ee",
    "verbar;": "|",
    "vert;": "|",
    "vfr;": "\U0001d533",
    "vltri;": "\u22b2",
    "vnsub;": "\u2282\u20d2",
    "vnsup;": "\u2283\u20d2",
    "vopf;": "\U0001d567",
    "vprop;": "\u221d",
    "vrtri;": "\u22b3",
    "vscr;": "\U0001d4cb",
    "vsubnE;": "\u2acb\ufe00",
    "vsubne;": "\u228a\ufe00",
    "vsupnE;": "\u2acc\ufe00",
    "vsupne;": "\u228b\ufe00",
    "vzigzag;": "\u299a",
    "wcirc;": "\u0175",
    "wedbar;": "\u2a5f",
    "wedge;": "\u2227",
    "wedgeq;": "\u2259",
    "weierp;": "\u2118",
    "wfr;": "\U0001d534",
    "wopf;": "\U0001d568",
    "wp;": "\u2118",
    "wr;": "\u2240",
    "wreath;": "\u2240",
    "wscr;": "\U0001d4cc",
    "xcap;": "\u22c2",
    "xcirc;": "\u25ef",
    "xcup;": "\u22c3",
    "xdtri;": "\u25bd",
    "xfr;": "\U0001d535",
    "xhArr;": "\u27fa",
    "xharr;": "\u27f7",
    "xi;": "\u03be",
    "xlArr;": "\u27f8",
    "xlarr;": "\u27f5",
    "xmap;": "\u27fc",
    "xnis;": "\u22fb",
    "xodot;": "\u2a00",
    "xopf;": "\U0001d569",
    "xoplus;": "\u2a01",
    "xotime;": "\u2a02",
    "xrArr;": "\u27f9",
    "xrarr;": "\u27f6",
    "xscr;": "\U0001d4cd",
    "xsqcup;": "\u2a06",
    "xuplus;": "\u2a04",
    "xutri;": "\u25b3",
    "xvee;": "\u22c1",
    "xwedge;": "\u22c0",
    "yacute": "\xfd",
    "yacute;": "\xfd",
    "yacy;": "\u044f",
    "ycirc;": "\u0177",
    "ycy;": "\u044b",
    "yen": "\xa5",
    "yen;": "\xa5",
    "yfr;": "\U0001d536",
    "yicy;": "\u0457",
    "yopf;": "\U0001d56a",
    "yscr;": "\U0001d4ce",
    "yucy;": "\u044e",
    "yuml": "\xff",
    "yuml;": "\xff",
    "zacute;": "\u017a",
    "zcaron;": "\u017e",
    "zcy;": "\u0437",
    "zdot;": "\u017c",
    "zeetrf;": "\u2128",
    "zeta;": "\u03b6",
    "zfr;": "\U0001d537",
    "zhcy;": "\u0436",
    "zigrarr;": "\u21dd",
    "zopf;": "\U0001d56b",
    "zscr;": "\U0001d4cf",
    "zwj;": "\u200d",
    "zwnj;": "\u200c",
}

replacementCharacters = {
    0x0: "\uFFFD",
    0x0d: "\u000D",
    0x80: "\u20AC",
    0x81: "\u0081",
    0x82: "\u201A",
    0x83: "\u0192",
    0x84: "\u201E",
    0x85: "\u2026",
    0x86: "\u2020",
    0x87: "\u2021",
    0x88: "\u02C6",
    0x89: "\u2030",
    0x8A: "\u0160",
    0x8B: "\u2039",
    0x8C: "\u0152",
    0x8D: "\u008D",
    0x8E: "\u017D",
    0x8F: "\u008F",
    0x90: "\u0090",
    0x91: "\u2018",
    0x92: "\u2019",
    0x93: "\u201C",
    0x94: "\u201D",
    0x95: "\u2022",
    0x96: "\u2013",
    0x97: "\u2014",
    0x98: "\u02DC",
    0x99: "\u2122",
    0x9A: "\u0161",
    0x9B: "\u203A",
    0x9C: "\u0153",
    0x9D: "\u009D",
    0x9E: "\u017E",
    0x9F: "\u0178",
}

tokenTypes = {
    "Doctype": 0,
    "Characters": 1,
    "SpaceCharacters": 2,
    "StartTag": 3,
    "EndTag": 4,
    "EmptyTag": 5,
    "Comment": 6,
    "ParseError": 7
}

tagTokenTypes = frozenset([tokenTypes["StartTag"], tokenTypes["EndTag"],
                           tokenTypes["EmptyTag"]])


prefixes = dict([(v, k) for k, v in namespaces.items()])
prefixes["http://www.w3.org/1998/Math/MathML"] = "math"


class DataLossWarning(UserWarning):
    pass


class ReparseException(Exception):
    pass
site-packages/pip/_vendor/html5lib/_utils.py000064400000010000147511334610015075 0ustar00from __future__ import absolute_import, division, unicode_literals

import sys
from types import ModuleType

from pip._vendor.six import text_type

try:
    import xml.etree.cElementTree as default_etree
except ImportError:
    import xml.etree.ElementTree as default_etree


__all__ = ["default_etree", "MethodDispatcher", "isSurrogatePair",
           "surrogatePairToCodepoint", "moduleFactoryFactory",
           "supports_lone_surrogates", "PY27"]


PY27 = sys.version_info[0] == 2 and sys.version_info[1] >= 7

# Platforms not supporting lone surrogates (\uD800-\uDFFF) should be
# caught by the below test. In general this would be any platform
# using UTF-16 as its encoding of unicode strings, such as
# Jython. This is because UTF-16 itself is based on the use of such
# surrogates, and there is no mechanism to further escape such
# escapes.
try:
    _x = eval('"\\uD800"')  # pylint:disable=eval-used
    if not isinstance(_x, text_type):
        # We need this with u"" because of http://bugs.jython.org/issue2039
        _x = eval('u"\\uD800"')  # pylint:disable=eval-used
        assert isinstance(_x, text_type)
except:  # pylint:disable=bare-except
    supports_lone_surrogates = False
else:
    supports_lone_surrogates = True


class MethodDispatcher(dict):
    """Dict with 2 special properties:

    On initiation, keys that are lists, sets or tuples are converted to
    multiple keys so accessing any one of the items in the original
    list-like object returns the matching value

    md = MethodDispatcher({("foo", "bar"):"baz"})
    md["foo"] == "baz"

    A default value which can be set through the default attribute.
    """

    def __init__(self, items=()):
        # Using _dictEntries instead of directly assigning to self is about
        # twice as fast. Please do careful performance testing before changing
        # anything here.
        _dictEntries = []
        for name, value in items:
            if isinstance(name, (list, tuple, frozenset, set)):
                for item in name:
                    _dictEntries.append((item, value))
            else:
                _dictEntries.append((name, value))
        dict.__init__(self, _dictEntries)
        assert len(self) == len(_dictEntries)
        self.default = None

    def __getitem__(self, key):
        return dict.get(self, key, self.default)


# Some utility functions to deal with weirdness around UCS2 vs UCS4
# python builds

def isSurrogatePair(data):
    return (len(data) == 2 and
            ord(data[0]) >= 0xD800 and ord(data[0]) <= 0xDBFF and
            ord(data[1]) >= 0xDC00 and ord(data[1]) <= 0xDFFF)


def surrogatePairToCodepoint(data):
    char_val = (0x10000 + (ord(data[0]) - 0xD800) * 0x400 +
                (ord(data[1]) - 0xDC00))
    return char_val

# Module Factory Factory (no, this isn't Java, I know)
# Here to stop this being duplicated all over the place.


def moduleFactoryFactory(factory):
    moduleCache = {}

    def moduleFactory(baseModule, *args, **kwargs):
        if isinstance(ModuleType.__name__, type("")):
            name = "_%s_factory" % baseModule.__name__
        else:
            name = b"_%s_factory" % baseModule.__name__

        kwargs_tuple = tuple(kwargs.items())

        try:
            return moduleCache[name][args][kwargs_tuple]
        except KeyError:
            mod = ModuleType(name)
            objs = factory(baseModule, *args, **kwargs)
            mod.__dict__.update(objs)
            if "name" not in moduleCache:
                moduleCache[name] = {}
            if "args" not in moduleCache[name]:
                moduleCache[name][args] = {}
            if "kwargs" not in moduleCache[name][args]:
                moduleCache[name][args][kwargs_tuple] = {}
            moduleCache[name][args][kwargs_tuple] = mod
            return mod

    return moduleFactory


def memoize(func):
    cache = {}

    def wrapped(*args, **kwargs):
        key = (tuple(args), tuple(kwargs.items()))
        if key not in cache:
            cache[key] = func(*args, **kwargs)
        return cache[key]

    return wrapped
site-packages/pip/_vendor/html5lib/serializer.py000064400000033541147511334610015766 0ustar00from __future__ import absolute_import, division, unicode_literals
from pip._vendor.six import text_type

import re

from codecs import register_error, xmlcharrefreplace_errors

from .constants import voidElements, booleanAttributes, spaceCharacters
from .constants import rcdataElements, entities, xmlEntities
from . import treewalkers, _utils
from xml.sax.saxutils import escape

_quoteAttributeSpecChars = "".join(spaceCharacters) + "\"'=<>`"
_quoteAttributeSpec = re.compile("[" + _quoteAttributeSpecChars + "]")
_quoteAttributeLegacy = re.compile("[" + _quoteAttributeSpecChars +
                                   "\x00\x01\x02\x03\x04\x05\x06\x07\x08\t\n"
                                   "\x0b\x0c\r\x0e\x0f\x10\x11\x12\x13\x14\x15"
                                   "\x16\x17\x18\x19\x1a\x1b\x1c\x1d\x1e\x1f"
                                   "\x20\x2f\x60\xa0\u1680\u180e\u180f\u2000"
                                   "\u2001\u2002\u2003\u2004\u2005\u2006\u2007"
                                   "\u2008\u2009\u200a\u2028\u2029\u202f\u205f"
                                   "\u3000]")


_encode_entity_map = {}
_is_ucs4 = len("\U0010FFFF") == 1
for k, v in list(entities.items()):
    # skip multi-character entities
    if ((_is_ucs4 and len(v) > 1) or
            (not _is_ucs4 and len(v) > 2)):
        continue
    if v != "&":
        if len(v) == 2:
            v = _utils.surrogatePairToCodepoint(v)
        else:
            v = ord(v)
        if v not in _encode_entity_map or k.islower():
            # prefer &lt; over &LT; and similarly for &amp;, &gt;, etc.
            _encode_entity_map[v] = k


def htmlentityreplace_errors(exc):
    if isinstance(exc, (UnicodeEncodeError, UnicodeTranslateError)):
        res = []
        codepoints = []
        skip = False
        for i, c in enumerate(exc.object[exc.start:exc.end]):
            if skip:
                skip = False
                continue
            index = i + exc.start
            if _utils.isSurrogatePair(exc.object[index:min([exc.end, index + 2])]):
                codepoint = _utils.surrogatePairToCodepoint(exc.object[index:index + 2])
                skip = True
            else:
                codepoint = ord(c)
            codepoints.append(codepoint)
        for cp in codepoints:
            e = _encode_entity_map.get(cp)
            if e:
                res.append("&")
                res.append(e)
                if not e.endswith(";"):
                    res.append(";")
            else:
                res.append("&#x%s;" % (hex(cp)[2:]))
        return ("".join(res), exc.end)
    else:
        return xmlcharrefreplace_errors(exc)

register_error("htmlentityreplace", htmlentityreplace_errors)


def serialize(input, tree="etree", encoding=None, **serializer_opts):
    # XXX: Should we cache this?
    walker = treewalkers.getTreeWalker(tree)
    s = HTMLSerializer(**serializer_opts)
    return s.render(walker(input), encoding)


class HTMLSerializer(object):

    # attribute quoting options
    quote_attr_values = "legacy"  # be secure by default
    quote_char = '"'
    use_best_quote_char = True

    # tag syntax options
    omit_optional_tags = True
    minimize_boolean_attributes = True
    use_trailing_solidus = False
    space_before_trailing_solidus = True

    # escaping options
    escape_lt_in_attrs = False
    escape_rcdata = False
    resolve_entities = True

    # miscellaneous options
    alphabetical_attributes = False
    inject_meta_charset = True
    strip_whitespace = False
    sanitize = False

    options = ("quote_attr_values", "quote_char", "use_best_quote_char",
               "omit_optional_tags", "minimize_boolean_attributes",
               "use_trailing_solidus", "space_before_trailing_solidus",
               "escape_lt_in_attrs", "escape_rcdata", "resolve_entities",
               "alphabetical_attributes", "inject_meta_charset",
               "strip_whitespace", "sanitize")

    def __init__(self, **kwargs):
        """Initialize HTMLSerializer.

        Keyword options (default given first unless specified) include:

        inject_meta_charset=True|False
          Whether it insert a meta element to define the character set of the
          document.
        quote_attr_values="legacy"|"spec"|"always"
          Whether to quote attribute values that don't require quoting
          per legacy browser behaviour, when required by the standard, or always.
        quote_char=u'"'|u"'"
          Use given quote character for attribute quoting. Default is to
          use double quote unless attribute value contains a double quote,
          in which case single quotes are used instead.
        escape_lt_in_attrs=False|True
          Whether to escape < in attribute values.
        escape_rcdata=False|True
          Whether to escape characters that need to be escaped within normal
          elements within rcdata elements such as style.
        resolve_entities=True|False
          Whether to resolve named character entities that appear in the
          source tree. The XML predefined entities &lt; &gt; &amp; &quot; &apos;
          are unaffected by this setting.
        strip_whitespace=False|True
          Whether to remove semantically meaningless whitespace. (This
          compresses all whitespace to a single space except within pre.)
        minimize_boolean_attributes=True|False
          Shortens boolean attributes to give just the attribute value,
          for example <input disabled="disabled"> becomes <input disabled>.
        use_trailing_solidus=False|True
          Includes a close-tag slash at the end of the start tag of void
          elements (empty elements whose end tag is forbidden). E.g. <hr/>.
        space_before_trailing_solidus=True|False
          Places a space immediately before the closing slash in a tag
          using a trailing solidus. E.g. <hr />. Requires use_trailing_solidus.
        sanitize=False|True
          Strip all unsafe or unknown constructs from output.
          See `html5lib user documentation`_
        omit_optional_tags=True|False
          Omit start/end tags that are optional.
        alphabetical_attributes=False|True
          Reorder attributes to be in alphabetical order.

        .. _html5lib user documentation: http://code.google.com/p/html5lib/wiki/UserDocumentation
        """
        unexpected_args = frozenset(kwargs) - frozenset(self.options)
        if len(unexpected_args) > 0:
            raise TypeError("__init__() got an unexpected keyword argument '%s'" % next(iter(unexpected_args)))
        if 'quote_char' in kwargs:
            self.use_best_quote_char = False
        for attr in self.options:
            setattr(self, attr, kwargs.get(attr, getattr(self, attr)))
        self.errors = []
        self.strict = False

    def encode(self, string):
        assert(isinstance(string, text_type))
        if self.encoding:
            return string.encode(self.encoding, "htmlentityreplace")
        else:
            return string

    def encodeStrict(self, string):
        assert(isinstance(string, text_type))
        if self.encoding:
            return string.encode(self.encoding, "strict")
        else:
            return string

    def serialize(self, treewalker, encoding=None):
        # pylint:disable=too-many-nested-blocks
        self.encoding = encoding
        in_cdata = False
        self.errors = []

        if encoding and self.inject_meta_charset:
            from .filters.inject_meta_charset import Filter
            treewalker = Filter(treewalker, encoding)
        # Alphabetical attributes is here under the assumption that none of
        # the later filters add or change order of attributes; it needs to be
        # before the sanitizer so escaped elements come out correctly
        if self.alphabetical_attributes:
            from .filters.alphabeticalattributes import Filter
            treewalker = Filter(treewalker)
        # WhitespaceFilter should be used before OptionalTagFilter
        # for maximum efficiently of this latter filter
        if self.strip_whitespace:
            from .filters.whitespace import Filter
            treewalker = Filter(treewalker)
        if self.sanitize:
            from .filters.sanitizer import Filter
            treewalker = Filter(treewalker)
        if self.omit_optional_tags:
            from .filters.optionaltags import Filter
            treewalker = Filter(treewalker)

        for token in treewalker:
            type = token["type"]
            if type == "Doctype":
                doctype = "<!DOCTYPE %s" % token["name"]

                if token["publicId"]:
                    doctype += ' PUBLIC "%s"' % token["publicId"]
                elif token["systemId"]:
                    doctype += " SYSTEM"
                if token["systemId"]:
                    if token["systemId"].find('"') >= 0:
                        if token["systemId"].find("'") >= 0:
                            self.serializeError("System identifer contains both single and double quote characters")
                        quote_char = "'"
                    else:
                        quote_char = '"'
                    doctype += " %s%s%s" % (quote_char, token["systemId"], quote_char)

                doctype += ">"
                yield self.encodeStrict(doctype)

            elif type in ("Characters", "SpaceCharacters"):
                if type == "SpaceCharacters" or in_cdata:
                    if in_cdata and token["data"].find("</") >= 0:
                        self.serializeError("Unexpected </ in CDATA")
                    yield self.encode(token["data"])
                else:
                    yield self.encode(escape(token["data"]))

            elif type in ("StartTag", "EmptyTag"):
                name = token["name"]
                yield self.encodeStrict("<%s" % name)
                if name in rcdataElements and not self.escape_rcdata:
                    in_cdata = True
                elif in_cdata:
                    self.serializeError("Unexpected child element of a CDATA element")
                for (_, attr_name), attr_value in token["data"].items():
                    # TODO: Add namespace support here
                    k = attr_name
                    v = attr_value
                    yield self.encodeStrict(' ')

                    yield self.encodeStrict(k)
                    if not self.minimize_boolean_attributes or \
                        (k not in booleanAttributes.get(name, tuple()) and
                         k not in booleanAttributes.get("", tuple())):
                        yield self.encodeStrict("=")
                        if self.quote_attr_values == "always" or len(v) == 0:
                            quote_attr = True
                        elif self.quote_attr_values == "spec":
                            quote_attr = _quoteAttributeSpec.search(v) is not None
                        elif self.quote_attr_values == "legacy":
                            quote_attr = _quoteAttributeLegacy.search(v) is not None
                        else:
                            raise ValueError("quote_attr_values must be one of: "
                                             "'always', 'spec', or 'legacy'")
                        v = v.replace("&", "&amp;")
                        if self.escape_lt_in_attrs:
                            v = v.replace("<", "&lt;")
                        if quote_attr:
                            quote_char = self.quote_char
                            if self.use_best_quote_char:
                                if "'" in v and '"' not in v:
                                    quote_char = '"'
                                elif '"' in v and "'" not in v:
                                    quote_char = "'"
                            if quote_char == "'":
                                v = v.replace("'", "&#39;")
                            else:
                                v = v.replace('"', "&quot;")
                            yield self.encodeStrict(quote_char)
                            yield self.encode(v)
                            yield self.encodeStrict(quote_char)
                        else:
                            yield self.encode(v)
                if name in voidElements and self.use_trailing_solidus:
                    if self.space_before_trailing_solidus:
                        yield self.encodeStrict(" /")
                    else:
                        yield self.encodeStrict("/")
                yield self.encode(">")

            elif type == "EndTag":
                name = token["name"]
                if name in rcdataElements:
                    in_cdata = False
                elif in_cdata:
                    self.serializeError("Unexpected child element of a CDATA element")
                yield self.encodeStrict("</%s>" % name)

            elif type == "Comment":
                data = token["data"]
                if data.find("--") >= 0:
                    self.serializeError("Comment contains --")
                yield self.encodeStrict("<!--%s-->" % token["data"])

            elif type == "Entity":
                name = token["name"]
                key = name + ";"
                if key not in entities:
                    self.serializeError("Entity %s not recognized" % name)
                if self.resolve_entities and key not in xmlEntities:
                    data = entities[key]
                else:
                    data = "&%s;" % name
                yield self.encodeStrict(data)

            else:
                self.serializeError(token["data"])

    def render(self, treewalker, encoding=None):
        if encoding:
            return b"".join(list(self.serialize(treewalker, encoding)))
        else:
            return "".join(list(self.serialize(treewalker)))

    def serializeError(self, data="XXX ERROR MESSAGE NEEDED"):
        # XXX The idea is to make data mandatory.
        self.errors.append(data)
        if self.strict:
            raise SerializeError


class SerializeError(Exception):
    """Error in serialized tree"""
    pass
site-packages/pip/_vendor/html5lib/_trie/datrie.py000064400000002232147511334610016160 0ustar00from __future__ import absolute_import, division, unicode_literals

from datrie import Trie as DATrie
from pip._vendor.six import text_type

from ._base import Trie as ABCTrie


class Trie(ABCTrie):
    def __init__(self, data):
        chars = set()
        for key in data.keys():
            if not isinstance(key, text_type):
                raise TypeError("All keys must be strings")
            for char in key:
                chars.add(char)

        self._data = DATrie("".join(chars))
        for key, value in data.items():
            self._data[key] = value

    def __contains__(self, key):
        return key in self._data

    def __len__(self):
        return len(self._data)

    def __iter__(self):
        raise NotImplementedError()

    def __getitem__(self, key):
        return self._data[key]

    def keys(self, prefix=None):
        return self._data.keys(prefix)

    def has_keys_with_prefix(self, prefix):
        return self._data.has_keys_with_prefix(prefix)

    def longest_prefix(self, prefix):
        return self._data.longest_prefix(prefix)

    def longest_prefix_item(self, prefix):
        return self._data.longest_prefix_item(prefix)
site-packages/pip/_vendor/html5lib/_trie/__init__.py000064400000000441147511334610016447 0ustar00from __future__ import absolute_import, division, unicode_literals

from .py import Trie as PyTrie

Trie = PyTrie

# pylint:disable=wrong-import-position
try:
    from .datrie import Trie as DATrie
except ImportError:
    pass
else:
    Trie = DATrie
# pylint:enable=wrong-import-position
site-packages/pip/_vendor/html5lib/_trie/_base.py000064400000001723147511334610015765 0ustar00from __future__ import absolute_import, division, unicode_literals

from collections import Mapping


class Trie(Mapping):
    """Abstract base class for tries"""

    def keys(self, prefix=None):
        # pylint:disable=arguments-differ
        keys = super(Trie, self).keys()

        if prefix is None:
            return set(keys)

        # Python 2.6: no set comprehensions
        return set([x for x in keys if x.startswith(prefix)])

    def has_keys_with_prefix(self, prefix):
        for key in self.keys():
            if key.startswith(prefix):
                return True

        return False

    def longest_prefix(self, prefix):
        if prefix in self:
            return prefix

        for i in range(1, len(prefix) + 1):
            if prefix[:-i] in self:
                return prefix[:-i]

        raise KeyError(prefix)

    def longest_prefix_item(self, prefix):
        lprefix = self.longest_prefix(prefix)
        return (lprefix, self[lprefix])
site-packages/pip/_vendor/html5lib/_trie/py.py000064400000003357147511334610015351 0ustar00from __future__ import absolute_import, division, unicode_literals
from pip._vendor.six import text_type

from bisect import bisect_left

from ._base import Trie as ABCTrie


class Trie(ABCTrie):
    def __init__(self, data):
        if not all(isinstance(x, text_type) for x in data.keys()):
            raise TypeError("All keys must be strings")

        self._data = data
        self._keys = sorted(data.keys())
        self._cachestr = ""
        self._cachepoints = (0, len(data))

    def __contains__(self, key):
        return key in self._data

    def __len__(self):
        return len(self._data)

    def __iter__(self):
        return iter(self._data)

    def __getitem__(self, key):
        return self._data[key]

    def keys(self, prefix=None):
        if prefix is None or prefix == "" or not self._keys:
            return set(self._keys)

        if prefix.startswith(self._cachestr):
            lo, hi = self._cachepoints
            start = i = bisect_left(self._keys, prefix, lo, hi)
        else:
            start = i = bisect_left(self._keys, prefix)

        keys = set()
        if start == len(self._keys):
            return keys

        while self._keys[i].startswith(prefix):
            keys.add(self._keys[i])
            i += 1

        self._cachestr = prefix
        self._cachepoints = (start, i)

        return keys

    def has_keys_with_prefix(self, prefix):
        if prefix in self._data:
            return True

        if prefix.startswith(self._cachestr):
            lo, hi = self._cachepoints
            i = bisect_left(self._keys, prefix, lo, hi)
        else:
            i = bisect_left(self._keys, prefix)

        if i == len(self._keys):
            return False

        return self._keys[i].startswith(prefix)
site-packages/pip/_vendor/html5lib/_trie/__pycache__/__init__.cpython-36.pyc000064400000000524147511334610022735 0ustar003

���e!�@sRddlmZmZmZddlmZeZyddlmZWne	k
rHYnXeZdS)�)�absolute_import�division�unicode_literals�)�TrieN)
Z
__future__rrr�pyrZPyTrieZdatrieZDATrie�ImportError�r	r	�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/html5lib/_trie/__pycache__/datrie.cpython-36.opt-1.pyc000064400000003626147511334610023413 0ustar003

���e��@sLddlmZmZmZddlmZddlmZddl	mZ
Gdd�de
�ZdS)�)�absolute_import�division�unicode_literals)�Trie)�	text_type�c@sVeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zddd
�Zdd�Z	dd�Z
dd�ZdS)rcCsvt�}x:|j�D].}t|t�s&td��x|D]}|j|�q,WqWtdj|��|_x|j	�D]\}}||j|<q\WdS)NzAll keys must be strings�)
�set�keys�
isinstancer�	TypeError�add�DATrie�join�_data�items)�self�data�chars�key�char�value�r�/usr/lib/python3.6/datrie.py�__init__
s

z
Trie.__init__cCs
||jkS)N)r)rrrrr�__contains__szTrie.__contains__cCs
t|j�S)N)�lenr)rrrr�__len__szTrie.__len__cCs
t��dS)N)�NotImplementedError)rrrr�__iter__sz
Trie.__iter__cCs
|j|S)N)r)rrrrr�__getitem__szTrie.__getitem__NcCs|jj|�S)N)rr
)r�prefixrrrr
"sz	Trie.keyscCs|jj|�S)N)r�has_keys_with_prefix)rr!rrrr"%szTrie.has_keys_with_prefixcCs|jj|�S)N)r�longest_prefix)rr!rrrr#(szTrie.longest_prefixcCs|jj|�S)N)r�longest_prefix_item)rr!rrrr$+szTrie.longest_prefix_item)N)�__name__�
__module__�__qualname__rrrrr r
r"r#r$rrrrr	s
rN)Z
__future__rrrZdatrierrZpip._vendor.sixrZ_baseZABCTrierrrr�<module>ssite-packages/pip/_vendor/html5lib/_trie/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000524147511334610023674 0ustar003

���e!�@sRddlmZmZmZddlmZeZyddlmZWne	k
rHYnXeZdS)�)�absolute_import�division�unicode_literals�)�TrieN)
Z
__future__rrr�pyrZPyTrieZdatrieZDATrie�ImportError�r	r	�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/html5lib/_trie/__pycache__/_base.cpython-36.pyc000064400000002625147511334610022253 0ustar003

���e��@s4ddlmZmZmZddlmZGdd�de�ZdS)�)�absolute_import�division�unicode_literals)�Mappingcs:eZdZdZd�fdd�	Zdd�Zdd�Zd	d
�Z�ZS)�TriezAbstract base class for triesNcs4tt|�j�}�dkrt|�St�fdd�|D��S)Ncsg|]}|j��r|�qS�)�
startswith)�.0�x)�prefixr�/usr/lib/python3.6/_base.py�
<listcomp>szTrie.keys.<locals>.<listcomp>)�superr�keys�set)�selfrr)�	__class__)rrr	sz	Trie.keyscCs$x|j�D]}|j|�r
dSq
WdS)NTF)rr)rr�keyrrr�has_keys_with_prefixs
zTrie.has_keys_with_prefixcCsT||kr|Sx:tdt|�d�D]$}|d|�|kr |d|�Sq Wt|��dS)N�)�range�len�KeyError)rr�irrr�longest_prefixszTrie.longest_prefixcCs|j|�}|||fS)N)r)rrZlprefixrrr�longest_prefix_item$s
zTrie.longest_prefix_item)N)	�__name__�
__module__�__qualname__�__doc__rrrr�
__classcell__rr)rrrs


rN)Z
__future__rrr�collectionsrrrrrr�<module>ssite-packages/pip/_vendor/html5lib/_trie/__pycache__/_base.cpython-36.opt-1.pyc000064400000002625147511334610023212 0ustar003

���e��@s4ddlmZmZmZddlmZGdd�de�ZdS)�)�absolute_import�division�unicode_literals)�Mappingcs:eZdZdZd�fdd�	Zdd�Zdd�Zd	d
�Z�ZS)�TriezAbstract base class for triesNcs4tt|�j�}�dkrt|�St�fdd�|D��S)Ncsg|]}|j��r|�qS�)�
startswith)�.0�x)�prefixr�/usr/lib/python3.6/_base.py�
<listcomp>szTrie.keys.<locals>.<listcomp>)�superr�keys�set)�selfrr)�	__class__)rrr	sz	Trie.keyscCs$x|j�D]}|j|�r
dSq
WdS)NTF)rr)rr�keyrrr�has_keys_with_prefixs
zTrie.has_keys_with_prefixcCsT||kr|Sx:tdt|�d�D]$}|d|�|kr |d|�Sq Wt|��dS)N�)�range�len�KeyError)rr�irrr�longest_prefixszTrie.longest_prefixcCs|j|�}|||fS)N)r)rrZlprefixrrr�longest_prefix_item$s
zTrie.longest_prefix_item)N)	�__name__�
__module__�__qualname__�__doc__rrrr�
__classcell__rr)rrrs


rN)Z
__future__rrr�collectionsrrrrrr�<module>ssite-packages/pip/_vendor/html5lib/_trie/__pycache__/datrie.cpython-36.pyc000064400000003626147511334610022454 0ustar003

���e��@sLddlmZmZmZddlmZddlmZddl	mZ
Gdd�de
�ZdS)�)�absolute_import�division�unicode_literals)�Trie)�	text_type�c@sVeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zddd
�Zdd�Z	dd�Z
dd�ZdS)rcCsvt�}x:|j�D].}t|t�s&td��x|D]}|j|�q,WqWtdj|��|_x|j	�D]\}}||j|<q\WdS)NzAll keys must be strings�)
�set�keys�
isinstancer�	TypeError�add�DATrie�join�_data�items)�self�data�chars�key�char�value�r�/usr/lib/python3.6/datrie.py�__init__
s

z
Trie.__init__cCs
||jkS)N)r)rrrrr�__contains__szTrie.__contains__cCs
t|j�S)N)�lenr)rrrr�__len__szTrie.__len__cCs
t��dS)N)�NotImplementedError)rrrr�__iter__sz
Trie.__iter__cCs
|j|S)N)r)rrrrr�__getitem__szTrie.__getitem__NcCs|jj|�S)N)rr
)r�prefixrrrr
"sz	Trie.keyscCs|jj|�S)N)r�has_keys_with_prefix)rr!rrrr"%szTrie.has_keys_with_prefixcCs|jj|�S)N)r�longest_prefix)rr!rrrr#(szTrie.longest_prefixcCs|jj|�S)N)r�longest_prefix_item)rr!rrrr$+szTrie.longest_prefix_item)N)�__name__�
__module__�__qualname__rrrrr r
r"r#r$rrrrr	s
rN)Z
__future__rrrZdatrierrZpip._vendor.sixrZ_baseZABCTrierrrr�<module>ssite-packages/pip/_vendor/html5lib/_trie/__pycache__/py.cpython-36.opt-1.pyc000064400000004143147511334610022566 0ustar003

���e��@sLddlmZmZmZddlmZddlmZddlm	Z
Gdd�de
�Z	dS)	�)�absolute_import�division�unicode_literals)�	text_type)�bisect_left�)�Triec@sFeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zddd
�Zdd�Z	dS)rcCsJtdd�|j�D��std��||_t|j��|_d|_dt|�f|_dS)Ncss|]}t|t�VqdS)N)�
isinstancer)�.0�x�r�/usr/lib/python3.6/py.py�	<genexpr>sz Trie.__init__.<locals>.<genexpr>zAll keys must be strings�r)	�all�keys�	TypeError�_data�sorted�_keys�	_cachestr�len�_cachepoints)�self�datarrr
�__init__
sz
Trie.__init__cCs
||jkS)N)r)r�keyrrr
�__contains__szTrie.__contains__cCs
t|j�S)N)rr)rrrr
�__len__szTrie.__len__cCs
t|j�S)N)�iterr)rrrr
�__iter__sz
Trie.__iter__cCs
|j|S)N)r)rrrrr
�__getitem__szTrie.__getitem__NcCs�|dks|dks|jr"t|j�S|j|j�rN|j\}}t|j|||�}}nt|j|�}}t�}|t|j�krv|Sx,|j|j|�r�|j|j|�|d7}qxW||_||f|_|S)Nrr)r�set�
startswithrrrr�add)r�prefix�lo�hi�start�irrrr
rs


z	Trie.keyscCsd||jkrdS|j|j�r6|j\}}t|j|||�}nt|j|�}|t|j�krTdS|j|j|�S)NTF)rr#rrrrr)rr%r&r'r)rrr
�has_keys_with_prefix6s

zTrie.has_keys_with_prefix)N)
�__name__�
__module__�__qualname__rrrr r!rr*rrrr
r	s	
rN)Z
__future__rrrZpip._vendor.sixrZbisectrZ_baserZABCTrierrrr
�<module>ssite-packages/pip/_vendor/html5lib/_trie/__pycache__/py.cpython-36.pyc000064400000004143147511334610021627 0ustar003

���e��@sLddlmZmZmZddlmZddlmZddlm	Z
Gdd�de
�Z	dS)	�)�absolute_import�division�unicode_literals)�	text_type)�bisect_left�)�Triec@sFeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zddd
�Zdd�Z	dS)rcCsJtdd�|j�D��std��||_t|j��|_d|_dt|�f|_dS)Ncss|]}t|t�VqdS)N)�
isinstancer)�.0�x�r�/usr/lib/python3.6/py.py�	<genexpr>sz Trie.__init__.<locals>.<genexpr>zAll keys must be strings�r)	�all�keys�	TypeError�_data�sorted�_keys�	_cachestr�len�_cachepoints)�self�datarrr
�__init__
sz
Trie.__init__cCs
||jkS)N)r)r�keyrrr
�__contains__szTrie.__contains__cCs
t|j�S)N)rr)rrrr
�__len__szTrie.__len__cCs
t|j�S)N)�iterr)rrrr
�__iter__sz
Trie.__iter__cCs
|j|S)N)r)rrrrr
�__getitem__szTrie.__getitem__NcCs�|dks|dks|jr"t|j�S|j|j�rN|j\}}t|j|||�}}nt|j|�}}t�}|t|j�krv|Sx,|j|j|�r�|j|j|�|d7}qxW||_||f|_|S)Nrr)r�set�
startswithrrrr�add)r�prefix�lo�hi�start�irrrr
rs


z	Trie.keyscCsd||jkrdS|j|j�r6|j\}}t|j|||�}nt|j|�}|t|j�krTdS|j|j|�S)NTF)rr#rrrrr)rr%r&r'r)rrr
�has_keys_with_prefix6s

zTrie.has_keys_with_prefix)N)
�__name__�
__module__�__qualname__rrrr r!rr*rrrr
r	s	
rN)Z
__future__rrrZpip._vendor.sixrZbisectrZ_baserZABCTrierrrr
�<module>ssite-packages/pip/_vendor/html5lib/__init__.py000064400000001414147511334610015346 0ustar00"""
HTML parsing library based on the WHATWG "HTML5"
specification. The parser is designed to be compatible with existing
HTML found in the wild and implements well-defined error recovery that
is largely compatible with modern desktop web browsers.

Example usage:

import html5lib
f = open("my_document.html")
tree = html5lib.parse(f)
"""

from __future__ import absolute_import, division, unicode_literals

from .html5parser import HTMLParser, parse, parseFragment
from .treebuilders import getTreeBuilder
from .treewalkers import getTreeWalker
from .serializer import serialize

__all__ = ["HTMLParser", "parse", "parseFragment", "getTreeBuilder",
           "getTreeWalker", "serialize"]

# this has to be at the top level, see how setup.py parses this
__version__ = "1.0b10"
site-packages/pip/_vendor/html5lib/filters/sanitizer.py000064400000061030147511334610017267 0ustar00from __future__ import absolute_import, division, unicode_literals

import re
from xml.sax.saxutils import escape, unescape

from pip._vendor.six.moves import urllib_parse as urlparse

from . import base
from ..constants import namespaces, prefixes

__all__ = ["Filter"]


allowed_elements = frozenset((
    (namespaces['html'], 'a'),
    (namespaces['html'], 'abbr'),
    (namespaces['html'], 'acronym'),
    (namespaces['html'], 'address'),
    (namespaces['html'], 'area'),
    (namespaces['html'], 'article'),
    (namespaces['html'], 'aside'),
    (namespaces['html'], 'audio'),
    (namespaces['html'], 'b'),
    (namespaces['html'], 'big'),
    (namespaces['html'], 'blockquote'),
    (namespaces['html'], 'br'),
    (namespaces['html'], 'button'),
    (namespaces['html'], 'canvas'),
    (namespaces['html'], 'caption'),
    (namespaces['html'], 'center'),
    (namespaces['html'], 'cite'),
    (namespaces['html'], 'code'),
    (namespaces['html'], 'col'),
    (namespaces['html'], 'colgroup'),
    (namespaces['html'], 'command'),
    (namespaces['html'], 'datagrid'),
    (namespaces['html'], 'datalist'),
    (namespaces['html'], 'dd'),
    (namespaces['html'], 'del'),
    (namespaces['html'], 'details'),
    (namespaces['html'], 'dfn'),
    (namespaces['html'], 'dialog'),
    (namespaces['html'], 'dir'),
    (namespaces['html'], 'div'),
    (namespaces['html'], 'dl'),
    (namespaces['html'], 'dt'),
    (namespaces['html'], 'em'),
    (namespaces['html'], 'event-source'),
    (namespaces['html'], 'fieldset'),
    (namespaces['html'], 'figcaption'),
    (namespaces['html'], 'figure'),
    (namespaces['html'], 'footer'),
    (namespaces['html'], 'font'),
    (namespaces['html'], 'form'),
    (namespaces['html'], 'header'),
    (namespaces['html'], 'h1'),
    (namespaces['html'], 'h2'),
    (namespaces['html'], 'h3'),
    (namespaces['html'], 'h4'),
    (namespaces['html'], 'h5'),
    (namespaces['html'], 'h6'),
    (namespaces['html'], 'hr'),
    (namespaces['html'], 'i'),
    (namespaces['html'], 'img'),
    (namespaces['html'], 'input'),
    (namespaces['html'], 'ins'),
    (namespaces['html'], 'keygen'),
    (namespaces['html'], 'kbd'),
    (namespaces['html'], 'label'),
    (namespaces['html'], 'legend'),
    (namespaces['html'], 'li'),
    (namespaces['html'], 'm'),
    (namespaces['html'], 'map'),
    (namespaces['html'], 'menu'),
    (namespaces['html'], 'meter'),
    (namespaces['html'], 'multicol'),
    (namespaces['html'], 'nav'),
    (namespaces['html'], 'nextid'),
    (namespaces['html'], 'ol'),
    (namespaces['html'], 'output'),
    (namespaces['html'], 'optgroup'),
    (namespaces['html'], 'option'),
    (namespaces['html'], 'p'),
    (namespaces['html'], 'pre'),
    (namespaces['html'], 'progress'),
    (namespaces['html'], 'q'),
    (namespaces['html'], 's'),
    (namespaces['html'], 'samp'),
    (namespaces['html'], 'section'),
    (namespaces['html'], 'select'),
    (namespaces['html'], 'small'),
    (namespaces['html'], 'sound'),
    (namespaces['html'], 'source'),
    (namespaces['html'], 'spacer'),
    (namespaces['html'], 'span'),
    (namespaces['html'], 'strike'),
    (namespaces['html'], 'strong'),
    (namespaces['html'], 'sub'),
    (namespaces['html'], 'sup'),
    (namespaces['html'], 'table'),
    (namespaces['html'], 'tbody'),
    (namespaces['html'], 'td'),
    (namespaces['html'], 'textarea'),
    (namespaces['html'], 'time'),
    (namespaces['html'], 'tfoot'),
    (namespaces['html'], 'th'),
    (namespaces['html'], 'thead'),
    (namespaces['html'], 'tr'),
    (namespaces['html'], 'tt'),
    (namespaces['html'], 'u'),
    (namespaces['html'], 'ul'),
    (namespaces['html'], 'var'),
    (namespaces['html'], 'video'),
    (namespaces['mathml'], 'maction'),
    (namespaces['mathml'], 'math'),
    (namespaces['mathml'], 'merror'),
    (namespaces['mathml'], 'mfrac'),
    (namespaces['mathml'], 'mi'),
    (namespaces['mathml'], 'mmultiscripts'),
    (namespaces['mathml'], 'mn'),
    (namespaces['mathml'], 'mo'),
    (namespaces['mathml'], 'mover'),
    (namespaces['mathml'], 'mpadded'),
    (namespaces['mathml'], 'mphantom'),
    (namespaces['mathml'], 'mprescripts'),
    (namespaces['mathml'], 'mroot'),
    (namespaces['mathml'], 'mrow'),
    (namespaces['mathml'], 'mspace'),
    (namespaces['mathml'], 'msqrt'),
    (namespaces['mathml'], 'mstyle'),
    (namespaces['mathml'], 'msub'),
    (namespaces['mathml'], 'msubsup'),
    (namespaces['mathml'], 'msup'),
    (namespaces['mathml'], 'mtable'),
    (namespaces['mathml'], 'mtd'),
    (namespaces['mathml'], 'mtext'),
    (namespaces['mathml'], 'mtr'),
    (namespaces['mathml'], 'munder'),
    (namespaces['mathml'], 'munderover'),
    (namespaces['mathml'], 'none'),
    (namespaces['svg'], 'a'),
    (namespaces['svg'], 'animate'),
    (namespaces['svg'], 'animateColor'),
    (namespaces['svg'], 'animateMotion'),
    (namespaces['svg'], 'animateTransform'),
    (namespaces['svg'], 'clipPath'),
    (namespaces['svg'], 'circle'),
    (namespaces['svg'], 'defs'),
    (namespaces['svg'], 'desc'),
    (namespaces['svg'], 'ellipse'),
    (namespaces['svg'], 'font-face'),
    (namespaces['svg'], 'font-face-name'),
    (namespaces['svg'], 'font-face-src'),
    (namespaces['svg'], 'g'),
    (namespaces['svg'], 'glyph'),
    (namespaces['svg'], 'hkern'),
    (namespaces['svg'], 'linearGradient'),
    (namespaces['svg'], 'line'),
    (namespaces['svg'], 'marker'),
    (namespaces['svg'], 'metadata'),
    (namespaces['svg'], 'missing-glyph'),
    (namespaces['svg'], 'mpath'),
    (namespaces['svg'], 'path'),
    (namespaces['svg'], 'polygon'),
    (namespaces['svg'], 'polyline'),
    (namespaces['svg'], 'radialGradient'),
    (namespaces['svg'], 'rect'),
    (namespaces['svg'], 'set'),
    (namespaces['svg'], 'stop'),
    (namespaces['svg'], 'svg'),
    (namespaces['svg'], 'switch'),
    (namespaces['svg'], 'text'),
    (namespaces['svg'], 'title'),
    (namespaces['svg'], 'tspan'),
    (namespaces['svg'], 'use'),
))

allowed_attributes = frozenset((
    # HTML attributes
    (None, 'abbr'),
    (None, 'accept'),
    (None, 'accept-charset'),
    (None, 'accesskey'),
    (None, 'action'),
    (None, 'align'),
    (None, 'alt'),
    (None, 'autocomplete'),
    (None, 'autofocus'),
    (None, 'axis'),
    (None, 'background'),
    (None, 'balance'),
    (None, 'bgcolor'),
    (None, 'bgproperties'),
    (None, 'border'),
    (None, 'bordercolor'),
    (None, 'bordercolordark'),
    (None, 'bordercolorlight'),
    (None, 'bottompadding'),
    (None, 'cellpadding'),
    (None, 'cellspacing'),
    (None, 'ch'),
    (None, 'challenge'),
    (None, 'char'),
    (None, 'charoff'),
    (None, 'choff'),
    (None, 'charset'),
    (None, 'checked'),
    (None, 'cite'),
    (None, 'class'),
    (None, 'clear'),
    (None, 'color'),
    (None, 'cols'),
    (None, 'colspan'),
    (None, 'compact'),
    (None, 'contenteditable'),
    (None, 'controls'),
    (None, 'coords'),
    (None, 'data'),
    (None, 'datafld'),
    (None, 'datapagesize'),
    (None, 'datasrc'),
    (None, 'datetime'),
    (None, 'default'),
    (None, 'delay'),
    (None, 'dir'),
    (None, 'disabled'),
    (None, 'draggable'),
    (None, 'dynsrc'),
    (None, 'enctype'),
    (None, 'end'),
    (None, 'face'),
    (None, 'for'),
    (None, 'form'),
    (None, 'frame'),
    (None, 'galleryimg'),
    (None, 'gutter'),
    (None, 'headers'),
    (None, 'height'),
    (None, 'hidefocus'),
    (None, 'hidden'),
    (None, 'high'),
    (None, 'href'),
    (None, 'hreflang'),
    (None, 'hspace'),
    (None, 'icon'),
    (None, 'id'),
    (None, 'inputmode'),
    (None, 'ismap'),
    (None, 'keytype'),
    (None, 'label'),
    (None, 'leftspacing'),
    (None, 'lang'),
    (None, 'list'),
    (None, 'longdesc'),
    (None, 'loop'),
    (None, 'loopcount'),
    (None, 'loopend'),
    (None, 'loopstart'),
    (None, 'low'),
    (None, 'lowsrc'),
    (None, 'max'),
    (None, 'maxlength'),
    (None, 'media'),
    (None, 'method'),
    (None, 'min'),
    (None, 'multiple'),
    (None, 'name'),
    (None, 'nohref'),
    (None, 'noshade'),
    (None, 'nowrap'),
    (None, 'open'),
    (None, 'optimum'),
    (None, 'pattern'),
    (None, 'ping'),
    (None, 'point-size'),
    (None, 'poster'),
    (None, 'pqg'),
    (None, 'preload'),
    (None, 'prompt'),
    (None, 'radiogroup'),
    (None, 'readonly'),
    (None, 'rel'),
    (None, 'repeat-max'),
    (None, 'repeat-min'),
    (None, 'replace'),
    (None, 'required'),
    (None, 'rev'),
    (None, 'rightspacing'),
    (None, 'rows'),
    (None, 'rowspan'),
    (None, 'rules'),
    (None, 'scope'),
    (None, 'selected'),
    (None, 'shape'),
    (None, 'size'),
    (None, 'span'),
    (None, 'src'),
    (None, 'start'),
    (None, 'step'),
    (None, 'style'),
    (None, 'summary'),
    (None, 'suppress'),
    (None, 'tabindex'),
    (None, 'target'),
    (None, 'template'),
    (None, 'title'),
    (None, 'toppadding'),
    (None, 'type'),
    (None, 'unselectable'),
    (None, 'usemap'),
    (None, 'urn'),
    (None, 'valign'),
    (None, 'value'),
    (None, 'variable'),
    (None, 'volume'),
    (None, 'vspace'),
    (None, 'vrml'),
    (None, 'width'),
    (None, 'wrap'),
    (namespaces['xml'], 'lang'),
    # MathML attributes
    (None, 'actiontype'),
    (None, 'align'),
    (None, 'columnalign'),
    (None, 'columnalign'),
    (None, 'columnalign'),
    (None, 'columnlines'),
    (None, 'columnspacing'),
    (None, 'columnspan'),
    (None, 'depth'),
    (None, 'display'),
    (None, 'displaystyle'),
    (None, 'equalcolumns'),
    (None, 'equalrows'),
    (None, 'fence'),
    (None, 'fontstyle'),
    (None, 'fontweight'),
    (None, 'frame'),
    (None, 'height'),
    (None, 'linethickness'),
    (None, 'lspace'),
    (None, 'mathbackground'),
    (None, 'mathcolor'),
    (None, 'mathvariant'),
    (None, 'mathvariant'),
    (None, 'maxsize'),
    (None, 'minsize'),
    (None, 'other'),
    (None, 'rowalign'),
    (None, 'rowalign'),
    (None, 'rowalign'),
    (None, 'rowlines'),
    (None, 'rowspacing'),
    (None, 'rowspan'),
    (None, 'rspace'),
    (None, 'scriptlevel'),
    (None, 'selection'),
    (None, 'separator'),
    (None, 'stretchy'),
    (None, 'width'),
    (None, 'width'),
    (namespaces['xlink'], 'href'),
    (namespaces['xlink'], 'show'),
    (namespaces['xlink'], 'type'),
    # SVG attributes
    (None, 'accent-height'),
    (None, 'accumulate'),
    (None, 'additive'),
    (None, 'alphabetic'),
    (None, 'arabic-form'),
    (None, 'ascent'),
    (None, 'attributeName'),
    (None, 'attributeType'),
    (None, 'baseProfile'),
    (None, 'bbox'),
    (None, 'begin'),
    (None, 'by'),
    (None, 'calcMode'),
    (None, 'cap-height'),
    (None, 'class'),
    (None, 'clip-path'),
    (None, 'color'),
    (None, 'color-rendering'),
    (None, 'content'),
    (None, 'cx'),
    (None, 'cy'),
    (None, 'd'),
    (None, 'dx'),
    (None, 'dy'),
    (None, 'descent'),
    (None, 'display'),
    (None, 'dur'),
    (None, 'end'),
    (None, 'fill'),
    (None, 'fill-opacity'),
    (None, 'fill-rule'),
    (None, 'font-family'),
    (None, 'font-size'),
    (None, 'font-stretch'),
    (None, 'font-style'),
    (None, 'font-variant'),
    (None, 'font-weight'),
    (None, 'from'),
    (None, 'fx'),
    (None, 'fy'),
    (None, 'g1'),
    (None, 'g2'),
    (None, 'glyph-name'),
    (None, 'gradientUnits'),
    (None, 'hanging'),
    (None, 'height'),
    (None, 'horiz-adv-x'),
    (None, 'horiz-origin-x'),
    (None, 'id'),
    (None, 'ideographic'),
    (None, 'k'),
    (None, 'keyPoints'),
    (None, 'keySplines'),
    (None, 'keyTimes'),
    (None, 'lang'),
    (None, 'marker-end'),
    (None, 'marker-mid'),
    (None, 'marker-start'),
    (None, 'markerHeight'),
    (None, 'markerUnits'),
    (None, 'markerWidth'),
    (None, 'mathematical'),
    (None, 'max'),
    (None, 'min'),
    (None, 'name'),
    (None, 'offset'),
    (None, 'opacity'),
    (None, 'orient'),
    (None, 'origin'),
    (None, 'overline-position'),
    (None, 'overline-thickness'),
    (None, 'panose-1'),
    (None, 'path'),
    (None, 'pathLength'),
    (None, 'points'),
    (None, 'preserveAspectRatio'),
    (None, 'r'),
    (None, 'refX'),
    (None, 'refY'),
    (None, 'repeatCount'),
    (None, 'repeatDur'),
    (None, 'requiredExtensions'),
    (None, 'requiredFeatures'),
    (None, 'restart'),
    (None, 'rotate'),
    (None, 'rx'),
    (None, 'ry'),
    (None, 'slope'),
    (None, 'stemh'),
    (None, 'stemv'),
    (None, 'stop-color'),
    (None, 'stop-opacity'),
    (None, 'strikethrough-position'),
    (None, 'strikethrough-thickness'),
    (None, 'stroke'),
    (None, 'stroke-dasharray'),
    (None, 'stroke-dashoffset'),
    (None, 'stroke-linecap'),
    (None, 'stroke-linejoin'),
    (None, 'stroke-miterlimit'),
    (None, 'stroke-opacity'),
    (None, 'stroke-width'),
    (None, 'systemLanguage'),
    (None, 'target'),
    (None, 'text-anchor'),
    (None, 'to'),
    (None, 'transform'),
    (None, 'type'),
    (None, 'u1'),
    (None, 'u2'),
    (None, 'underline-position'),
    (None, 'underline-thickness'),
    (None, 'unicode'),
    (None, 'unicode-range'),
    (None, 'units-per-em'),
    (None, 'values'),
    (None, 'version'),
    (None, 'viewBox'),
    (None, 'visibility'),
    (None, 'width'),
    (None, 'widths'),
    (None, 'x'),
    (None, 'x-height'),
    (None, 'x1'),
    (None, 'x2'),
    (namespaces['xlink'], 'actuate'),
    (namespaces['xlink'], 'arcrole'),
    (namespaces['xlink'], 'href'),
    (namespaces['xlink'], 'role'),
    (namespaces['xlink'], 'show'),
    (namespaces['xlink'], 'title'),
    (namespaces['xlink'], 'type'),
    (namespaces['xml'], 'base'),
    (namespaces['xml'], 'lang'),
    (namespaces['xml'], 'space'),
    (None, 'y'),
    (None, 'y1'),
    (None, 'y2'),
    (None, 'zoomAndPan'),
))

attr_val_is_uri = frozenset((
    (None, 'href'),
    (None, 'src'),
    (None, 'cite'),
    (None, 'action'),
    (None, 'longdesc'),
    (None, 'poster'),
    (None, 'background'),
    (None, 'datasrc'),
    (None, 'dynsrc'),
    (None, 'lowsrc'),
    (None, 'ping'),
    (namespaces['xlink'], 'href'),
    (namespaces['xml'], 'base'),
))

svg_attr_val_allows_ref = frozenset((
    (None, 'clip-path'),
    (None, 'color-profile'),
    (None, 'cursor'),
    (None, 'fill'),
    (None, 'filter'),
    (None, 'marker'),
    (None, 'marker-start'),
    (None, 'marker-mid'),
    (None, 'marker-end'),
    (None, 'mask'),
    (None, 'stroke'),
))

svg_allow_local_href = frozenset((
    (None, 'altGlyph'),
    (None, 'animate'),
    (None, 'animateColor'),
    (None, 'animateMotion'),
    (None, 'animateTransform'),
    (None, 'cursor'),
    (None, 'feImage'),
    (None, 'filter'),
    (None, 'linearGradient'),
    (None, 'pattern'),
    (None, 'radialGradient'),
    (None, 'textpath'),
    (None, 'tref'),
    (None, 'set'),
    (None, 'use')
))

allowed_css_properties = frozenset((
    'azimuth',
    'background-color',
    'border-bottom-color',
    'border-collapse',
    'border-color',
    'border-left-color',
    'border-right-color',
    'border-top-color',
    'clear',
    'color',
    'cursor',
    'direction',
    'display',
    'elevation',
    'float',
    'font',
    'font-family',
    'font-size',
    'font-style',
    'font-variant',
    'font-weight',
    'height',
    'letter-spacing',
    'line-height',
    'overflow',
    'pause',
    'pause-after',
    'pause-before',
    'pitch',
    'pitch-range',
    'richness',
    'speak',
    'speak-header',
    'speak-numeral',
    'speak-punctuation',
    'speech-rate',
    'stress',
    'text-align',
    'text-decoration',
    'text-indent',
    'unicode-bidi',
    'vertical-align',
    'voice-family',
    'volume',
    'white-space',
    'width',
))

allowed_css_keywords = frozenset((
    'auto',
    'aqua',
    'black',
    'block',
    'blue',
    'bold',
    'both',
    'bottom',
    'brown',
    'center',
    'collapse',
    'dashed',
    'dotted',
    'fuchsia',
    'gray',
    'green',
    '!important',
    'italic',
    'left',
    'lime',
    'maroon',
    'medium',
    'none',
    'navy',
    'normal',
    'nowrap',
    'olive',
    'pointer',
    'purple',
    'red',
    'right',
    'solid',
    'silver',
    'teal',
    'top',
    'transparent',
    'underline',
    'white',
    'yellow',
))

allowed_svg_properties = frozenset((
    'fill',
    'fill-opacity',
    'fill-rule',
    'stroke',
    'stroke-width',
    'stroke-linecap',
    'stroke-linejoin',
    'stroke-opacity',
))

allowed_protocols = frozenset((
    'ed2k',
    'ftp',
    'http',
    'https',
    'irc',
    'mailto',
    'news',
    'gopher',
    'nntp',
    'telnet',
    'webcal',
    'xmpp',
    'callto',
    'feed',
    'urn',
    'aim',
    'rsync',
    'tag',
    'ssh',
    'sftp',
    'rtsp',
    'afs',
    'data',
))

allowed_content_types = frozenset((
    'image/png',
    'image/jpeg',
    'image/gif',
    'image/webp',
    'image/bmp',
    'text/plain',
))


data_content_type = re.compile(r'''
                                ^
                                # Match a content type <application>/<type>
                                (?P<content_type>[-a-zA-Z0-9.]+/[-a-zA-Z0-9.]+)
                                # Match any character set and encoding
                                (?:(?:;charset=(?:[-a-zA-Z0-9]+)(?:;(?:base64))?)
                                  |(?:;(?:base64))?(?:;charset=(?:[-a-zA-Z0-9]+))?)
                                # Assume the rest is data
                                ,.*
                                $
                                ''',
                               re.VERBOSE)


class Filter(base.Filter):
    """ sanitization of XHTML+MathML+SVG and of inline style attributes."""
    def __init__(self,
                 source,
                 allowed_elements=allowed_elements,
                 allowed_attributes=allowed_attributes,
                 allowed_css_properties=allowed_css_properties,
                 allowed_css_keywords=allowed_css_keywords,
                 allowed_svg_properties=allowed_svg_properties,
                 allowed_protocols=allowed_protocols,
                 allowed_content_types=allowed_content_types,
                 attr_val_is_uri=attr_val_is_uri,
                 svg_attr_val_allows_ref=svg_attr_val_allows_ref,
                 svg_allow_local_href=svg_allow_local_href):
        super(Filter, self).__init__(source)
        self.allowed_elements = allowed_elements
        self.allowed_attributes = allowed_attributes
        self.allowed_css_properties = allowed_css_properties
        self.allowed_css_keywords = allowed_css_keywords
        self.allowed_svg_properties = allowed_svg_properties
        self.allowed_protocols = allowed_protocols
        self.allowed_content_types = allowed_content_types
        self.attr_val_is_uri = attr_val_is_uri
        self.svg_attr_val_allows_ref = svg_attr_val_allows_ref
        self.svg_allow_local_href = svg_allow_local_href

    def __iter__(self):
        for token in base.Filter.__iter__(self):
            token = self.sanitize_token(token)
            if token:
                yield token

    # Sanitize the +html+, escaping all elements not in ALLOWED_ELEMENTS, and
    # stripping out all # attributes not in ALLOWED_ATTRIBUTES. Style
    # attributes are parsed, and a restricted set, # specified by
    # ALLOWED_CSS_PROPERTIES and ALLOWED_CSS_KEYWORDS, are allowed through.
    # attributes in ATTR_VAL_IS_URI are scanned, and only URI schemes specified
    # in ALLOWED_PROTOCOLS are allowed.
    #
    #   sanitize_html('<script> do_nasty_stuff() </script>')
    #    => &lt;script> do_nasty_stuff() &lt;/script>
    #   sanitize_html('<a href="javascript: sucker();">Click here for $100</a>')
    #    => <a>Click here for $100</a>
    def sanitize_token(self, token):

        # accommodate filters which use token_type differently
        token_type = token["type"]
        if token_type in ("StartTag", "EndTag", "EmptyTag"):
            name = token["name"]
            namespace = token["namespace"]
            if ((namespace, name) in self.allowed_elements or
                (namespace is None and
                 (namespaces["html"], name) in self.allowed_elements)):
                return self.allowed_token(token)
            else:
                return self.disallowed_token(token)
        elif token_type == "Comment":
            pass
        else:
            return token

    def allowed_token(self, token):
        if "data" in token:
            attrs = token["data"]
            attr_names = set(attrs.keys())

            # Remove forbidden attributes
            for to_remove in (attr_names - self.allowed_attributes):
                del token["data"][to_remove]
                attr_names.remove(to_remove)

            # Remove attributes with disallowed URL values
            for attr in (attr_names & self.attr_val_is_uri):
                assert attr in attrs
                # I don't have a clue where this regexp comes from or why it matches those
                # characters, nor why we call unescape. I just know it's always been here.
                # Should you be worried by this comment in a sanitizer? Yes. On the other hand, all
                # this will do is remove *more* than it otherwise would.
                val_unescaped = re.sub("[`\x00-\x20\x7f-\xa0\s]+", '',
                                       unescape(attrs[attr])).lower()
                # remove replacement characters from unescaped characters
                val_unescaped = val_unescaped.replace("\ufffd", "")
                try:
                    uri = urlparse.urlparse(val_unescaped)
                except ValueError:
                    uri = None
                    del attrs[attr]
                if uri and uri.scheme:
                    if uri.scheme not in self.allowed_protocols:
                        del attrs[attr]
                    if uri.scheme == 'data':
                        m = data_content_type.match(uri.path)
                        if not m:
                            del attrs[attr]
                        elif m.group('content_type') not in self.allowed_content_types:
                            del attrs[attr]

            for attr in self.svg_attr_val_allows_ref:
                if attr in attrs:
                    attrs[attr] = re.sub(r'url\s*\(\s*[^#\s][^)]+?\)',
                                         ' ',
                                         unescape(attrs[attr]))
            if (token["name"] in self.svg_allow_local_href and
                (namespaces['xlink'], 'href') in attrs and re.search('^\s*[^#\s].*',
                                                                     attrs[(namespaces['xlink'], 'href')])):
                del attrs[(namespaces['xlink'], 'href')]
            if (None, 'style') in attrs:
                attrs[(None, 'style')] = self.sanitize_css(attrs[(None, 'style')])
            token["data"] = attrs
        return token

    def disallowed_token(self, token):
        token_type = token["type"]
        if token_type == "EndTag":
            token["data"] = "</%s>" % token["name"]
        elif token["data"]:
            assert token_type in ("StartTag", "EmptyTag")
            attrs = []
            for (ns, name), v in token["data"].items():
                attrs.append(' %s="%s"' % (name if ns is None else "%s:%s" % (prefixes[ns], name), escape(v)))
            token["data"] = "<%s%s>" % (token["name"], ''.join(attrs))
        else:
            token["data"] = "<%s>" % token["name"]
        if token.get("selfClosing"):
            token["data"] = token["data"][:-1] + "/>"

        token["type"] = "Characters"

        del token["name"]
        return token

    def sanitize_css(self, style):
        # disallow urls
        style = re.compile('url\s*\(\s*[^\s)]+?\s*\)\s*').sub(' ', style)

        # gauntlet
        if not re.match("""^([:,;#%.\sa-zA-Z0-9!]|\w-\w|'[\s\w]+'|"[\s\w]+"|\([\d,\s]+\))*$""", style):
            return ''
        if not re.match("^\s*([-\w]+\s*:[^:;]*(;\s*|$))*$", style):
            return ''

        clean = []
        for prop, value in re.findall("([-\w]+)\s*:\s*([^:;]*)", style):
            if not value:
                continue
            if prop.lower() in self.allowed_css_properties:
                clean.append(prop + ': ' + value + ';')
            elif prop.split('-')[0].lower() in ['background', 'border', 'margin',
                                                'padding']:
                for keyword in value.split():
                    if keyword not in self.allowed_css_keywords and \
                            not re.match("^(#[0-9a-f]+|rgb\(\d+%?,\d*%?,?\d*%?\)?|\d{0,2}\.?\d{0,2}(cm|em|ex|in|mm|pc|pt|px|%|,|\))?)$", keyword):  # noqa
                        break
                else:
                    clean.append(prop + ': ' + value + ';')
            elif prop.lower() in self.allowed_svg_properties:
                clean.append(prop + ': ' + value + ';')

        return ' '.join(clean)
site-packages/pip/_vendor/html5lib/filters/whitespace.py000064400000002163147511334610017415 0ustar00from __future__ import absolute_import, division, unicode_literals

import re

from . import base
from ..constants import rcdataElements, spaceCharacters
spaceCharacters = "".join(spaceCharacters)

SPACES_REGEX = re.compile("[%s]+" % spaceCharacters)


class Filter(base.Filter):

    spacePreserveElements = frozenset(["pre", "textarea"] + list(rcdataElements))

    def __iter__(self):
        preserve = 0
        for token in base.Filter.__iter__(self):
            type = token["type"]
            if type == "StartTag" \
                    and (preserve or token["name"] in self.spacePreserveElements):
                preserve += 1

            elif type == "EndTag" and preserve:
                preserve -= 1

            elif not preserve and type == "SpaceCharacters" and token["data"]:
                # Test on token["data"] above to not introduce spaces where there were not
                token["data"] = " "

            elif not preserve and type == "Characters":
                token["data"] = collapse_spaces(token["data"])

            yield token


def collapse_spaces(text):
    return SPACES_REGEX.sub(' ', text)
site-packages/pip/_vendor/html5lib/filters/__pycache__/sanitizer.cpython-36.opt-1.pyc000064400000042731147511334610024521 0ustar003

���eb�E@s
ddlmZmZmZddlZddlmZmZddlm	Z
ddlmZddl
mZmZd	gZeed
dfed
dfed
d
fed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
d fed
d!fed
d"fed
d#fed
d$fed
d%fed
d&fed
d'fed
d(fed
d)fed
d*fed
d+fed
d,fed
d-fed
d.fed
d/fed
d0fed
d1fed
d2fed
d3fed
d4fed
d5fed
d6fed
d7fed
d8fed
d9fed
d:fed
d;fed
d<fed
d=fed
d>fed
d?fed
d@fed
dAfed
dBfed
dCfed
dDfed
dEfed
dFfed
dGfed
dHfed
dIfed
dJfed
dKfed
dLfed
dMfed
dNfed
dOfed
dPfed
dQfed
dRfed
dSfed
dTfed
dUfed
dVfed
dWfed
dXfed
dYfed
dZfed
d[fed
d\fed
d]fed
d^fed
d_fed
d`fed
dafed
dbfed
dcfed
ddfed
defed
dffed
dgfed
dhfed
difed
djfed
dkfed
dlfed
dmfedndofedndpfedndqfedndrfedndsfedndtfedndufedndvfedndwfedndxfedndyfedndzfednd{fednd|fednd}fednd~fedndfednd�fednd�fednd�fednd�fednd�fednd�fednd�fednd�fednd�fednd�fed�dfed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�ff��Ze�d3�d4�d5�d6�d7�d8�d9�d:�d;�d<�d=�d>�d?�d@�dA�dB�dC�dD�dE�dF�dG�dH�dI�dJ�dK�dL�dM�dN�dO�dP�dQ�dR�dS�dT�dU�dV�dW�dX�dY�dZ�d[�d\�d]�d^�d_�d`�da�db�dc�dd�de�df�dg�dh�di�dj�dk�dl�dm�dn�do�dp�dq�dr�ds�dt�du�dv�dw�dx�dy�dz�d{�d|�d}�d~�d�d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d�e�d1d�f�d��d��d��ddÐdĐdŐdƐdǐdȐdɐdʐdːd̐d͐dΐdϐdАdѐdҐdӐdԐdՐd֐dאdؐdِdڐdېdܐdݐdސdߐd�d�d�d�d�d�d�e�dOd�fe�dO�dPfe�dO�d%f�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d��d��d��d��d��d��d��d��d��d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-�d.�d/�d0�d1�d2�d3�d4�d5�d6�d7�d8�d9�d:�d;�d<�d=�d>�d?�d@�dA�dB�dC�dD�dE�dF�dG�dH�dI�dJ�dK�dL�dM�dN�dO�dP�dQ�dR�dS�dT�dU�dV�dW�dX�dY�dZ�d[�d\�d]�d^�d_�d`�da�db�dce�dO�d�fe�dO�d�fe�dOd�fe�dO�d�fe�dO�dPfe�dOd�fe�dO�d%fe�d1�d�fe�d1d�fe�d1�d�f�dd�de�df�dg�fC�Ze�dh�di�dj�dk�dl�dm�dn�do�dp�dq�dre�dOd�fe�d1�d�ff
�Ze�d~�Ze�d��Ze�d��Ze�d��Ze�d��Ze�d��Ze�d��Zej�d1ej�ZG�d2d	�d	ej�ZdS(��)�absolute_import�division�unicode_literalsN)�escape�unescape)�urllib_parse�)�base�)�
namespaces�prefixes�Filter�html�a�abbrZacronymZaddressZareaZarticleZasideZaudio�bZbigZ
blockquote�brZbuttonZcanvasZcaption�center�cite�code�colZcolgroupZcommandZdatagridZdatalistZdd�delZdetailsZdfnZdialog�dirZdivZdlZdtZemzevent-sourceZfieldsetZ
figcaptionZfigureZfooter�font�form�headerZh1Zh2Zh3Zh4Zh5Zh6Zhr�iZimg�inputZinsZkeygenZkbd�labelZlegendZli�m�mapZmenuZmeterZmulticolZnavZnextidZol�outputZoptgroupZoption�pZpreZprogress�q�sZsampZsectionZselectZsmallZsound�sourceZspacer�spanZstrikeZstrong�subZsup�tableZtbodyZtdZtextareaZtimeZtfootZthZtheadZtrZtt�uZul�varZvideoZmathmlZmactionZmathZmerrorZmfracZmiZ
mmultiscriptsZmn�moZmoverZmpaddedZmphantomZmprescriptsZmrootZmrowZmspaceZmsqrtZmstyleZmsubZmsubsupZmsupZmtableZmtdZmtextZmtrZmunderZ
munderover�noneZsvg�animate�animateColor�
animateMotion�animateTransformZclipPathZcircleZdefsZdescZellipsez	font-facezfont-face-namez
font-face-src�gZglyphZhkern�linearGradient�line�markerZmetadataz
missing-glyphZmpath�pathZpolygonZpolyline�radialGradientZrect�set�stopZswitch�text�titleZtspan�use�accept�accept-charset�	accesskey�action�align�alt�autocomplete�	autofocus�axis�
background�balance�bgcolor�bgproperties�border�bordercolor�bordercolordark�bordercolorlight�
bottompadding�cellpadding�cellspacing�ch�	challenge�char�charoff�choff�charset�checked�class�clear�color�cols�colspan�compact�contenteditable�controls�coords�data�datafld�datapagesize�datasrc�datetime�default�delay�disabled�	draggable�dynsrc�enctype�end�face�for�frame�
galleryimg�gutter�headers�height�	hidefocus�hidden�high�href�hreflang�hspace�icon�id�	inputmode�ismap�keytype�leftspacing�lang�list�longdesc�loop�	loopcount�loopend�	loopstart�low�lowsrc�max�	maxlength�media�method�min�multiple�name�nohref�noshade�nowrap�open�optimum�pattern�ping�
point-size�poster�pqg�preload�prompt�
radiogroup�readonly�rel�
repeat-max�
repeat-min�replace�required�rev�rightspacing�rows�rowspan�rules�scope�selected�shape�size�src�start�step�style�summary�suppress�tabindex�target�template�
toppadding�type�unselectable�usemap�urn�valign�value�variable�volume�vspace�vrml�width�wrapZxml�
actiontype�columnalign�columnlines�
columnspacing�
columnspan�depth�display�displaystyle�equalcolumns�	equalrows�fence�	fontstyle�
fontweight�
linethickness�lspace�mathbackground�	mathcolor�mathvariant�maxsize�minsize�other�rowalign�rowlines�
rowspacing�rspace�scriptlevel�	selection�	separator�stretchy�xlinkZshow�
accent-height�
accumulate�additive�
alphabetic�arabic-form�ascent�
attributeName�
attributeType�baseProfile�bbox�begin�by�calcMode�
cap-height�	clip-path�color-rendering�content�cx�cy�d�dx�dy�descent�dur�fill�fill-opacity�	fill-rule�font-family�	font-size�font-stretch�
font-style�font-variant�font-weight�from�fx�fy�g1�g2�
glyph-name�
gradientUnits�hanging�horiz-adv-x�horiz-origin-x�ideographic�k�	keyPoints�
keySplines�keyTimes�
marker-end�
marker-mid�marker-start�markerHeight�markerUnits�markerWidth�mathematical�offset�opacity�orient�origin�overline-position�overline-thickness�panose-1�
pathLength�points�preserveAspectRatio�r�refX�refY�repeatCount�	repeatDur�requiredExtensions�requiredFeatures�restart�rotate�rx�ry�slope�stemh�stemv�
stop-color�stop-opacity�strikethrough-position�strikethrough-thickness�stroke�stroke-dasharray�stroke-dashoffset�stroke-linecap�stroke-linejoin�stroke-miterlimit�stroke-opacity�stroke-width�systemLanguage�text-anchor�to�	transform�u1�u2�underline-position�underline-thickness�unicode�
unicode-range�units-per-em�values�version�viewBox�
visibility�widths�x�x-height�x1�x2ZactuateZarcroleZroler	Zspace�y�y1�y2�
zoomAndPan�
color-profile�cursor�filter�mask�altGlyph�feImage�textpath�tref�azimuth�background-color�border-bottom-color�border-collapse�border-color�border-left-color�border-right-color�border-top-color�	direction�	elevation�float�letter-spacing�line-height�overflow�pause�pause-after�pause-before�pitch�pitch-range�richness�speak�speak-header�
speak-numeral�speak-punctuation�speech-rate�stress�
text-align�text-decoration�text-indent�unicode-bidi�vertical-align�voice-family�white-space�auto�aqua�black�block�blue�bold�both�bottom�brown�collapse�dashed�dotted�fuchsia�gray�green�
!important�italic�left�lime�maroon�medium�navy�normal�olive�pointer�purple�red�right�solid�silver�teal�top�transparent�	underline�white�yellow�ed2k�ftp�http�https�irc�mailto�news�gopher�nntp�telnet�webcal�xmpp�callto�feed�aim�rsync�tag�ssh�sftp�rtsp�afs�	image/png�
image/jpeg�	image/gif�
image/webp�	image/bmp�
text/plainaL
                                ^
                                # Match a content type <application>/<type>
                                (?P<content_type>[-a-zA-Z0-9.]+/[-a-zA-Z0-9.]+)
                                # Match any character set and encoding
                                (?:(?:;charset=(?:[-a-zA-Z0-9]+)(?:;(?:base64))?)
                                  |(?:;(?:base64))?(?:;charset=(?:[-a-zA-Z0-9]+))?)
                                # Assume the rest is data
                                ,.*
                                $
                                c
s^eZdZdZeeeeee	e
eee
f
�fdd�	Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z�ZS)r
zA sanitization of XHTML+MathML+SVG and of inline style attributes.csPtt|�j|�||_||_||_||_||_||_||_	|	|_
|
|_||_dS)N)
�superr
�__init__�allowed_elements�allowed_attributes�allowed_css_properties�allowed_css_keywords�allowed_svg_properties�allowed_protocols�allowed_content_types�attr_val_is_uri�svg_attr_val_allows_ref�svg_allow_local_href)�selfr%r�r�r�r�r�r�r�r�r�r�)�	__class__��/usr/lib/python3.6/sanitizer.pyr��szFilter.__init__ccs.x(tjj|�D]}|j|�}|r|VqWdS)N)r	r
�__iter__�sanitize_token)r��tokenr�r�r�r��s
zFilter.__iter__cCsp|d}|d	kr^|d}|d}||f|jksH|dkrRtd|f|jkrR|j|�S|j|�Sn|dkrhn|SdS)
Nr��StartTag�EndTag�EmptyTagr��	namespacer�Comment)r�r�r�)r�r�
allowed_token�disallowed_token)r�r��
token_typer�r�r�r�r�r��s
zFilter.sanitize_tokenc	Cs�d|k�r�|d}t|j��}x&||jD]}|d|=|j|�q*Wx�||j@D]�}tjddt||��j�}|j	dd�}yt
j
|�}Wntk
r�d}||=YnX|o�|jrR|j|j
kr�||=|jdkrRtj|j�}|s�||=qR|jd�|jkrR||=qRWx4|jD]*}||k�r
tjddt||��||<�q
W|d|jk�r�td	d
f|k�r�tjd|td	d
f��r�|td	d
f=d
|k�r�|j|d�|d<||d<|S)Nr`u
[`- - \s]+�u�Zcontent_typezurl\s*\(\s*[^#\s][^)]+?\)� r�r�rvz^\s*[^#\s].*r�)Nr�)Nr�)Nr�)r7�keysr��remover��rer'r�lowerr��urlparse�
ValueError�schemer��data_content_type�matchr5�groupr�r�r�r�search�sanitize_css)	r�r��attrsZ
attr_namesZ	to_remove�attrZ
val_unescapedZurirr�r�r�r�sJ






zFilter.allowed_tokencCs�|d}|dkr"d|d|d<n�|dr�g}xJ|dj�D]:\\}}}|jd|dkrZ|ndt||ft|�f�q<Wd|dd	j|�f|d<nd
|d|d<|jd�r�|ddd�d
|d<d|d<|d=|S)Nr�r�z</%s>r�r`z %s="%s"z%s:%sz<%s%s>r�z<%s>ZselfClosingrz/>Z
Characters���)�items�appendrr�join�get)r�r�r�r��nsr��vr�r�r�r�2s2
zFilter.disallowed_tokencCstjd�jd|�}tjd|�s"dStjd|�s2dSg}x�tjd|�D]�\}}|sRqD|j�|jkrx|j|d|d�qD|jd	�d
j�dkr�xf|j�D]}||j	kr�tjd|�r�Pq�W|j|d|d�qD|j�|j
krD|j|d|d�qDWdj|�S)Nzurl\s*\(\s*[^\s)]+?\s*\)\s*r�z@^([:,;#%.\sa-zA-Z0-9!]|\w-\w|'[\s\w]+'|"[\s\w]+"|\([\d,\s]+\))*$r�z ^\s*([-\w]+\s*:[^:;]*(;\s*|$))*$z([-\w]+)\s*:\s*([^:;]*)z: �;�-rrErI�margin�paddingz\^(#[0-9a-f]+|rgb\(\d+%?,\d*%?,?\d*%?\)?|\d{0,2}\.?\d{0,2}(cm|em|ex|in|mm|pc|pt|px|%|,|\))?)$)rErIr�r�)r��compiler'r��findallr�r�r��splitr�r�r�)r�r�ZcleanZpropr��keywordr�r�r�r�Fs*
zFilter.sanitize_css)�__name__�
__module__�__qualname__�__doc__r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r��
__classcell__r�r�)r�r�r
�s 
2)Nr)Nr<)Nr=)Nr>)Nr?)Nr@)NrA)NrB)NrC)NrD)NrE)NrF)NrG)NrH)NrI)NrJ)NrK)NrL)NrM)NrN)NrO)NrP)NrQ)NrR)NrS)NrT)NrU)NrV)Nr)NrW)NrX)NrY)NrZ)Nr[)Nr\)Nr])Nr^)Nr_)Nr`)Nra)Nrb)Nrc)Nrd)Nre)Nrf)Nr)Nrg)Nrh)Nri)Nrj)Nrk)Nrl)Nrm)Nr)Nrn)Nro)Nrp)Nrq)Nrr)Nrs)Nrt)Nru)Nrv)Nrw)Nrx)Nry)Nrz)Nr{)Nr|)Nr})Nr)Nr~)Nr)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr&)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr:)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr@)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nrn)Nrr)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)NrW)Nr�)NrY)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nrk)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr)Nr)Nr)Nr)Nr)Nr)Nr)Nr)Nrr)Nr)Nr	)Nrz)Nr
)Nr)Nr)Nr
)Nr)Nr)Nr)Nr)Nr)Nr)Nr)Nr)Nr)Nr�)Nr�)Nr�)Nr)Nr)Nr)Nr)Nr)Nr)Nr)Nr5)Nr)Nr)Nr)Nr )Nr!)Nr")Nr#)Nr$)Nr%)Nr&)Nr')Nr()Nr))Nr*)Nr+)Nr,)Nr-)Nr.)Nr/)Nr0)Nr1)Nr2)Nr3)Nr4)Nr5)Nr6)Nr7)Nr8)Nr9)Nr:)Nr�)Nr;)Nr<)Nr=)Nr�)Nr>)Nr?)Nr@)NrA)NrB)NrC)NrD)NrE)NrF)NrG)NrH)Nr�)NrI)NrJ)NrK)NrL)NrM)NrN)NrO)NrP)NrQ)Nrv)Nr�)Nr)Nr?)Nr�)Nr�)NrE)Nrc)Nri)Nr�)Nr��Nr��NrR�NrS�Nr��NrT�Nr4�Nr�Nr�Nr�NrU�Nr2)r�r�r�r�r�r�r�rrrr�NrV�Nr-�Nr.�Nr/�Nr0�NrS�NrW�NrT�Nr2�Nr��Nr6�NrX�NrY�Nr7�Nr;)rrrrrr	r
rrr
rrrrr).rZr[r\r]r^r_r`rarXrYrSrbr�rcrdrr�r�r�r�r�rrrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryr�rzr�)'r{r|r}r~rr�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r,r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)r�r�r�r2r9r5r6r8)r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r`)r�r�r�r�r�r�) Z
__future__rrrr�Zxml.sax.saxutilsrrZpip._vendor.six.movesrr�r�r	Z	constantsrr�__all__�	frozensetr�r�r�r�r�r�r�r�r�r�r��VERBOSEr�r
r�r�r�r��<module>s2
































































































































































site-packages/pip/_vendor/html5lib/filters/__pycache__/inject_meta_charset.cpython-36.opt-1.pyc000064400000003116147511334610026476 0ustar003

���e�
�@s6ddlmZmZmZddlmZGdd�dej�ZdS)�)�absolute_import�division�unicode_literals�)�basec@seZdZdd�Zdd�ZdS)�FiltercCstjj||�||_dS)N)rr�__init__�encoding)�self�sourcer	�r�)/usr/lib/python3.6/inject_meta_charset.pyrszFilter.__init__c
cs�d}|jdk}g}�x�tjj|�D�]�}|d}|dkrP|dj�dkrLd}�np|dk�rV|dj�dk�rd	}x�|d
j�D]V\\}}}	|dk	r�q~q~|j�dkr�|j|d
||f<d}Pq~|d
kr~|	j�dkr~d}q~W|o�d|d
k�rTd|j|d
d<d}nR|dj�dk�r�|�r�dd|d
d�Vddd|jid�Vddd�Vd}q"nj|dk�r�|dj�dk�r�|�r�|jd�V|�s�ddd|jid�Vx|�r�|jd�V�q�Wd}d}|dk�r�|j|�q"|Vq"WdS)NZpre_head�typeZStartTag�name�headZin_headZEmptyTag�metaF�data�charsetTz
http-equivzcontent-type�contentztext/html; charset=%s)rrrZEndTag)rrrZ	post_head)Nr)Nr)Nr)Nr)r	rr�__iter__�lower�items�pop�append)
r
�stateZ
meta_found�pending�tokenrZhas_http_equiv_content_type�	namespacer�valuerrr
rsX



zFilter.__iter__N)�__name__�
__module__�__qualname__rrrrrr
rsrN)Z
__future__rrr�rrrrrr
�<module>ssite-packages/pip/_vendor/html5lib/filters/__pycache__/alphabeticalattributes.cpython-36.opt-1.pyc000064400000001730147511334610027223 0ustar003

���em�@shddlmZmZmZddlmZyddlmZWn ek
rPddl	mZYnXGdd�dej
�Z
dS)�)�absolute_import�division�unicode_literals�)�base)�OrderedDictc@seZdZdd�ZdS)�Filterccshxbtjj|�D]R}|ddkrZt�}x,t|dj�dd�d�D]\}}|||<q>W||d<|VqWdS)	N�type�StartTag�EmptyTag�datacSs|dS)Nr�)�xr
r
�,/usr/lib/python3.6/alphabeticalattributes.py�<lambda>sz!Filter.__iter__.<locals>.<lambda>)�key)r
r)rr�__iter__r�sorted�items)�self�tokenZattrs�name�valuer
r
rrszFilter.__iter__N)�__name__�
__module__�__qualname__rr
r
r
rrsrN)Z
__future__rrr�r�collectionsr�ImportErrorZordereddictrr
r
r
r�<module>ssite-packages/pip/_vendor/html5lib/filters/__pycache__/optionaltags.cpython-36.pyc000064400000005615147511334610024256 0ustar003

���e&)�@s6ddlmZmZmZddlmZGdd�dej�ZdS)�)�absolute_import�division�unicode_literals�)�basec@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�FilterccsLd}}x*|jD] }|dk	r(|||fV|}|}qW|dk	rH||dfVdS)N)�source)�selfZ	previous1Z	previous2�token�r�"/usr/lib/python3.6/optionaltags.py�slidersz
Filter.sliderccsvxp|j�D]d\}}}|d}|dkrH|ds@|j|d||�rn|Vq
|dkrh|j|d|�sn|Vq
|Vq
WdS)N�type�StartTag�data�name�EndTag)r
�is_optional_start�is_optional_end)r	�previousr
�nextrrrr�__iter__szFilter.__iter__cCs�|r|dpd}|dkr |dkS|dkrJ|dkr4dS|d	krH|d
dkSn�|dkrx|dkr^dS|dkrr|d
dkSdSnd|dkr�|dkr�|d
dkSdSnB|dk�r�|dkr�|r�|dd	kr�|d
dkr�dS|d
dkSdSdS)Nr�html�Comment�SpaceCharacters�headr�EmptyTagTrr�bodyF�script�style�colgroup�col�tbody�thead�tfoot�tr)rr)rr)rr)rr)rr)r"r#r$r)r	�tagnamerrrrrrrs4
zFilter.is_optional_startcCs|r|dpd}|d7kr |d8kS|d9krP|d
kr<|d|kS|dkpJ|dkS�n�|d:kr�|d
krl|dd;kS|dkr�|dkp�|dkSdS�n||dk�r�|d<k�r�|dd=kS|dk�p�|dkS�nF|d-k�r�|d
k�r�|dd>kS|dk�p�|dkS�n|d?k�r,|d
k�r|dd@kS|dk�p(|dkSn�|d0k�r`|dAk�rDdS|d
k�rZ|dd0kSd1Sn�|dBk�r�|d
k�r�|ddCkS|d3k�r�|dk�p�|dkSdSnf|d4k�r�|d
k�r�|dd3kS|dk�p�|dkSn2|dDk�r|d
k�r�|ddEkS|dk�p|dkSdS)FNrrrrrr�li�optgroupr%rrr�dt�ddF�pr�address�article�aside�
blockquote�datagrid�dialog�dir�div�dl�fieldset�footer�form�h1�h2�h3�h4�h5�h6�header�hr�menu�nav�ol�pre�section�table�ul�option�rt�rpr Tr#r"r$�td�th)rrr)rr)r'r(r%)r)r*)r)r*)rr)r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBr+rCrDrErF)rGr()rHrI)rHrI)rr)r#r")r"r$)rJrK)rJrKr)r	r&rrrrrrWsf
















zFilter.is_optional_endN)�__name__�
__module__�__qualname__r
rrrrrrrrs

9rN)Z
__future__rrr�rrrrrr�<module>ssite-packages/pip/_vendor/html5lib/filters/__pycache__/base.cpython-36.pyc000064400000001373147511334610022461 0ustar003

���e�@s(ddlmZmZmZGdd�de�ZdS)�)�absolute_import�division�unicode_literalsc@s$eZdZdd�Zdd�Zdd�ZdS)�FiltercCs
||_dS)N)�source)�selfr�r�/usr/lib/python3.6/base.py�__init__szFilter.__init__cCs
t|j�S)N)�iterr)rrrr	�__iter__szFilter.__iter__cCst|j|�S)N)�getattrr)r�namerrr	�__getattr__szFilter.__getattr__N)�__name__�
__module__�__qualname__r
rrrrrr	rsrN)Z
__future__rrr�objectrrrrr	�<module>ssite-packages/pip/_vendor/html5lib/filters/__pycache__/alphabeticalattributes.cpython-36.pyc000064400000001730147511334610026264 0ustar003

���em�@shddlmZmZmZddlmZyddlmZWn ek
rPddl	mZYnXGdd�dej
�Z
dS)�)�absolute_import�division�unicode_literals�)�base)�OrderedDictc@seZdZdd�ZdS)�Filterccshxbtjj|�D]R}|ddkrZt�}x,t|dj�dd�d�D]\}}|||<q>W||d<|VqWdS)	N�type�StartTag�EmptyTag�datacSs|dS)Nr�)�xr
r
�,/usr/lib/python3.6/alphabeticalattributes.py�<lambda>sz!Filter.__iter__.<locals>.<lambda>)�key)r
r)rr�__iter__r�sorted�items)�self�tokenZattrs�name�valuer
r
rrszFilter.__iter__N)�__name__�
__module__�__qualname__rr
r
r
rrsrN)Z
__future__rrr�r�collectionsr�ImportErrorZordereddictrr
r
r
r�<module>ssite-packages/pip/_vendor/html5lib/filters/__pycache__/whitespace.cpython-36.opt-1.pyc000064400000002245147511334610024641 0ustar003

���es�@snddlmZmZmZddlZddlmZddlmZm	Z	dj
e	�Z	ejde	�ZGd	d
�d
ej
�Z
dd�ZdS)
�)�absolute_import�division�unicode_literalsN�)�base�)�rcdataElements�spaceCharacters�z[%s]+c@s(eZdZeddgee��Zdd�ZdS)�FilterZpreZtextareaccs�d}x�tjj|�D]�}|d}|dkrB|s8|d|jkrB|d7}nT|dkrX|rX|d8}n>|rx|dkrx|drxd	|d<n|r�|d
kr�t|d�|d<|VqWdS)Nr�typeZStartTag�namerZEndTagZSpaceCharacters�data� Z
Characters)rr�__iter__�spacePreserveElements�collapse_spaces)�selfZpreserve�tokenr�r� /usr/lib/python3.6/whitespace.pyrs


zFilter.__iter__N)�__name__�
__module__�__qualname__�	frozenset�listrrrrrrrrsrcCstjd|�S)Nr)�SPACES_REGEX�sub)�textrrrr%sr)Z
__future__rrr�rer
rZ	constantsrr	�join�compilerrrrrrr�<module>s
site-packages/pip/_vendor/html5lib/filters/__pycache__/__init__.cpython-36.pyc000064400000000161147511334610023300 0ustar003

���e�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/html5lib/filters/__pycache__/inject_meta_charset.cpython-36.pyc000064400000003116147511334610025537 0ustar003

���e�
�@s6ddlmZmZmZddlmZGdd�dej�ZdS)�)�absolute_import�division�unicode_literals�)�basec@seZdZdd�Zdd�ZdS)�FiltercCstjj||�||_dS)N)rr�__init__�encoding)�self�sourcer	�r�)/usr/lib/python3.6/inject_meta_charset.pyrszFilter.__init__c
cs�d}|jdk}g}�x�tjj|�D�]�}|d}|dkrP|dj�dkrLd}�np|dk�rV|dj�dk�rd	}x�|d
j�D]V\\}}}	|dk	r�q~q~|j�dkr�|j|d
||f<d}Pq~|d
kr~|	j�dkr~d}q~W|o�d|d
k�rTd|j|d
d<d}nR|dj�dk�r�|�r�dd|d
d�Vddd|jid�Vddd�Vd}q"nj|dk�r�|dj�dk�r�|�r�|jd�V|�s�ddd|jid�Vx|�r�|jd�V�q�Wd}d}|dk�r�|j|�q"|Vq"WdS)NZpre_head�typeZStartTag�name�headZin_headZEmptyTag�metaF�data�charsetTz
http-equivzcontent-type�contentztext/html; charset=%s)rrrZEndTag)rrrZ	post_head)Nr)Nr)Nr)Nr)r	rr�__iter__�lower�items�pop�append)
r
�stateZ
meta_found�pending�tokenrZhas_http_equiv_content_type�	namespacer�valuerrr
rsX



zFilter.__iter__N)�__name__�
__module__�__qualname__rrrrrr
rsrN)Z
__future__rrr�rrrrrr
�<module>ssite-packages/pip/_vendor/html5lib/filters/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000161147511334610024237 0ustar003

���e�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/html5lib/filters/__pycache__/whitespace.cpython-36.pyc000064400000002245147511334610023702 0ustar003

���es�@snddlmZmZmZddlZddlmZddlmZm	Z	dj
e	�Z	ejde	�ZGd	d
�d
ej
�Z
dd�ZdS)
�)�absolute_import�division�unicode_literalsN�)�base�)�rcdataElements�spaceCharacters�z[%s]+c@s(eZdZeddgee��Zdd�ZdS)�FilterZpreZtextareaccs�d}x�tjj|�D]�}|d}|dkrB|s8|d|jkrB|d7}nT|dkrX|rX|d8}n>|rx|dkrx|drxd	|d<n|r�|d
kr�t|d�|d<|VqWdS)Nr�typeZStartTag�namerZEndTagZSpaceCharacters�data� Z
Characters)rr�__iter__�spacePreserveElements�collapse_spaces)�selfZpreserve�tokenr�r� /usr/lib/python3.6/whitespace.pyrs


zFilter.__iter__N)�__name__�
__module__�__qualname__�	frozenset�listrrrrrrrrsrcCstjd|�S)Nr)�SPACES_REGEX�sub)�textrrrr%sr)Z
__future__rrr�rer
rZ	constantsrr	�join�compilerrrrrrr�<module>s
site-packages/pip/_vendor/html5lib/filters/__pycache__/base.cpython-36.opt-1.pyc000064400000001373147511334610023420 0ustar003

���e�@s(ddlmZmZmZGdd�de�ZdS)�)�absolute_import�division�unicode_literalsc@s$eZdZdd�Zdd�Zdd�ZdS)�FiltercCs
||_dS)N)�source)�selfr�r�/usr/lib/python3.6/base.py�__init__szFilter.__init__cCs
t|j�S)N)�iterr)rrrr	�__iter__szFilter.__iter__cCst|j|�S)N)�getattrr)r�namerrr	�__getattr__szFilter.__getattr__N)�__name__�
__module__�__qualname__r
rrrrrr	rsrN)Z
__future__rrr�objectrrrrr	�<module>ssite-packages/pip/_vendor/html5lib/filters/__pycache__/optionaltags.cpython-36.opt-1.pyc000064400000005615147511334610025215 0ustar003

���e&)�@s6ddlmZmZmZddlmZGdd�dej�ZdS)�)�absolute_import�division�unicode_literals�)�basec@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�FilterccsLd}}x*|jD] }|dk	r(|||fV|}|}qW|dk	rH||dfVdS)N)�source)�selfZ	previous1Z	previous2�token�r�"/usr/lib/python3.6/optionaltags.py�slidersz
Filter.sliderccsvxp|j�D]d\}}}|d}|dkrH|ds@|j|d||�rn|Vq
|dkrh|j|d|�sn|Vq
|Vq
WdS)N�type�StartTag�data�name�EndTag)r
�is_optional_start�is_optional_end)r	�previousr
�nextrrrr�__iter__szFilter.__iter__cCs�|r|dpd}|dkr |dkS|dkrJ|dkr4dS|d	krH|d
dkSn�|dkrx|dkr^dS|dkrr|d
dkSdSnd|dkr�|dkr�|d
dkSdSnB|dk�r�|dkr�|r�|dd	kr�|d
dkr�dS|d
dkSdSdS)Nr�html�Comment�SpaceCharacters�headr�EmptyTagTrr�bodyF�script�style�colgroup�col�tbody�thead�tfoot�tr)rr)rr)rr)rr)rr)r"r#r$r)r	�tagnamerrrrrrrs4
zFilter.is_optional_startcCs|r|dpd}|d7kr |d8kS|d9krP|d
kr<|d|kS|dkpJ|dkS�n�|d:kr�|d
krl|dd;kS|dkr�|dkp�|dkSdS�n||dk�r�|d<k�r�|dd=kS|dk�p�|dkS�nF|d-k�r�|d
k�r�|dd>kS|dk�p�|dkS�n|d?k�r,|d
k�r|dd@kS|dk�p(|dkSn�|d0k�r`|dAk�rDdS|d
k�rZ|dd0kSd1Sn�|dBk�r�|d
k�r�|ddCkS|d3k�r�|dk�p�|dkSdSnf|d4k�r�|d
k�r�|dd3kS|dk�p�|dkSn2|dDk�r|d
k�r�|ddEkS|dk�p|dkSdS)FNrrrrrr�li�optgroupr%rrr�dt�ddF�pr�address�article�aside�
blockquote�datagrid�dialog�dir�div�dl�fieldset�footer�form�h1�h2�h3�h4�h5�h6�header�hr�menu�nav�ol�pre�section�table�ul�option�rt�rpr Tr#r"r$�td�th)rrr)rr)r'r(r%)r)r*)r)r*)rr)r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBr+rCrDrErF)rGr()rHrI)rHrI)rr)r#r")r"r$)rJrK)rJrKr)r	r&rrrrrrWsf
















zFilter.is_optional_endN)�__name__�
__module__�__qualname__r
rrrrrrrrs

9rN)Z
__future__rrr�rrrrrr�<module>ssite-packages/pip/_vendor/html5lib/filters/__pycache__/sanitizer.cpython-36.pyc000064400000043040147511334610023554 0ustar003

���eb�E@s
ddlmZmZmZddlZddlmZmZddlm	Z
ddlmZddl
mZmZd	gZeed
dfed
dfed
d
fed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
dfed
d fed
d!fed
d"fed
d#fed
d$fed
d%fed
d&fed
d'fed
d(fed
d)fed
d*fed
d+fed
d,fed
d-fed
d.fed
d/fed
d0fed
d1fed
d2fed
d3fed
d4fed
d5fed
d6fed
d7fed
d8fed
d9fed
d:fed
d;fed
d<fed
d=fed
d>fed
d?fed
d@fed
dAfed
dBfed
dCfed
dDfed
dEfed
dFfed
dGfed
dHfed
dIfed
dJfed
dKfed
dLfed
dMfed
dNfed
dOfed
dPfed
dQfed
dRfed
dSfed
dTfed
dUfed
dVfed
dWfed
dXfed
dYfed
dZfed
d[fed
d\fed
d]fed
d^fed
d_fed
d`fed
dafed
dbfed
dcfed
ddfed
defed
dffed
dgfed
dhfed
difed
djfed
dkfed
dlfed
dmfedndofedndpfedndqfedndrfedndsfedndtfedndufedndvfedndwfedndxfedndyfedndzfednd{fednd|fednd}fednd~fedndfednd�fednd�fednd�fednd�fednd�fednd�fednd�fednd�fednd�fednd�fed�dfed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�fed�d�ff��Ze�d3�d4�d5�d6�d7�d8�d9�d:�d;�d<�d=�d>�d?�d@�dA�dB�dC�dD�dE�dF�dG�dH�dI�dJ�dK�dL�dM�dN�dO�dP�dQ�dR�dS�dT�dU�dV�dW�dX�dY�dZ�d[�d\�d]�d^�d_�d`�da�db�dc�dd�de�df�dg�dh�di�dj�dk�dl�dm�dn�do�dp�dq�dr�ds�dt�du�dv�dw�dx�dy�dz�d{�d|�d}�d~�d�d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d��d�e�d1d�f�d��d��d��ddÐdĐdŐdƐdǐdȐdɐdʐdːd̐d͐dΐdϐdАdѐdҐdӐdԐdՐd֐dאdؐdِdڐdېdܐdݐdސdߐd�d�d�d�d�d�d�e�dOd�fe�dO�dPfe�dO�d%f�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d��d��d��d��d��d��d��d��d��d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-�d.�d/�d0�d1�d2�d3�d4�d5�d6�d7�d8�d9�d:�d;�d<�d=�d>�d?�d@�dA�dB�dC�dD�dE�dF�dG�dH�dI�dJ�dK�dL�dM�dN�dO�dP�dQ�dR�dS�dT�dU�dV�dW�dX�dY�dZ�d[�d\�d]�d^�d_�d`�da�db�dce�dO�d�fe�dO�d�fe�dOd�fe�dO�d�fe�dO�dPfe�dOd�fe�dO�d%fe�d1�d�fe�d1d�fe�d1�d�f�dd�de�df�dg�fC�Ze�dh�di�dj�dk�dl�dm�dn�do�dp�dq�dre�dOd�fe�d1�d�ff
�Ze�d~�Ze�d��Ze�d��Ze�d��Ze�d��Ze�d��Ze�d��Zej�d1ej�ZG�d2d	�d	ej�ZdS(��)�absolute_import�division�unicode_literalsN)�escape�unescape)�urllib_parse�)�base�)�
namespaces�prefixes�Filter�html�a�abbrZacronymZaddressZareaZarticleZasideZaudio�bZbigZ
blockquote�brZbuttonZcanvasZcaption�center�cite�code�colZcolgroupZcommandZdatagridZdatalistZdd�delZdetailsZdfnZdialog�dirZdivZdlZdtZemzevent-sourceZfieldsetZ
figcaptionZfigureZfooter�font�form�headerZh1Zh2Zh3Zh4Zh5Zh6Zhr�iZimg�inputZinsZkeygenZkbd�labelZlegendZli�m�mapZmenuZmeterZmulticolZnavZnextidZol�outputZoptgroupZoption�pZpreZprogress�q�sZsampZsectionZselectZsmallZsound�sourceZspacer�spanZstrikeZstrong�subZsup�tableZtbodyZtdZtextareaZtimeZtfootZthZtheadZtrZtt�uZul�varZvideoZmathmlZmactionZmathZmerrorZmfracZmiZ
mmultiscriptsZmn�moZmoverZmpaddedZmphantomZmprescriptsZmrootZmrowZmspaceZmsqrtZmstyleZmsubZmsubsupZmsupZmtableZmtdZmtextZmtrZmunderZ
munderover�noneZsvg�animate�animateColor�
animateMotion�animateTransformZclipPathZcircleZdefsZdescZellipsez	font-facezfont-face-namez
font-face-src�gZglyphZhkern�linearGradient�line�markerZmetadataz
missing-glyphZmpath�pathZpolygonZpolyline�radialGradientZrect�set�stopZswitch�text�titleZtspan�use�accept�accept-charset�	accesskey�action�align�alt�autocomplete�	autofocus�axis�
background�balance�bgcolor�bgproperties�border�bordercolor�bordercolordark�bordercolorlight�
bottompadding�cellpadding�cellspacing�ch�	challenge�char�charoff�choff�charset�checked�class�clear�color�cols�colspan�compact�contenteditable�controls�coords�data�datafld�datapagesize�datasrc�datetime�default�delay�disabled�	draggable�dynsrc�enctype�end�face�for�frame�
galleryimg�gutter�headers�height�	hidefocus�hidden�high�href�hreflang�hspace�icon�id�	inputmode�ismap�keytype�leftspacing�lang�list�longdesc�loop�	loopcount�loopend�	loopstart�low�lowsrc�max�	maxlength�media�method�min�multiple�name�nohref�noshade�nowrap�open�optimum�pattern�ping�
point-size�poster�pqg�preload�prompt�
radiogroup�readonly�rel�
repeat-max�
repeat-min�replace�required�rev�rightspacing�rows�rowspan�rules�scope�selected�shape�size�src�start�step�style�summary�suppress�tabindex�target�template�
toppadding�type�unselectable�usemap�urn�valign�value�variable�volume�vspace�vrml�width�wrapZxml�
actiontype�columnalign�columnlines�
columnspacing�
columnspan�depth�display�displaystyle�equalcolumns�	equalrows�fence�	fontstyle�
fontweight�
linethickness�lspace�mathbackground�	mathcolor�mathvariant�maxsize�minsize�other�rowalign�rowlines�
rowspacing�rspace�scriptlevel�	selection�	separator�stretchy�xlinkZshow�
accent-height�
accumulate�additive�
alphabetic�arabic-form�ascent�
attributeName�
attributeType�baseProfile�bbox�begin�by�calcMode�
cap-height�	clip-path�color-rendering�content�cx�cy�d�dx�dy�descent�dur�fill�fill-opacity�	fill-rule�font-family�	font-size�font-stretch�
font-style�font-variant�font-weight�from�fx�fy�g1�g2�
glyph-name�
gradientUnits�hanging�horiz-adv-x�horiz-origin-x�ideographic�k�	keyPoints�
keySplines�keyTimes�
marker-end�
marker-mid�marker-start�markerHeight�markerUnits�markerWidth�mathematical�offset�opacity�orient�origin�overline-position�overline-thickness�panose-1�
pathLength�points�preserveAspectRatio�r�refX�refY�repeatCount�	repeatDur�requiredExtensions�requiredFeatures�restart�rotate�rx�ry�slope�stemh�stemv�
stop-color�stop-opacity�strikethrough-position�strikethrough-thickness�stroke�stroke-dasharray�stroke-dashoffset�stroke-linecap�stroke-linejoin�stroke-miterlimit�stroke-opacity�stroke-width�systemLanguage�text-anchor�to�	transform�u1�u2�underline-position�underline-thickness�unicode�
unicode-range�units-per-em�values�version�viewBox�
visibility�widths�x�x-height�x1�x2ZactuateZarcroleZroler	Zspace�y�y1�y2�
zoomAndPan�
color-profile�cursor�filter�mask�altGlyph�feImage�textpath�tref�azimuth�background-color�border-bottom-color�border-collapse�border-color�border-left-color�border-right-color�border-top-color�	direction�	elevation�float�letter-spacing�line-height�overflow�pause�pause-after�pause-before�pitch�pitch-range�richness�speak�speak-header�
speak-numeral�speak-punctuation�speech-rate�stress�
text-align�text-decoration�text-indent�unicode-bidi�vertical-align�voice-family�white-space�auto�aqua�black�block�blue�bold�both�bottom�brown�collapse�dashed�dotted�fuchsia�gray�green�
!important�italic�left�lime�maroon�medium�navy�normal�olive�pointer�purple�red�right�solid�silver�teal�top�transparent�	underline�white�yellow�ed2k�ftp�http�https�irc�mailto�news�gopher�nntp�telnet�webcal�xmpp�callto�feed�aim�rsync�tag�ssh�sftp�rtsp�afs�	image/png�
image/jpeg�	image/gif�
image/webp�	image/bmp�
text/plainaL
                                ^
                                # Match a content type <application>/<type>
                                (?P<content_type>[-a-zA-Z0-9.]+/[-a-zA-Z0-9.]+)
                                # Match any character set and encoding
                                (?:(?:;charset=(?:[-a-zA-Z0-9]+)(?:;(?:base64))?)
                                  |(?:;(?:base64))?(?:;charset=(?:[-a-zA-Z0-9]+))?)
                                # Assume the rest is data
                                ,.*
                                $
                                c
s^eZdZdZeeeeee	e
eee
f
�fdd�	Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z�ZS)r
zA sanitization of XHTML+MathML+SVG and of inline style attributes.csPtt|�j|�||_||_||_||_||_||_||_	|	|_
|
|_||_dS)N)
�superr
�__init__�allowed_elements�allowed_attributes�allowed_css_properties�allowed_css_keywords�allowed_svg_properties�allowed_protocols�allowed_content_types�attr_val_is_uri�svg_attr_val_allows_ref�svg_allow_local_href)�selfr%r�r�r�r�r�r�r�r�r�r�)�	__class__��/usr/lib/python3.6/sanitizer.pyr��szFilter.__init__ccs.x(tjj|�D]}|j|�}|r|VqWdS)N)r	r
�__iter__�sanitize_token)r��tokenr�r�r�r��s
zFilter.__iter__cCsp|d}|d	kr^|d}|d}||f|jksH|dkrRtd|f|jkrR|j|�S|j|�Sn|dkrhn|SdS)
Nr��StartTag�EndTag�EmptyTagr��	namespacer�Comment)r�r�r�)r�r�
allowed_token�disallowed_token)r�r��
token_typer�r�r�r�r�r��s
zFilter.sanitize_tokenc	Cs�d|k�r�|d}t|j��}x&||jD]}|d|=|j|�q*Wx�||j@D]�}||ksbt�tjddt||��j	�}|j
dd�}ytj|�}Wntk
r�d}||=YnX|o�|j
rR|j
|jkr�||=|j
dkrRtj|j�}|s�||=qR|jd�|jkrR||=qRWx4|jD]*}||k�rtjddt||��||<�qW|d|jk�r�td	d
f|k�r�tjd|td	d
f��r�|td	d
f=d
|k�r�|j|d�|d<||d<|S)Nr`u
[`- - \s]+�u�Zcontent_typezurl\s*\(\s*[^#\s][^)]+?\)� r�r�rvz^\s*[^#\s].*r�)Nr�)Nr�)Nr�)r7�keysr��remover��AssertionError�rer'r�lowerr��urlparse�
ValueError�schemer��data_content_type�matchr5�groupr�r�r�r�search�sanitize_css)	r�r��attrsZ
attr_namesZ	to_remove�attrZ
val_unescapedZurirr�r�r�r�sL






zFilter.allowed_tokencCs�|d}|dkr"d|d|d<n�|dr�|dks6t�g}xJ|dj�D]:\\}}}|jd|dkrf|nd	t||ft|�f�qHWd
|ddj|�f|d<nd|d|d<|jd
�r�|ddd�d|d<d|d<|d=|S)Nr�r�z</%s>r�r`r�r�z %s="%s"z%s:%sz<%s%s>r�z<%s>ZselfClosingrz/>Z
Characters)r�r����)r��items�appendrr�join�get)r�r�r�r��nsr��vr�r�r�r�2s2
zFilter.disallowed_tokencCstjd�jd|�}tjd|�s"dStjd|�s2dSg}x�tjd|�D]�\}}|sRqD|j�|jkrx|j|d|d�qD|jd	�d
j�dkr�xf|j�D]}||j	kr�tjd|�r�Pq�W|j|d|d�qD|j�|j
krD|j|d|d�qDWdj|�S)Nzurl\s*\(\s*[^\s)]+?\s*\)\s*r�z@^([:,;#%.\sa-zA-Z0-9!]|\w-\w|'[\s\w]+'|"[\s\w]+"|\([\d,\s]+\))*$r�z ^\s*([-\w]+\s*:[^:;]*(;\s*|$))*$z([-\w]+)\s*:\s*([^:;]*)z: �;�-rrErI�margin�paddingz\^(#[0-9a-f]+|rgb\(\d+%?,\d*%?,?\d*%?\)?|\d{0,2}\.?\d{0,2}(cm|em|ex|in|mm|pc|pt|px|%|,|\))?)$)rErIr�r�)r��compiler'r��findallr�r�r��splitr�r�r�)r�r�ZcleanZpropr��keywordr�r�r�r�Fs*
zFilter.sanitize_css)�__name__�
__module__�__qualname__�__doc__r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r��
__classcell__r�r�)r�r�r
�s 
2)Nr)Nr<)Nr=)Nr>)Nr?)Nr@)NrA)NrB)NrC)NrD)NrE)NrF)NrG)NrH)NrI)NrJ)NrK)NrL)NrM)NrN)NrO)NrP)NrQ)NrR)NrS)NrT)NrU)NrV)Nr)NrW)NrX)NrY)NrZ)Nr[)Nr\)Nr])Nr^)Nr_)Nr`)Nra)Nrb)Nrc)Nrd)Nre)Nrf)Nr)Nrg)Nrh)Nri)Nrj)Nrk)Nrl)Nrm)Nr)Nrn)Nro)Nrp)Nrq)Nrr)Nrs)Nrt)Nru)Nrv)Nrw)Nrx)Nry)Nrz)Nr{)Nr|)Nr})Nr)Nr~)Nr)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr&)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr:)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr@)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nrn)Nrr)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)NrW)Nr�)NrY)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nrk)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr�)Nr)Nr)Nr)Nr)Nr)Nr)Nr)Nr)Nrr)Nr)Nr	)Nrz)Nr
)Nr)Nr)Nr
)Nr)Nr)Nr)Nr)Nr)Nr)Nr)Nr)Nr)Nr�)Nr�)Nr�)Nr)Nr)Nr)Nr)Nr)Nr)Nr)Nr5)Nr)Nr)Nr)Nr )Nr!)Nr")Nr#)Nr$)Nr%)Nr&)Nr')Nr()Nr))Nr*)Nr+)Nr,)Nr-)Nr.)Nr/)Nr0)Nr1)Nr2)Nr3)Nr4)Nr5)Nr6)Nr7)Nr8)Nr9)Nr:)Nr�)Nr;)Nr<)Nr=)Nr�)Nr>)Nr?)Nr@)NrA)NrB)NrC)NrD)NrE)NrF)NrG)NrH)Nr�)NrI)NrJ)NrK)NrL)NrM)NrN)NrO)NrP)NrQ)Nrv)Nr�)Nr)Nr?)Nr�)Nr�)NrE)Nrc)Nri)Nr�)Nr��Nr��NrR�NrS�Nr��NrT�Nr4�Nr�Nr�Nr�NrU�Nr2)r�r�r�r�r�r�rrrrr�NrV�Nr-�Nr.�Nr/�Nr0�NrS�NrW�NrT�Nr2�Nr��Nr6�NrX�NrY�Nr7�Nr;)rrrrr	r
rrr
rrrrrr).rZr[r\r]r^r_r`rarXrYrSrbr�rcrdrr�r�r�r�r�rrrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryr�rzr�)'r{r|r}r~rr�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r,r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)r�r�r�r2r9r5r6r8)r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r`)r�r�r�r�r�r�) Z
__future__rrrr�Zxml.sax.saxutilsrrZpip._vendor.six.movesrr�r�r	Z	constantsrr�__all__�	frozensetr�r�r�r�r�r�r�r�r�r�r��VERBOSEr�r
r�r�r�r��<module>s2
































































































































































site-packages/pip/_vendor/html5lib/filters/__pycache__/lint.cpython-36.pyc000064400000004355147511334610022520 0ustar003

���e%
�@shddlmZmZmZddlmZddlmZddlm	Z	m
Z
ddlmZdje�ZGd	d
�d
ej
�Z
dS)�)�absolute_import�division�unicode_literals)�	text_type�)�base�)�
namespaces�voidElements)�spaceCharacters�cs&eZdZd�fdd�	Zdd�Z�ZS)�FilterTcstt|�j|�||_dS)N)�superr
�__init__�require_matching_tags)�self�sourcer)�	__class__��/usr/lib/python3.6/lint.pyr
szFilter.__init__c	csRg}�xFtjj|�D�]4}|d}|dk�rP|d}|d}|dksRt|t�sRt�|dks^t�t|t�slt�|dksxt�t|dt�s�t�|s�|tdkr�|tkr�|dks�t�n|dks�t�|dkr�|j	r�|j
||f�xp|dj�D]`\\}}}|dk�st|t��st�|dk�st�t|t��s,t�|dk�s:t�t|t�s�t�q�W�n�|d	k�r|d}|d}|dk�s�t|t��s�t�|dk�s�t�t|t��s�t�|dk�s�t�|�s�|tdk�r�|tk�r�d
�s
tdd|i��n"|j	�rD|j�}|||fk�sDt��n6|d
k�r4|d}t|t��sDt��n|dk�r�|d}t|t��sVt�|dk�sdt�|dk�rD|j
t�dk�sDt�n�|dk�r�|d}|dk�s�t|t��s�t�|ddk�s�t|t��s�t�|ddk�sDt|t��sDt�nV|dk�rt|dt��sDt�n6|dk�r.t|dt��sDt�nd
�sDtdd|i��|VqWdS)N�type�StartTag�EmptyTag�	namespace�namer�dataZhtmlZEndTagFz.Void element reported as EndTag token: %(tag)s�tag�Comment�
Characters�SpaceCharactersZDoctypeZpublicIdZsystemIdZEntityZSerializerErrorzUnknown token type: %(type)s)rr)rr)rr
�__iter__�
isinstancer�AssertionError�dictr	r
r�append�items�pop�stripr)	rZ
open_elements�tokenrrr�value�startrrrrr sl

 



 

zFilter.__iter__)T)�__name__�
__module__�__qualname__rr �
__classcell__rr)rrr
sr
N)Z
__future__rrrZpip._vendor.sixrrrZ	constantsr	r
r�joinr
rrrr�<module>s
site-packages/pip/_vendor/html5lib/filters/__pycache__/lint.cpython-36.opt-1.pyc000064400000003025147511334610023450 0ustar003

���e%
�@shddlmZmZmZddlmZddlmZddlm	Z	m
Z
ddlmZdje�ZGd	d
�d
ej
�Z
dS)�)�absolute_import�division�unicode_literals)�	text_type�)�base�)�
namespaces�voidElements)�spaceCharacters�cs&eZdZd�fdd�	Zdd�Z�ZS)�FilterTcstt|�j|�||_dS)N)�superr
�__init__�require_matching_tags)�self�sourcer)�	__class__��/usr/lib/python3.6/lint.pyr
szFilter.__init__c	cs@g}�x4tjj|�D�]"}|d}|dkr�|d}|d}|sL|tdkrV|tkrVn|dkrr|jrr|j||f�x�|dj�D]\\}}}q�Wn�|dkr�|d}|d}|s�|tdkr�|tkr�n|jr�|j�}n\|d	kr�|d}nJ|dk�r|d}|dk�r2n,|dk�r|d}n|d
k�r&n|dk�r2n|VqWdS)N�type�StartTag�EmptyTag�	namespace�nameZhtml�dataZEndTag�Comment�
Characters�SpaceCharactersZDoctypeZEntityZSerializerError)rr)rr)	rr
�__iter__r	r
r�append�items�pop)	rZ
open_elements�tokenrrr�value�startrrrrrsF




zFilter.__iter__)T)�__name__�
__module__�__qualname__rr�
__classcell__rr)rrr
sr
N)Z
__future__rrrZpip._vendor.sixrrrZ	constantsr	r
r�joinr
rrrr�<module>s
site-packages/pip/_vendor/html5lib/filters/optionaltags.py000064400000024446147511334610017775 0ustar00from __future__ import absolute_import, division, unicode_literals

from . import base


class Filter(base.Filter):
    def slider(self):
        previous1 = previous2 = None
        for token in self.source:
            if previous1 is not None:
                yield previous2, previous1, token
            previous2 = previous1
            previous1 = token
        if previous1 is not None:
            yield previous2, previous1, None

    def __iter__(self):
        for previous, token, next in self.slider():
            type = token["type"]
            if type == "StartTag":
                if (token["data"] or
                        not self.is_optional_start(token["name"], previous, next)):
                    yield token
            elif type == "EndTag":
                if not self.is_optional_end(token["name"], next):
                    yield token
            else:
                yield token

    def is_optional_start(self, tagname, previous, next):
        type = next and next["type"] or None
        if tagname in 'html':
            # An html element's start tag may be omitted if the first thing
            # inside the html element is not a space character or a comment.
            return type not in ("Comment", "SpaceCharacters")
        elif tagname == 'head':
            # A head element's start tag may be omitted if the first thing
            # inside the head element is an element.
            # XXX: we also omit the start tag if the head element is empty
            if type in ("StartTag", "EmptyTag"):
                return True
            elif type == "EndTag":
                return next["name"] == "head"
        elif tagname == 'body':
            # A body element's start tag may be omitted if the first thing
            # inside the body element is not a space character or a comment,
            # except if the first thing inside the body element is a script
            # or style element and the node immediately preceding the body
            # element is a head element whose end tag has been omitted.
            if type in ("Comment", "SpaceCharacters"):
                return False
            elif type == "StartTag":
                # XXX: we do not look at the preceding event, so we never omit
                # the body element's start tag if it's followed by a script or
                # a style element.
                return next["name"] not in ('script', 'style')
            else:
                return True
        elif tagname == 'colgroup':
            # A colgroup element's start tag may be omitted if the first thing
            # inside the colgroup element is a col element, and if the element
            # is not immediately preceded by another colgroup element whose
            # end tag has been omitted.
            if type in ("StartTag", "EmptyTag"):
                # XXX: we do not look at the preceding event, so instead we never
                # omit the colgroup element's end tag when it is immediately
                # followed by another colgroup element. See is_optional_end.
                return next["name"] == "col"
            else:
                return False
        elif tagname == 'tbody':
            # A tbody element's start tag may be omitted if the first thing
            # inside the tbody element is a tr element, and if the element is
            # not immediately preceded by a tbody, thead, or tfoot element
            # whose end tag has been omitted.
            if type == "StartTag":
                # omit the thead and tfoot elements' end tag when they are
                # immediately followed by a tbody element. See is_optional_end.
                if previous and previous['type'] == 'EndTag' and \
                        previous['name'] in ('tbody', 'thead', 'tfoot'):
                    return False
                return next["name"] == 'tr'
            else:
                return False
        return False

    def is_optional_end(self, tagname, next):
        type = next and next["type"] or None
        if tagname in ('html', 'head', 'body'):
            # An html element's end tag may be omitted if the html element
            # is not immediately followed by a space character or a comment.
            return type not in ("Comment", "SpaceCharacters")
        elif tagname in ('li', 'optgroup', 'tr'):
            # A li element's end tag may be omitted if the li element is
            # immediately followed by another li element or if there is
            # no more content in the parent element.
            # An optgroup element's end tag may be omitted if the optgroup
            # element is immediately followed by another optgroup element,
            # or if there is no more content in the parent element.
            # A tr element's end tag may be omitted if the tr element is
            # immediately followed by another tr element, or if there is
            # no more content in the parent element.
            if type == "StartTag":
                return next["name"] == tagname
            else:
                return type == "EndTag" or type is None
        elif tagname in ('dt', 'dd'):
            # A dt element's end tag may be omitted if the dt element is
            # immediately followed by another dt element or a dd element.
            # A dd element's end tag may be omitted if the dd element is
            # immediately followed by another dd element or a dt element,
            # or if there is no more content in the parent element.
            if type == "StartTag":
                return next["name"] in ('dt', 'dd')
            elif tagname == 'dd':
                return type == "EndTag" or type is None
            else:
                return False
        elif tagname == 'p':
            # A p element's end tag may be omitted if the p element is
            # immediately followed by an address, article, aside,
            # blockquote, datagrid, dialog, dir, div, dl, fieldset,
            # footer, form, h1, h2, h3, h4, h5, h6, header, hr, menu,
            # nav, ol, p, pre, section, table, or ul, element, or if
            # there is no more content in the parent element.
            if type in ("StartTag", "EmptyTag"):
                return next["name"] in ('address', 'article', 'aside',
                                        'blockquote', 'datagrid', 'dialog',
                                        'dir', 'div', 'dl', 'fieldset', 'footer',
                                        'form', 'h1', 'h2', 'h3', 'h4', 'h5', 'h6',
                                        'header', 'hr', 'menu', 'nav', 'ol',
                                        'p', 'pre', 'section', 'table', 'ul')
            else:
                return type == "EndTag" or type is None
        elif tagname == 'option':
            # An option element's end tag may be omitted if the option
            # element is immediately followed by another option element,
            # or if it is immediately followed by an <code>optgroup</code>
            # element, or if there is no more content in the parent
            # element.
            if type == "StartTag":
                return next["name"] in ('option', 'optgroup')
            else:
                return type == "EndTag" or type is None
        elif tagname in ('rt', 'rp'):
            # An rt element's end tag may be omitted if the rt element is
            # immediately followed by an rt or rp element, or if there is
            # no more content in the parent element.
            # An rp element's end tag may be omitted if the rp element is
            # immediately followed by an rt or rp element, or if there is
            # no more content in the parent element.
            if type == "StartTag":
                return next["name"] in ('rt', 'rp')
            else:
                return type == "EndTag" or type is None
        elif tagname == 'colgroup':
            # A colgroup element's end tag may be omitted if the colgroup
            # element is not immediately followed by a space character or
            # a comment.
            if type in ("Comment", "SpaceCharacters"):
                return False
            elif type == "StartTag":
                # XXX: we also look for an immediately following colgroup
                # element. See is_optional_start.
                return next["name"] != 'colgroup'
            else:
                return True
        elif tagname in ('thead', 'tbody'):
            # A thead element's end tag may be omitted if the thead element
            # is immediately followed by a tbody or tfoot element.
            # A tbody element's end tag may be omitted if the tbody element
            # is immediately followed by a tbody or tfoot element, or if
            # there is no more content in the parent element.
            # A tfoot element's end tag may be omitted if the tfoot element
            # is immediately followed by a tbody element, or if there is no
            # more content in the parent element.
            # XXX: we never omit the end tag when the following element is
            # a tbody. See is_optional_start.
            if type == "StartTag":
                return next["name"] in ['tbody', 'tfoot']
            elif tagname == 'tbody':
                return type == "EndTag" or type is None
            else:
                return False
        elif tagname == 'tfoot':
            # A tfoot element's end tag may be omitted if the tfoot element
            # is immediately followed by a tbody element, or if there is no
            # more content in the parent element.
            # XXX: we never omit the end tag when the following element is
            # a tbody. See is_optional_start.
            if type == "StartTag":
                return next["name"] == 'tbody'
            else:
                return type == "EndTag" or type is None
        elif tagname in ('td', 'th'):
            # A td element's end tag may be omitted if the td element is
            # immediately followed by a td or th element, or if there is
            # no more content in the parent element.
            # A th element's end tag may be omitted if the th element is
            # immediately followed by a td or th element, or if there is
            # no more content in the parent element.
            if type == "StartTag":
                return next["name"] in ('td', 'th')
            else:
                return type == "EndTag" or type is None
        return False
site-packages/pip/_vendor/html5lib/filters/base.py000064400000000436147511334610016174 0ustar00from __future__ import absolute_import, division, unicode_literals


class Filter(object):
    def __init__(self, source):
        self.source = source

    def __iter__(self):
        return iter(self.source)

    def __getattr__(self, name):
        return getattr(self.source, name)
site-packages/pip/_vendor/html5lib/filters/alphabeticalattributes.py000064400000001155147511334610022001 0ustar00from __future__ import absolute_import, division, unicode_literals

from . import base

try:
    from collections import OrderedDict
except ImportError:
    from ordereddict import OrderedDict


class Filter(base.Filter):
    def __iter__(self):
        for token in base.Filter.__iter__(self):
            if token["type"] in ("StartTag", "EmptyTag"):
                attrs = OrderedDict()
                for name, value in sorted(token["data"].items(),
                                          key=lambda x: x[0]):
                    attrs[name] = value
                token["data"] = attrs
            yield token
site-packages/pip/_vendor/html5lib/filters/inject_meta_charset.py000064400000005266147511334610021263 0ustar00from __future__ import absolute_import, division, unicode_literals

from . import base


class Filter(base.Filter):
    def __init__(self, source, encoding):
        base.Filter.__init__(self, source)
        self.encoding = encoding

    def __iter__(self):
        state = "pre_head"
        meta_found = (self.encoding is None)
        pending = []

        for token in base.Filter.__iter__(self):
            type = token["type"]
            if type == "StartTag":
                if token["name"].lower() == "head":
                    state = "in_head"

            elif type == "EmptyTag":
                if token["name"].lower() == "meta":
                    # replace charset with actual encoding
                    has_http_equiv_content_type = False
                    for (namespace, name), value in token["data"].items():
                        if namespace is not None:
                            continue
                        elif name.lower() == 'charset':
                            token["data"][(namespace, name)] = self.encoding
                            meta_found = True
                            break
                        elif name == 'http-equiv' and value.lower() == 'content-type':
                            has_http_equiv_content_type = True
                    else:
                        if has_http_equiv_content_type and (None, "content") in token["data"]:
                            token["data"][(None, "content")] = 'text/html; charset=%s' % self.encoding
                            meta_found = True

                elif token["name"].lower() == "head" and not meta_found:
                    # insert meta into empty head
                    yield {"type": "StartTag", "name": "head",
                           "data": token["data"]}
                    yield {"type": "EmptyTag", "name": "meta",
                           "data": {(None, "charset"): self.encoding}}
                    yield {"type": "EndTag", "name": "head"}
                    meta_found = True
                    continue

            elif type == "EndTag":
                if token["name"].lower() == "head" and pending:
                    # insert meta into head (if necessary) and flush pending queue
                    yield pending.pop(0)
                    if not meta_found:
                        yield {"type": "EmptyTag", "name": "meta",
                               "data": {(None, "charset"): self.encoding}}
                    while pending:
                        yield pending.pop(0)
                    meta_found = True
                    state = "post_head"

            if state == "in_head":
                pending.append(token)
            else:
                yield token
site-packages/pip/_vendor/html5lib/filters/__init__.py000064400000000000147511334610017004 0ustar00site-packages/pip/_vendor/html5lib/filters/lint.py000064400000006445147511334610016236 0ustar00from __future__ import absolute_import, division, unicode_literals

from pip._vendor.six import text_type

from . import base
from ..constants import namespaces, voidElements

from ..constants import spaceCharacters
spaceCharacters = "".join(spaceCharacters)


class Filter(base.Filter):
    def __init__(self, source, require_matching_tags=True):
        super(Filter, self).__init__(source)
        self.require_matching_tags = require_matching_tags

    def __iter__(self):
        open_elements = []
        for token in base.Filter.__iter__(self):
            type = token["type"]
            if type in ("StartTag", "EmptyTag"):
                namespace = token["namespace"]
                name = token["name"]
                assert namespace is None or isinstance(namespace, text_type)
                assert namespace != ""
                assert isinstance(name, text_type)
                assert name != ""
                assert isinstance(token["data"], dict)
                if (not namespace or namespace == namespaces["html"]) and name in voidElements:
                    assert type == "EmptyTag"
                else:
                    assert type == "StartTag"
                if type == "StartTag" and self.require_matching_tags:
                    open_elements.append((namespace, name))
                for (namespace, name), value in token["data"].items():
                    assert namespace is None or isinstance(namespace, text_type)
                    assert namespace != ""
                    assert isinstance(name, text_type)
                    assert name != ""
                    assert isinstance(value, text_type)

            elif type == "EndTag":
                namespace = token["namespace"]
                name = token["name"]
                assert namespace is None or isinstance(namespace, text_type)
                assert namespace != ""
                assert isinstance(name, text_type)
                assert name != ""
                if (not namespace or namespace == namespaces["html"]) and name in voidElements:
                    assert False, "Void element reported as EndTag token: %(tag)s" % {"tag": name}
                elif self.require_matching_tags:
                    start = open_elements.pop()
                    assert start == (namespace, name)

            elif type == "Comment":
                data = token["data"]
                assert isinstance(data, text_type)

            elif type in ("Characters", "SpaceCharacters"):
                data = token["data"]
                assert isinstance(data, text_type)
                assert data != ""
                if type == "SpaceCharacters":
                    assert data.strip(spaceCharacters) == ""

            elif type == "Doctype":
                name = token["name"]
                assert name is None or isinstance(name, text_type)
                assert token["publicId"] is None or isinstance(name, text_type)
                assert token["systemId"] is None or isinstance(name, text_type)

            elif type == "Entity":
                assert isinstance(token["name"], text_type)

            elif type == "SerializerError":
                assert isinstance(token["data"], text_type)

            else:
                assert False, "Unknown token type: %(type)s" % {"type": type}

            yield token
site-packages/pip/_vendor/progress/__init__.py000064400000005717147511334610015504 0ustar00# Copyright (c) 2012 Giorgos Verigakis <verigak@gmail.com>
#
# Permission to use, copy, modify, and distribute this software for any
# purpose with or without fee is hereby granted, provided that the above
# copyright notice and this permission notice appear in all copies.
#
# THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
# WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
# MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
# ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
# WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
# ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
# OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.

from __future__ import division

from collections import deque
from datetime import timedelta
from math import ceil
from sys import stderr
from time import time


__version__ = '1.2'


class Infinite(object):
    file = stderr
    sma_window = 10

    def __init__(self, *args, **kwargs):
        self.index = 0
        self.start_ts = time()
        self._ts = self.start_ts
        self._dt = deque(maxlen=self.sma_window)
        for key, val in kwargs.items():
            setattr(self, key, val)

    def __getitem__(self, key):
        if key.startswith('_'):
            return None
        return getattr(self, key, None)

    @property
    def avg(self):
        return sum(self._dt) / len(self._dt) if self._dt else 0

    @property
    def elapsed(self):
        return int(time() - self.start_ts)

    @property
    def elapsed_td(self):
        return timedelta(seconds=self.elapsed)

    def update(self):
        pass

    def start(self):
        pass

    def finish(self):
        pass

    def next(self, n=1):
        if n > 0:
            now = time()
            dt = (now - self._ts) / n
            self._dt.append(dt)
            self._ts = now

        self.index = self.index + n
        self.update()

    def iter(self, it):
        for x in it:
            yield x
            self.next()
        self.finish()


class Progress(Infinite):
    def __init__(self, *args, **kwargs):
        super(Progress, self).__init__(*args, **kwargs)
        self.max = kwargs.get('max', 100)

    @property
    def eta(self):
        return int(ceil(self.avg * self.remaining))

    @property
    def eta_td(self):
        return timedelta(seconds=self.eta)

    @property
    def percent(self):
        return self.progress * 100

    @property
    def progress(self):
        return min(1, self.index / self.max)

    @property
    def remaining(self):
        return max(self.max - self.index, 0)

    def start(self):
        self.update()

    def goto(self, index):
        incr = index - self.index
        self.next(incr)

    def iter(self, it):
        try:
            self.max = len(it)
        except TypeError:
            pass

        for x in it:
            yield x
            self.next()
        self.finish()
site-packages/pip/_vendor/progress/bar.py000064400000005175147511334610014507 0ustar00# -*- coding: utf-8 -*-

# Copyright (c) 2012 Giorgos Verigakis <verigak@gmail.com>
#
# Permission to use, copy, modify, and distribute this software for any
# purpose with or without fee is hereby granted, provided that the above
# copyright notice and this permission notice appear in all copies.
#
# THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
# WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
# MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
# ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
# WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
# ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
# OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.

from . import Progress
from .helpers import WritelnMixin


class Bar(WritelnMixin, Progress):
    width = 32
    message = ''
    suffix = '%(index)d/%(max)d'
    bar_prefix = ' |'
    bar_suffix = '| '
    empty_fill = ' '
    fill = '#'
    hide_cursor = True

    def update(self):
        filled_length = int(self.width * self.progress)
        empty_length = self.width - filled_length

        message = self.message % self
        bar = self.fill * filled_length
        empty = self.empty_fill * empty_length
        suffix = self.suffix % self
        line = ''.join([message, self.bar_prefix, bar, empty, self.bar_suffix,
                        suffix])
        self.writeln(line)


class ChargingBar(Bar):
    suffix = '%(percent)d%%'
    bar_prefix = ' '
    bar_suffix = ' '
    empty_fill = u'∙'
    fill = u'█'


class FillingSquaresBar(ChargingBar):
    empty_fill = u'▢'
    fill = u'▣'


class FillingCirclesBar(ChargingBar):
    empty_fill = u'◯'
    fill = u'◉'


class IncrementalBar(Bar):
    phases = (u' ', u'▏', u'▎', u'▍', u'▌', u'▋', u'▊', u'▉', u'█')

    def update(self):
        nphases = len(self.phases)
        expanded_length = int(nphases * self.width * self.progress)
        filled_length = int(self.width * self.progress)
        empty_length = self.width - filled_length
        phase = expanded_length - (filled_length * nphases)

        message = self.message % self
        bar = self.phases[-1] * filled_length
        current = self.phases[phase] if phase > 0 else ''
        empty = self.empty_fill * max(0, empty_length - len(current))
        suffix = self.suffix % self
        line = ''.join([message, self.bar_prefix, bar, current, empty,
                        self.bar_suffix, suffix])
        self.writeln(line)


class ShadyBar(IncrementalBar):
    phases = (u' ', u'░', u'▒', u'▓', u'█')
site-packages/pip/_vendor/progress/helpers.py000064400000005446147511334610015406 0ustar00# Copyright (c) 2012 Giorgos Verigakis <verigak@gmail.com>
#
# Permission to use, copy, modify, and distribute this software for any
# purpose with or without fee is hereby granted, provided that the above
# copyright notice and this permission notice appear in all copies.
#
# THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
# WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
# MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
# ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
# WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
# ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
# OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.

from __future__ import print_function


HIDE_CURSOR = '\x1b[?25l'
SHOW_CURSOR = '\x1b[?25h'


class WriteMixin(object):
    hide_cursor = False

    def __init__(self, message=None, **kwargs):
        super(WriteMixin, self).__init__(**kwargs)
        self._width = 0
        if message:
            self.message = message

        if self.file.isatty():
            if self.hide_cursor:
                print(HIDE_CURSOR, end='', file=self.file)
            print(self.message, end='', file=self.file)
            self.file.flush()

    def write(self, s):
        if self.file.isatty():
            b = '\b' * self._width
            c = s.ljust(self._width)
            print(b + c, end='', file=self.file)
            self._width = max(self._width, len(s))
            self.file.flush()

    def finish(self):
        if self.file.isatty() and self.hide_cursor:
            print(SHOW_CURSOR, end='', file=self.file)


class WritelnMixin(object):
    hide_cursor = False

    def __init__(self, message=None, **kwargs):
        super(WritelnMixin, self).__init__(**kwargs)
        if message:
            self.message = message

        if self.file.isatty() and self.hide_cursor:
            print(HIDE_CURSOR, end='', file=self.file)

    def clearln(self):
        if self.file.isatty():
            print('\r\x1b[K', end='', file=self.file)

    def writeln(self, line):
        if self.file.isatty():
            self.clearln()
            print(line, end='', file=self.file)
            self.file.flush()

    def finish(self):
        if self.file.isatty():
            print(file=self.file)
            if self.hide_cursor:
                print(SHOW_CURSOR, end='', file=self.file)


from signal import signal, SIGINT
from sys import exit


class SigIntMixin(object):
    """Registers a signal handler that calls finish on SIGINT"""

    def __init__(self, *args, **kwargs):
        super(SigIntMixin, self).__init__(*args, **kwargs)
        signal(SIGINT, self._sigint_handler)

    def _sigint_handler(self, signum, frame):
        self.finish()
        exit(0)
site-packages/pip/_vendor/progress/__pycache__/helpers.cpython-36.pyc000064400000005517147511334610021671 0ustar003

���e&�@sdddlmZdZdZGdd�de�ZGdd�de�ZddlmZmZdd	l	m
Z
Gd
d�de�ZdS)
�)�print_functionz[?25lz[?25hcs2eZdZdZd	�fdd�	Zdd�Zdd�Z�ZS)
�
WriteMixinFNcsbtt|�jf|�d|_|r"||_|jj�r^|jrBtt	d|jd�t|jd|jd�|jj
�dS)Nr�)�end�file)�superr�__init__�_width�messager�isatty�hide_cursor�print�HIDE_CURSOR�flush)�selfr
�kwargs)�	__class__��/usr/lib/python3.6/helpers.pyrs
zWriteMixin.__init__cCsT|jj�rPd|j}|j|j�}t||d|jd�t|jt|��|_|jj�dS)N�r)rr)rrr	�ljustr
�max�lenr)r�s�b�crrr�write%s

zWriteMixin.writecCs$|jj�r |jr ttd|jd�dS)Nr)rr)rrrr
�SHOW_CURSOR)rrrr�finish-szWriteMixin.finish)N)�__name__�
__module__�__qualname__rrrr�
__classcell__rr)rrrsrcs:eZdZdZd�fdd�	Zdd�Zdd�Zd	d
�Z�ZS)�WritelnMixinFNcs@tt|�jf|�|r||_|jj�r<|jr<ttd|jd�dS)Nr)rr)	rr#rr
rrrr
r)rr
r)rrrr5s
zWritelnMixin.__init__cCs|jj�rtdd|jd�dS)Nz
r)rr)rrr
)rrrr�clearln=s
zWritelnMixin.clearlncCs0|jj�r,|j�t|d|jd�|jj�dS)Nr)rr)rrr$r
r)r�linerrr�writelnAs
zWritelnMixin.writelncCs0|jj�r,t|jd�|jr,ttd|jd�dS)N)rr)rr)rrr
rr)rrrrrGs
zWritelnMixin.finish)N)	rr r!rrr$r&rr"rr)rrr#2s
r#)�signal�SIGINT)�exitcs(eZdZdZ�fdd�Zdd�Z�ZS)�SigIntMixinz6Registers a signal handler that calls finish on SIGINTcs"tt|�j||�tt|j�dS)N)rr*rr'r(�_sigint_handler)r�argsr)rrrrUszSigIntMixin.__init__cCs|j�td�dS)Nr)rr))rZsignum�framerrrr+YszSigIntMixin._sigint_handler)rr r!�__doc__rr+r"rr)rrr*Rsr*N)Z
__future__rrr�objectrr#r'r(�sysr)r*rrrr�<module>ssite-packages/pip/_vendor/progress/__pycache__/helpers.cpython-36.opt-1.pyc000064400000005517147511334610022630 0ustar003

���e&�@sdddlmZdZdZGdd�de�ZGdd�de�ZddlmZmZdd	l	m
Z
Gd
d�de�ZdS)
�)�print_functionz[?25lz[?25hcs2eZdZdZd	�fdd�	Zdd�Zdd�Z�ZS)
�
WriteMixinFNcsbtt|�jf|�d|_|r"||_|jj�r^|jrBtt	d|jd�t|jd|jd�|jj
�dS)Nr�)�end�file)�superr�__init__�_width�messager�isatty�hide_cursor�print�HIDE_CURSOR�flush)�selfr
�kwargs)�	__class__��/usr/lib/python3.6/helpers.pyrs
zWriteMixin.__init__cCsT|jj�rPd|j}|j|j�}t||d|jd�t|jt|��|_|jj�dS)N�r)rr)rrr	�ljustr
�max�lenr)r�s�b�crrr�write%s

zWriteMixin.writecCs$|jj�r |jr ttd|jd�dS)Nr)rr)rrrr
�SHOW_CURSOR)rrrr�finish-szWriteMixin.finish)N)�__name__�
__module__�__qualname__rrrr�
__classcell__rr)rrrsrcs:eZdZdZd�fdd�	Zdd�Zdd�Zd	d
�Z�ZS)�WritelnMixinFNcs@tt|�jf|�|r||_|jj�r<|jr<ttd|jd�dS)Nr)rr)	rr#rr
rrrr
r)rr
r)rrrr5s
zWritelnMixin.__init__cCs|jj�rtdd|jd�dS)Nz
r)rr)rrr
)rrrr�clearln=s
zWritelnMixin.clearlncCs0|jj�r,|j�t|d|jd�|jj�dS)Nr)rr)rrr$r
r)r�linerrr�writelnAs
zWritelnMixin.writelncCs0|jj�r,t|jd�|jr,ttd|jd�dS)N)rr)rr)rrr
rr)rrrrrGs
zWritelnMixin.finish)N)	rr r!rrr$r&rr"rr)rrr#2s
r#)�signal�SIGINT)�exitcs(eZdZdZ�fdd�Zdd�Z�ZS)�SigIntMixinz6Registers a signal handler that calls finish on SIGINTcs"tt|�j||�tt|j�dS)N)rr*rr'r(�_sigint_handler)r�argsr)rrrrUszSigIntMixin.__init__cCs|j�td�dS)Nr)rr))rZsignum�framerrrr+YszSigIntMixin._sigint_handler)rr r!�__doc__rr+r"rr)rrr*Rsr*N)Z
__future__rrr�objectrr#r'r(�sysr)r*rrrr�<module>ssite-packages/pip/_vendor/progress/__pycache__/bar.cpython-36.pyc000064400000004521147511334610020765 0ustar003

���e}
�@s~ddlmZddlmZGdd�dee�ZGdd�de�ZGdd�de�ZGd	d
�d
e�ZGdd�de�ZGd
d�de�Z	dS)�)�Progress)�WritelnMixinc@s4eZdZdZdZdZdZdZdZdZ	dZ
d	d
�ZdS)�Bar� �z%(index)d/%(max)dz |z| � �#TcCsjt|j|j�}|j|}|j|}|j|}|j|}|j|}dj||j|||j	|g�}|j
|�dS)Nr)�int�width�progress�message�fill�
empty_fill�suffix�join�
bar_prefix�
bar_suffix�writeln)�self�
filled_length�empty_lengthr�bar�emptyr�line�r�/usr/lib/python3.6/bar.py�updates




z
Bar.updateN)�__name__�
__module__�__qualname__r
rrrrrr
Zhide_cursorrrrrrrsrc@s eZdZdZdZdZdZdZdS)�ChargingBarz
%(percent)d%%ru∙u█N)rrrrrrrr
rrrrr ,s
r c@seZdZdZdZdS)�FillingSquaresBaru▢u▣N)rrrrr
rrrrr!4sr!c@seZdZdZdZdS)�FillingCirclesBaru◯u◉N)rrrrr
rrrrr"9sr"c	@seZdZd
Zd
d�ZdS)�IncrementalBarr�▏�▎�▍�▌�▋�▊�▉�█cCs�t|j�}t||j|j�}t|j|j�}|j|}|||}|j|}|jd|}|dkrn|j|nd}|jtd|t|��}	|j|}
dj	||j
|||	|j|
g�}|j|�dS)Nr�r���)
�len�phasesr	r
rrr�maxrrrrr)rZnphasesZexpanded_lengthrrZphaserrZcurrentrrrrrrrAs



zIncrementalBar.updateN)	rr$r%r&r'r(r)r*r+)rrrr/rrrrrr#>sr#c@seZdZdZdS)�ShadyBarr�░�▒�▓�█N)rr2r3r4r5)rrrr/rrrrr1Rsr1N)
rrZhelpersrrr r!r"r#r1rrrr�<module>ssite-packages/pip/_vendor/progress/__pycache__/spinner.cpython-36.opt-1.pyc000064400000002177147511334610022643 0ustar003

���e"�@s^ddlmZddlmZGdd�dee�ZGdd�de�ZGdd�de�ZGd	d
�d
e�ZdS)�)�Infinite)�
WriteMixinc@s eZdZdZd
ZdZdd�Zd	S)�Spinner��-�\�|�/TcCs$|jt|j�}|j|j|�dS)N)�index�len�phases�write)�self�i�r�/usr/lib/python3.6/spinner.py�updateszSpinner.updateN)rrrr	)�__name__�
__module__�__qualname__�messagerZhide_cursorrrrrrrsrc@seZdZddddgZdS)�
PieSpinneru◷u◶u◵u◴N)rrrrrrrrrsrc@seZdZddddgZdS)�MoonSpinneru◑u◒u◐u◓N)rrrrrrrrr#src@seZdZddddddgZdS)�LineSpinneru⎺u⎻u⎼u⎽N)rrrrrrrrr'srN)rrZhelpersrrrrrrrrr�<module>s

site-packages/pip/_vendor/progress/__pycache__/counter.cpython-36.opt-1.pyc000064400000002745147511334610022645 0ustar003

���e��@sfddlmZmZddlmZGdd�dee�ZGdd�dee�ZGdd�dee�ZGd	d
�d
e�ZdS)�)�Infinite�Progress)�
WriteMixinc@seZdZdZdZdd�ZdS)�Counter�TcCs|jt|j��dS)N)�write�str�index)�self�r�/usr/lib/python3.6/counter.py�updateszCounter.updateN)�__name__�
__module__�__qualname__�message�hide_cursorr
rrrrrsrc@seZdZdZdd�ZdS)�	CountdownTcCs|jt|j��dS)N)rrZ	remaining)r
rrrr
 szCountdown.updateN)rrrrr
rrrrrsrc	@seZdZdZd
Zdd�Zd
S)�Stack� �▁�▂�▃�▄�▅�▆�▇�█TcCs6t|j�}t|dt|j|��}|j|j|�dS)Nr)�len�phases�min�intZprogressr)r
Znphases�irrrr
(s
zStack.updateN)	rrrrrrrrr)rrrrrr
rrrrr$src@seZdZdZdS)�Pie�○�◔�◑�◕�●N)r$r%r&r'r()rrrrrrrrr#.sr#N)	rrrZhelpersrrrrr#rrrr�<module>s

site-packages/pip/_vendor/progress/__pycache__/__init__.cpython-36.opt-1.pyc000064400000007266147511334610022730 0ustar003

���e��@spddlmZddlmZddlmZddlmZddlm	Z	ddl
m
Z
dZGdd	�d	e�Z
Gd
d�de
�ZdS)
�)�division)�deque)�	timedelta)�ceil)�stderr)�timez1.2c@sreZdZeZdZdd�Zdd�Zedd��Z	edd	��Z
ed
d��Zdd
�Zdd�Z
dd�Zddd�Zdd�ZdS)�Infinite�
cOsJd|_t�|_|j|_t|jd�|_x |j�D]\}}t|||�q.WdS)Nr)�maxlen)	�indexr�start_ts�_tsr�
sma_window�_dt�items�setattr)�self�args�kwargs�key�val�r�/usr/lib/python3.6/__init__.py�__init__szInfinite.__init__cCs|jd�rdSt||d�S)N�_)�
startswith�getattr)rrrrr�__getitem__'s
zInfinite.__getitem__cCs|jrt|j�t|j�SdS)Nr)r�sum�len)rrrr�avg,szInfinite.avgcCstt�|j�S)N)�intrr)rrrr�elapsed0szInfinite.elapsedcCst|jd�S)N)�seconds)rr")rrrr�
elapsed_td4szInfinite.elapsed_tdcCsdS)Nr)rrrr�update8szInfinite.updatecCsdS)Nr)rrrr�start;szInfinite.startcCsdS)Nr)rrrr�finish>szInfinite.finish�cCsF|dkr.t�}||j|}|jj|�||_|j||_|j�dS)Nr)rr
r�appendrr%)r�nZnowZdtrrr�nextAsz
Infinite.nextccs(x|D]}|V|j�qW|j�dS)N)r+r')r�it�xrrr�iterKs
z
Infinite.iterN)r()�__name__�
__module__�__qualname__r�filerrr�propertyr r"r$r%r&r'r+r.rrrrrs

rcspeZdZ�fdd�Zedd��Zedd��Zedd��Zed	d
��Zedd��Z	d
d�Z
dd�Zdd�Z�Z
S)�Progresscs$tt|�j||�|jdd�|_dS)N�max�d)�superr4r�getr5)rrr)�	__class__rrrSszProgress.__init__cCstt|j|j��S)N)r!rr �	remaining)rrrr�etaWszProgress.etacCst|jd�S)N)r#)rr;)rrrr�eta_td[szProgress.eta_tdcCs
|jdS)Nr6)�progress)rrrr�percent_szProgress.percentcCstd|j|j�S)Nr()�minrr5)rrrrr=cszProgress.progresscCst|j|jd�S)Nr)r5r)rrrrr:gszProgress.remainingcCs|j�dS)N)r%)rrrrr&kszProgress.startcCs||j}|j|�dS)N)rr+)rrZincrrrr�gotons
z
Progress.gotoccsLyt|�|_Wntk
r"YnXx|D]}|V|j�q*W|j�dS)N)rr5�	TypeErrorr+r')rr,r-rrrr.rs
z
Progress.iter)r/r0r1rr3r;r<r>r=r:r&r@r.�
__classcell__rr)r9rr4Rsr4N)Z
__future__r�collectionsrZdatetimerZmathr�sysrr�__version__�objectrr4rrrr�<module>s7site-packages/pip/_vendor/progress/__pycache__/spinner.cpython-36.pyc000064400000002177147511334610021704 0ustar003

���e"�@s^ddlmZddlmZGdd�dee�ZGdd�de�ZGdd�de�ZGd	d
�d
e�ZdS)�)�Infinite)�
WriteMixinc@s eZdZdZd
ZdZdd�Zd	S)�Spinner��-�\�|�/TcCs$|jt|j�}|j|j|�dS)N)�index�len�phases�write)�self�i�r�/usr/lib/python3.6/spinner.py�updateszSpinner.updateN)rrrr	)�__name__�
__module__�__qualname__�messagerZhide_cursorrrrrrrsrc@seZdZddddgZdS)�
PieSpinneru◷u◶u◵u◴N)rrrrrrrrrsrc@seZdZddddgZdS)�MoonSpinneru◑u◒u◐u◓N)rrrrrrrrr#src@seZdZddddddgZdS)�LineSpinneru⎺u⎻u⎼u⎽N)rrrrrrrrr'srN)rrZhelpersrrrrrrrrr�<module>s

site-packages/pip/_vendor/progress/__pycache__/__init__.cpython-36.pyc000064400000007266147511334610021771 0ustar003

���e��@spddlmZddlmZddlmZddlmZddlm	Z	ddl
m
Z
dZGdd	�d	e�Z
Gd
d�de
�ZdS)
�)�division)�deque)�	timedelta)�ceil)�stderr)�timez1.2c@sreZdZeZdZdd�Zdd�Zedd��Z	edd	��Z
ed
d��Zdd
�Zdd�Z
dd�Zddd�Zdd�ZdS)�Infinite�
cOsJd|_t�|_|j|_t|jd�|_x |j�D]\}}t|||�q.WdS)Nr)�maxlen)	�indexr�start_ts�_tsr�
sma_window�_dt�items�setattr)�self�args�kwargs�key�val�r�/usr/lib/python3.6/__init__.py�__init__szInfinite.__init__cCs|jd�rdSt||d�S)N�_)�
startswith�getattr)rrrrr�__getitem__'s
zInfinite.__getitem__cCs|jrt|j�t|j�SdS)Nr)r�sum�len)rrrr�avg,szInfinite.avgcCstt�|j�S)N)�intrr)rrrr�elapsed0szInfinite.elapsedcCst|jd�S)N)�seconds)rr")rrrr�
elapsed_td4szInfinite.elapsed_tdcCsdS)Nr)rrrr�update8szInfinite.updatecCsdS)Nr)rrrr�start;szInfinite.startcCsdS)Nr)rrrr�finish>szInfinite.finish�cCsF|dkr.t�}||j|}|jj|�||_|j||_|j�dS)Nr)rr
r�appendrr%)r�nZnowZdtrrr�nextAsz
Infinite.nextccs(x|D]}|V|j�qW|j�dS)N)r+r')r�it�xrrr�iterKs
z
Infinite.iterN)r()�__name__�
__module__�__qualname__r�filerrr�propertyr r"r$r%r&r'r+r.rrrrrs

rcspeZdZ�fdd�Zedd��Zedd��Zedd��Zed	d
��Zedd��Z	d
d�Z
dd�Zdd�Z�Z
S)�Progresscs$tt|�j||�|jdd�|_dS)N�max�d)�superr4r�getr5)rrr)�	__class__rrrSszProgress.__init__cCstt|j|j��S)N)r!rr �	remaining)rrrr�etaWszProgress.etacCst|jd�S)N)r#)rr;)rrrr�eta_td[szProgress.eta_tdcCs
|jdS)Nr6)�progress)rrrr�percent_szProgress.percentcCstd|j|j�S)Nr()�minrr5)rrrrr=cszProgress.progresscCst|j|jd�S)Nr)r5r)rrrrr:gszProgress.remainingcCs|j�dS)N)r%)rrrrr&kszProgress.startcCs||j}|j|�dS)N)rr+)rrZincrrrr�gotons
z
Progress.gotoccsLyt|�|_Wntk
r"YnXx|D]}|V|j�q*W|j�dS)N)rr5�	TypeErrorr+r')rr,r-rrrr.rs
z
Progress.iter)r/r0r1rr3r;r<r>r=r:r&r@r.�
__classcell__rr)r9rr4Rsr4N)Z
__future__r�collectionsrZdatetimerZmathr�sysrr�__version__�objectrr4rrrr�<module>s7site-packages/pip/_vendor/progress/__pycache__/bar.cpython-36.opt-1.pyc000064400000004521147511334610021724 0ustar003

���e}
�@s~ddlmZddlmZGdd�dee�ZGdd�de�ZGdd�de�ZGd	d
�d
e�ZGdd�de�ZGd
d�de�Z	dS)�)�Progress)�WritelnMixinc@s4eZdZdZdZdZdZdZdZdZ	dZ
d	d
�ZdS)�Bar� �z%(index)d/%(max)dz |z| � �#TcCsjt|j|j�}|j|}|j|}|j|}|j|}|j|}dj||j|||j	|g�}|j
|�dS)Nr)�int�width�progress�message�fill�
empty_fill�suffix�join�
bar_prefix�
bar_suffix�writeln)�self�
filled_length�empty_lengthr�bar�emptyr�line�r�/usr/lib/python3.6/bar.py�updates




z
Bar.updateN)�__name__�
__module__�__qualname__r
rrrrrr
Zhide_cursorrrrrrrsrc@s eZdZdZdZdZdZdZdS)�ChargingBarz
%(percent)d%%ru∙u█N)rrrrrrrr
rrrrr ,s
r c@seZdZdZdZdS)�FillingSquaresBaru▢u▣N)rrrrr
rrrrr!4sr!c@seZdZdZdZdS)�FillingCirclesBaru◯u◉N)rrrrr
rrrrr"9sr"c	@seZdZd
Zd
d�ZdS)�IncrementalBarr�▏�▎�▍�▌�▋�▊�▉�█cCs�t|j�}t||j|j�}t|j|j�}|j|}|||}|j|}|jd|}|dkrn|j|nd}|jtd|t|��}	|j|}
dj	||j
|||	|j|
g�}|j|�dS)Nr�r���)
�len�phasesr	r
rrr�maxrrrrr)rZnphasesZexpanded_lengthrrZphaserrZcurrentrrrrrrrAs



zIncrementalBar.updateN)	rr$r%r&r'r(r)r*r+)rrrr/rrrrrr#>sr#c@seZdZdZdS)�ShadyBarr�░�▒�▓�█N)rr2r3r4r5)rrrr/rrrrr1Rsr1N)
rrZhelpersrrr r!r"r#r1rrrr�<module>ssite-packages/pip/_vendor/progress/__pycache__/counter.cpython-36.pyc000064400000002745147511334610021706 0ustar003

���e��@sfddlmZmZddlmZGdd�dee�ZGdd�dee�ZGdd�dee�ZGd	d
�d
e�ZdS)�)�Infinite�Progress)�
WriteMixinc@seZdZdZdZdd�ZdS)�Counter�TcCs|jt|j��dS)N)�write�str�index)�self�r�/usr/lib/python3.6/counter.py�updateszCounter.updateN)�__name__�
__module__�__qualname__�message�hide_cursorr
rrrrrsrc@seZdZdZdd�ZdS)�	CountdownTcCs|jt|j��dS)N)rrZ	remaining)r
rrrr
 szCountdown.updateN)rrrrr
rrrrrsrc	@seZdZdZd
Zdd�Zd
S)�Stack� �▁�▂�▃�▄�▅�▆�▇�█TcCs6t|j�}t|dt|j|��}|j|j|�dS)Nr)�len�phases�min�intZprogressr)r
Znphases�irrrr
(s
zStack.updateN)	rrrrrrrrr)rrrrrr
rrrrr$src@seZdZdZdS)�Pie�○�◔�◑�◕�●N)r$r%r&r'r()rrrrrrrrr#.sr#N)	rrrZhelpersrrrrr#rrrr�<module>s

site-packages/pip/_vendor/progress/counter.py000064400000002736147511334610015422 0ustar00# -*- coding: utf-8 -*-

# Copyright (c) 2012 Giorgos Verigakis <verigak@gmail.com>
#
# Permission to use, copy, modify, and distribute this software for any
# purpose with or without fee is hereby granted, provided that the above
# copyright notice and this permission notice appear in all copies.
#
# THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
# WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
# MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
# ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
# WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
# ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
# OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.

from . import Infinite, Progress
from .helpers import WriteMixin


class Counter(WriteMixin, Infinite):
    message = ''
    hide_cursor = True

    def update(self):
        self.write(str(self.index))


class Countdown(WriteMixin, Progress):
    hide_cursor = True

    def update(self):
        self.write(str(self.remaining))


class Stack(WriteMixin, Progress):
    phases = (u' ', u'▁', u'▂', u'▃', u'▄', u'▅', u'▆', u'▇', u'█')
    hide_cursor = True

    def update(self):
        nphases = len(self.phases)
        i = min(nphases - 1, int(self.progress * nphases))
        self.write(self.phases[i])


class Pie(Stack):
    phases = (u'○', u'◔', u'◑', u'◕', u'●')
site-packages/pip/_vendor/progress/spinner.py000064400000002442147511334610015413 0ustar00# -*- coding: utf-8 -*-

# Copyright (c) 2012 Giorgos Verigakis <verigak@gmail.com>
#
# Permission to use, copy, modify, and distribute this software for any
# purpose with or without fee is hereby granted, provided that the above
# copyright notice and this permission notice appear in all copies.
#
# THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
# WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
# MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
# ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
# WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
# ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
# OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.

from . import Infinite
from .helpers import WriteMixin


class Spinner(WriteMixin, Infinite):
    message = ''
    phases = ('-', '\\', '|', '/')
    hide_cursor = True

    def update(self):
        i = self.index % len(self.phases)
        self.write(self.phases[i])


class PieSpinner(Spinner):
    phases = [u'◷', u'◶', u'◵', u'◴']


class MoonSpinner(Spinner):
    phases = [u'◑', u'◒', u'◐', u'◓']


class LineSpinner(Spinner):
    phases = [u'⎺', u'⎻', u'⎼', u'⎽', u'⎼', u'⎻']
site-packages/pip/_vendor/certifi/__init__.py000064400000000077147511334610015257 0ustar00from .core import where, old_where

__version__ = "2018.01.18"
site-packages/pip/_vendor/certifi/__pycache__/__main__.cpython-36.pyc000064400000000251147511334610021516 0ustar003

���e)�@sddlmZee��dS)�)�whereN)Zcertifir�print�rr�/usr/lib/python3.6/__main__.py�<module>ssite-packages/pip/_vendor/certifi/__pycache__/core.cpython-36.pyc000064400000002112147511334610020724 0ustar003

���e&�@sJdZddlZddlZGdd�de�Zdd�Zdd�Zed	krFee��dS)
zU
certifi.py
~~~~~~~~~~

This module returns the installation location of cacert.pem.
�Nc@seZdZdZdS)�DeprecatedBundleWarningz�
    The weak security bundle is being deprecated. Please bother your service
    provider to get them to stop using cross-signed roots.
    N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/core.pyrsrcCsdS)Nz /etc/pki/tls/certs/ca-bundle.crtrrrrr�wheresr	cCstjdt�t�S)Nz�The weak security bundle has been removed. certifi.old_where() is now an alias of certifi.where(). Please update your code to use certifi.where() instead. certifi.old_where() will be removed in 2018.)�warnings�warnrr	rrrr�	old_wheresr�__main__)	r�osr
�DeprecationWarningrr	rr�printrrrr�<module>	s	site-packages/pip/_vendor/certifi/__pycache__/core.cpython-36.opt-1.pyc000064400000002112147511334610021663 0ustar003

���e&�@sJdZddlZddlZGdd�de�Zdd�Zdd�Zed	krFee��dS)
zU
certifi.py
~~~~~~~~~~

This module returns the installation location of cacert.pem.
�Nc@seZdZdZdS)�DeprecatedBundleWarningz�
    The weak security bundle is being deprecated. Please bother your service
    provider to get them to stop using cross-signed roots.
    N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/core.pyrsrcCsdS)Nz /etc/pki/tls/certs/ca-bundle.crtrrrrr�wheresr	cCstjdt�t�S)Nz�The weak security bundle has been removed. certifi.old_where() is now an alias of certifi.where(). Please update your code to use certifi.where() instead. certifi.old_where() will be removed in 2018.)�warnings�warnrr	rrrr�	old_wheresr�__main__)	r�osr
�DeprecationWarningrr	rr�printrrrr�<module>	s	site-packages/pip/_vendor/certifi/__pycache__/__main__.cpython-36.opt-1.pyc000064400000000251147511334610022455 0ustar003

���e)�@sddlmZee��dS)�)�whereN)Zcertifir�print�rr�/usr/lib/python3.6/__main__.py�<module>ssite-packages/pip/_vendor/certifi/__pycache__/__init__.cpython-36.pyc000064400000000306147511334610021536 0ustar003

���e?�@sddlmZmZdZdS)�)�where�	old_wherez
2018.01.18N)Zcorerr�__version__�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/certifi/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000306147511334610022475 0ustar003

���e?�@sddlmZmZdZdS)�)�where�	old_wherez
2018.01.18N)Zcorerr�__version__�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/certifi/__main__.py000064400000000051147511334610015230 0ustar00from certifi import where
print(where())
site-packages/pip/_vendor/certifi/core.py000064400000001446147511334610014451 0ustar00#!/usr/bin/env python
# -*- coding: utf-8 -*-

"""
certifi.py
~~~~~~~~~~

This module returns the installation location of cacert.pem.
"""
import os
import warnings


class DeprecatedBundleWarning(DeprecationWarning):
    """
    The weak security bundle is being deprecated. Please bother your service
    provider to get them to stop using cross-signed roots.
    """


def where():
    return '/etc/pki/tls/certs/ca-bundle.crt'


def old_where():
    warnings.warn(
        "The weak security bundle has been removed. certifi.old_where() is now an alias "
        "of certifi.where(). Please update your code to use certifi.where() instead. "
        "certifi.old_where() will be removed in 2018.",
        DeprecatedBundleWarning
    )
    return where()

if __name__ == '__main__':
    print(where())
site-packages/pip/_vendor/lockfile/linklockfile.py000064400000005134147511334610016330 0ustar00from __future__ import absolute_import

import time
import os

from . import (LockBase, LockFailed, NotLocked, NotMyLock, LockTimeout,
               AlreadyLocked)


class LinkLockFile(LockBase):
    """Lock access to a file using atomic property of link(2).

    >>> lock = LinkLockFile('somefile')
    >>> lock = LinkLockFile('somefile', threaded=False)
    """

    def acquire(self, timeout=None):
        try:
            open(self.unique_name, "wb").close()
        except IOError:
            raise LockFailed("failed to create %s" % self.unique_name)

        timeout = timeout if timeout is not None else self.timeout
        end_time = time.time()
        if timeout is not None and timeout > 0:
            end_time += timeout

        while True:
            # Try and create a hard link to it.
            try:
                os.link(self.unique_name, self.lock_file)
            except OSError:
                # Link creation failed.  Maybe we've double-locked?
                nlinks = os.stat(self.unique_name).st_nlink
                if nlinks == 2:
                    # The original link plus the one I created == 2.  We're
                    # good to go.
                    return
                else:
                    # Otherwise the lock creation failed.
                    if timeout is not None and time.time() > end_time:
                        os.unlink(self.unique_name)
                        if timeout > 0:
                            raise LockTimeout("Timeout waiting to acquire"
                                              " lock for %s" %
                                              self.path)
                        else:
                            raise AlreadyLocked("%s is already locked" %
                                                self.path)
                    time.sleep(timeout is not None and timeout / 10 or 0.1)
            else:
                # Link creation succeeded.  We're good to go.
                return

    def release(self):
        if not self.is_locked():
            raise NotLocked("%s is not locked" % self.path)
        elif not os.path.exists(self.unique_name):
            raise NotMyLock("%s is locked, but not by me" % self.path)
        os.unlink(self.unique_name)
        os.unlink(self.lock_file)

    def is_locked(self):
        return os.path.exists(self.lock_file)

    def i_am_locking(self):
        return (self.is_locked() and
                os.path.exists(self.unique_name) and
                os.stat(self.unique_name).st_nlink == 2)

    def break_lock(self):
        if os.path.exists(self.lock_file):
            os.unlink(self.lock_file)
site-packages/pip/_vendor/lockfile/__init__.py000064400000022233147511334610015420 0ustar00# -*- coding: utf-8 -*-

"""
lockfile.py - Platform-independent advisory file locks.

Requires Python 2.5 unless you apply 2.4.diff
Locking is done on a per-thread basis instead of a per-process basis.

Usage:

>>> lock = LockFile('somefile')
>>> try:
...     lock.acquire()
... except AlreadyLocked:
...     print 'somefile', 'is locked already.'
... except LockFailed:
...     print 'somefile', 'can\\'t be locked.'
... else:
...     print 'got lock'
got lock
>>> print lock.is_locked()
True
>>> lock.release()

>>> lock = LockFile('somefile')
>>> print lock.is_locked()
False
>>> with lock:
...    print lock.is_locked()
True
>>> print lock.is_locked()
False

>>> lock = LockFile('somefile')
>>> # It is okay to lock twice from the same thread...
>>> with lock:
...     lock.acquire()
...
>>> # Though no counter is kept, so you can't unlock multiple times...
>>> print lock.is_locked()
False

Exceptions:

    Error - base class for other exceptions
        LockError - base class for all locking exceptions
            AlreadyLocked - Another thread or process already holds the lock
            LockFailed - Lock failed for some other reason
        UnlockError - base class for all unlocking exceptions
            AlreadyUnlocked - File was not locked.
            NotMyLock - File was locked but not by the current thread/process
"""

from __future__ import absolute_import

import functools
import os
import socket
import threading
import warnings

# Work with PEP8 and non-PEP8 versions of threading module.
if not hasattr(threading, "current_thread"):
    threading.current_thread = threading.currentThread
if not hasattr(threading.Thread, "get_name"):
    threading.Thread.get_name = threading.Thread.getName

__all__ = ['Error', 'LockError', 'LockTimeout', 'AlreadyLocked',
           'LockFailed', 'UnlockError', 'NotLocked', 'NotMyLock',
           'LinkFileLock', 'MkdirFileLock', 'SQLiteFileLock',
           'LockBase', 'locked']


class Error(Exception):
    """
    Base class for other exceptions.

    >>> try:
    ...   raise Error
    ... except Exception:
    ...   pass
    """
    pass


class LockError(Error):
    """
    Base class for error arising from attempts to acquire the lock.

    >>> try:
    ...   raise LockError
    ... except Error:
    ...   pass
    """
    pass


class LockTimeout(LockError):
    """Raised when lock creation fails within a user-defined period of time.

    >>> try:
    ...   raise LockTimeout
    ... except LockError:
    ...   pass
    """
    pass


class AlreadyLocked(LockError):
    """Some other thread/process is locking the file.

    >>> try:
    ...   raise AlreadyLocked
    ... except LockError:
    ...   pass
    """
    pass


class LockFailed(LockError):
    """Lock file creation failed for some other reason.

    >>> try:
    ...   raise LockFailed
    ... except LockError:
    ...   pass
    """
    pass


class UnlockError(Error):
    """
    Base class for errors arising from attempts to release the lock.

    >>> try:
    ...   raise UnlockError
    ... except Error:
    ...   pass
    """
    pass


class NotLocked(UnlockError):
    """Raised when an attempt is made to unlock an unlocked file.

    >>> try:
    ...   raise NotLocked
    ... except UnlockError:
    ...   pass
    """
    pass


class NotMyLock(UnlockError):
    """Raised when an attempt is made to unlock a file someone else locked.

    >>> try:
    ...   raise NotMyLock
    ... except UnlockError:
    ...   pass
    """
    pass


class _SharedBase(object):
    def __init__(self, path):
        self.path = path

    def acquire(self, timeout=None):
        """
        Acquire the lock.

        * If timeout is omitted (or None), wait forever trying to lock the
          file.

        * If timeout > 0, try to acquire the lock for that many seconds.  If
          the lock period expires and the file is still locked, raise
          LockTimeout.

        * If timeout <= 0, raise AlreadyLocked immediately if the file is
          already locked.
        """
        raise NotImplemented("implement in subclass")

    def release(self):
        """
        Release the lock.

        If the file is not locked, raise NotLocked.
        """
        raise NotImplemented("implement in subclass")

    def __enter__(self):
        """
        Context manager support.
        """
        self.acquire()
        return self

    def __exit__(self, *_exc):
        """
        Context manager support.
        """
        self.release()

    def __repr__(self):
        return "<%s: %r>" % (self.__class__.__name__, self.path)


class LockBase(_SharedBase):
    """Base class for platform-specific lock classes."""
    def __init__(self, path, threaded=True, timeout=None):
        """
        >>> lock = LockBase('somefile')
        >>> lock = LockBase('somefile', threaded=False)
        """
        super(LockBase, self).__init__(path)
        self.lock_file = os.path.abspath(path) + ".lock"
        self.hostname = socket.gethostname()
        self.pid = os.getpid()
        if threaded:
            t = threading.current_thread()
            # Thread objects in Python 2.4 and earlier do not have ident
            # attrs.  Worm around that.
            ident = getattr(t, "ident", hash(t))
            self.tname = "-%x" % (ident & 0xffffffff)
        else:
            self.tname = ""
        dirname = os.path.dirname(self.lock_file)

        # unique name is mostly about the current process, but must
        # also contain the path -- otherwise, two adjacent locked
        # files conflict (one file gets locked, creating lock-file and
        # unique file, the other one gets locked, creating lock-file
        # and overwriting the already existing lock-file, then one
        # gets unlocked, deleting both lock-file and unique file,
        # finally the last lock errors out upon releasing.
        self.unique_name = os.path.join(dirname,
                                        "%s%s.%s%s" % (self.hostname,
                                                       self.tname,
                                                       self.pid,
                                                       hash(self.path)))
        self.timeout = timeout

    def is_locked(self):
        """
        Tell whether or not the file is locked.
        """
        raise NotImplemented("implement in subclass")

    def i_am_locking(self):
        """
        Return True if this object is locking the file.
        """
        raise NotImplemented("implement in subclass")

    def break_lock(self):
        """
        Remove a lock.  Useful if a locking thread failed to unlock.
        """
        raise NotImplemented("implement in subclass")

    def __repr__(self):
        return "<%s: %r -- %r>" % (self.__class__.__name__, self.unique_name,
                                   self.path)


def _fl_helper(cls, mod, *args, **kwds):
    warnings.warn("Import from %s module instead of lockfile package" % mod,
                  DeprecationWarning, stacklevel=2)
    # This is a bit funky, but it's only for awhile.  The way the unit tests
    # are constructed this function winds up as an unbound method, so it
    # actually takes three args, not two.  We want to toss out self.
    if not isinstance(args[0], str):
        # We are testing, avoid the first arg
        args = args[1:]
    if len(args) == 1 and not kwds:
        kwds["threaded"] = True
    return cls(*args, **kwds)


def LinkFileLock(*args, **kwds):
    """Factory function provided for backwards compatibility.

    Do not use in new code.  Instead, import LinkLockFile from the
    lockfile.linklockfile module.
    """
    from . import linklockfile
    return _fl_helper(linklockfile.LinkLockFile, "lockfile.linklockfile",
                      *args, **kwds)


def MkdirFileLock(*args, **kwds):
    """Factory function provided for backwards compatibility.

    Do not use in new code.  Instead, import MkdirLockFile from the
    lockfile.mkdirlockfile module.
    """
    from . import mkdirlockfile
    return _fl_helper(mkdirlockfile.MkdirLockFile, "lockfile.mkdirlockfile",
                      *args, **kwds)


def SQLiteFileLock(*args, **kwds):
    """Factory function provided for backwards compatibility.

    Do not use in new code.  Instead, import SQLiteLockFile from the
    lockfile.mkdirlockfile module.
    """
    from . import sqlitelockfile
    return _fl_helper(sqlitelockfile.SQLiteLockFile, "lockfile.sqlitelockfile",
                      *args, **kwds)


def locked(path, timeout=None):
    """Decorator which enables locks for decorated function.

    Arguments:
     - path: path for lockfile.
     - timeout (optional): Timeout for acquiring lock.

     Usage:
         @locked('/var/run/myname', timeout=0)
         def myname(...):
             ...
    """
    def decor(func):
        @functools.wraps(func)
        def wrapper(*args, **kwargs):
            lock = FileLock(path, timeout=timeout)
            lock.acquire()
            try:
                return func(*args, **kwargs)
            finally:
                lock.release()
        return wrapper
    return decor


if hasattr(os, "link"):
    from . import linklockfile as _llf
    LockFile = _llf.LinkLockFile
else:
    from . import mkdirlockfile as _mlf
    LockFile = _mlf.MkdirLockFile

FileLock = LockFile
site-packages/pip/_vendor/lockfile/sqlitelockfile.py000064400000012602147511334610016672 0ustar00from __future__ import absolute_import, division

import time
import os

try:
    unicode
except NameError:
    unicode = str

from . import LockBase, NotLocked, NotMyLock, LockTimeout, AlreadyLocked


class SQLiteLockFile(LockBase):
    "Demonstrate SQL-based locking."

    testdb = None

    def __init__(self, path, threaded=True, timeout=None):
        """
        >>> lock = SQLiteLockFile('somefile')
        >>> lock = SQLiteLockFile('somefile', threaded=False)
        """
        LockBase.__init__(self, path, threaded, timeout)
        self.lock_file = unicode(self.lock_file)
        self.unique_name = unicode(self.unique_name)

        if SQLiteLockFile.testdb is None:
            import tempfile
            _fd, testdb = tempfile.mkstemp()
            os.close(_fd)
            os.unlink(testdb)
            del _fd, tempfile
            SQLiteLockFile.testdb = testdb

        import sqlite3
        self.connection = sqlite3.connect(SQLiteLockFile.testdb)

        c = self.connection.cursor()
        try:
            c.execute("create table locks"
                      "("
                      "   lock_file varchar(32),"
                      "   unique_name varchar(32)"
                      ")")
        except sqlite3.OperationalError:
            pass
        else:
            self.connection.commit()
            import atexit
            atexit.register(os.unlink, SQLiteLockFile.testdb)

    def acquire(self, timeout=None):
        timeout = timeout if timeout is not None else self.timeout
        end_time = time.time()
        if timeout is not None and timeout > 0:
            end_time += timeout

        if timeout is None:
            wait = 0.1
        elif timeout <= 0:
            wait = 0
        else:
            wait = timeout / 10

        cursor = self.connection.cursor()

        while True:
            if not self.is_locked():
                # Not locked.  Try to lock it.
                cursor.execute("insert into locks"
                               "  (lock_file, unique_name)"
                               "  values"
                               "  (?, ?)",
                               (self.lock_file, self.unique_name))
                self.connection.commit()

                # Check to see if we are the only lock holder.
                cursor.execute("select * from locks"
                               "  where unique_name = ?",
                               (self.unique_name,))
                rows = cursor.fetchall()
                if len(rows) > 1:
                    # Nope.  Someone else got there.  Remove our lock.
                    cursor.execute("delete from locks"
                                   "  where unique_name = ?",
                                   (self.unique_name,))
                    self.connection.commit()
                else:
                    # Yup.  We're done, so go home.
                    return
            else:
                # Check to see if we are the only lock holder.
                cursor.execute("select * from locks"
                               "  where unique_name = ?",
                               (self.unique_name,))
                rows = cursor.fetchall()
                if len(rows) == 1:
                    # We're the locker, so go home.
                    return

            # Maybe we should wait a bit longer.
            if timeout is not None and time.time() > end_time:
                if timeout > 0:
                    # No more waiting.
                    raise LockTimeout("Timeout waiting to acquire"
                                      " lock for %s" %
                                      self.path)
                else:
                    # Someone else has the lock and we are impatient..
                    raise AlreadyLocked("%s is already locked" % self.path)

            # Well, okay.  We'll give it a bit longer.
            time.sleep(wait)

    def release(self):
        if not self.is_locked():
            raise NotLocked("%s is not locked" % self.path)
        if not self.i_am_locking():
            raise NotMyLock("%s is locked, but not by me (by %s)" %
                            (self.unique_name, self._who_is_locking()))
        cursor = self.connection.cursor()
        cursor.execute("delete from locks"
                       "  where unique_name = ?",
                       (self.unique_name,))
        self.connection.commit()

    def _who_is_locking(self):
        cursor = self.connection.cursor()
        cursor.execute("select unique_name from locks"
                       "  where lock_file = ?",
                       (self.lock_file,))
        return cursor.fetchone()[0]

    def is_locked(self):
        cursor = self.connection.cursor()
        cursor.execute("select * from locks"
                       "  where lock_file = ?",
                       (self.lock_file,))
        rows = cursor.fetchall()
        return not not rows

    def i_am_locking(self):
        cursor = self.connection.cursor()
        cursor.execute("select * from locks"
                       "  where lock_file = ?"
                       "    and unique_name = ?",
                       (self.lock_file, self.unique_name))
        return not not cursor.fetchall()

    def break_lock(self):
        cursor = self.connection.cursor()
        cursor.execute("delete from locks"
                       "  where lock_file = ?",
                       (self.lock_file,))
        self.connection.commit()
site-packages/pip/_vendor/lockfile/pidlockfile.py000064400000013712147511334610016150 0ustar00# -*- coding: utf-8 -*-

# pidlockfile.py
#
# Copyright © 2008–2009 Ben Finney <ben+python@benfinney.id.au>
#
# This is free software: you may copy, modify, and/or distribute this work
# under the terms of the Python Software Foundation License, version 2 or
# later as published by the Python Software Foundation.
# No warranty expressed or implied. See the file LICENSE.PSF-2 for details.

""" Lockfile behaviour implemented via Unix PID files.
    """

from __future__ import absolute_import

import errno
import os
import time

from . import (LockBase, AlreadyLocked, LockFailed, NotLocked, NotMyLock,
               LockTimeout)


class PIDLockFile(LockBase):
    """ Lockfile implemented as a Unix PID file.

    The lock file is a normal file named by the attribute `path`.
    A lock's PID file contains a single line of text, containing
    the process ID (PID) of the process that acquired the lock.

    >>> lock = PIDLockFile('somefile')
    >>> lock = PIDLockFile('somefile')
    """

    def __init__(self, path, threaded=False, timeout=None):
        # pid lockfiles don't support threaded operation, so always force
        # False as the threaded arg.
        LockBase.__init__(self, path, False, timeout)
        self.unique_name = self.path

    def read_pid(self):
        """ Get the PID from the lock file.
            """
        return read_pid_from_pidfile(self.path)

    def is_locked(self):
        """ Test if the lock is currently held.

            The lock is held if the PID file for this lock exists.

            """
        return os.path.exists(self.path)

    def i_am_locking(self):
        """ Test if the lock is held by the current process.

        Returns ``True`` if the current process ID matches the
        number stored in the PID file.
        """
        return self.is_locked() and os.getpid() == self.read_pid()

    def acquire(self, timeout=None):
        """ Acquire the lock.

        Creates the PID file for this lock, or raises an error if
        the lock could not be acquired.
        """

        timeout = timeout if timeout is not None else self.timeout
        end_time = time.time()
        if timeout is not None and timeout > 0:
            end_time += timeout

        while True:
            try:
                write_pid_to_pidfile(self.path)
            except OSError as exc:
                if exc.errno == errno.EEXIST:
                    # The lock creation failed.  Maybe sleep a bit.
                    if time.time() > end_time:
                        if timeout is not None and timeout > 0:
                            raise LockTimeout("Timeout waiting to acquire"
                                              " lock for %s" %
                                              self.path)
                        else:
                            raise AlreadyLocked("%s is already locked" %
                                                self.path)
                    time.sleep(timeout is not None and timeout / 10 or 0.1)
                else:
                    raise LockFailed("failed to create %s" % self.path)
            else:
                return

    def release(self):
        """ Release the lock.

            Removes the PID file to release the lock, or raises an
            error if the current process does not hold the lock.

            """
        if not self.is_locked():
            raise NotLocked("%s is not locked" % self.path)
        if not self.i_am_locking():
            raise NotMyLock("%s is locked, but not by me" % self.path)
        remove_existing_pidfile(self.path)

    def break_lock(self):
        """ Break an existing lock.

            Removes the PID file if it already exists, otherwise does
            nothing.

            """
        remove_existing_pidfile(self.path)


def read_pid_from_pidfile(pidfile_path):
    """ Read the PID recorded in the named PID file.

        Read and return the numeric PID recorded as text in the named
        PID file. If the PID file cannot be read, or if the content is
        not a valid PID, return ``None``.

        """
    pid = None
    try:
        pidfile = open(pidfile_path, 'r')
    except IOError:
        pass
    else:
        # According to the FHS 2.3 section on PID files in /var/run:
        #
        #   The file must consist of the process identifier in
        #   ASCII-encoded decimal, followed by a newline character.
        #
        #   Programs that read PID files should be somewhat flexible
        #   in what they accept; i.e., they should ignore extra
        #   whitespace, leading zeroes, absence of the trailing
        #   newline, or additional lines in the PID file.

        line = pidfile.readline().strip()
        try:
            pid = int(line)
        except ValueError:
            pass
        pidfile.close()

    return pid


def write_pid_to_pidfile(pidfile_path):
    """ Write the PID in the named PID file.

        Get the numeric process ID (“PID”) of the current process
        and write it to the named file as a line of text.

        """
    open_flags = (os.O_CREAT | os.O_EXCL | os.O_WRONLY)
    open_mode = 0o644
    pidfile_fd = os.open(pidfile_path, open_flags, open_mode)
    pidfile = os.fdopen(pidfile_fd, 'w')

    # According to the FHS 2.3 section on PID files in /var/run:
    #
    #   The file must consist of the process identifier in
    #   ASCII-encoded decimal, followed by a newline character. For
    #   example, if crond was process number 25, /var/run/crond.pid
    #   would contain three characters: two, five, and newline.

    pid = os.getpid()
    pidfile.write("%s\n" % pid)
    pidfile.close()


def remove_existing_pidfile(pidfile_path):
    """ Remove the named PID file if it exists.

        Removing a PID file that doesn't already exist puts us in the
        desired state, so we ignore the condition if the file does not
        exist.

        """
    try:
        os.remove(pidfile_path)
    except OSError as exc:
        if exc.errno == errno.ENOENT:
            pass
        else:
            raise
site-packages/pip/_vendor/lockfile/mkdirlockfile.py000064400000006030147511334610016475 0ustar00from __future__ import absolute_import, division

import time
import os
import sys
import errno

from . import (LockBase, LockFailed, NotLocked, NotMyLock, LockTimeout,
               AlreadyLocked)


class MkdirLockFile(LockBase):
    """Lock file by creating a directory."""
    def __init__(self, path, threaded=True, timeout=None):
        """
        >>> lock = MkdirLockFile('somefile')
        >>> lock = MkdirLockFile('somefile', threaded=False)
        """
        LockBase.__init__(self, path, threaded, timeout)
        # Lock file itself is a directory.  Place the unique file name into
        # it.
        self.unique_name = os.path.join(self.lock_file,
                                        "%s.%s%s" % (self.hostname,
                                                     self.tname,
                                                     self.pid))

    def acquire(self, timeout=None):
        timeout = timeout if timeout is not None else self.timeout
        end_time = time.time()
        if timeout is not None and timeout > 0:
            end_time += timeout

        if timeout is None:
            wait = 0.1
        else:
            wait = max(0, timeout / 10)

        while True:
            try:
                os.mkdir(self.lock_file)
            except OSError:
                err = sys.exc_info()[1]
                if err.errno == errno.EEXIST:
                    # Already locked.
                    if os.path.exists(self.unique_name):
                        # Already locked by me.
                        return
                    if timeout is not None and time.time() > end_time:
                        if timeout > 0:
                            raise LockTimeout("Timeout waiting to acquire"
                                              " lock for %s" %
                                              self.path)
                        else:
                            # Someone else has the lock.
                            raise AlreadyLocked("%s is already locked" %
                                                self.path)
                    time.sleep(wait)
                else:
                    # Couldn't create the lock for some other reason
                    raise LockFailed("failed to create %s" % self.lock_file)
            else:
                open(self.unique_name, "wb").close()
                return

    def release(self):
        if not self.is_locked():
            raise NotLocked("%s is not locked" % self.path)
        elif not os.path.exists(self.unique_name):
            raise NotMyLock("%s is locked, but not by me" % self.path)
        os.unlink(self.unique_name)
        os.rmdir(self.lock_file)

    def is_locked(self):
        return os.path.exists(self.lock_file)

    def i_am_locking(self):
        return (self.is_locked() and
                os.path.exists(self.unique_name))

    def break_lock(self):
        if os.path.exists(self.lock_file):
            for name in os.listdir(self.lock_file):
                os.unlink(os.path.join(self.lock_file, name))
            os.rmdir(self.lock_file)
site-packages/pip/_vendor/lockfile/symlinklockfile.py000064400000005070147511334610017060 0ustar00from __future__ import absolute_import

import os
import time

from . import (LockBase, NotLocked, NotMyLock, LockTimeout,
               AlreadyLocked)


class SymlinkLockFile(LockBase):
    """Lock access to a file using symlink(2)."""

    def __init__(self, path, threaded=True, timeout=None):
        # super(SymlinkLockFile).__init(...)
        LockBase.__init__(self, path, threaded, timeout)
        # split it back!
        self.unique_name = os.path.split(self.unique_name)[1]

    def acquire(self, timeout=None):
        # Hopefully unnecessary for symlink.
        # try:
        #     open(self.unique_name, "wb").close()
        # except IOError:
        #     raise LockFailed("failed to create %s" % self.unique_name)
        timeout = timeout if timeout is not None else self.timeout
        end_time = time.time()
        if timeout is not None and timeout > 0:
            end_time += timeout

        while True:
            # Try and create a symbolic link to it.
            try:
                os.symlink(self.unique_name, self.lock_file)
            except OSError:
                # Link creation failed.  Maybe we've double-locked?
                if self.i_am_locking():
                    # Linked to out unique name. Proceed.
                    return
                else:
                    # Otherwise the lock creation failed.
                    if timeout is not None and time.time() > end_time:
                        if timeout > 0:
                            raise LockTimeout("Timeout waiting to acquire"
                                              " lock for %s" %
                                              self.path)
                        else:
                            raise AlreadyLocked("%s is already locked" %
                                                self.path)
                    time.sleep(timeout / 10 if timeout is not None else 0.1)
            else:
                # Link creation succeeded.  We're good to go.
                return

    def release(self):
        if not self.is_locked():
            raise NotLocked("%s is not locked" % self.path)
        elif not self.i_am_locking():
            raise NotMyLock("%s is locked, but not by me" % self.path)
        os.unlink(self.lock_file)

    def is_locked(self):
        return os.path.islink(self.lock_file)

    def i_am_locking(self):
        return (os.path.islink(self.lock_file)
                and os.readlink(self.lock_file) == self.unique_name)

    def break_lock(self):
        if os.path.islink(self.lock_file):  # exists && link
            os.unlink(self.lock_file)
site-packages/pip/_vendor/lockfile/__pycache__/mkdirlockfile.cpython-36.opt-1.pyc000064400000005013147511334610023720 0ustar003

���e�@sdddlmZmZddlZddlZddlZddlZddlmZm	Z	m
Z
mZmZm
Z
Gdd�de�ZdS)�)�absolute_import�divisionN�)�LockBase�
LockFailed�	NotLocked�	NotMyLock�LockTimeout�
AlreadyLockedc@sDeZdZdZddd�Zddd�Zdd	�Zd
d�Zdd
�Zdd�Z	dS)�
MkdirLockFilez"Lock file by creating a directory.TNcCs6tj||||�tjj|jd|j|j|jf�|_	dS)zs
        >>> lock = MkdirLockFile('somefile')
        >>> lock = MkdirLockFile('somefile', threaded=False)
        z%s.%s%sN)
r�__init__�os�path�join�	lock_fileZhostnameZtname�pid�unique_name)�selfrZthreaded�timeout�r�#/usr/lib/python3.6/mkdirlockfile.pyrs

zMkdirLockFile.__init__cCs|dk	r|n|j}tj�}|dk	r2|dkr2||7}|dkr@d}ntd|d�}x�ytj|j�Wn�tk
r�tj�d}|j	t	j
kr�tjj|j
�r�dS|dk	r�tj�|kr�|dkr�td|j��ntd|j��tj|�ntd|j��YqPXt|j
d�j�dSqPWdS)	Nrg�������?�
rz&Timeout waiting to acquire lock for %sz%s is already lockedzfailed to create %s�wb)r�time�maxr
�mkdirr�OSError�sys�exc_info�errnoZEEXISTr�existsrr	r
Zsleepr�open�close)rrZend_time�wait�errrrr�acquires2
zMkdirLockFile.acquirecCsP|j�std|j��ntjj|j�s4td|j��tj|j�tj|j	�dS)Nz%s is not lockedz%s is locked, but not by me)
�	is_lockedrrr
r rr�unlink�rmdirr)rrrr�releaseAszMkdirLockFile.releasecCstjj|j�S)N)r
rr r)rrrrr&IszMkdirLockFile.is_lockedcCs|j�otjj|j�S)N)r&r
rr r)rrrr�i_am_lockingLszMkdirLockFile.i_am_lockingcCsJtjj|j�rFx*tj|j�D]}tjtjj|j|��qWtj|j�dS)N)r
rr r�listdirr'rr()r�namerrr�
break_lockPszMkdirLockFile.break_lock)TN)N)
�__name__�
__module__�__qualname__�__doc__rr%r)r&r*r-rrrrrs

&r)Z
__future__rrrr
rr�rrrrr	r
rrrrr�<module>s site-packages/pip/_vendor/lockfile/__pycache__/__init__.cpython-36.pyc000064400000023146147511334610021710 0ustar003

���e�$�
@s�dZddlmZddlZddlZddlZddlZddlZeed�sJej	e_
eejd�sbejjej_
dddd	d
ddd
dddddg
ZGdd�de�ZGdd�de�ZGdd�de�ZGdd	�d	e�ZGdd
�d
e�ZGdd�de�ZGdd�de�ZGdd
�d
e�ZGdd�de�ZGdd�de�Zdd�Zd d�Zd!d�Zd"d�Zd(d#d�Zeed$��rjd%d&l m!Z"e"j#Z$nd%d'l m%Z&e&j'Z$e$Z(dS))a
lockfile.py - Platform-independent advisory file locks.

Requires Python 2.5 unless you apply 2.4.diff
Locking is done on a per-thread basis instead of a per-process basis.

Usage:

>>> lock = LockFile('somefile')
>>> try:
...     lock.acquire()
... except AlreadyLocked:
...     print 'somefile', 'is locked already.'
... except LockFailed:
...     print 'somefile', 'can\'t be locked.'
... else:
...     print 'got lock'
got lock
>>> print lock.is_locked()
True
>>> lock.release()

>>> lock = LockFile('somefile')
>>> print lock.is_locked()
False
>>> with lock:
...    print lock.is_locked()
True
>>> print lock.is_locked()
False

>>> lock = LockFile('somefile')
>>> # It is okay to lock twice from the same thread...
>>> with lock:
...     lock.acquire()
...
>>> # Though no counter is kept, so you can't unlock multiple times...
>>> print lock.is_locked()
False

Exceptions:

    Error - base class for other exceptions
        LockError - base class for all locking exceptions
            AlreadyLocked - Another thread or process already holds the lock
            LockFailed - Lock failed for some other reason
        UnlockError - base class for all unlocking exceptions
            AlreadyUnlocked - File was not locked.
            NotMyLock - File was locked but not by the current thread/process
�)�absolute_importN�current_thread�get_name�Error�	LockError�LockTimeout�
AlreadyLocked�
LockFailed�UnlockError�	NotLocked�	NotMyLock�LinkFileLock�
MkdirFileLock�SQLiteFileLock�LockBase�lockedc@seZdZdZdS)rzw
    Base class for other exceptions.

    >>> try:
    ...   raise Error
    ... except Exception:
    ...   pass
    N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/__init__.pyrJsc@seZdZdZdS)rz�
    Base class for error arising from attempts to acquire the lock.

    >>> try:
    ...   raise LockError
    ... except Error:
    ...   pass
    N)rrrrrrrrrVsc@seZdZdZdS)rz�Raised when lock creation fails within a user-defined period of time.

    >>> try:
    ...   raise LockTimeout
    ... except LockError:
    ...   pass
    N)rrrrrrrrrbsc@seZdZdZdS)rz�Some other thread/process is locking the file.

    >>> try:
    ...   raise AlreadyLocked
    ... except LockError:
    ...   pass
    N)rrrrrrrrrmsc@seZdZdZdS)r	z�Lock file creation failed for some other reason.

    >>> try:
    ...   raise LockFailed
    ... except LockError:
    ...   pass
    N)rrrrrrrrr	xsc@seZdZdZdS)r
z�
    Base class for errors arising from attempts to release the lock.

    >>> try:
    ...   raise UnlockError
    ... except Error:
    ...   pass
    N)rrrrrrrrr
�sc@seZdZdZdS)rz�Raised when an attempt is made to unlock an unlocked file.

    >>> try:
    ...   raise NotLocked
    ... except UnlockError:
    ...   pass
    N)rrrrrrrrr�sc@seZdZdZdS)rz�Raised when an attempt is made to unlock a file someone else locked.

    >>> try:
    ...   raise NotMyLock
    ... except UnlockError:
    ...   pass
    N)rrrrrrrrr�sc@s>eZdZdd�Zddd�Zdd�Zdd	�Zd
d�Zdd
�ZdS)�_SharedBasecCs
||_dS)N)�path)�selfrrrr�__init__�sz_SharedBase.__init__NcCstd��dS)a�
        Acquire the lock.

        * If timeout is omitted (or None), wait forever trying to lock the
          file.

        * If timeout > 0, try to acquire the lock for that many seconds.  If
          the lock period expires and the file is still locked, raise
          LockTimeout.

        * If timeout <= 0, raise AlreadyLocked immediately if the file is
          already locked.
        zimplement in subclassN)�NotImplemented)r�timeoutrrr�acquire�sz_SharedBase.acquirecCstd��dS)zX
        Release the lock.

        If the file is not locked, raise NotLocked.
        zimplement in subclassN)r)rrrr�release�sz_SharedBase.releasecCs|j�|S)z*
        Context manager support.
        )r)rrrr�	__enter__�sz_SharedBase.__enter__cGs|j�dS)z*
        Context manager support.
        N)r)rZ_excrrr�__exit__�sz_SharedBase.__exit__cCsd|jj|jfS)Nz<%s: %r>)�	__class__rr)rrrr�__repr__�sz_SharedBase.__repr__)N)	rrrrrrr r!r#rrrrr�s
rcsBeZdZdZd�fdd�	Zdd�Zdd	�Zd
d�Zdd
�Z�Z	S)rz.Base class for platform-specific lock classes.TNcs�tt|�j|�tjj|�d|_tj�|_	tj
�|_|rbtj
�}t|dt|��}d|d@|_nd|_tjj|j�}tjj|d|j	|j|jt|j�f�|_||_dS)zi
        >>> lock = LockBase('somefile')
        >>> lock = LockBase('somefile', threaded=False)
        z.lock�identz-%xl���z	%s%s.%s%sN)�superrr�osr�abspathZ	lock_file�socketZgethostnameZhostname�getpid�pid�	threadingr�getattr�hashZtname�dirname�join�unique_namer)rr�threadedr�tr$r/)r"rrr�s 

	zLockBase.__init__cCstd��dS)z9
        Tell whether or not the file is locked.
        zimplement in subclassN)r)rrrr�	is_locked�szLockBase.is_lockedcCstd��dS)zA
        Return True if this object is locking the file.
        zimplement in subclassN)r)rrrr�i_am_locking�szLockBase.i_am_lockingcCstd��dS)zN
        Remove a lock.  Useful if a locking thread failed to unlock.
        zimplement in subclassN)r)rrrr�
break_lockszLockBase.break_lockcCsd|jj|j|jfS)Nz<%s: %r -- %r>)r"rr1r)rrrrr#szLockBase.__repr__)TN)
rrrrrr4r5r6r#�
__classcell__rr)r"rr�s!cOsRtjd|tdd�t|dt�s.|dd�}t|�dkrH|rHd|d<|||�S)Nz1Import from %s module instead of lockfile package�)�
stacklevelr�Tr2)�warnings�warn�DeprecationWarning�
isinstance�str�len)�cls�mod�args�kwdsrrr�
_fl_helpers

rEcOs ddlm}t|jdf|�|�S)z�Factory function provided for backwards compatibility.

    Do not use in new code.  Instead, import LinkLockFile from the
    lockfile.linklockfile module.
    r:)�linklockfilezlockfile.linklockfile)r%rFrE�LinkLockFile)rCrDrFrrrr
s
cOs ddlm}t|jdf|�|�S)z�Factory function provided for backwards compatibility.

    Do not use in new code.  Instead, import MkdirLockFile from the
    lockfile.mkdirlockfile module.
    r:)�
mkdirlockfilezlockfile.mkdirlockfile)r%rHrE�
MkdirLockFile)rCrDrHrrrr%s
cOs ddlm}t|jdf|�|�S)z�Factory function provided for backwards compatibility.

    Do not use in new code.  Instead, import SQLiteLockFile from the
    lockfile.mkdirlockfile module.
    r:)�sqlitelockfilezlockfile.sqlitelockfile)r%rJrEZSQLiteLockFile)rCrDrJrrrr0s
cs��fdd�}|S)aDecorator which enables locks for decorated function.

    Arguments:
     - path: path for lockfile.
     - timeout (optional): Timeout for acquiring lock.

     Usage:
         @locked('/var/run/myname', timeout=0)
         def myname(...):
             ...
    cstj�����fdd��}|S)Nc
s.t��d�}|j�z
�||�S|j�XdS)N)r)�FileLockrr)rC�kwargs�lock)�funcrrrr�wrapperHs

z&locked.<locals>.decor.<locals>.wrapper)�	functools�wraps)rNrO)rr)rNr�decorGszlocked.<locals>.decorr)rrrRr)rrrr;s
�linkr:)rF)rH)N))rZ
__future__rrPr'r)r,r;�hasattrZ
currentThreadrZThreadZgetNamer�__all__�	Exceptionrrrrr	r
rr�objectrrrEr
rrrr%rFZ_llfrGZLockFilerHZ_mlfrIrKrrrr�<module>4sF
-:
site-packages/pip/_vendor/lockfile/__pycache__/pidlockfile.cpython-36.pyc000064400000011243147511334610022431 0ustar003

���e��@stdZddlmZddlZddlZddlZddlmZmZm	Z	m
Z
mZmZGdd�de�Z
dd	�Zd
d�Zdd
�ZdS)z8 Lockfile behaviour implemented via Unix PID files.
    �)�absolute_importN�)�LockBase�
AlreadyLocked�
LockFailed�	NotLocked�	NotMyLock�LockTimeoutc@sLeZdZdZddd�Zdd�Zdd	�Zd
d�Zddd
�Zdd�Z	dd�Z
dS)�PIDLockFileaA Lockfile implemented as a Unix PID file.

    The lock file is a normal file named by the attribute `path`.
    A lock's PID file contains a single line of text, containing
    the process ID (PID) of the process that acquired the lock.

    >>> lock = PIDLockFile('somefile')
    >>> lock = PIDLockFile('somefile')
    FNcCstj||d|�|j|_dS)NF)r�__init__�pathZunique_name)�selfrZthreaded�timeout�r�!/usr/lib/python3.6/pidlockfile.pyr$szPIDLockFile.__init__cCs
t|j�S)z- Get the PID from the lock file.
            )�read_pid_from_pidfiler)r
rrr�read_pid*szPIDLockFile.read_pidcCstjj|j�S)zv Test if the lock is currently held.

            The lock is held if the PID file for this lock exists.

            )�osr�exists)r
rrr�	is_locked/szPIDLockFile.is_lockedcCs|j�otj�|j�kS)z� Test if the lock is held by the current process.

        Returns ``True`` if the current process ID matches the
        number stored in the PID file.
        )rr�getpidr)r
rrr�i_am_locking7szPIDLockFile.i_am_lockingcCs�|dk	r|n|j}tj�}|dk	r2|dkr2||7}x�yt|j�Wn�tk
r�}zv|jtjkr�tj�|kr�|dk	r�|dkr�td|j��ntd|j��tj	|dk	r�|dp�d�nt
d|j��WYdd}~Xq4XdSq4WdS)z� Acquire the lock.

        Creates the PID file for this lock, or raises an error if
        the lock could not be acquired.
        Nrz&Timeout waiting to acquire lock for %sz%s is already locked�
g�������?zfailed to create %s)r�time�write_pid_to_pidfiler�OSError�errnoZEEXISTr	rZsleepr)r
rZend_time�excrrr�acquire?s$
 zPIDLockFile.acquirecCs:|j�std|j��|j�s,td|j��t|j�dS)z� Release the lock.

            Removes the PID file to release the lock, or raises an
            error if the current process does not hold the lock.

            z%s is not lockedz%s is locked, but not by meN)rrrrr�remove_existing_pidfile)r
rrr�release_s
zPIDLockFile.releasecCst|j�dS)z� Break an existing lock.

            Removes the PID file if it already exists, otherwise does
            nothing.

            N)rr)r
rrr�
break_locklszPIDLockFile.break_lock)FN)N)�__name__�
__module__�__qualname__�__doc__rrrrrr r!rrrrr
s	

 
r
cCsbd}yt|d�}Wntk
r&Yn8X|j�j�}yt|�}Wntk
rTYnX|j�|S)z� Read the PID recorded in the named PID file.

        Read and return the numeric PID recorded as text in the named
        PID file. If the PID file cannot be read, or if the content is
        not a valid PID, return ``None``.

        N�r)�open�IOError�readline�strip�int�
ValueError�close)�pidfile_path�pid�pidfile�linerrrrvsrcCsRtjtjBtjB}d}tj|||�}tj|d�}tj�}|jd|�|j�dS)u� Write the PID in the named PID file.

        Get the numeric process ID (“PID”) of the current process
        and write it to the named file as a line of text.

        i��wz%s
N)	r�O_CREAT�O_EXCL�O_WRONLYr'�fdopenr�writer-)r.Z
open_flagsZ	open_modeZ
pidfile_fdr0r/rrrr�s	rcCsFytj|�Wn2tk
r@}z|jtjkr.n�WYdd}~XnXdS)z� Remove the named PID file if it exists.

        Removing a PID file that doesn't already exist puts us in the
        desired state, so we ignore the condition if the file does not
        exist.

        N)r�removerr�ENOENT)r.rrrrr�sr)r%Z
__future__rrrr�rrrrrr	r
rrrrrrr�<module>
s ]"site-packages/pip/_vendor/lockfile/__pycache__/mkdirlockfile.cpython-36.pyc000064400000005013147511334610022761 0ustar003

���e�@sdddlmZmZddlZddlZddlZddlZddlmZm	Z	m
Z
mZmZm
Z
Gdd�de�ZdS)�)�absolute_import�divisionN�)�LockBase�
LockFailed�	NotLocked�	NotMyLock�LockTimeout�
AlreadyLockedc@sDeZdZdZddd�Zddd�Zdd	�Zd
d�Zdd
�Zdd�Z	dS)�
MkdirLockFilez"Lock file by creating a directory.TNcCs6tj||||�tjj|jd|j|j|jf�|_	dS)zs
        >>> lock = MkdirLockFile('somefile')
        >>> lock = MkdirLockFile('somefile', threaded=False)
        z%s.%s%sN)
r�__init__�os�path�join�	lock_fileZhostnameZtname�pid�unique_name)�selfrZthreaded�timeout�r�#/usr/lib/python3.6/mkdirlockfile.pyrs

zMkdirLockFile.__init__cCs|dk	r|n|j}tj�}|dk	r2|dkr2||7}|dkr@d}ntd|d�}x�ytj|j�Wn�tk
r�tj�d}|j	t	j
kr�tjj|j
�r�dS|dk	r�tj�|kr�|dkr�td|j��ntd|j��tj|�ntd|j��YqPXt|j
d�j�dSqPWdS)	Nrg�������?�
rz&Timeout waiting to acquire lock for %sz%s is already lockedzfailed to create %s�wb)r�time�maxr
�mkdirr�OSError�sys�exc_info�errnoZEEXISTr�existsrr	r
Zsleepr�open�close)rrZend_time�wait�errrrr�acquires2
zMkdirLockFile.acquirecCsP|j�std|j��ntjj|j�s4td|j��tj|j�tj|j	�dS)Nz%s is not lockedz%s is locked, but not by me)
�	is_lockedrrr
r rr�unlink�rmdirr)rrrr�releaseAszMkdirLockFile.releasecCstjj|j�S)N)r
rr r)rrrrr&IszMkdirLockFile.is_lockedcCs|j�otjj|j�S)N)r&r
rr r)rrrr�i_am_lockingLszMkdirLockFile.i_am_lockingcCsJtjj|j�rFx*tj|j�D]}tjtjj|j|��qWtj|j�dS)N)r
rr r�listdirr'rr()r�namerrr�
break_lockPszMkdirLockFile.break_lock)TN)N)
�__name__�
__module__�__qualname__�__doc__rr%r)r&r*r-rrrrrs

&r)Z
__future__rrrr
rr�rrrrr	r
rrrrr�<module>s site-packages/pip/_vendor/lockfile/__pycache__/sqlitelockfile.cpython-36.pyc000064400000007126147511334610023163 0ustar003

���e��@srddlmZmZddlZddlZyeWnek
r@eZYnXddlm	Z	m
Z
mZmZm
Z
Gdd�de	�ZdS)�)�absolute_import�divisionN�)�LockBase�	NotLocked�	NotMyLock�LockTimeout�
AlreadyLockedc@sPeZdZdZdZddd�Zddd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�ZdS)�SQLiteLockFilezDemonstrate SQL-based locking.NTc
Cs�tj||||�t|j�|_t|j�|_tjdkrdddl}|j�\}}t	j
|�t	j|�~~|t_ddl}|j
tj�|_|jj�}y|jd�Wn|jk
r�Yn$X|jj�ddl}	|	jt	jtj�dS)zu
        >>> lock = SQLiteLockFile('somefile')
        >>> lock = SQLiteLockFile('somefile', threaded=False)
        NrzGcreate table locks(   lock_file varchar(32),   unique_name varchar(32)))r�__init__�unicode�	lock_file�unique_namer
�testdb�tempfileZmkstemp�os�close�unlink�sqlite3Zconnect�
connection�cursor�executeZOperationalError�commit�atexit�register)
�self�pathZthreaded�timeoutrZ_fdrr�cr�r�$/usr/lib/python3.6/sqlitelockfile.pyrs(




zSQLiteLockFile.__init__cCsH|dk	r|n|j}tj�}|dk	r2|dkr2||7}|dkr@d}n|dkrNd}n|d}|jj�}x�|j�s�|jd|j|jf�|jj�|jd|jf�|j	�}t
|�dkr�|jd|jf�|jj�q�dSn(|jd|jf�|j	�}t
|�dkr�dS|dk	�r6tj�|k�r6|dk�r(td|j��nt
d	|j��tj|�qbWdS)
Nrg�������?�
z;insert into locks  (lock_file, unique_name)  values  (?, ?)z*select * from locks  where unique_name = ?rz(delete from locks  where unique_name = ?z&Timeout waiting to acquire lock for %sz%s is already locked)r�timerr�	is_lockedrr
rr�fetchall�lenrrr	Zsleep)rrZend_time�waitr�rowsrrr �acquire5sD





zSQLiteLockFile.acquirecCs\|j�std|j��|j�s4td|j|j�f��|jj�}|j	d|jf�|jj
�dS)Nz%s is not lockedz#%s is locked, but not by me (by %s)z(delete from locks  where unique_name = ?)r#rr�i_am_lockingrr�_who_is_lockingrrrr)rrrrr �releasets

zSQLiteLockFile.releasecCs&|jj�}|jd|jf�|j�dS)Nz2select unique_name from locks  where lock_file = ?r)rrrr
Zfetchone)rrrrr r*�s

zSQLiteLockFile._who_is_lockingcCs*|jj�}|jd|jf�|j�}|S)Nz(select * from locks  where lock_file = ?)rrrr
r$)rrr'rrr r#�s


zSQLiteLockFile.is_lockedcCs*|jj�}|jd|j|jf�|j�S)Nz?select * from locks  where lock_file = ?    and unique_name = ?)rrrr
rr$)rrrrr r)�s
zSQLiteLockFile.i_am_lockingcCs(|jj�}|jd|jf�|jj�dS)Nz&delete from locks  where lock_file = ?)rrrr
r)rrrrr �
break_lock�s

zSQLiteLockFile.break_lock)TN)N)�__name__�
__module__�__qualname__�__doc__rrr(r+r*r#r)r,rrrr r
s
"
?r
)Z
__future__rrr"rr�	NameError�str�rrrrr	r
rrrr �<module>s
site-packages/pip/_vendor/lockfile/__pycache__/linklockfile.cpython-36.pyc000064400000004241147511334610022612 0ustar003

���e\
�@sPddlmZddlZddlZddlmZmZmZmZm	Z	m
Z
Gdd�de�ZdS)�)�absolute_importN�)�LockBase�
LockFailed�	NotLocked�	NotMyLock�LockTimeout�
AlreadyLockedc@s:eZdZdZd
dd�Zdd�Zdd�Zd	d
�Zdd�ZdS)�LinkLockFilez�Lock access to a file using atomic property of link(2).

    >>> lock = LinkLockFile('somefile')
    >>> lock = LinkLockFile('somefile', threaded=False)
    NcCs"yt|jd�j�Wn"tk
r6td|j��YnX|dk	rD|n|j}tj�}|dk	rj|dkrj||7}x�ytj|j|j	�Wn�t
k
�rtj|j�j}|dkr�dS|dk	r�tj�|kr�tj
|j�|dkr�td|j��ntd|j��tj|dk	�r
|d�pd�YqlXdSqlWdS)	N�wbzfailed to create %sr�z&Timeout waiting to acquire lock for %sz%s is already locked�
g�������?)�open�unique_name�close�IOErrorr�timeout�time�os�link�	lock_file�OSError�stat�st_nlink�unlinkr�pathr	Zsleep)�selfrZend_timeZnlinks�r�"/usr/lib/python3.6/linklockfile.py�acquires0
$zLinkLockFile.acquirecCsP|j�std|j��ntjj|j�s4td|j��tj|j�tj|j�dS)Nz%s is not lockedz%s is locked, but not by me)	�	is_lockedrrr�existsrrrr)rrrr�release7szLinkLockFile.releasecCstjj|j�S)N)rrr!r)rrrrr ?szLinkLockFile.is_lockedcCs(|j�o&tjj|j�o&tj|j�jdkS)Nr)r rrr!rrr)rrrr�i_am_lockingBszLinkLockFile.i_am_lockingcCstjj|j�rtj|j�dS)N)rrr!rr)rrrr�
break_lockGszLinkLockFile.break_lock)N)	�__name__�
__module__�__qualname__�__doc__rr"r r#r$rrrrr

s
&r
)Z
__future__rrr�rrrrrr	r
rrrr�<module>s site-packages/pip/_vendor/lockfile/__pycache__/__init__.cpython-36.opt-1.pyc000064400000023146147511334610022647 0ustar003

���e�$�
@s�dZddlmZddlZddlZddlZddlZddlZeed�sJej	e_
eejd�sbejjej_
dddd	d
ddd
dddddg
ZGdd�de�ZGdd�de�ZGdd�de�ZGdd	�d	e�ZGdd
�d
e�ZGdd�de�ZGdd�de�ZGdd
�d
e�ZGdd�de�ZGdd�de�Zdd�Zd d�Zd!d�Zd"d�Zd(d#d�Zeed$��rjd%d&l m!Z"e"j#Z$nd%d'l m%Z&e&j'Z$e$Z(dS))a
lockfile.py - Platform-independent advisory file locks.

Requires Python 2.5 unless you apply 2.4.diff
Locking is done on a per-thread basis instead of a per-process basis.

Usage:

>>> lock = LockFile('somefile')
>>> try:
...     lock.acquire()
... except AlreadyLocked:
...     print 'somefile', 'is locked already.'
... except LockFailed:
...     print 'somefile', 'can\'t be locked.'
... else:
...     print 'got lock'
got lock
>>> print lock.is_locked()
True
>>> lock.release()

>>> lock = LockFile('somefile')
>>> print lock.is_locked()
False
>>> with lock:
...    print lock.is_locked()
True
>>> print lock.is_locked()
False

>>> lock = LockFile('somefile')
>>> # It is okay to lock twice from the same thread...
>>> with lock:
...     lock.acquire()
...
>>> # Though no counter is kept, so you can't unlock multiple times...
>>> print lock.is_locked()
False

Exceptions:

    Error - base class for other exceptions
        LockError - base class for all locking exceptions
            AlreadyLocked - Another thread or process already holds the lock
            LockFailed - Lock failed for some other reason
        UnlockError - base class for all unlocking exceptions
            AlreadyUnlocked - File was not locked.
            NotMyLock - File was locked but not by the current thread/process
�)�absolute_importN�current_thread�get_name�Error�	LockError�LockTimeout�
AlreadyLocked�
LockFailed�UnlockError�	NotLocked�	NotMyLock�LinkFileLock�
MkdirFileLock�SQLiteFileLock�LockBase�lockedc@seZdZdZdS)rzw
    Base class for other exceptions.

    >>> try:
    ...   raise Error
    ... except Exception:
    ...   pass
    N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/__init__.pyrJsc@seZdZdZdS)rz�
    Base class for error arising from attempts to acquire the lock.

    >>> try:
    ...   raise LockError
    ... except Error:
    ...   pass
    N)rrrrrrrrrVsc@seZdZdZdS)rz�Raised when lock creation fails within a user-defined period of time.

    >>> try:
    ...   raise LockTimeout
    ... except LockError:
    ...   pass
    N)rrrrrrrrrbsc@seZdZdZdS)rz�Some other thread/process is locking the file.

    >>> try:
    ...   raise AlreadyLocked
    ... except LockError:
    ...   pass
    N)rrrrrrrrrmsc@seZdZdZdS)r	z�Lock file creation failed for some other reason.

    >>> try:
    ...   raise LockFailed
    ... except LockError:
    ...   pass
    N)rrrrrrrrr	xsc@seZdZdZdS)r
z�
    Base class for errors arising from attempts to release the lock.

    >>> try:
    ...   raise UnlockError
    ... except Error:
    ...   pass
    N)rrrrrrrrr
�sc@seZdZdZdS)rz�Raised when an attempt is made to unlock an unlocked file.

    >>> try:
    ...   raise NotLocked
    ... except UnlockError:
    ...   pass
    N)rrrrrrrrr�sc@seZdZdZdS)rz�Raised when an attempt is made to unlock a file someone else locked.

    >>> try:
    ...   raise NotMyLock
    ... except UnlockError:
    ...   pass
    N)rrrrrrrrr�sc@s>eZdZdd�Zddd�Zdd�Zdd	�Zd
d�Zdd
�ZdS)�_SharedBasecCs
||_dS)N)�path)�selfrrrr�__init__�sz_SharedBase.__init__NcCstd��dS)a�
        Acquire the lock.

        * If timeout is omitted (or None), wait forever trying to lock the
          file.

        * If timeout > 0, try to acquire the lock for that many seconds.  If
          the lock period expires and the file is still locked, raise
          LockTimeout.

        * If timeout <= 0, raise AlreadyLocked immediately if the file is
          already locked.
        zimplement in subclassN)�NotImplemented)r�timeoutrrr�acquire�sz_SharedBase.acquirecCstd��dS)zX
        Release the lock.

        If the file is not locked, raise NotLocked.
        zimplement in subclassN)r)rrrr�release�sz_SharedBase.releasecCs|j�|S)z*
        Context manager support.
        )r)rrrr�	__enter__�sz_SharedBase.__enter__cGs|j�dS)z*
        Context manager support.
        N)r)rZ_excrrr�__exit__�sz_SharedBase.__exit__cCsd|jj|jfS)Nz<%s: %r>)�	__class__rr)rrrr�__repr__�sz_SharedBase.__repr__)N)	rrrrrrr r!r#rrrrr�s
rcsBeZdZdZd�fdd�	Zdd�Zdd	�Zd
d�Zdd
�Z�Z	S)rz.Base class for platform-specific lock classes.TNcs�tt|�j|�tjj|�d|_tj�|_	tj
�|_|rbtj
�}t|dt|��}d|d@|_nd|_tjj|j�}tjj|d|j	|j|jt|j�f�|_||_dS)zi
        >>> lock = LockBase('somefile')
        >>> lock = LockBase('somefile', threaded=False)
        z.lock�identz-%xl���z	%s%s.%s%sN)�superrr�osr�abspathZ	lock_file�socketZgethostnameZhostname�getpid�pid�	threadingr�getattr�hashZtname�dirname�join�unique_namer)rr�threadedr�tr$r/)r"rrr�s 

	zLockBase.__init__cCstd��dS)z9
        Tell whether or not the file is locked.
        zimplement in subclassN)r)rrrr�	is_locked�szLockBase.is_lockedcCstd��dS)zA
        Return True if this object is locking the file.
        zimplement in subclassN)r)rrrr�i_am_locking�szLockBase.i_am_lockingcCstd��dS)zN
        Remove a lock.  Useful if a locking thread failed to unlock.
        zimplement in subclassN)r)rrrr�
break_lockszLockBase.break_lockcCsd|jj|j|jfS)Nz<%s: %r -- %r>)r"rr1r)rrrrr#szLockBase.__repr__)TN)
rrrrrr4r5r6r#�
__classcell__rr)r"rr�s!cOsRtjd|tdd�t|dt�s.|dd�}t|�dkrH|rHd|d<|||�S)Nz1Import from %s module instead of lockfile package�)�
stacklevelr�Tr2)�warnings�warn�DeprecationWarning�
isinstance�str�len)�cls�mod�args�kwdsrrr�
_fl_helpers

rEcOs ddlm}t|jdf|�|�S)z�Factory function provided for backwards compatibility.

    Do not use in new code.  Instead, import LinkLockFile from the
    lockfile.linklockfile module.
    r:)�linklockfilezlockfile.linklockfile)r%rFrE�LinkLockFile)rCrDrFrrrr
s
cOs ddlm}t|jdf|�|�S)z�Factory function provided for backwards compatibility.

    Do not use in new code.  Instead, import MkdirLockFile from the
    lockfile.mkdirlockfile module.
    r:)�
mkdirlockfilezlockfile.mkdirlockfile)r%rHrE�
MkdirLockFile)rCrDrHrrrr%s
cOs ddlm}t|jdf|�|�S)z�Factory function provided for backwards compatibility.

    Do not use in new code.  Instead, import SQLiteLockFile from the
    lockfile.mkdirlockfile module.
    r:)�sqlitelockfilezlockfile.sqlitelockfile)r%rJrEZSQLiteLockFile)rCrDrJrrrr0s
cs��fdd�}|S)aDecorator which enables locks for decorated function.

    Arguments:
     - path: path for lockfile.
     - timeout (optional): Timeout for acquiring lock.

     Usage:
         @locked('/var/run/myname', timeout=0)
         def myname(...):
             ...
    cstj�����fdd��}|S)Nc
s.t��d�}|j�z
�||�S|j�XdS)N)r)�FileLockrr)rC�kwargs�lock)�funcrrrr�wrapperHs

z&locked.<locals>.decor.<locals>.wrapper)�	functools�wraps)rNrO)rr)rNr�decorGszlocked.<locals>.decorr)rrrRr)rrrr;s
�linkr:)rF)rH)N))rZ
__future__rrPr'r)r,r;�hasattrZ
currentThreadrZThreadZgetNamer�__all__�	Exceptionrrrrr	r
rr�objectrrrEr
rrrr%rFZ_llfrGZLockFilerHZ_mlfrIrKrrrr�<module>4sF
-:
site-packages/pip/_vendor/lockfile/__pycache__/sqlitelockfile.cpython-36.opt-1.pyc000064400000007126147511334610024122 0ustar003

���e��@srddlmZmZddlZddlZyeWnek
r@eZYnXddlm	Z	m
Z
mZmZm
Z
Gdd�de	�ZdS)�)�absolute_import�divisionN�)�LockBase�	NotLocked�	NotMyLock�LockTimeout�
AlreadyLockedc@sPeZdZdZdZddd�Zddd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�ZdS)�SQLiteLockFilezDemonstrate SQL-based locking.NTc
Cs�tj||||�t|j�|_t|j�|_tjdkrdddl}|j�\}}t	j
|�t	j|�~~|t_ddl}|j
tj�|_|jj�}y|jd�Wn|jk
r�Yn$X|jj�ddl}	|	jt	jtj�dS)zu
        >>> lock = SQLiteLockFile('somefile')
        >>> lock = SQLiteLockFile('somefile', threaded=False)
        NrzGcreate table locks(   lock_file varchar(32),   unique_name varchar(32)))r�__init__�unicode�	lock_file�unique_namer
�testdb�tempfileZmkstemp�os�close�unlink�sqlite3Zconnect�
connection�cursor�executeZOperationalError�commit�atexit�register)
�self�pathZthreaded�timeoutrZ_fdrr�cr�r�$/usr/lib/python3.6/sqlitelockfile.pyrs(




zSQLiteLockFile.__init__cCsH|dk	r|n|j}tj�}|dk	r2|dkr2||7}|dkr@d}n|dkrNd}n|d}|jj�}x�|j�s�|jd|j|jf�|jj�|jd|jf�|j	�}t
|�dkr�|jd|jf�|jj�q�dSn(|jd|jf�|j	�}t
|�dkr�dS|dk	�r6tj�|k�r6|dk�r(td|j��nt
d	|j��tj|�qbWdS)
Nrg�������?�
z;insert into locks  (lock_file, unique_name)  values  (?, ?)z*select * from locks  where unique_name = ?rz(delete from locks  where unique_name = ?z&Timeout waiting to acquire lock for %sz%s is already locked)r�timerr�	is_lockedrr
rr�fetchall�lenrrr	Zsleep)rrZend_time�waitr�rowsrrr �acquire5sD





zSQLiteLockFile.acquirecCs\|j�std|j��|j�s4td|j|j�f��|jj�}|j	d|jf�|jj
�dS)Nz%s is not lockedz#%s is locked, but not by me (by %s)z(delete from locks  where unique_name = ?)r#rr�i_am_lockingrr�_who_is_lockingrrrr)rrrrr �releasets

zSQLiteLockFile.releasecCs&|jj�}|jd|jf�|j�dS)Nz2select unique_name from locks  where lock_file = ?r)rrrr
Zfetchone)rrrrr r*�s

zSQLiteLockFile._who_is_lockingcCs*|jj�}|jd|jf�|j�}|S)Nz(select * from locks  where lock_file = ?)rrrr
r$)rrr'rrr r#�s


zSQLiteLockFile.is_lockedcCs*|jj�}|jd|j|jf�|j�S)Nz?select * from locks  where lock_file = ?    and unique_name = ?)rrrr
rr$)rrrrr r)�s
zSQLiteLockFile.i_am_lockingcCs(|jj�}|jd|jf�|jj�dS)Nz&delete from locks  where lock_file = ?)rrrr
r)rrrrr �
break_lock�s

zSQLiteLockFile.break_lock)TN)N)�__name__�
__module__�__qualname__�__doc__rrr(r+r*r#r)r,rrrr r
s
"
?r
)Z
__future__rrr"rr�	NameError�str�rrrrr	r
rrrr �<module>s
site-packages/pip/_vendor/lockfile/__pycache__/symlinklockfile.cpython-36.opt-1.pyc000064400000004056147511334610024306 0ustar003

���e8
�@sLddlmZddlZddlZddlmZmZmZmZm	Z	Gdd�de�Z
dS)�)�absolute_importN�)�LockBase�	NotLocked�	NotMyLock�LockTimeout�
AlreadyLockedc@sDeZdZdZddd�Zddd�Zdd	�Zd
d�Zdd
�Zdd�Z	dS)�SymlinkLockFilez'Lock access to a file using symlink(2).TNcCs(tj||||�tjj|j�d|_dS)Nr)r�__init__�os�path�split�unique_name)�selfrZthreaded�timeout�r�%/usr/lib/python3.6/symlinklockfile.pyr

szSymlinkLockFile.__init__cCs�|dk	r|n|j}tj�}|dk	r2|dkr2||7}x�ytj|j|j�Wnttk
r�|j�rddS|dk	r�tj�|kr�|dkr�td|j	��nt
d|j	��tj|dk	r�|dnd�Yq4XdSq4WdS)Nrz&Timeout waiting to acquire lock for %sz%s is already locked�
g�������?)r�timer�symlinkr�	lock_file�OSError�i_am_lockingrrrZsleep)rrZend_timerrr�acquires$
 zSymlinkLockFile.acquirecCs>|j�std|j��n|j�s.td|j��tj|j�dS)Nz%s is not lockedz%s is locked, but not by me)�	is_lockedrrrrr�unlinkr)rrrr�release6s
zSymlinkLockFile.releasecCstjj|j�S)N)rr�islinkr)rrrrr=szSymlinkLockFile.is_lockedcCs tjj|j�otj|j�|jkS)N)rrrr�readlinkr)rrrrr@szSymlinkLockFile.i_am_lockingcCstjj|j�rtj|j�dS)N)rrrrr)rrrr�
break_lockDszSymlinkLockFile.break_lock)TN)N)
�__name__�
__module__�__qualname__�__doc__r
rrrrrrrrrr	
s

#r	)Z
__future__rrr�rrrrrr	rrrr�<module>ssite-packages/pip/_vendor/lockfile/__pycache__/symlinklockfile.cpython-36.pyc000064400000004056147511334610023347 0ustar003

���e8
�@sLddlmZddlZddlZddlmZmZmZmZm	Z	Gdd�de�Z
dS)�)�absolute_importN�)�LockBase�	NotLocked�	NotMyLock�LockTimeout�
AlreadyLockedc@sDeZdZdZddd�Zddd�Zdd	�Zd
d�Zdd
�Zdd�Z	dS)�SymlinkLockFilez'Lock access to a file using symlink(2).TNcCs(tj||||�tjj|j�d|_dS)Nr)r�__init__�os�path�split�unique_name)�selfrZthreaded�timeout�r�%/usr/lib/python3.6/symlinklockfile.pyr

szSymlinkLockFile.__init__cCs�|dk	r|n|j}tj�}|dk	r2|dkr2||7}x�ytj|j|j�Wnttk
r�|j�rddS|dk	r�tj�|kr�|dkr�td|j	��nt
d|j	��tj|dk	r�|dnd�Yq4XdSq4WdS)Nrz&Timeout waiting to acquire lock for %sz%s is already locked�
g�������?)r�timer�symlinkr�	lock_file�OSError�i_am_lockingrrrZsleep)rrZend_timerrr�acquires$
 zSymlinkLockFile.acquirecCs>|j�std|j��n|j�s.td|j��tj|j�dS)Nz%s is not lockedz%s is locked, but not by me)�	is_lockedrrrrr�unlinkr)rrrr�release6s
zSymlinkLockFile.releasecCstjj|j�S)N)rr�islinkr)rrrrr=szSymlinkLockFile.is_lockedcCs tjj|j�otj|j�|jkS)N)rrrr�readlinkr)rrrrr@szSymlinkLockFile.i_am_lockingcCstjj|j�rtj|j�dS)N)rrrrr)rrrr�
break_lockDszSymlinkLockFile.break_lock)TN)N)
�__name__�
__module__�__qualname__�__doc__r
rrrrrrrrrr	
s

#r	)Z
__future__rrr�rrrrrr	rrrr�<module>ssite-packages/pip/_vendor/lockfile/__pycache__/linklockfile.cpython-36.opt-1.pyc000064400000004241147511334610023551 0ustar003

���e\
�@sPddlmZddlZddlZddlmZmZmZmZm	Z	m
Z
Gdd�de�ZdS)�)�absolute_importN�)�LockBase�
LockFailed�	NotLocked�	NotMyLock�LockTimeout�
AlreadyLockedc@s:eZdZdZd
dd�Zdd�Zdd�Zd	d
�Zdd�ZdS)�LinkLockFilez�Lock access to a file using atomic property of link(2).

    >>> lock = LinkLockFile('somefile')
    >>> lock = LinkLockFile('somefile', threaded=False)
    NcCs"yt|jd�j�Wn"tk
r6td|j��YnX|dk	rD|n|j}tj�}|dk	rj|dkrj||7}x�ytj|j|j	�Wn�t
k
�rtj|j�j}|dkr�dS|dk	r�tj�|kr�tj
|j�|dkr�td|j��ntd|j��tj|dk	�r
|d�pd�YqlXdSqlWdS)	N�wbzfailed to create %sr�z&Timeout waiting to acquire lock for %sz%s is already locked�
g�������?)�open�unique_name�close�IOErrorr�timeout�time�os�link�	lock_file�OSError�stat�st_nlink�unlinkr�pathr	Zsleep)�selfrZend_timeZnlinks�r�"/usr/lib/python3.6/linklockfile.py�acquires0
$zLinkLockFile.acquirecCsP|j�std|j��ntjj|j�s4td|j��tj|j�tj|j�dS)Nz%s is not lockedz%s is locked, but not by me)	�	is_lockedrrr�existsrrrr)rrrr�release7szLinkLockFile.releasecCstjj|j�S)N)rrr!r)rrrrr ?szLinkLockFile.is_lockedcCs(|j�o&tjj|j�o&tj|j�jdkS)Nr)r rrr!rrr)rrrr�i_am_lockingBszLinkLockFile.i_am_lockingcCstjj|j�rtj|j�dS)N)rrr!rr)rrrr�
break_lockGszLinkLockFile.break_lock)N)	�__name__�
__module__�__qualname__�__doc__rr"r r#r$rrrrr

s
&r
)Z
__future__rrr�rrrrrr	r
rrrr�<module>s site-packages/pip/_vendor/lockfile/__pycache__/pidlockfile.cpython-36.opt-1.pyc000064400000011243147511334610023370 0ustar003

���e��@stdZddlmZddlZddlZddlZddlmZmZm	Z	m
Z
mZmZGdd�de�Z
dd	�Zd
d�Zdd
�ZdS)z8 Lockfile behaviour implemented via Unix PID files.
    �)�absolute_importN�)�LockBase�
AlreadyLocked�
LockFailed�	NotLocked�	NotMyLock�LockTimeoutc@sLeZdZdZddd�Zdd�Zdd	�Zd
d�Zddd
�Zdd�Z	dd�Z
dS)�PIDLockFileaA Lockfile implemented as a Unix PID file.

    The lock file is a normal file named by the attribute `path`.
    A lock's PID file contains a single line of text, containing
    the process ID (PID) of the process that acquired the lock.

    >>> lock = PIDLockFile('somefile')
    >>> lock = PIDLockFile('somefile')
    FNcCstj||d|�|j|_dS)NF)r�__init__�pathZunique_name)�selfrZthreaded�timeout�r�!/usr/lib/python3.6/pidlockfile.pyr$szPIDLockFile.__init__cCs
t|j�S)z- Get the PID from the lock file.
            )�read_pid_from_pidfiler)r
rrr�read_pid*szPIDLockFile.read_pidcCstjj|j�S)zv Test if the lock is currently held.

            The lock is held if the PID file for this lock exists.

            )�osr�exists)r
rrr�	is_locked/szPIDLockFile.is_lockedcCs|j�otj�|j�kS)z� Test if the lock is held by the current process.

        Returns ``True`` if the current process ID matches the
        number stored in the PID file.
        )rr�getpidr)r
rrr�i_am_locking7szPIDLockFile.i_am_lockingcCs�|dk	r|n|j}tj�}|dk	r2|dkr2||7}x�yt|j�Wn�tk
r�}zv|jtjkr�tj�|kr�|dk	r�|dkr�td|j��ntd|j��tj	|dk	r�|dp�d�nt
d|j��WYdd}~Xq4XdSq4WdS)z� Acquire the lock.

        Creates the PID file for this lock, or raises an error if
        the lock could not be acquired.
        Nrz&Timeout waiting to acquire lock for %sz%s is already locked�
g�������?zfailed to create %s)r�time�write_pid_to_pidfiler�OSError�errnoZEEXISTr	rZsleepr)r
rZend_time�excrrr�acquire?s$
 zPIDLockFile.acquirecCs:|j�std|j��|j�s,td|j��t|j�dS)z� Release the lock.

            Removes the PID file to release the lock, or raises an
            error if the current process does not hold the lock.

            z%s is not lockedz%s is locked, but not by meN)rrrrr�remove_existing_pidfile)r
rrr�release_s
zPIDLockFile.releasecCst|j�dS)z� Break an existing lock.

            Removes the PID file if it already exists, otherwise does
            nothing.

            N)rr)r
rrr�
break_locklszPIDLockFile.break_lock)FN)N)�__name__�
__module__�__qualname__�__doc__rrrrrr r!rrrrr
s	

 
r
cCsbd}yt|d�}Wntk
r&Yn8X|j�j�}yt|�}Wntk
rTYnX|j�|S)z� Read the PID recorded in the named PID file.

        Read and return the numeric PID recorded as text in the named
        PID file. If the PID file cannot be read, or if the content is
        not a valid PID, return ``None``.

        N�r)�open�IOError�readline�strip�int�
ValueError�close)�pidfile_path�pid�pidfile�linerrrrvsrcCsRtjtjBtjB}d}tj|||�}tj|d�}tj�}|jd|�|j�dS)u� Write the PID in the named PID file.

        Get the numeric process ID (“PID”) of the current process
        and write it to the named file as a line of text.

        i��wz%s
N)	r�O_CREAT�O_EXCL�O_WRONLYr'�fdopenr�writer-)r.Z
open_flagsZ	open_modeZ
pidfile_fdr0r/rrrr�s	rcCsFytj|�Wn2tk
r@}z|jtjkr.n�WYdd}~XnXdS)z� Remove the named PID file if it exists.

        Removing a PID file that doesn't already exist puts us in the
        desired state, so we ignore the condition if the file does not
        exist.

        N)r�removerr�ENOENT)r.rrrrr�sr)r%Z
__future__rrrr�rrrrrr	r
rrrrrrr�<module>
s ]"site-packages/pip/_vendor/pyparsing.py000064400000665653147511334610014130 0ustar00# module pyparsing.py
#
# Copyright (c) 2003-2016  Paul T. McGuire
#
# Permission is hereby granted, free of charge, to any person obtaining
# a copy of this software and associated documentation files (the
# "Software"), to deal in the Software without restriction, including
# without limitation the rights to use, copy, modify, merge, publish,
# distribute, sublicense, and/or sell copies of the Software, and to
# permit persons to whom the Software is furnished to do so, subject to
# the following conditions:
#
# The above copyright notice and this permission notice shall be
# included in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
#

__doc__ = \
"""
pyparsing module - Classes and methods to define and execute parsing grammars

The pyparsing module is an alternative approach to creating and executing simple grammars,
vs. the traditional lex/yacc approach, or the use of regular expressions.  With pyparsing, you
don't need to learn a new syntax for defining grammars or matching expressions - the parsing module
provides a library of classes that you use to construct the grammar directly in Python.

Here is a program to parse "Hello, World!" (or any greeting of the form 
C{"<salutation>, <addressee>!"}), built up using L{Word}, L{Literal}, and L{And} elements 
(L{'+'<ParserElement.__add__>} operator gives L{And} expressions, strings are auto-converted to
L{Literal} expressions)::

    from pyparsing import Word, alphas

    # define grammar of a greeting
    greet = Word(alphas) + "," + Word(alphas) + "!"

    hello = "Hello, World!"
    print (hello, "->", greet.parseString(hello))

The program outputs the following::

    Hello, World! -> ['Hello', ',', 'World', '!']

The Python representation of the grammar is quite readable, owing to the self-explanatory
class names, and the use of '+', '|' and '^' operators.

The L{ParseResults} object returned from L{ParserElement.parseString<ParserElement.parseString>} can be accessed as a nested list, a dictionary, or an
object with named attributes.

The pyparsing module handles some of the problems that are typically vexing when writing text parsers:
 - extra or missing whitespace (the above program will also handle "Hello,World!", "Hello  ,  World  !", etc.)
 - quoted strings
 - embedded comments
"""

__version__ = "2.1.10"
__versionTime__ = "07 Oct 2016 01:31 UTC"
__author__ = "Paul McGuire <ptmcg@users.sourceforge.net>"

import string
from weakref import ref as wkref
import copy
import sys
import warnings
import re
import sre_constants
import collections
import pprint
import traceback
import types
from datetime import datetime

try:
    from _thread import RLock
except ImportError:
    from threading import RLock

try:
    from collections import OrderedDict as _OrderedDict
except ImportError:
    try:
        from ordereddict import OrderedDict as _OrderedDict
    except ImportError:
        _OrderedDict = None

#~ sys.stderr.write( "testing pyparsing module, version %s, %s\n" % (__version__,__versionTime__ ) )

__all__ = [
'And', 'CaselessKeyword', 'CaselessLiteral', 'CharsNotIn', 'Combine', 'Dict', 'Each', 'Empty',
'FollowedBy', 'Forward', 'GoToColumn', 'Group', 'Keyword', 'LineEnd', 'LineStart', 'Literal',
'MatchFirst', 'NoMatch', 'NotAny', 'OneOrMore', 'OnlyOnce', 'Optional', 'Or',
'ParseBaseException', 'ParseElementEnhance', 'ParseException', 'ParseExpression', 'ParseFatalException',
'ParseResults', 'ParseSyntaxException', 'ParserElement', 'QuotedString', 'RecursiveGrammarException',
'Regex', 'SkipTo', 'StringEnd', 'StringStart', 'Suppress', 'Token', 'TokenConverter', 
'White', 'Word', 'WordEnd', 'WordStart', 'ZeroOrMore',
'alphanums', 'alphas', 'alphas8bit', 'anyCloseTag', 'anyOpenTag', 'cStyleComment', 'col',
'commaSeparatedList', 'commonHTMLEntity', 'countedArray', 'cppStyleComment', 'dblQuotedString',
'dblSlashComment', 'delimitedList', 'dictOf', 'downcaseTokens', 'empty', 'hexnums',
'htmlComment', 'javaStyleComment', 'line', 'lineEnd', 'lineStart', 'lineno',
'makeHTMLTags', 'makeXMLTags', 'matchOnlyAtCol', 'matchPreviousExpr', 'matchPreviousLiteral',
'nestedExpr', 'nullDebugAction', 'nums', 'oneOf', 'opAssoc', 'operatorPrecedence', 'printables',
'punc8bit', 'pythonStyleComment', 'quotedString', 'removeQuotes', 'replaceHTMLEntity', 
'replaceWith', 'restOfLine', 'sglQuotedString', 'srange', 'stringEnd',
'stringStart', 'traceParseAction', 'unicodeString', 'upcaseTokens', 'withAttribute',
'indentedBlock', 'originalTextFor', 'ungroup', 'infixNotation','locatedExpr', 'withClass',
'CloseMatch', 'tokenMap', 'pyparsing_common',
]

system_version = tuple(sys.version_info)[:3]
PY_3 = system_version[0] == 3
if PY_3:
    _MAX_INT = sys.maxsize
    basestring = str
    unichr = chr
    _ustr = str

    # build list of single arg builtins, that can be used as parse actions
    singleArgBuiltins = [sum, len, sorted, reversed, list, tuple, set, any, all, min, max]

else:
    _MAX_INT = sys.maxint
    range = xrange

    def _ustr(obj):
        """Drop-in replacement for str(obj) that tries to be Unicode friendly. It first tries
           str(obj). If that fails with a UnicodeEncodeError, then it tries unicode(obj). It
           then < returns the unicode object | encodes it with the default encoding | ... >.
        """
        if isinstance(obj,unicode):
            return obj

        try:
            # If this works, then _ustr(obj) has the same behaviour as str(obj), so
            # it won't break any existing code.
            return str(obj)

        except UnicodeEncodeError:
            # Else encode it
            ret = unicode(obj).encode(sys.getdefaultencoding(), 'xmlcharrefreplace')
            xmlcharref = Regex('&#\d+;')
            xmlcharref.setParseAction(lambda t: '\\u' + hex(int(t[0][2:-1]))[2:])
            return xmlcharref.transformString(ret)

    # build list of single arg builtins, tolerant of Python version, that can be used as parse actions
    singleArgBuiltins = []
    import __builtin__
    for fname in "sum len sorted reversed list tuple set any all min max".split():
        try:
            singleArgBuiltins.append(getattr(__builtin__,fname))
        except AttributeError:
            continue
            
_generatorType = type((y for y in range(1)))
 
def _xml_escape(data):
    """Escape &, <, >, ", ', etc. in a string of data."""

    # ampersand must be replaced first
    from_symbols = '&><"\''
    to_symbols = ('&'+s+';' for s in "amp gt lt quot apos".split())
    for from_,to_ in zip(from_symbols, to_symbols):
        data = data.replace(from_, to_)
    return data

class _Constants(object):
    pass

alphas     = string.ascii_uppercase + string.ascii_lowercase
nums       = "0123456789"
hexnums    = nums + "ABCDEFabcdef"
alphanums  = alphas + nums
_bslash    = chr(92)
printables = "".join(c for c in string.printable if c not in string.whitespace)

class ParseBaseException(Exception):
    """base exception class for all parsing runtime exceptions"""
    # Performance tuning: we construct a *lot* of these, so keep this
    # constructor as small and fast as possible
    def __init__( self, pstr, loc=0, msg=None, elem=None ):
        self.loc = loc
        if msg is None:
            self.msg = pstr
            self.pstr = ""
        else:
            self.msg = msg
            self.pstr = pstr
        self.parserElement = elem
        self.args = (pstr, loc, msg)

    @classmethod
    def _from_exception(cls, pe):
        """
        internal factory method to simplify creating one type of ParseException 
        from another - avoids having __init__ signature conflicts among subclasses
        """
        return cls(pe.pstr, pe.loc, pe.msg, pe.parserElement)

    def __getattr__( self, aname ):
        """supported attributes by name are:
            - lineno - returns the line number of the exception text
            - col - returns the column number of the exception text
            - line - returns the line containing the exception text
        """
        if( aname == "lineno" ):
            return lineno( self.loc, self.pstr )
        elif( aname in ("col", "column") ):
            return col( self.loc, self.pstr )
        elif( aname == "line" ):
            return line( self.loc, self.pstr )
        else:
            raise AttributeError(aname)

    def __str__( self ):
        return "%s (at char %d), (line:%d, col:%d)" % \
                ( self.msg, self.loc, self.lineno, self.column )
    def __repr__( self ):
        return _ustr(self)
    def markInputline( self, markerString = ">!<" ):
        """Extracts the exception line from the input string, and marks
           the location of the exception with a special symbol.
        """
        line_str = self.line
        line_column = self.column - 1
        if markerString:
            line_str = "".join((line_str[:line_column],
                                markerString, line_str[line_column:]))
        return line_str.strip()
    def __dir__(self):
        return "lineno col line".split() + dir(type(self))

class ParseException(ParseBaseException):
    """
    Exception thrown when parse expressions don't match class;
    supported attributes by name are:
     - lineno - returns the line number of the exception text
     - col - returns the column number of the exception text
     - line - returns the line containing the exception text
        
    Example::
        try:
            Word(nums).setName("integer").parseString("ABC")
        except ParseException as pe:
            print(pe)
            print("column: {}".format(pe.col))
            
    prints::
       Expected integer (at char 0), (line:1, col:1)
        column: 1
    """
    pass

class ParseFatalException(ParseBaseException):
    """user-throwable exception thrown when inconsistent parse content
       is found; stops all parsing immediately"""
    pass

class ParseSyntaxException(ParseFatalException):
    """just like L{ParseFatalException}, but thrown internally when an
       L{ErrorStop<And._ErrorStop>} ('-' operator) indicates that parsing is to stop 
       immediately because an unbacktrackable syntax error has been found"""
    pass

#~ class ReparseException(ParseBaseException):
    #~ """Experimental class - parse actions can raise this exception to cause
       #~ pyparsing to reparse the input string:
        #~ - with a modified input string, and/or
        #~ - with a modified start location
       #~ Set the values of the ReparseException in the constructor, and raise the
       #~ exception in a parse action to cause pyparsing to use the new string/location.
       #~ Setting the values as None causes no change to be made.
       #~ """
    #~ def __init_( self, newstring, restartLoc ):
        #~ self.newParseText = newstring
        #~ self.reparseLoc = restartLoc

class RecursiveGrammarException(Exception):
    """exception thrown by L{ParserElement.validate} if the grammar could be improperly recursive"""
    def __init__( self, parseElementList ):
        self.parseElementTrace = parseElementList

    def __str__( self ):
        return "RecursiveGrammarException: %s" % self.parseElementTrace

class _ParseResultsWithOffset(object):
    def __init__(self,p1,p2):
        self.tup = (p1,p2)
    def __getitem__(self,i):
        return self.tup[i]
    def __repr__(self):
        return repr(self.tup[0])
    def setOffset(self,i):
        self.tup = (self.tup[0],i)

class ParseResults(object):
    """
    Structured parse results, to provide multiple means of access to the parsed data:
       - as a list (C{len(results)})
       - by list index (C{results[0], results[1]}, etc.)
       - by attribute (C{results.<resultsName>} - see L{ParserElement.setResultsName})

    Example::
        integer = Word(nums)
        date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))
        # equivalent form:
        # date_str = integer("year") + '/' + integer("month") + '/' + integer("day")

        # parseString returns a ParseResults object
        result = date_str.parseString("1999/12/31")

        def test(s, fn=repr):
            print("%s -> %s" % (s, fn(eval(s))))
        test("list(result)")
        test("result[0]")
        test("result['month']")
        test("result.day")
        test("'month' in result")
        test("'minutes' in result")
        test("result.dump()", str)
    prints::
        list(result) -> ['1999', '/', '12', '/', '31']
        result[0] -> '1999'
        result['month'] -> '12'
        result.day -> '31'
        'month' in result -> True
        'minutes' in result -> False
        result.dump() -> ['1999', '/', '12', '/', '31']
        - day: 31
        - month: 12
        - year: 1999
    """
    def __new__(cls, toklist=None, name=None, asList=True, modal=True ):
        if isinstance(toklist, cls):
            return toklist
        retobj = object.__new__(cls)
        retobj.__doinit = True
        return retobj

    # Performance tuning: we construct a *lot* of these, so keep this
    # constructor as small and fast as possible
    def __init__( self, toklist=None, name=None, asList=True, modal=True, isinstance=isinstance ):
        if self.__doinit:
            self.__doinit = False
            self.__name = None
            self.__parent = None
            self.__accumNames = {}
            self.__asList = asList
            self.__modal = modal
            if toklist is None:
                toklist = []
            if isinstance(toklist, list):
                self.__toklist = toklist[:]
            elif isinstance(toklist, _generatorType):
                self.__toklist = list(toklist)
            else:
                self.__toklist = [toklist]
            self.__tokdict = dict()

        if name is not None and name:
            if not modal:
                self.__accumNames[name] = 0
            if isinstance(name,int):
                name = _ustr(name) # will always return a str, but use _ustr for consistency
            self.__name = name
            if not (isinstance(toklist, (type(None), basestring, list)) and toklist in (None,'',[])):
                if isinstance(toklist,basestring):
                    toklist = [ toklist ]
                if asList:
                    if isinstance(toklist,ParseResults):
                        self[name] = _ParseResultsWithOffset(toklist.copy(),0)
                    else:
                        self[name] = _ParseResultsWithOffset(ParseResults(toklist[0]),0)
                    self[name].__name = name
                else:
                    try:
                        self[name] = toklist[0]
                    except (KeyError,TypeError,IndexError):
                        self[name] = toklist

    def __getitem__( self, i ):
        if isinstance( i, (int,slice) ):
            return self.__toklist[i]
        else:
            if i not in self.__accumNames:
                return self.__tokdict[i][-1][0]
            else:
                return ParseResults([ v[0] for v in self.__tokdict[i] ])

    def __setitem__( self, k, v, isinstance=isinstance ):
        if isinstance(v,_ParseResultsWithOffset):
            self.__tokdict[k] = self.__tokdict.get(k,list()) + [v]
            sub = v[0]
        elif isinstance(k,(int,slice)):
            self.__toklist[k] = v
            sub = v
        else:
            self.__tokdict[k] = self.__tokdict.get(k,list()) + [_ParseResultsWithOffset(v,0)]
            sub = v
        if isinstance(sub,ParseResults):
            sub.__parent = wkref(self)

    def __delitem__( self, i ):
        if isinstance(i,(int,slice)):
            mylen = len( self.__toklist )
            del self.__toklist[i]

            # convert int to slice
            if isinstance(i, int):
                if i < 0:
                    i += mylen
                i = slice(i, i+1)
            # get removed indices
            removed = list(range(*i.indices(mylen)))
            removed.reverse()
            # fixup indices in token dictionary
            for name,occurrences in self.__tokdict.items():
                for j in removed:
                    for k, (value, position) in enumerate(occurrences):
                        occurrences[k] = _ParseResultsWithOffset(value, position - (position > j))
        else:
            del self.__tokdict[i]

    def __contains__( self, k ):
        return k in self.__tokdict

    def __len__( self ): return len( self.__toklist )
    def __bool__(self): return ( not not self.__toklist )
    __nonzero__ = __bool__
    def __iter__( self ): return iter( self.__toklist )
    def __reversed__( self ): return iter( self.__toklist[::-1] )
    def _iterkeys( self ):
        if hasattr(self.__tokdict, "iterkeys"):
            return self.__tokdict.iterkeys()
        else:
            return iter(self.__tokdict)

    def _itervalues( self ):
        return (self[k] for k in self._iterkeys())
            
    def _iteritems( self ):
        return ((k, self[k]) for k in self._iterkeys())

    if PY_3:
        keys = _iterkeys       
        """Returns an iterator of all named result keys (Python 3.x only)."""

        values = _itervalues
        """Returns an iterator of all named result values (Python 3.x only)."""

        items = _iteritems
        """Returns an iterator of all named result key-value tuples (Python 3.x only)."""

    else:
        iterkeys = _iterkeys
        """Returns an iterator of all named result keys (Python 2.x only)."""

        itervalues = _itervalues
        """Returns an iterator of all named result values (Python 2.x only)."""

        iteritems = _iteritems
        """Returns an iterator of all named result key-value tuples (Python 2.x only)."""

        def keys( self ):
            """Returns all named result keys (as a list in Python 2.x, as an iterator in Python 3.x)."""
            return list(self.iterkeys())

        def values( self ):
            """Returns all named result values (as a list in Python 2.x, as an iterator in Python 3.x)."""
            return list(self.itervalues())
                
        def items( self ):
            """Returns all named result key-values (as a list of tuples in Python 2.x, as an iterator in Python 3.x)."""
            return list(self.iteritems())

    def haskeys( self ):
        """Since keys() returns an iterator, this method is helpful in bypassing
           code that looks for the existence of any defined results names."""
        return bool(self.__tokdict)
        
    def pop( self, *args, **kwargs):
        """
        Removes and returns item at specified index (default=C{last}).
        Supports both C{list} and C{dict} semantics for C{pop()}. If passed no
        argument or an integer argument, it will use C{list} semantics
        and pop tokens from the list of parsed tokens. If passed a 
        non-integer argument (most likely a string), it will use C{dict}
        semantics and pop the corresponding value from any defined 
        results names. A second default return value argument is 
        supported, just as in C{dict.pop()}.

        Example::
            def remove_first(tokens):
                tokens.pop(0)
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            print(OneOrMore(Word(nums)).addParseAction(remove_first).parseString("0 123 321")) # -> ['123', '321']

            label = Word(alphas)
            patt = label("LABEL") + OneOrMore(Word(nums))
            print(patt.parseString("AAB 123 321").dump())

            # Use pop() in a parse action to remove named result (note that corresponding value is not
            # removed from list form of results)
            def remove_LABEL(tokens):
                tokens.pop("LABEL")
                return tokens
            patt.addParseAction(remove_LABEL)
            print(patt.parseString("AAB 123 321").dump())
        prints::
            ['AAB', '123', '321']
            - LABEL: AAB

            ['AAB', '123', '321']
        """
        if not args:
            args = [-1]
        for k,v in kwargs.items():
            if k == 'default':
                args = (args[0], v)
            else:
                raise TypeError("pop() got an unexpected keyword argument '%s'" % k)
        if (isinstance(args[0], int) or 
                        len(args) == 1 or 
                        args[0] in self):
            index = args[0]
            ret = self[index]
            del self[index]
            return ret
        else:
            defaultvalue = args[1]
            return defaultvalue

    def get(self, key, defaultValue=None):
        """
        Returns named result matching the given key, or if there is no
        such name, then returns the given C{defaultValue} or C{None} if no
        C{defaultValue} is specified.

        Similar to C{dict.get()}.
        
        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            result = date_str.parseString("1999/12/31")
            print(result.get("year")) # -> '1999'
            print(result.get("hour", "not specified")) # -> 'not specified'
            print(result.get("hour")) # -> None
        """
        if key in self:
            return self[key]
        else:
            return defaultValue

    def insert( self, index, insStr ):
        """
        Inserts new element at location index in the list of parsed tokens.
        
        Similar to C{list.insert()}.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']

            # use a parse action to insert the parse location in the front of the parsed results
            def insert_locn(locn, tokens):
                tokens.insert(0, locn)
            print(OneOrMore(Word(nums)).addParseAction(insert_locn).parseString("0 123 321")) # -> [0, '0', '123', '321']
        """
        self.__toklist.insert(index, insStr)
        # fixup indices in token dictionary
        for name,occurrences in self.__tokdict.items():
            for k, (value, position) in enumerate(occurrences):
                occurrences[k] = _ParseResultsWithOffset(value, position + (position > index))

    def append( self, item ):
        """
        Add single element to end of ParseResults list of elements.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            
            # use a parse action to compute the sum of the parsed integers, and add it to the end
            def append_sum(tokens):
                tokens.append(sum(map(int, tokens)))
            print(OneOrMore(Word(nums)).addParseAction(append_sum).parseString("0 123 321")) # -> ['0', '123', '321', 444]
        """
        self.__toklist.append(item)

    def extend( self, itemseq ):
        """
        Add sequence of elements to end of ParseResults list of elements.

        Example::
            patt = OneOrMore(Word(alphas))
            
            # use a parse action to append the reverse of the matched strings, to make a palindrome
            def make_palindrome(tokens):
                tokens.extend(reversed([t[::-1] for t in tokens]))
                return ''.join(tokens)
            print(patt.addParseAction(make_palindrome).parseString("lskdj sdlkjf lksd")) # -> 'lskdjsdlkjflksddsklfjkldsjdksl'
        """
        if isinstance(itemseq, ParseResults):
            self += itemseq
        else:
            self.__toklist.extend(itemseq)

    def clear( self ):
        """
        Clear all elements and results names.
        """
        del self.__toklist[:]
        self.__tokdict.clear()

    def __getattr__( self, name ):
        try:
            return self[name]
        except KeyError:
            return ""
            
        if name in self.__tokdict:
            if name not in self.__accumNames:
                return self.__tokdict[name][-1][0]
            else:
                return ParseResults([ v[0] for v in self.__tokdict[name] ])
        else:
            return ""

    def __add__( self, other ):
        ret = self.copy()
        ret += other
        return ret

    def __iadd__( self, other ):
        if other.__tokdict:
            offset = len(self.__toklist)
            addoffset = lambda a: offset if a<0 else a+offset
            otheritems = other.__tokdict.items()
            otherdictitems = [(k, _ParseResultsWithOffset(v[0],addoffset(v[1])) )
                                for (k,vlist) in otheritems for v in vlist]
            for k,v in otherdictitems:
                self[k] = v
                if isinstance(v[0],ParseResults):
                    v[0].__parent = wkref(self)
            
        self.__toklist += other.__toklist
        self.__accumNames.update( other.__accumNames )
        return self

    def __radd__(self, other):
        if isinstance(other,int) and other == 0:
            # useful for merging many ParseResults using sum() builtin
            return self.copy()
        else:
            # this may raise a TypeError - so be it
            return other + self
        
    def __repr__( self ):
        return "(%s, %s)" % ( repr( self.__toklist ), repr( self.__tokdict ) )

    def __str__( self ):
        return '[' + ', '.join(_ustr(i) if isinstance(i, ParseResults) else repr(i) for i in self.__toklist) + ']'

    def _asStringList( self, sep='' ):
        out = []
        for item in self.__toklist:
            if out and sep:
                out.append(sep)
            if isinstance( item, ParseResults ):
                out += item._asStringList()
            else:
                out.append( _ustr(item) )
        return out

    def asList( self ):
        """
        Returns the parse results as a nested list of matching tokens, all converted to strings.

        Example::
            patt = OneOrMore(Word(alphas))
            result = patt.parseString("sldkj lsdkj sldkj")
            # even though the result prints in string-like form, it is actually a pyparsing ParseResults
            print(type(result), result) # -> <class 'pyparsing.ParseResults'> ['sldkj', 'lsdkj', 'sldkj']
            
            # Use asList() to create an actual list
            result_list = result.asList()
            print(type(result_list), result_list) # -> <class 'list'> ['sldkj', 'lsdkj', 'sldkj']
        """
        return [res.asList() if isinstance(res,ParseResults) else res for res in self.__toklist]

    def asDict( self ):
        """
        Returns the named parse results as a nested dictionary.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(type(result), repr(result)) # -> <class 'pyparsing.ParseResults'> (['12', '/', '31', '/', '1999'], {'day': [('1999', 4)], 'year': [('12', 0)], 'month': [('31', 2)]})
            
            result_dict = result.asDict()
            print(type(result_dict), repr(result_dict)) # -> <class 'dict'> {'day': '1999', 'year': '12', 'month': '31'}

            # even though a ParseResults supports dict-like access, sometime you just need to have a dict
            import json
            print(json.dumps(result)) # -> Exception: TypeError: ... is not JSON serializable
            print(json.dumps(result.asDict())) # -> {"month": "31", "day": "1999", "year": "12"}
        """
        if PY_3:
            item_fn = self.items
        else:
            item_fn = self.iteritems
            
        def toItem(obj):
            if isinstance(obj, ParseResults):
                if obj.haskeys():
                    return obj.asDict()
                else:
                    return [toItem(v) for v in obj]
            else:
                return obj
                
        return dict((k,toItem(v)) for k,v in item_fn())

    def copy( self ):
        """
        Returns a new copy of a C{ParseResults} object.
        """
        ret = ParseResults( self.__toklist )
        ret.__tokdict = self.__tokdict.copy()
        ret.__parent = self.__parent
        ret.__accumNames.update( self.__accumNames )
        ret.__name = self.__name
        return ret

    def asXML( self, doctag=None, namedItemsOnly=False, indent="", formatted=True ):
        """
        (Deprecated) Returns the parse results as XML. Tags are created for tokens and lists that have defined results names.
        """
        nl = "\n"
        out = []
        namedItems = dict((v[1],k) for (k,vlist) in self.__tokdict.items()
                                                            for v in vlist)
        nextLevelIndent = indent + "  "

        # collapse out indents if formatting is not desired
        if not formatted:
            indent = ""
            nextLevelIndent = ""
            nl = ""

        selfTag = None
        if doctag is not None:
            selfTag = doctag
        else:
            if self.__name:
                selfTag = self.__name

        if not selfTag:
            if namedItemsOnly:
                return ""
            else:
                selfTag = "ITEM"

        out += [ nl, indent, "<", selfTag, ">" ]

        for i,res in enumerate(self.__toklist):
            if isinstance(res,ParseResults):
                if i in namedItems:
                    out += [ res.asXML(namedItems[i],
                                        namedItemsOnly and doctag is None,
                                        nextLevelIndent,
                                        formatted)]
                else:
                    out += [ res.asXML(None,
                                        namedItemsOnly and doctag is None,
                                        nextLevelIndent,
                                        formatted)]
            else:
                # individual token, see if there is a name for it
                resTag = None
                if i in namedItems:
                    resTag = namedItems[i]
                if not resTag:
                    if namedItemsOnly:
                        continue
                    else:
                        resTag = "ITEM"
                xmlBodyText = _xml_escape(_ustr(res))
                out += [ nl, nextLevelIndent, "<", resTag, ">",
                                                xmlBodyText,
                                                "</", resTag, ">" ]

        out += [ nl, indent, "</", selfTag, ">" ]
        return "".join(out)

    def __lookup(self,sub):
        for k,vlist in self.__tokdict.items():
            for v,loc in vlist:
                if sub is v:
                    return k
        return None

    def getName(self):
        """
        Returns the results name for this token expression. Useful when several 
        different expressions might match at a particular location.

        Example::
            integer = Word(nums)
            ssn_expr = Regex(r"\d\d\d-\d\d-\d\d\d\d")
            house_number_expr = Suppress('#') + Word(nums, alphanums)
            user_data = (Group(house_number_expr)("house_number") 
                        | Group(ssn_expr)("ssn")
                        | Group(integer)("age"))
            user_info = OneOrMore(user_data)
            
            result = user_info.parseString("22 111-22-3333 #221B")
            for item in result:
                print(item.getName(), ':', item[0])
        prints::
            age : 22
            ssn : 111-22-3333
            house_number : 221B
        """
        if self.__name:
            return self.__name
        elif self.__parent:
            par = self.__parent()
            if par:
                return par.__lookup(self)
            else:
                return None
        elif (len(self) == 1 and
               len(self.__tokdict) == 1 and
               next(iter(self.__tokdict.values()))[0][1] in (0,-1)):
            return next(iter(self.__tokdict.keys()))
        else:
            return None

    def dump(self, indent='', depth=0, full=True):
        """
        Diagnostic method for listing out the contents of a C{ParseResults}.
        Accepts an optional C{indent} argument so that this string can be embedded
        in a nested display of other data.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(result.dump())
        prints::
            ['12', '/', '31', '/', '1999']
            - day: 1999
            - month: 31
            - year: 12
        """
        out = []
        NL = '\n'
        out.append( indent+_ustr(self.asList()) )
        if full:
            if self.haskeys():
                items = sorted((str(k), v) for k,v in self.items())
                for k,v in items:
                    if out:
                        out.append(NL)
                    out.append( "%s%s- %s: " % (indent,('  '*depth), k) )
                    if isinstance(v,ParseResults):
                        if v:
                            out.append( v.dump(indent,depth+1) )
                        else:
                            out.append(_ustr(v))
                    else:
                        out.append(repr(v))
            elif any(isinstance(vv,ParseResults) for vv in self):
                v = self
                for i,vv in enumerate(v):
                    if isinstance(vv,ParseResults):
                        out.append("\n%s%s[%d]:\n%s%s%s" % (indent,('  '*(depth)),i,indent,('  '*(depth+1)),vv.dump(indent,depth+1) ))
                    else:
                        out.append("\n%s%s[%d]:\n%s%s%s" % (indent,('  '*(depth)),i,indent,('  '*(depth+1)),_ustr(vv)))
            
        return "".join(out)

    def pprint(self, *args, **kwargs):
        """
        Pretty-printer for parsed results as a list, using the C{pprint} module.
        Accepts additional positional or keyword args as defined for the 
        C{pprint.pprint} method. (U{http://docs.python.org/3/library/pprint.html#pprint.pprint})

        Example::
            ident = Word(alphas, alphanums)
            num = Word(nums)
            func = Forward()
            term = ident | num | Group('(' + func + ')')
            func <<= ident + Group(Optional(delimitedList(term)))
            result = func.parseString("fna a,b,(fnb c,d,200),100")
            result.pprint(width=40)
        prints::
            ['fna',
             ['a',
              'b',
              ['(', 'fnb', ['c', 'd', '200'], ')'],
              '100']]
        """
        pprint.pprint(self.asList(), *args, **kwargs)

    # add support for pickle protocol
    def __getstate__(self):
        return ( self.__toklist,
                 ( self.__tokdict.copy(),
                   self.__parent is not None and self.__parent() or None,
                   self.__accumNames,
                   self.__name ) )

    def __setstate__(self,state):
        self.__toklist = state[0]
        (self.__tokdict,
         par,
         inAccumNames,
         self.__name) = state[1]
        self.__accumNames = {}
        self.__accumNames.update(inAccumNames)
        if par is not None:
            self.__parent = wkref(par)
        else:
            self.__parent = None

    def __getnewargs__(self):
        return self.__toklist, self.__name, self.__asList, self.__modal

    def __dir__(self):
        return (dir(type(self)) + list(self.keys()))

collections.MutableMapping.register(ParseResults)

def col (loc,strg):
    """Returns current column within a string, counting newlines as line separators.
   The first column is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   """
    s = strg
    return 1 if 0<loc<len(s) and s[loc-1] == '\n' else loc - s.rfind("\n", 0, loc)

def lineno(loc,strg):
    """Returns current line number within a string, counting newlines as line separators.
   The first line is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   """
    return strg.count("\n",0,loc) + 1

def line( loc, strg ):
    """Returns the line of text containing loc within a string, counting newlines as line separators.
       """
    lastCR = strg.rfind("\n", 0, loc)
    nextCR = strg.find("\n", loc)
    if nextCR >= 0:
        return strg[lastCR+1:nextCR]
    else:
        return strg[lastCR+1:]

def _defaultStartDebugAction( instring, loc, expr ):
    print (("Match " + _ustr(expr) + " at loc " + _ustr(loc) + "(%d,%d)" % ( lineno(loc,instring), col(loc,instring) )))

def _defaultSuccessDebugAction( instring, startloc, endloc, expr, toks ):
    print ("Matched " + _ustr(expr) + " -> " + str(toks.asList()))

def _defaultExceptionDebugAction( instring, loc, expr, exc ):
    print ("Exception raised:" + _ustr(exc))

def nullDebugAction(*args):
    """'Do-nothing' debug action, to suppress debugging output during parsing."""
    pass

# Only works on Python 3.x - nonlocal is toxic to Python 2 installs
#~ 'decorator to trim function calls to match the arity of the target'
#~ def _trim_arity(func, maxargs=3):
    #~ if func in singleArgBuiltins:
        #~ return lambda s,l,t: func(t)
    #~ limit = 0
    #~ foundArity = False
    #~ def wrapper(*args):
        #~ nonlocal limit,foundArity
        #~ while 1:
            #~ try:
                #~ ret = func(*args[limit:])
                #~ foundArity = True
                #~ return ret
            #~ except TypeError:
                #~ if limit == maxargs or foundArity:
                    #~ raise
                #~ limit += 1
                #~ continue
    #~ return wrapper

# this version is Python 2.x-3.x cross-compatible
'decorator to trim function calls to match the arity of the target'
def _trim_arity(func, maxargs=2):
    if func in singleArgBuiltins:
        return lambda s,l,t: func(t)
    limit = [0]
    foundArity = [False]
    
    # traceback return data structure changed in Py3.5 - normalize back to plain tuples
    if system_version[:2] >= (3,5):
        def extract_stack(limit=0):
            # special handling for Python 3.5.0 - extra deep call stack by 1
            offset = -3 if system_version == (3,5,0) else -2
            frame_summary = traceback.extract_stack(limit=-offset+limit-1)[offset]
            return [(frame_summary.filename, frame_summary.lineno)]
        def extract_tb(tb, limit=0):
            frames = traceback.extract_tb(tb, limit=limit)
            frame_summary = frames[-1]
            return [(frame_summary.filename, frame_summary.lineno)]
    else:
        extract_stack = traceback.extract_stack
        extract_tb = traceback.extract_tb
    
    # synthesize what would be returned by traceback.extract_stack at the call to 
    # user's parse action 'func', so that we don't incur call penalty at parse time
    
    LINE_DIFF = 6
    # IF ANY CODE CHANGES, EVEN JUST COMMENTS OR BLANK LINES, BETWEEN THE NEXT LINE AND 
    # THE CALL TO FUNC INSIDE WRAPPER, LINE_DIFF MUST BE MODIFIED!!!!
    this_line = extract_stack(limit=2)[-1]
    pa_call_line_synth = (this_line[0], this_line[1]+LINE_DIFF)

    def wrapper(*args):
        while 1:
            try:
                ret = func(*args[limit[0]:])
                foundArity[0] = True
                return ret
            except TypeError:
                # re-raise TypeErrors if they did not come from our arity testing
                if foundArity[0]:
                    raise
                else:
                    try:
                        tb = sys.exc_info()[-1]
                        if not extract_tb(tb, limit=2)[-1][:2] == pa_call_line_synth:
                            raise
                    finally:
                        del tb

                if limit[0] <= maxargs:
                    limit[0] += 1
                    continue
                raise

    # copy func name to wrapper for sensible debug output
    func_name = "<parse action>"
    try:
        func_name = getattr(func, '__name__', 
                            getattr(func, '__class__').__name__)
    except Exception:
        func_name = str(func)
    wrapper.__name__ = func_name

    return wrapper

class ParserElement(object):
    """Abstract base level parser element class."""
    DEFAULT_WHITE_CHARS = " \n\t\r"
    verbose_stacktrace = False

    @staticmethod
    def setDefaultWhitespaceChars( chars ):
        r"""
        Overrides the default whitespace chars

        Example::
            # default whitespace chars are space, <TAB> and newline
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def', 'ghi', 'jkl']
            
            # change to just treat newline as significant
            ParserElement.setDefaultWhitespaceChars(" \t")
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def']
        """
        ParserElement.DEFAULT_WHITE_CHARS = chars

    @staticmethod
    def inlineLiteralsUsing(cls):
        """
        Set class to be used for inclusion of string literals into a parser.
        
        Example::
            # default literal class used is Literal
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']


            # change to Suppress
            ParserElement.inlineLiteralsUsing(Suppress)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '12', '31']
        """
        ParserElement._literalStringClass = cls

    def __init__( self, savelist=False ):
        self.parseAction = list()
        self.failAction = None
        #~ self.name = "<unknown>"  # don't define self.name, let subclasses try/except upcall
        self.strRepr = None
        self.resultsName = None
        self.saveAsList = savelist
        self.skipWhitespace = True
        self.whiteChars = ParserElement.DEFAULT_WHITE_CHARS
        self.copyDefaultWhiteChars = True
        self.mayReturnEmpty = False # used when checking for left-recursion
        self.keepTabs = False
        self.ignoreExprs = list()
        self.debug = False
        self.streamlined = False
        self.mayIndexError = True # used to optimize exception handling for subclasses that don't advance parse index
        self.errmsg = ""
        self.modalResults = True # used to mark results names as modal (report only last) or cumulative (list all)
        self.debugActions = ( None, None, None ) #custom debug actions
        self.re = None
        self.callPreparse = True # used to avoid redundant calls to preParse
        self.callDuringTry = False

    def copy( self ):
        """
        Make a copy of this C{ParserElement}.  Useful for defining different parse actions
        for the same parsing pattern, using copies of the original parse element.
        
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            integerK = integer.copy().addParseAction(lambda toks: toks[0]*1024) + Suppress("K")
            integerM = integer.copy().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
            
            print(OneOrMore(integerK | integerM | integer).parseString("5K 100 640K 256M"))
        prints::
            [5120, 100, 655360, 268435456]
        Equivalent form of C{expr.copy()} is just C{expr()}::
            integerM = integer().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
        """
        cpy = copy.copy( self )
        cpy.parseAction = self.parseAction[:]
        cpy.ignoreExprs = self.ignoreExprs[:]
        if self.copyDefaultWhiteChars:
            cpy.whiteChars = ParserElement.DEFAULT_WHITE_CHARS
        return cpy

    def setName( self, name ):
        """
        Define name for this expression, makes debugging and exception messages clearer.
        
        Example::
            Word(nums).parseString("ABC")  # -> Exception: Expected W:(0123...) (at char 0), (line:1, col:1)
            Word(nums).setName("integer").parseString("ABC")  # -> Exception: Expected integer (at char 0), (line:1, col:1)
        """
        self.name = name
        self.errmsg = "Expected " + self.name
        if hasattr(self,"exception"):
            self.exception.msg = self.errmsg
        return self

    def setResultsName( self, name, listAllMatches=False ):
        """
        Define name for referencing matching tokens as a nested attribute
        of the returned parse results.
        NOTE: this returns a *copy* of the original C{ParserElement} object;
        this is so that the client can define a basic element, such as an
        integer, and reference it in multiple places with different names.

        You can also set results names using the abbreviated syntax,
        C{expr("name")} in place of C{expr.setResultsName("name")} - 
        see L{I{__call__}<__call__>}.

        Example::
            date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))

            # equivalent form:
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
        """
        newself = self.copy()
        if name.endswith("*"):
            name = name[:-1]
            listAllMatches=True
        newself.resultsName = name
        newself.modalResults = not listAllMatches
        return newself

    def setBreak(self,breakFlag = True):
        """Method to invoke the Python pdb debugger when this element is
           about to be parsed. Set C{breakFlag} to True to enable, False to
           disable.
        """
        if breakFlag:
            _parseMethod = self._parse
            def breaker(instring, loc, doActions=True, callPreParse=True):
                import pdb
                pdb.set_trace()
                return _parseMethod( instring, loc, doActions, callPreParse )
            breaker._originalParseMethod = _parseMethod
            self._parse = breaker
        else:
            if hasattr(self._parse,"_originalParseMethod"):
                self._parse = self._parse._originalParseMethod
        return self

    def setParseAction( self, *fns, **kwargs ):
        """
        Define action to perform when successfully matching parse element definition.
        Parse action fn is a callable method with 0-3 arguments, called as C{fn(s,loc,toks)},
        C{fn(loc,toks)}, C{fn(toks)}, or just C{fn()}, where:
         - s   = the original string being parsed (see note below)
         - loc = the location of the matching substring
         - toks = a list of the matched tokens, packaged as a C{L{ParseResults}} object
        If the functions in fns modify the tokens, they can return them as the return
        value from fn, and the modified list of tokens will replace the original.
        Otherwise, fn does not need to return any value.

        Optional keyword arguments:
         - callDuringTry = (default=C{False}) indicate if parse action should be run during lookaheads and alternate testing

        Note: the default parsing behavior is to expand tabs in the input string
        before starting the parsing process.  See L{I{parseString}<parseString>} for more information
        on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
        consistent view of the parsed string, the parse location, and line and column
        positions within the parsed string.
        
        Example::
            integer = Word(nums)
            date_str = integer + '/' + integer + '/' + integer

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']

            # use parse action to convert to ints at parse time
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            date_str = integer + '/' + integer + '/' + integer

            # note that integer fields are now ints, not strings
            date_str.parseString("1999/12/31")  # -> [1999, '/', 12, '/', 31]
        """
        self.parseAction = list(map(_trim_arity, list(fns)))
        self.callDuringTry = kwargs.get("callDuringTry", False)
        return self

    def addParseAction( self, *fns, **kwargs ):
        """
        Add parse action to expression's list of parse actions. See L{I{setParseAction}<setParseAction>}.
        
        See examples in L{I{copy}<copy>}.
        """
        self.parseAction += list(map(_trim_arity, list(fns)))
        self.callDuringTry = self.callDuringTry or kwargs.get("callDuringTry", False)
        return self

    def addCondition(self, *fns, **kwargs):
        """Add a boolean predicate function to expression's list of parse actions. See 
        L{I{setParseAction}<setParseAction>} for function call signatures. Unlike C{setParseAction}, 
        functions passed to C{addCondition} need to return boolean success/fail of the condition.

        Optional keyword arguments:
         - message = define a custom message to be used in the raised exception
         - fatal   = if True, will raise ParseFatalException to stop parsing immediately; otherwise will raise ParseException
         
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            year_int = integer.copy()
            year_int.addCondition(lambda toks: toks[0] >= 2000, message="Only support years 2000 and later")
            date_str = year_int + '/' + integer + '/' + integer

            result = date_str.parseString("1999/12/31")  # -> Exception: Only support years 2000 and later (at char 0), (line:1, col:1)
        """
        msg = kwargs.get("message", "failed user-defined condition")
        exc_type = ParseFatalException if kwargs.get("fatal", False) else ParseException
        for fn in fns:
            def pa(s,l,t):
                if not bool(_trim_arity(fn)(s,l,t)):
                    raise exc_type(s,l,msg)
            self.parseAction.append(pa)
        self.callDuringTry = self.callDuringTry or kwargs.get("callDuringTry", False)
        return self

    def setFailAction( self, fn ):
        """Define action to perform if parsing fails at this expression.
           Fail acton fn is a callable function that takes the arguments
           C{fn(s,loc,expr,err)} where:
            - s = string being parsed
            - loc = location where expression match was attempted and failed
            - expr = the parse expression that failed
            - err = the exception thrown
           The function returns no value.  It may throw C{L{ParseFatalException}}
           if it is desired to stop parsing immediately."""
        self.failAction = fn
        return self

    def _skipIgnorables( self, instring, loc ):
        exprsFound = True
        while exprsFound:
            exprsFound = False
            for e in self.ignoreExprs:
                try:
                    while 1:
                        loc,dummy = e._parse( instring, loc )
                        exprsFound = True
                except ParseException:
                    pass
        return loc

    def preParse( self, instring, loc ):
        if self.ignoreExprs:
            loc = self._skipIgnorables( instring, loc )

        if self.skipWhitespace:
            wt = self.whiteChars
            instrlen = len(instring)
            while loc < instrlen and instring[loc] in wt:
                loc += 1

        return loc

    def parseImpl( self, instring, loc, doActions=True ):
        return loc, []

    def postParse( self, instring, loc, tokenlist ):
        return tokenlist

    #~ @profile
    def _parseNoCache( self, instring, loc, doActions=True, callPreParse=True ):
        debugging = ( self.debug ) #and doActions )

        if debugging or self.failAction:
            #~ print ("Match",self,"at loc",loc,"(%d,%d)" % ( lineno(loc,instring), col(loc,instring) ))
            if (self.debugActions[0] ):
                self.debugActions[0]( instring, loc, self )
            if callPreParse and self.callPreparse:
                preloc = self.preParse( instring, loc )
            else:
                preloc = loc
            tokensStart = preloc
            try:
                try:
                    loc,tokens = self.parseImpl( instring, preloc, doActions )
                except IndexError:
                    raise ParseException( instring, len(instring), self.errmsg, self )
            except ParseBaseException as err:
                #~ print ("Exception raised:", err)
                if self.debugActions[2]:
                    self.debugActions[2]( instring, tokensStart, self, err )
                if self.failAction:
                    self.failAction( instring, tokensStart, self, err )
                raise
        else:
            if callPreParse and self.callPreparse:
                preloc = self.preParse( instring, loc )
            else:
                preloc = loc
            tokensStart = preloc
            if self.mayIndexError or loc >= len(instring):
                try:
                    loc,tokens = self.parseImpl( instring, preloc, doActions )
                except IndexError:
                    raise ParseException( instring, len(instring), self.errmsg, self )
            else:
                loc,tokens = self.parseImpl( instring, preloc, doActions )

        tokens = self.postParse( instring, loc, tokens )

        retTokens = ParseResults( tokens, self.resultsName, asList=self.saveAsList, modal=self.modalResults )
        if self.parseAction and (doActions or self.callDuringTry):
            if debugging:
                try:
                    for fn in self.parseAction:
                        tokens = fn( instring, tokensStart, retTokens )
                        if tokens is not None:
                            retTokens = ParseResults( tokens,
                                                      self.resultsName,
                                                      asList=self.saveAsList and isinstance(tokens,(ParseResults,list)),
                                                      modal=self.modalResults )
                except ParseBaseException as err:
                    #~ print "Exception raised in user parse action:", err
                    if (self.debugActions[2] ):
                        self.debugActions[2]( instring, tokensStart, self, err )
                    raise
            else:
                for fn in self.parseAction:
                    tokens = fn( instring, tokensStart, retTokens )
                    if tokens is not None:
                        retTokens = ParseResults( tokens,
                                                  self.resultsName,
                                                  asList=self.saveAsList and isinstance(tokens,(ParseResults,list)),
                                                  modal=self.modalResults )

        if debugging:
            #~ print ("Matched",self,"->",retTokens.asList())
            if (self.debugActions[1] ):
                self.debugActions[1]( instring, tokensStart, loc, self, retTokens )

        return loc, retTokens

    def tryParse( self, instring, loc ):
        try:
            return self._parse( instring, loc, doActions=False )[0]
        except ParseFatalException:
            raise ParseException( instring, loc, self.errmsg, self)
    
    def canParseNext(self, instring, loc):
        try:
            self.tryParse(instring, loc)
        except (ParseException, IndexError):
            return False
        else:
            return True

    class _UnboundedCache(object):
        def __init__(self):
            cache = {}
            self.not_in_cache = not_in_cache = object()

            def get(self, key):
                return cache.get(key, not_in_cache)

            def set(self, key, value):
                cache[key] = value

            def clear(self):
                cache.clear()

            self.get = types.MethodType(get, self)
            self.set = types.MethodType(set, self)
            self.clear = types.MethodType(clear, self)

    if _OrderedDict is not None:
        class _FifoCache(object):
            def __init__(self, size):
                self.not_in_cache = not_in_cache = object()

                cache = _OrderedDict()

                def get(self, key):
                    return cache.get(key, not_in_cache)

                def set(self, key, value):
                    cache[key] = value
                    if len(cache) > size:
                        cache.popitem(False)

                def clear(self):
                    cache.clear()

                self.get = types.MethodType(get, self)
                self.set = types.MethodType(set, self)
                self.clear = types.MethodType(clear, self)

    else:
        class _FifoCache(object):
            def __init__(self, size):
                self.not_in_cache = not_in_cache = object()

                cache = {}
                key_fifo = collections.deque([], size)

                def get(self, key):
                    return cache.get(key, not_in_cache)

                def set(self, key, value):
                    cache[key] = value
                    if len(cache) > size:
                        cache.pop(key_fifo.popleft(), None)
                    key_fifo.append(key)

                def clear(self):
                    cache.clear()
                    key_fifo.clear()

                self.get = types.MethodType(get, self)
                self.set = types.MethodType(set, self)
                self.clear = types.MethodType(clear, self)

    # argument cache for optimizing repeated calls when backtracking through recursive expressions
    packrat_cache = {} # this is set later by enabledPackrat(); this is here so that resetCache() doesn't fail
    packrat_cache_lock = RLock()
    packrat_cache_stats = [0, 0]

    # this method gets repeatedly called during backtracking with the same arguments -
    # we can cache these arguments and save ourselves the trouble of re-parsing the contained expression
    def _parseCache( self, instring, loc, doActions=True, callPreParse=True ):
        HIT, MISS = 0, 1
        lookup = (self, instring, loc, callPreParse, doActions)
        with ParserElement.packrat_cache_lock:
            cache = ParserElement.packrat_cache
            value = cache.get(lookup)
            if value is cache.not_in_cache:
                ParserElement.packrat_cache_stats[MISS] += 1
                try:
                    value = self._parseNoCache(instring, loc, doActions, callPreParse)
                except ParseBaseException as pe:
                    # cache a copy of the exception, without the traceback
                    cache.set(lookup, pe.__class__(*pe.args))
                    raise
                else:
                    cache.set(lookup, (value[0], value[1].copy()))
                    return value
            else:
                ParserElement.packrat_cache_stats[HIT] += 1
                if isinstance(value, Exception):
                    raise value
                return (value[0], value[1].copy())

    _parse = _parseNoCache

    @staticmethod
    def resetCache():
        ParserElement.packrat_cache.clear()
        ParserElement.packrat_cache_stats[:] = [0] * len(ParserElement.packrat_cache_stats)

    _packratEnabled = False
    @staticmethod
    def enablePackrat(cache_size_limit=128):
        """Enables "packrat" parsing, which adds memoizing to the parsing logic.
           Repeated parse attempts at the same string location (which happens
           often in many complex grammars) can immediately return a cached value,
           instead of re-executing parsing/validating code.  Memoizing is done of
           both valid results and parsing exceptions.
           
           Parameters:
            - cache_size_limit - (default=C{128}) - if an integer value is provided
              will limit the size of the packrat cache; if None is passed, then
              the cache size will be unbounded; if 0 is passed, the cache will
              be effectively disabled.
            
           This speedup may break existing programs that use parse actions that
           have side-effects.  For this reason, packrat parsing is disabled when
           you first import pyparsing.  To activate the packrat feature, your
           program must call the class method C{ParserElement.enablePackrat()}.  If
           your program uses C{psyco} to "compile as you go", you must call
           C{enablePackrat} before calling C{psyco.full()}.  If you do not do this,
           Python will crash.  For best results, call C{enablePackrat()} immediately
           after importing pyparsing.
           
           Example::
               import pyparsing
               pyparsing.ParserElement.enablePackrat()
        """
        if not ParserElement._packratEnabled:
            ParserElement._packratEnabled = True
            if cache_size_limit is None:
                ParserElement.packrat_cache = ParserElement._UnboundedCache()
            else:
                ParserElement.packrat_cache = ParserElement._FifoCache(cache_size_limit)
            ParserElement._parse = ParserElement._parseCache

    def parseString( self, instring, parseAll=False ):
        """
        Execute the parse expression with the given string.
        This is the main interface to the client code, once the complete
        expression has been built.

        If you want the grammar to require that the entire input string be
        successfully parsed, then set C{parseAll} to True (equivalent to ending
        the grammar with C{L{StringEnd()}}).

        Note: C{parseString} implicitly calls C{expandtabs()} on the input string,
        in order to report proper column numbers in parse actions.
        If the input string contains tabs and
        the grammar uses parse actions that use the C{loc} argument to index into the
        string being parsed, you can ensure you have a consistent view of the input
        string by:
         - calling C{parseWithTabs} on your grammar before calling C{parseString}
           (see L{I{parseWithTabs}<parseWithTabs>})
         - define your parse action using the full C{(s,loc,toks)} signature, and
           reference the input string using the parse action's C{s} argument
         - explictly expand the tabs in your input string before calling
           C{parseString}
        
        Example::
            Word('a').parseString('aaaaabaaa')  # -> ['aaaaa']
            Word('a').parseString('aaaaabaaa', parseAll=True)  # -> Exception: Expected end of text
        """
        ParserElement.resetCache()
        if not self.streamlined:
            self.streamline()
            #~ self.saveAsList = True
        for e in self.ignoreExprs:
            e.streamline()
        if not self.keepTabs:
            instring = instring.expandtabs()
        try:
            loc, tokens = self._parse( instring, 0 )
            if parseAll:
                loc = self.preParse( instring, loc )
                se = Empty() + StringEnd()
                se._parse( instring, loc )
        except ParseBaseException as exc:
            if ParserElement.verbose_stacktrace:
                raise
            else:
                # catch and re-raise exception from here, clears out pyparsing internal stack trace
                raise exc
        else:
            return tokens

    def scanString( self, instring, maxMatches=_MAX_INT, overlap=False ):
        """
        Scan the input string for expression matches.  Each match will return the
        matching tokens, start location, and end location.  May be called with optional
        C{maxMatches} argument, to clip scanning after 'n' matches are found.  If
        C{overlap} is specified, then overlapping matches will be reported.

        Note that the start and end locations are reported relative to the string
        being parsed.  See L{I{parseString}<parseString>} for more information on parsing
        strings with embedded tabs.

        Example::
            source = "sldjf123lsdjjkf345sldkjf879lkjsfd987"
            print(source)
            for tokens,start,end in Word(alphas).scanString(source):
                print(' '*start + '^'*(end-start))
                print(' '*start + tokens[0])
        
        prints::
        
            sldjf123lsdjjkf345sldkjf879lkjsfd987
            ^^^^^
            sldjf
                    ^^^^^^^
                    lsdjjkf
                              ^^^^^^
                              sldkjf
                                       ^^^^^^
                                       lkjsfd
        """
        if not self.streamlined:
            self.streamline()
        for e in self.ignoreExprs:
            e.streamline()

        if not self.keepTabs:
            instring = _ustr(instring).expandtabs()
        instrlen = len(instring)
        loc = 0
        preparseFn = self.preParse
        parseFn = self._parse
        ParserElement.resetCache()
        matches = 0
        try:
            while loc <= instrlen and matches < maxMatches:
                try:
                    preloc = preparseFn( instring, loc )
                    nextLoc,tokens = parseFn( instring, preloc, callPreParse=False )
                except ParseException:
                    loc = preloc+1
                else:
                    if nextLoc > loc:
                        matches += 1
                        yield tokens, preloc, nextLoc
                        if overlap:
                            nextloc = preparseFn( instring, loc )
                            if nextloc > loc:
                                loc = nextLoc
                            else:
                                loc += 1
                        else:
                            loc = nextLoc
                    else:
                        loc = preloc+1
        except ParseBaseException as exc:
            if ParserElement.verbose_stacktrace:
                raise
            else:
                # catch and re-raise exception from here, clears out pyparsing internal stack trace
                raise exc

    def transformString( self, instring ):
        """
        Extension to C{L{scanString}}, to modify matching text with modified tokens that may
        be returned from a parse action.  To use C{transformString}, define a grammar and
        attach a parse action to it that modifies the returned token list.
        Invoking C{transformString()} on a target string will then scan for matches,
        and replace the matched text patterns according to the logic in the parse
        action.  C{transformString()} returns the resulting transformed string.
        
        Example::
            wd = Word(alphas)
            wd.setParseAction(lambda toks: toks[0].title())
            
            print(wd.transformString("now is the winter of our discontent made glorious summer by this sun of york."))
        Prints::
            Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York.
        """
        out = []
        lastE = 0
        # force preservation of <TAB>s, to minimize unwanted transformation of string, and to
        # keep string locs straight between transformString and scanString
        self.keepTabs = True
        try:
            for t,s,e in self.scanString( instring ):
                out.append( instring[lastE:s] )
                if t:
                    if isinstance(t,ParseResults):
                        out += t.asList()
                    elif isinstance(t,list):
                        out += t
                    else:
                        out.append(t)
                lastE = e
            out.append(instring[lastE:])
            out = [o for o in out if o]
            return "".join(map(_ustr,_flatten(out)))
        except ParseBaseException as exc:
            if ParserElement.verbose_stacktrace:
                raise
            else:
                # catch and re-raise exception from here, clears out pyparsing internal stack trace
                raise exc

    def searchString( self, instring, maxMatches=_MAX_INT ):
        """
        Another extension to C{L{scanString}}, simplifying the access to the tokens found
        to match the given parse expression.  May be called with optional
        C{maxMatches} argument, to clip searching after 'n' matches are found.
        
        Example::
            # a capitalized word starts with an uppercase letter, followed by zero or more lowercase letters
            cap_word = Word(alphas.upper(), alphas.lower())
            
            print(cap_word.searchString("More than Iron, more than Lead, more than Gold I need Electricity"))
        prints::
            ['More', 'Iron', 'Lead', 'Gold', 'I']
        """
        try:
            return ParseResults([ t for t,s,e in self.scanString( instring, maxMatches ) ])
        except ParseBaseException as exc:
            if ParserElement.verbose_stacktrace:
                raise
            else:
                # catch and re-raise exception from here, clears out pyparsing internal stack trace
                raise exc

    def split(self, instring, maxsplit=_MAX_INT, includeSeparators=False):
        """
        Generator method to split a string using the given expression as a separator.
        May be called with optional C{maxsplit} argument, to limit the number of splits;
        and the optional C{includeSeparators} argument (default=C{False}), if the separating
        matching text should be included in the split results.
        
        Example::        
            punc = oneOf(list(".,;:/-!?"))
            print(list(punc.split("This, this?, this sentence, is badly punctuated!")))
        prints::
            ['This', ' this', '', ' this sentence', ' is badly punctuated', '']
        """
        splits = 0
        last = 0
        for t,s,e in self.scanString(instring, maxMatches=maxsplit):
            yield instring[last:s]
            if includeSeparators:
                yield t[0]
            last = e
        yield instring[last:]

    def __add__(self, other ):
        """
        Implementation of + operator - returns C{L{And}}. Adding strings to a ParserElement
        converts them to L{Literal}s by default.
        
        Example::
            greet = Word(alphas) + "," + Word(alphas) + "!"
            hello = "Hello, World!"
            print (hello, "->", greet.parseString(hello))
        Prints::
            Hello, World! -> ['Hello', ',', 'World', '!']
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return And( [ self, other ] )

    def __radd__(self, other ):
        """
        Implementation of + operator when left operand is not a C{L{ParserElement}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return other + self

    def __sub__(self, other):
        """
        Implementation of - operator, returns C{L{And}} with error stop
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return And( [ self, And._ErrorStop(), other ] )

    def __rsub__(self, other ):
        """
        Implementation of - operator when left operand is not a C{L{ParserElement}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return other - self

    def __mul__(self,other):
        """
        Implementation of * operator, allows use of C{expr * 3} in place of
        C{expr + expr + expr}.  Expressions may also me multiplied by a 2-integer
        tuple, similar to C{{min,max}} multipliers in regular expressions.  Tuples
        may also include C{None} as in:
         - C{expr*(n,None)} or C{expr*(n,)} is equivalent
              to C{expr*n + L{ZeroOrMore}(expr)}
              (read as "at least n instances of C{expr}")
         - C{expr*(None,n)} is equivalent to C{expr*(0,n)}
              (read as "0 to n instances of C{expr}")
         - C{expr*(None,None)} is equivalent to C{L{ZeroOrMore}(expr)}
         - C{expr*(1,None)} is equivalent to C{L{OneOrMore}(expr)}

        Note that C{expr*(None,n)} does not raise an exception if
        more than n exprs exist in the input stream; that is,
        C{expr*(None,n)} does not enforce a maximum number of expr
        occurrences.  If this behavior is desired, then write
        C{expr*(None,n) + ~expr}
        """
        if isinstance(other,int):
            minElements, optElements = other,0
        elif isinstance(other,tuple):
            other = (other + (None, None))[:2]
            if other[0] is None:
                other = (0, other[1])
            if isinstance(other[0],int) and other[1] is None:
                if other[0] == 0:
                    return ZeroOrMore(self)
                if other[0] == 1:
                    return OneOrMore(self)
                else:
                    return self*other[0] + ZeroOrMore(self)
            elif isinstance(other[0],int) and isinstance(other[1],int):
                minElements, optElements = other
                optElements -= minElements
            else:
                raise TypeError("cannot multiply 'ParserElement' and ('%s','%s') objects", type(other[0]),type(other[1]))
        else:
            raise TypeError("cannot multiply 'ParserElement' and '%s' objects", type(other))

        if minElements < 0:
            raise ValueError("cannot multiply ParserElement by negative value")
        if optElements < 0:
            raise ValueError("second tuple value must be greater or equal to first tuple value")
        if minElements == optElements == 0:
            raise ValueError("cannot multiply ParserElement by 0 or (0,0)")

        if (optElements):
            def makeOptionalList(n):
                if n>1:
                    return Optional(self + makeOptionalList(n-1))
                else:
                    return Optional(self)
            if minElements:
                if minElements == 1:
                    ret = self + makeOptionalList(optElements)
                else:
                    ret = And([self]*minElements) + makeOptionalList(optElements)
            else:
                ret = makeOptionalList(optElements)
        else:
            if minElements == 1:
                ret = self
            else:
                ret = And([self]*minElements)
        return ret

    def __rmul__(self, other):
        return self.__mul__(other)

    def __or__(self, other ):
        """
        Implementation of | operator - returns C{L{MatchFirst}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return MatchFirst( [ self, other ] )

    def __ror__(self, other ):
        """
        Implementation of | operator when left operand is not a C{L{ParserElement}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return other | self

    def __xor__(self, other ):
        """
        Implementation of ^ operator - returns C{L{Or}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return Or( [ self, other ] )

    def __rxor__(self, other ):
        """
        Implementation of ^ operator when left operand is not a C{L{ParserElement}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return other ^ self

    def __and__(self, other ):
        """
        Implementation of & operator - returns C{L{Each}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return Each( [ self, other ] )

    def __rand__(self, other ):
        """
        Implementation of & operator when left operand is not a C{L{ParserElement}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return other & self

    def __invert__( self ):
        """
        Implementation of ~ operator - returns C{L{NotAny}}
        """
        return NotAny( self )

    def __call__(self, name=None):
        """
        Shortcut for C{L{setResultsName}}, with C{listAllMatches=False}.
        
        If C{name} is given with a trailing C{'*'} character, then C{listAllMatches} will be
        passed as C{True}.
           
        If C{name} is omitted, same as calling C{L{copy}}.

        Example::
            # these are equivalent
            userdata = Word(alphas).setResultsName("name") + Word(nums+"-").setResultsName("socsecno")
            userdata = Word(alphas)("name") + Word(nums+"-")("socsecno")             
        """
        if name is not None:
            return self.setResultsName(name)
        else:
            return self.copy()

    def suppress( self ):
        """
        Suppresses the output of this C{ParserElement}; useful to keep punctuation from
        cluttering up returned output.
        """
        return Suppress( self )

    def leaveWhitespace( self ):
        """
        Disables the skipping of whitespace before matching the characters in the
        C{ParserElement}'s defined pattern.  This is normally only used internally by
        the pyparsing module, but may be needed in some whitespace-sensitive grammars.
        """
        self.skipWhitespace = False
        return self

    def setWhitespaceChars( self, chars ):
        """
        Overrides the default whitespace chars
        """
        self.skipWhitespace = True
        self.whiteChars = chars
        self.copyDefaultWhiteChars = False
        return self

    def parseWithTabs( self ):
        """
        Overrides default behavior to expand C{<TAB>}s to spaces before parsing the input string.
        Must be called before C{parseString} when the input grammar contains elements that
        match C{<TAB>} characters.
        """
        self.keepTabs = True
        return self

    def ignore( self, other ):
        """
        Define expression to be ignored (e.g., comments) while doing pattern
        matching; may be called repeatedly, to define multiple comment or other
        ignorable patterns.
        
        Example::
            patt = OneOrMore(Word(alphas))
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj']
            
            patt.ignore(cStyleComment)
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj', 'lskjd']
        """
        if isinstance(other, basestring):
            other = Suppress(other)

        if isinstance( other, Suppress ):
            if other not in self.ignoreExprs:
                self.ignoreExprs.append(other)
        else:
            self.ignoreExprs.append( Suppress( other.copy() ) )
        return self

    def setDebugActions( self, startAction, successAction, exceptionAction ):
        """
        Enable display of debugging messages while doing pattern matching.
        """
        self.debugActions = (startAction or _defaultStartDebugAction,
                             successAction or _defaultSuccessDebugAction,
                             exceptionAction or _defaultExceptionDebugAction)
        self.debug = True
        return self

    def setDebug( self, flag=True ):
        """
        Enable display of debugging messages while doing pattern matching.
        Set C{flag} to True to enable, False to disable.

        Example::
            wd = Word(alphas).setName("alphaword")
            integer = Word(nums).setName("numword")
            term = wd | integer
            
            # turn on debugging for wd
            wd.setDebug()

            OneOrMore(term).parseString("abc 123 xyz 890")
        
        prints::
            Match alphaword at loc 0(1,1)
            Matched alphaword -> ['abc']
            Match alphaword at loc 3(1,4)
            Exception raised:Expected alphaword (at char 4), (line:1, col:5)
            Match alphaword at loc 7(1,8)
            Matched alphaword -> ['xyz']
            Match alphaword at loc 11(1,12)
            Exception raised:Expected alphaword (at char 12), (line:1, col:13)
            Match alphaword at loc 15(1,16)
            Exception raised:Expected alphaword (at char 15), (line:1, col:16)

        The output shown is that produced by the default debug actions - custom debug actions can be
        specified using L{setDebugActions}. Prior to attempting
        to match the C{wd} expression, the debugging message C{"Match <exprname> at loc <n>(<line>,<col>)"}
        is shown. Then if the parse succeeds, a C{"Matched"} message is shown, or an C{"Exception raised"}
        message is shown. Also note the use of L{setName} to assign a human-readable name to the expression,
        which makes debugging and exception messages easier to understand - for instance, the default
        name created for the C{Word} expression without calling C{setName} is C{"W:(ABCD...)"}.
        """
        if flag:
            self.setDebugActions( _defaultStartDebugAction, _defaultSuccessDebugAction, _defaultExceptionDebugAction )
        else:
            self.debug = False
        return self

    def __str__( self ):
        return self.name

    def __repr__( self ):
        return _ustr(self)

    def streamline( self ):
        self.streamlined = True
        self.strRepr = None
        return self

    def checkRecursion( self, parseElementList ):
        pass

    def validate( self, validateTrace=[] ):
        """
        Check defined expressions for valid structure, check for infinite recursive definitions.
        """
        self.checkRecursion( [] )

    def parseFile( self, file_or_filename, parseAll=False ):
        """
        Execute the parse expression on the given file or filename.
        If a filename is specified (instead of a file object),
        the entire file is opened, read, and closed before parsing.
        """
        try:
            file_contents = file_or_filename.read()
        except AttributeError:
            with open(file_or_filename, "r") as f:
                file_contents = f.read()
        try:
            return self.parseString(file_contents, parseAll)
        except ParseBaseException as exc:
            if ParserElement.verbose_stacktrace:
                raise
            else:
                # catch and re-raise exception from here, clears out pyparsing internal stack trace
                raise exc

    def __eq__(self,other):
        if isinstance(other, ParserElement):
            return self is other or vars(self) == vars(other)
        elif isinstance(other, basestring):
            return self.matches(other)
        else:
            return super(ParserElement,self)==other

    def __ne__(self,other):
        return not (self == other)

    def __hash__(self):
        return hash(id(self))

    def __req__(self,other):
        return self == other

    def __rne__(self,other):
        return not (self == other)

    def matches(self, testString, parseAll=True):
        """
        Method for quick testing of a parser against a test string. Good for simple 
        inline microtests of sub expressions while building up larger parser.
           
        Parameters:
         - testString - to test against this expression for a match
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests
            
        Example::
            expr = Word(nums)
            assert expr.matches("100")
        """
        try:
            self.parseString(_ustr(testString), parseAll=parseAll)
            return True
        except ParseBaseException:
            return False
                
    def runTests(self, tests, parseAll=True, comment='#', fullDump=True, printResults=True, failureTests=False):
        """
        Execute the parse expression on a series of test strings, showing each
        test, the parsed results or where the parse failed. Quick and easy way to
        run a parse expression against a list of sample strings.
           
        Parameters:
         - tests - a list of separate test strings, or a multiline string of test strings
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests           
         - comment - (default=C{'#'}) - expression for indicating embedded comments in the test 
              string; pass None to disable comment filtering
         - fullDump - (default=C{True}) - dump results as list followed by results names in nested outline;
              if False, only dump nested list
         - printResults - (default=C{True}) prints test output to stdout
         - failureTests - (default=C{False}) indicates if these tests are expected to fail parsing

        Returns: a (success, results) tuple, where success indicates that all tests succeeded
        (or failed if C{failureTests} is True), and the results contain a list of lines of each 
        test's output
        
        Example::
            number_expr = pyparsing_common.number.copy()

            result = number_expr.runTests('''
                # unsigned integer
                100
                # negative integer
                -100
                # float with scientific notation
                6.02e23
                # integer with scientific notation
                1e-12
                ''')
            print("Success" if result[0] else "Failed!")

            result = number_expr.runTests('''
                # stray character
                100Z
                # missing leading digit before '.'
                -.100
                # too many '.'
                3.14.159
                ''', failureTests=True)
            print("Success" if result[0] else "Failed!")
        prints::
            # unsigned integer
            100
            [100]

            # negative integer
            -100
            [-100]

            # float with scientific notation
            6.02e23
            [6.02e+23]

            # integer with scientific notation
            1e-12
            [1e-12]

            Success
            
            # stray character
            100Z
               ^
            FAIL: Expected end of text (at char 3), (line:1, col:4)

            # missing leading digit before '.'
            -.100
            ^
            FAIL: Expected {real number with scientific notation | real number | signed integer} (at char 0), (line:1, col:1)

            # too many '.'
            3.14.159
                ^
            FAIL: Expected end of text (at char 4), (line:1, col:5)

            Success

        Each test string must be on a single line. If you want to test a string that spans multiple
        lines, create a test like this::

            expr.runTest(r"this is a test\\n of strings that spans \\n 3 lines")
        
        (Note that this is a raw string literal, you must include the leading 'r'.)
        """
        if isinstance(tests, basestring):
            tests = list(map(str.strip, tests.rstrip().splitlines()))
        if isinstance(comment, basestring):
            comment = Literal(comment)
        allResults = []
        comments = []
        success = True
        for t in tests:
            if comment is not None and comment.matches(t, False) or comments and not t:
                comments.append(t)
                continue
            if not t:
                continue
            out = ['\n'.join(comments), t]
            comments = []
            try:
                t = t.replace(r'\n','\n')
                result = self.parseString(t, parseAll=parseAll)
                out.append(result.dump(full=fullDump))
                success = success and not failureTests
            except ParseBaseException as pe:
                fatal = "(FATAL)" if isinstance(pe, ParseFatalException) else ""
                if '\n' in t:
                    out.append(line(pe.loc, t))
                    out.append(' '*(col(pe.loc,t)-1) + '^' + fatal)
                else:
                    out.append(' '*pe.loc + '^' + fatal)
                out.append("FAIL: " + str(pe))
                success = success and failureTests
                result = pe
            except Exception as exc:
                out.append("FAIL-EXCEPTION: " + str(exc))
                success = success and failureTests
                result = exc

            if printResults:
                if fullDump:
                    out.append('')
                print('\n'.join(out))

            allResults.append((t, result))
        
        return success, allResults

        
class Token(ParserElement):
    """
    Abstract C{ParserElement} subclass, for defining atomic matching patterns.
    """
    def __init__( self ):
        super(Token,self).__init__( savelist=False )


class Empty(Token):
    """
    An empty token, will always match.
    """
    def __init__( self ):
        super(Empty,self).__init__()
        self.name = "Empty"
        self.mayReturnEmpty = True
        self.mayIndexError = False


class NoMatch(Token):
    """
    A token that will never match.
    """
    def __init__( self ):
        super(NoMatch,self).__init__()
        self.name = "NoMatch"
        self.mayReturnEmpty = True
        self.mayIndexError = False
        self.errmsg = "Unmatchable token"

    def parseImpl( self, instring, loc, doActions=True ):
        raise ParseException(instring, loc, self.errmsg, self)


class Literal(Token):
    """
    Token to exactly match a specified string.
    
    Example::
        Literal('blah').parseString('blah')  # -> ['blah']
        Literal('blah').parseString('blahfooblah')  # -> ['blah']
        Literal('blah').parseString('bla')  # -> Exception: Expected "blah"
    
    For case-insensitive matching, use L{CaselessLiteral}.
    
    For keyword matching (force word break before and after the matched string),
    use L{Keyword} or L{CaselessKeyword}.
    """
    def __init__( self, matchString ):
        super(Literal,self).__init__()
        self.match = matchString
        self.matchLen = len(matchString)
        try:
            self.firstMatchChar = matchString[0]
        except IndexError:
            warnings.warn("null string passed to Literal; use Empty() instead",
                            SyntaxWarning, stacklevel=2)
            self.__class__ = Empty
        self.name = '"%s"' % _ustr(self.match)
        self.errmsg = "Expected " + self.name
        self.mayReturnEmpty = False
        self.mayIndexError = False

    # Performance tuning: this routine gets called a *lot*
    # if this is a single character match string  and the first character matches,
    # short-circuit as quickly as possible, and avoid calling startswith
    #~ @profile
    def parseImpl( self, instring, loc, doActions=True ):
        if (instring[loc] == self.firstMatchChar and
            (self.matchLen==1 or instring.startswith(self.match,loc)) ):
            return loc+self.matchLen, self.match
        raise ParseException(instring, loc, self.errmsg, self)
_L = Literal
ParserElement._literalStringClass = Literal

class Keyword(Token):
    """
    Token to exactly match a specified string as a keyword, that is, it must be
    immediately followed by a non-keyword character.  Compare with C{L{Literal}}:
     - C{Literal("if")} will match the leading C{'if'} in C{'ifAndOnlyIf'}.
     - C{Keyword("if")} will not; it will only match the leading C{'if'} in C{'if x=1'}, or C{'if(y==2)'}
    Accepts two optional constructor arguments in addition to the keyword string:
     - C{identChars} is a string of characters that would be valid identifier characters,
          defaulting to all alphanumerics + "_" and "$"
     - C{caseless} allows case-insensitive matching, default is C{False}.
       
    Example::
        Keyword("start").parseString("start")  # -> ['start']
        Keyword("start").parseString("starting")  # -> Exception

    For case-insensitive matching, use L{CaselessKeyword}.
    """
    DEFAULT_KEYWORD_CHARS = alphanums+"_$"

    def __init__( self, matchString, identChars=None, caseless=False ):
        super(Keyword,self).__init__()
        if identChars is None:
            identChars = Keyword.DEFAULT_KEYWORD_CHARS
        self.match = matchString
        self.matchLen = len(matchString)
        try:
            self.firstMatchChar = matchString[0]
        except IndexError:
            warnings.warn("null string passed to Keyword; use Empty() instead",
                            SyntaxWarning, stacklevel=2)
        self.name = '"%s"' % self.match
        self.errmsg = "Expected " + self.name
        self.mayReturnEmpty = False
        self.mayIndexError = False
        self.caseless = caseless
        if caseless:
            self.caselessmatch = matchString.upper()
            identChars = identChars.upper()
        self.identChars = set(identChars)

    def parseImpl( self, instring, loc, doActions=True ):
        if self.caseless:
            if ( (instring[ loc:loc+self.matchLen ].upper() == self.caselessmatch) and
                 (loc >= len(instring)-self.matchLen or instring[loc+self.matchLen].upper() not in self.identChars) and
                 (loc == 0 or instring[loc-1].upper() not in self.identChars) ):
                return loc+self.matchLen, self.match
        else:
            if (instring[loc] == self.firstMatchChar and
                (self.matchLen==1 or instring.startswith(self.match,loc)) and
                (loc >= len(instring)-self.matchLen or instring[loc+self.matchLen] not in self.identChars) and
                (loc == 0 or instring[loc-1] not in self.identChars) ):
                return loc+self.matchLen, self.match
        raise ParseException(instring, loc, self.errmsg, self)

    def copy(self):
        c = super(Keyword,self).copy()
        c.identChars = Keyword.DEFAULT_KEYWORD_CHARS
        return c

    @staticmethod
    def setDefaultKeywordChars( chars ):
        """Overrides the default Keyword chars
        """
        Keyword.DEFAULT_KEYWORD_CHARS = chars

class CaselessLiteral(Literal):
    """
    Token to match a specified string, ignoring case of letters.
    Note: the matched results will always be in the case of the given
    match string, NOT the case of the input text.

    Example::
        OneOrMore(CaselessLiteral("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD', 'CMD']
        
    (Contrast with example for L{CaselessKeyword}.)
    """
    def __init__( self, matchString ):
        super(CaselessLiteral,self).__init__( matchString.upper() )
        # Preserve the defining literal.
        self.returnString = matchString
        self.name = "'%s'" % self.returnString
        self.errmsg = "Expected " + self.name

    def parseImpl( self, instring, loc, doActions=True ):
        if instring[ loc:loc+self.matchLen ].upper() == self.match:
            return loc+self.matchLen, self.returnString
        raise ParseException(instring, loc, self.errmsg, self)

class CaselessKeyword(Keyword):
    """
    Caseless version of L{Keyword}.

    Example::
        OneOrMore(CaselessKeyword("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD']
        
    (Contrast with example for L{CaselessLiteral}.)
    """
    def __init__( self, matchString, identChars=None ):
        super(CaselessKeyword,self).__init__( matchString, identChars, caseless=True )

    def parseImpl( self, instring, loc, doActions=True ):
        if ( (instring[ loc:loc+self.matchLen ].upper() == self.caselessmatch) and
             (loc >= len(instring)-self.matchLen or instring[loc+self.matchLen].upper() not in self.identChars) ):
            return loc+self.matchLen, self.match
        raise ParseException(instring, loc, self.errmsg, self)

class CloseMatch(Token):
    """
    A variation on L{Literal} which matches "close" matches, that is, 
    strings with at most 'n' mismatching characters. C{CloseMatch} takes parameters:
     - C{match_string} - string to be matched
     - C{maxMismatches} - (C{default=1}) maximum number of mismatches allowed to count as a match
    
    The results from a successful parse will contain the matched text from the input string and the following named results:
     - C{mismatches} - a list of the positions within the match_string where mismatches were found
     - C{original} - the original match_string used to compare against the input string
    
    If C{mismatches} is an empty list, then the match was an exact match.
    
    Example::
        patt = CloseMatch("ATCATCGAATGGA")
        patt.parseString("ATCATCGAAXGGA") # -> (['ATCATCGAAXGGA'], {'mismatches': [[9]], 'original': ['ATCATCGAATGGA']})
        patt.parseString("ATCAXCGAAXGGA") # -> Exception: Expected 'ATCATCGAATGGA' (with up to 1 mismatches) (at char 0), (line:1, col:1)

        # exact match
        patt.parseString("ATCATCGAATGGA") # -> (['ATCATCGAATGGA'], {'mismatches': [[]], 'original': ['ATCATCGAATGGA']})

        # close match allowing up to 2 mismatches
        patt = CloseMatch("ATCATCGAATGGA", maxMismatches=2)
        patt.parseString("ATCAXCGAAXGGA") # -> (['ATCAXCGAAXGGA'], {'mismatches': [[4, 9]], 'original': ['ATCATCGAATGGA']})
    """
    def __init__(self, match_string, maxMismatches=1):
        super(CloseMatch,self).__init__()
        self.name = match_string
        self.match_string = match_string
        self.maxMismatches = maxMismatches
        self.errmsg = "Expected %r (with up to %d mismatches)" % (self.match_string, self.maxMismatches)
        self.mayIndexError = False
        self.mayReturnEmpty = False

    def parseImpl( self, instring, loc, doActions=True ):
        start = loc
        instrlen = len(instring)
        maxloc = start + len(self.match_string)

        if maxloc <= instrlen:
            match_string = self.match_string
            match_stringloc = 0
            mismatches = []
            maxMismatches = self.maxMismatches

            for match_stringloc,s_m in enumerate(zip(instring[loc:maxloc], self.match_string)):
                src,mat = s_m
                if src != mat:
                    mismatches.append(match_stringloc)
                    if len(mismatches) > maxMismatches:
                        break
            else:
                loc = match_stringloc + 1
                results = ParseResults([instring[start:loc]])
                results['original'] = self.match_string
                results['mismatches'] = mismatches
                return loc, results

        raise ParseException(instring, loc, self.errmsg, self)


class Word(Token):
    """
    Token for matching words composed of allowed character sets.
    Defined with string containing all allowed initial characters,
    an optional string containing allowed body characters (if omitted,
    defaults to the initial character set), and an optional minimum,
    maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction. An optional
    C{excludeChars} parameter can list characters that might be found in 
    the input C{bodyChars} string; useful to define a word of all printables
    except for one or two characters, for instance.
    
    L{srange} is useful for defining custom character set strings for defining 
    C{Word} expressions, using range notation from regular expression character sets.
    
    A common mistake is to use C{Word} to match a specific literal string, as in 
    C{Word("Address")}. Remember that C{Word} uses the string argument to define
    I{sets} of matchable characters. This expression would match "Add", "AAA",
    "dAred", or any other word made up of the characters 'A', 'd', 'r', 'e', and 's'.
    To match an exact literal string, use L{Literal} or L{Keyword}.

    pyparsing includes helper strings for building Words:
     - L{alphas}
     - L{nums}
     - L{alphanums}
     - L{hexnums}
     - L{alphas8bit} (alphabetic characters in ASCII range 128-255 - accented, tilded, umlauted, etc.)
     - L{punc8bit} (non-alphabetic characters in ASCII range 128-255 - currency, symbols, superscripts, diacriticals, etc.)
     - L{printables} (any non-whitespace character)

    Example::
        # a word composed of digits
        integer = Word(nums) # equivalent to Word("0123456789") or Word(srange("0-9"))
        
        # a word with a leading capital, and zero or more lowercase
        capital_word = Word(alphas.upper(), alphas.lower())

        # hostnames are alphanumeric, with leading alpha, and '-'
        hostname = Word(alphas, alphanums+'-')
        
        # roman numeral (not a strict parser, accepts invalid mix of characters)
        roman = Word("IVXLCDM")
        
        # any string of non-whitespace characters, except for ','
        csv_value = Word(printables, excludeChars=",")
    """
    def __init__( self, initChars, bodyChars=None, min=1, max=0, exact=0, asKeyword=False, excludeChars=None ):
        super(Word,self).__init__()
        if excludeChars:
            initChars = ''.join(c for c in initChars if c not in excludeChars)
            if bodyChars:
                bodyChars = ''.join(c for c in bodyChars if c not in excludeChars)
        self.initCharsOrig = initChars
        self.initChars = set(initChars)
        if bodyChars :
            self.bodyCharsOrig = bodyChars
            self.bodyChars = set(bodyChars)
        else:
            self.bodyCharsOrig = initChars
            self.bodyChars = set(initChars)

        self.maxSpecified = max > 0

        if min < 1:
            raise ValueError("cannot specify a minimum length < 1; use Optional(Word()) if zero-length word is permitted")

        self.minLen = min

        if max > 0:
            self.maxLen = max
        else:
            self.maxLen = _MAX_INT

        if exact > 0:
            self.maxLen = exact
            self.minLen = exact

        self.name = _ustr(self)
        self.errmsg = "Expected " + self.name
        self.mayIndexError = False
        self.asKeyword = asKeyword

        if ' ' not in self.initCharsOrig+self.bodyCharsOrig and (min==1 and max==0 and exact==0):
            if self.bodyCharsOrig == self.initCharsOrig:
                self.reString = "[%s]+" % _escapeRegexRangeChars(self.initCharsOrig)
            elif len(self.initCharsOrig) == 1:
                self.reString = "%s[%s]*" % \
                                      (re.escape(self.initCharsOrig),
                                      _escapeRegexRangeChars(self.bodyCharsOrig),)
            else:
                self.reString = "[%s][%s]*" % \
                                      (_escapeRegexRangeChars(self.initCharsOrig),
                                      _escapeRegexRangeChars(self.bodyCharsOrig),)
            if self.asKeyword:
                self.reString = r"\b"+self.reString+r"\b"
            try:
                self.re = re.compile( self.reString )
            except Exception:
                self.re = None

    def parseImpl( self, instring, loc, doActions=True ):
        if self.re:
            result = self.re.match(instring,loc)
            if not result:
                raise ParseException(instring, loc, self.errmsg, self)

            loc = result.end()
            return loc, result.group()

        if not(instring[ loc ] in self.initChars):
            raise ParseException(instring, loc, self.errmsg, self)

        start = loc
        loc += 1
        instrlen = len(instring)
        bodychars = self.bodyChars
        maxloc = start + self.maxLen
        maxloc = min( maxloc, instrlen )
        while loc < maxloc and instring[loc] in bodychars:
            loc += 1

        throwException = False
        if loc - start < self.minLen:
            throwException = True
        if self.maxSpecified and loc < instrlen and instring[loc] in bodychars:
            throwException = True
        if self.asKeyword:
            if (start>0 and instring[start-1] in bodychars) or (loc<instrlen and instring[loc] in bodychars):
                throwException = True

        if throwException:
            raise ParseException(instring, loc, self.errmsg, self)

        return loc, instring[start:loc]

    def __str__( self ):
        try:
            return super(Word,self).__str__()
        except Exception:
            pass


        if self.strRepr is None:

            def charsAsStr(s):
                if len(s)>4:
                    return s[:4]+"..."
                else:
                    return s

            if ( self.initCharsOrig != self.bodyCharsOrig ):
                self.strRepr = "W:(%s,%s)" % ( charsAsStr(self.initCharsOrig), charsAsStr(self.bodyCharsOrig) )
            else:
                self.strRepr = "W:(%s)" % charsAsStr(self.initCharsOrig)

        return self.strRepr


class Regex(Token):
    """
    Token for matching strings that match a given regular expression.
    Defined with string specifying the regular expression in a form recognized by the inbuilt Python re module.
    If the given regex contains named groups (defined using C{(?P<name>...)}), these will be preserved as 
    named parse results.

    Example::
        realnum = Regex(r"[+-]?\d+\.\d*")
        date = Regex(r'(?P<year>\d{4})-(?P<month>\d\d?)-(?P<day>\d\d?)')
        # ref: http://stackoverflow.com/questions/267399/how-do-you-match-only-valid-roman-numerals-with-a-regular-expression
        roman = Regex(r"M{0,4}(CM|CD|D?C{0,3})(XC|XL|L?X{0,3})(IX|IV|V?I{0,3})")
    """
    compiledREtype = type(re.compile("[A-Z]"))
    def __init__( self, pattern, flags=0):
        """The parameters C{pattern} and C{flags} are passed to the C{re.compile()} function as-is. See the Python C{re} module for an explanation of the acceptable patterns and flags."""
        super(Regex,self).__init__()

        if isinstance(pattern, basestring):
            if not pattern:
                warnings.warn("null string passed to Regex; use Empty() instead",
                        SyntaxWarning, stacklevel=2)

            self.pattern = pattern
            self.flags = flags

            try:
                self.re = re.compile(self.pattern, self.flags)
                self.reString = self.pattern
            except sre_constants.error:
                warnings.warn("invalid pattern (%s) passed to Regex" % pattern,
                    SyntaxWarning, stacklevel=2)
                raise

        elif isinstance(pattern, Regex.compiledREtype):
            self.re = pattern
            self.pattern = \
            self.reString = str(pattern)
            self.flags = flags
            
        else:
            raise ValueError("Regex may only be constructed with a string or a compiled RE object")

        self.name = _ustr(self)
        self.errmsg = "Expected " + self.name
        self.mayIndexError = False
        self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        result = self.re.match(instring,loc)
        if not result:
            raise ParseException(instring, loc, self.errmsg, self)

        loc = result.end()
        d = result.groupdict()
        ret = ParseResults(result.group())
        if d:
            for k in d:
                ret[k] = d[k]
        return loc,ret

    def __str__( self ):
        try:
            return super(Regex,self).__str__()
        except Exception:
            pass

        if self.strRepr is None:
            self.strRepr = "Re:(%s)" % repr(self.pattern)

        return self.strRepr


class QuotedString(Token):
    r"""
    Token for matching strings that are delimited by quoting characters.
    
    Defined with the following parameters:
        - quoteChar - string of one or more characters defining the quote delimiting string
        - escChar - character to escape quotes, typically backslash (default=C{None})
        - escQuote - special quote sequence to escape an embedded quote string (such as SQL's "" to escape an embedded ") (default=C{None})
        - multiline - boolean indicating whether quotes can span multiple lines (default=C{False})
        - unquoteResults - boolean indicating whether the matched text should be unquoted (default=C{True})
        - endQuoteChar - string of one or more characters defining the end of the quote delimited string (default=C{None} => same as quoteChar)
        - convertWhitespaceEscapes - convert escaped whitespace (C{'\t'}, C{'\n'}, etc.) to actual whitespace (default=C{True})

    Example::
        qs = QuotedString('"')
        print(qs.searchString('lsjdf "This is the quote" sldjf'))
        complex_qs = QuotedString('{{', endQuoteChar='}}')
        print(complex_qs.searchString('lsjdf {{This is the "quote"}} sldjf'))
        sql_qs = QuotedString('"', escQuote='""')
        print(sql_qs.searchString('lsjdf "This is the quote with ""embedded"" quotes" sldjf'))
    prints::
        [['This is the quote']]
        [['This is the "quote"']]
        [['This is the quote with "embedded" quotes']]
    """
    def __init__( self, quoteChar, escChar=None, escQuote=None, multiline=False, unquoteResults=True, endQuoteChar=None, convertWhitespaceEscapes=True):
        super(QuotedString,self).__init__()

        # remove white space from quote chars - wont work anyway
        quoteChar = quoteChar.strip()
        if not quoteChar:
            warnings.warn("quoteChar cannot be the empty string",SyntaxWarning,stacklevel=2)
            raise SyntaxError()

        if endQuoteChar is None:
            endQuoteChar = quoteChar
        else:
            endQuoteChar = endQuoteChar.strip()
            if not endQuoteChar:
                warnings.warn("endQuoteChar cannot be the empty string",SyntaxWarning,stacklevel=2)
                raise SyntaxError()

        self.quoteChar = quoteChar
        self.quoteCharLen = len(quoteChar)
        self.firstQuoteChar = quoteChar[0]
        self.endQuoteChar = endQuoteChar
        self.endQuoteCharLen = len(endQuoteChar)
        self.escChar = escChar
        self.escQuote = escQuote
        self.unquoteResults = unquoteResults
        self.convertWhitespaceEscapes = convertWhitespaceEscapes

        if multiline:
            self.flags = re.MULTILINE | re.DOTALL
            self.pattern = r'%s(?:[^%s%s]' % \
                ( re.escape(self.quoteChar),
                  _escapeRegexRangeChars(self.endQuoteChar[0]),
                  (escChar is not None and _escapeRegexRangeChars(escChar) or '') )
        else:
            self.flags = 0
            self.pattern = r'%s(?:[^%s\n\r%s]' % \
                ( re.escape(self.quoteChar),
                  _escapeRegexRangeChars(self.endQuoteChar[0]),
                  (escChar is not None and _escapeRegexRangeChars(escChar) or '') )
        if len(self.endQuoteChar) > 1:
            self.pattern += (
                '|(?:' + ')|(?:'.join("%s[^%s]" % (re.escape(self.endQuoteChar[:i]),
                                               _escapeRegexRangeChars(self.endQuoteChar[i]))
                                    for i in range(len(self.endQuoteChar)-1,0,-1)) + ')'
                )
        if escQuote:
            self.pattern += (r'|(?:%s)' % re.escape(escQuote))
        if escChar:
            self.pattern += (r'|(?:%s.)' % re.escape(escChar))
            self.escCharReplacePattern = re.escape(self.escChar)+"(.)"
        self.pattern += (r')*%s' % re.escape(self.endQuoteChar))

        try:
            self.re = re.compile(self.pattern, self.flags)
            self.reString = self.pattern
        except sre_constants.error:
            warnings.warn("invalid pattern (%s) passed to Regex" % self.pattern,
                SyntaxWarning, stacklevel=2)
            raise

        self.name = _ustr(self)
        self.errmsg = "Expected " + self.name
        self.mayIndexError = False
        self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        result = instring[loc] == self.firstQuoteChar and self.re.match(instring,loc) or None
        if not result:
            raise ParseException(instring, loc, self.errmsg, self)

        loc = result.end()
        ret = result.group()

        if self.unquoteResults:

            # strip off quotes
            ret = ret[self.quoteCharLen:-self.endQuoteCharLen]

            if isinstance(ret,basestring):
                # replace escaped whitespace
                if '\\' in ret and self.convertWhitespaceEscapes:
                    ws_map = {
                        r'\t' : '\t',
                        r'\n' : '\n',
                        r'\f' : '\f',
                        r'\r' : '\r',
                    }
                    for wslit,wschar in ws_map.items():
                        ret = ret.replace(wslit, wschar)

                # replace escaped characters
                if self.escChar:
                    ret = re.sub(self.escCharReplacePattern,"\g<1>",ret)

                # replace escaped quotes
                if self.escQuote:
                    ret = ret.replace(self.escQuote, self.endQuoteChar)

        return loc, ret

    def __str__( self ):
        try:
            return super(QuotedString,self).__str__()
        except Exception:
            pass

        if self.strRepr is None:
            self.strRepr = "quoted string, starting with %s ending with %s" % (self.quoteChar, self.endQuoteChar)

        return self.strRepr


class CharsNotIn(Token):
    """
    Token for matching words composed of characters I{not} in a given set (will
    include whitespace in matched characters if not listed in the provided exclusion set - see example).
    Defined with string containing all disallowed characters, and an optional
    minimum, maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction.

    Example::
        # define a comma-separated-value as anything that is not a ','
        csv_value = CharsNotIn(',')
        print(delimitedList(csv_value).parseString("dkls,lsdkjf,s12 34,@!#,213"))
    prints::
        ['dkls', 'lsdkjf', 's12 34', '@!#', '213']
    """
    def __init__( self, notChars, min=1, max=0, exact=0 ):
        super(CharsNotIn,self).__init__()
        self.skipWhitespace = False
        self.notChars = notChars

        if min < 1:
            raise ValueError("cannot specify a minimum length < 1; use Optional(CharsNotIn()) if zero-length char group is permitted")

        self.minLen = min

        if max > 0:
            self.maxLen = max
        else:
            self.maxLen = _MAX_INT

        if exact > 0:
            self.maxLen = exact
            self.minLen = exact

        self.name = _ustr(self)
        self.errmsg = "Expected " + self.name
        self.mayReturnEmpty = ( self.minLen == 0 )
        self.mayIndexError = False

    def parseImpl( self, instring, loc, doActions=True ):
        if instring[loc] in self.notChars:
            raise ParseException(instring, loc, self.errmsg, self)

        start = loc
        loc += 1
        notchars = self.notChars
        maxlen = min( start+self.maxLen, len(instring) )
        while loc < maxlen and \
              (instring[loc] not in notchars):
            loc += 1

        if loc - start < self.minLen:
            raise ParseException(instring, loc, self.errmsg, self)

        return loc, instring[start:loc]

    def __str__( self ):
        try:
            return super(CharsNotIn, self).__str__()
        except Exception:
            pass

        if self.strRepr is None:
            if len(self.notChars) > 4:
                self.strRepr = "!W:(%s...)" % self.notChars[:4]
            else:
                self.strRepr = "!W:(%s)" % self.notChars

        return self.strRepr

class White(Token):
    """
    Special matching class for matching whitespace.  Normally, whitespace is ignored
    by pyparsing grammars.  This class is included when some whitespace structures
    are significant.  Define with a string containing the whitespace characters to be
    matched; default is C{" \\t\\r\\n"}.  Also takes optional C{min}, C{max}, and C{exact} arguments,
    as defined for the C{L{Word}} class.
    """
    whiteStrs = {
        " " : "<SPC>",
        "\t": "<TAB>",
        "\n": "<LF>",
        "\r": "<CR>",
        "\f": "<FF>",
        }
    def __init__(self, ws=" \t\r\n", min=1, max=0, exact=0):
        super(White,self).__init__()
        self.matchWhite = ws
        self.setWhitespaceChars( "".join(c for c in self.whiteChars if c not in self.matchWhite) )
        #~ self.leaveWhitespace()
        self.name = ("".join(White.whiteStrs[c] for c in self.matchWhite))
        self.mayReturnEmpty = True
        self.errmsg = "Expected " + self.name

        self.minLen = min

        if max > 0:
            self.maxLen = max
        else:
            self.maxLen = _MAX_INT

        if exact > 0:
            self.maxLen = exact
            self.minLen = exact

    def parseImpl( self, instring, loc, doActions=True ):
        if not(instring[ loc ] in self.matchWhite):
            raise ParseException(instring, loc, self.errmsg, self)
        start = loc
        loc += 1
        maxloc = start + self.maxLen
        maxloc = min( maxloc, len(instring) )
        while loc < maxloc and instring[loc] in self.matchWhite:
            loc += 1

        if loc - start < self.minLen:
            raise ParseException(instring, loc, self.errmsg, self)

        return loc, instring[start:loc]


class _PositionToken(Token):
    def __init__( self ):
        super(_PositionToken,self).__init__()
        self.name=self.__class__.__name__
        self.mayReturnEmpty = True
        self.mayIndexError = False

class GoToColumn(_PositionToken):
    """
    Token to advance to a specific column of input text; useful for tabular report scraping.
    """
    def __init__( self, colno ):
        super(GoToColumn,self).__init__()
        self.col = colno

    def preParse( self, instring, loc ):
        if col(loc,instring) != self.col:
            instrlen = len(instring)
            if self.ignoreExprs:
                loc = self._skipIgnorables( instring, loc )
            while loc < instrlen and instring[loc].isspace() and col( loc, instring ) != self.col :
                loc += 1
        return loc

    def parseImpl( self, instring, loc, doActions=True ):
        thiscol = col( loc, instring )
        if thiscol > self.col:
            raise ParseException( instring, loc, "Text not in expected column", self )
        newloc = loc + self.col - thiscol
        ret = instring[ loc: newloc ]
        return newloc, ret


class LineStart(_PositionToken):
    """
    Matches if current position is at the beginning of a line within the parse string
    
    Example::
    
        test = '''\
        AAA this line
        AAA and this line
          AAA but not this one
        B AAA and definitely not this one
        '''

        for t in (LineStart() + 'AAA' + restOfLine).searchString(test):
            print(t)
    
    Prints::
        ['AAA', ' this line']
        ['AAA', ' and this line']    

    """
    def __init__( self ):
        super(LineStart,self).__init__()
        self.errmsg = "Expected start of line"

    def parseImpl( self, instring, loc, doActions=True ):
        if col(loc, instring) == 1:
            return loc, []
        raise ParseException(instring, loc, self.errmsg, self)

class LineEnd(_PositionToken):
    """
    Matches if current position is at the end of a line within the parse string
    """
    def __init__( self ):
        super(LineEnd,self).__init__()
        self.setWhitespaceChars( ParserElement.DEFAULT_WHITE_CHARS.replace("\n","") )
        self.errmsg = "Expected end of line"

    def parseImpl( self, instring, loc, doActions=True ):
        if loc<len(instring):
            if instring[loc] == "\n":
                return loc+1, "\n"
            else:
                raise ParseException(instring, loc, self.errmsg, self)
        elif loc == len(instring):
            return loc+1, []
        else:
            raise ParseException(instring, loc, self.errmsg, self)

class StringStart(_PositionToken):
    """
    Matches if current position is at the beginning of the parse string
    """
    def __init__( self ):
        super(StringStart,self).__init__()
        self.errmsg = "Expected start of text"

    def parseImpl( self, instring, loc, doActions=True ):
        if loc != 0:
            # see if entire string up to here is just whitespace and ignoreables
            if loc != self.preParse( instring, 0 ):
                raise ParseException(instring, loc, self.errmsg, self)
        return loc, []

class StringEnd(_PositionToken):
    """
    Matches if current position is at the end of the parse string
    """
    def __init__( self ):
        super(StringEnd,self).__init__()
        self.errmsg = "Expected end of text"

    def parseImpl( self, instring, loc, doActions=True ):
        if loc < len(instring):
            raise ParseException(instring, loc, self.errmsg, self)
        elif loc == len(instring):
            return loc+1, []
        elif loc > len(instring):
            return loc, []
        else:
            raise ParseException(instring, loc, self.errmsg, self)

class WordStart(_PositionToken):
    """
    Matches if the current position is at the beginning of a Word, and
    is not preceded by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{\b} behavior of regular expressions,
    use C{WordStart(alphanums)}. C{WordStart} will also match at the beginning of
    the string being parsed, or at the beginning of a line.
    """
    def __init__(self, wordChars = printables):
        super(WordStart,self).__init__()
        self.wordChars = set(wordChars)
        self.errmsg = "Not at the start of a word"

    def parseImpl(self, instring, loc, doActions=True ):
        if loc != 0:
            if (instring[loc-1] in self.wordChars or
                instring[loc] not in self.wordChars):
                raise ParseException(instring, loc, self.errmsg, self)
        return loc, []

class WordEnd(_PositionToken):
    """
    Matches if the current position is at the end of a Word, and
    is not followed by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{\b} behavior of regular expressions,
    use C{WordEnd(alphanums)}. C{WordEnd} will also match at the end of
    the string being parsed, or at the end of a line.
    """
    def __init__(self, wordChars = printables):
        super(WordEnd,self).__init__()
        self.wordChars = set(wordChars)
        self.skipWhitespace = False
        self.errmsg = "Not at the end of a word"

    def parseImpl(self, instring, loc, doActions=True ):
        instrlen = len(instring)
        if instrlen>0 and loc<instrlen:
            if (instring[loc] in self.wordChars or
                instring[loc-1] not in self.wordChars):
                raise ParseException(instring, loc, self.errmsg, self)
        return loc, []


class ParseExpression(ParserElement):
    """
    Abstract subclass of ParserElement, for combining and post-processing parsed tokens.
    """
    def __init__( self, exprs, savelist = False ):
        super(ParseExpression,self).__init__(savelist)
        if isinstance( exprs, _generatorType ):
            exprs = list(exprs)

        if isinstance( exprs, basestring ):
            self.exprs = [ ParserElement._literalStringClass( exprs ) ]
        elif isinstance( exprs, collections.Iterable ):
            exprs = list(exprs)
            # if sequence of strings provided, wrap with Literal
            if all(isinstance(expr, basestring) for expr in exprs):
                exprs = map(ParserElement._literalStringClass, exprs)
            self.exprs = list(exprs)
        else:
            try:
                self.exprs = list( exprs )
            except TypeError:
                self.exprs = [ exprs ]
        self.callPreparse = False

    def __getitem__( self, i ):
        return self.exprs[i]

    def append( self, other ):
        self.exprs.append( other )
        self.strRepr = None
        return self

    def leaveWhitespace( self ):
        """Extends C{leaveWhitespace} defined in base class, and also invokes C{leaveWhitespace} on
           all contained expressions."""
        self.skipWhitespace = False
        self.exprs = [ e.copy() for e in self.exprs ]
        for e in self.exprs:
            e.leaveWhitespace()
        return self

    def ignore( self, other ):
        if isinstance( other, Suppress ):
            if other not in self.ignoreExprs:
                super( ParseExpression, self).ignore( other )
                for e in self.exprs:
                    e.ignore( self.ignoreExprs[-1] )
        else:
            super( ParseExpression, self).ignore( other )
            for e in self.exprs:
                e.ignore( self.ignoreExprs[-1] )
        return self

    def __str__( self ):
        try:
            return super(ParseExpression,self).__str__()
        except Exception:
            pass

        if self.strRepr is None:
            self.strRepr = "%s:(%s)" % ( self.__class__.__name__, _ustr(self.exprs) )
        return self.strRepr

    def streamline( self ):
        super(ParseExpression,self).streamline()

        for e in self.exprs:
            e.streamline()

        # collapse nested And's of the form And( And( And( a,b), c), d) to And( a,b,c,d )
        # but only if there are no parse actions or resultsNames on the nested And's
        # (likewise for Or's and MatchFirst's)
        if ( len(self.exprs) == 2 ):
            other = self.exprs[0]
            if ( isinstance( other, self.__class__ ) and
                  not(other.parseAction) and
                  other.resultsName is None and
                  not other.debug ):
                self.exprs = other.exprs[:] + [ self.exprs[1] ]
                self.strRepr = None
                self.mayReturnEmpty |= other.mayReturnEmpty
                self.mayIndexError  |= other.mayIndexError

            other = self.exprs[-1]
            if ( isinstance( other, self.__class__ ) and
                  not(other.parseAction) and
                  other.resultsName is None and
                  not other.debug ):
                self.exprs = self.exprs[:-1] + other.exprs[:]
                self.strRepr = None
                self.mayReturnEmpty |= other.mayReturnEmpty
                self.mayIndexError  |= other.mayIndexError

        self.errmsg = "Expected " + _ustr(self)
        
        return self

    def setResultsName( self, name, listAllMatches=False ):
        ret = super(ParseExpression,self).setResultsName(name,listAllMatches)
        return ret

    def validate( self, validateTrace=[] ):
        tmp = validateTrace[:]+[self]
        for e in self.exprs:
            e.validate(tmp)
        self.checkRecursion( [] )
        
    def copy(self):
        ret = super(ParseExpression,self).copy()
        ret.exprs = [e.copy() for e in self.exprs]
        return ret

class And(ParseExpression):
    """
    Requires all given C{ParseExpression}s to be found in the given order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'+'} operator.
    May also be constructed using the C{'-'} operator, which will suppress backtracking.

    Example::
        integer = Word(nums)
        name_expr = OneOrMore(Word(alphas))

        expr = And([integer("id"),name_expr("name"),integer("age")])
        # more easily written as:
        expr = integer("id") + name_expr("name") + integer("age")
    """

    class _ErrorStop(Empty):
        def __init__(self, *args, **kwargs):
            super(And._ErrorStop,self).__init__(*args, **kwargs)
            self.name = '-'
            self.leaveWhitespace()

    def __init__( self, exprs, savelist = True ):
        super(And,self).__init__(exprs, savelist)
        self.mayReturnEmpty = all(e.mayReturnEmpty for e in self.exprs)
        self.setWhitespaceChars( self.exprs[0].whiteChars )
        self.skipWhitespace = self.exprs[0].skipWhitespace
        self.callPreparse = True

    def parseImpl( self, instring, loc, doActions=True ):
        # pass False as last arg to _parse for first element, since we already
        # pre-parsed the string as part of our And pre-parsing
        loc, resultlist = self.exprs[0]._parse( instring, loc, doActions, callPreParse=False )
        errorStop = False
        for e in self.exprs[1:]:
            if isinstance(e, And._ErrorStop):
                errorStop = True
                continue
            if errorStop:
                try:
                    loc, exprtokens = e._parse( instring, loc, doActions )
                except ParseSyntaxException:
                    raise
                except ParseBaseException as pe:
                    pe.__traceback__ = None
                    raise ParseSyntaxException._from_exception(pe)
                except IndexError:
                    raise ParseSyntaxException(instring, len(instring), self.errmsg, self)
            else:
                loc, exprtokens = e._parse( instring, loc, doActions )
            if exprtokens or exprtokens.haskeys():
                resultlist += exprtokens
        return loc, resultlist

    def __iadd__(self, other ):
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        return self.append( other ) #And( [ self, other ] )

    def checkRecursion( self, parseElementList ):
        subRecCheckList = parseElementList[:] + [ self ]
        for e in self.exprs:
            e.checkRecursion( subRecCheckList )
            if not e.mayReturnEmpty:
                break

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "{" + " ".join(_ustr(e) for e in self.exprs) + "}"

        return self.strRepr


class Or(ParseExpression):
    """
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the expression that matches the longest string will be used.
    May be constructed using the C{'^'} operator.

    Example::
        # construct Or using '^' operator
        
        number = Word(nums) ^ Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789"))
    prints::
        [['123'], ['3.1416'], ['789']]
    """
    def __init__( self, exprs, savelist = False ):
        super(Or,self).__init__(exprs, savelist)
        if self.exprs:
            self.mayReturnEmpty = any(e.mayReturnEmpty for e in self.exprs)
        else:
            self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        maxExcLoc = -1
        maxException = None
        matches = []
        for e in self.exprs:
            try:
                loc2 = e.tryParse( instring, loc )
            except ParseException as err:
                err.__traceback__ = None
                if err.loc > maxExcLoc:
                    maxException = err
                    maxExcLoc = err.loc
            except IndexError:
                if len(instring) > maxExcLoc:
                    maxException = ParseException(instring,len(instring),e.errmsg,self)
                    maxExcLoc = len(instring)
            else:
                # save match among all matches, to retry longest to shortest
                matches.append((loc2, e))

        if matches:
            matches.sort(key=lambda x: -x[0])
            for _,e in matches:
                try:
                    return e._parse( instring, loc, doActions )
                except ParseException as err:
                    err.__traceback__ = None
                    if err.loc > maxExcLoc:
                        maxException = err
                        maxExcLoc = err.loc

        if maxException is not None:
            maxException.msg = self.errmsg
            raise maxException
        else:
            raise ParseException(instring, loc, "no defined alternatives to match", self)


    def __ixor__(self, other ):
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        return self.append( other ) #Or( [ self, other ] )

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "{" + " ^ ".join(_ustr(e) for e in self.exprs) + "}"

        return self.strRepr

    def checkRecursion( self, parseElementList ):
        subRecCheckList = parseElementList[:] + [ self ]
        for e in self.exprs:
            e.checkRecursion( subRecCheckList )


class MatchFirst(ParseExpression):
    """
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the first one listed is the one that will match.
    May be constructed using the C{'|'} operator.

    Example::
        # construct MatchFirst using '|' operator
        
        # watch the order of expressions to match
        number = Word(nums) | Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789")) #  Fail! -> [['123'], ['3'], ['1416'], ['789']]

        # put more selective expression first
        number = Combine(Word(nums) + '.' + Word(nums)) | Word(nums)
        print(number.searchString("123 3.1416 789")) #  Better -> [['123'], ['3.1416'], ['789']]
    """
    def __init__( self, exprs, savelist = False ):
        super(MatchFirst,self).__init__(exprs, savelist)
        if self.exprs:
            self.mayReturnEmpty = any(e.mayReturnEmpty for e in self.exprs)
        else:
            self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        maxExcLoc = -1
        maxException = None
        for e in self.exprs:
            try:
                ret = e._parse( instring, loc, doActions )
                return ret
            except ParseException as err:
                if err.loc > maxExcLoc:
                    maxException = err
                    maxExcLoc = err.loc
            except IndexError:
                if len(instring) > maxExcLoc:
                    maxException = ParseException(instring,len(instring),e.errmsg,self)
                    maxExcLoc = len(instring)

        # only got here if no expression matched, raise exception for match that made it the furthest
        else:
            if maxException is not None:
                maxException.msg = self.errmsg
                raise maxException
            else:
                raise ParseException(instring, loc, "no defined alternatives to match", self)

    def __ior__(self, other ):
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        return self.append( other ) #MatchFirst( [ self, other ] )

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "{" + " | ".join(_ustr(e) for e in self.exprs) + "}"

        return self.strRepr

    def checkRecursion( self, parseElementList ):
        subRecCheckList = parseElementList[:] + [ self ]
        for e in self.exprs:
            e.checkRecursion( subRecCheckList )


class Each(ParseExpression):
    """
    Requires all given C{ParseExpression}s to be found, but in any order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'&'} operator.

    Example::
        color = oneOf("RED ORANGE YELLOW GREEN BLUE PURPLE BLACK WHITE BROWN")
        shape_type = oneOf("SQUARE CIRCLE TRIANGLE STAR HEXAGON OCTAGON")
        integer = Word(nums)
        shape_attr = "shape:" + shape_type("shape")
        posn_attr = "posn:" + Group(integer("x") + ',' + integer("y"))("posn")
        color_attr = "color:" + color("color")
        size_attr = "size:" + integer("size")

        # use Each (using operator '&') to accept attributes in any order 
        # (shape and posn are required, color and size are optional)
        shape_spec = shape_attr & posn_attr & Optional(color_attr) & Optional(size_attr)

        shape_spec.runTests('''
            shape: SQUARE color: BLACK posn: 100, 120
            shape: CIRCLE size: 50 color: BLUE posn: 50,80
            color:GREEN size:20 shape:TRIANGLE posn:20,40
            '''
            )
    prints::
        shape: SQUARE color: BLACK posn: 100, 120
        ['shape:', 'SQUARE', 'color:', 'BLACK', 'posn:', ['100', ',', '120']]
        - color: BLACK
        - posn: ['100', ',', '120']
          - x: 100
          - y: 120
        - shape: SQUARE


        shape: CIRCLE size: 50 color: BLUE posn: 50,80
        ['shape:', 'CIRCLE', 'size:', '50', 'color:', 'BLUE', 'posn:', ['50', ',', '80']]
        - color: BLUE
        - posn: ['50', ',', '80']
          - x: 50
          - y: 80
        - shape: CIRCLE
        - size: 50


        color: GREEN size: 20 shape: TRIANGLE posn: 20,40
        ['color:', 'GREEN', 'size:', '20', 'shape:', 'TRIANGLE', 'posn:', ['20', ',', '40']]
        - color: GREEN
        - posn: ['20', ',', '40']
          - x: 20
          - y: 40
        - shape: TRIANGLE
        - size: 20
    """
    def __init__( self, exprs, savelist = True ):
        super(Each,self).__init__(exprs, savelist)
        self.mayReturnEmpty = all(e.mayReturnEmpty for e in self.exprs)
        self.skipWhitespace = True
        self.initExprGroups = True

    def parseImpl( self, instring, loc, doActions=True ):
        if self.initExprGroups:
            self.opt1map = dict((id(e.expr),e) for e in self.exprs if isinstance(e,Optional))
            opt1 = [ e.expr for e in self.exprs if isinstance(e,Optional) ]
            opt2 = [ e for e in self.exprs if e.mayReturnEmpty and not isinstance(e,Optional)]
            self.optionals = opt1 + opt2
            self.multioptionals = [ e.expr for e in self.exprs if isinstance(e,ZeroOrMore) ]
            self.multirequired = [ e.expr for e in self.exprs if isinstance(e,OneOrMore) ]
            self.required = [ e for e in self.exprs if not isinstance(e,(Optional,ZeroOrMore,OneOrMore)) ]
            self.required += self.multirequired
            self.initExprGroups = False
        tmpLoc = loc
        tmpReqd = self.required[:]
        tmpOpt  = self.optionals[:]
        matchOrder = []

        keepMatching = True
        while keepMatching:
            tmpExprs = tmpReqd + tmpOpt + self.multioptionals + self.multirequired
            failed = []
            for e in tmpExprs:
                try:
                    tmpLoc = e.tryParse( instring, tmpLoc )
                except ParseException:
                    failed.append(e)
                else:
                    matchOrder.append(self.opt1map.get(id(e),e))
                    if e in tmpReqd:
                        tmpReqd.remove(e)
                    elif e in tmpOpt:
                        tmpOpt.remove(e)
            if len(failed) == len(tmpExprs):
                keepMatching = False

        if tmpReqd:
            missing = ", ".join(_ustr(e) for e in tmpReqd)
            raise ParseException(instring,loc,"Missing one or more required elements (%s)" % missing )

        # add any unmatched Optionals, in case they have default values defined
        matchOrder += [e for e in self.exprs if isinstance(e,Optional) and e.expr in tmpOpt]

        resultlist = []
        for e in matchOrder:
            loc,results = e._parse(instring,loc,doActions)
            resultlist.append(results)

        finalResults = sum(resultlist, ParseResults([]))
        return loc, finalResults

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "{" + " & ".join(_ustr(e) for e in self.exprs) + "}"

        return self.strRepr

    def checkRecursion( self, parseElementList ):
        subRecCheckList = parseElementList[:] + [ self ]
        for e in self.exprs:
            e.checkRecursion( subRecCheckList )


class ParseElementEnhance(ParserElement):
    """
    Abstract subclass of C{ParserElement}, for combining and post-processing parsed tokens.
    """
    def __init__( self, expr, savelist=False ):
        super(ParseElementEnhance,self).__init__(savelist)
        if isinstance( expr, basestring ):
            if issubclass(ParserElement._literalStringClass, Token):
                expr = ParserElement._literalStringClass(expr)
            else:
                expr = ParserElement._literalStringClass(Literal(expr))
        self.expr = expr
        self.strRepr = None
        if expr is not None:
            self.mayIndexError = expr.mayIndexError
            self.mayReturnEmpty = expr.mayReturnEmpty
            self.setWhitespaceChars( expr.whiteChars )
            self.skipWhitespace = expr.skipWhitespace
            self.saveAsList = expr.saveAsList
            self.callPreparse = expr.callPreparse
            self.ignoreExprs.extend(expr.ignoreExprs)

    def parseImpl( self, instring, loc, doActions=True ):
        if self.expr is not None:
            return self.expr._parse( instring, loc, doActions, callPreParse=False )
        else:
            raise ParseException("",loc,self.errmsg,self)

    def leaveWhitespace( self ):
        self.skipWhitespace = False
        self.expr = self.expr.copy()
        if self.expr is not None:
            self.expr.leaveWhitespace()
        return self

    def ignore( self, other ):
        if isinstance( other, Suppress ):
            if other not in self.ignoreExprs:
                super( ParseElementEnhance, self).ignore( other )
                if self.expr is not None:
                    self.expr.ignore( self.ignoreExprs[-1] )
        else:
            super( ParseElementEnhance, self).ignore( other )
            if self.expr is not None:
                self.expr.ignore( self.ignoreExprs[-1] )
        return self

    def streamline( self ):
        super(ParseElementEnhance,self).streamline()
        if self.expr is not None:
            self.expr.streamline()
        return self

    def checkRecursion( self, parseElementList ):
        if self in parseElementList:
            raise RecursiveGrammarException( parseElementList+[self] )
        subRecCheckList = parseElementList[:] + [ self ]
        if self.expr is not None:
            self.expr.checkRecursion( subRecCheckList )

    def validate( self, validateTrace=[] ):
        tmp = validateTrace[:]+[self]
        if self.expr is not None:
            self.expr.validate(tmp)
        self.checkRecursion( [] )

    def __str__( self ):
        try:
            return super(ParseElementEnhance,self).__str__()
        except Exception:
            pass

        if self.strRepr is None and self.expr is not None:
            self.strRepr = "%s:(%s)" % ( self.__class__.__name__, _ustr(self.expr) )
        return self.strRepr


class FollowedBy(ParseElementEnhance):
    """
    Lookahead matching of the given parse expression.  C{FollowedBy}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression matches at the current
    position.  C{FollowedBy} always returns a null token list.

    Example::
        # use FollowedBy to match a label only if it is followed by a ':'
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        OneOrMore(attr_expr).parseString("shape: SQUARE color: BLACK posn: upper left").pprint()
    prints::
        [['shape', 'SQUARE'], ['color', 'BLACK'], ['posn', 'upper left']]
    """
    def __init__( self, expr ):
        super(FollowedBy,self).__init__(expr)
        self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        self.expr.tryParse( instring, loc )
        return loc, []


class NotAny(ParseElementEnhance):
    """
    Lookahead to disallow matching with the given parse expression.  C{NotAny}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression does I{not} match at the current
    position.  Also, C{NotAny} does I{not} skip over leading whitespace. C{NotAny}
    always returns a null token list.  May be constructed using the '~' operator.

    Example::
        
    """
    def __init__( self, expr ):
        super(NotAny,self).__init__(expr)
        #~ self.leaveWhitespace()
        self.skipWhitespace = False  # do NOT use self.leaveWhitespace(), don't want to propagate to exprs
        self.mayReturnEmpty = True
        self.errmsg = "Found unwanted token, "+_ustr(self.expr)

    def parseImpl( self, instring, loc, doActions=True ):
        if self.expr.canParseNext(instring, loc):
            raise ParseException(instring, loc, self.errmsg, self)
        return loc, []

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "~{" + _ustr(self.expr) + "}"

        return self.strRepr

class _MultipleMatch(ParseElementEnhance):
    def __init__( self, expr, stopOn=None):
        super(_MultipleMatch, self).__init__(expr)
        self.saveAsList = True
        ender = stopOn
        if isinstance(ender, basestring):
            ender = ParserElement._literalStringClass(ender)
        self.not_ender = ~ender if ender is not None else None

    def parseImpl( self, instring, loc, doActions=True ):
        self_expr_parse = self.expr._parse
        self_skip_ignorables = self._skipIgnorables
        check_ender = self.not_ender is not None
        if check_ender:
            try_not_ender = self.not_ender.tryParse
        
        # must be at least one (but first see if we are the stopOn sentinel;
        # if so, fail)
        if check_ender:
            try_not_ender(instring, loc)
        loc, tokens = self_expr_parse( instring, loc, doActions, callPreParse=False )
        try:
            hasIgnoreExprs = (not not self.ignoreExprs)
            while 1:
                if check_ender:
                    try_not_ender(instring, loc)
                if hasIgnoreExprs:
                    preloc = self_skip_ignorables( instring, loc )
                else:
                    preloc = loc
                loc, tmptokens = self_expr_parse( instring, preloc, doActions )
                if tmptokens or tmptokens.haskeys():
                    tokens += tmptokens
        except (ParseException,IndexError):
            pass

        return loc, tokens
        
class OneOrMore(_MultipleMatch):
    """
    Repetition of one or more of the given expression.
    
    Parameters:
     - expr - expression that must match one or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: BLACK"
        OneOrMore(attr_expr).parseString(text).pprint()  # Fail! read 'color' as data instead of next label -> [['shape', 'SQUARE color']]

        # use stopOn attribute for OneOrMore to avoid reading label string as part of the data
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        OneOrMore(attr_expr).parseString(text).pprint() # Better -> [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'BLACK']]
        
        # could also be written as
        (attr_expr * (1,)).parseString(text).pprint()
    """

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "{" + _ustr(self.expr) + "}..."

        return self.strRepr

class ZeroOrMore(_MultipleMatch):
    """
    Optional repetition of zero or more of the given expression.
    
    Parameters:
     - expr - expression that must match zero or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example: similar to L{OneOrMore}
    """
    def __init__( self, expr, stopOn=None):
        super(ZeroOrMore,self).__init__(expr, stopOn=stopOn)
        self.mayReturnEmpty = True
        
    def parseImpl( self, instring, loc, doActions=True ):
        try:
            return super(ZeroOrMore, self).parseImpl(instring, loc, doActions)
        except (ParseException,IndexError):
            return loc, []

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "[" + _ustr(self.expr) + "]..."

        return self.strRepr

class _NullToken(object):
    def __bool__(self):
        return False
    __nonzero__ = __bool__
    def __str__(self):
        return ""

_optionalNotMatched = _NullToken()
class Optional(ParseElementEnhance):
    """
    Optional matching of the given expression.

    Parameters:
     - expr - expression that must match zero or more times
     - default (optional) - value to be returned if the optional expression is not found.

    Example::
        # US postal code can be a 5-digit zip, plus optional 4-digit qualifier
        zip = Combine(Word(nums, exact=5) + Optional('-' + Word(nums, exact=4)))
        zip.runTests('''
            # traditional ZIP code
            12345
            
            # ZIP+4 form
            12101-0001
            
            # invalid ZIP
            98765-
            ''')
    prints::
        # traditional ZIP code
        12345
        ['12345']

        # ZIP+4 form
        12101-0001
        ['12101-0001']

        # invalid ZIP
        98765-
             ^
        FAIL: Expected end of text (at char 5), (line:1, col:6)
    """
    def __init__( self, expr, default=_optionalNotMatched ):
        super(Optional,self).__init__( expr, savelist=False )
        self.saveAsList = self.expr.saveAsList
        self.defaultValue = default
        self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        try:
            loc, tokens = self.expr._parse( instring, loc, doActions, callPreParse=False )
        except (ParseException,IndexError):
            if self.defaultValue is not _optionalNotMatched:
                if self.expr.resultsName:
                    tokens = ParseResults([ self.defaultValue ])
                    tokens[self.expr.resultsName] = self.defaultValue
                else:
                    tokens = [ self.defaultValue ]
            else:
                tokens = []
        return loc, tokens

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "[" + _ustr(self.expr) + "]"

        return self.strRepr

class SkipTo(ParseElementEnhance):
    """
    Token for skipping over all undefined text until the matched expression is found.

    Parameters:
     - expr - target expression marking the end of the data to be skipped
     - include - (default=C{False}) if True, the target expression is also parsed 
          (the skipped text and target expression are returned as a 2-element list).
     - ignore - (default=C{None}) used to define grammars (typically quoted strings and 
          comments) that might contain false matches to the target expression
     - failOn - (default=C{None}) define expressions that are not allowed to be 
          included in the skipped test; if found before the target expression is found, 
          the SkipTo is not a match

    Example::
        report = '''
            Outstanding Issues Report - 1 Jan 2000

               # | Severity | Description                               |  Days Open
            -----+----------+-------------------------------------------+-----------
             101 | Critical | Intermittent system crash                 |          6
              94 | Cosmetic | Spelling error on Login ('log|n')         |         14
              79 | Minor    | System slow when running too many reports |         47
            '''
        integer = Word(nums)
        SEP = Suppress('|')
        # use SkipTo to simply match everything up until the next SEP
        # - ignore quoted strings, so that a '|' character inside a quoted string does not match
        # - parse action will call token.strip() for each matched token, i.e., the description body
        string_data = SkipTo(SEP, ignore=quotedString)
        string_data.setParseAction(tokenMap(str.strip))
        ticket_expr = (integer("issue_num") + SEP 
                      + string_data("sev") + SEP 
                      + string_data("desc") + SEP 
                      + integer("days_open"))
        
        for tkt in ticket_expr.searchString(report):
            print tkt.dump()
    prints::
        ['101', 'Critical', 'Intermittent system crash', '6']
        - days_open: 6
        - desc: Intermittent system crash
        - issue_num: 101
        - sev: Critical
        ['94', 'Cosmetic', "Spelling error on Login ('log|n')", '14']
        - days_open: 14
        - desc: Spelling error on Login ('log|n')
        - issue_num: 94
        - sev: Cosmetic
        ['79', 'Minor', 'System slow when running too many reports', '47']
        - days_open: 47
        - desc: System slow when running too many reports
        - issue_num: 79
        - sev: Minor
    """
    def __init__( self, other, include=False, ignore=None, failOn=None ):
        super( SkipTo, self ).__init__( other )
        self.ignoreExpr = ignore
        self.mayReturnEmpty = True
        self.mayIndexError = False
        self.includeMatch = include
        self.asList = False
        if isinstance(failOn, basestring):
            self.failOn = ParserElement._literalStringClass(failOn)
        else:
            self.failOn = failOn
        self.errmsg = "No match found for "+_ustr(self.expr)

    def parseImpl( self, instring, loc, doActions=True ):
        startloc = loc
        instrlen = len(instring)
        expr = self.expr
        expr_parse = self.expr._parse
        self_failOn_canParseNext = self.failOn.canParseNext if self.failOn is not None else None
        self_ignoreExpr_tryParse = self.ignoreExpr.tryParse if self.ignoreExpr is not None else None
        
        tmploc = loc
        while tmploc <= instrlen:
            if self_failOn_canParseNext is not None:
                # break if failOn expression matches
                if self_failOn_canParseNext(instring, tmploc):
                    break
                    
            if self_ignoreExpr_tryParse is not None:
                # advance past ignore expressions
                while 1:
                    try:
                        tmploc = self_ignoreExpr_tryParse(instring, tmploc)
                    except ParseBaseException:
                        break
            
            try:
                expr_parse(instring, tmploc, doActions=False, callPreParse=False)
            except (ParseException, IndexError):
                # no match, advance loc in string
                tmploc += 1
            else:
                # matched skipto expr, done
                break

        else:
            # ran off the end of the input string without matching skipto expr, fail
            raise ParseException(instring, loc, self.errmsg, self)

        # build up return values
        loc = tmploc
        skiptext = instring[startloc:loc]
        skipresult = ParseResults(skiptext)
        
        if self.includeMatch:
            loc, mat = expr_parse(instring,loc,doActions,callPreParse=False)
            skipresult += mat

        return loc, skipresult

class Forward(ParseElementEnhance):
    """
    Forward declaration of an expression to be defined later -
    used for recursive grammars, such as algebraic infix notation.
    When the expression is known, it is assigned to the C{Forward} variable using the '<<' operator.

    Note: take care when assigning to C{Forward} not to overlook precedence of operators.
    Specifically, '|' has a lower precedence than '<<', so that::
        fwdExpr << a | b | c
    will actually be evaluated as::
        (fwdExpr << a) | b | c
    thereby leaving b and c out as parseable alternatives.  It is recommended that you
    explicitly group the values inserted into the C{Forward}::
        fwdExpr << (a | b | c)
    Converting to use the '<<=' operator instead will avoid this problem.

    See L{ParseResults.pprint} for an example of a recursive parser created using
    C{Forward}.
    """
    def __init__( self, other=None ):
        super(Forward,self).__init__( other, savelist=False )

    def __lshift__( self, other ):
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass(other)
        self.expr = other
        self.strRepr = None
        self.mayIndexError = self.expr.mayIndexError
        self.mayReturnEmpty = self.expr.mayReturnEmpty
        self.setWhitespaceChars( self.expr.whiteChars )
        self.skipWhitespace = self.expr.skipWhitespace
        self.saveAsList = self.expr.saveAsList
        self.ignoreExprs.extend(self.expr.ignoreExprs)
        return self
        
    def __ilshift__(self, other):
        return self << other
    
    def leaveWhitespace( self ):
        self.skipWhitespace = False
        return self

    def streamline( self ):
        if not self.streamlined:
            self.streamlined = True
            if self.expr is not None:
                self.expr.streamline()
        return self

    def validate( self, validateTrace=[] ):
        if self not in validateTrace:
            tmp = validateTrace[:]+[self]
            if self.expr is not None:
                self.expr.validate(tmp)
        self.checkRecursion([])

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name
        return self.__class__.__name__ + ": ..."

        # stubbed out for now - creates awful memory and perf issues
        self._revertClass = self.__class__
        self.__class__ = _ForwardNoRecurse
        try:
            if self.expr is not None:
                retString = _ustr(self.expr)
            else:
                retString = "None"
        finally:
            self.__class__ = self._revertClass
        return self.__class__.__name__ + ": " + retString

    def copy(self):
        if self.expr is not None:
            return super(Forward,self).copy()
        else:
            ret = Forward()
            ret <<= self
            return ret

class _ForwardNoRecurse(Forward):
    def __str__( self ):
        return "..."

class TokenConverter(ParseElementEnhance):
    """
    Abstract subclass of C{ParseExpression}, for converting parsed results.
    """
    def __init__( self, expr, savelist=False ):
        super(TokenConverter,self).__init__( expr )#, savelist )
        self.saveAsList = False

class Combine(TokenConverter):
    """
    Converter to concatenate all matching tokens to a single string.
    By default, the matching patterns must also be contiguous in the input string;
    this can be disabled by specifying C{'adjacent=False'} in the constructor.

    Example::
        real = Word(nums) + '.' + Word(nums)
        print(real.parseString('3.1416')) # -> ['3', '.', '1416']
        # will also erroneously match the following
        print(real.parseString('3. 1416')) # -> ['3', '.', '1416']

        real = Combine(Word(nums) + '.' + Word(nums))
        print(real.parseString('3.1416')) # -> ['3.1416']
        # no match when there are internal spaces
        print(real.parseString('3. 1416')) # -> Exception: Expected W:(0123...)
    """
    def __init__( self, expr, joinString="", adjacent=True ):
        super(Combine,self).__init__( expr )
        # suppress whitespace-stripping in contained parse expressions, but re-enable it on the Combine itself
        if adjacent:
            self.leaveWhitespace()
        self.adjacent = adjacent
        self.skipWhitespace = True
        self.joinString = joinString
        self.callPreparse = True

    def ignore( self, other ):
        if self.adjacent:
            ParserElement.ignore(self, other)
        else:
            super( Combine, self).ignore( other )
        return self

    def postParse( self, instring, loc, tokenlist ):
        retToks = tokenlist.copy()
        del retToks[:]
        retToks += ParseResults([ "".join(tokenlist._asStringList(self.joinString)) ], modal=self.modalResults)

        if self.resultsName and retToks.haskeys():
            return [ retToks ]
        else:
            return retToks

class Group(TokenConverter):
    """
    Converter to return the matched tokens as a list - useful for returning tokens of C{L{ZeroOrMore}} and C{L{OneOrMore}} expressions.

    Example::
        ident = Word(alphas)
        num = Word(nums)
        term = ident | num
        func = ident + Optional(delimitedList(term))
        print(func.parseString("fn a,b,100"))  # -> ['fn', 'a', 'b', '100']

        func = ident + Group(Optional(delimitedList(term)))
        print(func.parseString("fn a,b,100"))  # -> ['fn', ['a', 'b', '100']]
    """
    def __init__( self, expr ):
        super(Group,self).__init__( expr )
        self.saveAsList = True

    def postParse( self, instring, loc, tokenlist ):
        return [ tokenlist ]

class Dict(TokenConverter):
    """
    Converter to return a repetitive expression as a list, but also as a dictionary.
    Each element can also be referenced using the first token in the expression as its key.
    Useful for tabular report scraping when the first column can be used as a item key.

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        # print attributes as plain groups
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        # instead of OneOrMore(expr), parse using Dict(OneOrMore(Group(expr))) - Dict will auto-assign names
        result = Dict(OneOrMore(Group(attr_expr))).parseString(text)
        print(result.dump())
        
        # access named fields as dict entries, or output as dict
        print(result['shape'])        
        print(result.asDict())
    prints::
        ['shape', 'SQUARE', 'posn', 'upper left', 'color', 'light blue', 'texture', 'burlap']

        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        {'color': 'light blue', 'posn': 'upper left', 'texture': 'burlap', 'shape': 'SQUARE'}
    See more examples at L{ParseResults} of accessing fields by results name.
    """
    def __init__( self, expr ):
        super(Dict,self).__init__( expr )
        self.saveAsList = True

    def postParse( self, instring, loc, tokenlist ):
        for i,tok in enumerate(tokenlist):
            if len(tok) == 0:
                continue
            ikey = tok[0]
            if isinstance(ikey,int):
                ikey = _ustr(tok[0]).strip()
            if len(tok)==1:
                tokenlist[ikey] = _ParseResultsWithOffset("",i)
            elif len(tok)==2 and not isinstance(tok[1],ParseResults):
                tokenlist[ikey] = _ParseResultsWithOffset(tok[1],i)
            else:
                dictvalue = tok.copy() #ParseResults(i)
                del dictvalue[0]
                if len(dictvalue)!= 1 or (isinstance(dictvalue,ParseResults) and dictvalue.haskeys()):
                    tokenlist[ikey] = _ParseResultsWithOffset(dictvalue,i)
                else:
                    tokenlist[ikey] = _ParseResultsWithOffset(dictvalue[0],i)

        if self.resultsName:
            return [ tokenlist ]
        else:
            return tokenlist


class Suppress(TokenConverter):
    """
    Converter for ignoring the results of a parsed expression.

    Example::
        source = "a, b, c,d"
        wd = Word(alphas)
        wd_list1 = wd + ZeroOrMore(',' + wd)
        print(wd_list1.parseString(source))

        # often, delimiters that are useful during parsing are just in the
        # way afterward - use Suppress to keep them out of the parsed output
        wd_list2 = wd + ZeroOrMore(Suppress(',') + wd)
        print(wd_list2.parseString(source))
    prints::
        ['a', ',', 'b', ',', 'c', ',', 'd']
        ['a', 'b', 'c', 'd']
    (See also L{delimitedList}.)
    """
    def postParse( self, instring, loc, tokenlist ):
        return []

    def suppress( self ):
        return self


class OnlyOnce(object):
    """
    Wrapper for parse actions, to ensure they are only called once.
    """
    def __init__(self, methodCall):
        self.callable = _trim_arity(methodCall)
        self.called = False
    def __call__(self,s,l,t):
        if not self.called:
            results = self.callable(s,l,t)
            self.called = True
            return results
        raise ParseException(s,l,"")
    def reset(self):
        self.called = False

def traceParseAction(f):
    """
    Decorator for debugging parse actions. 
    
    When the parse action is called, this decorator will print C{">> entering I{method-name}(line:I{current_source_line}, I{parse_location}, I{matched_tokens})".}
    When the parse action completes, the decorator will print C{"<<"} followed by the returned value, or any exception that the parse action raised.

    Example::
        wd = Word(alphas)

        @traceParseAction
        def remove_duplicate_chars(tokens):
            return ''.join(sorted(set(''.join(tokens)))

        wds = OneOrMore(wd).setParseAction(remove_duplicate_chars)
        print(wds.parseString("slkdjs sld sldd sdlf sdljf"))
    prints::
        >>entering remove_duplicate_chars(line: 'slkdjs sld sldd sdlf sdljf', 0, (['slkdjs', 'sld', 'sldd', 'sdlf', 'sdljf'], {}))
        <<leaving remove_duplicate_chars (ret: 'dfjkls')
        ['dfjkls']
    """
    f = _trim_arity(f)
    def z(*paArgs):
        thisFunc = f.__name__
        s,l,t = paArgs[-3:]
        if len(paArgs)>3:
            thisFunc = paArgs[0].__class__.__name__ + '.' + thisFunc
        sys.stderr.write( ">>entering %s(line: '%s', %d, %r)\n" % (thisFunc,line(l,s),l,t) )
        try:
            ret = f(*paArgs)
        except Exception as exc:
            sys.stderr.write( "<<leaving %s (exception: %s)\n" % (thisFunc,exc) )
            raise
        sys.stderr.write( "<<leaving %s (ret: %r)\n" % (thisFunc,ret) )
        return ret
    try:
        z.__name__ = f.__name__
    except AttributeError:
        pass
    return z

#
# global helpers
#
def delimitedList( expr, delim=",", combine=False ):
    """
    Helper to define a delimited list of expressions - the delimiter defaults to ','.
    By default, the list elements and delimiters can have intervening whitespace, and
    comments, but this can be overridden by passing C{combine=True} in the constructor.
    If C{combine} is set to C{True}, the matching tokens are returned as a single token
    string, with the delimiters included; otherwise, the matching tokens are returned
    as a list of tokens, with the delimiters suppressed.

    Example::
        delimitedList(Word(alphas)).parseString("aa,bb,cc") # -> ['aa', 'bb', 'cc']
        delimitedList(Word(hexnums), delim=':', combine=True).parseString("AA:BB:CC:DD:EE") # -> ['AA:BB:CC:DD:EE']
    """
    dlName = _ustr(expr)+" ["+_ustr(delim)+" "+_ustr(expr)+"]..."
    if combine:
        return Combine( expr + ZeroOrMore( delim + expr ) ).setName(dlName)
    else:
        return ( expr + ZeroOrMore( Suppress( delim ) + expr ) ).setName(dlName)

def countedArray( expr, intExpr=None ):
    """
    Helper to define a counted list of expressions.
    This helper defines a pattern of the form::
        integer expr expr expr...
    where the leading integer tells how many expr expressions follow.
    The matched tokens returns the array of expr tokens as a list - the leading count token is suppressed.
    
    If C{intExpr} is specified, it should be a pyparsing expression that produces an integer value.

    Example::
        countedArray(Word(alphas)).parseString('2 ab cd ef')  # -> ['ab', 'cd']

        # in this parser, the leading integer value is given in binary,
        # '10' indicating that 2 values are in the array
        binaryConstant = Word('01').setParseAction(lambda t: int(t[0], 2))
        countedArray(Word(alphas), intExpr=binaryConstant).parseString('10 ab cd ef')  # -> ['ab', 'cd']
    """
    arrayExpr = Forward()
    def countFieldParseAction(s,l,t):
        n = t[0]
        arrayExpr << (n and Group(And([expr]*n)) or Group(empty))
        return []
    if intExpr is None:
        intExpr = Word(nums).setParseAction(lambda t:int(t[0]))
    else:
        intExpr = intExpr.copy()
    intExpr.setName("arrayLen")
    intExpr.addParseAction(countFieldParseAction, callDuringTry=True)
    return ( intExpr + arrayExpr ).setName('(len) ' + _ustr(expr) + '...')

def _flatten(L):
    ret = []
    for i in L:
        if isinstance(i,list):
            ret.extend(_flatten(i))
        else:
            ret.append(i)
    return ret

def matchPreviousLiteral(expr):
    """
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousLiteral(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches a
    previous literal, will also match the leading C{"1:1"} in C{"1:10"}.
    If this is not desired, use C{matchPreviousExpr}.
    Do I{not} use with packrat parsing enabled.
    """
    rep = Forward()
    def copyTokenToRepeater(s,l,t):
        if t:
            if len(t) == 1:
                rep << t[0]
            else:
                # flatten t tokens
                tflat = _flatten(t.asList())
                rep << And(Literal(tt) for tt in tflat)
        else:
            rep << Empty()
    expr.addParseAction(copyTokenToRepeater, callDuringTry=True)
    rep.setName('(prev) ' + _ustr(expr))
    return rep

def matchPreviousExpr(expr):
    """
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousExpr(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches by
    expressions, will I{not} match the leading C{"1:1"} in C{"1:10"};
    the expressions are evaluated first, and then compared, so
    C{"1"} is compared with C{"10"}.
    Do I{not} use with packrat parsing enabled.
    """
    rep = Forward()
    e2 = expr.copy()
    rep <<= e2
    def copyTokenToRepeater(s,l,t):
        matchTokens = _flatten(t.asList())
        def mustMatchTheseTokens(s,l,t):
            theseTokens = _flatten(t.asList())
            if  theseTokens != matchTokens:
                raise ParseException("",0,"")
        rep.setParseAction( mustMatchTheseTokens, callDuringTry=True )
    expr.addParseAction(copyTokenToRepeater, callDuringTry=True)
    rep.setName('(prev) ' + _ustr(expr))
    return rep

def _escapeRegexRangeChars(s):
    #~  escape these chars: ^-]
    for c in r"\^-]":
        s = s.replace(c,_bslash+c)
    s = s.replace("\n",r"\n")
    s = s.replace("\t",r"\t")
    return _ustr(s)

def oneOf( strs, caseless=False, useRegex=True ):
    """
    Helper to quickly define a set of alternative Literals, and makes sure to do
    longest-first testing when there is a conflict, regardless of the input order,
    but returns a C{L{MatchFirst}} for best performance.

    Parameters:
     - strs - a string of space-delimited literals, or a collection of string literals
     - caseless - (default=C{False}) - treat all literals as caseless
     - useRegex - (default=C{True}) - as an optimization, will generate a Regex
          object; otherwise, will generate a C{MatchFirst} object (if C{caseless=True}, or
          if creating a C{Regex} raises an exception)

    Example::
        comp_oper = oneOf("< = > <= >= !=")
        var = Word(alphas)
        number = Word(nums)
        term = var | number
        comparison_expr = term + comp_oper + term
        print(comparison_expr.searchString("B = 12  AA=23 B<=AA AA>12"))
    prints::
        [['B', '=', '12'], ['AA', '=', '23'], ['B', '<=', 'AA'], ['AA', '>', '12']]
    """
    if caseless:
        isequal = ( lambda a,b: a.upper() == b.upper() )
        masks = ( lambda a,b: b.upper().startswith(a.upper()) )
        parseElementClass = CaselessLiteral
    else:
        isequal = ( lambda a,b: a == b )
        masks = ( lambda a,b: b.startswith(a) )
        parseElementClass = Literal

    symbols = []
    if isinstance(strs,basestring):
        symbols = strs.split()
    elif isinstance(strs, collections.Iterable):
        symbols = list(strs)
    else:
        warnings.warn("Invalid argument to oneOf, expected string or iterable",
                SyntaxWarning, stacklevel=2)
    if not symbols:
        return NoMatch()

    i = 0
    while i < len(symbols)-1:
        cur = symbols[i]
        for j,other in enumerate(symbols[i+1:]):
            if ( isequal(other, cur) ):
                del symbols[i+j+1]
                break
            elif ( masks(cur, other) ):
                del symbols[i+j+1]
                symbols.insert(i,other)
                cur = other
                break
        else:
            i += 1

    if not caseless and useRegex:
        #~ print (strs,"->", "|".join( [ _escapeRegexChars(sym) for sym in symbols] ))
        try:
            if len(symbols)==len("".join(symbols)):
                return Regex( "[%s]" % "".join(_escapeRegexRangeChars(sym) for sym in symbols) ).setName(' | '.join(symbols))
            else:
                return Regex( "|".join(re.escape(sym) for sym in symbols) ).setName(' | '.join(symbols))
        except Exception:
            warnings.warn("Exception creating Regex for oneOf, building MatchFirst",
                    SyntaxWarning, stacklevel=2)


    # last resort, just use MatchFirst
    return MatchFirst(parseElementClass(sym) for sym in symbols).setName(' | '.join(symbols))

def dictOf( key, value ):
    """
    Helper to easily and clearly define a dictionary by specifying the respective patterns
    for the key and value.  Takes care of defining the C{L{Dict}}, C{L{ZeroOrMore}}, and C{L{Group}} tokens
    in the proper order.  The key pattern can include delimiting markers or punctuation,
    as long as they are suppressed, thereby leaving the significant key text.  The value
    pattern can include named results, so that the C{Dict} results can include named token
    fields.

    Example::
        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        attr_label = label
        attr_value = Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join)

        # similar to Dict, but simpler call format
        result = dictOf(attr_label, attr_value).parseString(text)
        print(result.dump())
        print(result['shape'])
        print(result.shape)  # object attribute access works too
        print(result.asDict())
    prints::
        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        SQUARE
        {'color': 'light blue', 'shape': 'SQUARE', 'posn': 'upper left', 'texture': 'burlap'}
    """
    return Dict( ZeroOrMore( Group ( key + value ) ) )

def originalTextFor(expr, asString=True):
    """
    Helper to return the original, untokenized text for a given expression.  Useful to
    restore the parsed fields of an HTML start tag into the raw tag text itself, or to
    revert separate tokens with intervening whitespace back to the original matching
    input text. By default, returns astring containing the original parsed text.  
       
    If the optional C{asString} argument is passed as C{False}, then the return value is a 
    C{L{ParseResults}} containing any results names that were originally matched, and a 
    single token containing the original matched text from the input string.  So if 
    the expression passed to C{L{originalTextFor}} contains expressions with defined
    results names, you must set C{asString} to C{False} if you want to preserve those
    results name values.

    Example::
        src = "this is test <b> bold <i>text</i> </b> normal text "
        for tag in ("b","i"):
            opener,closer = makeHTMLTags(tag)
            patt = originalTextFor(opener + SkipTo(closer) + closer)
            print(patt.searchString(src)[0])
    prints::
        ['<b> bold <i>text</i> </b>']
        ['<i>text</i>']
    """
    locMarker = Empty().setParseAction(lambda s,loc,t: loc)
    endlocMarker = locMarker.copy()
    endlocMarker.callPreparse = False
    matchExpr = locMarker("_original_start") + expr + endlocMarker("_original_end")
    if asString:
        extractText = lambda s,l,t: s[t._original_start:t._original_end]
    else:
        def extractText(s,l,t):
            t[:] = [s[t.pop('_original_start'):t.pop('_original_end')]]
    matchExpr.setParseAction(extractText)
    matchExpr.ignoreExprs = expr.ignoreExprs
    return matchExpr

def ungroup(expr): 
    """
    Helper to undo pyparsing's default grouping of And expressions, even
    if all but one are non-empty.
    """
    return TokenConverter(expr).setParseAction(lambda t:t[0])

def locatedExpr(expr):
    """
    Helper to decorate a returned token with its starting and ending locations in the input string.
    This helper adds the following results names:
     - locn_start = location where matched expression begins
     - locn_end = location where matched expression ends
     - value = the actual parsed results

    Be careful if the input text contains C{<TAB>} characters, you may want to call
    C{L{ParserElement.parseWithTabs}}

    Example::
        wd = Word(alphas)
        for match in locatedExpr(wd).searchString("ljsdf123lksdjjf123lkkjj1222"):
            print(match)
    prints::
        [[0, 'ljsdf', 5]]
        [[8, 'lksdjjf', 15]]
        [[18, 'lkkjj', 23]]
    """
    locator = Empty().setParseAction(lambda s,l,t: l)
    return Group(locator("locn_start") + expr("value") + locator.copy().leaveWhitespace()("locn_end"))


# convenience constants for positional expressions
empty       = Empty().setName("empty")
lineStart   = LineStart().setName("lineStart")
lineEnd     = LineEnd().setName("lineEnd")
stringStart = StringStart().setName("stringStart")
stringEnd   = StringEnd().setName("stringEnd")

_escapedPunc = Word( _bslash, r"\[]-*.$+^?()~ ", exact=2 ).setParseAction(lambda s,l,t:t[0][1])
_escapedHexChar = Regex(r"\\0?[xX][0-9a-fA-F]+").setParseAction(lambda s,l,t:unichr(int(t[0].lstrip(r'\0x'),16)))
_escapedOctChar = Regex(r"\\0[0-7]+").setParseAction(lambda s,l,t:unichr(int(t[0][1:],8)))
_singleChar = _escapedPunc | _escapedHexChar | _escapedOctChar | Word(printables, excludeChars=r'\]', exact=1) | Regex(r"\w", re.UNICODE)
_charRange = Group(_singleChar + Suppress("-") + _singleChar)
_reBracketExpr = Literal("[") + Optional("^").setResultsName("negate") + Group( OneOrMore( _charRange | _singleChar ) ).setResultsName("body") + "]"

def srange(s):
    r"""
    Helper to easily define string ranges for use in Word construction.  Borrows
    syntax from regexp '[]' string range definitions::
        srange("[0-9]")   -> "0123456789"
        srange("[a-z]")   -> "abcdefghijklmnopqrstuvwxyz"
        srange("[a-z$_]") -> "abcdefghijklmnopqrstuvwxyz$_"
    The input string must be enclosed in []'s, and the returned string is the expanded
    character set joined into a single string.
    The values enclosed in the []'s may be:
     - a single character
     - an escaped character with a leading backslash (such as C{\-} or C{\]})
     - an escaped hex character with a leading C{'\x'} (C{\x21}, which is a C{'!'} character) 
         (C{\0x##} is also supported for backwards compatibility) 
     - an escaped octal character with a leading C{'\0'} (C{\041}, which is a C{'!'} character)
     - a range of any of the above, separated by a dash (C{'a-z'}, etc.)
     - any combination of the above (C{'aeiouy'}, C{'a-zA-Z0-9_$'}, etc.)
    """
    _expanded = lambda p: p if not isinstance(p,ParseResults) else ''.join(unichr(c) for c in range(ord(p[0]),ord(p[1])+1))
    try:
        return "".join(_expanded(part) for part in _reBracketExpr.parseString(s).body)
    except Exception:
        return ""

def matchOnlyAtCol(n):
    """
    Helper method for defining parse actions that require matching at a specific
    column in the input text.
    """
    def verifyCol(strg,locn,toks):
        if col(locn,strg) != n:
            raise ParseException(strg,locn,"matched token not at column %d" % n)
    return verifyCol

def replaceWith(replStr):
    """
    Helper method for common parse actions that simply return a literal value.  Especially
    useful when used with C{L{transformString<ParserElement.transformString>}()}.

    Example::
        num = Word(nums).setParseAction(lambda toks: int(toks[0]))
        na = oneOf("N/A NA").setParseAction(replaceWith(math.nan))
        term = na | num
        
        OneOrMore(term).parseString("324 234 N/A 234") # -> [324, 234, nan, 234]
    """
    return lambda s,l,t: [replStr]

def removeQuotes(s,l,t):
    """
    Helper parse action for removing quotation marks from parsed quoted strings.

    Example::
        # by default, quotation marks are included in parsed results
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["'Now is the Winter of our Discontent'"]

        # use removeQuotes to strip quotation marks from parsed results
        quotedString.setParseAction(removeQuotes)
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["Now is the Winter of our Discontent"]
    """
    return t[0][1:-1]

def tokenMap(func, *args):
    """
    Helper to define a parse action by mapping a function to all elements of a ParseResults list.If any additional 
    args are passed, they are forwarded to the given function as additional arguments after
    the token, as in C{hex_integer = Word(hexnums).setParseAction(tokenMap(int, 16))}, which will convert the
    parsed data to an integer using base 16.

    Example (compare the last to example in L{ParserElement.transformString}::
        hex_ints = OneOrMore(Word(hexnums)).setParseAction(tokenMap(int, 16))
        hex_ints.runTests('''
            00 11 22 aa FF 0a 0d 1a
            ''')
        
        upperword = Word(alphas).setParseAction(tokenMap(str.upper))
        OneOrMore(upperword).runTests('''
            my kingdom for a horse
            ''')

        wd = Word(alphas).setParseAction(tokenMap(str.title))
        OneOrMore(wd).setParseAction(' '.join).runTests('''
            now is the winter of our discontent made glorious summer by this sun of york
            ''')
    prints::
        00 11 22 aa FF 0a 0d 1a
        [0, 17, 34, 170, 255, 10, 13, 26]

        my kingdom for a horse
        ['MY', 'KINGDOM', 'FOR', 'A', 'HORSE']

        now is the winter of our discontent made glorious summer by this sun of york
        ['Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York']
    """
    def pa(s,l,t):
        return [func(tokn, *args) for tokn in t]

    try:
        func_name = getattr(func, '__name__', 
                            getattr(func, '__class__').__name__)
    except Exception:
        func_name = str(func)
    pa.__name__ = func_name

    return pa

upcaseTokens = tokenMap(lambda t: _ustr(t).upper())
"""(Deprecated) Helper parse action to convert tokens to upper case. Deprecated in favor of L{pyparsing_common.upcaseTokens}"""

downcaseTokens = tokenMap(lambda t: _ustr(t).lower())
"""(Deprecated) Helper parse action to convert tokens to lower case. Deprecated in favor of L{pyparsing_common.downcaseTokens}"""
    
def _makeTags(tagStr, xml):
    """Internal helper to construct opening and closing tag expressions, given a tag name"""
    if isinstance(tagStr,basestring):
        resname = tagStr
        tagStr = Keyword(tagStr, caseless=not xml)
    else:
        resname = tagStr.name

    tagAttrName = Word(alphas,alphanums+"_-:")
    if (xml):
        tagAttrValue = dblQuotedString.copy().setParseAction( removeQuotes )
        openTag = Suppress("<") + tagStr("tag") + \
                Dict(ZeroOrMore(Group( tagAttrName + Suppress("=") + tagAttrValue ))) + \
                Optional("/",default=[False]).setResultsName("empty").setParseAction(lambda s,l,t:t[0]=='/') + Suppress(">")
    else:
        printablesLessRAbrack = "".join(c for c in printables if c not in ">")
        tagAttrValue = quotedString.copy().setParseAction( removeQuotes ) | Word(printablesLessRAbrack)
        openTag = Suppress("<") + tagStr("tag") + \
                Dict(ZeroOrMore(Group( tagAttrName.setParseAction(downcaseTokens) + \
                Optional( Suppress("=") + tagAttrValue ) ))) + \
                Optional("/",default=[False]).setResultsName("empty").setParseAction(lambda s,l,t:t[0]=='/') + Suppress(">")
    closeTag = Combine(_L("</") + tagStr + ">")

    openTag = openTag.setResultsName("start"+"".join(resname.replace(":"," ").title().split())).setName("<%s>" % resname)
    closeTag = closeTag.setResultsName("end"+"".join(resname.replace(":"," ").title().split())).setName("</%s>" % resname)
    openTag.tag = resname
    closeTag.tag = resname
    return openTag, closeTag

def makeHTMLTags(tagStr):
    """
    Helper to construct opening and closing tag expressions for HTML, given a tag name. Matches
    tags in either upper or lower case, attributes with namespaces and with quoted or unquoted values.

    Example::
        text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
        # makeHTMLTags returns pyparsing expressions for the opening and closing tags as a 2-tuple
        a,a_end = makeHTMLTags("A")
        link_expr = a + SkipTo(a_end)("link_text") + a_end
        
        for link in link_expr.searchString(text):
            # attributes in the <A> tag (like "href" shown here) are also accessible as named results
            print(link.link_text, '->', link.href)
    prints::
        pyparsing -> http://pyparsing.wikispaces.com
    """
    return _makeTags( tagStr, False )

def makeXMLTags(tagStr):
    """
    Helper to construct opening and closing tag expressions for XML, given a tag name. Matches
    tags only in the given upper/lower case.

    Example: similar to L{makeHTMLTags}
    """
    return _makeTags( tagStr, True )

def withAttribute(*args,**attrDict):
    """
    Helper to create a validating parse action to be used with start tags created
    with C{L{makeXMLTags}} or C{L{makeHTMLTags}}. Use C{withAttribute} to qualify a starting tag
    with a required attribute value, to avoid false matches on common tags such as
    C{<TD>} or C{<DIV>}.

    Call C{withAttribute} with a series of attribute names and values. Specify the list
    of filter attributes names and values as:
     - keyword arguments, as in C{(align="right")}, or
     - as an explicit dict with C{**} operator, when an attribute name is also a Python
          reserved word, as in C{**{"class":"Customer", "align":"right"}}
     - a list of name-value tuples, as in ( ("ns1:class", "Customer"), ("ns2:align","right") )
    For attribute names with a namespace prefix, you must use the second form.  Attribute
    names are matched insensitive to upper/lower case.
       
    If just testing for C{class} (with or without a namespace), use C{L{withClass}}.

    To verify that the attribute exists, but without specifying a value, pass
    C{withAttribute.ANY_VALUE} as the value.

    Example::
        html = '''
            <div>
            Some text
            <div type="grid">1 4 0 1 0</div>
            <div type="graph">1,3 2,3 1,1</div>
            <div>this has no type</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")

        # only match div tag having a type attribute with value "grid"
        div_grid = div().setParseAction(withAttribute(type="grid"))
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        # construct a match with any div tag having a type attribute, regardless of the value
        div_any_type = div().setParseAction(withAttribute(type=withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    """
    if args:
        attrs = args[:]
    else:
        attrs = attrDict.items()
    attrs = [(k,v) for k,v in attrs]
    def pa(s,l,tokens):
        for attrName,attrValue in attrs:
            if attrName not in tokens:
                raise ParseException(s,l,"no matching attribute " + attrName)
            if attrValue != withAttribute.ANY_VALUE and tokens[attrName] != attrValue:
                raise ParseException(s,l,"attribute '%s' has value '%s', must be '%s'" %
                                            (attrName, tokens[attrName], attrValue))
    return pa
withAttribute.ANY_VALUE = object()

def withClass(classname, namespace=''):
    """
    Simplified version of C{L{withAttribute}} when matching on a div class - made
    difficult because C{class} is a reserved word in Python.

    Example::
        html = '''
            <div>
            Some text
            <div class="grid">1 4 0 1 0</div>
            <div class="graph">1,3 2,3 1,1</div>
            <div>this &lt;div&gt; has no class</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")
        div_grid = div().setParseAction(withClass("grid"))
        
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        div_any_type = div().setParseAction(withClass(withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    """
    classattr = "%s:class" % namespace if namespace else "class"
    return withAttribute(**{classattr : classname})        

opAssoc = _Constants()
opAssoc.LEFT = object()
opAssoc.RIGHT = object()

def infixNotation( baseExpr, opList, lpar=Suppress('('), rpar=Suppress(')') ):
    """
    Helper method for constructing grammars of expressions made up of
    operators working in a precedence hierarchy.  Operators may be unary or
    binary, left- or right-associative.  Parse actions can also be attached
    to operator expressions. The generated parser will also recognize the use 
    of parentheses to override operator precedences (see example below).
    
    Note: if you define a deep operator list, you may see performance issues
    when using infixNotation. See L{ParserElement.enablePackrat} for a
    mechanism to potentially improve your parser performance.

    Parameters:
     - baseExpr - expression representing the most basic element for the nested
     - opList - list of tuples, one for each operator precedence level in the
      expression grammar; each tuple is of the form
      (opExpr, numTerms, rightLeftAssoc, parseAction), where:
       - opExpr is the pyparsing expression for the operator;
          may also be a string, which will be converted to a Literal;
          if numTerms is 3, opExpr is a tuple of two expressions, for the
          two operators separating the 3 terms
       - numTerms is the number of terms for this operator (must
          be 1, 2, or 3)
       - rightLeftAssoc is the indicator whether the operator is
          right or left associative, using the pyparsing-defined
          constants C{opAssoc.RIGHT} and C{opAssoc.LEFT}.
       - parseAction is the parse action to be associated with
          expressions matching this operator expression (the
          parse action tuple member may be omitted)
     - lpar - expression for matching left-parentheses (default=C{Suppress('(')})
     - rpar - expression for matching right-parentheses (default=C{Suppress(')')})

    Example::
        # simple example of four-function arithmetic with ints and variable names
        integer = pyparsing_common.signed_integer
        varname = pyparsing_common.identifier 
        
        arith_expr = infixNotation(integer | varname,
            [
            ('-', 1, opAssoc.RIGHT),
            (oneOf('* /'), 2, opAssoc.LEFT),
            (oneOf('+ -'), 2, opAssoc.LEFT),
            ])
        
        arith_expr.runTests('''
            5+3*6
            (5+3)*6
            -2--11
            ''', fullDump=False)
    prints::
        5+3*6
        [[5, '+', [3, '*', 6]]]

        (5+3)*6
        [[[5, '+', 3], '*', 6]]

        -2--11
        [[['-', 2], '-', ['-', 11]]]
    """
    ret = Forward()
    lastExpr = baseExpr | ( lpar + ret + rpar )
    for i,operDef in enumerate(opList):
        opExpr,arity,rightLeftAssoc,pa = (operDef + (None,))[:4]
        termName = "%s term" % opExpr if arity < 3 else "%s%s term" % opExpr
        if arity == 3:
            if opExpr is None or len(opExpr) != 2:
                raise ValueError("if numterms=3, opExpr must be a tuple or list of two expressions")
            opExpr1, opExpr2 = opExpr
        thisExpr = Forward().setName(termName)
        if rightLeftAssoc == opAssoc.LEFT:
            if arity == 1:
                matchExpr = FollowedBy(lastExpr + opExpr) + Group( lastExpr + OneOrMore( opExpr ) )
            elif arity == 2:
                if opExpr is not None:
                    matchExpr = FollowedBy(lastExpr + opExpr + lastExpr) + Group( lastExpr + OneOrMore( opExpr + lastExpr ) )
                else:
                    matchExpr = FollowedBy(lastExpr+lastExpr) + Group( lastExpr + OneOrMore(lastExpr) )
            elif arity == 3:
                matchExpr = FollowedBy(lastExpr + opExpr1 + lastExpr + opExpr2 + lastExpr) + \
                            Group( lastExpr + opExpr1 + lastExpr + opExpr2 + lastExpr )
            else:
                raise ValueError("operator must be unary (1), binary (2), or ternary (3)")
        elif rightLeftAssoc == opAssoc.RIGHT:
            if arity == 1:
                # try to avoid LR with this extra test
                if not isinstance(opExpr, Optional):
                    opExpr = Optional(opExpr)
                matchExpr = FollowedBy(opExpr.expr + thisExpr) + Group( opExpr + thisExpr )
            elif arity == 2:
                if opExpr is not None:
                    matchExpr = FollowedBy(lastExpr + opExpr + thisExpr) + Group( lastExpr + OneOrMore( opExpr + thisExpr ) )
                else:
                    matchExpr = FollowedBy(lastExpr + thisExpr) + Group( lastExpr + OneOrMore( thisExpr ) )
            elif arity == 3:
                matchExpr = FollowedBy(lastExpr + opExpr1 + thisExpr + opExpr2 + thisExpr) + \
                            Group( lastExpr + opExpr1 + thisExpr + opExpr2 + thisExpr )
            else:
                raise ValueError("operator must be unary (1), binary (2), or ternary (3)")
        else:
            raise ValueError("operator must indicate right or left associativity")
        if pa:
            matchExpr.setParseAction( pa )
        thisExpr <<= ( matchExpr.setName(termName) | lastExpr )
        lastExpr = thisExpr
    ret <<= lastExpr
    return ret

operatorPrecedence = infixNotation
"""(Deprecated) Former name of C{L{infixNotation}}, will be dropped in a future release."""

dblQuotedString = Combine(Regex(r'"(?:[^"\n\r\\]|(?:"")|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*')+'"').setName("string enclosed in double quotes")
sglQuotedString = Combine(Regex(r"'(?:[^'\n\r\\]|(?:'')|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*")+"'").setName("string enclosed in single quotes")
quotedString = Combine(Regex(r'"(?:[^"\n\r\\]|(?:"")|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*')+'"'|
                       Regex(r"'(?:[^'\n\r\\]|(?:'')|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*")+"'").setName("quotedString using single or double quotes")
unicodeString = Combine(_L('u') + quotedString.copy()).setName("unicode string literal")

def nestedExpr(opener="(", closer=")", content=None, ignoreExpr=quotedString.copy()):
    """
    Helper method for defining nested lists enclosed in opening and closing
    delimiters ("(" and ")" are the default).

    Parameters:
     - opener - opening character for a nested list (default=C{"("}); can also be a pyparsing expression
     - closer - closing character for a nested list (default=C{")"}); can also be a pyparsing expression
     - content - expression for items within the nested lists (default=C{None})
     - ignoreExpr - expression for ignoring opening and closing delimiters (default=C{quotedString})

    If an expression is not provided for the content argument, the nested
    expression will capture all whitespace-delimited content between delimiters
    as a list of separate values.

    Use the C{ignoreExpr} argument to define expressions that may contain
    opening or closing characters that should not be treated as opening
    or closing characters for nesting, such as quotedString or a comment
    expression.  Specify multiple expressions using an C{L{Or}} or C{L{MatchFirst}}.
    The default is L{quotedString}, but if no expressions are to be ignored,
    then pass C{None} for this argument.

    Example::
        data_type = oneOf("void int short long char float double")
        decl_data_type = Combine(data_type + Optional(Word('*')))
        ident = Word(alphas+'_', alphanums+'_')
        number = pyparsing_common.number
        arg = Group(decl_data_type + ident)
        LPAR,RPAR = map(Suppress, "()")

        code_body = nestedExpr('{', '}', ignoreExpr=(quotedString | cStyleComment))

        c_function = (decl_data_type("type") 
                      + ident("name")
                      + LPAR + Optional(delimitedList(arg), [])("args") + RPAR 
                      + code_body("body"))
        c_function.ignore(cStyleComment)
        
        source_code = '''
            int is_odd(int x) { 
                return (x%2); 
            }
                
            int dec_to_hex(char hchar) { 
                if (hchar >= '0' && hchar <= '9') { 
                    return (ord(hchar)-ord('0')); 
                } else { 
                    return (10+ord(hchar)-ord('A'));
                } 
            }
        '''
        for func in c_function.searchString(source_code):
            print("%(name)s (%(type)s) args: %(args)s" % func)

    prints::
        is_odd (int) args: [['int', 'x']]
        dec_to_hex (int) args: [['char', 'hchar']]
    """
    if opener == closer:
        raise ValueError("opening and closing strings cannot be the same")
    if content is None:
        if isinstance(opener,basestring) and isinstance(closer,basestring):
            if len(opener) == 1 and len(closer)==1:
                if ignoreExpr is not None:
                    content = (Combine(OneOrMore(~ignoreExpr +
                                    CharsNotIn(opener+closer+ParserElement.DEFAULT_WHITE_CHARS,exact=1))
                                ).setParseAction(lambda t:t[0].strip()))
                else:
                    content = (empty.copy()+CharsNotIn(opener+closer+ParserElement.DEFAULT_WHITE_CHARS
                                ).setParseAction(lambda t:t[0].strip()))
            else:
                if ignoreExpr is not None:
                    content = (Combine(OneOrMore(~ignoreExpr + 
                                    ~Literal(opener) + ~Literal(closer) +
                                    CharsNotIn(ParserElement.DEFAULT_WHITE_CHARS,exact=1))
                                ).setParseAction(lambda t:t[0].strip()))
                else:
                    content = (Combine(OneOrMore(~Literal(opener) + ~Literal(closer) +
                                    CharsNotIn(ParserElement.DEFAULT_WHITE_CHARS,exact=1))
                                ).setParseAction(lambda t:t[0].strip()))
        else:
            raise ValueError("opening and closing arguments must be strings if no content expression is given")
    ret = Forward()
    if ignoreExpr is not None:
        ret <<= Group( Suppress(opener) + ZeroOrMore( ignoreExpr | ret | content ) + Suppress(closer) )
    else:
        ret <<= Group( Suppress(opener) + ZeroOrMore( ret | content )  + Suppress(closer) )
    ret.setName('nested %s%s expression' % (opener,closer))
    return ret

def indentedBlock(blockStatementExpr, indentStack, indent=True):
    """
    Helper method for defining space-delimited indentation blocks, such as
    those used to define block statements in Python source code.

    Parameters:
     - blockStatementExpr - expression defining syntax of statement that
            is repeated within the indented block
     - indentStack - list created by caller to manage indentation stack
            (multiple statementWithIndentedBlock expressions within a single grammar
            should share a common indentStack)
     - indent - boolean indicating whether block must be indented beyond the
            the current level; set to False for block of left-most statements
            (default=C{True})

    A valid block must contain at least one C{blockStatement}.

    Example::
        data = '''
        def A(z):
          A1
          B = 100
          G = A2
          A2
          A3
        B
        def BB(a,b,c):
          BB1
          def BBA():
            bba1
            bba2
            bba3
        C
        D
        def spam(x,y):
             def eggs(z):
                 pass
        '''


        indentStack = [1]
        stmt = Forward()

        identifier = Word(alphas, alphanums)
        funcDecl = ("def" + identifier + Group( "(" + Optional( delimitedList(identifier) ) + ")" ) + ":")
        func_body = indentedBlock(stmt, indentStack)
        funcDef = Group( funcDecl + func_body )

        rvalue = Forward()
        funcCall = Group(identifier + "(" + Optional(delimitedList(rvalue)) + ")")
        rvalue << (funcCall | identifier | Word(nums))
        assignment = Group(identifier + "=" + rvalue)
        stmt << ( funcDef | assignment | identifier )

        module_body = OneOrMore(stmt)

        parseTree = module_body.parseString(data)
        parseTree.pprint()
    prints::
        [['def',
          'A',
          ['(', 'z', ')'],
          ':',
          [['A1'], [['B', '=', '100']], [['G', '=', 'A2']], ['A2'], ['A3']]],
         'B',
         ['def',
          'BB',
          ['(', 'a', 'b', 'c', ')'],
          ':',
          [['BB1'], [['def', 'BBA', ['(', ')'], ':', [['bba1'], ['bba2'], ['bba3']]]]]],
         'C',
         'D',
         ['def',
          'spam',
          ['(', 'x', 'y', ')'],
          ':',
          [[['def', 'eggs', ['(', 'z', ')'], ':', [['pass']]]]]]] 
    """
    def checkPeerIndent(s,l,t):
        if l >= len(s): return
        curCol = col(l,s)
        if curCol != indentStack[-1]:
            if curCol > indentStack[-1]:
                raise ParseFatalException(s,l,"illegal nesting")
            raise ParseException(s,l,"not a peer entry")

    def checkSubIndent(s,l,t):
        curCol = col(l,s)
        if curCol > indentStack[-1]:
            indentStack.append( curCol )
        else:
            raise ParseException(s,l,"not a subentry")

    def checkUnindent(s,l,t):
        if l >= len(s): return
        curCol = col(l,s)
        if not(indentStack and curCol < indentStack[-1] and curCol <= indentStack[-2]):
            raise ParseException(s,l,"not an unindent")
        indentStack.pop()

    NL = OneOrMore(LineEnd().setWhitespaceChars("\t ").suppress())
    INDENT = (Empty() + Empty().setParseAction(checkSubIndent)).setName('INDENT')
    PEER   = Empty().setParseAction(checkPeerIndent).setName('')
    UNDENT = Empty().setParseAction(checkUnindent).setName('UNINDENT')
    if indent:
        smExpr = Group( Optional(NL) +
            #~ FollowedBy(blockStatementExpr) +
            INDENT + (OneOrMore( PEER + Group(blockStatementExpr) + Optional(NL) )) + UNDENT)
    else:
        smExpr = Group( Optional(NL) +
            (OneOrMore( PEER + Group(blockStatementExpr) + Optional(NL) )) )
    blockStatementExpr.ignore(_bslash + LineEnd())
    return smExpr.setName('indented block')

alphas8bit = srange(r"[\0xc0-\0xd6\0xd8-\0xf6\0xf8-\0xff]")
punc8bit = srange(r"[\0xa1-\0xbf\0xd7\0xf7]")

anyOpenTag,anyCloseTag = makeHTMLTags(Word(alphas,alphanums+"_:").setName('any tag'))
_htmlEntityMap = dict(zip("gt lt amp nbsp quot apos".split(),'><& "\''))
commonHTMLEntity = Regex('&(?P<entity>' + '|'.join(_htmlEntityMap.keys()) +");").setName("common HTML entity")
def replaceHTMLEntity(t):
    """Helper parser action to replace common HTML entities with their special characters"""
    return _htmlEntityMap.get(t.entity)

# it's easy to get these comment structures wrong - they're very common, so may as well make them available
cStyleComment = Combine(Regex(r"/\*(?:[^*]|\*(?!/))*") + '*/').setName("C style comment")
"Comment of the form C{/* ... */}"

htmlComment = Regex(r"<!--[\s\S]*?-->").setName("HTML comment")
"Comment of the form C{<!-- ... -->}"

restOfLine = Regex(r".*").leaveWhitespace().setName("rest of line")
dblSlashComment = Regex(r"//(?:\\\n|[^\n])*").setName("// comment")
"Comment of the form C{// ... (to end of line)}"

cppStyleComment = Combine(Regex(r"/\*(?:[^*]|\*(?!/))*") + '*/'| dblSlashComment).setName("C++ style comment")
"Comment of either form C{L{cStyleComment}} or C{L{dblSlashComment}}"

javaStyleComment = cppStyleComment
"Same as C{L{cppStyleComment}}"

pythonStyleComment = Regex(r"#.*").setName("Python style comment")
"Comment of the form C{# ... (to end of line)}"

_commasepitem = Combine(OneOrMore(Word(printables, excludeChars=',') +
                                  Optional( Word(" \t") +
                                            ~Literal(",") + ~LineEnd() ) ) ).streamline().setName("commaItem")
commaSeparatedList = delimitedList( Optional( quotedString.copy() | _commasepitem, default="") ).setName("commaSeparatedList")
"""(Deprecated) Predefined expression of 1 or more printable words or quoted strings, separated by commas.
   This expression is deprecated in favor of L{pyparsing_common.comma_separated_list}."""

# some other useful expressions - using lower-case class name since we are really using this as a namespace
class pyparsing_common:
    """
    Here are some common low-level expressions that may be useful in jump-starting parser development:
     - numeric forms (L{integers<integer>}, L{reals<real>}, L{scientific notation<sci_real>})
     - common L{programming identifiers<identifier>}
     - network addresses (L{MAC<mac_address>}, L{IPv4<ipv4_address>}, L{IPv6<ipv6_address>})
     - ISO8601 L{dates<iso8601_date>} and L{datetime<iso8601_datetime>}
     - L{UUID<uuid>}
     - L{comma-separated list<comma_separated_list>}
    Parse actions:
     - C{L{convertToInteger}}
     - C{L{convertToFloat}}
     - C{L{convertToDate}}
     - C{L{convertToDatetime}}
     - C{L{stripHTMLTags}}
     - C{L{upcaseTokens}}
     - C{L{downcaseTokens}}

    Example::
        pyparsing_common.number.runTests('''
            # any int or real number, returned as the appropriate type
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.fnumber.runTests('''
            # any int or real number, returned as float
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.hex_integer.runTests('''
            # hex numbers
            100
            FF
            ''')

        pyparsing_common.fraction.runTests('''
            # fractions
            1/2
            -3/4
            ''')

        pyparsing_common.mixed_integer.runTests('''
            # mixed fractions
            1
            1/2
            -3/4
            1-3/4
            ''')

        import uuid
        pyparsing_common.uuid.setParseAction(tokenMap(uuid.UUID))
        pyparsing_common.uuid.runTests('''
            # uuid
            12345678-1234-5678-1234-567812345678
            ''')
    prints::
        # any int or real number, returned as the appropriate type
        100
        [100]

        -100
        [-100]

        +100
        [100]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # any int or real number, returned as float
        100
        [100.0]

        -100
        [-100.0]

        +100
        [100.0]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # hex numbers
        100
        [256]

        FF
        [255]

        # fractions
        1/2
        [0.5]

        -3/4
        [-0.75]

        # mixed fractions
        1
        [1]

        1/2
        [0.5]

        -3/4
        [-0.75]

        1-3/4
        [1.75]

        # uuid
        12345678-1234-5678-1234-567812345678
        [UUID('12345678-1234-5678-1234-567812345678')]
    """

    convertToInteger = tokenMap(int)
    """
    Parse action for converting parsed integers to Python int
    """

    convertToFloat = tokenMap(float)
    """
    Parse action for converting parsed numbers to Python float
    """

    integer = Word(nums).setName("integer").setParseAction(convertToInteger)
    """expression that parses an unsigned integer, returns an int"""

    hex_integer = Word(hexnums).setName("hex integer").setParseAction(tokenMap(int,16))
    """expression that parses a hexadecimal integer, returns an int"""

    signed_integer = Regex(r'[+-]?\d+').setName("signed integer").setParseAction(convertToInteger)
    """expression that parses an integer with optional leading sign, returns an int"""

    fraction = (signed_integer().setParseAction(convertToFloat) + '/' + signed_integer().setParseAction(convertToFloat)).setName("fraction")
    """fractional expression of an integer divided by an integer, returns a float"""
    fraction.addParseAction(lambda t: t[0]/t[-1])

    mixed_integer = (fraction | signed_integer + Optional(Optional('-').suppress() + fraction)).setName("fraction or mixed integer-fraction")
    """mixed integer of the form 'integer - fraction', with optional leading integer, returns float"""
    mixed_integer.addParseAction(sum)

    real = Regex(r'[+-]?\d+\.\d*').setName("real number").setParseAction(convertToFloat)
    """expression that parses a floating point number and returns a float"""

    sci_real = Regex(r'[+-]?\d+([eE][+-]?\d+|\.\d*([eE][+-]?\d+)?)').setName("real number with scientific notation").setParseAction(convertToFloat)
    """expression that parses a floating point number with optional scientific notation and returns a float"""

    # streamlining this expression makes the docs nicer-looking
    number = (sci_real | real | signed_integer).streamline()
    """any numeric expression, returns the corresponding Python type"""

    fnumber = Regex(r'[+-]?\d+\.?\d*([eE][+-]?\d+)?').setName("fnumber").setParseAction(convertToFloat)
    """any int or real number, returned as float"""
    
    identifier = Word(alphas+'_', alphanums+'_').setName("identifier")
    """typical code identifier (leading alpha or '_', followed by 0 or more alphas, nums, or '_')"""
    
    ipv4_address = Regex(r'(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})(\.(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})){3}').setName("IPv4 address")
    "IPv4 address (C{0.0.0.0 - 255.255.255.255})"

    _ipv6_part = Regex(r'[0-9a-fA-F]{1,4}').setName("hex_integer")
    _full_ipv6_address = (_ipv6_part + (':' + _ipv6_part)*7).setName("full IPv6 address")
    _short_ipv6_address = (Optional(_ipv6_part + (':' + _ipv6_part)*(0,6)) + "::" + Optional(_ipv6_part + (':' + _ipv6_part)*(0,6))).setName("short IPv6 address")
    _short_ipv6_address.addCondition(lambda t: sum(1 for tt in t if pyparsing_common._ipv6_part.matches(tt)) < 8)
    _mixed_ipv6_address = ("::ffff:" + ipv4_address).setName("mixed IPv6 address")
    ipv6_address = Combine((_full_ipv6_address | _mixed_ipv6_address | _short_ipv6_address).setName("IPv6 address")).setName("IPv6 address")
    "IPv6 address (long, short, or mixed form)"
    
    mac_address = Regex(r'[0-9a-fA-F]{2}([:.-])[0-9a-fA-F]{2}(?:\1[0-9a-fA-F]{2}){4}').setName("MAC address")
    "MAC address xx:xx:xx:xx:xx (may also have '-' or '.' delimiters)"

    @staticmethod
    def convertToDate(fmt="%Y-%m-%d"):
        """
        Helper to create a parse action for converting parsed date string to Python datetime.date

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%d"})

        Example::
            date_expr = pyparsing_common.iso8601_date.copy()
            date_expr.setParseAction(pyparsing_common.convertToDate())
            print(date_expr.parseString("1999-12-31"))
        prints::
            [datetime.date(1999, 12, 31)]
        """
        def cvt_fn(s,l,t):
            try:
                return datetime.strptime(t[0], fmt).date()
            except ValueError as ve:
                raise ParseException(s, l, str(ve))
        return cvt_fn

    @staticmethod
    def convertToDatetime(fmt="%Y-%m-%dT%H:%M:%S.%f"):
        """
        Helper to create a parse action for converting parsed datetime string to Python datetime.datetime

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%dT%H:%M:%S.%f"})

        Example::
            dt_expr = pyparsing_common.iso8601_datetime.copy()
            dt_expr.setParseAction(pyparsing_common.convertToDatetime())
            print(dt_expr.parseString("1999-12-31T23:59:59.999"))
        prints::
            [datetime.datetime(1999, 12, 31, 23, 59, 59, 999000)]
        """
        def cvt_fn(s,l,t):
            try:
                return datetime.strptime(t[0], fmt)
            except ValueError as ve:
                raise ParseException(s, l, str(ve))
        return cvt_fn

    iso8601_date = Regex(r'(?P<year>\d{4})(?:-(?P<month>\d\d)(?:-(?P<day>\d\d))?)?').setName("ISO8601 date")
    "ISO8601 date (C{yyyy-mm-dd})"

    iso8601_datetime = Regex(r'(?P<year>\d{4})-(?P<month>\d\d)-(?P<day>\d\d)[T ](?P<hour>\d\d):(?P<minute>\d\d)(:(?P<second>\d\d(\.\d*)?)?)?(?P<tz>Z|[+-]\d\d:?\d\d)?').setName("ISO8601 datetime")
    "ISO8601 datetime (C{yyyy-mm-ddThh:mm:ss.s(Z|+-00:00)}) - trailing seconds, milliseconds, and timezone optional; accepts separating C{'T'} or C{' '}"

    uuid = Regex(r'[0-9a-fA-F]{8}(-[0-9a-fA-F]{4}){3}-[0-9a-fA-F]{12}').setName("UUID")
    "UUID (C{xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx})"

    _html_stripper = anyOpenTag.suppress() | anyCloseTag.suppress()
    @staticmethod
    def stripHTMLTags(s, l, tokens):
        """
        Parse action to remove HTML tags from web page HTML source

        Example::
            # strip HTML links from normal text 
            text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
            td,td_end = makeHTMLTags("TD")
            table_text = td + SkipTo(td_end).setParseAction(pyparsing_common.stripHTMLTags)("body") + td_end
            
            print(table_text.parseString(text).body) # -> 'More info at the pyparsing wiki page'
        """
        return pyparsing_common._html_stripper.transformString(tokens[0])

    _commasepitem = Combine(OneOrMore(~Literal(",") + ~LineEnd() + Word(printables, excludeChars=',') 
                                        + Optional( White(" \t") ) ) ).streamline().setName("commaItem")
    comma_separated_list = delimitedList( Optional( quotedString.copy() | _commasepitem, default="") ).setName("comma separated list")
    """Predefined expression of 1 or more printable words or quoted strings, separated by commas."""

    upcaseTokens = staticmethod(tokenMap(lambda t: _ustr(t).upper()))
    """Parse action to convert tokens to upper case."""

    downcaseTokens = staticmethod(tokenMap(lambda t: _ustr(t).lower()))
    """Parse action to convert tokens to lower case."""


if __name__ == "__main__":

    selectToken    = CaselessLiteral("select")
    fromToken      = CaselessLiteral("from")

    ident          = Word(alphas, alphanums + "_$")

    columnName     = delimitedList(ident, ".", combine=True).setParseAction(upcaseTokens)
    columnNameList = Group(delimitedList(columnName)).setName("columns")
    columnSpec     = ('*' | columnNameList)

    tableName      = delimitedList(ident, ".", combine=True).setParseAction(upcaseTokens)
    tableNameList  = Group(delimitedList(tableName)).setName("tables")
    
    simpleSQL      = selectToken("command") + columnSpec("columns") + fromToken + tableNameList("tables")

    # demo runTests method, including embedded comments in test string
    simpleSQL.runTests("""
        # '*' as column list and dotted table name
        select * from SYS.XYZZY

        # caseless match on "SELECT", and casts back to "select"
        SELECT * from XYZZY, ABC

        # list of column names, and mixed case SELECT keyword
        Select AA,BB,CC from Sys.dual

        # multiple tables
        Select A, B, C from Sys.dual, Table2

        # invalid SELECT keyword - should fail
        Xelect A, B, C from Sys.dual

        # incomplete command - should fail
        Select

        # invalid column name - should fail
        Select ^^^ frox Sys.dual

        """)

    pyparsing_common.number.runTests("""
        100
        -100
        +100
        3.14159
        6.02e23
        1e-12
        """)

    # any int or real number, returned as float
    pyparsing_common.fnumber.runTests("""
        100
        -100
        +100
        3.14159
        6.02e23
        1e-12
        """)

    pyparsing_common.hex_integer.runTests("""
        100
        FF
        """)

    import uuid
    pyparsing_common.uuid.setParseAction(tokenMap(uuid.UUID))
    pyparsing_common.uuid.runTests("""
        12345678-1234-5678-1234-567812345678
        """)
site-packages/pip/_vendor/ipaddress.py000064400000234460147511334610014056 0ustar00# Copyright 2007 Google Inc.
#  Licensed to PSF under a Contributor Agreement.

"""A fast, lightweight IPv4/IPv6 manipulation library in Python.

This library is used to create/poke/manipulate IPv4 and IPv6 addresses
and networks.

"""

from __future__ import unicode_literals


import itertools
import struct

__version__ = '1.0.17'

# Compatibility functions
_compat_int_types = (int,)
try:
    _compat_int_types = (int, long)
except NameError:
    pass
try:
    _compat_str = unicode
except NameError:
    _compat_str = str
    assert bytes != str
if b'\0'[0] == 0:  # Python 3 semantics
    def _compat_bytes_to_byte_vals(byt):
        return byt
else:
    def _compat_bytes_to_byte_vals(byt):
        return [struct.unpack(b'!B', b)[0] for b in byt]
try:
    _compat_int_from_byte_vals = int.from_bytes
except AttributeError:
    def _compat_int_from_byte_vals(bytvals, endianess):
        assert endianess == 'big'
        res = 0
        for bv in bytvals:
            assert isinstance(bv, _compat_int_types)
            res = (res << 8) + bv
        return res


def _compat_to_bytes(intval, length, endianess):
    assert isinstance(intval, _compat_int_types)
    assert endianess == 'big'
    if length == 4:
        if intval < 0 or intval >= 2 ** 32:
            raise struct.error("integer out of range for 'I' format code")
        return struct.pack(b'!I', intval)
    elif length == 16:
        if intval < 0 or intval >= 2 ** 128:
            raise struct.error("integer out of range for 'QQ' format code")
        return struct.pack(b'!QQ', intval >> 64, intval & 0xffffffffffffffff)
    else:
        raise NotImplementedError()
if hasattr(int, 'bit_length'):
    # Not int.bit_length , since that won't work in 2.7 where long exists
    def _compat_bit_length(i):
        return i.bit_length()
else:
    def _compat_bit_length(i):
        for res in itertools.count():
            if i >> res == 0:
                return res


def _compat_range(start, end, step=1):
    assert step > 0
    i = start
    while i < end:
        yield i
        i += step


class _TotalOrderingMixin(object):
    __slots__ = ()

    # Helper that derives the other comparison operations from
    # __lt__ and __eq__
    # We avoid functools.total_ordering because it doesn't handle
    # NotImplemented correctly yet (http://bugs.python.org/issue10042)
    def __eq__(self, other):
        raise NotImplementedError

    def __ne__(self, other):
        equal = self.__eq__(other)
        if equal is NotImplemented:
            return NotImplemented
        return not equal

    def __lt__(self, other):
        raise NotImplementedError

    def __le__(self, other):
        less = self.__lt__(other)
        if less is NotImplemented or not less:
            return self.__eq__(other)
        return less

    def __gt__(self, other):
        less = self.__lt__(other)
        if less is NotImplemented:
            return NotImplemented
        equal = self.__eq__(other)
        if equal is NotImplemented:
            return NotImplemented
        return not (less or equal)

    def __ge__(self, other):
        less = self.__lt__(other)
        if less is NotImplemented:
            return NotImplemented
        return not less


IPV4LENGTH = 32
IPV6LENGTH = 128


class AddressValueError(ValueError):
    """A Value Error related to the address."""


class NetmaskValueError(ValueError):
    """A Value Error related to the netmask."""


def ip_address(address):
    """Take an IP string/int and return an object of the correct type.

    Args:
        address: A string or integer, the IP address.  Either IPv4 or
          IPv6 addresses may be supplied; integers less than 2**32 will
          be considered to be IPv4 by default.

    Returns:
        An IPv4Address or IPv6Address object.

    Raises:
        ValueError: if the *address* passed isn't either a v4 or a v6
          address

    """
    try:
        return IPv4Address(address)
    except (AddressValueError, NetmaskValueError):
        pass

    try:
        return IPv6Address(address)
    except (AddressValueError, NetmaskValueError):
        pass

    if isinstance(address, bytes):
        raise AddressValueError(
            '%r does not appear to be an IPv4 or IPv6 address. '
            'Did you pass in a bytes (str in Python 2) instead of'
            ' a unicode object?' % address)

    raise ValueError('%r does not appear to be an IPv4 or IPv6 address' %
                     address)


def ip_network(address, strict=True):
    """Take an IP string/int and return an object of the correct type.

    Args:
        address: A string or integer, the IP network.  Either IPv4 or
          IPv6 networks may be supplied; integers less than 2**32 will
          be considered to be IPv4 by default.

    Returns:
        An IPv4Network or IPv6Network object.

    Raises:
        ValueError: if the string passed isn't either a v4 or a v6
          address. Or if the network has host bits set.

    """
    try:
        return IPv4Network(address, strict)
    except (AddressValueError, NetmaskValueError):
        pass

    try:
        return IPv6Network(address, strict)
    except (AddressValueError, NetmaskValueError):
        pass

    if isinstance(address, bytes):
        raise AddressValueError(
            '%r does not appear to be an IPv4 or IPv6 network. '
            'Did you pass in a bytes (str in Python 2) instead of'
            ' a unicode object?' % address)

    raise ValueError('%r does not appear to be an IPv4 or IPv6 network' %
                     address)


def ip_interface(address):
    """Take an IP string/int and return an object of the correct type.

    Args:
        address: A string or integer, the IP address.  Either IPv4 or
          IPv6 addresses may be supplied; integers less than 2**32 will
          be considered to be IPv4 by default.

    Returns:
        An IPv4Interface or IPv6Interface object.

    Raises:
        ValueError: if the string passed isn't either a v4 or a v6
          address.

    Notes:
        The IPv?Interface classes describe an Address on a particular
        Network, so they're basically a combination of both the Address
        and Network classes.

    """
    try:
        return IPv4Interface(address)
    except (AddressValueError, NetmaskValueError):
        pass

    try:
        return IPv6Interface(address)
    except (AddressValueError, NetmaskValueError):
        pass

    raise ValueError('%r does not appear to be an IPv4 or IPv6 interface' %
                     address)


def v4_int_to_packed(address):
    """Represent an address as 4 packed bytes in network (big-endian) order.

    Args:
        address: An integer representation of an IPv4 IP address.

    Returns:
        The integer address packed as 4 bytes in network (big-endian) order.

    Raises:
        ValueError: If the integer is negative or too large to be an
          IPv4 IP address.

    """
    try:
        return _compat_to_bytes(address, 4, 'big')
    except (struct.error, OverflowError):
        raise ValueError("Address negative or too large for IPv4")


def v6_int_to_packed(address):
    """Represent an address as 16 packed bytes in network (big-endian) order.

    Args:
        address: An integer representation of an IPv6 IP address.

    Returns:
        The integer address packed as 16 bytes in network (big-endian) order.

    """
    try:
        return _compat_to_bytes(address, 16, 'big')
    except (struct.error, OverflowError):
        raise ValueError("Address negative or too large for IPv6")


def _split_optional_netmask(address):
    """Helper to split the netmask and raise AddressValueError if needed"""
    addr = _compat_str(address).split('/')
    if len(addr) > 2:
        raise AddressValueError("Only one '/' permitted in %r" % address)
    return addr


def _find_address_range(addresses):
    """Find a sequence of sorted deduplicated IPv#Address.

    Args:
        addresses: a list of IPv#Address objects.

    Yields:
        A tuple containing the first and last IP addresses in the sequence.

    """
    it = iter(addresses)
    first = last = next(it)
    for ip in it:
        if ip._ip != last._ip + 1:
            yield first, last
            first = ip
        last = ip
    yield first, last


def _count_righthand_zero_bits(number, bits):
    """Count the number of zero bits on the right hand side.

    Args:
        number: an integer.
        bits: maximum number of bits to count.

    Returns:
        The number of zero bits on the right hand side of the number.

    """
    if number == 0:
        return bits
    return min(bits, _compat_bit_length(~number & (number - 1)))


def summarize_address_range(first, last):
    """Summarize a network range given the first and last IP addresses.

    Example:
        >>> list(summarize_address_range(IPv4Address('192.0.2.0'),
        ...                              IPv4Address('192.0.2.130')))
        ...                                #doctest: +NORMALIZE_WHITESPACE
        [IPv4Network('192.0.2.0/25'), IPv4Network('192.0.2.128/31'),
         IPv4Network('192.0.2.130/32')]

    Args:
        first: the first IPv4Address or IPv6Address in the range.
        last: the last IPv4Address or IPv6Address in the range.

    Returns:
        An iterator of the summarized IPv(4|6) network objects.

    Raise:
        TypeError:
            If the first and last objects are not IP addresses.
            If the first and last objects are not the same version.
        ValueError:
            If the last object is not greater than the first.
            If the version of the first address is not 4 or 6.

    """
    if (not (isinstance(first, _BaseAddress) and
             isinstance(last, _BaseAddress))):
        raise TypeError('first and last must be IP addresses, not networks')
    if first.version != last.version:
        raise TypeError("%s and %s are not of the same version" % (
                        first, last))
    if first > last:
        raise ValueError('last IP address must be greater than first')

    if first.version == 4:
        ip = IPv4Network
    elif first.version == 6:
        ip = IPv6Network
    else:
        raise ValueError('unknown IP version')

    ip_bits = first._max_prefixlen
    first_int = first._ip
    last_int = last._ip
    while first_int <= last_int:
        nbits = min(_count_righthand_zero_bits(first_int, ip_bits),
                    _compat_bit_length(last_int - first_int + 1) - 1)
        net = ip((first_int, ip_bits - nbits))
        yield net
        first_int += 1 << nbits
        if first_int - 1 == ip._ALL_ONES:
            break


def _collapse_addresses_internal(addresses):
    """Loops through the addresses, collapsing concurrent netblocks.

    Example:

        ip1 = IPv4Network('192.0.2.0/26')
        ip2 = IPv4Network('192.0.2.64/26')
        ip3 = IPv4Network('192.0.2.128/26')
        ip4 = IPv4Network('192.0.2.192/26')

        _collapse_addresses_internal([ip1, ip2, ip3, ip4]) ->
          [IPv4Network('192.0.2.0/24')]

        This shouldn't be called directly; it is called via
          collapse_addresses([]).

    Args:
        addresses: A list of IPv4Network's or IPv6Network's

    Returns:
        A list of IPv4Network's or IPv6Network's depending on what we were
        passed.

    """
    # First merge
    to_merge = list(addresses)
    subnets = {}
    while to_merge:
        net = to_merge.pop()
        supernet = net.supernet()
        existing = subnets.get(supernet)
        if existing is None:
            subnets[supernet] = net
        elif existing != net:
            # Merge consecutive subnets
            del subnets[supernet]
            to_merge.append(supernet)
    # Then iterate over resulting networks, skipping subsumed subnets
    last = None
    for net in sorted(subnets.values()):
        if last is not None:
            # Since they are sorted,
            # last.network_address <= net.network_address is a given.
            if last.broadcast_address >= net.broadcast_address:
                continue
        yield net
        last = net


def collapse_addresses(addresses):
    """Collapse a list of IP objects.

    Example:
        collapse_addresses([IPv4Network('192.0.2.0/25'),
                            IPv4Network('192.0.2.128/25')]) ->
                           [IPv4Network('192.0.2.0/24')]

    Args:
        addresses: An iterator of IPv4Network or IPv6Network objects.

    Returns:
        An iterator of the collapsed IPv(4|6)Network objects.

    Raises:
        TypeError: If passed a list of mixed version objects.

    """
    addrs = []
    ips = []
    nets = []

    # split IP addresses and networks
    for ip in addresses:
        if isinstance(ip, _BaseAddress):
            if ips and ips[-1]._version != ip._version:
                raise TypeError("%s and %s are not of the same version" % (
                                ip, ips[-1]))
            ips.append(ip)
        elif ip._prefixlen == ip._max_prefixlen:
            if ips and ips[-1]._version != ip._version:
                raise TypeError("%s and %s are not of the same version" % (
                                ip, ips[-1]))
            try:
                ips.append(ip.ip)
            except AttributeError:
                ips.append(ip.network_address)
        else:
            if nets and nets[-1]._version != ip._version:
                raise TypeError("%s and %s are not of the same version" % (
                                ip, nets[-1]))
            nets.append(ip)

    # sort and dedup
    ips = sorted(set(ips))

    # find consecutive address ranges in the sorted sequence and summarize them
    if ips:
        for first, last in _find_address_range(ips):
            addrs.extend(summarize_address_range(first, last))

    return _collapse_addresses_internal(addrs + nets)


def get_mixed_type_key(obj):
    """Return a key suitable for sorting between networks and addresses.

    Address and Network objects are not sortable by default; they're
    fundamentally different so the expression

        IPv4Address('192.0.2.0') <= IPv4Network('192.0.2.0/24')

    doesn't make any sense.  There are some times however, where you may wish
    to have ipaddress sort these for you anyway. If you need to do this, you
    can use this function as the key= argument to sorted().

    Args:
      obj: either a Network or Address object.
    Returns:
      appropriate key.

    """
    if isinstance(obj, _BaseNetwork):
        return obj._get_networks_key()
    elif isinstance(obj, _BaseAddress):
        return obj._get_address_key()
    return NotImplemented


class _IPAddressBase(_TotalOrderingMixin):

    """The mother class."""

    __slots__ = ()

    @property
    def exploded(self):
        """Return the longhand version of the IP address as a string."""
        return self._explode_shorthand_ip_string()

    @property
    def compressed(self):
        """Return the shorthand version of the IP address as a string."""
        return _compat_str(self)

    @property
    def reverse_pointer(self):
        """The name of the reverse DNS pointer for the IP address, e.g.:
            >>> ipaddress.ip_address("127.0.0.1").reverse_pointer
            '1.0.0.127.in-addr.arpa'
            >>> ipaddress.ip_address("2001:db8::1").reverse_pointer
            '1.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.8.b.d.0.1.0.0.2.ip6.arpa'

        """
        return self._reverse_pointer()

    @property
    def version(self):
        msg = '%200s has no version specified' % (type(self),)
        raise NotImplementedError(msg)

    def _check_int_address(self, address):
        if address < 0:
            msg = "%d (< 0) is not permitted as an IPv%d address"
            raise AddressValueError(msg % (address, self._version))
        if address > self._ALL_ONES:
            msg = "%d (>= 2**%d) is not permitted as an IPv%d address"
            raise AddressValueError(msg % (address, self._max_prefixlen,
                                           self._version))

    def _check_packed_address(self, address, expected_len):
        address_len = len(address)
        if address_len != expected_len:
            msg = (
                '%r (len %d != %d) is not permitted as an IPv%d address. '
                'Did you pass in a bytes (str in Python 2) instead of'
                ' a unicode object?'
            )
            raise AddressValueError(msg % (address, address_len,
                                           expected_len, self._version))

    @classmethod
    def _ip_int_from_prefix(cls, prefixlen):
        """Turn the prefix length into a bitwise netmask

        Args:
            prefixlen: An integer, the prefix length.

        Returns:
            An integer.

        """
        return cls._ALL_ONES ^ (cls._ALL_ONES >> prefixlen)

    @classmethod
    def _prefix_from_ip_int(cls, ip_int):
        """Return prefix length from the bitwise netmask.

        Args:
            ip_int: An integer, the netmask in expanded bitwise format

        Returns:
            An integer, the prefix length.

        Raises:
            ValueError: If the input intermingles zeroes & ones
        """
        trailing_zeroes = _count_righthand_zero_bits(ip_int,
                                                     cls._max_prefixlen)
        prefixlen = cls._max_prefixlen - trailing_zeroes
        leading_ones = ip_int >> trailing_zeroes
        all_ones = (1 << prefixlen) - 1
        if leading_ones != all_ones:
            byteslen = cls._max_prefixlen // 8
            details = _compat_to_bytes(ip_int, byteslen, 'big')
            msg = 'Netmask pattern %r mixes zeroes & ones'
            raise ValueError(msg % details)
        return prefixlen

    @classmethod
    def _report_invalid_netmask(cls, netmask_str):
        msg = '%r is not a valid netmask' % netmask_str
        raise NetmaskValueError(msg)

    @classmethod
    def _prefix_from_prefix_string(cls, prefixlen_str):
        """Return prefix length from a numeric string

        Args:
            prefixlen_str: The string to be converted

        Returns:
            An integer, the prefix length.

        Raises:
            NetmaskValueError: If the input is not a valid netmask
        """
        # int allows a leading +/- as well as surrounding whitespace,
        # so we ensure that isn't the case
        if not _BaseV4._DECIMAL_DIGITS.issuperset(prefixlen_str):
            cls._report_invalid_netmask(prefixlen_str)
        try:
            prefixlen = int(prefixlen_str)
        except ValueError:
            cls._report_invalid_netmask(prefixlen_str)
        if not (0 <= prefixlen <= cls._max_prefixlen):
            cls._report_invalid_netmask(prefixlen_str)
        return prefixlen

    @classmethod
    def _prefix_from_ip_string(cls, ip_str):
        """Turn a netmask/hostmask string into a prefix length

        Args:
            ip_str: The netmask/hostmask to be converted

        Returns:
            An integer, the prefix length.

        Raises:
            NetmaskValueError: If the input is not a valid netmask/hostmask
        """
        # Parse the netmask/hostmask like an IP address.
        try:
            ip_int = cls._ip_int_from_string(ip_str)
        except AddressValueError:
            cls._report_invalid_netmask(ip_str)

        # Try matching a netmask (this would be /1*0*/ as a bitwise regexp).
        # Note that the two ambiguous cases (all-ones and all-zeroes) are
        # treated as netmasks.
        try:
            return cls._prefix_from_ip_int(ip_int)
        except ValueError:
            pass

        # Invert the bits, and try matching a /0+1+/ hostmask instead.
        ip_int ^= cls._ALL_ONES
        try:
            return cls._prefix_from_ip_int(ip_int)
        except ValueError:
            cls._report_invalid_netmask(ip_str)

    def __reduce__(self):
        return self.__class__, (_compat_str(self),)


class _BaseAddress(_IPAddressBase):

    """A generic IP object.

    This IP class contains the version independent methods which are
    used by single IP addresses.
    """

    __slots__ = ()

    def __int__(self):
        return self._ip

    def __eq__(self, other):
        try:
            return (self._ip == other._ip and
                    self._version == other._version)
        except AttributeError:
            return NotImplemented

    def __lt__(self, other):
        if not isinstance(other, _IPAddressBase):
            return NotImplemented
        if not isinstance(other, _BaseAddress):
            raise TypeError('%s and %s are not of the same type' % (
                self, other))
        if self._version != other._version:
            raise TypeError('%s and %s are not of the same version' % (
                self, other))
        if self._ip != other._ip:
            return self._ip < other._ip
        return False

    # Shorthand for Integer addition and subtraction. This is not
    # meant to ever support addition/subtraction of addresses.
    def __add__(self, other):
        if not isinstance(other, _compat_int_types):
            return NotImplemented
        return self.__class__(int(self) + other)

    def __sub__(self, other):
        if not isinstance(other, _compat_int_types):
            return NotImplemented
        return self.__class__(int(self) - other)

    def __repr__(self):
        return '%s(%r)' % (self.__class__.__name__, _compat_str(self))

    def __str__(self):
        return _compat_str(self._string_from_ip_int(self._ip))

    def __hash__(self):
        return hash(hex(int(self._ip)))

    def _get_address_key(self):
        return (self._version, self)

    def __reduce__(self):
        return self.__class__, (self._ip,)


class _BaseNetwork(_IPAddressBase):

    """A generic IP network object.

    This IP class contains the version independent methods which are
    used by networks.

    """
    def __init__(self, address):
        self._cache = {}

    def __repr__(self):
        return '%s(%r)' % (self.__class__.__name__, _compat_str(self))

    def __str__(self):
        return '%s/%d' % (self.network_address, self.prefixlen)

    def hosts(self):
        """Generate Iterator over usable hosts in a network.

        This is like __iter__ except it doesn't return the network
        or broadcast addresses.

        """
        network = int(self.network_address)
        broadcast = int(self.broadcast_address)
        for x in _compat_range(network + 1, broadcast):
            yield self._address_class(x)

    def __iter__(self):
        network = int(self.network_address)
        broadcast = int(self.broadcast_address)
        for x in _compat_range(network, broadcast + 1):
            yield self._address_class(x)

    def __getitem__(self, n):
        network = int(self.network_address)
        broadcast = int(self.broadcast_address)
        if n >= 0:
            if network + n > broadcast:
                raise IndexError('address out of range')
            return self._address_class(network + n)
        else:
            n += 1
            if broadcast + n < network:
                raise IndexError('address out of range')
            return self._address_class(broadcast + n)

    def __lt__(self, other):
        if not isinstance(other, _IPAddressBase):
            return NotImplemented
        if not isinstance(other, _BaseNetwork):
            raise TypeError('%s and %s are not of the same type' % (
                            self, other))
        if self._version != other._version:
            raise TypeError('%s and %s are not of the same version' % (
                            self, other))
        if self.network_address != other.network_address:
            return self.network_address < other.network_address
        if self.netmask != other.netmask:
            return self.netmask < other.netmask
        return False

    def __eq__(self, other):
        try:
            return (self._version == other._version and
                    self.network_address == other.network_address and
                    int(self.netmask) == int(other.netmask))
        except AttributeError:
            return NotImplemented

    def __hash__(self):
        return hash(int(self.network_address) ^ int(self.netmask))

    def __contains__(self, other):
        # always false if one is v4 and the other is v6.
        if self._version != other._version:
            return False
        # dealing with another network.
        if isinstance(other, _BaseNetwork):
            return False
        # dealing with another address
        else:
            # address
            return (int(self.network_address) <= int(other._ip) <=
                    int(self.broadcast_address))

    def overlaps(self, other):
        """Tell if self is partly contained in other."""
        return self.network_address in other or (
            self.broadcast_address in other or (
                other.network_address in self or (
                    other.broadcast_address in self)))

    @property
    def broadcast_address(self):
        x = self._cache.get('broadcast_address')
        if x is None:
            x = self._address_class(int(self.network_address) |
                                    int(self.hostmask))
            self._cache['broadcast_address'] = x
        return x

    @property
    def hostmask(self):
        x = self._cache.get('hostmask')
        if x is None:
            x = self._address_class(int(self.netmask) ^ self._ALL_ONES)
            self._cache['hostmask'] = x
        return x

    @property
    def with_prefixlen(self):
        return '%s/%d' % (self.network_address, self._prefixlen)

    @property
    def with_netmask(self):
        return '%s/%s' % (self.network_address, self.netmask)

    @property
    def with_hostmask(self):
        return '%s/%s' % (self.network_address, self.hostmask)

    @property
    def num_addresses(self):
        """Number of hosts in the current subnet."""
        return int(self.broadcast_address) - int(self.network_address) + 1

    @property
    def _address_class(self):
        # Returning bare address objects (rather than interfaces) allows for
        # more consistent behaviour across the network address, broadcast
        # address and individual host addresses.
        msg = '%200s has no associated address class' % (type(self),)
        raise NotImplementedError(msg)

    @property
    def prefixlen(self):
        return self._prefixlen

    def address_exclude(self, other):
        """Remove an address from a larger block.

        For example:

            addr1 = ip_network('192.0.2.0/28')
            addr2 = ip_network('192.0.2.1/32')
            list(addr1.address_exclude(addr2)) =
                [IPv4Network('192.0.2.0/32'), IPv4Network('192.0.2.2/31'),
                 IPv4Network('192.0.2.4/30'), IPv4Network('192.0.2.8/29')]

        or IPv6:

            addr1 = ip_network('2001:db8::1/32')
            addr2 = ip_network('2001:db8::1/128')
            list(addr1.address_exclude(addr2)) =
                [ip_network('2001:db8::1/128'),
                 ip_network('2001:db8::2/127'),
                 ip_network('2001:db8::4/126'),
                 ip_network('2001:db8::8/125'),
                 ...
                 ip_network('2001:db8:8000::/33')]

        Args:
            other: An IPv4Network or IPv6Network object of the same type.

        Returns:
            An iterator of the IPv(4|6)Network objects which is self
            minus other.

        Raises:
            TypeError: If self and other are of differing address
              versions, or if other is not a network object.
            ValueError: If other is not completely contained by self.

        """
        if not self._version == other._version:
            raise TypeError("%s and %s are not of the same version" % (
                            self, other))

        if not isinstance(other, _BaseNetwork):
            raise TypeError("%s is not a network object" % other)

        if not other.subnet_of(self):
            raise ValueError('%s not contained in %s' % (other, self))
        if other == self:
            return

        # Make sure we're comparing the network of other.
        other = other.__class__('%s/%s' % (other.network_address,
                                           other.prefixlen))

        s1, s2 = self.subnets()
        while s1 != other and s2 != other:
            if other.subnet_of(s1):
                yield s2
                s1, s2 = s1.subnets()
            elif other.subnet_of(s2):
                yield s1
                s1, s2 = s2.subnets()
            else:
                # If we got here, there's a bug somewhere.
                raise AssertionError('Error performing exclusion: '
                                     's1: %s s2: %s other: %s' %
                                     (s1, s2, other))
        if s1 == other:
            yield s2
        elif s2 == other:
            yield s1
        else:
            # If we got here, there's a bug somewhere.
            raise AssertionError('Error performing exclusion: '
                                 's1: %s s2: %s other: %s' %
                                 (s1, s2, other))

    def compare_networks(self, other):
        """Compare two IP objects.

        This is only concerned about the comparison of the integer
        representation of the network addresses.  This means that the
        host bits aren't considered at all in this method.  If you want
        to compare host bits, you can easily enough do a
        'HostA._ip < HostB._ip'

        Args:
            other: An IP object.

        Returns:
            If the IP versions of self and other are the same, returns:

            -1 if self < other:
              eg: IPv4Network('192.0.2.0/25') < IPv4Network('192.0.2.128/25')
              IPv6Network('2001:db8::1000/124') <
                  IPv6Network('2001:db8::2000/124')
            0 if self == other
              eg: IPv4Network('192.0.2.0/24') == IPv4Network('192.0.2.0/24')
              IPv6Network('2001:db8::1000/124') ==
                  IPv6Network('2001:db8::1000/124')
            1 if self > other
              eg: IPv4Network('192.0.2.128/25') > IPv4Network('192.0.2.0/25')
                  IPv6Network('2001:db8::2000/124') >
                      IPv6Network('2001:db8::1000/124')

          Raises:
              TypeError if the IP versions are different.

        """
        # does this need to raise a ValueError?
        if self._version != other._version:
            raise TypeError('%s and %s are not of the same type' % (
                            self, other))
        # self._version == other._version below here:
        if self.network_address < other.network_address:
            return -1
        if self.network_address > other.network_address:
            return 1
        # self.network_address == other.network_address below here:
        if self.netmask < other.netmask:
            return -1
        if self.netmask > other.netmask:
            return 1
        return 0

    def _get_networks_key(self):
        """Network-only key function.

        Returns an object that identifies this address' network and
        netmask. This function is a suitable "key" argument for sorted()
        and list.sort().

        """
        return (self._version, self.network_address, self.netmask)

    def subnets(self, prefixlen_diff=1, new_prefix=None):
        """The subnets which join to make the current subnet.

        In the case that self contains only one IP
        (self._prefixlen == 32 for IPv4 or self._prefixlen == 128
        for IPv6), yield an iterator with just ourself.

        Args:
            prefixlen_diff: An integer, the amount the prefix length
              should be increased by. This should not be set if
              new_prefix is also set.
            new_prefix: The desired new prefix length. This must be a
              larger number (smaller prefix) than the existing prefix.
              This should not be set if prefixlen_diff is also set.

        Returns:
            An iterator of IPv(4|6) objects.

        Raises:
            ValueError: The prefixlen_diff is too small or too large.
                OR
            prefixlen_diff and new_prefix are both set or new_prefix
              is a smaller number than the current prefix (smaller
              number means a larger network)

        """
        if self._prefixlen == self._max_prefixlen:
            yield self
            return

        if new_prefix is not None:
            if new_prefix < self._prefixlen:
                raise ValueError('new prefix must be longer')
            if prefixlen_diff != 1:
                raise ValueError('cannot set prefixlen_diff and new_prefix')
            prefixlen_diff = new_prefix - self._prefixlen

        if prefixlen_diff < 0:
            raise ValueError('prefix length diff must be > 0')
        new_prefixlen = self._prefixlen + prefixlen_diff

        if new_prefixlen > self._max_prefixlen:
            raise ValueError(
                'prefix length diff %d is invalid for netblock %s' % (
                    new_prefixlen, self))

        start = int(self.network_address)
        end = int(self.broadcast_address) + 1
        step = (int(self.hostmask) + 1) >> prefixlen_diff
        for new_addr in _compat_range(start, end, step):
            current = self.__class__((new_addr, new_prefixlen))
            yield current

    def supernet(self, prefixlen_diff=1, new_prefix=None):
        """The supernet containing the current network.

        Args:
            prefixlen_diff: An integer, the amount the prefix length of
              the network should be decreased by.  For example, given a
              /24 network and a prefixlen_diff of 3, a supernet with a
              /21 netmask is returned.

        Returns:
            An IPv4 network object.

        Raises:
            ValueError: If self.prefixlen - prefixlen_diff < 0. I.e., you have
              a negative prefix length.
                OR
            If prefixlen_diff and new_prefix are both set or new_prefix is a
              larger number than the current prefix (larger number means a
              smaller network)

        """
        if self._prefixlen == 0:
            return self

        if new_prefix is not None:
            if new_prefix > self._prefixlen:
                raise ValueError('new prefix must be shorter')
            if prefixlen_diff != 1:
                raise ValueError('cannot set prefixlen_diff and new_prefix')
            prefixlen_diff = self._prefixlen - new_prefix

        new_prefixlen = self.prefixlen - prefixlen_diff
        if new_prefixlen < 0:
            raise ValueError(
                'current prefixlen is %d, cannot have a prefixlen_diff of %d' %
                (self.prefixlen, prefixlen_diff))
        return self.__class__((
            int(self.network_address) & (int(self.netmask) << prefixlen_diff),
            new_prefixlen
        ))

    @property
    def is_multicast(self):
        """Test if the address is reserved for multicast use.

        Returns:
            A boolean, True if the address is a multicast address.
            See RFC 2373 2.7 for details.

        """
        return (self.network_address.is_multicast and
                self.broadcast_address.is_multicast)

    def subnet_of(self, other):
        # always false if one is v4 and the other is v6.
        if self._version != other._version:
            return False
        # dealing with another network.
        if (hasattr(other, 'network_address') and
                hasattr(other, 'broadcast_address')):
            return (other.network_address <= self.network_address and
                    other.broadcast_address >= self.broadcast_address)
        # dealing with another address
        else:
            raise TypeError('Unable to test subnet containment with element '
                            'of type %s' % type(other))

    def supernet_of(self, other):
        # always false if one is v4 and the other is v6.
        if self._version != other._version:
            return False
        # dealing with another network.
        if (hasattr(other, 'network_address') and
                hasattr(other, 'broadcast_address')):
            return (other.network_address >= self.network_address and
                    other.broadcast_address <= self.broadcast_address)
        # dealing with another address
        else:
            raise TypeError('Unable to test subnet containment with element '
                            'of type %s' % type(other))

    @property
    def is_reserved(self):
        """Test if the address is otherwise IETF reserved.

        Returns:
            A boolean, True if the address is within one of the
            reserved IPv6 Network ranges.

        """
        return (self.network_address.is_reserved and
                self.broadcast_address.is_reserved)

    @property
    def is_link_local(self):
        """Test if the address is reserved for link-local.

        Returns:
            A boolean, True if the address is reserved per RFC 4291.

        """
        return (self.network_address.is_link_local and
                self.broadcast_address.is_link_local)

    @property
    def is_private(self):
        """Test if this address is allocated for private networks.

        Returns:
            A boolean, True if the address is reserved per
            iana-ipv4-special-registry or iana-ipv6-special-registry.

        """
        return (self.network_address.is_private and
                self.broadcast_address.is_private)

    @property
    def is_global(self):
        """Test if this address is allocated for public networks.

        Returns:
            A boolean, True if the address is not reserved per
            iana-ipv4-special-registry or iana-ipv6-special-registry.

        """
        return not self.is_private

    @property
    def is_unspecified(self):
        """Test if the address is unspecified.

        Returns:
            A boolean, True if this is the unspecified address as defined in
            RFC 2373 2.5.2.

        """
        return (self.network_address.is_unspecified and
                self.broadcast_address.is_unspecified)

    @property
    def is_loopback(self):
        """Test if the address is a loopback address.

        Returns:
            A boolean, True if the address is a loopback address as defined in
            RFC 2373 2.5.3.

        """
        return (self.network_address.is_loopback and
                self.broadcast_address.is_loopback)


class _BaseV4(object):

    """Base IPv4 object.

    The following methods are used by IPv4 objects in both single IP
    addresses and networks.

    """

    __slots__ = ()
    _version = 4
    # Equivalent to 255.255.255.255 or 32 bits of 1's.
    _ALL_ONES = (2 ** IPV4LENGTH) - 1
    _DECIMAL_DIGITS = frozenset('0123456789')

    # the valid octets for host and netmasks. only useful for IPv4.
    _valid_mask_octets = frozenset([255, 254, 252, 248, 240, 224, 192, 128, 0])

    _max_prefixlen = IPV4LENGTH
    # There are only a handful of valid v4 netmasks, so we cache them all
    # when constructed (see _make_netmask()).
    _netmask_cache = {}

    def _explode_shorthand_ip_string(self):
        return _compat_str(self)

    @classmethod
    def _make_netmask(cls, arg):
        """Make a (netmask, prefix_len) tuple from the given argument.

        Argument can be:
        - an integer (the prefix length)
        - a string representing the prefix length (e.g. "24")
        - a string representing the prefix netmask (e.g. "255.255.255.0")
        """
        if arg not in cls._netmask_cache:
            if isinstance(arg, _compat_int_types):
                prefixlen = arg
            else:
                try:
                    # Check for a netmask in prefix length form
                    prefixlen = cls._prefix_from_prefix_string(arg)
                except NetmaskValueError:
                    # Check for a netmask or hostmask in dotted-quad form.
                    # This may raise NetmaskValueError.
                    prefixlen = cls._prefix_from_ip_string(arg)
            netmask = IPv4Address(cls._ip_int_from_prefix(prefixlen))
            cls._netmask_cache[arg] = netmask, prefixlen
        return cls._netmask_cache[arg]

    @classmethod
    def _ip_int_from_string(cls, ip_str):
        """Turn the given IP string into an integer for comparison.

        Args:
            ip_str: A string, the IP ip_str.

        Returns:
            The IP ip_str as an integer.

        Raises:
            AddressValueError: if ip_str isn't a valid IPv4 Address.

        """
        if not ip_str:
            raise AddressValueError('Address cannot be empty')

        octets = ip_str.split('.')
        if len(octets) != 4:
            raise AddressValueError("Expected 4 octets in %r" % ip_str)

        try:
            return _compat_int_from_byte_vals(
                map(cls._parse_octet, octets), 'big')
        except ValueError as exc:
            raise AddressValueError("%s in %r" % (exc, ip_str))

    @classmethod
    def _parse_octet(cls, octet_str):
        """Convert a decimal octet into an integer.

        Args:
            octet_str: A string, the number to parse.

        Returns:
            The octet as an integer.

        Raises:
            ValueError: if the octet isn't strictly a decimal from [0..255].

        """
        if not octet_str:
            raise ValueError("Empty octet not permitted")
        # Whitelist the characters, since int() allows a lot of bizarre stuff.
        if not cls._DECIMAL_DIGITS.issuperset(octet_str):
            msg = "Only decimal digits permitted in %r"
            raise ValueError(msg % octet_str)
        # We do the length check second, since the invalid character error
        # is likely to be more informative for the user
        if len(octet_str) > 3:
            msg = "At most 3 characters permitted in %r"
            raise ValueError(msg % octet_str)
        # Convert to integer (we know digits are legal)
        octet_int = int(octet_str, 10)
        # Any octets that look like they *might* be written in octal,
        # and which don't look exactly the same in both octal and
        # decimal are rejected as ambiguous
        if octet_int > 7 and octet_str[0] == '0':
            msg = "Ambiguous (octal/decimal) value in %r not permitted"
            raise ValueError(msg % octet_str)
        if octet_int > 255:
            raise ValueError("Octet %d (> 255) not permitted" % octet_int)
        return octet_int

    @classmethod
    def _string_from_ip_int(cls, ip_int):
        """Turns a 32-bit integer into dotted decimal notation.

        Args:
            ip_int: An integer, the IP address.

        Returns:
            The IP address as a string in dotted decimal notation.

        """
        return '.'.join(_compat_str(struct.unpack(b'!B', b)[0]
                                    if isinstance(b, bytes)
                                    else b)
                        for b in _compat_to_bytes(ip_int, 4, 'big'))

    def _is_hostmask(self, ip_str):
        """Test if the IP string is a hostmask (rather than a netmask).

        Args:
            ip_str: A string, the potential hostmask.

        Returns:
            A boolean, True if the IP string is a hostmask.

        """
        bits = ip_str.split('.')
        try:
            parts = [x for x in map(int, bits) if x in self._valid_mask_octets]
        except ValueError:
            return False
        if len(parts) != len(bits):
            return False
        if parts[0] < parts[-1]:
            return True
        return False

    def _reverse_pointer(self):
        """Return the reverse DNS pointer name for the IPv4 address.

        This implements the method described in RFC1035 3.5.

        """
        reverse_octets = _compat_str(self).split('.')[::-1]
        return '.'.join(reverse_octets) + '.in-addr.arpa'

    @property
    def max_prefixlen(self):
        return self._max_prefixlen

    @property
    def version(self):
        return self._version


class IPv4Address(_BaseV4, _BaseAddress):

    """Represent and manipulate single IPv4 Addresses."""

    __slots__ = ('_ip', '__weakref__')

    def __init__(self, address):

        """
        Args:
            address: A string or integer representing the IP

              Additionally, an integer can be passed, so
              IPv4Address('192.0.2.1') == IPv4Address(3221225985).
              or, more generally
              IPv4Address(int(IPv4Address('192.0.2.1'))) ==
                IPv4Address('192.0.2.1')

        Raises:
            AddressValueError: If ipaddress isn't a valid IPv4 address.

        """
        # Efficient constructor from integer.
        if isinstance(address, _compat_int_types):
            self._check_int_address(address)
            self._ip = address
            return

        # Constructing from a packed address
        if isinstance(address, bytes):
            self._check_packed_address(address, 4)
            bvs = _compat_bytes_to_byte_vals(address)
            self._ip = _compat_int_from_byte_vals(bvs, 'big')
            return

        # Assume input argument to be string or any object representation
        # which converts into a formatted IP string.
        addr_str = _compat_str(address)
        if '/' in addr_str:
            raise AddressValueError("Unexpected '/' in %r" % address)
        self._ip = self._ip_int_from_string(addr_str)

    @property
    def packed(self):
        """The binary representation of this address."""
        return v4_int_to_packed(self._ip)

    @property
    def is_reserved(self):
        """Test if the address is otherwise IETF reserved.

         Returns:
             A boolean, True if the address is within the
             reserved IPv4 Network range.

        """
        return self in self._constants._reserved_network

    @property
    def is_private(self):
        """Test if this address is allocated for private networks.

        Returns:
            A boolean, True if the address is reserved per
            iana-ipv4-special-registry.

        """
        return any(self in net for net in self._constants._private_networks)

    @property
    def is_global(self):
        return (
            self not in self._constants._public_network and
            not self.is_private)

    @property
    def is_multicast(self):
        """Test if the address is reserved for multicast use.

        Returns:
            A boolean, True if the address is multicast.
            See RFC 3171 for details.

        """
        return self in self._constants._multicast_network

    @property
    def is_unspecified(self):
        """Test if the address is unspecified.

        Returns:
            A boolean, True if this is the unspecified address as defined in
            RFC 5735 3.

        """
        return self == self._constants._unspecified_address

    @property
    def is_loopback(self):
        """Test if the address is a loopback address.

        Returns:
            A boolean, True if the address is a loopback per RFC 3330.

        """
        return self in self._constants._loopback_network

    @property
    def is_link_local(self):
        """Test if the address is reserved for link-local.

        Returns:
            A boolean, True if the address is link-local per RFC 3927.

        """
        return self in self._constants._linklocal_network


class IPv4Interface(IPv4Address):

    def __init__(self, address):
        if isinstance(address, (bytes, _compat_int_types)):
            IPv4Address.__init__(self, address)
            self.network = IPv4Network(self._ip)
            self._prefixlen = self._max_prefixlen
            return

        if isinstance(address, tuple):
            IPv4Address.__init__(self, address[0])
            if len(address) > 1:
                self._prefixlen = int(address[1])
            else:
                self._prefixlen = self._max_prefixlen

            self.network = IPv4Network(address, strict=False)
            self.netmask = self.network.netmask
            self.hostmask = self.network.hostmask
            return

        addr = _split_optional_netmask(address)
        IPv4Address.__init__(self, addr[0])

        self.network = IPv4Network(address, strict=False)
        self._prefixlen = self.network._prefixlen

        self.netmask = self.network.netmask
        self.hostmask = self.network.hostmask

    def __str__(self):
        return '%s/%d' % (self._string_from_ip_int(self._ip),
                          self.network.prefixlen)

    def __eq__(self, other):
        address_equal = IPv4Address.__eq__(self, other)
        if not address_equal or address_equal is NotImplemented:
            return address_equal
        try:
            return self.network == other.network
        except AttributeError:
            # An interface with an associated network is NOT the
            # same as an unassociated address. That's why the hash
            # takes the extra info into account.
            return False

    def __lt__(self, other):
        address_less = IPv4Address.__lt__(self, other)
        if address_less is NotImplemented:
            return NotImplemented
        try:
            return self.network < other.network
        except AttributeError:
            # We *do* allow addresses and interfaces to be sorted. The
            # unassociated address is considered less than all interfaces.
            return False

    def __hash__(self):
        return self._ip ^ self._prefixlen ^ int(self.network.network_address)

    __reduce__ = _IPAddressBase.__reduce__

    @property
    def ip(self):
        return IPv4Address(self._ip)

    @property
    def with_prefixlen(self):
        return '%s/%s' % (self._string_from_ip_int(self._ip),
                          self._prefixlen)

    @property
    def with_netmask(self):
        return '%s/%s' % (self._string_from_ip_int(self._ip),
                          self.netmask)

    @property
    def with_hostmask(self):
        return '%s/%s' % (self._string_from_ip_int(self._ip),
                          self.hostmask)


class IPv4Network(_BaseV4, _BaseNetwork):

    """This class represents and manipulates 32-bit IPv4 network + addresses..

    Attributes: [examples for IPv4Network('192.0.2.0/27')]
        .network_address: IPv4Address('192.0.2.0')
        .hostmask: IPv4Address('0.0.0.31')
        .broadcast_address: IPv4Address('192.0.2.32')
        .netmask: IPv4Address('255.255.255.224')
        .prefixlen: 27

    """
    # Class to use when creating address objects
    _address_class = IPv4Address

    def __init__(self, address, strict=True):

        """Instantiate a new IPv4 network object.

        Args:
            address: A string or integer representing the IP [& network].
              '192.0.2.0/24'
              '192.0.2.0/255.255.255.0'
              '192.0.0.2/0.0.0.255'
              are all functionally the same in IPv4. Similarly,
              '192.0.2.1'
              '192.0.2.1/255.255.255.255'
              '192.0.2.1/32'
              are also functionally equivalent. That is to say, failing to
              provide a subnetmask will create an object with a mask of /32.

              If the mask (portion after the / in the argument) is given in
              dotted quad form, it is treated as a netmask if it starts with a
              non-zero field (e.g. /255.0.0.0 == /8) and as a hostmask if it
              starts with a zero field (e.g. 0.255.255.255 == /8), with the
              single exception of an all-zero mask which is treated as a
              netmask == /0. If no mask is given, a default of /32 is used.

              Additionally, an integer can be passed, so
              IPv4Network('192.0.2.1') == IPv4Network(3221225985)
              or, more generally
              IPv4Interface(int(IPv4Interface('192.0.2.1'))) ==
                IPv4Interface('192.0.2.1')

        Raises:
            AddressValueError: If ipaddress isn't a valid IPv4 address.
            NetmaskValueError: If the netmask isn't valid for
              an IPv4 address.
            ValueError: If strict is True and a network address is not
              supplied.

        """
        _BaseNetwork.__init__(self, address)

        # Constructing from a packed address or integer
        if isinstance(address, (_compat_int_types, bytes)):
            self.network_address = IPv4Address(address)
            self.netmask, self._prefixlen = self._make_netmask(
                self._max_prefixlen)
            # fixme: address/network test here.
            return

        if isinstance(address, tuple):
            if len(address) > 1:
                arg = address[1]
            else:
                # We weren't given an address[1]
                arg = self._max_prefixlen
            self.network_address = IPv4Address(address[0])
            self.netmask, self._prefixlen = self._make_netmask(arg)
            packed = int(self.network_address)
            if packed & int(self.netmask) != packed:
                if strict:
                    raise ValueError('%s has host bits set' % self)
                else:
                    self.network_address = IPv4Address(packed &
                                                       int(self.netmask))
            return

        # Assume input argument to be string or any object representation
        # which converts into a formatted IP prefix string.
        addr = _split_optional_netmask(address)
        self.network_address = IPv4Address(self._ip_int_from_string(addr[0]))

        if len(addr) == 2:
            arg = addr[1]
        else:
            arg = self._max_prefixlen
        self.netmask, self._prefixlen = self._make_netmask(arg)

        if strict:
            if (IPv4Address(int(self.network_address) & int(self.netmask)) !=
                    self.network_address):
                raise ValueError('%s has host bits set' % self)
        self.network_address = IPv4Address(int(self.network_address) &
                                           int(self.netmask))

        if self._prefixlen == (self._max_prefixlen - 1):
            self.hosts = self.__iter__

    @property
    def is_global(self):
        """Test if this address is allocated for public networks.

        Returns:
            A boolean, True if the address is not reserved per
            iana-ipv4-special-registry.

        """
        return (not (self.network_address in IPv4Network('100.64.0.0/10') and
                self.broadcast_address in IPv4Network('100.64.0.0/10')) and
                not self.is_private)


class _IPv4Constants(object):

    _linklocal_network = IPv4Network('169.254.0.0/16')

    _loopback_network = IPv4Network('127.0.0.0/8')

    _multicast_network = IPv4Network('224.0.0.0/4')

    _public_network = IPv4Network('100.64.0.0/10')

    _private_networks = [
        IPv4Network('0.0.0.0/8'),
        IPv4Network('10.0.0.0/8'),
        IPv4Network('127.0.0.0/8'),
        IPv4Network('169.254.0.0/16'),
        IPv4Network('172.16.0.0/12'),
        IPv4Network('192.0.0.0/29'),
        IPv4Network('192.0.0.170/31'),
        IPv4Network('192.0.2.0/24'),
        IPv4Network('192.168.0.0/16'),
        IPv4Network('198.18.0.0/15'),
        IPv4Network('198.51.100.0/24'),
        IPv4Network('203.0.113.0/24'),
        IPv4Network('240.0.0.0/4'),
        IPv4Network('255.255.255.255/32'),
    ]

    _reserved_network = IPv4Network('240.0.0.0/4')

    _unspecified_address = IPv4Address('0.0.0.0')


IPv4Address._constants = _IPv4Constants


class _BaseV6(object):

    """Base IPv6 object.

    The following methods are used by IPv6 objects in both single IP
    addresses and networks.

    """

    __slots__ = ()
    _version = 6
    _ALL_ONES = (2 ** IPV6LENGTH) - 1
    _HEXTET_COUNT = 8
    _HEX_DIGITS = frozenset('0123456789ABCDEFabcdef')
    _max_prefixlen = IPV6LENGTH

    # There are only a bunch of valid v6 netmasks, so we cache them all
    # when constructed (see _make_netmask()).
    _netmask_cache = {}

    @classmethod
    def _make_netmask(cls, arg):
        """Make a (netmask, prefix_len) tuple from the given argument.

        Argument can be:
        - an integer (the prefix length)
        - a string representing the prefix length (e.g. "24")
        - a string representing the prefix netmask (e.g. "255.255.255.0")
        """
        if arg not in cls._netmask_cache:
            if isinstance(arg, _compat_int_types):
                prefixlen = arg
            else:
                prefixlen = cls._prefix_from_prefix_string(arg)
            netmask = IPv6Address(cls._ip_int_from_prefix(prefixlen))
            cls._netmask_cache[arg] = netmask, prefixlen
        return cls._netmask_cache[arg]

    @classmethod
    def _ip_int_from_string(cls, ip_str):
        """Turn an IPv6 ip_str into an integer.

        Args:
            ip_str: A string, the IPv6 ip_str.

        Returns:
            An int, the IPv6 address

        Raises:
            AddressValueError: if ip_str isn't a valid IPv6 Address.

        """
        if not ip_str:
            raise AddressValueError('Address cannot be empty')

        parts = ip_str.split(':')

        # An IPv6 address needs at least 2 colons (3 parts).
        _min_parts = 3
        if len(parts) < _min_parts:
            msg = "At least %d parts expected in %r" % (_min_parts, ip_str)
            raise AddressValueError(msg)

        # If the address has an IPv4-style suffix, convert it to hexadecimal.
        if '.' in parts[-1]:
            try:
                ipv4_int = IPv4Address(parts.pop())._ip
            except AddressValueError as exc:
                raise AddressValueError("%s in %r" % (exc, ip_str))
            parts.append('%x' % ((ipv4_int >> 16) & 0xFFFF))
            parts.append('%x' % (ipv4_int & 0xFFFF))

        # An IPv6 address can't have more than 8 colons (9 parts).
        # The extra colon comes from using the "::" notation for a single
        # leading or trailing zero part.
        _max_parts = cls._HEXTET_COUNT + 1
        if len(parts) > _max_parts:
            msg = "At most %d colons permitted in %r" % (
                _max_parts - 1, ip_str)
            raise AddressValueError(msg)

        # Disregarding the endpoints, find '::' with nothing in between.
        # This indicates that a run of zeroes has been skipped.
        skip_index = None
        for i in _compat_range(1, len(parts) - 1):
            if not parts[i]:
                if skip_index is not None:
                    # Can't have more than one '::'
                    msg = "At most one '::' permitted in %r" % ip_str
                    raise AddressValueError(msg)
                skip_index = i

        # parts_hi is the number of parts to copy from above/before the '::'
        # parts_lo is the number of parts to copy from below/after the '::'
        if skip_index is not None:
            # If we found a '::', then check if it also covers the endpoints.
            parts_hi = skip_index
            parts_lo = len(parts) - skip_index - 1
            if not parts[0]:
                parts_hi -= 1
                if parts_hi:
                    msg = "Leading ':' only permitted as part of '::' in %r"
                    raise AddressValueError(msg % ip_str)  # ^: requires ^::
            if not parts[-1]:
                parts_lo -= 1
                if parts_lo:
                    msg = "Trailing ':' only permitted as part of '::' in %r"
                    raise AddressValueError(msg % ip_str)  # :$ requires ::$
            parts_skipped = cls._HEXTET_COUNT - (parts_hi + parts_lo)
            if parts_skipped < 1:
                msg = "Expected at most %d other parts with '::' in %r"
                raise AddressValueError(msg % (cls._HEXTET_COUNT - 1, ip_str))
        else:
            # Otherwise, allocate the entire address to parts_hi.  The
            # endpoints could still be empty, but _parse_hextet() will check
            # for that.
            if len(parts) != cls._HEXTET_COUNT:
                msg = "Exactly %d parts expected without '::' in %r"
                raise AddressValueError(msg % (cls._HEXTET_COUNT, ip_str))
            if not parts[0]:
                msg = "Leading ':' only permitted as part of '::' in %r"
                raise AddressValueError(msg % ip_str)  # ^: requires ^::
            if not parts[-1]:
                msg = "Trailing ':' only permitted as part of '::' in %r"
                raise AddressValueError(msg % ip_str)  # :$ requires ::$
            parts_hi = len(parts)
            parts_lo = 0
            parts_skipped = 0

        try:
            # Now, parse the hextets into a 128-bit integer.
            ip_int = 0
            for i in range(parts_hi):
                ip_int <<= 16
                ip_int |= cls._parse_hextet(parts[i])
            ip_int <<= 16 * parts_skipped
            for i in range(-parts_lo, 0):
                ip_int <<= 16
                ip_int |= cls._parse_hextet(parts[i])
            return ip_int
        except ValueError as exc:
            raise AddressValueError("%s in %r" % (exc, ip_str))

    @classmethod
    def _parse_hextet(cls, hextet_str):
        """Convert an IPv6 hextet string into an integer.

        Args:
            hextet_str: A string, the number to parse.

        Returns:
            The hextet as an integer.

        Raises:
            ValueError: if the input isn't strictly a hex number from
              [0..FFFF].

        """
        # Whitelist the characters, since int() allows a lot of bizarre stuff.
        if not cls._HEX_DIGITS.issuperset(hextet_str):
            raise ValueError("Only hex digits permitted in %r" % hextet_str)
        # We do the length check second, since the invalid character error
        # is likely to be more informative for the user
        if len(hextet_str) > 4:
            msg = "At most 4 characters permitted in %r"
            raise ValueError(msg % hextet_str)
        # Length check means we can skip checking the integer value
        return int(hextet_str, 16)

    @classmethod
    def _compress_hextets(cls, hextets):
        """Compresses a list of hextets.

        Compresses a list of strings, replacing the longest continuous
        sequence of "0" in the list with "" and adding empty strings at
        the beginning or at the end of the string such that subsequently
        calling ":".join(hextets) will produce the compressed version of
        the IPv6 address.

        Args:
            hextets: A list of strings, the hextets to compress.

        Returns:
            A list of strings.

        """
        best_doublecolon_start = -1
        best_doublecolon_len = 0
        doublecolon_start = -1
        doublecolon_len = 0
        for index, hextet in enumerate(hextets):
            if hextet == '0':
                doublecolon_len += 1
                if doublecolon_start == -1:
                    # Start of a sequence of zeros.
                    doublecolon_start = index
                if doublecolon_len > best_doublecolon_len:
                    # This is the longest sequence of zeros so far.
                    best_doublecolon_len = doublecolon_len
                    best_doublecolon_start = doublecolon_start
            else:
                doublecolon_len = 0
                doublecolon_start = -1

        if best_doublecolon_len > 1:
            best_doublecolon_end = (best_doublecolon_start +
                                    best_doublecolon_len)
            # For zeros at the end of the address.
            if best_doublecolon_end == len(hextets):
                hextets += ['']
            hextets[best_doublecolon_start:best_doublecolon_end] = ['']
            # For zeros at the beginning of the address.
            if best_doublecolon_start == 0:
                hextets = [''] + hextets

        return hextets

    @classmethod
    def _string_from_ip_int(cls, ip_int=None):
        """Turns a 128-bit integer into hexadecimal notation.

        Args:
            ip_int: An integer, the IP address.

        Returns:
            A string, the hexadecimal representation of the address.

        Raises:
            ValueError: The address is bigger than 128 bits of all ones.

        """
        if ip_int is None:
            ip_int = int(cls._ip)

        if ip_int > cls._ALL_ONES:
            raise ValueError('IPv6 address is too large')

        hex_str = '%032x' % ip_int
        hextets = ['%x' % int(hex_str[x:x + 4], 16) for x in range(0, 32, 4)]

        hextets = cls._compress_hextets(hextets)
        return ':'.join(hextets)

    def _explode_shorthand_ip_string(self):
        """Expand a shortened IPv6 address.

        Args:
            ip_str: A string, the IPv6 address.

        Returns:
            A string, the expanded IPv6 address.

        """
        if isinstance(self, IPv6Network):
            ip_str = _compat_str(self.network_address)
        elif isinstance(self, IPv6Interface):
            ip_str = _compat_str(self.ip)
        else:
            ip_str = _compat_str(self)

        ip_int = self._ip_int_from_string(ip_str)
        hex_str = '%032x' % ip_int
        parts = [hex_str[x:x + 4] for x in range(0, 32, 4)]
        if isinstance(self, (_BaseNetwork, IPv6Interface)):
            return '%s/%d' % (':'.join(parts), self._prefixlen)
        return ':'.join(parts)

    def _reverse_pointer(self):
        """Return the reverse DNS pointer name for the IPv6 address.

        This implements the method described in RFC3596 2.5.

        """
        reverse_chars = self.exploded[::-1].replace(':', '')
        return '.'.join(reverse_chars) + '.ip6.arpa'

    @property
    def max_prefixlen(self):
        return self._max_prefixlen

    @property
    def version(self):
        return self._version


class IPv6Address(_BaseV6, _BaseAddress):

    """Represent and manipulate single IPv6 Addresses."""

    __slots__ = ('_ip', '__weakref__')

    def __init__(self, address):
        """Instantiate a new IPv6 address object.

        Args:
            address: A string or integer representing the IP

              Additionally, an integer can be passed, so
              IPv6Address('2001:db8::') ==
                IPv6Address(42540766411282592856903984951653826560)
              or, more generally
              IPv6Address(int(IPv6Address('2001:db8::'))) ==
                IPv6Address('2001:db8::')

        Raises:
            AddressValueError: If address isn't a valid IPv6 address.

        """
        # Efficient constructor from integer.
        if isinstance(address, _compat_int_types):
            self._check_int_address(address)
            self._ip = address
            return

        # Constructing from a packed address
        if isinstance(address, bytes):
            self._check_packed_address(address, 16)
            bvs = _compat_bytes_to_byte_vals(address)
            self._ip = _compat_int_from_byte_vals(bvs, 'big')
            return

        # Assume input argument to be string or any object representation
        # which converts into a formatted IP string.
        addr_str = _compat_str(address)
        if '/' in addr_str:
            raise AddressValueError("Unexpected '/' in %r" % address)
        self._ip = self._ip_int_from_string(addr_str)

    @property
    def packed(self):
        """The binary representation of this address."""
        return v6_int_to_packed(self._ip)

    @property
    def is_multicast(self):
        """Test if the address is reserved for multicast use.

        Returns:
            A boolean, True if the address is a multicast address.
            See RFC 2373 2.7 for details.

        """
        return self in self._constants._multicast_network

    @property
    def is_reserved(self):
        """Test if the address is otherwise IETF reserved.

        Returns:
            A boolean, True if the address is within one of the
            reserved IPv6 Network ranges.

        """
        return any(self in x for x in self._constants._reserved_networks)

    @property
    def is_link_local(self):
        """Test if the address is reserved for link-local.

        Returns:
            A boolean, True if the address is reserved per RFC 4291.

        """
        return self in self._constants._linklocal_network

    @property
    def is_site_local(self):
        """Test if the address is reserved for site-local.

        Note that the site-local address space has been deprecated by RFC 3879.
        Use is_private to test if this address is in the space of unique local
        addresses as defined by RFC 4193.

        Returns:
            A boolean, True if the address is reserved per RFC 3513 2.5.6.

        """
        return self in self._constants._sitelocal_network

    @property
    def is_private(self):
        """Test if this address is allocated for private networks.

        Returns:
            A boolean, True if the address is reserved per
            iana-ipv6-special-registry.

        """
        return any(self in net for net in self._constants._private_networks)

    @property
    def is_global(self):
        """Test if this address is allocated for public networks.

        Returns:
            A boolean, true if the address is not reserved per
            iana-ipv6-special-registry.

        """
        return not self.is_private

    @property
    def is_unspecified(self):
        """Test if the address is unspecified.

        Returns:
            A boolean, True if this is the unspecified address as defined in
            RFC 2373 2.5.2.

        """
        return self._ip == 0

    @property
    def is_loopback(self):
        """Test if the address is a loopback address.

        Returns:
            A boolean, True if the address is a loopback address as defined in
            RFC 2373 2.5.3.

        """
        return self._ip == 1

    @property
    def ipv4_mapped(self):
        """Return the IPv4 mapped address.

        Returns:
            If the IPv6 address is a v4 mapped address, return the
            IPv4 mapped address. Return None otherwise.

        """
        if (self._ip >> 32) != 0xFFFF:
            return None
        return IPv4Address(self._ip & 0xFFFFFFFF)

    @property
    def teredo(self):
        """Tuple of embedded teredo IPs.

        Returns:
            Tuple of the (server, client) IPs or None if the address
            doesn't appear to be a teredo address (doesn't start with
            2001::/32)

        """
        if (self._ip >> 96) != 0x20010000:
            return None
        return (IPv4Address((self._ip >> 64) & 0xFFFFFFFF),
                IPv4Address(~self._ip & 0xFFFFFFFF))

    @property
    def sixtofour(self):
        """Return the IPv4 6to4 embedded address.

        Returns:
            The IPv4 6to4-embedded address if present or None if the
            address doesn't appear to contain a 6to4 embedded address.

        """
        if (self._ip >> 112) != 0x2002:
            return None
        return IPv4Address((self._ip >> 80) & 0xFFFFFFFF)


class IPv6Interface(IPv6Address):

    def __init__(self, address):
        if isinstance(address, (bytes, _compat_int_types)):
            IPv6Address.__init__(self, address)
            self.network = IPv6Network(self._ip)
            self._prefixlen = self._max_prefixlen
            return
        if isinstance(address, tuple):
            IPv6Address.__init__(self, address[0])
            if len(address) > 1:
                self._prefixlen = int(address[1])
            else:
                self._prefixlen = self._max_prefixlen
            self.network = IPv6Network(address, strict=False)
            self.netmask = self.network.netmask
            self.hostmask = self.network.hostmask
            return

        addr = _split_optional_netmask(address)
        IPv6Address.__init__(self, addr[0])
        self.network = IPv6Network(address, strict=False)
        self.netmask = self.network.netmask
        self._prefixlen = self.network._prefixlen
        self.hostmask = self.network.hostmask

    def __str__(self):
        return '%s/%d' % (self._string_from_ip_int(self._ip),
                          self.network.prefixlen)

    def __eq__(self, other):
        address_equal = IPv6Address.__eq__(self, other)
        if not address_equal or address_equal is NotImplemented:
            return address_equal
        try:
            return self.network == other.network
        except AttributeError:
            # An interface with an associated network is NOT the
            # same as an unassociated address. That's why the hash
            # takes the extra info into account.
            return False

    def __lt__(self, other):
        address_less = IPv6Address.__lt__(self, other)
        if address_less is NotImplemented:
            return NotImplemented
        try:
            return self.network < other.network
        except AttributeError:
            # We *do* allow addresses and interfaces to be sorted. The
            # unassociated address is considered less than all interfaces.
            return False

    def __hash__(self):
        return self._ip ^ self._prefixlen ^ int(self.network.network_address)

    __reduce__ = _IPAddressBase.__reduce__

    @property
    def ip(self):
        return IPv6Address(self._ip)

    @property
    def with_prefixlen(self):
        return '%s/%s' % (self._string_from_ip_int(self._ip),
                          self._prefixlen)

    @property
    def with_netmask(self):
        return '%s/%s' % (self._string_from_ip_int(self._ip),
                          self.netmask)

    @property
    def with_hostmask(self):
        return '%s/%s' % (self._string_from_ip_int(self._ip),
                          self.hostmask)

    @property
    def is_unspecified(self):
        return self._ip == 0 and self.network.is_unspecified

    @property
    def is_loopback(self):
        return self._ip == 1 and self.network.is_loopback


class IPv6Network(_BaseV6, _BaseNetwork):

    """This class represents and manipulates 128-bit IPv6 networks.

    Attributes: [examples for IPv6('2001:db8::1000/124')]
        .network_address: IPv6Address('2001:db8::1000')
        .hostmask: IPv6Address('::f')
        .broadcast_address: IPv6Address('2001:db8::100f')
        .netmask: IPv6Address('ffff:ffff:ffff:ffff:ffff:ffff:ffff:fff0')
        .prefixlen: 124

    """

    # Class to use when creating address objects
    _address_class = IPv6Address

    def __init__(self, address, strict=True):
        """Instantiate a new IPv6 Network object.

        Args:
            address: A string or integer representing the IPv6 network or the
              IP and prefix/netmask.
              '2001:db8::/128'
              '2001:db8:0000:0000:0000:0000:0000:0000/128'
              '2001:db8::'
              are all functionally the same in IPv6.  That is to say,
              failing to provide a subnetmask will create an object with
              a mask of /128.

              Additionally, an integer can be passed, so
              IPv6Network('2001:db8::') ==
                IPv6Network(42540766411282592856903984951653826560)
              or, more generally
              IPv6Network(int(IPv6Network('2001:db8::'))) ==
                IPv6Network('2001:db8::')

            strict: A boolean. If true, ensure that we have been passed
              A true network address, eg, 2001:db8::1000/124 and not an
              IP address on a network, eg, 2001:db8::1/124.

        Raises:
            AddressValueError: If address isn't a valid IPv6 address.
            NetmaskValueError: If the netmask isn't valid for
              an IPv6 address.
            ValueError: If strict was True and a network address was not
              supplied.

        """
        _BaseNetwork.__init__(self, address)

        # Efficient constructor from integer or packed address
        if isinstance(address, (bytes, _compat_int_types)):
            self.network_address = IPv6Address(address)
            self.netmask, self._prefixlen = self._make_netmask(
                self._max_prefixlen)
            return

        if isinstance(address, tuple):
            if len(address) > 1:
                arg = address[1]
            else:
                arg = self._max_prefixlen
            self.netmask, self._prefixlen = self._make_netmask(arg)
            self.network_address = IPv6Address(address[0])
            packed = int(self.network_address)
            if packed & int(self.netmask) != packed:
                if strict:
                    raise ValueError('%s has host bits set' % self)
                else:
                    self.network_address = IPv6Address(packed &
                                                       int(self.netmask))
            return

        # Assume input argument to be string or any object representation
        # which converts into a formatted IP prefix string.
        addr = _split_optional_netmask(address)

        self.network_address = IPv6Address(self._ip_int_from_string(addr[0]))

        if len(addr) == 2:
            arg = addr[1]
        else:
            arg = self._max_prefixlen
        self.netmask, self._prefixlen = self._make_netmask(arg)

        if strict:
            if (IPv6Address(int(self.network_address) & int(self.netmask)) !=
                    self.network_address):
                raise ValueError('%s has host bits set' % self)
        self.network_address = IPv6Address(int(self.network_address) &
                                           int(self.netmask))

        if self._prefixlen == (self._max_prefixlen - 1):
            self.hosts = self.__iter__

    def hosts(self):
        """Generate Iterator over usable hosts in a network.

          This is like __iter__ except it doesn't return the
          Subnet-Router anycast address.

        """
        network = int(self.network_address)
        broadcast = int(self.broadcast_address)
        for x in _compat_range(network + 1, broadcast + 1):
            yield self._address_class(x)

    @property
    def is_site_local(self):
        """Test if the address is reserved for site-local.

        Note that the site-local address space has been deprecated by RFC 3879.
        Use is_private to test if this address is in the space of unique local
        addresses as defined by RFC 4193.

        Returns:
            A boolean, True if the address is reserved per RFC 3513 2.5.6.

        """
        return (self.network_address.is_site_local and
                self.broadcast_address.is_site_local)


class _IPv6Constants(object):

    _linklocal_network = IPv6Network('fe80::/10')

    _multicast_network = IPv6Network('ff00::/8')

    _private_networks = [
        IPv6Network('::1/128'),
        IPv6Network('::/128'),
        IPv6Network('::ffff:0:0/96'),
        IPv6Network('100::/64'),
        IPv6Network('2001::/23'),
        IPv6Network('2001:2::/48'),
        IPv6Network('2001:db8::/32'),
        IPv6Network('2001:10::/28'),
        IPv6Network('fc00::/7'),
        IPv6Network('fe80::/10'),
    ]

    _reserved_networks = [
        IPv6Network('::/8'), IPv6Network('100::/8'),
        IPv6Network('200::/7'), IPv6Network('400::/6'),
        IPv6Network('800::/5'), IPv6Network('1000::/4'),
        IPv6Network('4000::/3'), IPv6Network('6000::/3'),
        IPv6Network('8000::/3'), IPv6Network('A000::/3'),
        IPv6Network('C000::/3'), IPv6Network('E000::/4'),
        IPv6Network('F000::/5'), IPv6Network('F800::/6'),
        IPv6Network('FE00::/9'),
    ]

    _sitelocal_network = IPv6Network('fec0::/10')


IPv6Address._constants = _IPv6Constants
site-packages/pip/_vendor/__pycache__/retrying.cpython-36.pyc000064400000017525147511334610020230 0ustar003

���e�&�@slddlZddlmZddlZddlZddlZdZdd�ZGdd�de�Z	Gdd	�d	e�Z
Gd
d�de�ZdS)�N)�sixi���?csBt��dkr,t�d�r,dd�}|�d�S��fdd�}|SdS)z�
    Decorator function that instantiates the Retrying object
    @param *dargs: positional arguments passed to Retrying object
    @param **dkw: keyword arguments passed to the Retrying object
    �rcstj���fdd��}|S)Ncst�j�f|�|�S)N)�Retrying�call)�args�kw)�f��/usr/lib/python3.6/retrying.py�	wrapped_f$sz-retry.<locals>.wrap_simple.<locals>.wrapped_f)r�wraps)rrr	)rr
�wrap_simple"szretry.<locals>.wrap_simplecstj�����fdd��}|S)Ncst���j�f|�|�S)N)rr)rr)�dargs�dkwrr	r
r/sz&retry.<locals>.wrap.<locals>.wrapped_f)rr)rr)rr)rr
�wrap-szretry.<locals>.wrapN)�len�callable)rrr
rr	)rrr
�retrys
rc@sneZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�ZdS)rNFcs|dkrdn||_|dkrdn||_|dkr0dn||_|dkrBdn||_|dkrTdn||_|dkrfdn||_|	dkrxdn|	|_|
dkr�dn|
|_|dkr�tn||_	|dkr�dn||_
g�|dk	r̈j|j�|dk	r�j|j
�|dk	r�||_n&|dk�r
�fdd�|_nt||�|_dd�g�|dk	�r6�j|j�|dk	�sJ|dk	�rV�j|j�|dk	�sj|	dk	�rv�j|j�|
dk	�s�|dk	�r��j|j�|dk	�r�||_n&|dk�r‡fd	d�|_nt||�|_|dk�r�|j|_n||_|
dk�r�|j|_n|
|_||_dS)
N��di�rrcst��fdd��D��S)Nc3s|]}|���VqdS)Nr	)�.0r)�attempts�delayr	r
�	<genexpr>asz6Retrying.__init__.<locals>.<lambda>.<locals>.<genexpr>)�any)rr)�
stop_funcs)rrr
�<lambda>asz#Retrying.__init__.<locals>.<lambda>c_sdS)Nrr	)r�kwargsr	r	r
rhscst��fdd��D��S)Nc3s|]}|���VqdS)Nr	)rr)rrr	r
rysz6Retrying.__init__.<locals>.<lambda>.<locals>.<genexpr>)�max)rr)�
wait_funcs)rrr
rys)�_stop_max_attempt_number�_stop_max_delay�_wait_fixed�_wait_random_min�_wait_random_max�_wait_incrementing_start�_wait_incrementing_increment�_wait_exponential_multiplier�MAX_WAIT�_wait_exponential_max�_wait_jitter_max�append�stop_after_attempt�stop_after_delay�stop�getattr�fixed_sleep�random_sleep�incrementing_sleep�exponential_sleep�wait�
always_reject�_retry_on_exception�never_reject�_retry_on_result�_wrap_exception)�selfr.r4Zstop_max_attempt_numberZstop_max_delayZ
wait_fixedZwait_random_minZwait_random_maxZwait_incrementing_startZwait_incrementing_incrementZwait_exponential_multiplierZwait_exponential_maxZretry_on_exceptionZretry_on_result�wrap_exceptionZ	stop_funcZ	wait_funcZwait_jitter_maxr	)rrr
�__init__:sR








zRetrying.__init__cCs
||jkS)z;Stop after the previous attempt >= stop_max_attempt_number.)r )r:�previous_attempt_number�delay_since_first_attempt_msr	r	r
r,�szRetrying.stop_after_attemptcCs
||jkS)z=Stop after the time from the first attempt >= stop_max_delay.)r!)r:r=r>r	r	r
r-�szRetrying.stop_after_delaycCsdS)z#Don't sleep at all before retrying.rr	)r:r=r>r	r	r
�no_sleep�szRetrying.no_sleepcCs|jS)z0Sleep a fixed amount of time between each retry.)r")r:r=r>r	r	r
r0�szRetrying.fixed_sleepcCstj|j|j�S)zISleep a random amount of time between wait_random_min and wait_random_max)�randomZrandintr#r$)r:r=r>r	r	r
r1�szRetrying.random_sleepcCs$|j|j|d}|dkr d}|S)z�
        Sleep an incremental amount of time after each attempt, starting at
        wait_incrementing_start and incrementing by wait_incrementing_increment
        rr)r%r&)r:r=r>�resultr	r	r
r2�szRetrying.incrementing_sleepcCs2d|}|j|}||jkr"|j}|dkr.d}|S)N�r)r'r))r:r=r>ZexprAr	r	r
r3�s

zRetrying.exponential_sleepcCsdS)NFr	)r:rAr	r	r
r7�szRetrying.never_rejectcCsdS)NTr	)r:rAr	r	r
r5�szRetrying.always_rejectcCs4d}|jr ||j|jd�O}n||j|j�O}|S)NFr)�
has_exceptionr6�valuer8)r:�attemptZrejectr	r	r
�
should_reject�s
zRetrying.should_rejectc
Os�tttj�d��}d}x�yt|||�|d�}Wn tj�}t||d�}YnX|j|�sh|j|j�Stttj�d��|}|j	||�r�|jr�|j
r�|j��q�t|��n<|j||�}	|j
r�tj�|j
}
|	td|
�}	tj|	d�|d7}qWdS)Ni�rFTrg@�@)�int�round�time�Attempt�sys�exc_inforF�getr9r.rC�
RetryErrorr4r*r@r�sleep)r:�fnrrZ
start_time�attempt_numberrE�tbr>rOZjitterr	r	r
r�s*


z
Retrying.call)NNNNNNNNNNNNNFNNN)�__name__�
__module__�__qualname__r<r,r-r?r0r1r2r3r7r5rFrr	r	r	r
r8s0
F
		rc@s*eZdZdZdd�Zd
dd�Zdd�Zd	S)rJz�
    An Attempt encapsulates a call to a target function that may end as a
    normal return value from the function or an Exception depending on what
    occurred during the execution.
    cCs||_||_||_dS)N)rDrQrC)r:rDrQrCr	r	r
r<�szAttempt.__init__FcCs@|jr6|rt|��q<tj|jd|jd|jd�n|jSdS)z�
        Return the return value of this Attempt instance or raise an Exception.
        If wrap_exception is true, this Attempt is wrapped inside of a
        RetryError before being raised.
        rrrBN)rCrNrZreraiserD)r:r;r	r	r
rM�s

"zAttempt.getcCs:|jr&dj|jdjtj|jd���Sdj|j|j�SdS)NzAttempts: {0}, Error:
{1}�rBzAttempts: {0}, Value: {1})rC�formatrQ�join�	traceback�	format_tbrD)r:r	r	r
�__repr__�s zAttempt.__repr__N)F)rSrTrU�__doc__r<rMr[r	r	r	r
rJ�s
rJc@s eZdZdZdd�Zdd�ZdS)rNzU
    A RetryError encapsulates the last Attempt instance right before giving up.
    cCs
||_dS)N)�last_attempt)r:r]r	r	r
r<szRetryError.__init__cCsdj|j�S)NzRetryError[{0}])rWr])r:r	r	r
�__str__
szRetryError.__str__N)rSrTrUr\r<r^r	r	r	r
rNsrN)
r@Zpip._vendorrrKrIrYr(r�objectrrJ�	ExceptionrNr	r	r	r
�<module>s*!site-packages/pip/_vendor/__pycache__/distro.cpython-36.pyc000064400000077437147511334610017701 0ustar003

���e��@sbdZddlZddlZddlZddlZddlZddlZddlZejj	d�sXe
djej���dZdZ
iZddd	�Zd
diZejd�Zejd�Zd
dde
dfZd:dd�Zdd�Zd;dd�Zd<dd�Zd=dd�Zd>dd�Zd?dd �Zd@d!d"�Zd#d$�Zd%d&�ZdAd'd(�Zd)d*�Z d+d,�Z!d-d.�Z"d/d0�Z#d1d2�Z$d3d4�Z%Gd5d6�d6e&�Z'e'�Z(d7d8�Z)e*d9k�r^e)�dS)Ba,
The ``distro`` package (``distro`` stands for Linux Distribution) provides
information about the Linux distribution it runs on, such as a reliable
machine-readable distro ID, or version information.

It is a renewed alternative implementation for Python's original
:py:func:`platform.linux_distribution` function, but it provides much more
functionality. An alternative implementation became necessary because Python
3.5 deprecated this function, and Python 3.7 is expected to remove it
altogether. Its predecessor function :py:func:`platform.dist` was already
deprecated since Python 2.6 and is also expected to be removed in Python 3.7.
Still, there are many cases in which access to Linux distribution information
is needed. See `Python issue 1322 <https://bugs.python.org/issue1322>`_ for
more information.
�N�linuxzUnsupported platform: {0}z/etcz
os-releaseZoracleZrhel)ZenterpriseenterpriseZredhatenterpriseworkstationZredhatzA(?:[^)]*\)(.*)\()? *(?:STL )?([\d.+\-a-z]*\d) *(?:esaeler *)?(.+)z(\w+)[-_](release|version)$Zdebian_versionzlsb-releasezoem-releasezsystem-releaseTcCs
tj|�S)a$
    Return information about the current Linux distribution as a tuple
    ``(id_name, version, codename)`` with items as follows:

    * ``id_name``:  If *full_distribution_name* is false, the result of
      :func:`distro.id`. Otherwise, the result of :func:`distro.name`.

    * ``version``:  The result of :func:`distro.version`.

    * ``codename``:  The result of :func:`distro.codename`.

    The interface of this function is compatible with the original
    :py:func:`platform.linux_distribution` function, supporting a subset of
    its parameters.

    The data it returns may not exactly be the same, because it uses more data
    sources than the original function, and that may lead to different data if
    the Linux distribution is not consistent across multiple data sources it
    provides (there are indeed such distributions ...).

    Another reason for differences is the fact that the :func:`distro.id`
    method normalizes the distro ID string to a reliable machine-readable value
    for a number of popular Linux distributions.
    )�_distro�linux_distribution)�full_distribution_name�r�/usr/lib/python3.6/distro.pyr`srcCstj�S)a�

    Return the distro ID of the current Linux distribution, as a
    machine-readable string.

    For a number of Linux distributions, the returned distro ID value is
    *reliable*, in the sense that it is documented and that it does not change
    across releases of the distribution.

    This package maintains the following reliable distro ID values:

    ==============  =========================================
    Distro ID       Distribution
    ==============  =========================================
    "ubuntu"        Ubuntu
    "debian"        Debian
    "rhel"          RedHat Enterprise Linux
    "centos"        CentOS
    "rocky"         Rocky Linux
    "fedora"        Fedora
    "sles"          SUSE Linux Enterprise Server
    "opensuse"      openSUSE
    "amazon"        Amazon Linux
    "arch"          Arch Linux
    "cloudlinux"    CloudLinux OS
    "exherbo"       Exherbo Linux
    "gentoo"        GenToo Linux
    "ibm_powerkvm"  IBM PowerKVM
    "kvmibm"        KVM for IBM z Systems
    "linuxmint"     Linux Mint
    "mageia"        Mageia
    "mandriva"      Mandriva Linux
    "parallels"     Parallels
    "pidora"        Pidora
    "raspbian"      Raspbian
    "oracle"        Oracle Linux (and Oracle Enterprise Linux)
    "scientific"    Scientific Linux
    "slackware"     Slackware
    "xenserver"     XenServer
    ==============  =========================================

    If you have a need to get distros for reliable IDs added into this set,
    or if you find that the :func:`distro.id` function returns a different
    distro ID for one of the listed distros, please create an issue in the
    `distro issue tracker`_.

    **Lookup hierarchy and transformations:**

    First, the ID is obtained from the following sources, in the specified
    order. The first available and non-empty value is used:

    * the value of the "ID" attribute of the os-release file,

    * the value of the "Distributor ID" attribute returned by the lsb_release
      command,

    * the first part of the file name of the distro release file,

    The so determined ID value then passes the following transformations,
    before it is returned by this method:

    * it is translated to lower case,

    * blanks (which should not be there anyway) are translated to underscores,

    * a normalization of the ID is performed, based upon
      `normalization tables`_. The purpose of this normalization is to ensure
      that the ID is as reliable as possible, even across incompatible changes
      in the Linux distributions. A common reason for an incompatible change is
      the addition of an os-release file, or the addition of the lsb_release
      command, with ID values that differ from what was previously determined
      from the distro release file name.
    )r�idrrrrr|sIrFcCs
tj|�S)an
    Return the name of the current Linux distribution, as a human-readable
    string.

    If *pretty* is false, the name is returned without version or codename.
    (e.g. "CentOS Linux")

    If *pretty* is true, the version and codename are appended.
    (e.g. "CentOS Linux 7.1.1503 (Core)")

    **Lookup hierarchy:**

    The name is obtained from the following sources, in the specified order.
    The first available and non-empty value is used:

    * If *pretty* is false:

      - the value of the "NAME" attribute of the os-release file,

      - the value of the "Distributor ID" attribute returned by the lsb_release
        command,

      - the value of the "<name>" field of the distro release file.

    * If *pretty* is true:

      - the value of the "PRETTY_NAME" attribute of the os-release file,

      - the value of the "Description" attribute returned by the lsb_release
        command,

      - the value of the "<name>" field of the distro release file, appended
        with the value of the pretty version ("<version_id>" and "<codename>"
        fields) of the distro release file, if available.
    )r�name)�prettyrrrr	�s$r	cCstj||�S)ay
    Return the version of the current Linux distribution, as a human-readable
    string.

    If *pretty* is false, the version is returned without codename (e.g.
    "7.0").

    If *pretty* is true, the codename in parenthesis is appended, if the
    codename is non-empty (e.g. "7.0 (Maipo)").

    Some distributions provide version numbers with different precisions in
    the different sources of distribution information. Examining the different
    sources in a fixed priority order does not always yield the most precise
    version (e.g. for Debian 8.2, or CentOS 7.1).

    The *best* parameter can be used to control the approach for the returned
    version:

    If *best* is false, the first non-empty version number in priority order of
    the examined sources is returned.

    If *best* is true, the most precise version number out of all examined
    sources is returned.

    **Lookup hierarchy:**

    In all cases, the version number is obtained from the following sources.
    If *best* is false, this order represents the priority order:

    * the value of the "VERSION_ID" attribute of the os-release file,
    * the value of the "Release" attribute returned by the lsb_release
      command,
    * the version number parsed from the "<version_id>" field of the first line
      of the distro release file,
    * the version number parsed from the "PRETTY_NAME" attribute of the
      os-release file, if it follows the format of the distro release files.
    * the version number parsed from the "Description" attribute returned by
      the lsb_release command, if it follows the format of the distro release
      files.
    )r�version)r
�bestrrrr�s)rcCs
tj|�S)a�
    Return the version of the current Linux distribution as a tuple
    ``(major, minor, build_number)`` with items as follows:

    * ``major``:  The result of :func:`distro.major_version`.

    * ``minor``:  The result of :func:`distro.minor_version`.

    * ``build_number``:  The result of :func:`distro.build_number`.

    For a description of the *best* parameter, see the :func:`distro.version`
    method.
    )r�
version_parts)rrrrr
sr
cCs
tj|�S)a8
    Return the major version of the current Linux distribution, as a string,
    if provided.
    Otherwise, the empty string is returned. The major version is the first
    part of the dot-separated version string.

    For a description of the *best* parameter, see the :func:`distro.version`
    method.
    )r�
major_version)rrrrr,s
rcCs
tj|�S)a9
    Return the minor version of the current Linux distribution, as a string,
    if provided.
    Otherwise, the empty string is returned. The minor version is the second
    part of the dot-separated version string.

    For a description of the *best* parameter, see the :func:`distro.version`
    method.
    )r�
minor_version)rrrrr9s
rcCs
tj|�S)a6
    Return the build number of the current Linux distribution, as a string,
    if provided.
    Otherwise, the empty string is returned. The build number is the third part
    of the dot-separated version string.

    For a description of the *best* parameter, see the :func:`distro.version`
    method.
    )r�build_number)rrrrrFs
rcCstj�S)a
    Return a space-separated list of distro IDs of distributions that are
    closely related to the current Linux distribution in regards to packaging
    and programming interfaces, for example distributions the current
    distribution is a derivative from.

    **Lookup hierarchy:**

    This information item is only provided by the os-release file.
    For details, see the description of the "ID_LIKE" attribute in the
    `os-release man page
    <http://www.freedesktop.org/software/systemd/man/os-release.html>`_.
    )r�likerrrrrSsrcCstj�S)a�
    Return the codename for the release of the current Linux distribution,
    as a string.

    If the distribution does not have a codename, an empty string is returned.

    Note that the returned codename is not always really a codename. For
    example, openSUSE returns "x86_64". This function does not handle such
    cases in any special way and just returns the string it finds, if any.

    **Lookup hierarchy:**

    * the codename within the "VERSION" attribute of the os-release file, if
      provided,

    * the value of the "Codename" attribute returned by the lsb_release
      command,

    * the value of the "<codename>" field of the distro release file.
    )r�codenamerrrrrdsrcCstj||�S)a�
    Return certain machine-readable information items about the current Linux
    distribution in a dictionary, as shown in the following example:

    .. sourcecode:: python

        {
            'id': 'rhel',
            'version': '7.0',
            'version_parts': {
                'major': '7',
                'minor': '0',
                'build_number': ''
            },
            'like': 'fedora',
            'codename': 'Maipo'
        }

    The dictionary structure and keys are always the same, regardless of which
    information items are available in the underlying data sources. The values
    for the various keys are as follows:

    * ``id``:  The result of :func:`distro.id`.

    * ``version``:  The result of :func:`distro.version`.

    * ``version_parts -> major``:  The result of :func:`distro.major_version`.

    * ``version_parts -> minor``:  The result of :func:`distro.minor_version`.

    * ``version_parts -> build_number``:  The result of
      :func:`distro.build_number`.

    * ``like``:  The result of :func:`distro.like`.

    * ``codename``:  The result of :func:`distro.codename`.

    For a description of the *pretty* and *best* parameters, see the
    :func:`distro.version` method.
    )r�info)r
rrrrr|s)rcCstj�S)z�
    Return a dictionary containing key-value pairs for the information items
    from the os-release file data source of the current Linux distribution.

    See `os-release file`_ for details about these information items.
    )r�os_release_inforrrrr�srcCstj�S)z�
    Return a dictionary containing key-value pairs for the information items
    from the lsb_release command data source of the current Linux distribution.

    See `lsb_release command output`_ for details about these information
    items.
    )r�lsb_release_inforrrrr�srcCstj�S)z�
    Return a dictionary containing key-value pairs for the information items
    from the distro release file data source of the current Linux distribution.

    See `distro release file`_ for details about these information items.
    )r�distro_release_inforrrrr�srcCs
tj|�S)a�
    Return a single named information item from the os-release file data source
    of the current Linux distribution.

    Parameters:

    * ``attribute`` (string): Key of the information item.

    Returns:

    * (string): Value of the information item, if the item exists.
      The empty string, if the item does not exist.

    See `os-release file`_ for details about these information items.
    )r�os_release_attr)�	attributerrrr�srcCs
tj|�S)a�
    Return a single named information item from the lsb_release command output
    data source of the current Linux distribution.

    Parameters:

    * ``attribute`` (string): Key of the information item.

    Returns:

    * (string): Value of the information item, if the item exists.
      The empty string, if the item does not exist.

    See `lsb_release command output`_ for details about these information
    items.
    )r�lsb_release_attr)rrrrr�srcCs
tj|�S)a�
    Return a single named information item from the distro release file
    data source of the current Linux distribution.

    Parameters:

    * ``attribute`` (string): Key of the information item.

    Returns:

    * (string): Value of the information item, if the item exists.
      The empty string, if the item does not exist.

    See `distro release file`_ for details about these information items.
    )r�distro_release_attr)rrrrr�src@s�eZdZdZd:dd�Zdd�Zd;dd	�Zd
d�Zd<d
d�Zd=dd�Z	d>dd�Z
d?dd�Zd@dd�ZdAdd�Z
dd�Zdd�ZdBdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zed-d.��Zd/d0�Zed1d2��Zd3d4�Zd5d6�Zed7d8��Zd9S)C�LinuxDistributiona
    Provides information about a Linux distribution.

    This package creates a private module-global instance of this class with
    default initialization arguments, that is used by the
    `consolidated accessor functions`_ and `single source accessor functions`_.
    By using default initialization arguments, that module-global instance
    returns data about the current Linux distribution (i.e. the distro this
    package runs on).

    Normally, it is not necessary to create additional instances of this class.
    However, in situations where control is needed over the exact data sources
    that are used, instances of this class can be created with a specific
    distro release file, or a specific os-release file, or without invoking the
    lsb_release command.
    T�cCsH|ptjjtt�|_|pd|_|j�|_|r4|j	�ni|_
|j�|_dS)a8	
        The initialization method of this class gathers information from the
        available data sources, and stores that in private instance attributes.
        Subsequent access to the information items uses these private instance
        attributes, so that the data sources are read only once.

        Parameters:

        * ``include_lsb`` (bool): Controls whether the
          `lsb_release command output`_ is included as a data source.

          If the lsb_release command is not available in the program execution
          path, the data source for the lsb_release command will be empty.

        * ``os_release_file`` (string): The path name of the
          `os-release file`_ that is to be used as a data source.

          An empty string (the default) will cause the default path name to
          be used (see `os-release file`_ for details).

          If the specified or defaulted os-release file does not exist, the
          data source for the os-release file will be empty.

        * ``distro_release_file`` (string): The path name of the
          `distro release file`_ that is to be used as a data source.

          An empty string (the default) will cause a default search algorithm
          to be used (see `distro release file`_ for details).

          If the specified distro release file does not exist, or if no default
          distro release file can be found, the data source for the distro
          release file will be empty.

        Public instance attributes:

        * ``os_release_file`` (string): The path name of the
          `os-release file`_ that is actually used as a data source. The
          empty string if no distro release file is used as a data source.

        * ``distro_release_file`` (string): The path name of the
          `distro release file`_ that is actually used as a data source. The
          empty string if no distro release file is used as a data source.

        Raises:

        * :py:exc:`IOError`: Some I/O issue with an os-release file or distro
          release file.

        * :py:exc:`subprocess.CalledProcessError`: The lsb_release command had
          some issue (other than not being available in the program execution
          path).

        * :py:exc:`UnicodeError`: A data source has unexpected characters or
          uses an unexpected encoding.
        rN)
�os�path�join�_UNIXCONFDIR�_OS_RELEASE_BASENAME�os_release_file�distro_release_file�_get_os_release_info�_os_release_info�_get_lsb_release_info�_lsb_release_info�_get_distro_release_info�_distro_release_info)�selfZinclude_lsbr"r#rrr�__init__s;

zLinuxDistribution.__init__cCsdj|j|j|j|j|j�S)z Return repr of all info
        z�LinuxDistribution(os_release_file={0!r}, distro_release_file={1!r}, _os_release_info={2!r}, _lsb_release_info={3!r}, _distro_release_info={4!r}))�formatr"r#r%r'r))r*rrr�__repr__VszLinuxDistribution.__repr__cCs"|r|j�n|j�|j�|j�fS)z�
        Return information about the Linux distribution that is compatible
        with Python's :func:`platform.linux_distribution`, supporting a subset
        of its parameters.

        For details, see :func:`distro.linux_distribution`.
        )r	rrr)r*rrrrrfs	z$LinuxDistribution.linux_distributioncCsTdd�}|jd�}|r ||t�S|jd�}|r8||t�S|jd�}|rP||t�SdS)zrReturn the distro ID of the Linux distribution, as a string.

        For details, see :func:`distro.id`.
        cSs|j�jdd�}|j||�S)N� �_)�lower�replace�get)�	distro_id�tablerrr�	normalizeysz'LinuxDistribution.id.<locals>.normalizer�distributor_idr)r�NORMALIZED_OS_IDr�NORMALIZED_LSB_IDr�NORMALIZED_DISTRO_ID)r*r5r3rrrrts





zLinuxDistribution.idFcCsh|jd�p|jd�p|jd�}|r`|jd�p4|jd�}|s`|jd�}|jdd�}|r`|d|}|pfdS)	zx
        Return the name of the Linux distribution, as a string.

        For details, see :func:`distro.name`.
        r	r6�pretty_name�descriptionT)r
r.r)rrrr)r*r
r	rrrrr	�s





zLinuxDistribution.namecCs�|jd�|jd�|jd�|j|jd��jdd�|j|jd��jdd�g}d}|r�xJ|D]$}|jd�|jd�ksv|dkrV|}qVWnx|D]}|dkr�|}Pq�W|r�|r�|j�r�dj||j��}|S)z~
        Return the version of the Linux distribution, as a string.

        For details, see :func:`distro.version`.
        �
version_id�releaser:rr;�.z	{0} ({1}))rrr�_parse_distro_release_contentr2�countrr,)r*r
rZversionsr�vrrrr�s&


zLinuxDistribution.versioncCsL|j|d�}|rHtjd�}|j|�}|rH|j�\}}}||p>d|pDdfSdS)z�
        Return the version of the Linux distribution, as a tuple of version
        numbers.

        For details, see :func:`distro.version_parts`.
        )rz(\d+)\.?(\d+)?\.?(\d+)?r)rrr)r�re�compile�match�groups)r*rZversion_strZ
version_regex�matches�major�minorrrrrr
�s

zLinuxDistribution.version_partscCs|j|�dS)z�
        Return the major version number of the current distribution.

        For details, see :func:`distro.major_version`.
        r)r
)r*rrrrr�szLinuxDistribution.major_versioncCs|j|�dS)z�
        Return the minor version number of the Linux distribution.

        For details, see :func:`distro.minor_version`.
        �)r
)r*rrrrr�szLinuxDistribution.minor_versioncCs|j|�dS)z{
        Return the build number of the Linux distribution.

        For details, see :func:`distro.build_number`.
        �)r
)r*rrrrr�szLinuxDistribution.build_numbercCs|jd�pdS)z�
        Return the IDs of distributions that are like the Linux distribution.

        For details, see :func:`distro.like`.
        Zid_liker)r)r*rrrr�szLinuxDistribution.likecCs"|jd�p |jd�p |jd�p dS)zs
        Return the codename of the Linux distribution.

        For details, see :func:`distro.codename`.
        rr)rrr)r*rrrr�s


zLinuxDistribution.codenamecCsBt|j�|j||�t|j|�|j|�|j|�d�|j�|j�d�S)z�
        Return certain machine-readable information about the Linux
        distribution.

        For details, see :func:`distro.info`.
        )rGrHr)rrr
rr)�dictrrrrrrr)r*r
rrrrr�s
zLinuxDistribution.infocCs|jS)z�
        Return a dictionary containing key-value pairs for the information
        items from the os-release file data source of the Linux distribution.

        For details, see :func:`distro.os_release_info`.
        )r%)r*rrrr
sz!LinuxDistribution.os_release_infocCs|jS)z�
        Return a dictionary containing key-value pairs for the information
        items from the lsb_release command data source of the Linux
        distribution.

        For details, see :func:`distro.lsb_release_info`.
        )r')r*rrrrsz"LinuxDistribution.lsb_release_infocCs|jS)z�
        Return a dictionary containing key-value pairs for the information
        items from the distro release file data source of the Linux
        distribution.

        For details, see :func:`distro.distro_release_info`.
        )r))r*rrrr sz%LinuxDistribution.distro_release_infocCs|jj|d�S)z�
        Return a single named information item from the os-release file data
        source of the Linux distribution.

        For details, see :func:`distro.os_release_attr`.
        r)r%r2)r*rrrrr*sz!LinuxDistribution.os_release_attrcCs|jj|d�S)z�
        Return a single named information item from the lsb_release command
        output data source of the Linux distribution.

        For details, see :func:`distro.lsb_release_attr`.
        r)r'r2)r*rrrrr3sz"LinuxDistribution.lsb_release_attrcCs|jj|d�S)z�
        Return a single named information item from the distro release file
        data source of the Linux distribution.

        For details, see :func:`distro.distro_release_attr`.
        r)r)r2)r*rrrrr<sz%LinuxDistribution.distro_release_attrc	Cs.tjj|j�r*t|j��}|j|�SQRXiS)z�
        Get the information items from the specified os-release file.

        Returns:
            A dictionary containing all information items.
        N)rr�isfiler"�open�_parse_os_release_content)r*Zrelease_filerrrr$Esz&LinuxDistribution._get_os_release_infocCs�i}tj|dd�}d|_tjddkr@t|jt�r@|jjd�|_t|�}x�|D]�}d|krN|j	dd�\}}t|t�r~|jd�}|||j
�<|d	kr�tjd
|�}|r�|j
�}|jd�}|jd�}|j�}||d
<q�d|d
<qNqNW|S)aD
        Parse the lines of an os-release file.

        Parameters:

        * lines: Iterable through the lines in the os-release file.
                 Each line must be a unicode string or a UTF-8 encoded byte
                 string.

        Returns:
            A dictionary containing all information items.
        T)�posixrrJz
iso-8859-1�=rIzutf-8�VERSIONz(\(\D+\))|,(\s+)?\D+z()�,rr)�shlexZwhitespace_split�sys�version_info�
isinstanceZ	wordchars�bytes�decode�list�splitr0rB�search�group�strip)�lines�propsZlexer�tokens�token�krArrrrrNQs.	






z+LinuxDistribution._parse_os_release_contentcCs�d}tj|dtjtjd�}|j�\}}|jd�|jd�}}|j}|dkr\|j�}|j|�S|dkrhiStj	dd�d
kr�tj
||||��n@tj	dd�dkr�tj
|||��ntj	dd�dkr�tj
||��dS)z�
        Get the information items from the lsb_release command output.

        Returns:
            A dictionary containing all information items.
        zlsb_release -aT)�shell�stdout�stderrzutf-8r�NrJ����)rgrh)rJri)rJrj)�
subprocess�Popen�PIPEZcommunicaterX�
returncode�
splitlines�_parse_lsb_release_contentrTrUZCalledProcessError)r*�cmdZprocessrdre�codeZcontentrrrr&�s(

z'LinuxDistribution._get_lsb_release_infocCsti}xj|D]b}t|t�r"|jd�n|}|jd�jdd�}t|�dkrFq
|\}}|j|jdd�j�|j�i�q
W|S)aM
        Parse the output of the lsb_release command.

        Parameters:

        * lines: Iterable through the lines of the lsb_release output.
                 Each line must be a unicode string or a UTF-8 encoded byte
                 string.

        Returns:
            A dictionary containing all information items.
        zutf-8�
�:rIrJr.r/)	rVrWrXr]rZ�len�updater1r0)r^r_�lineZkvrbrArrrrp�s
"z,LinuxDistribution._parse_lsb_release_contentcCs�|jr@|j|j�}tjj|j�}tj|�}|r<|jd�|d<|Stjt	�}|j
�x\|D]T}|tkrfqXtj|�}|rXtjjt	|�}|j|�}d|krX||_|jd�|d<|SqXWiSdS)z�
        Get the information items from the specified distro release file.

        Returns:
            A dictionary containing all information items.
        rIrr	N)
r#�_parse_distro_release_filerr�basename� _DISTRO_RELEASE_BASENAME_PATTERNrDr\�listdirr �sort� _DISTRO_RELEASE_IGNORE_BASENAMESr)r*�distro_inforyrDZ	basenames�filepathrrrr(�s,




z*LinuxDistribution._get_distro_release_infoc	Cs.tjj|�r*t|��}|j|j��SQRXiS)z�
        Parse a distro release file.

        Parameters:

        * filepath: Path name of the distro release file.

        Returns:
            A dictionary containing all information items.
        N)rrrLrMr?�readline)r*r�fprrrrx�s
z,LinuxDistribution._parse_distro_release_filecCs�t|t�r|jd�}tj|j�ddd	��}i}|r�|jd�ddd
�|d<|jd�rn|jd�ddd�|d<|jd�r�|jd�ddd�|d<n|r�|j�|d<|S)
a
        Parse a line from a distro release file.

        Parameters:
        * line: Line from the distro release file. Must be a unicode string
                or a UTF-8 encoded byte string.

        Returns:
            A dictionary containing all information items.
        zutf-8NrIrgr	rJr<r���r�r�r�)rVrWrX�(_DISTRO_RELEASE_CONTENT_REVERSED_PATTERNrDr]r\)rwrFr~rrrr?�s



z/LinuxDistribution._parse_distro_release_contentN)Trr)T)F)FF)F)F)F)F)FF)�__name__�
__module__�__qualname__�__doc__r+r-rrr	rr
rrrrrrrrrrrrr$�staticmethodrNr&rpr(rxr?rrrrrs:
@


!




	

			<)rcCs�ddl}tjt�}|jtj�|jtjtj	��|j
dd�}|jddddd�|j�}|j
rv|jt
jt�d	d
d��nB|jdtd
d
��td
d
�}|r�|jd|�t�}|r�|jd|�dS)NrzLinux distro info tool)r;z--jsonz-jz!Output in machine readable format�
store_true)�help�action�T)�indentZ	sort_keyszName: %s)r
zVersion: %szCodename: %s)�argparse�loggingZ	getLoggerr�ZsetLevel�DEBUGZ
addHandlerZ
StreamHandlerrTrd�ArgumentParser�add_argument�
parse_args�jsonr�dumpsr	rr)r�Zlogger�parser�argsZdistribution_versionZdistribution_codenamerrr�mains(

r��__main__)T)F)FF)F)F)F)F)FF)+r�rrBrTr�rSr�rk�platform�
startswith�ImportErrorr,r r!r7r8r9rCr�rzr}rrr	rr
rrrrrrrrrrrr�objectrrr�r�rrrr�<module>sd	

L
'
,





,


site-packages/pip/_vendor/__pycache__/re-vendor.cpython-36.pyc000064400000002010147511334610020245 0ustar003

���e�@s�ddlZddlZddlZddlZddlZejjejje��Z	dd�Z
dd�Zdd�Ze
dkr�eej�d	krpe
�ejd
dkr�e�nejd
dkr�e�ne
�dS)�NcCstd�tjd�dS)Nz"Usage: re-vendor.py [clean|vendor]�)�print�sys�exit�rr�/usr/lib/python3.6/re-vendor.py�usage	srcCsPx6tjt�D](}tjjt|�}tjj|�rtj|�qWtjtjjtd��dS)Nzsix.py)	�os�listdir�here�path�join�isdir�shutil�rmtree�unlink)�fn�dirnamerrr�clean
s
rcCs6tjddtddg�xtjd�D]}tj|�q WdS)NZinstallz-tz-rz
vendor.txtz
*.egg-info)�pip�mainr�globrr)rrrr�vendorsr�__main__�r)r	rrrrr�abspathr�__file__rrrr�__name__�len�argvrrrr�<module>s site-packages/pip/_vendor/__pycache__/six.cpython-36.opt-1.pyc000064400000057532147511334610020131 0ustar003

���e�u�I@srdZddlmZddlZddlZddlZddlZddlZdZdZ	ej
ddkZej
ddkZej
dd��dzkZ
er�efZefZefZeZeZejZn�efZeefZeejfZeZeZejjd	�r�e�d|�ZnLGdd
�d
e�Z ye!e ��Wn e"k
�re�d~�ZYnXe�d��Z[ dd�Z#dd�Z$Gdd�de�Z%Gdd�de%�Z&Gdd�dej'�Z(Gdd�de%�Z)Gdd�de�Z*e*e+�Z,Gdd�de(�Z-e)ddd d!�e)d"d#d$d%d"�e)d&d#d#d'd&�e)d(d)d$d*d(�e)d+d)d,�e)d-d#d$d.d-�e)d/d0d0d1d/�e)d2d0d0d/d2�e)d3d)d$d4d3�e)d5d)e
�rd6nd7d8�e)d9d)d:�e)d;d<d=d>�e)d!d!d �e)d?d?d@�e)dAdAd@�e)dBdBd@�e)d4d)d$d4d3�e)dCd#d$dDdC�e)dEd#d#dFdE�e&d$d)�e&dGdH�e&dIdJ�e&dKdLdM�e&dNdOdN�e&dPdQdR�e&dSdTdU�e&dVdWdX�e&dYdZd[�e&d\d]d^�e&d_d`da�e&dbdcdd�e&dedfdg�e&dhdidj�e&dkdkdl�e&dmdmdl�e&dndndl�e&dododp�e&dqdr�e&dsdt�e&dudv�e&dwdxdw�e&dydz�e&d{d|d}�e&d~dd��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�e+d�d��e&d�e+d�d��e&d�e+d�e+d��e&d�d�d��e&d�d�d��e&d�d�d��g>Z.ejd�k�rZe.e&d�d��g7Z.x:e.D]2Z/e0e-e/j1e/�e2e/e&��r`e,j3e/d�e/j1��q`W[/e.e-_.e-e+d��Z4e,j3e4d��Gd�d��d�e(�Z5e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d>d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��gZ6xe6D]Z/e0e5e/j1e/��q�W[/e6e5_.e,j3e5e+d��d�dӃGd�dՄd�e(�Z7e)d�d�d��e)d�d�d��e)d�d�d��gZ8xe8D]Z/e0e7e/j1e/��q$W[/e8e7_.e,j3e7e+d��d�d܃Gd�dބd�e(�Z9e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)�dd�d�g!Z:xe:D]Z/e0e9e/j1e/��q�W[/e:e9_.e,j3e9e+�d��d�d�G�d�d��de(�Z;e)�dd��d�e)�dd��d�e)�d	d��d�e)�d
d��d�gZ<xe<D]Z/e0e;e/j1e/��qTW[/e<e;_.e,j3e;e+�d��d�d
�G�d�d��de(�Z=e)�dd�d��gZ>xe>D]Z/e0e=e/j1e/��q�W[/e>e=_.e,j3e=e+�d��d�d�G�d�d��dej'�Z?e,j3e?e+d���d��d�d�Z@�d�d�ZAe�	rj�dZB�dZC�dZD�dZE�dZF�d ZGn$�d!ZB�d"ZC�d#ZD�d$ZE�d%ZF�d&ZGyeHZIWn"eJk
�	r��d'�d(�ZIYnXeIZHyeKZKWn"eJk
�	r��d)�d*�ZKYnXe�
r�d+�d,�ZLejMZN�d-�d.�ZOeZPn>�d/�d,�ZL�d0�d1�ZN�d2�d.�ZOG�d3�d4��d4e�ZPeKZKe#eL�d5�ejQeB�ZRejQeC�ZSejQeD�ZTejQeE�ZUejQeF�ZVejQeG�ZWe�
r��d6�d7�ZX�d8�d9�ZY�d:�d;�ZZ�d<�d=�Z[ej\�d>�Z]ej\�d?�Z^ej\�d@�Z_nT�dA�d7�ZX�dB�d9�ZY�dC�d;�ZZ�dD�d=�Z[ej\�dE�Z]ej\�dF�Z^ej\�dG�Z_e#eX�dH�e#eY�dI�e#eZ�dJ�e#e[�dK�e�r�dL�dM�Z`�dN�dO�ZaebZcddldZdedje�dP�jfZg[dejhd�ZiejjZkelZmddlnZnenjoZoenjpZp�dQZqej
d
d
k�r�dRZr�dSZsn�dTZr�dUZsnj�dV�dM�Z`�dW�dO�ZaecZcebZg�dX�dY�Zi�dZ�d[�Zkejtejuev�ZmddloZoeojoZoZp�d\Zq�dRZr�dSZse#e`�d]�e#ea�d^��d_�dQ�Zw�d`�dT�Zx�da�dU�Zye�r�eze4j{�db�Z|�d��dc�dd�Z}n�d��de�df�Z|e|�dg�ej
dd��d�k�
re|�dh�n.ej
dd��d�k�
r8e|�di�n�dj�dk�Z~eze4j{�dld�Zedk�
rj�dm�dn�Zej
dd��d�k�
r�eZ��do�dn�Ze#e}�dp�ej
dd��d�k�
r�ej�ej�f�dq�dr�Z�nej�Z��ds�dt�Z��du�dv�Z��dw�dx�Z�gZ�e+Z�e��j��dy�dk	�rge�_�ej��rbx>e�ej��D]0\Z�Z�ee��j+dk�r*e�j1e+k�r*ej�e�=P�q*W[�[�ej�j�e,�dS(�z6Utilities for writing code that runs on Python 2 and 3�)�absolute_importNz'Benjamin Peterson <benjamin@python.org>z1.10.0����java��c@seZdZdd�ZdS)�XcCsdS)Nrrl�)�selfr
r
�/usr/lib/python3.6/six.py�__len__>sz	X.__len__N)�__name__�
__module__�__qualname__r
r
r
r
rr	<sr	�?cCs
||_dS)z Add documentation to a function.N)�__doc__)�func�docr
r
r�_add_docKsrcCst|�tj|S)z7Import module, returning the module after the last dot.)�
__import__�sys�modules)�namer
r
r�_import_modulePsrc@seZdZdd�Zdd�ZdS)�
_LazyDescrcCs
||_dS)N)r)rrr
r
r�__init__Xsz_LazyDescr.__init__cCsB|j�}t||j|�yt|j|j�Wntk
r<YnX|S)N)�_resolve�setattrr�delattr�	__class__�AttributeError)r�obj�tp�resultr
r
r�__get__[sz_LazyDescr.__get__N)rrrrr%r
r
r
rrVsrcs.eZdZd�fdd�	Zdd�Zdd�Z�ZS)	�MovedModuleNcs2tt|�j|�tr(|dkr |}||_n||_dS)N)�superr&r�PY3�mod)rr�old�new)r r
rriszMovedModule.__init__cCs
t|j�S)N)rr))rr
r
rrrszMovedModule._resolvecCs"|j�}t||�}t|||�|S)N)r�getattrr)r�attr�_module�valuer
r
r�__getattr__us
zMovedModule.__getattr__)N)rrrrrr0�
__classcell__r
r
)r rr&gs	r&cs(eZdZ�fdd�Zdd�ZgZ�ZS)�_LazyModulecstt|�j|�|jj|_dS)N)r'r2rr r)rr)r r
rr~sz_LazyModule.__init__cCs ddg}|dd�|jD�7}|S)NrrcSsg|]
}|j�qSr
)r)�.0r-r
r
r�
<listcomp>�sz'_LazyModule.__dir__.<locals>.<listcomp>)�_moved_attributes)rZattrsr
r
r�__dir__�sz_LazyModule.__dir__)rrrrr6r5r1r
r
)r rr2|sr2cs&eZdZd�fdd�	Zdd�Z�ZS)�MovedAttributeNcsdtt|�j|�trH|dkr |}||_|dkr@|dkr<|}n|}||_n||_|dkrZ|}||_dS)N)r'r7rr(r)r-)rrZold_modZnew_modZold_attrZnew_attr)r r
rr�szMovedAttribute.__init__cCst|j�}t||j�S)N)rr)r,r-)r�moduler
r
rr�s
zMovedAttribute._resolve)NN)rrrrrr1r
r
)r rr7�sr7c@sVeZdZdZdd�Zdd�Zdd�Zdd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZeZdS)�_SixMetaPathImporterz�
    A meta path importer to import six.moves and its submodules.

    This class implements a PEP302 finder and loader. It should be compatible
    with Python 2.5 and all existing versions of Python3
    cCs||_i|_dS)N)r�
known_modules)rZsix_module_namer
r
rr�sz_SixMetaPathImporter.__init__cGs&x |D]}||j|jd|<qWdS)N�.)r:r)rr)Z	fullnames�fullnamer
r
r�_add_module�s
z _SixMetaPathImporter._add_modulecCs|j|jd|S)Nr;)r:r)rr<r
r
r�_get_module�sz _SixMetaPathImporter._get_moduleNcCs||jkr|SdS)N)r:)rr<�pathr
r
r�find_module�s
z _SixMetaPathImporter.find_modulecCs0y
|j|Stk
r*td|��YnXdS)Nz!This loader does not know module )r:�KeyError�ImportError)rr<r
r
rZ__get_module�s
z!_SixMetaPathImporter.__get_modulecCsRy
tj|Stk
rYnX|j|�}t|t�r>|j�}n||_|tj|<|S)N)rrrA� _SixMetaPathImporter__get_module�
isinstancer&r�
__loader__)rr<r)r
r
r�load_module�s




z _SixMetaPathImporter.load_modulecCst|j|�d�S)z�
        Return true, if the named module is a package.

        We need this method to get correct spec objects with
        Python 3.4 (see PEP451)
        �__path__)�hasattrrC)rr<r
r
r�
is_package�sz_SixMetaPathImporter.is_packagecCs|j|�dS)z;Return None

        Required, if is_package is implementedN)rC)rr<r
r
r�get_code�s
z_SixMetaPathImporter.get_code)N)
rrrrrr=r>r@rCrFrIrJ�
get_sourcer
r
r
rr9�s
	r9c@seZdZdZgZdS)�_MovedItemszLazy loading of moved objectsN)rrrrrGr
r
r
rrL�srLZ	cStringIO�io�StringIO�filter�	itertools�builtinsZifilter�filterfalseZifilterfalse�inputZ__builtin__Z	raw_input�internr�map�imap�getcwd�osZgetcwdu�getcwdb�rangeZxrangeZ
reload_module�	importlibZimp�reload�reduce�	functoolsZshlex_quoteZpipesZshlexZquote�UserDict�collections�UserList�
UserString�zipZizip�zip_longestZizip_longestZconfigparserZConfigParser�copyregZcopy_regZdbm_gnuZgdbmzdbm.gnuZ
_dummy_threadZdummy_threadZhttp_cookiejarZ	cookielibzhttp.cookiejarZhttp_cookiesZCookiezhttp.cookiesZ
html_entitiesZhtmlentitydefsz
html.entitiesZhtml_parserZ
HTMLParserzhtml.parserZhttp_clientZhttplibzhttp.clientZemail_mime_multipartzemail.MIMEMultipartzemail.mime.multipartZemail_mime_nonmultipartzemail.MIMENonMultipartzemail.mime.nonmultipartZemail_mime_textzemail.MIMETextzemail.mime.textZemail_mime_basezemail.MIMEBasezemail.mime.baseZBaseHTTPServerzhttp.serverZ
CGIHTTPServerZSimpleHTTPServerZcPickle�pickleZqueueZQueue�reprlib�reprZsocketserverZSocketServer�_threadZthreadZtkinterZTkinterZtkinter_dialogZDialogztkinter.dialogZtkinter_filedialogZ
FileDialogztkinter.filedialogZtkinter_scrolledtextZScrolledTextztkinter.scrolledtextZtkinter_simpledialogZSimpleDialogztkinter.simpledialogZtkinter_tixZTixztkinter.tixZtkinter_ttkZttkztkinter.ttkZtkinter_constantsZTkconstantsztkinter.constantsZtkinter_dndZTkdndztkinter.dndZtkinter_colorchooserZtkColorChooserztkinter.colorchooserZtkinter_commondialogZtkCommonDialogztkinter.commondialogZtkinter_tkfiledialogZtkFileDialogZtkinter_fontZtkFontztkinter.fontZtkinter_messageboxZtkMessageBoxztkinter.messageboxZtkinter_tksimpledialogZtkSimpleDialogZurllib_parsez.moves.urllib_parsezurllib.parseZurllib_errorz.moves.urllib_errorzurllib.errorZurllibz
.moves.urllibZurllib_robotparser�robotparserzurllib.robotparserZ
xmlrpc_clientZ	xmlrpclibz
xmlrpc.clientZ
xmlrpc_serverZSimpleXMLRPCServerz
xmlrpc.serverZwin32�winreg�_winregzmoves.z.moves�movesc@seZdZdZdS)�Module_six_moves_urllib_parsez7Lazy loading of moved objects in six.moves.urllib_parseN)rrrrr
r
r
rrn@srnZParseResultZurlparseZSplitResultZparse_qsZ	parse_qslZ	urldefragZurljoinZurlsplitZ
urlunparseZ
urlunsplitZ
quote_plusZunquoteZunquote_plusZ	urlencodeZ
splitqueryZsplittagZ	splituserZ
uses_fragmentZuses_netlocZuses_paramsZ
uses_queryZ
uses_relativezmoves.urllib_parsezmoves.urllib.parsec@seZdZdZdS)�Module_six_moves_urllib_errorz7Lazy loading of moved objects in six.moves.urllib_errorN)rrrrr
r
r
rrohsroZURLErrorZurllib2Z	HTTPErrorZContentTooShortErrorz.moves.urllib.errorzmoves.urllib_errorzmoves.urllib.errorc@seZdZdZdS)�Module_six_moves_urllib_requestz9Lazy loading of moved objects in six.moves.urllib_requestN)rrrrr
r
r
rrp|srpZurlopenzurllib.requestZinstall_openerZbuild_openerZpathname2urlZurl2pathnameZ
getproxiesZRequestZOpenerDirectorZHTTPDefaultErrorHandlerZHTTPRedirectHandlerZHTTPCookieProcessorZProxyHandlerZBaseHandlerZHTTPPasswordMgrZHTTPPasswordMgrWithDefaultRealmZAbstractBasicAuthHandlerZHTTPBasicAuthHandlerZProxyBasicAuthHandlerZAbstractDigestAuthHandlerZHTTPDigestAuthHandlerZProxyDigestAuthHandlerZHTTPHandlerZHTTPSHandlerZFileHandlerZ
FTPHandlerZCacheFTPHandlerZUnknownHandlerZHTTPErrorProcessorZurlretrieveZ
urlcleanupZ	URLopenerZFancyURLopenerZproxy_bypassz.moves.urllib.requestzmoves.urllib_requestzmoves.urllib.requestc@seZdZdZdS)� Module_six_moves_urllib_responsez:Lazy loading of moved objects in six.moves.urllib_responseN)rrrrr
r
r
rrq�srqZaddbasezurllib.responseZaddclosehookZaddinfoZ
addinfourlz.moves.urllib.responsezmoves.urllib_responsezmoves.urllib.responsec@seZdZdZdS)�#Module_six_moves_urllib_robotparserz=Lazy loading of moved objects in six.moves.urllib_robotparserN)rrrrr
r
r
rrr�srrZRobotFileParserz.moves.urllib.robotparserzmoves.urllib_robotparserzmoves.urllib.robotparserc@sNeZdZdZgZejd�Zejd�Zejd�Z	ejd�Z
ejd�Zdd�Zd	S)
�Module_six_moves_urllibzICreate a six.moves.urllib namespace that resembles the Python 3 namespacezmoves.urllib_parsezmoves.urllib_errorzmoves.urllib_requestzmoves.urllib_responsezmoves.urllib_robotparsercCsdddddgS)N�parse�error�request�responserjr
)rr
r
rr6�szModule_six_moves_urllib.__dir__N)
rrrrrG�	_importerr>rtrurvrwrjr6r
r
r
rrs�s




rszmoves.urllibcCstt|j|�dS)zAdd an item to six.moves.N)rrLr)Zmover
r
r�add_move�srycCsXytt|�WnDtk
rRytj|=Wn"tk
rLtd|f��YnXYnXdS)zRemove item from six.moves.zno such move, %rN)rrLr!rm�__dict__rA)rr
r
r�remove_move�sr{�__func__�__self__�__closure__�__code__�__defaults__�__globals__�im_funcZim_selfZfunc_closureZ	func_codeZ
func_defaultsZfunc_globalscCs|j�S)N)�next)�itr
r
r�advance_iteratorsr�cCstdd�t|�jD��S)Ncss|]}d|jkVqdS)�__call__N)rz)r3�klassr
r
r�	<genexpr>szcallable.<locals>.<genexpr>)�any�type�__mro__)r"r
r
r�callablesr�cCs|S)Nr
)�unboundr
r
r�get_unbound_functionsr�cCs|S)Nr
)r�clsr
r
r�create_unbound_methodsr�cCs|jS)N)r�)r�r
r
rr�"scCstj|||j�S)N)�types�
MethodTyper )rr"r
r
r�create_bound_method%sr�cCstj|d|�S)N)r�r�)rr�r
r
rr�(sc@seZdZdd�ZdS)�IteratorcCst|�j|�S)N)r��__next__)rr
r
rr�-sz
Iterator.nextN)rrrr�r
r
r
rr�+sr�z3Get the function out of a possibly unbound functioncKst|jf|��S)N)�iter�keys)�d�kwr
r
r�iterkeys>sr�cKst|jf|��S)N)r��values)r�r�r
r
r�
itervaluesAsr�cKst|jf|��S)N)r��items)r�r�r
r
r�	iteritemsDsr�cKst|jf|��S)N)r�Zlists)r�r�r
r
r�	iterlistsGsr�r�r�r�cKs|jf|�S)N)r�)r�r�r
r
rr�PscKs|jf|�S)N)r�)r�r�r
r
rr�SscKs|jf|�S)N)r�)r�r�r
r
rr�VscKs|jf|�S)N)r�)r�r�r
r
rr�Ys�viewkeys�
viewvalues�	viewitemsz1Return an iterator over the keys of a dictionary.z3Return an iterator over the values of a dictionary.z?Return an iterator over the (key, value) pairs of a dictionary.zBReturn an iterator over the (key, [values]) pairs of a dictionary.cCs
|jd�S)Nzlatin-1)�encode)�sr
r
r�bksr�cCs|S)Nr
)r�r
r
r�unsr�z>B�assertCountEqualZassertRaisesRegexpZassertRegexpMatches�assertRaisesRegex�assertRegexcCs|S)Nr
)r�r
r
rr��scCst|jdd�d�S)Nz\\z\\\\Zunicode_escape)�unicode�replace)r�r
r
rr��scCst|d�S)Nr)�ord)Zbsr
r
r�byte2int�sr�cCst||�S)N)r�)Zbuf�ir
r
r�
indexbytes�sr�ZassertItemsEqualzByte literalzText literalcOst|t�||�S)N)r,�_assertCountEqual)r�args�kwargsr
r
rr��scOst|t�||�S)N)r,�_assertRaisesRegex)rr�r�r
r
rr��scOst|t�||�S)N)r,�_assertRegex)rr�r�r
r
rr��s�execcCs*|dkr|�}|j|k	r"|j|��|�dS)N)�
__traceback__�with_traceback)r#r/�tbr
r
r�reraise�s


r�cCsB|dkr*tjd�}|j}|dkr&|j}~n|dkr6|}td�dS)zExecute code in a namespace.Nrzexec _code_ in _globs_, _locs_)r�	_getframe�	f_globals�f_localsr�)Z_code_Z_globs_Z_locs_�framer
r
r�exec_�s
r�z9def reraise(tp, value, tb=None):
    raise tp, value, tb
zrdef raise_from(value, from_value):
    if from_value is None:
        raise value
    raise value from from_value
zCdef raise_from(value, from_value):
    raise value from from_value
cCs|�dS)Nr
)r/Z
from_valuer
r
r�
raise_from�sr��printc
s6|jdtj���dkrdS�fdd�}d}|jdd�}|dk	r`t|t�rNd}nt|t�s`td��|jd	d�}|dk	r�t|t�r�d}nt|t�s�td
��|r�td��|s�x|D]}t|t�r�d}Pq�W|r�td�}td
�}nd}d
}|dkr�|}|dk�r�|}x,t|�D] \}	}|	�r||�||��qW||�dS)z4The new-style print function for Python 2.4 and 2.5.�fileNcsdt|t�st|�}t�t�rVt|t�rV�jdk	rVt�dd�}|dkrHd}|j�j|�}�j|�dS)N�errors�strict)	rD�
basestring�strr�r��encodingr,r��write)�datar�)�fpr
rr��s



zprint_.<locals>.writeF�sepTzsep must be None or a string�endzend must be None or a stringz$invalid keyword arguments to print()�
� )�popr�stdoutrDr�r��	TypeError�	enumerate)
r�r�r�Zwant_unicoder�r��arg�newlineZspacer�r
)r�r�print_�sL







r�cOs<|jdtj�}|jdd�}t||�|r8|dk	r8|j�dS)Nr��flushF)�getrr�r��_printr�)r�r�r�r�r
r
rr�s

zReraise an exception.cs���fdd�}|S)Ncstj����|�}�|_|S)N)r^�wraps�__wrapped__)�f)�assigned�updated�wrappedr
r�wrapperszwraps.<locals>.wrapperr
)r�r�r�r�r
)r�r�r�rr�sr�cs&G��fdd�d��}tj|dfi�S)z%Create a base class with a metaclass.cseZdZ��fdd�ZdS)z!with_metaclass.<locals>.metaclasscs�|�|�S)Nr
)r�rZ
this_basesr�)�bases�metar
r�__new__'sz)with_metaclass.<locals>.metaclass.__new__N)rrrr�r
)r�r�r
r�	metaclass%sr�Ztemporary_class)r�r�)r�r�r�r
)r�r�r�with_metaclass sr�cs�fdd�}|S)z6Class decorator for creating a class with a metaclass.csl|jj�}|jd�}|dk	rDt|t�r,|g}x|D]}|j|�q2W|jdd�|jdd��|j|j|�S)N�	__slots__rz�__weakref__)rz�copyr�rDr�r�r�	__bases__)r�Z	orig_vars�slotsZ	slots_var)r�r
rr�.s



zadd_metaclass.<locals>.wrapperr
)r�r�r
)r�r�
add_metaclass,sr�cCs2tr.d|jkrtd|j��|j|_dd�|_|S)a
    A decorator that defines __unicode__ and __str__ methods under Python 2.
    Under Python 3 it does nothing.

    To support Python 2 and 3 with a single code base, define a __str__ method
    returning text and apply this decorator to the class.
    �__str__zY@python_2_unicode_compatible cannot be applied to %s because it doesn't define __str__().cSs|j�jd�S)Nzutf-8)�__unicode__r�)rr
r
r�<lambda>Jsz-python_2_unicode_compatible.<locals>.<lambda>)�PY2rz�
ValueErrorrr�r�)r�r
r
r�python_2_unicode_compatible<s


r��__spec__)rrli���li���ll����)N)NN)rr)rr)rr)rr)�rZ
__future__rr^rP�operatorrr��
__author__�__version__�version_infor�r(ZPY34r�Zstring_types�intZ
integer_typesr�Zclass_typesZ	text_type�bytesZbinary_type�maxsizeZMAXSIZEr�ZlongZ	ClassTyper��platform�
startswith�objectr	�len�
OverflowErrorrrrr&�
ModuleTyper2r7r9rrxrLr5r-rrrDr=rmrnZ_urllib_parse_moved_attributesroZ_urllib_error_moved_attributesrpZ _urllib_request_moved_attributesrqZ!_urllib_response_moved_attributesrrZ$_urllib_robotparser_moved_attributesrsryr{Z
_meth_funcZ
_meth_selfZ
_func_closureZ
_func_codeZ_func_defaultsZ
_func_globalsr�r��	NameErrorr�r�r�r�r�r��
attrgetterZget_method_functionZget_method_selfZget_function_closureZget_function_codeZget_function_defaultsZget_function_globalsr�r�r�r��methodcallerr�r�r�r�r��chrZunichr�struct�Struct�packZint2byte�
itemgetterr��getitemr�r�Z	iterbytesrMrN�BytesIOr�r�r��partialrVr�r�r�r�r,rQr�r�r�r�r��WRAPPER_ASSIGNMENTS�WRAPPER_UPDATESr�r�r�r�rG�__package__�globalsr�r��submodule_search_locations�	meta_pathr�r�Zimporter�appendr
r
r
r�<module>s�

>












































































































5site-packages/pip/_vendor/__pycache__/pyparsing.cpython-36.opt-1.pyc000064400000610513147511334610021334 0ustar003

���e�k�@s�dZdZdZdZddlZddlmZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlmZyddlmZWn ek
r�ddlmZYnXydd	l
mZWn>ek
r�ydd	lmZWnek
r�dZYnXYnXd
ddd
ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrgiZee	j�dds�ZeddskZe�r"e	jZe Z!e"Z#e Z$e%e&e'e(e)ee*e+e,e-e.gZ/nbe	j0Ze1Z2dtdu�Z$gZ/ddl3Z3xBdvj4�D]6Z5ye/j6e7e3e5��Wne8k
�r|�wJYnX�qJWe9dwdx�e2dy�D��Z:dzd{�Z;Gd|d}�d}e<�Z=ej>ej?Z@d~ZAeAdZBe@eAZCe"d��ZDd�jEd�dx�ejFD��ZGGd�d!�d!eH�ZIGd�d#�d#eI�ZJGd�d%�d%eI�ZKGd�d'�d'eK�ZLGd�d*�d*eH�ZMGd�d��d�e<�ZNGd�d&�d&e<�ZOe
jPjQeO�d�d=�ZRd�dN�ZSd�dK�ZTd�d��ZUd�d��ZVd�d��ZWd�dU�ZX�d/d�d��ZYGd�d(�d(e<�ZZGd�d0�d0eZ�Z[Gd�d�de[�Z\Gd�d�de[�Z]Gd�d�de[�Z^e^Z_e^eZ_`Gd�d�de[�ZaGd�d�de^�ZbGd�d�dea�ZcGd�dp�dpe[�ZdGd�d3�d3e[�ZeGd�d+�d+e[�ZfGd�d)�d)e[�ZgGd�d
�d
e[�ZhGd�d2�d2e[�ZiGd�d��d�e[�ZjGd�d�dej�ZkGd�d�dej�ZlGd�d�dej�ZmGd�d.�d.ej�ZnGd�d-�d-ej�ZoGd�d5�d5ej�ZpGd�d4�d4ej�ZqGd�d$�d$eZ�ZrGd�d
�d
er�ZsGd�d �d er�ZtGd�d�der�ZuGd�d�der�ZvGd�d"�d"eZ�ZwGd�d�dew�ZxGd�d�dew�ZyGd�d��d�ew�ZzGd�d�dez�Z{Gd�d6�d6ez�Z|Gd�d��d�e<�Z}e}�Z~Gd�d�dew�ZGd�d,�d,ew�Z�Gd�d�dew�Z�Gd�d��d�e��Z�Gd�d1�d1ew�Z�Gd�d�de��Z�Gd�d�de��Z�Gd�d�de��Z�Gd�d/�d/e��Z�Gd�d�de<�Z�d�df�Z��d0d�dD�Z��d1d�d@�Z�d�d΄Z�d�dS�Z�d�dR�Z�d�d҄Z��d2d�dW�Z�d�dE�Z��d3d�dk�Z�d�dl�Z�d�dn�Z�e\�j�dG�Z�el�j�dM�Z�em�j�dL�Z�en�j�de�Z�eo�j�dd�Z�eeeDd�d�dڍj�d�d܄�Z�efd݃j�d�d܄�Z�efd߃j�d�d܄�Z�e�e�Be�BeeeGd�dyd�Befd�ej��BZ�e�e�e�d�e��Z�e^d�ed�j�d�e�e{e�e�B��j�d�d�Z�d�dc�Z�d�dQ�Z�d�d`�Z�d�d^�Z�d�dq�Z�e�d�d܄�Z�e�d�d܄�Z�d�d�Z�d�dO�Z�d�dP�Z�d�di�Z�e<�e�_��d4d�do�Z�e=�Z�e<�e�_�e<�e�_�e�d��e�d��fd�dm�Z�e�Z�e�efd��d��j�d��Z�e�efd��d��j�d��Z�e�efd��d�efd��d�B�j��d�Z�e�e_�d�e�j��j��d�Z�d�d�de�j�f�ddT�Z��d5�ddj�Z�e��d�Z�e��d�Z�e�eee@eC�d�j��d��\Z�Z�e�e��d	j4��d
��Z�ef�d�djEe�j��d
�j��d�ZĐdd_�Z�e�ef�d��d�j��d�Z�ef�d�j��d�Z�ef�d�jȃj��d�Z�ef�d�j��d�Z�e�ef�d��de�B�j��d�Z�e�Z�ef�d�j��d�Z�e�e{eeeGdɐd�eee�d�e^dɃem����j΃j��d�Z�e�ee�j�e�Bd��d��j�d>�Z�G�d dr�dr�Z�eҐd!k�r�eb�d"�Z�eb�d#�Z�eee@eC�d$�Z�e�eՐd%dӐd&�j�e��Z�e�e�eփ�j��d'�Zאd(e�BZ�e�eՐd%dӐd&�j�e��Z�e�e�eك�j��d)�Z�eӐd*�eؐd'�e�eڐd)�Z�e�jܐd+�e�j�jܐd,�e�j�jܐd,�e�j�jܐd-�ddl�Z�e�j�j�e�e�j��e�j�jܐd.�dS(6aS
pyparsing module - Classes and methods to define and execute parsing grammars

The pyparsing module is an alternative approach to creating and executing simple grammars,
vs. the traditional lex/yacc approach, or the use of regular expressions.  With pyparsing, you
don't need to learn a new syntax for defining grammars or matching expressions - the parsing module
provides a library of classes that you use to construct the grammar directly in Python.

Here is a program to parse "Hello, World!" (or any greeting of the form 
C{"<salutation>, <addressee>!"}), built up using L{Word}, L{Literal}, and L{And} elements 
(L{'+'<ParserElement.__add__>} operator gives L{And} expressions, strings are auto-converted to
L{Literal} expressions)::

    from pyparsing import Word, alphas

    # define grammar of a greeting
    greet = Word(alphas) + "," + Word(alphas) + "!"

    hello = "Hello, World!"
    print (hello, "->", greet.parseString(hello))

The program outputs the following::

    Hello, World! -> ['Hello', ',', 'World', '!']

The Python representation of the grammar is quite readable, owing to the self-explanatory
class names, and the use of '+', '|' and '^' operators.

The L{ParseResults} object returned from L{ParserElement.parseString<ParserElement.parseString>} can be accessed as a nested list, a dictionary, or an
object with named attributes.

The pyparsing module handles some of the problems that are typically vexing when writing text parsers:
 - extra or missing whitespace (the above program will also handle "Hello,World!", "Hello  ,  World  !", etc.)
 - quoted strings
 - embedded comments
z2.1.10z07 Oct 2016 01:31 UTCz*Paul McGuire <ptmcg@users.sourceforge.net>�N)�ref)�datetime)�RLock)�OrderedDict�And�CaselessKeyword�CaselessLiteral�
CharsNotIn�Combine�Dict�Each�Empty�
FollowedBy�Forward�
GoToColumn�Group�Keyword�LineEnd�	LineStart�Literal�
MatchFirst�NoMatch�NotAny�	OneOrMore�OnlyOnce�Optional�Or�ParseBaseException�ParseElementEnhance�ParseException�ParseExpression�ParseFatalException�ParseResults�ParseSyntaxException�
ParserElement�QuotedString�RecursiveGrammarException�Regex�SkipTo�	StringEnd�StringStart�Suppress�Token�TokenConverter�White�Word�WordEnd�	WordStart�
ZeroOrMore�	alphanums�alphas�
alphas8bit�anyCloseTag�
anyOpenTag�
cStyleComment�col�commaSeparatedList�commonHTMLEntity�countedArray�cppStyleComment�dblQuotedString�dblSlashComment�
delimitedList�dictOf�downcaseTokens�empty�hexnums�htmlComment�javaStyleComment�line�lineEnd�	lineStart�lineno�makeHTMLTags�makeXMLTags�matchOnlyAtCol�matchPreviousExpr�matchPreviousLiteral�
nestedExpr�nullDebugAction�nums�oneOf�opAssoc�operatorPrecedence�
printables�punc8bit�pythonStyleComment�quotedString�removeQuotes�replaceHTMLEntity�replaceWith�
restOfLine�sglQuotedString�srange�	stringEnd�stringStart�traceParseAction�
unicodeString�upcaseTokens�
withAttribute�
indentedBlock�originalTextFor�ungroup�
infixNotation�locatedExpr�	withClass�
CloseMatch�tokenMap�pyparsing_common�cCs`t|t�r|Syt|�Stk
rZt|�jtj�d�}td�}|jdd��|j	|�SXdS)aDrop-in replacement for str(obj) that tries to be Unicode friendly. It first tries
           str(obj). If that fails with a UnicodeEncodeError, then it tries unicode(obj). It
           then < returns the unicode object | encodes it with the default encoding | ... >.
        �xmlcharrefreplacez&#\d+;cSs$dtt|ddd���dd�S)Nz\ur�����)�hex�int)�t�rw�/usr/lib/python3.6/pyparsing.py�<lambda>�sz_ustr.<locals>.<lambda>N)
�
isinstanceZunicode�str�UnicodeEncodeError�encode�sys�getdefaultencodingr'�setParseAction�transformString)�obj�retZ
xmlcharrefrwrwrx�_ustr�s
r�z6sum len sorted reversed list tuple set any all min maxccs|]
}|VqdS)Nrw)�.0�yrwrwrx�	<genexpr>�sr�rrcCs>d}dd�dj�D�}x"t||�D]\}}|j||�}q"W|S)z/Escape &, <, >, ", ', etc. in a string of data.z&><"'css|]}d|dVqdS)�&�;Nrw)r��srwrwrxr��sz_xml_escape.<locals>.<genexpr>zamp gt lt quot apos)�split�zip�replace)�dataZfrom_symbolsZ
to_symbolsZfrom_Zto_rwrwrx�_xml_escape�s
r�c@seZdZdS)�
_ConstantsN)�__name__�
__module__�__qualname__rwrwrwrxr��sr��
0123456789ZABCDEFabcdef�\�ccs|]}|tjkr|VqdS)N)�stringZ
whitespace)r��crwrwrxr��sc@sPeZdZdZddd�Zedd��Zdd	�Zd
d�Zdd
�Z	ddd�Z
dd�ZdS)rz7base exception class for all parsing runtime exceptionsrNcCs>||_|dkr||_d|_n||_||_||_|||f|_dS)Nr�)�loc�msg�pstr�
parserElement�args)�selfr�r�r��elemrwrwrx�__init__�szParseBaseException.__init__cCs||j|j|j|j�S)z�
        internal factory method to simplify creating one type of ParseException 
        from another - avoids having __init__ signature conflicts among subclasses
        )r�r�r�r�)�cls�perwrwrx�_from_exception�sz"ParseBaseException._from_exceptioncCsN|dkrt|j|j�S|dkr,t|j|j�S|dkrBt|j|j�St|��dS)z�supported attributes by name are:
            - lineno - returns the line number of the exception text
            - col - returns the column number of the exception text
            - line - returns the line containing the exception text
        rJr9�columnrGN)r9r�)rJr�r�r9rG�AttributeError)r�Zanamerwrwrx�__getattr__�szParseBaseException.__getattr__cCsd|j|j|j|jfS)Nz"%s (at char %d), (line:%d, col:%d))r�r�rJr�)r�rwrwrx�__str__�szParseBaseException.__str__cCst|�S)N)r�)r�rwrwrx�__repr__�szParseBaseException.__repr__�>!<cCs<|j}|jd}|r4dj|d|�|||d�f�}|j�S)z�Extracts the exception line from the input string, and marks
           the location of the exception with a special symbol.
        rrr�N)rGr��join�strip)r�ZmarkerStringZline_strZline_columnrwrwrx�
markInputline�s
z ParseBaseException.markInputlinecCsdj�tt|��S)Nzlineno col line)r��dir�type)r�rwrwrx�__dir__�szParseBaseException.__dir__)rNN)r�)r�r�r��__doc__r��classmethodr�r�r�r�r�r�rwrwrwrxr�s


c@seZdZdZdS)raN
    Exception thrown when parse expressions don't match class;
    supported attributes by name are:
     - lineno - returns the line number of the exception text
     - col - returns the column number of the exception text
     - line - returns the line containing the exception text
        
    Example::
        try:
            Word(nums).setName("integer").parseString("ABC")
        except ParseException as pe:
            print(pe)
            print("column: {}".format(pe.col))
            
    prints::
       Expected integer (at char 0), (line:1, col:1)
        column: 1
    N)r�r�r�r�rwrwrwrxr�sc@seZdZdZdS)r!znuser-throwable exception thrown when inconsistent parse content
       is found; stops all parsing immediatelyN)r�r�r�r�rwrwrwrxr!sc@seZdZdZdS)r#z�just like L{ParseFatalException}, but thrown internally when an
       L{ErrorStop<And._ErrorStop>} ('-' operator) indicates that parsing is to stop 
       immediately because an unbacktrackable syntax error has been foundN)r�r�r�r�rwrwrwrxr#sc@s eZdZdZdd�Zdd�ZdS)r&zZexception thrown by L{ParserElement.validate} if the grammar could be improperly recursivecCs
||_dS)N)�parseElementTrace)r��parseElementListrwrwrxr�sz"RecursiveGrammarException.__init__cCs
d|jS)NzRecursiveGrammarException: %s)r�)r�rwrwrxr� sz!RecursiveGrammarException.__str__N)r�r�r�r�r�r�rwrwrwrxr&sc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�_ParseResultsWithOffsetcCs||f|_dS)N)�tup)r�Zp1Zp2rwrwrxr�$sz _ParseResultsWithOffset.__init__cCs
|j|S)N)r�)r��irwrwrx�__getitem__&sz#_ParseResultsWithOffset.__getitem__cCst|jd�S)Nr)�reprr�)r�rwrwrxr�(sz _ParseResultsWithOffset.__repr__cCs|jd|f|_dS)Nr)r�)r�r�rwrwrx�	setOffset*sz!_ParseResultsWithOffset.setOffsetN)r�r�r�r�r�r�r�rwrwrwrxr�#sr�c@s�eZdZdZd[dd�Zddddefdd�Zdd	�Zefd
d�Zdd
�Z	dd�Z
dd�Zdd�ZeZ
dd�Zdd�Zdd�Zdd�Zdd�Zer�eZeZeZn$eZeZeZdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd\d(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Z d2d3�Z!d4d5�Z"d6d7�Z#d8d9�Z$d:d;�Z%d<d=�Z&d]d?d@�Z'dAdB�Z(dCdD�Z)dEdF�Z*d^dHdI�Z+dJdK�Z,dLdM�Z-d_dOdP�Z.dQdR�Z/dSdT�Z0dUdV�Z1dWdX�Z2dYdZ�Z3dS)`r"aI
    Structured parse results, to provide multiple means of access to the parsed data:
       - as a list (C{len(results)})
       - by list index (C{results[0], results[1]}, etc.)
       - by attribute (C{results.<resultsName>} - see L{ParserElement.setResultsName})

    Example::
        integer = Word(nums)
        date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))
        # equivalent form:
        # date_str = integer("year") + '/' + integer("month") + '/' + integer("day")

        # parseString returns a ParseResults object
        result = date_str.parseString("1999/12/31")

        def test(s, fn=repr):
            print("%s -> %s" % (s, fn(eval(s))))
        test("list(result)")
        test("result[0]")
        test("result['month']")
        test("result.day")
        test("'month' in result")
        test("'minutes' in result")
        test("result.dump()", str)
    prints::
        list(result) -> ['1999', '/', '12', '/', '31']
        result[0] -> '1999'
        result['month'] -> '12'
        result.day -> '31'
        'month' in result -> True
        'minutes' in result -> False
        result.dump() -> ['1999', '/', '12', '/', '31']
        - day: 31
        - month: 12
        - year: 1999
    NTcCs"t||�r|Stj|�}d|_|S)NT)rz�object�__new__�_ParseResults__doinit)r��toklist�name�asList�modalZretobjrwrwrxr�Ts


zParseResults.__new__c
Cs`|jrvd|_d|_d|_i|_||_||_|dkr6g}||t�rP|dd�|_n||t�rft|�|_n|g|_t	�|_
|dk	o�|�r\|s�d|j|<||t�r�t|�}||_||t
d�ttf�o�|ddgfk�s\||t�r�|g}|�r&||t��rt|j�d�||<ntt|d�d�||<|||_n6y|d||<Wn$tttfk
�rZ|||<YnXdS)NFrr�)r��_ParseResults__name�_ParseResults__parent�_ParseResults__accumNames�_ParseResults__asList�_ParseResults__modal�list�_ParseResults__toklist�_generatorType�dict�_ParseResults__tokdictrur�r��
basestringr"r��copy�KeyError�	TypeError�
IndexError)r�r�r�r�r�rzrwrwrxr�]sB



$
zParseResults.__init__cCsPt|ttf�r|j|S||jkr4|j|ddStdd�|j|D��SdS)NrrrcSsg|]}|d�qS)rrw)r��vrwrwrx�
<listcomp>�sz,ParseResults.__getitem__.<locals>.<listcomp>rs)rzru�slicer�r�r�r")r�r�rwrwrxr��s


zParseResults.__getitem__cCs�||t�r0|jj|t��|g|j|<|d}nD||ttf�rN||j|<|}n&|jj|t��t|d�g|j|<|}||t�r�t|�|_	dS)Nr)
r�r��getr�rur�r�r"�wkrefr�)r��kr�rz�subrwrwrx�__setitem__�s


"
zParseResults.__setitem__c
Cs�t|ttf�r�t|j�}|j|=t|t�rH|dkr:||7}t||d�}tt|j|���}|j�x^|j	j
�D]F\}}x<|D]4}x.t|�D]"\}\}}	t||	|	|k�||<q�Wq|WqnWn|j	|=dS)Nrrr)
rzrur��lenr�r��range�indices�reverser��items�	enumerater�)
r�r�ZmylenZremovedr��occurrences�jr��value�positionrwrwrx�__delitem__�s


$zParseResults.__delitem__cCs
||jkS)N)r�)r�r�rwrwrx�__contains__�szParseResults.__contains__cCs
t|j�S)N)r�r�)r�rwrwrx�__len__�szParseResults.__len__cCs
|jS)N)r�)r�rwrwrx�__bool__�szParseResults.__bool__cCs
t|j�S)N)�iterr�)r�rwrwrx�__iter__�szParseResults.__iter__cCst|jddd��S)Nrrrs)r�r�)r�rwrwrx�__reversed__�szParseResults.__reversed__cCs$t|jd�r|jj�St|j�SdS)N�iterkeys)�hasattrr�r�r�)r�rwrwrx�	_iterkeys�s
zParseResults._iterkeyscs�fdd��j�D�S)Nc3s|]}�|VqdS)Nrw)r�r�)r�rwrxr��sz+ParseResults._itervalues.<locals>.<genexpr>)r�)r�rw)r�rx�_itervalues�szParseResults._itervaluescs�fdd��j�D�S)Nc3s|]}|�|fVqdS)Nrw)r�r�)r�rwrxr��sz*ParseResults._iteritems.<locals>.<genexpr>)r�)r�rw)r�rx�
_iteritems�szParseResults._iteritemscCst|j��S)zVReturns all named result keys (as a list in Python 2.x, as an iterator in Python 3.x).)r�r�)r�rwrwrx�keys�szParseResults.keyscCst|j��S)zXReturns all named result values (as a list in Python 2.x, as an iterator in Python 3.x).)r��
itervalues)r�rwrwrx�values�szParseResults.valuescCst|j��S)zfReturns all named result key-values (as a list of tuples in Python 2.x, as an iterator in Python 3.x).)r��	iteritems)r�rwrwrxr��szParseResults.itemscCs
t|j�S)z�Since keys() returns an iterator, this method is helpful in bypassing
           code that looks for the existence of any defined results names.)�boolr�)r�rwrwrx�haskeys�szParseResults.haskeyscOs�|s
dg}x6|j�D]*\}}|dkr2|d|f}qtd|��qWt|dt�sht|�dksh|d|kr�|d}||}||=|S|d}|SdS)a�
        Removes and returns item at specified index (default=C{last}).
        Supports both C{list} and C{dict} semantics for C{pop()}. If passed no
        argument or an integer argument, it will use C{list} semantics
        and pop tokens from the list of parsed tokens. If passed a 
        non-integer argument (most likely a string), it will use C{dict}
        semantics and pop the corresponding value from any defined 
        results names. A second default return value argument is 
        supported, just as in C{dict.pop()}.

        Example::
            def remove_first(tokens):
                tokens.pop(0)
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            print(OneOrMore(Word(nums)).addParseAction(remove_first).parseString("0 123 321")) # -> ['123', '321']

            label = Word(alphas)
            patt = label("LABEL") + OneOrMore(Word(nums))
            print(patt.parseString("AAB 123 321").dump())

            # Use pop() in a parse action to remove named result (note that corresponding value is not
            # removed from list form of results)
            def remove_LABEL(tokens):
                tokens.pop("LABEL")
                return tokens
            patt.addParseAction(remove_LABEL)
            print(patt.parseString("AAB 123 321").dump())
        prints::
            ['AAB', '123', '321']
            - LABEL: AAB

            ['AAB', '123', '321']
        rr�defaultrz-pop() got an unexpected keyword argument '%s'Nrs)r�r�rzrur�)r�r��kwargsr�r��indexr�Zdefaultvaluerwrwrx�pop�s"zParseResults.popcCs||kr||S|SdS)ai
        Returns named result matching the given key, or if there is no
        such name, then returns the given C{defaultValue} or C{None} if no
        C{defaultValue} is specified.

        Similar to C{dict.get()}.
        
        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            result = date_str.parseString("1999/12/31")
            print(result.get("year")) # -> '1999'
            print(result.get("hour", "not specified")) # -> 'not specified'
            print(result.get("hour")) # -> None
        Nrw)r��key�defaultValuerwrwrxr�szParseResults.getcCsZ|jj||�xF|jj�D]8\}}x.t|�D]"\}\}}t||||k�||<q,WqWdS)a
        Inserts new element at location index in the list of parsed tokens.
        
        Similar to C{list.insert()}.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']

            # use a parse action to insert the parse location in the front of the parsed results
            def insert_locn(locn, tokens):
                tokens.insert(0, locn)
            print(OneOrMore(Word(nums)).addParseAction(insert_locn).parseString("0 123 321")) # -> [0, '0', '123', '321']
        N)r��insertr�r�r�r�)r�r�ZinsStrr�r�r�r�r�rwrwrxr�2szParseResults.insertcCs|jj|�dS)a�
        Add single element to end of ParseResults list of elements.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            
            # use a parse action to compute the sum of the parsed integers, and add it to the end
            def append_sum(tokens):
                tokens.append(sum(map(int, tokens)))
            print(OneOrMore(Word(nums)).addParseAction(append_sum).parseString("0 123 321")) # -> ['0', '123', '321', 444]
        N)r��append)r��itemrwrwrxr�FszParseResults.appendcCs$t|t�r||7}n|jj|�dS)a
        Add sequence of elements to end of ParseResults list of elements.

        Example::
            patt = OneOrMore(Word(alphas))
            
            # use a parse action to append the reverse of the matched strings, to make a palindrome
            def make_palindrome(tokens):
                tokens.extend(reversed([t[::-1] for t in tokens]))
                return ''.join(tokens)
            print(patt.addParseAction(make_palindrome).parseString("lskdj sdlkjf lksd")) # -> 'lskdjsdlkjflksddsklfjkldsjdksl'
        N)rzr"r��extend)r�Zitemseqrwrwrxr�Ts

zParseResults.extendcCs|jdd�=|jj�dS)z7
        Clear all elements and results names.
        N)r�r��clear)r�rwrwrxr�fszParseResults.clearcCsfy||Stk
rdSX||jkr^||jkrD|j|ddStdd�|j|D��SndSdS)Nr�rrrcSsg|]}|d�qS)rrw)r�r�rwrwrxr�wsz,ParseResults.__getattr__.<locals>.<listcomp>rs)r�r�r�r")r�r�rwrwrxr�ms

zParseResults.__getattr__cCs|j�}||7}|S)N)r�)r��otherr�rwrwrx�__add__{szParseResults.__add__cs�|jrnt|j���fdd��|jj�}�fdd�|D�}x4|D],\}}|||<t|dt�r>t|�|d_q>W|j|j7_|jj	|j�|S)Ncs|dkr�S|�S)Nrrw)�a)�offsetrwrxry�sz'ParseResults.__iadd__.<locals>.<lambda>c	s4g|],\}}|D]}|t|d�|d��f�qqS)rrr)r�)r�r��vlistr�)�	addoffsetrwrxr��sz)ParseResults.__iadd__.<locals>.<listcomp>r)
r�r�r�r�rzr"r�r�r��update)r�r�Z
otheritemsZotherdictitemsr�r�rw)rrrx�__iadd__�s


zParseResults.__iadd__cCs&t|t�r|dkr|j�S||SdS)Nr)rzrur�)r�r�rwrwrx�__radd__�szParseResults.__radd__cCsdt|j�t|j�fS)Nz(%s, %s))r�r�r�)r�rwrwrxr��szParseResults.__repr__cCsddjdd�|jD��dS)N�[z, css(|] }t|t�rt|�nt|�VqdS)N)rzr"r�r�)r�r�rwrwrxr��sz'ParseResults.__str__.<locals>.<genexpr>�])r�r�)r�rwrwrxr��szParseResults.__str__r�cCsPg}xF|jD]<}|r"|r"|j|�t|t�r:||j�7}q|jt|��qW|S)N)r�r�rzr"�
_asStringListr�)r��sep�outr�rwrwrxr
�s

zParseResults._asStringListcCsdd�|jD�S)a�
        Returns the parse results as a nested list of matching tokens, all converted to strings.

        Example::
            patt = OneOrMore(Word(alphas))
            result = patt.parseString("sldkj lsdkj sldkj")
            # even though the result prints in string-like form, it is actually a pyparsing ParseResults
            print(type(result), result) # -> <class 'pyparsing.ParseResults'> ['sldkj', 'lsdkj', 'sldkj']
            
            # Use asList() to create an actual list
            result_list = result.asList()
            print(type(result_list), result_list) # -> <class 'list'> ['sldkj', 'lsdkj', 'sldkj']
        cSs"g|]}t|t�r|j�n|�qSrw)rzr"r�)r��resrwrwrxr��sz'ParseResults.asList.<locals>.<listcomp>)r�)r�rwrwrxr��szParseResults.asListcs6tr|j}n|j}�fdd��t�fdd�|�D��S)a�
        Returns the named parse results as a nested dictionary.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(type(result), repr(result)) # -> <class 'pyparsing.ParseResults'> (['12', '/', '31', '/', '1999'], {'day': [('1999', 4)], 'year': [('12', 0)], 'month': [('31', 2)]})
            
            result_dict = result.asDict()
            print(type(result_dict), repr(result_dict)) # -> <class 'dict'> {'day': '1999', 'year': '12', 'month': '31'}

            # even though a ParseResults supports dict-like access, sometime you just need to have a dict
            import json
            print(json.dumps(result)) # -> Exception: TypeError: ... is not JSON serializable
            print(json.dumps(result.asDict())) # -> {"month": "31", "day": "1999", "year": "12"}
        cs6t|t�r.|j�r|j�S�fdd�|D�Sn|SdS)Ncsg|]}�|��qSrwrw)r�r�)�toItemrwrxr��sz7ParseResults.asDict.<locals>.toItem.<locals>.<listcomp>)rzr"r��asDict)r�)rrwrxr�s

z#ParseResults.asDict.<locals>.toItemc3s|]\}}|�|�fVqdS)Nrw)r�r�r�)rrwrxr��sz&ParseResults.asDict.<locals>.<genexpr>)�PY_3r�r�r�)r�Zitem_fnrw)rrxr�s
	zParseResults.asDictcCs8t|j�}|jj�|_|j|_|jj|j�|j|_|S)zA
        Returns a new copy of a C{ParseResults} object.
        )r"r�r�r�r�r�rr�)r�r�rwrwrxr��s
zParseResults.copyFcCsPd}g}tdd�|jj�D��}|d}|s8d}d}d}d}	|dk	rJ|}	n|jrV|j}	|	sf|rbdSd}	|||d|	d	g7}x�t|j�D]�\}
}t|t�r�|
|kr�||j||
|o�|dk||�g7}n||jd|o�|dk||�g7}q�d}|
|kr�||
}|�s
|�rq�nd}t	t
|��}
|||d|d	|
d
|d	g	7}q�W|||d
|	d	g7}dj|�S)z�
        (Deprecated) Returns the parse results as XML. Tags are created for tokens and lists that have defined results names.
        �
css(|] \}}|D]}|d|fVqqdS)rrNrw)r�r�rr�rwrwrxr��sz%ParseResults.asXML.<locals>.<genexpr>z  r�NZITEM�<�>z</)r�r�r�r�r�r�rzr"�asXMLr�r�r�)r�ZdoctagZnamedItemsOnly�indentZ	formatted�nlrZ
namedItemsZnextLevelIndentZselfTagr�r
ZresTagZxmlBodyTextrwrwrxr�sT


zParseResults.asXMLcCs:x4|jj�D]&\}}x|D]\}}||kr|SqWqWdS)N)r�r�)r�r�r�rr�r�rwrwrxZ__lookup$s
zParseResults.__lookupcCs�|jr|jS|jr.|j�}|r(|j|�SdSnNt|�dkrxt|j�dkrxtt|jj���dddkrxtt|jj���SdSdS)a(
        Returns the results name for this token expression. Useful when several 
        different expressions might match at a particular location.

        Example::
            integer = Word(nums)
            ssn_expr = Regex(r"\d\d\d-\d\d-\d\d\d\d")
            house_number_expr = Suppress('#') + Word(nums, alphanums)
            user_data = (Group(house_number_expr)("house_number") 
                        | Group(ssn_expr)("ssn")
                        | Group(integer)("age"))
            user_info = OneOrMore(user_data)
            
            result = user_info.parseString("22 111-22-3333 #221B")
            for item in result:
                print(item.getName(), ':', item[0])
        prints::
            age : 22
            ssn : 111-22-3333
            house_number : 221B
        Nrrrrs)rrs)	r�r��_ParseResults__lookupr�r��nextr�r�r�)r��parrwrwrx�getName+s
zParseResults.getNamercCsbg}d}|j|t|j���|�rX|j�r�tdd�|j�D��}xz|D]r\}}|r^|j|�|jd|d||f�t|t�r�|r�|j|j||d��q�|jt|��qH|jt	|��qHWn�t
dd�|D���rX|}x~t|�D]r\}	}
t|
t��r*|jd|d||	|d|d|
j||d�f�q�|jd|d||	|d|dt|
�f�q�Wd	j|�S)
aH
        Diagnostic method for listing out the contents of a C{ParseResults}.
        Accepts an optional C{indent} argument so that this string can be embedded
        in a nested display of other data.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(result.dump())
        prints::
            ['12', '/', '31', '/', '1999']
            - day: 1999
            - month: 31
            - year: 12
        rcss|]\}}t|�|fVqdS)N)r{)r�r�r�rwrwrxr�gsz$ParseResults.dump.<locals>.<genexpr>z
%s%s- %s: z  rrcss|]}t|t�VqdS)N)rzr")r��vvrwrwrxr�ssz
%s%s[%d]:
%s%s%sr�)
r�r�r�r��sortedr�rzr"�dumpr��anyr�r�)r�r�depth�fullr�NLr�r�r�r�rrwrwrxrPs,

4.zParseResults.dumpcOstj|j�f|�|�dS)a�
        Pretty-printer for parsed results as a list, using the C{pprint} module.
        Accepts additional positional or keyword args as defined for the 
        C{pprint.pprint} method. (U{http://docs.python.org/3/library/pprint.html#pprint.pprint})

        Example::
            ident = Word(alphas, alphanums)
            num = Word(nums)
            func = Forward()
            term = ident | num | Group('(' + func + ')')
            func <<= ident + Group(Optional(delimitedList(term)))
            result = func.parseString("fna a,b,(fnb c,d,200),100")
            result.pprint(width=40)
        prints::
            ['fna',
             ['a',
              'b',
              ['(', 'fnb', ['c', 'd', '200'], ')'],
              '100']]
        N)�pprintr�)r�r�r�rwrwrxr"}szParseResults.pprintcCs.|j|jj�|jdk	r|j�p d|j|jffS)N)r�r�r�r�r�r�)r�rwrwrx�__getstate__�s
zParseResults.__getstate__cCsN|d|_|d\|_}}|_i|_|jj|�|dk	rDt|�|_nd|_dS)Nrrr)r�r�r�r�rr�r�)r��staterZinAccumNamesrwrwrx�__setstate__�s
zParseResults.__setstate__cCs|j|j|j|jfS)N)r�r�r�r�)r�rwrwrx�__getnewargs__�szParseResults.__getnewargs__cCstt|��t|j��S)N)r�r�r�r�)r�rwrwrxr��szParseResults.__dir__)NNTT)N)r�)NFr�T)r�rT)4r�r�r�r�r�rzr�r�r�r�r�r�r��__nonzero__r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr�r�r
r�rr�rrrrr"r#r%r&r�rwrwrwrxr"-sh&
	'	
4

#
=%
-
cCsF|}d|kot|�knr4||ddkr4dS||jdd|�S)aReturns current column within a string, counting newlines as line separators.
   The first column is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   rrrr)r��rfind)r��strgr�rwrwrxr9�s
cCs|jdd|�dS)aReturns current line number within a string, counting newlines as line separators.
   The first line is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   rrrr)�count)r�r)rwrwrxrJ�s
cCsF|jdd|�}|jd|�}|dkr2||d|�S||dd�SdS)zfReturns the line of text containing loc within a string, counting newlines as line separators.
       rrrrN)r(�find)r�r)ZlastCRZnextCRrwrwrxrG�s
cCs8tdt|�dt|�dt||�t||�f�dS)NzMatch z at loc z(%d,%d))�printr�rJr9)�instringr��exprrwrwrx�_defaultStartDebugAction�sr/cCs$tdt|�dt|j���dS)NzMatched z -> )r,r�r{r�)r-�startlocZendlocr.�toksrwrwrx�_defaultSuccessDebugAction�sr2cCstdt|��dS)NzException raised:)r,r�)r-r�r.�excrwrwrx�_defaultExceptionDebugAction�sr4cGsdS)zG'Do-nothing' debug action, to suppress debugging output during parsing.Nrw)r�rwrwrxrQ�srqcs��tkr�fdd�Sdg�dg�tdd�dkrFddd	�}dd
d��ntj}tj�d}|dd
�d}|d|d|f�������fdd�}d}yt�dt�d�j�}Wntk
r�t��}YnX||_|S)Ncs�|�S)Nrw)r��lrv)�funcrwrxry�sz_trim_arity.<locals>.<lambda>rFrqro�cSs8tdkrdnd	}tj||dd�|}|j|jfgS)
Nror7rrqrr)�limit)ror7r������)�system_version�	traceback�
extract_stack�filenamerJ)r8r�
frame_summaryrwrwrxr=sz"_trim_arity.<locals>.extract_stackcSs$tj||d�}|d}|j|jfgS)N)r8rrrs)r<�
extract_tbr>rJ)�tbr8Zframesr?rwrwrxr@sz_trim_arity.<locals>.extract_tb�)r8rrcs�x�y �|�dd��}d�d<|Stk
r��dr>�n4z.tj�d}�|dd�ddd��ksj�Wd~X�d�kr��dd7<w�YqXqWdS)NrTrrrq)r8rsrs)r�r~�exc_info)r�r�rA)r@�
foundArityr6r8�maxargs�pa_call_line_synthrwrx�wrappers"z_trim_arity.<locals>.wrapperz<parse action>r��	__class__)ror7)r)rrs)	�singleArgBuiltinsr;r<r=r@�getattrr��	Exceptionr{)r6rEr=Z	LINE_DIFFZ	this_linerG�	func_namerw)r@rDr6r8rErFrx�_trim_arity�s*
rMcs�eZdZdZdZdZedd��Zedd��Zd�dd	�Z	d
d�Z
dd
�Zd�dd�Zd�dd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd�dd �Zd!d"�Zd�d#d$�Zd%d&�Zd'd(�ZGd)d*�d*e�Zed+k	r�Gd,d-�d-e�ZnGd.d-�d-e�ZiZe�Zd/d/gZ d�d0d1�Z!eZ"ed2d3��Z#dZ$ed�d5d6��Z%d�d7d8�Z&e'dfd9d:�Z(d;d<�Z)e'fd=d>�Z*e'dfd?d@�Z+dAdB�Z,dCdD�Z-dEdF�Z.dGdH�Z/dIdJ�Z0dKdL�Z1dMdN�Z2dOdP�Z3dQdR�Z4dSdT�Z5dUdV�Z6dWdX�Z7dYdZ�Z8d�d[d\�Z9d]d^�Z:d_d`�Z;dadb�Z<dcdd�Z=dedf�Z>dgdh�Z?d�didj�Z@dkdl�ZAdmdn�ZBdodp�ZCdqdr�ZDgfdsdt�ZEd�dudv�ZF�fdwdx�ZGdydz�ZHd{d|�ZId}d~�ZJdd��ZKd�d�d��ZLd�d�d��ZM�ZNS)�r$z)Abstract base level parser element class.z 
	
FcCs
|t_dS)a�
        Overrides the default whitespace chars

        Example::
            # default whitespace chars are space, <TAB> and newline
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def', 'ghi', 'jkl']
            
            # change to just treat newline as significant
            ParserElement.setDefaultWhitespaceChars(" \t")
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def']
        N)r$�DEFAULT_WHITE_CHARS)�charsrwrwrx�setDefaultWhitespaceChars=s
z'ParserElement.setDefaultWhitespaceCharscCs
|t_dS)a�
        Set class to be used for inclusion of string literals into a parser.
        
        Example::
            # default literal class used is Literal
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']


            # change to Suppress
            ParserElement.inlineLiteralsUsing(Suppress)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '12', '31']
        N)r$�_literalStringClass)r�rwrwrx�inlineLiteralsUsingLsz!ParserElement.inlineLiteralsUsingcCs�t�|_d|_d|_d|_||_d|_tj|_	d|_
d|_d|_t�|_
d|_d|_d|_d|_d|_d|_d|_d|_d|_dS)NTFr�)NNN)r��parseAction�
failAction�strRepr�resultsName�
saveAsList�skipWhitespacer$rN�
whiteChars�copyDefaultWhiteChars�mayReturnEmpty�keepTabs�ignoreExprs�debug�streamlined�
mayIndexError�errmsg�modalResults�debugActions�re�callPreparse�
callDuringTry)r��savelistrwrwrxr�as(zParserElement.__init__cCs<tj|�}|jdd�|_|jdd�|_|jr8tj|_|S)a$
        Make a copy of this C{ParserElement}.  Useful for defining different parse actions
        for the same parsing pattern, using copies of the original parse element.
        
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            integerK = integer.copy().addParseAction(lambda toks: toks[0]*1024) + Suppress("K")
            integerM = integer.copy().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
            
            print(OneOrMore(integerK | integerM | integer).parseString("5K 100 640K 256M"))
        prints::
            [5120, 100, 655360, 268435456]
        Equivalent form of C{expr.copy()} is just C{expr()}::
            integerM = integer().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
        N)r�rSr]rZr$rNrY)r�Zcpyrwrwrxr�xs
zParserElement.copycCs*||_d|j|_t|d�r&|j|j_|S)af
        Define name for this expression, makes debugging and exception messages clearer.
        
        Example::
            Word(nums).parseString("ABC")  # -> Exception: Expected W:(0123...) (at char 0), (line:1, col:1)
            Word(nums).setName("integer").parseString("ABC")  # -> Exception: Expected integer (at char 0), (line:1, col:1)
        z	Expected �	exception)r�rar�rhr�)r�r�rwrwrx�setName�s


zParserElement.setNamecCs4|j�}|jd�r"|dd�}d}||_||_|S)aP
        Define name for referencing matching tokens as a nested attribute
        of the returned parse results.
        NOTE: this returns a *copy* of the original C{ParserElement} object;
        this is so that the client can define a basic element, such as an
        integer, and reference it in multiple places with different names.

        You can also set results names using the abbreviated syntax,
        C{expr("name")} in place of C{expr.setResultsName("name")} - 
        see L{I{__call__}<__call__>}.

        Example::
            date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))

            # equivalent form:
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
        �*NrrTrs)r��endswithrVrb)r�r��listAllMatchesZnewselfrwrwrx�setResultsName�s
zParserElement.setResultsNameTcs@|r&|j�d�fdd�	}�|_||_nt|jd�r<|jj|_|S)z�Method to invoke the Python pdb debugger when this element is
           about to be parsed. Set C{breakFlag} to True to enable, False to
           disable.
        Tcsddl}|j��||||�S)Nr)�pdbZ	set_trace)r-r��	doActions�callPreParsern)�_parseMethodrwrx�breaker�sz'ParserElement.setBreak.<locals>.breaker�_originalParseMethod)TT)�_parsersr�)r�Z	breakFlagrrrw)rqrx�setBreak�s
zParserElement.setBreakcOs&tttt|���|_|jdd�|_|S)a
        Define action to perform when successfully matching parse element definition.
        Parse action fn is a callable method with 0-3 arguments, called as C{fn(s,loc,toks)},
        C{fn(loc,toks)}, C{fn(toks)}, or just C{fn()}, where:
         - s   = the original string being parsed (see note below)
         - loc = the location of the matching substring
         - toks = a list of the matched tokens, packaged as a C{L{ParseResults}} object
        If the functions in fns modify the tokens, they can return them as the return
        value from fn, and the modified list of tokens will replace the original.
        Otherwise, fn does not need to return any value.

        Optional keyword arguments:
         - callDuringTry = (default=C{False}) indicate if parse action should be run during lookaheads and alternate testing

        Note: the default parsing behavior is to expand tabs in the input string
        before starting the parsing process.  See L{I{parseString}<parseString>} for more information
        on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
        consistent view of the parsed string, the parse location, and line and column
        positions within the parsed string.
        
        Example::
            integer = Word(nums)
            date_str = integer + '/' + integer + '/' + integer

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']

            # use parse action to convert to ints at parse time
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            date_str = integer + '/' + integer + '/' + integer

            # note that integer fields are now ints, not strings
            date_str.parseString("1999/12/31")  # -> [1999, '/', 12, '/', 31]
        rfF)r��maprMrSr�rf)r��fnsr�rwrwrxr��s"zParserElement.setParseActioncOs4|jtttt|���7_|jp,|jdd�|_|S)z�
        Add parse action to expression's list of parse actions. See L{I{setParseAction}<setParseAction>}.
        
        See examples in L{I{copy}<copy>}.
        rfF)rSr�rvrMrfr�)r�rwr�rwrwrx�addParseAction�szParserElement.addParseActioncsb|jdd��|jdd�rtnt�x(|D] ����fdd�}|jj|�q&W|jpZ|jdd�|_|S)a�Add a boolean predicate function to expression's list of parse actions. See 
        L{I{setParseAction}<setParseAction>} for function call signatures. Unlike C{setParseAction}, 
        functions passed to C{addCondition} need to return boolean success/fail of the condition.

        Optional keyword arguments:
         - message = define a custom message to be used in the raised exception
         - fatal   = if True, will raise ParseFatalException to stop parsing immediately; otherwise will raise ParseException
         
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            year_int = integer.copy()
            year_int.addCondition(lambda toks: toks[0] >= 2000, message="Only support years 2000 and later")
            date_str = year_int + '/' + integer + '/' + integer

            result = date_str.parseString("1999/12/31")  # -> Exception: Only support years 2000 and later (at char 0), (line:1, col:1)
        �messagezfailed user-defined condition�fatalFcs$tt��|||��s �||���dS)N)r�rM)r�r5rv)�exc_type�fnr�rwrx�pasz&ParserElement.addCondition.<locals>.parf)r�r!rrSr�rf)r�rwr�r}rw)r{r|r�rx�addCondition�s
zParserElement.addConditioncCs
||_|S)aDefine action to perform if parsing fails at this expression.
           Fail acton fn is a callable function that takes the arguments
           C{fn(s,loc,expr,err)} where:
            - s = string being parsed
            - loc = location where expression match was attempted and failed
            - expr = the parse expression that failed
            - err = the exception thrown
           The function returns no value.  It may throw C{L{ParseFatalException}}
           if it is desired to stop parsing immediately.)rT)r�r|rwrwrx�
setFailActions
zParserElement.setFailActioncCsZd}xP|rTd}xB|jD]8}yx|j||�\}}d}qWWqtk
rLYqXqWqW|S)NTF)r]rtr)r�r-r�Z
exprsFound�eZdummyrwrwrx�_skipIgnorables#szParserElement._skipIgnorablescCsL|jr|j||�}|jrH|j}t|�}x ||krF|||krF|d7}q(W|S)Nrr)r]r�rXrYr�)r�r-r�Zwt�instrlenrwrwrx�preParse0szParserElement.preParsecCs|gfS)Nrw)r�r-r�rorwrwrx�	parseImpl<szParserElement.parseImplcCs|S)Nrw)r�r-r��	tokenlistrwrwrx�	postParse?szParserElement.postParsec"Cs�|j}|s|jr�|jdr,|jd|||�|rD|jrD|j||�}n|}|}yDy|j|||�\}}Wn(tk
r�t|t|�|j	|��YnXWnXt
k
r�}	z<|jdr�|jd||||	�|jr�|j||||	��WYdd}	~	XnXn�|o�|j�r|j||�}n|}|}|j�s$|t|�k�rhy|j|||�\}}Wn*tk
�rdt|t|�|j	|��YnXn|j|||�\}}|j|||�}t
||j|j|jd�}
|j�r�|�s�|j�r�|�rVyRxL|jD]B}||||
�}|dk	�r�t
||j|j�o�t|t
tf�|jd�}
�q�WWnFt
k
�rR}	z(|jd�r@|jd||||	��WYdd}	~	XnXnNxL|jD]B}||||
�}|dk	�r^t
||j|j�o�t|t
tf�|jd�}
�q^W|�r�|jd�r�|jd|||||
�||
fS)Nrrq)r�r�rr)r^rTrcrer�r�r�rr�rarr`r�r"rVrWrbrSrfrzr�)r�r-r�rorpZ	debugging�prelocZtokensStart�tokens�errZ	retTokensr|rwrwrx�
_parseNoCacheCsp





zParserElement._parseNoCachecCs>y|j||dd�dStk
r8t|||j|��YnXdS)NF)ror)rtr!rra)r�r-r�rwrwrx�tryParse�szParserElement.tryParsecCs2y|j||�Wnttfk
r(dSXdSdS)NFT)r�rr�)r�r-r�rwrwrx�canParseNext�s
zParserElement.canParseNextc@seZdZdd�ZdS)zParserElement._UnboundedCachecsdi�t�|_���fdd�}�fdd�}�fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)�cache�not_in_cacherwrxr��sz3ParserElement._UnboundedCache.__init__.<locals>.getcs|�|<dS)Nrw)r�r�r�)r�rwrx�set�sz3ParserElement._UnboundedCache.__init__.<locals>.setcs�j�dS)N)r�)r�)r�rwrxr��sz5ParserElement._UnboundedCache.__init__.<locals>.clear)r�r��types�
MethodTyper�r�r�)r�r�r�r�rw)r�r�rxr��sz&ParserElement._UnboundedCache.__init__N)r�r�r�r�rwrwrwrx�_UnboundedCache�sr�Nc@seZdZdd�ZdS)zParserElement._FifoCachecsht�|_�t����fdd�}��fdd�}�fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)r�r�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.getcs"|�|<t���kr�jd�dS)NF)r��popitem)r�r�r�)r��sizerwrxr��sz.ParserElement._FifoCache.__init__.<locals>.setcs�j�dS)N)r�)r�)r�rwrxr��sz0ParserElement._FifoCache.__init__.<locals>.clear)r�r��_OrderedDictr�r�r�r�r�)r�r�r�r�r�rw)r�r�r�rxr��sz!ParserElement._FifoCache.__init__N)r�r�r�r�rwrwrwrx�
_FifoCache�sr�c@seZdZdd�ZdS)zParserElement._FifoCachecsvt�|_�i�tjg�����fdd�}���fdd�}��fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)r�r�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.getcs2|�|<t���kr$�j�j�d��j|�dS)N)r�r��popleftr�)r�r�r�)r��key_fifor�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.setcs�j��j�dS)N)r�)r�)r�r�rwrxr��sz0ParserElement._FifoCache.__init__.<locals>.clear)	r�r��collections�dequer�r�r�r�r�)r�r�r�r�r�rw)r�r�r�r�rxr��sz!ParserElement._FifoCache.__init__N)r�r�r�r�rwrwrwrxr��srcCs�d\}}|||||f}tj��tj}|j|�}	|	|jkr�tj|d7<y|j||||�}	Wn8tk
r�}
z|j||
j	|
j
���WYdd}
~
Xq�X|j||	d|	dj�f�|	Sn4tj|d7<t|	t
�r�|	�|	d|	dj�fSWdQRXdS)Nrrr)rrr)r$�packrat_cache_lock�
packrat_cacher�r��packrat_cache_statsr�rr�rHr�r�rzrK)r�r-r�rorpZHITZMISS�lookupr�r�r�rwrwrx�_parseCache�s$


zParserElement._parseCachecCs(tjj�dgttj�tjdd�<dS)Nr)r$r�r�r�r�rwrwrwrx�
resetCache�s
zParserElement.resetCache�cCs8tjs4dt_|dkr tj�t_ntj|�t_tjt_dS)a�Enables "packrat" parsing, which adds memoizing to the parsing logic.
           Repeated parse attempts at the same string location (which happens
           often in many complex grammars) can immediately return a cached value,
           instead of re-executing parsing/validating code.  Memoizing is done of
           both valid results and parsing exceptions.
           
           Parameters:
            - cache_size_limit - (default=C{128}) - if an integer value is provided
              will limit the size of the packrat cache; if None is passed, then
              the cache size will be unbounded; if 0 is passed, the cache will
              be effectively disabled.
            
           This speedup may break existing programs that use parse actions that
           have side-effects.  For this reason, packrat parsing is disabled when
           you first import pyparsing.  To activate the packrat feature, your
           program must call the class method C{ParserElement.enablePackrat()}.  If
           your program uses C{psyco} to "compile as you go", you must call
           C{enablePackrat} before calling C{psyco.full()}.  If you do not do this,
           Python will crash.  For best results, call C{enablePackrat()} immediately
           after importing pyparsing.
           
           Example::
               import pyparsing
               pyparsing.ParserElement.enablePackrat()
        TN)r$�_packratEnabledr�r�r�r�rt)Zcache_size_limitrwrwrx�
enablePackratszParserElement.enablePackratcCs�tj�|js|j�x|jD]}|j�qW|js<|j�}y<|j|d�\}}|rv|j||�}t	�t
�}|j||�Wn0tk
r�}ztjr��n|�WYdd}~XnX|SdS)aB
        Execute the parse expression with the given string.
        This is the main interface to the client code, once the complete
        expression has been built.

        If you want the grammar to require that the entire input string be
        successfully parsed, then set C{parseAll} to True (equivalent to ending
        the grammar with C{L{StringEnd()}}).

        Note: C{parseString} implicitly calls C{expandtabs()} on the input string,
        in order to report proper column numbers in parse actions.
        If the input string contains tabs and
        the grammar uses parse actions that use the C{loc} argument to index into the
        string being parsed, you can ensure you have a consistent view of the input
        string by:
         - calling C{parseWithTabs} on your grammar before calling C{parseString}
           (see L{I{parseWithTabs}<parseWithTabs>})
         - define your parse action using the full C{(s,loc,toks)} signature, and
           reference the input string using the parse action's C{s} argument
         - explictly expand the tabs in your input string before calling
           C{parseString}
        
        Example::
            Word('a').parseString('aaaaabaaa')  # -> ['aaaaa']
            Word('a').parseString('aaaaabaaa', parseAll=True)  # -> Exception: Expected end of text
        rN)
r$r�r_�
streamliner]r\�
expandtabsrtr�r
r)r�verbose_stacktrace)r�r-�parseAllr�r�r�Zser3rwrwrx�parseString#s$zParserElement.parseStringccs@|js|j�x|jD]}|j�qW|js8t|�j�}t|�}d}|j}|j}t	j
�d}	y�x�||kon|	|k�ry |||�}
|||
dd�\}}Wntk
r�|
d}Yq`X||kr�|	d7}	||
|fV|r�|||�}
|
|kr�|}q�|d7}n|}q`|
d}q`WWn4tk
�r:}zt	j
�r&�n|�WYdd}~XnXdS)a�
        Scan the input string for expression matches.  Each match will return the
        matching tokens, start location, and end location.  May be called with optional
        C{maxMatches} argument, to clip scanning after 'n' matches are found.  If
        C{overlap} is specified, then overlapping matches will be reported.

        Note that the start and end locations are reported relative to the string
        being parsed.  See L{I{parseString}<parseString>} for more information on parsing
        strings with embedded tabs.

        Example::
            source = "sldjf123lsdjjkf345sldkjf879lkjsfd987"
            print(source)
            for tokens,start,end in Word(alphas).scanString(source):
                print(' '*start + '^'*(end-start))
                print(' '*start + tokens[0])
        
        prints::
        
            sldjf123lsdjjkf345sldkjf879lkjsfd987
            ^^^^^
            sldjf
                    ^^^^^^^
                    lsdjjkf
                              ^^^^^^
                              sldkjf
                                       ^^^^^^
                                       lkjsfd
        rF)rprrN)r_r�r]r\r�r�r�r�rtr$r�rrr�)r�r-�
maxMatchesZoverlapr�r�r�Z
preparseFnZparseFn�matchesr�ZnextLocr�Znextlocr3rwrwrx�
scanStringUsB


zParserElement.scanStringcCs�g}d}d|_y�xh|j|�D]Z\}}}|j|||��|rrt|t�rT||j�7}nt|t�rh||7}n
|j|�|}qW|j||d��dd�|D�}djtt	t
|���Stk
r�}ztj
rȂn|�WYdd}~XnXdS)af
        Extension to C{L{scanString}}, to modify matching text with modified tokens that may
        be returned from a parse action.  To use C{transformString}, define a grammar and
        attach a parse action to it that modifies the returned token list.
        Invoking C{transformString()} on a target string will then scan for matches,
        and replace the matched text patterns according to the logic in the parse
        action.  C{transformString()} returns the resulting transformed string.
        
        Example::
            wd = Word(alphas)
            wd.setParseAction(lambda toks: toks[0].title())
            
            print(wd.transformString("now is the winter of our discontent made glorious summer by this sun of york."))
        Prints::
            Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York.
        rTNcSsg|]}|r|�qSrwrw)r��orwrwrxr��sz1ParserElement.transformString.<locals>.<listcomp>r�)r\r�r�rzr"r�r�r�rvr��_flattenrr$r�)r�r-rZlastErvr�r�r3rwrwrxr��s(



zParserElement.transformStringcCsPytdd�|j||�D��Stk
rJ}ztjr6�n|�WYdd}~XnXdS)a~
        Another extension to C{L{scanString}}, simplifying the access to the tokens found
        to match the given parse expression.  May be called with optional
        C{maxMatches} argument, to clip searching after 'n' matches are found.
        
        Example::
            # a capitalized word starts with an uppercase letter, followed by zero or more lowercase letters
            cap_word = Word(alphas.upper(), alphas.lower())
            
            print(cap_word.searchString("More than Iron, more than Lead, more than Gold I need Electricity"))
        prints::
            ['More', 'Iron', 'Lead', 'Gold', 'I']
        cSsg|]\}}}|�qSrwrw)r�rvr�r�rwrwrxr��sz.ParserElement.searchString.<locals>.<listcomp>N)r"r�rr$r�)r�r-r�r3rwrwrx�searchString�szParserElement.searchStringc	csXd}d}x<|j||d�D]*\}}}|||�V|r>|dV|}qW||d�VdS)a[
        Generator method to split a string using the given expression as a separator.
        May be called with optional C{maxsplit} argument, to limit the number of splits;
        and the optional C{includeSeparators} argument (default=C{False}), if the separating
        matching text should be included in the split results.
        
        Example::        
            punc = oneOf(list(".,;:/-!?"))
            print(list(punc.split("This, this?, this sentence, is badly punctuated!")))
        prints::
            ['This', ' this', '', ' this sentence', ' is badly punctuated', '']
        r)r�N)r�)	r�r-�maxsplitZincludeSeparatorsZsplitsZlastrvr�r�rwrwrxr��s

zParserElement.splitcCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)a�
        Implementation of + operator - returns C{L{And}}. Adding strings to a ParserElement
        converts them to L{Literal}s by default.
        
        Example::
            greet = Word(alphas) + "," + Word(alphas) + "!"
            hello = "Hello, World!"
            print (hello, "->", greet.parseString(hello))
        Prints::
            Hello, World! -> ['Hello', ',', 'World', '!']
        z4Cannot combine element of type %s with ParserElementrq)�
stacklevelN)	rzr�r$rQ�warnings�warnr��
SyntaxWarningr)r�r�rwrwrxr�s



zParserElement.__add__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||S)z]
        Implementation of + operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrxrs



zParserElement.__radd__cCsLt|t�rtj|�}t|t�s:tjdt|�tdd�dSt|tj	�|g�S)zQ
        Implementation of - operator, returns C{L{And}} with error stop
        z4Cannot combine element of type %s with ParserElementrq)r�N)
rzr�r$rQr�r�r�r�r�
_ErrorStop)r�r�rwrwrx�__sub__s



zParserElement.__sub__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||S)z]
        Implementation of - operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rsub__ s



zParserElement.__rsub__cs�t|t�r|d}}n�t|t�r�|ddd�}|ddkrHd|df}t|dt�r�|ddkr�|ddkrvt��S|ddkr�t��S�|dt��SnJt|dt�r�t|dt�r�|\}}||8}ntdt|d�t|d���ntdt|���|dk�rtd��|dk�rtd��||k�o2dkn�rBtd	��|�r���fd
d��|�r�|dk�rt��|�}nt�g|��|�}n�|�}n|dk�r��}nt�g|�}|S)
a�
        Implementation of * operator, allows use of C{expr * 3} in place of
        C{expr + expr + expr}.  Expressions may also me multiplied by a 2-integer
        tuple, similar to C{{min,max}} multipliers in regular expressions.  Tuples
        may also include C{None} as in:
         - C{expr*(n,None)} or C{expr*(n,)} is equivalent
              to C{expr*n + L{ZeroOrMore}(expr)}
              (read as "at least n instances of C{expr}")
         - C{expr*(None,n)} is equivalent to C{expr*(0,n)}
              (read as "0 to n instances of C{expr}")
         - C{expr*(None,None)} is equivalent to C{L{ZeroOrMore}(expr)}
         - C{expr*(1,None)} is equivalent to C{L{OneOrMore}(expr)}

        Note that C{expr*(None,n)} does not raise an exception if
        more than n exprs exist in the input stream; that is,
        C{expr*(None,n)} does not enforce a maximum number of expr
        occurrences.  If this behavior is desired, then write
        C{expr*(None,n) + ~expr}
        rNrqrrz7cannot multiply 'ParserElement' and ('%s','%s') objectsz0cannot multiply 'ParserElement' and '%s' objectsz/cannot multiply ParserElement by negative valuez@second tuple value must be greater or equal to first tuple valuez+cannot multiply ParserElement by 0 or (0,0)cs(|dkrt��|d��St��SdS)Nrr)r)�n)�makeOptionalListr�rwrxr�]sz/ParserElement.__mul__.<locals>.makeOptionalList)NN)	rzru�tupler2rr�r��
ValueErrorr)r�r�ZminElementsZoptElementsr�rw)r�r�rx�__mul__,sD







zParserElement.__mul__cCs
|j|�S)N)r�)r�r�rwrwrx�__rmul__pszParserElement.__rmul__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zI
        Implementation of | operator - returns C{L{MatchFirst}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__or__ss



zParserElement.__or__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||BS)z]
        Implementation of | operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__ror__s



zParserElement.__ror__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zA
        Implementation of ^ operator - returns C{L{Or}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__xor__�s



zParserElement.__xor__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||AS)z]
        Implementation of ^ operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rxor__�s



zParserElement.__rxor__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zC
        Implementation of & operator - returns C{L{Each}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__and__�s



zParserElement.__and__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||@S)z]
        Implementation of & operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rand__�s



zParserElement.__rand__cCst|�S)zE
        Implementation of ~ operator - returns C{L{NotAny}}
        )r)r�rwrwrx�
__invert__�szParserElement.__invert__cCs|dk	r|j|�S|j�SdS)a

        Shortcut for C{L{setResultsName}}, with C{listAllMatches=False}.
        
        If C{name} is given with a trailing C{'*'} character, then C{listAllMatches} will be
        passed as C{True}.
           
        If C{name} is omitted, same as calling C{L{copy}}.

        Example::
            # these are equivalent
            userdata = Word(alphas).setResultsName("name") + Word(nums+"-").setResultsName("socsecno")
            userdata = Word(alphas)("name") + Word(nums+"-")("socsecno")             
        N)rmr�)r�r�rwrwrx�__call__�s
zParserElement.__call__cCst|�S)z�
        Suppresses the output of this C{ParserElement}; useful to keep punctuation from
        cluttering up returned output.
        )r+)r�rwrwrx�suppress�szParserElement.suppresscCs
d|_|S)a
        Disables the skipping of whitespace before matching the characters in the
        C{ParserElement}'s defined pattern.  This is normally only used internally by
        the pyparsing module, but may be needed in some whitespace-sensitive grammars.
        F)rX)r�rwrwrx�leaveWhitespace�szParserElement.leaveWhitespacecCsd|_||_d|_|S)z8
        Overrides the default whitespace chars
        TF)rXrYrZ)r�rOrwrwrx�setWhitespaceChars�sz ParserElement.setWhitespaceCharscCs
d|_|S)z�
        Overrides default behavior to expand C{<TAB>}s to spaces before parsing the input string.
        Must be called before C{parseString} when the input grammar contains elements that
        match C{<TAB>} characters.
        T)r\)r�rwrwrx�
parseWithTabs�szParserElement.parseWithTabscCsLt|t�rt|�}t|t�r4||jkrH|jj|�n|jjt|j���|S)a�
        Define expression to be ignored (e.g., comments) while doing pattern
        matching; may be called repeatedly, to define multiple comment or other
        ignorable patterns.
        
        Example::
            patt = OneOrMore(Word(alphas))
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj']
            
            patt.ignore(cStyleComment)
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj', 'lskjd']
        )rzr�r+r]r�r�)r�r�rwrwrx�ignore�s


zParserElement.ignorecCs"|pt|pt|ptf|_d|_|S)zT
        Enable display of debugging messages while doing pattern matching.
        T)r/r2r4rcr^)r�ZstartActionZ
successActionZexceptionActionrwrwrx�setDebugActions
s
zParserElement.setDebugActionscCs|r|jttt�nd|_|S)a�
        Enable display of debugging messages while doing pattern matching.
        Set C{flag} to True to enable, False to disable.

        Example::
            wd = Word(alphas).setName("alphaword")
            integer = Word(nums).setName("numword")
            term = wd | integer
            
            # turn on debugging for wd
            wd.setDebug()

            OneOrMore(term).parseString("abc 123 xyz 890")
        
        prints::
            Match alphaword at loc 0(1,1)
            Matched alphaword -> ['abc']
            Match alphaword at loc 3(1,4)
            Exception raised:Expected alphaword (at char 4), (line:1, col:5)
            Match alphaword at loc 7(1,8)
            Matched alphaword -> ['xyz']
            Match alphaword at loc 11(1,12)
            Exception raised:Expected alphaword (at char 12), (line:1, col:13)
            Match alphaword at loc 15(1,16)
            Exception raised:Expected alphaword (at char 15), (line:1, col:16)

        The output shown is that produced by the default debug actions - custom debug actions can be
        specified using L{setDebugActions}. Prior to attempting
        to match the C{wd} expression, the debugging message C{"Match <exprname> at loc <n>(<line>,<col>)"}
        is shown. Then if the parse succeeds, a C{"Matched"} message is shown, or an C{"Exception raised"}
        message is shown. Also note the use of L{setName} to assign a human-readable name to the expression,
        which makes debugging and exception messages easier to understand - for instance, the default
        name created for the C{Word} expression without calling C{setName} is C{"W:(ABCD...)"}.
        F)r�r/r2r4r^)r��flagrwrwrx�setDebugs#zParserElement.setDebugcCs|jS)N)r�)r�rwrwrxr�@szParserElement.__str__cCst|�S)N)r�)r�rwrwrxr�CszParserElement.__repr__cCsd|_d|_|S)NT)r_rU)r�rwrwrxr�FszParserElement.streamlinecCsdS)Nrw)r�r�rwrwrx�checkRecursionKszParserElement.checkRecursioncCs|jg�dS)zj
        Check defined expressions for valid structure, check for infinite recursive definitions.
        N)r�)r��
validateTracerwrwrx�validateNszParserElement.validatecCs�y|j�}Wn2tk
r>t|d��}|j�}WdQRXYnXy|j||�Stk
r|}ztjrh�n|�WYdd}~XnXdS)z�
        Execute the parse expression on the given file or filename.
        If a filename is specified (instead of a file object),
        the entire file is opened, read, and closed before parsing.
        �rN)�readr��openr�rr$r�)r�Zfile_or_filenamer�Z
file_contents�fr3rwrwrx�	parseFileTszParserElement.parseFilecsHt|t�r"||kp t|�t|�kSt|t�r6|j|�Stt|�|kSdS)N)rzr$�varsr�r��super)r�r�)rHrwrx�__eq__hs



zParserElement.__eq__cCs
||kS)Nrw)r�r�rwrwrx�__ne__pszParserElement.__ne__cCstt|��S)N)�hash�id)r�rwrwrx�__hash__sszParserElement.__hash__cCs||kS)Nrw)r�r�rwrwrx�__req__vszParserElement.__req__cCs
||kS)Nrw)r�r�rwrwrx�__rne__yszParserElement.__rne__cCs0y|jt|�|d�dStk
r*dSXdS)a�
        Method for quick testing of a parser against a test string. Good for simple 
        inline microtests of sub expressions while building up larger parser.
           
        Parameters:
         - testString - to test against this expression for a match
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests
            
        Example::
            expr = Word(nums)
            assert expr.matches("100")
        )r�TFN)r�r�r)r�Z
testStringr�rwrwrxr�|s

zParserElement.matches�#cCs�t|t�r"tttj|j�j���}t|t�r4t|�}g}g}d}	�x�|D�]�}
|dk	rb|j	|
d�sl|rx|
rx|j
|
�qH|
s~qHdj|�|
g}g}y:|
jdd�}
|j
|
|d�}|j
|j|d��|	o�|}	Wn�tk
�rx}
z�t|
t�r�dnd	}d|
k�r0|j
t|
j|
��|j
d
t|
j|
�dd|�n|j
d
|
jd|�|j
d
t|
��|	�ob|}	|
}WYdd}
~
XnDtk
�r�}z&|j
dt|��|	�o�|}	|}WYdd}~XnX|�r�|�r�|j
d	�tdj|��|j
|
|f�qHW|	|fS)a3
        Execute the parse expression on a series of test strings, showing each
        test, the parsed results or where the parse failed. Quick and easy way to
        run a parse expression against a list of sample strings.
           
        Parameters:
         - tests - a list of separate test strings, or a multiline string of test strings
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests           
         - comment - (default=C{'#'}) - expression for indicating embedded comments in the test 
              string; pass None to disable comment filtering
         - fullDump - (default=C{True}) - dump results as list followed by results names in nested outline;
              if False, only dump nested list
         - printResults - (default=C{True}) prints test output to stdout
         - failureTests - (default=C{False}) indicates if these tests are expected to fail parsing

        Returns: a (success, results) tuple, where success indicates that all tests succeeded
        (or failed if C{failureTests} is True), and the results contain a list of lines of each 
        test's output
        
        Example::
            number_expr = pyparsing_common.number.copy()

            result = number_expr.runTests('''
                # unsigned integer
                100
                # negative integer
                -100
                # float with scientific notation
                6.02e23
                # integer with scientific notation
                1e-12
                ''')
            print("Success" if result[0] else "Failed!")

            result = number_expr.runTests('''
                # stray character
                100Z
                # missing leading digit before '.'
                -.100
                # too many '.'
                3.14.159
                ''', failureTests=True)
            print("Success" if result[0] else "Failed!")
        prints::
            # unsigned integer
            100
            [100]

            # negative integer
            -100
            [-100]

            # float with scientific notation
            6.02e23
            [6.02e+23]

            # integer with scientific notation
            1e-12
            [1e-12]

            Success
            
            # stray character
            100Z
               ^
            FAIL: Expected end of text (at char 3), (line:1, col:4)

            # missing leading digit before '.'
            -.100
            ^
            FAIL: Expected {real number with scientific notation | real number | signed integer} (at char 0), (line:1, col:1)

            # too many '.'
            3.14.159
                ^
            FAIL: Expected end of text (at char 4), (line:1, col:5)

            Success

        Each test string must be on a single line. If you want to test a string that spans multiple
        lines, create a test like this::

            expr.runTest(r"this is a test\n of strings that spans \n 3 lines")
        
        (Note that this is a raw string literal, you must include the leading 'r'.)
        TNFrz\n)r�)r z(FATAL)r�� rr�^zFAIL: zFAIL-EXCEPTION: )rzr�r�rvr{r��rstrip�
splitlinesrr�r�r�r�r�rrr!rGr�r9rKr,)r�Ztestsr�ZcommentZfullDumpZprintResultsZfailureTestsZ
allResultsZcomments�successrvr�resultr�rzr3rwrwrx�runTests�sNW



$


zParserElement.runTests)F)F)T)T)TT)TT)r�)F)N)T)F)T)Tr�TTF)Or�r�r�r�rNr��staticmethodrPrRr�r�rirmrur�rxr~rr�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�rtr�r�r�r��_MAX_INTr�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r��
__classcell__rwrw)rHrxr$8s�


&




H
"
2G+D
			

)

cs eZdZdZ�fdd�Z�ZS)r,zT
    Abstract C{ParserElement} subclass, for defining atomic matching patterns.
    cstt|�jdd�dS)NF)rg)r�r,r�)r�)rHrwrxr�	szToken.__init__)r�r�r�r�r�r�rwrw)rHrxr,	scs eZdZdZ�fdd�Z�ZS)r
z,
    An empty token, will always match.
    cs$tt|�j�d|_d|_d|_dS)Nr
TF)r�r
r�r�r[r`)r�)rHrwrxr�	szEmpty.__init__)r�r�r�r�r�r�rwrw)rHrxr
	scs*eZdZdZ�fdd�Zddd�Z�ZS)rz(
    A token that will never match.
    cs*tt|�j�d|_d|_d|_d|_dS)NrTFzUnmatchable token)r�rr�r�r[r`ra)r�)rHrwrxr�*	s
zNoMatch.__init__TcCst|||j|��dS)N)rra)r�r-r�rorwrwrxr�1	szNoMatch.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr&	scs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Token to exactly match a specified string.
    
    Example::
        Literal('blah').parseString('blah')  # -> ['blah']
        Literal('blah').parseString('blahfooblah')  # -> ['blah']
        Literal('blah').parseString('bla')  # -> Exception: Expected "blah"
    
    For case-insensitive matching, use L{CaselessLiteral}.
    
    For keyword matching (force word break before and after the matched string),
    use L{Keyword} or L{CaselessKeyword}.
    cs�tt|�j�||_t|�|_y|d|_Wn*tk
rVtj	dt
dd�t|_YnXdt
|j�|_d|j|_d|_d|_dS)Nrz2null string passed to Literal; use Empty() insteadrq)r�z"%s"z	Expected F)r�rr��matchr��matchLen�firstMatchCharr�r�r�r�r
rHr�r�rar[r`)r��matchString)rHrwrxr�C	s

zLiteral.__init__TcCsJ|||jkr6|jdks&|j|j|�r6||j|jfSt|||j|��dS)Nrr)r�r��
startswithr�rra)r�r-r�rorwrwrxr�V	szLiteral.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr5	s
csLeZdZdZedZd�fdd�	Zddd	�Z�fd
d�Ze	dd
��Z
�ZS)ra\
    Token to exactly match a specified string as a keyword, that is, it must be
    immediately followed by a non-keyword character.  Compare with C{L{Literal}}:
     - C{Literal("if")} will match the leading C{'if'} in C{'ifAndOnlyIf'}.
     - C{Keyword("if")} will not; it will only match the leading C{'if'} in C{'if x=1'}, or C{'if(y==2)'}
    Accepts two optional constructor arguments in addition to the keyword string:
     - C{identChars} is a string of characters that would be valid identifier characters,
          defaulting to all alphanumerics + "_" and "$"
     - C{caseless} allows case-insensitive matching, default is C{False}.
       
    Example::
        Keyword("start").parseString("start")  # -> ['start']
        Keyword("start").parseString("starting")  # -> Exception

    For case-insensitive matching, use L{CaselessKeyword}.
    z_$NFcs�tt|�j�|dkrtj}||_t|�|_y|d|_Wn$tk
r^t	j
dtdd�YnXd|j|_d|j|_
d|_d|_||_|r�|j�|_|j�}t|�|_dS)Nrz2null string passed to Keyword; use Empty() insteadrq)r�z"%s"z	Expected F)r�rr��DEFAULT_KEYWORD_CHARSr�r�r�r�r�r�r�r�r�rar[r`�caseless�upper�
caselessmatchr��
identChars)r�r�r�r�)rHrwrxr�q	s&

zKeyword.__init__TcCs|jr|||||j�j�|jkr�|t|�|jksL|||jj�|jkr�|dksj||dj�|jkr�||j|jfSnv|||jkr�|jdks�|j|j|�r�|t|�|jks�|||j|jkr�|dks�||d|jkr�||j|jfSt	|||j
|��dS)Nrrr)r�r�r�r�r�r�r�r�r�rra)r�r-r�rorwrwrxr��	s*&zKeyword.parseImplcstt|�j�}tj|_|S)N)r�rr�r�r�)r�r�)rHrwrxr��	szKeyword.copycCs
|t_dS)z,Overrides the default Keyword chars
        N)rr�)rOrwrwrx�setDefaultKeywordChars�	szKeyword.setDefaultKeywordChars)NF)T)r�r�r�r�r3r�r�r�r�r�r�r�rwrw)rHrxr^	s
cs*eZdZdZ�fdd�Zddd�Z�ZS)ral
    Token to match a specified string, ignoring case of letters.
    Note: the matched results will always be in the case of the given
    match string, NOT the case of the input text.

    Example::
        OneOrMore(CaselessLiteral("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD', 'CMD']
        
    (Contrast with example for L{CaselessKeyword}.)
    cs6tt|�j|j��||_d|j|_d|j|_dS)Nz'%s'z	Expected )r�rr�r��returnStringr�ra)r�r�)rHrwrxr��	szCaselessLiteral.__init__TcCs@||||j�j�|jkr,||j|jfSt|||j|��dS)N)r�r�r�r�rra)r�r-r�rorwrwrxr��	szCaselessLiteral.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr�	s
cs,eZdZdZd�fdd�	Zd	dd�Z�ZS)
rz�
    Caseless version of L{Keyword}.

    Example::
        OneOrMore(CaselessKeyword("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD']
        
    (Contrast with example for L{CaselessLiteral}.)
    Ncstt|�j||dd�dS)NT)r�)r�rr�)r�r�r�)rHrwrxr��	szCaselessKeyword.__init__TcCsj||||j�j�|jkrV|t|�|jksF|||jj�|jkrV||j|jfSt|||j|��dS)N)r�r�r�r�r�r�rra)r�r-r�rorwrwrxr��	s*zCaselessKeyword.parseImpl)N)T)r�r�r�r�r�r�r�rwrw)rHrxr�	scs,eZdZdZd�fdd�	Zd	dd�Z�ZS)
rlax
    A variation on L{Literal} which matches "close" matches, that is, 
    strings with at most 'n' mismatching characters. C{CloseMatch} takes parameters:
     - C{match_string} - string to be matched
     - C{maxMismatches} - (C{default=1}) maximum number of mismatches allowed to count as a match
    
    The results from a successful parse will contain the matched text from the input string and the following named results:
     - C{mismatches} - a list of the positions within the match_string where mismatches were found
     - C{original} - the original match_string used to compare against the input string
    
    If C{mismatches} is an empty list, then the match was an exact match.
    
    Example::
        patt = CloseMatch("ATCATCGAATGGA")
        patt.parseString("ATCATCGAAXGGA") # -> (['ATCATCGAAXGGA'], {'mismatches': [[9]], 'original': ['ATCATCGAATGGA']})
        patt.parseString("ATCAXCGAAXGGA") # -> Exception: Expected 'ATCATCGAATGGA' (with up to 1 mismatches) (at char 0), (line:1, col:1)

        # exact match
        patt.parseString("ATCATCGAATGGA") # -> (['ATCATCGAATGGA'], {'mismatches': [[]], 'original': ['ATCATCGAATGGA']})

        # close match allowing up to 2 mismatches
        patt = CloseMatch("ATCATCGAATGGA", maxMismatches=2)
        patt.parseString("ATCAXCGAAXGGA") # -> (['ATCAXCGAAXGGA'], {'mismatches': [[4, 9]], 'original': ['ATCATCGAATGGA']})
    rrcsBtt|�j�||_||_||_d|j|jf|_d|_d|_dS)Nz&Expected %r (with up to %d mismatches)F)	r�rlr�r��match_string�
maxMismatchesrar`r[)r�r�r�)rHrwrxr��	szCloseMatch.__init__TcCs�|}t|�}|t|j�}||kr�|j}d}g}	|j}
x�tt|||�|j��D]0\}}|\}}
||
krP|	j|�t|	�|
krPPqPW|d}t|||�g�}|j|d<|	|d<||fSt|||j|��dS)Nrrr�original�
mismatches)	r�r�r�r�r�r�r"rra)r�r-r�ro�startr��maxlocr�Zmatch_stringlocr�r�Zs_m�src�mat�resultsrwrwrxr��	s("

zCloseMatch.parseImpl)rr)T)r�r�r�r�r�r�r�rwrw)rHrxrl�	s	cs8eZdZdZd
�fdd�	Zdd	d
�Z�fdd�Z�ZS)r/a	
    Token for matching words composed of allowed character sets.
    Defined with string containing all allowed initial characters,
    an optional string containing allowed body characters (if omitted,
    defaults to the initial character set), and an optional minimum,
    maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction. An optional
    C{excludeChars} parameter can list characters that might be found in 
    the input C{bodyChars} string; useful to define a word of all printables
    except for one or two characters, for instance.
    
    L{srange} is useful for defining custom character set strings for defining 
    C{Word} expressions, using range notation from regular expression character sets.
    
    A common mistake is to use C{Word} to match a specific literal string, as in 
    C{Word("Address")}. Remember that C{Word} uses the string argument to define
    I{sets} of matchable characters. This expression would match "Add", "AAA",
    "dAred", or any other word made up of the characters 'A', 'd', 'r', 'e', and 's'.
    To match an exact literal string, use L{Literal} or L{Keyword}.

    pyparsing includes helper strings for building Words:
     - L{alphas}
     - L{nums}
     - L{alphanums}
     - L{hexnums}
     - L{alphas8bit} (alphabetic characters in ASCII range 128-255 - accented, tilded, umlauted, etc.)
     - L{punc8bit} (non-alphabetic characters in ASCII range 128-255 - currency, symbols, superscripts, diacriticals, etc.)
     - L{printables} (any non-whitespace character)

    Example::
        # a word composed of digits
        integer = Word(nums) # equivalent to Word("0123456789") or Word(srange("0-9"))
        
        # a word with a leading capital, and zero or more lowercase
        capital_word = Word(alphas.upper(), alphas.lower())

        # hostnames are alphanumeric, with leading alpha, and '-'
        hostname = Word(alphas, alphanums+'-')
        
        # roman numeral (not a strict parser, accepts invalid mix of characters)
        roman = Word("IVXLCDM")
        
        # any string of non-whitespace characters, except for ','
        csv_value = Word(printables, excludeChars=",")
    NrrrFcs�tt|�j��rFdj�fdd�|D��}|rFdj�fdd�|D��}||_t|�|_|rl||_t|�|_n||_t|�|_|dk|_	|dkr�t
d��||_|dkr�||_nt
|_|dkr�||_||_t|�|_d|j|_d	|_||_d
|j|jk�r�|dk�r�|dk�r�|dk�r�|j|jk�r8dt|j�|_nHt|j�dk�rfdtj|j�t|j�f|_nd
t|j�t|j�f|_|j�r�d|jd|_ytj|j�|_Wntk
�r�d|_YnXdS)Nr�c3s|]}|�kr|VqdS)Nrw)r�r�)�excludeCharsrwrxr�7
sz Word.__init__.<locals>.<genexpr>c3s|]}|�kr|VqdS)Nrw)r�r�)r�rwrxr�9
srrrzZcannot specify a minimum length < 1; use Optional(Word()) if zero-length word is permittedz	Expected Fr�z[%s]+z%s[%s]*z	[%s][%s]*z\b)r�r/r�r��
initCharsOrigr��	initChars�
bodyCharsOrig�	bodyChars�maxSpecifiedr��minLen�maxLenr�r�r�rar`�	asKeyword�_escapeRegexRangeChars�reStringr�rd�escape�compilerK)r�rr�min�max�exactrr�)rH)r�rxr�4
sT



0
z
Word.__init__Tc
CsD|jr<|jj||�}|s(t|||j|��|j�}||j�fS|||jkrZt|||j|��|}|d7}t|�}|j}||j	}t
||�}x ||kr�|||kr�|d7}q�Wd}	|||jkr�d}	|jr�||kr�|||kr�d}	|j
�r|dk�r||d|k�s||k�r|||k�rd}	|	�r4t|||j|��||||�fS)NrrFTr)rdr�rra�end�grouprr�rrrrrr)
r�r-r�ror�r�r�Z	bodycharsr�ZthrowExceptionrwrwrxr�j
s6

4zWord.parseImplcstytt|�j�Stk
r"YnX|jdkrndd�}|j|jkr^d||j�||j�f|_nd||j�|_|jS)NcSs$t|�dkr|dd�dS|SdS)N�z...)r�)r�rwrwrx�
charsAsStr�
sz Word.__str__.<locals>.charsAsStrz	W:(%s,%s)zW:(%s))r�r/r�rKrUr�r)r�r)rHrwrxr��
s
zWord.__str__)NrrrrFN)T)r�r�r�r�r�r�r�r�rwrw)rHrxr/
s.6
#csFeZdZdZeejd��Zd�fdd�	Zddd�Z	�fd	d
�Z
�ZS)
r'a�
    Token for matching strings that match a given regular expression.
    Defined with string specifying the regular expression in a form recognized by the inbuilt Python re module.
    If the given regex contains named groups (defined using C{(?P<name>...)}), these will be preserved as 
    named parse results.

    Example::
        realnum = Regex(r"[+-]?\d+\.\d*")
        date = Regex(r'(?P<year>\d{4})-(?P<month>\d\d?)-(?P<day>\d\d?)')
        # ref: http://stackoverflow.com/questions/267399/how-do-you-match-only-valid-roman-numerals-with-a-regular-expression
        roman = Regex(r"M{0,4}(CM|CD|D?C{0,3})(XC|XL|L?X{0,3})(IX|IV|V?I{0,3})")
    z[A-Z]rcs�tt|�j�t|t�r�|s,tjdtdd�||_||_	yt
j|j|j	�|_
|j|_Wq�t
jk
r�tjd|tdd��Yq�Xn2t|tj�r�||_
t|�|_|_||_	ntd��t|�|_d|j|_d|_d|_d	S)
z�The parameters C{pattern} and C{flags} are passed to the C{re.compile()} function as-is. See the Python C{re} module for an explanation of the acceptable patterns and flags.z0null string passed to Regex; use Empty() insteadrq)r�z$invalid pattern (%s) passed to RegexzCRegex may only be constructed with a string or a compiled RE objectz	Expected FTN)r�r'r�rzr�r�r�r��pattern�flagsrdr
r�
sre_constants�error�compiledREtyper{r�r�r�rar`r[)r�rr)rHrwrxr��
s.





zRegex.__init__TcCsd|jj||�}|s"t|||j|��|j�}|j�}t|j��}|r\x|D]}||||<qHW||fS)N)rdr�rrar�	groupdictr"r)r�r-r�ror��dr�r�rwrwrxr��
s
zRegex.parseImplcsDytt|�j�Stk
r"YnX|jdkr>dt|j�|_|jS)NzRe:(%s))r�r'r�rKrUr�r)r�)rHrwrxr��
s
z
Regex.__str__)r)T)r�r�r�r�r�rdr
rr�r�r�r�rwrw)rHrxr'�
s
"

cs8eZdZdZd�fdd�	Zddd�Z�fd	d
�Z�ZS)
r%a�
    Token for matching strings that are delimited by quoting characters.
    
    Defined with the following parameters:
        - quoteChar - string of one or more characters defining the quote delimiting string
        - escChar - character to escape quotes, typically backslash (default=C{None})
        - escQuote - special quote sequence to escape an embedded quote string (such as SQL's "" to escape an embedded ") (default=C{None})
        - multiline - boolean indicating whether quotes can span multiple lines (default=C{False})
        - unquoteResults - boolean indicating whether the matched text should be unquoted (default=C{True})
        - endQuoteChar - string of one or more characters defining the end of the quote delimited string (default=C{None} => same as quoteChar)
        - convertWhitespaceEscapes - convert escaped whitespace (C{'\t'}, C{'\n'}, etc.) to actual whitespace (default=C{True})

    Example::
        qs = QuotedString('"')
        print(qs.searchString('lsjdf "This is the quote" sldjf'))
        complex_qs = QuotedString('{{', endQuoteChar='}}')
        print(complex_qs.searchString('lsjdf {{This is the "quote"}} sldjf'))
        sql_qs = QuotedString('"', escQuote='""')
        print(sql_qs.searchString('lsjdf "This is the quote with ""embedded"" quotes" sldjf'))
    prints::
        [['This is the quote']]
        [['This is the "quote"']]
        [['This is the quote with "embedded" quotes']]
    NFTcsNtt��j�|j�}|s0tjdtdd�t��|dkr>|}n"|j�}|s`tjdtdd�t��|�_t	|��_
|d�_|�_t	|��_
|�_|�_|�_|�_|r�tjtjB�_dtj�j�t�jd�|dk	r�t|�p�df�_n<d�_dtj�j�t�jd�|dk	�rt|��pdf�_t	�j�d	k�rp�jd
dj�fdd
�tt	�j�d	dd�D��d7_|�r��jdtj|�7_|�r��jdtj|�7_tj�j�d�_�jdtj�j�7_ytj�j�j��_�j�_Wn0tjk
�r&tjd�jtdd��YnXt ���_!d�j!�_"d�_#d�_$dS)Nz$quoteChar cannot be the empty stringrq)r�z'endQuoteChar cannot be the empty stringrz%s(?:[^%s%s]r�z%s(?:[^%s\n\r%s]rrz|(?:z)|(?:c3s4|],}dtj�jd|��t�j|�fVqdS)z%s[^%s]N)rdr	�endQuoteCharr)r�r�)r�rwrxr�/sz(QuotedString.__init__.<locals>.<genexpr>�)z|(?:%s)z|(?:%s.)z(.)z)*%sz$invalid pattern (%s) passed to Regexz	Expected FTrs)%r�r%r�r�r�r�r��SyntaxError�	quoteCharr��quoteCharLen�firstQuoteCharr�endQuoteCharLen�escChar�escQuote�unquoteResults�convertWhitespaceEscapesrd�	MULTILINE�DOTALLrr	rrr�r��escCharReplacePatternr
rrrr�r�rar`r[)r�rr r!Z	multiliner"rr#)rH)r�rxr�sf




6

zQuotedString.__init__c	Cs�|||jkr|jj||�pd}|s4t|||j|��|j�}|j�}|jr�||j|j	�}t
|t�r�d|kr�|jr�ddddd�}x |j
�D]\}}|j||�}q�W|jr�tj|jd|�}|jr�|j|j|j�}||fS)N�\�	r��
)z\tz\nz\fz\rz\g<1>)rrdr�rrarrr"rrrzr�r#r�r�r r�r&r!r)	r�r-r�ror�r�Zws_mapZwslitZwscharrwrwrxr�Gs( 
zQuotedString.parseImplcsFytt|�j�Stk
r"YnX|jdkr@d|j|jf|_|jS)Nz.quoted string, starting with %s ending with %s)r�r%r�rKrUrr)r�)rHrwrxr�js
zQuotedString.__str__)NNFTNT)T)r�r�r�r�r�r�r�r�rwrw)rHrxr%�
sA
#cs8eZdZdZd�fdd�	Zddd�Z�fd	d
�Z�ZS)
r	a�
    Token for matching words composed of characters I{not} in a given set (will
    include whitespace in matched characters if not listed in the provided exclusion set - see example).
    Defined with string containing all disallowed characters, and an optional
    minimum, maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction.

    Example::
        # define a comma-separated-value as anything that is not a ','
        csv_value = CharsNotIn(',')
        print(delimitedList(csv_value).parseString("dkls,lsdkjf,s12 34,@!#,213"))
    prints::
        ['dkls', 'lsdkjf', 's12 34', '@!#', '213']
    rrrcs�tt|�j�d|_||_|dkr*td��||_|dkr@||_nt|_|dkrZ||_||_t	|�|_
d|j
|_|jdk|_d|_
dS)NFrrzfcannot specify a minimum length < 1; use Optional(CharsNotIn()) if zero-length char group is permittedrz	Expected )r�r	r�rX�notCharsr�rrr�r�r�rar[r`)r�r+rrr
)rHrwrxr��s 
zCharsNotIn.__init__TcCs�|||jkrt|||j|��|}|d7}|j}t||jt|��}x ||krd|||krd|d7}qFW|||jkr�t|||j|��||||�fS)Nrr)r+rrarrr�r)r�r-r�ror�Znotchars�maxlenrwrwrxr��s
zCharsNotIn.parseImplcsdytt|�j�Stk
r"YnX|jdkr^t|j�dkrRd|jdd�|_nd|j|_|jS)Nrz
!W:(%s...)z!W:(%s))r�r	r�rKrUr�r+)r�)rHrwrxr��s
zCharsNotIn.__str__)rrrr)T)r�r�r�r�r�r�r�r�rwrw)rHrxr	vs
cs<eZdZdZdddddd�Zd�fdd�	Zddd�Z�ZS)r.a�
    Special matching class for matching whitespace.  Normally, whitespace is ignored
    by pyparsing grammars.  This class is included when some whitespace structures
    are significant.  Define with a string containing the whitespace characters to be
    matched; default is C{" \t\r\n"}.  Also takes optional C{min}, C{max}, and C{exact} arguments,
    as defined for the C{L{Word}} class.
    z<SPC>z<TAB>z<LF>z<CR>z<FF>)r�r(rr*r)� 	
rrrcs�tt��j�|�_�jdj�fdd��jD���djdd��jD���_d�_d�j�_	|�_
|dkrt|�_nt�_|dkr�|�_|�_
dS)Nr�c3s|]}|�jkr|VqdS)N)�
matchWhite)r�r�)r�rwrxr��sz!White.__init__.<locals>.<genexpr>css|]}tj|VqdS)N)r.�	whiteStrs)r�r�rwrwrxr��sTz	Expected r)
r�r.r�r.r�r�rYr�r[rarrr�)r�Zwsrrr
)rH)r�rxr��s zWhite.__init__TcCs�|||jkrt|||j|��|}|d7}||j}t|t|��}x"||krd|||jkrd|d7}qDW|||jkr�t|||j|��||||�fS)Nrr)r.rrarrr�r)r�r-r�ror�r�rwrwrxr��s
zWhite.parseImpl)r-rrrr)T)r�r�r�r�r/r�r�r�rwrw)rHrxr.�scseZdZ�fdd�Z�ZS)�_PositionTokencs(tt|�j�|jj|_d|_d|_dS)NTF)r�r0r�rHr�r�r[r`)r�)rHrwrxr��s
z_PositionToken.__init__)r�r�r�r�r�rwrw)rHrxr0�sr0cs2eZdZdZ�fdd�Zdd�Zd	dd�Z�ZS)
rzb
    Token to advance to a specific column of input text; useful for tabular report scraping.
    cstt|�j�||_dS)N)r�rr�r9)r��colno)rHrwrxr��szGoToColumn.__init__cCs`t||�|jkr\t|�}|jr*|j||�}x0||krZ||j�rZt||�|jkrZ|d7}q,W|S)Nrr)r9r�r]r��isspace)r�r-r�r�rwrwrxr��s&zGoToColumn.preParseTcCsDt||�}||jkr"t||d|��||j|}|||�}||fS)NzText not in expected column)r9r)r�r-r�roZthiscolZnewlocr�rwrwrxr�s

zGoToColumn.parseImpl)T)r�r�r�r�r�r�r�r�rwrw)rHrxr�s	cs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Matches if current position is at the beginning of a line within the parse string
    
    Example::
    
        test = '''        AAA this line
        AAA and this line
          AAA but not this one
        B AAA and definitely not this one
        '''

        for t in (LineStart() + 'AAA' + restOfLine).searchString(test):
            print(t)
    
    Prints::
        ['AAA', ' this line']
        ['AAA', ' and this line']    

    cstt|�j�d|_dS)NzExpected start of line)r�rr�ra)r�)rHrwrxr�&szLineStart.__init__TcCs*t||�dkr|gfSt|||j|��dS)Nrr)r9rra)r�r-r�rorwrwrxr�*szLineStart.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxrscs*eZdZdZ�fdd�Zddd�Z�ZS)rzU
    Matches if current position is at the end of a line within the parse string
    cs,tt|�j�|jtjjdd��d|_dS)Nrr�zExpected end of line)r�rr�r�r$rNr�ra)r�)rHrwrxr�3szLineEnd.__init__TcCsb|t|�kr6||dkr$|ddfSt|||j|��n(|t|�krN|dgfSt|||j|��dS)Nrrr)r�rra)r�r-r�rorwrwrxr�8szLineEnd.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr/scs*eZdZdZ�fdd�Zddd�Z�ZS)r*zM
    Matches if current position is at the beginning of the parse string
    cstt|�j�d|_dS)NzExpected start of text)r�r*r�ra)r�)rHrwrxr�GszStringStart.__init__TcCs0|dkr(||j|d�kr(t|||j|��|gfS)Nr)r�rra)r�r-r�rorwrwrxr�KszStringStart.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr*Cscs*eZdZdZ�fdd�Zddd�Z�ZS)r)zG
    Matches if current position is at the end of the parse string
    cstt|�j�d|_dS)NzExpected end of text)r�r)r�ra)r�)rHrwrxr�VszStringEnd.__init__TcCs^|t|�krt|||j|��n<|t|�kr6|dgfS|t|�krJ|gfSt|||j|��dS)Nrr)r�rra)r�r-r�rorwrwrxr�ZszStringEnd.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr)Rscs.eZdZdZef�fdd�	Zddd�Z�ZS)r1ap
    Matches if the current position is at the beginning of a Word, and
    is not preceded by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{} behavior of regular expressions,
    use C{WordStart(alphanums)}. C{WordStart} will also match at the beginning of
    the string being parsed, or at the beginning of a line.
    cs"tt|�j�t|�|_d|_dS)NzNot at the start of a word)r�r1r�r��	wordCharsra)r�r3)rHrwrxr�ls
zWordStart.__init__TcCs@|dkr8||d|jks(|||jkr8t|||j|��|gfS)Nrrr)r3rra)r�r-r�rorwrwrxr�qs
zWordStart.parseImpl)T)r�r�r�r�rVr�r�r�rwrw)rHrxr1dscs.eZdZdZef�fdd�	Zddd�Z�ZS)r0aZ
    Matches if the current position is at the end of a Word, and
    is not followed by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{} behavior of regular expressions,
    use C{WordEnd(alphanums)}. C{WordEnd} will also match at the end of
    the string being parsed, or at the end of a line.
    cs(tt|�j�t|�|_d|_d|_dS)NFzNot at the end of a word)r�r0r�r�r3rXra)r�r3)rHrwrxr��s
zWordEnd.__init__TcCsPt|�}|dkrH||krH|||jks8||d|jkrHt|||j|��|gfS)Nrrr)r�r3rra)r�r-r�ror�rwrwrxr��szWordEnd.parseImpl)T)r�r�r�r�rVr�r�r�rwrw)rHrxr0xscs�eZdZdZd�fdd�	Zdd�Zdd�Zd	d
�Z�fdd�Z�fd
d�Z	�fdd�Z
d�fdd�	Zgfdd�Z�fdd�Z
�ZS)r z^
    Abstract subclass of ParserElement, for combining and post-processing parsed tokens.
    Fcs�tt|�j|�t|t�r"t|�}t|t�r<tj|�g|_	njt|t
j�rzt|�}tdd�|D��rnt
tj|�}t|�|_	n,yt|�|_	Wntk
r�|g|_	YnXd|_dS)Ncss|]}t|t�VqdS)N)rzr�)r�r.rwrwrxr��sz+ParseExpression.__init__.<locals>.<genexpr>F)r�r r�rzr�r�r�r$rQ�exprsr��Iterable�allrvr�re)r�r4rg)rHrwrxr��s

zParseExpression.__init__cCs
|j|S)N)r4)r�r�rwrwrxr��szParseExpression.__getitem__cCs|jj|�d|_|S)N)r4r�rU)r�r�rwrwrxr��szParseExpression.appendcCs4d|_dd�|jD�|_x|jD]}|j�q W|S)z~Extends C{leaveWhitespace} defined in base class, and also invokes C{leaveWhitespace} on
           all contained expressions.FcSsg|]}|j��qSrw)r�)r�r�rwrwrxr��sz3ParseExpression.leaveWhitespace.<locals>.<listcomp>)rXr4r�)r�r�rwrwrxr��s
zParseExpression.leaveWhitespacecszt|t�rF||jkrvtt|�j|�xP|jD]}|j|jd�q,Wn0tt|�j|�x|jD]}|j|jd�q^W|S)Nrrrsrs)rzr+r]r�r r�r4)r�r�r�)rHrwrxr��s

zParseExpression.ignorecsLytt|�j�Stk
r"YnX|jdkrFd|jjt|j�f|_|jS)Nz%s:(%s))	r�r r�rKrUrHr�r�r4)r�)rHrwrxr��s
zParseExpression.__str__cs0tt|�j�x|jD]}|j�qWt|j�dk�r|jd}t||j�r�|jr�|jdkr�|j	r�|jdd�|jdg|_d|_
|j|jO_|j|jO_|jd}t||j�o�|jo�|jdko�|j	�r|jdd�|jdd�|_d|_
|j|jO_|j|jO_dt
|�|_|S)Nrqrrrz	Expected rsrs)r�r r�r4r�rzrHrSrVr^rUr[r`r�ra)r�r�r�)rHrwrxr��s0




zParseExpression.streamlinecstt|�j||�}|S)N)r�r rm)r�r�rlr�)rHrwrxrm�szParseExpression.setResultsNamecCs:|dd�|g}x|jD]}|j|�qW|jg�dS)N)r4r�r�)r�r��tmpr�rwrwrxr��szParseExpression.validatecs$tt|�j�}dd�|jD�|_|S)NcSsg|]}|j��qSrw)r�)r�r�rwrwrxr��sz(ParseExpression.copy.<locals>.<listcomp>)r�r r�r4)r�r�)rHrwrxr��szParseExpression.copy)F)F)r�r�r�r�r�r�r�r�r�r�r�rmr�r�r�rwrw)rHrxr �s	
"csTeZdZdZGdd�de�Zd�fdd�	Zddd�Zd	d
�Zdd�Z	d
d�Z
�ZS)ra

    Requires all given C{ParseExpression}s to be found in the given order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'+'} operator.
    May also be constructed using the C{'-'} operator, which will suppress backtracking.

    Example::
        integer = Word(nums)
        name_expr = OneOrMore(Word(alphas))

        expr = And([integer("id"),name_expr("name"),integer("age")])
        # more easily written as:
        expr = integer("id") + name_expr("name") + integer("age")
    cseZdZ�fdd�Z�ZS)zAnd._ErrorStopcs&ttj|�j||�d|_|j�dS)N�-)r�rr�r�r�r�)r�r�r�)rHrwrxr�
szAnd._ErrorStop.__init__)r�r�r�r�r�rwrw)rHrxr�
sr�TcsRtt|�j||�tdd�|jD��|_|j|jdj�|jdj|_d|_	dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�
szAnd.__init__.<locals>.<genexpr>rT)
r�rr�r6r4r[r�rYrXre)r�r4rg)rHrwrxr�
s
zAnd.__init__c	Cs|jdj|||dd�\}}d}x�|jdd�D]�}t|tj�rFd}q0|r�y|j|||�\}}Wq�tk
rv�Yq�tk
r�}zd|_tj|��WYdd}~Xq�t	k
r�t|t
|�|j|��Yq�Xn|j|||�\}}|s�|j�r0||7}q0W||fS)NrF)rprrT)
r4rtrzrr�r#r�
__traceback__r�r�r�rar�)	r�r-r�ro�
resultlistZ	errorStopr�Z
exprtokensr�rwrwrxr�
s(z
And.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrxr5
s

zAnd.__iadd__cCs8|dd�|g}x |jD]}|j|�|jsPqWdS)N)r4r�r[)r�r��subRecCheckListr�rwrwrxr�:
s

zAnd.checkRecursioncCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr��{r�css|]}t|�VqdS)N)r�)r�r�rwrwrxr�F
szAnd.__str__.<locals>.<genexpr>�})r�r�rUr�r4)r�rwrwrxr�A
s


 zAnd.__str__)T)T)r�r�r�r�r
r�r�r�rr�r�r�rwrw)rHrxr�s
csDeZdZdZd�fdd�	Zddd�Zdd	�Zd
d�Zdd
�Z�Z	S)ra�
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the expression that matches the longest string will be used.
    May be constructed using the C{'^'} operator.

    Example::
        # construct Or using '^' operator
        
        number = Word(nums) ^ Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789"))
    prints::
        [['123'], ['3.1416'], ['789']]
    Fcs:tt|�j||�|jr0tdd�|jD��|_nd|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�\
szOr.__init__.<locals>.<genexpr>T)r�rr�r4rr[)r�r4rg)rHrwrxr�Y
szOr.__init__TcCsTd}d}g}x�|jD]�}y|j||�}Wnvtk
rd}	z d|	_|	j|krT|	}|	j}WYdd}	~	Xqtk
r�t|�|kr�t|t|�|j|�}t|�}YqX|j||f�qW|�r*|j	dd�d�x`|D]X\}
}y|j
|||�Stk
�r$}	z"d|	_|	j|k�r|	}|	j}WYdd}	~	Xq�Xq�W|dk	�rB|j|_|�nt||d|��dS)NrrcSs
|dS)Nrrw)�xrwrwrxryu
szOr.parseImpl.<locals>.<lambda>)r�z no defined alternatives to matchrs)r4r�rr9r�r�r�rar��sortrtr�)r�r-r�ro�	maxExcLoc�maxExceptionr�r�Zloc2r��_rwrwrxr�`
s<

zOr.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrx�__ixor__�
s

zOr.__ixor__cCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r<z ^ css|]}t|�VqdS)N)r�)r�r�rwrwrxr��
szOr.__str__.<locals>.<genexpr>r=)r�r�rUr�r4)r�rwrwrxr��
s


 z
Or.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r4r�)r�r�r;r�rwrwrxr��
szOr.checkRecursion)F)T)
r�r�r�r�r�r�rCr�r�r�rwrw)rHrxrK
s

&	csDeZdZdZd�fdd�	Zddd�Zdd	�Zd
d�Zdd
�Z�Z	S)ra�
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the first one listed is the one that will match.
    May be constructed using the C{'|'} operator.

    Example::
        # construct MatchFirst using '|' operator
        
        # watch the order of expressions to match
        number = Word(nums) | Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789")) #  Fail! -> [['123'], ['3'], ['1416'], ['789']]

        # put more selective expression first
        number = Combine(Word(nums) + '.' + Word(nums)) | Word(nums)
        print(number.searchString("123 3.1416 789")) #  Better -> [['123'], ['3.1416'], ['789']]
    Fcs:tt|�j||�|jr0tdd�|jD��|_nd|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr��
sz&MatchFirst.__init__.<locals>.<genexpr>T)r�rr�r4rr[)r�r4rg)rHrwrxr��
szMatchFirst.__init__Tc	Cs�d}d}x�|jD]�}y|j|||�}|Stk
r\}z|j|krL|}|j}WYdd}~Xqtk
r�t|�|kr�t|t|�|j|�}t|�}YqXqW|dk	r�|j|_|�nt||d|��dS)Nrrz no defined alternatives to matchrs)r4rtrr�r�r�rar�)	r�r-r�ror@rAr�r�r�rwrwrxr��
s$
zMatchFirst.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrx�__ior__�
s

zMatchFirst.__ior__cCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r<z | css|]}t|�VqdS)N)r�)r�r�rwrwrxr��
sz%MatchFirst.__str__.<locals>.<genexpr>r=)r�r�rUr�r4)r�rwrwrxr��
s


 zMatchFirst.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r4r�)r�r�r;r�rwrwrxr��
szMatchFirst.checkRecursion)F)T)
r�r�r�r�r�r�rDr�r�r�rwrw)rHrxr�
s
	cs<eZdZdZd�fdd�	Zddd�Zdd�Zd	d
�Z�ZS)
ram
    Requires all given C{ParseExpression}s to be found, but in any order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'&'} operator.

    Example::
        color = oneOf("RED ORANGE YELLOW GREEN BLUE PURPLE BLACK WHITE BROWN")
        shape_type = oneOf("SQUARE CIRCLE TRIANGLE STAR HEXAGON OCTAGON")
        integer = Word(nums)
        shape_attr = "shape:" + shape_type("shape")
        posn_attr = "posn:" + Group(integer("x") + ',' + integer("y"))("posn")
        color_attr = "color:" + color("color")
        size_attr = "size:" + integer("size")

        # use Each (using operator '&') to accept attributes in any order 
        # (shape and posn are required, color and size are optional)
        shape_spec = shape_attr & posn_attr & Optional(color_attr) & Optional(size_attr)

        shape_spec.runTests('''
            shape: SQUARE color: BLACK posn: 100, 120
            shape: CIRCLE size: 50 color: BLUE posn: 50,80
            color:GREEN size:20 shape:TRIANGLE posn:20,40
            '''
            )
    prints::
        shape: SQUARE color: BLACK posn: 100, 120
        ['shape:', 'SQUARE', 'color:', 'BLACK', 'posn:', ['100', ',', '120']]
        - color: BLACK
        - posn: ['100', ',', '120']
          - x: 100
          - y: 120
        - shape: SQUARE


        shape: CIRCLE size: 50 color: BLUE posn: 50,80
        ['shape:', 'CIRCLE', 'size:', '50', 'color:', 'BLUE', 'posn:', ['50', ',', '80']]
        - color: BLUE
        - posn: ['50', ',', '80']
          - x: 50
          - y: 80
        - shape: CIRCLE
        - size: 50


        color: GREEN size: 20 shape: TRIANGLE posn: 20,40
        ['color:', 'GREEN', 'size:', '20', 'shape:', 'TRIANGLE', 'posn:', ['20', ',', '40']]
        - color: GREEN
        - posn: ['20', ',', '40']
          - x: 20
          - y: 40
        - shape: TRIANGLE
        - size: 20
    Tcs8tt|�j||�tdd�|jD��|_d|_d|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�sz Each.__init__.<locals>.<genexpr>T)r�rr�r6r4r[rX�initExprGroups)r�r4rg)rHrwrxr�sz
Each.__init__c
s�|jr�tdd�|jD��|_dd�|jD�}dd�|jD�}|||_dd�|jD�|_dd�|jD�|_dd�|jD�|_|j|j7_d	|_|}|jdd�}|jdd��g}d
}	x�|	�rp|�|j|j}
g}x~|
D]v}y|j||�}Wn t	k
�r|j
|�Yq�X|j
|jjt|�|��||k�rD|j
|�q�|�kr�j
|�q�Wt|�t|
�kr�d	}	q�W|�r�djdd�|D��}
t	||d
|
��|�fdd�|jD�7}g}x*|D]"}|j|||�\}}|j
|��q�Wt|tg��}||fS)Ncss&|]}t|t�rt|j�|fVqdS)N)rzrr�r.)r�r�rwrwrxr�sz!Each.parseImpl.<locals>.<genexpr>cSsg|]}t|t�r|j�qSrw)rzrr.)r�r�rwrwrxr�sz"Each.parseImpl.<locals>.<listcomp>cSs"g|]}|jrt|t�r|�qSrw)r[rzr)r�r�rwrwrxr�scSsg|]}t|t�r|j�qSrw)rzr2r.)r�r�rwrwrxr� scSsg|]}t|t�r|j�qSrw)rzrr.)r�r�rwrwrxr�!scSs g|]}t|tttf�s|�qSrw)rzrr2r)r�r�rwrwrxr�"sFTz, css|]}t|�VqdS)N)r�)r�r�rwrwrxr�=sz*Missing one or more required elements (%s)cs$g|]}t|t�r|j�kr|�qSrw)rzrr.)r�r�)�tmpOptrwrxr�As)rEr�r4Zopt1mapZ	optionalsZmultioptionalsZ
multirequiredZrequiredr�rr�r�r��remover�r�rt�sumr")r�r-r�roZopt1Zopt2ZtmpLocZtmpReqdZ
matchOrderZkeepMatchingZtmpExprsZfailedr�Zmissingr:r�ZfinalResultsrw)rFrxr�sP



zEach.parseImplcCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r<z & css|]}t|�VqdS)N)r�)r�r�rwrwrxr�PszEach.__str__.<locals>.<genexpr>r=)r�r�rUr�r4)r�rwrwrxr�Ks


 zEach.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r4r�)r�r�r;r�rwrwrxr�TszEach.checkRecursion)T)T)	r�r�r�r�r�r�r�r�r�rwrw)rHrxr�
s
5
1	csleZdZdZd�fdd�	Zddd�Zdd	�Z�fd
d�Z�fdd
�Zdd�Z	gfdd�Z
�fdd�Z�ZS)rza
    Abstract subclass of C{ParserElement}, for combining and post-processing parsed tokens.
    Fcs�tt|�j|�t|t�r@ttjt�r2tj|�}ntjt	|��}||_
d|_|dk	r�|j|_|j
|_
|j|j�|j|_|j|_|j|_|jj|j�dS)N)r�rr�rzr��
issubclassr$rQr,rr.rUr`r[r�rYrXrWrer]r�)r�r.rg)rHrwrxr�^s
zParseElementEnhance.__init__TcCs2|jdk	r|jj|||dd�Std||j|��dS)NF)rpr�)r.rtrra)r�r-r�rorwrwrxr�ps
zParseElementEnhance.parseImplcCs*d|_|jj�|_|jdk	r&|jj�|S)NF)rXr.r�r�)r�rwrwrxr�vs


z#ParseElementEnhance.leaveWhitespacecsrt|t�rB||jkrntt|�j|�|jdk	rn|jj|jd�n,tt|�j|�|jdk	rn|jj|jd�|S)Nrrrsrs)rzr+r]r�rr�r.)r�r�)rHrwrxr�}s



zParseElementEnhance.ignorecs&tt|�j�|jdk	r"|jj�|S)N)r�rr�r.)r�)rHrwrxr��s

zParseElementEnhance.streamlinecCsB||krt||g��|dd�|g}|jdk	r>|jj|�dS)N)r&r.r�)r�r�r;rwrwrxr��s

z"ParseElementEnhance.checkRecursioncCs6|dd�|g}|jdk	r(|jj|�|jg�dS)N)r.r�r�)r�r�r7rwrwrxr��s
zParseElementEnhance.validatecsVytt|�j�Stk
r"YnX|jdkrP|jdk	rPd|jjt|j�f|_|jS)Nz%s:(%s))	r�rr�rKrUr.rHr�r�)r�)rHrwrxr��szParseElementEnhance.__str__)F)T)
r�r�r�r�r�r�r�r�r�r�r�r�r�rwrw)rHrxrZs
cs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Lookahead matching of the given parse expression.  C{FollowedBy}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression matches at the current
    position.  C{FollowedBy} always returns a null token list.

    Example::
        # use FollowedBy to match a label only if it is followed by a ':'
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        OneOrMore(attr_expr).parseString("shape: SQUARE color: BLACK posn: upper left").pprint()
    prints::
        [['shape', 'SQUARE'], ['color', 'BLACK'], ['posn', 'upper left']]
    cstt|�j|�d|_dS)NT)r�rr�r[)r�r.)rHrwrxr��szFollowedBy.__init__TcCs|jj||�|gfS)N)r.r�)r�r-r�rorwrwrxr��szFollowedBy.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr�scs2eZdZdZ�fdd�Zd	dd�Zdd�Z�ZS)
ra�
    Lookahead to disallow matching with the given parse expression.  C{NotAny}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression does I{not} match at the current
    position.  Also, C{NotAny} does I{not} skip over leading whitespace. C{NotAny}
    always returns a null token list.  May be constructed using the '~' operator.

    Example::
        
    cs0tt|�j|�d|_d|_dt|j�|_dS)NFTzFound unwanted token, )r�rr�rXr[r�r.ra)r�r.)rHrwrxr��szNotAny.__init__TcCs&|jj||�rt|||j|��|gfS)N)r.r�rra)r�r-r�rorwrwrxr��szNotAny.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�z~{r=)r�r�rUr�r.)r�rwrwrxr��s


zNotAny.__str__)T)r�r�r�r�r�r�r�r�rwrw)rHrxr�s

cs(eZdZd�fdd�	Zddd�Z�ZS)	�_MultipleMatchNcsFtt|�j|�d|_|}t|t�r.tj|�}|dk	r<|nd|_dS)NT)	r�rJr�rWrzr�r$rQ�	not_ender)r�r.�stopOnZender)rHrwrxr��s

z_MultipleMatch.__init__TcCs�|jj}|j}|jdk	}|r$|jj}|r2|||�||||dd�\}}yZ|j}	xJ|rb|||�|	rr|||�}
n|}
|||
|�\}}|s�|j�rT||7}qTWWnttfk
r�YnX||fS)NF)rp)	r.rtr�rKr�r]r�rr�)r�r-r�roZself_expr_parseZself_skip_ignorablesZcheck_enderZ
try_not_enderr�ZhasIgnoreExprsr�Z	tmptokensrwrwrxr��s,



z_MultipleMatch.parseImpl)N)T)r�r�r�r�r�r�rwrw)rHrxrJ�srJc@seZdZdZdd�ZdS)ra�
    Repetition of one or more of the given expression.
    
    Parameters:
     - expr - expression that must match one or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: BLACK"
        OneOrMore(attr_expr).parseString(text).pprint()  # Fail! read 'color' as data instead of next label -> [['shape', 'SQUARE color']]

        # use stopOn attribute for OneOrMore to avoid reading label string as part of the data
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        OneOrMore(attr_expr).parseString(text).pprint() # Better -> [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'BLACK']]
        
        # could also be written as
        (attr_expr * (1,)).parseString(text).pprint()
    cCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�r<z}...)r�r�rUr�r.)r�rwrwrxr�!s


zOneOrMore.__str__N)r�r�r�r�r�rwrwrwrxrscs8eZdZdZd
�fdd�	Zd�fdd�	Zdd	�Z�ZS)r2aw
    Optional repetition of zero or more of the given expression.
    
    Parameters:
     - expr - expression that must match zero or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example: similar to L{OneOrMore}
    Ncstt|�j||d�d|_dS)N)rLT)r�r2r�r[)r�r.rL)rHrwrxr�6szZeroOrMore.__init__Tcs6ytt|�j|||�Sttfk
r0|gfSXdS)N)r�r2r�rr�)r�r-r�ro)rHrwrxr�:szZeroOrMore.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�rz]...)r�r�rUr�r.)r�rwrwrxr�@s


zZeroOrMore.__str__)N)T)r�r�r�r�r�r�r�r�rwrw)rHrxr2*sc@s eZdZdd�ZeZdd�ZdS)�
_NullTokencCsdS)NFrw)r�rwrwrxr�Jsz_NullToken.__bool__cCsdS)Nr�rw)r�rwrwrxr�Msz_NullToken.__str__N)r�r�r�r�r'r�rwrwrwrxrMIsrMcs6eZdZdZef�fdd�	Zd	dd�Zdd�Z�ZS)
raa
    Optional matching of the given expression.

    Parameters:
     - expr - expression that must match zero or more times
     - default (optional) - value to be returned if the optional expression is not found.

    Example::
        # US postal code can be a 5-digit zip, plus optional 4-digit qualifier
        zip = Combine(Word(nums, exact=5) + Optional('-' + Word(nums, exact=4)))
        zip.runTests('''
            # traditional ZIP code
            12345
            
            # ZIP+4 form
            12101-0001
            
            # invalid ZIP
            98765-
            ''')
    prints::
        # traditional ZIP code
        12345
        ['12345']

        # ZIP+4 form
        12101-0001
        ['12101-0001']

        # invalid ZIP
        98765-
             ^
        FAIL: Expected end of text (at char 5), (line:1, col:6)
    cs.tt|�j|dd�|jj|_||_d|_dS)NF)rgT)r�rr�r.rWr�r[)r�r.r�)rHrwrxr�ts
zOptional.__init__TcCszy|jj|||dd�\}}WnTttfk
rp|jtk	rh|jjr^t|jg�}|j||jj<ql|jg}ng}YnX||fS)NF)rp)r.rtrr�r��_optionalNotMatchedrVr")r�r-r�ror�rwrwrxr�zs


zOptional.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�rr	)r�r�rUr�r.)r�rwrwrxr��s


zOptional.__str__)T)	r�r�r�r�rNr�r�r�r�rwrw)rHrxrQs"
cs,eZdZdZd	�fdd�	Zd
dd�Z�ZS)r(a�	
    Token for skipping over all undefined text until the matched expression is found.

    Parameters:
     - expr - target expression marking the end of the data to be skipped
     - include - (default=C{False}) if True, the target expression is also parsed 
          (the skipped text and target expression are returned as a 2-element list).
     - ignore - (default=C{None}) used to define grammars (typically quoted strings and 
          comments) that might contain false matches to the target expression
     - failOn - (default=C{None}) define expressions that are not allowed to be 
          included in the skipped test; if found before the target expression is found, 
          the SkipTo is not a match

    Example::
        report = '''
            Outstanding Issues Report - 1 Jan 2000

               # | Severity | Description                               |  Days Open
            -----+----------+-------------------------------------------+-----------
             101 | Critical | Intermittent system crash                 |          6
              94 | Cosmetic | Spelling error on Login ('log|n')         |         14
              79 | Minor    | System slow when running too many reports |         47
            '''
        integer = Word(nums)
        SEP = Suppress('|')
        # use SkipTo to simply match everything up until the next SEP
        # - ignore quoted strings, so that a '|' character inside a quoted string does not match
        # - parse action will call token.strip() for each matched token, i.e., the description body
        string_data = SkipTo(SEP, ignore=quotedString)
        string_data.setParseAction(tokenMap(str.strip))
        ticket_expr = (integer("issue_num") + SEP 
                      + string_data("sev") + SEP 
                      + string_data("desc") + SEP 
                      + integer("days_open"))
        
        for tkt in ticket_expr.searchString(report):
            print tkt.dump()
    prints::
        ['101', 'Critical', 'Intermittent system crash', '6']
        - days_open: 6
        - desc: Intermittent system crash
        - issue_num: 101
        - sev: Critical
        ['94', 'Cosmetic', "Spelling error on Login ('log|n')", '14']
        - days_open: 14
        - desc: Spelling error on Login ('log|n')
        - issue_num: 94
        - sev: Cosmetic
        ['79', 'Minor', 'System slow when running too many reports', '47']
        - days_open: 47
        - desc: System slow when running too many reports
        - issue_num: 79
        - sev: Minor
    FNcs`tt|�j|�||_d|_d|_||_d|_t|t	�rFt
j|�|_n||_dt
|j�|_dS)NTFzNo match found for )r�r(r��
ignoreExprr[r`�includeMatchr�rzr�r$rQ�failOnr�r.ra)r�r��includer�rQ)rHrwrxr��s
zSkipTo.__init__TcCs,|}t|�}|j}|jj}|jdk	r,|jjnd}|jdk	rB|jjnd}	|}
x�|
|kr�|dk	rh|||
�rhP|	dk	r�x*y|	||
�}
Wqrtk
r�PYqrXqrWy|||
ddd�Wn tt	fk
r�|
d7}
YqLXPqLWt|||j
|��|
}|||�}t|�}|j�r$||||dd�\}}
||
7}||fS)NF)rorprr)rp)
r�r.rtrQr�rOr�rrr�rar"rP)r�r-r�ror0r�r.Z
expr_parseZself_failOn_canParseNextZself_ignoreExpr_tryParseZtmplocZskiptextZ
skipresultr�rwrwrxr��s<

zSkipTo.parseImpl)FNN)T)r�r�r�r�r�r�r�rwrw)rHrxr(�s6
csbeZdZdZd�fdd�	Zdd�Zdd�Zd	d
�Zdd�Zgfd
d�Z	dd�Z
�fdd�Z�ZS)raK
    Forward declaration of an expression to be defined later -
    used for recursive grammars, such as algebraic infix notation.
    When the expression is known, it is assigned to the C{Forward} variable using the '<<' operator.

    Note: take care when assigning to C{Forward} not to overlook precedence of operators.
    Specifically, '|' has a lower precedence than '<<', so that::
        fwdExpr << a | b | c
    will actually be evaluated as::
        (fwdExpr << a) | b | c
    thereby leaving b and c out as parseable alternatives.  It is recommended that you
    explicitly group the values inserted into the C{Forward}::
        fwdExpr << (a | b | c)
    Converting to use the '<<=' operator instead will avoid this problem.

    See L{ParseResults.pprint} for an example of a recursive parser created using
    C{Forward}.
    Ncstt|�j|dd�dS)NF)rg)r�rr�)r�r�)rHrwrxr�szForward.__init__cCsjt|t�rtj|�}||_d|_|jj|_|jj|_|j|jj	�|jj
|_
|jj|_|jj
|jj�|S)N)rzr�r$rQr.rUr`r[r�rYrXrWr]r�)r�r�rwrwrx�
__lshift__s





zForward.__lshift__cCs||>S)Nrw)r�r�rwrwrx�__ilshift__'szForward.__ilshift__cCs
d|_|S)NF)rX)r�rwrwrxr�*szForward.leaveWhitespacecCs$|js d|_|jdk	r |jj�|S)NT)r_r.r�)r�rwrwrxr�.s


zForward.streamlinecCs>||kr0|dd�|g}|jdk	r0|jj|�|jg�dS)N)r.r�r�)r�r�r7rwrwrxr�5s

zForward.validatecCs>t|d�r|jS|jjdSd}Wd|j|_X|jjd|S)Nr�z: ...�Nonez: )r�r�rHr�Z_revertClass�_ForwardNoRecurser.r�)r�Z	retStringrwrwrxr�<s

zForward.__str__cs.|jdk	rtt|�j�St�}||K}|SdS)N)r.r�rr�)r�r�)rHrwrxr�Ms

zForward.copy)N)
r�r�r�r�r�rSrTr�r�r�r�r�r�rwrw)rHrxrs
c@seZdZdd�ZdS)rVcCsdS)Nz...rw)r�rwrwrxr�Vsz_ForwardNoRecurse.__str__N)r�r�r�r�rwrwrwrxrVUsrVcs"eZdZdZd�fdd�	Z�ZS)r-zQ
    Abstract subclass of C{ParseExpression}, for converting parsed results.
    Fcstt|�j|�d|_dS)NF)r�r-r�rW)r�r.rg)rHrwrxr�]szTokenConverter.__init__)F)r�r�r�r�r�r�rwrw)rHrxr-Yscs6eZdZdZd
�fdd�	Z�fdd�Zdd	�Z�ZS)r
a�
    Converter to concatenate all matching tokens to a single string.
    By default, the matching patterns must also be contiguous in the input string;
    this can be disabled by specifying C{'adjacent=False'} in the constructor.

    Example::
        real = Word(nums) + '.' + Word(nums)
        print(real.parseString('3.1416')) # -> ['3', '.', '1416']
        # will also erroneously match the following
        print(real.parseString('3. 1416')) # -> ['3', '.', '1416']

        real = Combine(Word(nums) + '.' + Word(nums))
        print(real.parseString('3.1416')) # -> ['3.1416']
        # no match when there are internal spaces
        print(real.parseString('3. 1416')) # -> Exception: Expected W:(0123...)
    r�Tcs8tt|�j|�|r|j�||_d|_||_d|_dS)NT)r�r
r�r��adjacentrX�
joinStringre)r�r.rXrW)rHrwrxr�rszCombine.__init__cs(|jrtj||�ntt|�j|�|S)N)rWr$r�r�r
)r�r�)rHrwrxr�|szCombine.ignorecCsP|j�}|dd�=|tdj|j|j��g|jd�7}|jrH|j�rH|gS|SdS)Nr�)r�)r�r"r�r
rXrbrVr�)r�r-r�r�ZretToksrwrwrxr��s
"zCombine.postParse)r�T)r�r�r�r�r�r�r�r�rwrw)rHrxr
as
cs(eZdZdZ�fdd�Zdd�Z�ZS)ra�
    Converter to return the matched tokens as a list - useful for returning tokens of C{L{ZeroOrMore}} and C{L{OneOrMore}} expressions.

    Example::
        ident = Word(alphas)
        num = Word(nums)
        term = ident | num
        func = ident + Optional(delimitedList(term))
        print(func.parseString("fn a,b,100"))  # -> ['fn', 'a', 'b', '100']

        func = ident + Group(Optional(delimitedList(term)))
        print(func.parseString("fn a,b,100"))  # -> ['fn', ['a', 'b', '100']]
    cstt|�j|�d|_dS)NT)r�rr�rW)r�r.)rHrwrxr��szGroup.__init__cCs|gS)Nrw)r�r-r�r�rwrwrxr��szGroup.postParse)r�r�r�r�r�r�r�rwrw)rHrxr�s
cs(eZdZdZ�fdd�Zdd�Z�ZS)raW
    Converter to return a repetitive expression as a list, but also as a dictionary.
    Each element can also be referenced using the first token in the expression as its key.
    Useful for tabular report scraping when the first column can be used as a item key.

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        # print attributes as plain groups
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        # instead of OneOrMore(expr), parse using Dict(OneOrMore(Group(expr))) - Dict will auto-assign names
        result = Dict(OneOrMore(Group(attr_expr))).parseString(text)
        print(result.dump())
        
        # access named fields as dict entries, or output as dict
        print(result['shape'])        
        print(result.asDict())
    prints::
        ['shape', 'SQUARE', 'posn', 'upper left', 'color', 'light blue', 'texture', 'burlap']

        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        {'color': 'light blue', 'posn': 'upper left', 'texture': 'burlap', 'shape': 'SQUARE'}
    See more examples at L{ParseResults} of accessing fields by results name.
    cstt|�j|�d|_dS)NT)r�rr�rW)r�r.)rHrwrxr��sz
Dict.__init__cCs�x�t|�D]�\}}t|�dkr q
|d}t|t�rBt|d�j�}t|�dkr^td|�||<q
t|�dkr�t|dt�r�t|d|�||<q
|j�}|d=t|�dks�t|t�r�|j	�r�t||�||<q
t|d|�||<q
W|j
r�|gS|SdS)Nrrrr�rq)r�r�rzrur�r�r�r"r�r�rV)r�r-r�r�r��tokZikeyZ	dictvaluerwrwrxr��s$
zDict.postParse)r�r�r�r�r�r�r�rwrw)rHrxr�s#c@s eZdZdZdd�Zdd�ZdS)r+aV
    Converter for ignoring the results of a parsed expression.

    Example::
        source = "a, b, c,d"
        wd = Word(alphas)
        wd_list1 = wd + ZeroOrMore(',' + wd)
        print(wd_list1.parseString(source))

        # often, delimiters that are useful during parsing are just in the
        # way afterward - use Suppress to keep them out of the parsed output
        wd_list2 = wd + ZeroOrMore(Suppress(',') + wd)
        print(wd_list2.parseString(source))
    prints::
        ['a', ',', 'b', ',', 'c', ',', 'd']
        ['a', 'b', 'c', 'd']
    (See also L{delimitedList}.)
    cCsgS)Nrw)r�r-r�r�rwrwrxr��szSuppress.postParsecCs|S)Nrw)r�rwrwrxr��szSuppress.suppressN)r�r�r�r�r�r�rwrwrwrxr+�sc@s(eZdZdZdd�Zdd�Zdd�ZdS)	rzI
    Wrapper for parse actions, to ensure they are only called once.
    cCst|�|_d|_dS)NF)rM�callable�called)r�Z
methodCallrwrwrxr�s
zOnlyOnce.__init__cCs.|js|j|||�}d|_|St||d��dS)NTr�)r[rZr)r�r�r5rvr�rwrwrxr�s
zOnlyOnce.__call__cCs
d|_dS)NF)r[)r�rwrwrx�reset
szOnlyOnce.resetN)r�r�r�r�r�r�r\rwrwrwrxr�scs:t����fdd�}y�j|_Wntk
r4YnX|S)as
    Decorator for debugging parse actions. 
    
    When the parse action is called, this decorator will print C{">> entering I{method-name}(line:I{current_source_line}, I{parse_location}, I{matched_tokens})".}
    When the parse action completes, the decorator will print C{"<<"} followed by the returned value, or any exception that the parse action raised.

    Example::
        wd = Word(alphas)

        @traceParseAction
        def remove_duplicate_chars(tokens):
            return ''.join(sorted(set(''.join(tokens)))

        wds = OneOrMore(wd).setParseAction(remove_duplicate_chars)
        print(wds.parseString("slkdjs sld sldd sdlf sdljf"))
    prints::
        >>entering remove_duplicate_chars(line: 'slkdjs sld sldd sdlf sdljf', 0, (['slkdjs', 'sld', 'sldd', 'sdlf', 'sdljf'], {}))
        <<leaving remove_duplicate_chars (ret: 'dfjkls')
        ['dfjkls']
    cs��j}|dd�\}}}t|�dkr8|djjd|}tjjd|t||�||f�y�|�}Wn8tk
r�}ztjjd||f��WYdd}~XnXtjjd||f�|S)Nror�.z">>entering %s(line: '%s', %d, %r)
z<<leaving %s (exception: %s)
z<<leaving %s (ret: %r)
r9)r�r�rHr~�stderr�writerGrK)ZpaArgsZthisFuncr�r5rvr�r3)r�rwrx�z#sztraceParseAction.<locals>.z)rMr�r�)r�r`rw)r�rxrb
s
�,FcCs`t|�dt|�dt|�d}|rBt|t||��j|�S|tt|�|�j|�SdS)a�
    Helper to define a delimited list of expressions - the delimiter defaults to ','.
    By default, the list elements and delimiters can have intervening whitespace, and
    comments, but this can be overridden by passing C{combine=True} in the constructor.
    If C{combine} is set to C{True}, the matching tokens are returned as a single token
    string, with the delimiters included; otherwise, the matching tokens are returned
    as a list of tokens, with the delimiters suppressed.

    Example::
        delimitedList(Word(alphas)).parseString("aa,bb,cc") # -> ['aa', 'bb', 'cc']
        delimitedList(Word(hexnums), delim=':', combine=True).parseString("AA:BB:CC:DD:EE") # -> ['AA:BB:CC:DD:EE']
    z [r�z]...N)r�r
r2rir+)r.Zdelim�combineZdlNamerwrwrxr@9s
$csjt����fdd�}|dkr0tt�jdd��}n|j�}|jd�|j|dd�|�jd	t��d
�S)a:
    Helper to define a counted list of expressions.
    This helper defines a pattern of the form::
        integer expr expr expr...
    where the leading integer tells how many expr expressions follow.
    The matched tokens returns the array of expr tokens as a list - the leading count token is suppressed.
    
    If C{intExpr} is specified, it should be a pyparsing expression that produces an integer value.

    Example::
        countedArray(Word(alphas)).parseString('2 ab cd ef')  # -> ['ab', 'cd']

        # in this parser, the leading integer value is given in binary,
        # '10' indicating that 2 values are in the array
        binaryConstant = Word('01').setParseAction(lambda t: int(t[0], 2))
        countedArray(Word(alphas), intExpr=binaryConstant).parseString('10 ab cd ef')  # -> ['ab', 'cd']
    cs.|d}�|r tt�g|��p&tt�>gS)Nr)rrrC)r�r5rvr�)�	arrayExprr.rwrx�countFieldParseAction_s"z+countedArray.<locals>.countFieldParseActionNcSst|d�S)Nr)ru)rvrwrwrxrydszcountedArray.<locals>.<lambda>ZarrayLenT)rfz(len) z...)rr/rRr�r�rirxr�)r.ZintExprrdrw)rcr.rxr<Ls
cCs:g}x0|D](}t|t�r(|jt|��q
|j|�q
W|S)N)rzr�r�r�r�)�Lr�r�rwrwrxr�ks

r�cs6t���fdd�}|j|dd��jdt|���S)a*
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousLiteral(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches a
    previous literal, will also match the leading C{"1:1"} in C{"1:10"}.
    If this is not desired, use C{matchPreviousExpr}.
    Do I{not} use with packrat parsing enabled.
    csP|rBt|�dkr�|d>qLt|j��}�tdd�|D��>n
�t�>dS)Nrrrcss|]}t|�VqdS)N)r)r��ttrwrwrxr��szDmatchPreviousLiteral.<locals>.copyTokenToRepeater.<locals>.<genexpr>)r�r�r�rr
)r�r5rvZtflat)�reprwrx�copyTokenToRepeater�sz1matchPreviousLiteral.<locals>.copyTokenToRepeaterT)rfz(prev) )rrxrir�)r.rhrw)rgrxrOts


csFt��|j�}�|K��fdd�}|j|dd��jdt|���S)aS
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousExpr(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches by
    expressions, will I{not} match the leading C{"1:1"} in C{"1:10"};
    the expressions are evaluated first, and then compared, so
    C{"1"} is compared with C{"10"}.
    Do I{not} use with packrat parsing enabled.
    cs*t|j����fdd�}�j|dd�dS)Ncs$t|j��}|�kr tddd��dS)Nr�r)r�r�r)r�r5rvZtheseTokens)�matchTokensrwrx�mustMatchTheseTokens�szLmatchPreviousExpr.<locals>.copyTokenToRepeater.<locals>.mustMatchTheseTokensT)rf)r�r�r�)r�r5rvrj)rg)rirxrh�sz.matchPreviousExpr.<locals>.copyTokenToRepeaterT)rfz(prev) )rr�rxrir�)r.Ze2rhrw)rgrxrN�scCs>xdD]}|j|t|�}qW|jdd�}|jdd�}t|�S)Nz\^-]rz\nr(z\t)r��_bslashr�)r�r�rwrwrxr�s

rTc
s�|rdd�}dd�}t�ndd�}dd�}t�g}t|t�rF|j�}n&t|tj�r\t|�}ntj	dt
dd�|svt�Sd	}x�|t|�d
k�r||}xnt
||d
d��D]N\}}	||	|�r�|||d
=Pq�|||	�r�|||d
=|j||	�|	}Pq�W|d
7}q|W|�r�|�r�yht|�tdj|��k�rZtd
djdd�|D���jdj|��Stdjdd�|D���jdj|��SWn&tk
�r�tj	dt
dd�YnXt�fdd�|D��jdj|��S)a�
    Helper to quickly define a set of alternative Literals, and makes sure to do
    longest-first testing when there is a conflict, regardless of the input order,
    but returns a C{L{MatchFirst}} for best performance.

    Parameters:
     - strs - a string of space-delimited literals, or a collection of string literals
     - caseless - (default=C{False}) - treat all literals as caseless
     - useRegex - (default=C{True}) - as an optimization, will generate a Regex
          object; otherwise, will generate a C{MatchFirst} object (if C{caseless=True}, or
          if creating a C{Regex} raises an exception)

    Example::
        comp_oper = oneOf("< = > <= >= !=")
        var = Word(alphas)
        number = Word(nums)
        term = var | number
        comparison_expr = term + comp_oper + term
        print(comparison_expr.searchString("B = 12  AA=23 B<=AA AA>12"))
    prints::
        [['B', '=', '12'], ['AA', '=', '23'], ['B', '<=', 'AA'], ['AA', '>', '12']]
    cSs|j�|j�kS)N)r�)r�brwrwrxry�szoneOf.<locals>.<lambda>cSs|j�j|j��S)N)r�r�)rrlrwrwrxry�scSs||kS)Nrw)rrlrwrwrxry�scSs
|j|�S)N)r�)rrlrwrwrxry�sz6Invalid argument to oneOf, expected string or iterablerq)r�rrrNr�z[%s]css|]}t|�VqdS)N)r)r��symrwrwrxr��szoneOf.<locals>.<genexpr>z | �|css|]}tj|�VqdS)N)rdr	)r�rmrwrwrxr��sz7Exception creating Regex for oneOf, building MatchFirstc3s|]}�|�VqdS)Nrw)r�rm)�parseElementClassrwrxr��s)rrrzr�r�r�r5r�r�r�r�rr�r�r�r�r'rirKr)
Zstrsr�ZuseRegexZisequalZmasksZsymbolsr�Zcurr�r�rw)rorxrS�sL





((cCsttt||���S)a�
    Helper to easily and clearly define a dictionary by specifying the respective patterns
    for the key and value.  Takes care of defining the C{L{Dict}}, C{L{ZeroOrMore}}, and C{L{Group}} tokens
    in the proper order.  The key pattern can include delimiting markers or punctuation,
    as long as they are suppressed, thereby leaving the significant key text.  The value
    pattern can include named results, so that the C{Dict} results can include named token
    fields.

    Example::
        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        attr_label = label
        attr_value = Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join)

        # similar to Dict, but simpler call format
        result = dictOf(attr_label, attr_value).parseString(text)
        print(result.dump())
        print(result['shape'])
        print(result.shape)  # object attribute access works too
        print(result.asDict())
    prints::
        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        SQUARE
        {'color': 'light blue', 'shape': 'SQUARE', 'posn': 'upper left', 'texture': 'burlap'}
    )rr2r)r�r�rwrwrxrA�s!cCs^t�jdd��}|j�}d|_|d�||d�}|r@dd�}ndd�}|j|�|j|_|S)	a�
    Helper to return the original, untokenized text for a given expression.  Useful to
    restore the parsed fields of an HTML start tag into the raw tag text itself, or to
    revert separate tokens with intervening whitespace back to the original matching
    input text. By default, returns astring containing the original parsed text.  
       
    If the optional C{asString} argument is passed as C{False}, then the return value is a 
    C{L{ParseResults}} containing any results names that were originally matched, and a 
    single token containing the original matched text from the input string.  So if 
    the expression passed to C{L{originalTextFor}} contains expressions with defined
    results names, you must set C{asString} to C{False} if you want to preserve those
    results name values.

    Example::
        src = "this is test <b> bold <i>text</i> </b> normal text "
        for tag in ("b","i"):
            opener,closer = makeHTMLTags(tag)
            patt = originalTextFor(opener + SkipTo(closer) + closer)
            print(patt.searchString(src)[0])
    prints::
        ['<b> bold <i>text</i> </b>']
        ['<i>text</i>']
    cSs|S)Nrw)r�r�rvrwrwrxry8sz!originalTextFor.<locals>.<lambda>F�_original_start�
_original_endcSs||j|j�S)N)rprq)r�r5rvrwrwrxry=scSs&||jd�|jd��g|dd�<dS)Nrprq)r�)r�r5rvrwrwrx�extractText?sz$originalTextFor.<locals>.extractText)r
r�r�rer])r.ZasStringZ	locMarkerZendlocMarker�	matchExprrrrwrwrxrg s

cCst|�jdd��S)zp
    Helper to undo pyparsing's default grouping of And expressions, even
    if all but one are non-empty.
    cSs|dS)Nrrw)rvrwrwrxryJszungroup.<locals>.<lambda>)r-r�)r.rwrwrxrhEscCs4t�jdd��}t|d�|d�|j�j�d��S)a�
    Helper to decorate a returned token with its starting and ending locations in the input string.
    This helper adds the following results names:
     - locn_start = location where matched expression begins
     - locn_end = location where matched expression ends
     - value = the actual parsed results

    Be careful if the input text contains C{<TAB>} characters, you may want to call
    C{L{ParserElement.parseWithTabs}}

    Example::
        wd = Word(alphas)
        for match in locatedExpr(wd).searchString("ljsdf123lksdjjf123lkkjj1222"):
            print(match)
    prints::
        [[0, 'ljsdf', 5]]
        [[8, 'lksdjjf', 15]]
        [[18, 'lkkjj', 23]]
    cSs|S)Nrw)r�r5rvrwrwrxry`szlocatedExpr.<locals>.<lambda>Z
locn_startr�Zlocn_end)r
r�rr�r�)r.ZlocatorrwrwrxrjLsz\[]-*.$+^?()~ )r
cCs|ddS)Nrrrrw)r�r5rvrwrwrxryksryz\\0?[xX][0-9a-fA-F]+cCstt|djd�d��S)Nrz\0x�)�unichrru�lstrip)r�r5rvrwrwrxrylsz	\\0[0-7]+cCstt|ddd�d��S)Nrrr�)ruru)r�r5rvrwrwrxrymsz\])r�r
z\wr8rr�Znegate�bodyr	csBdd��y dj�fdd�tj|�jD��Stk
r<dSXdS)a�
    Helper to easily define string ranges for use in Word construction.  Borrows
    syntax from regexp '[]' string range definitions::
        srange("[0-9]")   -> "0123456789"
        srange("[a-z]")   -> "abcdefghijklmnopqrstuvwxyz"
        srange("[a-z$_]") -> "abcdefghijklmnopqrstuvwxyz$_"
    The input string must be enclosed in []'s, and the returned string is the expanded
    character set joined into a single string.
    The values enclosed in the []'s may be:
     - a single character
     - an escaped character with a leading backslash (such as C{\-} or C{\]})
     - an escaped hex character with a leading C{'\x'} (C{\x21}, which is a C{'!'} character) 
         (C{\0x##} is also supported for backwards compatibility) 
     - an escaped octal character with a leading C{'\0'} (C{\041}, which is a C{'!'} character)
     - a range of any of the above, separated by a dash (C{'a-z'}, etc.)
     - any combination of the above (C{'aeiouy'}, C{'a-zA-Z0-9_$'}, etc.)
    cSs<t|t�s|Sdjdd�tt|d�t|d�d�D��S)Nr�css|]}t|�VqdS)N)ru)r�r�rwrwrxr��sz+srange.<locals>.<lambda>.<locals>.<genexpr>rrr)rzr"r�r��ord)�prwrwrxry�szsrange.<locals>.<lambda>r�c3s|]}�|�VqdS)Nrw)r��part)�	_expandedrwrxr��szsrange.<locals>.<genexpr>N)r��_reBracketExprr�rxrK)r�rw)r|rxr_rs
 cs�fdd�}|S)zt
    Helper method for defining parse actions that require matching at a specific
    column in the input text.
    cs"t||��krt||d���dS)Nzmatched token not at column %d)r9r)r)Zlocnr1)r�rwrx�	verifyCol�sz!matchOnlyAtCol.<locals>.verifyColrw)r�r~rw)r�rxrM�scs�fdd�S)a�
    Helper method for common parse actions that simply return a literal value.  Especially
    useful when used with C{L{transformString<ParserElement.transformString>}()}.

    Example::
        num = Word(nums).setParseAction(lambda toks: int(toks[0]))
        na = oneOf("N/A NA").setParseAction(replaceWith(math.nan))
        term = na | num
        
        OneOrMore(term).parseString("324 234 N/A 234") # -> [324, 234, nan, 234]
    cs�gS)Nrw)r�r5rv)�replStrrwrxry�szreplaceWith.<locals>.<lambda>rw)rrw)rrxr\�scCs|ddd�S)a
    Helper parse action for removing quotation marks from parsed quoted strings.

    Example::
        # by default, quotation marks are included in parsed results
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["'Now is the Winter of our Discontent'"]

        # use removeQuotes to strip quotation marks from parsed results
        quotedString.setParseAction(removeQuotes)
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["Now is the Winter of our Discontent"]
    rrrrsrw)r�r5rvrwrwrxrZ�scsN��fdd�}yt�dt�d�j�}Wntk
rBt��}YnX||_|S)aG
    Helper to define a parse action by mapping a function to all elements of a ParseResults list.If any additional 
    args are passed, they are forwarded to the given function as additional arguments after
    the token, as in C{hex_integer = Word(hexnums).setParseAction(tokenMap(int, 16))}, which will convert the
    parsed data to an integer using base 16.

    Example (compare the last to example in L{ParserElement.transformString}::
        hex_ints = OneOrMore(Word(hexnums)).setParseAction(tokenMap(int, 16))
        hex_ints.runTests('''
            00 11 22 aa FF 0a 0d 1a
            ''')
        
        upperword = Word(alphas).setParseAction(tokenMap(str.upper))
        OneOrMore(upperword).runTests('''
            my kingdom for a horse
            ''')

        wd = Word(alphas).setParseAction(tokenMap(str.title))
        OneOrMore(wd).setParseAction(' '.join).runTests('''
            now is the winter of our discontent made glorious summer by this sun of york
            ''')
    prints::
        00 11 22 aa FF 0a 0d 1a
        [0, 17, 34, 170, 255, 10, 13, 26]

        my kingdom for a horse
        ['MY', 'KINGDOM', 'FOR', 'A', 'HORSE']

        now is the winter of our discontent made glorious summer by this sun of york
        ['Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York']
    cs��fdd�|D�S)Ncsg|]}�|f����qSrwrw)r�Ztokn)r�r6rwrxr��sz(tokenMap.<locals>.pa.<locals>.<listcomp>rw)r�r5rv)r�r6rwrxr}�sztokenMap.<locals>.par�rH)rJr�rKr{)r6r�r}rLrw)r�r6rxrm�s cCst|�j�S)N)r�r�)rvrwrwrxry�scCst|�j�S)N)r��lower)rvrwrwrxry�scCs�t|t�r|}t||d�}n|j}tttd�}|r�tj�j	t
�}td�|d�tt
t|td�|���tddgd�jd	�j	d
d��td�}n�d
jdd�tD��}tj�j	t
�t|�B}td�|d�tt
t|j	t�ttd�|����tddgd�jd	�j	dd��td�}ttd�|d�}|jdd
j|jdd�j�j���jd|�}|jdd
j|jdd�j�j���jd|�}||_||_||fS)zRInternal helper to construct opening and closing tag expressions, given a tag name)r�z_-:r�tag�=�/F)r�rCcSs|ddkS)Nrr�rw)r�r5rvrwrwrxry�sz_makeTags.<locals>.<lambda>rr�css|]}|dkr|VqdS)rNrw)r�r�rwrwrxr��sz_makeTags.<locals>.<genexpr>cSs|ddkS)Nrr�rw)r�r5rvrwrwrxry�sz</r��:r�z<%s>rz</%s>)rzr�rr�r/r4r3r>r�r�rZr+rr2rrrmr�rVrYrBr
�_Lr��titler�rir�)�tagStrZxmlZresnameZtagAttrNameZtagAttrValueZopenTagZprintablesLessRAbrackZcloseTagrwrwrx�	_makeTags�s"
T\..r�cCs
t|d�S)a 
    Helper to construct opening and closing tag expressions for HTML, given a tag name. Matches
    tags in either upper or lower case, attributes with namespaces and with quoted or unquoted values.

    Example::
        text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
        # makeHTMLTags returns pyparsing expressions for the opening and closing tags as a 2-tuple
        a,a_end = makeHTMLTags("A")
        link_expr = a + SkipTo(a_end)("link_text") + a_end
        
        for link in link_expr.searchString(text):
            # attributes in the <A> tag (like "href" shown here) are also accessible as named results
            print(link.link_text, '->', link.href)
    prints::
        pyparsing -> http://pyparsing.wikispaces.com
    F)r�)r�rwrwrxrK�scCs
t|d�S)z�
    Helper to construct opening and closing tag expressions for XML, given a tag name. Matches
    tags only in the given upper/lower case.

    Example: similar to L{makeHTMLTags}
    T)r�)r�rwrwrxrLscs8|r|dd��n|j��dd��D���fdd�}|S)a<
    Helper to create a validating parse action to be used with start tags created
    with C{L{makeXMLTags}} or C{L{makeHTMLTags}}. Use C{withAttribute} to qualify a starting tag
    with a required attribute value, to avoid false matches on common tags such as
    C{<TD>} or C{<DIV>}.

    Call C{withAttribute} with a series of attribute names and values. Specify the list
    of filter attributes names and values as:
     - keyword arguments, as in C{(align="right")}, or
     - as an explicit dict with C{**} operator, when an attribute name is also a Python
          reserved word, as in C{**{"class":"Customer", "align":"right"}}
     - a list of name-value tuples, as in ( ("ns1:class", "Customer"), ("ns2:align","right") )
    For attribute names with a namespace prefix, you must use the second form.  Attribute
    names are matched insensitive to upper/lower case.
       
    If just testing for C{class} (with or without a namespace), use C{L{withClass}}.

    To verify that the attribute exists, but without specifying a value, pass
    C{withAttribute.ANY_VALUE} as the value.

    Example::
        html = '''
            <div>
            Some text
            <div type="grid">1 4 0 1 0</div>
            <div type="graph">1,3 2,3 1,1</div>
            <div>this has no type</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")

        # only match div tag having a type attribute with value "grid"
        div_grid = div().setParseAction(withAttribute(type="grid"))
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        # construct a match with any div tag having a type attribute, regardless of the value
        div_any_type = div().setParseAction(withAttribute(type=withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    NcSsg|]\}}||f�qSrwrw)r�r�r�rwrwrxr�Qsz!withAttribute.<locals>.<listcomp>cs^xX�D]P\}}||kr&t||d|��|tjkr|||krt||d||||f��qWdS)Nzno matching attribute z+attribute '%s' has value '%s', must be '%s')rre�	ANY_VALUE)r�r5r�ZattrNameZ	attrValue)�attrsrwrxr}RszwithAttribute.<locals>.pa)r�)r�ZattrDictr}rw)r�rxres2cCs|rd|nd}tf||i�S)a�
    Simplified version of C{L{withAttribute}} when matching on a div class - made
    difficult because C{class} is a reserved word in Python.

    Example::
        html = '''
            <div>
            Some text
            <div class="grid">1 4 0 1 0</div>
            <div class="graph">1,3 2,3 1,1</div>
            <div>this &lt;div&gt; has no class</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")
        div_grid = div().setParseAction(withClass("grid"))
        
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        div_any_type = div().setParseAction(withClass(withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    z%s:class�class)re)Z	classname�	namespaceZ	classattrrwrwrxrk\s �(rcCs�t�}||||B}�x`t|�D�]R\}}|ddd�\}}	}
}|	dkrTd|nd|}|	dkr�|dksxt|�dkr�td��|\}
}t�j|�}|
tjk�rd|	dkr�t||�t|t	|��}n�|	dk�r|dk	�rt|||�t|t	||��}nt||�t|t	|��}nD|	dk�rZt||
|||�t||
|||�}ntd	��n�|
tj
k�rH|	dk�r�t|t��s�t|�}t|j
|�t||�}n�|	dk�r|dk	�r�t|||�t|t	||��}nt||�t|t	|��}nD|	dk�r>t||
|||�t||
|||�}ntd	��ntd
��|�r`|j|�||j|�|BK}|}q"W||K}|S)a�	
    Helper method for constructing grammars of expressions made up of
    operators working in a precedence hierarchy.  Operators may be unary or
    binary, left- or right-associative.  Parse actions can also be attached
    to operator expressions. The generated parser will also recognize the use 
    of parentheses to override operator precedences (see example below).
    
    Note: if you define a deep operator list, you may see performance issues
    when using infixNotation. See L{ParserElement.enablePackrat} for a
    mechanism to potentially improve your parser performance.

    Parameters:
     - baseExpr - expression representing the most basic element for the nested
     - opList - list of tuples, one for each operator precedence level in the
      expression grammar; each tuple is of the form
      (opExpr, numTerms, rightLeftAssoc, parseAction), where:
       - opExpr is the pyparsing expression for the operator;
          may also be a string, which will be converted to a Literal;
          if numTerms is 3, opExpr is a tuple of two expressions, for the
          two operators separating the 3 terms
       - numTerms is the number of terms for this operator (must
          be 1, 2, or 3)
       - rightLeftAssoc is the indicator whether the operator is
          right or left associative, using the pyparsing-defined
          constants C{opAssoc.RIGHT} and C{opAssoc.LEFT}.
       - parseAction is the parse action to be associated with
          expressions matching this operator expression (the
          parse action tuple member may be omitted)
     - lpar - expression for matching left-parentheses (default=C{Suppress('(')})
     - rpar - expression for matching right-parentheses (default=C{Suppress(')')})

    Example::
        # simple example of four-function arithmetic with ints and variable names
        integer = pyparsing_common.signed_integer
        varname = pyparsing_common.identifier 
        
        arith_expr = infixNotation(integer | varname,
            [
            ('-', 1, opAssoc.RIGHT),
            (oneOf('* /'), 2, opAssoc.LEFT),
            (oneOf('+ -'), 2, opAssoc.LEFT),
            ])
        
        arith_expr.runTests('''
            5+3*6
            (5+3)*6
            -2--11
            ''', fullDump=False)
    prints::
        5+3*6
        [[5, '+', [3, '*', 6]]]

        (5+3)*6
        [[[5, '+', 3], '*', 6]]

        -2--11
        [[['-', 2], '-', ['-', 11]]]
    Nrroz%s termz	%s%s termrqz@if numterms=3, opExpr must be a tuple or list of two expressionsrrz6operator must be unary (1), binary (2), or ternary (3)z2operator must indicate right or left associativity)N)rr�r�r�rirT�LEFTrrr�RIGHTrzrr.r�)ZbaseExprZopListZlparZrparr�ZlastExprr�ZoperDefZopExprZarityZrightLeftAssocr}ZtermNameZopExpr1ZopExpr2ZthisExprrsrwrwrxri�sR;

&




&


z4"(?:[^"\n\r\\]|(?:"")|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*�"z string enclosed in double quotesz4'(?:[^'\n\r\\]|(?:'')|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*�'z string enclosed in single quotesz*quotedString using single or double quotes�uzunicode string literalcCs�||krtd��|dk�r(t|t�o,t|t��r t|�dkr�t|�dkr�|dk	r�tt|t||tjdd���j	dd��}n$t
j�t||tj�j	dd��}nx|dk	r�tt|t|�t|�ttjdd���j	dd��}n4ttt|�t|�ttjdd���j	d	d��}ntd
��t
�}|dk	�rb|tt|�t||B|B�t|��K}n$|tt|�t||B�t|��K}|jd||f�|S)a~	
    Helper method for defining nested lists enclosed in opening and closing
    delimiters ("(" and ")" are the default).

    Parameters:
     - opener - opening character for a nested list (default=C{"("}); can also be a pyparsing expression
     - closer - closing character for a nested list (default=C{")"}); can also be a pyparsing expression
     - content - expression for items within the nested lists (default=C{None})
     - ignoreExpr - expression for ignoring opening and closing delimiters (default=C{quotedString})

    If an expression is not provided for the content argument, the nested
    expression will capture all whitespace-delimited content between delimiters
    as a list of separate values.

    Use the C{ignoreExpr} argument to define expressions that may contain
    opening or closing characters that should not be treated as opening
    or closing characters for nesting, such as quotedString or a comment
    expression.  Specify multiple expressions using an C{L{Or}} or C{L{MatchFirst}}.
    The default is L{quotedString}, but if no expressions are to be ignored,
    then pass C{None} for this argument.

    Example::
        data_type = oneOf("void int short long char float double")
        decl_data_type = Combine(data_type + Optional(Word('*')))
        ident = Word(alphas+'_', alphanums+'_')
        number = pyparsing_common.number
        arg = Group(decl_data_type + ident)
        LPAR,RPAR = map(Suppress, "()")

        code_body = nestedExpr('{', '}', ignoreExpr=(quotedString | cStyleComment))

        c_function = (decl_data_type("type") 
                      + ident("name")
                      + LPAR + Optional(delimitedList(arg), [])("args") + RPAR 
                      + code_body("body"))
        c_function.ignore(cStyleComment)
        
        source_code = '''
            int is_odd(int x) { 
                return (x%2); 
            }
                
            int dec_to_hex(char hchar) { 
                if (hchar >= '0' && hchar <= '9') { 
                    return (ord(hchar)-ord('0')); 
                } else { 
                    return (10+ord(hchar)-ord('A'));
                } 
            }
        '''
        for func in c_function.searchString(source_code):
            print("%(name)s (%(type)s) args: %(args)s" % func)

    prints::
        is_odd (int) args: [['int', 'x']]
        dec_to_hex (int) args: [['char', 'hchar']]
    z.opening and closing strings cannot be the sameNrr)r
cSs|dj�S)Nr)r�)rvrwrwrxry9sznestedExpr.<locals>.<lambda>cSs|dj�S)Nr)r�)rvrwrwrxry<scSs|dj�S)Nr)r�)rvrwrwrxryBscSs|dj�S)Nr)r�)rvrwrwrxryFszOopening and closing arguments must be strings if no content expression is givenznested %s%s expression)r�rzr�r�r
rr	r$rNr�rCr�rrrr+r2ri)�openerZcloserZcontentrOr�rwrwrxrP�s4:

*$cs��fdd�}�fdd�}�fdd�}tt�jd�j��}t�t�j|�jd�}t�j|�jd	�}t�j|�jd
�}	|r�tt|�|t|t|�t|��|	�}
n$tt|�t|t|�t|���}
|j	t
t��|
jd�S)a
	
    Helper method for defining space-delimited indentation blocks, such as
    those used to define block statements in Python source code.

    Parameters:
     - blockStatementExpr - expression defining syntax of statement that
            is repeated within the indented block
     - indentStack - list created by caller to manage indentation stack
            (multiple statementWithIndentedBlock expressions within a single grammar
            should share a common indentStack)
     - indent - boolean indicating whether block must be indented beyond the
            the current level; set to False for block of left-most statements
            (default=C{True})

    A valid block must contain at least one C{blockStatement}.

    Example::
        data = '''
        def A(z):
          A1
          B = 100
          G = A2
          A2
          A3
        B
        def BB(a,b,c):
          BB1
          def BBA():
            bba1
            bba2
            bba3
        C
        D
        def spam(x,y):
             def eggs(z):
                 pass
        '''


        indentStack = [1]
        stmt = Forward()

        identifier = Word(alphas, alphanums)
        funcDecl = ("def" + identifier + Group( "(" + Optional( delimitedList(identifier) ) + ")" ) + ":")
        func_body = indentedBlock(stmt, indentStack)
        funcDef = Group( funcDecl + func_body )

        rvalue = Forward()
        funcCall = Group(identifier + "(" + Optional(delimitedList(rvalue)) + ")")
        rvalue << (funcCall | identifier | Word(nums))
        assignment = Group(identifier + "=" + rvalue)
        stmt << ( funcDef | assignment | identifier )

        module_body = OneOrMore(stmt)

        parseTree = module_body.parseString(data)
        parseTree.pprint()
    prints::
        [['def',
          'A',
          ['(', 'z', ')'],
          ':',
          [['A1'], [['B', '=', '100']], [['G', '=', 'A2']], ['A2'], ['A3']]],
         'B',
         ['def',
          'BB',
          ['(', 'a', 'b', 'c', ')'],
          ':',
          [['BB1'], [['def', 'BBA', ['(', ')'], ':', [['bba1'], ['bba2'], ['bba3']]]]]],
         'C',
         'D',
         ['def',
          'spam',
          ['(', 'x', 'y', ')'],
          ':',
          [[['def', 'eggs', ['(', 'z', ')'], ':', [['pass']]]]]]] 
    csN|t|�krdSt||�}|�dkrJ|�dkr>t||d��t||d��dS)Nrrzillegal nestingznot a peer entryrsrs)r�r9r!r)r�r5rv�curCol)�indentStackrwrx�checkPeerIndent�s
z&indentedBlock.<locals>.checkPeerIndentcs2t||�}|�dkr"�j|�nt||d��dS)Nrrznot a subentryrs)r9r�r)r�r5rvr�)r�rwrx�checkSubIndent�s
z%indentedBlock.<locals>.checkSubIndentcsN|t|�krdSt||�}�o4|�dko4|�dksBt||d���j�dS)Nrrrqznot an unindentrsr:)r�r9rr�)r�r5rvr�)r�rwrx�
checkUnindent�s
z$indentedBlock.<locals>.checkUnindentz	 �INDENTr�ZUNINDENTzindented block)rrr�r�r
r�rirrr�rk)ZblockStatementExprr�rr�r�r�r!r�ZPEERZUNDENTZsmExprrw)r�rxrfQsN,z#[\0xc0-\0xd6\0xd8-\0xf6\0xf8-\0xff]z[\0xa1-\0xbf\0xd7\0xf7]z_:zany tagzgt lt amp nbsp quot aposz><& "'z&(?P<entity>rnz);zcommon HTML entitycCstj|j�S)zRHelper parser action to replace common HTML entities with their special characters)�_htmlEntityMapr�Zentity)rvrwrwrxr[�sz/\*(?:[^*]|\*(?!/))*z*/zC style commentz<!--[\s\S]*?-->zHTML commentz.*zrest of linez//(?:\\\n|[^\n])*z
// commentzC++ style commentz#.*zPython style comment)r�z 	�	commaItem)r�c@s�eZdZdZee�Zee�Ze	e
�jd�je�Z
e	e�jd�jeed��Zed�jd�je�Ze�je�de�je�jd�Zejd	d
��eeeed�j�e�Bjd�Zeje�ed
�jd�je�Zed�jd�je�ZeeBeBj�Zed�jd�je�Ze	eded�jd�Zed�jd�Z ed�jd�Z!e!de!djd�Z"ee!de!d>�dee!de!d?�jd�Z#e#j$d d
��d!e jd"�Z%e&e"e%Be#Bjd#��jd#�Z'ed$�jd%�Z(e)d@d'd(��Z*e)dAd*d+��Z+ed,�jd-�Z,ed.�jd/�Z-ed0�jd1�Z.e/j�e0j�BZ1e)d2d3��Z2e&e3e4d4�e5�e	e6d4d5�ee7d6����j�jd7�Z8e9ee:j;�e8Bd8d9��jd:�Z<e)ed;d
���Z=e)ed<d
���Z>d=S)Brna�

    Here are some common low-level expressions that may be useful in jump-starting parser development:
     - numeric forms (L{integers<integer>}, L{reals<real>}, L{scientific notation<sci_real>})
     - common L{programming identifiers<identifier>}
     - network addresses (L{MAC<mac_address>}, L{IPv4<ipv4_address>}, L{IPv6<ipv6_address>})
     - ISO8601 L{dates<iso8601_date>} and L{datetime<iso8601_datetime>}
     - L{UUID<uuid>}
     - L{comma-separated list<comma_separated_list>}
    Parse actions:
     - C{L{convertToInteger}}
     - C{L{convertToFloat}}
     - C{L{convertToDate}}
     - C{L{convertToDatetime}}
     - C{L{stripHTMLTags}}
     - C{L{upcaseTokens}}
     - C{L{downcaseTokens}}

    Example::
        pyparsing_common.number.runTests('''
            # any int or real number, returned as the appropriate type
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.fnumber.runTests('''
            # any int or real number, returned as float
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.hex_integer.runTests('''
            # hex numbers
            100
            FF
            ''')

        pyparsing_common.fraction.runTests('''
            # fractions
            1/2
            -3/4
            ''')

        pyparsing_common.mixed_integer.runTests('''
            # mixed fractions
            1
            1/2
            -3/4
            1-3/4
            ''')

        import uuid
        pyparsing_common.uuid.setParseAction(tokenMap(uuid.UUID))
        pyparsing_common.uuid.runTests('''
            # uuid
            12345678-1234-5678-1234-567812345678
            ''')
    prints::
        # any int or real number, returned as the appropriate type
        100
        [100]

        -100
        [-100]

        +100
        [100]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # any int or real number, returned as float
        100
        [100.0]

        -100
        [-100.0]

        +100
        [100.0]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # hex numbers
        100
        [256]

        FF
        [255]

        # fractions
        1/2
        [0.5]

        -3/4
        [-0.75]

        # mixed fractions
        1
        [1]

        1/2
        [0.5]

        -3/4
        [-0.75]

        1-3/4
        [1.75]

        # uuid
        12345678-1234-5678-1234-567812345678
        [UUID('12345678-1234-5678-1234-567812345678')]
    �integerzhex integerrtz[+-]?\d+zsigned integerr��fractioncCs|d|dS)Nrrrrsrw)rvrwrwrxry�szpyparsing_common.<lambda>r8z"fraction or mixed integer-fractionz
[+-]?\d+\.\d*zreal numberz+[+-]?\d+([eE][+-]?\d+|\.\d*([eE][+-]?\d+)?)z$real number with scientific notationz[+-]?\d+\.?\d*([eE][+-]?\d+)?�fnumberrB�
identifierzK(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})(\.(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})){3}zIPv4 addressz[0-9a-fA-F]{1,4}�hex_integerr��zfull IPv6 addressrrBz::zshort IPv6 addresscCstdd�|D��dkS)Ncss|]}tjj|�rdVqdS)rrN)rn�
_ipv6_partr�)r�rfrwrwrxr��sz,pyparsing_common.<lambda>.<locals>.<genexpr>rw)rH)rvrwrwrxry�sz::ffff:zmixed IPv6 addresszIPv6 addressz:[0-9a-fA-F]{2}([:.-])[0-9a-fA-F]{2}(?:\1[0-9a-fA-F]{2}){4}zMAC address�%Y-%m-%dcs�fdd�}|S)a�
        Helper to create a parse action for converting parsed date string to Python datetime.date

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%d"})

        Example::
            date_expr = pyparsing_common.iso8601_date.copy()
            date_expr.setParseAction(pyparsing_common.convertToDate())
            print(date_expr.parseString("1999-12-31"))
        prints::
            [datetime.date(1999, 12, 31)]
        csLytj|d��j�Stk
rF}zt||t|���WYdd}~XnXdS)Nr)r�strptimeZdater�rr{)r�r5rv�ve)�fmtrwrx�cvt_fn�sz.pyparsing_common.convertToDate.<locals>.cvt_fnrw)r�r�rw)r�rx�
convertToDate�szpyparsing_common.convertToDate�%Y-%m-%dT%H:%M:%S.%fcs�fdd�}|S)a
        Helper to create a parse action for converting parsed datetime string to Python datetime.datetime

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%dT%H:%M:%S.%f"})

        Example::
            dt_expr = pyparsing_common.iso8601_datetime.copy()
            dt_expr.setParseAction(pyparsing_common.convertToDatetime())
            print(dt_expr.parseString("1999-12-31T23:59:59.999"))
        prints::
            [datetime.datetime(1999, 12, 31, 23, 59, 59, 999000)]
        csHytj|d��Stk
rB}zt||t|���WYdd}~XnXdS)Nr)rr�r�rr{)r�r5rvr�)r�rwrxr��sz2pyparsing_common.convertToDatetime.<locals>.cvt_fnrw)r�r�rw)r�rx�convertToDatetime�sz"pyparsing_common.convertToDatetimez7(?P<year>\d{4})(?:-(?P<month>\d\d)(?:-(?P<day>\d\d))?)?zISO8601 datez�(?P<year>\d{4})-(?P<month>\d\d)-(?P<day>\d\d)[T ](?P<hour>\d\d):(?P<minute>\d\d)(:(?P<second>\d\d(\.\d*)?)?)?(?P<tz>Z|[+-]\d\d:?\d\d)?zISO8601 datetimez2[0-9a-fA-F]{8}(-[0-9a-fA-F]{4}){3}-[0-9a-fA-F]{12}�UUIDcCstjj|d�S)a
        Parse action to remove HTML tags from web page HTML source

        Example::
            # strip HTML links from normal text 
            text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
            td,td_end = makeHTMLTags("TD")
            table_text = td + SkipTo(td_end).setParseAction(pyparsing_common.stripHTMLTags)("body") + td_end
            
            print(table_text.parseString(text).body) # -> 'More info at the pyparsing wiki page'
        r)rn�_html_stripperr�)r�r5r�rwrwrx�
stripHTMLTags�s
zpyparsing_common.stripHTMLTagsra)r�z 	r�r�)r�zcomma separated listcCst|�j�S)N)r�r�)rvrwrwrxry�scCst|�j�S)N)r�r�)rvrwrwrxry�sN)rrB)rrB)r�)r�)?r�r�r�r�rmruZconvertToInteger�floatZconvertToFloatr/rRrir�r�rDr�r'Zsigned_integerr�rxrr�Z
mixed_integerrH�realZsci_realr��numberr�r4r3r�Zipv4_addressr�Z_full_ipv6_addressZ_short_ipv6_addressr~Z_mixed_ipv6_addressr
Zipv6_addressZmac_addressr�r�r�Ziso8601_dateZiso8601_datetime�uuidr7r6r�r�rrrrVr.�
_commasepitemr@rYr�Zcomma_separated_listrdrBrwrwrwrxrn�sN""
28�__main__Zselect�fromz_$r])rb�columnsrjZtablesZcommandaK
        # '*' as column list and dotted table name
        select * from SYS.XYZZY

        # caseless match on "SELECT", and casts back to "select"
        SELECT * from XYZZY, ABC

        # list of column names, and mixed case SELECT keyword
        Select AA,BB,CC from Sys.dual

        # multiple tables
        Select A, B, C from Sys.dual, Table2

        # invalid SELECT keyword - should fail
        Xelect A, B, C from Sys.dual

        # incomplete command - should fail
        Select

        # invalid column name - should fail
        Select ^^^ frox Sys.dual

        z]
        100
        -100
        +100
        3.14159
        6.02e23
        1e-12
        z 
        100
        FF
        z6
        12345678-1234-5678-1234-567812345678
        )rq)raF)N)FT)T)r�)T)�r��__version__Z__versionTime__�
__author__r��weakrefrr�r�r~r�rdrr�r"r<r�r�_threadr�ImportErrorZ	threadingrr�Zordereddict�__all__r��version_infor;r�maxsizer�r{r��chrrur�rHr�r�reversedr�r�rr6rrrIZmaxintZxranger�Z__builtin__r�Zfnamer�rJr�r�r�r�r�r�Zascii_uppercaseZascii_lowercaser4rRrDr3rkr�Z	printablerVrKrrr!r#r&r�r"�MutableMapping�registerr9rJrGr/r2r4rQrMr$r,r
rrr�rQrrrrlr/r'r%r	r.r0rrrr*r)r1r0r rrrrrrrrJrr2rMrNrr(rrVr-r
rrr+rrbr@r<r�rOrNrrSrArgrhrjrirCrIrHrar`r�Z_escapedPuncZ_escapedHexCharZ_escapedOctChar�UNICODEZ_singleCharZ
_charRangermr}r_rMr\rZrmrdrBr�rKrLrer�rkrTr�r�rirUr>r^rYrcrPrfr5rWr7r6r�r�r�r�r;r[r8rEr�r]r?r=rFrXr�r�r:rnr�ZselectTokenZ	fromTokenZidentZ
columnNameZcolumnNameListZ
columnSpecZ	tableNameZ
tableNameListZ	simpleSQLr�r�r�r�r�r�rwrwrwrx�<module>=s�









8


@d&A= I
G3pLOD|M &#@sQ,A,	I#%&0
,	?#kZr

 (
 0 


"site-packages/pip/_vendor/__pycache__/distro.cpython-36.opt-1.pyc000064400000077437147511334610020640 0ustar003

���e��@sbdZddlZddlZddlZddlZddlZddlZddlZejj	d�sXe
djej���dZdZ
iZddd	�Zd
diZejd�Zejd�Zd
dde
dfZd:dd�Zdd�Zd;dd�Zd<dd�Zd=dd�Zd>dd�Zd?dd �Zd@d!d"�Zd#d$�Zd%d&�ZdAd'd(�Zd)d*�Z d+d,�Z!d-d.�Z"d/d0�Z#d1d2�Z$d3d4�Z%Gd5d6�d6e&�Z'e'�Z(d7d8�Z)e*d9k�r^e)�dS)Ba,
The ``distro`` package (``distro`` stands for Linux Distribution) provides
information about the Linux distribution it runs on, such as a reliable
machine-readable distro ID, or version information.

It is a renewed alternative implementation for Python's original
:py:func:`platform.linux_distribution` function, but it provides much more
functionality. An alternative implementation became necessary because Python
3.5 deprecated this function, and Python 3.7 is expected to remove it
altogether. Its predecessor function :py:func:`platform.dist` was already
deprecated since Python 2.6 and is also expected to be removed in Python 3.7.
Still, there are many cases in which access to Linux distribution information
is needed. See `Python issue 1322 <https://bugs.python.org/issue1322>`_ for
more information.
�N�linuxzUnsupported platform: {0}z/etcz
os-releaseZoracleZrhel)ZenterpriseenterpriseZredhatenterpriseworkstationZredhatzA(?:[^)]*\)(.*)\()? *(?:STL )?([\d.+\-a-z]*\d) *(?:esaeler *)?(.+)z(\w+)[-_](release|version)$Zdebian_versionzlsb-releasezoem-releasezsystem-releaseTcCs
tj|�S)a$
    Return information about the current Linux distribution as a tuple
    ``(id_name, version, codename)`` with items as follows:

    * ``id_name``:  If *full_distribution_name* is false, the result of
      :func:`distro.id`. Otherwise, the result of :func:`distro.name`.

    * ``version``:  The result of :func:`distro.version`.

    * ``codename``:  The result of :func:`distro.codename`.

    The interface of this function is compatible with the original
    :py:func:`platform.linux_distribution` function, supporting a subset of
    its parameters.

    The data it returns may not exactly be the same, because it uses more data
    sources than the original function, and that may lead to different data if
    the Linux distribution is not consistent across multiple data sources it
    provides (there are indeed such distributions ...).

    Another reason for differences is the fact that the :func:`distro.id`
    method normalizes the distro ID string to a reliable machine-readable value
    for a number of popular Linux distributions.
    )�_distro�linux_distribution)�full_distribution_name�r�/usr/lib/python3.6/distro.pyr`srcCstj�S)a�

    Return the distro ID of the current Linux distribution, as a
    machine-readable string.

    For a number of Linux distributions, the returned distro ID value is
    *reliable*, in the sense that it is documented and that it does not change
    across releases of the distribution.

    This package maintains the following reliable distro ID values:

    ==============  =========================================
    Distro ID       Distribution
    ==============  =========================================
    "ubuntu"        Ubuntu
    "debian"        Debian
    "rhel"          RedHat Enterprise Linux
    "centos"        CentOS
    "rocky"         Rocky Linux
    "fedora"        Fedora
    "sles"          SUSE Linux Enterprise Server
    "opensuse"      openSUSE
    "amazon"        Amazon Linux
    "arch"          Arch Linux
    "cloudlinux"    CloudLinux OS
    "exherbo"       Exherbo Linux
    "gentoo"        GenToo Linux
    "ibm_powerkvm"  IBM PowerKVM
    "kvmibm"        KVM for IBM z Systems
    "linuxmint"     Linux Mint
    "mageia"        Mageia
    "mandriva"      Mandriva Linux
    "parallels"     Parallels
    "pidora"        Pidora
    "raspbian"      Raspbian
    "oracle"        Oracle Linux (and Oracle Enterprise Linux)
    "scientific"    Scientific Linux
    "slackware"     Slackware
    "xenserver"     XenServer
    ==============  =========================================

    If you have a need to get distros for reliable IDs added into this set,
    or if you find that the :func:`distro.id` function returns a different
    distro ID for one of the listed distros, please create an issue in the
    `distro issue tracker`_.

    **Lookup hierarchy and transformations:**

    First, the ID is obtained from the following sources, in the specified
    order. The first available and non-empty value is used:

    * the value of the "ID" attribute of the os-release file,

    * the value of the "Distributor ID" attribute returned by the lsb_release
      command,

    * the first part of the file name of the distro release file,

    The so determined ID value then passes the following transformations,
    before it is returned by this method:

    * it is translated to lower case,

    * blanks (which should not be there anyway) are translated to underscores,

    * a normalization of the ID is performed, based upon
      `normalization tables`_. The purpose of this normalization is to ensure
      that the ID is as reliable as possible, even across incompatible changes
      in the Linux distributions. A common reason for an incompatible change is
      the addition of an os-release file, or the addition of the lsb_release
      command, with ID values that differ from what was previously determined
      from the distro release file name.
    )r�idrrrrr|sIrFcCs
tj|�S)an
    Return the name of the current Linux distribution, as a human-readable
    string.

    If *pretty* is false, the name is returned without version or codename.
    (e.g. "CentOS Linux")

    If *pretty* is true, the version and codename are appended.
    (e.g. "CentOS Linux 7.1.1503 (Core)")

    **Lookup hierarchy:**

    The name is obtained from the following sources, in the specified order.
    The first available and non-empty value is used:

    * If *pretty* is false:

      - the value of the "NAME" attribute of the os-release file,

      - the value of the "Distributor ID" attribute returned by the lsb_release
        command,

      - the value of the "<name>" field of the distro release file.

    * If *pretty* is true:

      - the value of the "PRETTY_NAME" attribute of the os-release file,

      - the value of the "Description" attribute returned by the lsb_release
        command,

      - the value of the "<name>" field of the distro release file, appended
        with the value of the pretty version ("<version_id>" and "<codename>"
        fields) of the distro release file, if available.
    )r�name)�prettyrrrr	�s$r	cCstj||�S)ay
    Return the version of the current Linux distribution, as a human-readable
    string.

    If *pretty* is false, the version is returned without codename (e.g.
    "7.0").

    If *pretty* is true, the codename in parenthesis is appended, if the
    codename is non-empty (e.g. "7.0 (Maipo)").

    Some distributions provide version numbers with different precisions in
    the different sources of distribution information. Examining the different
    sources in a fixed priority order does not always yield the most precise
    version (e.g. for Debian 8.2, or CentOS 7.1).

    The *best* parameter can be used to control the approach for the returned
    version:

    If *best* is false, the first non-empty version number in priority order of
    the examined sources is returned.

    If *best* is true, the most precise version number out of all examined
    sources is returned.

    **Lookup hierarchy:**

    In all cases, the version number is obtained from the following sources.
    If *best* is false, this order represents the priority order:

    * the value of the "VERSION_ID" attribute of the os-release file,
    * the value of the "Release" attribute returned by the lsb_release
      command,
    * the version number parsed from the "<version_id>" field of the first line
      of the distro release file,
    * the version number parsed from the "PRETTY_NAME" attribute of the
      os-release file, if it follows the format of the distro release files.
    * the version number parsed from the "Description" attribute returned by
      the lsb_release command, if it follows the format of the distro release
      files.
    )r�version)r
�bestrrrr�s)rcCs
tj|�S)a�
    Return the version of the current Linux distribution as a tuple
    ``(major, minor, build_number)`` with items as follows:

    * ``major``:  The result of :func:`distro.major_version`.

    * ``minor``:  The result of :func:`distro.minor_version`.

    * ``build_number``:  The result of :func:`distro.build_number`.

    For a description of the *best* parameter, see the :func:`distro.version`
    method.
    )r�
version_parts)rrrrr
sr
cCs
tj|�S)a8
    Return the major version of the current Linux distribution, as a string,
    if provided.
    Otherwise, the empty string is returned. The major version is the first
    part of the dot-separated version string.

    For a description of the *best* parameter, see the :func:`distro.version`
    method.
    )r�
major_version)rrrrr,s
rcCs
tj|�S)a9
    Return the minor version of the current Linux distribution, as a string,
    if provided.
    Otherwise, the empty string is returned. The minor version is the second
    part of the dot-separated version string.

    For a description of the *best* parameter, see the :func:`distro.version`
    method.
    )r�
minor_version)rrrrr9s
rcCs
tj|�S)a6
    Return the build number of the current Linux distribution, as a string,
    if provided.
    Otherwise, the empty string is returned. The build number is the third part
    of the dot-separated version string.

    For a description of the *best* parameter, see the :func:`distro.version`
    method.
    )r�build_number)rrrrrFs
rcCstj�S)a
    Return a space-separated list of distro IDs of distributions that are
    closely related to the current Linux distribution in regards to packaging
    and programming interfaces, for example distributions the current
    distribution is a derivative from.

    **Lookup hierarchy:**

    This information item is only provided by the os-release file.
    For details, see the description of the "ID_LIKE" attribute in the
    `os-release man page
    <http://www.freedesktop.org/software/systemd/man/os-release.html>`_.
    )r�likerrrrrSsrcCstj�S)a�
    Return the codename for the release of the current Linux distribution,
    as a string.

    If the distribution does not have a codename, an empty string is returned.

    Note that the returned codename is not always really a codename. For
    example, openSUSE returns "x86_64". This function does not handle such
    cases in any special way and just returns the string it finds, if any.

    **Lookup hierarchy:**

    * the codename within the "VERSION" attribute of the os-release file, if
      provided,

    * the value of the "Codename" attribute returned by the lsb_release
      command,

    * the value of the "<codename>" field of the distro release file.
    )r�codenamerrrrrdsrcCstj||�S)a�
    Return certain machine-readable information items about the current Linux
    distribution in a dictionary, as shown in the following example:

    .. sourcecode:: python

        {
            'id': 'rhel',
            'version': '7.0',
            'version_parts': {
                'major': '7',
                'minor': '0',
                'build_number': ''
            },
            'like': 'fedora',
            'codename': 'Maipo'
        }

    The dictionary structure and keys are always the same, regardless of which
    information items are available in the underlying data sources. The values
    for the various keys are as follows:

    * ``id``:  The result of :func:`distro.id`.

    * ``version``:  The result of :func:`distro.version`.

    * ``version_parts -> major``:  The result of :func:`distro.major_version`.

    * ``version_parts -> minor``:  The result of :func:`distro.minor_version`.

    * ``version_parts -> build_number``:  The result of
      :func:`distro.build_number`.

    * ``like``:  The result of :func:`distro.like`.

    * ``codename``:  The result of :func:`distro.codename`.

    For a description of the *pretty* and *best* parameters, see the
    :func:`distro.version` method.
    )r�info)r
rrrrr|s)rcCstj�S)z�
    Return a dictionary containing key-value pairs for the information items
    from the os-release file data source of the current Linux distribution.

    See `os-release file`_ for details about these information items.
    )r�os_release_inforrrrr�srcCstj�S)z�
    Return a dictionary containing key-value pairs for the information items
    from the lsb_release command data source of the current Linux distribution.

    See `lsb_release command output`_ for details about these information
    items.
    )r�lsb_release_inforrrrr�srcCstj�S)z�
    Return a dictionary containing key-value pairs for the information items
    from the distro release file data source of the current Linux distribution.

    See `distro release file`_ for details about these information items.
    )r�distro_release_inforrrrr�srcCs
tj|�S)a�
    Return a single named information item from the os-release file data source
    of the current Linux distribution.

    Parameters:

    * ``attribute`` (string): Key of the information item.

    Returns:

    * (string): Value of the information item, if the item exists.
      The empty string, if the item does not exist.

    See `os-release file`_ for details about these information items.
    )r�os_release_attr)�	attributerrrr�srcCs
tj|�S)a�
    Return a single named information item from the lsb_release command output
    data source of the current Linux distribution.

    Parameters:

    * ``attribute`` (string): Key of the information item.

    Returns:

    * (string): Value of the information item, if the item exists.
      The empty string, if the item does not exist.

    See `lsb_release command output`_ for details about these information
    items.
    )r�lsb_release_attr)rrrrr�srcCs
tj|�S)a�
    Return a single named information item from the distro release file
    data source of the current Linux distribution.

    Parameters:

    * ``attribute`` (string): Key of the information item.

    Returns:

    * (string): Value of the information item, if the item exists.
      The empty string, if the item does not exist.

    See `distro release file`_ for details about these information items.
    )r�distro_release_attr)rrrrr�src@s�eZdZdZd:dd�Zdd�Zd;dd	�Zd
d�Zd<d
d�Zd=dd�Z	d>dd�Z
d?dd�Zd@dd�ZdAdd�Z
dd�Zdd�ZdBdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zed-d.��Zd/d0�Zed1d2��Zd3d4�Zd5d6�Zed7d8��Zd9S)C�LinuxDistributiona
    Provides information about a Linux distribution.

    This package creates a private module-global instance of this class with
    default initialization arguments, that is used by the
    `consolidated accessor functions`_ and `single source accessor functions`_.
    By using default initialization arguments, that module-global instance
    returns data about the current Linux distribution (i.e. the distro this
    package runs on).

    Normally, it is not necessary to create additional instances of this class.
    However, in situations where control is needed over the exact data sources
    that are used, instances of this class can be created with a specific
    distro release file, or a specific os-release file, or without invoking the
    lsb_release command.
    T�cCsH|ptjjtt�|_|pd|_|j�|_|r4|j	�ni|_
|j�|_dS)a8	
        The initialization method of this class gathers information from the
        available data sources, and stores that in private instance attributes.
        Subsequent access to the information items uses these private instance
        attributes, so that the data sources are read only once.

        Parameters:

        * ``include_lsb`` (bool): Controls whether the
          `lsb_release command output`_ is included as a data source.

          If the lsb_release command is not available in the program execution
          path, the data source for the lsb_release command will be empty.

        * ``os_release_file`` (string): The path name of the
          `os-release file`_ that is to be used as a data source.

          An empty string (the default) will cause the default path name to
          be used (see `os-release file`_ for details).

          If the specified or defaulted os-release file does not exist, the
          data source for the os-release file will be empty.

        * ``distro_release_file`` (string): The path name of the
          `distro release file`_ that is to be used as a data source.

          An empty string (the default) will cause a default search algorithm
          to be used (see `distro release file`_ for details).

          If the specified distro release file does not exist, or if no default
          distro release file can be found, the data source for the distro
          release file will be empty.

        Public instance attributes:

        * ``os_release_file`` (string): The path name of the
          `os-release file`_ that is actually used as a data source. The
          empty string if no distro release file is used as a data source.

        * ``distro_release_file`` (string): The path name of the
          `distro release file`_ that is actually used as a data source. The
          empty string if no distro release file is used as a data source.

        Raises:

        * :py:exc:`IOError`: Some I/O issue with an os-release file or distro
          release file.

        * :py:exc:`subprocess.CalledProcessError`: The lsb_release command had
          some issue (other than not being available in the program execution
          path).

        * :py:exc:`UnicodeError`: A data source has unexpected characters or
          uses an unexpected encoding.
        rN)
�os�path�join�_UNIXCONFDIR�_OS_RELEASE_BASENAME�os_release_file�distro_release_file�_get_os_release_info�_os_release_info�_get_lsb_release_info�_lsb_release_info�_get_distro_release_info�_distro_release_info)�selfZinclude_lsbr"r#rrr�__init__s;

zLinuxDistribution.__init__cCsdj|j|j|j|j|j�S)z Return repr of all info
        z�LinuxDistribution(os_release_file={0!r}, distro_release_file={1!r}, _os_release_info={2!r}, _lsb_release_info={3!r}, _distro_release_info={4!r}))�formatr"r#r%r'r))r*rrr�__repr__VszLinuxDistribution.__repr__cCs"|r|j�n|j�|j�|j�fS)z�
        Return information about the Linux distribution that is compatible
        with Python's :func:`platform.linux_distribution`, supporting a subset
        of its parameters.

        For details, see :func:`distro.linux_distribution`.
        )r	rrr)r*rrrrrfs	z$LinuxDistribution.linux_distributioncCsTdd�}|jd�}|r ||t�S|jd�}|r8||t�S|jd�}|rP||t�SdS)zrReturn the distro ID of the Linux distribution, as a string.

        For details, see :func:`distro.id`.
        cSs|j�jdd�}|j||�S)N� �_)�lower�replace�get)�	distro_id�tablerrr�	normalizeysz'LinuxDistribution.id.<locals>.normalizer�distributor_idr)r�NORMALIZED_OS_IDr�NORMALIZED_LSB_IDr�NORMALIZED_DISTRO_ID)r*r5r3rrrrts





zLinuxDistribution.idFcCsh|jd�p|jd�p|jd�}|r`|jd�p4|jd�}|s`|jd�}|jdd�}|r`|d|}|pfdS)	zx
        Return the name of the Linux distribution, as a string.

        For details, see :func:`distro.name`.
        r	r6�pretty_name�descriptionT)r
r.r)rrrr)r*r
r	rrrrr	�s





zLinuxDistribution.namecCs�|jd�|jd�|jd�|j|jd��jdd�|j|jd��jdd�g}d}|r�xJ|D]$}|jd�|jd�ksv|dkrV|}qVWnx|D]}|dkr�|}Pq�W|r�|r�|j�r�dj||j��}|S)z~
        Return the version of the Linux distribution, as a string.

        For details, see :func:`distro.version`.
        �
version_id�releaser:rr;�.z	{0} ({1}))rrr�_parse_distro_release_contentr2�countrr,)r*r
rZversionsr�vrrrr�s&


zLinuxDistribution.versioncCsL|j|d�}|rHtjd�}|j|�}|rH|j�\}}}||p>d|pDdfSdS)z�
        Return the version of the Linux distribution, as a tuple of version
        numbers.

        For details, see :func:`distro.version_parts`.
        )rz(\d+)\.?(\d+)?\.?(\d+)?r)rrr)r�re�compile�match�groups)r*rZversion_strZ
version_regex�matches�major�minorrrrrr
�s

zLinuxDistribution.version_partscCs|j|�dS)z�
        Return the major version number of the current distribution.

        For details, see :func:`distro.major_version`.
        r)r
)r*rrrrr�szLinuxDistribution.major_versioncCs|j|�dS)z�
        Return the minor version number of the Linux distribution.

        For details, see :func:`distro.minor_version`.
        �)r
)r*rrrrr�szLinuxDistribution.minor_versioncCs|j|�dS)z{
        Return the build number of the Linux distribution.

        For details, see :func:`distro.build_number`.
        �)r
)r*rrrrr�szLinuxDistribution.build_numbercCs|jd�pdS)z�
        Return the IDs of distributions that are like the Linux distribution.

        For details, see :func:`distro.like`.
        Zid_liker)r)r*rrrr�szLinuxDistribution.likecCs"|jd�p |jd�p |jd�p dS)zs
        Return the codename of the Linux distribution.

        For details, see :func:`distro.codename`.
        rr)rrr)r*rrrr�s


zLinuxDistribution.codenamecCsBt|j�|j||�t|j|�|j|�|j|�d�|j�|j�d�S)z�
        Return certain machine-readable information about the Linux
        distribution.

        For details, see :func:`distro.info`.
        )rGrHr)rrr
rr)�dictrrrrrrr)r*r
rrrrr�s
zLinuxDistribution.infocCs|jS)z�
        Return a dictionary containing key-value pairs for the information
        items from the os-release file data source of the Linux distribution.

        For details, see :func:`distro.os_release_info`.
        )r%)r*rrrr
sz!LinuxDistribution.os_release_infocCs|jS)z�
        Return a dictionary containing key-value pairs for the information
        items from the lsb_release command data source of the Linux
        distribution.

        For details, see :func:`distro.lsb_release_info`.
        )r')r*rrrrsz"LinuxDistribution.lsb_release_infocCs|jS)z�
        Return a dictionary containing key-value pairs for the information
        items from the distro release file data source of the Linux
        distribution.

        For details, see :func:`distro.distro_release_info`.
        )r))r*rrrr sz%LinuxDistribution.distro_release_infocCs|jj|d�S)z�
        Return a single named information item from the os-release file data
        source of the Linux distribution.

        For details, see :func:`distro.os_release_attr`.
        r)r%r2)r*rrrrr*sz!LinuxDistribution.os_release_attrcCs|jj|d�S)z�
        Return a single named information item from the lsb_release command
        output data source of the Linux distribution.

        For details, see :func:`distro.lsb_release_attr`.
        r)r'r2)r*rrrrr3sz"LinuxDistribution.lsb_release_attrcCs|jj|d�S)z�
        Return a single named information item from the distro release file
        data source of the Linux distribution.

        For details, see :func:`distro.distro_release_attr`.
        r)r)r2)r*rrrrr<sz%LinuxDistribution.distro_release_attrc	Cs.tjj|j�r*t|j��}|j|�SQRXiS)z�
        Get the information items from the specified os-release file.

        Returns:
            A dictionary containing all information items.
        N)rr�isfiler"�open�_parse_os_release_content)r*Zrelease_filerrrr$Esz&LinuxDistribution._get_os_release_infocCs�i}tj|dd�}d|_tjddkr@t|jt�r@|jjd�|_t|�}x�|D]�}d|krN|j	dd�\}}t|t�r~|jd�}|||j
�<|d	kr�tjd
|�}|r�|j
�}|jd�}|jd�}|j�}||d
<q�d|d
<qNqNW|S)aD
        Parse the lines of an os-release file.

        Parameters:

        * lines: Iterable through the lines in the os-release file.
                 Each line must be a unicode string or a UTF-8 encoded byte
                 string.

        Returns:
            A dictionary containing all information items.
        T)�posixrrJz
iso-8859-1�=rIzutf-8�VERSIONz(\(\D+\))|,(\s+)?\D+z()�,rr)�shlexZwhitespace_split�sys�version_info�
isinstanceZ	wordchars�bytes�decode�list�splitr0rB�search�group�strip)�lines�propsZlexer�tokens�token�krArrrrrNQs.	






z+LinuxDistribution._parse_os_release_contentcCs�d}tj|dtjtjd�}|j�\}}|jd�|jd�}}|j}|dkr\|j�}|j|�S|dkrhiStj	dd�d
kr�tj
||||��n@tj	dd�dkr�tj
|||��ntj	dd�dkr�tj
||��dS)z�
        Get the information items from the lsb_release command output.

        Returns:
            A dictionary containing all information items.
        zlsb_release -aT)�shell�stdout�stderrzutf-8r�NrJ����)rgrh)rJri)rJrj)�
subprocess�Popen�PIPEZcommunicaterX�
returncode�
splitlines�_parse_lsb_release_contentrTrUZCalledProcessError)r*�cmdZprocessrdre�codeZcontentrrrr&�s(

z'LinuxDistribution._get_lsb_release_infocCsti}xj|D]b}t|t�r"|jd�n|}|jd�jdd�}t|�dkrFq
|\}}|j|jdd�j�|j�i�q
W|S)aM
        Parse the output of the lsb_release command.

        Parameters:

        * lines: Iterable through the lines of the lsb_release output.
                 Each line must be a unicode string or a UTF-8 encoded byte
                 string.

        Returns:
            A dictionary containing all information items.
        zutf-8�
�:rIrJr.r/)	rVrWrXr]rZ�len�updater1r0)r^r_�lineZkvrbrArrrrp�s
"z,LinuxDistribution._parse_lsb_release_contentcCs�|jr@|j|j�}tjj|j�}tj|�}|r<|jd�|d<|Stjt	�}|j
�x\|D]T}|tkrfqXtj|�}|rXtjjt	|�}|j|�}d|krX||_|jd�|d<|SqXWiSdS)z�
        Get the information items from the specified distro release file.

        Returns:
            A dictionary containing all information items.
        rIrr	N)
r#�_parse_distro_release_filerr�basename� _DISTRO_RELEASE_BASENAME_PATTERNrDr\�listdirr �sort� _DISTRO_RELEASE_IGNORE_BASENAMESr)r*�distro_inforyrDZ	basenames�filepathrrrr(�s,




z*LinuxDistribution._get_distro_release_infoc	Cs.tjj|�r*t|��}|j|j��SQRXiS)z�
        Parse a distro release file.

        Parameters:

        * filepath: Path name of the distro release file.

        Returns:
            A dictionary containing all information items.
        N)rrrLrMr?�readline)r*r�fprrrrx�s
z,LinuxDistribution._parse_distro_release_filecCs�t|t�r|jd�}tj|j�ddd	��}i}|r�|jd�ddd
�|d<|jd�rn|jd�ddd�|d<|jd�r�|jd�ddd�|d<n|r�|j�|d<|S)
a
        Parse a line from a distro release file.

        Parameters:
        * line: Line from the distro release file. Must be a unicode string
                or a UTF-8 encoded byte string.

        Returns:
            A dictionary containing all information items.
        zutf-8NrIrgr	rJr<r���r�r�r�)rVrWrX�(_DISTRO_RELEASE_CONTENT_REVERSED_PATTERNrDr]r\)rwrFr~rrrr?�s



z/LinuxDistribution._parse_distro_release_contentN)Trr)T)F)FF)F)F)F)F)FF)�__name__�
__module__�__qualname__�__doc__r+r-rrr	rr
rrrrrrrrrrrrr$�staticmethodrNr&rpr(rxr?rrrrrs:
@


!




	

			<)rcCs�ddl}tjt�}|jtj�|jtjtj	��|j
dd�}|jddddd�|j�}|j
rv|jt
jt�d	d
d��nB|jdtd
d
��td
d
�}|r�|jd|�t�}|r�|jd|�dS)NrzLinux distro info tool)r;z--jsonz-jz!Output in machine readable format�
store_true)�help�action�T)�indentZ	sort_keyszName: %s)r
zVersion: %szCodename: %s)�argparse�loggingZ	getLoggerr�ZsetLevel�DEBUGZ
addHandlerZ
StreamHandlerrTrd�ArgumentParser�add_argument�
parse_args�jsonr�dumpsr	rr)r�Zlogger�parser�argsZdistribution_versionZdistribution_codenamerrr�mains(

r��__main__)T)F)FF)F)F)F)F)FF)+r�rrBrTr�rSr�rk�platform�
startswith�ImportErrorr,r r!r7r8r9rCr�rzr}rrr	rr
rrrrrrrrrrrr�objectrrr�r�rrrr�<module>sd	

L
'
,





,


site-packages/pip/_vendor/__pycache__/ipaddress.cpython-36.pyc000064400000201427147511334610020337 0ustar003

���e09�@sldZddlmZddlZddlZdZefZyeefZWne	k
rJYnXye
ZWn$e	k
rxeZe
ekstt�YnXdOdkr�dd�Zndd�Zy
ejZWnek
r�d	d
�ZYnXdd�Zeed
�r�dd�Zndd�ZdPdd�ZGdd�de�ZdZdZGdd�de�ZGdd�de�Zdd�ZdQdd �Zd!d"�Z d#d$�Z!d%d&�Z"d'd(�Z#d)d*�Z$d+d,�Z%d-d.�Z&d/d0�Z'd1d2�Z(d3d4�Z)Gd5d6�d6e�Z*Gd7d8�d8e*�Z+Gd9d:�d:e*�Z,Gd;d<�d<e�Z-Gd=d>�d>e-e+�Z.Gd?d@�d@e.�Z/GdAdB�dBe-e,�Z0GdCdD�dDe�Z1e1e._2GdEdF�dFe�Z3GdGdH�dHe3e+�Z4GdIdJ�dJe4�Z5GdKdL�dLe3e,�Z6GdMdN�dNe�Z7e7e4_2dS)Rz�A fast, lightweight IPv4/IPv6 manipulation library in Python.

This library is used to create/poke/manipulate IPv4 and IPv6 addresses
and networks.

�)�unicode_literalsNz1.0.17�cCs|S)N�)�bytrr�/usr/lib/python3.6/ipaddress.py�_compat_bytes_to_byte_valssrcCsdd�|D�S)NcSsg|]}tjd|�d�qS)s!Br)�struct�unpack)�.0�brrr�
<listcomp>#sz._compat_bytes_to_byte_vals.<locals>.<listcomp>r)rrrrr"scCs<|dkst�d}x&|D]}t|t�s(t�|d>|}qW|S)N�bigr�)�AssertionError�
isinstance�_compat_int_types)Zbytvals�	endianess�resZbvrrr�_compat_int_from_byte_vals's
rcCs�t|t�st�|dkst�|dkrH|dks2|dkr<tjd��tjd|�S|dkr�|dksd|dd	krntjd
��tjd|d?|d
@�St��dS)Nr
�r�� z(integer out of range for 'I' format codes!I��z)integer out of range for 'QQ' format codes!QQ�@l����l)rrrr�error�pack�NotImplementedError)ZintvalZlengthrrrr�_compat_to_bytes0s

r�
bit_lengthcCs|j�S)N)r)�irrr�_compat_bit_length?sr!cCs&x tj�D]}||?dkr
|Sq
WdS)Nr)�	itertools�count)r rrrrr!Bs�ccs0|dkst�|}x||kr*|V||7}qWdS)Nr)r)�start�end�stepr rrr�
_compat_rangeHs

r(c@s@eZdZfZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Z	d
S)�_TotalOrderingMixincCst�dS)N)r)�self�otherrrr�__eq__Wsz_TotalOrderingMixin.__eq__cCs|j|�}|tkrtS|S)N)r,�NotImplemented)r*r+�equalrrr�__ne__Zs
z_TotalOrderingMixin.__ne__cCst�dS)N)r)r*r+rrr�__lt__`sz_TotalOrderingMixin.__lt__cCs&|j|�}|tks|r"|j|�S|S)N)r0r-r,)r*r+�lessrrr�__le__cs

z_TotalOrderingMixin.__le__cCs6|j|�}|tkrtS|j|�}|tkr,tS|p2|S)N)r0r-r,)r*r+r1r.rrr�__gt__is

z_TotalOrderingMixin.__gt__cCs|j|�}|tkrtS|S)N)r0r-)r*r+r1rrr�__ge__rs
z_TotalOrderingMixin.__ge__N)
�__name__�
__module__�__qualname__�	__slots__r,r/r0r2r3r4rrrrr)Ps	r)rrc@seZdZdZdS)�AddressValueErrorz%A Value Error related to the address.N)r5r6r7�__doc__rrrrr9}sr9c@seZdZdZdS)�NetmaskValueErrorz%A Value Error related to the netmask.N)r5r6r7r:rrrrr;�sr;cCsjyt|�Sttfk
r YnXyt|�Sttfk
rBYnXt|t�rZtd|��td|��dS)a�Take an IP string/int and return an object of the correct type.

    Args:
        address: A string or integer, the IP address.  Either IPv4 or
          IPv6 addresses may be supplied; integers less than 2**32 will
          be considered to be IPv4 by default.

    Returns:
        An IPv4Address or IPv6Address object.

    Raises:
        ValueError: if the *address* passed isn't either a v4 or a v6
          address

    zx%r does not appear to be an IPv4 or IPv6 address. Did you pass in a bytes (str in Python 2) instead of a unicode object?z0%r does not appear to be an IPv4 or IPv6 addressN)�IPv4Addressr9r;�IPv6Addressr�bytes�
ValueError)�addressrrr�
ip_address�s
rATcCsny
t||�Sttfk
r"YnXy
t||�Sttfk
rFYnXt|t�r^td|��td|��dS)a�Take an IP string/int and return an object of the correct type.

    Args:
        address: A string or integer, the IP network.  Either IPv4 or
          IPv6 networks may be supplied; integers less than 2**32 will
          be considered to be IPv4 by default.

    Returns:
        An IPv4Network or IPv6Network object.

    Raises:
        ValueError: if the string passed isn't either a v4 or a v6
          address. Or if the network has host bits set.

    zx%r does not appear to be an IPv4 or IPv6 network. Did you pass in a bytes (str in Python 2) instead of a unicode object?z0%r does not appear to be an IPv4 or IPv6 networkN)�IPv4Networkr9r;�IPv6Networkrr>r?)r@�strictrrr�
ip_network�s


rEcCsTyt|�Sttfk
r YnXyt|�Sttfk
rBYnXtd|��dS)agTake an IP string/int and return an object of the correct type.

    Args:
        address: A string or integer, the IP address.  Either IPv4 or
          IPv6 addresses may be supplied; integers less than 2**32 will
          be considered to be IPv4 by default.

    Returns:
        An IPv4Interface or IPv6Interface object.

    Raises:
        ValueError: if the string passed isn't either a v4 or a v6
          address.

    Notes:
        The IPv?Interface classes describe an Address on a particular
        Network, so they're basically a combination of both the Address
        and Network classes.

    z2%r does not appear to be an IPv4 or IPv6 interfaceN)�
IPv4Interfacer9r;�
IPv6Interfacer?)r@rrr�ip_interface�srHcCs4yt|dd�Stjtfk
r.td��YnXdS)a`Represent an address as 4 packed bytes in network (big-endian) order.

    Args:
        address: An integer representation of an IPv4 IP address.

    Returns:
        The integer address packed as 4 bytes in network (big-endian) order.

    Raises:
        ValueError: If the integer is negative or too large to be an
          IPv4 IP address.

    rr
z&Address negative or too large for IPv4N)rrr�
OverflowErrorr?)r@rrr�v4_int_to_packed�srJcCs4yt|dd�Stjtfk
r.td��YnXdS)z�Represent an address as 16 packed bytes in network (big-endian) order.

    Args:
        address: An integer representation of an IPv6 IP address.

    Returns:
        The integer address packed as 16 bytes in network (big-endian) order.

    rr
z&Address negative or too large for IPv6N)rrrrIr?)r@rrr�v6_int_to_packeds
rKcCs*t|�jd�}t|�dkr&td|��|S)zAHelper to split the netmask and raise AddressValueError if needed�/rzOnly one '/' permitted in %r)�_compat_str�split�lenr9)r@�addrrrr�_split_optional_netmasksrQccsRt|�}t|�}}x.|D]&}|j|jdkr<||fV|}|}qW||fVdS)z�Find a sequence of sorted deduplicated IPv#Address.

    Args:
        addresses: a list of IPv#Address objects.

    Yields:
        A tuple containing the first and last IP addresses in the sequence.

    r$N)�iter�next�_ip)�	addresses�it�first�last�iprrr�_find_address_ranges


rZcCs$|dkr|St|t||d@��S)z�Count the number of zero bits on the right hand side.

    Args:
        number: an integer.
        bits: maximum number of bits to count.

    Returns:
        The number of zero bits on the right hand side of the number.

    rr$)�minr!)Znumber�bitsrrr�_count_righthand_zero_bits0sr]ccs�t|t�ot|t�std��|j|jkr8td||f��||krHtd��|jdkrXt}n|jdkrht}ntd��|j}|j}|j}x^||kr�t	t
||�t||d�d�}||||f�}|V|d|>7}|d|jkr�Pq�WdS)	a�Summarize a network range given the first and last IP addresses.

    Example:
        >>> list(summarize_address_range(IPv4Address('192.0.2.0'),
        ...                              IPv4Address('192.0.2.130')))
        ...                                #doctest: +NORMALIZE_WHITESPACE
        [IPv4Network('192.0.2.0/25'), IPv4Network('192.0.2.128/31'),
         IPv4Network('192.0.2.130/32')]

    Args:
        first: the first IPv4Address or IPv6Address in the range.
        last: the last IPv4Address or IPv6Address in the range.

    Returns:
        An iterator of the summarized IPv(4|6) network objects.

    Raise:
        TypeError:
            If the first and last objects are not IP addresses.
            If the first and last objects are not the same version.
        ValueError:
            If the last object is not greater than the first.
            If the version of the first address is not 4 or 6.

    z1first and last must be IP addresses, not networksz%%s and %s are not of the same versionz*last IP address must be greater than firstr�zunknown IP versionr$N)
r�_BaseAddress�	TypeError�versionr?rBrC�_max_prefixlenrTr[r]r!�	_ALL_ONES)rWrXrYZip_bitsZ	first_intZlast_intZnbits�netrrr�summarize_address_range@s0





reccs�t|�}i}xL|rX|j�}|j�}|j|�}|dkr>|||<q||kr||=|j|�qWd}x4t|j��D]$}|dk	r�|j|jkr�ql|V|}qlWdS)auLoops through the addresses, collapsing concurrent netblocks.

    Example:

        ip1 = IPv4Network('192.0.2.0/26')
        ip2 = IPv4Network('192.0.2.64/26')
        ip3 = IPv4Network('192.0.2.128/26')
        ip4 = IPv4Network('192.0.2.192/26')

        _collapse_addresses_internal([ip1, ip2, ip3, ip4]) ->
          [IPv4Network('192.0.2.0/24')]

        This shouldn't be called directly; it is called via
          collapse_addresses([]).

    Args:
        addresses: A list of IPv4Network's or IPv6Network's

    Returns:
        A list of IPv4Network's or IPv6Network's depending on what we were
        passed.

    N)�list�pop�supernet�get�append�sorted�values�broadcast_address)rUZto_merge�subnetsrdrhZexistingrXrrr�_collapse_addresses_internalws$

rocCs8g}g}g}x�|D]�}t|t�rT|rH|dj|jkrHtd||df��|j|�q|j|jkr�|r�|dj|jkr�td||df��y|j|j�Wq�tk
r�|j|j	�Yq�Xq|r�|dj|jkr�td||df��|j|�qWt
t|��}|�r,x&t|�D]\}}|j
t||���qWt||�S)	a�Collapse a list of IP objects.

    Example:
        collapse_addresses([IPv4Network('192.0.2.0/25'),
                            IPv4Network('192.0.2.128/25')]) ->
                           [IPv4Network('192.0.2.0/24')]

    Args:
        addresses: An iterator of IPv4Network or IPv6Network objects.

    Returns:
        An iterator of the collapsed IPv(4|6)Network objects.

    Raises:
        TypeError: If passed a list of mixed version objects.

    r$z%%s and %s are not of the same version���rprprprprp)rr_�_versionr`rj�
_prefixlenrbrY�AttributeError�network_addressrk�setrZ�extendrero)rUZaddrsZipsZnetsrYrWrXrrr�collapse_addresses�s4

rwcCs(t|t�r|j�St|t�r$|j�StS)a2Return a key suitable for sorting between networks and addresses.

    Address and Network objects are not sortable by default; they're
    fundamentally different so the expression

        IPv4Address('192.0.2.0') <= IPv4Network('192.0.2.0/24')

    doesn't make any sense.  There are some times however, where you may wish
    to have ipaddress sort these for you anyway. If you need to do this, you
    can use this function as the key= argument to sorted().

    Args:
      obj: either a Network or Address object.
    Returns:
      appropriate key.

    )r�_BaseNetwork�_get_networks_keyr_�_get_address_keyr-)�objrrr�get_mixed_type_key�s


r|c@s�eZdZdZfZedd��Zedd��Zedd��Zedd	��Z	d
d�Z
dd
�Zedd��Z
edd��Zedd��Zedd��Zedd��Zdd�ZdS)�_IPAddressBasezThe mother class.cCs|j�S)z:Return the longhand version of the IP address as a string.)�_explode_shorthand_ip_string)r*rrr�exploded�sz_IPAddressBase.explodedcCst|�S)z;Return the shorthand version of the IP address as a string.)rM)r*rrr�
compressedsz_IPAddressBase.compressedcCs|j�S)aIThe name of the reverse DNS pointer for the IP address, e.g.:
            >>> ipaddress.ip_address("127.0.0.1").reverse_pointer
            '1.0.0.127.in-addr.arpa'
            >>> ipaddress.ip_address("2001:db8::1").reverse_pointer
            '1.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.8.b.d.0.1.0.0.2.ip6.arpa'

        )�_reverse_pointer)r*rrr�reverse_pointers	z_IPAddressBase.reverse_pointercCsdt|�f}t|��dS)Nz%200s has no version specified)�typer)r*�msgrrrrasz_IPAddressBase.versioncCsF|dkrd}t|||jf��||jkrBd}t|||j|jf��dS)Nrz-%d (< 0) is not permitted as an IPv%d addressz2%d (>= 2**%d) is not permitted as an IPv%d address)r9rqrcrb)r*r@r�rrr�_check_int_addresss

z!_IPAddressBase._check_int_addresscCs.t|�}||kr*d}t|||||jf��dS)Nz~%r (len %d != %d) is not permitted as an IPv%d address. Did you pass in a bytes (str in Python 2) instead of a unicode object?)rOr9rq)r*r@Zexpected_lenZaddress_lenr�rrr�_check_packed_address s
z$_IPAddressBase._check_packed_addresscCs|j|j|?AS)z�Turn the prefix length into a bitwise netmask

        Args:
            prefixlen: An integer, the prefix length.

        Returns:
            An integer.

        )rc)�cls�	prefixlenrrr�_ip_int_from_prefix+sz"_IPAddressBase._ip_int_from_prefixc	Cs\t||j�}|j|}||?}d|>d}||krX|jd}t||d�}d}t||��|S)aReturn prefix length from the bitwise netmask.

        Args:
            ip_int: An integer, the netmask in expanded bitwise format

        Returns:
            An integer, the prefix length.

        Raises:
            ValueError: If the input intermingles zeroes & ones
        r$rr
z&Netmask pattern %r mixes zeroes & ones)r]rbrr?)	r��ip_intZtrailing_zeroesr�Zleading_onesZall_onesZbyteslenZdetailsr�rrr�_prefix_from_ip_int8s


z"_IPAddressBase._prefix_from_ip_intcCsd|}t|��dS)Nz%r is not a valid netmask)r;)r�Znetmask_strr�rrr�_report_invalid_netmaskQsz&_IPAddressBase._report_invalid_netmaskcCsjtjj|�s|j|�yt|�}Wntk
r@|j|�YnXd|koV|jknsf|j|�|S)a	Return prefix length from a numeric string

        Args:
            prefixlen_str: The string to be converted

        Returns:
            An integer, the prefix length.

        Raises:
            NetmaskValueError: If the input is not a valid netmask
        r)�_BaseV4�_DECIMAL_DIGITS�
issupersetr��intr?rb)r�Z
prefixlen_strr�rrr�_prefix_from_prefix_stringVs

z)_IPAddressBase._prefix_from_prefix_stringcCs�y|j|�}Wntk
r,|j|�YnXy
|j|�Stk
rLYnX||jN}y
|j|�Stk
r�|j|�YnXdS)aTurn a netmask/hostmask string into a prefix length

        Args:
            ip_str: The netmask/hostmask to be converted

        Returns:
            An integer, the prefix length.

        Raises:
            NetmaskValueError: If the input is not a valid netmask/hostmask
        N)�_ip_int_from_stringr9r�r�r?rc)r��ip_strr�rrr�_prefix_from_ip_stringos


z%_IPAddressBase._prefix_from_ip_stringcCs|jt|�ffS)N)�	__class__rM)r*rrr�
__reduce__�sz_IPAddressBase.__reduce__N)r5r6r7r:r8�propertyrr�r�rar�r��classmethodr�r�r�r�r�r�rrrrr}�s	
"r}c@sdeZdZdZfZdd�Zdd�Zdd�Zdd	�Zd
d�Z	dd
�Z
dd�Zdd�Zdd�Z
dd�ZdS)r_z�A generic IP object.

    This IP class contains the version independent methods which are
    used by single IP addresses.
    cCs|jS)N)rT)r*rrr�__int__�sz_BaseAddress.__int__cCs2y|j|jko|j|jkStk
r,tSXdS)N)rTrqrsr-)r*r+rrrr,�s
z_BaseAddress.__eq__cCs`t|t�stSt|t�s(td||f��|j|jkrDtd||f��|j|jkr\|j|jkSdS)Nz"%s and %s are not of the same typez%%s and %s are not of the same versionF)rr}r-r_r`rqrT)r*r+rrrr0�s

z_BaseAddress.__lt__cCs t|t�stS|jt|�|�S)N)rrr-r�r�)r*r+rrr�__add__�s
z_BaseAddress.__add__cCs t|t�stS|jt|�|�S)N)rrr-r�r�)r*r+rrr�__sub__�s
z_BaseAddress.__sub__cCsd|jjt|�fS)Nz%s(%r))r�r5rM)r*rrr�__repr__�sz_BaseAddress.__repr__cCst|j|j��S)N)rM�_string_from_ip_intrT)r*rrr�__str__�sz_BaseAddress.__str__cCsttt|j���S)N)�hash�hexr�rT)r*rrr�__hash__�sz_BaseAddress.__hash__cCs
|j|fS)N)rq)r*rrrrz�sz_BaseAddress._get_address_keycCs|j|jffS)N)r�rT)r*rrrr��sz_BaseAddress.__reduce__N)r5r6r7r:r8r�r,r0r�r�r�r�r�rzr�rrrrr_�sr_c@sXeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zedd��Zedd��Zedd��Zedd��Zed d!��Zed"d#��Zed$d%��Zed&d'��Zd(d)�Zd*d+�Zd,d-�ZdFd0d1�ZdGd2d3�Zed4d5��Zd6d7�Zd8d9�Zed:d;��Z ed<d=��Z!ed>d?��Z"ed@dA��Z#edBdC��Z$edDdE��Z%d/S)Hrxz~A generic IP network object.

    This IP class contains the version independent methods which are
    used by networks.

    cCs
i|_dS)N)�_cache)r*r@rrr�__init__�sz_BaseNetwork.__init__cCsd|jjt|�fS)Nz%s(%r))r�r5rM)r*rrrr��sz_BaseNetwork.__repr__cCsd|j|jfS)Nz%s/%d)rtr�)r*rrrr��sz_BaseNetwork.__str__ccs<t|j�}t|j�}x"t|d|�D]}|j|�Vq$WdS)z�Generate Iterator over usable hosts in a network.

        This is like __iter__ except it doesn't return the network
        or broadcast addresses.

        r$N)r�rtrmr(�_address_class)r*�network�	broadcast�xrrr�hosts�s

z_BaseNetwork.hostsccs<t|j�}t|j�}x"t||d�D]}|j|�Vq$WdS)Nr$)r�rtrmr(r�)r*r�r�r�rrr�__iter__�s

z_BaseNetwork.__iter__cCslt|j�}t|j�}|dkr>|||kr0td��|j||�S|d7}|||krZtd��|j||�SdS)Nrzaddress out of ranger$)r�rtrm�
IndexErrorr�)r*�nr�r�rrr�__getitem__�s

z_BaseNetwork.__getitem__cCsxt|t�stSt|t�s(td||f��|j|jkrDtd||f��|j|jkr\|j|jkS|j|jkrt|j|jkSdS)Nz"%s and %s are not of the same typez%%s and %s are not of the same versionF)rr}r-rxr`rqrt�netmask)r*r+rrrr0s

z_BaseNetwork.__lt__cCsFy,|j|jko,|j|jko,t|j�t|j�kStk
r@tSXdS)N)rqrtr�r�rsr-)r*r+rrrr,sz_BaseNetwork.__eq__cCstt|j�t|j�A�S)N)r�r�rtr�)r*rrrr�sz_BaseNetwork.__hash__cCsL|j|jkrdSt|t�rdSt|j�t|j�koBt|j�kSSdS)NF)rqrrxr�rtrTrm)r*r+rrr�__contains__s
z_BaseNetwork.__contains__cCs(|j|kp&|j|kp&|j|kp&|j|kS)z*Tell if self is partly contained in other.)rtrm)r*r+rrr�overlaps)s


z_BaseNetwork.overlapscCs<|jjd�}|dkr8|jt|j�t|j�B�}||jd<|S)Nrm)r�rir�r�rt�hostmask)r*r�rrrrm0s
z_BaseNetwork.broadcast_addresscCs8|jjd�}|dkr4|jt|j�|jA�}||jd<|S)Nr�)r�rir�r�r�rc)r*r�rrrr�9s

z_BaseNetwork.hostmaskcCsd|j|jfS)Nz%s/%d)rtrr)r*rrr�with_prefixlenAsz_BaseNetwork.with_prefixlencCsd|j|jfS)Nz%s/%s)rtr�)r*rrr�with_netmaskEsz_BaseNetwork.with_netmaskcCsd|j|jfS)Nz%s/%s)rtr�)r*rrr�
with_hostmaskIsz_BaseNetwork.with_hostmaskcCst|j�t|j�dS)z&Number of hosts in the current subnet.r$)r�rmrt)r*rrr�
num_addressesMsz_BaseNetwork.num_addressescCsdt|�f}t|��dS)Nz%%200s has no associated address class)r�r)r*r�rrrr�Rsz_BaseNetwork._address_classcCs|jS)N)rr)r*rrrr�Zsz_BaseNetwork.prefixlenccs|j|jkstd||f��t|t�s2td|��|j|�sLtd||f��||krXdS|jd|j|jf�}|j	�\}}xb||kr�||kr�|j|�r�|V|j	�\}}q||j|�r�|V|j	�\}}q|t
d|||f��q|W||kr�|Vn$||k�r|Vnt
d|||f��dS)a�Remove an address from a larger block.

        For example:

            addr1 = ip_network('192.0.2.0/28')
            addr2 = ip_network('192.0.2.1/32')
            list(addr1.address_exclude(addr2)) =
                [IPv4Network('192.0.2.0/32'), IPv4Network('192.0.2.2/31'),
                 IPv4Network('192.0.2.4/30'), IPv4Network('192.0.2.8/29')]

        or IPv6:

            addr1 = ip_network('2001:db8::1/32')
            addr2 = ip_network('2001:db8::1/128')
            list(addr1.address_exclude(addr2)) =
                [ip_network('2001:db8::1/128'),
                 ip_network('2001:db8::2/127'),
                 ip_network('2001:db8::4/126'),
                 ip_network('2001:db8::8/125'),
                 ...
                 ip_network('2001:db8:8000::/33')]

        Args:
            other: An IPv4Network or IPv6Network object of the same type.

        Returns:
            An iterator of the IPv(4|6)Network objects which is self
            minus other.

        Raises:
            TypeError: If self and other are of differing address
              versions, or if other is not a network object.
            ValueError: If other is not completely contained by self.

        z%%s and %s are not of the same versionz%s is not a network objectz%s not contained in %sNz%s/%sz3Error performing exclusion: s1: %s s2: %s other: %s)rqr`rrx�	subnet_ofr?r�rtr�rnr)r*r+�s1�s2rrr�address_exclude^s6$





z_BaseNetwork.address_excludecCs`|j|jkrtd||f��|j|jkr,dS|j|jkr<dS|j|jkrLdS|j|jkr\dSdS)a�Compare two IP objects.

        This is only concerned about the comparison of the integer
        representation of the network addresses.  This means that the
        host bits aren't considered at all in this method.  If you want
        to compare host bits, you can easily enough do a
        'HostA._ip < HostB._ip'

        Args:
            other: An IP object.

        Returns:
            If the IP versions of self and other are the same, returns:

            -1 if self < other:
              eg: IPv4Network('192.0.2.0/25') < IPv4Network('192.0.2.128/25')
              IPv6Network('2001:db8::1000/124') <
                  IPv6Network('2001:db8::2000/124')
            0 if self == other
              eg: IPv4Network('192.0.2.0/24') == IPv4Network('192.0.2.0/24')
              IPv6Network('2001:db8::1000/124') ==
                  IPv6Network('2001:db8::1000/124')
            1 if self > other
              eg: IPv4Network('192.0.2.128/25') > IPv4Network('192.0.2.0/25')
                  IPv6Network('2001:db8::2000/124') >
                      IPv6Network('2001:db8::1000/124')

          Raises:
              TypeError if the IP versions are different.

        z"%s and %s are not of the same typer$rrprp)rqr`rtr�)r*r+rrr�compare_networks�s!z_BaseNetwork.compare_networkscCs|j|j|jfS)z�Network-only key function.

        Returns an object that identifies this address' network and
        netmask. This function is a suitable "key" argument for sorted()
        and list.sort().

        )rqrtr�)r*rrrry�sz_BaseNetwork._get_networks_keyr$Nc	cs�|j|jkr|VdS|dk	rJ||jkr0td��|dkr@td��||j}|dkrZtd��|j|}||jkr~td||f��t|j�}t|j�d}t|j�d|?}x(t|||�D]}|j||f�}|Vq�WdS)a�The subnets which join to make the current subnet.

        In the case that self contains only one IP
        (self._prefixlen == 32 for IPv4 or self._prefixlen == 128
        for IPv6), yield an iterator with just ourself.

        Args:
            prefixlen_diff: An integer, the amount the prefix length
              should be increased by. This should not be set if
              new_prefix is also set.
            new_prefix: The desired new prefix length. This must be a
              larger number (smaller prefix) than the existing prefix.
              This should not be set if prefixlen_diff is also set.

        Returns:
            An iterator of IPv(4|6) objects.

        Raises:
            ValueError: The prefixlen_diff is too small or too large.
                OR
            prefixlen_diff and new_prefix are both set or new_prefix
              is a smaller number than the current prefix (smaller
              number means a larger network)

        Nznew prefix must be longerr$z(cannot set prefixlen_diff and new_prefixrzprefix length diff must be > 0z0prefix length diff %d is invalid for netblock %s)	rrrbr?r�rtrmr�r(r�)	r*�prefixlen_diff�
new_prefix�
new_prefixlenr%r&r'Znew_addrZcurrentrrrrn�s,




z_BaseNetwork.subnetscCs�|jdkr|S|dk	rB||jkr(td��|dkr8td��|j|}|j|}|dkrftd|j|f��|jt|j�t|j�|>@|f�S)a�The supernet containing the current network.

        Args:
            prefixlen_diff: An integer, the amount the prefix length of
              the network should be decreased by.  For example, given a
              /24 network and a prefixlen_diff of 3, a supernet with a
              /21 netmask is returned.

        Returns:
            An IPv4 network object.

        Raises:
            ValueError: If self.prefixlen - prefixlen_diff < 0. I.e., you have
              a negative prefix length.
                OR
            If prefixlen_diff and new_prefix are both set or new_prefix is a
              larger number than the current prefix (larger number means a
              smaller network)

        rNznew prefix must be shorterr$z(cannot set prefixlen_diff and new_prefixz;current prefixlen is %d, cannot have a prefixlen_diff of %d)rrr?r�r�r�rtr�)r*r�r�r�rrrrhs 



z_BaseNetwork.supernetcCs|jjo|jjS)z�Test if the address is reserved for multicast use.

        Returns:
            A boolean, True if the address is a multicast address.
            See RFC 2373 2.7 for details.

        )rt�is_multicastrm)r*rrrr�As	z_BaseNetwork.is_multicastcCsP|j|jkrdSt|d�r<t|d�r<|j|jko:|j|jkStdt|���dS)NFrtrmz9Unable to test subnet containment with element of type %s)rq�hasattrrtrmr`r�)r*r+rrrr�Ms

z_BaseNetwork.subnet_ofcCsP|j|jkrdSt|d�r<t|d�r<|j|jko:|j|jkStdt|���dS)NFrtrmz9Unable to test subnet containment with element of type %s)rqr�rtrmr`r�)r*r+rrr�supernet_of[s

z_BaseNetwork.supernet_ofcCs|jjo|jjS)z�Test if the address is otherwise IETF reserved.

        Returns:
            A boolean, True if the address is within one of the
            reserved IPv6 Network ranges.

        )rt�is_reservedrm)r*rrrr�is	z_BaseNetwork.is_reservedcCs|jjo|jjS)z�Test if the address is reserved for link-local.

        Returns:
            A boolean, True if the address is reserved per RFC 4291.

        )rt�
is_link_localrm)r*rrrr�usz_BaseNetwork.is_link_localcCs|jjo|jjS)z�Test if this address is allocated for private networks.

        Returns:
            A boolean, True if the address is reserved per
            iana-ipv4-special-registry or iana-ipv6-special-registry.

        )rt�
is_privaterm)r*rrrr��s	z_BaseNetwork.is_privatecCs|jS)z�Test if this address is allocated for public networks.

        Returns:
            A boolean, True if the address is not reserved per
            iana-ipv4-special-registry or iana-ipv6-special-registry.

        )r�)r*rrr�	is_global�s	z_BaseNetwork.is_globalcCs|jjo|jjS)z�Test if the address is unspecified.

        Returns:
            A boolean, True if this is the unspecified address as defined in
            RFC 2373 2.5.2.

        )rt�is_unspecifiedrm)r*rrrr��s	z_BaseNetwork.is_unspecifiedcCs|jjo|jjS)z�Test if the address is a loopback address.

        Returns:
            A boolean, True if the address is a loopback address as defined in
            RFC 2373 2.5.3.

        )rt�is_loopbackrm)r*rrrr��s	z_BaseNetwork.is_loopback)r$N)r$N)&r5r6r7r:r�r�r�r�r�r�r0r,r�r�r�r�rmr�r�r�r�r�r�r�r�r�ryrnrhr�r�r�r�r�r�r�r�r�rrrrrx�sD

	K0

5
)rxc
@s�eZdZdZfZdZdedZed�Z	edddd	d
ddd
dg	�Z
eZiZdd�Z
edd��Zedd��Zedd��Zedd��Zdd�Zdd�Zedd��Zedd ��Zd!S)"r�zyBase IPv4 object.

    The following methods are used by IPv4 objects in both single IP
    addresses and networks.

    rrr$�
0123456789���������rrcCst|�S)N)rM)r*rrrr~�sz$_BaseV4._explode_shorthand_ip_stringcCsn||jkrdt|t�r|}n.y|j|�}Wntk
rF|j|�}YnXt|j|��}||f|j|<|j|S)aMake a (netmask, prefix_len) tuple from the given argument.

        Argument can be:
        - an integer (the prefix length)
        - a string representing the prefix length (e.g. "24")
        - a string representing the prefix netmask (e.g. "255.255.255.0")
        )�_netmask_cacherrr�r;r�r<r�)r��argr�r�rrr�
_make_netmask�s	

z_BaseV4._make_netmaskcCsx|std��|jd�}t|�dkr.td|��ytt|j|�d�Stk
rr}ztd||f��WYdd}~XnXdS)aTurn the given IP string into an integer for comparison.

        Args:
            ip_str: A string, the IP ip_str.

        Returns:
            The IP ip_str as an integer.

        Raises:
            AddressValueError: if ip_str isn't a valid IPv4 Address.

        zAddress cannot be empty�.rzExpected 4 octets in %rr
z%s in %rN)r9rNrOr�map�_parse_octetr?)r�r�Zoctets�excrrrr��s
z_BaseV4._ip_int_from_stringcCs�|std��|jj|�s(d}t||��t|�dkrDd}t||��t|d�}|dkrr|ddkrrd	}t||��|d
kr�td|��|S)aConvert a decimal octet into an integer.

        Args:
            octet_str: A string, the number to parse.

        Returns:
            The octet as an integer.

        Raises:
            ValueError: if the octet isn't strictly a decimal from [0..255].

        zEmpty octet not permittedz#Only decimal digits permitted in %r�z$At most 3 characters permitted in %r�
�r�0z3Ambiguous (octal/decimal) value in %r not permittedr�zOctet %d (> 255) not permitted)r?r�r�rOr�)r�Z	octet_strr�Z	octet_intrrrr��s
z_BaseV4._parse_octetcCsdjdd�t|dd�D��S)z�Turns a 32-bit integer into dotted decimal notation.

        Args:
            ip_int: An integer, the IP address.

        Returns:
            The IP address as a string in dotted decimal notation.

        r�css0|](}tt|t�r"tjd|�dn|�VqdS)s!BrN)rMrr>rr	)r
rrrr�	<genexpr>-sz._BaseV4._string_from_ip_int.<locals>.<genexpr>rr
)�joinr)r�r�rrrr�"s
z_BaseV4._string_from_ip_intcsh|jd�}y�fdd�tt|�D�}Wntk
r:dSXt|�t|�krPdS|d|dkrddSdS)	z�Test if the IP string is a hostmask (rather than a netmask).

        Args:
            ip_str: A string, the potential hostmask.

        Returns:
            A boolean, True if the IP string is a hostmask.

        r�csg|]}|�jkr|�qSr)�_valid_mask_octets)r
r�)r*rrr>sz(_BaseV4._is_hostmask.<locals>.<listcomp>Frr$Trp)rNr�r�r?rO)r*r�r\�partsr)r*r�_is_hostmask2s

z_BaseV4._is_hostmaskcCs&t|�jd�ddd�}dj|�dS)z�Return the reverse DNS pointer name for the IPv4 address.

        This implements the method described in RFC1035 3.5.

        r�Nr$z
.in-addr.arparp)rMrNr�)r*Zreverse_octetsrrrr�Gsz_BaseV4._reverse_pointercCs|jS)N)rb)r*rrr�
max_prefixlenPsz_BaseV4.max_prefixlencCs|jS)N)rq)r*rrrraTsz_BaseV4.versionN)r5r6r7r:r8rq�
IPV4LENGTHrc�	frozensetr�r�rbr�r~r�r�r�r�r�r�r�r�r�rarrrrr��s"%	r�c@s|eZdZdZdZdd�Zedd��Zedd	��Zed
d��Z	edd
��Z
edd��Zedd��Zedd��Z
edd��ZdS)r<z/Represent and manipulate single IPv4 Addresses.rT�__weakref__cCsxt|t�r|j|�||_dSt|t�rL|j|d�t|�}t|d�|_dSt|�}d|krht	d|��|j
|�|_dS)a�
        Args:
            address: A string or integer representing the IP

              Additionally, an integer can be passed, so
              IPv4Address('192.0.2.1') == IPv4Address(3221225985).
              or, more generally
              IPv4Address(int(IPv4Address('192.0.2.1'))) ==
                IPv4Address('192.0.2.1')

        Raises:
            AddressValueError: If ipaddress isn't a valid IPv4 address.

        Nrr
rLzUnexpected '/' in %r)rrr�rTr>r�rrrMr9r�)r*r@�bvs�addr_strrrrr�_s


zIPv4Address.__init__cCs
t|j�S)z*The binary representation of this address.)rJrT)r*rrr�packed�szIPv4Address.packedcCs||jjkS)z�Test if the address is otherwise IETF reserved.

         Returns:
             A boolean, True if the address is within the
             reserved IPv4 Network range.

        )�
_constants�_reserved_network)r*rrrr��s	zIPv4Address.is_reservedcst�fdd��jjD��S)z�Test if this address is allocated for private networks.

        Returns:
            A boolean, True if the address is reserved per
            iana-ipv4-special-registry.

        c3s|]}�|kVqdS)Nr)r
rd)r*rrr��sz)IPv4Address.is_private.<locals>.<genexpr>)�anyr��_private_networks)r*r)r*rr��s	zIPv4Address.is_privatecCs||jjko|jS)N)r��_public_networkr�)r*rrrr��szIPv4Address.is_globalcCs||jjkS)z�Test if the address is reserved for multicast use.

        Returns:
            A boolean, True if the address is multicast.
            See RFC 3171 for details.

        )r��_multicast_network)r*rrrr��s	zIPv4Address.is_multicastcCs||jjkS)z�Test if the address is unspecified.

        Returns:
            A boolean, True if this is the unspecified address as defined in
            RFC 5735 3.

        )r��_unspecified_address)r*rrrr��s	zIPv4Address.is_unspecifiedcCs||jjkS)z�Test if the address is a loopback address.

        Returns:
            A boolean, True if the address is a loopback per RFC 3330.

        )r��_loopback_network)r*rrrr��szIPv4Address.is_loopbackcCs||jjkS)z�Test if the address is reserved for link-local.

        Returns:
            A boolean, True if the address is link-local per RFC 3927.

        )r��_linklocal_network)r*rrrr��szIPv4Address.is_link_localN)rTr�)r5r6r7r:r8r�r�r�r�r�r�r�r�r�r�rrrrr<Ys$
r<c@sjeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zej	Z	e
dd��Ze
d
d��Ze
dd��Z
e
dd��ZdS)rFcCs�t|ttf�r2tj||�t|j�|_|j|_	dSt|t
�r�tj||d�t|�dkrht|d�|_	n|j|_	t|dd�|_|jj
|_
|jj|_dSt|�}tj||d�t|dd�|_|jj	|_	|jj
|_
|jj|_dS)Nrr$F)rD)rr>rr<r�rBrTr�rbrr�tuplerOr�r�r�rQ)r*r@rPrrrr��s(




zIPv4Interface.__init__cCsd|j|j�|jjfS)Nz%s/%d)r�rTr�r�)r*rrrr��szIPv4Interface.__str__cCsDtj||�}|s|tkr|Sy|j|jkStk
r>dSXdS)NF)r<r,r-r�rs)r*r+�
address_equalrrrr,�szIPv4Interface.__eq__cCs>tj||�}|tkrtSy|j|jkStk
r8dSXdS)NF)r<r0r-r�rs)r*r+�address_lessrrrr0�szIPv4Interface.__lt__cCs|j|jAt|jj�AS)N)rTrrr�r�rt)r*rrrr�szIPv4Interface.__hash__cCs
t|j�S)N)r<rT)r*rrrrY
szIPv4Interface.ipcCsd|j|j�|jfS)Nz%s/%s)r�rTrr)r*rrrr�szIPv4Interface.with_prefixlencCsd|j|j�|jfS)Nz%s/%s)r�rTr�)r*rrrr�szIPv4Interface.with_netmaskcCsd|j|j�|jfS)Nz%s/%s)r�rTr�)r*rrrr�szIPv4Interface.with_hostmaskN)r5r6r7r�r�r,r0r�r}r�r�rYr�r�r�rrrrrF�srFc@s*eZdZdZeZddd�Zedd��ZdS)	rBaeThis class represents and manipulates 32-bit IPv4 network + addresses..

    Attributes: [examples for IPv4Network('192.0.2.0/27')]
        .network_address: IPv4Address('192.0.2.0')
        .hostmask: IPv4Address('0.0.0.31')
        .broadcast_address: IPv4Address('192.0.2.32')
        .netmask: IPv4Address('255.255.255.224')
        .prefixlen: 27

    TcCs|tj||�t|ttf�r<t|�|_|j|j�\|_	|_
dSt|t�r�t|�dkr\|d}n|j}t|d�|_|j|�\|_	|_
t
|j�}|t
|j	�@|kr�|r�td|��nt|t
|j	�@�|_dSt|�}t|j|d��|_t|�dkr�|d}n|j}|j|�\|_	|_
|�rDtt
|j�t
|j	�@�|jk�rDtd|��tt
|j�t
|j	�@�|_|j
|jdk�rx|j|_dS)aInstantiate a new IPv4 network object.

        Args:
            address: A string or integer representing the IP [& network].
              '192.0.2.0/24'
              '192.0.2.0/255.255.255.0'
              '192.0.0.2/0.0.0.255'
              are all functionally the same in IPv4. Similarly,
              '192.0.2.1'
              '192.0.2.1/255.255.255.255'
              '192.0.2.1/32'
              are also functionally equivalent. That is to say, failing to
              provide a subnetmask will create an object with a mask of /32.

              If the mask (portion after the / in the argument) is given in
              dotted quad form, it is treated as a netmask if it starts with a
              non-zero field (e.g. /255.0.0.0 == /8) and as a hostmask if it
              starts with a zero field (e.g. 0.255.255.255 == /8), with the
              single exception of an all-zero mask which is treated as a
              netmask == /0. If no mask is given, a default of /32 is used.

              Additionally, an integer can be passed, so
              IPv4Network('192.0.2.1') == IPv4Network(3221225985)
              or, more generally
              IPv4Interface(int(IPv4Interface('192.0.2.1'))) ==
                IPv4Interface('192.0.2.1')

        Raises:
            AddressValueError: If ipaddress isn't a valid IPv4 address.
            NetmaskValueError: If the netmask isn't valid for
              an IPv4 address.
            ValueError: If strict is True and a network address is not
              supplied.

        Nr$rz%s has host bits setr)rxr�rrr>r<rtr�rbr�rrr�rOr�r?rQr�r�r�)r*r@rDr�r�rPrrrr�0sB%






zIPv4Network.__init__cCs&|jtd�ko|jtd�ko$|jS)z�Test if this address is allocated for public networks.

        Returns:
            A boolean, True if the address is not reserved per
            iana-ipv4-special-registry.

        z
100.64.0.0/10)rtrBrmr�)r*rrrr��s	zIPv4Network.is_globalN)T)	r5r6r7r:r<r�r�r�r�rrrrrB!s
UrBc@s�eZdZed�Zed�Zed�Zed�Zed�ed�ed�ed�ed�ed�ed	�ed
�ed�ed�ed
�ed�ed�ed�gZed�Z	e
d�ZdS)�_IPv4Constantsz169.254.0.0/16z127.0.0.0/8z224.0.0.0/4z
100.64.0.0/10z	0.0.0.0/8z
10.0.0.0/8z
172.16.0.0/12z192.0.0.0/29z192.0.0.170/31z192.0.2.0/24z192.168.0.0/16z
198.18.0.0/15z198.51.100.0/24z203.0.113.0/24z240.0.0.0/4z255.255.255.255/32z0.0.0.0N)r5r6r7rBr�r�r�r�r�r�r<r�rrrrr��s(
r�c@s�eZdZdZfZdZdedZdZe	d�Z
eZiZe
dd��Ze
d	d
��Ze
dd��Ze
d
d��Ze
ddd��Zdd�Zdd�Zedd��Zedd��ZdS)�_BaseV6zyBase IPv6 object.

    The following methods are used by IPv6 objects in both single IP
    addresses and networks.

    r^rr$rZ0123456789ABCDEFabcdefcCsJ||jkr@t|t�r|}n
|j|�}t|j|��}||f|j|<|j|S)aMake a (netmask, prefix_len) tuple from the given argument.

        Argument can be:
        - an integer (the prefix length)
        - a string representing the prefix length (e.g. "24")
        - a string representing the prefix netmask (e.g. "255.255.255.0")
        )r�rrr�r=r�)r�r�r�r�rrrr��s	


z_BaseV6._make_netmaskcCs�|std��|jd�}d}t|�|kr:d||f}t|��d|dkr�yt|j��j}Wn2tk
r�}ztd||f��WYdd}~XnX|jd	|d
?d@�|jd	|d@�|jd}t|�|kr�d|d|f}t|��d}x@tdt|�d�D]*}	||	s�|dk	�r d
|}t|��|	}q�W|dk	�r�|}
t|�|d}|d�sn|
d8}
|
�rnd}t||��|d�s�|d8}|�r�d}t||��|j|
|}|dk�r4d}t||jd|f��njt|�|jk�r�d}t||j|f��|d�s
d}t||��|d�s$d}t||��t|�}
d}d}ytd}
x,t	|
�D] }	|
d
K}
|
|j
||	�O}
�qDW|
d
|K}
x0t	|d�D] }	|
d
K}
|
|j
||	�O}
�q�W|
Stk
�r�}ztd||f��WYdd}~XnXdS)z�Turn an IPv6 ip_str into an integer.

        Args:
            ip_str: A string, the IPv6 ip_str.

        Returns:
            An int, the IPv6 address

        Raises:
            AddressValueError: if ip_str isn't a valid IPv6 Address.

        zAddress cannot be empty�:r�z At least %d parts expected in %rr�r$z%s in %rNz%xri��z!At most %d colons permitted in %rz At most one '::' permitted in %rrz0Leading ':' only permitted as part of '::' in %rz1Trailing ':' only permitted as part of '::' in %rz/Expected at most %d other parts with '::' in %rz,Exactly %d parts expected without '::' in %rrprprp)r9rNrOr<rgrTrj�
_HEXTET_COUNTr(�range�
_parse_hextetr?)r�r�r�Z
_min_partsr�Zipv4_intr�Z
_max_partsZ
skip_indexr Zparts_hiZparts_loZ
parts_skippedr�rrrr��s�
"







z_BaseV6._ip_int_from_stringcCs>|jj|�std|��t|�dkr4d}t||��t|d�S)a&Convert an IPv6 hextet string into an integer.

        Args:
            hextet_str: A string, the number to parse.

        Returns:
            The hextet as an integer.

        Raises:
            ValueError: if the input isn't strictly a hex number from
              [0..FFFF].

        zOnly hex digits permitted in %rrz$At most 4 characters permitted in %rr)�_HEX_DIGITSr�r?rOr�)r�Z
hextet_strr�rrrr�Esz_BaseV6._parse_hextetc	Cs�d}d}d}d}xJt|�D]>\}}|dkrP|d7}|dkr>|}||krX|}|}qd}d}qW|dkr�||}|t|�kr�|dg7}dg|||�<|dkr�dg|}|S)	a�Compresses a list of hextets.

        Compresses a list of strings, replacing the longest continuous
        sequence of "0" in the list with "" and adding empty strings at
        the beginning or at the end of the string such that subsequently
        calling ":".join(hextets) will produce the compressed version of
        the IPv6 address.

        Args:
            hextets: A list of strings, the hextets to compress.

        Returns:
            A list of strings.

        r$rr��rprprprp)�	enumeraterO)	r��hextetsZbest_doublecolon_startZbest_doublecolon_lenZdoublecolon_startZdoublecolon_len�indexZhextetZbest_doublecolon_endrrr�_compress_hextets_s.

z_BaseV6._compress_hextetsNcsZ|dkrt|j�}||jkr$td��d|��fdd�tddd�D�}|j|�}d	j|�S)
a,Turns a 128-bit integer into hexadecimal notation.

        Args:
            ip_int: An integer, the IP address.

        Returns:
            A string, the hexadecimal representation of the address.

        Raises:
            ValueError: The address is bigger than 128 bits of all ones.

        NzIPv6 address is too largez%032xcs&g|]}dt�||d�d��qS)z%xrr)r�)r
r�)�hex_strrrr�sz/_BaseV6._string_from_ip_int.<locals>.<listcomp>rrrr�)r�rTrcr?r�r�r�)r�r�r�r)r�rr��s


z_BaseV6._string_from_ip_intcs�t|t�rt|j�}nt|t�r,t|j�}nt|�}|j|�}d|��fdd�tddd�D�}t|ttf�r�ddj	|�|j
fSdj	|�S)	z�Expand a shortened IPv6 address.

        Args:
            ip_str: A string, the IPv6 address.

        Returns:
            A string, the expanded IPv6 address.

        z%032xcsg|]}�||d��qS)rr)r
r�)r�rrr�sz8_BaseV6._explode_shorthand_ip_string.<locals>.<listcomp>rrrz%s/%dr�)rrCrMrtrGrYr�r�rxr�rr)r*r�r�r�r)r�rr~�s



z$_BaseV6._explode_shorthand_ip_stringcCs&|jddd�jdd�}dj|�dS)z�Return the reverse DNS pointer name for the IPv6 address.

        This implements the method described in RFC3596 2.5.

        Nr$r�r�r�z	.ip6.arparp)r�replacer�)r*Z
reverse_charsrrrr��sz_BaseV6._reverse_pointercCs|jS)N)rb)r*rrrr��sz_BaseV6.max_prefixlencCs|jS)N)rq)r*rrrra�sz_BaseV6.version)N)r5r6r7r:r8rq�
IPV6LENGTHrcr�r�r�rbr�r�r�r�r�r�r�r~r�r�r�rarrrrr��s$i0	r�c@s�eZdZdZdZdd�Zedd��Zedd	��Zed
d��Z	edd
��Z
edd��Zedd��Zedd��Z
edd��Zedd��Zedd��Zedd��Zedd��ZdS) r=z/Represent and manipulate single IPv6 Addresses.rTr�cCsxt|t�r|j|�||_dSt|t�rL|j|d�t|�}t|d�|_dSt|�}d|krht	d|��|j
|�|_dS)aInstantiate a new IPv6 address object.

        Args:
            address: A string or integer representing the IP

              Additionally, an integer can be passed, so
              IPv6Address('2001:db8::') ==
                IPv6Address(42540766411282592856903984951653826560)
              or, more generally
              IPv6Address(int(IPv6Address('2001:db8::'))) ==
                IPv6Address('2001:db8::')

        Raises:
            AddressValueError: If address isn't a valid IPv6 address.

        Nrr
rLzUnexpected '/' in %r)rrr�rTr>r�rrrMr9r�)r*r@r�r�rrrr��s


zIPv6Address.__init__cCs
t|j�S)z*The binary representation of this address.)rKrT)r*rrrr��szIPv6Address.packedcCs||jjkS)z�Test if the address is reserved for multicast use.

        Returns:
            A boolean, True if the address is a multicast address.
            See RFC 2373 2.7 for details.

        )r�r�)r*rrrr�s	zIPv6Address.is_multicastcst�fdd��jjD��S)z�Test if the address is otherwise IETF reserved.

        Returns:
            A boolean, True if the address is within one of the
            reserved IPv6 Network ranges.

        c3s|]}�|kVqdS)Nr)r
r�)r*rrr�sz*IPv6Address.is_reserved.<locals>.<genexpr>)r�r��_reserved_networks)r*r)r*rr�s	zIPv6Address.is_reservedcCs||jjkS)z�Test if the address is reserved for link-local.

        Returns:
            A boolean, True if the address is reserved per RFC 4291.

        )r�r�)r*rrrr�szIPv6Address.is_link_localcCs||jjkS)a`Test if the address is reserved for site-local.

        Note that the site-local address space has been deprecated by RFC 3879.
        Use is_private to test if this address is in the space of unique local
        addresses as defined by RFC 4193.

        Returns:
            A boolean, True if the address is reserved per RFC 3513 2.5.6.

        )r��_sitelocal_network)r*rrr�
is_site_local#szIPv6Address.is_site_localcst�fdd��jjD��S)z�Test if this address is allocated for private networks.

        Returns:
            A boolean, True if the address is reserved per
            iana-ipv6-special-registry.

        c3s|]}�|kVqdS)Nr)r
rd)r*rrr�:sz)IPv6Address.is_private.<locals>.<genexpr>)r�r�r�)r*r)r*rr�1s	zIPv6Address.is_privatecCs|jS)z�Test if this address is allocated for public networks.

        Returns:
            A boolean, true if the address is not reserved per
            iana-ipv6-special-registry.

        )r�)r*rrrr�<s	zIPv6Address.is_globalcCs
|jdkS)z�Test if the address is unspecified.

        Returns:
            A boolean, True if this is the unspecified address as defined in
            RFC 2373 2.5.2.

        r)rT)r*rrrr�Gs	zIPv6Address.is_unspecifiedcCs
|jdkS)z�Test if the address is a loopback address.

        Returns:
            A boolean, True if the address is a loopback address as defined in
            RFC 2373 2.5.3.

        r$)rT)r*rrrr�Rs	zIPv6Address.is_loopbackcCs |jd?dkrdSt|jd@�S)z�Return the IPv4 mapped address.

        Returns:
            If the IPv6 address is a v4 mapped address, return the
            IPv4 mapped address. Return None otherwise.

        ri��Nl��)rTr<)r*rrr�ipv4_mapped]s	zIPv6Address.ipv4_mappedcCs4|jd?dkrdSt|jd?d@�t|jd@�fS)z�Tuple of embedded teredo IPs.

        Returns:
            Tuple of the (server, client) IPs or None if the address
            doesn't appear to be a teredo address (doesn't start with
            2001::/32)

        �`i Nrl��)rTr<)r*rrr�teredojs
zIPv6Address.teredocCs$|jd?dkrdSt|jd?d@�S)z�Return the IPv4 6to4 embedded address.

        Returns:
            The IPv4 6to4-embedded address if present or None if the
            address doesn't appear to contain a 6to4 embedded address.

        �pi N�Pl��)rTr<)r*rrr�	sixtofourys	zIPv6Address.sixtofourN)rTr�)r5r6r7r:r8r�r�r�r�r�r�rr�r�r�r�rrrrrrrr=�s%

r=c@s�eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zej	Z	e
dd��Ze
d
d��Ze
dd��Z
e
dd��Ze
dd��Ze
dd��ZdS)rGcCs�t|ttf�r2tj||�t|j�|_|j|_	dSt|t
�r�tj||d�t|�dkrht|d�|_	n|j|_	t|dd�|_|jj
|_
|jj|_dSt|�}tj||d�t|dd�|_|jj
|_
|jj	|_	|jj|_dS)Nrr$F)rD)rr>rr=r�rCrTr�rbrrr�rOr�r�r�rQ)r*r@rPrrrr��s(




zIPv6Interface.__init__cCsd|j|j�|jjfS)Nz%s/%d)r�rTr�r�)r*rrrr��szIPv6Interface.__str__cCsDtj||�}|s|tkr|Sy|j|jkStk
r>dSXdS)NF)r=r,r-r�rs)r*r+r�rrrr,�szIPv6Interface.__eq__cCs>tj||�}|tkrtSy|j|jkStk
r8dSXdS)NF)r=r0r-r�rs)r*r+r�rrrr0�szIPv6Interface.__lt__cCs|j|jAt|jj�AS)N)rTrrr�r�rt)r*rrrr��szIPv6Interface.__hash__cCs
t|j�S)N)r=rT)r*rrrrY�szIPv6Interface.ipcCsd|j|j�|jfS)Nz%s/%s)r�rTrr)r*rrrr��szIPv6Interface.with_prefixlencCsd|j|j�|jfS)Nz%s/%s)r�rTr�)r*rrrr��szIPv6Interface.with_netmaskcCsd|j|j�|jfS)Nz%s/%s)r�rTr�)r*rrrr��szIPv6Interface.with_hostmaskcCs|jdko|jjS)Nr)rTr�r�)r*rrrr��szIPv6Interface.is_unspecifiedcCs|jdko|jjS)Nr$)rTr�r�)r*rrrr��szIPv6Interface.is_loopbackN)r5r6r7r�r�r,r0r�r}r�r�rYr�r�r�r�r�rrrrrG�srGc@s2eZdZdZeZd
dd�Zdd�Zedd��Z	d	S)rCavThis class represents and manipulates 128-bit IPv6 networks.

    Attributes: [examples for IPv6('2001:db8::1000/124')]
        .network_address: IPv6Address('2001:db8::1000')
        .hostmask: IPv6Address('::f')
        .broadcast_address: IPv6Address('2001:db8::100f')
        .netmask: IPv6Address('ffff:ffff:ffff:ffff:ffff:ffff:ffff:fff0')
        .prefixlen: 124

    TcCs|tj||�t|ttf�r<t|�|_|j|j�\|_	|_
dSt|t�r�t|�dkr\|d}n|j}|j|�\|_	|_
t|d�|_t
|j�}|t
|j	�@|kr�|r�td|��nt|t
|j	�@�|_dSt|�}t|j|d��|_t|�dkr�|d}n|j}|j|�\|_	|_
|�rDtt
|j�t
|j	�@�|jk�rDtd|��tt
|j�t
|j	�@�|_|j
|jdk�rx|j|_dS)a�Instantiate a new IPv6 Network object.

        Args:
            address: A string or integer representing the IPv6 network or the
              IP and prefix/netmask.
              '2001:db8::/128'
              '2001:db8:0000:0000:0000:0000:0000:0000/128'
              '2001:db8::'
              are all functionally the same in IPv6.  That is to say,
              failing to provide a subnetmask will create an object with
              a mask of /128.

              Additionally, an integer can be passed, so
              IPv6Network('2001:db8::') ==
                IPv6Network(42540766411282592856903984951653826560)
              or, more generally
              IPv6Network(int(IPv6Network('2001:db8::'))) ==
                IPv6Network('2001:db8::')

            strict: A boolean. If true, ensure that we have been passed
              A true network address, eg, 2001:db8::1000/124 and not an
              IP address on a network, eg, 2001:db8::1/124.

        Raises:
            AddressValueError: If address isn't a valid IPv6 address.
            NetmaskValueError: If the netmask isn't valid for
              an IPv6 address.
            ValueError: If strict was True and a network address was not
              supplied.

        Nr$rz%s has host bits setr)rxr�rr>rr=rtr�rbr�rrr�rOr�r?rQr�r�r�)r*r@rDr�r�rPrrrr��sB 






zIPv6Network.__init__ccs@t|j�}t|j�}x&t|d|d�D]}|j|�Vq(WdS)z�Generate Iterator over usable hosts in a network.

          This is like __iter__ except it doesn't return the
          Subnet-Router anycast address.

        r$N)r�rtrmr(r�)r*r�r�r�rrrr�<	s

zIPv6Network.hostscCs|jjo|jjS)a`Test if the address is reserved for site-local.

        Note that the site-local address space has been deprecated by RFC 3879.
        Use is_private to test if this address is in the space of unique local
        addresses as defined by RFC 4193.

        Returns:
            A boolean, True if the address is reserved per RFC 3513 2.5.6.

        )rtrrm)r*rrrrH	szIPv6Network.is_site_localN)T)
r5r6r7r:r=r�r�r�r�rrrrrrC�s

OrCc@s�eZdZed�Zed�Zed�ed�ed�ed�ed�ed�ed	�ed
�ed�ed�g
Zed�ed
�ed�ed�ed�ed�ed�ed�ed�ed�ed�ed�ed�ed�ed�gZed�ZdS)�_IPv6Constantsz	fe80::/10zff00::/8z::1/128z::/128z
::ffff:0:0/96z100::/64z	2001::/23z2001:2::/48z
2001:db8::/32z2001:10::/28zfc00::/7z::/8z100::/8z200::/7z400::/6z800::/5z1000::/4z4000::/3z6000::/3z8000::/3zA000::/3zC000::/3zE000::/4zF000::/5zF800::/6zFE00::/9z	fec0::/10N)	r5r6r7rCr�r�r�rrrrrrr	X	s*

r	r)r$)T)8r:Z
__future__rr"r�__version__r�rZlong�	NameErrorZunicoderM�strr>rr�
from_bytesrrsrr�r!r(�objectr)r�r�r?r9r;rArErHrJrKrQrZr]rerorwr|r}r_rxr�r<rFrBr�r�r�r=rGrCr	rrrr�<module>	s�

	


)$
$#716=a*vRr 5V{!site-packages/pip/_vendor/__pycache__/appdirs.cpython-36.pyc000064400000044153147511334610020024 0ustar003

���e`W�@s�dZd1Zdjeee��ZddlZddlZejddkZ	e	r>eZ
ejjd�r�ddlZej
�ddZejd�rrd	Zq�ejd
�r�dZq�dZnejZd2dd�Zd3dd�Zd4dd�Zd5dd�Zd6dd�Zd7dd�ZGdd�de�Zdd�Zdd �Zd!d"�Zd#d$�Zed	k�r�yddlZeZWnnek
�r�ydd%l m!Z!eZWnBek
�r|yddl"Z#eZWnek
�rveZYnXYnXYnXe$d&k�r~d'Z%d(Z&d8Z'e(d)�ee%e&d*d+�Z)x$e'D]Z*e(d,e*e+e)e*�f��q�We(d-�ee%e&�Z)x$e'D]Z*e(d,e*e+e)e*�f��q�We(d.�ee%�Z)x$e'D]Z*e(d,e*e+e)e*�f��q$We(d/�ee%d
d0�Z)x$e'D]Z*e(d,e*e+e)e*�f��q^WdS)9zyUtilities for determining application-specific dirs.

See <http://github.com/ActiveState/appdirs> for details and usage.
����.N��javaZWindows�win32ZMac�darwinZlinux2FcCs�tdkr^|dkr|}|rdpd}tjjt|��}|r�|dk	rNtjj|||�}q�tjj||�}nNtdkr�tjjd�}|r�tjj||�}n&tjdtjjd	��}|r�tjj||�}|r�|r�tjj||�}|S)
aJReturn full path to the user-specific data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user data directories are:
        macOS:                  ~/Library/Application Support/<AppName>
        Unix:                   ~/.local/share/<AppName>    # or in $XDG_DATA_HOME, if defined
        Win XP (not roaming):   C:\Documents and Settings\<username>\Application Data\<AppAuthor>\<AppName>
        Win XP (roaming):       C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>
        Win 7  (not roaming):   C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>
        Win 7  (roaming):       C:\Users\<username>\AppData\Roaming\<AppAuthor>\<AppName>

    For Unix, we follow the XDG spec and support $XDG_DATA_HOME.
    That means, by default "~/.local/share/<AppName>".
    rN�
CSIDL_APPDATA�CSIDL_LOCAL_APPDATAFrz~/Library/Application Support/Z
XDG_DATA_HOMEz~/.local/share)�system�os�path�normpath�_get_win_folder�join�
expanduser�getenv)�appname�	appauthor�version�roaming�constr
�r�/usr/lib/python3.6/appdirs.py�
user_data_dir-s& rcs
tdkrR|dkr�}tjjtd��}�r�|dk	rBtjj||��}q�tjj|��}n�tdkrztjjd�}�r�tjj|��}nttjdtjjdd	g��}d
d�|j	tj�D�}�r�|r�tjj�|���fdd�|D�}|r�tjj|�}n|d
}|S�o�|�rtjj||�}|S)aiReturn full path to the user-shared data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "multipath" is an optional parameter only applicable to *nix
            which indicates that the entire list of data dirs should be
            returned. By default, the first item from XDG_DATA_DIRS is
            returned, or '/usr/local/share/<AppName>',
            if XDG_DATA_DIRS is not set

    Typical user data directories are:
        macOS:      /Library/Application Support/<AppName>
        Unix:       /usr/local/share/<AppName> or /usr/share/<AppName>
        Win XP:     C:\Documents and Settings\All Users\Application Data\<AppAuthor>\<AppName>
        Vista:      (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)
        Win 7:      C:\ProgramData\<AppAuthor>\<AppName>   # Hidden, but writeable on Win 7.

    For Unix, this is using the $XDG_DATA_DIRS[0] default.

    WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
    rN�CSIDL_COMMON_APPDATAFrz/Library/Application SupportZ
XDG_DATA_DIRSz/usr/local/sharez
/usr/sharecSs g|]}tjj|jtj���qSr)rr
r�rstrip�sep)�.0�xrrr�
<listcomp>�sz!site_data_dir.<locals>.<listcomp>csg|]}tjj|�g��qSr)rrr)rr)rrrr �sr)
rrr
rrrrr�pathsep�split)rrr�	multipathr
�pathlistr)rr�
site_data_dirds4
r%cCsXtdkrt||d|�}n&tjdtjjd��}|r>tjj||�}|rT|rTtjj||�}|S)a�Return full path to the user-specific config dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user data directories are:
        macOS:                  same as user_data_dir
        Unix:                   ~/.config/<AppName>     # or in $XDG_CONFIG_HOME, if defined
        Win *:                  same as user_data_dir

    For Unix, we follow the XDG spec and support $XDG_CONFIG_HOME.
    That means, by deafult "~/.config/<AppName>".
    rrNZXDG_CONFIG_HOMEz	~/.config)rr)rrrrr
rr)rrrrr
rrr�user_config_dir�sr&cs�td	kr*t�|�}�r�|r�tjj||�}ndtjdd�}dd�|jtj�D�}�rt|rbtjj�|���fdd�|D�}|r�tjj|�}n|d}|S)
aReturn full path to the user-shared data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "multipath" is an optional parameter only applicable to *nix
            which indicates that the entire list of config dirs should be
            returned. By default, the first item from XDG_CONFIG_DIRS is
            returned, or '/etc/xdg/<AppName>', if XDG_CONFIG_DIRS is not set

    Typical user data directories are:
        macOS:      same as site_data_dir
        Unix:       /etc/xdg/<AppName> or $XDG_CONFIG_DIRS[i]/<AppName> for each value in
                    $XDG_CONFIG_DIRS
        Win *:      same as site_data_dir
        Vista:      (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)

    For Unix, this is using the $XDG_CONFIG_DIRS[0] default, if multipath=False

    WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
    rrZXDG_CONFIG_DIRSz/etc/xdgcSs g|]}tjj|jtj���qSr)rr
rrr)rrrrrr �sz#site_config_dir.<locals>.<listcomp>csg|]}tjj|�g��qSr)rrr)rr)rrrr �sr)rr)rr%rr
rrr"r!)rrrr#r
r$r)rr�site_config_dir�s
r'TcCs�tdkrd|dkr|}tjjtd��}|r�|dk	rBtjj|||�}ntjj||�}|r�tjj|d�}nNtdkr�tjjd�}|r�tjj||�}n&tjdtjjd	��}|r�tjj||�}|r�|r�tjj||�}|S)
aReturn full path to the user-specific cache dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "opinion" (boolean) can be False to disable the appending of
            "Cache" to the base app data dir for Windows. See
            discussion below.

    Typical user cache directories are:
        macOS:      ~/Library/Caches/<AppName>
        Unix:       ~/.cache/<AppName> (XDG default)
        Win XP:     C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Cache
        Vista:      C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Cache

    On Windows the only suggestion in the MSDN docs is that local settings go in
    the `CSIDL_LOCAL_APPDATA` directory. This is identical to the non-roaming
    app data dir (the default returned by `user_data_dir` above). Apps typically
    put cache data somewhere *under* the given dir here. Some examples:
        ...\Mozilla\Firefox\Profiles\<ProfileName>\Cache
        ...\Acme\SuperApp\Cache\1.0
    OPINION: This function appends "Cache" to the `CSIDL_LOCAL_APPDATA` value.
    This can be disabled with the `opinion=False` option.
    rNr
FZCacherz~/Library/CachesZXDG_CACHE_HOMEz~/.cache)rrr
rrrrr)rrr�opinionr
rrr�user_cache_dirs(!r)cCs�tdkr tjjtjjd�|�}nNtdkrLt|||�}d}|rntjj|d�}n"t|||�}d}|rntjj|d�}|r�|r�tjj||�}|S)a�Return full path to the user-specific log dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "opinion" (boolean) can be False to disable the appending of
            "Logs" to the base app data dir for Windows, and "log" to the
            base cache dir for Unix. See discussion below.

    Typical user cache directories are:
        macOS:      ~/Library/Logs/<AppName>
        Unix:       ~/.cache/<AppName>/log  # or under $XDG_CACHE_HOME if defined
        Win XP:     C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Logs
        Vista:      C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Logs

    On Windows the only suggestion in the MSDN docs is that local settings
    go in the `CSIDL_LOCAL_APPDATA` directory. (Note: I'm interested in
    examples of what some windows apps use for a logs dir.)

    OPINION: This function appends "Logs" to the `CSIDL_LOCAL_APPDATA`
    value for Windows and appends "log" to the user cache dir for Unix.
    This can be disabled with the `opinion=False` option.
    rz~/Library/LogsrFZLogs�log)rrr
rrrr))rrrr(r
rrr�user_log_dir:s  
r+c@sbeZdZdZddd�Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��ZdS)�AppDirsz1Convenience wrapper for getting application dirs.NFcCs"||_||_||_||_||_dS)N)rrrrr#)�selfrrrrr#rrr�__init__os
zAppDirs.__init__cCst|j|j|j|jd�S)N)rr)rrrrr)r-rrrrws
zAppDirs.user_data_dircCst|j|j|j|jd�S)N)rr#)r%rrrr#)r-rrrr%|s
zAppDirs.site_data_dircCst|j|j|j|jd�S)N)rr)r&rrrr)r-rrrr&�s
zAppDirs.user_config_dircCst|j|j|j|jd�S)N)rr#)r'rrrr#)r-rrrr'�s
zAppDirs.site_config_dircCst|j|j|jd�S)N)r)r)rrr)r-rrrr)�s
zAppDirs.user_cache_dircCst|j|j|jd�S)N)r)r+rrr)r-rrrr+�s
zAppDirs.user_log_dir)NNFF)�__name__�
__module__�__qualname__�__doc__r.�propertyrr%r&r'r)r+rrrrr,ms
r,cCs:ddl}dddd�|}|j|jd�}|j||�\}}|S)z�This is a fallback technique at best. I'm not sure if using the
    registry for this guarantees us the correct answer for all CSIDL_*
    names.
    rNZAppDatazCommon AppDataz
Local AppData)r	rr
z@Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders)�_winreg�OpenKey�HKEY_CURRENT_USERZQueryValueEx)�
csidl_namer4Zshell_folder_name�key�dir�typerrr�_get_win_folder_from_registry�sr;cCs�ddlm}m}|jdt||�dd�}y`t|�}d}x|D]}t|�dkr:d}Pq:W|r�yddl}|j|�}Wnt	k
r�YnXWnt
k
r�YnX|S)Nr)�shellcon�shellF�T)�win32com.shellr<r=�SHGetFolderPath�getattr�unicode�ord�win32api�GetShortPathName�ImportError�UnicodeError)r7r<r=r9�
has_high_char�crDrrr�_get_win_folder_with_pywin32�s$

rJcCs�ddl}dddd�|}|jd�}|jjjd|dd|�d}x|D]}t|�dkrBd	}PqBW|r�|jd�}|jjj|j|d�r�|}|jS)
Nr��#�)r	rr
iFr>T)	�ctypesZcreate_unicode_buffer�windllZshell32ZSHGetFolderPathWrCZkernel32ZGetShortPathNameW�value)r7rNZcsidl_const�bufrHrIZbuf2rrr�_get_win_folder_with_ctypes�s"


rRcCs�ddl}ddlm}ddlm}|jjd}|jd|�}|jj	}|j
dt|j|�d|jj
|�|jj|j��jd�}d}x|D]}	t|	�dkr~d	}Pq~W|r�|jd|�}|jj	}
tj|||�r�|jj|j��jd�}|S)
Nr)�jna)r�rI�Fr>T)�arrayZcom.sunrSZcom.sun.jna.platformrZWinDefZMAX_PATHZzerosZShell32ZINSTANCEr@rAZShlObjZSHGFP_TYPE_CURRENTZNativeZtoStringZtostringrrCZKernel32ZkernalrE)r7rVrSrZbuf_sizerQr=r9rHrIZkernelrrr�_get_win_folder_with_jna�s&
rW)rO�__main__ZMyAppZ	MyCompanyz%-- app dirs (with optional 'version')z1.0)rz%s: %sz)
-- app dirs (without optional 'version')z+
-- app dirs (without optional 'appauthor')z(
-- app dirs (with disabled 'appauthor'))r)rrr)NNNF)NNNF)NNNF)NNNF)NNNT)NNNT)rr%r&r'r)r+),r2Z__version_info__r�map�str�__version__�sysr�version_infoZPY3rB�platform�
startswithZjava_verZos_namerrr%r&r'r)r+�objectr,r;rJrRrWr?Zwin32comrrFrNrOZcom.sun.jnaZcomr/rrZprops�print�dirsZproprArrrr�<module>	s~


7
B
(
3
9
3+






site-packages/pip/_vendor/__pycache__/pyparsing.cpython-36.pyc000064400000610513147511334610020375 0ustar003

���e�k�@s�dZdZdZdZddlZddlmZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlmZyddlmZWn ek
r�ddlmZYnXydd	l
mZWn>ek
r�ydd	lmZWnek
r�dZYnXYnXd
ddd
ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrgiZee	j�dds�ZeddskZe�r"e	jZe Z!e"Z#e Z$e%e&e'e(e)ee*e+e,e-e.gZ/nbe	j0Ze1Z2dtdu�Z$gZ/ddl3Z3xBdvj4�D]6Z5ye/j6e7e3e5��Wne8k
�r|�wJYnX�qJWe9dwdx�e2dy�D��Z:dzd{�Z;Gd|d}�d}e<�Z=ej>ej?Z@d~ZAeAdZBe@eAZCe"d��ZDd�jEd�dx�ejFD��ZGGd�d!�d!eH�ZIGd�d#�d#eI�ZJGd�d%�d%eI�ZKGd�d'�d'eK�ZLGd�d*�d*eH�ZMGd�d��d�e<�ZNGd�d&�d&e<�ZOe
jPjQeO�d�d=�ZRd�dN�ZSd�dK�ZTd�d��ZUd�d��ZVd�d��ZWd�dU�ZX�d/d�d��ZYGd�d(�d(e<�ZZGd�d0�d0eZ�Z[Gd�d�de[�Z\Gd�d�de[�Z]Gd�d�de[�Z^e^Z_e^eZ_`Gd�d�de[�ZaGd�d�de^�ZbGd�d�dea�ZcGd�dp�dpe[�ZdGd�d3�d3e[�ZeGd�d+�d+e[�ZfGd�d)�d)e[�ZgGd�d
�d
e[�ZhGd�d2�d2e[�ZiGd�d��d�e[�ZjGd�d�dej�ZkGd�d�dej�ZlGd�d�dej�ZmGd�d.�d.ej�ZnGd�d-�d-ej�ZoGd�d5�d5ej�ZpGd�d4�d4ej�ZqGd�d$�d$eZ�ZrGd�d
�d
er�ZsGd�d �d er�ZtGd�d�der�ZuGd�d�der�ZvGd�d"�d"eZ�ZwGd�d�dew�ZxGd�d�dew�ZyGd�d��d�ew�ZzGd�d�dez�Z{Gd�d6�d6ez�Z|Gd�d��d�e<�Z}e}�Z~Gd�d�dew�ZGd�d,�d,ew�Z�Gd�d�dew�Z�Gd�d��d�e��Z�Gd�d1�d1ew�Z�Gd�d�de��Z�Gd�d�de��Z�Gd�d�de��Z�Gd�d/�d/e��Z�Gd�d�de<�Z�d�df�Z��d0d�dD�Z��d1d�d@�Z�d�d΄Z�d�dS�Z�d�dR�Z�d�d҄Z��d2d�dW�Z�d�dE�Z��d3d�dk�Z�d�dl�Z�d�dn�Z�e\�j�dG�Z�el�j�dM�Z�em�j�dL�Z�en�j�de�Z�eo�j�dd�Z�eeeDd�d�dڍj�d�d܄�Z�efd݃j�d�d܄�Z�efd߃j�d�d܄�Z�e�e�Be�BeeeGd�dyd�Befd�ej��BZ�e�e�e�d�e��Z�e^d�ed�j�d�e�e{e�e�B��j�d�d�Z�d�dc�Z�d�dQ�Z�d�d`�Z�d�d^�Z�d�dq�Z�e�d�d܄�Z�e�d�d܄�Z�d�d�Z�d�dO�Z�d�dP�Z�d�di�Z�e<�e�_��d4d�do�Z�e=�Z�e<�e�_�e<�e�_�e�d��e�d��fd�dm�Z�e�Z�e�efd��d��j�d��Z�e�efd��d��j�d��Z�e�efd��d�efd��d�B�j��d�Z�e�e_�d�e�j��j��d�Z�d�d�de�j�f�ddT�Z��d5�ddj�Z�e��d�Z�e��d�Z�e�eee@eC�d�j��d��\Z�Z�e�e��d	j4��d
��Z�ef�d�djEe�j��d
�j��d�ZĐdd_�Z�e�ef�d��d�j��d�Z�ef�d�j��d�Z�ef�d�jȃj��d�Z�ef�d�j��d�Z�e�ef�d��de�B�j��d�Z�e�Z�ef�d�j��d�Z�e�e{eeeGdɐd�eee�d�e^dɃem����j΃j��d�Z�e�ee�j�e�Bd��d��j�d>�Z�G�d dr�dr�Z�eҐd!k�r�eb�d"�Z�eb�d#�Z�eee@eC�d$�Z�e�eՐd%dӐd&�j�e��Z�e�e�eփ�j��d'�Zאd(e�BZ�e�eՐd%dӐd&�j�e��Z�e�e�eك�j��d)�Z�eӐd*�eؐd'�e�eڐd)�Z�e�jܐd+�e�j�jܐd,�e�j�jܐd,�e�j�jܐd-�ddl�Z�e�j�j�e�e�j��e�j�jܐd.�dS(6aS
pyparsing module - Classes and methods to define and execute parsing grammars

The pyparsing module is an alternative approach to creating and executing simple grammars,
vs. the traditional lex/yacc approach, or the use of regular expressions.  With pyparsing, you
don't need to learn a new syntax for defining grammars or matching expressions - the parsing module
provides a library of classes that you use to construct the grammar directly in Python.

Here is a program to parse "Hello, World!" (or any greeting of the form 
C{"<salutation>, <addressee>!"}), built up using L{Word}, L{Literal}, and L{And} elements 
(L{'+'<ParserElement.__add__>} operator gives L{And} expressions, strings are auto-converted to
L{Literal} expressions)::

    from pyparsing import Word, alphas

    # define grammar of a greeting
    greet = Word(alphas) + "," + Word(alphas) + "!"

    hello = "Hello, World!"
    print (hello, "->", greet.parseString(hello))

The program outputs the following::

    Hello, World! -> ['Hello', ',', 'World', '!']

The Python representation of the grammar is quite readable, owing to the self-explanatory
class names, and the use of '+', '|' and '^' operators.

The L{ParseResults} object returned from L{ParserElement.parseString<ParserElement.parseString>} can be accessed as a nested list, a dictionary, or an
object with named attributes.

The pyparsing module handles some of the problems that are typically vexing when writing text parsers:
 - extra or missing whitespace (the above program will also handle "Hello,World!", "Hello  ,  World  !", etc.)
 - quoted strings
 - embedded comments
z2.1.10z07 Oct 2016 01:31 UTCz*Paul McGuire <ptmcg@users.sourceforge.net>�N)�ref)�datetime)�RLock)�OrderedDict�And�CaselessKeyword�CaselessLiteral�
CharsNotIn�Combine�Dict�Each�Empty�
FollowedBy�Forward�
GoToColumn�Group�Keyword�LineEnd�	LineStart�Literal�
MatchFirst�NoMatch�NotAny�	OneOrMore�OnlyOnce�Optional�Or�ParseBaseException�ParseElementEnhance�ParseException�ParseExpression�ParseFatalException�ParseResults�ParseSyntaxException�
ParserElement�QuotedString�RecursiveGrammarException�Regex�SkipTo�	StringEnd�StringStart�Suppress�Token�TokenConverter�White�Word�WordEnd�	WordStart�
ZeroOrMore�	alphanums�alphas�
alphas8bit�anyCloseTag�
anyOpenTag�
cStyleComment�col�commaSeparatedList�commonHTMLEntity�countedArray�cppStyleComment�dblQuotedString�dblSlashComment�
delimitedList�dictOf�downcaseTokens�empty�hexnums�htmlComment�javaStyleComment�line�lineEnd�	lineStart�lineno�makeHTMLTags�makeXMLTags�matchOnlyAtCol�matchPreviousExpr�matchPreviousLiteral�
nestedExpr�nullDebugAction�nums�oneOf�opAssoc�operatorPrecedence�
printables�punc8bit�pythonStyleComment�quotedString�removeQuotes�replaceHTMLEntity�replaceWith�
restOfLine�sglQuotedString�srange�	stringEnd�stringStart�traceParseAction�
unicodeString�upcaseTokens�
withAttribute�
indentedBlock�originalTextFor�ungroup�
infixNotation�locatedExpr�	withClass�
CloseMatch�tokenMap�pyparsing_common�cCs`t|t�r|Syt|�Stk
rZt|�jtj�d�}td�}|jdd��|j	|�SXdS)aDrop-in replacement for str(obj) that tries to be Unicode friendly. It first tries
           str(obj). If that fails with a UnicodeEncodeError, then it tries unicode(obj). It
           then < returns the unicode object | encodes it with the default encoding | ... >.
        �xmlcharrefreplacez&#\d+;cSs$dtt|ddd���dd�S)Nz\ur�����)�hex�int)�t�rw�/usr/lib/python3.6/pyparsing.py�<lambda>�sz_ustr.<locals>.<lambda>N)
�
isinstanceZunicode�str�UnicodeEncodeError�encode�sys�getdefaultencodingr'�setParseAction�transformString)�obj�retZ
xmlcharrefrwrwrx�_ustr�s
r�z6sum len sorted reversed list tuple set any all min maxccs|]
}|VqdS)Nrw)�.0�yrwrwrx�	<genexpr>�sr�rrcCs>d}dd�dj�D�}x"t||�D]\}}|j||�}q"W|S)z/Escape &, <, >, ", ', etc. in a string of data.z&><"'css|]}d|dVqdS)�&�;Nrw)r��srwrwrxr��sz_xml_escape.<locals>.<genexpr>zamp gt lt quot apos)�split�zip�replace)�dataZfrom_symbolsZ
to_symbolsZfrom_Zto_rwrwrx�_xml_escape�s
r�c@seZdZdS)�
_ConstantsN)�__name__�
__module__�__qualname__rwrwrwrxr��sr��
0123456789ZABCDEFabcdef�\�ccs|]}|tjkr|VqdS)N)�stringZ
whitespace)r��crwrwrxr��sc@sPeZdZdZddd�Zedd��Zdd	�Zd
d�Zdd
�Z	ddd�Z
dd�ZdS)rz7base exception class for all parsing runtime exceptionsrNcCs>||_|dkr||_d|_n||_||_||_|||f|_dS)Nr�)�loc�msg�pstr�
parserElement�args)�selfr�r�r��elemrwrwrx�__init__�szParseBaseException.__init__cCs||j|j|j|j�S)z�
        internal factory method to simplify creating one type of ParseException 
        from another - avoids having __init__ signature conflicts among subclasses
        )r�r�r�r�)�cls�perwrwrx�_from_exception�sz"ParseBaseException._from_exceptioncCsN|dkrt|j|j�S|dkr,t|j|j�S|dkrBt|j|j�St|��dS)z�supported attributes by name are:
            - lineno - returns the line number of the exception text
            - col - returns the column number of the exception text
            - line - returns the line containing the exception text
        rJr9�columnrGN)r9r�)rJr�r�r9rG�AttributeError)r�Zanamerwrwrx�__getattr__�szParseBaseException.__getattr__cCsd|j|j|j|jfS)Nz"%s (at char %d), (line:%d, col:%d))r�r�rJr�)r�rwrwrx�__str__�szParseBaseException.__str__cCst|�S)N)r�)r�rwrwrx�__repr__�szParseBaseException.__repr__�>!<cCs<|j}|jd}|r4dj|d|�|||d�f�}|j�S)z�Extracts the exception line from the input string, and marks
           the location of the exception with a special symbol.
        rrr�N)rGr��join�strip)r�ZmarkerStringZline_strZline_columnrwrwrx�
markInputline�s
z ParseBaseException.markInputlinecCsdj�tt|��S)Nzlineno col line)r��dir�type)r�rwrwrx�__dir__�szParseBaseException.__dir__)rNN)r�)r�r�r��__doc__r��classmethodr�r�r�r�r�r�rwrwrwrxr�s


c@seZdZdZdS)raN
    Exception thrown when parse expressions don't match class;
    supported attributes by name are:
     - lineno - returns the line number of the exception text
     - col - returns the column number of the exception text
     - line - returns the line containing the exception text
        
    Example::
        try:
            Word(nums).setName("integer").parseString("ABC")
        except ParseException as pe:
            print(pe)
            print("column: {}".format(pe.col))
            
    prints::
       Expected integer (at char 0), (line:1, col:1)
        column: 1
    N)r�r�r�r�rwrwrwrxr�sc@seZdZdZdS)r!znuser-throwable exception thrown when inconsistent parse content
       is found; stops all parsing immediatelyN)r�r�r�r�rwrwrwrxr!sc@seZdZdZdS)r#z�just like L{ParseFatalException}, but thrown internally when an
       L{ErrorStop<And._ErrorStop>} ('-' operator) indicates that parsing is to stop 
       immediately because an unbacktrackable syntax error has been foundN)r�r�r�r�rwrwrwrxr#sc@s eZdZdZdd�Zdd�ZdS)r&zZexception thrown by L{ParserElement.validate} if the grammar could be improperly recursivecCs
||_dS)N)�parseElementTrace)r��parseElementListrwrwrxr�sz"RecursiveGrammarException.__init__cCs
d|jS)NzRecursiveGrammarException: %s)r�)r�rwrwrxr� sz!RecursiveGrammarException.__str__N)r�r�r�r�r�r�rwrwrwrxr&sc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�_ParseResultsWithOffsetcCs||f|_dS)N)�tup)r�Zp1Zp2rwrwrxr�$sz _ParseResultsWithOffset.__init__cCs
|j|S)N)r�)r��irwrwrx�__getitem__&sz#_ParseResultsWithOffset.__getitem__cCst|jd�S)Nr)�reprr�)r�rwrwrxr�(sz _ParseResultsWithOffset.__repr__cCs|jd|f|_dS)Nr)r�)r�r�rwrwrx�	setOffset*sz!_ParseResultsWithOffset.setOffsetN)r�r�r�r�r�r�r�rwrwrwrxr�#sr�c@s�eZdZdZd[dd�Zddddefdd�Zdd	�Zefd
d�Zdd
�Z	dd�Z
dd�Zdd�ZeZ
dd�Zdd�Zdd�Zdd�Zdd�Zer�eZeZeZn$eZeZeZdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd\d(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Z d2d3�Z!d4d5�Z"d6d7�Z#d8d9�Z$d:d;�Z%d<d=�Z&d]d?d@�Z'dAdB�Z(dCdD�Z)dEdF�Z*d^dHdI�Z+dJdK�Z,dLdM�Z-d_dOdP�Z.dQdR�Z/dSdT�Z0dUdV�Z1dWdX�Z2dYdZ�Z3dS)`r"aI
    Structured parse results, to provide multiple means of access to the parsed data:
       - as a list (C{len(results)})
       - by list index (C{results[0], results[1]}, etc.)
       - by attribute (C{results.<resultsName>} - see L{ParserElement.setResultsName})

    Example::
        integer = Word(nums)
        date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))
        # equivalent form:
        # date_str = integer("year") + '/' + integer("month") + '/' + integer("day")

        # parseString returns a ParseResults object
        result = date_str.parseString("1999/12/31")

        def test(s, fn=repr):
            print("%s -> %s" % (s, fn(eval(s))))
        test("list(result)")
        test("result[0]")
        test("result['month']")
        test("result.day")
        test("'month' in result")
        test("'minutes' in result")
        test("result.dump()", str)
    prints::
        list(result) -> ['1999', '/', '12', '/', '31']
        result[0] -> '1999'
        result['month'] -> '12'
        result.day -> '31'
        'month' in result -> True
        'minutes' in result -> False
        result.dump() -> ['1999', '/', '12', '/', '31']
        - day: 31
        - month: 12
        - year: 1999
    NTcCs"t||�r|Stj|�}d|_|S)NT)rz�object�__new__�_ParseResults__doinit)r��toklist�name�asList�modalZretobjrwrwrxr�Ts


zParseResults.__new__c
Cs`|jrvd|_d|_d|_i|_||_||_|dkr6g}||t�rP|dd�|_n||t�rft|�|_n|g|_t	�|_
|dk	o�|�r\|s�d|j|<||t�r�t|�}||_||t
d�ttf�o�|ddgfk�s\||t�r�|g}|�r&||t��rt|j�d�||<ntt|d�d�||<|||_n6y|d||<Wn$tttfk
�rZ|||<YnXdS)NFrr�)r��_ParseResults__name�_ParseResults__parent�_ParseResults__accumNames�_ParseResults__asList�_ParseResults__modal�list�_ParseResults__toklist�_generatorType�dict�_ParseResults__tokdictrur�r��
basestringr"r��copy�KeyError�	TypeError�
IndexError)r�r�r�r�r�rzrwrwrxr�]sB



$
zParseResults.__init__cCsPt|ttf�r|j|S||jkr4|j|ddStdd�|j|D��SdS)NrrrcSsg|]}|d�qS)rrw)r��vrwrwrx�
<listcomp>�sz,ParseResults.__getitem__.<locals>.<listcomp>rs)rzru�slicer�r�r�r")r�r�rwrwrxr��s


zParseResults.__getitem__cCs�||t�r0|jj|t��|g|j|<|d}nD||ttf�rN||j|<|}n&|jj|t��t|d�g|j|<|}||t�r�t|�|_	dS)Nr)
r�r��getr�rur�r�r"�wkrefr�)r��kr�rz�subrwrwrx�__setitem__�s


"
zParseResults.__setitem__c
Cs�t|ttf�r�t|j�}|j|=t|t�rH|dkr:||7}t||d�}tt|j|���}|j�x^|j	j
�D]F\}}x<|D]4}x.t|�D]"\}\}}	t||	|	|k�||<q�Wq|WqnWn|j	|=dS)Nrrr)
rzrur��lenr�r��range�indices�reverser��items�	enumerater�)
r�r�ZmylenZremovedr��occurrences�jr��value�positionrwrwrx�__delitem__�s


$zParseResults.__delitem__cCs
||jkS)N)r�)r�r�rwrwrx�__contains__�szParseResults.__contains__cCs
t|j�S)N)r�r�)r�rwrwrx�__len__�szParseResults.__len__cCs
|jS)N)r�)r�rwrwrx�__bool__�szParseResults.__bool__cCs
t|j�S)N)�iterr�)r�rwrwrx�__iter__�szParseResults.__iter__cCst|jddd��S)Nrrrs)r�r�)r�rwrwrx�__reversed__�szParseResults.__reversed__cCs$t|jd�r|jj�St|j�SdS)N�iterkeys)�hasattrr�r�r�)r�rwrwrx�	_iterkeys�s
zParseResults._iterkeyscs�fdd��j�D�S)Nc3s|]}�|VqdS)Nrw)r�r�)r�rwrxr��sz+ParseResults._itervalues.<locals>.<genexpr>)r�)r�rw)r�rx�_itervalues�szParseResults._itervaluescs�fdd��j�D�S)Nc3s|]}|�|fVqdS)Nrw)r�r�)r�rwrxr��sz*ParseResults._iteritems.<locals>.<genexpr>)r�)r�rw)r�rx�
_iteritems�szParseResults._iteritemscCst|j��S)zVReturns all named result keys (as a list in Python 2.x, as an iterator in Python 3.x).)r�r�)r�rwrwrx�keys�szParseResults.keyscCst|j��S)zXReturns all named result values (as a list in Python 2.x, as an iterator in Python 3.x).)r��
itervalues)r�rwrwrx�values�szParseResults.valuescCst|j��S)zfReturns all named result key-values (as a list of tuples in Python 2.x, as an iterator in Python 3.x).)r��	iteritems)r�rwrwrxr��szParseResults.itemscCs
t|j�S)z�Since keys() returns an iterator, this method is helpful in bypassing
           code that looks for the existence of any defined results names.)�boolr�)r�rwrwrx�haskeys�szParseResults.haskeyscOs�|s
dg}x6|j�D]*\}}|dkr2|d|f}qtd|��qWt|dt�sht|�dksh|d|kr�|d}||}||=|S|d}|SdS)a�
        Removes and returns item at specified index (default=C{last}).
        Supports both C{list} and C{dict} semantics for C{pop()}. If passed no
        argument or an integer argument, it will use C{list} semantics
        and pop tokens from the list of parsed tokens. If passed a 
        non-integer argument (most likely a string), it will use C{dict}
        semantics and pop the corresponding value from any defined 
        results names. A second default return value argument is 
        supported, just as in C{dict.pop()}.

        Example::
            def remove_first(tokens):
                tokens.pop(0)
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            print(OneOrMore(Word(nums)).addParseAction(remove_first).parseString("0 123 321")) # -> ['123', '321']

            label = Word(alphas)
            patt = label("LABEL") + OneOrMore(Word(nums))
            print(patt.parseString("AAB 123 321").dump())

            # Use pop() in a parse action to remove named result (note that corresponding value is not
            # removed from list form of results)
            def remove_LABEL(tokens):
                tokens.pop("LABEL")
                return tokens
            patt.addParseAction(remove_LABEL)
            print(patt.parseString("AAB 123 321").dump())
        prints::
            ['AAB', '123', '321']
            - LABEL: AAB

            ['AAB', '123', '321']
        rr�defaultrz-pop() got an unexpected keyword argument '%s'Nrs)r�r�rzrur�)r�r��kwargsr�r��indexr�Zdefaultvaluerwrwrx�pop�s"zParseResults.popcCs||kr||S|SdS)ai
        Returns named result matching the given key, or if there is no
        such name, then returns the given C{defaultValue} or C{None} if no
        C{defaultValue} is specified.

        Similar to C{dict.get()}.
        
        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            result = date_str.parseString("1999/12/31")
            print(result.get("year")) # -> '1999'
            print(result.get("hour", "not specified")) # -> 'not specified'
            print(result.get("hour")) # -> None
        Nrw)r��key�defaultValuerwrwrxr�szParseResults.getcCsZ|jj||�xF|jj�D]8\}}x.t|�D]"\}\}}t||||k�||<q,WqWdS)a
        Inserts new element at location index in the list of parsed tokens.
        
        Similar to C{list.insert()}.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']

            # use a parse action to insert the parse location in the front of the parsed results
            def insert_locn(locn, tokens):
                tokens.insert(0, locn)
            print(OneOrMore(Word(nums)).addParseAction(insert_locn).parseString("0 123 321")) # -> [0, '0', '123', '321']
        N)r��insertr�r�r�r�)r�r�ZinsStrr�r�r�r�r�rwrwrxr�2szParseResults.insertcCs|jj|�dS)a�
        Add single element to end of ParseResults list of elements.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            
            # use a parse action to compute the sum of the parsed integers, and add it to the end
            def append_sum(tokens):
                tokens.append(sum(map(int, tokens)))
            print(OneOrMore(Word(nums)).addParseAction(append_sum).parseString("0 123 321")) # -> ['0', '123', '321', 444]
        N)r��append)r��itemrwrwrxr�FszParseResults.appendcCs$t|t�r||7}n|jj|�dS)a
        Add sequence of elements to end of ParseResults list of elements.

        Example::
            patt = OneOrMore(Word(alphas))
            
            # use a parse action to append the reverse of the matched strings, to make a palindrome
            def make_palindrome(tokens):
                tokens.extend(reversed([t[::-1] for t in tokens]))
                return ''.join(tokens)
            print(patt.addParseAction(make_palindrome).parseString("lskdj sdlkjf lksd")) # -> 'lskdjsdlkjflksddsklfjkldsjdksl'
        N)rzr"r��extend)r�Zitemseqrwrwrxr�Ts

zParseResults.extendcCs|jdd�=|jj�dS)z7
        Clear all elements and results names.
        N)r�r��clear)r�rwrwrxr�fszParseResults.clearcCsfy||Stk
rdSX||jkr^||jkrD|j|ddStdd�|j|D��SndSdS)Nr�rrrcSsg|]}|d�qS)rrw)r�r�rwrwrxr�wsz,ParseResults.__getattr__.<locals>.<listcomp>rs)r�r�r�r")r�r�rwrwrxr�ms

zParseResults.__getattr__cCs|j�}||7}|S)N)r�)r��otherr�rwrwrx�__add__{szParseResults.__add__cs�|jrnt|j���fdd��|jj�}�fdd�|D�}x4|D],\}}|||<t|dt�r>t|�|d_q>W|j|j7_|jj	|j�|S)Ncs|dkr�S|�S)Nrrw)�a)�offsetrwrxry�sz'ParseResults.__iadd__.<locals>.<lambda>c	s4g|],\}}|D]}|t|d�|d��f�qqS)rrr)r�)r�r��vlistr�)�	addoffsetrwrxr��sz)ParseResults.__iadd__.<locals>.<listcomp>r)
r�r�r�r�rzr"r�r�r��update)r�r�Z
otheritemsZotherdictitemsr�r�rw)rrrx�__iadd__�s


zParseResults.__iadd__cCs&t|t�r|dkr|j�S||SdS)Nr)rzrur�)r�r�rwrwrx�__radd__�szParseResults.__radd__cCsdt|j�t|j�fS)Nz(%s, %s))r�r�r�)r�rwrwrxr��szParseResults.__repr__cCsddjdd�|jD��dS)N�[z, css(|] }t|t�rt|�nt|�VqdS)N)rzr"r�r�)r�r�rwrwrxr��sz'ParseResults.__str__.<locals>.<genexpr>�])r�r�)r�rwrwrxr��szParseResults.__str__r�cCsPg}xF|jD]<}|r"|r"|j|�t|t�r:||j�7}q|jt|��qW|S)N)r�r�rzr"�
_asStringListr�)r��sep�outr�rwrwrxr
�s

zParseResults._asStringListcCsdd�|jD�S)a�
        Returns the parse results as a nested list of matching tokens, all converted to strings.

        Example::
            patt = OneOrMore(Word(alphas))
            result = patt.parseString("sldkj lsdkj sldkj")
            # even though the result prints in string-like form, it is actually a pyparsing ParseResults
            print(type(result), result) # -> <class 'pyparsing.ParseResults'> ['sldkj', 'lsdkj', 'sldkj']
            
            # Use asList() to create an actual list
            result_list = result.asList()
            print(type(result_list), result_list) # -> <class 'list'> ['sldkj', 'lsdkj', 'sldkj']
        cSs"g|]}t|t�r|j�n|�qSrw)rzr"r�)r��resrwrwrxr��sz'ParseResults.asList.<locals>.<listcomp>)r�)r�rwrwrxr��szParseResults.asListcs6tr|j}n|j}�fdd��t�fdd�|�D��S)a�
        Returns the named parse results as a nested dictionary.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(type(result), repr(result)) # -> <class 'pyparsing.ParseResults'> (['12', '/', '31', '/', '1999'], {'day': [('1999', 4)], 'year': [('12', 0)], 'month': [('31', 2)]})
            
            result_dict = result.asDict()
            print(type(result_dict), repr(result_dict)) # -> <class 'dict'> {'day': '1999', 'year': '12', 'month': '31'}

            # even though a ParseResults supports dict-like access, sometime you just need to have a dict
            import json
            print(json.dumps(result)) # -> Exception: TypeError: ... is not JSON serializable
            print(json.dumps(result.asDict())) # -> {"month": "31", "day": "1999", "year": "12"}
        cs6t|t�r.|j�r|j�S�fdd�|D�Sn|SdS)Ncsg|]}�|��qSrwrw)r�r�)�toItemrwrxr��sz7ParseResults.asDict.<locals>.toItem.<locals>.<listcomp>)rzr"r��asDict)r�)rrwrxr�s

z#ParseResults.asDict.<locals>.toItemc3s|]\}}|�|�fVqdS)Nrw)r�r�r�)rrwrxr��sz&ParseResults.asDict.<locals>.<genexpr>)�PY_3r�r�r�)r�Zitem_fnrw)rrxr�s
	zParseResults.asDictcCs8t|j�}|jj�|_|j|_|jj|j�|j|_|S)zA
        Returns a new copy of a C{ParseResults} object.
        )r"r�r�r�r�r�rr�)r�r�rwrwrxr��s
zParseResults.copyFcCsPd}g}tdd�|jj�D��}|d}|s8d}d}d}d}	|dk	rJ|}	n|jrV|j}	|	sf|rbdSd}	|||d|	d	g7}x�t|j�D]�\}
}t|t�r�|
|kr�||j||
|o�|dk||�g7}n||jd|o�|dk||�g7}q�d}|
|kr�||
}|�s
|�rq�nd}t	t
|��}
|||d|d	|
d
|d	g	7}q�W|||d
|	d	g7}dj|�S)z�
        (Deprecated) Returns the parse results as XML. Tags are created for tokens and lists that have defined results names.
        �
css(|] \}}|D]}|d|fVqqdS)rrNrw)r�r�rr�rwrwrxr��sz%ParseResults.asXML.<locals>.<genexpr>z  r�NZITEM�<�>z</)r�r�r�r�r�r�rzr"�asXMLr�r�r�)r�ZdoctagZnamedItemsOnly�indentZ	formatted�nlrZ
namedItemsZnextLevelIndentZselfTagr�r
ZresTagZxmlBodyTextrwrwrxr�sT


zParseResults.asXMLcCs:x4|jj�D]&\}}x|D]\}}||kr|SqWqWdS)N)r�r�)r�r�r�rr�r�rwrwrxZ__lookup$s
zParseResults.__lookupcCs�|jr|jS|jr.|j�}|r(|j|�SdSnNt|�dkrxt|j�dkrxtt|jj���dddkrxtt|jj���SdSdS)a(
        Returns the results name for this token expression. Useful when several 
        different expressions might match at a particular location.

        Example::
            integer = Word(nums)
            ssn_expr = Regex(r"\d\d\d-\d\d-\d\d\d\d")
            house_number_expr = Suppress('#') + Word(nums, alphanums)
            user_data = (Group(house_number_expr)("house_number") 
                        | Group(ssn_expr)("ssn")
                        | Group(integer)("age"))
            user_info = OneOrMore(user_data)
            
            result = user_info.parseString("22 111-22-3333 #221B")
            for item in result:
                print(item.getName(), ':', item[0])
        prints::
            age : 22
            ssn : 111-22-3333
            house_number : 221B
        Nrrrrs)rrs)	r�r��_ParseResults__lookupr�r��nextr�r�r�)r��parrwrwrx�getName+s
zParseResults.getNamercCsbg}d}|j|t|j���|�rX|j�r�tdd�|j�D��}xz|D]r\}}|r^|j|�|jd|d||f�t|t�r�|r�|j|j||d��q�|jt|��qH|jt	|��qHWn�t
dd�|D���rX|}x~t|�D]r\}	}
t|
t��r*|jd|d||	|d|d|
j||d�f�q�|jd|d||	|d|dt|
�f�q�Wd	j|�S)
aH
        Diagnostic method for listing out the contents of a C{ParseResults}.
        Accepts an optional C{indent} argument so that this string can be embedded
        in a nested display of other data.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(result.dump())
        prints::
            ['12', '/', '31', '/', '1999']
            - day: 1999
            - month: 31
            - year: 12
        rcss|]\}}t|�|fVqdS)N)r{)r�r�r�rwrwrxr�gsz$ParseResults.dump.<locals>.<genexpr>z
%s%s- %s: z  rrcss|]}t|t�VqdS)N)rzr")r��vvrwrwrxr�ssz
%s%s[%d]:
%s%s%sr�)
r�r�r�r��sortedr�rzr"�dumpr��anyr�r�)r�r�depth�fullr�NLr�r�r�r�rrwrwrxrPs,

4.zParseResults.dumpcOstj|j�f|�|�dS)a�
        Pretty-printer for parsed results as a list, using the C{pprint} module.
        Accepts additional positional or keyword args as defined for the 
        C{pprint.pprint} method. (U{http://docs.python.org/3/library/pprint.html#pprint.pprint})

        Example::
            ident = Word(alphas, alphanums)
            num = Word(nums)
            func = Forward()
            term = ident | num | Group('(' + func + ')')
            func <<= ident + Group(Optional(delimitedList(term)))
            result = func.parseString("fna a,b,(fnb c,d,200),100")
            result.pprint(width=40)
        prints::
            ['fna',
             ['a',
              'b',
              ['(', 'fnb', ['c', 'd', '200'], ')'],
              '100']]
        N)�pprintr�)r�r�r�rwrwrxr"}szParseResults.pprintcCs.|j|jj�|jdk	r|j�p d|j|jffS)N)r�r�r�r�r�r�)r�rwrwrx�__getstate__�s
zParseResults.__getstate__cCsN|d|_|d\|_}}|_i|_|jj|�|dk	rDt|�|_nd|_dS)Nrrr)r�r�r�r�rr�r�)r��staterZinAccumNamesrwrwrx�__setstate__�s
zParseResults.__setstate__cCs|j|j|j|jfS)N)r�r�r�r�)r�rwrwrx�__getnewargs__�szParseResults.__getnewargs__cCstt|��t|j��S)N)r�r�r�r�)r�rwrwrxr��szParseResults.__dir__)NNTT)N)r�)NFr�T)r�rT)4r�r�r�r�r�rzr�r�r�r�r�r�r��__nonzero__r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr�r�r
r�rr�rrrrr"r#r%r&r�rwrwrwrxr"-sh&
	'	
4

#
=%
-
cCsF|}d|kot|�knr4||ddkr4dS||jdd|�S)aReturns current column within a string, counting newlines as line separators.
   The first column is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   rrrr)r��rfind)r��strgr�rwrwrxr9�s
cCs|jdd|�dS)aReturns current line number within a string, counting newlines as line separators.
   The first line is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   rrrr)�count)r�r)rwrwrxrJ�s
cCsF|jdd|�}|jd|�}|dkr2||d|�S||dd�SdS)zfReturns the line of text containing loc within a string, counting newlines as line separators.
       rrrrN)r(�find)r�r)ZlastCRZnextCRrwrwrxrG�s
cCs8tdt|�dt|�dt||�t||�f�dS)NzMatch z at loc z(%d,%d))�printr�rJr9)�instringr��exprrwrwrx�_defaultStartDebugAction�sr/cCs$tdt|�dt|j���dS)NzMatched z -> )r,r�r{r�)r-�startlocZendlocr.�toksrwrwrx�_defaultSuccessDebugAction�sr2cCstdt|��dS)NzException raised:)r,r�)r-r�r.�excrwrwrx�_defaultExceptionDebugAction�sr4cGsdS)zG'Do-nothing' debug action, to suppress debugging output during parsing.Nrw)r�rwrwrxrQ�srqcs��tkr�fdd�Sdg�dg�tdd�dkrFddd	�}dd
d��ntj}tj�d}|dd
�d}|d|d|f�������fdd�}d}yt�dt�d�j�}Wntk
r�t��}YnX||_|S)Ncs�|�S)Nrw)r��lrv)�funcrwrxry�sz_trim_arity.<locals>.<lambda>rFrqro�cSs8tdkrdnd	}tj||dd�|}|j|jfgS)
Nror7rrqrr)�limit)ror7r������)�system_version�	traceback�
extract_stack�filenamerJ)r8r�
frame_summaryrwrwrxr=sz"_trim_arity.<locals>.extract_stackcSs$tj||d�}|d}|j|jfgS)N)r8rrrs)r<�
extract_tbr>rJ)�tbr8Zframesr?rwrwrxr@sz_trim_arity.<locals>.extract_tb�)r8rrcs�x�y �|�dd��}d�d<|Stk
r��dr>�n4z.tj�d}�|dd�ddd��ksj�Wd~X�d�kr��dd7<w�YqXqWdS)NrTrrrq)r8rsrs)r�r~�exc_info)r�r�rA)r@�
foundArityr6r8�maxargs�pa_call_line_synthrwrx�wrappers"z_trim_arity.<locals>.wrapperz<parse action>r��	__class__)ror7)r)rrs)	�singleArgBuiltinsr;r<r=r@�getattrr��	Exceptionr{)r6rEr=Z	LINE_DIFFZ	this_linerG�	func_namerw)r@rDr6r8rErFrx�_trim_arity�s*
rMcs�eZdZdZdZdZedd��Zedd��Zd�dd	�Z	d
d�Z
dd
�Zd�dd�Zd�dd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd�dd �Zd!d"�Zd�d#d$�Zd%d&�Zd'd(�ZGd)d*�d*e�Zed+k	r�Gd,d-�d-e�ZnGd.d-�d-e�ZiZe�Zd/d/gZ d�d0d1�Z!eZ"ed2d3��Z#dZ$ed�d5d6��Z%d�d7d8�Z&e'dfd9d:�Z(d;d<�Z)e'fd=d>�Z*e'dfd?d@�Z+dAdB�Z,dCdD�Z-dEdF�Z.dGdH�Z/dIdJ�Z0dKdL�Z1dMdN�Z2dOdP�Z3dQdR�Z4dSdT�Z5dUdV�Z6dWdX�Z7dYdZ�Z8d�d[d\�Z9d]d^�Z:d_d`�Z;dadb�Z<dcdd�Z=dedf�Z>dgdh�Z?d�didj�Z@dkdl�ZAdmdn�ZBdodp�ZCdqdr�ZDgfdsdt�ZEd�dudv�ZF�fdwdx�ZGdydz�ZHd{d|�ZId}d~�ZJdd��ZKd�d�d��ZLd�d�d��ZM�ZNS)�r$z)Abstract base level parser element class.z 
	
FcCs
|t_dS)a�
        Overrides the default whitespace chars

        Example::
            # default whitespace chars are space, <TAB> and newline
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def', 'ghi', 'jkl']
            
            # change to just treat newline as significant
            ParserElement.setDefaultWhitespaceChars(" \t")
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def']
        N)r$�DEFAULT_WHITE_CHARS)�charsrwrwrx�setDefaultWhitespaceChars=s
z'ParserElement.setDefaultWhitespaceCharscCs
|t_dS)a�
        Set class to be used for inclusion of string literals into a parser.
        
        Example::
            # default literal class used is Literal
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']


            # change to Suppress
            ParserElement.inlineLiteralsUsing(Suppress)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '12', '31']
        N)r$�_literalStringClass)r�rwrwrx�inlineLiteralsUsingLsz!ParserElement.inlineLiteralsUsingcCs�t�|_d|_d|_d|_||_d|_tj|_	d|_
d|_d|_t�|_
d|_d|_d|_d|_d|_d|_d|_d|_d|_dS)NTFr�)NNN)r��parseAction�
failAction�strRepr�resultsName�
saveAsList�skipWhitespacer$rN�
whiteChars�copyDefaultWhiteChars�mayReturnEmpty�keepTabs�ignoreExprs�debug�streamlined�
mayIndexError�errmsg�modalResults�debugActions�re�callPreparse�
callDuringTry)r��savelistrwrwrxr�as(zParserElement.__init__cCs<tj|�}|jdd�|_|jdd�|_|jr8tj|_|S)a$
        Make a copy of this C{ParserElement}.  Useful for defining different parse actions
        for the same parsing pattern, using copies of the original parse element.
        
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            integerK = integer.copy().addParseAction(lambda toks: toks[0]*1024) + Suppress("K")
            integerM = integer.copy().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
            
            print(OneOrMore(integerK | integerM | integer).parseString("5K 100 640K 256M"))
        prints::
            [5120, 100, 655360, 268435456]
        Equivalent form of C{expr.copy()} is just C{expr()}::
            integerM = integer().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
        N)r�rSr]rZr$rNrY)r�Zcpyrwrwrxr�xs
zParserElement.copycCs*||_d|j|_t|d�r&|j|j_|S)af
        Define name for this expression, makes debugging and exception messages clearer.
        
        Example::
            Word(nums).parseString("ABC")  # -> Exception: Expected W:(0123...) (at char 0), (line:1, col:1)
            Word(nums).setName("integer").parseString("ABC")  # -> Exception: Expected integer (at char 0), (line:1, col:1)
        z	Expected �	exception)r�rar�rhr�)r�r�rwrwrx�setName�s


zParserElement.setNamecCs4|j�}|jd�r"|dd�}d}||_||_|S)aP
        Define name for referencing matching tokens as a nested attribute
        of the returned parse results.
        NOTE: this returns a *copy* of the original C{ParserElement} object;
        this is so that the client can define a basic element, such as an
        integer, and reference it in multiple places with different names.

        You can also set results names using the abbreviated syntax,
        C{expr("name")} in place of C{expr.setResultsName("name")} - 
        see L{I{__call__}<__call__>}.

        Example::
            date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))

            # equivalent form:
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
        �*NrrTrs)r��endswithrVrb)r�r��listAllMatchesZnewselfrwrwrx�setResultsName�s
zParserElement.setResultsNameTcs@|r&|j�d�fdd�	}�|_||_nt|jd�r<|jj|_|S)z�Method to invoke the Python pdb debugger when this element is
           about to be parsed. Set C{breakFlag} to True to enable, False to
           disable.
        Tcsddl}|j��||||�S)Nr)�pdbZ	set_trace)r-r��	doActions�callPreParsern)�_parseMethodrwrx�breaker�sz'ParserElement.setBreak.<locals>.breaker�_originalParseMethod)TT)�_parsersr�)r�Z	breakFlagrrrw)rqrx�setBreak�s
zParserElement.setBreakcOs&tttt|���|_|jdd�|_|S)a
        Define action to perform when successfully matching parse element definition.
        Parse action fn is a callable method with 0-3 arguments, called as C{fn(s,loc,toks)},
        C{fn(loc,toks)}, C{fn(toks)}, or just C{fn()}, where:
         - s   = the original string being parsed (see note below)
         - loc = the location of the matching substring
         - toks = a list of the matched tokens, packaged as a C{L{ParseResults}} object
        If the functions in fns modify the tokens, they can return them as the return
        value from fn, and the modified list of tokens will replace the original.
        Otherwise, fn does not need to return any value.

        Optional keyword arguments:
         - callDuringTry = (default=C{False}) indicate if parse action should be run during lookaheads and alternate testing

        Note: the default parsing behavior is to expand tabs in the input string
        before starting the parsing process.  See L{I{parseString}<parseString>} for more information
        on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
        consistent view of the parsed string, the parse location, and line and column
        positions within the parsed string.
        
        Example::
            integer = Word(nums)
            date_str = integer + '/' + integer + '/' + integer

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']

            # use parse action to convert to ints at parse time
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            date_str = integer + '/' + integer + '/' + integer

            # note that integer fields are now ints, not strings
            date_str.parseString("1999/12/31")  # -> [1999, '/', 12, '/', 31]
        rfF)r��maprMrSr�rf)r��fnsr�rwrwrxr��s"zParserElement.setParseActioncOs4|jtttt|���7_|jp,|jdd�|_|S)z�
        Add parse action to expression's list of parse actions. See L{I{setParseAction}<setParseAction>}.
        
        See examples in L{I{copy}<copy>}.
        rfF)rSr�rvrMrfr�)r�rwr�rwrwrx�addParseAction�szParserElement.addParseActioncsb|jdd��|jdd�rtnt�x(|D] ����fdd�}|jj|�q&W|jpZ|jdd�|_|S)a�Add a boolean predicate function to expression's list of parse actions. See 
        L{I{setParseAction}<setParseAction>} for function call signatures. Unlike C{setParseAction}, 
        functions passed to C{addCondition} need to return boolean success/fail of the condition.

        Optional keyword arguments:
         - message = define a custom message to be used in the raised exception
         - fatal   = if True, will raise ParseFatalException to stop parsing immediately; otherwise will raise ParseException
         
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            year_int = integer.copy()
            year_int.addCondition(lambda toks: toks[0] >= 2000, message="Only support years 2000 and later")
            date_str = year_int + '/' + integer + '/' + integer

            result = date_str.parseString("1999/12/31")  # -> Exception: Only support years 2000 and later (at char 0), (line:1, col:1)
        �messagezfailed user-defined condition�fatalFcs$tt��|||��s �||���dS)N)r�rM)r�r5rv)�exc_type�fnr�rwrx�pasz&ParserElement.addCondition.<locals>.parf)r�r!rrSr�rf)r�rwr�r}rw)r{r|r�rx�addCondition�s
zParserElement.addConditioncCs
||_|S)aDefine action to perform if parsing fails at this expression.
           Fail acton fn is a callable function that takes the arguments
           C{fn(s,loc,expr,err)} where:
            - s = string being parsed
            - loc = location where expression match was attempted and failed
            - expr = the parse expression that failed
            - err = the exception thrown
           The function returns no value.  It may throw C{L{ParseFatalException}}
           if it is desired to stop parsing immediately.)rT)r�r|rwrwrx�
setFailActions
zParserElement.setFailActioncCsZd}xP|rTd}xB|jD]8}yx|j||�\}}d}qWWqtk
rLYqXqWqW|S)NTF)r]rtr)r�r-r�Z
exprsFound�eZdummyrwrwrx�_skipIgnorables#szParserElement._skipIgnorablescCsL|jr|j||�}|jrH|j}t|�}x ||krF|||krF|d7}q(W|S)Nrr)r]r�rXrYr�)r�r-r�Zwt�instrlenrwrwrx�preParse0szParserElement.preParsecCs|gfS)Nrw)r�r-r�rorwrwrx�	parseImpl<szParserElement.parseImplcCs|S)Nrw)r�r-r��	tokenlistrwrwrx�	postParse?szParserElement.postParsec"Cs�|j}|s|jr�|jdr,|jd|||�|rD|jrD|j||�}n|}|}yDy|j|||�\}}Wn(tk
r�t|t|�|j	|��YnXWnXt
k
r�}	z<|jdr�|jd||||	�|jr�|j||||	��WYdd}	~	XnXn�|o�|j�r|j||�}n|}|}|j�s$|t|�k�rhy|j|||�\}}Wn*tk
�rdt|t|�|j	|��YnXn|j|||�\}}|j|||�}t
||j|j|jd�}
|j�r�|�s�|j�r�|�rVyRxL|jD]B}||||
�}|dk	�r�t
||j|j�o�t|t
tf�|jd�}
�q�WWnFt
k
�rR}	z(|jd�r@|jd||||	��WYdd}	~	XnXnNxL|jD]B}||||
�}|dk	�r^t
||j|j�o�t|t
tf�|jd�}
�q^W|�r�|jd�r�|jd|||||
�||
fS)Nrrq)r�r�rr)r^rTrcrer�r�r�rr�rarr`r�r"rVrWrbrSrfrzr�)r�r-r�rorpZ	debugging�prelocZtokensStart�tokens�errZ	retTokensr|rwrwrx�
_parseNoCacheCsp





zParserElement._parseNoCachecCs>y|j||dd�dStk
r8t|||j|��YnXdS)NF)ror)rtr!rra)r�r-r�rwrwrx�tryParse�szParserElement.tryParsecCs2y|j||�Wnttfk
r(dSXdSdS)NFT)r�rr�)r�r-r�rwrwrx�canParseNext�s
zParserElement.canParseNextc@seZdZdd�ZdS)zParserElement._UnboundedCachecsdi�t�|_���fdd�}�fdd�}�fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)�cache�not_in_cacherwrxr��sz3ParserElement._UnboundedCache.__init__.<locals>.getcs|�|<dS)Nrw)r�r�r�)r�rwrx�set�sz3ParserElement._UnboundedCache.__init__.<locals>.setcs�j�dS)N)r�)r�)r�rwrxr��sz5ParserElement._UnboundedCache.__init__.<locals>.clear)r�r��types�
MethodTyper�r�r�)r�r�r�r�rw)r�r�rxr��sz&ParserElement._UnboundedCache.__init__N)r�r�r�r�rwrwrwrx�_UnboundedCache�sr�Nc@seZdZdd�ZdS)zParserElement._FifoCachecsht�|_�t����fdd�}��fdd�}�fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)r�r�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.getcs"|�|<t���kr�jd�dS)NF)r��popitem)r�r�r�)r��sizerwrxr��sz.ParserElement._FifoCache.__init__.<locals>.setcs�j�dS)N)r�)r�)r�rwrxr��sz0ParserElement._FifoCache.__init__.<locals>.clear)r�r��_OrderedDictr�r�r�r�r�)r�r�r�r�r�rw)r�r�r�rxr��sz!ParserElement._FifoCache.__init__N)r�r�r�r�rwrwrwrx�
_FifoCache�sr�c@seZdZdd�ZdS)zParserElement._FifoCachecsvt�|_�i�tjg�����fdd�}���fdd�}��fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)r�r�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.getcs2|�|<t���kr$�j�j�d��j|�dS)N)r�r��popleftr�)r�r�r�)r��key_fifor�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.setcs�j��j�dS)N)r�)r�)r�r�rwrxr��sz0ParserElement._FifoCache.__init__.<locals>.clear)	r�r��collections�dequer�r�r�r�r�)r�r�r�r�r�rw)r�r�r�r�rxr��sz!ParserElement._FifoCache.__init__N)r�r�r�r�rwrwrwrxr��srcCs�d\}}|||||f}tj��tj}|j|�}	|	|jkr�tj|d7<y|j||||�}	Wn8tk
r�}
z|j||
j	|
j
���WYdd}
~
Xq�X|j||	d|	dj�f�|	Sn4tj|d7<t|	t
�r�|	�|	d|	dj�fSWdQRXdS)Nrrr)rrr)r$�packrat_cache_lock�
packrat_cacher�r��packrat_cache_statsr�rr�rHr�r�rzrK)r�r-r�rorpZHITZMISS�lookupr�r�r�rwrwrx�_parseCache�s$


zParserElement._parseCachecCs(tjj�dgttj�tjdd�<dS)Nr)r$r�r�r�r�rwrwrwrx�
resetCache�s
zParserElement.resetCache�cCs8tjs4dt_|dkr tj�t_ntj|�t_tjt_dS)a�Enables "packrat" parsing, which adds memoizing to the parsing logic.
           Repeated parse attempts at the same string location (which happens
           often in many complex grammars) can immediately return a cached value,
           instead of re-executing parsing/validating code.  Memoizing is done of
           both valid results and parsing exceptions.
           
           Parameters:
            - cache_size_limit - (default=C{128}) - if an integer value is provided
              will limit the size of the packrat cache; if None is passed, then
              the cache size will be unbounded; if 0 is passed, the cache will
              be effectively disabled.
            
           This speedup may break existing programs that use parse actions that
           have side-effects.  For this reason, packrat parsing is disabled when
           you first import pyparsing.  To activate the packrat feature, your
           program must call the class method C{ParserElement.enablePackrat()}.  If
           your program uses C{psyco} to "compile as you go", you must call
           C{enablePackrat} before calling C{psyco.full()}.  If you do not do this,
           Python will crash.  For best results, call C{enablePackrat()} immediately
           after importing pyparsing.
           
           Example::
               import pyparsing
               pyparsing.ParserElement.enablePackrat()
        TN)r$�_packratEnabledr�r�r�r�rt)Zcache_size_limitrwrwrx�
enablePackratszParserElement.enablePackratcCs�tj�|js|j�x|jD]}|j�qW|js<|j�}y<|j|d�\}}|rv|j||�}t	�t
�}|j||�Wn0tk
r�}ztjr��n|�WYdd}~XnX|SdS)aB
        Execute the parse expression with the given string.
        This is the main interface to the client code, once the complete
        expression has been built.

        If you want the grammar to require that the entire input string be
        successfully parsed, then set C{parseAll} to True (equivalent to ending
        the grammar with C{L{StringEnd()}}).

        Note: C{parseString} implicitly calls C{expandtabs()} on the input string,
        in order to report proper column numbers in parse actions.
        If the input string contains tabs and
        the grammar uses parse actions that use the C{loc} argument to index into the
        string being parsed, you can ensure you have a consistent view of the input
        string by:
         - calling C{parseWithTabs} on your grammar before calling C{parseString}
           (see L{I{parseWithTabs}<parseWithTabs>})
         - define your parse action using the full C{(s,loc,toks)} signature, and
           reference the input string using the parse action's C{s} argument
         - explictly expand the tabs in your input string before calling
           C{parseString}
        
        Example::
            Word('a').parseString('aaaaabaaa')  # -> ['aaaaa']
            Word('a').parseString('aaaaabaaa', parseAll=True)  # -> Exception: Expected end of text
        rN)
r$r�r_�
streamliner]r\�
expandtabsrtr�r
r)r�verbose_stacktrace)r�r-�parseAllr�r�r�Zser3rwrwrx�parseString#s$zParserElement.parseStringccs@|js|j�x|jD]}|j�qW|js8t|�j�}t|�}d}|j}|j}t	j
�d}	y�x�||kon|	|k�ry |||�}
|||
dd�\}}Wntk
r�|
d}Yq`X||kr�|	d7}	||
|fV|r�|||�}
|
|kr�|}q�|d7}n|}q`|
d}q`WWn4tk
�r:}zt	j
�r&�n|�WYdd}~XnXdS)a�
        Scan the input string for expression matches.  Each match will return the
        matching tokens, start location, and end location.  May be called with optional
        C{maxMatches} argument, to clip scanning after 'n' matches are found.  If
        C{overlap} is specified, then overlapping matches will be reported.

        Note that the start and end locations are reported relative to the string
        being parsed.  See L{I{parseString}<parseString>} for more information on parsing
        strings with embedded tabs.

        Example::
            source = "sldjf123lsdjjkf345sldkjf879lkjsfd987"
            print(source)
            for tokens,start,end in Word(alphas).scanString(source):
                print(' '*start + '^'*(end-start))
                print(' '*start + tokens[0])
        
        prints::
        
            sldjf123lsdjjkf345sldkjf879lkjsfd987
            ^^^^^
            sldjf
                    ^^^^^^^
                    lsdjjkf
                              ^^^^^^
                              sldkjf
                                       ^^^^^^
                                       lkjsfd
        rF)rprrN)r_r�r]r\r�r�r�r�rtr$r�rrr�)r�r-�
maxMatchesZoverlapr�r�r�Z
preparseFnZparseFn�matchesr�ZnextLocr�Znextlocr3rwrwrx�
scanStringUsB


zParserElement.scanStringcCs�g}d}d|_y�xh|j|�D]Z\}}}|j|||��|rrt|t�rT||j�7}nt|t�rh||7}n
|j|�|}qW|j||d��dd�|D�}djtt	t
|���Stk
r�}ztj
rȂn|�WYdd}~XnXdS)af
        Extension to C{L{scanString}}, to modify matching text with modified tokens that may
        be returned from a parse action.  To use C{transformString}, define a grammar and
        attach a parse action to it that modifies the returned token list.
        Invoking C{transformString()} on a target string will then scan for matches,
        and replace the matched text patterns according to the logic in the parse
        action.  C{transformString()} returns the resulting transformed string.
        
        Example::
            wd = Word(alphas)
            wd.setParseAction(lambda toks: toks[0].title())
            
            print(wd.transformString("now is the winter of our discontent made glorious summer by this sun of york."))
        Prints::
            Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York.
        rTNcSsg|]}|r|�qSrwrw)r��orwrwrxr��sz1ParserElement.transformString.<locals>.<listcomp>r�)r\r�r�rzr"r�r�r�rvr��_flattenrr$r�)r�r-rZlastErvr�r�r3rwrwrxr��s(



zParserElement.transformStringcCsPytdd�|j||�D��Stk
rJ}ztjr6�n|�WYdd}~XnXdS)a~
        Another extension to C{L{scanString}}, simplifying the access to the tokens found
        to match the given parse expression.  May be called with optional
        C{maxMatches} argument, to clip searching after 'n' matches are found.
        
        Example::
            # a capitalized word starts with an uppercase letter, followed by zero or more lowercase letters
            cap_word = Word(alphas.upper(), alphas.lower())
            
            print(cap_word.searchString("More than Iron, more than Lead, more than Gold I need Electricity"))
        prints::
            ['More', 'Iron', 'Lead', 'Gold', 'I']
        cSsg|]\}}}|�qSrwrw)r�rvr�r�rwrwrxr��sz.ParserElement.searchString.<locals>.<listcomp>N)r"r�rr$r�)r�r-r�r3rwrwrx�searchString�szParserElement.searchStringc	csXd}d}x<|j||d�D]*\}}}|||�V|r>|dV|}qW||d�VdS)a[
        Generator method to split a string using the given expression as a separator.
        May be called with optional C{maxsplit} argument, to limit the number of splits;
        and the optional C{includeSeparators} argument (default=C{False}), if the separating
        matching text should be included in the split results.
        
        Example::        
            punc = oneOf(list(".,;:/-!?"))
            print(list(punc.split("This, this?, this sentence, is badly punctuated!")))
        prints::
            ['This', ' this', '', ' this sentence', ' is badly punctuated', '']
        r)r�N)r�)	r�r-�maxsplitZincludeSeparatorsZsplitsZlastrvr�r�rwrwrxr��s

zParserElement.splitcCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)a�
        Implementation of + operator - returns C{L{And}}. Adding strings to a ParserElement
        converts them to L{Literal}s by default.
        
        Example::
            greet = Word(alphas) + "," + Word(alphas) + "!"
            hello = "Hello, World!"
            print (hello, "->", greet.parseString(hello))
        Prints::
            Hello, World! -> ['Hello', ',', 'World', '!']
        z4Cannot combine element of type %s with ParserElementrq)�
stacklevelN)	rzr�r$rQ�warnings�warnr��
SyntaxWarningr)r�r�rwrwrxr�s



zParserElement.__add__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||S)z]
        Implementation of + operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrxrs



zParserElement.__radd__cCsLt|t�rtj|�}t|t�s:tjdt|�tdd�dSt|tj	�|g�S)zQ
        Implementation of - operator, returns C{L{And}} with error stop
        z4Cannot combine element of type %s with ParserElementrq)r�N)
rzr�r$rQr�r�r�r�r�
_ErrorStop)r�r�rwrwrx�__sub__s



zParserElement.__sub__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||S)z]
        Implementation of - operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rsub__ s



zParserElement.__rsub__cs�t|t�r|d}}n�t|t�r�|ddd�}|ddkrHd|df}t|dt�r�|ddkr�|ddkrvt��S|ddkr�t��S�|dt��SnJt|dt�r�t|dt�r�|\}}||8}ntdt|d�t|d���ntdt|���|dk�rtd��|dk�rtd��||k�o2dkn�rBtd	��|�r���fd
d��|�r�|dk�rt��|�}nt�g|��|�}n�|�}n|dk�r��}nt�g|�}|S)
a�
        Implementation of * operator, allows use of C{expr * 3} in place of
        C{expr + expr + expr}.  Expressions may also me multiplied by a 2-integer
        tuple, similar to C{{min,max}} multipliers in regular expressions.  Tuples
        may also include C{None} as in:
         - C{expr*(n,None)} or C{expr*(n,)} is equivalent
              to C{expr*n + L{ZeroOrMore}(expr)}
              (read as "at least n instances of C{expr}")
         - C{expr*(None,n)} is equivalent to C{expr*(0,n)}
              (read as "0 to n instances of C{expr}")
         - C{expr*(None,None)} is equivalent to C{L{ZeroOrMore}(expr)}
         - C{expr*(1,None)} is equivalent to C{L{OneOrMore}(expr)}

        Note that C{expr*(None,n)} does not raise an exception if
        more than n exprs exist in the input stream; that is,
        C{expr*(None,n)} does not enforce a maximum number of expr
        occurrences.  If this behavior is desired, then write
        C{expr*(None,n) + ~expr}
        rNrqrrz7cannot multiply 'ParserElement' and ('%s','%s') objectsz0cannot multiply 'ParserElement' and '%s' objectsz/cannot multiply ParserElement by negative valuez@second tuple value must be greater or equal to first tuple valuez+cannot multiply ParserElement by 0 or (0,0)cs(|dkrt��|d��St��SdS)Nrr)r)�n)�makeOptionalListr�rwrxr�]sz/ParserElement.__mul__.<locals>.makeOptionalList)NN)	rzru�tupler2rr�r��
ValueErrorr)r�r�ZminElementsZoptElementsr�rw)r�r�rx�__mul__,sD







zParserElement.__mul__cCs
|j|�S)N)r�)r�r�rwrwrx�__rmul__pszParserElement.__rmul__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zI
        Implementation of | operator - returns C{L{MatchFirst}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__or__ss



zParserElement.__or__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||BS)z]
        Implementation of | operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__ror__s



zParserElement.__ror__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zA
        Implementation of ^ operator - returns C{L{Or}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__xor__�s



zParserElement.__xor__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||AS)z]
        Implementation of ^ operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rxor__�s



zParserElement.__rxor__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zC
        Implementation of & operator - returns C{L{Each}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__and__�s



zParserElement.__and__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||@S)z]
        Implementation of & operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rand__�s



zParserElement.__rand__cCst|�S)zE
        Implementation of ~ operator - returns C{L{NotAny}}
        )r)r�rwrwrx�
__invert__�szParserElement.__invert__cCs|dk	r|j|�S|j�SdS)a

        Shortcut for C{L{setResultsName}}, with C{listAllMatches=False}.
        
        If C{name} is given with a trailing C{'*'} character, then C{listAllMatches} will be
        passed as C{True}.
           
        If C{name} is omitted, same as calling C{L{copy}}.

        Example::
            # these are equivalent
            userdata = Word(alphas).setResultsName("name") + Word(nums+"-").setResultsName("socsecno")
            userdata = Word(alphas)("name") + Word(nums+"-")("socsecno")             
        N)rmr�)r�r�rwrwrx�__call__�s
zParserElement.__call__cCst|�S)z�
        Suppresses the output of this C{ParserElement}; useful to keep punctuation from
        cluttering up returned output.
        )r+)r�rwrwrx�suppress�szParserElement.suppresscCs
d|_|S)a
        Disables the skipping of whitespace before matching the characters in the
        C{ParserElement}'s defined pattern.  This is normally only used internally by
        the pyparsing module, but may be needed in some whitespace-sensitive grammars.
        F)rX)r�rwrwrx�leaveWhitespace�szParserElement.leaveWhitespacecCsd|_||_d|_|S)z8
        Overrides the default whitespace chars
        TF)rXrYrZ)r�rOrwrwrx�setWhitespaceChars�sz ParserElement.setWhitespaceCharscCs
d|_|S)z�
        Overrides default behavior to expand C{<TAB>}s to spaces before parsing the input string.
        Must be called before C{parseString} when the input grammar contains elements that
        match C{<TAB>} characters.
        T)r\)r�rwrwrx�
parseWithTabs�szParserElement.parseWithTabscCsLt|t�rt|�}t|t�r4||jkrH|jj|�n|jjt|j���|S)a�
        Define expression to be ignored (e.g., comments) while doing pattern
        matching; may be called repeatedly, to define multiple comment or other
        ignorable patterns.
        
        Example::
            patt = OneOrMore(Word(alphas))
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj']
            
            patt.ignore(cStyleComment)
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj', 'lskjd']
        )rzr�r+r]r�r�)r�r�rwrwrx�ignore�s


zParserElement.ignorecCs"|pt|pt|ptf|_d|_|S)zT
        Enable display of debugging messages while doing pattern matching.
        T)r/r2r4rcr^)r�ZstartActionZ
successActionZexceptionActionrwrwrx�setDebugActions
s
zParserElement.setDebugActionscCs|r|jttt�nd|_|S)a�
        Enable display of debugging messages while doing pattern matching.
        Set C{flag} to True to enable, False to disable.

        Example::
            wd = Word(alphas).setName("alphaword")
            integer = Word(nums).setName("numword")
            term = wd | integer
            
            # turn on debugging for wd
            wd.setDebug()

            OneOrMore(term).parseString("abc 123 xyz 890")
        
        prints::
            Match alphaword at loc 0(1,1)
            Matched alphaword -> ['abc']
            Match alphaword at loc 3(1,4)
            Exception raised:Expected alphaword (at char 4), (line:1, col:5)
            Match alphaword at loc 7(1,8)
            Matched alphaword -> ['xyz']
            Match alphaword at loc 11(1,12)
            Exception raised:Expected alphaword (at char 12), (line:1, col:13)
            Match alphaword at loc 15(1,16)
            Exception raised:Expected alphaword (at char 15), (line:1, col:16)

        The output shown is that produced by the default debug actions - custom debug actions can be
        specified using L{setDebugActions}. Prior to attempting
        to match the C{wd} expression, the debugging message C{"Match <exprname> at loc <n>(<line>,<col>)"}
        is shown. Then if the parse succeeds, a C{"Matched"} message is shown, or an C{"Exception raised"}
        message is shown. Also note the use of L{setName} to assign a human-readable name to the expression,
        which makes debugging and exception messages easier to understand - for instance, the default
        name created for the C{Word} expression without calling C{setName} is C{"W:(ABCD...)"}.
        F)r�r/r2r4r^)r��flagrwrwrx�setDebugs#zParserElement.setDebugcCs|jS)N)r�)r�rwrwrxr�@szParserElement.__str__cCst|�S)N)r�)r�rwrwrxr�CszParserElement.__repr__cCsd|_d|_|S)NT)r_rU)r�rwrwrxr�FszParserElement.streamlinecCsdS)Nrw)r�r�rwrwrx�checkRecursionKszParserElement.checkRecursioncCs|jg�dS)zj
        Check defined expressions for valid structure, check for infinite recursive definitions.
        N)r�)r��
validateTracerwrwrx�validateNszParserElement.validatecCs�y|j�}Wn2tk
r>t|d��}|j�}WdQRXYnXy|j||�Stk
r|}ztjrh�n|�WYdd}~XnXdS)z�
        Execute the parse expression on the given file or filename.
        If a filename is specified (instead of a file object),
        the entire file is opened, read, and closed before parsing.
        �rN)�readr��openr�rr$r�)r�Zfile_or_filenamer�Z
file_contents�fr3rwrwrx�	parseFileTszParserElement.parseFilecsHt|t�r"||kp t|�t|�kSt|t�r6|j|�Stt|�|kSdS)N)rzr$�varsr�r��super)r�r�)rHrwrx�__eq__hs



zParserElement.__eq__cCs
||kS)Nrw)r�r�rwrwrx�__ne__pszParserElement.__ne__cCstt|��S)N)�hash�id)r�rwrwrx�__hash__sszParserElement.__hash__cCs||kS)Nrw)r�r�rwrwrx�__req__vszParserElement.__req__cCs
||kS)Nrw)r�r�rwrwrx�__rne__yszParserElement.__rne__cCs0y|jt|�|d�dStk
r*dSXdS)a�
        Method for quick testing of a parser against a test string. Good for simple 
        inline microtests of sub expressions while building up larger parser.
           
        Parameters:
         - testString - to test against this expression for a match
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests
            
        Example::
            expr = Word(nums)
            assert expr.matches("100")
        )r�TFN)r�r�r)r�Z
testStringr�rwrwrxr�|s

zParserElement.matches�#cCs�t|t�r"tttj|j�j���}t|t�r4t|�}g}g}d}	�x�|D�]�}
|dk	rb|j	|
d�sl|rx|
rx|j
|
�qH|
s~qHdj|�|
g}g}y:|
jdd�}
|j
|
|d�}|j
|j|d��|	o�|}	Wn�tk
�rx}
z�t|
t�r�dnd	}d|
k�r0|j
t|
j|
��|j
d
t|
j|
�dd|�n|j
d
|
jd|�|j
d
t|
��|	�ob|}	|
}WYdd}
~
XnDtk
�r�}z&|j
dt|��|	�o�|}	|}WYdd}~XnX|�r�|�r�|j
d	�tdj|��|j
|
|f�qHW|	|fS)a3
        Execute the parse expression on a series of test strings, showing each
        test, the parsed results or where the parse failed. Quick and easy way to
        run a parse expression against a list of sample strings.
           
        Parameters:
         - tests - a list of separate test strings, or a multiline string of test strings
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests           
         - comment - (default=C{'#'}) - expression for indicating embedded comments in the test 
              string; pass None to disable comment filtering
         - fullDump - (default=C{True}) - dump results as list followed by results names in nested outline;
              if False, only dump nested list
         - printResults - (default=C{True}) prints test output to stdout
         - failureTests - (default=C{False}) indicates if these tests are expected to fail parsing

        Returns: a (success, results) tuple, where success indicates that all tests succeeded
        (or failed if C{failureTests} is True), and the results contain a list of lines of each 
        test's output
        
        Example::
            number_expr = pyparsing_common.number.copy()

            result = number_expr.runTests('''
                # unsigned integer
                100
                # negative integer
                -100
                # float with scientific notation
                6.02e23
                # integer with scientific notation
                1e-12
                ''')
            print("Success" if result[0] else "Failed!")

            result = number_expr.runTests('''
                # stray character
                100Z
                # missing leading digit before '.'
                -.100
                # too many '.'
                3.14.159
                ''', failureTests=True)
            print("Success" if result[0] else "Failed!")
        prints::
            # unsigned integer
            100
            [100]

            # negative integer
            -100
            [-100]

            # float with scientific notation
            6.02e23
            [6.02e+23]

            # integer with scientific notation
            1e-12
            [1e-12]

            Success
            
            # stray character
            100Z
               ^
            FAIL: Expected end of text (at char 3), (line:1, col:4)

            # missing leading digit before '.'
            -.100
            ^
            FAIL: Expected {real number with scientific notation | real number | signed integer} (at char 0), (line:1, col:1)

            # too many '.'
            3.14.159
                ^
            FAIL: Expected end of text (at char 4), (line:1, col:5)

            Success

        Each test string must be on a single line. If you want to test a string that spans multiple
        lines, create a test like this::

            expr.runTest(r"this is a test\n of strings that spans \n 3 lines")
        
        (Note that this is a raw string literal, you must include the leading 'r'.)
        TNFrz\n)r�)r z(FATAL)r�� rr�^zFAIL: zFAIL-EXCEPTION: )rzr�r�rvr{r��rstrip�
splitlinesrr�r�r�r�r�rrr!rGr�r9rKr,)r�Ztestsr�ZcommentZfullDumpZprintResultsZfailureTestsZ
allResultsZcomments�successrvr�resultr�rzr3rwrwrx�runTests�sNW



$


zParserElement.runTests)F)F)T)T)TT)TT)r�)F)N)T)F)T)Tr�TTF)Or�r�r�r�rNr��staticmethodrPrRr�r�rirmrur�rxr~rr�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�rtr�r�r�r��_MAX_INTr�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r��
__classcell__rwrw)rHrxr$8s�


&




H
"
2G+D
			

)

cs eZdZdZ�fdd�Z�ZS)r,zT
    Abstract C{ParserElement} subclass, for defining atomic matching patterns.
    cstt|�jdd�dS)NF)rg)r�r,r�)r�)rHrwrxr�	szToken.__init__)r�r�r�r�r�r�rwrw)rHrxr,	scs eZdZdZ�fdd�Z�ZS)r
z,
    An empty token, will always match.
    cs$tt|�j�d|_d|_d|_dS)Nr
TF)r�r
r�r�r[r`)r�)rHrwrxr�	szEmpty.__init__)r�r�r�r�r�r�rwrw)rHrxr
	scs*eZdZdZ�fdd�Zddd�Z�ZS)rz(
    A token that will never match.
    cs*tt|�j�d|_d|_d|_d|_dS)NrTFzUnmatchable token)r�rr�r�r[r`ra)r�)rHrwrxr�*	s
zNoMatch.__init__TcCst|||j|��dS)N)rra)r�r-r�rorwrwrxr�1	szNoMatch.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr&	scs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Token to exactly match a specified string.
    
    Example::
        Literal('blah').parseString('blah')  # -> ['blah']
        Literal('blah').parseString('blahfooblah')  # -> ['blah']
        Literal('blah').parseString('bla')  # -> Exception: Expected "blah"
    
    For case-insensitive matching, use L{CaselessLiteral}.
    
    For keyword matching (force word break before and after the matched string),
    use L{Keyword} or L{CaselessKeyword}.
    cs�tt|�j�||_t|�|_y|d|_Wn*tk
rVtj	dt
dd�t|_YnXdt
|j�|_d|j|_d|_d|_dS)Nrz2null string passed to Literal; use Empty() insteadrq)r�z"%s"z	Expected F)r�rr��matchr��matchLen�firstMatchCharr�r�r�r�r
rHr�r�rar[r`)r��matchString)rHrwrxr�C	s

zLiteral.__init__TcCsJ|||jkr6|jdks&|j|j|�r6||j|jfSt|||j|��dS)Nrr)r�r��
startswithr�rra)r�r-r�rorwrwrxr�V	szLiteral.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr5	s
csLeZdZdZedZd�fdd�	Zddd	�Z�fd
d�Ze	dd
��Z
�ZS)ra\
    Token to exactly match a specified string as a keyword, that is, it must be
    immediately followed by a non-keyword character.  Compare with C{L{Literal}}:
     - C{Literal("if")} will match the leading C{'if'} in C{'ifAndOnlyIf'}.
     - C{Keyword("if")} will not; it will only match the leading C{'if'} in C{'if x=1'}, or C{'if(y==2)'}
    Accepts two optional constructor arguments in addition to the keyword string:
     - C{identChars} is a string of characters that would be valid identifier characters,
          defaulting to all alphanumerics + "_" and "$"
     - C{caseless} allows case-insensitive matching, default is C{False}.
       
    Example::
        Keyword("start").parseString("start")  # -> ['start']
        Keyword("start").parseString("starting")  # -> Exception

    For case-insensitive matching, use L{CaselessKeyword}.
    z_$NFcs�tt|�j�|dkrtj}||_t|�|_y|d|_Wn$tk
r^t	j
dtdd�YnXd|j|_d|j|_
d|_d|_||_|r�|j�|_|j�}t|�|_dS)Nrz2null string passed to Keyword; use Empty() insteadrq)r�z"%s"z	Expected F)r�rr��DEFAULT_KEYWORD_CHARSr�r�r�r�r�r�r�r�r�rar[r`�caseless�upper�
caselessmatchr��
identChars)r�r�r�r�)rHrwrxr�q	s&

zKeyword.__init__TcCs|jr|||||j�j�|jkr�|t|�|jksL|||jj�|jkr�|dksj||dj�|jkr�||j|jfSnv|||jkr�|jdks�|j|j|�r�|t|�|jks�|||j|jkr�|dks�||d|jkr�||j|jfSt	|||j
|��dS)Nrrr)r�r�r�r�r�r�r�r�r�rra)r�r-r�rorwrwrxr��	s*&zKeyword.parseImplcstt|�j�}tj|_|S)N)r�rr�r�r�)r�r�)rHrwrxr��	szKeyword.copycCs
|t_dS)z,Overrides the default Keyword chars
        N)rr�)rOrwrwrx�setDefaultKeywordChars�	szKeyword.setDefaultKeywordChars)NF)T)r�r�r�r�r3r�r�r�r�r�r�r�rwrw)rHrxr^	s
cs*eZdZdZ�fdd�Zddd�Z�ZS)ral
    Token to match a specified string, ignoring case of letters.
    Note: the matched results will always be in the case of the given
    match string, NOT the case of the input text.

    Example::
        OneOrMore(CaselessLiteral("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD', 'CMD']
        
    (Contrast with example for L{CaselessKeyword}.)
    cs6tt|�j|j��||_d|j|_d|j|_dS)Nz'%s'z	Expected )r�rr�r��returnStringr�ra)r�r�)rHrwrxr��	szCaselessLiteral.__init__TcCs@||||j�j�|jkr,||j|jfSt|||j|��dS)N)r�r�r�r�rra)r�r-r�rorwrwrxr��	szCaselessLiteral.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr�	s
cs,eZdZdZd�fdd�	Zd	dd�Z�ZS)
rz�
    Caseless version of L{Keyword}.

    Example::
        OneOrMore(CaselessKeyword("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD']
        
    (Contrast with example for L{CaselessLiteral}.)
    Ncstt|�j||dd�dS)NT)r�)r�rr�)r�r�r�)rHrwrxr��	szCaselessKeyword.__init__TcCsj||||j�j�|jkrV|t|�|jksF|||jj�|jkrV||j|jfSt|||j|��dS)N)r�r�r�r�r�r�rra)r�r-r�rorwrwrxr��	s*zCaselessKeyword.parseImpl)N)T)r�r�r�r�r�r�r�rwrw)rHrxr�	scs,eZdZdZd�fdd�	Zd	dd�Z�ZS)
rlax
    A variation on L{Literal} which matches "close" matches, that is, 
    strings with at most 'n' mismatching characters. C{CloseMatch} takes parameters:
     - C{match_string} - string to be matched
     - C{maxMismatches} - (C{default=1}) maximum number of mismatches allowed to count as a match
    
    The results from a successful parse will contain the matched text from the input string and the following named results:
     - C{mismatches} - a list of the positions within the match_string where mismatches were found
     - C{original} - the original match_string used to compare against the input string
    
    If C{mismatches} is an empty list, then the match was an exact match.
    
    Example::
        patt = CloseMatch("ATCATCGAATGGA")
        patt.parseString("ATCATCGAAXGGA") # -> (['ATCATCGAAXGGA'], {'mismatches': [[9]], 'original': ['ATCATCGAATGGA']})
        patt.parseString("ATCAXCGAAXGGA") # -> Exception: Expected 'ATCATCGAATGGA' (with up to 1 mismatches) (at char 0), (line:1, col:1)

        # exact match
        patt.parseString("ATCATCGAATGGA") # -> (['ATCATCGAATGGA'], {'mismatches': [[]], 'original': ['ATCATCGAATGGA']})

        # close match allowing up to 2 mismatches
        patt = CloseMatch("ATCATCGAATGGA", maxMismatches=2)
        patt.parseString("ATCAXCGAAXGGA") # -> (['ATCAXCGAAXGGA'], {'mismatches': [[4, 9]], 'original': ['ATCATCGAATGGA']})
    rrcsBtt|�j�||_||_||_d|j|jf|_d|_d|_dS)Nz&Expected %r (with up to %d mismatches)F)	r�rlr�r��match_string�
maxMismatchesrar`r[)r�r�r�)rHrwrxr��	szCloseMatch.__init__TcCs�|}t|�}|t|j�}||kr�|j}d}g}	|j}
x�tt|||�|j��D]0\}}|\}}
||
krP|	j|�t|	�|
krPPqPW|d}t|||�g�}|j|d<|	|d<||fSt|||j|��dS)Nrrr�original�
mismatches)	r�r�r�r�r�r�r"rra)r�r-r�ro�startr��maxlocr�Zmatch_stringlocr�r�Zs_m�src�mat�resultsrwrwrxr��	s("

zCloseMatch.parseImpl)rr)T)r�r�r�r�r�r�r�rwrw)rHrxrl�	s	cs8eZdZdZd
�fdd�	Zdd	d
�Z�fdd�Z�ZS)r/a	
    Token for matching words composed of allowed character sets.
    Defined with string containing all allowed initial characters,
    an optional string containing allowed body characters (if omitted,
    defaults to the initial character set), and an optional minimum,
    maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction. An optional
    C{excludeChars} parameter can list characters that might be found in 
    the input C{bodyChars} string; useful to define a word of all printables
    except for one or two characters, for instance.
    
    L{srange} is useful for defining custom character set strings for defining 
    C{Word} expressions, using range notation from regular expression character sets.
    
    A common mistake is to use C{Word} to match a specific literal string, as in 
    C{Word("Address")}. Remember that C{Word} uses the string argument to define
    I{sets} of matchable characters. This expression would match "Add", "AAA",
    "dAred", or any other word made up of the characters 'A', 'd', 'r', 'e', and 's'.
    To match an exact literal string, use L{Literal} or L{Keyword}.

    pyparsing includes helper strings for building Words:
     - L{alphas}
     - L{nums}
     - L{alphanums}
     - L{hexnums}
     - L{alphas8bit} (alphabetic characters in ASCII range 128-255 - accented, tilded, umlauted, etc.)
     - L{punc8bit} (non-alphabetic characters in ASCII range 128-255 - currency, symbols, superscripts, diacriticals, etc.)
     - L{printables} (any non-whitespace character)

    Example::
        # a word composed of digits
        integer = Word(nums) # equivalent to Word("0123456789") or Word(srange("0-9"))
        
        # a word with a leading capital, and zero or more lowercase
        capital_word = Word(alphas.upper(), alphas.lower())

        # hostnames are alphanumeric, with leading alpha, and '-'
        hostname = Word(alphas, alphanums+'-')
        
        # roman numeral (not a strict parser, accepts invalid mix of characters)
        roman = Word("IVXLCDM")
        
        # any string of non-whitespace characters, except for ','
        csv_value = Word(printables, excludeChars=",")
    NrrrFcs�tt|�j��rFdj�fdd�|D��}|rFdj�fdd�|D��}||_t|�|_|rl||_t|�|_n||_t|�|_|dk|_	|dkr�t
d��||_|dkr�||_nt
|_|dkr�||_||_t|�|_d|j|_d	|_||_d
|j|jk�r�|dk�r�|dk�r�|dk�r�|j|jk�r8dt|j�|_nHt|j�dk�rfdtj|j�t|j�f|_nd
t|j�t|j�f|_|j�r�d|jd|_ytj|j�|_Wntk
�r�d|_YnXdS)Nr�c3s|]}|�kr|VqdS)Nrw)r�r�)�excludeCharsrwrxr�7
sz Word.__init__.<locals>.<genexpr>c3s|]}|�kr|VqdS)Nrw)r�r�)r�rwrxr�9
srrrzZcannot specify a minimum length < 1; use Optional(Word()) if zero-length word is permittedz	Expected Fr�z[%s]+z%s[%s]*z	[%s][%s]*z\b)r�r/r�r��
initCharsOrigr��	initChars�
bodyCharsOrig�	bodyChars�maxSpecifiedr��minLen�maxLenr�r�r�rar`�	asKeyword�_escapeRegexRangeChars�reStringr�rd�escape�compilerK)r�rr�min�max�exactrr�)rH)r�rxr�4
sT



0
z
Word.__init__Tc
CsD|jr<|jj||�}|s(t|||j|��|j�}||j�fS|||jkrZt|||j|��|}|d7}t|�}|j}||j	}t
||�}x ||kr�|||kr�|d7}q�Wd}	|||jkr�d}	|jr�||kr�|||kr�d}	|j
�r|dk�r||d|k�s||k�r|||k�rd}	|	�r4t|||j|��||||�fS)NrrFTr)rdr�rra�end�grouprr�rrrrrr)
r�r-r�ror�r�r�Z	bodycharsr�ZthrowExceptionrwrwrxr�j
s6

4zWord.parseImplcstytt|�j�Stk
r"YnX|jdkrndd�}|j|jkr^d||j�||j�f|_nd||j�|_|jS)NcSs$t|�dkr|dd�dS|SdS)N�z...)r�)r�rwrwrx�
charsAsStr�
sz Word.__str__.<locals>.charsAsStrz	W:(%s,%s)zW:(%s))r�r/r�rKrUr�r)r�r)rHrwrxr��
s
zWord.__str__)NrrrrFN)T)r�r�r�r�r�r�r�r�rwrw)rHrxr/
s.6
#csFeZdZdZeejd��Zd�fdd�	Zddd�Z	�fd	d
�Z
�ZS)
r'a�
    Token for matching strings that match a given regular expression.
    Defined with string specifying the regular expression in a form recognized by the inbuilt Python re module.
    If the given regex contains named groups (defined using C{(?P<name>...)}), these will be preserved as 
    named parse results.

    Example::
        realnum = Regex(r"[+-]?\d+\.\d*")
        date = Regex(r'(?P<year>\d{4})-(?P<month>\d\d?)-(?P<day>\d\d?)')
        # ref: http://stackoverflow.com/questions/267399/how-do-you-match-only-valid-roman-numerals-with-a-regular-expression
        roman = Regex(r"M{0,4}(CM|CD|D?C{0,3})(XC|XL|L?X{0,3})(IX|IV|V?I{0,3})")
    z[A-Z]rcs�tt|�j�t|t�r�|s,tjdtdd�||_||_	yt
j|j|j	�|_
|j|_Wq�t
jk
r�tjd|tdd��Yq�Xn2t|tj�r�||_
t|�|_|_||_	ntd��t|�|_d|j|_d|_d|_d	S)
z�The parameters C{pattern} and C{flags} are passed to the C{re.compile()} function as-is. See the Python C{re} module for an explanation of the acceptable patterns and flags.z0null string passed to Regex; use Empty() insteadrq)r�z$invalid pattern (%s) passed to RegexzCRegex may only be constructed with a string or a compiled RE objectz	Expected FTN)r�r'r�rzr�r�r�r��pattern�flagsrdr
r�
sre_constants�error�compiledREtyper{r�r�r�rar`r[)r�rr)rHrwrxr��
s.





zRegex.__init__TcCsd|jj||�}|s"t|||j|��|j�}|j�}t|j��}|r\x|D]}||||<qHW||fS)N)rdr�rrar�	groupdictr"r)r�r-r�ror��dr�r�rwrwrxr��
s
zRegex.parseImplcsDytt|�j�Stk
r"YnX|jdkr>dt|j�|_|jS)NzRe:(%s))r�r'r�rKrUr�r)r�)rHrwrxr��
s
z
Regex.__str__)r)T)r�r�r�r�r�rdr
rr�r�r�r�rwrw)rHrxr'�
s
"

cs8eZdZdZd�fdd�	Zddd�Z�fd	d
�Z�ZS)
r%a�
    Token for matching strings that are delimited by quoting characters.
    
    Defined with the following parameters:
        - quoteChar - string of one or more characters defining the quote delimiting string
        - escChar - character to escape quotes, typically backslash (default=C{None})
        - escQuote - special quote sequence to escape an embedded quote string (such as SQL's "" to escape an embedded ") (default=C{None})
        - multiline - boolean indicating whether quotes can span multiple lines (default=C{False})
        - unquoteResults - boolean indicating whether the matched text should be unquoted (default=C{True})
        - endQuoteChar - string of one or more characters defining the end of the quote delimited string (default=C{None} => same as quoteChar)
        - convertWhitespaceEscapes - convert escaped whitespace (C{'\t'}, C{'\n'}, etc.) to actual whitespace (default=C{True})

    Example::
        qs = QuotedString('"')
        print(qs.searchString('lsjdf "This is the quote" sldjf'))
        complex_qs = QuotedString('{{', endQuoteChar='}}')
        print(complex_qs.searchString('lsjdf {{This is the "quote"}} sldjf'))
        sql_qs = QuotedString('"', escQuote='""')
        print(sql_qs.searchString('lsjdf "This is the quote with ""embedded"" quotes" sldjf'))
    prints::
        [['This is the quote']]
        [['This is the "quote"']]
        [['This is the quote with "embedded" quotes']]
    NFTcsNtt��j�|j�}|s0tjdtdd�t��|dkr>|}n"|j�}|s`tjdtdd�t��|�_t	|��_
|d�_|�_t	|��_
|�_|�_|�_|�_|r�tjtjB�_dtj�j�t�jd�|dk	r�t|�p�df�_n<d�_dtj�j�t�jd�|dk	�rt|��pdf�_t	�j�d	k�rp�jd
dj�fdd
�tt	�j�d	dd�D��d7_|�r��jdtj|�7_|�r��jdtj|�7_tj�j�d�_�jdtj�j�7_ytj�j�j��_�j�_Wn0tjk
�r&tjd�jtdd��YnXt ���_!d�j!�_"d�_#d�_$dS)Nz$quoteChar cannot be the empty stringrq)r�z'endQuoteChar cannot be the empty stringrz%s(?:[^%s%s]r�z%s(?:[^%s\n\r%s]rrz|(?:z)|(?:c3s4|],}dtj�jd|��t�j|�fVqdS)z%s[^%s]N)rdr	�endQuoteCharr)r�r�)r�rwrxr�/sz(QuotedString.__init__.<locals>.<genexpr>�)z|(?:%s)z|(?:%s.)z(.)z)*%sz$invalid pattern (%s) passed to Regexz	Expected FTrs)%r�r%r�r�r�r�r��SyntaxError�	quoteCharr��quoteCharLen�firstQuoteCharr�endQuoteCharLen�escChar�escQuote�unquoteResults�convertWhitespaceEscapesrd�	MULTILINE�DOTALLrr	rrr�r��escCharReplacePatternr
rrrr�r�rar`r[)r�rr r!Z	multiliner"rr#)rH)r�rxr�sf




6

zQuotedString.__init__c	Cs�|||jkr|jj||�pd}|s4t|||j|��|j�}|j�}|jr�||j|j	�}t
|t�r�d|kr�|jr�ddddd�}x |j
�D]\}}|j||�}q�W|jr�tj|jd|�}|jr�|j|j|j�}||fS)N�\�	r��
)z\tz\nz\fz\rz\g<1>)rrdr�rrarrr"rrrzr�r#r�r�r r�r&r!r)	r�r-r�ror�r�Zws_mapZwslitZwscharrwrwrxr�Gs( 
zQuotedString.parseImplcsFytt|�j�Stk
r"YnX|jdkr@d|j|jf|_|jS)Nz.quoted string, starting with %s ending with %s)r�r%r�rKrUrr)r�)rHrwrxr�js
zQuotedString.__str__)NNFTNT)T)r�r�r�r�r�r�r�r�rwrw)rHrxr%�
sA
#cs8eZdZdZd�fdd�	Zddd�Z�fd	d
�Z�ZS)
r	a�
    Token for matching words composed of characters I{not} in a given set (will
    include whitespace in matched characters if not listed in the provided exclusion set - see example).
    Defined with string containing all disallowed characters, and an optional
    minimum, maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction.

    Example::
        # define a comma-separated-value as anything that is not a ','
        csv_value = CharsNotIn(',')
        print(delimitedList(csv_value).parseString("dkls,lsdkjf,s12 34,@!#,213"))
    prints::
        ['dkls', 'lsdkjf', 's12 34', '@!#', '213']
    rrrcs�tt|�j�d|_||_|dkr*td��||_|dkr@||_nt|_|dkrZ||_||_t	|�|_
d|j
|_|jdk|_d|_
dS)NFrrzfcannot specify a minimum length < 1; use Optional(CharsNotIn()) if zero-length char group is permittedrz	Expected )r�r	r�rX�notCharsr�rrr�r�r�rar[r`)r�r+rrr
)rHrwrxr��s 
zCharsNotIn.__init__TcCs�|||jkrt|||j|��|}|d7}|j}t||jt|��}x ||krd|||krd|d7}qFW|||jkr�t|||j|��||||�fS)Nrr)r+rrarrr�r)r�r-r�ror�Znotchars�maxlenrwrwrxr��s
zCharsNotIn.parseImplcsdytt|�j�Stk
r"YnX|jdkr^t|j�dkrRd|jdd�|_nd|j|_|jS)Nrz
!W:(%s...)z!W:(%s))r�r	r�rKrUr�r+)r�)rHrwrxr��s
zCharsNotIn.__str__)rrrr)T)r�r�r�r�r�r�r�r�rwrw)rHrxr	vs
cs<eZdZdZdddddd�Zd�fdd�	Zddd�Z�ZS)r.a�
    Special matching class for matching whitespace.  Normally, whitespace is ignored
    by pyparsing grammars.  This class is included when some whitespace structures
    are significant.  Define with a string containing the whitespace characters to be
    matched; default is C{" \t\r\n"}.  Also takes optional C{min}, C{max}, and C{exact} arguments,
    as defined for the C{L{Word}} class.
    z<SPC>z<TAB>z<LF>z<CR>z<FF>)r�r(rr*r)� 	
rrrcs�tt��j�|�_�jdj�fdd��jD���djdd��jD���_d�_d�j�_	|�_
|dkrt|�_nt�_|dkr�|�_|�_
dS)Nr�c3s|]}|�jkr|VqdS)N)�
matchWhite)r�r�)r�rwrxr��sz!White.__init__.<locals>.<genexpr>css|]}tj|VqdS)N)r.�	whiteStrs)r�r�rwrwrxr��sTz	Expected r)
r�r.r�r.r�r�rYr�r[rarrr�)r�Zwsrrr
)rH)r�rxr��s zWhite.__init__TcCs�|||jkrt|||j|��|}|d7}||j}t|t|��}x"||krd|||jkrd|d7}qDW|||jkr�t|||j|��||||�fS)Nrr)r.rrarrr�r)r�r-r�ror�r�rwrwrxr��s
zWhite.parseImpl)r-rrrr)T)r�r�r�r�r/r�r�r�rwrw)rHrxr.�scseZdZ�fdd�Z�ZS)�_PositionTokencs(tt|�j�|jj|_d|_d|_dS)NTF)r�r0r�rHr�r�r[r`)r�)rHrwrxr��s
z_PositionToken.__init__)r�r�r�r�r�rwrw)rHrxr0�sr0cs2eZdZdZ�fdd�Zdd�Zd	dd�Z�ZS)
rzb
    Token to advance to a specific column of input text; useful for tabular report scraping.
    cstt|�j�||_dS)N)r�rr�r9)r��colno)rHrwrxr��szGoToColumn.__init__cCs`t||�|jkr\t|�}|jr*|j||�}x0||krZ||j�rZt||�|jkrZ|d7}q,W|S)Nrr)r9r�r]r��isspace)r�r-r�r�rwrwrxr��s&zGoToColumn.preParseTcCsDt||�}||jkr"t||d|��||j|}|||�}||fS)NzText not in expected column)r9r)r�r-r�roZthiscolZnewlocr�rwrwrxr�s

zGoToColumn.parseImpl)T)r�r�r�r�r�r�r�r�rwrw)rHrxr�s	cs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Matches if current position is at the beginning of a line within the parse string
    
    Example::
    
        test = '''        AAA this line
        AAA and this line
          AAA but not this one
        B AAA and definitely not this one
        '''

        for t in (LineStart() + 'AAA' + restOfLine).searchString(test):
            print(t)
    
    Prints::
        ['AAA', ' this line']
        ['AAA', ' and this line']    

    cstt|�j�d|_dS)NzExpected start of line)r�rr�ra)r�)rHrwrxr�&szLineStart.__init__TcCs*t||�dkr|gfSt|||j|��dS)Nrr)r9rra)r�r-r�rorwrwrxr�*szLineStart.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxrscs*eZdZdZ�fdd�Zddd�Z�ZS)rzU
    Matches if current position is at the end of a line within the parse string
    cs,tt|�j�|jtjjdd��d|_dS)Nrr�zExpected end of line)r�rr�r�r$rNr�ra)r�)rHrwrxr�3szLineEnd.__init__TcCsb|t|�kr6||dkr$|ddfSt|||j|��n(|t|�krN|dgfSt|||j|��dS)Nrrr)r�rra)r�r-r�rorwrwrxr�8szLineEnd.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr/scs*eZdZdZ�fdd�Zddd�Z�ZS)r*zM
    Matches if current position is at the beginning of the parse string
    cstt|�j�d|_dS)NzExpected start of text)r�r*r�ra)r�)rHrwrxr�GszStringStart.__init__TcCs0|dkr(||j|d�kr(t|||j|��|gfS)Nr)r�rra)r�r-r�rorwrwrxr�KszStringStart.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr*Cscs*eZdZdZ�fdd�Zddd�Z�ZS)r)zG
    Matches if current position is at the end of the parse string
    cstt|�j�d|_dS)NzExpected end of text)r�r)r�ra)r�)rHrwrxr�VszStringEnd.__init__TcCs^|t|�krt|||j|��n<|t|�kr6|dgfS|t|�krJ|gfSt|||j|��dS)Nrr)r�rra)r�r-r�rorwrwrxr�ZszStringEnd.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr)Rscs.eZdZdZef�fdd�	Zddd�Z�ZS)r1ap
    Matches if the current position is at the beginning of a Word, and
    is not preceded by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{} behavior of regular expressions,
    use C{WordStart(alphanums)}. C{WordStart} will also match at the beginning of
    the string being parsed, or at the beginning of a line.
    cs"tt|�j�t|�|_d|_dS)NzNot at the start of a word)r�r1r�r��	wordCharsra)r�r3)rHrwrxr�ls
zWordStart.__init__TcCs@|dkr8||d|jks(|||jkr8t|||j|��|gfS)Nrrr)r3rra)r�r-r�rorwrwrxr�qs
zWordStart.parseImpl)T)r�r�r�r�rVr�r�r�rwrw)rHrxr1dscs.eZdZdZef�fdd�	Zddd�Z�ZS)r0aZ
    Matches if the current position is at the end of a Word, and
    is not followed by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{} behavior of regular expressions,
    use C{WordEnd(alphanums)}. C{WordEnd} will also match at the end of
    the string being parsed, or at the end of a line.
    cs(tt|�j�t|�|_d|_d|_dS)NFzNot at the end of a word)r�r0r�r�r3rXra)r�r3)rHrwrxr��s
zWordEnd.__init__TcCsPt|�}|dkrH||krH|||jks8||d|jkrHt|||j|��|gfS)Nrrr)r�r3rra)r�r-r�ror�rwrwrxr��szWordEnd.parseImpl)T)r�r�r�r�rVr�r�r�rwrw)rHrxr0xscs�eZdZdZd�fdd�	Zdd�Zdd�Zd	d
�Z�fdd�Z�fd
d�Z	�fdd�Z
d�fdd�	Zgfdd�Z�fdd�Z
�ZS)r z^
    Abstract subclass of ParserElement, for combining and post-processing parsed tokens.
    Fcs�tt|�j|�t|t�r"t|�}t|t�r<tj|�g|_	njt|t
j�rzt|�}tdd�|D��rnt
tj|�}t|�|_	n,yt|�|_	Wntk
r�|g|_	YnXd|_dS)Ncss|]}t|t�VqdS)N)rzr�)r�r.rwrwrxr��sz+ParseExpression.__init__.<locals>.<genexpr>F)r�r r�rzr�r�r�r$rQ�exprsr��Iterable�allrvr�re)r�r4rg)rHrwrxr��s

zParseExpression.__init__cCs
|j|S)N)r4)r�r�rwrwrxr��szParseExpression.__getitem__cCs|jj|�d|_|S)N)r4r�rU)r�r�rwrwrxr��szParseExpression.appendcCs4d|_dd�|jD�|_x|jD]}|j�q W|S)z~Extends C{leaveWhitespace} defined in base class, and also invokes C{leaveWhitespace} on
           all contained expressions.FcSsg|]}|j��qSrw)r�)r�r�rwrwrxr��sz3ParseExpression.leaveWhitespace.<locals>.<listcomp>)rXr4r�)r�r�rwrwrxr��s
zParseExpression.leaveWhitespacecszt|t�rF||jkrvtt|�j|�xP|jD]}|j|jd�q,Wn0tt|�j|�x|jD]}|j|jd�q^W|S)Nrrrsrs)rzr+r]r�r r�r4)r�r�r�)rHrwrxr��s

zParseExpression.ignorecsLytt|�j�Stk
r"YnX|jdkrFd|jjt|j�f|_|jS)Nz%s:(%s))	r�r r�rKrUrHr�r�r4)r�)rHrwrxr��s
zParseExpression.__str__cs0tt|�j�x|jD]}|j�qWt|j�dk�r|jd}t||j�r�|jr�|jdkr�|j	r�|jdd�|jdg|_d|_
|j|jO_|j|jO_|jd}t||j�o�|jo�|jdko�|j	�r|jdd�|jdd�|_d|_
|j|jO_|j|jO_dt
|�|_|S)Nrqrrrz	Expected rsrs)r�r r�r4r�rzrHrSrVr^rUr[r`r�ra)r�r�r�)rHrwrxr��s0




zParseExpression.streamlinecstt|�j||�}|S)N)r�r rm)r�r�rlr�)rHrwrxrm�szParseExpression.setResultsNamecCs:|dd�|g}x|jD]}|j|�qW|jg�dS)N)r4r�r�)r�r��tmpr�rwrwrxr��szParseExpression.validatecs$tt|�j�}dd�|jD�|_|S)NcSsg|]}|j��qSrw)r�)r�r�rwrwrxr��sz(ParseExpression.copy.<locals>.<listcomp>)r�r r�r4)r�r�)rHrwrxr��szParseExpression.copy)F)F)r�r�r�r�r�r�r�r�r�r�r�rmr�r�r�rwrw)rHrxr �s	
"csTeZdZdZGdd�de�Zd�fdd�	Zddd�Zd	d
�Zdd�Z	d
d�Z
�ZS)ra

    Requires all given C{ParseExpression}s to be found in the given order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'+'} operator.
    May also be constructed using the C{'-'} operator, which will suppress backtracking.

    Example::
        integer = Word(nums)
        name_expr = OneOrMore(Word(alphas))

        expr = And([integer("id"),name_expr("name"),integer("age")])
        # more easily written as:
        expr = integer("id") + name_expr("name") + integer("age")
    cseZdZ�fdd�Z�ZS)zAnd._ErrorStopcs&ttj|�j||�d|_|j�dS)N�-)r�rr�r�r�r�)r�r�r�)rHrwrxr�
szAnd._ErrorStop.__init__)r�r�r�r�r�rwrw)rHrxr�
sr�TcsRtt|�j||�tdd�|jD��|_|j|jdj�|jdj|_d|_	dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�
szAnd.__init__.<locals>.<genexpr>rT)
r�rr�r6r4r[r�rYrXre)r�r4rg)rHrwrxr�
s
zAnd.__init__c	Cs|jdj|||dd�\}}d}x�|jdd�D]�}t|tj�rFd}q0|r�y|j|||�\}}Wq�tk
rv�Yq�tk
r�}zd|_tj|��WYdd}~Xq�t	k
r�t|t
|�|j|��Yq�Xn|j|||�\}}|s�|j�r0||7}q0W||fS)NrF)rprrT)
r4rtrzrr�r#r�
__traceback__r�r�r�rar�)	r�r-r�ro�
resultlistZ	errorStopr�Z
exprtokensr�rwrwrxr�
s(z
And.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrxr5
s

zAnd.__iadd__cCs8|dd�|g}x |jD]}|j|�|jsPqWdS)N)r4r�r[)r�r��subRecCheckListr�rwrwrxr�:
s

zAnd.checkRecursioncCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr��{r�css|]}t|�VqdS)N)r�)r�r�rwrwrxr�F
szAnd.__str__.<locals>.<genexpr>�})r�r�rUr�r4)r�rwrwrxr�A
s


 zAnd.__str__)T)T)r�r�r�r�r
r�r�r�rr�r�r�rwrw)rHrxr�s
csDeZdZdZd�fdd�	Zddd�Zdd	�Zd
d�Zdd
�Z�Z	S)ra�
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the expression that matches the longest string will be used.
    May be constructed using the C{'^'} operator.

    Example::
        # construct Or using '^' operator
        
        number = Word(nums) ^ Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789"))
    prints::
        [['123'], ['3.1416'], ['789']]
    Fcs:tt|�j||�|jr0tdd�|jD��|_nd|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�\
szOr.__init__.<locals>.<genexpr>T)r�rr�r4rr[)r�r4rg)rHrwrxr�Y
szOr.__init__TcCsTd}d}g}x�|jD]�}y|j||�}Wnvtk
rd}	z d|	_|	j|krT|	}|	j}WYdd}	~	Xqtk
r�t|�|kr�t|t|�|j|�}t|�}YqX|j||f�qW|�r*|j	dd�d�x`|D]X\}
}y|j
|||�Stk
�r$}	z"d|	_|	j|k�r|	}|	j}WYdd}	~	Xq�Xq�W|dk	�rB|j|_|�nt||d|��dS)NrrcSs
|dS)Nrrw)�xrwrwrxryu
szOr.parseImpl.<locals>.<lambda>)r�z no defined alternatives to matchrs)r4r�rr9r�r�r�rar��sortrtr�)r�r-r�ro�	maxExcLoc�maxExceptionr�r�Zloc2r��_rwrwrxr�`
s<

zOr.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrx�__ixor__�
s

zOr.__ixor__cCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r<z ^ css|]}t|�VqdS)N)r�)r�r�rwrwrxr��
szOr.__str__.<locals>.<genexpr>r=)r�r�rUr�r4)r�rwrwrxr��
s


 z
Or.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r4r�)r�r�r;r�rwrwrxr��
szOr.checkRecursion)F)T)
r�r�r�r�r�r�rCr�r�r�rwrw)rHrxrK
s

&	csDeZdZdZd�fdd�	Zddd�Zdd	�Zd
d�Zdd
�Z�Z	S)ra�
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the first one listed is the one that will match.
    May be constructed using the C{'|'} operator.

    Example::
        # construct MatchFirst using '|' operator
        
        # watch the order of expressions to match
        number = Word(nums) | Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789")) #  Fail! -> [['123'], ['3'], ['1416'], ['789']]

        # put more selective expression first
        number = Combine(Word(nums) + '.' + Word(nums)) | Word(nums)
        print(number.searchString("123 3.1416 789")) #  Better -> [['123'], ['3.1416'], ['789']]
    Fcs:tt|�j||�|jr0tdd�|jD��|_nd|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr��
sz&MatchFirst.__init__.<locals>.<genexpr>T)r�rr�r4rr[)r�r4rg)rHrwrxr��
szMatchFirst.__init__Tc	Cs�d}d}x�|jD]�}y|j|||�}|Stk
r\}z|j|krL|}|j}WYdd}~Xqtk
r�t|�|kr�t|t|�|j|�}t|�}YqXqW|dk	r�|j|_|�nt||d|��dS)Nrrz no defined alternatives to matchrs)r4rtrr�r�r�rar�)	r�r-r�ror@rAr�r�r�rwrwrxr��
s$
zMatchFirst.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrx�__ior__�
s

zMatchFirst.__ior__cCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r<z | css|]}t|�VqdS)N)r�)r�r�rwrwrxr��
sz%MatchFirst.__str__.<locals>.<genexpr>r=)r�r�rUr�r4)r�rwrwrxr��
s


 zMatchFirst.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r4r�)r�r�r;r�rwrwrxr��
szMatchFirst.checkRecursion)F)T)
r�r�r�r�r�r�rDr�r�r�rwrw)rHrxr�
s
	cs<eZdZdZd�fdd�	Zddd�Zdd�Zd	d
�Z�ZS)
ram
    Requires all given C{ParseExpression}s to be found, but in any order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'&'} operator.

    Example::
        color = oneOf("RED ORANGE YELLOW GREEN BLUE PURPLE BLACK WHITE BROWN")
        shape_type = oneOf("SQUARE CIRCLE TRIANGLE STAR HEXAGON OCTAGON")
        integer = Word(nums)
        shape_attr = "shape:" + shape_type("shape")
        posn_attr = "posn:" + Group(integer("x") + ',' + integer("y"))("posn")
        color_attr = "color:" + color("color")
        size_attr = "size:" + integer("size")

        # use Each (using operator '&') to accept attributes in any order 
        # (shape and posn are required, color and size are optional)
        shape_spec = shape_attr & posn_attr & Optional(color_attr) & Optional(size_attr)

        shape_spec.runTests('''
            shape: SQUARE color: BLACK posn: 100, 120
            shape: CIRCLE size: 50 color: BLUE posn: 50,80
            color:GREEN size:20 shape:TRIANGLE posn:20,40
            '''
            )
    prints::
        shape: SQUARE color: BLACK posn: 100, 120
        ['shape:', 'SQUARE', 'color:', 'BLACK', 'posn:', ['100', ',', '120']]
        - color: BLACK
        - posn: ['100', ',', '120']
          - x: 100
          - y: 120
        - shape: SQUARE


        shape: CIRCLE size: 50 color: BLUE posn: 50,80
        ['shape:', 'CIRCLE', 'size:', '50', 'color:', 'BLUE', 'posn:', ['50', ',', '80']]
        - color: BLUE
        - posn: ['50', ',', '80']
          - x: 50
          - y: 80
        - shape: CIRCLE
        - size: 50


        color: GREEN size: 20 shape: TRIANGLE posn: 20,40
        ['color:', 'GREEN', 'size:', '20', 'shape:', 'TRIANGLE', 'posn:', ['20', ',', '40']]
        - color: GREEN
        - posn: ['20', ',', '40']
          - x: 20
          - y: 40
        - shape: TRIANGLE
        - size: 20
    Tcs8tt|�j||�tdd�|jD��|_d|_d|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�sz Each.__init__.<locals>.<genexpr>T)r�rr�r6r4r[rX�initExprGroups)r�r4rg)rHrwrxr�sz
Each.__init__c
s�|jr�tdd�|jD��|_dd�|jD�}dd�|jD�}|||_dd�|jD�|_dd�|jD�|_dd�|jD�|_|j|j7_d	|_|}|jdd�}|jdd��g}d
}	x�|	�rp|�|j|j}
g}x~|
D]v}y|j||�}Wn t	k
�r|j
|�Yq�X|j
|jjt|�|��||k�rD|j
|�q�|�kr�j
|�q�Wt|�t|
�kr�d	}	q�W|�r�djdd�|D��}
t	||d
|
��|�fdd�|jD�7}g}x*|D]"}|j|||�\}}|j
|��q�Wt|tg��}||fS)Ncss&|]}t|t�rt|j�|fVqdS)N)rzrr�r.)r�r�rwrwrxr�sz!Each.parseImpl.<locals>.<genexpr>cSsg|]}t|t�r|j�qSrw)rzrr.)r�r�rwrwrxr�sz"Each.parseImpl.<locals>.<listcomp>cSs"g|]}|jrt|t�r|�qSrw)r[rzr)r�r�rwrwrxr�scSsg|]}t|t�r|j�qSrw)rzr2r.)r�r�rwrwrxr� scSsg|]}t|t�r|j�qSrw)rzrr.)r�r�rwrwrxr�!scSs g|]}t|tttf�s|�qSrw)rzrr2r)r�r�rwrwrxr�"sFTz, css|]}t|�VqdS)N)r�)r�r�rwrwrxr�=sz*Missing one or more required elements (%s)cs$g|]}t|t�r|j�kr|�qSrw)rzrr.)r�r�)�tmpOptrwrxr�As)rEr�r4Zopt1mapZ	optionalsZmultioptionalsZ
multirequiredZrequiredr�rr�r�r��remover�r�rt�sumr")r�r-r�roZopt1Zopt2ZtmpLocZtmpReqdZ
matchOrderZkeepMatchingZtmpExprsZfailedr�Zmissingr:r�ZfinalResultsrw)rFrxr�sP



zEach.parseImplcCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r<z & css|]}t|�VqdS)N)r�)r�r�rwrwrxr�PszEach.__str__.<locals>.<genexpr>r=)r�r�rUr�r4)r�rwrwrxr�Ks


 zEach.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r4r�)r�r�r;r�rwrwrxr�TszEach.checkRecursion)T)T)	r�r�r�r�r�r�r�r�r�rwrw)rHrxr�
s
5
1	csleZdZdZd�fdd�	Zddd�Zdd	�Z�fd
d�Z�fdd
�Zdd�Z	gfdd�Z
�fdd�Z�ZS)rza
    Abstract subclass of C{ParserElement}, for combining and post-processing parsed tokens.
    Fcs�tt|�j|�t|t�r@ttjt�r2tj|�}ntjt	|��}||_
d|_|dk	r�|j|_|j
|_
|j|j�|j|_|j|_|j|_|jj|j�dS)N)r�rr�rzr��
issubclassr$rQr,rr.rUr`r[r�rYrXrWrer]r�)r�r.rg)rHrwrxr�^s
zParseElementEnhance.__init__TcCs2|jdk	r|jj|||dd�Std||j|��dS)NF)rpr�)r.rtrra)r�r-r�rorwrwrxr�ps
zParseElementEnhance.parseImplcCs*d|_|jj�|_|jdk	r&|jj�|S)NF)rXr.r�r�)r�rwrwrxr�vs


z#ParseElementEnhance.leaveWhitespacecsrt|t�rB||jkrntt|�j|�|jdk	rn|jj|jd�n,tt|�j|�|jdk	rn|jj|jd�|S)Nrrrsrs)rzr+r]r�rr�r.)r�r�)rHrwrxr�}s



zParseElementEnhance.ignorecs&tt|�j�|jdk	r"|jj�|S)N)r�rr�r.)r�)rHrwrxr��s

zParseElementEnhance.streamlinecCsB||krt||g��|dd�|g}|jdk	r>|jj|�dS)N)r&r.r�)r�r�r;rwrwrxr��s

z"ParseElementEnhance.checkRecursioncCs6|dd�|g}|jdk	r(|jj|�|jg�dS)N)r.r�r�)r�r�r7rwrwrxr��s
zParseElementEnhance.validatecsVytt|�j�Stk
r"YnX|jdkrP|jdk	rPd|jjt|j�f|_|jS)Nz%s:(%s))	r�rr�rKrUr.rHr�r�)r�)rHrwrxr��szParseElementEnhance.__str__)F)T)
r�r�r�r�r�r�r�r�r�r�r�r�r�rwrw)rHrxrZs
cs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Lookahead matching of the given parse expression.  C{FollowedBy}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression matches at the current
    position.  C{FollowedBy} always returns a null token list.

    Example::
        # use FollowedBy to match a label only if it is followed by a ':'
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        OneOrMore(attr_expr).parseString("shape: SQUARE color: BLACK posn: upper left").pprint()
    prints::
        [['shape', 'SQUARE'], ['color', 'BLACK'], ['posn', 'upper left']]
    cstt|�j|�d|_dS)NT)r�rr�r[)r�r.)rHrwrxr��szFollowedBy.__init__TcCs|jj||�|gfS)N)r.r�)r�r-r�rorwrwrxr��szFollowedBy.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr�scs2eZdZdZ�fdd�Zd	dd�Zdd�Z�ZS)
ra�
    Lookahead to disallow matching with the given parse expression.  C{NotAny}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression does I{not} match at the current
    position.  Also, C{NotAny} does I{not} skip over leading whitespace. C{NotAny}
    always returns a null token list.  May be constructed using the '~' operator.

    Example::
        
    cs0tt|�j|�d|_d|_dt|j�|_dS)NFTzFound unwanted token, )r�rr�rXr[r�r.ra)r�r.)rHrwrxr��szNotAny.__init__TcCs&|jj||�rt|||j|��|gfS)N)r.r�rra)r�r-r�rorwrwrxr��szNotAny.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�z~{r=)r�r�rUr�r.)r�rwrwrxr��s


zNotAny.__str__)T)r�r�r�r�r�r�r�r�rwrw)rHrxr�s

cs(eZdZd�fdd�	Zddd�Z�ZS)	�_MultipleMatchNcsFtt|�j|�d|_|}t|t�r.tj|�}|dk	r<|nd|_dS)NT)	r�rJr�rWrzr�r$rQ�	not_ender)r�r.�stopOnZender)rHrwrxr��s

z_MultipleMatch.__init__TcCs�|jj}|j}|jdk	}|r$|jj}|r2|||�||||dd�\}}yZ|j}	xJ|rb|||�|	rr|||�}
n|}
|||
|�\}}|s�|j�rT||7}qTWWnttfk
r�YnX||fS)NF)rp)	r.rtr�rKr�r]r�rr�)r�r-r�roZself_expr_parseZself_skip_ignorablesZcheck_enderZ
try_not_enderr�ZhasIgnoreExprsr�Z	tmptokensrwrwrxr��s,



z_MultipleMatch.parseImpl)N)T)r�r�r�r�r�r�rwrw)rHrxrJ�srJc@seZdZdZdd�ZdS)ra�
    Repetition of one or more of the given expression.
    
    Parameters:
     - expr - expression that must match one or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: BLACK"
        OneOrMore(attr_expr).parseString(text).pprint()  # Fail! read 'color' as data instead of next label -> [['shape', 'SQUARE color']]

        # use stopOn attribute for OneOrMore to avoid reading label string as part of the data
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        OneOrMore(attr_expr).parseString(text).pprint() # Better -> [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'BLACK']]
        
        # could also be written as
        (attr_expr * (1,)).parseString(text).pprint()
    cCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�r<z}...)r�r�rUr�r.)r�rwrwrxr�!s


zOneOrMore.__str__N)r�r�r�r�r�rwrwrwrxrscs8eZdZdZd
�fdd�	Zd�fdd�	Zdd	�Z�ZS)r2aw
    Optional repetition of zero or more of the given expression.
    
    Parameters:
     - expr - expression that must match zero or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example: similar to L{OneOrMore}
    Ncstt|�j||d�d|_dS)N)rLT)r�r2r�r[)r�r.rL)rHrwrxr�6szZeroOrMore.__init__Tcs6ytt|�j|||�Sttfk
r0|gfSXdS)N)r�r2r�rr�)r�r-r�ro)rHrwrxr�:szZeroOrMore.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�rz]...)r�r�rUr�r.)r�rwrwrxr�@s


zZeroOrMore.__str__)N)T)r�r�r�r�r�r�r�r�rwrw)rHrxr2*sc@s eZdZdd�ZeZdd�ZdS)�
_NullTokencCsdS)NFrw)r�rwrwrxr�Jsz_NullToken.__bool__cCsdS)Nr�rw)r�rwrwrxr�Msz_NullToken.__str__N)r�r�r�r�r'r�rwrwrwrxrMIsrMcs6eZdZdZef�fdd�	Zd	dd�Zdd�Z�ZS)
raa
    Optional matching of the given expression.

    Parameters:
     - expr - expression that must match zero or more times
     - default (optional) - value to be returned if the optional expression is not found.

    Example::
        # US postal code can be a 5-digit zip, plus optional 4-digit qualifier
        zip = Combine(Word(nums, exact=5) + Optional('-' + Word(nums, exact=4)))
        zip.runTests('''
            # traditional ZIP code
            12345
            
            # ZIP+4 form
            12101-0001
            
            # invalid ZIP
            98765-
            ''')
    prints::
        # traditional ZIP code
        12345
        ['12345']

        # ZIP+4 form
        12101-0001
        ['12101-0001']

        # invalid ZIP
        98765-
             ^
        FAIL: Expected end of text (at char 5), (line:1, col:6)
    cs.tt|�j|dd�|jj|_||_d|_dS)NF)rgT)r�rr�r.rWr�r[)r�r.r�)rHrwrxr�ts
zOptional.__init__TcCszy|jj|||dd�\}}WnTttfk
rp|jtk	rh|jjr^t|jg�}|j||jj<ql|jg}ng}YnX||fS)NF)rp)r.rtrr�r��_optionalNotMatchedrVr")r�r-r�ror�rwrwrxr�zs


zOptional.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�rr	)r�r�rUr�r.)r�rwrwrxr��s


zOptional.__str__)T)	r�r�r�r�rNr�r�r�r�rwrw)rHrxrQs"
cs,eZdZdZd	�fdd�	Zd
dd�Z�ZS)r(a�	
    Token for skipping over all undefined text until the matched expression is found.

    Parameters:
     - expr - target expression marking the end of the data to be skipped
     - include - (default=C{False}) if True, the target expression is also parsed 
          (the skipped text and target expression are returned as a 2-element list).
     - ignore - (default=C{None}) used to define grammars (typically quoted strings and 
          comments) that might contain false matches to the target expression
     - failOn - (default=C{None}) define expressions that are not allowed to be 
          included in the skipped test; if found before the target expression is found, 
          the SkipTo is not a match

    Example::
        report = '''
            Outstanding Issues Report - 1 Jan 2000

               # | Severity | Description                               |  Days Open
            -----+----------+-------------------------------------------+-----------
             101 | Critical | Intermittent system crash                 |          6
              94 | Cosmetic | Spelling error on Login ('log|n')         |         14
              79 | Minor    | System slow when running too many reports |         47
            '''
        integer = Word(nums)
        SEP = Suppress('|')
        # use SkipTo to simply match everything up until the next SEP
        # - ignore quoted strings, so that a '|' character inside a quoted string does not match
        # - parse action will call token.strip() for each matched token, i.e., the description body
        string_data = SkipTo(SEP, ignore=quotedString)
        string_data.setParseAction(tokenMap(str.strip))
        ticket_expr = (integer("issue_num") + SEP 
                      + string_data("sev") + SEP 
                      + string_data("desc") + SEP 
                      + integer("days_open"))
        
        for tkt in ticket_expr.searchString(report):
            print tkt.dump()
    prints::
        ['101', 'Critical', 'Intermittent system crash', '6']
        - days_open: 6
        - desc: Intermittent system crash
        - issue_num: 101
        - sev: Critical
        ['94', 'Cosmetic', "Spelling error on Login ('log|n')", '14']
        - days_open: 14
        - desc: Spelling error on Login ('log|n')
        - issue_num: 94
        - sev: Cosmetic
        ['79', 'Minor', 'System slow when running too many reports', '47']
        - days_open: 47
        - desc: System slow when running too many reports
        - issue_num: 79
        - sev: Minor
    FNcs`tt|�j|�||_d|_d|_||_d|_t|t	�rFt
j|�|_n||_dt
|j�|_dS)NTFzNo match found for )r�r(r��
ignoreExprr[r`�includeMatchr�rzr�r$rQ�failOnr�r.ra)r�r��includer�rQ)rHrwrxr��s
zSkipTo.__init__TcCs,|}t|�}|j}|jj}|jdk	r,|jjnd}|jdk	rB|jjnd}	|}
x�|
|kr�|dk	rh|||
�rhP|	dk	r�x*y|	||
�}
Wqrtk
r�PYqrXqrWy|||
ddd�Wn tt	fk
r�|
d7}
YqLXPqLWt|||j
|��|
}|||�}t|�}|j�r$||||dd�\}}
||
7}||fS)NF)rorprr)rp)
r�r.rtrQr�rOr�rrr�rar"rP)r�r-r�ror0r�r.Z
expr_parseZself_failOn_canParseNextZself_ignoreExpr_tryParseZtmplocZskiptextZ
skipresultr�rwrwrxr��s<

zSkipTo.parseImpl)FNN)T)r�r�r�r�r�r�r�rwrw)rHrxr(�s6
csbeZdZdZd�fdd�	Zdd�Zdd�Zd	d
�Zdd�Zgfd
d�Z	dd�Z
�fdd�Z�ZS)raK
    Forward declaration of an expression to be defined later -
    used for recursive grammars, such as algebraic infix notation.
    When the expression is known, it is assigned to the C{Forward} variable using the '<<' operator.

    Note: take care when assigning to C{Forward} not to overlook precedence of operators.
    Specifically, '|' has a lower precedence than '<<', so that::
        fwdExpr << a | b | c
    will actually be evaluated as::
        (fwdExpr << a) | b | c
    thereby leaving b and c out as parseable alternatives.  It is recommended that you
    explicitly group the values inserted into the C{Forward}::
        fwdExpr << (a | b | c)
    Converting to use the '<<=' operator instead will avoid this problem.

    See L{ParseResults.pprint} for an example of a recursive parser created using
    C{Forward}.
    Ncstt|�j|dd�dS)NF)rg)r�rr�)r�r�)rHrwrxr�szForward.__init__cCsjt|t�rtj|�}||_d|_|jj|_|jj|_|j|jj	�|jj
|_
|jj|_|jj
|jj�|S)N)rzr�r$rQr.rUr`r[r�rYrXrWr]r�)r�r�rwrwrx�
__lshift__s





zForward.__lshift__cCs||>S)Nrw)r�r�rwrwrx�__ilshift__'szForward.__ilshift__cCs
d|_|S)NF)rX)r�rwrwrxr�*szForward.leaveWhitespacecCs$|js d|_|jdk	r |jj�|S)NT)r_r.r�)r�rwrwrxr�.s


zForward.streamlinecCs>||kr0|dd�|g}|jdk	r0|jj|�|jg�dS)N)r.r�r�)r�r�r7rwrwrxr�5s

zForward.validatecCs>t|d�r|jS|jjdSd}Wd|j|_X|jjd|S)Nr�z: ...�Nonez: )r�r�rHr�Z_revertClass�_ForwardNoRecurser.r�)r�Z	retStringrwrwrxr�<s

zForward.__str__cs.|jdk	rtt|�j�St�}||K}|SdS)N)r.r�rr�)r�r�)rHrwrxr�Ms

zForward.copy)N)
r�r�r�r�r�rSrTr�r�r�r�r�r�rwrw)rHrxrs
c@seZdZdd�ZdS)rVcCsdS)Nz...rw)r�rwrwrxr�Vsz_ForwardNoRecurse.__str__N)r�r�r�r�rwrwrwrxrVUsrVcs"eZdZdZd�fdd�	Z�ZS)r-zQ
    Abstract subclass of C{ParseExpression}, for converting parsed results.
    Fcstt|�j|�d|_dS)NF)r�r-r�rW)r�r.rg)rHrwrxr�]szTokenConverter.__init__)F)r�r�r�r�r�r�rwrw)rHrxr-Yscs6eZdZdZd
�fdd�	Z�fdd�Zdd	�Z�ZS)r
a�
    Converter to concatenate all matching tokens to a single string.
    By default, the matching patterns must also be contiguous in the input string;
    this can be disabled by specifying C{'adjacent=False'} in the constructor.

    Example::
        real = Word(nums) + '.' + Word(nums)
        print(real.parseString('3.1416')) # -> ['3', '.', '1416']
        # will also erroneously match the following
        print(real.parseString('3. 1416')) # -> ['3', '.', '1416']

        real = Combine(Word(nums) + '.' + Word(nums))
        print(real.parseString('3.1416')) # -> ['3.1416']
        # no match when there are internal spaces
        print(real.parseString('3. 1416')) # -> Exception: Expected W:(0123...)
    r�Tcs8tt|�j|�|r|j�||_d|_||_d|_dS)NT)r�r
r�r��adjacentrX�
joinStringre)r�r.rXrW)rHrwrxr�rszCombine.__init__cs(|jrtj||�ntt|�j|�|S)N)rWr$r�r�r
)r�r�)rHrwrxr�|szCombine.ignorecCsP|j�}|dd�=|tdj|j|j��g|jd�7}|jrH|j�rH|gS|SdS)Nr�)r�)r�r"r�r
rXrbrVr�)r�r-r�r�ZretToksrwrwrxr��s
"zCombine.postParse)r�T)r�r�r�r�r�r�r�r�rwrw)rHrxr
as
cs(eZdZdZ�fdd�Zdd�Z�ZS)ra�
    Converter to return the matched tokens as a list - useful for returning tokens of C{L{ZeroOrMore}} and C{L{OneOrMore}} expressions.

    Example::
        ident = Word(alphas)
        num = Word(nums)
        term = ident | num
        func = ident + Optional(delimitedList(term))
        print(func.parseString("fn a,b,100"))  # -> ['fn', 'a', 'b', '100']

        func = ident + Group(Optional(delimitedList(term)))
        print(func.parseString("fn a,b,100"))  # -> ['fn', ['a', 'b', '100']]
    cstt|�j|�d|_dS)NT)r�rr�rW)r�r.)rHrwrxr��szGroup.__init__cCs|gS)Nrw)r�r-r�r�rwrwrxr��szGroup.postParse)r�r�r�r�r�r�r�rwrw)rHrxr�s
cs(eZdZdZ�fdd�Zdd�Z�ZS)raW
    Converter to return a repetitive expression as a list, but also as a dictionary.
    Each element can also be referenced using the first token in the expression as its key.
    Useful for tabular report scraping when the first column can be used as a item key.

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        # print attributes as plain groups
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        # instead of OneOrMore(expr), parse using Dict(OneOrMore(Group(expr))) - Dict will auto-assign names
        result = Dict(OneOrMore(Group(attr_expr))).parseString(text)
        print(result.dump())
        
        # access named fields as dict entries, or output as dict
        print(result['shape'])        
        print(result.asDict())
    prints::
        ['shape', 'SQUARE', 'posn', 'upper left', 'color', 'light blue', 'texture', 'burlap']

        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        {'color': 'light blue', 'posn': 'upper left', 'texture': 'burlap', 'shape': 'SQUARE'}
    See more examples at L{ParseResults} of accessing fields by results name.
    cstt|�j|�d|_dS)NT)r�rr�rW)r�r.)rHrwrxr��sz
Dict.__init__cCs�x�t|�D]�\}}t|�dkr q
|d}t|t�rBt|d�j�}t|�dkr^td|�||<q
t|�dkr�t|dt�r�t|d|�||<q
|j�}|d=t|�dks�t|t�r�|j	�r�t||�||<q
t|d|�||<q
W|j
r�|gS|SdS)Nrrrr�rq)r�r�rzrur�r�r�r"r�r�rV)r�r-r�r�r��tokZikeyZ	dictvaluerwrwrxr��s$
zDict.postParse)r�r�r�r�r�r�r�rwrw)rHrxr�s#c@s eZdZdZdd�Zdd�ZdS)r+aV
    Converter for ignoring the results of a parsed expression.

    Example::
        source = "a, b, c,d"
        wd = Word(alphas)
        wd_list1 = wd + ZeroOrMore(',' + wd)
        print(wd_list1.parseString(source))

        # often, delimiters that are useful during parsing are just in the
        # way afterward - use Suppress to keep them out of the parsed output
        wd_list2 = wd + ZeroOrMore(Suppress(',') + wd)
        print(wd_list2.parseString(source))
    prints::
        ['a', ',', 'b', ',', 'c', ',', 'd']
        ['a', 'b', 'c', 'd']
    (See also L{delimitedList}.)
    cCsgS)Nrw)r�r-r�r�rwrwrxr��szSuppress.postParsecCs|S)Nrw)r�rwrwrxr��szSuppress.suppressN)r�r�r�r�r�r�rwrwrwrxr+�sc@s(eZdZdZdd�Zdd�Zdd�ZdS)	rzI
    Wrapper for parse actions, to ensure they are only called once.
    cCst|�|_d|_dS)NF)rM�callable�called)r�Z
methodCallrwrwrxr�s
zOnlyOnce.__init__cCs.|js|j|||�}d|_|St||d��dS)NTr�)r[rZr)r�r�r5rvr�rwrwrxr�s
zOnlyOnce.__call__cCs
d|_dS)NF)r[)r�rwrwrx�reset
szOnlyOnce.resetN)r�r�r�r�r�r�r\rwrwrwrxr�scs:t����fdd�}y�j|_Wntk
r4YnX|S)as
    Decorator for debugging parse actions. 
    
    When the parse action is called, this decorator will print C{">> entering I{method-name}(line:I{current_source_line}, I{parse_location}, I{matched_tokens})".}
    When the parse action completes, the decorator will print C{"<<"} followed by the returned value, or any exception that the parse action raised.

    Example::
        wd = Word(alphas)

        @traceParseAction
        def remove_duplicate_chars(tokens):
            return ''.join(sorted(set(''.join(tokens)))

        wds = OneOrMore(wd).setParseAction(remove_duplicate_chars)
        print(wds.parseString("slkdjs sld sldd sdlf sdljf"))
    prints::
        >>entering remove_duplicate_chars(line: 'slkdjs sld sldd sdlf sdljf', 0, (['slkdjs', 'sld', 'sldd', 'sdlf', 'sdljf'], {}))
        <<leaving remove_duplicate_chars (ret: 'dfjkls')
        ['dfjkls']
    cs��j}|dd�\}}}t|�dkr8|djjd|}tjjd|t||�||f�y�|�}Wn8tk
r�}ztjjd||f��WYdd}~XnXtjjd||f�|S)Nror�.z">>entering %s(line: '%s', %d, %r)
z<<leaving %s (exception: %s)
z<<leaving %s (ret: %r)
r9)r�r�rHr~�stderr�writerGrK)ZpaArgsZthisFuncr�r5rvr�r3)r�rwrx�z#sztraceParseAction.<locals>.z)rMr�r�)r�r`rw)r�rxrb
s
�,FcCs`t|�dt|�dt|�d}|rBt|t||��j|�S|tt|�|�j|�SdS)a�
    Helper to define a delimited list of expressions - the delimiter defaults to ','.
    By default, the list elements and delimiters can have intervening whitespace, and
    comments, but this can be overridden by passing C{combine=True} in the constructor.
    If C{combine} is set to C{True}, the matching tokens are returned as a single token
    string, with the delimiters included; otherwise, the matching tokens are returned
    as a list of tokens, with the delimiters suppressed.

    Example::
        delimitedList(Word(alphas)).parseString("aa,bb,cc") # -> ['aa', 'bb', 'cc']
        delimitedList(Word(hexnums), delim=':', combine=True).parseString("AA:BB:CC:DD:EE") # -> ['AA:BB:CC:DD:EE']
    z [r�z]...N)r�r
r2rir+)r.Zdelim�combineZdlNamerwrwrxr@9s
$csjt����fdd�}|dkr0tt�jdd��}n|j�}|jd�|j|dd�|�jd	t��d
�S)a:
    Helper to define a counted list of expressions.
    This helper defines a pattern of the form::
        integer expr expr expr...
    where the leading integer tells how many expr expressions follow.
    The matched tokens returns the array of expr tokens as a list - the leading count token is suppressed.
    
    If C{intExpr} is specified, it should be a pyparsing expression that produces an integer value.

    Example::
        countedArray(Word(alphas)).parseString('2 ab cd ef')  # -> ['ab', 'cd']

        # in this parser, the leading integer value is given in binary,
        # '10' indicating that 2 values are in the array
        binaryConstant = Word('01').setParseAction(lambda t: int(t[0], 2))
        countedArray(Word(alphas), intExpr=binaryConstant).parseString('10 ab cd ef')  # -> ['ab', 'cd']
    cs.|d}�|r tt�g|��p&tt�>gS)Nr)rrrC)r�r5rvr�)�	arrayExprr.rwrx�countFieldParseAction_s"z+countedArray.<locals>.countFieldParseActionNcSst|d�S)Nr)ru)rvrwrwrxrydszcountedArray.<locals>.<lambda>ZarrayLenT)rfz(len) z...)rr/rRr�r�rirxr�)r.ZintExprrdrw)rcr.rxr<Ls
cCs:g}x0|D](}t|t�r(|jt|��q
|j|�q
W|S)N)rzr�r�r�r�)�Lr�r�rwrwrxr�ks

r�cs6t���fdd�}|j|dd��jdt|���S)a*
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousLiteral(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches a
    previous literal, will also match the leading C{"1:1"} in C{"1:10"}.
    If this is not desired, use C{matchPreviousExpr}.
    Do I{not} use with packrat parsing enabled.
    csP|rBt|�dkr�|d>qLt|j��}�tdd�|D��>n
�t�>dS)Nrrrcss|]}t|�VqdS)N)r)r��ttrwrwrxr��szDmatchPreviousLiteral.<locals>.copyTokenToRepeater.<locals>.<genexpr>)r�r�r�rr
)r�r5rvZtflat)�reprwrx�copyTokenToRepeater�sz1matchPreviousLiteral.<locals>.copyTokenToRepeaterT)rfz(prev) )rrxrir�)r.rhrw)rgrxrOts


csFt��|j�}�|K��fdd�}|j|dd��jdt|���S)aS
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousExpr(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches by
    expressions, will I{not} match the leading C{"1:1"} in C{"1:10"};
    the expressions are evaluated first, and then compared, so
    C{"1"} is compared with C{"10"}.
    Do I{not} use with packrat parsing enabled.
    cs*t|j����fdd�}�j|dd�dS)Ncs$t|j��}|�kr tddd��dS)Nr�r)r�r�r)r�r5rvZtheseTokens)�matchTokensrwrx�mustMatchTheseTokens�szLmatchPreviousExpr.<locals>.copyTokenToRepeater.<locals>.mustMatchTheseTokensT)rf)r�r�r�)r�r5rvrj)rg)rirxrh�sz.matchPreviousExpr.<locals>.copyTokenToRepeaterT)rfz(prev) )rr�rxrir�)r.Ze2rhrw)rgrxrN�scCs>xdD]}|j|t|�}qW|jdd�}|jdd�}t|�S)Nz\^-]rz\nr(z\t)r��_bslashr�)r�r�rwrwrxr�s

rTc
s�|rdd�}dd�}t�ndd�}dd�}t�g}t|t�rF|j�}n&t|tj�r\t|�}ntj	dt
dd�|svt�Sd	}x�|t|�d
k�r||}xnt
||d
d��D]N\}}	||	|�r�|||d
=Pq�|||	�r�|||d
=|j||	�|	}Pq�W|d
7}q|W|�r�|�r�yht|�tdj|��k�rZtd
djdd�|D���jdj|��Stdjdd�|D���jdj|��SWn&tk
�r�tj	dt
dd�YnXt�fdd�|D��jdj|��S)a�
    Helper to quickly define a set of alternative Literals, and makes sure to do
    longest-first testing when there is a conflict, regardless of the input order,
    but returns a C{L{MatchFirst}} for best performance.

    Parameters:
     - strs - a string of space-delimited literals, or a collection of string literals
     - caseless - (default=C{False}) - treat all literals as caseless
     - useRegex - (default=C{True}) - as an optimization, will generate a Regex
          object; otherwise, will generate a C{MatchFirst} object (if C{caseless=True}, or
          if creating a C{Regex} raises an exception)

    Example::
        comp_oper = oneOf("< = > <= >= !=")
        var = Word(alphas)
        number = Word(nums)
        term = var | number
        comparison_expr = term + comp_oper + term
        print(comparison_expr.searchString("B = 12  AA=23 B<=AA AA>12"))
    prints::
        [['B', '=', '12'], ['AA', '=', '23'], ['B', '<=', 'AA'], ['AA', '>', '12']]
    cSs|j�|j�kS)N)r�)r�brwrwrxry�szoneOf.<locals>.<lambda>cSs|j�j|j��S)N)r�r�)rrlrwrwrxry�scSs||kS)Nrw)rrlrwrwrxry�scSs
|j|�S)N)r�)rrlrwrwrxry�sz6Invalid argument to oneOf, expected string or iterablerq)r�rrrNr�z[%s]css|]}t|�VqdS)N)r)r��symrwrwrxr��szoneOf.<locals>.<genexpr>z | �|css|]}tj|�VqdS)N)rdr	)r�rmrwrwrxr��sz7Exception creating Regex for oneOf, building MatchFirstc3s|]}�|�VqdS)Nrw)r�rm)�parseElementClassrwrxr��s)rrrzr�r�r�r5r�r�r�r�rr�r�r�r�r'rirKr)
Zstrsr�ZuseRegexZisequalZmasksZsymbolsr�Zcurr�r�rw)rorxrS�sL





((cCsttt||���S)a�
    Helper to easily and clearly define a dictionary by specifying the respective patterns
    for the key and value.  Takes care of defining the C{L{Dict}}, C{L{ZeroOrMore}}, and C{L{Group}} tokens
    in the proper order.  The key pattern can include delimiting markers or punctuation,
    as long as they are suppressed, thereby leaving the significant key text.  The value
    pattern can include named results, so that the C{Dict} results can include named token
    fields.

    Example::
        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        attr_label = label
        attr_value = Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join)

        # similar to Dict, but simpler call format
        result = dictOf(attr_label, attr_value).parseString(text)
        print(result.dump())
        print(result['shape'])
        print(result.shape)  # object attribute access works too
        print(result.asDict())
    prints::
        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        SQUARE
        {'color': 'light blue', 'shape': 'SQUARE', 'posn': 'upper left', 'texture': 'burlap'}
    )rr2r)r�r�rwrwrxrA�s!cCs^t�jdd��}|j�}d|_|d�||d�}|r@dd�}ndd�}|j|�|j|_|S)	a�
    Helper to return the original, untokenized text for a given expression.  Useful to
    restore the parsed fields of an HTML start tag into the raw tag text itself, or to
    revert separate tokens with intervening whitespace back to the original matching
    input text. By default, returns astring containing the original parsed text.  
       
    If the optional C{asString} argument is passed as C{False}, then the return value is a 
    C{L{ParseResults}} containing any results names that were originally matched, and a 
    single token containing the original matched text from the input string.  So if 
    the expression passed to C{L{originalTextFor}} contains expressions with defined
    results names, you must set C{asString} to C{False} if you want to preserve those
    results name values.

    Example::
        src = "this is test <b> bold <i>text</i> </b> normal text "
        for tag in ("b","i"):
            opener,closer = makeHTMLTags(tag)
            patt = originalTextFor(opener + SkipTo(closer) + closer)
            print(patt.searchString(src)[0])
    prints::
        ['<b> bold <i>text</i> </b>']
        ['<i>text</i>']
    cSs|S)Nrw)r�r�rvrwrwrxry8sz!originalTextFor.<locals>.<lambda>F�_original_start�
_original_endcSs||j|j�S)N)rprq)r�r5rvrwrwrxry=scSs&||jd�|jd��g|dd�<dS)Nrprq)r�)r�r5rvrwrwrx�extractText?sz$originalTextFor.<locals>.extractText)r
r�r�rer])r.ZasStringZ	locMarkerZendlocMarker�	matchExprrrrwrwrxrg s

cCst|�jdd��S)zp
    Helper to undo pyparsing's default grouping of And expressions, even
    if all but one are non-empty.
    cSs|dS)Nrrw)rvrwrwrxryJszungroup.<locals>.<lambda>)r-r�)r.rwrwrxrhEscCs4t�jdd��}t|d�|d�|j�j�d��S)a�
    Helper to decorate a returned token with its starting and ending locations in the input string.
    This helper adds the following results names:
     - locn_start = location where matched expression begins
     - locn_end = location where matched expression ends
     - value = the actual parsed results

    Be careful if the input text contains C{<TAB>} characters, you may want to call
    C{L{ParserElement.parseWithTabs}}

    Example::
        wd = Word(alphas)
        for match in locatedExpr(wd).searchString("ljsdf123lksdjjf123lkkjj1222"):
            print(match)
    prints::
        [[0, 'ljsdf', 5]]
        [[8, 'lksdjjf', 15]]
        [[18, 'lkkjj', 23]]
    cSs|S)Nrw)r�r5rvrwrwrxry`szlocatedExpr.<locals>.<lambda>Z
locn_startr�Zlocn_end)r
r�rr�r�)r.ZlocatorrwrwrxrjLsz\[]-*.$+^?()~ )r
cCs|ddS)Nrrrrw)r�r5rvrwrwrxryksryz\\0?[xX][0-9a-fA-F]+cCstt|djd�d��S)Nrz\0x�)�unichrru�lstrip)r�r5rvrwrwrxrylsz	\\0[0-7]+cCstt|ddd�d��S)Nrrr�)ruru)r�r5rvrwrwrxrymsz\])r�r
z\wr8rr�Znegate�bodyr	csBdd��y dj�fdd�tj|�jD��Stk
r<dSXdS)a�
    Helper to easily define string ranges for use in Word construction.  Borrows
    syntax from regexp '[]' string range definitions::
        srange("[0-9]")   -> "0123456789"
        srange("[a-z]")   -> "abcdefghijklmnopqrstuvwxyz"
        srange("[a-z$_]") -> "abcdefghijklmnopqrstuvwxyz$_"
    The input string must be enclosed in []'s, and the returned string is the expanded
    character set joined into a single string.
    The values enclosed in the []'s may be:
     - a single character
     - an escaped character with a leading backslash (such as C{\-} or C{\]})
     - an escaped hex character with a leading C{'\x'} (C{\x21}, which is a C{'!'} character) 
         (C{\0x##} is also supported for backwards compatibility) 
     - an escaped octal character with a leading C{'\0'} (C{\041}, which is a C{'!'} character)
     - a range of any of the above, separated by a dash (C{'a-z'}, etc.)
     - any combination of the above (C{'aeiouy'}, C{'a-zA-Z0-9_$'}, etc.)
    cSs<t|t�s|Sdjdd�tt|d�t|d�d�D��S)Nr�css|]}t|�VqdS)N)ru)r�r�rwrwrxr��sz+srange.<locals>.<lambda>.<locals>.<genexpr>rrr)rzr"r�r��ord)�prwrwrxry�szsrange.<locals>.<lambda>r�c3s|]}�|�VqdS)Nrw)r��part)�	_expandedrwrxr��szsrange.<locals>.<genexpr>N)r��_reBracketExprr�rxrK)r�rw)r|rxr_rs
 cs�fdd�}|S)zt
    Helper method for defining parse actions that require matching at a specific
    column in the input text.
    cs"t||��krt||d���dS)Nzmatched token not at column %d)r9r)r)Zlocnr1)r�rwrx�	verifyCol�sz!matchOnlyAtCol.<locals>.verifyColrw)r�r~rw)r�rxrM�scs�fdd�S)a�
    Helper method for common parse actions that simply return a literal value.  Especially
    useful when used with C{L{transformString<ParserElement.transformString>}()}.

    Example::
        num = Word(nums).setParseAction(lambda toks: int(toks[0]))
        na = oneOf("N/A NA").setParseAction(replaceWith(math.nan))
        term = na | num
        
        OneOrMore(term).parseString("324 234 N/A 234") # -> [324, 234, nan, 234]
    cs�gS)Nrw)r�r5rv)�replStrrwrxry�szreplaceWith.<locals>.<lambda>rw)rrw)rrxr\�scCs|ddd�S)a
    Helper parse action for removing quotation marks from parsed quoted strings.

    Example::
        # by default, quotation marks are included in parsed results
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["'Now is the Winter of our Discontent'"]

        # use removeQuotes to strip quotation marks from parsed results
        quotedString.setParseAction(removeQuotes)
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["Now is the Winter of our Discontent"]
    rrrrsrw)r�r5rvrwrwrxrZ�scsN��fdd�}yt�dt�d�j�}Wntk
rBt��}YnX||_|S)aG
    Helper to define a parse action by mapping a function to all elements of a ParseResults list.If any additional 
    args are passed, they are forwarded to the given function as additional arguments after
    the token, as in C{hex_integer = Word(hexnums).setParseAction(tokenMap(int, 16))}, which will convert the
    parsed data to an integer using base 16.

    Example (compare the last to example in L{ParserElement.transformString}::
        hex_ints = OneOrMore(Word(hexnums)).setParseAction(tokenMap(int, 16))
        hex_ints.runTests('''
            00 11 22 aa FF 0a 0d 1a
            ''')
        
        upperword = Word(alphas).setParseAction(tokenMap(str.upper))
        OneOrMore(upperword).runTests('''
            my kingdom for a horse
            ''')

        wd = Word(alphas).setParseAction(tokenMap(str.title))
        OneOrMore(wd).setParseAction(' '.join).runTests('''
            now is the winter of our discontent made glorious summer by this sun of york
            ''')
    prints::
        00 11 22 aa FF 0a 0d 1a
        [0, 17, 34, 170, 255, 10, 13, 26]

        my kingdom for a horse
        ['MY', 'KINGDOM', 'FOR', 'A', 'HORSE']

        now is the winter of our discontent made glorious summer by this sun of york
        ['Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York']
    cs��fdd�|D�S)Ncsg|]}�|f����qSrwrw)r�Ztokn)r�r6rwrxr��sz(tokenMap.<locals>.pa.<locals>.<listcomp>rw)r�r5rv)r�r6rwrxr}�sztokenMap.<locals>.par�rH)rJr�rKr{)r6r�r}rLrw)r�r6rxrm�s cCst|�j�S)N)r�r�)rvrwrwrxry�scCst|�j�S)N)r��lower)rvrwrwrxry�scCs�t|t�r|}t||d�}n|j}tttd�}|r�tj�j	t
�}td�|d�tt
t|td�|���tddgd�jd	�j	d
d��td�}n�d
jdd�tD��}tj�j	t
�t|�B}td�|d�tt
t|j	t�ttd�|����tddgd�jd	�j	dd��td�}ttd�|d�}|jdd
j|jdd�j�j���jd|�}|jdd
j|jdd�j�j���jd|�}||_||_||fS)zRInternal helper to construct opening and closing tag expressions, given a tag name)r�z_-:r�tag�=�/F)r�rCcSs|ddkS)Nrr�rw)r�r5rvrwrwrxry�sz_makeTags.<locals>.<lambda>rr�css|]}|dkr|VqdS)rNrw)r�r�rwrwrxr��sz_makeTags.<locals>.<genexpr>cSs|ddkS)Nrr�rw)r�r5rvrwrwrxry�sz</r��:r�z<%s>rz</%s>)rzr�rr�r/r4r3r>r�r�rZr+rr2rrrmr�rVrYrBr
�_Lr��titler�rir�)�tagStrZxmlZresnameZtagAttrNameZtagAttrValueZopenTagZprintablesLessRAbrackZcloseTagrwrwrx�	_makeTags�s"
T\..r�cCs
t|d�S)a 
    Helper to construct opening and closing tag expressions for HTML, given a tag name. Matches
    tags in either upper or lower case, attributes with namespaces and with quoted or unquoted values.

    Example::
        text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
        # makeHTMLTags returns pyparsing expressions for the opening and closing tags as a 2-tuple
        a,a_end = makeHTMLTags("A")
        link_expr = a + SkipTo(a_end)("link_text") + a_end
        
        for link in link_expr.searchString(text):
            # attributes in the <A> tag (like "href" shown here) are also accessible as named results
            print(link.link_text, '->', link.href)
    prints::
        pyparsing -> http://pyparsing.wikispaces.com
    F)r�)r�rwrwrxrK�scCs
t|d�S)z�
    Helper to construct opening and closing tag expressions for XML, given a tag name. Matches
    tags only in the given upper/lower case.

    Example: similar to L{makeHTMLTags}
    T)r�)r�rwrwrxrLscs8|r|dd��n|j��dd��D���fdd�}|S)a<
    Helper to create a validating parse action to be used with start tags created
    with C{L{makeXMLTags}} or C{L{makeHTMLTags}}. Use C{withAttribute} to qualify a starting tag
    with a required attribute value, to avoid false matches on common tags such as
    C{<TD>} or C{<DIV>}.

    Call C{withAttribute} with a series of attribute names and values. Specify the list
    of filter attributes names and values as:
     - keyword arguments, as in C{(align="right")}, or
     - as an explicit dict with C{**} operator, when an attribute name is also a Python
          reserved word, as in C{**{"class":"Customer", "align":"right"}}
     - a list of name-value tuples, as in ( ("ns1:class", "Customer"), ("ns2:align","right") )
    For attribute names with a namespace prefix, you must use the second form.  Attribute
    names are matched insensitive to upper/lower case.
       
    If just testing for C{class} (with or without a namespace), use C{L{withClass}}.

    To verify that the attribute exists, but without specifying a value, pass
    C{withAttribute.ANY_VALUE} as the value.

    Example::
        html = '''
            <div>
            Some text
            <div type="grid">1 4 0 1 0</div>
            <div type="graph">1,3 2,3 1,1</div>
            <div>this has no type</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")

        # only match div tag having a type attribute with value "grid"
        div_grid = div().setParseAction(withAttribute(type="grid"))
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        # construct a match with any div tag having a type attribute, regardless of the value
        div_any_type = div().setParseAction(withAttribute(type=withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    NcSsg|]\}}||f�qSrwrw)r�r�r�rwrwrxr�Qsz!withAttribute.<locals>.<listcomp>cs^xX�D]P\}}||kr&t||d|��|tjkr|||krt||d||||f��qWdS)Nzno matching attribute z+attribute '%s' has value '%s', must be '%s')rre�	ANY_VALUE)r�r5r�ZattrNameZ	attrValue)�attrsrwrxr}RszwithAttribute.<locals>.pa)r�)r�ZattrDictr}rw)r�rxres2cCs|rd|nd}tf||i�S)a�
    Simplified version of C{L{withAttribute}} when matching on a div class - made
    difficult because C{class} is a reserved word in Python.

    Example::
        html = '''
            <div>
            Some text
            <div class="grid">1 4 0 1 0</div>
            <div class="graph">1,3 2,3 1,1</div>
            <div>this &lt;div&gt; has no class</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")
        div_grid = div().setParseAction(withClass("grid"))
        
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        div_any_type = div().setParseAction(withClass(withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    z%s:class�class)re)Z	classname�	namespaceZ	classattrrwrwrxrk\s �(rcCs�t�}||||B}�x`t|�D�]R\}}|ddd�\}}	}
}|	dkrTd|nd|}|	dkr�|dksxt|�dkr�td��|\}
}t�j|�}|
tjk�rd|	dkr�t||�t|t	|��}n�|	dk�r|dk	�rt|||�t|t	||��}nt||�t|t	|��}nD|	dk�rZt||
|||�t||
|||�}ntd	��n�|
tj
k�rH|	dk�r�t|t��s�t|�}t|j
|�t||�}n�|	dk�r|dk	�r�t|||�t|t	||��}nt||�t|t	|��}nD|	dk�r>t||
|||�t||
|||�}ntd	��ntd
��|�r`|j|�||j|�|BK}|}q"W||K}|S)a�	
    Helper method for constructing grammars of expressions made up of
    operators working in a precedence hierarchy.  Operators may be unary or
    binary, left- or right-associative.  Parse actions can also be attached
    to operator expressions. The generated parser will also recognize the use 
    of parentheses to override operator precedences (see example below).
    
    Note: if you define a deep operator list, you may see performance issues
    when using infixNotation. See L{ParserElement.enablePackrat} for a
    mechanism to potentially improve your parser performance.

    Parameters:
     - baseExpr - expression representing the most basic element for the nested
     - opList - list of tuples, one for each operator precedence level in the
      expression grammar; each tuple is of the form
      (opExpr, numTerms, rightLeftAssoc, parseAction), where:
       - opExpr is the pyparsing expression for the operator;
          may also be a string, which will be converted to a Literal;
          if numTerms is 3, opExpr is a tuple of two expressions, for the
          two operators separating the 3 terms
       - numTerms is the number of terms for this operator (must
          be 1, 2, or 3)
       - rightLeftAssoc is the indicator whether the operator is
          right or left associative, using the pyparsing-defined
          constants C{opAssoc.RIGHT} and C{opAssoc.LEFT}.
       - parseAction is the parse action to be associated with
          expressions matching this operator expression (the
          parse action tuple member may be omitted)
     - lpar - expression for matching left-parentheses (default=C{Suppress('(')})
     - rpar - expression for matching right-parentheses (default=C{Suppress(')')})

    Example::
        # simple example of four-function arithmetic with ints and variable names
        integer = pyparsing_common.signed_integer
        varname = pyparsing_common.identifier 
        
        arith_expr = infixNotation(integer | varname,
            [
            ('-', 1, opAssoc.RIGHT),
            (oneOf('* /'), 2, opAssoc.LEFT),
            (oneOf('+ -'), 2, opAssoc.LEFT),
            ])
        
        arith_expr.runTests('''
            5+3*6
            (5+3)*6
            -2--11
            ''', fullDump=False)
    prints::
        5+3*6
        [[5, '+', [3, '*', 6]]]

        (5+3)*6
        [[[5, '+', 3], '*', 6]]

        -2--11
        [[['-', 2], '-', ['-', 11]]]
    Nrroz%s termz	%s%s termrqz@if numterms=3, opExpr must be a tuple or list of two expressionsrrz6operator must be unary (1), binary (2), or ternary (3)z2operator must indicate right or left associativity)N)rr�r�r�rirT�LEFTrrr�RIGHTrzrr.r�)ZbaseExprZopListZlparZrparr�ZlastExprr�ZoperDefZopExprZarityZrightLeftAssocr}ZtermNameZopExpr1ZopExpr2ZthisExprrsrwrwrxri�sR;

&




&


z4"(?:[^"\n\r\\]|(?:"")|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*�"z string enclosed in double quotesz4'(?:[^'\n\r\\]|(?:'')|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*�'z string enclosed in single quotesz*quotedString using single or double quotes�uzunicode string literalcCs�||krtd��|dk�r(t|t�o,t|t��r t|�dkr�t|�dkr�|dk	r�tt|t||tjdd���j	dd��}n$t
j�t||tj�j	dd��}nx|dk	r�tt|t|�t|�ttjdd���j	dd��}n4ttt|�t|�ttjdd���j	d	d��}ntd
��t
�}|dk	�rb|tt|�t||B|B�t|��K}n$|tt|�t||B�t|��K}|jd||f�|S)a~	
    Helper method for defining nested lists enclosed in opening and closing
    delimiters ("(" and ")" are the default).

    Parameters:
     - opener - opening character for a nested list (default=C{"("}); can also be a pyparsing expression
     - closer - closing character for a nested list (default=C{")"}); can also be a pyparsing expression
     - content - expression for items within the nested lists (default=C{None})
     - ignoreExpr - expression for ignoring opening and closing delimiters (default=C{quotedString})

    If an expression is not provided for the content argument, the nested
    expression will capture all whitespace-delimited content between delimiters
    as a list of separate values.

    Use the C{ignoreExpr} argument to define expressions that may contain
    opening or closing characters that should not be treated as opening
    or closing characters for nesting, such as quotedString or a comment
    expression.  Specify multiple expressions using an C{L{Or}} or C{L{MatchFirst}}.
    The default is L{quotedString}, but if no expressions are to be ignored,
    then pass C{None} for this argument.

    Example::
        data_type = oneOf("void int short long char float double")
        decl_data_type = Combine(data_type + Optional(Word('*')))
        ident = Word(alphas+'_', alphanums+'_')
        number = pyparsing_common.number
        arg = Group(decl_data_type + ident)
        LPAR,RPAR = map(Suppress, "()")

        code_body = nestedExpr('{', '}', ignoreExpr=(quotedString | cStyleComment))

        c_function = (decl_data_type("type") 
                      + ident("name")
                      + LPAR + Optional(delimitedList(arg), [])("args") + RPAR 
                      + code_body("body"))
        c_function.ignore(cStyleComment)
        
        source_code = '''
            int is_odd(int x) { 
                return (x%2); 
            }
                
            int dec_to_hex(char hchar) { 
                if (hchar >= '0' && hchar <= '9') { 
                    return (ord(hchar)-ord('0')); 
                } else { 
                    return (10+ord(hchar)-ord('A'));
                } 
            }
        '''
        for func in c_function.searchString(source_code):
            print("%(name)s (%(type)s) args: %(args)s" % func)

    prints::
        is_odd (int) args: [['int', 'x']]
        dec_to_hex (int) args: [['char', 'hchar']]
    z.opening and closing strings cannot be the sameNrr)r
cSs|dj�S)Nr)r�)rvrwrwrxry9sznestedExpr.<locals>.<lambda>cSs|dj�S)Nr)r�)rvrwrwrxry<scSs|dj�S)Nr)r�)rvrwrwrxryBscSs|dj�S)Nr)r�)rvrwrwrxryFszOopening and closing arguments must be strings if no content expression is givenznested %s%s expression)r�rzr�r�r
rr	r$rNr�rCr�rrrr+r2ri)�openerZcloserZcontentrOr�rwrwrxrP�s4:

*$cs��fdd�}�fdd�}�fdd�}tt�jd�j��}t�t�j|�jd�}t�j|�jd	�}t�j|�jd
�}	|r�tt|�|t|t|�t|��|	�}
n$tt|�t|t|�t|���}
|j	t
t��|
jd�S)a
	
    Helper method for defining space-delimited indentation blocks, such as
    those used to define block statements in Python source code.

    Parameters:
     - blockStatementExpr - expression defining syntax of statement that
            is repeated within the indented block
     - indentStack - list created by caller to manage indentation stack
            (multiple statementWithIndentedBlock expressions within a single grammar
            should share a common indentStack)
     - indent - boolean indicating whether block must be indented beyond the
            the current level; set to False for block of left-most statements
            (default=C{True})

    A valid block must contain at least one C{blockStatement}.

    Example::
        data = '''
        def A(z):
          A1
          B = 100
          G = A2
          A2
          A3
        B
        def BB(a,b,c):
          BB1
          def BBA():
            bba1
            bba2
            bba3
        C
        D
        def spam(x,y):
             def eggs(z):
                 pass
        '''


        indentStack = [1]
        stmt = Forward()

        identifier = Word(alphas, alphanums)
        funcDecl = ("def" + identifier + Group( "(" + Optional( delimitedList(identifier) ) + ")" ) + ":")
        func_body = indentedBlock(stmt, indentStack)
        funcDef = Group( funcDecl + func_body )

        rvalue = Forward()
        funcCall = Group(identifier + "(" + Optional(delimitedList(rvalue)) + ")")
        rvalue << (funcCall | identifier | Word(nums))
        assignment = Group(identifier + "=" + rvalue)
        stmt << ( funcDef | assignment | identifier )

        module_body = OneOrMore(stmt)

        parseTree = module_body.parseString(data)
        parseTree.pprint()
    prints::
        [['def',
          'A',
          ['(', 'z', ')'],
          ':',
          [['A1'], [['B', '=', '100']], [['G', '=', 'A2']], ['A2'], ['A3']]],
         'B',
         ['def',
          'BB',
          ['(', 'a', 'b', 'c', ')'],
          ':',
          [['BB1'], [['def', 'BBA', ['(', ')'], ':', [['bba1'], ['bba2'], ['bba3']]]]]],
         'C',
         'D',
         ['def',
          'spam',
          ['(', 'x', 'y', ')'],
          ':',
          [[['def', 'eggs', ['(', 'z', ')'], ':', [['pass']]]]]]] 
    csN|t|�krdSt||�}|�dkrJ|�dkr>t||d��t||d��dS)Nrrzillegal nestingznot a peer entryrsrs)r�r9r!r)r�r5rv�curCol)�indentStackrwrx�checkPeerIndent�s
z&indentedBlock.<locals>.checkPeerIndentcs2t||�}|�dkr"�j|�nt||d��dS)Nrrznot a subentryrs)r9r�r)r�r5rvr�)r�rwrx�checkSubIndent�s
z%indentedBlock.<locals>.checkSubIndentcsN|t|�krdSt||�}�o4|�dko4|�dksBt||d���j�dS)Nrrrqznot an unindentrsr:)r�r9rr�)r�r5rvr�)r�rwrx�
checkUnindent�s
z$indentedBlock.<locals>.checkUnindentz	 �INDENTr�ZUNINDENTzindented block)rrr�r�r
r�rirrr�rk)ZblockStatementExprr�rr�r�r�r!r�ZPEERZUNDENTZsmExprrw)r�rxrfQsN,z#[\0xc0-\0xd6\0xd8-\0xf6\0xf8-\0xff]z[\0xa1-\0xbf\0xd7\0xf7]z_:zany tagzgt lt amp nbsp quot aposz><& "'z&(?P<entity>rnz);zcommon HTML entitycCstj|j�S)zRHelper parser action to replace common HTML entities with their special characters)�_htmlEntityMapr�Zentity)rvrwrwrxr[�sz/\*(?:[^*]|\*(?!/))*z*/zC style commentz<!--[\s\S]*?-->zHTML commentz.*zrest of linez//(?:\\\n|[^\n])*z
// commentzC++ style commentz#.*zPython style comment)r�z 	�	commaItem)r�c@s�eZdZdZee�Zee�Ze	e
�jd�je�Z
e	e�jd�jeed��Zed�jd�je�Ze�je�de�je�jd�Zejd	d
��eeeed�j�e�Bjd�Zeje�ed
�jd�je�Zed�jd�je�ZeeBeBj�Zed�jd�je�Ze	eded�jd�Zed�jd�Z ed�jd�Z!e!de!djd�Z"ee!de!d>�dee!de!d?�jd�Z#e#j$d d
��d!e jd"�Z%e&e"e%Be#Bjd#��jd#�Z'ed$�jd%�Z(e)d@d'd(��Z*e)dAd*d+��Z+ed,�jd-�Z,ed.�jd/�Z-ed0�jd1�Z.e/j�e0j�BZ1e)d2d3��Z2e&e3e4d4�e5�e	e6d4d5�ee7d6����j�jd7�Z8e9ee:j;�e8Bd8d9��jd:�Z<e)ed;d
���Z=e)ed<d
���Z>d=S)Brna�

    Here are some common low-level expressions that may be useful in jump-starting parser development:
     - numeric forms (L{integers<integer>}, L{reals<real>}, L{scientific notation<sci_real>})
     - common L{programming identifiers<identifier>}
     - network addresses (L{MAC<mac_address>}, L{IPv4<ipv4_address>}, L{IPv6<ipv6_address>})
     - ISO8601 L{dates<iso8601_date>} and L{datetime<iso8601_datetime>}
     - L{UUID<uuid>}
     - L{comma-separated list<comma_separated_list>}
    Parse actions:
     - C{L{convertToInteger}}
     - C{L{convertToFloat}}
     - C{L{convertToDate}}
     - C{L{convertToDatetime}}
     - C{L{stripHTMLTags}}
     - C{L{upcaseTokens}}
     - C{L{downcaseTokens}}

    Example::
        pyparsing_common.number.runTests('''
            # any int or real number, returned as the appropriate type
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.fnumber.runTests('''
            # any int or real number, returned as float
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.hex_integer.runTests('''
            # hex numbers
            100
            FF
            ''')

        pyparsing_common.fraction.runTests('''
            # fractions
            1/2
            -3/4
            ''')

        pyparsing_common.mixed_integer.runTests('''
            # mixed fractions
            1
            1/2
            -3/4
            1-3/4
            ''')

        import uuid
        pyparsing_common.uuid.setParseAction(tokenMap(uuid.UUID))
        pyparsing_common.uuid.runTests('''
            # uuid
            12345678-1234-5678-1234-567812345678
            ''')
    prints::
        # any int or real number, returned as the appropriate type
        100
        [100]

        -100
        [-100]

        +100
        [100]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # any int or real number, returned as float
        100
        [100.0]

        -100
        [-100.0]

        +100
        [100.0]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # hex numbers
        100
        [256]

        FF
        [255]

        # fractions
        1/2
        [0.5]

        -3/4
        [-0.75]

        # mixed fractions
        1
        [1]

        1/2
        [0.5]

        -3/4
        [-0.75]

        1-3/4
        [1.75]

        # uuid
        12345678-1234-5678-1234-567812345678
        [UUID('12345678-1234-5678-1234-567812345678')]
    �integerzhex integerrtz[+-]?\d+zsigned integerr��fractioncCs|d|dS)Nrrrrsrw)rvrwrwrxry�szpyparsing_common.<lambda>r8z"fraction or mixed integer-fractionz
[+-]?\d+\.\d*zreal numberz+[+-]?\d+([eE][+-]?\d+|\.\d*([eE][+-]?\d+)?)z$real number with scientific notationz[+-]?\d+\.?\d*([eE][+-]?\d+)?�fnumberrB�
identifierzK(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})(\.(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})){3}zIPv4 addressz[0-9a-fA-F]{1,4}�hex_integerr��zfull IPv6 addressrrBz::zshort IPv6 addresscCstdd�|D��dkS)Ncss|]}tjj|�rdVqdS)rrN)rn�
_ipv6_partr�)r�rfrwrwrxr��sz,pyparsing_common.<lambda>.<locals>.<genexpr>rw)rH)rvrwrwrxry�sz::ffff:zmixed IPv6 addresszIPv6 addressz:[0-9a-fA-F]{2}([:.-])[0-9a-fA-F]{2}(?:\1[0-9a-fA-F]{2}){4}zMAC address�%Y-%m-%dcs�fdd�}|S)a�
        Helper to create a parse action for converting parsed date string to Python datetime.date

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%d"})

        Example::
            date_expr = pyparsing_common.iso8601_date.copy()
            date_expr.setParseAction(pyparsing_common.convertToDate())
            print(date_expr.parseString("1999-12-31"))
        prints::
            [datetime.date(1999, 12, 31)]
        csLytj|d��j�Stk
rF}zt||t|���WYdd}~XnXdS)Nr)r�strptimeZdater�rr{)r�r5rv�ve)�fmtrwrx�cvt_fn�sz.pyparsing_common.convertToDate.<locals>.cvt_fnrw)r�r�rw)r�rx�
convertToDate�szpyparsing_common.convertToDate�%Y-%m-%dT%H:%M:%S.%fcs�fdd�}|S)a
        Helper to create a parse action for converting parsed datetime string to Python datetime.datetime

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%dT%H:%M:%S.%f"})

        Example::
            dt_expr = pyparsing_common.iso8601_datetime.copy()
            dt_expr.setParseAction(pyparsing_common.convertToDatetime())
            print(dt_expr.parseString("1999-12-31T23:59:59.999"))
        prints::
            [datetime.datetime(1999, 12, 31, 23, 59, 59, 999000)]
        csHytj|d��Stk
rB}zt||t|���WYdd}~XnXdS)Nr)rr�r�rr{)r�r5rvr�)r�rwrxr��sz2pyparsing_common.convertToDatetime.<locals>.cvt_fnrw)r�r�rw)r�rx�convertToDatetime�sz"pyparsing_common.convertToDatetimez7(?P<year>\d{4})(?:-(?P<month>\d\d)(?:-(?P<day>\d\d))?)?zISO8601 datez�(?P<year>\d{4})-(?P<month>\d\d)-(?P<day>\d\d)[T ](?P<hour>\d\d):(?P<minute>\d\d)(:(?P<second>\d\d(\.\d*)?)?)?(?P<tz>Z|[+-]\d\d:?\d\d)?zISO8601 datetimez2[0-9a-fA-F]{8}(-[0-9a-fA-F]{4}){3}-[0-9a-fA-F]{12}�UUIDcCstjj|d�S)a
        Parse action to remove HTML tags from web page HTML source

        Example::
            # strip HTML links from normal text 
            text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
            td,td_end = makeHTMLTags("TD")
            table_text = td + SkipTo(td_end).setParseAction(pyparsing_common.stripHTMLTags)("body") + td_end
            
            print(table_text.parseString(text).body) # -> 'More info at the pyparsing wiki page'
        r)rn�_html_stripperr�)r�r5r�rwrwrx�
stripHTMLTags�s
zpyparsing_common.stripHTMLTagsra)r�z 	r�r�)r�zcomma separated listcCst|�j�S)N)r�r�)rvrwrwrxry�scCst|�j�S)N)r�r�)rvrwrwrxry�sN)rrB)rrB)r�)r�)?r�r�r�r�rmruZconvertToInteger�floatZconvertToFloatr/rRrir�r�rDr�r'Zsigned_integerr�rxrr�Z
mixed_integerrH�realZsci_realr��numberr�r4r3r�Zipv4_addressr�Z_full_ipv6_addressZ_short_ipv6_addressr~Z_mixed_ipv6_addressr
Zipv6_addressZmac_addressr�r�r�Ziso8601_dateZiso8601_datetime�uuidr7r6r�r�rrrrVr.�
_commasepitemr@rYr�Zcomma_separated_listrdrBrwrwrwrxrn�sN""
28�__main__Zselect�fromz_$r])rb�columnsrjZtablesZcommandaK
        # '*' as column list and dotted table name
        select * from SYS.XYZZY

        # caseless match on "SELECT", and casts back to "select"
        SELECT * from XYZZY, ABC

        # list of column names, and mixed case SELECT keyword
        Select AA,BB,CC from Sys.dual

        # multiple tables
        Select A, B, C from Sys.dual, Table2

        # invalid SELECT keyword - should fail
        Xelect A, B, C from Sys.dual

        # incomplete command - should fail
        Select

        # invalid column name - should fail
        Select ^^^ frox Sys.dual

        z]
        100
        -100
        +100
        3.14159
        6.02e23
        1e-12
        z 
        100
        FF
        z6
        12345678-1234-5678-1234-567812345678
        )rq)raF)N)FT)T)r�)T)�r��__version__Z__versionTime__�
__author__r��weakrefrr�r�r~r�rdrr�r"r<r�r�_threadr�ImportErrorZ	threadingrr�Zordereddict�__all__r��version_infor;r�maxsizer�r{r��chrrur�rHr�r�reversedr�r�rr6rrrIZmaxintZxranger�Z__builtin__r�Zfnamer�rJr�r�r�r�r�r�Zascii_uppercaseZascii_lowercaser4rRrDr3rkr�Z	printablerVrKrrr!r#r&r�r"�MutableMapping�registerr9rJrGr/r2r4rQrMr$r,r
rrr�rQrrrrlr/r'r%r	r.r0rrrr*r)r1r0r rrrrrrrrJrr2rMrNrr(rrVr-r
rrr+rrbr@r<r�rOrNrrSrArgrhrjrirCrIrHrar`r�Z_escapedPuncZ_escapedHexCharZ_escapedOctChar�UNICODEZ_singleCharZ
_charRangermr}r_rMr\rZrmrdrBr�rKrLrer�rkrTr�r�rirUr>r^rYrcrPrfr5rWr7r6r�r�r�r�r;r[r8rEr�r]r?r=rFrXr�r�r:rnr�ZselectTokenZ	fromTokenZidentZ
columnNameZcolumnNameListZ
columnSpecZ	tableNameZ
tableNameListZ	simpleSQLr�r�r�r�r�r�rwrwrwrx�<module>=s�









8


@d&A= I
G3pLOD|M &#@sQ,A,	I#%&0
,	?#kZr

 (
 0 


"site-packages/pip/_vendor/__pycache__/ipaddress.cpython-36.opt-1.pyc000064400000201213147511334610021267 0ustar003

���e09�@s`dZddlmZddlZddlZdZefZyeefZWne	k
rJYnXye
ZWne	k
rleZYnXdOdkr�dd�Z
ndd�Z
y
ejZWnek
r�d	d
�ZYnXdd�Zeed
�r�dd�Zndd�ZdPdd�ZGdd�de�ZdZdZGdd�de�ZGdd�de�Zdd�ZdQdd �Zd!d"�Zd#d$�Zd%d&�Z d'd(�Z!d)d*�Z"d+d,�Z#d-d.�Z$d/d0�Z%d1d2�Z&d3d4�Z'Gd5d6�d6e�Z(Gd7d8�d8e(�Z)Gd9d:�d:e(�Z*Gd;d<�d<e�Z+Gd=d>�d>e+e)�Z,Gd?d@�d@e,�Z-GdAdB�dBe+e*�Z.GdCdD�dDe�Z/e/e,_0GdEdF�dFe�Z1GdGdH�dHe1e)�Z2GdIdJ�dJe2�Z3GdKdL�dLe1e*�Z4GdMdN�dNe�Z5e5e2_0dS)Rz�A fast, lightweight IPv4/IPv6 manipulation library in Python.

This library is used to create/poke/manipulate IPv4 and IPv6 addresses
and networks.

�)�unicode_literalsNz1.0.17�cCs|S)N�)�bytrr�/usr/lib/python3.6/ipaddress.py�_compat_bytes_to_byte_valssrcCsdd�|D�S)NcSsg|]}tjd|�d�qS)s!Br)�struct�unpack)�.0�brrr�
<listcomp>#sz._compat_bytes_to_byte_vals.<locals>.<listcomp>r)rrrrr"scCs"d}x|D]}|d>|}q
W|S)Nr�r)Zbytvals�	endianess�resZbvrrr�_compat_int_from_byte_vals's
rcCst|dkr.|dks|d
kr"tjd��tjd|�S|dkrj|dksJ|ddkrTtjd	��tjd
|d?|d@�St��dS)N�r�� z(integer out of range for 'I' format codes!I��z)integer out of range for 'QQ' format codes!QQ�@l����l)r�error�pack�NotImplementedError)ZintvalZlengthrrrr�_compat_to_bytes0s

r�
bit_lengthcCs|j�S)N)r)�irrr�_compat_bit_length?srcCs&x tj�D]}||?dkr
|Sq
WdS)Nr)�	itertools�count)rrrrrrBs�ccs$|}x||kr|V||7}qWdS)Nr)�start�end�steprrrr�
_compat_rangeHs
r$c@s@eZdZfZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Z	d
S)�_TotalOrderingMixincCst�dS)N)r)�self�otherrrr�__eq__Wsz_TotalOrderingMixin.__eq__cCs|j|�}|tkrtS|S)N)r(�NotImplemented)r&r'�equalrrr�__ne__Zs
z_TotalOrderingMixin.__ne__cCst�dS)N)r)r&r'rrr�__lt__`sz_TotalOrderingMixin.__lt__cCs&|j|�}|tks|r"|j|�S|S)N)r,r)r()r&r'�lessrrr�__le__cs

z_TotalOrderingMixin.__le__cCs6|j|�}|tkrtS|j|�}|tkr,tS|p2|S)N)r,r)r()r&r'r-r*rrr�__gt__is

z_TotalOrderingMixin.__gt__cCs|j|�}|tkrtS|S)N)r,r))r&r'r-rrr�__ge__rs
z_TotalOrderingMixin.__ge__N)
�__name__�
__module__�__qualname__�	__slots__r(r+r,r.r/r0rrrrr%Ps	r%rrc@seZdZdZdS)�AddressValueErrorz%A Value Error related to the address.N)r1r2r3�__doc__rrrrr5}sr5c@seZdZdZdS)�NetmaskValueErrorz%A Value Error related to the netmask.N)r1r2r3r6rrrrr7�sr7cCsjyt|�Sttfk
r YnXyt|�Sttfk
rBYnXt|t�rZtd|��td|��dS)a�Take an IP string/int and return an object of the correct type.

    Args:
        address: A string or integer, the IP address.  Either IPv4 or
          IPv6 addresses may be supplied; integers less than 2**32 will
          be considered to be IPv4 by default.

    Returns:
        An IPv4Address or IPv6Address object.

    Raises:
        ValueError: if the *address* passed isn't either a v4 or a v6
          address

    zx%r does not appear to be an IPv4 or IPv6 address. Did you pass in a bytes (str in Python 2) instead of a unicode object?z0%r does not appear to be an IPv4 or IPv6 addressN)�IPv4Addressr5r7�IPv6Address�
isinstance�bytes�
ValueError)�addressrrr�
ip_address�s
r>TcCsny
t||�Sttfk
r"YnXy
t||�Sttfk
rFYnXt|t�r^td|��td|��dS)a�Take an IP string/int and return an object of the correct type.

    Args:
        address: A string or integer, the IP network.  Either IPv4 or
          IPv6 networks may be supplied; integers less than 2**32 will
          be considered to be IPv4 by default.

    Returns:
        An IPv4Network or IPv6Network object.

    Raises:
        ValueError: if the string passed isn't either a v4 or a v6
          address. Or if the network has host bits set.

    zx%r does not appear to be an IPv4 or IPv6 network. Did you pass in a bytes (str in Python 2) instead of a unicode object?z0%r does not appear to be an IPv4 or IPv6 networkN)�IPv4Networkr5r7�IPv6Networkr:r;r<)r=�strictrrr�
ip_network�s


rBcCsTyt|�Sttfk
r YnXyt|�Sttfk
rBYnXtd|��dS)agTake an IP string/int and return an object of the correct type.

    Args:
        address: A string or integer, the IP address.  Either IPv4 or
          IPv6 addresses may be supplied; integers less than 2**32 will
          be considered to be IPv4 by default.

    Returns:
        An IPv4Interface or IPv6Interface object.

    Raises:
        ValueError: if the string passed isn't either a v4 or a v6
          address.

    Notes:
        The IPv?Interface classes describe an Address on a particular
        Network, so they're basically a combination of both the Address
        and Network classes.

    z2%r does not appear to be an IPv4 or IPv6 interfaceN)�
IPv4Interfacer5r7�
IPv6Interfacer<)r=rrr�ip_interface�srEcCs4yt|dd�Stjtfk
r.td��YnXdS)a`Represent an address as 4 packed bytes in network (big-endian) order.

    Args:
        address: An integer representation of an IPv4 IP address.

    Returns:
        The integer address packed as 4 bytes in network (big-endian) order.

    Raises:
        ValueError: If the integer is negative or too large to be an
          IPv4 IP address.

    r�bigz&Address negative or too large for IPv4N)rrr�
OverflowErrorr<)r=rrr�v4_int_to_packed�srHcCs4yt|dd�Stjtfk
r.td��YnXdS)z�Represent an address as 16 packed bytes in network (big-endian) order.

    Args:
        address: An integer representation of an IPv6 IP address.

    Returns:
        The integer address packed as 16 bytes in network (big-endian) order.

    rrFz&Address negative or too large for IPv6N)rrrrGr<)r=rrr�v6_int_to_packeds
rIcCs*t|�jd�}t|�dkr&td|��|S)zAHelper to split the netmask and raise AddressValueError if needed�/rzOnly one '/' permitted in %r)�_compat_str�split�lenr5)r=�addrrrr�_split_optional_netmasksrOccsRt|�}t|�}}x.|D]&}|j|jdkr<||fV|}|}qW||fVdS)z�Find a sequence of sorted deduplicated IPv#Address.

    Args:
        addresses: a list of IPv#Address objects.

    Yields:
        A tuple containing the first and last IP addresses in the sequence.

    r N)�iter�next�_ip)�	addresses�it�first�last�iprrr�_find_address_ranges


rXcCs$|dkr|St|t||d@��S)z�Count the number of zero bits on the right hand side.

    Args:
        number: an integer.
        bits: maximum number of bits to count.

    Returns:
        The number of zero bits on the right hand side of the number.

    rr )�minr)Znumber�bitsrrr�_count_righthand_zero_bits0sr[ccs�t|t�ot|t�std��|j|jkr8td||f��||krHtd��|jdkrXt}n|jdkrht}ntd��|j}|j}|j}x^||kr�t	t
||�t||d�d�}||||f�}|V|d|>7}|d|jkr�Pq�WdS)	a�Summarize a network range given the first and last IP addresses.

    Example:
        >>> list(summarize_address_range(IPv4Address('192.0.2.0'),
        ...                              IPv4Address('192.0.2.130')))
        ...                                #doctest: +NORMALIZE_WHITESPACE
        [IPv4Network('192.0.2.0/25'), IPv4Network('192.0.2.128/31'),
         IPv4Network('192.0.2.130/32')]

    Args:
        first: the first IPv4Address or IPv6Address in the range.
        last: the last IPv4Address or IPv6Address in the range.

    Returns:
        An iterator of the summarized IPv(4|6) network objects.

    Raise:
        TypeError:
            If the first and last objects are not IP addresses.
            If the first and last objects are not the same version.
        ValueError:
            If the last object is not greater than the first.
            If the version of the first address is not 4 or 6.

    z1first and last must be IP addresses, not networksz%%s and %s are not of the same versionz*last IP address must be greater than firstr�zunknown IP versionr N)
r:�_BaseAddress�	TypeError�versionr<r?r@�_max_prefixlenrRrYr[r�	_ALL_ONES)rUrVrWZip_bitsZ	first_intZlast_intZnbits�netrrr�summarize_address_range@s0





rcccs�t|�}i}xL|rX|j�}|j�}|j|�}|dkr>|||<q||kr||=|j|�qWd}x4t|j��D]$}|dk	r�|j|jkr�ql|V|}qlWdS)auLoops through the addresses, collapsing concurrent netblocks.

    Example:

        ip1 = IPv4Network('192.0.2.0/26')
        ip2 = IPv4Network('192.0.2.64/26')
        ip3 = IPv4Network('192.0.2.128/26')
        ip4 = IPv4Network('192.0.2.192/26')

        _collapse_addresses_internal([ip1, ip2, ip3, ip4]) ->
          [IPv4Network('192.0.2.0/24')]

        This shouldn't be called directly; it is called via
          collapse_addresses([]).

    Args:
        addresses: A list of IPv4Network's or IPv6Network's

    Returns:
        A list of IPv4Network's or IPv6Network's depending on what we were
        passed.

    N)�list�pop�supernet�get�append�sorted�values�broadcast_address)rSZto_merge�subnetsrbrfZexistingrVrrr�_collapse_addresses_internalws$

rmcCs8g}g}g}x�|D]�}t|t�rT|rH|dj|jkrHtd||df��|j|�q|j|jkr�|r�|dj|jkr�td||df��y|j|j�Wq�tk
r�|j|j	�Yq�Xq|r�|dj|jkr�td||df��|j|�qWt
t|��}|�r,x&t|�D]\}}|j
t||���qWt||�S)	a�Collapse a list of IP objects.

    Example:
        collapse_addresses([IPv4Network('192.0.2.0/25'),
                            IPv4Network('192.0.2.128/25')]) ->
                           [IPv4Network('192.0.2.0/24')]

    Args:
        addresses: An iterator of IPv4Network or IPv6Network objects.

    Returns:
        An iterator of the collapsed IPv(4|6)Network objects.

    Raises:
        TypeError: If passed a list of mixed version objects.

    r z%%s and %s are not of the same version���rnrnrnrnrn)r:r]�_versionr^rh�
_prefixlenr`rW�AttributeError�network_addressri�setrX�extendrcrm)rSZaddrsZipsZnetsrWrUrVrrr�collapse_addresses�s4

rucCs(t|t�r|j�St|t�r$|j�StS)a2Return a key suitable for sorting between networks and addresses.

    Address and Network objects are not sortable by default; they're
    fundamentally different so the expression

        IPv4Address('192.0.2.0') <= IPv4Network('192.0.2.0/24')

    doesn't make any sense.  There are some times however, where you may wish
    to have ipaddress sort these for you anyway. If you need to do this, you
    can use this function as the key= argument to sorted().

    Args:
      obj: either a Network or Address object.
    Returns:
      appropriate key.

    )r:�_BaseNetwork�_get_networks_keyr]�_get_address_keyr))�objrrr�get_mixed_type_key�s


rzc@s�eZdZdZfZedd��Zedd��Zedd��Zedd	��Z	d
d�Z
dd
�Zedd��Z
edd��Zedd��Zedd��Zedd��Zdd�ZdS)�_IPAddressBasezThe mother class.cCs|j�S)z:Return the longhand version of the IP address as a string.)�_explode_shorthand_ip_string)r&rrr�exploded�sz_IPAddressBase.explodedcCst|�S)z;Return the shorthand version of the IP address as a string.)rK)r&rrr�
compressedsz_IPAddressBase.compressedcCs|j�S)aIThe name of the reverse DNS pointer for the IP address, e.g.:
            >>> ipaddress.ip_address("127.0.0.1").reverse_pointer
            '1.0.0.127.in-addr.arpa'
            >>> ipaddress.ip_address("2001:db8::1").reverse_pointer
            '1.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.8.b.d.0.1.0.0.2.ip6.arpa'

        )�_reverse_pointer)r&rrr�reverse_pointers	z_IPAddressBase.reverse_pointercCsdt|�f}t|��dS)Nz%200s has no version specified)�typer)r&�msgrrrr_sz_IPAddressBase.versioncCsF|dkrd}t|||jf��||jkrBd}t|||j|jf��dS)Nrz-%d (< 0) is not permitted as an IPv%d addressz2%d (>= 2**%d) is not permitted as an IPv%d address)r5rorar`)r&r=r�rrr�_check_int_addresss

z!_IPAddressBase._check_int_addresscCs.t|�}||kr*d}t|||||jf��dS)Nz~%r (len %d != %d) is not permitted as an IPv%d address. Did you pass in a bytes (str in Python 2) instead of a unicode object?)rMr5ro)r&r=Zexpected_lenZaddress_lenr�rrr�_check_packed_address s
z$_IPAddressBase._check_packed_addresscCs|j|j|?AS)z�Turn the prefix length into a bitwise netmask

        Args:
            prefixlen: An integer, the prefix length.

        Returns:
            An integer.

        )ra)�cls�	prefixlenrrr�_ip_int_from_prefix+sz"_IPAddressBase._ip_int_from_prefixc	Cs\t||j�}|j|}||?}d|>d}||krX|jd}t||d�}d}t||��|S)aReturn prefix length from the bitwise netmask.

        Args:
            ip_int: An integer, the netmask in expanded bitwise format

        Returns:
            An integer, the prefix length.

        Raises:
            ValueError: If the input intermingles zeroes & ones
        r r
rFz&Netmask pattern %r mixes zeroes & ones)r[r`rr<)	r��ip_intZtrailing_zeroesr�Zleading_onesZall_onesZbyteslenZdetailsr�rrr�_prefix_from_ip_int8s


z"_IPAddressBase._prefix_from_ip_intcCsd|}t|��dS)Nz%r is not a valid netmask)r7)r�Znetmask_strr�rrr�_report_invalid_netmaskQsz&_IPAddressBase._report_invalid_netmaskcCsjtjj|�s|j|�yt|�}Wntk
r@|j|�YnXd|koV|jknsf|j|�|S)a	Return prefix length from a numeric string

        Args:
            prefixlen_str: The string to be converted

        Returns:
            An integer, the prefix length.

        Raises:
            NetmaskValueError: If the input is not a valid netmask
        r)�_BaseV4�_DECIMAL_DIGITS�
issupersetr��intr<r`)r�Z
prefixlen_strr�rrr�_prefix_from_prefix_stringVs

z)_IPAddressBase._prefix_from_prefix_stringcCs�y|j|�}Wntk
r,|j|�YnXy
|j|�Stk
rLYnX||jN}y
|j|�Stk
r�|j|�YnXdS)aTurn a netmask/hostmask string into a prefix length

        Args:
            ip_str: The netmask/hostmask to be converted

        Returns:
            An integer, the prefix length.

        Raises:
            NetmaskValueError: If the input is not a valid netmask/hostmask
        N)�_ip_int_from_stringr5r�r�r<ra)r��ip_strr�rrr�_prefix_from_ip_stringos


z%_IPAddressBase._prefix_from_ip_stringcCs|jt|�ffS)N)�	__class__rK)r&rrr�
__reduce__�sz_IPAddressBase.__reduce__N)r1r2r3r6r4�propertyr}r~r�r_r�r��classmethodr�r�r�r�r�r�rrrrr{�s	
"r{c@sdeZdZdZfZdd�Zdd�Zdd�Zdd	�Zd
d�Z	dd
�Z
dd�Zdd�Zdd�Z
dd�ZdS)r]z�A generic IP object.

    This IP class contains the version independent methods which are
    used by single IP addresses.
    cCs|jS)N)rR)r&rrr�__int__�sz_BaseAddress.__int__cCs2y|j|jko|j|jkStk
r,tSXdS)N)rRrorqr))r&r'rrrr(�s
z_BaseAddress.__eq__cCs`t|t�stSt|t�s(td||f��|j|jkrDtd||f��|j|jkr\|j|jkSdS)Nz"%s and %s are not of the same typez%%s and %s are not of the same versionF)r:r{r)r]r^rorR)r&r'rrrr,�s

z_BaseAddress.__lt__cCs t|t�stS|jt|�|�S)N)r:�_compat_int_typesr)r�r�)r&r'rrr�__add__�s
z_BaseAddress.__add__cCs t|t�stS|jt|�|�S)N)r:r�r)r�r�)r&r'rrr�__sub__�s
z_BaseAddress.__sub__cCsd|jjt|�fS)Nz%s(%r))r�r1rK)r&rrr�__repr__�sz_BaseAddress.__repr__cCst|j|j��S)N)rK�_string_from_ip_intrR)r&rrr�__str__�sz_BaseAddress.__str__cCsttt|j���S)N)�hash�hexr�rR)r&rrr�__hash__�sz_BaseAddress.__hash__cCs
|j|fS)N)ro)r&rrrrx�sz_BaseAddress._get_address_keycCs|j|jffS)N)r�rR)r&rrrr��sz_BaseAddress.__reduce__N)r1r2r3r6r4r�r(r,r�r�r�r�r�rxr�rrrrr]�sr]c@sXeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zedd��Zedd��Zedd��Zedd��Zed d!��Zed"d#��Zed$d%��Zed&d'��Zd(d)�Zd*d+�Zd,d-�ZdFd0d1�ZdGd2d3�Zed4d5��Zd6d7�Zd8d9�Zed:d;��Z ed<d=��Z!ed>d?��Z"ed@dA��Z#edBdC��Z$edDdE��Z%d/S)Hrvz~A generic IP network object.

    This IP class contains the version independent methods which are
    used by networks.

    cCs
i|_dS)N)�_cache)r&r=rrr�__init__�sz_BaseNetwork.__init__cCsd|jjt|�fS)Nz%s(%r))r�r1rK)r&rrrr��sz_BaseNetwork.__repr__cCsd|j|jfS)Nz%s/%d)rrr�)r&rrrr��sz_BaseNetwork.__str__ccs<t|j�}t|j�}x"t|d|�D]}|j|�Vq$WdS)z�Generate Iterator over usable hosts in a network.

        This is like __iter__ except it doesn't return the network
        or broadcast addresses.

        r N)r�rrrkr$�_address_class)r&�network�	broadcast�xrrr�hosts�s

z_BaseNetwork.hostsccs<t|j�}t|j�}x"t||d�D]}|j|�Vq$WdS)Nr )r�rrrkr$r�)r&r�r�r�rrr�__iter__�s

z_BaseNetwork.__iter__cCslt|j�}t|j�}|dkr>|||kr0td��|j||�S|d7}|||krZtd��|j||�SdS)Nrzaddress out of ranger )r�rrrk�
IndexErrorr�)r&�nr�r�rrr�__getitem__�s

z_BaseNetwork.__getitem__cCsxt|t�stSt|t�s(td||f��|j|jkrDtd||f��|j|jkr\|j|jkS|j|jkrt|j|jkSdS)Nz"%s and %s are not of the same typez%%s and %s are not of the same versionF)r:r{r)rvr^rorr�netmask)r&r'rrrr,s

z_BaseNetwork.__lt__cCsFy,|j|jko,|j|jko,t|j�t|j�kStk
r@tSXdS)N)rorrr�r�rqr))r&r'rrrr(sz_BaseNetwork.__eq__cCstt|j�t|j�A�S)N)r�r�rrr�)r&rrrr�sz_BaseNetwork.__hash__cCsL|j|jkrdSt|t�rdSt|j�t|j�koBt|j�kSSdS)NF)ror:rvr�rrrRrk)r&r'rrr�__contains__s
z_BaseNetwork.__contains__cCs(|j|kp&|j|kp&|j|kp&|j|kS)z*Tell if self is partly contained in other.)rrrk)r&r'rrr�overlaps)s


z_BaseNetwork.overlapscCs<|jjd�}|dkr8|jt|j�t|j�B�}||jd<|S)Nrk)r�rgr�r�rr�hostmask)r&r�rrrrk0s
z_BaseNetwork.broadcast_addresscCs8|jjd�}|dkr4|jt|j�|jA�}||jd<|S)Nr�)r�rgr�r�r�ra)r&r�rrrr�9s

z_BaseNetwork.hostmaskcCsd|j|jfS)Nz%s/%d)rrrp)r&rrr�with_prefixlenAsz_BaseNetwork.with_prefixlencCsd|j|jfS)Nz%s/%s)rrr�)r&rrr�with_netmaskEsz_BaseNetwork.with_netmaskcCsd|j|jfS)Nz%s/%s)rrr�)r&rrr�
with_hostmaskIsz_BaseNetwork.with_hostmaskcCst|j�t|j�dS)z&Number of hosts in the current subnet.r )r�rkrr)r&rrr�
num_addressesMsz_BaseNetwork.num_addressescCsdt|�f}t|��dS)Nz%%200s has no associated address class)r�r)r&r�rrrr�Rsz_BaseNetwork._address_classcCs|jS)N)rp)r&rrrr�Zsz_BaseNetwork.prefixlenccs|j|jkstd||f��t|t�s2td|��|j|�sLtd||f��||krXdS|jd|j|jf�}|j	�\}}xb||kr�||kr�|j|�r�|V|j	�\}}q||j|�r�|V|j	�\}}q|t
d|||f��q|W||kr�|Vn$||k�r|Vnt
d|||f��dS)a�Remove an address from a larger block.

        For example:

            addr1 = ip_network('192.0.2.0/28')
            addr2 = ip_network('192.0.2.1/32')
            list(addr1.address_exclude(addr2)) =
                [IPv4Network('192.0.2.0/32'), IPv4Network('192.0.2.2/31'),
                 IPv4Network('192.0.2.4/30'), IPv4Network('192.0.2.8/29')]

        or IPv6:

            addr1 = ip_network('2001:db8::1/32')
            addr2 = ip_network('2001:db8::1/128')
            list(addr1.address_exclude(addr2)) =
                [ip_network('2001:db8::1/128'),
                 ip_network('2001:db8::2/127'),
                 ip_network('2001:db8::4/126'),
                 ip_network('2001:db8::8/125'),
                 ...
                 ip_network('2001:db8:8000::/33')]

        Args:
            other: An IPv4Network or IPv6Network object of the same type.

        Returns:
            An iterator of the IPv(4|6)Network objects which is self
            minus other.

        Raises:
            TypeError: If self and other are of differing address
              versions, or if other is not a network object.
            ValueError: If other is not completely contained by self.

        z%%s and %s are not of the same versionz%s is not a network objectz%s not contained in %sNz%s/%sz3Error performing exclusion: s1: %s s2: %s other: %s)ror^r:rv�	subnet_ofr<r�rrr�rl�AssertionError)r&r'�s1�s2rrr�address_exclude^s6$





z_BaseNetwork.address_excludecCs`|j|jkrtd||f��|j|jkr,dS|j|jkr<dS|j|jkrLdS|j|jkr\dSdS)a�Compare two IP objects.

        This is only concerned about the comparison of the integer
        representation of the network addresses.  This means that the
        host bits aren't considered at all in this method.  If you want
        to compare host bits, you can easily enough do a
        'HostA._ip < HostB._ip'

        Args:
            other: An IP object.

        Returns:
            If the IP versions of self and other are the same, returns:

            -1 if self < other:
              eg: IPv4Network('192.0.2.0/25') < IPv4Network('192.0.2.128/25')
              IPv6Network('2001:db8::1000/124') <
                  IPv6Network('2001:db8::2000/124')
            0 if self == other
              eg: IPv4Network('192.0.2.0/24') == IPv4Network('192.0.2.0/24')
              IPv6Network('2001:db8::1000/124') ==
                  IPv6Network('2001:db8::1000/124')
            1 if self > other
              eg: IPv4Network('192.0.2.128/25') > IPv4Network('192.0.2.0/25')
                  IPv6Network('2001:db8::2000/124') >
                      IPv6Network('2001:db8::1000/124')

          Raises:
              TypeError if the IP versions are different.

        z"%s and %s are not of the same typer rrnrn)ror^rrr�)r&r'rrr�compare_networks�s!z_BaseNetwork.compare_networkscCs|j|j|jfS)z�Network-only key function.

        Returns an object that identifies this address' network and
        netmask. This function is a suitable "key" argument for sorted()
        and list.sort().

        )rorrr�)r&rrrrw�sz_BaseNetwork._get_networks_keyr Nc	cs�|j|jkr|VdS|dk	rJ||jkr0td��|dkr@td��||j}|dkrZtd��|j|}||jkr~td||f��t|j�}t|j�d}t|j�d|?}x(t|||�D]}|j||f�}|Vq�WdS)a�The subnets which join to make the current subnet.

        In the case that self contains only one IP
        (self._prefixlen == 32 for IPv4 or self._prefixlen == 128
        for IPv6), yield an iterator with just ourself.

        Args:
            prefixlen_diff: An integer, the amount the prefix length
              should be increased by. This should not be set if
              new_prefix is also set.
            new_prefix: The desired new prefix length. This must be a
              larger number (smaller prefix) than the existing prefix.
              This should not be set if prefixlen_diff is also set.

        Returns:
            An iterator of IPv(4|6) objects.

        Raises:
            ValueError: The prefixlen_diff is too small or too large.
                OR
            prefixlen_diff and new_prefix are both set or new_prefix
              is a smaller number than the current prefix (smaller
              number means a larger network)

        Nznew prefix must be longerr z(cannot set prefixlen_diff and new_prefixrzprefix length diff must be > 0z0prefix length diff %d is invalid for netblock %s)	rpr`r<r�rrrkr�r$r�)	r&�prefixlen_diff�
new_prefix�
new_prefixlenr!r"r#Znew_addrZcurrentrrrrl�s,




z_BaseNetwork.subnetscCs�|jdkr|S|dk	rB||jkr(td��|dkr8td��|j|}|j|}|dkrftd|j|f��|jt|j�t|j�|>@|f�S)a�The supernet containing the current network.

        Args:
            prefixlen_diff: An integer, the amount the prefix length of
              the network should be decreased by.  For example, given a
              /24 network and a prefixlen_diff of 3, a supernet with a
              /21 netmask is returned.

        Returns:
            An IPv4 network object.

        Raises:
            ValueError: If self.prefixlen - prefixlen_diff < 0. I.e., you have
              a negative prefix length.
                OR
            If prefixlen_diff and new_prefix are both set or new_prefix is a
              larger number than the current prefix (larger number means a
              smaller network)

        rNznew prefix must be shorterr z(cannot set prefixlen_diff and new_prefixz;current prefixlen is %d, cannot have a prefixlen_diff of %d)rpr<r�r�r�rrr�)r&r�r�r�rrrrfs 



z_BaseNetwork.supernetcCs|jjo|jjS)z�Test if the address is reserved for multicast use.

        Returns:
            A boolean, True if the address is a multicast address.
            See RFC 2373 2.7 for details.

        )rr�is_multicastrk)r&rrrr�As	z_BaseNetwork.is_multicastcCsP|j|jkrdSt|d�r<t|d�r<|j|jko:|j|jkStdt|���dS)NFrrrkz9Unable to test subnet containment with element of type %s)ro�hasattrrrrkr^r�)r&r'rrrr�Ms

z_BaseNetwork.subnet_ofcCsP|j|jkrdSt|d�r<t|d�r<|j|jko:|j|jkStdt|���dS)NFrrrkz9Unable to test subnet containment with element of type %s)ror�rrrkr^r�)r&r'rrr�supernet_of[s

z_BaseNetwork.supernet_ofcCs|jjo|jjS)z�Test if the address is otherwise IETF reserved.

        Returns:
            A boolean, True if the address is within one of the
            reserved IPv6 Network ranges.

        )rr�is_reservedrk)r&rrrr�is	z_BaseNetwork.is_reservedcCs|jjo|jjS)z�Test if the address is reserved for link-local.

        Returns:
            A boolean, True if the address is reserved per RFC 4291.

        )rr�
is_link_localrk)r&rrrr�usz_BaseNetwork.is_link_localcCs|jjo|jjS)z�Test if this address is allocated for private networks.

        Returns:
            A boolean, True if the address is reserved per
            iana-ipv4-special-registry or iana-ipv6-special-registry.

        )rr�
is_privaterk)r&rrrr��s	z_BaseNetwork.is_privatecCs|jS)z�Test if this address is allocated for public networks.

        Returns:
            A boolean, True if the address is not reserved per
            iana-ipv4-special-registry or iana-ipv6-special-registry.

        )r�)r&rrr�	is_global�s	z_BaseNetwork.is_globalcCs|jjo|jjS)z�Test if the address is unspecified.

        Returns:
            A boolean, True if this is the unspecified address as defined in
            RFC 2373 2.5.2.

        )rr�is_unspecifiedrk)r&rrrr��s	z_BaseNetwork.is_unspecifiedcCs|jjo|jjS)z�Test if the address is a loopback address.

        Returns:
            A boolean, True if the address is a loopback address as defined in
            RFC 2373 2.5.3.

        )rr�is_loopbackrk)r&rrrr��s	z_BaseNetwork.is_loopback)r N)r N)&r1r2r3r6r�r�r�r�r�r�r,r(r�r�r�r�rkr�r�r�r�r�r�r�r�r�rwrlrfr�r�r�r�r�r�r�r�r�rrrrrv�sD

	K0

5
)rvc
@s�eZdZdZfZdZdedZed�Z	edddd	d
ddd
dg	�Z
eZiZdd�Z
edd��Zedd��Zedd��Zedd��Zdd�Zdd�Zedd��Zedd ��Zd!S)"r�zyBase IPv4 object.

    The following methods are used by IPv4 objects in both single IP
    addresses and networks.

    rrr �
0123456789���������rrcCst|�S)N)rK)r&rrrr|�sz$_BaseV4._explode_shorthand_ip_stringcCsn||jkrdt|t�r|}n.y|j|�}Wntk
rF|j|�}YnXt|j|��}||f|j|<|j|S)aMake a (netmask, prefix_len) tuple from the given argument.

        Argument can be:
        - an integer (the prefix length)
        - a string representing the prefix length (e.g. "24")
        - a string representing the prefix netmask (e.g. "255.255.255.0")
        )�_netmask_cacher:r�r�r7r�r8r�)r��argr�r�rrr�
_make_netmask�s	

z_BaseV4._make_netmaskcCsx|std��|jd�}t|�dkr.td|��ytt|j|�d�Stk
rr}ztd||f��WYdd}~XnXdS)aTurn the given IP string into an integer for comparison.

        Args:
            ip_str: A string, the IP ip_str.

        Returns:
            The IP ip_str as an integer.

        Raises:
            AddressValueError: if ip_str isn't a valid IPv4 Address.

        zAddress cannot be empty�.rzExpected 4 octets in %rrFz%s in %rN)r5rLrMr�map�_parse_octetr<)r�r�Zoctets�excrrrr��s
z_BaseV4._ip_int_from_stringcCs�|std��|jj|�s(d}t||��t|�dkrDd}t||��t|d�}|dkrr|ddkrrd	}t||��|d
kr�td|��|S)aConvert a decimal octet into an integer.

        Args:
            octet_str: A string, the number to parse.

        Returns:
            The octet as an integer.

        Raises:
            ValueError: if the octet isn't strictly a decimal from [0..255].

        zEmpty octet not permittedz#Only decimal digits permitted in %r�z$At most 3 characters permitted in %r�
�r�0z3Ambiguous (octal/decimal) value in %r not permittedr�zOctet %d (> 255) not permitted)r<r�r�rMr�)r�Z	octet_strr�Z	octet_intrrrr��s
z_BaseV4._parse_octetcCsdjdd�t|dd�D��S)z�Turns a 32-bit integer into dotted decimal notation.

        Args:
            ip_int: An integer, the IP address.

        Returns:
            The IP address as a string in dotted decimal notation.

        r�css0|](}tt|t�r"tjd|�dn|�VqdS)s!BrN)rKr:r;rr	)r
rrrr�	<genexpr>-sz._BaseV4._string_from_ip_int.<locals>.<genexpr>rrF)�joinr)r�r�rrrr�"s
z_BaseV4._string_from_ip_intcsh|jd�}y�fdd�tt|�D�}Wntk
r:dSXt|�t|�krPdS|d|dkrddSdS)	z�Test if the IP string is a hostmask (rather than a netmask).

        Args:
            ip_str: A string, the potential hostmask.

        Returns:
            A boolean, True if the IP string is a hostmask.

        r�csg|]}|�jkr|�qSr)�_valid_mask_octets)r
r�)r&rrr>sz(_BaseV4._is_hostmask.<locals>.<listcomp>Frr Trn)rLr�r�r<rM)r&r�rZ�partsr)r&r�_is_hostmask2s

z_BaseV4._is_hostmaskcCs&t|�jd�ddd�}dj|�dS)z�Return the reverse DNS pointer name for the IPv4 address.

        This implements the method described in RFC1035 3.5.

        r�Nr z
.in-addr.arparn)rKrLr�)r&Zreverse_octetsrrrrGsz_BaseV4._reverse_pointercCs|jS)N)r`)r&rrr�
max_prefixlenPsz_BaseV4.max_prefixlencCs|jS)N)ro)r&rrrr_Tsz_BaseV4.versionN)r1r2r3r6r4ro�
IPV4LENGTHra�	frozensetr�r�r`r�r|r�r�r�r�r�r�rr�r�r_rrrrr��s"%	r�c@s|eZdZdZdZdd�Zedd��Zedd	��Zed
d��Z	edd
��Z
edd��Zedd��Zedd��Z
edd��ZdS)r8z/Represent and manipulate single IPv4 Addresses.rR�__weakref__cCsxt|t�r|j|�||_dSt|t�rL|j|d�t|�}t|d�|_dSt|�}d|krht	d|��|j
|�|_dS)a�
        Args:
            address: A string or integer representing the IP

              Additionally, an integer can be passed, so
              IPv4Address('192.0.2.1') == IPv4Address(3221225985).
              or, more generally
              IPv4Address(int(IPv4Address('192.0.2.1'))) ==
                IPv4Address('192.0.2.1')

        Raises:
            AddressValueError: If ipaddress isn't a valid IPv4 address.

        NrrFrJzUnexpected '/' in %r)r:r�r�rRr;r�rrrKr5r�)r&r=�bvs�addr_strrrrr�_s


zIPv4Address.__init__cCs
t|j�S)z*The binary representation of this address.)rHrR)r&rrr�packed�szIPv4Address.packedcCs||jjkS)z�Test if the address is otherwise IETF reserved.

         Returns:
             A boolean, True if the address is within the
             reserved IPv4 Network range.

        )�
_constants�_reserved_network)r&rrrr��s	zIPv4Address.is_reservedcst�fdd��jjD��S)z�Test if this address is allocated for private networks.

        Returns:
            A boolean, True if the address is reserved per
            iana-ipv4-special-registry.

        c3s|]}�|kVqdS)Nr)r
rb)r&rrr��sz)IPv4Address.is_private.<locals>.<genexpr>)�anyr��_private_networks)r&r)r&rr��s	zIPv4Address.is_privatecCs||jjko|jS)N)r��_public_networkr�)r&rrrr��szIPv4Address.is_globalcCs||jjkS)z�Test if the address is reserved for multicast use.

        Returns:
            A boolean, True if the address is multicast.
            See RFC 3171 for details.

        )r��_multicast_network)r&rrrr��s	zIPv4Address.is_multicastcCs||jjkS)z�Test if the address is unspecified.

        Returns:
            A boolean, True if this is the unspecified address as defined in
            RFC 5735 3.

        )r��_unspecified_address)r&rrrr��s	zIPv4Address.is_unspecifiedcCs||jjkS)z�Test if the address is a loopback address.

        Returns:
            A boolean, True if the address is a loopback per RFC 3330.

        )r��_loopback_network)r&rrrr��szIPv4Address.is_loopbackcCs||jjkS)z�Test if the address is reserved for link-local.

        Returns:
            A boolean, True if the address is link-local per RFC 3927.

        )r��_linklocal_network)r&rrrr��szIPv4Address.is_link_localN)rRr�)r1r2r3r6r4r�r�r�r�r�r�r�r�r�r�rrrrr8Ys$
r8c@sjeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zej	Z	e
dd��Ze
d
d��Ze
dd��Z
e
dd��ZdS)rCcCs�t|ttf�r2tj||�t|j�|_|j|_	dSt|t
�r�tj||d�t|�dkrht|d�|_	n|j|_	t|dd�|_|jj
|_
|jj|_dSt|�}tj||d�t|dd�|_|jj	|_	|jj
|_
|jj|_dS)Nrr F)rA)r:r;r�r8r�r?rRr�r`rp�tuplerMr�r�r�rO)r&r=rNrrrr��s(




zIPv4Interface.__init__cCsd|j|j�|jjfS)Nz%s/%d)r�rRr�r�)r&rrrr��szIPv4Interface.__str__cCsDtj||�}|s|tkr|Sy|j|jkStk
r>dSXdS)NF)r8r(r)r�rq)r&r'�
address_equalrrrr(�szIPv4Interface.__eq__cCs>tj||�}|tkrtSy|j|jkStk
r8dSXdS)NF)r8r,r)r�rq)r&r'�address_lessrrrr,�szIPv4Interface.__lt__cCs|j|jAt|jj�AS)N)rRrpr�r�rr)r&rrrr�szIPv4Interface.__hash__cCs
t|j�S)N)r8rR)r&rrrrW
szIPv4Interface.ipcCsd|j|j�|jfS)Nz%s/%s)r�rRrp)r&rrrr�szIPv4Interface.with_prefixlencCsd|j|j�|jfS)Nz%s/%s)r�rRr�)r&rrrr�szIPv4Interface.with_netmaskcCsd|j|j�|jfS)Nz%s/%s)r�rRr�)r&rrrr�szIPv4Interface.with_hostmaskN)r1r2r3r�r�r(r,r�r{r�r�rWr�r�r�rrrrrC�srCc@s*eZdZdZeZddd�Zedd��ZdS)	r?aeThis class represents and manipulates 32-bit IPv4 network + addresses..

    Attributes: [examples for IPv4Network('192.0.2.0/27')]
        .network_address: IPv4Address('192.0.2.0')
        .hostmask: IPv4Address('0.0.0.31')
        .broadcast_address: IPv4Address('192.0.2.32')
        .netmask: IPv4Address('255.255.255.224')
        .prefixlen: 27

    TcCs|tj||�t|ttf�r<t|�|_|j|j�\|_	|_
dSt|t�r�t|�dkr\|d}n|j}t|d�|_|j|�\|_	|_
t
|j�}|t
|j	�@|kr�|r�td|��nt|t
|j	�@�|_dSt|�}t|j|d��|_t|�dkr�|d}n|j}|j|�\|_	|_
|�rDtt
|j�t
|j	�@�|jk�rDtd|��tt
|j�t
|j	�@�|_|j
|jdk�rx|j|_dS)aInstantiate a new IPv4 network object.

        Args:
            address: A string or integer representing the IP [& network].
              '192.0.2.0/24'
              '192.0.2.0/255.255.255.0'
              '192.0.0.2/0.0.0.255'
              are all functionally the same in IPv4. Similarly,
              '192.0.2.1'
              '192.0.2.1/255.255.255.255'
              '192.0.2.1/32'
              are also functionally equivalent. That is to say, failing to
              provide a subnetmask will create an object with a mask of /32.

              If the mask (portion after the / in the argument) is given in
              dotted quad form, it is treated as a netmask if it starts with a
              non-zero field (e.g. /255.0.0.0 == /8) and as a hostmask if it
              starts with a zero field (e.g. 0.255.255.255 == /8), with the
              single exception of an all-zero mask which is treated as a
              netmask == /0. If no mask is given, a default of /32 is used.

              Additionally, an integer can be passed, so
              IPv4Network('192.0.2.1') == IPv4Network(3221225985)
              or, more generally
              IPv4Interface(int(IPv4Interface('192.0.2.1'))) ==
                IPv4Interface('192.0.2.1')

        Raises:
            AddressValueError: If ipaddress isn't a valid IPv4 address.
            NetmaskValueError: If the netmask isn't valid for
              an IPv4 address.
            ValueError: If strict is True and a network address is not
              supplied.

        Nr rz%s has host bits setr)rvr�r:r�r;r8rrr�r`r�rpr�rMr�r<rOr�r�r�)r&r=rAr�r�rNrrrr�0sB%






zIPv4Network.__init__cCs&|jtd�ko|jtd�ko$|jS)z�Test if this address is allocated for public networks.

        Returns:
            A boolean, True if the address is not reserved per
            iana-ipv4-special-registry.

        z
100.64.0.0/10)rrr?rkr�)r&rrrr��s	zIPv4Network.is_globalN)T)	r1r2r3r6r8r�r�r�r�rrrrr?!s
Ur?c@s�eZdZed�Zed�Zed�Zed�Zed�ed�ed�ed�ed�ed�ed	�ed
�ed�ed�ed
�ed�ed�ed�gZed�Z	e
d�ZdS)�_IPv4Constantsz169.254.0.0/16z127.0.0.0/8z224.0.0.0/4z
100.64.0.0/10z	0.0.0.0/8z
10.0.0.0/8z
172.16.0.0/12z192.0.0.0/29z192.0.0.170/31z192.0.2.0/24z192.168.0.0/16z
198.18.0.0/15z198.51.100.0/24z203.0.113.0/24z240.0.0.0/4z255.255.255.255/32z0.0.0.0N)r1r2r3r?r�r�r�r�r�r�r8r�rrrrr��s(
r�c@s�eZdZdZfZdZdedZdZe	d�Z
eZiZe
dd��Ze
d	d
��Ze
dd��Ze
d
d��Ze
ddd��Zdd�Zdd�Zedd��Zedd��ZdS)�_BaseV6zyBase IPv6 object.

    The following methods are used by IPv6 objects in both single IP
    addresses and networks.

    r\rr r
Z0123456789ABCDEFabcdefcCsJ||jkr@t|t�r|}n
|j|�}t|j|��}||f|j|<|j|S)aMake a (netmask, prefix_len) tuple from the given argument.

        Argument can be:
        - an integer (the prefix length)
        - a string representing the prefix length (e.g. "24")
        - a string representing the prefix netmask (e.g. "255.255.255.0")
        )r�r:r�r�r9r�)r�r�r�r�rrrr��s	


z_BaseV6._make_netmaskcCs�|std��|jd�}d}t|�|kr:d||f}t|��d|dkr�yt|j��j}Wn2tk
r�}ztd||f��WYdd}~XnX|jd	|d
?d@�|jd	|d@�|jd}t|�|kr�d|d|f}t|��d}x@tdt|�d�D]*}	||	s�|dk	�r d
|}t|��|	}q�W|dk	�r�|}
t|�|d}|d�sn|
d8}
|
�rnd}t||��|d�s�|d8}|�r�d}t||��|j|
|}|dk�r4d}t||jd|f��njt|�|jk�r�d}t||j|f��|d�s
d}t||��|d�s$d}t||��t|�}
d}d}ytd}
x,t	|
�D] }	|
d
K}
|
|j
||	�O}
�qDW|
d
|K}
x0t	|d�D] }	|
d
K}
|
|j
||	�O}
�q�W|
Stk
�r�}ztd||f��WYdd}~XnXdS)z�Turn an IPv6 ip_str into an integer.

        Args:
            ip_str: A string, the IPv6 ip_str.

        Returns:
            An int, the IPv6 address

        Raises:
            AddressValueError: if ip_str isn't a valid IPv6 Address.

        zAddress cannot be empty�:r�z At least %d parts expected in %rr�r z%s in %rNz%xri��z!At most %d colons permitted in %rz At most one '::' permitted in %rrz0Leading ':' only permitted as part of '::' in %rz1Trailing ':' only permitted as part of '::' in %rz/Expected at most %d other parts with '::' in %rz,Exactly %d parts expected without '::' in %rrnrnrn)r5rLrMr8rerRrh�
_HEXTET_COUNTr$�range�
_parse_hextetr<)r�r�r�Z
_min_partsr�Zipv4_intr�Z
_max_partsZ
skip_indexrZparts_hiZparts_loZ
parts_skippedr�rrrr��s�
"







z_BaseV6._ip_int_from_stringcCs>|jj|�std|��t|�dkr4d}t||��t|d�S)a&Convert an IPv6 hextet string into an integer.

        Args:
            hextet_str: A string, the number to parse.

        Returns:
            The hextet as an integer.

        Raises:
            ValueError: if the input isn't strictly a hex number from
              [0..FFFF].

        zOnly hex digits permitted in %rrz$At most 4 characters permitted in %rr)�_HEX_DIGITSr�r<rMr�)r�Z
hextet_strr�rrrr�Esz_BaseV6._parse_hextetc	Cs�d}d}d}d}xJt|�D]>\}}|dkrP|d7}|dkr>|}||krX|}|}qd}d}qW|dkr�||}|t|�kr�|dg7}dg|||�<|dkr�dg|}|S)	a�Compresses a list of hextets.

        Compresses a list of strings, replacing the longest continuous
        sequence of "0" in the list with "" and adding empty strings at
        the beginning or at the end of the string such that subsequently
        calling ":".join(hextets) will produce the compressed version of
        the IPv6 address.

        Args:
            hextets: A list of strings, the hextets to compress.

        Returns:
            A list of strings.

        r rr��rnrnrnrn)�	enumeraterM)	r��hextetsZbest_doublecolon_startZbest_doublecolon_lenZdoublecolon_startZdoublecolon_len�indexZhextetZbest_doublecolon_endrrr�_compress_hextets_s.

z_BaseV6._compress_hextetsNcsZ|dkrt|j�}||jkr$td��d|��fdd�tddd�D�}|j|�}d	j|�S)
a,Turns a 128-bit integer into hexadecimal notation.

        Args:
            ip_int: An integer, the IP address.

        Returns:
            A string, the hexadecimal representation of the address.

        Raises:
            ValueError: The address is bigger than 128 bits of all ones.

        NzIPv6 address is too largez%032xcs&g|]}dt�||d�d��qS)z%xrr)r�)r
r�)�hex_strrrr�sz/_BaseV6._string_from_ip_int.<locals>.<listcomp>rrrr�)r�rRrar<r�r�r�)r�r�r�r)r�rr��s


z_BaseV6._string_from_ip_intcs�t|t�rt|j�}nt|t�r,t|j�}nt|�}|j|�}d|��fdd�tddd�D�}t|ttf�r�ddj	|�|j
fSdj	|�S)	z�Expand a shortened IPv6 address.

        Args:
            ip_str: A string, the IPv6 address.

        Returns:
            A string, the expanded IPv6 address.

        z%032xcsg|]}�||d��qS)rr)r
r�)r�rrr�sz8_BaseV6._explode_shorthand_ip_string.<locals>.<listcomp>rrrz%s/%dr�)r:r@rKrrrDrWr�r�rvr�rp)r&r�r�r�r)r�rr|�s



z$_BaseV6._explode_shorthand_ip_stringcCs&|jddd�jdd�}dj|�dS)z�Return the reverse DNS pointer name for the IPv6 address.

        This implements the method described in RFC3596 2.5.

        Nr r�r�r�z	.ip6.arparn)r}�replacer�)r&Z
reverse_charsrrrr�sz_BaseV6._reverse_pointercCs|jS)N)r`)r&rrrr��sz_BaseV6.max_prefixlencCs|jS)N)ro)r&rrrr_�sz_BaseV6.version)N)r1r2r3r6r4ro�
IPV6LENGTHrar�r�r�r`r�r�r�r�r�r�r�r|rr�r�r_rrrrr��s$i0	r�c@s�eZdZdZdZdd�Zedd��Zedd	��Zed
d��Z	edd
��Z
edd��Zedd��Zedd��Z
edd��Zedd��Zedd��Zedd��Zedd��ZdS) r9z/Represent and manipulate single IPv6 Addresses.rRr�cCsxt|t�r|j|�||_dSt|t�rL|j|d�t|�}t|d�|_dSt|�}d|krht	d|��|j
|�|_dS)aInstantiate a new IPv6 address object.

        Args:
            address: A string or integer representing the IP

              Additionally, an integer can be passed, so
              IPv6Address('2001:db8::') ==
                IPv6Address(42540766411282592856903984951653826560)
              or, more generally
              IPv6Address(int(IPv6Address('2001:db8::'))) ==
                IPv6Address('2001:db8::')

        Raises:
            AddressValueError: If address isn't a valid IPv6 address.

        NrrFrJzUnexpected '/' in %r)r:r�r�rRr;r�rrrKr5r�)r&r=r�r�rrrr��s


zIPv6Address.__init__cCs
t|j�S)z*The binary representation of this address.)rIrR)r&rrrr��szIPv6Address.packedcCs||jjkS)z�Test if the address is reserved for multicast use.

        Returns:
            A boolean, True if the address is a multicast address.
            See RFC 2373 2.7 for details.

        )r�r�)r&rrrr�s	zIPv6Address.is_multicastcst�fdd��jjD��S)z�Test if the address is otherwise IETF reserved.

        Returns:
            A boolean, True if the address is within one of the
            reserved IPv6 Network ranges.

        c3s|]}�|kVqdS)Nr)r
r�)r&rrr�sz*IPv6Address.is_reserved.<locals>.<genexpr>)r�r��_reserved_networks)r&r)r&rr�s	zIPv6Address.is_reservedcCs||jjkS)z�Test if the address is reserved for link-local.

        Returns:
            A boolean, True if the address is reserved per RFC 4291.

        )r�r�)r&rrrr�szIPv6Address.is_link_localcCs||jjkS)a`Test if the address is reserved for site-local.

        Note that the site-local address space has been deprecated by RFC 3879.
        Use is_private to test if this address is in the space of unique local
        addresses as defined by RFC 4193.

        Returns:
            A boolean, True if the address is reserved per RFC 3513 2.5.6.

        )r��_sitelocal_network)r&rrr�
is_site_local#szIPv6Address.is_site_localcst�fdd��jjD��S)z�Test if this address is allocated for private networks.

        Returns:
            A boolean, True if the address is reserved per
            iana-ipv6-special-registry.

        c3s|]}�|kVqdS)Nr)r
rb)r&rrr�:sz)IPv6Address.is_private.<locals>.<genexpr>)r�r�r�)r&r)r&rr�1s	zIPv6Address.is_privatecCs|jS)z�Test if this address is allocated for public networks.

        Returns:
            A boolean, true if the address is not reserved per
            iana-ipv6-special-registry.

        )r�)r&rrrr�<s	zIPv6Address.is_globalcCs
|jdkS)z�Test if the address is unspecified.

        Returns:
            A boolean, True if this is the unspecified address as defined in
            RFC 2373 2.5.2.

        r)rR)r&rrrr�Gs	zIPv6Address.is_unspecifiedcCs
|jdkS)z�Test if the address is a loopback address.

        Returns:
            A boolean, True if the address is a loopback address as defined in
            RFC 2373 2.5.3.

        r )rR)r&rrrr�Rs	zIPv6Address.is_loopbackcCs |jd?dkrdSt|jd@�S)z�Return the IPv4 mapped address.

        Returns:
            If the IPv6 address is a v4 mapped address, return the
            IPv4 mapped address. Return None otherwise.

        ri��Nl��)rRr8)r&rrr�ipv4_mapped]s	zIPv6Address.ipv4_mappedcCs4|jd?dkrdSt|jd?d@�t|jd@�fS)z�Tuple of embedded teredo IPs.

        Returns:
            Tuple of the (server, client) IPs or None if the address
            doesn't appear to be a teredo address (doesn't start with
            2001::/32)

        �`i Nrl��)rRr8)r&rrr�teredojs
zIPv6Address.teredocCs$|jd?dkrdSt|jd?d@�S)z�Return the IPv4 6to4 embedded address.

        Returns:
            The IPv4 6to4-embedded address if present or None if the
            address doesn't appear to contain a 6to4 embedded address.

        �pi N�Pl��)rRr8)r&rrr�	sixtofourys	zIPv6Address.sixtofourN)rRr�)r1r2r3r6r4r�r�r�r�r�r�rr�r�r�r�rrrrrrrr9�s%

r9c@s�eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zej	Z	e
dd��Ze
d
d��Ze
dd��Z
e
dd��Ze
dd��Ze
dd��ZdS)rDcCs�t|ttf�r2tj||�t|j�|_|j|_	dSt|t
�r�tj||d�t|�dkrht|d�|_	n|j|_	t|dd�|_|jj
|_
|jj|_dSt|�}tj||d�t|dd�|_|jj
|_
|jj	|_	|jj|_dS)Nrr F)rA)r:r;r�r9r�r@rRr�r`rpr�rMr�r�r�rO)r&r=rNrrrr��s(




zIPv6Interface.__init__cCsd|j|j�|jjfS)Nz%s/%d)r�rRr�r�)r&rrrr��szIPv6Interface.__str__cCsDtj||�}|s|tkr|Sy|j|jkStk
r>dSXdS)NF)r9r(r)r�rq)r&r'r�rrrr(�szIPv6Interface.__eq__cCs>tj||�}|tkrtSy|j|jkStk
r8dSXdS)NF)r9r,r)r�rq)r&r'r�rrrr,�szIPv6Interface.__lt__cCs|j|jAt|jj�AS)N)rRrpr�r�rr)r&rrrr��szIPv6Interface.__hash__cCs
t|j�S)N)r9rR)r&rrrrW�szIPv6Interface.ipcCsd|j|j�|jfS)Nz%s/%s)r�rRrp)r&rrrr��szIPv6Interface.with_prefixlencCsd|j|j�|jfS)Nz%s/%s)r�rRr�)r&rrrr��szIPv6Interface.with_netmaskcCsd|j|j�|jfS)Nz%s/%s)r�rRr�)r&rrrr��szIPv6Interface.with_hostmaskcCs|jdko|jjS)Nr)rRr�r�)r&rrrr��szIPv6Interface.is_unspecifiedcCs|jdko|jjS)Nr )rRr�r�)r&rrrr��szIPv6Interface.is_loopbackN)r1r2r3r�r�r(r,r�r{r�r�rWr�r�r�r�r�rrrrrD�srDc@s2eZdZdZeZd
dd�Zdd�Zedd��Z	d	S)r@avThis class represents and manipulates 128-bit IPv6 networks.

    Attributes: [examples for IPv6('2001:db8::1000/124')]
        .network_address: IPv6Address('2001:db8::1000')
        .hostmask: IPv6Address('::f')
        .broadcast_address: IPv6Address('2001:db8::100f')
        .netmask: IPv6Address('ffff:ffff:ffff:ffff:ffff:ffff:ffff:fff0')
        .prefixlen: 124

    TcCs|tj||�t|ttf�r<t|�|_|j|j�\|_	|_
dSt|t�r�t|�dkr\|d}n|j}|j|�\|_	|_
t|d�|_t
|j�}|t
|j	�@|kr�|r�td|��nt|t
|j	�@�|_dSt|�}t|j|d��|_t|�dkr�|d}n|j}|j|�\|_	|_
|�rDtt
|j�t
|j	�@�|jk�rDtd|��tt
|j�t
|j	�@�|_|j
|jdk�rx|j|_dS)a�Instantiate a new IPv6 Network object.

        Args:
            address: A string or integer representing the IPv6 network or the
              IP and prefix/netmask.
              '2001:db8::/128'
              '2001:db8:0000:0000:0000:0000:0000:0000/128'
              '2001:db8::'
              are all functionally the same in IPv6.  That is to say,
              failing to provide a subnetmask will create an object with
              a mask of /128.

              Additionally, an integer can be passed, so
              IPv6Network('2001:db8::') ==
                IPv6Network(42540766411282592856903984951653826560)
              or, more generally
              IPv6Network(int(IPv6Network('2001:db8::'))) ==
                IPv6Network('2001:db8::')

            strict: A boolean. If true, ensure that we have been passed
              A true network address, eg, 2001:db8::1000/124 and not an
              IP address on a network, eg, 2001:db8::1/124.

        Raises:
            AddressValueError: If address isn't a valid IPv6 address.
            NetmaskValueError: If the netmask isn't valid for
              an IPv6 address.
            ValueError: If strict was True and a network address was not
              supplied.

        Nr rz%s has host bits setr)rvr�r:r;r�r9rrr�r`r�rpr�rMr�r<rOr�r�r�)r&r=rAr�r�rNrrrr��sB 






zIPv6Network.__init__ccs@t|j�}t|j�}x&t|d|d�D]}|j|�Vq(WdS)z�Generate Iterator over usable hosts in a network.

          This is like __iter__ except it doesn't return the
          Subnet-Router anycast address.

        r N)r�rrrkr$r�)r&r�r�r�rrrr�<	s

zIPv6Network.hostscCs|jjo|jjS)a`Test if the address is reserved for site-local.

        Note that the site-local address space has been deprecated by RFC 3879.
        Use is_private to test if this address is in the space of unique local
        addresses as defined by RFC 4193.

        Returns:
            A boolean, True if the address is reserved per RFC 3513 2.5.6.

        )rrrrk)r&rrrrH	szIPv6Network.is_site_localN)T)
r1r2r3r6r9r�r�r�r�rrrrrr@�s

Or@c@s�eZdZed�Zed�Zed�ed�ed�ed�ed�ed�ed	�ed
�ed�ed�g
Zed�ed
�ed�ed�ed�ed�ed�ed�ed�ed�ed�ed�ed�ed�ed�gZed�ZdS)�_IPv6Constantsz	fe80::/10zff00::/8z::1/128z::/128z
::ffff:0:0/96z100::/64z	2001::/23z2001:2::/48z
2001:db8::/32z2001:10::/28zfc00::/7z::/8z100::/8z200::/7z400::/6z800::/5z1000::/4z4000::/3z6000::/3z8000::/3zA000::/3zC000::/3zE000::/4zF000::/5zF800::/6zFE00::/9z	fec0::/10N)	r1r2r3r@r�r�r�rrrrrrr	X	s*

r	r)r )T)6r6Z
__future__rrr�__version__r�r�Zlong�	NameErrorZunicoderK�strr�
from_bytesrrqrr�rr$�objectr%r�r�r<r5r7r>rBrErHrIrOrXr[rcrmrurzr{r]rvr�r8rCr?r�r�r�r9rDr@r	rrrr�<module>	s�

	


)$
$#716=a*vRr 5V{!site-packages/pip/_vendor/__pycache__/__init__.cpython-36.pyc000064400000005240147511334610020113 0ustar003

���e>�@s�dZddlmZddlZddlZddlZdZejj	ejj
e��Zdd�Z
e�r�ejejjed��ejejdd�<e
d�e
d	�e
d
�e
d�e
d�e
d
�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d �e
d!�e
d"�e
d#�e
d$�e
d%�e
d&�e
d'�e
d(�e
d)�e
d*�e
d+�e
d,�e
d-�e
d.�e
d/�e
d0�e
d1�e
d2�dS)3z�
pip._vendor is for vendoring dependencies of pip to prevent needing pip to
depend on something external.

Files inside of pip._vendor should be considered immutable and should only be
updated to versions from upstream.
�)�absolute_importNFcCs�djt|�}yt|t�t�dd�Wnztk
r�yt|t�t�dd�Wntk
r`Yn:Xtj|tj|<|jdd�\}}t	tj||tj|�YnXdS)Nz{0}.{1}r)�level�.�)
�format�__name__�
__import__�globals�locals�ImportError�sys�modules�rsplit�setattr)Z
modulenameZ
vendored_name�base�head�r�/usr/lib/python3.6/__init__.py�vendoreds	rz*.whlZcachecontrolZcoloramaZdistlibZdistroZhtml5libZlockfileZsixz	six.moveszsix.moves.urllibZ	packagingzpackaging.versionzpackaging.specifiersZ
pkg_resourcesZprogressZretryingZrequestszrequests.packageszrequests.packages.urllib3z&requests.packages.urllib3._collectionsz$requests.packages.urllib3.connectionz(requests.packages.urllib3.connectionpoolz!requests.packages.urllib3.contribz*requests.packages.urllib3.contrib.ntlmpoolz+requests.packages.urllib3.contrib.pyopensslz$requests.packages.urllib3.exceptionsz requests.packages.urllib3.fieldsz"requests.packages.urllib3.filepostz"requests.packages.urllib3.packagesz/requests.packages.urllib3.packages.ordered_dictz&requests.packages.urllib3.packages.sixz5requests.packages.urllib3.packages.ssl_match_hostnamezErequests.packages.urllib3.packages.ssl_match_hostname._implementationz%requests.packages.urllib3.poolmanagerz!requests.packages.urllib3.requestz"requests.packages.urllib3.responsezrequests.packages.urllib3.utilz)requests.packages.urllib3.util.connectionz&requests.packages.urllib3.util.requestz'requests.packages.urllib3.util.responsez$requests.packages.urllib3.util.retryz#requests.packages.urllib3.util.ssl_z&requests.packages.urllib3.util.timeoutz"requests.packages.urllib3.util.url)�__doc__Z
__future__rZglobZos.path�osrZ	DEBUNDLED�path�abspath�dirname�__file__Z	WHEEL_DIRr�joinrrrr�<module>sh$site-packages/pip/_vendor/__pycache__/retrying.cpython-36.opt-1.pyc000064400000017525147511334610021167 0ustar003

���e�&�@slddlZddlmZddlZddlZddlZdZdd�ZGdd�de�Z	Gdd	�d	e�Z
Gd
d�de�ZdS)�N)�sixi���?csBt��dkr,t�d�r,dd�}|�d�S��fdd�}|SdS)z�
    Decorator function that instantiates the Retrying object
    @param *dargs: positional arguments passed to Retrying object
    @param **dkw: keyword arguments passed to the Retrying object
    �rcstj���fdd��}|S)Ncst�j�f|�|�S)N)�Retrying�call)�args�kw)�f��/usr/lib/python3.6/retrying.py�	wrapped_f$sz-retry.<locals>.wrap_simple.<locals>.wrapped_f)r�wraps)rrr	)rr
�wrap_simple"szretry.<locals>.wrap_simplecstj�����fdd��}|S)Ncst���j�f|�|�S)N)rr)rr)�dargs�dkwrr	r
r/sz&retry.<locals>.wrap.<locals>.wrapped_f)rr)rr)rr)rr
�wrap-szretry.<locals>.wrapN)�len�callable)rrr
rr	)rrr
�retrys
rc@sneZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�ZdS)rNFcs|dkrdn||_|dkrdn||_|dkr0dn||_|dkrBdn||_|dkrTdn||_|dkrfdn||_|	dkrxdn|	|_|
dkr�dn|
|_|dkr�tn||_	|dkr�dn||_
g�|dk	r̈j|j�|dk	r�j|j
�|dk	r�||_n&|dk�r
�fdd�|_nt||�|_dd�g�|dk	�r6�j|j�|dk	�sJ|dk	�rV�j|j�|dk	�sj|	dk	�rv�j|j�|
dk	�s�|dk	�r��j|j�|dk	�r�||_n&|dk�r‡fd	d�|_nt||�|_|dk�r�|j|_n||_|
dk�r�|j|_n|
|_||_dS)
N��di�rrcst��fdd��D��S)Nc3s|]}|���VqdS)Nr	)�.0r)�attempts�delayr	r
�	<genexpr>asz6Retrying.__init__.<locals>.<lambda>.<locals>.<genexpr>)�any)rr)�
stop_funcs)rrr
�<lambda>asz#Retrying.__init__.<locals>.<lambda>c_sdS)Nrr	)r�kwargsr	r	r
rhscst��fdd��D��S)Nc3s|]}|���VqdS)Nr	)rr)rrr	r
rysz6Retrying.__init__.<locals>.<lambda>.<locals>.<genexpr>)�max)rr)�
wait_funcs)rrr
rys)�_stop_max_attempt_number�_stop_max_delay�_wait_fixed�_wait_random_min�_wait_random_max�_wait_incrementing_start�_wait_incrementing_increment�_wait_exponential_multiplier�MAX_WAIT�_wait_exponential_max�_wait_jitter_max�append�stop_after_attempt�stop_after_delay�stop�getattr�fixed_sleep�random_sleep�incrementing_sleep�exponential_sleep�wait�
always_reject�_retry_on_exception�never_reject�_retry_on_result�_wrap_exception)�selfr.r4Zstop_max_attempt_numberZstop_max_delayZ
wait_fixedZwait_random_minZwait_random_maxZwait_incrementing_startZwait_incrementing_incrementZwait_exponential_multiplierZwait_exponential_maxZretry_on_exceptionZretry_on_result�wrap_exceptionZ	stop_funcZ	wait_funcZwait_jitter_maxr	)rrr
�__init__:sR








zRetrying.__init__cCs
||jkS)z;Stop after the previous attempt >= stop_max_attempt_number.)r )r:�previous_attempt_number�delay_since_first_attempt_msr	r	r
r,�szRetrying.stop_after_attemptcCs
||jkS)z=Stop after the time from the first attempt >= stop_max_delay.)r!)r:r=r>r	r	r
r-�szRetrying.stop_after_delaycCsdS)z#Don't sleep at all before retrying.rr	)r:r=r>r	r	r
�no_sleep�szRetrying.no_sleepcCs|jS)z0Sleep a fixed amount of time between each retry.)r")r:r=r>r	r	r
r0�szRetrying.fixed_sleepcCstj|j|j�S)zISleep a random amount of time between wait_random_min and wait_random_max)�randomZrandintr#r$)r:r=r>r	r	r
r1�szRetrying.random_sleepcCs$|j|j|d}|dkr d}|S)z�
        Sleep an incremental amount of time after each attempt, starting at
        wait_incrementing_start and incrementing by wait_incrementing_increment
        rr)r%r&)r:r=r>�resultr	r	r
r2�szRetrying.incrementing_sleepcCs2d|}|j|}||jkr"|j}|dkr.d}|S)N�r)r'r))r:r=r>ZexprAr	r	r
r3�s

zRetrying.exponential_sleepcCsdS)NFr	)r:rAr	r	r
r7�szRetrying.never_rejectcCsdS)NTr	)r:rAr	r	r
r5�szRetrying.always_rejectcCs4d}|jr ||j|jd�O}n||j|j�O}|S)NFr)�
has_exceptionr6�valuer8)r:�attemptZrejectr	r	r
�
should_reject�s
zRetrying.should_rejectc
Os�tttj�d��}d}x�yt|||�|d�}Wn tj�}t||d�}YnX|j|�sh|j|j�Stttj�d��|}|j	||�r�|jr�|j
r�|j��q�t|��n<|j||�}	|j
r�tj�|j
}
|	td|
�}	tj|	d�|d7}qWdS)Ni�rFTrg@�@)�int�round�time�Attempt�sys�exc_inforF�getr9r.rC�
RetryErrorr4r*r@r�sleep)r:�fnrrZ
start_time�attempt_numberrE�tbr>rOZjitterr	r	r
r�s*


z
Retrying.call)NNNNNNNNNNNNNFNNN)�__name__�
__module__�__qualname__r<r,r-r?r0r1r2r3r7r5rFrr	r	r	r
r8s0
F
		rc@s*eZdZdZdd�Zd
dd�Zdd�Zd	S)rJz�
    An Attempt encapsulates a call to a target function that may end as a
    normal return value from the function or an Exception depending on what
    occurred during the execution.
    cCs||_||_||_dS)N)rDrQrC)r:rDrQrCr	r	r
r<�szAttempt.__init__FcCs@|jr6|rt|��q<tj|jd|jd|jd�n|jSdS)z�
        Return the return value of this Attempt instance or raise an Exception.
        If wrap_exception is true, this Attempt is wrapped inside of a
        RetryError before being raised.
        rrrBN)rCrNrZreraiserD)r:r;r	r	r
rM�s

"zAttempt.getcCs:|jr&dj|jdjtj|jd���Sdj|j|j�SdS)NzAttempts: {0}, Error:
{1}�rBzAttempts: {0}, Value: {1})rC�formatrQ�join�	traceback�	format_tbrD)r:r	r	r
�__repr__�s zAttempt.__repr__N)F)rSrTrU�__doc__r<rMr[r	r	r	r
rJ�s
rJc@s eZdZdZdd�Zdd�ZdS)rNzU
    A RetryError encapsulates the last Attempt instance right before giving up.
    cCs
||_dS)N)�last_attempt)r:r]r	r	r
r<szRetryError.__init__cCsdj|j�S)NzRetryError[{0}])rWr])r:r	r	r
�__str__
szRetryError.__str__N)rSrTrUr\r<r^r	r	r	r
rNsrN)
r@Zpip._vendorrrKrIrYr(r�objectrrJ�	ExceptionrNr	r	r	r
�<module>s*!site-packages/pip/_vendor/__pycache__/appdirs.cpython-36.opt-1.pyc000064400000044153147511334610020763 0ustar003

���e`W�@s�dZd1Zdjeee��ZddlZddlZejddkZ	e	r>eZ
ejjd�r�ddlZej
�ddZejd�rrd	Zq�ejd
�r�dZq�dZnejZd2dd�Zd3dd�Zd4dd�Zd5dd�Zd6dd�Zd7dd�ZGdd�de�Zdd�Zdd �Zd!d"�Zd#d$�Zed	k�r�yddlZeZWnnek
�r�ydd%l m!Z!eZWnBek
�r|yddl"Z#eZWnek
�rveZYnXYnXYnXe$d&k�r~d'Z%d(Z&d8Z'e(d)�ee%e&d*d+�Z)x$e'D]Z*e(d,e*e+e)e*�f��q�We(d-�ee%e&�Z)x$e'D]Z*e(d,e*e+e)e*�f��q�We(d.�ee%�Z)x$e'D]Z*e(d,e*e+e)e*�f��q$We(d/�ee%d
d0�Z)x$e'D]Z*e(d,e*e+e)e*�f��q^WdS)9zyUtilities for determining application-specific dirs.

See <http://github.com/ActiveState/appdirs> for details and usage.
����.N��javaZWindows�win32ZMac�darwinZlinux2FcCs�tdkr^|dkr|}|rdpd}tjjt|��}|r�|dk	rNtjj|||�}q�tjj||�}nNtdkr�tjjd�}|r�tjj||�}n&tjdtjjd	��}|r�tjj||�}|r�|r�tjj||�}|S)
aJReturn full path to the user-specific data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user data directories are:
        macOS:                  ~/Library/Application Support/<AppName>
        Unix:                   ~/.local/share/<AppName>    # or in $XDG_DATA_HOME, if defined
        Win XP (not roaming):   C:\Documents and Settings\<username>\Application Data\<AppAuthor>\<AppName>
        Win XP (roaming):       C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>
        Win 7  (not roaming):   C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>
        Win 7  (roaming):       C:\Users\<username>\AppData\Roaming\<AppAuthor>\<AppName>

    For Unix, we follow the XDG spec and support $XDG_DATA_HOME.
    That means, by default "~/.local/share/<AppName>".
    rN�
CSIDL_APPDATA�CSIDL_LOCAL_APPDATAFrz~/Library/Application Support/Z
XDG_DATA_HOMEz~/.local/share)�system�os�path�normpath�_get_win_folder�join�
expanduser�getenv)�appname�	appauthor�version�roaming�constr
�r�/usr/lib/python3.6/appdirs.py�
user_data_dir-s& rcs
tdkrR|dkr�}tjjtd��}�r�|dk	rBtjj||��}q�tjj|��}n�tdkrztjjd�}�r�tjj|��}nttjdtjjdd	g��}d
d�|j	tj�D�}�r�|r�tjj�|���fdd�|D�}|r�tjj|�}n|d
}|S�o�|�rtjj||�}|S)aiReturn full path to the user-shared data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "multipath" is an optional parameter only applicable to *nix
            which indicates that the entire list of data dirs should be
            returned. By default, the first item from XDG_DATA_DIRS is
            returned, or '/usr/local/share/<AppName>',
            if XDG_DATA_DIRS is not set

    Typical user data directories are:
        macOS:      /Library/Application Support/<AppName>
        Unix:       /usr/local/share/<AppName> or /usr/share/<AppName>
        Win XP:     C:\Documents and Settings\All Users\Application Data\<AppAuthor>\<AppName>
        Vista:      (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)
        Win 7:      C:\ProgramData\<AppAuthor>\<AppName>   # Hidden, but writeable on Win 7.

    For Unix, this is using the $XDG_DATA_DIRS[0] default.

    WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
    rN�CSIDL_COMMON_APPDATAFrz/Library/Application SupportZ
XDG_DATA_DIRSz/usr/local/sharez
/usr/sharecSs g|]}tjj|jtj���qSr)rr
r�rstrip�sep)�.0�xrrr�
<listcomp>�sz!site_data_dir.<locals>.<listcomp>csg|]}tjj|�g��qSr)rrr)rr)rrrr �sr)
rrr
rrrrr�pathsep�split)rrr�	multipathr
�pathlistr)rr�
site_data_dirds4
r%cCsXtdkrt||d|�}n&tjdtjjd��}|r>tjj||�}|rT|rTtjj||�}|S)a�Return full path to the user-specific config dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user data directories are:
        macOS:                  same as user_data_dir
        Unix:                   ~/.config/<AppName>     # or in $XDG_CONFIG_HOME, if defined
        Win *:                  same as user_data_dir

    For Unix, we follow the XDG spec and support $XDG_CONFIG_HOME.
    That means, by deafult "~/.config/<AppName>".
    rrNZXDG_CONFIG_HOMEz	~/.config)rr)rrrrr
rr)rrrrr
rrr�user_config_dir�sr&cs�td	kr*t�|�}�r�|r�tjj||�}ndtjdd�}dd�|jtj�D�}�rt|rbtjj�|���fdd�|D�}|r�tjj|�}n|d}|S)
aReturn full path to the user-shared data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "multipath" is an optional parameter only applicable to *nix
            which indicates that the entire list of config dirs should be
            returned. By default, the first item from XDG_CONFIG_DIRS is
            returned, or '/etc/xdg/<AppName>', if XDG_CONFIG_DIRS is not set

    Typical user data directories are:
        macOS:      same as site_data_dir
        Unix:       /etc/xdg/<AppName> or $XDG_CONFIG_DIRS[i]/<AppName> for each value in
                    $XDG_CONFIG_DIRS
        Win *:      same as site_data_dir
        Vista:      (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)

    For Unix, this is using the $XDG_CONFIG_DIRS[0] default, if multipath=False

    WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
    rrZXDG_CONFIG_DIRSz/etc/xdgcSs g|]}tjj|jtj���qSr)rr
rrr)rrrrrr �sz#site_config_dir.<locals>.<listcomp>csg|]}tjj|�g��qSr)rrr)rr)rrrr �sr)rr)rr%rr
rrr"r!)rrrr#r
r$r)rr�site_config_dir�s
r'TcCs�tdkrd|dkr|}tjjtd��}|r�|dk	rBtjj|||�}ntjj||�}|r�tjj|d�}nNtdkr�tjjd�}|r�tjj||�}n&tjdtjjd	��}|r�tjj||�}|r�|r�tjj||�}|S)
aReturn full path to the user-specific cache dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "opinion" (boolean) can be False to disable the appending of
            "Cache" to the base app data dir for Windows. See
            discussion below.

    Typical user cache directories are:
        macOS:      ~/Library/Caches/<AppName>
        Unix:       ~/.cache/<AppName> (XDG default)
        Win XP:     C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Cache
        Vista:      C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Cache

    On Windows the only suggestion in the MSDN docs is that local settings go in
    the `CSIDL_LOCAL_APPDATA` directory. This is identical to the non-roaming
    app data dir (the default returned by `user_data_dir` above). Apps typically
    put cache data somewhere *under* the given dir here. Some examples:
        ...\Mozilla\Firefox\Profiles\<ProfileName>\Cache
        ...\Acme\SuperApp\Cache\1.0
    OPINION: This function appends "Cache" to the `CSIDL_LOCAL_APPDATA` value.
    This can be disabled with the `opinion=False` option.
    rNr
FZCacherz~/Library/CachesZXDG_CACHE_HOMEz~/.cache)rrr
rrrrr)rrr�opinionr
rrr�user_cache_dirs(!r)cCs�tdkr tjjtjjd�|�}nNtdkrLt|||�}d}|rntjj|d�}n"t|||�}d}|rntjj|d�}|r�|r�tjj||�}|S)a�Return full path to the user-specific log dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "opinion" (boolean) can be False to disable the appending of
            "Logs" to the base app data dir for Windows, and "log" to the
            base cache dir for Unix. See discussion below.

    Typical user cache directories are:
        macOS:      ~/Library/Logs/<AppName>
        Unix:       ~/.cache/<AppName>/log  # or under $XDG_CACHE_HOME if defined
        Win XP:     C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Logs
        Vista:      C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Logs

    On Windows the only suggestion in the MSDN docs is that local settings
    go in the `CSIDL_LOCAL_APPDATA` directory. (Note: I'm interested in
    examples of what some windows apps use for a logs dir.)

    OPINION: This function appends "Logs" to the `CSIDL_LOCAL_APPDATA`
    value for Windows and appends "log" to the user cache dir for Unix.
    This can be disabled with the `opinion=False` option.
    rz~/Library/LogsrFZLogs�log)rrr
rrrr))rrrr(r
rrr�user_log_dir:s  
r+c@sbeZdZdZddd�Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��ZdS)�AppDirsz1Convenience wrapper for getting application dirs.NFcCs"||_||_||_||_||_dS)N)rrrrr#)�selfrrrrr#rrr�__init__os
zAppDirs.__init__cCst|j|j|j|jd�S)N)rr)rrrrr)r-rrrrws
zAppDirs.user_data_dircCst|j|j|j|jd�S)N)rr#)r%rrrr#)r-rrrr%|s
zAppDirs.site_data_dircCst|j|j|j|jd�S)N)rr)r&rrrr)r-rrrr&�s
zAppDirs.user_config_dircCst|j|j|j|jd�S)N)rr#)r'rrrr#)r-rrrr'�s
zAppDirs.site_config_dircCst|j|j|jd�S)N)r)r)rrr)r-rrrr)�s
zAppDirs.user_cache_dircCst|j|j|jd�S)N)r)r+rrr)r-rrrr+�s
zAppDirs.user_log_dir)NNFF)�__name__�
__module__�__qualname__�__doc__r.�propertyrr%r&r'r)r+rrrrr,ms
r,cCs:ddl}dddd�|}|j|jd�}|j||�\}}|S)z�This is a fallback technique at best. I'm not sure if using the
    registry for this guarantees us the correct answer for all CSIDL_*
    names.
    rNZAppDatazCommon AppDataz
Local AppData)r	rr
z@Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders)�_winreg�OpenKey�HKEY_CURRENT_USERZQueryValueEx)�
csidl_namer4Zshell_folder_name�key�dir�typerrr�_get_win_folder_from_registry�sr;cCs�ddlm}m}|jdt||�dd�}y`t|�}d}x|D]}t|�dkr:d}Pq:W|r�yddl}|j|�}Wnt	k
r�YnXWnt
k
r�YnX|S)Nr)�shellcon�shellF�T)�win32com.shellr<r=�SHGetFolderPath�getattr�unicode�ord�win32api�GetShortPathName�ImportError�UnicodeError)r7r<r=r9�
has_high_char�crDrrr�_get_win_folder_with_pywin32�s$

rJcCs�ddl}dddd�|}|jd�}|jjjd|dd|�d}x|D]}t|�dkrBd	}PqBW|r�|jd�}|jjj|j|d�r�|}|jS)
Nr��#�)r	rr
iFr>T)	�ctypesZcreate_unicode_buffer�windllZshell32ZSHGetFolderPathWrCZkernel32ZGetShortPathNameW�value)r7rNZcsidl_const�bufrHrIZbuf2rrr�_get_win_folder_with_ctypes�s"


rRcCs�ddl}ddlm}ddlm}|jjd}|jd|�}|jj	}|j
dt|j|�d|jj
|�|jj|j��jd�}d}x|D]}	t|	�dkr~d	}Pq~W|r�|jd|�}|jj	}
tj|||�r�|jj|j��jd�}|S)
Nr)�jna)r�rI�Fr>T)�arrayZcom.sunrSZcom.sun.jna.platformrZWinDefZMAX_PATHZzerosZShell32ZINSTANCEr@rAZShlObjZSHGFP_TYPE_CURRENTZNativeZtoStringZtostringrrCZKernel32ZkernalrE)r7rVrSrZbuf_sizerQr=r9rHrIZkernelrrr�_get_win_folder_with_jna�s&
rW)rO�__main__ZMyAppZ	MyCompanyz%-- app dirs (with optional 'version')z1.0)rz%s: %sz)
-- app dirs (without optional 'version')z+
-- app dirs (without optional 'appauthor')z(
-- app dirs (with disabled 'appauthor'))r)rrr)NNNF)NNNF)NNNF)NNNF)NNNT)NNNT)rr%r&r'r)r+),r2Z__version_info__r�map�str�__version__�sysr�version_infoZPY3rB�platform�
startswithZjava_verZos_namerrr%r&r'r)r+�objectr,r;rJrRrWr?Zwin32comrrFrNrOZcom.sun.jnaZcomr/rrZprops�print�dirsZproprArrrr�<module>	s~


7
B
(
3
9
3+






site-packages/pip/_vendor/__pycache__/six.cpython-36.pyc000064400000057532147511334610017172 0ustar003

���e�u�I@srdZddlmZddlZddlZddlZddlZddlZdZdZ	ej
ddkZej
ddkZej
dd��dzkZ
er�efZefZefZeZeZejZn�efZeefZeejfZeZeZejjd	�r�e�d|�ZnLGdd
�d
e�Z ye!e ��Wn e"k
�re�d~�ZYnXe�d��Z[ dd�Z#dd�Z$Gdd�de�Z%Gdd�de%�Z&Gdd�dej'�Z(Gdd�de%�Z)Gdd�de�Z*e*e+�Z,Gdd�de(�Z-e)ddd d!�e)d"d#d$d%d"�e)d&d#d#d'd&�e)d(d)d$d*d(�e)d+d)d,�e)d-d#d$d.d-�e)d/d0d0d1d/�e)d2d0d0d/d2�e)d3d)d$d4d3�e)d5d)e
�rd6nd7d8�e)d9d)d:�e)d;d<d=d>�e)d!d!d �e)d?d?d@�e)dAdAd@�e)dBdBd@�e)d4d)d$d4d3�e)dCd#d$dDdC�e)dEd#d#dFdE�e&d$d)�e&dGdH�e&dIdJ�e&dKdLdM�e&dNdOdN�e&dPdQdR�e&dSdTdU�e&dVdWdX�e&dYdZd[�e&d\d]d^�e&d_d`da�e&dbdcdd�e&dedfdg�e&dhdidj�e&dkdkdl�e&dmdmdl�e&dndndl�e&dododp�e&dqdr�e&dsdt�e&dudv�e&dwdxdw�e&dydz�e&d{d|d}�e&d~dd��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�e+d�d��e&d�e+d�d��e&d�e+d�e+d��e&d�d�d��e&d�d�d��e&d�d�d��g>Z.ejd�k�rZe.e&d�d��g7Z.x:e.D]2Z/e0e-e/j1e/�e2e/e&��r`e,j3e/d�e/j1��q`W[/e.e-_.e-e+d��Z4e,j3e4d��Gd�d��d�e(�Z5e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d>d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��gZ6xe6D]Z/e0e5e/j1e/��q�W[/e6e5_.e,j3e5e+d��d�dӃGd�dՄd�e(�Z7e)d�d�d��e)d�d�d��e)d�d�d��gZ8xe8D]Z/e0e7e/j1e/��q$W[/e8e7_.e,j3e7e+d��d�d܃Gd�dބd�e(�Z9e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)�dd�d�g!Z:xe:D]Z/e0e9e/j1e/��q�W[/e:e9_.e,j3e9e+�d��d�d�G�d�d��de(�Z;e)�dd��d�e)�dd��d�e)�d	d��d�e)�d
d��d�gZ<xe<D]Z/e0e;e/j1e/��qTW[/e<e;_.e,j3e;e+�d��d�d
�G�d�d��de(�Z=e)�dd�d��gZ>xe>D]Z/e0e=e/j1e/��q�W[/e>e=_.e,j3e=e+�d��d�d�G�d�d��dej'�Z?e,j3e?e+d���d��d�d�Z@�d�d�ZAe�	rj�dZB�dZC�dZD�dZE�dZF�d ZGn$�d!ZB�d"ZC�d#ZD�d$ZE�d%ZF�d&ZGyeHZIWn"eJk
�	r��d'�d(�ZIYnXeIZHyeKZKWn"eJk
�	r��d)�d*�ZKYnXe�
r�d+�d,�ZLejMZN�d-�d.�ZOeZPn>�d/�d,�ZL�d0�d1�ZN�d2�d.�ZOG�d3�d4��d4e�ZPeKZKe#eL�d5�ejQeB�ZRejQeC�ZSejQeD�ZTejQeE�ZUejQeF�ZVejQeG�ZWe�
r��d6�d7�ZX�d8�d9�ZY�d:�d;�ZZ�d<�d=�Z[ej\�d>�Z]ej\�d?�Z^ej\�d@�Z_nT�dA�d7�ZX�dB�d9�ZY�dC�d;�ZZ�dD�d=�Z[ej\�dE�Z]ej\�dF�Z^ej\�dG�Z_e#eX�dH�e#eY�dI�e#eZ�dJ�e#e[�dK�e�r�dL�dM�Z`�dN�dO�ZaebZcddldZdedje�dP�jfZg[dejhd�ZiejjZkelZmddlnZnenjoZoenjpZp�dQZqej
d
d
k�r�dRZr�dSZsn�dTZr�dUZsnj�dV�dM�Z`�dW�dO�ZaecZcebZg�dX�dY�Zi�dZ�d[�Zkejtejuev�ZmddloZoeojoZoZp�d\Zq�dRZr�dSZse#e`�d]�e#ea�d^��d_�dQ�Zw�d`�dT�Zx�da�dU�Zye�r�eze4j{�db�Z|�d��dc�dd�Z}n�d��de�df�Z|e|�dg�ej
dd��d�k�
re|�dh�n.ej
dd��d�k�
r8e|�di�n�dj�dk�Z~eze4j{�dld�Zedk�
rj�dm�dn�Zej
dd��d�k�
r�eZ��do�dn�Ze#e}�dp�ej
dd��d�k�
r�ej�ej�f�dq�dr�Z�nej�Z��ds�dt�Z��du�dv�Z��dw�dx�Z�gZ�e+Z�e��j��dy�dk	�rge�_�ej��rbx>e�ej��D]0\Z�Z�ee��j+dk�r*e�j1e+k�r*ej�e�=P�q*W[�[�ej�j�e,�dS(�z6Utilities for writing code that runs on Python 2 and 3�)�absolute_importNz'Benjamin Peterson <benjamin@python.org>z1.10.0����java��c@seZdZdd�ZdS)�XcCsdS)Nrrl�)�selfr
r
�/usr/lib/python3.6/six.py�__len__>sz	X.__len__N)�__name__�
__module__�__qualname__r
r
r
r
rr	<sr	�?cCs
||_dS)z Add documentation to a function.N)�__doc__)�func�docr
r
r�_add_docKsrcCst|�tj|S)z7Import module, returning the module after the last dot.)�
__import__�sys�modules)�namer
r
r�_import_modulePsrc@seZdZdd�Zdd�ZdS)�
_LazyDescrcCs
||_dS)N)r)rrr
r
r�__init__Xsz_LazyDescr.__init__cCsB|j�}t||j|�yt|j|j�Wntk
r<YnX|S)N)�_resolve�setattrr�delattr�	__class__�AttributeError)r�obj�tp�resultr
r
r�__get__[sz_LazyDescr.__get__N)rrrrr%r
r
r
rrVsrcs.eZdZd�fdd�	Zdd�Zdd�Z�ZS)	�MovedModuleNcs2tt|�j|�tr(|dkr |}||_n||_dS)N)�superr&r�PY3�mod)rr�old�new)r r
rriszMovedModule.__init__cCs
t|j�S)N)rr))rr
r
rrrszMovedModule._resolvecCs"|j�}t||�}t|||�|S)N)r�getattrr)r�attr�_module�valuer
r
r�__getattr__us
zMovedModule.__getattr__)N)rrrrrr0�
__classcell__r
r
)r rr&gs	r&cs(eZdZ�fdd�Zdd�ZgZ�ZS)�_LazyModulecstt|�j|�|jj|_dS)N)r'r2rr r)rr)r r
rr~sz_LazyModule.__init__cCs ddg}|dd�|jD�7}|S)NrrcSsg|]
}|j�qSr
)r)�.0r-r
r
r�
<listcomp>�sz'_LazyModule.__dir__.<locals>.<listcomp>)�_moved_attributes)rZattrsr
r
r�__dir__�sz_LazyModule.__dir__)rrrrr6r5r1r
r
)r rr2|sr2cs&eZdZd�fdd�	Zdd�Z�ZS)�MovedAttributeNcsdtt|�j|�trH|dkr |}||_|dkr@|dkr<|}n|}||_n||_|dkrZ|}||_dS)N)r'r7rr(r)r-)rrZold_modZnew_modZold_attrZnew_attr)r r
rr�szMovedAttribute.__init__cCst|j�}t||j�S)N)rr)r,r-)r�moduler
r
rr�s
zMovedAttribute._resolve)NN)rrrrrr1r
r
)r rr7�sr7c@sVeZdZdZdd�Zdd�Zdd�Zdd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZeZdS)�_SixMetaPathImporterz�
    A meta path importer to import six.moves and its submodules.

    This class implements a PEP302 finder and loader. It should be compatible
    with Python 2.5 and all existing versions of Python3
    cCs||_i|_dS)N)r�
known_modules)rZsix_module_namer
r
rr�sz_SixMetaPathImporter.__init__cGs&x |D]}||j|jd|<qWdS)N�.)r:r)rr)Z	fullnames�fullnamer
r
r�_add_module�s
z _SixMetaPathImporter._add_modulecCs|j|jd|S)Nr;)r:r)rr<r
r
r�_get_module�sz _SixMetaPathImporter._get_moduleNcCs||jkr|SdS)N)r:)rr<�pathr
r
r�find_module�s
z _SixMetaPathImporter.find_modulecCs0y
|j|Stk
r*td|��YnXdS)Nz!This loader does not know module )r:�KeyError�ImportError)rr<r
r
rZ__get_module�s
z!_SixMetaPathImporter.__get_modulecCsRy
tj|Stk
rYnX|j|�}t|t�r>|j�}n||_|tj|<|S)N)rrrA� _SixMetaPathImporter__get_module�
isinstancer&r�
__loader__)rr<r)r
r
r�load_module�s




z _SixMetaPathImporter.load_modulecCst|j|�d�S)z�
        Return true, if the named module is a package.

        We need this method to get correct spec objects with
        Python 3.4 (see PEP451)
        �__path__)�hasattrrC)rr<r
r
r�
is_package�sz_SixMetaPathImporter.is_packagecCs|j|�dS)z;Return None

        Required, if is_package is implementedN)rC)rr<r
r
r�get_code�s
z_SixMetaPathImporter.get_code)N)
rrrrrr=r>r@rCrFrIrJ�
get_sourcer
r
r
rr9�s
	r9c@seZdZdZgZdS)�_MovedItemszLazy loading of moved objectsN)rrrrrGr
r
r
rrL�srLZ	cStringIO�io�StringIO�filter�	itertools�builtinsZifilter�filterfalseZifilterfalse�inputZ__builtin__Z	raw_input�internr�map�imap�getcwd�osZgetcwdu�getcwdb�rangeZxrangeZ
reload_module�	importlibZimp�reload�reduce�	functoolsZshlex_quoteZpipesZshlexZquote�UserDict�collections�UserList�
UserString�zipZizip�zip_longestZizip_longestZconfigparserZConfigParser�copyregZcopy_regZdbm_gnuZgdbmzdbm.gnuZ
_dummy_threadZdummy_threadZhttp_cookiejarZ	cookielibzhttp.cookiejarZhttp_cookiesZCookiezhttp.cookiesZ
html_entitiesZhtmlentitydefsz
html.entitiesZhtml_parserZ
HTMLParserzhtml.parserZhttp_clientZhttplibzhttp.clientZemail_mime_multipartzemail.MIMEMultipartzemail.mime.multipartZemail_mime_nonmultipartzemail.MIMENonMultipartzemail.mime.nonmultipartZemail_mime_textzemail.MIMETextzemail.mime.textZemail_mime_basezemail.MIMEBasezemail.mime.baseZBaseHTTPServerzhttp.serverZ
CGIHTTPServerZSimpleHTTPServerZcPickle�pickleZqueueZQueue�reprlib�reprZsocketserverZSocketServer�_threadZthreadZtkinterZTkinterZtkinter_dialogZDialogztkinter.dialogZtkinter_filedialogZ
FileDialogztkinter.filedialogZtkinter_scrolledtextZScrolledTextztkinter.scrolledtextZtkinter_simpledialogZSimpleDialogztkinter.simpledialogZtkinter_tixZTixztkinter.tixZtkinter_ttkZttkztkinter.ttkZtkinter_constantsZTkconstantsztkinter.constantsZtkinter_dndZTkdndztkinter.dndZtkinter_colorchooserZtkColorChooserztkinter.colorchooserZtkinter_commondialogZtkCommonDialogztkinter.commondialogZtkinter_tkfiledialogZtkFileDialogZtkinter_fontZtkFontztkinter.fontZtkinter_messageboxZtkMessageBoxztkinter.messageboxZtkinter_tksimpledialogZtkSimpleDialogZurllib_parsez.moves.urllib_parsezurllib.parseZurllib_errorz.moves.urllib_errorzurllib.errorZurllibz
.moves.urllibZurllib_robotparser�robotparserzurllib.robotparserZ
xmlrpc_clientZ	xmlrpclibz
xmlrpc.clientZ
xmlrpc_serverZSimpleXMLRPCServerz
xmlrpc.serverZwin32�winreg�_winregzmoves.z.moves�movesc@seZdZdZdS)�Module_six_moves_urllib_parsez7Lazy loading of moved objects in six.moves.urllib_parseN)rrrrr
r
r
rrn@srnZParseResultZurlparseZSplitResultZparse_qsZ	parse_qslZ	urldefragZurljoinZurlsplitZ
urlunparseZ
urlunsplitZ
quote_plusZunquoteZunquote_plusZ	urlencodeZ
splitqueryZsplittagZ	splituserZ
uses_fragmentZuses_netlocZuses_paramsZ
uses_queryZ
uses_relativezmoves.urllib_parsezmoves.urllib.parsec@seZdZdZdS)�Module_six_moves_urllib_errorz7Lazy loading of moved objects in six.moves.urllib_errorN)rrrrr
r
r
rrohsroZURLErrorZurllib2Z	HTTPErrorZContentTooShortErrorz.moves.urllib.errorzmoves.urllib_errorzmoves.urllib.errorc@seZdZdZdS)�Module_six_moves_urllib_requestz9Lazy loading of moved objects in six.moves.urllib_requestN)rrrrr
r
r
rrp|srpZurlopenzurllib.requestZinstall_openerZbuild_openerZpathname2urlZurl2pathnameZ
getproxiesZRequestZOpenerDirectorZHTTPDefaultErrorHandlerZHTTPRedirectHandlerZHTTPCookieProcessorZProxyHandlerZBaseHandlerZHTTPPasswordMgrZHTTPPasswordMgrWithDefaultRealmZAbstractBasicAuthHandlerZHTTPBasicAuthHandlerZProxyBasicAuthHandlerZAbstractDigestAuthHandlerZHTTPDigestAuthHandlerZProxyDigestAuthHandlerZHTTPHandlerZHTTPSHandlerZFileHandlerZ
FTPHandlerZCacheFTPHandlerZUnknownHandlerZHTTPErrorProcessorZurlretrieveZ
urlcleanupZ	URLopenerZFancyURLopenerZproxy_bypassz.moves.urllib.requestzmoves.urllib_requestzmoves.urllib.requestc@seZdZdZdS)� Module_six_moves_urllib_responsez:Lazy loading of moved objects in six.moves.urllib_responseN)rrrrr
r
r
rrq�srqZaddbasezurllib.responseZaddclosehookZaddinfoZ
addinfourlz.moves.urllib.responsezmoves.urllib_responsezmoves.urllib.responsec@seZdZdZdS)�#Module_six_moves_urllib_robotparserz=Lazy loading of moved objects in six.moves.urllib_robotparserN)rrrrr
r
r
rrr�srrZRobotFileParserz.moves.urllib.robotparserzmoves.urllib_robotparserzmoves.urllib.robotparserc@sNeZdZdZgZejd�Zejd�Zejd�Z	ejd�Z
ejd�Zdd�Zd	S)
�Module_six_moves_urllibzICreate a six.moves.urllib namespace that resembles the Python 3 namespacezmoves.urllib_parsezmoves.urllib_errorzmoves.urllib_requestzmoves.urllib_responsezmoves.urllib_robotparsercCsdddddgS)N�parse�error�request�responserjr
)rr
r
rr6�szModule_six_moves_urllib.__dir__N)
rrrrrG�	_importerr>rtrurvrwrjr6r
r
r
rrs�s




rszmoves.urllibcCstt|j|�dS)zAdd an item to six.moves.N)rrLr)Zmover
r
r�add_move�srycCsXytt|�WnDtk
rRytj|=Wn"tk
rLtd|f��YnXYnXdS)zRemove item from six.moves.zno such move, %rN)rrLr!rm�__dict__rA)rr
r
r�remove_move�sr{�__func__�__self__�__closure__�__code__�__defaults__�__globals__�im_funcZim_selfZfunc_closureZ	func_codeZ
func_defaultsZfunc_globalscCs|j�S)N)�next)�itr
r
r�advance_iteratorsr�cCstdd�t|�jD��S)Ncss|]}d|jkVqdS)�__call__N)rz)r3�klassr
r
r�	<genexpr>szcallable.<locals>.<genexpr>)�any�type�__mro__)r"r
r
r�callablesr�cCs|S)Nr
)�unboundr
r
r�get_unbound_functionsr�cCs|S)Nr
)r�clsr
r
r�create_unbound_methodsr�cCs|jS)N)r�)r�r
r
rr�"scCstj|||j�S)N)�types�
MethodTyper )rr"r
r
r�create_bound_method%sr�cCstj|d|�S)N)r�r�)rr�r
r
rr�(sc@seZdZdd�ZdS)�IteratorcCst|�j|�S)N)r��__next__)rr
r
rr�-sz
Iterator.nextN)rrrr�r
r
r
rr�+sr�z3Get the function out of a possibly unbound functioncKst|jf|��S)N)�iter�keys)�d�kwr
r
r�iterkeys>sr�cKst|jf|��S)N)r��values)r�r�r
r
r�
itervaluesAsr�cKst|jf|��S)N)r��items)r�r�r
r
r�	iteritemsDsr�cKst|jf|��S)N)r�Zlists)r�r�r
r
r�	iterlistsGsr�r�r�r�cKs|jf|�S)N)r�)r�r�r
r
rr�PscKs|jf|�S)N)r�)r�r�r
r
rr�SscKs|jf|�S)N)r�)r�r�r
r
rr�VscKs|jf|�S)N)r�)r�r�r
r
rr�Ys�viewkeys�
viewvalues�	viewitemsz1Return an iterator over the keys of a dictionary.z3Return an iterator over the values of a dictionary.z?Return an iterator over the (key, value) pairs of a dictionary.zBReturn an iterator over the (key, [values]) pairs of a dictionary.cCs
|jd�S)Nzlatin-1)�encode)�sr
r
r�bksr�cCs|S)Nr
)r�r
r
r�unsr�z>B�assertCountEqualZassertRaisesRegexpZassertRegexpMatches�assertRaisesRegex�assertRegexcCs|S)Nr
)r�r
r
rr��scCst|jdd�d�S)Nz\\z\\\\Zunicode_escape)�unicode�replace)r�r
r
rr��scCst|d�S)Nr)�ord)Zbsr
r
r�byte2int�sr�cCst||�S)N)r�)Zbuf�ir
r
r�
indexbytes�sr�ZassertItemsEqualzByte literalzText literalcOst|t�||�S)N)r,�_assertCountEqual)r�args�kwargsr
r
rr��scOst|t�||�S)N)r,�_assertRaisesRegex)rr�r�r
r
rr��scOst|t�||�S)N)r,�_assertRegex)rr�r�r
r
rr��s�execcCs*|dkr|�}|j|k	r"|j|��|�dS)N)�
__traceback__�with_traceback)r#r/�tbr
r
r�reraise�s


r�cCsB|dkr*tjd�}|j}|dkr&|j}~n|dkr6|}td�dS)zExecute code in a namespace.Nrzexec _code_ in _globs_, _locs_)r�	_getframe�	f_globals�f_localsr�)Z_code_Z_globs_Z_locs_�framer
r
r�exec_�s
r�z9def reraise(tp, value, tb=None):
    raise tp, value, tb
zrdef raise_from(value, from_value):
    if from_value is None:
        raise value
    raise value from from_value
zCdef raise_from(value, from_value):
    raise value from from_value
cCs|�dS)Nr
)r/Z
from_valuer
r
r�
raise_from�sr��printc
s6|jdtj���dkrdS�fdd�}d}|jdd�}|dk	r`t|t�rNd}nt|t�s`td��|jd	d�}|dk	r�t|t�r�d}nt|t�s�td
��|r�td��|s�x|D]}t|t�r�d}Pq�W|r�td�}td
�}nd}d
}|dkr�|}|dk�r�|}x,t|�D] \}	}|	�r||�||��qW||�dS)z4The new-style print function for Python 2.4 and 2.5.�fileNcsdt|t�st|�}t�t�rVt|t�rV�jdk	rVt�dd�}|dkrHd}|j�j|�}�j|�dS)N�errors�strict)	rD�
basestring�strr�r��encodingr,r��write)�datar�)�fpr
rr��s



zprint_.<locals>.writeF�sepTzsep must be None or a string�endzend must be None or a stringz$invalid keyword arguments to print()�
� )�popr�stdoutrDr�r��	TypeError�	enumerate)
r�r�r�Zwant_unicoder�r��arg�newlineZspacer�r
)r�r�print_�sL







r�cOs<|jdtj�}|jdd�}t||�|r8|dk	r8|j�dS)Nr��flushF)�getrr�r��_printr�)r�r�r�r�r
r
rr�s

zReraise an exception.cs���fdd�}|S)Ncstj����|�}�|_|S)N)r^�wraps�__wrapped__)�f)�assigned�updated�wrappedr
r�wrapperszwraps.<locals>.wrapperr
)r�r�r�r�r
)r�r�r�rr�sr�cs&G��fdd�d��}tj|dfi�S)z%Create a base class with a metaclass.cseZdZ��fdd�ZdS)z!with_metaclass.<locals>.metaclasscs�|�|�S)Nr
)r�rZ
this_basesr�)�bases�metar
r�__new__'sz)with_metaclass.<locals>.metaclass.__new__N)rrrr�r
)r�r�r
r�	metaclass%sr�Ztemporary_class)r�r�)r�r�r�r
)r�r�r�with_metaclass sr�cs�fdd�}|S)z6Class decorator for creating a class with a metaclass.csl|jj�}|jd�}|dk	rDt|t�r,|g}x|D]}|j|�q2W|jdd�|jdd��|j|j|�S)N�	__slots__rz�__weakref__)rz�copyr�rDr�r�r�	__bases__)r�Z	orig_vars�slotsZ	slots_var)r�r
rr�.s



zadd_metaclass.<locals>.wrapperr
)r�r�r
)r�r�
add_metaclass,sr�cCs2tr.d|jkrtd|j��|j|_dd�|_|S)a
    A decorator that defines __unicode__ and __str__ methods under Python 2.
    Under Python 3 it does nothing.

    To support Python 2 and 3 with a single code base, define a __str__ method
    returning text and apply this decorator to the class.
    �__str__zY@python_2_unicode_compatible cannot be applied to %s because it doesn't define __str__().cSs|j�jd�S)Nzutf-8)�__unicode__r�)rr
r
r�<lambda>Jsz-python_2_unicode_compatible.<locals>.<lambda>)�PY2rz�
ValueErrorrr�r�)r�r
r
r�python_2_unicode_compatible<s


r��__spec__)rrli���li���ll����)N)NN)rr)rr)rr)rr)�rZ
__future__rr^rP�operatorrr��
__author__�__version__�version_infor�r(ZPY34r�Zstring_types�intZ
integer_typesr�Zclass_typesZ	text_type�bytesZbinary_type�maxsizeZMAXSIZEr�ZlongZ	ClassTyper��platform�
startswith�objectr	�len�
OverflowErrorrrrr&�
ModuleTyper2r7r9rrxrLr5r-rrrDr=rmrnZ_urllib_parse_moved_attributesroZ_urllib_error_moved_attributesrpZ _urllib_request_moved_attributesrqZ!_urllib_response_moved_attributesrrZ$_urllib_robotparser_moved_attributesrsryr{Z
_meth_funcZ
_meth_selfZ
_func_closureZ
_func_codeZ_func_defaultsZ
_func_globalsr�r��	NameErrorr�r�r�r�r�r��
attrgetterZget_method_functionZget_method_selfZget_function_closureZget_function_codeZget_function_defaultsZget_function_globalsr�r�r�r��methodcallerr�r�r�r�r��chrZunichr�struct�Struct�packZint2byte�
itemgetterr��getitemr�r�Z	iterbytesrMrN�BytesIOr�r�r��partialrVr�r�r�r�r,rQr�r�r�r�r��WRAPPER_ASSIGNMENTS�WRAPPER_UPDATESr�r�r�r�rG�__package__�globalsr�r��submodule_search_locations�	meta_pathr�r�Zimporter�appendr
r
r
r�<module>s�

>












































































































5site-packages/pip/_vendor/__pycache__/__init__.cpython-36.opt-1.pyc000064400000005240147511334610021052 0ustar003

���e>�@s�dZddlmZddlZddlZddlZdZejj	ejj
e��Zdd�Z
e�r�ejejjed��ejejdd�<e
d�e
d	�e
d
�e
d�e
d�e
d
�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d�e
d �e
d!�e
d"�e
d#�e
d$�e
d%�e
d&�e
d'�e
d(�e
d)�e
d*�e
d+�e
d,�e
d-�e
d.�e
d/�e
d0�e
d1�e
d2�dS)3z�
pip._vendor is for vendoring dependencies of pip to prevent needing pip to
depend on something external.

Files inside of pip._vendor should be considered immutable and should only be
updated to versions from upstream.
�)�absolute_importNFcCs�djt|�}yt|t�t�dd�Wnztk
r�yt|t�t�dd�Wntk
r`Yn:Xtj|tj|<|jdd�\}}t	tj||tj|�YnXdS)Nz{0}.{1}r)�level�.�)
�format�__name__�
__import__�globals�locals�ImportError�sys�modules�rsplit�setattr)Z
modulenameZ
vendored_name�base�head�r�/usr/lib/python3.6/__init__.py�vendoreds	rz*.whlZcachecontrolZcoloramaZdistlibZdistroZhtml5libZlockfileZsixz	six.moveszsix.moves.urllibZ	packagingzpackaging.versionzpackaging.specifiersZ
pkg_resourcesZprogressZretryingZrequestszrequests.packageszrequests.packages.urllib3z&requests.packages.urllib3._collectionsz$requests.packages.urllib3.connectionz(requests.packages.urllib3.connectionpoolz!requests.packages.urllib3.contribz*requests.packages.urllib3.contrib.ntlmpoolz+requests.packages.urllib3.contrib.pyopensslz$requests.packages.urllib3.exceptionsz requests.packages.urllib3.fieldsz"requests.packages.urllib3.filepostz"requests.packages.urllib3.packagesz/requests.packages.urllib3.packages.ordered_dictz&requests.packages.urllib3.packages.sixz5requests.packages.urllib3.packages.ssl_match_hostnamezErequests.packages.urllib3.packages.ssl_match_hostname._implementationz%requests.packages.urllib3.poolmanagerz!requests.packages.urllib3.requestz"requests.packages.urllib3.responsezrequests.packages.urllib3.utilz)requests.packages.urllib3.util.connectionz&requests.packages.urllib3.util.requestz'requests.packages.urllib3.util.responsez$requests.packages.urllib3.util.retryz#requests.packages.urllib3.util.ssl_z&requests.packages.urllib3.util.timeoutz"requests.packages.urllib3.util.url)�__doc__Z
__future__rZglobZos.path�osrZ	DEBUNDLED�path�abspath�dirname�__file__Z	WHEEL_DIRr�joinrrrr�<module>sh$site-packages/pip/_vendor/__pycache__/re-vendor.cpython-36.opt-1.pyc000064400000002010147511334610021204 0ustar003

���e�@s�ddlZddlZddlZddlZddlZejjejje��Z	dd�Z
dd�Zdd�Ze
dkr�eej�d	krpe
�ejd
dkr�e�nejd
dkr�e�ne
�dS)�NcCstd�tjd�dS)Nz"Usage: re-vendor.py [clean|vendor]�)�print�sys�exit�rr�/usr/lib/python3.6/re-vendor.py�usage	srcCsPx6tjt�D](}tjjt|�}tjj|�rtj|�qWtjtjjtd��dS)Nzsix.py)	�os�listdir�here�path�join�isdir�shutil�rmtree�unlink)�fn�dirnamerrr�clean
s
rcCs6tjddtddg�xtjd�D]}tj|�q WdS)NZinstallz-tz-rz
vendor.txtz
*.egg-info)�pip�mainr�globrr)rrrr�vendorsr�__main__�r)r	rrrrr�abspathr�__file__rrrr�__name__�len�argvrrrr�<module>s site-packages/pip/_vendor/colorama/initialise.py000064400000003575147511334610016030 0ustar00# Copyright Jonathan Hartley 2013. BSD 3-Clause license, see LICENSE file.
import atexit
import contextlib
import sys

from .ansitowin32 import AnsiToWin32


orig_stdout = None
orig_stderr = None

wrapped_stdout = None
wrapped_stderr = None

atexit_done = False


def reset_all():
    if AnsiToWin32 is not None:    # Issue #74: objects might become None at exit
        AnsiToWin32(orig_stdout).reset_all()


def init(autoreset=False, convert=None, strip=None, wrap=True):

    if not wrap and any([autoreset, convert, strip]):
        raise ValueError('wrap=False conflicts with any other arg=True')

    global wrapped_stdout, wrapped_stderr
    global orig_stdout, orig_stderr

    orig_stdout = sys.stdout
    orig_stderr = sys.stderr

    if sys.stdout is None:
        wrapped_stdout = None
    else:
        sys.stdout = wrapped_stdout = \
            wrap_stream(orig_stdout, convert, strip, autoreset, wrap)
    if sys.stderr is None:
        wrapped_stderr = None
    else:
        sys.stderr = wrapped_stderr = \
            wrap_stream(orig_stderr, convert, strip, autoreset, wrap)

    global atexit_done
    if not atexit_done:
        atexit.register(reset_all)
        atexit_done = True


def deinit():
    if orig_stdout is not None:
        sys.stdout = orig_stdout
    if orig_stderr is not None:
        sys.stderr = orig_stderr


@contextlib.contextmanager
def colorama_text(*args, **kwargs):
    init(*args, **kwargs)
    try:
        yield
    finally:
        deinit()


def reinit():
    if wrapped_stdout is not None:
        sys.stdout = wrapped_stdout
    if wrapped_stderr is not None:
        sys.stderr = wrapped_stderr


def wrap_stream(stream, convert, strip, autoreset, wrap):
    if wrap:
        wrapper = AnsiToWin32(stream,
            convert=convert, strip=strip, autoreset=autoreset)
        if wrapper.should_wrap():
            stream = wrapper.stream
    return stream


site-packages/pip/_vendor/colorama/ansitowin32.py000064400000022704147511334610016051 0ustar00# Copyright Jonathan Hartley 2013. BSD 3-Clause license, see LICENSE file.
import re
import sys
import os

from .ansi import AnsiFore, AnsiBack, AnsiStyle, Style
from .winterm import WinTerm, WinColor, WinStyle
from .win32 import windll, winapi_test


winterm = None
if windll is not None:
    winterm = WinTerm()


def is_stream_closed(stream):
    return not hasattr(stream, 'closed') or stream.closed


def is_a_tty(stream):
    return hasattr(stream, 'isatty') and stream.isatty()


class StreamWrapper(object):
    '''
    Wraps a stream (such as stdout), acting as a transparent proxy for all
    attribute access apart from method 'write()', which is delegated to our
    Converter instance.
    '''
    def __init__(self, wrapped, converter):
        # double-underscore everything to prevent clashes with names of
        # attributes on the wrapped stream object.
        self.__wrapped = wrapped
        self.__convertor = converter

    def __getattr__(self, name):
        return getattr(self.__wrapped, name)

    def write(self, text):
        self.__convertor.write(text)


class AnsiToWin32(object):
    '''
    Implements a 'write()' method which, on Windows, will strip ANSI character
    sequences from the text, and if outputting to a tty, will convert them into
    win32 function calls.
    '''
    ANSI_CSI_RE = re.compile('\001?\033\[((?:\d|;)*)([a-zA-Z])\002?')     # Control Sequence Introducer
    ANSI_OSC_RE = re.compile('\001?\033\]((?:.|;)*?)(\x07)\002?')         # Operating System Command

    def __init__(self, wrapped, convert=None, strip=None, autoreset=False):
        # The wrapped stream (normally sys.stdout or sys.stderr)
        self.wrapped = wrapped

        # should we reset colors to defaults after every .write()
        self.autoreset = autoreset

        # create the proxy wrapping our output stream
        self.stream = StreamWrapper(wrapped, self)

        on_windows = os.name == 'nt'
        # We test if the WinAPI works, because even if we are on Windows
        # we may be using a terminal that doesn't support the WinAPI
        # (e.g. Cygwin Terminal). In this case it's up to the terminal
        # to support the ANSI codes.
        conversion_supported = on_windows and winapi_test()

        # should we strip ANSI sequences from our output?
        if strip is None:
            strip = conversion_supported or (not is_stream_closed(wrapped) and not is_a_tty(wrapped))
        self.strip = strip

        # should we should convert ANSI sequences into win32 calls?
        if convert is None:
            convert = conversion_supported and not is_stream_closed(wrapped) and is_a_tty(wrapped)
        self.convert = convert

        # dict of ansi codes to win32 functions and parameters
        self.win32_calls = self.get_win32_calls()

        # are we wrapping stderr?
        self.on_stderr = self.wrapped is sys.stderr

    def should_wrap(self):
        '''
        True if this class is actually needed. If false, then the output
        stream will not be affected, nor will win32 calls be issued, so
        wrapping stdout is not actually required. This will generally be
        False on non-Windows platforms, unless optional functionality like
        autoreset has been requested using kwargs to init()
        '''
        return self.convert or self.strip or self.autoreset

    def get_win32_calls(self):
        if self.convert and winterm:
            return {
                AnsiStyle.RESET_ALL: (winterm.reset_all, ),
                AnsiStyle.BRIGHT: (winterm.style, WinStyle.BRIGHT),
                AnsiStyle.DIM: (winterm.style, WinStyle.NORMAL),
                AnsiStyle.NORMAL: (winterm.style, WinStyle.NORMAL),
                AnsiFore.BLACK: (winterm.fore, WinColor.BLACK),
                AnsiFore.RED: (winterm.fore, WinColor.RED),
                AnsiFore.GREEN: (winterm.fore, WinColor.GREEN),
                AnsiFore.YELLOW: (winterm.fore, WinColor.YELLOW),
                AnsiFore.BLUE: (winterm.fore, WinColor.BLUE),
                AnsiFore.MAGENTA: (winterm.fore, WinColor.MAGENTA),
                AnsiFore.CYAN: (winterm.fore, WinColor.CYAN),
                AnsiFore.WHITE: (winterm.fore, WinColor.GREY),
                AnsiFore.RESET: (winterm.fore, ),
                AnsiFore.LIGHTBLACK_EX: (winterm.fore, WinColor.BLACK, True),
                AnsiFore.LIGHTRED_EX: (winterm.fore, WinColor.RED, True),
                AnsiFore.LIGHTGREEN_EX: (winterm.fore, WinColor.GREEN, True),
                AnsiFore.LIGHTYELLOW_EX: (winterm.fore, WinColor.YELLOW, True),
                AnsiFore.LIGHTBLUE_EX: (winterm.fore, WinColor.BLUE, True),
                AnsiFore.LIGHTMAGENTA_EX: (winterm.fore, WinColor.MAGENTA, True),
                AnsiFore.LIGHTCYAN_EX: (winterm.fore, WinColor.CYAN, True),
                AnsiFore.LIGHTWHITE_EX: (winterm.fore, WinColor.GREY, True),
                AnsiBack.BLACK: (winterm.back, WinColor.BLACK),
                AnsiBack.RED: (winterm.back, WinColor.RED),
                AnsiBack.GREEN: (winterm.back, WinColor.GREEN),
                AnsiBack.YELLOW: (winterm.back, WinColor.YELLOW),
                AnsiBack.BLUE: (winterm.back, WinColor.BLUE),
                AnsiBack.MAGENTA: (winterm.back, WinColor.MAGENTA),
                AnsiBack.CYAN: (winterm.back, WinColor.CYAN),
                AnsiBack.WHITE: (winterm.back, WinColor.GREY),
                AnsiBack.RESET: (winterm.back, ),
                AnsiBack.LIGHTBLACK_EX: (winterm.back, WinColor.BLACK, True),
                AnsiBack.LIGHTRED_EX: (winterm.back, WinColor.RED, True),
                AnsiBack.LIGHTGREEN_EX: (winterm.back, WinColor.GREEN, True),
                AnsiBack.LIGHTYELLOW_EX: (winterm.back, WinColor.YELLOW, True),
                AnsiBack.LIGHTBLUE_EX: (winterm.back, WinColor.BLUE, True),
                AnsiBack.LIGHTMAGENTA_EX: (winterm.back, WinColor.MAGENTA, True),
                AnsiBack.LIGHTCYAN_EX: (winterm.back, WinColor.CYAN, True),
                AnsiBack.LIGHTWHITE_EX: (winterm.back, WinColor.GREY, True),
            }
        return dict()

    def write(self, text):
        if self.strip or self.convert:
            self.write_and_convert(text)
        else:
            self.wrapped.write(text)
            self.wrapped.flush()
        if self.autoreset:
            self.reset_all()


    def reset_all(self):
        if self.convert:
            self.call_win32('m', (0,))
        elif not self.strip and not is_stream_closed(self.wrapped):
            self.wrapped.write(Style.RESET_ALL)


    def write_and_convert(self, text):
        '''
        Write the given text to our wrapped stream, stripping any ANSI
        sequences from the text, and optionally converting them into win32
        calls.
        '''
        cursor = 0
        text = self.convert_osc(text)
        for match in self.ANSI_CSI_RE.finditer(text):
            start, end = match.span()
            self.write_plain_text(text, cursor, start)
            self.convert_ansi(*match.groups())
            cursor = end
        self.write_plain_text(text, cursor, len(text))


    def write_plain_text(self, text, start, end):
        if start < end:
            self.wrapped.write(text[start:end])
            self.wrapped.flush()


    def convert_ansi(self, paramstring, command):
        if self.convert:
            params = self.extract_params(command, paramstring)
            self.call_win32(command, params)


    def extract_params(self, command, paramstring):
        if command in 'Hf':
            params = tuple(int(p) if len(p) != 0 else 1 for p in paramstring.split(';'))
            while len(params) < 2:
                # defaults:
                params = params + (1,)
        else:
            params = tuple(int(p) for p in paramstring.split(';') if len(p) != 0)
            if len(params) == 0:
                # defaults:
                if command in 'JKm':
                    params = (0,)
                elif command in 'ABCD':
                    params = (1,)

        return params


    def call_win32(self, command, params):
        if command == 'm':
            for param in params:
                if param in self.win32_calls:
                    func_args = self.win32_calls[param]
                    func = func_args[0]
                    args = func_args[1:]
                    kwargs = dict(on_stderr=self.on_stderr)
                    func(*args, **kwargs)
        elif command in 'J':
            winterm.erase_screen(params[0], on_stderr=self.on_stderr)
        elif command in 'K':
            winterm.erase_line(params[0], on_stderr=self.on_stderr)
        elif command in 'Hf':     # cursor position - absolute
            winterm.set_cursor_position(params, on_stderr=self.on_stderr)
        elif command in 'ABCD':   # cursor position - relative
            n = params[0]
            # A - up, B - down, C - forward, D - back
            x, y = {'A': (0, -n), 'B': (0, n), 'C': (n, 0), 'D': (-n, 0)}[command]
            winterm.cursor_adjust(x, y, on_stderr=self.on_stderr)


    def convert_osc(self, text):
        for match in self.ANSI_OSC_RE.finditer(text):
            start, end = match.span()
            text = text[:start] + text[end:]
            paramstring, command = match.groups()
            if command in '\x07':       # \x07 = BEL
                params = paramstring.split(";")
                # 0 - change title and icon (we will only change title)
                # 1 - change icon (we don't support this)
                # 2 - change title
                if params[0] in '02':
                    winterm.set_title(params[1])
        return text
site-packages/pip/_vendor/colorama/ansi.py000064400000004734147511334610014626 0ustar00# Copyright Jonathan Hartley 2013. BSD 3-Clause license, see LICENSE file.
'''
This module generates ANSI character codes to printing colors to terminals.
See: http://en.wikipedia.org/wiki/ANSI_escape_code
'''

CSI = '\033['
OSC = '\033]'
BEL = '\007'


def code_to_chars(code):
    return CSI + str(code) + 'm'

def set_title(title):
    return OSC + '2;' + title + BEL

def clear_screen(mode=2):
    return CSI + str(mode) + 'J'

def clear_line(mode=2):
    return CSI + str(mode) + 'K'


class AnsiCodes(object):
    def __init__(self):
        # the subclasses declare class attributes which are numbers.
        # Upon instantiation we define instance attributes, which are the same
        # as the class attributes but wrapped with the ANSI escape sequence
        for name in dir(self):
            if not name.startswith('_'):
                value = getattr(self, name)
                setattr(self, name, code_to_chars(value))


class AnsiCursor(object):
    def UP(self, n=1):
        return CSI + str(n) + 'A'
    def DOWN(self, n=1):
        return CSI + str(n) + 'B'
    def FORWARD(self, n=1):
        return CSI + str(n) + 'C'
    def BACK(self, n=1):
        return CSI + str(n) + 'D'
    def POS(self, x=1, y=1):
        return CSI + str(y) + ';' + str(x) + 'H'


class AnsiFore(AnsiCodes):
    BLACK           = 30
    RED             = 31
    GREEN           = 32
    YELLOW          = 33
    BLUE            = 34
    MAGENTA         = 35
    CYAN            = 36
    WHITE           = 37
    RESET           = 39

    # These are fairly well supported, but not part of the standard.
    LIGHTBLACK_EX   = 90
    LIGHTRED_EX     = 91
    LIGHTGREEN_EX   = 92
    LIGHTYELLOW_EX  = 93
    LIGHTBLUE_EX    = 94
    LIGHTMAGENTA_EX = 95
    LIGHTCYAN_EX    = 96
    LIGHTWHITE_EX   = 97


class AnsiBack(AnsiCodes):
    BLACK           = 40
    RED             = 41
    GREEN           = 42
    YELLOW          = 43
    BLUE            = 44
    MAGENTA         = 45
    CYAN            = 46
    WHITE           = 47
    RESET           = 49

    # These are fairly well supported, but not part of the standard.
    LIGHTBLACK_EX   = 100
    LIGHTRED_EX     = 101
    LIGHTGREEN_EX   = 102
    LIGHTYELLOW_EX  = 103
    LIGHTBLUE_EX    = 104
    LIGHTMAGENTA_EX = 105
    LIGHTCYAN_EX    = 106
    LIGHTWHITE_EX   = 107


class AnsiStyle(AnsiCodes):
    BRIGHT    = 1
    DIM       = 2
    NORMAL    = 22
    RESET_ALL = 0

Fore   = AnsiFore()
Back   = AnsiBack()
Style  = AnsiStyle()
Cursor = AnsiCursor()
site-packages/pip/_vendor/colorama/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000554147511334610022652 0ustar003

���e��@sDddlmZmZmZmZddlmZmZmZm	Z	ddl
mZdZdS)�)�init�deinit�reinit�
colorama_text)�Fore�Back�Style�Cursor)�AnsiToWin32z0.3.7N)
Z
initialiserrrr�ansirrrr	Zansitowin32r
�__version__�r
r
�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/colorama/__pycache__/win32.cpython-36.opt-1.pyc000064400000006743147511334610022063 0ustar003

���e��@s�dZdZy.ddlZddlmZeej�ZddlmZWn.eefk
rddZdd�Z	dd�Z
Y�nfXdd	lmZmZm
Z
mZejZGd
d�de�ZejjZejge_eje_ejjZejee�ge_eje_ejj	Zejejge_eje_ejjZejege_eje_ejj Z!eje
ejeeej�ge!_eje!_ejj"Z#ejejejeeej�ge#_eje#_ejj$Z%ej&ge%_eje%_eee�eee�iZ'dd
�Z
efdd�Zdd�Z	ddd�Zdd�Z(dd�Z"dd�Z)dS)���N)�
LibraryLoader)�wintypescGsdS)N�)�_rr�/usr/lib/python3.6/win32.py�<lambda>sr	cGsdS)Nr)rrrrr	s)�byref�	Structure�c_char�POINTERc@s>eZdZdZdefdefdejfdejfdefgZdd�Z	d	S)
�CONSOLE_SCREEN_BUFFER_INFOzstruct in wincon.h.�dwSize�dwCursorPosition�wAttributes�srWindow�dwMaximumWindowSizecCsHd|jj|jj|jj|jj|j|jj|jj|jj|jj	|j
j|j
jfS)Nz"(%d,%d,%d,%d,%d,%d,%d,%d,%d,%d,%d))r�Y�Xrrr�Top�LeftZBottomZRightr)�selfrrr�__str__sz"CONSOLE_SCREEN_BUFFER_INFO.__str__N)
�__name__�
__module__�__qualname__�__doc__�COORDr�WORDZ
SMALL_RECTZ_fields_rrrrrrs
rcCs$tt}t�}t|t|��}t|�S)N)�handles�STDOUTr�_GetConsoleScreenBufferInfor
�bool)�handle�csbi�successrrr�winapi_testas
r'cCs t|}t�}t|t|��}|S)N)r rr"r
)�	stream_idr$r%r&rrr�GetConsoleScreenBufferInfohs
r)cCst|}t||�S)N)r �_SetConsoleTextAttribute)r(Zattrsr$rrr�SetConsoleTextAttributeosr+TcCsvt|�}|jdks|jdkr dSt|jd|jd�}|rdtt�j}|j|j7_|j|j7_t|}t	||�S)Nr�)
rrrr)r!rrrr �_SetConsoleCursorPosition)r(ZpositionZadjustZadjusted_position�srr$rrr�SetConsoleCursorPositionss
r/cCsBt|}t|j��}tj|�}tjd�}t||||t|��}|jS)Nr)r r�encoder�DWORD�_FillConsoleOutputCharacterAr
�value)r(�char�length�startr$�num_writtenr&rrr�FillConsoleOutputCharacter�s

r8cCs:t|}tj|�}tj|�}tjd�}t||||t|��S)za FillConsoleOutputAttribute( hConsole, csbi.wAttributes, dwConSize, coordScreen, &cCharsWritten )r)r rrr1�_FillConsoleOutputAttributer
)r(�attrr5r6r$Z	attributer7rrr�FillConsoleOutputAttribute�s


r;cCst|�S)N)�_SetConsoleTitleW)�titlerrr�SetConsoleTitle�sr>i����i�)T)*r!ZSTDERRZctypesrZWinDLLZwindllr�AttributeError�ImportErrorr+r'r
rrr
Z_COORDrrZkernel32ZGetStdHandleZ
_GetStdHandler1ZargtypesZHANDLEZrestyper)r"ZBOOLr*rr/r-ZFillConsoleOutputCharacterAr2r;r9ZSetConsoleTitleAr<ZLPCSTRr r8r>rrrr�<module>sl






site-packages/pip/_vendor/colorama/__pycache__/win32.cpython-36.pyc000064400000006743147511334610021124 0ustar003

���e��@s�dZdZy.ddlZddlmZeej�ZddlmZWn.eefk
rddZdd�Z	dd�Z
Y�nfXdd	lmZmZm
Z
mZejZGd
d�de�ZejjZejge_eje_ejjZejee�ge_eje_ejj	Zejejge_eje_ejjZejege_eje_ejj Z!eje
ejeeej�ge!_eje!_ejj"Z#ejejejeeej�ge#_eje#_ejj$Z%ej&ge%_eje%_eee�eee�iZ'dd
�Z
efdd�Zdd�Z	ddd�Zdd�Z(dd�Z"dd�Z)dS)���N)�
LibraryLoader)�wintypescGsdS)N�)�_rr�/usr/lib/python3.6/win32.py�<lambda>sr	cGsdS)Nr)rrrrr	s)�byref�	Structure�c_char�POINTERc@s>eZdZdZdefdefdejfdejfdefgZdd�Z	d	S)
�CONSOLE_SCREEN_BUFFER_INFOzstruct in wincon.h.�dwSize�dwCursorPosition�wAttributes�srWindow�dwMaximumWindowSizecCsHd|jj|jj|jj|jj|j|jj|jj|jj|jj	|j
j|j
jfS)Nz"(%d,%d,%d,%d,%d,%d,%d,%d,%d,%d,%d))r�Y�Xrrr�Top�LeftZBottomZRightr)�selfrrr�__str__sz"CONSOLE_SCREEN_BUFFER_INFO.__str__N)
�__name__�
__module__�__qualname__�__doc__�COORDr�WORDZ
SMALL_RECTZ_fields_rrrrrrs
rcCs$tt}t�}t|t|��}t|�S)N)�handles�STDOUTr�_GetConsoleScreenBufferInfor
�bool)�handle�csbi�successrrr�winapi_testas
r'cCs t|}t�}t|t|��}|S)N)r rr"r
)�	stream_idr$r%r&rrr�GetConsoleScreenBufferInfohs
r)cCst|}t||�S)N)r �_SetConsoleTextAttribute)r(Zattrsr$rrr�SetConsoleTextAttributeosr+TcCsvt|�}|jdks|jdkr dSt|jd|jd�}|rdtt�j}|j|j7_|j|j7_t|}t	||�S)Nr�)
rrrr)r!rrrr �_SetConsoleCursorPosition)r(ZpositionZadjustZadjusted_position�srr$rrr�SetConsoleCursorPositionss
r/cCsBt|}t|j��}tj|�}tjd�}t||||t|��}|jS)Nr)r r�encoder�DWORD�_FillConsoleOutputCharacterAr
�value)r(�char�length�startr$�num_writtenr&rrr�FillConsoleOutputCharacter�s

r8cCs:t|}tj|�}tj|�}tjd�}t||||t|��S)za FillConsoleOutputAttribute( hConsole, csbi.wAttributes, dwConSize, coordScreen, &cCharsWritten )r)r rrr1�_FillConsoleOutputAttributer
)r(�attrr5r6r$Z	attributer7rrr�FillConsoleOutputAttribute�s


r;cCst|�S)N)�_SetConsoleTitleW)�titlerrr�SetConsoleTitle�sr>i����i�)T)*r!ZSTDERRZctypesrZWinDLLZwindllr�AttributeError�ImportErrorr+r'r
rrr
Z_COORDrrZkernel32ZGetStdHandleZ
_GetStdHandler1ZargtypesZHANDLEZrestyper)r"ZBOOLr*rr/r-ZFillConsoleOutputCharacterAr2r;r9ZSetConsoleTitleAr<ZLPCSTRr r8r>rrrr�<module>sl






site-packages/pip/_vendor/colorama/__pycache__/ansitowin32.cpython-36.opt-1.pyc000064400000015522147511334610023274 0ustar003

���e�%�@s�ddlZddlZddlZddlmZmZmZmZddlm	Z	m
Z
mZddlm
Z
mZdZe
dk	rfe	�Zdd�Zdd	�ZGd
d�de�ZGdd
�d
e�ZdS)�N�)�AnsiFore�AnsiBack�	AnsiStyle�Style)�WinTerm�WinColor�WinStyle)�windll�winapi_testcCst|d�p|jS)N�closed)�hasattrr)�stream�r�!/usr/lib/python3.6/ansitowin32.py�is_stream_closedsrcCst|d�o|j�S)N�isatty)r
r)rrrr�is_a_ttysrc@s(eZdZdZdd�Zdd�Zdd�ZdS)	�
StreamWrapperz�
    Wraps a stream (such as stdout), acting as a transparent proxy for all
    attribute access apart from method 'write()', which is delegated to our
    Converter instance.
    cCs||_||_dS)N)�_StreamWrapper__wrapped�_StreamWrapper__convertor)�self�wrappedZ	converterrrr�__init__szStreamWrapper.__init__cCst|j|�S)N)�getattrr)r�namerrr�__getattr__$szStreamWrapper.__getattr__cCs|jj|�dS)N)r�write)r�textrrrr'szStreamWrapper.writeN)�__name__�
__module__�__qualname__�__doc__rrrrrrrrsrc@s~eZdZdZejd�Zejd�Zddd�Zdd	�Z	d
d�Z
dd
�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�ZdS)�AnsiToWin32z�
    Implements a 'write()' method which, on Windows, will strip ANSI character
    sequences from the text, and if outputting to a tty, will convert them into
    win32 function calls.
    z?\[((?:\d|;)*)([a-zA-Z])?z?\]((?:.|;)*?)()?NFcCs�||_||_t||�|_tjdk}|o*t�}|dkrL|pJt|�oJt|�}||_	|dkrp|ont|�ont|�}||_
|j�|_|jt
jk|_dS)N�nt)r�	autoresetrr�osrrrr�strip�convert�get_win32_calls�win32_calls�sys�stderr�	on_stderr)rrr(r'r%Z
on_windowsZconversion_supportedrrrr4s


zAnsiToWin32.__init__cCs|jp|jp|jS)aj
        True if this class is actually needed. If false, then the output
        stream will not be affected, nor will win32 calls be issued, so
        wrapping stdout is not actually required. This will generally be
        False on non-Windows platforms, unless optional functionality like
        autoreset has been requested using kwargs to init()
        )r(r'r%)rrrr�should_wrapUszAnsiToWin32.should_wrapcNCs>|jot�r8tjtjftjtjtjftjtjtj	ftj	tjtj	ft
jtjt
jft
jtjt
jft
jtjt
jft
jtjt
jft
jtjt
jft
jtjt
jft
jtjt
jft
jtjt
jft
jtjft
jtjt
jdft
jtjt
jdft
jtjt
jdft
jtjt
jdft
jtjt
jdft
jtjt
jdft
jtjt
jdft
jtjt
jdftjtj t
jftjtj t
jftjtj t
jftjtj t
jftjtj t
jftjtj t
jftjtj t
jftjtj t
jftjtj ftjtj t
jdftjtj t
jdftjtj t
jdftjtj t
jdftjtj t
jdftjtj t
jdftjtj t
jdftjtj t
jdfi&St!�S)NT)"r(�wintermr�	RESET_ALL�	reset_allZBRIGHTZstyler	ZDIMZNORMALrZBLACKZforerZREDZGREENZYELLOWZBLUEZMAGENTAZCYANZWHITEZGREYZRESETZ
LIGHTBLACK_EXZLIGHTRED_EXZ
LIGHTGREEN_EXZLIGHTYELLOW_EXZLIGHTBLUE_EXZLIGHTMAGENTA_EXZLIGHTCYAN_EXZ
LIGHTWHITE_EXrZback�dict)rrrrr)_sP


zAnsiToWin32.get_win32_callscCs@|js|jr|j|�n|jj|�|jj�|jr<|j�dS)N)r'r(�write_and_convertrr�flushr%r1)rrrrrr�s
zAnsiToWin32.writecCs:|jr|jdd�n"|jr6t|j�r6|jjtj�dS)N�mr)r)r(�
call_win32r'rrrrr0)rrrrr1�szAnsiToWin32.reset_allcCsfd}|j|�}x@|jj|�D]0}|j�\}}|j|||�|j|j��|}qW|j||t|��dS)z�
        Write the given text to our wrapped stream, stripping any ANSI
        sequences from the text, and optionally converting them into win32
        calls.
        rN)�convert_osc�ANSI_CSI_RE�finditer�span�write_plain_text�convert_ansi�groups�len)rrZcursor�match�start�endrrrr3�s
zAnsiToWin32.write_and_convertcCs*||kr&|jj|||��|jj�dS)N)rrr4)rrr@rArrrr;�szAnsiToWin32.write_plain_textcCs"|jr|j||�}|j||�dS)N)r(�extract_paramsr6)r�paramstring�command�paramsrrrr<�szAnsiToWin32.convert_ansicCs~|dkr<tdd�|jd�D��}xXt|�dkr8|d}q"Wn>tdd�|jd�D��}t|�dkrz|d	krnd}n|d
krzd
}|S)N�Hfcss&|]}t|�dkrt|�ndVqdS)rrN)r>�int)�.0�prrr�	<genexpr>�sz-AnsiToWin32.extract_params.<locals>.<genexpr>�;�rcss"|]}t|�dkrt|�VqdS)rN)r>rG)rHrIrrrrJ�srZJKm�ABCD)r)r)r)�tuple�splitr>)rrDrCrErrrrB�szAnsiToWin32.extract_paramscCs�|dkrVx�|D]B}||jkr|j|}|d}|dd�}t|jd�}|||�qWn�|dkrttj|d|jd�n�|dkr�tj|d|jd�nf|dkr�tj||jd�nL|dkr�|d}d|fd|f|df|dfd	�|\}	}
tj|	|
|jd�dS)
Nr5rr)r-�J�KrFrM)�A�B�C�D)r*r2r-r/Zerase_screenZ
erase_lineZset_cursor_positionZ
cursor_adjust)rrDrEZparamZ	func_args�func�args�kwargs�n�x�yrrrr6�s$


*zAnsiToWin32.call_win32cCsvxp|jj|�D]`}|j�\}}|d|�||d�}|j�\}}|dkr|jd�}|ddkrtj|d�qW|S)N�rKrZ02r)�ANSI_OSC_REr9r:r=rOr/Z	set_title)rrr?r@rArCrDrErrrr7�s
zAnsiToWin32.convert_osc)NNF)rr r!r"�re�compiler8r]rr.r)rr1r3r;r<rBr6r7rrrrr#+s


!
,
r#)r^r+r&�ansirrrrr/rrr	Zwin32r
rrr�objectrr#rrrr�<module>ssite-packages/pip/_vendor/colorama/__pycache__/ansi.cpython-36.pyc000064400000006276147511334610021115 0ustar003

���e�	�@s�dZdZdZdZdd�Zdd�Zdd	d
�Zddd�ZGd
d�de�Z	Gdd�de�Z
Gdd�de	�ZGdd�de	�ZGdd�de	�Z
e�Ze�Ze
�Ze
�ZdS)z�
This module generates ANSI character codes to printing colors to terminals.
See: http://en.wikipedia.org/wiki/ANSI_escape_code
z]�cCstt|�dS)N�m)�CSI�str)�code�r�/usr/lib/python3.6/ansi.py�
code_to_charssrcCstd|tS)Nz2;)�OSC�BEL)�titlerrr�	set_titlesr�cCstt|�dS)N�J)rr)�moderrr�clear_screensrcCstt|�dS)N�K)rr)rrrr�
clear_linesrc@seZdZdd�ZdS)�	AnsiCodescCs:x4t|�D](}|jd�s
t||�}t||t|��q
WdS)N�_)�dir�
startswith�getattr�setattrr)�self�name�valuerrr�__init__s

zAnsiCodes.__init__N)�__name__�
__module__�__qualname__rrrrrrsrc@s>eZdZd
dd�Zddd�Zddd�Zddd	�Zdd
d�ZdS)�
AnsiCursor�cCstt|�dS)N�A)rr)r�nrrr�UP%sz
AnsiCursor.UPcCstt|�dS)N�B)rr)rr#rrr�DOWN'szAnsiCursor.DOWNcCstt|�dS)N�C)rr)rr#rrr�FORWARD)szAnsiCursor.FORWARDcCstt|�dS)N�D)rr)rr#rrr�BACK+szAnsiCursor.BACKcCstt|�dt|�dS)N�;�H)rr)r�x�yrrr�POS-szAnsiCursor.POSN)r!)r!)r!)r!)r!r!)rrrr$r&r(r*r/rrrrr $s




r c@sPeZdZdZdZdZdZdZdZdZ	dZ
d	Zd
ZdZ
dZd
ZdZdZdZdZdS)�AnsiFore��� �!�"�#�$�%�'�Z�[�\�]�^�_�`�aN)rrr�BLACK�RED�GREEN�YELLOW�BLUE�MAGENTA�CYAN�WHITE�RESET�
LIGHTBLACK_EX�LIGHTRED_EX�
LIGHTGREEN_EX�LIGHTYELLOW_EX�LIGHTBLUE_EX�LIGHTMAGENTA_EX�LIGHTCYAN_EX�
LIGHTWHITE_EXrrrrr01s"r0c@sPeZdZdZdZdZdZdZdZdZ	dZ
d	Zd
ZdZ
dZd
ZdZdZdZdZdS)�AnsiBack�(�)�*�+�,�-�.�/�1�d�e�f�g�h�i�j�kN)rrrrBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrrrrrSGs"rSc@seZdZdZdZdZdZdS)�	AnsiStyler!r
��N)rrrZBRIGHTZDIMZNORMALZ	RESET_ALLrrrrre]sreN)r
)r
)�__doc__rr	r
rrrr�objectrr r0rSreZForeZBackZStyleZCursorrrrr�<module>s 


site-packages/pip/_vendor/colorama/__pycache__/__init__.cpython-36.pyc000064400000000554147511334610021713 0ustar003

���e��@sDddlmZmZmZmZddlmZmZmZm	Z	ddl
mZdZdS)�)�init�deinit�reinit�
colorama_text)�Fore�Back�Style�Cursor)�AnsiToWin32z0.3.7N)
Z
initialiserrrr�ansirrrr	Zansitowin32r
�__version__�r
r
�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/colorama/__pycache__/initialise.cpython-36.pyc000064400000003061147511334610022302 0ustar003

���e}�@stddlZddlZddlZddlmZdadadadada	dd�Z
ddd	�Zd
d�Zej
dd
��Zdd�Zdd�ZdS)�N�)�AnsiToWin32FcCstdk	rtt�j�dS)N)r�orig_stdout�	reset_all�rr� /usr/lib/python3.6/initialise.pyrsrTcCs�|rt|||g�rtd��tjatjatjdkr8dantt||||�t_atjdkr^da	ntt||||�t_a	t
s�tjt
�da
dS)Nz,wrap=False conflicts with any other arg=TrueT)�any�
ValueError�sys�stdoutr�stderr�orig_stderr�wrapped_stdout�wrap_stream�wrapped_stderr�atexit_done�atexit�registerr)�	autoreset�convert�strip�wraprrr�inits


rcCs tdk	rtt_tdk	rtt_dS)N)rr
rr
rrrrr�deinit3src
os"t||�z
dVWdt�XdS)N)rr)�args�kwargsrrr�
colorama_text:s

rcCs tdk	rtt_tdk	rtt_dS)N)rr
rrrrrrr�reinitCsrcCs&|r"t||||d�}|j�r"|j}|S)N)rrr)rZshould_wrap�stream)rrrrr�wrapperrrrrJsr)FNNT)r�
contextlibr
Zansitowin32rrr
rrrrrr�contextmanagerrrrrrrr�<module>s
	site-packages/pip/_vendor/colorama/__pycache__/ansi.cpython-36.opt-1.pyc000064400000006276147511334610022054 0ustar003

���e�	�@s�dZdZdZdZdd�Zdd�Zdd	d
�Zddd�ZGd
d�de�Z	Gdd�de�Z
Gdd�de	�ZGdd�de	�ZGdd�de	�Z
e�Ze�Ze
�Ze
�ZdS)z�
This module generates ANSI character codes to printing colors to terminals.
See: http://en.wikipedia.org/wiki/ANSI_escape_code
z]�cCstt|�dS)N�m)�CSI�str)�code�r�/usr/lib/python3.6/ansi.py�
code_to_charssrcCstd|tS)Nz2;)�OSC�BEL)�titlerrr�	set_titlesr�cCstt|�dS)N�J)rr)�moderrr�clear_screensrcCstt|�dS)N�K)rr)rrrr�
clear_linesrc@seZdZdd�ZdS)�	AnsiCodescCs:x4t|�D](}|jd�s
t||�}t||t|��q
WdS)N�_)�dir�
startswith�getattr�setattrr)�self�name�valuerrr�__init__s

zAnsiCodes.__init__N)�__name__�
__module__�__qualname__rrrrrrsrc@s>eZdZd
dd�Zddd�Zddd�Zddd	�Zdd
d�ZdS)�
AnsiCursor�cCstt|�dS)N�A)rr)r�nrrr�UP%sz
AnsiCursor.UPcCstt|�dS)N�B)rr)rr#rrr�DOWN'szAnsiCursor.DOWNcCstt|�dS)N�C)rr)rr#rrr�FORWARD)szAnsiCursor.FORWARDcCstt|�dS)N�D)rr)rr#rrr�BACK+szAnsiCursor.BACKcCstt|�dt|�dS)N�;�H)rr)r�x�yrrr�POS-szAnsiCursor.POSN)r!)r!)r!)r!)r!r!)rrrr$r&r(r*r/rrrrr $s




r c@sPeZdZdZdZdZdZdZdZdZ	dZ
d	Zd
ZdZ
dZd
ZdZdZdZdZdS)�AnsiFore��� �!�"�#�$�%�'�Z�[�\�]�^�_�`�aN)rrr�BLACK�RED�GREEN�YELLOW�BLUE�MAGENTA�CYAN�WHITE�RESET�
LIGHTBLACK_EX�LIGHTRED_EX�
LIGHTGREEN_EX�LIGHTYELLOW_EX�LIGHTBLUE_EX�LIGHTMAGENTA_EX�LIGHTCYAN_EX�
LIGHTWHITE_EXrrrrr01s"r0c@sPeZdZdZdZdZdZdZdZdZ	dZ
d	Zd
ZdZ
dZd
ZdZdZdZdZdS)�AnsiBack�(�)�*�+�,�-�.�/�1�d�e�f�g�h�i�j�kN)rrrrBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrrrrrSGs"rSc@seZdZdZdZdZdZdS)�	AnsiStyler!r
��N)rrrZBRIGHTZDIMZNORMALZ	RESET_ALLrrrrre]sreN)r
)r
)�__doc__rr	r
rrrr�objectrr r0rSreZForeZBackZStyleZCursorrrrr�<module>s 


site-packages/pip/_vendor/colorama/__pycache__/winterm.cpython-36.pyc000064400000010607147511334610021641 0ustar003

���e��@s@ddlmZGdd�de�ZGdd�de�ZGdd�de�ZdS)	�)�win32c@s,eZdZdZdZdZdZdZdZdZ	dZ
d	S)
�WinColor�r������N)�__name__�
__module__�__qualname__ZBLACKZBLUEZGREENZCYANZREDZMAGENTAZYELLOWZGREY�rr�/usr/lib/python3.6/winterm.pyrsrc@seZdZdZdZdZdS)�WinStyler��N)rrr
ZNORMAL�BRIGHT�BRIGHT_BACKGROUNDrrrrrsrc@s�eZdZdd�Zdd�Zdd�Zd dd	�Zd!dd�Zd"d
d�Zd#dd�Z	d$dd�Z
dd�Zd%dd�Zd&dd�Z
d'dd�Zd(dd�Zdd�ZdS))�WinTermcCs>tjtj�j|_|j|j�|j|_|j|_	|j
|_d|_dS)Nr)
r�GetConsoleScreenBufferInfo�STDOUTZwAttributes�_default�	set_attrs�_fore�
_default_fore�_back�
_default_back�_style�_default_style�_light)�selfrrr�__init__szWinTerm.__init__cCs|j|jd|j|jBS)N�)rrrr )r!rrr�	get_attrs$szWinTerm.get_attrscCs.|d@|_|d?d@|_|tjtjB@|_dS)Nr
r)rrrrrr)r!�valuerrrr's
zWinTerm.set_attrsNcCs|j|j�|j|jd�dS)N)�attrs)rr�set_console)r!�	on_stderrrrr�	reset_all,szWinTerm.reset_allFcCsL|dkr|j}||_|r*|jtjO_n|jtjM_|j|d�dS)N)r()rrr rrr')r!�fore�lightr(rrrr*0szWinTerm.forecCsL|dkr|j}||_|r*|jtjO_n|jtjM_|j|d�dS)N)r()rrr rrr')r!�backr+r(rrrr,;szWinTerm.backcCs$|dkr|j}||_|j|d�dS)N)r()rrr')r!�styler(rrrr-Fsz
WinTerm.stylecCs0|dkr|j�}tj}|r tj}tj||�dS)N)r$rr�STDERRZSetConsoleTextAttribute)r!r&r(�handlerrrr'LszWinTerm.set_consolecCs,tj|�j}|jd7_|jd7_|S)Nr)rr�dwCursorPosition�X�Y)r!r/�positionrrr�get_positionTszWinTerm.get_positioncCs,|dkrdStj}|rtj}tj||�dS)N)rrr.�SetConsoleCursorPosition)r!r3r(r/rrr�set_cursor_position\szWinTerm.set_cursor_positioncCsBtj}|rtj}|j|�}|j||j|f}tj||dd�dS)NF)Zadjust)rrr.r4r2r1r5)r!�x�yr(r/r3Zadjusted_positionrrr�
cursor_adjustfs
zWinTerm.cursor_adjustrc	Cs�tj}|rtj}tj|�}|jj|jj}|jj|jj|jj}|dkrX|j}||}|dkrrtjdd�}|}n|dkr�tjdd�}|}tj	|d||�tj
||j�||�|dkr�tj|d�dS)Nrrr� )rr)
rrr.r�dwSizer1r2r0�COORD�FillConsoleOutputCharacter�FillConsoleOutputAttributer$r5)	r!�moder(r/�csbiZcells_in_screenZcells_before_cursor�
from_coord�cells_to_eraserrr�erase_screenns&
zWinTerm.erase_screencCs�tj}|rtj}tj|�}|dkr8|j}|jj|jj}|dkrZtjd|jj�}|jj}n |dkrztjd|jj�}|jj}tj	|d||�tj
||j�||�dS)Nrrrr:)rrr.rr0r;r1r<r2r=r>r$)r!r?r(r/r@rArBrrr�
erase_line�s

zWinTerm.erase_linecCstj|�dS)N)rZSetConsoleTitle)r!�titlerrr�	set_title�szWinTerm.set_title)N)NFF)NFF)NF)NF)NF)F)rF)rF)rrr
r"r$rr)r*r,r-r'r4r6r9rCrDrFrrrrrs









rN)�r�objectrrrrrrr�<module>ssite-packages/pip/_vendor/colorama/__pycache__/initialise.cpython-36.opt-1.pyc000064400000003061147511334610023241 0ustar003

���e}�@stddlZddlZddlZddlmZdadadadada	dd�Z
ddd	�Zd
d�Zej
dd
��Zdd�Zdd�ZdS)�N�)�AnsiToWin32FcCstdk	rtt�j�dS)N)r�orig_stdout�	reset_all�rr� /usr/lib/python3.6/initialise.pyrsrTcCs�|rt|||g�rtd��tjatjatjdkr8dantt||||�t_atjdkr^da	ntt||||�t_a	t
s�tjt
�da
dS)Nz,wrap=False conflicts with any other arg=TrueT)�any�
ValueError�sys�stdoutr�stderr�orig_stderr�wrapped_stdout�wrap_stream�wrapped_stderr�atexit_done�atexit�registerr)�	autoreset�convert�strip�wraprrr�inits


rcCs tdk	rtt_tdk	rtt_dS)N)rr
rr
rrrrr�deinit3src
os"t||�z
dVWdt�XdS)N)rr)�args�kwargsrrr�
colorama_text:s

rcCs tdk	rtt_tdk	rtt_dS)N)rr
rrrrrrr�reinitCsrcCs&|r"t||||d�}|j�r"|j}|S)N)rrr)rZshould_wrap�stream)rrrrr�wrapperrrrrJsr)FNNT)r�
contextlibr
Zansitowin32rrr
rrrrrr�contextmanagerrrrrrrr�<module>s
	site-packages/pip/_vendor/colorama/__pycache__/ansitowin32.cpython-36.pyc000064400000015522147511334610022335 0ustar003

���e�%�@s�ddlZddlZddlZddlmZmZmZmZddlm	Z	m
Z
mZddlm
Z
mZdZe
dk	rfe	�Zdd�Zdd	�ZGd
d�de�ZGdd
�d
e�ZdS)�N�)�AnsiFore�AnsiBack�	AnsiStyle�Style)�WinTerm�WinColor�WinStyle)�windll�winapi_testcCst|d�p|jS)N�closed)�hasattrr)�stream�r�!/usr/lib/python3.6/ansitowin32.py�is_stream_closedsrcCst|d�o|j�S)N�isatty)r
r)rrrr�is_a_ttysrc@s(eZdZdZdd�Zdd�Zdd�ZdS)	�
StreamWrapperz�
    Wraps a stream (such as stdout), acting as a transparent proxy for all
    attribute access apart from method 'write()', which is delegated to our
    Converter instance.
    cCs||_||_dS)N)�_StreamWrapper__wrapped�_StreamWrapper__convertor)�self�wrappedZ	converterrrr�__init__szStreamWrapper.__init__cCst|j|�S)N)�getattrr)r�namerrr�__getattr__$szStreamWrapper.__getattr__cCs|jj|�dS)N)r�write)r�textrrrr'szStreamWrapper.writeN)�__name__�
__module__�__qualname__�__doc__rrrrrrrrsrc@s~eZdZdZejd�Zejd�Zddd�Zdd	�Z	d
d�Z
dd
�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�ZdS)�AnsiToWin32z�
    Implements a 'write()' method which, on Windows, will strip ANSI character
    sequences from the text, and if outputting to a tty, will convert them into
    win32 function calls.
    z?\[((?:\d|;)*)([a-zA-Z])?z?\]((?:.|;)*?)()?NFcCs�||_||_t||�|_tjdk}|o*t�}|dkrL|pJt|�oJt|�}||_	|dkrp|ont|�ont|�}||_
|j�|_|jt
jk|_dS)N�nt)r�	autoresetrr�osrrrr�strip�convert�get_win32_calls�win32_calls�sys�stderr�	on_stderr)rrr(r'r%Z
on_windowsZconversion_supportedrrrr4s


zAnsiToWin32.__init__cCs|jp|jp|jS)aj
        True if this class is actually needed. If false, then the output
        stream will not be affected, nor will win32 calls be issued, so
        wrapping stdout is not actually required. This will generally be
        False on non-Windows platforms, unless optional functionality like
        autoreset has been requested using kwargs to init()
        )r(r'r%)rrrr�should_wrapUszAnsiToWin32.should_wrapcNCs>|jot�r8tjtjftjtjtjftjtjtj	ftj	tjtj	ft
jtjt
jft
jtjt
jft
jtjt
jft
jtjt
jft
jtjt
jft
jtjt
jft
jtjt
jft
jtjt
jft
jtjft
jtjt
jdft
jtjt
jdft
jtjt
jdft
jtjt
jdft
jtjt
jdft
jtjt
jdft
jtjt
jdft
jtjt
jdftjtj t
jftjtj t
jftjtj t
jftjtj t
jftjtj t
jftjtj t
jftjtj t
jftjtj t
jftjtj ftjtj t
jdftjtj t
jdftjtj t
jdftjtj t
jdftjtj t
jdftjtj t
jdftjtj t
jdftjtj t
jdfi&St!�S)NT)"r(�wintermr�	RESET_ALL�	reset_allZBRIGHTZstyler	ZDIMZNORMALrZBLACKZforerZREDZGREENZYELLOWZBLUEZMAGENTAZCYANZWHITEZGREYZRESETZ
LIGHTBLACK_EXZLIGHTRED_EXZ
LIGHTGREEN_EXZLIGHTYELLOW_EXZLIGHTBLUE_EXZLIGHTMAGENTA_EXZLIGHTCYAN_EXZ
LIGHTWHITE_EXrZback�dict)rrrrr)_sP


zAnsiToWin32.get_win32_callscCs@|js|jr|j|�n|jj|�|jj�|jr<|j�dS)N)r'r(�write_and_convertrr�flushr%r1)rrrrrr�s
zAnsiToWin32.writecCs:|jr|jdd�n"|jr6t|j�r6|jjtj�dS)N�mr)r)r(�
call_win32r'rrrrr0)rrrrr1�szAnsiToWin32.reset_allcCsfd}|j|�}x@|jj|�D]0}|j�\}}|j|||�|j|j��|}qW|j||t|��dS)z�
        Write the given text to our wrapped stream, stripping any ANSI
        sequences from the text, and optionally converting them into win32
        calls.
        rN)�convert_osc�ANSI_CSI_RE�finditer�span�write_plain_text�convert_ansi�groups�len)rrZcursor�match�start�endrrrr3�s
zAnsiToWin32.write_and_convertcCs*||kr&|jj|||��|jj�dS)N)rrr4)rrr@rArrrr;�szAnsiToWin32.write_plain_textcCs"|jr|j||�}|j||�dS)N)r(�extract_paramsr6)r�paramstring�command�paramsrrrr<�szAnsiToWin32.convert_ansicCs~|dkr<tdd�|jd�D��}xXt|�dkr8|d}q"Wn>tdd�|jd�D��}t|�dkrz|d	krnd}n|d
krzd
}|S)N�Hfcss&|]}t|�dkrt|�ndVqdS)rrN)r>�int)�.0�prrr�	<genexpr>�sz-AnsiToWin32.extract_params.<locals>.<genexpr>�;�rcss"|]}t|�dkrt|�VqdS)rN)r>rG)rHrIrrrrJ�srZJKm�ABCD)r)r)r)�tuple�splitr>)rrDrCrErrrrB�szAnsiToWin32.extract_paramscCs�|dkrVx�|D]B}||jkr|j|}|d}|dd�}t|jd�}|||�qWn�|dkrttj|d|jd�n�|dkr�tj|d|jd�nf|dkr�tj||jd�nL|dkr�|d}d|fd|f|df|dfd	�|\}	}
tj|	|
|jd�dS)
Nr5rr)r-�J�KrFrM)�A�B�C�D)r*r2r-r/Zerase_screenZ
erase_lineZset_cursor_positionZ
cursor_adjust)rrDrEZparamZ	func_args�func�args�kwargs�n�x�yrrrr6�s$


*zAnsiToWin32.call_win32cCsvxp|jj|�D]`}|j�\}}|d|�||d�}|j�\}}|dkr|jd�}|ddkrtj|d�qW|S)N�rKrZ02r)�ANSI_OSC_REr9r:r=rOr/Z	set_title)rrr?r@rArCrDrErrrr7�s
zAnsiToWin32.convert_osc)NNF)rr r!r"�re�compiler8r]rr.r)rr1r3r;r<rBr6r7rrrrr#+s


!
,
r#)r^r+r&�ansirrrrr/rrr	Zwin32r
rrr�objectrr#rrrr�<module>ssite-packages/pip/_vendor/colorama/__pycache__/winterm.cpython-36.opt-1.pyc000064400000010607147511334610022600 0ustar003

���e��@s@ddlmZGdd�de�ZGdd�de�ZGdd�de�ZdS)	�)�win32c@s,eZdZdZdZdZdZdZdZdZ	dZ
d	S)
�WinColor�r������N)�__name__�
__module__�__qualname__ZBLACKZBLUEZGREENZCYANZREDZMAGENTAZYELLOWZGREY�rr�/usr/lib/python3.6/winterm.pyrsrc@seZdZdZdZdZdS)�WinStyler��N)rrr
ZNORMAL�BRIGHT�BRIGHT_BACKGROUNDrrrrrsrc@s�eZdZdd�Zdd�Zdd�Zd dd	�Zd!dd�Zd"d
d�Zd#dd�Z	d$dd�Z
dd�Zd%dd�Zd&dd�Z
d'dd�Zd(dd�Zdd�ZdS))�WinTermcCs>tjtj�j|_|j|j�|j|_|j|_	|j
|_d|_dS)Nr)
r�GetConsoleScreenBufferInfo�STDOUTZwAttributes�_default�	set_attrs�_fore�
_default_fore�_back�
_default_back�_style�_default_style�_light)�selfrrr�__init__szWinTerm.__init__cCs|j|jd|j|jBS)N�)rrrr )r!rrr�	get_attrs$szWinTerm.get_attrscCs.|d@|_|d?d@|_|tjtjB@|_dS)Nr
r)rrrrrr)r!�valuerrrr's
zWinTerm.set_attrsNcCs|j|j�|j|jd�dS)N)�attrs)rr�set_console)r!�	on_stderrrrr�	reset_all,szWinTerm.reset_allFcCsL|dkr|j}||_|r*|jtjO_n|jtjM_|j|d�dS)N)r()rrr rrr')r!�fore�lightr(rrrr*0szWinTerm.forecCsL|dkr|j}||_|r*|jtjO_n|jtjM_|j|d�dS)N)r()rrr rrr')r!�backr+r(rrrr,;szWinTerm.backcCs$|dkr|j}||_|j|d�dS)N)r()rrr')r!�styler(rrrr-Fsz
WinTerm.stylecCs0|dkr|j�}tj}|r tj}tj||�dS)N)r$rr�STDERRZSetConsoleTextAttribute)r!r&r(�handlerrrr'LszWinTerm.set_consolecCs,tj|�j}|jd7_|jd7_|S)Nr)rr�dwCursorPosition�X�Y)r!r/�positionrrr�get_positionTszWinTerm.get_positioncCs,|dkrdStj}|rtj}tj||�dS)N)rrr.�SetConsoleCursorPosition)r!r3r(r/rrr�set_cursor_position\szWinTerm.set_cursor_positioncCsBtj}|rtj}|j|�}|j||j|f}tj||dd�dS)NF)Zadjust)rrr.r4r2r1r5)r!�x�yr(r/r3Zadjusted_positionrrr�
cursor_adjustfs
zWinTerm.cursor_adjustrc	Cs�tj}|rtj}tj|�}|jj|jj}|jj|jj|jj}|dkrX|j}||}|dkrrtjdd�}|}n|dkr�tjdd�}|}tj	|d||�tj
||j�||�|dkr�tj|d�dS)Nrrr� )rr)
rrr.r�dwSizer1r2r0�COORD�FillConsoleOutputCharacter�FillConsoleOutputAttributer$r5)	r!�moder(r/�csbiZcells_in_screenZcells_before_cursor�
from_coord�cells_to_eraserrr�erase_screenns&
zWinTerm.erase_screencCs�tj}|rtj}tj|�}|dkr8|j}|jj|jj}|dkrZtjd|jj�}|jj}n |dkrztjd|jj�}|jj}tj	|d||�tj
||j�||�dS)Nrrrr:)rrr.rr0r;r1r<r2r=r>r$)r!r?r(r/r@rArBrrr�
erase_line�s

zWinTerm.erase_linecCstj|�dS)N)rZSetConsoleTitle)r!�titlerrr�	set_title�szWinTerm.set_title)N)NFF)NFF)NF)NF)NF)F)rF)rF)rrr
r"r$rr)r*r,r-r'r4r6r9rCrDrFrrrrrs









rN)�r�objectrrrrrrr�<module>ssite-packages/pip/_vendor/colorama/__init__.py000064400000000360147511334610015422 0ustar00# Copyright Jonathan Hartley 2013. BSD 3-Clause license, see LICENSE file.
from .initialise import init, deinit, reinit, colorama_text
from .ansi import Fore, Back, Style, Cursor
from .ansitowin32 import AnsiToWin32

__version__ = '0.3.7'

site-packages/pip/_vendor/colorama/win32.py000064400000012365147511334610014635 0ustar00# Copyright Jonathan Hartley 2013. BSD 3-Clause license, see LICENSE file.

# from winbase.h
STDOUT = -11
STDERR = -12

try:
    import ctypes
    from ctypes import LibraryLoader
    windll = LibraryLoader(ctypes.WinDLL)
    from ctypes import wintypes
except (AttributeError, ImportError):
    windll = None
    SetConsoleTextAttribute = lambda *_: None
    winapi_test = lambda *_: None
else:
    from ctypes import byref, Structure, c_char, POINTER

    COORD = wintypes._COORD

    class CONSOLE_SCREEN_BUFFER_INFO(Structure):
        """struct in wincon.h."""
        _fields_ = [
            ("dwSize", COORD),
            ("dwCursorPosition", COORD),
            ("wAttributes", wintypes.WORD),
            ("srWindow", wintypes.SMALL_RECT),
            ("dwMaximumWindowSize", COORD),
        ]
        def __str__(self):
            return '(%d,%d,%d,%d,%d,%d,%d,%d,%d,%d,%d)' % (
                self.dwSize.Y, self.dwSize.X
                , self.dwCursorPosition.Y, self.dwCursorPosition.X
                , self.wAttributes
                , self.srWindow.Top, self.srWindow.Left, self.srWindow.Bottom, self.srWindow.Right
                , self.dwMaximumWindowSize.Y, self.dwMaximumWindowSize.X
            )

    _GetStdHandle = windll.kernel32.GetStdHandle
    _GetStdHandle.argtypes = [
        wintypes.DWORD,
    ]
    _GetStdHandle.restype = wintypes.HANDLE

    _GetConsoleScreenBufferInfo = windll.kernel32.GetConsoleScreenBufferInfo
    _GetConsoleScreenBufferInfo.argtypes = [
        wintypes.HANDLE,
        POINTER(CONSOLE_SCREEN_BUFFER_INFO),
    ]
    _GetConsoleScreenBufferInfo.restype = wintypes.BOOL

    _SetConsoleTextAttribute = windll.kernel32.SetConsoleTextAttribute
    _SetConsoleTextAttribute.argtypes = [
        wintypes.HANDLE,
        wintypes.WORD,
    ]
    _SetConsoleTextAttribute.restype = wintypes.BOOL

    _SetConsoleCursorPosition = windll.kernel32.SetConsoleCursorPosition
    _SetConsoleCursorPosition.argtypes = [
        wintypes.HANDLE,
        COORD,
    ]
    _SetConsoleCursorPosition.restype = wintypes.BOOL

    _FillConsoleOutputCharacterA = windll.kernel32.FillConsoleOutputCharacterA
    _FillConsoleOutputCharacterA.argtypes = [
        wintypes.HANDLE,
        c_char,
        wintypes.DWORD,
        COORD,
        POINTER(wintypes.DWORD),
    ]
    _FillConsoleOutputCharacterA.restype = wintypes.BOOL

    _FillConsoleOutputAttribute = windll.kernel32.FillConsoleOutputAttribute
    _FillConsoleOutputAttribute.argtypes = [
        wintypes.HANDLE,
        wintypes.WORD,
        wintypes.DWORD,
        COORD,
        POINTER(wintypes.DWORD),
    ]
    _FillConsoleOutputAttribute.restype = wintypes.BOOL

    _SetConsoleTitleW = windll.kernel32.SetConsoleTitleA
    _SetConsoleTitleW.argtypes = [
        wintypes.LPCSTR
    ]
    _SetConsoleTitleW.restype = wintypes.BOOL

    handles = {
        STDOUT: _GetStdHandle(STDOUT),
        STDERR: _GetStdHandle(STDERR),
    }

    def winapi_test():
        handle = handles[STDOUT]
        csbi = CONSOLE_SCREEN_BUFFER_INFO()
        success = _GetConsoleScreenBufferInfo(
            handle, byref(csbi))
        return bool(success)

    def GetConsoleScreenBufferInfo(stream_id=STDOUT):
        handle = handles[stream_id]
        csbi = CONSOLE_SCREEN_BUFFER_INFO()
        success = _GetConsoleScreenBufferInfo(
            handle, byref(csbi))
        return csbi

    def SetConsoleTextAttribute(stream_id, attrs):
        handle = handles[stream_id]
        return _SetConsoleTextAttribute(handle, attrs)

    def SetConsoleCursorPosition(stream_id, position, adjust=True):
        position = COORD(*position)
        # If the position is out of range, do nothing.
        if position.Y <= 0 or position.X <= 0:
            return
        # Adjust for Windows' SetConsoleCursorPosition:
        #    1. being 0-based, while ANSI is 1-based.
        #    2. expecting (x,y), while ANSI uses (y,x).
        adjusted_position = COORD(position.Y - 1, position.X - 1)
        if adjust:
            # Adjust for viewport's scroll position
            sr = GetConsoleScreenBufferInfo(STDOUT).srWindow
            adjusted_position.Y += sr.Top
            adjusted_position.X += sr.Left
        # Resume normal processing
        handle = handles[stream_id]
        return _SetConsoleCursorPosition(handle, adjusted_position)

    def FillConsoleOutputCharacter(stream_id, char, length, start):
        handle = handles[stream_id]
        char = c_char(char.encode())
        length = wintypes.DWORD(length)
        num_written = wintypes.DWORD(0)
        # Note that this is hard-coded for ANSI (vs wide) bytes.
        success = _FillConsoleOutputCharacterA(
            handle, char, length, start, byref(num_written))
        return num_written.value

    def FillConsoleOutputAttribute(stream_id, attr, length, start):
        ''' FillConsoleOutputAttribute( hConsole, csbi.wAttributes, dwConSize, coordScreen, &cCharsWritten )'''
        handle = handles[stream_id]
        attribute = wintypes.WORD(attr)
        length = wintypes.DWORD(length)
        num_written = wintypes.DWORD(0)
        # Note that this is hard-coded for ANSI (vs wide) bytes.
        return _FillConsoleOutputAttribute(
            handle, attribute, length, start, byref(num_written))

    def SetConsoleTitle(title):
        return _SetConsoleTitleW(title)
site-packages/pip/_vendor/colorama/winterm.py000064400000014222147511334610015352 0ustar00# Copyright Jonathan Hartley 2013. BSD 3-Clause license, see LICENSE file.
from . import win32


# from wincon.h
class WinColor(object):
    BLACK   = 0
    BLUE    = 1
    GREEN   = 2
    CYAN    = 3
    RED     = 4
    MAGENTA = 5
    YELLOW  = 6
    GREY    = 7

# from wincon.h
class WinStyle(object):
    NORMAL              = 0x00 # dim text, dim background
    BRIGHT              = 0x08 # bright text, dim background
    BRIGHT_BACKGROUND   = 0x80 # dim text, bright background

class WinTerm(object):

    def __init__(self):
        self._default = win32.GetConsoleScreenBufferInfo(win32.STDOUT).wAttributes
        self.set_attrs(self._default)
        self._default_fore = self._fore
        self._default_back = self._back
        self._default_style = self._style
        # In order to emulate LIGHT_EX in windows, we borrow the BRIGHT style.
        # So that LIGHT_EX colors and BRIGHT style do not clobber each other,
        # we track them separately, since LIGHT_EX is overwritten by Fore/Back
        # and BRIGHT is overwritten by Style codes.
        self._light = 0

    def get_attrs(self):
        return self._fore + self._back * 16 + (self._style | self._light)

    def set_attrs(self, value):
        self._fore = value & 7
        self._back = (value >> 4) & 7
        self._style = value & (WinStyle.BRIGHT | WinStyle.BRIGHT_BACKGROUND)

    def reset_all(self, on_stderr=None):
        self.set_attrs(self._default)
        self.set_console(attrs=self._default)

    def fore(self, fore=None, light=False, on_stderr=False):
        if fore is None:
            fore = self._default_fore
        self._fore = fore
        # Emulate LIGHT_EX with BRIGHT Style
        if light:
            self._light |= WinStyle.BRIGHT
        else:
            self._light &= ~WinStyle.BRIGHT
        self.set_console(on_stderr=on_stderr)

    def back(self, back=None, light=False, on_stderr=False):
        if back is None:
            back = self._default_back
        self._back = back
        # Emulate LIGHT_EX with BRIGHT_BACKGROUND Style
        if light:
            self._light |= WinStyle.BRIGHT_BACKGROUND
        else:
            self._light &= ~WinStyle.BRIGHT_BACKGROUND
        self.set_console(on_stderr=on_stderr)

    def style(self, style=None, on_stderr=False):
        if style is None:
            style = self._default_style
        self._style = style
        self.set_console(on_stderr=on_stderr)

    def set_console(self, attrs=None, on_stderr=False):
        if attrs is None:
            attrs = self.get_attrs()
        handle = win32.STDOUT
        if on_stderr:
            handle = win32.STDERR
        win32.SetConsoleTextAttribute(handle, attrs)

    def get_position(self, handle):
        position = win32.GetConsoleScreenBufferInfo(handle).dwCursorPosition
        # Because Windows coordinates are 0-based,
        # and win32.SetConsoleCursorPosition expects 1-based.
        position.X += 1
        position.Y += 1
        return position

    def set_cursor_position(self, position=None, on_stderr=False):
        if position is None:
            # I'm not currently tracking the position, so there is no default.
            # position = self.get_position()
            return
        handle = win32.STDOUT
        if on_stderr:
            handle = win32.STDERR
        win32.SetConsoleCursorPosition(handle, position)

    def cursor_adjust(self, x, y, on_stderr=False):
        handle = win32.STDOUT
        if on_stderr:
            handle = win32.STDERR
        position = self.get_position(handle)
        adjusted_position = (position.Y + y, position.X + x)
        win32.SetConsoleCursorPosition(handle, adjusted_position, adjust=False)

    def erase_screen(self, mode=0, on_stderr=False):
        # 0 should clear from the cursor to the end of the screen.
        # 1 should clear from the cursor to the beginning of the screen.
        # 2 should clear the entire screen, and move cursor to (1,1)
        handle = win32.STDOUT
        if on_stderr:
            handle = win32.STDERR
        csbi = win32.GetConsoleScreenBufferInfo(handle)
        # get the number of character cells in the current buffer
        cells_in_screen = csbi.dwSize.X * csbi.dwSize.Y
        # get number of character cells before current cursor position
        cells_before_cursor = csbi.dwSize.X * csbi.dwCursorPosition.Y + csbi.dwCursorPosition.X
        if mode == 0:
            from_coord = csbi.dwCursorPosition
            cells_to_erase = cells_in_screen - cells_before_cursor
        if mode == 1:
            from_coord = win32.COORD(0, 0)
            cells_to_erase = cells_before_cursor
        elif mode == 2:
            from_coord = win32.COORD(0, 0)
            cells_to_erase = cells_in_screen
        # fill the entire screen with blanks
        win32.FillConsoleOutputCharacter(handle, ' ', cells_to_erase, from_coord)
        # now set the buffer's attributes accordingly
        win32.FillConsoleOutputAttribute(handle, self.get_attrs(), cells_to_erase, from_coord)
        if mode == 2:
            # put the cursor where needed
            win32.SetConsoleCursorPosition(handle, (1, 1))

    def erase_line(self, mode=0, on_stderr=False):
        # 0 should clear from the cursor to the end of the line.
        # 1 should clear from the cursor to the beginning of the line.
        # 2 should clear the entire line.
        handle = win32.STDOUT
        if on_stderr:
            handle = win32.STDERR
        csbi = win32.GetConsoleScreenBufferInfo(handle)
        if mode == 0:
            from_coord = csbi.dwCursorPosition
            cells_to_erase = csbi.dwSize.X - csbi.dwCursorPosition.X
        if mode == 1:
            from_coord = win32.COORD(0, csbi.dwCursorPosition.Y)
            cells_to_erase = csbi.dwCursorPosition.X
        elif mode == 2:
            from_coord = win32.COORD(0, csbi.dwCursorPosition.Y)
            cells_to_erase = csbi.dwSize.X
        # fill the entire screen with blanks
        win32.FillConsoleOutputCharacter(handle, ' ', cells_to_erase, from_coord)
        # now set the buffer's attributes accordingly
        win32.FillConsoleOutputAttribute(handle, self.get_attrs(), cells_to_erase, from_coord)

    def set_title(self, title):
        win32.SetConsoleTitle(title)
site-packages/pip/_vendor/webencodings/__pycache__/tests.cpython-36.pyc000064400000011570147511334610022170 0ustar003

���e��@s�dZddlmZddlmZmZmZmZmZm	Z	m
Z
mZmZdd�Z
dd�Zd	d
�Zdd�Zd
d�Zdd�Zdd�Zdd�Zdd�ZdS)z�

    webencodings.tests
    ~~~~~~~~~~~~~~~~~~

    A basic test suite for Encoding.

    :copyright: Copyright 2012 by Simon Sapin
    :license: BSD, see LICENSE for details.

�)�unicode_literals�)	�lookup�LABELS�decode�encode�iter_decode�iter_encode�IncrementalDecoder�IncrementalEncoder�UTF8cOs4y|||�Wn|k
r"dSXtd|��dS)NzDid not raise %s.)�AssertionError)Z	exceptionZfunction�args�kwargs�r�/usr/lib/python3.6/tests.py�
assert_raisess
rcCstd�jdkst�td�jdks$t�td�jdks6t�td�jdksHt�td�jdksZt�td�jdkslt�td�jdks~t�td�dks�t�td�dks�t�td	�jd
ks�t�td�jd
ks�t�td�jd
ks�t�td
�jd
ks�t�td�dks�t�td�dk�st�dS)Nzutf-8zUtf-8zUTF-8�utf8zutf8 z 
utf8	�u8uutf-8 zUS-ASCIIzwindows-1252z
iso-8859-1�latin1ZLATIN1zlatin-1uLATİN1)r�namer
rrrr�test_labelssrcCsx�tD]�}td|�dt|�fks$t�td|�dks6t�xLdD]D}tdg||�\}}t|�gksdt�ttdg||��gks<t�q<Wt|�}|jd�dks�t�|jddd�dks�t�t	|�}|jd�dks�t�|jddd�dkst�qWx&t
tj��D]}t|�j|ks�t�q�WdS)	N��rr�T)�final)rrr)
rrrr
rr�listr	r
r�set�valuesr)Zlabel�repeat�output�_�decoder�encoderrrrr�test_all_labels0s

 r$cCsTtttdd�tttdd�tttgd�tttgd�tttd�tttd�dS)NséZinvalid�é)r�LookupErrorrrrr	r
rrrrr�test_invalid_labelCsr'cCs�tdd�dtd�fkst�tdtd��dtd�fks8t�tdd�dtd�fksRt�tdt�dtd�fkslt�tdd�dtd�fks�t�td	d�dtd�fks�t�td
d�dtd�fks�t�tdd�dtd
�fks�t�tdd�dtd�fks�t�tdd�dtd
�fk�s
t�tdd�dtd�fk�s&t�tdd�dtd
�fk�sBt�tdd�dtd
�fk�s^t�tdd�dtd�fk�szt�tdd�dtd
�fk�s�t�tdd�dtd
�fk�s�t�dS)N��ru€sérr%�asciiuésés���zutf-16bes���zutf-16les���us���s�zUTF-16BEs�zUTF-16LEzUTF-16)rrr
rrrrr�test_decodeLs r*cCsptdd�dkst�tdd�dks$t�tdd�dks6t�tdd�dksHt�tdd�dksZt�tdd	�d
kslt�dS)Nr%r��rsézutf-16s�zutf-16lezutf-16bes�)rr
rrrr�test_encodebsr,cCs�dd�}|gd�dkst�|dgd�dks.t�|dgd�dksBt�|dgd�d	ksVt�|d
dgd�d	kslt�|dd
gd�d	ks�t�|dgd�dks�t�|dgd�dks�t�|dddgd�dks�t�|dddgd�dks�t�|ddddddgd�dks�t�|dgd�dk�st�|dgd�dk�s$t�|dgd�dk�s:t�|dgd�dk�sPt�|ddddddgd�dk�spt�|dddgd �d!k�s�t�dS)"NcSst||�\}}dj|�S)Nr)r�join)�inputZfallback_encodingr Z	_encodingrrr�iter_decode_to_stringlsz/test_iter_decode.<locals>.iter_decode_to_stringrrrr+r%shelloZhellosheslloshell�oséuésés�����aua���s���s�uï»s���s�����s���sh�zx-user-defineduhllo)r
)r/rrr�test_iter_decodeks.r7cCs�djtgd��dkst�djtdgd��dks2t�djtdgd��dksLt�djtddddgd��dkslt�djtddddgd��dks�t�djtddddgd��dks�t�djtddddgd	��d
ks�t�djtddddgd
��dks�t�dS)Nrrrr%r+zutf-16s�zutf-16lezutf-16bes�uhZllozx-user-definedsh�llo)r-r	r
rrrr�test_iter_encode�s    r8cCs@d}d}d}d}t|d�|td�fks*t�t|d�|ks<t�dS)Ns2,O�#�ɻtϨ�u2,O#tsaaZaazx-user-defined)rrr
r)ZencodedZdecodedrrr�test_x_user_defined�sr9N)�__doc__Z
__future__rrrrrrrr	r
rrrrr$r'r*r,r7r8r9rrrr�<module>s,			site-packages/pip/_vendor/webencodings/__pycache__/mklabels.cpython-36.opt-1.pyc000064400000003376147511334610023564 0ustar003

���e�@sfdZddlZyddlmZWn ek
r<ddlmZYnXdd�Zdd�Zedkrbe	ed	��dS)
z�

    webencodings.mklabels
    ~~~~~~~~~~~~~~~~~~~~~

    Regenarate the webencodings.labels module.

    :copyright: Copyright 2012 by Simon Sapin
    :license: BSD, see LICENSE for details.

�N)�urlopencCs|S)N�)�stringrr�/usr/lib/python3.6/mklabels.py�assert_lowersrcsfdg}dd�tjt|�j�jd��D�}tdd�|D���|j�fdd�|D��|jd�d	j|�S)
Na"""

    webencodings.labels
    ~~~~~~~~~~~~~~~~~~~

    Map encoding labels to their name.

    :copyright: Copyright 2012 by Simon Sapin
    :license: BSD, see LICENSE for details.

"""

# XXX Do not edit!
# This file is automatically generated by mklabels.py

LABELS = {
cSsLg|]D}|dD]6}|dD](}tt|��jd�t|d�jd�f�qqqS)Z	encodings�labels�u�name)�reprr�lstrip)�.0�category�encoding�labelrrr�
<listcomp>-szgenerate.<locals>.<listcomp>�asciicss|]\}}t|�VqdS)N)�len)rrr	rrr�	<genexpr>2szgenerate.<locals>.<genexpr>c3s,|]$\}}d|d�t|�|fVqdS)z    %s:%s %s,
� N)r)rrr	)�max_lenrrr4s�}�)	�json�loadsr�read�decode�max�extend�append�join)Zurl�partsrr)rr�generates


r!�__main__z.http://encoding.spec.whatwg.org/encodings.json)
�__doc__rZurllibr�ImportErrorZurllib.requestrr!�__name__�printrrrr�<module>s!site-packages/pip/_vendor/webencodings/__pycache__/tests.cpython-36.opt-1.pyc000064400000004443147511334610023130 0ustar003

���e��@s�dZddlmZddlmZmZmZmZmZm	Z	m
Z
mZmZdd�Z
dd�Zd	d
�Zdd�Zd
d�Zdd�Zdd�Zdd�Zdd�ZdS)z�

    webencodings.tests
    ~~~~~~~~~~~~~~~~~~

    A basic test suite for Encoding.

    :copyright: Copyright 2012 by Simon Sapin
    :license: BSD, see LICENSE for details.

�)�unicode_literals�)	�lookup�LABELS�decode�encode�iter_decode�iter_encode�IncrementalDecoder�IncrementalEncoder�UTF8cOs4y|||�Wn|k
r"dSXtd|��dS)NzDid not raise %s.)�AssertionError)Z	exceptionZfunction�args�kwargs�r�/usr/lib/python3.6/tests.py�
assert_raisess
rcCsdS)Nrrrrr�test_labelssrcCsZx>tD]6}x dD]}tdg||�\}}qWt|�}t|�}qWxttj��D]}qNWdS)Nrr��)rrr)rrr
r�set�values)Zlabel�repeat�output�_�decoder�encoder�namerrr�test_all_labels0s

rcCsTtttdd�tttdd�tttgd�tttgd�tttd�tttd�dS)NséZinvalid�é)r�LookupErrorrrrr	r
rrrrr�test_invalid_labelCsr!cCsdS)Nrrrrr�test_decodeLsr"cCsdS)Nrrrrr�test_encodebsr#cCsdd�}dS)NcSst||�\}}dj|�S)N�)r�join)�inputZfallback_encodingrZ	_encodingrrr�iter_decode_to_stringlsz/test_iter_decode.<locals>.iter_decode_to_stringr)r'rrr�test_iter_decodeksr(cCsdS)Nrrrrr�test_iter_encode�sr)cCsd}d}d}d}dS)Ns2,O�#�ɻtϨ�u2,O#tsaaZaar)ZencodedZdecodedrrr�test_x_user_defined�s
r*N)�__doc__Z
__future__rr$rrrrrr	r
rrrrrr!r"r#r(r)r*rrrr�<module>s,			site-packages/pip/_vendor/webencodings/__pycache__/labels.cpython-36.pyc000064400000007646147511334610022301 0ustar003

���e#��@s�dZddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddd	d	d	d	d	d	d	d	d	d	d	d
d
d
ddddddddddd
d
d
dddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddd d!d!d!d!d!d"d"d"d#d#d$d$d$d$d$d$d$d%d%d%d%d%d%d%d%d%d%d&d&d'd(d(d)d*��Zd+S),z�

    webencodings.labels
    ~~~~~~~~~~~~~~~~~~~

    Map encoding labels to their name.

    :copyright: Copyright 2012 by Simon Sapin
    :license: BSD, see LICENSE for details.

zutf-8�ibm866z
iso-8859-2z
iso-8859-3z
iso-8859-4z
iso-8859-5z
iso-8859-6z
iso-8859-7z
iso-8859-8ziso-8859-8-iziso-8859-10ziso-8859-13ziso-8859-14ziso-8859-15ziso-8859-16zkoi8-rzkoi8-u�	macintoshzwindows-874zwindows-1250zwindows-1251zwindows-1252zwindows-1253zwindows-1254zwindows-1255zwindows-1256zwindows-1257zwindows-1258zx-mac-cyrillic�gbk�gb18030z
hz-gb-2312�big5zeuc-jpziso-2022-jp�	shift_jiszeuc-krziso-2022-krzutf-16bezutf-16lezx-user-defined)�zunicode-1-1-utf-8zutf-8�utf8�866�cp866�csibm866r�csisolatin2z
iso-8859-2z
iso-ir-101z	iso8859-2Ziso88592z
iso_8859-2ziso_8859-2:1987�l2�latin2�csisolatin3z
iso-8859-3z
iso-ir-109z	iso8859-3Ziso88593z
iso_8859-3ziso_8859-3:1988�l3�latin3�csisolatin4z
iso-8859-4z
iso-ir-110z	iso8859-4Ziso88594z
iso_8859-4ziso_8859-4:1988�l4�latin4�csisolatincyrillic�cyrillicz
iso-8859-5z
iso-ir-144z	iso8859-5Ziso88595z
iso_8859-5ziso_8859-5:1988�arabiczasmo-708Zcsiso88596eZcsiso88596i�csisolatinarabiczecma-114z
iso-8859-6ziso-8859-6-eziso-8859-6-iz
iso-ir-127z	iso8859-6Ziso88596z
iso_8859-6ziso_8859-6:1987�csisolatingreekzecma-118�elot_928�greek�greek8z
iso-8859-7z
iso-ir-126z	iso8859-7Ziso88597z
iso_8859-7ziso_8859-7:1987Zsun_eu_greekZcsiso88598e�csisolatinhebrew�hebrewz
iso-8859-8ziso-8859-8-ez
iso-ir-138z	iso8859-8Ziso88598z
iso_8859-8ziso_8859-8:1988ZvisualZcsiso88598iziso-8859-8-iZlogical�csisolatin6ziso-8859-10z
iso-ir-157z
iso8859-10Z	iso885910�l6�latin6ziso-8859-13z
iso8859-13Z	iso885913ziso-8859-14z
iso8859-14Z	iso885914Zcsisolatin9ziso-8859-15z
iso8859-15Z	iso885915ziso_8859-15�l9ziso-8859-16�cskoi8rZkoiZkoi8zkoi8-r�koi8_rzkoi8-uZcsmacintoshZmacrzx-mac-romanzdos-874ziso-8859-11z
iso8859-11Z	iso885911ztis-620zwindows-874�cp1250zwindows-1250zx-cp1250�cp1251zwindows-1251zx-cp1251zansi_x3.4-1968�ascii�cp1252�cp819�csisolatin1�ibm819z
iso-8859-1z
iso-ir-100z	iso8859-1Ziso88591z
iso_8859-1ziso_8859-1:1987�l1�latin1zus-asciizwindows-1252zx-cp1252�cp1253zwindows-1253zx-cp1253�cp1254�csisolatin5z
iso-8859-9z
iso-ir-148z	iso8859-9Ziso88599z
iso_8859-9ziso_8859-9:1989�l5�latin5zwindows-1254zx-cp1254�cp1255zwindows-1255zx-cp1255�cp1256zwindows-1256zx-cp1256�cp1257zwindows-1257zx-cp1257�cp1258zwindows-1258zx-cp1258zx-mac-cyrilliczx-mac-ukrainian�chineseZcsgb2312�csiso58gb231280�gb2312Zgb_2312z
gb_2312-80rz	iso-ir-58zx-gbkrz
hz-gb-2312rz
big5-hkscszcn-big5�csbig5zx-x-big5Zcseucpkdfmtjapanesezeuc-jpzx-euc-jp�csiso2022jpziso-2022-jp�
csshiftjis�ms_kanjiz	shift-jisr�sjiszwindows-31jzx-sjisZcseuckrZ
csksc56011987zeuc-krz
iso-ir-149�koreanzks_c_5601-1987zks_c_5601-1989�ksc5601Zksc_5601zwindows-949�csiso2022krziso-2022-krzutf-16bezutf-16zutf-16lezx-user-definedN)�__doc__ZLABELS�rBrB�/usr/lib/python3.6/labels.py�<module>s�site-packages/pip/_vendor/webencodings/__pycache__/labels.cpython-36.opt-1.pyc000064400000007646147511334610023240 0ustar003

���e#��@s�dZddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddd	d	d	d	d	d	d	d	d	d	d	d
d
d
ddddddddddd
d
d
dddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddd d!d!d!d!d!d"d"d"d#d#d$d$d$d$d$d$d$d%d%d%d%d%d%d%d%d%d%d&d&d'd(d(d)d*��Zd+S),z�

    webencodings.labels
    ~~~~~~~~~~~~~~~~~~~

    Map encoding labels to their name.

    :copyright: Copyright 2012 by Simon Sapin
    :license: BSD, see LICENSE for details.

zutf-8�ibm866z
iso-8859-2z
iso-8859-3z
iso-8859-4z
iso-8859-5z
iso-8859-6z
iso-8859-7z
iso-8859-8ziso-8859-8-iziso-8859-10ziso-8859-13ziso-8859-14ziso-8859-15ziso-8859-16zkoi8-rzkoi8-u�	macintoshzwindows-874zwindows-1250zwindows-1251zwindows-1252zwindows-1253zwindows-1254zwindows-1255zwindows-1256zwindows-1257zwindows-1258zx-mac-cyrillic�gbk�gb18030z
hz-gb-2312�big5zeuc-jpziso-2022-jp�	shift_jiszeuc-krziso-2022-krzutf-16bezutf-16lezx-user-defined)�zunicode-1-1-utf-8zutf-8�utf8�866�cp866�csibm866r�csisolatin2z
iso-8859-2z
iso-ir-101z	iso8859-2Ziso88592z
iso_8859-2ziso_8859-2:1987�l2�latin2�csisolatin3z
iso-8859-3z
iso-ir-109z	iso8859-3Ziso88593z
iso_8859-3ziso_8859-3:1988�l3�latin3�csisolatin4z
iso-8859-4z
iso-ir-110z	iso8859-4Ziso88594z
iso_8859-4ziso_8859-4:1988�l4�latin4�csisolatincyrillic�cyrillicz
iso-8859-5z
iso-ir-144z	iso8859-5Ziso88595z
iso_8859-5ziso_8859-5:1988�arabiczasmo-708Zcsiso88596eZcsiso88596i�csisolatinarabiczecma-114z
iso-8859-6ziso-8859-6-eziso-8859-6-iz
iso-ir-127z	iso8859-6Ziso88596z
iso_8859-6ziso_8859-6:1987�csisolatingreekzecma-118�elot_928�greek�greek8z
iso-8859-7z
iso-ir-126z	iso8859-7Ziso88597z
iso_8859-7ziso_8859-7:1987Zsun_eu_greekZcsiso88598e�csisolatinhebrew�hebrewz
iso-8859-8ziso-8859-8-ez
iso-ir-138z	iso8859-8Ziso88598z
iso_8859-8ziso_8859-8:1988ZvisualZcsiso88598iziso-8859-8-iZlogical�csisolatin6ziso-8859-10z
iso-ir-157z
iso8859-10Z	iso885910�l6�latin6ziso-8859-13z
iso8859-13Z	iso885913ziso-8859-14z
iso8859-14Z	iso885914Zcsisolatin9ziso-8859-15z
iso8859-15Z	iso885915ziso_8859-15�l9ziso-8859-16�cskoi8rZkoiZkoi8zkoi8-r�koi8_rzkoi8-uZcsmacintoshZmacrzx-mac-romanzdos-874ziso-8859-11z
iso8859-11Z	iso885911ztis-620zwindows-874�cp1250zwindows-1250zx-cp1250�cp1251zwindows-1251zx-cp1251zansi_x3.4-1968�ascii�cp1252�cp819�csisolatin1�ibm819z
iso-8859-1z
iso-ir-100z	iso8859-1Ziso88591z
iso_8859-1ziso_8859-1:1987�l1�latin1zus-asciizwindows-1252zx-cp1252�cp1253zwindows-1253zx-cp1253�cp1254�csisolatin5z
iso-8859-9z
iso-ir-148z	iso8859-9Ziso88599z
iso_8859-9ziso_8859-9:1989�l5�latin5zwindows-1254zx-cp1254�cp1255zwindows-1255zx-cp1255�cp1256zwindows-1256zx-cp1256�cp1257zwindows-1257zx-cp1257�cp1258zwindows-1258zx-cp1258zx-mac-cyrilliczx-mac-ukrainian�chineseZcsgb2312�csiso58gb231280�gb2312Zgb_2312z
gb_2312-80rz	iso-ir-58zx-gbkrz
hz-gb-2312rz
big5-hkscszcn-big5�csbig5zx-x-big5Zcseucpkdfmtjapanesezeuc-jpzx-euc-jp�csiso2022jpziso-2022-jp�
csshiftjis�ms_kanjiz	shift-jisr�sjiszwindows-31jzx-sjisZcseuckrZ
csksc56011987zeuc-krz
iso-ir-149�koreanzks_c_5601-1987zks_c_5601-1989�ksc5601Zksc_5601zwindows-949�csiso2022krziso-2022-krzutf-16bezutf-16zutf-16lezx-user-definedN)�__doc__ZLABELS�rBrB�/usr/lib/python3.6/labels.py�<module>s�site-packages/pip/_vendor/webencodings/__pycache__/__init__.cpython-36.pyc000064400000022570147511334610022567 0ustar003

���eP)�@s�dZddlmZddlZddlmZdZddd	d
d�ZiZdd
�Z	dd�Z
dd�ZGdd�de�Z
e
d�Ze
d�Ze
d�Zd+dd�Zdd�Zedfdd�Zd,dd �Zd!d"�Zedfd#d$�Zd%d&�ZGd'd(�d(e�ZGd)d*�d*e�ZdS)-a

    webencodings
    ~~~~~~~~~~~~

    This is a Python implementation of the `WHATWG Encoding standard
    <http://encoding.spec.whatwg.org/>`. See README for details.

    :copyright: Copyright 2012 by Simon Sapin
    :license: BSD, see LICENSE for details.

�)�unicode_literalsN�)�LABELSz0.5z
iso-8859-8zmac-cyrillicz	mac-romanZcp874)ziso-8859-8-izx-mac-cyrillic�	macintoshzwindows-874cCs|jd�j�jd�S)a9Transform (only) ASCII letters to lower case: A-Z is mapped to a-z.

    :param string: An Unicode string.
    :returns: A new Unicode string.

    This is used for `ASCII case-insensitive
    <http://encoding.spec.whatwg.org/#ascii-case-insensitive>`_
    matching of encoding labels.
    The same matching is also used, among other things,
    for `CSS keywords <http://dev.w3.org/csswg/css-values/#keywords>`_.

    This is different from the :meth:`~py:str.lower` method of Unicode strings
    which also affect non-ASCII characters,
    sometimes mapping them into the ASCII range:

        >>> keyword = u'Bac\N{KELVIN SIGN}ground'
        >>> assert keyword.lower() == u'background'
        >>> assert ascii_lower(keyword) != keyword.lower()
        >>> assert ascii_lower(keyword) == u'bac\N{KELVIN SIGN}ground'

    �utf8)�encode�lower�decode)�string�r�/usr/lib/python3.6/__init__.py�ascii_lower#sr
cCsxt|jd��}tj|�}|dkr$dStj|�}|dkrt|dkrLddlm}ntj||�}tj	|�}t
||�}|t|<|S)u<
    Look for an encoding by its label.
    This is the spec’s `get an encoding
    <http://encoding.spec.whatwg.org/#concept-encoding-get>`_ algorithm.
    Supported labels are listed there.

    :param label: A string.
    :returns:
        An :class:`Encoding` object, or :obj:`None` for an unknown label.

    z	

 Nzx-user-definedr)�
codec_info)r
�stripr�get�CACHEZx_user_definedr�PYTHON_NAMES�codecs�lookup�Encoding)Zlabel�name�encodingrZpython_namerrrr=s




rcCs.t|d�r|St|�}|dkr*td|��|S)z�
    Accept either an encoding object or label.

    :param encoding: An :class:`Encoding` object or a label string.
    :returns: An :class:`Encoding` object.
    :raises: :exc:`~exceptions.LookupError` for an unknown label.

    rNzUnknown encoding label: %r)�hasattrr�LookupError)Zencoding_or_labelrrrr�
_get_encoding[s	
rc@s eZdZdZdd�Zdd�ZdS)raOReresents a character encoding such as UTF-8,
    that can be used for decoding or encoding.

    .. attribute:: name

        Canonical name of the encoding

    .. attribute:: codec_info

        The actual implementation of the encoding,
        a stdlib :class:`~codecs.CodecInfo` object.
        See :func:`codecs.register`.

    cCs||_||_dS)N)rr)�selfrrrrr�__init__|szEncoding.__init__cCs
d|jS)Nz
<Encoding %s>)r)rrrr�__repr__�szEncoding.__repr__N)�__name__�
__module__�__qualname__�__doc__rrrrrrrmsrzutf-8zutf-16lezutf-16be�replacecCs2t|�}t|�\}}|p|}|jj||�d|fS)a�
    Decode a single string.

    :param input: A byte string
    :param fallback_encoding:
        An :class:`Encoding` object or a label string.
        The encoding to use if :obj:`input` does note have a BOM.
    :param errors: Type of error handling. See :func:`codecs.register`.
    :raises: :exc:`~exceptions.LookupError` for an unknown encoding label.
    :return:
        A ``(output, encoding)`` tuple of an Unicode string
        and an :obj:`Encoding`.

    r)r�_detect_bomrr	)�input�fallback_encoding�errorsZbom_encodingrrrrr	�sr	cCsV|jd�rt|dd�fS|jd�r4t|dd�fS|jd�rNt|dd�fSd|fS)zBReturn (bom_encoding, input), with any BOM removed from the input.s���Ns��s�)�
startswith�_UTF16LE�_UTF16BE�UTF8)r$rrrr#�s


r#�strictcCst|�jj||�dS)a;
    Encode a single string.

    :param input: An Unicode string.
    :param encoding: An :class:`Encoding` object or a label string.
    :param errors: Type of error handling. See :func:`codecs.register`.
    :raises: :exc:`~exceptions.LookupError` for an unknown encoding label.
    :return: A byte string.

    r)rrr)r$rr&rrrr�srcCs$t||�}t||�}t|�}||fS)a�
    "Pull"-based decoder.

    :param input:
        An iterable of byte strings.

        The input is first consumed just enough to determine the encoding
        based on the precense of a BOM,
        then consumed on demand when the return value is.
    :param fallback_encoding:
        An :class:`Encoding` object or a label string.
        The encoding to use if :obj:`input` does note have a BOM.
    :param errors: Type of error handling. See :func:`codecs.register`.
    :raises: :exc:`~exceptions.LookupError` for an unknown encoding label.
    :returns:
        An ``(output, encoding)`` tuple.
        :obj:`output` is an iterable of Unicode strings,
        :obj:`encoding` is the :obj:`Encoding` that is being used.

    )�IncrementalDecoder�_iter_decode_generator�next)r$r%r&�decoder�	generatorrrrr�iter_decode�s

r3ccs�|j}t|�}xf|D].}||�}|r|jdk	s2t�|jV|VPqW|ddd�}|jdk	s`t�|jV|rr|VdSx|D]}||�}|r||Vq|W|ddd�}|r�|VdS)zqReturn a generator that first yields the :obj:`Encoding`,
    then yields output chukns as Unicode strings.

    N�T)�final)r	�iterr�AssertionError)r$r1r	�chunck�outputrrrr/�s,


r/cCst||�j}t||�S)uY
    “Pull”-based encoder.

    :param input: An iterable of Unicode strings.
    :param encoding: An :class:`Encoding` object or a label string.
    :param errors: Type of error handling. See :func:`codecs.register`.
    :raises: :exc:`~exceptions.LookupError` for an unknown encoding label.
    :returns: An iterable of byte strings.

    )�IncrementalEncoderr�_iter_encode_generator)r$rr&rrrr�iter_encode�sr<ccs:x|D]}||�}|r|VqW|ddd�}|r6|VdS)N�T)r5r)r$rr8r9rrrr;s

r;c@s$eZdZdZd	dd�Zd
dd�ZdS)r.uO
    “Push”-based decoder.

    :param fallback_encoding:
        An :class:`Encoding` object or a label string.
        The encoding to use if :obj:`input` does note have a BOM.
    :param errors: Type of error handling. See :func:`codecs.register`.
    :raises: :exc:`~exceptions.LookupError` for an unknown encoding label.

    r"cCs&t|�|_||_d|_d|_d|_dS)Nr4)r�_fallback_encoding�_errors�_buffer�_decoderr)rr%r&rrrrs

zIncrementalDecoder.__init__FcCs~|j}|dk	r|||�S|j|}t|�\}}|dkrXt|�dkrR|rR||_dS|j}|jj|j�j}||_||_	|||�S)z�Decode one chunk of the input.

        :param input: A byte string.
        :param final:
            Indicate that no more input is available.
            Must be :obj:`True` if this is the last call.
        :returns: An Unicode string.

        Nr(r=)
rAr@r#�lenr>r�incrementaldecoderr?r	r)rr$r5r1rrrrr	's


zIncrementalDecoder.decodeN)r")F)rrr r!rr	rrrrr.s

r.c@seZdZdZedfdd�ZdS)r:u�
    “Push”-based encoder.

    :param encoding: An :class:`Encoding` object or a label string.
    :param errors: Type of error handling. See :func:`codecs.register`.
    :raises: :exc:`~exceptions.LookupError` for an unknown encoding label.

    .. method:: encode(input, final=False)

        :param input: An Unicode string.
        :param final:
            Indicate that no more input is available.
            Must be :obj:`True` if this is the last call.
        :returns: A byte string.

    r-cCst|�}|jj|�j|_dS)N)rr�incrementalencoderr)rrr&rrrrTszIncrementalEncoder.__init__N)rrr r!r,rrrrrr:Csr:)r")r")r!Z
__future__rrZlabelsr�VERSIONrrr
rr�objectrr,r*r+r	r#rr3r/r<r;r.r:rrrr�<module>
s2

 
3site-packages/pip/_vendor/webencodings/__pycache__/x_user_defined.cpython-36.opt-1.pyc000064400000005025147511334610024746 0ustar003

���e��	@s�dZddlmZddlZGdd�dej�ZGdd�dej�ZGdd	�d	ej�ZGd
d�deej�ZGdd
�d
eej�Zej	de�j
e�jeeeed�ZdZ
eje
�ZdS)z�

    webencodings.x_user_defined
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~

    An implementation of the x-user-defined encoding.

    :copyright: Copyright 2012 by Simon Sapin
    :license: BSD, see LICENSE for details.

�)�unicode_literalsNc@s eZdZddd�Zddd�ZdS)	�Codec�strictcCstj||t�S)N)�codecs�charmap_encode�encoding_table)�self�input�errors�r�$/usr/lib/python3.6/x_user_defined.py�encodeszCodec.encodecCstj||t�S)N)r�charmap_decode�decoding_table)rr	r
rrr�decodeszCodec.decodeN)r)r)�__name__�
__module__�__qualname__r
rrrrrrs
rc@seZdZddd�ZdS)�IncrementalEncoderFcCstj||jt�dS)Nr)rrr
r)rr	�finalrrrr
szIncrementalEncoder.encodeN)F)rrrr
rrrrrsrc@seZdZddd�ZdS)�IncrementalDecoderFcCstj||jt�dS)Nr)rrr
r)rr	rrrrr$szIncrementalDecoder.decodeN)F)rrrrrrrrr#src@seZdZdS)�StreamWriterN)rrrrrrrr(src@seZdZdS)�StreamReaderN)rrrrrrrr,srzx-user-defined)�namer
r�incrementalencoder�incrementaldecoder�streamreader�streamwriteru	

 !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~)�__doc__Z
__future__rrrrrrr�	CodecInfor
rZ
codec_infor�
charmap_buildrrrrr�<module>s&		site-packages/pip/_vendor/webencodings/__pycache__/__init__.cpython-36.opt-1.pyc000064400000022510147511334610023520 0ustar003

���eP)�@s�dZddlmZddlZddlmZdZddd	d
d�ZiZdd
�Z	dd�Z
dd�ZGdd�de�Z
e
d�Ze
d�Ze
d�Zd+dd�Zdd�Zedfdd�Zd,dd �Zd!d"�Zedfd#d$�Zd%d&�ZGd'd(�d(e�ZGd)d*�d*e�ZdS)-a

    webencodings
    ~~~~~~~~~~~~

    This is a Python implementation of the `WHATWG Encoding standard
    <http://encoding.spec.whatwg.org/>`. See README for details.

    :copyright: Copyright 2012 by Simon Sapin
    :license: BSD, see LICENSE for details.

�)�unicode_literalsN�)�LABELSz0.5z
iso-8859-8zmac-cyrillicz	mac-romanZcp874)ziso-8859-8-izx-mac-cyrillic�	macintoshzwindows-874cCs|jd�j�jd�S)a9Transform (only) ASCII letters to lower case: A-Z is mapped to a-z.

    :param string: An Unicode string.
    :returns: A new Unicode string.

    This is used for `ASCII case-insensitive
    <http://encoding.spec.whatwg.org/#ascii-case-insensitive>`_
    matching of encoding labels.
    The same matching is also used, among other things,
    for `CSS keywords <http://dev.w3.org/csswg/css-values/#keywords>`_.

    This is different from the :meth:`~py:str.lower` method of Unicode strings
    which also affect non-ASCII characters,
    sometimes mapping them into the ASCII range:

        >>> keyword = u'Bac\N{KELVIN SIGN}ground'
        >>> assert keyword.lower() == u'background'
        >>> assert ascii_lower(keyword) != keyword.lower()
        >>> assert ascii_lower(keyword) == u'bac\N{KELVIN SIGN}ground'

    �utf8)�encode�lower�decode)�string�r�/usr/lib/python3.6/__init__.py�ascii_lower#sr
cCsxt|jd��}tj|�}|dkr$dStj|�}|dkrt|dkrLddlm}ntj||�}tj	|�}t
||�}|t|<|S)u<
    Look for an encoding by its label.
    This is the spec’s `get an encoding
    <http://encoding.spec.whatwg.org/#concept-encoding-get>`_ algorithm.
    Supported labels are listed there.

    :param label: A string.
    :returns:
        An :class:`Encoding` object, or :obj:`None` for an unknown label.

    z	

 Nzx-user-definedr)�
codec_info)r
�stripr�get�CACHEZx_user_definedr�PYTHON_NAMES�codecs�lookup�Encoding)Zlabel�name�encodingrZpython_namerrrr=s




rcCs.t|d�r|St|�}|dkr*td|��|S)z�
    Accept either an encoding object or label.

    :param encoding: An :class:`Encoding` object or a label string.
    :returns: An :class:`Encoding` object.
    :raises: :exc:`~exceptions.LookupError` for an unknown label.

    rNzUnknown encoding label: %r)�hasattrr�LookupError)Zencoding_or_labelrrrr�
_get_encoding[s	
rc@s eZdZdZdd�Zdd�ZdS)raOReresents a character encoding such as UTF-8,
    that can be used for decoding or encoding.

    .. attribute:: name

        Canonical name of the encoding

    .. attribute:: codec_info

        The actual implementation of the encoding,
        a stdlib :class:`~codecs.CodecInfo` object.
        See :func:`codecs.register`.

    cCs||_||_dS)N)rr)�selfrrrrr�__init__|szEncoding.__init__cCs
d|jS)Nz
<Encoding %s>)r)rrrr�__repr__�szEncoding.__repr__N)�__name__�
__module__�__qualname__�__doc__rrrrrrrmsrzutf-8zutf-16lezutf-16be�replacecCs2t|�}t|�\}}|p|}|jj||�d|fS)a�
    Decode a single string.

    :param input: A byte string
    :param fallback_encoding:
        An :class:`Encoding` object or a label string.
        The encoding to use if :obj:`input` does note have a BOM.
    :param errors: Type of error handling. See :func:`codecs.register`.
    :raises: :exc:`~exceptions.LookupError` for an unknown encoding label.
    :return:
        A ``(output, encoding)`` tuple of an Unicode string
        and an :obj:`Encoding`.

    r)r�_detect_bomrr	)�input�fallback_encoding�errorsZbom_encodingrrrrr	�sr	cCsV|jd�rt|dd�fS|jd�r4t|dd�fS|jd�rNt|dd�fSd|fS)zBReturn (bom_encoding, input), with any BOM removed from the input.s���Ns��s�)�
startswith�_UTF16LE�_UTF16BE�UTF8)r$rrrr#�s


r#�strictcCst|�jj||�dS)a;
    Encode a single string.

    :param input: An Unicode string.
    :param encoding: An :class:`Encoding` object or a label string.
    :param errors: Type of error handling. See :func:`codecs.register`.
    :raises: :exc:`~exceptions.LookupError` for an unknown encoding label.
    :return: A byte string.

    r)rrr)r$rr&rrrr�srcCs$t||�}t||�}t|�}||fS)a�
    "Pull"-based decoder.

    :param input:
        An iterable of byte strings.

        The input is first consumed just enough to determine the encoding
        based on the precense of a BOM,
        then consumed on demand when the return value is.
    :param fallback_encoding:
        An :class:`Encoding` object or a label string.
        The encoding to use if :obj:`input` does note have a BOM.
    :param errors: Type of error handling. See :func:`codecs.register`.
    :raises: :exc:`~exceptions.LookupError` for an unknown encoding label.
    :returns:
        An ``(output, encoding)`` tuple.
        :obj:`output` is an iterable of Unicode strings,
        :obj:`encoding` is the :obj:`Encoding` that is being used.

    )�IncrementalDecoder�_iter_decode_generator�next)r$r%r&�decoder�	generatorrrrr�iter_decode�s

r3ccs�|j}t|�}xJ|D] }||�}|r|jV|VPqW|ddd�}|jV|rV|VdSx|D]}||�}|r`|Vq`W|ddd�}|r�|VdS)zqReturn a generator that first yields the :obj:`Encoding`,
    then yields output chukns as Unicode strings.

    �T)�finalN)r	�iterr)r$r1r	�chunck�outputrrrr/�s(


r/cCst||�j}t||�S)uY
    “Pull”-based encoder.

    :param input: An iterable of Unicode strings.
    :param encoding: An :class:`Encoding` object or a label string.
    :param errors: Type of error handling. See :func:`codecs.register`.
    :raises: :exc:`~exceptions.LookupError` for an unknown encoding label.
    :returns: An iterable of byte strings.

    )�IncrementalEncoderr�_iter_encode_generator)r$rr&rrrr�iter_encode�sr;ccs:x|D]}||�}|r|VqW|ddd�}|r6|VdS)N�T)r5r)r$rr7r8rrrr:s

r:c@s$eZdZdZd	dd�Zd
dd�ZdS)r.uO
    “Push”-based decoder.

    :param fallback_encoding:
        An :class:`Encoding` object or a label string.
        The encoding to use if :obj:`input` does note have a BOM.
    :param errors: Type of error handling. See :func:`codecs.register`.
    :raises: :exc:`~exceptions.LookupError` for an unknown encoding label.

    r"cCs&t|�|_||_d|_d|_d|_dS)Nr4)r�_fallback_encoding�_errors�_buffer�_decoderr)rr%r&rrrrs

zIncrementalDecoder.__init__FcCs~|j}|dk	r|||�S|j|}t|�\}}|dkrXt|�dkrR|rR||_dS|j}|jj|j�j}||_||_	|||�S)z�Decode one chunk of the input.

        :param input: A byte string.
        :param final:
            Indicate that no more input is available.
            Must be :obj:`True` if this is the last call.
        :returns: An Unicode string.

        Nr(r<)
r@r?r#�lenr=r�incrementaldecoderr>r	r)rr$r5r1rrrrr	's


zIncrementalDecoder.decodeN)r")F)rrr r!rr	rrrrr.s

r.c@seZdZdZedfdd�ZdS)r9u�
    “Push”-based encoder.

    :param encoding: An :class:`Encoding` object or a label string.
    :param errors: Type of error handling. See :func:`codecs.register`.
    :raises: :exc:`~exceptions.LookupError` for an unknown encoding label.

    .. method:: encode(input, final=False)

        :param input: An Unicode string.
        :param final:
            Indicate that no more input is available.
            Must be :obj:`True` if this is the last call.
        :returns: A byte string.

    r-cCst|�}|jj|�j|_dS)N)rr�incrementalencoderr)rrr&rrrrTszIncrementalEncoder.__init__N)rrr r!r,rrrrrr9Csr9)r")r")r!Z
__future__rrZlabelsr�VERSIONrrr
rr�objectrr,r*r+r	r#rr3r/r;r:r.r9rrrr�<module>
s2

 
3site-packages/pip/_vendor/webencodings/__pycache__/mklabels.cpython-36.pyc000064400000003444147511334610022621 0ustar003

���e�@sfdZddlZyddlmZWn ek
r<ddlmZYnXdd�Zdd�Zedkrbe	ed	��dS)
z�

    webencodings.mklabels
    ~~~~~~~~~~~~~~~~~~~~~

    Regenarate the webencodings.labels module.

    :copyright: Copyright 2012 by Simon Sapin
    :license: BSD, see LICENSE for details.

�N)�urlopencCs||j�kst�|S)N)�lower�AssertionError)�string�r�/usr/lib/python3.6/mklabels.py�assert_lowersrcsfdg}dd�tjt|�j�jd��D�}tdd�|D���|j�fdd�|D��|jd�d	j|�S)
Na"""

    webencodings.labels
    ~~~~~~~~~~~~~~~~~~~

    Map encoding labels to their name.

    :copyright: Copyright 2012 by Simon Sapin
    :license: BSD, see LICENSE for details.

"""

# XXX Do not edit!
# This file is automatically generated by mklabels.py

LABELS = {
cSsLg|]D}|dD]6}|dD](}tt|��jd�t|d�jd�f�qqqS)Z	encodings�labels�u�name)�reprr�lstrip)�.0�category�encoding�labelrrr�
<listcomp>-szgenerate.<locals>.<listcomp>�asciicss|]\}}t|�VqdS)N)�len)rrrrrr�	<genexpr>2szgenerate.<locals>.<genexpr>c3s,|]$\}}d|d�t|�|fVqdS)z    %s:%s %s,
� N)r)rrr)�max_lenrrr4s�}�)	�json�loadsr�read�decode�max�extend�append�join)Zurl�partsr	r)rr�generates


r#�__main__z.http://encoding.spec.whatwg.org/encodings.json)
�__doc__rZurllibr�ImportErrorZurllib.requestrr#�__name__�printrrrr�<module>s!site-packages/pip/_vendor/webencodings/__pycache__/x_user_defined.cpython-36.pyc000064400000005025147511334610024007 0ustar003

���e��	@s�dZddlmZddlZGdd�dej�ZGdd�dej�ZGdd	�d	ej�ZGd
d�deej�ZGdd
�d
eej�Zej	de�j
e�jeeeed�ZdZ
eje
�ZdS)z�

    webencodings.x_user_defined
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~

    An implementation of the x-user-defined encoding.

    :copyright: Copyright 2012 by Simon Sapin
    :license: BSD, see LICENSE for details.

�)�unicode_literalsNc@s eZdZddd�Zddd�ZdS)	�Codec�strictcCstj||t�S)N)�codecs�charmap_encode�encoding_table)�self�input�errors�r�$/usr/lib/python3.6/x_user_defined.py�encodeszCodec.encodecCstj||t�S)N)r�charmap_decode�decoding_table)rr	r
rrr�decodeszCodec.decodeN)r)r)�__name__�
__module__�__qualname__r
rrrrrrs
rc@seZdZddd�ZdS)�IncrementalEncoderFcCstj||jt�dS)Nr)rrr
r)rr	�finalrrrr
szIncrementalEncoder.encodeN)F)rrrr
rrrrrsrc@seZdZddd�ZdS)�IncrementalDecoderFcCstj||jt�dS)Nr)rrr
r)rr	rrrrr$szIncrementalDecoder.decodeN)F)rrrrrrrrr#src@seZdZdS)�StreamWriterN)rrrrrrrr(src@seZdZdS)�StreamReaderN)rrrrrrrr,srzx-user-defined)�namer
r�incrementalencoder�incrementaldecoder�streamreader�streamwriteru	

 !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~)�__doc__Z
__future__rrrrrrr�	CodecInfor
rZ
codec_infor�
charmap_buildrrrrr�<module>s&		site-packages/pip/_vendor/webencodings/labels.py000064400000021423147511334610016002 0ustar00"""

    webencodings.labels
    ~~~~~~~~~~~~~~~~~~~

    Map encoding labels to their name.

    :copyright: Copyright 2012 by Simon Sapin
    :license: BSD, see LICENSE for details.

"""

# XXX Do not edit!
# This file is automatically generated by mklabels.py

LABELS = {
    'unicode-1-1-utf-8':   'utf-8',
    'utf-8':               'utf-8',
    'utf8':                'utf-8',
    '866':                 'ibm866',
    'cp866':               'ibm866',
    'csibm866':            'ibm866',
    'ibm866':              'ibm866',
    'csisolatin2':         'iso-8859-2',
    'iso-8859-2':          'iso-8859-2',
    'iso-ir-101':          'iso-8859-2',
    'iso8859-2':           'iso-8859-2',
    'iso88592':            'iso-8859-2',
    'iso_8859-2':          'iso-8859-2',
    'iso_8859-2:1987':     'iso-8859-2',
    'l2':                  'iso-8859-2',
    'latin2':              'iso-8859-2',
    'csisolatin3':         'iso-8859-3',
    'iso-8859-3':          'iso-8859-3',
    'iso-ir-109':          'iso-8859-3',
    'iso8859-3':           'iso-8859-3',
    'iso88593':            'iso-8859-3',
    'iso_8859-3':          'iso-8859-3',
    'iso_8859-3:1988':     'iso-8859-3',
    'l3':                  'iso-8859-3',
    'latin3':              'iso-8859-3',
    'csisolatin4':         'iso-8859-4',
    'iso-8859-4':          'iso-8859-4',
    'iso-ir-110':          'iso-8859-4',
    'iso8859-4':           'iso-8859-4',
    'iso88594':            'iso-8859-4',
    'iso_8859-4':          'iso-8859-4',
    'iso_8859-4:1988':     'iso-8859-4',
    'l4':                  'iso-8859-4',
    'latin4':              'iso-8859-4',
    'csisolatincyrillic':  'iso-8859-5',
    'cyrillic':            'iso-8859-5',
    'iso-8859-5':          'iso-8859-5',
    'iso-ir-144':          'iso-8859-5',
    'iso8859-5':           'iso-8859-5',
    'iso88595':            'iso-8859-5',
    'iso_8859-5':          'iso-8859-5',
    'iso_8859-5:1988':     'iso-8859-5',
    'arabic':              'iso-8859-6',
    'asmo-708':            'iso-8859-6',
    'csiso88596e':         'iso-8859-6',
    'csiso88596i':         'iso-8859-6',
    'csisolatinarabic':    'iso-8859-6',
    'ecma-114':            'iso-8859-6',
    'iso-8859-6':          'iso-8859-6',
    'iso-8859-6-e':        'iso-8859-6',
    'iso-8859-6-i':        'iso-8859-6',
    'iso-ir-127':          'iso-8859-6',
    'iso8859-6':           'iso-8859-6',
    'iso88596':            'iso-8859-6',
    'iso_8859-6':          'iso-8859-6',
    'iso_8859-6:1987':     'iso-8859-6',
    'csisolatingreek':     'iso-8859-7',
    'ecma-118':            'iso-8859-7',
    'elot_928':            'iso-8859-7',
    'greek':               'iso-8859-7',
    'greek8':              'iso-8859-7',
    'iso-8859-7':          'iso-8859-7',
    'iso-ir-126':          'iso-8859-7',
    'iso8859-7':           'iso-8859-7',
    'iso88597':            'iso-8859-7',
    'iso_8859-7':          'iso-8859-7',
    'iso_8859-7:1987':     'iso-8859-7',
    'sun_eu_greek':        'iso-8859-7',
    'csiso88598e':         'iso-8859-8',
    'csisolatinhebrew':    'iso-8859-8',
    'hebrew':              'iso-8859-8',
    'iso-8859-8':          'iso-8859-8',
    'iso-8859-8-e':        'iso-8859-8',
    'iso-ir-138':          'iso-8859-8',
    'iso8859-8':           'iso-8859-8',
    'iso88598':            'iso-8859-8',
    'iso_8859-8':          'iso-8859-8',
    'iso_8859-8:1988':     'iso-8859-8',
    'visual':              'iso-8859-8',
    'csiso88598i':         'iso-8859-8-i',
    'iso-8859-8-i':        'iso-8859-8-i',
    'logical':             'iso-8859-8-i',
    'csisolatin6':         'iso-8859-10',
    'iso-8859-10':         'iso-8859-10',
    'iso-ir-157':          'iso-8859-10',
    'iso8859-10':          'iso-8859-10',
    'iso885910':           'iso-8859-10',
    'l6':                  'iso-8859-10',
    'latin6':              'iso-8859-10',
    'iso-8859-13':         'iso-8859-13',
    'iso8859-13':          'iso-8859-13',
    'iso885913':           'iso-8859-13',
    'iso-8859-14':         'iso-8859-14',
    'iso8859-14':          'iso-8859-14',
    'iso885914':           'iso-8859-14',
    'csisolatin9':         'iso-8859-15',
    'iso-8859-15':         'iso-8859-15',
    'iso8859-15':          'iso-8859-15',
    'iso885915':           'iso-8859-15',
    'iso_8859-15':         'iso-8859-15',
    'l9':                  'iso-8859-15',
    'iso-8859-16':         'iso-8859-16',
    'cskoi8r':             'koi8-r',
    'koi':                 'koi8-r',
    'koi8':                'koi8-r',
    'koi8-r':              'koi8-r',
    'koi8_r':              'koi8-r',
    'koi8-u':              'koi8-u',
    'csmacintosh':         'macintosh',
    'mac':                 'macintosh',
    'macintosh':           'macintosh',
    'x-mac-roman':         'macintosh',
    'dos-874':             'windows-874',
    'iso-8859-11':         'windows-874',
    'iso8859-11':          'windows-874',
    'iso885911':           'windows-874',
    'tis-620':             'windows-874',
    'windows-874':         'windows-874',
    'cp1250':              'windows-1250',
    'windows-1250':        'windows-1250',
    'x-cp1250':            'windows-1250',
    'cp1251':              'windows-1251',
    'windows-1251':        'windows-1251',
    'x-cp1251':            'windows-1251',
    'ansi_x3.4-1968':      'windows-1252',
    'ascii':               'windows-1252',
    'cp1252':              'windows-1252',
    'cp819':               'windows-1252',
    'csisolatin1':         'windows-1252',
    'ibm819':              'windows-1252',
    'iso-8859-1':          'windows-1252',
    'iso-ir-100':          'windows-1252',
    'iso8859-1':           'windows-1252',
    'iso88591':            'windows-1252',
    'iso_8859-1':          'windows-1252',
    'iso_8859-1:1987':     'windows-1252',
    'l1':                  'windows-1252',
    'latin1':              'windows-1252',
    'us-ascii':            'windows-1252',
    'windows-1252':        'windows-1252',
    'x-cp1252':            'windows-1252',
    'cp1253':              'windows-1253',
    'windows-1253':        'windows-1253',
    'x-cp1253':            'windows-1253',
    'cp1254':              'windows-1254',
    'csisolatin5':         'windows-1254',
    'iso-8859-9':          'windows-1254',
    'iso-ir-148':          'windows-1254',
    'iso8859-9':           'windows-1254',
    'iso88599':            'windows-1254',
    'iso_8859-9':          'windows-1254',
    'iso_8859-9:1989':     'windows-1254',
    'l5':                  'windows-1254',
    'latin5':              'windows-1254',
    'windows-1254':        'windows-1254',
    'x-cp1254':            'windows-1254',
    'cp1255':              'windows-1255',
    'windows-1255':        'windows-1255',
    'x-cp1255':            'windows-1255',
    'cp1256':              'windows-1256',
    'windows-1256':        'windows-1256',
    'x-cp1256':            'windows-1256',
    'cp1257':              'windows-1257',
    'windows-1257':        'windows-1257',
    'x-cp1257':            'windows-1257',
    'cp1258':              'windows-1258',
    'windows-1258':        'windows-1258',
    'x-cp1258':            'windows-1258',
    'x-mac-cyrillic':      'x-mac-cyrillic',
    'x-mac-ukrainian':     'x-mac-cyrillic',
    'chinese':             'gbk',
    'csgb2312':            'gbk',
    'csiso58gb231280':     'gbk',
    'gb2312':              'gbk',
    'gb_2312':             'gbk',
    'gb_2312-80':          'gbk',
    'gbk':                 'gbk',
    'iso-ir-58':           'gbk',
    'x-gbk':               'gbk',
    'gb18030':             'gb18030',
    'hz-gb-2312':          'hz-gb-2312',
    'big5':                'big5',
    'big5-hkscs':          'big5',
    'cn-big5':             'big5',
    'csbig5':              'big5',
    'x-x-big5':            'big5',
    'cseucpkdfmtjapanese': 'euc-jp',
    'euc-jp':              'euc-jp',
    'x-euc-jp':            'euc-jp',
    'csiso2022jp':         'iso-2022-jp',
    'iso-2022-jp':         'iso-2022-jp',
    'csshiftjis':          'shift_jis',
    'ms_kanji':            'shift_jis',
    'shift-jis':           'shift_jis',
    'shift_jis':           'shift_jis',
    'sjis':                'shift_jis',
    'windows-31j':         'shift_jis',
    'x-sjis':              'shift_jis',
    'cseuckr':             'euc-kr',
    'csksc56011987':       'euc-kr',
    'euc-kr':              'euc-kr',
    'iso-ir-149':          'euc-kr',
    'korean':              'euc-kr',
    'ks_c_5601-1987':      'euc-kr',
    'ks_c_5601-1989':      'euc-kr',
    'ksc5601':             'euc-kr',
    'ksc_5601':            'euc-kr',
    'windows-949':         'euc-kr',
    'csiso2022kr':         'iso-2022-kr',
    'iso-2022-kr':         'iso-2022-kr',
    'utf-16be':            'utf-16be',
    'utf-16':              'utf-16le',
    'utf-16le':            'utf-16le',
    'x-user-defined':      'x-user-defined',
}
site-packages/pip/_vendor/webencodings/mklabels.py000064400000002431147511334610016330 0ustar00"""

    webencodings.mklabels
    ~~~~~~~~~~~~~~~~~~~~~

    Regenarate the webencodings.labels module.

    :copyright: Copyright 2012 by Simon Sapin
    :license: BSD, see LICENSE for details.

"""

import json
try:
    from urllib import urlopen
except ImportError:
    from urllib.request import urlopen


def assert_lower(string):
    assert string == string.lower()
    return string


def generate(url):
    parts = ['''\
"""

    webencodings.labels
    ~~~~~~~~~~~~~~~~~~~

    Map encoding labels to their name.

    :copyright: Copyright 2012 by Simon Sapin
    :license: BSD, see LICENSE for details.

"""

# XXX Do not edit!
# This file is automatically generated by mklabels.py

LABELS = {
''']
    labels = [
        (repr(assert_lower(label)).lstrip('u'),
         repr(encoding['name']).lstrip('u'))
        for category in json.loads(urlopen(url).read().decode('ascii'))
        for encoding in category['encodings']
        for label in encoding['labels']]
    max_len = max(len(label) for label, name in labels)
    parts.extend(
        '    %s:%s %s,\n' % (label, ' ' * (max_len - len(label)), name)
        for label, name in labels)
    parts.append('}')
    return ''.join(parts)


if __name__ == '__main__':
    print(generate('http://encoding.spec.whatwg.org/encodings.json'))
site-packages/pip/_vendor/webencodings/x_user_defined.py000064400000010322147511334610017517 0ustar00# coding: utf8
"""

    webencodings.x_user_defined
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~

    An implementation of the x-user-defined encoding.

    :copyright: Copyright 2012 by Simon Sapin
    :license: BSD, see LICENSE for details.

"""

from __future__ import unicode_literals

import codecs


### Codec APIs

class Codec(codecs.Codec):

    def encode(self, input, errors='strict'):
        return codecs.charmap_encode(input, errors, encoding_table)

    def decode(self, input, errors='strict'):
        return codecs.charmap_decode(input, errors, decoding_table)


class IncrementalEncoder(codecs.IncrementalEncoder):
    def encode(self, input, final=False):
        return codecs.charmap_encode(input, self.errors, encoding_table)[0]


class IncrementalDecoder(codecs.IncrementalDecoder):
    def decode(self, input, final=False):
        return codecs.charmap_decode(input, self.errors, decoding_table)[0]


class StreamWriter(Codec, codecs.StreamWriter):
    pass


class StreamReader(Codec, codecs.StreamReader):
    pass


### encodings module API

codec_info = codecs.CodecInfo(
    name='x-user-defined',
    encode=Codec().encode,
    decode=Codec().decode,
    incrementalencoder=IncrementalEncoder,
    incrementaldecoder=IncrementalDecoder,
    streamreader=StreamReader,
    streamwriter=StreamWriter,
)


### Decoding Table

# Python 3:
# for c in range(256): print('    %r' % chr(c if c < 128 else c + 0xF700))
decoding_table = (
    '\x00'
    '\x01'
    '\x02'
    '\x03'
    '\x04'
    '\x05'
    '\x06'
    '\x07'
    '\x08'
    '\t'
    '\n'
    '\x0b'
    '\x0c'
    '\r'
    '\x0e'
    '\x0f'
    '\x10'
    '\x11'
    '\x12'
    '\x13'
    '\x14'
    '\x15'
    '\x16'
    '\x17'
    '\x18'
    '\x19'
    '\x1a'
    '\x1b'
    '\x1c'
    '\x1d'
    '\x1e'
    '\x1f'
    ' '
    '!'
    '"'
    '#'
    '$'
    '%'
    '&'
    "'"
    '('
    ')'
    '*'
    '+'
    ','
    '-'
    '.'
    '/'
    '0'
    '1'
    '2'
    '3'
    '4'
    '5'
    '6'
    '7'
    '8'
    '9'
    ':'
    ';'
    '<'
    '='
    '>'
    '?'
    '@'
    'A'
    'B'
    'C'
    'D'
    'E'
    'F'
    'G'
    'H'
    'I'
    'J'
    'K'
    'L'
    'M'
    'N'
    'O'
    'P'
    'Q'
    'R'
    'S'
    'T'
    'U'
    'V'
    'W'
    'X'
    'Y'
    'Z'
    '['
    '\\'
    ']'
    '^'
    '_'
    '`'
    'a'
    'b'
    'c'
    'd'
    'e'
    'f'
    'g'
    'h'
    'i'
    'j'
    'k'
    'l'
    'm'
    'n'
    'o'
    'p'
    'q'
    'r'
    's'
    't'
    'u'
    'v'
    'w'
    'x'
    'y'
    'z'
    '{'
    '|'
    '}'
    '~'
    '\x7f'
    '\uf780'
    '\uf781'
    '\uf782'
    '\uf783'
    '\uf784'
    '\uf785'
    '\uf786'
    '\uf787'
    '\uf788'
    '\uf789'
    '\uf78a'
    '\uf78b'
    '\uf78c'
    '\uf78d'
    '\uf78e'
    '\uf78f'
    '\uf790'
    '\uf791'
    '\uf792'
    '\uf793'
    '\uf794'
    '\uf795'
    '\uf796'
    '\uf797'
    '\uf798'
    '\uf799'
    '\uf79a'
    '\uf79b'
    '\uf79c'
    '\uf79d'
    '\uf79e'
    '\uf79f'
    '\uf7a0'
    '\uf7a1'
    '\uf7a2'
    '\uf7a3'
    '\uf7a4'
    '\uf7a5'
    '\uf7a6'
    '\uf7a7'
    '\uf7a8'
    '\uf7a9'
    '\uf7aa'
    '\uf7ab'
    '\uf7ac'
    '\uf7ad'
    '\uf7ae'
    '\uf7af'
    '\uf7b0'
    '\uf7b1'
    '\uf7b2'
    '\uf7b3'
    '\uf7b4'
    '\uf7b5'
    '\uf7b6'
    '\uf7b7'
    '\uf7b8'
    '\uf7b9'
    '\uf7ba'
    '\uf7bb'
    '\uf7bc'
    '\uf7bd'
    '\uf7be'
    '\uf7bf'
    '\uf7c0'
    '\uf7c1'
    '\uf7c2'
    '\uf7c3'
    '\uf7c4'
    '\uf7c5'
    '\uf7c6'
    '\uf7c7'
    '\uf7c8'
    '\uf7c9'
    '\uf7ca'
    '\uf7cb'
    '\uf7cc'
    '\uf7cd'
    '\uf7ce'
    '\uf7cf'
    '\uf7d0'
    '\uf7d1'
    '\uf7d2'
    '\uf7d3'
    '\uf7d4'
    '\uf7d5'
    '\uf7d6'
    '\uf7d7'
    '\uf7d8'
    '\uf7d9'
    '\uf7da'
    '\uf7db'
    '\uf7dc'
    '\uf7dd'
    '\uf7de'
    '\uf7df'
    '\uf7e0'
    '\uf7e1'
    '\uf7e2'
    '\uf7e3'
    '\uf7e4'
    '\uf7e5'
    '\uf7e6'
    '\uf7e7'
    '\uf7e8'
    '\uf7e9'
    '\uf7ea'
    '\uf7eb'
    '\uf7ec'
    '\uf7ed'
    '\uf7ee'
    '\uf7ef'
    '\uf7f0'
    '\uf7f1'
    '\uf7f2'
    '\uf7f3'
    '\uf7f4'
    '\uf7f5'
    '\uf7f6'
    '\uf7f7'
    '\uf7f8'
    '\uf7f9'
    '\uf7fa'
    '\uf7fb'
    '\uf7fc'
    '\uf7fd'
    '\uf7fe'
    '\uf7ff'
)

### Encoding table
encoding_table = codecs.charmap_build(decoding_table)
site-packages/pip/_vendor/webencodings/__init__.py000064400000024520147511334610016300 0ustar00# coding: utf8
"""

    webencodings
    ~~~~~~~~~~~~

    This is a Python implementation of the `WHATWG Encoding standard
    <http://encoding.spec.whatwg.org/>`. See README for details.

    :copyright: Copyright 2012 by Simon Sapin
    :license: BSD, see LICENSE for details.

"""

from __future__ import unicode_literals

import codecs

from .labels import LABELS


VERSION = '0.5'


# Some names in Encoding are not valid Python aliases. Remap these.
PYTHON_NAMES = {
    'iso-8859-8-i': 'iso-8859-8',
    'x-mac-cyrillic': 'mac-cyrillic',
    'macintosh': 'mac-roman',
    'windows-874': 'cp874'}

CACHE = {}


def ascii_lower(string):
    r"""Transform (only) ASCII letters to lower case: A-Z is mapped to a-z.

    :param string: An Unicode string.
    :returns: A new Unicode string.

    This is used for `ASCII case-insensitive
    <http://encoding.spec.whatwg.org/#ascii-case-insensitive>`_
    matching of encoding labels.
    The same matching is also used, among other things,
    for `CSS keywords <http://dev.w3.org/csswg/css-values/#keywords>`_.

    This is different from the :meth:`~py:str.lower` method of Unicode strings
    which also affect non-ASCII characters,
    sometimes mapping them into the ASCII range:

        >>> keyword = u'Bac\N{KELVIN SIGN}ground'
        >>> assert keyword.lower() == u'background'
        >>> assert ascii_lower(keyword) != keyword.lower()
        >>> assert ascii_lower(keyword) == u'bac\N{KELVIN SIGN}ground'

    """
    # This turns out to be faster than unicode.translate()
    return string.encode('utf8').lower().decode('utf8')


def lookup(label):
    """
    Look for an encoding by its label.
    This is the spec’s `get an encoding
    <http://encoding.spec.whatwg.org/#concept-encoding-get>`_ algorithm.
    Supported labels are listed there.

    :param label: A string.
    :returns:
        An :class:`Encoding` object, or :obj:`None` for an unknown label.

    """
    # Only strip ASCII whitespace: U+0009, U+000A, U+000C, U+000D, and U+0020.
    label = ascii_lower(label.strip('\t\n\f\r '))
    name = LABELS.get(label)
    if name is None:
        return None
    encoding = CACHE.get(name)
    if encoding is None:
        if name == 'x-user-defined':
            from .x_user_defined import codec_info
        else:
            python_name = PYTHON_NAMES.get(name, name)
            # Any python_name value that gets to here should be valid.
            codec_info = codecs.lookup(python_name)
        encoding = Encoding(name, codec_info)
        CACHE[name] = encoding
    return encoding


def _get_encoding(encoding_or_label):
    """
    Accept either an encoding object or label.

    :param encoding: An :class:`Encoding` object or a label string.
    :returns: An :class:`Encoding` object.
    :raises: :exc:`~exceptions.LookupError` for an unknown label.

    """
    if hasattr(encoding_or_label, 'codec_info'):
        return encoding_or_label

    encoding = lookup(encoding_or_label)
    if encoding is None:
        raise LookupError('Unknown encoding label: %r' % encoding_or_label)
    return encoding


class Encoding(object):
    """Reresents a character encoding such as UTF-8,
    that can be used for decoding or encoding.

    .. attribute:: name

        Canonical name of the encoding

    .. attribute:: codec_info

        The actual implementation of the encoding,
        a stdlib :class:`~codecs.CodecInfo` object.
        See :func:`codecs.register`.

    """
    def __init__(self, name, codec_info):
        self.name = name
        self.codec_info = codec_info

    def __repr__(self):
        return '<Encoding %s>' % self.name


#: The UTF-8 encoding. Should be used for new content and formats.
UTF8 = lookup('utf-8')

_UTF16LE = lookup('utf-16le')
_UTF16BE = lookup('utf-16be')


def decode(input, fallback_encoding, errors='replace'):
    """
    Decode a single string.

    :param input: A byte string
    :param fallback_encoding:
        An :class:`Encoding` object or a label string.
        The encoding to use if :obj:`input` does note have a BOM.
    :param errors: Type of error handling. See :func:`codecs.register`.
    :raises: :exc:`~exceptions.LookupError` for an unknown encoding label.
    :return:
        A ``(output, encoding)`` tuple of an Unicode string
        and an :obj:`Encoding`.

    """
    # Fail early if `encoding` is an invalid label.
    fallback_encoding = _get_encoding(fallback_encoding)
    bom_encoding, input = _detect_bom(input)
    encoding = bom_encoding or fallback_encoding
    return encoding.codec_info.decode(input, errors)[0], encoding


def _detect_bom(input):
    """Return (bom_encoding, input), with any BOM removed from the input."""
    if input.startswith(b'\xFF\xFE'):
        return _UTF16LE, input[2:]
    if input.startswith(b'\xFE\xFF'):
        return _UTF16BE, input[2:]
    if input.startswith(b'\xEF\xBB\xBF'):
        return UTF8, input[3:]
    return None, input


def encode(input, encoding=UTF8, errors='strict'):
    """
    Encode a single string.

    :param input: An Unicode string.
    :param encoding: An :class:`Encoding` object or a label string.
    :param errors: Type of error handling. See :func:`codecs.register`.
    :raises: :exc:`~exceptions.LookupError` for an unknown encoding label.
    :return: A byte string.

    """
    return _get_encoding(encoding).codec_info.encode(input, errors)[0]


def iter_decode(input, fallback_encoding, errors='replace'):
    """
    "Pull"-based decoder.

    :param input:
        An iterable of byte strings.

        The input is first consumed just enough to determine the encoding
        based on the precense of a BOM,
        then consumed on demand when the return value is.
    :param fallback_encoding:
        An :class:`Encoding` object or a label string.
        The encoding to use if :obj:`input` does note have a BOM.
    :param errors: Type of error handling. See :func:`codecs.register`.
    :raises: :exc:`~exceptions.LookupError` for an unknown encoding label.
    :returns:
        An ``(output, encoding)`` tuple.
        :obj:`output` is an iterable of Unicode strings,
        :obj:`encoding` is the :obj:`Encoding` that is being used.

    """

    decoder = IncrementalDecoder(fallback_encoding, errors)
    generator = _iter_decode_generator(input, decoder)
    encoding = next(generator)
    return generator, encoding


def _iter_decode_generator(input, decoder):
    """Return a generator that first yields the :obj:`Encoding`,
    then yields output chukns as Unicode strings.

    """
    decode = decoder.decode
    input = iter(input)
    for chunck in input:
        output = decode(chunck)
        if output:
            assert decoder.encoding is not None
            yield decoder.encoding
            yield output
            break
    else:
        # Input exhausted without determining the encoding
        output = decode(b'', final=True)
        assert decoder.encoding is not None
        yield decoder.encoding
        if output:
            yield output
        return

    for chunck in input:
        output = decode(chunck)
        if output:
            yield output
    output = decode(b'', final=True)
    if output:
        yield output


def iter_encode(input, encoding=UTF8, errors='strict'):
    """
    “Pull”-based encoder.

    :param input: An iterable of Unicode strings.
    :param encoding: An :class:`Encoding` object or a label string.
    :param errors: Type of error handling. See :func:`codecs.register`.
    :raises: :exc:`~exceptions.LookupError` for an unknown encoding label.
    :returns: An iterable of byte strings.

    """
    # Fail early if `encoding` is an invalid label.
    encode = IncrementalEncoder(encoding, errors).encode
    return _iter_encode_generator(input, encode)


def _iter_encode_generator(input, encode):
    for chunck in input:
        output = encode(chunck)
        if output:
            yield output
    output = encode('', final=True)
    if output:
        yield output


class IncrementalDecoder(object):
    """
    “Push”-based decoder.

    :param fallback_encoding:
        An :class:`Encoding` object or a label string.
        The encoding to use if :obj:`input` does note have a BOM.
    :param errors: Type of error handling. See :func:`codecs.register`.
    :raises: :exc:`~exceptions.LookupError` for an unknown encoding label.

    """
    def __init__(self, fallback_encoding, errors='replace'):
        # Fail early if `encoding` is an invalid label.
        self._fallback_encoding = _get_encoding(fallback_encoding)
        self._errors = errors
        self._buffer = b''
        self._decoder = None
        #: The actual :class:`Encoding` that is being used,
        #: or :obj:`None` if that is not determined yet.
        #: (Ie. if there is not enough input yet to determine
        #: if there is a BOM.)
        self.encoding = None  # Not known yet.

    def decode(self, input, final=False):
        """Decode one chunk of the input.

        :param input: A byte string.
        :param final:
            Indicate that no more input is available.
            Must be :obj:`True` if this is the last call.
        :returns: An Unicode string.

        """
        decoder = self._decoder
        if decoder is not None:
            return decoder(input, final)

        input = self._buffer + input
        encoding, input = _detect_bom(input)
        if encoding is None:
            if len(input) < 3 and not final:  # Not enough data yet.
                self._buffer = input
                return ''
            else:  # No BOM
                encoding = self._fallback_encoding
        decoder = encoding.codec_info.incrementaldecoder(self._errors).decode
        self._decoder = decoder
        self.encoding = encoding
        return decoder(input, final)


class IncrementalEncoder(object):
    """
    “Push”-based encoder.

    :param encoding: An :class:`Encoding` object or a label string.
    :param errors: Type of error handling. See :func:`codecs.register`.
    :raises: :exc:`~exceptions.LookupError` for an unknown encoding label.

    .. method:: encode(input, final=False)

        :param input: An Unicode string.
        :param final:
            Indicate that no more input is available.
            Must be :obj:`True` if this is the last call.
        :returns: A byte string.

    """
    def __init__(self, encoding=UTF8, errors='strict'):
        encoding = _get_encoding(encoding)
        self.encode = encoding.codec_info.incrementalencoder(errors).encode
site-packages/pip/_vendor/webencodings/tests.py000064400000014642147511334610015707 0ustar00# coding: utf8
"""

    webencodings.tests
    ~~~~~~~~~~~~~~~~~~

    A basic test suite for Encoding.

    :copyright: Copyright 2012 by Simon Sapin
    :license: BSD, see LICENSE for details.

"""

from __future__ import unicode_literals

from . import (lookup, LABELS, decode, encode, iter_decode, iter_encode,
               IncrementalDecoder, IncrementalEncoder, UTF8)


def assert_raises(exception, function, *args, **kwargs):
    try:
        function(*args, **kwargs)
    except exception:
        return
    else:  # pragma: no cover
        raise AssertionError('Did not raise %s.' % exception)


def test_labels():
    assert lookup('utf-8').name == 'utf-8'
    assert lookup('Utf-8').name == 'utf-8'
    assert lookup('UTF-8').name == 'utf-8'
    assert lookup('utf8').name == 'utf-8'
    assert lookup('utf8').name == 'utf-8'
    assert lookup('utf8 ').name == 'utf-8'
    assert lookup(' \r\nutf8\t').name == 'utf-8'
    assert lookup('u8') is None  # Python label.
    assert lookup('utf-8 ') is None  # Non-ASCII white space.

    assert lookup('US-ASCII').name == 'windows-1252'
    assert lookup('iso-8859-1').name == 'windows-1252'
    assert lookup('latin1').name == 'windows-1252'
    assert lookup('LATIN1').name == 'windows-1252'
    assert lookup('latin-1') is None
    assert lookup('LATİN1') is None  # ASCII-only case insensitivity.


def test_all_labels():
    for label in LABELS:
        assert decode(b'', label) == ('', lookup(label))
        assert encode('', label) == b''
        for repeat in [0, 1, 12]:
            output, _ = iter_decode([b''] * repeat, label)
            assert list(output) == []
            assert list(iter_encode([''] * repeat, label)) == []
        decoder = IncrementalDecoder(label)
        assert decoder.decode(b'') == ''
        assert decoder.decode(b'', final=True) == ''
        encoder = IncrementalEncoder(label)
        assert encoder.encode('') == b''
        assert encoder.encode('', final=True) == b''
    # All encoding names are valid labels too:
    for name in set(LABELS.values()):
        assert lookup(name).name == name


def test_invalid_label():
    assert_raises(LookupError, decode, b'\xEF\xBB\xBF\xc3\xa9', 'invalid')
    assert_raises(LookupError, encode, 'é', 'invalid')
    assert_raises(LookupError, iter_decode, [], 'invalid')
    assert_raises(LookupError, iter_encode, [], 'invalid')
    assert_raises(LookupError, IncrementalDecoder, 'invalid')
    assert_raises(LookupError, IncrementalEncoder, 'invalid')


def test_decode():
    assert decode(b'\x80', 'latin1') == ('€', lookup('latin1'))
    assert decode(b'\x80', lookup('latin1')) == ('€', lookup('latin1'))
    assert decode(b'\xc3\xa9', 'utf8') == ('é', lookup('utf8'))
    assert decode(b'\xc3\xa9', UTF8) == ('é', lookup('utf8'))
    assert decode(b'\xc3\xa9', 'ascii') == ('é', lookup('ascii'))
    assert decode(b'\xEF\xBB\xBF\xc3\xa9', 'ascii') == ('é', lookup('utf8'))  # UTF-8 with BOM

    assert decode(b'\xFE\xFF\x00\xe9', 'ascii') == ('é', lookup('utf-16be'))  # UTF-16-BE with BOM
    assert decode(b'\xFF\xFE\xe9\x00', 'ascii') == ('é', lookup('utf-16le'))  # UTF-16-LE with BOM
    assert decode(b'\xFE\xFF\xe9\x00', 'ascii') == ('\ue900', lookup('utf-16be'))
    assert decode(b'\xFF\xFE\x00\xe9', 'ascii') == ('\ue900', lookup('utf-16le'))

    assert decode(b'\x00\xe9', 'UTF-16BE') == ('é', lookup('utf-16be'))
    assert decode(b'\xe9\x00', 'UTF-16LE') == ('é', lookup('utf-16le'))
    assert decode(b'\xe9\x00', 'UTF-16') == ('é', lookup('utf-16le'))

    assert decode(b'\xe9\x00', 'UTF-16BE') == ('\ue900', lookup('utf-16be'))
    assert decode(b'\x00\xe9', 'UTF-16LE') == ('\ue900', lookup('utf-16le'))
    assert decode(b'\x00\xe9', 'UTF-16') == ('\ue900', lookup('utf-16le'))


def test_encode():
    assert encode('é', 'latin1') == b'\xe9'
    assert encode('é', 'utf8') == b'\xc3\xa9'
    assert encode('é', 'utf8') == b'\xc3\xa9'
    assert encode('é', 'utf-16') == b'\xe9\x00'
    assert encode('é', 'utf-16le') == b'\xe9\x00'
    assert encode('é', 'utf-16be') == b'\x00\xe9'


def test_iter_decode():
    def iter_decode_to_string(input, fallback_encoding):
        output, _encoding = iter_decode(input, fallback_encoding)
        return ''.join(output)
    assert iter_decode_to_string([], 'latin1') == ''
    assert iter_decode_to_string([b''], 'latin1') == ''
    assert iter_decode_to_string([b'\xe9'], 'latin1') == 'é'
    assert iter_decode_to_string([b'hello'], 'latin1') == 'hello'
    assert iter_decode_to_string([b'he', b'llo'], 'latin1') == 'hello'
    assert iter_decode_to_string([b'hell', b'o'], 'latin1') == 'hello'
    assert iter_decode_to_string([b'\xc3\xa9'], 'latin1') == 'é'
    assert iter_decode_to_string([b'\xEF\xBB\xBF\xc3\xa9'], 'latin1') == 'é'
    assert iter_decode_to_string([
        b'\xEF\xBB\xBF', b'\xc3', b'\xa9'], 'latin1') == 'é'
    assert iter_decode_to_string([
        b'\xEF\xBB\xBF', b'a', b'\xc3'], 'latin1') == 'a\uFFFD'
    assert iter_decode_to_string([
        b'', b'\xEF', b'', b'', b'\xBB\xBF\xc3', b'\xa9'], 'latin1') == 'é'
    assert iter_decode_to_string([b'\xEF\xBB\xBF'], 'latin1') == ''
    assert iter_decode_to_string([b'\xEF\xBB'], 'latin1') == 'ï»'
    assert iter_decode_to_string([b'\xFE\xFF\x00\xe9'], 'latin1') == 'é'
    assert iter_decode_to_string([b'\xFF\xFE\xe9\x00'], 'latin1') == 'é'
    assert iter_decode_to_string([
        b'', b'\xFF', b'', b'', b'\xFE\xe9', b'\x00'], 'latin1') == 'é'
    assert iter_decode_to_string([
        b'', b'h\xe9', b'llo'], 'x-user-defined') == 'h\uF7E9llo'


def test_iter_encode():
    assert b''.join(iter_encode([], 'latin1')) == b''
    assert b''.join(iter_encode([''], 'latin1')) == b''
    assert b''.join(iter_encode(['é'], 'latin1')) == b'\xe9'
    assert b''.join(iter_encode(['', 'é', '', ''], 'latin1')) == b'\xe9'
    assert b''.join(iter_encode(['', 'é', '', ''], 'utf-16')) == b'\xe9\x00'
    assert b''.join(iter_encode(['', 'é', '', ''], 'utf-16le')) == b'\xe9\x00'
    assert b''.join(iter_encode(['', 'é', '', ''], 'utf-16be')) == b'\x00\xe9'
    assert b''.join(iter_encode([
        '', 'h\uF7E9', '', 'llo'], 'x-user-defined')) == b'h\xe9llo'


def test_x_user_defined():
    encoded = b'2,\x0c\x0b\x1aO\xd9#\xcb\x0f\xc9\xbbt\xcf\xa8\xca'
    decoded = '2,\x0c\x0b\x1aO\uf7d9#\uf7cb\x0f\uf7c9\uf7bbt\uf7cf\uf7a8\uf7ca'
    encoded = b'aa'
    decoded = 'aa'
    assert decode(encoded, 'x-user-defined') == (decoded, lookup('x-user-defined'))
    assert encode(decoded, 'x-user-defined') == encoded
site-packages/pip/_vendor/distlib/scripts.py000064400000035570147511334610015222 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2013-2015 Vinay Sajip.
# Licensed to the Python Software Foundation under a contributor agreement.
# See LICENSE.txt and CONTRIBUTORS.txt.
#
from io import BytesIO
import logging
import os
import re
import struct
import sys

from .compat import sysconfig, detect_encoding, ZipFile
from .resources import finder
from .util import (FileOperator, get_export_entry, convert_path,
                   get_executable, in_venv)

logger = logging.getLogger(__name__)

_DEFAULT_MANIFEST = '''
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<assembly xmlns="urn:schemas-microsoft-com:asm.v1" manifestVersion="1.0">
 <assemblyIdentity version="1.0.0.0"
 processorArchitecture="X86"
 name="%s"
 type="win32"/>

 <!-- Identify the application security requirements. -->
 <trustInfo xmlns="urn:schemas-microsoft-com:asm.v3">
 <security>
 <requestedPrivileges>
 <requestedExecutionLevel level="asInvoker" uiAccess="false"/>
 </requestedPrivileges>
 </security>
 </trustInfo>
</assembly>'''.strip()

# check if Python is called on the first line with this expression
FIRST_LINE_RE = re.compile(b'^#!.*pythonw?[0-9.]*([ \t].*)?$')
SCRIPT_TEMPLATE = '''# -*- coding: utf-8 -*-
if __name__ == '__main__':
    import sys, re

    def _resolve(module, func):
        __import__(module)
        mod = sys.modules[module]
        parts = func.split('.')
        result = getattr(mod, parts.pop(0))
        for p in parts:
            result = getattr(result, p)
        return result

    try:
        sys.argv[0] = re.sub(r'(-script\.pyw?|\.exe)?$', '', sys.argv[0])

        func = _resolve('%(module)s', '%(func)s')
        rc = func() # None interpreted as 0
    except Exception as e:  # only supporting Python >= 2.6
        sys.stderr.write('%%s\\n' %% e)
        rc = 1
    sys.exit(rc)
'''


def _enquote_executable(executable):
    if ' ' in executable:
        # make sure we quote only the executable in case of env
        # for example /usr/bin/env "/dir with spaces/bin/jython"
        # instead of "/usr/bin/env /dir with spaces/bin/jython"
        # otherwise whole
        if executable.startswith('/usr/bin/env '):
            env, _executable = executable.split(' ', 1)
            if ' ' in _executable and not _executable.startswith('"'):
                executable = '%s "%s"' % (env, _executable)
        else:
            if not executable.startswith('"'):
                executable = '"%s"' % executable
    return executable


class ScriptMaker(object):
    """
    A class to copy or create scripts from source scripts or callable
    specifications.
    """
    script_template = SCRIPT_TEMPLATE

    executable = None  # for shebangs

    def __init__(self, source_dir, target_dir, add_launchers=True,
                 dry_run=False, fileop=None):
        self.source_dir = source_dir
        self.target_dir = target_dir
        self.add_launchers = add_launchers
        self.force = False
        self.clobber = False
        # It only makes sense to set mode bits on POSIX.
        self.set_mode = (os.name == 'posix') or (os.name == 'java' and
                                                 os._name == 'posix')
        self.variants = set(('', 'X.Y'))
        self._fileop = fileop or FileOperator(dry_run)

        self._is_nt = os.name == 'nt' or (
            os.name == 'java' and os._name == 'nt')

    def _get_alternate_executable(self, executable, options):
        if options.get('gui', False) and self._is_nt:  # pragma: no cover
            dn, fn = os.path.split(executable)
            fn = fn.replace('python', 'pythonw')
            executable = os.path.join(dn, fn)
        return executable

    if sys.platform.startswith('java'):  # pragma: no cover
        def _is_shell(self, executable):
            """
            Determine if the specified executable is a script
            (contains a #! line)
            """
            try:
                with open(executable) as fp:
                    return fp.read(2) == '#!'
            except (OSError, IOError):
                logger.warning('Failed to open %s', executable)
                return False

        def _fix_jython_executable(self, executable):
            if self._is_shell(executable):
                # Workaround for Jython is not needed on Linux systems.
                import java

                if java.lang.System.getProperty('os.name') == 'Linux':
                    return executable
            elif executable.lower().endswith('jython.exe'):
                # Use wrapper exe for Jython on Windows
                return executable
            return '/usr/bin/env %s' % executable

    def _get_shebang(self, encoding, post_interp=b'', options=None):
        enquote = True
        if self.executable:
            executable = self.executable
            enquote = False     # assume this will be taken care of
        elif not sysconfig.is_python_build():
            executable = get_executable()
        elif in_venv():  # pragma: no cover
            executable = os.path.join(sysconfig.get_path('scripts'),
                            'python%s' % sysconfig.get_config_var('EXE'))
        else:  # pragma: no cover
            executable = os.path.join(
                sysconfig.get_config_var('BINDIR'),
               'python%s%s' % (sysconfig.get_config_var('VERSION'),
                               sysconfig.get_config_var('EXE')))
        if options:
            executable = self._get_alternate_executable(executable, options)

        if sys.platform.startswith('java'):  # pragma: no cover
            executable = self._fix_jython_executable(executable)
        # Normalise case for Windows
        executable = os.path.normcase(executable)
        # If the user didn't specify an executable, it may be necessary to
        # cater for executable paths with spaces (not uncommon on Windows)
        if enquote:
            executable = _enquote_executable(executable)
        # Issue #51: don't use fsencode, since we later try to
        # check that the shebang is decodable using utf-8.
        executable = executable.encode('utf-8')
        # in case of IronPython, play safe and enable frames support
        if (sys.platform == 'cli' and '-X:Frames' not in post_interp
            and '-X:FullFrames' not in post_interp):  # pragma: no cover
            post_interp += b' -X:Frames'
        shebang = b'#!' + executable + post_interp + b'\n'
        # Python parser starts to read a script using UTF-8 until
        # it gets a #coding:xxx cookie. The shebang has to be the
        # first line of a file, the #coding:xxx cookie cannot be
        # written before. So the shebang has to be decodable from
        # UTF-8.
        try:
            shebang.decode('utf-8')
        except UnicodeDecodeError:  # pragma: no cover
            raise ValueError(
                'The shebang (%r) is not decodable from utf-8' % shebang)
        # If the script is encoded to a custom encoding (use a
        # #coding:xxx cookie), the shebang has to be decodable from
        # the script encoding too.
        if encoding != 'utf-8':
            try:
                shebang.decode(encoding)
            except UnicodeDecodeError:  # pragma: no cover
                raise ValueError(
                    'The shebang (%r) is not decodable '
                    'from the script encoding (%r)' % (shebang, encoding))
        return shebang

    def _get_script_text(self, entry):
        return self.script_template % dict(module=entry.prefix,
                                           func=entry.suffix)

    manifest = _DEFAULT_MANIFEST

    def get_manifest(self, exename):
        base = os.path.basename(exename)
        return self.manifest % base

    def _write_script(self, names, shebang, script_bytes, filenames, ext):
        use_launcher = self.add_launchers and self._is_nt
        linesep = os.linesep.encode('utf-8')
        if not use_launcher:
            script_bytes = shebang + linesep + script_bytes
        else:  # pragma: no cover
            if ext == 'py':
                launcher = self._get_launcher('t')
            else:
                launcher = self._get_launcher('w')
            stream = BytesIO()
            with ZipFile(stream, 'w') as zf:
                zf.writestr('__main__.py', script_bytes)
            zip_data = stream.getvalue()
            script_bytes = launcher + shebang + linesep + zip_data
        for name in names:
            outname = os.path.join(self.target_dir, name)
            if use_launcher:  # pragma: no cover
                n, e = os.path.splitext(outname)
                if e.startswith('.py'):
                    outname = n
                outname = '%s.exe' % outname
                try:
                    self._fileop.write_binary_file(outname, script_bytes)
                except Exception:
                    # Failed writing an executable - it might be in use.
                    logger.warning('Failed to write executable - trying to '
                                   'use .deleteme logic')
                    dfname = '%s.deleteme' % outname
                    if os.path.exists(dfname):
                        os.remove(dfname)       # Not allowed to fail here
                    os.rename(outname, dfname)  # nor here
                    self._fileop.write_binary_file(outname, script_bytes)
                    logger.debug('Able to replace executable using '
                                 '.deleteme logic')
                    try:
                        os.remove(dfname)
                    except Exception:
                        pass    # still in use - ignore error
            else:
                if self._is_nt and not outname.endswith('.' + ext):  # pragma: no cover
                    outname = '%s.%s' % (outname, ext)
                if os.path.exists(outname) and not self.clobber:
                    logger.warning('Skipping existing file %s', outname)
                    continue
                self._fileop.write_binary_file(outname, script_bytes)
                if self.set_mode:
                    self._fileop.set_executable_mode([outname])
            filenames.append(outname)

    def _make_script(self, entry, filenames, options=None):
        post_interp = b''
        if options:
            args = options.get('interpreter_args', [])
            if args:
                args = ' %s' % ' '.join(args)
                post_interp = args.encode('utf-8')
        shebang = self._get_shebang('utf-8', post_interp, options=options)
        script = self._get_script_text(entry).encode('utf-8')
        name = entry.name
        scriptnames = set()
        if '' in self.variants:
            scriptnames.add(name)
        if 'X' in self.variants:
            scriptnames.add('%s%s' % (name, sys.version[0]))
        if 'X.Y' in self.variants:
            scriptnames.add('%s-%s' % (name, sys.version[:3]))
        if options and options.get('gui', False):
            ext = 'pyw'
        else:
            ext = 'py'
        self._write_script(scriptnames, shebang, script, filenames, ext)

    def _copy_script(self, script, filenames):
        adjust = False
        script = os.path.join(self.source_dir, convert_path(script))
        outname = os.path.join(self.target_dir, os.path.basename(script))
        if not self.force and not self._fileop.newer(script, outname):
            logger.debug('not copying %s (up-to-date)', script)
            return

        # Always open the file, but ignore failures in dry-run mode --
        # that way, we'll get accurate feedback if we can read the
        # script.
        try:
            f = open(script, 'rb')
        except IOError:  # pragma: no cover
            if not self.dry_run:
                raise
            f = None
        else:
            first_line = f.readline()
            if not first_line:  # pragma: no cover
                logger.warning('%s: %s is an empty file (skipping)',
                               self.get_command_name(),  script)
                return

            match = FIRST_LINE_RE.match(first_line.replace(b'\r\n', b'\n'))
            if match:
                adjust = True
                post_interp = match.group(1) or b''

        if not adjust:
            if f:
                f.close()
            self._fileop.copy_file(script, outname)
            if self.set_mode:
                self._fileop.set_executable_mode([outname])
            filenames.append(outname)
        else:
            logger.info('copying and adjusting %s -> %s', script,
                        self.target_dir)
            if not self._fileop.dry_run:
                encoding, lines = detect_encoding(f.readline)
                f.seek(0)
                shebang = self._get_shebang(encoding, post_interp)
                if b'pythonw' in first_line:  # pragma: no cover
                    ext = 'pyw'
                else:
                    ext = 'py'
                n = os.path.basename(outname)
                self._write_script([n], shebang, f.read(), filenames, ext)
            if f:
                f.close()

    @property
    def dry_run(self):
        return self._fileop.dry_run

    @dry_run.setter
    def dry_run(self, value):
        self._fileop.dry_run = value

    if os.name == 'nt' or (os.name == 'java' and os._name == 'nt'):  # pragma: no cover
        # Executable launcher support.
        # Launchers are from https://bitbucket.org/vinay.sajip/simple_launcher/

        def _get_launcher(self, kind):
            if struct.calcsize('P') == 8:   # 64-bit
                bits = '64'
            else:
                bits = '32'
            name = '%s%s.exe' % (kind, bits)
            # Issue 31: don't hardcode an absolute package name, but
            # determine it relative to the current package
            distlib_package = __name__.rsplit('.', 1)[0]
            result = finder(distlib_package).find(name).bytes
            return result

    # Public API follows

    def make(self, specification, options=None):
        """
        Make a script.

        :param specification: The specification, which is either a valid export
                              entry specification (to make a script from a
                              callable) or a filename (to make a script by
                              copying from a source location).
        :param options: A dictionary of options controlling script generation.
        :return: A list of all absolute pathnames written to.
        """
        filenames = []
        entry = get_export_entry(specification)
        if entry is None:
            self._copy_script(specification, filenames)
        else:
            self._make_script(entry, filenames, options=options)
        return filenames

    def make_multiple(self, specifications, options=None):
        """
        Take a list of specifications and make scripts from them,
        :param specifications: A list of specifications.
        :return: A list of all absolute pathnames written to,
        """
        filenames = []
        for specification in specifications:
            filenames.extend(self.make(specification, options))
        return filenames
site-packages/pip/_vendor/distlib/database.py000064400000141010147511334610015262 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2012-2016 The Python Software Foundation.
# See LICENSE.txt and CONTRIBUTORS.txt.
#
"""PEP 376 implementation."""

from __future__ import unicode_literals

import base64
import codecs
import contextlib
import hashlib
import logging
import os
import posixpath
import sys
import zipimport

from . import DistlibException, resources
from .compat import StringIO
from .version import get_scheme, UnsupportedVersionError
from .metadata import Metadata, METADATA_FILENAME, WHEEL_METADATA_FILENAME
from .util import (parse_requirement, cached_property, parse_name_and_version,
                   read_exports, write_exports, CSVReader, CSVWriter)


__all__ = ['Distribution', 'BaseInstalledDistribution',
           'InstalledDistribution', 'EggInfoDistribution',
           'DistributionPath']


logger = logging.getLogger(__name__)

EXPORTS_FILENAME = 'pydist-exports.json'
COMMANDS_FILENAME = 'pydist-commands.json'

DIST_FILES = ('INSTALLER', METADATA_FILENAME, 'RECORD', 'REQUESTED',
              'RESOURCES', EXPORTS_FILENAME, 'SHARED')

DISTINFO_EXT = '.dist-info'


class _Cache(object):
    """
    A simple cache mapping names and .dist-info paths to distributions
    """
    def __init__(self):
        """
        Initialise an instance. There is normally one for each DistributionPath.
        """
        self.name = {}
        self.path = {}
        self.generated = False

    def clear(self):
        """
        Clear the cache, setting it to its initial state.
        """
        self.name.clear()
        self.path.clear()
        self.generated = False

    def add(self, dist):
        """
        Add a distribution to the cache.
        :param dist: The distribution to add.
        """
        if dist.path not in self.path:
            self.path[dist.path] = dist
            self.name.setdefault(dist.key, []).append(dist)


class DistributionPath(object):
    """
    Represents a set of distributions installed on a path (typically sys.path).
    """
    def __init__(self, path=None, include_egg=False):
        """
        Create an instance from a path, optionally including legacy (distutils/
        setuptools/distribute) distributions.
        :param path: The path to use, as a list of directories. If not specified,
                     sys.path is used.
        :param include_egg: If True, this instance will look for and return legacy
                            distributions as well as those based on PEP 376.
        """
        if path is None:
            path = sys.path
        self.path = path
        self._include_dist = True
        self._include_egg = include_egg

        self._cache = _Cache()
        self._cache_egg = _Cache()
        self._cache_enabled = True
        self._scheme = get_scheme('default')

    def _get_cache_enabled(self):
        return self._cache_enabled

    def _set_cache_enabled(self, value):
        self._cache_enabled = value

    cache_enabled = property(_get_cache_enabled, _set_cache_enabled)

    def clear_cache(self):
        """
        Clears the internal cache.
        """
        self._cache.clear()
        self._cache_egg.clear()


    def _yield_distributions(self):
        """
        Yield .dist-info and/or .egg(-info) distributions.
        """
        # We need to check if we've seen some resources already, because on
        # some Linux systems (e.g. some Debian/Ubuntu variants) there are
        # symlinks which alias other files in the environment.
        seen = set()
        for path in self.path:
            finder = resources.finder_for_path(path)
            if finder is None:
                continue
            r = finder.find('')
            if not r or not r.is_container:
                continue
            rset = sorted(r.resources)
            for entry in rset:
                r = finder.find(entry)
                if not r or r.path in seen:
                    continue
                if self._include_dist and entry.endswith(DISTINFO_EXT):
                    possible_filenames = [METADATA_FILENAME, WHEEL_METADATA_FILENAME]
                    for metadata_filename in possible_filenames:
                        metadata_path = posixpath.join(entry, metadata_filename)
                        pydist = finder.find(metadata_path)
                        if pydist:
                            break
                    else:
                        continue

                    with contextlib.closing(pydist.as_stream()) as stream:
                        metadata = Metadata(fileobj=stream, scheme='legacy')
                    logger.debug('Found %s', r.path)
                    seen.add(r.path)
                    yield new_dist_class(r.path, metadata=metadata,
                                         env=self)
                elif self._include_egg and entry.endswith(('.egg-info',
                                                          '.egg')):
                    logger.debug('Found %s', r.path)
                    seen.add(r.path)
                    yield old_dist_class(r.path, self)

    def _generate_cache(self):
        """
        Scan the path for distributions and populate the cache with
        those that are found.
        """
        gen_dist = not self._cache.generated
        gen_egg = self._include_egg and not self._cache_egg.generated
        if gen_dist or gen_egg:
            for dist in self._yield_distributions():
                if isinstance(dist, InstalledDistribution):
                    self._cache.add(dist)
                else:
                    self._cache_egg.add(dist)

            if gen_dist:
                self._cache.generated = True
            if gen_egg:
                self._cache_egg.generated = True

    @classmethod
    def distinfo_dirname(cls, name, version):
        """
        The *name* and *version* parameters are converted into their
        filename-escaped form, i.e. any ``'-'`` characters are replaced
        with ``'_'`` other than the one in ``'dist-info'`` and the one
        separating the name from the version number.

        :parameter name: is converted to a standard distribution name by replacing
                         any runs of non- alphanumeric characters with a single
                         ``'-'``.
        :type name: string
        :parameter version: is converted to a standard version string. Spaces
                            become dots, and all other non-alphanumeric characters
                            (except dots) become dashes, with runs of multiple
                            dashes condensed to a single dash.
        :type version: string
        :returns: directory name
        :rtype: string"""
        name = name.replace('-', '_')
        return '-'.join([name, version]) + DISTINFO_EXT

    def get_distributions(self):
        """
        Provides an iterator that looks for distributions and returns
        :class:`InstalledDistribution` or
        :class:`EggInfoDistribution` instances for each one of them.

        :rtype: iterator of :class:`InstalledDistribution` and
                :class:`EggInfoDistribution` instances
        """
        if not self._cache_enabled:
            for dist in self._yield_distributions():
                yield dist
        else:
            self._generate_cache()

            for dist in self._cache.path.values():
                yield dist

            if self._include_egg:
                for dist in self._cache_egg.path.values():
                    yield dist

    def get_distribution(self, name):
        """
        Looks for a named distribution on the path.

        This function only returns the first result found, as no more than one
        value is expected. If nothing is found, ``None`` is returned.

        :rtype: :class:`InstalledDistribution`, :class:`EggInfoDistribution`
                or ``None``
        """
        result = None
        name = name.lower()
        if not self._cache_enabled:
            for dist in self._yield_distributions():
                if dist.key == name:
                    result = dist
                    break
        else:
            self._generate_cache()

            if name in self._cache.name:
                result = self._cache.name[name][0]
            elif self._include_egg and name in self._cache_egg.name:
                result = self._cache_egg.name[name][0]
        return result

    def provides_distribution(self, name, version=None):
        """
        Iterates over all distributions to find which distributions provide *name*.
        If a *version* is provided, it will be used to filter the results.

        This function only returns the first result found, since no more than
        one values are expected. If the directory is not found, returns ``None``.

        :parameter version: a version specifier that indicates the version
                            required, conforming to the format in ``PEP-345``

        :type name: string
        :type version: string
        """
        matcher = None
        if not version is None:
            try:
                matcher = self._scheme.matcher('%s (%s)' % (name, version))
            except ValueError:
                raise DistlibException('invalid name or version: %r, %r' %
                                      (name, version))

        for dist in self.get_distributions():
            provided = dist.provides

            for p in provided:
                p_name, p_ver = parse_name_and_version(p)
                if matcher is None:
                    if p_name == name:
                        yield dist
                        break
                else:
                    if p_name == name and matcher.match(p_ver):
                        yield dist
                        break

    def get_file_path(self, name, relative_path):
        """
        Return the path to a resource file.
        """
        dist = self.get_distribution(name)
        if dist is None:
            raise LookupError('no distribution named %r found' % name)
        return dist.get_resource_path(relative_path)

    def get_exported_entries(self, category, name=None):
        """
        Return all of the exported entries in a particular category.

        :param category: The category to search for entries.
        :param name: If specified, only entries with that name are returned.
        """
        for dist in self.get_distributions():
            r = dist.exports
            if category in r:
                d = r[category]
                if name is not None:
                    if name in d:
                        yield d[name]
                else:
                    for v in d.values():
                        yield v


class Distribution(object):
    """
    A base class for distributions, whether installed or from indexes.
    Either way, it must have some metadata, so that's all that's needed
    for construction.
    """

    build_time_dependency = False
    """
    Set to True if it's known to be only a build-time dependency (i.e.
    not needed after installation).
    """

    requested = False
    """A boolean that indicates whether the ``REQUESTED`` metadata file is
    present (in other words, whether the package was installed by user
    request or it was installed as a dependency)."""

    def __init__(self, metadata):
        """
        Initialise an instance.
        :param metadata: The instance of :class:`Metadata` describing this
        distribution.
        """
        self.metadata = metadata
        self.name = metadata.name
        self.key = self.name.lower()    # for case-insensitive comparisons
        self.version = metadata.version
        self.locator = None
        self.digest = None
        self.extras = None      # additional features requested
        self.context = None     # environment marker overrides
        self.download_urls = set()
        self.digests = {}

    @property
    def source_url(self):
        """
        The source archive download URL for this distribution.
        """
        return self.metadata.source_url

    download_url = source_url   # Backward compatibility

    @property
    def name_and_version(self):
        """
        A utility property which displays the name and version in parentheses.
        """
        return '%s (%s)' % (self.name, self.version)

    @property
    def provides(self):
        """
        A set of distribution names and versions provided by this distribution.
        :return: A set of "name (version)" strings.
        """
        plist = self.metadata.provides
        s = '%s (%s)' % (self.name, self.version)
        if s not in plist:
            plist.append(s)
        return plist

    def _get_requirements(self, req_attr):
        md = self.metadata
        logger.debug('Getting requirements from metadata %r', md.todict())
        reqts = getattr(md, req_attr)
        return set(md.get_requirements(reqts, extras=self.extras,
                                       env=self.context))

    @property
    def run_requires(self):
        return self._get_requirements('run_requires')

    @property
    def meta_requires(self):
        return self._get_requirements('meta_requires')

    @property
    def build_requires(self):
        return self._get_requirements('build_requires')

    @property
    def test_requires(self):
        return self._get_requirements('test_requires')

    @property
    def dev_requires(self):
        return self._get_requirements('dev_requires')

    def matches_requirement(self, req):
        """
        Say if this instance matches (fulfills) a requirement.
        :param req: The requirement to match.
        :rtype req: str
        :return: True if it matches, else False.
        """
        # Requirement may contain extras - parse to lose those
        # from what's passed to the matcher
        r = parse_requirement(req)
        scheme = get_scheme(self.metadata.scheme)
        try:
            matcher = scheme.matcher(r.requirement)
        except UnsupportedVersionError:
            # XXX compat-mode if cannot read the version
            logger.warning('could not read version %r - using name only',
                           req)
            name = req.split()[0]
            matcher = scheme.matcher(name)

        name = matcher.key   # case-insensitive

        result = False
        for p in self.provides:
            p_name, p_ver = parse_name_and_version(p)
            if p_name != name:
                continue
            try:
                result = matcher.match(p_ver)
                break
            except UnsupportedVersionError:
                pass
        return result

    def __repr__(self):
        """
        Return a textual representation of this instance,
        """
        if self.source_url:
            suffix = ' [%s]' % self.source_url
        else:
            suffix = ''
        return '<Distribution %s (%s)%s>' % (self.name, self.version, suffix)

    def __eq__(self, other):
        """
        See if this distribution is the same as another.
        :param other: The distribution to compare with. To be equal to one
                      another. distributions must have the same type, name,
                      version and source_url.
        :return: True if it is the same, else False.
        """
        if type(other) is not type(self):
            result = False
        else:
            result = (self.name == other.name and
                      self.version == other.version and
                      self.source_url == other.source_url)
        return result

    def __hash__(self):
        """
        Compute hash in a way which matches the equality test.
        """
        return hash(self.name) + hash(self.version) + hash(self.source_url)


class BaseInstalledDistribution(Distribution):
    """
    This is the base class for installed distributions (whether PEP 376 or
    legacy).
    """

    hasher = None

    def __init__(self, metadata, path, env=None):
        """
        Initialise an instance.
        :param metadata: An instance of :class:`Metadata` which describes the
                         distribution. This will normally have been initialised
                         from a metadata file in the ``path``.
        :param path:     The path of the ``.dist-info`` or ``.egg-info``
                         directory for the distribution.
        :param env:      This is normally the :class:`DistributionPath`
                         instance where this distribution was found.
        """
        super(BaseInstalledDistribution, self).__init__(metadata)
        self.path = path
        self.dist_path = env

    def get_hash(self, data, hasher=None):
        """
        Get the hash of some data, using a particular hash algorithm, if
        specified.

        :param data: The data to be hashed.
        :type data: bytes
        :param hasher: The name of a hash implementation, supported by hashlib,
                       or ``None``. Examples of valid values are ``'sha1'``,
                       ``'sha224'``, ``'sha384'``, '``sha256'``, ``'md5'`` and
                       ``'sha512'``. If no hasher is specified, the ``hasher``
                       attribute of the :class:`InstalledDistribution` instance
                       is used. If the hasher is determined to be ``None``, MD5
                       is used as the hashing algorithm.
        :returns: The hash of the data. If a hasher was explicitly specified,
                  the returned hash will be prefixed with the specified hasher
                  followed by '='.
        :rtype: str
        """
        if hasher is None:
            hasher = self.hasher
        if hasher is None:
            hasher = hashlib.md5
            prefix = ''
        else:
            hasher = getattr(hashlib, hasher)
            prefix = '%s=' % self.hasher
        digest = hasher(data).digest()
        digest = base64.urlsafe_b64encode(digest).rstrip(b'=').decode('ascii')
        return '%s%s' % (prefix, digest)


class InstalledDistribution(BaseInstalledDistribution):
    """
    Created with the *path* of the ``.dist-info`` directory provided to the
    constructor. It reads the metadata contained in ``pydist.json`` when it is
    instantiated., or uses a passed in Metadata instance (useful for when
    dry-run mode is being used).
    """

    hasher = 'sha256'

    def __init__(self, path, metadata=None, env=None):
        self.finder = finder = resources.finder_for_path(path)
        if finder is None:
            import pdb; pdb.set_trace ()
        if env and env._cache_enabled and path in env._cache.path:
            metadata = env._cache.path[path].metadata
        elif metadata is None:
            r = finder.find(METADATA_FILENAME)
            # Temporary - for Wheel 0.23 support
            if r is None:
                r = finder.find(WHEEL_METADATA_FILENAME)
            # Temporary - for legacy support
            if r is None:
                r = finder.find('METADATA')
            if r is None:
                raise ValueError('no %s found in %s' % (METADATA_FILENAME,
                                                        path))
            with contextlib.closing(r.as_stream()) as stream:
                metadata = Metadata(fileobj=stream, scheme='legacy')

        super(InstalledDistribution, self).__init__(metadata, path, env)

        if env and env._cache_enabled:
            env._cache.add(self)

        try:
            r = finder.find('REQUESTED')
        except AttributeError:
            import pdb; pdb.set_trace ()
        self.requested = r is not None

    def __repr__(self):
        return '<InstalledDistribution %r %s at %r>' % (
            self.name, self.version, self.path)

    def __str__(self):
        return "%s %s" % (self.name, self.version)

    def _get_records(self):
        """
        Get the list of installed files for the distribution
        :return: A list of tuples of path, hash and size. Note that hash and
                 size might be ``None`` for some entries. The path is exactly
                 as stored in the file (which is as in PEP 376).
        """
        results = []
        r = self.get_distinfo_resource('RECORD')
        with contextlib.closing(r.as_stream()) as stream:
            with CSVReader(stream=stream) as record_reader:
                # Base location is parent dir of .dist-info dir
                #base_location = os.path.dirname(self.path)
                #base_location = os.path.abspath(base_location)
                for row in record_reader:
                    missing = [None for i in range(len(row), 3)]
                    path, checksum, size = row + missing
                    #if not os.path.isabs(path):
                    #    path = path.replace('/', os.sep)
                    #    path = os.path.join(base_location, path)
                    results.append((path, checksum, size))
        return results

    @cached_property
    def exports(self):
        """
        Return the information exported by this distribution.
        :return: A dictionary of exports, mapping an export category to a dict
                 of :class:`ExportEntry` instances describing the individual
                 export entries, and keyed by name.
        """
        result = {}
        r = self.get_distinfo_resource(EXPORTS_FILENAME)
        if r:
            result = self.read_exports()
        return result

    def read_exports(self):
        """
        Read exports data from a file in .ini format.

        :return: A dictionary of exports, mapping an export category to a list
                 of :class:`ExportEntry` instances describing the individual
                 export entries.
        """
        result = {}
        r = self.get_distinfo_resource(EXPORTS_FILENAME)
        if r:
            with contextlib.closing(r.as_stream()) as stream:
                result = read_exports(stream)
        return result

    def write_exports(self, exports):
        """
        Write a dictionary of exports to a file in .ini format.
        :param exports: A dictionary of exports, mapping an export category to
                        a list of :class:`ExportEntry` instances describing the
                        individual export entries.
        """
        rf = self.get_distinfo_file(EXPORTS_FILENAME)
        with open(rf, 'w') as f:
            write_exports(exports, f)

    def get_resource_path(self, relative_path):
        """
        NOTE: This API may change in the future.

        Return the absolute path to a resource file with the given relative
        path.

        :param relative_path: The path, relative to .dist-info, of the resource
                              of interest.
        :return: The absolute path where the resource is to be found.
        """
        r = self.get_distinfo_resource('RESOURCES')
        with contextlib.closing(r.as_stream()) as stream:
            with CSVReader(stream=stream) as resources_reader:
                for relative, destination in resources_reader:
                    if relative == relative_path:
                        return destination
        raise KeyError('no resource file with relative path %r '
                       'is installed' % relative_path)

    def list_installed_files(self):
        """
        Iterates over the ``RECORD`` entries and returns a tuple
        ``(path, hash, size)`` for each line.

        :returns: iterator of (path, hash, size)
        """
        for result in self._get_records():
            yield result

    def write_installed_files(self, paths, prefix, dry_run=False):
        """
        Writes the ``RECORD`` file, using the ``paths`` iterable passed in. Any
        existing ``RECORD`` file is silently overwritten.

        prefix is used to determine when to write absolute paths.
        """
        prefix = os.path.join(prefix, '')
        base = os.path.dirname(self.path)
        base_under_prefix = base.startswith(prefix)
        base = os.path.join(base, '')
        record_path = self.get_distinfo_file('RECORD')
        logger.info('creating %s', record_path)
        if dry_run:
            return None
        with CSVWriter(record_path) as writer:
            for path in paths:
                if os.path.isdir(path) or path.endswith(('.pyc', '.pyo')):
                    # do not put size and hash, as in PEP-376
                    hash_value = size = ''
                else:
                    size = '%d' % os.path.getsize(path)
                    with open(path, 'rb') as fp:
                        hash_value = self.get_hash(fp.read())
                if path.startswith(base) or (base_under_prefix and
                                             path.startswith(prefix)):
                    path = os.path.relpath(path, base)
                writer.writerow((path, hash_value, size))

            # add the RECORD file itself
            if record_path.startswith(base):
                record_path = os.path.relpath(record_path, base)
            writer.writerow((record_path, '', ''))
        return record_path

    def check_installed_files(self):
        """
        Checks that the hashes and sizes of the files in ``RECORD`` are
        matched by the files themselves. Returns a (possibly empty) list of
        mismatches. Each entry in the mismatch list will be a tuple consisting
        of the path, 'exists', 'size' or 'hash' according to what didn't match
        (existence is checked first, then size, then hash), the expected
        value and the actual value.
        """
        mismatches = []
        base = os.path.dirname(self.path)
        record_path = self.get_distinfo_file('RECORD')
        for path, hash_value, size in self.list_installed_files():
            if not os.path.isabs(path):
                path = os.path.join(base, path)
            if path == record_path:
                continue
            if not os.path.exists(path):
                mismatches.append((path, 'exists', True, False))
            elif os.path.isfile(path):
                actual_size = str(os.path.getsize(path))
                if size and actual_size != size:
                    mismatches.append((path, 'size', size, actual_size))
                elif hash_value:
                    if '=' in hash_value:
                        hasher = hash_value.split('=', 1)[0]
                    else:
                        hasher = None

                    with open(path, 'rb') as f:
                        actual_hash = self.get_hash(f.read(), hasher)
                        if actual_hash != hash_value:
                            mismatches.append((path, 'hash', hash_value, actual_hash))
        return mismatches

    @cached_property
    def shared_locations(self):
        """
        A dictionary of shared locations whose keys are in the set 'prefix',
        'purelib', 'platlib', 'scripts', 'headers', 'data' and 'namespace'.
        The corresponding value is the absolute path of that category for
        this distribution, and takes into account any paths selected by the
        user at installation time (e.g. via command-line arguments). In the
        case of the 'namespace' key, this would be a list of absolute paths
        for the roots of namespace packages in this distribution.

        The first time this property is accessed, the relevant information is
        read from the SHARED file in the .dist-info directory.
        """
        result = {}
        shared_path = os.path.join(self.path, 'SHARED')
        if os.path.isfile(shared_path):
            with codecs.open(shared_path, 'r', encoding='utf-8') as f:
                lines = f.read().splitlines()
            for line in lines:
                key, value = line.split('=', 1)
                if key == 'namespace':
                    result.setdefault(key, []).append(value)
                else:
                    result[key] = value
        return result

    def write_shared_locations(self, paths, dry_run=False):
        """
        Write shared location information to the SHARED file in .dist-info.
        :param paths: A dictionary as described in the documentation for
        :meth:`shared_locations`.
        :param dry_run: If True, the action is logged but no file is actually
                        written.
        :return: The path of the file written to.
        """
        shared_path = os.path.join(self.path, 'SHARED')
        logger.info('creating %s', shared_path)
        if dry_run:
            return None
        lines = []
        for key in ('prefix', 'lib', 'headers', 'scripts', 'data'):
            path = paths[key]
            if os.path.isdir(paths[key]):
                lines.append('%s=%s' % (key,  path))
        for ns in paths.get('namespace', ()):
            lines.append('namespace=%s' % ns)

        with codecs.open(shared_path, 'w', encoding='utf-8') as f:
            f.write('\n'.join(lines))
        return shared_path

    def get_distinfo_resource(self, path):
        if path not in DIST_FILES:
            raise DistlibException('invalid path for a dist-info file: '
                                   '%r at %r' % (path, self.path))
        finder = resources.finder_for_path(self.path)
        if finder is None:
            raise DistlibException('Unable to get a finder for %s' % self.path)
        return finder.find(path)

    def get_distinfo_file(self, path):
        """
        Returns a path located under the ``.dist-info`` directory. Returns a
        string representing the path.

        :parameter path: a ``'/'``-separated path relative to the
                         ``.dist-info`` directory or an absolute path;
                         If *path* is an absolute path and doesn't start
                         with the ``.dist-info`` directory path,
                         a :class:`DistlibException` is raised
        :type path: str
        :rtype: str
        """
        # Check if it is an absolute path  # XXX use relpath, add tests
        if path.find(os.sep) >= 0:
            # it's an absolute path?
            distinfo_dirname, path = path.split(os.sep)[-2:]
            if distinfo_dirname != self.path.split(os.sep)[-1]:
                raise DistlibException(
                    'dist-info file %r does not belong to the %r %s '
                    'distribution' % (path, self.name, self.version))

        # The file must be relative
        if path not in DIST_FILES:
            raise DistlibException('invalid path for a dist-info file: '
                                   '%r at %r' % (path, self.path))

        return os.path.join(self.path, path)

    def list_distinfo_files(self):
        """
        Iterates over the ``RECORD`` entries and returns paths for each line if
        the path is pointing to a file located in the ``.dist-info`` directory
        or one of its subdirectories.

        :returns: iterator of paths
        """
        base = os.path.dirname(self.path)
        for path, checksum, size in self._get_records():
            # XXX add separator or use real relpath algo
            if not os.path.isabs(path):
                path = os.path.join(base, path)
            if path.startswith(self.path):
                yield path

    def __eq__(self, other):
        return (isinstance(other, InstalledDistribution) and
                self.path == other.path)

    # See http://docs.python.org/reference/datamodel#object.__hash__
    __hash__ = object.__hash__


class EggInfoDistribution(BaseInstalledDistribution):
    """Created with the *path* of the ``.egg-info`` directory or file provided
    to the constructor. It reads the metadata contained in the file itself, or
    if the given path happens to be a directory, the metadata is read from the
    file ``PKG-INFO`` under that directory."""

    requested = True    # as we have no way of knowing, assume it was
    shared_locations = {}

    def __init__(self, path, env=None):
        def set_name_and_version(s, n, v):
            s.name = n
            s.key = n.lower()   # for case-insensitive comparisons
            s.version = v

        self.path = path
        self.dist_path = env
        if env and env._cache_enabled and path in env._cache_egg.path:
            metadata = env._cache_egg.path[path].metadata
            set_name_and_version(self, metadata.name, metadata.version)
        else:
            metadata = self._get_metadata(path)

            # Need to be set before caching
            set_name_and_version(self, metadata.name, metadata.version)

            if env and env._cache_enabled:
                env._cache_egg.add(self)
        super(EggInfoDistribution, self).__init__(metadata, path, env)

    def _get_metadata(self, path):
        requires = None

        def parse_requires_data(data):
            """Create a list of dependencies from a requires.txt file.

            *data*: the contents of a setuptools-produced requires.txt file.
            """
            reqs = []
            lines = data.splitlines()
            for line in lines:
                line = line.strip()
                if line.startswith('['):
                    logger.warning('Unexpected line: quitting requirement scan: %r',
                                   line)
                    break
                r = parse_requirement(line)
                if not r:
                    logger.warning('Not recognised as a requirement: %r', line)
                    continue
                if r.extras:
                    logger.warning('extra requirements in requires.txt are '
                                   'not supported')
                if not r.constraints:
                    reqs.append(r.name)
                else:
                    cons = ', '.join('%s%s' % c for c in r.constraints)
                    reqs.append('%s (%s)' % (r.name, cons))
            return reqs

        def parse_requires_path(req_path):
            """Create a list of dependencies from a requires.txt file.

            *req_path*: the path to a setuptools-produced requires.txt file.
            """

            reqs = []
            try:
                with codecs.open(req_path, 'r', 'utf-8') as fp:
                    reqs = parse_requires_data(fp.read())
            except IOError:
                pass
            return reqs

        if path.endswith('.egg'):
            if os.path.isdir(path):
                meta_path = os.path.join(path, 'EGG-INFO', 'PKG-INFO')
                metadata = Metadata(path=meta_path, scheme='legacy')
                req_path = os.path.join(path, 'EGG-INFO', 'requires.txt')
                requires = parse_requires_path(req_path)
            else:
                # FIXME handle the case where zipfile is not available
                zipf = zipimport.zipimporter(path)
                fileobj = StringIO(
                    zipf.get_data('EGG-INFO/PKG-INFO').decode('utf8'))
                metadata = Metadata(fileobj=fileobj, scheme='legacy')
                try:
                    data = zipf.get_data('EGG-INFO/requires.txt')
                    requires = parse_requires_data(data.decode('utf-8'))
                except IOError:
                    requires = None
        elif path.endswith('.egg-info'):
            if os.path.isdir(path):
                req_path = os.path.join(path, 'requires.txt')
                requires = parse_requires_path(req_path)
                path = os.path.join(path, 'PKG-INFO')
            metadata = Metadata(path=path, scheme='legacy')
        else:
            raise DistlibException('path must end with .egg-info or .egg, '
                                   'got %r' % path)

        if requires:
            metadata.add_requirements(requires)
        return metadata

    def __repr__(self):
        return '<EggInfoDistribution %r %s at %r>' % (
            self.name, self.version, self.path)

    def __str__(self):
        return "%s %s" % (self.name, self.version)

    def check_installed_files(self):
        """
        Checks that the hashes and sizes of the files in ``RECORD`` are
        matched by the files themselves. Returns a (possibly empty) list of
        mismatches. Each entry in the mismatch list will be a tuple consisting
        of the path, 'exists', 'size' or 'hash' according to what didn't match
        (existence is checked first, then size, then hash), the expected
        value and the actual value.
        """
        mismatches = []
        record_path = os.path.join(self.path, 'installed-files.txt')
        if os.path.exists(record_path):
            for path, _, _ in self.list_installed_files():
                if path == record_path:
                    continue
                if not os.path.exists(path):
                    mismatches.append((path, 'exists', True, False))
        return mismatches

    def list_installed_files(self):
        """
        Iterates over the ``installed-files.txt`` entries and returns a tuple
        ``(path, hash, size)`` for each line.

        :returns: a list of (path, hash, size)
        """

        def _md5(path):
            f = open(path, 'rb')
            try:
                content = f.read()
            finally:
                f.close()
            return hashlib.md5(content).hexdigest()

        def _size(path):
            return os.stat(path).st_size

        record_path = os.path.join(self.path, 'installed-files.txt')
        result = []
        if os.path.exists(record_path):
            with codecs.open(record_path, 'r', encoding='utf-8') as f:
                for line in f:
                    line = line.strip()
                    p = os.path.normpath(os.path.join(self.path, line))
                    # "./" is present as a marker between installed files
                    # and installation metadata files
                    if not os.path.exists(p):
                        logger.warning('Non-existent file: %s', p)
                        if p.endswith(('.pyc', '.pyo')):
                            continue
                        #otherwise fall through and fail
                    if not os.path.isdir(p):
                        result.append((p, _md5(p), _size(p)))
            result.append((record_path, None, None))
        return result

    def list_distinfo_files(self, absolute=False):
        """
        Iterates over the ``installed-files.txt`` entries and returns paths for
        each line if the path is pointing to a file located in the
        ``.egg-info`` directory or one of its subdirectories.

        :parameter absolute: If *absolute* is ``True``, each returned path is
                          transformed into a local absolute path. Otherwise the
                          raw value from ``installed-files.txt`` is returned.
        :type absolute: boolean
        :returns: iterator of paths
        """
        record_path = os.path.join(self.path, 'installed-files.txt')
        skip = True
        with codecs.open(record_path, 'r', encoding='utf-8') as f:
            for line in f:
                line = line.strip()
                if line == './':
                    skip = False
                    continue
                if not skip:
                    p = os.path.normpath(os.path.join(self.path, line))
                    if p.startswith(self.path):
                        if absolute:
                            yield p
                        else:
                            yield line

    def __eq__(self, other):
        return (isinstance(other, EggInfoDistribution) and
                self.path == other.path)

    # See http://docs.python.org/reference/datamodel#object.__hash__
    __hash__ = object.__hash__

new_dist_class = InstalledDistribution
old_dist_class = EggInfoDistribution


class DependencyGraph(object):
    """
    Represents a dependency graph between distributions.

    The dependency relationships are stored in an ``adjacency_list`` that maps
    distributions to a list of ``(other, label)`` tuples where  ``other``
    is a distribution and the edge is labeled with ``label`` (i.e. the version
    specifier, if such was provided). Also, for more efficient traversal, for
    every distribution ``x``, a list of predecessors is kept in
    ``reverse_list[x]``. An edge from distribution ``a`` to
    distribution ``b`` means that ``a`` depends on ``b``. If any missing
    dependencies are found, they are stored in ``missing``, which is a
    dictionary that maps distributions to a list of requirements that were not
    provided by any other distributions.
    """

    def __init__(self):
        self.adjacency_list = {}
        self.reverse_list = {}
        self.missing = {}

    def add_distribution(self, distribution):
        """Add the *distribution* to the graph.

        :type distribution: :class:`distutils2.database.InstalledDistribution`
                            or :class:`distutils2.database.EggInfoDistribution`
        """
        self.adjacency_list[distribution] = []
        self.reverse_list[distribution] = []
        #self.missing[distribution] = []

    def add_edge(self, x, y, label=None):
        """Add an edge from distribution *x* to distribution *y* with the given
        *label*.

        :type x: :class:`distutils2.database.InstalledDistribution` or
                 :class:`distutils2.database.EggInfoDistribution`
        :type y: :class:`distutils2.database.InstalledDistribution` or
                 :class:`distutils2.database.EggInfoDistribution`
        :type label: ``str`` or ``None``
        """
        self.adjacency_list[x].append((y, label))
        # multiple edges are allowed, so be careful
        if x not in self.reverse_list[y]:
            self.reverse_list[y].append(x)

    def add_missing(self, distribution, requirement):
        """
        Add a missing *requirement* for the given *distribution*.

        :type distribution: :class:`distutils2.database.InstalledDistribution`
                            or :class:`distutils2.database.EggInfoDistribution`
        :type requirement: ``str``
        """
        logger.debug('%s missing %r', distribution, requirement)
        self.missing.setdefault(distribution, []).append(requirement)

    def _repr_dist(self, dist):
        return '%s %s' % (dist.name, dist.version)

    def repr_node(self, dist, level=1):
        """Prints only a subgraph"""
        output = [self._repr_dist(dist)]
        for other, label in self.adjacency_list[dist]:
            dist = self._repr_dist(other)
            if label is not None:
                dist = '%s [%s]' % (dist, label)
            output.append('    ' * level + str(dist))
            suboutput = self.repr_node(other, level + 1)
            subs = suboutput.split('\n')
            output.extend(subs[1:])
        return '\n'.join(output)

    def to_dot(self, f, skip_disconnected=True):
        """Writes a DOT output for the graph to the provided file *f*.

        If *skip_disconnected* is set to ``True``, then all distributions
        that are not dependent on any other distribution are skipped.

        :type f: has to support ``file``-like operations
        :type skip_disconnected: ``bool``
        """
        disconnected = []

        f.write("digraph dependencies {\n")
        for dist, adjs in self.adjacency_list.items():
            if len(adjs) == 0 and not skip_disconnected:
                disconnected.append(dist)
            for other, label in adjs:
                if not label is None:
                    f.write('"%s" -> "%s" [label="%s"]\n' %
                            (dist.name, other.name, label))
                else:
                    f.write('"%s" -> "%s"\n' % (dist.name, other.name))
        if not skip_disconnected and len(disconnected) > 0:
            f.write('subgraph disconnected {\n')
            f.write('label = "Disconnected"\n')
            f.write('bgcolor = red\n')

            for dist in disconnected:
                f.write('"%s"' % dist.name)
                f.write('\n')
            f.write('}\n')
        f.write('}\n')

    def topological_sort(self):
        """
        Perform a topological sort of the graph.
        :return: A tuple, the first element of which is a topologically sorted
                 list of distributions, and the second element of which is a
                 list of distributions that cannot be sorted because they have
                 circular dependencies and so form a cycle.
        """
        result = []
        # Make a shallow copy of the adjacency list
        alist = {}
        for k, v in self.adjacency_list.items():
            alist[k] = v[:]
        while True:
            # See what we can remove in this run
            to_remove = []
            for k, v in list(alist.items())[:]:
                if not v:
                    to_remove.append(k)
                    del alist[k]
            if not to_remove:
                # What's left in alist (if anything) is a cycle.
                break
            # Remove from the adjacency list of others
            for k, v in alist.items():
                alist[k] = [(d, r) for d, r in v if d not in to_remove]
            logger.debug('Moving to result: %s',
                         ['%s (%s)' % (d.name, d.version) for d in to_remove])
            result.extend(to_remove)
        return result, list(alist.keys())

    def __repr__(self):
        """Representation of the graph"""
        output = []
        for dist, adjs in self.adjacency_list.items():
            output.append(self.repr_node(dist))
        return '\n'.join(output)


def make_graph(dists, scheme='default'):
    """Makes a dependency graph from the given distributions.

    :parameter dists: a list of distributions
    :type dists: list of :class:`distutils2.database.InstalledDistribution` and
                 :class:`distutils2.database.EggInfoDistribution` instances
    :rtype: a :class:`DependencyGraph` instance
    """
    scheme = get_scheme(scheme)
    graph = DependencyGraph()
    provided = {}  # maps names to lists of (version, dist) tuples

    # first, build the graph and find out what's provided
    for dist in dists:
        graph.add_distribution(dist)

        for p in dist.provides:
            name, version = parse_name_and_version(p)
            logger.debug('Add to provided: %s, %s, %s', name, version, dist)
            provided.setdefault(name, []).append((version, dist))

    # now make the edges
    for dist in dists:
        requires = (dist.run_requires | dist.meta_requires |
                    dist.build_requires | dist.dev_requires)
        for req in requires:
            try:
                matcher = scheme.matcher(req)
            except UnsupportedVersionError:
                # XXX compat-mode if cannot read the version
                logger.warning('could not read version %r - using name only',
                               req)
                name = req.split()[0]
                matcher = scheme.matcher(name)

            name = matcher.key   # case-insensitive

            matched = False
            if name in provided:
                for version, provider in provided[name]:
                    try:
                        match = matcher.match(version)
                    except UnsupportedVersionError:
                        match = False

                    if match:
                        graph.add_edge(dist, provider, req)
                        matched = True
                        break
            if not matched:
                graph.add_missing(dist, req)
    return graph


def get_dependent_dists(dists, dist):
    """Recursively generate a list of distributions from *dists* that are
    dependent on *dist*.

    :param dists: a list of distributions
    :param dist: a distribution, member of *dists* for which we are interested
    """
    if dist not in dists:
        raise DistlibException('given distribution %r is not a member '
                               'of the list' % dist.name)
    graph = make_graph(dists)

    dep = [dist]  # dependent distributions
    todo = graph.reverse_list[dist]  # list of nodes we should inspect

    while todo:
        d = todo.pop()
        dep.append(d)
        for succ in graph.reverse_list[d]:
            if succ not in dep:
                todo.append(succ)

    dep.pop(0)  # remove dist from dep, was there to prevent infinite loops
    return dep


def get_required_dists(dists, dist):
    """Recursively generate a list of distributions from *dists* that are
    required by *dist*.

    :param dists: a list of distributions
    :param dist: a distribution, member of *dists* for which we are interested
    """
    if dist not in dists:
        raise DistlibException('given distribution %r is not a member '
                               'of the list' % dist.name)
    graph = make_graph(dists)

    req = []  # required distributions
    todo = graph.adjacency_list[dist]  # list of nodes we should inspect

    while todo:
        d = todo.pop()[0]
        req.append(d)
        for pred in graph.adjacency_list[d]:
            if pred not in req:
                todo.append(pred)

    return req


def make_dist(name, version, **kwargs):
    """
    A convenience method for making a dist given just a name and version.
    """
    summary = kwargs.pop('summary', 'Placeholder for summary')
    md = Metadata(**kwargs)
    md.name = name
    md.version = version
    md.summary = summary or 'Placeholder for summary'
    return Distribution(md)
site-packages/pip/_vendor/distlib/util.py000064400000150551147511334610014505 0ustar00#
# Copyright (C) 2012-2016 The Python Software Foundation.
# See LICENSE.txt and CONTRIBUTORS.txt.
#
import codecs
from collections import deque
import contextlib
import csv
from glob import iglob as std_iglob
import io
import json
import logging
import os
import py_compile
import re
import shutil
import socket
try:
    import ssl
except ImportError:  # pragma: no cover
    ssl = None
import subprocess
import sys
import tarfile
import tempfile
import textwrap

try:
    import threading
except ImportError:  # pragma: no cover
    import dummy_threading as threading
import time

from . import DistlibException
from .compat import (string_types, text_type, shutil, raw_input, StringIO,
                     cache_from_source, urlopen, urljoin, httplib, xmlrpclib,
                     splittype, HTTPHandler, BaseConfigurator, valid_ident,
                     Container, configparser, URLError, ZipFile, fsdecode,
                     unquote)

logger = logging.getLogger(__name__)

#
# Requirement parsing code for name + optional constraints + optional extras
#
# e.g. 'foo >= 1.2, < 2.0 [bar, baz]'
#
# The regex can seem a bit hairy, so we build it up out of smaller pieces
# which are manageable.
#

COMMA = r'\s*,\s*'
COMMA_RE = re.compile(COMMA)

IDENT = r'(\w|[.-])+'
EXTRA_IDENT = r'(\*|:(\*|\w+):|' + IDENT + ')'
VERSPEC = IDENT + r'\*?'

RELOP = '([<>=!~]=)|[<>]'

#
# The first relop is optional - if absent, will be taken as '~='
#
BARE_CONSTRAINTS = ('(' + RELOP + r')?\s*(' + VERSPEC + ')(' + COMMA + '(' +
                    RELOP + r')\s*(' + VERSPEC + '))*')

DIRECT_REF = '(from\s+(?P<diref>.*))'

#
# Either the bare constraints or the bare constraints in parentheses
#
CONSTRAINTS = (r'\(\s*(?P<c1>' + BARE_CONSTRAINTS + '|' + DIRECT_REF +
               r')\s*\)|(?P<c2>' + BARE_CONSTRAINTS + '\s*)')

EXTRA_LIST = EXTRA_IDENT + '(' + COMMA + EXTRA_IDENT + ')*'
EXTRAS = r'\[\s*(?P<ex>' + EXTRA_LIST + r')?\s*\]'
REQUIREMENT = ('(?P<dn>'  + IDENT + r')\s*(' + EXTRAS + r'\s*)?(\s*' +
               CONSTRAINTS + ')?$')
REQUIREMENT_RE = re.compile(REQUIREMENT)

#
# Used to scan through the constraints
#
RELOP_IDENT = '(?P<op>' + RELOP + r')\s*(?P<vn>' + VERSPEC + ')'
RELOP_IDENT_RE = re.compile(RELOP_IDENT)

def parse_requirement(s):

    def get_constraint(m):
        d = m.groupdict()
        return d['op'], d['vn']

    result = None
    m = REQUIREMENT_RE.match(s)
    if m:
        d = m.groupdict()
        name = d['dn']
        cons = d['c1'] or d['c2']
        if not d['diref']:
            url = None
        else:
            # direct reference
            cons = None
            url = d['diref'].strip()
        if not cons:
            cons = None
            constr = ''
            rs = d['dn']
        else:
            if cons[0] not in '<>!=':
                cons = '~=' + cons
            iterator = RELOP_IDENT_RE.finditer(cons)
            cons = [get_constraint(m) for m in iterator]
            rs = '%s (%s)' % (name, ', '.join(['%s %s' % con for con in cons]))
        if not d['ex']:
            extras = None
        else:
            extras = COMMA_RE.split(d['ex'])
        result = Container(name=name, constraints=cons, extras=extras,
                           requirement=rs, source=s, url=url)
    return result


def get_resources_dests(resources_root, rules):
    """Find destinations for resources files"""

    def get_rel_path(base, path):
        # normalizes and returns a lstripped-/-separated path
        base = base.replace(os.path.sep, '/')
        path = path.replace(os.path.sep, '/')
        assert path.startswith(base)
        return path[len(base):].lstrip('/')


    destinations = {}
    for base, suffix, dest in rules:
        prefix = os.path.join(resources_root, base)
        for abs_base in iglob(prefix):
            abs_glob = os.path.join(abs_base, suffix)
            for abs_path in iglob(abs_glob):
                resource_file = get_rel_path(resources_root, abs_path)
                if dest is None:  # remove the entry if it was here
                    destinations.pop(resource_file, None)
                else:
                    rel_path = get_rel_path(abs_base, abs_path)
                    rel_dest = dest.replace(os.path.sep, '/').rstrip('/')
                    destinations[resource_file] = rel_dest + '/' + rel_path
    return destinations


def in_venv():
    if hasattr(sys, 'real_prefix'):
        # virtualenv venvs
        result = True
    else:
        # PEP 405 venvs
        result = sys.prefix != getattr(sys, 'base_prefix', sys.prefix)
    return result


def get_executable():
# The __PYVENV_LAUNCHER__ dance is apparently no longer needed, as
# changes to the stub launcher mean that sys.executable always points
# to the stub on macOS
#    if sys.platform == 'darwin' and ('__PYVENV_LAUNCHER__'
#                                     in os.environ):
#        result =  os.environ['__PYVENV_LAUNCHER__']
#    else:
#        result = sys.executable
#    return result
    result = os.path.normcase(sys.executable)
    if not isinstance(result, text_type):
        result = fsdecode(result)
    return result


def proceed(prompt, allowed_chars, error_prompt=None, default=None):
    p = prompt
    while True:
        s = raw_input(p)
        p = prompt
        if not s and default:
            s = default
        if s:
            c = s[0].lower()
            if c in allowed_chars:
                break
            if error_prompt:
                p = '%c: %s\n%s' % (c, error_prompt, prompt)
    return c


def extract_by_key(d, keys):
    if isinstance(keys, string_types):
        keys = keys.split()
    result = {}
    for key in keys:
        if key in d:
            result[key] = d[key]
    return result

def read_exports(stream):
    if sys.version_info[0] >= 3:
        # needs to be a text stream
        stream = codecs.getreader('utf-8')(stream)
    # Try to load as JSON, falling back on legacy format
    data = stream.read()
    stream = StringIO(data)
    try:
        jdata = json.load(stream)
        result = jdata['extensions']['python.exports']['exports']
        for group, entries in result.items():
            for k, v in entries.items():
                s = '%s = %s' % (k, v)
                entry = get_export_entry(s)
                assert entry is not None
                entries[k] = entry
        return result
    except Exception:
        stream.seek(0, 0)

    def read_stream(cp, stream):
        if hasattr(cp, 'read_file'):
            cp.read_file(stream)
        else:
            cp.readfp(stream)

    cp = configparser.ConfigParser()
    try:
        read_stream(cp, stream)
    except configparser.MissingSectionHeaderError:
        stream.close()
        data = textwrap.dedent(data)
        stream = StringIO(data)
        read_stream(cp, stream)

    result = {}
    for key in cp.sections():
        result[key] = entries = {}
        for name, value in cp.items(key):
            s = '%s = %s' % (name, value)
            entry = get_export_entry(s)
            assert entry is not None
            #entry.dist = self
            entries[name] = entry
    return result


def write_exports(exports, stream):
    if sys.version_info[0] >= 3:
        # needs to be a text stream
        stream = codecs.getwriter('utf-8')(stream)
    cp = configparser.ConfigParser()
    for k, v in exports.items():
        # TODO check k, v for valid values
        cp.add_section(k)
        for entry in v.values():
            if entry.suffix is None:
                s = entry.prefix
            else:
                s = '%s:%s' % (entry.prefix, entry.suffix)
            if entry.flags:
                s = '%s [%s]' % (s, ', '.join(entry.flags))
            cp.set(k, entry.name, s)
    cp.write(stream)


@contextlib.contextmanager
def tempdir():
    td = tempfile.mkdtemp()
    try:
        yield td
    finally:
        shutil.rmtree(td)

@contextlib.contextmanager
def chdir(d):
    cwd = os.getcwd()
    try:
        os.chdir(d)
        yield
    finally:
        os.chdir(cwd)


@contextlib.contextmanager
def socket_timeout(seconds=15):
    cto = socket.getdefaulttimeout()
    try:
        socket.setdefaulttimeout(seconds)
        yield
    finally:
        socket.setdefaulttimeout(cto)


class cached_property(object):
    def __init__(self, func):
        self.func = func
        #for attr in ('__name__', '__module__', '__doc__'):
        #    setattr(self, attr, getattr(func, attr, None))

    def __get__(self, obj, cls=None):
        if obj is None:
            return self
        value = self.func(obj)
        object.__setattr__(obj, self.func.__name__, value)
        #obj.__dict__[self.func.__name__] = value = self.func(obj)
        return value

def convert_path(pathname):
    """Return 'pathname' as a name that will work on the native filesystem.

    The path is split on '/' and put back together again using the current
    directory separator.  Needed because filenames in the setup script are
    always supplied in Unix style, and have to be converted to the local
    convention before we can actually use them in the filesystem.  Raises
    ValueError on non-Unix-ish systems if 'pathname' either starts or
    ends with a slash.
    """
    if os.sep == '/':
        return pathname
    if not pathname:
        return pathname
    if pathname[0] == '/':
        raise ValueError("path '%s' cannot be absolute" % pathname)
    if pathname[-1] == '/':
        raise ValueError("path '%s' cannot end with '/'" % pathname)

    paths = pathname.split('/')
    while os.curdir in paths:
        paths.remove(os.curdir)
    if not paths:
        return os.curdir
    return os.path.join(*paths)


class FileOperator(object):
    def __init__(self, dry_run=False):
        self.dry_run = dry_run
        self.ensured = set()
        self._init_record()

    def _init_record(self):
        self.record = False
        self.files_written = set()
        self.dirs_created = set()

    def record_as_written(self, path):
        if self.record:
            self.files_written.add(path)

    def newer(self, source, target):
        """Tell if the target is newer than the source.

        Returns true if 'source' exists and is more recently modified than
        'target', or if 'source' exists and 'target' doesn't.

        Returns false if both exist and 'target' is the same age or younger
        than 'source'. Raise PackagingFileError if 'source' does not exist.

        Note that this test is not very accurate: files created in the same
        second will have the same "age".
        """
        if not os.path.exists(source):
            raise DistlibException("file '%r' does not exist" %
                                   os.path.abspath(source))
        if not os.path.exists(target):
            return True

        return os.stat(source).st_mtime > os.stat(target).st_mtime

    def copy_file(self, infile, outfile, check=True):
        """Copy a file respecting dry-run and force flags.
        """
        self.ensure_dir(os.path.dirname(outfile))
        logger.info('Copying %s to %s', infile, outfile)
        if not self.dry_run:
            msg = None
            if check:
                if os.path.islink(outfile):
                    msg = '%s is a symlink' % outfile
                elif os.path.exists(outfile) and not os.path.isfile(outfile):
                    msg = '%s is a non-regular file' % outfile
            if msg:
                raise ValueError(msg + ' which would be overwritten')
            shutil.copyfile(infile, outfile)
        self.record_as_written(outfile)

    def copy_stream(self, instream, outfile, encoding=None):
        assert not os.path.isdir(outfile)
        self.ensure_dir(os.path.dirname(outfile))
        logger.info('Copying stream %s to %s', instream, outfile)
        if not self.dry_run:
            if encoding is None:
                outstream = open(outfile, 'wb')
            else:
                outstream = codecs.open(outfile, 'w', encoding=encoding)
            try:
                shutil.copyfileobj(instream, outstream)
            finally:
                outstream.close()
        self.record_as_written(outfile)

    def write_binary_file(self, path, data):
        self.ensure_dir(os.path.dirname(path))
        if not self.dry_run:
            with open(path, 'wb') as f:
                f.write(data)
        self.record_as_written(path)

    def write_text_file(self, path, data, encoding):
        self.ensure_dir(os.path.dirname(path))
        if not self.dry_run:
            with open(path, 'wb') as f:
                f.write(data.encode(encoding))
        self.record_as_written(path)

    def set_mode(self, bits, mask, files):
        if os.name == 'posix' or (os.name == 'java' and os._name == 'posix'):
            # Set the executable bits (owner, group, and world) on
            # all the files specified.
            for f in files:
                if self.dry_run:
                    logger.info("changing mode of %s", f)
                else:
                    mode = (os.stat(f).st_mode | bits) & mask
                    logger.info("changing mode of %s to %o", f, mode)
                    os.chmod(f, mode)

    set_executable_mode = lambda s, f: s.set_mode(0o555, 0o7777, f)

    def ensure_dir(self, path):
        path = os.path.abspath(path)
        if path not in self.ensured and not os.path.exists(path):
            self.ensured.add(path)
            d, f = os.path.split(path)
            self.ensure_dir(d)
            logger.info('Creating %s' % path)
            if not self.dry_run:
                os.mkdir(path)
            if self.record:
                self.dirs_created.add(path)

    def byte_compile(self, path, optimize=False, force=False, prefix=None):
        dpath = cache_from_source(path, not optimize)
        logger.info('Byte-compiling %s to %s', path, dpath)
        if not self.dry_run:
            if force or self.newer(path, dpath):
                if not prefix:
                    diagpath = None
                else:
                    assert path.startswith(prefix)
                    diagpath = path[len(prefix):]
            py_compile.compile(path, dpath, diagpath, True)     # raise error
        self.record_as_written(dpath)
        return dpath

    def ensure_removed(self, path):
        if os.path.exists(path):
            if os.path.isdir(path) and not os.path.islink(path):
                logger.debug('Removing directory tree at %s', path)
                if not self.dry_run:
                    shutil.rmtree(path)
                if self.record:
                    if path in self.dirs_created:
                        self.dirs_created.remove(path)
            else:
                if os.path.islink(path):
                    s = 'link'
                else:
                    s = 'file'
                logger.debug('Removing %s %s', s, path)
                if not self.dry_run:
                    os.remove(path)
                if self.record:
                    if path in self.files_written:
                        self.files_written.remove(path)

    def is_writable(self, path):
        result = False
        while not result:
            if os.path.exists(path):
                result = os.access(path, os.W_OK)
                break
            parent = os.path.dirname(path)
            if parent == path:
                break
            path = parent
        return result

    def commit(self):
        """
        Commit recorded changes, turn off recording, return
        changes.
        """
        assert self.record
        result = self.files_written, self.dirs_created
        self._init_record()
        return result

    def rollback(self):
        if not self.dry_run:
            for f in list(self.files_written):
                if os.path.exists(f):
                    os.remove(f)
            # dirs should all be empty now, except perhaps for
            # __pycache__ subdirs
            # reverse so that subdirs appear before their parents
            dirs = sorted(self.dirs_created, reverse=True)
            for d in dirs:
                flist = os.listdir(d)
                if flist:
                    assert flist == ['__pycache__']
                    sd = os.path.join(d, flist[0])
                    os.rmdir(sd)
                os.rmdir(d)     # should fail if non-empty
        self._init_record()

def resolve(module_name, dotted_path):
    if module_name in sys.modules:
        mod = sys.modules[module_name]
    else:
        mod = __import__(module_name)
    if dotted_path is None:
        result = mod
    else:
        parts = dotted_path.split('.')
        result = getattr(mod, parts.pop(0))
        for p in parts:
            result = getattr(result, p)
    return result


class ExportEntry(object):
    def __init__(self, name, prefix, suffix, flags):
        self.name = name
        self.prefix = prefix
        self.suffix = suffix
        self.flags = flags

    @cached_property
    def value(self):
        return resolve(self.prefix, self.suffix)

    def __repr__(self):  # pragma: no cover
        return '<ExportEntry %s = %s:%s %s>' % (self.name, self.prefix,
                                                self.suffix, self.flags)

    def __eq__(self, other):
        if not isinstance(other, ExportEntry):
            result = False
        else:
            result = (self.name == other.name and
                      self.prefix == other.prefix and
                      self.suffix == other.suffix and
                      self.flags == other.flags)
        return result

    __hash__ = object.__hash__


ENTRY_RE = re.compile(r'''(?P<name>(\w|[-.+])+)
                      \s*=\s*(?P<callable>(\w+)([:\.]\w+)*)
                      \s*(\[\s*(?P<flags>\w+(=\w+)?(,\s*\w+(=\w+)?)*)\s*\])?
                      ''', re.VERBOSE)

def get_export_entry(specification):
    m = ENTRY_RE.search(specification)
    if not m:
        result = None
        if '[' in specification or ']' in specification:
            raise DistlibException("Invalid specification "
                                   "'%s'" % specification)
    else:
        d = m.groupdict()
        name = d['name']
        path = d['callable']
        colons = path.count(':')
        if colons == 0:
            prefix, suffix = path, None
        else:
            if colons != 1:
                raise DistlibException("Invalid specification "
                                       "'%s'" % specification)
            prefix, suffix = path.split(':')
        flags = d['flags']
        if flags is None:
            if '[' in specification or ']' in specification:
                raise DistlibException("Invalid specification "
                                       "'%s'" % specification)
            flags = []
        else:
            flags = [f.strip() for f in flags.split(',')]
        result = ExportEntry(name, prefix, suffix, flags)
    return result


def get_cache_base(suffix=None):
    """
    Return the default base location for distlib caches. If the directory does
    not exist, it is created. Use the suffix provided for the base directory,
    and default to '.distlib' if it isn't provided.

    On Windows, if LOCALAPPDATA is defined in the environment, then it is
    assumed to be a directory, and will be the parent directory of the result.
    On POSIX, and on Windows if LOCALAPPDATA is not defined, the user's home
    directory - using os.expanduser('~') - will be the parent directory of
    the result.

    The result is just the directory '.distlib' in the parent directory as
    determined above, or with the name specified with ``suffix``.
    """
    if suffix is None:
        suffix = '.distlib'
    if os.name == 'nt' and 'LOCALAPPDATA' in os.environ:
        result = os.path.expandvars('$localappdata')
    else:
        # Assume posix, or old Windows
        result = os.path.expanduser('~')
    # we use 'isdir' instead of 'exists', because we want to
    # fail if there's a file with that name
    if os.path.isdir(result):
        usable = os.access(result, os.W_OK)
        if not usable:
            logger.warning('Directory exists but is not writable: %s', result)
    else:
        try:
            os.makedirs(result)
            usable = True
        except OSError:
            logger.warning('Unable to create %s', result, exc_info=True)
            usable = False
    if not usable:
        result = tempfile.mkdtemp()
        logger.warning('Default location unusable, using %s', result)
    return os.path.join(result, suffix)


def path_to_cache_dir(path):
    """
    Convert an absolute path to a directory name for use in a cache.

    The algorithm used is:

    #. On Windows, any ``':'`` in the drive is replaced with ``'---'``.
    #. Any occurrence of ``os.sep`` is replaced with ``'--'``.
    #. ``'.cache'`` is appended.
    """
    d, p = os.path.splitdrive(os.path.abspath(path))
    if d:
        d = d.replace(':', '---')
    p = p.replace(os.sep, '--')
    return d + p + '.cache'


def ensure_slash(s):
    if not s.endswith('/'):
        return s + '/'
    return s


def parse_credentials(netloc):
    username = password = None
    if '@' in netloc:
        prefix, netloc = netloc.split('@', 1)
        if ':' not in prefix:
            username = prefix
        else:
            username, password = prefix.split(':', 1)
    return username, password, netloc


def get_process_umask():
    result = os.umask(0o22)
    os.umask(result)
    return result

def is_string_sequence(seq):
    result = True
    i = None
    for i, s in enumerate(seq):
        if not isinstance(s, string_types):
            result = False
            break
    assert i is not None
    return result

PROJECT_NAME_AND_VERSION = re.compile('([a-z0-9_]+([.-][a-z_][a-z0-9_]*)*)-'
                                      '([a-z0-9_.+-]+)', re.I)
PYTHON_VERSION = re.compile(r'-py(\d\.?\d?)')


def split_filename(filename, project_name=None):
    """
    Extract name, version, python version from a filename (no extension)

    Return name, version, pyver or None
    """
    result = None
    pyver = None
    filename = unquote(filename).replace(' ', '-')
    m = PYTHON_VERSION.search(filename)
    if m:
        pyver = m.group(1)
        filename = filename[:m.start()]
    if project_name and len(filename) > len(project_name) + 1:
        m = re.match(re.escape(project_name) + r'\b', filename)
        if m:
            n = m.end()
            result = filename[:n], filename[n + 1:], pyver
    if result is None:
        m = PROJECT_NAME_AND_VERSION.match(filename)
        if m:
            result = m.group(1), m.group(3), pyver
    return result

# Allow spaces in name because of legacy dists like "Twisted Core"
NAME_VERSION_RE = re.compile(r'(?P<name>[\w .-]+)\s*'
                             r'\(\s*(?P<ver>[^\s)]+)\)$')

def parse_name_and_version(p):
    """
    A utility method used to get name and version from a string.

    From e.g. a Provides-Dist value.

    :param p: A value in a form 'foo (1.0)'
    :return: The name and version as a tuple.
    """
    m = NAME_VERSION_RE.match(p)
    if not m:
        raise DistlibException('Ill-formed name/version string: \'%s\'' % p)
    d = m.groupdict()
    return d['name'].strip().lower(), d['ver']

def get_extras(requested, available):
    result = set()
    requested = set(requested or [])
    available = set(available or [])
    if '*' in requested:
        requested.remove('*')
        result |= available
    for r in requested:
        if r == '-':
            result.add(r)
        elif r.startswith('-'):
            unwanted = r[1:]
            if unwanted not in available:
                logger.warning('undeclared extra: %s' % unwanted)
            if unwanted in result:
                result.remove(unwanted)
        else:
            if r not in available:
                logger.warning('undeclared extra: %s' % r)
            result.add(r)
    return result
#
# Extended metadata functionality
#

def _get_external_data(url):
    result = {}
    try:
        # urlopen might fail if it runs into redirections,
        # because of Python issue #13696. Fixed in locators
        # using a custom redirect handler.
        resp = urlopen(url)
        headers = resp.info()
        ct = headers.get('Content-Type')
        if not ct.startswith('application/json'):
            logger.debug('Unexpected response for JSON request: %s', ct)
        else:
            reader = codecs.getreader('utf-8')(resp)
            #data = reader.read().decode('utf-8')
            #result = json.loads(data)
            result = json.load(reader)
    except Exception as e:
        logger.exception('Failed to get external data for %s: %s', url, e)
    return result

_external_data_base_url = 'https://www.red-dove.com/pypi/projects/'

def get_project_data(name):
    url = '%s/%s/project.json' % (name[0].upper(), name)
    url = urljoin(_external_data_base_url, url)
    result = _get_external_data(url)
    return result

def get_package_data(name, version):
    url = '%s/%s/package-%s.json' % (name[0].upper(), name, version)
    url = urljoin(_external_data_base_url, url)
    return _get_external_data(url)


class Cache(object):
    """
    A class implementing a cache for resources that need to live in the file system
    e.g. shared libraries. This class was moved from resources to here because it
    could be used by other modules, e.g. the wheel module.
    """

    def __init__(self, base):
        """
        Initialise an instance.

        :param base: The base directory where the cache should be located.
        """
        # we use 'isdir' instead of 'exists', because we want to
        # fail if there's a file with that name
        if not os.path.isdir(base):  # pragma: no cover
            os.makedirs(base)
        if (os.stat(base).st_mode & 0o77) != 0:
            logger.warning('Directory \'%s\' is not private', base)
        self.base = os.path.abspath(os.path.normpath(base))

    def prefix_to_dir(self, prefix):
        """
        Converts a resource prefix to a directory name in the cache.
        """
        return path_to_cache_dir(prefix)

    def clear(self):
        """
        Clear the cache.
        """
        not_removed = []
        for fn in os.listdir(self.base):
            fn = os.path.join(self.base, fn)
            try:
                if os.path.islink(fn) or os.path.isfile(fn):
                    os.remove(fn)
                elif os.path.isdir(fn):
                    shutil.rmtree(fn)
            except Exception:
                not_removed.append(fn)
        return not_removed


class EventMixin(object):
    """
    A very simple publish/subscribe system.
    """
    def __init__(self):
        self._subscribers = {}

    def add(self, event, subscriber, append=True):
        """
        Add a subscriber for an event.

        :param event: The name of an event.
        :param subscriber: The subscriber to be added (and called when the
                           event is published).
        :param append: Whether to append or prepend the subscriber to an
                       existing subscriber list for the event.
        """
        subs = self._subscribers
        if event not in subs:
            subs[event] = deque([subscriber])
        else:
            sq = subs[event]
            if append:
                sq.append(subscriber)
            else:
                sq.appendleft(subscriber)

    def remove(self, event, subscriber):
        """
        Remove a subscriber for an event.

        :param event: The name of an event.
        :param subscriber: The subscriber to be removed.
        """
        subs = self._subscribers
        if event not in subs:
            raise ValueError('No subscribers: %r' % event)
        subs[event].remove(subscriber)

    def get_subscribers(self, event):
        """
        Return an iterator for the subscribers for an event.
        :param event: The event to return subscribers for.
        """
        return iter(self._subscribers.get(event, ()))

    def publish(self, event, *args, **kwargs):
        """
        Publish a event and return a list of values returned by its
        subscribers.

        :param event: The event to publish.
        :param args: The positional arguments to pass to the event's
                     subscribers.
        :param kwargs: The keyword arguments to pass to the event's
                       subscribers.
        """
        result = []
        for subscriber in self.get_subscribers(event):
            try:
                value = subscriber(event, *args, **kwargs)
            except Exception:
                logger.exception('Exception during event publication')
                value = None
            result.append(value)
        logger.debug('publish %s: args = %s, kwargs = %s, result = %s',
                     event, args, kwargs, result)
        return result

#
# Simple sequencing
#
class Sequencer(object):
    def __init__(self):
        self._preds = {}
        self._succs = {}
        self._nodes = set()     # nodes with no preds/succs

    def add_node(self, node):
        self._nodes.add(node)

    def remove_node(self, node, edges=False):
        if node in self._nodes:
            self._nodes.remove(node)
        if edges:
            for p in set(self._preds.get(node, ())):
                self.remove(p, node)
            for s in set(self._succs.get(node, ())):
                self.remove(node, s)
            # Remove empties
            for k, v in list(self._preds.items()):
                if not v:
                    del self._preds[k]
            for k, v in list(self._succs.items()):
                if not v:
                    del self._succs[k]

    def add(self, pred, succ):
        assert pred != succ
        self._preds.setdefault(succ, set()).add(pred)
        self._succs.setdefault(pred, set()).add(succ)

    def remove(self, pred, succ):
        assert pred != succ
        try:
            preds = self._preds[succ]
            succs = self._succs[pred]
        except KeyError:  # pragma: no cover
            raise ValueError('%r not a successor of anything' % succ)
        try:
            preds.remove(pred)
            succs.remove(succ)
        except KeyError:  # pragma: no cover
            raise ValueError('%r not a successor of %r' % (succ, pred))

    def is_step(self, step):
        return (step in self._preds or step in self._succs or
                step in self._nodes)

    def get_steps(self, final):
        if not self.is_step(final):
            raise ValueError('Unknown: %r' % final)
        result = []
        todo = []
        seen = set()
        todo.append(final)
        while todo:
            step = todo.pop(0)
            if step in seen:
                # if a step was already seen,
                # move it to the end (so it will appear earlier
                # when reversed on return) ... but not for the
                # final step, as that would be confusing for
                # users
                if step != final:
                    result.remove(step)
                    result.append(step)
            else:
                seen.add(step)
                result.append(step)
                preds = self._preds.get(step, ())
                todo.extend(preds)
        return reversed(result)

    @property
    def strong_connections(self):
        #http://en.wikipedia.org/wiki/Tarjan%27s_strongly_connected_components_algorithm
        index_counter = [0]
        stack = []
        lowlinks = {}
        index = {}
        result = []

        graph = self._succs

        def strongconnect(node):
            # set the depth index for this node to the smallest unused index
            index[node] = index_counter[0]
            lowlinks[node] = index_counter[0]
            index_counter[0] += 1
            stack.append(node)

            # Consider successors
            try:
                successors = graph[node]
            except Exception:
                successors = []
            for successor in successors:
                if successor not in lowlinks:
                    # Successor has not yet been visited
                    strongconnect(successor)
                    lowlinks[node] = min(lowlinks[node],lowlinks[successor])
                elif successor in stack:
                    # the successor is in the stack and hence in the current
                    # strongly connected component (SCC)
                    lowlinks[node] = min(lowlinks[node],index[successor])

            # If `node` is a root node, pop the stack and generate an SCC
            if lowlinks[node] == index[node]:
                connected_component = []

                while True:
                    successor = stack.pop()
                    connected_component.append(successor)
                    if successor == node: break
                component = tuple(connected_component)
                # storing the result
                result.append(component)

        for node in graph:
            if node not in lowlinks:
                strongconnect(node)

        return result

    @property
    def dot(self):
        result = ['digraph G {']
        for succ in self._preds:
            preds = self._preds[succ]
            for pred in preds:
                result.append('  %s -> %s;' % (pred, succ))
        for node in self._nodes:
            result.append('  %s;' % node)
        result.append('}')
        return '\n'.join(result)

#
# Unarchiving functionality for zip, tar, tgz, tbz, whl
#

ARCHIVE_EXTENSIONS = ('.tar.gz', '.tar.bz2', '.tar', '.zip',
                      '.tgz', '.tbz', '.whl')

def unarchive(archive_filename, dest_dir, format=None, check=True):

    def check_path(path):
        if not isinstance(path, text_type):
            path = path.decode('utf-8')
        p = os.path.abspath(os.path.join(dest_dir, path))
        if not p.startswith(dest_dir) or p[plen] != os.sep:
            raise ValueError('path outside destination: %r' % p)

    dest_dir = os.path.abspath(dest_dir)
    plen = len(dest_dir)
    archive = None
    if format is None:
        if archive_filename.endswith(('.zip', '.whl')):
            format = 'zip'
        elif archive_filename.endswith(('.tar.gz', '.tgz')):
            format = 'tgz'
            mode = 'r:gz'
        elif archive_filename.endswith(('.tar.bz2', '.tbz')):
            format = 'tbz'
            mode = 'r:bz2'
        elif archive_filename.endswith('.tar'):
            format = 'tar'
            mode = 'r'
        else:  # pragma: no cover
            raise ValueError('Unknown format for %r' % archive_filename)
    try:
        if format == 'zip':
            archive = ZipFile(archive_filename, 'r')
            if check:
                names = archive.namelist()
                for name in names:
                    check_path(name)
        else:
            archive = tarfile.open(archive_filename, mode)
            if check:
                names = archive.getnames()
                for name in names:
                    check_path(name)
        if format != 'zip' and sys.version_info[0] < 3:
            # See Python issue 17153. If the dest path contains Unicode,
            # tarfile extraction fails on Python 2.x if a member path name
            # contains non-ASCII characters - it leads to an implicit
            # bytes -> unicode conversion using ASCII to decode.
            for tarinfo in archive.getmembers():
                if not isinstance(tarinfo.name, text_type):
                    tarinfo.name = tarinfo.name.decode('utf-8')

        # Limit extraction of dangerous items, if this Python
        # allows it easily. If not, just trust the input.
        # See: https://docs.python.org/3/library/tarfile.html#extraction-filters
        def extraction_filter(member, path):
            """Run tarfile.tar_fillter, but raise the expected ValueError"""
            # This is only called if the current Python has tarfile filters
            try:
                return tarfile.tar_filter(member, path)
            except tarfile.FilterError as exc:
                raise ValueError(str(exc))
        archive.extraction_filter = extraction_filter

        archive.extractall(dest_dir)

    finally:
        if archive:
            archive.close()


def zip_dir(directory):
    """zip a directory tree into a BytesIO object"""
    result = io.BytesIO()
    dlen = len(directory)
    with ZipFile(result, "w") as zf:
        for root, dirs, files in os.walk(directory):
            for name in files:
                full = os.path.join(root, name)
                rel = root[dlen:]
                dest = os.path.join(rel, name)
                zf.write(full, dest)
    return result

#
# Simple progress bar
#

UNITS = ('', 'K', 'M', 'G','T','P')


class Progress(object):
    unknown = 'UNKNOWN'

    def __init__(self, minval=0, maxval=100):
        assert maxval is None or maxval >= minval
        self.min = self.cur = minval
        self.max = maxval
        self.started = None
        self.elapsed = 0
        self.done = False

    def update(self, curval):
        assert self.min <= curval
        assert self.max is None or curval <= self.max
        self.cur = curval
        now = time.time()
        if self.started is None:
            self.started = now
        else:
            self.elapsed = now - self.started

    def increment(self, incr):
        assert incr >= 0
        self.update(self.cur + incr)

    def start(self):
        self.update(self.min)
        return self

    def stop(self):
        if self.max is not None:
            self.update(self.max)
        self.done = True

    @property
    def maximum(self):
        return self.unknown if self.max is None else self.max

    @property
    def percentage(self):
        if self.done:
            result = '100 %'
        elif self.max is None:
            result = ' ?? %'
        else:
            v = 100.0 * (self.cur - self.min) / (self.max - self.min)
            result = '%3d %%' % v
        return result

    def format_duration(self, duration):
        if (duration <= 0) and self.max is None or self.cur == self.min:
            result = '??:??:??'
        #elif duration < 1:
        #    result = '--:--:--'
        else:
            result = time.strftime('%H:%M:%S', time.gmtime(duration))
        return result

    @property
    def ETA(self):
        if self.done:
            prefix = 'Done'
            t = self.elapsed
            #import pdb; pdb.set_trace()
        else:
            prefix = 'ETA '
            if self.max is None:
                t = -1
            elif self.elapsed == 0 or (self.cur == self.min):
                t = 0
            else:
                #import pdb; pdb.set_trace()
                t = float(self.max - self.min)
                t /= self.cur - self.min
                t = (t - 1) * self.elapsed
        return '%s: %s' % (prefix, self.format_duration(t))

    @property
    def speed(self):
        if self.elapsed == 0:
            result = 0.0
        else:
            result = (self.cur - self.min) / self.elapsed
        for unit in UNITS:
            if result < 1000:
                break
            result /= 1000.0
        return '%d %sB/s' % (result, unit)

#
# Glob functionality
#

RICH_GLOB = re.compile(r'\{([^}]*)\}')
_CHECK_RECURSIVE_GLOB = re.compile(r'[^/\\,{]\*\*|\*\*[^/\\,}]')
_CHECK_MISMATCH_SET = re.compile(r'^[^{]*\}|\{[^}]*$')


def iglob(path_glob):
    """Extended globbing function that supports ** and {opt1,opt2,opt3}."""
    if _CHECK_RECURSIVE_GLOB.search(path_glob):
        msg = """invalid glob %r: recursive glob "**" must be used alone"""
        raise ValueError(msg % path_glob)
    if _CHECK_MISMATCH_SET.search(path_glob):
        msg = """invalid glob %r: mismatching set marker '{' or '}'"""
        raise ValueError(msg % path_glob)
    return _iglob(path_glob)


def _iglob(path_glob):
    rich_path_glob = RICH_GLOB.split(path_glob, 1)
    if len(rich_path_glob) > 1:
        assert len(rich_path_glob) == 3, rich_path_glob
        prefix, set, suffix = rich_path_glob
        for item in set.split(','):
            for path in _iglob(''.join((prefix, item, suffix))):
                yield path
    else:
        if '**' not in path_glob:
            for item in std_iglob(path_glob):
                yield item
        else:
            prefix, radical = path_glob.split('**', 1)
            if prefix == '':
                prefix = '.'
            if radical == '':
                radical = '*'
            else:
                # we support both
                radical = radical.lstrip('/')
                radical = radical.lstrip('\\')
            for path, dir, files in os.walk(prefix):
                path = os.path.normpath(path)
                for fn in _iglob(os.path.join(path, radical)):
                    yield fn

if ssl:
    from .compat import (HTTPSHandler as BaseHTTPSHandler, match_hostname,
                         CertificateError)


#
# HTTPSConnection which verifies certificates/matches domains
#

    class HTTPSConnection(httplib.HTTPSConnection):
        ca_certs = None # set this to the path to the certs file (.pem)
        check_domain = True # only used if ca_certs is not None

        # noinspection PyPropertyAccess
        def connect(self):
            sock = socket.create_connection((self.host, self.port), self.timeout)
            if getattr(self, '_tunnel_host', False):
                self.sock = sock
                self._tunnel()

            if not hasattr(ssl, 'SSLContext'):
                # For 2.x
                if self.ca_certs:
                    cert_reqs = ssl.CERT_REQUIRED
                else:
                    cert_reqs = ssl.CERT_NONE
                self.sock = ssl.wrap_socket(sock, self.key_file, self.cert_file,
                                            cert_reqs=cert_reqs,
                                            ssl_version=ssl.PROTOCOL_SSLv23,
                                            ca_certs=self.ca_certs)
            else:  # pragma: no cover
                context = ssl.SSLContext(ssl.PROTOCOL_SSLv23)
                context.options |= ssl.OP_NO_SSLv2
                if self.cert_file:
                    context.load_cert_chain(self.cert_file, self.key_file)
                kwargs = {}
                if self.ca_certs:
                    context.verify_mode = ssl.CERT_REQUIRED
                    context.load_verify_locations(cafile=self.ca_certs)
                    if getattr(ssl, 'HAS_SNI', False):
                        kwargs['server_hostname'] = self.host
                self.sock = context.wrap_socket(sock, **kwargs)
            if self.ca_certs and self.check_domain:
                try:
                    match_hostname(self.sock.getpeercert(), self.host)
                    logger.debug('Host verified: %s', self.host)
                except CertificateError:  # pragma: no cover
                    self.sock.shutdown(socket.SHUT_RDWR)
                    self.sock.close()
                    raise

    class HTTPSHandler(BaseHTTPSHandler):
        def __init__(self, ca_certs, check_domain=True):
            BaseHTTPSHandler.__init__(self)
            self.ca_certs = ca_certs
            self.check_domain = check_domain

        def _conn_maker(self, *args, **kwargs):
            """
            This is called to create a connection instance. Normally you'd
            pass a connection class to do_open, but it doesn't actually check for
            a class, and just expects a callable. As long as we behave just as a
            constructor would have, we should be OK. If it ever changes so that
            we *must* pass a class, we'll create an UnsafeHTTPSConnection class
            which just sets check_domain to False in the class definition, and
            choose which one to pass to do_open.
            """
            result = HTTPSConnection(*args, **kwargs)
            if self.ca_certs:
                result.ca_certs = self.ca_certs
                result.check_domain = self.check_domain
            return result

        def https_open(self, req):
            try:
                return self.do_open(self._conn_maker, req)
            except URLError as e:
                if 'certificate verify failed' in str(e.reason):
                    raise CertificateError('Unable to verify server certificate '
                                           'for %s' % req.host)
                else:
                    raise

    #
    # To prevent against mixing HTTP traffic with HTTPS (examples: A Man-In-The-
    # Middle proxy using HTTP listens on port 443, or an index mistakenly serves
    # HTML containing a http://xyz link when it should be https://xyz),
    # you can use the following handler class, which does not allow HTTP traffic.
    #
    # It works by inheriting from HTTPHandler - so build_opener won't add a
    # handler for HTTP itself.
    #
    class HTTPSOnlyHandler(HTTPSHandler, HTTPHandler):
        def http_open(self, req):
            raise URLError('Unexpected HTTP request on what should be a secure '
                           'connection: %s' % req)

#
# XML-RPC with timeouts
#

_ver_info = sys.version_info[:2]

if _ver_info == (2, 6):
    class HTTP(httplib.HTTP):
        def __init__(self, host='', port=None, **kwargs):
            if port == 0:   # 0 means use port 0, not the default port
                port = None
            self._setup(self._connection_class(host, port, **kwargs))


    if ssl:
        class HTTPS(httplib.HTTPS):
            def __init__(self, host='', port=None, **kwargs):
                if port == 0:   # 0 means use port 0, not the default port
                    port = None
                self._setup(self._connection_class(host, port, **kwargs))


class Transport(xmlrpclib.Transport):
    def __init__(self, timeout, use_datetime=0):
        self.timeout = timeout
        xmlrpclib.Transport.__init__(self, use_datetime)

    def make_connection(self, host):
        h, eh, x509 = self.get_host_info(host)
        if _ver_info == (2, 6):
            result = HTTP(h, timeout=self.timeout)
        else:
            if not self._connection or host != self._connection[0]:
                self._extra_headers = eh
                self._connection = host, httplib.HTTPConnection(h)
            result = self._connection[1]
        return result

if ssl:
    class SafeTransport(xmlrpclib.SafeTransport):
        def __init__(self, timeout, use_datetime=0):
            self.timeout = timeout
            xmlrpclib.SafeTransport.__init__(self, use_datetime)

        def make_connection(self, host):
            h, eh, kwargs = self.get_host_info(host)
            if not kwargs:
                kwargs = {}
            kwargs['timeout'] = self.timeout
            if _ver_info == (2, 6):
                result = HTTPS(host, None, **kwargs)
            else:
                if not self._connection or host != self._connection[0]:
                    self._extra_headers = eh
                    self._connection = host, httplib.HTTPSConnection(h, None,
                                                                     **kwargs)
                result = self._connection[1]
            return result


class ServerProxy(xmlrpclib.ServerProxy):
    def __init__(self, uri, **kwargs):
        self.timeout = timeout = kwargs.pop('timeout', None)
        # The above classes only come into play if a timeout
        # is specified
        if timeout is not None:
            scheme, _ = splittype(uri)
            use_datetime = kwargs.get('use_datetime', 0)
            if scheme == 'https':
                tcls = SafeTransport
            else:
                tcls = Transport
            kwargs['transport'] = t = tcls(timeout, use_datetime=use_datetime)
            self.transport = t
        xmlrpclib.ServerProxy.__init__(self, uri, **kwargs)

#
# CSV functionality. This is provided because on 2.x, the csv module can't
# handle Unicode. However, we need to deal with Unicode in e.g. RECORD files.
#

def _csv_open(fn, mode, **kwargs):
    if sys.version_info[0] < 3:
        mode += 'b'
    else:
        kwargs['newline'] = ''
    return open(fn, mode, **kwargs)


class CSVBase(object):
    defaults = {
        'delimiter': str(','),      # The strs are used because we need native
        'quotechar': str('"'),      # str in the csv API (2.x won't take
        'lineterminator': str('\n') # Unicode)
    }

    def __enter__(self):
        return self

    def __exit__(self, *exc_info):
        self.stream.close()


class CSVReader(CSVBase):
    def __init__(self, **kwargs):
        if 'stream' in kwargs:
            stream = kwargs['stream']
            if sys.version_info[0] >= 3:
                # needs to be a text stream
                stream = codecs.getreader('utf-8')(stream)
            self.stream = stream
        else:
            self.stream = _csv_open(kwargs['path'], 'r')
        self.reader = csv.reader(self.stream, **self.defaults)

    def __iter__(self):
        return self

    def next(self):
        result = next(self.reader)
        if sys.version_info[0] < 3:
            for i, item in enumerate(result):
                if not isinstance(item, text_type):
                    result[i] = item.decode('utf-8')
        return result

    __next__ = next

class CSVWriter(CSVBase):
    def __init__(self, fn, **kwargs):
        self.stream = _csv_open(fn, 'w')
        self.writer = csv.writer(self.stream, **self.defaults)

    def writerow(self, row):
        if sys.version_info[0] < 3:
            r = []
            for item in row:
                if isinstance(item, text_type):
                    item = item.encode('utf-8')
                r.append(item)
            row = r
        self.writer.writerow(row)

#
#   Configurator functionality
#

class Configurator(BaseConfigurator):

    value_converters = dict(BaseConfigurator.value_converters)
    value_converters['inc'] = 'inc_convert'

    def __init__(self, config, base=None):
        super(Configurator, self).__init__(config)
        self.base = base or os.getcwd()

    def configure_custom(self, config):
        def convert(o):
            if isinstance(o, (list, tuple)):
                result = type(o)([convert(i) for i in o])
            elif isinstance(o, dict):
                if '()' in o:
                    result = self.configure_custom(o)
                else:
                    result = {}
                    for k in o:
                        result[k] = convert(o[k])
            else:
                result = self.convert(o)
            return result

        c = config.pop('()')
        if not callable(c):
            c = self.resolve(c)
        props = config.pop('.', None)
        # Check for valid identifiers
        args = config.pop('[]', ())
        if args:
            args = tuple([convert(o) for o in args])
        items = [(k, convert(config[k])) for k in config if valid_ident(k)]
        kwargs = dict(items)
        result = c(*args, **kwargs)
        if props:
            for n, v in props.items():
                setattr(result, n, convert(v))
        return result

    def __getitem__(self, key):
        result = self.config[key]
        if isinstance(result, dict) and '()' in result:
            self.config[key] = result = self.configure_custom(result)
        return result

    def inc_convert(self, value):
        """Default converter for the inc:// protocol."""
        if not os.path.isabs(value):
            value = os.path.join(self.base, value)
        with codecs.open(value, 'r', encoding='utf-8') as f:
            result = json.load(f)
        return result

#
# Mixin for running subprocesses and capturing their output
#

class SubprocessMixin(object):
    def __init__(self, verbose=False, progress=None):
        self.verbose = verbose
        self.progress = progress

    def reader(self, stream, context):
        """
        Read lines from a subprocess' output stream and either pass to a progress
        callable (if specified) or write progress information to sys.stderr.
        """
        progress = self.progress
        verbose = self.verbose
        while True:
            s = stream.readline()
            if not s:
                break
            if progress is not None:
                progress(s, context)
            else:
                if not verbose:
                    sys.stderr.write('.')
                else:
                    sys.stderr.write(s.decode('utf-8'))
                sys.stderr.flush()
        stream.close()

    def run_command(self, cmd, **kwargs):
        p = subprocess.Popen(cmd, stdout=subprocess.PIPE,
                             stderr=subprocess.PIPE, **kwargs)
        t1 = threading.Thread(target=self.reader, args=(p.stdout, 'stdout'))
        t1.start()
        t2 = threading.Thread(target=self.reader, args=(p.stderr, 'stderr'))
        t2.start()
        p.wait()
        t1.join()
        t2.join()
        if self.progress is not None:
            self.progress('done.', 'main')
        elif self.verbose:
            sys.stderr.write('done.\n')
        return p


def normalize_name(name):
    """Normalize a python package name a la PEP 503"""
    # https://www.python.org/dev/peps/pep-0503/#normalized-names
    return re.sub('[-_.]+', '-', name).lower()
site-packages/pip/_vendor/distlib/__init__.py000064400000001105147511334610015255 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2012-2016 Vinay Sajip.
# Licensed to the Python Software Foundation under a contributor agreement.
# See LICENSE.txt and CONTRIBUTORS.txt.
#
import logging

__version__ = '0.2.4'

class DistlibException(Exception):
    pass

try:
    from logging import NullHandler
except ImportError: # pragma: no cover
    class NullHandler(logging.Handler):
        def handle(self, record): pass
        def emit(self, record): pass
        def createLock(self): self.lock = None

logger = logging.getLogger(__name__)
logger.addHandler(NullHandler())
site-packages/pip/_vendor/distlib/metadata.py000064400000113661147511334610015311 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2012 The Python Software Foundation.
# See LICENSE.txt and CONTRIBUTORS.txt.
#
"""Implementation of the Metadata for Python packages PEPs.

Supports all metadata formats (1.0, 1.1, 1.2, and 2.0 experimental).
"""
from __future__ import unicode_literals

import codecs
from email import message_from_file
import json
import logging
import re


from . import DistlibException, __version__
from .compat import StringIO, string_types, text_type
from .markers import interpret
from .util import extract_by_key, get_extras
from .version import get_scheme, PEP440_VERSION_RE

logger = logging.getLogger(__name__)


class MetadataMissingError(DistlibException):
    """A required metadata is missing"""


class MetadataConflictError(DistlibException):
    """Attempt to read or write metadata fields that are conflictual."""


class MetadataUnrecognizedVersionError(DistlibException):
    """Unknown metadata version number."""


class MetadataInvalidError(DistlibException):
    """A metadata value is invalid"""

# public API of this module
__all__ = ['Metadata', 'PKG_INFO_ENCODING', 'PKG_INFO_PREFERRED_VERSION']

# Encoding used for the PKG-INFO files
PKG_INFO_ENCODING = 'utf-8'

# preferred version. Hopefully will be changed
# to 1.2 once PEP 345 is supported everywhere
PKG_INFO_PREFERRED_VERSION = '1.1'

_LINE_PREFIX_1_2 = re.compile('\n       \|')
_LINE_PREFIX_PRE_1_2 = re.compile('\n        ')
_241_FIELDS = ('Metadata-Version', 'Name', 'Version', 'Platform',
               'Summary', 'Description',
               'Keywords', 'Home-page', 'Author', 'Author-email',
               'License')

_314_FIELDS = ('Metadata-Version', 'Name', 'Version', 'Platform',
               'Supported-Platform', 'Summary', 'Description',
               'Keywords', 'Home-page', 'Author', 'Author-email',
               'License', 'Classifier', 'Download-URL', 'Obsoletes',
               'Provides', 'Requires')

_314_MARKERS = ('Obsoletes', 'Provides', 'Requires', 'Classifier',
                'Download-URL')

_345_FIELDS = ('Metadata-Version', 'Name', 'Version', 'Platform',
               'Supported-Platform', 'Summary', 'Description',
               'Keywords', 'Home-page', 'Author', 'Author-email',
               'Maintainer', 'Maintainer-email', 'License',
               'Classifier', 'Download-URL', 'Obsoletes-Dist',
               'Project-URL', 'Provides-Dist', 'Requires-Dist',
               'Requires-Python', 'Requires-External')

_345_MARKERS = ('Provides-Dist', 'Requires-Dist', 'Requires-Python',
                'Obsoletes-Dist', 'Requires-External', 'Maintainer',
                'Maintainer-email', 'Project-URL')

_426_FIELDS = ('Metadata-Version', 'Name', 'Version', 'Platform',
               'Supported-Platform', 'Summary', 'Description',
               'Keywords', 'Home-page', 'Author', 'Author-email',
               'Maintainer', 'Maintainer-email', 'License',
               'Classifier', 'Download-URL', 'Obsoletes-Dist',
               'Project-URL', 'Provides-Dist', 'Requires-Dist',
               'Requires-Python', 'Requires-External', 'Private-Version',
               'Obsoleted-By', 'Setup-Requires-Dist', 'Extension',
               'Provides-Extra')

_426_MARKERS = ('Private-Version', 'Provides-Extra', 'Obsoleted-By',
                'Setup-Requires-Dist', 'Extension')

_ALL_FIELDS = set()
_ALL_FIELDS.update(_241_FIELDS)
_ALL_FIELDS.update(_314_FIELDS)
_ALL_FIELDS.update(_345_FIELDS)
_ALL_FIELDS.update(_426_FIELDS)

EXTRA_RE = re.compile(r'''extra\s*==\s*("([^"]+)"|'([^']+)')''')


def _version2fieldlist(version):
    if version == '1.0':
        return _241_FIELDS
    elif version == '1.1':
        return _314_FIELDS
    elif version == '1.2':
        return _345_FIELDS
    elif version == '2.0':
        return _426_FIELDS
    raise MetadataUnrecognizedVersionError(version)


def _best_version(fields):
    """Detect the best version depending on the fields used."""
    def _has_marker(keys, markers):
        for marker in markers:
            if marker in keys:
                return True
        return False

    keys = []
    for key, value in fields.items():
        if value in ([], 'UNKNOWN', None):
            continue
        keys.append(key)

    possible_versions = ['1.0', '1.1', '1.2', '2.0']

    # first let's try to see if a field is not part of one of the version
    for key in keys:
        if key not in _241_FIELDS and '1.0' in possible_versions:
            possible_versions.remove('1.0')
        if key not in _314_FIELDS and '1.1' in possible_versions:
            possible_versions.remove('1.1')
        if key not in _345_FIELDS and '1.2' in possible_versions:
            possible_versions.remove('1.2')
        if key not in _426_FIELDS and '2.0' in possible_versions:
            possible_versions.remove('2.0')

    # possible_version contains qualified versions
    if len(possible_versions) == 1:
        return possible_versions[0]   # found !
    elif len(possible_versions) == 0:
        raise MetadataConflictError('Unknown metadata set')

    # let's see if one unique marker is found
    is_1_1 = '1.1' in possible_versions and _has_marker(keys, _314_MARKERS)
    is_1_2 = '1.2' in possible_versions and _has_marker(keys, _345_MARKERS)
    is_2_0 = '2.0' in possible_versions and _has_marker(keys, _426_MARKERS)
    if int(is_1_1) + int(is_1_2) + int(is_2_0) > 1:
        raise MetadataConflictError('You used incompatible 1.1/1.2/2.0 fields')

    # we have the choice, 1.0, or 1.2, or 2.0
    #   - 1.0 has a broken Summary field but works with all tools
    #   - 1.1 is to avoid
    #   - 1.2 fixes Summary but has little adoption
    #   - 2.0 adds more features and is very new
    if not is_1_1 and not is_1_2 and not is_2_0:
        # we couldn't find any specific marker
        if PKG_INFO_PREFERRED_VERSION in possible_versions:
            return PKG_INFO_PREFERRED_VERSION
    if is_1_1:
        return '1.1'
    if is_1_2:
        return '1.2'

    return '2.0'

_ATTR2FIELD = {
    'metadata_version': 'Metadata-Version',
    'name': 'Name',
    'version': 'Version',
    'platform': 'Platform',
    'supported_platform': 'Supported-Platform',
    'summary': 'Summary',
    'description': 'Description',
    'keywords': 'Keywords',
    'home_page': 'Home-page',
    'author': 'Author',
    'author_email': 'Author-email',
    'maintainer': 'Maintainer',
    'maintainer_email': 'Maintainer-email',
    'license': 'License',
    'classifier': 'Classifier',
    'download_url': 'Download-URL',
    'obsoletes_dist': 'Obsoletes-Dist',
    'provides_dist': 'Provides-Dist',
    'requires_dist': 'Requires-Dist',
    'setup_requires_dist': 'Setup-Requires-Dist',
    'requires_python': 'Requires-Python',
    'requires_external': 'Requires-External',
    'requires': 'Requires',
    'provides': 'Provides',
    'obsoletes': 'Obsoletes',
    'project_url': 'Project-URL',
    'private_version': 'Private-Version',
    'obsoleted_by': 'Obsoleted-By',
    'extension': 'Extension',
    'provides_extra': 'Provides-Extra',
}

_PREDICATE_FIELDS = ('Requires-Dist', 'Obsoletes-Dist', 'Provides-Dist')
_VERSIONS_FIELDS = ('Requires-Python',)
_VERSION_FIELDS = ('Version',)
_LISTFIELDS = ('Platform', 'Classifier', 'Obsoletes',
               'Requires', 'Provides', 'Obsoletes-Dist',
               'Provides-Dist', 'Requires-Dist', 'Requires-External',
               'Project-URL', 'Supported-Platform', 'Setup-Requires-Dist',
               'Provides-Extra', 'Extension')
_LISTTUPLEFIELDS = ('Project-URL',)

_ELEMENTSFIELD = ('Keywords',)

_UNICODEFIELDS = ('Author', 'Maintainer', 'Summary', 'Description')

_MISSING = object()

_FILESAFE = re.compile('[^A-Za-z0-9.]+')


def _get_name_and_version(name, version, for_filename=False):
    """Return the distribution name with version.

    If for_filename is true, return a filename-escaped form."""
    if for_filename:
        # For both name and version any runs of non-alphanumeric or '.'
        # characters are replaced with a single '-'.  Additionally any
        # spaces in the version string become '.'
        name = _FILESAFE.sub('-', name)
        version = _FILESAFE.sub('-', version.replace(' ', '.'))
    return '%s-%s' % (name, version)


class LegacyMetadata(object):
    """The legacy metadata of a release.

    Supports versions 1.0, 1.1 and 1.2 (auto-detected). You can
    instantiate the class with one of these arguments (or none):
    - *path*, the path to a metadata file
    - *fileobj* give a file-like object with metadata as content
    - *mapping* is a dict-like object
    - *scheme* is a version scheme name
    """
    # TODO document the mapping API and UNKNOWN default key

    def __init__(self, path=None, fileobj=None, mapping=None,
                 scheme='default'):
        if [path, fileobj, mapping].count(None) < 2:
            raise TypeError('path, fileobj and mapping are exclusive')
        self._fields = {}
        self.requires_files = []
        self._dependencies = None
        self.scheme = scheme
        if path is not None:
            self.read(path)
        elif fileobj is not None:
            self.read_file(fileobj)
        elif mapping is not None:
            self.update(mapping)
            self.set_metadata_version()

    def set_metadata_version(self):
        self._fields['Metadata-Version'] = _best_version(self._fields)

    def _write_field(self, fileobj, name, value):
        fileobj.write('%s: %s\n' % (name, value))

    def __getitem__(self, name):
        return self.get(name)

    def __setitem__(self, name, value):
        return self.set(name, value)

    def __delitem__(self, name):
        field_name = self._convert_name(name)
        try:
            del self._fields[field_name]
        except KeyError:
            raise KeyError(name)

    def __contains__(self, name):
        return (name in self._fields or
                self._convert_name(name) in self._fields)

    def _convert_name(self, name):
        if name in _ALL_FIELDS:
            return name
        name = name.replace('-', '_').lower()
        return _ATTR2FIELD.get(name, name)

    def _default_value(self, name):
        if name in _LISTFIELDS or name in _ELEMENTSFIELD:
            return []
        return 'UNKNOWN'

    def _remove_line_prefix(self, value):
        if self.metadata_version in ('1.0', '1.1'):
            return _LINE_PREFIX_PRE_1_2.sub('\n', value)
        else:
            return _LINE_PREFIX_1_2.sub('\n', value)

    def __getattr__(self, name):
        if name in _ATTR2FIELD:
            return self[name]
        raise AttributeError(name)

    #
    # Public API
    #

#    dependencies = property(_get_dependencies, _set_dependencies)

    def get_fullname(self, filesafe=False):
        """Return the distribution name with version.

        If filesafe is true, return a filename-escaped form."""
        return _get_name_and_version(self['Name'], self['Version'], filesafe)

    def is_field(self, name):
        """return True if name is a valid metadata key"""
        name = self._convert_name(name)
        return name in _ALL_FIELDS

    def is_multi_field(self, name):
        name = self._convert_name(name)
        return name in _LISTFIELDS

    def read(self, filepath):
        """Read the metadata values from a file path."""
        fp = codecs.open(filepath, 'r', encoding='utf-8')
        try:
            self.read_file(fp)
        finally:
            fp.close()

    def read_file(self, fileob):
        """Read the metadata values from a file object."""
        msg = message_from_file(fileob)
        self._fields['Metadata-Version'] = msg['metadata-version']

        # When reading, get all the fields we can
        for field in _ALL_FIELDS:
            if field not in msg:
                continue
            if field in _LISTFIELDS:
                # we can have multiple lines
                values = msg.get_all(field)
                if field in _LISTTUPLEFIELDS and values is not None:
                    values = [tuple(value.split(',')) for value in values]
                self.set(field, values)
            else:
                # single line
                value = msg[field]
                if value is not None and value != 'UNKNOWN':
                    self.set(field, value)
        self.set_metadata_version()

    def write(self, filepath, skip_unknown=False):
        """Write the metadata fields to filepath."""
        fp = codecs.open(filepath, 'w', encoding='utf-8')
        try:
            self.write_file(fp, skip_unknown)
        finally:
            fp.close()

    def write_file(self, fileobject, skip_unknown=False):
        """Write the PKG-INFO format data to a file object."""
        self.set_metadata_version()

        for field in _version2fieldlist(self['Metadata-Version']):
            values = self.get(field)
            if skip_unknown and values in ('UNKNOWN', [], ['UNKNOWN']):
                continue
            if field in _ELEMENTSFIELD:
                self._write_field(fileobject, field, ','.join(values))
                continue
            if field not in _LISTFIELDS:
                if field == 'Description':
                    if self.metadata_version in ('1.0', '1.1'):
                        values = values.replace('\n', '\n        ')
                    else:
                        values = values.replace('\n', '\n       |')
                values = [values]

            if field in _LISTTUPLEFIELDS:
                values = [','.join(value) for value in values]

            for value in values:
                self._write_field(fileobject, field, value)

    def update(self, other=None, **kwargs):
        """Set metadata values from the given iterable `other` and kwargs.

        Behavior is like `dict.update`: If `other` has a ``keys`` method,
        they are looped over and ``self[key]`` is assigned ``other[key]``.
        Else, ``other`` is an iterable of ``(key, value)`` iterables.

        Keys that don't match a metadata field or that have an empty value are
        dropped.
        """
        def _set(key, value):
            if key in _ATTR2FIELD and value:
                self.set(self._convert_name(key), value)

        if not other:
            # other is None or empty container
            pass
        elif hasattr(other, 'keys'):
            for k in other.keys():
                _set(k, other[k])
        else:
            for k, v in other:
                _set(k, v)

        if kwargs:
            for k, v in kwargs.items():
                _set(k, v)

    def set(self, name, value):
        """Control then set a metadata field."""
        name = self._convert_name(name)

        if ((name in _ELEMENTSFIELD or name == 'Platform') and
            not isinstance(value, (list, tuple))):
            if isinstance(value, string_types):
                value = [v.strip() for v in value.split(',')]
            else:
                value = []
        elif (name in _LISTFIELDS and
              not isinstance(value, (list, tuple))):
            if isinstance(value, string_types):
                value = [value]
            else:
                value = []

        if logger.isEnabledFor(logging.WARNING):
            project_name = self['Name']

            scheme = get_scheme(self.scheme)
            if name in _PREDICATE_FIELDS and value is not None:
                for v in value:
                    # check that the values are valid
                    if not scheme.is_valid_matcher(v.split(';')[0]):
                        logger.warning(
                            "'%s': '%s' is not valid (field '%s')",
                            project_name, v, name)
            # FIXME this rejects UNKNOWN, is that right?
            elif name in _VERSIONS_FIELDS and value is not None:
                if not scheme.is_valid_constraint_list(value):
                    logger.warning("'%s': '%s' is not a valid version (field '%s')",
                                   project_name, value, name)
            elif name in _VERSION_FIELDS and value is not None:
                if not scheme.is_valid_version(value):
                    logger.warning("'%s': '%s' is not a valid version (field '%s')",
                                   project_name, value, name)

        if name in _UNICODEFIELDS:
            if name == 'Description':
                value = self._remove_line_prefix(value)

        self._fields[name] = value

    def get(self, name, default=_MISSING):
        """Get a metadata field."""
        name = self._convert_name(name)
        if name not in self._fields:
            if default is _MISSING:
                default = self._default_value(name)
            return default
        if name in _UNICODEFIELDS:
            value = self._fields[name]
            return value
        elif name in _LISTFIELDS:
            value = self._fields[name]
            if value is None:
                return []
            res = []
            for val in value:
                if name not in _LISTTUPLEFIELDS:
                    res.append(val)
                else:
                    # That's for Project-URL
                    res.append((val[0], val[1]))
            return res

        elif name in _ELEMENTSFIELD:
            value = self._fields[name]
            if isinstance(value, string_types):
                return value.split(',')
        return self._fields[name]

    def check(self, strict=False):
        """Check if the metadata is compliant. If strict is True then raise if
        no Name or Version are provided"""
        self.set_metadata_version()

        # XXX should check the versions (if the file was loaded)
        missing, warnings = [], []

        for attr in ('Name', 'Version'):  # required by PEP 345
            if attr not in self:
                missing.append(attr)

        if strict and missing != []:
            msg = 'missing required metadata: %s' % ', '.join(missing)
            raise MetadataMissingError(msg)

        for attr in ('Home-page', 'Author'):
            if attr not in self:
                missing.append(attr)

        # checking metadata 1.2 (XXX needs to check 1.1, 1.0)
        if self['Metadata-Version'] != '1.2':
            return missing, warnings

        scheme = get_scheme(self.scheme)

        def are_valid_constraints(value):
            for v in value:
                if not scheme.is_valid_matcher(v.split(';')[0]):
                    return False
            return True

        for fields, controller in ((_PREDICATE_FIELDS, are_valid_constraints),
                                   (_VERSIONS_FIELDS,
                                    scheme.is_valid_constraint_list),
                                   (_VERSION_FIELDS,
                                    scheme.is_valid_version)):
            for field in fields:
                value = self.get(field, None)
                if value is not None and not controller(value):
                    warnings.append("Wrong value for '%s': %s" % (field, value))

        return missing, warnings

    def todict(self, skip_missing=False):
        """Return fields as a dict.

        Field names will be converted to use the underscore-lowercase style
        instead of hyphen-mixed case (i.e. home_page instead of Home-page).
        """
        self.set_metadata_version()

        mapping_1_0 = (
            ('metadata_version', 'Metadata-Version'),
            ('name', 'Name'),
            ('version', 'Version'),
            ('summary', 'Summary'),
            ('home_page', 'Home-page'),
            ('author', 'Author'),
            ('author_email', 'Author-email'),
            ('license', 'License'),
            ('description', 'Description'),
            ('keywords', 'Keywords'),
            ('platform', 'Platform'),
            ('classifiers', 'Classifier'),
            ('download_url', 'Download-URL'),
        )

        data = {}
        for key, field_name in mapping_1_0:
            if not skip_missing or field_name in self._fields:
                data[key] = self[field_name]

        if self['Metadata-Version'] == '1.2':
            mapping_1_2 = (
                ('requires_dist', 'Requires-Dist'),
                ('requires_python', 'Requires-Python'),
                ('requires_external', 'Requires-External'),
                ('provides_dist', 'Provides-Dist'),
                ('obsoletes_dist', 'Obsoletes-Dist'),
                ('project_url', 'Project-URL'),
                ('maintainer', 'Maintainer'),
                ('maintainer_email', 'Maintainer-email'),
            )
            for key, field_name in mapping_1_2:
                if not skip_missing or field_name in self._fields:
                    if key != 'project_url':
                        data[key] = self[field_name]
                    else:
                        data[key] = [','.join(u) for u in self[field_name]]

        elif self['Metadata-Version'] == '1.1':
            mapping_1_1 = (
                ('provides', 'Provides'),
                ('requires', 'Requires'),
                ('obsoletes', 'Obsoletes'),
            )
            for key, field_name in mapping_1_1:
                if not skip_missing or field_name in self._fields:
                    data[key] = self[field_name]

        return data

    def add_requirements(self, requirements):
        if self['Metadata-Version'] == '1.1':
            # we can't have 1.1 metadata *and* Setuptools requires
            for field in ('Obsoletes', 'Requires', 'Provides'):
                if field in self:
                    del self[field]
        self['Requires-Dist'] += requirements

    # Mapping API
    # TODO could add iter* variants

    def keys(self):
        return list(_version2fieldlist(self['Metadata-Version']))

    def __iter__(self):
        for key in self.keys():
            yield key

    def values(self):
        return [self[key] for key in self.keys()]

    def items(self):
        return [(key, self[key]) for key in self.keys()]

    def __repr__(self):
        return '<%s %s %s>' % (self.__class__.__name__, self.name,
                               self.version)


METADATA_FILENAME = 'pydist.json'
WHEEL_METADATA_FILENAME = 'metadata.json'


class Metadata(object):
    """
    The metadata of a release. This implementation uses 2.0 (JSON)
    metadata where possible. If not possible, it wraps a LegacyMetadata
    instance which handles the key-value metadata format.
    """

    METADATA_VERSION_MATCHER = re.compile('^\d+(\.\d+)*$')

    NAME_MATCHER = re.compile('^[0-9A-Z]([0-9A-Z_.-]*[0-9A-Z])?$', re.I)

    VERSION_MATCHER = PEP440_VERSION_RE

    SUMMARY_MATCHER = re.compile('.{1,2047}')

    METADATA_VERSION = '2.0'

    GENERATOR = 'distlib (%s)' % __version__

    MANDATORY_KEYS = {
        'name': (),
        'version': (),
        'summary': ('legacy',),
    }

    INDEX_KEYS = ('name version license summary description author '
                  'author_email keywords platform home_page classifiers '
                  'download_url')

    DEPENDENCY_KEYS = ('extras run_requires test_requires build_requires '
                       'dev_requires provides meta_requires obsoleted_by '
                       'supports_environments')

    SYNTAX_VALIDATORS = {
        'metadata_version': (METADATA_VERSION_MATCHER, ()),
        'name': (NAME_MATCHER, ('legacy',)),
        'version': (VERSION_MATCHER, ('legacy',)),
        'summary': (SUMMARY_MATCHER, ('legacy',)),
    }

    __slots__ = ('_legacy', '_data', 'scheme')

    def __init__(self, path=None, fileobj=None, mapping=None,
                 scheme='default'):
        if [path, fileobj, mapping].count(None) < 2:
            raise TypeError('path, fileobj and mapping are exclusive')
        self._legacy = None
        self._data = None
        self.scheme = scheme
        #import pdb; pdb.set_trace()
        if mapping is not None:
            try:
                self._validate_mapping(mapping, scheme)
                self._data = mapping
            except MetadataUnrecognizedVersionError:
                self._legacy = LegacyMetadata(mapping=mapping, scheme=scheme)
                self.validate()
        else:
            data = None
            if path:
                with open(path, 'rb') as f:
                    data = f.read()
            elif fileobj:
                data = fileobj.read()
            if data is None:
                # Initialised with no args - to be added
                self._data = {
                    'metadata_version': self.METADATA_VERSION,
                    'generator': self.GENERATOR,
                }
            else:
                if not isinstance(data, text_type):
                    data = data.decode('utf-8')
                try:
                    self._data = json.loads(data)
                    self._validate_mapping(self._data, scheme)
                except ValueError:
                    # Note: MetadataUnrecognizedVersionError does not
                    # inherit from ValueError (it's a DistlibException,
                    # which should not inherit from ValueError).
                    # The ValueError comes from the json.load - if that
                    # succeeds and we get a validation error, we want
                    # that to propagate
                    self._legacy = LegacyMetadata(fileobj=StringIO(data),
                                                  scheme=scheme)
                    self.validate()

    common_keys = set(('name', 'version', 'license', 'keywords', 'summary'))

    none_list = (None, list)
    none_dict = (None, dict)

    mapped_keys = {
        'run_requires': ('Requires-Dist', list),
        'build_requires': ('Setup-Requires-Dist', list),
        'dev_requires': none_list,
        'test_requires': none_list,
        'meta_requires': none_list,
        'extras': ('Provides-Extra', list),
        'modules': none_list,
        'namespaces': none_list,
        'exports': none_dict,
        'commands': none_dict,
        'classifiers': ('Classifier', list),
        'source_url': ('Download-URL', None),
        'metadata_version': ('Metadata-Version', None),
    }

    del none_list, none_dict

    def __getattribute__(self, key):
        common = object.__getattribute__(self, 'common_keys')
        mapped = object.__getattribute__(self, 'mapped_keys')
        if key in mapped:
            lk, maker = mapped[key]
            if self._legacy:
                if lk is None:
                    result = None if maker is None else maker()
                else:
                    result = self._legacy.get(lk)
            else:
                value = None if maker is None else maker()
                if key not in ('commands', 'exports', 'modules', 'namespaces',
                               'classifiers'):
                    result = self._data.get(key, value)
                else:
                    # special cases for PEP 459
                    sentinel = object()
                    result = sentinel
                    d = self._data.get('extensions')
                    if d:
                        if key == 'commands':
                            result = d.get('python.commands', value)
                        elif key == 'classifiers':
                            d = d.get('python.details')
                            if d:
                                result = d.get(key, value)
                        else:
                            d = d.get('python.exports')
                            if not d:
                                d = self._data.get('python.exports')
                            if d:
                                result = d.get(key, value)
                    if result is sentinel:
                        result = value
        elif key not in common:
            result = object.__getattribute__(self, key)
        elif self._legacy:
            result = self._legacy.get(key)
        else:
            result = self._data.get(key)
        return result

    def _validate_value(self, key, value, scheme=None):
        if key in self.SYNTAX_VALIDATORS:
            pattern, exclusions = self.SYNTAX_VALIDATORS[key]
            if (scheme or self.scheme) not in exclusions:
                m = pattern.match(value)
                if not m:
                    raise MetadataInvalidError("'%s' is an invalid value for "
                                               "the '%s' property" % (value,
                                                                    key))

    def __setattr__(self, key, value):
        self._validate_value(key, value)
        common = object.__getattribute__(self, 'common_keys')
        mapped = object.__getattribute__(self, 'mapped_keys')
        if key in mapped:
            lk, _ = mapped[key]
            if self._legacy:
                if lk is None:
                    raise NotImplementedError
                self._legacy[lk] = value
            elif key not in ('commands', 'exports', 'modules', 'namespaces',
                             'classifiers'):
                self._data[key] = value
            else:
                # special cases for PEP 459
                d = self._data.setdefault('extensions', {})
                if key == 'commands':
                    d['python.commands'] = value
                elif key == 'classifiers':
                    d = d.setdefault('python.details', {})
                    d[key] = value
                else:
                    d = d.setdefault('python.exports', {})
                    d[key] = value
        elif key not in common:
            object.__setattr__(self, key, value)
        else:
            if key == 'keywords':
                if isinstance(value, string_types):
                    value = value.strip()
                    if value:
                        value = value.split()
                    else:
                        value = []
            if self._legacy:
                self._legacy[key] = value
            else:
                self._data[key] = value

    @property
    def name_and_version(self):
        return _get_name_and_version(self.name, self.version, True)

    @property
    def provides(self):
        if self._legacy:
            result = self._legacy['Provides-Dist']
        else:
            result = self._data.setdefault('provides', [])
        s = '%s (%s)' % (self.name, self.version)
        if s not in result:
            result.append(s)
        return result

    @provides.setter
    def provides(self, value):
        if self._legacy:
            self._legacy['Provides-Dist'] = value
        else:
            self._data['provides'] = value

    def get_requirements(self, reqts, extras=None, env=None):
        """
        Base method to get dependencies, given a set of extras
        to satisfy and an optional environment context.
        :param reqts: A list of sometimes-wanted dependencies,
                      perhaps dependent on extras and environment.
        :param extras: A list of optional components being requested.
        :param env: An optional environment for marker evaluation.
        """
        if self._legacy:
            result = reqts
        else:
            result = []
            extras = get_extras(extras or [], self.extras)
            for d in reqts:
                if 'extra' not in d and 'environment' not in d:
                    # unconditional
                    include = True
                else:
                    if 'extra' not in d:
                        # Not extra-dependent - only environment-dependent
                        include = True
                    else:
                        include = d.get('extra') in extras
                    if include:
                        # Not excluded because of extras, check environment
                        marker = d.get('environment')
                        if marker:
                            include = interpret(marker, env)
                if include:
                    result.extend(d['requires'])
            for key in ('build', 'dev', 'test'):
                e = ':%s:' % key
                if e in extras:
                    extras.remove(e)
                    # A recursive call, but it should terminate since 'test'
                    # has been removed from the extras
                    reqts = self._data.get('%s_requires' % key, [])
                    result.extend(self.get_requirements(reqts, extras=extras,
                                                        env=env))
        return result

    @property
    def dictionary(self):
        if self._legacy:
            return self._from_legacy()
        return self._data

    @property
    def dependencies(self):
        if self._legacy:
            raise NotImplementedError
        else:
            return extract_by_key(self._data, self.DEPENDENCY_KEYS)

    @dependencies.setter
    def dependencies(self, value):
        if self._legacy:
            raise NotImplementedError
        else:
            self._data.update(value)

    def _validate_mapping(self, mapping, scheme):
        if mapping.get('metadata_version') != self.METADATA_VERSION:
            raise MetadataUnrecognizedVersionError()
        missing = []
        for key, exclusions in self.MANDATORY_KEYS.items():
            if key not in mapping:
                if scheme not in exclusions:
                    missing.append(key)
        if missing:
            msg = 'Missing metadata items: %s' % ', '.join(missing)
            raise MetadataMissingError(msg)
        for k, v in mapping.items():
            self._validate_value(k, v, scheme)

    def validate(self):
        if self._legacy:
            missing, warnings = self._legacy.check(True)
            if missing or warnings:
                logger.warning('Metadata: missing: %s, warnings: %s',
                               missing, warnings)
        else:
            self._validate_mapping(self._data, self.scheme)

    def todict(self):
        if self._legacy:
            return self._legacy.todict(True)
        else:
            result = extract_by_key(self._data, self.INDEX_KEYS)
            return result

    def _from_legacy(self):
        assert self._legacy and not self._data
        result = {
            'metadata_version': self.METADATA_VERSION,
            'generator': self.GENERATOR,
        }
        lmd = self._legacy.todict(True)     # skip missing ones
        for k in ('name', 'version', 'license', 'summary', 'description',
                  'classifier'):
            if k in lmd:
                if k == 'classifier':
                    nk = 'classifiers'
                else:
                    nk = k
                result[nk] = lmd[k]
        kw = lmd.get('Keywords', [])
        if kw == ['']:
            kw = []
        result['keywords'] = kw
        keys = (('requires_dist', 'run_requires'),
                ('setup_requires_dist', 'build_requires'))
        for ok, nk in keys:
            if ok in lmd and lmd[ok]:
                result[nk] = [{'requires': lmd[ok]}]
        result['provides'] = self.provides
        author = {}
        maintainer = {}
        return result

    LEGACY_MAPPING = {
        'name': 'Name',
        'version': 'Version',
        'license': 'License',
        'summary': 'Summary',
        'description': 'Description',
        'classifiers': 'Classifier',
    }

    def _to_legacy(self):
        def process_entries(entries):
            reqts = set()
            for e in entries:
                extra = e.get('extra')
                env = e.get('environment')
                rlist = e['requires']
                for r in rlist:
                    if not env and not extra:
                        reqts.add(r)
                    else:
                        marker = ''
                        if extra:
                            marker = 'extra == "%s"' % extra
                        if env:
                            if marker:
                                marker = '(%s) and %s' % (env, marker)
                            else:
                                marker = env
                        reqts.add(';'.join((r, marker)))
            return reqts

        assert self._data and not self._legacy
        result = LegacyMetadata()
        nmd = self._data
        for nk, ok in self.LEGACY_MAPPING.items():
            if nk in nmd:
                result[ok] = nmd[nk]
        r1 = process_entries(self.run_requires + self.meta_requires)
        r2 = process_entries(self.build_requires + self.dev_requires)
        if self.extras:
            result['Provides-Extra'] = sorted(self.extras)
        result['Requires-Dist'] = sorted(r1)
        result['Setup-Requires-Dist'] = sorted(r2)
        # TODO: other fields such as contacts
        return result

    def write(self, path=None, fileobj=None, legacy=False, skip_unknown=True):
        if [path, fileobj].count(None) != 1:
            raise ValueError('Exactly one of path and fileobj is needed')
        self.validate()
        if legacy:
            if self._legacy:
                legacy_md = self._legacy
            else:
                legacy_md = self._to_legacy()
            if path:
                legacy_md.write(path, skip_unknown=skip_unknown)
            else:
                legacy_md.write_file(fileobj, skip_unknown=skip_unknown)
        else:
            if self._legacy:
                d = self._from_legacy()
            else:
                d = self._data
            if fileobj:
                json.dump(d, fileobj, ensure_ascii=True, indent=2,
                          sort_keys=True)
            else:
                with codecs.open(path, 'w', 'utf-8') as f:
                    json.dump(d, f, ensure_ascii=True, indent=2,
                              sort_keys=True)

    def add_requirements(self, requirements):
        if self._legacy:
            self._legacy.add_requirements(requirements)
        else:
            run_requires = self._data.setdefault('run_requires', [])
            always = None
            for entry in run_requires:
                if 'environment' not in entry and 'extra' not in entry:
                    always = entry
                    break
            if always is None:
                always = { 'requires': requirements }
                run_requires.insert(0, always)
            else:
                rset = set(always['requires']) | set(requirements)
                always['requires'] = sorted(rset)

    def __repr__(self):
        name = self.name or '(no name)'
        version = self.version or 'no version'
        return '<%s %s %s (%s)>' % (self.__class__.__name__,
                                    self.metadata_version, name, version)
site-packages/pip/_vendor/distlib/_backport/__init__.py000064400000000422147511334610017222 0ustar00"""Modules copied from Python 3 standard libraries, for internal use only.

Individual classes and functions are found in d2._backport.misc.  Intended
usage is to always import things missing from 3.1 from that module: the
built-in/stdlib objects will be used if found.
"""
site-packages/pip/_vendor/distlib/_backport/shutil.py000064400000062057147511334610017007 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2012 The Python Software Foundation.
# See LICENSE.txt and CONTRIBUTORS.txt.
#
"""Utility functions for copying and archiving files and directory trees.

XXX The functions here don't copy the resource fork or other metadata on Mac.

"""

import os
import sys
import stat
from os.path import abspath
import fnmatch
import collections
import errno
from . import tarfile

try:
    import bz2
    _BZ2_SUPPORTED = True
except ImportError:
    _BZ2_SUPPORTED = False

try:
    from pwd import getpwnam
except ImportError:
    getpwnam = None

try:
    from grp import getgrnam
except ImportError:
    getgrnam = None

__all__ = ["copyfileobj", "copyfile", "copymode", "copystat", "copy", "copy2",
           "copytree", "move", "rmtree", "Error", "SpecialFileError",
           "ExecError", "make_archive", "get_archive_formats",
           "register_archive_format", "unregister_archive_format",
           "get_unpack_formats", "register_unpack_format",
           "unregister_unpack_format", "unpack_archive", "ignore_patterns"]

class Error(EnvironmentError):
    pass

class SpecialFileError(EnvironmentError):
    """Raised when trying to do a kind of operation (e.g. copying) which is
    not supported on a special file (e.g. a named pipe)"""

class ExecError(EnvironmentError):
    """Raised when a command could not be executed"""

class ReadError(EnvironmentError):
    """Raised when an archive cannot be read"""

class RegistryError(Exception):
    """Raised when a registry operation with the archiving
    and unpacking registries fails"""


try:
    WindowsError
except NameError:
    WindowsError = None

def copyfileobj(fsrc, fdst, length=16*1024):
    """copy data from file-like object fsrc to file-like object fdst"""
    while 1:
        buf = fsrc.read(length)
        if not buf:
            break
        fdst.write(buf)

def _samefile(src, dst):
    # Macintosh, Unix.
    if hasattr(os.path, 'samefile'):
        try:
            return os.path.samefile(src, dst)
        except OSError:
            return False

    # All other platforms: check for same pathname.
    return (os.path.normcase(os.path.abspath(src)) ==
            os.path.normcase(os.path.abspath(dst)))

def copyfile(src, dst):
    """Copy data from src to dst"""
    if _samefile(src, dst):
        raise Error("`%s` and `%s` are the same file" % (src, dst))

    for fn in [src, dst]:
        try:
            st = os.stat(fn)
        except OSError:
            # File most likely does not exist
            pass
        else:
            # XXX What about other special files? (sockets, devices...)
            if stat.S_ISFIFO(st.st_mode):
                raise SpecialFileError("`%s` is a named pipe" % fn)

    with open(src, 'rb') as fsrc:
        with open(dst, 'wb') as fdst:
            copyfileobj(fsrc, fdst)

def copymode(src, dst):
    """Copy mode bits from src to dst"""
    if hasattr(os, 'chmod'):
        st = os.stat(src)
        mode = stat.S_IMODE(st.st_mode)
        os.chmod(dst, mode)

def copystat(src, dst):
    """Copy all stat info (mode bits, atime, mtime, flags) from src to dst"""
    st = os.stat(src)
    mode = stat.S_IMODE(st.st_mode)
    if hasattr(os, 'utime'):
        os.utime(dst, (st.st_atime, st.st_mtime))
    if hasattr(os, 'chmod'):
        os.chmod(dst, mode)
    if hasattr(os, 'chflags') and hasattr(st, 'st_flags'):
        try:
            os.chflags(dst, st.st_flags)
        except OSError as why:
            if (not hasattr(errno, 'EOPNOTSUPP') or
                why.errno != errno.EOPNOTSUPP):
                raise

def copy(src, dst):
    """Copy data and mode bits ("cp src dst").

    The destination may be a directory.

    """
    if os.path.isdir(dst):
        dst = os.path.join(dst, os.path.basename(src))
    copyfile(src, dst)
    copymode(src, dst)

def copy2(src, dst):
    """Copy data and all stat info ("cp -p src dst").

    The destination may be a directory.

    """
    if os.path.isdir(dst):
        dst = os.path.join(dst, os.path.basename(src))
    copyfile(src, dst)
    copystat(src, dst)

def ignore_patterns(*patterns):
    """Function that can be used as copytree() ignore parameter.

    Patterns is a sequence of glob-style patterns
    that are used to exclude files"""
    def _ignore_patterns(path, names):
        ignored_names = []
        for pattern in patterns:
            ignored_names.extend(fnmatch.filter(names, pattern))
        return set(ignored_names)
    return _ignore_patterns

def copytree(src, dst, symlinks=False, ignore=None, copy_function=copy2,
             ignore_dangling_symlinks=False):
    """Recursively copy a directory tree.

    The destination directory must not already exist.
    If exception(s) occur, an Error is raised with a list of reasons.

    If the optional symlinks flag is true, symbolic links in the
    source tree result in symbolic links in the destination tree; if
    it is false, the contents of the files pointed to by symbolic
    links are copied. If the file pointed by the symlink doesn't
    exist, an exception will be added in the list of errors raised in
    an Error exception at the end of the copy process.

    You can set the optional ignore_dangling_symlinks flag to true if you
    want to silence this exception. Notice that this has no effect on
    platforms that don't support os.symlink.

    The optional ignore argument is a callable. If given, it
    is called with the `src` parameter, which is the directory
    being visited by copytree(), and `names` which is the list of
    `src` contents, as returned by os.listdir():

        callable(src, names) -> ignored_names

    Since copytree() is called recursively, the callable will be
    called once for each directory that is copied. It returns a
    list of names relative to the `src` directory that should
    not be copied.

    The optional copy_function argument is a callable that will be used
    to copy each file. It will be called with the source path and the
    destination path as arguments. By default, copy2() is used, but any
    function that supports the same signature (like copy()) can be used.

    """
    names = os.listdir(src)
    if ignore is not None:
        ignored_names = ignore(src, names)
    else:
        ignored_names = set()

    os.makedirs(dst)
    errors = []
    for name in names:
        if name in ignored_names:
            continue
        srcname = os.path.join(src, name)
        dstname = os.path.join(dst, name)
        try:
            if os.path.islink(srcname):
                linkto = os.readlink(srcname)
                if symlinks:
                    os.symlink(linkto, dstname)
                else:
                    # ignore dangling symlink if the flag is on
                    if not os.path.exists(linkto) and ignore_dangling_symlinks:
                        continue
                    # otherwise let the copy occurs. copy2 will raise an error
                    copy_function(srcname, dstname)
            elif os.path.isdir(srcname):
                copytree(srcname, dstname, symlinks, ignore, copy_function)
            else:
                # Will raise a SpecialFileError for unsupported file types
                copy_function(srcname, dstname)
        # catch the Error from the recursive copytree so that we can
        # continue with other files
        except Error as err:
            errors.extend(err.args[0])
        except EnvironmentError as why:
            errors.append((srcname, dstname, str(why)))
    try:
        copystat(src, dst)
    except OSError as why:
        if WindowsError is not None and isinstance(why, WindowsError):
            # Copying file access times may fail on Windows
            pass
        else:
            errors.extend((src, dst, str(why)))
    if errors:
        raise Error(errors)

def rmtree(path, ignore_errors=False, onerror=None):
    """Recursively delete a directory tree.

    If ignore_errors is set, errors are ignored; otherwise, if onerror
    is set, it is called to handle the error with arguments (func,
    path, exc_info) where func is os.listdir, os.remove, or os.rmdir;
    path is the argument to that function that caused it to fail; and
    exc_info is a tuple returned by sys.exc_info().  If ignore_errors
    is false and onerror is None, an exception is raised.

    """
    if ignore_errors:
        def onerror(*args):
            pass
    elif onerror is None:
        def onerror(*args):
            raise
    try:
        if os.path.islink(path):
            # symlinks to directories are forbidden, see bug #1669
            raise OSError("Cannot call rmtree on a symbolic link")
    except OSError:
        onerror(os.path.islink, path, sys.exc_info())
        # can't continue even if onerror hook returns
        return
    names = []
    try:
        names = os.listdir(path)
    except os.error:
        onerror(os.listdir, path, sys.exc_info())
    for name in names:
        fullname = os.path.join(path, name)
        try:
            mode = os.lstat(fullname).st_mode
        except os.error:
            mode = 0
        if stat.S_ISDIR(mode):
            rmtree(fullname, ignore_errors, onerror)
        else:
            try:
                os.remove(fullname)
            except os.error:
                onerror(os.remove, fullname, sys.exc_info())
    try:
        os.rmdir(path)
    except os.error:
        onerror(os.rmdir, path, sys.exc_info())


def _basename(path):
    # A basename() variant which first strips the trailing slash, if present.
    # Thus we always get the last component of the path, even for directories.
    return os.path.basename(path.rstrip(os.path.sep))

def move(src, dst):
    """Recursively move a file or directory to another location. This is
    similar to the Unix "mv" command.

    If the destination is a directory or a symlink to a directory, the source
    is moved inside the directory. The destination path must not already
    exist.

    If the destination already exists but is not a directory, it may be
    overwritten depending on os.rename() semantics.

    If the destination is on our current filesystem, then rename() is used.
    Otherwise, src is copied to the destination and then removed.
    A lot more could be done here...  A look at a mv.c shows a lot of
    the issues this implementation glosses over.

    """
    real_dst = dst
    if os.path.isdir(dst):
        if _samefile(src, dst):
            # We might be on a case insensitive filesystem,
            # perform the rename anyway.
            os.rename(src, dst)
            return

        real_dst = os.path.join(dst, _basename(src))
        if os.path.exists(real_dst):
            raise Error("Destination path '%s' already exists" % real_dst)
    try:
        os.rename(src, real_dst)
    except OSError:
        if os.path.isdir(src):
            if _destinsrc(src, dst):
                raise Error("Cannot move a directory '%s' into itself '%s'." % (src, dst))
            copytree(src, real_dst, symlinks=True)
            rmtree(src)
        else:
            copy2(src, real_dst)
            os.unlink(src)

def _destinsrc(src, dst):
    src = abspath(src)
    dst = abspath(dst)
    if not src.endswith(os.path.sep):
        src += os.path.sep
    if not dst.endswith(os.path.sep):
        dst += os.path.sep
    return dst.startswith(src)

def _get_gid(name):
    """Returns a gid, given a group name."""
    if getgrnam is None or name is None:
        return None
    try:
        result = getgrnam(name)
    except KeyError:
        result = None
    if result is not None:
        return result[2]
    return None

def _get_uid(name):
    """Returns an uid, given a user name."""
    if getpwnam is None or name is None:
        return None
    try:
        result = getpwnam(name)
    except KeyError:
        result = None
    if result is not None:
        return result[2]
    return None

def _make_tarball(base_name, base_dir, compress="gzip", verbose=0, dry_run=0,
                  owner=None, group=None, logger=None):
    """Create a (possibly compressed) tar file from all the files under
    'base_dir'.

    'compress' must be "gzip" (the default), "bzip2", or None.

    'owner' and 'group' can be used to define an owner and a group for the
    archive that is being built. If not provided, the current owner and group
    will be used.

    The output tar file will be named 'base_name' +  ".tar", possibly plus
    the appropriate compression extension (".gz", or ".bz2").

    Returns the output filename.
    """
    tar_compression = {'gzip': 'gz', None: ''}
    compress_ext = {'gzip': '.gz'}

    if _BZ2_SUPPORTED:
        tar_compression['bzip2'] = 'bz2'
        compress_ext['bzip2'] = '.bz2'

    # flags for compression program, each element of list will be an argument
    if compress is not None and compress not in compress_ext:
        raise ValueError("bad value for 'compress', or compression format not "
                         "supported : {0}".format(compress))

    archive_name = base_name + '.tar' + compress_ext.get(compress, '')
    archive_dir = os.path.dirname(archive_name)

    if not os.path.exists(archive_dir):
        if logger is not None:
            logger.info("creating %s", archive_dir)
        if not dry_run:
            os.makedirs(archive_dir)

    # creating the tarball
    if logger is not None:
        logger.info('Creating tar archive')

    uid = _get_uid(owner)
    gid = _get_gid(group)

    def _set_uid_gid(tarinfo):
        if gid is not None:
            tarinfo.gid = gid
            tarinfo.gname = group
        if uid is not None:
            tarinfo.uid = uid
            tarinfo.uname = owner
        return tarinfo

    if not dry_run:
        tar = tarfile.open(archive_name, 'w|%s' % tar_compression[compress])
        try:
            tar.add(base_dir, filter=_set_uid_gid)
        finally:
            tar.close()

    return archive_name

def _call_external_zip(base_dir, zip_filename, verbose=False, dry_run=False):
    # XXX see if we want to keep an external call here
    if verbose:
        zipoptions = "-r"
    else:
        zipoptions = "-rq"
    from distutils.errors import DistutilsExecError
    from distutils.spawn import spawn
    try:
        spawn(["zip", zipoptions, zip_filename, base_dir], dry_run=dry_run)
    except DistutilsExecError:
        # XXX really should distinguish between "couldn't find
        # external 'zip' command" and "zip failed".
        raise ExecError("unable to create zip file '%s': "
            "could neither import the 'zipfile' module nor "
            "find a standalone zip utility") % zip_filename

def _make_zipfile(base_name, base_dir, verbose=0, dry_run=0, logger=None):
    """Create a zip file from all the files under 'base_dir'.

    The output zip file will be named 'base_name' + ".zip".  Uses either the
    "zipfile" Python module (if available) or the InfoZIP "zip" utility
    (if installed and found on the default search path).  If neither tool is
    available, raises ExecError.  Returns the name of the output zip
    file.
    """
    zip_filename = base_name + ".zip"
    archive_dir = os.path.dirname(base_name)

    if not os.path.exists(archive_dir):
        if logger is not None:
            logger.info("creating %s", archive_dir)
        if not dry_run:
            os.makedirs(archive_dir)

    # If zipfile module is not available, try spawning an external 'zip'
    # command.
    try:
        import zipfile
    except ImportError:
        zipfile = None

    if zipfile is None:
        _call_external_zip(base_dir, zip_filename, verbose, dry_run)
    else:
        if logger is not None:
            logger.info("creating '%s' and adding '%s' to it",
                        zip_filename, base_dir)

        if not dry_run:
            zip = zipfile.ZipFile(zip_filename, "w",
                                  compression=zipfile.ZIP_DEFLATED)

            for dirpath, dirnames, filenames in os.walk(base_dir):
                for name in filenames:
                    path = os.path.normpath(os.path.join(dirpath, name))
                    if os.path.isfile(path):
                        zip.write(path, path)
                        if logger is not None:
                            logger.info("adding '%s'", path)
            zip.close()

    return zip_filename

_ARCHIVE_FORMATS = {
    'gztar': (_make_tarball, [('compress', 'gzip')], "gzip'ed tar-file"),
    'bztar': (_make_tarball, [('compress', 'bzip2')], "bzip2'ed tar-file"),
    'tar':   (_make_tarball, [('compress', None)], "uncompressed tar file"),
    'zip':   (_make_zipfile, [], "ZIP file"),
    }

if _BZ2_SUPPORTED:
    _ARCHIVE_FORMATS['bztar'] = (_make_tarball, [('compress', 'bzip2')],
                                "bzip2'ed tar-file")

def get_archive_formats():
    """Returns a list of supported formats for archiving and unarchiving.

    Each element of the returned sequence is a tuple (name, description)
    """
    formats = [(name, registry[2]) for name, registry in
               _ARCHIVE_FORMATS.items()]
    formats.sort()
    return formats

def register_archive_format(name, function, extra_args=None, description=''):
    """Registers an archive format.

    name is the name of the format. function is the callable that will be
    used to create archives. If provided, extra_args is a sequence of
    (name, value) tuples that will be passed as arguments to the callable.
    description can be provided to describe the format, and will be returned
    by the get_archive_formats() function.
    """
    if extra_args is None:
        extra_args = []
    if not isinstance(function, collections.Callable):
        raise TypeError('The %s object is not callable' % function)
    if not isinstance(extra_args, (tuple, list)):
        raise TypeError('extra_args needs to be a sequence')
    for element in extra_args:
        if not isinstance(element, (tuple, list)) or len(element) !=2:
            raise TypeError('extra_args elements are : (arg_name, value)')

    _ARCHIVE_FORMATS[name] = (function, extra_args, description)

def unregister_archive_format(name):
    del _ARCHIVE_FORMATS[name]

def make_archive(base_name, format, root_dir=None, base_dir=None, verbose=0,
                 dry_run=0, owner=None, group=None, logger=None):
    """Create an archive file (eg. zip or tar).

    'base_name' is the name of the file to create, minus any format-specific
    extension; 'format' is the archive format: one of "zip", "tar", "bztar"
    or "gztar".

    'root_dir' is a directory that will be the root directory of the
    archive; ie. we typically chdir into 'root_dir' before creating the
    archive.  'base_dir' is the directory where we start archiving from;
    ie. 'base_dir' will be the common prefix of all files and
    directories in the archive.  'root_dir' and 'base_dir' both default
    to the current directory.  Returns the name of the archive file.

    'owner' and 'group' are used when creating a tar archive. By default,
    uses the current owner and group.
    """
    save_cwd = os.getcwd()
    if root_dir is not None:
        if logger is not None:
            logger.debug("changing into '%s'", root_dir)
        base_name = os.path.abspath(base_name)
        if not dry_run:
            os.chdir(root_dir)

    if base_dir is None:
        base_dir = os.curdir

    kwargs = {'dry_run': dry_run, 'logger': logger}

    try:
        format_info = _ARCHIVE_FORMATS[format]
    except KeyError:
        raise ValueError("unknown archive format '%s'" % format)

    func = format_info[0]
    for arg, val in format_info[1]:
        kwargs[arg] = val

    if format != 'zip':
        kwargs['owner'] = owner
        kwargs['group'] = group

    try:
        filename = func(base_name, base_dir, **kwargs)
    finally:
        if root_dir is not None:
            if logger is not None:
                logger.debug("changing back to '%s'", save_cwd)
            os.chdir(save_cwd)

    return filename


def get_unpack_formats():
    """Returns a list of supported formats for unpacking.

    Each element of the returned sequence is a tuple
    (name, extensions, description)
    """
    formats = [(name, info[0], info[3]) for name, info in
               _UNPACK_FORMATS.items()]
    formats.sort()
    return formats

def _check_unpack_options(extensions, function, extra_args):
    """Checks what gets registered as an unpacker."""
    # first make sure no other unpacker is registered for this extension
    existing_extensions = {}
    for name, info in _UNPACK_FORMATS.items():
        for ext in info[0]:
            existing_extensions[ext] = name

    for extension in extensions:
        if extension in existing_extensions:
            msg = '%s is already registered for "%s"'
            raise RegistryError(msg % (extension,
                                       existing_extensions[extension]))

    if not isinstance(function, collections.Callable):
        raise TypeError('The registered function must be a callable')


def register_unpack_format(name, extensions, function, extra_args=None,
                           description=''):
    """Registers an unpack format.

    `name` is the name of the format. `extensions` is a list of extensions
    corresponding to the format.

    `function` is the callable that will be
    used to unpack archives. The callable will receive archives to unpack.
    If it's unable to handle an archive, it needs to raise a ReadError
    exception.

    If provided, `extra_args` is a sequence of
    (name, value) tuples that will be passed as arguments to the callable.
    description can be provided to describe the format, and will be returned
    by the get_unpack_formats() function.
    """
    if extra_args is None:
        extra_args = []
    _check_unpack_options(extensions, function, extra_args)
    _UNPACK_FORMATS[name] = extensions, function, extra_args, description

def unregister_unpack_format(name):
    """Removes the pack format from the registry."""
    del _UNPACK_FORMATS[name]

def _ensure_directory(path):
    """Ensure that the parent directory of `path` exists"""
    dirname = os.path.dirname(path)
    if not os.path.isdir(dirname):
        os.makedirs(dirname)

def _unpack_zipfile(filename, extract_dir):
    """Unpack zip `filename` to `extract_dir`
    """
    try:
        import zipfile
    except ImportError:
        raise ReadError('zlib not supported, cannot unpack this archive.')

    if not zipfile.is_zipfile(filename):
        raise ReadError("%s is not a zip file" % filename)

    zip = zipfile.ZipFile(filename)
    try:
        for info in zip.infolist():
            name = info.filename

            # don't extract absolute paths or ones with .. in them
            if name.startswith('/') or '..' in name:
                continue

            target = os.path.join(extract_dir, *name.split('/'))
            if not target:
                continue

            _ensure_directory(target)
            if not name.endswith('/'):
                # file
                data = zip.read(info.filename)
                f = open(target, 'wb')
                try:
                    f.write(data)
                finally:
                    f.close()
                    del data
    finally:
        zip.close()

def _unpack_tarfile(filename, extract_dir):
    """Unpack tar/tar.gz/tar.bz2 `filename` to `extract_dir`
    """
    try:
        tarobj = tarfile.open(filename)
    except tarfile.TarError:
        raise ReadError(
            "%s is not a compressed or uncompressed tar file" % filename)
    try:
        tarobj.extractall(extract_dir)
    finally:
        tarobj.close()

_UNPACK_FORMATS = {
    'gztar': (['.tar.gz', '.tgz'], _unpack_tarfile, [], "gzip'ed tar-file"),
    'tar':   (['.tar'], _unpack_tarfile, [], "uncompressed tar file"),
    'zip':   (['.zip'], _unpack_zipfile, [], "ZIP file")
    }

if _BZ2_SUPPORTED:
    _UNPACK_FORMATS['bztar'] = (['.bz2'], _unpack_tarfile, [],
                                "bzip2'ed tar-file")

def _find_unpack_format(filename):
    for name, info in _UNPACK_FORMATS.items():
        for extension in info[0]:
            if filename.endswith(extension):
                return name
    return None

def unpack_archive(filename, extract_dir=None, format=None):
    """Unpack an archive.

    `filename` is the name of the archive.

    `extract_dir` is the name of the target directory, where the archive
    is unpacked. If not provided, the current working directory is used.

    `format` is the archive format: one of "zip", "tar", or "gztar". Or any
    other registered format. If not provided, unpack_archive will use the
    filename extension and see if an unpacker was registered for that
    extension.

    In case none is found, a ValueError is raised.
    """
    if extract_dir is None:
        extract_dir = os.getcwd()

    if format is not None:
        try:
            format_info = _UNPACK_FORMATS[format]
        except KeyError:
            raise ValueError("Unknown unpack format '{0}'".format(format))

        func = format_info[1]
        func(filename, extract_dir, **dict(format_info[2]))
    else:
        # we need to look at the registered unpackers supported extensions
        format = _find_unpack_format(filename)
        if format is None:
            raise ReadError("Unknown archive format '{0}'".format(filename))

        func = _UNPACK_FORMATS[format][1]
        kwargs = dict(_UNPACK_FORMATS[format][2])
        func(filename, extract_dir, **kwargs)
site-packages/pip/_vendor/distlib/_backport/__pycache__/misc.cpython-36.opt-1.pyc000064400000001740147511334610023645 0ustar003

���e��@s�dZddlZddlZdddgZyddlmZWnek
rLddd�ZYnXyeZWn(ek
r~dd	l	m
Z
d
d�ZYnXy
ejZWnek
r�dd�ZYnXdS)
z/Backports for individual classes and functions.�N�cache_from_source�callable�fsencode)rFcCs|rdp
d}||S)N�c�o�)Zpy_file�debugZextrr�/usr/lib/python3.6/misc.pyrs)�CallablecCs
t|t�S)N)�
isinstancer
)�objrrr	rscCs<t|t�r|St|t�r&|jtj��Stdt|�j��dS)Nzexpect bytes or str, not %s)	r�bytes�str�encode�sys�getfilesystemencoding�	TypeError�type�__name__)�filenamerrr	r"s

)F)
�__doc__�osr�__all__Zimpr�ImportErrorr�	NameError�collectionsr
r�AttributeErrorrrrr	�<module>s 

site-packages/pip/_vendor/distlib/_backport/__pycache__/shutil.cpython-36.opt-1.pyc000064400000051544147511334610024231 0ustar003

�Ub/d�-@s
dZddlZddlZddlZddlmZddlZddlZddlZddl	m
Z
yddlZdZWne
k
rtdZYnXyddlmZWne
k
r�dZYnXydd	lmZWne
k
r�dZYnXd
ddd
dddddddddddddddddgZGdd�de�ZGd d�de�ZGd!d�de�ZGd"d#�d#e�ZGd$d%�d%e�ZyeWnek
�rjdZYnXded(d
�Zd)d*�Zd+d�Zd,d�Zd-d
�Z d.d�Z!d/d�Z"d0d�Z#dde"dfd1d�Z$dfd2d�Z%d3d4�Z&d5d�Z'd6d7�Z(d8d9�Z)d:d;�Z*dgd=d>�Z+dhd?d@�Z,didAdB�Z-e+djgdDfe+dkgdFfe+dlgdGfe-gdHfdI�Z.e�rRe+dmgdFfe.dJ<dKd�Z/dndMd�Z0dNd�Z1dodOd�Z2dPd�Z3dQdR�Z4dpdSd�Z5dTd�Z6dUdV�Z7dWdX�Z8dYdZ�Z9d[d\ge9gdDfd]ge9gdGfd^ge8gdHfd_�Z:e�r�d`ge9gdFfe:dJ<dadb�Z;dqdcd�Z<dS)rz�Utility functions for copying and archiving files and directory trees.

XXX The functions here don't copy the resource fork or other metadata on Mac.

�N)�abspath�)�tarfileTF)�getpwnam)�getgrnam�copyfileobj�copyfile�copymode�copystat�copy�copy2�copytree�move�rmtree�Error�SpecialFileError�	ExecError�make_archive�get_archive_formats�register_archive_format�unregister_archive_format�get_unpack_formats�register_unpack_format�unregister_unpack_format�unpack_archive�ignore_patternsc@seZdZdS)rN)�__name__�
__module__�__qualname__�rr�/usr/lib/python3.6/shutil.pyr,sc@seZdZdZdS)rz|Raised when trying to do a kind of operation (e.g. copying) which is
    not supported on a special file (e.g. a named pipe)N)rrr�__doc__rrrr r/sc@seZdZdZdS)rz+Raised when a command could not be executedN)rrrr!rrrr r3sc@seZdZdZdS)�	ReadErrorz%Raised when an archive cannot be readN)rrrr!rrrr r"6sr"c@seZdZdZdS)�
RegistryErrorzVRaised when a registry operation with the archiving
    and unpacking registries failsN)rrrr!rrrr r#9sr#�icCs$x|j|�}|sP|j|�qWdS)z=copy data from file-like object fsrc to file-like object fdstN)�read�write)�fsrc�fdstZlengthZbufrrr rCs

cCsXttjd�r0ytjj||�Stk
r.dSXtjjtjj|��tjjtjj|��kS)N�samefileF)�hasattr�os�pathr)�OSError�normcaser)�src�dstrrr �	_samefileKsr1cCs�t||�rtd||f��xL||gD]@}ytj|�}Wntk
rJYq$Xtj|j�r$td|��q$Wt|d��&}t|d��}t	||�WdQRXWdQRXdS)zCopy data from src to dstz`%s` and `%s` are the same filez`%s` is a named pipe�rb�wbN)
r1rr+�statr-�S_ISFIFO�st_moder�openr)r/r0�fn�str'r(rrr rWs
cCs0ttd�r,tj|�}tj|j�}tj||�dS)zCopy mode bits from src to dst�chmodN)r*r+r4�S_IMODEr6r:)r/r0r9�moderrr r	ks

cCs�tj|�}tj|j�}ttd�r4tj||j|jf�ttd�rJtj||�ttd�r�t|d�r�ytj	||j
�Wn<tk
r�}z ttd�s�|jtj
kr��WYdd}~XnXdS)zCCopy all stat info (mode bits, atime, mtime, flags) from src to dst�utimer:�chflags�st_flags�
EOPNOTSUPPN)r+r4r;r6r*r=�st_atime�st_mtimer:r>r?r-�errnor@)r/r0r9r<�whyrrr r
rs


cCs:tjj|�r"tjj|tjj|��}t||�t||�dS)zVCopy data and mode bits ("cp src dst").

    The destination may be a directory.

    N)r+r,�isdir�join�basenamerr	)r/r0rrr r�s
cCs:tjj|�r"tjj|tjj|��}t||�t||�dS)z]Copy data and all stat info ("cp -p src dst").

    The destination may be a directory.

    N)r+r,rErFrGrr
)r/r0rrr r�s
cs�fdd�}|S)z�Function that can be used as copytree() ignore parameter.

    Patterns is a sequence of glob-style patterns
    that are used to exclude filescs,g}x�D]}|jtj||��q
Wt|�S)N)�extend�fnmatch�filter�set)r,�names�
ignored_names�pattern)�patternsrr �_ignore_patterns�s
z)ignore_patterns.<locals>._ignore_patternsr)rOrPr)rOr r�scCs�tj|�}|dk	r|||�}nt�}tj|�g}�x|D�]
}	|	|krJq:tjj||	�}
tjj||	�}yttjj|
�r�tj|
�}|r�tj||�q�tjj	|�r�|r�w:||
|�n(tjj
|
�r�t|
||||�n
||
|�Wq:tk
�r}
z|j
|
jd�WYdd}
~
Xq:tk
�rD}z|j|
|t|�f�WYdd}~Xq:Xq:Wyt||�WnPtk
�r�}z2tdk	�r�t|t��r�n|j
||t|�f�WYdd}~XnX|�r�t|��dS)a�Recursively copy a directory tree.

    The destination directory must not already exist.
    If exception(s) occur, an Error is raised with a list of reasons.

    If the optional symlinks flag is true, symbolic links in the
    source tree result in symbolic links in the destination tree; if
    it is false, the contents of the files pointed to by symbolic
    links are copied. If the file pointed by the symlink doesn't
    exist, an exception will be added in the list of errors raised in
    an Error exception at the end of the copy process.

    You can set the optional ignore_dangling_symlinks flag to true if you
    want to silence this exception. Notice that this has no effect on
    platforms that don't support os.symlink.

    The optional ignore argument is a callable. If given, it
    is called with the `src` parameter, which is the directory
    being visited by copytree(), and `names` which is the list of
    `src` contents, as returned by os.listdir():

        callable(src, names) -> ignored_names

    Since copytree() is called recursively, the callable will be
    called once for each directory that is copied. It returns a
    list of names relative to the `src` directory that should
    not be copied.

    The optional copy_function argument is a callable that will be used
    to copy each file. It will be called with the source path and the
    destination path as arguments. By default, copy2() is used, but any
    function that supports the same signature (like copy()) can be used.

    Nr)r+�listdirrK�makedirsr,rF�islink�readlink�symlink�existsrEr
rrH�args�EnvironmentError�append�strr
r-�WindowsError�
isinstance)r/r0�symlinks�ignoreZ
copy_functionZignore_dangling_symlinksrLrM�errors�nameZsrcnameZdstnameZlinkto�errrDrrr r
�sD$


 *&c$Cst|rdd�}n|dkrdd�}ytjj|�r4td��Wn(tk
r^|tjj|tj��dSXg}ytj|�}Wn(tjk
r�|tj|tj��YnXx�|D]�}tjj||�}ytj	|�j
}Wntjk
r�d}YnXtj|�r�t
|||�q�ytj|�Wq�tjk
�r0|tj|tj��Yq�Xq�Wytj|�Wn*tjk
�rn|tj|tj��YnXdS)a�Recursively delete a directory tree.

    If ignore_errors is set, errors are ignored; otherwise, if onerror
    is set, it is called to handle the error with arguments (func,
    path, exc_info) where func is os.listdir, os.remove, or os.rmdir;
    path is the argument to that function that caused it to fail; and
    exc_info is a tuple returned by sys.exc_info().  If ignore_errors
    is false and onerror is None, an exception is raised.

    cWsdS)Nr)rWrrr �onerrorszrmtree.<locals>.onerrorNcWs�dS)Nr)rWrrr rbsz%Cannot call rmtree on a symbolic linkr)r+r,rSr-�sys�exc_inforQ�errorrF�lstatr6r4�S_ISDIRr�remove�rmdir)r,�
ignore_errorsrbrLr`�fullnamer<rrr r�s>



cCstjj|jtjj��S)N)r+r,rG�rstrip�sep)r,rrr �	_basename'srncCs�|}tjj|�rTt||�r*tj||�dStjj|t|��}tjj|�rTtd|��ytj||�Wnft	k
r�tjj|�r�t
||�r�td||f��t||dd�t|�nt
||�tj|�YnXdS)a�Recursively move a file or directory to another location. This is
    similar to the Unix "mv" command.

    If the destination is a directory or a symlink to a directory, the source
    is moved inside the directory. The destination path must not already
    exist.

    If the destination already exists but is not a directory, it may be
    overwritten depending on os.rename() semantics.

    If the destination is on our current filesystem, then rename() is used.
    Otherwise, src is copied to the destination and then removed.
    A lot more could be done here...  A look at a mv.c shows a lot of
    the issues this implementation glosses over.

    Nz$Destination path '%s' already existsz.Cannot move a directory '%s' into itself '%s'.T)r])r+r,rEr1�renamerFrnrVrr-�
_destinsrcr
rr�unlink)r/r0Zreal_dstrrr r,s$



cCsNt|�}t|�}|jtjj�s*|tjj7}|jtjj�sD|tjj7}|j|�S)N)r�endswithr+r,rm�
startswith)r/r0rrr rpTsrpcCsNtdks|dkrdSyt|�}Wntk
r8d}YnX|dk	rJ|dSdS)z"Returns a gid, given a group name.N�)r�KeyError)r`�resultrrr �_get_gid]s
rwcCsNtdks|dkrdSyt|�}Wntk
r8d}YnX|dk	rJ|dSdS)z"Returns an uid, given a user name.Nrt)rru)r`rvrrr �_get_uidis
rx�gzipc
sddd�}ddi}	tr&d|d<d|	d<|d	k	rD||	krDtd
j|���|d|	j|d�}
tjj|
�}tjj|�s�|d	k	r�|jd|�|s�tj	|�|d	k	r�|jd
�t
���t�������fdd�}|�s�tj
|
d||�}
z|
j||d�Wd	|
j�X|
S)a�Create a (possibly compressed) tar file from all the files under
    'base_dir'.

    'compress' must be "gzip" (the default), "bzip2", or None.

    'owner' and 'group' can be used to define an owner and a group for the
    archive that is being built. If not provided, the current owner and group
    will be used.

    The output tar file will be named 'base_name' +  ".tar", possibly plus
    the appropriate compression extension (".gz", or ".bz2").

    Returns the output filename.
    Zgz�)ryNryz.gz�bz2�bzip2z.bz2NzCbad value for 'compress', or compression format not supported : {0}z.tarzcreating %szCreating tar archivecs,�dk	r�|_�|_�dk	r(�|_�|_|S)N)�gidZgname�uid�uname)Ztarinfo)r}�group�ownerr~rr �_set_uid_gid�sz#_make_tarball.<locals>._set_uid_gidzw|%s)rJ)�_BZ2_SUPPORTED�
ValueError�format�getr+r,�dirnamerV�inforRrxrwrr7�add�close)�	base_name�base_dir�compress�verbose�dry_runr�r��loggerZtar_compressionZcompress_extZarchive_name�archive_dirr��tarr)r}r�r�r~r �
_make_tarballus4


	
r�cCsd|r
d}nd}ddlm}ddlm}y|d|||g|d�Wn |k
r^td�|�YnXdS)	Nz-rz-rqr)�DistutilsExecError)�spawn�zip)r�zkunable to create zip file '%s': could neither import the 'zipfile' module nor find a standalone zip utility)Zdistutils.errorsr�Zdistutils.spawnr�r)r��zip_filenamer�r�Z
zipoptionsr�r�rrr �_call_external_zip�sr�cCs$|d}tjj|�}tjj|�sB|dk	r4|jd|�|sBtj|�yddl}Wntk
rfd}YnX|dkr�t||||�n�|dk	r�|jd||�|�s |j	|d|j
d�}xhtj|�D]Z\}	}
}xN|D]F}tjjtjj
|	|��}
tjj|
�r�|j|
|
�|dk	r�|jd|
�q�Wq�W|j�|S)	amCreate a zip file from all the files under 'base_dir'.

    The output zip file will be named 'base_name' + ".zip".  Uses either the
    "zipfile" Python module (if available) or the InfoZIP "zip" utility
    (if installed and found on the default search path).  If neither tool is
    available, raises ExecError.  Returns the name of the output zip
    file.
    z.zipNzcreating %srz#creating '%s' and adding '%s' to it�w)Zcompressionzadding '%s')r+r,r�rVr�rR�zipfile�ImportErrorr��ZipFileZZIP_DEFLATED�walk�normpathrF�isfiler&r�)r�r�r�r�r�r�r�r�r��dirpathZdirnames�	filenamesr`r,rrr �
_make_zipfile�s8	



r�r�zgzip'ed tar-filer|zbzip2'ed tar-filezuncompressed tar filezZIP file)�gztar�bztarr�r�r�cCsdd�tj�D�}|j�|S)z�Returns a list of supported formats for archiving and unarchiving.

    Each element of the returned sequence is a tuple (name, description)
    cSsg|]\}}||df�qS)rtr)�.0r`�registryrrr �
<listcomp>sz'get_archive_formats.<locals>.<listcomp>)�_ARCHIVE_FORMATS�items�sort)�formatsrrr r�srzcCs~|dkrg}t|tj�s$td|��t|ttf�s:td��x0|D](}t|ttf�s`t|�dkr@td��q@W|||ft|<dS)auRegisters an archive format.

    name is the name of the format. function is the callable that will be
    used to create archives. If provided, extra_args is a sequence of
    (name, value) tuples that will be passed as arguments to the callable.
    description can be provided to describe the format, and will be returned
    by the get_archive_formats() function.
    NzThe %s object is not callablez!extra_args needs to be a sequencertz+extra_args elements are : (arg_name, value))r\�collections�Callable�	TypeError�tuple�list�lenr�)r`�function�
extra_args�description�elementrrr rs	
cCs
t|=dS)N)r�)r`rrr rsc	Cstj�}	|dk	r>|dk	r$|jd|�tjj|�}|s>tj|�|dkrLtj}||d�}
yt|}Wn tk
r�t	d|��YnX|d}x|dD]\}
}||
|
<q�W|dkr�||
d<||
d	<z|||f|
�}Wd|dk	r�|dk	r�|jd
|	�tj|	�X|S)a�Create an archive file (eg. zip or tar).

    'base_name' is the name of the file to create, minus any format-specific
    extension; 'format' is the archive format: one of "zip", "tar", "bztar"
    or "gztar".

    'root_dir' is a directory that will be the root directory of the
    archive; ie. we typically chdir into 'root_dir' before creating the
    archive.  'base_dir' is the directory where we start archiving from;
    ie. 'base_dir' will be the common prefix of all files and
    directories in the archive.  'root_dir' and 'base_dir' both default
    to the current directory.  Returns the name of the archive file.

    'owner' and 'group' are used when creating a tar archive. By default,
    uses the current owner and group.
    Nzchanging into '%s')r�r�zunknown archive format '%s'rrr�r�r�zchanging back to '%s')
r+�getcwd�debugr,r�chdir�curdirr�rur�)r�r�Zroot_dirr�r�r�r�r�r�Zsave_cwd�kwargs�format_info�func�arg�val�filenamerrr r s6

cCsdd�tj�D�}|j�|S)z�Returns a list of supported formats for unpacking.

    Each element of the returned sequence is a tuple
    (name, extensions, description)
    cSs"g|]\}}||d|df�qS)r�r)r�r`r�rrr r�]sz&get_unpack_formats.<locals>.<listcomp>)�_UNPACK_FORMATSr�r�)r�rrr rWsc	Cszi}x.tj�D]"\}}x|dD]}|||<q WqWx,|D]$}||kr:d}t||||f��q:Wt|tj�svtd��dS)z+Checks what gets registered as an unpacker.rz!%s is already registered for "%s"z*The registered function must be a callableN)r�r�r#r\r�r�r�)	�
extensionsr�r�Zexisting_extensionsr`r�Zext�	extension�msgrrr �_check_unpack_optionsbs
r�cCs,|dkrg}t|||�||||ft|<dS)aMRegisters an unpack format.

    `name` is the name of the format. `extensions` is a list of extensions
    corresponding to the format.

    `function` is the callable that will be
    used to unpack archives. The callable will receive archives to unpack.
    If it's unable to handle an archive, it needs to raise a ReadError
    exception.

    If provided, `extra_args` is a sequence of
    (name, value) tuples that will be passed as arguments to the callable.
    description can be provided to describe the format, and will be returned
    by the get_unpack_formats() function.
    N)r�r�)r`r�r�r�r�rrr rtscCs
t|=dS)z*Removes the pack format from the registry.N)r�)r`rrr r�scCs&tjj|�}tjj|�s"tj|�dS)z1Ensure that the parent directory of `path` existsN)r+r,r�rErR)r,r�rrr �_ensure_directory�sr�c	Cs�yddl}Wntk
r(td��YnX|j|�s@td|��|j|�}z�x�|j�D]�}|j}|jd�sVd|krtqVtj	j
|f|jd���}|s�qVt|�|j
d�sV|j|j�}t|d�}z|j|�Wd|j�~XqVWWd|j�XdS)z+Unpack zip `filename` to `extract_dir`
    rNz/zlib not supported, cannot unpack this archive.z%s is not a zip file�/z..r3)r�r�r"Z
is_zipfiler�Zinfolistr�rsr+r,rF�splitr�rrr%r7r&r�)	r��extract_dirr�r�r�r`�target�data�frrr �_unpack_zipfile�s0



r�cCsPytj|�}Wn"tjk
r0td|��YnXz|j|�Wd|j�XdS)z:Unpack tar/tar.gz/tar.bz2 `filename` to `extract_dir`
    z/%s is not a compressed or uncompressed tar fileN)rr7ZTarErrorr"Z
extractallr�)r�r�Ztarobjrrr �_unpack_tarfile�sr�z.tar.gzz.tgzz.tarz.zip)r�r�r�z.bz2cCs:x4tj�D](\}}x|dD]}|j|�r|SqWq
WdS)Nr)r�r�rr)r�r`r�r�rrr �_find_unpack_format�s

r�c
Cs�|dkrtj�}|dk	rhyt|}Wn"tk
rFtdj|���YnX|d}|||ft|d��nHt|�}|dkr�tdj|���t|d}tt|d�}|||f|�dS)a�Unpack an archive.

    `filename` is the name of the archive.

    `extract_dir` is the name of the target directory, where the archive
    is unpacked. If not provided, the current working directory is used.

    `format` is the archive format: one of "zip", "tar", or "gztar". Or any
    other registered format. If not provided, unpack_archive will use the
    filename extension and see if an unpacker was registered for that
    extension.

    In case none is found, a ValueError is raised.
    NzUnknown unpack format '{0}'rrtzUnknown archive format '{0}')	r+r�r�rur�r��dictr�r")r�r�r�r�r�r�rrr r�s�@)r�)FN)ryrrNNN)FF)rrN)r�ry)r�r|)r�N)r�r|)Nrz)NNrrNNN)Nrz)NN)=r!r+rcr4Zos.pathrrIr�rCrzrr{r�r��pwdrZgrpr�__all__rXrrrr"�	Exceptionr#r[�	NameErrorrr1rr	r
rrrr
rrnrrprwrxr�r�r�r�rrrrrr�rrr�r�r�r�r�rrrrr �<module>
s�






Q
1(	
=

0






6
%
site-packages/pip/_vendor/distlib/_backport/__pycache__/sysconfig.cpython-36.opt-1.pyc000064400000037031147511334610024720 0ustar003

���eKi�@s�dZddlZddlZddlZddlZddlmZmZyddlZWne	k
r\ddl
ZYnXdddddd	d
ddd
dgZdd�Zej
r�ejjeej
��Zneej��Zejdkr�dedEd�j�kr�eejjee��Zejdko�dedFd�j�k�r
eejjeee��Zejdk�r@dedGd�j�k�r@eejjeee��Zdd�Ze�Zdadd�Zej�Zejd�Zdd�Zejj�dZ ejdd �Z!e de d!Z"ejj#ej$�Z%ejj#ej&�Z'da(dZ)d"d#�Z*d$d%�Z+d&d'�Z,d(d)�Z-d*d+�Z.d,d-�Z/dHd.d/�Z0d0d�Z1d1d2�Z2d3d4�Z3dId5d�Z4d6d�Z5d7d
�Z6d8d	�Z7e.�dd9fd:d
�Z8e.�dd9fd;d�Z9d<d�Z:d=d�Z;d>d�Z<d?d�Z=d@dA�Z>dBdC�Z?e@dDk�r�e?�dS)Jz-Access to Python's configuration information.�N)�pardir�realpath�get_config_h_filename�get_config_var�get_config_vars�get_makefile_filename�get_path�get_path_names�	get_paths�get_platform�get_python_version�get_scheme_names�parse_config_hcCs"yt|�Stk
r|SXdS)N)r�OSError)�path�r�/usr/lib/python3.6/sysconfig.py�_safe_realpath"sr�ntZpcbuild�z\pc\v�
z\pcbuild\amd64�cCs.x(dD] }tjjtjjtd|��rdSqWdS)N�
Setup.dist�Setup.local�ModulesTF)rr)�osr�isfile�join�
_PROJECT_BASE)�fnrrr�is_python_build:s
r FcCs�ts�ddlm}tjdd�d}||�}|jd�}|j��}tj|�WdQRXt	r~x(dD] }tj
|d	d
�tj
|dd�qZWd
adS)N�)�finder�.�rz
sysconfig.cfg�posix_prefix�
posix_home�includez{srcdir}/Include�platincludez{projectbase}/.T)r%r&)�	_cfg_readZ	resourcesr"�__name__�rsplit�findZ	as_stream�_SCHEMESZreadfp�
_PYTHON_BUILD�set)r"Zbackport_packageZ_finderZ_cfgfile�s�schemerrr�_ensure_cfg_readDs


r2z\{([^{]*?)\}cs�t�|jd�r|jd�}nt�}|j�}xD|D]<}|dkr>q0x,|D]$\}}|j||�rZqD|j|||�qDWq0W|jd�xX|j�D]L}t|j|����fdd�}x,|j|�D]\}}|j||t	j
||��q�Wq�WdS)N�globalscs$|jd�}|�kr�|S|jd�S)Nr$r)�group)�matchobj�name)�	variablesrr�	_replaceros
z"_expand_globals.<locals>._replacer)r2Zhas_section�items�tuple�sectionsZ
has_optionr/Zremove_section�dict�	_VAR_REPL�sub)�configr3r;ZsectionZoption�valuer8r)r7r�_expand_globalsYs$


rA�r!cs�fdd�}tj||�S)z�In the string `path`, replace tokens like {some.thing} with the
    corresponding value from the map `local_vars`.

    If there is no corresponding value, leave the token unchanged.
    cs8|jd�}|�kr�|S|tjkr.tj|S|jd�S)Nr$r)r4r�environ)r5r6)�
local_varsrrr8�s


z_subst_vars.<locals>._replacer)r=r>)rrDr8r)rDr�_subst_vars�srEcCs4|j�}x&|j�D]\}}||kr$q|||<qWdS)N)�keysr9)�target_dict�
other_dict�target_keys�keyr@rrr�_extend_dict�s
rKcCsdi}|dkri}t|t��xBtj|�D]4\}}tjdkrFtjj|�}tjjt	||��||<q(W|S)N�posixr)rLr)
rKrr-r9rr6r�
expanduser�normpathrE)r1�vars�resrJr@rrr�_expand_vars�s
rQcs�fdd�}tj||�S)Ncs$|jd�}|�kr�|S|jd�S)Nr$r)r4)r5r6)rOrrr8�s
zformat_value.<locals>._replacer)r=r>)r@rOr8r)rOr�format_value�srRcCstjdkrdStjS)NrLr%)rr6rrrr�_get_default_scheme�s
rScCs�tjjdd�}dd�}tjdkrBtjjd�p.d}|r8|S||d�Stjdkr|td	�}|r||r`|S|dd
|dtjdd��S|r�|S|dd
�SdS)N�PYTHONUSERBASEcWstjjtjj|��S)N)rrrMr)�argsrrr�joinuser�sz_getuserbase.<locals>.joinuserr�APPDATA�~�Python�darwin�PYTHONFRAMEWORK�Libraryz%d.%dr!z.local)rrC�getr6�sys�platformr�version_info)�env_baserV�base�	frameworkrrr�_getuserbase�s"



rdcCs"tjd�}tjd�}tjd�}|dkr*i}i}i}tj|ddd��}|j�}WdQRXx�|D]�}	|	jd�s\|	j�d	krxq\|j|	�}
|
r\|
jd
d�\}}|j�}|j	dd	�}
d
|
kr�|||<q\yt
|�}Wn$tk
r�|j	dd
�||<Yq\X|||<q\Wt|j
��}d}�x�t|�dk�r�x�t|�D�]�}||}|j|��pJ|j|�}
|
dk	�r�|
jd
�}d}||k�r|t||�}n�||k�r�d}nx|tjk�r�tj|}n`||k�r�|jd��r�|dd�|k�r�d	}n$d||k�r�d}nt|d|�}nd	||<}|�r�||
j�d�}|d|
j��||}d
|k�rF|||<n~yt
|�}Wn"tk
�rt|j�||<Yn
X|||<|j|�|jd��r�|dd�|k�r�|dd�}||k�r�|||<n|||<|j|��q(W�qWx.|j�D]"\}}t|t��r�|j�||<�q�W|j|�|S)z�Parse a Makefile-style file.

    A dictionary containing name/value pairs is returned.  If an
    optional dictionary is passed in as the second argument, it is
    used instead of a new dictionary.
    z"([a-zA-Z][a-zA-Z0-9_]+)\s*=\s*(.*)z\$\(([A-Za-z][A-Za-z0-9_]*)\)z\${([A-Za-z][A-Za-z0-9_]*)}Nzutf-8�surrogateescape)�encoding�errors�#�r$r!z$$�$�CFLAGS�LDFLAGS�CPPFLAGSrTF�PY_rB)rkrlrm)�re�compile�codecs�open�	readlines�
startswith�strip�matchr4�replace�int�
ValueError�listrF�lenr:�search�strrrC�end�start�remover9�
isinstance�update)�filenamerO�_variable_rx�_findvar1_rx�_findvar2_rx�done�notdone�f�lines�line�m�n�v�tmpvr7�renamed_variablesr6r@�found�item�after�krrr�_parse_makefile�s�	
















r�cCsDtrtjjtd�Sttd�r,dttjf}nd}tjjt	d�|d�S)z Return the path of the Makefile.�Makefile�abiflagszconfig-%s%sr?�stdlib)
r.rrrr�hasattrr^�_PY_VERSION_SHORTr�r)�config_dir_namerrrrMs
cCs�t�}yt||�WnJtk
r^}z.d|}t|d�rF|d|j}t|��WYdd}~XnXt�}y"t|��}t||�WdQRXWnJtk
r�}z.d|}t|d�r�|d|j}t|��WYdd}~XnXtr�|d|d<dS)z7Initialize the module as appropriate for POSIX systems.z.invalid Python installation: unable to open %s�strerrorz (%s)N�	BLDSHARED�LDSHARED)	rr��IOErrorr�r�rrrrr.)rO�makefile�e�msg�config_hr�rrr�_init_posixXs&


r�cCsVtd�|d<td�|d<td�|d<d|d<d	|d
<t|d<tjjttj��|d<d
S)z+Initialize the module as appropriate for NTr��LIBDEST�
platstdlib�
BINLIBDESTr'�	INCLUDEPYz.pyd�SOz.exe�EXE�VERSION�BINDIRN)r�_PY_VERSION_SHORT_NO_DOTrr�dirnamerr^�
executable)rOrrr�_init_non_posixtsr�cCs�|dkri}tjd�}tjd�}xx|j�}|s0P|j|�}|rz|jdd�\}}yt|�}Wntk
rnYnX|||<q"|j|�}|r"d||jd�<q"W|S)z�Parse a config.h-style file.

    A dictionary containing name/value pairs is returned.  If an
    optional dictionary is passed in as the second argument, it is
    used instead of a new dictionary.
    Nz"#define ([A-Z][A-Za-z0-9_]+) (.*)
z&/[*] #undef ([A-Z][A-Za-z0-9_]+) [*]/
r$r!r)rorp�readlinervr4rxry)�fprO�	define_rx�undef_rxr�r�r�r�rrrr�s(




cCs:tr$tjdkrtjjtd�}q,t}ntd�}tjj|d�S)zReturn the path of pyconfig.h.r�PCr(z
pyconfig.h)r.rr6rrrr)�inc_dirrrrr�s
cCstttj���S)z,Return a tuple containing the schemes names.)r:�sortedr-r;rrrrr
�scCs
tjd�S)z*Return a tuple containing the paths names.r%)r-Zoptionsrrrrr	�sTcCs&t�|rt||�Sttj|��SdS)z�Return a mapping containing an install scheme.

    ``scheme`` is the install scheme name. If not provided, it will
    return the default scheme for the current platform.
    N)r2rQr<r-r9)r1rO�expandrrrr
�s
cCst|||�|S)z[Return a path corresponding to the scheme.

    ``scheme`` is the install scheme name.
    )r
)r6r1rOr�rrrr�scGs�tdk�r�iattd<ttd<ttd<ttd<tdtdtd<ttd	<ttd
<ttd<ytjtd<Wntk
r�d
td<YnXt	j
d#kr�tt�t	j
dkr�tt�tj
dkr�t�td<dtkr�ttd<nttd�td<to�t	j
dk�r\t}yt	j�}Wntk
�rd}YnXt	jjtd��r\||k�r\t	jj|td�}t	jj|�td<tjdk�r�t	j�d}t|jd�d�}|dk�r�x:d$D]2}t|}tjdd|�}tjdd|�}|t|<�q�Wn�dt	jk�rt	jd}x8d%D]0}t|}tjdd|�}|d|}|t|<�q�Wtjdd
�}	tjd |	�}
|
dk	�r�|
j d!�}t	jj!|��s�x,d&D]$}t|}tjd"d|�}|t|<�q^W|�r�g}x|D]}
|j"tj|
���q�W|StSdS)'ayWith no arguments, return a dictionary of all configuration
    variables relevant for the current platform.

    On Unix, this means every variable defined in Python's installed Makefile;
    On Windows and Mac OS it's a much smaller set.

    With arguments, return a list of values that result from looking up
    each argument in the configuration variable dictionary.
    N�prefix�exec_prefix�
py_version�py_version_shortrr!�py_version_nodotrb�platbase�projectbaser�rir�os2rLz2.6�userbase�srcdirrZr#rrl�
BASECFLAGSrk�	PY_CFLAGSr�z
-arch\s+\w+\s� z-isysroot [^ 	]*Z	ARCHFLAGSz-isysroot\s+(\S+)r$z-isysroot\s+\S+(\s|$))rr�)rlr�rkr�r�)rlr�rkr�r�)rlr�rkr�r�)#�_CONFIG_VARS�_PREFIX�_EXEC_PREFIX�_PY_VERSIONr�rr^r��AttributeErrorrr6r�r��versionrdrr.�getcwdrr�isabsrrNr_�unamerx�splitror>rCr]r|r4�exists�append)rUrb�cwdr�Zkernel_versionZ
major_versionrJ�flagsZarchrkr�Zsdk�valsr6rrrr�s�












cCst�j|�S)z�Return the value of a single variable using the dictionary returned by
    'get_config_vars()'.

    Equivalent to get_config_vars().get(name)
    )rr])r6rrrrRscCs`tjdkrnd}tjj|�}|d:kr(tjStjjd|�}tj|t|�|�j�}|dkr\dS|dkrhdStjStjd	ks�ttd
�r�tjStj	�\}}}}}|j�j
dd�}|j
d
d�}|j
dd�}|dd�dkr�d||fS|dd�dk�r(|ddk�rRd}dt|d�d|dd�f}�n*|dd�dk�rFd||fS|dd�dk�rfd|||fS|dd�d k�r�d }tj
d!�}	|	j|�}
|
�rR|
j�}�n�|dd�d"k�rRt�}|jd#�}|}
ytd$�}Wntk
�r�YnJXztjd%|j��}
Wd|j�X|
dk	�r4d&j|
jd�jd&�dd��}
|�s>|
}|�rR|}d'}|
d&d(k�rd)t�jd*d�j�k�rd+}t�jd*�}tjd,|�}ttt|���}t|�dk�r�|d}n^|d;k�r�d+}nN|d<k�r�d0}n>|d=k�r�d1}n.|d>k�r�d3}n|d?k�rd4}ntd5|f��n<|d-k�r2tjd@k�rRd/}n |dAk�rRtjdBk�rNd2}nd.}d9|||fS)Ca�Return a string that identifies the current platform.

    This is used mainly to distinguish platform-specific build directories and
    platform-specific built distributions.  Typically includes the OS name
    and version and the architecture (as supplied by 'os.uname()'),
    although the exact information included depends on the OS; eg. for IRIX
    the architecture isn't particularly important (IRIX only runs on SGI
    hardware), but for Linux the kernel version isn't particularly
    important.

    Examples of returned values:
       linux-i586
       linux-alpha (?)
       solaris-2.6-sun4u
       irix-5.3
       irix64-6.2

    Windows will return one of:
       win-amd64 (64bit Windows on AMD64 (aka x86_64, Intel64, EM64T, etc)
       win-ia64 (64bit Windows on Itanium)
       win32 (all others - specifically, sys.platform is returned)

    For other non-POSIX platforms, currently just returns 'sys.platform'.
    rz bit (r$�)�amd64z	win-amd64�itaniumzwin-ia64rLr��/rir��_�-N��linuxz%s-%s�sunosr�5�solarisz%d.%srBr!��irix�aixz%s-%s.%s��cygwinz[\d.]+rZ�MACOSX_DEPLOYMENT_TARGETz0/System/Library/CoreServices/SystemVersion.plistz=<key>ProductUserVisibleVersion</key>\s*<string>(.*?)</string>r#Zmacosxz10.4.z-archrkZfatz
-arch\s+(\S+)�i386�ppc�x86_64ZintelZfat3�ppc64Zfat64Z	universalz%Don't know machine value for archs=%r� �PowerPC�Power_Macintoshz%s-%s-%s���)r�r�)r�r�)r�r�r�)r�r�)r�r�r�r�l)r�r�l) rr6r^r�r,r_r{�lowerr�r�rwrxrorprvr4rr]rrr�r|�read�closerr�ru�findallr:r�r/ry�maxsize)r��i�j�look�osname�host�releaser��machine�rel_rer�ZcfgvarsZmacverZ
macreleaser�ZcflagsZarchsrrrr[s�
$












cCstS)N)r�rrrrr�scCsJxDtt|j���D]0\}\}}|dkr2td|�td||f�qWdS)Nrz%s: z
	%s = "%s")�	enumerater�r9�print)�title�data�indexrJr@rrr�_print_dictsrcCsRtdt��tdt��tdt��t�tdt��t�tdt��dS)z*Display all information sysconfig detains.zPlatform: "%s"zPython version: "%s"z!Current installation scheme: "%s"�Paths�	VariablesN)r�rrrSrr
rrrrr�_mainsr�__main__i����i����i�)N)N)A�__doc__rqrror^Zos.pathrrZconfigparser�ImportErrorZConfigParser�__all__rr�rr�rr�r6r�rr r.r)r2ZRawConfigParserr-rpr=rAr�r�r�r�r�rNr�r�r�r�r��
_USER_BASErErKrQrRrSrdr�rr�r�rrr
r	r
rrrrrrrr*rrrr�<module>s� "
#
	
v

	#
site-packages/pip/_vendor/distlib/_backport/__pycache__/shutil.cpython-36.pyc000064400000051544147511334610023272 0ustar003

�Ub/d�-@s
dZddlZddlZddlZddlmZddlZddlZddlZddl	m
Z
yddlZdZWne
k
rtdZYnXyddlmZWne
k
r�dZYnXydd	lmZWne
k
r�dZYnXd
ddd
dddddddddddddddddgZGdd�de�ZGd d�de�ZGd!d�de�ZGd"d#�d#e�ZGd$d%�d%e�ZyeWnek
�rjdZYnXded(d
�Zd)d*�Zd+d�Zd,d�Zd-d
�Z d.d�Z!d/d�Z"d0d�Z#dde"dfd1d�Z$dfd2d�Z%d3d4�Z&d5d�Z'd6d7�Z(d8d9�Z)d:d;�Z*dgd=d>�Z+dhd?d@�Z,didAdB�Z-e+djgdDfe+dkgdFfe+dlgdGfe-gdHfdI�Z.e�rRe+dmgdFfe.dJ<dKd�Z/dndMd�Z0dNd�Z1dodOd�Z2dPd�Z3dQdR�Z4dpdSd�Z5dTd�Z6dUdV�Z7dWdX�Z8dYdZ�Z9d[d\ge9gdDfd]ge9gdGfd^ge8gdHfd_�Z:e�r�d`ge9gdFfe:dJ<dadb�Z;dqdcd�Z<dS)rz�Utility functions for copying and archiving files and directory trees.

XXX The functions here don't copy the resource fork or other metadata on Mac.

�N)�abspath�)�tarfileTF)�getpwnam)�getgrnam�copyfileobj�copyfile�copymode�copystat�copy�copy2�copytree�move�rmtree�Error�SpecialFileError�	ExecError�make_archive�get_archive_formats�register_archive_format�unregister_archive_format�get_unpack_formats�register_unpack_format�unregister_unpack_format�unpack_archive�ignore_patternsc@seZdZdS)rN)�__name__�
__module__�__qualname__�rr�/usr/lib/python3.6/shutil.pyr,sc@seZdZdZdS)rz|Raised when trying to do a kind of operation (e.g. copying) which is
    not supported on a special file (e.g. a named pipe)N)rrr�__doc__rrrr r/sc@seZdZdZdS)rz+Raised when a command could not be executedN)rrrr!rrrr r3sc@seZdZdZdS)�	ReadErrorz%Raised when an archive cannot be readN)rrrr!rrrr r"6sr"c@seZdZdZdS)�
RegistryErrorzVRaised when a registry operation with the archiving
    and unpacking registries failsN)rrrr!rrrr r#9sr#�icCs$x|j|�}|sP|j|�qWdS)z=copy data from file-like object fsrc to file-like object fdstN)�read�write)�fsrc�fdstZlengthZbufrrr rCs

cCsXttjd�r0ytjj||�Stk
r.dSXtjjtjj|��tjjtjj|��kS)N�samefileF)�hasattr�os�pathr)�OSError�normcaser)�src�dstrrr �	_samefileKsr1cCs�t||�rtd||f��xL||gD]@}ytj|�}Wntk
rJYq$Xtj|j�r$td|��q$Wt|d��&}t|d��}t	||�WdQRXWdQRXdS)zCopy data from src to dstz`%s` and `%s` are the same filez`%s` is a named pipe�rb�wbN)
r1rr+�statr-�S_ISFIFO�st_moder�openr)r/r0�fn�str'r(rrr rWs
cCs0ttd�r,tj|�}tj|j�}tj||�dS)zCopy mode bits from src to dst�chmodN)r*r+r4�S_IMODEr6r:)r/r0r9�moderrr r	ks

cCs�tj|�}tj|j�}ttd�r4tj||j|jf�ttd�rJtj||�ttd�r�t|d�r�ytj	||j
�Wn<tk
r�}z ttd�s�|jtj
kr��WYdd}~XnXdS)zCCopy all stat info (mode bits, atime, mtime, flags) from src to dst�utimer:�chflags�st_flags�
EOPNOTSUPPN)r+r4r;r6r*r=�st_atime�st_mtimer:r>r?r-�errnor@)r/r0r9r<�whyrrr r
rs


cCs:tjj|�r"tjj|tjj|��}t||�t||�dS)zVCopy data and mode bits ("cp src dst").

    The destination may be a directory.

    N)r+r,�isdir�join�basenamerr	)r/r0rrr r�s
cCs:tjj|�r"tjj|tjj|��}t||�t||�dS)z]Copy data and all stat info ("cp -p src dst").

    The destination may be a directory.

    N)r+r,rErFrGrr
)r/r0rrr r�s
cs�fdd�}|S)z�Function that can be used as copytree() ignore parameter.

    Patterns is a sequence of glob-style patterns
    that are used to exclude filescs,g}x�D]}|jtj||��q
Wt|�S)N)�extend�fnmatch�filter�set)r,�names�
ignored_names�pattern)�patternsrr �_ignore_patterns�s
z)ignore_patterns.<locals>._ignore_patternsr)rOrPr)rOr r�scCs�tj|�}|dk	r|||�}nt�}tj|�g}�x|D�]
}	|	|krJq:tjj||	�}
tjj||	�}yttjj|
�r�tj|
�}|r�tj||�q�tjj	|�r�|r�w:||
|�n(tjj
|
�r�t|
||||�n
||
|�Wq:tk
�r}
z|j
|
jd�WYdd}
~
Xq:tk
�rD}z|j|
|t|�f�WYdd}~Xq:Xq:Wyt||�WnPtk
�r�}z2tdk	�r�t|t��r�n|j
||t|�f�WYdd}~XnX|�r�t|��dS)a�Recursively copy a directory tree.

    The destination directory must not already exist.
    If exception(s) occur, an Error is raised with a list of reasons.

    If the optional symlinks flag is true, symbolic links in the
    source tree result in symbolic links in the destination tree; if
    it is false, the contents of the files pointed to by symbolic
    links are copied. If the file pointed by the symlink doesn't
    exist, an exception will be added in the list of errors raised in
    an Error exception at the end of the copy process.

    You can set the optional ignore_dangling_symlinks flag to true if you
    want to silence this exception. Notice that this has no effect on
    platforms that don't support os.symlink.

    The optional ignore argument is a callable. If given, it
    is called with the `src` parameter, which is the directory
    being visited by copytree(), and `names` which is the list of
    `src` contents, as returned by os.listdir():

        callable(src, names) -> ignored_names

    Since copytree() is called recursively, the callable will be
    called once for each directory that is copied. It returns a
    list of names relative to the `src` directory that should
    not be copied.

    The optional copy_function argument is a callable that will be used
    to copy each file. It will be called with the source path and the
    destination path as arguments. By default, copy2() is used, but any
    function that supports the same signature (like copy()) can be used.

    Nr)r+�listdirrK�makedirsr,rF�islink�readlink�symlink�existsrEr
rrH�args�EnvironmentError�append�strr
r-�WindowsError�
isinstance)r/r0�symlinks�ignoreZ
copy_functionZignore_dangling_symlinksrLrM�errors�nameZsrcnameZdstnameZlinkto�errrDrrr r
�sD$


 *&c$Cst|rdd�}n|dkrdd�}ytjj|�r4td��Wn(tk
r^|tjj|tj��dSXg}ytj|�}Wn(tjk
r�|tj|tj��YnXx�|D]�}tjj||�}ytj	|�j
}Wntjk
r�d}YnXtj|�r�t
|||�q�ytj|�Wq�tjk
�r0|tj|tj��Yq�Xq�Wytj|�Wn*tjk
�rn|tj|tj��YnXdS)a�Recursively delete a directory tree.

    If ignore_errors is set, errors are ignored; otherwise, if onerror
    is set, it is called to handle the error with arguments (func,
    path, exc_info) where func is os.listdir, os.remove, or os.rmdir;
    path is the argument to that function that caused it to fail; and
    exc_info is a tuple returned by sys.exc_info().  If ignore_errors
    is false and onerror is None, an exception is raised.

    cWsdS)Nr)rWrrr �onerrorszrmtree.<locals>.onerrorNcWs�dS)Nr)rWrrr rbsz%Cannot call rmtree on a symbolic linkr)r+r,rSr-�sys�exc_inforQ�errorrF�lstatr6r4�S_ISDIRr�remove�rmdir)r,�
ignore_errorsrbrLr`�fullnamer<rrr r�s>



cCstjj|jtjj��S)N)r+r,rG�rstrip�sep)r,rrr �	_basename'srncCs�|}tjj|�rTt||�r*tj||�dStjj|t|��}tjj|�rTtd|��ytj||�Wnft	k
r�tjj|�r�t
||�r�td||f��t||dd�t|�nt
||�tj|�YnXdS)a�Recursively move a file or directory to another location. This is
    similar to the Unix "mv" command.

    If the destination is a directory or a symlink to a directory, the source
    is moved inside the directory. The destination path must not already
    exist.

    If the destination already exists but is not a directory, it may be
    overwritten depending on os.rename() semantics.

    If the destination is on our current filesystem, then rename() is used.
    Otherwise, src is copied to the destination and then removed.
    A lot more could be done here...  A look at a mv.c shows a lot of
    the issues this implementation glosses over.

    Nz$Destination path '%s' already existsz.Cannot move a directory '%s' into itself '%s'.T)r])r+r,rEr1�renamerFrnrVrr-�
_destinsrcr
rr�unlink)r/r0Zreal_dstrrr r,s$



cCsNt|�}t|�}|jtjj�s*|tjj7}|jtjj�sD|tjj7}|j|�S)N)r�endswithr+r,rm�
startswith)r/r0rrr rpTsrpcCsNtdks|dkrdSyt|�}Wntk
r8d}YnX|dk	rJ|dSdS)z"Returns a gid, given a group name.N�)r�KeyError)r`�resultrrr �_get_gid]s
rwcCsNtdks|dkrdSyt|�}Wntk
r8d}YnX|dk	rJ|dSdS)z"Returns an uid, given a user name.Nrt)rru)r`rvrrr �_get_uidis
rx�gzipc
sddd�}ddi}	tr&d|d<d|	d<|d	k	rD||	krDtd
j|���|d|	j|d�}
tjj|
�}tjj|�s�|d	k	r�|jd|�|s�tj	|�|d	k	r�|jd
�t
���t�������fdd�}|�s�tj
|
d||�}
z|
j||d�Wd	|
j�X|
S)a�Create a (possibly compressed) tar file from all the files under
    'base_dir'.

    'compress' must be "gzip" (the default), "bzip2", or None.

    'owner' and 'group' can be used to define an owner and a group for the
    archive that is being built. If not provided, the current owner and group
    will be used.

    The output tar file will be named 'base_name' +  ".tar", possibly plus
    the appropriate compression extension (".gz", or ".bz2").

    Returns the output filename.
    Zgz�)ryNryz.gz�bz2�bzip2z.bz2NzCbad value for 'compress', or compression format not supported : {0}z.tarzcreating %szCreating tar archivecs,�dk	r�|_�|_�dk	r(�|_�|_|S)N)�gidZgname�uid�uname)Ztarinfo)r}�group�ownerr~rr �_set_uid_gid�sz#_make_tarball.<locals>._set_uid_gidzw|%s)rJ)�_BZ2_SUPPORTED�
ValueError�format�getr+r,�dirnamerV�inforRrxrwrr7�add�close)�	base_name�base_dir�compress�verbose�dry_runr�r��loggerZtar_compressionZcompress_extZarchive_name�archive_dirr��tarr)r}r�r�r~r �
_make_tarballus4


	
r�cCsd|r
d}nd}ddlm}ddlm}y|d|||g|d�Wn |k
r^td�|�YnXdS)	Nz-rz-rqr)�DistutilsExecError)�spawn�zip)r�zkunable to create zip file '%s': could neither import the 'zipfile' module nor find a standalone zip utility)Zdistutils.errorsr�Zdistutils.spawnr�r)r��zip_filenamer�r�Z
zipoptionsr�r�rrr �_call_external_zip�sr�cCs$|d}tjj|�}tjj|�sB|dk	r4|jd|�|sBtj|�yddl}Wntk
rfd}YnX|dkr�t||||�n�|dk	r�|jd||�|�s |j	|d|j
d�}xhtj|�D]Z\}	}
}xN|D]F}tjjtjj
|	|��}
tjj|
�r�|j|
|
�|dk	r�|jd|
�q�Wq�W|j�|S)	amCreate a zip file from all the files under 'base_dir'.

    The output zip file will be named 'base_name' + ".zip".  Uses either the
    "zipfile" Python module (if available) or the InfoZIP "zip" utility
    (if installed and found on the default search path).  If neither tool is
    available, raises ExecError.  Returns the name of the output zip
    file.
    z.zipNzcreating %srz#creating '%s' and adding '%s' to it�w)Zcompressionzadding '%s')r+r,r�rVr�rR�zipfile�ImportErrorr��ZipFileZZIP_DEFLATED�walk�normpathrF�isfiler&r�)r�r�r�r�r�r�r�r�r��dirpathZdirnames�	filenamesr`r,rrr �
_make_zipfile�s8	



r�r�zgzip'ed tar-filer|zbzip2'ed tar-filezuncompressed tar filezZIP file)�gztar�bztarr�r�r�cCsdd�tj�D�}|j�|S)z�Returns a list of supported formats for archiving and unarchiving.

    Each element of the returned sequence is a tuple (name, description)
    cSsg|]\}}||df�qS)rtr)�.0r`�registryrrr �
<listcomp>sz'get_archive_formats.<locals>.<listcomp>)�_ARCHIVE_FORMATS�items�sort)�formatsrrr r�srzcCs~|dkrg}t|tj�s$td|��t|ttf�s:td��x0|D](}t|ttf�s`t|�dkr@td��q@W|||ft|<dS)auRegisters an archive format.

    name is the name of the format. function is the callable that will be
    used to create archives. If provided, extra_args is a sequence of
    (name, value) tuples that will be passed as arguments to the callable.
    description can be provided to describe the format, and will be returned
    by the get_archive_formats() function.
    NzThe %s object is not callablez!extra_args needs to be a sequencertz+extra_args elements are : (arg_name, value))r\�collections�Callable�	TypeError�tuple�list�lenr�)r`�function�
extra_args�description�elementrrr rs	
cCs
t|=dS)N)r�)r`rrr rsc	Cstj�}	|dk	r>|dk	r$|jd|�tjj|�}|s>tj|�|dkrLtj}||d�}
yt|}Wn tk
r�t	d|��YnX|d}x|dD]\}
}||
|
<q�W|dkr�||
d<||
d	<z|||f|
�}Wd|dk	r�|dk	r�|jd
|	�tj|	�X|S)a�Create an archive file (eg. zip or tar).

    'base_name' is the name of the file to create, minus any format-specific
    extension; 'format' is the archive format: one of "zip", "tar", "bztar"
    or "gztar".

    'root_dir' is a directory that will be the root directory of the
    archive; ie. we typically chdir into 'root_dir' before creating the
    archive.  'base_dir' is the directory where we start archiving from;
    ie. 'base_dir' will be the common prefix of all files and
    directories in the archive.  'root_dir' and 'base_dir' both default
    to the current directory.  Returns the name of the archive file.

    'owner' and 'group' are used when creating a tar archive. By default,
    uses the current owner and group.
    Nzchanging into '%s')r�r�zunknown archive format '%s'rrr�r�r�zchanging back to '%s')
r+�getcwd�debugr,r�chdir�curdirr�rur�)r�r�Zroot_dirr�r�r�r�r�r�Zsave_cwd�kwargs�format_info�func�arg�val�filenamerrr r s6

cCsdd�tj�D�}|j�|S)z�Returns a list of supported formats for unpacking.

    Each element of the returned sequence is a tuple
    (name, extensions, description)
    cSs"g|]\}}||d|df�qS)r�r)r�r`r�rrr r�]sz&get_unpack_formats.<locals>.<listcomp>)�_UNPACK_FORMATSr�r�)r�rrr rWsc	Cszi}x.tj�D]"\}}x|dD]}|||<q WqWx,|D]$}||kr:d}t||||f��q:Wt|tj�svtd��dS)z+Checks what gets registered as an unpacker.rz!%s is already registered for "%s"z*The registered function must be a callableN)r�r�r#r\r�r�r�)	�
extensionsr�r�Zexisting_extensionsr`r�Zext�	extension�msgrrr �_check_unpack_optionsbs
r�cCs,|dkrg}t|||�||||ft|<dS)aMRegisters an unpack format.

    `name` is the name of the format. `extensions` is a list of extensions
    corresponding to the format.

    `function` is the callable that will be
    used to unpack archives. The callable will receive archives to unpack.
    If it's unable to handle an archive, it needs to raise a ReadError
    exception.

    If provided, `extra_args` is a sequence of
    (name, value) tuples that will be passed as arguments to the callable.
    description can be provided to describe the format, and will be returned
    by the get_unpack_formats() function.
    N)r�r�)r`r�r�r�r�rrr rtscCs
t|=dS)z*Removes the pack format from the registry.N)r�)r`rrr r�scCs&tjj|�}tjj|�s"tj|�dS)z1Ensure that the parent directory of `path` existsN)r+r,r�rErR)r,r�rrr �_ensure_directory�sr�c	Cs�yddl}Wntk
r(td��YnX|j|�s@td|��|j|�}z�x�|j�D]�}|j}|jd�sVd|krtqVtj	j
|f|jd���}|s�qVt|�|j
d�sV|j|j�}t|d�}z|j|�Wd|j�~XqVWWd|j�XdS)z+Unpack zip `filename` to `extract_dir`
    rNz/zlib not supported, cannot unpack this archive.z%s is not a zip file�/z..r3)r�r�r"Z
is_zipfiler�Zinfolistr�rsr+r,rF�splitr�rrr%r7r&r�)	r��extract_dirr�r�r�r`�target�data�frrr �_unpack_zipfile�s0



r�cCsPytj|�}Wn"tjk
r0td|��YnXz|j|�Wd|j�XdS)z:Unpack tar/tar.gz/tar.bz2 `filename` to `extract_dir`
    z/%s is not a compressed or uncompressed tar fileN)rr7ZTarErrorr"Z
extractallr�)r�r�Ztarobjrrr �_unpack_tarfile�sr�z.tar.gzz.tgzz.tarz.zip)r�r�r�z.bz2cCs:x4tj�D](\}}x|dD]}|j|�r|SqWq
WdS)Nr)r�r�rr)r�r`r�r�rrr �_find_unpack_format�s

r�c
Cs�|dkrtj�}|dk	rhyt|}Wn"tk
rFtdj|���YnX|d}|||ft|d��nHt|�}|dkr�tdj|���t|d}tt|d�}|||f|�dS)a�Unpack an archive.

    `filename` is the name of the archive.

    `extract_dir` is the name of the target directory, where the archive
    is unpacked. If not provided, the current working directory is used.

    `format` is the archive format: one of "zip", "tar", or "gztar". Or any
    other registered format. If not provided, unpack_archive will use the
    filename extension and see if an unpacker was registered for that
    extension.

    In case none is found, a ValueError is raised.
    NzUnknown unpack format '{0}'rrtzUnknown archive format '{0}')	r+r�r�rur�r��dictr�r")r�r�r�r�r�r�rrr r�s�@)r�)FN)ryrrNNN)FF)rrN)r�ry)r�r|)r�N)r�r|)Nrz)NNrrNNN)Nrz)NN)=r!r+rcr4Zos.pathrrIr�rCrzrr{r�r��pwdrZgrpr�__all__rXrrrr"�	Exceptionr#r[�	NameErrorrr1rr	r
rrrr
rrnrrprwrxr�r�r�r�rrrrrr�rrr�r�r�r�r�rrrrr �<module>
s�






Q
1(	
=

0






6
%
site-packages/pip/_vendor/distlib/_backport/__pycache__/tarfile.cpython-36.opt-1.pyc000064400000172757147511334610024361 0ustar003

�Ub�i�@sNddlmZdZdZdZdZdZdZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZyddlZddlZWnek
r�dZZYnXeefZyeef7ZWnek
r�YnXd	d
ddgZejdd
kr�ddlZnddlZejZdZdZedZ dZ!dZ"dZ#dZ$dZ%dZ&dZ'dZ(dZ)dZ*dZ+dZ,dZ-dZ.dZ/dZ0dZ1d Z2d!Z3d"Z4dZ5d#Z6d$Z7e6Z8e&e'e(e)e,e-e.e*e+e/e0e1fZ9e&e'e.e1fZ:e/e0e1fZ;d�Z<e=d��Z>e?e?e?e@e@e@d-�ZAd.ZBd/ZCd0ZDd1ZEd2ZFd3ZGd4ZHd5ZIdZJd6ZKd7ZLd8ZMd9ZNd:ZOd;ZPd<ZQd$ZRd#ZSe	jTd�k�rd?ZUnejV�ZUd@dA�ZWdBdC�ZXdDdE�ZYd;e8fdFdG�ZZdHdI�Z[d�dJdK�Z\eBdLfeCdMfeDdNfeEdOfeFdPfeGdQffeKdRffeLdSffeMeHBdTfeHdUfeMdVffeNdRffeOdSffePeIBdTfeIdUfePdVffeQdRffeRdSffeSeJBdWfeJdXfeSdVfff
Z]dYdZ�Z^Gd[d�de_�Z`Gd\d]�d]e`�ZaGd^d_�d_e`�ZbGd`da�dae`�ZcGdbdc�dce`�ZdGddde�dee`�ZeGdfdg�dgee�ZfGdhdi�diee�ZgGdjdk�dkee�ZhGdldm�dmee�ZiGdndo�doee�ZjGdpdq�dqek�ZlGdrds�dsek�ZmGdtdu�duek�ZnGdvdw�dwek�ZoGdxdy�dyek�ZpGdzd{�d{ek�ZqGd|d
�d
ek�ZrGd}d	�d	ek�ZsGd~d�dek�Ztd�d�ZueZvesjZdS)��)�print_functionz
$Revision$z0.9.0u"Lars Gustäbel (lars@gustaebel.de)z5$Date: 2011-02-25 17:42:01 +0200 (Fri, 25 Feb 2011) $z?$Id: tarfile.py 88586 2011-02-25 15:42:01Z marc-andre.lemburg $u4Gustavo Niemeyer, Niels Gustäbel, Richard Townsend.N�TarFile�TarInfo�
is_tarfile�TarError��i�sustar  sustar00�d��0�1�2�3�4�5�6�7�L�K�S�x�g�X���path�linkpath�size�mtime�uid�gid�uname�gname)ZatimeZctimerr r!ri�i�i`i@i iii���@� ����nt�cezutf-8cCs(|j||�}|d|�|t|�tS)z8Convert a string to a null-terminated bytes object.
    N)�encode�len�NUL)�s�length�encoding�errors�r4�/usr/lib/python3.6/tarfile.py�stn�sr6cCs*|jd�}|dkr|d|�}|j||�S)z8Convert a null-terminated bytes object to a string.
    rrN���)�find�decode)r0r2r3�pr4r4r5�nts�s
r;cCs�|dtd�krJytt|dd�p"dd�}Wq�tk
rFtd��Yq�Xn:d}x4tt|�d�D] }|dK}|t||d�7}q`W|S)	z/Convert a number field to a python number.
    rr%�ascii�strict�0r)zinvalid headerr)�chr�intr;�
ValueError�InvalidHeaderError�ranger.�ord)r0�n�ir4r4r5�nti�srGcCs�d|kod|dknr<d|d|fjd�t}n�|tksT|d|dkr\td��|dkr|tjdtjd	|��d}t�}x,t|d�D]}|j	d|d
@�|dL}q�W|j	dd�|S)z/Convert a python number to a number field.
    rr)rz%0*or<r$zoverflow in number field�L�l�r%)
r-r/�
GNU_FORMATrA�struct�unpack�pack�	bytearrayrC�insert)rE�digits�formatr0rFr4r4r5�itn�s	 rScCshdttjd|dd��tjd|dd���}dttjd|dd��tjd	|dd���}||fS)
a�Calculate the checksum for a member's header by summing up all
       characters except for the chksum field which is treated as if
       it was filled with spaces. According to the GNU tar sources,
       some tars (Sun and NeXT) calculate chksum with signed char,
       which will be different if there are chars in the buffer with
       the high bit set. So we calculate two checksums, unsigned and
       signed.
    r$Z148BN�Z356B�iZ148bZ356b)�sumrLrM)�bufZunsigned_chksumZ
signed_chksumr4r4r5�calc_chksums�s	00rXcCs�|dkrdS|dkr8x|jd�}|s&P|j|�qWdSd}t||�\}}x8t|�D],}|j|�}t|�|krvtd��|j|�qTW|dkr�|j|�}t|�|kr�td��|j|�dS)zjCopy length bytes from fileobj src to fileobj dst.
       If length is None, copy the entire content.
    rNr(izend of file reachedi@i@)�read�write�divmodrCr.�IOError)�src�dstr1rWZBUFSIZE�blocks�	remainder�br4r4r5�copyfileobjs,



rbrI�-ra�d�cr:�r�wr0�S�x�t�TcCsPg}x@tD]8}x2|D] \}}||@|kr|j|�PqW|jd�q
Wdj|�S)zcConvert a file's mode to a string of the form
       -rwxrwxrwx.
       Used by TarFile.list()
    rc�)�filemode_table�append�join)�modeZperm�table�bit�charr4r4r5�filemode8s

rtc@seZdZdZdS)rzBase exception.N)�__name__�
__module__�__qualname__�__doc__r4r4r4r5rGsc@seZdZdZdS)�ExtractErrorz%General exception for extract errors.N)rurvrwrxr4r4r4r5ryJsryc@seZdZdZdS)�	ReadErrorz&Exception for unreadable tar archives.N)rurvrwrxr4r4r4r5rzMsrzc@seZdZdZdS)�CompressionErrorz.Exception for unavailable compression methods.N)rurvrwrxr4r4r4r5r{Psr{c@seZdZdZdS)�StreamErrorz=Exception for unsupported operations on stream-like TarFiles.N)rurvrwrxr4r4r4r5r|Ssr|c@seZdZdZdS)�HeaderErrorz!Base exception for header errors.N)rurvrwrxr4r4r4r5r}Vsr}c@seZdZdZdS)�EmptyHeaderErrorzException for empty headers.N)rurvrwrxr4r4r4r5r~Ysr~c@seZdZdZdS)�TruncatedHeaderErrorz Exception for truncated headers.N)rurvrwrxr4r4r4r5r\src@seZdZdZdS)�EOFHeaderErrorz"Exception for end of file headers.N)rurvrwrxr4r4r4r5r�_sr�c@seZdZdZdS)rBzException for invalid headers.N)rurvrwrxr4r4r4r5rBbsrBc@seZdZdZdS)�SubsequentHeaderErrorz3Exception for missing and invalid extended headers.N)rurvrwrxr4r4r4r5r�esr�c@s0eZdZdZdd�Zdd�Zdd�Zdd	�Zd
S)�
_LowLevelFilez�Low-level file object. Supports reading and writing.
       It is used instead of a regular file object for streaming
       access.
    cCsFtjtjtjBtjBd�|}ttd�r2|tjO}tj||d�|_dS)N)rfrg�O_BINARYi�)	�os�O_RDONLY�O_WRONLY�O_CREAT�O_TRUNC�hasattrr��open�fd)�self�namerpr4r4r5�__init__rs

z_LowLevelFile.__init__cCstj|j�dS)N)r��closer�)r�r4r4r5r�{sz_LowLevelFile.closecCstj|j|�S)N)r�rYr�)r�rr4r4r5rY~sz_LowLevelFile.readcCstj|j|�dS)N)r�rZr�)r�r0r4r4r5rZ�sz_LowLevelFile.writeN)rurvrwrxr�r�rYrZr4r4r4r5r�ls
	r�c@steZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zddd�Zddd�Z
dd�Zdd�ZdS)�_Streama�Class that serves as an adapter between TarFile and
       a stream-like object.  The stream-like object only
       needs to have a read() or write() method and is accessed
       blockwise.  Use of gzip or bzip2 compression is possible.
       A stream-like object could be for example: sys.stdin,
       sys.stdout, a socket, a tape device etc.

       _Stream is intended to be used only internally.
    cCsRd|_|dkrt||�}d|_|dkr6t|�}|j�}|p<d|_||_||_||_||_d|_	d|_
d|_y�|dkr�yddl}Wnt
k
r�td	��YnX||_|jd�|_|d
kr�|j�n|j�|dk�r$yddl}Wnt
k
r�td��YnX|d
k�rd|_|j�|_n
|j�|_Wn&|j�s@|jj�d|_�YnXdS)
z$Construct a _Stream object.
        TNF�*rl�r�gzzzlib module is not availablerf�bz2zbz2 module is not available)�_extfileobjr��_StreamProxy�getcomptyper�rp�comptype�fileobj�bufsizerW�pos�closed�zlib�ImportErrorr{�crc32�crc�
_init_read_gz�_init_write_gzr��dbuf�BZ2Decompressor�cmp�
BZ2Compressorr�)r�r�rpr�r�r�r�r�r4r4r5r��sP





z_Stream.__init__cCst|d�r|jr|j�dS)Nr�)r�r�r�)r�r4r4r5�__del__�sz_Stream.__del__cCs�|jjd|jj|jj|jjd�|_tjdtt	j	���}|j
d|d�|jjd�rf|jdd�|_|j
|jj
d	d
�t�dS)z6Initialize for writing with gzip compression.
        �	rz<Ls�s�z.gzNrz
iso-8859-1�replace���)r�ZcompressobjZDEFLATED�	MAX_WBITSZ
DEF_MEM_LEVELr�rLrNr@�time�_Stream__writer��endswithr-r/)r�Z	timestampr4r4r5r��sz_Stream._init_write_gzcCsR|jdkr|jj||j�|_|jt|�7_|jdkrD|jj|�}|j|�dS)z&Write string s to the stream.
        r��tarN)	r�r�r�r�r�r.r��compressr�)r�r0r4r4r5rZ�s

z
_Stream.writecCsR|j|7_x>t|j�|jkrL|jj|jd|j��|j|jd�|_qWdS)z]Write string s to the stream if a whole new block
           is ready to be written.
        N)rWr.r�r�rZ)r�r0r4r4r5Z__write�sz_Stream.__writecCs�|jr
dS|jdkr2|jdkr2|j|jj�7_|jdkr�|jr�|jj|j�d|_|jdkr�|jjtj	d|j
d@��|jjtj	d|jd@��|js�|jj
�d|_dS)	z[Close the _Stream object. No operation should be
           done on it afterwards.
        Nrgr�r�r�z<Ll��T)r�rpr�rWr��flushr�rZrLrNr�r�r�r�)r�r4r4r5r��s

z
_Stream.closecCs�|jj|jj�|_d|_|jd�dkr0td��|jd�dkrFtd��t|jd��}|jd�|d	@r�t|jd��d
t|jd��}|j	|�|d@r�x|jd�}|s�|t
kr�Pq�W|d@r�x|jd�}|s�|t
kr�Pq�W|d@r�|jd�d
S)z:Initialize for reading a gzip compressed fileobj.
        r�rs�znot a gzip filer�zunsupported compression method�r*r$r)r(N)r�Z
decompressobjr�r�r��
_Stream__readrzr{rDrYr/)r��flagZxlenr0r4r4r5r�s.
 


z_Stream._init_read_gzcCs|jS)z3Return the stream's file pointer position.
        )r�)r�r4r4r5�tell#sz_Stream.tellrcCs\||jdkrNt||j|j�\}}xt|�D]}|j|j�q.W|j|�ntd��|jS)zXSet the stream's file pointer to pos. Negative seeking
           is forbidden.
        rz seeking backwards is not allowed)r�r[r�rCrYr|)r�r�r_r`rFr4r4r5�seek(sz_Stream.seekNcCsZ|dkr:g}x |j|j�}|s P|j|�qWdj|�}n
|j|�}|jt|�7_|S)z�Return the next size number of bytes from the stream.
           If size is not defined, return all bytes of the stream
           up to EOF.
        Nrl)�_readr�rnror�r.)r�rrjrWr4r4r5rY5s
z_Stream.readcCs�|jdkr|j|�St|j�}xf||kr�|j|j�}|s:Py|jj|�}Wntk
rftd��YnX|j|7_|t|�7}q W|jd|�}|j|d�|_|S)z+Return size bytes from the stream.
        r�zinvalid compressed dataN)	r�r�r.r�r�r��
decompressr\rz)r�rrerWr4r4r5r�Gs 



z
_Stream._readcCsht|j�}x:||krD|jj|j�}|s(P|j|7_|t|�7}qW|jd|�}|j|d�|_|S)zsReturn size bytes from stream. If internal buffer is empty,
           read another block from the stream.
        N)r.rWr�rYr�)r�rrerWr4r4r5Z__read\s

z_Stream.__read)r)N)rurvrwrxr�r�r�rZr�r�r�r�r�rYr�r�r4r4r4r5r��s	4
	

r�c@s0eZdZdZdd�Zdd�Zdd�Zdd	�Zd
S)r�zsSmall proxy class that enables transparent compression
       detection for the Stream interface (mode 'r|*').
    cCs||_|jjt�|_dS)N)r�rY�	BLOCKSIZErW)r�r�r4r4r5r�qsz_StreamProxy.__init__cCs|jj|_|jS)N)r�rYrW)r�rr4r4r5rYus
z_StreamProxy.readcCs$|jjd�rdS|jjd�r dSdS)Ns�r�sBZh91r�r�)rW�
startswith)r�r4r4r5r�ys
z_StreamProxy.getcomptypecCs|jj�dS)N)r�r�)r�r4r4r5r��sz_StreamProxy.closeN)rurvrwrxr�rYr�r�r4r4r4r5r�ls
r�c@sLeZdZdZdZdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�ZdS)�	_BZ2ProxyaSmall proxy class that enables external file object
       support for "r:bz2" and "w:bz2" modes. This is actually
       a workaround for a limitation in bz2 module's BZ2File
       class which (unlike gzip.GzipFile) has no support for
       a file object argument.
    r(icCs(||_||_t|jdd�|_|j�dS)Nr�)r�rp�getattrr��init)r�r�rpr4r4r5r��sz_BZ2Proxy.__init__cCsDddl}d|_|jdkr6|j�|_|jjd�d|_n
|j�|_dS)Nrrfr�)	r�r�rpr��bz2objr�r�rWr�)r�r�r4r4r5r��s

z_BZ2Proxy.initcCs�t|j�}xF||krP|jj|j�}|s(P|jj|�}|j|7_|t|�7}qW|jd|�}|j|d�|_|jt|�7_|S)N)r.rWr�rY�	blocksizer�r�r�)r�rri�raw�datarWr4r4r5rY�s

z_BZ2Proxy.readcCs&||jkr|j�|j||j�dS)N)r�r�rY)r�r�r4r4r5r��s
z_BZ2Proxy.seekcCs|jS)N)r�)r�r4r4r5r��sz_BZ2Proxy.tellcCs.|jt|�7_|jj|�}|jj|�dS)N)r�r.r�r�r�rZ)r�r�r�r4r4r5rZ�sz_BZ2Proxy.writecCs$|jdkr |jj�}|jj|�dS)Nrg)rpr�r�r�rZ)r�r�r4r4r5r��s

z_BZ2Proxy.closeNi@)rurvrwrxr�r�r�rYr�r�rZr�r4r4r4r5r��s
r�c@s<eZdZdZd
dd�Zdd�Zdd�Zd	d
�Zddd�ZdS)�_FileInFilezA thin wrapper around an existing file object that
       provides a part of its data as an individual file
       object.
    NcCs�||_||_||_d|_|dkr*d|fg}d|_g|_d}|j}xT|D]L\}}||krj|jjd||df�|jjd||||f�||7}||}qFW||jkr�|jjd||jdf�dS)NrFT)r��offsetr�position�	map_index�maprn)r�r�r�rZ	blockinfoZlastposZrealposr4r4r5r��s$

z_FileInFile.__init__cCst|jd�sdS|jj�S)N�seekableT)r�r�r�)r�r4r4r5r��sz_FileInFile.seekablecCs|jS)z*Return the current file position.
        )r�)r�r4r4r5r��sz_FileInFile.tellcCs
||_dS)z(Seek to a position in the file.
        N)r�)r�r�r4r4r5r��sz_FileInFile.seekcCs�|dkr|j|j}nt||j|j�}d}x�|dkr�xZ|j|j\}}}}||jko`|knrjPq8|jd7_|jt|j�kr8d|_q8Wt|||j�}|r�|jj||j|�||jj|�7}n|t	|7}||8}|j|7_q.W|S)z!Read data from the file.
        Nr�rr)
rr��minr�r�r.r�r�rYr/)r�rrWr��start�stopr�r1r4r4r5rY�s(

z_FileInFile.read)N)N)	rurvrwrxr�r�r�r�rYr4r4r4r5r��s
r�c@szeZdZdZdZdd�Zdd�Zdd�Zd	d
�Zddd
�Z	e	Z
ddd�Zdd�Zdd�Z
ejfdd�Zdd�Zdd�ZdS)�ExFileObjectzaFile-like object for reading an archive member.
       Is returned by TarFile.extractfile().
    icCsDt|j|j|j|j�|_|j|_d|_d|_|j|_d|_d|_	dS)NrfFrr�)
r�r��offset_datar�sparser�rpr�r��buffer)r��tarfile�tarinfor4r4r5r�s
zExFileObject.__init__cCsdS)NTr4)r�r4r4r5�readable!szExFileObject.readablecCsdS)NFr4)r�r4r4r5�writable$szExFileObject.writablecCs
|jj�S)N)r�r�)r�r4r4r5r�'szExFileObject.seekableNcCs�|jrtd��d}|jrL|dkr.|j}d|_n|jd|�}|j|d�|_|dkrd||jj�7}n||jj|t|��7}|jt|�7_|S)z~Read at most size bytes from the file. If size is not
           present or None, read all data until EOF is reached.
        zI/O operation on closed filer�N)r�rAr�r�rYr.r�)r�rrWr4r4r5rY*szExFileObject.readrcCs�|jrtd��|jjd�d}|dkrzxR|jj|j�}|j|7_|sRd|kr(|jjd�d}|dkrtt|j�}Pq(W|dkr�t||�}|jd|�}|j|d�|_|j	t|�7_	|S)z�Read one entire line from the file. If size is present
           and non-negative, return a string with at most that
           size, which may be an incomplete line.
        zI/O operation on closed file�
rrNr7)
r�rAr�r8r�rYr�r.r�r�)r�rr�rWr4r4r5�readlineEs$

zExFileObject.readlinecCs&g}x|j�}|sP|j|�qW|S)z0Return a list with all remaining lines.
        )r�rn)r��result�liner4r4r5�	readlinesbszExFileObject.readlinescCs|jrtd��|jS)z*Return the current file position.
        zI/O operation on closed file)r�rAr�)r�r4r4r5r�lszExFileObject.tellcCs�|jrtd��|tjkr.tt|d�|j�|_nj|tjkrj|dkrTt|j|d�|_q�t|j||j�|_n.|tj	kr�tt|j||j�d�|_ntd��d|_
|jj|j�dS)z(Seek to a position in the file.
        zI/O operation on closed filerzInvalid argumentr�N)
r�rAr��SEEK_SETr��maxrr��SEEK_CUR�SEEK_ENDr�r�r�)r�r��whencer4r4r5r�ts


zExFileObject.seekcCs
d|_dS)zClose the file object.
        TN)r�)r�r4r4r5r��szExFileObject.closeccsx|j�}|sP|VqWdS)z/Get an iterator over the file's lines.
        N)r�)r�r�r4r4r5�__iter__�s
zExFileObject.__iter__)Nr7)r7)rurvrwrxr�r�r�r�r�rY�read1r�r�r�r�r�r�r�r�r4r4r4r5r�s



r�c@s�eZdZdZdiZdjdd�Zdd�Zdd�Zeee�Z	dd�Z
dd �Zee
e�Zd!d"�Z
d#d$�Zeed%fd&d'�Zd(d)�Zd*d+�Zd,d-�Zed.d/��Zd0d1�Zed2d3��Zed4d5��Zed6d7��Zed8d9��Zed:d;��Zed<d=��Zd>d?�Zd@dA�Z dBdC�Z!dDdE�Z"dFdG�Z#dHdI�Z$dJdK�Z%dLdM�Z&dNdO�Z'dPdQ�Z(dRdS�Z)dTdU�Z*dVdW�Z+dXdY�Z,dZd[�Z-d\d]�Z.d^d_�Z/d`da�Z0dbdc�Z1ddde�Z2dfdg�Z3dhS)kraInformational class which holds the details about an
       archive member given by a tar header block.
       TarInfo objects are returned by TarFile.getmember(),
       TarFile.getmembers() and TarFile.gettarinfo() and are
       usually created internally.
    r�rpr r!rr�chksum�type�linknamer"r#�devmajor�devminorr�r��pax_headersr�r��_sparse_structs�_link_targetrlcCsj||_d|_d|_d|_d|_d|_d|_t|_d|_	d|_
d|_d|_d|_
d|_d|_d|_i|_dS)zXConstruct a TarInfo object. name is the optional name
           of the member.
        i�rrlN)r�rpr r!rrr��REGTYPEr�r�r"r#r�r�r�r�r�r�)r�r�r4r4r5r��s"zTarInfo.__init__cCs|jS)N)r�)r�r4r4r5�_getpath�szTarInfo._getpathcCs
||_dS)N)r�)r�r�r4r4r5�_setpath�szTarInfo._setpathcCs|jS)N)r�)r�r4r4r5�_getlinkpath�szTarInfo._getlinkpathcCs
||_dS)N)r�)r�r�r4r4r5�_setlinkpath�szTarInfo._setlinkpathcCsd|jj|jt|�fS)Nz<%s %r at %#x>)�	__class__rur��id)r�r4r4r5�__repr__�szTarInfo.__repr__cCsn|j|jd@|j|j|j|j|j|j|j|j	|j
|j|jd�
}|dt
krj|djd�rj|dd7<|S)z9Return the TarInfo's attributes as a dictionary.
        i�)
r�rpr r!rrr�r�r�r"r#r�r�r�r��/)r�rpr r!rrr�r�r�r"r#r�r��DIRTYPEr�)r��infor4r4r5�get_info�s 
zTarInfo.get_info�surrogateescapecCsT|j�}|tkr|j|||�S|tkr4|j|||�S|tkrH|j||�Std��dS)z<Return a tar header as a string of 512 byte blocks.
        zinvalid formatN)r��USTAR_FORMAT�create_ustar_headerrK�create_gnu_header�
PAX_FORMAT�create_pax_headerrA)r�rRr2r3r�r4r4r5�tobuf�sz
TarInfo.tobufcCsZt|d<t|d�tkr td��t|d�tkrJ|j|d�\|d<|d<|j|t||�S)z3Return the object as a ustar header block.
        �magicr�zlinkname is too longr��prefix)�POSIX_MAGICr.�LENGTH_LINKrA�LENGTH_NAME�_posix_split_name�_create_headerr�)r�r�r2r3r4r4r5r��szTarInfo.create_ustar_headercCspt|d<d}t|d�tkr4||j|dt||�7}t|d�tkr\||j|dt||�7}||j|t||�S)z:Return the object as a GNU header block sequence.
        r�r�r�r�)	�	GNU_MAGICr.r��_create_gnu_long_header�GNUTYPE_LONGLINKr��GNUTYPE_LONGNAMErrK)r�r�r2r3rWr4r4r5r�szTarInfo.create_gnu_headerc
Cs4t|d<|jj�}x�ddtfddtfddfD]h\}}}||kr@q,y||jd	d
�Wn"tk
rv||||<w,YnXt||�|kr,||||<q,WxldD]d\}}||kr�d||<q�||}d|ko�d|dkn�s�t|t	�r�t
|�||<d||<q�W|�r|j|t|�}	nd}	|	|j
|td	d�S)z�Return the object as a ustar header block. If it cannot be
           represented this way, prepend a pax extended header sequence
           with supplement information.
        r�r�rr�rr"r'r#r<r=r r)r!r�rrrr�r�)r"r"r')r#r#r'�r r)�r!r)�rr�rr)rrr	r
)r�r��copyr�r�r-�UnicodeEncodeErrorr.�
isinstance�float�str�_create_pax_generic_header�XHDTYPErr�)
r�r�r2r�r�Zhnamer1rQ�valrWr4r4r5r�s4
.zTarInfo.create_pax_headercCs|j|td�S)zAReturn the object as a pax global header block sequence.
        �utf8)r�XGLTYPE)�clsr�r4r4r5�create_pax_global_headerDsz TarInfo.create_pax_global_headercCsp|dtd�}x |r0|ddkr0|dd�}qW|t|�d�}|dd�}|s`t|�tkrhtd��||fS)zUSplit a name longer than 100 chars into a prefix
           and a name part.
        Nrr�zname is too longr7r7r7)�
LENGTH_PREFIXr.r�rA)r�r�r�r4r4r5rJszTarInfo._posix_split_namecCsVt|jdd�d||�t|jdd�d@d|�t|jdd�d|�t|jd	d�d|�t|jd
d�d|�t|jdd�d|�d
|jdt�t|jdd�d||�|jdt�t|jdd�d||�t|jdd�d||�t|jdd�d|�t|jdd�d|�t|jdd�d||�g}tjdtdj|��}t	|td��d}|dd�d|j
d�|d d�}|S)!z�Return a header block. info is a dictionary with file
           information, format must be one of the *_FORMAT constants.
        r�rlr
rpri�r)r r!rrrs        r�r�r�r"r'r#r�r�r�rz%dsr�Nilz%06or<iei����i����)r6�getrSr�r�rLrNr�rorXr-)r�rRr2r3�partsrWr�r4r4r5rYs&

&zTarInfo._create_headercCs.tt|�t�\}}|dkr*|t|t7}|S)zdReturn the string payload filled with zero bytes
           up to the next 512 byte border.
        r)r[r.r�r/)Zpayloadr_r`r4r4r5�_create_payloaduszTarInfo._create_payloadcCsR|j||�t}i}d|d<||d<t|�|d<t|d<|j|t||�|j|�S)zTReturn a GNUTYPE_LONGNAME or GNUTYPE_LONGLINK sequence
           for name.
        z
././@LongLinkr�r�rr�)r-r/r.rrr�r)rr�r�r2r3r�r4r4r5rszTarInfo._create_gnu_long_headercCs:d}x@|j�D]4\}}y|jdd�Wqtk
r@d}PYqXqWd}|rV|d7}x�|j�D]�\}}|jd�}|r�|j|d�}n
|jd�}t|�t|�d}d	}	}
x"|tt|
��}	|	|
kr�P|	}
q�W|tt|
�d
�d|d|d
7}q`Wi}d|d<||d<t|�|d<t|d<|j|td
d�|j	|�S)z�Return a POSIX.1-2008 extended or global header sequence
           that contains a list of keyword, value pairs. The values
           must be strings.
        Frr=Tr�s21 hdrcharset=BINARY
r�rrr<� �=r�z././@PaxHeaderr�r�rr�r�)
�itemsr-rr.r�bytesr�rr�r)rr�r�r2Zbinary�keyword�valueZrecordsrIrEr:r�r4r4r5r�s<

*z"TarInfo._create_pax_generic_headercCstt|�dkrtd��t|�tkr(td��|jt�tkr>td��t|dd��}|t|�krbt	d��|�}t
|dd�||�|_t|dd	��|_t|d	d
��|_
t|d
d��|_t|dd��|_t|dd��|_||_|dd
�|_t
|d
d�||�|_t
|dd�||�|_t
|dd�||�|_t|dd��|_t|dd��|_t
|dd�||�}|jtk�r�|jjd��r�t|_|jtk�r6d}g}xrtd�D]f}	y0t|||d��}
t||d|d��}Wntk
�r�PYnX|j|
|f�|d7}�q�Wt|d�}t|dd��}
|||
f|_ |j!��rN|jj"d�|_|�rp|jt#k�rp|d|j|_|S)zAConstruct a TarInfo object from a 512 byte bytes object.
        rzempty headerztruncated headerzend of file headerrTrUzbad checksumr
�l�t�|��ii	i)iIiQiYi�r�i�r*r�i�i�i�)$r.r~r�r�countr/r�rGrXrBr;r�rpr r!rrr�r�r�r"r#r�r��AREGTYPEr�r��GNUTYPE_SPARSErCrArn�boolr��isdir�rstrip�	GNU_TYPES)rrWr2r3r��objr�r��structsrFr��numbytes�
isextended�origsizer4r4r5�frombuf�sZ
zTarInfo.frombufcCs8|jjt�}|j||j|j�}|jj�t|_|j|�S)zOReturn the next TarInfo object from TarFile object
           tarfile.
        )	r�rYr�r3r2r3r�r��_proc_member)rr�rWr.r4r4r5�fromtarfileszTarInfo.fromtarfilecCsT|jttfkr|j|�S|jtkr,|j|�S|jtttfkrF|j	|�S|j
|�SdS)zYChoose the right processing method depending on
           the type and call it.
        N)r�rr�
_proc_gnulongr)�_proc_sparserr�SOLARIS_XHDTYPE�	_proc_pax�
_proc_builtin)r�r�r4r4r5r4s



zTarInfo._proc_membercCsR|jj�|_|j}|j�s$|jtkr4||j|j�7}||_|j	|j
|j|j�|S)zfProcess a builtin type or an unknown type which
           will be treated as a regular file.
        )
r�r�r��isregr��SUPPORTED_TYPES�_blockrr��_apply_pax_infor�r2r3)r�r�r�r4r4r5r:$szTarInfo._proc_builtincCs�|jj|j|j��}y|j|�}Wntk
r>td��YnX|j|_|jt	krft
||j|j�|_
n|jtkr�t
||j|j�|_|S)zSProcess the blocks that hold a GNU longname
           or longlink member.
        z missing or bad subsequent header)r�rYr=rr5r}r�r�r�rr;r2r3r�rr�)r�r�rW�nextr4r4r5r65s

zTarInfo._proc_gnulongc
Cs�|j\}}}|`x�|r�|jjt�}d}xvtd�D]j}y0t|||d��}t||d|d��}	Wntk
rzPYnX|r�|	r�|j||	f�|d7}q0Wt|d�}qW||_	|jj
�|_|j|j|j
�|_||_
|S)z8Process a GNU sparse header plus extra headers.
        r�rr&i�)r�r�rYr�rCrGrArnr*r�r�r�r=rr�)
r�r�r/r1r2rWr�rFr�r0r4r4r5r7Ks(zTarInfo._proc_sparsecCs |jj|j|j��}|jtkr&|j}n
|jj�}tj	d|�}|dk	rX|j
d�jd�|d<|jd�}|dkrr|j
}nd}tjd�}d}x�|j||�}|s�P|j�\}	}
t|	�}	||jd	�d|jd�|	d�}|j|
dd|j�}
|
tk�r|j|||j
|j�}n|j|dd|j�}|||
<||	7}q�Wy|j|�}Wntk
�rTtd
��YnXd|k�rn|j||�nHd|k�r�|j|||�n.|jd
�dk�r�|jd�dk�r�|j|||�|jttfk�r|j||j
|j�|j |_ d|k�r|j!}
|j"��s|jt#k�r|
|j|j�7}
|
|_ |S)zVProcess an extended or global header as described in
           POSIX.1-2008.
        s\d+ hdrcharset=([^\n]+)\nNrr�
hdrcharsetZBINARYs(\d+) ([^=]+)=rrz missing or bad subsequent headerzGNU.sparse.mapzGNU.sparse.sizezGNU.sparse.major�1zGNU.sparse.minorr>r)$r�rYr=rr�rr�r�re�search�groupr9rr2�compile�match�groupsr@�endr��_decode_pax_fieldr3�PAX_NAME_FIELDSr5r}r��_proc_gnusparse_01�_proc_gnusparse_00�_proc_gnusparse_10rr8r>r�r�r;r<)r�r�rWr�rGrAr2Zregexr�r1rr r?r�r4r4r5r9gs`



$	





 
zTarInfo._proc_paxcCspg}x(tjd|�D]}|jt|jd���qWg}x(tjd|�D]}|jt|jd���q@Wtt||��|_dS)z?Process a GNU tar extended sparse header, version 0.0.
        s\d+ GNU.sparse.offset=(\d+)\nrs\d+ GNU.sparse.numbytes=(\d+)\nN)rC�finditerrnr@rE�list�zipr�)r�r?r�rWZoffsetsrGr0r4r4r5rM�szTarInfo._proc_gnusparse_00cCs@dd�|djd�D�}tt|ddd�|ddd���|_dS)z?Process a GNU tar extended sparse header, version 0.1.
        cSsg|]}t|��qSr4)r@)�.0rir4r4r5�
<listcomp>�sz.TarInfo._proc_gnusparse_01.<locals>.<listcomp>zGNU.sparse.map�,Nrr)�splitrPrQr�)r�r?r�r�r4r4r5rL�szTarInfo._proc_gnusparse_01cCs�d}g}|jjt�}|jdd�\}}t|�}xJt|�|dkrvd|krV||jjt�7}|jdd�\}}|jt|��q.W|jj�|_t	t
|ddd�|ddd���|_dS)z?Process a GNU tar extended sparse header, version 1.0.
        Nr�rr)r�rYr�rUr@r.rnr�r�rPrQr�)r�r?r�r�Zfieldsr�rWZnumberr4r4r5rN�szTarInfo._proc_gnusparse_10c
Cs�x�|j�D]�\}}|dkr(t|d|�q
|dkrBt|dt|��q
|dkr\t|dt|��q
|tkr
|tkr�yt||�}Wntk
r�d}YnX|dkr�|jd�}t|||�q
W|j�|_dS)	zoReplace fields with supplemental information from a previous
           pax extended or global header.
        zGNU.sparse.namerzGNU.sparse.sizerzGNU.sparse.realsizerr�N)	r�setattrr@�
PAX_FIELDS�PAX_NUMBER_FIELDSrAr,rr�)r�r�r2r3rr r4r4r5r>�s"

zTarInfo._apply_pax_infocCs.y|j|d�Stk
r(|j||�SXdS)z1Decode a single field from a pax record.
        r=N)r9�UnicodeDecodeError)r�r r2Zfallback_encodingZfallback_errorsr4r4r5rJszTarInfo._decode_pax_fieldcCs"t|t�\}}|r|d7}|tS)z_Round up a byte count by BLOCKSIZE and return it,
           e.g. _block(834) => 1024.
        r)r[r�)r�r'r_r`r4r4r5r=
szTarInfo._blockcCs
|jtkS)N)r��
REGULAR_TYPES)r�r4r4r5r;sz
TarInfo.isregcCs|j�S)N)r;)r�r4r4r5�isfileszTarInfo.isfilecCs
|jtkS)N)r�r�)r�r4r4r5r+sz
TarInfo.isdircCs
|jtkS)N)r��SYMTYPE)r�r4r4r5�issymsz
TarInfo.issymcCs
|jtkS)N)r��LNKTYPE)r�r4r4r5�islnksz
TarInfo.islnkcCs
|jtkS)N)r��CHRTYPE)r�r4r4r5�ischr sz
TarInfo.ischrcCs
|jtkS)N)r��BLKTYPE)r�r4r4r5�isblk"sz
TarInfo.isblkcCs
|jtkS)N)r��FIFOTYPE)r�r4r4r5�isfifo$szTarInfo.isfifocCs
|jdk	S)N)r�)r�r4r4r5�issparse&szTarInfo.issparsecCs|jtttfkS)N)r�r`rbrd)r�r4r4r5�isdev(sz
TarInfo.isdevN)r�rpr r!rrr�r�r�r"r#r�r�r�r�r�r�r�r�r�)rl)4rurvrwrx�	__slots__r�r�r��propertyrr�r�rr�r��DEFAULT_FORMAT�ENCODINGr�r�r�r��classmethodrr�staticmethodrrrrr3r5r4r:r6r7r9rMrLrNr>rJr=r;r[r+r]r_rarcrerfrgr4r4r4r5r�s`



1
3?
f	c@s�eZdZdZdZdZdZdZeZ	e
ZdZe
ZeZdVdd	�Zedddefd
d��ZedWdd
��ZedXdd��ZedYdd��Zdddd�Zdd�Zdd�Zdd�Zdd�ZdZdd �Zd[d"d#�Zd\d$d%�Zd]d&d'�Z d^d)d*�Z!d_d,d-�Z"d.d/�Z#d`d0d1�Z$d2d3�Z%d4d5�Z&d6d7�Z'd8d9�Z(d:d;�Z)d<d=�Z*d>d?�Z+d@dA�Z,dBdC�Z-dDdE�Z.dadFdG�Z/dHdI�Z0dbdJdK�Z1dLdM�Z2dNdO�Z3dPdQ�Z4dRdS�Z5dTdU�Z6dS)crz=The TarFile Class provides an interface to tar archives.
    rFrNrfr�c
Cs�t|�dks|dkrtd��||_dddd�||_|sp|jdkr\tjj|�r\d	|_d|_t||j�}d
|_n0|dkr�t	|d�r�|j
}t	|d
�r�|j|_d|_|r�tjj|�nd|_
||_|dk	r�||_
|dk	r�||_|dk	r�||_|dk	r�||_|dk	�r||_|	|_|
dk	�r(|j
tk�r(|
|_ni|_|dk	�r>||_|dk	�rN||_d
|_g|_d
|_|jj�|_i|_y�|jdk�r�d|_|j�|_|jdk�r$x�|jj|j�y|jj |�}|jj!|�WnTt"k
�r�|jj|j�PYn0t#k
�r}
zt$t%|
���WYdd}
~
XnX�q�W|jdk�rnd|_|j�rn|jj&|jj'��}|jj(|�|jt|�7_Wn&|j�s�|jj)�d|_�YnXdS)a�Open an (uncompressed) tar archive `name'. `mode' is either 'r' to
           read from an existing archive, 'a' to append data to an existing
           file or 'w' to create a new file overwriting an existing one. `mode'
           defaults to 'r'.
           If `fileobj' is given, it is used for reading or writing data. If it
           can be determined, `mode' is overridden by `fileobj's mode.
           `fileobj' is not closed, when TarFile is closed.
        rr�zmode must be 'r', 'a' or 'w'�rbzr+b�wb)rf�argrprgFNr�rpTrf�aw)*r.rArp�_moder�r�exists�	bltn_openr�r�r��abspathr�rRr��dereference�ignore_zerosr2r3r�r��debug�
errorlevelr��members�_loadedr�r��inodes�firstmemberr?r�r5rnr�r}rzrrrrZr�)r�r�rpr�rRr�rvrwr2r3r�rxry�erWr4r4r5r�Fs�




$
zTarFile.__init__c
Ks�|r|rtd��|dkr�xz|jD]p}t||j|�}|dk	rH|j�}y||d|f|�Sttfk
r�}	z|dk	r�|j|�w$WYdd}	~	Xq$Xq$Wtd���nd|k�r|jdd�\}
}|
p�d}
|p�d}||jkr�t||j|�}ntd	|��|||
|f|�Sd
|k�r�|jd
d�\}
}|
�p(d}
|�p2d}|
dk�rFtd��t||
|||�}y|||
|f|�}Wn|j	��YnXd
|_
|S|dk�r�|j|||f|�Std��dS)a|Open a tar archive for reading, writing or appending. Return
           an appropriate TarFile class.

           mode:
           'r' or 'r:*' open for reading with transparent compression
           'r:'         open for reading exclusively uncompressed
           'r:gz'       open for reading with gzip compression
           'r:bz2'      open for reading with bzip2 compression
           'a' or 'a:'  open for appending, creating the file if necessary
           'w' or 'w:'  open for writing without compression
           'w:gz'       open for writing with gzip compression
           'w:bz2'      open for writing with bzip2 compression

           'r|*'        open a stream of tar blocks with transparent compression
           'r|'         open an uncompressed stream of tar blocks for reading
           'r|gz'       open a gzip compressed stream of tar blocks
           'r|bz2'      open a bzip2 compressed stream of tar blocks
           'w|'         open an uncompressed stream for writing
           'w|gz'       open a gzip compressed stream for writing
           'w|bz2'      open a bzip2 compressed stream for writing
        znothing to openrf�r:*Nz%file could not be opened successfully�:rr�zunknown compression type %r�|�rwzmode must be 'r' or 'w'Frqzundiscernible mode)rfr)rA�	OPEN_METHr�r�rzr{r�rUr�r�r��taropen)
rr�rpr�r��kwargsr��funcZ	saved_posr~rt�streamrjr4r4r5r��sN







zTarFile.opencKs,t|�dks|dkrtd��||||f|�S)zCOpen uncompressed tar archive name for reading or writing.
        rr�zmode must be 'r', 'a' or 'w')r.rA)rr�rpr�r�r4r4r5r��szTarFile.taropenr�c	Ks�t|�dks|dkrtd��yddl}|jWn ttfk
rNtd��YnX|dk	}y*|j||d||�}|j|||f|�}Wn^tk
r�|r�|dk	r�|j	�|dkr��t
d��Yn$|r�|dk	r�|j	��YnX||_|S)	zkOpen gzip compressed tar archive name for reading or writing.
           Appending is not allowed.
        rr�zmode must be 'r' or 'w'rNzgzip module is not availableraznot a gzip file)r.rA�gzipZGzipFiler��AttributeErrorr{r�r\r�rzr�)	rr�rpr��
compresslevelr�r�Z
extfileobjrjr4r4r5�gzopens.
zTarFile.gzopencKs�t|�dks|dkrtd��yddl}Wntk
rDtd��YnX|dk	rZt||�}n|j|||d�}y|j|||f|�}Wn(tt	fk
r�|j
�td��YnXd	|_|S)
zlOpen bzip2 compressed tar archive name for reading or writing.
           Appending is not allowed.
        rr�zmode must be 'r' or 'w'.rNzbz2 module is not available)r�znot a bzip2 fileF)
r.rAr�r�r{r�ZBZ2Filer�r\�EOFErrorr�rzr�)rr�rpr�r�r�r�rjr4r4r5�bz2open$s zTarFile.bz2openr�r�r�)r�r�r�cCs�|jr
dS|jdkrf|jjttd�|jtd7_t|jt�\}}|dkrf|jjtt|�|j	sv|jj
�d|_dS)zlClose the TarFile. In write-mode, two finishing zero blocks are
           appended to the archive.
        NrqrrT)r�rpr�rZr/r�r�r[�
RECORDSIZEr�r�)r�r_r`r4r4r5r�Hs

z
TarFile.closecCs"|j|�}|dkrtd|��|S)aReturn a TarInfo object for member `name'. If `name' can not be
           found in the archive, KeyError is raised. If a member occurs more
           than once in the archive, its last occurrence is assumed to be the
           most up-to-date version.
        Nzfilename %r not found)�
_getmember�KeyError)r�r�r�r4r4r5�	getmember\s
zTarFile.getmembercCs|j�|js|j�|jS)z�Return the members of the archive as a list of TarInfo objects. The
           list has the same order as the members in the archive.
        )�_checkr{�_loadrz)r�r4r4r5�
getmembersgszTarFile.getmemberscCsdd�|j�D�S)z�Return the members of the archive as a list of their names. It has
           the same order as the list returned by getmembers().
        cSsg|]
}|j�qSr4)r�)rRr�r4r4r5rSusz$TarFile.getnames.<locals>.<listcomp>)r�)r�r4r4r5�getnamesqszTarFile.getnamescCsp|jd�|dk	r|j}|dkr$|}tjj|�\}}|jtjd�}|jd�}|j�}||_	|dkr�t
td�r�|jr�tj|�}q�tj
|�}ntj|j��}d}|j}t
j|��r|j|jf}	|jr�|jdkr�|	|jkr�||j|	kr�t}
|j|	}nt}
|	d�rx||j|	<nht
j|��r"t}
nVt
j|��r4t}
nDt
j|��rPt}
tj|�}n(t
j|��rbt }
nt
j!|��rtt"}
ndS||_||_#|j$|_%|j&|_'|
tk�r�|j(|_)nd|_)|j*|_+|
|_,||_-t.�r�yt.j/|j%�d|_0Wnt1k
�r�YnXt2�r*yt2j3|j'�d|_4Wnt1k
�r(YnX|
t t"fk�rlt
td��rlt
td	��rltj5|j6�|_7tj8|j6�|_9|S)
aOCreate a TarInfo object for either the file `name' or the file
           object `fileobj' (using os.fstat on its file descriptor). You can
           modify some of the TarInfo's attributes before you add it using
           addfile(). If given, `arcname' specifies an alternative name for the
           file in the archive.
        rqNr��lstatrlrr�major�minor):r�r�r�r�
splitdriver��sep�lstripr�r�r�rvr��stat�fstat�fileno�st_mode�S_ISREG�st_ino�st_dev�st_nlinkr|r^r��S_ISDIRr��S_ISFIFOrd�S_ISLNKr\�readlink�S_ISCHRr`�S_ISBLKrbrp�st_uidr �st_gidr!�st_sizer�st_mtimerr�r��pwd�getpwuidr"r��grpZgetgrgidr#r��st_rdevr�r�r�)r�r��arcnamer�Zdrvr�Zstatresr�Zstmd�inoder�r4r4r5�
gettarinfows~




zTarFile.gettarinfoTcCs|j�x�|D]�}|r�tt|j�dd�td|jp6|j|jp@|jfdd�|j�s\|j	�rztdd|j
|jfdd�ntd|jdd�tdt
j|j�dd	�dd�t|j|j�r�d
nddd�|r�|j�r�td|jdd�|j�r�td
|jdd�t�qWdS)z�Print a table of contents to sys.stdout. If `verbose' is False, only
           the names of the members are printed. If it is True, an `ls -l'-like
           output is produced.
        � )rIz%s/%sz%10sz%d,%dz%10dz%d-%02d-%02d %02d:%02d:%02dNr�r�rlz->zlink to)r��printrtrpr"r r#r!rarcr�r�rr�Z	localtimerr�r+r]r�r_)r��verboser�r4r4r5rP�s&
zTarFile.listc	Csr|jd�|dkr|}|dk	rPddl}|jdtd�||�rP|jdd|�dS|jdk	r�tjj|�|jkr�|jdd|�dS|jd|�|j	||�}|dkr�|jdd	|�dS|dk	r�||�}|dkr�|jdd|�dS|j
��r
t|d
�}|j||�|j
�nd|j��rd|j|�|�rnxHtj|�D].}|jtjj||�tjj||�|||d��q0Wn
|j|�dS)a~Add the file `name' to the archive. `name' may be any type of file
           (directory, fifo, symbolic link, etc.). If given, `arcname'
           specifies an alternative name for the file in the archive.
           Directories are added recursively by default. This can be avoided by
           setting `recursive' to False. `exclude' is a function that should
           return True for each filename to be excluded. `filter' is a function
           that expects a TarInfo object argument and returns the changed
           TarInfo object, if it returns None the TarInfo object will be
           excluded from the archive.
        rqNrzuse the filter argument insteadrztarfile: Excluded %rztarfile: Skipped %rrztarfile: Unsupported type %rrn)�filter)r��warnings�warn�DeprecationWarning�_dbgr�r�rrur�r;rt�addfiler�r+�listdir�addro)	r�r�r��	recursive�excluder�r�r��fr4r4r5r��sD





zTarFile.addcCs�|jd�tj|�}|j|j|j|j�}|jj|�|jt	|�7_|dk	r�t
||j|j�t|jt
�\}}|dkr�|jjtt
|�|d7}|j|t
7_|jj|�dS)a]Add the TarInfo object `tarinfo' to the archive. If `fileobj' is
           given, tarinfo.size bytes are read from it and added to the archive.
           You can create TarInfo objects using gettarinfo().
           On Windows platforms, `fileobj' should always be opened with mode
           'rb' to avoid irritation about the file size.
        rqNrr)r�rr�rRr2r3r�rZr�r.rbrr[r�r/rzrn)r�r�r�rWr_r`r4r4r5r�4s

zTarFile.addfile�.cCs�g}|dkr|}xD|D]<}|j�r<|j|�tj|�}d|_|j|||j�d�qW|jdd�d�|j�x�|D]~}tjj	||j
�}y(|j||�|j||�|j
||�Wqttk
r�}z$|jdkrЂn|jdd|�WYdd}~XqtXqtWdS)	aMExtract all members from the archive to the current working
           directory and set owner, modification time and permissions on
           directories afterwards. `path' specifies a different directory
           to extract to. `members' is optional and must be a subset of the
           list returned by getmembers().
        Ni�)�	set_attrscSs|jS)N)r�)rpr4r4r5�<lambda>dsz$TarFile.extractall.<locals>.<lambda>)�keyrztarfile: %s)r+rnrrp�extract�sort�reverser�rror��chown�utime�chmodryryr�)r�rrzZdirectoriesr��dirpathr~r4r4r5�
extractallNs*




zTarFile.extractallrlcCs
|jd�t|t�r |j|�}n|}|j�r>tjj||j�|_	y |j
|tjj||j�|d�Wn�tk
r�}zJ|j
dkr~�n6|jdkr�|jdd|j�n|jdd|j|jf�WYdd}~XnBtk
�r}z$|j
dkr�n|jdd|�WYdd}~XnXdS)axExtract a member from the archive to the current working directory,
           using its full name. Its file information is extracted as accurately
           as possible. `member' may be a filename or a TarInfo object. You can
           specify a different directory using `path'. File attributes (owner,
           mtime, mode) are set unless `set_attrs' is False.
        rf)r�rNrztarfile: %sztarfile: %s %r)r�r
rr�r_r�rror�r��_extract_memberr��EnvironmentErrorry�filenamer��strerrorry)r��memberrr�r�r~r4r4r5r�ts&



(
zTarFile.extractcCs�|jd�t|t�r |j|�}n|}|j�r8|j||�S|jtkrN|j||�S|j�s^|j	�r�t|j
t�rttd��q�|j
|j|��SndSdS)a�Extract a member from the archive as a file object. `member' may be
           a filename or a TarInfo object. If `member' is a regular file, a
           file-like object is returned. If `member' is a link, a file-like
           object is constructed from the link's target. If `member' is none of
           the above, None is returned.
           The file-like object is read-only and provides the following
           methods: read(), readline(), readlines(), seek() and tell()
        rfz'cannot extract (sym)link as file objectN)r�r
rr�r;�
fileobjectr�r<r_r]r�r�r|�extractfile�_find_link_target)r�r�r�r4r4r5r��s	



zTarFile.extractfilecCsT|jd�}|jdtj�}tjj|�}|r@tjj|�r@tj|�|j�sP|j	�rj|j
dd|j|jf�n|j
d|j�|j
�r�|j||�n�|j�r�|j||�nx|j�r�|j||�nb|j�s�|j�r�|j||�nD|j�s�|j	�r�|j||�n&|jtk�r|j||�n|j||�|�rP|j||�|j	��sP|j||�|j||�dS)z\Extract the TarInfo object tarinfo to a physical
           file called targetpath.
        r�rz%s -> %sN)r,r�r�r�r�dirnamers�makedirsr_r]r�r�r�r;�makefiler+�makedirre�makefiforarc�makedev�makelinkr�r<�makeunknownr�r�r�)r�r��
targetpathr�Z	upperdirsr4r4r5r��s4


zTarFile._extract_membercCsFytj|d�Wn0tk
r@}z|jtjkr0�WYdd}~XnXdS)z,Make a directory called targetpath.
        i�N)r��mkdirr��errnoZEEXIST)r�r�r�r~r4r4r5r��s
zTarFile.makedircCs�|j}|j|j�t|d�}|jdk	rRx8|jD]\}}|j|�t|||�q.Wnt|||j�|j|j�|j�|j�dS)z'Make a file called targetpath.
        roN)	r�r�r�rtr�rbr�truncater�)r�r�r��source�targetr�rr4r4r5r��s


zTarFile.makefilecCs"|j||�|jdd|j�dS)zYMake a file from a TarInfo object with an unknown type
           at targetpath.
        rz9tarfile: Unknown file type %r, extracted as regular file.N)r�r�r�)r�r�r�r4r4r5r�	szTarFile.makeunknowncCs"ttd�rtj|�ntd��dS)z'Make a fifo called targetpath.
        �mkfifozfifo not supported by systemN)r�r�r�ry)r�r�r�r4r4r5r�	s
zTarFile.makefifocCsbttd�sttd�r td��|j}|j�r:|tjO}n
|tjO}tj||tj	|j
|j��dS)z<Make a character or block device called targetpath.
        �mknodr�z'special devices not supported by systemN)r�r�ryrprcr��S_IFBLK�S_IFCHRr�r�r�r�)r�r�r�rpr4r4r5r�	s
zTarFile.makedevcCs�yL|j�rtj|j|�n0tjj|j�r8tj|j|�n|j|j	|�|�Wn>t
k
r�|j�r�tjjtjj|j
�|j�}n|j}Yn6Xy|j|j	|�|�Wntk
r�td��YnXdS)z�Make a (symbolic) link called targetpath. If it cannot be created
          (platform limitation), we try to make a copy of the referenced file
          instead of a link.
        z%unable to resolve link inside archiveN)r]r��symlinkr�rrsr��linkr�r��symlink_exceptionror�r�r�ry)r�r�r�rr4r4r5r�'	s"


zTarFile.makelinkc Cs�tr�ttd�r�tj�dkr�ytj|j�d}Wntk
rH|j}YnXytj	|j
�d}Wntk
rx|j}YnXy>|j�r�ttd�r�tj
|||�ntjdkr�tj|||�Wn*tk
r�}ztd��WYdd}~XnXdS)z6Set owner of targetpath according to tarinfo.
        �geteuidrr�lchownZos2emxzcould not change ownerN)r�r�r�r�r�Zgetgrnamr#r�r!�getpwnamr"r r]r��sys�platformr�r�ry)r�r�r��g�ur~r4r4r5r�D	s 
z
TarFile.chowncCsLttd�rHytj||j�Wn*tk
rF}ztd��WYdd}~XnXdS)zASet file permissions of targetpath according to tarinfo.
        r�zcould not change modeN)r�r�r�rpr�ry)r�r�r�r~r4r4r5r�Z	s

z
TarFile.chmodcCsVttd�sdSytj||j|jf�Wn*tk
rP}ztd��WYdd}~XnXdS)zBSet modification time of targetpath according to tarinfo.
        r�Nz"could not change modification time)r�r�r�rr�ry)r�r�r�r~r4r4r5r�c	s
z
TarFile.utimecCs�|jd�|jdk	r$|j}d|_|S|jj|j�d}�x\y|jj|�}W�nBtk
r�}z2|jr�|j	dd|j|f�|jt
7_w:WYdd}~Xn�tk
r�}zJ|jr�|j	dd|j|f�|jt
7_w:n|jdkr�tt
|���WYdd}~Xn�tk
�r&|jdk�r"td��Ynjtk
�r`}z|jdk�rPtt
|���WYdd}~Xn0tk
�r�}ztt
|���WYdd}~XnXPq:W|dk	�r�|jj|�nd|_|S)z�Return the next member of the archive as a TarInfo object, when
           TarFile is opened for reading. Return None if there is no more
           available.
        ZraNrz0x%X: %srz
empty fileT)r�r}r�r�r�r�r5r�rwr�r�rBrzrr~rr�rzrnr{)r��mr�r~r4r4r5r?n	sF



zTarFile.nextcCsn|j�}|dk	r"|d|j|��}|r2tjj|�}x6t|�D]*}|rTtjj|j�}n|j}||kr<|Sq<WdS)z}Find an archive member by name from bottom to top.
           If tarinfo is given, it is used as the starting point.
        N)r��indexr�r�normpath�reversedr�)r�r�r��	normalizerzr��member_namer4r4r5r��	szTarFile._getmembercCs"x|j�}|dkrPqWd|_dS)zWRead through the entire archive file and look for readable
           members.
        NT)r?r{)r�r�r4r4r5r��	s
z
TarFile._loadcCs:|jrtd|jj��|dk	r6|j|kr6td|j��dS)znCheck if TarFile is still open, and if the operation's mode
           corresponds to TarFile's mode.
        z%s is closedNzbad operation for mode %r)r�r\r�rurp)r�rpr4r4r5r��	szTarFile._checkcCsX|j�r&tjj|j�d|j}d}n
|j}|}|j||dd�}|dkrTtd|��|S)zZFind the target member of a symlink or hardlink member in the
           archive.
        r�NT)r�r�zlinkname %r not found)r]r�rr�r�r�r�r�)r�r�r��limitr�r4r4r5r��	szTarFile._find_link_targetcCs|jrt|j�St|�SdS)z$Provide an iterator object.
        N)r{�iterrz�TarIter)r�r4r4r5r��	s
zTarFile.__iter__cCs||jkrt|tjd�dS)z.Write debugging output to sys.stderr.
        )�fileN)rxr�r��stderr)r��level�msgr4r4r5r��	s
zTarFile._dbgcCs|j�|S)N)r�)r�r4r4r5�	__enter__�	szTarFile.__enter__cCs,|dkr|j�n|js"|jj�d|_dS)NT)r�r�r�r�)r�r�r �	tracebackr4r4r5�__exit__�	s


zTarFile.__exit__)NrfNNNNNNr�NNN)rfN)rfNr�)rfNr�)NNN)T)NTNN)N)r�N)rlT)T)NF)N)7rurvrwrxrxrvrwryrjrRrkr2r3rr�r�r�r�rlr�r�r�r�r�r�r�r�r�r�r�rPr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r?r�r�r�r�r�r�rrr4r4r4r5r,sl
iK

b

>

&
#&
0	1


	c@s,eZdZdZdd�Zdd�Zdd�ZeZdS)	r�zMIterator Class.

       for tarinfo in TarFile(...):
           suite...
    cCs||_d|_dS)z$Construct a TarIter object.
        rN)r�r�)r�r�r4r4r5r�
szTarIter.__init__cCs|S)z Return iterator object.
        r4)r�r4r4r5r�
szTarIter.__iter__cCsb|jjs$|jj�}|sPd|j_t�n,y|jj|j}Wntk
rNt�YnX|jd7_|S)z�Return the next item using TarFile's next() method.
           When all members have been read, set TarFile as _loaded.
        Tr)r�r{r?�
StopIterationrzr��
IndexError)r�r�r4r4r5�__next__

s

zTarIter.__next__N)rurvrwrxr�r�rr?r4r4r4r5r��	s
r�cCs.yt|�}|j�dStk
r(dSXdS)zfReturn True if name points to a tar archive that we
       are able to handle, else return False.
    TFN)r�r�r)r�rjr4r4r5r#
s)rrrrr r!r"r#)rrr"r#)r+r,)N)wZ
__future__r�__version__�version�
__author__Z__date__Z	__cvsid__�__credits__r�r�r�r�r�rLrrCr�r�r�r��NotImplementedErrorr�ZWindowsError�	NameError�__all__�version_infoZ__builtin__�builtinsr�Z_openr/r�r�rr�r�r�rr�r(r^r\r`rbr�rdZCONTTYPErrr)rrr8r�rKr�rjr<rZr-rW�setrKrr@rX�S_IFLNK�S_IFREGr��S_IFDIRr��S_IFIFOZTSUIDZTSGIDZTSVTXZTUREADZTUWRITEZTUEXECZTGREADZTGWRITEZTGEXECZTOREADZTOWRITEZTOEXECr�rk�getfilesystemencodingr6r;rGrSrXrbrmrt�	Exceptionrryrzr{r|r}r~rr�rBr��objectr�r�r�r�r�r�rrr�rrtr4r4r4r5�<module>s>




i?KT*site-packages/pip/_vendor/distlib/_backport/__pycache__/tarfile.cpython-36.pyc000064400000172757147511334610023422 0ustar003

�Ub�i�@sNddlmZdZdZdZdZdZdZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZyddlZddlZWnek
r�dZZYnXeefZyeef7ZWnek
r�YnXd	d
ddgZejdd
kr�ddlZnddlZejZdZdZedZ dZ!dZ"dZ#dZ$dZ%dZ&dZ'dZ(dZ)dZ*dZ+dZ,dZ-dZ.dZ/dZ0dZ1d Z2d!Z3d"Z4dZ5d#Z6d$Z7e6Z8e&e'e(e)e,e-e.e*e+e/e0e1fZ9e&e'e.e1fZ:e/e0e1fZ;d�Z<e=d��Z>e?e?e?e@e@e@d-�ZAd.ZBd/ZCd0ZDd1ZEd2ZFd3ZGd4ZHd5ZIdZJd6ZKd7ZLd8ZMd9ZNd:ZOd;ZPd<ZQd$ZRd#ZSe	jTd�k�rd?ZUnejV�ZUd@dA�ZWdBdC�ZXdDdE�ZYd;e8fdFdG�ZZdHdI�Z[d�dJdK�Z\eBdLfeCdMfeDdNfeEdOfeFdPfeGdQffeKdRffeLdSffeMeHBdTfeHdUfeMdVffeNdRffeOdSffePeIBdTfeIdUfePdVffeQdRffeRdSffeSeJBdWfeJdXfeSdVfff
Z]dYdZ�Z^Gd[d�de_�Z`Gd\d]�d]e`�ZaGd^d_�d_e`�ZbGd`da�dae`�ZcGdbdc�dce`�ZdGddde�dee`�ZeGdfdg�dgee�ZfGdhdi�diee�ZgGdjdk�dkee�ZhGdldm�dmee�ZiGdndo�doee�ZjGdpdq�dqek�ZlGdrds�dsek�ZmGdtdu�duek�ZnGdvdw�dwek�ZoGdxdy�dyek�ZpGdzd{�d{ek�ZqGd|d
�d
ek�ZrGd}d	�d	ek�ZsGd~d�dek�Ztd�d�ZueZvesjZdS)��)�print_functionz
$Revision$z0.9.0u"Lars Gustäbel (lars@gustaebel.de)z5$Date: 2011-02-25 17:42:01 +0200 (Fri, 25 Feb 2011) $z?$Id: tarfile.py 88586 2011-02-25 15:42:01Z marc-andre.lemburg $u4Gustavo Niemeyer, Niels Gustäbel, Richard Townsend.N�TarFile�TarInfo�
is_tarfile�TarError��i�sustar  sustar00�d��0�1�2�3�4�5�6�7�L�K�S�x�g�X���path�linkpath�size�mtime�uid�gid�uname�gname)ZatimeZctimerr r!ri�i�i`i@i iii���@� ����nt�cezutf-8cCs(|j||�}|d|�|t|�tS)z8Convert a string to a null-terminated bytes object.
    N)�encode�len�NUL)�s�length�encoding�errors�r4�/usr/lib/python3.6/tarfile.py�stn�sr6cCs*|jd�}|dkr|d|�}|j||�S)z8Convert a null-terminated bytes object to a string.
    rrN���)�find�decode)r0r2r3�pr4r4r5�nts�s
r;cCs�|dtd�krJytt|dd�p"dd�}Wq�tk
rFtd��Yq�Xn:d}x4tt|�d�D] }|dK}|t||d�7}q`W|S)	z/Convert a number field to a python number.
    rr%�ascii�strict�0r)zinvalid headerr)�chr�intr;�
ValueError�InvalidHeaderError�ranger.�ord)r0�n�ir4r4r5�nti�srGcCs�d|kod|dknr<d|d|fjd�t}n�|tksT|d|dkr\td��|dkr|tjdtjd	|��d}t�}x,t|d�D]}|j	d|d
@�|dL}q�W|j	dd�|S)z/Convert a python number to a number field.
    rr)rz%0*or<r$zoverflow in number field�L�l�r%)
r-r/�
GNU_FORMATrA�struct�unpack�pack�	bytearrayrC�insert)rE�digits�formatr0rFr4r4r5�itn�s	 rScCshdttjd|dd��tjd|dd���}dttjd|dd��tjd	|dd���}||fS)
a�Calculate the checksum for a member's header by summing up all
       characters except for the chksum field which is treated as if
       it was filled with spaces. According to the GNU tar sources,
       some tars (Sun and NeXT) calculate chksum with signed char,
       which will be different if there are chars in the buffer with
       the high bit set. So we calculate two checksums, unsigned and
       signed.
    r$Z148BN�Z356B�iZ148bZ356b)�sumrLrM)�bufZunsigned_chksumZ
signed_chksumr4r4r5�calc_chksums�s	00rXcCs�|dkrdS|dkr8x|jd�}|s&P|j|�qWdSd}t||�\}}x8t|�D],}|j|�}t|�|krvtd��|j|�qTW|dkr�|j|�}t|�|kr�td��|j|�dS)zjCopy length bytes from fileobj src to fileobj dst.
       If length is None, copy the entire content.
    rNr(izend of file reachedi@i@)�read�write�divmodrCr.�IOError)�src�dstr1rWZBUFSIZE�blocks�	remainder�br4r4r5�copyfileobjs,



rbrI�-ra�d�cr:�r�wr0�S�x�t�TcCsPg}x@tD]8}x2|D] \}}||@|kr|j|�PqW|jd�q
Wdj|�S)zcConvert a file's mode to a string of the form
       -rwxrwxrwx.
       Used by TarFile.list()
    rc�)�filemode_table�append�join)�modeZperm�table�bit�charr4r4r5�filemode8s

rtc@seZdZdZdS)rzBase exception.N)�__name__�
__module__�__qualname__�__doc__r4r4r4r5rGsc@seZdZdZdS)�ExtractErrorz%General exception for extract errors.N)rurvrwrxr4r4r4r5ryJsryc@seZdZdZdS)�	ReadErrorz&Exception for unreadable tar archives.N)rurvrwrxr4r4r4r5rzMsrzc@seZdZdZdS)�CompressionErrorz.Exception for unavailable compression methods.N)rurvrwrxr4r4r4r5r{Psr{c@seZdZdZdS)�StreamErrorz=Exception for unsupported operations on stream-like TarFiles.N)rurvrwrxr4r4r4r5r|Ssr|c@seZdZdZdS)�HeaderErrorz!Base exception for header errors.N)rurvrwrxr4r4r4r5r}Vsr}c@seZdZdZdS)�EmptyHeaderErrorzException for empty headers.N)rurvrwrxr4r4r4r5r~Ysr~c@seZdZdZdS)�TruncatedHeaderErrorz Exception for truncated headers.N)rurvrwrxr4r4r4r5r\src@seZdZdZdS)�EOFHeaderErrorz"Exception for end of file headers.N)rurvrwrxr4r4r4r5r�_sr�c@seZdZdZdS)rBzException for invalid headers.N)rurvrwrxr4r4r4r5rBbsrBc@seZdZdZdS)�SubsequentHeaderErrorz3Exception for missing and invalid extended headers.N)rurvrwrxr4r4r4r5r�esr�c@s0eZdZdZdd�Zdd�Zdd�Zdd	�Zd
S)�
_LowLevelFilez�Low-level file object. Supports reading and writing.
       It is used instead of a regular file object for streaming
       access.
    cCsFtjtjtjBtjBd�|}ttd�r2|tjO}tj||d�|_dS)N)rfrg�O_BINARYi�)	�os�O_RDONLY�O_WRONLY�O_CREAT�O_TRUNC�hasattrr��open�fd)�self�namerpr4r4r5�__init__rs

z_LowLevelFile.__init__cCstj|j�dS)N)r��closer�)r�r4r4r5r�{sz_LowLevelFile.closecCstj|j|�S)N)r�rYr�)r�rr4r4r5rY~sz_LowLevelFile.readcCstj|j|�dS)N)r�rZr�)r�r0r4r4r5rZ�sz_LowLevelFile.writeN)rurvrwrxr�r�rYrZr4r4r4r5r�ls
	r�c@steZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zddd�Zddd�Z
dd�Zdd�ZdS)�_Streama�Class that serves as an adapter between TarFile and
       a stream-like object.  The stream-like object only
       needs to have a read() or write() method and is accessed
       blockwise.  Use of gzip or bzip2 compression is possible.
       A stream-like object could be for example: sys.stdin,
       sys.stdout, a socket, a tape device etc.

       _Stream is intended to be used only internally.
    cCsRd|_|dkrt||�}d|_|dkr6t|�}|j�}|p<d|_||_||_||_||_d|_	d|_
d|_y�|dkr�yddl}Wnt
k
r�td	��YnX||_|jd�|_|d
kr�|j�n|j�|dk�r$yddl}Wnt
k
r�td��YnX|d
k�rd|_|j�|_n
|j�|_Wn&|j�s@|jj�d|_�YnXdS)
z$Construct a _Stream object.
        TNF�*rl�r�gzzzlib module is not availablerf�bz2zbz2 module is not available)�_extfileobjr��_StreamProxy�getcomptyper�rp�comptype�fileobj�bufsizerW�pos�closed�zlib�ImportErrorr{�crc32�crc�
_init_read_gz�_init_write_gzr��dbuf�BZ2Decompressor�cmp�
BZ2Compressorr�)r�r�rpr�r�r�r�r�r4r4r5r��sP





z_Stream.__init__cCst|d�r|jr|j�dS)Nr�)r�r�r�)r�r4r4r5�__del__�sz_Stream.__del__cCs�|jjd|jj|jj|jjd�|_tjdtt	j	���}|j
d|d�|jjd�rf|jdd�|_|j
|jj
d	d
�t�dS)z6Initialize for writing with gzip compression.
        �	rz<Ls�s�z.gzNrz
iso-8859-1�replace���)r�ZcompressobjZDEFLATED�	MAX_WBITSZ
DEF_MEM_LEVELr�rLrNr@�time�_Stream__writer��endswithr-r/)r�Z	timestampr4r4r5r��sz_Stream._init_write_gzcCsR|jdkr|jj||j�|_|jt|�7_|jdkrD|jj|�}|j|�dS)z&Write string s to the stream.
        r��tarN)	r�r�r�r�r�r.r��compressr�)r�r0r4r4r5rZ�s

z
_Stream.writecCsR|j|7_x>t|j�|jkrL|jj|jd|j��|j|jd�|_qWdS)z]Write string s to the stream if a whole new block
           is ready to be written.
        N)rWr.r�r�rZ)r�r0r4r4r5Z__write�sz_Stream.__writecCs�|jr
dS|jdkr2|jdkr2|j|jj�7_|jdkr�|jr�|jj|j�d|_|jdkr�|jjtj	d|j
d@��|jjtj	d|jd@��|js�|jj
�d|_dS)	z[Close the _Stream object. No operation should be
           done on it afterwards.
        Nrgr�r�r�z<Ll��T)r�rpr�rWr��flushr�rZrLrNr�r�r�r�)r�r4r4r5r��s

z
_Stream.closecCs�|jj|jj�|_d|_|jd�dkr0td��|jd�dkrFtd��t|jd��}|jd�|d	@r�t|jd��d
t|jd��}|j	|�|d@r�x|jd�}|s�|t
kr�Pq�W|d@r�x|jd�}|s�|t
kr�Pq�W|d@r�|jd�d
S)z:Initialize for reading a gzip compressed fileobj.
        r�rs�znot a gzip filer�zunsupported compression method�r*r$r)r(N)r�Z
decompressobjr�r�r��
_Stream__readrzr{rDrYr/)r��flagZxlenr0r4r4r5r�s.
 


z_Stream._init_read_gzcCs|jS)z3Return the stream's file pointer position.
        )r�)r�r4r4r5�tell#sz_Stream.tellrcCs\||jdkrNt||j|j�\}}xt|�D]}|j|j�q.W|j|�ntd��|jS)zXSet the stream's file pointer to pos. Negative seeking
           is forbidden.
        rz seeking backwards is not allowed)r�r[r�rCrYr|)r�r�r_r`rFr4r4r5�seek(sz_Stream.seekNcCsZ|dkr:g}x |j|j�}|s P|j|�qWdj|�}n
|j|�}|jt|�7_|S)z�Return the next size number of bytes from the stream.
           If size is not defined, return all bytes of the stream
           up to EOF.
        Nrl)�_readr�rnror�r.)r�rrjrWr4r4r5rY5s
z_Stream.readcCs�|jdkr|j|�St|j�}xf||kr�|j|j�}|s:Py|jj|�}Wntk
rftd��YnX|j|7_|t|�7}q W|jd|�}|j|d�|_|S)z+Return size bytes from the stream.
        r�zinvalid compressed dataN)	r�r�r.r�r�r��
decompressr\rz)r�rrerWr4r4r5r�Gs 



z
_Stream._readcCsht|j�}x:||krD|jj|j�}|s(P|j|7_|t|�7}qW|jd|�}|j|d�|_|S)zsReturn size bytes from stream. If internal buffer is empty,
           read another block from the stream.
        N)r.rWr�rYr�)r�rrerWr4r4r5Z__read\s

z_Stream.__read)r)N)rurvrwrxr�r�r�rZr�r�r�r�r�rYr�r�r4r4r4r5r��s	4
	

r�c@s0eZdZdZdd�Zdd�Zdd�Zdd	�Zd
S)r�zsSmall proxy class that enables transparent compression
       detection for the Stream interface (mode 'r|*').
    cCs||_|jjt�|_dS)N)r�rY�	BLOCKSIZErW)r�r�r4r4r5r�qsz_StreamProxy.__init__cCs|jj|_|jS)N)r�rYrW)r�rr4r4r5rYus
z_StreamProxy.readcCs$|jjd�rdS|jjd�r dSdS)Ns�r�sBZh91r�r�)rW�
startswith)r�r4r4r5r�ys
z_StreamProxy.getcomptypecCs|jj�dS)N)r�r�)r�r4r4r5r��sz_StreamProxy.closeN)rurvrwrxr�rYr�r�r4r4r4r5r�ls
r�c@sLeZdZdZdZdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�ZdS)�	_BZ2ProxyaSmall proxy class that enables external file object
       support for "r:bz2" and "w:bz2" modes. This is actually
       a workaround for a limitation in bz2 module's BZ2File
       class which (unlike gzip.GzipFile) has no support for
       a file object argument.
    r(icCs(||_||_t|jdd�|_|j�dS)Nr�)r�rp�getattrr��init)r�r�rpr4r4r5r��sz_BZ2Proxy.__init__cCsDddl}d|_|jdkr6|j�|_|jjd�d|_n
|j�|_dS)Nrrfr�)	r�r�rpr��bz2objr�r�rWr�)r�r�r4r4r5r��s

z_BZ2Proxy.initcCs�t|j�}xF||krP|jj|j�}|s(P|jj|�}|j|7_|t|�7}qW|jd|�}|j|d�|_|jt|�7_|S)N)r.rWr�rY�	blocksizer�r�r�)r�rri�raw�datarWr4r4r5rY�s

z_BZ2Proxy.readcCs&||jkr|j�|j||j�dS)N)r�r�rY)r�r�r4r4r5r��s
z_BZ2Proxy.seekcCs|jS)N)r�)r�r4r4r5r��sz_BZ2Proxy.tellcCs.|jt|�7_|jj|�}|jj|�dS)N)r�r.r�r�r�rZ)r�r�r�r4r4r5rZ�sz_BZ2Proxy.writecCs$|jdkr |jj�}|jj|�dS)Nrg)rpr�r�r�rZ)r�r�r4r4r5r��s

z_BZ2Proxy.closeNi@)rurvrwrxr�r�r�rYr�r�rZr�r4r4r4r5r��s
r�c@s<eZdZdZd
dd�Zdd�Zdd�Zd	d
�Zddd�ZdS)�_FileInFilezA thin wrapper around an existing file object that
       provides a part of its data as an individual file
       object.
    NcCs�||_||_||_d|_|dkr*d|fg}d|_g|_d}|j}xT|D]L\}}||krj|jjd||df�|jjd||||f�||7}||}qFW||jkr�|jjd||jdf�dS)NrFT)r��offsetr�position�	map_index�maprn)r�r�r�rZ	blockinfoZlastposZrealposr4r4r5r��s$

z_FileInFile.__init__cCst|jd�sdS|jj�S)N�seekableT)r�r�r�)r�r4r4r5r��sz_FileInFile.seekablecCs|jS)z*Return the current file position.
        )r�)r�r4r4r5r��sz_FileInFile.tellcCs
||_dS)z(Seek to a position in the file.
        N)r�)r�r�r4r4r5r��sz_FileInFile.seekcCs�|dkr|j|j}nt||j|j�}d}x�|dkr�xZ|j|j\}}}}||jko`|knrjPq8|jd7_|jt|j�kr8d|_q8Wt|||j�}|r�|jj||j|�||jj|�7}n|t	|7}||8}|j|7_q.W|S)z!Read data from the file.
        Nr�rr)
rr��minr�r�r.r�r�rYr/)r�rrWr��start�stopr�r1r4r4r5rY�s(

z_FileInFile.read)N)N)	rurvrwrxr�r�r�r�rYr4r4r4r5r��s
r�c@szeZdZdZdZdd�Zdd�Zdd�Zd	d
�Zddd
�Z	e	Z
ddd�Zdd�Zdd�Z
ejfdd�Zdd�Zdd�ZdS)�ExFileObjectzaFile-like object for reading an archive member.
       Is returned by TarFile.extractfile().
    icCsDt|j|j|j|j�|_|j|_d|_d|_|j|_d|_d|_	dS)NrfFrr�)
r�r��offset_datar�sparser�rpr�r��buffer)r��tarfile�tarinfor4r4r5r�s
zExFileObject.__init__cCsdS)NTr4)r�r4r4r5�readable!szExFileObject.readablecCsdS)NFr4)r�r4r4r5�writable$szExFileObject.writablecCs
|jj�S)N)r�r�)r�r4r4r5r�'szExFileObject.seekableNcCs�|jrtd��d}|jrL|dkr.|j}d|_n|jd|�}|j|d�|_|dkrd||jj�7}n||jj|t|��7}|jt|�7_|S)z~Read at most size bytes from the file. If size is not
           present or None, read all data until EOF is reached.
        zI/O operation on closed filer�N)r�rAr�r�rYr.r�)r�rrWr4r4r5rY*szExFileObject.readrcCs�|jrtd��|jjd�d}|dkrzxR|jj|j�}|j|7_|sRd|kr(|jjd�d}|dkrtt|j�}Pq(W|dkr�t||�}|jd|�}|j|d�|_|j	t|�7_	|S)z�Read one entire line from the file. If size is present
           and non-negative, return a string with at most that
           size, which may be an incomplete line.
        zI/O operation on closed file�
rrNr7)
r�rAr�r8r�rYr�r.r�r�)r�rr�rWr4r4r5�readlineEs$

zExFileObject.readlinecCs&g}x|j�}|sP|j|�qW|S)z0Return a list with all remaining lines.
        )r�rn)r��result�liner4r4r5�	readlinesbszExFileObject.readlinescCs|jrtd��|jS)z*Return the current file position.
        zI/O operation on closed file)r�rAr�)r�r4r4r5r�lszExFileObject.tellcCs�|jrtd��|tjkr.tt|d�|j�|_nj|tjkrj|dkrTt|j|d�|_q�t|j||j�|_n.|tj	kr�tt|j||j�d�|_ntd��d|_
|jj|j�dS)z(Seek to a position in the file.
        zI/O operation on closed filerzInvalid argumentr�N)
r�rAr��SEEK_SETr��maxrr��SEEK_CUR�SEEK_ENDr�r�r�)r�r��whencer4r4r5r�ts


zExFileObject.seekcCs
d|_dS)zClose the file object.
        TN)r�)r�r4r4r5r��szExFileObject.closeccsx|j�}|sP|VqWdS)z/Get an iterator over the file's lines.
        N)r�)r�r�r4r4r5�__iter__�s
zExFileObject.__iter__)Nr7)r7)rurvrwrxr�r�r�r�r�rY�read1r�r�r�r�r�r�r�r�r4r4r4r5r�s



r�c@s�eZdZdZdiZdjdd�Zdd�Zdd�Zeee�Z	dd�Z
dd �Zee
e�Zd!d"�Z
d#d$�Zeed%fd&d'�Zd(d)�Zd*d+�Zd,d-�Zed.d/��Zd0d1�Zed2d3��Zed4d5��Zed6d7��Zed8d9��Zed:d;��Zed<d=��Zd>d?�Zd@dA�Z dBdC�Z!dDdE�Z"dFdG�Z#dHdI�Z$dJdK�Z%dLdM�Z&dNdO�Z'dPdQ�Z(dRdS�Z)dTdU�Z*dVdW�Z+dXdY�Z,dZd[�Z-d\d]�Z.d^d_�Z/d`da�Z0dbdc�Z1ddde�Z2dfdg�Z3dhS)kraInformational class which holds the details about an
       archive member given by a tar header block.
       TarInfo objects are returned by TarFile.getmember(),
       TarFile.getmembers() and TarFile.gettarinfo() and are
       usually created internally.
    r�rpr r!rr�chksum�type�linknamer"r#�devmajor�devminorr�r��pax_headersr�r��_sparse_structs�_link_targetrlcCsj||_d|_d|_d|_d|_d|_d|_t|_d|_	d|_
d|_d|_d|_
d|_d|_d|_i|_dS)zXConstruct a TarInfo object. name is the optional name
           of the member.
        i�rrlN)r�rpr r!rrr��REGTYPEr�r�r"r#r�r�r�r�r�r�)r�r�r4r4r5r��s"zTarInfo.__init__cCs|jS)N)r�)r�r4r4r5�_getpath�szTarInfo._getpathcCs
||_dS)N)r�)r�r�r4r4r5�_setpath�szTarInfo._setpathcCs|jS)N)r�)r�r4r4r5�_getlinkpath�szTarInfo._getlinkpathcCs
||_dS)N)r�)r�r�r4r4r5�_setlinkpath�szTarInfo._setlinkpathcCsd|jj|jt|�fS)Nz<%s %r at %#x>)�	__class__rur��id)r�r4r4r5�__repr__�szTarInfo.__repr__cCsn|j|jd@|j|j|j|j|j|j|j|j	|j
|j|jd�
}|dt
krj|djd�rj|dd7<|S)z9Return the TarInfo's attributes as a dictionary.
        i�)
r�rpr r!rrr�r�r�r"r#r�r�r�r��/)r�rpr r!rrr�r�r�r"r#r�r��DIRTYPEr�)r��infor4r4r5�get_info�s 
zTarInfo.get_info�surrogateescapecCsT|j�}|tkr|j|||�S|tkr4|j|||�S|tkrH|j||�Std��dS)z<Return a tar header as a string of 512 byte blocks.
        zinvalid formatN)r��USTAR_FORMAT�create_ustar_headerrK�create_gnu_header�
PAX_FORMAT�create_pax_headerrA)r�rRr2r3r�r4r4r5�tobuf�sz
TarInfo.tobufcCsZt|d<t|d�tkr td��t|d�tkrJ|j|d�\|d<|d<|j|t||�S)z3Return the object as a ustar header block.
        �magicr�zlinkname is too longr��prefix)�POSIX_MAGICr.�LENGTH_LINKrA�LENGTH_NAME�_posix_split_name�_create_headerr�)r�r�r2r3r4r4r5r��szTarInfo.create_ustar_headercCspt|d<d}t|d�tkr4||j|dt||�7}t|d�tkr\||j|dt||�7}||j|t||�S)z:Return the object as a GNU header block sequence.
        r�r�r�r�)	�	GNU_MAGICr.r��_create_gnu_long_header�GNUTYPE_LONGLINKr��GNUTYPE_LONGNAMErrK)r�r�r2r3rWr4r4r5r�szTarInfo.create_gnu_headerc
Cs4t|d<|jj�}x�ddtfddtfddfD]h\}}}||kr@q,y||jd	d
�Wn"tk
rv||||<w,YnXt||�|kr,||||<q,WxldD]d\}}||kr�d||<q�||}d|ko�d|dkn�s�t|t	�r�t
|�||<d||<q�W|�r|j|t|�}	nd}	|	|j
|td	d�S)z�Return the object as a ustar header block. If it cannot be
           represented this way, prepend a pax extended header sequence
           with supplement information.
        r�r�rr�rr"r'r#r<r=r r)r!r�rrrr�r�)r"r"r')r#r#r'�r r)�r!r)�rr�rr)rrr	r
)r�r��copyr�r�r-�UnicodeEncodeErrorr.�
isinstance�float�str�_create_pax_generic_header�XHDTYPErr�)
r�r�r2r�r�Zhnamer1rQ�valrWr4r4r5r�s4
.zTarInfo.create_pax_headercCs|j|td�S)zAReturn the object as a pax global header block sequence.
        �utf8)r�XGLTYPE)�clsr�r4r4r5�create_pax_global_headerDsz TarInfo.create_pax_global_headercCsp|dtd�}x |r0|ddkr0|dd�}qW|t|�d�}|dd�}|s`t|�tkrhtd��||fS)zUSplit a name longer than 100 chars into a prefix
           and a name part.
        Nrr�zname is too longr7r7r7)�
LENGTH_PREFIXr.r�rA)r�r�r�r4r4r5rJszTarInfo._posix_split_namecCsVt|jdd�d||�t|jdd�d@d|�t|jdd�d|�t|jd	d�d|�t|jd
d�d|�t|jdd�d|�d
|jdt�t|jdd�d||�|jdt�t|jdd�d||�t|jdd�d||�t|jdd�d|�t|jdd�d|�t|jdd�d||�g}tjdtdj|��}t	|td��d}|dd�d|j
d�|d d�}|S)!z�Return a header block. info is a dictionary with file
           information, format must be one of the *_FORMAT constants.
        r�rlr
rpri�r)r r!rrrs        r�r�r�r"r'r#r�r�r�rz%dsr�Nilz%06or<iei����i����)r6�getrSr�r�rLrNr�rorXr-)r�rRr2r3�partsrWr�r4r4r5rYs&

&zTarInfo._create_headercCs.tt|�t�\}}|dkr*|t|t7}|S)zdReturn the string payload filled with zero bytes
           up to the next 512 byte border.
        r)r[r.r�r/)Zpayloadr_r`r4r4r5�_create_payloaduszTarInfo._create_payloadcCsR|j||�t}i}d|d<||d<t|�|d<t|d<|j|t||�|j|�S)zTReturn a GNUTYPE_LONGNAME or GNUTYPE_LONGLINK sequence
           for name.
        z
././@LongLinkr�r�rr�)r-r/r.rrr�r)rr�r�r2r3r�r4r4r5rszTarInfo._create_gnu_long_headercCs:d}x@|j�D]4\}}y|jdd�Wqtk
r@d}PYqXqWd}|rV|d7}x�|j�D]�\}}|jd�}|r�|j|d�}n
|jd�}t|�t|�d}d	}	}
x"|tt|
��}	|	|
kr�P|	}
q�W|tt|
�d
�d|d|d
7}q`Wi}d|d<||d<t|�|d<t|d<|j|td
d�|j	|�S)z�Return a POSIX.1-2008 extended or global header sequence
           that contains a list of keyword, value pairs. The values
           must be strings.
        Frr=Tr�s21 hdrcharset=BINARY
r�rrr<� �=r�z././@PaxHeaderr�r�rr�r�)
�itemsr-rr.r�bytesr�rr�r)rr�r�r2Zbinary�keyword�valueZrecordsrIrEr:r�r4r4r5r�s<

*z"TarInfo._create_pax_generic_headercCstt|�dkrtd��t|�tkr(td��|jt�tkr>td��t|dd��}|t|�krbt	d��|�}t
|dd�||�|_t|dd	��|_t|d	d
��|_
t|d
d��|_t|dd��|_t|dd��|_||_|dd
�|_t
|d
d�||�|_t
|dd�||�|_t
|dd�||�|_t|dd��|_t|dd��|_t
|dd�||�}|jtk�r�|jjd��r�t|_|jtk�r6d}g}xrtd�D]f}	y0t|||d��}
t||d|d��}Wntk
�r�PYnX|j|
|f�|d7}�q�Wt|d�}t|dd��}
|||
f|_ |j!��rN|jj"d�|_|�rp|jt#k�rp|d|j|_|S)zAConstruct a TarInfo object from a 512 byte bytes object.
        rzempty headerztruncated headerzend of file headerrTrUzbad checksumr
�l�t�|��ii	i)iIiQiYi�r�i�r*r�i�i�i�)$r.r~r�r�countr/r�rGrXrBr;r�rpr r!rrr�r�r�r"r#r�r��AREGTYPEr�r��GNUTYPE_SPARSErCrArn�boolr��isdir�rstrip�	GNU_TYPES)rrWr2r3r��objr�r��structsrFr��numbytes�
isextended�origsizer4r4r5�frombuf�sZ
zTarInfo.frombufcCs8|jjt�}|j||j|j�}|jj�t|_|j|�S)zOReturn the next TarInfo object from TarFile object
           tarfile.
        )	r�rYr�r3r2r3r�r��_proc_member)rr�rWr.r4r4r5�fromtarfileszTarInfo.fromtarfilecCsT|jttfkr|j|�S|jtkr,|j|�S|jtttfkrF|j	|�S|j
|�SdS)zYChoose the right processing method depending on
           the type and call it.
        N)r�rr�
_proc_gnulongr)�_proc_sparserr�SOLARIS_XHDTYPE�	_proc_pax�
_proc_builtin)r�r�r4r4r5r4s



zTarInfo._proc_membercCsR|jj�|_|j}|j�s$|jtkr4||j|j�7}||_|j	|j
|j|j�|S)zfProcess a builtin type or an unknown type which
           will be treated as a regular file.
        )
r�r�r��isregr��SUPPORTED_TYPES�_blockrr��_apply_pax_infor�r2r3)r�r�r�r4r4r5r:$szTarInfo._proc_builtincCs�|jj|j|j��}y|j|�}Wntk
r>td��YnX|j|_|jt	krft
||j|j�|_
n|jtkr�t
||j|j�|_|S)zSProcess the blocks that hold a GNU longname
           or longlink member.
        z missing or bad subsequent header)r�rYr=rr5r}r�r�r�rr;r2r3r�rr�)r�r�rW�nextr4r4r5r65s

zTarInfo._proc_gnulongc
Cs�|j\}}}|`x�|r�|jjt�}d}xvtd�D]j}y0t|||d��}t||d|d��}	Wntk
rzPYnX|r�|	r�|j||	f�|d7}q0Wt|d�}qW||_	|jj
�|_|j|j|j
�|_||_
|S)z8Process a GNU sparse header plus extra headers.
        r�rr&i�)r�r�rYr�rCrGrArnr*r�r�r�r=rr�)
r�r�r/r1r2rWr�rFr�r0r4r4r5r7Ks(zTarInfo._proc_sparsecCs |jj|j|j��}|jtkr&|j}n
|jj�}tj	d|�}|dk	rX|j
d�jd�|d<|jd�}|dkrr|j
}nd}tjd�}d}x�|j||�}|s�P|j�\}	}
t|	�}	||jd	�d|jd�|	d�}|j|
dd|j�}
|
tk�r|j|||j
|j�}n|j|dd|j�}|||
<||	7}q�Wy|j|�}Wntk
�rTtd
��YnXd|k�rn|j||�nHd|k�r�|j|||�n.|jd
�dk�r�|jd�dk�r�|j|||�|jttfk�r|j||j
|j�|j |_ d|k�r|j!}
|j"��s|jt#k�r|
|j|j�7}
|
|_ |S)zVProcess an extended or global header as described in
           POSIX.1-2008.
        s\d+ hdrcharset=([^\n]+)\nNrr�
hdrcharsetZBINARYs(\d+) ([^=]+)=rrz missing or bad subsequent headerzGNU.sparse.mapzGNU.sparse.sizezGNU.sparse.major�1zGNU.sparse.minorr>r)$r�rYr=rr�rr�r�re�search�groupr9rr2�compile�match�groupsr@�endr��_decode_pax_fieldr3�PAX_NAME_FIELDSr5r}r��_proc_gnusparse_01�_proc_gnusparse_00�_proc_gnusparse_10rr8r>r�r�r;r<)r�r�rWr�rGrAr2Zregexr�r1rr r?r�r4r4r5r9gs`



$	





 
zTarInfo._proc_paxcCspg}x(tjd|�D]}|jt|jd���qWg}x(tjd|�D]}|jt|jd���q@Wtt||��|_dS)z?Process a GNU tar extended sparse header, version 0.0.
        s\d+ GNU.sparse.offset=(\d+)\nrs\d+ GNU.sparse.numbytes=(\d+)\nN)rC�finditerrnr@rE�list�zipr�)r�r?r�rWZoffsetsrGr0r4r4r5rM�szTarInfo._proc_gnusparse_00cCs@dd�|djd�D�}tt|ddd�|ddd���|_dS)z?Process a GNU tar extended sparse header, version 0.1.
        cSsg|]}t|��qSr4)r@)�.0rir4r4r5�
<listcomp>�sz.TarInfo._proc_gnusparse_01.<locals>.<listcomp>zGNU.sparse.map�,Nrr)�splitrPrQr�)r�r?r�r�r4r4r5rL�szTarInfo._proc_gnusparse_01cCs�d}g}|jjt�}|jdd�\}}t|�}xJt|�|dkrvd|krV||jjt�7}|jdd�\}}|jt|��q.W|jj�|_t	t
|ddd�|ddd���|_dS)z?Process a GNU tar extended sparse header, version 1.0.
        Nr�rr)r�rYr�rUr@r.rnr�r�rPrQr�)r�r?r�r�Zfieldsr�rWZnumberr4r4r5rN�szTarInfo._proc_gnusparse_10c
Cs�x�|j�D]�\}}|dkr(t|d|�q
|dkrBt|dt|��q
|dkr\t|dt|��q
|tkr
|tkr�yt||�}Wntk
r�d}YnX|dkr�|jd�}t|||�q
W|j�|_dS)	zoReplace fields with supplemental information from a previous
           pax extended or global header.
        zGNU.sparse.namerzGNU.sparse.sizerzGNU.sparse.realsizerr�N)	r�setattrr@�
PAX_FIELDS�PAX_NUMBER_FIELDSrAr,rr�)r�r�r2r3rr r4r4r5r>�s"

zTarInfo._apply_pax_infocCs.y|j|d�Stk
r(|j||�SXdS)z1Decode a single field from a pax record.
        r=N)r9�UnicodeDecodeError)r�r r2Zfallback_encodingZfallback_errorsr4r4r5rJszTarInfo._decode_pax_fieldcCs"t|t�\}}|r|d7}|tS)z_Round up a byte count by BLOCKSIZE and return it,
           e.g. _block(834) => 1024.
        r)r[r�)r�r'r_r`r4r4r5r=
szTarInfo._blockcCs
|jtkS)N)r��
REGULAR_TYPES)r�r4r4r5r;sz
TarInfo.isregcCs|j�S)N)r;)r�r4r4r5�isfileszTarInfo.isfilecCs
|jtkS)N)r�r�)r�r4r4r5r+sz
TarInfo.isdircCs
|jtkS)N)r��SYMTYPE)r�r4r4r5�issymsz
TarInfo.issymcCs
|jtkS)N)r��LNKTYPE)r�r4r4r5�islnksz
TarInfo.islnkcCs
|jtkS)N)r��CHRTYPE)r�r4r4r5�ischr sz
TarInfo.ischrcCs
|jtkS)N)r��BLKTYPE)r�r4r4r5�isblk"sz
TarInfo.isblkcCs
|jtkS)N)r��FIFOTYPE)r�r4r4r5�isfifo$szTarInfo.isfifocCs
|jdk	S)N)r�)r�r4r4r5�issparse&szTarInfo.issparsecCs|jtttfkS)N)r�r`rbrd)r�r4r4r5�isdev(sz
TarInfo.isdevN)r�rpr r!rrr�r�r�r"r#r�r�r�r�r�r�r�r�r�)rl)4rurvrwrx�	__slots__r�r�r��propertyrr�r�rr�r��DEFAULT_FORMAT�ENCODINGr�r�r�r��classmethodrr�staticmethodrrrrr3r5r4r:r6r7r9rMrLrNr>rJr=r;r[r+r]r_rarcrerfrgr4r4r4r5r�s`



1
3?
f	c@s�eZdZdZdZdZdZdZeZ	e
ZdZe
ZeZdVdd	�Zedddefd
d��ZedWdd
��ZedXdd��ZedYdd��Zdddd�Zdd�Zdd�Zdd�Zdd�ZdZdd �Zd[d"d#�Zd\d$d%�Zd]d&d'�Z d^d)d*�Z!d_d,d-�Z"d.d/�Z#d`d0d1�Z$d2d3�Z%d4d5�Z&d6d7�Z'd8d9�Z(d:d;�Z)d<d=�Z*d>d?�Z+d@dA�Z,dBdC�Z-dDdE�Z.dadFdG�Z/dHdI�Z0dbdJdK�Z1dLdM�Z2dNdO�Z3dPdQ�Z4dRdS�Z5dTdU�Z6dS)crz=The TarFile Class provides an interface to tar archives.
    rFrNrfr�c
Cs�t|�dks|dkrtd��||_dddd�||_|sp|jdkr\tjj|�r\d	|_d|_t||j�}d
|_n0|dkr�t	|d�r�|j
}t	|d
�r�|j|_d|_|r�tjj|�nd|_
||_|dk	r�||_
|dk	r�||_|dk	r�||_|dk	r�||_|dk	�r||_|	|_|
dk	�r(|j
tk�r(|
|_ni|_|dk	�r>||_|dk	�rN||_d
|_g|_d
|_|jj�|_i|_y�|jdk�r�d|_|j�|_|jdk�r$x�|jj|j�y|jj |�}|jj!|�WnTt"k
�r�|jj|j�PYn0t#k
�r}
zt$t%|
���WYdd}
~
XnX�q�W|jdk�rnd|_|j�rn|jj&|jj'��}|jj(|�|jt|�7_Wn&|j�s�|jj)�d|_�YnXdS)a�Open an (uncompressed) tar archive `name'. `mode' is either 'r' to
           read from an existing archive, 'a' to append data to an existing
           file or 'w' to create a new file overwriting an existing one. `mode'
           defaults to 'r'.
           If `fileobj' is given, it is used for reading or writing data. If it
           can be determined, `mode' is overridden by `fileobj's mode.
           `fileobj' is not closed, when TarFile is closed.
        rr�zmode must be 'r', 'a' or 'w'�rbzr+b�wb)rf�argrprgFNr�rpTrf�aw)*r.rArp�_moder�r�exists�	bltn_openr�r�r��abspathr�rRr��dereference�ignore_zerosr2r3r�r��debug�
errorlevelr��members�_loadedr�r��inodes�firstmemberr?r�r5rnr�r}rzrrrrZr�)r�r�rpr�rRr�rvrwr2r3r�rxry�erWr4r4r5r�Fs�




$
zTarFile.__init__c
Ks�|r|rtd��|dkr�xz|jD]p}t||j|�}|dk	rH|j�}y||d|f|�Sttfk
r�}	z|dk	r�|j|�w$WYdd}	~	Xq$Xq$Wtd���nd|k�r|jdd�\}
}|
p�d}
|p�d}||jkr�t||j|�}ntd	|��|||
|f|�Sd
|k�r�|jd
d�\}
}|
�p(d}
|�p2d}|
dk�rFtd��t||
|||�}y|||
|f|�}Wn|j	��YnXd
|_
|S|dk�r�|j|||f|�Std��dS)a|Open a tar archive for reading, writing or appending. Return
           an appropriate TarFile class.

           mode:
           'r' or 'r:*' open for reading with transparent compression
           'r:'         open for reading exclusively uncompressed
           'r:gz'       open for reading with gzip compression
           'r:bz2'      open for reading with bzip2 compression
           'a' or 'a:'  open for appending, creating the file if necessary
           'w' or 'w:'  open for writing without compression
           'w:gz'       open for writing with gzip compression
           'w:bz2'      open for writing with bzip2 compression

           'r|*'        open a stream of tar blocks with transparent compression
           'r|'         open an uncompressed stream of tar blocks for reading
           'r|gz'       open a gzip compressed stream of tar blocks
           'r|bz2'      open a bzip2 compressed stream of tar blocks
           'w|'         open an uncompressed stream for writing
           'w|gz'       open a gzip compressed stream for writing
           'w|bz2'      open a bzip2 compressed stream for writing
        znothing to openrf�r:*Nz%file could not be opened successfully�:rr�zunknown compression type %r�|�rwzmode must be 'r' or 'w'Frqzundiscernible mode)rfr)rA�	OPEN_METHr�r�rzr{r�rUr�r�r��taropen)
rr�rpr�r��kwargsr��funcZ	saved_posr~rt�streamrjr4r4r5r��sN







zTarFile.opencKs,t|�dks|dkrtd��||||f|�S)zCOpen uncompressed tar archive name for reading or writing.
        rr�zmode must be 'r', 'a' or 'w')r.rA)rr�rpr�r�r4r4r5r��szTarFile.taropenr�c	Ks�t|�dks|dkrtd��yddl}|jWn ttfk
rNtd��YnX|dk	}y*|j||d||�}|j|||f|�}Wn^tk
r�|r�|dk	r�|j	�|dkr��t
d��Yn$|r�|dk	r�|j	��YnX||_|S)	zkOpen gzip compressed tar archive name for reading or writing.
           Appending is not allowed.
        rr�zmode must be 'r' or 'w'rNzgzip module is not availableraznot a gzip file)r.rA�gzipZGzipFiler��AttributeErrorr{r�r\r�rzr�)	rr�rpr��
compresslevelr�r�Z
extfileobjrjr4r4r5�gzopens.
zTarFile.gzopencKs�t|�dks|dkrtd��yddl}Wntk
rDtd��YnX|dk	rZt||�}n|j|||d�}y|j|||f|�}Wn(tt	fk
r�|j
�td��YnXd	|_|S)
zlOpen bzip2 compressed tar archive name for reading or writing.
           Appending is not allowed.
        rr�zmode must be 'r' or 'w'.rNzbz2 module is not available)r�znot a bzip2 fileF)
r.rAr�r�r{r�ZBZ2Filer�r\�EOFErrorr�rzr�)rr�rpr�r�r�r�rjr4r4r5�bz2open$s zTarFile.bz2openr�r�r�)r�r�r�cCs�|jr
dS|jdkrf|jjttd�|jtd7_t|jt�\}}|dkrf|jjtt|�|j	sv|jj
�d|_dS)zlClose the TarFile. In write-mode, two finishing zero blocks are
           appended to the archive.
        NrqrrT)r�rpr�rZr/r�r�r[�
RECORDSIZEr�r�)r�r_r`r4r4r5r�Hs

z
TarFile.closecCs"|j|�}|dkrtd|��|S)aReturn a TarInfo object for member `name'. If `name' can not be
           found in the archive, KeyError is raised. If a member occurs more
           than once in the archive, its last occurrence is assumed to be the
           most up-to-date version.
        Nzfilename %r not found)�
_getmember�KeyError)r�r�r�r4r4r5�	getmember\s
zTarFile.getmembercCs|j�|js|j�|jS)z�Return the members of the archive as a list of TarInfo objects. The
           list has the same order as the members in the archive.
        )�_checkr{�_loadrz)r�r4r4r5�
getmembersgszTarFile.getmemberscCsdd�|j�D�S)z�Return the members of the archive as a list of their names. It has
           the same order as the list returned by getmembers().
        cSsg|]
}|j�qSr4)r�)rRr�r4r4r5rSusz$TarFile.getnames.<locals>.<listcomp>)r�)r�r4r4r5�getnamesqszTarFile.getnamescCsp|jd�|dk	r|j}|dkr$|}tjj|�\}}|jtjd�}|jd�}|j�}||_	|dkr�t
td�r�|jr�tj|�}q�tj
|�}ntj|j��}d}|j}t
j|��r|j|jf}	|jr�|jdkr�|	|jkr�||j|	kr�t}
|j|	}nt}
|	d�rx||j|	<nht
j|��r"t}
nVt
j|��r4t}
nDt
j|��rPt}
tj|�}n(t
j|��rbt }
nt
j!|��rtt"}
ndS||_||_#|j$|_%|j&|_'|
tk�r�|j(|_)nd|_)|j*|_+|
|_,||_-t.�r�yt.j/|j%�d|_0Wnt1k
�r�YnXt2�r*yt2j3|j'�d|_4Wnt1k
�r(YnX|
t t"fk�rlt
td��rlt
td	��rltj5|j6�|_7tj8|j6�|_9|S)
aOCreate a TarInfo object for either the file `name' or the file
           object `fileobj' (using os.fstat on its file descriptor). You can
           modify some of the TarInfo's attributes before you add it using
           addfile(). If given, `arcname' specifies an alternative name for the
           file in the archive.
        rqNr��lstatrlrr�major�minor):r�r�r�r�
splitdriver��sep�lstripr�r�r�rvr��stat�fstat�fileno�st_mode�S_ISREG�st_ino�st_dev�st_nlinkr|r^r��S_ISDIRr��S_ISFIFOrd�S_ISLNKr\�readlink�S_ISCHRr`�S_ISBLKrbrp�st_uidr �st_gidr!�st_sizer�st_mtimerr�r��pwd�getpwuidr"r��grpZgetgrgidr#r��st_rdevr�r�r�)r�r��arcnamer�Zdrvr�Zstatresr�Zstmd�inoder�r4r4r5�
gettarinfows~




zTarFile.gettarinfoTcCs|j�x�|D]�}|r�tt|j�dd�td|jp6|j|jp@|jfdd�|j�s\|j	�rztdd|j
|jfdd�ntd|jdd�tdt
j|j�dd	�dd�t|j|j�r�d
nddd�|r�|j�r�td|jdd�|j�r�td
|jdd�t�qWdS)z�Print a table of contents to sys.stdout. If `verbose' is False, only
           the names of the members are printed. If it is True, an `ls -l'-like
           output is produced.
        � )rIz%s/%sz%10sz%d,%dz%10dz%d-%02d-%02d %02d:%02d:%02dNr�r�rlz->zlink to)r��printrtrpr"r r#r!rarcr�r�rr�Z	localtimerr�r+r]r�r_)r��verboser�r4r4r5rP�s&
zTarFile.listc	Csr|jd�|dkr|}|dk	rPddl}|jdtd�||�rP|jdd|�dS|jdk	r�tjj|�|jkr�|jdd|�dS|jd|�|j	||�}|dkr�|jdd	|�dS|dk	r�||�}|dkr�|jdd|�dS|j
��r
t|d
�}|j||�|j
�nd|j��rd|j|�|�rnxHtj|�D].}|jtjj||�tjj||�|||d��q0Wn
|j|�dS)a~Add the file `name' to the archive. `name' may be any type of file
           (directory, fifo, symbolic link, etc.). If given, `arcname'
           specifies an alternative name for the file in the archive.
           Directories are added recursively by default. This can be avoided by
           setting `recursive' to False. `exclude' is a function that should
           return True for each filename to be excluded. `filter' is a function
           that expects a TarInfo object argument and returns the changed
           TarInfo object, if it returns None the TarInfo object will be
           excluded from the archive.
        rqNrzuse the filter argument insteadrztarfile: Excluded %rztarfile: Skipped %rrztarfile: Unsupported type %rrn)�filter)r��warnings�warn�DeprecationWarning�_dbgr�r�rrur�r;rt�addfiler�r+�listdir�addro)	r�r�r��	recursive�excluder�r�r��fr4r4r5r��sD





zTarFile.addcCs�|jd�tj|�}|j|j|j|j�}|jj|�|jt	|�7_|dk	r�t
||j|j�t|jt
�\}}|dkr�|jjtt
|�|d7}|j|t
7_|jj|�dS)a]Add the TarInfo object `tarinfo' to the archive. If `fileobj' is
           given, tarinfo.size bytes are read from it and added to the archive.
           You can create TarInfo objects using gettarinfo().
           On Windows platforms, `fileobj' should always be opened with mode
           'rb' to avoid irritation about the file size.
        rqNrr)r�rr�rRr2r3r�rZr�r.rbrr[r�r/rzrn)r�r�r�rWr_r`r4r4r5r�4s

zTarFile.addfile�.cCs�g}|dkr|}xD|D]<}|j�r<|j|�tj|�}d|_|j|||j�d�qW|jdd�d�|j�x�|D]~}tjj	||j
�}y(|j||�|j||�|j
||�Wqttk
r�}z$|jdkrЂn|jdd|�WYdd}~XqtXqtWdS)	aMExtract all members from the archive to the current working
           directory and set owner, modification time and permissions on
           directories afterwards. `path' specifies a different directory
           to extract to. `members' is optional and must be a subset of the
           list returned by getmembers().
        Ni�)�	set_attrscSs|jS)N)r�)rpr4r4r5�<lambda>dsz$TarFile.extractall.<locals>.<lambda>)�keyrztarfile: %s)r+rnrrp�extract�sort�reverser�rror��chown�utime�chmodryryr�)r�rrzZdirectoriesr��dirpathr~r4r4r5�
extractallNs*




zTarFile.extractallrlcCs
|jd�t|t�r |j|�}n|}|j�r>tjj||j�|_	y |j
|tjj||j�|d�Wn�tk
r�}zJ|j
dkr~�n6|jdkr�|jdd|j�n|jdd|j|jf�WYdd}~XnBtk
�r}z$|j
dkr�n|jdd|�WYdd}~XnXdS)axExtract a member from the archive to the current working directory,
           using its full name. Its file information is extracted as accurately
           as possible. `member' may be a filename or a TarInfo object. You can
           specify a different directory using `path'. File attributes (owner,
           mtime, mode) are set unless `set_attrs' is False.
        rf)r�rNrztarfile: %sztarfile: %s %r)r�r
rr�r_r�rror�r��_extract_memberr��EnvironmentErrorry�filenamer��strerrorry)r��memberrr�r�r~r4r4r5r�ts&



(
zTarFile.extractcCs�|jd�t|t�r |j|�}n|}|j�r8|j||�S|jtkrN|j||�S|j�s^|j	�r�t|j
t�rttd��q�|j
|j|��SndSdS)a�Extract a member from the archive as a file object. `member' may be
           a filename or a TarInfo object. If `member' is a regular file, a
           file-like object is returned. If `member' is a link, a file-like
           object is constructed from the link's target. If `member' is none of
           the above, None is returned.
           The file-like object is read-only and provides the following
           methods: read(), readline(), readlines(), seek() and tell()
        rfz'cannot extract (sym)link as file objectN)r�r
rr�r;�
fileobjectr�r<r_r]r�r�r|�extractfile�_find_link_target)r�r�r�r4r4r5r��s	



zTarFile.extractfilecCsT|jd�}|jdtj�}tjj|�}|r@tjj|�r@tj|�|j�sP|j	�rj|j
dd|j|jf�n|j
d|j�|j
�r�|j||�n�|j�r�|j||�nx|j�r�|j||�nb|j�s�|j�r�|j||�nD|j�s�|j	�r�|j||�n&|jtk�r|j||�n|j||�|�rP|j||�|j	��sP|j||�|j||�dS)z\Extract the TarInfo object tarinfo to a physical
           file called targetpath.
        r�rz%s -> %sN)r,r�r�r�r�dirnamers�makedirsr_r]r�r�r�r;�makefiler+�makedirre�makefiforarc�makedev�makelinkr�r<�makeunknownr�r�r�)r�r��
targetpathr�Z	upperdirsr4r4r5r��s4


zTarFile._extract_membercCsFytj|d�Wn0tk
r@}z|jtjkr0�WYdd}~XnXdS)z,Make a directory called targetpath.
        i�N)r��mkdirr��errnoZEEXIST)r�r�r�r~r4r4r5r��s
zTarFile.makedircCs�|j}|j|j�t|d�}|jdk	rRx8|jD]\}}|j|�t|||�q.Wnt|||j�|j|j�|j�|j�dS)z'Make a file called targetpath.
        roN)	r�r�r�rtr�rbr�truncater�)r�r�r��source�targetr�rr4r4r5r��s


zTarFile.makefilecCs"|j||�|jdd|j�dS)zYMake a file from a TarInfo object with an unknown type
           at targetpath.
        rz9tarfile: Unknown file type %r, extracted as regular file.N)r�r�r�)r�r�r�r4r4r5r�	szTarFile.makeunknowncCs"ttd�rtj|�ntd��dS)z'Make a fifo called targetpath.
        �mkfifozfifo not supported by systemN)r�r�r�ry)r�r�r�r4r4r5r�	s
zTarFile.makefifocCsbttd�sttd�r td��|j}|j�r:|tjO}n
|tjO}tj||tj	|j
|j��dS)z<Make a character or block device called targetpath.
        �mknodr�z'special devices not supported by systemN)r�r�ryrprcr��S_IFBLK�S_IFCHRr�r�r�r�)r�r�r�rpr4r4r5r�	s
zTarFile.makedevcCs�yL|j�rtj|j|�n0tjj|j�r8tj|j|�n|j|j	|�|�Wn>t
k
r�|j�r�tjjtjj|j
�|j�}n|j}Yn6Xy|j|j	|�|�Wntk
r�td��YnXdS)z�Make a (symbolic) link called targetpath. If it cannot be created
          (platform limitation), we try to make a copy of the referenced file
          instead of a link.
        z%unable to resolve link inside archiveN)r]r��symlinkr�rrsr��linkr�r��symlink_exceptionror�r�r�ry)r�r�r�rr4r4r5r�'	s"


zTarFile.makelinkc Cs�tr�ttd�r�tj�dkr�ytj|j�d}Wntk
rH|j}YnXytj	|j
�d}Wntk
rx|j}YnXy>|j�r�ttd�r�tj
|||�ntjdkr�tj|||�Wn*tk
r�}ztd��WYdd}~XnXdS)z6Set owner of targetpath according to tarinfo.
        �geteuidrr�lchownZos2emxzcould not change ownerN)r�r�r�r�r�Zgetgrnamr#r�r!�getpwnamr"r r]r��sys�platformr�r�ry)r�r�r��g�ur~r4r4r5r�D	s 
z
TarFile.chowncCsLttd�rHytj||j�Wn*tk
rF}ztd��WYdd}~XnXdS)zASet file permissions of targetpath according to tarinfo.
        r�zcould not change modeN)r�r�r�rpr�ry)r�r�r�r~r4r4r5r�Z	s

z
TarFile.chmodcCsVttd�sdSytj||j|jf�Wn*tk
rP}ztd��WYdd}~XnXdS)zBSet modification time of targetpath according to tarinfo.
        r�Nz"could not change modification time)r�r�r�rr�ry)r�r�r�r~r4r4r5r�c	s
z
TarFile.utimecCs�|jd�|jdk	r$|j}d|_|S|jj|j�d}�x\y|jj|�}W�nBtk
r�}z2|jr�|j	dd|j|f�|jt
7_w:WYdd}~Xn�tk
r�}zJ|jr�|j	dd|j|f�|jt
7_w:n|jdkr�tt
|���WYdd}~Xn�tk
�r&|jdk�r"td��Ynjtk
�r`}z|jdk�rPtt
|���WYdd}~Xn0tk
�r�}ztt
|���WYdd}~XnXPq:W|dk	�r�|jj|�nd|_|S)z�Return the next member of the archive as a TarInfo object, when
           TarFile is opened for reading. Return None if there is no more
           available.
        ZraNrz0x%X: %srz
empty fileT)r�r}r�r�r�r�r5r�rwr�r�rBrzrr~rr�rzrnr{)r��mr�r~r4r4r5r?n	sF



zTarFile.nextcCsn|j�}|dk	r"|d|j|��}|r2tjj|�}x6t|�D]*}|rTtjj|j�}n|j}||kr<|Sq<WdS)z}Find an archive member by name from bottom to top.
           If tarinfo is given, it is used as the starting point.
        N)r��indexr�r�normpath�reversedr�)r�r�r��	normalizerzr��member_namer4r4r5r��	szTarFile._getmembercCs"x|j�}|dkrPqWd|_dS)zWRead through the entire archive file and look for readable
           members.
        NT)r?r{)r�r�r4r4r5r��	s
z
TarFile._loadcCs:|jrtd|jj��|dk	r6|j|kr6td|j��dS)znCheck if TarFile is still open, and if the operation's mode
           corresponds to TarFile's mode.
        z%s is closedNzbad operation for mode %r)r�r\r�rurp)r�rpr4r4r5r��	szTarFile._checkcCsX|j�r&tjj|j�d|j}d}n
|j}|}|j||dd�}|dkrTtd|��|S)zZFind the target member of a symlink or hardlink member in the
           archive.
        r�NT)r�r�zlinkname %r not found)r]r�rr�r�r�r�r�)r�r�r��limitr�r4r4r5r��	szTarFile._find_link_targetcCs|jrt|j�St|�SdS)z$Provide an iterator object.
        N)r{�iterrz�TarIter)r�r4r4r5r��	s
zTarFile.__iter__cCs||jkrt|tjd�dS)z.Write debugging output to sys.stderr.
        )�fileN)rxr�r��stderr)r��level�msgr4r4r5r��	s
zTarFile._dbgcCs|j�|S)N)r�)r�r4r4r5�	__enter__�	szTarFile.__enter__cCs,|dkr|j�n|js"|jj�d|_dS)NT)r�r�r�r�)r�r�r �	tracebackr4r4r5�__exit__�	s


zTarFile.__exit__)NrfNNNNNNr�NNN)rfN)rfNr�)rfNr�)NNN)T)NTNN)N)r�N)rlT)T)NF)N)7rurvrwrxrxrvrwryrjrRrkr2r3rr�r�r�r�rlr�r�r�r�r�r�r�r�r�r�r�rPr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r?r�r�r�r�r�r�rrr4r4r4r5r,sl
iK

b

>

&
#&
0	1


	c@s,eZdZdZdd�Zdd�Zdd�ZeZdS)	r�zMIterator Class.

       for tarinfo in TarFile(...):
           suite...
    cCs||_d|_dS)z$Construct a TarIter object.
        rN)r�r�)r�r�r4r4r5r�
szTarIter.__init__cCs|S)z Return iterator object.
        r4)r�r4r4r5r�
szTarIter.__iter__cCsb|jjs$|jj�}|sPd|j_t�n,y|jj|j}Wntk
rNt�YnX|jd7_|S)z�Return the next item using TarFile's next() method.
           When all members have been read, set TarFile as _loaded.
        Tr)r�r{r?�
StopIterationrzr��
IndexError)r�r�r4r4r5�__next__

s

zTarIter.__next__N)rurvrwrxr�r�rr?r4r4r4r5r��	s
r�cCs.yt|�}|j�dStk
r(dSXdS)zfReturn True if name points to a tar archive that we
       are able to handle, else return False.
    TFN)r�r�r)r�rjr4r4r5r#
s)rrrrr r!r"r#)rrr"r#)r+r,)N)wZ
__future__r�__version__�version�
__author__Z__date__Z	__cvsid__�__credits__r�r�r�r�r�rLrrCr�r�r�r��NotImplementedErrorr�ZWindowsError�	NameError�__all__�version_infoZ__builtin__�builtinsr�Z_openr/r�r�rr�r�r�rr�r(r^r\r`rbr�rdZCONTTYPErrr)rrr8r�rKr�rjr<rZr-rW�setrKrr@rX�S_IFLNK�S_IFREGr��S_IFDIRr��S_IFIFOZTSUIDZTSGIDZTSVTXZTUREADZTUWRITEZTUEXECZTGREADZTGWRITEZTGEXECZTOREADZTOWRITEZTOEXECr�rk�getfilesystemencodingr6r;rGrSrXrbrmrt�	Exceptionrryrzr{r|r}r~rr�rBr��objectr�r�r�r�r�r�rrr�rrtr4r4r4r5�<module>s>




i?KT*site-packages/pip/_vendor/distlib/_backport/__pycache__/sysconfig.cpython-36.pyc000064400000037115147511334610023764 0ustar003

���eKi�@s�dZddlZddlZddlZddlZddlmZmZyddlZWne	k
r\ddl
ZYnXdddddd	d
ddd
dgZdd�Zej
r�ejjeej
��Zneej��Zejdkr�dedEd�j�kr�eejjee��Zejdko�dedFd�j�k�r
eejjeee��Zejdk�r@dedGd�j�k�r@eejjeee��Zdd�Ze�Zdadd�Zej�Zejd�Zdd�Zejj�dZ ejdd �Z!e de d!Z"ejj#ej$�Z%ejj#ej&�Z'da(dZ)d"d#�Z*d$d%�Z+d&d'�Z,d(d)�Z-d*d+�Z.d,d-�Z/dHd.d/�Z0d0d�Z1d1d2�Z2d3d4�Z3dId5d�Z4d6d�Z5d7d
�Z6d8d	�Z7e.�dd9fd:d
�Z8e.�dd9fd;d�Z9d<d�Z:d=d�Z;d>d�Z<d?d�Z=d@dA�Z>dBdC�Z?e@dDk�r�e?�dS)Jz-Access to Python's configuration information.�N)�pardir�realpath�get_config_h_filename�get_config_var�get_config_vars�get_makefile_filename�get_path�get_path_names�	get_paths�get_platform�get_python_version�get_scheme_names�parse_config_hcCs"yt|�Stk
r|SXdS)N)r�OSError)�path�r�/usr/lib/python3.6/sysconfig.py�_safe_realpath"sr�ntZpcbuild�z\pc\v�
z\pcbuild\amd64�cCs.x(dD] }tjjtjjtd|��rdSqWdS)N�
Setup.dist�Setup.local�ModulesTF)rr)�osr�isfile�join�
_PROJECT_BASE)�fnrrr�is_python_build:s
r FcCs�ts�ddlm}tjdd�d}||�}|jd�}|s>td��|j��}tj	|�WdQRXt
r�x(dD] }tj|d
d�tj|dd
�qfWdadS)N�)�finder�.�rz
sysconfig.cfgzsysconfig.cfg exists�posix_prefix�
posix_home�includez{srcdir}/Include�platincludez{projectbase}/.T)r%r&)�	_cfg_readZ	resourcesr"�__name__�rsplit�find�AssertionErrorZ	as_stream�_SCHEMESZreadfp�
_PYTHON_BUILD�set)r"Zbackport_packageZ_finderZ_cfgfile�s�schemerrr�_ensure_cfg_readDs


r3z\{([^{]*?)\}cs�t�|jd�r|jd�}nt�}|j�}xD|D]<}|dkr>q0x,|D]$\}}|j||�rZqD|j|||�qDWq0W|jd�xX|j�D]L}t|j|����fdd�}x,|j|�D]\}}|j||t	j
||��q�Wq�WdS)N�globalscs$|jd�}|�kr�|S|jd�S)Nr$r)�group)�matchobj�name)�	variablesrr�	_replaceros
z"_expand_globals.<locals>._replacer)r3Zhas_section�items�tuple�sectionsZ
has_optionr0Zremove_section�dict�	_VAR_REPL�sub)�configr4r<ZsectionZoption�valuer9r)r8r�_expand_globalsYs$


rB�r!cs�fdd�}tj||�S)z�In the string `path`, replace tokens like {some.thing} with the
    corresponding value from the map `local_vars`.

    If there is no corresponding value, leave the token unchanged.
    cs8|jd�}|�kr�|S|tjkr.tj|S|jd�S)Nr$r)r5r�environ)r6r7)�
local_varsrrr9�s


z_subst_vars.<locals>._replacer)r>r?)rrEr9r)rEr�_subst_vars�srFcCs4|j�}x&|j�D]\}}||kr$q|||<qWdS)N)�keysr:)�target_dict�
other_dict�target_keys�keyrArrr�_extend_dict�s
rLcCsdi}|dkri}t|t��xBtj|�D]4\}}tjdkrFtjj|�}tjjt	||��||<q(W|S)N�posixr)rMr)
rLrr.r:rr7r�
expanduser�normpathrF)r2�vars�resrKrArrr�_expand_vars�s
rRcs�fdd�}tj||�S)Ncs$|jd�}|�kr�|S|jd�S)Nr$r)r5)r6r7)rPrrr9�s
zformat_value.<locals>._replacer)r>r?)rArPr9r)rPr�format_value�srScCstjdkrdStjS)NrMr%)rr7rrrr�_get_default_scheme�s
rTcCs�tjjdd�}dd�}tjdkrBtjjd�p.d}|r8|S||d�Stjdkr|td	�}|r||r`|S|dd
|dtjdd��S|r�|S|dd
�SdS)N�PYTHONUSERBASEcWstjjtjj|��S)N)rrrNr)�argsrrr�joinuser�sz_getuserbase.<locals>.joinuserr�APPDATA�~�Python�darwin�PYTHONFRAMEWORK�Libraryz%d.%dr!z.local)rrD�getr7�sys�platformr�version_info)�env_baserW�base�	frameworkrrr�_getuserbase�s"



recCs"tjd�}tjd�}tjd�}|dkr*i}i}i}tj|ddd��}|j�}WdQRXx�|D]�}	|	jd�s\|	j�d	krxq\|j|	�}
|
r\|
jd
d�\}}|j�}|j	dd	�}
d
|
kr�|||<q\yt
|�}Wn$tk
r�|j	dd
�||<Yq\X|||<q\Wt|j
��}d}�x�t|�dk�r�x�t|�D�]�}||}|j|��pJ|j|�}
|
dk	�r�|
jd
�}d}||k�r|t||�}n�||k�r�d}nx|tjk�r�tj|}n`||k�r�|jd��r�|dd�|k�r�d	}n$d||k�r�d}nt|d|�}nd	||<}|�r�||
j�d�}|d|
j��||}d
|k�rF|||<n~yt
|�}Wn"tk
�rt|j�||<Yn
X|||<|j|�|jd��r�|dd�|k�r�|dd�}||k�r�|||<n|||<|j|��q(W�qWx.|j�D]"\}}t|t��r�|j�||<�q�W|j|�|S)z�Parse a Makefile-style file.

    A dictionary containing name/value pairs is returned.  If an
    optional dictionary is passed in as the second argument, it is
    used instead of a new dictionary.
    z"([a-zA-Z][a-zA-Z0-9_]+)\s*=\s*(.*)z\$\(([A-Za-z][A-Za-z0-9_]*)\)z\${([A-Za-z][A-Za-z0-9_]*)}Nzutf-8�surrogateescape)�encoding�errors�#�r$r!z$$�$�CFLAGS�LDFLAGS�CPPFLAGSrTF�PY_rC)rlrmrn)�re�compile�codecs�open�	readlines�
startswith�strip�matchr5�replace�int�
ValueError�listrG�lenr;�search�strrrD�end�start�remover:�
isinstance�update)�filenamerP�_variable_rx�_findvar1_rx�_findvar2_rx�done�notdone�f�lines�line�m�n�v�tmpvr8�renamed_variablesr7rA�found�item�after�krrr�_parse_makefile�s�	
















r�cCsDtrtjjtd�Sttd�r,dttjf}nd}tjjt	d�|d�S)z Return the path of the Makefile.�Makefile�abiflagszconfig-%s%sr@�stdlib)
r/rrrr�hasattrr_�_PY_VERSION_SHORTr�r)�config_dir_namerrrrMs
cCs�t�}yt||�WnJtk
r^}z.d|}t|d�rF|d|j}t|��WYdd}~XnXt�}y"t|��}t||�WdQRXWnJtk
r�}z.d|}t|d�r�|d|j}t|��WYdd}~XnXtr�|d|d<dS)z7Initialize the module as appropriate for POSIX systems.z.invalid Python installation: unable to open %s�strerrorz (%s)N�	BLDSHARED�LDSHARED)	rr��IOErrorr�r�rrsrr/)rP�makefile�e�msg�config_hr�rrr�_init_posixXs&


r�cCsVtd�|d<td�|d<td�|d<d|d<d	|d
<t|d<tjjttj��|d<d
S)z+Initialize the module as appropriate for NTr��LIBDEST�
platstdlib�
BINLIBDESTr'�	INCLUDEPYz.pyd�SOz.exe�EXE�VERSION�BINDIRN)r�_PY_VERSION_SHORT_NO_DOTrr�dirnamerr_�
executable)rPrrr�_init_non_posixtsr�cCs�|dkri}tjd�}tjd�}xx|j�}|s0P|j|�}|rz|jdd�\}}yt|�}Wntk
rnYnX|||<q"|j|�}|r"d||jd�<q"W|S)z�Parse a config.h-style file.

    A dictionary containing name/value pairs is returned.  If an
    optional dictionary is passed in as the second argument, it is
    used instead of a new dictionary.
    Nz"#define ([A-Z][A-Za-z0-9_]+) (.*)
z&/[*] #undef ([A-Z][A-Za-z0-9_]+) [*]/
r$r!r)rprq�readlinerwr5ryrz)�fprP�	define_rx�undef_rxr�r�r�r�rrrr�s(




cCs:tr$tjdkrtjjtd�}q,t}ntd�}tjj|d�S)zReturn the path of pyconfig.h.r�PCr(z
pyconfig.h)r/rr7rrrr)�inc_dirrrrr�s
cCstttj���S)z,Return a tuple containing the schemes names.)r;�sortedr.r<rrrrr
�scCs
tjd�S)z*Return a tuple containing the paths names.r%)r.Zoptionsrrrrr	�sTcCs&t�|rt||�Sttj|��SdS)z�Return a mapping containing an install scheme.

    ``scheme`` is the install scheme name. If not provided, it will
    return the default scheme for the current platform.
    N)r3rRr=r.r:)r2rP�expandrrrr
�s
cCst|||�|S)z[Return a path corresponding to the scheme.

    ``scheme`` is the install scheme name.
    )r
)r7r2rPr�rrrr�scGs�tdk�r�iattd<ttd<ttd<ttd<tdtdtd<ttd	<ttd
<ttd<ytjtd<Wntk
r�d
td<YnXt	j
d#kr�tt�t	j
dkr�tt�tj
dkr�t�td<dtkr�ttd<nttd�td<to�t	j
dk�r\t}yt	j�}Wntk
�rd}YnXt	jjtd��r\||k�r\t	jj|td�}t	jj|�td<tjdk�r�t	j�d}t|jd�d�}|dk�r�x:d$D]2}t|}tjdd|�}tjdd|�}|t|<�q�Wn�dt	jk�rt	jd}x8d%D]0}t|}tjdd|�}|d|}|t|<�q�Wtjdd
�}	tjd |	�}
|
dk	�r�|
j d!�}t	jj!|��s�x,d&D]$}t|}tjd"d|�}|t|<�q^W|�r�g}x|D]}
|j"tj|
���q�W|StSdS)'ayWith no arguments, return a dictionary of all configuration
    variables relevant for the current platform.

    On Unix, this means every variable defined in Python's installed Makefile;
    On Windows and Mac OS it's a much smaller set.

    With arguments, return a list of values that result from looking up
    each argument in the configuration variable dictionary.
    N�prefix�exec_prefix�
py_version�py_version_shortrr!�py_version_nodotrc�platbase�projectbaser�rjr�os2rMz2.6�userbase�srcdirr[r#rrm�
BASECFLAGSrl�	PY_CFLAGSr�z
-arch\s+\w+\s� z-isysroot [^ 	]*Z	ARCHFLAGSz-isysroot\s+(\S+)r$z-isysroot\s+\S+(\s|$))rr�)rmr�rlr�r�)rmr�rlr�r�)rmr�rlr�r�)#�_CONFIG_VARS�_PREFIX�_EXEC_PREFIX�_PY_VERSIONr�rr_r��AttributeErrorrr7r�r��versionrerr/�getcwdrr�isabsrrOr`�unamery�splitrpr?rDr^r}r5�exists�append)rVrc�cwdr�Zkernel_versionZ
major_versionrK�flagsZarchrlr�Zsdk�valsr7rrrr�s�












cCst�j|�S)z�Return the value of a single variable using the dictionary returned by
    'get_config_vars()'.

    Equivalent to get_config_vars().get(name)
    )rr^)r7rrrrRscCs`tjdkrnd}tjj|�}|d:kr(tjStjjd|�}tj|t|�|�j�}|dkr\dS|dkrhdStjStjd	ks�ttd
�r�tjStj	�\}}}}}|j�j
dd�}|j
d
d�}|j
dd�}|dd�dkr�d||fS|dd�dk�r(|ddk�rRd}dt|d�d|dd�f}�n*|dd�dk�rFd||fS|dd�dk�rfd|||fS|dd�d k�r�d }tj
d!�}	|	j|�}
|
�rR|
j�}�n�|dd�d"k�rRt�}|jd#�}|}
ytd$�}Wntk
�r�YnJXztjd%|j��}
Wd|j�X|
dk	�r4d&j|
jd�jd&�dd��}
|�s>|
}|�rR|}d'}|
d&d(k�rd)t�jd*d�j�k�rd+}t�jd*�}tjd,|�}ttt|���}t|�dk�r�|d}n^|d;k�r�d+}nN|d<k�r�d0}n>|d=k�r�d1}n.|d>k�r�d3}n|d?k�rd4}ntd5|f��n<|d-k�r2tjd@k�rRd/}n |dAk�rRtjdBk�rNd2}nd.}d9|||fS)Ca�Return a string that identifies the current platform.

    This is used mainly to distinguish platform-specific build directories and
    platform-specific built distributions.  Typically includes the OS name
    and version and the architecture (as supplied by 'os.uname()'),
    although the exact information included depends on the OS; eg. for IRIX
    the architecture isn't particularly important (IRIX only runs on SGI
    hardware), but for Linux the kernel version isn't particularly
    important.

    Examples of returned values:
       linux-i586
       linux-alpha (?)
       solaris-2.6-sun4u
       irix-5.3
       irix64-6.2

    Windows will return one of:
       win-amd64 (64bit Windows on AMD64 (aka x86_64, Intel64, EM64T, etc)
       win-ia64 (64bit Windows on Itanium)
       win32 (all others - specifically, sys.platform is returned)

    For other non-POSIX platforms, currently just returns 'sys.platform'.
    rz bit (r$�)�amd64z	win-amd64�itaniumzwin-ia64rMr��/rjr��_�-N��linuxz%s-%s�sunosr�5�solarisz%d.%srCr!��irix�aixz%s-%s.%s��cygwinz[\d.]+r[�MACOSX_DEPLOYMENT_TARGETz0/System/Library/CoreServices/SystemVersion.plistz=<key>ProductUserVisibleVersion</key>\s*<string>(.*?)</string>r#Zmacosxz10.4.z-archrlZfatz
-arch\s+(\S+)�i386�ppc�x86_64ZintelZfat3�ppc64Zfat64Z	universalz%Don't know machine value for archs=%r� �PowerPC�Power_Macintoshz%s-%s-%s���)r�r�)r�r�)r�r�r�)r�r�)r�r�r�r�l)r�r�l) rr7r_r�r,r`r|�lowerr�r�rxryrprqrwr5rr^rsr�r}�read�closerr�rv�findallr;r�r0rz�maxsize)r��i�j�look�osname�host�releaser��machine�rel_rer�ZcfgvarsZmacverZ
macreleaser�ZcflagsZarchsrrrr[s�
$












cCstS)N)r�rrrrr�scCsJxDtt|j���D]0\}\}}|dkr2td|�td||f�qWdS)Nrz%s: z
	%s = "%s")�	enumerater�r:�print)�title�data�indexrKrArrr�_print_dictsrcCsRtdt��tdt��tdt��t�tdt��t�tdt��dS)z*Display all information sysconfig detains.zPlatform: "%s"zPython version: "%s"z!Current installation scheme: "%s"�Paths�	VariablesN)r�rrrTrr
rrrrr�_mainsr�__main__i����i����i�)N)N)A�__doc__rrrrpr_Zos.pathrrZconfigparser�ImportErrorZConfigParser�__all__rr�rr�rr�r7r�rr r/r)r3ZRawConfigParserr.rqr>rBr�r�r�r�r�rOr�r�r�r�r��
_USER_BASErFrLrRrSrTrer�rr�r�rrr
r	r
rrrrrrrr*rrrr�<module>s� "
#
	
v

	#
site-packages/pip/_vendor/distlib/_backport/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000613147511334610024447 0ustar003

���e�@sdZdS)aModules copied from Python 3 standard libraries, for internal use only.

Individual classes and functions are found in d2._backport.misc.  Intended
usage is to always import things missing from 3.1 from that module: the
built-in/stdlib objects will be used if found.
N)�__doc__�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/distlib/_backport/__pycache__/__init__.cpython-36.pyc000064400000000613147511334610023510 0ustar003

���e�@sdZdS)aModules copied from Python 3 standard libraries, for internal use only.

Individual classes and functions are found in d2._backport.misc.  Intended
usage is to always import things missing from 3.1 from that module: the
built-in/stdlib objects will be used if found.
N)�__doc__�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/distlib/_backport/__pycache__/misc.cpython-36.pyc000064400000001740147511334610022706 0ustar003

���e��@s�dZddlZddlZdddgZyddlmZWnek
rLddd�ZYnXyeZWn(ek
r~dd	l	m
Z
d
d�ZYnXy
ejZWnek
r�dd�ZYnXdS)
z/Backports for individual classes and functions.�N�cache_from_source�callable�fsencode)rTcCs|rdp
d}||S)N�c�o�)Zpy_file�debugZextrr�/usr/lib/python3.6/misc.pyrs)�CallablecCs
t|t�S)N)�
isinstancer
)�objrrr	rscCs<t|t�r|St|t�r&|jtj��Stdt|�j��dS)Nzexpect bytes or str, not %s)	r�bytes�str�encode�sys�getfilesystemencoding�	TypeError�type�__name__)�filenamerrr	r"s

)T)
�__doc__�osr�__all__Zimpr�ImportErrorr�	NameError�collectionsr
r�AttributeErrorrrrr	�<module>s 

site-packages/pip/_vendor/distlib/_backport/sysconfig.cfg000064400000005071147511334610017603 0ustar00[posix_prefix]
# Configuration directories.  Some of these come straight out of the
# configure script.  They are for implementing the other variables, not to
# be used directly in [resource_locations].
confdir = /etc
datadir = /usr/share
libdir = /usr/lib
statedir = /var
# User resource directory
local = ~/.local/{distribution.name}

stdlib = {base}/lib/python{py_version_short}
platstdlib = {platbase}/lib/python{py_version_short}
purelib = {base}/lib/python{py_version_short}/site-packages
platlib = {platbase}/lib/python{py_version_short}/site-packages
include = {base}/include/python{py_version_short}{abiflags}
platinclude = {platbase}/include/python{py_version_short}{abiflags}
data = {base}

[posix_home]
stdlib = {base}/lib/python
platstdlib = {base}/lib/python
purelib = {base}/lib/python
platlib = {base}/lib/python
include = {base}/include/python
platinclude = {base}/include/python
scripts = {base}/bin
data = {base}

[nt]
stdlib = {base}/Lib
platstdlib = {base}/Lib
purelib = {base}/Lib/site-packages
platlib = {base}/Lib/site-packages
include = {base}/Include
platinclude = {base}/Include
scripts = {base}/Scripts
data = {base}

[os2]
stdlib = {base}/Lib
platstdlib = {base}/Lib
purelib = {base}/Lib/site-packages
platlib = {base}/Lib/site-packages
include = {base}/Include
platinclude = {base}/Include
scripts = {base}/Scripts
data = {base}

[os2_home]
stdlib = {userbase}/lib/python{py_version_short}
platstdlib = {userbase}/lib/python{py_version_short}
purelib = {userbase}/lib/python{py_version_short}/site-packages
platlib = {userbase}/lib/python{py_version_short}/site-packages
include = {userbase}/include/python{py_version_short}
scripts = {userbase}/bin
data = {userbase}

[nt_user]
stdlib = {userbase}/Python{py_version_nodot}
platstdlib = {userbase}/Python{py_version_nodot}
purelib = {userbase}/Python{py_version_nodot}/site-packages
platlib = {userbase}/Python{py_version_nodot}/site-packages
include = {userbase}/Python{py_version_nodot}/Include
scripts = {userbase}/Scripts
data = {userbase}

[posix_user]
stdlib = {userbase}/lib/python{py_version_short}
platstdlib = {userbase}/lib/python{py_version_short}
purelib = {userbase}/lib/python{py_version_short}/site-packages
platlib = {userbase}/lib/python{py_version_short}/site-packages
include = {userbase}/include/python{py_version_short}
scripts = {userbase}/bin
data = {userbase}

[osx_framework_user]
stdlib = {userbase}/lib/python
platstdlib = {userbase}/lib/python
purelib = {userbase}/lib/python/site-packages
platlib = {userbase}/lib/python/site-packages
include = {userbase}/include
scripts = {userbase}/bin
data = {userbase}
site-packages/pip/_vendor/distlib/_backport/tarfile.py000064400000264724147511334610017132 0ustar00#-------------------------------------------------------------------
# tarfile.py
#-------------------------------------------------------------------
# Copyright (C) 2002 Lars Gustaebel <lars@gustaebel.de>
# All rights reserved.
#
# Permission  is  hereby granted,  free  of charge,  to  any person
# obtaining a  copy of  this software  and associated documentation
# files  (the  "Software"),  to   deal  in  the  Software   without
# restriction,  including  without limitation  the  rights to  use,
# copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies  of  the  Software,  and to  permit  persons  to  whom the
# Software  is  furnished  to  do  so,  subject  to  the  following
# conditions:
#
# The above copyright  notice and this  permission notice shall  be
# included in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS  IS", WITHOUT WARRANTY OF ANY  KIND,
# EXPRESS OR IMPLIED, INCLUDING  BUT NOT LIMITED TO  THE WARRANTIES
# OF  MERCHANTABILITY,  FITNESS   FOR  A  PARTICULAR   PURPOSE  AND
# NONINFRINGEMENT.  IN  NO  EVENT SHALL  THE  AUTHORS  OR COPYRIGHT
# HOLDERS  BE LIABLE  FOR ANY  CLAIM, DAMAGES  OR OTHER  LIABILITY,
# WHETHER  IN AN  ACTION OF  CONTRACT, TORT  OR OTHERWISE,  ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
# OTHER DEALINGS IN THE SOFTWARE.
#
from __future__ import print_function

"""Read from and write to tar format archives.
"""

__version__ = "$Revision$"

version     = "0.9.0"
__author__  = "Lars Gust\u00e4bel (lars@gustaebel.de)"
__date__    = "$Date: 2011-02-25 17:42:01 +0200 (Fri, 25 Feb 2011) $"
__cvsid__   = "$Id: tarfile.py 88586 2011-02-25 15:42:01Z marc-andre.lemburg $"
__credits__ = "Gustavo Niemeyer, Niels Gust\u00e4bel, Richard Townsend."

#---------
# Imports
#---------
import sys
import os
import stat
import errno
import time
import struct
import copy
import re

try:
    import grp, pwd
except ImportError:
    grp = pwd = None

# os.symlink on Windows prior to 6.0 raises NotImplementedError
symlink_exception = (AttributeError, NotImplementedError)
try:
    # WindowsError (1314) will be raised if the caller does not hold the
    # SeCreateSymbolicLinkPrivilege privilege
    symlink_exception += (WindowsError,)
except NameError:
    pass

# from tarfile import *
__all__ = ["TarFile", "TarInfo", "is_tarfile", "TarError"]

if sys.version_info[0] < 3:
    import __builtin__ as builtins
else:
    import builtins

_open = builtins.open   # Since 'open' is TarFile.open

#---------------------------------------------------------
# tar constants
#---------------------------------------------------------
NUL = b"\0"                     # the null character
BLOCKSIZE = 512                 # length of processing blocks
RECORDSIZE = BLOCKSIZE * 20     # length of records
GNU_MAGIC = b"ustar  \0"        # magic gnu tar string
POSIX_MAGIC = b"ustar\x0000"    # magic posix tar string

LENGTH_NAME = 100               # maximum length of a filename
LENGTH_LINK = 100               # maximum length of a linkname
LENGTH_PREFIX = 155             # maximum length of the prefix field

REGTYPE = b"0"                  # regular file
AREGTYPE = b"\0"                # regular file
LNKTYPE = b"1"                  # link (inside tarfile)
SYMTYPE = b"2"                  # symbolic link
CHRTYPE = b"3"                  # character special device
BLKTYPE = b"4"                  # block special device
DIRTYPE = b"5"                  # directory
FIFOTYPE = b"6"                 # fifo special device
CONTTYPE = b"7"                 # contiguous file

GNUTYPE_LONGNAME = b"L"         # GNU tar longname
GNUTYPE_LONGLINK = b"K"         # GNU tar longlink
GNUTYPE_SPARSE = b"S"           # GNU tar sparse file

XHDTYPE = b"x"                  # POSIX.1-2001 extended header
XGLTYPE = b"g"                  # POSIX.1-2001 global header
SOLARIS_XHDTYPE = b"X"          # Solaris extended header

USTAR_FORMAT = 0                # POSIX.1-1988 (ustar) format
GNU_FORMAT = 1                  # GNU tar format
PAX_FORMAT = 2                  # POSIX.1-2001 (pax) format
DEFAULT_FORMAT = GNU_FORMAT

#---------------------------------------------------------
# tarfile constants
#---------------------------------------------------------
# File types that tarfile supports:
SUPPORTED_TYPES = (REGTYPE, AREGTYPE, LNKTYPE,
                   SYMTYPE, DIRTYPE, FIFOTYPE,
                   CONTTYPE, CHRTYPE, BLKTYPE,
                   GNUTYPE_LONGNAME, GNUTYPE_LONGLINK,
                   GNUTYPE_SPARSE)

# File types that will be treated as a regular file.
REGULAR_TYPES = (REGTYPE, AREGTYPE,
                 CONTTYPE, GNUTYPE_SPARSE)

# File types that are part of the GNU tar format.
GNU_TYPES = (GNUTYPE_LONGNAME, GNUTYPE_LONGLINK,
             GNUTYPE_SPARSE)

# Fields from a pax header that override a TarInfo attribute.
PAX_FIELDS = ("path", "linkpath", "size", "mtime",
              "uid", "gid", "uname", "gname")

# Fields from a pax header that are affected by hdrcharset.
PAX_NAME_FIELDS = set(("path", "linkpath", "uname", "gname"))

# Fields in a pax header that are numbers, all other fields
# are treated as strings.
PAX_NUMBER_FIELDS = {
    "atime": float,
    "ctime": float,
    "mtime": float,
    "uid": int,
    "gid": int,
    "size": int
}

#---------------------------------------------------------
# Bits used in the mode field, values in octal.
#---------------------------------------------------------
S_IFLNK = 0o120000        # symbolic link
S_IFREG = 0o100000        # regular file
S_IFBLK = 0o060000        # block device
S_IFDIR = 0o040000        # directory
S_IFCHR = 0o020000        # character device
S_IFIFO = 0o010000        # fifo

TSUID   = 0o4000          # set UID on execution
TSGID   = 0o2000          # set GID on execution
TSVTX   = 0o1000          # reserved

TUREAD  = 0o400           # read by owner
TUWRITE = 0o200           # write by owner
TUEXEC  = 0o100           # execute/search by owner
TGREAD  = 0o040           # read by group
TGWRITE = 0o020           # write by group
TGEXEC  = 0o010           # execute/search by group
TOREAD  = 0o004           # read by other
TOWRITE = 0o002           # write by other
TOEXEC  = 0o001           # execute/search by other

#---------------------------------------------------------
# initialization
#---------------------------------------------------------
if os.name in ("nt", "ce"):
    ENCODING = "utf-8"
else:
    ENCODING = sys.getfilesystemencoding()

#---------------------------------------------------------
# Some useful functions
#---------------------------------------------------------

def stn(s, length, encoding, errors):
    """Convert a string to a null-terminated bytes object.
    """
    s = s.encode(encoding, errors)
    return s[:length] + (length - len(s)) * NUL

def nts(s, encoding, errors):
    """Convert a null-terminated bytes object to a string.
    """
    p = s.find(b"\0")
    if p != -1:
        s = s[:p]
    return s.decode(encoding, errors)

def nti(s):
    """Convert a number field to a python number.
    """
    # There are two possible encodings for a number field, see
    # itn() below.
    if s[0] != chr(0o200):
        try:
            n = int(nts(s, "ascii", "strict") or "0", 8)
        except ValueError:
            raise InvalidHeaderError("invalid header")
    else:
        n = 0
        for i in range(len(s) - 1):
            n <<= 8
            n += ord(s[i + 1])
    return n

def itn(n, digits=8, format=DEFAULT_FORMAT):
    """Convert a python number to a number field.
    """
    # POSIX 1003.1-1988 requires numbers to be encoded as a string of
    # octal digits followed by a null-byte, this allows values up to
    # (8**(digits-1))-1. GNU tar allows storing numbers greater than
    # that if necessary. A leading 0o200 byte indicates this particular
    # encoding, the following digits-1 bytes are a big-endian
    # representation. This allows values up to (256**(digits-1))-1.
    if 0 <= n < 8 ** (digits - 1):
        s = ("%0*o" % (digits - 1, n)).encode("ascii") + NUL
    else:
        if format != GNU_FORMAT or n >= 256 ** (digits - 1):
            raise ValueError("overflow in number field")

        if n < 0:
            # XXX We mimic GNU tar's behaviour with negative numbers,
            # this could raise OverflowError.
            n = struct.unpack("L", struct.pack("l", n))[0]

        s = bytearray()
        for i in range(digits - 1):
            s.insert(0, n & 0o377)
            n >>= 8
        s.insert(0, 0o200)
    return s

def calc_chksums(buf):
    """Calculate the checksum for a member's header by summing up all
       characters except for the chksum field which is treated as if
       it was filled with spaces. According to the GNU tar sources,
       some tars (Sun and NeXT) calculate chksum with signed char,
       which will be different if there are chars in the buffer with
       the high bit set. So we calculate two checksums, unsigned and
       signed.
    """
    unsigned_chksum = 256 + sum(struct.unpack("148B", buf[:148]) + struct.unpack("356B", buf[156:512]))
    signed_chksum = 256 + sum(struct.unpack("148b", buf[:148]) + struct.unpack("356b", buf[156:512]))
    return unsigned_chksum, signed_chksum

def copyfileobj(src, dst, length=None):
    """Copy length bytes from fileobj src to fileobj dst.
       If length is None, copy the entire content.
    """
    if length == 0:
        return
    if length is None:
        while True:
            buf = src.read(16*1024)
            if not buf:
                break
            dst.write(buf)
        return

    BUFSIZE = 16 * 1024
    blocks, remainder = divmod(length, BUFSIZE)
    for b in range(blocks):
        buf = src.read(BUFSIZE)
        if len(buf) < BUFSIZE:
            raise IOError("end of file reached")
        dst.write(buf)

    if remainder != 0:
        buf = src.read(remainder)
        if len(buf) < remainder:
            raise IOError("end of file reached")
        dst.write(buf)
    return

filemode_table = (
    ((S_IFLNK,      "l"),
     (S_IFREG,      "-"),
     (S_IFBLK,      "b"),
     (S_IFDIR,      "d"),
     (S_IFCHR,      "c"),
     (S_IFIFO,      "p")),

    ((TUREAD,       "r"),),
    ((TUWRITE,      "w"),),
    ((TUEXEC|TSUID, "s"),
     (TSUID,        "S"),
     (TUEXEC,       "x")),

    ((TGREAD,       "r"),),
    ((TGWRITE,      "w"),),
    ((TGEXEC|TSGID, "s"),
     (TSGID,        "S"),
     (TGEXEC,       "x")),

    ((TOREAD,       "r"),),
    ((TOWRITE,      "w"),),
    ((TOEXEC|TSVTX, "t"),
     (TSVTX,        "T"),
     (TOEXEC,       "x"))
)

def filemode(mode):
    """Convert a file's mode to a string of the form
       -rwxrwxrwx.
       Used by TarFile.list()
    """
    perm = []
    for table in filemode_table:
        for bit, char in table:
            if mode & bit == bit:
                perm.append(char)
                break
        else:
            perm.append("-")
    return "".join(perm)

class TarError(Exception):
    """Base exception."""
    pass
class ExtractError(TarError):
    """General exception for extract errors."""
    pass
class ReadError(TarError):
    """Exception for unreadable tar archives."""
    pass
class CompressionError(TarError):
    """Exception for unavailable compression methods."""
    pass
class StreamError(TarError):
    """Exception for unsupported operations on stream-like TarFiles."""
    pass
class HeaderError(TarError):
    """Base exception for header errors."""
    pass
class EmptyHeaderError(HeaderError):
    """Exception for empty headers."""
    pass
class TruncatedHeaderError(HeaderError):
    """Exception for truncated headers."""
    pass
class EOFHeaderError(HeaderError):
    """Exception for end of file headers."""
    pass
class InvalidHeaderError(HeaderError):
    """Exception for invalid headers."""
    pass
class SubsequentHeaderError(HeaderError):
    """Exception for missing and invalid extended headers."""
    pass

#---------------------------
# internal stream interface
#---------------------------
class _LowLevelFile(object):
    """Low-level file object. Supports reading and writing.
       It is used instead of a regular file object for streaming
       access.
    """

    def __init__(self, name, mode):
        mode = {
            "r": os.O_RDONLY,
            "w": os.O_WRONLY | os.O_CREAT | os.O_TRUNC,
        }[mode]
        if hasattr(os, "O_BINARY"):
            mode |= os.O_BINARY
        self.fd = os.open(name, mode, 0o666)

    def close(self):
        os.close(self.fd)

    def read(self, size):
        return os.read(self.fd, size)

    def write(self, s):
        os.write(self.fd, s)

class _Stream(object):
    """Class that serves as an adapter between TarFile and
       a stream-like object.  The stream-like object only
       needs to have a read() or write() method and is accessed
       blockwise.  Use of gzip or bzip2 compression is possible.
       A stream-like object could be for example: sys.stdin,
       sys.stdout, a socket, a tape device etc.

       _Stream is intended to be used only internally.
    """

    def __init__(self, name, mode, comptype, fileobj, bufsize):
        """Construct a _Stream object.
        """
        self._extfileobj = True
        if fileobj is None:
            fileobj = _LowLevelFile(name, mode)
            self._extfileobj = False

        if comptype == '*':
            # Enable transparent compression detection for the
            # stream interface
            fileobj = _StreamProxy(fileobj)
            comptype = fileobj.getcomptype()

        self.name     = name or ""
        self.mode     = mode
        self.comptype = comptype
        self.fileobj  = fileobj
        self.bufsize  = bufsize
        self.buf      = b""
        self.pos      = 0
        self.closed   = False

        try:
            if comptype == "gz":
                try:
                    import zlib
                except ImportError:
                    raise CompressionError("zlib module is not available")
                self.zlib = zlib
                self.crc = zlib.crc32(b"")
                if mode == "r":
                    self._init_read_gz()
                else:
                    self._init_write_gz()

            if comptype == "bz2":
                try:
                    import bz2
                except ImportError:
                    raise CompressionError("bz2 module is not available")
                if mode == "r":
                    self.dbuf = b""
                    self.cmp = bz2.BZ2Decompressor()
                else:
                    self.cmp = bz2.BZ2Compressor()
        except:
            if not self._extfileobj:
                self.fileobj.close()
            self.closed = True
            raise

    def __del__(self):
        if hasattr(self, "closed") and not self.closed:
            self.close()

    def _init_write_gz(self):
        """Initialize for writing with gzip compression.
        """
        self.cmp = self.zlib.compressobj(9, self.zlib.DEFLATED,
                                            -self.zlib.MAX_WBITS,
                                            self.zlib.DEF_MEM_LEVEL,
                                            0)
        timestamp = struct.pack("<L", int(time.time()))
        self.__write(b"\037\213\010\010" + timestamp + b"\002\377")
        if self.name.endswith(".gz"):
            self.name = self.name[:-3]
        # RFC1952 says we must use ISO-8859-1 for the FNAME field.
        self.__write(self.name.encode("iso-8859-1", "replace") + NUL)

    def write(self, s):
        """Write string s to the stream.
        """
        if self.comptype == "gz":
            self.crc = self.zlib.crc32(s, self.crc)
        self.pos += len(s)
        if self.comptype != "tar":
            s = self.cmp.compress(s)
        self.__write(s)

    def __write(self, s):
        """Write string s to the stream if a whole new block
           is ready to be written.
        """
        self.buf += s
        while len(self.buf) > self.bufsize:
            self.fileobj.write(self.buf[:self.bufsize])
            self.buf = self.buf[self.bufsize:]

    def close(self):
        """Close the _Stream object. No operation should be
           done on it afterwards.
        """
        if self.closed:
            return

        if self.mode == "w" and self.comptype != "tar":
            self.buf += self.cmp.flush()

        if self.mode == "w" and self.buf:
            self.fileobj.write(self.buf)
            self.buf = b""
            if self.comptype == "gz":
                # The native zlib crc is an unsigned 32-bit integer, but
                # the Python wrapper implicitly casts that to a signed C
                # long.  So, on a 32-bit box self.crc may "look negative",
                # while the same crc on a 64-bit box may "look positive".
                # To avoid irksome warnings from the `struct` module, force
                # it to look positive on all boxes.
                self.fileobj.write(struct.pack("<L", self.crc & 0xffffffff))
                self.fileobj.write(struct.pack("<L", self.pos & 0xffffFFFF))

        if not self._extfileobj:
            self.fileobj.close()

        self.closed = True

    def _init_read_gz(self):
        """Initialize for reading a gzip compressed fileobj.
        """
        self.cmp = self.zlib.decompressobj(-self.zlib.MAX_WBITS)
        self.dbuf = b""

        # taken from gzip.GzipFile with some alterations
        if self.__read(2) != b"\037\213":
            raise ReadError("not a gzip file")
        if self.__read(1) != b"\010":
            raise CompressionError("unsupported compression method")

        flag = ord(self.__read(1))
        self.__read(6)

        if flag & 4:
            xlen = ord(self.__read(1)) + 256 * ord(self.__read(1))
            self.read(xlen)
        if flag & 8:
            while True:
                s = self.__read(1)
                if not s or s == NUL:
                    break
        if flag & 16:
            while True:
                s = self.__read(1)
                if not s or s == NUL:
                    break
        if flag & 2:
            self.__read(2)

    def tell(self):
        """Return the stream's file pointer position.
        """
        return self.pos

    def seek(self, pos=0):
        """Set the stream's file pointer to pos. Negative seeking
           is forbidden.
        """
        if pos - self.pos >= 0:
            blocks, remainder = divmod(pos - self.pos, self.bufsize)
            for i in range(blocks):
                self.read(self.bufsize)
            self.read(remainder)
        else:
            raise StreamError("seeking backwards is not allowed")
        return self.pos

    def read(self, size=None):
        """Return the next size number of bytes from the stream.
           If size is not defined, return all bytes of the stream
           up to EOF.
        """
        if size is None:
            t = []
            while True:
                buf = self._read(self.bufsize)
                if not buf:
                    break
                t.append(buf)
            buf = "".join(t)
        else:
            buf = self._read(size)
        self.pos += len(buf)
        return buf

    def _read(self, size):
        """Return size bytes from the stream.
        """
        if self.comptype == "tar":
            return self.__read(size)

        c = len(self.dbuf)
        while c < size:
            buf = self.__read(self.bufsize)
            if not buf:
                break
            try:
                buf = self.cmp.decompress(buf)
            except IOError:
                raise ReadError("invalid compressed data")
            self.dbuf += buf
            c += len(buf)
        buf = self.dbuf[:size]
        self.dbuf = self.dbuf[size:]
        return buf

    def __read(self, size):
        """Return size bytes from stream. If internal buffer is empty,
           read another block from the stream.
        """
        c = len(self.buf)
        while c < size:
            buf = self.fileobj.read(self.bufsize)
            if not buf:
                break
            self.buf += buf
            c += len(buf)
        buf = self.buf[:size]
        self.buf = self.buf[size:]
        return buf
# class _Stream

class _StreamProxy(object):
    """Small proxy class that enables transparent compression
       detection for the Stream interface (mode 'r|*').
    """

    def __init__(self, fileobj):
        self.fileobj = fileobj
        self.buf = self.fileobj.read(BLOCKSIZE)

    def read(self, size):
        self.read = self.fileobj.read
        return self.buf

    def getcomptype(self):
        if self.buf.startswith(b"\037\213\010"):
            return "gz"
        if self.buf.startswith(b"BZh91"):
            return "bz2"
        return "tar"

    def close(self):
        self.fileobj.close()
# class StreamProxy

class _BZ2Proxy(object):
    """Small proxy class that enables external file object
       support for "r:bz2" and "w:bz2" modes. This is actually
       a workaround for a limitation in bz2 module's BZ2File
       class which (unlike gzip.GzipFile) has no support for
       a file object argument.
    """

    blocksize = 16 * 1024

    def __init__(self, fileobj, mode):
        self.fileobj = fileobj
        self.mode = mode
        self.name = getattr(self.fileobj, "name", None)
        self.init()

    def init(self):
        import bz2
        self.pos = 0
        if self.mode == "r":
            self.bz2obj = bz2.BZ2Decompressor()
            self.fileobj.seek(0)
            self.buf = b""
        else:
            self.bz2obj = bz2.BZ2Compressor()

    def read(self, size):
        x = len(self.buf)
        while x < size:
            raw = self.fileobj.read(self.blocksize)
            if not raw:
                break
            data = self.bz2obj.decompress(raw)
            self.buf += data
            x += len(data)

        buf = self.buf[:size]
        self.buf = self.buf[size:]
        self.pos += len(buf)
        return buf

    def seek(self, pos):
        if pos < self.pos:
            self.init()
        self.read(pos - self.pos)

    def tell(self):
        return self.pos

    def write(self, data):
        self.pos += len(data)
        raw = self.bz2obj.compress(data)
        self.fileobj.write(raw)

    def close(self):
        if self.mode == "w":
            raw = self.bz2obj.flush()
            self.fileobj.write(raw)
# class _BZ2Proxy

#------------------------
# Extraction file object
#------------------------
class _FileInFile(object):
    """A thin wrapper around an existing file object that
       provides a part of its data as an individual file
       object.
    """

    def __init__(self, fileobj, offset, size, blockinfo=None):
        self.fileobj = fileobj
        self.offset = offset
        self.size = size
        self.position = 0

        if blockinfo is None:
            blockinfo = [(0, size)]

        # Construct a map with data and zero blocks.
        self.map_index = 0
        self.map = []
        lastpos = 0
        realpos = self.offset
        for offset, size in blockinfo:
            if offset > lastpos:
                self.map.append((False, lastpos, offset, None))
            self.map.append((True, offset, offset + size, realpos))
            realpos += size
            lastpos = offset + size
        if lastpos < self.size:
            self.map.append((False, lastpos, self.size, None))

    def seekable(self):
        if not hasattr(self.fileobj, "seekable"):
            # XXX gzip.GzipFile and bz2.BZ2File
            return True
        return self.fileobj.seekable()

    def tell(self):
        """Return the current file position.
        """
        return self.position

    def seek(self, position):
        """Seek to a position in the file.
        """
        self.position = position

    def read(self, size=None):
        """Read data from the file.
        """
        if size is None:
            size = self.size - self.position
        else:
            size = min(size, self.size - self.position)

        buf = b""
        while size > 0:
            while True:
                data, start, stop, offset = self.map[self.map_index]
                if start <= self.position < stop:
                    break
                else:
                    self.map_index += 1
                    if self.map_index == len(self.map):
                        self.map_index = 0
            length = min(size, stop - self.position)
            if data:
                self.fileobj.seek(offset + (self.position - start))
                buf += self.fileobj.read(length)
            else:
                buf += NUL * length
            size -= length
            self.position += length
        return buf
#class _FileInFile


class ExFileObject(object):
    """File-like object for reading an archive member.
       Is returned by TarFile.extractfile().
    """
    blocksize = 1024

    def __init__(self, tarfile, tarinfo):
        self.fileobj = _FileInFile(tarfile.fileobj,
                                   tarinfo.offset_data,
                                   tarinfo.size,
                                   tarinfo.sparse)
        self.name = tarinfo.name
        self.mode = "r"
        self.closed = False
        self.size = tarinfo.size

        self.position = 0
        self.buffer = b""

    def readable(self):
        return True

    def writable(self):
        return False

    def seekable(self):
        return self.fileobj.seekable()

    def read(self, size=None):
        """Read at most size bytes from the file. If size is not
           present or None, read all data until EOF is reached.
        """
        if self.closed:
            raise ValueError("I/O operation on closed file")

        buf = b""
        if self.buffer:
            if size is None:
                buf = self.buffer
                self.buffer = b""
            else:
                buf = self.buffer[:size]
                self.buffer = self.buffer[size:]

        if size is None:
            buf += self.fileobj.read()
        else:
            buf += self.fileobj.read(size - len(buf))

        self.position += len(buf)
        return buf

    # XXX TextIOWrapper uses the read1() method.
    read1 = read

    def readline(self, size=-1):
        """Read one entire line from the file. If size is present
           and non-negative, return a string with at most that
           size, which may be an incomplete line.
        """
        if self.closed:
            raise ValueError("I/O operation on closed file")

        pos = self.buffer.find(b"\n") + 1
        if pos == 0:
            # no newline found.
            while True:
                buf = self.fileobj.read(self.blocksize)
                self.buffer += buf
                if not buf or b"\n" in buf:
                    pos = self.buffer.find(b"\n") + 1
                    if pos == 0:
                        # no newline found.
                        pos = len(self.buffer)
                    break

        if size != -1:
            pos = min(size, pos)

        buf = self.buffer[:pos]
        self.buffer = self.buffer[pos:]
        self.position += len(buf)
        return buf

    def readlines(self):
        """Return a list with all remaining lines.
        """
        result = []
        while True:
            line = self.readline()
            if not line: break
            result.append(line)
        return result

    def tell(self):
        """Return the current file position.
        """
        if self.closed:
            raise ValueError("I/O operation on closed file")

        return self.position

    def seek(self, pos, whence=os.SEEK_SET):
        """Seek to a position in the file.
        """
        if self.closed:
            raise ValueError("I/O operation on closed file")

        if whence == os.SEEK_SET:
            self.position = min(max(pos, 0), self.size)
        elif whence == os.SEEK_CUR:
            if pos < 0:
                self.position = max(self.position + pos, 0)
            else:
                self.position = min(self.position + pos, self.size)
        elif whence == os.SEEK_END:
            self.position = max(min(self.size + pos, self.size), 0)
        else:
            raise ValueError("Invalid argument")

        self.buffer = b""
        self.fileobj.seek(self.position)

    def close(self):
        """Close the file object.
        """
        self.closed = True

    def __iter__(self):
        """Get an iterator over the file's lines.
        """
        while True:
            line = self.readline()
            if not line:
                break
            yield line
#class ExFileObject

#------------------
# Exported Classes
#------------------
class TarInfo(object):
    """Informational class which holds the details about an
       archive member given by a tar header block.
       TarInfo objects are returned by TarFile.getmember(),
       TarFile.getmembers() and TarFile.gettarinfo() and are
       usually created internally.
    """

    __slots__ = ("name", "mode", "uid", "gid", "size", "mtime",
                 "chksum", "type", "linkname", "uname", "gname",
                 "devmajor", "devminor",
                 "offset", "offset_data", "pax_headers", "sparse",
                 "tarfile", "_sparse_structs", "_link_target")

    def __init__(self, name=""):
        """Construct a TarInfo object. name is the optional name
           of the member.
        """
        self.name = name        # member name
        self.mode = 0o644       # file permissions
        self.uid = 0            # user id
        self.gid = 0            # group id
        self.size = 0           # file size
        self.mtime = 0          # modification time
        self.chksum = 0         # header checksum
        self.type = REGTYPE     # member type
        self.linkname = ""      # link name
        self.uname = ""         # user name
        self.gname = ""         # group name
        self.devmajor = 0       # device major number
        self.devminor = 0       # device minor number

        self.offset = 0         # the tar header starts here
        self.offset_data = 0    # the file's data starts here

        self.sparse = None      # sparse member information
        self.pax_headers = {}   # pax header information

    # In pax headers the "name" and "linkname" field are called
    # "path" and "linkpath".
    def _getpath(self):
        return self.name
    def _setpath(self, name):
        self.name = name
    path = property(_getpath, _setpath)

    def _getlinkpath(self):
        return self.linkname
    def _setlinkpath(self, linkname):
        self.linkname = linkname
    linkpath = property(_getlinkpath, _setlinkpath)

    def __repr__(self):
        return "<%s %r at %#x>" % (self.__class__.__name__,self.name,id(self))

    def get_info(self):
        """Return the TarInfo's attributes as a dictionary.
        """
        info = {
            "name":     self.name,
            "mode":     self.mode & 0o7777,
            "uid":      self.uid,
            "gid":      self.gid,
            "size":     self.size,
            "mtime":    self.mtime,
            "chksum":   self.chksum,
            "type":     self.type,
            "linkname": self.linkname,
            "uname":    self.uname,
            "gname":    self.gname,
            "devmajor": self.devmajor,
            "devminor": self.devminor
        }

        if info["type"] == DIRTYPE and not info["name"].endswith("/"):
            info["name"] += "/"

        return info

    def tobuf(self, format=DEFAULT_FORMAT, encoding=ENCODING, errors="surrogateescape"):
        """Return a tar header as a string of 512 byte blocks.
        """
        info = self.get_info()

        if format == USTAR_FORMAT:
            return self.create_ustar_header(info, encoding, errors)
        elif format == GNU_FORMAT:
            return self.create_gnu_header(info, encoding, errors)
        elif format == PAX_FORMAT:
            return self.create_pax_header(info, encoding)
        else:
            raise ValueError("invalid format")

    def create_ustar_header(self, info, encoding, errors):
        """Return the object as a ustar header block.
        """
        info["magic"] = POSIX_MAGIC

        if len(info["linkname"]) > LENGTH_LINK:
            raise ValueError("linkname is too long")

        if len(info["name"]) > LENGTH_NAME:
            info["prefix"], info["name"] = self._posix_split_name(info["name"])

        return self._create_header(info, USTAR_FORMAT, encoding, errors)

    def create_gnu_header(self, info, encoding, errors):
        """Return the object as a GNU header block sequence.
        """
        info["magic"] = GNU_MAGIC

        buf = b""
        if len(info["linkname"]) > LENGTH_LINK:
            buf += self._create_gnu_long_header(info["linkname"], GNUTYPE_LONGLINK, encoding, errors)

        if len(info["name"]) > LENGTH_NAME:
            buf += self._create_gnu_long_header(info["name"], GNUTYPE_LONGNAME, encoding, errors)

        return buf + self._create_header(info, GNU_FORMAT, encoding, errors)

    def create_pax_header(self, info, encoding):
        """Return the object as a ustar header block. If it cannot be
           represented this way, prepend a pax extended header sequence
           with supplement information.
        """
        info["magic"] = POSIX_MAGIC
        pax_headers = self.pax_headers.copy()

        # Test string fields for values that exceed the field length or cannot
        # be represented in ASCII encoding.
        for name, hname, length in (
                ("name", "path", LENGTH_NAME), ("linkname", "linkpath", LENGTH_LINK),
                ("uname", "uname", 32), ("gname", "gname", 32)):

            if hname in pax_headers:
                # The pax header has priority.
                continue

            # Try to encode the string as ASCII.
            try:
                info[name].encode("ascii", "strict")
            except UnicodeEncodeError:
                pax_headers[hname] = info[name]
                continue

            if len(info[name]) > length:
                pax_headers[hname] = info[name]

        # Test number fields for values that exceed the field limit or values
        # that like to be stored as float.
        for name, digits in (("uid", 8), ("gid", 8), ("size", 12), ("mtime", 12)):
            if name in pax_headers:
                # The pax header has priority. Avoid overflow.
                info[name] = 0
                continue

            val = info[name]
            if not 0 <= val < 8 ** (digits - 1) or isinstance(val, float):
                pax_headers[name] = str(val)
                info[name] = 0

        # Create a pax extended header if necessary.
        if pax_headers:
            buf = self._create_pax_generic_header(pax_headers, XHDTYPE, encoding)
        else:
            buf = b""

        return buf + self._create_header(info, USTAR_FORMAT, "ascii", "replace")

    @classmethod
    def create_pax_global_header(cls, pax_headers):
        """Return the object as a pax global header block sequence.
        """
        return cls._create_pax_generic_header(pax_headers, XGLTYPE, "utf8")

    def _posix_split_name(self, name):
        """Split a name longer than 100 chars into a prefix
           and a name part.
        """
        prefix = name[:LENGTH_PREFIX + 1]
        while prefix and prefix[-1] != "/":
            prefix = prefix[:-1]

        name = name[len(prefix):]
        prefix = prefix[:-1]

        if not prefix or len(name) > LENGTH_NAME:
            raise ValueError("name is too long")
        return prefix, name

    @staticmethod
    def _create_header(info, format, encoding, errors):
        """Return a header block. info is a dictionary with file
           information, format must be one of the *_FORMAT constants.
        """
        parts = [
            stn(info.get("name", ""), 100, encoding, errors),
            itn(info.get("mode", 0) & 0o7777, 8, format),
            itn(info.get("uid", 0), 8, format),
            itn(info.get("gid", 0), 8, format),
            itn(info.get("size", 0), 12, format),
            itn(info.get("mtime", 0), 12, format),
            b"        ", # checksum field
            info.get("type", REGTYPE),
            stn(info.get("linkname", ""), 100, encoding, errors),
            info.get("magic", POSIX_MAGIC),
            stn(info.get("uname", ""), 32, encoding, errors),
            stn(info.get("gname", ""), 32, encoding, errors),
            itn(info.get("devmajor", 0), 8, format),
            itn(info.get("devminor", 0), 8, format),
            stn(info.get("prefix", ""), 155, encoding, errors)
        ]

        buf = struct.pack("%ds" % BLOCKSIZE, b"".join(parts))
        chksum = calc_chksums(buf[-BLOCKSIZE:])[0]
        buf = buf[:-364] + ("%06o\0" % chksum).encode("ascii") + buf[-357:]
        return buf

    @staticmethod
    def _create_payload(payload):
        """Return the string payload filled with zero bytes
           up to the next 512 byte border.
        """
        blocks, remainder = divmod(len(payload), BLOCKSIZE)
        if remainder > 0:
            payload += (BLOCKSIZE - remainder) * NUL
        return payload

    @classmethod
    def _create_gnu_long_header(cls, name, type, encoding, errors):
        """Return a GNUTYPE_LONGNAME or GNUTYPE_LONGLINK sequence
           for name.
        """
        name = name.encode(encoding, errors) + NUL

        info = {}
        info["name"] = "././@LongLink"
        info["type"] = type
        info["size"] = len(name)
        info["magic"] = GNU_MAGIC

        # create extended header + name blocks.
        return cls._create_header(info, USTAR_FORMAT, encoding, errors) + \
                cls._create_payload(name)

    @classmethod
    def _create_pax_generic_header(cls, pax_headers, type, encoding):
        """Return a POSIX.1-2008 extended or global header sequence
           that contains a list of keyword, value pairs. The values
           must be strings.
        """
        # Check if one of the fields contains surrogate characters and thereby
        # forces hdrcharset=BINARY, see _proc_pax() for more information.
        binary = False
        for keyword, value in pax_headers.items():
            try:
                value.encode("utf8", "strict")
            except UnicodeEncodeError:
                binary = True
                break

        records = b""
        if binary:
            # Put the hdrcharset field at the beginning of the header.
            records += b"21 hdrcharset=BINARY\n"

        for keyword, value in pax_headers.items():
            keyword = keyword.encode("utf8")
            if binary:
                # Try to restore the original byte representation of `value'.
                # Needless to say, that the encoding must match the string.
                value = value.encode(encoding, "surrogateescape")
            else:
                value = value.encode("utf8")

            l = len(keyword) + len(value) + 3   # ' ' + '=' + '\n'
            n = p = 0
            while True:
                n = l + len(str(p))
                if n == p:
                    break
                p = n
            records += bytes(str(p), "ascii") + b" " + keyword + b"=" + value + b"\n"

        # We use a hardcoded "././@PaxHeader" name like star does
        # instead of the one that POSIX recommends.
        info = {}
        info["name"] = "././@PaxHeader"
        info["type"] = type
        info["size"] = len(records)
        info["magic"] = POSIX_MAGIC

        # Create pax header + record blocks.
        return cls._create_header(info, USTAR_FORMAT, "ascii", "replace") + \
                cls._create_payload(records)

    @classmethod
    def frombuf(cls, buf, encoding, errors):
        """Construct a TarInfo object from a 512 byte bytes object.
        """
        if len(buf) == 0:
            raise EmptyHeaderError("empty header")
        if len(buf) != BLOCKSIZE:
            raise TruncatedHeaderError("truncated header")
        if buf.count(NUL) == BLOCKSIZE:
            raise EOFHeaderError("end of file header")

        chksum = nti(buf[148:156])
        if chksum not in calc_chksums(buf):
            raise InvalidHeaderError("bad checksum")

        obj = cls()
        obj.name = nts(buf[0:100], encoding, errors)
        obj.mode = nti(buf[100:108])
        obj.uid = nti(buf[108:116])
        obj.gid = nti(buf[116:124])
        obj.size = nti(buf[124:136])
        obj.mtime = nti(buf[136:148])
        obj.chksum = chksum
        obj.type = buf[156:157]
        obj.linkname = nts(buf[157:257], encoding, errors)
        obj.uname = nts(buf[265:297], encoding, errors)
        obj.gname = nts(buf[297:329], encoding, errors)
        obj.devmajor = nti(buf[329:337])
        obj.devminor = nti(buf[337:345])
        prefix = nts(buf[345:500], encoding, errors)

        # Old V7 tar format represents a directory as a regular
        # file with a trailing slash.
        if obj.type == AREGTYPE and obj.name.endswith("/"):
            obj.type = DIRTYPE

        # The old GNU sparse format occupies some of the unused
        # space in the buffer for up to 4 sparse structures.
        # Save the them for later processing in _proc_sparse().
        if obj.type == GNUTYPE_SPARSE:
            pos = 386
            structs = []
            for i in range(4):
                try:
                    offset = nti(buf[pos:pos + 12])
                    numbytes = nti(buf[pos + 12:pos + 24])
                except ValueError:
                    break
                structs.append((offset, numbytes))
                pos += 24
            isextended = bool(buf[482])
            origsize = nti(buf[483:495])
            obj._sparse_structs = (structs, isextended, origsize)

        # Remove redundant slashes from directories.
        if obj.isdir():
            obj.name = obj.name.rstrip("/")

        # Reconstruct a ustar longname.
        if prefix and obj.type not in GNU_TYPES:
            obj.name = prefix + "/" + obj.name
        return obj

    @classmethod
    def fromtarfile(cls, tarfile):
        """Return the next TarInfo object from TarFile object
           tarfile.
        """
        buf = tarfile.fileobj.read(BLOCKSIZE)
        obj = cls.frombuf(buf, tarfile.encoding, tarfile.errors)
        obj.offset = tarfile.fileobj.tell() - BLOCKSIZE
        return obj._proc_member(tarfile)

    #--------------------------------------------------------------------------
    # The following are methods that are called depending on the type of a
    # member. The entry point is _proc_member() which can be overridden in a
    # subclass to add custom _proc_*() methods. A _proc_*() method MUST
    # implement the following
    # operations:
    # 1. Set self.offset_data to the position where the data blocks begin,
    #    if there is data that follows.
    # 2. Set tarfile.offset to the position where the next member's header will
    #    begin.
    # 3. Return self or another valid TarInfo object.
    def _proc_member(self, tarfile):
        """Choose the right processing method depending on
           the type and call it.
        """
        if self.type in (GNUTYPE_LONGNAME, GNUTYPE_LONGLINK):
            return self._proc_gnulong(tarfile)
        elif self.type == GNUTYPE_SPARSE:
            return self._proc_sparse(tarfile)
        elif self.type in (XHDTYPE, XGLTYPE, SOLARIS_XHDTYPE):
            return self._proc_pax(tarfile)
        else:
            return self._proc_builtin(tarfile)

    def _proc_builtin(self, tarfile):
        """Process a builtin type or an unknown type which
           will be treated as a regular file.
        """
        self.offset_data = tarfile.fileobj.tell()
        offset = self.offset_data
        if self.isreg() or self.type not in SUPPORTED_TYPES:
            # Skip the following data blocks.
            offset += self._block(self.size)
        tarfile.offset = offset

        # Patch the TarInfo object with saved global
        # header information.
        self._apply_pax_info(tarfile.pax_headers, tarfile.encoding, tarfile.errors)

        return self

    def _proc_gnulong(self, tarfile):
        """Process the blocks that hold a GNU longname
           or longlink member.
        """
        buf = tarfile.fileobj.read(self._block(self.size))

        # Fetch the next header and process it.
        try:
            next = self.fromtarfile(tarfile)
        except HeaderError:
            raise SubsequentHeaderError("missing or bad subsequent header")

        # Patch the TarInfo object from the next header with
        # the longname information.
        next.offset = self.offset
        if self.type == GNUTYPE_LONGNAME:
            next.name = nts(buf, tarfile.encoding, tarfile.errors)
        elif self.type == GNUTYPE_LONGLINK:
            next.linkname = nts(buf, tarfile.encoding, tarfile.errors)

        return next

    def _proc_sparse(self, tarfile):
        """Process a GNU sparse header plus extra headers.
        """
        # We already collected some sparse structures in frombuf().
        structs, isextended, origsize = self._sparse_structs
        del self._sparse_structs

        # Collect sparse structures from extended header blocks.
        while isextended:
            buf = tarfile.fileobj.read(BLOCKSIZE)
            pos = 0
            for i in range(21):
                try:
                    offset = nti(buf[pos:pos + 12])
                    numbytes = nti(buf[pos + 12:pos + 24])
                except ValueError:
                    break
                if offset and numbytes:
                    structs.append((offset, numbytes))
                pos += 24
            isextended = bool(buf[504])
        self.sparse = structs

        self.offset_data = tarfile.fileobj.tell()
        tarfile.offset = self.offset_data + self._block(self.size)
        self.size = origsize
        return self

    def _proc_pax(self, tarfile):
        """Process an extended or global header as described in
           POSIX.1-2008.
        """
        # Read the header information.
        buf = tarfile.fileobj.read(self._block(self.size))

        # A pax header stores supplemental information for either
        # the following file (extended) or all following files
        # (global).
        if self.type == XGLTYPE:
            pax_headers = tarfile.pax_headers
        else:
            pax_headers = tarfile.pax_headers.copy()

        # Check if the pax header contains a hdrcharset field. This tells us
        # the encoding of the path, linkpath, uname and gname fields. Normally,
        # these fields are UTF-8 encoded but since POSIX.1-2008 tar
        # implementations are allowed to store them as raw binary strings if
        # the translation to UTF-8 fails.
        match = re.search(br"\d+ hdrcharset=([^\n]+)\n", buf)
        if match is not None:
            pax_headers["hdrcharset"] = match.group(1).decode("utf8")

        # For the time being, we don't care about anything other than "BINARY".
        # The only other value that is currently allowed by the standard is
        # "ISO-IR 10646 2000 UTF-8" in other words UTF-8.
        hdrcharset = pax_headers.get("hdrcharset")
        if hdrcharset == "BINARY":
            encoding = tarfile.encoding
        else:
            encoding = "utf8"

        # Parse pax header information. A record looks like that:
        # "%d %s=%s\n" % (length, keyword, value). length is the size
        # of the complete record including the length field itself and
        # the newline. keyword and value are both UTF-8 encoded strings.
        regex = re.compile(br"(\d+) ([^=]+)=")
        pos = 0
        while True:
            match = regex.match(buf, pos)
            if not match:
                break

            length, keyword = match.groups()
            length = int(length)
            value = buf[match.end(2) + 1:match.start(1) + length - 1]

            # Normally, we could just use "utf8" as the encoding and "strict"
            # as the error handler, but we better not take the risk. For
            # example, GNU tar <= 1.23 is known to store filenames it cannot
            # translate to UTF-8 as raw strings (unfortunately without a
            # hdrcharset=BINARY header).
            # We first try the strict standard encoding, and if that fails we
            # fall back on the user's encoding and error handler.
            keyword = self._decode_pax_field(keyword, "utf8", "utf8",
                    tarfile.errors)
            if keyword in PAX_NAME_FIELDS:
                value = self._decode_pax_field(value, encoding, tarfile.encoding,
                        tarfile.errors)
            else:
                value = self._decode_pax_field(value, "utf8", "utf8",
                        tarfile.errors)

            pax_headers[keyword] = value
            pos += length

        # Fetch the next header.
        try:
            next = self.fromtarfile(tarfile)
        except HeaderError:
            raise SubsequentHeaderError("missing or bad subsequent header")

        # Process GNU sparse information.
        if "GNU.sparse.map" in pax_headers:
            # GNU extended sparse format version 0.1.
            self._proc_gnusparse_01(next, pax_headers)

        elif "GNU.sparse.size" in pax_headers:
            # GNU extended sparse format version 0.0.
            self._proc_gnusparse_00(next, pax_headers, buf)

        elif pax_headers.get("GNU.sparse.major") == "1" and pax_headers.get("GNU.sparse.minor") == "0":
            # GNU extended sparse format version 1.0.
            self._proc_gnusparse_10(next, pax_headers, tarfile)

        if self.type in (XHDTYPE, SOLARIS_XHDTYPE):
            # Patch the TarInfo object with the extended header info.
            next._apply_pax_info(pax_headers, tarfile.encoding, tarfile.errors)
            next.offset = self.offset

            if "size" in pax_headers:
                # If the extended header replaces the size field,
                # we need to recalculate the offset where the next
                # header starts.
                offset = next.offset_data
                if next.isreg() or next.type not in SUPPORTED_TYPES:
                    offset += next._block(next.size)
                tarfile.offset = offset

        return next

    def _proc_gnusparse_00(self, next, pax_headers, buf):
        """Process a GNU tar extended sparse header, version 0.0.
        """
        offsets = []
        for match in re.finditer(br"\d+ GNU.sparse.offset=(\d+)\n", buf):
            offsets.append(int(match.group(1)))
        numbytes = []
        for match in re.finditer(br"\d+ GNU.sparse.numbytes=(\d+)\n", buf):
            numbytes.append(int(match.group(1)))
        next.sparse = list(zip(offsets, numbytes))

    def _proc_gnusparse_01(self, next, pax_headers):
        """Process a GNU tar extended sparse header, version 0.1.
        """
        sparse = [int(x) for x in pax_headers["GNU.sparse.map"].split(",")]
        next.sparse = list(zip(sparse[::2], sparse[1::2]))

    def _proc_gnusparse_10(self, next, pax_headers, tarfile):
        """Process a GNU tar extended sparse header, version 1.0.
        """
        fields = None
        sparse = []
        buf = tarfile.fileobj.read(BLOCKSIZE)
        fields, buf = buf.split(b"\n", 1)
        fields = int(fields)
        while len(sparse) < fields * 2:
            if b"\n" not in buf:
                buf += tarfile.fileobj.read(BLOCKSIZE)
            number, buf = buf.split(b"\n", 1)
            sparse.append(int(number))
        next.offset_data = tarfile.fileobj.tell()
        next.sparse = list(zip(sparse[::2], sparse[1::2]))

    def _apply_pax_info(self, pax_headers, encoding, errors):
        """Replace fields with supplemental information from a previous
           pax extended or global header.
        """
        for keyword, value in pax_headers.items():
            if keyword == "GNU.sparse.name":
                setattr(self, "path", value)
            elif keyword == "GNU.sparse.size":
                setattr(self, "size", int(value))
            elif keyword == "GNU.sparse.realsize":
                setattr(self, "size", int(value))
            elif keyword in PAX_FIELDS:
                if keyword in PAX_NUMBER_FIELDS:
                    try:
                        value = PAX_NUMBER_FIELDS[keyword](value)
                    except ValueError:
                        value = 0
                if keyword == "path":
                    value = value.rstrip("/")
                setattr(self, keyword, value)

        self.pax_headers = pax_headers.copy()

    def _decode_pax_field(self, value, encoding, fallback_encoding, fallback_errors):
        """Decode a single field from a pax record.
        """
        try:
            return value.decode(encoding, "strict")
        except UnicodeDecodeError:
            return value.decode(fallback_encoding, fallback_errors)

    def _block(self, count):
        """Round up a byte count by BLOCKSIZE and return it,
           e.g. _block(834) => 1024.
        """
        blocks, remainder = divmod(count, BLOCKSIZE)
        if remainder:
            blocks += 1
        return blocks * BLOCKSIZE

    def isreg(self):
        return self.type in REGULAR_TYPES
    def isfile(self):
        return self.isreg()
    def isdir(self):
        return self.type == DIRTYPE
    def issym(self):
        return self.type == SYMTYPE
    def islnk(self):
        return self.type == LNKTYPE
    def ischr(self):
        return self.type == CHRTYPE
    def isblk(self):
        return self.type == BLKTYPE
    def isfifo(self):
        return self.type == FIFOTYPE
    def issparse(self):
        return self.sparse is not None
    def isdev(self):
        return self.type in (CHRTYPE, BLKTYPE, FIFOTYPE)
# class TarInfo

class TarFile(object):
    """The TarFile Class provides an interface to tar archives.
    """

    debug = 0                   # May be set from 0 (no msgs) to 3 (all msgs)

    dereference = False         # If true, add content of linked file to the
                                # tar file, else the link.

    ignore_zeros = False        # If true, skips empty or invalid blocks and
                                # continues processing.

    errorlevel = 1              # If 0, fatal errors only appear in debug
                                # messages (if debug >= 0). If > 0, errors
                                # are passed to the caller as exceptions.

    format = DEFAULT_FORMAT     # The format to use when creating an archive.

    encoding = ENCODING         # Encoding for 8-bit character strings.

    errors = None               # Error handler for unicode conversion.

    tarinfo = TarInfo           # The default TarInfo class to use.

    fileobject = ExFileObject   # The default ExFileObject class to use.

    def __init__(self, name=None, mode="r", fileobj=None, format=None,
            tarinfo=None, dereference=None, ignore_zeros=None, encoding=None,
            errors="surrogateescape", pax_headers=None, debug=None, errorlevel=None):
        """Open an (uncompressed) tar archive `name'. `mode' is either 'r' to
           read from an existing archive, 'a' to append data to an existing
           file or 'w' to create a new file overwriting an existing one. `mode'
           defaults to 'r'.
           If `fileobj' is given, it is used for reading or writing data. If it
           can be determined, `mode' is overridden by `fileobj's mode.
           `fileobj' is not closed, when TarFile is closed.
        """
        if len(mode) > 1 or mode not in "raw":
            raise ValueError("mode must be 'r', 'a' or 'w'")
        self.mode = mode
        self._mode = {"r": "rb", "a": "r+b", "w": "wb"}[mode]

        if not fileobj:
            if self.mode == "a" and not os.path.exists(name):
                # Create nonexistent files in append mode.
                self.mode = "w"
                self._mode = "wb"
            fileobj = bltn_open(name, self._mode)
            self._extfileobj = False
        else:
            if name is None and hasattr(fileobj, "name"):
                name = fileobj.name
            if hasattr(fileobj, "mode"):
                self._mode = fileobj.mode
            self._extfileobj = True
        self.name = os.path.abspath(name) if name else None
        self.fileobj = fileobj

        # Init attributes.
        if format is not None:
            self.format = format
        if tarinfo is not None:
            self.tarinfo = tarinfo
        if dereference is not None:
            self.dereference = dereference
        if ignore_zeros is not None:
            self.ignore_zeros = ignore_zeros
        if encoding is not None:
            self.encoding = encoding
        self.errors = errors

        if pax_headers is not None and self.format == PAX_FORMAT:
            self.pax_headers = pax_headers
        else:
            self.pax_headers = {}

        if debug is not None:
            self.debug = debug
        if errorlevel is not None:
            self.errorlevel = errorlevel

        # Init datastructures.
        self.closed = False
        self.members = []       # list of members as TarInfo objects
        self._loaded = False    # flag if all members have been read
        self.offset = self.fileobj.tell()
                                # current position in the archive file
        self.inodes = {}        # dictionary caching the inodes of
                                # archive members already added

        try:
            if self.mode == "r":
                self.firstmember = None
                self.firstmember = self.next()

            if self.mode == "a":
                # Move to the end of the archive,
                # before the first empty block.
                while True:
                    self.fileobj.seek(self.offset)
                    try:
                        tarinfo = self.tarinfo.fromtarfile(self)
                        self.members.append(tarinfo)
                    except EOFHeaderError:
                        self.fileobj.seek(self.offset)
                        break
                    except HeaderError as e:
                        raise ReadError(str(e))

            if self.mode in "aw":
                self._loaded = True

                if self.pax_headers:
                    buf = self.tarinfo.create_pax_global_header(self.pax_headers.copy())
                    self.fileobj.write(buf)
                    self.offset += len(buf)
        except:
            if not self._extfileobj:
                self.fileobj.close()
            self.closed = True
            raise

    #--------------------------------------------------------------------------
    # Below are the classmethods which act as alternate constructors to the
    # TarFile class. The open() method is the only one that is needed for
    # public use; it is the "super"-constructor and is able to select an
    # adequate "sub"-constructor for a particular compression using the mapping
    # from OPEN_METH.
    #
    # This concept allows one to subclass TarFile without losing the comfort of
    # the super-constructor. A sub-constructor is registered and made available
    # by adding it to the mapping in OPEN_METH.

    @classmethod
    def open(cls, name=None, mode="r", fileobj=None, bufsize=RECORDSIZE, **kwargs):
        """Open a tar archive for reading, writing or appending. Return
           an appropriate TarFile class.

           mode:
           'r' or 'r:*' open for reading with transparent compression
           'r:'         open for reading exclusively uncompressed
           'r:gz'       open for reading with gzip compression
           'r:bz2'      open for reading with bzip2 compression
           'a' or 'a:'  open for appending, creating the file if necessary
           'w' or 'w:'  open for writing without compression
           'w:gz'       open for writing with gzip compression
           'w:bz2'      open for writing with bzip2 compression

           'r|*'        open a stream of tar blocks with transparent compression
           'r|'         open an uncompressed stream of tar blocks for reading
           'r|gz'       open a gzip compressed stream of tar blocks
           'r|bz2'      open a bzip2 compressed stream of tar blocks
           'w|'         open an uncompressed stream for writing
           'w|gz'       open a gzip compressed stream for writing
           'w|bz2'      open a bzip2 compressed stream for writing
        """

        if not name and not fileobj:
            raise ValueError("nothing to open")

        if mode in ("r", "r:*"):
            # Find out which *open() is appropriate for opening the file.
            for comptype in cls.OPEN_METH:
                func = getattr(cls, cls.OPEN_METH[comptype])
                if fileobj is not None:
                    saved_pos = fileobj.tell()
                try:
                    return func(name, "r", fileobj, **kwargs)
                except (ReadError, CompressionError) as e:
                    if fileobj is not None:
                        fileobj.seek(saved_pos)
                    continue
            raise ReadError("file could not be opened successfully")

        elif ":" in mode:
            filemode, comptype = mode.split(":", 1)
            filemode = filemode or "r"
            comptype = comptype or "tar"

            # Select the *open() function according to
            # given compression.
            if comptype in cls.OPEN_METH:
                func = getattr(cls, cls.OPEN_METH[comptype])
            else:
                raise CompressionError("unknown compression type %r" % comptype)
            return func(name, filemode, fileobj, **kwargs)

        elif "|" in mode:
            filemode, comptype = mode.split("|", 1)
            filemode = filemode or "r"
            comptype = comptype or "tar"

            if filemode not in "rw":
                raise ValueError("mode must be 'r' or 'w'")

            stream = _Stream(name, filemode, comptype, fileobj, bufsize)
            try:
                t = cls(name, filemode, stream, **kwargs)
            except:
                stream.close()
                raise
            t._extfileobj = False
            return t

        elif mode in "aw":
            return cls.taropen(name, mode, fileobj, **kwargs)

        raise ValueError("undiscernible mode")

    @classmethod
    def taropen(cls, name, mode="r", fileobj=None, **kwargs):
        """Open uncompressed tar archive name for reading or writing.
        """
        if len(mode) > 1 or mode not in "raw":
            raise ValueError("mode must be 'r', 'a' or 'w'")
        return cls(name, mode, fileobj, **kwargs)

    @classmethod
    def gzopen(cls, name, mode="r", fileobj=None, compresslevel=9, **kwargs):
        """Open gzip compressed tar archive name for reading or writing.
           Appending is not allowed.
        """
        if len(mode) > 1 or mode not in "rw":
            raise ValueError("mode must be 'r' or 'w'")

        try:
            import gzip
            gzip.GzipFile
        except (ImportError, AttributeError):
            raise CompressionError("gzip module is not available")

        extfileobj = fileobj is not None
        try:
            fileobj = gzip.GzipFile(name, mode + "b", compresslevel, fileobj)
            t = cls.taropen(name, mode, fileobj, **kwargs)
        except IOError:
            if not extfileobj and fileobj is not None:
                fileobj.close()
            if fileobj is None:
                raise
            raise ReadError("not a gzip file")
        except:
            if not extfileobj and fileobj is not None:
                fileobj.close()
            raise
        t._extfileobj = extfileobj
        return t

    @classmethod
    def bz2open(cls, name, mode="r", fileobj=None, compresslevel=9, **kwargs):
        """Open bzip2 compressed tar archive name for reading or writing.
           Appending is not allowed.
        """
        if len(mode) > 1 or mode not in "rw":
            raise ValueError("mode must be 'r' or 'w'.")

        try:
            import bz2
        except ImportError:
            raise CompressionError("bz2 module is not available")

        if fileobj is not None:
            fileobj = _BZ2Proxy(fileobj, mode)
        else:
            fileobj = bz2.BZ2File(name, mode, compresslevel=compresslevel)

        try:
            t = cls.taropen(name, mode, fileobj, **kwargs)
        except (IOError, EOFError):
            fileobj.close()
            raise ReadError("not a bzip2 file")
        t._extfileobj = False
        return t

    # All *open() methods are registered here.
    OPEN_METH = {
        "tar": "taropen",   # uncompressed tar
        "gz":  "gzopen",    # gzip compressed tar
        "bz2": "bz2open"    # bzip2 compressed tar
    }

    #--------------------------------------------------------------------------
    # The public methods which TarFile provides:

    def close(self):
        """Close the TarFile. In write-mode, two finishing zero blocks are
           appended to the archive.
        """
        if self.closed:
            return

        if self.mode in "aw":
            self.fileobj.write(NUL * (BLOCKSIZE * 2))
            self.offset += (BLOCKSIZE * 2)
            # fill up the end with zero-blocks
            # (like option -b20 for tar does)
            blocks, remainder = divmod(self.offset, RECORDSIZE)
            if remainder > 0:
                self.fileobj.write(NUL * (RECORDSIZE - remainder))

        if not self._extfileobj:
            self.fileobj.close()
        self.closed = True

    def getmember(self, name):
        """Return a TarInfo object for member `name'. If `name' can not be
           found in the archive, KeyError is raised. If a member occurs more
           than once in the archive, its last occurrence is assumed to be the
           most up-to-date version.
        """
        tarinfo = self._getmember(name)
        if tarinfo is None:
            raise KeyError("filename %r not found" % name)
        return tarinfo

    def getmembers(self):
        """Return the members of the archive as a list of TarInfo objects. The
           list has the same order as the members in the archive.
        """
        self._check()
        if not self._loaded:    # if we want to obtain a list of
            self._load()        # all members, we first have to
                                # scan the whole archive.
        return self.members

    def getnames(self):
        """Return the members of the archive as a list of their names. It has
           the same order as the list returned by getmembers().
        """
        return [tarinfo.name for tarinfo in self.getmembers()]

    def gettarinfo(self, name=None, arcname=None, fileobj=None):
        """Create a TarInfo object for either the file `name' or the file
           object `fileobj' (using os.fstat on its file descriptor). You can
           modify some of the TarInfo's attributes before you add it using
           addfile(). If given, `arcname' specifies an alternative name for the
           file in the archive.
        """
        self._check("aw")

        # When fileobj is given, replace name by
        # fileobj's real name.
        if fileobj is not None:
            name = fileobj.name

        # Building the name of the member in the archive.
        # Backward slashes are converted to forward slashes,
        # Absolute paths are turned to relative paths.
        if arcname is None:
            arcname = name
        drv, arcname = os.path.splitdrive(arcname)
        arcname = arcname.replace(os.sep, "/")
        arcname = arcname.lstrip("/")

        # Now, fill the TarInfo object with
        # information specific for the file.
        tarinfo = self.tarinfo()
        tarinfo.tarfile = self

        # Use os.stat or os.lstat, depending on platform
        # and if symlinks shall be resolved.
        if fileobj is None:
            if hasattr(os, "lstat") and not self.dereference:
                statres = os.lstat(name)
            else:
                statres = os.stat(name)
        else:
            statres = os.fstat(fileobj.fileno())
        linkname = ""

        stmd = statres.st_mode
        if stat.S_ISREG(stmd):
            inode = (statres.st_ino, statres.st_dev)
            if not self.dereference and statres.st_nlink > 1 and \
                    inode in self.inodes and arcname != self.inodes[inode]:
                # Is it a hardlink to an already
                # archived file?
                type = LNKTYPE
                linkname = self.inodes[inode]
            else:
                # The inode is added only if its valid.
                # For win32 it is always 0.
                type = REGTYPE
                if inode[0]:
                    self.inodes[inode] = arcname
        elif stat.S_ISDIR(stmd):
            type = DIRTYPE
        elif stat.S_ISFIFO(stmd):
            type = FIFOTYPE
        elif stat.S_ISLNK(stmd):
            type = SYMTYPE
            linkname = os.readlink(name)
        elif stat.S_ISCHR(stmd):
            type = CHRTYPE
        elif stat.S_ISBLK(stmd):
            type = BLKTYPE
        else:
            return None

        # Fill the TarInfo object with all
        # information we can get.
        tarinfo.name = arcname
        tarinfo.mode = stmd
        tarinfo.uid = statres.st_uid
        tarinfo.gid = statres.st_gid
        if type == REGTYPE:
            tarinfo.size = statres.st_size
        else:
            tarinfo.size = 0
        tarinfo.mtime = statres.st_mtime
        tarinfo.type = type
        tarinfo.linkname = linkname
        if pwd:
            try:
                tarinfo.uname = pwd.getpwuid(tarinfo.uid)[0]
            except KeyError:
                pass
        if grp:
            try:
                tarinfo.gname = grp.getgrgid(tarinfo.gid)[0]
            except KeyError:
                pass

        if type in (CHRTYPE, BLKTYPE):
            if hasattr(os, "major") and hasattr(os, "minor"):
                tarinfo.devmajor = os.major(statres.st_rdev)
                tarinfo.devminor = os.minor(statres.st_rdev)
        return tarinfo

    def list(self, verbose=True):
        """Print a table of contents to sys.stdout. If `verbose' is False, only
           the names of the members are printed. If it is True, an `ls -l'-like
           output is produced.
        """
        self._check()

        for tarinfo in self:
            if verbose:
                print(filemode(tarinfo.mode), end=' ')
                print("%s/%s" % (tarinfo.uname or tarinfo.uid,
                                 tarinfo.gname or tarinfo.gid), end=' ')
                if tarinfo.ischr() or tarinfo.isblk():
                    print("%10s" % ("%d,%d" \
                                    % (tarinfo.devmajor, tarinfo.devminor)), end=' ')
                else:
                    print("%10d" % tarinfo.size, end=' ')
                print("%d-%02d-%02d %02d:%02d:%02d" \
                      % time.localtime(tarinfo.mtime)[:6], end=' ')

            print(tarinfo.name + ("/" if tarinfo.isdir() else ""), end=' ')

            if verbose:
                if tarinfo.issym():
                    print("->", tarinfo.linkname, end=' ')
                if tarinfo.islnk():
                    print("link to", tarinfo.linkname, end=' ')
            print()

    def add(self, name, arcname=None, recursive=True, exclude=None, filter=None):
        """Add the file `name' to the archive. `name' may be any type of file
           (directory, fifo, symbolic link, etc.). If given, `arcname'
           specifies an alternative name for the file in the archive.
           Directories are added recursively by default. This can be avoided by
           setting `recursive' to False. `exclude' is a function that should
           return True for each filename to be excluded. `filter' is a function
           that expects a TarInfo object argument and returns the changed
           TarInfo object, if it returns None the TarInfo object will be
           excluded from the archive.
        """
        self._check("aw")

        if arcname is None:
            arcname = name

        # Exclude pathnames.
        if exclude is not None:
            import warnings
            warnings.warn("use the filter argument instead",
                    DeprecationWarning, 2)
            if exclude(name):
                self._dbg(2, "tarfile: Excluded %r" % name)
                return

        # Skip if somebody tries to archive the archive...
        if self.name is not None and os.path.abspath(name) == self.name:
            self._dbg(2, "tarfile: Skipped %r" % name)
            return

        self._dbg(1, name)

        # Create a TarInfo object from the file.
        tarinfo = self.gettarinfo(name, arcname)

        if tarinfo is None:
            self._dbg(1, "tarfile: Unsupported type %r" % name)
            return

        # Change or exclude the TarInfo object.
        if filter is not None:
            tarinfo = filter(tarinfo)
            if tarinfo is None:
                self._dbg(2, "tarfile: Excluded %r" % name)
                return

        # Append the tar header and data to the archive.
        if tarinfo.isreg():
            f = bltn_open(name, "rb")
            self.addfile(tarinfo, f)
            f.close()

        elif tarinfo.isdir():
            self.addfile(tarinfo)
            if recursive:
                for f in os.listdir(name):
                    self.add(os.path.join(name, f), os.path.join(arcname, f),
                            recursive, exclude, filter=filter)

        else:
            self.addfile(tarinfo)

    def addfile(self, tarinfo, fileobj=None):
        """Add the TarInfo object `tarinfo' to the archive. If `fileobj' is
           given, tarinfo.size bytes are read from it and added to the archive.
           You can create TarInfo objects using gettarinfo().
           On Windows platforms, `fileobj' should always be opened with mode
           'rb' to avoid irritation about the file size.
        """
        self._check("aw")

        tarinfo = copy.copy(tarinfo)

        buf = tarinfo.tobuf(self.format, self.encoding, self.errors)
        self.fileobj.write(buf)
        self.offset += len(buf)

        # If there's data to follow, append it.
        if fileobj is not None:
            copyfileobj(fileobj, self.fileobj, tarinfo.size)
            blocks, remainder = divmod(tarinfo.size, BLOCKSIZE)
            if remainder > 0:
                self.fileobj.write(NUL * (BLOCKSIZE - remainder))
                blocks += 1
            self.offset += blocks * BLOCKSIZE

        self.members.append(tarinfo)

    def extractall(self, path=".", members=None):
        """Extract all members from the archive to the current working
           directory and set owner, modification time and permissions on
           directories afterwards. `path' specifies a different directory
           to extract to. `members' is optional and must be a subset of the
           list returned by getmembers().
        """
        directories = []

        if members is None:
            members = self

        for tarinfo in members:
            if tarinfo.isdir():
                # Extract directories with a safe mode.
                directories.append(tarinfo)
                tarinfo = copy.copy(tarinfo)
                tarinfo.mode = 0o700
            # Do not set_attrs directories, as we will do that further down
            self.extract(tarinfo, path, set_attrs=not tarinfo.isdir())

        # Reverse sort directories.
        directories.sort(key=lambda a: a.name)
        directories.reverse()

        # Set correct owner, mtime and filemode on directories.
        for tarinfo in directories:
            dirpath = os.path.join(path, tarinfo.name)
            try:
                self.chown(tarinfo, dirpath)
                self.utime(tarinfo, dirpath)
                self.chmod(tarinfo, dirpath)
            except ExtractError as e:
                if self.errorlevel > 1:
                    raise
                else:
                    self._dbg(1, "tarfile: %s" % e)

    def extract(self, member, path="", set_attrs=True):
        """Extract a member from the archive to the current working directory,
           using its full name. Its file information is extracted as accurately
           as possible. `member' may be a filename or a TarInfo object. You can
           specify a different directory using `path'. File attributes (owner,
           mtime, mode) are set unless `set_attrs' is False.
        """
        self._check("r")

        if isinstance(member, str):
            tarinfo = self.getmember(member)
        else:
            tarinfo = member

        # Prepare the link target for makelink().
        if tarinfo.islnk():
            tarinfo._link_target = os.path.join(path, tarinfo.linkname)

        try:
            self._extract_member(tarinfo, os.path.join(path, tarinfo.name),
                                 set_attrs=set_attrs)
        except EnvironmentError as e:
            if self.errorlevel > 0:
                raise
            else:
                if e.filename is None:
                    self._dbg(1, "tarfile: %s" % e.strerror)
                else:
                    self._dbg(1, "tarfile: %s %r" % (e.strerror, e.filename))
        except ExtractError as e:
            if self.errorlevel > 1:
                raise
            else:
                self._dbg(1, "tarfile: %s" % e)

    def extractfile(self, member):
        """Extract a member from the archive as a file object. `member' may be
           a filename or a TarInfo object. If `member' is a regular file, a
           file-like object is returned. If `member' is a link, a file-like
           object is constructed from the link's target. If `member' is none of
           the above, None is returned.
           The file-like object is read-only and provides the following
           methods: read(), readline(), readlines(), seek() and tell()
        """
        self._check("r")

        if isinstance(member, str):
            tarinfo = self.getmember(member)
        else:
            tarinfo = member

        if tarinfo.isreg():
            return self.fileobject(self, tarinfo)

        elif tarinfo.type not in SUPPORTED_TYPES:
            # If a member's type is unknown, it is treated as a
            # regular file.
            return self.fileobject(self, tarinfo)

        elif tarinfo.islnk() or tarinfo.issym():
            if isinstance(self.fileobj, _Stream):
                # A small but ugly workaround for the case that someone tries
                # to extract a (sym)link as a file-object from a non-seekable
                # stream of tar blocks.
                raise StreamError("cannot extract (sym)link as file object")
            else:
                # A (sym)link's file object is its target's file object.
                return self.extractfile(self._find_link_target(tarinfo))
        else:
            # If there's no data associated with the member (directory, chrdev,
            # blkdev, etc.), return None instead of a file object.
            return None

    def _extract_member(self, tarinfo, targetpath, set_attrs=True):
        """Extract the TarInfo object tarinfo to a physical
           file called targetpath.
        """
        # Fetch the TarInfo object for the given name
        # and build the destination pathname, replacing
        # forward slashes to platform specific separators.
        targetpath = targetpath.rstrip("/")
        targetpath = targetpath.replace("/", os.sep)

        # Create all upper directories.
        upperdirs = os.path.dirname(targetpath)
        if upperdirs and not os.path.exists(upperdirs):
            # Create directories that are not part of the archive with
            # default permissions.
            os.makedirs(upperdirs)

        if tarinfo.islnk() or tarinfo.issym():
            self._dbg(1, "%s -> %s" % (tarinfo.name, tarinfo.linkname))
        else:
            self._dbg(1, tarinfo.name)

        if tarinfo.isreg():
            self.makefile(tarinfo, targetpath)
        elif tarinfo.isdir():
            self.makedir(tarinfo, targetpath)
        elif tarinfo.isfifo():
            self.makefifo(tarinfo, targetpath)
        elif tarinfo.ischr() or tarinfo.isblk():
            self.makedev(tarinfo, targetpath)
        elif tarinfo.islnk() or tarinfo.issym():
            self.makelink(tarinfo, targetpath)
        elif tarinfo.type not in SUPPORTED_TYPES:
            self.makeunknown(tarinfo, targetpath)
        else:
            self.makefile(tarinfo, targetpath)

        if set_attrs:
            self.chown(tarinfo, targetpath)
            if not tarinfo.issym():
                self.chmod(tarinfo, targetpath)
                self.utime(tarinfo, targetpath)

    #--------------------------------------------------------------------------
    # Below are the different file methods. They are called via
    # _extract_member() when extract() is called. They can be replaced in a
    # subclass to implement other functionality.

    def makedir(self, tarinfo, targetpath):
        """Make a directory called targetpath.
        """
        try:
            # Use a safe mode for the directory, the real mode is set
            # later in _extract_member().
            os.mkdir(targetpath, 0o700)
        except EnvironmentError as e:
            if e.errno != errno.EEXIST:
                raise

    def makefile(self, tarinfo, targetpath):
        """Make a file called targetpath.
        """
        source = self.fileobj
        source.seek(tarinfo.offset_data)
        target = bltn_open(targetpath, "wb")
        if tarinfo.sparse is not None:
            for offset, size in tarinfo.sparse:
                target.seek(offset)
                copyfileobj(source, target, size)
        else:
            copyfileobj(source, target, tarinfo.size)
        target.seek(tarinfo.size)
        target.truncate()
        target.close()

    def makeunknown(self, tarinfo, targetpath):
        """Make a file from a TarInfo object with an unknown type
           at targetpath.
        """
        self.makefile(tarinfo, targetpath)
        self._dbg(1, "tarfile: Unknown file type %r, " \
                     "extracted as regular file." % tarinfo.type)

    def makefifo(self, tarinfo, targetpath):
        """Make a fifo called targetpath.
        """
        if hasattr(os, "mkfifo"):
            os.mkfifo(targetpath)
        else:
            raise ExtractError("fifo not supported by system")

    def makedev(self, tarinfo, targetpath):
        """Make a character or block device called targetpath.
        """
        if not hasattr(os, "mknod") or not hasattr(os, "makedev"):
            raise ExtractError("special devices not supported by system")

        mode = tarinfo.mode
        if tarinfo.isblk():
            mode |= stat.S_IFBLK
        else:
            mode |= stat.S_IFCHR

        os.mknod(targetpath, mode,
                 os.makedev(tarinfo.devmajor, tarinfo.devminor))

    def makelink(self, tarinfo, targetpath):
        """Make a (symbolic) link called targetpath. If it cannot be created
          (platform limitation), we try to make a copy of the referenced file
          instead of a link.
        """
        try:
            # For systems that support symbolic and hard links.
            if tarinfo.issym():
                os.symlink(tarinfo.linkname, targetpath)
            else:
                # See extract().
                if os.path.exists(tarinfo._link_target):
                    os.link(tarinfo._link_target, targetpath)
                else:
                    self._extract_member(self._find_link_target(tarinfo),
                                         targetpath)
        except symlink_exception:
            if tarinfo.issym():
                linkpath = os.path.join(os.path.dirname(tarinfo.name),
                                        tarinfo.linkname)
            else:
                linkpath = tarinfo.linkname
        else:
            try:
                self._extract_member(self._find_link_target(tarinfo),
                                     targetpath)
            except KeyError:
                raise ExtractError("unable to resolve link inside archive")

    def chown(self, tarinfo, targetpath):
        """Set owner of targetpath according to tarinfo.
        """
        if pwd and hasattr(os, "geteuid") and os.geteuid() == 0:
            # We have to be root to do so.
            try:
                g = grp.getgrnam(tarinfo.gname)[2]
            except KeyError:
                g = tarinfo.gid
            try:
                u = pwd.getpwnam(tarinfo.uname)[2]
            except KeyError:
                u = tarinfo.uid
            try:
                if tarinfo.issym() and hasattr(os, "lchown"):
                    os.lchown(targetpath, u, g)
                else:
                    if sys.platform != "os2emx":
                        os.chown(targetpath, u, g)
            except EnvironmentError as e:
                raise ExtractError("could not change owner")

    def chmod(self, tarinfo, targetpath):
        """Set file permissions of targetpath according to tarinfo.
        """
        if hasattr(os, 'chmod'):
            try:
                os.chmod(targetpath, tarinfo.mode)
            except EnvironmentError as e:
                raise ExtractError("could not change mode")

    def utime(self, tarinfo, targetpath):
        """Set modification time of targetpath according to tarinfo.
        """
        if not hasattr(os, 'utime'):
            return
        try:
            os.utime(targetpath, (tarinfo.mtime, tarinfo.mtime))
        except EnvironmentError as e:
            raise ExtractError("could not change modification time")

    #--------------------------------------------------------------------------
    def next(self):
        """Return the next member of the archive as a TarInfo object, when
           TarFile is opened for reading. Return None if there is no more
           available.
        """
        self._check("ra")
        if self.firstmember is not None:
            m = self.firstmember
            self.firstmember = None
            return m

        # Read the next block.
        self.fileobj.seek(self.offset)
        tarinfo = None
        while True:
            try:
                tarinfo = self.tarinfo.fromtarfile(self)
            except EOFHeaderError as e:
                if self.ignore_zeros:
                    self._dbg(2, "0x%X: %s" % (self.offset, e))
                    self.offset += BLOCKSIZE
                    continue
            except InvalidHeaderError as e:
                if self.ignore_zeros:
                    self._dbg(2, "0x%X: %s" % (self.offset, e))
                    self.offset += BLOCKSIZE
                    continue
                elif self.offset == 0:
                    raise ReadError(str(e))
            except EmptyHeaderError:
                if self.offset == 0:
                    raise ReadError("empty file")
            except TruncatedHeaderError as e:
                if self.offset == 0:
                    raise ReadError(str(e))
            except SubsequentHeaderError as e:
                raise ReadError(str(e))
            break

        if tarinfo is not None:
            self.members.append(tarinfo)
        else:
            self._loaded = True

        return tarinfo

    #--------------------------------------------------------------------------
    # Little helper methods:

    def _getmember(self, name, tarinfo=None, normalize=False):
        """Find an archive member by name from bottom to top.
           If tarinfo is given, it is used as the starting point.
        """
        # Ensure that all members have been loaded.
        members = self.getmembers()

        # Limit the member search list up to tarinfo.
        if tarinfo is not None:
            members = members[:members.index(tarinfo)]

        if normalize:
            name = os.path.normpath(name)

        for member in reversed(members):
            if normalize:
                member_name = os.path.normpath(member.name)
            else:
                member_name = member.name

            if name == member_name:
                return member

    def _load(self):
        """Read through the entire archive file and look for readable
           members.
        """
        while True:
            tarinfo = self.next()
            if tarinfo is None:
                break
        self._loaded = True

    def _check(self, mode=None):
        """Check if TarFile is still open, and if the operation's mode
           corresponds to TarFile's mode.
        """
        if self.closed:
            raise IOError("%s is closed" % self.__class__.__name__)
        if mode is not None and self.mode not in mode:
            raise IOError("bad operation for mode %r" % self.mode)

    def _find_link_target(self, tarinfo):
        """Find the target member of a symlink or hardlink member in the
           archive.
        """
        if tarinfo.issym():
            # Always search the entire archive.
            linkname = os.path.dirname(tarinfo.name) + "/" + tarinfo.linkname
            limit = None
        else:
            # Search the archive before the link, because a hard link is
            # just a reference to an already archived file.
            linkname = tarinfo.linkname
            limit = tarinfo

        member = self._getmember(linkname, tarinfo=limit, normalize=True)
        if member is None:
            raise KeyError("linkname %r not found" % linkname)
        return member

    def __iter__(self):
        """Provide an iterator object.
        """
        if self._loaded:
            return iter(self.members)
        else:
            return TarIter(self)

    def _dbg(self, level, msg):
        """Write debugging output to sys.stderr.
        """
        if level <= self.debug:
            print(msg, file=sys.stderr)

    def __enter__(self):
        self._check()
        return self

    def __exit__(self, type, value, traceback):
        if type is None:
            self.close()
        else:
            # An exception occurred. We must not call close() because
            # it would try to write end-of-archive blocks and padding.
            if not self._extfileobj:
                self.fileobj.close()
            self.closed = True
# class TarFile

class TarIter(object):
    """Iterator Class.

       for tarinfo in TarFile(...):
           suite...
    """

    def __init__(self, tarfile):
        """Construct a TarIter object.
        """
        self.tarfile = tarfile
        self.index = 0
    def __iter__(self):
        """Return iterator object.
        """
        return self

    def __next__(self):
        """Return the next item using TarFile's next() method.
           When all members have been read, set TarFile as _loaded.
        """
        # Fix for SF #1100429: Under rare circumstances it can
        # happen that getmembers() is called during iteration,
        # which will cause TarIter to stop prematurely.
        if not self.tarfile._loaded:
            tarinfo = self.tarfile.next()
            if not tarinfo:
                self.tarfile._loaded = True
                raise StopIteration
        else:
            try:
                tarinfo = self.tarfile.members[self.index]
            except IndexError:
                raise StopIteration
        self.index += 1
        return tarinfo

    next = __next__ # for Python 2.x

#--------------------
# exported functions
#--------------------
def is_tarfile(name):
    """Return True if name points to a tar archive that we
       are able to handle, else return False.
    """
    try:
        t = open(name)
        t.close()
        return True
    except TarError:
        return False

bltn_open = open
open = TarFile.open
site-packages/pip/_vendor/distlib/_backport/sysconfig.py000064400000064513147511334610017502 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2012 The Python Software Foundation.
# See LICENSE.txt and CONTRIBUTORS.txt.
#
"""Access to Python's configuration information."""

import codecs
import os
import re
import sys
from os.path import pardir, realpath
try:
    import configparser
except ImportError:
    import ConfigParser as configparser


__all__ = [
    'get_config_h_filename',
    'get_config_var',
    'get_config_vars',
    'get_makefile_filename',
    'get_path',
    'get_path_names',
    'get_paths',
    'get_platform',
    'get_python_version',
    'get_scheme_names',
    'parse_config_h',
]


def _safe_realpath(path):
    try:
        return realpath(path)
    except OSError:
        return path


if sys.executable:
    _PROJECT_BASE = os.path.dirname(_safe_realpath(sys.executable))
else:
    # sys.executable can be empty if argv[0] has been changed and Python is
    # unable to retrieve the real program name
    _PROJECT_BASE = _safe_realpath(os.getcwd())

if os.name == "nt" and "pcbuild" in _PROJECT_BASE[-8:].lower():
    _PROJECT_BASE = _safe_realpath(os.path.join(_PROJECT_BASE, pardir))
# PC/VS7.1
if os.name == "nt" and "\\pc\\v" in _PROJECT_BASE[-10:].lower():
    _PROJECT_BASE = _safe_realpath(os.path.join(_PROJECT_BASE, pardir, pardir))
# PC/AMD64
if os.name == "nt" and "\\pcbuild\\amd64" in _PROJECT_BASE[-14:].lower():
    _PROJECT_BASE = _safe_realpath(os.path.join(_PROJECT_BASE, pardir, pardir))


def is_python_build():
    for fn in ("Setup.dist", "Setup.local"):
        if os.path.isfile(os.path.join(_PROJECT_BASE, "Modules", fn)):
            return True
    return False

_PYTHON_BUILD = is_python_build()

_cfg_read = False

def _ensure_cfg_read():
    global _cfg_read
    if not _cfg_read:
        from ..resources import finder
        backport_package = __name__.rsplit('.', 1)[0]
        _finder = finder(backport_package)
        _cfgfile = _finder.find('sysconfig.cfg')
        assert _cfgfile, 'sysconfig.cfg exists'
        with _cfgfile.as_stream() as s:
            _SCHEMES.readfp(s)
        if _PYTHON_BUILD:
            for scheme in ('posix_prefix', 'posix_home'):
                _SCHEMES.set(scheme, 'include', '{srcdir}/Include')
                _SCHEMES.set(scheme, 'platinclude', '{projectbase}/.')

        _cfg_read = True


_SCHEMES = configparser.RawConfigParser()
_VAR_REPL = re.compile(r'\{([^{]*?)\}')

def _expand_globals(config):
    _ensure_cfg_read()
    if config.has_section('globals'):
        globals = config.items('globals')
    else:
        globals = tuple()

    sections = config.sections()
    for section in sections:
        if section == 'globals':
            continue
        for option, value in globals:
            if config.has_option(section, option):
                continue
            config.set(section, option, value)
    config.remove_section('globals')

    # now expanding local variables defined in the cfg file
    #
    for section in config.sections():
        variables = dict(config.items(section))

        def _replacer(matchobj):
            name = matchobj.group(1)
            if name in variables:
                return variables[name]
            return matchobj.group(0)

        for option, value in config.items(section):
            config.set(section, option, _VAR_REPL.sub(_replacer, value))

#_expand_globals(_SCHEMES)

 # FIXME don't rely on sys.version here, its format is an implementation detail
 # of CPython, use sys.version_info or sys.hexversion
_PY_VERSION = sys.version.split()[0]
_PY_VERSION_SHORT = sys.version[:3]
_PY_VERSION_SHORT_NO_DOT = _PY_VERSION[0] + _PY_VERSION[2]
_PREFIX = os.path.normpath(sys.prefix)
_EXEC_PREFIX = os.path.normpath(sys.exec_prefix)
_CONFIG_VARS = None
_USER_BASE = None


def _subst_vars(path, local_vars):
    """In the string `path`, replace tokens like {some.thing} with the
    corresponding value from the map `local_vars`.

    If there is no corresponding value, leave the token unchanged.
    """
    def _replacer(matchobj):
        name = matchobj.group(1)
        if name in local_vars:
            return local_vars[name]
        elif name in os.environ:
            return os.environ[name]
        return matchobj.group(0)
    return _VAR_REPL.sub(_replacer, path)


def _extend_dict(target_dict, other_dict):
    target_keys = target_dict.keys()
    for key, value in other_dict.items():
        if key in target_keys:
            continue
        target_dict[key] = value


def _expand_vars(scheme, vars):
    res = {}
    if vars is None:
        vars = {}
    _extend_dict(vars, get_config_vars())

    for key, value in _SCHEMES.items(scheme):
        if os.name in ('posix', 'nt'):
            value = os.path.expanduser(value)
        res[key] = os.path.normpath(_subst_vars(value, vars))
    return res


def format_value(value, vars):
    def _replacer(matchobj):
        name = matchobj.group(1)
        if name in vars:
            return vars[name]
        return matchobj.group(0)
    return _VAR_REPL.sub(_replacer, value)


def _get_default_scheme():
    if os.name == 'posix':
        # the default scheme for posix is posix_prefix
        return 'posix_prefix'
    return os.name


def _getuserbase():
    env_base = os.environ.get("PYTHONUSERBASE", None)

    def joinuser(*args):
        return os.path.expanduser(os.path.join(*args))

    # what about 'os2emx', 'riscos' ?
    if os.name == "nt":
        base = os.environ.get("APPDATA") or "~"
        if env_base:
            return env_base
        else:
            return joinuser(base, "Python")

    if sys.platform == "darwin":
        framework = get_config_var("PYTHONFRAMEWORK")
        if framework:
            if env_base:
                return env_base
            else:
                return joinuser("~", "Library", framework, "%d.%d" %
                                sys.version_info[:2])

    if env_base:
        return env_base
    else:
        return joinuser("~", ".local")


def _parse_makefile(filename, vars=None):
    """Parse a Makefile-style file.

    A dictionary containing name/value pairs is returned.  If an
    optional dictionary is passed in as the second argument, it is
    used instead of a new dictionary.
    """
    # Regexes needed for parsing Makefile (and similar syntaxes,
    # like old-style Setup files).
    _variable_rx = re.compile("([a-zA-Z][a-zA-Z0-9_]+)\s*=\s*(.*)")
    _findvar1_rx = re.compile(r"\$\(([A-Za-z][A-Za-z0-9_]*)\)")
    _findvar2_rx = re.compile(r"\${([A-Za-z][A-Za-z0-9_]*)}")

    if vars is None:
        vars = {}
    done = {}
    notdone = {}

    with codecs.open(filename, encoding='utf-8', errors="surrogateescape") as f:
        lines = f.readlines()

    for line in lines:
        if line.startswith('#') or line.strip() == '':
            continue
        m = _variable_rx.match(line)
        if m:
            n, v = m.group(1, 2)
            v = v.strip()
            # `$$' is a literal `$' in make
            tmpv = v.replace('$$', '')

            if "$" in tmpv:
                notdone[n] = v
            else:
                try:
                    v = int(v)
                except ValueError:
                    # insert literal `$'
                    done[n] = v.replace('$$', '$')
                else:
                    done[n] = v

    # do variable interpolation here
    variables = list(notdone.keys())

    # Variables with a 'PY_' prefix in the makefile. These need to
    # be made available without that prefix through sysconfig.
    # Special care is needed to ensure that variable expansion works, even
    # if the expansion uses the name without a prefix.
    renamed_variables = ('CFLAGS', 'LDFLAGS', 'CPPFLAGS')

    while len(variables) > 0:
        for name in tuple(variables):
            value = notdone[name]
            m = _findvar1_rx.search(value) or _findvar2_rx.search(value)
            if m is not None:
                n = m.group(1)
                found = True
                if n in done:
                    item = str(done[n])
                elif n in notdone:
                    # get it on a subsequent round
                    found = False
                elif n in os.environ:
                    # do it like make: fall back to environment
                    item = os.environ[n]

                elif n in renamed_variables:
                    if (name.startswith('PY_') and
                        name[3:] in renamed_variables):
                        item = ""

                    elif 'PY_' + n in notdone:
                        found = False

                    else:
                        item = str(done['PY_' + n])

                else:
                    done[n] = item = ""

                if found:
                    after = value[m.end():]
                    value = value[:m.start()] + item + after
                    if "$" in after:
                        notdone[name] = value
                    else:
                        try:
                            value = int(value)
                        except ValueError:
                            done[name] = value.strip()
                        else:
                            done[name] = value
                        variables.remove(name)

                        if (name.startswith('PY_') and
                            name[3:] in renamed_variables):

                            name = name[3:]
                            if name not in done:
                                done[name] = value

            else:
                # bogus variable reference (e.g. "prefix=$/opt/python");
                # just drop it since we can't deal
                done[name] = value
                variables.remove(name)

    # strip spurious spaces
    for k, v in done.items():
        if isinstance(v, str):
            done[k] = v.strip()

    # save the results in the global dictionary
    vars.update(done)
    return vars


def get_makefile_filename():
    """Return the path of the Makefile."""
    if _PYTHON_BUILD:
        return os.path.join(_PROJECT_BASE, "Makefile")
    if hasattr(sys, 'abiflags'):
        config_dir_name = 'config-%s%s' % (_PY_VERSION_SHORT, sys.abiflags)
    else:
        config_dir_name = 'config'
    return os.path.join(get_path('stdlib'), config_dir_name, 'Makefile')


def _init_posix(vars):
    """Initialize the module as appropriate for POSIX systems."""
    # load the installed Makefile:
    makefile = get_makefile_filename()
    try:
        _parse_makefile(makefile, vars)
    except IOError as e:
        msg = "invalid Python installation: unable to open %s" % makefile
        if hasattr(e, "strerror"):
            msg = msg + " (%s)" % e.strerror
        raise IOError(msg)
    # load the installed pyconfig.h:
    config_h = get_config_h_filename()
    try:
        with open(config_h) as f:
            parse_config_h(f, vars)
    except IOError as e:
        msg = "invalid Python installation: unable to open %s" % config_h
        if hasattr(e, "strerror"):
            msg = msg + " (%s)" % e.strerror
        raise IOError(msg)
    # On AIX, there are wrong paths to the linker scripts in the Makefile
    # -- these paths are relative to the Python source, but when installed
    # the scripts are in another directory.
    if _PYTHON_BUILD:
        vars['LDSHARED'] = vars['BLDSHARED']


def _init_non_posix(vars):
    """Initialize the module as appropriate for NT"""
    # set basic install directories
    vars['LIBDEST'] = get_path('stdlib')
    vars['BINLIBDEST'] = get_path('platstdlib')
    vars['INCLUDEPY'] = get_path('include')
    vars['SO'] = '.pyd'
    vars['EXE'] = '.exe'
    vars['VERSION'] = _PY_VERSION_SHORT_NO_DOT
    vars['BINDIR'] = os.path.dirname(_safe_realpath(sys.executable))

#
# public APIs
#


def parse_config_h(fp, vars=None):
    """Parse a config.h-style file.

    A dictionary containing name/value pairs is returned.  If an
    optional dictionary is passed in as the second argument, it is
    used instead of a new dictionary.
    """
    if vars is None:
        vars = {}
    define_rx = re.compile("#define ([A-Z][A-Za-z0-9_]+) (.*)\n")
    undef_rx = re.compile("/[*] #undef ([A-Z][A-Za-z0-9_]+) [*]/\n")

    while True:
        line = fp.readline()
        if not line:
            break
        m = define_rx.match(line)
        if m:
            n, v = m.group(1, 2)
            try:
                v = int(v)
            except ValueError:
                pass
            vars[n] = v
        else:
            m = undef_rx.match(line)
            if m:
                vars[m.group(1)] = 0
    return vars


def get_config_h_filename():
    """Return the path of pyconfig.h."""
    if _PYTHON_BUILD:
        if os.name == "nt":
            inc_dir = os.path.join(_PROJECT_BASE, "PC")
        else:
            inc_dir = _PROJECT_BASE
    else:
        inc_dir = get_path('platinclude')
    return os.path.join(inc_dir, 'pyconfig.h')


def get_scheme_names():
    """Return a tuple containing the schemes names."""
    return tuple(sorted(_SCHEMES.sections()))


def get_path_names():
    """Return a tuple containing the paths names."""
    # xxx see if we want a static list
    return _SCHEMES.options('posix_prefix')


def get_paths(scheme=_get_default_scheme(), vars=None, expand=True):
    """Return a mapping containing an install scheme.

    ``scheme`` is the install scheme name. If not provided, it will
    return the default scheme for the current platform.
    """
    _ensure_cfg_read()
    if expand:
        return _expand_vars(scheme, vars)
    else:
        return dict(_SCHEMES.items(scheme))


def get_path(name, scheme=_get_default_scheme(), vars=None, expand=True):
    """Return a path corresponding to the scheme.

    ``scheme`` is the install scheme name.
    """
    return get_paths(scheme, vars, expand)[name]


def get_config_vars(*args):
    """With no arguments, return a dictionary of all configuration
    variables relevant for the current platform.

    On Unix, this means every variable defined in Python's installed Makefile;
    On Windows and Mac OS it's a much smaller set.

    With arguments, return a list of values that result from looking up
    each argument in the configuration variable dictionary.
    """
    global _CONFIG_VARS
    if _CONFIG_VARS is None:
        _CONFIG_VARS = {}
        # Normalized versions of prefix and exec_prefix are handy to have;
        # in fact, these are the standard versions used most places in the
        # distutils2 module.
        _CONFIG_VARS['prefix'] = _PREFIX
        _CONFIG_VARS['exec_prefix'] = _EXEC_PREFIX
        _CONFIG_VARS['py_version'] = _PY_VERSION
        _CONFIG_VARS['py_version_short'] = _PY_VERSION_SHORT
        _CONFIG_VARS['py_version_nodot'] = _PY_VERSION[0] + _PY_VERSION[2]
        _CONFIG_VARS['base'] = _PREFIX
        _CONFIG_VARS['platbase'] = _EXEC_PREFIX
        _CONFIG_VARS['projectbase'] = _PROJECT_BASE
        try:
            _CONFIG_VARS['abiflags'] = sys.abiflags
        except AttributeError:
            # sys.abiflags may not be defined on all platforms.
            _CONFIG_VARS['abiflags'] = ''

        if os.name in ('nt', 'os2'):
            _init_non_posix(_CONFIG_VARS)
        if os.name == 'posix':
            _init_posix(_CONFIG_VARS)
        # Setting 'userbase' is done below the call to the
        # init function to enable using 'get_config_var' in
        # the init-function.
        if sys.version >= '2.6':
            _CONFIG_VARS['userbase'] = _getuserbase()

        if 'srcdir' not in _CONFIG_VARS:
            _CONFIG_VARS['srcdir'] = _PROJECT_BASE
        else:
            _CONFIG_VARS['srcdir'] = _safe_realpath(_CONFIG_VARS['srcdir'])

        # Convert srcdir into an absolute path if it appears necessary.
        # Normally it is relative to the build directory.  However, during
        # testing, for example, we might be running a non-installed python
        # from a different directory.
        if _PYTHON_BUILD and os.name == "posix":
            base = _PROJECT_BASE
            try:
                cwd = os.getcwd()
            except OSError:
                cwd = None
            if (not os.path.isabs(_CONFIG_VARS['srcdir']) and
                base != cwd):
                # srcdir is relative and we are not in the same directory
                # as the executable. Assume executable is in the build
                # directory and make srcdir absolute.
                srcdir = os.path.join(base, _CONFIG_VARS['srcdir'])
                _CONFIG_VARS['srcdir'] = os.path.normpath(srcdir)

        if sys.platform == 'darwin':
            kernel_version = os.uname()[2]  # Kernel version (8.4.3)
            major_version = int(kernel_version.split('.')[0])

            if major_version < 8:
                # On macOS before 10.4, check if -arch and -isysroot
                # are in CFLAGS or LDFLAGS and remove them if they are.
                # This is needed when building extensions on a 10.3 system
                # using a universal build of python.
                for key in ('LDFLAGS', 'BASECFLAGS',
                        # a number of derived variables. These need to be
                        # patched up as well.
                        'CFLAGS', 'PY_CFLAGS', 'BLDSHARED'):
                    flags = _CONFIG_VARS[key]
                    flags = re.sub('-arch\s+\w+\s', ' ', flags)
                    flags = re.sub('-isysroot [^ \t]*', ' ', flags)
                    _CONFIG_VARS[key] = flags
            else:
                # Allow the user to override the architecture flags using
                # an environment variable.
                # NOTE: This name was introduced by Apple in OSX 10.5 and
                # is used by several scripting languages distributed with
                # that OS release.
                if 'ARCHFLAGS' in os.environ:
                    arch = os.environ['ARCHFLAGS']
                    for key in ('LDFLAGS', 'BASECFLAGS',
                        # a number of derived variables. These need to be
                        # patched up as well.
                        'CFLAGS', 'PY_CFLAGS', 'BLDSHARED'):

                        flags = _CONFIG_VARS[key]
                        flags = re.sub('-arch\s+\w+\s', ' ', flags)
                        flags = flags + ' ' + arch
                        _CONFIG_VARS[key] = flags

                # If we're on OSX 10.5 or later and the user tries to
                # compiles an extension using an SDK that is not present
                # on the current machine it is better to not use an SDK
                # than to fail.
                #
                # The major usecase for this is users using a Python.org
                # binary installer  on OSX 10.6: that installer uses
                # the 10.4u SDK, but that SDK is not installed by default
                # when you install Xcode.
                #
                CFLAGS = _CONFIG_VARS.get('CFLAGS', '')
                m = re.search('-isysroot\s+(\S+)', CFLAGS)
                if m is not None:
                    sdk = m.group(1)
                    if not os.path.exists(sdk):
                        for key in ('LDFLAGS', 'BASECFLAGS',
                             # a number of derived variables. These need to be
                             # patched up as well.
                            'CFLAGS', 'PY_CFLAGS', 'BLDSHARED'):

                            flags = _CONFIG_VARS[key]
                            flags = re.sub('-isysroot\s+\S+(\s|$)', ' ', flags)
                            _CONFIG_VARS[key] = flags

    if args:
        vals = []
        for name in args:
            vals.append(_CONFIG_VARS.get(name))
        return vals
    else:
        return _CONFIG_VARS


def get_config_var(name):
    """Return the value of a single variable using the dictionary returned by
    'get_config_vars()'.

    Equivalent to get_config_vars().get(name)
    """
    return get_config_vars().get(name)


def get_platform():
    """Return a string that identifies the current platform.

    This is used mainly to distinguish platform-specific build directories and
    platform-specific built distributions.  Typically includes the OS name
    and version and the architecture (as supplied by 'os.uname()'),
    although the exact information included depends on the OS; eg. for IRIX
    the architecture isn't particularly important (IRIX only runs on SGI
    hardware), but for Linux the kernel version isn't particularly
    important.

    Examples of returned values:
       linux-i586
       linux-alpha (?)
       solaris-2.6-sun4u
       irix-5.3
       irix64-6.2

    Windows will return one of:
       win-amd64 (64bit Windows on AMD64 (aka x86_64, Intel64, EM64T, etc)
       win-ia64 (64bit Windows on Itanium)
       win32 (all others - specifically, sys.platform is returned)

    For other non-POSIX platforms, currently just returns 'sys.platform'.
    """
    if os.name == 'nt':
        # sniff sys.version for architecture.
        prefix = " bit ("
        i = sys.version.find(prefix)
        if i == -1:
            return sys.platform
        j = sys.version.find(")", i)
        look = sys.version[i+len(prefix):j].lower()
        if look == 'amd64':
            return 'win-amd64'
        if look == 'itanium':
            return 'win-ia64'
        return sys.platform

    if os.name != "posix" or not hasattr(os, 'uname'):
        # XXX what about the architecture? NT is Intel or Alpha,
        # Mac OS is M68k or PPC, etc.
        return sys.platform

    # Try to distinguish various flavours of Unix
    osname, host, release, version, machine = os.uname()

    # Convert the OS name to lowercase, remove '/' characters
    # (to accommodate BSD/OS), and translate spaces (for "Power Macintosh")
    osname = osname.lower().replace('/', '')
    machine = machine.replace(' ', '_')
    machine = machine.replace('/', '-')

    if osname[:5] == "linux":
        # At least on Linux/Intel, 'machine' is the processor --
        # i386, etc.
        # XXX what about Alpha, SPARC, etc?
        return  "%s-%s" % (osname, machine)
    elif osname[:5] == "sunos":
        if release[0] >= "5":           # SunOS 5 == Solaris 2
            osname = "solaris"
            release = "%d.%s" % (int(release[0]) - 3, release[2:])
        # fall through to standard osname-release-machine representation
    elif osname[:4] == "irix":              # could be "irix64"!
        return "%s-%s" % (osname, release)
    elif osname[:3] == "aix":
        return "%s-%s.%s" % (osname, version, release)
    elif osname[:6] == "cygwin":
        osname = "cygwin"
        rel_re = re.compile(r'[\d.]+')
        m = rel_re.match(release)
        if m:
            release = m.group()
    elif osname[:6] == "darwin":
        #
        # For our purposes, we'll assume that the system version from
        # distutils' perspective is what MACOSX_DEPLOYMENT_TARGET is set
        # to. This makes the compatibility story a bit more sane because the
        # machine is going to compile and link as if it were
        # MACOSX_DEPLOYMENT_TARGET.
        cfgvars = get_config_vars()
        macver = cfgvars.get('MACOSX_DEPLOYMENT_TARGET')

        if True:
            # Always calculate the release of the running machine,
            # needed to determine if we can build fat binaries or not.

            macrelease = macver
            # Get the system version. Reading this plist is a documented
            # way to get the system version (see the documentation for
            # the Gestalt Manager)
            try:
                f = open('/System/Library/CoreServices/SystemVersion.plist')
            except IOError:
                # We're on a plain darwin box, fall back to the default
                # behaviour.
                pass
            else:
                try:
                    m = re.search(r'<key>ProductUserVisibleVersion</key>\s*'
                                  r'<string>(.*?)</string>', f.read())
                finally:
                    f.close()
                if m is not None:
                    macrelease = '.'.join(m.group(1).split('.')[:2])
                # else: fall back to the default behaviour

        if not macver:
            macver = macrelease

        if macver:
            release = macver
            osname = "macosx"

            if ((macrelease + '.') >= '10.4.' and
                '-arch' in get_config_vars().get('CFLAGS', '').strip()):
                # The universal build will build fat binaries, but not on
                # systems before 10.4
                #
                # Try to detect 4-way universal builds, those have machine-type
                # 'universal' instead of 'fat'.

                machine = 'fat'
                cflags = get_config_vars().get('CFLAGS')

                archs = re.findall('-arch\s+(\S+)', cflags)
                archs = tuple(sorted(set(archs)))

                if len(archs) == 1:
                    machine = archs[0]
                elif archs == ('i386', 'ppc'):
                    machine = 'fat'
                elif archs == ('i386', 'x86_64'):
                    machine = 'intel'
                elif archs == ('i386', 'ppc', 'x86_64'):
                    machine = 'fat3'
                elif archs == ('ppc64', 'x86_64'):
                    machine = 'fat64'
                elif archs == ('i386', 'ppc', 'ppc64', 'x86_64'):
                    machine = 'universal'
                else:
                    raise ValueError(
                       "Don't know machine value for archs=%r" % (archs,))

            elif machine == 'i386':
                # On OSX the machine type returned by uname is always the
                # 32-bit variant, even if the executable architecture is
                # the 64-bit variant
                if sys.maxsize >= 2**32:
                    machine = 'x86_64'

            elif machine in ('PowerPC', 'Power_Macintosh'):
                # Pick a sane name for the PPC architecture.
                # See 'i386' case
                if sys.maxsize >= 2**32:
                    machine = 'ppc64'
                else:
                    machine = 'ppc'

    return "%s-%s-%s" % (osname, release, machine)


def get_python_version():
    return _PY_VERSION_SHORT


def _print_dict(title, data):
    for index, (key, value) in enumerate(sorted(data.items())):
        if index == 0:
            print('%s: ' % (title))
        print('\t%s = "%s"' % (key, value))


def _main():
    """Display all information sysconfig detains."""
    print('Platform: "%s"' % get_platform())
    print('Python version: "%s"' % get_python_version())
    print('Current installation scheme: "%s"' % _get_default_scheme())
    print()
    _print_dict('Paths', get_paths())
    print()
    _print_dict('Variables', get_config_vars())


if __name__ == '__main__':
    _main()
site-packages/pip/_vendor/distlib/_backport/misc.py000064400000001713147511334610016422 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2012 The Python Software Foundation.
# See LICENSE.txt and CONTRIBUTORS.txt.
#
"""Backports for individual classes and functions."""

import os
import sys

__all__ = ['cache_from_source', 'callable', 'fsencode']


try:
    from imp import cache_from_source
except ImportError:
    def cache_from_source(py_file, debug=__debug__):
        ext = debug and 'c' or 'o'
        return py_file + ext


try:
    callable = callable
except NameError:
    from collections import Callable

    def callable(obj):
        return isinstance(obj, Callable)


try:
    fsencode = os.fsencode
except AttributeError:
    def fsencode(filename):
        if isinstance(filename, bytes):
            return filename
        elif isinstance(filename, str):
            return filename.encode(sys.getfilesystemencoding())
        else:
            raise TypeError("expect bytes or str, not %s" %
                            type(filename).__name__)
site-packages/pip/_vendor/distlib/index.py000064400000051135147511334610014635 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2013 Vinay Sajip.
# Licensed to the Python Software Foundation under a contributor agreement.
# See LICENSE.txt and CONTRIBUTORS.txt.
#
import hashlib
import logging
import os
import shutil
import subprocess
import tempfile
try:
    from threading import Thread
except ImportError:
    from dummy_threading import Thread

from . import DistlibException
from .compat import (HTTPBasicAuthHandler, Request, HTTPPasswordMgr,
                     urlparse, build_opener, string_types)
from .util import cached_property, zip_dir, ServerProxy

logger = logging.getLogger(__name__)

DEFAULT_INDEX = 'https://pypi.python.org/pypi'
DEFAULT_REALM = 'pypi'

class PackageIndex(object):
    """
    This class represents a package index compatible with PyPI, the Python
    Package Index.
    """

    boundary = b'----------ThIs_Is_tHe_distlib_index_bouNdaRY_$'

    def __init__(self, url=None):
        """
        Initialise an instance.

        :param url: The URL of the index. If not specified, the URL for PyPI is
                    used.
        """
        self.url = url or DEFAULT_INDEX
        self.read_configuration()
        scheme, netloc, path, params, query, frag = urlparse(self.url)
        if params or query or frag or scheme not in ('http', 'https'):
            raise DistlibException('invalid repository: %s' % self.url)
        self.password_handler = None
        self.ssl_verifier = None
        self.gpg = None
        self.gpg_home = None
        self.rpc_proxy = None
        with open(os.devnull, 'w') as sink:
            # Use gpg by default rather than gpg2, as gpg2 insists on
            # prompting for passwords
            for s in ('gpg', 'gpg2'):
                try:
                    rc = subprocess.check_call([s, '--version'], stdout=sink,
                                               stderr=sink)
                    if rc == 0:
                        self.gpg = s
                        break
                except OSError:
                    pass

    def _get_pypirc_command(self):
        """
        Get the distutils command for interacting with PyPI configurations.
        :return: the command.
        """
        from distutils.core import Distribution
        from distutils.config import PyPIRCCommand
        d = Distribution()
        return PyPIRCCommand(d)

    def read_configuration(self):
        """
        Read the PyPI access configuration as supported by distutils, getting
        PyPI to do the actual work. This populates ``username``, ``password``,
        ``realm`` and ``url`` attributes from the configuration.
        """
        # get distutils to do the work
        c = self._get_pypirc_command()
        c.repository = self.url
        cfg = c._read_pypirc()
        self.username = cfg.get('username')
        self.password = cfg.get('password')
        self.realm = cfg.get('realm', 'pypi')
        self.url = cfg.get('repository', self.url)

    def save_configuration(self):
        """
        Save the PyPI access configuration. You must have set ``username`` and
        ``password`` attributes before calling this method.

        Again, distutils is used to do the actual work.
        """
        self.check_credentials()
        # get distutils to do the work
        c = self._get_pypirc_command()
        c._store_pypirc(self.username, self.password)

    def check_credentials(self):
        """
        Check that ``username`` and ``password`` have been set, and raise an
        exception if not.
        """
        if self.username is None or self.password is None:
            raise DistlibException('username and password must be set')
        pm = HTTPPasswordMgr()
        _, netloc, _, _, _, _ = urlparse(self.url)
        pm.add_password(self.realm, netloc, self.username, self.password)
        self.password_handler = HTTPBasicAuthHandler(pm)

    def register(self, metadata):
        """
        Register a distribution on PyPI, using the provided metadata.

        :param metadata: A :class:`Metadata` instance defining at least a name
                         and version number for the distribution to be
                         registered.
        :return: The HTTP response received from PyPI upon submission of the
                request.
        """
        self.check_credentials()
        metadata.validate()
        d = metadata.todict()
        d[':action'] = 'verify'
        request = self.encode_request(d.items(), [])
        response = self.send_request(request)
        d[':action'] = 'submit'
        request = self.encode_request(d.items(), [])
        return self.send_request(request)

    def _reader(self, name, stream, outbuf):
        """
        Thread runner for reading lines of from a subprocess into a buffer.

        :param name: The logical name of the stream (used for logging only).
        :param stream: The stream to read from. This will typically a pipe
                       connected to the output stream of a subprocess.
        :param outbuf: The list to append the read lines to.
        """
        while True:
            s = stream.readline()
            if not s:
                break
            s = s.decode('utf-8').rstrip()
            outbuf.append(s)
            logger.debug('%s: %s' % (name, s))
        stream.close()

    def get_sign_command(self, filename, signer, sign_password,
                         keystore=None):
        """
        Return a suitable command for signing a file.

        :param filename: The pathname to the file to be signed.
        :param signer: The identifier of the signer of the file.
        :param sign_password: The passphrase for the signer's
                              private key used for signing.
        :param keystore: The path to a directory which contains the keys
                         used in verification. If not specified, the
                         instance's ``gpg_home`` attribute is used instead.
        :return: The signing command as a list suitable to be
                 passed to :class:`subprocess.Popen`.
        """
        cmd = [self.gpg, '--status-fd', '2', '--no-tty']
        if keystore is None:
            keystore = self.gpg_home
        if keystore:
            cmd.extend(['--homedir', keystore])
        if sign_password is not None:
            cmd.extend(['--batch', '--passphrase-fd', '0'])
        td = tempfile.mkdtemp()
        sf = os.path.join(td, os.path.basename(filename) + '.asc')
        cmd.extend(['--detach-sign', '--armor', '--local-user',
                    signer, '--output', sf, filename])
        logger.debug('invoking: %s', ' '.join(cmd))
        return cmd, sf

    def run_command(self, cmd, input_data=None):
        """
        Run a command in a child process , passing it any input data specified.

        :param cmd: The command to run.
        :param input_data: If specified, this must be a byte string containing
                           data to be sent to the child process.
        :return: A tuple consisting of the subprocess' exit code, a list of
                 lines read from the subprocess' ``stdout``, and a list of
                 lines read from the subprocess' ``stderr``.
        """
        kwargs = {
            'stdout': subprocess.PIPE,
            'stderr': subprocess.PIPE,
        }
        if input_data is not None:
            kwargs['stdin'] = subprocess.PIPE
        stdout = []
        stderr = []
        p = subprocess.Popen(cmd, **kwargs)
        # We don't use communicate() here because we may need to
        # get clever with interacting with the command
        t1 = Thread(target=self._reader, args=('stdout', p.stdout, stdout))
        t1.start()
        t2 = Thread(target=self._reader, args=('stderr', p.stderr, stderr))
        t2.start()
        if input_data is not None:
            p.stdin.write(input_data)
            p.stdin.close()

        p.wait()
        t1.join()
        t2.join()
        return p.returncode, stdout, stderr

    def sign_file(self, filename, signer, sign_password, keystore=None):
        """
        Sign a file.

        :param filename: The pathname to the file to be signed.
        :param signer: The identifier of the signer of the file.
        :param sign_password: The passphrase for the signer's
                              private key used for signing.
        :param keystore: The path to a directory which contains the keys
                         used in signing. If not specified, the instance's
                         ``gpg_home`` attribute is used instead.
        :return: The absolute pathname of the file where the signature is
                 stored.
        """
        cmd, sig_file = self.get_sign_command(filename, signer, sign_password,
                                              keystore)
        rc, stdout, stderr = self.run_command(cmd,
                                              sign_password.encode('utf-8'))
        if rc != 0:
            raise DistlibException('sign command failed with error '
                                   'code %s' % rc)
        return sig_file

    def upload_file(self, metadata, filename, signer=None, sign_password=None,
                    filetype='sdist', pyversion='source', keystore=None):
        """
        Upload a release file to the index.

        :param metadata: A :class:`Metadata` instance defining at least a name
                         and version number for the file to be uploaded.
        :param filename: The pathname of the file to be uploaded.
        :param signer: The identifier of the signer of the file.
        :param sign_password: The passphrase for the signer's
                              private key used for signing.
        :param filetype: The type of the file being uploaded. This is the
                        distutils command which produced that file, e.g.
                        ``sdist`` or ``bdist_wheel``.
        :param pyversion: The version of Python which the release relates
                          to. For code compatible with any Python, this would
                          be ``source``, otherwise it would be e.g. ``3.2``.
        :param keystore: The path to a directory which contains the keys
                         used in signing. If not specified, the instance's
                         ``gpg_home`` attribute is used instead.
        :return: The HTTP response received from PyPI upon submission of the
                request.
        """
        self.check_credentials()
        if not os.path.exists(filename):
            raise DistlibException('not found: %s' % filename)
        metadata.validate()
        d = metadata.todict()
        sig_file = None
        if signer:
            if not self.gpg:
                logger.warning('no signing program available - not signed')
            else:
                sig_file = self.sign_file(filename, signer, sign_password,
                                          keystore)
        with open(filename, 'rb') as f:
            file_data = f.read()
        md5_digest = hashlib.md5(file_data).hexdigest()
        sha256_digest = hashlib.sha256(file_data).hexdigest()
        d.update({
            ':action': 'file_upload',
            'protocol_version': '1',
            'filetype': filetype,
            'pyversion': pyversion,
            'md5_digest': md5_digest,
            'sha256_digest': sha256_digest,
        })
        files = [('content', os.path.basename(filename), file_data)]
        if sig_file:
            with open(sig_file, 'rb') as f:
                sig_data = f.read()
            files.append(('gpg_signature', os.path.basename(sig_file),
                         sig_data))
            shutil.rmtree(os.path.dirname(sig_file))
        request = self.encode_request(d.items(), files)
        return self.send_request(request)

    def upload_documentation(self, metadata, doc_dir):
        """
        Upload documentation to the index.

        :param metadata: A :class:`Metadata` instance defining at least a name
                         and version number for the documentation to be
                         uploaded.
        :param doc_dir: The pathname of the directory which contains the
                        documentation. This should be the directory that
                        contains the ``index.html`` for the documentation.
        :return: The HTTP response received from PyPI upon submission of the
                request.
        """
        self.check_credentials()
        if not os.path.isdir(doc_dir):
            raise DistlibException('not a directory: %r' % doc_dir)
        fn = os.path.join(doc_dir, 'index.html')
        if not os.path.exists(fn):
            raise DistlibException('not found: %r' % fn)
        metadata.validate()
        name, version = metadata.name, metadata.version
        zip_data = zip_dir(doc_dir).getvalue()
        fields = [(':action', 'doc_upload'),
                  ('name', name), ('version', version)]
        files = [('content', name, zip_data)]
        request = self.encode_request(fields, files)
        return self.send_request(request)

    def get_verify_command(self, signature_filename, data_filename,
                           keystore=None):
        """
        Return a suitable command for verifying a file.

        :param signature_filename: The pathname to the file containing the
                                   signature.
        :param data_filename: The pathname to the file containing the
                              signed data.
        :param keystore: The path to a directory which contains the keys
                         used in verification. If not specified, the
                         instance's ``gpg_home`` attribute is used instead.
        :return: The verifying command as a list suitable to be
                 passed to :class:`subprocess.Popen`.
        """
        cmd = [self.gpg, '--status-fd', '2', '--no-tty']
        if keystore is None:
            keystore = self.gpg_home
        if keystore:
            cmd.extend(['--homedir', keystore])
        cmd.extend(['--verify', signature_filename, data_filename])
        logger.debug('invoking: %s', ' '.join(cmd))
        return cmd

    def verify_signature(self, signature_filename, data_filename,
                         keystore=None):
        """
        Verify a signature for a file.

        :param signature_filename: The pathname to the file containing the
                                   signature.
        :param data_filename: The pathname to the file containing the
                              signed data.
        :param keystore: The path to a directory which contains the keys
                         used in verification. If not specified, the
                         instance's ``gpg_home`` attribute is used instead.
        :return: True if the signature was verified, else False.
        """
        if not self.gpg:
            raise DistlibException('verification unavailable because gpg '
                                   'unavailable')
        cmd = self.get_verify_command(signature_filename, data_filename,
                                      keystore)
        rc, stdout, stderr = self.run_command(cmd)
        if rc not in (0, 1):
            raise DistlibException('verify command failed with error '
                             'code %s' % rc)
        return rc == 0

    def download_file(self, url, destfile, digest=None, reporthook=None):
        """
        This is a convenience method for downloading a file from an URL.
        Normally, this will be a file from the index, though currently
        no check is made for this (i.e. a file can be downloaded from
        anywhere).

        The method is just like the :func:`urlretrieve` function in the
        standard library, except that it allows digest computation to be
        done during download and checking that the downloaded data
        matched any expected value.

        :param url: The URL of the file to be downloaded (assumed to be
                    available via an HTTP GET request).
        :param destfile: The pathname where the downloaded file is to be
                         saved.
        :param digest: If specified, this must be a (hasher, value)
                       tuple, where hasher is the algorithm used (e.g.
                       ``'md5'``) and ``value`` is the expected value.
        :param reporthook: The same as for :func:`urlretrieve` in the
                           standard library.
        """
        if digest is None:
            digester = None
            logger.debug('No digest specified')
        else:
            if isinstance(digest, (list, tuple)):
                hasher, digest = digest
            else:
                hasher = 'md5'
            digester = getattr(hashlib, hasher)()
            logger.debug('Digest specified: %s' % digest)
        # The following code is equivalent to urlretrieve.
        # We need to do it this way so that we can compute the
        # digest of the file as we go.
        with open(destfile, 'wb') as dfp:
            # addinfourl is not a context manager on 2.x
            # so we have to use try/finally
            sfp = self.send_request(Request(url))
            try:
                headers = sfp.info()
                blocksize = 8192
                size = -1
                read = 0
                blocknum = 0
                if "content-length" in headers:
                    size = int(headers["Content-Length"])
                if reporthook:
                    reporthook(blocknum, blocksize, size)
                while True:
                    block = sfp.read(blocksize)
                    if not block:
                        break
                    read += len(block)
                    dfp.write(block)
                    if digester:
                        digester.update(block)
                    blocknum += 1
                    if reporthook:
                        reporthook(blocknum, blocksize, size)
            finally:
                sfp.close()

        # check that we got the whole file, if we can
        if size >= 0 and read < size:
            raise DistlibException(
                'retrieval incomplete: got only %d out of %d bytes'
                % (read, size))
        # if we have a digest, it must match.
        if digester:
            actual = digester.hexdigest()
            if digest != actual:
                raise DistlibException('%s digest mismatch for %s: expected '
                                       '%s, got %s' % (hasher, destfile,
                                                       digest, actual))
            logger.debug('Digest verified: %s', digest)

    def send_request(self, req):
        """
        Send a standard library :class:`Request` to PyPI and return its
        response.

        :param req: The request to send.
        :return: The HTTP response from PyPI (a standard library HTTPResponse).
        """
        handlers = []
        if self.password_handler:
            handlers.append(self.password_handler)
        if self.ssl_verifier:
            handlers.append(self.ssl_verifier)
        opener = build_opener(*handlers)
        return opener.open(req)

    def encode_request(self, fields, files):
        """
        Encode fields and files for posting to an HTTP server.

        :param fields: The fields to send as a list of (fieldname, value)
                       tuples.
        :param files: The files to send as a list of (fieldname, filename,
                      file_bytes) tuple.
        """
        # Adapted from packaging, which in turn was adapted from
        # http://code.activestate.com/recipes/146306

        parts = []
        boundary = self.boundary
        for k, values in fields:
            if not isinstance(values, (list, tuple)):
                values = [values]

            for v in values:
                parts.extend((
                    b'--' + boundary,
                    ('Content-Disposition: form-data; name="%s"' %
                     k).encode('utf-8'),
                    b'',
                    v.encode('utf-8')))
        for key, filename, value in files:
            parts.extend((
                b'--' + boundary,
                ('Content-Disposition: form-data; name="%s"; filename="%s"' %
                 (key, filename)).encode('utf-8'),
                b'',
                value))

        parts.extend((b'--' + boundary + b'--', b''))

        body = b'\r\n'.join(parts)
        ct = b'multipart/form-data; boundary=' + boundary
        headers = {
            'Content-type': ct,
            'Content-length': str(len(body))
        }
        return Request(self.url, body, headers)

    def search(self, terms, operator=None):
        if isinstance(terms, string_types):
            terms = {'name': terms}
        if self.rpc_proxy is None:
            self.rpc_proxy = ServerProxy(self.url, timeout=3.0)
        return self.rpc_proxy.search(terms, operator or 'and')
site-packages/pip/_vendor/distlib/locators.py000064400000143505147511334610015357 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2012-2015 Vinay Sajip.
# Licensed to the Python Software Foundation under a contributor agreement.
# See LICENSE.txt and CONTRIBUTORS.txt.
#

import gzip
from io import BytesIO
import json
import logging
import os
import posixpath
import re
try:
    import threading
except ImportError:  # pragma: no cover
    import dummy_threading as threading
import zlib

from . import DistlibException
from .compat import (urljoin, urlparse, urlunparse, url2pathname, pathname2url,
                     queue, quote, unescape, string_types, build_opener,
                     HTTPRedirectHandler as BaseRedirectHandler, text_type,
                     Request, HTTPError, URLError)
from .database import Distribution, DistributionPath, make_dist
from .metadata import Metadata
from .util import (cached_property, parse_credentials, ensure_slash,
                   split_filename, get_project_data, parse_requirement,
                   parse_name_and_version, ServerProxy, normalize_name)
from .version import get_scheme, UnsupportedVersionError
from .wheel import Wheel, is_compatible

logger = logging.getLogger(__name__)

HASHER_HASH = re.compile('^(\w+)=([a-f0-9]+)')
CHARSET = re.compile(r';\s*charset\s*=\s*(.*)\s*$', re.I)
HTML_CONTENT_TYPE = re.compile('text/html|application/x(ht)?ml')
DEFAULT_INDEX = 'https://pypi.python.org/pypi'

def get_all_distribution_names(url=None):
    """
    Return all distribution names known by an index.
    :param url: The URL of the index.
    :return: A list of all known distribution names.
    """
    if url is None:
        url = DEFAULT_INDEX
    client = ServerProxy(url, timeout=3.0)
    return client.list_packages()

class RedirectHandler(BaseRedirectHandler):
    """
    A class to work around a bug in some Python 3.2.x releases.
    """
    # There's a bug in the base version for some 3.2.x
    # (e.g. 3.2.2 on Ubuntu Oneiric). If a Location header
    # returns e.g. /abc, it bails because it says the scheme ''
    # is bogus, when actually it should use the request's
    # URL for the scheme. See Python issue #13696.
    def http_error_302(self, req, fp, code, msg, headers):
        # Some servers (incorrectly) return multiple Location headers
        # (so probably same goes for URI).  Use first header.
        newurl = None
        for key in ('location', 'uri'):
            if key in headers:
                newurl = headers[key]
                break
        if newurl is None:
            return
        urlparts = urlparse(newurl)
        if urlparts.scheme == '':
            newurl = urljoin(req.get_full_url(), newurl)
            if hasattr(headers, 'replace_header'):
                headers.replace_header(key, newurl)
            else:
                headers[key] = newurl
        return BaseRedirectHandler.http_error_302(self, req, fp, code, msg,
                                                  headers)

    http_error_301 = http_error_303 = http_error_307 = http_error_302

class Locator(object):
    """
    A base class for locators - things that locate distributions.
    """
    source_extensions = ('.tar.gz', '.tar.bz2', '.tar', '.zip', '.tgz', '.tbz')
    binary_extensions = ('.egg', '.exe', '.whl')
    excluded_extensions = ('.pdf',)

    # A list of tags indicating which wheels you want to match. The default
    # value of None matches against the tags compatible with the running
    # Python. If you want to match other values, set wheel_tags on a locator
    # instance to a list of tuples (pyver, abi, arch) which you want to match.
    wheel_tags = None

    downloadable_extensions = source_extensions + ('.whl',)

    def __init__(self, scheme='default'):
        """
        Initialise an instance.
        :param scheme: Because locators look for most recent versions, they
                       need to know the version scheme to use. This specifies
                       the current PEP-recommended scheme - use ``'legacy'``
                       if you need to support existing distributions on PyPI.
        """
        self._cache = {}
        self.scheme = scheme
        # Because of bugs in some of the handlers on some of the platforms,
        # we use our own opener rather than just using urlopen.
        self.opener = build_opener(RedirectHandler())
        # If get_project() is called from locate(), the matcher instance
        # is set from the requirement passed to locate(). See issue #18 for
        # why this can be useful to know.
        self.matcher = None
        self.errors = queue.Queue()

    def get_errors(self):
        """
        Return any errors which have occurred.
        """
        result = []
        while not self.errors.empty():  # pragma: no cover
            try:
                e = self.errors.get(False)
                result.append(e)
            except self.errors.Empty:
                continue
            self.errors.task_done()
        return result

    def clear_errors(self):
        """
        Clear any errors which may have been logged.
        """
        # Just get the errors and throw them away
        self.get_errors()

    def clear_cache(self):
        self._cache.clear()

    def _get_scheme(self):
        return self._scheme

    def _set_scheme(self, value):
        self._scheme = value

    scheme = property(_get_scheme, _set_scheme)

    def _get_project(self, name):
        """
        For a given project, get a dictionary mapping available versions to Distribution
        instances.

        This should be implemented in subclasses.

        If called from a locate() request, self.matcher will be set to a
        matcher for the requirement to satisfy, otherwise it will be None.
        """
        raise NotImplementedError('Please implement in the subclass')

    def get_distribution_names(self):
        """
        Return all the distribution names known to this locator.
        """
        raise NotImplementedError('Please implement in the subclass')

    def get_project(self, name):
        """
        For a given project, get a dictionary mapping available versions to Distribution
        instances.

        This calls _get_project to do all the work, and just implements a caching layer on top.
        """
        if self._cache is None:
            result = self._get_project(name)
        elif name in self._cache:
            result = self._cache[name]
        else:
            self.clear_errors()
            result = self._get_project(name)
            self._cache[name] = result
        return result

    def score_url(self, url):
        """
        Give an url a score which can be used to choose preferred URLs
        for a given project release.
        """
        t = urlparse(url)
        basename = posixpath.basename(t.path)
        compatible = True
        is_wheel = basename.endswith('.whl')
        if is_wheel:
            compatible = is_compatible(Wheel(basename), self.wheel_tags)
        return (t.scheme != 'https', 'pypi.python.org' in t.netloc,
                is_wheel, compatible, basename)

    def prefer_url(self, url1, url2):
        """
        Choose one of two URLs where both are candidates for distribution
        archives for the same version of a distribution (for example,
        .tar.gz vs. zip).

        The current implementation favours https:// URLs over http://, archives
        from PyPI over those from other locations, wheel compatibility (if a
        wheel) and then the archive name.
        """
        result = url2
        if url1:
            s1 = self.score_url(url1)
            s2 = self.score_url(url2)
            if s1 > s2:
                result = url1
            if result != url2:
                logger.debug('Not replacing %r with %r', url1, url2)
            else:
                logger.debug('Replacing %r with %r', url1, url2)
        return result

    def split_filename(self, filename, project_name):
        """
        Attempt to split a filename in project name, version and Python version.
        """
        return split_filename(filename, project_name)

    def convert_url_to_download_info(self, url, project_name):
        """
        See if a URL is a candidate for a download URL for a project (the URL
        has typically been scraped from an HTML page).

        If it is, a dictionary is returned with keys "name", "version",
        "filename" and "url"; otherwise, None is returned.
        """
        def same_project(name1, name2):
            return normalize_name(name1) == normalize_name(name2)

        result = None
        scheme, netloc, path, params, query, frag = urlparse(url)
        if frag.lower().startswith('egg='):
            logger.debug('%s: version hint in fragment: %r',
                         project_name, frag)
        m = HASHER_HASH.match(frag)
        if m:
            algo, digest = m.groups()
        else:
            algo, digest = None, None
        origpath = path
        if path and path[-1] == '/':
            path = path[:-1]
        if path.endswith('.whl'):
            try:
                wheel = Wheel(path)
                if is_compatible(wheel, self.wheel_tags):
                    if project_name is None:
                        include = True
                    else:
                        include = same_project(wheel.name, project_name)
                    if include:
                        result = {
                            'name': wheel.name,
                            'version': wheel.version,
                            'filename': wheel.filename,
                            'url': urlunparse((scheme, netloc, origpath,
                                               params, query, '')),
                            'python-version': ', '.join(
                                ['.'.join(list(v[2:])) for v in wheel.pyver]),
                        }
            except Exception as e:  # pragma: no cover
                logger.warning('invalid path for wheel: %s', path)
        elif path.endswith(self.downloadable_extensions):
            path = filename = posixpath.basename(path)
            for ext in self.downloadable_extensions:
                if path.endswith(ext):
                    path = path[:-len(ext)]
                    t = self.split_filename(path, project_name)
                    if not t:
                        logger.debug('No match for project/version: %s', path)
                    else:
                        name, version, pyver = t
                        if not project_name or same_project(project_name, name):
                            result = {
                                'name': name,
                                'version': version,
                                'filename': filename,
                                'url': urlunparse((scheme, netloc, origpath,
                                                   params, query, '')),
                                #'packagetype': 'sdist',
                            }
                            if pyver:
                                result['python-version'] = pyver
                    break
        if result and algo:
            result['%s_digest' % algo] = digest
        return result

    def _get_digest(self, info):
        """
        Get a digest from a dictionary by looking at keys of the form
        'algo_digest'.

        Returns a 2-tuple (algo, digest) if found, else None. Currently
        looks only for SHA256, then MD5.
        """
        result = None
        for algo in ('sha256', 'md5'):
            key = '%s_digest' % algo
            if key in info:
                result = (algo, info[key])
                break
        return result

    def _update_version_data(self, result, info):
        """
        Update a result dictionary (the final result from _get_project) with a
        dictionary for a specific version, which typically holds information
        gleaned from a filename or URL for an archive for the distribution.
        """
        name = info.pop('name')
        version = info.pop('version')
        if version in result:
            dist = result[version]
            md = dist.metadata
        else:
            dist = make_dist(name, version, scheme=self.scheme)
            md = dist.metadata
        dist.digest = digest = self._get_digest(info)
        url = info['url']
        result['digests'][url] = digest
        if md.source_url != info['url']:
            md.source_url = self.prefer_url(md.source_url, url)
            result['urls'].setdefault(version, set()).add(url)
        dist.locator = self
        result[version] = dist

    def locate(self, requirement, prereleases=False):
        """
        Find the most recent distribution which matches the given
        requirement.

        :param requirement: A requirement of the form 'foo (1.0)' or perhaps
                            'foo (>= 1.0, < 2.0, != 1.3)'
        :param prereleases: If ``True``, allow pre-release versions
                            to be located. Otherwise, pre-release versions
                            are not returned.
        :return: A :class:`Distribution` instance, or ``None`` if no such
                 distribution could be located.
        """
        result = None
        r = parse_requirement(requirement)
        if r is None:
            raise DistlibException('Not a valid requirement: %r' % requirement)
        scheme = get_scheme(self.scheme)
        self.matcher = matcher = scheme.matcher(r.requirement)
        logger.debug('matcher: %s (%s)', matcher, type(matcher).__name__)
        versions = self.get_project(r.name)
        if len(versions) > 2:   # urls and digests keys are present
            # sometimes, versions are invalid
            slist = []
            vcls = matcher.version_class
            for k in versions:
                if k in ('urls', 'digests'):
                    continue
                try:
                    if not matcher.match(k):
                        logger.debug('%s did not match %r', matcher, k)
                    else:
                        if prereleases or not vcls(k).is_prerelease:
                            slist.append(k)
                        else:
                            logger.debug('skipping pre-release '
                                         'version %s of %s', k, matcher.name)
                except Exception:  # pragma: no cover
                    logger.warning('error matching %s with %r', matcher, k)
                    pass # slist.append(k)
            if len(slist) > 1:
                slist = sorted(slist, key=scheme.key)
            if slist:
                logger.debug('sorted list: %s', slist)
                version = slist[-1]
                result = versions[version]
        if result:
            if r.extras:
                result.extras = r.extras
            result.download_urls = versions.get('urls', {}).get(version, set())
            d = {}
            sd = versions.get('digests', {})
            for url in result.download_urls:
                if url in sd:
                    d[url] = sd[url]
            result.digests = d
        self.matcher = None
        return result


class PyPIRPCLocator(Locator):
    """
    This locator uses XML-RPC to locate distributions. It therefore
    cannot be used with simple mirrors (that only mirror file content).
    """
    def __init__(self, url, **kwargs):
        """
        Initialise an instance.

        :param url: The URL to use for XML-RPC.
        :param kwargs: Passed to the superclass constructor.
        """
        super(PyPIRPCLocator, self).__init__(**kwargs)
        self.base_url = url
        self.client = ServerProxy(url, timeout=3.0)

    def get_distribution_names(self):
        """
        Return all the distribution names known to this locator.
        """
        return set(self.client.list_packages())

    def _get_project(self, name):
        result = {'urls': {}, 'digests': {}}
        versions = self.client.package_releases(name, True)
        for v in versions:
            urls = self.client.release_urls(name, v)
            data = self.client.release_data(name, v)
            metadata = Metadata(scheme=self.scheme)
            metadata.name = data['name']
            metadata.version = data['version']
            metadata.license = data.get('license')
            metadata.keywords = data.get('keywords', [])
            metadata.summary = data.get('summary')
            dist = Distribution(metadata)
            if urls:
                info = urls[0]
                metadata.source_url = info['url']
                dist.digest = self._get_digest(info)
                dist.locator = self
                result[v] = dist
                for info in urls:
                    url = info['url']
                    digest = self._get_digest(info)
                    result['urls'].setdefault(v, set()).add(url)
                    result['digests'][url] = digest
        return result

class PyPIJSONLocator(Locator):
    """
    This locator uses PyPI's JSON interface. It's very limited in functionality
    and probably not worth using.
    """
    def __init__(self, url, **kwargs):
        super(PyPIJSONLocator, self).__init__(**kwargs)
        self.base_url = ensure_slash(url)

    def get_distribution_names(self):
        """
        Return all the distribution names known to this locator.
        """
        raise NotImplementedError('Not available from this locator')

    def _get_project(self, name):
        result = {'urls': {}, 'digests': {}}
        url = urljoin(self.base_url, '%s/json' % quote(name))
        try:
            resp = self.opener.open(url)
            data = resp.read().decode() # for now
            d = json.loads(data)
            md = Metadata(scheme=self.scheme)
            data = d['info']
            md.name = data['name']
            md.version = data['version']
            md.license = data.get('license')
            md.keywords = data.get('keywords', [])
            md.summary = data.get('summary')
            dist = Distribution(md)
            dist.locator = self
            urls = d['urls']
            result[md.version] = dist
            for info in d['urls']:
                url = info['url']
                dist.download_urls.add(url)
                dist.digests[url] = self._get_digest(info)
                result['urls'].setdefault(md.version, set()).add(url)
                result['digests'][url] = self._get_digest(info)
            # Now get other releases
            for version, infos in d['releases'].items():
                if version == md.version:
                    continue    # already done
                omd = Metadata(scheme=self.scheme)
                omd.name = md.name
                omd.version = version
                odist = Distribution(omd)
                odist.locator = self
                result[version] = odist
                for info in infos:
                    url = info['url']
                    odist.download_urls.add(url)
                    odist.digests[url] = self._get_digest(info)
                    result['urls'].setdefault(version, set()).add(url)
                    result['digests'][url] = self._get_digest(info)
#            for info in urls:
#                md.source_url = info['url']
#                dist.digest = self._get_digest(info)
#                dist.locator = self
#                for info in urls:
#                    url = info['url']
#                    result['urls'].setdefault(md.version, set()).add(url)
#                    result['digests'][url] = self._get_digest(info)
        except Exception as e:
            self.errors.put(text_type(e))
            logger.exception('JSON fetch failed: %s', e)
        return result


class Page(object):
    """
    This class represents a scraped HTML page.
    """
    # The following slightly hairy-looking regex just looks for the contents of
    # an anchor link, which has an attribute "href" either immediately preceded
    # or immediately followed by a "rel" attribute. The attribute values can be
    # declared with double quotes, single quotes or no quotes - which leads to
    # the length of the expression.
    _href = re.compile("""
(rel\s*=\s*(?:"(?P<rel1>[^"]*)"|'(?P<rel2>[^']*)'|(?P<rel3>[^>\s\n]*))\s+)?
href\s*=\s*(?:"(?P<url1>[^"]*)"|'(?P<url2>[^']*)'|(?P<url3>[^>\s\n]*))
(\s+rel\s*=\s*(?:"(?P<rel4>[^"]*)"|'(?P<rel5>[^']*)'|(?P<rel6>[^>\s\n]*)))?
""", re.I | re.S | re.X)
    _base = re.compile(r"""<base\s+href\s*=\s*['"]?([^'">]+)""", re.I | re.S)

    def __init__(self, data, url):
        """
        Initialise an instance with the Unicode page contents and the URL they
        came from.
        """
        self.data = data
        self.base_url = self.url = url
        m = self._base.search(self.data)
        if m:
            self.base_url = m.group(1)

    _clean_re = re.compile(r'[^a-z0-9$&+,/:;=?@.#%_\\|-]', re.I)

    @cached_property
    def links(self):
        """
        Return the URLs of all the links on a page together with information
        about their "rel" attribute, for determining which ones to treat as
        downloads and which ones to queue for further scraping.
        """
        def clean(url):
            "Tidy up an URL."
            scheme, netloc, path, params, query, frag = urlparse(url)
            return urlunparse((scheme, netloc, quote(path),
                               params, query, frag))

        result = set()
        for match in self._href.finditer(self.data):
            d = match.groupdict('')
            rel = (d['rel1'] or d['rel2'] or d['rel3'] or
                   d['rel4'] or d['rel5'] or d['rel6'])
            url = d['url1'] or d['url2'] or d['url3']
            url = urljoin(self.base_url, url)
            url = unescape(url)
            url = self._clean_re.sub(lambda m: '%%%2x' % ord(m.group(0)), url)
            result.add((url, rel))
        # We sort the result, hoping to bring the most recent versions
        # to the front
        result = sorted(result, key=lambda t: t[0], reverse=True)
        return result


class SimpleScrapingLocator(Locator):
    """
    A locator which scrapes HTML pages to locate downloads for a distribution.
    This runs multiple threads to do the I/O; performance is at least as good
    as pip's PackageFinder, which works in an analogous fashion.
    """

    # These are used to deal with various Content-Encoding schemes.
    decoders = {
        'deflate': zlib.decompress,
        'gzip': lambda b: gzip.GzipFile(fileobj=BytesIO(d)).read(),
        'none': lambda b: b,
    }

    def __init__(self, url, timeout=None, num_workers=10, **kwargs):
        """
        Initialise an instance.
        :param url: The root URL to use for scraping.
        :param timeout: The timeout, in seconds, to be applied to requests.
                        This defaults to ``None`` (no timeout specified).
        :param num_workers: The number of worker threads you want to do I/O,
                            This defaults to 10.
        :param kwargs: Passed to the superclass.
        """
        super(SimpleScrapingLocator, self).__init__(**kwargs)
        self.base_url = ensure_slash(url)
        self.timeout = timeout
        self._page_cache = {}
        self._seen = set()
        self._to_fetch = queue.Queue()
        self._bad_hosts = set()
        self.skip_externals = False
        self.num_workers = num_workers
        self._lock = threading.RLock()
        # See issue #45: we need to be resilient when the locator is used
        # in a thread, e.g. with concurrent.futures. We can't use self._lock
        # as it is for coordinating our internal threads - the ones created
        # in _prepare_threads.
        self._gplock = threading.RLock()

    def _prepare_threads(self):
        """
        Threads are created only when get_project is called, and terminate
        before it returns. They are there primarily to parallelise I/O (i.e.
        fetching web pages).
        """
        self._threads = []
        for i in range(self.num_workers):
            t = threading.Thread(target=self._fetch)
            t.setDaemon(True)
            t.start()
            self._threads.append(t)

    def _wait_threads(self):
        """
        Tell all the threads to terminate (by sending a sentinel value) and
        wait for them to do so.
        """
        # Note that you need two loops, since you can't say which
        # thread will get each sentinel
        for t in self._threads:
            self._to_fetch.put(None)    # sentinel
        for t in self._threads:
            t.join()
        self._threads = []

    def _get_project(self, name):
        result = {'urls': {}, 'digests': {}}
        with self._gplock:
            self.result = result
            self.project_name = name
            url = urljoin(self.base_url, '%s/' % quote(name))
            self._seen.clear()
            self._page_cache.clear()
            self._prepare_threads()
            try:
                logger.debug('Queueing %s', url)
                self._to_fetch.put(url)
                self._to_fetch.join()
            finally:
                self._wait_threads()
            del self.result
        return result

    platform_dependent = re.compile(r'\b(linux-(i\d86|x86_64|arm\w+)|'
                                    r'win(32|-amd64)|macosx-?\d+)\b', re.I)

    def _is_platform_dependent(self, url):
        """
        Does an URL refer to a platform-specific download?
        """
        return self.platform_dependent.search(url)

    def _process_download(self, url):
        """
        See if an URL is a suitable download for a project.

        If it is, register information in the result dictionary (for
        _get_project) about the specific version it's for.

        Note that the return value isn't actually used other than as a boolean
        value.
        """
        if self._is_platform_dependent(url):
            info = None
        else:
            info = self.convert_url_to_download_info(url, self.project_name)
        logger.debug('process_download: %s -> %s', url, info)
        if info:
            with self._lock:    # needed because self.result is shared
                self._update_version_data(self.result, info)
        return info

    def _should_queue(self, link, referrer, rel):
        """
        Determine whether a link URL from a referring page and with a
        particular "rel" attribute should be queued for scraping.
        """
        scheme, netloc, path, _, _, _ = urlparse(link)
        if path.endswith(self.source_extensions + self.binary_extensions +
                         self.excluded_extensions):
            result = False
        elif self.skip_externals and not link.startswith(self.base_url):
            result = False
        elif not referrer.startswith(self.base_url):
            result = False
        elif rel not in ('homepage', 'download'):
            result = False
        elif scheme not in ('http', 'https', 'ftp'):
            result = False
        elif self._is_platform_dependent(link):
            result = False
        else:
            host = netloc.split(':', 1)[0]
            if host.lower() == 'localhost':
                result = False
            else:
                result = True
        logger.debug('should_queue: %s (%s) from %s -> %s', link, rel,
                     referrer, result)
        return result

    def _fetch(self):
        """
        Get a URL to fetch from the work queue, get the HTML page, examine its
        links for download candidates and candidates for further scraping.

        This is a handy method to run in a thread.
        """
        while True:
            url = self._to_fetch.get()
            try:
                if url:
                    page = self.get_page(url)
                    if page is None:    # e.g. after an error
                        continue
                    for link, rel in page.links:
                        if link not in self._seen:
                            self._seen.add(link)
                            if (not self._process_download(link) and
                                self._should_queue(link, url, rel)):
                                logger.debug('Queueing %s from %s', link, url)
                                self._to_fetch.put(link)
            except Exception as e:  # pragma: no cover
                self.errors.put(text_type(e))
            finally:
                # always do this, to avoid hangs :-)
                self._to_fetch.task_done()
            if not url:
                #logger.debug('Sentinel seen, quitting.')
                break

    def get_page(self, url):
        """
        Get the HTML for an URL, possibly from an in-memory cache.

        XXX TODO Note: this cache is never actually cleared. It's assumed that
        the data won't get stale over the lifetime of a locator instance (not
        necessarily true for the default_locator).
        """
        # http://peak.telecommunity.com/DevCenter/EasyInstall#package-index-api
        scheme, netloc, path, _, _, _ = urlparse(url)
        if scheme == 'file' and os.path.isdir(url2pathname(path)):
            url = urljoin(ensure_slash(url), 'index.html')

        if url in self._page_cache:
            result = self._page_cache[url]
            logger.debug('Returning %s from cache: %s', url, result)
        else:
            host = netloc.split(':', 1)[0]
            result = None
            if host in self._bad_hosts:
                logger.debug('Skipping %s due to bad host %s', url, host)
            else:
                req = Request(url, headers={'Accept-encoding': 'identity'})
                try:
                    logger.debug('Fetching %s', url)
                    resp = self.opener.open(req, timeout=self.timeout)
                    logger.debug('Fetched %s', url)
                    headers = resp.info()
                    content_type = headers.get('Content-Type', '')
                    if HTML_CONTENT_TYPE.match(content_type):
                        final_url = resp.geturl()
                        data = resp.read()
                        encoding = headers.get('Content-Encoding')
                        if encoding:
                            decoder = self.decoders[encoding]   # fail if not found
                            data = decoder(data)
                        encoding = 'utf-8'
                        m = CHARSET.search(content_type)
                        if m:
                            encoding = m.group(1)
                        try:
                            data = data.decode(encoding)
                        except UnicodeError:  # pragma: no cover
                            data = data.decode('latin-1')    # fallback
                        result = Page(data, final_url)
                        self._page_cache[final_url] = result
                except HTTPError as e:
                    if e.code != 404:
                        logger.exception('Fetch failed: %s: %s', url, e)
                except URLError as e:  # pragma: no cover
                    logger.exception('Fetch failed: %s: %s', url, e)
                    with self._lock:
                        self._bad_hosts.add(host)
                except Exception as e:  # pragma: no cover
                    logger.exception('Fetch failed: %s: %s', url, e)
                finally:
                    self._page_cache[url] = result   # even if None (failure)
        return result

    _distname_re = re.compile('<a href=[^>]*>([^<]+)<')

    def get_distribution_names(self):
        """
        Return all the distribution names known to this locator.
        """
        result = set()
        page = self.get_page(self.base_url)
        if not page:
            raise DistlibException('Unable to get %s' % self.base_url)
        for match in self._distname_re.finditer(page.data):
            result.add(match.group(1))
        return result

class DirectoryLocator(Locator):
    """
    This class locates distributions in a directory tree.
    """

    def __init__(self, path, **kwargs):
        """
        Initialise an instance.
        :param path: The root of the directory tree to search.
        :param kwargs: Passed to the superclass constructor,
                       except for:
                       * recursive - if True (the default), subdirectories are
                         recursed into. If False, only the top-level directory
                         is searched,
        """
        self.recursive = kwargs.pop('recursive', True)
        super(DirectoryLocator, self).__init__(**kwargs)
        path = os.path.abspath(path)
        if not os.path.isdir(path):  # pragma: no cover
            raise DistlibException('Not a directory: %r' % path)
        self.base_dir = path

    def should_include(self, filename, parent):
        """
        Should a filename be considered as a candidate for a distribution
        archive? As well as the filename, the directory which contains it
        is provided, though not used by the current implementation.
        """
        return filename.endswith(self.downloadable_extensions)

    def _get_project(self, name):
        result = {'urls': {}, 'digests': {}}
        for root, dirs, files in os.walk(self.base_dir):
            for fn in files:
                if self.should_include(fn, root):
                    fn = os.path.join(root, fn)
                    url = urlunparse(('file', '',
                                      pathname2url(os.path.abspath(fn)),
                                      '', '', ''))
                    info = self.convert_url_to_download_info(url, name)
                    if info:
                        self._update_version_data(result, info)
            if not self.recursive:
                break
        return result

    def get_distribution_names(self):
        """
        Return all the distribution names known to this locator.
        """
        result = set()
        for root, dirs, files in os.walk(self.base_dir):
            for fn in files:
                if self.should_include(fn, root):
                    fn = os.path.join(root, fn)
                    url = urlunparse(('file', '',
                                      pathname2url(os.path.abspath(fn)),
                                      '', '', ''))
                    info = self.convert_url_to_download_info(url, None)
                    if info:
                        result.add(info['name'])
            if not self.recursive:
                break
        return result

class JSONLocator(Locator):
    """
    This locator uses special extended metadata (not available on PyPI) and is
    the basis of performant dependency resolution in distlib. Other locators
    require archive downloads before dependencies can be determined! As you
    might imagine, that can be slow.
    """
    def get_distribution_names(self):
        """
        Return all the distribution names known to this locator.
        """
        raise NotImplementedError('Not available from this locator')

    def _get_project(self, name):
        result = {'urls': {}, 'digests': {}}
        data = get_project_data(name)
        if data:
            for info in data.get('files', []):
                if info['ptype'] != 'sdist' or info['pyversion'] != 'source':
                    continue
                # We don't store summary in project metadata as it makes
                # the data bigger for no benefit during dependency
                # resolution
                dist = make_dist(data['name'], info['version'],
                                 summary=data.get('summary',
                                                  'Placeholder for summary'),
                                 scheme=self.scheme)
                md = dist.metadata
                md.source_url = info['url']
                # TODO SHA256 digest
                if 'digest' in info and info['digest']:
                    dist.digest = ('md5', info['digest'])
                md.dependencies = info.get('requirements', {})
                dist.exports = info.get('exports', {})
                result[dist.version] = dist
                result['urls'].setdefault(dist.version, set()).add(info['url'])
        return result

class DistPathLocator(Locator):
    """
    This locator finds installed distributions in a path. It can be useful for
    adding to an :class:`AggregatingLocator`.
    """
    def __init__(self, distpath, **kwargs):
        """
        Initialise an instance.

        :param distpath: A :class:`DistributionPath` instance to search.
        """
        super(DistPathLocator, self).__init__(**kwargs)
        assert isinstance(distpath, DistributionPath)
        self.distpath = distpath

    def _get_project(self, name):
        dist = self.distpath.get_distribution(name)
        if dist is None:
            result = {'urls': {}, 'digests': {}}
        else:
            result = {
                dist.version: dist,
                'urls': {dist.version: set([dist.source_url])},
                'digests': {dist.version: set([None])}
            }
        return result


class AggregatingLocator(Locator):
    """
    This class allows you to chain and/or merge a list of locators.
    """
    def __init__(self, *locators, **kwargs):
        """
        Initialise an instance.

        :param locators: The list of locators to search.
        :param kwargs: Passed to the superclass constructor,
                       except for:
                       * merge - if False (the default), the first successful
                         search from any of the locators is returned. If True,
                         the results from all locators are merged (this can be
                         slow).
        """
        self.merge = kwargs.pop('merge', False)
        self.locators = locators
        super(AggregatingLocator, self).__init__(**kwargs)

    def clear_cache(self):
        super(AggregatingLocator, self).clear_cache()
        for locator in self.locators:
            locator.clear_cache()

    def _set_scheme(self, value):
        self._scheme = value
        for locator in self.locators:
            locator.scheme = value

    scheme = property(Locator.scheme.fget, _set_scheme)

    def _get_project(self, name):
        result = {}
        for locator in self.locators:
            d = locator.get_project(name)
            if d:
                if self.merge:
                    files = result.get('urls', {})
                    digests = result.get('digests', {})
                    # next line could overwrite result['urls'], result['digests']
                    result.update(d)
                    df = result.get('urls')
                    if files and df:
                        for k, v in files.items():
                            if k in df:
                                df[k] |= v
                            else:
                                df[k] = v
                    dd = result.get('digests')
                    if digests and dd:
                        dd.update(digests)
                else:
                    # See issue #18. If any dists are found and we're looking
                    # for specific constraints, we only return something if
                    # a match is found. For example, if a DirectoryLocator
                    # returns just foo (1.0) while we're looking for
                    # foo (>= 2.0), we'll pretend there was nothing there so
                    # that subsequent locators can be queried. Otherwise we
                    # would just return foo (1.0) which would then lead to a
                    # failure to find foo (>= 2.0), because other locators
                    # weren't searched. Note that this only matters when
                    # merge=False.
                    if self.matcher is None:
                        found = True
                    else:
                        found = False
                        for k in d:
                            if self.matcher.match(k):
                                found = True
                                break
                    if found:
                        result = d
                        break
        return result

    def get_distribution_names(self):
        """
        Return all the distribution names known to this locator.
        """
        result = set()
        for locator in self.locators:
            try:
                result |= locator.get_distribution_names()
            except NotImplementedError:
                pass
        return result


# We use a legacy scheme simply because most of the dists on PyPI use legacy
# versions which don't conform to PEP 426 / PEP 440.
default_locator = AggregatingLocator(
                    JSONLocator(),
                    SimpleScrapingLocator('https://pypi.python.org/simple/',
                                          timeout=3.0),
                    scheme='legacy')

locate = default_locator.locate

NAME_VERSION_RE = re.compile(r'(?P<name>[\w-]+)\s*'
                             r'\(\s*(==\s*)?(?P<ver>[^)]+)\)$')

class DependencyFinder(object):
    """
    Locate dependencies for distributions.
    """

    def __init__(self, locator=None):
        """
        Initialise an instance, using the specified locator
        to locate distributions.
        """
        self.locator = locator or default_locator
        self.scheme = get_scheme(self.locator.scheme)

    def add_distribution(self, dist):
        """
        Add a distribution to the finder. This will update internal information
        about who provides what.
        :param dist: The distribution to add.
        """
        logger.debug('adding distribution %s', dist)
        name = dist.key
        self.dists_by_name[name] = dist
        self.dists[(name, dist.version)] = dist
        for p in dist.provides:
            name, version = parse_name_and_version(p)
            logger.debug('Add to provided: %s, %s, %s', name, version, dist)
            self.provided.setdefault(name, set()).add((version, dist))

    def remove_distribution(self, dist):
        """
        Remove a distribution from the finder. This will update internal
        information about who provides what.
        :param dist: The distribution to remove.
        """
        logger.debug('removing distribution %s', dist)
        name = dist.key
        del self.dists_by_name[name]
        del self.dists[(name, dist.version)]
        for p in dist.provides:
            name, version = parse_name_and_version(p)
            logger.debug('Remove from provided: %s, %s, %s', name, version, dist)
            s = self.provided[name]
            s.remove((version, dist))
            if not s:
                del self.provided[name]

    def get_matcher(self, reqt):
        """
        Get a version matcher for a requirement.
        :param reqt: The requirement
        :type reqt: str
        :return: A version matcher (an instance of
                 :class:`distlib.version.Matcher`).
        """
        try:
            matcher = self.scheme.matcher(reqt)
        except UnsupportedVersionError:  # pragma: no cover
            # XXX compat-mode if cannot read the version
            name = reqt.split()[0]
            matcher = self.scheme.matcher(name)
        return matcher

    def find_providers(self, reqt):
        """
        Find the distributions which can fulfill a requirement.

        :param reqt: The requirement.
         :type reqt: str
        :return: A set of distribution which can fulfill the requirement.
        """
        matcher = self.get_matcher(reqt)
        name = matcher.key   # case-insensitive
        result = set()
        provided = self.provided
        if name in provided:
            for version, provider in provided[name]:
                try:
                    match = matcher.match(version)
                except UnsupportedVersionError:
                    match = False

                if match:
                    result.add(provider)
                    break
        return result

    def try_to_replace(self, provider, other, problems):
        """
        Attempt to replace one provider with another. This is typically used
        when resolving dependencies from multiple sources, e.g. A requires
        (B >= 1.0) while C requires (B >= 1.1).

        For successful replacement, ``provider`` must meet all the requirements
        which ``other`` fulfills.

        :param provider: The provider we are trying to replace with.
        :param other: The provider we're trying to replace.
        :param problems: If False is returned, this will contain what
                         problems prevented replacement. This is currently
                         a tuple of the literal string 'cantreplace',
                         ``provider``, ``other``  and the set of requirements
                         that ``provider`` couldn't fulfill.
        :return: True if we can replace ``other`` with ``provider``, else
                 False.
        """
        rlist = self.reqts[other]
        unmatched = set()
        for s in rlist:
            matcher = self.get_matcher(s)
            if not matcher.match(provider.version):
                unmatched.add(s)
        if unmatched:
            # can't replace other with provider
            problems.add(('cantreplace', provider, other,
                          frozenset(unmatched)))
            result = False
        else:
            # can replace other with provider
            self.remove_distribution(other)
            del self.reqts[other]
            for s in rlist:
                self.reqts.setdefault(provider, set()).add(s)
            self.add_distribution(provider)
            result = True
        return result

    def find(self, requirement, meta_extras=None, prereleases=False):
        """
        Find a distribution and all distributions it depends on.

        :param requirement: The requirement specifying the distribution to
                            find, or a Distribution instance.
        :param meta_extras: A list of meta extras such as :test:, :build: and
                            so on.
        :param prereleases: If ``True``, allow pre-release versions to be
                            returned - otherwise, don't return prereleases
                            unless they're all that's available.

        Return a set of :class:`Distribution` instances and a set of
        problems.

        The distributions returned should be such that they have the
        :attr:`required` attribute set to ``True`` if they were
        from the ``requirement`` passed to ``find()``, and they have the
        :attr:`build_time_dependency` attribute set to ``True`` unless they
        are post-installation dependencies of the ``requirement``.

        The problems should be a tuple consisting of the string
        ``'unsatisfied'`` and the requirement which couldn't be satisfied
        by any distribution known to the locator.
        """

        self.provided = {}
        self.dists = {}
        self.dists_by_name = {}
        self.reqts = {}

        meta_extras = set(meta_extras or [])
        if ':*:' in meta_extras:
            meta_extras.remove(':*:')
            # :meta: and :run: are implicitly included
            meta_extras |= set([':test:', ':build:', ':dev:'])

        if isinstance(requirement, Distribution):
            dist = odist = requirement
            logger.debug('passed %s as requirement', odist)
        else:
            dist = odist = self.locator.locate(requirement,
                                               prereleases=prereleases)
            if dist is None:
                raise DistlibException('Unable to locate %r' % requirement)
            logger.debug('located %s', odist)
        dist.requested = True
        problems = set()
        todo = set([dist])
        install_dists = set([odist])
        while todo:
            dist = todo.pop()
            name = dist.key     # case-insensitive
            if name not in self.dists_by_name:
                self.add_distribution(dist)
            else:
                #import pdb; pdb.set_trace()
                other = self.dists_by_name[name]
                if other != dist:
                    self.try_to_replace(dist, other, problems)

            ireqts = dist.run_requires | dist.meta_requires
            sreqts = dist.build_requires
            ereqts = set()
            if dist in install_dists:
                for key in ('test', 'build', 'dev'):
                    e = ':%s:' % key
                    if e in meta_extras:
                        ereqts |= getattr(dist, '%s_requires' % key)
            all_reqts = ireqts | sreqts | ereqts
            for r in all_reqts:
                providers = self.find_providers(r)
                if not providers:
                    logger.debug('No providers found for %r', r)
                    provider = self.locator.locate(r, prereleases=prereleases)
                    # If no provider is found and we didn't consider
                    # prereleases, consider them now.
                    if provider is None and not prereleases:
                        provider = self.locator.locate(r, prereleases=True)
                    if provider is None:
                        logger.debug('Cannot satisfy %r', r)
                        problems.add(('unsatisfied', r))
                    else:
                        n, v = provider.key, provider.version
                        if (n, v) not in self.dists:
                            todo.add(provider)
                        providers.add(provider)
                        if r in ireqts and dist in install_dists:
                            install_dists.add(provider)
                            logger.debug('Adding %s to install_dists',
                                         provider.name_and_version)
                for p in providers:
                    name = p.key
                    if name not in self.dists_by_name:
                        self.reqts.setdefault(p, set()).add(r)
                    else:
                        other = self.dists_by_name[name]
                        if other != p:
                            # see if other can be replaced by p
                            self.try_to_replace(p, other, problems)

        dists = set(self.dists.values())
        for dist in dists:
            dist.build_time_dependency = dist not in install_dists
            if dist.build_time_dependency:
                logger.debug('%s is a build-time dependency only.',
                             dist.name_and_version)
        logger.debug('find done for %s', odist)
        return dists, problems
site-packages/pip/_vendor/distlib/wheel.py000064400000114313147511334610014630 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2013-2016 Vinay Sajip.
# Licensed to the Python Software Foundation under a contributor agreement.
# See LICENSE.txt and CONTRIBUTORS.txt.
#
from __future__ import unicode_literals

import base64
import codecs
import datetime
import distutils.util
from email import message_from_file
import hashlib
import imp
import json
import logging
import os
import posixpath
import re
import shutil
import sys
import tempfile
import zipfile

from . import __version__, DistlibException
from .compat import sysconfig, ZipFile, fsdecode, text_type, filter
from .database import InstalledDistribution
from .metadata import Metadata, METADATA_FILENAME
from .util import (FileOperator, convert_path, CSVReader, CSVWriter, Cache,
                   cached_property, get_cache_base, read_exports, tempdir)
from .version import NormalizedVersion, UnsupportedVersionError

logger = logging.getLogger(__name__)

cache = None    # created when needed

if hasattr(sys, 'pypy_version_info'):
    IMP_PREFIX = 'pp'
elif sys.platform.startswith('java'):
    IMP_PREFIX = 'jy'
elif sys.platform == 'cli':
    IMP_PREFIX = 'ip'
else:
    IMP_PREFIX = 'cp'

VER_SUFFIX = sysconfig.get_config_var('py_version_nodot')
if not VER_SUFFIX:   # pragma: no cover
    VER_SUFFIX = '%s%s' % sys.version_info[:2]
PYVER = 'py' + VER_SUFFIX
IMPVER = IMP_PREFIX + VER_SUFFIX

ARCH = distutils.util.get_platform().replace('-', '_').replace('.', '_')

ABI = sysconfig.get_config_var('SOABI')
if ABI and ABI.startswith('cpython-'):
    ABI = ABI.replace('cpython-', 'cp')
else:
    def _derive_abi():
        parts = ['cp', VER_SUFFIX]
        if sysconfig.get_config_var('Py_DEBUG'):
            parts.append('d')
        if sysconfig.get_config_var('WITH_PYMALLOC'):
            parts.append('m')
        if sysconfig.get_config_var('Py_UNICODE_SIZE') == 4:
            parts.append('u')
        return ''.join(parts)
    ABI = _derive_abi()
    del _derive_abi

FILENAME_RE = re.compile(r'''
(?P<nm>[^-]+)
-(?P<vn>\d+[^-]*)
(-(?P<bn>\d+[^-]*))?
-(?P<py>\w+\d+(\.\w+\d+)*)
-(?P<bi>\w+)
-(?P<ar>\w+(\.\w+)*)
\.whl$
''', re.IGNORECASE | re.VERBOSE)

NAME_VERSION_RE = re.compile(r'''
(?P<nm>[^-]+)
-(?P<vn>\d+[^-]*)
(-(?P<bn>\d+[^-]*))?$
''', re.IGNORECASE | re.VERBOSE)

SHEBANG_RE = re.compile(br'\s*#![^\r\n]*')
SHEBANG_DETAIL_RE = re.compile(br'^(\s*#!("[^"]+"|\S+))\s+(.*)$')
SHEBANG_PYTHON = b'#!python'
SHEBANG_PYTHONW = b'#!pythonw'

if os.sep == '/':
    to_posix = lambda o: o
else:
    to_posix = lambda o: o.replace(os.sep, '/')


class Mounter(object):
    def __init__(self):
        self.impure_wheels = {}
        self.libs = {}

    def add(self, pathname, extensions):
        self.impure_wheels[pathname] = extensions
        self.libs.update(extensions)

    def remove(self, pathname):
        extensions = self.impure_wheels.pop(pathname)
        for k, v in extensions:
            if k in self.libs:
                del self.libs[k]

    def find_module(self, fullname, path=None):
        if fullname in self.libs:
            result = self
        else:
            result = None
        return result

    def load_module(self, fullname):
        if fullname in sys.modules:
            result = sys.modules[fullname]
        else:
            if fullname not in self.libs:
                raise ImportError('unable to find extension for %s' % fullname)
            result = imp.load_dynamic(fullname, self.libs[fullname])
            result.__loader__ = self
            parts = fullname.rsplit('.', 1)
            if len(parts) > 1:
                result.__package__ = parts[0]
        return result

_hook = Mounter()


class Wheel(object):
    """
    Class to build and install from Wheel files (PEP 427).
    """

    wheel_version = (1, 1)
    hash_kind = 'sha256'

    def __init__(self, filename=None, sign=False, verify=False):
        """
        Initialise an instance using a (valid) filename.
        """
        self.sign = sign
        self.should_verify = verify
        self.buildver = ''
        self.pyver = [PYVER]
        self.abi = ['none']
        self.arch = ['any']
        self.dirname = os.getcwd()
        if filename is None:
            self.name = 'dummy'
            self.version = '0.1'
            self._filename = self.filename
        else:
            m = NAME_VERSION_RE.match(filename)
            if m:
                info = m.groupdict('')
                self.name = info['nm']
                # Reinstate the local version separator
                self.version = info['vn'].replace('_', '-')
                self.buildver = info['bn']
                self._filename = self.filename
            else:
                dirname, filename = os.path.split(filename)
                m = FILENAME_RE.match(filename)
                if not m:
                    raise DistlibException('Invalid name or '
                                           'filename: %r' % filename)
                if dirname:
                    self.dirname = os.path.abspath(dirname)
                self._filename = filename
                info = m.groupdict('')
                self.name = info['nm']
                self.version = info['vn']
                self.buildver = info['bn']
                self.pyver = info['py'].split('.')
                self.abi = info['bi'].split('.')
                self.arch = info['ar'].split('.')

    @property
    def filename(self):
        """
        Build and return a filename from the various components.
        """
        if self.buildver:
            buildver = '-' + self.buildver
        else:
            buildver = ''
        pyver = '.'.join(self.pyver)
        abi = '.'.join(self.abi)
        arch = '.'.join(self.arch)
        # replace - with _ as a local version separator
        version = self.version.replace('-', '_')
        return '%s-%s%s-%s-%s-%s.whl' % (self.name, version, buildver,
                                         pyver, abi, arch)

    @property
    def exists(self):
        path = os.path.join(self.dirname, self.filename)
        return os.path.isfile(path)

    @property
    def tags(self):
        for pyver in self.pyver:
            for abi in self.abi:
                for arch in self.arch:
                    yield pyver, abi, arch

    @cached_property
    def metadata(self):
        pathname = os.path.join(self.dirname, self.filename)
        name_ver = '%s-%s' % (self.name, self.version)
        info_dir = '%s.dist-info' % name_ver
        wrapper = codecs.getreader('utf-8')
        with ZipFile(pathname, 'r') as zf:
            wheel_metadata = self.get_wheel_metadata(zf)
            wv = wheel_metadata['Wheel-Version'].split('.', 1)
            file_version = tuple([int(i) for i in wv])
            if file_version < (1, 1):
                fn = 'METADATA'
            else:
                fn = METADATA_FILENAME
            try:
                metadata_filename = posixpath.join(info_dir, fn)
                with zf.open(metadata_filename) as bf:
                    wf = wrapper(bf)
                    result = Metadata(fileobj=wf)
            except KeyError:
                raise ValueError('Invalid wheel, because %s is '
                                 'missing' % fn)
        return result

    def get_wheel_metadata(self, zf):
        name_ver = '%s-%s' % (self.name, self.version)
        info_dir = '%s.dist-info' % name_ver
        metadata_filename = posixpath.join(info_dir, 'WHEEL')
        with zf.open(metadata_filename) as bf:
            wf = codecs.getreader('utf-8')(bf)
            message = message_from_file(wf)
        return dict(message)

    @cached_property
    def info(self):
        pathname = os.path.join(self.dirname, self.filename)
        with ZipFile(pathname, 'r') as zf:
            result = self.get_wheel_metadata(zf)
        return result

    def process_shebang(self, data):
        m = SHEBANG_RE.match(data)
        if m:
            end = m.end()
            shebang, data_after_shebang = data[:end], data[end:]
            # Preserve any arguments after the interpreter
            if b'pythonw' in shebang.lower():
                shebang_python = SHEBANG_PYTHONW
            else:
                shebang_python = SHEBANG_PYTHON
            m = SHEBANG_DETAIL_RE.match(shebang)
            if m:
                args = b' ' + m.groups()[-1]
            else:
                args = b''
            shebang = shebang_python + args
            data = shebang + data_after_shebang
        else:
            cr = data.find(b'\r')
            lf = data.find(b'\n')
            if cr < 0 or cr > lf:
                term = b'\n'
            else:
                if data[cr:cr + 2] == b'\r\n':
                    term = b'\r\n'
                else:
                    term = b'\r'
            data = SHEBANG_PYTHON + term + data
        return data

    def get_hash(self, data, hash_kind=None):
        if hash_kind is None:
            hash_kind = self.hash_kind
        try:
            hasher = getattr(hashlib, hash_kind)
        except AttributeError:
            raise DistlibException('Unsupported hash algorithm: %r' % hash_kind)
        result = hasher(data).digest()
        result = base64.urlsafe_b64encode(result).rstrip(b'=').decode('ascii')
        return hash_kind, result

    def write_record(self, records, record_path, base):
        records = list(records) # make a copy for sorting
        p = to_posix(os.path.relpath(record_path, base))
        records.append((p, '', ''))
        records.sort()
        with CSVWriter(record_path) as writer:
            for row in records:
                writer.writerow(row)

    def write_records(self, info, libdir, archive_paths):
        records = []
        distinfo, info_dir = info
        hasher = getattr(hashlib, self.hash_kind)
        for ap, p in archive_paths:
            with open(p, 'rb') as f:
                data = f.read()
            digest = '%s=%s' % self.get_hash(data)
            size = os.path.getsize(p)
            records.append((ap, digest, size))

        p = os.path.join(distinfo, 'RECORD')
        self.write_record(records, p, libdir)
        ap = to_posix(os.path.join(info_dir, 'RECORD'))
        archive_paths.append((ap, p))

    def build_zip(self, pathname, archive_paths):
        with ZipFile(pathname, 'w', zipfile.ZIP_DEFLATED) as zf:
            for ap, p in archive_paths:
                logger.debug('Wrote %s to %s in wheel', p, ap)
                zf.write(p, ap)

    def build(self, paths, tags=None, wheel_version=None):
        """
        Build a wheel from files in specified paths, and use any specified tags
        when determining the name of the wheel.
        """
        if tags is None:
            tags = {}

        libkey = list(filter(lambda o: o in paths, ('purelib', 'platlib')))[0]
        if libkey == 'platlib':
            is_pure = 'false'
            default_pyver = [IMPVER]
            default_abi = [ABI]
            default_arch = [ARCH]
        else:
            is_pure = 'true'
            default_pyver = [PYVER]
            default_abi = ['none']
            default_arch = ['any']

        self.pyver = tags.get('pyver', default_pyver)
        self.abi = tags.get('abi', default_abi)
        self.arch = tags.get('arch', default_arch)

        libdir = paths[libkey]

        name_ver = '%s-%s' % (self.name, self.version)
        data_dir = '%s.data' % name_ver
        info_dir = '%s.dist-info' % name_ver

        archive_paths = []

        # First, stuff which is not in site-packages
        for key in ('data', 'headers', 'scripts'):
            if key not in paths:
                continue
            path = paths[key]
            if os.path.isdir(path):
                for root, dirs, files in os.walk(path):
                    for fn in files:
                        p = fsdecode(os.path.join(root, fn))
                        rp = os.path.relpath(p, path)
                        ap = to_posix(os.path.join(data_dir, key, rp))
                        archive_paths.append((ap, p))
                        if key == 'scripts' and not p.endswith('.exe'):
                            with open(p, 'rb') as f:
                                data = f.read()
                            data = self.process_shebang(data)
                            with open(p, 'wb') as f:
                                f.write(data)

        # Now, stuff which is in site-packages, other than the
        # distinfo stuff.
        path = libdir
        distinfo = None
        for root, dirs, files in os.walk(path):
            if root == path:
                # At the top level only, save distinfo for later
                # and skip it for now
                for i, dn in enumerate(dirs):
                    dn = fsdecode(dn)
                    if dn.endswith('.dist-info'):
                        distinfo = os.path.join(root, dn)
                        del dirs[i]
                        break
                assert distinfo, '.dist-info directory expected, not found'

            for fn in files:
                # comment out next suite to leave .pyc files in
                if fsdecode(fn).endswith(('.pyc', '.pyo')):
                    continue
                p = os.path.join(root, fn)
                rp = to_posix(os.path.relpath(p, path))
                archive_paths.append((rp, p))

        # Now distinfo. Assumed to be flat, i.e. os.listdir is enough.
        files = os.listdir(distinfo)
        for fn in files:
            if fn not in ('RECORD', 'INSTALLER', 'SHARED', 'WHEEL'):
                p = fsdecode(os.path.join(distinfo, fn))
                ap = to_posix(os.path.join(info_dir, fn))
                archive_paths.append((ap, p))

        wheel_metadata = [
            'Wheel-Version: %d.%d' % (wheel_version or self.wheel_version),
            'Generator: distlib %s' % __version__,
            'Root-Is-Purelib: %s' % is_pure,
        ]
        for pyver, abi, arch in self.tags:
            wheel_metadata.append('Tag: %s-%s-%s' % (pyver, abi, arch))
        p = os.path.join(distinfo, 'WHEEL')
        with open(p, 'w') as f:
            f.write('\n'.join(wheel_metadata))
        ap = to_posix(os.path.join(info_dir, 'WHEEL'))
        archive_paths.append((ap, p))

        # Now, at last, RECORD.
        # Paths in here are archive paths - nothing else makes sense.
        self.write_records((distinfo, info_dir), libdir, archive_paths)
        # Now, ready to build the zip file
        pathname = os.path.join(self.dirname, self.filename)
        self.build_zip(pathname, archive_paths)
        return pathname

    def install(self, paths, maker, **kwargs):
        """
        Install a wheel to the specified paths. If kwarg ``warner`` is
        specified, it should be a callable, which will be called with two
        tuples indicating the wheel version of this software and the wheel
        version in the file, if there is a discrepancy in the versions.
        This can be used to issue any warnings to raise any exceptions.
        If kwarg ``lib_only`` is True, only the purelib/platlib files are
        installed, and the headers, scripts, data and dist-info metadata are
        not written.

        The return value is a :class:`InstalledDistribution` instance unless
        ``options.lib_only`` is True, in which case the return value is ``None``.
        """

        dry_run = maker.dry_run
        warner = kwargs.get('warner')
        lib_only = kwargs.get('lib_only', False)

        pathname = os.path.join(self.dirname, self.filename)
        name_ver = '%s-%s' % (self.name, self.version)
        data_dir = '%s.data' % name_ver
        info_dir = '%s.dist-info' % name_ver

        metadata_name = posixpath.join(info_dir, METADATA_FILENAME)
        wheel_metadata_name = posixpath.join(info_dir, 'WHEEL')
        record_name = posixpath.join(info_dir, 'RECORD')

        wrapper = codecs.getreader('utf-8')

        with ZipFile(pathname, 'r') as zf:
            with zf.open(wheel_metadata_name) as bwf:
                wf = wrapper(bwf)
                message = message_from_file(wf)
            wv = message['Wheel-Version'].split('.', 1)
            file_version = tuple([int(i) for i in wv])
            if (file_version != self.wheel_version) and warner:
                warner(self.wheel_version, file_version)

            if message['Root-Is-Purelib'] == 'true':
                libdir = paths['purelib']
            else:
                libdir = paths['platlib']

            records = {}
            with zf.open(record_name) as bf:
                with CSVReader(stream=bf) as reader:
                    for row in reader:
                        p = row[0]
                        records[p] = row

            data_pfx = posixpath.join(data_dir, '')
            info_pfx = posixpath.join(info_dir, '')
            script_pfx = posixpath.join(data_dir, 'scripts', '')

            # make a new instance rather than a copy of maker's,
            # as we mutate it
            fileop = FileOperator(dry_run=dry_run)
            fileop.record = True    # so we can rollback if needed

            bc = not sys.dont_write_bytecode    # Double negatives. Lovely!

            outfiles = []   # for RECORD writing

            # for script copying/shebang processing
            workdir = tempfile.mkdtemp()
            # set target dir later
            # we default add_launchers to False, as the
            # Python Launcher should be used instead
            maker.source_dir = workdir
            maker.target_dir = None
            try:
                for zinfo in zf.infolist():
                    arcname = zinfo.filename
                    if isinstance(arcname, text_type):
                        u_arcname = arcname
                    else:
                        u_arcname = arcname.decode('utf-8')
                    # The signature file won't be in RECORD,
                    # and we  don't currently don't do anything with it
                    if u_arcname.endswith('/RECORD.jws'):
                        continue
                    row = records[u_arcname]
                    if row[2] and str(zinfo.file_size) != row[2]:
                        raise DistlibException('size mismatch for '
                                               '%s' % u_arcname)
                    if row[1]:
                        kind, value = row[1].split('=', 1)
                        with zf.open(arcname) as bf:
                            data = bf.read()
                        _, digest = self.get_hash(data, kind)
                        if digest != value:
                            raise DistlibException('digest mismatch for '
                                                   '%s' % arcname)

                    if lib_only and u_arcname.startswith((info_pfx, data_pfx)):
                        logger.debug('lib_only: skipping %s', u_arcname)
                        continue
                    is_script = (u_arcname.startswith(script_pfx)
                                 and not u_arcname.endswith('.exe'))

                    if u_arcname.startswith(data_pfx):
                        _, where, rp = u_arcname.split('/', 2)
                        outfile = os.path.join(paths[where], convert_path(rp))
                    else:
                        # meant for site-packages.
                        if u_arcname in (wheel_metadata_name, record_name):
                            continue
                        outfile = os.path.join(libdir, convert_path(u_arcname))
                    if not is_script:
                        with zf.open(arcname) as bf:
                            fileop.copy_stream(bf, outfile)
                        outfiles.append(outfile)
                        # Double check the digest of the written file
                        if not dry_run and row[1]:
                            with open(outfile, 'rb') as bf:
                                data = bf.read()
                                _, newdigest = self.get_hash(data, kind)
                                if newdigest != digest:
                                    raise DistlibException('digest mismatch '
                                                           'on write for '
                                                           '%s' % outfile)
                        if bc and outfile.endswith('.py'):
                            try:
                                pyc = fileop.byte_compile(outfile)
                                outfiles.append(pyc)
                            except Exception:
                                # Don't give up if byte-compilation fails,
                                # but log it and perhaps warn the user
                                logger.warning('Byte-compilation failed',
                                               exc_info=True)
                    else:
                        fn = os.path.basename(convert_path(arcname))
                        workname = os.path.join(workdir, fn)
                        with zf.open(arcname) as bf:
                            fileop.copy_stream(bf, workname)

                        dn, fn = os.path.split(outfile)
                        maker.target_dir = dn
                        filenames = maker.make(fn)
                        fileop.set_executable_mode(filenames)
                        outfiles.extend(filenames)

                if lib_only:
                    logger.debug('lib_only: returning None')
                    dist = None
                else:
                    # Generate scripts

                    # Try to get pydist.json so we can see if there are
                    # any commands to generate. If this fails (e.g. because
                    # of a legacy wheel), log a warning but don't give up.
                    commands = None
                    file_version = self.info['Wheel-Version']
                    if file_version == '1.0':
                        # Use legacy info
                        ep = posixpath.join(info_dir, 'entry_points.txt')
                        try:
                            with zf.open(ep) as bwf:
                                epdata = read_exports(bwf)
                            commands = {}
                            for key in ('console', 'gui'):
                                k = '%s_scripts' % key
                                if k in epdata:
                                    commands['wrap_%s' % key] = d = {}
                                    for v in epdata[k].values():
                                        s = '%s:%s' % (v.prefix, v.suffix)
                                        if v.flags:
                                            s += ' %s' % v.flags
                                        d[v.name] = s
                        except Exception:
                            logger.warning('Unable to read legacy script '
                                           'metadata, so cannot generate '
                                           'scripts')
                    else:
                        try:
                            with zf.open(metadata_name) as bwf:
                                wf = wrapper(bwf)
                                commands = json.load(wf).get('extensions')
                                if commands:
                                    commands = commands.get('python.commands')
                        except Exception:
                            logger.warning('Unable to read JSON metadata, so '
                                           'cannot generate scripts')
                    if commands:
                        console_scripts = commands.get('wrap_console', {})
                        gui_scripts = commands.get('wrap_gui', {})
                        if console_scripts or gui_scripts:
                            script_dir = paths.get('scripts', '')
                            if not os.path.isdir(script_dir):
                                raise ValueError('Valid script path not '
                                                 'specified')
                            maker.target_dir = script_dir
                            for k, v in console_scripts.items():
                                script = '%s = %s' % (k, v)
                                filenames = maker.make(script)
                                fileop.set_executable_mode(filenames)

                            if gui_scripts:
                                options = {'gui': True }
                                for k, v in gui_scripts.items():
                                    script = '%s = %s' % (k, v)
                                    filenames = maker.make(script, options)
                                    fileop.set_executable_mode(filenames)

                    p = os.path.join(libdir, info_dir)
                    dist = InstalledDistribution(p)

                    # Write SHARED
                    paths = dict(paths)     # don't change passed in dict
                    del paths['purelib']
                    del paths['platlib']
                    paths['lib'] = libdir
                    p = dist.write_shared_locations(paths, dry_run)
                    if p:
                        outfiles.append(p)

                    # Write RECORD
                    dist.write_installed_files(outfiles, paths['prefix'],
                                               dry_run)
                return dist
            except Exception:  # pragma: no cover
                logger.exception('installation failed.')
                fileop.rollback()
                raise
            finally:
                shutil.rmtree(workdir)

    def _get_dylib_cache(self):
        global cache
        if cache is None:
            # Use native string to avoid issues on 2.x: see Python #20140.
            base = os.path.join(get_cache_base(), str('dylib-cache'),
                                sys.version[:3])
            cache = Cache(base)
        return cache

    def _get_extensions(self):
        pathname = os.path.join(self.dirname, self.filename)
        name_ver = '%s-%s' % (self.name, self.version)
        info_dir = '%s.dist-info' % name_ver
        arcname = posixpath.join(info_dir, 'EXTENSIONS')
        wrapper = codecs.getreader('utf-8')
        result = []
        with ZipFile(pathname, 'r') as zf:
            try:
                with zf.open(arcname) as bf:
                    wf = wrapper(bf)
                    extensions = json.load(wf)
                    cache = self._get_dylib_cache()
                    prefix = cache.prefix_to_dir(pathname)
                    cache_base = os.path.join(cache.base, prefix)
                    if not os.path.isdir(cache_base):
                        os.makedirs(cache_base)
                    for name, relpath in extensions.items():
                        dest = os.path.join(cache_base, convert_path(relpath))
                        if not os.path.exists(dest):
                            extract = True
                        else:
                            file_time = os.stat(dest).st_mtime
                            file_time = datetime.datetime.fromtimestamp(file_time)
                            info = zf.getinfo(relpath)
                            wheel_time = datetime.datetime(*info.date_time)
                            extract = wheel_time > file_time
                        if extract:
                            zf.extract(relpath, cache_base)
                        result.append((name, dest))
            except KeyError:
                pass
        return result

    def is_compatible(self):
        """
        Determine if a wheel is compatible with the running system.
        """
        return is_compatible(self)

    def is_mountable(self):
        """
        Determine if a wheel is asserted as mountable by its metadata.
        """
        return True # for now - metadata details TBD

    def mount(self, append=False):
        pathname = os.path.abspath(os.path.join(self.dirname, self.filename))
        if not self.is_compatible():
            msg = 'Wheel %s not compatible with this Python.' % pathname
            raise DistlibException(msg)
        if not self.is_mountable():
            msg = 'Wheel %s is marked as not mountable.' % pathname
            raise DistlibException(msg)
        if pathname in sys.path:
            logger.debug('%s already in path', pathname)
        else:
            if append:
                sys.path.append(pathname)
            else:
                sys.path.insert(0, pathname)
            extensions = self._get_extensions()
            if extensions:
                if _hook not in sys.meta_path:
                    sys.meta_path.append(_hook)
                _hook.add(pathname, extensions)

    def unmount(self):
        pathname = os.path.abspath(os.path.join(self.dirname, self.filename))
        if pathname not in sys.path:
            logger.debug('%s not in path', pathname)
        else:
            sys.path.remove(pathname)
            if pathname in _hook.impure_wheels:
                _hook.remove(pathname)
            if not _hook.impure_wheels:
                if _hook in sys.meta_path:
                    sys.meta_path.remove(_hook)

    def verify(self):
        pathname = os.path.join(self.dirname, self.filename)
        name_ver = '%s-%s' % (self.name, self.version)
        data_dir = '%s.data' % name_ver
        info_dir = '%s.dist-info' % name_ver

        metadata_name = posixpath.join(info_dir, METADATA_FILENAME)
        wheel_metadata_name = posixpath.join(info_dir, 'WHEEL')
        record_name = posixpath.join(info_dir, 'RECORD')

        wrapper = codecs.getreader('utf-8')

        with ZipFile(pathname, 'r') as zf:
            with zf.open(wheel_metadata_name) as bwf:
                wf = wrapper(bwf)
                message = message_from_file(wf)
            wv = message['Wheel-Version'].split('.', 1)
            file_version = tuple([int(i) for i in wv])
            # TODO version verification

            records = {}
            with zf.open(record_name) as bf:
                with CSVReader(stream=bf) as reader:
                    for row in reader:
                        p = row[0]
                        records[p] = row

            for zinfo in zf.infolist():
                arcname = zinfo.filename
                if isinstance(arcname, text_type):
                    u_arcname = arcname
                else:
                    u_arcname = arcname.decode('utf-8')
                if '..' in u_arcname:
                    raise DistlibException('invalid entry in '
                                           'wheel: %r' % u_arcname)

                # The signature file won't be in RECORD,
                # and we  don't currently don't do anything with it
                if u_arcname.endswith('/RECORD.jws'):
                    continue
                row = records[u_arcname]
                if row[2] and str(zinfo.file_size) != row[2]:
                    raise DistlibException('size mismatch for '
                                           '%s' % u_arcname)
                if row[1]:
                    kind, value = row[1].split('=', 1)
                    with zf.open(arcname) as bf:
                        data = bf.read()
                    _, digest = self.get_hash(data, kind)
                    if digest != value:
                        raise DistlibException('digest mismatch for '
                                               '%s' % arcname)

    def update(self, modifier, dest_dir=None, **kwargs):
        """
        Update the contents of a wheel in a generic way. The modifier should
        be a callable which expects a dictionary argument: its keys are
        archive-entry paths, and its values are absolute filesystem paths
        where the contents the corresponding archive entries can be found. The
        modifier is free to change the contents of the files pointed to, add
        new entries and remove entries, before returning. This method will
        extract the entire contents of the wheel to a temporary location, call
        the modifier, and then use the passed (and possibly updated)
        dictionary to write a new wheel. If ``dest_dir`` is specified, the new
        wheel is written there -- otherwise, the original wheel is overwritten.

        The modifier should return True if it updated the wheel, else False.
        This method returns the same value the modifier returns.
        """

        def get_version(path_map, info_dir):
            version = path = None
            key = '%s/%s' % (info_dir, METADATA_FILENAME)
            if key not in path_map:
                key = '%s/PKG-INFO' % info_dir
            if key in path_map:
                path = path_map[key]
                version = Metadata(path=path).version
            return version, path

        def update_version(version, path):
            updated = None
            try:
                v = NormalizedVersion(version)
                i = version.find('-')
                if i < 0:
                    updated = '%s+1' % version
                else:
                    parts = [int(s) for s in version[i + 1:].split('.')]
                    parts[-1] += 1
                    updated = '%s+%s' % (version[:i],
                                         '.'.join(str(i) for i in parts))
            except UnsupportedVersionError:
                logger.debug('Cannot update non-compliant (PEP-440) '
                             'version %r', version)
            if updated:
                md = Metadata(path=path)
                md.version = updated
                legacy = not path.endswith(METADATA_FILENAME)
                md.write(path=path, legacy=legacy)
                logger.debug('Version updated from %r to %r', version,
                             updated)

        pathname = os.path.join(self.dirname, self.filename)
        name_ver = '%s-%s' % (self.name, self.version)
        info_dir = '%s.dist-info' % name_ver
        record_name = posixpath.join(info_dir, 'RECORD')
        with tempdir() as workdir:
            with ZipFile(pathname, 'r') as zf:
                path_map = {}
                for zinfo in zf.infolist():
                    arcname = zinfo.filename
                    if isinstance(arcname, text_type):
                        u_arcname = arcname
                    else:
                        u_arcname = arcname.decode('utf-8')
                    if u_arcname == record_name:
                        continue
                    if '..' in u_arcname:
                        raise DistlibException('invalid entry in '
                                               'wheel: %r' % u_arcname)
                    zf.extract(zinfo, workdir)
                    path = os.path.join(workdir, convert_path(u_arcname))
                    path_map[u_arcname] = path

            # Remember the version.
            original_version, _ = get_version(path_map, info_dir)
            # Files extracted. Call the modifier.
            modified = modifier(path_map, **kwargs)
            if modified:
                # Something changed - need to build a new wheel.
                current_version, path = get_version(path_map, info_dir)
                if current_version and (current_version == original_version):
                    # Add or update local version to signify changes.
                    update_version(current_version, path)
                # Decide where the new wheel goes.
                if dest_dir is None:
                    fd, newpath = tempfile.mkstemp(suffix='.whl',
                                                   prefix='wheel-update-',
                                                   dir=workdir)
                    os.close(fd)
                else:
                    if not os.path.isdir(dest_dir):
                        raise DistlibException('Not a directory: %r' % dest_dir)
                    newpath = os.path.join(dest_dir, self.filename)
                archive_paths = list(path_map.items())
                distinfo = os.path.join(workdir, info_dir)
                info = distinfo, info_dir
                self.write_records(info, workdir, archive_paths)
                self.build_zip(newpath, archive_paths)
                if dest_dir is None:
                    shutil.copyfile(newpath, pathname)
        return modified

def compatible_tags():
    """
    Return (pyver, abi, arch) tuples compatible with this Python.
    """
    versions = [VER_SUFFIX]
    major = VER_SUFFIX[0]
    for minor in range(sys.version_info[1] - 1, - 1, -1):
        versions.append(''.join([major, str(minor)]))

    abis = []
    for suffix, _, _ in imp.get_suffixes():
        if suffix.startswith('.abi'):
            abis.append(suffix.split('.', 2)[1])
    abis.sort()
    if ABI != 'none':
        abis.insert(0, ABI)
    abis.append('none')
    result = []

    arches = [ARCH]
    if sys.platform == 'darwin':
        m = re.match('(\w+)_(\d+)_(\d+)_(\w+)$', ARCH)
        if m:
            name, major, minor, arch = m.groups()
            minor = int(minor)
            matches = [arch]
            if arch in ('i386', 'ppc'):
                matches.append('fat')
            if arch in ('i386', 'ppc', 'x86_64'):
                matches.append('fat3')
            if arch in ('ppc64', 'x86_64'):
                matches.append('fat64')
            if arch in ('i386', 'x86_64'):
                matches.append('intel')
            if arch in ('i386', 'x86_64', 'intel', 'ppc', 'ppc64'):
                matches.append('universal')
            while minor >= 0:
                for match in matches:
                    s = '%s_%s_%s_%s' % (name, major, minor, match)
                    if s != ARCH:   # already there
                        arches.append(s)
                minor -= 1

    # Most specific - our Python version, ABI and arch
    for abi in abis:
        for arch in arches:
            result.append((''.join((IMP_PREFIX, versions[0])), abi, arch))

    # where no ABI / arch dependency, but IMP_PREFIX dependency
    for i, version in enumerate(versions):
        result.append((''.join((IMP_PREFIX, version)), 'none', 'any'))
        if i == 0:
            result.append((''.join((IMP_PREFIX, version[0])), 'none', 'any'))

    # no IMP_PREFIX, ABI or arch dependency
    for i, version in enumerate(versions):
        result.append((''.join(('py', version)), 'none', 'any'))
        if i == 0:
            result.append((''.join(('py', version[0])), 'none', 'any'))
    return set(result)


COMPATIBLE_TAGS = compatible_tags()

del compatible_tags


def is_compatible(wheel, tags=None):
    if not isinstance(wheel, Wheel):
        wheel = Wheel(wheel)    # assume it's a filename
    result = False
    if tags is None:
        tags = COMPATIBLE_TAGS
    for ver, abi, arch in tags:
        if ver in wheel.pyver and abi in wheel.abi and arch in wheel.arch:
            result = True
            break
    return result
site-packages/pip/_vendor/distlib/compat.py000064400000117541147511334610015015 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2013-2016 Vinay Sajip.
# Licensed to the Python Software Foundation under a contributor agreement.
# See LICENSE.txt and CONTRIBUTORS.txt.
#
from __future__ import absolute_import

import os
import re
import sys

try:
    import ssl
except ImportError:
    ssl = None

if sys.version_info[0] < 3:  # pragma: no cover
    from StringIO import StringIO
    string_types = basestring,
    text_type = unicode
    from types import FileType as file_type
    import __builtin__ as builtins
    import ConfigParser as configparser
    from ._backport import shutil
    from urlparse import urlparse, urlunparse, urljoin, urlsplit, urlunsplit
    from urllib import (urlretrieve, quote as _quote, unquote, url2pathname,
                        pathname2url, ContentTooShortError, splittype)

    def quote(s):
        if isinstance(s, unicode):
            s = s.encode('utf-8')
        return _quote(s)

    import urllib2
    from urllib2 import (Request, urlopen, URLError, HTTPError,
                         HTTPBasicAuthHandler, HTTPPasswordMgr,
                         HTTPHandler, HTTPRedirectHandler,
                         build_opener)
    if ssl:
        from urllib2 import HTTPSHandler
    import httplib
    import xmlrpclib
    import Queue as queue
    from HTMLParser import HTMLParser
    import htmlentitydefs
    raw_input = raw_input
    from itertools import ifilter as filter
    from itertools import ifilterfalse as filterfalse

    _userprog = None
    def splituser(host):
        """splituser('user[:passwd]@host[:port]') --> 'user[:passwd]', 'host[:port]'."""
        global _userprog
        if _userprog is None:
            import re
            _userprog = re.compile('^(.*)@(.*)$')

        match = _userprog.match(host)
        if match: return match.group(1, 2)
        return None, host

else:  # pragma: no cover
    from io import StringIO
    string_types = str,
    text_type = str
    from io import TextIOWrapper as file_type
    import builtins
    import configparser
    import shutil
    from urllib.parse import (urlparse, urlunparse, urljoin, splituser, quote,
                              unquote, urlsplit, urlunsplit, splittype)
    from urllib.request import (urlopen, urlretrieve, Request, url2pathname,
                                pathname2url,
                                HTTPBasicAuthHandler, HTTPPasswordMgr,
                                HTTPHandler, HTTPRedirectHandler,
                                build_opener)
    if ssl:
        from urllib.request import HTTPSHandler
    from urllib.error import HTTPError, URLError, ContentTooShortError
    import http.client as httplib
    import urllib.request as urllib2
    import xmlrpc.client as xmlrpclib
    import queue
    from html.parser import HTMLParser
    import html.entities as htmlentitydefs
    raw_input = input
    from itertools import filterfalse
    filter = filter

try:
    from ssl import match_hostname, CertificateError
except ImportError: # pragma: no cover
    class CertificateError(ValueError):
        pass


    def _dnsname_match(dn, hostname, max_wildcards=1):
        """Matching according to RFC 6125, section 6.4.3

        http://tools.ietf.org/html/rfc6125#section-6.4.3
        """
        pats = []
        if not dn:
            return False

        parts = dn.split('.')
        leftmost, remainder = parts[0], parts[1:]

        wildcards = leftmost.count('*')
        if wildcards > max_wildcards:
            # Issue #17980: avoid denials of service by refusing more
            # than one wildcard per fragment.  A survey of established
            # policy among SSL implementations showed it to be a
            # reasonable choice.
            raise CertificateError(
                "too many wildcards in certificate DNS name: " + repr(dn))

        # speed up common case w/o wildcards
        if not wildcards:
            return dn.lower() == hostname.lower()

        # RFC 6125, section 6.4.3, subitem 1.
        # The client SHOULD NOT attempt to match a presented identifier in which
        # the wildcard character comprises a label other than the left-most label.
        if leftmost == '*':
            # When '*' is a fragment by itself, it matches a non-empty dotless
            # fragment.
            pats.append('[^.]+')
        elif leftmost.startswith('xn--') or hostname.startswith('xn--'):
            # RFC 6125, section 6.4.3, subitem 3.
            # The client SHOULD NOT attempt to match a presented identifier
            # where the wildcard character is embedded within an A-label or
            # U-label of an internationalized domain name.
            pats.append(re.escape(leftmost))
        else:
            # Otherwise, '*' matches any dotless string, e.g. www*
            pats.append(re.escape(leftmost).replace(r'\*', '[^.]*'))

        # add the remaining fragments, ignore any wildcards
        for frag in remainder:
            pats.append(re.escape(frag))

        pat = re.compile(r'\A' + r'\.'.join(pats) + r'\Z', re.IGNORECASE)
        return pat.match(hostname)


    def match_hostname(cert, hostname):
        """Verify that *cert* (in decoded format as returned by
        SSLSocket.getpeercert()) matches the *hostname*.  RFC 2818 and RFC 6125
        rules are followed, but IP addresses are not accepted for *hostname*.

        CertificateError is raised on failure. On success, the function
        returns nothing.
        """
        if not cert:
            raise ValueError("empty or no certificate, match_hostname needs a "
                             "SSL socket or SSL context with either "
                             "CERT_OPTIONAL or CERT_REQUIRED")
        dnsnames = []
        san = cert.get('subjectAltName', ())
        for key, value in san:
            if key == 'DNS':
                if _dnsname_match(value, hostname):
                    return
                dnsnames.append(value)
        if not dnsnames:
            # The subject is only checked when there is no dNSName entry
            # in subjectAltName
            for sub in cert.get('subject', ()):
                for key, value in sub:
                    # XXX according to RFC 2818, the most specific Common Name
                    # must be used.
                    if key == 'commonName':
                        if _dnsname_match(value, hostname):
                            return
                        dnsnames.append(value)
        if len(dnsnames) > 1:
            raise CertificateError("hostname %r "
                "doesn't match either of %s"
                % (hostname, ', '.join(map(repr, dnsnames))))
        elif len(dnsnames) == 1:
            raise CertificateError("hostname %r "
                "doesn't match %r"
                % (hostname, dnsnames[0]))
        else:
            raise CertificateError("no appropriate commonName or "
                "subjectAltName fields were found")


try:
    from types import SimpleNamespace as Container
except ImportError:  # pragma: no cover
    class Container(object):
        """
        A generic container for when multiple values need to be returned
        """
        def __init__(self, **kwargs):
            self.__dict__.update(kwargs)


try:
    from shutil import which
except ImportError:  # pragma: no cover
    # Implementation from Python 3.3
    def which(cmd, mode=os.F_OK | os.X_OK, path=None):
        """Given a command, mode, and a PATH string, return the path which
        conforms to the given mode on the PATH, or None if there is no such
        file.

        `mode` defaults to os.F_OK | os.X_OK. `path` defaults to the result
        of os.environ.get("PATH"), or can be overridden with a custom search
        path.

        """
        # Check that a given file can be accessed with the correct mode.
        # Additionally check that `file` is not a directory, as on Windows
        # directories pass the os.access check.
        def _access_check(fn, mode):
            return (os.path.exists(fn) and os.access(fn, mode)
                    and not os.path.isdir(fn))

        # If we're given a path with a directory part, look it up directly rather
        # than referring to PATH directories. This includes checking relative to the
        # current directory, e.g. ./script
        if os.path.dirname(cmd):
            if _access_check(cmd, mode):
                return cmd
            return None

        if path is None:
            path = os.environ.get("PATH", os.defpath)
        if not path:
            return None
        path = path.split(os.pathsep)

        if sys.platform == "win32":
            # The current directory takes precedence on Windows.
            if not os.curdir in path:
                path.insert(0, os.curdir)

            # PATHEXT is necessary to check on Windows.
            pathext = os.environ.get("PATHEXT", "").split(os.pathsep)
            # See if the given file matches any of the expected path extensions.
            # This will allow us to short circuit when given "python.exe".
            # If it does match, only test that one, otherwise we have to try
            # others.
            if any(cmd.lower().endswith(ext.lower()) for ext in pathext):
                files = [cmd]
            else:
                files = [cmd + ext for ext in pathext]
        else:
            # On other platforms you don't have things like PATHEXT to tell you
            # what file suffixes are executable, so just pass on cmd as-is.
            files = [cmd]

        seen = set()
        for dir in path:
            normdir = os.path.normcase(dir)
            if not normdir in seen:
                seen.add(normdir)
                for thefile in files:
                    name = os.path.join(dir, thefile)
                    if _access_check(name, mode):
                        return name
        return None


# ZipFile is a context manager in 2.7, but not in 2.6

from zipfile import ZipFile as BaseZipFile

if hasattr(BaseZipFile, '__enter__'):  # pragma: no cover
    ZipFile = BaseZipFile
else:
    from zipfile import ZipExtFile as BaseZipExtFile

    class ZipExtFile(BaseZipExtFile):
        def __init__(self, base):
            self.__dict__.update(base.__dict__)

        def __enter__(self):
            return self

        def __exit__(self, *exc_info):
            self.close()
            # return None, so if an exception occurred, it will propagate

    class ZipFile(BaseZipFile):
        def __enter__(self):
            return self

        def __exit__(self, *exc_info):
            self.close()
            # return None, so if an exception occurred, it will propagate

        def open(self, *args, **kwargs):
            base = BaseZipFile.open(self, *args, **kwargs)
            return ZipExtFile(base)

try:
    from platform import python_implementation
except ImportError: # pragma: no cover
    def python_implementation():
        """Return a string identifying the Python implementation."""
        if 'PyPy' in sys.version:
            return 'PyPy'
        if os.name == 'java':
            return 'Jython'
        if sys.version.startswith('IronPython'):
            return 'IronPython'
        return 'CPython'

try:
    import sysconfig
except ImportError: # pragma: no cover
    from ._backport import sysconfig

try:
    callable = callable
except NameError:   # pragma: no cover
    from collections import Callable

    def callable(obj):
        return isinstance(obj, Callable)


try:
    fsencode = os.fsencode
    fsdecode = os.fsdecode
except AttributeError:  # pragma: no cover
    _fsencoding = sys.getfilesystemencoding()
    if _fsencoding == 'mbcs':
        _fserrors = 'strict'
    else:
        _fserrors = 'surrogateescape'

    def fsencode(filename):
        if isinstance(filename, bytes):
            return filename
        elif isinstance(filename, text_type):
            return filename.encode(_fsencoding, _fserrors)
        else:
            raise TypeError("expect bytes or str, not %s" %
                            type(filename).__name__)

    def fsdecode(filename):
        if isinstance(filename, text_type):
            return filename
        elif isinstance(filename, bytes):
            return filename.decode(_fsencoding, _fserrors)
        else:
            raise TypeError("expect bytes or str, not %s" %
                            type(filename).__name__)

try:
    from tokenize import detect_encoding
except ImportError: # pragma: no cover
    from codecs import BOM_UTF8, lookup
    import re

    cookie_re = re.compile("coding[:=]\s*([-\w.]+)")

    def _get_normal_name(orig_enc):
        """Imitates get_normal_name in tokenizer.c."""
        # Only care about the first 12 characters.
        enc = orig_enc[:12].lower().replace("_", "-")
        if enc == "utf-8" or enc.startswith("utf-8-"):
            return "utf-8"
        if enc in ("latin-1", "iso-8859-1", "iso-latin-1") or \
           enc.startswith(("latin-1-", "iso-8859-1-", "iso-latin-1-")):
            return "iso-8859-1"
        return orig_enc

    def detect_encoding(readline):
        """
        The detect_encoding() function is used to detect the encoding that should
        be used to decode a Python source file.  It requires one argument, readline,
        in the same way as the tokenize() generator.

        It will call readline a maximum of twice, and return the encoding used
        (as a string) and a list of any lines (left as bytes) it has read in.

        It detects the encoding from the presence of a utf-8 bom or an encoding
        cookie as specified in pep-0263.  If both a bom and a cookie are present,
        but disagree, a SyntaxError will be raised.  If the encoding cookie is an
        invalid charset, raise a SyntaxError.  Note that if a utf-8 bom is found,
        'utf-8-sig' is returned.

        If no encoding is specified, then the default of 'utf-8' will be returned.
        """
        try:
            filename = readline.__self__.name
        except AttributeError:
            filename = None
        bom_found = False
        encoding = None
        default = 'utf-8'
        def read_or_stop():
            try:
                return readline()
            except StopIteration:
                return b''

        def find_cookie(line):
            try:
                # Decode as UTF-8. Either the line is an encoding declaration,
                # in which case it should be pure ASCII, or it must be UTF-8
                # per default encoding.
                line_string = line.decode('utf-8')
            except UnicodeDecodeError:
                msg = "invalid or missing encoding declaration"
                if filename is not None:
                    msg = '{} for {!r}'.format(msg, filename)
                raise SyntaxError(msg)

            matches = cookie_re.findall(line_string)
            if not matches:
                return None
            encoding = _get_normal_name(matches[0])
            try:
                codec = lookup(encoding)
            except LookupError:
                # This behaviour mimics the Python interpreter
                if filename is None:
                    msg = "unknown encoding: " + encoding
                else:
                    msg = "unknown encoding for {!r}: {}".format(filename,
                            encoding)
                raise SyntaxError(msg)

            if bom_found:
                if codec.name != 'utf-8':
                    # This behaviour mimics the Python interpreter
                    if filename is None:
                        msg = 'encoding problem: utf-8'
                    else:
                        msg = 'encoding problem for {!r}: utf-8'.format(filename)
                    raise SyntaxError(msg)
                encoding += '-sig'
            return encoding

        first = read_or_stop()
        if first.startswith(BOM_UTF8):
            bom_found = True
            first = first[3:]
            default = 'utf-8-sig'
        if not first:
            return default, []

        encoding = find_cookie(first)
        if encoding:
            return encoding, [first]

        second = read_or_stop()
        if not second:
            return default, [first]

        encoding = find_cookie(second)
        if encoding:
            return encoding, [first, second]

        return default, [first, second]

# For converting & <-> &amp; etc.
try:
    from html import escape
except ImportError:
    from cgi import escape
if sys.version_info[:2] < (3, 4):
    unescape = HTMLParser().unescape
else:
    from html import unescape

try:
    from collections import ChainMap
except ImportError: # pragma: no cover
    from collections import MutableMapping

    try:
        from reprlib import recursive_repr as _recursive_repr
    except ImportError:
        def _recursive_repr(fillvalue='...'):
            '''
            Decorator to make a repr function return fillvalue for a recursive
            call
            '''

            def decorating_function(user_function):
                repr_running = set()

                def wrapper(self):
                    key = id(self), get_ident()
                    if key in repr_running:
                        return fillvalue
                    repr_running.add(key)
                    try:
                        result = user_function(self)
                    finally:
                        repr_running.discard(key)
                    return result

                # Can't use functools.wraps() here because of bootstrap issues
                wrapper.__module__ = getattr(user_function, '__module__')
                wrapper.__doc__ = getattr(user_function, '__doc__')
                wrapper.__name__ = getattr(user_function, '__name__')
                wrapper.__annotations__ = getattr(user_function, '__annotations__', {})
                return wrapper

            return decorating_function

    class ChainMap(MutableMapping):
        ''' A ChainMap groups multiple dicts (or other mappings) together
        to create a single, updateable view.

        The underlying mappings are stored in a list.  That list is public and can
        accessed or updated using the *maps* attribute.  There is no other state.

        Lookups search the underlying mappings successively until a key is found.
        In contrast, writes, updates, and deletions only operate on the first
        mapping.

        '''

        def __init__(self, *maps):
            '''Initialize a ChainMap by setting *maps* to the given mappings.
            If no mappings are provided, a single empty dictionary is used.

            '''
            self.maps = list(maps) or [{}]          # always at least one map

        def __missing__(self, key):
            raise KeyError(key)

        def __getitem__(self, key):
            for mapping in self.maps:
                try:
                    return mapping[key]             # can't use 'key in mapping' with defaultdict
                except KeyError:
                    pass
            return self.__missing__(key)            # support subclasses that define __missing__

        def get(self, key, default=None):
            return self[key] if key in self else default

        def __len__(self):
            return len(set().union(*self.maps))     # reuses stored hash values if possible

        def __iter__(self):
            return iter(set().union(*self.maps))

        def __contains__(self, key):
            return any(key in m for m in self.maps)

        def __bool__(self):
            return any(self.maps)

        @_recursive_repr()
        def __repr__(self):
            return '{0.__class__.__name__}({1})'.format(
                self, ', '.join(map(repr, self.maps)))

        @classmethod
        def fromkeys(cls, iterable, *args):
            'Create a ChainMap with a single dict created from the iterable.'
            return cls(dict.fromkeys(iterable, *args))

        def copy(self):
            'New ChainMap or subclass with a new copy of maps[0] and refs to maps[1:]'
            return self.__class__(self.maps[0].copy(), *self.maps[1:])

        __copy__ = copy

        def new_child(self):                        # like Django's Context.push()
            'New ChainMap with a new dict followed by all previous maps.'
            return self.__class__({}, *self.maps)

        @property
        def parents(self):                          # like Django's Context.pop()
            'New ChainMap from maps[1:].'
            return self.__class__(*self.maps[1:])

        def __setitem__(self, key, value):
            self.maps[0][key] = value

        def __delitem__(self, key):
            try:
                del self.maps[0][key]
            except KeyError:
                raise KeyError('Key not found in the first mapping: {!r}'.format(key))

        def popitem(self):
            'Remove and return an item pair from maps[0]. Raise KeyError is maps[0] is empty.'
            try:
                return self.maps[0].popitem()
            except KeyError:
                raise KeyError('No keys found in the first mapping.')

        def pop(self, key, *args):
            'Remove *key* from maps[0] and return its value. Raise KeyError if *key* not in maps[0].'
            try:
                return self.maps[0].pop(key, *args)
            except KeyError:
                raise KeyError('Key not found in the first mapping: {!r}'.format(key))

        def clear(self):
            'Clear maps[0], leaving maps[1:] intact.'
            self.maps[0].clear()

try:
    from imp import cache_from_source
except ImportError: # pragma: no cover
    def cache_from_source(path, debug_override=None):
        assert path.endswith('.py')
        if debug_override is None:
            debug_override = __debug__
        if debug_override:
            suffix = 'c'
        else:
            suffix = 'o'
        return path + suffix

try:
    from collections import OrderedDict
except ImportError: # pragma: no cover
## {{{ http://code.activestate.com/recipes/576693/ (r9)
# Backport of OrderedDict() class that runs on Python 2.4, 2.5, 2.6, 2.7 and pypy.
# Passes Python2.7's test suite and incorporates all the latest updates.
    try:
        from thread import get_ident as _get_ident
    except ImportError:
        from dummy_thread import get_ident as _get_ident

    try:
        from _abcoll import KeysView, ValuesView, ItemsView
    except ImportError:
        pass


    class OrderedDict(dict):
        'Dictionary that remembers insertion order'
        # An inherited dict maps keys to values.
        # The inherited dict provides __getitem__, __len__, __contains__, and get.
        # The remaining methods are order-aware.
        # Big-O running times for all methods are the same as for regular dictionaries.

        # The internal self.__map dictionary maps keys to links in a doubly linked list.
        # The circular doubly linked list starts and ends with a sentinel element.
        # The sentinel element never gets deleted (this simplifies the algorithm).
        # Each link is stored as a list of length three:  [PREV, NEXT, KEY].

        def __init__(self, *args, **kwds):
            '''Initialize an ordered dictionary.  Signature is the same as for
            regular dictionaries, but keyword arguments are not recommended
            because their insertion order is arbitrary.

            '''
            if len(args) > 1:
                raise TypeError('expected at most 1 arguments, got %d' % len(args))
            try:
                self.__root
            except AttributeError:
                self.__root = root = []                     # sentinel node
                root[:] = [root, root, None]
                self.__map = {}
            self.__update(*args, **kwds)

        def __setitem__(self, key, value, dict_setitem=dict.__setitem__):
            'od.__setitem__(i, y) <==> od[i]=y'
            # Setting a new item creates a new link which goes at the end of the linked
            # list, and the inherited dictionary is updated with the new key/value pair.
            if key not in self:
                root = self.__root
                last = root[0]
                last[1] = root[0] = self.__map[key] = [last, root, key]
            dict_setitem(self, key, value)

        def __delitem__(self, key, dict_delitem=dict.__delitem__):
            'od.__delitem__(y) <==> del od[y]'
            # Deleting an existing item uses self.__map to find the link which is
            # then removed by updating the links in the predecessor and successor nodes.
            dict_delitem(self, key)
            link_prev, link_next, key = self.__map.pop(key)
            link_prev[1] = link_next
            link_next[0] = link_prev

        def __iter__(self):
            'od.__iter__() <==> iter(od)'
            root = self.__root
            curr = root[1]
            while curr is not root:
                yield curr[2]
                curr = curr[1]

        def __reversed__(self):
            'od.__reversed__() <==> reversed(od)'
            root = self.__root
            curr = root[0]
            while curr is not root:
                yield curr[2]
                curr = curr[0]

        def clear(self):
            'od.clear() -> None.  Remove all items from od.'
            try:
                for node in self.__map.itervalues():
                    del node[:]
                root = self.__root
                root[:] = [root, root, None]
                self.__map.clear()
            except AttributeError:
                pass
            dict.clear(self)

        def popitem(self, last=True):
            '''od.popitem() -> (k, v), return and remove a (key, value) pair.
            Pairs are returned in LIFO order if last is true or FIFO order if false.

            '''
            if not self:
                raise KeyError('dictionary is empty')
            root = self.__root
            if last:
                link = root[0]
                link_prev = link[0]
                link_prev[1] = root
                root[0] = link_prev
            else:
                link = root[1]
                link_next = link[1]
                root[1] = link_next
                link_next[0] = root
            key = link[2]
            del self.__map[key]
            value = dict.pop(self, key)
            return key, value

        # -- the following methods do not depend on the internal structure --

        def keys(self):
            'od.keys() -> list of keys in od'
            return list(self)

        def values(self):
            'od.values() -> list of values in od'
            return [self[key] for key in self]

        def items(self):
            'od.items() -> list of (key, value) pairs in od'
            return [(key, self[key]) for key in self]

        def iterkeys(self):
            'od.iterkeys() -> an iterator over the keys in od'
            return iter(self)

        def itervalues(self):
            'od.itervalues -> an iterator over the values in od'
            for k in self:
                yield self[k]

        def iteritems(self):
            'od.iteritems -> an iterator over the (key, value) items in od'
            for k in self:
                yield (k, self[k])

        def update(*args, **kwds):
            '''od.update(E, **F) -> None.  Update od from dict/iterable E and F.

            If E is a dict instance, does:           for k in E: od[k] = E[k]
            If E has a .keys() method, does:         for k in E.keys(): od[k] = E[k]
            Or if E is an iterable of items, does:   for k, v in E: od[k] = v
            In either case, this is followed by:     for k, v in F.items(): od[k] = v

            '''
            if len(args) > 2:
                raise TypeError('update() takes at most 2 positional '
                                'arguments (%d given)' % (len(args),))
            elif not args:
                raise TypeError('update() takes at least 1 argument (0 given)')
            self = args[0]
            # Make progressively weaker assumptions about "other"
            other = ()
            if len(args) == 2:
                other = args[1]
            if isinstance(other, dict):
                for key in other:
                    self[key] = other[key]
            elif hasattr(other, 'keys'):
                for key in other.keys():
                    self[key] = other[key]
            else:
                for key, value in other:
                    self[key] = value
            for key, value in kwds.items():
                self[key] = value

        __update = update  # let subclasses override update without breaking __init__

        __marker = object()

        def pop(self, key, default=__marker):
            '''od.pop(k[,d]) -> v, remove specified key and return the corresponding value.
            If key is not found, d is returned if given, otherwise KeyError is raised.

            '''
            if key in self:
                result = self[key]
                del self[key]
                return result
            if default is self.__marker:
                raise KeyError(key)
            return default

        def setdefault(self, key, default=None):
            'od.setdefault(k[,d]) -> od.get(k,d), also set od[k]=d if k not in od'
            if key in self:
                return self[key]
            self[key] = default
            return default

        def __repr__(self, _repr_running=None):
            'od.__repr__() <==> repr(od)'
            if not _repr_running: _repr_running = {}
            call_key = id(self), _get_ident()
            if call_key in _repr_running:
                return '...'
            _repr_running[call_key] = 1
            try:
                if not self:
                    return '%s()' % (self.__class__.__name__,)
                return '%s(%r)' % (self.__class__.__name__, self.items())
            finally:
                del _repr_running[call_key]

        def __reduce__(self):
            'Return state information for pickling'
            items = [[k, self[k]] for k in self]
            inst_dict = vars(self).copy()
            for k in vars(OrderedDict()):
                inst_dict.pop(k, None)
            if inst_dict:
                return (self.__class__, (items,), inst_dict)
            return self.__class__, (items,)

        def copy(self):
            'od.copy() -> a shallow copy of od'
            return self.__class__(self)

        @classmethod
        def fromkeys(cls, iterable, value=None):
            '''OD.fromkeys(S[, v]) -> New ordered dictionary with keys from S
            and values equal to v (which defaults to None).

            '''
            d = cls()
            for key in iterable:
                d[key] = value
            return d

        def __eq__(self, other):
            '''od.__eq__(y) <==> od==y.  Comparison to another OD is order-sensitive
            while comparison to a regular mapping is order-insensitive.

            '''
            if isinstance(other, OrderedDict):
                return len(self)==len(other) and self.items() == other.items()
            return dict.__eq__(self, other)

        def __ne__(self, other):
            return not self == other

        # -- the following methods are only used in Python 2.7 --

        def viewkeys(self):
            "od.viewkeys() -> a set-like object providing a view on od's keys"
            return KeysView(self)

        def viewvalues(self):
            "od.viewvalues() -> an object providing a view on od's values"
            return ValuesView(self)

        def viewitems(self):
            "od.viewitems() -> a set-like object providing a view on od's items"
            return ItemsView(self)

try:
    from logging.config import BaseConfigurator, valid_ident
except ImportError: # pragma: no cover
    IDENTIFIER = re.compile('^[a-z_][a-z0-9_]*$', re.I)


    def valid_ident(s):
        m = IDENTIFIER.match(s)
        if not m:
            raise ValueError('Not a valid Python identifier: %r' % s)
        return True


    # The ConvertingXXX classes are wrappers around standard Python containers,
    # and they serve to convert any suitable values in the container. The
    # conversion converts base dicts, lists and tuples to their wrapped
    # equivalents, whereas strings which match a conversion format are converted
    # appropriately.
    #
    # Each wrapper should have a configurator attribute holding the actual
    # configurator to use for conversion.

    class ConvertingDict(dict):
        """A converting dictionary wrapper."""

        def __getitem__(self, key):
            value = dict.__getitem__(self, key)
            result = self.configurator.convert(value)
            #If the converted value is different, save for next time
            if value is not result:
                self[key] = result
                if type(result) in (ConvertingDict, ConvertingList,
                                    ConvertingTuple):
                    result.parent = self
                    result.key = key
            return result

        def get(self, key, default=None):
            value = dict.get(self, key, default)
            result = self.configurator.convert(value)
            #If the converted value is different, save for next time
            if value is not result:
                self[key] = result
                if type(result) in (ConvertingDict, ConvertingList,
                                    ConvertingTuple):
                    result.parent = self
                    result.key = key
            return result

    def pop(self, key, default=None):
        value = dict.pop(self, key, default)
        result = self.configurator.convert(value)
        if value is not result:
            if type(result) in (ConvertingDict, ConvertingList,
                                ConvertingTuple):
                result.parent = self
                result.key = key
        return result

    class ConvertingList(list):
        """A converting list wrapper."""
        def __getitem__(self, key):
            value = list.__getitem__(self, key)
            result = self.configurator.convert(value)
            #If the converted value is different, save for next time
            if value is not result:
                self[key] = result
                if type(result) in (ConvertingDict, ConvertingList,
                                    ConvertingTuple):
                    result.parent = self
                    result.key = key
            return result

        def pop(self, idx=-1):
            value = list.pop(self, idx)
            result = self.configurator.convert(value)
            if value is not result:
                if type(result) in (ConvertingDict, ConvertingList,
                                    ConvertingTuple):
                    result.parent = self
            return result

    class ConvertingTuple(tuple):
        """A converting tuple wrapper."""
        def __getitem__(self, key):
            value = tuple.__getitem__(self, key)
            result = self.configurator.convert(value)
            if value is not result:
                if type(result) in (ConvertingDict, ConvertingList,
                                    ConvertingTuple):
                    result.parent = self
                    result.key = key
            return result

    class BaseConfigurator(object):
        """
        The configurator base class which defines some useful defaults.
        """

        CONVERT_PATTERN = re.compile(r'^(?P<prefix>[a-z]+)://(?P<suffix>.*)$')

        WORD_PATTERN = re.compile(r'^\s*(\w+)\s*')
        DOT_PATTERN = re.compile(r'^\.\s*(\w+)\s*')
        INDEX_PATTERN = re.compile(r'^\[\s*(\w+)\s*\]\s*')
        DIGIT_PATTERN = re.compile(r'^\d+$')

        value_converters = {
            'ext' : 'ext_convert',
            'cfg' : 'cfg_convert',
        }

        # We might want to use a different one, e.g. importlib
        importer = staticmethod(__import__)

        def __init__(self, config):
            self.config = ConvertingDict(config)
            self.config.configurator = self

        def resolve(self, s):
            """
            Resolve strings to objects using standard import and attribute
            syntax.
            """
            name = s.split('.')
            used = name.pop(0)
            try:
                found = self.importer(used)
                for frag in name:
                    used += '.' + frag
                    try:
                        found = getattr(found, frag)
                    except AttributeError:
                        self.importer(used)
                        found = getattr(found, frag)
                return found
            except ImportError:
                e, tb = sys.exc_info()[1:]
                v = ValueError('Cannot resolve %r: %s' % (s, e))
                v.__cause__, v.__traceback__ = e, tb
                raise v

        def ext_convert(self, value):
            """Default converter for the ext:// protocol."""
            return self.resolve(value)

        def cfg_convert(self, value):
            """Default converter for the cfg:// protocol."""
            rest = value
            m = self.WORD_PATTERN.match(rest)
            if m is None:
                raise ValueError("Unable to convert %r" % value)
            else:
                rest = rest[m.end():]
                d = self.config[m.groups()[0]]
                #print d, rest
                while rest:
                    m = self.DOT_PATTERN.match(rest)
                    if m:
                        d = d[m.groups()[0]]
                    else:
                        m = self.INDEX_PATTERN.match(rest)
                        if m:
                            idx = m.groups()[0]
                            if not self.DIGIT_PATTERN.match(idx):
                                d = d[idx]
                            else:
                                try:
                                    n = int(idx) # try as number first (most likely)
                                    d = d[n]
                                except TypeError:
                                    d = d[idx]
                    if m:
                        rest = rest[m.end():]
                    else:
                        raise ValueError('Unable to convert '
                                         '%r at %r' % (value, rest))
            #rest should be empty
            return d

        def convert(self, value):
            """
            Convert values to an appropriate type. dicts, lists and tuples are
            replaced by their converting alternatives. Strings are checked to
            see if they have a conversion format and are converted if they do.
            """
            if not isinstance(value, ConvertingDict) and isinstance(value, dict):
                value = ConvertingDict(value)
                value.configurator = self
            elif not isinstance(value, ConvertingList) and isinstance(value, list):
                value = ConvertingList(value)
                value.configurator = self
            elif not isinstance(value, ConvertingTuple) and\
                     isinstance(value, tuple):
                value = ConvertingTuple(value)
                value.configurator = self
            elif isinstance(value, string_types):
                m = self.CONVERT_PATTERN.match(value)
                if m:
                    d = m.groupdict()
                    prefix = d['prefix']
                    converter = self.value_converters.get(prefix, None)
                    if converter:
                        suffix = d['suffix']
                        converter = getattr(self, converter)
                        value = converter(suffix)
            return value

        def configure_custom(self, config):
            """Configure an object with a user-supplied factory."""
            c = config.pop('()')
            if not callable(c):
                c = self.resolve(c)
            props = config.pop('.', None)
            # Check for valid identifiers
            kwargs = dict([(k, config[k]) for k in config if valid_ident(k)])
            result = c(**kwargs)
            if props:
                for name, value in props.items():
                    setattr(result, name, value)
            return result

        def as_tuple(self, value):
            """Utility function which converts lists to tuples."""
            if isinstance(value, list):
                value = tuple(value)
            return value
site-packages/pip/_vendor/distlib/__pycache__/wheel.cpython-36.opt-1.pyc000064400000060467147511334610022065 0ustar003

���e˘�@s�ddlmZddlZddlZddlZddlZddlmZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlZddlZddlZddlmZmZddlmZmZmZmZmZddlmZddlm Z m!Z!dd	l"m#Z#m$Z$m%Z%m&Z&m'Z'm(Z(m)Z)m*Z*m+Z+dd
l,m-Z-m.Z.ej/e0�Z1da2e3ed��r4dZ4n*ej5j6d
��rHdZ4nej5dk�rZdZ4ndZ4ej7d�Z8e8�s�dej9dd�Z8de8Z:e4e8Z;ej"j<�j=dd�j=dd�Z>ej7d�Z?e?�r�e?j6d��r�e?j=dd�Z?ndd�Z@e@�Z?[@ejAdejBejCB�ZDejAdejBejCB�ZEejAd�ZFejAd �ZGd!ZHd"ZIe
jJd#k�r>d$d%�ZKnd&d%�ZKGd'd(�d(eL�ZMeM�ZNGd)d*�d*eL�ZOd+d,�ZPeP�ZQ[Pd/d-d.�ZRdS)0�)�unicode_literalsN)�message_from_file�)�__version__�DistlibException)�	sysconfig�ZipFile�fsdecode�	text_type�filter)�InstalledDistribution)�Metadata�METADATA_FILENAME)	�FileOperator�convert_path�	CSVReader�	CSVWriter�Cache�cached_property�get_cache_base�read_exports�tempdir)�NormalizedVersion�UnsupportedVersionErrorZpypy_version_infoZpp�javaZjyZcliZip�cp�py_version_nodotz%s%s��py�-�_�.�SOABIzcpython-cCsRdtg}tjd�r|jd�tjd�r0|jd�tjd�dkrH|jd�d	j|�S)
Nr�Py_DEBUG�d�
WITH_PYMALLOC�mZPy_UNICODE_SIZE��u�)�
VER_SUFFIXr�get_config_var�append�join)�parts�r/�/usr/lib/python3.6/wheel.py�_derive_abi;s




r1zz
(?P<nm>[^-]+)
-(?P<vn>\d+[^-]*)
(-(?P<bn>\d+[^-]*))?
-(?P<py>\w+\d+(\.\w+\d+)*)
-(?P<bi>\w+)
-(?P<ar>\w+(\.\w+)*)
\.whl$
z7
(?P<nm>[^-]+)
-(?P<vn>\d+[^-]*)
(-(?P<bn>\d+[^-]*))?$
s
\s*#![^\r\n]*s^(\s*#!("[^"]+"|\S+))\s+(.*)$s#!pythons	#!pythonw�/cCs|S)Nr/)�or/r/r0�<lambda>]sr4cCs|jtjd�S)Nr2)�replace�os�sep)r3r/r/r0r4_sc@s6eZdZdd�Zdd�Zdd�Zddd	�Zd
d�ZdS)
�MountercCsi|_i|_dS)N)�
impure_wheels�libs)�selfr/r/r0�__init__cszMounter.__init__cCs||j|<|jj|�dS)N)r9r:�update)r;�pathname�
extensionsr/r/r0�addgs
zMounter.addcCs4|jj|�}x"|D]\}}||jkr|j|=qWdS)N)r9�popr:)r;r>r?�k�vr/r/r0�removeks
zMounter.removeNcCs||jkr|}nd}|S)N)r:)r;�fullname�path�resultr/r/r0�find_moduleqs
zMounter.find_modulecCsj|tjkrtj|}nP||jkr,td|��tj||j|�}||_|jdd�}t|�dkrf|d|_	|S)Nzunable to find extension for %sr!rr)
�sys�modulesr:�ImportError�impZload_dynamic�
__loader__�rsplit�len�__package__)r;rErGr.r/r/r0�load_modulexs


zMounter.load_module)N)�__name__�
__module__�__qualname__r<r@rDrHrQr/r/r/r0r8bs

r8c@s�eZdZdZd2ZdZd3dd�Zedd	��Zed
d��Z	edd
��Z
edd��Zdd�Z
edd��Zdd�Zd4dd�Zdd�Zdd�Zdd�Zd5dd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd6d*d+�Zd,d-�Zd.d/�Zd7d0d1�ZdS)8�Wheelz@
    Class to build and install from Wheel files (PEP 427).
    rZsha256NFcCs8||_||_d|_tg|_dg|_dg|_tj�|_	|dkrRd|_
d|_|j|_
n�tj|�}|r�|jd�}|d|_
|djd	d
�|_|d|_|j|_
n�tjj|�\}}tj|�}|s�td|��|r�tjj|�|_	||_
|jd�}|d|_
|d|_|d|_|d
jd�|_|djd�|_|djd�|_dS)zB
        Initialise an instance using a (valid) filename.
        r)�none�anyNZdummyz0.1ZnmZvnr rZbnzInvalid name or filename: %rrr!Zbi�ar)�signZ
should_verify�buildver�PYVER�pyver�abi�archr6�getcwd�dirname�name�version�filenameZ	_filename�NAME_VERSION_RE�match�	groupdictr5rF�split�FILENAME_REr�abspath)r;rcrY�verifyr&�infor`r/r/r0r<�sB











zWheel.__init__cCs^|jrd|j}nd}dj|j�}dj|j�}dj|j�}|jjdd�}d|j|||||fS)zJ
        Build and return a filename from the various components.
        rr)r!r z%s-%s%s-%s-%s-%s.whl)rZr-r\r]r^rbr5ra)r;rZr\r]r^rbr/r/r0rc�s
zWheel.filenamecCstjj|j|j�}tjj|�S)N)r6rFr-r`rc�isfile)r;rFr/r/r0�exists�szWheel.existsccs@x:|jD]0}x*|jD] }x|jD]}|||fVq WqWqWdS)N)r\r]r^)r;r\r]r^r/r/r0�tags�sz
Wheel.tagscCs�tjj|j|j�}d|j|jf}d|}tjd�}t	|d���}|j
|�}|djdd�}tdd	�|D��}|d
krzd
}	nt
}	y8tj||	�}
|j|
��}||�}t|d�}
WdQRXWn tk
r�td|	��YnXWdQRX|
S)Nz%s-%sz%s.dist-infozutf-8�rz
Wheel-Versionr!rcSsg|]}t|��qSr/)�int)�.0�ir/r/r0�
<listcomp>�sz"Wheel.metadata.<locals>.<listcomp>ZMETADATA)Zfileobjz$Invalid wheel, because %s is missing)rr)r6rFr-r`rcrarb�codecs�	getreaderr�get_wheel_metadatarg�tupler�	posixpath�openr
�KeyError�
ValueError)r;r>�name_ver�info_dir�wrapper�zf�wheel_metadata�wv�file_version�fn�metadata_filename�bf�wfrGr/r/r0�metadata�s(

zWheel.metadatac	CsXd|j|jf}d|}tj|d�}|j|��}tjd�|�}t|�}WdQRXt|�S)Nz%s-%sz%s.dist-info�WHEELzutf-8)	rarbrxr-ryrtrur�dict)r;rr|r}r�r�r��messager/r/r0rv�szWheel.get_wheel_metadatac	Cs6tjj|j|j�}t|d��}|j|�}WdQRX|S)Nro)r6rFr-r`rcrrv)r;r>rrGr/r/r0rk�sz
Wheel.infocCs�tj|�}|r||j�}|d|�||d�}}d|j�krBt}nt}tj|�}|rfd|j�d
}nd}||}||}nT|jd�}|jd�}	|dks�||	kr�d}
n|||d�d	kr�d	}
nd}
t|
|}|S)Nspythonw� r��
�
rrs
���)	�
SHEBANG_REre�end�lower�SHEBANG_PYTHONW�SHEBANG_PYTHON�SHEBANG_DETAIL_RE�groups�find)r;�datar&r�ZshebangZdata_after_shebangZshebang_python�argsZcrZlfZtermr/r/r0�process_shebang�s,




zWheel.process_shebangcCsh|dkr|j}ytt|�}Wn tk
r<td|��YnX||�j�}tj|�jd�j	d�}||fS)NzUnsupported hash algorithm: %r�=�ascii)
�	hash_kind�getattr�hashlib�AttributeErrorr�digest�base64Zurlsafe_b64encode�rstrip�decode)r;r�r��hasherrGr/r/r0�get_hashszWheel.get_hashc
Csbt|�}ttjj||��}|j|ddf�|j�t|��}x|D]}|j|�qBWWdQRXdS)Nr))	�list�to_posixr6rF�relpathr,�sortrZwriterow)r;�recordsZrecord_path�base�p�writer�rowr/r/r0�write_record's

zWheel.write_recordcCs�g}|\}}tt|j�}xX|D]P\}}	t|	d��}
|
j�}WdQRXd|j|�}tjj|	�}
|j	|||
f�qWtjj
|d�}	|j||	|�ttjj
|d��}|j	||	f�dS)N�rbz%s=%s�RECORD)
r�r�r�ry�readr�r6rF�getsizer,r-r�r�)r;rk�libdir�
archive_pathsr��distinfor}r��apr��fr�r��sizer/r/r0�
write_records0szWheel.write_recordscCsJt|dtj��2}x*|D]"\}}tjd||�|j||�qWWdQRXdS)N�wzWrote %s to %s in wheel)r�zipfileZZIP_DEFLATED�logger�debug�write)r;r>r�rr�r�r/r/r0�	build_zip@szWheel.build_zipc!s�|dkri}tt�fdd�d$��d}|dkrFd}tg}tg}tg}nd}tg}d	g}d
g}|jd|�|_|jd|�|_|jd
|�|_	�|}	d|j
|jf}
d|
}d|
}g}
x�d%D]�}|�kr�q��|}tj
j|�r�x�tj|�D]�\}}}x�|D]�}ttj
j||��}tj
j||�}ttj
j|||��}|
j||f�|dk�r�|jd��r�t|d��}|j�}WdQRX|j|�}t|d��}|j|�WdQRX�q�Wq�Wq�W|	}d}x�tj|�D]�\}}}||k�rx@t|�D]4\}}t|�}|jd��r�tj
j||�}||=P�q�WxP|D]H}t|�jd&��r2�qtj
j||�}ttj
j||��}|
j||f��qW�q�Wtj|�}xJ|D]B}|d'k�r|ttj
j||��}ttj
j||��}|
j||f��q|Wd|�p�|jdtd |g}x*|jD] \}}}|jd!|||f��q�Wtj
j|d�}t|d"��}|jd#j|��WdQRXttj
j|d��}|
j||f�|j||f|	|
�tj
j|j |j!�} |j"| |
�| S)(z�
        Build a wheel from files in specified paths, and use any specified tags
        when determining the name of the wheel.
        Ncs|�kS)Nr/)r3)�pathsr/r0r4NszWheel.build.<locals>.<lambda>�purelib�platlibrZfalse�truerVrWr\r]r^z%s-%sz%s.dataz%s.dist-infor��headers�scriptsz.exer��wbz
.dist-info�.pyc�.pyor��	INSTALLER�SHAREDr�zWheel-Version: %d.%dzGenerator: distlib %szRoot-Is-Purelib: %sz
Tag: %s-%s-%sr��
)r�r�)r�r�r�)r�r�)r�r�r�r�)#r�r�IMPVER�ABI�ARCHr[�getr\r]r^rarbr6rF�isdir�walkr	r-r�r�r,�endswithryr�r�r��	enumerate�listdir�
wheel_versionrrnr�r`rcr�)!r;r�rnr�ZlibkeyZis_pureZ
default_pyverZdefault_abiZdefault_archr�r|�data_dirr}r��keyrF�root�dirs�filesr�r��rpr�r�r�r�rr�dnr�r\r]r^r>r/)r�r0�buildFs�


"





zWheel.buildcBIKs`|j}|jd�}|jdd�}tjj|j|j�}d|j|jf}d|}	d|}
t	j|
t
�}t	j|
d�}t	j|
d�}
tjd	�}t
|d
����}|j|��}||�}t|�}WdQRX|djd
d�}tdd�|D��}||jkr�|r�||j|�|ddk�r|d}n|d}i}|j|
��<}t|d��&}x|D]}|d}|||<�q.WWdQRXWdQRXt	j|	d�}t	j|
d�}t	j|	dd�}t|d�}d|_tj}g} tj�}!|!|_d|_�z��y^�x�|j�D�]�}"|"j}#t|#t��r�|#}$n
|#jd	�}$|$j d��r��q�||$}|d�r0t!|"j"�|dk�r0t#d|$��|d�r�|djdd�\}%}&|j|#��}|j$�}'WdQRX|j%|'|%�\}(})|)|&k�r�t#d|#��|�r�|$j&||f��r�t'j(d |$��q�|$j&|��o�|$j d!�}*|$j&|��r|$jd"d�\}(}+},tjj||+t)|,��}-n$|$||
fk�r�q�tjj|t)|$��}-|*�s|j|#��}|j*||-�WdQRX| j+|-�|�r�|d�r�t|-d#��4}|j$�}'|j%|'|%�\}(}.|.|)k�r�t#d$|-��WdQRX|�rx|-j d%��rxy|j,|-�}/| j+|/�Wn$t-k
�rt'j.d&dd'�YnXnttjj/t)|#��}0tjj|!|0�}1|j|#��}|j*||1�WdQRXtjj|-�\}2}0|2|_|j0|0�}3|j1|3�| j2|3��q�W|�r�t'j(d(�d}4�n~d}5|j3d}|d)k�r~t	j|
d*�}6y�|j|6��}t4|�}7WdQRXi}5xxd<D]p}8d-|8}9|9|7k�r�i|5d.|8<}:xF|7|9j5�D]6};d/|;j6|;j7f}<|;j8�rB|<d0|;j87}<|<|:|;j<�qW�q�WWn t-k
�rzt'j.d1�YnXndyB|j|��.}||�}t9j:|�jd2�}5|5�r�|5jd3�}5WdQRXWn t-k
�r�t'j.d4�YnX|5�r�|5jd5i�}=|5jd6i�}>|=�s|>�r�|jdd�}?tjj;|?��s.t<d7��|?|_x6|=j=�D]*\}9};d8|9|;f}@|j0|@�}3|j1|3��q>W|>�r�d,di}Ax8|>j=�D],\}9};d8|9|;f}@|j0|@|A�}3|j1|3��q�Wtjj||
�}t>|�}4t?|�}|d=|d=||d9<|4j@||�}|�r| j+|�|4jA| |d:|�|4St-k
�r@t'jBd;�|jC��YnXWdtDjE|!�XWdQRXdS)=a�
        Install a wheel to the specified paths. If kwarg ``warner`` is
        specified, it should be a callable, which will be called with two
        tuples indicating the wheel version of this software and the wheel
        version in the file, if there is a discrepancy in the versions.
        This can be used to issue any warnings to raise any exceptions.
        If kwarg ``lib_only`` is True, only the purelib/platlib files are
        installed, and the headers, scripts, data and dist-info metadata are
        not written.

        The return value is a :class:`InstalledDistribution` instance unless
        ``options.lib_only`` is True, in which case the return value is ``None``.
        �warner�lib_onlyFz%s-%sz%s.dataz%s.dist-infor�r�zutf-8roNz
Wheel-Versionr!rcSsg|]}t|��qSr/)rp)rqrrr/r/r0rs�sz!Wheel.install.<locals>.<listcomp>zRoot-Is-Purelibr�r�r�)�streamrr)r�)�dry_runTz/RECORD.jwsrzsize mismatch for %s�=zdigest mismatch for %szlib_only: skipping %sz.exer2r�zdigest mismatch on write for %sz.pyzByte-compilation failed)�exc_infozlib_only: returning Nonez1.0zentry_points.txt�console�guiz
%s_scriptszwrap_%sz%s:%sz %szAUnable to read legacy script metadata, so cannot generate scriptsr?zpython.commandsz8Unable to read JSON metadata, so cannot generate scriptsZwrap_consoleZwrap_guizValid script path not specifiedz%s = %s�lib�prefixzinstallation failed.)r�r�)Fr�r�r6rFr-r`rcrarbrxrrtrurryrrgrwr�rr�recordrI�dont_write_bytecode�tempfileZmkdtempZ
source_dirZ
target_dir�infolist�
isinstancer
r�r��str�	file_sizerr�r��
startswithr�r�rZcopy_streamr,Zbyte_compile�	ExceptionZwarning�basenameZmakeZset_executable_mode�extendrkr�valuesr��suffix�flags�json�loadr�r{�itemsrr�Zwrite_shared_locationsZwrite_installed_filesZ	exceptionZrollback�shutilZrmtree)Br;r�Zmaker�kwargsr�r�r�r>r|r�r}�
metadata_name�wheel_metadata_name�record_namer~r�bwfr�r�r�r�r�r�r��readerr�r�Zdata_pfxZinfo_pfxZ
script_pfxZfileopZbcZoutfiles�workdir�zinfo�arcname�	u_arcname�kind�valuer�r r�Z	is_script�wherer�ZoutfileZ	newdigestZpycr�Zworknamer��	filenamesZdistZcommandsZepZepdatar�rBr$rC�sZconsole_scriptsZgui_scriptsZ
script_dirZscriptZoptionsr/r/r0�install�sB



"
















z
Wheel.installcCs4tdkr0tjjt�td�tjdd��}t|�atS)Nzdylib-cache�)	�cacher6rFr-rr�rIrbr)r;r�r/r/r0�_get_dylib_cache�s
zWheel._get_dylib_cachecCsltjj|j|j�}d|j|jf}d|}tj|d�}tj	d�}g}t
|d���}y�|j|���}||�}	tj
|	�}
|j�}|j|�}tjj|j|�}
tjj|
�s�tj|
�x�|
j�D]�\}}tjj|
t|��}tjj|�s�d}n6tj|�j}tjj|�}|j|�}tj|j�}||k}|�r(|j||
�|j||f�q�WWdQRXWntk
�r\YnXWdQRX|S)Nz%s-%sz%s.dist-infoZ
EXTENSIONSzutf-8roT)r6rFr-r`rcrarbrxrtrurryr�r�rZ
prefix_to_dirr�r��makedirsr�rrm�stat�st_mtime�datetimeZ
fromtimestampZgetinfoZ	date_time�extractr,rz)r;r>r|r}rr~rGrr�r�r?r
r�Z
cache_baserar��destrZ	file_timerkZ
wheel_timer/r/r0�_get_extensions�s>




 zWheel._get_extensionscCst|�S)zM
        Determine if a wheel is compatible with the running system.
        )�
is_compatible)r;r/r/r0r�szWheel.is_compatiblecCsdS)zP
        Determine if a wheel is asserted as mountable by its metadata.
        Tr/)r;r/r/r0�is_mountable�szWheel.is_mountablecCs�tjjtjj|j|j��}|j�s2d|}t|��|j�sJd|}t|��|t	jkrbt
jd|�nN|rtt	jj|�nt	jj
d|�|j�}|r�tt	jkr�t	jjt�tj||�dS)Nz)Wheel %s not compatible with this Python.z$Wheel %s is marked as not mountable.z%s already in pathr)r6rFrir-r`rcrrrrIr�r�r,�insertr�_hook�	meta_pathr@)r;r,r>�msgr?r/r/r0�mount�s"

zWheel.mountcCsrtjjtjj|j|j��}|tjkr2tjd|�n<tjj	|�|t
jkrRt
j	|�t
jsnt
tjkrntjj	t
�dS)Nz%s not in path)
r6rFrir-r`rcrIr�r�rDrr9r)r;r>r/r/r0�unmount�s



z
Wheel.unmountc'Cstjj|j|j�}d|j|jf}d|}d|}tj|t�}tj|d�}tj|d�}t	j
d�}t|d����}	|	j|��}
||
�}t
|�}WdQRX|djd	d
�}
tdd�|
D��}i}|	j|��:}t|d
��$}x|D]}|d}|||<q�WWdQRXWdQRXx�|	j�D]�}|j}t|t��r*|}n
|jd�}d|k�rJtd|��|jd��rZ�q||}|d�r�t|j�|dk�r�td|��|d
�r|d
jdd
�\}}|	j|��}|j�}WdQRX|j||�\}}||k�rtd|���qWWdQRXdS)Nz%s-%sz%s.dataz%s.dist-infor�r�zutf-8roz
Wheel-Versionr!rcSsg|]}t|��qSr/)rp)rqrrr/r/r0rs�sz Wheel.verify.<locals>.<listcomp>)r�rz..zinvalid entry in wheel: %rz/RECORD.jwsrzsize mismatch for %sr�zdigest mismatch for %s)r6rFr-r`rcrarbrxrrtrurryrrgrwrr�r�r
r�rr�r�r�r�r�)r;r>r|r�r}r�r�r�r~rrr�r�r�r�r�r�rr�r�rrrrrr�r r�r/r/r0rj�sT

 



zWheel.verifycKs�dd�}dd�}tjj|j|j�}d|j|jf}d|}tj|d�}	t����}
t	|d���}i}xt|j
�D]h}
|
j}t|t�r�|}n
|j
d	�}||	kr�qjd
|kr�td|��|j|
|
�tjj|
t|��}|||<qjWWdQRX|||�\}}||f|�}|�r�|||�\}}|�r(||k�r(|||�|dk�rRtjd
d|
d�\}}tj|�n*tjj|��sltd|��tjj||j�}t|j��}tjj|
|�}||f}|j||
|�|j||�|dk�r�tj||�WdQRX|S)a�
        Update the contents of a wheel in a generic way. The modifier should
        be a callable which expects a dictionary argument: its keys are
        archive-entry paths, and its values are absolute filesystem paths
        where the contents the corresponding archive entries can be found. The
        modifier is free to change the contents of the files pointed to, add
        new entries and remove entries, before returning. This method will
        extract the entire contents of the wheel to a temporary location, call
        the modifier, and then use the passed (and possibly updated)
        dictionary to write a new wheel. If ``dest_dir`` is specified, the new
        wheel is written there -- otherwise, the original wheel is overwritten.

        The modifier should return True if it updated the wheel, else False.
        This method returns the same value the modifier returns.
        cSsHd}}d|tf}||kr$d|}||kr@||}t|d�j}||fS)Nz%s/%sz%s/PKG-INFO)rF)rr
rb)�path_mapr}rbrFr�r/r/r0�get_version1sz!Wheel.update.<locals>.get_versioncSs�d}y|t|�}|jd�}|dkr*d|}nTdd�||dd�jd�D�}|dd7<d|d|�djd	d
�|D��f}Wn tk
r�tjd|�YnX|r�t|d�}||_|j	t
�}|j||d
�tjd||�dS)Nrrz%s+1cSsg|]}t|��qSr/)rp)rqr
r/r/r0rsCsz8Wheel.update.<locals>.update_version.<locals>.<listcomp>rr!z%s+%scss|]}t|�VqdS)N)r�)rqrrr/r/r0�	<genexpr>Fsz7Wheel.update.<locals>.update_version.<locals>.<genexpr>z0Cannot update non-compliant (PEP-440) version %r)rF)rF�legacyzVersion updated from %r to %rr�)rr�rgr-rr�r�r
rbr�rr�)rbrF�updatedrCrrr.Zmdr!r/r/r0�update_version;s(

 
z$Wheel.update.<locals>.update_versionz%s-%sz%s.dist-infor�rozutf-8z..zinvalid entry in wheel: %rNz.whlz
wheel-update-)r�r��dirzNot a directory: %r)r6rFr-r`rcrarbrxrrr�r�r
r�rrrr�Zmkstemp�closer�r�r�r�r�r�Zcopyfile)r;ZmodifierZdest_dirr�rr#r>r|r}r�rrrrrrrFZoriginal_versionr ZmodifiedZcurrent_version�fd�newpathr�r�rkr/r/r0r= sX






zWheel.update)rr)NFF)N)NN)F)N)rRrSrT�__doc__r�r�r<�propertyrcrmrnrr�rvrkr�r�r�r�r�r�rrrrrrrrjr=r/r/r/r0rU�s4
)	
	
he	"
6rUcCs�tg}td}x6ttjdddd�D]}|jdj|t|�g��q&Wg}x6tj�D]*\}}}|j	d�rT|j|j
dd�d�qTW|j�tdkr�|j
dt�|jd�g}tg}tjdk�r�tjd	t�}|�r�|j�\}	}}}
t|�}|
g}|
dk�r|jd�|
dk�r|jd�|
dk�r*|jd�|
dk�r>|jd�|
dk�rR|jd�xL|dk�r�x2|D]*}d|	|||f}
|
tk�rd|j|
��qdW|d8}�qTWx<|D]4}x,|D]$}
|jdjt|df�||
f��q�W�q�WxXt|�D]L\}}|jdjt|f�ddf�|dk�r�|jdjt|df�ddf��q�WxXt|�D]L\}}|jdjd|f�ddf�|dk�rB|jdjd|df�ddf��qBWt|�S)zG
    Return (pyver, abi, arch) tuples compatible with this Python.
    rrr)z.abir!rrV�darwinz(\w+)_(\d+)_(\d+)_(\w+)$�i386�ppcZfat�x86_64Zfat3�ppc64Zfat64�intelZ	universalz%s_%s_%s_%srWrr�r�)r+r,)r+r,r-)r.r-)r+r-)r+r-r/r,r.)r*�rangerI�version_infor,r-r�rLZget_suffixesr�rgr�r�rr��platform�rerer�rp�
IMP_PREFIXr��set)Zversions�major�minorZabisr�r rGZarchesr&rar^Zmatchesrer
r]rrrbr/r/r0�compatible_tags�s`















*
$
$r8cCs^t|t�st|�}d}|dkr"t}x6|D].\}}}||jkr(||jkr(||jkr(d}Pq(W|S)NFT)r�rU�COMPATIBLE_TAGSr\r]r^)ZwheelrnrGZverr]r^r/r/r0r�s
r)N)SZ
__future__rr�rtrZdistutils.utilZ	distutilsZemailrr�rLr�Zloggingr6rxr3r�rIr�r�r)rr�compatrrr	r
rZdatabaserr�r
r�utilrrrrrrrrrrbrrZ	getLoggerrRr�r
�hasattrr4r2r�r+r*r1r[r��get_platformr5r�r�r1�compile�
IGNORECASE�VERBOSErhrdr�r�r�r�r7r��objectr8rrUr8r9rr/r/r/r0�<module>s�,


	


#>site-packages/pip/_vendor/distlib/__pycache__/index.cpython-36.pyc000064400000041620147511334610021117 0ustar003

���e]R�@s�ddlZddlZddlZddlZddlZddlZyddlmZWn ek
r`ddl	mZYnXddl
mZddlm
Z
mZmZmZmZmZddlmZmZmZeje�ZdZdZGd	d
�d
e�ZdS)�N)�Thread�)�DistlibException)�HTTPBasicAuthHandler�Request�HTTPPasswordMgr�urlparse�build_opener�string_types)�cached_property�zip_dir�ServerProxyzhttps://pypi.python.org/pypi�pypic@s�eZdZdZdZd*dd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zd+dd�Zd,dd�Z
d-dd�Zd.dd�Zdd�Zd/dd�Zd0d d!�Zd1d"d#�Zd$d%�Zd&d'�Zd2d(d)�ZdS)3�PackageIndexzc
    This class represents a package index compatible with PyPI, the Python
    Package Index.
    s.----------ThIs_Is_tHe_distlib_index_bouNdaRY_$NcCs�|pt|_|j�t|j�\}}}}}}|s<|s<|s<|dkrJtd|j��d|_d|_d|_d|_d|_	t
tjd��R}xJdD]B}	y(t
j|	dg||d	�}
|
d
kr�|	|_PWq|tk
r�Yq|Xq|WWdQRXdS)
z�
        Initialise an instance.

        :param url: The URL of the index. If not specified, the URL for PyPI is
                    used.
        �http�httpszinvalid repository: %sN�w�gpg�gpg2z	--version)�stdout�stderrr)rr)rr)�
DEFAULT_INDEX�url�read_configurationrr�password_handler�ssl_verifierr�gpg_home�	rpc_proxy�open�os�devnull�
subprocessZ
check_call�OSError)�selfr�scheme�netloc�pathZparamsZqueryZfragZsink�s�rc�r)�/usr/lib/python3.6/index.py�__init__$s(

zPackageIndex.__init__cCs&ddlm}ddlm}|�}||�S)zs
        Get the distutils command for interacting with PyPI configurations.
        :return: the command.
        r)�Distribution)�
PyPIRCCommand)Zdistutils.corer,Zdistutils.configr-)r#r,r-�dr)r)r*�_get_pypirc_commandBsz PackageIndex._get_pypirc_commandcCsR|j�}|j|_|j�}|jd�|_|jd�|_|jdd�|_|jd|j�|_dS)z�
        Read the PyPI access configuration as supported by distutils, getting
        PyPI to do the actual work. This populates ``username``, ``password``,
        ``realm`` and ``url`` attributes from the configuration.
        �username�password�realmr�
repositoryN)r/rr3Z_read_pypirc�getr0r1r2)r#�cZcfgr)r)r*rLszPackageIndex.read_configurationcCs$|j�|j�}|j|j|j�dS)z�
        Save the PyPI access configuration. You must have set ``username`` and
        ``password`` attributes before calling this method.

        Again, distutils is used to do the actual work.
        N)�check_credentialsr/Z
_store_pypircr0r1)r#r5r)r)r*�save_configuration[szPackageIndex.save_configurationcCs\|jdks|jdkrtd��t�}t|j�\}}}}}}|j|j||j|j�t|�|_	dS)zp
        Check that ``username`` and ``password`` have been set, and raise an
        exception if not.
        Nz!username and password must be set)
r0r1rrrrZadd_passwordr2rr)r#Zpm�_r%r)r)r*r6gszPackageIndex.check_credentialscCs\|j�|j�|j�}d|d<|j|j�g�}|j|�}d|d<|j|j�g�}|j|�S)aq
        Register a distribution on PyPI, using the provided metadata.

        :param metadata: A :class:`Metadata` instance defining at least a name
                         and version number for the distribution to be
                         registered.
        :return: The HTTP response received from PyPI upon submission of the
                request.
        Zverifyz:actionZsubmit)r6�validate�todict�encode_request�items�send_request)r#�metadatar.�requestZresponser)r)r*�registerss

zPackageIndex.registercCsJx<|j�}|sP|jd�j�}|j|�tjd||f�qW|j�dS)ar
        Thread runner for reading lines of from a subprocess into a buffer.

        :param name: The logical name of the stream (used for logging only).
        :param stream: The stream to read from. This will typically a pipe
                       connected to the output stream of a subprocess.
        :param outbuf: The list to append the read lines to.
        zutf-8z%s: %sN)�readline�decode�rstrip�append�logger�debug�close)r#�name�streamZoutbufr'r)r)r*�_reader�s	
zPackageIndex._readercCs�|jdddg}|dkr|j}|r.|jd|g�|dk	rF|jdddg�tj�}tjj|tjj|�d	�}|jd
dd|d
||g�t	j
ddj|��||fS)a�
        Return a suitable command for signing a file.

        :param filename: The pathname to the file to be signed.
        :param signer: The identifier of the signer of the file.
        :param sign_password: The passphrase for the signer's
                              private key used for signing.
        :param keystore: The path to a directory which contains the keys
                         used in verification. If not specified, the
                         instance's ``gpg_home`` attribute is used instead.
        :return: The signing command as a list suitable to be
                 passed to :class:`subprocess.Popen`.
        z--status-fd�2z--no-ttyNz	--homedirz--batchz--passphrase-fd�0z.ascz
--detach-signz--armorz--local-userz--outputzinvoking: %s� )rr�extend�tempfileZmkdtemprr&�join�basenamerErF)r#�filename�signer�
sign_password�keystore�cmdZtdZsfr)r)r*�get_sign_command�s
zPackageIndex.get_sign_commandc	Cs�tjtjd�}|dk	r tj|d<g}g}tj|f|�}t|jd|j|fd�}|j�t|jd|j|fd�}|j�|dk	r�|jj	|�|jj
�|j�|j�|j�|j
||fS)a�
        Run a command in a child process , passing it any input data specified.

        :param cmd: The command to run.
        :param input_data: If specified, this must be a byte string containing
                           data to be sent to the child process.
        :return: A tuple consisting of the subprocess' exit code, a list of
                 lines read from the subprocess' ``stdout``, and a list of
                 lines read from the subprocess' ``stderr``.
        )rrN�stdinr)�target�argsr)r!�PIPE�PopenrrJr�startrrX�writerG�waitrP�
returncode)	r#rVZ
input_data�kwargsrr�pZt1Zt2r)r)r*�run_command�s$


zPackageIndex.run_commandc
CsD|j||||�\}}|j||jd��\}}}	|dkr@td|��|S)aR
        Sign a file.

        :param filename: The pathname to the file to be signed.
        :param signer: The identifier of the signer of the file.
        :param sign_password: The passphrase for the signer's
                              private key used for signing.
        :param keystore: The path to a directory which contains the keys
                         used in signing. If not specified, the instance's
                         ``gpg_home`` attribute is used instead.
        :return: The absolute pathname of the file where the signature is
                 stored.
        zutf-8rz&sign command failed with error code %s)rWrc�encoder)
r#rRrSrTrUrV�sig_filer(rrr)r)r*�	sign_file�s

zPackageIndex.sign_file�sdist�sourcecCs(|j�tjj|�s td|��|j�|j�}d}	|rZ|jsJtj	d�n|j
||||�}	t|d��}
|
j�}WdQRXt
j|�j�}t
j|�j�}
|jdd||||
d��dtjj|�|fg}|	�rt|	d��}
|
j�}WdQRX|jd	tjj|	�|f�tjtjj|	��|j|j�|�}|j|�S)
a�
        Upload a release file to the index.

        :param metadata: A :class:`Metadata` instance defining at least a name
                         and version number for the file to be uploaded.
        :param filename: The pathname of the file to be uploaded.
        :param signer: The identifier of the signer of the file.
        :param sign_password: The passphrase for the signer's
                              private key used for signing.
        :param filetype: The type of the file being uploaded. This is the
                        distutils command which produced that file, e.g.
                        ``sdist`` or ``bdist_wheel``.
        :param pyversion: The version of Python which the release relates
                          to. For code compatible with any Python, this would
                          be ``source``, otherwise it would be e.g. ``3.2``.
        :param keystore: The path to a directory which contains the keys
                         used in signing. If not specified, the instance's
                         ``gpg_home`` attribute is used instead.
        :return: The HTTP response received from PyPI upon submission of the
                request.
        z
not found: %sNz)no signing program available - not signed�rbZfile_upload�1)z:actionZprotocol_version�filetype�	pyversion�
md5_digest�
sha256_digest�contentZ
gpg_signature)r6rr&�existsrr9r:rrEZwarningrfr�read�hashlib�md5�	hexdigestZsha256�updaterQrD�shutilZrmtree�dirnamer;r<r=)r#r>rRrSrTrkrlrUr.re�fZ	file_datarmrn�filesZsig_datar?r)r)r*�upload_file�s>

zPackageIndex.upload_filec
Cs�|j�tjj|�s td|��tjj|d�}tjj|�sFtd|��|j�|j|j	}}t
|�j�}d	d|fd|fg}d||fg}|j||�}	|j
|	�S)
a2
        Upload documentation to the index.

        :param metadata: A :class:`Metadata` instance defining at least a name
                         and version number for the documentation to be
                         uploaded.
        :param doc_dir: The pathname of the directory which contains the
                        documentation. This should be the directory that
                        contains the ``index.html`` for the documentation.
        :return: The HTTP response received from PyPI upon submission of the
                request.
        znot a directory: %rz
index.htmlz
not found: %r�:action�
doc_uploadrH�versionro)r{r|)r6rr&�isdirrrPrpr9rHr}r�getvaluer;r=)
r#r>Zdoc_dir�fnrHr}Zzip_data�fieldsryr?r)r)r*�upload_documentation)s
z!PackageIndex.upload_documentationcCsT|jdddg}|dkr|j}|r.|jd|g�|jd||g�tjddj|��|S)	a|
        Return a suitable command for verifying a file.

        :param signature_filename: The pathname to the file containing the
                                   signature.
        :param data_filename: The pathname to the file containing the
                              signed data.
        :param keystore: The path to a directory which contains the keys
                         used in verification. If not specified, the
                         instance's ``gpg_home`` attribute is used instead.
        :return: The verifying command as a list suitable to be
                 passed to :class:`subprocess.Popen`.
        z--status-fdrKz--no-ttyNz	--homedirz--verifyzinvoking: %srM)rrrNrErFrP)r#�signature_filename�
data_filenamerUrVr)r)r*�get_verify_commandEszPackageIndex.get_verify_commandcCsH|jstd��|j|||�}|j|�\}}}|dkr@td|��|dkS)a6
        Verify a signature for a file.

        :param signature_filename: The pathname to the file containing the
                                   signature.
        :param data_filename: The pathname to the file containing the
                              signed data.
        :param keystore: The path to a directory which contains the keys
                         used in verification. If not specified, the
                         instance's ``gpg_home`` attribute is used instead.
        :return: True if the signature was verified, else False.
        z0verification unavailable because gpg unavailablerrz(verify command failed with error code %s)rr)rrr�rc)r#r�r�rUrVr(rrr)r)r*�verify_signature]szPackageIndex.verify_signaturecCsp|dkrd}tjd�n6t|ttf�r0|\}}nd}tt|��}tjd|�t|d���}|jt	|��}z�|j
�}	d}
d}d}d}
d	|	kr�t|	d
�}|r�||
|
|�xP|j|
�}|s�P|t
|�7}|j|�|r�|j|�|
d7}
|r�||
|
|�q�WWd|j�XWdQRX|dk�r4||k�r4td||f��|�rl|j�}||k�r`td||||f��tjd
|�dS)a
        This is a convenience method for downloading a file from an URL.
        Normally, this will be a file from the index, though currently
        no check is made for this (i.e. a file can be downloaded from
        anywhere).

        The method is just like the :func:`urlretrieve` function in the
        standard library, except that it allows digest computation to be
        done during download and checking that the downloaded data
        matched any expected value.

        :param url: The URL of the file to be downloaded (assumed to be
                    available via an HTTP GET request).
        :param destfile: The pathname where the downloaded file is to be
                         saved.
        :param digest: If specified, this must be a (hasher, value)
                       tuple, where hasher is the algorithm used (e.g.
                       ``'md5'``) and ``value`` is the expected value.
        :param reporthook: The same as for :func:`urlretrieve` in the
                           standard library.
        NzNo digest specifiedrszDigest specified: %s�wbi rrzcontent-lengthzContent-Lengthz1retrieval incomplete: got only %d out of %d bytesz.%s digest mismatch for %s: expected %s, got %szDigest verified: %s���)rErF�
isinstance�list�tuple�getattrrrrr=r�info�intrq�lenr^rurGrrt)r#r�destfileZdigestZ
reporthookZdigesterZhasherZdfpZsfp�headersZ	blocksize�sizerqZblocknum�block�actualr)r)r*�
download_filevsV




zPackageIndex.download_filecCs:g}|jr|j|j�|jr(|j|j�t|�}|j|�S)z�
        Send a standard library :class:`Request` to PyPI and return its
        response.

        :param req: The request to send.
        :return: The HTTP response from PyPI (a standard library HTTPResponse).
        )rrDrr	r)r#ZreqZhandlers�openerr)r)r*r=�szPackageIndex.send_requestcCs�g}|j}xX|D]P\}}t|ttf�s,|g}x2|D]*}|jd|d|jd�d|jd�f�q2WqWx6|D].\}}	}
|jd|d||	fjd�d|
f�qjW|jd|ddf�dj|�}d|}|tt|��d�}
t	|j
||
�S)	a&
        Encode fields and files for posting to an HTTP server.

        :param fields: The fields to send as a list of (fieldname, value)
                       tuples.
        :param files: The files to send as a list of (fieldname, filename,
                      file_bytes) tuple.
        s--z)Content-Disposition: form-data; name="%s"zutf-8�z8Content-Disposition: form-data; name="%s"; filename="%s"s
smultipart/form-data; boundary=)zContent-typezContent-length)�boundaryr�r�r�rNrdrP�strr�rr)r#r�ry�partsr��k�values�v�keyrR�valueZbodyZctr�r)r)r*r;�s2


zPackageIndex.encode_requestcCs>t|t�rd|i}|jdkr,t|jdd�|_|jj||p:d�S)NrHg@)Ztimeout�and)r�r
rr
r�search)r#Zterms�operatorr)r)r*r��s


zPackageIndex.search)N)N)N)N)NNrgrhN)N)N)NN)N)�__name__�
__module__�__qualname__�__doc__r�r+r/rr7r6r@rJrWrcrfrzr�r�r�r�r=r;r�r)r)r)r*rs*



#

8


M+r)rrZloggingrrvr!rOZ	threadingr�ImportErrorZdummy_threading�r�compatrrrrr	r
�utilrrr
Z	getLoggerr�rErZ
DEFAULT_REALM�objectrr)r)r)r*�<module>s  
site-packages/pip/_vendor/distlib/__pycache__/version.cpython-36.opt-1.pyc000064400000050610147511334610022433 0ustar003

���e�\�@sZdZddlZddlZddlmZddddd	d
ddgZeje�ZGd
d�de	�Z
Gdd�de�ZGdd�de�Z
ejd�Zdd�ZeZGdd�de�Zdd�ZGdd�de
�Zejd�dfejd�dfejd�dfejd�dfejd �d!fejd"�d!fejd#�d$fejd%�d&fejd'�d(fejd)�d*ff
Zejd+�dfejd,�dfejd-�d$fejd#�d$fejd.�dffZejd/�Zd0d1�Zd2d3�Zejd4ej�Zd5d5d6d5d7ddd8�Zd9d:�ZGd;d�de�ZGd<d�de
�Zejd=ej�Z d>d?�Z!d@dA�Z"GdBd	�d	e�Z#GdCd
�d
e
�Z$GdDdE�dEe�Z%e%eee�e%eedFdG��e%e"e$e�dH�Z&e&dIe&dJ<dKd�Z'dS)Lz~
Implementation of a flexible versioning scheme providing support for PEP-440,
setuptools-compatible and semantic versioning.
�N�)�string_types�NormalizedVersion�NormalizedMatcher�
LegacyVersion�
LegacyMatcher�SemanticVersion�SemanticMatcher�UnsupportedVersionError�
get_schemec@seZdZdZdS)r
zThis is an unsupported version.N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/version.pyr
sc@sxeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zedd��ZdS)�VersioncCs"|j�|_}|j|�|_}dS)N)�strip�_string�parse�_parts)�self�s�partsrrr�__init__szVersion.__init__cCstd��dS)Nzplease implement in a subclass)�NotImplementedError)rrrrrr$sz
Version.parsecCs$t|�t|�kr td||f��dS)Nzcannot compare %r and %r)�type�	TypeError)r�otherrrr�_check_compatible'szVersion._check_compatiblecCs|j|�|j|jkS)N)rr)rrrrr�__eq__+s
zVersion.__eq__cCs|j|�S)N)r )rrrrr�__ne__/szVersion.__ne__cCs|j|�|j|jkS)N)rr)rrrrr�__lt__2s
zVersion.__lt__cCs|j|�p|j|�S)N)r"r )rrrrr�__gt__6szVersion.__gt__cCs|j|�p|j|�S)N)r"r )rrrrr�__le__9szVersion.__le__cCs|j|�p|j|�S)N)r#r )rrrrr�__ge__<szVersion.__ge__cCs
t|j�S)N)�hashr)rrrr�__hash__@szVersion.__hash__cCsd|jj|jfS)Nz%s('%s'))�	__class__rr)rrrr�__repr__CszVersion.__repr__cCs|jS)N)r)rrrr�__str__FszVersion.__str__cCstd��dS)NzPlease implement in subclasses.)r)rrrr�
is_prereleaseIszVersion.is_prereleaseN)rr
rrrrr r!r"r#r$r%r'r)r*�propertyr+rrrrrsrc	@s�eZdZdZejd�Zejd�Zejd�Zdd�dd�dd�d	d�d
d�dd�dd�d
d�d�Z	dd�Z
dd�Zedd��Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd �ZdS)!�MatcherNz^(\w[\s\w'.-]*)(\((.*)\))?z'^(<=|>=|<|>|!=|={2,3}|~=)?\s*([^\s,]+)$z
^\d+(\.\d+)*$cCs||kS)Nr)�v�c�prrr�<lambda>WszMatcher.<lambda>cCs||kS)Nr)r.r/r0rrrr1XscCs||kp||kS)Nr)r.r/r0rrrr1YscCs||kp||kS)Nr)r.r/r0rrrr1ZscCs||kS)Nr)r.r/r0rrrr1[scCs||kS)Nr)r.r/r0rrrr1\scCs||kp||kS)Nr)r.r/r0rrrr1^scCs||kS)Nr)r.r/r0rrrr1_s)�<�>z<=z>=z==z===z~=z!=c
CsJ|jdkrtd��|j�|_}|jj|�}|s<td|��|jd�}|dj�|_|jj�|_	g}|d�r<dd�|dj
d�D�}x�|D]�}|jj|�}|s�td	||f��|j�}|dp�d
}|d}|jd��r|dkr�td|��|dd�d}}	|j
j|��s(|j|�n|j|�d}}	|j|||	f�q�Wt|�|_dS)NzPlease specify a version classz
Not valid: %r�r�cSsg|]}|j��qSr)r)�.0r/rrr�
<listcomp>nsz$Matcher.__init__.<locals>.<listcomp>�,zInvalid %r in %rz~=rz.*�==�!=z#'.*' not allowed for %r constraintsTF)r9r:���)�
version_class�
ValueErrorrr�dist_re�match�groups�name�lower�key�split�comp_re�endswith�num_re�append�tupler)
rr�mr@ZclistZconstraintsr/�opZvn�prefixrrrrbs:



zMatcher.__init__cCszt|t�r|j|�}x`|jD]V\}}}|jj|�}t|t�rFt||�}|sbd||jjf}t	|��||||�sdSqWdS)z�
        Check if the provided version matches the constraints.

        :param version: The version to match against this instance.
        :type version: String or :class:`Version` instance.
        z%r not implemented for %sFT)
�
isinstancerr<r�
_operators�get�getattrr(rr)r�version�operator�
constraintrL�f�msgrrrr?�s



z
Matcher.matchcCs6d}t|j�dkr2|jdddkr2|jdd}|S)Nrr�==�===)rVrW)�lenr)r�resultrrr�
exact_version�s zMatcher.exact_versioncCs0t|�t|�ks|j|jkr,td||f��dS)Nzcannot compare %s and %s)rrAr)rrrrrr�szMatcher._check_compatiblecCs"|j|�|j|jko |j|jkS)N)rrCr)rrrrrr �s
zMatcher.__eq__cCs|j|�S)N)r )rrrrrr!�szMatcher.__ne__cCst|j�t|j�S)N)r&rCr)rrrrr'�szMatcher.__hash__cCsd|jj|jfS)Nz%s(%r))r(rr)rrrrr)�szMatcher.__repr__cCs|jS)N)r)rrrrr*�szMatcher.__str__)rr
rr<�re�compiler>rErGrNrr?r,rZrr r!r'r)r*rrrrr-Ns*


%r-zk^v?(\d+!)?(\d+(\.\d+)*)((a|b|c|rc)(\d+))?(\.(post)(\d+))?(\.(dev)(\d+))?(\+([a-zA-Z\d]+(\.[a-zA-Z\d]+)?))?$cCs�|j�}tj|�}|s"td|��|j�}tdd�|djd�D��}x(t|�dkrn|ddkrn|dd�}qHW|ds~d}nt|d�}|dd�}|d	d
�}|dd�}|d
}|dkr�f}n|dt|d�f}|dkr�f}n|dt|d�f}|dk�r
f}n|dt|d�f}|dk�r.f}nLg}	x>|jd�D]0}
|
j	��rZdt|
�f}
nd|
f}
|	j
|
��q>Wt|	�}|�s�|�r�|�r�d}nd}|�s�d}|�s�d}||||||fS)NzNot a valid version: %scss|]}t|�VqdS)N)�int)r6r.rrr�	<genexpr>�sz_pep_440_key.<locals>.<genexpr>r�.r����	�
��
�a�z�_�final���rk)NN)NN)NNrk)rgrk)rh)ri)rj)r�PEP440_VERSION_REr?r
r@rIrDrXr]�isdigitrH)rrJr@ZnumsZepoch�preZpost�devZlocalr�partrrr�_pep_440_key�sT



rqc@s6eZdZdZdd�Zedddddg�Zed	d
��ZdS)raIA rational version.

    Good:
        1.2         # equivalent to "1.2.0"
        1.2.0
        1.2a1
        1.2.3a2
        1.2.3b1
        1.2.3c1
        1.2.3.4
        TODO: fill this out

    Bad:
        1           # minimum two numbers
        1.2a        # release level must have a release serial
        1.2.3b
    cCs<t|�}tj|�}|j�}tdd�|djd�D��|_|S)Ncss|]}t|�VqdS)N)r])r6r.rrrr^sz*NormalizedVersion.parse.<locals>.<genexpr>rr_)�_normalized_keyrlr?r@rIrD�_release_clause)rrrYrJr@rrrrs

zNormalizedVersion.parserg�br/�rcrocst�fdd��jD��S)Nc3s |]}|r|d�jkVqdS)rN)�PREREL_TAGS)r6�t)rrrr^sz2NormalizedVersion.is_prerelease.<locals>.<genexpr>)�anyr)rr)rrr+szNormalizedVersion.is_prereleaseN)	rr
rrr�setrvr,r+rrrrr�scCs>t|�}t|�}||krdS|j|�s*dSt|�}||dkS)NTFr_)�str�
startswithrX)�x�y�nrrr�
_match_prefix"s
rc	@sneZdZeZddddddddd	�Zd
d�Zdd
�Zdd�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�ZdS)r�_match_compatible�	_match_lt�	_match_gt�	_match_le�	_match_ge�	_match_eq�_match_arbitrary�	_match_ne)z~=r2r3z<=z>=z==z===z!=cCsV|rd|ko|jd}n|jdo,|jd}|rN|jjdd�d}|j|�}||fS)N�+rrrkrkrk)rrrDr<)rrQrSrLZstrip_localrrrr�
_adjust_local<s
zNormalizedMatcher._adjust_localcCsD|j|||�\}}||krdS|j}djdd�|D��}t||�S)NFr_cSsg|]}t|��qSr)rz)r6�irrrr7Osz/NormalizedMatcher._match_lt.<locals>.<listcomp>)r�rs�joinr)rrQrSrL�release_clause�pfxrrrr�JszNormalizedMatcher._match_ltcCsD|j|||�\}}||krdS|j}djdd�|D��}t||�S)NFr_cSsg|]}t|��qSr)rz)r6r�rrrr7Wsz/NormalizedMatcher._match_gt.<locals>.<listcomp>)r�rsr�r)rrQrSrLr�r�rrrr�RszNormalizedMatcher._match_gtcCs|j|||�\}}||kS)N)r�)rrQrSrLrrrr�ZszNormalizedMatcher._match_lecCs|j|||�\}}||kS)N)r�)rrQrSrLrrrr�^szNormalizedMatcher._match_gecCs.|j|||�\}}|s ||k}n
t||�}|S)N)r�r)rrQrSrLrYrrrr�bs


zNormalizedMatcher._match_eqcCst|�t|�kS)N)rz)rrQrSrLrrrr�jsz"NormalizedMatcher._match_arbitrarycCs0|j|||�\}}|s ||k}nt||�}|S)N)r�r)rrQrSrLrYrrrr�ms

zNormalizedMatcher._match_necCsf|j|||�\}}||krdS||kr*dS|j}t|�dkrH|dd�}djdd�|D��}t||�S)NTFrr_cSsg|]}t|��qSr)rz)r6r�rrrr7�sz7NormalizedMatcher._match_compatible.<locals>.<listcomp>rk)r�rsrXr�r)rrQrSrLr�r�rrrr�usz#NormalizedMatcher._match_compatibleN)rr
rrr<rNr�r�r�r�r�r�r�r�r�rrrrr-s$z[.+-]$r4z^[.](\d)z0.\1z^[.-]z
^\((.*)\)$z\1z^v(ersion)?\s*(\d+)z\2z^r(ev)?\s*(\d+)z[.]{2,}r_z\b(alfa|apha)\b�alphaz\b(pre-alpha|prealpha)\bz	pre.alphaz	\(beta\)$�betaz
^[:~._+-]+z
[,*")([\]]z[~:+_ -]z\.$z
(\d+(\.\d+)*)cCsZ|j�j�}xtD]\}}|j||�}qW|s2d}tj|�}|sJd}|}n�|j�djd�}dd�|D�}xt|�dkr�|j	d�qlWt|�dkr�||j
�d�}n8djdd�|dd�D��||j
�d�}|dd�}djd	d�|D��}|j�}|�rxtD]\}}|j||�}�qW|�s*|}nd
|k�r8dnd}|||}t
|��sVd}|S)
z�
    Try to suggest a semantic form for a version for which
    _suggest_normalized_version couldn't come up with anything.
    z0.0.0rr_cSsg|]}t|��qSr)r])r6r�rrrr7�sz-_suggest_semantic_version.<locals>.<listcomp>�NcSsg|]}t|��qSr)rz)r6r�rrrr7�scSsg|]}t|��qSr)rz)r6r�rrrr7�sro�-r�)rrB�
_REPLACEMENTS�sub�_NUMERIC_PREFIXr?r@rDrXrH�endr��_SUFFIX_REPLACEMENTS�	is_semver)rrYZpat�replrJrL�suffix�seprrr�_suggest_semantic_version�s:
,
r�cCslyt|�|Stk
r YnX|j�}xdBD]\}}|j||�}q0Wtjdd|�}tjdd|�}tjdd|�}tjdd|�}tjdd|�}|jd��r�|d d!�}tjd"d|�}tjd#d$|�}tjd%d&|�}tjd'd|�}tjd(d)|�}tjd*d)|�}tjd+d
|�}tjd,d-|�}tjd.d&|�}tjd/d0|�}tjd1d2|�}yt|�Wntk
�rfd!}YnX|S)Ca�Suggest a normalized version close to the given version string.

    If you have a version string that isn't rational (i.e. NormalizedVersion
    doesn't like it) then you might be able to get an equivalent (or close)
    rational version from this function.

    This does a number of simple normalizations to the given string, based
    on observation of versions currently in use on PyPI. Given a dump of
    those version during PyCon 2009, 4287 of them:
    - 2312 (53.93%) match NormalizedVersion without change
      with the automatic suggestion
    - 3474 (81.04%) match when using this suggestion method

    @param s {str} An irrational version string.
    @returns A rational version string, or None, if couldn't determine one.
    �-alpharg�-betartr�r�rur/�-finalr4�-pre�-release�.release�-stabler�r_ri� �.finalrjzpre$Zpre0zdev$Zdev0z([abc]|rc)[\-\.](\d+)$z\1\2z[\-\.](dev)[\-\.]?r?(\d+)$z.\1\2z[.~]?([abc])\.?z\1r.rNz\b0+(\d+)(?!\d)z(\d+[abc])$z\g<1>0z\.?(dev-r|dev\.r)\.?(\d+)$z.dev\2z-(a|b|c)(\d+)$z[\.\-](dev|devel)$z.dev0z(?![\.\-])dev$z(final|stable)$z\.?(r|-|-r)\.?(\d+)$z.post\2z\.?(dev|git|bzr)\.?(\d+)$z\.?(pre|preview|-c)(\d+)$zc\g<2>zp(\d+)$z.post\1�r�rg�r�rt�r�rg�r�rt�rur/�r�r4�r�r/�r�r4�r�r4�r�r4�r�r_�rir_�r�r4�r�r4�rjr4)r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)rrr
rB�replacer[r�r{)rZrsZorigr�rrr�_suggest_normalized_version�sH	
r�z([a-z]+|\d+|[\.-])r/zfinal-�@)rnZpreviewr�ruror4r_cCs~dd�}g}xh||�D]\}|jd�rh|dkrJx|rH|ddkrH|j�q.Wx|rf|d	dkrf|j�qLW|j|�qWt|�S)
NcSsxg}xdtj|j��D]R}tj||�}|rd|dd�koBdknrT|jd�}nd|}|j|�qW|jd�|S)N�0r�9��*z*final)�
_VERSION_PARTrDrB�_VERSION_REPLACErO�zfillrH)rrYr0rrr�	get_partsIs 
z_legacy_key.<locals>.get_partsr�z*finalrz*final-Z00000000rkrk)r{�poprHrI)rr�rYr0rrr�_legacy_keyHs

r�c@s eZdZdd�Zedd��ZdS)rcCst|�S)N)r�)rrrrrrcszLegacyVersion.parsecCs:d}x0|jD]&}t|t�r|jd�r|dkrd}PqW|S)NFr�z*finalT)rrMrr{)rrYr|rrrr+fszLegacyVersion.is_prereleaseN)rr
rrr,r+rrrrrbsc@s4eZdZeZeej�Zded<ej	d�Z
dd�ZdS)rr�z~=z^(\d+(\.\d+)*)cCs`||krdS|jjt|��}|s2tjd||�dS|j�d}d|krV|jdd�d}t||�S)NFzACannot compute compatible match for version %s  and constraint %sTrr_r)�
numeric_rer?rz�loggerZwarningr@�rsplitr)rrQrSrLrJrrrrr�yszLegacyMatcher._match_compatibleN)rr
rrr<�dictr-rNr[r\r�r�rrrrrqs


zN^(\d+)\.(\d+)\.(\d+)(-[a-z0-9]+(\.[a-z0-9-]+)*)?(\+[a-z0-9]+(\.[a-z0-9-]+)*)?$cCs
tj|�S)N)�
_SEMVER_REr?)rrrrr��sr�c	Csndd�}t|�}|st|��|j�}dd�|dd�D�\}}}||dd�||dd�}}|||f||fS)	NcSs8|dkr|f}n$|dd�jd�}tdd�|D��}|S)Nrr_cSs"g|]}|j�r|jd�n|�qS)r�)rmr�)r6r0rrrr7�sz5_semantic_key.<locals>.make_tuple.<locals>.<listcomp>)rDrI)rZabsentrYrrrr�
make_tuple�s
z!_semantic_key.<locals>.make_tuplecSsg|]}t|��qSr)r])r6r�rrrr7�sz!_semantic_key.<locals>.<listcomp>r��|�r�)r�r
r@)	rr�rJr@�major�minorZpatchrnZbuildrrr�
_semantic_key�s
r�c@s eZdZdd�Zedd��ZdS)rcCst|�S)N)r�)rrrrrr�szSemanticVersion.parsecCs|jdddkS)Nrrr�)r)rrrrr+�szSemanticVersion.is_prereleaseN)rr
rrr,r+rrrrr�sc@seZdZeZdS)r	N)rr
rrr<rrrrr	�sc@s6eZdZddd�Zdd�Zdd�Zdd	�Zd
d�ZdS)
�
VersionSchemeNcCs||_||_||_dS)N)rC�matcher�	suggester)rrCr�r�rrrr�szVersionScheme.__init__cCs2y|jj|�d}Wntk
r,d}YnX|S)NTF)r�r<r
)rrrYrrr�is_valid_version�s
zVersionScheme.is_valid_versioncCs0y|j|�d}Wntk
r*d}YnX|S)NTF)r�r
)rrrYrrr�is_valid_matcher�s

zVersionScheme.is_valid_matchercCs|jd|�S)z:
        Used for processing some metadata fields
        zdummy_name (%s))r�)rrrrr�is_valid_constraint_list�sz&VersionScheme.is_valid_constraint_listcCs|jdkrd}n
|j|�}|S)N)r�)rrrYrrr�suggest�s

zVersionScheme.suggest)N)rr
rrr�r�r�r�rrrrr��s

r�cCs|S)Nr)rrrrrr1�sr1)�
normalized�legacyZsemanticr��defaultcCs|tkrtd|��t|S)Nzunknown scheme name: %r)�_SCHEMESr=)rArrrr�s)(rZloggingr[�compatr�__all__Z	getLoggerrr�r=r
�objectrr-r\rlrqrrrrrr�r�r�r�r��Ir�r�r�rrr�r�r�rr	r�r�rrrrr�<module>	sz
1k
=$W
.r	$
site-packages/pip/_vendor/distlib/__pycache__/markers.cpython-36.pyc000064400000013506147511334610021456 0ustar003

���e��@sddZddlZddlZddlZddlZddlmZmZddlm	Z	dgZ
Gdd�de�Zd
d	d�Z
dS)zEParser for the environment markers micro-language defined in PEP 345.�N�)�python_implementation�string_types)�in_venv�	interpretc
@s�eZdZdZdd�dd�dd�dd�dd�dd�d	d�d
d�dd�d�	Zejd
ejdd�ejj	dd�de
jee
��ej�ej�ej�e�d�	Zd*dd�Zdd�Zdd�Zd+dd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�ZdS),�	Evaluatorz5
    A limited evaluator for Python expressions.
    cCs||kS)N�)�x�yrr�/usr/lib/python3.6/markers.py�<lambda>szEvaluator.<lambda>cCs||kS)Nr)r	r
rrrrscCs||kS)Nr)r	r
rrrrscCs||kS)Nr)r	r
rrrrscCs||kS)Nr)r	r
rrrrscCs||kS)Nr)r	r
rrrrscCs|S)Nr)r	rrrr scCs||kS)Nr)r	r
rrrr!scCs||kS)Nr)r	r
rrrr"s)	�eq�gtZgte�in�ltZlte�notZnoteqZnotinz%s.%sN�� rr)	Zsys_platformZpython_versionZpython_full_versionZos_nameZplatform_in_venvZplatform_releaseZplatform_versionZplatform_machineZplatform_python_implementationcCs|pi|_d|_dS)zu
        Initialise an instance.

        :param context: If specified, names are looked up in this mapping.
        N)�context�source)�selfrrrr�__init__3s
zEvaluator.__init__cCs8d}d|j|||�}||t|j�kr4|d7}|S)zH
        Get the part of the source which is causing a problem.
        �
z%rz...)r�len)r�offsetZfragment_len�srrr�get_fragment<s
zEvaluator.get_fragmentcCst|d|d�S)z@
        Get a handler for the specified AST node type.
        zdo_%sN)�getattr)r�	node_typerrr�get_handlerFszEvaluator.get_handlercCs�t|t�rr||_ddi}|r$||d<ytj|f|�}Wn:tk
rp}z|j|j�}td|��WYdd}~XnX|jj	j
�}|j|�}|dkr�|jdkr�d}n|j|j�}td||f��||�S)zf
        Evaluate a source string or node, using ``filename`` when
        displaying errors.
        �mode�eval�filenamezsyntax error %sNz(source not available)z don't know how to evaluate %r %s)
�
isinstancerr�ast�parse�SyntaxErrorrr�	__class__�__name__�lowerr�
col_offset)r�noder"�kwargs�errZhandlerrrr�evaluateLs&


zEvaluator.evaluatecCs&t|tj�std��d|jj|jfS)Nzattribute node expectedz%s.%s)r#r$Z	Attribute�AssertionError�value�id�attr)rr+rrr�get_attr_keyfszEvaluator.get_attr_keycCsft|jtj�sd}n|j|�}||jkp0||jk}|sBtd|��||jkrX|j|}n
|j|}|S)NFzinvalid expression: %s)r#r0r$�Namer3r�allowed_valuesr&)rr+�valid�key�resultrrr�do_attributejs


zEvaluator.do_attributecCs�|j|jd�}|jjtjk}|jjtjk}|s8|s8t�|r@|sJ|r�|r�x4|jdd�D]"}|j|�}|rp|sz|rZ|rZPqZW|S)Nrr)r.�values�opr'r$ZOrZAndr/)rr+r8Zis_orZis_and�nrrr�	do_boolopxs
zEvaluator.do_boolopc	s���fdd�}�j}�j|�}d}xnt�j�j�D]\\}}|||�|jjj�}|�jkrft	d|���j|�}�j|||�}|s�P|}|}q2W|S)Ncs@d}t|tj�r t|tj�r d}|s<�j�j�}td|��dS)NTFzInvalid comparison: %s)r#r$ZStrrr*r&)�lhsnode�rhsnoder6r)r+rrr�sanity_check�sz*Evaluator.do_compare.<locals>.sanity_checkTzunsupported operation: %r)
�leftr.�zipZopsZcomparatorsr'r(r)�	operatorsr&)	rr+r@r>Zlhsr8r;r?Zrhsr)r+rr�
do_compare�s 




zEvaluator.do_comparecCs|j|j�S)N)r.Zbody)rr+rrr�
do_expression�szEvaluator.do_expressioncCsTd}|j|jkr"d}|j|j}n|j|jkr>d}|j|j}|sPtd|j��|S)NFTzinvalid expression: %s)r1rr5r&)rr+r6r8rrr�do_name�szEvaluator.do_namecCs|jS)N)r)rr+rrr�do_str�szEvaluator.do_str)N)N)r(�
__module__�__qualname__�__doc__rC�sys�platform�version_info�version�split�os�name�strr�release�machinerr5rrrr.r3r9r=rDrErFrGrrrrrs<

	

rcCst|�j|j��S)z�
    Interpret a marker and return a result depending on environment.

    :param marker: The marker to interpret.
    :type marker: str
    :param execution_context: The context used for name lookup.
    :type execution_context: mapping
    )rr.�strip)ZmarkerZexecution_contextrrrr�s	)N)rJr$rPrKrL�compatrr�utilr�__all__�objectrrrrrr�<module>s"site-packages/pip/_vendor/distlib/__pycache__/locators.cpython-36.pyc000064400000113170147511334610021636 0ustar003

���eE��@s@ddlZddlmZddlZddlZddlZddlZddlZyddlZWne	k
rdddl
ZYnXddlZddlm
Z
ddlmZmZmZmZmZmZmZmZmZmZmZmZmZmZmZddlm Z m!Z!m"Z"ddl#m$Z$ddl%m&Z&m'Z'm(Z(m)Z)m*Z*m+Z+m,Z,m-Z-m.Z.dd	l/m0Z0m1Z1dd
l2m3Z3m4Z4ej5e6�Z7ej8d�Z9ej8dej:�Z;ej8d
�Z<dZ=d-dd�Z>Gdd�de�Z?Gdd�de@�ZAGdd�deA�ZBGdd�deA�ZCGdd�de@�ZDGdd�deA�ZEGdd�deA�ZFGdd �d eA�ZGGd!d"�d"eA�ZHGd#d$�d$eA�ZIeIeG�eEd%d&d'�d(d)�ZJeJjKZKej8d*�ZLGd+d,�d,e@�ZMdS).�N)�BytesIO�)�DistlibException)�urljoin�urlparse�
urlunparse�url2pathname�pathname2url�queue�quote�unescape�string_types�build_opener�HTTPRedirectHandler�	text_type�Request�	HTTPError�URLError)�Distribution�DistributionPath�	make_dist)�Metadata)	�cached_property�parse_credentials�ensure_slash�split_filename�get_project_data�parse_requirement�parse_name_and_version�ServerProxy�normalize_name)�
get_scheme�UnsupportedVersionError)�Wheel�
is_compatiblez^(\w+)=([a-f0-9]+)z;\s*charset\s*=\s*(.*)\s*$ztext/html|application/x(ht)?mlzhttps://pypi.python.org/pypicCs |dkrt}t|dd�}|j�S)z�
    Return all distribution names known by an index.
    :param url: The URL of the index.
    :return: A list of all known distribution names.
    Ng@)�timeout)�
DEFAULT_INDEXr�
list_packages)�url�client�r*�/usr/lib/python3.6/locators.py�get_all_distribution_names)sr,c@s$eZdZdZdd�ZeZZZdS)�RedirectHandlerzE
    A class to work around a bug in some Python 3.2.x releases.
    c	Cs�d}xdD]}||kr
||}Pq
W|dkr0dSt|�}|jdkrpt|j�|�}t|d�rh|j||�n|||<tj||||||�S)N�location�uri��replace_header)r.r/)r�schemerZget_full_url�hasattrr1�BaseRedirectHandler�http_error_302)	�self�req�fp�code�msg�headersZnewurl�keyZurlpartsr*r*r+r5=s


zRedirectHandler.http_error_302N)�__name__�
__module__�__qualname__�__doc__r5Zhttp_error_301Zhttp_error_303Zhttp_error_307r*r*r*r+r-4sr-c@s�eZdZdZd/Zd0Zd1ZdZed2Zd3dd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zee
e�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd4d-d.�ZdS)5�LocatorzG
    A base class for locators - things that locate distributions.
    �.tar.gz�.tar.bz2�.tar�.zip�.tgz�.tbz�.egg�.exe�.whl�.pdfN�defaultcCs,i|_||_tt��|_d|_tj�|_dS)a^
        Initialise an instance.
        :param scheme: Because locators look for most recent versions, they
                       need to know the version scheme to use. This specifies
                       the current PEP-recommended scheme - use ``'legacy'``
                       if you need to support existing distributions on PyPI.
        N)	�_cacher2rr-�opener�matcherr
�Queue�errors)r6r2r*r*r+�__init__cs
zLocator.__init__cCsXg}xN|jj�sRy|jjd�}|j|�Wn|jjk
rDwYnX|jj�qW|S)z8
        Return any errors which have occurred.
        F)rQ�empty�get�appendZEmpty�	task_done)r6�result�er*r*r+�
get_errorsvszLocator.get_errorscCs|j�dS)z>
        Clear any errors which may have been logged.
        N)rY)r6r*r*r+�clear_errors�szLocator.clear_errorscCs|jj�dS)N)rM�clear)r6r*r*r+�clear_cache�szLocator.clear_cachecCs|jS)N)�_scheme)r6r*r*r+�_get_scheme�szLocator._get_schemecCs
||_dS)N)r])r6�valuer*r*r+�_set_scheme�szLocator._set_schemecCstd��dS)a=
        For a given project, get a dictionary mapping available versions to Distribution
        instances.

        This should be implemented in subclasses.

        If called from a locate() request, self.matcher will be set to a
        matcher for the requirement to satisfy, otherwise it will be None.
        z Please implement in the subclassN)�NotImplementedError)r6�namer*r*r+�_get_project�s
zLocator._get_projectcCstd��dS)zJ
        Return all the distribution names known to this locator.
        z Please implement in the subclassN)ra)r6r*r*r+�get_distribution_names�szLocator.get_distribution_namescCsL|jdkr|j|�}n2||jkr,|j|}n|j�|j|�}||j|<|S)z�
        For a given project, get a dictionary mapping available versions to Distribution
        instances.

        This calls _get_project to do all the work, and just implements a caching layer on top.
        N)rMrcrZ)r6rbrWr*r*r+�get_project�s



zLocator.get_projectcCsPt|�}tj|j�}d}|jd�}|r6tt|�|j�}|jdkd|j	k|||fS)zu
        Give an url a score which can be used to choose preferred URLs
        for a given project release.
        Tz.whl�httpszpypi.python.org)
r�	posixpath�basename�path�endswithr$r#�
wheel_tagsr2�netloc)r6r(�trhZ
compatibleZis_wheelr*r*r+�	score_url�s
zLocator.score_urlcCsR|}|rN|j|�}|j|�}||kr(|}||kr@tjd||�ntjd||�|S)a{
        Choose one of two URLs where both are candidates for distribution
        archives for the same version of a distribution (for example,
        .tar.gz vs. zip).

        The current implementation favours https:// URLs over http://, archives
        from PyPI over those from other locations, wheel compatibility (if a
        wheel) and then the archive name.
        zNot replacing %r with %rzReplacing %r with %r)rn�logger�debug)r6�url1�url2rW�s1�s2r*r*r+�
prefer_url�s


zLocator.prefer_urlcCs
t||�S)zZ
        Attempt to split a filename in project name, version and Python version.
        )r)r6�filename�project_namer*r*r+r�szLocator.split_filenamecCsdd�}d}t|�\}}}}}	}
|
j�jd�r<tjd||
�tj|
�}|rX|j�\}}
nd\}}
|}|r�|ddkr�|dd�}|jd��r2yrt	|�}t
||j��r�|dkr�d	}n||j|�}|�r�|j|j
|jt|||||	d
f�djdd
�|jD��d�}Wn0tk
�r.}ztjd|�WYdd}~XnXn�|j|j��r�tj|�}}x�|jD]�}|j|��rV|dt|��}|j||�}|�s�tjd|�nJ|\}}}|�s�|||��r�|||t|||||	d
f�d�}|�r�||d<P�qVW|�r|�r|
|d|<|S)a
        See if a URL is a candidate for a download URL for a project (the URL
        has typically been scraped from an HTML page).

        If it is, a dictionary is returned with keys "name", "version",
        "filename" and "url"; otherwise, None is returned.
        cSst|�t|�kS)N)r )Zname1Zname2r*r*r+�same_project�sz:Locator.convert_url_to_download_info.<locals>.same_projectNzegg=z %s: version hint in fragment: %rr�/z.whlTr0z, cSs"g|]}djt|dd����qS)�.�N)�join�list)�.0�vr*r*r+�
<listcomp>sz8Locator.convert_url_to_download_info.<locals>.<listcomp>)rb�versionrvr(zpython-versionzinvalid path for wheel: %sz No match for project/version: %s)rbr�rvr(zpython-versionz	%s_digest)NN���r�)r�lower�
startswithrorp�HASHER_HASH�match�groupsrjr#r$rkrbr�rvrr|�pyver�	Exception�warning�downloadable_extensionsrgrh�lenr)r6r(rwrxrWr2rlri�params�query�frag�m�algo�digestZorigpath�wheel�includerXrvZextrmrbr�r�r*r*r+�convert_url_to_download_info�sf

 
z$Locator.convert_url_to_download_infocCs4d}x*dD]"}d|}||kr
|||f}Pq
W|S)z�
        Get a digest from a dictionary by looking at keys of the form
        'algo_digest'.

        Returns a 2-tuple (algo, digest) if found, else None. Currently
        looks only for SHA256, then MD5.
        N�sha256�md5z	%s_digest)r�r�r*)r6�inforWr�r<r*r*r+�_get_digest)s
zLocator._get_digestc	Cs�|jd�}|jd�}||kr,||}|j}nt|||jd�}|j}|j|�|_}|d}||d|<|j|dkr�|j|j|�|_|dj|t	��j
|�||_|||<dS)z�
        Update a result dictionary (the final result from _get_project) with a
        dictionary for a specific version, which typically holds information
        gleaned from a filename or URL for an archive for the distribution.
        rbr�)r2r(�digests�urlsN)�pop�metadatarr2r�r��
source_urlru�
setdefault�set�add�locator)	r6rWr�rbr��dist�mdr�r(r*r*r+�_update_version_data9s

zLocator._update_version_dataFcCs�d}t|�}|dkr td|��t|j�}|j|j�|_}tjd|t|�j	�|j
|j�}t|�dk�r8g}|j
}	x�|D]|}
|
d
kr�qzyJ|j|
�s�tjd||
�n,|s�|	|
�jr�|j|
�ntjd|
|j�Wqztk
r�tjd	||
�YqzXqzWt|�d
k�rt||jd�}|�r8tjd|�|d}||}|�r�|j�rN|j|_|jdi�j|t��|_i}|jdi�}
x&|jD]}||
k�r~|
|||<�q~W||_d|_|S)a
        Find the most recent distribution which matches the given
        requirement.

        :param requirement: A requirement of the form 'foo (1.0)' or perhaps
                            'foo (>= 1.0, < 2.0, != 1.3)'
        :param prereleases: If ``True``, allow pre-release versions
                            to be located. Otherwise, pre-release versions
                            are not returned.
        :return: A :class:`Distribution` instance, or ``None`` if no such
                 distribution could be located.
        NzNot a valid requirement: %rzmatcher: %s (%s)r{r�r�z%s did not match %rz%skipping pre-release version %s of %szerror matching %s with %rr)r<zsorted list: %s)r�r�r�)rrr!r2rO�requirementrorp�typer=rerbr�Z
version_classr�Z
is_prereleaserUr�r��sortedr<ZextrasrTr��
download_urlsr�)r6r��prereleasesrW�rr2rO�versionsZslistZvcls�kr��dZsdr(r*r*r+�locatePsT





zLocator.locate)rBrCrDrErFrG)rHrIrJ)rK)rJ)rL)F)r=r>r?r@�source_extensions�binary_extensions�excluded_extensionsrkr�rRrYrZr\r^r`�propertyr2rcrdrernrurr�r�r�r�r*r*r*r+rASs.

FrAcs0eZdZdZ�fdd�Zdd�Zdd�Z�ZS)�PyPIRPCLocatorz�
    This locator uses XML-RPC to locate distributions. It therefore
    cannot be used with simple mirrors (that only mirror file content).
    cs*tt|�jf|�||_t|dd�|_dS)z�
        Initialise an instance.

        :param url: The URL to use for XML-RPC.
        :param kwargs: Passed to the superclass constructor.
        g@)r%N)�superr�rR�base_urlrr))r6r(�kwargs)�	__class__r*r+rR�szPyPIRPCLocator.__init__cCst|jj��S)zJ
        Return all the distribution names known to this locator.
        )r�r)r')r6r*r*r+rd�sz%PyPIRPCLocator.get_distribution_namescCsiid�}|jj|d�}x�|D]�}|jj||�}|jj||�}t|jd�}|d|_|d|_|jd�|_	|jdg�|_
|jd�|_t|�}|r|d	}	|	d
|_
|j|	�|_||_|||<xB|D]:}	|	d
}
|j|	�}|dj|t��j|
�||d|
<q�WqW|S)
N)r�r�T)r2rbr��license�keywords�summaryrr(r�r�)r)Zpackage_releasesZrelease_urlsZrelease_datarr2rbr�rTr�r�r�rr�r�r�r�r�r�r�)r6rbrWr�rr��datar�r�r�r(r�r*r*r+rc�s0






zPyPIRPCLocator._get_project)r=r>r?r@rRrdrc�
__classcell__r*r*)r�r+r��sr�cs0eZdZdZ�fdd�Zdd�Zdd�Z�ZS)�PyPIJSONLocatorzw
    This locator uses PyPI's JSON interface. It's very limited in functionality
    and probably not worth using.
    cs tt|�jf|�t|�|_dS)N)r�r�rRrr�)r6r(r�)r�r*r+rR�szPyPIJSONLocator.__init__cCstd��dS)zJ
        Return all the distribution names known to this locator.
        zNot available from this locatorN)ra)r6r*r*r+rd�sz&PyPIJSONLocator.get_distribution_namescCsiid�}t|jdt|��}�y�|jj|�}|j�j�}tj|�}t	|j
d�}|d}|d|_|d|_|j
d�|_|j
dg�|_|j
d	�|_t|�}||_|d
}	|||j<x`|d
D]T}
|
d}|jj|�|j|
�|j|<|d
j|jt��j|�|j|
�|d|<q�Wx�|d
j�D]�\}}||jk�r:�q"t	|j
d�}
|j|
_||
_t|
�}||_|||<x\|D]T}
|
d}|jj|�|j|
�|j|<|d
j|t��j|�|j|
�|d|<�qpW�q"WWn@tk
�r}z"|jjt|��tjd|�WYdd}~XnX|S)N)r�r�z%s/json)r2r�rbr�r�r�r�r�r(r�ZreleaseszJSON fetch failed: %s) rr�rrN�open�read�decode�json�loadsrr2rbr�rTr�r�r�rr�r�r�r�r�r�r��itemsr�rQ�putrro�	exception)r6rbrWr(�respr�r�r�r�r�r�r�ZinfosZomd�odistrXr*r*r+rc�sT





"	zPyPIJSONLocator._get_project)r=r>r?r@rRrdrcr�r*r*)r�r+r��sr�c@s`eZdZdZejdejejBejB�Z	ejdejejB�Z
dd�Zejdej�Ze
dd��Zd	S)
�Pagez4
    This class represents a scraped HTML page.
    z�
(rel\s*=\s*(?:"(?P<rel1>[^"]*)"|'(?P<rel2>[^']*)'|(?P<rel3>[^>\s
]*))\s+)?
href\s*=\s*(?:"(?P<url1>[^"]*)"|'(?P<url2>[^']*)'|(?P<url3>[^>\s
]*))
(\s+rel\s*=\s*(?:"(?P<rel4>[^"]*)"|'(?P<rel5>[^']*)'|(?P<rel6>[^>\s
]*)))?
z!<base\s+href\s*=\s*['"]?([^'">]+)cCs4||_||_|_|jj|j�}|r0|jd�|_dS)zk
        Initialise an instance with the Unicode page contents and the URL they
        came from.
        rN)r�r�r(�_base�search�group)r6r�r(r�r*r*r+rRs
z
Page.__init__z[^a-z0-9$&+,/:;=?@.#%_\\|-]cCs�dd�}t�}x�|jj|j�D]�}|jd�}|dpZ|dpZ|dpZ|dpZ|dpZ|d	}|d
pr|dpr|d}t|j|�}t|�}|jj	d
d�|�}|j
||f�qWt|dd�dd�}|S)z�
        Return the URLs of all the links on a page together with information
        about their "rel" attribute, for determining which ones to treat as
        downloads and which ones to queue for further scraping.
        cSs,t|�\}}}}}}t||t|�|||f�S)zTidy up an URL.)rrr)r(r2rlrir�r�r�r*r*r+�clean%szPage.links.<locals>.cleanr0Zrel1Zrel2Zrel3Zrel4Zrel5Zrel6rqrrZurl3cSsdt|jd��S)Nz%%%2xr)�ordr�)r�r*r*r+�<lambda>3szPage.links.<locals>.<lambda>cSs|dS)Nrr*)rmr*r*r+r�7sT)r<�reverse)r��_href�finditerr��	groupdictrr�r�	_clean_re�subr�r�)r6r�rWr�r��relr(r*r*r+�linkss
z
Page.linksN)r=r>r?r@�re�compile�I�S�Xr�r�rRr�rr�r*r*r*r+r�sr�cs�eZdZdZejdd�dd�d�Zd�fdd	�	Zd
d�Zdd
�Z	dd�Z
ejdej
�Zdd�Zdd�Zdd�Zdd�Zdd�Zejd�Zdd�Z�ZS)�SimpleScrapingLocatorz�
    A locator which scrapes HTML pages to locate downloads for a distribution.
    This runs multiple threads to do the I/O; performance is at least as good
    as pip's PackageFinder, which works in an analogous fashion.
    cCstjtt�d�j�S)N)Zfileobj)�gzipZGzipFilerr�r�)�br*r*r+r�EszSimpleScrapingLocator.<lambda>cCs|S)Nr*)r�r*r*r+r�Fs)Zdeflater�ZnoneN�
csftt|�jf|�t|�|_||_i|_t�|_t	j
�|_t�|_d|_
||_tj�|_tj�|_dS)a�
        Initialise an instance.
        :param url: The root URL to use for scraping.
        :param timeout: The timeout, in seconds, to be applied to requests.
                        This defaults to ``None`` (no timeout specified).
        :param num_workers: The number of worker threads you want to do I/O,
                            This defaults to 10.
        :param kwargs: Passed to the superclass.
        FN)r�r�rRrr�r%�_page_cacher��_seenr
rP�	_to_fetch�
_bad_hosts�skip_externals�num_workers�	threading�RLock�_lock�_gplock)r6r(r%r�r�)r�r*r+rRIs



zSimpleScrapingLocator.__init__cCsJg|_x>t|j�D]0}tj|jd�}|jd�|j�|jj|�qWdS)z�
        Threads are created only when get_project is called, and terminate
        before it returns. They are there primarily to parallelise I/O (i.e.
        fetching web pages).
        )�targetTN)	�_threads�ranger�r�ZThread�_fetchZ	setDaemon�startrU)r6�irmr*r*r+�_prepare_threadscs
z&SimpleScrapingLocator._prepare_threadscCs>x|jD]}|jjd�qWx|jD]}|j�q$Wg|_dS)zu
        Tell all the threads to terminate (by sending a sentinel value) and
        wait for them to do so.
        N)r�r�r�r|)r6rmr*r*r+�
_wait_threadsps
z#SimpleScrapingLocator._wait_threadscCs�iid�}|j�x||_||_t|jdt|��}|jj�|jj�|j	�z&t
jd|�|jj
|�|jj�Wd|j�X|`WdQRX|S)N)r�r�z%s/zQueueing %s)r�rWrwrr�rr�r[r�r�rorpr�r�r|r�)r6rbrWr(r*r*r+rc}s



z"SimpleScrapingLocator._get_projectz<\b(linux-(i\d86|x86_64|arm\w+)|win(32|-amd64)|macosx-?\d+)\bcCs|jj|�S)zD
        Does an URL refer to a platform-specific download?
        )�platform_dependentr�)r6r(r*r*r+�_is_platform_dependent�sz,SimpleScrapingLocator._is_platform_dependentc
CsT|j|�rd}n|j||j�}tjd||�|rP|j�|j|j|�WdQRX|S)a%
        See if an URL is a suitable download for a project.

        If it is, register information in the result dictionary (for
        _get_project) about the specific version it's for.

        Note that the return value isn't actually used other than as a boolean
        value.
        Nzprocess_download: %s -> %s)r�r�rwrorpr�r�rW)r6r(r�r*r*r+�_process_download�s

z'SimpleScrapingLocator._process_downloadc
Cs�t|�\}}}}}}|j|j|j|j�r2d}n~|jrL|j|j�rLd}nd|j|j�s^d}nR|d
krld}nD|dkrzd}n6|j|�r�d}n&|j	dd�d	}	|	j
�d
kr�d}nd}tjd||||�|S)z�
        Determine whether a link URL from a referring page and with a
        particular "rel" attribute should be queued for scraping.
        F�homepage�download�httprf�ftp�:rrZ	localhostTz#should_queue: %s (%s) from %s -> %s)r�r�)r�rfr�)
rrjr�r�r�r�r�r�r��splitr�rorp)
r6�linkZreferrerr�r2rlri�_rW�hostr*r*r+�
_should_queue�s*


z#SimpleScrapingLocator._should_queuecCs�x�|jj�}z�yz|r�|j|�}|dkr(wx\|jD]R\}}||jkr0|jj|�|j|�r0|j|||�r0tj	d||�|jj
|�q0WWn2tk
r�}z|jj
t
|��WYdd}~XnXWd|jj�X|sPqWdS)z�
        Get a URL to fetch from the work queue, get the HTML page, examine its
        links for download candidates and candidates for further scraping.

        This is a handy method to run in a thread.
        NzQueueing %s from %s)r�rT�get_pager�r�r�r�r�rorpr�r�rQrrV)r6r(�pager�r�rXr*r*r+r��s&


&zSimpleScrapingLocator._fetchcCsXt|�\}}}}}}|dkr:tjjt|��r:tt|�d�}||jkr`|j|}tj	d||��n�|j
dd�d}d}||jkr�tj	d||��n�t|d	d
id�}�z�y�tj	d|�|j
j||jd
�}	tj	d|�|	j�}
|
jdd�}tj|��r�|	j�}|	j�}
|
jd�}|�r"|j|}||
�}
d}tj|�}|�r@|jd�}y|
j|�}
Wn tk
�rn|
jd�}
YnXt|
|�}||j|<Wn�tk
�r�}z |jdk�r�tjd||�WYdd}~Xn�t k
�r}z2tjd||�|j!�|jj"|�WdQRXWYdd}~Xn2t#k
�rB}ztjd||�WYdd}~XnXWd||j|<X|S)a
        Get the HTML for an URL, possibly from an in-memory cache.

        XXX TODO Note: this cache is never actually cleared. It's assumed that
        the data won't get stale over the lifetime of a locator instance (not
        necessarily true for the default_locator).
        �filez
index.htmlzReturning %s from cache: %sr�rrNzSkipping %s due to bad host %szAccept-encodingZidentity)r;zFetching %s)r%z
Fetched %szContent-Typer0zContent-Encodingzutf-8zlatin-1i�zFetch failed: %s: %s)$r�osri�isdirrrrr�rorpr�r�rrNr�r%r�rT�HTML_CONTENT_TYPEr�Zgeturlr��decoders�CHARSETr�r�r��UnicodeErrorr�rr9r�rr�r�r�)r6r(r2rlrir�rWr�r7r�r;Zcontent_typeZ	final_urlr��encoding�decoderr�rXr*r*r+r�sZ	







&$zSimpleScrapingLocator.get_pagez<a href=[^>]*>([^<]+)<cCsPt�}|j|j�}|s$td|j��x&|jj|j�D]}|j|jd��q4W|S)zJ
        Return all the distribution names known to this locator.
        zUnable to get %sr)	r�rr�r�_distname_rer�r�r�r�)r6rWrr�r*r*r+rd$sz,SimpleScrapingLocator.get_distribution_names)Nr�)r=r>r?r@�zlibZ
decompressrrRr�r�rcr�r�r�r�r�r�r�r�rrrdr�r*r*)r�r+r�;s"

;
r�cs8eZdZdZ�fdd�Zdd�Zdd�Zdd	�Z�ZS)
�DirectoryLocatorz?
    This class locates distributions in a directory tree.
    csN|jdd�|_tt|�jf|�tjj|�}tjj|�sDt	d|��||_
dS)a�
        Initialise an instance.
        :param path: The root of the directory tree to search.
        :param kwargs: Passed to the superclass constructor,
                       except for:
                       * recursive - if True (the default), subdirectories are
                         recursed into. If False, only the top-level directory
                         is searched,
        �	recursiveTzNot a directory: %rN)r�rr�r
rRrri�abspathrr�base_dir)r6rir�)r�r*r+rR5s
zDirectoryLocator.__init__cCs|j|j�S)z�
        Should a filename be considered as a candidate for a distribution
        archive? As well as the filename, the directory which contains it
        is provided, though not used by the current implementation.
        )rjr�)r6rv�parentr*r*r+�should_includeFszDirectoryLocator.should_includec		Cs�iid�}x�tj|j�D]v\}}}xb|D]Z}|j||�r(tjj||�}tddttjj|��dddf�}|j	||�}|r(|j
||�q(W|jsPqW|S)N)r�r�rr0)r�walkrrrir|rr	rr�r�r)	r6rbrW�root�dirs�files�fnr(r�r*r*r+rcNs

zDirectoryLocator._get_projectc	Cs�t�}x�tj|j�D]x\}}}xd|D]\}|j||�r$tjj||�}tddttjj	|��dddf�}|j
|d�}|r$|j|d�q$W|jsPqW|S)zJ
        Return all the distribution names known to this locator.
        rr0Nrb)
r�rrrrrir|rr	rr�r�r)r6rWrrrrr(r�r*r*r+rd^s
z'DirectoryLocator.get_distribution_names)	r=r>r?r@rRrrcrdr�r*r*)r�r+r
0s
r
c@s eZdZdZdd�Zdd�ZdS)�JSONLocatora
    This locator uses special extended metadata (not available on PyPI) and is
    the basis of performant dependency resolution in distlib. Other locators
    require archive downloads before dependencies can be determined! As you
    might imagine, that can be slow.
    cCstd��dS)zJ
        Return all the distribution names known to this locator.
        zNot available from this locatorN)ra)r6r*r*r+rdxsz"JSONLocator.get_distribution_namescCs�iid�}t|�}|r�x�|jdg�D]�}|ddks$|ddkrBq$t|d|d|jd	d
�|jd�}|j}|d|_d
|kr�|d
r�d|d
f|_|jdi�|_|jdi�|_|||j	<|dj
|j	t��j|d�q$W|S)N)r�r�rZptypeZsdistZ	pyversion�sourcerbr�r�zPlaceholder for summary)r�r2r(r�r�Zrequirements�exportsr�)
rrTrr2r�r�r�Zdependenciesrr�r�r�r�)r6rbrWr�r�r�r�r*r*r+rc~s&



"zJSONLocator._get_projectN)r=r>r?r@rdrcr*r*r*r+rqsrcs(eZdZdZ�fdd�Zdd�Z�ZS)�DistPathLocatorz�
    This locator finds installed distributions in a path. It can be useful for
    adding to an :class:`AggregatingLocator`.
    cs*tt|�jf|�t|t�s t�||_dS)zs
        Initialise an instance.

        :param distpath: A :class:`DistributionPath` instance to search.
        N)r�rrR�
isinstancer�AssertionError�distpath)r6rr�)r�r*r+rR�szDistPathLocator.__init__cCsP|jj|�}|dkr iid�}n,|j|d|jt|jg�id|jtdg�ii}|S)N)r�r�r�r�)rZget_distributionr�r�r�)r6rbr�rWr*r*r+rc�szDistPathLocator._get_project)r=r>r?r@rRrcr�r*r*)r�r+r�s
rcsReZdZdZ�fdd�Z�fdd�Zdd�Zeej	j
e�Z	dd	�Zd
d�Z�Z
S)�AggregatingLocatorzI
    This class allows you to chain and/or merge a list of locators.
    cs*|jdd�|_||_tt|�jf|�dS)a�
        Initialise an instance.

        :param locators: The list of locators to search.
        :param kwargs: Passed to the superclass constructor,
                       except for:
                       * merge - if False (the default), the first successful
                         search from any of the locators is returned. If True,
                         the results from all locators are merged (this can be
                         slow).
        �mergeFN)r�r �locatorsr�rrR)r6r!r�)r�r*r+rR�szAggregatingLocator.__init__cs*tt|�j�x|jD]}|j�qWdS)N)r�rr\r!)r6r�)r�r*r+r\�szAggregatingLocator.clear_cachecCs ||_x|jD]
}||_qWdS)N)r]r!r2)r6r_r�r*r*r+r`�szAggregatingLocator._set_schemecCs�i}x�|jD]�}|j|�}|r|jr�|jdi�}|jdi�}|j|�|jd�}|r�|r�x6|j�D]*\}}	||kr�|||	O<qb|	||<qbW|jd�}
|r�|
r�|
j|�q|jdkr�d}n$d}x|D]}|jj|�r�d}Pq�W|r|}PqW|S)Nr�r�TF)r!rer rT�updater�rOr�)r6rbrWr�r�rr�Zdfr�rZdd�foundr*r*r+rc�s8





zAggregatingLocator._get_projectcCs@t�}x4|jD]*}y||j�O}Wqtk
r6YqXqW|S)zJ
        Return all the distribution names known to this locator.
        )r�r!rdra)r6rWr�r*r*r+rd�s
z)AggregatingLocator.get_distribution_names)r=r>r?r@rRr\r`r�rAr2�fgetrcrdr�r*r*)r�r+r�s,rzhttps://pypi.python.org/simple/g@)r%�legacy)r2z1(?P<name>[\w-]+)\s*\(\s*(==\s*)?(?P<ver>[^)]+)\)$c@sLeZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	ddd�Z
dS)�DependencyFinderz0
    Locate dependencies for distributions.
    NcCs|pt|_t|jj�|_dS)zf
        Initialise an instance, using the specified locator
        to locate distributions.
        N)�default_locatorr�r!r2)r6r�r*r*r+rRs
zDependencyFinder.__init__cCsvtjd|�|j}||j|<||j||jf<xD|jD]:}t|�\}}tjd|||�|jj	|t
��j||f�q4WdS)z�
        Add a distribution to the finder. This will update internal information
        about who provides what.
        :param dist: The distribution to add.
        zadding distribution %szAdd to provided: %s, %s, %sN)rorpr<�
dists_by_name�distsr��providesr�providedr�r�r�)r6r�rb�pr�r*r*r+�add_distribution&s
z!DependencyFinder.add_distributioncCs|tjd|�|j}|j|=|j||jf=xN|jD]D}t|�\}}tjd|||�|j|}|j	||f�|s0|j|=q0WdS)z�
        Remove a distribution from the finder. This will update internal
        information about who provides what.
        :param dist: The distribution to remove.
        zremoving distribution %sz Remove from provided: %s, %s, %sN)
rorpr<r(r)r�r*rr+�remove)r6r�rbr,r��sr*r*r+�remove_distribution5s
z$DependencyFinder.remove_distributioncCsBy|jj|�}Wn,tk
r<|j�d}|jj|�}YnX|S)z�
        Get a version matcher for a requirement.
        :param reqt: The requirement
        :type reqt: str
        :return: A version matcher (an instance of
                 :class:`distlib.version.Matcher`).
        r)r2rOr"r�)r6�reqtrOrbr*r*r+�get_matcherGszDependencyFinder.get_matcherc	Csv|j|�}|j}t�}|j}||krrxL||D]@\}}y|j|�}Wntk
r\d}YnX|r.|j|�Pq.W|S)z�
        Find the distributions which can fulfill a requirement.

        :param reqt: The requirement.
         :type reqt: str
        :return: A set of distribution which can fulfill the requirement.
        F)r2r<r�r+r�r"r�)	r6r1rOrbrWr+r��providerr�r*r*r+�find_providersWs


zDependencyFinder.find_providersc	Cs�|j|}t�}x,|D]$}|j|�}|j|j�s|j|�qW|r^|jd||t|�f�d}nD|j|�|j|=x"|D]}|jj|t��j|�qvW|j	|�d}|S)a�
        Attempt to replace one provider with another. This is typically used
        when resolving dependencies from multiple sources, e.g. A requires
        (B >= 1.0) while C requires (B >= 1.1).

        For successful replacement, ``provider`` must meet all the requirements
        which ``other`` fulfills.

        :param provider: The provider we are trying to replace with.
        :param other: The provider we're trying to replace.
        :param problems: If False is returned, this will contain what
                         problems prevented replacement. This is currently
                         a tuple of the literal string 'cantreplace',
                         ``provider``, ``other``  and the set of requirements
                         that ``provider`` couldn't fulfill.
        :return: True if we can replace ``other`` with ``provider``, else
                 False.
        ZcantreplaceFT)
�reqtsr�r2r�r�r��	frozensetr0r�r-)	r6r3�other�problemsZrlistZ	unmatchedr/rOrWr*r*r+�try_to_replaceos"






zDependencyFinder.try_to_replaceFcCsi|_i|_i|_i|_t|p g�}d|krH|jd�|tdddg�O}t|t�rh|}}tj	d|�n4|j
j||d�}}|dkr�td|��tj	d	|�d
|_
t�}t|g�}t|g�}�x�|�r�|j�}|j}	|	|jkr�|j|�n"|j|	}
|
|k�r|j||
|�|j|jB}|j}t�}
||k�rbx2dD]*}d|}||k�r4|
t|d|�O}
�q4W||B|
B}�x>|D�]4}|j|�}|�sNtj	d|�|j
j||d�}|dk�r�|�r�|j
j|d
d�}|dk�r�tj	d|�|jd|f�n^|j|j}}||f|jk�r|j|�|j|�||k�rN||k�rN|j|�tj	d|j�xZ|D]R}|j}	|	|jk�r�|jj|t��j|�n"|j|	}
|
|k�rT|j||
|��qTW�qvWq�Wt|jj��}x.|D]&}||k|_|j�r�tj	d|j��q�Wtj	d|�||fS)a�
        Find a distribution and all distributions it depends on.

        :param requirement: The requirement specifying the distribution to
                            find, or a Distribution instance.
        :param meta_extras: A list of meta extras such as :test:, :build: and
                            so on.
        :param prereleases: If ``True``, allow pre-release versions to be
                            returned - otherwise, don't return prereleases
                            unless they're all that's available.

        Return a set of :class:`Distribution` instances and a set of
        problems.

        The distributions returned should be such that they have the
        :attr:`required` attribute set to ``True`` if they were
        from the ``requirement`` passed to ``find()``, and they have the
        :attr:`build_time_dependency` attribute set to ``True`` unless they
        are post-installation dependencies of the ``requirement``.

        The problems should be a tuple consisting of the string
        ``'unsatisfied'`` and the requirement which couldn't be satisfied
        by any distribution known to the locator.
        z:*:z:test:z:build:z:dev:zpassed %s as requirement)r�NzUnable to locate %rz
located %sT�test�build�devz:%s:z%s_requireszNo providers found for %rzCannot satisfy %rZunsatisfiedzAdding %s to install_distsz#%s is a build-time dependency only.zfind done for %s)r:r;r<)r+r)r(r5r�r.rrrorpr�r�rZ	requestedr�r<r-r9Zrun_requiresZ
meta_requiresZbuild_requires�getattrr4r�r�Zname_and_versionr��valuesZbuild_time_dependency)r6r�Zmeta_extrasr�r�r�r8ZtodoZ
install_distsrbr7ZireqtsZsreqtsZereqtsr<rXZ	all_reqtsr�Z	providersr3�nrr,r)r*r*r+�find�s�




















zDependencyFinder.find)N)NF)r=r>r?r@rRr-r0r2r4r9r@r*r*r*r+r&s
(r&)N)Nr��iorr�Zloggingrrgr�r��ImportErrorZdummy_threadingrr0r�compatrrrrr	r
rrr
rrr4rrrrZdatabaserrrr�r�utilrrrrrrrrr r�r!r"r�r#r$Z	getLoggerr=ror�r�r�rrr&r,r-�objectrAr�r�r�r�r
rrrr'r�ZNAME_VERSION_REr&r*r*r*r+�<module>sZD,



;0E:vA&[
site-packages/pip/_vendor/distlib/__pycache__/manifest.cpython-36.pyc000064400000024052147511334610021616 0ustar003

���e�9�@s�dZddlZddlZddlZddlZddlZddlmZddlm	Z	ddl
mZdgZej
e�Zejdej�Zejd	ejejB�Zejdd
�ZGdd�de�ZdS)zu
Class representing the list of files in a distribution.

Equivalent to distutils.filelist, but fixes some problems.
�N�)�DistlibException)�fsdecode)�convert_path�Manifestz\\w*
z#.*?(?=
)|
(?=$)�c@szeZdZdZddd�Zdd�Zdd�Zd	d
�Zddd
�Zdd�Z	dd�Z
dd�Zddd�Zd dd�Z
d!dd�Zdd�ZdS)"rz~A list of files built by on exploring the filesystem and filtered by
    applying various patterns to what we find there.
    NcCs>tjjtjj|ptj���|_|jtj|_d|_t	�|_
dS)zd
        Initialise an instance.

        :param base: The base directory to explore under.
        N)�os�path�abspath�normpath�getcwd�base�sep�prefix�allfiles�set�files)�selfr
�r�/usr/lib/python3.6/manifest.py�__init__*szManifest.__init__cCs�ddlm}m}m}g|_}|j}|g}|j}|j}xv|r�|�}tj	|�}	x\|	D]T}
tj
j||
�}tj|�}|j}
||
�r�|jt
|��qR||
�rR||
�rR||�qRWq8WdS)zmFind all files under the base and set ``allfiles`` to the absolute
        pathnames of files found.
        r)�S_ISREG�S_ISDIR�S_ISLNKN)�statrrrrr
�pop�appendr�listdirr	�join�st_moder)rrrrr�root�stackr�push�names�name�fullnamer�moderrr�findall9s"



zManifest.findallcCs4|j|j�stjj|j|�}|jjtjj|��dS)zz
        Add a file to the manifest.

        :param item: The pathname to add. This can be relative to the base.
        N)	�
startswithrrr	rr
r�addr)r�itemrrrr)TszManifest.addcCsx|D]}|j|�qWdS)z�
        Add a list of files to the manifest.

        :param items: The pathnames to add. These can be relative to the base.
        N)r))r�itemsr*rrr�add_many^s
zManifest.add_manyFcsf��fdd��t�j�}|rJt�}x|D]}�|tjj|��q(W||O}dd�tdd�|D��D�S)z8
        Return sorted files in directory order
        csJ|j|�tjd|�|�jkrFtjj|�\}}|dks<t��||�dS)Nzadd_dir added %s��/)r-r.)r)�logger�debugr
rr	�split�AssertionError)�dirs�d�parent�_)�add_dirrrrr7ls

z Manifest.sorted.<locals>.add_dircSsg|]}tjj|��qSr)rr	r)�.0Z
path_tuplerrr�
<listcomp>zsz#Manifest.sorted.<locals>.<listcomp>css|]}tjj|�VqdS)N)rr	r1)r8r	rrr�	<genexpr>{sz"Manifest.sorted.<locals>.<genexpr>)rrrr	�dirname�sorted)rZwantdirs�resultr3�fr)r7rrr<gs

zManifest.sortedcCst�|_g|_dS)zClear all collected files.N)rrr)rrrr�clear}szManifest.clearcCs�|j|�\}}}}|dkrFx&|D]}|j|dd�s tjd|�q W�n<|dkrnx|D]}|j|dd�}qTW�n|dkr�x&|D]}|j|dd�s|tjd|�q|Wn�|d	kr�x�|D]}|j|dd�}q�Wn�|d
k�r�x�|D] }|j||d�s�tjd||�q�Wn�|d
k�r&xz|D]}|j||d�}�qWn\|dk�rN|jd|d��s�tjd|�n4|dk�rv|jd|d��s�tjd|�ntd|��dS)av
        Process a directive which either adds some files from ``allfiles`` to
        ``files``, or removes some files from ``files``.

        :param directive: The directive to process. This should be in a format
                     compatible with distutils ``MANIFEST.in`` files:

                     http://docs.python.org/distutils/sourcedist.html#commands
        �includeT)�anchorzno files found matching %r�excludezglobal-includeFz3no files found matching %r anywhere in distributionzglobal-excludezrecursive-include)rz-no files found matching %r under directory %rzrecursive-exclude�graftNz no directories found matching %r�prunez4no previously-included directories found matching %rzinvalid action %r)�_parse_directive�_include_patternr/Zwarning�_exclude_patternr)r�	directive�action�patterns�thedirZ
dirpattern�pattern�foundrrr�process_directive�sD









zManifest.process_directivec	Cs|j�}t|�dkr,|ddkr,|jdd�|d}d}}}|dkrxt|�dkr`td
|��dd�|dd�D�}n�|dkr�t|�dkr�td|��t|d�}dd�|dd�D�}n<|dk�r�t|�dkr�td|��t|d�}ntd|��||||fS)z�
        Validate a directive.
        :param directive: The directive to validate.
        :return: A tuple of action, patterns, thedir, dir_patterns
        rrr@rB�global-include�global-exclude�recursive-include�recursive-excluderCrDNrz$%r expects <pattern1> <pattern2> ...cSsg|]}t|��qSr)r)r8�wordrrrr9�sz-Manifest._parse_directive.<locals>.<listcomp>�z*%r expects <dir> <pattern1> <pattern2> ...cSsg|]}t|��qSr)r)r8rSrrrr9�sz!%r expects a single <dir_pattern>zunknown action %r)r@rBrOrPrQrRrCrD)r@rBrOrP)rQrR)rCrD)r1�len�insertrr)rrHZwordsrIrJrKZdir_patternrrrrE�s:



zManifest._parse_directiveTcCsTd}|j||||�}|jdkr&|j�x(|jD]}|j|�r.|jj|�d}q.W|S)a�Select strings (presumably filenames) from 'self.files' that
        match 'pattern', a Unix-style wildcard (glob) pattern.

        Patterns are not quite the same as implemented by the 'fnmatch'
        module: '*' and '?'  match non-special characters, where "special"
        is platform-dependent: slash on Unix; colon, slash, and backslash on
        DOS/Windows; and colon on Mac OS.

        If 'anchor' is true (the default), then the pattern match is more
        stringent: "*.py" will match "foo.py" but not "foo/bar.py".  If
        'anchor' is false, both of these will match.

        If 'prefix' is supplied, then only filenames starting with 'prefix'
        (itself a pattern) and ending with 'pattern', with anything in between
        them, will match.  'anchor' is ignored in this case.

        If 'is_regex' is true, 'anchor' and 'prefix' are ignored, and
        'pattern' is assumed to be either a string containing a regex or a
        regex object -- no translation is done, the regex is just compiled
        and used as-is.

        Selected strings will be added to self.files.

        Return True if files are found.
        FNT)�_translate_patternrr'�searchrr))rrLrAr�is_regexrM�
pattern_rer$rrrrFs

zManifest._include_patterncCsFd}|j||||�}x,t|j�D]}|j|�r |jj|�d}q W|S)atRemove strings (presumably filenames) from 'files' that match
        'pattern'.

        Other parameters are the same as for 'include_pattern()', above.
        The list 'self.files' is modified in place. Return True if files are
        found.

        This API is public to allow e.g. exclusion of SCM subdirs, e.g. when
        packaging source distributions
        FT)rW�listrrX�remove)rrLrArrYrMrZr>rrrrG)s
zManifest._exclude_patternc
Cs�|rt|t�rtj|�S|Stdkr:|jd�jd�\}}}|rj|j|�}td
krn|j|�rd|j|�snt	�nd}tj
tjj
|jd��}	|dk	�rftdkr�|jd�}
|j|�dt|
��}n>|j|�}|j|�r�|j|�s�t	�|t|�t|�t|��}tj}tjdk�rd}tdk�r4d|	|j
|d	|f�}n0|t|�t|�t|��}d
||	||||f}n8|�r�tdk�r�d|	|}nd||	|t|�d�f}tj|�S)aTranslate a shell-like wildcard pattern to a compiled regular
        expression.

        Return the compiled regex.  If 'is_regex' true,
        then 'pattern' is directly compiled to a regex (if it's a string)
        or just returned as-is (assumes it's a regex object).
        rTrr6r-N�\z\\�^z.*z%s%s%s%s.*%s%sz%s%s%s)rTr)rTr)rTr)rTr)rTr)�
isinstance�str�re�compile�_PYTHON_VERSION�_glob_to_re�	partitionr(�endswithr2�escaperr	rr
rUr)
rrLrArrY�startr6�endrZr
Z
empty_patternZ	prefix_rerrrrrW=sB	








zManifest._translate_patterncCs8tj|�}tj}tjdkrd}d|}tjd||�}|S)z�Translate a shell-like glob pattern to a regular expression.

        Return a string containing the regex.  Differs from
        'fnmatch.translate()' in that '*' does not match "special characters"
        (which are platform-specific).
        r]z\\\\z\1[^%s]z((?<!\\)(\\\\)*)\.)�fnmatch�	translaterrra�sub)rrLrZrZescapedrrrrdts

zManifest._glob_to_re)N)F)TNF)TNF)TNF)�__name__�
__module__�__qualname__�__doc__rr'r)r,r<r?rNrErFrGrWrdrrrrr%s 

	
O/
(

6)rprjZloggingrra�sysr-r�compatr�utilr�__all__Z	getLoggerrmr/rb�MZ_COLLAPSE_PATTERN�SZ_COMMENTED_LINE�version_inforc�objectrrrrr�<module>
s
site-packages/pip/_vendor/distlib/__pycache__/wheel.cpython-36.pyc000064400000060601147511334610021114 0ustar003

���e˘�@s�ddlmZddlZddlZddlZddlZddlmZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlZddlZddlZddlmZmZddlmZmZmZmZmZddlmZddlm Z m!Z!dd	l"m#Z#m$Z$m%Z%m&Z&m'Z'm(Z(m)Z)m*Z*m+Z+dd
l,m-Z-m.Z.ej/e0�Z1da2e3ed��r4dZ4n*ej5j6d
��rHdZ4nej5dk�rZdZ4ndZ4ej7d�Z8e8�s�dej9dd�Z8de8Z:e4e8Z;ej"j<�j=dd�j=dd�Z>ej7d�Z?e?�r�e?j6d��r�e?j=dd�Z?ndd�Z@e@�Z?[@ejAdejBejCB�ZDejAdejBejCB�ZEejAd�ZFejAd �ZGd!ZHd"ZIe
jJd#k�r>d$d%�ZKnd&d%�ZKGd'd(�d(eL�ZMeM�ZNGd)d*�d*eL�ZOd+d,�ZPeP�ZQ[Pd/d-d.�ZRdS)0�)�unicode_literalsN)�message_from_file�)�__version__�DistlibException)�	sysconfig�ZipFile�fsdecode�	text_type�filter)�InstalledDistribution)�Metadata�METADATA_FILENAME)	�FileOperator�convert_path�	CSVReader�	CSVWriter�Cache�cached_property�get_cache_base�read_exports�tempdir)�NormalizedVersion�UnsupportedVersionErrorZpypy_version_infoZpp�javaZjyZcliZip�cp�py_version_nodotz%s%s��py�-�_�.�SOABIzcpython-cCsRdtg}tjd�r|jd�tjd�r0|jd�tjd�dkrH|jd�d	j|�S)
Nr�Py_DEBUG�d�
WITH_PYMALLOC�mZPy_UNICODE_SIZE��u�)�
VER_SUFFIXr�get_config_var�append�join)�parts�r/�/usr/lib/python3.6/wheel.py�_derive_abi;s




r1zz
(?P<nm>[^-]+)
-(?P<vn>\d+[^-]*)
(-(?P<bn>\d+[^-]*))?
-(?P<py>\w+\d+(\.\w+\d+)*)
-(?P<bi>\w+)
-(?P<ar>\w+(\.\w+)*)
\.whl$
z7
(?P<nm>[^-]+)
-(?P<vn>\d+[^-]*)
(-(?P<bn>\d+[^-]*))?$
s
\s*#![^\r\n]*s^(\s*#!("[^"]+"|\S+))\s+(.*)$s#!pythons	#!pythonw�/cCs|S)Nr/)�or/r/r0�<lambda>]sr4cCs|jtjd�S)Nr2)�replace�os�sep)r3r/r/r0r4_sc@s6eZdZdd�Zdd�Zdd�Zddd	�Zd
d�ZdS)
�MountercCsi|_i|_dS)N)�
impure_wheels�libs)�selfr/r/r0�__init__cszMounter.__init__cCs||j|<|jj|�dS)N)r9r:�update)r;�pathname�
extensionsr/r/r0�addgs
zMounter.addcCs4|jj|�}x"|D]\}}||jkr|j|=qWdS)N)r9�popr:)r;r>r?�k�vr/r/r0�removeks
zMounter.removeNcCs||jkr|}nd}|S)N)r:)r;�fullname�path�resultr/r/r0�find_moduleqs
zMounter.find_modulecCsj|tjkrtj|}nP||jkr,td|��tj||j|�}||_|jdd�}t|�dkrf|d|_	|S)Nzunable to find extension for %sr!rr)
�sys�modulesr:�ImportError�impZload_dynamic�
__loader__�rsplit�len�__package__)r;rErGr.r/r/r0�load_modulexs


zMounter.load_module)N)�__name__�
__module__�__qualname__r<r@rDrHrQr/r/r/r0r8bs

r8c@s�eZdZdZd2ZdZd3dd�Zedd	��Zed
d��Z	edd
��Z
edd��Zdd�Z
edd��Zdd�Zd4dd�Zdd�Zdd�Zdd�Zd5dd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd6d*d+�Zd,d-�Zd.d/�Zd7d0d1�ZdS)8�Wheelz@
    Class to build and install from Wheel files (PEP 427).
    rZsha256NFcCs8||_||_d|_tg|_dg|_dg|_tj�|_	|dkrRd|_
d|_|j|_
n�tj|�}|r�|jd�}|d|_
|djd	d
�|_|d|_|j|_
n�tjj|�\}}tj|�}|s�td|��|r�tjj|�|_	||_
|jd�}|d|_
|d|_|d|_|d
jd�|_|djd�|_|djd�|_dS)zB
        Initialise an instance using a (valid) filename.
        r)�none�anyNZdummyz0.1ZnmZvnr rZbnzInvalid name or filename: %rrr!Zbi�ar)�signZ
should_verify�buildver�PYVER�pyver�abi�archr6�getcwd�dirname�name�version�filenameZ	_filename�NAME_VERSION_RE�match�	groupdictr5rF�split�FILENAME_REr�abspath)r;rcrY�verifyr&�infor`r/r/r0r<�sB











zWheel.__init__cCs^|jrd|j}nd}dj|j�}dj|j�}dj|j�}|jjdd�}d|j|||||fS)zJ
        Build and return a filename from the various components.
        rr)r!r z%s-%s%s-%s-%s-%s.whl)rZr-r\r]r^rbr5ra)r;rZr\r]r^rbr/r/r0rc�s
zWheel.filenamecCstjj|j|j�}tjj|�S)N)r6rFr-r`rc�isfile)r;rFr/r/r0�exists�szWheel.existsccs@x:|jD]0}x*|jD] }x|jD]}|||fVq WqWqWdS)N)r\r]r^)r;r\r]r^r/r/r0�tags�sz
Wheel.tagscCs�tjj|j|j�}d|j|jf}d|}tjd�}t	|d���}|j
|�}|djdd�}tdd	�|D��}|d
krzd
}	nt
}	y8tj||	�}
|j|
��}||�}t|d�}
WdQRXWn tk
r�td|	��YnXWdQRX|
S)Nz%s-%sz%s.dist-infozutf-8�rz
Wheel-Versionr!rcSsg|]}t|��qSr/)�int)�.0�ir/r/r0�
<listcomp>�sz"Wheel.metadata.<locals>.<listcomp>ZMETADATA)Zfileobjz$Invalid wheel, because %s is missing)rr)r6rFr-r`rcrarb�codecs�	getreaderr�get_wheel_metadatarg�tupler�	posixpath�openr
�KeyError�
ValueError)r;r>�name_ver�info_dir�wrapper�zf�wheel_metadata�wv�file_version�fn�metadata_filename�bf�wfrGr/r/r0�metadata�s(

zWheel.metadatac	CsXd|j|jf}d|}tj|d�}|j|��}tjd�|�}t|�}WdQRXt|�S)Nz%s-%sz%s.dist-info�WHEELzutf-8)	rarbrxr-ryrtrur�dict)r;rr|r}r�r�r��messager/r/r0rv�szWheel.get_wheel_metadatac	Cs6tjj|j|j�}t|d��}|j|�}WdQRX|S)Nro)r6rFr-r`rcrrv)r;r>rrGr/r/r0rk�sz
Wheel.infocCs�tj|�}|r||j�}|d|�||d�}}d|j�krBt}nt}tj|�}|rfd|j�d
}nd}||}||}nT|jd�}|jd�}	|dks�||	kr�d}
n|||d�d	kr�d	}
nd}
t|
|}|S)Nspythonw� r��
�
rrs
���)	�
SHEBANG_REre�end�lower�SHEBANG_PYTHONW�SHEBANG_PYTHON�SHEBANG_DETAIL_RE�groups�find)r;�datar&r�ZshebangZdata_after_shebangZshebang_python�argsZcrZlfZtermr/r/r0�process_shebang�s,




zWheel.process_shebangcCsh|dkr|j}ytt|�}Wn tk
r<td|��YnX||�j�}tj|�jd�j	d�}||fS)NzUnsupported hash algorithm: %r�=�ascii)
�	hash_kind�getattr�hashlib�AttributeErrorr�digest�base64Zurlsafe_b64encode�rstrip�decode)r;r�r��hasherrGr/r/r0�get_hashszWheel.get_hashc
Csbt|�}ttjj||��}|j|ddf�|j�t|��}x|D]}|j|�qBWWdQRXdS)Nr))	�list�to_posixr6rF�relpathr,�sortrZwriterow)r;�recordsZrecord_path�base�p�writer�rowr/r/r0�write_record's

zWheel.write_recordcCs�g}|\}}tt|j�}xX|D]P\}}	t|	d��}
|
j�}WdQRXd|j|�}tjj|	�}
|j	|||
f�qWtjj
|d�}	|j||	|�ttjj
|d��}|j	||	f�dS)N�rbz%s=%s�RECORD)
r�r�r�ry�readr�r6rF�getsizer,r-r�r�)r;rk�libdir�
archive_pathsr��distinfor}r��apr��fr�r��sizer/r/r0�
write_records0szWheel.write_recordscCsJt|dtj��2}x*|D]"\}}tjd||�|j||�qWWdQRXdS)N�wzWrote %s to %s in wheel)r�zipfileZZIP_DEFLATED�logger�debug�write)r;r>r�rr�r�r/r/r0�	build_zip@szWheel.build_zipc!s�|dkri}tt�fdd�d%��d}|dkrFd}tg}tg}tg}nd}tg}d	g}d
g}|jd|�|_|jd|�|_|jd
|�|_	�|}	d|j
|jf}
d|
}d|
}g}
x�d&D]�}|�kr�q��|}tj
j|�r�x�tj|�D]�\}}}x�|D]�}ttj
j||��}tj
j||�}ttj
j|||��}|
j||f�|dk�r�|jd��r�t|d��}|j�}WdQRX|j|�}t|d��}|j|�WdQRX�q�Wq�Wq�W|	}d}x�tj|�D]�\}}}||k�r"x@t|�D]4\}}t|�}|jd��r�tj
j||�}||=P�q�W|�s"td��xP|D]H}t|�jd'��r@�q(tj
j||�}ttj
j||��}|
j||f��q(W�q�Wtj|�}xJ|D]B}|d(k�r�ttj
j||��}ttj
j||��}|
j||f��q�Wd|�p�|jd td!|g}x*|jD] \}}}|jd"|||f��q�Wtj
j|d�}t|d#��}|jd$j|��WdQRXttj
j|d��}|
j||f�|j ||f|	|
�tj
j|j!|j"�} |j#| |
�| S))z�
        Build a wheel from files in specified paths, and use any specified tags
        when determining the name of the wheel.
        Ncs|�kS)Nr/)r3)�pathsr/r0r4NszWheel.build.<locals>.<lambda>�purelib�platlibrZfalse�truerVrWr\r]r^z%s-%sz%s.dataz%s.dist-infor��headers�scriptsz.exer��wbz
.dist-infoz(.dist-info directory expected, not found�.pyc�.pyor��	INSTALLER�SHAREDr�zWheel-Version: %d.%dzGenerator: distlib %szRoot-Is-Purelib: %sz
Tag: %s-%s-%sr��
)r�r�)r�r�r�)r�r�)r�r�r�r�)$r�r�IMPVER�ABI�ARCHr[�getr\r]r^rarbr6rF�isdir�walkr	r-r�r�r,�endswithryr�r�r��	enumerate�AssertionError�listdir�
wheel_versionrrnr�r`rcr�)!r;r�rnr�ZlibkeyZis_pureZ
default_pyverZdefault_abiZdefault_archr�r|�data_dirr}r��keyrF�root�dirs�filesr�r��rpr�r�r�r�rr�dnr�r\r]r^r>r/)r�r0�buildFs�


"





zWheel.buildcBIKs`|j}|jd�}|jdd�}tjj|j|j�}d|j|jf}d|}	d|}
t	j|
t
�}t	j|
d�}t	j|
d�}
tjd	�}t
|d
����}|j|��}||�}t|�}WdQRX|djd
d�}tdd�|D��}||jkr�|r�||j|�|ddk�r|d}n|d}i}|j|
��<}t|d��&}x|D]}|d}|||<�q.WWdQRXWdQRXt	j|	d�}t	j|
d�}t	j|	dd�}t|d�}d|_tj}g} tj�}!|!|_d|_�z��y^�x�|j�D�]�}"|"j}#t|#t��r�|#}$n
|#jd	�}$|$j d��r��q�||$}|d�r0t!|"j"�|dk�r0t#d|$��|d�r�|djdd�\}%}&|j|#��}|j$�}'WdQRX|j%|'|%�\}(})|)|&k�r�t#d|#��|�r�|$j&||f��r�t'j(d |$��q�|$j&|��o�|$j d!�}*|$j&|��r|$jd"d�\}(}+},tjj||+t)|,��}-n$|$||
fk�r�q�tjj|t)|$��}-|*�s|j|#��}|j*||-�WdQRX| j+|-�|�r�|d�r�t|-d#��4}|j$�}'|j%|'|%�\}(}.|.|)k�r�t#d$|-��WdQRX|�rx|-j d%��rxy|j,|-�}/| j+|/�Wn$t-k
�rt'j.d&dd'�YnXnttjj/t)|#��}0tjj|!|0�}1|j|#��}|j*||1�WdQRXtjj|-�\}2}0|2|_|j0|0�}3|j1|3�| j2|3��q�W|�r�t'j(d(�d}4�n~d}5|j3d}|d)k�r~t	j|
d*�}6y�|j|6��}t4|�}7WdQRXi}5xxd<D]p}8d-|8}9|9|7k�r�i|5d.|8<}:xF|7|9j5�D]6};d/|;j6|;j7f}<|;j8�rB|<d0|;j87}<|<|:|;j<�qW�q�WWn t-k
�rzt'j.d1�YnXndyB|j|��.}||�}t9j:|�jd2�}5|5�r�|5jd3�}5WdQRXWn t-k
�r�t'j.d4�YnX|5�r�|5jd5i�}=|5jd6i�}>|=�s|>�r�|jdd�}?tjj;|?��s.t<d7��|?|_x6|=j=�D]*\}9};d8|9|;f}@|j0|@�}3|j1|3��q>W|>�r�d,di}Ax8|>j=�D],\}9};d8|9|;f}@|j0|@|A�}3|j1|3��q�Wtjj||
�}t>|�}4t?|�}|d=|d=||d9<|4j@||�}|�r| j+|�|4jA| |d:|�|4St-k
�r@t'jBd;�|jC��YnXWdtDjE|!�XWdQRXdS)=a�
        Install a wheel to the specified paths. If kwarg ``warner`` is
        specified, it should be a callable, which will be called with two
        tuples indicating the wheel version of this software and the wheel
        version in the file, if there is a discrepancy in the versions.
        This can be used to issue any warnings to raise any exceptions.
        If kwarg ``lib_only`` is True, only the purelib/platlib files are
        installed, and the headers, scripts, data and dist-info metadata are
        not written.

        The return value is a :class:`InstalledDistribution` instance unless
        ``options.lib_only`` is True, in which case the return value is ``None``.
        �warner�lib_onlyFz%s-%sz%s.dataz%s.dist-infor�r�zutf-8roNz
Wheel-Versionr!rcSsg|]}t|��qSr/)rp)rqrrr/r/r0rs�sz!Wheel.install.<locals>.<listcomp>zRoot-Is-Purelibr�r�r�)�streamrr)r�)�dry_runTz/RECORD.jwsrzsize mismatch for %s�=zdigest mismatch for %szlib_only: skipping %sz.exer2r�zdigest mismatch on write for %sz.pyzByte-compilation failed)�exc_infozlib_only: returning Nonez1.0zentry_points.txt�console�guiz
%s_scriptszwrap_%sz%s:%sz %szAUnable to read legacy script metadata, so cannot generate scriptsr?zpython.commandsz8Unable to read JSON metadata, so cannot generate scriptsZwrap_consoleZwrap_guizValid script path not specifiedz%s = %s�lib�prefixzinstallation failed.)r�r�)Fr�r�r6rFr-r`rcrarbrxrrtrurryrrgrwr�rr�recordrI�dont_write_bytecode�tempfileZmkdtempZ
source_dirZ
target_dir�infolist�
isinstancer
r�r��str�	file_sizerr�r��
startswithr�r�rZcopy_streamr,Zbyte_compile�	ExceptionZwarning�basenameZmakeZset_executable_mode�extendrkr�valuesr��suffix�flags�json�loadr�r{�itemsrr�Zwrite_shared_locationsZwrite_installed_filesZ	exceptionZrollback�shutilZrmtree)Br;r�Zmaker�kwargsr�r�r�r>r|r�r}�
metadata_name�wheel_metadata_name�record_namer~r�bwfr�r�r�r�r�r�r��readerr�r�Zdata_pfxZinfo_pfxZ
script_pfxZfileopZbcZoutfiles�workdir�zinfo�arcname�	u_arcname�kind�valuer�r r�Z	is_script�wherer�ZoutfileZ	newdigestZpycr�Zworknamer��	filenamesZdistZcommandsZepZepdatar�rBr$rC�sZconsole_scriptsZgui_scriptsZ
script_dirZscriptZoptionsr/r/r0�install�sB



"
















z
Wheel.installcCs4tdkr0tjjt�td�tjdd��}t|�atS)Nzdylib-cache�)	�cacher6rFr-rr�rIrbr)r;r�r/r/r0�_get_dylib_cache�s
zWheel._get_dylib_cachecCsltjj|j|j�}d|j|jf}d|}tj|d�}tj	d�}g}t
|d���}y�|j|���}||�}	tj
|	�}
|j�}|j|�}tjj|j|�}
tjj|
�s�tj|
�x�|
j�D]�\}}tjj|
t|��}tjj|�s�d}n6tj|�j}tjj|�}|j|�}tj|j�}||k}|�r(|j||
�|j||f�q�WWdQRXWntk
�r\YnXWdQRX|S)Nz%s-%sz%s.dist-infoZ
EXTENSIONSzutf-8roT)r6rFr-r`rcrarbrxrtrurryr�r�rZ
prefix_to_dirr�r��makedirsr�rrm�stat�st_mtime�datetimeZ
fromtimestampZgetinfoZ	date_time�extractr,rz)r;r>r|r}rr~rGrr�r�r?rr�Z
cache_baserar��destrZ	file_timerkZ
wheel_timer/r/r0�_get_extensions�s>




 zWheel._get_extensionscCst|�S)zM
        Determine if a wheel is compatible with the running system.
        )�
is_compatible)r;r/r/r0r�szWheel.is_compatiblecCsdS)zP
        Determine if a wheel is asserted as mountable by its metadata.
        Tr/)r;r/r/r0�is_mountable�szWheel.is_mountablecCs�tjjtjj|j|j��}|j�s2d|}t|��|j�sJd|}t|��|t	jkrbt
jd|�nN|rtt	jj|�nt	jj
d|�|j�}|r�tt	jkr�t	jjt�tj||�dS)Nz)Wheel %s not compatible with this Python.z$Wheel %s is marked as not mountable.z%s already in pathr)r6rFrir-r`rcrrrrIr�r�r,�insertr�_hook�	meta_pathr@)r;r,r>�msgr?r/r/r0�mount�s"

zWheel.mountcCsrtjjtjj|j|j��}|tjkr2tjd|�n<tjj	|�|t
jkrRt
j	|�t
jsnt
tjkrntjj	t
�dS)Nz%s not in path)
r6rFrir-r`rcrIr�r�rDrr9r)r;r>r/r/r0�unmount�s



z
Wheel.unmountc'Cstjj|j|j�}d|j|jf}d|}d|}tj|t�}tj|d�}tj|d�}t	j
d�}t|d����}	|	j|��}
||
�}t
|�}WdQRX|djd	d
�}
tdd�|
D��}i}|	j|��:}t|d
��$}x|D]}|d}|||<q�WWdQRXWdQRXx�|	j�D]�}|j}t|t��r*|}n
|jd�}d|k�rJtd|��|jd��rZ�q||}|d�r�t|j�|dk�r�td|��|d
�r|d
jdd
�\}}|	j|��}|j�}WdQRX|j||�\}}||k�rtd|���qWWdQRXdS)Nz%s-%sz%s.dataz%s.dist-infor�r�zutf-8roz
Wheel-Versionr!rcSsg|]}t|��qSr/)rp)rqrrr/r/r0rs�sz Wheel.verify.<locals>.<listcomp>)r�rz..zinvalid entry in wheel: %rz/RECORD.jwsrzsize mismatch for %sr�zdigest mismatch for %s)r6rFr-r`rcrarbrxrrtrurryrrgrwrr�r�r
r�rr�r�r�r�r�)r;r>r|r�r}r�r�rr~rrr�r�r�r�r�r�rr�r�rrrrrr�r r�r/r/r0rj�sT

 



zWheel.verifycKs�dd�}dd�}tjj|j|j�}d|j|jf}d|}tj|d�}	t����}
t	|d���}i}xt|j
�D]h}
|
j}t|t�r�|}n
|j
d	�}||	kr�qjd
|kr�td|��|j|
|
�tjj|
t|��}|||<qjWWdQRX|||�\}}||f|�}|�r�|||�\}}|�r(||k�r(|||�|dk�rRtjd
d|
d�\}}tj|�n*tjj|��sltd|��tjj||j�}t|j��}tjj|
|�}||f}|j||
|�|j||�|dk�r�tj||�WdQRX|S)a�
        Update the contents of a wheel in a generic way. The modifier should
        be a callable which expects a dictionary argument: its keys are
        archive-entry paths, and its values are absolute filesystem paths
        where the contents the corresponding archive entries can be found. The
        modifier is free to change the contents of the files pointed to, add
        new entries and remove entries, before returning. This method will
        extract the entire contents of the wheel to a temporary location, call
        the modifier, and then use the passed (and possibly updated)
        dictionary to write a new wheel. If ``dest_dir`` is specified, the new
        wheel is written there -- otherwise, the original wheel is overwritten.

        The modifier should return True if it updated the wheel, else False.
        This method returns the same value the modifier returns.
        cSsHd}}d|tf}||kr$d|}||kr@||}t|d�j}||fS)Nz%s/%sz%s/PKG-INFO)rF)rr
rb)�path_mapr}rbrFr�r/r/r0�get_version1sz!Wheel.update.<locals>.get_versioncSs�d}y|t|�}|jd�}|dkr*d|}nTdd�||dd�jd�D�}|dd7<d|d|�djd	d
�|D��f}Wn tk
r�tjd|�YnX|r�t|d�}||_|j	t
�}|j||d
�tjd||�dS)Nrrz%s+1cSsg|]}t|��qSr/)rp)rqrr/r/r0rsCsz8Wheel.update.<locals>.update_version.<locals>.<listcomp>rr!z%s+%scss|]}t|�VqdS)N)r�)rqrrr/r/r0�	<genexpr>Fsz7Wheel.update.<locals>.update_version.<locals>.<genexpr>z0Cannot update non-compliant (PEP-440) version %r)rF)rF�legacyzVersion updated from %r to %rr�)rr�rgr-rr�r�r
rbr�rr�)rbrF�updatedrCrrr.Zmdr"r/r/r0�update_version;s(

 
z$Wheel.update.<locals>.update_versionz%s-%sz%s.dist-infor�rozutf-8z..zinvalid entry in wheel: %rNz.whlz
wheel-update-)r�r��dirzNot a directory: %r)r6rFr-r`rcrarbrxrrr�r�r
r�rrrr�Zmkstemp�closer�r�r�r�r�r�Zcopyfile)r;ZmodifierZdest_dirr�r r$r>r|r}rrrrrrrrFZoriginal_versionr ZmodifiedZcurrent_version�fd�newpathr�r�rkr/r/r0r= sX






zWheel.update)rr)NFF)N)NN)F)N)rRrSrT�__doc__r�r�r<�propertyrcrmrnrr�rvrkr�r�r�r�r�r�rrrrrrrrjr=r/r/r/r0rU�s4
)	
	
he	"
6rUcCs�tg}td}x6ttjdddd�D]}|jdj|t|�g��q&Wg}x6tj�D]*\}}}|j	d�rT|j|j
dd�d�qTW|j�tdkr�|j
dt�|jd�g}tg}tjdk�r�tjd	t�}|�r�|j�\}	}}}
t|�}|
g}|
dk�r|jd�|
dk�r|jd�|
dk�r*|jd�|
dk�r>|jd�|
dk�rR|jd�xL|dk�r�x2|D]*}d|	|||f}
|
tk�rd|j|
��qdW|d8}�qTWx<|D]4}x,|D]$}
|jdjt|df�||
f��q�W�q�WxXt|�D]L\}}|jdjt|f�ddf�|dk�r�|jdjt|df�ddf��q�WxXt|�D]L\}}|jdjd|f�ddf�|dk�rB|jdjd|df�ddf��qBWt|�S)zG
    Return (pyver, abi, arch) tuples compatible with this Python.
    rrr)z.abir!rrV�darwinz(\w+)_(\d+)_(\d+)_(\w+)$�i386�ppcZfat�x86_64Zfat3�ppc64Zfat64�intelZ	universalz%s_%s_%s_%srWrr�r�)r,r-)r,r-r.)r/r.)r,r.)r,r.r0r-r/)r*�rangerI�version_infor,r-r�rLZget_suffixesr�rgr�r�rr��platform�rerer�rp�
IMP_PREFIXr��set)Zversions�major�minorZabisr�r rGZarchesr&rar^Zmatchesrerr]rrrbr/r/r0�compatible_tags�s`















*
$
$r9cCs^t|t�st|�}d}|dkr"t}x6|D].\}}}||jkr(||jkr(||jkr(d}Pq(W|S)NFT)r�rU�COMPATIBLE_TAGSr\r]r^)ZwheelrnrGZverr]r^r/r/r0r�s
r)N)SZ
__future__rr�rtrZdistutils.utilZ	distutilsZemailrr�rLr�Zloggingr6rxr4r�rIr�r�r)rr�compatrrr	r
rZdatabaserr�r
r�utilrrrrrrrrrrbrrZ	getLoggerrRr�r�hasattrr5r3r�r+r*r2r[r��get_platformr5r�r�r1�compile�
IGNORECASE�VERBOSErhrdr�r�r�r�r7r��objectr8rrUr9r:rr/r/r/r0�<module>s�,


	


#>site-packages/pip/_vendor/distlib/__pycache__/resources.cpython-36.pyc000064400000025116147511334610022024 0ustar003

���e*�@s�ddlmZddlZddlZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddlmZddl
mZmZmZmZeje�ZdaGdd�de�ZGdd	�d	e�ZGd
d�de�ZGdd
�d
e�ZGdd�de�ZGdd�de�Zed�ee
jeiZyFyddl Z!Wne"k
�r$ddl#Z!YnXeee!j$<eee!j%<[!Wne"e&fk
�rXYnXdd�Z'iZ(dd�Z)e	j*e+d��Z,dd�Z-dS)�)�unicode_literalsN�)�DistlibException)�cached_property�get_cache_base�path_to_cache_dir�Cachecs.eZdZd�fdd�	Zdd�Zdd�Z�ZS)	�
ResourceCacheNcs0|dkrtjjt�td��}tt|�j|�dS)Nzresource-cache)�os�path�joinr�str�superr	�__init__)�self�base)�	__class__��/usr/lib/python3.6/resources.pyrszResourceCache.__init__cCsdS)z�
        Is the cache stale for the given resource?

        :param resource: The :class:`Resource` being cached.
        :param path: The path of the resource in the cache.
        :return: True if the cache is stale.
        Tr)r�resourcerrrr�is_stale#s	zResourceCache.is_stalec	Cs�|jj|�\}}|dkr|}n~tjj|j|j|�|�}tjj|�}tjj|�sXtj	|�tjj
|�sjd}n|j||�}|r�t|d��}|j
|j�WdQRX|S)z�
        Get a resource into the cache,

        :param resource: A :class:`Resource` instance.
        :return: The pathname of the resource in the cache.
        NT�wb)�finder�get_cache_infor
rrrZ
prefix_to_dir�dirname�isdir�makedirs�existsr�open�write�bytes)rr�prefixr�resultrZstale�frrr�get.s
zResourceCache.get)N)�__name__�
__module__�__qualname__rrr$�
__classcell__rr)rrr	sr	c@seZdZdd�ZdS)�ResourceBasecCs||_||_dS)N)r�name)rrr*rrrrIszResourceBase.__init__N)r%r&r'rrrrrr)Hsr)c@s@eZdZdZdZdd�Zedd��Zedd��Zed	d
��Z	dS)�Resourcez�
    A class representing an in-package resource, such as a data file. This is
    not normally instantiated by user code, but rather by a
    :class:`ResourceFinder` which manages the resource.
    FcCs|jj|�S)z�
        Get the resource as a stream.

        This is not a property to make it obvious that it returns a new stream
        each time.
        )r�
get_stream)rrrr�	as_streamVszResource.as_streamcCstdkrt�atj|�S)N)�cacher	r$)rrrr�	file_path_szResource.file_pathcCs|jj|�S)N)r�	get_bytes)rrrrr fszResource.bytescCs|jj|�S)N)r�get_size)rrrr�sizejsz
Resource.sizeN)
r%r&r'�__doc__�is_containerr-rr/r r2rrrrr+Ns	r+c@seZdZdZedd��ZdS)�ResourceContainerTcCs|jj|�S)N)r�
get_resources)rrrr�	resourcesrszResourceContainer.resourcesN)r%r&r'r4rr7rrrrr5osr5c@s�eZdZdZejjd�rdZnd Zdd�Zdd	�Z	d
d�Z
dd
�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zeejj�Zdd�ZdS)!�ResourceFinderz4
    Resource finder for file system resources.
    �java�.pyc�.pyo�.classcCs.||_t|dd�|_tjjt|dd��|_dS)N�
__loader__�__file__�)�module�getattr�loaderr
rrr)rr@rrrr�szResourceFinder.__init__cCstjj|�S)N)r
r�realpath)rrrrr�_adjust_path�szResourceFinder._adjust_pathcCsBt|t�rd}nd}|j|�}|jd|j�tjj|�}|j|�S)N�/�/r)	�
isinstancer �split�insertrr
rrrD)r�
resource_name�sep�partsr"rrr�
_make_path�s

zResourceFinder._make_pathcCstjj|�S)N)r
rr)rrrrr�_find�szResourceFinder._findcCs
d|jfS)N)r)rrrrrr�szResourceFinder.get_cache_infocCsD|j|�}|j|�sd}n&|j|�r0t||�}n
t||�}||_|S)N)rMrN�
_is_directoryr5r+r)rrJrr"rrr�find�s



zResourceFinder.findcCst|jd�S)N�rb)rr)rrrrrr,�szResourceFinder.get_streamc	Cs t|jd��
}|j�SQRXdS)NrQ)rr�read)rrr#rrrr0�szResourceFinder.get_bytescCstjj|j�S)N)r
r�getsize)rrrrrr1�szResourceFinder.get_sizecs*�fdd��t�fdd�tj|j�D��S)Ncs|dko|j�j�S)N�__pycache__)�endswith�skipped_extensions)r#)rrr�allowed�sz-ResourceFinder.get_resources.<locals>.allowedcsg|]}�|�r|�qSrr)�.0r#)rWrr�
<listcomp>�sz0ResourceFinder.get_resources.<locals>.<listcomp>)�setr
�listdirr)rrr)rWrrr6�szResourceFinder.get_resourcescCs|j|j�S)N)rOr)rrrrrr4�szResourceFinder.is_containerccs�|j|�}|dk	r�|g}xn|r�|jd�}|V|jr|j}xH|jD]>}|sP|}ndj||g�}|j|�}|jrz|j|�qB|VqBWqWdS)NrrF)rP�popr4r*r7r�append)rrJrZtodoZrnamer*�new_nameZchildrrr�iterator�s 


zResourceFinder.iteratorN)r:r;r<)r:r;)r%r&r'r3�sys�platform�
startswithrVrrDrMrNrrPr,r0r1r6r4�staticmethodr
rrrOr_rrrrr8ws"r8cs`eZdZdZ�fdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Z�Z
S)�ZipResourceFinderz6
    Resource finder for resources in .zip files.
    csZtt|�j|�|jj}dt|�|_t|jd�r>|jj|_nt	j
||_t|j�|_dS)Nr�_files)
rrdrrB�archive�len�
prefix_len�hasattrre�	zipimport�_zip_directory_cache�sorted�index)rr@rf)rrrr�szZipResourceFinder.__init__cCs|S)Nr)rrrrrrD�szZipResourceFinder._adjust_pathcCs�||jd�}||jkrd}nX|r:|dtjkr:|tj}tj|j|�}y|j|j|�}Wntk
rtd}YnX|s�tj	d||j
j�ntj	d||j
j�|S)NTrFz_find failed: %r %rz_find worked: %r %r���)rhrer
rK�bisectrmrb�
IndexError�logger�debugrBr!)rrr"�irrrrN�s


zZipResourceFinder._findcCs&|jj}|jdt|�d�}||fS)Nr)rBrfrrg)rrr!rrrrr�sz ZipResourceFinder.get_cache_infocCs|jj|j�S)N)rB�get_datar)rrrrrr0�szZipResourceFinder.get_bytescCstj|j|��S)N)�io�BytesIOr0)rrrrrr,�szZipResourceFinder.get_streamcCs|j|jd�}|j|dS)N�)rrhre)rrrrrrr1szZipResourceFinder.get_sizecCs�|j|jd�}|r,|dtjkr,|tj7}t|�}t�}tj|j|�}xV|t|j�kr�|j|j|�sjP|j||d�}|j	|j
tjd�d�|d7}qJW|S)Nrrrn)rrhr
rKrgrZrormrb�addrH)rrrZplenr"rs�srrrr6s
zZipResourceFinder.get_resourcescCsj||jd�}|r*|dtjkr*|tj7}tj|j|�}y|j|j|�}Wntk
rdd}YnX|S)NrFrn)rhr
rKrormrbrp)rrrsr"rrrrOs

zZipResourceFinder._is_directory)r%r&r'r3rrDrNrr0r,r1r6rOr(rr)rrrd�srdcCs|tt|�<dS)N)�_finder_registry�type)rB�finder_makerrrr�register_finder0sr}cCs�|tkrt|}nv|tjkr$t|�tj|}t|dd�}|dkrJtd��t|dd�}tjt|��}|dkrxtd|��||�}|t|<|S)z�
    Return a resource finder for a package.
    :param package: The name of the package.
    :return: A :class:`ResourceFinder` instance for the package.
    �__path__Nz8You cannot get a finder for a module, only for a packager=zUnable to locate finder for %r)	�
_finder_cacher`�modules�
__import__rArrzr$r{)�packager"r@rrBr|rrrr6s


rZ	__dummy__cCsRd}tj|�tjj|�}tjt|��}|rNt}tj	j
|d�|_||_||�}|S)z�
    Return a resource finder for a path, which should represent a container.

    :param path: The path.
    :return: A :class:`ResourceFinder` instance for the path.
    Nr?)
�pkgutilZget_importerr`�path_importer_cacher$rzr{�
_dummy_moduler
rrr>r=)rr"rBrr@rrr�finder_for_pathRs
r�).Z
__future__rroruZloggingr
r�Zshutilr`�typesrjr?r�utilrrrrZ	getLoggerr%rqr.r	�objectr)r+r5r8rdr{�zipimporterrz�_frozen_importlib_externalZ_fi�ImportError�_frozen_importlib�SourceFileLoader�
FileFinder�AttributeErrorr}rr�
ModuleTyper
r�r�rrrr�<module>sH
,!ZN


site-packages/pip/_vendor/distlib/__pycache__/compat.cpython-36.opt-1.pyc000064400000076203147511334610022237 0ustar003

���ea���@sddlmZddlZddlZddlZyddlZWnek
rHdZYnXejddk�r~ddlmZe	fZ
eZddl
mZddlZddlZddlmZddlmZmZmZmZmZdd	lmZmZmZm Z m!Z!m"Z"m#Z#d
d�Zddl$Z$ddl$m%Z%m&Z&m'Z'm(Z(m)Z)m*Z*m+Z+m,Z,m-Z-e�r&dd
l$m.Z.ddl/Z/ddl0Z0ddl1Z2ddl3m3Z3ddl4Z4e5Z5ddl6m7Z8ddl6m9Z:da;dd�Z<�nddl=mZe>fZ
e>Zddl=m?ZddlZddlZddlZddl@mZmZmZm<Z<mZmZmZmZm#Z#ddlAm&Z&mZm%Z%m Z m!Z!m)Z)m*Z*m+Z+m,Z,m-Z-e�r&dd
lAm.Z.ddlBm(Z(m'Z'm"Z"ddlCjDZ/ddlAjEZ$ddlFjDZ0ddl2Z2ddlGm3Z3ddlHjIZ4eJZ5ddl6m:Z:e8Z8yddlmKZKmLZLWn8ek
�r�Gdd�deM�ZLdcdd�ZNdd�ZKYnXyddl
mOZPWn&ek
�rGd d!�d!eQ�ZPYnXydd"lmRZRWn,ek
�rLejSejTBdfd#d$�ZRYnXdd%lUmVZWeXeWd&��rleWZVn,dd'lUmYZZGd(d)�d)eZ�ZYGd*d+�d+eW�ZVydd,l[m\Z\Wnek
�r�d-d.�Z\YnXyddl]Z]Wn"ek
�r�dd/lm]Z]YnXye^Z^Wn*e_k
�r*dd0l`maZad1d2�Z^YnXyejbZbejcZcWnBedk
�r~eje�Zfefd3k�rfd4Zgnd5Zgd6d7�Zbd8d9�ZcYnXydd:lhmiZiWnHek
�r�dd;ljmkZkmlZlddlZejmd<�Znd=d>�Zod?d@�ZiYnXyddAlpmqZqWn"ek
�rddAlrmqZqYnXejddB�ddk�r,e3�jsZsnddDlpmsZsyddEl`mtZtWndek
�r�ddFl`muZuyddGlvmwZxWn ek
�r�dedIdJ�ZxYnXGdKdL�dLeu�ZtYnXyddMlymzZzWn ek
�r�dfdNdO�ZzYnXyddPl`m{Z{Wn�ek
�rzyddQl|m}Z~Wn"ek
�r4ddQlm}Z~YnXyddRl�m�Z�m�Z�m�Z�Wnek
�rdYnXGdSdT�dTe��Z{YnXyddUl�m�Z�m�Z�Wnvek
�rejmdVej��Z�dWdX�Z�GdYdZ�dZe��Z�dgd[d\�Z�Gd]d^�d^e��Z�Gd_d`�d`e��Z�Gdadb�dbeQ�Z�YnXdS)h�)�absolute_importN�)�StringIO)�FileType�)�shutil)�urlparse�
urlunparse�urljoin�urlsplit�
urlunsplit)�urlretrieve�quote�unquote�url2pathname�pathname2url�ContentTooShortError�	splittypecCst|t�r|jd�}t|�S)Nzutf-8)�
isinstance�unicode�encode�_quote)�s�r�/usr/lib/python3.6/compat.pyrs

r)	�Request�urlopen�URLError�	HTTPError�HTTPBasicAuthHandler�HTTPPasswordMgr�HTTPHandler�HTTPRedirectHandler�build_opener)�HTTPSHandler)�
HTMLParser)�ifilter)�ifilterfalsecCs<tdkrddl}|jd�atj|�}|r4|jdd�Sd|fS)zJsplituser('user[:passwd]@host[:port]') --> 'user[:passwd]', 'host[:port]'.Nrz^(.*)@(.*)$r�)�	_userprog�re�compile�match�group)�hostr*r,rrr�	splituser4s

r/)�
TextIOWrapper)	rr	r
r/rrrrr)
rr
rrrrr r!r"r#)rrr)�filterfalse)�match_hostname�CertificateErrorc@seZdZdS)r3N)�__name__�
__module__�__qualname__rrrrr3^sr3c
Cs�g}|sdS|jd�}|d|dd�}}|jd�}||krNtdt|���|sb|j�|j�kS|dkrv|jd�n>|jd	�s�|jd	�r�|jtj|��n|jtj|�j	d
d��x|D]}|jtj|��q�Wtj
dd
j|�dtj�}	|	j
|�S)zpMatching according to RFC 6125, section 6.4.3

        http://tools.ietf.org/html/rfc6125#section-6.4.3
        F�.rrN�*z,too many wildcards in certificate DNS name: z[^.]+zxn--z\*z[^.]*z\Az\.z\Z)�split�countr3�repr�lower�append�
startswithr*�escape�replacer+�join�
IGNORECASEr,)
Zdn�hostnameZ
max_wildcardsZpats�partsZleftmostZ	remainderZ	wildcards�fragZpatrrr�_dnsname_matchbs(


rFcCs�|std��g}|jdf�}x0|D](\}}|dkr"t||�r@dS|j|�q"W|s�xF|jdf�D]6}x0|D](\}}|dkrjt||�r�dS|j|�qjWq`Wt|�dkr�td|d	jtt|��f��n*t|�dkr�td
||df��ntd��dS)
a=Verify that *cert* (in decoded format as returned by
        SSLSocket.getpeercert()) matches the *hostname*.  RFC 2818 and RFC 6125
        rules are followed, but IP addresses are not accepted for *hostname*.

        CertificateError is raised on failure. On success, the function
        returns nothing.
        ztempty or no certificate, match_hostname needs a SSL socket or SSL context with either CERT_OPTIONAL or CERT_REQUIREDZsubjectAltNameZDNSNZsubjectZ
commonNamerz&hostname %r doesn't match either of %sz, zhostname %r doesn't match %rrz=no appropriate commonName or subjectAltName fields were found)	�
ValueError�getrFr=�lenr3rA�mapr;)ZcertrCZdnsnamesZsan�key�value�subrrrr2�s.

r2)�SimpleNamespacec@seZdZdZdd�ZdS)�	ContainerzR
        A generic container for when multiple values need to be returned
        cKs|jj|�dS)N)�__dict__�update)�self�kwargsrrr�__init__�szContainer.__init__N)r4r5r6�__doc__rTrrrrrO�srO)�whichcs"dd�}tjj��r&|�|�r"�SdS|dkr>tjjdtj�}|sFdS|jtj�}tj	dkr�tj
|krt|jdtj
�tjjdd�jtj�}t�fd	d
�|D��r��g}q‡fdd�|D�}n�g}t
�}xT|D]L}tjj|�}||kr�|j|�x(|D] }	tjj||	�}
||
|�r�|
Sq�Wq�WdS)
aKGiven a command, mode, and a PATH string, return the path which
        conforms to the given mode on the PATH, or None if there is no such
        file.

        `mode` defaults to os.F_OK | os.X_OK. `path` defaults to the result
        of os.environ.get("PATH"), or can be overridden with a custom search
        path.

        cSs&tjj|�o$tj||�o$tjj|�S)N)�os�path�exists�access�isdir)�fn�moderrr�
_access_check�szwhich.<locals>._access_checkN�PATHZwin32rZPATHEXT�c3s |]}�j�j|j��VqdS)N)r<�endswith)�.0�ext)�cmdrr�	<genexpr>�szwhich.<locals>.<genexpr>csg|]}�|�qSrr)rbrc)rdrr�
<listcomp>�szwhich.<locals>.<listcomp>)rWrX�dirname�environrH�defpathr9�pathsep�sys�platform�curdir�insert�any�set�normcase�addrA)rdr]rXr^Zpathext�files�seen�dirZnormdirZthefile�namer)rdrrV�s8







rV)�ZipFile�	__enter__)�
ZipExtFilec@s$eZdZdd�Zdd�Zdd�ZdS)rycCs|jj|j�dS)N)rPrQ)rR�baserrrrTszZipExtFile.__init__cCs|S)Nr)rRrrrrxszZipExtFile.__enter__cGs|j�dS)N)�close)rR�exc_inforrr�__exit__szZipExtFile.__exit__N)r4r5r6rTrxr}rrrrrysryc@s$eZdZdd�Zdd�Zdd�ZdS)rwcCs|S)Nr)rRrrrrx"szZipFile.__enter__cGs|j�dS)N)r{)rRr|rrrr}%szZipFile.__exit__cOstj|f|�|�}t|�S)N)�BaseZipFile�openry)rR�argsrSrzrrrr)szZipFile.openN)r4r5r6rxr}rrrrrrw!srw)�python_implementationcCs0dtjkrdStjdkrdStjjd�r,dSdS)z6Return a string identifying the Python implementation.ZPyPy�javaZJythonZ
IronPythonZCPython)rk�versionrWrvr>rrrrr�0s

r�)�	sysconfig)�CallablecCs
t|t�S)N)rr�)�objrrr�callableDsr��mbcs�strict�surrogateescapecCs:t|t�r|St|t�r$|jtt�Stdt|�j��dS)Nzexpect bytes or str, not %s)	r�bytes�	text_typer�_fsencoding�	_fserrors�	TypeError�typer4)�filenamerrr�fsencodeRs

r�cCs:t|t�r|St|t�r$|jtt�Stdt|�j��dS)Nzexpect bytes or str, not %s)	rr�r��decoder�r�r�r�r4)r�rrr�fsdecode[s

r�)�detect_encoding)�BOM_UTF8�lookupzcoding[:=]\s*([-\w.]+)cCsH|dd�j�jdd�}|dks*|jd�r.dS|d
ks@|jd�rDdS|S)z(Imitates get_normal_name in tokenizer.c.N��_�-zutf-8zutf-8-�latin-1�
iso-8859-1�iso-latin-1�latin-1-�iso-8859-1-�iso-latin-1-)r�r�r�)r�r�r�)r<r@r>)�orig_enc�encrrr�_get_normal_namels
r�cs�y�jj�Wntk
r$d�YnXd�d}d}�fdd�}��fdd�}|�}|jt�rpd�|d	d�}d
}|s||gfS||�}|r�||gfS|�}|s�||gfS||�}|r�|||gfS|||gfS)a?
        The detect_encoding() function is used to detect the encoding that should
        be used to decode a Python source file.  It requires one argument, readline,
        in the same way as the tokenize() generator.

        It will call readline a maximum of twice, and return the encoding used
        (as a string) and a list of any lines (left as bytes) it has read in.

        It detects the encoding from the presence of a utf-8 bom or an encoding
        cookie as specified in pep-0263.  If both a bom and a cookie are present,
        but disagree, a SyntaxError will be raised.  If the encoding cookie is an
        invalid charset, raise a SyntaxError.  Note that if a utf-8 bom is found,
        'utf-8-sig' is returned.

        If no encoding is specified, then the default of 'utf-8' will be returned.
        NFzutf-8cs y��Stk
rdSXdS)N�)�
StopIterationr)�readlinerr�read_or_stop�sz%detect_encoding.<locals>.read_or_stopcs�y|jd�}Wn4tk
rBd}�dk	r6dj|��}t|��YnXtj|�}|sVdSt|d�}yt|�}Wn:tk
r��dkr�d|}ndj�|�}t|��YnX�r�|j	dkr؈dkr�d}n
dj��}t|��|d	7}|S)
Nzutf-8z'invalid or missing encoding declarationz{} for {!r}rzunknown encoding: zunknown encoding for {!r}: {}zencoding problem: utf-8z encoding problem for {!r}: utf-8z-sig)
r��UnicodeDecodeError�format�SyntaxError�	cookie_re�findallr�r��LookupErrorrv)�line�line_string�msgZmatches�encoding�codec)�	bom_foundr�rr�find_cookie�s6



z$detect_encoding.<locals>.find_cookieTrz	utf-8-sig)�__self__rv�AttributeErrorr>r�)r�r��defaultr�r��first�secondr)r�r�r�rr�ws4
&


r�)r?r(�)�unescape)�ChainMap)�MutableMapping)�recursive_repr�...cs�fdd�}|S)zm
            Decorator to make a repr function return fillvalue for a recursive
            call
            csLt�����fdd�}t�d�|_t�d�|_t�d�|_t�di�|_|S)NcsBt|�t�f}|�kr�S�j|�z�|�}Wd�j|�X|S)N)�id�	get_identrr�discard)rRrK�result)�	fillvalue�repr_running�
user_functionrr�wrapper�s
z=_recursive_repr.<locals>.decorating_function.<locals>.wrapperr5rUr4�__annotations__)rp�getattrr5rUr4r�)r�r�)r�)r�r�r�decorating_function�sz,_recursive_repr.<locals>.decorating_functionr)r�r�r)r�r�_recursive_repr�sr�c@s�eZdZdZdd�Zdd�Zdd�Zd'd	d
�Zdd�Zd
d�Z	dd�Z
dd�Ze�dd��Z
edd��Zdd�ZeZdd�Zedd��Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�ZdS)(r�a� A ChainMap groups multiple dicts (or other mappings) together
        to create a single, updateable view.

        The underlying mappings are stored in a list.  That list is public and can
        accessed or updated using the *maps* attribute.  There is no other state.

        Lookups search the underlying mappings successively until a key is found.
        In contrast, writes, updates, and deletions only operate on the first
        mapping.

        cGst|�pig|_dS)z�Initialize a ChainMap by setting *maps* to the given mappings.
            If no mappings are provided, a single empty dictionary is used.

            N)�list�maps)rRr�rrrrT
szChainMap.__init__cCst|��dS)N)�KeyError)rRrKrrr�__missing__szChainMap.__missing__cCs8x,|jD]"}y||Stk
r(YqXqW|j|�S)N)r�r�r�)rRrK�mappingrrr�__getitem__s
zChainMap.__getitem__NcCs||kr||S|S)Nr)rRrKr�rrrrHszChainMap.getcCstt�j|j��S)N)rIrp�unionr�)rRrrr�__len__"szChainMap.__len__cCstt�j|j��S)N)�iterrpr�r�)rRrrr�__iter__%szChainMap.__iter__cst�fdd�|jD��S)Nc3s|]}�|kVqdS)Nr)rb�m)rKrrre)sz(ChainMap.__contains__.<locals>.<genexpr>)ror�)rRrKr)rKr�__contains__(szChainMap.__contains__cCs
t|j�S)N)ror�)rRrrr�__bool__+szChainMap.__bool__cCsdj|djtt|j���S)Nz{0.__class__.__name__}({1})z, )r�rArJr;r�)rRrrr�__repr__.szChainMap.__repr__cGs|tj|f|���S)z?Create a ChainMap with a single dict created from the iterable.)�dict�fromkeys)�cls�iterabler�rrrr�3szChainMap.fromkeyscCs$|j|jdj�f|jdd���S)zHNew ChainMap or subclass with a new copy of maps[0] and refs to maps[1:]rrN)�	__class__r��copy)rRrrrr�8sz
ChainMap.copycCs|jif|j��S)z;New ChainMap with a new dict followed by all previous maps.)r�r�)rRrrr�	new_child>szChainMap.new_childcCs|j|jdd��S)zNew ChainMap from maps[1:].rN)r�r�)rRrrr�parentsBszChainMap.parentscCs||jd|<dS)Nr)r�)rRrKrLrrr�__setitem__GszChainMap.__setitem__cCs8y|jd|=Wn"tk
r2tdj|���YnXdS)Nrz(Key not found in the first mapping: {!r})r�r�r�)rRrKrrr�__delitem__JszChainMap.__delitem__cCs0y|jdj�Stk
r*td��YnXdS)zPRemove and return an item pair from maps[0]. Raise KeyError is maps[0] is empty.rz#No keys found in the first mapping.N)r��popitemr�)rRrrrr�PszChainMap.popitemcGs>y|jdj|f|��Stk
r8tdj|���YnXdS)zWRemove *key* from maps[0] and return its value. Raise KeyError if *key* not in maps[0].rz(Key not found in the first mapping: {!r}N)r��popr�r�)rRrKr�rrrr�WszChainMap.popcCs|jdj�dS)z'Clear maps[0], leaving maps[1:] intact.rN)r��clear)rRrrrr�^szChainMap.clear)N)r4r5r6rUrTr�r�rHr�r�r�r�r�r��classmethodr�r��__copy__r��propertyr�r�r�r�r�r�rrrrr�s(
r�)�cache_from_sourcecCs"|dkrd}|rd}nd}||S)NF�c�or)rX�debug_override�suffixrrrr�esr�)�OrderedDict)r�)�KeysView�
ValuesView�	ItemsViewc@s�eZdZdZdd�Zejfdd�Zejfdd�Zdd	�Zd
d�Z	dd
�Z
d6dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�ZeZe�Zefdd �Zd7d"d#�Zd8d$d%�Zd&d'�Zd(d)�Zed9d*d+��Zd,d-�Zd.d/�Zd0d1�Zd2d3�Z d4d5�Z!d!S):r�z)Dictionary that remembers insertion ordercOsnt|�dkrtdt|���y
|jWn6tk
r\g|_}||dg|dd�<i|_YnX|j||�dS)z�Initialize an ordered dictionary.  Signature is the same as for
            regular dictionaries, but keyword arguments are not recommended
            because their insertion order is arbitrary.

            rz$expected at most 1 arguments, got %dN)rIr��_OrderedDict__rootr��_OrderedDict__map�_OrderedDict__update)rRr��kwds�rootrrrrT�s

zOrderedDict.__init__cCsF||kr6|j}|d}|||g|d<|d<|j|<||||�dS)z!od.__setitem__(i, y) <==> od[i]=yrrN)r�r�)rRrKrLZdict_setitemr��lastrrrr��s
 zOrderedDict.__setitem__cCs0|||�|jj|�\}}}||d<||d<dS)z od.__delitem__(y) <==> del od[y]rrN)r�r�)rRrKZdict_delitem�	link_prev�	link_nextrrrr��s
zOrderedDict.__delitem__ccs2|j}|d}x||k	r,|dV|d}qWdS)zod.__iter__() <==> iter(od)rr(N)r�)rRr��currrrrr��s


zOrderedDict.__iter__ccs2|j}|d}x||k	r,|dV|d}qWdS)z#od.__reversed__() <==> reversed(od)rr(N)r�)rRr�r�rrr�__reversed__�s


zOrderedDict.__reversed__cCshyDx|jj�D]}|dd�=qW|j}||dg|dd�<|jj�Wntk
rXYnXtj|�dS)z.od.clear() -> None.  Remove all items from od.N)r��
itervaluesr�r�r�r�)rRZnoder�rrrr��szOrderedDict.clearTcCs||std��|j}|r8|d}|d}||d<||d<n |d}|d}||d<||d<|d}|j|=tj||�}||fS)z�od.popitem() -> (k, v), return and remove a (key, value) pair.
            Pairs are returned in LIFO order if last is true or FIFO order if false.

            zdictionary is emptyrrr()r�r�r�r�r�)rRr�r��linkr�r�rKrLrrrr��s 
zOrderedDict.popitemcCst|�S)zod.keys() -> list of keys in od)r�)rRrrr�keys�szOrderedDict.keyscs�fdd��D�S)z#od.values() -> list of values in odcsg|]}�|�qSrr)rbrK)rRrrrf�sz&OrderedDict.values.<locals>.<listcomp>r)rRr)rRr�values�szOrderedDict.valuescs�fdd��D�S)z.od.items() -> list of (key, value) pairs in odcsg|]}|�|f�qSrr)rbrK)rRrrrf�sz%OrderedDict.items.<locals>.<listcomp>r)rRr)rRr�items�szOrderedDict.itemscCst|�S)z0od.iterkeys() -> an iterator over the keys in od)r�)rRrrr�iterkeys�szOrderedDict.iterkeysccsx|D]}||VqWdS)z2od.itervalues -> an iterator over the values in odNr)rR�krrrr��s
zOrderedDict.itervaluesccs x|D]}|||fVqWdS)z=od.iteritems -> an iterator over the (key, value) items in odNr)rRrrrr�	iteritems�s
zOrderedDict.iteritemscOs�t|�dkr tdt|�f��n|s,td��|d}f}t|�dkrL|d}t|t�rrx^|D]}||||<q\WnDt|d�r�x8|j�D]}||||<q�Wnx|D]\}}|||<q�Wx|j�D]\}}|||<q�WdS)a�od.update(E, **F) -> None.  Update od from dict/iterable E and F.

            If E is a dict instance, does:           for k in E: od[k] = E[k]
            If E has a .keys() method, does:         for k in E.keys(): od[k] = E[k]
            Or if E is an iterable of items, does:   for k, v in E: od[k] = v
            In either case, this is followed by:     for k, v in F.items(): od[k] = v

            r(z8update() takes at most 2 positional arguments (%d given)z,update() takes at least 1 argument (0 given)rrr�N)rIr�rr��hasattrr�r)r�r�rR�otherrKrLrrrrQ�s&	


zOrderedDict.updatecCs0||kr||}||=|S||jkr,t|��|S)z�od.pop(k[,d]) -> v, remove specified key and return the corresponding value.
            If key is not found, d is returned if given, otherwise KeyError is raised.

            )�_OrderedDict__markerr�)rRrKr�r�rrrr�!s
zOrderedDict.popNcCs||kr||S|||<|S)zDod.setdefault(k[,d]) -> od.get(k,d), also set od[k]=d if k not in odr)rRrKr�rrr�
setdefault.szOrderedDict.setdefaultcCs^|si}t|�t�f}||kr"dSd||<z&|s>d|jjfSd|jj|j�fS||=XdS)zod.__repr__() <==> repr(od)z...rz%s()z%s(%r)N)r��
_get_identr�r4r)rRZ
_repr_runningZcall_keyrrrr�5szOrderedDict.__repr__cs\�fdd��D�}t��j�}xtt��D]}|j|d�q*W|rP�j|f|fS�j|ffS)z%Return state information for picklingcsg|]}|�|g�qSrr)rbr)rRrrrfEsz*OrderedDict.__reduce__.<locals>.<listcomp>N)�varsr�r�r�r�)rRrZ	inst_dictrr)rRr�
__reduce__CszOrderedDict.__reduce__cCs
|j|�S)z!od.copy() -> a shallow copy of od)r�)rRrrrr�MszOrderedDict.copycCs |�}x|D]}|||<qW|S)z�OD.fromkeys(S[, v]) -> New ordered dictionary with keys from S
            and values equal to v (which defaults to None).

            r)r�r�rL�drKrrrr�Qs
zOrderedDict.fromkeyscCs6t|t�r*t|�t|�ko(|j�|j�kStj||�S)z�od.__eq__(y) <==> od==y.  Comparison to another OD is order-sensitive
            while comparison to a regular mapping is order-insensitive.

            )rr�rIrr��__eq__)rRrrrrr
\s
 zOrderedDict.__eq__cCs
||kS)Nr)rRrrrr�__ne__eszOrderedDict.__ne__cCst|�S)z@od.viewkeys() -> a set-like object providing a view on od's keys)r�)rRrrr�viewkeysjszOrderedDict.viewkeyscCst|�S)z<od.viewvalues() -> an object providing a view on od's values)r�)rRrrr�
viewvaluesnszOrderedDict.viewvaluescCst|�S)zBod.viewitems() -> a set-like object providing a view on od's items)r�)rRrrr�	viewitemsrszOrderedDict.viewitems)T)N)N)N)"r4r5r6rUrTr�r�r�r�r�r�r�r�rrrr�rrQr��objectrr�rr�rr�r�r�r
rrrrrrrrr��s:
	




	r�)�BaseConfigurator�valid_identz^[a-z_][a-z0-9_]*$cCstj|�}|std|��dS)Nz!Not a valid Python identifier: %rT)�
IDENTIFIERr,rG)rr�rrrr|s
rc@s"eZdZdZdd�Zddd�ZdS)�ConvertingDictz A converting dictionary wrapper.cCsJtj||�}|jj|�}||k	rF|||<t|�tttfkrF||_||_	|S)N)
r�r��configurator�convertr�r�ConvertingList�ConvertingTuple�parentrK)rRrKrLr�rrrr��s
zConvertingDict.__getitem__NcCsLtj|||�}|jj|�}||k	rH|||<t|�tttfkrH||_||_	|S)N)
r�rHrrr�rrrrrK)rRrKr�rLr�rrrrH�s
zConvertingDict.get)N)r4r5r6rUr�rHrrrrr�srcCsDtj|||�}|jj|�}||k	r@t|�tttfkr@||_||_	|S)N)
r�r�rrr�rrrrrK)rRrKr�rLr�rrrr��s
r�c@s"eZdZdZdd�Zd	dd�ZdS)
rzA converting list wrapper.cCsJtj||�}|jj|�}||k	rF|||<t|�tttfkrF||_||_	|S)N)
r�r�rrr�rrrrrK)rRrKrLr�rrrr��s
zConvertingList.__getitem__rcCs<tj||�}|jj|�}||k	r8t|�tttfkr8||_|S)N)	r�r�rrr�rrrr)rR�idxrLr�rrrr��s
zConvertingList.popN���)r)r4r5r6rUr�r�rrrrr�src@seZdZdZdd�ZdS)rzA converting tuple wrapper.cCsBtj||�}|jj|�}||k	r>t|�tttfkr>||_||_	|S)N)
�tupler�rrr�rrrrrK)rRrKrLr�rrrr��s
zConvertingTuple.__getitem__N)r4r5r6rUr�rrrrr�src@s�eZdZdZejd�Zejd�Zejd�Zejd�Z	ejd�Z
ddd	�Zee
�Zd
d�Zdd
�Zdd�Zdd�Zdd�Zdd�Zdd�ZdS)rzQ
        The configurator base class which defines some useful defaults.
        z%^(?P<prefix>[a-z]+)://(?P<suffix>.*)$z^\s*(\w+)\s*z^\.\s*(\w+)\s*z^\[\s*(\w+)\s*\]\s*z^\d+$�ext_convert�cfg_convert)rcZcfgcCst|�|_||j_dS)N)r�configr)rRr!rrrrT�s
zBaseConfigurator.__init__c	Cs�|jd�}|jd�}y`|j|�}xP|D]H}|d|7}yt||�}Wq&tk
rl|j|�t||�}Yq&Xq&W|Stk
r�tj�dd�\}}td||f�}|||_	|_
|�YnXdS)zl
            Resolve strings to objects using standard import and attribute
            syntax.
            r7rrNzCannot resolve %r: %s)r9r��importerr�r��ImportErrorrkr|rG�	__cause__�
__traceback__)	rRrrvZused�foundrE�e�tb�vrrr�resolve�s"




zBaseConfigurator.resolvecCs
|j|�S)z*Default converter for the ext:// protocol.)r*)rRrLrrrrszBaseConfigurator.ext_convertcCs|}|jj|�}|dkr&td|��n�||j�d�}|j|j�d}x�|r�|jj|�}|rp||j�d}nd|jj|�}|r�|j�d}|jj|�s�||}n2yt	|�}||}Wnt
k
r�||}YnX|r�||j�d�}qJtd||f��qJW|S)z*Default converter for the cfg:// protocol.NzUnable to convert %rrzUnable to convert %r at %r)�WORD_PATTERNr,rG�endr!�groups�DOT_PATTERN�
INDEX_PATTERN�
DIGIT_PATTERN�intr�)rRrL�restr�rr�nrrrr s2
zBaseConfigurator.cfg_convertcCs�t|t�r&t|t�r&t|�}||_n�t|t�rLt|t�rLt|�}||_n|t|t�rrt|t�rrt|�}||_nVt|t�r�|j	j
|�}|r�|j�}|d}|jj
|d�}|r�|d}t||�}||�}|S)z�
            Convert values to an appropriate type. dicts, lists and tuples are
            replaced by their converting alternatives. Strings are checked to
            see if they have a conversion format and are converted if they do.
            �prefixNr�)rrr�rrr�rr�string_types�CONVERT_PATTERNr,�	groupdict�value_convertersrHr�)rRrLr�rr4Z	converterr�rrrr)s*


zBaseConfigurator.convertcsr�jd�}t|�s|j|�}�jdd�}t�fdd��D��}|f|�}|rnx |j�D]\}}t|||�qVW|S)z1Configure an object with a user-supplied factory.z()r7Ncs g|]}t|�r|�|f�qSr)r)rbr)r!rrrfLsz5BaseConfigurator.configure_custom.<locals>.<listcomp>)r�r�r*r�r�setattr)rRr!r�ZpropsrSr�rvrLr)r!r�configure_customEs


z!BaseConfigurator.configure_customcCst|t�rt|�}|S)z0Utility function which converts lists to tuples.)rr�r)rRrLrrr�as_tupleSs
zBaseConfigurator.as_tupleN)r4r5r6rUr*r+r6r+r.r/r0r8�staticmethod�
__import__r"rTr*rr rr:r;rrrrr�s 




"r)r)rr�)r�)N)N)�Z
__future__rrWr*rkZsslr#�version_inforZ
basestringr5rr��typesrZ	file_typeZ__builtin__�builtinsZConfigParserZconfigparserZ	_backportrrr	r
rrZurllibr
rrrrrrrZurllib2rrrrrr r!r"r#r$ZhttplibZ	xmlrpclibZQueueZqueuer%ZhtmlentitydefsZ	raw_input�	itertoolsr&�filterr'r1r)r/�io�strr0Zurllib.parseZurllib.requestZurllib.errorZhttp.clientZclientZrequestZ
xmlrpc.clientZhtml.parserZ
html.entitiesZentities�inputr2r3rGrFrNrOrrV�F_OK�X_OKZzipfilerwr~rryZBaseZipExtFilerlr�r�r��	NameError�collectionsr�r�r�r��getfilesystemencodingr�r��tokenizer��codecsr�r�r+r�r�Zhtmlr?Zcgir�r�r��reprlibr�r�Zimpr�r�Zthreadr�r	Zdummy_threadZ_abcollr�r�r�r�Zlogging.configrr�Irrr�r�rrrrrrr�<module>s&
$,,0




2+A


		
[
b
w

site-packages/pip/_vendor/distlib/__pycache__/scripts.cpython-36.opt-1.pyc000064400000023415147511334610022440 0ustar003

���ex;�@s�ddlmZddlZddlZddlZddlZddlZddlmZm	Z	m
Z
ddlmZddl
mZmZmZmZmZeje�Zdj�Zejd�Zd	Zd
d�ZGdd
�d
e�ZdS)�)�BytesION�)�	sysconfig�detect_encoding�ZipFile)�finder)�FileOperator�get_export_entry�convert_path�get_executable�in_venva�
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<assembly xmlns="urn:schemas-microsoft-com:asm.v1" manifestVersion="1.0">
 <assemblyIdentity version="1.0.0.0"
 processorArchitecture="X86"
 name="%s"
 type="win32"/>

 <!-- Identify the application security requirements. -->
 <trustInfo xmlns="urn:schemas-microsoft-com:asm.v3">
 <security>
 <requestedPrivileges>
 <requestedExecutionLevel level="asInvoker" uiAccess="false"/>
 </requestedPrivileges>
 </security>
 </trustInfo>
</assembly>s^#!.*pythonw?[0-9.]*([ 	].*)?$a|# -*- coding: utf-8 -*-
if __name__ == '__main__':
    import sys, re

    def _resolve(module, func):
        __import__(module)
        mod = sys.modules[module]
        parts = func.split('.')
        result = getattr(mod, parts.pop(0))
        for p in parts:
            result = getattr(result, p)
        return result

    try:
        sys.argv[0] = re.sub(r'(-script\.pyw?|\.exe)?$', '', sys.argv[0])

        func = _resolve('%(module)s', '%(func)s')
        rc = func() # None interpreted as 0
    except Exception as e:  # only supporting Python >= 2.6
        sys.stderr.write('%%s\n' %% e)
        rc = 1
    sys.exit(rc)
cCsZd|krV|jd�rD|jdd�\}}d|krV|jd�rVd||f}n|jd�sVd|}|S)N� z
/usr/bin/env r�"z%s "%s"z"%s")�
startswith�split)�
executable�envZ_executable�r�/usr/lib/python3.6/scripts.py�_enquote_executableBs

rc@s�eZdZdZeZdZd%dd�Zdd�Ze	j
jd	�rBd
d�Zdd
�Z
d&dd�Zdd�ZeZdd�Zdd�Zd'dd�Zdd�Zedd��Zejdd��Zejdks�ejd	kr�ejdkr�dd �Zd(d!d"�Zd)d#d$�ZdS)*�ScriptMakerz_
    A class to copy or create scripts from source scripts or callable
    specifications.
    NTFcCsz||_||_||_d|_d|_tjdkp:tjdko:tjdk|_t	d�|_
|pRt|�|_tjdkprtjdkortjdk|_
dS)NF�posix�java��X.Y�nt)rr)�
source_dir�
target_dir�
add_launchers�force�clobber�os�name�_name�set_mode�set�variantsr�_fileop�_is_nt)�selfrrr�dry_runZfileoprrr�__init__[s

zScriptMaker.__init__cCs@|jdd�r<|jr<tjj|�\}}|jdd�}tjj||�}|S)N�guiF�pythonZpythonw)�getr(r!�pathr�replace�join)r)r�optionsZdn�fnrrr�_get_alternate_executableks
z%ScriptMaker._get_alternate_executablercCsLy"t|��}|jd�dkSQRXWn$ttfk
rFtjd|�dSXdS)zl
            Determine if the specified executable is a script
            (contains a #! line)
            �z#!NzFailed to open %sF)�open�read�OSError�IOError�logger�warning)r)r�fprrr�	_is_shellss
zScriptMaker._is_shellcCsD|j|�r*ddl}|jjjd�dkr<|Sn|j�jd�r<|Sd|S)Nrzos.nameZLinuxz
jython.exez/usr/bin/env %s)r=rZlangZSystemZgetProperty�lower�endswith)r)rrrrr�_fix_jython_executables
z"ScriptMaker._fix_jython_executable�cCsdd}|jr|j}d}n^tj�s&t�}nNt�rLtjjtjd�dtj	d��}n(tjjtj	d�dtj	d�tj	d�f�}|r�|j
||�}tjj
d	�r�|j|�}tjj|�}|r�t|�}|jd
�}tjdkr�d|kr�d
|kr�|d7}d||d}y|jd
�Wn"tk
�rtd|��YnX|d
k�r`y|j|�Wn&tk
�r^td||f��YnX|S)NTF�scriptszpython%s�EXE�BINDIRz
python%s%s�VERSIONrzutf-8Zcliz	-X:Framesz
-X:FullFramess
 -X:Framess#!�
z,The shebang (%r) is not decodable from utf-8z?The shebang (%r) is not decodable from the script encoding (%r))rr�is_python_buildrrr!r/r1�get_path�get_config_varr4�sys�platformrr@�normcaser�encode�decode�UnicodeDecodeError�
ValueError)r)�encoding�post_interpr2Zenquoter�shebangrrr�_get_shebang�sL



zScriptMaker._get_shebangcCs|jt|j|jd�S)N)�module�func)�script_template�dict�prefix�suffix)r)�entryrrr�_get_script_text�s
zScriptMaker._get_script_textcCstjj|�}|j|S)N)r!r/�basename�manifest)r)Zexename�baserrr�get_manifest�szScriptMaker.get_manifestcCs�|jo
|j}tjjd�}|s*|||}n^|dkr>|jd�}n
|jd�}t�}	t|	d��}
|
jd|�WdQRX|	j	�}||||}�xd|D�]Z}tj
j|j|�}
|�rrtj
j
|
�\}}|jd�r�|}
d|
}
y|jj|
|�Wn�tk
�rntjd�d	|
}tj
j|��r tj|�tj|
|�|jj|
|�tjd
�ytj|�Wntk
�rhYnXYnXnp|j�r�|
jd|��r�d|
|f}
tj
j|
��r�|j�r�tjd
|
�q�|jj|
|�|j�r�|jj|
g�|j|
�q�WdS)Nzutf-8�py�t�wz__main__.pyz.pyz%s.exez:Failed to write executable - trying to use .deleteme logicz%s.deletemez0Able to replace executable using .deleteme logic�.z%s.%szSkipping existing file %s)rr(r!�lineseprM�
_get_launcherrrZwritestr�getvaluer/r1r�splitextrr'Zwrite_binary_file�	Exceptionr:r;�exists�remove�rename�debugr?r r$�set_executable_mode�append)r)�namesrSZscript_bytes�	filenames�extZuse_launcherreZlauncher�streamZzfZzip_datar"�outname�n�eZdfnamerrr�
_write_script�sT




zScriptMaker._write_scriptcCs�d}|r0|jdg�}|r0ddj|�}|jd�}|jd||d�}|j|�jd�}|j}t�}	d|jkrp|	j|�d|jkr�|	jd	|t	j
d
f�d|jkr�|	jd|t	j
dd
�f�|r�|jdd�r�d}
nd}
|j|	||||
�dS)NrAZinterpreter_argsz %sr
zutf-8)r2r�Xz%s%srzX.Yz%s-%s�r,F�pywra)r.r1rMrTr\r"r%r&�addrJ�versionrw)r)r[rqr2rR�argsrS�scriptr"Zscriptnamesrrrrr�_make_script�s(




zScriptMaker._make_scriptcCs�d}tjj|jt|��}tjj|jtjj|��}|jrX|jj	||�rXt
jd|�dSyt|d�}Wn t
k
r�|js~�d}YnLX|j�}|s�t
jd|j�|�dStj|jdd��}|r�d}|jd�p�d	}|�s|r�|j�|jj||�|j�r|jj|g�|j|�n�t
jd
||j�|jj�s�t|j�\}	}
|jd�|j|	|�}d|k�rbd
}nd}tjj|�}
|j|
g||j �||�|�r�|j�dS)NFznot copying %s (up-to-date)�rbz"%s: %s is an empty file (skipping)s
rFTrrAzcopying and adjusting %s -> %srspythonwrzra)!r!r/r1rr
rr]rr'Znewerr:rmr6r9r*�readliner;Zget_command_name�
FIRST_LINE_RE�matchr0�group�closeZ	copy_filer$rnro�infor�seekrTrwr7)r)r~rqZadjustrt�fZ
first_liner�rRrQ�linesrSrrrurrr�_copy_scriptsR



zScriptMaker._copy_scriptcCs|jjS)N)r'r*)r)rrrr*JszScriptMaker.dry_runcCs||j_dS)N)r'r*)r)�valuerrrr*NsrcCsHtjd�dkrd}nd}d||f}tjdd�d}t|�j|�j}|S)	N�P�Z64Z32z%s%s.exerdrr)�struct�calcsize�__name__�rsplitr�find�bytes)r)Zkind�bitsr"Zdistlib_package�resultrrrrfVszScriptMaker._get_launchercCs6g}t|�}|dkr"|j||�n|j|||d�|S)a�
        Make a script.

        :param specification: The specification, which is either a valid export
                              entry specification (to make a script from a
                              callable) or a filename (to make a script by
                              copying from a source location).
        :param options: A dictionary of options controlling script generation.
        :return: A list of all absolute pathnames written to.
        N)r2)r	r�r)r)�
specificationr2rqr[rrr�makedszScriptMaker.makecCs(g}x|D]}|j|j||��q
W|S)z�
        Take a list of specifications and make scripts from them,
        :param specifications: A list of specifications.
        :return: A list of all absolute pathnames written to,
        )�extendr�)r)Zspecificationsr2rqr�rrr�
make_multiplews
zScriptMaker.make_multiple)TFN)rAN)N)N)N)r��
__module__�__qualname__�__doc__�SCRIPT_TEMPLATErWrr+r4rJrKrr=r@rTr\�_DEFAULT_MANIFESTr^r`rwrr��propertyr*�setterr!r"r#rfr�r�rrrrrRs,

82
4
r)�iorZloggingr!�rer�rJ�compatrrrZ	resourcesr�utilrr	r
rrZ	getLoggerr�r:�stripr��compiler�r�r�objectrrrrr�<module>s

site-packages/pip/_vendor/distlib/__pycache__/resources.cpython-36.opt-1.pyc000064400000025116147511334610022763 0ustar003

���e*�@s�ddlmZddlZddlZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddlmZddl
mZmZmZmZeje�ZdaGdd�de�ZGdd	�d	e�ZGd
d�de�ZGdd
�d
e�ZGdd�de�ZGdd�de�Zed�ee
jeiZyFyddl Z!Wne"k
�r$ddl#Z!YnXeee!j$<eee!j%<[!Wne"e&fk
�rXYnXdd�Z'iZ(dd�Z)e	j*e+d��Z,dd�Z-dS)�)�unicode_literalsN�)�DistlibException)�cached_property�get_cache_base�path_to_cache_dir�Cachecs.eZdZd�fdd�	Zdd�Zdd�Z�ZS)	�
ResourceCacheNcs0|dkrtjjt�td��}tt|�j|�dS)Nzresource-cache)�os�path�joinr�str�superr	�__init__)�self�base)�	__class__��/usr/lib/python3.6/resources.pyrszResourceCache.__init__cCsdS)z�
        Is the cache stale for the given resource?

        :param resource: The :class:`Resource` being cached.
        :param path: The path of the resource in the cache.
        :return: True if the cache is stale.
        Tr)r�resourcerrrr�is_stale#s	zResourceCache.is_stalec	Cs�|jj|�\}}|dkr|}n~tjj|j|j|�|�}tjj|�}tjj|�sXtj	|�tjj
|�sjd}n|j||�}|r�t|d��}|j
|j�WdQRX|S)z�
        Get a resource into the cache,

        :param resource: A :class:`Resource` instance.
        :return: The pathname of the resource in the cache.
        NT�wb)�finder�get_cache_infor
rrrZ
prefix_to_dir�dirname�isdir�makedirs�existsr�open�write�bytes)rr�prefixr�resultrZstale�frrr�get.s
zResourceCache.get)N)�__name__�
__module__�__qualname__rrr$�
__classcell__rr)rrr	sr	c@seZdZdd�ZdS)�ResourceBasecCs||_||_dS)N)r�name)rrr*rrrrIszResourceBase.__init__N)r%r&r'rrrrrr)Hsr)c@s@eZdZdZdZdd�Zedd��Zedd��Zed	d
��Z	dS)�Resourcez�
    A class representing an in-package resource, such as a data file. This is
    not normally instantiated by user code, but rather by a
    :class:`ResourceFinder` which manages the resource.
    FcCs|jj|�S)z�
        Get the resource as a stream.

        This is not a property to make it obvious that it returns a new stream
        each time.
        )r�
get_stream)rrrr�	as_streamVszResource.as_streamcCstdkrt�atj|�S)N)�cacher	r$)rrrr�	file_path_szResource.file_pathcCs|jj|�S)N)r�	get_bytes)rrrrr fszResource.bytescCs|jj|�S)N)r�get_size)rrrr�sizejsz
Resource.sizeN)
r%r&r'�__doc__�is_containerr-rr/r r2rrrrr+Ns	r+c@seZdZdZedd��ZdS)�ResourceContainerTcCs|jj|�S)N)r�
get_resources)rrrr�	resourcesrszResourceContainer.resourcesN)r%r&r'r4rr7rrrrr5osr5c@s�eZdZdZejjd�rdZnd Zdd�Zdd	�Z	d
d�Z
dd
�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zeejj�Zdd�ZdS)!�ResourceFinderz4
    Resource finder for file system resources.
    �java�.pyc�.pyo�.classcCs.||_t|dd�|_tjjt|dd��|_dS)N�
__loader__�__file__�)�module�getattr�loaderr
rrr)rr@rrrr�szResourceFinder.__init__cCstjj|�S)N)r
r�realpath)rrrrr�_adjust_path�szResourceFinder._adjust_pathcCsBt|t�rd}nd}|j|�}|jd|j�tjj|�}|j|�S)N�/�/r)	�
isinstancer �split�insertrr
rrrD)r�
resource_name�sep�partsr"rrr�
_make_path�s

zResourceFinder._make_pathcCstjj|�S)N)r
rr)rrrrr�_find�szResourceFinder._findcCs
d|jfS)N)r)rrrrrr�szResourceFinder.get_cache_infocCsD|j|�}|j|�sd}n&|j|�r0t||�}n
t||�}||_|S)N)rMrN�
_is_directoryr5r+r)rrJrr"rrr�find�s



zResourceFinder.findcCst|jd�S)N�rb)rr)rrrrrr,�szResourceFinder.get_streamc	Cs t|jd��
}|j�SQRXdS)NrQ)rr�read)rrr#rrrr0�szResourceFinder.get_bytescCstjj|j�S)N)r
r�getsize)rrrrrr1�szResourceFinder.get_sizecs*�fdd��t�fdd�tj|j�D��S)Ncs|dko|j�j�S)N�__pycache__)�endswith�skipped_extensions)r#)rrr�allowed�sz-ResourceFinder.get_resources.<locals>.allowedcsg|]}�|�r|�qSrr)�.0r#)rWrr�
<listcomp>�sz0ResourceFinder.get_resources.<locals>.<listcomp>)�setr
�listdirr)rrr)rWrrr6�szResourceFinder.get_resourcescCs|j|j�S)N)rOr)rrrrrr4�szResourceFinder.is_containerccs�|j|�}|dk	r�|g}xn|r�|jd�}|V|jr|j}xH|jD]>}|sP|}ndj||g�}|j|�}|jrz|j|�qB|VqBWqWdS)NrrF)rP�popr4r*r7r�append)rrJrZtodoZrnamer*�new_nameZchildrrr�iterator�s 


zResourceFinder.iteratorN)r:r;r<)r:r;)r%r&r'r3�sys�platform�
startswithrVrrDrMrNrrPr,r0r1r6r4�staticmethodr
rrrOr_rrrrr8ws"r8cs`eZdZdZ�fdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Z�Z
S)�ZipResourceFinderz6
    Resource finder for resources in .zip files.
    csZtt|�j|�|jj}dt|�|_t|jd�r>|jj|_nt	j
||_t|j�|_dS)Nr�_files)
rrdrrB�archive�len�
prefix_len�hasattrre�	zipimport�_zip_directory_cache�sorted�index)rr@rf)rrrr�szZipResourceFinder.__init__cCs|S)Nr)rrrrrrD�szZipResourceFinder._adjust_pathcCs�||jd�}||jkrd}nX|r:|dtjkr:|tj}tj|j|�}y|j|j|�}Wntk
rtd}YnX|s�tj	d||j
j�ntj	d||j
j�|S)NTrFz_find failed: %r %rz_find worked: %r %r���)rhrer
rK�bisectrmrb�
IndexError�logger�debugrBr!)rrr"�irrrrN�s


zZipResourceFinder._findcCs&|jj}|jdt|�d�}||fS)Nr)rBrfrrg)rrr!rrrrr�sz ZipResourceFinder.get_cache_infocCs|jj|j�S)N)rB�get_datar)rrrrrr0�szZipResourceFinder.get_bytescCstj|j|��S)N)�io�BytesIOr0)rrrrrr,�szZipResourceFinder.get_streamcCs|j|jd�}|j|dS)N�)rrhre)rrrrrrr1szZipResourceFinder.get_sizecCs�|j|jd�}|r,|dtjkr,|tj7}t|�}t�}tj|j|�}xV|t|j�kr�|j|j|�sjP|j||d�}|j	|j
tjd�d�|d7}qJW|S)Nrrrn)rrhr
rKrgrZrormrb�addrH)rrrZplenr"rs�srrrr6s
zZipResourceFinder.get_resourcescCsj||jd�}|r*|dtjkr*|tj7}tj|j|�}y|j|j|�}Wntk
rdd}YnX|S)NrFrn)rhr
rKrormrbrp)rrrsr"rrrrOs

zZipResourceFinder._is_directory)r%r&r'r3rrDrNrr0r,r1r6rOr(rr)rrrd�srdcCs|tt|�<dS)N)�_finder_registry�type)rB�finder_makerrrr�register_finder0sr}cCs�|tkrt|}nv|tjkr$t|�tj|}t|dd�}|dkrJtd��t|dd�}tjt|��}|dkrxtd|��||�}|t|<|S)z�
    Return a resource finder for a package.
    :param package: The name of the package.
    :return: A :class:`ResourceFinder` instance for the package.
    �__path__Nz8You cannot get a finder for a module, only for a packager=zUnable to locate finder for %r)	�
_finder_cacher`�modules�
__import__rArrzr$r{)�packager"r@rrBr|rrrr6s


rZ	__dummy__cCsRd}tj|�tjj|�}tjt|��}|rNt}tj	j
|d�|_||_||�}|S)z�
    Return a resource finder for a path, which should represent a container.

    :param path: The path.
    :return: A :class:`ResourceFinder` instance for the path.
    Nr?)
�pkgutilZget_importerr`�path_importer_cacher$rzr{�
_dummy_moduler
rrr>r=)rr"rBrr@rrr�finder_for_pathRs
r�).Z
__future__rroruZloggingr
r�Zshutilr`�typesrjr?r�utilrrrrZ	getLoggerr%rqr.r	�objectr)r+r5r8rdr{�zipimporterrz�_frozen_importlib_externalZ_fi�ImportError�_frozen_importlib�SourceFileLoader�
FileFinder�AttributeErrorr}rr�
ModuleTyper
r�r�rrrr�<module>sH
,!ZN


site-packages/pip/_vendor/distlib/__pycache__/metadata.cpython-36.pyc000064400000064571147511334610021602 0ustar003

���e���@sdZddlmZddlZddlmZddlZddlZddlZddl	m
Z
mZddlm
Z
mZmZddlmZdd	lmZmZdd
lmZmZeje�ZGdd�de
�ZGd
d�de
�ZGdd�de
�ZGdd�de
�ZdddgZdZ dZ!ej"d�Z#ej"d�Z$dGZ%dHZ&dIZ'dJZ(dKZ)dLZ*dMZ+e,�Z-e-j.e%�e-j.e&�e-j.e(�e-j.e*�ej"d8�Z/d9d:�Z0d;d<�Z1ddddd%ddd d!d"d#d+d,d$d&d'd-d/d0d5d1d2d*d)d(d.d3d4d6d7d=�Z2dNZ3dOZ4dPZ5dQZ6dRZ7dSZ8dTZ9e:�Z;ej"d>�Z<dUd@dA�Z=GdBdC�dCe:�Z>dDZ?dEZ@GdFd�de:�ZAdS)VzImplementation of the Metadata for Python packages PEPs.

Supports all metadata formats (1.0, 1.1, 1.2, and 2.0 experimental).
�)�unicode_literalsN)�message_from_file�)�DistlibException�__version__)�StringIO�string_types�	text_type)�	interpret)�extract_by_key�
get_extras)�
get_scheme�PEP440_VERSION_REc@seZdZdZdS)�MetadataMissingErrorzA required metadata is missingN)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/metadata.pyrsrc@seZdZdZdS)�MetadataConflictErrorz>Attempt to read or write metadata fields that are conflictual.N)rrrrrrrrr src@seZdZdZdS)� MetadataUnrecognizedVersionErrorz Unknown metadata version number.N)rrrrrrrrr$src@seZdZdZdS)�MetadataInvalidErrorzA metadata value is invalidN)rrrrrrrrr(sr�Metadata�PKG_INFO_ENCODING�PKG_INFO_PREFERRED_VERSIONzutf-8z1.1z

       \|z	
        �Metadata-Version�Name�Version�Platform�Summary�Description�Keywords�	Home-page�Author�Author-email�License�Supported-Platform�
Classifier�Download-URL�	Obsoletes�Provides�Requires�
Maintainer�Maintainer-email�Obsoletes-Dist�Project-URL�
Provides-Dist�
Requires-Dist�Requires-Python�Requires-External�Private-Version�Obsoleted-By�Setup-Requires-Dist�	Extension�Provides-Extraz"extra\s*==\s*("([^"]+)"|'([^']+)')cCs<|dkrtS|dkrtS|dkr$tS|dkr0tSt|��dS)Nz1.0z1.1z1.2z2.0)�_241_FIELDS�_314_FIELDS�_345_FIELDS�_426_FIELDSr)�versionrrr�_version2fieldlistgsr?c	Cs�dd�}g}x.|j�D]"\}}|gddfkr.q|j|�qWddddg}xt|D]l}|tkrld|krl|jd�|tkr�d|kr�|jd�|tkr�d|kr�|jd�|tkrNd|krN|jd�qNWt|�d	kr�|d
St|�d
kr�td��d|ko�||t	�}d|k�o
||t
�}d|k�o||t�}t|�t|�t|�d	k�rFtd��|�rl|�rl|�rlt
|k�rlt
S|�rvdS|�r�dSdS)
z5Detect the best version depending on the fields used.cSsx|D]}||krdSqWdS)NTFr)�keys�markers�markerrrr�_has_markerus
z"_best_version.<locals>._has_marker�UNKNOWNNz1.0z1.1z1.2z2.0rrzUnknown metadata setz(You used incompatible 1.1/1.2/2.0 fields)�items�appendr:�remover;r<r=�lenr�_314_MARKERS�_345_MARKERS�_426_MARKERS�intr)	�fieldsrCr@�key�valueZpossible_versionsZis_1_1Zis_1_2Zis_2_0rrr�
_best_versionssB




rP)�metadata_version�namer>�platformZsupported_platform�summary�description�keywords�	home_page�author�author_email�
maintainer�maintainer_email�license�
classifier�download_url�obsoletes_dist�
provides_dist�
requires_dist�setup_requires_dist�requires_python�requires_external�requires�provides�	obsoletes�project_urlZprivate_versionZobsoleted_by�	extensionZprovides_extraz[^A-Za-z0-9.]+FcCs0|r$tjd|�}tjd|jdd��}d||fS)zhReturn the distribution name with version.

    If for_filename is true, return a filename-escaped form.�-� �.z%s-%s)�	_FILESAFE�sub�replace)rRr>Zfor_filenamerrr�_get_name_and_version�srpc@s
eZdZdZd?dd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zd@dd�Zdd�Zdd �Zd!d"�Zd#d$�ZdAd%d&�ZdBd'd(�ZdCd)d*�Zd+d,�Zefd-d.�ZdDd/d0�ZdEd1d2�Zd3d4�Zd5d6�Zd7d8�Zd9d:�Zd;d<�Z d=d>�Z!dS)F�LegacyMetadataaaThe legacy metadata of a release.

    Supports versions 1.0, 1.1 and 1.2 (auto-detected). You can
    instantiate the class with one of these arguments (or none):
    - *path*, the path to a metadata file
    - *fileobj* give a file-like object with metadata as content
    - *mapping* is a dict-like object
    - *scheme* is a version scheme name
    N�defaultcCsz|||gjd�dkrtd��i|_g|_d|_||_|dk	rH|j|�n.|dk	r\|j|�n|dk	rv|j|�|j	�dS)N�z'path, fileobj and mapping are exclusive)
�count�	TypeError�_fieldsZrequires_filesZ
_dependencies�scheme�read�	read_file�update�set_metadata_version)�self�path�fileobj�mappingrwrrr�__init__�s
zLegacyMetadata.__init__cCst|j�|jd<dS)NzMetadata-Version)rPrv)r|rrrr{sz#LegacyMetadata.set_metadata_versioncCs|jd||f�dS)Nz%s: %s
)�write)r|r~rRrOrrr�_write_fieldszLegacyMetadata._write_fieldcCs
|j|�S)N)�get)r|rRrrr�__getitem__szLegacyMetadata.__getitem__cCs|j||�S)N)�set)r|rRrOrrr�__setitem__szLegacyMetadata.__setitem__cCs8|j|�}y|j|=Wntk
r2t|��YnXdS)N)�
_convert_namerv�KeyError)r|rR�
field_namerrr�__delitem__s

zLegacyMetadata.__delitem__cCs||jkp|j|�|jkS)N)rvr�)r|rRrrr�__contains__s
zLegacyMetadata.__contains__cCs(|tkr|S|jdd�j�}tj||�S)Nrj�_)�_ALL_FIELDSro�lower�_ATTR2FIELDr�)r|rRrrrr�szLegacyMetadata._convert_namecCs|tks|tkrgSdS)NrD)�_LISTFIELDS�_ELEMENTSFIELD)r|rRrrr�_default_value%szLegacyMetadata._default_valuecCs&|jdkrtjd|�Stjd|�SdS)N�1.0�1.1�
)r�r�)rQ�_LINE_PREFIX_PRE_1_2rn�_LINE_PREFIX_1_2)r|rOrrr�_remove_line_prefix*s
z"LegacyMetadata._remove_line_prefixcCs|tkr||St|��dS)N)r��AttributeError)r|rRrrr�__getattr__0szLegacyMetadata.__getattr__FcCst|d|d|�S)zhReturn the distribution name with version.

        If filesafe is true, return a filename-escaped form.rr)rp)r|Zfilesaferrr�get_fullname;szLegacyMetadata.get_fullnamecCs|j|�}|tkS)z+return True if name is a valid metadata key)r�r�)r|rRrrr�is_fieldAs
zLegacyMetadata.is_fieldcCs|j|�}|tkS)N)r�r�)r|rRrrr�is_multi_fieldFs
zLegacyMetadata.is_multi_fieldc
Cs.tj|ddd�}z|j|�Wd|j�XdS)z*Read the metadata values from a file path.�rzutf-8)�encodingN)�codecs�openry�close)r|�filepath�fprrrrxJszLegacyMetadata.readcCs�t|�}|d|jd<xxtD]p}||kr*q|tkrh|j|�}|tkrZ|dk	rZdd�|D�}|j||�q||}|dk	r|dkr|j||�qW|j�dS)z,Read the metadata values from a file object.zmetadata-versionzMetadata-VersionNcSsg|]}t|jd���qS)�,)�tuple�split)�.0rOrrr�
<listcomp>_sz,LegacyMetadata.read_file.<locals>.<listcomp>rD)rrvr�r�Zget_all�_LISTTUPLEFIELDSr�r{)r|Zfileob�msg�field�valuesrOrrrryRs

zLegacyMetadata.read_filec
Cs0tj|ddd�}z|j||�Wd|j�XdS)z&Write the metadata fields to filepath.�wzutf-8)r�N)r�r��
write_filer�)r|r��skip_unknownr�rrrr�hszLegacyMetadata.writecCs�|j�x�t|d�D]�}|j|�}|r:|dgdgfkr:q|tkrX|j||dj|��q|tkr�|dkr�|jd
kr�|jdd�}n|jdd	�}|g}|t	kr�d
d�|D�}x|D]}|j|||�q�WqWdS)z0Write the PKG-INFO format data to a file object.zMetadata-VersionrDr�r!�1.0�1.1r�z	
        z	
       |cSsg|]}dj|��qS)r�)�join)r�rOrrrr��sz-LegacyMetadata.write_file.<locals>.<listcomp>N)r�r�)
r{r?r�r�r�r�r�rQror�)r|Z
fileobjectr�r�r�rOrrrr�ps$


zLegacyMetadata.write_filecs��fdd�}|snHt|d�r>x<|j�D]}||||�q&Wnx|D]\}}|||�qDW|r~x|j�D]\}}|||�qhWdS)a�Set metadata values from the given iterable `other` and kwargs.

        Behavior is like `dict.update`: If `other` has a ``keys`` method,
        they are looped over and ``self[key]`` is assigned ``other[key]``.
        Else, ``other`` is an iterable of ``(key, value)`` iterables.

        Keys that don't match a metadata field or that have an empty value are
        dropped.
        cs"|tkr|r�j�j|�|�dS)N)r�r�r�)rNrO)r|rr�_set�sz#LegacyMetadata.update.<locals>._setr@N)�hasattrr@rE)r|�other�kwargsr��k�vr)r|rrz�s

zLegacyMetadata.updatecCsn|j|�}|tks|dkrPt|ttf�rPt|t�rJdd�|jd�D�}q~g}n.|tkr~t|ttf�r~t|t�rz|g}ng}tj	t
j��rB|d}t|j
�}|tkr�|dk	r�x�|D](}|j|jd�d�s�tjd	|||�q�Wn`|tko�|dk	�r|j|��sBtjd
|||�n0|tk�rB|dk	�rB|j|��sBtjd
|||�|tk�r`|dk�r`|j|�}||j|<dS)z"Control then set a metadata field.rcSsg|]}|j��qSr)�strip)r�r�rrrr��sz&LegacyMetadata.set.<locals>.<listcomp>r�rN�;rz$'%s': '%s' is not valid (field '%s')z.'%s': '%s' is not a valid version (field '%s')r!)r�r��
isinstance�listr�rr�r��loggerZisEnabledFor�loggingZWARNINGr
rw�_PREDICATE_FIELDS�is_valid_matcher�warning�_VERSIONS_FIELDS�is_valid_constraint_list�_VERSION_FIELDS�is_valid_version�_UNICODEFIELDSr�rv)r|rRrOZproject_namerwr�rrrr��s@








zLegacyMetadata.setcCs�|j|�}||jkr*|tkr&|j|�}|S|tkr@|j|}|S|tkr�|j|}|dkr^gSg}x6|D].}|tkr�|j|�qh|j|d|df�qhW|S|tkr�|j|}t	|t
�r�|jd�S|j|S)zGet a metadata field.Nrrr�)r�rv�_MISSINGr�r�r�r�rFr�r�rr�)r|rRrrrO�res�valrrrr��s.








zLegacyMetadata.getcs|j�gg}}xd
D]}||kr|j|�qW|rT|gkrTddj|�}t|��xdD]}||krZ|j|�qZW|ddkr�||fSt|j���fd	d
�}xdt|ft�jft	�j
ffD]F\}}x<|D]4}	|j|	d�}
|
dk	o�||
�r�|jd|	|
f�q�Wq�W||fS)zkCheck if the metadata is compliant. If strict is True then raise if
        no Name or Version are providedrrzmissing required metadata: %sz, �	Home-pager$zMetadata-Versionz1.2cs*x$|D]}�j|jd�d�sdSqWdS)Nr�rFT)r�r�)rOr�)rwrr�are_valid_constraintss
z3LegacyMetadata.check.<locals>.are_valid_constraintsNzWrong value for '%s': %s)rr)r�r$)r{rFr�rr
rwr�r�r�r�r�r�)r|�strict�missing�warnings�attrr�r�rMZ
controllerr�rOr)rwr�check�s2




zLegacyMetadata.checkcCs�|j�dB}i}x,|D]$\}}|s.||jkr||||<qW|ddk�r�dK}x�|D]F\}}|sn||jkrV|d&k�r�||||<qVd,d-�||D�||<qVWnF|dd.k�r�dO}x2|D]*\}}|�s�||jk�r�||||<�q�W|S)Pz�Return fields as a dict.

        Field names will be converted to use the underscore-lowercase style
        instead of hyphen-mixed case (i.e. home_page instead of Home-page).
        rQ�Metadata-VersionrRrr>rrTr rW�	Home-pagerXr$rY�Author-emailr\r&rUr!rVr"rSr�classifiersr(r^�Download-URLz1.2ra�
Requires-Distrc�Requires-Pythonrd�Requires-Externalr`�
Provides-Distr_�Obsoletes-Distrh�Project-URLrZr-r[�Maintainer-emailcSsg|]}dj|��qS)r�)r�)r��urrrr�Gsz)LegacyMetadata.todict.<locals>.<listcomp>z1.1rfr+rer,rgr*�rQr��rRr�r>r�rTr �rWr��rXr$�rYr��r\r&�rUr!�rVr"�rSr�r�r(�r^r�)
r�r�r�r�r�r�r�r�r�r�r�r�r��rar��rcr��rdr��r`r��r_r��rhr��rZr-�r[r�)r�r�r�r�r�r�r�r��rfr+�rer,�rgr*)r�r�r�)r{rv)r|Zskip_missingZmapping_1_0�datarNr�Zmapping_1_2Zmapping_1_1rrr�todictsP
zLegacyMetadata.todictcCs<|ddkr(xdD]}||kr||=qW|d|7<dS)NzMetadata-Versionz1.1r*r,r+z
Requires-Dist)r*r,r+r)r|�requirementsr�rrr�add_requirementsUs


zLegacyMetadata.add_requirementscCstt|d��S)NzMetadata-Version)r�r?)r|rrrr@`szLegacyMetadata.keysccsx|j�D]
}|Vq
WdS)N)r@)r|rNrrr�__iter__cszLegacyMetadata.__iter__cs�fdd��j�D�S)Ncsg|]}�|�qSrr)r�rN)r|rrr�hsz)LegacyMetadata.values.<locals>.<listcomp>)r@)r|r)r|rr�gszLegacyMetadata.valuescs�fdd��j�D�S)Ncsg|]}|�|f�qSrr)r�rN)r|rrr�ksz(LegacyMetadata.items.<locals>.<listcomp>)r@)r|r)r|rrEjszLegacyMetadata.itemscCsd|jj|j|jfS)Nz
<%s %s %s>)�	__class__rrRr>)r|rrr�__repr__mszLegacyMetadata.__repr__)NNNrr)F)F)F)N)F)F)"rrrrr�r{r�r�r�r�r�r�r�r�r�r�r�r�rxryr�r�rzr�r�r�r�r�r�r@r�r�rEr�rrrrrq�s>	




,
,
;rqzpydist.jsonz
metadata.jsonc@s�eZdZdZejd�Zejdej�Ze	Z
ejd�ZdZde
ZffdId�Zd	Zd
ZeffedJfe
dKfedLfd�ZdMZdNdd�ZedO�ZdefZdefZdefdefeeedefeeeedefdPdQd�
Z[[dd �ZdRd!d"�Zd#d$�Zed%d&��Z ed'd(��Z!e!j"d)d(��Z!dSd*d+�Z#ed,d-��Z$ed.d/��Z%e%j"d0d/��Z%d1d2�Z&d3d4�Z'd5d6�Z(d7d8�Z)d9d:d;d<d=dd>�Z*d?d@�Z+dTdCdD�Z,dEdF�Z-dGdH�Z.dS)Urz�
    The metadata of a release. This implementation uses 2.0 (JSON)
    metadata where possible. If not possible, it wraps a LegacyMetadata
    instance which handles the key-value metadata format.
    z
^\d+(\.\d+)*$z!^[0-9A-Z]([0-9A-Z_.-]*[0-9A-Z])?$z	.{1,2047}z2.0zdistlib (%s)�legacy)rRr>rTzqname version license summary description author author_email keywords platform home_page classifiers download_urlzwextras run_requires test_requires build_requires dev_requires provides meta_requires obsoleted_by supports_environments)rQrRr>rT�_legacy�_datarwNrrcCs0|||gjd�dkrtd��d|_d|_||_|dk	rzy|j||�||_Wn*tk
rvt||d�|_|j�YnXn�d}|r�t	|d��}|j
�}WdQRXn|r�|j
�}|dkr�|j|jd�|_ndt
|t�s�|jd�}ytj|�|_|j|j|�Wn0tk
�r*tt|�|d�|_|j�YnXdS)Nrsz'path, fileobj and mapping are exclusive)rrw�rb)rQ�	generatorzutf-8)r~rw)rtrur�rrw�_validate_mappingrrq�validater�rx�METADATA_VERSION�	GENERATORr�r	�decode�json�loads�
ValueErrorr)r|r}r~rrwr��frrrr��s<



zMetadata.__init__rRr>r\rVrTz
Requires-DistzSetup-Requires-DistzProvides-Extrar(�Download-URL�Metadata-Version)
�run_requires�build_requires�dev_requiresZ
test_requires�
meta_requires�extras�modules�
namespaces�exports�commandsr�Z
source_urlrQc
CsZtj|d�}tj|d�}||k�r||\}}|jr^|dkrP|dkrHdn|�}n|jj|�}n�|dkrjdn|�}|dkr�|jj||�}n�t�}|}|jjd�}	|	�r|dkr�|	jd	|�}nR|dkr�|	jd
�}	|	r�|	j||�}n.|	jd�}	|	�s�|jjd�}	|	�r|	j||�}||k�rV|}n:||k�r4tj||�}n"|j�rJ|jj|�}n|jj|�}|S)
N�common_keys�mapped_keysrrrrr��
extensionszpython.commandszpython.detailszpython.exports)rrrrr�)�object�__getattribute__r�r�r)
r|rN�common�mapped�lkZmaker�resultrO�sentinel�drrrr�sF




zMetadata.__getattribute__cCsH||jkrD|j|\}}|p |j|krD|j|�}|sDtd||f��dS)Nz.'%s' is an invalid value for the '%s' property)�SYNTAX_VALIDATORSrw�matchr)r|rNrOrw�pattern�
exclusions�mrrr�_validate_values

zMetadata._validate_valuecCs*|j||�tj|d�}tj|d�}||kr�||\}}|jrV|dkrJt�||j|<nf|d
krj||j|<nR|jjdi�}|dkr�||d	<n2|dkr�|jd
i�}|||<n|jdi�}|||<nh||kr�tj|||�nP|dk�rt|t	��r|j
�}|�r|j�}ng}|j�r||j|<n
||j|<dS)Nrrrrrrr�rzpython.commandszpython.detailszpython.exportsrV)rrrrr�)r'rrr��NotImplementedErrorr�
setdefault�__setattr__r�rr�r�)r|rNrOrrrr�r!rrrr*s>




zMetadata.__setattr__cCst|j|jd�S)NT)rprRr>)r|rrr�name_and_version@szMetadata.name_and_versioncCsF|jr|jd}n|jjdg�}d|j|jf}||krB|j|�|S)Nz
Provides-Distrfz%s (%s))r�rr)rRr>rF)r|r�srrrrfDs
zMetadata.providescCs |jr||jd<n
||jd<dS)Nz
Provides-Distrf)r�r)r|rOrrrrfOsc
Cs�|jr|}n�g}t|pg|j�}xl|D]d}d|kr@d|kr@d}n8d|krNd}n|jd�|k}|rx|jd�}|rxt||�}|r&|j|d�q&WxNdD]F}d|}	|	|kr�|j|	�|jjd	|g�}|j|j|||d
��q�W|S)a�
        Base method to get dependencies, given a set of extras
        to satisfy and an optional environment context.
        :param reqts: A list of sometimes-wanted dependencies,
                      perhaps dependent on extras and environment.
        :param extras: A list of optional components being requested.
        :param env: An optional environment for marker evaluation.
        �extra�environmentTre�build�dev�testz:%s:z%s_requires)r�env)r/r0r1)	r�rrr�r
�extendrGr�get_requirements)
r|�reqtsrr2rr!�includerBrN�errrr4Vs0	




zMetadata.get_requirementscCs|jr|j�S|jS)N)r��_from_legacyr)r|rrr�
dictionary�szMetadata.dictionarycCs|jrt�nt|j|j�SdS)N)r�r(rr�DEPENDENCY_KEYS)r|rrr�dependencies�szMetadata.dependenciescCs|jrt�n|jj|�dS)N)r�r(rrz)r|rOrrrr;�sc	Cs�|jd�|jkrt��g}x0|jj�D]"\}}||kr&||kr&|j|�q&W|rfddj|�}t|��x"|j�D]\}}|j|||�qpWdS)NrQzMissing metadata items: %sz, )	r�rr�MANDATORY_KEYSrErFr�rr')	r|rrwr�rNr%r�r�r�rrrr�szMetadata._validate_mappingcCsB|jr.|jjd�\}}|s|r>tjd||�n|j|j|j�dS)NTz#Metadata: missing: %s, warnings: %s)r�r�r�r�rrrw)r|r�r�rrrr�s
zMetadata.validatecCs(|jr|jjd�St|j|j�}|SdS)NT)r�r�rr�
INDEX_KEYS)r|rrrrr��szMetadata.todictc
Cs�|jr|jst�|j|jd�}|jjd�}x2dD]*}||kr2|dkrLd	}n|}||||<q2W|jd
g�}|dgkrzg}||d<d}x2|D]*\}}||kr�||r�d||ig||<q�W|j|d<i}i}	|S)N)rQrTrRr>r\rTrUr]r�r"�rVrarrbrrerf)rRr>r\rTrUr]�rar�rbr)r?r@)r�r�AssertionErrorrrr�r�rf)
r|rZlmdr��nk�kwr@�okrXrZrrrr8�s0


zMetadata._from_legacyrrr&r r!)rRr>r\rTrUr�cCs�dd�}|jr|jst�t�}|j}x*|jj�D]\}}||kr2||||<q2W||j|j�}||j|j	�}|j
r�t|j
�|d<t|�|d<t|�|d<|S)NcSs�t�}x�|D]�}|jd�}|jd�}|d}xb|D]Z}|rN|rN|j|�q2d}|r^d|}|rx|rtd||f}n|}|jdj||f��q2WqW|S)Nr-r.rer>z
extra == "%s"z(%s) and %sr�)r�r��addr�)Zentriesr5r7r-r2Zrlistr�rBrrr�process_entries�s"



z,Metadata._to_legacy.<locals>.process_entrieszProvides-Extraz
Requires-DistzSetup-Requires-Dist)rr�rArq�LEGACY_MAPPINGrErrrrr�sorted)r|rFrZnmdrBrDZr1Zr2rrr�
_to_legacy�szMetadata._to_legacyFTcCs�||gjd�dkrtd��|j�|r`|jr4|j}n|j�}|rP|j||d�q�|j||d�n^|jrp|j�}n|j}|r�t	j
||dddd�n.tj|dd��}t	j
||dddd�WdQRXdS)	Nrz)Exactly one of path and fileobj is needed)r�Trs)Zensure_ascii�indentZ	sort_keysr�zutf-8)
rtr
rr�rIr�r�r8rr�dumpr�r�)r|r}r~r�r�Z	legacy_mdr!rrrrr��s&

zMetadata.writecCs�|jr|jj|�nt|jjdg�}d}x"|D]}d|kr,d|kr,|}Pq,W|dkrhd|i}|jd|�n t|d�t|�B}t|�|d<dS)Nrr.r-rer)r�r�rr)�insertr�rH)r|r�r�always�entryZrsetrrrr�s
zMetadata.add_requirementscCs*|jpd}|jpd}d|jj|j||fS)Nz	(no name)z
no versionz<%s %s %s (%s)>)rRr>r�rrQ)r|rRr>rrrr�(s

zMetadata.__repr__)r�)r�)r�)r�)r�rrw)NNNrr)rRr>r\rVrT)rN)r
N)N)NN)NNFT)/rrrr�re�compileZMETADATA_VERSION_MATCHER�IZNAME_MATCHERrZVERSION_MATCHERZSUMMARY_MATCHERrrrr<r=r:r"�	__slots__r�r�rr�Z	none_list�dictZ	none_dictrrr'r*�propertyr+rf�setterr4r9r;rrr�r8rGrIr�r�r�rrrrrvsx


,+

'
*	%
)rrrrr r!r"r#r$r%r&)rrrrr'r r!r"r#r$r%r&r(r)r*r+r,)r*r+r,r(r))rrrrr'r r!r"r#r$r%r-r.r&r(r)r/r0r1r2r3r4)r1r2r3r/r4r-r.r0)rrrrr'r r!r"r#r$r%r-r.r&r(r)r/r0r1r2r3r4r5r6r7r8r9)r5r9r6r7r8)r2r/r1)r3)r)rr(r*r,r+r/r1r2r4r0r'r7r9r8)r0)r")r$r-r r!)F)BrZ
__future__rr�Zemailrrr�rOr>rr�compatrrr	rAr
�utilrrr>r
rZ	getLoggerrr�rrrr�__all__rrrPr�r�r:r;rIr<rJr=rKr�r�rzZEXTRA_REr?rPr�r�r�r�r�r�r�r�rr�rmrprqZMETADATA_FILENAMEZWHEEL_METADATA_FILENAMErrrrr�<module>	s�








9


	site-packages/pip/_vendor/distlib/__pycache__/locators.cpython-36.opt-1.pyc000064400000113116147511334610022575 0ustar003

���eE��@s@ddlZddlmZddlZddlZddlZddlZddlZyddlZWne	k
rdddl
ZYnXddlZddlm
Z
ddlmZmZmZmZmZmZmZmZmZmZmZmZmZmZmZddlm Z m!Z!m"Z"ddl#m$Z$ddl%m&Z&m'Z'm(Z(m)Z)m*Z*m+Z+m,Z,m-Z-m.Z.dd	l/m0Z0m1Z1dd
l2m3Z3m4Z4ej5e6�Z7ej8d�Z9ej8dej:�Z;ej8d
�Z<dZ=d-dd�Z>Gdd�de�Z?Gdd�de@�ZAGdd�deA�ZBGdd�deA�ZCGdd�de@�ZDGdd�deA�ZEGdd�deA�ZFGdd �d eA�ZGGd!d"�d"eA�ZHGd#d$�d$eA�ZIeIeG�eEd%d&d'�d(d)�ZJeJjKZKej8d*�ZLGd+d,�d,e@�ZMdS).�N)�BytesIO�)�DistlibException)�urljoin�urlparse�
urlunparse�url2pathname�pathname2url�queue�quote�unescape�string_types�build_opener�HTTPRedirectHandler�	text_type�Request�	HTTPError�URLError)�Distribution�DistributionPath�	make_dist)�Metadata)	�cached_property�parse_credentials�ensure_slash�split_filename�get_project_data�parse_requirement�parse_name_and_version�ServerProxy�normalize_name)�
get_scheme�UnsupportedVersionError)�Wheel�
is_compatiblez^(\w+)=([a-f0-9]+)z;\s*charset\s*=\s*(.*)\s*$ztext/html|application/x(ht)?mlzhttps://pypi.python.org/pypicCs |dkrt}t|dd�}|j�S)z�
    Return all distribution names known by an index.
    :param url: The URL of the index.
    :return: A list of all known distribution names.
    Ng@)�timeout)�
DEFAULT_INDEXr�
list_packages)�url�client�r*�/usr/lib/python3.6/locators.py�get_all_distribution_names)sr,c@s$eZdZdZdd�ZeZZZdS)�RedirectHandlerzE
    A class to work around a bug in some Python 3.2.x releases.
    c	Cs�d}xdD]}||kr
||}Pq
W|dkr0dSt|�}|jdkrpt|j�|�}t|d�rh|j||�n|||<tj||||||�S)N�location�uri��replace_header)r.r/)r�schemerZget_full_url�hasattrr1�BaseRedirectHandler�http_error_302)	�self�req�fp�code�msg�headersZnewurl�keyZurlpartsr*r*r+r5=s


zRedirectHandler.http_error_302N)�__name__�
__module__�__qualname__�__doc__r5Zhttp_error_301Zhttp_error_303Zhttp_error_307r*r*r*r+r-4sr-c@s�eZdZdZd/Zd0Zd1ZdZed2Zd3dd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zee
e�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd4d-d.�ZdS)5�LocatorzG
    A base class for locators - things that locate distributions.
    �.tar.gz�.tar.bz2�.tar�.zip�.tgz�.tbz�.egg�.exe�.whl�.pdfN�defaultcCs,i|_||_tt��|_d|_tj�|_dS)a^
        Initialise an instance.
        :param scheme: Because locators look for most recent versions, they
                       need to know the version scheme to use. This specifies
                       the current PEP-recommended scheme - use ``'legacy'``
                       if you need to support existing distributions on PyPI.
        N)	�_cacher2rr-�opener�matcherr
�Queue�errors)r6r2r*r*r+�__init__cs
zLocator.__init__cCsXg}xN|jj�sRy|jjd�}|j|�Wn|jjk
rDwYnX|jj�qW|S)z8
        Return any errors which have occurred.
        F)rQ�empty�get�appendZEmpty�	task_done)r6�result�er*r*r+�
get_errorsvszLocator.get_errorscCs|j�dS)z>
        Clear any errors which may have been logged.
        N)rY)r6r*r*r+�clear_errors�szLocator.clear_errorscCs|jj�dS)N)rM�clear)r6r*r*r+�clear_cache�szLocator.clear_cachecCs|jS)N)�_scheme)r6r*r*r+�_get_scheme�szLocator._get_schemecCs
||_dS)N)r])r6�valuer*r*r+�_set_scheme�szLocator._set_schemecCstd��dS)a=
        For a given project, get a dictionary mapping available versions to Distribution
        instances.

        This should be implemented in subclasses.

        If called from a locate() request, self.matcher will be set to a
        matcher for the requirement to satisfy, otherwise it will be None.
        z Please implement in the subclassN)�NotImplementedError)r6�namer*r*r+�_get_project�s
zLocator._get_projectcCstd��dS)zJ
        Return all the distribution names known to this locator.
        z Please implement in the subclassN)ra)r6r*r*r+�get_distribution_names�szLocator.get_distribution_namescCsL|jdkr|j|�}n2||jkr,|j|}n|j�|j|�}||j|<|S)z�
        For a given project, get a dictionary mapping available versions to Distribution
        instances.

        This calls _get_project to do all the work, and just implements a caching layer on top.
        N)rMrcrZ)r6rbrWr*r*r+�get_project�s



zLocator.get_projectcCsPt|�}tj|j�}d}|jd�}|r6tt|�|j�}|jdkd|j	k|||fS)zu
        Give an url a score which can be used to choose preferred URLs
        for a given project release.
        Tz.whl�httpszpypi.python.org)
r�	posixpath�basename�path�endswithr$r#�
wheel_tagsr2�netloc)r6r(�trhZ
compatibleZis_wheelr*r*r+�	score_url�s
zLocator.score_urlcCsR|}|rN|j|�}|j|�}||kr(|}||kr@tjd||�ntjd||�|S)a{
        Choose one of two URLs where both are candidates for distribution
        archives for the same version of a distribution (for example,
        .tar.gz vs. zip).

        The current implementation favours https:// URLs over http://, archives
        from PyPI over those from other locations, wheel compatibility (if a
        wheel) and then the archive name.
        zNot replacing %r with %rzReplacing %r with %r)rn�logger�debug)r6�url1�url2rW�s1�s2r*r*r+�
prefer_url�s


zLocator.prefer_urlcCs
t||�S)zZ
        Attempt to split a filename in project name, version and Python version.
        )r)r6�filename�project_namer*r*r+r�szLocator.split_filenamecCsdd�}d}t|�\}}}}}	}
|
j�jd�r<tjd||
�tj|
�}|rX|j�\}}
nd\}}
|}|r�|ddkr�|dd�}|jd��r2yrt	|�}t
||j��r�|dkr�d	}n||j|�}|�r�|j|j
|jt|||||	d
f�djdd
�|jD��d�}Wn0tk
�r.}ztjd|�WYdd}~XnXn�|j|j��r�tj|�}}x�|jD]�}|j|��rV|dt|��}|j||�}|�s�tjd|�nJ|\}}}|�s�|||��r�|||t|||||	d
f�d�}|�r�||d<P�qVW|�r|�r|
|d|<|S)a
        See if a URL is a candidate for a download URL for a project (the URL
        has typically been scraped from an HTML page).

        If it is, a dictionary is returned with keys "name", "version",
        "filename" and "url"; otherwise, None is returned.
        cSst|�t|�kS)N)r )Zname1Zname2r*r*r+�same_project�sz:Locator.convert_url_to_download_info.<locals>.same_projectNzegg=z %s: version hint in fragment: %rr�/z.whlTr0z, cSs"g|]}djt|dd����qS)�.�N)�join�list)�.0�vr*r*r+�
<listcomp>sz8Locator.convert_url_to_download_info.<locals>.<listcomp>)rb�versionrvr(zpython-versionzinvalid path for wheel: %sz No match for project/version: %s)rbr�rvr(zpython-versionz	%s_digest)NN���r�)r�lower�
startswithrorp�HASHER_HASH�match�groupsrjr#r$rkrbr�rvrr|�pyver�	Exception�warning�downloadable_extensionsrgrh�lenr)r6r(rwrxrWr2rlri�params�query�frag�m�algo�digestZorigpath�wheel�includerXrvZextrmrbr�r�r*r*r+�convert_url_to_download_info�sf

 
z$Locator.convert_url_to_download_infocCs4d}x*dD]"}d|}||kr
|||f}Pq
W|S)z�
        Get a digest from a dictionary by looking at keys of the form
        'algo_digest'.

        Returns a 2-tuple (algo, digest) if found, else None. Currently
        looks only for SHA256, then MD5.
        N�sha256�md5z	%s_digest)r�r�r*)r6�inforWr�r<r*r*r+�_get_digest)s
zLocator._get_digestc	Cs�|jd�}|jd�}||kr,||}|j}nt|||jd�}|j}|j|�|_}|d}||d|<|j|dkr�|j|j|�|_|dj|t	��j
|�||_|||<dS)z�
        Update a result dictionary (the final result from _get_project) with a
        dictionary for a specific version, which typically holds information
        gleaned from a filename or URL for an archive for the distribution.
        rbr�)r2r(�digests�urlsN)�pop�metadatarr2r�r��
source_urlru�
setdefault�set�add�locator)	r6rWr�rbr��dist�mdr�r(r*r*r+�_update_version_data9s

zLocator._update_version_dataFcCs�d}t|�}|dkr td|��t|j�}|j|j�|_}tjd|t|�j	�|j
|j�}t|�dk�r8g}|j
}	x�|D]|}
|
d
kr�qzyJ|j|
�s�tjd||
�n,|s�|	|
�jr�|j|
�ntjd|
|j�Wqztk
r�tjd	||
�YqzXqzWt|�d
k�rt||jd�}|�r8tjd|�|d}||}|�r�|j�rN|j|_|jdi�j|t��|_i}|jdi�}
x&|jD]}||
k�r~|
|||<�q~W||_d|_|S)a
        Find the most recent distribution which matches the given
        requirement.

        :param requirement: A requirement of the form 'foo (1.0)' or perhaps
                            'foo (>= 1.0, < 2.0, != 1.3)'
        :param prereleases: If ``True``, allow pre-release versions
                            to be located. Otherwise, pre-release versions
                            are not returned.
        :return: A :class:`Distribution` instance, or ``None`` if no such
                 distribution could be located.
        NzNot a valid requirement: %rzmatcher: %s (%s)r{r�r�z%s did not match %rz%skipping pre-release version %s of %szerror matching %s with %rr)r<zsorted list: %s)r�r�r�)rrr!r2rO�requirementrorp�typer=rerbr�Z
version_classr�Z
is_prereleaserUr�r��sortedr<ZextrasrTr��
download_urlsr�)r6r��prereleasesrW�rr2rO�versionsZslistZvcls�kr��dZsdr(r*r*r+�locatePsT





zLocator.locate)rBrCrDrErFrG)rHrIrJ)rK)rJ)rL)F)r=r>r?r@�source_extensions�binary_extensions�excluded_extensionsrkr�rRrYrZr\r^r`�propertyr2rcrdrernrurr�r�r�r�r*r*r*r+rASs.

FrAcs0eZdZdZ�fdd�Zdd�Zdd�Z�ZS)�PyPIRPCLocatorz�
    This locator uses XML-RPC to locate distributions. It therefore
    cannot be used with simple mirrors (that only mirror file content).
    cs*tt|�jf|�||_t|dd�|_dS)z�
        Initialise an instance.

        :param url: The URL to use for XML-RPC.
        :param kwargs: Passed to the superclass constructor.
        g@)r%N)�superr�rR�base_urlrr))r6r(�kwargs)�	__class__r*r+rR�szPyPIRPCLocator.__init__cCst|jj��S)zJ
        Return all the distribution names known to this locator.
        )r�r)r')r6r*r*r+rd�sz%PyPIRPCLocator.get_distribution_namescCsiid�}|jj|d�}x�|D]�}|jj||�}|jj||�}t|jd�}|d|_|d|_|jd�|_	|jdg�|_
|jd�|_t|�}|r|d	}	|	d
|_
|j|	�|_||_|||<xB|D]:}	|	d
}
|j|	�}|dj|t��j|
�||d|
<q�WqW|S)
N)r�r�T)r2rbr��license�keywords�summaryrr(r�r�)r)Zpackage_releasesZrelease_urlsZrelease_datarr2rbr�rTr�r�r�rr�r�r�r�r�r�r�)r6rbrWr�rr��datar�r�r�r(r�r*r*r+rc�s0






zPyPIRPCLocator._get_project)r=r>r?r@rRrdrc�
__classcell__r*r*)r�r+r��sr�cs0eZdZdZ�fdd�Zdd�Zdd�Z�ZS)�PyPIJSONLocatorzw
    This locator uses PyPI's JSON interface. It's very limited in functionality
    and probably not worth using.
    cs tt|�jf|�t|�|_dS)N)r�r�rRrr�)r6r(r�)r�r*r+rR�szPyPIJSONLocator.__init__cCstd��dS)zJ
        Return all the distribution names known to this locator.
        zNot available from this locatorN)ra)r6r*r*r+rd�sz&PyPIJSONLocator.get_distribution_namescCsiid�}t|jdt|��}�y�|jj|�}|j�j�}tj|�}t	|j
d�}|d}|d|_|d|_|j
d�|_|j
dg�|_|j
d	�|_t|�}||_|d
}	|||j<x`|d
D]T}
|
d}|jj|�|j|
�|j|<|d
j|jt��j|�|j|
�|d|<q�Wx�|d
j�D]�\}}||jk�r:�q"t	|j
d�}
|j|
_||
_t|
�}||_|||<x\|D]T}
|
d}|jj|�|j|
�|j|<|d
j|t��j|�|j|
�|d|<�qpW�q"WWn@tk
�r}z"|jjt|��tjd|�WYdd}~XnX|S)N)r�r�z%s/json)r2r�rbr�r�r�r�r�r(r�ZreleaseszJSON fetch failed: %s) rr�rrN�open�read�decode�json�loadsrr2rbr�rTr�r�r�rr�r�r�r�r�r�r��itemsr�rQ�putrro�	exception)r6rbrWr(�respr�r�r�r�r�r�r�ZinfosZomd�odistrXr*r*r+rc�sT





"	zPyPIJSONLocator._get_project)r=r>r?r@rRrdrcr�r*r*)r�r+r��sr�c@s`eZdZdZejdejejBejB�Z	ejdejejB�Z
dd�Zejdej�Ze
dd��Zd	S)
�Pagez4
    This class represents a scraped HTML page.
    z�
(rel\s*=\s*(?:"(?P<rel1>[^"]*)"|'(?P<rel2>[^']*)'|(?P<rel3>[^>\s
]*))\s+)?
href\s*=\s*(?:"(?P<url1>[^"]*)"|'(?P<url2>[^']*)'|(?P<url3>[^>\s
]*))
(\s+rel\s*=\s*(?:"(?P<rel4>[^"]*)"|'(?P<rel5>[^']*)'|(?P<rel6>[^>\s
]*)))?
z!<base\s+href\s*=\s*['"]?([^'">]+)cCs4||_||_|_|jj|j�}|r0|jd�|_dS)zk
        Initialise an instance with the Unicode page contents and the URL they
        came from.
        rN)r�r�r(�_base�search�group)r6r�r(r�r*r*r+rRs
z
Page.__init__z[^a-z0-9$&+,/:;=?@.#%_\\|-]cCs�dd�}t�}x�|jj|j�D]�}|jd�}|dpZ|dpZ|dpZ|dpZ|dpZ|d	}|d
pr|dpr|d}t|j|�}t|�}|jj	d
d�|�}|j
||f�qWt|dd�dd�}|S)z�
        Return the URLs of all the links on a page together with information
        about their "rel" attribute, for determining which ones to treat as
        downloads and which ones to queue for further scraping.
        cSs,t|�\}}}}}}t||t|�|||f�S)zTidy up an URL.)rrr)r(r2rlrir�r�r�r*r*r+�clean%szPage.links.<locals>.cleanr0Zrel1Zrel2Zrel3Zrel4Zrel5Zrel6rqrrZurl3cSsdt|jd��S)Nz%%%2xr)�ordr�)r�r*r*r+�<lambda>3szPage.links.<locals>.<lambda>cSs|dS)Nrr*)rmr*r*r+r�7sT)r<�reverse)r��_href�finditerr��	groupdictrr�r�	_clean_re�subr�r�)r6r�rWr�r��relr(r*r*r+�linkss
z
Page.linksN)r=r>r?r@�re�compile�I�S�Xr�r�rRr�rr�r*r*r*r+r�sr�cs�eZdZdZejdd�dd�d�Zd�fdd	�	Zd
d�Zdd
�Z	dd�Z
ejdej
�Zdd�Zdd�Zdd�Zdd�Zdd�Zejd�Zdd�Z�ZS)�SimpleScrapingLocatorz�
    A locator which scrapes HTML pages to locate downloads for a distribution.
    This runs multiple threads to do the I/O; performance is at least as good
    as pip's PackageFinder, which works in an analogous fashion.
    cCstjtt�d�j�S)N)Zfileobj)�gzipZGzipFilerr�r�)�br*r*r+r�EszSimpleScrapingLocator.<lambda>cCs|S)Nr*)r�r*r*r+r�Fs)Zdeflater�ZnoneN�
csftt|�jf|�t|�|_||_i|_t�|_t	j
�|_t�|_d|_
||_tj�|_tj�|_dS)a�
        Initialise an instance.
        :param url: The root URL to use for scraping.
        :param timeout: The timeout, in seconds, to be applied to requests.
                        This defaults to ``None`` (no timeout specified).
        :param num_workers: The number of worker threads you want to do I/O,
                            This defaults to 10.
        :param kwargs: Passed to the superclass.
        FN)r�r�rRrr�r%�_page_cacher��_seenr
rP�	_to_fetch�
_bad_hosts�skip_externals�num_workers�	threading�RLock�_lock�_gplock)r6r(r%r�r�)r�r*r+rRIs



zSimpleScrapingLocator.__init__cCsJg|_x>t|j�D]0}tj|jd�}|jd�|j�|jj|�qWdS)z�
        Threads are created only when get_project is called, and terminate
        before it returns. They are there primarily to parallelise I/O (i.e.
        fetching web pages).
        )�targetTN)	�_threads�ranger�r�ZThread�_fetchZ	setDaemon�startrU)r6�irmr*r*r+�_prepare_threadscs
z&SimpleScrapingLocator._prepare_threadscCs>x|jD]}|jjd�qWx|jD]}|j�q$Wg|_dS)zu
        Tell all the threads to terminate (by sending a sentinel value) and
        wait for them to do so.
        N)r�r�r�r|)r6rmr*r*r+�
_wait_threadsps
z#SimpleScrapingLocator._wait_threadscCs�iid�}|j�x||_||_t|jdt|��}|jj�|jj�|j	�z&t
jd|�|jj
|�|jj�Wd|j�X|`WdQRX|S)N)r�r�z%s/zQueueing %s)r�rWrwrr�rr�r[r�r�rorpr�r�r|r�)r6rbrWr(r*r*r+rc}s



z"SimpleScrapingLocator._get_projectz<\b(linux-(i\d86|x86_64|arm\w+)|win(32|-amd64)|macosx-?\d+)\bcCs|jj|�S)zD
        Does an URL refer to a platform-specific download?
        )�platform_dependentr�)r6r(r*r*r+�_is_platform_dependent�sz,SimpleScrapingLocator._is_platform_dependentc
CsT|j|�rd}n|j||j�}tjd||�|rP|j�|j|j|�WdQRX|S)a%
        See if an URL is a suitable download for a project.

        If it is, register information in the result dictionary (for
        _get_project) about the specific version it's for.

        Note that the return value isn't actually used other than as a boolean
        value.
        Nzprocess_download: %s -> %s)r�r�rwrorpr�r�rW)r6r(r�r*r*r+�_process_download�s

z'SimpleScrapingLocator._process_downloadc
Cs�t|�\}}}}}}|j|j|j|j�r2d}n~|jrL|j|j�rLd}nd|j|j�s^d}nR|d
krld}nD|dkrzd}n6|j|�r�d}n&|j	dd�d	}	|	j
�d
kr�d}nd}tjd||||�|S)z�
        Determine whether a link URL from a referring page and with a
        particular "rel" attribute should be queued for scraping.
        F�homepage�download�httprf�ftp�:rrZ	localhostTz#should_queue: %s (%s) from %s -> %s)r�r�)r�rfr�)
rrjr�r�r�r�r�r�r��splitr�rorp)
r6�linkZreferrerr�r2rlri�_rW�hostr*r*r+�
_should_queue�s*


z#SimpleScrapingLocator._should_queuecCs�x�|jj�}z�yz|r�|j|�}|dkr(wx\|jD]R\}}||jkr0|jj|�|j|�r0|j|||�r0tj	d||�|jj
|�q0WWn2tk
r�}z|jj
t
|��WYdd}~XnXWd|jj�X|sPqWdS)z�
        Get a URL to fetch from the work queue, get the HTML page, examine its
        links for download candidates and candidates for further scraping.

        This is a handy method to run in a thread.
        NzQueueing %s from %s)r�rT�get_pager�r�r�r�r�rorpr�r�rQrrV)r6r(�pager�r�rXr*r*r+r��s&


&zSimpleScrapingLocator._fetchcCsXt|�\}}}}}}|dkr:tjjt|��r:tt|�d�}||jkr`|j|}tj	d||��n�|j
dd�d}d}||jkr�tj	d||��n�t|d	d
id�}�z�y�tj	d|�|j
j||jd
�}	tj	d|�|	j�}
|
jdd�}tj|��r�|	j�}|	j�}
|
jd�}|�r"|j|}||
�}
d}tj|�}|�r@|jd�}y|
j|�}
Wn tk
�rn|
jd�}
YnXt|
|�}||j|<Wn�tk
�r�}z |jdk�r�tjd||�WYdd}~Xn�t k
�r}z2tjd||�|j!�|jj"|�WdQRXWYdd}~Xn2t#k
�rB}ztjd||�WYdd}~XnXWd||j|<X|S)a
        Get the HTML for an URL, possibly from an in-memory cache.

        XXX TODO Note: this cache is never actually cleared. It's assumed that
        the data won't get stale over the lifetime of a locator instance (not
        necessarily true for the default_locator).
        �filez
index.htmlzReturning %s from cache: %sr�rrNzSkipping %s due to bad host %szAccept-encodingZidentity)r;zFetching %s)r%z
Fetched %szContent-Typer0zContent-Encodingzutf-8zlatin-1i�zFetch failed: %s: %s)$r�osri�isdirrrrr�rorpr�r�rrNr�r%r�rT�HTML_CONTENT_TYPEr�Zgeturlr��decoders�CHARSETr�r�r��UnicodeErrorr�rr9r�rr�r�r�)r6r(r2rlrir�rWr�r7r�r;Zcontent_typeZ	final_urlr��encoding�decoderr�rXr*r*r+r�sZ	







&$zSimpleScrapingLocator.get_pagez<a href=[^>]*>([^<]+)<cCsPt�}|j|j�}|s$td|j��x&|jj|j�D]}|j|jd��q4W|S)zJ
        Return all the distribution names known to this locator.
        zUnable to get %sr)	r�rr�r�_distname_rer�r�r�r�)r6rWrr�r*r*r+rd$sz,SimpleScrapingLocator.get_distribution_names)Nr�)r=r>r?r@�zlibZ
decompressrrRr�r�rcr�r�r�r�r�r�r�r�rrrdr�r*r*)r�r+r�;s"

;
r�cs8eZdZdZ�fdd�Zdd�Zdd�Zdd	�Z�ZS)
�DirectoryLocatorz?
    This class locates distributions in a directory tree.
    csN|jdd�|_tt|�jf|�tjj|�}tjj|�sDt	d|��||_
dS)a�
        Initialise an instance.
        :param path: The root of the directory tree to search.
        :param kwargs: Passed to the superclass constructor,
                       except for:
                       * recursive - if True (the default), subdirectories are
                         recursed into. If False, only the top-level directory
                         is searched,
        �	recursiveTzNot a directory: %rN)r�rr�r
rRrri�abspathrr�base_dir)r6rir�)r�r*r+rR5s
zDirectoryLocator.__init__cCs|j|j�S)z�
        Should a filename be considered as a candidate for a distribution
        archive? As well as the filename, the directory which contains it
        is provided, though not used by the current implementation.
        )rjr�)r6rv�parentr*r*r+�should_includeFszDirectoryLocator.should_includec		Cs�iid�}x�tj|j�D]v\}}}xb|D]Z}|j||�r(tjj||�}tddttjj|��dddf�}|j	||�}|r(|j
||�q(W|jsPqW|S)N)r�r�rr0)r�walkrrrir|rr	rr�r�r)	r6rbrW�root�dirs�files�fnr(r�r*r*r+rcNs

zDirectoryLocator._get_projectc	Cs�t�}x�tj|j�D]x\}}}xd|D]\}|j||�r$tjj||�}tddttjj	|��dddf�}|j
|d�}|r$|j|d�q$W|jsPqW|S)zJ
        Return all the distribution names known to this locator.
        rr0Nrb)
r�rrrrrir|rr	rr�r�r)r6rWrrrrr(r�r*r*r+rd^s
z'DirectoryLocator.get_distribution_names)	r=r>r?r@rRrrcrdr�r*r*)r�r+r
0s
r
c@s eZdZdZdd�Zdd�ZdS)�JSONLocatora
    This locator uses special extended metadata (not available on PyPI) and is
    the basis of performant dependency resolution in distlib. Other locators
    require archive downloads before dependencies can be determined! As you
    might imagine, that can be slow.
    cCstd��dS)zJ
        Return all the distribution names known to this locator.
        zNot available from this locatorN)ra)r6r*r*r+rdxsz"JSONLocator.get_distribution_namescCs�iid�}t|�}|r�x�|jdg�D]�}|ddks$|ddkrBq$t|d|d|jd	d
�|jd�}|j}|d|_d
|kr�|d
r�d|d
f|_|jdi�|_|jdi�|_|||j	<|dj
|j	t��j|d�q$W|S)N)r�r�rZptypeZsdistZ	pyversion�sourcerbr�r�zPlaceholder for summary)r�r2r(r�r�Zrequirements�exportsr�)
rrTrr2r�r�r�Zdependenciesrr�r�r�r�)r6rbrWr�r�r�r�r*r*r+rc~s&



"zJSONLocator._get_projectN)r=r>r?r@rdrcr*r*r*r+rqsrcs(eZdZdZ�fdd�Zdd�Z�ZS)�DistPathLocatorz�
    This locator finds installed distributions in a path. It can be useful for
    adding to an :class:`AggregatingLocator`.
    cstt|�jf|�||_dS)zs
        Initialise an instance.

        :param distpath: A :class:`DistributionPath` instance to search.
        N)r�rrR�distpath)r6rr�)r�r*r+rR�szDistPathLocator.__init__cCsP|jj|�}|dkr iid�}n,|j|d|jt|jg�id|jtdg�ii}|S)N)r�r�r�r�)rZget_distributionr�r�r�)r6rbr�rWr*r*r+rc�szDistPathLocator._get_project)r=r>r?r@rRrcr�r*r*)r�r+r�s
rcsReZdZdZ�fdd�Z�fdd�Zdd�Zeej	j
e�Z	dd	�Zd
d�Z�Z
S)�AggregatingLocatorzI
    This class allows you to chain and/or merge a list of locators.
    cs*|jdd�|_||_tt|�jf|�dS)a�
        Initialise an instance.

        :param locators: The list of locators to search.
        :param kwargs: Passed to the superclass constructor,
                       except for:
                       * merge - if False (the default), the first successful
                         search from any of the locators is returned. If True,
                         the results from all locators are merged (this can be
                         slow).
        �mergeFN)r�r�locatorsr�rrR)r6rr�)r�r*r+rR�szAggregatingLocator.__init__cs*tt|�j�x|jD]}|j�qWdS)N)r�rr\r)r6r�)r�r*r+r\�szAggregatingLocator.clear_cachecCs ||_x|jD]
}||_qWdS)N)r]rr2)r6r_r�r*r*r+r`�szAggregatingLocator._set_schemecCs�i}x�|jD]�}|j|�}|r|jr�|jdi�}|jdi�}|j|�|jd�}|r�|r�x6|j�D]*\}}	||kr�|||	O<qb|	||<qbW|jd�}
|r�|
r�|
j|�q|jdkr�d}n$d}x|D]}|jj|�r�d}Pq�W|r|}PqW|S)Nr�r�TF)rrerrT�updater�rOr�)r6rbrWr�r�rr�Zdfr�rZdd�foundr*r*r+rc�s8





zAggregatingLocator._get_projectcCs@t�}x4|jD]*}y||j�O}Wqtk
r6YqXqW|S)zJ
        Return all the distribution names known to this locator.
        )r�rrdra)r6rWr�r*r*r+rd�s
z)AggregatingLocator.get_distribution_names)r=r>r?r@rRr\r`r�rAr2�fgetrcrdr�r*r*)r�r+r�s,rzhttps://pypi.python.org/simple/g@)r%�legacy)r2z1(?P<name>[\w-]+)\s*\(\s*(==\s*)?(?P<ver>[^)]+)\)$c@sLeZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	ddd�Z
dS)�DependencyFinderz0
    Locate dependencies for distributions.
    NcCs|pt|_t|jj�|_dS)zf
        Initialise an instance, using the specified locator
        to locate distributions.
        N)�default_locatorr�r!r2)r6r�r*r*r+rRs
zDependencyFinder.__init__cCsvtjd|�|j}||j|<||j||jf<xD|jD]:}t|�\}}tjd|||�|jj	|t
��j||f�q4WdS)z�
        Add a distribution to the finder. This will update internal information
        about who provides what.
        :param dist: The distribution to add.
        zadding distribution %szAdd to provided: %s, %s, %sN)rorpr<�
dists_by_name�distsr��providesr�providedr�r�r�)r6r�rb�pr�r*r*r+�add_distribution&s
z!DependencyFinder.add_distributioncCs|tjd|�|j}|j|=|j||jf=xN|jD]D}t|�\}}tjd|||�|j|}|j	||f�|s0|j|=q0WdS)z�
        Remove a distribution from the finder. This will update internal
        information about who provides what.
        :param dist: The distribution to remove.
        zremoving distribution %sz Remove from provided: %s, %s, %sN)
rorpr<r&r'r�r(rr)�remove)r6r�rbr*r��sr*r*r+�remove_distribution5s
z$DependencyFinder.remove_distributioncCsBy|jj|�}Wn,tk
r<|j�d}|jj|�}YnX|S)z�
        Get a version matcher for a requirement.
        :param reqt: The requirement
        :type reqt: str
        :return: A version matcher (an instance of
                 :class:`distlib.version.Matcher`).
        r)r2rOr"r�)r6�reqtrOrbr*r*r+�get_matcherGszDependencyFinder.get_matcherc	Csv|j|�}|j}t�}|j}||krrxL||D]@\}}y|j|�}Wntk
r\d}YnX|r.|j|�Pq.W|S)z�
        Find the distributions which can fulfill a requirement.

        :param reqt: The requirement.
         :type reqt: str
        :return: A set of distribution which can fulfill the requirement.
        F)r0r<r�r)r�r"r�)	r6r/rOrbrWr)r��providerr�r*r*r+�find_providersWs


zDependencyFinder.find_providersc	Cs�|j|}t�}x,|D]$}|j|�}|j|j�s|j|�qW|r^|jd||t|�f�d}nD|j|�|j|=x"|D]}|jj|t��j|�qvW|j	|�d}|S)a�
        Attempt to replace one provider with another. This is typically used
        when resolving dependencies from multiple sources, e.g. A requires
        (B >= 1.0) while C requires (B >= 1.1).

        For successful replacement, ``provider`` must meet all the requirements
        which ``other`` fulfills.

        :param provider: The provider we are trying to replace with.
        :param other: The provider we're trying to replace.
        :param problems: If False is returned, this will contain what
                         problems prevented replacement. This is currently
                         a tuple of the literal string 'cantreplace',
                         ``provider``, ``other``  and the set of requirements
                         that ``provider`` couldn't fulfill.
        :return: True if we can replace ``other`` with ``provider``, else
                 False.
        ZcantreplaceFT)
�reqtsr�r0r�r�r��	frozensetr.r�r+)	r6r1�other�problemsZrlistZ	unmatchedr-rOrWr*r*r+�try_to_replaceos"






zDependencyFinder.try_to_replaceFcCsi|_i|_i|_i|_t|p g�}d|krH|jd�|tdddg�O}t|t�rh|}}tj	d|�n4|j
j||d�}}|dkr�td|��tj	d	|�d
|_
t�}t|g�}t|g�}�x�|�r�|j�}|j}	|	|jkr�|j|�n"|j|	}
|
|k�r|j||
|�|j|jB}|j}t�}
||k�rbx2dD]*}d|}||k�r4|
t|d|�O}
�q4W||B|
B}�x>|D�]4}|j|�}|�sNtj	d|�|j
j||d�}|dk�r�|�r�|j
j|d
d�}|dk�r�tj	d|�|jd|f�n^|j|j}}||f|jk�r|j|�|j|�||k�rN||k�rN|j|�tj	d|j�xZ|D]R}|j}	|	|jk�r�|jj|t��j|�n"|j|	}
|
|k�rT|j||
|��qTW�qvWq�Wt|jj��}x.|D]&}||k|_|j�r�tj	d|j��q�Wtj	d|�||fS)a�
        Find a distribution and all distributions it depends on.

        :param requirement: The requirement specifying the distribution to
                            find, or a Distribution instance.
        :param meta_extras: A list of meta extras such as :test:, :build: and
                            so on.
        :param prereleases: If ``True``, allow pre-release versions to be
                            returned - otherwise, don't return prereleases
                            unless they're all that's available.

        Return a set of :class:`Distribution` instances and a set of
        problems.

        The distributions returned should be such that they have the
        :attr:`required` attribute set to ``True`` if they were
        from the ``requirement`` passed to ``find()``, and they have the
        :attr:`build_time_dependency` attribute set to ``True`` unless they
        are post-installation dependencies of the ``requirement``.

        The problems should be a tuple consisting of the string
        ``'unsatisfied'`` and the requirement which couldn't be satisfied
        by any distribution known to the locator.
        z:*:z:test:z:build:z:dev:zpassed %s as requirement)r�NzUnable to locate %rz
located %sT�test�build�devz:%s:z%s_requireszNo providers found for %rzCannot satisfy %rZunsatisfiedzAdding %s to install_distsz#%s is a build-time dependency only.zfind done for %s)r8r9r:)r)r'r&r3r�r,�
isinstancerrorpr�r�rZ	requestedr�r<r+r7Zrun_requiresZ
meta_requiresZbuild_requires�getattrr2r�r�Zname_and_versionr��valuesZbuild_time_dependency)r6r�Zmeta_extrasr�r�r�r6ZtodoZ
install_distsrbr5ZireqtsZsreqtsZereqtsr<rXZ	all_reqtsr�Z	providersr1�nrr*r'r*r*r+�find�s�




















zDependencyFinder.find)N)NF)r=r>r?r@rRr+r.r0r2r7r?r*r*r*r+r$s
(r$)N)Nr��iorr�Zloggingrrgr�r��ImportErrorZdummy_threadingrr0r�compatrrrrr	r
rrr
rrr4rrrrZdatabaserrrr�r�utilrrrrrrrrr r�r!r"r�r#r$Z	getLoggerr=ror�r�r�rrr&r,r-�objectrAr�r�r�r�r
rrrr%r�ZNAME_VERSION_REr$r*r*r*r+�<module>sZD,



;0E:vA&[
site-packages/pip/_vendor/distlib/__pycache__/__init__.cpython-36.opt-1.pyc000064400000001703147511334610022504 0ustar003

���eE�@snddlZdZGdd�de�ZyddlmZWn&ek
rRGdd�dej�ZYnXeje�Z	e	j
e��dS)�Nz0.2.4c@seZdZdS)�DistlibExceptionN)�__name__�
__module__�__qualname__�rr�/usr/lib/python3.6/__init__.pyrsr)�NullHandlerc@s$eZdZdd�Zdd�Zdd�ZdS)rcCsdS)Nr)�self�recordrrr�handleszNullHandler.handlecCsdS)Nr)r	r
rrr�emitszNullHandler.emitcCs
d|_dS)N)�lock)r	rrr�
createLockszNullHandler.createLockN)rrrrrrrrrrrsr)Zlogging�__version__�	Exceptionrr�ImportErrorZHandlerZ	getLoggerrZloggerZ
addHandlerrrrr�<module>s
site-packages/pip/_vendor/distlib/__pycache__/util.cpython-36.opt-1.pyc000064400000126527147511334610021736 0ustar003

���ei��@s>ddlZddlmZddlZddlZddlmZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZyddlZWnek
r�dZYnXddlZddlZddlZddlZddlZyddlZWnek
r�ddlZYnXddlZddlmZddlmZmZmZmZm Z m!Z!m"Z"m#Z#m$Z$m%Z%m&Z&m'Z'm(Z(m)Z)m*Z*m+Z+m,Z,m-Z-m.Z.m/Z/e
j0e1�Z2dZ3e
j4e3�Z5dZ6d	e6d
Z7e6dZ8dZ9d
e9de8de3d
e9de8dZ:dZ;de:de;de:dZ<e7d
e3e7dZ=de=dZ>de6de>de<dZ?e
j4e?�Z@de9de8d
ZAe
j4eA�ZBdd �ZCd!d"�ZDd#d$�ZEd%d&�ZFd�d'd(�ZGd)d*�ZHd+d,�ZId-d.�ZJejKd/d0��ZLejKd1d2��ZMejKd�d4d5��ZNGd6d7�d7eO�ZPd8d9�ZQGd:d;�d;eO�ZRd<d=�ZSGd>d?�d?eO�ZTe
j4d@e
jU�ZVdAdB�ZWd�dCdD�ZXdEdF�ZYdGdH�ZZdIdJ�Z[dKdL�Z\dMdN�Z]e
j4dOe
j^�Z_e
j4dP�Z`d�dQdR�Zae
j4dS�ZbdTdU�ZcdVdW�ZddXdY�ZedZZfd[d\�Zgd]d^�ZhGd_d`�d`eO�ZiGdadb�dbeO�ZjGdcdd�ddeO�Zkd�Zld�dmdn�Zmdodp�Znd�ZoGdwdx�dxeO�Zpe
j4dy�Zqe
j4dz�Zre
j4d{�Zsd|d}�Zd~d�Zte�r\dd�lmuZvmwZwmxZxGd�d��d�e$jy�ZyGd�d��d�ev�ZuGd�d��d�eue'�Zzej{dd��Z|e|d�k�r�Gd�d��d�e$j}�Z}e�r�Gd�d��d�e$j~�Z~Gd�d��d�e%j�Ze�r�Gd�d��d�e%j��Z�Gd�d��d�e%j��Z�d�d��Z�Gd�d��d�eO�Z�Gd�d��d�e��Z�Gd�d��d�e��Z�Gd�d��d�e(�Z�Gd�d��d�eO�Z�d�d��Z�dS)��N)�deque)�iglob�)�DistlibException)�string_types�	text_type�shutil�	raw_input�StringIO�cache_from_source�urlopen�urljoin�httplib�	xmlrpclib�	splittype�HTTPHandler�BaseConfigurator�valid_ident�	Container�configparser�URLError�ZipFile�fsdecode�unquotez\s*,\s*z
(\w|[.-])+z(\*|:(\*|\w+):|�)z\*?z([<>=!~]=)|[<>]�(z)?\s*(z)(z)\s*(z))*z(from\s+(?P<diref>.*))z\(\s*(?P<c1>�|z)\s*\)|(?P<c2>z\s*)z)*z\[\s*(?P<ex>z)?\s*\]z(?P<dn>z	\s*)?(\s*z)?$z(?P<op>z)\s*(?P<vn>cs�dd��d}tj|�}|r�|j�}|d}|dp8|d}|dsHd}nd}|dj�}|snd}d}|d}nL|dd	kr�d
|}tj|�}	�fdd�|	D�}d
|djdd�|D��f}|ds�d}
ntj|d�}
t	|||
|||d�}|S)NcSs|j�}|d|dfS)N�opZvn)�	groupdict)�m�d�r!�/usr/lib/python3.6/util.py�get_constraintYsz)parse_requirement.<locals>.get_constraintZdnZc1Zc2Zdiref�rz<>!=z~=csg|]}�|��qSr!r!)�.0r)r#r!r"�
<listcomp>qsz%parse_requirement.<locals>.<listcomp>z%s (%s)z, cSsg|]}d|�qS)z%s %sr!)r%Zconr!r!r"r&rsZex)�nameZconstraints�extrasZrequirement�source�url)
�REQUIREMENT_RE�matchr�strip�RELOP_IDENT_RE�finditer�join�COMMA_RE�splitr)�s�resultrr r'Zconsr*ZconstrZrs�iteratorr(r!)r#r"�parse_requirementWs4


r6cCs�dd�}i}x�|D]�\}}}tjj||�}x�t|�D]t}tjj||�}	x`t|	�D]T}
|||
�}|dkrt|j|d�qP|||
�}|jtjjd�jd�}
|
d|||<qPWq4WqW|S)z%Find destinations for resources filescSs6|jtjjd�}|jtjjd�}|t|�d�jd�S)N�/)�replace�os�path�sep�len�lstrip)�baser:r!r!r"�get_rel_pathsz)get_resources_dests.<locals>.get_rel_pathNr7)r9r:r0r�popr8r;�rstrip)Zresources_rootZrulesr?Zdestinationsr>�suffix�dest�prefixZabs_baseZabs_globZabs_pathZ
resource_fileZrel_pathZrel_destr!r!r"�get_resources_dests|s

rEcCs(ttd�rd}ntjttdtj�k}|S)NZreal_prefixT�base_prefix)�hasattr�sysrD�getattr)r4r!r!r"�in_venv�s
rJcCs$tjjtj�}t|t�s t|�}|S)N)r9r:�normcaserH�
executable�
isinstancerr)r4r!r!r"�get_executable�s

rNcCsT|}xJt|�}|}|r |r |}|r|dj�}||kr:P|rd|||f}qW|S)Nrz	%c: %s
%s)r	�lower)�promptZ
allowed_charsZerror_prompt�default�pr3�cr!r!r"�proceed�s
rTcCs<t|t�r|j�}i}x |D]}||kr||||<qW|S)N)rMrr2)r �keysr4�keyr!r!r"�extract_by_key�s

rWcCsntjddkrtjd�|�}|j�}t|�}yftj|�}|ddd}xF|j�D]:\}}x0|j�D]$\}}d||f}t	|�}	|	||<qdWqRW|St
k
r�|jdd�YnXdd	�}
tj
�}y|
||�Wn<tjk
�r|j�tj|�}t|�}|
||�YnXi}xT|j�D]H}i||<}x4|j|�D]&\}
}d|
|f}t	|�}	|	||
<�q:W�qW|S)
Nr�zutf-8�
extensionszpython.exports�exportsz%s = %scSs$t|d�r|j|�n
|j|�dS)N�	read_file)rGr[Zreadfp)�cp�streamr!r!r"�read_stream�s
z!read_exports.<locals>.read_stream)rH�version_info�codecs�	getreader�readr
�json�load�items�get_export_entry�	Exception�seekr�ConfigParserZMissingSectionHeaderError�close�textwrap�dedentZsections)r]�dataZjdatar4�groupZentries�k�vr3�entryr^r\rVr'�valuer!r!r"�read_exports�s@

rscCs�tjddkrtjd�|�}tj�}x||j�D]p\}}|j|�x\|j�D]P}|j	dkr`|j
}nd|j
|j	f}|jr�d|dj|j�f}|j
||j|�qJWq.W|j|�dS)NrrXzutf-8z%s:%sz%s [%s]z, )rHr_r`�	getwriterrrireZadd_section�valuesrBrD�flagsr0�setr'�write)rZr]r\rorprqr3r!r!r"�
write_exports�s

ryccs$tj�}z
|VWdtj|�XdS)N)�tempfile�mkdtempr�rmtree)Ztdr!r!r"�tempdirs
r}ccs.tj�}ztj|�dVWdtj|�XdS)N)r9�getcwd�chdir)r �cwdr!r!r"rs


r�ccs.tj�}ztj|�dVWdtj|�XdS)N)�socketZgetdefaulttimeoutZsetdefaulttimeout)ZsecondsZctor!r!r"�socket_timeouts


r�c@seZdZdd�Zddd�ZdS)�cached_propertycCs
||_dS)N)�func)�selfr�r!r!r"�__init__)szcached_property.__init__NcCs,|dkr|S|j|�}tj||jj|�|S)N)r��object�__setattr__�__name__)r��obj�clsrrr!r!r"�__get__.s

zcached_property.__get__)N)r��
__module__�__qualname__r�r�r!r!r!r"r�(sr�cCs�tjdkr|S|s|S|ddkr.td|��|ddkrFtd|��|jd�}xtj|krj|jtj�qRW|svtjStjj|�S)a�Return 'pathname' as a name that will work on the native filesystem.

    The path is split on '/' and put back together again using the current
    directory separator.  Needed because filenames in the setup script are
    always supplied in Unix style, and have to be converted to the local
    convention before we can actually use them in the filesystem.  Raises
    ValueError on non-Unix-ish systems if 'pathname' either starts or
    ends with a slash.
    r7rzpath '%s' cannot be absoluterzpath '%s' cannot end with '/'���)r9r;�
ValueErrorr2�curdir�remover:r0)�pathname�pathsr!r!r"�convert_path6s


r�c@s�eZdZd$dd�Zdd�Zdd�Zdd	�Zd%dd�Zd&dd�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
d'dd�Zdd�Zdd�Zd d!�Zd"d#�Zd
S)(�FileOperatorFcCs||_t�|_|j�dS)N)�dry_runrw�ensured�_init_record)r�r�r!r!r"r�RszFileOperator.__init__cCsd|_t�|_t�|_dS)NF)�recordrw�
files_written�dirs_created)r�r!r!r"r�WszFileOperator._init_recordcCs|jr|jj|�dS)N)r�r��add)r�r:r!r!r"�record_as_written\szFileOperator.record_as_writtencCsHtjj|�s tdtjj|���tjj|�s0dStj|�jtj|�jkS)a�Tell if the target is newer than the source.

        Returns true if 'source' exists and is more recently modified than
        'target', or if 'source' exists and 'target' doesn't.

        Returns false if both exist and 'target' is the same age or younger
        than 'source'. Raise PackagingFileError if 'source' does not exist.

        Note that this test is not very accurate: files created in the same
        second will have the same "age".
        zfile '%r' does not existT)r9r:�existsr�abspath�stat�st_mtime)r�r)�targetr!r!r"�newer`szFileOperator.newerTcCs�|jtjj|��tjd||�|js�d}|rftjj|�rDd|}n"tjj|�rftjj	|�rfd|}|rvt
|d��tj||�|j
|�dS)z8Copy a file respecting dry-run and force flags.
        zCopying %s to %sNz%s is a symlinkz%s is a non-regular filez which would be overwritten)�
ensure_dirr9r:�dirname�logger�infor��islinkr��isfiler�rZcopyfiler�)r�Zinfile�outfile�check�msgr!r!r"�	copy_filets
zFileOperator.copy_fileNc
Cst|jtjj|��tjd||�|jsf|dkr:t|d�}ntj|d|d�}zt	j
||�Wd|j�X|j|�dS)NzCopying stream %s to %s�wb�w)�encoding)
r�r9r:r�r�r�r��openr`rZcopyfileobjrjr�)r�Zinstreamr�r�Z	outstreamr!r!r"�copy_stream�s
zFileOperator.copy_streamc	CsF|jtjj|��|js8t|d��}|j|�WdQRX|j|�dS)Nr�)r�r9r:r�r�r�rxr�)r�r:rm�fr!r!r"�write_binary_file�s
zFileOperator.write_binary_filec
CsL|jtjj|��|js>t|d��}|j|j|��WdQRX|j|�dS)Nr�)	r�r9r:r�r�r�rx�encoder�)r�r:rmr�r�r!r!r"�write_text_file�s
zFileOperator.write_text_filecCsrtjdkstjdkrntjdkrnxN|D]F}|jr<tjd|�q$tj|�j|B|@}tjd||�tj||�q$WdS)N�posix�javazchanging mode of %szchanging mode of %s to %o)	r9r'�_namer�r�r�r��st_mode�chmod)r��bits�mask�filesr��moder!r!r"�set_mode�s
zFileOperator.set_modecCs|jdd|�S)Nimi�)r�)r3r�r!r!r"�<lambda>�szFileOperator.<lambda>cCs~tjj|�}||jkrztjj|�rz|jj|�tjj|�\}}|j|�tj	d|�|j
shtj|�|jrz|j
j|�dS)NzCreating %s)r9r:r�r�r�r�r2r�r�r�r��mkdirr�r�)r�r:r r�r!r!r"r��s

zFileOperator.ensure_dircCsht||�}tjd||�|jsZ|s0|j||�rJ|s:d}n|t|�d�}tj|||d�|j|�|S)NzByte-compiling %s to %sT)	rr�r�r�r�r<�
py_compile�compiler�)r�r:�optimize�forcerDZdpathZdiagpathr!r!r"�byte_compile�s
zFileOperator.byte_compilecCs�tjj|�r�tjj|�r`tjj|�r`tjd|�|jsBtj	|�|j
r�||jkr�|jj|�nPtjj|�rrd}nd}tjd||�|js�tj|�|j
r�||j
kr�|j
j|�dS)NzRemoving directory tree at %s�link�filezRemoving %s %s)r9r:r��isdirr�r��debugr�rr|r�r�r�r�)r�r:r3r!r!r"�ensure_removed�s"



zFileOperator.ensure_removedcCsHd}x>|sBtjj|�r&tj|tj�}Ptjj|�}||kr<P|}qW|S)NF)r9r:r��access�W_OKr�)r�r:r4�parentr!r!r"�is_writable�szFileOperator.is_writablecCs|j|jf}|j�|S)zV
        Commit recorded changes, turn off recording, return
        changes.
        )r�r�r�)r�r4r!r!r"�commit�szFileOperator.commitcCs�|js�x(t|j�D]}tjj|�rtj|�qWt|jdd�}x@|D]8}tj	|�}|rrtjj
||d�}tj|�tj|�qDW|j�dS)NT)�reverser)
r��listr�r9r:r�r��sortedr��listdirr0�rmdirr�)r�r��dirsr �flistZsdr!r!r"�rollback�s


zFileOperator.rollback)F)T)N)FFN)r�r�r�r�r�r�r�r�r�r�r�r�Zset_executable_moder�r�r�r�r�r�r!r!r!r"r�Qs 




r�cCsb|tjkrtj|}nt|�}|dkr,|}n2|jd�}t||jd��}x|D]}t||�}qLW|S)N�.r)rH�modules�
__import__r2rIr@)Zmodule_nameZdotted_path�modr4�partsrRr!r!r"�resolves


r�c@s6eZdZdd�Zedd��Zdd�Zdd�Zej	Z	d	S)
�ExportEntrycCs||_||_||_||_dS)N)r'rDrBrv)r�r'rDrBrvr!r!r"r�szExportEntry.__init__cCst|j|j�S)N)r�rDrB)r�r!r!r"rrszExportEntry.valuecCsd|j|j|j|jfS)Nz<ExportEntry %s = %s:%s %s>)r'rDrBrv)r�r!r!r"�__repr__!s
zExportEntry.__repr__cCsDt|t�sd}n0|j|jko>|j|jko>|j|jko>|j|jk}|S)NF)rMr�r'rDrBrv)r��otherr4r!r!r"�__eq__%s
zExportEntry.__eq__N)
r�r�r�r�r�rrr�r�r��__hash__r!r!r!r"r�s

r�z�(?P<name>(\w|[-.+])+)
                      \s*=\s*(?P<callable>(\w+)([:\.]\w+)*)
                      \s*(\[\s*(?P<flags>\w+(=\w+)?(,\s*\w+(=\w+)?)*)\s*\])?
                      c
Cs�tj|�}|s0d}d|ks"d|kr�td|��n�|j�}|d}|d}|jd�}|dkrf|d}}n"|dkrztd|��|jd�\}}|d	}	|	dkr�d|ks�d|kr�td|��g}	nd
d�|	jd�D�}	t||||	�}|S)
N�[�]zInvalid specification '%s'r'�callable�:rrrvcSsg|]}|j��qSr!)r-)r%r�r!r!r"r&Qsz$get_export_entry.<locals>.<listcomp>�,)�ENTRY_RE�searchrr�countr2r�)
Z
specificationrr4r r'r:ZcolonsrDrBrvr!r!r"rf7s2


rfcCs�|dkrd}tjdkr.dtjkr.tjjd�}ntjjd�}tjj|�rftj|tj�}|s�t	j
d|�n<ytj|�d}Wn(tk
r�t	j
d	|dd
�d}YnX|s�t
j�}t	j
d|�tjj||�S)
a�
    Return the default base location for distlib caches. If the directory does
    not exist, it is created. Use the suffix provided for the base directory,
    and default to '.distlib' if it isn't provided.

    On Windows, if LOCALAPPDATA is defined in the environment, then it is
    assumed to be a directory, and will be the parent directory of the result.
    On POSIX, and on Windows if LOCALAPPDATA is not defined, the user's home
    directory - using os.expanduser('~') - will be the parent directory of
    the result.

    The result is just the directory '.distlib' in the parent directory as
    determined above, or with the name specified with ``suffix``.
    Nz.distlib�ntZLOCALAPPDATAz
$localappdata�~z(Directory exists but is not writable: %sTzUnable to create %s)�exc_infoFz#Default location unusable, using %s)r9r'�environr:�
expandvars�
expanduserr�r�r�r��warning�makedirs�OSErrorrzr{r0)rBr4Zusabler!r!r"�get_cache_baseVs&

r�cCsBtjjtjj|��\}}|r(|jdd�}|jtjd�}||dS)a
    Convert an absolute path to a directory name for use in a cache.

    The algorithm used is:

    #. On Windows, any ``':'`` in the drive is replaced with ``'---'``.
    #. Any occurrence of ``os.sep`` is replaced with ``'--'``.
    #. ``'.cache'`` is appended.
    r�z---z--z.cache)r9r:�
splitdriver�r8r;)r:r rRr!r!r"�path_to_cache_dirs

r�cCs|jd�s|dS|S)Nr7)�endswith)r3r!r!r"�ensure_slash�s
r�cCsHd}}d|kr>|jdd�\}}d|kr.|}n|jdd�\}}|||fS)N�@rr�)r2)ZnetlocZusernameZpasswordrDr!r!r"�parse_credentials�sr�cCstjd�}tj|�|S)N�)r9�umask)r4r!r!r"�get_process_umask�s

rcCs2d}d}x$t|�D]\}}t|t�sd}PqW|S)NTF)�	enumeraterMr)�seqr4�ir3r!r!r"�is_string_sequence�s
rz3([a-z0-9_]+([.-][a-z_][a-z0-9_]*)*)-([a-z0-9_.+-]+)z
-py(\d\.?\d?)cCs�d}d}t|�jdd�}tj|�}|r@|jd�}|d|j��}|r�t|�t|�dkr�tjtj	|�d|�}|r�|j
�}|d|�||dd�|f}|dkr�tj|�}|r�|jd�|jd�|f}|S)zw
    Extract name, version, python version from a filename (no extension)

    Return name, version, pyver or None
    N� �-rz\brX)rr8�PYTHON_VERSIONr�rn�startr<�rer,�escape�end�PROJECT_NAME_AND_VERSION)�filenameZproject_namer4Zpyverr�nr!r!r"�split_filename�s"


rz-(?P<name>[\w .-]+)\s*\(\s*(?P<ver>[^\s)]+)\)$cCs:tj|�}|std|��|j�}|dj�j�|dfS)z�
    A utility method used to get name and version from a string.

    From e.g. a Provides-Dist value.

    :param p: A value in a form 'foo (1.0)'
    :return: The name and version as a tuple.
    z$Ill-formed name/version string: '%s'r'Zver)�NAME_VERSION_REr,rrr-rO)rRrr r!r!r"�parse_name_and_version�s
	
rcCs�t�}t|pg�}t|pg�}d|kr8|jd�||O}x�|D]x}|dkrV|j|�q>|jd�r�|dd�}||kr�tjd|�||kr�|j|�q>||kr�tjd|�|j|�q>W|S)N�*rrzundeclared extra: %s)rwr�r��
startswithr�r�)Z	requestedZ	availabler4�rZunwantedr!r!r"�
get_extras�s&


rcCs�i}yNt|�}|j�}|jd�}|jd�s8tjd|�ntjd�|�}tj	|�}Wn0t
k
r�}ztjd||�WYdd}~XnX|S)NzContent-Typezapplication/jsonz(Unexpected response for JSON request: %szutf-8z&Failed to get external data for %s: %s)rr��getrr�r�r`rarcrdrg�	exception)r*r4ZrespZheadersZct�reader�er!r!r"�_get_external_data�s

 rz'https://www.red-dove.com/pypi/projects/cCs*d|dj�|f}tt|�}t|�}|S)Nz%s/%s/project.jsonr)�upperr
�_external_data_base_urlr)r'r*r4r!r!r"�get_project_datas
rcCs(d|dj�||f}tt|�}t|�S)Nz%s/%s/package-%s.jsonr)rr
rr)r'�versionr*r!r!r"�get_package_datas
r c@s(eZdZdZdd�Zdd�Zdd�ZdS)	�Cachez�
    A class implementing a cache for resources that need to live in the file system
    e.g. shared libraries. This class was moved from resources to here because it
    could be used by other modules, e.g. the wheel module.
    cCsPtjj|�stj|�tj|�jd@dkr6tjd|�tjjtjj	|��|_
dS)zu
        Initialise an instance.

        :param base: The base directory where the cache should be located.
        �?rzDirectory '%s' is not privateN)r9r:r�r�r�r�r�r�r��normpathr>)r�r>r!r!r"r�"s

zCache.__init__cCst|�S)zN
        Converts a resource prefix to a directory name in the cache.
        )r�)r�rDr!r!r"�
prefix_to_dir0szCache.prefix_to_dircCs�g}x�tj|j�D]r}tjj|j|�}y>tjj|�s@tjj|�rLtj|�ntjj|�rbt	j
|�Wqtk
r�|j|�YqXqW|S)z"
        Clear the cache.
        )
r9r�r>r:r0r�r�r�r�rr|rg�append)r�Znot_removed�fnr!r!r"�clear6szCache.clearN)r�r�r��__doc__r�r$r'r!r!r!r"r!sr!c@s:eZdZdZdd�Zddd�Zdd�Zd	d
�Zdd�Zd
S)�
EventMixinz1
    A very simple publish/subscribe system.
    cCs
i|_dS)N)�_subscribers)r�r!r!r"r�KszEventMixin.__init__TcCsD|j}||krt|g�||<n"||}|r6|j|�n
|j|�dS)a`
        Add a subscriber for an event.

        :param event: The name of an event.
        :param subscriber: The subscriber to be added (and called when the
                           event is published).
        :param append: Whether to append or prepend the subscriber to an
                       existing subscriber list for the event.
        N)r*rr%�
appendleft)r��event�
subscriberr%�subsZsqr!r!r"r�Ns
zEventMixin.addcCs,|j}||krtd|��||j|�dS)z�
        Remove a subscriber for an event.

        :param event: The name of an event.
        :param subscriber: The subscriber to be removed.
        zNo subscribers: %rN)r*r�r�)r�r,r-r.r!r!r"r�bszEventMixin.removecCst|jj|f��S)z�
        Return an iterator for the subscribers for an event.
        :param event: The event to return subscribers for.
        )�iterr*r)r�r,r!r!r"�get_subscribersnszEventMixin.get_subscriberscOspg}xT|j|�D]F}y||f|�|�}Wn"tk
rJtjd�d}YnX|j|�qWtjd||||�|S)a^
        Publish a event and return a list of values returned by its
        subscribers.

        :param event: The event to publish.
        :param args: The positional arguments to pass to the event's
                     subscribers.
        :param kwargs: The keyword arguments to pass to the event's
                       subscribers.
        z"Exception during event publicationNz/publish %s: args = %s, kwargs = %s, result = %s)r0rgr�rr%r�)r�r,�args�kwargsr4r-rrr!r!r"�publishus

zEventMixin.publishN)T)	r�r�r�r(r�r�r�r0r3r!r!r!r"r)Gs
r)c@s^eZdZdd�Zdd�Zddd�Zdd	�Zd
d�Zdd
�Zdd�Z	e
dd��Ze
dd��ZdS)�	SequencercCsi|_i|_t�|_dS)N)�_preds�_succsrw�_nodes)r�r!r!r"r��szSequencer.__init__cCs|jj|�dS)N)r7r�)r��noder!r!r"�add_node�szSequencer.add_nodeFcCs�||jkr|jj|�|r�x&t|jj|f��D]}|j||�q.Wx&t|jj|f��D]}|j||�qVWx&t|jj��D]\}}|sz|j|=qzWx&t|jj��D]\}}|s�|j|=q�WdS)N)r7r�rwr5rr6r�re)r�r8ZedgesrRr3rorpr!r!r"�remove_node�s
zSequencer.remove_nodecCs0|jj|t��j|�|jj|t��j|�dS)N)r5�
setdefaultrwr�r6)r��pred�succr!r!r"r��sz
Sequencer.addcCs|y|j|}|j|}Wn tk
r8td|��YnXy|j|�|j|�Wn$tk
rvtd||f��YnXdS)Nz%r not a successor of anythingz%r not a successor of %r)r5r6�KeyErrorr�r�)r�r<r=�predsZsuccsr!r!r"r��s

zSequencer.removecCs||jkp||jkp||jkS)N)r5r6r7)r��stepr!r!r"�is_step�szSequencer.is_stepcCs�|j|�std|��g}g}t�}|j|�xd|r�|jd�}||krd||kr�|j|�|j|�q0|j|�|j|�|jj|f�}|j	|�q0Wt
|�S)NzUnknown: %rr)rAr�rwr%r@r�r�r5r�extend�reversed)r��finalr4Ztodo�seenr@r?r!r!r"�	get_steps�s"





zSequencer.get_stepscsVdg�g�i�i�g�|j��������fdd��x�D]}|�kr:�|�q:W�S)Nrc
s��d�|<�d�|<�dd7<�j|�y�|}Wntk
rVg}YnXxR|D]J}|�kr��|�t�|�|��|<q^|�kr^t�|�|��|<q^W�|�|kr�g}x �j�}|j|�||kr�Pq�Wt|�}�j|�dS)Nrr)r%rg�minr@�tuple)r8Z
successorsZ	successorZconnected_componentZ	component)�graph�index�
index_counter�lowlinksr4�stack�
strongconnectr!r"rN�s.



z3Sequencer.strong_connections.<locals>.strongconnect)r6)r�r8r!)rIrJrKrLr4rMrNr"�strong_connections�s"
zSequencer.strong_connectionscCsrdg}x8|jD].}|j|}x|D]}|jd||f�q"WqWx|jD]}|jd|�qHW|jd�dj|�S)Nzdigraph G {z  %s -> %s;z  %s;�}�
)r5r%r7r0)r�r4r=r?r<r8r!r!r"�dot
s


z
Sequencer.dotN)F)
r�r�r�r�r9r:r�r�rArF�propertyrOrRr!r!r!r"r4�s

3r4�.tar.gz�.tar.bz2�.tar�.zip�.tgz�.tbz�.whlTc
sf��fdd�}tjj���t���d}|dkr�|jd�r>d}nH|jd�rRd}d	}n4|jd�rfd}d
}n |jd�rzd}d}ntd|��z�|dkr�t|d�}|r�|j�}xD|D]}||�q�Wn.tj	||�}|r�|j
�}x|D]}||�q�W|dk�r6tjddk�r6x.|j
�D]"}	t|	jt��s|	jjd�|	_�qWdd�}
|
|_|j��Wd|�r`|j�XdS)NcsTt|t�s|jd�}tjjtjj�|��}|j��sD|�tjkrPt	d|��dS)Nzutf-8zpath outside destination: %r)
rMr�decoder9r:r�r0rr;r�)r:rR)�dest_dir�plenr!r"�
check_paths


zunarchive.<locals>.check_path�.zip�.whl�zip�.tar.gz�.tgzZtgzzr:gz�.tar.bz2�.tbzZtbzzr:bz2z.tarZtarrzUnknown format for %rrrXzutf-8cSsBytj||�Stjk
r<}ztt|���WYdd}~XnXdS)z:Run tarfile.tar_fillter, but raise the expected ValueErrorN)�tarfileZ
tar_filterZFilterErrorr��str)�memberr:�excr!r!r"�extraction_filterPsz$unarchive.<locals>.extraction_filter)r_r`)rbrc)rdre)r9r:r�r<r�r�rZnamelistrfr�ZgetnamesrHr_Z
getmembersrMr'rr[rjZ
extractallrj)Zarchive_filenamer\�formatr�r^�archiver��namesr'Ztarinforjr!)r\r]r"�	unarchivesL






rncCs�tj�}t|�}t|d��b}xZtj|�D]L\}}}x@|D]8}tjj||�}||d�}	tjj|	|�}
|j||
�q8Wq(WWdQRX|S)z*zip a directory tree into a BytesIO objectr�N)	�io�BytesIOr<rr9�walkr:r0rx)Z	directoryr4ZdlenZzf�rootr�r�r'ZfullZrelrCr!r!r"�zip_dir`s
rsr$�K�M�G�T�Pc@sreZdZdZddd�Zdd�Zdd	�Zd
d�Zdd
�Ze	dd��Z
e	dd��Zdd�Ze	dd��Z
e	dd��ZdS)�ProgressZUNKNOWNr�dcCs(||_|_||_d|_d|_d|_dS)NrF)rG�cur�max�started�elapsed�done)r�ZminvalZmaxvalr!r!r"r�ws
zProgress.__init__cCs0||_tj�}|jdkr ||_n||j|_dS)N)r{�timer}r~)r�ZcurvalZnowr!r!r"�updates

zProgress.updatecCs|j|j|�dS)N)r�r{)r�Zincrr!r!r"�	increment�szProgress.incrementcCs|j|j�|S)N)r�rG)r�r!r!r"r	�szProgress.startcCs |jdk	r|j|j�d|_dS)NT)r|r�r)r�r!r!r"�stop�s
z
Progress.stopcCs|jdkr|jS|jS)N)r|�unknown)r�r!r!r"�maximum�szProgress.maximumcCsD|jrd}n4|jdkrd}n$d|j|j|j|j}d|}|S)Nz100 %z ?? %gY@z%3d %%)rr|r{rG)r�r4rpr!r!r"�
percentage�s
zProgress.percentagecCs:|dkr|jdks|j|jkr$d}ntjdtj|��}|S)Nrz??:??:??z%H:%M:%S)r|r{rGr�ZstrftimeZgmtime)r�Zdurationr4r!r!r"�format_duration�szProgress.format_durationcCs�|jrd}|j}n^d}|jdkr&d}nJ|jdks<|j|jkrBd}n.t|j|j�}||j|j}|d|j}d||j|�fS)NZDonezETA rrz%s: %sr�)rr~r|r{rG�floatr�)r�rD�tr!r!r"�ETA�s
zProgress.ETAcCsN|jdkrd}n|j|j|j}xtD]}|dkr6P|d}q(Wd||fS)Nrgi�g@�@z%d %sB/s)r~r{rG�UNITS)r�r4Zunitr!r!r"�speed�s

zProgress.speedN)rrz)r�r�r�r�r�r�r�r	r�rSr�r�r�r�r�r!r!r!r"ryts

	ryz\{([^}]*)\}z[^/\\,{]\*\*|\*\*[^/\\,}]z^[^{]*\}|\{[^}]*$cCs<tj|�rd}t||��tj|�r4d}t||��t|�S)zAExtended globbing function that supports ** and {opt1,opt2,opt3}.z7invalid glob %r: recursive glob "**" must be used alonez2invalid glob %r: mismatching set marker '{' or '}')�_CHECK_RECURSIVE_GLOBr�r��_CHECK_MISMATCH_SET�_iglob)�	path_globr�r!r!r"r�s

rccstj|d�}t|�dkr\|\}}}x�|jd�D](}x"tdj|||f��D]
}|VqHWq.Wn�d|kr~x�t|�D]
}|VqnWn�|jdd�\}}|dkr�d}|dkr�d}n|jd�}|jd�}xFtj|�D]8\}}}	tj	j
|�}x ttj	j||��D]
}
|
Vq�Wq�WdS)	Nrr�r$z**r�rr7�\)�	RICH_GLOBr2r<r�r0�	std_iglobr=r9rqr:r#)r�Zrich_path_globrDrwrB�itemr:Zradical�dirr�r&r!r!r"r��s(


r�)�HTTPSHandler�match_hostname�CertificateErrorc@seZdZdZdZdd�ZdS)�HTTPSConnectionNTcCsPtj|j|jf|j�}t|dd�r0||_|j�tt	d�sp|j
rHt	j}nt	j}t	j
||j|j|t	j|j
d�|_nxt	jt	j�}|jt	jO_|jr�|j|j|j�i}|j
r�t	j|_|j|j
d�tt	dd�r�|j|d<|j
|f|�|_|j
o�|j�rLy$t|jj�|j�tjd|j�Wn0tk
�rJ|jjtj�|jj��YnXdS)	NZ_tunnel_hostF�
SSLContext)�	cert_reqsZssl_version�ca_certs)ZcafileZHAS_SNIZserver_hostnamezHost verified: %s) r�Zcreate_connection�host�port�timeoutrI�sockZ_tunnelrG�sslr�Z
CERT_REQUIREDZ	CERT_NONEZwrap_socketZkey_fileZ	cert_fileZPROTOCOL_SSLv23r�ZoptionsZOP_NO_SSLv2Zload_cert_chainZverify_modeZload_verify_locations�check_domainr�Zgetpeercertr�r�r�ZshutdownZ	SHUT_RDWRrj)r�r�r��contextr2r!r!r"�connect
s>


zHTTPSConnection.connect)r�r�r�r�r�r�r!r!r!r"r�sr�c@s&eZdZd	dd�Zdd�Zdd�ZdS)
r�TcCstj|�||_||_dS)N)�BaseHTTPSHandlerr�r�r�)r�r�r�r!r!r"r�0s
zHTTPSHandler.__init__cOs$t||�}|jr |j|_|j|_|S)a
            This is called to create a connection instance. Normally you'd
            pass a connection class to do_open, but it doesn't actually check for
            a class, and just expects a callable. As long as we behave just as a
            constructor would have, we should be OK. If it ever changes so that
            we *must* pass a class, we'll create an UnsafeHTTPSConnection class
            which just sets check_domain to False in the class definition, and
            choose which one to pass to do_open.
            )r�r�r�)r�r1r2r4r!r!r"�_conn_maker5s


zHTTPSHandler._conn_makercCsVy|j|j|�Stk
rP}z&dt|j�kr>td|j��n�WYdd}~XnXdS)Nzcertificate verify failedz*Unable to verify server certificate for %s)Zdo_openr�rrg�reasonr�r�)r��reqrr!r!r"�
https_openEszHTTPSHandler.https_openN)T)r�r�r�r�r�r�r!r!r!r"r�/s
r�c@seZdZdd�ZdS)�HTTPSOnlyHandlercCstd|��dS)NzAUnexpected HTTP request on what should be a secure connection: %s)r)r�r�r!r!r"�	http_openYszHTTPSOnlyHandler.http_openN)r�r�r�r�r!r!r!r"r�Xsr���c@seZdZddd�ZdS)�HTTPr$NcKs&|dkrd}|j|j||f|��dS)Nr)�_setup�_connection_class)r�r�r�r2r!r!r"r�esz
HTTP.__init__)r$N)r�r�r�r�r!r!r!r"r�dsr�c@seZdZddd�ZdS)�HTTPSr$NcKs&|dkrd}|j|j||f|��dS)Nr)r�r�)r�r�r�r2r!r!r"r�mszHTTPS.__init__)r$N)r�r�r�r�r!r!r!r"r�lsr�c@seZdZddd�Zdd�ZdS)�	TransportrcCs||_tjj||�dS)N)r�rr�r�)r�r��use_datetimer!r!r"r�tszTransport.__init__cCsb|j|�\}}}tdkr(t||jd�}n6|js>||jdkrT||_|tj|�f|_|jd}|S)Nr�r�)r�rr)r�r�)�
get_host_info�	_ver_infor�r��_connection�_extra_headersrZHTTPConnection)r�r��h�ehZx509r4r!r!r"�make_connectionxs
zTransport.make_connectionN)r)r�r�r�r�r�r!r!r!r"r�ss
r�c@seZdZddd�Zdd�ZdS)�
SafeTransportrcCs||_tjj||�dS)N)r�rr�r�)r�r�r�r!r!r"r��szSafeTransport.__init__cCsz|j|�\}}}|si}|j|d<tdkr:t|df|�}n<|jsP||jdkrl||_|tj|df|�f|_|jd}|S)Nr�r�r�rr)r�r�)r�r�r�r�r�r�rr�)r�r�r�r�r2r4r!r!r"r��s


zSafeTransport.make_connectionN)r)r�r�r�r�r�r!r!r!r"r��s
r�c@seZdZdd�ZdS)�ServerProxyc	Kst|jdd�|_}|dk	r^t|�\}}|jdd�}|dkr@t}nt}|||d�|d<}||_tjj	||f|�dS)Nr�r�rZhttps)r��	transport)
r@r�rrr�r�r�rr�r�)	r�Zurir2r��scheme�_r�Ztclsr�r!r!r"r��szServerProxy.__init__N)r�r�r�r�r!r!r!r"r��sr�cKs.tjddkr|d7}nd|d<t||f|�S)NrrX�br$�newline)rHr_r�)r&r�r2r!r!r"�	_csv_open�s
r�c@s4eZdZed�ed�ed�d�Zdd�Zdd�Zd	S)
�CSVBaser��"rQ)Z	delimiterZ	quotecharZlineterminatorcCs|S)Nr!)r�r!r!r"�	__enter__�szCSVBase.__enter__cGs|jj�dS)N)r]rj)r�r�r!r!r"�__exit__�szCSVBase.__exit__N)r�r�r�rg�defaultsr�r�r!r!r!r"r��s
r�c@s(eZdZdd�Zdd�Zdd�ZeZdS)�	CSVReadercKs\d|kr4|d}tjddkr,tjd�|�}||_nt|dd�|_tj|jf|j�|_dS)Nr]rrXzutf-8r:r)	rHr_r`rar]r��csvrr�)r�r2r]r!r!r"r��szCSVReader.__init__cCs|S)Nr!)r�r!r!r"�__iter__�szCSVReader.__iter__cCsJt|j�}tjddkrFx,t|�D] \}}t|t�s"|jd�||<q"W|S)NrrXzutf-8)�nextrrHr_rrMrr[)r�r4rr�r!r!r"r��s

zCSVReader.nextN)r�r�r�r�r�r��__next__r!r!r!r"r��sr�c@seZdZdd�Zdd�ZdS)�	CSVWritercKs$t|d�|_tj|jf|j�|_dS)Nr�)r�r]r��writerr�)r�r&r2r!r!r"r��szCSVWriter.__init__cCsRtjddkrBg}x*|D]"}t|t�r0|jd�}|j|�qW|}|jj|�dS)NrrXzutf-8)rHr_rMrr�r%r��writerow)r��rowrr�r!r!r"r��s


zCSVWriter.writerowN)r�r�r�r�r�r!r!r!r"r��sr�csHeZdZeej�Zded<d�fdd�	Zdd�Zdd	�Zd
d�Z	�Z
S)
�Configurator�inc_convertZincNcs"tt|�j|�|ptj�|_dS)N)�superr�r�r9r~r>)r��configr>)�	__class__r!r"r��szConfigurator.__init__c
s���fdd���jd�}t|�s*�j|�}�jdd�}�jdf�}|r\t�fdd�|D��}��fdd��D�}t|�}|||�}|r�x$|j�D]\}}	t||�|	��q�W|S)	Ncszt|ttf�r*t|��fdd�|D��}nLt|t�rld|krH�j|�}qvi}x(|D]}�||�||<qRWn
�j|�}|S)Ncsg|]}�|��qSr!r!)r%r)�convertr!r"r&�szBConfigurator.configure_custom.<locals>.convert.<locals>.<listcomp>z())rMr�rH�type�dict�configure_customr�)�or4ro)r�r�r!r"r��s


z.Configurator.configure_custom.<locals>.convertz()r�z[]csg|]}�|��qSr!r!)r%r�)r�r!r"r&sz1Configurator.configure_custom.<locals>.<listcomp>cs$g|]}t|�r|��|�f�qSr!)r)r%ro)r�r�r!r"r&s)r@r�r�rHr�re�setattr)
r�r�rSZpropsr1rer2r4rrpr!)r�r�r�r"r��s


zConfigurator.configure_customcCs4|j|}t|t�r0d|kr0|j|�|j|<}|S)Nz())r�rMr�r�)r�rVr4r!r!r"�__getitem__s
zConfigurator.__getitem__c	CsFtjj|�stjj|j|�}tj|ddd��}tj|�}WdQRX|S)z*Default converter for the inc:// protocol.rzutf-8)r�N)	r9r:�isabsr0r>r`r�rcrd)r�rrr�r4r!r!r"r�s
zConfigurator.inc_convert)N)r�r�r�r�rZvalue_convertersr�r�r�r��
__classcell__r!r!)r�r"r��s
r�c@s&eZdZd	dd�Zdd�Zdd�ZdS)
�SubprocessMixinFNcCs||_||_dS)N)�verbose�progress)r�r�r�r!r!r"r�+szSubprocessMixin.__init__cCsn|j}|j}xT|j�}|sP|dk	r0|||�q|sBtjjd�ntjj|jd��tjj�qW|j�dS)z�
        Read lines from a subprocess' output stream and either pass to a progress
        callable (if specified) or write progress information to sys.stderr.
        Nr�zutf-8)	r�r��readlinerH�stderrrxr[�flushrj)r�r]r�r�r�r3r!r!r"r/szSubprocessMixin.readercKs�tj|ftjtjd�|��}tj|j|jdfd�}|j�tj|j|jdfd�}|j�|j	�|j
�|j
�|jdk	r�|jdd�n|jr�t
jjd�|S)N)�stdoutr�r�)r�r1r�zdone.�mainzdone.
)�
subprocess�Popen�PIPE�	threadingZThreadrr�r	r��waitr0r�r�rHrx)r��cmdr2rRZt1Zt2r!r!r"�run_commandDs
zSubprocessMixin.run_command)FN)r�r�r�r�rr�r!r!r!r"r�*s
r�cCstjdd|�j�S)z,Normalize a python package name a la PEP 503z[-_.]+r)r
�subrO)r'r!r!r"�normalize_nameUsr�)NN)r�)N)N)rTrUrVrWrXrYrZ)NT)r$rtrurvrwrx)r�r�)�r`�collectionsr�
contextlibr�Zglobrr�rorcZloggingr9r�r
rr�r��ImportErrorr�rHrfrzrkr�Zdummy_threadingr�r$r�compatrrr	r
rrr
rrrrrrrrrrrrZ	getLoggerr�r��COMMAr�r1ZIDENTZEXTRA_IDENTZVERSPECZRELOPZBARE_CONSTRAINTSZ
DIRECT_REFZCONSTRAINTSZ
EXTRA_LISTZEXTRASZREQUIREMENTr+ZRELOP_IDENTr.r6rErJrNrTrWrsry�contextmanagerr}rr�r�r�r�r�r�r��VERBOSEr�rfr�r�r�r�rr�Ir
rrrrrrrrr r!r)r4ZARCHIVE_EXTENSIONSrnrsr�ryr�r�r�r�r�r�r�r�r�r�r_r�r�r�r�r�r�r�r�r�r�r�r�r�r!r!r!r"�<module>s�
X

,

%

	/
	7

)



,H
C]


*)	
:+site-packages/pip/_vendor/distlib/__pycache__/database.cpython-36.pyc000064400000122116147511334610021554 0ustar003

���e��@s�dZddlmZddlZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddlZddlm
Z
mZddlmZddlmZmZddlmZmZmZdd	lmZmZmZmZmZmZmZd
ddd
dgZ ej!e"�Z#dZ$dZ%deddde$dfZ&dZ'Gdd�de(�Z)Gdd�de(�Z*Gdd
�d
e(�Z+Gdd�de+�Z,Gdd�de,�Z-Gdd
�d
e,�Z.e-Z/e.Z0Gdd�de(�Z1d)d!d"�Z2d#d$�Z3d%d&�Z4d'd(�Z5dS)*zPEP 376 implementation.�)�unicode_literalsN�)�DistlibException�	resources)�StringIO)�
get_scheme�UnsupportedVersionError)�Metadata�METADATA_FILENAME�WHEEL_METADATA_FILENAME)�parse_requirement�cached_property�parse_name_and_version�read_exports�
write_exports�	CSVReader�	CSVWriter�Distribution�BaseInstalledDistribution�InstalledDistribution�EggInfoDistribution�DistributionPathzpydist-exports.jsonzpydist-commands.jsonZ	INSTALLER�RECORD�	REQUESTED�	RESOURCES�SHAREDz
.dist-infoc@s(eZdZdZdd�Zdd�Zdd�ZdS)	�_CachezL
    A simple cache mapping names and .dist-info paths to distributions
    cCsi|_i|_d|_dS)zZ
        Initialise an instance. There is normally one for each DistributionPath.
        FN)�name�path�	generated)�self�r!�/usr/lib/python3.6/database.py�__init__0sz_Cache.__init__cCs|jj�|jj�d|_dS)zC
        Clear the cache, setting it to its initial state.
        FN)r�clearrr)r r!r!r"r$8s

z_Cache.clearcCs2|j|jkr.||j|j<|jj|jg�j|�dS)z`
        Add a distribution to the cache.
        :param dist: The distribution to add.
        N)rr�
setdefault�key�append)r �distr!r!r"�add@sz
_Cache.addN)�__name__�
__module__�__qualname__�__doc__r#r$r)r!r!r!r"r,src@s�eZdZdZddd�Zdd�Zdd	�Zeee�Zd
d�Z	dd
�Z
dd�Zedd��Z
dd�Zdd�Zddd�Zdd�Zddd�ZdS)rzU
    Represents a set of distributions installed on a path (typically sys.path).
    NFcCsD|dkrtj}||_d|_||_t�|_t�|_d|_td�|_	dS)a�
        Create an instance from a path, optionally including legacy (distutils/
        setuptools/distribute) distributions.
        :param path: The path to use, as a list of directories. If not specified,
                     sys.path is used.
        :param include_egg: If True, this instance will look for and return legacy
                            distributions as well as those based on PEP 376.
        NT�default)
�sysr�
_include_dist�_include_eggr�_cache�
_cache_egg�_cache_enabledr�_scheme)r rZinclude_eggr!r!r"r#Ns	zDistributionPath.__init__cCs|jS)N)r4)r r!r!r"�_get_cache_enabledbsz#DistributionPath._get_cache_enabledcCs
||_dS)N)r4)r �valuer!r!r"�_set_cache_enabledesz#DistributionPath._set_cache_enabledcCs|jj�|jj�dS)z,
        Clears the internal cache.
        N)r2r$r3)r r!r!r"�clear_cachejs
zDistributionPath.clear_cachec
csTt�}�xF|jD�]:}tj|�}|dkr*q|jd�}|s|jrDqt|j�}�x�|D]�}|j|�}|sV|j|krvqV|jo�|jt	��rt
tg}x*|D] }tj
||�}	|j|	�}
|
r�Pq�WqVtj|
j���}t|dd�}WdQRXtjd|j�|j|j�t|j||d�VqV|jrV|jd	�rVtjd|j�|j|j�t|j|�VqVWqWdS)
zD
        Yield .dist-info and/or .egg(-info) distributions.
        N��legacy)�fileobj�schemezFound %s)�metadata�env�	.egg-info�.egg)r@rA)�setrr�finder_for_path�findZis_container�sortedr0�endswith�DISTINFO_EXTr
r�	posixpath�join�
contextlib�closing�	as_streamr	�logger�debugr)�new_dist_classr1�old_dist_class)
r �seenr�finder�rZrset�entryZpossible_filenamesZmetadata_filenameZ
metadata_pathZpydist�streamr>r!r!r"�_yield_distributionsrs@






z%DistributionPath._yield_distributionscCst|jj}|jo|jj}|s"|rpx4|j�D](}t|t�rH|jj|�q,|jj|�q,W|rdd|j_|rpd|j_dS)zk
        Scan the path for distributions and populate the cache with
        those that are found.
        TN)r2rr1r3rV�
isinstancerr))r Zgen_distZgen_eggr(r!r!r"�_generate_cache�s

z DistributionPath._generate_cachecCs|jdd�}dj||g�tS)ao
        The *name* and *version* parameters are converted into their
        filename-escaped form, i.e. any ``'-'`` characters are replaced
        with ``'_'`` other than the one in ``'dist-info'`` and the one
        separating the name from the version number.

        :parameter name: is converted to a standard distribution name by replacing
                         any runs of non- alphanumeric characters with a single
                         ``'-'``.
        :type name: string
        :parameter version: is converted to a standard version string. Spaces
                            become dots, and all other non-alphanumeric characters
                            (except dots) become dashes, with runs of multiple
                            dashes condensed to a single dash.
        :type version: string
        :returns: directory name
        :rtype: string�-�_)�replacerIrG)�clsr�versionr!r!r"�distinfo_dirname�sz!DistributionPath.distinfo_dirnameccsj|js x^|j�D]
}|VqWnF|j�x|jjj�D]
}|Vq6W|jrfx|jjj�D]
}|VqXWdS)a5
        Provides an iterator that looks for distributions and returns
        :class:`InstalledDistribution` or
        :class:`EggInfoDistribution` instances for each one of them.

        :rtype: iterator of :class:`InstalledDistribution` and
                :class:`EggInfoDistribution` instances
        N)r4rVrXr2r�valuesr1r3)r r(r!r!r"�get_distributions�s	
z"DistributionPath.get_distributionscCs�d}|j�}|js6xj|j�D]}|j|kr|}PqWnH|j�||jjkr\|jj|d}n"|jr~||jjkr~|jj|d}|S)a=
        Looks for a named distribution on the path.

        This function only returns the first result found, as no more than one
        value is expected. If nothing is found, ``None`` is returned.

        :rtype: :class:`InstalledDistribution`, :class:`EggInfoDistribution`
                or ``None``
        Nr)	�lowerr4rVr&rXr2rr1r3)r r�resultr(r!r!r"�get_distribution�s

z!DistributionPath.get_distributionc	cs�d}|dk	rJy|jjd||f�}Wn$tk
rHtd||f��YnXxd|j�D]X}|j}xL|D]D}t|�\}}|dkr�||kr�|VPqd||krd|j|�rd|VPqdWqTWdS)a
        Iterates over all distributions to find which distributions provide *name*.
        If a *version* is provided, it will be used to filter the results.

        This function only returns the first result found, since no more than
        one values are expected. If the directory is not found, returns ``None``.

        :parameter version: a version specifier that indicates the version
                            required, conforming to the format in ``PEP-345``

        :type name: string
        :type version: string
        Nz%s (%s)zinvalid name or version: %r, %r)r5�matcher�
ValueErrorrr`�providesr�match)	r rr]rdr(�provided�p�p_name�p_verr!r!r"�provides_distribution�s$
z&DistributionPath.provides_distributioncCs(|j|�}|dkrtd|��|j|�S)z5
        Return the path to a resource file.
        Nzno distribution named %r found)rc�LookupError�get_resource_path)r r�
relative_pathr(r!r!r"�
get_file_paths
zDistributionPath.get_file_pathccs`xZ|j�D]N}|j}||kr
||}|dk	r@||krX||Vq
x|j�D]
}|VqJWq
WdS)z�
        Return all of the exported entries in a particular category.

        :param category: The category to search for entries.
        :param name: If specified, only entries with that name are returned.
        N)r`�exportsr_)r �categoryrr(rS�d�vr!r!r"�get_exported_entries"sz%DistributionPath.get_exported_entries)NF)N)N)r*r+r,r-r#r6r8�propertyZ
cache_enabledr9rVrX�classmethodr^r`rcrlrprur!r!r!r"rJs

*
$	c@s�eZdZdZdZdZdd�Zedd��ZeZ	edd��Z
ed	d
��Zdd�Zed
d��Z
edd��Zedd��Zedd��Zedd��Zdd�Zdd�Zdd�Zdd�ZdS) rz�
    A base class for distributions, whether installed or from indexes.
    Either way, it must have some metadata, so that's all that's needed
    for construction.
    FcCsL||_|j|_|jj�|_|j|_d|_d|_d|_d|_t	�|_
i|_dS)z�
        Initialise an instance.
        :param metadata: The instance of :class:`Metadata` describing this
        distribution.
        N)r>rrar&r]Zlocator�digest�extras�contextrBZ
download_urlsZdigests)r r>r!r!r"r#GszDistribution.__init__cCs|jjS)zH
        The source archive download URL for this distribution.
        )r>�
source_url)r r!r!r"r{XszDistribution.source_urlcCsd|j|jfS)zX
        A utility property which displays the name and version in parentheses.
        z%s (%s))rr])r r!r!r"�name_and_versionaszDistribution.name_and_versioncCs.|jj}d|j|jf}||kr*|j|�|S)z�
        A set of distribution names and versions provided by this distribution.
        :return: A set of "name (version)" strings.
        z%s (%s))r>rfrr]r')r Zplist�sr!r!r"rfhs

zDistribution.providescCs8|j}tjd|j��t||�}t|j||j|jd��S)Nz%Getting requirements from metadata %r)ryr?)	r>rMrNZtodict�getattrrBZget_requirementsryrz)r Zreq_attr�mdZreqtsr!r!r"�_get_requirementsts

zDistribution._get_requirementscCs
|jd�S)N�run_requires)r�)r r!r!r"r�{szDistribution.run_requirescCs
|jd�S)N�
meta_requires)r�)r r!r!r"r�szDistribution.meta_requirescCs
|jd�S)N�build_requires)r�)r r!r!r"r��szDistribution.build_requirescCs
|jd�S)N�
test_requires)r�)r r!r!r"r��szDistribution.test_requirescCs
|jd�S)N�dev_requires)r�)r r!r!r"r��szDistribution.dev_requiresc
Cs�t|�}t|jj�}y|j|j�}Wn6tk
rZtjd|�|j	�d}|j|�}YnX|j
}d}xJ|jD]@}t|�\}}	||kr�qny|j
|	�}PWqntk
r�YqnXqnW|S)z�
        Say if this instance matches (fulfills) a requirement.
        :param req: The requirement to match.
        :rtype req: str
        :return: True if it matches, else False.
        z+could not read version %r - using name onlyrF)rrr>r=rd�requirementrrM�warning�splitr&rfrrg)
r �reqrSr=rdrrbrirjrkr!r!r"�matches_requirement�s*	

z Distribution.matches_requirementcCs(|jrd|j}nd}d|j|j|fS)zC
        Return a textual representation of this instance,
        z [%s]r:z<Distribution %s (%s)%s>)r{rr])r �suffixr!r!r"�__repr__�szDistribution.__repr__cCs>t|�t|�k	rd}n$|j|jko8|j|jko8|j|jk}|S)a<
        See if this distribution is the same as another.
        :param other: The distribution to compare with. To be equal to one
                      another. distributions must have the same type, name,
                      version and source_url.
        :return: True if it is the same, else False.
        F)�typerr]r{)r �otherrbr!r!r"�__eq__�szDistribution.__eq__cCst|j�t|j�t|j�S)zH
        Compute hash in a way which matches the equality test.
        )�hashrr]r{)r r!r!r"�__hash__�szDistribution.__hash__N)r*r+r,r-Zbuild_time_dependency�	requestedr#rvr{Zdownload_urlr|rfr�r�r�r�r�r�r�r�r�r�r!r!r!r"r5s$"
cs0eZdZdZdZd�fdd�	Zddd�Z�ZS)	rz]
    This is the base class for installed distributions (whether PEP 376 or
    legacy).
    Ncs tt|�j|�||_||_dS)a
        Initialise an instance.
        :param metadata: An instance of :class:`Metadata` which describes the
                         distribution. This will normally have been initialised
                         from a metadata file in the ``path``.
        :param path:     The path of the ``.dist-info`` or ``.egg-info``
                         directory for the distribution.
        :param env:      This is normally the :class:`DistributionPath`
                         instance where this distribution was found.
        N)�superrr#r�	dist_path)r r>rr?)�	__class__r!r"r#�sz"BaseInstalledDistribution.__init__cCsd|dkr|j}|dkr"tj}d}ntt|�}d|j}||�j�}tj|�jd�jd�}d||fS)a�
        Get the hash of some data, using a particular hash algorithm, if
        specified.

        :param data: The data to be hashed.
        :type data: bytes
        :param hasher: The name of a hash implementation, supported by hashlib,
                       or ``None``. Examples of valid values are ``'sha1'``,
                       ``'sha224'``, ``'sha384'``, '``sha256'``, ``'md5'`` and
                       ``'sha512'``. If no hasher is specified, the ``hasher``
                       attribute of the :class:`InstalledDistribution` instance
                       is used. If the hasher is determined to be ``None``, MD5
                       is used as the hashing algorithm.
        :returns: The hash of the data. If a hasher was explicitly specified,
                  the returned hash will be prefixed with the specified hasher
                  followed by '='.
        :rtype: str
        Nr:z%s=�=�asciiz%s%s)	�hasher�hashlib�md5r~rx�base64Zurlsafe_b64encode�rstrip�decode)r �datar��prefixrxr!r!r"�get_hash�s

z"BaseInstalledDistribution.get_hash)N)N)r*r+r,r-r�r#r��
__classcell__r!r!)r�r"r�scs�eZdZdZdZd'�fdd�	Zdd�Zdd	�Zd
d�Ze	dd
��Z
dd�Zdd�Zdd�Z
dd�Zd(dd�Zdd�Ze	dd��Zd)dd�Zdd �Zd!d"�Zd#d$�Zd%d&�ZejZ�ZS)*ra
    Created with the *path* of the ``.dist-info`` directory provided to the
    constructor. It reads the metadata contained in ``pydist.json`` when it is
    instantiated., or uses a passed in Metadata instance (useful for when
    dry-run mode is being used).
    Zsha256Ncs0tj|�|_}|dkr(ddl}|j�|rN|jrN||jjkrN|jj|j}nt|dkr�|j	t
�}|dkrr|j	t�}|dkr�|j	d�}|dkr�tdt
|f��t
j|j���}t|dd�}WdQRXtt|�j|||�|r�|jr�|jj|�y|j	d�}Wn&tk
�r ddl}|j�YnX|dk	|_dS)NrZMETADATAzno %s found in %sr;)r<r=r)rrCrR�pdbZ	set_tracer4r2rr>rDr
rrerJrKrLr	r�rr#r)�AttributeErrorr�)r rr>r?rRr�rSrU)r�r!r"r#s4




zInstalledDistribution.__init__cCsd|j|j|jfS)Nz#<InstalledDistribution %r %s at %r>)rr]r)r r!r!r"r�2szInstalledDistribution.__repr__cCsd|j|jfS)Nz%s %s)rr])r r!r!r"�__str__6szInstalledDistribution.__str__c
Cs�g}|jd�}tj|j���`}t|d��J}xB|D]:}dd�tt|�d�D�}||\}}}	|j|||	f�q0WWdQRXWdQRX|S)a"
        Get the list of installed files for the distribution
        :return: A list of tuples of path, hash and size. Note that hash and
                 size might be ``None`` for some entries. The path is exactly
                 as stored in the file (which is as in PEP 376).
        r)rUcSsg|]}d�qS)Nr!)�.0�ir!r!r"�
<listcomp>Hsz6InstalledDistribution._get_records.<locals>.<listcomp>�N)�get_distinfo_resourcerJrKrLr�range�lenr')
r �resultsrSrUZ
record_reader�row�missingr�checksum�sizer!r!r"�_get_records9s

(z"InstalledDistribution._get_recordscCsi}|jt�}|r|j�}|S)a
        Return the information exported by this distribution.
        :return: A dictionary of exports, mapping an export category to a dict
                 of :class:`ExportEntry` instances describing the individual
                 export entries, and keyed by name.
        )r��EXPORTS_FILENAMEr)r rbrSr!r!r"rqPs

zInstalledDistribution.exportsc	Cs8i}|jt�}|r4tj|j���}t|�}WdQRX|S)z�
        Read exports data from a file in .ini format.

        :return: A dictionary of exports, mapping an export category to a list
                 of :class:`ExportEntry` instances describing the individual
                 export entries.
        N)r�r�rJrKrLr)r rbrSrUr!r!r"r^s
z"InstalledDistribution.read_exportsc
Cs.|jt�}t|d��}t||�WdQRXdS)a
        Write a dictionary of exports to a file in .ini format.
        :param exports: A dictionary of exports, mapping an export category to
                        a list of :class:`ExportEntry` instances describing the
                        individual export entries.
        �wN)�get_distinfo_filer��openr)r rqZrf�fr!r!r"rms
z#InstalledDistribution.write_exportscCsh|jd�}tj|j���:}t|d��$}x|D]\}}||kr,|Sq,WWdQRXWdQRXtd|��dS)aW
        NOTE: This API may change in the future.

        Return the absolute path to a resource file with the given relative
        path.

        :param relative_path: The path, relative to .dist-info, of the resource
                              of interest.
        :return: The absolute path where the resource is to be found.
        r)rUNz3no resource file with relative path %r is installed)r�rJrKrLr�KeyError)r rorSrUZresources_readerZrelativeZdestinationr!r!r"rnxs
z'InstalledDistribution.get_resource_pathccsx|j�D]
}|Vq
WdS)z�
        Iterates over the ``RECORD`` entries and returns a tuple
        ``(path, hash, size)`` for each line.

        :returns: iterator of (path, hash, size)
        N)r�)r rbr!r!r"�list_installed_files�sz*InstalledDistribution.list_installed_filesFcCs,tjj|d�}tjj|j�}|j|�}tjj|d�}|jd�}tjd|�|rRdSt|���}x�|D]�}tjj	|�s||j
d	�r�d}	}
n4dtjj|�}
t|d��}|j
|j��}	WdQRX|j|�s�|r�|j|�r�tjj||�}|j||	|
f�qbW|j|��rtjj||�}|j|ddf�WdQRX|S)
z�
        Writes the ``RECORD`` file, using the ``paths`` iterable passed in. Any
        existing ``RECORD`` file is silently overwritten.

        prefix is used to determine when to write absolute paths.
        r:rzcreating %sN�.pyc�.pyoz%d�rb)r�r�)�osrrI�dirname�
startswithr�rM�infor�isdirrF�getsizer�r��read�relpathZwriterow)r �pathsr��dry_run�baseZbase_under_prefix�record_path�writerr�
hash_valuer��fpr!r!r"�write_installed_files�s.





z+InstalledDistribution.write_installed_filesc
Csg}tjj|j�}|jd�}�x�|j�D]�\}}}tjj|�sLtjj||�}||krVq(tjj|�sv|j|dddf�q(tjj	|�r(t
tjj|��}|r�||kr�|j|d||f�q(|r(d|kr�|jdd�d}nd	}t
|d
��2}	|j|	j�|�}
|
|k�r|j|d||
f�Wd	QRXq(W|S)a�
        Checks that the hashes and sizes of the files in ``RECORD`` are
        matched by the files themselves. Returns a (possibly empty) list of
        mismatches. Each entry in the mismatch list will be a tuple consisting
        of the path, 'exists', 'size' or 'hash' according to what didn't match
        (existence is checked first, then size, then hash), the expected
        value and the actual value.
        r�existsTFr��=rrNr�r�)r�rr�r�r��isabsrIr�r'�isfile�strr�r�r�r�r�)r �
mismatchesr�r�rr�r�Zactual_sizer�r�Zactual_hashr!r!r"�check_installed_files�s.	

 z+InstalledDistribution.check_installed_filescCs�i}tjj|jd�}tjj|�r�tj|ddd��}|j�j�}WdQRXx@|D]8}|jdd�\}}|dkr~|j	|g�j
|�qN|||<qNW|S)	a�
        A dictionary of shared locations whose keys are in the set 'prefix',
        'purelib', 'platlib', 'scripts', 'headers', 'data' and 'namespace'.
        The corresponding value is the absolute path of that category for
        this distribution, and takes into account any paths selected by the
        user at installation time (e.g. via command-line arguments). In the
        case of the 'namespace' key, this would be a list of absolute paths
        for the roots of namespace packages in this distribution.

        The first time this property is accessed, the relevant information is
        read from the SHARED file in the .dist-info directory.
        rrSzutf-8)�encodingNr�r�	namespace)r�rrIr��codecsr�r��
splitlinesr�r%r')r rb�shared_pathr��lines�liner&r7r!r!r"�shared_locations�s
z&InstalledDistribution.shared_locationsc	
Cs�tjj|jd�}tjd|�|r$dSg}x6dD].}||}tjj||�r.|jd	||f�q.Wx"|jd
f�D]}|jd|�qnWtj	|dd
d��}|j
dj|��WdQRX|S)aa
        Write shared location information to the SHARED file in .dist-info.
        :param paths: A dictionary as described in the documentation for
        :meth:`shared_locations`.
        :param dry_run: If True, the action is logged but no file is actually
                        written.
        :return: The path of the file written to.
        rzcreating %sNr��lib�headers�scriptsr�z%s=%sr�znamespace=%sr�zutf-8)r��
)r�r�r�r�r�)r�rrIrMr�r�r'�getr�r��write)	r r�r�r�r�r&r�nsr�r!r!r"�write_shared_locations�s	
z,InstalledDistribution.write_shared_locationscCsF|tkrtd||jf��tj|j�}|dkr<td|j��|j|�S)Nz+invalid path for a dist-info file: %r at %rzUnable to get a finder for %s)�
DIST_FILESrrrrCrD)r rrRr!r!r"r�sz+InstalledDistribution.get_distinfo_resourcecCs~|jtj�dkrT|jtj�dd�\}}||jjtj�dkrTtd||j|jf��|tkrntd||jf��tjj	|j|�S)	a�
        Returns a path located under the ``.dist-info`` directory. Returns a
        string representing the path.

        :parameter path: a ``'/'``-separated path relative to the
                         ``.dist-info`` directory or an absolute path;
                         If *path* is an absolute path and doesn't start
                         with the ``.dist-info`` directory path,
                         a :class:`DistlibException` is raised
        :type path: str
        :rtype: str
        r�Nrz;dist-info file %r does not belong to the %r %s distributionz+invalid path for a dist-info file: %r at %r������)
rDr��sepr�rrrr]r�rI)r rr^r!r!r"r�sz'InstalledDistribution.get_distinfo_fileccsVtjj|j�}xB|j�D]6\}}}tjj|�s<tjj||�}|j|j�r|VqWdS)z�
        Iterates over the ``RECORD`` entries and returns paths for each line if
        the path is pointing to a file located in the ``.dist-info`` directory
        or one of its subdirectories.

        :returns: iterator of paths
        N)r�rr�r�r�rIr�)r r�rr�r�r!r!r"�list_distinfo_files6sz)InstalledDistribution.list_distinfo_filescCst|t�o|j|jkS)N)rWrr)r r�r!r!r"r�Fs
zInstalledDistribution.__eq__)NN)F)F)r*r+r,r-r�r#r�r�r�r
rqrrrnr�r�r�r�r�r�r�r�r��objectr�r�r!r!)r�r"r	s(

##
	csjeZdZdZdZiZd�fdd�	Zdd�Zdd	�Zd
d�Z	dd
�Z
dd�Zddd�Zdd�Z
ejZ�ZS)raCreated with the *path* of the ``.egg-info`` directory or file provided
    to the constructor. It reads the metadata contained in the file itself, or
    if the given path happens to be a directory, the metadata is read from the
    file ``PKG-INFO`` under that directory.TNcs�dd�}||_||_|rJ|jrJ||jjkrJ|jj|j}|||j|j�n0|j|�}|||j|j�|rz|jrz|jj|�t	t
|�j|||�dS)NcSs||_|j�|_||_dS)N)rrar&r])r}�nrtr!r!r"�set_name_and_versionXs
z:EggInfoDistribution.__init__.<locals>.set_name_and_version)rr�r4r3r>rr]�
_get_metadatar)r�rr#)r rr?r�r>)r�r!r"r#Ws

zEggInfoDistribution.__init__c
s2d}dd���fdd�}|jd�r�tjj|�rdtjj|dd�}t|dd	�}tjj|dd
�}||�}n`tj|�}t|j	d�j
d��}t|dd
�}y|j	d�}	�|	j
d��}Wntk
r�d}YnXnX|jd��rtjj|��rtjj|d
�}||�}tjj|d�}t|dd	�}ntd|��|�r.|j
|�|S)NcSs�g}|j�}x�|D]�}|j�}|jd�r6tjd|�Pt|�}|sPtjd|�q|jr`tjd�|jst|j|j	�qdj
dd�|jD��}|jd|j	|f�qW|S)	z�Create a list of dependencies from a requires.txt file.

            *data*: the contents of a setuptools-produced requires.txt file.
            �[z.Unexpected line: quitting requirement scan: %rz#Not recognised as a requirement: %rz4extra requirements in requires.txt are not supportedz, css|]}d|VqdS)z%s%sNr!)r��cr!r!r"�	<genexpr>�szQEggInfoDistribution._get_metadata.<locals>.parse_requires_data.<locals>.<genexpr>z%s (%s))r��stripr�rMr�rryZconstraintsr'rrI)r��reqsr�r�rSZconsr!r!r"�parse_requires_dataos&


z>EggInfoDistribution._get_metadata.<locals>.parse_requires_datacsHg}y*tj|dd��}�|j��}WdQRXWntk
rBYnX|S)z�Create a list of dependencies from a requires.txt file.

            *req_path*: the path to a setuptools-produced requires.txt file.
            rSzutf-8N)r�r�r��IOError)�req_pathr�r�)r�r!r"�parse_requires_path�sz>EggInfoDistribution._get_metadata.<locals>.parse_requires_pathz.eggzEGG-INFOzPKG-INFOr;)rr=zrequires.txtzEGG-INFO/PKG-INFO�utf8)r<r=zEGG-INFO/requires.txtzutf-8z	.egg-infoz,path must end with .egg-info or .egg, got %r)rFr�rr�rIr	�	zipimport�zipimporterr�get_datar�r�rZadd_requirements)
r r�requiresr��	meta_pathr>r�Zzipfr<r�r!)r�r"r�ls:




z!EggInfoDistribution._get_metadatacCsd|j|j|jfS)Nz!<EggInfoDistribution %r %s at %r>)rr]r)r r!r!r"r��szEggInfoDistribution.__repr__cCsd|j|jfS)Nz%s %s)rr])r r!r!r"r��szEggInfoDistribution.__str__cCsdg}tjj|jd�}tjj|�r`x>|j�D]2\}}}||kr>q*tjj|�s*|j|dddf�q*W|S)a�
        Checks that the hashes and sizes of the files in ``RECORD`` are
        matched by the files themselves. Returns a (possibly empty) list of
        mismatches. Each entry in the mismatch list will be a tuple consisting
        of the path, 'exists', 'size' or 'hash' according to what didn't match
        (existence is checked first, then size, then hash), the expected
        value and the actual value.
        zinstalled-files.txtr�TF)r�rrIr�r�r')r r�r�rrZr!r!r"r��s	z)EggInfoDistribution.check_installed_filesc
Cs�dd�}dd�}tjj|jd�}g}tjj|�r�tj|ddd��|}xt|D]l}|j�}tjjtjj|j|��}tjj|�s�tj	d	|�|j
d
�r�qHtjj|�sH|j|||�||�f�qHWWdQRX|j|ddf�|S)z�
        Iterates over the ``installed-files.txt`` entries and returns a tuple
        ``(path, hash, size)`` for each line.

        :returns: a list of (path, hash, size)
        c
Ss0t|d�}z|j�}Wd|j�Xtj|�j�S)Nr�)r�r��closer�r�Z	hexdigest)rr�Zcontentr!r!r"�_md5�s


z6EggInfoDistribution.list_installed_files.<locals>._md5cSstj|�jS)N)r��stat�st_size)rr!r!r"�_size�sz7EggInfoDistribution.list_installed_files.<locals>._sizezinstalled-files.txtrSzutf-8)r�zNon-existent file: %s�.pyc�.pyoN)rr)
r�rrIr�r�r�r��normpathrMr�rFr�r')r r�rr�rbr�r�rir!r!r"r��s"

&z(EggInfoDistribution.list_installed_filesFccs�tjj|jd�}d}tj|ddd��d}x\|D]T}|j�}|dkrFd}q,|s,tjjtjj|j|��}|j|j�r,|rz|Vq,|Vq,WWdQRXdS)	a
        Iterates over the ``installed-files.txt`` entries and returns paths for
        each line if the path is pointing to a file located in the
        ``.egg-info`` directory or one of its subdirectories.

        :parameter absolute: If *absolute* is ``True``, each returned path is
                          transformed into a local absolute path. Otherwise the
                          raw value from ``installed-files.txt`` is returned.
        :type absolute: boolean
        :returns: iterator of paths
        zinstalled-files.txtTrSzutf-8)r�z./FN)r�rrIr�r�r�rr�)r Zabsoluter��skipr�r�rir!r!r"r��s
z'EggInfoDistribution.list_distinfo_filescCst|t�o|j|jkS)N)rWrr)r r�r!r!r"r�s
zEggInfoDistribution.__eq__)N)F)r*r+r,r-r�r�r#r�r�r�r�r�r�r�r�r�r�r!r!)r�r"rNsK&
c@s^eZdZdZdd�Zdd�Zddd�Zd	d
�Zdd�Zddd�Z	ddd�Z
dd�Zdd�ZdS)�DependencyGrapha�
    Represents a dependency graph between distributions.

    The dependency relationships are stored in an ``adjacency_list`` that maps
    distributions to a list of ``(other, label)`` tuples where  ``other``
    is a distribution and the edge is labeled with ``label`` (i.e. the version
    specifier, if such was provided). Also, for more efficient traversal, for
    every distribution ``x``, a list of predecessors is kept in
    ``reverse_list[x]``. An edge from distribution ``a`` to
    distribution ``b`` means that ``a`` depends on ``b``. If any missing
    dependencies are found, they are stored in ``missing``, which is a
    dictionary that maps distributions to a list of requirements that were not
    provided by any other distributions.
    cCsi|_i|_i|_dS)N)�adjacency_list�reverse_listr�)r r!r!r"r#.szDependencyGraph.__init__cCsg|j|<g|j|<dS)z�Add the *distribution* to the graph.

        :type distribution: :class:`distutils2.database.InstalledDistribution`
                            or :class:`distutils2.database.EggInfoDistribution`
        N)rr)r �distributionr!r!r"�add_distribution3s
z DependencyGraph.add_distributionNcCs6|j|j||f�||j|kr2|j|j|�dS)a�Add an edge from distribution *x* to distribution *y* with the given
        *label*.

        :type x: :class:`distutils2.database.InstalledDistribution` or
                 :class:`distutils2.database.EggInfoDistribution`
        :type y: :class:`distutils2.database.InstalledDistribution` or
                 :class:`distutils2.database.EggInfoDistribution`
        :type label: ``str`` or ``None``
        N)rr'r)r �x�y�labelr!r!r"�add_edge=s
zDependencyGraph.add_edgecCs&tjd||�|jj|g�j|�dS)a
        Add a missing *requirement* for the given *distribution*.

        :type distribution: :class:`distutils2.database.InstalledDistribution`
                            or :class:`distutils2.database.EggInfoDistribution`
        :type requirement: ``str``
        z
%s missing %rN)rMrNr�r%r')r rr�r!r!r"�add_missingLszDependencyGraph.add_missingcCsd|j|jfS)Nz%s %s)rr])r r(r!r!r"�
_repr_distWszDependencyGraph._repr_distrcCs�|j|�g}xv|j|D]h\}}|j|�}|dk	r>d||f}|jd|t|��|j||d�}|jd�}|j|dd��qWdj|�S)zPrints only a subgraphNz%s [%s]z    rr�)rrr'r��	repr_noder��extendrI)r r(�level�outputr�rZ	suboutputZsubsr!r!r"rZs

zDependencyGraph.repr_nodeTcCs�g}|jd�x||jj�D]n\}}t|�dkr>|r>|j|�xH|D]@\}}|dk	rn|jd|j|j|f�qD|jd|j|jf�qDWqW|r�t|�dkr�|jd�|jd�|jd�x&|D]}|jd	|j�|jd
�q�W|jd�|jd�dS)a9Writes a DOT output for the graph to the provided file *f*.

        If *skip_disconnected* is set to ``True``, then all distributions
        that are not dependent on any other distribution are skipped.

        :type f: has to support ``file``-like operations
        :type skip_disconnected: ``bool``
        zdigraph dependencies {
rNz"%s" -> "%s" [label="%s"]
z
"%s" -> "%s"
zsubgraph disconnected {
zlabel = "Disconnected"
zbgcolor = red
z"%s"r�z}
)r�r�itemsr�r'r)r r�Zskip_disconnectedZdisconnectedr(�adjsr�rr!r!r"�to_dotgs&	






zDependencyGraph.to_dotcs�g}i}x&|jj�D]\}}|dd�||<qWx�g�x4t|j��dd�D]\}}|sL�j|�||=qLW�srPx*|j�D]\}}�fdd�|D�||<q|Wtjddd��D��|j��q2W|t|j��fS)aa
        Perform a topological sort of the graph.
        :return: A tuple, the first element of which is a topologically sorted
                 list of distributions, and the second element of which is a
                 list of distributions that cannot be sorted because they have
                 circular dependencies and so form a cycle.
        Ncs g|]\}}|�kr||f�qSr!r!)r�rsrS)�	to_remover!r"r��sz4DependencyGraph.topological_sort.<locals>.<listcomp>zMoving to result: %scSsg|]}d|j|jf�qS)z%s (%s))rr])r�rsr!r!r"r��s)rr�listr'rMrNr�keys)r rbZalist�krtr!)rr"�topological_sort�s$

z DependencyGraph.topological_sortcCs6g}x&|jj�D]\}}|j|j|��qWdj|�S)zRepresentation of the graphr�)rrr'rrI)r rr(rr!r!r"r��szDependencyGraph.__repr__)N)r)T)
r*r+r,r-r#r	r
rrrrrr�r!r!r!r"rs



 rr.cCsft|�}t�}i}xX|D]P}|j|�x@|jD]6}t|�\}}tjd|||�|j|g�j||f�q.WqWx�|D]�}|j	|j
B|jB|jB}x�|D]�}	y|j
|	�}
Wn6tk
r�tjd|	�|	j�d}|j
|�}
YnX|
j}d}||k�rJxV||D]J\}}y|
j|�}
Wntk
�r,d}
YnX|
r�|j|||	�d}Pq�W|s�|j||	�q�WqrW|S)a6Makes a dependency graph from the given distributions.

    :parameter dists: a list of distributions
    :type dists: list of :class:`distutils2.database.InstalledDistribution` and
                 :class:`distutils2.database.EggInfoDistribution` instances
    :rtype: a :class:`DependencyGraph` instance
    zAdd to provided: %s, %s, %sz+could not read version %r - using name onlyrFT)rrr	rfrrMrNr%r'r�r�r�r�rdrr�r�r&rgr
r)�distsr=�graphrhr(rirr]r�r�rdZmatchedZproviderrgr!r!r"�
make_graph�sD





rcCs~||krtd|j��t|�}|g}|j|}x@|rn|j�}|j|�x$|j|D]}||krR|j|�qRWq0W|jd�|S)z�Recursively generate a list of distributions from *dists* that are
    dependent on *dist*.

    :param dists: a list of distributions
    :param dist: a distribution, member of *dists* for which we are interested
    z1given distribution %r is not a member of the listr)rrrr�popr')rr(rZdep�todorsZsuccr!r!r"�get_dependent_dists�s



r!cCsv||krtd|j��t|�}g}|j|}xD|rp|j�d}|j|�x$|j|D]}||krT|j|�qTWq.W|S)z�Recursively generate a list of distributions from *dists* that are
    required by *dist*.

    :param dists: a list of distributions
    :param dist: a distribution, member of *dists* for which we are interested
    z1given distribution %r is not a member of the listr)rrrrrr')rr(rr�r rsZpredr!r!r"�get_required_dists�s


r"cKs4|jdd�}tf|�}||_||_|p(d|_t|�S)zO
    A convenience method for making a dist given just a name and version.
    �summaryzPlaceholder for summary)rr	rr]r#r)rr]�kwargsr#rr!r!r"�	make_dists

r%)r.)6r-Z
__future__rr�r�rJr�Zloggingr�rHr/r�r:rr�compatrr]rrr>r	r
r�utilrr
rrrrr�__all__Z	getLoggerr*rMr�ZCOMMANDS_FILENAMEr�rGr�rrrrrrrOrPrrr!r"r%r!r!r!r"�<module>sV$

l7GM
6site-packages/pip/_vendor/distlib/__pycache__/compat.cpython-36.pyc000064400000076252147511334610021304 0ustar003

���ea���@sddlmZddlZddlZddlZyddlZWnek
rHdZYnXejddk�r~ddlmZe	fZ
eZddl
mZddlZddlZddlmZddlmZmZmZmZmZdd	lmZmZmZm Z m!Z!m"Z"m#Z#d
d�Zddl$Z$ddl$m%Z%m&Z&m'Z'm(Z(m)Z)m*Z*m+Z+m,Z,m-Z-e�r&dd
l$m.Z.ddl/Z/ddl0Z0ddl1Z2ddl3m3Z3ddl4Z4e5Z5ddl6m7Z8ddl6m9Z:da;dd�Z<�nddl=mZe>fZ
e>Zddl=m?ZddlZddlZddlZddl@mZmZmZm<Z<mZmZmZmZm#Z#ddlAm&Z&mZm%Z%m Z m!Z!m)Z)m*Z*m+Z+m,Z,m-Z-e�r&dd
lAm.Z.ddlBm(Z(m'Z'm"Z"ddlCjDZ/ddlAjEZ$ddlFjDZ0ddl2Z2ddlGm3Z3ddlHjIZ4eJZ5ddl6m:Z:e8Z8yddlmKZKmLZLWn8ek
�r�Gdd�deM�ZLdcdd�ZNdd�ZKYnXyddl
mOZPWn&ek
�rGd d!�d!eQ�ZPYnXydd"lmRZRWn,ek
�rLejSejTBdfd#d$�ZRYnXdd%lUmVZWeXeWd&��rleWZVn,dd'lUmYZZGd(d)�d)eZ�ZYGd*d+�d+eW�ZVydd,l[m\Z\Wnek
�r�d-d.�Z\YnXyddl]Z]Wn"ek
�r�dd/lm]Z]YnXye^Z^Wn*e_k
�r*dd0l`maZad1d2�Z^YnXyejbZbejcZcWnBedk
�r~eje�Zfefd3k�rfd4Zgnd5Zgd6d7�Zbd8d9�ZcYnXydd:lhmiZiWnHek
�r�dd;ljmkZkmlZlddlZejmd<�Znd=d>�Zod?d@�ZiYnXyddAlpmqZqWn"ek
�rddAlrmqZqYnXejddB�ddk�r,e3�jsZsnddDlpmsZsyddEl`mtZtWndek
�r�ddFl`muZuyddGlvmwZxWn ek
�r�dedIdJ�ZxYnXGdKdL�dLeu�ZtYnXyddMlymzZzWn ek
�r�dfdNdO�ZzYnXyddPl`m{Z{Wn�ek
�rzyddQl|m}Z~Wn"ek
�r4ddQlm}Z~YnXyddRl�m�Z�m�Z�m�Z�Wnek
�rdYnXGdSdT�dTe��Z{YnXyddUl�m�Z�m�Z�Wnvek
�rejmdVej��Z�dWdX�Z�GdYdZ�dZe��Z�dgd[d\�Z�Gd]d^�d^e��Z�Gd_d`�d`e��Z�Gdadb�dbeQ�Z�YnXdS)h�)�absolute_importN�)�StringIO)�FileType�)�shutil)�urlparse�
urlunparse�urljoin�urlsplit�
urlunsplit)�urlretrieve�quote�unquote�url2pathname�pathname2url�ContentTooShortError�	splittypecCst|t�r|jd�}t|�S)Nzutf-8)�
isinstance�unicode�encode�_quote)�s�r�/usr/lib/python3.6/compat.pyrs

r)	�Request�urlopen�URLError�	HTTPError�HTTPBasicAuthHandler�HTTPPasswordMgr�HTTPHandler�HTTPRedirectHandler�build_opener)�HTTPSHandler)�
HTMLParser)�ifilter)�ifilterfalsecCs<tdkrddl}|jd�atj|�}|r4|jdd�Sd|fS)zJsplituser('user[:passwd]@host[:port]') --> 'user[:passwd]', 'host[:port]'.Nrz^(.*)@(.*)$r�)�	_userprog�re�compile�match�group)�hostr*r,rrr�	splituser4s

r/)�
TextIOWrapper)	rr	r
r/rrrrr)
rr
rrrrr r!r"r#)rrr)�filterfalse)�match_hostname�CertificateErrorc@seZdZdS)r3N)�__name__�
__module__�__qualname__rrrrr3^sr3c
Cs�g}|sdS|jd�}|d|dd�}}|jd�}||krNtdt|���|sb|j�|j�kS|dkrv|jd�n>|jd	�s�|jd	�r�|jtj|��n|jtj|�j	d
d��x|D]}|jtj|��q�Wtj
dd
j|�dtj�}	|	j
|�S)zpMatching according to RFC 6125, section 6.4.3

        http://tools.ietf.org/html/rfc6125#section-6.4.3
        F�.rrN�*z,too many wildcards in certificate DNS name: z[^.]+zxn--z\*z[^.]*z\Az\.z\Z)�split�countr3�repr�lower�append�
startswithr*�escape�replacer+�join�
IGNORECASEr,)
Zdn�hostnameZ
max_wildcardsZpats�partsZleftmostZ	remainderZ	wildcards�fragZpatrrr�_dnsname_matchbs(


rFcCs�|std��g}|jdf�}x0|D](\}}|dkr"t||�r@dS|j|�q"W|s�xF|jdf�D]6}x0|D](\}}|dkrjt||�r�dS|j|�qjWq`Wt|�dkr�td|d	jtt|��f��n*t|�dkr�td
||df��ntd��dS)
a=Verify that *cert* (in decoded format as returned by
        SSLSocket.getpeercert()) matches the *hostname*.  RFC 2818 and RFC 6125
        rules are followed, but IP addresses are not accepted for *hostname*.

        CertificateError is raised on failure. On success, the function
        returns nothing.
        ztempty or no certificate, match_hostname needs a SSL socket or SSL context with either CERT_OPTIONAL or CERT_REQUIREDZsubjectAltNameZDNSNZsubjectZ
commonNamerz&hostname %r doesn't match either of %sz, zhostname %r doesn't match %rrz=no appropriate commonName or subjectAltName fields were found)	�
ValueError�getrFr=�lenr3rA�mapr;)ZcertrCZdnsnamesZsan�key�value�subrrrr2�s.

r2)�SimpleNamespacec@seZdZdZdd�ZdS)�	ContainerzR
        A generic container for when multiple values need to be returned
        cKs|jj|�dS)N)�__dict__�update)�self�kwargsrrr�__init__�szContainer.__init__N)r4r5r6�__doc__rTrrrrrO�srO)�whichcs"dd�}tjj��r&|�|�r"�SdS|dkr>tjjdtj�}|sFdS|jtj�}tj	dkr�tj
|krt|jdtj
�tjjdd�jtj�}t�fd	d
�|D��r��g}q‡fdd�|D�}n�g}t
�}xT|D]L}tjj|�}||kr�|j|�x(|D] }	tjj||	�}
||
|�r�|
Sq�Wq�WdS)
aKGiven a command, mode, and a PATH string, return the path which
        conforms to the given mode on the PATH, or None if there is no such
        file.

        `mode` defaults to os.F_OK | os.X_OK. `path` defaults to the result
        of os.environ.get("PATH"), or can be overridden with a custom search
        path.

        cSs&tjj|�o$tj||�o$tjj|�S)N)�os�path�exists�access�isdir)�fn�moderrr�
_access_check�szwhich.<locals>._access_checkN�PATHZwin32rZPATHEXT�c3s |]}�j�j|j��VqdS)N)r<�endswith)�.0�ext)�cmdrr�	<genexpr>�szwhich.<locals>.<genexpr>csg|]}�|�qSrr)rbrc)rdrr�
<listcomp>�szwhich.<locals>.<listcomp>)rWrX�dirname�environrH�defpathr9�pathsep�sys�platform�curdir�insert�any�set�normcase�addrA)rdr]rXr^Zpathext�files�seen�dirZnormdirZthefile�namer)rdrrV�s8







rV)�ZipFile�	__enter__)�
ZipExtFilec@s$eZdZdd�Zdd�Zdd�ZdS)rycCs|jj|j�dS)N)rPrQ)rR�baserrrrTszZipExtFile.__init__cCs|S)Nr)rRrrrrxszZipExtFile.__enter__cGs|j�dS)N)�close)rR�exc_inforrr�__exit__szZipExtFile.__exit__N)r4r5r6rTrxr}rrrrrysryc@s$eZdZdd�Zdd�Zdd�ZdS)rwcCs|S)Nr)rRrrrrx"szZipFile.__enter__cGs|j�dS)N)r{)rRr|rrrr}%szZipFile.__exit__cOstj|f|�|�}t|�S)N)�BaseZipFile�openry)rR�argsrSrzrrrr)szZipFile.openN)r4r5r6rxr}rrrrrrw!srw)�python_implementationcCs0dtjkrdStjdkrdStjjd�r,dSdS)z6Return a string identifying the Python implementation.ZPyPy�javaZJythonZ
IronPythonZCPython)rk�versionrWrvr>rrrrr�0s

r�)�	sysconfig)�CallablecCs
t|t�S)N)rr�)�objrrr�callableDsr��mbcs�strict�surrogateescapecCs:t|t�r|St|t�r$|jtt�Stdt|�j��dS)Nzexpect bytes or str, not %s)	r�bytes�	text_typer�_fsencoding�	_fserrors�	TypeError�typer4)�filenamerrr�fsencodeRs

r�cCs:t|t�r|St|t�r$|jtt�Stdt|�j��dS)Nzexpect bytes or str, not %s)	rr�r��decoder�r�r�r�r4)r�rrr�fsdecode[s

r�)�detect_encoding)�BOM_UTF8�lookupzcoding[:=]\s*([-\w.]+)cCsH|dd�j�jdd�}|dks*|jd�r.dS|d
ks@|jd�rDdS|S)z(Imitates get_normal_name in tokenizer.c.N��_�-zutf-8zutf-8-�latin-1�
iso-8859-1�iso-latin-1�latin-1-�iso-8859-1-�iso-latin-1-)r�r�r�)r�r�r�)r<r@r>)�orig_enc�encrrr�_get_normal_namels
r�cs�y�jj�Wntk
r$d�YnXd�d}d}�fdd�}��fdd�}|�}|jt�rpd�|d	d�}d
}|s||gfS||�}|r�||gfS|�}|s�||gfS||�}|r�|||gfS|||gfS)a?
        The detect_encoding() function is used to detect the encoding that should
        be used to decode a Python source file.  It requires one argument, readline,
        in the same way as the tokenize() generator.

        It will call readline a maximum of twice, and return the encoding used
        (as a string) and a list of any lines (left as bytes) it has read in.

        It detects the encoding from the presence of a utf-8 bom or an encoding
        cookie as specified in pep-0263.  If both a bom and a cookie are present,
        but disagree, a SyntaxError will be raised.  If the encoding cookie is an
        invalid charset, raise a SyntaxError.  Note that if a utf-8 bom is found,
        'utf-8-sig' is returned.

        If no encoding is specified, then the default of 'utf-8' will be returned.
        NFzutf-8cs y��Stk
rdSXdS)N�)�
StopIterationr)�readlinerr�read_or_stop�sz%detect_encoding.<locals>.read_or_stopcs�y|jd�}Wn4tk
rBd}�dk	r6dj|��}t|��YnXtj|�}|sVdSt|d�}yt|�}Wn:tk
r��dkr�d|}ndj�|�}t|��YnX�r�|j	dkr؈dkr�d}n
dj��}t|��|d	7}|S)
Nzutf-8z'invalid or missing encoding declarationz{} for {!r}rzunknown encoding: zunknown encoding for {!r}: {}zencoding problem: utf-8z encoding problem for {!r}: utf-8z-sig)
r��UnicodeDecodeError�format�SyntaxError�	cookie_re�findallr�r��LookupErrorrv)�line�line_string�msgZmatches�encoding�codec)�	bom_foundr�rr�find_cookie�s6



z$detect_encoding.<locals>.find_cookieTrz	utf-8-sig)�__self__rv�AttributeErrorr>r�)r�r��defaultr�r��first�secondr)r�r�r�rr�ws4
&


r�)r?r(�)�unescape)�ChainMap)�MutableMapping)�recursive_repr�...cs�fdd�}|S)zm
            Decorator to make a repr function return fillvalue for a recursive
            call
            csLt�����fdd�}t�d�|_t�d�|_t�d�|_t�di�|_|S)NcsBt|�t�f}|�kr�S�j|�z�|�}Wd�j|�X|S)N)�id�	get_identrr�discard)rRrK�result)�	fillvalue�repr_running�
user_functionrr�wrapper�s
z=_recursive_repr.<locals>.decorating_function.<locals>.wrapperr5rUr4�__annotations__)rp�getattrr5rUr4r�)r�r�)r�)r�r�r�decorating_function�sz,_recursive_repr.<locals>.decorating_functionr)r�r�r)r�r�_recursive_repr�sr�c@s�eZdZdZdd�Zdd�Zdd�Zd'd	d
�Zdd�Zd
d�Z	dd�Z
dd�Ze�dd��Z
edd��Zdd�ZeZdd�Zedd��Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�ZdS)(r�a� A ChainMap groups multiple dicts (or other mappings) together
        to create a single, updateable view.

        The underlying mappings are stored in a list.  That list is public and can
        accessed or updated using the *maps* attribute.  There is no other state.

        Lookups search the underlying mappings successively until a key is found.
        In contrast, writes, updates, and deletions only operate on the first
        mapping.

        cGst|�pig|_dS)z�Initialize a ChainMap by setting *maps* to the given mappings.
            If no mappings are provided, a single empty dictionary is used.

            N)�list�maps)rRr�rrrrT
szChainMap.__init__cCst|��dS)N)�KeyError)rRrKrrr�__missing__szChainMap.__missing__cCs8x,|jD]"}y||Stk
r(YqXqW|j|�S)N)r�r�r�)rRrK�mappingrrr�__getitem__s
zChainMap.__getitem__NcCs||kr||S|S)Nr)rRrKr�rrrrHszChainMap.getcCstt�j|j��S)N)rIrp�unionr�)rRrrr�__len__"szChainMap.__len__cCstt�j|j��S)N)�iterrpr�r�)rRrrr�__iter__%szChainMap.__iter__cst�fdd�|jD��S)Nc3s|]}�|kVqdS)Nr)rb�m)rKrrre)sz(ChainMap.__contains__.<locals>.<genexpr>)ror�)rRrKr)rKr�__contains__(szChainMap.__contains__cCs
t|j�S)N)ror�)rRrrr�__bool__+szChainMap.__bool__cCsdj|djtt|j���S)Nz{0.__class__.__name__}({1})z, )r�rArJr;r�)rRrrr�__repr__.szChainMap.__repr__cGs|tj|f|���S)z?Create a ChainMap with a single dict created from the iterable.)�dict�fromkeys)�cls�iterabler�rrrr�3szChainMap.fromkeyscCs$|j|jdj�f|jdd���S)zHNew ChainMap or subclass with a new copy of maps[0] and refs to maps[1:]rrN)�	__class__r��copy)rRrrrr�8sz
ChainMap.copycCs|jif|j��S)z;New ChainMap with a new dict followed by all previous maps.)r�r�)rRrrr�	new_child>szChainMap.new_childcCs|j|jdd��S)zNew ChainMap from maps[1:].rN)r�r�)rRrrr�parentsBszChainMap.parentscCs||jd|<dS)Nr)r�)rRrKrLrrr�__setitem__GszChainMap.__setitem__cCs8y|jd|=Wn"tk
r2tdj|���YnXdS)Nrz(Key not found in the first mapping: {!r})r�r�r�)rRrKrrr�__delitem__JszChainMap.__delitem__cCs0y|jdj�Stk
r*td��YnXdS)zPRemove and return an item pair from maps[0]. Raise KeyError is maps[0] is empty.rz#No keys found in the first mapping.N)r��popitemr�)rRrrrr�PszChainMap.popitemcGs>y|jdj|f|��Stk
r8tdj|���YnXdS)zWRemove *key* from maps[0] and return its value. Raise KeyError if *key* not in maps[0].rz(Key not found in the first mapping: {!r}N)r��popr�r�)rRrKr�rrrr�WszChainMap.popcCs|jdj�dS)z'Clear maps[0], leaving maps[1:] intact.rN)r��clear)rRrrrr�^szChainMap.clear)N)r4r5r6rUrTr�r�rHr�r�r�r�r�r��classmethodr�r��__copy__r��propertyr�r�r�r�r�r�rrrrr�s(
r�)�cache_from_sourcecCs0|jd�st�|dkrd}|r$d}nd}||S)Nz.pyT�c�o)ra�AssertionError)rX�debug_override�suffixrrrr�esr�)�OrderedDict)r�)�KeysView�
ValuesView�	ItemsViewc@s�eZdZdZdd�Zejfdd�Zejfdd�Zdd	�Zd
d�Z	dd
�Z
d6dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�ZeZe�Zefdd �Zd7d"d#�Zd8d$d%�Zd&d'�Zd(d)�Zed9d*d+��Zd,d-�Zd.d/�Zd0d1�Zd2d3�Z d4d5�Z!d!S):r�z)Dictionary that remembers insertion ordercOsnt|�dkrtdt|���y
|jWn6tk
r\g|_}||dg|dd�<i|_YnX|j||�dS)z�Initialize an ordered dictionary.  Signature is the same as for
            regular dictionaries, but keyword arguments are not recommended
            because their insertion order is arbitrary.

            rz$expected at most 1 arguments, got %dN)rIr��_OrderedDict__rootr��_OrderedDict__map�_OrderedDict__update)rRr��kwds�rootrrrrT�s

zOrderedDict.__init__cCsF||kr6|j}|d}|||g|d<|d<|j|<||||�dS)z!od.__setitem__(i, y) <==> od[i]=yrrN)r�r�)rRrKrLZdict_setitemr��lastrrrr��s
 zOrderedDict.__setitem__cCs0|||�|jj|�\}}}||d<||d<dS)z od.__delitem__(y) <==> del od[y]rrN)r�r�)rRrKZdict_delitem�	link_prev�	link_nextrrrr��s
zOrderedDict.__delitem__ccs2|j}|d}x||k	r,|dV|d}qWdS)zod.__iter__() <==> iter(od)rr(N)r�)rRr��currrrrr��s


zOrderedDict.__iter__ccs2|j}|d}x||k	r,|dV|d}qWdS)z#od.__reversed__() <==> reversed(od)rr(N)r�)rRr�r�rrr�__reversed__�s


zOrderedDict.__reversed__cCshyDx|jj�D]}|dd�=qW|j}||dg|dd�<|jj�Wntk
rXYnXtj|�dS)z.od.clear() -> None.  Remove all items from od.N)r��
itervaluesr�r�r�r�)rRZnoder�rrrr��szOrderedDict.clearTcCs||std��|j}|r8|d}|d}||d<||d<n |d}|d}||d<||d<|d}|j|=tj||�}||fS)z�od.popitem() -> (k, v), return and remove a (key, value) pair.
            Pairs are returned in LIFO order if last is true or FIFO order if false.

            zdictionary is emptyrrr()r�r�r�r�r�)rRr�r��linkr�r�rKrLrrrr��s 
zOrderedDict.popitemcCst|�S)zod.keys() -> list of keys in od)r�)rRrrr�keys�szOrderedDict.keyscs�fdd��D�S)z#od.values() -> list of values in odcsg|]}�|�qSrr)rbrK)rRrrrf�sz&OrderedDict.values.<locals>.<listcomp>r)rRr)rRr�values�szOrderedDict.valuescs�fdd��D�S)z.od.items() -> list of (key, value) pairs in odcsg|]}|�|f�qSrr)rbrK)rRrrrf�sz%OrderedDict.items.<locals>.<listcomp>r)rRr)rRr�items�szOrderedDict.itemscCst|�S)z0od.iterkeys() -> an iterator over the keys in od)r�)rRrrr�iterkeys�szOrderedDict.iterkeysccsx|D]}||VqWdS)z2od.itervalues -> an iterator over the values in odNr)rR�krrrr��s
zOrderedDict.itervaluesccs x|D]}|||fVqWdS)z=od.iteritems -> an iterator over the (key, value) items in odNr)rRrrrr�	iteritems�s
zOrderedDict.iteritemscOs�t|�dkr tdt|�f��n|s,td��|d}f}t|�dkrL|d}t|t�rrx^|D]}||||<q\WnDt|d�r�x8|j�D]}||||<q�Wnx|D]\}}|||<q�Wx|j�D]\}}|||<q�WdS)a�od.update(E, **F) -> None.  Update od from dict/iterable E and F.

            If E is a dict instance, does:           for k in E: od[k] = E[k]
            If E has a .keys() method, does:         for k in E.keys(): od[k] = E[k]
            Or if E is an iterable of items, does:   for k, v in E: od[k] = v
            In either case, this is followed by:     for k, v in F.items(): od[k] = v

            r(z8update() takes at most 2 positional arguments (%d given)z,update() takes at least 1 argument (0 given)rrrN)rIr�rr��hasattrrr)r�r�rR�otherrKrLrrrrQ�s&	


zOrderedDict.updatecCs0||kr||}||=|S||jkr,t|��|S)z�od.pop(k[,d]) -> v, remove specified key and return the corresponding value.
            If key is not found, d is returned if given, otherwise KeyError is raised.

            )�_OrderedDict__markerr�)rRrKr�r�rrrr�!s
zOrderedDict.popNcCs||kr||S|||<|S)zDod.setdefault(k[,d]) -> od.get(k,d), also set od[k]=d if k not in odr)rRrKr�rrr�
setdefault.szOrderedDict.setdefaultcCs^|si}t|�t�f}||kr"dSd||<z&|s>d|jjfSd|jj|j�fS||=XdS)zod.__repr__() <==> repr(od)z...rz%s()z%s(%r)N)r��
_get_identr�r4r)rRZ
_repr_runningZcall_keyrrrr�5szOrderedDict.__repr__cs\�fdd��D�}t��j�}xtt��D]}|j|d�q*W|rP�j|f|fS�j|ffS)z%Return state information for picklingcsg|]}|�|g�qSrr)rbr)rRrrrfEsz*OrderedDict.__reduce__.<locals>.<listcomp>N)�varsr�r�r�r�)rRrZ	inst_dictrr)rRr�
__reduce__CszOrderedDict.__reduce__cCs
|j|�S)z!od.copy() -> a shallow copy of od)r�)rRrrrr�MszOrderedDict.copycCs |�}x|D]}|||<qW|S)z�OD.fromkeys(S[, v]) -> New ordered dictionary with keys from S
            and values equal to v (which defaults to None).

            r)r�r�rL�drKrrrr�Qs
zOrderedDict.fromkeyscCs6t|t�r*t|�t|�ko(|j�|j�kStj||�S)z�od.__eq__(y) <==> od==y.  Comparison to another OD is order-sensitive
            while comparison to a regular mapping is order-insensitive.

            )rr�rIrr��__eq__)rRrrrrr\s
 zOrderedDict.__eq__cCs
||kS)Nr)rRrrrr�__ne__eszOrderedDict.__ne__cCst|�S)z@od.viewkeys() -> a set-like object providing a view on od's keys)r�)rRrrr�viewkeysjszOrderedDict.viewkeyscCst|�S)z<od.viewvalues() -> an object providing a view on od's values)r�)rRrrr�
viewvaluesnszOrderedDict.viewvaluescCst|�S)zBod.viewitems() -> a set-like object providing a view on od's items)r�)rRrrr�	viewitemsrszOrderedDict.viewitems)T)N)N)N)"r4r5r6rUrTr�r�r�r�r�r�r�rrrrr�rrQr��objectrr�r	r�rr�r�r�rrrrrrrrrr��s:
	




	r�)�BaseConfigurator�valid_identz^[a-z_][a-z0-9_]*$cCstj|�}|std|��dS)Nz!Not a valid Python identifier: %rT)�
IDENTIFIERr,rG)rr�rrrr|s
rc@s"eZdZdZdd�Zddd�ZdS)�ConvertingDictz A converting dictionary wrapper.cCsJtj||�}|jj|�}||k	rF|||<t|�tttfkrF||_||_	|S)N)
r�r��configurator�convertr�r�ConvertingList�ConvertingTuple�parentrK)rRrKrLr�rrrr��s
zConvertingDict.__getitem__NcCsLtj|||�}|jj|�}||k	rH|||<t|�tttfkrH||_||_	|S)N)
r�rHrrr�rrrrrK)rRrKr�rLr�rrrrH�s
zConvertingDict.get)N)r4r5r6rUr�rHrrrrr�srcCsDtj|||�}|jj|�}||k	r@t|�tttfkr@||_||_	|S)N)
r�r�rrr�rrrrrK)rRrKr�rLr�rrrr��s
r�c@s"eZdZdZdd�Zd	dd�ZdS)
rzA converting list wrapper.cCsJtj||�}|jj|�}||k	rF|||<t|�tttfkrF||_||_	|S)N)
r�r�rrr�rrrrrK)rRrKrLr�rrrr��s
zConvertingList.__getitem__rcCs<tj||�}|jj|�}||k	r8t|�tttfkr8||_|S)N)	r�r�rrr�rrrr)rR�idxrLr�rrrr��s
zConvertingList.popN���)r)r4r5r6rUr�r�rrrrr�src@seZdZdZdd�ZdS)rzA converting tuple wrapper.cCsBtj||�}|jj|�}||k	r>t|�tttfkr>||_||_	|S)N)
�tupler�rrr�rrrrrK)rRrKrLr�rrrr��s
zConvertingTuple.__getitem__N)r4r5r6rUr�rrrrr�src@s�eZdZdZejd�Zejd�Zejd�Zejd�Z	ejd�Z
ddd	�Zee
�Zd
d�Zdd
�Zdd�Zdd�Zdd�Zdd�Zdd�ZdS)rzQ
        The configurator base class which defines some useful defaults.
        z%^(?P<prefix>[a-z]+)://(?P<suffix>.*)$z^\s*(\w+)\s*z^\.\s*(\w+)\s*z^\[\s*(\w+)\s*\]\s*z^\d+$�ext_convert�cfg_convert)rcZcfgcCst|�|_||j_dS)N)r�configr)rRr"rrrrT�s
zBaseConfigurator.__init__c	Cs�|jd�}|jd�}y`|j|�}xP|D]H}|d|7}yt||�}Wq&tk
rl|j|�t||�}Yq&Xq&W|Stk
r�tj�dd�\}}td||f�}|||_	|_
|�YnXdS)zl
            Resolve strings to objects using standard import and attribute
            syntax.
            r7rrNzCannot resolve %r: %s)r9r��importerr�r��ImportErrorrkr|rG�	__cause__�
__traceback__)	rRrrvZused�foundrE�e�tb�vrrr�resolve�s"




zBaseConfigurator.resolvecCs
|j|�S)z*Default converter for the ext:// protocol.)r+)rRrLrrrr szBaseConfigurator.ext_convertcCs|}|jj|�}|dkr&td|��n�||j�d�}|j|j�d}x�|r�|jj|�}|rp||j�d}nd|jj|�}|r�|j�d}|jj|�s�||}n2yt	|�}||}Wnt
k
r�||}YnX|r�||j�d�}qJtd||f��qJW|S)z*Default converter for the cfg:// protocol.NzUnable to convert %rrzUnable to convert %r at %r)�WORD_PATTERNr,rG�endr"�groups�DOT_PATTERN�
INDEX_PATTERN�
DIGIT_PATTERN�intr�)rRrL�restr�r
r�nrrrr!s2
zBaseConfigurator.cfg_convertcCs�t|t�r&t|t�r&t|�}||_n�t|t�rLt|t�rLt|�}||_n|t|t�rrt|t�rrt|�}||_nVt|t�r�|j	j
|�}|r�|j�}|d}|jj
|d�}|r�|d}t||�}||�}|S)z�
            Convert values to an appropriate type. dicts, lists and tuples are
            replaced by their converting alternatives. Strings are checked to
            see if they have a conversion format and are converted if they do.
            �prefixNr�)rrr�rrr�rr�string_types�CONVERT_PATTERNr,�	groupdict�value_convertersrHr�)rRrLr�r
r5Z	converterr�rrrr)s*


zBaseConfigurator.convertcsr�jd�}t|�s|j|�}�jdd�}t�fdd��D��}|f|�}|rnx |j�D]\}}t|||�qVW|S)z1Configure an object with a user-supplied factory.z()r7Ncs g|]}t|�r|�|f�qSr)r)rbr)r"rrrfLsz5BaseConfigurator.configure_custom.<locals>.<listcomp>)r�r�r+r�r�setattr)rRr"r�ZpropsrSr�rvrLr)r"r�configure_customEs


z!BaseConfigurator.configure_customcCst|t�rt|�}|S)z0Utility function which converts lists to tuples.)rr�r)rRrLrrr�as_tupleSs
zBaseConfigurator.as_tupleN)r4r5r6rUr*r+r7r,r/r0r1r9�staticmethod�
__import__r#rTr+r r!rr;r<rrrrr�s 




"r)r)rr�)r�)N)N)�Z
__future__rrWr*rkZsslr$�version_inforZ
basestringr6rr��typesrZ	file_typeZ__builtin__�builtinsZConfigParserZconfigparserZ	_backportrrr	r
rrZurllibr
rrrrrrrZurllib2rrrrrr r!r"r#r$ZhttplibZ	xmlrpclibZQueueZqueuer%ZhtmlentitydefsZ	raw_input�	itertoolsr&�filterr'r1r)r/�io�strr0Zurllib.parseZurllib.requestZurllib.errorZhttp.clientZclientZrequestZ
xmlrpc.clientZhtml.parserZ
html.entitiesZentities�inputr2r3rGrFrNrOrrV�F_OK�X_OKZzipfilerwr~rryZBaseZipExtFilerlr�r�r��	NameError�collectionsr�r�r�r��getfilesystemencodingr�r��tokenizer��codecsr�r�r+r�r�Zhtmlr?Zcgir�r�r��reprlibr�r�Zimpr�r�Zthreadr�r
Zdummy_threadZ_abcollr�r�r�r�Zlogging.configrr�Irrr�r�rrrrrrr�<module>s&
$,,0




2+A


		
[
b
w

site-packages/pip/_vendor/distlib/__pycache__/__init__.cpython-36.pyc000064400000001703147511334610021545 0ustar003

���eE�@snddlZdZGdd�de�ZyddlmZWn&ek
rRGdd�dej�ZYnXeje�Z	e	j
e��dS)�Nz0.2.4c@seZdZdS)�DistlibExceptionN)�__name__�
__module__�__qualname__�rr�/usr/lib/python3.6/__init__.pyrsr)�NullHandlerc@s$eZdZdd�Zdd�Zdd�ZdS)rcCsdS)Nr)�self�recordrrr�handleszNullHandler.handlecCsdS)Nr)r	r
rrr�emitszNullHandler.emitcCs
d|_dS)N)�lock)r	rrr�
createLockszNullHandler.createLockN)rrrrrrrrrrrsr)Zlogging�__version__�	Exceptionrr�ImportErrorZHandlerZ	getLoggerrZloggerZ
addHandlerrrrr�<module>s
site-packages/pip/_vendor/distlib/__pycache__/metadata.cpython-36.opt-1.pyc000064400000064462147511334610022540 0ustar003

���e���@sdZddlmZddlZddlmZddlZddlZddlZddl	m
Z
mZddlm
Z
mZmZddlmZdd	lmZmZdd
lmZmZeje�ZGdd�de
�ZGd
d�de
�ZGdd�de
�ZGdd�de
�ZdddgZdZ dZ!ej"d�Z#ej"d�Z$dGZ%dHZ&dIZ'dJZ(dKZ)dLZ*dMZ+e,�Z-e-j.e%�e-j.e&�e-j.e(�e-j.e*�ej"d8�Z/d9d:�Z0d;d<�Z1ddddd%ddd d!d"d#d+d,d$d&d'd-d/d0d5d1d2d*d)d(d.d3d4d6d7d=�Z2dNZ3dOZ4dPZ5dQZ6dRZ7dSZ8dTZ9e:�Z;ej"d>�Z<dUd@dA�Z=GdBdC�dCe:�Z>dDZ?dEZ@GdFd�de:�ZAdS)VzImplementation of the Metadata for Python packages PEPs.

Supports all metadata formats (1.0, 1.1, 1.2, and 2.0 experimental).
�)�unicode_literalsN)�message_from_file�)�DistlibException�__version__)�StringIO�string_types�	text_type)�	interpret)�extract_by_key�
get_extras)�
get_scheme�PEP440_VERSION_REc@seZdZdZdS)�MetadataMissingErrorzA required metadata is missingN)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/metadata.pyrsrc@seZdZdZdS)�MetadataConflictErrorz>Attempt to read or write metadata fields that are conflictual.N)rrrrrrrrr src@seZdZdZdS)� MetadataUnrecognizedVersionErrorz Unknown metadata version number.N)rrrrrrrrr$src@seZdZdZdS)�MetadataInvalidErrorzA metadata value is invalidN)rrrrrrrrr(sr�Metadata�PKG_INFO_ENCODING�PKG_INFO_PREFERRED_VERSIONzutf-8z1.1z

       \|z	
        �Metadata-Version�Name�Version�Platform�Summary�Description�Keywords�	Home-page�Author�Author-email�License�Supported-Platform�
Classifier�Download-URL�	Obsoletes�Provides�Requires�
Maintainer�Maintainer-email�Obsoletes-Dist�Project-URL�
Provides-Dist�
Requires-Dist�Requires-Python�Requires-External�Private-Version�Obsoleted-By�Setup-Requires-Dist�	Extension�Provides-Extraz"extra\s*==\s*("([^"]+)"|'([^']+)')cCs<|dkrtS|dkrtS|dkr$tS|dkr0tSt|��dS)Nz1.0z1.1z1.2z2.0)�_241_FIELDS�_314_FIELDS�_345_FIELDS�_426_FIELDSr)�versionrrr�_version2fieldlistgsr?c	Cs�dd�}g}x.|j�D]"\}}|gddfkr.q|j|�qWddddg}xt|D]l}|tkrld|krl|jd�|tkr�d|kr�|jd�|tkr�d|kr�|jd�|tkrNd|krN|jd�qNWt|�d	kr�|d
St|�d
kr�td��d|ko�||t	�}d|k�o
||t
�}d|k�o||t�}t|�t|�t|�d	k�rFtd��|�rl|�rl|�rlt
|k�rlt
S|�rvdS|�r�dSdS)
z5Detect the best version depending on the fields used.cSsx|D]}||krdSqWdS)NTFr)�keys�markers�markerrrr�_has_markerus
z"_best_version.<locals>._has_marker�UNKNOWNNz1.0z1.1z1.2z2.0rrzUnknown metadata setz(You used incompatible 1.1/1.2/2.0 fields)�items�appendr:�remover;r<r=�lenr�_314_MARKERS�_345_MARKERS�_426_MARKERS�intr)	�fieldsrCr@�key�valueZpossible_versionsZis_1_1Zis_1_2Zis_2_0rrr�
_best_versionssB




rP)�metadata_version�namer>�platformZsupported_platform�summary�description�keywords�	home_page�author�author_email�
maintainer�maintainer_email�license�
classifier�download_url�obsoletes_dist�
provides_dist�
requires_dist�setup_requires_dist�requires_python�requires_external�requires�provides�	obsoletes�project_urlZprivate_versionZobsoleted_by�	extensionZprovides_extraz[^A-Za-z0-9.]+FcCs0|r$tjd|�}tjd|jdd��}d||fS)zhReturn the distribution name with version.

    If for_filename is true, return a filename-escaped form.�-� �.z%s-%s)�	_FILESAFE�sub�replace)rRr>Zfor_filenamerrr�_get_name_and_version�srpc@s
eZdZdZd?dd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zd@dd�Zdd�Zdd �Zd!d"�Zd#d$�ZdAd%d&�ZdBd'd(�ZdCd)d*�Zd+d,�Zefd-d.�ZdDd/d0�ZdEd1d2�Zd3d4�Zd5d6�Zd7d8�Zd9d:�Zd;d<�Z d=d>�Z!dS)F�LegacyMetadataaaThe legacy metadata of a release.

    Supports versions 1.0, 1.1 and 1.2 (auto-detected). You can
    instantiate the class with one of these arguments (or none):
    - *path*, the path to a metadata file
    - *fileobj* give a file-like object with metadata as content
    - *mapping* is a dict-like object
    - *scheme* is a version scheme name
    N�defaultcCsz|||gjd�dkrtd��i|_g|_d|_||_|dk	rH|j|�n.|dk	r\|j|�n|dk	rv|j|�|j	�dS)N�z'path, fileobj and mapping are exclusive)
�count�	TypeError�_fieldsZrequires_filesZ
_dependencies�scheme�read�	read_file�update�set_metadata_version)�self�path�fileobj�mappingrwrrr�__init__�s
zLegacyMetadata.__init__cCst|j�|jd<dS)NzMetadata-Version)rPrv)r|rrrr{sz#LegacyMetadata.set_metadata_versioncCs|jd||f�dS)Nz%s: %s
)�write)r|r~rRrOrrr�_write_fieldszLegacyMetadata._write_fieldcCs
|j|�S)N)�get)r|rRrrr�__getitem__szLegacyMetadata.__getitem__cCs|j||�S)N)�set)r|rRrOrrr�__setitem__szLegacyMetadata.__setitem__cCs8|j|�}y|j|=Wntk
r2t|��YnXdS)N)�
_convert_namerv�KeyError)r|rR�
field_namerrr�__delitem__s

zLegacyMetadata.__delitem__cCs||jkp|j|�|jkS)N)rvr�)r|rRrrr�__contains__s
zLegacyMetadata.__contains__cCs(|tkr|S|jdd�j�}tj||�S)Nrj�_)�_ALL_FIELDSro�lower�_ATTR2FIELDr�)r|rRrrrr�szLegacyMetadata._convert_namecCs|tks|tkrgSdS)NrD)�_LISTFIELDS�_ELEMENTSFIELD)r|rRrrr�_default_value%szLegacyMetadata._default_valuecCs&|jdkrtjd|�Stjd|�SdS)N�1.0�1.1�
)r�r�)rQ�_LINE_PREFIX_PRE_1_2rn�_LINE_PREFIX_1_2)r|rOrrr�_remove_line_prefix*s
z"LegacyMetadata._remove_line_prefixcCs|tkr||St|��dS)N)r��AttributeError)r|rRrrr�__getattr__0szLegacyMetadata.__getattr__FcCst|d|d|�S)zhReturn the distribution name with version.

        If filesafe is true, return a filename-escaped form.rr)rp)r|Zfilesaferrr�get_fullname;szLegacyMetadata.get_fullnamecCs|j|�}|tkS)z+return True if name is a valid metadata key)r�r�)r|rRrrr�is_fieldAs
zLegacyMetadata.is_fieldcCs|j|�}|tkS)N)r�r�)r|rRrrr�is_multi_fieldFs
zLegacyMetadata.is_multi_fieldc
Cs.tj|ddd�}z|j|�Wd|j�XdS)z*Read the metadata values from a file path.�rzutf-8)�encodingN)�codecs�openry�close)r|�filepath�fprrrrxJszLegacyMetadata.readcCs�t|�}|d|jd<xxtD]p}||kr*q|tkrh|j|�}|tkrZ|dk	rZdd�|D�}|j||�q||}|dk	r|dkr|j||�qW|j�dS)z,Read the metadata values from a file object.zmetadata-versionzMetadata-VersionNcSsg|]}t|jd���qS)�,)�tuple�split)�.0rOrrr�
<listcomp>_sz,LegacyMetadata.read_file.<locals>.<listcomp>rD)rrvr�r�Zget_all�_LISTTUPLEFIELDSr�r{)r|Zfileob�msg�field�valuesrOrrrryRs

zLegacyMetadata.read_filec
Cs0tj|ddd�}z|j||�Wd|j�XdS)z&Write the metadata fields to filepath.�wzutf-8)r�N)r�r��
write_filer�)r|r��skip_unknownr�rrrr�hszLegacyMetadata.writecCs�|j�x�t|d�D]�}|j|�}|r:|dgdgfkr:q|tkrX|j||dj|��q|tkr�|dkr�|jd
kr�|jdd�}n|jdd	�}|g}|t	kr�d
d�|D�}x|D]}|j|||�q�WqWdS)z0Write the PKG-INFO format data to a file object.zMetadata-VersionrDr�r!�1.0�1.1r�z	
        z	
       |cSsg|]}dj|��qS)r�)�join)r�rOrrrr��sz-LegacyMetadata.write_file.<locals>.<listcomp>N)r�r�)
r{r?r�r�r�r�r�rQror�)r|Z
fileobjectr�r�r�rOrrrr�ps$


zLegacyMetadata.write_filecs��fdd�}|snHt|d�r>x<|j�D]}||||�q&Wnx|D]\}}|||�qDW|r~x|j�D]\}}|||�qhWdS)a�Set metadata values from the given iterable `other` and kwargs.

        Behavior is like `dict.update`: If `other` has a ``keys`` method,
        they are looped over and ``self[key]`` is assigned ``other[key]``.
        Else, ``other`` is an iterable of ``(key, value)`` iterables.

        Keys that don't match a metadata field or that have an empty value are
        dropped.
        cs"|tkr|r�j�j|�|�dS)N)r�r�r�)rNrO)r|rr�_set�sz#LegacyMetadata.update.<locals>._setr@N)�hasattrr@rE)r|�other�kwargsr��k�vr)r|rrz�s

zLegacyMetadata.updatecCsn|j|�}|tks|dkrPt|ttf�rPt|t�rJdd�|jd�D�}q~g}n.|tkr~t|ttf�r~t|t�rz|g}ng}tj	t
j��rB|d}t|j
�}|tkr�|dk	r�x�|D](}|j|jd�d�s�tjd	|||�q�Wn`|tko�|dk	�r|j|��sBtjd
|||�n0|tk�rB|dk	�rB|j|��sBtjd
|||�|tk�r`|dk�r`|j|�}||j|<dS)z"Control then set a metadata field.rcSsg|]}|j��qSr)�strip)r�r�rrrr��sz&LegacyMetadata.set.<locals>.<listcomp>r�rN�;rz$'%s': '%s' is not valid (field '%s')z.'%s': '%s' is not a valid version (field '%s')r!)r�r��
isinstance�listr�rr�r��loggerZisEnabledFor�loggingZWARNINGr
rw�_PREDICATE_FIELDS�is_valid_matcher�warning�_VERSIONS_FIELDS�is_valid_constraint_list�_VERSION_FIELDS�is_valid_version�_UNICODEFIELDSr�rv)r|rRrOZproject_namerwr�rrrr��s@








zLegacyMetadata.setcCs�|j|�}||jkr*|tkr&|j|�}|S|tkr@|j|}|S|tkr�|j|}|dkr^gSg}x6|D].}|tkr�|j|�qh|j|d|df�qhW|S|tkr�|j|}t	|t
�r�|jd�S|j|S)zGet a metadata field.Nrrr�)r�rv�_MISSINGr�r�r�r�rFr�r�rr�)r|rRrrrO�res�valrrrr��s.








zLegacyMetadata.getcs|j�gg}}xd
D]}||kr|j|�qW|rT|gkrTddj|�}t|��xdD]}||krZ|j|�qZW|ddkr�||fSt|j���fd	d
�}xdt|ft�jft	�j
ffD]F\}}x<|D]4}	|j|	d�}
|
dk	o�||
�r�|jd|	|
f�q�Wq�W||fS)zkCheck if the metadata is compliant. If strict is True then raise if
        no Name or Version are providedrrzmissing required metadata: %sz, �	Home-pager$zMetadata-Versionz1.2cs*x$|D]}�j|jd�d�sdSqWdS)Nr�rFT)r�r�)rOr�)rwrr�are_valid_constraintss
z3LegacyMetadata.check.<locals>.are_valid_constraintsNzWrong value for '%s': %s)rr)r�r$)r{rFr�rr
rwr�r�r�r�r�r�)r|�strict�missing�warnings�attrr�r�rMZ
controllerr�rOr)rwr�check�s2




zLegacyMetadata.checkcCs�|j�dB}i}x,|D]$\}}|s.||jkr||||<qW|ddk�r�dK}x�|D]F\}}|sn||jkrV|d&k�r�||||<qVd,d-�||D�||<qVWnF|dd.k�r�dO}x2|D]*\}}|�s�||jk�r�||||<�q�W|S)Pz�Return fields as a dict.

        Field names will be converted to use the underscore-lowercase style
        instead of hyphen-mixed case (i.e. home_page instead of Home-page).
        rQ�Metadata-VersionrRrr>rrTr rW�	Home-pagerXr$rY�Author-emailr\r&rUr!rVr"rSr�classifiersr(r^�Download-URLz1.2ra�
Requires-Distrc�Requires-Pythonrd�Requires-Externalr`�
Provides-Distr_�Obsoletes-Distrh�Project-URLrZr-r[�Maintainer-emailcSsg|]}dj|��qS)r�)r�)r��urrrr�Gsz)LegacyMetadata.todict.<locals>.<listcomp>z1.1rfr+rer,rgr*�rQr��rRr�r>r�rTr �rWr��rXr$�rYr��r\r&�rUr!�rVr"�rSr�r�r(�r^r�)
r�r�r�r�r�r�r�r�r�r�r�r�r��rar��rcr��rdr��r`r��r_r��rhr��rZr-�r[r�)r�r�r�r�r�r�r�r��rfr+�rer,�rgr*)r�r�r�)r{rv)r|Zskip_missingZmapping_1_0�datarNr�Zmapping_1_2Zmapping_1_1rrr�todictsP
zLegacyMetadata.todictcCs<|ddkr(xdD]}||kr||=qW|d|7<dS)NzMetadata-Versionz1.1r*r,r+z
Requires-Dist)r*r,r+r)r|�requirementsr�rrr�add_requirementsUs


zLegacyMetadata.add_requirementscCstt|d��S)NzMetadata-Version)r�r?)r|rrrr@`szLegacyMetadata.keysccsx|j�D]
}|Vq
WdS)N)r@)r|rNrrr�__iter__cszLegacyMetadata.__iter__cs�fdd��j�D�S)Ncsg|]}�|�qSrr)r�rN)r|rrr�hsz)LegacyMetadata.values.<locals>.<listcomp>)r@)r|r)r|rr�gszLegacyMetadata.valuescs�fdd��j�D�S)Ncsg|]}|�|f�qSrr)r�rN)r|rrr�ksz(LegacyMetadata.items.<locals>.<listcomp>)r@)r|r)r|rrEjszLegacyMetadata.itemscCsd|jj|j|jfS)Nz
<%s %s %s>)�	__class__rrRr>)r|rrr�__repr__mszLegacyMetadata.__repr__)NNNrr)F)F)F)N)F)F)"rrrrr�r{r�r�r�r�r�r�r�r�r�r�r�r�rxryr�r�rzr�r�r�r�r�r�r@r�r�rEr�rrrrrq�s>	




,
,
;rqzpydist.jsonz
metadata.jsonc@s�eZdZdZejd�Zejdej�Ze	Z
ejd�ZdZde
ZffdId�Zd	Zd
ZeffedJfe
dKfedLfd�ZdMZdNdd�ZedO�ZdefZdefZdefdefeeedefeeeedefdPdQd�
Z[[dd �ZdRd!d"�Zd#d$�Zed%d&��Z ed'd(��Z!e!j"d)d(��Z!dSd*d+�Z#ed,d-��Z$ed.d/��Z%e%j"d0d/��Z%d1d2�Z&d3d4�Z'd5d6�Z(d7d8�Z)d9d:d;d<d=dd>�Z*d?d@�Z+dTdCdD�Z,dEdF�Z-dGdH�Z.dS)Urz�
    The metadata of a release. This implementation uses 2.0 (JSON)
    metadata where possible. If not possible, it wraps a LegacyMetadata
    instance which handles the key-value metadata format.
    z
^\d+(\.\d+)*$z!^[0-9A-Z]([0-9A-Z_.-]*[0-9A-Z])?$z	.{1,2047}z2.0zdistlib (%s)�legacy)rRr>rTzqname version license summary description author author_email keywords platform home_page classifiers download_urlzwextras run_requires test_requires build_requires dev_requires provides meta_requires obsoleted_by supports_environments)rQrRr>rT�_legacy�_datarwNrrcCs0|||gjd�dkrtd��d|_d|_||_|dk	rzy|j||�||_Wn*tk
rvt||d�|_|j�YnXn�d}|r�t	|d��}|j
�}WdQRXn|r�|j
�}|dkr�|j|jd�|_ndt
|t�s�|jd�}ytj|�|_|j|j|�Wn0tk
�r*tt|�|d�|_|j�YnXdS)Nrsz'path, fileobj and mapping are exclusive)rrw�rb)rQ�	generatorzutf-8)r~rw)rtrur�rrw�_validate_mappingrrq�validater�rx�METADATA_VERSION�	GENERATORr�r	�decode�json�loads�
ValueErrorr)r|r}r~rrwr��frrrr��s<



zMetadata.__init__rRr>r\rVrTz
Requires-DistzSetup-Requires-DistzProvides-Extrar(�Download-URL�Metadata-Version)
�run_requires�build_requires�dev_requiresZ
test_requires�
meta_requires�extras�modules�
namespaces�exports�commandsr�Z
source_urlrQc
CsZtj|d�}tj|d�}||k�r||\}}|jr^|dkrP|dkrHdn|�}n|jj|�}n�|dkrjdn|�}|dkr�|jj||�}n�t�}|}|jjd�}	|	�r|dkr�|	jd	|�}nR|dkr�|	jd
�}	|	r�|	j||�}n.|	jd�}	|	�s�|jjd�}	|	�r|	j||�}||k�rV|}n:||k�r4tj||�}n"|j�rJ|jj|�}n|jj|�}|S)
N�common_keys�mapped_keysrrrrr��
extensionszpython.commandszpython.detailszpython.exports)rrrrr�)�object�__getattribute__r�r�r)
r|rN�common�mapped�lkZmaker�resultrO�sentinel�drrrr�sF




zMetadata.__getattribute__cCsH||jkrD|j|\}}|p |j|krD|j|�}|sDtd||f��dS)Nz.'%s' is an invalid value for the '%s' property)�SYNTAX_VALIDATORSrw�matchr)r|rNrOrw�pattern�
exclusions�mrrr�_validate_values

zMetadata._validate_valuecCs*|j||�tj|d�}tj|d�}||kr�||\}}|jrV|dkrJt�||j|<nf|d
krj||j|<nR|jjdi�}|dkr�||d	<n2|dkr�|jd
i�}|||<n|jdi�}|||<nh||kr�tj|||�nP|dk�rt|t	��r|j
�}|�r|j�}ng}|j�r||j|<n
||j|<dS)Nrrrrrrr�rzpython.commandszpython.detailszpython.exportsrV)rrrrr�)r'rrr��NotImplementedErrorr�
setdefault�__setattr__r�rr�r�)r|rNrOrrrr�r!rrrr*s>




zMetadata.__setattr__cCst|j|jd�S)NT)rprRr>)r|rrr�name_and_version@szMetadata.name_and_versioncCsF|jr|jd}n|jjdg�}d|j|jf}||krB|j|�|S)Nz
Provides-Distrfz%s (%s))r�rr)rRr>rF)r|r�srrrrfDs
zMetadata.providescCs |jr||jd<n
||jd<dS)Nz
Provides-Distrf)r�r)r|rOrrrrfOsc
Cs�|jr|}n�g}t|pg|j�}xl|D]d}d|kr@d|kr@d}n8d|krNd}n|jd�|k}|rx|jd�}|rxt||�}|r&|j|d�q&WxNdD]F}d|}	|	|kr�|j|	�|jjd	|g�}|j|j|||d
��q�W|S)a�
        Base method to get dependencies, given a set of extras
        to satisfy and an optional environment context.
        :param reqts: A list of sometimes-wanted dependencies,
                      perhaps dependent on extras and environment.
        :param extras: A list of optional components being requested.
        :param env: An optional environment for marker evaluation.
        �extra�environmentTre�build�dev�testz:%s:z%s_requires)r�env)r/r0r1)	r�rrr�r
�extendrGr�get_requirements)
r|�reqtsrr2rr!�includerBrN�errrr4Vs0	




zMetadata.get_requirementscCs|jr|j�S|jS)N)r��_from_legacyr)r|rrr�
dictionary�szMetadata.dictionarycCs|jrt�nt|j|j�SdS)N)r�r(rr�DEPENDENCY_KEYS)r|rrr�dependencies�szMetadata.dependenciescCs|jrt�n|jj|�dS)N)r�r(rrz)r|rOrrrr;�sc	Cs�|jd�|jkrt��g}x0|jj�D]"\}}||kr&||kr&|j|�q&W|rfddj|�}t|��x"|j�D]\}}|j|||�qpWdS)NrQzMissing metadata items: %sz, )	r�rr�MANDATORY_KEYSrErFr�rr')	r|rrwr�rNr%r�r�r�rrrr�szMetadata._validate_mappingcCsB|jr.|jjd�\}}|s|r>tjd||�n|j|j|j�dS)NTz#Metadata: missing: %s, warnings: %s)r�r�r�r�rrrw)r|r�r�rrrr�s
zMetadata.validatecCs(|jr|jjd�St|j|j�}|SdS)NT)r�r�rr�
INDEX_KEYS)r|rrrrr��szMetadata.todictc
Cs�|j|jd�}|jjd�}x2dD]*}||kr |dkr:d	}n|}||||<q W|jd
g�}|dgkrhg}||d<d}x2|D]*\}}||krz||rzd||ig||<qzW|j|d<i}i}	|S)N)rQrTrRr>r\rTrUr]r�r"�rVrarrbrrerf)rRr>r\rTrUr]�rar�rbr)r?r@)rrr�r�r�rf)
r|rZlmdr��nk�kwr@�okrXrZrrrr8�s.


zMetadata._from_legacyrrr&r r!)rRr>r\rTrUr�cCs�dd�}t�}|j}x*|jj�D]\}}||kr ||||<q W||j|j�}||j|j�}|jrtt	|j�|d<t	|�|d<t	|�|d<|S)NcSs�t�}x�|D]�}|jd�}|jd�}|d}xb|D]Z}|rN|rN|j|�q2d}|r^d|}|rx|rtd||f}n|}|jdj||f��q2WqW|S)Nr-r.rer>z
extra == "%s"z(%s) and %sr�)r�r��addr�)Zentriesr5r7r-r2Zrlistr�rBrrr�process_entries�s"



z,Metadata._to_legacy.<locals>.process_entrieszProvides-Extraz
Requires-DistzSetup-Requires-Dist)
rqr�LEGACY_MAPPINGrErrrrr�sorted)r|rErZnmdrArCZr1Zr2rrr�
_to_legacy�szMetadata._to_legacyFTcCs�||gjd�dkrtd��|j�|r`|jr4|j}n|j�}|rP|j||d�q�|j||d�n^|jrp|j�}n|j}|r�t	j
||dddd�n.tj|dd��}t	j
||dddd�WdQRXdS)	Nrz)Exactly one of path and fileobj is needed)r�Trs)Zensure_ascii�indentZ	sort_keysr�zutf-8)
rtr
rr�rHr�r�r8rr�dumpr�r�)r|r}r~r�r�Z	legacy_mdr!rrrrr��s&

zMetadata.writecCs�|jr|jj|�nt|jjdg�}d}x"|D]}d|kr,d|kr,|}Pq,W|dkrhd|i}|jd|�n t|d�t|�B}t|�|d<dS)Nrr.r-rer)r�r�rr)�insertr�rG)r|r�r�always�entryZrsetrrrr�s
zMetadata.add_requirementscCs*|jpd}|jpd}d|jj|j||fS)Nz	(no name)z
no versionz<%s %s %s (%s)>)rRr>r�rrQ)r|rRr>rrrr�(s

zMetadata.__repr__)r�)r�)r�)r�)r�rrw)NNNrr)rRr>r\rVrT)rN)r
N)N)NN)NNFT)/rrrr�re�compileZMETADATA_VERSION_MATCHER�IZNAME_MATCHERrZVERSION_MATCHERZSUMMARY_MATCHERrrrr<r=r:r"�	__slots__r�r�rr�Z	none_list�dictZ	none_dictrrr'r*�propertyr+rf�setterr4r9r;rrr�r8rFrHr�r�r�rrrrrvsx


,+

'
*	%
)rrrrr r!r"r#r$r%r&)rrrrr'r r!r"r#r$r%r&r(r)r*r+r,)r*r+r,r(r))rrrrr'r r!r"r#r$r%r-r.r&r(r)r/r0r1r2r3r4)r1r2r3r/r4r-r.r0)rrrrr'r r!r"r#r$r%r-r.r&r(r)r/r0r1r2r3r4r5r6r7r8r9)r5r9r6r7r8)r2r/r1)r3)r)rr(r*r,r+r/r1r2r4r0r'r7r9r8)r0)r")r$r-r r!)F)BrZ
__future__rr�Zemailrrr�rNr>rr�compatrrr	rAr
�utilrrr>r
rZ	getLoggerrr�rrrr�__all__rrrOr�r�r:r;rIr<rJr=rKr�r�rzZEXTRA_REr?rPr�r�r�r�r�r�r�r�rr�rmrprqZMETADATA_FILENAMEZWHEEL_METADATA_FILENAMErrrrr�<module>	s�








9


	site-packages/pip/_vendor/distlib/__pycache__/manifest.cpython-36.opt-1.pyc000064400000023660147511334610022561 0ustar003

���e�9�@s�dZddlZddlZddlZddlZddlZddlmZddlm	Z	ddl
mZdgZej
e�Zejdej�Zejd	ejejB�Zejdd
�ZGdd�de�ZdS)zu
Class representing the list of files in a distribution.

Equivalent to distutils.filelist, but fixes some problems.
�N�)�DistlibException)�fsdecode)�convert_path�Manifestz\\w*
z#.*?(?=
)|
(?=$)�c@szeZdZdZddd�Zdd�Zdd�Zd	d
�Zddd
�Zdd�Z	dd�Z
dd�Zddd�Zd dd�Z
d!dd�Zdd�ZdS)"rz~A list of files built by on exploring the filesystem and filtered by
    applying various patterns to what we find there.
    NcCs>tjjtjj|ptj���|_|jtj|_d|_t	�|_
dS)zd
        Initialise an instance.

        :param base: The base directory to explore under.
        N)�os�path�abspath�normpath�getcwd�base�sep�prefix�allfiles�set�files)�selfr
�r�/usr/lib/python3.6/manifest.py�__init__*szManifest.__init__cCs�ddlm}m}m}g|_}|j}|g}|j}|j}xv|r�|�}tj	|�}	x\|	D]T}
tj
j||
�}tj|�}|j}
||
�r�|jt
|��qR||
�rR||
�rR||�qRWq8WdS)zmFind all files under the base and set ``allfiles`` to the absolute
        pathnames of files found.
        r)�S_ISREG�S_ISDIR�S_ISLNKN)�statrrrrr
�pop�appendr�listdirr	�join�st_moder)rrrrr�root�stackr�push�names�name�fullnamer�moderrr�findall9s"



zManifest.findallcCs4|j|j�stjj|j|�}|jjtjj|��dS)zz
        Add a file to the manifest.

        :param item: The pathname to add. This can be relative to the base.
        N)	�
startswithrrr	rr
r�addr)r�itemrrrr)TszManifest.addcCsx|D]}|j|�qWdS)z�
        Add a list of files to the manifest.

        :param items: The pathnames to add. These can be relative to the base.
        N)r))r�itemsr*rrr�add_many^s
zManifest.add_manyFcsf��fdd��t�j�}|rJt�}x|D]}�|tjj|��q(W||O}dd�tdd�|D��D�S)z8
        Return sorted files in directory order
        cs>|j|�tjd|�|�jkr:tjj|�\}}�||�dS)Nzadd_dir added %s)r)�logger�debugr
rr	�split)�dirs�d�parent�_)�add_dirrrrr4ls


z Manifest.sorted.<locals>.add_dircSsg|]}tjj|��qSr)rr	r)�.0Z
path_tuplerrr�
<listcomp>zsz#Manifest.sorted.<locals>.<listcomp>css|]}tjj|�VqdS)N)rr	r/)r5r	rrr�	<genexpr>{sz"Manifest.sorted.<locals>.<genexpr>)rrrr	�dirname�sorted)rZwantdirs�resultr0�fr)r4rrr9gs

zManifest.sortedcCst�|_g|_dS)zClear all collected files.N)rrr)rrrr�clear}szManifest.clearcCs�|j|�\}}}}|dkrFx&|D]}|j|dd�s tjd|�q W�n<|dkrnx|D]}|j|dd�}qTW�n|dkr�x&|D]}|j|dd�s|tjd|�q|Wn�|d	kr�x�|D]}|j|dd�}q�Wn�|d
k�r�x�|D] }|j||d�s�tjd||�q�Wn�|d
k�r&xz|D]}|j||d�}�qWn\|dk�rN|jd|d��s�tjd|�n4|dk�rv|jd|d��s�tjd|�ntd|��dS)av
        Process a directive which either adds some files from ``allfiles`` to
        ``files``, or removes some files from ``files``.

        :param directive: The directive to process. This should be in a format
                     compatible with distutils ``MANIFEST.in`` files:

                     http://docs.python.org/distutils/sourcedist.html#commands
        �includeT)�anchorzno files found matching %r�excludezglobal-includeFz3no files found matching %r anywhere in distributionzglobal-excludezrecursive-include)rz-no files found matching %r under directory %rzrecursive-exclude�graftNz no directories found matching %r�prunez4no previously-included directories found matching %rzinvalid action %r)�_parse_directive�_include_patternr-Zwarning�_exclude_patternr)r�	directive�action�patterns�thedirZ
dirpattern�pattern�foundrrr�process_directive�sD









zManifest.process_directivec	Cs|j�}t|�dkr,|ddkr,|jdd�|d}d}}}|dkrxt|�dkr`td
|��dd�|dd�D�}n�|dkr�t|�dkr�td|��t|d�}dd�|dd�D�}n<|dk�r�t|�dkr�td|��t|d�}ntd|��||||fS)z�
        Validate a directive.
        :param directive: The directive to validate.
        :return: A tuple of action, patterns, thedir, dir_patterns
        rrr=r?�global-include�global-exclude�recursive-include�recursive-excluder@rANrz$%r expects <pattern1> <pattern2> ...cSsg|]}t|��qSr)r)r5�wordrrrr6�sz-Manifest._parse_directive.<locals>.<listcomp>�z*%r expects <dir> <pattern1> <pattern2> ...cSsg|]}t|��qSr)r)r5rPrrrr6�sz!%r expects a single <dir_pattern>zunknown action %r)r=r?rLrMrNrOr@rA)r=r?rLrM)rNrO)r@rA)r/�len�insertrr)rrEZwordsrFrGrHZdir_patternrrrrB�s:



zManifest._parse_directiveTcCsTd}|j||||�}|jdkr&|j�x(|jD]}|j|�r.|jj|�d}q.W|S)a�Select strings (presumably filenames) from 'self.files' that
        match 'pattern', a Unix-style wildcard (glob) pattern.

        Patterns are not quite the same as implemented by the 'fnmatch'
        module: '*' and '?'  match non-special characters, where "special"
        is platform-dependent: slash on Unix; colon, slash, and backslash on
        DOS/Windows; and colon on Mac OS.

        If 'anchor' is true (the default), then the pattern match is more
        stringent: "*.py" will match "foo.py" but not "foo/bar.py".  If
        'anchor' is false, both of these will match.

        If 'prefix' is supplied, then only filenames starting with 'prefix'
        (itself a pattern) and ending with 'pattern', with anything in between
        them, will match.  'anchor' is ignored in this case.

        If 'is_regex' is true, 'anchor' and 'prefix' are ignored, and
        'pattern' is assumed to be either a string containing a regex or a
        regex object -- no translation is done, the regex is just compiled
        and used as-is.

        Selected strings will be added to self.files.

        Return True if files are found.
        FNT)�_translate_patternrr'�searchrr))rrIr>r�is_regexrJ�
pattern_rer$rrrrCs

zManifest._include_patterncCsFd}|j||||�}x,t|j�D]}|j|�r |jj|�d}q W|S)atRemove strings (presumably filenames) from 'files' that match
        'pattern'.

        Other parameters are the same as for 'include_pattern()', above.
        The list 'self.files' is modified in place. Return True if files are
        found.

        This API is public to allow e.g. exclusion of SCM subdirs, e.g. when
        packaging source distributions
        FT)rT�listrrU�remove)rrIr>rrVrJrWr;rrrrD)s
zManifest._exclude_patternc
Csv|rt|t�rtj|�S|Stdkr:|jd�jd�\}}}|rR|j|�}td
krVnd}tjtj	j
|jd��}	|dk	�r4tdkr�|jd�}
|j|�dt|
��}n&|j|�}|t|�t|�t|��}tj
}tj
dkr�d}tdk�rd|	|j
|d	|f�}n0|t|�t|�t|��}d
||	||||f}n8|�rltdk�rRd|	|}nd||	|t|�d�f}tj|�S)aTranslate a shell-like wildcard pattern to a compiled regular
        expression.

        Return the compiled regex.  If 'is_regex' true,
        then 'pattern' is directly compiled to a regex (if it's a string)
        or just returned as-is (assumes it's a regex object).
        rQrr3�N�\z\\�^z.*z%s%s%s%s.*%s%sz%s%s%s)rQr)rQr)rQr)rQr)rQr)�
isinstance�str�re�compile�_PYTHON_VERSION�_glob_to_re�	partition�escaperr	rr
rRr)
rrIr>rrV�startr3�endrWr
Z
empty_patternZ	prefix_rerrrrrT=s@	









zManifest._translate_patterncCs8tj|�}tj}tjdkrd}d|}tjd||�}|S)z�Translate a shell-like glob pattern to a regular expression.

        Return a string containing the regex.  Differs from
        'fnmatch.translate()' in that '*' does not match "special characters"
        (which are platform-specific).
        r[z\\\\z\1[^%s]z((?<!\\)(\\\\)*)\.)�fnmatch�	translaterrr_�sub)rrIrWrZescapedrrrrbts

zManifest._glob_to_re)N)F)TNF)TNF)TNF)�__name__�
__module__�__qualname__�__doc__rr'r)r,r9r<rKrBrCrDrTrbrrrrr%s 

	
O/
(

6)rmrgZloggingrr_�sysrZr�compatr�utilr�__all__Z	getLoggerrjr-r`�MZ_COLLAPSE_PATTERN�SZ_COMMENTED_LINE�version_infora�objectrrrrr�<module>
s
site-packages/pip/_vendor/distlib/__pycache__/util.cpython-36.pyc000064400000127326147511334610020775 0ustar003

���ei��@s>ddlZddlmZddlZddlZddlmZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZyddlZWnek
r�dZYnXddlZddlZddlZddlZddlZyddlZWnek
r�ddlZYnXddlZddlmZddlmZmZmZmZm Z m!Z!m"Z"m#Z#m$Z$m%Z%m&Z&m'Z'm(Z(m)Z)m*Z*m+Z+m,Z,m-Z-m.Z.m/Z/e
j0e1�Z2dZ3e
j4e3�Z5dZ6d	e6d
Z7e6dZ8dZ9d
e9de8de3d
e9de8dZ:dZ;de:de;de:dZ<e7d
e3e7dZ=de=dZ>de6de>de<dZ?e
j4e?�Z@de9de8d
ZAe
j4eA�ZBdd �ZCd!d"�ZDd#d$�ZEd%d&�ZFd�d'd(�ZGd)d*�ZHd+d,�ZId-d.�ZJejKd/d0��ZLejKd1d2��ZMejKd�d4d5��ZNGd6d7�d7eO�ZPd8d9�ZQGd:d;�d;eO�ZRd<d=�ZSGd>d?�d?eO�ZTe
j4d@e
jU�ZVdAdB�ZWd�dCdD�ZXdEdF�ZYdGdH�ZZdIdJ�Z[dKdL�Z\dMdN�Z]e
j4dOe
j^�Z_e
j4dP�Z`d�dQdR�Zae
j4dS�ZbdTdU�ZcdVdW�ZddXdY�ZedZZfd[d\�Zgd]d^�ZhGd_d`�d`eO�ZiGdadb�dbeO�ZjGdcdd�ddeO�Zkd�Zld�dmdn�Zmdodp�Znd�ZoGdwdx�dxeO�Zpe
j4dy�Zqe
j4dz�Zre
j4d{�Zsd|d}�Zd~d�Zte�r\dd�lmuZvmwZwmxZxGd�d��d�e$jy�ZyGd�d��d�ev�ZuGd�d��d�eue'�Zzej{dd��Z|e|d�k�r�Gd�d��d�e$j}�Z}e�r�Gd�d��d�e$j~�Z~Gd�d��d�e%j�Ze�r�Gd�d��d�e%j��Z�Gd�d��d�e%j��Z�d�d��Z�Gd�d��d�eO�Z�Gd�d��d�e��Z�Gd�d��d�e��Z�Gd�d��d�e(�Z�Gd�d��d�eO�Z�d�d��Z�dS)��N)�deque)�iglob�)�DistlibException)�string_types�	text_type�shutil�	raw_input�StringIO�cache_from_source�urlopen�urljoin�httplib�	xmlrpclib�	splittype�HTTPHandler�BaseConfigurator�valid_ident�	Container�configparser�URLError�ZipFile�fsdecode�unquotez\s*,\s*z
(\w|[.-])+z(\*|:(\*|\w+):|�)z\*?z([<>=!~]=)|[<>]�(z)?\s*(z)(z)\s*(z))*z(from\s+(?P<diref>.*))z\(\s*(?P<c1>�|z)\s*\)|(?P<c2>z\s*)z)*z\[\s*(?P<ex>z)?\s*\]z(?P<dn>z	\s*)?(\s*z)?$z(?P<op>z)\s*(?P<vn>cs�dd��d}tj|�}|r�|j�}|d}|dp8|d}|dsHd}nd}|dj�}|snd}d}|d}nL|dd	kr�d
|}tj|�}	�fdd�|	D�}d
|djdd�|D��f}|ds�d}
ntj|d�}
t	|||
|||d�}|S)NcSs|j�}|d|dfS)N�opZvn)�	groupdict)�m�d�r!�/usr/lib/python3.6/util.py�get_constraintYsz)parse_requirement.<locals>.get_constraintZdnZc1Zc2Zdiref�rz<>!=z~=csg|]}�|��qSr!r!)�.0r)r#r!r"�
<listcomp>qsz%parse_requirement.<locals>.<listcomp>z%s (%s)z, cSsg|]}d|�qS)z%s %sr!)r%Zconr!r!r"r&rsZex)�nameZconstraints�extrasZrequirement�source�url)
�REQUIREMENT_RE�matchr�strip�RELOP_IDENT_RE�finditer�join�COMMA_RE�splitr)�s�resultrr r'Zconsr*ZconstrZrs�iteratorr(r!)r#r"�parse_requirementWs4


r6cCs�dd�}i}x�|D]�\}}}tjj||�}x�t|�D]t}tjj||�}	x`t|	�D]T}
|||
�}|dkrt|j|d�qP|||
�}|jtjjd�jd�}
|
d|||<qPWq4WqW|S)z%Find destinations for resources filescSsD|jtjjd�}|jtjjd�}|j|�s.t�|t|�d�jd�S)N�/)�replace�os�path�sep�
startswith�AssertionError�len�lstrip)�baser:r!r!r"�get_rel_pathsz)get_resources_dests.<locals>.get_rel_pathNr7)r9r:r0r�popr8r;�rstrip)Zresources_rootZrulesrAZdestinationsr@�suffix�dest�prefixZabs_baseZabs_globZabs_pathZ
resource_fileZrel_pathZrel_destr!r!r"�get_resources_dests|s

rGcCs(ttd�rd}ntjttdtj�k}|S)NZreal_prefixT�base_prefix)�hasattr�sysrF�getattr)r4r!r!r"�in_venv�s
rLcCs$tjjtj�}t|t�s t|�}|S)N)r9r:�normcaserJ�
executable�
isinstancerr)r4r!r!r"�get_executable�s

rPcCsT|}xJt|�}|}|r |r |}|r|dj�}||kr:P|rd|||f}qW|S)Nrz	%c: %s
%s)r	�lower)�promptZ
allowed_charsZerror_prompt�default�pr3�cr!r!r"�proceed�s
rVcCs<t|t�r|j�}i}x |D]}||kr||||<qW|S)N)rOrr2)r �keysr4�keyr!r!r"�extract_by_key�s

rYcCs�tjddkrtjd�|�}|j�}t|�}yrtj|�}|ddd}xR|j�D]F\}}x<|j�D]0\}}d||f}t	|�}	|	dk	s�t
�|	||<qdWqRW|Stk
r�|jdd�YnXdd	�}
t
j�}y|
||�Wn<t
jk
�r|j�tj|�}t|�}|
||�YnXi}xb|j�D]V}i||<}xB|j|�D]4\}
}d|
|f}t	|�}	|	dk	�spt
�|	||
<�qFW�q*W|S)
Nr�zutf-8�
extensionszpython.exports�exportsz%s = %scSs$t|d�r|j|�n
|j|�dS)N�	read_file)rIr]Zreadfp)�cp�streamr!r!r"�read_stream�s
z!read_exports.<locals>.read_stream)rJ�version_info�codecs�	getreader�readr
�json�load�items�get_export_entryr=�	Exception�seekr�ConfigParserZMissingSectionHeaderError�close�textwrap�dedentZsections)r_�dataZjdatar4�groupZentries�k�vr3�entryr`r^rXr'�valuer!r!r"�read_exports�sD

rucCs�tjddkrtjd�|�}tj�}x||j�D]p\}}|j|�x\|j�D]P}|j	dkr`|j
}nd|j
|j	f}|jr�d|dj|j�f}|j
||j|�qJWq.W|j|�dS)NrrZzutf-8z%s:%sz%s [%s]z, )rJrarb�	getwriterrrkrgZadd_section�valuesrDrF�flagsr0�setr'�write)r\r_r^rqrrrsr3r!r!r"�
write_exports�s

r{ccs$tj�}z
|VWdtj|�XdS)N)�tempfile�mkdtempr�rmtree)Ztdr!r!r"�tempdirs
rccs.tj�}ztj|�dVWdtj|�XdS)N)r9�getcwd�chdir)r �cwdr!r!r"r�s


r��ccs.tj�}ztj|�dVWdtj|�XdS)N)�socketZgetdefaulttimeoutZsetdefaulttimeout)ZsecondsZctor!r!r"�socket_timeouts


r�c@seZdZdd�Zddd�ZdS)�cached_propertycCs
||_dS)N)�func)�selfr�r!r!r"�__init__)szcached_property.__init__NcCs,|dkr|S|j|�}tj||jj|�|S)N)r��object�__setattr__�__name__)r��obj�clsrtr!r!r"�__get__.s

zcached_property.__get__)N)r��
__module__�__qualname__r�r�r!r!r!r"r�(sr�cCs�tjdkr|S|s|S|ddkr.td|��|ddkrFtd|��|jd�}xtj|krj|jtj�qRW|svtjStjj|�S)a�Return 'pathname' as a name that will work on the native filesystem.

    The path is split on '/' and put back together again using the current
    directory separator.  Needed because filenames in the setup script are
    always supplied in Unix style, and have to be converted to the local
    convention before we can actually use them in the filesystem.  Raises
    ValueError on non-Unix-ish systems if 'pathname' either starts or
    ends with a slash.
    r7rzpath '%s' cannot be absoluterzpath '%s' cannot end with '/'���)r9r;�
ValueErrorr2�curdir�remover:r0)�pathname�pathsr!r!r"�convert_path6s


r�c@s�eZdZd$dd�Zdd�Zdd�Zdd	�Zd%dd�Zd&dd�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
d'dd�Zdd�Zdd�Zd d!�Zd"d#�Zd
S)(�FileOperatorFcCs||_t�|_|j�dS)N)�dry_runry�ensured�_init_record)r�r�r!r!r"r�RszFileOperator.__init__cCsd|_t�|_t�|_dS)NF)�recordry�
files_written�dirs_created)r�r!r!r"r�WszFileOperator._init_recordcCs|jr|jj|�dS)N)r�r��add)r�r:r!r!r"�record_as_written\szFileOperator.record_as_writtencCsHtjj|�s tdtjj|���tjj|�s0dStj|�jtj|�jkS)a�Tell if the target is newer than the source.

        Returns true if 'source' exists and is more recently modified than
        'target', or if 'source' exists and 'target' doesn't.

        Returns false if both exist and 'target' is the same age or younger
        than 'source'. Raise PackagingFileError if 'source' does not exist.

        Note that this test is not very accurate: files created in the same
        second will have the same "age".
        zfile '%r' does not existT)r9r:�existsr�abspath�stat�st_mtime)r�r)�targetr!r!r"�newer`szFileOperator.newerTcCs�|jtjj|��tjd||�|js�d}|rftjj|�rDd|}n"tjj|�rftjj	|�rfd|}|rvt
|d��tj||�|j
|�dS)z8Copy a file respecting dry-run and force flags.
        zCopying %s to %sNz%s is a symlinkz%s is a non-regular filez which would be overwritten)�
ensure_dirr9r:�dirname�logger�infor��islinkr��isfiler�rZcopyfiler�)r�Zinfile�outfile�check�msgr!r!r"�	copy_filets
zFileOperator.copy_fileNc
Cs�tjj|�st�|jtjj|��tjd||�|jsx|dkrLt	|d�}nt
j	|d|d�}ztj||�Wd|j
�X|j|�dS)NzCopying stream %s to %s�wb�w)�encoding)r9r:�isdirr=r�r�r�r�r��openrbrZcopyfileobjrlr�)r�Zinstreamr�r�Z	outstreamr!r!r"�copy_stream�s
zFileOperator.copy_streamc	CsF|jtjj|��|js8t|d��}|j|�WdQRX|j|�dS)Nr�)r�r9r:r�r�r�rzr�)r�r:ro�fr!r!r"�write_binary_file�s
zFileOperator.write_binary_filec
CsL|jtjj|��|js>t|d��}|j|j|��WdQRX|j|�dS)Nr�)	r�r9r:r�r�r�rz�encoder�)r�r:ror�r�r!r!r"�write_text_file�s
zFileOperator.write_text_filecCsrtjdkstjdkrntjdkrnxN|D]F}|jr<tjd|�q$tj|�j|B|@}tjd||�tj||�q$WdS)N�posix�javazchanging mode of %szchanging mode of %s to %o)	r9r'�_namer�r�r�r��st_mode�chmod)r��bits�mask�filesr��moder!r!r"�set_mode�s
zFileOperator.set_modecCs|jdd|�S)Nimi�)r�)r3r�r!r!r"�<lambda>�szFileOperator.<lambda>cCs~tjj|�}||jkrztjj|�rz|jj|�tjj|�\}}|j|�tj	d|�|j
shtj|�|jrz|j
j|�dS)NzCreating %s)r9r:r�r�r�r�r2r�r�r�r��mkdirr�r�)r�r:r r�r!r!r"r��s

zFileOperator.ensure_dircCsvt||�}tjd||�|jsh|s0|j||�rX|s:d}n|j|�sHt�|t|�d�}tj	|||d�|j
|�|S)NzByte-compiling %s to %sT)rr�r�r�r�r<r=r>�
py_compile�compiler�)r�r:�optimize�forcerFZdpathZdiagpathr!r!r"�byte_compile�s
zFileOperator.byte_compilecCs�tjj|�r�tjj|�r`tjj|�r`tjd|�|jsBtj	|�|j
r�||jkr�|jj|�nPtjj|�rrd}nd}tjd||�|js�tj|�|j
r�||j
kr�|j
j|�dS)NzRemoving directory tree at %s�link�filezRemoving %s %s)r9r:r�r�r�r��debugr�rr~r�r�r�r�)r�r:r3r!r!r"�ensure_removed�s"



zFileOperator.ensure_removedcCsHd}x>|sBtjj|�r&tj|tj�}Ptjj|�}||kr<P|}qW|S)NF)r9r:r��access�W_OKr�)r�r:r4�parentr!r!r"�is_writable�szFileOperator.is_writablecCs"|js
t�|j|jf}|j�|S)zV
        Commit recorded changes, turn off recording, return
        changes.
        )r�r=r�r�r�)r�r4r!r!r"�commit�s
zFileOperator.commitcCs�|js�x(t|j�D]}tjj|�rtj|�qWt|jdd�}xN|D]F}tj	|�}|r�|dgksdt
�tjj||d�}tj|�tj|�qDW|j
�dS)NT)�reverse�__pycache__r)r��listr�r9r:r�r��sortedr��listdirr=r0�rmdirr�)r�r��dirsr �flistZsdr!r!r"�rollback�s


zFileOperator.rollback)F)T)N)FFN)r�r�r�r�r�r�r�r�r�r�r�r�Zset_executable_moder�r�r�r�r�r�r!r!r!r"r�Qs 




r�cCsb|tjkrtj|}nt|�}|dkr,|}n2|jd�}t||jd��}x|D]}t||�}qLW|S)N�.r)rJ�modules�
__import__r2rKrB)Zmodule_nameZdotted_path�modr4�partsrTr!r!r"�resolves


r�c@s6eZdZdd�Zedd��Zdd�Zdd�Zej	Z	d	S)
�ExportEntrycCs||_||_||_||_dS)N)r'rFrDrx)r�r'rFrDrxr!r!r"r�szExportEntry.__init__cCst|j|j�S)N)r�rFrD)r�r!r!r"rtszExportEntry.valuecCsd|j|j|j|jfS)Nz<ExportEntry %s = %s:%s %s>)r'rFrDrx)r�r!r!r"�__repr__!s
zExportEntry.__repr__cCsDt|t�sd}n0|j|jko>|j|jko>|j|jko>|j|jk}|S)NF)rOr�r'rFrDrx)r��otherr4r!r!r"�__eq__%s
zExportEntry.__eq__N)
r�r�r�r�r�rtr�r�r��__hash__r!r!r!r"r�s

r�z�(?P<name>(\w|[-.+])+)
                      \s*=\s*(?P<callable>(\w+)([:\.]\w+)*)
                      \s*(\[\s*(?P<flags>\w+(=\w+)?(,\s*\w+(=\w+)?)*)\s*\])?
                      c
Cs�tj|�}|s0d}d|ks"d|kr�td|��n�|j�}|d}|d}|jd�}|dkrf|d}}n"|dkrztd|��|jd�\}}|d	}	|	dkr�d|ks�d|kr�td|��g}	nd
d�|	jd�D�}	t||||	�}|S)
N�[�]zInvalid specification '%s'r'�callable�:rrrxcSsg|]}|j��qSr!)r-)r%r�r!r!r"r&Qsz$get_export_entry.<locals>.<listcomp>�,)�ENTRY_RE�searchrr�countr2r�)
Z
specificationrr4r r'r:ZcolonsrFrDrxr!r!r"rh7s2


rhcCs�|dkrd}tjdkr.dtjkr.tjjd�}ntjjd�}tjj|�rftj|tj�}|s�t	j
d|�n<ytj|�d}Wn(tk
r�t	j
d	|dd
�d}YnX|s�t
j�}t	j
d|�tjj||�S)
a�
    Return the default base location for distlib caches. If the directory does
    not exist, it is created. Use the suffix provided for the base directory,
    and default to '.distlib' if it isn't provided.

    On Windows, if LOCALAPPDATA is defined in the environment, then it is
    assumed to be a directory, and will be the parent directory of the result.
    On POSIX, and on Windows if LOCALAPPDATA is not defined, the user's home
    directory - using os.expanduser('~') - will be the parent directory of
    the result.

    The result is just the directory '.distlib' in the parent directory as
    determined above, or with the name specified with ``suffix``.
    Nz.distlib�ntZLOCALAPPDATAz
$localappdata�~z(Directory exists but is not writable: %sTzUnable to create %s)�exc_infoFz#Default location unusable, using %s)r9r'�environr:�
expandvars�
expanduserr�r�r�r��warning�makedirs�OSErrorr|r}r0)rDr4Zusabler!r!r"�get_cache_baseVs&

r�cCsBtjjtjj|��\}}|r(|jdd�}|jtjd�}||dS)a
    Convert an absolute path to a directory name for use in a cache.

    The algorithm used is:

    #. On Windows, any ``':'`` in the drive is replaced with ``'---'``.
    #. Any occurrence of ``os.sep`` is replaced with ``'--'``.
    #. ``'.cache'`` is appended.
    r�z---z--z.cache)r9r:�
splitdriver�r8r;)r:r rTr!r!r"�path_to_cache_dirs

r�cCs|jd�s|dS|S)Nr7)�endswith)r3r!r!r"�ensure_slash�s
r�cCsHd}}d|kr>|jdd�\}}d|kr.|}n|jdd�\}}|||fS)N�@rr�)r2)ZnetlocZusernameZpasswordrFr!r!r"�parse_credentials�srcCstjd�}tj|�|S)N�)r9�umask)r4r!r!r"�get_process_umask�s

rcCs>d}d}x$t|�D]\}}t|t�sd}PqW|dk	s:t�|S)NTF)�	enumeraterOrr=)�seqr4�ir3r!r!r"�is_string_sequence�s
rz3([a-z0-9_]+([.-][a-z_][a-z0-9_]*)*)-([a-z0-9_.+-]+)z
-py(\d\.?\d?)cCs�d}d}t|�jdd�}tj|�}|r@|jd�}|d|j��}|r�t|�t|�dkr�tjtj	|�d|�}|r�|j
�}|d|�||dd�|f}|dkr�tj|�}|r�|jd�|jd�|f}|S)zw
    Extract name, version, python version from a filename (no extension)

    Return name, version, pyver or None
    N� �-rz\brZ)rr8�PYTHON_VERSIONr�rp�startr>�rer,�escape�end�PROJECT_NAME_AND_VERSION)�filenameZproject_namer4Zpyverr�nr!r!r"�split_filename�s"


rz-(?P<name>[\w .-]+)\s*\(\s*(?P<ver>[^\s)]+)\)$cCs:tj|�}|std|��|j�}|dj�j�|dfS)z�
    A utility method used to get name and version from a string.

    From e.g. a Provides-Dist value.

    :param p: A value in a form 'foo (1.0)'
    :return: The name and version as a tuple.
    z$Ill-formed name/version string: '%s'r'Zver)�NAME_VERSION_REr,rrr-rQ)rTrr r!r!r"�parse_name_and_version�s
	
rcCs�t�}t|pg�}t|pg�}d|kr8|jd�||O}x�|D]x}|dkrV|j|�q>|jd�r�|dd�}||kr�tjd|�||kr�|j|�q>||kr�tjd|�|j|�q>W|S)N�*r
rzundeclared extra: %s)ryr�r�r<r�r�)Z	requestedZ	availabler4�rZunwantedr!r!r"�
get_extras�s&


rcCs�i}yNt|�}|j�}|jd�}|jd�s8tjd|�ntjd�|�}tj	|�}Wn0t
k
r�}ztjd||�WYdd}~XnX|S)NzContent-Typezapplication/jsonz(Unexpected response for JSON request: %szutf-8z&Failed to get external data for %s: %s)rr��getr<r�r�rbrcrerfri�	exception)r*r4ZrespZheadersZct�reader�er!r!r"�_get_external_data�s

 rz'https://www.red-dove.com/pypi/projects/cCs*d|dj�|f}tt|�}t|�}|S)Nz%s/%s/project.jsonr)�upperr
�_external_data_base_urlr)r'r*r4r!r!r"�get_project_datas
r cCs(d|dj�||f}tt|�}t|�S)Nz%s/%s/package-%s.jsonr)rr
rr)r'�versionr*r!r!r"�get_package_datas
r"c@s(eZdZdZdd�Zdd�Zdd�ZdS)	�Cachez�
    A class implementing a cache for resources that need to live in the file system
    e.g. shared libraries. This class was moved from resources to here because it
    could be used by other modules, e.g. the wheel module.
    cCsPtjj|�stj|�tj|�jd@dkr6tjd|�tjjtjj	|��|_
dS)zu
        Initialise an instance.

        :param base: The base directory where the cache should be located.
        �?rzDirectory '%s' is not privateN)r9r:r�r�r�r�r�r�r��normpathr@)r�r@r!r!r"r�"s

zCache.__init__cCst|�S)zN
        Converts a resource prefix to a directory name in the cache.
        )r�)r�rFr!r!r"�
prefix_to_dir0szCache.prefix_to_dircCs�g}x�tj|j�D]r}tjj|j|�}y>tjj|�s@tjj|�rLtj|�ntjj|�rbt	j
|�Wqtk
r�|j|�YqXqW|S)z"
        Clear the cache.
        )
r9r�r@r:r0r�r�r�r�rr~ri�append)r�Znot_removed�fnr!r!r"�clear6szCache.clearN)r�r�r��__doc__r�r&r)r!r!r!r"r#sr#c@s:eZdZdZdd�Zddd�Zdd�Zd	d
�Zdd�Zd
S)�
EventMixinz1
    A very simple publish/subscribe system.
    cCs
i|_dS)N)�_subscribers)r�r!r!r"r�KszEventMixin.__init__TcCsD|j}||krt|g�||<n"||}|r6|j|�n
|j|�dS)a`
        Add a subscriber for an event.

        :param event: The name of an event.
        :param subscriber: The subscriber to be added (and called when the
                           event is published).
        :param append: Whether to append or prepend the subscriber to an
                       existing subscriber list for the event.
        N)r,rr'�
appendleft)r��event�
subscriberr'�subsZsqr!r!r"r�Ns
zEventMixin.addcCs,|j}||krtd|��||j|�dS)z�
        Remove a subscriber for an event.

        :param event: The name of an event.
        :param subscriber: The subscriber to be removed.
        zNo subscribers: %rN)r,r�r�)r�r.r/r0r!r!r"r�bszEventMixin.removecCst|jj|f��S)z�
        Return an iterator for the subscribers for an event.
        :param event: The event to return subscribers for.
        )�iterr,r)r�r.r!r!r"�get_subscribersnszEventMixin.get_subscriberscOspg}xT|j|�D]F}y||f|�|�}Wn"tk
rJtjd�d}YnX|j|�qWtjd||||�|S)a^
        Publish a event and return a list of values returned by its
        subscribers.

        :param event: The event to publish.
        :param args: The positional arguments to pass to the event's
                     subscribers.
        :param kwargs: The keyword arguments to pass to the event's
                       subscribers.
        z"Exception during event publicationNz/publish %s: args = %s, kwargs = %s, result = %s)r2rir�rr'r�)r�r.�args�kwargsr4r/rtr!r!r"�publishus

zEventMixin.publishN)T)	r�r�r�r*r�r�r�r2r5r!r!r!r"r+Gs
r+c@s^eZdZdd�Zdd�Zddd�Zdd	�Zd
d�Zdd
�Zdd�Z	e
dd��Ze
dd��ZdS)�	SequencercCsi|_i|_t�|_dS)N)�_preds�_succsry�_nodes)r�r!r!r"r��szSequencer.__init__cCs|jj|�dS)N)r9r�)r��noder!r!r"�add_node�szSequencer.add_nodeFcCs�||jkr|jj|�|r�x&t|jj|f��D]}|j||�q.Wx&t|jj|f��D]}|j||�qVWx&t|jj��D]\}}|sz|j|=qzWx&t|jj��D]\}}|s�|j|=q�WdS)N)r9r�ryr7rr8r�rg)r�r:ZedgesrTr3rqrrr!r!r"�remove_node�s
zSequencer.remove_nodecCs<||kst�|jj|t��j|�|jj|t��j|�dS)N)r=r7�
setdefaultryr�r8)r��pred�succr!r!r"r��sz
Sequencer.addcCs�||kst�y|j|}|j|}Wn tk
rDtd|��YnXy|j|�|j|�Wn$tk
r�td||f��YnXdS)Nz%r not a successor of anythingz%r not a successor of %r)r=r7r8�KeyErrorr�r�)r�r>r?�predsZsuccsr!r!r"r��s

zSequencer.removecCs||jkp||jkp||jkS)N)r7r8r9)r��stepr!r!r"�is_step�szSequencer.is_stepcCs�|j|�std|��g}g}t�}|j|�xd|r�|jd�}||krd||kr�|j|�|j|�q0|j|�|j|�|jj|f�}|j	|�q0Wt
|�S)NzUnknown: %rr)rCr�ryr'rBr�r�r7r�extend�reversed)r��finalr4Ztodo�seenrBrAr!r!r"�	get_steps�s"





zSequencer.get_stepscsVdg�g�i�i�g�|j��������fdd��x�D]}|�kr:�|�q:W�S)Nrc
s��d�|<�d�|<�dd7<�j|�y�|}Wntk
rVg}YnXxR|D]J}|�kr��|�t�|�|��|<q^|�kr^t�|�|��|<q^W�|�|kr�g}x �j�}|j|�||kr�Pq�Wt|�}�j|�dS)Nrr)r'ri�minrB�tuple)r:Z
successorsZ	successorZconnected_componentZ	component)�graph�index�
index_counter�lowlinksr4�stack�
strongconnectr!r"rP�s.



z3Sequencer.strong_connections.<locals>.strongconnect)r8)r�r:r!)rKrLrMrNr4rOrPr"�strong_connections�s"
zSequencer.strong_connectionscCsrdg}x8|jD].}|j|}x|D]}|jd||f�q"WqWx|jD]}|jd|�qHW|jd�dj|�S)Nzdigraph G {z  %s -> %s;z  %s;�}�
)r7r'r9r0)r�r4r?rAr>r:r!r!r"�dot
s


z
Sequencer.dotN)F)
r�r�r�r�r;r<r�r�rCrH�propertyrQrTr!r!r!r"r6�s

3r6�.tar.gz�.tar.bz2�.tar�.zip�.tgz�.tbz�.whlTc
sf��fdd�}tjj���t���d}|dkr�|jd�r>d}nH|jd�rRd}d	}n4|jd�rfd}d
}n |jd�rzd}d}ntd|��z�|dkr�t|d�}|r�|j�}xD|D]}||�q�Wn.tj	||�}|r�|j
�}x|D]}||�q�W|dk�r6tjddk�r6x.|j
�D]"}	t|	jt��s|	jjd�|	_�qWdd�}
|
|_|j��Wd|�r`|j�XdS)NcsTt|t�s|jd�}tjjtjj�|��}|j��sD|�tjkrPt	d|��dS)Nzutf-8zpath outside destination: %r)
rOr�decoder9r:r�r0r<r;r�)r:rT)�dest_dir�plenr!r"�
check_paths


zunarchive.<locals>.check_path�.zip�.whl�zip�.tar.gz�.tgzZtgzzr:gz�.tar.bz2�.tbzZtbzzr:bz2z.tarZtarrzUnknown format for %rrrZzutf-8cSsBytj||�Stjk
r<}ztt|���WYdd}~XnXdS)z:Run tarfile.tar_fillter, but raise the expected ValueErrorN)�tarfileZ
tar_filterZFilterErrorr��str)�memberr:�excr!r!r"�extraction_filterPsz$unarchive.<locals>.extraction_filter)rarb)rdre)rfrg)r9r:r�r>r�r�rZnamelistrhr�ZgetnamesrJraZ
getmembersrOr'rr]rlZ
extractallrl)Zarchive_filenamer^�formatr�r`�archiver��namesr'Ztarinforlr!)r^r_r"�	unarchivesL






rpcCs�tj�}t|�}t|d��b}xZtj|�D]L\}}}x@|D]8}tjj||�}||d�}	tjj|	|�}
|j||
�q8Wq(WWdQRX|S)z*zip a directory tree into a BytesIO objectr�N)	�io�BytesIOr>rr9�walkr:r0rz)Z	directoryr4ZdlenZzf�rootr�r�r'ZfullZrelrEr!r!r"�zip_dir`s
rur$�K�M�G�T�Pc@sreZdZdZddd�Zdd�Zdd	�Zd
d�Zdd
�Ze	dd��Z
e	dd��Zdd�Ze	dd��Z
e	dd��ZdS)�ProgressZUNKNOWNr�dcCs<|dks||kst�||_|_||_d|_d|_d|_dS)NrF)r=rI�cur�max�started�elapsed�done)r�ZminvalZmaxvalr!r!r"r�wszProgress.__init__cCsV|j|kst�|jdks&||jks&t�||_tj�}|jdkrF||_n||j|_dS)N)rIr=r~r}�timerr�)r�ZcurvalZnowr!r!r"�updates
zProgress.updatecCs |dkst�|j|j|�dS)Nr)r=r�r})r�Zincrr!r!r"�	increment�szProgress.incrementcCs|j|j�|S)N)r�rI)r�r!r!r"r�szProgress.startcCs |jdk	r|j|j�d|_dS)NT)r~r�r�)r�r!r!r"�stop�s
z
Progress.stopcCs|jdkr|jS|jS)N)r~�unknown)r�r!r!r"�maximum�szProgress.maximumcCsD|jrd}n4|jdkrd}n$d|j|j|j|j}d|}|S)Nz100 %z ?? %gY@z%3d %%)r�r~r}rI)r�r4rrr!r!r"�
percentage�s
zProgress.percentagecCs:|dkr|jdks|j|jkr$d}ntjdtj|��}|S)Nrz??:??:??z%H:%M:%S)r~r}rIr�ZstrftimeZgmtime)r�Zdurationr4r!r!r"�format_duration�szProgress.format_durationcCs�|jrd}|j}n^d}|jdkr&d}nJ|jdks<|j|jkrBd}n.t|j|j�}||j|j}|d|j}d||j|�fS)NZDonezETA rrz%s: %sr�)r�r�r~r}rI�floatr�)r�rF�tr!r!r"�ETA�s
zProgress.ETAcCsN|jdkrd}n|j|j|j}xtD]}|dkr6P|d}q(Wd||fS)Nrgi�g@�@z%d %sB/s)r�r}rI�UNITS)r�r4Zunitr!r!r"�speed�s

zProgress.speedN)rr|)r�r�r�r�r�r�r�rr�rUr�r�r�r�r�r!r!r!r"r{ts

	r{z\{([^}]*)\}z[^/\\,{]\*\*|\*\*[^/\\,}]z^[^{]*\}|\{[^}]*$cCs<tj|�rd}t||��tj|�r4d}t||��t|�S)zAExtended globbing function that supports ** and {opt1,opt2,opt3}.z7invalid glob %r: recursive glob "**" must be used alonez2invalid glob %r: mismatching set marker '{' or '}')�_CHECK_RECURSIVE_GLOBr�r��_CHECK_MISMATCH_SET�_iglob)�	path_globr�r!r!r"r�s

rccstj|d�}t|�dkrpt|�dks,t|��|\}}}x�|jd�D](}x"tdj|||f��D]
}|Vq\WqBWn�d|kr�x�t|�D]
}|Vq�Wn�|jdd�\}}|dkr�d}|dkr�d}n|jd�}|jd	�}xHtj	|�D]:\}}}	tj
j|�}x"ttj
j||��D]}
|
V�qWq�WdS)
NrrZr�r$z**r�rr7�\)�	RICH_GLOBr2r>r=r�r0�	std_iglobr?r9rsr:r%)r�Zrich_path_globrFryrD�itemr:Zradical�dirr�r(r!r!r"r��s*


r�)�HTTPSHandler�match_hostname�CertificateErrorc@seZdZdZdZdd�ZdS)�HTTPSConnectionNTcCsPtj|j|jf|j�}t|dd�r0||_|j�tt	d�sp|j
rHt	j}nt	j}t	j
||j|j|t	j|j
d�|_nxt	jt	j�}|jt	jO_|jr�|j|j|j�i}|j
r�t	j|_|j|j
d�tt	dd�r�|j|d<|j
|f|�|_|j
o�|j�rLy$t|jj�|j�tjd|j�Wn0tk
�rJ|jjtj�|jj��YnXdS)	NZ_tunnel_hostF�
SSLContext)�	cert_reqsZssl_version�ca_certs)ZcafileZHAS_SNIZserver_hostnamezHost verified: %s) r�Zcreate_connection�host�port�timeoutrK�sockZ_tunnelrI�sslr�Z
CERT_REQUIREDZ	CERT_NONEZwrap_socketZkey_fileZ	cert_fileZPROTOCOL_SSLv23r�ZoptionsZOP_NO_SSLv2Zload_cert_chainZverify_modeZload_verify_locations�check_domainr�Zgetpeercertr�r�r�ZshutdownZ	SHUT_RDWRrl)r�r�r��contextr4r!r!r"�connect
s>


zHTTPSConnection.connect)r�r�r�r�r�r�r!r!r!r"r�sr�c@s&eZdZd	dd�Zdd�Zdd�ZdS)
r�TcCstj|�||_||_dS)N)�BaseHTTPSHandlerr�r�r�)r�r�r�r!r!r"r�0s
zHTTPSHandler.__init__cOs$t||�}|jr |j|_|j|_|S)a
            This is called to create a connection instance. Normally you'd
            pass a connection class to do_open, but it doesn't actually check for
            a class, and just expects a callable. As long as we behave just as a
            constructor would have, we should be OK. If it ever changes so that
            we *must* pass a class, we'll create an UnsafeHTTPSConnection class
            which just sets check_domain to False in the class definition, and
            choose which one to pass to do_open.
            )r�r�r�)r�r3r4r4r!r!r"�_conn_maker5s


zHTTPSHandler._conn_makercCsVy|j|j|�Stk
rP}z&dt|j�kr>td|j��n�WYdd}~XnXdS)Nzcertificate verify failedz*Unable to verify server certificate for %s)Zdo_openr�rri�reasonr�r�)r��reqrr!r!r"�
https_openEszHTTPSHandler.https_openN)T)r�r�r�r�r�r�r!r!r!r"r�/s
r�c@seZdZdd�ZdS)�HTTPSOnlyHandlercCstd|��dS)NzAUnexpected HTTP request on what should be a secure connection: %s)r)r�r�r!r!r"�	http_openYszHTTPSOnlyHandler.http_openN)r�r�r�r�r!r!r!r"r�Xsr���c@seZdZddd�ZdS)�HTTPr$NcKs&|dkrd}|j|j||f|��dS)Nr)�_setup�_connection_class)r�r�r�r4r!r!r"r�esz
HTTP.__init__)r$N)r�r�r�r�r!r!r!r"r�dsr�c@seZdZddd�ZdS)�HTTPSr$NcKs&|dkrd}|j|j||f|��dS)Nr)r�r�)r�r�r�r4r!r!r"r�mszHTTPS.__init__)r$N)r�r�r�r�r!r!r!r"r�lsr�c@seZdZddd�Zdd�ZdS)�	TransportrcCs||_tjj||�dS)N)r�rr�r�)r�r��use_datetimer!r!r"r�tszTransport.__init__cCsb|j|�\}}}tdkr(t||jd�}n6|js>||jdkrT||_|tj|�f|_|jd}|S)Nr�r�)r�rr)r�r�)�
get_host_info�	_ver_infor�r��_connection�_extra_headersrZHTTPConnection)r�r��h�ehZx509r4r!r!r"�make_connectionxs
zTransport.make_connectionN)r)r�r�r�r�r�r!r!r!r"r�ss
r�c@seZdZddd�Zdd�ZdS)�
SafeTransportrcCs||_tjj||�dS)N)r�rr�r�)r�r�r�r!r!r"r��szSafeTransport.__init__cCsz|j|�\}}}|si}|j|d<tdkr:t|df|�}n<|jsP||jdkrl||_|tj|df|�f|_|jd}|S)Nr�r�r�rr)r�r�)r�r�r�r�r�r�rr�)r�r�r�r�r4r4r!r!r"r��s


zSafeTransport.make_connectionN)r)r�r�r�r�r�r!r!r!r"r��s
r�c@seZdZdd�ZdS)�ServerProxyc	Kst|jdd�|_}|dk	r^t|�\}}|jdd�}|dkr@t}nt}|||d�|d<}||_tjj	||f|�dS)Nr�r�rZhttps)r��	transport)
rBr�rrr�r�r�rr�r�)	r�Zurir4r��scheme�_r�Ztclsr�r!r!r"r��szServerProxy.__init__N)r�r�r�r�r!r!r!r"r��sr�cKs.tjddkr|d7}nd|d<t||f|�S)NrrZ�br$�newline)rJrar�)r(r�r4r!r!r"�	_csv_open�s
r�c@s4eZdZed�ed�ed�d�Zdd�Zdd�Zd	S)
�CSVBaser��"rS)Z	delimiterZ	quotecharZlineterminatorcCs|S)Nr!)r�r!r!r"�	__enter__�szCSVBase.__enter__cGs|jj�dS)N)r_rl)r�r�r!r!r"�__exit__�szCSVBase.__exit__N)r�r�r�ri�defaultsr�r�r!r!r!r"r��s
r�c@s(eZdZdd�Zdd�Zdd�ZeZdS)�	CSVReadercKs\d|kr4|d}tjddkr,tjd�|�}||_nt|dd�|_tj|jf|j�|_dS)Nr_rrZzutf-8r:r)	rJrarbrcr_r��csvrr�)r�r4r_r!r!r"r��szCSVReader.__init__cCs|S)Nr!)r�r!r!r"�__iter__�szCSVReader.__iter__cCsJt|j�}tjddkrFx,t|�D] \}}t|t�s"|jd�||<q"W|S)NrrZzutf-8)�nextrrJrarrOrr])r�r4rr�r!r!r"r��s

zCSVReader.nextN)r�r�r�r�r�r��__next__r!r!r!r"r��sr�c@seZdZdd�Zdd�ZdS)�	CSVWritercKs$t|d�|_tj|jf|j�|_dS)Nr�)r�r_r��writerr�)r�r(r4r!r!r"r��szCSVWriter.__init__cCsRtjddkrBg}x*|D]"}t|t�r0|jd�}|j|�qW|}|jj|�dS)NrrZzutf-8)rJrarOrr�r'r��writerow)r��rowrr�r!r!r"r��s


zCSVWriter.writerowN)r�r�r�r�r�r!r!r!r"r��sr�csHeZdZeej�Zded<d�fdd�	Zdd�Zdd	�Zd
d�Z	�Z
S)
�Configurator�inc_convertZincNcs"tt|�j|�|ptj�|_dS)N)�superr�r�r9r�r@)r��configr@)�	__class__r!r"r��szConfigurator.__init__c
s���fdd���jd�}t|�s*�j|�}�jdd�}�jdf�}|r\t�fdd�|D��}��fdd��D�}t|�}|||�}|r�x$|j�D]\}}	t||�|	��q�W|S)	Ncszt|ttf�r*t|��fdd�|D��}nLt|t�rld|krH�j|�}qvi}x(|D]}�||�||<qRWn
�j|�}|S)Ncsg|]}�|��qSr!r!)r%r)�convertr!r"r&�szBConfigurator.configure_custom.<locals>.convert.<locals>.<listcomp>z())rOr�rJ�type�dict�configure_customr�)�or4rq)r�r�r!r"r��s


z.Configurator.configure_custom.<locals>.convertz()r�z[]csg|]}�|��qSr!r!)r%r�)r�r!r"r&sz1Configurator.configure_custom.<locals>.<listcomp>cs$g|]}t|�r|��|�f�qSr!)r)r%rq)r�r�r!r"r&s)rBr�r�rJr�rg�setattr)
r�r�rUZpropsr3rgr4r4rrrr!)r�r�r�r"r��s


zConfigurator.configure_customcCs4|j|}t|t�r0d|kr0|j|�|j|<}|S)Nz())r�rOr�r�)r�rXr4r!r!r"�__getitem__s
zConfigurator.__getitem__c	CsFtjj|�stjj|j|�}tj|ddd��}tj|�}WdQRX|S)z*Default converter for the inc:// protocol.rzutf-8)r�N)	r9r:�isabsr0r@rbr�rerf)r�rtr�r4r!r!r"r�s
zConfigurator.inc_convert)N)r�r�r�r�rZvalue_convertersr�r�r�r��
__classcell__r!r!)r�r"r��s
r�c@s&eZdZd	dd�Zdd�Zdd�ZdS)
�SubprocessMixinFNcCs||_||_dS)N)�verbose�progress)r�r�r�r!r!r"r�+szSubprocessMixin.__init__cCsn|j}|j}xT|j�}|sP|dk	r0|||�q|sBtjjd�ntjj|jd��tjj�qW|j�dS)z�
        Read lines from a subprocess' output stream and either pass to a progress
        callable (if specified) or write progress information to sys.stderr.
        Nr�zutf-8)	r�r��readlinerJ�stderrrzr]�flushrl)r�r_r�r�r�r3r!r!r"r/szSubprocessMixin.readercKs�tj|ftjtjd�|��}tj|j|jdfd�}|j�tj|j|jdfd�}|j�|j	�|j
�|j
�|jdk	r�|jdd�n|jr�t
jjd�|S)N)�stdoutr�r�)r�r3r�zdone.�mainzdone.
)�
subprocess�Popen�PIPE�	threadingZThreadrr�rr��waitr0r�r�rJrz)r��cmdr4rTZt1Zt2r!r!r"�run_commandDs
zSubprocessMixin.run_command)FN)r�r�r�r�rr�r!r!r!r"r�*s
r�cCstjdd|�j�S)z,Normalize a python package name a la PEP 503z[-_.]+r
)r
�subrQ)r'r!r!r"�normalize_nameUsr�)NN)r�)N)N)rVrWrXrYrZr[r\)NT)r$rvrwrxryrz)r�r�)�rb�collectionsr�
contextlibr�Zglobrr�rqreZloggingr9r�r
rr�r��ImportErrorr�rJrhr|rmr�Zdummy_threadingr�r$r�compatrrr	r
rrr
rrrrrrrrrrrrZ	getLoggerr�r��COMMAr�r1ZIDENTZEXTRA_IDENTZVERSPECZRELOPZBARE_CONSTRAINTSZ
DIRECT_REFZCONSTRAINTSZ
EXTRA_LISTZEXTRASZREQUIREMENTr+ZRELOP_IDENTr.r6rGrLrPrVrYrur{�contextmanagerrr�r�r�r�r�r�r�r��VERBOSEr�rhr�r�r�rrr�Irrrrrrrrr r"r#r+r6ZARCHIVE_EXTENSIONSrprur�r{r�r�r�r�r�r�r�r�r�r�rar�r�r�r�r�r�r�r�r�r�r�r�r�r!r!r!r"�<module>s�
X

,

%

	/
	7

)



,H
C]


*)	
:+site-packages/pip/_vendor/distlib/__pycache__/markers.cpython-36.opt-1.pyc000064400000013337147511334610022417 0ustar003

���e��@sddZddlZddlZddlZddlZddlmZmZddlm	Z	dgZ
Gdd�de�Zd
d	d�Z
dS)zEParser for the environment markers micro-language defined in PEP 345.�N�)�python_implementation�string_types)�in_venv�	interpretc
@s�eZdZdZdd�dd�dd�dd�dd�dd�d	d�d
d�dd�d�	Zejd
ejdd�ejj	dd�de
jee
��ej�ej�ej�e�d�	Zd*dd�Zdd�Zdd�Zd+dd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�ZdS),�	Evaluatorz5
    A limited evaluator for Python expressions.
    cCs||kS)N�)�x�yrr�/usr/lib/python3.6/markers.py�<lambda>szEvaluator.<lambda>cCs||kS)Nr)r	r
rrrrscCs||kS)Nr)r	r
rrrrscCs||kS)Nr)r	r
rrrrscCs||kS)Nr)r	r
rrrrscCs||kS)Nr)r	r
rrrrscCs|S)Nr)r	rrrr scCs||kS)Nr)r	r
rrrr!scCs||kS)Nr)r	r
rrrr"s)	�eq�gtZgte�in�ltZlte�notZnoteqZnotinz%s.%sN�� rr)	Zsys_platformZpython_versionZpython_full_versionZos_nameZplatform_in_venvZplatform_releaseZplatform_versionZplatform_machineZplatform_python_implementationcCs|pi|_d|_dS)zu
        Initialise an instance.

        :param context: If specified, names are looked up in this mapping.
        N)�context�source)�selfrrrr�__init__3s
zEvaluator.__init__cCs8d}d|j|||�}||t|j�kr4|d7}|S)zH
        Get the part of the source which is causing a problem.
        �
z%rz...)r�len)r�offsetZfragment_len�srrr�get_fragment<s
zEvaluator.get_fragmentcCst|d|d�S)z@
        Get a handler for the specified AST node type.
        zdo_%sN)�getattr)r�	node_typerrr�get_handlerFszEvaluator.get_handlercCs�t|t�rr||_ddi}|r$||d<ytj|f|�}Wn:tk
rp}z|j|j�}td|��WYdd}~XnX|jj	j
�}|j|�}|dkr�|jdkr�d}n|j|j�}td||f��||�S)zf
        Evaluate a source string or node, using ``filename`` when
        displaying errors.
        �mode�eval�filenamezsyntax error %sNz(source not available)z don't know how to evaluate %r %s)
�
isinstancerr�ast�parse�SyntaxErrorrr�	__class__�__name__�lowerr�
col_offset)r�noder"�kwargs�errZhandlerrrr�evaluateLs&


zEvaluator.evaluatecCsd|jj|jfS)Nz%s.%s)�value�id�attr)rr+rrr�get_attr_keyfszEvaluator.get_attr_keycCsft|jtj�sd}n|j|�}||jkp0||jk}|sBtd|��||jkrX|j|}n
|j|}|S)NFzinvalid expression: %s)r#r/r$�Namer2r�allowed_valuesr&)rr+�valid�key�resultrrr�do_attributejs


zEvaluator.do_attributecCsx|j|jd�}|jjtjk}|jjtjk}|r4|s>|rt|rtx4|jdd�D]"}|j|�}|rd|sn|rN|rNPqNW|S)Nrr)r.�values�opr'r$ZOrZAnd)rr+r7Zis_orZis_and�nrrr�	do_boolopxs
zEvaluator.do_boolopc	s���fdd�}�j}�j|�}d}xnt�j�j�D]\\}}|||�|jjj�}|�jkrft	d|���j|�}�j|||�}|s�P|}|}q2W|S)Ncs@d}t|tj�r t|tj�r d}|s<�j�j�}td|��dS)NTFzInvalid comparison: %s)r#r$ZStrrr*r&)�lhsnode�rhsnoder5r)r+rrr�sanity_check�sz*Evaluator.do_compare.<locals>.sanity_checkTzunsupported operation: %r)
�leftr.�zipZopsZcomparatorsr'r(r)�	operatorsr&)	rr+r?r=Zlhsr7r:r>Zrhsr)r+rr�
do_compare�s 




zEvaluator.do_comparecCs|j|j�S)N)r.Zbody)rr+rrr�
do_expression�szEvaluator.do_expressioncCsTd}|j|jkr"d}|j|j}n|j|jkr>d}|j|j}|sPtd|j��|S)NFTzinvalid expression: %s)r0rr4r&)rr+r5r7rrr�do_name�szEvaluator.do_namecCs|jS)N)r)rr+rrr�do_str�szEvaluator.do_str)N)N)r(�
__module__�__qualname__�__doc__rB�sys�platform�version_info�version�split�os�name�strr�release�machinerr4rrrr.r2r8r<rCrDrErFrrrrrs<

	

rcCst|�j|j��S)z�
    Interpret a marker and return a result depending on environment.

    :param marker: The marker to interpret.
    :type marker: str
    :param execution_context: The context used for name lookup.
    :type execution_context: mapping
    )rr.�strip)ZmarkerZexecution_contextrrrr�s	)N)rIr$rOrJrK�compatrr�utilr�__all__�objectrrrrrr�<module>s"site-packages/pip/_vendor/distlib/__pycache__/index.cpython-36.opt-1.pyc000064400000041620147511334610022056 0ustar003

���e]R�@s�ddlZddlZddlZddlZddlZddlZyddlmZWn ek
r`ddl	mZYnXddl
mZddlm
Z
mZmZmZmZmZddlmZmZmZeje�ZdZdZGd	d
�d
e�ZdS)�N)�Thread�)�DistlibException)�HTTPBasicAuthHandler�Request�HTTPPasswordMgr�urlparse�build_opener�string_types)�cached_property�zip_dir�ServerProxyzhttps://pypi.python.org/pypi�pypic@s�eZdZdZdZd*dd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zd+dd�Zd,dd�Z
d-dd�Zd.dd�Zdd�Zd/dd�Zd0d d!�Zd1d"d#�Zd$d%�Zd&d'�Zd2d(d)�ZdS)3�PackageIndexzc
    This class represents a package index compatible with PyPI, the Python
    Package Index.
    s.----------ThIs_Is_tHe_distlib_index_bouNdaRY_$NcCs�|pt|_|j�t|j�\}}}}}}|s<|s<|s<|dkrJtd|j��d|_d|_d|_d|_d|_	t
tjd��R}xJdD]B}	y(t
j|	dg||d	�}
|
d
kr�|	|_PWq|tk
r�Yq|Xq|WWdQRXdS)
z�
        Initialise an instance.

        :param url: The URL of the index. If not specified, the URL for PyPI is
                    used.
        �http�httpszinvalid repository: %sN�w�gpg�gpg2z	--version)�stdout�stderrr)rr)rr)�
DEFAULT_INDEX�url�read_configurationrr�password_handler�ssl_verifierr�gpg_home�	rpc_proxy�open�os�devnull�
subprocessZ
check_call�OSError)�selfr�scheme�netloc�pathZparamsZqueryZfragZsink�s�rc�r)�/usr/lib/python3.6/index.py�__init__$s(

zPackageIndex.__init__cCs&ddlm}ddlm}|�}||�S)zs
        Get the distutils command for interacting with PyPI configurations.
        :return: the command.
        r)�Distribution)�
PyPIRCCommand)Zdistutils.corer,Zdistutils.configr-)r#r,r-�dr)r)r*�_get_pypirc_commandBsz PackageIndex._get_pypirc_commandcCsR|j�}|j|_|j�}|jd�|_|jd�|_|jdd�|_|jd|j�|_dS)z�
        Read the PyPI access configuration as supported by distutils, getting
        PyPI to do the actual work. This populates ``username``, ``password``,
        ``realm`` and ``url`` attributes from the configuration.
        �username�password�realmr�
repositoryN)r/rr3Z_read_pypirc�getr0r1r2)r#�cZcfgr)r)r*rLszPackageIndex.read_configurationcCs$|j�|j�}|j|j|j�dS)z�
        Save the PyPI access configuration. You must have set ``username`` and
        ``password`` attributes before calling this method.

        Again, distutils is used to do the actual work.
        N)�check_credentialsr/Z
_store_pypircr0r1)r#r5r)r)r*�save_configuration[szPackageIndex.save_configurationcCs\|jdks|jdkrtd��t�}t|j�\}}}}}}|j|j||j|j�t|�|_	dS)zp
        Check that ``username`` and ``password`` have been set, and raise an
        exception if not.
        Nz!username and password must be set)
r0r1rrrrZadd_passwordr2rr)r#Zpm�_r%r)r)r*r6gszPackageIndex.check_credentialscCs\|j�|j�|j�}d|d<|j|j�g�}|j|�}d|d<|j|j�g�}|j|�S)aq
        Register a distribution on PyPI, using the provided metadata.

        :param metadata: A :class:`Metadata` instance defining at least a name
                         and version number for the distribution to be
                         registered.
        :return: The HTTP response received from PyPI upon submission of the
                request.
        Zverifyz:actionZsubmit)r6�validate�todict�encode_request�items�send_request)r#�metadatar.�requestZresponser)r)r*�registerss

zPackageIndex.registercCsJx<|j�}|sP|jd�j�}|j|�tjd||f�qW|j�dS)ar
        Thread runner for reading lines of from a subprocess into a buffer.

        :param name: The logical name of the stream (used for logging only).
        :param stream: The stream to read from. This will typically a pipe
                       connected to the output stream of a subprocess.
        :param outbuf: The list to append the read lines to.
        zutf-8z%s: %sN)�readline�decode�rstrip�append�logger�debug�close)r#�name�streamZoutbufr'r)r)r*�_reader�s	
zPackageIndex._readercCs�|jdddg}|dkr|j}|r.|jd|g�|dk	rF|jdddg�tj�}tjj|tjj|�d	�}|jd
dd|d
||g�t	j
ddj|��||fS)a�
        Return a suitable command for signing a file.

        :param filename: The pathname to the file to be signed.
        :param signer: The identifier of the signer of the file.
        :param sign_password: The passphrase for the signer's
                              private key used for signing.
        :param keystore: The path to a directory which contains the keys
                         used in verification. If not specified, the
                         instance's ``gpg_home`` attribute is used instead.
        :return: The signing command as a list suitable to be
                 passed to :class:`subprocess.Popen`.
        z--status-fd�2z--no-ttyNz	--homedirz--batchz--passphrase-fd�0z.ascz
--detach-signz--armorz--local-userz--outputzinvoking: %s� )rr�extend�tempfileZmkdtemprr&�join�basenamerErF)r#�filename�signer�
sign_password�keystore�cmdZtdZsfr)r)r*�get_sign_command�s
zPackageIndex.get_sign_commandc	Cs�tjtjd�}|dk	r tj|d<g}g}tj|f|�}t|jd|j|fd�}|j�t|jd|j|fd�}|j�|dk	r�|jj	|�|jj
�|j�|j�|j�|j
||fS)a�
        Run a command in a child process , passing it any input data specified.

        :param cmd: The command to run.
        :param input_data: If specified, this must be a byte string containing
                           data to be sent to the child process.
        :return: A tuple consisting of the subprocess' exit code, a list of
                 lines read from the subprocess' ``stdout``, and a list of
                 lines read from the subprocess' ``stderr``.
        )rrN�stdinr)�target�argsr)r!�PIPE�PopenrrJr�startrrX�writerG�waitrP�
returncode)	r#rVZ
input_data�kwargsrr�pZt1Zt2r)r)r*�run_command�s$


zPackageIndex.run_commandc
CsD|j||||�\}}|j||jd��\}}}	|dkr@td|��|S)aR
        Sign a file.

        :param filename: The pathname to the file to be signed.
        :param signer: The identifier of the signer of the file.
        :param sign_password: The passphrase for the signer's
                              private key used for signing.
        :param keystore: The path to a directory which contains the keys
                         used in signing. If not specified, the instance's
                         ``gpg_home`` attribute is used instead.
        :return: The absolute pathname of the file where the signature is
                 stored.
        zutf-8rz&sign command failed with error code %s)rWrc�encoder)
r#rRrSrTrUrV�sig_filer(rrr)r)r*�	sign_file�s

zPackageIndex.sign_file�sdist�sourcecCs(|j�tjj|�s td|��|j�|j�}d}	|rZ|jsJtj	d�n|j
||||�}	t|d��}
|
j�}WdQRXt
j|�j�}t
j|�j�}
|jdd||||
d��dtjj|�|fg}|	�rt|	d��}
|
j�}WdQRX|jd	tjj|	�|f�tjtjj|	��|j|j�|�}|j|�S)
a�
        Upload a release file to the index.

        :param metadata: A :class:`Metadata` instance defining at least a name
                         and version number for the file to be uploaded.
        :param filename: The pathname of the file to be uploaded.
        :param signer: The identifier of the signer of the file.
        :param sign_password: The passphrase for the signer's
                              private key used for signing.
        :param filetype: The type of the file being uploaded. This is the
                        distutils command which produced that file, e.g.
                        ``sdist`` or ``bdist_wheel``.
        :param pyversion: The version of Python which the release relates
                          to. For code compatible with any Python, this would
                          be ``source``, otherwise it would be e.g. ``3.2``.
        :param keystore: The path to a directory which contains the keys
                         used in signing. If not specified, the instance's
                         ``gpg_home`` attribute is used instead.
        :return: The HTTP response received from PyPI upon submission of the
                request.
        z
not found: %sNz)no signing program available - not signed�rbZfile_upload�1)z:actionZprotocol_version�filetype�	pyversion�
md5_digest�
sha256_digest�contentZ
gpg_signature)r6rr&�existsrr9r:rrEZwarningrfr�read�hashlib�md5�	hexdigestZsha256�updaterQrD�shutilZrmtree�dirnamer;r<r=)r#r>rRrSrTrkrlrUr.re�fZ	file_datarmrn�filesZsig_datar?r)r)r*�upload_file�s>

zPackageIndex.upload_filec
Cs�|j�tjj|�s td|��tjj|d�}tjj|�sFtd|��|j�|j|j	}}t
|�j�}d	d|fd|fg}d||fg}|j||�}	|j
|	�S)
a2
        Upload documentation to the index.

        :param metadata: A :class:`Metadata` instance defining at least a name
                         and version number for the documentation to be
                         uploaded.
        :param doc_dir: The pathname of the directory which contains the
                        documentation. This should be the directory that
                        contains the ``index.html`` for the documentation.
        :return: The HTTP response received from PyPI upon submission of the
                request.
        znot a directory: %rz
index.htmlz
not found: %r�:action�
doc_uploadrH�versionro)r{r|)r6rr&�isdirrrPrpr9rHr}r�getvaluer;r=)
r#r>Zdoc_dir�fnrHr}Zzip_data�fieldsryr?r)r)r*�upload_documentation)s
z!PackageIndex.upload_documentationcCsT|jdddg}|dkr|j}|r.|jd|g�|jd||g�tjddj|��|S)	a|
        Return a suitable command for verifying a file.

        :param signature_filename: The pathname to the file containing the
                                   signature.
        :param data_filename: The pathname to the file containing the
                              signed data.
        :param keystore: The path to a directory which contains the keys
                         used in verification. If not specified, the
                         instance's ``gpg_home`` attribute is used instead.
        :return: The verifying command as a list suitable to be
                 passed to :class:`subprocess.Popen`.
        z--status-fdrKz--no-ttyNz	--homedirz--verifyzinvoking: %srM)rrrNrErFrP)r#�signature_filename�
data_filenamerUrVr)r)r*�get_verify_commandEszPackageIndex.get_verify_commandcCsH|jstd��|j|||�}|j|�\}}}|dkr@td|��|dkS)a6
        Verify a signature for a file.

        :param signature_filename: The pathname to the file containing the
                                   signature.
        :param data_filename: The pathname to the file containing the
                              signed data.
        :param keystore: The path to a directory which contains the keys
                         used in verification. If not specified, the
                         instance's ``gpg_home`` attribute is used instead.
        :return: True if the signature was verified, else False.
        z0verification unavailable because gpg unavailablerrz(verify command failed with error code %s)rr)rrr�rc)r#r�r�rUrVr(rrr)r)r*�verify_signature]szPackageIndex.verify_signaturecCsp|dkrd}tjd�n6t|ttf�r0|\}}nd}tt|��}tjd|�t|d���}|jt	|��}z�|j
�}	d}
d}d}d}
d	|	kr�t|	d
�}|r�||
|
|�xP|j|
�}|s�P|t
|�7}|j|�|r�|j|�|
d7}
|r�||
|
|�q�WWd|j�XWdQRX|dk�r4||k�r4td||f��|�rl|j�}||k�r`td||||f��tjd
|�dS)a
        This is a convenience method for downloading a file from an URL.
        Normally, this will be a file from the index, though currently
        no check is made for this (i.e. a file can be downloaded from
        anywhere).

        The method is just like the :func:`urlretrieve` function in the
        standard library, except that it allows digest computation to be
        done during download and checking that the downloaded data
        matched any expected value.

        :param url: The URL of the file to be downloaded (assumed to be
                    available via an HTTP GET request).
        :param destfile: The pathname where the downloaded file is to be
                         saved.
        :param digest: If specified, this must be a (hasher, value)
                       tuple, where hasher is the algorithm used (e.g.
                       ``'md5'``) and ``value`` is the expected value.
        :param reporthook: The same as for :func:`urlretrieve` in the
                           standard library.
        NzNo digest specifiedrszDigest specified: %s�wbi rrzcontent-lengthzContent-Lengthz1retrieval incomplete: got only %d out of %d bytesz.%s digest mismatch for %s: expected %s, got %szDigest verified: %s���)rErF�
isinstance�list�tuple�getattrrrrr=r�info�intrq�lenr^rurGrrt)r#r�destfileZdigestZ
reporthookZdigesterZhasherZdfpZsfp�headersZ	blocksize�sizerqZblocknum�block�actualr)r)r*�
download_filevsV




zPackageIndex.download_filecCs:g}|jr|j|j�|jr(|j|j�t|�}|j|�S)z�
        Send a standard library :class:`Request` to PyPI and return its
        response.

        :param req: The request to send.
        :return: The HTTP response from PyPI (a standard library HTTPResponse).
        )rrDrr	r)r#ZreqZhandlers�openerr)r)r*r=�szPackageIndex.send_requestcCs�g}|j}xX|D]P\}}t|ttf�s,|g}x2|D]*}|jd|d|jd�d|jd�f�q2WqWx6|D].\}}	}
|jd|d||	fjd�d|
f�qjW|jd|ddf�dj|�}d|}|tt|��d�}
t	|j
||
�S)	a&
        Encode fields and files for posting to an HTTP server.

        :param fields: The fields to send as a list of (fieldname, value)
                       tuples.
        :param files: The files to send as a list of (fieldname, filename,
                      file_bytes) tuple.
        s--z)Content-Disposition: form-data; name="%s"zutf-8�z8Content-Disposition: form-data; name="%s"; filename="%s"s
smultipart/form-data; boundary=)zContent-typezContent-length)�boundaryr�r�r�rNrdrP�strr�rr)r#r�ry�partsr��k�values�v�keyrR�valueZbodyZctr�r)r)r*r;�s2


zPackageIndex.encode_requestcCs>t|t�rd|i}|jdkr,t|jdd�|_|jj||p:d�S)NrHg@)Ztimeout�and)r�r
rr
r�search)r#Zterms�operatorr)r)r*r��s


zPackageIndex.search)N)N)N)N)NNrgrhN)N)N)NN)N)�__name__�
__module__�__qualname__�__doc__r�r+r/rr7r6r@rJrWrcrfrzr�r�r�r�r=r;r�r)r)r)r*rs*



#

8


M+r)rrZloggingrrvr!rOZ	threadingr�ImportErrorZdummy_threading�r�compatrrrrr	r
�utilrrr
Z	getLoggerr�rErZ
DEFAULT_REALM�objectrr)r)r)r*�<module>s  
site-packages/pip/_vendor/distlib/__pycache__/scripts.cpython-36.pyc000064400000023415147511334610021501 0ustar003

���ex;�@s�ddlmZddlZddlZddlZddlZddlZddlmZm	Z	m
Z
ddlmZddl
mZmZmZmZmZeje�Zdj�Zejd�Zd	Zd
d�ZGdd
�d
e�ZdS)�)�BytesION�)�	sysconfig�detect_encoding�ZipFile)�finder)�FileOperator�get_export_entry�convert_path�get_executable�in_venva�
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<assembly xmlns="urn:schemas-microsoft-com:asm.v1" manifestVersion="1.0">
 <assemblyIdentity version="1.0.0.0"
 processorArchitecture="X86"
 name="%s"
 type="win32"/>

 <!-- Identify the application security requirements. -->
 <trustInfo xmlns="urn:schemas-microsoft-com:asm.v3">
 <security>
 <requestedPrivileges>
 <requestedExecutionLevel level="asInvoker" uiAccess="false"/>
 </requestedPrivileges>
 </security>
 </trustInfo>
</assembly>s^#!.*pythonw?[0-9.]*([ 	].*)?$a|# -*- coding: utf-8 -*-
if __name__ == '__main__':
    import sys, re

    def _resolve(module, func):
        __import__(module)
        mod = sys.modules[module]
        parts = func.split('.')
        result = getattr(mod, parts.pop(0))
        for p in parts:
            result = getattr(result, p)
        return result

    try:
        sys.argv[0] = re.sub(r'(-script\.pyw?|\.exe)?$', '', sys.argv[0])

        func = _resolve('%(module)s', '%(func)s')
        rc = func() # None interpreted as 0
    except Exception as e:  # only supporting Python >= 2.6
        sys.stderr.write('%%s\n' %% e)
        rc = 1
    sys.exit(rc)
cCsZd|krV|jd�rD|jdd�\}}d|krV|jd�rVd||f}n|jd�sVd|}|S)N� z
/usr/bin/env r�"z%s "%s"z"%s")�
startswith�split)�
executable�envZ_executable�r�/usr/lib/python3.6/scripts.py�_enquote_executableBs

rc@s�eZdZdZeZdZd%dd�Zdd�Ze	j
jd	�rBd
d�Zdd
�Z
d&dd�Zdd�ZeZdd�Zdd�Zd'dd�Zdd�Zedd��Zejdd��Zejdks�ejd	kr�ejdkr�dd �Zd(d!d"�Zd)d#d$�ZdS)*�ScriptMakerz_
    A class to copy or create scripts from source scripts or callable
    specifications.
    NTFcCsz||_||_||_d|_d|_tjdkp:tjdko:tjdk|_t	d�|_
|pRt|�|_tjdkprtjdkortjdk|_
dS)NF�posix�java��X.Y�nt)rr)�
source_dir�
target_dir�
add_launchers�force�clobber�os�name�_name�set_mode�set�variantsr�_fileop�_is_nt)�selfrrr�dry_runZfileoprrr�__init__[s

zScriptMaker.__init__cCs@|jdd�r<|jr<tjj|�\}}|jdd�}tjj||�}|S)N�guiF�pythonZpythonw)�getr(r!�pathr�replace�join)r)r�optionsZdn�fnrrr�_get_alternate_executableks
z%ScriptMaker._get_alternate_executablercCsLy"t|��}|jd�dkSQRXWn$ttfk
rFtjd|�dSXdS)zl
            Determine if the specified executable is a script
            (contains a #! line)
            �z#!NzFailed to open %sF)�open�read�OSError�IOError�logger�warning)r)r�fprrr�	_is_shellss
zScriptMaker._is_shellcCsD|j|�r*ddl}|jjjd�dkr<|Sn|j�jd�r<|Sd|S)Nrzos.nameZLinuxz
jython.exez/usr/bin/env %s)r=rZlangZSystemZgetProperty�lower�endswith)r)rrrrr�_fix_jython_executables
z"ScriptMaker._fix_jython_executable�cCsdd}|jr|j}d}n^tj�s&t�}nNt�rLtjjtjd�dtj	d��}n(tjjtj	d�dtj	d�tj	d�f�}|r�|j
||�}tjj
d	�r�|j|�}tjj|�}|r�t|�}|jd
�}tjdkr�d|kr�d
|kr�|d7}d||d}y|jd
�Wn"tk
�rtd|��YnX|d
k�r`y|j|�Wn&tk
�r^td||f��YnX|S)NTF�scriptszpython%s�EXE�BINDIRz
python%s%s�VERSIONrzutf-8Zcliz	-X:Framesz
-X:FullFramess
 -X:Framess#!�
z,The shebang (%r) is not decodable from utf-8z?The shebang (%r) is not decodable from the script encoding (%r))rr�is_python_buildrrr!r/r1�get_path�get_config_varr4�sys�platformrr@�normcaser�encode�decode�UnicodeDecodeError�
ValueError)r)�encoding�post_interpr2Zenquoter�shebangrrr�_get_shebang�sL



zScriptMaker._get_shebangcCs|jt|j|jd�S)N)�module�func)�script_template�dict�prefix�suffix)r)�entryrrr�_get_script_text�s
zScriptMaker._get_script_textcCstjj|�}|j|S)N)r!r/�basename�manifest)r)Zexename�baserrr�get_manifest�szScriptMaker.get_manifestcCs�|jo
|j}tjjd�}|s*|||}n^|dkr>|jd�}n
|jd�}t�}	t|	d��}
|
jd|�WdQRX|	j	�}||||}�xd|D�]Z}tj
j|j|�}
|�rrtj
j
|
�\}}|jd�r�|}
d|
}
y|jj|
|�Wn�tk
�rntjd�d	|
}tj
j|��r tj|�tj|
|�|jj|
|�tjd
�ytj|�Wntk
�rhYnXYnXnp|j�r�|
jd|��r�d|
|f}
tj
j|
��r�|j�r�tjd
|
�q�|jj|
|�|j�r�|jj|
g�|j|
�q�WdS)Nzutf-8�py�t�wz__main__.pyz.pyz%s.exez:Failed to write executable - trying to use .deleteme logicz%s.deletemez0Able to replace executable using .deleteme logic�.z%s.%szSkipping existing file %s)rr(r!�lineseprM�
_get_launcherrrZwritestr�getvaluer/r1r�splitextrr'Zwrite_binary_file�	Exceptionr:r;�exists�remove�rename�debugr?r r$�set_executable_mode�append)r)�namesrSZscript_bytes�	filenames�extZuse_launcherreZlauncher�streamZzfZzip_datar"�outname�n�eZdfnamerrr�
_write_script�sT




zScriptMaker._write_scriptcCs�d}|r0|jdg�}|r0ddj|�}|jd�}|jd||d�}|j|�jd�}|j}t�}	d|jkrp|	j|�d|jkr�|	jd	|t	j
d
f�d|jkr�|	jd|t	j
dd
�f�|r�|jdd�r�d}
nd}
|j|	||||
�dS)NrAZinterpreter_argsz %sr
zutf-8)r2r�Xz%s%srzX.Yz%s-%s�r,F�pywra)r.r1rMrTr\r"r%r&�addrJ�versionrw)r)r[rqr2rR�argsrS�scriptr"Zscriptnamesrrrrr�_make_script�s(




zScriptMaker._make_scriptcCs�d}tjj|jt|��}tjj|jtjj|��}|jrX|jj	||�rXt
jd|�dSyt|d�}Wn t
k
r�|js~�d}YnLX|j�}|s�t
jd|j�|�dStj|jdd��}|r�d}|jd�p�d	}|�s|r�|j�|jj||�|j�r|jj|g�|j|�n�t
jd
||j�|jj�s�t|j�\}	}
|jd�|j|	|�}d|k�rbd
}nd}tjj|�}
|j|
g||j �||�|�r�|j�dS)NFznot copying %s (up-to-date)�rbz"%s: %s is an empty file (skipping)s
rFTrrAzcopying and adjusting %s -> %srspythonwrzra)!r!r/r1rr
rr]rr'Znewerr:rmr6r9r*�readliner;Zget_command_name�
FIRST_LINE_RE�matchr0�group�closeZ	copy_filer$rnro�infor�seekrTrwr7)r)r~rqZadjustrt�fZ
first_liner�rRrQ�linesrSrrrurrr�_copy_scriptsR



zScriptMaker._copy_scriptcCs|jjS)N)r'r*)r)rrrr*JszScriptMaker.dry_runcCs||j_dS)N)r'r*)r)�valuerrrr*NsrcCsHtjd�dkrd}nd}d||f}tjdd�d}t|�j|�j}|S)	N�P�Z64Z32z%s%s.exerdrr)�struct�calcsize�__name__�rsplitr�find�bytes)r)Zkind�bitsr"Zdistlib_package�resultrrrrfVszScriptMaker._get_launchercCs6g}t|�}|dkr"|j||�n|j|||d�|S)a�
        Make a script.

        :param specification: The specification, which is either a valid export
                              entry specification (to make a script from a
                              callable) or a filename (to make a script by
                              copying from a source location).
        :param options: A dictionary of options controlling script generation.
        :return: A list of all absolute pathnames written to.
        N)r2)r	r�r)r)�
specificationr2rqr[rrr�makedszScriptMaker.makecCs(g}x|D]}|j|j||��q
W|S)z�
        Take a list of specifications and make scripts from them,
        :param specifications: A list of specifications.
        :return: A list of all absolute pathnames written to,
        )�extendr�)r)Zspecificationsr2rqr�rrr�
make_multiplews
zScriptMaker.make_multiple)TFN)rAN)N)N)N)r��
__module__�__qualname__�__doc__�SCRIPT_TEMPLATErWrr+r4rJrKrr=r@rTr\�_DEFAULT_MANIFESTr^r`rwrr��propertyr*�setterr!r"r#rfr�r�rrrrrRs,

82
4
r)�iorZloggingr!�rer�rJ�compatrrrZ	resourcesr�utilrr	r
rrZ	getLoggerr�r:�stripr��compiler�r�r�objectrrrrr�<module>s

site-packages/pip/_vendor/distlib/__pycache__/version.cpython-36.pyc000064400000050714147511334610021501 0ustar003

���e�\�@sZdZddlZddlZddlmZddddd	d
ddgZeje�ZGd
d�de	�Z
Gdd�de�ZGdd�de�Z
ejd�Zdd�ZeZGdd�de�Zdd�ZGdd�de
�Zejd�dfejd�dfejd�dfejd�dfejd �d!fejd"�d!fejd#�d$fejd%�d&fejd'�d(fejd)�d*ff
Zejd+�dfejd,�dfejd-�d$fejd#�d$fejd.�dffZejd/�Zd0d1�Zd2d3�Zejd4ej�Zd5d5d6d5d7ddd8�Zd9d:�ZGd;d�de�ZGd<d�de
�Zejd=ej�Z d>d?�Z!d@dA�Z"GdBd	�d	e�Z#GdCd
�d
e
�Z$GdDdE�dEe�Z%e%eee�e%eedFdG��e%e"e$e�dH�Z&e&dIe&dJ<dKd�Z'dS)Lz~
Implementation of a flexible versioning scheme providing support for PEP-440,
setuptools-compatible and semantic versioning.
�N�)�string_types�NormalizedVersion�NormalizedMatcher�
LegacyVersion�
LegacyMatcher�SemanticVersion�SemanticMatcher�UnsupportedVersionError�
get_schemec@seZdZdZdS)r
zThis is an unsupported version.N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/version.pyr
sc@sxeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zedd��ZdS)�VersioncCs@|j�|_}|j|�|_}t|t�s,t�t|�dks<t�dS)Nr)�strip�_string�parse�_parts�
isinstance�tuple�AssertionError�len)�self�s�partsrrr�__init__szVersion.__init__cCstd��dS)Nzplease implement in a subclass)�NotImplementedError)rrrrrr$sz
Version.parsecCs$t|�t|�kr td||f��dS)Nzcannot compare %r and %r)�type�	TypeError)r�otherrrr�_check_compatible'szVersion._check_compatiblecCs|j|�|j|jkS)N)r#r)rr"rrr�__eq__+s
zVersion.__eq__cCs|j|�S)N)r$)rr"rrr�__ne__/szVersion.__ne__cCs|j|�|j|jkS)N)r#r)rr"rrr�__lt__2s
zVersion.__lt__cCs|j|�p|j|�S)N)r&r$)rr"rrr�__gt__6szVersion.__gt__cCs|j|�p|j|�S)N)r&r$)rr"rrr�__le__9szVersion.__le__cCs|j|�p|j|�S)N)r'r$)rr"rrr�__ge__<szVersion.__ge__cCs
t|j�S)N)�hashr)rrrr�__hash__@szVersion.__hash__cCsd|jj|jfS)Nz%s('%s'))�	__class__rr)rrrr�__repr__CszVersion.__repr__cCs|jS)N)r)rrrr�__str__FszVersion.__str__cCstd��dS)NzPlease implement in subclasses.)r)rrrr�
is_prereleaseIszVersion.is_prereleaseN)rr
rrrr#r$r%r&r'r(r)r+r-r.�propertyr/rrrrrsrc	@s�eZdZdZejd�Zejd�Zejd�Zdd�dd�dd�d	d�d
d�dd�dd�d
d�d�Z	dd�Z
dd�Zedd��Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd �ZdS)!�MatcherNz^(\w[\s\w'.-]*)(\((.*)\))?z'^(<=|>=|<|>|!=|={2,3}|~=)?\s*([^\s,]+)$z
^\d+(\.\d+)*$cCs||kS)Nr)�v�c�prrr�<lambda>WszMatcher.<lambda>cCs||kS)Nr)r2r3r4rrrr5XscCs||kp||kS)Nr)r2r3r4rrrr5YscCs||kp||kS)Nr)r2r3r4rrrr5ZscCs||kS)Nr)r2r3r4rrrr5[scCs||kS)Nr)r2r3r4rrrr5\scCs||kp||kS)Nr)r2r3r4rrrr5^scCs||kS)Nr)r2r3r4rrrr5_s)�<�>z<=z>=z==z===z~=z!=c
CsJ|jdkrtd��|j�|_}|jj|�}|s<td|��|jd�}|dj�|_|jj�|_	g}|d�r<dd�|dj
d�D�}x�|D]�}|jj|�}|s�td	||f��|j�}|dp�d
}|d}|jd��r|dkr�td|��|dd�d}}	|j
j|��s(|j|�n|j|�d}}	|j|||	f�q�Wt|�|_dS)NzPlease specify a version classz
Not valid: %r�r�cSsg|]}|j��qSr)r)�.0r3rrr�
<listcomp>nsz$Matcher.__init__.<locals>.<listcomp>�,zInvalid %r in %rz~=rz.*�==�!=z#'.*' not allowed for %r constraintsTF)r=r>���)�
version_class�
ValueErrorrr�dist_re�match�groups�name�lower�key�split�comp_re�endswith�num_re�appendrr)
rr�mrDZclistZconstraintsr3�opZvn�prefixrrrrbs:



zMatcher.__init__cCszt|t�r|j|�}x`|jD]V\}}}|jj|�}t|t�rFt||�}|sbd||jjf}t	|��||||�sdSqWdS)z�
        Check if the provided version matches the constraints.

        :param version: The version to match against this instance.
        :type version: String or :class:`Version` instance.
        z%r not implemented for %sFT)
rrr@r�
_operators�get�getattrr,rr)r�version�operator�
constraintrO�f�msgrrrrC�s



z
Matcher.matchcCs6d}t|j�dkr2|jdddkr2|jdd}|S)Nrr�==�===)rXrY)rr)r�resultrrr�
exact_version�s zMatcher.exact_versioncCs0t|�t|�ks|j|jkr,td||f��dS)Nzcannot compare %s and %s)r rEr!)rr"rrrr#�szMatcher._check_compatiblecCs"|j|�|j|jko |j|jkS)N)r#rGr)rr"rrrr$�s
zMatcher.__eq__cCs|j|�S)N)r$)rr"rrrr%�szMatcher.__ne__cCst|j�t|j�S)N)r*rGr)rrrrr+�szMatcher.__hash__cCsd|jj|jfS)Nz%s(%r))r,rr)rrrrr-�szMatcher.__repr__cCs|jS)N)r)rrrrr.�szMatcher.__str__)rr
rr@�re�compilerBrIrKrPrrCr0r[r#r$r%r+r-r.rrrrr1Ns*


%r1zk^v?(\d+!)?(\d+(\.\d+)*)((a|b|c|rc)(\d+))?(\.(post)(\d+))?(\.(dev)(\d+))?(\+([a-zA-Z\d]+(\.[a-zA-Z\d]+)?))?$cCs�|j�}tj|�}|s"td|��|j�}tdd�|djd�D��}x(t|�dkrn|ddkrn|dd�}qHW|ds~d}nt|d�}|dd�}|d	d
�}|dd�}|d
}|dkr�f}n|dt|d�f}|dkr�f}n|dt|d�f}|dk�r
f}n|dt|d�f}|dk�r.f}nLg}	x>|jd�D]0}
|
j	��rZdt|
�f}
nd|
f}
|	j
|
��q>Wt|	�}|�s�|�r�|�r�d}nd}|�s�d}|�s�d}||||||fS)NzNot a valid version: %scss|]}t|�VqdS)N)�int)r:r2rrr�	<genexpr>�sz_pep_440_key.<locals>.<genexpr>r�.r����	�
��
�a�z�_�final���rl)NN)NN)NNrl)rhrl)ri)rj)rk)r�PEP440_VERSION_RErCr
rDrrHrr^�isdigitrL)rrMrDZnumsZepoch�preZpost�devZlocalr�partrrr�_pep_440_key�sT



rrc@s6eZdZdZdd�Zedddddg�Zed	d
��ZdS)raIA rational version.

    Good:
        1.2         # equivalent to "1.2.0"
        1.2.0
        1.2a1
        1.2.3a2
        1.2.3b1
        1.2.3c1
        1.2.3.4
        TODO: fill this out

    Bad:
        1           # minimum two numbers
        1.2a        # release level must have a release serial
        1.2.3b
    cCs<t|�}tj|�}|j�}tdd�|djd�D��|_|S)Ncss|]}t|�VqdS)N)r^)r:r2rrrr_sz*NormalizedVersion.parse.<locals>.<genexpr>rr`)�_normalized_keyrmrCrDrrH�_release_clause)rrrZrMrDrrrrs

zNormalizedVersion.parserh�br3�rcrpcst�fdd��jD��S)Nc3s |]}|r|d�jkVqdS)rN)�PREREL_TAGS)r:�t)rrrr_sz2NormalizedVersion.is_prerelease.<locals>.<genexpr>)�anyr)rr)rrr/szNormalizedVersion.is_prereleaseN)	rr
rrr�setrwr0r/rrrrr�scCs>t|�}t|�}||krdS|j|�s*dSt|�}||dkS)NTFr`)�str�
startswithr)�x�y�nrrr�
_match_prefix"s
r�c	@sneZdZeZddddddddd	�Zd
d�Zdd
�Zdd�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�ZdS)r�_match_compatible�	_match_lt�	_match_gt�	_match_le�	_match_ge�	_match_eq�_match_arbitrary�	_match_ne)z~=r6r7z<=z>=z==z===z!=cCsV|rd|ko|jd}n|jdo,|jd}|rN|jjdd�d}|j|�}||fS)N�+rrrlrlrl)rrrHr@)rrSrUrOZstrip_localrrrr�
_adjust_local<s
zNormalizedMatcher._adjust_localcCsD|j|||�\}}||krdS|j}djdd�|D��}t||�S)NFr`cSsg|]}t|��qSr)r{)r:�irrrr;Osz/NormalizedMatcher._match_lt.<locals>.<listcomp>)r�rt�joinr�)rrSrUrO�release_clause�pfxrrrr�JszNormalizedMatcher._match_ltcCsD|j|||�\}}||krdS|j}djdd�|D��}t||�S)NFr`cSsg|]}t|��qSr)r{)r:r�rrrr;Wsz/NormalizedMatcher._match_gt.<locals>.<listcomp>)r�rtr�r�)rrSrUrOr�r�rrrr�RszNormalizedMatcher._match_gtcCs|j|||�\}}||kS)N)r�)rrSrUrOrrrr�ZszNormalizedMatcher._match_lecCs|j|||�\}}||kS)N)r�)rrSrUrOrrrr�^szNormalizedMatcher._match_gecCs.|j|||�\}}|s ||k}n
t||�}|S)N)r�r�)rrSrUrOrZrrrr�bs


zNormalizedMatcher._match_eqcCst|�t|�kS)N)r{)rrSrUrOrrrr�jsz"NormalizedMatcher._match_arbitrarycCs0|j|||�\}}|s ||k}nt||�}|S)N)r�r�)rrSrUrOrZrrrr�ms

zNormalizedMatcher._match_necCsf|j|||�\}}||krdS||kr*dS|j}t|�dkrH|dd�}djdd�|D��}t||�S)NTFrr`cSsg|]}t|��qSr)r{)r:r�rrrr;�sz7NormalizedMatcher._match_compatible.<locals>.<listcomp>rl)r�rtrr�r�)rrSrUrOr�r�rrrr�usz#NormalizedMatcher._match_compatibleN)rr
rrr@rPr�r�r�r�r�r�r�r�r�rrrrr-s$z[.+-]$r8z^[.](\d)z0.\1z^[.-]z
^\((.*)\)$z\1z^v(ersion)?\s*(\d+)z\2z^r(ev)?\s*(\d+)z[.]{2,}r`z\b(alfa|apha)\b�alphaz\b(pre-alpha|prealpha)\bz	pre.alphaz	\(beta\)$�betaz
^[:~._+-]+z
[,*")([\]]z[~:+_ -]z\.$z
(\d+(\.\d+)*)cCsZ|j�j�}xtD]\}}|j||�}qW|s2d}tj|�}|sJd}|}n�|j�djd�}dd�|D�}xt|�dkr�|j	d�qlWt|�dkr�||j
�d�}n8djdd�|dd�D��||j
�d�}|dd�}djd	d�|D��}|j�}|�rxtD]\}}|j||�}�qW|�s*|}nd
|k�r8dnd}|||}t
|��sVd}|S)
z�
    Try to suggest a semantic form for a version for which
    _suggest_normalized_version couldn't come up with anything.
    z0.0.0rr`cSsg|]}t|��qSr)r^)r:r�rrrr;�sz-_suggest_semantic_version.<locals>.<listcomp>�NcSsg|]}t|��qSr)r{)r:r�rrrr;�scSsg|]}t|��qSr)r{)r:r�rrrr;�srp�-r�)rrF�
_REPLACEMENTS�sub�_NUMERIC_PREFIXrCrDrHrrL�endr��_SUFFIX_REPLACEMENTS�	is_semver)rrZZpat�replrMrO�suffix�seprrr�_suggest_semantic_version�s:
,
r�cCslyt|�|Stk
r YnX|j�}xdBD]\}}|j||�}q0Wtjdd|�}tjdd|�}tjdd|�}tjdd|�}tjdd|�}|jd��r�|d d!�}tjd"d|�}tjd#d$|�}tjd%d&|�}tjd'd|�}tjd(d)|�}tjd*d)|�}tjd+d
|�}tjd,d-|�}tjd.d&|�}tjd/d0|�}tjd1d2|�}yt|�Wntk
�rfd!}YnX|S)Ca�Suggest a normalized version close to the given version string.

    If you have a version string that isn't rational (i.e. NormalizedVersion
    doesn't like it) then you might be able to get an equivalent (or close)
    rational version from this function.

    This does a number of simple normalizations to the given string, based
    on observation of versions currently in use on PyPI. Given a dump of
    those version during PyCon 2009, 4287 of them:
    - 2312 (53.93%) match NormalizedVersion without change
      with the automatic suggestion
    - 3474 (81.04%) match when using this suggestion method

    @param s {str} An irrational version string.
    @returns A rational version string, or None, if couldn't determine one.
    �-alpharh�-betarur�r�rvr3�-finalr8�-pre�-release�.release�-stabler�r`rj� �.finalrkzpre$Zpre0zdev$Zdev0z([abc]|rc)[\-\.](\d+)$z\1\2z[\-\.](dev)[\-\.]?r?(\d+)$z.\1\2z[.~]?([abc])\.?z\1r2rNz\b0+(\d+)(?!\d)z(\d+[abc])$z\g<1>0z\.?(dev-r|dev\.r)\.?(\d+)$z.dev\2z-(a|b|c)(\d+)$z[\.\-](dev|devel)$z.dev0z(?![\.\-])dev$z(final|stable)$z\.?(r|-|-r)\.?(\d+)$z.post\2z\.?(dev|git|bzr)\.?(\d+)$z\.?(pre|preview|-c)(\d+)$zc\g<2>zp(\d+)$z.post\1�r�rh�r�ru�r�rh�r�ru�rvr3�r�r8�r�r3�r�r8�r�r8�r�r8�r�r`�rjr`�r�r8�r�r8�rkr8)r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)rsr
rF�replacer\r�r|)rZrsZorigr�rrr�_suggest_normalized_version�sH	
r�z([a-z]+|\d+|[\.-])r3zfinal-�@)roZpreviewr�rvrpr8r`cCs~dd�}g}xh||�D]\}|jd�rh|dkrJx|rH|ddkrH|j�q.Wx|rf|d	dkrf|j�qLW|j|�qWt|�S)
NcSsxg}xdtj|j��D]R}tj||�}|rd|dd�koBdknrT|jd�}nd|}|j|�qW|jd�|S)N�0r�9��*z*final)�
_VERSION_PARTrHrF�_VERSION_REPLACErQ�zfillrL)rrZr4rrr�	get_partsIs 
z_legacy_key.<locals>.get_partsr�z*finalrz*final-Z00000000rlrl)r|�poprLr)rr�rZr4rrr�_legacy_keyHs

r�c@s eZdZdd�Zedd��ZdS)rcCst|�S)N)r�)rrrrrrcszLegacyVersion.parsecCs:d}x0|jD]&}t|t�r|jd�r|dkrd}PqW|S)NFr�z*finalT)rrrr|)rrZr}rrrr/fszLegacyVersion.is_prereleaseN)rr
rrr0r/rrrrrbsc@s4eZdZeZeej�Zded<ej	d�Z
dd�ZdS)rr�z~=z^(\d+(\.\d+)*)cCs`||krdS|jjt|��}|s2tjd||�dS|j�d}d|krV|jdd�d}t||�S)NFzACannot compute compatible match for version %s  and constraint %sTrr`r)�
numeric_rerCr{�loggerZwarningrD�rsplitr�)rrSrUrOrMrrrrr�yszLegacyMatcher._match_compatibleN)rr
rrr@�dictr1rPr\r]r�r�rrrrrqs


zN^(\d+)\.(\d+)\.(\d+)(-[a-z0-9]+(\.[a-z0-9-]+)*)?(\+[a-z0-9]+(\.[a-z0-9-]+)*)?$cCs
tj|�S)N)�
_SEMVER_RErC)rrrrr��sr�c	Csndd�}t|�}|st|��|j�}dd�|dd�D�\}}}||dd�||dd�}}|||f||fS)	NcSs8|dkr|f}n$|dd�jd�}tdd�|D��}|S)Nrr`cSs"g|]}|j�r|jd�n|�qS)r�)rnr�)r:r4rrrr;�sz5_semantic_key.<locals>.make_tuple.<locals>.<listcomp>)rHr)rZabsentrZrrrr�
make_tuple�s
z!_semantic_key.<locals>.make_tuplecSsg|]}t|��qSr)r^)r:r�rrrr;�sz!_semantic_key.<locals>.<listcomp>r��|�r�)r�r
rD)	rr�rMrD�major�minorZpatchroZbuildrrr�
_semantic_key�s
r�c@s eZdZdd�Zedd��ZdS)rcCst|�S)N)r�)rrrrrr�szSemanticVersion.parsecCs|jdddkS)Nrrr�)r)rrrrr/�szSemanticVersion.is_prereleaseN)rr
rrr0r/rrrrr�sc@seZdZeZdS)r	N)rr
rrr@rrrrr	�sc@s6eZdZddd�Zdd�Zdd�Zdd	�Zd
d�ZdS)
�
VersionSchemeNcCs||_||_||_dS)N)rG�matcher�	suggester)rrGr�r�rrrr�szVersionScheme.__init__cCs2y|jj|�d}Wntk
r,d}YnX|S)NTF)r�r@r
)rrrZrrr�is_valid_version�s
zVersionScheme.is_valid_versioncCs0y|j|�d}Wntk
r*d}YnX|S)NTF)r�r
)rrrZrrr�is_valid_matcher�s

zVersionScheme.is_valid_matchercCs|jd|�S)z:
        Used for processing some metadata fields
        zdummy_name (%s))r�)rrrrr�is_valid_constraint_list�sz&VersionScheme.is_valid_constraint_listcCs|jdkrd}n
|j|�}|S)N)r�)rrrZrrr�suggest�s

zVersionScheme.suggest)N)rr
rrr�r�r�r�rrrrr��s

r�cCs|S)Nr)rrrrrr5�sr5)�
normalized�legacyZsemanticr��defaultcCs|tkrtd|��t|S)Nzunknown scheme name: %r)�_SCHEMESrA)rErrrr�s)(rZloggingr\�compatr�__all__Z	getLoggerrr�rAr
�objectrr1r]rmrrrsrr�rr�r�r�r�r��Ir�r�r�rrr�r�r�rr	r�r�rrrrr�<module>	sz
1k
=$W
.r	$
site-packages/pip/_vendor/distlib/__pycache__/database.cpython-36.opt-1.pyc000064400000122116147511334610022513 0ustar003

���e��@s�dZddlmZddlZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddlZddlm
Z
mZddlmZddlmZmZddlmZmZmZdd	lmZmZmZmZmZmZmZd
ddd
dgZ ej!e"�Z#dZ$dZ%deddde$dfZ&dZ'Gdd�de(�Z)Gdd�de(�Z*Gdd
�d
e(�Z+Gdd�de+�Z,Gdd�de,�Z-Gdd
�d
e,�Z.e-Z/e.Z0Gdd�de(�Z1d)d!d"�Z2d#d$�Z3d%d&�Z4d'd(�Z5dS)*zPEP 376 implementation.�)�unicode_literalsN�)�DistlibException�	resources)�StringIO)�
get_scheme�UnsupportedVersionError)�Metadata�METADATA_FILENAME�WHEEL_METADATA_FILENAME)�parse_requirement�cached_property�parse_name_and_version�read_exports�
write_exports�	CSVReader�	CSVWriter�Distribution�BaseInstalledDistribution�InstalledDistribution�EggInfoDistribution�DistributionPathzpydist-exports.jsonzpydist-commands.jsonZ	INSTALLER�RECORD�	REQUESTED�	RESOURCES�SHAREDz
.dist-infoc@s(eZdZdZdd�Zdd�Zdd�ZdS)	�_CachezL
    A simple cache mapping names and .dist-info paths to distributions
    cCsi|_i|_d|_dS)zZ
        Initialise an instance. There is normally one for each DistributionPath.
        FN)�name�path�	generated)�self�r!�/usr/lib/python3.6/database.py�__init__0sz_Cache.__init__cCs|jj�|jj�d|_dS)zC
        Clear the cache, setting it to its initial state.
        FN)r�clearrr)r r!r!r"r$8s

z_Cache.clearcCs2|j|jkr.||j|j<|jj|jg�j|�dS)z`
        Add a distribution to the cache.
        :param dist: The distribution to add.
        N)rr�
setdefault�key�append)r �distr!r!r"�add@sz
_Cache.addN)�__name__�
__module__�__qualname__�__doc__r#r$r)r!r!r!r"r,src@s�eZdZdZddd�Zdd�Zdd	�Zeee�Zd
d�Z	dd
�Z
dd�Zedd��Z
dd�Zdd�Zddd�Zdd�Zddd�ZdS)rzU
    Represents a set of distributions installed on a path (typically sys.path).
    NFcCsD|dkrtj}||_d|_||_t�|_t�|_d|_td�|_	dS)a�
        Create an instance from a path, optionally including legacy (distutils/
        setuptools/distribute) distributions.
        :param path: The path to use, as a list of directories. If not specified,
                     sys.path is used.
        :param include_egg: If True, this instance will look for and return legacy
                            distributions as well as those based on PEP 376.
        NT�default)
�sysr�
_include_dist�_include_eggr�_cache�
_cache_egg�_cache_enabledr�_scheme)r rZinclude_eggr!r!r"r#Ns	zDistributionPath.__init__cCs|jS)N)r4)r r!r!r"�_get_cache_enabledbsz#DistributionPath._get_cache_enabledcCs
||_dS)N)r4)r �valuer!r!r"�_set_cache_enabledesz#DistributionPath._set_cache_enabledcCs|jj�|jj�dS)z,
        Clears the internal cache.
        N)r2r$r3)r r!r!r"�clear_cachejs
zDistributionPath.clear_cachec
csTt�}�xF|jD�]:}tj|�}|dkr*q|jd�}|s|jrDqt|j�}�x�|D]�}|j|�}|sV|j|krvqV|jo�|jt	��rt
tg}x*|D] }tj
||�}	|j|	�}
|
r�Pq�WqVtj|
j���}t|dd�}WdQRXtjd|j�|j|j�t|j||d�VqV|jrV|jd	�rVtjd|j�|j|j�t|j|�VqVWqWdS)
zD
        Yield .dist-info and/or .egg(-info) distributions.
        N��legacy)�fileobj�schemezFound %s)�metadata�env�	.egg-info�.egg)r@rA)�setrr�finder_for_path�findZis_container�sortedr0�endswith�DISTINFO_EXTr
r�	posixpath�join�
contextlib�closing�	as_streamr	�logger�debugr)�new_dist_classr1�old_dist_class)
r �seenr�finder�rZrset�entryZpossible_filenamesZmetadata_filenameZ
metadata_pathZpydist�streamr>r!r!r"�_yield_distributionsrs@






z%DistributionPath._yield_distributionscCst|jj}|jo|jj}|s"|rpx4|j�D](}t|t�rH|jj|�q,|jj|�q,W|rdd|j_|rpd|j_dS)zk
        Scan the path for distributions and populate the cache with
        those that are found.
        TN)r2rr1r3rV�
isinstancerr))r Zgen_distZgen_eggr(r!r!r"�_generate_cache�s

z DistributionPath._generate_cachecCs|jdd�}dj||g�tS)ao
        The *name* and *version* parameters are converted into their
        filename-escaped form, i.e. any ``'-'`` characters are replaced
        with ``'_'`` other than the one in ``'dist-info'`` and the one
        separating the name from the version number.

        :parameter name: is converted to a standard distribution name by replacing
                         any runs of non- alphanumeric characters with a single
                         ``'-'``.
        :type name: string
        :parameter version: is converted to a standard version string. Spaces
                            become dots, and all other non-alphanumeric characters
                            (except dots) become dashes, with runs of multiple
                            dashes condensed to a single dash.
        :type version: string
        :returns: directory name
        :rtype: string�-�_)�replacerIrG)�clsr�versionr!r!r"�distinfo_dirname�sz!DistributionPath.distinfo_dirnameccsj|js x^|j�D]
}|VqWnF|j�x|jjj�D]
}|Vq6W|jrfx|jjj�D]
}|VqXWdS)a5
        Provides an iterator that looks for distributions and returns
        :class:`InstalledDistribution` or
        :class:`EggInfoDistribution` instances for each one of them.

        :rtype: iterator of :class:`InstalledDistribution` and
                :class:`EggInfoDistribution` instances
        N)r4rVrXr2r�valuesr1r3)r r(r!r!r"�get_distributions�s	
z"DistributionPath.get_distributionscCs�d}|j�}|js6xj|j�D]}|j|kr|}PqWnH|j�||jjkr\|jj|d}n"|jr~||jjkr~|jj|d}|S)a=
        Looks for a named distribution on the path.

        This function only returns the first result found, as no more than one
        value is expected. If nothing is found, ``None`` is returned.

        :rtype: :class:`InstalledDistribution`, :class:`EggInfoDistribution`
                or ``None``
        Nr)	�lowerr4rVr&rXr2rr1r3)r r�resultr(r!r!r"�get_distribution�s

z!DistributionPath.get_distributionc	cs�d}|dk	rJy|jjd||f�}Wn$tk
rHtd||f��YnXxd|j�D]X}|j}xL|D]D}t|�\}}|dkr�||kr�|VPqd||krd|j|�rd|VPqdWqTWdS)a
        Iterates over all distributions to find which distributions provide *name*.
        If a *version* is provided, it will be used to filter the results.

        This function only returns the first result found, since no more than
        one values are expected. If the directory is not found, returns ``None``.

        :parameter version: a version specifier that indicates the version
                            required, conforming to the format in ``PEP-345``

        :type name: string
        :type version: string
        Nz%s (%s)zinvalid name or version: %r, %r)r5�matcher�
ValueErrorrr`�providesr�match)	r rr]rdr(�provided�p�p_name�p_verr!r!r"�provides_distribution�s$
z&DistributionPath.provides_distributioncCs(|j|�}|dkrtd|��|j|�S)z5
        Return the path to a resource file.
        Nzno distribution named %r found)rc�LookupError�get_resource_path)r r�
relative_pathr(r!r!r"�
get_file_paths
zDistributionPath.get_file_pathccs`xZ|j�D]N}|j}||kr
||}|dk	r@||krX||Vq
x|j�D]
}|VqJWq
WdS)z�
        Return all of the exported entries in a particular category.

        :param category: The category to search for entries.
        :param name: If specified, only entries with that name are returned.
        N)r`�exportsr_)r �categoryrr(rS�d�vr!r!r"�get_exported_entries"sz%DistributionPath.get_exported_entries)NF)N)N)r*r+r,r-r#r6r8�propertyZ
cache_enabledr9rVrX�classmethodr^r`rcrlrprur!r!r!r"rJs

*
$	c@s�eZdZdZdZdZdd�Zedd��ZeZ	edd��Z
ed	d
��Zdd�Zed
d��Z
edd��Zedd��Zedd��Zedd��Zdd�Zdd�Zdd�Zdd�ZdS) rz�
    A base class for distributions, whether installed or from indexes.
    Either way, it must have some metadata, so that's all that's needed
    for construction.
    FcCsL||_|j|_|jj�|_|j|_d|_d|_d|_d|_t	�|_
i|_dS)z�
        Initialise an instance.
        :param metadata: The instance of :class:`Metadata` describing this
        distribution.
        N)r>rrar&r]Zlocator�digest�extras�contextrBZ
download_urlsZdigests)r r>r!r!r"r#GszDistribution.__init__cCs|jjS)zH
        The source archive download URL for this distribution.
        )r>�
source_url)r r!r!r"r{XszDistribution.source_urlcCsd|j|jfS)zX
        A utility property which displays the name and version in parentheses.
        z%s (%s))rr])r r!r!r"�name_and_versionaszDistribution.name_and_versioncCs.|jj}d|j|jf}||kr*|j|�|S)z�
        A set of distribution names and versions provided by this distribution.
        :return: A set of "name (version)" strings.
        z%s (%s))r>rfrr]r')r Zplist�sr!r!r"rfhs

zDistribution.providescCs8|j}tjd|j��t||�}t|j||j|jd��S)Nz%Getting requirements from metadata %r)ryr?)	r>rMrNZtodict�getattrrBZget_requirementsryrz)r Zreq_attr�mdZreqtsr!r!r"�_get_requirementsts

zDistribution._get_requirementscCs
|jd�S)N�run_requires)r�)r r!r!r"r�{szDistribution.run_requirescCs
|jd�S)N�
meta_requires)r�)r r!r!r"r�szDistribution.meta_requirescCs
|jd�S)N�build_requires)r�)r r!r!r"r��szDistribution.build_requirescCs
|jd�S)N�
test_requires)r�)r r!r!r"r��szDistribution.test_requirescCs
|jd�S)N�dev_requires)r�)r r!r!r"r��szDistribution.dev_requiresc
Cs�t|�}t|jj�}y|j|j�}Wn6tk
rZtjd|�|j	�d}|j|�}YnX|j
}d}xJ|jD]@}t|�\}}	||kr�qny|j
|	�}PWqntk
r�YqnXqnW|S)z�
        Say if this instance matches (fulfills) a requirement.
        :param req: The requirement to match.
        :rtype req: str
        :return: True if it matches, else False.
        z+could not read version %r - using name onlyrF)rrr>r=rd�requirementrrM�warning�splitr&rfrrg)
r �reqrSr=rdrrbrirjrkr!r!r"�matches_requirement�s*	

z Distribution.matches_requirementcCs(|jrd|j}nd}d|j|j|fS)zC
        Return a textual representation of this instance,
        z [%s]r:z<Distribution %s (%s)%s>)r{rr])r �suffixr!r!r"�__repr__�szDistribution.__repr__cCs>t|�t|�k	rd}n$|j|jko8|j|jko8|j|jk}|S)a<
        See if this distribution is the same as another.
        :param other: The distribution to compare with. To be equal to one
                      another. distributions must have the same type, name,
                      version and source_url.
        :return: True if it is the same, else False.
        F)�typerr]r{)r �otherrbr!r!r"�__eq__�szDistribution.__eq__cCst|j�t|j�t|j�S)zH
        Compute hash in a way which matches the equality test.
        )�hashrr]r{)r r!r!r"�__hash__�szDistribution.__hash__N)r*r+r,r-Zbuild_time_dependency�	requestedr#rvr{Zdownload_urlr|rfr�r�r�r�r�r�r�r�r�r�r!r!r!r"r5s$"
cs0eZdZdZdZd�fdd�	Zddd�Z�ZS)	rz]
    This is the base class for installed distributions (whether PEP 376 or
    legacy).
    Ncs tt|�j|�||_||_dS)a
        Initialise an instance.
        :param metadata: An instance of :class:`Metadata` which describes the
                         distribution. This will normally have been initialised
                         from a metadata file in the ``path``.
        :param path:     The path of the ``.dist-info`` or ``.egg-info``
                         directory for the distribution.
        :param env:      This is normally the :class:`DistributionPath`
                         instance where this distribution was found.
        N)�superrr#r�	dist_path)r r>rr?)�	__class__r!r"r#�sz"BaseInstalledDistribution.__init__cCsd|dkr|j}|dkr"tj}d}ntt|�}d|j}||�j�}tj|�jd�jd�}d||fS)a�
        Get the hash of some data, using a particular hash algorithm, if
        specified.

        :param data: The data to be hashed.
        :type data: bytes
        :param hasher: The name of a hash implementation, supported by hashlib,
                       or ``None``. Examples of valid values are ``'sha1'``,
                       ``'sha224'``, ``'sha384'``, '``sha256'``, ``'md5'`` and
                       ``'sha512'``. If no hasher is specified, the ``hasher``
                       attribute of the :class:`InstalledDistribution` instance
                       is used. If the hasher is determined to be ``None``, MD5
                       is used as the hashing algorithm.
        :returns: The hash of the data. If a hasher was explicitly specified,
                  the returned hash will be prefixed with the specified hasher
                  followed by '='.
        :rtype: str
        Nr:z%s=�=�asciiz%s%s)	�hasher�hashlib�md5r~rx�base64Zurlsafe_b64encode�rstrip�decode)r �datar��prefixrxr!r!r"�get_hash�s

z"BaseInstalledDistribution.get_hash)N)N)r*r+r,r-r�r#r��
__classcell__r!r!)r�r"r�scs�eZdZdZdZd'�fdd�	Zdd�Zdd	�Zd
d�Ze	dd
��Z
dd�Zdd�Zdd�Z
dd�Zd(dd�Zdd�Ze	dd��Zd)dd�Zdd �Zd!d"�Zd#d$�Zd%d&�ZejZ�ZS)*ra
    Created with the *path* of the ``.dist-info`` directory provided to the
    constructor. It reads the metadata contained in ``pydist.json`` when it is
    instantiated., or uses a passed in Metadata instance (useful for when
    dry-run mode is being used).
    Zsha256Ncs0tj|�|_}|dkr(ddl}|j�|rN|jrN||jjkrN|jj|j}nt|dkr�|j	t
�}|dkrr|j	t�}|dkr�|j	d�}|dkr�tdt
|f��t
j|j���}t|dd�}WdQRXtt|�j|||�|r�|jr�|jj|�y|j	d�}Wn&tk
�r ddl}|j�YnX|dk	|_dS)NrZMETADATAzno %s found in %sr;)r<r=r)rrCrR�pdbZ	set_tracer4r2rr>rDr
rrerJrKrLr	r�rr#r)�AttributeErrorr�)r rr>r?rRr�rSrU)r�r!r"r#s4




zInstalledDistribution.__init__cCsd|j|j|jfS)Nz#<InstalledDistribution %r %s at %r>)rr]r)r r!r!r"r�2szInstalledDistribution.__repr__cCsd|j|jfS)Nz%s %s)rr])r r!r!r"�__str__6szInstalledDistribution.__str__c
Cs�g}|jd�}tj|j���`}t|d��J}xB|D]:}dd�tt|�d�D�}||\}}}	|j|||	f�q0WWdQRXWdQRX|S)a"
        Get the list of installed files for the distribution
        :return: A list of tuples of path, hash and size. Note that hash and
                 size might be ``None`` for some entries. The path is exactly
                 as stored in the file (which is as in PEP 376).
        r)rUcSsg|]}d�qS)Nr!)�.0�ir!r!r"�
<listcomp>Hsz6InstalledDistribution._get_records.<locals>.<listcomp>�N)�get_distinfo_resourcerJrKrLr�range�lenr')
r �resultsrSrUZ
record_reader�row�missingr�checksum�sizer!r!r"�_get_records9s

(z"InstalledDistribution._get_recordscCsi}|jt�}|r|j�}|S)a
        Return the information exported by this distribution.
        :return: A dictionary of exports, mapping an export category to a dict
                 of :class:`ExportEntry` instances describing the individual
                 export entries, and keyed by name.
        )r��EXPORTS_FILENAMEr)r rbrSr!r!r"rqPs

zInstalledDistribution.exportsc	Cs8i}|jt�}|r4tj|j���}t|�}WdQRX|S)z�
        Read exports data from a file in .ini format.

        :return: A dictionary of exports, mapping an export category to a list
                 of :class:`ExportEntry` instances describing the individual
                 export entries.
        N)r�r�rJrKrLr)r rbrSrUr!r!r"r^s
z"InstalledDistribution.read_exportsc
Cs.|jt�}t|d��}t||�WdQRXdS)a
        Write a dictionary of exports to a file in .ini format.
        :param exports: A dictionary of exports, mapping an export category to
                        a list of :class:`ExportEntry` instances describing the
                        individual export entries.
        �wN)�get_distinfo_filer��openr)r rqZrf�fr!r!r"rms
z#InstalledDistribution.write_exportscCsh|jd�}tj|j���:}t|d��$}x|D]\}}||kr,|Sq,WWdQRXWdQRXtd|��dS)aW
        NOTE: This API may change in the future.

        Return the absolute path to a resource file with the given relative
        path.

        :param relative_path: The path, relative to .dist-info, of the resource
                              of interest.
        :return: The absolute path where the resource is to be found.
        r)rUNz3no resource file with relative path %r is installed)r�rJrKrLr�KeyError)r rorSrUZresources_readerZrelativeZdestinationr!r!r"rnxs
z'InstalledDistribution.get_resource_pathccsx|j�D]
}|Vq
WdS)z�
        Iterates over the ``RECORD`` entries and returns a tuple
        ``(path, hash, size)`` for each line.

        :returns: iterator of (path, hash, size)
        N)r�)r rbr!r!r"�list_installed_files�sz*InstalledDistribution.list_installed_filesFcCs,tjj|d�}tjj|j�}|j|�}tjj|d�}|jd�}tjd|�|rRdSt|���}x�|D]�}tjj	|�s||j
d	�r�d}	}
n4dtjj|�}
t|d��}|j
|j��}	WdQRX|j|�s�|r�|j|�r�tjj||�}|j||	|
f�qbW|j|��rtjj||�}|j|ddf�WdQRX|S)
z�
        Writes the ``RECORD`` file, using the ``paths`` iterable passed in. Any
        existing ``RECORD`` file is silently overwritten.

        prefix is used to determine when to write absolute paths.
        r:rzcreating %sN�.pyc�.pyoz%d�rb)r�r�)�osrrI�dirname�
startswithr�rM�infor�isdirrF�getsizer�r��read�relpathZwriterow)r �pathsr��dry_run�baseZbase_under_prefix�record_path�writerr�
hash_valuer��fpr!r!r"�write_installed_files�s.





z+InstalledDistribution.write_installed_filesc
Csg}tjj|j�}|jd�}�x�|j�D]�\}}}tjj|�sLtjj||�}||krVq(tjj|�sv|j|dddf�q(tjj	|�r(t
tjj|��}|r�||kr�|j|d||f�q(|r(d|kr�|jdd�d}nd	}t
|d
��2}	|j|	j�|�}
|
|k�r|j|d||
f�Wd	QRXq(W|S)a�
        Checks that the hashes and sizes of the files in ``RECORD`` are
        matched by the files themselves. Returns a (possibly empty) list of
        mismatches. Each entry in the mismatch list will be a tuple consisting
        of the path, 'exists', 'size' or 'hash' according to what didn't match
        (existence is checked first, then size, then hash), the expected
        value and the actual value.
        r�existsTFr��=rrNr�r�)r�rr�r�r��isabsrIr�r'�isfile�strr�r�r�r�r�)r �
mismatchesr�r�rr�r�Zactual_sizer�r�Zactual_hashr!r!r"�check_installed_files�s.	

 z+InstalledDistribution.check_installed_filescCs�i}tjj|jd�}tjj|�r�tj|ddd��}|j�j�}WdQRXx@|D]8}|jdd�\}}|dkr~|j	|g�j
|�qN|||<qNW|S)	a�
        A dictionary of shared locations whose keys are in the set 'prefix',
        'purelib', 'platlib', 'scripts', 'headers', 'data' and 'namespace'.
        The corresponding value is the absolute path of that category for
        this distribution, and takes into account any paths selected by the
        user at installation time (e.g. via command-line arguments). In the
        case of the 'namespace' key, this would be a list of absolute paths
        for the roots of namespace packages in this distribution.

        The first time this property is accessed, the relevant information is
        read from the SHARED file in the .dist-info directory.
        rrSzutf-8)�encodingNr�r�	namespace)r�rrIr��codecsr�r��
splitlinesr�r%r')r rb�shared_pathr��lines�liner&r7r!r!r"�shared_locations�s
z&InstalledDistribution.shared_locationsc	
Cs�tjj|jd�}tjd|�|r$dSg}x6dD].}||}tjj||�r.|jd	||f�q.Wx"|jd
f�D]}|jd|�qnWtj	|dd
d��}|j
dj|��WdQRX|S)aa
        Write shared location information to the SHARED file in .dist-info.
        :param paths: A dictionary as described in the documentation for
        :meth:`shared_locations`.
        :param dry_run: If True, the action is logged but no file is actually
                        written.
        :return: The path of the file written to.
        rzcreating %sNr��lib�headers�scriptsr�z%s=%sr�znamespace=%sr�zutf-8)r��
)r�r�r�r�r�)r�rrIrMr�r�r'�getr�r��write)	r r�r�r�r�r&r�nsr�r!r!r"�write_shared_locations�s	
z,InstalledDistribution.write_shared_locationscCsF|tkrtd||jf��tj|j�}|dkr<td|j��|j|�S)Nz+invalid path for a dist-info file: %r at %rzUnable to get a finder for %s)�
DIST_FILESrrrrCrD)r rrRr!r!r"r�sz+InstalledDistribution.get_distinfo_resourcecCs~|jtj�dkrT|jtj�dd�\}}||jjtj�dkrTtd||j|jf��|tkrntd||jf��tjj	|j|�S)	a�
        Returns a path located under the ``.dist-info`` directory. Returns a
        string representing the path.

        :parameter path: a ``'/'``-separated path relative to the
                         ``.dist-info`` directory or an absolute path;
                         If *path* is an absolute path and doesn't start
                         with the ``.dist-info`` directory path,
                         a :class:`DistlibException` is raised
        :type path: str
        :rtype: str
        r�Nrz;dist-info file %r does not belong to the %r %s distributionz+invalid path for a dist-info file: %r at %r������)
rDr��sepr�rrrr]r�rI)r rr^r!r!r"r�sz'InstalledDistribution.get_distinfo_fileccsVtjj|j�}xB|j�D]6\}}}tjj|�s<tjj||�}|j|j�r|VqWdS)z�
        Iterates over the ``RECORD`` entries and returns paths for each line if
        the path is pointing to a file located in the ``.dist-info`` directory
        or one of its subdirectories.

        :returns: iterator of paths
        N)r�rr�r�r�rIr�)r r�rr�r�r!r!r"�list_distinfo_files6sz)InstalledDistribution.list_distinfo_filescCst|t�o|j|jkS)N)rWrr)r r�r!r!r"r�Fs
zInstalledDistribution.__eq__)NN)F)F)r*r+r,r-r�r#r�r�r�r
rqrrrnr�r�r�r�r�r�r�r�r��objectr�r�r!r!)r�r"r	s(

##
	csjeZdZdZdZiZd�fdd�	Zdd�Zdd	�Zd
d�Z	dd
�Z
dd�Zddd�Zdd�Z
ejZ�ZS)raCreated with the *path* of the ``.egg-info`` directory or file provided
    to the constructor. It reads the metadata contained in the file itself, or
    if the given path happens to be a directory, the metadata is read from the
    file ``PKG-INFO`` under that directory.TNcs�dd�}||_||_|rJ|jrJ||jjkrJ|jj|j}|||j|j�n0|j|�}|||j|j�|rz|jrz|jj|�t	t
|�j|||�dS)NcSs||_|j�|_||_dS)N)rrar&r])r}�nrtr!r!r"�set_name_and_versionXs
z:EggInfoDistribution.__init__.<locals>.set_name_and_version)rr�r4r3r>rr]�
_get_metadatar)r�rr#)r rr?r�r>)r�r!r"r#Ws

zEggInfoDistribution.__init__c
s2d}dd���fdd�}|jd�r�tjj|�rdtjj|dd�}t|dd	�}tjj|dd
�}||�}n`tj|�}t|j	d�j
d��}t|dd
�}y|j	d�}	�|	j
d��}Wntk
r�d}YnXnX|jd��rtjj|��rtjj|d
�}||�}tjj|d�}t|dd	�}ntd|��|�r.|j
|�|S)NcSs�g}|j�}x�|D]�}|j�}|jd�r6tjd|�Pt|�}|sPtjd|�q|jr`tjd�|jst|j|j	�qdj
dd�|jD��}|jd|j	|f�qW|S)	z�Create a list of dependencies from a requires.txt file.

            *data*: the contents of a setuptools-produced requires.txt file.
            �[z.Unexpected line: quitting requirement scan: %rz#Not recognised as a requirement: %rz4extra requirements in requires.txt are not supportedz, css|]}d|VqdS)z%s%sNr!)r��cr!r!r"�	<genexpr>�szQEggInfoDistribution._get_metadata.<locals>.parse_requires_data.<locals>.<genexpr>z%s (%s))r��stripr�rMr�rryZconstraintsr'rrI)r��reqsr�r�rSZconsr!r!r"�parse_requires_dataos&


z>EggInfoDistribution._get_metadata.<locals>.parse_requires_datacsHg}y*tj|dd��}�|j��}WdQRXWntk
rBYnX|S)z�Create a list of dependencies from a requires.txt file.

            *req_path*: the path to a setuptools-produced requires.txt file.
            rSzutf-8N)r�r�r��IOError)�req_pathr�r�)r�r!r"�parse_requires_path�sz>EggInfoDistribution._get_metadata.<locals>.parse_requires_pathz.eggzEGG-INFOzPKG-INFOr;)rr=zrequires.txtzEGG-INFO/PKG-INFO�utf8)r<r=zEGG-INFO/requires.txtzutf-8z	.egg-infoz,path must end with .egg-info or .egg, got %r)rFr�rr�rIr	�	zipimport�zipimporterr�get_datar�r�rZadd_requirements)
r r�requiresr��	meta_pathr>r�Zzipfr<r�r!)r�r"r�ls:




z!EggInfoDistribution._get_metadatacCsd|j|j|jfS)Nz!<EggInfoDistribution %r %s at %r>)rr]r)r r!r!r"r��szEggInfoDistribution.__repr__cCsd|j|jfS)Nz%s %s)rr])r r!r!r"r��szEggInfoDistribution.__str__cCsdg}tjj|jd�}tjj|�r`x>|j�D]2\}}}||kr>q*tjj|�s*|j|dddf�q*W|S)a�
        Checks that the hashes and sizes of the files in ``RECORD`` are
        matched by the files themselves. Returns a (possibly empty) list of
        mismatches. Each entry in the mismatch list will be a tuple consisting
        of the path, 'exists', 'size' or 'hash' according to what didn't match
        (existence is checked first, then size, then hash), the expected
        value and the actual value.
        zinstalled-files.txtr�TF)r�rrIr�r�r')r r�r�rrZr!r!r"r��s	z)EggInfoDistribution.check_installed_filesc
Cs�dd�}dd�}tjj|jd�}g}tjj|�r�tj|ddd��|}xt|D]l}|j�}tjjtjj|j|��}tjj|�s�tj	d	|�|j
d
�r�qHtjj|�sH|j|||�||�f�qHWWdQRX|j|ddf�|S)z�
        Iterates over the ``installed-files.txt`` entries and returns a tuple
        ``(path, hash, size)`` for each line.

        :returns: a list of (path, hash, size)
        c
Ss0t|d�}z|j�}Wd|j�Xtj|�j�S)Nr�)r�r��closer�r�Z	hexdigest)rr�Zcontentr!r!r"�_md5�s


z6EggInfoDistribution.list_installed_files.<locals>._md5cSstj|�jS)N)r��stat�st_size)rr!r!r"�_size�sz7EggInfoDistribution.list_installed_files.<locals>._sizezinstalled-files.txtrSzutf-8)r�zNon-existent file: %s�.pyc�.pyoN)rr)
r�rrIr�r�r�r��normpathrMr�rFr�r')r r�rr�rbr�r�rir!r!r"r��s"

&z(EggInfoDistribution.list_installed_filesFccs�tjj|jd�}d}tj|ddd��d}x\|D]T}|j�}|dkrFd}q,|s,tjjtjj|j|��}|j|j�r,|rz|Vq,|Vq,WWdQRXdS)	a
        Iterates over the ``installed-files.txt`` entries and returns paths for
        each line if the path is pointing to a file located in the
        ``.egg-info`` directory or one of its subdirectories.

        :parameter absolute: If *absolute* is ``True``, each returned path is
                          transformed into a local absolute path. Otherwise the
                          raw value from ``installed-files.txt`` is returned.
        :type absolute: boolean
        :returns: iterator of paths
        zinstalled-files.txtTrSzutf-8)r�z./FN)r�rrIr�r�r�rr�)r Zabsoluter��skipr�r�rir!r!r"r��s
z'EggInfoDistribution.list_distinfo_filescCst|t�o|j|jkS)N)rWrr)r r�r!r!r"r�s
zEggInfoDistribution.__eq__)N)F)r*r+r,r-r�r�r#r�r�r�r�r�r�r�r�r�r�r!r!)r�r"rNsK&
c@s^eZdZdZdd�Zdd�Zddd�Zd	d
�Zdd�Zddd�Z	ddd�Z
dd�Zdd�ZdS)�DependencyGrapha�
    Represents a dependency graph between distributions.

    The dependency relationships are stored in an ``adjacency_list`` that maps
    distributions to a list of ``(other, label)`` tuples where  ``other``
    is a distribution and the edge is labeled with ``label`` (i.e. the version
    specifier, if such was provided). Also, for more efficient traversal, for
    every distribution ``x``, a list of predecessors is kept in
    ``reverse_list[x]``. An edge from distribution ``a`` to
    distribution ``b`` means that ``a`` depends on ``b``. If any missing
    dependencies are found, they are stored in ``missing``, which is a
    dictionary that maps distributions to a list of requirements that were not
    provided by any other distributions.
    cCsi|_i|_i|_dS)N)�adjacency_list�reverse_listr�)r r!r!r"r#.szDependencyGraph.__init__cCsg|j|<g|j|<dS)z�Add the *distribution* to the graph.

        :type distribution: :class:`distutils2.database.InstalledDistribution`
                            or :class:`distutils2.database.EggInfoDistribution`
        N)rr)r �distributionr!r!r"�add_distribution3s
z DependencyGraph.add_distributionNcCs6|j|j||f�||j|kr2|j|j|�dS)a�Add an edge from distribution *x* to distribution *y* with the given
        *label*.

        :type x: :class:`distutils2.database.InstalledDistribution` or
                 :class:`distutils2.database.EggInfoDistribution`
        :type y: :class:`distutils2.database.InstalledDistribution` or
                 :class:`distutils2.database.EggInfoDistribution`
        :type label: ``str`` or ``None``
        N)rr'r)r �x�y�labelr!r!r"�add_edge=s
zDependencyGraph.add_edgecCs&tjd||�|jj|g�j|�dS)a
        Add a missing *requirement* for the given *distribution*.

        :type distribution: :class:`distutils2.database.InstalledDistribution`
                            or :class:`distutils2.database.EggInfoDistribution`
        :type requirement: ``str``
        z
%s missing %rN)rMrNr�r%r')r rr�r!r!r"�add_missingLszDependencyGraph.add_missingcCsd|j|jfS)Nz%s %s)rr])r r(r!r!r"�
_repr_distWszDependencyGraph._repr_distrcCs�|j|�g}xv|j|D]h\}}|j|�}|dk	r>d||f}|jd|t|��|j||d�}|jd�}|j|dd��qWdj|�S)zPrints only a subgraphNz%s [%s]z    rr�)rrr'r��	repr_noder��extendrI)r r(�level�outputr�rZ	suboutputZsubsr!r!r"rZs

zDependencyGraph.repr_nodeTcCs�g}|jd�x||jj�D]n\}}t|�dkr>|r>|j|�xH|D]@\}}|dk	rn|jd|j|j|f�qD|jd|j|jf�qDWqW|r�t|�dkr�|jd�|jd�|jd�x&|D]}|jd	|j�|jd
�q�W|jd�|jd�dS)a9Writes a DOT output for the graph to the provided file *f*.

        If *skip_disconnected* is set to ``True``, then all distributions
        that are not dependent on any other distribution are skipped.

        :type f: has to support ``file``-like operations
        :type skip_disconnected: ``bool``
        zdigraph dependencies {
rNz"%s" -> "%s" [label="%s"]
z
"%s" -> "%s"
zsubgraph disconnected {
zlabel = "Disconnected"
zbgcolor = red
z"%s"r�z}
)r�r�itemsr�r'r)r r�Zskip_disconnectedZdisconnectedr(�adjsr�rr!r!r"�to_dotgs&	






zDependencyGraph.to_dotcs�g}i}x&|jj�D]\}}|dd�||<qWx�g�x4t|j��dd�D]\}}|sL�j|�||=qLW�srPx*|j�D]\}}�fdd�|D�||<q|Wtjddd��D��|j��q2W|t|j��fS)aa
        Perform a topological sort of the graph.
        :return: A tuple, the first element of which is a topologically sorted
                 list of distributions, and the second element of which is a
                 list of distributions that cannot be sorted because they have
                 circular dependencies and so form a cycle.
        Ncs g|]\}}|�kr||f�qSr!r!)r�rsrS)�	to_remover!r"r��sz4DependencyGraph.topological_sort.<locals>.<listcomp>zMoving to result: %scSsg|]}d|j|jf�qS)z%s (%s))rr])r�rsr!r!r"r��s)rr�listr'rMrNr�keys)r rbZalist�krtr!)rr"�topological_sort�s$

z DependencyGraph.topological_sortcCs6g}x&|jj�D]\}}|j|j|��qWdj|�S)zRepresentation of the graphr�)rrr'rrI)r rr(rr!r!r"r��szDependencyGraph.__repr__)N)r)T)
r*r+r,r-r#r	r
rrrrrr�r!r!r!r"rs



 rr.cCsft|�}t�}i}xX|D]P}|j|�x@|jD]6}t|�\}}tjd|||�|j|g�j||f�q.WqWx�|D]�}|j	|j
B|jB|jB}x�|D]�}	y|j
|	�}
Wn6tk
r�tjd|	�|	j�d}|j
|�}
YnX|
j}d}||k�rJxV||D]J\}}y|
j|�}
Wntk
�r,d}
YnX|
r�|j|||	�d}Pq�W|s�|j||	�q�WqrW|S)a6Makes a dependency graph from the given distributions.

    :parameter dists: a list of distributions
    :type dists: list of :class:`distutils2.database.InstalledDistribution` and
                 :class:`distutils2.database.EggInfoDistribution` instances
    :rtype: a :class:`DependencyGraph` instance
    zAdd to provided: %s, %s, %sz+could not read version %r - using name onlyrFT)rrr	rfrrMrNr%r'r�r�r�r�rdrr�r�r&rgr
r)�distsr=�graphrhr(rirr]r�r�rdZmatchedZproviderrgr!r!r"�
make_graph�sD





rcCs~||krtd|j��t|�}|g}|j|}x@|rn|j�}|j|�x$|j|D]}||krR|j|�qRWq0W|jd�|S)z�Recursively generate a list of distributions from *dists* that are
    dependent on *dist*.

    :param dists: a list of distributions
    :param dist: a distribution, member of *dists* for which we are interested
    z1given distribution %r is not a member of the listr)rrrr�popr')rr(rZdep�todorsZsuccr!r!r"�get_dependent_dists�s



r!cCsv||krtd|j��t|�}g}|j|}xD|rp|j�d}|j|�x$|j|D]}||krT|j|�qTWq.W|S)z�Recursively generate a list of distributions from *dists* that are
    required by *dist*.

    :param dists: a list of distributions
    :param dist: a distribution, member of *dists* for which we are interested
    z1given distribution %r is not a member of the listr)rrrrrr')rr(rr�r rsZpredr!r!r"�get_required_dists�s


r"cKs4|jdd�}tf|�}||_||_|p(d|_t|�S)zO
    A convenience method for making a dist given just a name and version.
    �summaryzPlaceholder for summary)rr	rr]r#r)rr]�kwargsr#rr!r!r"�	make_dists

r%)r.)6r-Z
__future__rr�r�rJr�Zloggingr�rHr/r�r:rr�compatrr]rrr>r	r
r�utilrr
rrrrr�__all__Z	getLoggerr*rMr�ZCOMMANDS_FILENAMEr�rGr�rrrrrrrOrPrrr!r"r%r!r!r!r"�<module>sV$

l7GM
6site-packages/pip/_vendor/distlib/manifest.py000064400000034732147511334610015340 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2012-2013 Python Software Foundation.
# See LICENSE.txt and CONTRIBUTORS.txt.
#
"""
Class representing the list of files in a distribution.

Equivalent to distutils.filelist, but fixes some problems.
"""
import fnmatch
import logging
import os
import re
import sys

from . import DistlibException
from .compat import fsdecode
from .util import convert_path


__all__ = ['Manifest']

logger = logging.getLogger(__name__)

# a \ followed by some spaces + EOL
_COLLAPSE_PATTERN = re.compile('\\\w*\n', re.M)
_COMMENTED_LINE = re.compile('#.*?(?=\n)|\n(?=$)', re.M | re.S)

#
# Due to the different results returned by fnmatch.translate, we need
# to do slightly different processing for Python 2.7 and 3.2 ... this needed
# to be brought in for Python 3.6 onwards.
#
_PYTHON_VERSION = sys.version_info[:2]

class Manifest(object):
    """A list of files built by on exploring the filesystem and filtered by
    applying various patterns to what we find there.
    """

    def __init__(self, base=None):
        """
        Initialise an instance.

        :param base: The base directory to explore under.
        """
        self.base = os.path.abspath(os.path.normpath(base or os.getcwd()))
        self.prefix = self.base + os.sep
        self.allfiles = None
        self.files = set()

    #
    # Public API
    #

    def findall(self):
        """Find all files under the base and set ``allfiles`` to the absolute
        pathnames of files found.
        """
        from stat import S_ISREG, S_ISDIR, S_ISLNK

        self.allfiles = allfiles = []
        root = self.base
        stack = [root]
        pop = stack.pop
        push = stack.append

        while stack:
            root = pop()
            names = os.listdir(root)

            for name in names:
                fullname = os.path.join(root, name)

                # Avoid excess stat calls -- just one will do, thank you!
                stat = os.stat(fullname)
                mode = stat.st_mode
                if S_ISREG(mode):
                    allfiles.append(fsdecode(fullname))
                elif S_ISDIR(mode) and not S_ISLNK(mode):
                    push(fullname)

    def add(self, item):
        """
        Add a file to the manifest.

        :param item: The pathname to add. This can be relative to the base.
        """
        if not item.startswith(self.prefix):
            item = os.path.join(self.base, item)
        self.files.add(os.path.normpath(item))

    def add_many(self, items):
        """
        Add a list of files to the manifest.

        :param items: The pathnames to add. These can be relative to the base.
        """
        for item in items:
            self.add(item)

    def sorted(self, wantdirs=False):
        """
        Return sorted files in directory order
        """

        def add_dir(dirs, d):
            dirs.add(d)
            logger.debug('add_dir added %s', d)
            if d != self.base:
                parent, _ = os.path.split(d)
                assert parent not in ('', '/')
                add_dir(dirs, parent)

        result = set(self.files)    # make a copy!
        if wantdirs:
            dirs = set()
            for f in result:
                add_dir(dirs, os.path.dirname(f))
            result |= dirs
        return [os.path.join(*path_tuple) for path_tuple in
                sorted(os.path.split(path) for path in result)]

    def clear(self):
        """Clear all collected files."""
        self.files = set()
        self.allfiles = []

    def process_directive(self, directive):
        """
        Process a directive which either adds some files from ``allfiles`` to
        ``files``, or removes some files from ``files``.

        :param directive: The directive to process. This should be in a format
                     compatible with distutils ``MANIFEST.in`` files:

                     http://docs.python.org/distutils/sourcedist.html#commands
        """
        # Parse the line: split it up, make sure the right number of words
        # is there, and return the relevant words.  'action' is always
        # defined: it's the first word of the line.  Which of the other
        # three are defined depends on the action; it'll be either
        # patterns, (dir and patterns), or (dirpattern).
        action, patterns, thedir, dirpattern = self._parse_directive(directive)

        # OK, now we know that the action is valid and we have the
        # right number of words on the line for that action -- so we
        # can proceed with minimal error-checking.
        if action == 'include':
            for pattern in patterns:
                if not self._include_pattern(pattern, anchor=True):
                    logger.warning('no files found matching %r', pattern)

        elif action == 'exclude':
            for pattern in patterns:
                found = self._exclude_pattern(pattern, anchor=True)
                #if not found:
                #    logger.warning('no previously-included files '
                #                   'found matching %r', pattern)

        elif action == 'global-include':
            for pattern in patterns:
                if not self._include_pattern(pattern, anchor=False):
                    logger.warning('no files found matching %r '
                                   'anywhere in distribution', pattern)

        elif action == 'global-exclude':
            for pattern in patterns:
                found = self._exclude_pattern(pattern, anchor=False)
                #if not found:
                #    logger.warning('no previously-included files '
                #                   'matching %r found anywhere in '
                #                   'distribution', pattern)

        elif action == 'recursive-include':
            for pattern in patterns:
                if not self._include_pattern(pattern, prefix=thedir):
                    logger.warning('no files found matching %r '
                                   'under directory %r', pattern, thedir)

        elif action == 'recursive-exclude':
            for pattern in patterns:
                found = self._exclude_pattern(pattern, prefix=thedir)
                #if not found:
                #    logger.warning('no previously-included files '
                #                   'matching %r found under directory %r',
                #                   pattern, thedir)

        elif action == 'graft':
            if not self._include_pattern(None, prefix=dirpattern):
                logger.warning('no directories found matching %r',
                               dirpattern)

        elif action == 'prune':
            if not self._exclude_pattern(None, prefix=dirpattern):
                logger.warning('no previously-included directories found '
                               'matching %r', dirpattern)
        else:   # pragma: no cover
            # This should never happen, as it should be caught in
            # _parse_template_line
            raise DistlibException(
                'invalid action %r' % action)

    #
    # Private API
    #

    def _parse_directive(self, directive):
        """
        Validate a directive.
        :param directive: The directive to validate.
        :return: A tuple of action, patterns, thedir, dir_patterns
        """
        words = directive.split()
        if len(words) == 1 and words[0] not in ('include', 'exclude',
                                                'global-include',
                                                'global-exclude',
                                                'recursive-include',
                                                'recursive-exclude',
                                                'graft', 'prune'):
            # no action given, let's use the default 'include'
            words.insert(0, 'include')

        action = words[0]
        patterns = thedir = dir_pattern = None

        if action in ('include', 'exclude',
                      'global-include', 'global-exclude'):
            if len(words) < 2:
                raise DistlibException(
                    '%r expects <pattern1> <pattern2> ...' % action)

            patterns = [convert_path(word) for word in words[1:]]

        elif action in ('recursive-include', 'recursive-exclude'):
            if len(words) < 3:
                raise DistlibException(
                    '%r expects <dir> <pattern1> <pattern2> ...' % action)

            thedir = convert_path(words[1])
            patterns = [convert_path(word) for word in words[2:]]

        elif action in ('graft', 'prune'):
            if len(words) != 2:
                raise DistlibException(
                    '%r expects a single <dir_pattern>' % action)

            dir_pattern = convert_path(words[1])

        else:
            raise DistlibException('unknown action %r' % action)

        return action, patterns, thedir, dir_pattern

    def _include_pattern(self, pattern, anchor=True, prefix=None,
                         is_regex=False):
        """Select strings (presumably filenames) from 'self.files' that
        match 'pattern', a Unix-style wildcard (glob) pattern.

        Patterns are not quite the same as implemented by the 'fnmatch'
        module: '*' and '?'  match non-special characters, where "special"
        is platform-dependent: slash on Unix; colon, slash, and backslash on
        DOS/Windows; and colon on Mac OS.

        If 'anchor' is true (the default), then the pattern match is more
        stringent: "*.py" will match "foo.py" but not "foo/bar.py".  If
        'anchor' is false, both of these will match.

        If 'prefix' is supplied, then only filenames starting with 'prefix'
        (itself a pattern) and ending with 'pattern', with anything in between
        them, will match.  'anchor' is ignored in this case.

        If 'is_regex' is true, 'anchor' and 'prefix' are ignored, and
        'pattern' is assumed to be either a string containing a regex or a
        regex object -- no translation is done, the regex is just compiled
        and used as-is.

        Selected strings will be added to self.files.

        Return True if files are found.
        """
        # XXX docstring lying about what the special chars are?
        found = False
        pattern_re = self._translate_pattern(pattern, anchor, prefix, is_regex)

        # delayed loading of allfiles list
        if self.allfiles is None:
            self.findall()

        for name in self.allfiles:
            if pattern_re.search(name):
                self.files.add(name)
                found = True
        return found

    def _exclude_pattern(self, pattern, anchor=True, prefix=None,
                         is_regex=False):
        """Remove strings (presumably filenames) from 'files' that match
        'pattern'.

        Other parameters are the same as for 'include_pattern()', above.
        The list 'self.files' is modified in place. Return True if files are
        found.

        This API is public to allow e.g. exclusion of SCM subdirs, e.g. when
        packaging source distributions
        """
        found = False
        pattern_re = self._translate_pattern(pattern, anchor, prefix, is_regex)
        for f in list(self.files):
            if pattern_re.search(f):
                self.files.remove(f)
                found = True
        return found

    def _translate_pattern(self, pattern, anchor=True, prefix=None,
                           is_regex=False):
        """Translate a shell-like wildcard pattern to a compiled regular
        expression.

        Return the compiled regex.  If 'is_regex' true,
        then 'pattern' is directly compiled to a regex (if it's a string)
        or just returned as-is (assumes it's a regex object).
        """
        if is_regex:
            if isinstance(pattern, str):
                return re.compile(pattern)
            else:
                return pattern

        if _PYTHON_VERSION > (3, 2):
            # ditch start and end characters
            start, _, end = self._glob_to_re('_').partition('_')

        if pattern:
            pattern_re = self._glob_to_re(pattern)
            if _PYTHON_VERSION > (3, 2):
                assert pattern_re.startswith(start) and pattern_re.endswith(end)
        else:
            pattern_re = ''

        base = re.escape(os.path.join(self.base, ''))
        if prefix is not None:
            # ditch end of pattern character
            if _PYTHON_VERSION <= (3, 2):
                empty_pattern = self._glob_to_re('')
                prefix_re = self._glob_to_re(prefix)[:-len(empty_pattern)]
            else:
                prefix_re = self._glob_to_re(prefix)
                assert prefix_re.startswith(start) and prefix_re.endswith(end)
                prefix_re = prefix_re[len(start): len(prefix_re) - len(end)]
            sep = os.sep
            if os.sep == '\\':
                sep = r'\\'
            if _PYTHON_VERSION <= (3, 2):
                pattern_re = '^' + base + sep.join((prefix_re,
                                                    '.*' + pattern_re))
            else:
                pattern_re = pattern_re[len(start): len(pattern_re) - len(end)]
                pattern_re = r'%s%s%s%s.*%s%s' % (start, base, prefix_re, sep,
                                                  pattern_re, end)
        else:  # no prefix -- respect anchor flag
            if anchor:
                if _PYTHON_VERSION <= (3, 2):
                    pattern_re = '^' + base + pattern_re
                else:
                    pattern_re = r'%s%s%s' % (start, base, pattern_re[len(start):])

        return re.compile(pattern_re)

    def _glob_to_re(self, pattern):
        """Translate a shell-like glob pattern to a regular expression.

        Return a string containing the regex.  Differs from
        'fnmatch.translate()' in that '*' does not match "special characters"
        (which are platform-specific).
        """
        pattern_re = fnmatch.translate(pattern)

        # '?' and '*' in the glob pattern become '.' and '.*' in the RE, which
        # IMHO is wrong -- '?' and '*' aren't supposed to match slash in Unix,
        # and by extension they shouldn't match such "special characters" under
        # any OS.  So change all non-escaped dots in the RE to match any
        # character except the special characters (currently: just os.sep).
        sep = os.sep
        if os.sep == '\\':
            # we're using a regex to manipulate a regex, so we need
            # to escape the backslash twice
            sep = r'\\\\'
        escaped = r'\1[^%s]' % sep
        pattern_re = re.sub(r'((?<!\\)(\\\\)*)\.', escaped, pattern_re)
        return pattern_re
site-packages/pip/_vendor/distlib/resources.py000064400000025016147511334610015537 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2013-2016 Vinay Sajip.
# Licensed to the Python Software Foundation under a contributor agreement.
# See LICENSE.txt and CONTRIBUTORS.txt.
#
from __future__ import unicode_literals

import bisect
import io
import logging
import os
import pkgutil
import shutil
import sys
import types
import zipimport

from . import DistlibException
from .util import cached_property, get_cache_base, path_to_cache_dir, Cache

logger = logging.getLogger(__name__)


cache = None    # created when needed


class ResourceCache(Cache):
    def __init__(self, base=None):
        if base is None:
            # Use native string to avoid issues on 2.x: see Python #20140.
            base = os.path.join(get_cache_base(), str('resource-cache'))
        super(ResourceCache, self).__init__(base)

    def is_stale(self, resource, path):
        """
        Is the cache stale for the given resource?

        :param resource: The :class:`Resource` being cached.
        :param path: The path of the resource in the cache.
        :return: True if the cache is stale.
        """
        # Cache invalidation is a hard problem :-)
        return True

    def get(self, resource):
        """
        Get a resource into the cache,

        :param resource: A :class:`Resource` instance.
        :return: The pathname of the resource in the cache.
        """
        prefix, path = resource.finder.get_cache_info(resource)
        if prefix is None:
            result = path
        else:
            result = os.path.join(self.base, self.prefix_to_dir(prefix), path)
            dirname = os.path.dirname(result)
            if not os.path.isdir(dirname):
                os.makedirs(dirname)
            if not os.path.exists(result):
                stale = True
            else:
                stale = self.is_stale(resource, path)
            if stale:
                # write the bytes of the resource to the cache location
                with open(result, 'wb') as f:
                    f.write(resource.bytes)
        return result


class ResourceBase(object):
    def __init__(self, finder, name):
        self.finder = finder
        self.name = name


class Resource(ResourceBase):
    """
    A class representing an in-package resource, such as a data file. This is
    not normally instantiated by user code, but rather by a
    :class:`ResourceFinder` which manages the resource.
    """
    is_container = False        # Backwards compatibility

    def as_stream(self):
        """
        Get the resource as a stream.

        This is not a property to make it obvious that it returns a new stream
        each time.
        """
        return self.finder.get_stream(self)

    @cached_property
    def file_path(self):
        global cache
        if cache is None:
            cache = ResourceCache()
        return cache.get(self)

    @cached_property
    def bytes(self):
        return self.finder.get_bytes(self)

    @cached_property
    def size(self):
        return self.finder.get_size(self)


class ResourceContainer(ResourceBase):
    is_container = True     # Backwards compatibility

    @cached_property
    def resources(self):
        return self.finder.get_resources(self)


class ResourceFinder(object):
    """
    Resource finder for file system resources.
    """

    if sys.platform.startswith('java'):
        skipped_extensions = ('.pyc', '.pyo', '.class')
    else:
        skipped_extensions = ('.pyc', '.pyo')

    def __init__(self, module):
        self.module = module
        self.loader = getattr(module, '__loader__', None)
        self.base = os.path.dirname(getattr(module, '__file__', ''))

    def _adjust_path(self, path):
        return os.path.realpath(path)

    def _make_path(self, resource_name):
        # Issue #50: need to preserve type of path on Python 2.x
        # like os.path._get_sep
        if isinstance(resource_name, bytes):    # should only happen on 2.x
            sep = b'/'
        else:
            sep = '/'
        parts = resource_name.split(sep)
        parts.insert(0, self.base)
        result = os.path.join(*parts)
        return self._adjust_path(result)

    def _find(self, path):
        return os.path.exists(path)

    def get_cache_info(self, resource):
        return None, resource.path

    def find(self, resource_name):
        path = self._make_path(resource_name)
        if not self._find(path):
            result = None
        else:
            if self._is_directory(path):
                result = ResourceContainer(self, resource_name)
            else:
                result = Resource(self, resource_name)
            result.path = path
        return result

    def get_stream(self, resource):
        return open(resource.path, 'rb')

    def get_bytes(self, resource):
        with open(resource.path, 'rb') as f:
            return f.read()

    def get_size(self, resource):
        return os.path.getsize(resource.path)

    def get_resources(self, resource):
        def allowed(f):
            return (f != '__pycache__' and not
                    f.endswith(self.skipped_extensions))
        return set([f for f in os.listdir(resource.path) if allowed(f)])

    def is_container(self, resource):
        return self._is_directory(resource.path)

    _is_directory = staticmethod(os.path.isdir)

    def iterator(self, resource_name):
        resource = self.find(resource_name)
        if resource is not None:
            todo = [resource]
            while todo:
                resource = todo.pop(0)
                yield resource
                if resource.is_container:
                    rname = resource.name
                    for name in resource.resources:
                        if not rname:
                            new_name = name
                        else:
                            new_name = '/'.join([rname, name])
                        child = self.find(new_name)
                        if child.is_container:
                            todo.append(child)
                        else:
                            yield child


class ZipResourceFinder(ResourceFinder):
    """
    Resource finder for resources in .zip files.
    """
    def __init__(self, module):
        super(ZipResourceFinder, self).__init__(module)
        archive = self.loader.archive
        self.prefix_len = 1 + len(archive)
        # PyPy doesn't have a _files attr on zipimporter, and you can't set one
        if hasattr(self.loader, '_files'):
            self._files = self.loader._files
        else:
            self._files = zipimport._zip_directory_cache[archive]
        self.index = sorted(self._files)

    def _adjust_path(self, path):
        return path

    def _find(self, path):
        path = path[self.prefix_len:]
        if path in self._files:
            result = True
        else:
            if path and path[-1] != os.sep:
                path = path + os.sep
            i = bisect.bisect(self.index, path)
            try:
                result = self.index[i].startswith(path)
            except IndexError:
                result = False
        if not result:
            logger.debug('_find failed: %r %r', path, self.loader.prefix)
        else:
            logger.debug('_find worked: %r %r', path, self.loader.prefix)
        return result

    def get_cache_info(self, resource):
        prefix = self.loader.archive
        path = resource.path[1 + len(prefix):]
        return prefix, path

    def get_bytes(self, resource):
        return self.loader.get_data(resource.path)

    def get_stream(self, resource):
        return io.BytesIO(self.get_bytes(resource))

    def get_size(self, resource):
        path = resource.path[self.prefix_len:]
        return self._files[path][3]

    def get_resources(self, resource):
        path = resource.path[self.prefix_len:]
        if path and path[-1] != os.sep:
            path += os.sep
        plen = len(path)
        result = set()
        i = bisect.bisect(self.index, path)
        while i < len(self.index):
            if not self.index[i].startswith(path):
                break
            s = self.index[i][plen:]
            result.add(s.split(os.sep, 1)[0])   # only immediate children
            i += 1
        return result

    def _is_directory(self, path):
        path = path[self.prefix_len:]
        if path and path[-1] != os.sep:
            path += os.sep
        i = bisect.bisect(self.index, path)
        try:
            result = self.index[i].startswith(path)
        except IndexError:
            result = False
        return result

_finder_registry = {
    type(None): ResourceFinder,
    zipimport.zipimporter: ZipResourceFinder
}

try:
    # In Python 3.6, _frozen_importlib -> _frozen_importlib_external
    try:
        import _frozen_importlib_external as _fi
    except ImportError:
        import _frozen_importlib as _fi
    _finder_registry[_fi.SourceFileLoader] = ResourceFinder
    _finder_registry[_fi.FileFinder] = ResourceFinder
    del _fi
except (ImportError, AttributeError):
    pass


def register_finder(loader, finder_maker):
    _finder_registry[type(loader)] = finder_maker

_finder_cache = {}


def finder(package):
    """
    Return a resource finder for a package.
    :param package: The name of the package.
    :return: A :class:`ResourceFinder` instance for the package.
    """
    if package in _finder_cache:
        result = _finder_cache[package]
    else:
        if package not in sys.modules:
            __import__(package)
        module = sys.modules[package]
        path = getattr(module, '__path__', None)
        if path is None:
            raise DistlibException('You cannot get a finder for a module, '
                                   'only for a package')
        loader = getattr(module, '__loader__', None)
        finder_maker = _finder_registry.get(type(loader))
        if finder_maker is None:
            raise DistlibException('Unable to locate finder for %r' % package)
        result = finder_maker(module)
        _finder_cache[package] = result
    return result


_dummy_module = types.ModuleType(str('__dummy__'))


def finder_for_path(path):
    """
    Return a resource finder for a path, which should represent a container.

    :param path: The path.
    :return: A :class:`ResourceFinder` instance for the path.
    """
    result = None
    # calls any path hooks, gets importer into cache
    pkgutil.get_importer(path)
    loader = sys.path_importer_cache.get(path)
    finder = _finder_registry.get(type(loader))
    if finder:
        module = _dummy_module
        module.__file__ = os.path.join(path, '')
        module.__loader__ = loader
        result = finder(module)
    return result
site-packages/pip/_vendor/distlib/version.py000064400000056237147511334610015223 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2012-2016 The Python Software Foundation.
# See LICENSE.txt and CONTRIBUTORS.txt.
#
"""
Implementation of a flexible versioning scheme providing support for PEP-440,
setuptools-compatible and semantic versioning.
"""

import logging
import re

from .compat import string_types

__all__ = ['NormalizedVersion', 'NormalizedMatcher',
           'LegacyVersion', 'LegacyMatcher',
           'SemanticVersion', 'SemanticMatcher',
           'UnsupportedVersionError', 'get_scheme']

logger = logging.getLogger(__name__)


class UnsupportedVersionError(ValueError):
    """This is an unsupported version."""
    pass


class Version(object):
    def __init__(self, s):
        self._string = s = s.strip()
        self._parts = parts = self.parse(s)
        assert isinstance(parts, tuple)
        assert len(parts) > 0

    def parse(self, s):
        raise NotImplementedError('please implement in a subclass')

    def _check_compatible(self, other):
        if type(self) != type(other):
            raise TypeError('cannot compare %r and %r' % (self, other))

    def __eq__(self, other):
        self._check_compatible(other)
        return self._parts == other._parts

    def __ne__(self, other):
        return not self.__eq__(other)

    def __lt__(self, other):
        self._check_compatible(other)
        return self._parts < other._parts

    def __gt__(self, other):
        return not (self.__lt__(other) or self.__eq__(other))

    def __le__(self, other):
        return self.__lt__(other) or self.__eq__(other)

    def __ge__(self, other):
        return self.__gt__(other) or self.__eq__(other)

    # See http://docs.python.org/reference/datamodel#object.__hash__
    def __hash__(self):
        return hash(self._parts)

    def __repr__(self):
        return "%s('%s')" % (self.__class__.__name__, self._string)

    def __str__(self):
        return self._string

    @property
    def is_prerelease(self):
        raise NotImplementedError('Please implement in subclasses.')


class Matcher(object):
    version_class = None

    dist_re = re.compile(r"^(\w[\s\w'.-]*)(\((.*)\))?")
    comp_re = re.compile(r'^(<=|>=|<|>|!=|={2,3}|~=)?\s*([^\s,]+)$')
    num_re = re.compile(r'^\d+(\.\d+)*$')

    # value is either a callable or the name of a method
    _operators = {
        '<': lambda v, c, p: v < c,
        '>': lambda v, c, p: v > c,
        '<=': lambda v, c, p: v == c or v < c,
        '>=': lambda v, c, p: v == c or v > c,
        '==': lambda v, c, p: v == c,
        '===': lambda v, c, p: v == c,
        # by default, compatible => >=.
        '~=': lambda v, c, p: v == c or v > c,
        '!=': lambda v, c, p: v != c,
    }

    def __init__(self, s):
        if self.version_class is None:
            raise ValueError('Please specify a version class')
        self._string = s = s.strip()
        m = self.dist_re.match(s)
        if not m:
            raise ValueError('Not valid: %r' % s)
        groups = m.groups('')
        self.name = groups[0].strip()
        self.key = self.name.lower()    # for case-insensitive comparisons
        clist = []
        if groups[2]:
            constraints = [c.strip() for c in groups[2].split(',')]
            for c in constraints:
                m = self.comp_re.match(c)
                if not m:
                    raise ValueError('Invalid %r in %r' % (c, s))
                groups = m.groups()
                op = groups[0] or '~='
                s = groups[1]
                if s.endswith('.*'):
                    if op not in ('==', '!='):
                        raise ValueError('\'.*\' not allowed for '
                                         '%r constraints' % op)
                    # Could be a partial version (e.g. for '2.*') which
                    # won't parse as a version, so keep it as a string
                    vn, prefix = s[:-2], True
                    if not self.num_re.match(vn):
                        # Just to check that vn is a valid version
                        self.version_class(vn)
                else:
                    # Should parse as a version, so we can create an
                    # instance for the comparison
                    vn, prefix = self.version_class(s), False
                clist.append((op, vn, prefix))
        self._parts = tuple(clist)

    def match(self, version):
        """
        Check if the provided version matches the constraints.

        :param version: The version to match against this instance.
        :type version: String or :class:`Version` instance.
        """
        if isinstance(version, string_types):
            version = self.version_class(version)
        for operator, constraint, prefix in self._parts:
            f = self._operators.get(operator)
            if isinstance(f, string_types):
                f = getattr(self, f)
            if not f:
                msg = ('%r not implemented '
                       'for %s' % (operator, self.__class__.__name__))
                raise NotImplementedError(msg)
            if not f(version, constraint, prefix):
                return False
        return True

    @property
    def exact_version(self):
        result = None
        if len(self._parts) == 1 and self._parts[0][0] in ('==', '==='):
            result = self._parts[0][1]
        return result

    def _check_compatible(self, other):
        if type(self) != type(other) or self.name != other.name:
            raise TypeError('cannot compare %s and %s' % (self, other))

    def __eq__(self, other):
        self._check_compatible(other)
        return self.key == other.key and self._parts == other._parts

    def __ne__(self, other):
        return not self.__eq__(other)

    # See http://docs.python.org/reference/datamodel#object.__hash__
    def __hash__(self):
        return hash(self.key) + hash(self._parts)

    def __repr__(self):
        return "%s(%r)" % (self.__class__.__name__, self._string)

    def __str__(self):
        return self._string


PEP440_VERSION_RE = re.compile(r'^v?(\d+!)?(\d+(\.\d+)*)((a|b|c|rc)(\d+))?'
                               r'(\.(post)(\d+))?(\.(dev)(\d+))?'
                               r'(\+([a-zA-Z\d]+(\.[a-zA-Z\d]+)?))?$')


def _pep_440_key(s):
    s = s.strip()
    m = PEP440_VERSION_RE.match(s)
    if not m:
        raise UnsupportedVersionError('Not a valid version: %s' % s)
    groups = m.groups()
    nums = tuple(int(v) for v in groups[1].split('.'))
    while len(nums) > 1 and nums[-1] == 0:
        nums = nums[:-1]

    if not groups[0]:
        epoch = 0
    else:
        epoch = int(groups[0])
    pre = groups[4:6]
    post = groups[7:9]
    dev = groups[10:12]
    local = groups[13]
    if pre == (None, None):
        pre = ()
    else:
        pre = pre[0], int(pre[1])
    if post == (None, None):
        post = ()
    else:
        post = post[0], int(post[1])
    if dev == (None, None):
        dev = ()
    else:
        dev = dev[0], int(dev[1])
    if local is None:
        local = ()
    else:
        parts = []
        for part in local.split('.'):
            # to ensure that numeric compares as > lexicographic, avoid
            # comparing them directly, but encode a tuple which ensures
            # correct sorting
            if part.isdigit():
                part = (1, int(part))
            else:
                part = (0, part)
            parts.append(part)
        local = tuple(parts)
    if not pre:
        # either before pre-release, or final release and after
        if not post and dev:
            # before pre-release
            pre = ('a', -1)     # to sort before a0
        else:
            pre = ('z',)        # to sort after all pre-releases
    # now look at the state of post and dev.
    if not post:
        post = ('_',)   # sort before 'a'
    if not dev:
        dev = ('final',)

    #print('%s -> %s' % (s, m.groups()))
    return epoch, nums, pre, post, dev, local


_normalized_key = _pep_440_key


class NormalizedVersion(Version):
    """A rational version.

    Good:
        1.2         # equivalent to "1.2.0"
        1.2.0
        1.2a1
        1.2.3a2
        1.2.3b1
        1.2.3c1
        1.2.3.4
        TODO: fill this out

    Bad:
        1           # minimum two numbers
        1.2a        # release level must have a release serial
        1.2.3b
    """
    def parse(self, s):
        result = _normalized_key(s)
        # _normalized_key loses trailing zeroes in the release
        # clause, since that's needed to ensure that X.Y == X.Y.0 == X.Y.0.0
        # However, PEP 440 prefix matching needs it: for example,
        # (~= 1.4.5.0) matches differently to (~= 1.4.5.0.0).
        m = PEP440_VERSION_RE.match(s)      # must succeed
        groups = m.groups()
        self._release_clause = tuple(int(v) for v in groups[1].split('.'))
        return result

    PREREL_TAGS = set(['a', 'b', 'c', 'rc', 'dev'])

    @property
    def is_prerelease(self):
        return any(t[0] in self.PREREL_TAGS for t in self._parts if t)


def _match_prefix(x, y):
    x = str(x)
    y = str(y)
    if x == y:
        return True
    if not x.startswith(y):
        return False
    n = len(y)
    return x[n] == '.'


class NormalizedMatcher(Matcher):
    version_class = NormalizedVersion

    # value is either a callable or the name of a method
    _operators = {
        '~=': '_match_compatible',
        '<': '_match_lt',
        '>': '_match_gt',
        '<=': '_match_le',
        '>=': '_match_ge',
        '==': '_match_eq',
        '===': '_match_arbitrary',
        '!=': '_match_ne',
    }

    def _adjust_local(self, version, constraint, prefix):
        if prefix:
            strip_local = '+' not in constraint and version._parts[-1]
        else:
            # both constraint and version are
            # NormalizedVersion instances.
            # If constraint does not have a local component,
            # ensure the version doesn't, either.
            strip_local = not constraint._parts[-1] and version._parts[-1]
        if strip_local:
            s = version._string.split('+', 1)[0]
            version = self.version_class(s)
        return version, constraint

    def _match_lt(self, version, constraint, prefix):
        version, constraint = self._adjust_local(version, constraint, prefix)
        if version >= constraint:
            return False
        release_clause = constraint._release_clause
        pfx = '.'.join([str(i) for i in release_clause])
        return not _match_prefix(version, pfx)

    def _match_gt(self, version, constraint, prefix):
        version, constraint = self._adjust_local(version, constraint, prefix)
        if version <= constraint:
            return False
        release_clause = constraint._release_clause
        pfx = '.'.join([str(i) for i in release_clause])
        return not _match_prefix(version, pfx)

    def _match_le(self, version, constraint, prefix):
        version, constraint = self._adjust_local(version, constraint, prefix)
        return version <= constraint

    def _match_ge(self, version, constraint, prefix):
        version, constraint = self._adjust_local(version, constraint, prefix)
        return version >= constraint

    def _match_eq(self, version, constraint, prefix):
        version, constraint = self._adjust_local(version, constraint, prefix)
        if not prefix:
            result = (version == constraint)
        else:
            result = _match_prefix(version, constraint)
        return result

    def _match_arbitrary(self, version, constraint, prefix):
        return str(version) == str(constraint)

    def _match_ne(self, version, constraint, prefix):
        version, constraint = self._adjust_local(version, constraint, prefix)
        if not prefix:
            result = (version != constraint)
        else:
            result = not _match_prefix(version, constraint)
        return result

    def _match_compatible(self, version, constraint, prefix):
        version, constraint = self._adjust_local(version, constraint, prefix)
        if version == constraint:
            return True
        if version < constraint:
            return False
#        if not prefix:
#            return True
        release_clause = constraint._release_clause
        if len(release_clause) > 1:
            release_clause = release_clause[:-1]
        pfx = '.'.join([str(i) for i in release_clause])
        return _match_prefix(version, pfx)

_REPLACEMENTS = (
    (re.compile('[.+-]$'), ''),                     # remove trailing puncts
    (re.compile(r'^[.](\d)'), r'0.\1'),             # .N -> 0.N at start
    (re.compile('^[.-]'), ''),                      # remove leading puncts
    (re.compile(r'^\((.*)\)$'), r'\1'),             # remove parentheses
    (re.compile(r'^v(ersion)?\s*(\d+)'), r'\2'),    # remove leading v(ersion)
    (re.compile(r'^r(ev)?\s*(\d+)'), r'\2'),        # remove leading v(ersion)
    (re.compile('[.]{2,}'), '.'),                   # multiple runs of '.'
    (re.compile(r'\b(alfa|apha)\b'), 'alpha'),      # misspelt alpha
    (re.compile(r'\b(pre-alpha|prealpha)\b'),
                'pre.alpha'),                       # standardise
    (re.compile(r'\(beta\)$'), 'beta'),             # remove parentheses
)

_SUFFIX_REPLACEMENTS = (
    (re.compile('^[:~._+-]+'), ''),                   # remove leading puncts
    (re.compile('[,*")([\]]'), ''),                   # remove unwanted chars
    (re.compile('[~:+_ -]'), '.'),                    # replace illegal chars
    (re.compile('[.]{2,}'), '.'),                   # multiple runs of '.'
    (re.compile(r'\.$'), ''),                       # trailing '.'
)

_NUMERIC_PREFIX = re.compile(r'(\d+(\.\d+)*)')


def _suggest_semantic_version(s):
    """
    Try to suggest a semantic form for a version for which
    _suggest_normalized_version couldn't come up with anything.
    """
    result = s.strip().lower()
    for pat, repl in _REPLACEMENTS:
        result = pat.sub(repl, result)
    if not result:
        result = '0.0.0'

    # Now look for numeric prefix, and separate it out from
    # the rest.
    #import pdb; pdb.set_trace()
    m = _NUMERIC_PREFIX.match(result)
    if not m:
        prefix = '0.0.0'
        suffix = result
    else:
        prefix = m.groups()[0].split('.')
        prefix = [int(i) for i in prefix]
        while len(prefix) < 3:
            prefix.append(0)
        if len(prefix) == 3:
            suffix = result[m.end():]
        else:
            suffix = '.'.join([str(i) for i in prefix[3:]]) + result[m.end():]
            prefix = prefix[:3]
        prefix = '.'.join([str(i) for i in prefix])
        suffix = suffix.strip()
    if suffix:
        #import pdb; pdb.set_trace()
        # massage the suffix.
        for pat, repl in _SUFFIX_REPLACEMENTS:
            suffix = pat.sub(repl, suffix)

    if not suffix:
        result = prefix
    else:
        sep = '-' if 'dev' in suffix else '+'
        result = prefix + sep + suffix
    if not is_semver(result):
        result = None
    return result


def _suggest_normalized_version(s):
    """Suggest a normalized version close to the given version string.

    If you have a version string that isn't rational (i.e. NormalizedVersion
    doesn't like it) then you might be able to get an equivalent (or close)
    rational version from this function.

    This does a number of simple normalizations to the given string, based
    on observation of versions currently in use on PyPI. Given a dump of
    those version during PyCon 2009, 4287 of them:
    - 2312 (53.93%) match NormalizedVersion without change
      with the automatic suggestion
    - 3474 (81.04%) match when using this suggestion method

    @param s {str} An irrational version string.
    @returns A rational version string, or None, if couldn't determine one.
    """
    try:
        _normalized_key(s)
        return s   # already rational
    except UnsupportedVersionError:
        pass

    rs = s.lower()

    # part of this could use maketrans
    for orig, repl in (('-alpha', 'a'), ('-beta', 'b'), ('alpha', 'a'),
                       ('beta', 'b'), ('rc', 'c'), ('-final', ''),
                       ('-pre', 'c'),
                       ('-release', ''), ('.release', ''), ('-stable', ''),
                       ('+', '.'), ('_', '.'), (' ', ''), ('.final', ''),
                       ('final', '')):
        rs = rs.replace(orig, repl)

    # if something ends with dev or pre, we add a 0
    rs = re.sub(r"pre$", r"pre0", rs)
    rs = re.sub(r"dev$", r"dev0", rs)

    # if we have something like "b-2" or "a.2" at the end of the
    # version, that is probably beta, alpha, etc
    # let's remove the dash or dot
    rs = re.sub(r"([abc]|rc)[\-\.](\d+)$", r"\1\2", rs)

    # 1.0-dev-r371 -> 1.0.dev371
    # 0.1-dev-r79 -> 0.1.dev79
    rs = re.sub(r"[\-\.](dev)[\-\.]?r?(\d+)$", r".\1\2", rs)

    # Clean: 2.0.a.3, 2.0.b1, 0.9.0~c1
    rs = re.sub(r"[.~]?([abc])\.?", r"\1", rs)

    # Clean: v0.3, v1.0
    if rs.startswith('v'):
        rs = rs[1:]

    # Clean leading '0's on numbers.
    #TODO: unintended side-effect on, e.g., "2003.05.09"
    # PyPI stats: 77 (~2%) better
    rs = re.sub(r"\b0+(\d+)(?!\d)", r"\1", rs)

    # Clean a/b/c with no version. E.g. "1.0a" -> "1.0a0". Setuptools infers
    # zero.
    # PyPI stats: 245 (7.56%) better
    rs = re.sub(r"(\d+[abc])$", r"\g<1>0", rs)

    # the 'dev-rNNN' tag is a dev tag
    rs = re.sub(r"\.?(dev-r|dev\.r)\.?(\d+)$", r".dev\2", rs)

    # clean the - when used as a pre delimiter
    rs = re.sub(r"-(a|b|c)(\d+)$", r"\1\2", rs)

    # a terminal "dev" or "devel" can be changed into ".dev0"
    rs = re.sub(r"[\.\-](dev|devel)$", r".dev0", rs)

    # a terminal "dev" can be changed into ".dev0"
    rs = re.sub(r"(?![\.\-])dev$", r".dev0", rs)

    # a terminal "final" or "stable" can be removed
    rs = re.sub(r"(final|stable)$", "", rs)

    # The 'r' and the '-' tags are post release tags
    #   0.4a1.r10       ->  0.4a1.post10
    #   0.9.33-17222    ->  0.9.33.post17222
    #   0.9.33-r17222   ->  0.9.33.post17222
    rs = re.sub(r"\.?(r|-|-r)\.?(\d+)$", r".post\2", rs)

    # Clean 'r' instead of 'dev' usage:
    #   0.9.33+r17222   ->  0.9.33.dev17222
    #   1.0dev123       ->  1.0.dev123
    #   1.0.git123      ->  1.0.dev123
    #   1.0.bzr123      ->  1.0.dev123
    #   0.1a0dev.123    ->  0.1a0.dev123
    # PyPI stats:  ~150 (~4%) better
    rs = re.sub(r"\.?(dev|git|bzr)\.?(\d+)$", r".dev\2", rs)

    # Clean '.pre' (normalized from '-pre' above) instead of 'c' usage:
    #   0.2.pre1        ->  0.2c1
    #   0.2-c1         ->  0.2c1
    #   1.0preview123   ->  1.0c123
    # PyPI stats: ~21 (0.62%) better
    rs = re.sub(r"\.?(pre|preview|-c)(\d+)$", r"c\g<2>", rs)

    # Tcl/Tk uses "px" for their post release markers
    rs = re.sub(r"p(\d+)$", r".post\1", rs)

    try:
        _normalized_key(rs)
    except UnsupportedVersionError:
        rs = None
    return rs

#
#   Legacy version processing (distribute-compatible)
#

_VERSION_PART = re.compile(r'([a-z]+|\d+|[\.-])', re.I)
_VERSION_REPLACE = {
    'pre': 'c',
    'preview': 'c',
    '-': 'final-',
    'rc': 'c',
    'dev': '@',
    '': None,
    '.': None,
}


def _legacy_key(s):
    def get_parts(s):
        result = []
        for p in _VERSION_PART.split(s.lower()):
            p = _VERSION_REPLACE.get(p, p)
            if p:
                if '0' <= p[:1] <= '9':
                    p = p.zfill(8)
                else:
                    p = '*' + p
                result.append(p)
        result.append('*final')
        return result

    result = []
    for p in get_parts(s):
        if p.startswith('*'):
            if p < '*final':
                while result and result[-1] == '*final-':
                    result.pop()
            while result and result[-1] == '00000000':
                result.pop()
        result.append(p)
    return tuple(result)


class LegacyVersion(Version):
    def parse(self, s):
        return _legacy_key(s)

    @property
    def is_prerelease(self):
        result = False
        for x in self._parts:
            if (isinstance(x, string_types) and x.startswith('*') and
                x < '*final'):
                result = True
                break
        return result


class LegacyMatcher(Matcher):
    version_class = LegacyVersion

    _operators = dict(Matcher._operators)
    _operators['~='] = '_match_compatible'

    numeric_re = re.compile('^(\d+(\.\d+)*)')

    def _match_compatible(self, version, constraint, prefix):
        if version < constraint:
            return False
        m = self.numeric_re.match(str(constraint))
        if not m:
            logger.warning('Cannot compute compatible match for version %s '
                           ' and constraint %s', version, constraint)
            return True
        s = m.groups()[0]
        if '.' in s:
            s = s.rsplit('.', 1)[0]
        return _match_prefix(version, s)

#
#   Semantic versioning
#

_SEMVER_RE = re.compile(r'^(\d+)\.(\d+)\.(\d+)'
                        r'(-[a-z0-9]+(\.[a-z0-9-]+)*)?'
                        r'(\+[a-z0-9]+(\.[a-z0-9-]+)*)?$', re.I)


def is_semver(s):
    return _SEMVER_RE.match(s)


def _semantic_key(s):
    def make_tuple(s, absent):
        if s is None:
            result = (absent,)
        else:
            parts = s[1:].split('.')
            # We can't compare ints and strings on Python 3, so fudge it
            # by zero-filling numeric values so simulate a numeric comparison
            result = tuple([p.zfill(8) if p.isdigit() else p for p in parts])
        return result

    m = is_semver(s)
    if not m:
        raise UnsupportedVersionError(s)
    groups = m.groups()
    major, minor, patch = [int(i) for i in groups[:3]]
    # choose the '|' and '*' so that versions sort correctly
    pre, build = make_tuple(groups[3], '|'), make_tuple(groups[5], '*')
    return (major, minor, patch), pre, build


class SemanticVersion(Version):
    def parse(self, s):
        return _semantic_key(s)

    @property
    def is_prerelease(self):
        return self._parts[1][0] != '|'


class SemanticMatcher(Matcher):
    version_class = SemanticVersion


class VersionScheme(object):
    def __init__(self, key, matcher, suggester=None):
        self.key = key
        self.matcher = matcher
        self.suggester = suggester

    def is_valid_version(self, s):
        try:
            self.matcher.version_class(s)
            result = True
        except UnsupportedVersionError:
            result = False
        return result

    def is_valid_matcher(self, s):
        try:
            self.matcher(s)
            result = True
        except UnsupportedVersionError:
            result = False
        return result

    def is_valid_constraint_list(self, s):
        """
        Used for processing some metadata fields
        """
        return self.is_valid_matcher('dummy_name (%s)' % s)

    def suggest(self, s):
        if self.suggester is None:
            result = None
        else:
            result = self.suggester(s)
        return result

_SCHEMES = {
    'normalized': VersionScheme(_normalized_key, NormalizedMatcher,
                                _suggest_normalized_version),
    'legacy': VersionScheme(_legacy_key, LegacyMatcher, lambda self, s: s),
    'semantic': VersionScheme(_semantic_key, SemanticMatcher,
                              _suggest_semantic_version),
}

_SCHEMES['default'] = _SCHEMES['normalized']


def get_scheme(name):
    if name not in _SCHEMES:
        raise ValueError('unknown scheme name: %r' % name)
    return _SCHEMES[name]
site-packages/pip/_vendor/distlib/markers.py000064400000014212147511334610015165 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2012-2013 Vinay Sajip.
# Licensed to the Python Software Foundation under a contributor agreement.
# See LICENSE.txt and CONTRIBUTORS.txt.
#
"""Parser for the environment markers micro-language defined in PEP 345."""

import ast
import os
import sys
import platform

from .compat import python_implementation, string_types
from .util import in_venv

__all__ = ['interpret']


class Evaluator(object):
    """
    A limited evaluator for Python expressions.
    """

    operators = {
        'eq': lambda x, y: x == y,
        'gt': lambda x, y: x > y,
        'gte': lambda x, y: x >= y,
        'in': lambda x, y: x in y,
        'lt': lambda x, y: x < y,
        'lte': lambda x, y: x <= y,
        'not': lambda x: not x,
        'noteq': lambda x, y: x != y,
        'notin': lambda x, y: x not in y,
    }

    allowed_values = {
        'sys_platform': sys.platform,
        'python_version': '%s.%s' % sys.version_info[:2],
        # parsing sys.platform is not reliable, but there is no other
        # way to get e.g. 2.7.2+, and the PEP is defined with sys.version
        'python_full_version': sys.version.split(' ', 1)[0],
        'os_name': os.name,
        'platform_in_venv': str(in_venv()),
        'platform_release': platform.release(),
        'platform_version': platform.version(),
        'platform_machine': platform.machine(),
        'platform_python_implementation': python_implementation(),
    }

    def __init__(self, context=None):
        """
        Initialise an instance.

        :param context: If specified, names are looked up in this mapping.
        """
        self.context = context or {}
        self.source = None

    def get_fragment(self, offset):
        """
        Get the part of the source which is causing a problem.
        """
        fragment_len = 10
        s = '%r' % (self.source[offset:offset + fragment_len])
        if offset + fragment_len < len(self.source):
            s += '...'
        return s

    def get_handler(self, node_type):
        """
        Get a handler for the specified AST node type.
        """
        return getattr(self, 'do_%s' % node_type, None)

    def evaluate(self, node, filename=None):
        """
        Evaluate a source string or node, using ``filename`` when
        displaying errors.
        """
        if isinstance(node, string_types):
            self.source = node
            kwargs = {'mode': 'eval'}
            if filename:
                kwargs['filename'] = filename
            try:
                node = ast.parse(node, **kwargs)
            except SyntaxError as e:
                s = self.get_fragment(e.offset)
                raise SyntaxError('syntax error %s' % s)
        node_type = node.__class__.__name__.lower()
        handler = self.get_handler(node_type)
        if handler is None:
            if self.source is None:
                s = '(source not available)'
            else:
                s = self.get_fragment(node.col_offset)
            raise SyntaxError("don't know how to evaluate %r %s" % (
                node_type, s))
        return handler(node)

    def get_attr_key(self, node):
        assert isinstance(node, ast.Attribute), 'attribute node expected'
        return '%s.%s' % (node.value.id, node.attr)

    def do_attribute(self, node):
        if not isinstance(node.value, ast.Name):
            valid = False
        else:
            key = self.get_attr_key(node)
            valid = key in self.context or key in self.allowed_values
        if not valid:
            raise SyntaxError('invalid expression: %s' % key)
        if key in self.context:
            result = self.context[key]
        else:
            result = self.allowed_values[key]
        return result

    def do_boolop(self, node):
        result = self.evaluate(node.values[0])
        is_or = node.op.__class__ is ast.Or
        is_and = node.op.__class__ is ast.And
        assert is_or or is_and
        if (is_and and result) or (is_or and not result):
            for n in node.values[1:]:
                result = self.evaluate(n)
                if (is_or and result) or (is_and and not result):
                    break
        return result

    def do_compare(self, node):
        def sanity_check(lhsnode, rhsnode):
            valid = True
            if isinstance(lhsnode, ast.Str) and isinstance(rhsnode, ast.Str):
                valid = False
            #elif (isinstance(lhsnode, ast.Attribute)
            #      and isinstance(rhsnode, ast.Attribute)):
            #    klhs = self.get_attr_key(lhsnode)
            #    krhs = self.get_attr_key(rhsnode)
            #    valid = klhs != krhs
            if not valid:
                s = self.get_fragment(node.col_offset)
                raise SyntaxError('Invalid comparison: %s' % s)

        lhsnode = node.left
        lhs = self.evaluate(lhsnode)
        result = True
        for op, rhsnode in zip(node.ops, node.comparators):
            sanity_check(lhsnode, rhsnode)
            op = op.__class__.__name__.lower()
            if op not in self.operators:
                raise SyntaxError('unsupported operation: %r' % op)
            rhs = self.evaluate(rhsnode)
            result = self.operators[op](lhs, rhs)
            if not result:
                break
            lhs = rhs
            lhsnode = rhsnode
        return result

    def do_expression(self, node):
        return self.evaluate(node.body)

    def do_name(self, node):
        valid = False
        if node.id in self.context:
            valid = True
            result = self.context[node.id]
        elif node.id in self.allowed_values:
            valid = True
            result = self.allowed_values[node.id]
        if not valid:
            raise SyntaxError('invalid expression: %s' % node.id)
        return result

    def do_str(self, node):
        return node.s


def interpret(marker, execution_context=None):
    """
    Interpret a marker and return a result depending on environment.

    :param marker: The marker to interpret.
    :type marker: str
    :param execution_context: The context used for name lookup.
    :type execution_context: mapping
    """
    return Evaluator(execution_context).evaluate(marker.strip())
site-packages/pip/_vendor/six.py000064400000072622147511334610012703 0ustar00"""Utilities for writing code that runs on Python 2 and 3"""

# Copyright (c) 2010-2015 Benjamin Peterson
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in all
# copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.

from __future__ import absolute_import

import functools
import itertools
import operator
import sys
import types

__author__ = "Benjamin Peterson <benjamin@python.org>"
__version__ = "1.10.0"


# Useful for very coarse version differentiation.
PY2 = sys.version_info[0] == 2
PY3 = sys.version_info[0] == 3
PY34 = sys.version_info[0:2] >= (3, 4)

if PY3:
    string_types = str,
    integer_types = int,
    class_types = type,
    text_type = str
    binary_type = bytes

    MAXSIZE = sys.maxsize
else:
    string_types = basestring,
    integer_types = (int, long)
    class_types = (type, types.ClassType)
    text_type = unicode
    binary_type = str

    if sys.platform.startswith("java"):
        # Jython always uses 32 bits.
        MAXSIZE = int((1 << 31) - 1)
    else:
        # It's possible to have sizeof(long) != sizeof(Py_ssize_t).
        class X(object):

            def __len__(self):
                return 1 << 31
        try:
            len(X())
        except OverflowError:
            # 32-bit
            MAXSIZE = int((1 << 31) - 1)
        else:
            # 64-bit
            MAXSIZE = int((1 << 63) - 1)
        del X


def _add_doc(func, doc):
    """Add documentation to a function."""
    func.__doc__ = doc


def _import_module(name):
    """Import module, returning the module after the last dot."""
    __import__(name)
    return sys.modules[name]


class _LazyDescr(object):

    def __init__(self, name):
        self.name = name

    def __get__(self, obj, tp):
        result = self._resolve()
        setattr(obj, self.name, result)  # Invokes __set__.
        try:
            # This is a bit ugly, but it avoids running this again by
            # removing this descriptor.
            delattr(obj.__class__, self.name)
        except AttributeError:
            pass
        return result


class MovedModule(_LazyDescr):

    def __init__(self, name, old, new=None):
        super(MovedModule, self).__init__(name)
        if PY3:
            if new is None:
                new = name
            self.mod = new
        else:
            self.mod = old

    def _resolve(self):
        return _import_module(self.mod)

    def __getattr__(self, attr):
        _module = self._resolve()
        value = getattr(_module, attr)
        setattr(self, attr, value)
        return value


class _LazyModule(types.ModuleType):

    def __init__(self, name):
        super(_LazyModule, self).__init__(name)
        self.__doc__ = self.__class__.__doc__

    def __dir__(self):
        attrs = ["__doc__", "__name__"]
        attrs += [attr.name for attr in self._moved_attributes]
        return attrs

    # Subclasses should override this
    _moved_attributes = []


class MovedAttribute(_LazyDescr):

    def __init__(self, name, old_mod, new_mod, old_attr=None, new_attr=None):
        super(MovedAttribute, self).__init__(name)
        if PY3:
            if new_mod is None:
                new_mod = name
            self.mod = new_mod
            if new_attr is None:
                if old_attr is None:
                    new_attr = name
                else:
                    new_attr = old_attr
            self.attr = new_attr
        else:
            self.mod = old_mod
            if old_attr is None:
                old_attr = name
            self.attr = old_attr

    def _resolve(self):
        module = _import_module(self.mod)
        return getattr(module, self.attr)


class _SixMetaPathImporter(object):

    """
    A meta path importer to import six.moves and its submodules.

    This class implements a PEP302 finder and loader. It should be compatible
    with Python 2.5 and all existing versions of Python3
    """

    def __init__(self, six_module_name):
        self.name = six_module_name
        self.known_modules = {}

    def _add_module(self, mod, *fullnames):
        for fullname in fullnames:
            self.known_modules[self.name + "." + fullname] = mod

    def _get_module(self, fullname):
        return self.known_modules[self.name + "." + fullname]

    def find_module(self, fullname, path=None):
        if fullname in self.known_modules:
            return self
        return None

    def __get_module(self, fullname):
        try:
            return self.known_modules[fullname]
        except KeyError:
            raise ImportError("This loader does not know module " + fullname)

    def load_module(self, fullname):
        try:
            # in case of a reload
            return sys.modules[fullname]
        except KeyError:
            pass
        mod = self.__get_module(fullname)
        if isinstance(mod, MovedModule):
            mod = mod._resolve()
        else:
            mod.__loader__ = self
        sys.modules[fullname] = mod
        return mod

    def is_package(self, fullname):
        """
        Return true, if the named module is a package.

        We need this method to get correct spec objects with
        Python 3.4 (see PEP451)
        """
        return hasattr(self.__get_module(fullname), "__path__")

    def get_code(self, fullname):
        """Return None

        Required, if is_package is implemented"""
        self.__get_module(fullname)  # eventually raises ImportError
        return None
    get_source = get_code  # same as get_code

_importer = _SixMetaPathImporter(__name__)


class _MovedItems(_LazyModule):

    """Lazy loading of moved objects"""
    __path__ = []  # mark as package


_moved_attributes = [
    MovedAttribute("cStringIO", "cStringIO", "io", "StringIO"),
    MovedAttribute("filter", "itertools", "builtins", "ifilter", "filter"),
    MovedAttribute("filterfalse", "itertools", "itertools", "ifilterfalse", "filterfalse"),
    MovedAttribute("input", "__builtin__", "builtins", "raw_input", "input"),
    MovedAttribute("intern", "__builtin__", "sys"),
    MovedAttribute("map", "itertools", "builtins", "imap", "map"),
    MovedAttribute("getcwd", "os", "os", "getcwdu", "getcwd"),
    MovedAttribute("getcwdb", "os", "os", "getcwd", "getcwdb"),
    MovedAttribute("range", "__builtin__", "builtins", "xrange", "range"),
    MovedAttribute("reload_module", "__builtin__", "importlib" if PY34 else "imp", "reload"),
    MovedAttribute("reduce", "__builtin__", "functools"),
    MovedAttribute("shlex_quote", "pipes", "shlex", "quote"),
    MovedAttribute("StringIO", "StringIO", "io"),
    MovedAttribute("UserDict", "UserDict", "collections"),
    MovedAttribute("UserList", "UserList", "collections"),
    MovedAttribute("UserString", "UserString", "collections"),
    MovedAttribute("xrange", "__builtin__", "builtins", "xrange", "range"),
    MovedAttribute("zip", "itertools", "builtins", "izip", "zip"),
    MovedAttribute("zip_longest", "itertools", "itertools", "izip_longest", "zip_longest"),
    MovedModule("builtins", "__builtin__"),
    MovedModule("configparser", "ConfigParser"),
    MovedModule("copyreg", "copy_reg"),
    MovedModule("dbm_gnu", "gdbm", "dbm.gnu"),
    MovedModule("_dummy_thread", "dummy_thread", "_dummy_thread"),
    MovedModule("http_cookiejar", "cookielib", "http.cookiejar"),
    MovedModule("http_cookies", "Cookie", "http.cookies"),
    MovedModule("html_entities", "htmlentitydefs", "html.entities"),
    MovedModule("html_parser", "HTMLParser", "html.parser"),
    MovedModule("http_client", "httplib", "http.client"),
    MovedModule("email_mime_multipart", "email.MIMEMultipart", "email.mime.multipart"),
    MovedModule("email_mime_nonmultipart", "email.MIMENonMultipart", "email.mime.nonmultipart"),
    MovedModule("email_mime_text", "email.MIMEText", "email.mime.text"),
    MovedModule("email_mime_base", "email.MIMEBase", "email.mime.base"),
    MovedModule("BaseHTTPServer", "BaseHTTPServer", "http.server"),
    MovedModule("CGIHTTPServer", "CGIHTTPServer", "http.server"),
    MovedModule("SimpleHTTPServer", "SimpleHTTPServer", "http.server"),
    MovedModule("cPickle", "cPickle", "pickle"),
    MovedModule("queue", "Queue"),
    MovedModule("reprlib", "repr"),
    MovedModule("socketserver", "SocketServer"),
    MovedModule("_thread", "thread", "_thread"),
    MovedModule("tkinter", "Tkinter"),
    MovedModule("tkinter_dialog", "Dialog", "tkinter.dialog"),
    MovedModule("tkinter_filedialog", "FileDialog", "tkinter.filedialog"),
    MovedModule("tkinter_scrolledtext", "ScrolledText", "tkinter.scrolledtext"),
    MovedModule("tkinter_simpledialog", "SimpleDialog", "tkinter.simpledialog"),
    MovedModule("tkinter_tix", "Tix", "tkinter.tix"),
    MovedModule("tkinter_ttk", "ttk", "tkinter.ttk"),
    MovedModule("tkinter_constants", "Tkconstants", "tkinter.constants"),
    MovedModule("tkinter_dnd", "Tkdnd", "tkinter.dnd"),
    MovedModule("tkinter_colorchooser", "tkColorChooser",
                "tkinter.colorchooser"),
    MovedModule("tkinter_commondialog", "tkCommonDialog",
                "tkinter.commondialog"),
    MovedModule("tkinter_tkfiledialog", "tkFileDialog", "tkinter.filedialog"),
    MovedModule("tkinter_font", "tkFont", "tkinter.font"),
    MovedModule("tkinter_messagebox", "tkMessageBox", "tkinter.messagebox"),
    MovedModule("tkinter_tksimpledialog", "tkSimpleDialog",
                "tkinter.simpledialog"),
    MovedModule("urllib_parse", __name__ + ".moves.urllib_parse", "urllib.parse"),
    MovedModule("urllib_error", __name__ + ".moves.urllib_error", "urllib.error"),
    MovedModule("urllib", __name__ + ".moves.urllib", __name__ + ".moves.urllib"),
    MovedModule("urllib_robotparser", "robotparser", "urllib.robotparser"),
    MovedModule("xmlrpc_client", "xmlrpclib", "xmlrpc.client"),
    MovedModule("xmlrpc_server", "SimpleXMLRPCServer", "xmlrpc.server"),
]
# Add windows specific modules.
if sys.platform == "win32":
    _moved_attributes += [
        MovedModule("winreg", "_winreg"),
    ]

for attr in _moved_attributes:
    setattr(_MovedItems, attr.name, attr)
    if isinstance(attr, MovedModule):
        _importer._add_module(attr, "moves." + attr.name)
del attr

_MovedItems._moved_attributes = _moved_attributes

moves = _MovedItems(__name__ + ".moves")
_importer._add_module(moves, "moves")


class Module_six_moves_urllib_parse(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_parse"""


_urllib_parse_moved_attributes = [
    MovedAttribute("ParseResult", "urlparse", "urllib.parse"),
    MovedAttribute("SplitResult", "urlparse", "urllib.parse"),
    MovedAttribute("parse_qs", "urlparse", "urllib.parse"),
    MovedAttribute("parse_qsl", "urlparse", "urllib.parse"),
    MovedAttribute("urldefrag", "urlparse", "urllib.parse"),
    MovedAttribute("urljoin", "urlparse", "urllib.parse"),
    MovedAttribute("urlparse", "urlparse", "urllib.parse"),
    MovedAttribute("urlsplit", "urlparse", "urllib.parse"),
    MovedAttribute("urlunparse", "urlparse", "urllib.parse"),
    MovedAttribute("urlunsplit", "urlparse", "urllib.parse"),
    MovedAttribute("quote", "urllib", "urllib.parse"),
    MovedAttribute("quote_plus", "urllib", "urllib.parse"),
    MovedAttribute("unquote", "urllib", "urllib.parse"),
    MovedAttribute("unquote_plus", "urllib", "urllib.parse"),
    MovedAttribute("urlencode", "urllib", "urllib.parse"),
    MovedAttribute("splitquery", "urllib", "urllib.parse"),
    MovedAttribute("splittag", "urllib", "urllib.parse"),
    MovedAttribute("splituser", "urllib", "urllib.parse"),
    MovedAttribute("uses_fragment", "urlparse", "urllib.parse"),
    MovedAttribute("uses_netloc", "urlparse", "urllib.parse"),
    MovedAttribute("uses_params", "urlparse", "urllib.parse"),
    MovedAttribute("uses_query", "urlparse", "urllib.parse"),
    MovedAttribute("uses_relative", "urlparse", "urllib.parse"),
]
for attr in _urllib_parse_moved_attributes:
    setattr(Module_six_moves_urllib_parse, attr.name, attr)
del attr

Module_six_moves_urllib_parse._moved_attributes = _urllib_parse_moved_attributes

_importer._add_module(Module_six_moves_urllib_parse(__name__ + ".moves.urllib_parse"),
                      "moves.urllib_parse", "moves.urllib.parse")


class Module_six_moves_urllib_error(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_error"""


_urllib_error_moved_attributes = [
    MovedAttribute("URLError", "urllib2", "urllib.error"),
    MovedAttribute("HTTPError", "urllib2", "urllib.error"),
    MovedAttribute("ContentTooShortError", "urllib", "urllib.error"),
]
for attr in _urllib_error_moved_attributes:
    setattr(Module_six_moves_urllib_error, attr.name, attr)
del attr

Module_six_moves_urllib_error._moved_attributes = _urllib_error_moved_attributes

_importer._add_module(Module_six_moves_urllib_error(__name__ + ".moves.urllib.error"),
                      "moves.urllib_error", "moves.urllib.error")


class Module_six_moves_urllib_request(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_request"""


_urllib_request_moved_attributes = [
    MovedAttribute("urlopen", "urllib2", "urllib.request"),
    MovedAttribute("install_opener", "urllib2", "urllib.request"),
    MovedAttribute("build_opener", "urllib2", "urllib.request"),
    MovedAttribute("pathname2url", "urllib", "urllib.request"),
    MovedAttribute("url2pathname", "urllib", "urllib.request"),
    MovedAttribute("getproxies", "urllib", "urllib.request"),
    MovedAttribute("Request", "urllib2", "urllib.request"),
    MovedAttribute("OpenerDirector", "urllib2", "urllib.request"),
    MovedAttribute("HTTPDefaultErrorHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPRedirectHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPCookieProcessor", "urllib2", "urllib.request"),
    MovedAttribute("ProxyHandler", "urllib2", "urllib.request"),
    MovedAttribute("BaseHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPPasswordMgr", "urllib2", "urllib.request"),
    MovedAttribute("HTTPPasswordMgrWithDefaultRealm", "urllib2", "urllib.request"),
    MovedAttribute("AbstractBasicAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPBasicAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("ProxyBasicAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("AbstractDigestAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPDigestAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("ProxyDigestAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPSHandler", "urllib2", "urllib.request"),
    MovedAttribute("FileHandler", "urllib2", "urllib.request"),
    MovedAttribute("FTPHandler", "urllib2", "urllib.request"),
    MovedAttribute("CacheFTPHandler", "urllib2", "urllib.request"),
    MovedAttribute("UnknownHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPErrorProcessor", "urllib2", "urllib.request"),
    MovedAttribute("urlretrieve", "urllib", "urllib.request"),
    MovedAttribute("urlcleanup", "urllib", "urllib.request"),
    MovedAttribute("URLopener", "urllib", "urllib.request"),
    MovedAttribute("FancyURLopener", "urllib", "urllib.request"),
    MovedAttribute("proxy_bypass", "urllib", "urllib.request"),
]
for attr in _urllib_request_moved_attributes:
    setattr(Module_six_moves_urllib_request, attr.name, attr)
del attr

Module_six_moves_urllib_request._moved_attributes = _urllib_request_moved_attributes

_importer._add_module(Module_six_moves_urllib_request(__name__ + ".moves.urllib.request"),
                      "moves.urllib_request", "moves.urllib.request")


class Module_six_moves_urllib_response(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_response"""


_urllib_response_moved_attributes = [
    MovedAttribute("addbase", "urllib", "urllib.response"),
    MovedAttribute("addclosehook", "urllib", "urllib.response"),
    MovedAttribute("addinfo", "urllib", "urllib.response"),
    MovedAttribute("addinfourl", "urllib", "urllib.response"),
]
for attr in _urllib_response_moved_attributes:
    setattr(Module_six_moves_urllib_response, attr.name, attr)
del attr

Module_six_moves_urllib_response._moved_attributes = _urllib_response_moved_attributes

_importer._add_module(Module_six_moves_urllib_response(__name__ + ".moves.urllib.response"),
                      "moves.urllib_response", "moves.urllib.response")


class Module_six_moves_urllib_robotparser(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_robotparser"""


_urllib_robotparser_moved_attributes = [
    MovedAttribute("RobotFileParser", "robotparser", "urllib.robotparser"),
]
for attr in _urllib_robotparser_moved_attributes:
    setattr(Module_six_moves_urllib_robotparser, attr.name, attr)
del attr

Module_six_moves_urllib_robotparser._moved_attributes = _urllib_robotparser_moved_attributes

_importer._add_module(Module_six_moves_urllib_robotparser(__name__ + ".moves.urllib.robotparser"),
                      "moves.urllib_robotparser", "moves.urllib.robotparser")


class Module_six_moves_urllib(types.ModuleType):

    """Create a six.moves.urllib namespace that resembles the Python 3 namespace"""
    __path__ = []  # mark as package
    parse = _importer._get_module("moves.urllib_parse")
    error = _importer._get_module("moves.urllib_error")
    request = _importer._get_module("moves.urllib_request")
    response = _importer._get_module("moves.urllib_response")
    robotparser = _importer._get_module("moves.urllib_robotparser")

    def __dir__(self):
        return ['parse', 'error', 'request', 'response', 'robotparser']

_importer._add_module(Module_six_moves_urllib(__name__ + ".moves.urllib"),
                      "moves.urllib")


def add_move(move):
    """Add an item to six.moves."""
    setattr(_MovedItems, move.name, move)


def remove_move(name):
    """Remove item from six.moves."""
    try:
        delattr(_MovedItems, name)
    except AttributeError:
        try:
            del moves.__dict__[name]
        except KeyError:
            raise AttributeError("no such move, %r" % (name,))


if PY3:
    _meth_func = "__func__"
    _meth_self = "__self__"

    _func_closure = "__closure__"
    _func_code = "__code__"
    _func_defaults = "__defaults__"
    _func_globals = "__globals__"
else:
    _meth_func = "im_func"
    _meth_self = "im_self"

    _func_closure = "func_closure"
    _func_code = "func_code"
    _func_defaults = "func_defaults"
    _func_globals = "func_globals"


try:
    advance_iterator = next
except NameError:
    def advance_iterator(it):
        return it.next()
next = advance_iterator


try:
    callable = callable
except NameError:
    def callable(obj):
        return any("__call__" in klass.__dict__ for klass in type(obj).__mro__)


if PY3:
    def get_unbound_function(unbound):
        return unbound

    create_bound_method = types.MethodType

    def create_unbound_method(func, cls):
        return func

    Iterator = object
else:
    def get_unbound_function(unbound):
        return unbound.im_func

    def create_bound_method(func, obj):
        return types.MethodType(func, obj, obj.__class__)

    def create_unbound_method(func, cls):
        return types.MethodType(func, None, cls)

    class Iterator(object):

        def next(self):
            return type(self).__next__(self)

    callable = callable
_add_doc(get_unbound_function,
         """Get the function out of a possibly unbound function""")


get_method_function = operator.attrgetter(_meth_func)
get_method_self = operator.attrgetter(_meth_self)
get_function_closure = operator.attrgetter(_func_closure)
get_function_code = operator.attrgetter(_func_code)
get_function_defaults = operator.attrgetter(_func_defaults)
get_function_globals = operator.attrgetter(_func_globals)


if PY3:
    def iterkeys(d, **kw):
        return iter(d.keys(**kw))

    def itervalues(d, **kw):
        return iter(d.values(**kw))

    def iteritems(d, **kw):
        return iter(d.items(**kw))

    def iterlists(d, **kw):
        return iter(d.lists(**kw))

    viewkeys = operator.methodcaller("keys")

    viewvalues = operator.methodcaller("values")

    viewitems = operator.methodcaller("items")
else:
    def iterkeys(d, **kw):
        return d.iterkeys(**kw)

    def itervalues(d, **kw):
        return d.itervalues(**kw)

    def iteritems(d, **kw):
        return d.iteritems(**kw)

    def iterlists(d, **kw):
        return d.iterlists(**kw)

    viewkeys = operator.methodcaller("viewkeys")

    viewvalues = operator.methodcaller("viewvalues")

    viewitems = operator.methodcaller("viewitems")

_add_doc(iterkeys, "Return an iterator over the keys of a dictionary.")
_add_doc(itervalues, "Return an iterator over the values of a dictionary.")
_add_doc(iteritems,
         "Return an iterator over the (key, value) pairs of a dictionary.")
_add_doc(iterlists,
         "Return an iterator over the (key, [values]) pairs of a dictionary.")


if PY3:
    def b(s):
        return s.encode("latin-1")

    def u(s):
        return s
    unichr = chr
    import struct
    int2byte = struct.Struct(">B").pack
    del struct
    byte2int = operator.itemgetter(0)
    indexbytes = operator.getitem
    iterbytes = iter
    import io
    StringIO = io.StringIO
    BytesIO = io.BytesIO
    _assertCountEqual = "assertCountEqual"
    if sys.version_info[1] <= 1:
        _assertRaisesRegex = "assertRaisesRegexp"
        _assertRegex = "assertRegexpMatches"
    else:
        _assertRaisesRegex = "assertRaisesRegex"
        _assertRegex = "assertRegex"
else:
    def b(s):
        return s
    # Workaround for standalone backslash

    def u(s):
        return unicode(s.replace(r'\\', r'\\\\'), "unicode_escape")
    unichr = unichr
    int2byte = chr

    def byte2int(bs):
        return ord(bs[0])

    def indexbytes(buf, i):
        return ord(buf[i])
    iterbytes = functools.partial(itertools.imap, ord)
    import StringIO
    StringIO = BytesIO = StringIO.StringIO
    _assertCountEqual = "assertItemsEqual"
    _assertRaisesRegex = "assertRaisesRegexp"
    _assertRegex = "assertRegexpMatches"
_add_doc(b, """Byte literal""")
_add_doc(u, """Text literal""")


def assertCountEqual(self, *args, **kwargs):
    return getattr(self, _assertCountEqual)(*args, **kwargs)


def assertRaisesRegex(self, *args, **kwargs):
    return getattr(self, _assertRaisesRegex)(*args, **kwargs)


def assertRegex(self, *args, **kwargs):
    return getattr(self, _assertRegex)(*args, **kwargs)


if PY3:
    exec_ = getattr(moves.builtins, "exec")

    def reraise(tp, value, tb=None):
        if value is None:
            value = tp()
        if value.__traceback__ is not tb:
            raise value.with_traceback(tb)
        raise value

else:
    def exec_(_code_, _globs_=None, _locs_=None):
        """Execute code in a namespace."""
        if _globs_ is None:
            frame = sys._getframe(1)
            _globs_ = frame.f_globals
            if _locs_ is None:
                _locs_ = frame.f_locals
            del frame
        elif _locs_ is None:
            _locs_ = _globs_
        exec("""exec _code_ in _globs_, _locs_""")

    exec_("""def reraise(tp, value, tb=None):
    raise tp, value, tb
""")


if sys.version_info[:2] == (3, 2):
    exec_("""def raise_from(value, from_value):
    if from_value is None:
        raise value
    raise value from from_value
""")
elif sys.version_info[:2] > (3, 2):
    exec_("""def raise_from(value, from_value):
    raise value from from_value
""")
else:
    def raise_from(value, from_value):
        raise value


print_ = getattr(moves.builtins, "print", None)
if print_ is None:
    def print_(*args, **kwargs):
        """The new-style print function for Python 2.4 and 2.5."""
        fp = kwargs.pop("file", sys.stdout)
        if fp is None:
            return

        def write(data):
            if not isinstance(data, basestring):
                data = str(data)
            # If the file has an encoding, encode unicode with it.
            if (isinstance(fp, file) and
                    isinstance(data, unicode) and
                    fp.encoding is not None):
                errors = getattr(fp, "errors", None)
                if errors is None:
                    errors = "strict"
                data = data.encode(fp.encoding, errors)
            fp.write(data)
        want_unicode = False
        sep = kwargs.pop("sep", None)
        if sep is not None:
            if isinstance(sep, unicode):
                want_unicode = True
            elif not isinstance(sep, str):
                raise TypeError("sep must be None or a string")
        end = kwargs.pop("end", None)
        if end is not None:
            if isinstance(end, unicode):
                want_unicode = True
            elif not isinstance(end, str):
                raise TypeError("end must be None or a string")
        if kwargs:
            raise TypeError("invalid keyword arguments to print()")
        if not want_unicode:
            for arg in args:
                if isinstance(arg, unicode):
                    want_unicode = True
                    break
        if want_unicode:
            newline = unicode("\n")
            space = unicode(" ")
        else:
            newline = "\n"
            space = " "
        if sep is None:
            sep = space
        if end is None:
            end = newline
        for i, arg in enumerate(args):
            if i:
                write(sep)
            write(arg)
        write(end)
if sys.version_info[:2] < (3, 3):
    _print = print_

    def print_(*args, **kwargs):
        fp = kwargs.get("file", sys.stdout)
        flush = kwargs.pop("flush", False)
        _print(*args, **kwargs)
        if flush and fp is not None:
            fp.flush()

_add_doc(reraise, """Reraise an exception.""")

if sys.version_info[0:2] < (3, 4):
    def wraps(wrapped, assigned=functools.WRAPPER_ASSIGNMENTS,
              updated=functools.WRAPPER_UPDATES):
        def wrapper(f):
            f = functools.wraps(wrapped, assigned, updated)(f)
            f.__wrapped__ = wrapped
            return f
        return wrapper
else:
    wraps = functools.wraps


def with_metaclass(meta, *bases):
    """Create a base class with a metaclass."""
    # This requires a bit of explanation: the basic idea is to make a dummy
    # metaclass for one level of class instantiation that replaces itself with
    # the actual metaclass.
    class metaclass(meta):

        def __new__(cls, name, this_bases, d):
            return meta(name, bases, d)
    return type.__new__(metaclass, 'temporary_class', (), {})


def add_metaclass(metaclass):
    """Class decorator for creating a class with a metaclass."""
    def wrapper(cls):
        orig_vars = cls.__dict__.copy()
        slots = orig_vars.get('__slots__')
        if slots is not None:
            if isinstance(slots, str):
                slots = [slots]
            for slots_var in slots:
                orig_vars.pop(slots_var)
        orig_vars.pop('__dict__', None)
        orig_vars.pop('__weakref__', None)
        return metaclass(cls.__name__, cls.__bases__, orig_vars)
    return wrapper


def python_2_unicode_compatible(klass):
    """
    A decorator that defines __unicode__ and __str__ methods under Python 2.
    Under Python 3 it does nothing.

    To support Python 2 and 3 with a single code base, define a __str__ method
    returning text and apply this decorator to the class.
    """
    if PY2:
        if '__str__' not in klass.__dict__:
            raise ValueError("@python_2_unicode_compatible cannot be applied "
                             "to %s because it doesn't define __str__()." %
                             klass.__name__)
        klass.__unicode__ = klass.__str__
        klass.__str__ = lambda self: self.__unicode__().encode('utf-8')
    return klass


# Complete the moves implementation.
# This code is at the end of this module to speed up module loading.
# Turn this module into a package.
__path__ = []  # required for PEP 302 and PEP 451
__package__ = __name__  # see PEP 366 @ReservedAssignment
if globals().get("__spec__") is not None:
    __spec__.submodule_search_locations = []  # PEP 451 @UndefinedVariable
# Remove other six meta path importers, since they cause problems. This can
# happen if six is removed from sys.modules and then reloaded. (Setuptools does
# this for some reason.)
if sys.meta_path:
    for i, importer in enumerate(sys.meta_path):
        # Here's some real nastiness: Another "instance" of the six module might
        # be floating around. Therefore, we can't use isinstance() to check for
        # the six meta path importer, since the other six instance will have
        # inserted an importer with different class.
        if (type(importer).__name__ == "_SixMetaPathImporter" and
                importer.name == __name__):
            del sys.meta_path[i]
            break
    del i, importer
# Finally, add the importer to the meta path import hook.
sys.meta_path.append(_importer)
site-packages/pip/_vendor/cachecontrol/controller.py000064400000031340147511334610016717 0ustar00"""
The httplib2 algorithms ported for use with requests.
"""
import logging
import re
import calendar
import time
from email.utils import parsedate_tz

from pip._vendor.requests.structures import CaseInsensitiveDict

from .cache import DictCache
from .serialize import Serializer


logger = logging.getLogger(__name__)

URI = re.compile(r"^(([^:/?#]+):)?(//([^/?#]*))?([^?#]*)(\?([^#]*))?(#(.*))?")


def parse_uri(uri):
    """Parses a URI using the regex given in Appendix B of RFC 3986.

        (scheme, authority, path, query, fragment) = parse_uri(uri)
    """
    groups = URI.match(uri).groups()
    return (groups[1], groups[3], groups[4], groups[6], groups[8])


class CacheController(object):
    """An interface to see if request should cached or not.
    """
    def __init__(self, cache=None, cache_etags=True, serializer=None):
        self.cache = cache or DictCache()
        self.cache_etags = cache_etags
        self.serializer = serializer or Serializer()

    @classmethod
    def _urlnorm(cls, uri):
        """Normalize the URL to create a safe key for the cache"""
        (scheme, authority, path, query, fragment) = parse_uri(uri)
        if not scheme or not authority:
            raise Exception("Only absolute URIs are allowed. uri = %s" % uri)

        scheme = scheme.lower()
        authority = authority.lower()

        if not path:
            path = "/"

        # Could do syntax based normalization of the URI before
        # computing the digest. See Section 6.2.2 of Std 66.
        request_uri = query and "?".join([path, query]) or path
        defrag_uri = scheme + "://" + authority + request_uri

        return defrag_uri

    @classmethod
    def cache_url(cls, uri):
        return cls._urlnorm(uri)

    def parse_cache_control(self, headers):
        """
        Parse the cache control headers returning a dictionary with values
        for the different directives.
        """
        retval = {}

        cc_header = 'cache-control'
        if 'Cache-Control' in headers:
            cc_header = 'Cache-Control'

        if cc_header in headers:
            parts = headers[cc_header].split(',')
            parts_with_args = [
                tuple([x.strip().lower() for x in part.split("=", 1)])
                for part in parts if -1 != part.find("=")
            ]
            parts_wo_args = [
                (name.strip().lower(), 1)
                for name in parts if -1 == name.find("=")
            ]
            retval = dict(parts_with_args + parts_wo_args)
        return retval

    def cached_request(self, request):
        """
        Return a cached response if it exists in the cache, otherwise
        return False.
        """
        cache_url = self.cache_url(request.url)
        logger.debug('Looking up "%s" in the cache', cache_url)
        cc = self.parse_cache_control(request.headers)

        # Bail out if the request insists on fresh data
        if 'no-cache' in cc:
            logger.debug('Request header has "no-cache", cache bypassed')
            return False

        if 'max-age' in cc and cc['max-age'] == 0:
            logger.debug('Request header has "max_age" as 0, cache bypassed')
            return False

        # Request allows serving from the cache, let's see if we find something
        cache_data = self.cache.get(cache_url)
        if cache_data is None:
            logger.debug('No cache entry available')
            return False

        # Check whether it can be deserialized
        resp = self.serializer.loads(request, cache_data)
        if not resp:
            logger.warning('Cache entry deserialization failed, entry ignored')
            return False

        # If we have a cached 301, return it immediately. We don't
        # need to test our response for other headers b/c it is
        # intrinsically "cacheable" as it is Permanent.
        # See:
        #   https://tools.ietf.org/html/rfc7231#section-6.4.2
        #
        # Client can try to refresh the value by repeating the request
        # with cache busting headers as usual (ie no-cache).
        if resp.status == 301:
            msg = ('Returning cached "301 Moved Permanently" response '
                   '(ignoring date and etag information)')
            logger.debug(msg)
            return resp

        headers = CaseInsensitiveDict(resp.headers)
        if not headers or 'date' not in headers:
            if 'etag' not in headers:
                # Without date or etag, the cached response can never be used
                # and should be deleted.
                logger.debug('Purging cached response: no date or etag')
                self.cache.delete(cache_url)
            logger.debug('Ignoring cached response: no date')
            return False

        now = time.time()
        date = calendar.timegm(
            parsedate_tz(headers['date'])
        )
        current_age = max(0, now - date)
        logger.debug('Current age based on date: %i', current_age)

        # TODO: There is an assumption that the result will be a
        #       urllib3 response object. This may not be best since we
        #       could probably avoid instantiating or constructing the
        #       response until we know we need it.
        resp_cc = self.parse_cache_control(headers)

        # determine freshness
        freshness_lifetime = 0

        # Check the max-age pragma in the cache control header
        if 'max-age' in resp_cc and resp_cc['max-age'].isdigit():
            freshness_lifetime = int(resp_cc['max-age'])
            logger.debug('Freshness lifetime from max-age: %i',
                         freshness_lifetime)

        # If there isn't a max-age, check for an expires header
        elif 'expires' in headers:
            expires = parsedate_tz(headers['expires'])
            if expires is not None:
                expire_time = calendar.timegm(expires) - date
                freshness_lifetime = max(0, expire_time)
                logger.debug("Freshness lifetime from expires: %i",
                             freshness_lifetime)

        # Determine if we are setting freshness limit in the
        # request. Note, this overrides what was in the response.
        if 'max-age' in cc:
            try:
                freshness_lifetime = int(cc['max-age'])
                logger.debug('Freshness lifetime from request max-age: %i',
                             freshness_lifetime)
            except ValueError:
                freshness_lifetime = 0

        if 'min-fresh' in cc:
            try:
                min_fresh = int(cc['min-fresh'])
            except ValueError:
                min_fresh = 0
            # adjust our current age by our min fresh
            current_age += min_fresh
            logger.debug('Adjusted current age from min-fresh: %i',
                         current_age)

        # Return entry if it is fresh enough
        if freshness_lifetime > current_age:
            logger.debug('The response is "fresh", returning cached response')
            logger.debug('%i > %i', freshness_lifetime, current_age)
            return resp

        # we're not fresh. If we don't have an Etag, clear it out
        if 'etag' not in headers:
            logger.debug(
                'The cached response is "stale" with no etag, purging'
            )
            self.cache.delete(cache_url)

        # return the original handler
        return False

    def conditional_headers(self, request):
        cache_url = self.cache_url(request.url)
        resp = self.serializer.loads(request, self.cache.get(cache_url))
        new_headers = {}

        if resp:
            headers = CaseInsensitiveDict(resp.headers)

            if 'etag' in headers:
                new_headers['If-None-Match'] = headers['ETag']

            if 'last-modified' in headers:
                new_headers['If-Modified-Since'] = headers['Last-Modified']

        return new_headers

    def cache_response(self, request, response, body=None):
        """
        Algorithm for caching requests.

        This assumes a requests Response object.
        """
        # From httplib2: Don't cache 206's since we aren't going to
        #                handle byte range requests
        cacheable_status_codes = [200, 203, 300, 301]
        if response.status not in cacheable_status_codes:
            logger.debug(
                'Status code %s not in %s',
                response.status,
                cacheable_status_codes
            )
            return

        response_headers = CaseInsensitiveDict(response.headers)

        # If we've been given a body, our response has a Content-Length, that
        # Content-Length is valid then we can check to see if the body we've
        # been given matches the expected size, and if it doesn't we'll just
        # skip trying to cache it.
        if (body is not None and
                "content-length" in response_headers and
                response_headers["content-length"].isdigit() and
                int(response_headers["content-length"]) != len(body)):
            return

        cc_req = self.parse_cache_control(request.headers)
        cc = self.parse_cache_control(response_headers)

        cache_url = self.cache_url(request.url)
        logger.debug('Updating cache with response from "%s"', cache_url)

        # Delete it from the cache if we happen to have it stored there
        no_store = False
        if cc.get('no-store'):
            no_store = True
            logger.debug('Response header has "no-store"')
        if cc_req.get('no-store'):
            no_store = True
            logger.debug('Request header has "no-store"')
        if no_store and self.cache.get(cache_url):
            logger.debug('Purging existing cache entry to honor "no-store"')
            self.cache.delete(cache_url)

        # If we've been given an etag, then keep the response
        if self.cache_etags and 'etag' in response_headers:
            logger.debug('Caching due to etag')
            self.cache.set(
                cache_url,
                self.serializer.dumps(request, response, body=body),
            )

        # Add to the cache any 301s. We do this before looking that
        # the Date headers.
        elif response.status == 301:
            logger.debug('Caching permanant redirect')
            self.cache.set(
                cache_url,
                self.serializer.dumps(request, response)
            )

        # Add to the cache if the response headers demand it. If there
        # is no date header then we can't do anything about expiring
        # the cache.
        elif 'date' in response_headers:
            # cache when there is a max-age > 0
            if cc and cc.get('max-age'):
                if cc['max-age'].isdigit() and int(cc['max-age']) > 0:
                    logger.debug('Caching b/c date exists and max-age > 0')
                    self.cache.set(
                        cache_url,
                        self.serializer.dumps(request, response, body=body),
                    )

            # If the request can expire, it means we should cache it
            # in the meantime.
            elif 'expires' in response_headers:
                if response_headers['expires']:
                    logger.debug('Caching b/c of expires header')
                    self.cache.set(
                        cache_url,
                        self.serializer.dumps(request, response, body=body),
                    )

    def update_cached_response(self, request, response):
        """On a 304 we will get a new set of headers that we want to
        update our cached value with, assuming we have one.

        This should only ever be called when we've sent an ETag and
        gotten a 304 as the response.
        """
        cache_url = self.cache_url(request.url)

        cached_response = self.serializer.loads(
            request,
            self.cache.get(cache_url)
        )

        if not cached_response:
            # we didn't have a cached response
            return response

        # Lets update our headers with the headers from the new request:
        # http://tools.ietf.org/html/draft-ietf-httpbis-p4-conditional-26#section-4.1
        #
        # The server isn't supposed to send headers that would make
        # the cached body invalid. But... just in case, we'll be sure
        # to strip out ones we know that might be problmatic due to
        # typical assumptions.
        excluded_headers = [
            "content-length",
        ]

        cached_response.headers.update(
            dict((k, v) for k, v in response.headers.items()
                 if k.lower() not in excluded_headers)
        )

        # we want a 200 b/c we have content via the cache
        cached_response.status = 200

        # update our cache
        self.cache.set(
            cache_url,
            self.serializer.dumps(request, cached_response),
        )

        return cached_response
site-packages/pip/_vendor/cachecontrol/serialize.py000064400000014610147511334610016524 0ustar00import base64
import io
import json
import zlib

from pip._vendor.requests.structures import CaseInsensitiveDict

from .compat import HTTPResponse, pickle, text_type


def _b64_encode_bytes(b):
    return base64.b64encode(b).decode("ascii")


def _b64_encode_str(s):
    return _b64_encode_bytes(s.encode("utf8"))


def _b64_encode(s):
    if isinstance(s, text_type):
        return _b64_encode_str(s)
    return _b64_encode_bytes(s)


def _b64_decode_bytes(b):
    return base64.b64decode(b.encode("ascii"))


def _b64_decode_str(s):
    return _b64_decode_bytes(s).decode("utf8")


class Serializer(object):

    def dumps(self, request, response, body=None):
        response_headers = CaseInsensitiveDict(response.headers)

        if body is None:
            body = response.read(decode_content=False)

            # NOTE: 99% sure this is dead code. I'm only leaving it
            #       here b/c I don't have a test yet to prove
            #       it. Basically, before using
            #       `cachecontrol.filewrapper.CallbackFileWrapper`,
            #       this made an effort to reset the file handle. The
            #       `CallbackFileWrapper` short circuits this code by
            #       setting the body as the content is consumed, the
            #       result being a `body` argument is *always* passed
            #       into cache_response, and in turn,
            #       `Serializer.dump`.
            response._fp = io.BytesIO(body)

        data = {
            "response": {
                "body": _b64_encode_bytes(body),
                "headers": dict(
                    (_b64_encode(k), _b64_encode(v))
                    for k, v in response.headers.items()
                ),
                "status": response.status,
                "version": response.version,
                "reason": _b64_encode_str(response.reason),
                "strict": response.strict,
                "decode_content": response.decode_content,
            },
        }

        # Construct our vary headers
        data["vary"] = {}
        if "vary" in response_headers:
            varied_headers = response_headers['vary'].split(',')
            for header in varied_headers:
                header = header.strip()
                data["vary"][header] = request.headers.get(header, None)

        # Encode our Vary headers to ensure they can be serialized as JSON
        data["vary"] = dict(
            (_b64_encode(k), _b64_encode(v) if v is not None else v)
            for k, v in data["vary"].items()
        )

        return b",".join([
            b"cc=2",
            zlib.compress(
                json.dumps(
                    data, separators=(",", ":"), sort_keys=True,
                ).encode("utf8"),
            ),
        ])

    def loads(self, request, data):
        # Short circuit if we've been given an empty set of data
        if not data:
            return

        # Determine what version of the serializer the data was serialized
        # with
        try:
            ver, data = data.split(b",", 1)
        except ValueError:
            ver = b"cc=0"

        # Make sure that our "ver" is actually a version and isn't a false
        # positive from a , being in the data stream.
        if ver[:3] != b"cc=":
            data = ver + data
            ver = b"cc=0"

        # Get the version number out of the cc=N
        ver = ver.split(b"=", 1)[-1].decode("ascii")

        # Dispatch to the actual load method for the given version
        try:
            return getattr(self, "_loads_v{0}".format(ver))(request, data)
        except AttributeError:
            # This is a version we don't have a loads function for, so we'll
            # just treat it as a miss and return None
            return

    def prepare_response(self, request, cached):
        """Verify our vary headers match and construct a real urllib3
        HTTPResponse object.
        """
        # Special case the '*' Vary value as it means we cannot actually
        # determine if the cached response is suitable for this request.
        if "*" in cached.get("vary", {}):
            return

        # Ensure that the Vary headers for the cached response match our
        # request
        for header, value in cached.get("vary", {}).items():
            if request.headers.get(header, None) != value:
                return

        body_raw = cached["response"].pop("body")

        headers = CaseInsensitiveDict(data=cached['response']['headers'])
        if headers.get('transfer-encoding', '') == 'chunked':
            headers.pop('transfer-encoding')

        cached['response']['headers'] = headers

        try:
            body = io.BytesIO(body_raw)
        except TypeError:
            # This can happen if cachecontrol serialized to v1 format (pickle)
            # using Python 2. A Python 2 str(byte string) will be unpickled as
            # a Python 3 str (unicode string), which will cause the above to
            # fail with:
            #
            #     TypeError: 'str' does not support the buffer interface
            body = io.BytesIO(body_raw.encode('utf8'))

        return HTTPResponse(
            body=body,
            preload_content=False,
            **cached["response"]
        )

    def _loads_v0(self, request, data):
        # The original legacy cache data. This doesn't contain enough
        # information to construct everything we need, so we'll treat this as
        # a miss.
        return

    def _loads_v1(self, request, data):
        try:
            cached = pickle.loads(data)
        except ValueError:
            return

        return self.prepare_response(request, cached)

    def _loads_v2(self, request, data):
        try:
            cached = json.loads(zlib.decompress(data).decode("utf8"))
        except ValueError:
            return

        # We need to decode the items that we've base64 encoded
        cached["response"]["body"] = _b64_decode_bytes(
            cached["response"]["body"]
        )
        cached["response"]["headers"] = dict(
            (_b64_decode_str(k), _b64_decode_str(v))
            for k, v in cached["response"]["headers"].items()
        )
        cached["response"]["reason"] = _b64_decode_str(
            cached["response"]["reason"],
        )
        cached["vary"] = dict(
            (_b64_decode_str(k), _b64_decode_str(v) if v is not None else v)
            for k, v in cached["vary"].items()
        )

        return self.prepare_response(request, cached)
site-packages/pip/_vendor/cachecontrol/adapter.py000064400000011000147511334610016143 0ustar00import types
import functools

from pip._vendor.requests.adapters import HTTPAdapter

from .controller import CacheController
from .cache import DictCache
from .filewrapper import CallbackFileWrapper


class CacheControlAdapter(HTTPAdapter):
    invalidating_methods = set(['PUT', 'DELETE'])

    def __init__(self, cache=None,
                 cache_etags=True,
                 controller_class=None,
                 serializer=None,
                 heuristic=None,
                 *args, **kw):
        super(CacheControlAdapter, self).__init__(*args, **kw)
        self.cache = cache or DictCache()
        self.heuristic = heuristic

        controller_factory = controller_class or CacheController
        self.controller = controller_factory(
            self.cache,
            cache_etags=cache_etags,
            serializer=serializer,
        )

    def send(self, request, **kw):
        """
        Send a request. Use the request information to see if it
        exists in the cache and cache the response if we need to and can.
        """
        if request.method == 'GET':
            cached_response = self.controller.cached_request(request)
            if cached_response:
                return self.build_response(request, cached_response,
                                           from_cache=True)

            # check for etags and add headers if appropriate
            request.headers.update(
                self.controller.conditional_headers(request)
            )

        resp = super(CacheControlAdapter, self).send(request, **kw)

        return resp

    def build_response(self, request, response, from_cache=False):
        """
        Build a response by making a request or using the cache.

        This will end up calling send and returning a potentially
        cached response
        """
        if not from_cache and request.method == 'GET':
            # Check for any heuristics that might update headers
            # before trying to cache.
            if self.heuristic:
                response = self.heuristic.apply(response)

            # apply any expiration heuristics
            if response.status == 304:
                # We must have sent an ETag request. This could mean
                # that we've been expired already or that we simply
                # have an etag. In either case, we want to try and
                # update the cache if that is the case.
                cached_response = self.controller.update_cached_response(
                    request, response
                )

                if cached_response is not response:
                    from_cache = True

                # We are done with the server response, read a
                # possible response body (compliant servers will
                # not return one, but we cannot be 100% sure) and
                # release the connection back to the pool.
                response.read(decode_content=False)
                response.release_conn()

                response = cached_response

            # We always cache the 301 responses
            elif response.status == 301:
                self.controller.cache_response(request, response)
            else:
                # Wrap the response file with a wrapper that will cache the
                #   response when the stream has been consumed.
                response._fp = CallbackFileWrapper(
                    response._fp,
                    functools.partial(
                        self.controller.cache_response,
                        request,
                        response,
                    )
                )
                if response.chunked:
                    super_update_chunk_length = response._update_chunk_length

                    def _update_chunk_length(self):
                        super_update_chunk_length()
                        if self.chunk_left == 0:
                            self._fp._close()
                    response._update_chunk_length = types.MethodType(_update_chunk_length, response)

        resp = super(CacheControlAdapter, self).build_response(
            request, response
        )

        # See if we should invalidate the cache.
        if request.method in self.invalidating_methods and resp.ok:
            cache_url = self.controller.cache_url(request.url)
            self.cache.delete(cache_url)

        # Give the request a from_cache attr to let people use it
        resp.from_cache = from_cache

        return resp

    def close(self):
        self.cache.close()
        super(CacheControlAdapter, self).close()
site-packages/pip/_vendor/cachecontrol/caches/__pycache__/file_cache.cpython-36.pyc000064400000005231147511334610024310 0ustar003

���e�
�@sdddlZddlZddlmZddlmZddlmZddlm	Z	dd�Z
Gd	d
�d
e�Zdd�ZdS)
�N)�LockFile)�
MkdirLockFile�)�	BaseCache)�CacheControllercCs�tj}|tjtjBO}ttd�r*|tjO}ttd�r>|tjO}ytj|�Wntt	fk
rdYnXtj
|||�}ytj|d�Stj|��YnXdS)N�
O_NOFOLLOW�O_BINARY�wb)
�os�O_WRONLY�O_CREAT�O_EXCL�hasattrrr�remove�IOError�OSError�open�fdopen�close)�filenameZfmode�flags�fd�r� /usr/lib/python3.6/file_cache.py�_secure_open_writes 




rc@sBeZdZddd�Zedd��Zd	d
�Zdd�Zd
d�Zdd�Z	dS)�	FileCacheF��NcCsN|dk	r|dk	rtd��|r t}|dkr,t}||_||_||_||_||_dS)Nz/Cannot use use_dir_lock and lock_class together)�
ValueErrorrr�	directory�forever�filemode�dirmode�
lock_class)�selfrr r!r"Zuse_dir_lockr#rrr�__init__4szFileCache.__init__cCstj|j��j�S)N)�hashlibZsha224�encodeZ	hexdigest)�xrrrr'GszFileCache.encodecCs4|j|�}t|dd��|g}tjj|jf|��S)N�)r'�listr
�path�joinr)r$�nameZhashed�partsrrr�_fnKs
z
FileCache._fnc	Cs8|j|�}tjj|�sdSt|d��
}|j�SQRXdS)N�rb)r/r
r+�existsr�read)r$�keyr-�fhrrr�getRs

z
FileCache.getcCs||j|�}ytjtjj|�|j�Wnttfk
r<YnX|j|��*}t	|j|j
��}|j|�WdQRXWdQRXdS)N)r/r
�makedirsr+�dirnamer"rrr#rr!�write)r$r3�valuer-�lockr4rrr�setZs
z
FileCache.setcCs|j|�}|jstj|�dS)N)r/r r
r)r$r3r-rrr�deletehs
zFileCache.delete)FrrNN)
�__name__�
__module__�__qualname__r%�staticmethodr'r/r5r;r<rrrrr3s
rcCstj|�}|j|�S)z\Return the file cache path based on the URL.

    This does not ensure the file exists!
    )rZ	cache_urlr/)ZurlZ	filecacher3rrr�url_to_file_pathns
rA)
r&r
Zpip._vendor.lockfilerZ"pip._vendor.lockfile.mkdirlockfiler�cacherZ
controllerrrrrArrrr�<module>s(;site-packages/pip/_vendor/cachecontrol/caches/__pycache__/file_cache.cpython-36.opt-1.pyc000064400000005231147511334610025247 0ustar003

���e�
�@sdddlZddlZddlmZddlmZddlmZddlm	Z	dd�Z
Gd	d
�d
e�Zdd�ZdS)
�N)�LockFile)�
MkdirLockFile�)�	BaseCache)�CacheControllercCs�tj}|tjtjBO}ttd�r*|tjO}ttd�r>|tjO}ytj|�Wntt	fk
rdYnXtj
|||�}ytj|d�Stj|��YnXdS)N�
O_NOFOLLOW�O_BINARY�wb)
�os�O_WRONLY�O_CREAT�O_EXCL�hasattrrr�remove�IOError�OSError�open�fdopen�close)�filenameZfmode�flags�fd�r� /usr/lib/python3.6/file_cache.py�_secure_open_writes 




rc@sBeZdZddd�Zedd��Zd	d
�Zdd�Zd
d�Zdd�Z	dS)�	FileCacheF��NcCsN|dk	r|dk	rtd��|r t}|dkr,t}||_||_||_||_||_dS)Nz/Cannot use use_dir_lock and lock_class together)�
ValueErrorrr�	directory�forever�filemode�dirmode�
lock_class)�selfrr r!r"Zuse_dir_lockr#rrr�__init__4szFileCache.__init__cCstj|j��j�S)N)�hashlibZsha224�encodeZ	hexdigest)�xrrrr'GszFileCache.encodecCs4|j|�}t|dd��|g}tjj|jf|��S)N�)r'�listr
�path�joinr)r$�nameZhashed�partsrrr�_fnKs
z
FileCache._fnc	Cs8|j|�}tjj|�sdSt|d��
}|j�SQRXdS)N�rb)r/r
r+�existsr�read)r$�keyr-�fhrrr�getRs

z
FileCache.getcCs||j|�}ytjtjj|�|j�Wnttfk
r<YnX|j|��*}t	|j|j
��}|j|�WdQRXWdQRXdS)N)r/r
�makedirsr+�dirnamer"rrr#rr!�write)r$r3�valuer-�lockr4rrr�setZs
z
FileCache.setcCs|j|�}|jstj|�dS)N)r/r r
r)r$r3r-rrr�deletehs
zFileCache.delete)FrrNN)
�__name__�
__module__�__qualname__r%�staticmethodr'r/r5r;r<rrrrr3s
rcCstj|�}|j|�S)z\Return the file cache path based on the URL.

    This does not ensure the file exists!
    )rZ	cache_urlr/)ZurlZ	filecacher3rrr�url_to_file_pathns
rA)
r&r
Zpip._vendor.lockfilerZ"pip._vendor.lockfile.mkdirlockfiler�cacherZ
controllerrrrrArrrr�<module>s(;site-packages/pip/_vendor/cachecontrol/caches/__pycache__/__init__.cpython-36.pyc000064400000001004147511334610023777 0ustar003

���eq�@stddlmZyddlmZWn$ek
r@ed�Zee�YnXyddlZddlm	Z	Wnek
rnYnXdS)�)�dedent�)�	FileCachez�
    NOTE: In order to use the FileCache you must have
    lockfile installed. You can install it via pip:
      pip install lockfile
    N)�
RedisCache)
�textwraprZ
file_cacher�ImportErrorZnotice�printZredisZredis_cacher�r	r	�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/cachecontrol/caches/__pycache__/redis_cache.cpython-36.opt-1.pyc000064400000003123147511334610025434 0ustar003

���e��@s4ddlmZddlmZdd�ZGdd�de�ZdS)�)�division)�datetimecCs<t|d�r|j�S|j}|j|jdd}||ddS)zPython 2.6 compatability�
total_seconds�i�
�i@Bi@B)�hasattrrZmicrosecondsZsecondsZdays)ZtdZmsZsecs�r	�!/usr/lib/python3.6/redis_cache.pyrs

rc@s>eZdZdd�Zdd�Zddd�Zdd	�Zd
d�Zdd
�ZdS)�
RedisCachecCs
||_dS)N)�conn)�selfrr	r	r
�__init__szRedisCache.__init__cCs|jj|�S)N)r�get)r
�keyr	r	r
rszRedisCache.getNcCs8|s|jj||�n |tj�}|jj|t|�|�dS)N)r�setrZnowZsetexr)r
r�valueZexpiresr	r	r
rszRedisCache.setcCs|jj|�dS)N)r�delete)r
rr	r	r
rszRedisCache.deletecCs$x|jj�D]}|jj|�qWdS)zIHelper for clearing all the keys in a database. Use with
        caution!N)r�keysr)r
rr	r	r
�clear"szRedisCache.clearcCs|jj�dS)N)rZ
disconnect)r
r	r	r
�close(szRedisCache.close)N)	�__name__�
__module__�__qualname__rrrrrrr	r	r	r
rs
rN)Z
__future__rrr�objectrr	r	r	r
�<module>s
site-packages/pip/_vendor/cachecontrol/caches/__pycache__/__init__.cpython-36.opt-1.pyc000064400000001004147511334610024736 0ustar003

���eq�@stddlmZyddlmZWn$ek
r@ed�Zee�YnXyddlZddlm	Z	Wnek
rnYnXdS)�)�dedent�)�	FileCachez�
    NOTE: In order to use the FileCache you must have
    lockfile installed. You can install it via pip:
      pip install lockfile
    N)�
RedisCache)
�textwraprZ
file_cacher�ImportErrorZnotice�printZredisZredis_cacher�r	r	�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/cachecontrol/caches/__pycache__/redis_cache.cpython-36.pyc000064400000003123147511334610024475 0ustar003

���e��@s4ddlmZddlmZdd�ZGdd�de�ZdS)�)�division)�datetimecCs<t|d�r|j�S|j}|j|jdd}||ddS)zPython 2.6 compatability�
total_seconds�i�
�i@Bi@B)�hasattrrZmicrosecondsZsecondsZdays)ZtdZmsZsecs�r	�!/usr/lib/python3.6/redis_cache.pyrs

rc@s>eZdZdd�Zdd�Zddd�Zdd	�Zd
d�Zdd
�ZdS)�
RedisCachecCs
||_dS)N)�conn)�selfrr	r	r
�__init__szRedisCache.__init__cCs|jj|�S)N)r�get)r
�keyr	r	r
rszRedisCache.getNcCs8|s|jj||�n |tj�}|jj|t|�|�dS)N)r�setrZnowZsetexr)r
r�valueZexpiresr	r	r
rszRedisCache.setcCs|jj|�dS)N)r�delete)r
rr	r	r
rszRedisCache.deletecCs$x|jj�D]}|jj|�qWdS)zIHelper for clearing all the keys in a database. Use with
        caution!N)r�keysr)r
rr	r	r
�clear"szRedisCache.clearcCs|jj�dS)N)rZ
disconnect)r
r	r	r
�close(szRedisCache.close)N)	�__name__�
__module__�__qualname__rrrrrrr	r	r	r
rs
rN)Z
__future__rrr�objectrr	r	r	r
�<module>s
site-packages/pip/_vendor/cachecontrol/caches/__init__.py000064400000000561147511334610017522 0ustar00from textwrap import dedent

try:
    from .file_cache import FileCache
except ImportError:
    notice = dedent('''
    NOTE: In order to use the FileCache you must have
    lockfile installed. You can install it via pip:
      pip install lockfile
    ''')
    print(notice)


try:
    import redis
    from .redis_cache import RedisCache
except ImportError:
    pass
site-packages/pip/_vendor/cachecontrol/caches/redis_cache.py000064400000001715147511334610020216 0ustar00from __future__ import division

from datetime import datetime


def total_seconds(td):
    """Python 2.6 compatability"""
    if hasattr(td, 'total_seconds'):
        return td.total_seconds()

    ms = td.microseconds
    secs = (td.seconds + td.days * 24 * 3600)
    return (ms + secs * 10**6) / 10**6


class RedisCache(object):

    def __init__(self, conn):
        self.conn = conn

    def get(self, key):
        return self.conn.get(key)

    def set(self, key, value, expires=None):
        if not expires:
            self.conn.set(key, value)
        else:
            expires = expires - datetime.now()
            self.conn.setex(key, total_seconds(expires), value)

    def delete(self, key):
        self.conn.delete(key)

    def clear(self):
        """Helper for clearing all the keys in a database. Use with
        caution!"""
        for key in self.conn.keys():
            self.conn.delete(key)

    def close(self):
        self.conn.disconnect()
site-packages/pip/_vendor/cachecontrol/caches/file_cache.py000064400000006714147511334610020033 0ustar00import hashlib
import os

from pip._vendor.lockfile import LockFile
from pip._vendor.lockfile.mkdirlockfile import MkdirLockFile

from ..cache import BaseCache
from ..controller import CacheController


def _secure_open_write(filename, fmode):
    # We only want to write to this file, so open it in write only mode
    flags = os.O_WRONLY

    # os.O_CREAT | os.O_EXCL will fail if the file already exists, so we only
    #  will open *new* files.
    # We specify this because we want to ensure that the mode we pass is the
    # mode of the file.
    flags |= os.O_CREAT | os.O_EXCL

    # Do not follow symlinks to prevent someone from making a symlink that
    # we follow and insecurely open a cache file.
    if hasattr(os, "O_NOFOLLOW"):
        flags |= os.O_NOFOLLOW

    # On Windows we'll mark this file as binary
    if hasattr(os, "O_BINARY"):
        flags |= os.O_BINARY

    # Before we open our file, we want to delete any existing file that is
    # there
    try:
        os.remove(filename)
    except (IOError, OSError):
        # The file must not exist already, so we can just skip ahead to opening
        pass

    # Open our file, the use of os.O_CREAT | os.O_EXCL will ensure that if a
    # race condition happens between the os.remove and this line, that an
    # error will be raised. Because we utilize a lockfile this should only
    # happen if someone is attempting to attack us.
    fd = os.open(filename, flags, fmode)
    try:
        return os.fdopen(fd, "wb")
    except:
        # An error occurred wrapping our FD in a file object
        os.close(fd)
        raise


class FileCache(BaseCache):
    def __init__(self, directory, forever=False, filemode=0o0600,
                 dirmode=0o0700, use_dir_lock=None, lock_class=None):

        if use_dir_lock is not None and lock_class is not None:
            raise ValueError("Cannot use use_dir_lock and lock_class together")

        if use_dir_lock:
            lock_class = MkdirLockFile

        if lock_class is None:
            lock_class = LockFile

        self.directory = directory
        self.forever = forever
        self.filemode = filemode
        self.dirmode = dirmode
        self.lock_class = lock_class


    @staticmethod
    def encode(x):
        return hashlib.sha224(x.encode()).hexdigest()

    def _fn(self, name):
        # NOTE: This method should not change as some may depend on it.
        #       See: https://github.com/ionrock/cachecontrol/issues/63
        hashed = self.encode(name)
        parts = list(hashed[:5]) + [hashed]
        return os.path.join(self.directory, *parts)

    def get(self, key):
        name = self._fn(key)
        if not os.path.exists(name):
            return None

        with open(name, 'rb') as fh:
            return fh.read()

    def set(self, key, value):
        name = self._fn(key)

        # Make sure the directory exists
        try:
            os.makedirs(os.path.dirname(name), self.dirmode)
        except (IOError, OSError):
            pass

        with self.lock_class(name) as lock:
            # Write our actual file
            with _secure_open_write(lock.path, self.filemode) as fh:
                fh.write(value)

    def delete(self, key):
        name = self._fn(key)
        if not self.forever:
            os.remove(name)


def url_to_file_path(url, filecache):
    """Return the file cache path based on the URL.

    This does not ensure the file exists!
    """
    key = CacheController.cache_url(url)
    return filecache._fn(key)
site-packages/pip/_vendor/cachecontrol/__init__.py000064400000000456147511334610016277 0ustar00"""CacheControl import Interface.

Make it easy to import from cachecontrol without long namespaces.
"""
__author__ = 'Eric Larson'
__email__ = 'eric@ionrock.org'
__version__ = '0.11.7'

from .wrapper import CacheControl
from .adapter import CacheControlAdapter
from .controller import CacheController
site-packages/pip/_vendor/cachecontrol/cache.py000064400000001426147511334610015601 0ustar00"""
The cache object API for implementing caches. The default is a thread
safe in-memory dictionary.
"""
from threading import Lock


class BaseCache(object):

    def get(self, key):
        raise NotImplemented()

    def set(self, key, value):
        raise NotImplemented()

    def delete(self, key):
        raise NotImplemented()

    def close(self):
        pass


class DictCache(BaseCache):

    def __init__(self, init_dict=None):
        self.lock = Lock()
        self.data = init_dict or {}

    def get(self, key):
        return self.data.get(key, None)

    def set(self, key, value):
        with self.lock:
            self.data.update({key: value})

    def delete(self, key):
        with self.lock:
            if key in self.data:
                self.data.pop(key)
site-packages/pip/_vendor/cachecontrol/_cmd.py000064400000002450147511334610015436 0ustar00import logging

from pip._vendor import requests

from pip._vendor.cachecontrol.adapter import CacheControlAdapter
from pip._vendor.cachecontrol.cache import DictCache
from pip._vendor.cachecontrol.controller import logger

from argparse import ArgumentParser


def setup_logging():
    logger.setLevel(logging.DEBUG)
    handler = logging.StreamHandler()
    logger.addHandler(handler)


def get_session():
    adapter = CacheControlAdapter(
        DictCache(),
        cache_etags=True,
        serializer=None,
        heuristic=None,
    )
    sess = requests.Session()
    sess.mount('http://', adapter)
    sess.mount('https://', adapter)

    sess.cache_controller = adapter.controller
    return sess


def get_args():
    parser = ArgumentParser()
    parser.add_argument('url', help='The URL to try and cache')
    return parser.parse_args()


def main(args=None):
    args = get_args()
    sess = get_session()

    # Make a request to get a response
    resp = sess.get(args.url)

    # Turn on logging
    setup_logging()

    # try setting the cache
    sess.cache_controller.cache_response(resp.request, resp.raw)

    # Now try to get it
    if sess.cache_controller.cached_request(resp.request):
        print('Cached!')
    else:
        print('Not cached :(')


if __name__ == '__main__':
    main()
site-packages/pip/_vendor/cachecontrol/filewrapper.py000064400000004743147511334610017063 0ustar00from io import BytesIO


class CallbackFileWrapper(object):
    """
    Small wrapper around a fp object which will tee everything read into a
    buffer, and when that file is closed it will execute a callback with the
    contents of that buffer.

    All attributes are proxied to the underlying file object.

    This class uses members with a double underscore (__) leading prefix so as
    not to accidentally shadow an attribute.
    """

    def __init__(self, fp, callback):
        self.__buf = BytesIO()
        self.__fp = fp
        self.__callback = callback

    def __getattr__(self, name):
        # The vaguaries of garbage collection means that self.__fp is
        # not always set.  By using __getattribute__ and the private
        # name[0] allows looking up the attribute value and raising an
        # AttributeError when it doesn't exist. This stop thigns from
        # infinitely recursing calls to getattr in the case where
        # self.__fp hasn't been set.
        #
        # [0] https://docs.python.org/2/reference/expressions.html#atom-identifiers
        fp = self.__getattribute__('_CallbackFileWrapper__fp')
        return getattr(fp, name)

    def __is_fp_closed(self):
        try:
            return self.__fp.fp is None
        except AttributeError:
            pass

        try:
            return self.__fp.closed
        except AttributeError:
            pass

        # We just don't cache it then.
        # TODO: Add some logging here...
        return False

    def _close(self):
        if self.__callback:
            self.__callback(self.__buf.getvalue())

        # We assign this to None here, because otherwise we can get into
        # really tricky problems where the CPython interpreter dead locks
        # because the callback is holding a reference to something which
        # has a __del__ method. Setting this to None breaks the cycle
        # and allows the garbage collector to do it's thing normally.
        self.__callback = None

    def read(self, amt=None):
        data = self.__fp.read(amt)
        self.__buf.write(data)
        if self.__is_fp_closed():
            self._close()

        return data

    def _safe_read(self, amt):
        data = self.__fp._safe_read(amt)
        if amt == 2 and data == b'\r\n':
            # urllib executes this read to toss the CRLF at the end
            # of the chunk.
            return data

        self.__buf.write(data)
        if self.__is_fp_closed():
            self._close()

        return data
site-packages/pip/_vendor/cachecontrol/__pycache__/heuristics.cpython-36.pyc000064400000011004147511334610023175 0ustar003

���e-�@s�ddlZddlZddlmZmZmZddlmZmZdZddd�Z	dd�Z
Gd	d
�d
e�ZGdd�de�Z
Gd
d�de�ZGdd�de�ZdS)�N)�
formatdate�	parsedate�parsedate_tz)�datetime�	timedeltaz%a, %d %b %Y %H:%M:%S GMTcCs|p
tj�}||S)N)r�now)�delta�date�r
� /usr/lib/python3.6/heuristics.py�expire_aftersrcCsttj|j���S)N)r�calendar�timegmZ	timetuple)Zdtr
r
r�datetime_to_headersrc@s$eZdZdd�Zdd�Zdd�ZdS)�
BaseHeuristiccCsdS)a!
        Return a valid 1xx warning header value describing the cache
        adjustments.

        The response is provided too allow warnings like 113
        http://tools.ietf.org/html/rfc7234#section-5.5.4 where we need
        to explicitly say response is over 24 hours old.
        z110 - "Response is Stale"r
)�self�responser
r
r�warnings	zBaseHeuristic.warningcCsiS)z�Update the response headers with any new headers.

        NOTE: This SHOULD always include some Warning header to
              signify that the response was cached by the client, not
              by way of the provided headers.
        r
)rrr
r
r�update_headers!szBaseHeuristic.update_headerscCs@|j|�}|r<|jj|�|j|�}|dk	r<|jjd|i�|S)N�Warning)r�headers�updater)rrZupdated_headersZwarning_header_valuer
r
r�apply*s

zBaseHeuristic.applyN)�__name__�
__module__�__qualname__rrrr
r
r
rrs	rc@seZdZdZdd�ZdS)�OneDayCachezM
    Cache the response by providing an expires 1 day in the
    future.
    cCsRi}d|jkrNt|jd�}ttdd�t|dd��d�}t|�|d<d|d<|S)	N�expiresr	�)Zdays�)r	�publicz
cache-control)rrrrrr)rrrr	rr
r
rr;s

zOneDayCache.update_headersN)rrr�__doc__rr
r
r
rr6src@s(eZdZdZdd�Zdd�Zdd�ZdS)	�ExpiresAfterz;
    Cache **all** requests for a defined time period.
    cKstf|�|_dS)N)rr)r�kwr
r
r�__init__LszExpiresAfter.__init__cCst|j�}t|�dd�S)Nr )rz
cache-control)rrr)rrrr
r
rrOs
zExpiresAfter.update_headerscCsd}||jS)Nz:110 - Automatically cached for %s. Response might be stale)r)rrZtmplr
r
rrVszExpiresAfter.warningN)rrrr!r$rrr
r
r
rr"Gsr"c@s>eZdZdZedddddddd	d
ddg�Zd
d�Zdd�ZdS)�LastModifieda�
    If there is no Expires header already, fall back on Last-Modified
    using the heuristic from
    http://tools.ietf.org/html/rfc7234#section-4.2.2
    to calculate a reasonable value.

    Firefox also does something like this per
    https://developer.mozilla.org/en-US/docs/Web/HTTP/Caching_FAQ
    http://lxr.mozilla.org/mozilla-release/source/netwerk/protocol/http/nsHttpResponseHead.cpp#397
    Unlike mozilla we limit this to 24-hr.
    ��������i,i-i�i�i�i�i�c
Cs�|j}d|kriSd|kr*|ddkr*iS|j|jkr:iSd|ksJd|krNiStjt|d��}t|d�}|dks||dkr�iStj�}td||�}|tj|�}tdt	|dd
��}||kr�iS||}	dtj
ttj|	��iS)Nrz
cache-controlr r	z
last-modifiedr�
�ii�Q)
rZstatus�cacheable_by_default_statusesr
rrr�time�max�minZstrftime�TIME_FMTZgmtime)
r�resprr	Z
last_modifiedrZcurrent_agerZfreshness_lifetimerr
r
rrks*zLastModified.update_headerscCsdS)Nr
)rr1r
r
rr�szLastModified.warningN)rrrr!�setr,rrr
r
r
rr%[s
r%)N)r
r-Zemail.utilsrrrrrr0rr�objectrrr"r%r
r
r
r�<module>s
"site-packages/pip/_vendor/cachecontrol/__pycache__/cache.cpython-36.opt-1.pyc000064400000003215147511334610023022 0ustar003

���e�@s4dZddlmZGdd�de�ZGdd�de�ZdS)zb
The cache object API for implementing caches. The default is a thread
safe in-memory dictionary.
�)�Lockc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�	BaseCachecCs
t��dS)N)�NotImplemented)�self�key�r�/usr/lib/python3.6/cache.py�get
sz
BaseCache.getcCs
t��dS)N)r)rr�valuerrr�set
sz
BaseCache.setcCs
t��dS)N)r)rrrrr�deleteszBaseCache.deletecCsdS)Nr)rrrr�closeszBaseCache.closeN)�__name__�
__module__�__qualname__r	rrr
rrrrrsrc@s.eZdZd
dd�Zdd�Zdd�Zdd	�ZdS)�	DictCacheNcCst�|_|pi|_dS)N)r�lock�data)rZ	init_dictrrr�__init__szDictCache.__init__cCs|jj|d�S)N)rr	)rrrrrr	sz
DictCache.getc
Cs&|j�|jj||i�WdQRXdS)N)rr�update)rrr
rrrr sz
DictCache.setc	Cs,|j�||jkr|jj|�WdQRXdS)N)rr�pop)rrrrrr$s
zDictCache.delete)N)rrrrr	rrrrrrrs
rN)�__doc__Z	threadingr�objectrrrrrr�<module>ssite-packages/pip/_vendor/cachecontrol/__pycache__/_cmd.cpython-36.pyc000064400000002703147511334610021723 0ustar003

���e(�@sxddlZddlmZddlmZddlmZddlmZddl	m
Z
dd�Zd	d
�Zdd�Z
dd
d�Zedkrte�dS)�N)�requests)�CacheControlAdapter)�	DictCache)�logger)�ArgumentParsercCs"tjtj�tj�}tj|�dS)N)rZsetLevel�logging�DEBUGZ
StreamHandlerZ
addHandler)Zhandler�r	�/usr/lib/python3.6/_cmd.py�
setup_loggingsrcCs>tt�dddd�}tj�}|jd|�|jd|�|j|_|S)NT)Zcache_etagsZ
serializerZ	heuristiczhttp://zhttps://)rrrZSessionZmountZ
controller�cache_controller)Zadapter�sessr	r	r
�get_sessionsrcCst�}|jddd�|j�S)N�urlzThe URL to try and cache)�help)r�add_argument�
parse_args)�parserr	r	r
�get_args!srcCsTt�}t�}|j|j�}t�|jj|j|j�|jj	|j�rHt
d�nt
d�dS)NzCached!z
Not cached :()rr�getrrrZcache_responseZrequest�rawZcached_request�print)�argsr
Zrespr	r	r
�main's
r�__main__)N)rZpip._vendorrZ pip._vendor.cachecontrol.adapterrZpip._vendor.cachecontrol.cacherZ#pip._vendor.cachecontrol.controllerr�argparserrrrr�__name__r	r	r	r
�<module>s
site-packages/pip/_vendor/cachecontrol/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000722147511334610023516 0ustar003

���e.�@s8dZdZdZdZddlmZddlmZddlm	Z	dS)	zbCacheControl import Interface.

Make it easy to import from cachecontrol without long namespaces.
zEric Larsonzeric@ionrock.orgz0.11.7�)�CacheControl)�CacheControlAdapter)�CacheControllerN)
�__doc__�
__author__Z	__email__�__version__�wrapperrZadapterrZ
controllerr�r	r	�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/cachecontrol/__pycache__/cache.cpython-36.pyc000064400000003215147511334610022063 0ustar003

���e�@s4dZddlmZGdd�de�ZGdd�de�ZdS)zb
The cache object API for implementing caches. The default is a thread
safe in-memory dictionary.
�)�Lockc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�	BaseCachecCs
t��dS)N)�NotImplemented)�self�key�r�/usr/lib/python3.6/cache.py�get
sz
BaseCache.getcCs
t��dS)N)r)rr�valuerrr�set
sz
BaseCache.setcCs
t��dS)N)r)rrrrr�deleteszBaseCache.deletecCsdS)Nr)rrrr�closeszBaseCache.closeN)�__name__�
__module__�__qualname__r	rrr
rrrrrsrc@s.eZdZd
dd�Zdd�Zdd�Zdd	�ZdS)�	DictCacheNcCst�|_|pi|_dS)N)r�lock�data)rZ	init_dictrrr�__init__szDictCache.__init__cCs|jj|d�S)N)rr	)rrrrrr	sz
DictCache.getc
Cs&|j�|jj||i�WdQRXdS)N)rr�update)rrr
rrrr sz
DictCache.setc	Cs,|j�||jkr|jj|�WdQRXdS)N)rr�pop)rrrrrr$s
zDictCache.delete)N)rrrrr	rrrrrrrs
rN)�__doc__Z	threadingr�objectrrrrrr�<module>ssite-packages/pip/_vendor/cachecontrol/__pycache__/filewrapper.cpython-36.pyc000064400000004026147511334610023341 0ustar003

���e�	�@s ddlmZGdd�de�ZdS)�)�BytesIOc@sBeZdZdZdd�Zdd�Zdd�Zdd	�Zddd�Zd
d�Z	d
S)�CallbackFileWrapperav
    Small wrapper around a fp object which will tee everything read into a
    buffer, and when that file is closed it will execute a callback with the
    contents of that buffer.

    All attributes are proxied to the underlying file object.

    This class uses members with a double underscore (__) leading prefix so as
    not to accidentally shadow an attribute.
    cCst�|_||_||_dS)N)r�_CallbackFileWrapper__buf�_CallbackFileWrapper__fp�_CallbackFileWrapper__callback)�self�fp�callback�r
�!/usr/lib/python3.6/filewrapper.py�__init__szCallbackFileWrapper.__init__cCs|jd�}t||�S)Nr)�__getattribute__�getattr)r�namerr
r
r�__getattr__s	
zCallbackFileWrapper.__getattr__cCsDy|jjdkStk
r YnXy|jjStk
r>YnXdS)NF)rr�AttributeError�closed)rr
r
rZ__is_fp_closed!sz"CallbackFileWrapper.__is_fp_closedcCs |jr|j|jj��d|_dS)N)rr�getvalue)rr
r
r�_close0szCallbackFileWrapper._closeNcCs,|jj|�}|jj|�|j�r(|j�|S)N)r�readr�write�"_CallbackFileWrapper__is_fp_closedr)r�amt�datar
r
rr;s
zCallbackFileWrapper.readcCs@|jj|�}|dkr |dkr |S|jj|�|j�r<|j�|S)N�s
)r�
_safe_readrrrr)rrrr
r
rrCszCallbackFileWrapper._safe_read)N)
�__name__�
__module__�__qualname__�__doc__rrrrrrr
r
r
rrs

rN)�ior�objectrr
r
r
r�<module>ssite-packages/pip/_vendor/cachecontrol/__pycache__/heuristics.cpython-36.opt-1.pyc000064400000011004147511334610024134 0ustar003

���e-�@s�ddlZddlZddlmZmZmZddlmZmZdZddd�Z	dd�Z
Gd	d
�d
e�ZGdd�de�Z
Gd
d�de�ZGdd�de�ZdS)�N)�
formatdate�	parsedate�parsedate_tz)�datetime�	timedeltaz%a, %d %b %Y %H:%M:%S GMTcCs|p
tj�}||S)N)r�now)�delta�date�r
� /usr/lib/python3.6/heuristics.py�expire_aftersrcCsttj|j���S)N)r�calendar�timegmZ	timetuple)Zdtr
r
r�datetime_to_headersrc@s$eZdZdd�Zdd�Zdd�ZdS)�
BaseHeuristiccCsdS)a!
        Return a valid 1xx warning header value describing the cache
        adjustments.

        The response is provided too allow warnings like 113
        http://tools.ietf.org/html/rfc7234#section-5.5.4 where we need
        to explicitly say response is over 24 hours old.
        z110 - "Response is Stale"r
)�self�responser
r
r�warnings	zBaseHeuristic.warningcCsiS)z�Update the response headers with any new headers.

        NOTE: This SHOULD always include some Warning header to
              signify that the response was cached by the client, not
              by way of the provided headers.
        r
)rrr
r
r�update_headers!szBaseHeuristic.update_headerscCs@|j|�}|r<|jj|�|j|�}|dk	r<|jjd|i�|S)N�Warning)r�headers�updater)rrZupdated_headersZwarning_header_valuer
r
r�apply*s

zBaseHeuristic.applyN)�__name__�
__module__�__qualname__rrrr
r
r
rrs	rc@seZdZdZdd�ZdS)�OneDayCachezM
    Cache the response by providing an expires 1 day in the
    future.
    cCsRi}d|jkrNt|jd�}ttdd�t|dd��d�}t|�|d<d|d<|S)	N�expiresr	�)Zdays�)r	�publicz
cache-control)rrrrrr)rrrr	rr
r
rr;s

zOneDayCache.update_headersN)rrr�__doc__rr
r
r
rr6src@s(eZdZdZdd�Zdd�Zdd�ZdS)	�ExpiresAfterz;
    Cache **all** requests for a defined time period.
    cKstf|�|_dS)N)rr)r�kwr
r
r�__init__LszExpiresAfter.__init__cCst|j�}t|�dd�S)Nr )rz
cache-control)rrr)rrrr
r
rrOs
zExpiresAfter.update_headerscCsd}||jS)Nz:110 - Automatically cached for %s. Response might be stale)r)rrZtmplr
r
rrVszExpiresAfter.warningN)rrrr!r$rrr
r
r
rr"Gsr"c@s>eZdZdZedddddddd	d
ddg�Zd
d�Zdd�ZdS)�LastModifieda�
    If there is no Expires header already, fall back on Last-Modified
    using the heuristic from
    http://tools.ietf.org/html/rfc7234#section-4.2.2
    to calculate a reasonable value.

    Firefox also does something like this per
    https://developer.mozilla.org/en-US/docs/Web/HTTP/Caching_FAQ
    http://lxr.mozilla.org/mozilla-release/source/netwerk/protocol/http/nsHttpResponseHead.cpp#397
    Unlike mozilla we limit this to 24-hr.
    ��������i,i-i�i�i�i�i�c
Cs�|j}d|kriSd|kr*|ddkr*iS|j|jkr:iSd|ksJd|krNiStjt|d��}t|d�}|dks||dkr�iStj�}td||�}|tj|�}tdt	|dd
��}||kr�iS||}	dtj
ttj|	��iS)Nrz
cache-controlr r	z
last-modifiedr�
�ii�Q)
rZstatus�cacheable_by_default_statusesr
rrr�time�max�minZstrftime�TIME_FMTZgmtime)
r�resprr	Z
last_modifiedrZcurrent_agerZfreshness_lifetimerr
r
rrks*zLastModified.update_headerscCsdS)Nr
)rr1r
r
rr�szLastModified.warningN)rrrr!�setr,rrr
r
r
rr%[s
r%)N)r
r-Zemail.utilsrrrrrr0rr�objectrrr"r%r
r
r
r�<module>s
"site-packages/pip/_vendor/cachecontrol/__pycache__/controller.cpython-36.opt-1.pyc000064400000016775147511334610024161 0ustar003

���e�2�@s�dZddlZddlZddlZddlZddlmZddlmZddl	m
Z
ddlmZej
e�Zejd�Zd	d
�ZGdd�de�ZdS)
z7
The httplib2 algorithms ported for use with requests.
�N)�parsedate_tz)�CaseInsensitiveDict�)�	DictCache)�
Serializerz9^(([^:/?#]+):)?(//([^/?#]*))?([^?#]*)(\?([^#]*))?(#(.*))?cCs0tj|�j�}|d|d|d|d|dfS)z�Parses a URI using the regex given in Appendix B of RFC 3986.

        (scheme, authority, path, query, fragment) = parse_uri(uri)
    r����)�URI�match�groups)�urir
�r� /usr/lib/python3.6/controller.py�	parse_urisrc@s\eZdZdZddd�Zedd��Zedd	��Zd
d�Zdd
�Z	dd�Z
ddd�Zdd�ZdS)�CacheControllerz9An interface to see if request should cached or not.
    NTcCs"|pt�|_||_|pt�|_dS)N)r�cache�cache_etagsr�
serializer)�selfrrrrrr�__init__!szCacheController.__init__c	Cslt|�\}}}}}|s|r*td|��|j�}|j�}|sBd}|rTdj||g�pV|}|d||}|S)z4Normalize the URL to create a safe key for the cachez(Only absolute URIs are allowed. uri = %s�/�?z://)r�	Exception�lower�join)	�clsr�schemeZ	authority�pathZqueryZfragmentZrequest_uriZ
defrag_urirrr�_urlnorm&szCacheController._urlnormcCs
|j|�S)N)r )rrrrr�	cache_url:szCacheController.cache_urlcCsVi}d}d|krd}||krR||jd�}dd�|D�}dd�|D�}t||�}|S)zz
        Parse the cache control headers returning a dictionary with values
        for the different directives.
        z
cache-controlz
Cache-Control�,cSs4g|],}d|jd�krtdd�|jdd�D���qS)r�=cSsg|]}|j�j��qSr)�stripr)�.0�xrrr�
<listcomp>LszBCacheController.parse_cache_control.<locals>.<listcomp>.<listcomp>���)�find�tuple�split)r%�partrrrr'Lsz7CacheController.parse_cache_control.<locals>.<listcomp>cSs*g|]"}d|jd�kr|j�j�df�qS)rr#r()r)r$r)r%�namerrrr'Ps)r+�dict)r�headersZretvalZ	cc_header�partsZparts_with_argsZ
parts_wo_argsrrr�parse_cache_control>sz#CacheController.parse_cache_controlcCs�|j|j�}tjd|�|j|j�}d|kr:tjd�dSd|kr\|ddkr\tjd�dS|jj|�}|dkr~tjd	�dS|jj	||�}|s�tj
d
�dS|jdkr�d}tj|�|St|j�}|s�d
|k�rd|kr�tjd�|jj
|�tjd�dStj�}tjt|d
��}	td||	�}
tjd|
�|j|�}d}d|k�rt|dj��rtt|d�}tjd|�nDd|k�r�t|d�}
|
dk	�r�tj|
�|	}td|�}tjd|�d|k�r�yt|d�}tjd|�Wntk
�r�d}YnXd|k�rDyt|d�}Wntk
�r.d}YnX|
|7}
tjd|
�||
k�rjtjd�tjd||
�|Sd|k�r�tjd�|jj
|�dS)ze
        Return a cached response if it exists in the cache, otherwise
        return False.
        zLooking up "%s" in the cachezno-cachez-Request header has "no-cache", cache bypassedFzmax-agerz1Request header has "max_age" as 0, cache bypassedNzNo cache entry availablez1Cache entry deserialization failed, entry ignoredi-zVReturning cached "301 Moved Permanently" response (ignoring date and etag information)�date�etagz(Purging cached response: no date or etagz!Ignoring cached response: no datezCurrent age based on date: %iz#Freshness lifetime from max-age: %i�expiresz#Freshness lifetime from expires: %iz+Freshness lifetime from request max-age: %iz	min-freshz'Adjusted current age from min-fresh: %iz2The response is "fresh", returning cached responsez%i > %iz4The cached response is "stale" with no etag, purging)r!�url�logger�debugr1r/r�getr�loadsZwarning�statusr�delete�time�calendarZtimegmr�max�isdigit�int�
ValueError)r�requestr!�ccZ
cache_data�resp�msgr/Znowr2Zcurrent_ageZresp_ccZfreshness_lifetimer4Zexpire_timeZ	min_freshrrr�cached_requestVs�





















zCacheController.cached_requestcCs`|j|j�}|jj||jj|��}i}|r\t|j�}d|krH|d|d<d|kr\|d|d<|S)Nr3ZETagz
If-None-Matchz
last-modifiedz
Last-ModifiedzIf-Modified-Since)r!r5rr9rr8rr/)rrBr!rDZnew_headersr/rrr�conditional_headers�s
z#CacheController.conditional_headersc
Cs�ddddg}|j|kr*tjd|j|�dSt|j�}|dk	rhd|krh|dj�rht|d�t|�krhdS|j|j�}|j|�}|j	|j
�}tjd|�d	}	|jd
�r�d}	tjd�|jd
�r�d}	tjd
�|	r�|jj|�r�tjd�|jj
|�|jo�d|k�r(tjd�|jj||jj|||d��n�|jdk�rXtjd�|jj||jj||��n�d|k�r�|�r�|jd��r�|dj��r�t|d�dk�r�tjd�|jj||jj|||d��n:d|k�r�|d�r�tjd�|jj||jj|||d��dS)zc
        Algorithm for caching requests.

        This assumes a requests Response object.
        ����i,i-zStatus code %s not in %sNzcontent-lengthz&Updating cache with response from "%s"Fzno-storeTzResponse header has "no-store"zRequest header has "no-store"z0Purging existing cache entry to honor "no-store"r3zCaching due to etag)�bodyzCaching permanant redirectr2zmax-agerz'Caching b/c date exists and max-age > 0r4zCaching b/c of expires header)r:r6r7rr/r?r@�lenr1r!r5r8rr;r�setr�dumps)
rrB�responserJZcacheable_status_codesZresponse_headersZcc_reqrCr!Zno_storerrr�cache_response�sd










 



zCacheController.cache_responsecsv|j|j�}|jj||jj|��}|s*|Sdg�|jjt�fdd�|jj	�D���d|_
|jj||jj||��|S)z�On a 304 we will get a new set of headers that we want to
        update our cached value with, assuming we have one.

        This should only ever be called when we've sent an ETag and
        gotten a 304 as the response.
        zcontent-lengthc3s&|]\}}|j��kr||fVqdS)N)r)r%�k�v)�excluded_headersrr�	<genexpr>Tsz9CacheController.update_cached_response.<locals>.<genexpr>rH)
r!r5rr9rr8r/�updater.�itemsr:rLrM)rrBrNr!Zcached_responser)rRr�update_cached_response6s
z&CacheController.update_cached_response)NTN)N)
�__name__�
__module__�__qualname__�__doc__r�classmethodr r!r1rFrGrOrVrrrrrs
y
Wr)rZZlogging�rer=r<Zemail.utilsrZpip._vendor.requests.structuresrrrZ	serializerZ	getLoggerrWr6�compilerr�objectrrrrr�<module>s

	site-packages/pip/_vendor/cachecontrol/__pycache__/filewrapper.cpython-36.opt-1.pyc000064400000004026147511334610024300 0ustar003

���e�	�@s ddlmZGdd�de�ZdS)�)�BytesIOc@sBeZdZdZdd�Zdd�Zdd�Zdd	�Zddd�Zd
d�Z	d
S)�CallbackFileWrapperav
    Small wrapper around a fp object which will tee everything read into a
    buffer, and when that file is closed it will execute a callback with the
    contents of that buffer.

    All attributes are proxied to the underlying file object.

    This class uses members with a double underscore (__) leading prefix so as
    not to accidentally shadow an attribute.
    cCst�|_||_||_dS)N)r�_CallbackFileWrapper__buf�_CallbackFileWrapper__fp�_CallbackFileWrapper__callback)�self�fp�callback�r
�!/usr/lib/python3.6/filewrapper.py�__init__szCallbackFileWrapper.__init__cCs|jd�}t||�S)Nr)�__getattribute__�getattr)r�namerr
r
r�__getattr__s	
zCallbackFileWrapper.__getattr__cCsDy|jjdkStk
r YnXy|jjStk
r>YnXdS)NF)rr�AttributeError�closed)rr
r
rZ__is_fp_closed!sz"CallbackFileWrapper.__is_fp_closedcCs |jr|j|jj��d|_dS)N)rr�getvalue)rr
r
r�_close0szCallbackFileWrapper._closeNcCs,|jj|�}|jj|�|j�r(|j�|S)N)r�readr�write�"_CallbackFileWrapper__is_fp_closedr)r�amt�datar
r
rr;s
zCallbackFileWrapper.readcCs@|jj|�}|dkr |dkr |S|jj|�|j�r<|j�|S)N�s
)r�
_safe_readrrrr)rrrr
r
rrCszCallbackFileWrapper._safe_read)N)
�__name__�
__module__�__qualname__�__doc__rrrrrrr
r
r
rrs

rN)�ior�objectrr
r
r
r�<module>ssite-packages/pip/_vendor/cachecontrol/__pycache__/wrapper.cpython-36.pyc000064400000000745147511334610022505 0ustar003

���e��@s&ddlmZddlmZddd�ZdS)�)�CacheControlAdapter)�	DictCacheNTcCs6|pt�}t||||d�}|jd|�|jd|�|S)N)�cache_etags�
serializer�	heuristiczhttp://zhttps://)rrZmount)Zsess�cacherrr�adapter�r	�/usr/lib/python3.6/wrapper.py�CacheControls
r)NTNN)rrrrrr	r	r	r
�<module>s
site-packages/pip/_vendor/cachecontrol/__pycache__/compat.cpython-36.pyc000064400000000761147511334610022306 0ustar003

���e|�@s�yddlmZWn ek
r0ddlmZYnXyddlZWnek
rZddlZYnXddlmZddlm	Z	y
e
fZWnek
r�e
fZYnXdS)�)�urljoinN)�HTTPResponse)�is_fp_closed)Zurllib.parser�ImportErrorZurlparseZcPickle�pickleZpip._vendor.urllib3.responserZpip._vendor.urllib3.utilrZunicodeZ	text_type�	NameError�str�r	r	�/usr/lib/python3.6/compat.py�<module>s
site-packages/pip/_vendor/cachecontrol/__pycache__/serialize.cpython-36.pyc000064400000010445147511334610023012 0ustar003

���e��@s|ddlZddlZddlZddlZddlmZddlmZmZm	Z	dd�Z
dd�Zd	d
�Zdd�Z
d
d�ZGdd�de�ZdS)�N)�CaseInsensitiveDict�)�HTTPResponse�pickle�	text_typecCstj|�jd�S)N�ascii)�base64Z	b64encode�decode)�b�r�/usr/lib/python3.6/serialize.py�_b64_encode_bytessr
cCst|jd��S)N�utf8)r
�encode)�srrr�_b64_encode_strsrcCst|t�rt|�St|�S)N)�
isinstancerrr
)rrrr�_b64_encodes
rcCstj|jd��S)Nr)rZ	b64decoder)r
rrr�_b64_decode_bytessrcCst|�jd�S)Nr)rr	)rrrr�_b64_decode_strsrc@s>eZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�ZdS)�
SerializerNc	Cs�t|j�}|dkr*|jdd�}tj|�|_dt|�tdd�|jj�D��|j	|j
t|j�|j
|jd�i}i|d<d|kr�|djd�}x*|D]"}|j�}|jj|d�|d|<q�Wtd	d�|dj�D��|d<d
jdtjtj|dd
d�jd��g�S)NF)�decode_content�responsecss"|]\}}t|�t|�fVqdS)N)r)�.0�k�vrrr�	<genexpr>9sz#Serializer.dumps.<locals>.<genexpr>)�body�headers�status�version�reason�strictr�vary�,css.|]&\}}t|�|dk	r t|�n|fVqdS)N)r)rrrrrrrNs�,scc=2�:T)Z
separatorsZ	sort_keysr)r$r&)rr�read�io�BytesIOZ_fpr
�dict�itemsrr rr!r"r�split�strip�get�join�zlib�compress�json�dumpsr)�self�requestrrZresponse_headers�dataZvaried_headers�headerrrrr3#s:

zSerializer.dumpscCs�|sdSy|jdd�\}}Wntk
r4d}YnX|dd�dkrR||}d}|jdd�d	jd�}yt|dj|��||�Stk
r�dSXdS)
Nr%rscc=0�scc=�=rz_loads_v{0}���)r,�
ValueErrorr	�getattr�format�AttributeError)r4r5r6Zverrrr�loads[s
zSerializer.loadsc
Cs�d|jdi�krdSx2|jdi�j�D]\}}|jj|d�|kr&dSq&W|djd�}t|ddd�}|jdd	�d
kr�|jd�||dd<ytj|�}Wn$tk
r�tj|jd��}YnXt	f|dd
�|d��S)z`Verify our vary headers match and construct a real urllib3
        HTTPResponse object.
        �*r#Nrrr)r6ztransfer-encoding�ZchunkedrF)rZpreload_content)
r.r+r�poprr(r)�	TypeErrorrr)r4r5�cachedr7�valueZbody_rawrrrrr�prepare_responsexs$
zSerializer.prepare_responsecCsdS)Nr)r4r5r6rrr�	_loads_v0�szSerializer._loads_v0cCs0ytj|�}Wntk
r"dSX|j||�S)N)rr?r;rF)r4r5r6rDrrr�	_loads_v1�s
zSerializer._loads_v1cCs�ytjtj|�jd��}Wntk
r.dSXt|dd�|dd<tdd�|ddj�D��|dd<t	|dd�|dd<tdd�|d	j�D��|d	<|j
||�S)
Nrrrcss"|]\}}t|�t|�fVqdS)N)r)rrrrrrr�sz'Serializer._loads_v2.<locals>.<genexpr>rr!css.|]&\}}t|�|dk	r t|�n|fVqdS)N)r)rrrrrrr�sr#)r2r?r0Z
decompressr	r;rr*r+rrF)r4r5r6rDrrr�	_loads_v2�szSerializer._loads_v2)N)	�__name__�
__module__�__qualname__r3r?rFrGrHrIrrrrr!s
8(r)rr(r2r0Zpip._vendor.requests.structuresr�compatrrrr
rrrr�objectrrrrr�<module>ssite-packages/pip/_vendor/cachecontrol/__pycache__/__init__.cpython-36.pyc000064400000000722147511334610022557 0ustar003

���e.�@s8dZdZdZdZddlmZddlmZddlm	Z	dS)	zbCacheControl import Interface.

Make it easy to import from cachecontrol without long namespaces.
zEric Larsonzeric@ionrock.orgz0.11.7�)�CacheControl)�CacheControlAdapter)�CacheControllerN)
�__doc__�
__author__Z	__email__�__version__�wrapperrZadapterrZ
controllerr�r	r	�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/cachecontrol/__pycache__/adapter.cpython-36.pyc000064400000005362147511334610022445 0ustar003

���e�@sTddlZddlZddlmZddlmZddlmZddlm	Z	Gdd�de�Z
dS)	�N)�HTTPAdapter�)�CacheController)�	DictCache)�CallbackFileWrappercsPeZdZeddg�Zd�fdd�	Z�fdd�Zd�fd
d�	Z�fdd
�Z�Z	S)�CacheControlAdapterZPUTZDELETENTc	sBtt|�j||�|pt�|_||_|p*t}||j||d�|_dS)N)�cache_etags�
serializer)�superr�__init__r�cache�	heuristicr�
controller)	�selfrrZcontroller_classr	r
�args�kwZcontroller_factory)�	__class__��/usr/lib/python3.6/adapter.pyrszCacheControlAdapter.__init__csV|jdkr>|jj|�}|r*|j||dd�S|jj|jj|��tt|�j	|f|�}|S)z�
        Send a request. Use the request information to see if it
        exists in the cache and cache the response if we need to and can.
        �GETT)�
from_cache)
�methodrZcached_request�build_responseZheaders�updateZconditional_headersr
r�send)r�requestr�cached_response�resp)rrrrs
zCacheControlAdapter.sendFcs|r�|jdkr�|jr"|jj|�}|jdkr`|jj||�}||k	rFd}|jdd�|j�|}n\|jdkrz|jj||�nBt	|j
tj|jj||��|_
|j
r�|j��fdd�}tj||�|_tt|�j||�}|j|jkr�|jr�|jj|j�}|jj|�||_|S)	z�
        Build a response by making a request or using the cache.

        This will end up calling send and returning a potentially
        cached response
        ri0TF)Zdecode_contenti-cs��|jdkr|jj�dS)Nr)Z
chunk_left�_fpZ_close)r)�super_update_chunk_lengthrr�_update_chunk_lengthgs
z@CacheControlAdapter.build_response.<locals>._update_chunk_length)rr
ZapplyZstatusrZupdate_cached_response�readZrelease_connZcache_responserr�	functools�partialZchunkedr �types�
MethodTyper
rr�invalidating_methods�ok�	cache_urlZurlr�deleter)rrZresponserrr rr()r)rrr3s<



z"CacheControlAdapter.build_responsecs|jj�tt|�j�dS)N)r�closer
r)r)rrrr*{s
zCacheControlAdapter.close)NTNNN)F)
�__name__�
__module__�__qualname__�setr&rrrr*�
__classcell__rr)rrrs
Hr)r$r"Zpip._vendor.requests.adaptersrrrrrZfilewrapperrrrrrr�<module>ssite-packages/pip/_vendor/cachecontrol/__pycache__/_cmd.cpython-36.opt-1.pyc000064400000002703147511334610022662 0ustar003

���e(�@sxddlZddlmZddlmZddlmZddlmZddl	m
Z
dd�Zd	d
�Zdd�Z
dd
d�Zedkrte�dS)�N)�requests)�CacheControlAdapter)�	DictCache)�logger)�ArgumentParsercCs"tjtj�tj�}tj|�dS)N)rZsetLevel�logging�DEBUGZ
StreamHandlerZ
addHandler)Zhandler�r	�/usr/lib/python3.6/_cmd.py�
setup_loggingsrcCs>tt�dddd�}tj�}|jd|�|jd|�|j|_|S)NT)Zcache_etagsZ
serializerZ	heuristiczhttp://zhttps://)rrrZSessionZmountZ
controller�cache_controller)Zadapter�sessr	r	r
�get_sessionsrcCst�}|jddd�|j�S)N�urlzThe URL to try and cache)�help)r�add_argument�
parse_args)�parserr	r	r
�get_args!srcCsTt�}t�}|j|j�}t�|jj|j|j�|jj	|j�rHt
d�nt
d�dS)NzCached!z
Not cached :()rr�getrrrZcache_responseZrequest�rawZcached_request�print)�argsr
Zrespr	r	r
�main's
r�__main__)N)rZpip._vendorrZ pip._vendor.cachecontrol.adapterrZpip._vendor.cachecontrol.cacherZ#pip._vendor.cachecontrol.controllerr�argparserrrrr�__name__r	r	r	r
�<module>s
site-packages/pip/_vendor/cachecontrol/__pycache__/controller.cpython-36.pyc000064400000016775147511334610023222 0ustar003

���e�2�@s�dZddlZddlZddlZddlZddlmZddlmZddl	m
Z
ddlmZej
e�Zejd�Zd	d
�ZGdd�de�ZdS)
z7
The httplib2 algorithms ported for use with requests.
�N)�parsedate_tz)�CaseInsensitiveDict�)�	DictCache)�
Serializerz9^(([^:/?#]+):)?(//([^/?#]*))?([^?#]*)(\?([^#]*))?(#(.*))?cCs0tj|�j�}|d|d|d|d|dfS)z�Parses a URI using the regex given in Appendix B of RFC 3986.

        (scheme, authority, path, query, fragment) = parse_uri(uri)
    r����)�URI�match�groups)�urir
�r� /usr/lib/python3.6/controller.py�	parse_urisrc@s\eZdZdZddd�Zedd��Zedd	��Zd
d�Zdd
�Z	dd�Z
ddd�Zdd�ZdS)�CacheControllerz9An interface to see if request should cached or not.
    NTcCs"|pt�|_||_|pt�|_dS)N)r�cache�cache_etagsr�
serializer)�selfrrrrrr�__init__!szCacheController.__init__c	Cslt|�\}}}}}|s|r*td|��|j�}|j�}|sBd}|rTdj||g�pV|}|d||}|S)z4Normalize the URL to create a safe key for the cachez(Only absolute URIs are allowed. uri = %s�/�?z://)r�	Exception�lower�join)	�clsr�schemeZ	authority�pathZqueryZfragmentZrequest_uriZ
defrag_urirrr�_urlnorm&szCacheController._urlnormcCs
|j|�S)N)r )rrrrr�	cache_url:szCacheController.cache_urlcCsVi}d}d|krd}||krR||jd�}dd�|D�}dd�|D�}t||�}|S)zz
        Parse the cache control headers returning a dictionary with values
        for the different directives.
        z
cache-controlz
Cache-Control�,cSs4g|],}d|jd�krtdd�|jdd�D���qS)r�=cSsg|]}|j�j��qSr)�stripr)�.0�xrrr�
<listcomp>LszBCacheController.parse_cache_control.<locals>.<listcomp>.<listcomp>���)�find�tuple�split)r%�partrrrr'Lsz7CacheController.parse_cache_control.<locals>.<listcomp>cSs*g|]"}d|jd�kr|j�j�df�qS)rr#r()r)r$r)r%�namerrrr'Ps)r+�dict)r�headersZretvalZ	cc_header�partsZparts_with_argsZ
parts_wo_argsrrr�parse_cache_control>sz#CacheController.parse_cache_controlcCs�|j|j�}tjd|�|j|j�}d|kr:tjd�dSd|kr\|ddkr\tjd�dS|jj|�}|dkr~tjd	�dS|jj	||�}|s�tj
d
�dS|jdkr�d}tj|�|St|j�}|s�d
|k�rd|kr�tjd�|jj
|�tjd�dStj�}tjt|d
��}	td||	�}
tjd|
�|j|�}d}d|k�rt|dj��rtt|d�}tjd|�nDd|k�r�t|d�}
|
dk	�r�tj|
�|	}td|�}tjd|�d|k�r�yt|d�}tjd|�Wntk
�r�d}YnXd|k�rDyt|d�}Wntk
�r.d}YnX|
|7}
tjd|
�||
k�rjtjd�tjd||
�|Sd|k�r�tjd�|jj
|�dS)ze
        Return a cached response if it exists in the cache, otherwise
        return False.
        zLooking up "%s" in the cachezno-cachez-Request header has "no-cache", cache bypassedFzmax-agerz1Request header has "max_age" as 0, cache bypassedNzNo cache entry availablez1Cache entry deserialization failed, entry ignoredi-zVReturning cached "301 Moved Permanently" response (ignoring date and etag information)�date�etagz(Purging cached response: no date or etagz!Ignoring cached response: no datezCurrent age based on date: %iz#Freshness lifetime from max-age: %i�expiresz#Freshness lifetime from expires: %iz+Freshness lifetime from request max-age: %iz	min-freshz'Adjusted current age from min-fresh: %iz2The response is "fresh", returning cached responsez%i > %iz4The cached response is "stale" with no etag, purging)r!�url�logger�debugr1r/r�getr�loadsZwarning�statusr�delete�time�calendarZtimegmr�max�isdigit�int�
ValueError)r�requestr!�ccZ
cache_data�resp�msgr/Znowr2Zcurrent_ageZresp_ccZfreshness_lifetimer4Zexpire_timeZ	min_freshrrr�cached_requestVs�





















zCacheController.cached_requestcCs`|j|j�}|jj||jj|��}i}|r\t|j�}d|krH|d|d<d|kr\|d|d<|S)Nr3ZETagz
If-None-Matchz
last-modifiedz
Last-ModifiedzIf-Modified-Since)r!r5rr9rr8rr/)rrBr!rDZnew_headersr/rrr�conditional_headers�s
z#CacheController.conditional_headersc
Cs�ddddg}|j|kr*tjd|j|�dSt|j�}|dk	rhd|krh|dj�rht|d�t|�krhdS|j|j�}|j|�}|j	|j
�}tjd|�d	}	|jd
�r�d}	tjd�|jd
�r�d}	tjd
�|	r�|jj|�r�tjd�|jj
|�|jo�d|k�r(tjd�|jj||jj|||d��n�|jdk�rXtjd�|jj||jj||��n�d|k�r�|�r�|jd��r�|dj��r�t|d�dk�r�tjd�|jj||jj|||d��n:d|k�r�|d�r�tjd�|jj||jj|||d��dS)zc
        Algorithm for caching requests.

        This assumes a requests Response object.
        ����i,i-zStatus code %s not in %sNzcontent-lengthz&Updating cache with response from "%s"Fzno-storeTzResponse header has "no-store"zRequest header has "no-store"z0Purging existing cache entry to honor "no-store"r3zCaching due to etag)�bodyzCaching permanant redirectr2zmax-agerz'Caching b/c date exists and max-age > 0r4zCaching b/c of expires header)r:r6r7rr/r?r@�lenr1r!r5r8rr;r�setr�dumps)
rrB�responserJZcacheable_status_codesZresponse_headersZcc_reqrCr!Zno_storerrr�cache_response�sd










 



zCacheController.cache_responsecsv|j|j�}|jj||jj|��}|s*|Sdg�|jjt�fdd�|jj	�D���d|_
|jj||jj||��|S)z�On a 304 we will get a new set of headers that we want to
        update our cached value with, assuming we have one.

        This should only ever be called when we've sent an ETag and
        gotten a 304 as the response.
        zcontent-lengthc3s&|]\}}|j��kr||fVqdS)N)r)r%�k�v)�excluded_headersrr�	<genexpr>Tsz9CacheController.update_cached_response.<locals>.<genexpr>rH)
r!r5rr9rr8r/�updater.�itemsr:rLrM)rrBrNr!Zcached_responser)rRr�update_cached_response6s
z&CacheController.update_cached_response)NTN)N)
�__name__�
__module__�__qualname__�__doc__r�classmethodr r!r1rFrGrOrVrrrrrs
y
Wr)rZZlogging�rer=r<Zemail.utilsrZpip._vendor.requests.structuresrrrZ	serializerZ	getLoggerrWr6�compilerr�objectrrrrr�<module>s

	site-packages/pip/_vendor/cachecontrol/__pycache__/serialize.cpython-36.opt-1.pyc000064400000010445147511334610023751 0ustar003

���e��@s|ddlZddlZddlZddlZddlmZddlmZmZm	Z	dd�Z
dd�Zd	d
�Zdd�Z
d
d�ZGdd�de�ZdS)�N)�CaseInsensitiveDict�)�HTTPResponse�pickle�	text_typecCstj|�jd�S)N�ascii)�base64Z	b64encode�decode)�b�r�/usr/lib/python3.6/serialize.py�_b64_encode_bytessr
cCst|jd��S)N�utf8)r
�encode)�srrr�_b64_encode_strsrcCst|t�rt|�St|�S)N)�
isinstancerrr
)rrrr�_b64_encodes
rcCstj|jd��S)Nr)rZ	b64decoder)r
rrr�_b64_decode_bytessrcCst|�jd�S)Nr)rr	)rrrr�_b64_decode_strsrc@s>eZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�ZdS)�
SerializerNc	Cs�t|j�}|dkr*|jdd�}tj|�|_dt|�tdd�|jj�D��|j	|j
t|j�|j
|jd�i}i|d<d|kr�|djd�}x*|D]"}|j�}|jj|d�|d|<q�Wtd	d�|dj�D��|d<d
jdtjtj|dd
d�jd��g�S)NF)�decode_content�responsecss"|]\}}t|�t|�fVqdS)N)r)�.0�k�vrrr�	<genexpr>9sz#Serializer.dumps.<locals>.<genexpr>)�body�headers�status�version�reason�strictr�vary�,css.|]&\}}t|�|dk	r t|�n|fVqdS)N)r)rrrrrrrNs�,scc=2�:T)Z
separatorsZ	sort_keysr)r$r&)rr�read�io�BytesIOZ_fpr
�dict�itemsrr rr!r"r�split�strip�get�join�zlib�compress�json�dumpsr)�self�requestrrZresponse_headers�dataZvaried_headers�headerrrrr3#s:

zSerializer.dumpscCs�|sdSy|jdd�\}}Wntk
r4d}YnX|dd�dkrR||}d}|jdd�d	jd�}yt|dj|��||�Stk
r�dSXdS)
Nr%rscc=0�scc=�=rz_loads_v{0}���)r,�
ValueErrorr	�getattr�format�AttributeError)r4r5r6Zverrrr�loads[s
zSerializer.loadsc
Cs�d|jdi�krdSx2|jdi�j�D]\}}|jj|d�|kr&dSq&W|djd�}t|ddd�}|jdd	�d
kr�|jd�||dd<ytj|�}Wn$tk
r�tj|jd��}YnXt	f|dd
�|d��S)z`Verify our vary headers match and construct a real urllib3
        HTTPResponse object.
        �*r#Nrrr)r6ztransfer-encoding�ZchunkedrF)rZpreload_content)
r.r+r�poprr(r)�	TypeErrorrr)r4r5�cachedr7�valueZbody_rawrrrrr�prepare_responsexs$
zSerializer.prepare_responsecCsdS)Nr)r4r5r6rrr�	_loads_v0�szSerializer._loads_v0cCs0ytj|�}Wntk
r"dSX|j||�S)N)rr?r;rF)r4r5r6rDrrr�	_loads_v1�s
zSerializer._loads_v1cCs�ytjtj|�jd��}Wntk
r.dSXt|dd�|dd<tdd�|ddj�D��|dd<t	|dd�|dd<tdd�|d	j�D��|d	<|j
||�S)
Nrrrcss"|]\}}t|�t|�fVqdS)N)r)rrrrrrr�sz'Serializer._loads_v2.<locals>.<genexpr>rr!css.|]&\}}t|�|dk	r t|�n|fVqdS)N)r)rrrrrrr�sr#)r2r?r0Z
decompressr	r;rr*r+rrF)r4r5r6rDrrr�	_loads_v2�szSerializer._loads_v2)N)	�__name__�
__module__�__qualname__r3r?rFrGrHrIrrrrr!s
8(r)rr(r2r0Zpip._vendor.requests.structuresr�compatrrrr
rrrr�objectrrrrr�<module>ssite-packages/pip/_vendor/cachecontrol/__pycache__/compat.cpython-36.opt-1.pyc000064400000000761147511334610023245 0ustar003

���e|�@s�yddlmZWn ek
r0ddlmZYnXyddlZWnek
rZddlZYnXddlmZddlm	Z	y
e
fZWnek
r�e
fZYnXdS)�)�urljoinN)�HTTPResponse)�is_fp_closed)Zurllib.parser�ImportErrorZurlparseZcPickle�pickleZpip._vendor.urllib3.responserZpip._vendor.urllib3.utilrZunicodeZ	text_type�	NameError�str�r	r	�/usr/lib/python3.6/compat.py�<module>s
site-packages/pip/_vendor/cachecontrol/__pycache__/adapter.cpython-36.opt-1.pyc000064400000005362147511334610023404 0ustar003

���e�@sTddlZddlZddlmZddlmZddlmZddlm	Z	Gdd�de�Z
dS)	�N)�HTTPAdapter�)�CacheController)�	DictCache)�CallbackFileWrappercsPeZdZeddg�Zd�fdd�	Z�fdd�Zd�fd
d�	Z�fdd
�Z�Z	S)�CacheControlAdapterZPUTZDELETENTc	sBtt|�j||�|pt�|_||_|p*t}||j||d�|_dS)N)�cache_etags�
serializer)�superr�__init__r�cache�	heuristicr�
controller)	�selfrrZcontroller_classr	r
�args�kwZcontroller_factory)�	__class__��/usr/lib/python3.6/adapter.pyrszCacheControlAdapter.__init__csV|jdkr>|jj|�}|r*|j||dd�S|jj|jj|��tt|�j	|f|�}|S)z�
        Send a request. Use the request information to see if it
        exists in the cache and cache the response if we need to and can.
        �GETT)�
from_cache)
�methodrZcached_request�build_responseZheaders�updateZconditional_headersr
r�send)r�requestr�cached_response�resp)rrrrs
zCacheControlAdapter.sendFcs|r�|jdkr�|jr"|jj|�}|jdkr`|jj||�}||k	rFd}|jdd�|j�|}n\|jdkrz|jj||�nBt	|j
tj|jj||��|_
|j
r�|j��fdd�}tj||�|_tt|�j||�}|j|jkr�|jr�|jj|j�}|jj|�||_|S)	z�
        Build a response by making a request or using the cache.

        This will end up calling send and returning a potentially
        cached response
        ri0TF)Zdecode_contenti-cs��|jdkr|jj�dS)Nr)Z
chunk_left�_fpZ_close)r)�super_update_chunk_lengthrr�_update_chunk_lengthgs
z@CacheControlAdapter.build_response.<locals>._update_chunk_length)rr
ZapplyZstatusrZupdate_cached_response�readZrelease_connZcache_responserr�	functools�partialZchunkedr �types�
MethodTyper
rr�invalidating_methods�ok�	cache_urlZurlr�deleter)rrZresponserrr rr()r)rrr3s<



z"CacheControlAdapter.build_responsecs|jj�tt|�j�dS)N)r�closer
r)r)rrrr*{s
zCacheControlAdapter.close)NTNNN)F)
�__name__�
__module__�__qualname__�setr&rrrr*�
__classcell__rr)rrrs
Hr)r$r"Zpip._vendor.requests.adaptersrrrrrZfilewrapperrrrrrr�<module>ssite-packages/pip/_vendor/cachecontrol/__pycache__/wrapper.cpython-36.opt-1.pyc000064400000000745147511334610023444 0ustar003

���e��@s&ddlmZddlmZddd�ZdS)�)�CacheControlAdapter)�	DictCacheNTcCs6|pt�}t||||d�}|jd|�|jd|�|S)N)�cache_etags�
serializer�	heuristiczhttp://zhttps://)rrZmount)Zsess�cacherrr�adapter�r	�/usr/lib/python3.6/wrapper.py�CacheControls
r)NTNN)rrrrrr	r	r	r
�<module>s
site-packages/pip/_vendor/cachecontrol/wrapper.py000064400000000762147511334610016220 0ustar00from .adapter import CacheControlAdapter
from .cache import DictCache


def CacheControl(sess,
                 cache=None,
                 cache_etags=True,
                 serializer=None,
                 heuristic=None):

    cache = cache or DictCache()
    adapter = CacheControlAdapter(
        cache,
        cache_etags=cache_etags,
        serializer=serializer,
        heuristic=heuristic,
    )
    sess.mount('http://', adapter)
    sess.mount('https://', adapter)

    return sess
site-packages/pip/_vendor/cachecontrol/heuristics.py000064400000010055147511334610016716 0ustar00import calendar
import time

from email.utils import formatdate, parsedate, parsedate_tz

from datetime import datetime, timedelta

TIME_FMT = "%a, %d %b %Y %H:%M:%S GMT"


def expire_after(delta, date=None):
    date = date or datetime.now()
    return date + delta


def datetime_to_header(dt):
    return formatdate(calendar.timegm(dt.timetuple()))


class BaseHeuristic(object):

    def warning(self, response):
        """
        Return a valid 1xx warning header value describing the cache
        adjustments.

        The response is provided too allow warnings like 113
        http://tools.ietf.org/html/rfc7234#section-5.5.4 where we need
        to explicitly say response is over 24 hours old.
        """
        return '110 - "Response is Stale"'

    def update_headers(self, response):
        """Update the response headers with any new headers.

        NOTE: This SHOULD always include some Warning header to
              signify that the response was cached by the client, not
              by way of the provided headers.
        """
        return {}

    def apply(self, response):
        updated_headers = self.update_headers(response)

        if updated_headers:
            response.headers.update(updated_headers)
            warning_header_value = self.warning(response)
            if warning_header_value is not None:
                response.headers.update({'Warning': warning_header_value})

        return response


class OneDayCache(BaseHeuristic):
    """
    Cache the response by providing an expires 1 day in the
    future.
    """
    def update_headers(self, response):
        headers = {}

        if 'expires' not in response.headers:
            date = parsedate(response.headers['date'])
            expires = expire_after(timedelta(days=1),
                                   date=datetime(*date[:6]))
            headers['expires'] = datetime_to_header(expires)
            headers['cache-control'] = 'public'
        return headers


class ExpiresAfter(BaseHeuristic):
    """
    Cache **all** requests for a defined time period.
    """

    def __init__(self, **kw):
        self.delta = timedelta(**kw)

    def update_headers(self, response):
        expires = expire_after(self.delta)
        return {
            'expires': datetime_to_header(expires),
            'cache-control': 'public',
        }

    def warning(self, response):
        tmpl = '110 - Automatically cached for %s. Response might be stale'
        return tmpl % self.delta


class LastModified(BaseHeuristic):
    """
    If there is no Expires header already, fall back on Last-Modified
    using the heuristic from
    http://tools.ietf.org/html/rfc7234#section-4.2.2
    to calculate a reasonable value.

    Firefox also does something like this per
    https://developer.mozilla.org/en-US/docs/Web/HTTP/Caching_FAQ
    http://lxr.mozilla.org/mozilla-release/source/netwerk/protocol/http/nsHttpResponseHead.cpp#397
    Unlike mozilla we limit this to 24-hr.
    """
    cacheable_by_default_statuses = set([
        200, 203, 204, 206, 300, 301, 404, 405, 410, 414, 501
    ])

    def update_headers(self, resp):
        headers = resp.headers

        if 'expires' in headers:
            return {}

        if 'cache-control' in headers and headers['cache-control'] != 'public':
            return {}

        if resp.status not in self.cacheable_by_default_statuses:
            return {}

        if 'date' not in headers or 'last-modified' not in headers:
            return {}

        date = calendar.timegm(parsedate_tz(headers['date']))
        last_modified = parsedate(headers['last-modified'])
        if date is None or last_modified is None:
            return {}

        now = time.time()
        current_age = max(0, now - date)
        delta = date - calendar.timegm(last_modified)
        freshness_lifetime = max(0, min(delta / 10, 24 * 3600))
        if freshness_lifetime <= current_age:
            return {}

        expires = date + freshness_lifetime
        return {'expires': time.strftime(TIME_FMT, time.gmtime(expires))}

    def warning(self, resp):
        return None
site-packages/pip/_vendor/cachecontrol/compat.py000064400000000574147511334610016024 0ustar00try:
    from urllib.parse import urljoin
except ImportError:
    from urlparse import urljoin


try:
    import cPickle as pickle
except ImportError:
    import pickle


from pip._vendor.urllib3.response import HTTPResponse
from pip._vendor.urllib3.util import is_fp_closed

# Replicate some six behaviour
try:
    text_type = (unicode,)
except NameError:
    text_type = (str,)
site-packages/pip/_vendor/retrying.py000064400000023364147511334610013742 0ustar00## Copyright 2013-2014 Ray Holder
##
## Licensed under the Apache License, Version 2.0 (the "License");
## you may not use this file except in compliance with the License.
## You may obtain a copy of the License at
##
## http://www.apache.org/licenses/LICENSE-2.0
##
## Unless required by applicable law or agreed to in writing, software
## distributed under the License is distributed on an "AS IS" BASIS,
## WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
## See the License for the specific language governing permissions and
## limitations under the License.

import random
from pip._vendor import six
import sys
import time
import traceback


# sys.maxint / 2, since Python 3.2 doesn't have a sys.maxint...
MAX_WAIT = 1073741823


def retry(*dargs, **dkw):
    """
    Decorator function that instantiates the Retrying object
    @param *dargs: positional arguments passed to Retrying object
    @param **dkw: keyword arguments passed to the Retrying object
    """
    # support both @retry and @retry() as valid syntax
    if len(dargs) == 1 and callable(dargs[0]):
        def wrap_simple(f):

            @six.wraps(f)
            def wrapped_f(*args, **kw):
                return Retrying().call(f, *args, **kw)

            return wrapped_f

        return wrap_simple(dargs[0])

    else:
        def wrap(f):

            @six.wraps(f)
            def wrapped_f(*args, **kw):
                return Retrying(*dargs, **dkw).call(f, *args, **kw)

            return wrapped_f

        return wrap


class Retrying(object):

    def __init__(self,
                 stop=None, wait=None,
                 stop_max_attempt_number=None,
                 stop_max_delay=None,
                 wait_fixed=None,
                 wait_random_min=None, wait_random_max=None,
                 wait_incrementing_start=None, wait_incrementing_increment=None,
                 wait_exponential_multiplier=None, wait_exponential_max=None,
                 retry_on_exception=None,
                 retry_on_result=None,
                 wrap_exception=False,
                 stop_func=None,
                 wait_func=None,
                 wait_jitter_max=None):

        self._stop_max_attempt_number = 5 if stop_max_attempt_number is None else stop_max_attempt_number
        self._stop_max_delay = 100 if stop_max_delay is None else stop_max_delay
        self._wait_fixed = 1000 if wait_fixed is None else wait_fixed
        self._wait_random_min = 0 if wait_random_min is None else wait_random_min
        self._wait_random_max = 1000 if wait_random_max is None else wait_random_max
        self._wait_incrementing_start = 0 if wait_incrementing_start is None else wait_incrementing_start
        self._wait_incrementing_increment = 100 if wait_incrementing_increment is None else wait_incrementing_increment
        self._wait_exponential_multiplier = 1 if wait_exponential_multiplier is None else wait_exponential_multiplier
        self._wait_exponential_max = MAX_WAIT if wait_exponential_max is None else wait_exponential_max
        self._wait_jitter_max = 0 if wait_jitter_max is None else wait_jitter_max

        # TODO add chaining of stop behaviors
        # stop behavior
        stop_funcs = []
        if stop_max_attempt_number is not None:
            stop_funcs.append(self.stop_after_attempt)

        if stop_max_delay is not None:
            stop_funcs.append(self.stop_after_delay)

        if stop_func is not None:
            self.stop = stop_func

        elif stop is None:
            self.stop = lambda attempts, delay: any(f(attempts, delay) for f in stop_funcs)

        else:
            self.stop = getattr(self, stop)

        # TODO add chaining of wait behaviors
        # wait behavior
        wait_funcs = [lambda *args, **kwargs: 0]
        if wait_fixed is not None:
            wait_funcs.append(self.fixed_sleep)

        if wait_random_min is not None or wait_random_max is not None:
            wait_funcs.append(self.random_sleep)

        if wait_incrementing_start is not None or wait_incrementing_increment is not None:
            wait_funcs.append(self.incrementing_sleep)

        if wait_exponential_multiplier is not None or wait_exponential_max is not None:
            wait_funcs.append(self.exponential_sleep)

        if wait_func is not None:
            self.wait = wait_func

        elif wait is None:
            self.wait = lambda attempts, delay: max(f(attempts, delay) for f in wait_funcs)

        else:
            self.wait = getattr(self, wait)

        # retry on exception filter
        if retry_on_exception is None:
            self._retry_on_exception = self.always_reject
        else:
            self._retry_on_exception = retry_on_exception

        # TODO simplify retrying by Exception types
        # retry on result filter
        if retry_on_result is None:
            self._retry_on_result = self.never_reject
        else:
            self._retry_on_result = retry_on_result

        self._wrap_exception = wrap_exception

    def stop_after_attempt(self, previous_attempt_number, delay_since_first_attempt_ms):
        """Stop after the previous attempt >= stop_max_attempt_number."""
        return previous_attempt_number >= self._stop_max_attempt_number

    def stop_after_delay(self, previous_attempt_number, delay_since_first_attempt_ms):
        """Stop after the time from the first attempt >= stop_max_delay."""
        return delay_since_first_attempt_ms >= self._stop_max_delay

    def no_sleep(self, previous_attempt_number, delay_since_first_attempt_ms):
        """Don't sleep at all before retrying."""
        return 0

    def fixed_sleep(self, previous_attempt_number, delay_since_first_attempt_ms):
        """Sleep a fixed amount of time between each retry."""
        return self._wait_fixed

    def random_sleep(self, previous_attempt_number, delay_since_first_attempt_ms):
        """Sleep a random amount of time between wait_random_min and wait_random_max"""
        return random.randint(self._wait_random_min, self._wait_random_max)

    def incrementing_sleep(self, previous_attempt_number, delay_since_first_attempt_ms):
        """
        Sleep an incremental amount of time after each attempt, starting at
        wait_incrementing_start and incrementing by wait_incrementing_increment
        """
        result = self._wait_incrementing_start + (self._wait_incrementing_increment * (previous_attempt_number - 1))
        if result < 0:
            result = 0
        return result

    def exponential_sleep(self, previous_attempt_number, delay_since_first_attempt_ms):
        exp = 2 ** previous_attempt_number
        result = self._wait_exponential_multiplier * exp
        if result > self._wait_exponential_max:
            result = self._wait_exponential_max
        if result < 0:
            result = 0
        return result

    def never_reject(self, result):
        return False

    def always_reject(self, result):
        return True

    def should_reject(self, attempt):
        reject = False
        if attempt.has_exception:
            reject |= self._retry_on_exception(attempt.value[1])
        else:
            reject |= self._retry_on_result(attempt.value)

        return reject

    def call(self, fn, *args, **kwargs):
        start_time = int(round(time.time() * 1000))
        attempt_number = 1
        while True:
            try:
                attempt = Attempt(fn(*args, **kwargs), attempt_number, False)
            except:
                tb = sys.exc_info()
                attempt = Attempt(tb, attempt_number, True)

            if not self.should_reject(attempt):
                return attempt.get(self._wrap_exception)

            delay_since_first_attempt_ms = int(round(time.time() * 1000)) - start_time
            if self.stop(attempt_number, delay_since_first_attempt_ms):
                if not self._wrap_exception and attempt.has_exception:
                    # get() on an attempt with an exception should cause it to be raised, but raise just in case
                    raise attempt.get()
                else:
                    raise RetryError(attempt)
            else:
                sleep = self.wait(attempt_number, delay_since_first_attempt_ms)
                if self._wait_jitter_max:
                    jitter = random.random() * self._wait_jitter_max
                    sleep = sleep + max(0, jitter)
                time.sleep(sleep / 1000.0)

            attempt_number += 1


class Attempt(object):
    """
    An Attempt encapsulates a call to a target function that may end as a
    normal return value from the function or an Exception depending on what
    occurred during the execution.
    """

    def __init__(self, value, attempt_number, has_exception):
        self.value = value
        self.attempt_number = attempt_number
        self.has_exception = has_exception

    def get(self, wrap_exception=False):
        """
        Return the return value of this Attempt instance or raise an Exception.
        If wrap_exception is true, this Attempt is wrapped inside of a
        RetryError before being raised.
        """
        if self.has_exception:
            if wrap_exception:
                raise RetryError(self)
            else:
                six.reraise(self.value[0], self.value[1], self.value[2])
        else:
            return self.value

    def __repr__(self):
        if self.has_exception:
            return "Attempts: {0}, Error:\n{1}".format(self.attempt_number, "".join(traceback.format_tb(self.value[2])))
        else:
            return "Attempts: {0}, Value: {1}".format(self.attempt_number, self.value)


class RetryError(Exception):
    """
    A RetryError encapsulates the last Attempt instance right before giving up.
    """

    def __init__(self, last_attempt):
        self.last_attempt = last_attempt

    def __str__(self):
        return "RetryError[{0}]".format(self.last_attempt)
site-packages/pip/_vendor/requests/__version__.py000064400000000664147511334610016231 0ustar00# .-. .-. .-. . . .-. .-. .-. .-.
# |(  |-  |.| | | |-  `-.  |  `-.
# ' ' `-' `-`.`-' `-' `-'  '  `-'

__title__ = 'requests'
__description__ = 'Python HTTP for Humans.'
__url__ = 'http://python-requests.org'
__version__ = '2.18.4'
__build__ = 0x021804
__author__ = 'Kenneth Reitz'
__author_email__ = 'me@kennethreitz.org'
__license__ = 'Apache 2.0'
__copyright__ = 'Copyright 2017 Kenneth Reitz'
__cake__ = u'\u2728 \U0001f370 \u2728'
site-packages/pip/_vendor/requests/compat.py000064400000003132147511334610015224 0ustar00# -*- coding: utf-8 -*-

"""
requests.compat
~~~~~~~~~~~~~~~

This module handles import compatibility issues between Python 2 and
Python 3.
"""

from pip._vendor import chardet

import sys

# -------
# Pythons
# -------

# Syntax sugar.
_ver = sys.version_info

#: Python 2.x?
is_py2 = (_ver[0] == 2)

#: Python 3.x?
is_py3 = (_ver[0] == 3)

# try:
#     import simplejson as json
# except ImportError:
import json

# ---------
# Specifics
# ---------

if is_py2:
    from urllib import (
        quote, unquote, quote_plus, unquote_plus, urlencode, getproxies,
        proxy_bypass, proxy_bypass_environment, getproxies_environment)
    from urlparse import urlparse, urlunparse, urljoin, urlsplit, urldefrag
    from urllib2 import parse_http_list
    import cookielib
    from Cookie import Morsel
    from StringIO import StringIO

    from pip._vendor.urllib3.packages.ordered_dict import OrderedDict

    builtin_str = str
    bytes = str
    str = unicode
    basestring = basestring
    numeric_types = (int, long, float)
    integer_types = (int, long)

elif is_py3:
    from urllib.parse import urlparse, urlunparse, urljoin, urlsplit, urlencode, quote, unquote, quote_plus, unquote_plus, urldefrag
    from urllib.request import parse_http_list, getproxies, proxy_bypass, proxy_bypass_environment, getproxies_environment
    from http import cookiejar as cookielib
    from http.cookies import Morsel
    from io import StringIO
    from collections import OrderedDict

    builtin_str = str
    str = str
    bytes = bytes
    basestring = (str, bytes)
    numeric_types = (int, float)
    integer_types = (int,)
site-packages/pip/_vendor/requests/__init__.py000064400000006767147511334610015521 0ustar00# -*- coding: utf-8 -*-

#   __
#  /__)  _  _     _   _ _/   _
# / (   (- (/ (/ (- _)  /  _)
#          /

"""
Requests HTTP Library
~~~~~~~~~~~~~~~~~~~~~

Requests is an HTTP library, written in Python, for human beings. Basic GET
usage:

   >>> import requests
   >>> r = requests.get('https://www.python.org')
   >>> r.status_code
   200
   >>> 'Python is a programming language' in r.content
   True

... or POST:

   >>> payload = dict(key1='value1', key2='value2')
   >>> r = requests.post('http://httpbin.org/post', data=payload)
   >>> print(r.text)
   {
     ...
     "form": {
       "key2": "value2",
       "key1": "value1"
     },
     ...
   }

The other HTTP methods are supported - see `requests.api`. Full documentation
is at <http://python-requests.org>.

:copyright: (c) 2017 by Kenneth Reitz.
:license: Apache 2.0, see LICENSE for more details.
"""

from pip._vendor import urllib3
from pip._vendor import chardet
import warnings
from .exceptions import RequestsDependencyWarning


def check_compatibility(urllib3_version, chardet_version):
    urllib3_version = urllib3_version.split('.')
    assert urllib3_version != ['dev']  # Verify urllib3 isn't installed from git.

    # Sometimes, urllib3 only reports its version as 16.1.
    if len(urllib3_version) == 2:
        urllib3_version.append('0')

    # Check urllib3 for compatibility.
    major, minor, patch = urllib3_version  # noqa: F811
    major, minor, patch = int(major), int(minor), int(patch)
    # urllib3 >= 1.21.1, <= 1.22
    assert major == 1
    assert minor >= 21
    assert minor <= 22

    # Check chardet for compatibility.
    major, minor, patch = chardet_version.split('.')[:3]
    major, minor, patch = int(major), int(minor), int(patch)
    # chardet >= 3.0.2, < 3.1.0
    assert major == 3
    assert minor < 1
    assert patch >= 2


# Check imported dependencies for compatibility.
try:
    check_compatibility(urllib3.__version__, chardet.__version__)
except (AssertionError, ValueError):
    warnings.warn("urllib3 ({0}) or chardet ({1}) doesn't match a supported "
                  "version!".format(urllib3.__version__, chardet.__version__),
                  RequestsDependencyWarning)

# Attempt to enable urllib3's SNI support, if possible
# try:
#     from pip._vendor.urllib3.contrib import pyopenssl
#     pyopenssl.inject_into_urllib3()
# except ImportError:
#     pass

# urllib3's DependencyWarnings should be silenced.
from pip._vendor.urllib3.exceptions import DependencyWarning
warnings.simplefilter('ignore', DependencyWarning)

from .__version__ import __title__, __description__, __url__, __version__
from .__version__ import __build__, __author__, __author_email__, __license__
from .__version__ import __copyright__, __cake__

from . import utils
from . import packages
from .models import Request, Response, PreparedRequest
from .api import request, get, head, post, patch, put, delete, options
from .sessions import session, Session
from .status_codes import codes
from .exceptions import (
    RequestException, Timeout, URLRequired,
    TooManyRedirects, HTTPError, ConnectionError,
    FileModeWarning, ConnectTimeout, ReadTimeout
)

# Set default logging handler to avoid "No handler found" warnings.
import logging
try:  # Python 2.7+
    from logging import NullHandler
except ImportError:
    class NullHandler(logging.Handler):
        def emit(self, record):
            pass

logging.getLogger(__name__).addHandler(NullHandler())

# FileModeWarnings go off per the default.
warnings.simplefilter('default', FileModeWarning, append=True)
site-packages/pip/_vendor/requests/sessions.py000064400000070021147511334610015610 0ustar00# -*- coding: utf-8 -*-

"""
requests.session
~~~~~~~~~~~~~~~~

This module provides a Session object to manage and persist settings across
requests (cookies, auth, proxies).
"""
import os
import platform
import time
from collections import Mapping
from datetime import timedelta

from .auth import _basic_auth_str
from .compat import cookielib, is_py3, OrderedDict, urljoin, urlparse
from .cookies import (
    cookiejar_from_dict, extract_cookies_to_jar, RequestsCookieJar, merge_cookies)
from .models import Request, PreparedRequest, DEFAULT_REDIRECT_LIMIT
from .hooks import default_hooks, dispatch_hook
from ._internal_utils import to_native_string
from .utils import to_key_val_list, default_headers
from .exceptions import (
    TooManyRedirects, InvalidSchema, ChunkedEncodingError, ContentDecodingError)

from .structures import CaseInsensitiveDict
from .adapters import HTTPAdapter

from .utils import (
    requote_uri, get_environ_proxies, get_netrc_auth, should_bypass_proxies,
    get_auth_from_url, rewind_body, DEFAULT_PORTS
)

from .status_codes import codes

# formerly defined here, reexposed here for backward compatibility
from .models import REDIRECT_STATI

# Preferred clock, based on which one is more accurate on a given system.
if platform.system() == 'Windows':
    try:  # Python 3.3+
        preferred_clock = time.perf_counter
    except AttributeError:  # Earlier than Python 3.
        preferred_clock = time.clock
else:
    preferred_clock = time.time


def merge_setting(request_setting, session_setting, dict_class=OrderedDict):
    """Determines appropriate setting for a given request, taking into account
    the explicit setting on that request, and the setting in the session. If a
    setting is a dictionary, they will be merged together using `dict_class`
    """

    if session_setting is None:
        return request_setting

    if request_setting is None:
        return session_setting

    # Bypass if not a dictionary (e.g. verify)
    if not (
            isinstance(session_setting, Mapping) and
            isinstance(request_setting, Mapping)
    ):
        return request_setting

    merged_setting = dict_class(to_key_val_list(session_setting))
    merged_setting.update(to_key_val_list(request_setting))

    # Remove keys that are set to None. Extract keys first to avoid altering
    # the dictionary during iteration.
    none_keys = [k for (k, v) in merged_setting.items() if v is None]
    for key in none_keys:
        del merged_setting[key]

    return merged_setting


def merge_hooks(request_hooks, session_hooks, dict_class=OrderedDict):
    """Properly merges both requests and session hooks.

    This is necessary because when request_hooks == {'response': []}, the
    merge breaks Session hooks entirely.
    """
    if session_hooks is None or session_hooks.get('response') == []:
        return request_hooks

    if request_hooks is None or request_hooks.get('response') == []:
        return session_hooks

    return merge_setting(request_hooks, session_hooks, dict_class)


class SessionRedirectMixin(object):

    def get_redirect_target(self, resp):
        """Receives a Response. Returns a redirect URI or ``None``"""
        # Due to the nature of how requests processes redirects this method will
        # be called at least once upon the original response and at least twice
        # on each subsequent redirect response (if any).
        # If a custom mixin is used to handle this logic, it may be advantageous
        # to cache the redirect location onto the response object as a private
        # attribute.
        if resp.is_redirect:
            location = resp.headers['location']
            # Currently the underlying http module on py3 decode headers
            # in latin1, but empirical evidence suggests that latin1 is very
            # rarely used with non-ASCII characters in HTTP headers.
            # It is more likely to get UTF8 header rather than latin1.
            # This causes incorrect handling of UTF8 encoded location headers.
            # To solve this, we re-encode the location in latin1.
            if is_py3:
                location = location.encode('latin1')
            return to_native_string(location, 'utf8')
        return None


    def should_strip_auth(self, old_url, new_url):
        """Decide whether Authorization header should be removed when redirecting"""
        old_parsed = urlparse(old_url)
        new_parsed = urlparse(new_url)
        if old_parsed.hostname != new_parsed.hostname:
            return True
        # Special case: allow http -> https redirect when using the standard
        # ports. This isn't specified by RFC 7235, but is kept to avoid
        # breaking backwards compatibility with older versions of requests
        # that allowed any redirects on the same host.
        if (old_parsed.scheme == 'http' and old_parsed.port in (80, None)
                and new_parsed.scheme == 'https' and new_parsed.port in (443, None)):
            return False

        # Handle default port usage corresponding to scheme.
        changed_port = old_parsed.port != new_parsed.port
        changed_scheme = old_parsed.scheme != new_parsed.scheme
        default_port = (DEFAULT_PORTS.get(old_parsed.scheme, None), None)
        if (not changed_scheme and old_parsed.port in default_port
                and new_parsed.port in default_port):
            return False

        # Standard case: root URI must match
        return changed_port or changed_scheme

    def resolve_redirects(self, resp, req, stream=False, timeout=None,
                          verify=True, cert=None, proxies=None, yield_requests=False, **adapter_kwargs):
        """Receives a Response. Returns a generator of Responses or Requests."""

        hist = []  # keep track of history

        url = self.get_redirect_target(resp)
        while url:
            prepared_request = req.copy()

            # Update history and keep track of redirects.
            # resp.history must ignore the original request in this loop
            hist.append(resp)
            resp.history = hist[1:]

            try:
                resp.content  # Consume socket so it can be released
            except (ChunkedEncodingError, ContentDecodingError, RuntimeError):
                resp.raw.read(decode_content=False)

            if len(resp.history) >= self.max_redirects:
                raise TooManyRedirects('Exceeded %s redirects.' % self.max_redirects, response=resp)

            # Release the connection back into the pool.
            resp.close()

            # Handle redirection without scheme (see: RFC 1808 Section 4)
            if url.startswith('//'):
                parsed_rurl = urlparse(resp.url)
                url = '%s:%s' % (to_native_string(parsed_rurl.scheme), url)

            # The scheme should be lower case...
            parsed = urlparse(url)
            url = parsed.geturl()

            # Facilitate relative 'location' headers, as allowed by RFC 7231.
            # (e.g. '/path/to/resource' instead of 'http://domain.tld/path/to/resource')
            # Compliant with RFC3986, we percent encode the url.
            if not parsed.netloc:
                url = urljoin(resp.url, requote_uri(url))
            else:
                url = requote_uri(url)

            prepared_request.url = to_native_string(url)

            self.rebuild_method(prepared_request, resp)

            # https://github.com/requests/requests/issues/1084
            if resp.status_code not in (codes.temporary_redirect, codes.permanent_redirect):
                # https://github.com/requests/requests/issues/3490
                purged_headers = ('Content-Length', 'Content-Type', 'Transfer-Encoding')
                for header in purged_headers:
                    prepared_request.headers.pop(header, None)
                prepared_request.body = None

            headers = prepared_request.headers
            try:
                del headers['Cookie']
            except KeyError:
                pass

            # Extract any cookies sent on the response to the cookiejar
            # in the new request. Because we've mutated our copied prepared
            # request, use the old one that we haven't yet touched.
            extract_cookies_to_jar(prepared_request._cookies, req, resp.raw)
            merge_cookies(prepared_request._cookies, self.cookies)
            prepared_request.prepare_cookies(prepared_request._cookies)

            # Rebuild auth and proxy information.
            proxies = self.rebuild_proxies(prepared_request, proxies)
            self.rebuild_auth(prepared_request, resp)

            # A failed tell() sets `_body_position` to `object()`. This non-None
            # value ensures `rewindable` will be True, allowing us to raise an
            # UnrewindableBodyError, instead of hanging the connection.
            rewindable = (
                prepared_request._body_position is not None and
                ('Content-Length' in headers or 'Transfer-Encoding' in headers)
            )

            # Attempt to rewind consumed file-like object.
            if rewindable:
                rewind_body(prepared_request)

            # Override the original request.
            req = prepared_request

            if yield_requests:
                yield req
            else:

                resp = self.send(
                    req,
                    stream=stream,
                    timeout=timeout,
                    verify=verify,
                    cert=cert,
                    proxies=proxies,
                    allow_redirects=False,
                    **adapter_kwargs
                )

                extract_cookies_to_jar(self.cookies, prepared_request, resp.raw)

                # extract redirect url, if any, for the next loop
                url = self.get_redirect_target(resp)
                yield resp

    def rebuild_auth(self, prepared_request, response):
        """When being redirected we may want to strip authentication from the
        request to avoid leaking credentials. This method intelligently removes
        and reapplies authentication where possible to avoid credential loss.
        """
        headers = prepared_request.headers
        url = prepared_request.url

        if 'Authorization' in headers and self.should_strip_auth(response.request.url, url):
            # If we get redirected to a new host, we should strip out any
            # authentication headers.
            del headers['Authorization']

        # .netrc might have more auth for us on our new host.
        new_auth = get_netrc_auth(url) if self.trust_env else None
        if new_auth is not None:
            prepared_request.prepare_auth(new_auth)

        return

    def rebuild_proxies(self, prepared_request, proxies):
        """This method re-evaluates the proxy configuration by considering the
        environment variables. If we are redirected to a URL covered by
        NO_PROXY, we strip the proxy configuration. Otherwise, we set missing
        proxy keys for this URL (in case they were stripped by a previous
        redirect).

        This method also replaces the Proxy-Authorization header where
        necessary.

        :rtype: dict
        """
        proxies = proxies if proxies is not None else {}
        headers = prepared_request.headers
        url = prepared_request.url
        scheme = urlparse(url).scheme
        new_proxies = proxies.copy()
        no_proxy = proxies.get('no_proxy')

        bypass_proxy = should_bypass_proxies(url, no_proxy=no_proxy)
        if self.trust_env and not bypass_proxy:
            environ_proxies = get_environ_proxies(url, no_proxy=no_proxy)

            proxy = environ_proxies.get(scheme, environ_proxies.get('all'))

            if proxy:
                new_proxies.setdefault(scheme, proxy)

        if 'Proxy-Authorization' in headers:
            del headers['Proxy-Authorization']

        try:
            username, password = get_auth_from_url(new_proxies[scheme])
        except KeyError:
            username, password = None, None

        if username and password:
            headers['Proxy-Authorization'] = _basic_auth_str(username, password)

        return new_proxies

    def rebuild_method(self, prepared_request, response):
        """When being redirected we may want to change the method of the request
        based on certain specs or browser behavior.
        """
        method = prepared_request.method

        # http://tools.ietf.org/html/rfc7231#section-6.4.4
        if response.status_code == codes.see_other and method != 'HEAD':
            method = 'GET'

        # Do what the browsers do, despite standards...
        # First, turn 302s into GETs.
        if response.status_code == codes.found and method != 'HEAD':
            method = 'GET'

        # Second, if a POST is responded to with a 301, turn it into a GET.
        # This bizarre behaviour is explained in Issue 1704.
        if response.status_code == codes.moved and method == 'POST':
            method = 'GET'

        prepared_request.method = method


class Session(SessionRedirectMixin):
    """A Requests session.

    Provides cookie persistence, connection-pooling, and configuration.

    Basic Usage::

      >>> import requests
      >>> s = requests.Session()
      >>> s.get('http://httpbin.org/get')
      <Response [200]>

    Or as a context manager::

      >>> with requests.Session() as s:
      >>>     s.get('http://httpbin.org/get')
      <Response [200]>
    """

    __attrs__ = [
        'headers', 'cookies', 'auth', 'proxies', 'hooks', 'params', 'verify',
        'cert', 'prefetch', 'adapters', 'stream', 'trust_env',
        'max_redirects',
    ]

    def __init__(self):

        #: A case-insensitive dictionary of headers to be sent on each
        #: :class:`Request <Request>` sent from this
        #: :class:`Session <Session>`.
        self.headers = default_headers()

        #: Default Authentication tuple or object to attach to
        #: :class:`Request <Request>`.
        self.auth = None

        #: Dictionary mapping protocol or protocol and host to the URL of the proxy
        #: (e.g. {'http': 'foo.bar:3128', 'http://host.name': 'foo.bar:4012'}) to
        #: be used on each :class:`Request <Request>`.
        self.proxies = {}

        #: Event-handling hooks.
        self.hooks = default_hooks()

        #: Dictionary of querystring data to attach to each
        #: :class:`Request <Request>`. The dictionary values may be lists for
        #: representing multivalued query parameters.
        self.params = {}

        #: Stream response content default.
        self.stream = False

        #: SSL Verification default.
        self.verify = True

        #: SSL client certificate default, if String, path to ssl client
        #: cert file (.pem). If Tuple, ('cert', 'key') pair.
        self.cert = None

        #: Maximum number of redirects allowed. If the request exceeds this
        #: limit, a :class:`TooManyRedirects` exception is raised.
        #: This defaults to requests.models.DEFAULT_REDIRECT_LIMIT, which is
        #: 30.
        self.max_redirects = DEFAULT_REDIRECT_LIMIT

        #: Trust environment settings for proxy configuration, default
        #: authentication and similar.
        self.trust_env = True

        #: A CookieJar containing all currently outstanding cookies set on this
        #: session. By default it is a
        #: :class:`RequestsCookieJar <requests.cookies.RequestsCookieJar>`, but
        #: may be any other ``cookielib.CookieJar`` compatible object.
        self.cookies = cookiejar_from_dict({})

        # Default connection adapters.
        self.adapters = OrderedDict()
        self.mount('https://', HTTPAdapter())
        self.mount('http://', HTTPAdapter())

    def __enter__(self):
        return self

    def __exit__(self, *args):
        self.close()

    def prepare_request(self, request):
        """Constructs a :class:`PreparedRequest <PreparedRequest>` for
        transmission and returns it. The :class:`PreparedRequest` has settings
        merged from the :class:`Request <Request>` instance and those of the
        :class:`Session`.

        :param request: :class:`Request` instance to prepare with this
            session's settings.
        :rtype: requests.PreparedRequest
        """
        cookies = request.cookies or {}

        # Bootstrap CookieJar.
        if not isinstance(cookies, cookielib.CookieJar):
            cookies = cookiejar_from_dict(cookies)

        # Merge with session cookies
        merged_cookies = merge_cookies(
            merge_cookies(RequestsCookieJar(), self.cookies), cookies)

        # Set environment's basic authentication if not explicitly set.
        auth = request.auth
        if self.trust_env and not auth and not self.auth:
            auth = get_netrc_auth(request.url)

        p = PreparedRequest()
        p.prepare(
            method=request.method.upper(),
            url=request.url,
            files=request.files,
            data=request.data,
            json=request.json,
            headers=merge_setting(request.headers, self.headers, dict_class=CaseInsensitiveDict),
            params=merge_setting(request.params, self.params),
            auth=merge_setting(auth, self.auth),
            cookies=merged_cookies,
            hooks=merge_hooks(request.hooks, self.hooks),
        )
        return p

    def request(self, method, url,
            params=None, data=None, headers=None, cookies=None, files=None,
            auth=None, timeout=None, allow_redirects=True, proxies=None,
            hooks=None, stream=None, verify=None, cert=None, json=None):
        """Constructs a :class:`Request <Request>`, prepares it and sends it.
        Returns :class:`Response <Response>` object.

        :param method: method for the new :class:`Request` object.
        :param url: URL for the new :class:`Request` object.
        :param params: (optional) Dictionary or bytes to be sent in the query
            string for the :class:`Request`.
        :param data: (optional) Dictionary, bytes, or file-like object to send
            in the body of the :class:`Request`.
        :param json: (optional) json to send in the body of the
            :class:`Request`.
        :param headers: (optional) Dictionary of HTTP Headers to send with the
            :class:`Request`.
        :param cookies: (optional) Dict or CookieJar object to send with the
            :class:`Request`.
        :param files: (optional) Dictionary of ``'filename': file-like-objects``
            for multipart encoding upload.
        :param auth: (optional) Auth tuple or callable to enable
            Basic/Digest/Custom HTTP Auth.
        :param timeout: (optional) How long to wait for the server to send
            data before giving up, as a float, or a :ref:`(connect timeout,
            read timeout) <timeouts>` tuple.
        :type timeout: float or tuple
        :param allow_redirects: (optional) Set to True by default.
        :type allow_redirects: bool
        :param proxies: (optional) Dictionary mapping protocol or protocol and
            hostname to the URL of the proxy.
        :param stream: (optional) whether to immediately download the response
            content. Defaults to ``False``.
        :param verify: (optional) Either a boolean, in which case it controls whether we verify
            the server's TLS certificate, or a string, in which case it must be a path
            to a CA bundle to use. Defaults to ``True``.
        :param cert: (optional) if String, path to ssl client cert file (.pem).
            If Tuple, ('cert', 'key') pair.
        :rtype: requests.Response
        """
        # Create the Request.
        req = Request(
            method=method.upper(),
            url=url,
            headers=headers,
            files=files,
            data=data or {},
            json=json,
            params=params or {},
            auth=auth,
            cookies=cookies,
            hooks=hooks,
        )
        prep = self.prepare_request(req)

        proxies = proxies or {}

        settings = self.merge_environment_settings(
            prep.url, proxies, stream, verify, cert
        )

        # Send the request.
        send_kwargs = {
            'timeout': timeout,
            'allow_redirects': allow_redirects,
        }
        send_kwargs.update(settings)
        resp = self.send(prep, **send_kwargs)

        return resp

    def get(self, url, **kwargs):
        r"""Sends a GET request. Returns :class:`Response` object.

        :param url: URL for the new :class:`Request` object.
        :param \*\*kwargs: Optional arguments that ``request`` takes.
        :rtype: requests.Response
        """

        kwargs.setdefault('allow_redirects', True)
        return self.request('GET', url, **kwargs)

    def options(self, url, **kwargs):
        r"""Sends a OPTIONS request. Returns :class:`Response` object.

        :param url: URL for the new :class:`Request` object.
        :param \*\*kwargs: Optional arguments that ``request`` takes.
        :rtype: requests.Response
        """

        kwargs.setdefault('allow_redirects', True)
        return self.request('OPTIONS', url, **kwargs)

    def head(self, url, **kwargs):
        r"""Sends a HEAD request. Returns :class:`Response` object.

        :param url: URL for the new :class:`Request` object.
        :param \*\*kwargs: Optional arguments that ``request`` takes.
        :rtype: requests.Response
        """

        kwargs.setdefault('allow_redirects', False)
        return self.request('HEAD', url, **kwargs)

    def post(self, url, data=None, json=None, **kwargs):
        r"""Sends a POST request. Returns :class:`Response` object.

        :param url: URL for the new :class:`Request` object.
        :param data: (optional) Dictionary, bytes, or file-like object to send in the body of the :class:`Request`.
        :param json: (optional) json to send in the body of the :class:`Request`.
        :param \*\*kwargs: Optional arguments that ``request`` takes.
        :rtype: requests.Response
        """

        return self.request('POST', url, data=data, json=json, **kwargs)

    def put(self, url, data=None, **kwargs):
        r"""Sends a PUT request. Returns :class:`Response` object.

        :param url: URL for the new :class:`Request` object.
        :param data: (optional) Dictionary, bytes, or file-like object to send in the body of the :class:`Request`.
        :param \*\*kwargs: Optional arguments that ``request`` takes.
        :rtype: requests.Response
        """

        return self.request('PUT', url, data=data, **kwargs)

    def patch(self, url, data=None, **kwargs):
        r"""Sends a PATCH request. Returns :class:`Response` object.

        :param url: URL for the new :class:`Request` object.
        :param data: (optional) Dictionary, bytes, or file-like object to send in the body of the :class:`Request`.
        :param \*\*kwargs: Optional arguments that ``request`` takes.
        :rtype: requests.Response
        """

        return self.request('PATCH', url, data=data, **kwargs)

    def delete(self, url, **kwargs):
        r"""Sends a DELETE request. Returns :class:`Response` object.

        :param url: URL for the new :class:`Request` object.
        :param \*\*kwargs: Optional arguments that ``request`` takes.
        :rtype: requests.Response
        """

        return self.request('DELETE', url, **kwargs)

    def send(self, request, **kwargs):
        """Send a given PreparedRequest.

        :rtype: requests.Response
        """
        # Set defaults that the hooks can utilize to ensure they always have
        # the correct parameters to reproduce the previous request.
        kwargs.setdefault('stream', self.stream)
        kwargs.setdefault('verify', self.verify)
        kwargs.setdefault('cert', self.cert)
        kwargs.setdefault('proxies', self.proxies)

        # It's possible that users might accidentally send a Request object.
        # Guard against that specific failure case.
        if isinstance(request, Request):
            raise ValueError('You can only send PreparedRequests.')

        # Set up variables needed for resolve_redirects and dispatching of hooks
        allow_redirects = kwargs.pop('allow_redirects', True)
        stream = kwargs.get('stream')
        hooks = request.hooks

        # Get the appropriate adapter to use
        adapter = self.get_adapter(url=request.url)

        # Start time (approximately) of the request
        start = preferred_clock()

        # Send the request
        r = adapter.send(request, **kwargs)

        # Total elapsed time of the request (approximately)
        elapsed = preferred_clock() - start
        r.elapsed = timedelta(seconds=elapsed)

        # Response manipulation hooks
        r = dispatch_hook('response', hooks, r, **kwargs)

        # Persist cookies
        if r.history:

            # If the hooks create history then we want those cookies too
            for resp in r.history:
                extract_cookies_to_jar(self.cookies, resp.request, resp.raw)

        extract_cookies_to_jar(self.cookies, request, r.raw)

        # Redirect resolving generator.
        gen = self.resolve_redirects(r, request, **kwargs)

        # Resolve redirects if allowed.
        history = [resp for resp in gen] if allow_redirects else []

        # Shuffle things around if there's history.
        if history:
            # Insert the first (original) request at the start
            history.insert(0, r)
            # Get the last request made
            r = history.pop()
            r.history = history

        # If redirects aren't being followed, store the response on the Request for Response.next().
        if not allow_redirects:
            try:
                r._next = next(self.resolve_redirects(r, request, yield_requests=True, **kwargs))
            except StopIteration:
                pass

        if not stream:
            r.content

        return r

    def merge_environment_settings(self, url, proxies, stream, verify, cert):
        """
        Check the environment and merge it with some settings.

        :rtype: dict
        """
        # Gather clues from the surrounding environment.
        if self.trust_env:
            # Set environment's proxies.
            no_proxy = proxies.get('no_proxy') if proxies is not None else None
            env_proxies = get_environ_proxies(url, no_proxy=no_proxy)
            for (k, v) in env_proxies.items():
                proxies.setdefault(k, v)

            # Look for requests environment configuration and be compatible
            # with cURL.
            if verify is True or verify is None:
                verify = (os.environ.get('REQUESTS_CA_BUNDLE') or
                          os.environ.get('CURL_CA_BUNDLE'))

        # Merge all the kwargs.
        proxies = merge_setting(proxies, self.proxies)
        stream = merge_setting(stream, self.stream)
        verify = merge_setting(verify, self.verify)
        cert = merge_setting(cert, self.cert)

        return {'verify': verify, 'proxies': proxies, 'stream': stream,
                'cert': cert}

    def get_adapter(self, url):
        """
        Returns the appropriate connection adapter for the given URL.

        :rtype: requests.adapters.BaseAdapter
        """
        for (prefix, adapter) in self.adapters.items():

            if url.lower().startswith(prefix):
                return adapter

        # Nothing matches :-/
        raise InvalidSchema("No connection adapters were found for '%s'" % url)

    def close(self):
        """Closes all adapters and as such the session"""
        for v in self.adapters.values():
            v.close()

    def mount(self, prefix, adapter):
        """Registers a connection adapter to a prefix.

        Adapters are sorted in descending order by prefix length.
        """
        self.adapters[prefix] = adapter
        keys_to_move = [k for k in self.adapters if len(k) < len(prefix)]

        for key in keys_to_move:
            self.adapters[key] = self.adapters.pop(key)

    def __getstate__(self):
        state = dict((attr, getattr(self, attr, None)) for attr in self.__attrs__)
        return state

    def __setstate__(self, state):
        for attr, value in state.items():
            setattr(self, attr, value)


def session():
    """
    Returns a :class:`Session` for context-management.

    :rtype: Session
    """

    return Session()
site-packages/pip/_vendor/requests/help.py000064400000007123147511334610014675 0ustar00"""Module containing bug report helper(s)."""
from __future__ import print_function

import json
import platform
import sys
import ssl

from pip._vendor import idna
from pip._vendor import urllib3
from pip._vendor import chardet

from . import __version__ as requests_version

try:
    from .packages.urllib3.contrib import pyopenssl
except ImportError:
    pyopenssl = None
    OpenSSL = None
    cryptography = None
else:
    import OpenSSL
    import cryptography


def _implementation():
    """Return a dict with the Python implementation and version.

    Provide both the name and the version of the Python implementation
    currently running. For example, on CPython 2.7.5 it will return
    {'name': 'CPython', 'version': '2.7.5'}.

    This function works best on CPython and PyPy: in particular, it probably
    doesn't work for Jython or IronPython. Future investigation should be done
    to work out the correct shape of the code for those platforms.
    """
    implementation = platform.python_implementation()

    if implementation == 'CPython':
        implementation_version = platform.python_version()
    elif implementation == 'PyPy':
        implementation_version = '%s.%s.%s' % (sys.pypy_version_info.major,
                                               sys.pypy_version_info.minor,
                                               sys.pypy_version_info.micro)
        if sys.pypy_version_info.releaselevel != 'final':
            implementation_version = ''.join([
                implementation_version, sys.pypy_version_info.releaselevel
            ])
    elif implementation == 'Jython':
        implementation_version = platform.python_version()  # Complete Guess
    elif implementation == 'IronPython':
        implementation_version = platform.python_version()  # Complete Guess
    else:
        implementation_version = 'Unknown'

    return {'name': implementation, 'version': implementation_version}


def info():
    """Generate information for a bug report."""
    try:
        platform_info = {
            'system': platform.system(),
            'release': platform.release(),
        }
    except IOError:
        platform_info = {
            'system': 'Unknown',
            'release': 'Unknown',
        }

    implementation_info = _implementation()
    urllib3_info = {'version': urllib3.__version__}
    chardet_info = {'version': chardet.__version__}

    pyopenssl_info = {
        'version': None,
        'openssl_version': '',
    }
    if OpenSSL:
        pyopenssl_info = {
            'version': OpenSSL.__version__,
            'openssl_version': '%x' % OpenSSL.SSL.OPENSSL_VERSION_NUMBER,
        }
    cryptography_info = {
        'version': getattr(cryptography, '__version__', ''),
    }
    idna_info = {
        'version': getattr(idna, '__version__', ''),
    }

    # OPENSSL_VERSION_NUMBER doesn't exist in the Python 2.6 ssl module.
    system_ssl = getattr(ssl, 'OPENSSL_VERSION_NUMBER', None)
    system_ssl_info = {
        'version': '%x' % system_ssl if system_ssl is not None else ''
    }

    return {
        'platform': platform_info,
        'implementation': implementation_info,
        'system_ssl': system_ssl_info,
        'using_pyopenssl': pyopenssl is not None,
        'pyOpenSSL': pyopenssl_info,
        'urllib3': urllib3_info,
        'chardet': chardet_info,
        'cryptography': cryptography_info,
        'idna': idna_info,
        'requests': {
            'version': requests_version,
        },
    }


def main():
    """Pretty-print the bug information as JSON."""
    print(json.dumps(info(), sort_keys=True, indent=2))


if __name__ == '__main__':
    main()
site-packages/pip/_vendor/requests/_internal_utils.py000064400000002110147511334610017127 0ustar00# -*- coding: utf-8 -*-

"""
requests._internal_utils
~~~~~~~~~~~~~~

Provides utility functions that are consumed internally by Requests
which depend on extremely few external helpers (such as compat)
"""

from .compat import is_py2, builtin_str, str


def to_native_string(string, encoding='ascii'):
    """Given a string object, regardless of type, returns a representation of
    that string in the native string type, encoding and decoding where
    necessary. This assumes ASCII unless told otherwise.
    """
    if isinstance(string, builtin_str):
        out = string
    else:
        if is_py2:
            out = string.encode(encoding)
        else:
            out = string.decode(encoding)

    return out


def unicode_is_ascii(u_string):
    """Determine if unicode string only contains ASCII characters.

    :param str u_string: unicode string to check. Must be unicode
        and not Python 2 `str`.
    :rtype: bool
    """
    assert isinstance(u_string, str)
    try:
        u_string.encode('ascii')
        return True
    except UnicodeEncodeError:
        return False
site-packages/pip/_vendor/requests/structures.py000064400000005704147511334610016173 0ustar00# -*- coding: utf-8 -*-

"""
requests.structures
~~~~~~~~~~~~~~~~~~~

Data structures that power Requests.
"""

import collections

from .compat import OrderedDict


class CaseInsensitiveDict(collections.MutableMapping):
    """A case-insensitive ``dict``-like object.

    Implements all methods and operations of
    ``collections.MutableMapping`` as well as dict's ``copy``. Also
    provides ``lower_items``.

    All keys are expected to be strings. The structure remembers the
    case of the last key to be set, and ``iter(instance)``,
    ``keys()``, ``items()``, ``iterkeys()``, and ``iteritems()``
    will contain case-sensitive keys. However, querying and contains
    testing is case insensitive::

        cid = CaseInsensitiveDict()
        cid['Accept'] = 'application/json'
        cid['aCCEPT'] == 'application/json'  # True
        list(cid) == ['Accept']  # True

    For example, ``headers['content-encoding']`` will return the
    value of a ``'Content-Encoding'`` response header, regardless
    of how the header name was originally stored.

    If the constructor, ``.update``, or equality comparison
    operations are given keys that have equal ``.lower()``s, the
    behavior is undefined.
    """

    def __init__(self, data=None, **kwargs):
        self._store = OrderedDict()
        if data is None:
            data = {}
        self.update(data, **kwargs)

    def __setitem__(self, key, value):
        # Use the lowercased key for lookups, but store the actual
        # key alongside the value.
        self._store[key.lower()] = (key, value)

    def __getitem__(self, key):
        return self._store[key.lower()][1]

    def __delitem__(self, key):
        del self._store[key.lower()]

    def __iter__(self):
        return (casedkey for casedkey, mappedvalue in self._store.values())

    def __len__(self):
        return len(self._store)

    def lower_items(self):
        """Like iteritems(), but with all lowercase keys."""
        return (
            (lowerkey, keyval[1])
            for (lowerkey, keyval)
            in self._store.items()
        )

    def __eq__(self, other):
        if isinstance(other, collections.Mapping):
            other = CaseInsensitiveDict(other)
        else:
            return NotImplemented
        # Compare insensitively
        return dict(self.lower_items()) == dict(other.lower_items())

    # Copy is required
    def copy(self):
        return CaseInsensitiveDict(self._store.values())

    def __repr__(self):
        return str(dict(self.items()))


class LookupDict(dict):
    """Dictionary lookup object."""

    def __init__(self, name=None):
        self.name = name
        super(LookupDict, self).__init__()

    def __repr__(self):
        return '<lookup \'%s\'>' % (self.name)

    def __getitem__(self, key):
        # We allow fall-through here, so values default to None

        return self.__dict__.get(key, None)

    def get(self, key, default=None):
        return self.__dict__.get(key, default)
site-packages/pip/_vendor/requests/adapters.py000064400000051030147511334610015544 0ustar00# -*- coding: utf-8 -*-

"""
requests.adapters
~~~~~~~~~~~~~~~~~

This module contains the transport adapters that Requests uses to define
and maintain connections.
"""

import os.path
import socket

from pip._vendor.urllib3.poolmanager import PoolManager, proxy_from_url
from pip._vendor.urllib3.response import HTTPResponse
from pip._vendor.urllib3.util import Timeout as TimeoutSauce
from pip._vendor.urllib3.util.retry import Retry
from pip._vendor.urllib3.exceptions import ClosedPoolError
from pip._vendor.urllib3.exceptions import ConnectTimeoutError
from pip._vendor.urllib3.exceptions import HTTPError as _HTTPError
from pip._vendor.urllib3.exceptions import MaxRetryError
from pip._vendor.urllib3.exceptions import NewConnectionError
from pip._vendor.urllib3.exceptions import ProxyError as _ProxyError
from pip._vendor.urllib3.exceptions import ProtocolError
from pip._vendor.urllib3.exceptions import ReadTimeoutError
from pip._vendor.urllib3.exceptions import SSLError as _SSLError
from pip._vendor.urllib3.exceptions import ResponseError

from .models import Response
from .compat import urlparse, basestring
from .utils import (DEFAULT_CA_BUNDLE_PATH, get_encoding_from_headers,
                    prepend_scheme_if_needed, get_auth_from_url, urldefragauth,
                    select_proxy)
from .structures import CaseInsensitiveDict
from .cookies import extract_cookies_to_jar
from .exceptions import (ConnectionError, ConnectTimeout, ReadTimeout, SSLError,
                         ProxyError, RetryError, InvalidSchema)
from .auth import _basic_auth_str

try:
    from pip._vendor.urllib3.contrib.socks import SOCKSProxyManager
except ImportError:
    def SOCKSProxyManager(*args, **kwargs):
        raise InvalidSchema("Missing dependencies for SOCKS support.")

DEFAULT_POOLBLOCK = False
DEFAULT_POOLSIZE = 10
DEFAULT_RETRIES = 0
DEFAULT_POOL_TIMEOUT = None


class BaseAdapter(object):
    """The Base Transport Adapter"""

    def __init__(self):
        super(BaseAdapter, self).__init__()

    def send(self, request, stream=False, timeout=None, verify=True,
             cert=None, proxies=None):
        """Sends PreparedRequest object. Returns Response object.

        :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
        :param stream: (optional) Whether to stream the request content.
        :param timeout: (optional) How long to wait for the server to send
            data before giving up, as a float, or a :ref:`(connect timeout,
            read timeout) <timeouts>` tuple.
        :type timeout: float or tuple
        :param verify: (optional) Either a boolean, in which case it controls whether we verify
            the server's TLS certificate, or a string, in which case it must be a path
            to a CA bundle to use
        :param cert: (optional) Any user-provided SSL certificate to be trusted.
        :param proxies: (optional) The proxies dictionary to apply to the request.
        """
        raise NotImplementedError

    def close(self):
        """Cleans up adapter specific items."""
        raise NotImplementedError


class HTTPAdapter(BaseAdapter):
    """The built-in HTTP Adapter for urllib3.

    Provides a general-case interface for Requests sessions to contact HTTP and
    HTTPS urls by implementing the Transport Adapter interface. This class will
    usually be created by the :class:`Session <Session>` class under the
    covers.

    :param pool_connections: The number of urllib3 connection pools to cache.
    :param pool_maxsize: The maximum number of connections to save in the pool.
    :param max_retries: The maximum number of retries each connection
        should attempt. Note, this applies only to failed DNS lookups, socket
        connections and connection timeouts, never to requests where data has
        made it to the server. By default, Requests does not retry failed
        connections. If you need granular control over the conditions under
        which we retry a request, import urllib3's ``Retry`` class and pass
        that instead.
    :param pool_block: Whether the connection pool should block for connections.

    Usage::

      >>> import requests
      >>> s = requests.Session()
      >>> a = requests.adapters.HTTPAdapter(max_retries=3)
      >>> s.mount('http://', a)
    """
    __attrs__ = ['max_retries', 'config', '_pool_connections', '_pool_maxsize',
                 '_pool_block']

    def __init__(self, pool_connections=DEFAULT_POOLSIZE,
                 pool_maxsize=DEFAULT_POOLSIZE, max_retries=DEFAULT_RETRIES,
                 pool_block=DEFAULT_POOLBLOCK):
        if max_retries == DEFAULT_RETRIES:
            self.max_retries = Retry(0, read=False)
        else:
            self.max_retries = Retry.from_int(max_retries)
        self.config = {}
        self.proxy_manager = {}

        super(HTTPAdapter, self).__init__()

        self._pool_connections = pool_connections
        self._pool_maxsize = pool_maxsize
        self._pool_block = pool_block

        self.init_poolmanager(pool_connections, pool_maxsize, block=pool_block)

    def __getstate__(self):
        return dict((attr, getattr(self, attr, None)) for attr in
                    self.__attrs__)

    def __setstate__(self, state):
        # Can't handle by adding 'proxy_manager' to self.__attrs__ because
        # self.poolmanager uses a lambda function, which isn't pickleable.
        self.proxy_manager = {}
        self.config = {}

        for attr, value in state.items():
            setattr(self, attr, value)

        self.init_poolmanager(self._pool_connections, self._pool_maxsize,
                              block=self._pool_block)

    def init_poolmanager(self, connections, maxsize, block=DEFAULT_POOLBLOCK, **pool_kwargs):
        """Initializes a urllib3 PoolManager.

        This method should not be called from user code, and is only
        exposed for use when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        :param connections: The number of urllib3 connection pools to cache.
        :param maxsize: The maximum number of connections to save in the pool.
        :param block: Block when no free connections are available.
        :param pool_kwargs: Extra keyword arguments used to initialize the Pool Manager.
        """
        # save these values for pickling
        self._pool_connections = connections
        self._pool_maxsize = maxsize
        self._pool_block = block

        self.poolmanager = PoolManager(num_pools=connections, maxsize=maxsize,
                                       block=block, strict=True, **pool_kwargs)

    def proxy_manager_for(self, proxy, **proxy_kwargs):
        """Return urllib3 ProxyManager for the given proxy.

        This method should not be called from user code, and is only
        exposed for use when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        :param proxy: The proxy to return a urllib3 ProxyManager for.
        :param proxy_kwargs: Extra keyword arguments used to configure the Proxy Manager.
        :returns: ProxyManager
        :rtype: urllib3.ProxyManager
        """
        if proxy in self.proxy_manager:
            manager = self.proxy_manager[proxy]
        elif proxy.lower().startswith('socks'):
            username, password = get_auth_from_url(proxy)
            manager = self.proxy_manager[proxy] = SOCKSProxyManager(
                proxy,
                username=username,
                password=password,
                num_pools=self._pool_connections,
                maxsize=self._pool_maxsize,
                block=self._pool_block,
                **proxy_kwargs
            )
        else:
            proxy_headers = self.proxy_headers(proxy)
            manager = self.proxy_manager[proxy] = proxy_from_url(
                proxy,
                proxy_headers=proxy_headers,
                num_pools=self._pool_connections,
                maxsize=self._pool_maxsize,
                block=self._pool_block,
                **proxy_kwargs)

        return manager

    def cert_verify(self, conn, url, verify, cert):
        """Verify a SSL certificate. This method should not be called from user
        code, and is only exposed for use when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        :param conn: The urllib3 connection object associated with the cert.
        :param url: The requested URL.
        :param verify: Either a boolean, in which case it controls whether we verify
            the server's TLS certificate, or a string, in which case it must be a path
            to a CA bundle to use
        :param cert: The SSL certificate to verify.
        """
        if url.lower().startswith('https') and verify:

            cert_loc = None

            # Allow self-specified cert location.
            if verify is not True:
                cert_loc = verify

            if not cert_loc:
                cert_loc = DEFAULT_CA_BUNDLE_PATH

            if not cert_loc or not os.path.exists(cert_loc):
                raise IOError("Could not find a suitable TLS CA certificate bundle, "
                              "invalid path: {0}".format(cert_loc))

            conn.cert_reqs = 'CERT_REQUIRED'

            if not os.path.isdir(cert_loc):
                conn.ca_certs = cert_loc
            else:
                conn.ca_cert_dir = cert_loc
        else:
            conn.cert_reqs = 'CERT_NONE'
            conn.ca_certs = None
            conn.ca_cert_dir = None

        if cert:
            if not isinstance(cert, basestring):
                conn.cert_file = cert[0]
                conn.key_file = cert[1]
            else:
                conn.cert_file = cert
                conn.key_file = None
            if conn.cert_file and not os.path.exists(conn.cert_file):
                raise IOError("Could not find the TLS certificate file, "
                              "invalid path: {0}".format(conn.cert_file))
            if conn.key_file and not os.path.exists(conn.key_file):
                raise IOError("Could not find the TLS key file, "
                              "invalid path: {0}".format(conn.key_file))

    def build_response(self, req, resp):
        """Builds a :class:`Response <requests.Response>` object from a urllib3
        response. This should not be called from user code, and is only exposed
        for use when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`

        :param req: The :class:`PreparedRequest <PreparedRequest>` used to generate the response.
        :param resp: The urllib3 response object.
        :rtype: requests.Response
        """
        response = Response()

        # Fallback to None if there's no status_code, for whatever reason.
        response.status_code = getattr(resp, 'status', None)

        # Make headers case-insensitive.
        response.headers = CaseInsensitiveDict(getattr(resp, 'headers', {}))

        # Set encoding.
        response.encoding = get_encoding_from_headers(response.headers)
        response.raw = resp
        response.reason = response.raw.reason

        if isinstance(req.url, bytes):
            response.url = req.url.decode('utf-8')
        else:
            response.url = req.url

        # Add new cookies from the server.
        extract_cookies_to_jar(response.cookies, req, resp)

        # Give the Response some context.
        response.request = req
        response.connection = self

        return response

    def get_connection(self, url, proxies=None):
        """Returns a urllib3 connection for the given URL. This should not be
        called from user code, and is only exposed for use when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        :param url: The URL to connect to.
        :param proxies: (optional) A Requests-style dictionary of proxies used on this request.
        :rtype: urllib3.ConnectionPool
        """
        proxy = select_proxy(url, proxies)

        if proxy:
            proxy = prepend_scheme_if_needed(proxy, 'http')
            proxy_manager = self.proxy_manager_for(proxy)
            conn = proxy_manager.connection_from_url(url)
        else:
            # Only scheme should be lower case
            parsed = urlparse(url)
            url = parsed.geturl()
            conn = self.poolmanager.connection_from_url(url)

        return conn

    def close(self):
        """Disposes of any internal state.

        Currently, this closes the PoolManager and any active ProxyManager,
        which closes any pooled connections.
        """
        self.poolmanager.clear()
        for proxy in self.proxy_manager.values():
            proxy.clear()

    def request_url(self, request, proxies):
        """Obtain the url to use when making the final request.

        If the message is being sent through a HTTP proxy, the full URL has to
        be used. Otherwise, we should only use the path portion of the URL.

        This should not be called from user code, and is only exposed for use
        when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
        :param proxies: A dictionary of schemes or schemes and hosts to proxy URLs.
        :rtype: str
        """
        proxy = select_proxy(request.url, proxies)
        scheme = urlparse(request.url).scheme

        is_proxied_http_request = (proxy and scheme != 'https')
        using_socks_proxy = False
        if proxy:
            proxy_scheme = urlparse(proxy).scheme.lower()
            using_socks_proxy = proxy_scheme.startswith('socks')

        url = request.path_url
        if is_proxied_http_request and not using_socks_proxy:
            url = urldefragauth(request.url)

        return url

    def add_headers(self, request, **kwargs):
        """Add any headers needed by the connection. As of v2.0 this does
        nothing by default, but is left for overriding by users that subclass
        the :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        This should not be called from user code, and is only exposed for use
        when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        :param request: The :class:`PreparedRequest <PreparedRequest>` to add headers to.
        :param kwargs: The keyword arguments from the call to send().
        """
        pass

    def proxy_headers(self, proxy):
        """Returns a dictionary of the headers to add to any request sent
        through a proxy. This works with urllib3 magic to ensure that they are
        correctly sent to the proxy, rather than in a tunnelled request if
        CONNECT is being used.

        This should not be called from user code, and is only exposed for use
        when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        :param proxies: The url of the proxy being used for this request.
        :rtype: dict
        """
        headers = {}
        username, password = get_auth_from_url(proxy)

        if username:
            headers['Proxy-Authorization'] = _basic_auth_str(username,
                                                             password)

        return headers

    def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
        """Sends PreparedRequest object. Returns Response object.

        :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
        :param stream: (optional) Whether to stream the request content.
        :param timeout: (optional) How long to wait for the server to send
            data before giving up, as a float, or a :ref:`(connect timeout,
            read timeout) <timeouts>` tuple.
        :type timeout: float or tuple or urllib3 Timeout object
        :param verify: (optional) Either a boolean, in which case it controls whether
            we verify the server's TLS certificate, or a string, in which case it
            must be a path to a CA bundle to use
        :param cert: (optional) Any user-provided SSL certificate to be trusted.
        :param proxies: (optional) The proxies dictionary to apply to the request.
        :rtype: requests.Response
        """

        conn = self.get_connection(request.url, proxies)

        self.cert_verify(conn, request.url, verify, cert)
        url = self.request_url(request, proxies)
        self.add_headers(request)

        chunked = not (request.body is None or 'Content-Length' in request.headers)

        if isinstance(timeout, tuple):
            try:
                connect, read = timeout
                timeout = TimeoutSauce(connect=connect, read=read)
            except ValueError as e:
                # this may raise a string formatting error.
                err = ("Invalid timeout {0}. Pass a (connect, read) "
                       "timeout tuple, or a single float to set "
                       "both timeouts to the same value".format(timeout))
                raise ValueError(err)
        elif isinstance(timeout, TimeoutSauce):
            pass
        else:
            timeout = TimeoutSauce(connect=timeout, read=timeout)

        try:
            if not chunked:
                resp = conn.urlopen(
                    method=request.method,
                    url=url,
                    body=request.body,
                    headers=request.headers,
                    redirect=False,
                    assert_same_host=False,
                    preload_content=False,
                    decode_content=False,
                    retries=self.max_retries,
                    timeout=timeout
                )

            # Send the request.
            else:
                if hasattr(conn, 'proxy_pool'):
                    conn = conn.proxy_pool

                low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)

                try:
                    low_conn.putrequest(request.method,
                                        url,
                                        skip_accept_encoding=True)

                    for header, value in request.headers.items():
                        low_conn.putheader(header, value)

                    low_conn.endheaders()

                    for i in request.body:
                        low_conn.send(hex(len(i))[2:].encode('utf-8'))
                        low_conn.send(b'\r\n')
                        low_conn.send(i)
                        low_conn.send(b'\r\n')
                    low_conn.send(b'0\r\n\r\n')

                    # Receive the response from the server
                    try:
                        # For Python 2.7+ versions, use buffering of HTTP
                        # responses
                        r = low_conn.getresponse(buffering=True)
                    except TypeError:
                        # For compatibility with Python 2.6 versions and back
                        r = low_conn.getresponse()

                    resp = HTTPResponse.from_httplib(
                        r,
                        pool=conn,
                        connection=low_conn,
                        preload_content=False,
                        decode_content=False
                    )
                except:
                    # If we hit any problems here, clean up the connection.
                    # Then, reraise so that we can handle the actual exception.
                    low_conn.close()
                    raise

        except (ProtocolError, socket.error) as err:
            raise ConnectionError(err, request=request)

        except MaxRetryError as e:
            if isinstance(e.reason, ConnectTimeoutError):
                # TODO: Remove this in 3.0.0: see #2811
                if not isinstance(e.reason, NewConnectionError):
                    raise ConnectTimeout(e, request=request)

            if isinstance(e.reason, ResponseError):
                raise RetryError(e, request=request)

            if isinstance(e.reason, _ProxyError):
                raise ProxyError(e, request=request)

            if isinstance(e.reason, _SSLError):
                # This branch is for urllib3 v1.22 and later.
                raise SSLError(e, request=request)

            raise ConnectionError(e, request=request)

        except ClosedPoolError as e:
            raise ConnectionError(e, request=request)

        except _ProxyError as e:
            raise ProxyError(e)

        except (_SSLError, _HTTPError) as e:
            if isinstance(e, _SSLError):
                # This branch is for urllib3 versions earlier than v1.22
                raise SSLError(e, request=request)
            elif isinstance(e, ReadTimeoutError):
                raise ReadTimeout(e, request=request)
            else:
                raise

        return self.build_response(request, resp)
site-packages/pip/_vendor/requests/hooks.py000064400000001377147511334610015075 0ustar00# -*- coding: utf-8 -*-

"""
requests.hooks
~~~~~~~~~~~~~~

This module provides the capabilities for the Requests hooks system.

Available hooks:

``response``:
    The response generated from a Request.
"""
HOOKS = ['response']


def default_hooks():
    return dict((event, []) for event in HOOKS)

# TODO: response is the only one


def dispatch_hook(key, hooks, hook_data, **kwargs):
    """Dispatches a hook dictionary on a given piece of data."""
    hooks = hooks or dict()
    hooks = hooks.get(key)
    if hooks:
        if hasattr(hooks, '__call__'):
            hooks = [hooks]
        for hook in hooks:
            _hook_data = hook(hook_data, **kwargs)
            if _hook_data is not None:
                hook_data = _hook_data
    return hook_data
site-packages/pip/_vendor/requests/utils.py000064400000066057147511334610015120 0ustar00# -*- coding: utf-8 -*-

"""
requests.utils
~~~~~~~~~~~~~~

This module provides utility functions that are used within Requests
that are also useful for external consumption.
"""

import cgi
import codecs
import collections
import contextlib
import io
import os
import platform
import re
import socket
import struct
import warnings

from .__version__ import __version__
from . import certs
# to_native_string is unused here, but imported here for backwards compatibility
from ._internal_utils import to_native_string
from .compat import parse_http_list as _parse_list_header
from .compat import (
    quote, urlparse, bytes, str, OrderedDict, unquote, getproxies,
    proxy_bypass, urlunparse, basestring, integer_types, is_py3,
    proxy_bypass_environment, getproxies_environment)
from .cookies import cookiejar_from_dict
from .structures import CaseInsensitiveDict
from .exceptions import (
    InvalidURL, InvalidHeader, FileModeWarning, UnrewindableBodyError)

NETRC_FILES = ('.netrc', '_netrc')

DEFAULT_CA_BUNDLE_PATH = certs.where()

DEFAULT_PORTS = {'http': 80, 'https': 443}

if platform.system() == 'Windows':
    # provide a proxy_bypass version on Windows without DNS lookups

    def proxy_bypass_registry(host):
        if is_py3:
            import winreg
        else:
            import _winreg as winreg
        try:
            internetSettings = winreg.OpenKey(winreg.HKEY_CURRENT_USER,
                r'Software\Microsoft\Windows\CurrentVersion\Internet Settings')
            proxyEnable = winreg.QueryValueEx(internetSettings,
                                              'ProxyEnable')[0]
            proxyOverride = winreg.QueryValueEx(internetSettings,
                                                'ProxyOverride')[0]
        except OSError:
            return False
        if not proxyEnable or not proxyOverride:
            return False

        # make a check value list from the registry entry: replace the
        # '<local>' string by the localhost entry and the corresponding
        # canonical entry.
        proxyOverride = proxyOverride.split(';')
        # now check if we match one of the registry values.
        for test in proxyOverride:
            if test == '<local>':
                if '.' not in host:
                    return True
            test = test.replace(".", r"\.")     # mask dots
            test = test.replace("*", r".*")     # change glob sequence
            test = test.replace("?", r".")      # change glob char
            if re.match(test, host, re.I):
                return True
        return False

    def proxy_bypass(host):  # noqa
        """Return True, if the host should be bypassed.

        Checks proxy settings gathered from the environment, if specified,
        or the registry.
        """
        if getproxies_environment():
            return proxy_bypass_environment(host)
        else:
            return proxy_bypass_registry(host)


def dict_to_sequence(d):
    """Returns an internal sequence dictionary update."""

    if hasattr(d, 'items'):
        d = d.items()

    return d


def super_len(o):
    total_length = None
    current_position = 0

    if hasattr(o, '__len__'):
        total_length = len(o)

    elif hasattr(o, 'len'):
        total_length = o.len

    elif hasattr(o, 'fileno'):
        try:
            fileno = o.fileno()
        except io.UnsupportedOperation:
            pass
        else:
            total_length = os.fstat(fileno).st_size

            # Having used fstat to determine the file length, we need to
            # confirm that this file was opened up in binary mode.
            if 'b' not in o.mode:
                warnings.warn((
                    "Requests has determined the content-length for this "
                    "request using the binary size of the file: however, the "
                    "file has been opened in text mode (i.e. without the 'b' "
                    "flag in the mode). This may lead to an incorrect "
                    "content-length. In Requests 3.0, support will be removed "
                    "for files in text mode."),
                    FileModeWarning
                )

    if hasattr(o, 'tell'):
        try:
            current_position = o.tell()
        except (OSError, IOError):
            # This can happen in some weird situations, such as when the file
            # is actually a special file descriptor like stdin. In this
            # instance, we don't know what the length is, so set it to zero and
            # let requests chunk it instead.
            if total_length is not None:
                current_position = total_length
        else:
            if hasattr(o, 'seek') and total_length is None:
                # StringIO and BytesIO have seek but no useable fileno
                try:
                    # seek to end of file
                    o.seek(0, 2)
                    total_length = o.tell()

                    # seek back to current position to support
                    # partially read file-like objects
                    o.seek(current_position or 0)
                except (OSError, IOError):
                    total_length = 0

    if total_length is None:
        total_length = 0

    return max(0, total_length - current_position)


def get_netrc_auth(url, raise_errors=False):
    """Returns the Requests tuple auth for a given url from netrc."""

    try:
        from netrc import netrc, NetrcParseError

        netrc_path = None

        for f in NETRC_FILES:
            try:
                loc = os.path.expanduser('~/{0}'.format(f))
            except KeyError:
                # os.path.expanduser can fail when $HOME is undefined and
                # getpwuid fails. See http://bugs.python.org/issue20164 &
                # https://github.com/requests/requests/issues/1846
                return

            if os.path.exists(loc):
                netrc_path = loc
                break

        # Abort early if there isn't one.
        if netrc_path is None:
            return

        ri = urlparse(url)

        # Strip port numbers from netloc. This weird `if...encode`` dance is
        # used for Python 3.2, which doesn't support unicode literals.
        splitstr = b':'
        if isinstance(url, str):
            splitstr = splitstr.decode('ascii')
        host = ri.netloc.split(splitstr)[0]

        try:
            _netrc = netrc(netrc_path).authenticators(host)
            if _netrc:
                # Return with login / password
                login_i = (0 if _netrc[0] else 1)
                return (_netrc[login_i], _netrc[2])
        except (NetrcParseError, IOError):
            # If there was a parsing error or a permissions issue reading the file,
            # we'll just skip netrc auth unless explicitly asked to raise errors.
            if raise_errors:
                raise

    # AppEngine hackiness.
    except (ImportError, AttributeError):
        pass


def guess_filename(obj):
    """Tries to guess the filename of the given object."""
    name = getattr(obj, 'name', None)
    if (name and isinstance(name, basestring) and name[0] != '<' and
            name[-1] != '>'):
        return os.path.basename(name)


def from_key_val_list(value):
    """Take an object and test to see if it can be represented as a
    dictionary. Unless it can not be represented as such, return an
    OrderedDict, e.g.,

    ::

        >>> from_key_val_list([('key', 'val')])
        OrderedDict([('key', 'val')])
        >>> from_key_val_list('string')
        ValueError: need more than 1 value to unpack
        >>> from_key_val_list({'key': 'val'})
        OrderedDict([('key', 'val')])

    :rtype: OrderedDict
    """
    if value is None:
        return None

    if isinstance(value, (str, bytes, bool, int)):
        raise ValueError('cannot encode objects that are not 2-tuples')

    return OrderedDict(value)


def to_key_val_list(value):
    """Take an object and test to see if it can be represented as a
    dictionary. If it can be, return a list of tuples, e.g.,

    ::

        >>> to_key_val_list([('key', 'val')])
        [('key', 'val')]
        >>> to_key_val_list({'key': 'val'})
        [('key', 'val')]
        >>> to_key_val_list('string')
        ValueError: cannot encode objects that are not 2-tuples.

    :rtype: list
    """
    if value is None:
        return None

    if isinstance(value, (str, bytes, bool, int)):
        raise ValueError('cannot encode objects that are not 2-tuples')

    if isinstance(value, collections.Mapping):
        value = value.items()

    return list(value)


# From mitsuhiko/werkzeug (used with permission).
def parse_list_header(value):
    """Parse lists as described by RFC 2068 Section 2.

    In particular, parse comma-separated lists where the elements of
    the list may include quoted-strings.  A quoted-string could
    contain a comma.  A non-quoted string could have quotes in the
    middle.  Quotes are removed automatically after parsing.

    It basically works like :func:`parse_set_header` just that items
    may appear multiple times and case sensitivity is preserved.

    The return value is a standard :class:`list`:

    >>> parse_list_header('token, "quoted value"')
    ['token', 'quoted value']

    To create a header from the :class:`list` again, use the
    :func:`dump_header` function.

    :param value: a string with a list header.
    :return: :class:`list`
    :rtype: list
    """
    result = []
    for item in _parse_list_header(value):
        if item[:1] == item[-1:] == '"':
            item = unquote_header_value(item[1:-1])
        result.append(item)
    return result


# From mitsuhiko/werkzeug (used with permission).
def parse_dict_header(value):
    """Parse lists of key, value pairs as described by RFC 2068 Section 2 and
    convert them into a python dict:

    >>> d = parse_dict_header('foo="is a fish", bar="as well"')
    >>> type(d) is dict
    True
    >>> sorted(d.items())
    [('bar', 'as well'), ('foo', 'is a fish')]

    If there is no value for a key it will be `None`:

    >>> parse_dict_header('key_without_value')
    {'key_without_value': None}

    To create a header from the :class:`dict` again, use the
    :func:`dump_header` function.

    :param value: a string with a dict header.
    :return: :class:`dict`
    :rtype: dict
    """
    result = {}
    for item in _parse_list_header(value):
        if '=' not in item:
            result[item] = None
            continue
        name, value = item.split('=', 1)
        if value[:1] == value[-1:] == '"':
            value = unquote_header_value(value[1:-1])
        result[name] = value
    return result


# From mitsuhiko/werkzeug (used with permission).
def unquote_header_value(value, is_filename=False):
    r"""Unquotes a header value.  (Reversal of :func:`quote_header_value`).
    This does not use the real unquoting but what browsers are actually
    using for quoting.

    :param value: the header value to unquote.
    :rtype: str
    """
    if value and value[0] == value[-1] == '"':
        # this is not the real unquoting, but fixing this so that the
        # RFC is met will result in bugs with internet explorer and
        # probably some other browsers as well.  IE for example is
        # uploading files with "C:\foo\bar.txt" as filename
        value = value[1:-1]

        # if this is a filename and the starting characters look like
        # a UNC path, then just return the value without quotes.  Using the
        # replace sequence below on a UNC path has the effect of turning
        # the leading double slash into a single slash and then
        # _fix_ie_filename() doesn't work correctly.  See #458.
        if not is_filename or value[:2] != '\\\\':
            return value.replace('\\\\', '\\').replace('\\"', '"')
    return value


def dict_from_cookiejar(cj):
    """Returns a key/value dictionary from a CookieJar.

    :param cj: CookieJar object to extract cookies from.
    :rtype: dict
    """

    cookie_dict = {}

    for cookie in cj:
        cookie_dict[cookie.name] = cookie.value

    return cookie_dict


def add_dict_to_cookiejar(cj, cookie_dict):
    """Returns a CookieJar from a key/value dictionary.

    :param cj: CookieJar to insert cookies into.
    :param cookie_dict: Dict of key/values to insert into CookieJar.
    :rtype: CookieJar
    """

    return cookiejar_from_dict(cookie_dict, cj)


def get_encodings_from_content(content):
    """Returns encodings from given content string.

    :param content: bytestring to extract encodings from.
    """
    warnings.warn((
        'In requests 3.0, get_encodings_from_content will be removed. For '
        'more information, please see the discussion on issue #2266. (This'
        ' warning should only appear once.)'),
        DeprecationWarning)

    charset_re = re.compile(r'<meta.*?charset=["\']*(.+?)["\'>]', flags=re.I)
    pragma_re = re.compile(r'<meta.*?content=["\']*;?charset=(.+?)["\'>]', flags=re.I)
    xml_re = re.compile(r'^<\?xml.*?encoding=["\']*(.+?)["\'>]')

    return (charset_re.findall(content) +
            pragma_re.findall(content) +
            xml_re.findall(content))


def get_encoding_from_headers(headers):
    """Returns encodings from given HTTP Header Dict.

    :param headers: dictionary to extract encoding from.
    :rtype: str
    """

    content_type = headers.get('content-type')

    if not content_type:
        return None

    content_type, params = cgi.parse_header(content_type)

    if 'charset' in params:
        return params['charset'].strip("'\"")

    if 'text' in content_type:
        return 'ISO-8859-1'


def stream_decode_response_unicode(iterator, r):
    """Stream decodes a iterator."""

    if r.encoding is None:
        for item in iterator:
            yield item
        return

    decoder = codecs.getincrementaldecoder(r.encoding)(errors='replace')
    for chunk in iterator:
        rv = decoder.decode(chunk)
        if rv:
            yield rv
    rv = decoder.decode(b'', final=True)
    if rv:
        yield rv


def iter_slices(string, slice_length):
    """Iterate over slices of a string."""
    pos = 0
    if slice_length is None or slice_length <= 0:
        slice_length = len(string)
    while pos < len(string):
        yield string[pos:pos + slice_length]
        pos += slice_length


def get_unicode_from_response(r):
    """Returns the requested content back in unicode.

    :param r: Response object to get unicode content from.

    Tried:

    1. charset from content-type
    2. fall back and replace all unicode characters

    :rtype: str
    """
    warnings.warn((
        'In requests 3.0, get_unicode_from_response will be removed. For '
        'more information, please see the discussion on issue #2266. (This'
        ' warning should only appear once.)'),
        DeprecationWarning)

    tried_encodings = []

    # Try charset from content-type
    encoding = get_encoding_from_headers(r.headers)

    if encoding:
        try:
            return str(r.content, encoding)
        except UnicodeError:
            tried_encodings.append(encoding)

    # Fall back:
    try:
        return str(r.content, encoding, errors='replace')
    except TypeError:
        return r.content


# The unreserved URI characters (RFC 3986)
UNRESERVED_SET = frozenset(
    "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz" + "0123456789-._~")


def unquote_unreserved(uri):
    """Un-escape any percent-escape sequences in a URI that are unreserved
    characters. This leaves all reserved, illegal and non-ASCII bytes encoded.

    :rtype: str
    """
    parts = uri.split('%')
    for i in range(1, len(parts)):
        h = parts[i][0:2]
        if len(h) == 2 and h.isalnum():
            try:
                c = chr(int(h, 16))
            except ValueError:
                raise InvalidURL("Invalid percent-escape sequence: '%s'" % h)

            if c in UNRESERVED_SET:
                parts[i] = c + parts[i][2:]
            else:
                parts[i] = '%' + parts[i]
        else:
            parts[i] = '%' + parts[i]
    return ''.join(parts)


def requote_uri(uri):
    """Re-quote the given URI.

    This function passes the given URI through an unquote/quote cycle to
    ensure that it is fully and consistently quoted.

    :rtype: str
    """
    safe_with_percent = "!#$%&'()*+,/:;=?@[]~"
    safe_without_percent = "!#$&'()*+,/:;=?@[]~"
    try:
        # Unquote only the unreserved characters
        # Then quote only illegal characters (do not quote reserved,
        # unreserved, or '%')
        return quote(unquote_unreserved(uri), safe=safe_with_percent)
    except InvalidURL:
        # We couldn't unquote the given URI, so let's try quoting it, but
        # there may be unquoted '%'s in the URI. We need to make sure they're
        # properly quoted so they do not cause issues elsewhere.
        return quote(uri, safe=safe_without_percent)


def address_in_network(ip, net):
    """This function allows you to check if an IP belongs to a network subnet

    Example: returns True if ip = 192.168.1.1 and net = 192.168.1.0/24
             returns False if ip = 192.168.1.1 and net = 192.168.100.0/24

    :rtype: bool
    """
    ipaddr = struct.unpack('=L', socket.inet_aton(ip))[0]
    netaddr, bits = net.split('/')
    netmask = struct.unpack('=L', socket.inet_aton(dotted_netmask(int(bits))))[0]
    network = struct.unpack('=L', socket.inet_aton(netaddr))[0] & netmask
    return (ipaddr & netmask) == (network & netmask)


def dotted_netmask(mask):
    """Converts mask from /xx format to xxx.xxx.xxx.xxx

    Example: if mask is 24 function returns 255.255.255.0

    :rtype: str
    """
    bits = 0xffffffff ^ (1 << 32 - mask) - 1
    return socket.inet_ntoa(struct.pack('>I', bits))


def is_ipv4_address(string_ip):
    """
    :rtype: bool
    """
    try:
        socket.inet_aton(string_ip)
    except socket.error:
        return False
    return True


def is_valid_cidr(string_network):
    """
    Very simple check of the cidr format in no_proxy variable.

    :rtype: bool
    """
    if string_network.count('/') == 1:
        try:
            mask = int(string_network.split('/')[1])
        except ValueError:
            return False

        if mask < 1 or mask > 32:
            return False

        try:
            socket.inet_aton(string_network.split('/')[0])
        except socket.error:
            return False
    else:
        return False
    return True


@contextlib.contextmanager
def set_environ(env_name, value):
    """Set the environment variable 'env_name' to 'value'

    Save previous value, yield, and then restore the previous value stored in
    the environment variable 'env_name'.

    If 'value' is None, do nothing"""
    value_changed = value is not None
    if value_changed:
        old_value = os.environ.get(env_name)
        os.environ[env_name] = value
    try:
        yield
    finally:
        if value_changed:
            if old_value is None:
                del os.environ[env_name]
            else:
                os.environ[env_name] = old_value


def should_bypass_proxies(url, no_proxy):
    """
    Returns whether we should bypass proxies or not.

    :rtype: bool
    """
    get_proxy = lambda k: os.environ.get(k) or os.environ.get(k.upper())

    # First check whether no_proxy is defined. If it is, check that the URL
    # we're getting isn't in the no_proxy list.
    no_proxy_arg = no_proxy
    if no_proxy is None:
        no_proxy = get_proxy('no_proxy')
    netloc = urlparse(url).netloc

    if no_proxy:
        # We need to check whether we match here. We need to see if we match
        # the end of the netloc, both with and without the port.
        no_proxy = (
            host for host in no_proxy.replace(' ', '').split(',') if host
        )

        ip = netloc.split(':')[0]
        if is_ipv4_address(ip):
            for proxy_ip in no_proxy:
                if is_valid_cidr(proxy_ip):
                    if address_in_network(ip, proxy_ip):
                        return True
                elif ip == proxy_ip:
                    # If no_proxy ip was defined in plain IP notation instead of cidr notation &
                    # matches the IP of the index
                    return True
        else:
            for host in no_proxy:
                if netloc.endswith(host) or netloc.split(':')[0].endswith(host):
                    # The URL does match something in no_proxy, so we don't want
                    # to apply the proxies on this URL.
                    return True

    # If the system proxy settings indicate that this URL should be bypassed,
    # don't proxy.
    # The proxy_bypass function is incredibly buggy on OS X in early versions
    # of Python 2.6, so allow this call to fail. Only catch the specific
    # exceptions we've seen, though: this call failing in other ways can reveal
    # legitimate problems.
    with set_environ('no_proxy', no_proxy_arg):
        try:
            bypass = proxy_bypass(netloc)
        except (TypeError, socket.gaierror):
            bypass = False

    if bypass:
        return True

    return False


def get_environ_proxies(url, no_proxy=None):
    """
    Return a dict of environment proxies.

    :rtype: dict
    """
    if should_bypass_proxies(url, no_proxy=no_proxy):
        return {}
    else:
        return getproxies()


def select_proxy(url, proxies):
    """Select a proxy for the url, if applicable.

    :param url: The url being for the request
    :param proxies: A dictionary of schemes or schemes and hosts to proxy URLs
    """
    proxies = proxies or {}
    urlparts = urlparse(url)
    if urlparts.hostname is None:
        return proxies.get(urlparts.scheme, proxies.get('all'))

    proxy_keys = [
        urlparts.scheme + '://' + urlparts.hostname,
        urlparts.scheme,
        'all://' + urlparts.hostname,
        'all',
    ]
    proxy = None
    for proxy_key in proxy_keys:
        if proxy_key in proxies:
            proxy = proxies[proxy_key]
            break

    return proxy


def default_user_agent(name="python-requests"):
    """
    Return a string representing the default user agent.

    :rtype: str
    """
    return '%s/%s' % (name, __version__)


def default_headers():
    """
    :rtype: requests.structures.CaseInsensitiveDict
    """
    return CaseInsensitiveDict({
        'User-Agent': default_user_agent(),
        'Accept-Encoding': ', '.join(('gzip', 'deflate')),
        'Accept': '*/*',
        'Connection': 'keep-alive',
    })


def parse_header_links(value):
    """Return a dict of parsed link headers proxies.

    i.e. Link: <http:/.../front.jpeg>; rel=front; type="image/jpeg",<http://.../back.jpeg>; rel=back;type="image/jpeg"

    :rtype: list
    """

    links = []

    replace_chars = ' \'"'

    for val in re.split(', *<', value):
        try:
            url, params = val.split(';', 1)
        except ValueError:
            url, params = val, ''

        link = {'url': url.strip('<> \'"')}

        for param in params.split(';'):
            try:
                key, value = param.split('=')
            except ValueError:
                break

            link[key.strip(replace_chars)] = value.strip(replace_chars)

        links.append(link)

    return links


# Null bytes; no need to recreate these on each call to guess_json_utf
_null = '\x00'.encode('ascii')  # encoding to ASCII for Python 3
_null2 = _null * 2
_null3 = _null * 3


def guess_json_utf(data):
    """
    :rtype: str
    """
    # JSON always starts with two ASCII characters, so detection is as
    # easy as counting the nulls and from their location and count
    # determine the encoding. Also detect a BOM, if present.
    sample = data[:4]
    if sample in (codecs.BOM_UTF32_LE, codecs.BOM_UTF32_BE):
        return 'utf-32'     # BOM included
    if sample[:3] == codecs.BOM_UTF8:
        return 'utf-8-sig'  # BOM included, MS style (discouraged)
    if sample[:2] in (codecs.BOM_UTF16_LE, codecs.BOM_UTF16_BE):
        return 'utf-16'     # BOM included
    nullcount = sample.count(_null)
    if nullcount == 0:
        return 'utf-8'
    if nullcount == 2:
        if sample[::2] == _null2:   # 1st and 3rd are null
            return 'utf-16-be'
        if sample[1::2] == _null2:  # 2nd and 4th are null
            return 'utf-16-le'
        # Did not detect 2 valid UTF-16 ascii-range characters
    if nullcount == 3:
        if sample[:3] == _null3:
            return 'utf-32-be'
        if sample[1:] == _null3:
            return 'utf-32-le'
        # Did not detect a valid UTF-32 ascii-range character
    return None


def prepend_scheme_if_needed(url, new_scheme):
    """Given a URL that may or may not have a scheme, prepend the given scheme.
    Does not replace a present scheme with the one provided as an argument.

    :rtype: str
    """
    scheme, netloc, path, params, query, fragment = urlparse(url, new_scheme)

    # urlparse is a finicky beast, and sometimes decides that there isn't a
    # netloc present. Assume that it's being over-cautious, and switch netloc
    # and path if urlparse decided there was no netloc.
    if not netloc:
        netloc, path = path, netloc

    return urlunparse((scheme, netloc, path, params, query, fragment))


def get_auth_from_url(url):
    """Given a url with authentication components, extract them into a tuple of
    username,password.

    :rtype: (str,str)
    """
    parsed = urlparse(url)

    try:
        auth = (unquote(parsed.username), unquote(parsed.password))
    except (AttributeError, TypeError):
        auth = ('', '')

    return auth


# Moved outside of function to avoid recompile every call
_CLEAN_HEADER_REGEX_BYTE = re.compile(b'^\\S[^\\r\\n]*$|^$')
_CLEAN_HEADER_REGEX_STR = re.compile(r'^\S[^\r\n]*$|^$')


def check_header_validity(header):
    """Verifies that header value is a string which doesn't contain
    leading whitespace or return characters. This prevents unintended
    header injection.

    :param header: tuple, in the format (name, value).
    """
    name, value = header

    if isinstance(value, bytes):
        pat = _CLEAN_HEADER_REGEX_BYTE
    else:
        pat = _CLEAN_HEADER_REGEX_STR
    try:
        if not pat.match(value):
            raise InvalidHeader("Invalid return character or leading space in header: %s" % name)
    except TypeError:
        raise InvalidHeader("Value for header {%s: %s} must be of type str or "
                            "bytes, not %s" % (name, value, type(value)))


def urldefragauth(url):
    """
    Given a url remove the fragment and the authentication part.

    :rtype: str
    """
    scheme, netloc, path, params, query, fragment = urlparse(url)

    # see func:`prepend_scheme_if_needed`
    if not netloc:
        netloc, path = path, netloc

    netloc = netloc.rsplit('@', 1)[-1]

    return urlunparse((scheme, netloc, path, params, query, ''))


def rewind_body(prepared_request):
    """Move file pointer back to its recorded starting position
    so it can be read again on redirect.
    """
    body_seek = getattr(prepared_request.body, 'seek', None)
    if body_seek is not None and isinstance(prepared_request._body_position, integer_types):
        try:
            body_seek(prepared_request._body_position)
        except (IOError, OSError):
            raise UnrewindableBodyError("An error occurred when rewinding request "
                                        "body for redirect.")
    else:
        raise UnrewindableBodyError("Unable to rewind request body for redirect.")
site-packages/pip/_vendor/requests/__pycache__/__version__.cpython-36.pyc000064400000000727147511334610022515 0ustar003

���e��@s,dZdZdZdZdZdZdZdZdZd	Z	d
S)ZrequestszPython HTTP for Humans.zhttp://python-requests.orgz2.18.4iz
Kenneth Reitzzme@kennethreitz.orgz
Apache 2.0zCopyright 2017 Kenneth Reitzu✨ 🍰 ✨N)
Z	__title__Z__description__Z__url__�__version__Z	__build__�
__author__Z__author_email__Z__license__Z
__copyright__Z__cake__�rr�!/usr/lib/python3.6/__version__.py�<module>ssite-packages/pip/_vendor/requests/__pycache__/hooks.cpython-36.opt-1.pyc000064400000001626147511334610022315 0ustar003

���e��@sdZdgZdd�Zdd�ZdS)z�
requests.hooks
~~~~~~~~~~~~~~

This module provides the capabilities for the Requests hooks system.

Available hooks:

``response``:
    The response generated from a Request.
ZresponsecCstdd�tD��S)Ncss|]}|gfVqdS)N�)�.0Zeventrr�/usr/lib/python3.6/hooks.py�	<genexpr>sz default_hooks.<locals>.<genexpr>)�dict�HOOKSrrrr�
default_hookssrcKsR|pt�}|j|�}|rNt|d�r(|g}x$|D]}||f|�}|dk	r.|}q.W|S)z6Dispatches a hook dictionary on a given piece of data.�__call__N)r�get�hasattr)�keyZhooksZ	hook_data�kwargs�hookZ
_hook_datarrr�
dispatch_hooks



rN)�__doc__rrrrrrr�<module>
ssite-packages/pip/_vendor/requests/__pycache__/__version__.cpython-36.opt-1.pyc000064400000000727147511334610023454 0ustar003

���e��@s,dZdZdZdZdZdZdZdZdZd	Z	d
S)ZrequestszPython HTTP for Humans.zhttp://python-requests.orgz2.18.4iz
Kenneth Reitzzme@kennethreitz.orgz
Apache 2.0zCopyright 2017 Kenneth Reitzu✨ 🍰 ✨N)
Z	__title__Z__description__Z__url__�__version__Z	__build__�
__author__Z__author_email__Z__license__Z
__copyright__Z__cake__�rr�!/usr/lib/python3.6/__version__.py�<module>ssite-packages/pip/_vendor/requests/__pycache__/cookies.cpython-36.opt-1.pyc000064400000044057147511334610022633 0ustar003

���e G�
@sdZddlZddlZddlZddlZddlmZddlmZm	Z	m
Z
mZyddlZWne
k
rpddlZYnXGdd�de�ZGdd	�d	e�Zd
d�Zdd
�Zddd�ZGdd�de�ZGdd�dejej�Zdd�Zdd�Zdd�Zd dd�Zdd�ZdS)!z�
requests.cookies
~~~~~~~~~~~~~~~~

Compatibility code to be able to use `cookielib.CookieJar` with requests.

requests.utils imports from here, so be careful with imports.
�N�)�to_native_string)�	cookielib�urlparse�
urlunparse�Morselc@s�eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
ddd�Zdd�Zdd�Z
dd�Zedd��Zedd��Zedd��ZdS) �MockRequesta�Wraps a `requests.Request` to mimic a `urllib2.Request`.

    The code in `cookielib.CookieJar` expects this interface in order to correctly
    manage cookie policies, i.e., determine whether a cookie can be set, given the
    domains of the request and the cookie.

    The original request object is read-only. The client is responsible for collecting
    the new headers via `get_new_headers()` and interpreting them appropriately. You
    probably want `get_cookie_header`, defined below.
    cCs ||_i|_t|jj�j|_dS)N)�_r�_new_headersr�url�scheme�type)�self�request�r�/usr/lib/python3.6/cookies.py�__init__&szMockRequest.__init__cCs|jS)N)r
)rrrr�get_type+szMockRequest.get_typecCst|jj�jS)N)rr	rZnetloc)rrrr�get_host.szMockRequest.get_hostcCs|j�S)N)r)rrrr�get_origin_req_host1szMockRequest.get_origin_req_hostcCsT|jjjd�s|jjSt|jjddd�}t|jj�}t|j||j|j	|j
|jg�S)NZHostzutf-8)�encoding)r	�headers�getrrrrr�pathZparamsZqueryZfragment)r�hostZparsedrrr�get_full_url4szMockRequest.get_full_urlcCsdS)NTr)rrrr�is_unverifiableBszMockRequest.is_unverifiablecCs||jjkp||jkS)N)r	rr
)r�namerrr�
has_headerEszMockRequest.has_headerNcCs|jjj||jj||��S)N)r	rrr
)rr�defaultrrr�
get_headerHszMockRequest.get_headercCstd��dS)zMcookielib has no legitimate use for this method; add it back if you find one.z=Cookie headers should be added with add_unredirected_header()N)�NotImplementedError)r�key�valrrr�
add_headerKszMockRequest.add_headercCs||j|<dS)N)r
)rr�valuerrr�add_unredirected_headerOsz#MockRequest.add_unredirected_headercCs|jS)N)r
)rrrr�get_new_headersRszMockRequest.get_new_headerscCs|j�S)N)r)rrrr�unverifiableUszMockRequest.unverifiablecCs|j�S)N)r)rrrr�origin_req_hostYszMockRequest.origin_req_hostcCs|j�S)N)r)rrrrr]szMockRequest.host)N)�__name__�
__module__�__qualname__�__doc__rrrrrrrr r$r&r'�propertyr(r)rrrrrrs

rc@s(eZdZdZdd�Zdd�Zdd�ZdS)	�MockResponsez�Wraps a `httplib.HTTPMessage` to mimic a `urllib.addinfourl`.

    ...what? Basically, expose the parsed HTTP headers from the server response
    the way `cookielib` expects to see them.
    cCs
||_dS)z�Make a MockResponse for `cookielib` to read.

        :param headers: a httplib.HTTPMessage or analogous carrying the headers
        N)�_headers)rrrrrriszMockResponse.__init__cCs|jS)N)r0)rrrr�infopszMockResponse.infocCs|jj|�dS)N)r0�
getheaders)rrrrrr2sszMockResponse.getheadersN)r*r+r,r-rr1r2rrrrr/bsr/cCs8t|d�o|jsdSt|�}t|jj�}|j||�dS)z�Extract the cookies from the response into a CookieJar.

    :param jar: cookielib.CookieJar (not necessarily a RequestsCookieJar)
    :param request: our own requests.Request object
    :param response: urllib3.HTTPResponse object
    �_original_responseN)�hasattrr3rr/�msgZextract_cookies)�jarrZresponseZreq�resrrr�extract_cookies_to_jarws
r8cCs t|�}|j|�|j�jd�S)zj
    Produce an appropriate Cookie header string to be sent with `request`, or None.

    :rtype: str
    �Cookie)rZadd_cookie_headerr'r)r6r�rrrr�get_cookie_header�s
r;cCs�g}xV|D]N}|j|krq
|dk	r.||jkr.q
|dk	rB||jkrBq
|j|j|j|jf�q
Wx |D]\}}}|j|||�qbWdS)zkUnsets a cookie by name, by default over all domains and paths.

    Wraps CookieJar.clear(), is O(n).
    N)r�domainr�append�clear)�	cookiejarrr<rZ
clearables�cookierrr�remove_cookie_by_name�s

rAc@seZdZdZdS)�CookieConflictErrorz�There are two cookies that meet the criteria specified in the cookie jar.
    Use .get and .set and include domain and path args in order to be more specific.
    N)r*r+r,r-rrrrrB�srBcs�eZdZdZd1dd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zd2dd�Z�fdd�Zdd�Zdd �Zd!d"�Z�fd#d$�Z�fd%d&�Zd3d'd(�Zd4d)d*�Zd+d,�Zd-d.�Zd/d0�Z�ZS)5�RequestsCookieJara�Compatibility class; is a cookielib.CookieJar, but exposes a dict
    interface.

    This is the CookieJar we create by default for requests and sessions that
    don't specify one, since some clients may expect response.cookies and
    session.cookies to support dict operations.

    Requests does not use the dict interface internally; it's just for
    compatibility with external client code. All requests code should work
    out of the box with externally provided instances of ``CookieJar``, e.g.
    ``LWPCookieJar`` and ``FileCookieJar``.

    Unlike a regular CookieJar, this class is pickleable.

    .. warning:: dictionary operations that are normally O(1) may be O(n).
    NcCs(y|j|||�Stk
r"|SXdS)z�Dict-like get() that also supports optional domain and path args in
        order to resolve naming collisions from using one cookie jar over
        multiple domains.

        .. warning:: operation is O(n), not O(1).
        N)�_find_no_duplicates�KeyError)rrrr<rrrrr�szRequestsCookieJar.getcKsX|dkr(t|||jd�|jd�d�dSt|t�r<t|�}nt||f|�}|j|�|S)z�Dict-like set() that also supports optional domain and path args in
        order to resolve naming collisions from using one cookie jar over
        multiple domains.
        Nr<r)r<r)rAr�
isinstancer�morsel_to_cookie�
create_cookie�
set_cookie)rrr%�kwargs�crrr�set�s


zRequestsCookieJar.setccsxt|�D]}|jVq
WdS)z�Dict-like iterkeys() that returns an iterator of names of cookies
        from the jar.

        .. seealso:: itervalues() and iteritems().
        N)�iterr)rr@rrr�iterkeys�szRequestsCookieJar.iterkeyscCst|j��S)z�Dict-like keys() that returns a list of names of cookies from the
        jar.

        .. seealso:: values() and items().
        )�listrN)rrrr�keys�szRequestsCookieJar.keysccsxt|�D]}|jVq
WdS)z�Dict-like itervalues() that returns an iterator of values of cookies
        from the jar.

        .. seealso:: iterkeys() and iteritems().
        N)rMr%)rr@rrr�
itervalues�szRequestsCookieJar.itervaluescCst|j��S)z�Dict-like values() that returns a list of values of cookies from the
        jar.

        .. seealso:: keys() and items().
        )rOrQ)rrrr�values�szRequestsCookieJar.valuesccs$xt|�D]}|j|jfVq
WdS)z�Dict-like iteritems() that returns an iterator of name-value tuples
        from the jar.

        .. seealso:: iterkeys() and itervalues().
        N)rMrr%)rr@rrr�	iteritems�szRequestsCookieJar.iteritemscCst|j��S)z�Dict-like items() that returns a list of name-value tuples from the
        jar. Allows client-code to call ``dict(RequestsCookieJar)`` and get a
        vanilla python dict of key value pairs.

        .. seealso:: keys() and values().
        )rOrS)rrrr�itemsszRequestsCookieJar.itemscCs0g}x&t|�D]}|j|kr|j|j�qW|S)z2Utility method to list all the domains in the jar.)rMr<r=)r�domainsr@rrr�list_domainss

zRequestsCookieJar.list_domainscCs0g}x&t|�D]}|j|kr|j|j�qW|S)z0Utility method to list all the paths in the jar.)rMrr=)r�pathsr@rrr�
list_pathss

zRequestsCookieJar.list_pathscCs>g}x4t|�D](}|jdk	r*|j|kr*dS|j|j�qWdS)zvReturns True if there are multiple domains in the jar.
        Returns False otherwise.

        :rtype: bool
        NTF)rMr<r=)rrUr@rrr�multiple_domainssz"RequestsCookieJar.multiple_domainscCsJi}x@t|�D]4}|dks$|j|kr|dks6|j|kr|j||j<qW|S)z�Takes as an argument an optional domain and path and returns a plain
        old Python dict of name-value pairs of cookies that meet the
        requirements.

        :rtype: dict
        N)rMr<rr%r)rr<rZ
dictionaryr@rrr�get_dict,szRequestsCookieJar.get_dictcs*ytt|�j|�Stk
r$dSXdS)NT)�superrC�__contains__rB)rr)�	__class__rrr\<szRequestsCookieJar.__contains__cCs
|j|�S)z�Dict-like __getitem__() for compatibility with client code. Throws
        exception if there are more than one cookie with name. In that case,
        use the more explicit get() method instead.

        .. warning:: operation is O(n), not O(1).
        )rD)rrrrr�__getitem__BszRequestsCookieJar.__getitem__cCs|j||�dS)z�Dict-like __setitem__ for compatibility with client code. Throws
        exception if there is already a cookie of that name in the jar. In that
        case, use the more explicit set() method instead.
        N)rL)rrr%rrr�__setitem__KszRequestsCookieJar.__setitem__cCst||�dS)zlDeletes a cookie given a name. Wraps ``cookielib.CookieJar``'s
        ``remove_cookie_by_name()``.
        N)rA)rrrrr�__delitem__RszRequestsCookieJar.__delitem__csLt|jd�r4|jjd�r4|jjd�r4|jjdd�|_tt|�j|f|�|�S)N�
startswith�"z\"�)r4r%ra�endswith�replacer[rCrI)rr@�argsrJ)r]rrrIXs$zRequestsCookieJar.set_cookiecs@t|tj�r,x.|D]}|jtj|��qWntt|�j|�dS)zAUpdates this jar with cookies from another CookieJar or dict-likeN)rFr�	CookieJarrI�copyr[rC�update)r�otherr@)r]rrri]s
zRequestsCookieJar.updatecCs\xDt|�D]8}|j|kr
|dks*|j|kr
|dks<|j|kr
|jSq
Wtd|||f��dS)a�Requests uses this method internally to get cookie values.

        If there are conflicting cookies, _find arbitrarily chooses one.
        See _find_no_duplicates if you want an exception thrown if there are
        conflicting cookies.

        :param name: a string containing name of cookie
        :param domain: (optional) string containing domain of cookie
        :param path: (optional) string containing path of cookie
        :return: cookie.value
        Nzname=%r, domain=%r, path=%r)rMrr<rr%rE)rrr<rr@rrr�_findes

zRequestsCookieJar._findcCs|d}xXt|�D]L}|j|kr|dks.|j|kr|dks@|j|kr|dk	rTtd|��|j}qW|rf|Std|||f��dS)a�Both ``__get_item__`` and ``get`` call this function: it's never
        used elsewhere in Requests.

        :param name: a string containing name of cookie
        :param domain: (optional) string containing domain of cookie
        :param path: (optional) string containing path of cookie
        :raises KeyError: if cookie is not found
        :raises CookieConflictError: if there are multiple cookies
            that match name and optionally domain and path
        :return: cookie.value
        Nz(There are multiple cookies with name, %rzname=%r, domain=%r, path=%r)rMrr<rrBr%rE)rrr<rZtoReturnr@rrrrDys

z%RequestsCookieJar._find_no_duplicatescCs|jj�}|jd�|S)z4Unlike a normal CookieJar, this class is pickleable.�
_cookies_lock)�__dict__rh�pop)r�staterrr�__getstate__�s

zRequestsCookieJar.__getstate__cCs$|jj|�d|jkr tj�|_dS)z4Unlike a normal CookieJar, this class is pickleable.rlN)rmri�	threading�RLockrl)rrorrr�__setstate__�s
zRequestsCookieJar.__setstate__cCst�}|j|�|S)z(Return a copy of this RequestsCookieJar.)rCri)rZnew_cjrrrrh�s
zRequestsCookieJar.copy)NNN)NN)NN)NN)r*r+r,r-rrLrNrPrQrRrSrTrVrXrYrZr\r^r_r`rIrirkrDrprsrh�
__classcell__rr)r]rrC�s0
				
	

rCcCsR|dkrdSt|d�r|j�Stj|�}|j�x|D]}|jtj|��q6W|S)Nrh)r4rhr>rI)r6Znew_jarr@rrr�_copy_cookie_jar�s


rucKs�td||ddddddddddidd�
}t|�t|�}|rNd	}t|t|���|j|�t|d
�|d<t|d�|d
<|djd�|d<t|d�|d<tjf|�S)z�Make a cookie from underspecified parameters.

    By default, the pair of `name` and `value` will be set for the domain ''
    and sent on every request (this is sometimes called a "supercookie").
    rNrc�/FT�HttpOnly)
�versionrr%�portr<r�secure�expires�discard�comment�comment_url�rest�rfc2109z4create_cookie() got unexpected keyword arguments: %sryZport_specifiedr<Zdomain_specified�.Zdomain_initial_dotrZpath_specified)	�dictrL�	TypeErrorrOri�boolrarr9)rr%rJ�resultZbadargs�errrrrrH�s0
rHcCs�d}|drPyttj�t|d��}Wqrtk
rLtd|d��YqrXn"|drrd}tjtj|d|��}t|dt|d�d|d||j	|d	dd
|didt|d�|j
|d
p�dd�
S)zBConvert a Morsel object into a Cookie containing the one k/v pair.Nzmax-agezmax-age: %s must be integerr{z%a, %d-%b-%Y %H:%M:%S GMTr}Fr<rrwZhttponlyrzrxr)
r}r~r|r<r{rrryrr�rzr%rx)�int�time�
ValueErrorr��calendarZtimegmZstrptimerHr�r"r%)Zmorselr{Z
time_templaterrrrG�s0


rGTcCsV|dkrt�}|dk	rRdd�|D�}x,|D]$}|s:||kr*|jt|||��q*W|S)a-Returns a CookieJar from a key/value dictionary.

    :param cookie_dict: Dict of key/values to insert into CookieJar.
    :param cookiejar: (optional) A cookiejar to add the cookies to.
    :param overwrite: (optional) If False, will not replace cookies
        already in the jar with new ones.
    NcSsg|]
}|j�qSr)r)�.0r@rrr�
<listcomp>sz'cookiejar_from_dict.<locals>.<listcomp>)rCrIrH)Zcookie_dictr?�	overwriteZnames_from_jarrrrr�cookiejar_from_dict�s
r�cCszt|tj�std��t|t�r.t||dd�}nHt|tj�rvy|j|�Wn,tk
rtx|D]}|j|�q^WYnX|S)z�Add cookies to cookiejar and returns a merged CookieJar.

    :param cookiejar: CookieJar object to add the cookies to.
    :param cookies: Dictionary or CookieJar object to be added.
    z!You can only merge into CookieJarF)r?r�)	rFrrgr�r�r�ri�AttributeErrorrI)r?ZcookiesZ
cookie_in_jarrrr�
merge_cookiess

r�)NN)NT)r-rhr�r��collectionsZ_internal_utilsr�compatrrrrrq�ImportErrorZdummy_threading�objectrr/r8r;rA�RuntimeErrorrBrg�MutableMappingrCrurHrGr�r�rrrr�<module>
s.H
{#
site-packages/pip/_vendor/requests/__pycache__/exceptions.cpython-36.pyc000064400000012174147511334610022414 0ustar003

���e+�@s�dZddlmZGdd�de�ZGdd�de�ZGdd�de�ZGd	d
�d
e�ZGdd�de�ZGd
d�de�Z	Gdd�dee	�Z
Gdd�de	�ZGdd�de�ZGdd�de�Z
Gdd�dee�ZGdd�dee�ZGdd�dee�ZGdd�dee�ZGdd �d e�ZGd!d"�d"ee�ZGd#d$�d$ee�ZGd%d&�d&e�ZGd'd(�d(e�ZGd)d*�d*e�ZGd+d,�d,ee�ZGd-d.�d.e�Zd/S)0z`
requests.exceptions
~~~~~~~~~~~~~~~~~~~

This module contains the set of Requests' exceptions.
�)�	HTTPErrorcs eZdZdZ�fdd�Z�ZS)�RequestExceptionzTThere was an ambiguous exception that occurred while handling your
    request.
    csZ|jdd�}||_|jdd�|_|dk	rD|jrDt|d�rD|jj|_tt|�j||�dS)zBInitialize RequestException with `request` and `response` objects.�responseN�request)�poprr�hasattr�superr�__init__)�self�args�kwargsr)�	__class__�� /usr/lib/python3.6/exceptions.pyr	s

zRequestException.__init__)�__name__�
__module__�__qualname__�__doc__r	�
__classcell__rr)r
rrsrc@seZdZdZdS)rzAn HTTP error occurred.N)rrrrrrrrrsrc@seZdZdZdS)�ConnectionErrorzA Connection error occurred.N)rrrrrrrrr src@seZdZdZdS)�
ProxyErrorzA proxy error occurred.N)rrrrrrrrr$src@seZdZdZdS)�SSLErrorzAn SSL error occurred.N)rrrrrrrrr(src@seZdZdZdS)�Timeoutz�The request timed out.

    Catching this error will catch both
    :exc:`~requests.exceptions.ConnectTimeout` and
    :exc:`~requests.exceptions.ReadTimeout` errors.
    N)rrrrrrrrr,src@seZdZdZdS)�ConnectTimeoutz�The request timed out while trying to connect to the remote server.

    Requests that produced this error are safe to retry.
    N)rrrrrrrrr5src@seZdZdZdS)�ReadTimeoutz@The server did not send any data in the allotted amount of time.N)rrrrrrrrr<src@seZdZdZdS)�URLRequiredz*A valid URL is required to make a request.N)rrrrrrrrr@src@seZdZdZdS)�TooManyRedirectszToo many redirects.N)rrrrrrrrrDsrc@seZdZdZdS)�
MissingSchemaz/The URL schema (e.g. http or https) is missing.N)rrrrrrrrrHsrc@seZdZdZdS)�
InvalidSchemaz"See defaults.py for valid schemas.N)rrrrrrrrrLsrc@seZdZdZdS)�
InvalidURLz%The URL provided was somehow invalid.N)rrrrrrrrrPsrc@seZdZdZdS)�
InvalidHeaderz.The header value provided was somehow invalid.N)rrrrrrrrr Tsr c@seZdZdZdS)�ChunkedEncodingErrorz?The server declared chunked encoding but sent an invalid chunk.N)rrrrrrrrr!Xsr!c@seZdZdZdS)�ContentDecodingErrorz!Failed to decode response contentN)rrrrrrrrr"\sr"c@seZdZdZdS)�StreamConsumedErrorz2The content for this response was already consumedN)rrrrrrrrr#`sr#c@seZdZdZdS)�
RetryErrorzCustom retries logic failedN)rrrrrrrrr$dsr$c@seZdZdZdS)�UnrewindableBodyErrorz:Requests encountered an error when trying to rewind a bodyN)rrrrrrrrr%hsr%c@seZdZdZdS)�RequestsWarningzBase warning for Requests.N)rrrrrrrrr&nsr&c@seZdZdZdS)�FileModeWarningzJA file was opened in text mode, but Requests determined its binary length.N)rrrrrrrrr'ssr'c@seZdZdZdS)�RequestsDependencyWarningz@An imported dependency doesn't match the expected version range.N)rrrrrrrrr(xsr(N)rZpip._vendor.urllib3.exceptionsrZ
BaseHTTPError�IOErrorrrrrrrrrr�
ValueErrorrrrr r!r"�	TypeErrorr#r$r%�Warningr&�DeprecationWarningr'r(rrrr�<module>s.	site-packages/pip/_vendor/requests/__pycache__/api.cpython-36.opt-1.pyc000064400000014374147511334610021747 0ustar003

���e]�@s\dZddlmZdd�Zddd�Zdd	�Zd
d�Zddd
�Zddd�Zddd�Z	dd�Z
dS)z�
requests.api
~~~~~~~~~~~~

This module implements the Requests API.

:copyright: (c) 2012 by Kenneth Reitz.
:license: Apache2, see LICENSE for more details.
�)�sessionscKs*tj��}|jf||d�|��SQRXdS)a�	Constructs and sends a :class:`Request <Request>`.

    :param method: method for the new :class:`Request` object.
    :param url: URL for the new :class:`Request` object.
    :param params: (optional) Dictionary or bytes to be sent in the query string for the :class:`Request`.
    :param data: (optional) Dictionary or list of tuples ``[(key, value)]`` (will be form-encoded), bytes, or file-like object to send in the body of the :class:`Request`.
    :param json: (optional) json data to send in the body of the :class:`Request`.
    :param headers: (optional) Dictionary of HTTP Headers to send with the :class:`Request`.
    :param cookies: (optional) Dict or CookieJar object to send with the :class:`Request`.
    :param files: (optional) Dictionary of ``'name': file-like-objects`` (or ``{'name': file-tuple}``) for multipart encoding upload.
        ``file-tuple`` can be a 2-tuple ``('filename', fileobj)``, 3-tuple ``('filename', fileobj, 'content_type')``
        or a 4-tuple ``('filename', fileobj, 'content_type', custom_headers)``, where ``'content-type'`` is a string
        defining the content type of the given file and ``custom_headers`` a dict-like object containing additional headers
        to add for the file.
    :param auth: (optional) Auth tuple to enable Basic/Digest/Custom HTTP Auth.
    :param timeout: (optional) How many seconds to wait for the server to send data
        before giving up, as a float, or a :ref:`(connect timeout, read
        timeout) <timeouts>` tuple.
    :type timeout: float or tuple
    :param allow_redirects: (optional) Boolean. Enable/disable GET/OPTIONS/POST/PUT/PATCH/DELETE/HEAD redirection. Defaults to ``True``.
    :type allow_redirects: bool
    :param proxies: (optional) Dictionary mapping protocol to the URL of the proxy.
    :param verify: (optional) Either a boolean, in which case it controls whether we verify
            the server's TLS certificate, or a string, in which case it must be a path
            to a CA bundle to use. Defaults to ``True``.
    :param stream: (optional) if ``False``, the response content will be immediately downloaded.
    :param cert: (optional) if String, path to ssl client cert file (.pem). If Tuple, ('cert', 'key') pair.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response

    Usage::

      >>> import requests
      >>> req = requests.request('GET', 'http://httpbin.org/get')
      <Response [200]>
    )�method�urlN)rZSession�request)rr�kwargsZsession�r�/usr/lib/python3.6/api.pyrs)
rNcKs"|jdd�td|fd|i|��S)aOSends a GET request.

    :param url: URL for the new :class:`Request` object.
    :param params: (optional) Dictionary or bytes to be sent in the query string for the :class:`Request`.
    :param \*\*kwargs: Optional arguments that ``request`` takes.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response
    �allow_redirectsT�get�params)�
setdefaultr)rrrrrrr
=s
r
cKs|jdd�td|f|�S)z�Sends an OPTIONS request.

    :param url: URL for the new :class:`Request` object.
    :param \*\*kwargs: Optional arguments that ``request`` takes.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response
    r	T�options)rr)rrrrrr
Ks	r
cKs|jdd�td|f|�S)z�Sends a HEAD request.

    :param url: URL for the new :class:`Request` object.
    :param \*\*kwargs: Optional arguments that ``request`` takes.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response
    r	F�head)rr)rrrrrrXs	rcKstd|f||d�|��S)a�Sends a POST request.

    :param url: URL for the new :class:`Request` object.
    :param data: (optional) Dictionary (will be form-encoded), bytes, or file-like object to send in the body of the :class:`Request`.
    :param json: (optional) json data to send in the body of the :class:`Request`.
    :param \*\*kwargs: Optional arguments that ``request`` takes.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response
    �post)�data�json)r)rrrrrrrresrcKstd|fd|i|��S)a�Sends a PUT request.

    :param url: URL for the new :class:`Request` object.
    :param data: (optional) Dictionary (will be form-encoded), bytes, or file-like object to send in the body of the :class:`Request`.
    :param json: (optional) json data to send in the body of the :class:`Request`.
    :param \*\*kwargs: Optional arguments that ``request`` takes.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response
    �putr)r)rrrrrrrssrcKstd|fd|i|��S)a�Sends a PATCH request.

    :param url: URL for the new :class:`Request` object.
    :param data: (optional) Dictionary (will be form-encoded), bytes, or file-like object to send in the body of the :class:`Request`.
    :param json: (optional) json data to send in the body of the :class:`Request`.
    :param \*\*kwargs: Optional arguments that ``request`` takes.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response
    �patchr)r)rrrrrrr�srcKstd|f|�S)z�Sends a DELETE request.

    :param url: URL for the new :class:`Request` object.
    :param \*\*kwargs: Optional arguments that ``request`` takes.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response
    �delete)r)rrrrrr�s	r)N)NN)N)N)�__doc__�rrr
r
rrrrrrrrr�<module>s-




site-packages/pip/_vendor/requests/__pycache__/__init__.cpython-36.pyc000064400000006060147511334610021767 0ustar003

���e�
�@s�dZddlmZddlmZddlZddlmZdd�Zyeejej�Wn0e	e
fk
rzejd	jejej�e�YnXdd
l
mZejde�ddlmZmZmZmZdd
lmZmZmZmZddlmZmZddlmZddlmZddlmZmZmZddl m!Z!m"Z"m#Z#m$Z$m%Z%m&Z&m'Z'm(Z(ddl)m*Z*m+Z+ddl,m-Z-ddlm.Z.m/Z/m0Z0m1Z1m2Z2m3Z3m4Z4m5Z5m6Z6ddl7Z7yddl7m8Z8Wn(e9k
�r�Gdd�de7j:�Z8YnXe7j;e<�j=e8��ejde4dd�dS)a�
Requests HTTP Library
~~~~~~~~~~~~~~~~~~~~~

Requests is an HTTP library, written in Python, for human beings. Basic GET
usage:

   >>> import requests
   >>> r = requests.get('https://www.python.org')
   >>> r.status_code
   200
   >>> 'Python is a programming language' in r.content
   True

... or POST:

   >>> payload = dict(key1='value1', key2='value2')
   >>> r = requests.post('http://httpbin.org/post', data=payload)
   >>> print(r.text)
   {
     ...
     "form": {
       "key2": "value2",
       "key1": "value1"
     },
     ...
   }

The other HTTP methods are supported - see `requests.api`. Full documentation
is at <http://python-requests.org>.

:copyright: (c) 2017 by Kenneth Reitz.
:license: Apache 2.0, see LICENSE for more details.
�)�urllib3)�chardetN�)�RequestsDependencyWarningcCs�|jd�}|dgkst�t|�dkr.|jd�|\}}}t|�t|�t|�}}}|dks`t�|dkslt�|dksxt�|jd�dd�\}}}t|�t|�t|�}}}|dks�t�|dks�t�|dks�t�dS)	N�.Zdev��0r���)�split�AssertionError�len�append�int)Zurllib3_versionZchardet_version�major�minor�patch�r�/usr/lib/python3.6/__init__.py�check_compatibility1s


rzAurllib3 ({0}) or chardet ({1}) doesn't match a supported version!)�DependencyWarning�ignore)�	__title__�__description__�__url__�__version__)�	__build__�
__author__�__author_email__�__license__)�
__copyright__�__cake__)�utils)�packages)�Request�Response�PreparedRequest)�request�get�head�postr�put�delete�options)�session�Session)�codes)	�RequestException�Timeout�URLRequired�TooManyRedirects�	HTTPError�ConnectionError�FileModeWarning�ConnectTimeout�ReadTimeout)�NullHandlerc@seZdZdd�ZdS)r;cCsdS)Nr)�self�recordrrr�emitsszNullHandler.emitN)�__name__�
__module__�__qualname__r>rrrrr;rsr;�defaultT)r)>�__doc__Zpip._vendorrr�warnings�
exceptionsrrrr
�
ValueError�warn�formatZpip._vendor.urllib3.exceptionsr�simplefilterrrrrrrr r!r"�r#r$Zmodelsr%r&r'Zapir(r)r*r+rr,r-r.Zsessionsr/r0Zstatus_codesr1r2r3r4r5r6r7r8r9r:Zloggingr;�ImportErrorZHandlerZ	getLoggerr?Z
addHandlerrrrr�<module>)s<

(,site-packages/pip/_vendor/requests/__pycache__/packages.cpython-36.opt-1.pyc000064400000000676147511334610022754 0ustar003

���e��@s~ddlZxpdD]hZdeZee�e�e<xLeej�D]>ZeeksNejed�r4ee	d�d�Z
ejeejde
<q4WqWdS)	�N�urllib3�idna�chardetzpip._vendor.�.zpip._vendor.requests.packages.)rrr)�sys�packageZvendored_package�
__import__�locals�list�modules�mod�
startswith�lenZunprefixed_mod�rr�/usr/lib/python3.6/packages.py�<module>s
site-packages/pip/_vendor/requests/__pycache__/status_codes.cpython-36.pyc000064400000007012147511334610022726 0ustar003

���e��F@s�ddlmZd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�dz�DZed{d|�ZxNej�D]B\ZZx6eD].Zeeee�ej	dă�s�eeej
�e��q�W�q�WdS)��)�
LookupDict�continue�switching_protocols�
processing�
checkpoint�uri_too_long�request_uri_too_long�ok�okay�all_ok�all_okay�all_good�\o/�✓�created�accepted�non_authoritative_info�non_authoritative_information�
no_content�
reset_content�reset�partial_content�partial�multi_status�multiple_status�multi_stati�multiple_stati�already_reported�im_used�multiple_choices�moved_permanently�moved�\o-�found�	see_other�other�not_modified�	use_proxy�switch_proxy�temporary_redirect�temporary_moved�	temporary�permanent_redirect�resume_incomplete�resume�bad_request�bad�unauthorized�payment_required�payment�	forbidden�	not_found�-o-�method_not_allowed�not_allowed�not_acceptable�proxy_authentication_required�
proxy_auth�proxy_authentication�request_timeout�timeout�conflict�gone�length_required�precondition_failed�precondition�request_entity_too_large�request_uri_too_large�unsupported_media_type�unsupported_media�
media_type�requested_range_not_satisfiable�requested_range�range_not_satisfiable�expectation_failed�im_a_teapot�teapot�
i_am_a_teapot�misdirected_request�unprocessable_entity�
unprocessable�locked�failed_dependency�
dependency�unordered_collection�	unordered�upgrade_required�upgrade�precondition_required�too_many_requests�too_many�header_fields_too_large�fields_too_large�no_response�none�
retry_with�retry�$blocked_by_windows_parental_controls�parental_controls�unavailable_for_legal_reasons�
legal_reasons�client_closed_request�internal_server_error�server_error�/o\�✗�not_implemented�bad_gateway�service_unavailable�unavailable�gateway_timeout�http_version_not_supported�http_version�variant_also_negotiates�insufficient_storage�bandwidth_limit_exceeded�	bandwidth�not_extended�network_authentication_required�network_auth�network_authentication)D�d�e�f�g�z��������������������i,i-i.i/i0i1i2i3i4i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�Zstatus_codes)�name�\�/N)r)r)r)r)rr)r	r
rrr
rr)r)r)rr)r)rr)rr)rrrr)r)r)r)r r!r")r#)r$r%)r&)r')r()r)r*r+)r,r-r.)r/r0)r1)r2r3)r4)r5r6)r7r8)r9)r:r;r<)r=r>)r?)r@)rA)rBrC)rD)rE)rFrGrH)rIrJrK)rL)rMrNrO)rP)rQrR)rS)rTrU)rVrW)rXrY)rZrC)r[r\)r]r^)r_r`)rarb)rcrd)rerf)rg)rhrirjrk)rl)rm)rnro)rp)rqrr)rs)rt)rurv)rw)rxryrz)r�r�)Z
structuresrZ_codesZcodes�items�codeZtitles�title�setattr�
startswith�upper�r�r��"/usr/lib/python3.6/status_codes.py�<module>s�

site-packages/pip/_vendor/requests/__pycache__/adapters.cpython-36.opt-1.pyc000064400000040074147511334610022775 0ustar003

���eR�@s�dZddlZddlZddlmZmZddlmZddl	m
Zddlm
Z
ddlmZddlmZdd	lmZdd
lmZddlmZddlmZdd
lmZddlmZddlmZddlmZddlmZddlmZm Z ddl!m"Z"m#Z#m$Z$m%Z%m&Z&m'Z'ddl(m)Z)ddl*m+Z+ddl,m-Z-m.Z.m/Z/mZmZm0Z0m1Z1ddl2m3Z3yddl4m5Z5Wne6k
�rrdd�Z5YnXdZ7dZ8dZ9dZ:Gdd�de;�Z<Gd d!�d!e<�Z=dS)"z�
requests.adapters
~~~~~~~~~~~~~~~~~

This module contains the transport adapters that Requests uses to define
and maintain connections.
�N)�PoolManager�proxy_from_url)�HTTPResponse)�Timeout)�Retry)�ClosedPoolError)�ConnectTimeoutError)�	HTTPError)�
MaxRetryError)�NewConnectionError)�
ProxyError)�
ProtocolError)�ReadTimeoutError)�SSLError)�
ResponseError�)�Response)�urlparse�
basestring)�DEFAULT_CA_BUNDLE_PATH�get_encoding_from_headers�prepend_scheme_if_needed�get_auth_from_url�
urldefragauth�select_proxy)�CaseInsensitiveDict)�extract_cookies_to_jar)�ConnectionError�ConnectTimeout�ReadTimeoutrr�
RetryError�
InvalidSchema)�_basic_auth_str)�SOCKSProxyManagercOstd��dS)Nz'Missing dependencies for SOCKS support.)r!)�args�kwargs�r&�/usr/lib/python3.6/adapters.pyr#+sr#F�
cs2eZdZdZ�fdd�Zddd�Zd	d
�Z�ZS)�BaseAdapterzThe Base Transport Adaptercstt|�j�dS)N)�superr)�__init__)�self)�	__class__r&r'r+7szBaseAdapter.__init__FNTcCst�dS)aCSends PreparedRequest object. Returns Response object.

        :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
        :param stream: (optional) Whether to stream the request content.
        :param timeout: (optional) How long to wait for the server to send
            data before giving up, as a float, or a :ref:`(connect timeout,
            read timeout) <timeouts>` tuple.
        :type timeout: float or tuple
        :param verify: (optional) Either a boolean, in which case it controls whether we verify
            the server's TLS certificate, or a string, in which case it must be a path
            to a CA bundle to use
        :param cert: (optional) Any user-provided SSL certificate to be trusted.
        :param proxies: (optional) The proxies dictionary to apply to the request.
        N)�NotImplementedError)r,�request�stream�timeout�verify�cert�proxiesr&r&r'�send:szBaseAdapter.sendcCst�dS)z!Cleans up adapter specific items.N)r.)r,r&r&r'�closeLszBaseAdapter.close)FNTNN)�__name__�
__module__�__qualname__�__doc__r+r5r6�
__classcell__r&r&)r-r'r)4s

r)cs�eZdZdZdddddgZeeeef�fdd�	Zd	d
�Z	dd�Z
efd
d�Zdd�Zdd�Z
dd�Zd$dd�Zdd�Zdd�Zdd�Zdd�Zd%d"d#�Z�ZS)&�HTTPAdaptera�The built-in HTTP Adapter for urllib3.

    Provides a general-case interface for Requests sessions to contact HTTP and
    HTTPS urls by implementing the Transport Adapter interface. This class will
    usually be created by the :class:`Session <Session>` class under the
    covers.

    :param pool_connections: The number of urllib3 connection pools to cache.
    :param pool_maxsize: The maximum number of connections to save in the pool.
    :param max_retries: The maximum number of retries each connection
        should attempt. Note, this applies only to failed DNS lookups, socket
        connections and connection timeouts, never to requests where data has
        made it to the server. By default, Requests does not retry failed
        connections. If you need granular control over the conditions under
        which we retry a request, import urllib3's ``Retry`` class and pass
        that instead.
    :param pool_block: Whether the connection pool should block for connections.

    Usage::

      >>> import requests
      >>> s = requests.Session()
      >>> a = requests.adapters.HTTPAdapter(max_retries=3)
      >>> s.mount('http://', a)
    �max_retries�config�_pool_connections�
_pool_maxsize�_pool_blockcsd|tkrtddd�|_ntj|�|_i|_i|_tt|�j�||_	||_
||_|j|||d�dS)NrF)�read)�block)
�DEFAULT_RETRIESrr=Zfrom_intr>�
proxy_managerr*r<r+r?r@rA�init_poolmanager)r,Zpool_connectionsZpool_maxsizer=Z
pool_block)r-r&r'r+nszHTTPAdapter.__init__cst�fdd��jD��S)Nc3s|]}|t�|d�fVqdS)N)�getattr)�.0�attr)r,r&r'�	<genexpr>�sz+HTTPAdapter.__getstate__.<locals>.<genexpr>)�dict�	__attrs__)r,r&)r,r'�__getstate__�szHTTPAdapter.__getstate__cCsHi|_i|_x |j�D]\}}t|||�qW|j|j|j|jd�dS)N)rC)rEr>�items�setattrrFr?r@rA)r,�staterI�valuer&r&r'�__setstate__�szHTTPAdapter.__setstate__cKs0||_||_||_tf|||dd�|��|_dS)aInitializes a urllib3 PoolManager.

        This method should not be called from user code, and is only
        exposed for use when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        :param connections: The number of urllib3 connection pools to cache.
        :param maxsize: The maximum number of connections to save in the pool.
        :param block: Block when no free connections are available.
        :param pool_kwargs: Extra keyword arguments used to initialize the Pool Manager.
        T)�	num_pools�maxsizerC�strictN)r?r@rAr�poolmanager)r,ZconnectionsrTrCZpool_kwargsr&r&r'rF�s

zHTTPAdapter.init_poolmanagercKs�||jkr|j|}n||j�jd�r^t|�\}}t|f|||j|j|jd�|��}|j|<n4|j|�}t	|f||j|j|jd�|��}|j|<|S)a�Return urllib3 ProxyManager for the given proxy.

        This method should not be called from user code, and is only
        exposed for use when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        :param proxy: The proxy to return a urllib3 ProxyManager for.
        :param proxy_kwargs: Extra keyword arguments used to configure the Proxy Manager.
        :returns: ProxyManager
        :rtype: urllib3.ProxyManager
        �socks)�username�passwordrSrTrC)�
proxy_headersrSrTrC)
rE�lower�
startswithrr#r?r@rArZr)r,�proxyZproxy_kwargsZmanagerrXrYrZr&r&r'�proxy_manager_for�s*

zHTTPAdapter.proxy_manager_forcCs|j�jd�rn|rnd}|dk	r"|}|s*t}|s>tjj|�rLtdj|���d|_tjj	|�sf||_
q�||_nd|_d|_
d|_|�rt|t
�s�|d|_|d|_n||_d|_|jr�tjj|j�r�td	j|j���|jo�tjj|j��rtd
j|j���dS)aAVerify a SSL certificate. This method should not be called from user
        code, and is only exposed for use when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        :param conn: The urllib3 connection object associated with the cert.
        :param url: The requested URL.
        :param verify: Either a boolean, in which case it controls whether we verify
            the server's TLS certificate, or a string, in which case it must be a path
            to a CA bundle to use
        :param cert: The SSL certificate to verify.
        �httpsNTzFCould not find a suitable TLS CA certificate bundle, invalid path: {0}Z
CERT_REQUIREDZ	CERT_NONErrz:Could not find the TLS certificate file, invalid path: {0}z2Could not find the TLS key file, invalid path: {0})r[r\r�os�path�exists�IOError�formatZ	cert_reqs�isdirZca_certsZca_cert_dir�
isinstancerZ	cert_fileZkey_file)r,�conn�urlr2r3Zcert_locr&r&r'�cert_verify�s8


zHTTPAdapter.cert_verifycCs�t�}t|dd�|_tt|di��|_t|j�|_||_|jj|_t	|j
t�r^|j
jd�|_
n|j
|_
t
|j||�||_||_|S)a�Builds a :class:`Response <requests.Response>` object from a urllib3
        response. This should not be called from user code, and is only exposed
        for use when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`

        :param req: The :class:`PreparedRequest <PreparedRequest>` used to generate the response.
        :param resp: The urllib3 response object.
        :rtype: requests.Response
        ZstatusN�headerszutf-8)rrGZstatus_coderrjr�encoding�raw�reasonrfrh�bytes�decoder�cookiesr/�
connection)r,Zreq�respZresponser&r&r'�build_response�s

zHTTPAdapter.build_responseNcCsNt||�}|r.t|d�}|j|�}|j|�}nt|�}|j�}|jj|�}|S)a�Returns a urllib3 connection for the given URL. This should not be
        called from user code, and is only exposed for use when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        :param url: The URL to connect to.
        :param proxies: (optional) A Requests-style dictionary of proxies used on this request.
        :rtype: urllib3.ConnectionPool
        Zhttp)rrr^Zconnection_from_urlrZgeturlrV)r,rhr4r]rErgZparsedr&r&r'�get_connection"s	


zHTTPAdapter.get_connectioncCs*|jj�x|jj�D]}|j�qWdS)z�Disposes of any internal state.

        Currently, this closes the PoolManager and any active ProxyManager,
        which closes any pooled connections.
        N)rV�clearrE�values)r,r]r&r&r'r69s
zHTTPAdapter.closec	Csbt|j|�}t|j�j}|o"|dk}d}|rDt|�jj�}|jd�}|j}|r^|r^t|j�}|S)a?Obtain the url to use when making the final request.

        If the message is being sent through a HTTP proxy, the full URL has to
        be used. Otherwise, we should only use the path portion of the URL.

        This should not be called from user code, and is only exposed for use
        when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
        :param proxies: A dictionary of schemes or schemes and hosts to proxy URLs.
        :rtype: str
        r_FrW)rrhr�schemer[r\Zpath_urlr)	r,r/r4r]rwZis_proxied_http_requestZusing_socks_proxyZproxy_schemerhr&r&r'�request_urlCs


zHTTPAdapter.request_urlcKsdS)a"Add any headers needed by the connection. As of v2.0 this does
        nothing by default, but is left for overriding by users that subclass
        the :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        This should not be called from user code, and is only exposed for use
        when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        :param request: The :class:`PreparedRequest <PreparedRequest>` to add headers to.
        :param kwargs: The keyword arguments from the call to send().
        Nr&)r,r/r%r&r&r'�add_headers`szHTTPAdapter.add_headerscCs&i}t|�\}}|r"t||�|d<|S)a
Returns a dictionary of the headers to add to any request sent
        through a proxy. This works with urllib3 magic to ensure that they are
        correctly sent to the proxy, rather than in a tunnelled request if
        CONNECT is being used.

        This should not be called from user code, and is only exposed for use
        when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        :param proxies: The url of the proxy being used for this request.
        :rtype: dict
        zProxy-Authorization)rr")r,r]rjrXrYr&r&r'rZns

zHTTPAdapter.proxy_headersFTc)Cs�|j|j|�}|j||j||�|j||�}|j|�|jdkpHd|jk}	t|t�r�y|\}
}t	|
|d�}Wq�t
k
r�}zdj|�}
t
|
��WYdd}~Xq�Xnt|t	�r�nt	||d�}�yL|	s�|j|j
||j|jdddd|j|d�
}�nt|d��r|j}|jtd�}y�|j|j
|d	d
�x$|jj�D]\}}|j||��q.W|j�xN|jD]D}|jtt|��dd�jd��|jd
�|j|�|jd
��qXW|jd�y|jd	d�}Wntk
�r�|j�}YnXtj|||ddd�}Wn|j��YnXW�n�t t!j"fk
�rD}
zt#|
|d��WYdd}
~
X�nZt$k
�r�}z�t|j%t&��r~t|j%t'��s~t(||d��t|j%t)��r�t*||d��t|j%t+��r�t,||d��t|j%t-��r�t.||d��t#||d��WYdd}~Xn�t/k
�r}zt#||d��WYdd}~Xn�t+k
�r@}zt,|��WYdd}~Xn^t-t0fk
�r�}z<t|t-��rpt.||d��nt|t1��r�t2||d��n�WYdd}~XnX|j3||�S)aSends PreparedRequest object. Returns Response object.

        :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
        :param stream: (optional) Whether to stream the request content.
        :param timeout: (optional) How long to wait for the server to send
            data before giving up, as a float, or a :ref:`(connect timeout,
            read timeout) <timeouts>` tuple.
        :type timeout: float or tuple or urllib3 Timeout object
        :param verify: (optional) Either a boolean, in which case it controls whether
            we verify the server's TLS certificate, or a string, in which case it
            must be a path to a CA bundle to use
        :param cert: (optional) Any user-provided SSL certificate to be trusted.
        :param proxies: (optional) The proxies dictionary to apply to the request.
        :rtype: requests.Response
        NzContent-Length)�connectrBzsInvalid timeout {0}. Pass a (connect, read) timeout tuple, or a single float to set both timeouts to the same valueF)
�methodrh�bodyrjZredirectZassert_same_host�preload_content�decode_contentZretriesr1�
proxy_pool)r1T)Zskip_accept_encoding�zutf-8s
s0

)�	buffering)Zpoolrqr}r~)r/)4rtrhrirxryr|rjrf�tuple�TimeoutSauce�
ValueErrorrdZurlopenr{r=�hasattrrZ	_get_conn�DEFAULT_POOL_TIMEOUTZ
putrequestrNZ	putheaderZ
endheadersr5�hex�len�encodeZgetresponse�	TypeErrorrZfrom_httplibr6r
�socket�errorrr
rmrrrrr �_ProxyErrorr�	_SSLErrorrr�
_HTTPErrorrrrs)r,r/r0r1r2r3r4rgrhZchunkedrzrB�e�errrrZlow_conn�headerrQ�i�rr&r&r'r5�s�


 


zHTTPAdapter.send)N)FNTNN)r7r8r9r:rL�DEFAULT_POOLSIZErD�DEFAULT_POOLBLOCKr+rMrRrFr^rirsrtr6rxryrZr5r;r&r&)r-r'r<Qs$%4%

r<)>r:Zos.pathr`r�Zpip._vendor.urllib3.poolmanagerrrZpip._vendor.urllib3.responserZpip._vendor.urllib3.utilrr�Zpip._vendor.urllib3.util.retryrZpip._vendor.urllib3.exceptionsrrr	r�r
rrr�r
rrr�rZmodelsr�compatrrZutilsrrrrrrZ
structuresrrpr�
exceptionsrrrr r!Zauthr"Z!pip._vendor.urllib3.contrib.socksr#�ImportErrorr�r�rDr��objectr)r<r&r&r&r'�<module>	sB $site-packages/pip/_vendor/requests/__pycache__/help.cpython-36.opt-1.pyc000064400000005067147511334610022125 0ustar003

���eS�@s�dZddlmZddlZddlZddlZddlZddlmZddlm	Z	ddlm
Z
ddlmZ
ydd	lmZWn ek
r�dZdZdZYnXddlZddlZd
d�Zdd
�Zdd�Zedkr�e�dS)z'Module containing bug report helper(s).�)�print_functionN)�idna)�urllib3)�chardet�)�__version__)�	pyopensslcCs�tj�}|dkrtj�}nj|dkr\dtjjtjjtjjf}tjjdkr�dj	|tjjg�}n(|dkrntj�}n|dkr�tj�}nd}||d	�S)
a�Return a dict with the Python implementation and version.

    Provide both the name and the version of the Python implementation
    currently running. For example, on CPython 2.7.5 it will return
    {'name': 'CPython', 'version': '2.7.5'}.

    This function works best on CPython and PyPy: in particular, it probably
    doesn't work for Jython or IronPython. Future investigation should be done
    to work out the correct shape of the code for those platforms.
    ZCPythonZPyPyz%s.%s.%s�final�ZJythonZ
IronPython�Unknown)�name�version)
�platformZpython_implementationZpython_version�sysZpypy_version_info�major�minor�micro�releaselevel�join)�implementationZimplementation_version�r�/usr/lib/python3.6/help.py�_implementations 


rc	Cs�ytj�tj�d�}Wntk
r4ddd�}YnXt�}dtji}dtji}ddd�}trrtjdtj	j
d�}dttdd�i}dtt
dd�i}ttd	d�}d|dk	r�d|ndi}|||tdk	|||||dtid
�
S)z&Generate information for a bug report.)�system�releaserr
Nr
)r
Zopenssl_versionz%xr�OPENSSL_VERSION_NUMBER)
rr�
system_sslZusing_pyopensslZ	pyOpenSSLrr�cryptographyrZrequests)rrr�IOErrorrrrr�OpenSSLZSSLr�getattrrr�sslr�requests_version)	Z
platform_infoZimplementation_infoZurllib3_infoZchardet_infoZpyopenssl_infoZcryptography_infoZ	idna_inforZsystem_ssl_inforrr�info;s8

r#cCsttjt�ddd��dS)z)Pretty-print the bug information as JSON.T�)Z	sort_keys�indentN)�print�json�dumpsr#rrrr�mainrsr)�__main__)�__doc__Z
__future__rr'rrr!Zpip._vendorrrrr
rr"Zpackages.urllib3.contribr�ImportErrorrrrr#r)�__name__rrrr�<module>s,
!7site-packages/pip/_vendor/requests/__pycache__/exceptions.cpython-36.opt-1.pyc000064400000012174147511334620023354 0ustar003

���e+�@s�dZddlmZGdd�de�ZGdd�de�ZGdd�de�ZGd	d
�d
e�ZGdd�de�ZGd
d�de�Z	Gdd�dee	�Z
Gdd�de	�ZGdd�de�ZGdd�de�Z
Gdd�dee�ZGdd�dee�ZGdd�dee�ZGdd�dee�ZGdd �d e�ZGd!d"�d"ee�ZGd#d$�d$ee�ZGd%d&�d&e�ZGd'd(�d(e�ZGd)d*�d*e�ZGd+d,�d,ee�ZGd-d.�d.e�Zd/S)0z`
requests.exceptions
~~~~~~~~~~~~~~~~~~~

This module contains the set of Requests' exceptions.
�)�	HTTPErrorcs eZdZdZ�fdd�Z�ZS)�RequestExceptionzTThere was an ambiguous exception that occurred while handling your
    request.
    csZ|jdd�}||_|jdd�|_|dk	rD|jrDt|d�rD|jj|_tt|�j||�dS)zBInitialize RequestException with `request` and `response` objects.�responseN�request)�poprr�hasattr�superr�__init__)�self�args�kwargsr)�	__class__�� /usr/lib/python3.6/exceptions.pyr	s

zRequestException.__init__)�__name__�
__module__�__qualname__�__doc__r	�
__classcell__rr)r
rrsrc@seZdZdZdS)rzAn HTTP error occurred.N)rrrrrrrrrsrc@seZdZdZdS)�ConnectionErrorzA Connection error occurred.N)rrrrrrrrr src@seZdZdZdS)�
ProxyErrorzA proxy error occurred.N)rrrrrrrrr$src@seZdZdZdS)�SSLErrorzAn SSL error occurred.N)rrrrrrrrr(src@seZdZdZdS)�Timeoutz�The request timed out.

    Catching this error will catch both
    :exc:`~requests.exceptions.ConnectTimeout` and
    :exc:`~requests.exceptions.ReadTimeout` errors.
    N)rrrrrrrrr,src@seZdZdZdS)�ConnectTimeoutz�The request timed out while trying to connect to the remote server.

    Requests that produced this error are safe to retry.
    N)rrrrrrrrr5src@seZdZdZdS)�ReadTimeoutz@The server did not send any data in the allotted amount of time.N)rrrrrrrrr<src@seZdZdZdS)�URLRequiredz*A valid URL is required to make a request.N)rrrrrrrrr@src@seZdZdZdS)�TooManyRedirectszToo many redirects.N)rrrrrrrrrDsrc@seZdZdZdS)�
MissingSchemaz/The URL schema (e.g. http or https) is missing.N)rrrrrrrrrHsrc@seZdZdZdS)�
InvalidSchemaz"See defaults.py for valid schemas.N)rrrrrrrrrLsrc@seZdZdZdS)�
InvalidURLz%The URL provided was somehow invalid.N)rrrrrrrrrPsrc@seZdZdZdS)�
InvalidHeaderz.The header value provided was somehow invalid.N)rrrrrrrrr Tsr c@seZdZdZdS)�ChunkedEncodingErrorz?The server declared chunked encoding but sent an invalid chunk.N)rrrrrrrrr!Xsr!c@seZdZdZdS)�ContentDecodingErrorz!Failed to decode response contentN)rrrrrrrrr"\sr"c@seZdZdZdS)�StreamConsumedErrorz2The content for this response was already consumedN)rrrrrrrrr#`sr#c@seZdZdZdS)�
RetryErrorzCustom retries logic failedN)rrrrrrrrr$dsr$c@seZdZdZdS)�UnrewindableBodyErrorz:Requests encountered an error when trying to rewind a bodyN)rrrrrrrrr%hsr%c@seZdZdZdS)�RequestsWarningzBase warning for Requests.N)rrrrrrrrr&nsr&c@seZdZdZdS)�FileModeWarningzJA file was opened in text mode, but Requests determined its binary length.N)rrrrrrrrr'ssr'c@seZdZdZdS)�RequestsDependencyWarningz@An imported dependency doesn't match the expected version range.N)rrrrrrrrr(xsr(N)rZpip._vendor.urllib3.exceptionsrZ
BaseHTTPError�IOErrorrrrrrrrrr�
ValueErrorrrrr r!r"�	TypeErrorr#r$r%�Warningr&�DeprecationWarningr'r(rrrr�<module>s.	site-packages/pip/_vendor/requests/__pycache__/structures.cpython-36.opt-1.pyc000064400000010346147511334620023415 0ustar003

���e��@s>dZddlZddlmZGdd�dej�ZGdd�de�ZdS)	zO
requests.structures
~~~~~~~~~~~~~~~~~~~

Data structures that power Requests.
�N�)�OrderedDictc@sbeZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dS)�CaseInsensitiveDicta�A case-insensitive ``dict``-like object.

    Implements all methods and operations of
    ``collections.MutableMapping`` as well as dict's ``copy``. Also
    provides ``lower_items``.

    All keys are expected to be strings. The structure remembers the
    case of the last key to be set, and ``iter(instance)``,
    ``keys()``, ``items()``, ``iterkeys()``, and ``iteritems()``
    will contain case-sensitive keys. However, querying and contains
    testing is case insensitive::

        cid = CaseInsensitiveDict()
        cid['Accept'] = 'application/json'
        cid['aCCEPT'] == 'application/json'  # True
        list(cid) == ['Accept']  # True

    For example, ``headers['content-encoding']`` will return the
    value of a ``'Content-Encoding'`` response header, regardless
    of how the header name was originally stored.

    If the constructor, ``.update``, or equality comparison
    operations are given keys that have equal ``.lower()``s, the
    behavior is undefined.
    NcKs&t�|_|dkri}|j|f|�dS)N)r�_store�update)�self�data�kwargs�r
� /usr/lib/python3.6/structures.py�__init__*szCaseInsensitiveDict.__init__cCs||f|j|j�<dS)N)r�lower)r�key�valuer
r
r�__setitem__0szCaseInsensitiveDict.__setitem__cCs|j|j�dS)Nr)rr
)rrr
r
r�__getitem__5szCaseInsensitiveDict.__getitem__cCs|j|j�=dS)N)rr
)rrr
r
r�__delitem__8szCaseInsensitiveDict.__delitem__cCsdd�|jj�D�S)Ncss|]\}}|VqdS)Nr
)�.0ZcasedkeyZmappedvaluer
r
r�	<genexpr><sz/CaseInsensitiveDict.__iter__.<locals>.<genexpr>)r�values)rr
r
r�__iter__;szCaseInsensitiveDict.__iter__cCs
t|j�S)N)�lenr)rr
r
r�__len__>szCaseInsensitiveDict.__len__cCsdd�|jj�D�S)z.Like iteritems(), but with all lowercase keys.css|]\}}||dfVqdS)rNr
)rZlowerkeyZkeyvalr
r
rrDsz2CaseInsensitiveDict.lower_items.<locals>.<genexpr>)r�items)rr
r
r�lower_itemsAszCaseInsensitiveDict.lower_itemscCs2t|tj�rt|�}ntSt|j��t|j��kS)N)�
isinstance�collections�Mappingr�NotImplemented�dictr)r�otherr
r
r�__eq__Is
zCaseInsensitiveDict.__eq__cCst|jj��S)N)rrr)rr
r
r�copyRszCaseInsensitiveDict.copycCstt|j���S)N)�strrr)rr
r
r�__repr__UszCaseInsensitiveDict.__repr__)N)�__name__�
__module__�__qualname__�__doc__rrrrrrrr!r"r$r
r
r
rrs
	rcs<eZdZdZd�fdd�	Zdd�Zdd�Zdd	d
�Z�ZS)
�
LookupDictzDictionary lookup object.Ncs||_tt|�j�dS)N)�name�superr)r)rr*)�	__class__r
rr\szLookupDict.__init__cCs
d|jS)Nz
<lookup '%s'>)r*)rr
r
rr$`szLookupDict.__repr__cCs|jj|d�S)N)�__dict__�get)rrr
r
rrcszLookupDict.__getitem__cCs|jj||�S)N)r-r.)rr�defaultr
r
rr.hszLookupDict.get)N)N)	r%r&r'r(rr$rr.�
__classcell__r
r
)r,rr)Ys
r))r(r�compatr�MutableMappingrrr)r
r
r
r�<module>sJsite-packages/pip/_vendor/requests/__pycache__/utils.cpython-36.opt-1.pyc000064400000050320147511334620022326 0ustar003

���e/l�@s�dZddlZddlZddlZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddlZddlmZddl
mZddlmZddlmZddlmZmZmZmZmZmZmZmZmZmZmZmZm Z m!Z!dd	l"m#Z#dd
l$m%Z%ddl&m'Z'm(Z(m)Z)m*Z*dfZ+ej,�Z-ddd�Z.ej/�dk�r0dd�Z0dd�Zdd�Z1dd�Z2dgdd�Z3dd�Z4dd �Z5d!d"�Z6d#d$�Z7d%d&�Z8dhd'd(�Z9d)d*�Z:d+d,�Z;d-d.�Z<d/d0�Z=d1d2�Z>d3d4�Z?d5d6�Z@eAdi�ZBd9d:�ZCd;d<�ZDd=d>�ZEd?d@�ZFdAdB�ZGdCdD�ZHejIdEdF��ZJdGdH�ZKdjdIdJ�ZLdKdL�ZMdkdNdO�ZNdPdQ�ZOdRdS�ZPdTjQdU�ZReRdVZSeRdWZTdXdY�ZUdZd[�ZVd\d]�ZWejXd^�ZYejXd_�ZZd`da�Z[dbdc�Z\ddde�Z]dS)lz�
requests.utils
~~~~~~~~~~~~~~

This module provides utility functions that are used within Requests
that are also useful for external consumption.
�N�)�__version__)�certs)�to_native_string)�parse_http_list)�quote�urlparse�bytes�str�OrderedDict�unquote�
getproxies�proxy_bypass�
urlunparse�
basestring�
integer_types�is_py3�proxy_bypass_environment�getproxies_environment)�cookiejar_from_dict)�CaseInsensitiveDict)�
InvalidURL�
InvalidHeader�FileModeWarning�UnrewindableBodyError�.netrc�_netrc�Pi�)ZhttpZhttpsZWindowsc
Cs�trddl}nddl}y2|j|jd�}|j|d�d}|j|d�d}Wntk
r\dSX|sj|rndS|jd�}xX|D]P}|dkr�d|kr�d	S|jdd
�}|jdd�}|jd
d�}t	j
||t	j�r~d	Sq~WdS)Nrz;Software\Microsoft\Windows\CurrentVersion\Internet SettingsZProxyEnableZ
ProxyOverrideF�;z<local>�.Tz\.�*z.*�?)r�winreg�_winreg�OpenKey�HKEY_CURRENT_USERZQueryValueEx�OSError�split�replace�re�match�I)�hostr"ZinternetSettingsZproxyEnableZ
proxyOverrideZtest�r-�/usr/lib/python3.6/utils.py�proxy_bypass_registry.s2



r/cCst�rt|�St|�SdS)z�Return True, if the host should be bypassed.

        Checks proxy settings gathered from the environment, if specified,
        or the registry.
        N)rrr/)r,r-r-r.rOsrcCst|d�r|j�}|S)z/Returns an internal sequence dictionary update.�items)�hasattrr0)�dr-r-r.�dict_to_sequence[s
r3cCs2d}d}t|d�rt|�}nbt|d�r.|j}nPt|d�r~y|j�}Wntjk
rZYn$Xtj|�j}d|jkr~t	j
dt�t|d��ry|j�}Wn$t
tfk
r�|dk	r�|}Yn\Xt|d�o�|dk�ry&|jdd	�|j�}|j|p�d�Wnt
tfk
�rd}YnX|dk�r$d}td||�S)
Nr�__len__�len�fileno�ba%Requests has determined the content-length for this request using the binary size of the file: however, the file has been opened in text mode (i.e. without the 'b' flag in the mode). This may lead to an incorrect content-length. In Requests 3.0, support will be removed for files in text mode.�tell�seek�)r1r5r6�io�UnsupportedOperation�os�fstat�st_size�mode�warnings�warnrr8r&�IOErrorr9�max)�oZtotal_lengthZcurrent_positionr6r-r-r.�	super_lends@







rFFcCsy�ddlm}m}d}xJtD]B}ytjjdj|��}Wntk
rJdSXtjj|�r|}PqW|dkrndSt	|�}d}t
|t�r�|jd�}|j
j|�d}	y6||�j|	�}
|
r�|
dr�dnd}|
||
dfSWn|tfk
r�|r�YnXWnttfk
�rYnXdS)	z;Returns the Requests tuple auth for a given url from netrc.r)�netrc�NetrcParseErrorNz~/{0}�:�asciirr:)rGrH�NETRC_FILESr=�path�
expanduser�format�KeyError�existsr�
isinstancer
�decode�netlocr'ZauthenticatorsrC�ImportError�AttributeError)�urlZraise_errorsrGrHZ
netrc_path�f�locZriZsplitstrr,rZlogin_ir-r-r.�get_netrc_auth�s8


rYcCsBt|dd�}|r>t|t�r>|ddkr>|ddkr>tjj|�SdS)z0Tries to guess the filename of the given object.�nameNr�<r�>���)�getattrrQrr=rL�basename)�objrZr-r-r.�guess_filename�sracCs.|dkrdSt|ttttf�r&td��t|�S)a�Take an object and test to see if it can be represented as a
    dictionary. Unless it can not be represented as such, return an
    OrderedDict, e.g.,

    ::

        >>> from_key_val_list([('key', 'val')])
        OrderedDict([('key', 'val')])
        >>> from_key_val_list('string')
        ValueError: need more than 1 value to unpack
        >>> from_key_val_list({'key': 'val'})
        OrderedDict([('key', 'val')])

    :rtype: OrderedDict
    Nz+cannot encode objects that are not 2-tuples)rQr
r	�bool�int�
ValueErrorr)�valuer-r-r.�from_key_val_list�s
rfcCsB|dkrdSt|ttttf�r&td��t|tj�r:|j�}t	|�S)a�Take an object and test to see if it can be represented as a
    dictionary. If it can be, return a list of tuples, e.g.,

    ::

        >>> to_key_val_list([('key', 'val')])
        [('key', 'val')]
        >>> to_key_val_list({'key': 'val'})
        [('key', 'val')]
        >>> to_key_val_list('string')
        ValueError: cannot encode objects that are not 2-tuples.

    :rtype: list
    Nz+cannot encode objects that are not 2-tuples)
rQr
r	rbrcrd�collections�Mappingr0�list)rer-r-r.�to_key_val_list�srjcCs\g}xRt|�D]F}|dd�|dd�ko4dknrJt|dd��}|j|�qW|S)aParse lists as described by RFC 2068 Section 2.

    In particular, parse comma-separated lists where the elements of
    the list may include quoted-strings.  A quoted-string could
    contain a comma.  A non-quoted string could have quotes in the
    middle.  Quotes are removed automatically after parsing.

    It basically works like :func:`parse_set_header` just that items
    may appear multiple times and case sensitivity is preserved.

    The return value is a standard :class:`list`:

    >>> parse_list_header('token, "quoted value"')
    ['token', 'quoted value']

    To create a header from the :class:`list` again, use the
    :func:`dump_header` function.

    :param value: a string with a list header.
    :return: :class:`list`
    :rtype: list
    Nr�"r]r])�_parse_list_header�unquote_header_value�append)re�result�itemr-r-r.�parse_list_headers(rqcCs|i}xrt|�D]f}d|kr$d||<q|jdd�\}}|dd�|dd�koVdknrlt|dd��}|||<qW|S)a^Parse lists of key, value pairs as described by RFC 2068 Section 2 and
    convert them into a python dict:

    >>> d = parse_dict_header('foo="is a fish", bar="as well"')
    >>> type(d) is dict
    True
    >>> sorted(d.items())
    [('bar', 'as well'), ('foo', 'is a fish')]

    If there is no value for a key it will be `None`:

    >>> parse_dict_header('key_without_value')
    {'key_without_value': None}

    To create a header from the :class:`dict` again, use the
    :func:`dump_header` function.

    :param value: a string with a dict header.
    :return: :class:`dict`
    :rtype: dict
    �=Nrrkr]r])rlr'rm)rerorprZr-r-r.�parse_dict_header1s(rscCs^|rZ|d|d	kodknrZ|dd
�}|sF|dd�dkrZ|jdd�jdd�S|S)z�Unquotes a header value.  (Reversal of :func:`quote_header_value`).
    This does not use the real unquoting but what browsers are actually
    using for quoting.

    :param value: the header value to unquote.
    :rtype: str
    rrrkNr:z\\�\z\"r]r])r()reZis_filenamer-r-r.rmTs
$rmcCs"i}x|D]}|j||j<q
W|S)z�Returns a key/value dictionary from a CookieJar.

    :param cj: CookieJar object to extract cookies from.
    :rtype: dict
    )rerZ)�cj�cookie_dictZcookier-r-r.�dict_from_cookiejarms
rwcCs
t||�S)z�Returns a CookieJar from a key/value dictionary.

    :param cj: CookieJar to insert cookies into.
    :param cookie_dict: Dict of key/values to insert into CookieJar.
    :rtype: CookieJar
    )r)rurvr-r-r.�add_dict_to_cookiejar|srxcCsTtjdt�tjdtjd�}tjdtjd�}tjd�}|j|�|j|�|j|�S)zlReturns encodings from given content string.

    :param content: bytestring to extract encodings from.
    z�In requests 3.0, get_encodings_from_content will be removed. For more information, please see the discussion on issue #2266. (This warning should only appear once.)z!<meta.*?charset=["\']*(.+?)["\'>])�flagsz+<meta.*?content=["\']*;?charset=(.+?)["\'>]z$^<\?xml.*?encoding=["\']*(.+?)["\'>])rArB�DeprecationWarningr)�compiler+�findall)�contentZ
charset_reZ	pragma_reZxml_rer-r-r.�get_encodings_from_content�s
r~cCsF|jd�}|sdStj|�\}}d|kr6|djd�Sd|krBdSdS)z}Returns encodings from given HTTP Header Dict.

    :param headers: dictionary to extract encoding from.
    :rtype: str
    zcontent-typeN�charsetz'"�textz
ISO-8859-1)�get�cgiZparse_header�strip)�headersZcontent_type�paramsr-r-r.�get_encoding_from_headers�s
r�ccsr|jdkr"x|D]
}|VqWdStj|j�dd�}x |D]}|j|�}|r:|Vq:W|jddd�}|rn|VdS)zStream decodes a iterator.Nr()�errors�T)�final)�encoding�codecs�getincrementaldecoderrR)�iterator�rrp�decoder�chunk�rvr-r-r.�stream_decode_response_unicode�s





r�ccsLd}|dks|dkrt|�}x*|t|�krF||||�V||7}qWdS)z Iterate over slices of a string.rN)r5)�stringZslice_length�posr-r-r.�iter_slices�sr�cCsvtjdt�g}t|j�}|rJyt|j|�Stk
rH|j|�YnXyt|j|dd�St	k
rp|jSXdS)z�Returns the requested content back in unicode.

    :param r: Response object to get unicode content from.

    Tried:

    1. charset from content-type
    2. fall back and replace all unicode characters

    :rtype: str
    z�In requests 3.0, get_unicode_from_response will be removed. For more information, please see the discussion on issue #2266. (This warning should only appear once.)r()r�N)
rArBrzr�r�r
r}�UnicodeErrorrn�	TypeError)r�Ztried_encodingsr�r-r-r.�get_unicode_from_response�s
r�Z4ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzz0123456789-._~c
Cs�|jd�}x�tdt|��D]�}||dd�}t|�dkr�|j�r�ytt|d��}Wn tk
rttd|��YnX|tkr�|||dd�||<q�d||||<qd||||<qWdj	|�S)	z�Un-escape any percent-escape sequences in a URI that are unreserved
    characters. This leaves all reserved, illegal and non-ASCII bytes encoded.

    :rtype: str
    �%rrr:�z%Invalid percent-escape sequence: '%s'N�)
r'�ranger5�isalnum�chrrcrdr�UNRESERVED_SET�join)�uri�parts�i�h�cr-r-r.�unquote_unreserved�s
r�cCs:d}d}ytt|�|d�Stk
r4t||d�SXdS)z�Re-quote the given URI.

    This function passes the given URI through an unquote/quote cycle to
    ensure that it is fully and consistently quoted.

    :rtype: str
    z!#$%&'()*+,/:;=?@[]~z!#$&'()*+,/:;=?@[]~)ZsafeN)rr�r)r�Zsafe_with_percentZsafe_without_percentr-r-r.�requote_uri
sr�cCsltjdtj|��d}|jd�\}}tjdtjtt|����d}tjdtj|��d|@}||@||@kS)z�This function allows you to check if an IP belongs to a network subnet

    Example: returns True if ip = 192.168.1.1 and net = 192.168.1.0/24
             returns False if ip = 192.168.1.1 and net = 192.168.100.0/24

    :rtype: bool
    z=Lr�/)�struct�unpack�socket�	inet_atonr'�dotted_netmaskrc)�ipZnetZipaddrZnetaddr�bitsZnetmaskZnetworkr-r-r.�address_in_network#s
r�cCs&ddd|>dA}tjtjd|��S)z�Converts mask from /xx format to xxx.xxx.xxx.xxx

    Example: if mask is 24 function returns 255.255.255.0

    :rtype: str
    l��r� z>I)r�Z	inet_ntoar��pack)�maskr�r-r-r.r�2sr�cCs*ytj|�Wntjk
r$dSXdS)z
    :rtype: bool
    FT)r�r��error)Z	string_ipr-r-r.�is_ipv4_address=s
r�cCs�|jd�dkr�yt|jd�d�}Wntk
r8dSX|dksJ|dkrNdSytj|jd�d�Wq�tjk
r|dSXndSdS)zV
    Very simple check of the cidr format in no_proxy variable.

    :rtype: bool
    r�rFr�rT)�countrcr'rdr�r�r�)Zstring_networkr�r-r-r.�
is_valid_cidrHsr�ccsT|dk	}|r"tjj|�}|tj|<z
dVWd|rN|dkrDtj|=n
|tj|<XdS)z�Set the environment variable 'env_name' to 'value'

    Save previous value, yield, and then restore the previous value stored in
    the environment variable 'env_name'.

    If 'value' is None, do nothingN)r=�environr�)Zenv_namereZ
value_changedZ	old_valuer-r-r.�set_environ`s


r�c	Csdd�}|}|dkr|d�}t|�j}|r�dd�|jdd�jd	�D�}|jd
�d}t|�r�xb|D](}t|�r~t||�r�dSqb||krbdSqbWn0x.|D]&}|j|�s�|jd
�dj|�r�dSq�Wtd|��2yt	|�}Wnt
tjfk
r�d
}YnXWdQRX|�rdSd
S)zL
    Returns whether we should bypass proxies or not.

    :rtype: bool
    cSstjj|�ptjj|j��S)N)r=r�r��upper)�kr-r-r.�<lambda>|sz'should_bypass_proxies.<locals>.<lambda>N�no_proxycss|]}|r|VqdS)Nr-)�.0r,r-r-r.�	<genexpr>�sz(should_bypass_proxies.<locals>.<genexpr>� r��,�:rTF)
rrSr(r'r�r�r��endswithr�rr�r�Zgaierror)	rVr�Z	get_proxyZno_proxy_argrSr�Zproxy_ipr,Zbypassr-r-r.�should_bypass_proxiesvs4




r�cCst||d�riSt�SdS)zA
    Return a dict of environment proxies.

    :rtype: dict
    )r�N)r�r
)rVr�r-r-r.�get_environ_proxies�sr�cCsv|pi}t|�}|jdkr.|j|j|jd��S|jd|j|jd|jdg}d}x|D]}||krX||}PqXW|S)z�Select a proxy for the url, if applicable.

    :param url: The url being for the request
    :param proxies: A dictionary of schemes or schemes and hosts to proxy URLs
    N�allz://zall://)rZhostnamer��scheme)rVZproxiesZurlpartsZ
proxy_keys�proxyZ	proxy_keyr-r-r.�select_proxy�s

r��python-requestscCsd|tfS)zO
    Return a string representing the default user agent.

    :rtype: str
    z%s/%s)r)rZr-r-r.�default_user_agent�sr�cCstt�djd�ddd��S)z9
    :rtype: requests.structures.CaseInsensitiveDict
    z, �gzip�deflatez*/*z
keep-alive)z
User-AgentzAccept-EncodingZAcceptZ
Connection)r�r�)rr�r�r-r-r-r.�default_headers�s
r�c	Cs�g}d}x�tjd|�D]�}y|jdd�\}}Wntk
rL|d}}YnXd|jd�i}xP|jd�D]B}y|jd�\}}Wntk
r�PYnX|j|�||j|�<qhW|j|�qW|S)	z�Return a dict of parsed link headers proxies.

    i.e. Link: <http:/.../front.jpeg>; rel=front; type="image/jpeg",<http://.../back.jpeg>; rel=back;type="image/jpeg"

    :rtype: list
    z '"z, *<rrr�rVz<> '"rr)r)r'rdr�rn)	reZlinksZ
replace_chars�valrVr��linkZparam�keyr-r-r.�parse_header_links�s r��rJr:�cCs�|dd�}|tjtjfkr dS|dd�tjkr6dS|dd�tjtjfkrRdS|jt�}|dkrhd	S|dkr�|ddd�tkr�d
S|ddd�tkr�dS|dkr�|dd�t	kr�d
S|dd�t	kr�dSdS)z
    :rtype: str
    N�zutf-32r�z	utf-8-sigr:zutf-16rzutf-8z	utf-16-berz	utf-16-lez	utf-32-bez	utf-32-le)
r��BOM_UTF32_LE�BOM_UTF32_BE�BOM_UTF8�BOM_UTF16_LE�BOM_UTF16_BEr��_null�_null2�_null3)�dataZsampleZ	nullcountr-r-r.�guess_json_utfs*
r�cCs8t||�\}}}}}}|s$||}}t||||||f�S)z�Given a URL that may or may not have a scheme, prepend the given scheme.
    Does not replace a present scheme with the one provided as an argument.

    :rtype: str
    )rr)rVZ
new_schemer�rSrLr��query�fragmentr-r-r.�prepend_scheme_if_needed1s
r�cCsBt|�}yt|j�t|j�f}Wnttfk
r<d}YnX|S)z{Given a url with authentication components, extract them into a tuple of
    username,password.

    :rtype: (str,str)
    r�)r�r�)rrZusernameZpasswordrUr�)rVZparsedZauthr-r-r.�get_auth_from_urlBs
r�s^\S[^\r\n]*$|^$z^\S[^\r\n]*$|^$cCsf|\}}t|t�rt}nt}y|j|�s4td|��Wn*tk
r`td||t|�f��YnXdS)z�Verifies that header value is a string which doesn't contain
    leading whitespace or return characters. This prevents unintended
    header injection.

    :param header: tuple, in the format (name, value).
    z7Invalid return character or leading space in header: %sz>Value for header {%s: %s} must be of type str or bytes, not %sN)rQr	�_CLEAN_HEADER_REGEX_BYTE�_CLEAN_HEADER_REGEX_STRr*rr��type)�headerrZreZpatr-r-r.�check_header_validityWs

r�cCsFt|�\}}}}}}|s"||}}|jdd�d}t|||||df�S)zW
    Given a url remove the fragment and the authentication part.

    :rtype: str
    �@rr�r])r�rsplitr)rVr�rSrLr�r�r�r-r-r.�
urldefragauthls

r�cCs`t|jdd�}|dk	rTt|jt�rTy||j�Wq\ttfk
rPtd��Yq\Xntd��dS)zfMove file pointer back to its recorded starting position
    so it can be read again on redirect.
    r9Nz;An error occurred when rewinding request body for redirect.z+Unable to rewind request body for redirect.)r^ZbodyrQZ_body_positionrrCr&r)Zprepared_requestZ	body_seekr-r-r.�rewind_body}sr�)rr)F)FzBABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789-._~)N)r�)^�__doc__r�r�rg�
contextlibr;r=�platformr)r�r�rArr�rZ_internal_utilsr�compatrrlrrr	r
rrr
rrrrrrrZcookiesrZ
structuresr�
exceptionsrrrrrK�whereZDEFAULT_CA_BUNDLE_PATHZ
DEFAULT_PORTS�systemr/r3rFrYrarfrjrqrsrmrwrxr~r�r�r�r��	frozensetr�r�r�r�r�r�r��contextmanagerr�r�r�r�r�r�r��encoder�r�r�r�r�r�r{r�r�r�r�r�r-r-r-r.�<module>	s�@
!	=
3 #

%9

	"
 

site-packages/pip/_vendor/requests/__pycache__/certs.cpython-36.pyc000064400000001052147511334620021345 0ustar003

���e��@s&dZddlmZedkr"ee��dS)uF
requests.certs
~~~~~~~~~~~~~~

This module returns the preferred default CA certificate bundle. There is
only one — the one from the certifi package.

If you are packaging Requests, e.g., for a Linux distribution or a managed
environment, you can change the definition of where() to return a separately
packaged CA bundle.
�)�where�__main__N)�__doc__Zpip._vendor.certifir�__name__�print�rr�/usr/lib/python3.6/certs.py�<module>ssite-packages/pip/_vendor/requests/__pycache__/hooks.cpython-36.pyc000064400000001626147511334620021357 0ustar003

���e��@sdZdgZdd�Zdd�ZdS)z�
requests.hooks
~~~~~~~~~~~~~~

This module provides the capabilities for the Requests hooks system.

Available hooks:

``response``:
    The response generated from a Request.
ZresponsecCstdd�tD��S)Ncss|]}|gfVqdS)N�)�.0Zeventrr�/usr/lib/python3.6/hooks.py�	<genexpr>sz default_hooks.<locals>.<genexpr>)�dict�HOOKSrrrr�
default_hookssrcKsR|pt�}|j|�}|rNt|d�r(|g}x$|D]}||f|�}|dk	r.|}q.W|S)z6Dispatches a hook dictionary on a given piece of data.�__call__N)r�get�hasattr)�keyZhooksZ	hook_data�kwargs�hookZ
_hook_datarrr�
dispatch_hooks



rN)�__doc__rrrrrrr�<module>
ssite-packages/pip/_vendor/requests/__pycache__/help.cpython-36.pyc000064400000005067147511334620021167 0ustar003

���eS�@s�dZddlmZddlZddlZddlZddlZddlmZddlm	Z	ddlm
Z
ddlmZ
ydd	lmZWn ek
r�dZdZdZYnXddlZddlZd
d�Zdd
�Zdd�Zedkr�e�dS)z'Module containing bug report helper(s).�)�print_functionN)�idna)�urllib3)�chardet�)�__version__)�	pyopensslcCs�tj�}|dkrtj�}nj|dkr\dtjjtjjtjjf}tjjdkr�dj	|tjjg�}n(|dkrntj�}n|dkr�tj�}nd}||d	�S)
a�Return a dict with the Python implementation and version.

    Provide both the name and the version of the Python implementation
    currently running. For example, on CPython 2.7.5 it will return
    {'name': 'CPython', 'version': '2.7.5'}.

    This function works best on CPython and PyPy: in particular, it probably
    doesn't work for Jython or IronPython. Future investigation should be done
    to work out the correct shape of the code for those platforms.
    ZCPythonZPyPyz%s.%s.%s�final�ZJythonZ
IronPython�Unknown)�name�version)
�platformZpython_implementationZpython_version�sysZpypy_version_info�major�minor�micro�releaselevel�join)�implementationZimplementation_version�r�/usr/lib/python3.6/help.py�_implementations 


rc	Cs�ytj�tj�d�}Wntk
r4ddd�}YnXt�}dtji}dtji}ddd�}trrtjdtj	j
d�}dttdd�i}dtt
dd�i}ttd	d�}d|dk	r�d|ndi}|||tdk	|||||dtid
�
S)z&Generate information for a bug report.)�system�releaserr
Nr
)r
Zopenssl_versionz%xr�OPENSSL_VERSION_NUMBER)
rr�
system_sslZusing_pyopensslZ	pyOpenSSLrr�cryptographyrZrequests)rrr�IOErrorrrrr�OpenSSLZSSLr�getattrrr�sslr�requests_version)	Z
platform_infoZimplementation_infoZurllib3_infoZchardet_infoZpyopenssl_infoZcryptography_infoZ	idna_inforZsystem_ssl_inforrr�info;s8

r#cCsttjt�ddd��dS)z)Pretty-print the bug information as JSON.T�)Z	sort_keys�indentN)�print�json�dumpsr#rrrr�mainrsr)�__main__)�__doc__Z
__future__rr'rrr!Zpip._vendorrrrr
rr"Zpackages.urllib3.contribr�ImportErrorrrrr#r)�__name__rrrr�<module>s,
!7site-packages/pip/_vendor/requests/__pycache__/models.cpython-36.opt-1.pyc000064400000056505147511334620022464 0ustar003

���e��@s�dZddlZddlZddlZddlZddlmZddlm	Z	ddl
mZddlm
Z
mZmZmZddlmZdd	lmZdd
lmZddlmZddlmZmZmZdd
lmZmZm Z m!Z!m"Z"m#Z#m$Z$ddl%m&Z&m'Z'ddl(m)Z)m*Z*m+Z+m,Z,m-Z-m.Z.m/Z/m0Z0m1Z1m2Z2ddl3m4Z4m5Z5m6Z6m7Z7m8Z8m9Z9m:Z:m;Z;m<Z<m=Z=ddl3m>Z?ddl@mAZAeAjBeAjCeAjDeAjEeAjFfZGdZHd!ZIdZJGdd�deK�ZLGdd�deK�ZMGdd�deM�ZNGdd�deLeM�ZOGdd �d eK�ZPdS)"z`
requests.models
~~~~~~~~~~~~~~~

This module contains the primary objects that power Requests.
�N)�RequestField)�encode_multipart_formdata)�	parse_url)�DecodeError�ReadTimeoutError�
ProtocolError�LocationParseError)�UnsupportedOperation�)�
default_hooks)�CaseInsensitiveDict)�
HTTPBasicAuth)�cookiejar_from_dict�get_cookie_header�_copy_cookie_jar)�	HTTPError�
MissingSchema�
InvalidURL�ChunkedEncodingError�ContentDecodingError�ConnectionError�StreamConsumedError)�to_native_string�unicode_is_ascii)
�guess_filename�get_auth_from_url�requote_uri�stream_decode_response_unicode�to_key_val_list�parse_header_links�iter_slices�guess_json_utf�	super_len�check_header_validity)
�	cookielib�
urlunparse�urlsplit�	urlencode�str�bytes�is_py2�chardet�builtin_str�
basestring)�json)�codes��
iic@s0eZdZedd��Zedd��Zedd��ZdS)�RequestEncodingMixincCsNg}t|j�}|j}|sd}|j|�|j}|rD|jd�|j|�dj|�S)zBuild the path URL to use.�/�?�)r&�url�path�append�query�join)�selfr6�pr7r9�r=�/usr/lib/python3.6/models.py�path_url=s



zRequestEncodingMixin.path_urlcCs�t|ttf�r|St|d�r |St|d�r�g}x|t|�D]p\}}t|t�sVt|d�r\|g}xJ|D]B}|dk	rb|jt|t�r�|jd�n|t|t�r�|jd�n|f�qbWq8Wt|dd�S|SdS)z�Encode parameters in a piece of data.

        Will successfully encode parameters when passed as a dict or a list of
        2-tuples. Order is retained if data is a list of 2-tuples but arbitrary
        if parameters are supplied as a dict.
        �read�__iter__Nzutf-8T)Zdoseq)	�
isinstancer(r)�hasattrrr-r8�encoder')�data�result�kZvs�vr=r=r>�_encode_paramsRs 	


$z#RequestEncodingMixin._encode_paramscCs�|std��nt|t�r td��g}t|p,i�}t|p8i�}x�|D]�\}}t|t�s`t|d�rf|g}x\|D]T}|dk	rlt|t�s�t|�}|jt|t�r�|jd�n|t|t�r�|j	d�n|f�qlWqBWx�|D]�\}}d}d}	t|t
tf��r.t|�dk�r|\}
}n&t|�dk�r |\}
}}n|\}
}}}	nt
|��p:|}
|}t|tttf��rX|}n|j�}t|||
|	d�}
|
j|d	�|j|
�q�Wt|�\}}||fS)
a�Build the body for a multipart/form-data request.

        Will successfully encode files when passed as a dict or a list of
        tuples. Order is retained if data is a list of tuples but arbitrary
        if parameters are supplied as a dict.
        The tuples may be 2-tuples (filename, fileobj), 3-tuples (filename, fileobj, contentype)
        or 4-tuples (filename, fileobj, contentype, custom_headers).
        zFiles must be provided.zData must not be a string.rANzutf-8��)�namerE�filename�headers)�content_type)�
ValueErrorrBr-rrCr)r(r8�decoderD�tuple�list�lenr�	bytearrayr@rZmake_multipartr)�filesrEZ
new_fieldsZfieldsZfield�valrHrGZftZfh�fn�fpZfdataZrf�bodyrOr=r=r>�
_encode_filesmsH




$
z"RequestEncodingMixin._encode_filesN)�__name__�
__module__�__qualname__�propertyr?�staticmethodrIr[r=r=r=r>r2<sr2c@seZdZdd�Zdd�ZdS)�RequestHooksMixincCs\||jkrtd|��t|tj�r4|j|j|�n$t|d�rX|j|jdd�|D��dS)zProperly register a hook.z1Unsupported event specified, with event name "%s"rAcss|]}t|tj�r|VqdS)N)rB�collections�Callable)�.0�hr=r=r>�	<genexpr>�sz2RequestHooksMixin.register_hook.<locals>.<genexpr>N)�hooksrPrBrbrcr8rC�extend)r;�event�hookr=r=r>�
register_hook�s

zRequestHooksMixin.register_hookcCs.y|j|j|�dStk
r(dSXdS)ziDeregister a previously registered hook.
        Returns True if the hook existed, False if not.
        TFN)rg�removerP)r;rirjr=r=r>�deregister_hook�s
z!RequestHooksMixin.deregister_hookN)r\r]r^rkrmr=r=r=r>ra�srac
@s*eZdZdZd	dd�Zdd�Zdd�ZdS)
�Requesta�A user-created :class:`Request <Request>` object.

    Used to prepare a :class:`PreparedRequest <PreparedRequest>`, which is sent to the server.

    :param method: HTTP method to use.
    :param url: URL to send.
    :param headers: dictionary of headers to send.
    :param files: dictionary of {filename: fileobject} files to multipart upload.
    :param data: the body to attach to the request. If a dictionary is provided, form-encoding will take place.
    :param json: json for the body to attach to the request (if files or data is not specified).
    :param params: dictionary of URL parameters to append to the URL.
    :param auth: Auth handler or (user, pass) tuple.
    :param cookies: dictionary or CookieJar of cookies to attach to this request.
    :param hooks: dictionary of callback hooks, for internal usage.

    Usage::

      >>> import requests
      >>> req = requests.Request('GET', 'http://httpbin.org/get')
      >>> req.prepare()
      <PreparedRequest [GET]>
    Nc
Cs�|dkrgn|}|dkrgn|}|dkr,in|}|dkr<in|}|	dkrLin|	}	t�|_x&t|	j��D]\}}|j||d�qfW||_||_||_||_||_	|
|_
||_||_||_
dS)N)rirj)rrgrS�itemsrk�methodr6rNrVrEr.�params�auth�cookies)
r;rpr6rNrVrErqrrrsrgr.rGrHr=r=r>�__init__�s"zRequest.__init__cCs
d|jS)Nz<Request [%s]>)rp)r;r=r=r>�__repr__�szRequest.__repr__cCs<t�}|j|j|j|j|j|j|j|j|j	|j
|jd�
|S)zXConstructs a :class:`PreparedRequest <PreparedRequest>` for transmission and returns it.)
rpr6rNrVrEr.rqrrrsrg)�PreparedRequest�preparerpr6rNrVrEr.rqrrrsrg)r;r<r=r=r>rw�s
zRequest.prepare)
NNNNNNNNNN)r\r]r^�__doc__rtrurwr=r=r=r>rn�s

rnc
@s�eZdZdZdd�Zddd�Zdd�Zd	d
�Zdd�Ze	d
d��Z
dd�Zdd�Zddd�Z
dd�Zd dd�Zdd�Zdd�ZdS)!rva�The fully mutable :class:`PreparedRequest <PreparedRequest>` object,
    containing the exact bytes that will be sent to the server.

    Generated from either a :class:`Request <Request>` object or manually.

    Usage::

      >>> import requests
      >>> req = requests.Request('GET', 'http://httpbin.org/get')
      >>> r = req.prepare()
      <PreparedRequest [GET]>

      >>> s = requests.Session()
      >>> s.send(r)
      <Response [200]>
    cCs0d|_d|_d|_d|_d|_t�|_d|_dS)N)rpr6rN�_cookiesrZrrg�_body_position)r;r=r=r>rtszPreparedRequest.__init__NcCsR|j|�|j||�|j|�|j|�|j|||
�|j||�|j|	�dS)z6Prepares the entire request with the given parameters.N)�prepare_method�prepare_url�prepare_headers�prepare_cookies�prepare_body�prepare_auth�
prepare_hooks)r;rpr6rNrVrErqrrrsrgr.r=r=r>rw+s


zPreparedRequest.preparecCs
d|jS)Nz<PreparedRequest [%s]>)rp)r;r=r=r>ru=szPreparedRequest.__repr__cCsXt�}|j|_|j|_|jdk	r*|jj�nd|_t|j�|_|j|_|j|_|j	|_	|S)N)
rvrpr6rN�copyrryrZrgrz)r;r<r=r=r>r�@szPreparedRequest.copycCs$||_|jdk	r t|jj��|_dS)zPrepares the given HTTP method.N)rpr�upper)r;rpr=r=r>r{Ks
zPreparedRequest.prepare_methodcCs@ddl}y|j|dd�jd�}Wn|jk
r:t�YnX|S)NrT)Zuts46zutf-8)�idnarDrQZ	IDNAError�UnicodeError)�hostr�r=r=r>�_get_idna_encoded_hostQs
z&PreparedRequest._get_idna_encoded_hostcCs0t|t�r|jd�}ntr"t|�nt|�}|j�}d|krT|j�jd�rT||_	dSyt
|�\}}}}}}}	Wn,tk
r�}
zt|
j
��WYdd}
~
XnX|s�d}|jt|d��}t|��|s�td|��t|��sy|j|�}Wntk
�rtd��YnXn|jd��rtd��|�p"d	}|�r2|d
7}||7}|�rP|dt|�7}|�sZd}t�r�t|t��rv|jd�}t|t��r�|jd�}t|t��r�|jd�}t|t��r�|jd�}t|	t��r�|	jd�}	t|ttf��r�t|�}|j|�}
|
�r|�r
d
||
f}n|
}tt|||d||	g��}||_	dS)zPrepares the given HTTP URL.�utf8�:ZhttpNzDInvalid URL {0!r}: No schema supplied. Perhaps you meant http://{0}?z Invalid URL %r: No host suppliedzURL has an invalid label.�*r5�@r3zutf-8z%s&%s)rBr)rQr*Zunicoder(�lstrip�lower�
startswithr6rrr�args�formatrrrr�r�rDrIrr%)r;r6rq�schemerrr�Zportr7r9Zfragment�e�errorZnetlocZ
enc_paramsr=r=r>r|[sh








zPreparedRequest.prepare_urlcCs@t�|_|r<x.|j�D]"}t|�|\}}||jt|�<qWdS)z Prepares the given HTTP headers.N)rrNror#r)r;rN�headerrL�valuer=r=r>r}�szPreparedRequest.prepare_headerscCsvd}d}|r8|dk	r8d}tj|�}t|t�s8|jd�}tt|d�t|ttt	t
jf�g�}yt|�}Wnt
ttfk
r�d}YnX|r�|}t|dd�dk	r�y|j�|_Wn ttfk
r�t�|_YnX|r�td��|r�t|�|jd<n
d|jd	<np|�r|j||�\}}n2|�rF|j|�}t|t��s<t|d
��rBd}nd}|j|�|�rld|jk�rl||jd
<||_dS)z"Prepares the given HTTP body data.Nzapplication/jsonzutf-8rA�tellz1Streamed bodies and files are mutually exclusive.zContent-LengthZchunkedzTransfer-Encodingr@z!application/x-www-form-urlencodedzcontent-typezContent-Type)�complexjson�dumpsrBr)rD�allrCr-rSrRrb�Mappingr"�	TypeError�AttributeErrorr	�getattrr�rz�IOError�OSError�object�NotImplementedErrorr,rNr[rI�prepare_content_lengthrZ)r;rErVr.rZrOZ	is_stream�lengthr=r=r>r�sJ






zPreparedRequest.prepare_bodycCsL|dk	r$t|�}|rHt|�|jd<n$|jdkrH|jjd�dkrHd|jd<dS)z>Prepare Content-Length header based on request method and bodyNzContent-Length�GET�HEAD�0)r�r�)r"r,rNrp�get)r;rZr�r=r=r>r�sz&PreparedRequest.prepare_content_lengthr5cCsj|dkr"t|j�}t|�r|nd}|rft|t�rDt|�dkrDt|�}||�}|jj|j�|j	|j
�dS)z"Prepares the given HTTP auth data.NrJ)rr6�anyrBrRrTr
�__dict__�updater�rZ)r;rrr6Zurl_auth�rr=r=r>r�s
zPreparedRequest.prepare_authcCs@t|tj�r||_n
t|�|_t|j|�}|dk	r<||jd<dS)aPrepares the given HTTP cookie data.

        This function eventually generates a ``Cookie`` header from the
        given cookies using cookielib. Due to cookielib's design, the header
        will not be regenerated if it already exists, meaning this function
        can only be called once for the life of the
        :class:`PreparedRequest <PreparedRequest>` object. Any subsequent calls
        to ``prepare_cookies`` will have no actual effect, unless the "Cookie"
        header is removed beforehand.
        NZCookie)rBr$Z	CookieJarryrrrN)r;rsZ
cookie_headerr=r=r>r~$s
zPreparedRequest.prepare_cookiescCs*|pg}x|D]}|j|||�qWdS)zPrepares the given hooks.N)rk)r;rgrir=r=r>r�8s
zPreparedRequest.prepare_hooks)
NNNNNNNNNN)N)r5)r\r]r^rxrtrwrur�r{r`r�r|r}rr�r�r~r�r=r=r=r>rvs

V
E
rvc
@seZdZdZdddddddd	d
dg
Zdd
�Zdd�Zdd�Zdd�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
edd��Zed d!��Zed"d#��Zed$d%��Zed&d'��Zd;d*d+�Zed,d,fd-d.�Zed/d0��Zed1d2��Zd3d4�Zed5d6��Zd7d8�Zd9d:�Zd,S)<�ResponsezhThe :class:`Response <Response>` object, which contains a
    server's response to an HTTP request.
    �_content�status_coderNr6�history�encoding�reasonrs�elapsed�requestcCs^d|_d|_d|_d|_t�|_d|_d|_d|_g|_	d|_
ti�|_t
jd�|_d|_dS)NFr)r��_content_consumed�_nextr�rrN�rawr6r�r�r�rrs�datetimeZ	timedeltar�r�)r;r=r=r>rtLs
zResponse.__init__cCs|S)Nr=)r;r=r=r>�	__enter__{szResponse.__enter__cGs|j�dS)N)�close)r;r�r=r=r>�__exit__~szResponse.__exit__cs$�js�jt�fdd��jD��S)Nc3s|]}|t�|d�fVqdS)N)r�)rd�attr)r;r=r>rf�sz(Response.__getstate__.<locals>.<genexpr>)r��content�dict�	__attrs__)r;r=)r;r>�__getstate__�s

zResponse.__getstate__cCs>x |j�D]\}}t|||�q
Wt|dd�t|dd�dS)Nr�Tr�)ro�setattr)r;�staterLr�r=r=r>�__setstate__�szResponse.__setstate__cCs
d|jS)Nz<Response [%s]>)r�)r;r=r=r>ru�szResponse.__repr__cCs|jS)akReturns True if :attr:`status_code` is less than 400.

        This attribute checks if the status code of the response is between
        400 and 600 to see if there was a client error or a server error. If
        the status code, is between 200 and 400, this will return True. This
        is **not** a check to see if the response code is ``200 OK``.
        )�ok)r;r=r=r>�__bool__�szResponse.__bool__cCs|jS)akReturns True if :attr:`status_code` is less than 400.

        This attribute checks if the status code of the response is between
        400 and 600 to see if there was a client error or a server error. If
        the status code, is between 200 and 400, this will return True. This
        is **not** a check to see if the response code is ``200 OK``.
        )r�)r;r=r=r>�__nonzero__�szResponse.__nonzero__cCs
|jd�S)z,Allows you to use a response as an iterator.�)�iter_content)r;r=r=r>rA�szResponse.__iter__cCs&y|j�Wntk
r dSXdS)akReturns True if :attr:`status_code` is less than 400.

        This attribute checks if the status code of the response is between
        400 and 600 to see if there was a client error or a server error. If
        the status code, is between 200 and 400, this will return True. This
        is **not** a check to see if the response code is ``200 OK``.
        FT)�raise_for_statusr)r;r=r=r>r��s
	zResponse.okcCsd|jko|jtkS)z�True if this Response is a well-formed HTTP redirect that could have
        been processed automatically (by :meth:`Session.resolve_redirects`).
        �location)rNr��REDIRECT_STATI)r;r=r=r>�is_redirect�szResponse.is_redirectcCsd|jko|jtjtjfkS)z@True if this Response one of the permanent versions of redirect.r�)rNr�r/Zmoved_permanently�permanent_redirect)r;r=r=r>�is_permanent_redirect�szResponse.is_permanent_redirectcCs|jS)zTReturns a PreparedRequest for the next request in a redirect chain, if there is one.)r�)r;r=r=r>�next�sz
Response.nextcCstj|j�dS)z7The apparent encoding, provided by the chardet library.r�)r+Zdetectr�)r;r=r=r>�apparent_encoding�szResponse.apparent_encodingr
Fcs~��fdd�}�jr(t�jt�r(t��n$�dk	rLt�t�rLtdt����t�j��}|�}�jrh|n|}|rzt	|��}|S)a�Iterates over the response data.  When stream=True is set on the
        request, this avoids reading the content at once into memory for
        large responses.  The chunk size is the number of bytes it should
        read into memory.  This is not necessarily the length of each item
        returned as decoding can take place.

        chunk_size must be of type int or None. A value of None will
        function differently depending on the value of `stream`.
        stream=True will read data as it arrives in whatever size the
        chunks are received. If stream=False, data is returned as
        a single chunk.

        If decode_unicode is True, content will be decoded using the best
        available encoding based on the response.
        c3s�t�jd�r�y$x�jj�dd�D]
}|Vq WWq�tk
rZ}zt|��WYdd}~Xq�tk
r�}zt|��WYdd}~Xq�tk
r�}zt|��WYdd}~Xq�Xnx�jj	��}|s�P|Vq�Wd�_
dS)N�streamT)Zdecode_content)rCr�r�rrrrrrr@r�)�chunkr�)�
chunk_sizer;r=r>�generate�s 
z'Response.iter_content.<locals>.generateNz.chunk_size must be an int, it is instead a %s.)
r�rBr��boolr�intr��typer r)r;r��decode_unicoder�Z
reused_chunksZ
stream_chunksZchunksr=)r�r;r>r��s
zResponse.iter_contentNccs�d}x�|j||d�D]r}|dk	r(||}|r8|j|�}n|j�}|rn|drn|rn|dd|dkrn|j�}nd}x|D]
}|VqxWqW|dk	r�|VdS)z�Iterates over the response data, one line at a time.  When
        stream=True is set on the request, this avoids reading the
        content at once into memory for large responses.

        .. note:: This method is not reentrant safe.
        N)r�r�r
���r�r�r�)r��split�
splitlines�pop)r;r�r�Z	delimiter�pendingr��lines�liner=r=r>�
iter_liness$

zResponse.iter_linescCsZ|jdkrN|jrtd��|jdks,|jdkr4d|_nt�j|jt��pJt�|_d|_|jS)z"Content of the response, in bytes.Fz2The content for this response was already consumedrNT)	r�r��RuntimeErrorr�r�r)r:r��CONTENT_CHUNK_SIZE)r;r=r=r>r�*s
zResponse.contentcCshd}|j}|jstd�S|jdkr(|j}yt|j|dd�}Wn&ttfk
rbt|jdd�}YnX|S)a�Content of the response, in unicode.

        If Response.encoding is None, encoding will be guessed using
        ``chardet``.

        The encoding of the response content is determined based solely on HTTP
        headers, following RFC 2616 to the letter. If you can take advantage of
        non-HTTP knowledge to make a better guess at the encoding, you should
        set ``r.encoding`` appropriately before accessing this property.
        Nr5�replace)�errors)r�r�r(r��LookupErrorr�)r;r�r�r=r=r>�text>s
z
Response.textcKsj|jrZ|jrZt|j�dkrZt|j�}|dk	rZytj|jj|�f|�Stk
rXYnXtj|jf|�S)z�Returns the json-encoded content of a response, if any.

        :param \*\*kwargs: Optional arguments that ``json.loads`` takes.
        :raises ValueError: If the response body does not contain valid json.
        rKN)	r�r�rTr!r��loadsrQ�UnicodeDecodeErrorr�)r;�kwargsr�r=r=r>r.ds
z
Response.jsoncCsJ|jjd�}i}|rFt|�}x(|D] }|jd�p8|jd�}|||<q"W|S)z8Returns the parsed header links of the response, if any.�linkZrelr6)rNr�r)r;r��l�linksr��keyr=r=r>r�~s
zResponse.linkscCs�d}t|jt�rDy|jjd�}WqJtk
r@|jjd�}YqJXn|j}d|jko^dknrxd|j||jf}n,d|jko�dknr�d|j||jf}|r�t||d	��d
S)z2Raises stored :class:`HTTPError`, if one occurred.r5zutf-8z
iso-8859-1i�i�z%s Client Error: %s for url: %siXz%s Server Error: %s for url: %s)ZresponseN)rBr�r)rQr�r�r6r)r;Zhttp_error_msgr�r=r=r>r��szResponse.raise_for_statuscCs0|js|jj�t|jdd�}|dk	r,|�dS)z�Releases the connection back to the pool. Once this method has been
        called the underlying ``raw`` object must not be accessed again.

        *Note: Should not normally need to be called explicitly.*
        �release_connN)r�r�r�r�)r;r�r=r=r>r��s

zResponse.close)r
F)r\r]r^rxr�rtr�r�r�r�rur�r�rAr_r�r�r�r�r�r��ITER_CHUNK_SIZEr�r�r�r.r�r�r�r=r=r=r>r�Bs2
/


7&r�i()Qrxrbr��sysZencodings.idnaZ	encodingsZpip._vendor.urllib3.fieldsrZpip._vendor.urllib3.filepostrZpip._vendor.urllib3.utilrZpip._vendor.urllib3.exceptionsrrrr�ior	rgrZ
structuresrrrr
rsrrr�
exceptionsrrrrrrrZ_internal_utilsrrZutilsrrrrrrr r!r"r#�compatr$r%r&r'r(r)r*r+r,r-r.r�Zstatus_codesr/Zmoved�found�otherZtemporary_redirectr�r�ZDEFAULT_REDIRECT_LIMITr�r�r�r2rarnrvr�r=r=r=r>�<module>sD$00nF<site-packages/pip/_vendor/requests/__pycache__/_internal_utils.cpython-36.pyc000064400000002315147511334620023423 0ustar003

���eH�@s.dZddlmZmZmZd	dd�Zdd�ZdS)
z�
requests._internal_utils
~~~~~~~~~~~~~~

Provides utility functions that are consumed internally by Requests
which depend on extremely few external helpers (such as compat)
�)�is_py2�builtin_str�str�asciicCs.t|t�r|}ntr |j|�}n
|j|�}|S)z�Given a string object, regardless of type, returns a representation of
    that string in the native string type, encoding and decoding where
    necessary. This assumes ASCII unless told otherwise.
    )�
isinstancerr�encode�decode)�string�encoding�out�r�%/usr/lib/python3.6/_internal_utils.py�to_native_strings

rcCs6t|t�st�y|jd�dStk
r0dSXdS)z�Determine if unicode string only contains ASCII characters.

    :param str u_string: unicode string to check. Must be unicode
        and not Python 2 `str`.
    :rtype: bool
    rTFN)rr�AssertionErrorr�UnicodeEncodeError)Zu_stringrrr
�unicode_is_asciis
rN)r)�__doc__�compatrrrrrrrrr
�<module>	s
site-packages/pip/_vendor/requests/__pycache__/adapters.cpython-36.pyc000064400000040074147511334620022037 0ustar003

���eR�@s�dZddlZddlZddlmZmZddlmZddl	m
Zddlm
Z
ddlmZddlmZdd	lmZdd
lmZddlmZddlmZdd
lmZddlmZddlmZddlmZddlmZddlmZm Z ddl!m"Z"m#Z#m$Z$m%Z%m&Z&m'Z'ddl(m)Z)ddl*m+Z+ddl,m-Z-m.Z.m/Z/mZmZm0Z0m1Z1ddl2m3Z3yddl4m5Z5Wne6k
�rrdd�Z5YnXdZ7dZ8dZ9dZ:Gdd�de;�Z<Gd d!�d!e<�Z=dS)"z�
requests.adapters
~~~~~~~~~~~~~~~~~

This module contains the transport adapters that Requests uses to define
and maintain connections.
�N)�PoolManager�proxy_from_url)�HTTPResponse)�Timeout)�Retry)�ClosedPoolError)�ConnectTimeoutError)�	HTTPError)�
MaxRetryError)�NewConnectionError)�
ProxyError)�
ProtocolError)�ReadTimeoutError)�SSLError)�
ResponseError�)�Response)�urlparse�
basestring)�DEFAULT_CA_BUNDLE_PATH�get_encoding_from_headers�prepend_scheme_if_needed�get_auth_from_url�
urldefragauth�select_proxy)�CaseInsensitiveDict)�extract_cookies_to_jar)�ConnectionError�ConnectTimeout�ReadTimeoutrr�
RetryError�
InvalidSchema)�_basic_auth_str)�SOCKSProxyManagercOstd��dS)Nz'Missing dependencies for SOCKS support.)r!)�args�kwargs�r&�/usr/lib/python3.6/adapters.pyr#+sr#F�
cs2eZdZdZ�fdd�Zddd�Zd	d
�Z�ZS)�BaseAdapterzThe Base Transport Adaptercstt|�j�dS)N)�superr)�__init__)�self)�	__class__r&r'r+7szBaseAdapter.__init__FNTcCst�dS)aCSends PreparedRequest object. Returns Response object.

        :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
        :param stream: (optional) Whether to stream the request content.
        :param timeout: (optional) How long to wait for the server to send
            data before giving up, as a float, or a :ref:`(connect timeout,
            read timeout) <timeouts>` tuple.
        :type timeout: float or tuple
        :param verify: (optional) Either a boolean, in which case it controls whether we verify
            the server's TLS certificate, or a string, in which case it must be a path
            to a CA bundle to use
        :param cert: (optional) Any user-provided SSL certificate to be trusted.
        :param proxies: (optional) The proxies dictionary to apply to the request.
        N)�NotImplementedError)r,�request�stream�timeout�verify�cert�proxiesr&r&r'�send:szBaseAdapter.sendcCst�dS)z!Cleans up adapter specific items.N)r.)r,r&r&r'�closeLszBaseAdapter.close)FNTNN)�__name__�
__module__�__qualname__�__doc__r+r5r6�
__classcell__r&r&)r-r'r)4s

r)cs�eZdZdZdddddgZeeeef�fdd�	Zd	d
�Z	dd�Z
efd
d�Zdd�Zdd�Z
dd�Zd$dd�Zdd�Zdd�Zdd�Zdd�Zd%d"d#�Z�ZS)&�HTTPAdaptera�The built-in HTTP Adapter for urllib3.

    Provides a general-case interface for Requests sessions to contact HTTP and
    HTTPS urls by implementing the Transport Adapter interface. This class will
    usually be created by the :class:`Session <Session>` class under the
    covers.

    :param pool_connections: The number of urllib3 connection pools to cache.
    :param pool_maxsize: The maximum number of connections to save in the pool.
    :param max_retries: The maximum number of retries each connection
        should attempt. Note, this applies only to failed DNS lookups, socket
        connections and connection timeouts, never to requests where data has
        made it to the server. By default, Requests does not retry failed
        connections. If you need granular control over the conditions under
        which we retry a request, import urllib3's ``Retry`` class and pass
        that instead.
    :param pool_block: Whether the connection pool should block for connections.

    Usage::

      >>> import requests
      >>> s = requests.Session()
      >>> a = requests.adapters.HTTPAdapter(max_retries=3)
      >>> s.mount('http://', a)
    �max_retries�config�_pool_connections�
_pool_maxsize�_pool_blockcsd|tkrtddd�|_ntj|�|_i|_i|_tt|�j�||_	||_
||_|j|||d�dS)NrF)�read)�block)
�DEFAULT_RETRIESrr=Zfrom_intr>�
proxy_managerr*r<r+r?r@rA�init_poolmanager)r,Zpool_connectionsZpool_maxsizer=Z
pool_block)r-r&r'r+nszHTTPAdapter.__init__cst�fdd��jD��S)Nc3s|]}|t�|d�fVqdS)N)�getattr)�.0�attr)r,r&r'�	<genexpr>�sz+HTTPAdapter.__getstate__.<locals>.<genexpr>)�dict�	__attrs__)r,r&)r,r'�__getstate__�szHTTPAdapter.__getstate__cCsHi|_i|_x |j�D]\}}t|||�qW|j|j|j|jd�dS)N)rC)rEr>�items�setattrrFr?r@rA)r,�staterI�valuer&r&r'�__setstate__�szHTTPAdapter.__setstate__cKs0||_||_||_tf|||dd�|��|_dS)aInitializes a urllib3 PoolManager.

        This method should not be called from user code, and is only
        exposed for use when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        :param connections: The number of urllib3 connection pools to cache.
        :param maxsize: The maximum number of connections to save in the pool.
        :param block: Block when no free connections are available.
        :param pool_kwargs: Extra keyword arguments used to initialize the Pool Manager.
        T)�	num_pools�maxsizerC�strictN)r?r@rAr�poolmanager)r,ZconnectionsrTrCZpool_kwargsr&r&r'rF�s

zHTTPAdapter.init_poolmanagercKs�||jkr|j|}n||j�jd�r^t|�\}}t|f|||j|j|jd�|��}|j|<n4|j|�}t	|f||j|j|jd�|��}|j|<|S)a�Return urllib3 ProxyManager for the given proxy.

        This method should not be called from user code, and is only
        exposed for use when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        :param proxy: The proxy to return a urllib3 ProxyManager for.
        :param proxy_kwargs: Extra keyword arguments used to configure the Proxy Manager.
        :returns: ProxyManager
        :rtype: urllib3.ProxyManager
        �socks)�username�passwordrSrTrC)�
proxy_headersrSrTrC)
rE�lower�
startswithrr#r?r@rArZr)r,�proxyZproxy_kwargsZmanagerrXrYrZr&r&r'�proxy_manager_for�s*

zHTTPAdapter.proxy_manager_forcCs|j�jd�rn|rnd}|dk	r"|}|s*t}|s>tjj|�rLtdj|���d|_tjj	|�sf||_
q�||_nd|_d|_
d|_|�rt|t
�s�|d|_|d|_n||_d|_|jr�tjj|j�r�td	j|j���|jo�tjj|j��rtd
j|j���dS)aAVerify a SSL certificate. This method should not be called from user
        code, and is only exposed for use when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        :param conn: The urllib3 connection object associated with the cert.
        :param url: The requested URL.
        :param verify: Either a boolean, in which case it controls whether we verify
            the server's TLS certificate, or a string, in which case it must be a path
            to a CA bundle to use
        :param cert: The SSL certificate to verify.
        �httpsNTzFCould not find a suitable TLS CA certificate bundle, invalid path: {0}Z
CERT_REQUIREDZ	CERT_NONErrz:Could not find the TLS certificate file, invalid path: {0}z2Could not find the TLS key file, invalid path: {0})r[r\r�os�path�exists�IOError�formatZ	cert_reqs�isdirZca_certsZca_cert_dir�
isinstancerZ	cert_fileZkey_file)r,�conn�urlr2r3Zcert_locr&r&r'�cert_verify�s8


zHTTPAdapter.cert_verifycCs�t�}t|dd�|_tt|di��|_t|j�|_||_|jj|_t	|j
t�r^|j
jd�|_
n|j
|_
t
|j||�||_||_|S)a�Builds a :class:`Response <requests.Response>` object from a urllib3
        response. This should not be called from user code, and is only exposed
        for use when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`

        :param req: The :class:`PreparedRequest <PreparedRequest>` used to generate the response.
        :param resp: The urllib3 response object.
        :rtype: requests.Response
        ZstatusN�headerszutf-8)rrGZstatus_coderrjr�encoding�raw�reasonrfrh�bytes�decoder�cookiesr/�
connection)r,Zreq�respZresponser&r&r'�build_response�s

zHTTPAdapter.build_responseNcCsNt||�}|r.t|d�}|j|�}|j|�}nt|�}|j�}|jj|�}|S)a�Returns a urllib3 connection for the given URL. This should not be
        called from user code, and is only exposed for use when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        :param url: The URL to connect to.
        :param proxies: (optional) A Requests-style dictionary of proxies used on this request.
        :rtype: urllib3.ConnectionPool
        Zhttp)rrr^Zconnection_from_urlrZgeturlrV)r,rhr4r]rErgZparsedr&r&r'�get_connection"s	


zHTTPAdapter.get_connectioncCs*|jj�x|jj�D]}|j�qWdS)z�Disposes of any internal state.

        Currently, this closes the PoolManager and any active ProxyManager,
        which closes any pooled connections.
        N)rV�clearrE�values)r,r]r&r&r'r69s
zHTTPAdapter.closec	Csbt|j|�}t|j�j}|o"|dk}d}|rDt|�jj�}|jd�}|j}|r^|r^t|j�}|S)a?Obtain the url to use when making the final request.

        If the message is being sent through a HTTP proxy, the full URL has to
        be used. Otherwise, we should only use the path portion of the URL.

        This should not be called from user code, and is only exposed for use
        when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
        :param proxies: A dictionary of schemes or schemes and hosts to proxy URLs.
        :rtype: str
        r_FrW)rrhr�schemer[r\Zpath_urlr)	r,r/r4r]rwZis_proxied_http_requestZusing_socks_proxyZproxy_schemerhr&r&r'�request_urlCs


zHTTPAdapter.request_urlcKsdS)a"Add any headers needed by the connection. As of v2.0 this does
        nothing by default, but is left for overriding by users that subclass
        the :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        This should not be called from user code, and is only exposed for use
        when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        :param request: The :class:`PreparedRequest <PreparedRequest>` to add headers to.
        :param kwargs: The keyword arguments from the call to send().
        Nr&)r,r/r%r&r&r'�add_headers`szHTTPAdapter.add_headerscCs&i}t|�\}}|r"t||�|d<|S)a
Returns a dictionary of the headers to add to any request sent
        through a proxy. This works with urllib3 magic to ensure that they are
        correctly sent to the proxy, rather than in a tunnelled request if
        CONNECT is being used.

        This should not be called from user code, and is only exposed for use
        when subclassing the
        :class:`HTTPAdapter <requests.adapters.HTTPAdapter>`.

        :param proxies: The url of the proxy being used for this request.
        :rtype: dict
        zProxy-Authorization)rr")r,r]rjrXrYr&r&r'rZns

zHTTPAdapter.proxy_headersFTc)Cs�|j|j|�}|j||j||�|j||�}|j|�|jdkpHd|jk}	t|t�r�y|\}
}t	|
|d�}Wq�t
k
r�}zdj|�}
t
|
��WYdd}~Xq�Xnt|t	�r�nt	||d�}�yL|	s�|j|j
||j|jdddd|j|d�
}�nt|d��r|j}|jtd�}y�|j|j
|d	d
�x$|jj�D]\}}|j||��q.W|j�xN|jD]D}|jtt|��dd�jd��|jd
�|j|�|jd
��qXW|jd�y|jd	d�}Wntk
�r�|j�}YnXtj|||ddd�}Wn|j��YnXW�n�t t!j"fk
�rD}
zt#|
|d��WYdd}
~
X�nZt$k
�r�}z�t|j%t&��r~t|j%t'��s~t(||d��t|j%t)��r�t*||d��t|j%t+��r�t,||d��t|j%t-��r�t.||d��t#||d��WYdd}~Xn�t/k
�r}zt#||d��WYdd}~Xn�t+k
�r@}zt,|��WYdd}~Xn^t-t0fk
�r�}z<t|t-��rpt.||d��nt|t1��r�t2||d��n�WYdd}~XnX|j3||�S)aSends PreparedRequest object. Returns Response object.

        :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
        :param stream: (optional) Whether to stream the request content.
        :param timeout: (optional) How long to wait for the server to send
            data before giving up, as a float, or a :ref:`(connect timeout,
            read timeout) <timeouts>` tuple.
        :type timeout: float or tuple or urllib3 Timeout object
        :param verify: (optional) Either a boolean, in which case it controls whether
            we verify the server's TLS certificate, or a string, in which case it
            must be a path to a CA bundle to use
        :param cert: (optional) Any user-provided SSL certificate to be trusted.
        :param proxies: (optional) The proxies dictionary to apply to the request.
        :rtype: requests.Response
        NzContent-Length)�connectrBzsInvalid timeout {0}. Pass a (connect, read) timeout tuple, or a single float to set both timeouts to the same valueF)
�methodrh�bodyrjZredirectZassert_same_host�preload_content�decode_contentZretriesr1�
proxy_pool)r1T)Zskip_accept_encoding�zutf-8s
s0

)�	buffering)Zpoolrqr}r~)r/)4rtrhrirxryr|rjrf�tuple�TimeoutSauce�
ValueErrorrdZurlopenr{r=�hasattrrZ	_get_conn�DEFAULT_POOL_TIMEOUTZ
putrequestrNZ	putheaderZ
endheadersr5�hex�len�encodeZgetresponse�	TypeErrorrZfrom_httplibr6r
�socket�errorrr
rmrrrrr �_ProxyErrorr�	_SSLErrorrr�
_HTTPErrorrrrs)r,r/r0r1r2r3r4rgrhZchunkedrzrB�e�errrrZlow_conn�headerrQ�i�rr&r&r'r5�s�


 


zHTTPAdapter.send)N)FNTNN)r7r8r9r:rL�DEFAULT_POOLSIZErD�DEFAULT_POOLBLOCKr+rMrRrFr^rirsrtr6rxryrZr5r;r&r&)r-r'r<Qs$%4%

r<)>r:Zos.pathr`r�Zpip._vendor.urllib3.poolmanagerrrZpip._vendor.urllib3.responserZpip._vendor.urllib3.utilrr�Zpip._vendor.urllib3.util.retryrZpip._vendor.urllib3.exceptionsrrr	r�r
rrr�r
rrr�rZmodelsr�compatrrZutilsrrrrrrZ
structuresrrpr�
exceptionsrrrr r!Zauthr"Z!pip._vendor.urllib3.contrib.socksr#�ImportErrorr�r�rDr��objectr)r<r&r&r&r'�<module>	sB $site-packages/pip/_vendor/requests/__pycache__/auth.cpython-36.opt-1.pyc000064400000017115147511334620022134 0ustar003

���e&�@s�dZddlZddlZddlZddlZddlZddlZddlmZddl	m
Z
mZmZddl
mZddlmZddlmZd	Zd
Zdd�ZGd
d�de�ZGdd�de�ZGdd�de�ZGdd�de�ZdS)z]
requests.auth
~~~~~~~~~~~~~

This module contains the authentication handlers for Requests.
�N)�	b64encode�)�urlparse�str�
basestring)�extract_cookies_to_jar)�to_native_string)�parse_dict_headerz!application/x-www-form-urlencodedzmultipart/form-datacCs�t|t�s&tjdj|�td�t|�}t|t�sLtjdj|�td�t|�}t|t�r`|jd�}t|t�rt|jd�}dtt	dj
||f��j��}|S)zReturns a Basic Auth string.z�Non-string usernames will no longer be supported in Requests 3.0.0. Please convert the object you've passed in ({0!r}) to a string or bytes object in the near future to avoid problems.)�categoryz�Non-string passwords will no longer be supported in Requests 3.0.0. Please convert the object you've passed in ({0!r}) to a string or bytes object in the near future to avoid problems.�latin1zBasic �:)�
isinstancer�warnings�warn�format�DeprecationWarningr�encoderr�join�strip)�username�passwordZauthstr�r�/usr/lib/python3.6/auth.py�_basic_auth_strs&






rc@seZdZdZdd�ZdS)�AuthBasez4Base class that all auth implementations derive fromcCstd��dS)NzAuth hooks must be callable.)�NotImplementedError)�self�rrrr�__call__KszAuthBase.__call__N)�__name__�
__module__�__qualname__�__doc__rrrrrrHsrc@s0eZdZdZdd�Zdd�Zdd�Zdd	�Zd
S)�
HTTPBasicAuthz?Attaches HTTP Basic Authentication to the given Request object.cCs||_||_dS)N)rr)rrrrrr�__init__RszHTTPBasicAuth.__init__cCs(t|jt|dd�k|jt|dd�kg�S)Nrr)�allr�getattrr)r�otherrrr�__eq__VszHTTPBasicAuth.__eq__cCs
||kS)Nr)rr'rrr�__ne__\szHTTPBasicAuth.__ne__cCst|j|j�|jd<|S)N�
Authorization)rrr�headers)rrrrrr_szHTTPBasicAuth.__call__N)rr r!r"r$r(r)rrrrrr#Os
r#c@seZdZdZdd�ZdS)�
HTTPProxyAuthz=Attaches HTTP Proxy Authentication to a given Request object.cCst|j|j�|jd<|S)NzProxy-Authorization)rrrr+)rrrrrrgszHTTPProxyAuth.__call__N)rr r!r"rrrrrr,dsr,c@sPeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�ZdS)�HTTPDigestAuthz@Attaches HTTP Digest Authentication to the given Request object.cCs||_||_tj�|_dS)N)rr�	threadingZlocal�
_thread_local)rrrrrrr$oszHTTPDigestAuth.__init__cCs@t|jd�s<d|j_d|j_d|j_i|j_d|j_d|j_dS)N�initT�r)�hasattrr/r0�
last_nonce�nonce_count�chal�pos�
num_401_calls)rrrr�init_per_thread_stateusz$HTTPDigestAuth.init_per_thread_statecsj|jjd}|jjd}|jjjd�}|jjjd�}|jjjd�}d�|dkrTd}n|j�}|dksl|dkrzd	d
�}	|	�n|dkr�dd
�}
|
��fdd�}�dkr�dSd}t|�}
|
jp�d}|
jr�|d|
j7}d|j||jf}d||f}�|�}�|�}||jj	k�r|jj
d7_
nd|j_
d|jj
}t|jj
�jd�}||jd�7}|t
j�jd�7}|tjd�7}tj|�j�dd�}|dk�r��d|||f�}|�s�||d||f�}n<|dk�s�d|jd�k�r�d|||d|f}|||�}ndS||j_	d|j||||f}|�r(|d|7}|�r:|d|7}|�rL|d|7}|�rb|d ||f7}d!|S)"z
        :rtype: str
        �realm�nonce�qop�	algorithm�opaqueNZMD5zMD5-SESScSs"t|t�r|jd�}tj|�j�S)Nzutf-8)r
rr�hashlibZmd5�	hexdigest)�xrrr�md5_utf8�s

z4HTTPDigestAuth.build_digest_header.<locals>.md5_utf8ZSHAcSs"t|t�r|jd�}tj|�j�S)Nzutf-8)r
rrr>�sha1r?)r@rrr�sha_utf8�s

z4HTTPDigestAuth.build_digest_header.<locals>.sha_utf8cs�d||f�S)Nz%s:%sr)�s�d)�	hash_utf8rr�<lambda>�sz4HTTPDigestAuth.build_digest_header.<locals>.<lambda>�/�?z%s:%s:%sz%s:%srz%08xzutf-8��Zauth�,z%s:%s:%s:%s:%sz>username="%s", realm="%s", nonce="%s", uri="%s", response="%s"z
, opaque="%s"z, algorithm="%s"z
, digest="%s"z , qop="auth", nc=%s, cnonce="%s"z	Digest %s)r/r5�get�upperr�pathZqueryrrr3r4rr�timeZctime�os�urandomr>rBr?�split)r�method�urlr9r:r;r<r=Z
_algorithmrArCZKDZentdigZp_parsedrOZA1ZA2ZHA1ZHA2ZncvaluerDZcnonceZrespdigZnoncebit�baser)rFr�build_digest_headersr

z"HTTPDigestAuth.build_digest_headercKs|jrd|j_dS)z)Reset num_401_calls counter on redirects.rN)Zis_redirectr/r7)rr�kwargsrrr�handle_redirect�szHTTPDigestAuth.handle_redirectcKs"d|jkodkns&d|j_|S|jjdk	rD|jjj|jj�|jjdd�}d|j	�koh|jjdk�r|jjd7_t
jd	t
jd
�}t
|jd|dd��|j_|j|j�|jj�}t|j|j|j�|j|j�|j|j|j�|jd<|jj|f|�}|jj|�||_|Sd|j_|S)
zo
        Takes the given response and tries digest-auth, if needed.

        :rtype: requests.Response
        i�i�rNzwww-authenticater1Zdigest�zdigest )�flags)�countr*)Zstatus_coder/r7r6Zrequest�body�seekr+rM�lower�re�compile�
IGNORECASEr	�subr5Zcontent�close�copyrZ_cookies�rawZprepare_cookiesrWrTrUZ
connection�send�history�append)rrrXZs_authZpatZprepZ_rrrr�
handle_401�s.	
zHTTPDigestAuth.handle_401cCs~|j�|jjr&|j|j|j�|jd<y|jj�|j_	Wnt
k
rTd|j_	YnX|jd|j�|jd|j
�d|j_|S)Nr*Zresponser)r8r/r3rWrTrUr+r]�tellr6�AttributeErrorZ
register_hookrjrYr7)rrrrrr
szHTTPDigestAuth.__call__cCs(t|jt|dd�k|jt|dd�kg�S)Nrr)r%rr&r)rr'rrrr(szHTTPDigestAuth.__eq__cCs
||kS)Nr)rr'rrrr)$szHTTPDigestAuth.__ne__N)rr r!r"r$r8rWrYrjrr(r)rrrrr-ls
Z,r-)r"rQr`rPr>r.r�base64r�compatrrrZcookiesrZ_internal_utilsrZutilsr	ZCONTENT_TYPE_FORM_URLENCODEDZCONTENT_TYPE_MULTI_PARTr�objectrr#r,r-rrrr�<module>s$,site-packages/pip/_vendor/requests/__pycache__/cookies.cpython-36.pyc000064400000044057147511334620021675 0ustar003

���e G�
@sdZddlZddlZddlZddlZddlmZddlmZm	Z	m
Z
mZyddlZWne
k
rpddlZYnXGdd�de�ZGdd	�d	e�Zd
d�Zdd
�Zddd�ZGdd�de�ZGdd�dejej�Zdd�Zdd�Zdd�Zd dd�Zdd�ZdS)!z�
requests.cookies
~~~~~~~~~~~~~~~~

Compatibility code to be able to use `cookielib.CookieJar` with requests.

requests.utils imports from here, so be careful with imports.
�N�)�to_native_string)�	cookielib�urlparse�
urlunparse�Morselc@s�eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
ddd�Zdd�Zdd�Z
dd�Zedd��Zedd��Zedd��ZdS) �MockRequesta�Wraps a `requests.Request` to mimic a `urllib2.Request`.

    The code in `cookielib.CookieJar` expects this interface in order to correctly
    manage cookie policies, i.e., determine whether a cookie can be set, given the
    domains of the request and the cookie.

    The original request object is read-only. The client is responsible for collecting
    the new headers via `get_new_headers()` and interpreting them appropriately. You
    probably want `get_cookie_header`, defined below.
    cCs ||_i|_t|jj�j|_dS)N)�_r�_new_headersr�url�scheme�type)�self�request�r�/usr/lib/python3.6/cookies.py�__init__&szMockRequest.__init__cCs|jS)N)r
)rrrr�get_type+szMockRequest.get_typecCst|jj�jS)N)rr	rZnetloc)rrrr�get_host.szMockRequest.get_hostcCs|j�S)N)r)rrrr�get_origin_req_host1szMockRequest.get_origin_req_hostcCsT|jjjd�s|jjSt|jjddd�}t|jj�}t|j||j|j	|j
|jg�S)NZHostzutf-8)�encoding)r	�headers�getrrrrr�pathZparamsZqueryZfragment)r�hostZparsedrrr�get_full_url4szMockRequest.get_full_urlcCsdS)NTr)rrrr�is_unverifiableBszMockRequest.is_unverifiablecCs||jjkp||jkS)N)r	rr
)r�namerrr�
has_headerEszMockRequest.has_headerNcCs|jjj||jj||��S)N)r	rrr
)rr�defaultrrr�
get_headerHszMockRequest.get_headercCstd��dS)zMcookielib has no legitimate use for this method; add it back if you find one.z=Cookie headers should be added with add_unredirected_header()N)�NotImplementedError)r�key�valrrr�
add_headerKszMockRequest.add_headercCs||j|<dS)N)r
)rr�valuerrr�add_unredirected_headerOsz#MockRequest.add_unredirected_headercCs|jS)N)r
)rrrr�get_new_headersRszMockRequest.get_new_headerscCs|j�S)N)r)rrrr�unverifiableUszMockRequest.unverifiablecCs|j�S)N)r)rrrr�origin_req_hostYszMockRequest.origin_req_hostcCs|j�S)N)r)rrrrr]szMockRequest.host)N)�__name__�
__module__�__qualname__�__doc__rrrrrrrr r$r&r'�propertyr(r)rrrrrrs

rc@s(eZdZdZdd�Zdd�Zdd�ZdS)	�MockResponsez�Wraps a `httplib.HTTPMessage` to mimic a `urllib.addinfourl`.

    ...what? Basically, expose the parsed HTTP headers from the server response
    the way `cookielib` expects to see them.
    cCs
||_dS)z�Make a MockResponse for `cookielib` to read.

        :param headers: a httplib.HTTPMessage or analogous carrying the headers
        N)�_headers)rrrrrriszMockResponse.__init__cCs|jS)N)r0)rrrr�infopszMockResponse.infocCs|jj|�dS)N)r0�
getheaders)rrrrrr2sszMockResponse.getheadersN)r*r+r,r-rr1r2rrrrr/bsr/cCs8t|d�o|jsdSt|�}t|jj�}|j||�dS)z�Extract the cookies from the response into a CookieJar.

    :param jar: cookielib.CookieJar (not necessarily a RequestsCookieJar)
    :param request: our own requests.Request object
    :param response: urllib3.HTTPResponse object
    �_original_responseN)�hasattrr3rr/�msgZextract_cookies)�jarrZresponseZreq�resrrr�extract_cookies_to_jarws
r8cCs t|�}|j|�|j�jd�S)zj
    Produce an appropriate Cookie header string to be sent with `request`, or None.

    :rtype: str
    �Cookie)rZadd_cookie_headerr'r)r6r�rrrr�get_cookie_header�s
r;cCs�g}xV|D]N}|j|krq
|dk	r.||jkr.q
|dk	rB||jkrBq
|j|j|j|jf�q
Wx |D]\}}}|j|||�qbWdS)zkUnsets a cookie by name, by default over all domains and paths.

    Wraps CookieJar.clear(), is O(n).
    N)r�domainr�append�clear)�	cookiejarrr<rZ
clearables�cookierrr�remove_cookie_by_name�s

rAc@seZdZdZdS)�CookieConflictErrorz�There are two cookies that meet the criteria specified in the cookie jar.
    Use .get and .set and include domain and path args in order to be more specific.
    N)r*r+r,r-rrrrrB�srBcs�eZdZdZd1dd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zd2dd�Z�fdd�Zdd�Zdd �Zd!d"�Z�fd#d$�Z�fd%d&�Zd3d'd(�Zd4d)d*�Zd+d,�Zd-d.�Zd/d0�Z�ZS)5�RequestsCookieJara�Compatibility class; is a cookielib.CookieJar, but exposes a dict
    interface.

    This is the CookieJar we create by default for requests and sessions that
    don't specify one, since some clients may expect response.cookies and
    session.cookies to support dict operations.

    Requests does not use the dict interface internally; it's just for
    compatibility with external client code. All requests code should work
    out of the box with externally provided instances of ``CookieJar``, e.g.
    ``LWPCookieJar`` and ``FileCookieJar``.

    Unlike a regular CookieJar, this class is pickleable.

    .. warning:: dictionary operations that are normally O(1) may be O(n).
    NcCs(y|j|||�Stk
r"|SXdS)z�Dict-like get() that also supports optional domain and path args in
        order to resolve naming collisions from using one cookie jar over
        multiple domains.

        .. warning:: operation is O(n), not O(1).
        N)�_find_no_duplicates�KeyError)rrrr<rrrrr�szRequestsCookieJar.getcKsX|dkr(t|||jd�|jd�d�dSt|t�r<t|�}nt||f|�}|j|�|S)z�Dict-like set() that also supports optional domain and path args in
        order to resolve naming collisions from using one cookie jar over
        multiple domains.
        Nr<r)r<r)rAr�
isinstancer�morsel_to_cookie�
create_cookie�
set_cookie)rrr%�kwargs�crrr�set�s


zRequestsCookieJar.setccsxt|�D]}|jVq
WdS)z�Dict-like iterkeys() that returns an iterator of names of cookies
        from the jar.

        .. seealso:: itervalues() and iteritems().
        N)�iterr)rr@rrr�iterkeys�szRequestsCookieJar.iterkeyscCst|j��S)z�Dict-like keys() that returns a list of names of cookies from the
        jar.

        .. seealso:: values() and items().
        )�listrN)rrrr�keys�szRequestsCookieJar.keysccsxt|�D]}|jVq
WdS)z�Dict-like itervalues() that returns an iterator of values of cookies
        from the jar.

        .. seealso:: iterkeys() and iteritems().
        N)rMr%)rr@rrr�
itervalues�szRequestsCookieJar.itervaluescCst|j��S)z�Dict-like values() that returns a list of values of cookies from the
        jar.

        .. seealso:: keys() and items().
        )rOrQ)rrrr�values�szRequestsCookieJar.valuesccs$xt|�D]}|j|jfVq
WdS)z�Dict-like iteritems() that returns an iterator of name-value tuples
        from the jar.

        .. seealso:: iterkeys() and itervalues().
        N)rMrr%)rr@rrr�	iteritems�szRequestsCookieJar.iteritemscCst|j��S)z�Dict-like items() that returns a list of name-value tuples from the
        jar. Allows client-code to call ``dict(RequestsCookieJar)`` and get a
        vanilla python dict of key value pairs.

        .. seealso:: keys() and values().
        )rOrS)rrrr�itemsszRequestsCookieJar.itemscCs0g}x&t|�D]}|j|kr|j|j�qW|S)z2Utility method to list all the domains in the jar.)rMr<r=)r�domainsr@rrr�list_domainss

zRequestsCookieJar.list_domainscCs0g}x&t|�D]}|j|kr|j|j�qW|S)z0Utility method to list all the paths in the jar.)rMrr=)r�pathsr@rrr�
list_pathss

zRequestsCookieJar.list_pathscCs>g}x4t|�D](}|jdk	r*|j|kr*dS|j|j�qWdS)zvReturns True if there are multiple domains in the jar.
        Returns False otherwise.

        :rtype: bool
        NTF)rMr<r=)rrUr@rrr�multiple_domainssz"RequestsCookieJar.multiple_domainscCsJi}x@t|�D]4}|dks$|j|kr|dks6|j|kr|j||j<qW|S)z�Takes as an argument an optional domain and path and returns a plain
        old Python dict of name-value pairs of cookies that meet the
        requirements.

        :rtype: dict
        N)rMr<rr%r)rr<rZ
dictionaryr@rrr�get_dict,szRequestsCookieJar.get_dictcs*ytt|�j|�Stk
r$dSXdS)NT)�superrC�__contains__rB)rr)�	__class__rrr\<szRequestsCookieJar.__contains__cCs
|j|�S)z�Dict-like __getitem__() for compatibility with client code. Throws
        exception if there are more than one cookie with name. In that case,
        use the more explicit get() method instead.

        .. warning:: operation is O(n), not O(1).
        )rD)rrrrr�__getitem__BszRequestsCookieJar.__getitem__cCs|j||�dS)z�Dict-like __setitem__ for compatibility with client code. Throws
        exception if there is already a cookie of that name in the jar. In that
        case, use the more explicit set() method instead.
        N)rL)rrr%rrr�__setitem__KszRequestsCookieJar.__setitem__cCst||�dS)zlDeletes a cookie given a name. Wraps ``cookielib.CookieJar``'s
        ``remove_cookie_by_name()``.
        N)rA)rrrrr�__delitem__RszRequestsCookieJar.__delitem__csLt|jd�r4|jjd�r4|jjd�r4|jjdd�|_tt|�j|f|�|�S)N�
startswith�"z\"�)r4r%ra�endswith�replacer[rCrI)rr@�argsrJ)r]rrrIXs$zRequestsCookieJar.set_cookiecs@t|tj�r,x.|D]}|jtj|��qWntt|�j|�dS)zAUpdates this jar with cookies from another CookieJar or dict-likeN)rFr�	CookieJarrI�copyr[rC�update)r�otherr@)r]rrri]s
zRequestsCookieJar.updatecCs\xDt|�D]8}|j|kr
|dks*|j|kr
|dks<|j|kr
|jSq
Wtd|||f��dS)a�Requests uses this method internally to get cookie values.

        If there are conflicting cookies, _find arbitrarily chooses one.
        See _find_no_duplicates if you want an exception thrown if there are
        conflicting cookies.

        :param name: a string containing name of cookie
        :param domain: (optional) string containing domain of cookie
        :param path: (optional) string containing path of cookie
        :return: cookie.value
        Nzname=%r, domain=%r, path=%r)rMrr<rr%rE)rrr<rr@rrr�_findes

zRequestsCookieJar._findcCs|d}xXt|�D]L}|j|kr|dks.|j|kr|dks@|j|kr|dk	rTtd|��|j}qW|rf|Std|||f��dS)a�Both ``__get_item__`` and ``get`` call this function: it's never
        used elsewhere in Requests.

        :param name: a string containing name of cookie
        :param domain: (optional) string containing domain of cookie
        :param path: (optional) string containing path of cookie
        :raises KeyError: if cookie is not found
        :raises CookieConflictError: if there are multiple cookies
            that match name and optionally domain and path
        :return: cookie.value
        Nz(There are multiple cookies with name, %rzname=%r, domain=%r, path=%r)rMrr<rrBr%rE)rrr<rZtoReturnr@rrrrDys

z%RequestsCookieJar._find_no_duplicatescCs|jj�}|jd�|S)z4Unlike a normal CookieJar, this class is pickleable.�
_cookies_lock)�__dict__rh�pop)r�staterrr�__getstate__�s

zRequestsCookieJar.__getstate__cCs$|jj|�d|jkr tj�|_dS)z4Unlike a normal CookieJar, this class is pickleable.rlN)rmri�	threading�RLockrl)rrorrr�__setstate__�s
zRequestsCookieJar.__setstate__cCst�}|j|�|S)z(Return a copy of this RequestsCookieJar.)rCri)rZnew_cjrrrrh�s
zRequestsCookieJar.copy)NNN)NN)NN)NN)r*r+r,r-rrLrNrPrQrRrSrTrVrXrYrZr\r^r_r`rIrirkrDrprsrh�
__classcell__rr)r]rrC�s0
				
	

rCcCsR|dkrdSt|d�r|j�Stj|�}|j�x|D]}|jtj|��q6W|S)Nrh)r4rhr>rI)r6Znew_jarr@rrr�_copy_cookie_jar�s


rucKs�td||ddddddddddidd�
}t|�t|�}|rNd	}t|t|���|j|�t|d
�|d<t|d�|d
<|djd�|d<t|d�|d<tjf|�S)z�Make a cookie from underspecified parameters.

    By default, the pair of `name` and `value` will be set for the domain ''
    and sent on every request (this is sometimes called a "supercookie").
    rNrc�/FT�HttpOnly)
�versionrr%�portr<r�secure�expires�discard�comment�comment_url�rest�rfc2109z4create_cookie() got unexpected keyword arguments: %sryZport_specifiedr<Zdomain_specified�.Zdomain_initial_dotrZpath_specified)	�dictrL�	TypeErrorrOri�boolrarr9)rr%rJ�resultZbadargs�errrrrrH�s0
rHcCs�d}|drPyttj�t|d��}Wqrtk
rLtd|d��YqrXn"|drrd}tjtj|d|��}t|dt|d�d|d||j	|d	dd
|didt|d�|j
|d
p�dd�
S)zBConvert a Morsel object into a Cookie containing the one k/v pair.Nzmax-agezmax-age: %s must be integerr{z%a, %d-%b-%Y %H:%M:%S GMTr}Fr<rrwZhttponlyrzrxr)
r}r~r|r<r{rrryrr�rzr%rx)�int�time�
ValueErrorr��calendarZtimegmZstrptimerHr�r"r%)Zmorselr{Z
time_templaterrrrG�s0


rGTcCsV|dkrt�}|dk	rRdd�|D�}x,|D]$}|s:||kr*|jt|||��q*W|S)a-Returns a CookieJar from a key/value dictionary.

    :param cookie_dict: Dict of key/values to insert into CookieJar.
    :param cookiejar: (optional) A cookiejar to add the cookies to.
    :param overwrite: (optional) If False, will not replace cookies
        already in the jar with new ones.
    NcSsg|]
}|j�qSr)r)�.0r@rrr�
<listcomp>sz'cookiejar_from_dict.<locals>.<listcomp>)rCrIrH)Zcookie_dictr?�	overwriteZnames_from_jarrrrr�cookiejar_from_dict�s
r�cCszt|tj�std��t|t�r.t||dd�}nHt|tj�rvy|j|�Wn,tk
rtx|D]}|j|�q^WYnX|S)z�Add cookies to cookiejar and returns a merged CookieJar.

    :param cookiejar: CookieJar object to add the cookies to.
    :param cookies: Dictionary or CookieJar object to be added.
    z!You can only merge into CookieJarF)r?r�)	rFrrgr�r�r�ri�AttributeErrorrI)r?ZcookiesZ
cookie_in_jarrrr�
merge_cookiess

r�)NN)NT)r-rhr�r��collectionsZ_internal_utilsr�compatrrrrrq�ImportErrorZdummy_threading�objectrr/r8r;rA�RuntimeErrorrBrg�MutableMappingrCrurHrGr�r�rrrr�<module>
s.H
{#
site-packages/pip/_vendor/requests/__pycache__/packages.cpython-36.pyc000064400000000676147511334620022016 0ustar003

���e��@s~ddlZxpdD]hZdeZee�e�e<xLeej�D]>ZeeksNejed�r4ee	d�d�Z
ejeejde
<q4WqWdS)	�N�urllib3�idna�chardetzpip._vendor.�.zpip._vendor.requests.packages.)rrr)�sys�packageZvendored_package�
__import__�locals�list�modules�mod�
startswith�lenZunprefixed_mod�rr�/usr/lib/python3.6/packages.py�<module>s
site-packages/pip/_vendor/requests/__pycache__/compat.cpython-36.opt-1.pyc000064400000002653147511334620022457 0ustar003

���eZ�@s�dZddlmZddlZejZeddkZeddkZddlZer�ddl	m
Z
mZmZm
Z
mZmZmZmZmZddlmZmZmZmZmZddlmZddlZdd	lmZdd
lmZddlmZe Z!e Z"e#Z e$Z$e%e&e'fZ(e%e&fZ)n�e�r�ddl*mZmZmZmZmZm
Z
mZmZm
Z
mZdd
l+mZmZmZmZmZddl,m-Zdd	l.mZdd
l/mZddl0mZe Z!e Z e"Z"e e"fZ$e%e'fZ(e%fZ)dS)zq
requests.compat
~~~~~~~~~~~~~~~

This module handles import compatibility issues between Python 2 and
Python 3.
�)�chardetN��)	�quote�unquote�
quote_plus�unquote_plus�	urlencode�
getproxies�proxy_bypass�proxy_bypass_environment�getproxies_environment)�urlparse�
urlunparse�urljoin�urlsplit�	urldefrag)�parse_http_list)�Morsel)�StringIO)�OrderedDict)
rrrrr	rrrrr)rr
rrr
)�	cookiejar)1�__doc__Zpip._vendorr�sys�version_infoZ_verZis_py2Zis_py3ZjsonZurllibrrrrr	r
rrr
rrrrrZurllib2rZ	cookielibZCookierrZ)pip._vendor.urllib3.packages.ordered_dictr�strZbuiltin_str�bytesZunicodeZ
basestring�intZlong�floatZ
numeric_typesZ
integer_typesZurllib.parseZurllib.requestZhttprZhttp.cookies�io�collections�r!r!�/usr/lib/python3.6/compat.py�<module>	sB,

0site-packages/pip/_vendor/requests/__pycache__/structures.cpython-36.pyc000064400000010346147511334620022456 0ustar003

���e��@s>dZddlZddlmZGdd�dej�ZGdd�de�ZdS)	zO
requests.structures
~~~~~~~~~~~~~~~~~~~

Data structures that power Requests.
�N�)�OrderedDictc@sbeZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dS)�CaseInsensitiveDicta�A case-insensitive ``dict``-like object.

    Implements all methods and operations of
    ``collections.MutableMapping`` as well as dict's ``copy``. Also
    provides ``lower_items``.

    All keys are expected to be strings. The structure remembers the
    case of the last key to be set, and ``iter(instance)``,
    ``keys()``, ``items()``, ``iterkeys()``, and ``iteritems()``
    will contain case-sensitive keys. However, querying and contains
    testing is case insensitive::

        cid = CaseInsensitiveDict()
        cid['Accept'] = 'application/json'
        cid['aCCEPT'] == 'application/json'  # True
        list(cid) == ['Accept']  # True

    For example, ``headers['content-encoding']`` will return the
    value of a ``'Content-Encoding'`` response header, regardless
    of how the header name was originally stored.

    If the constructor, ``.update``, or equality comparison
    operations are given keys that have equal ``.lower()``s, the
    behavior is undefined.
    NcKs&t�|_|dkri}|j|f|�dS)N)r�_store�update)�self�data�kwargs�r
� /usr/lib/python3.6/structures.py�__init__*szCaseInsensitiveDict.__init__cCs||f|j|j�<dS)N)r�lower)r�key�valuer
r
r�__setitem__0szCaseInsensitiveDict.__setitem__cCs|j|j�dS)Nr)rr
)rrr
r
r�__getitem__5szCaseInsensitiveDict.__getitem__cCs|j|j�=dS)N)rr
)rrr
r
r�__delitem__8szCaseInsensitiveDict.__delitem__cCsdd�|jj�D�S)Ncss|]\}}|VqdS)Nr
)�.0ZcasedkeyZmappedvaluer
r
r�	<genexpr><sz/CaseInsensitiveDict.__iter__.<locals>.<genexpr>)r�values)rr
r
r�__iter__;szCaseInsensitiveDict.__iter__cCs
t|j�S)N)�lenr)rr
r
r�__len__>szCaseInsensitiveDict.__len__cCsdd�|jj�D�S)z.Like iteritems(), but with all lowercase keys.css|]\}}||dfVqdS)rNr
)rZlowerkeyZkeyvalr
r
rrDsz2CaseInsensitiveDict.lower_items.<locals>.<genexpr>)r�items)rr
r
r�lower_itemsAszCaseInsensitiveDict.lower_itemscCs2t|tj�rt|�}ntSt|j��t|j��kS)N)�
isinstance�collections�Mappingr�NotImplemented�dictr)r�otherr
r
r�__eq__Is
zCaseInsensitiveDict.__eq__cCst|jj��S)N)rrr)rr
r
r�copyRszCaseInsensitiveDict.copycCstt|j���S)N)�strrr)rr
r
r�__repr__UszCaseInsensitiveDict.__repr__)N)�__name__�
__module__�__qualname__�__doc__rrrrrrrr!r"r$r
r
r
rrs
	rcs<eZdZdZd�fdd�	Zdd�Zdd�Zdd	d
�Z�ZS)
�
LookupDictzDictionary lookup object.Ncs||_tt|�j�dS)N)�name�superr)r)rr*)�	__class__r
rr\szLookupDict.__init__cCs
d|jS)Nz
<lookup '%s'>)r*)rr
r
rr$`szLookupDict.__repr__cCs|jj|d�S)N)�__dict__�get)rrr
r
rrcszLookupDict.__getitem__cCs|jj||�S)N)r-r.)rr�defaultr
r
rr.hszLookupDict.get)N)N)	r%r&r'r(rr$rr.�
__classcell__r
r
)r,rr)Ys
r))r(r�compatr�MutableMappingrrr)r
r
r
r�<module>sJsite-packages/pip/_vendor/requests/__pycache__/utils.cpython-36.pyc000064400000050320147511334620021367 0ustar003

���e/l�@s�dZddlZddlZddlZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddlZddlmZddl
mZddlmZddlmZddlmZmZmZmZmZmZmZmZmZmZmZmZm Z m!Z!dd	l"m#Z#dd
l$m%Z%ddl&m'Z'm(Z(m)Z)m*Z*dfZ+ej,�Z-ddd�Z.ej/�dk�r0dd�Z0dd�Zdd�Z1dd�Z2dgdd�Z3dd�Z4dd �Z5d!d"�Z6d#d$�Z7d%d&�Z8dhd'd(�Z9d)d*�Z:d+d,�Z;d-d.�Z<d/d0�Z=d1d2�Z>d3d4�Z?d5d6�Z@eAdi�ZBd9d:�ZCd;d<�ZDd=d>�ZEd?d@�ZFdAdB�ZGdCdD�ZHejIdEdF��ZJdGdH�ZKdjdIdJ�ZLdKdL�ZMdkdNdO�ZNdPdQ�ZOdRdS�ZPdTjQdU�ZReRdVZSeRdWZTdXdY�ZUdZd[�ZVd\d]�ZWejXd^�ZYejXd_�ZZd`da�Z[dbdc�Z\ddde�Z]dS)lz�
requests.utils
~~~~~~~~~~~~~~

This module provides utility functions that are used within Requests
that are also useful for external consumption.
�N�)�__version__)�certs)�to_native_string)�parse_http_list)�quote�urlparse�bytes�str�OrderedDict�unquote�
getproxies�proxy_bypass�
urlunparse�
basestring�
integer_types�is_py3�proxy_bypass_environment�getproxies_environment)�cookiejar_from_dict)�CaseInsensitiveDict)�
InvalidURL�
InvalidHeader�FileModeWarning�UnrewindableBodyError�.netrc�_netrc�Pi�)ZhttpZhttpsZWindowsc
Cs�trddl}nddl}y2|j|jd�}|j|d�d}|j|d�d}Wntk
r\dSX|sj|rndS|jd�}xX|D]P}|dkr�d|kr�d	S|jdd
�}|jdd�}|jd
d�}t	j
||t	j�r~d	Sq~WdS)Nrz;Software\Microsoft\Windows\CurrentVersion\Internet SettingsZProxyEnableZ
ProxyOverrideF�;z<local>�.Tz\.�*z.*�?)r�winreg�_winreg�OpenKey�HKEY_CURRENT_USERZQueryValueEx�OSError�split�replace�re�match�I)�hostr"ZinternetSettingsZproxyEnableZ
proxyOverrideZtest�r-�/usr/lib/python3.6/utils.py�proxy_bypass_registry.s2



r/cCst�rt|�St|�SdS)z�Return True, if the host should be bypassed.

        Checks proxy settings gathered from the environment, if specified,
        or the registry.
        N)rrr/)r,r-r-r.rOsrcCst|d�r|j�}|S)z/Returns an internal sequence dictionary update.�items)�hasattrr0)�dr-r-r.�dict_to_sequence[s
r3cCs2d}d}t|d�rt|�}nbt|d�r.|j}nPt|d�r~y|j�}Wntjk
rZYn$Xtj|�j}d|jkr~t	j
dt�t|d��ry|j�}Wn$t
tfk
r�|dk	r�|}Yn\Xt|d�o�|dk�ry&|jdd	�|j�}|j|p�d�Wnt
tfk
�rd}YnX|dk�r$d}td||�S)
Nr�__len__�len�fileno�ba%Requests has determined the content-length for this request using the binary size of the file: however, the file has been opened in text mode (i.e. without the 'b' flag in the mode). This may lead to an incorrect content-length. In Requests 3.0, support will be removed for files in text mode.�tell�seek�)r1r5r6�io�UnsupportedOperation�os�fstat�st_size�mode�warnings�warnrr8r&�IOErrorr9�max)�oZtotal_lengthZcurrent_positionr6r-r-r.�	super_lends@







rFFcCsy�ddlm}m}d}xJtD]B}ytjjdj|��}Wntk
rJdSXtjj|�r|}PqW|dkrndSt	|�}d}t
|t�r�|jd�}|j
j|�d}	y6||�j|	�}
|
r�|
dr�dnd}|
||
dfSWn|tfk
r�|r�YnXWnttfk
�rYnXdS)	z;Returns the Requests tuple auth for a given url from netrc.r)�netrc�NetrcParseErrorNz~/{0}�:�asciirr:)rGrH�NETRC_FILESr=�path�
expanduser�format�KeyError�existsr�
isinstancer
�decode�netlocr'ZauthenticatorsrC�ImportError�AttributeError)�urlZraise_errorsrGrHZ
netrc_path�f�locZriZsplitstrr,rZlogin_ir-r-r.�get_netrc_auth�s8


rYcCsBt|dd�}|r>t|t�r>|ddkr>|ddkr>tjj|�SdS)z0Tries to guess the filename of the given object.�nameNr�<r�>���)�getattrrQrr=rL�basename)�objrZr-r-r.�guess_filename�sracCs.|dkrdSt|ttttf�r&td��t|�S)a�Take an object and test to see if it can be represented as a
    dictionary. Unless it can not be represented as such, return an
    OrderedDict, e.g.,

    ::

        >>> from_key_val_list([('key', 'val')])
        OrderedDict([('key', 'val')])
        >>> from_key_val_list('string')
        ValueError: need more than 1 value to unpack
        >>> from_key_val_list({'key': 'val'})
        OrderedDict([('key', 'val')])

    :rtype: OrderedDict
    Nz+cannot encode objects that are not 2-tuples)rQr
r	�bool�int�
ValueErrorr)�valuer-r-r.�from_key_val_list�s
rfcCsB|dkrdSt|ttttf�r&td��t|tj�r:|j�}t	|�S)a�Take an object and test to see if it can be represented as a
    dictionary. If it can be, return a list of tuples, e.g.,

    ::

        >>> to_key_val_list([('key', 'val')])
        [('key', 'val')]
        >>> to_key_val_list({'key': 'val'})
        [('key', 'val')]
        >>> to_key_val_list('string')
        ValueError: cannot encode objects that are not 2-tuples.

    :rtype: list
    Nz+cannot encode objects that are not 2-tuples)
rQr
r	rbrcrd�collections�Mappingr0�list)rer-r-r.�to_key_val_list�srjcCs\g}xRt|�D]F}|dd�|dd�ko4dknrJt|dd��}|j|�qW|S)aParse lists as described by RFC 2068 Section 2.

    In particular, parse comma-separated lists where the elements of
    the list may include quoted-strings.  A quoted-string could
    contain a comma.  A non-quoted string could have quotes in the
    middle.  Quotes are removed automatically after parsing.

    It basically works like :func:`parse_set_header` just that items
    may appear multiple times and case sensitivity is preserved.

    The return value is a standard :class:`list`:

    >>> parse_list_header('token, "quoted value"')
    ['token', 'quoted value']

    To create a header from the :class:`list` again, use the
    :func:`dump_header` function.

    :param value: a string with a list header.
    :return: :class:`list`
    :rtype: list
    Nr�"r]r])�_parse_list_header�unquote_header_value�append)re�result�itemr-r-r.�parse_list_headers(rqcCs|i}xrt|�D]f}d|kr$d||<q|jdd�\}}|dd�|dd�koVdknrlt|dd��}|||<qW|S)a^Parse lists of key, value pairs as described by RFC 2068 Section 2 and
    convert them into a python dict:

    >>> d = parse_dict_header('foo="is a fish", bar="as well"')
    >>> type(d) is dict
    True
    >>> sorted(d.items())
    [('bar', 'as well'), ('foo', 'is a fish')]

    If there is no value for a key it will be `None`:

    >>> parse_dict_header('key_without_value')
    {'key_without_value': None}

    To create a header from the :class:`dict` again, use the
    :func:`dump_header` function.

    :param value: a string with a dict header.
    :return: :class:`dict`
    :rtype: dict
    �=Nrrkr]r])rlr'rm)rerorprZr-r-r.�parse_dict_header1s(rscCs^|rZ|d|d	kodknrZ|dd
�}|sF|dd�dkrZ|jdd�jdd�S|S)z�Unquotes a header value.  (Reversal of :func:`quote_header_value`).
    This does not use the real unquoting but what browsers are actually
    using for quoting.

    :param value: the header value to unquote.
    :rtype: str
    rrrkNr:z\\�\z\"r]r])r()reZis_filenamer-r-r.rmTs
$rmcCs"i}x|D]}|j||j<q
W|S)z�Returns a key/value dictionary from a CookieJar.

    :param cj: CookieJar object to extract cookies from.
    :rtype: dict
    )rerZ)�cj�cookie_dictZcookier-r-r.�dict_from_cookiejarms
rwcCs
t||�S)z�Returns a CookieJar from a key/value dictionary.

    :param cj: CookieJar to insert cookies into.
    :param cookie_dict: Dict of key/values to insert into CookieJar.
    :rtype: CookieJar
    )r)rurvr-r-r.�add_dict_to_cookiejar|srxcCsTtjdt�tjdtjd�}tjdtjd�}tjd�}|j|�|j|�|j|�S)zlReturns encodings from given content string.

    :param content: bytestring to extract encodings from.
    z�In requests 3.0, get_encodings_from_content will be removed. For more information, please see the discussion on issue #2266. (This warning should only appear once.)z!<meta.*?charset=["\']*(.+?)["\'>])�flagsz+<meta.*?content=["\']*;?charset=(.+?)["\'>]z$^<\?xml.*?encoding=["\']*(.+?)["\'>])rArB�DeprecationWarningr)�compiler+�findall)�contentZ
charset_reZ	pragma_reZxml_rer-r-r.�get_encodings_from_content�s
r~cCsF|jd�}|sdStj|�\}}d|kr6|djd�Sd|krBdSdS)z}Returns encodings from given HTTP Header Dict.

    :param headers: dictionary to extract encoding from.
    :rtype: str
    zcontent-typeN�charsetz'"�textz
ISO-8859-1)�get�cgiZparse_header�strip)�headersZcontent_type�paramsr-r-r.�get_encoding_from_headers�s
r�ccsr|jdkr"x|D]
}|VqWdStj|j�dd�}x |D]}|j|�}|r:|Vq:W|jddd�}|rn|VdS)zStream decodes a iterator.Nr()�errors�T)�final)�encoding�codecs�getincrementaldecoderrR)�iterator�rrp�decoder�chunk�rvr-r-r.�stream_decode_response_unicode�s





r�ccsLd}|dks|dkrt|�}x*|t|�krF||||�V||7}qWdS)z Iterate over slices of a string.rN)r5)�stringZslice_length�posr-r-r.�iter_slices�sr�cCsvtjdt�g}t|j�}|rJyt|j|�Stk
rH|j|�YnXyt|j|dd�St	k
rp|jSXdS)z�Returns the requested content back in unicode.

    :param r: Response object to get unicode content from.

    Tried:

    1. charset from content-type
    2. fall back and replace all unicode characters

    :rtype: str
    z�In requests 3.0, get_unicode_from_response will be removed. For more information, please see the discussion on issue #2266. (This warning should only appear once.)r()r�N)
rArBrzr�r�r
r}�UnicodeErrorrn�	TypeError)r�Ztried_encodingsr�r-r-r.�get_unicode_from_response�s
r�Z4ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzz0123456789-._~c
Cs�|jd�}x�tdt|��D]�}||dd�}t|�dkr�|j�r�ytt|d��}Wn tk
rttd|��YnX|tkr�|||dd�||<q�d||||<qd||||<qWdj	|�S)	z�Un-escape any percent-escape sequences in a URI that are unreserved
    characters. This leaves all reserved, illegal and non-ASCII bytes encoded.

    :rtype: str
    �%rrr:�z%Invalid percent-escape sequence: '%s'N�)
r'�ranger5�isalnum�chrrcrdr�UNRESERVED_SET�join)�uri�parts�i�h�cr-r-r.�unquote_unreserved�s
r�cCs:d}d}ytt|�|d�Stk
r4t||d�SXdS)z�Re-quote the given URI.

    This function passes the given URI through an unquote/quote cycle to
    ensure that it is fully and consistently quoted.

    :rtype: str
    z!#$%&'()*+,/:;=?@[]~z!#$&'()*+,/:;=?@[]~)ZsafeN)rr�r)r�Zsafe_with_percentZsafe_without_percentr-r-r.�requote_uri
sr�cCsltjdtj|��d}|jd�\}}tjdtjtt|����d}tjdtj|��d|@}||@||@kS)z�This function allows you to check if an IP belongs to a network subnet

    Example: returns True if ip = 192.168.1.1 and net = 192.168.1.0/24
             returns False if ip = 192.168.1.1 and net = 192.168.100.0/24

    :rtype: bool
    z=Lr�/)�struct�unpack�socket�	inet_atonr'�dotted_netmaskrc)�ipZnetZipaddrZnetaddr�bitsZnetmaskZnetworkr-r-r.�address_in_network#s
r�cCs&ddd|>dA}tjtjd|��S)z�Converts mask from /xx format to xxx.xxx.xxx.xxx

    Example: if mask is 24 function returns 255.255.255.0

    :rtype: str
    l��r� z>I)r�Z	inet_ntoar��pack)�maskr�r-r-r.r�2sr�cCs*ytj|�Wntjk
r$dSXdS)z
    :rtype: bool
    FT)r�r��error)Z	string_ipr-r-r.�is_ipv4_address=s
r�cCs�|jd�dkr�yt|jd�d�}Wntk
r8dSX|dksJ|dkrNdSytj|jd�d�Wq�tjk
r|dSXndSdS)zV
    Very simple check of the cidr format in no_proxy variable.

    :rtype: bool
    r�rFr�rT)�countrcr'rdr�r�r�)Zstring_networkr�r-r-r.�
is_valid_cidrHsr�ccsT|dk	}|r"tjj|�}|tj|<z
dVWd|rN|dkrDtj|=n
|tj|<XdS)z�Set the environment variable 'env_name' to 'value'

    Save previous value, yield, and then restore the previous value stored in
    the environment variable 'env_name'.

    If 'value' is None, do nothingN)r=�environr�)Zenv_namereZ
value_changedZ	old_valuer-r-r.�set_environ`s


r�c	Csdd�}|}|dkr|d�}t|�j}|r�dd�|jdd�jd	�D�}|jd
�d}t|�r�xb|D](}t|�r~t||�r�dSqb||krbdSqbWn0x.|D]&}|j|�s�|jd
�dj|�r�dSq�Wtd|��2yt	|�}Wnt
tjfk
r�d
}YnXWdQRX|�rdSd
S)zL
    Returns whether we should bypass proxies or not.

    :rtype: bool
    cSstjj|�ptjj|j��S)N)r=r�r��upper)�kr-r-r.�<lambda>|sz'should_bypass_proxies.<locals>.<lambda>N�no_proxycss|]}|r|VqdS)Nr-)�.0r,r-r-r.�	<genexpr>�sz(should_bypass_proxies.<locals>.<genexpr>� r��,�:rTF)
rrSr(r'r�r�r��endswithr�rr�r�Zgaierror)	rVr�Z	get_proxyZno_proxy_argrSr�Zproxy_ipr,Zbypassr-r-r.�should_bypass_proxiesvs4




r�cCst||d�riSt�SdS)zA
    Return a dict of environment proxies.

    :rtype: dict
    )r�N)r�r
)rVr�r-r-r.�get_environ_proxies�sr�cCsv|pi}t|�}|jdkr.|j|j|jd��S|jd|j|jd|jdg}d}x|D]}||krX||}PqXW|S)z�Select a proxy for the url, if applicable.

    :param url: The url being for the request
    :param proxies: A dictionary of schemes or schemes and hosts to proxy URLs
    N�allz://zall://)rZhostnamer��scheme)rVZproxiesZurlpartsZ
proxy_keys�proxyZ	proxy_keyr-r-r.�select_proxy�s

r��python-requestscCsd|tfS)zO
    Return a string representing the default user agent.

    :rtype: str
    z%s/%s)r)rZr-r-r.�default_user_agent�sr�cCstt�djd�ddd��S)z9
    :rtype: requests.structures.CaseInsensitiveDict
    z, �gzip�deflatez*/*z
keep-alive)z
User-AgentzAccept-EncodingZAcceptZ
Connection)r�r�)rr�r�r-r-r-r.�default_headers�s
r�c	Cs�g}d}x�tjd|�D]�}y|jdd�\}}Wntk
rL|d}}YnXd|jd�i}xP|jd�D]B}y|jd�\}}Wntk
r�PYnX|j|�||j|�<qhW|j|�qW|S)	z�Return a dict of parsed link headers proxies.

    i.e. Link: <http:/.../front.jpeg>; rel=front; type="image/jpeg",<http://.../back.jpeg>; rel=back;type="image/jpeg"

    :rtype: list
    z '"z, *<rrr�rVz<> '"rr)r)r'rdr�rn)	reZlinksZ
replace_chars�valrVr��linkZparam�keyr-r-r.�parse_header_links�s r��rJr:�cCs�|dd�}|tjtjfkr dS|dd�tjkr6dS|dd�tjtjfkrRdS|jt�}|dkrhd	S|dkr�|ddd�tkr�d
S|ddd�tkr�dS|dkr�|dd�t	kr�d
S|dd�t	kr�dSdS)z
    :rtype: str
    N�zutf-32r�z	utf-8-sigr:zutf-16rzutf-8z	utf-16-berz	utf-16-lez	utf-32-bez	utf-32-le)
r��BOM_UTF32_LE�BOM_UTF32_BE�BOM_UTF8�BOM_UTF16_LE�BOM_UTF16_BEr��_null�_null2�_null3)�dataZsampleZ	nullcountr-r-r.�guess_json_utfs*
r�cCs8t||�\}}}}}}|s$||}}t||||||f�S)z�Given a URL that may or may not have a scheme, prepend the given scheme.
    Does not replace a present scheme with the one provided as an argument.

    :rtype: str
    )rr)rVZ
new_schemer�rSrLr��query�fragmentr-r-r.�prepend_scheme_if_needed1s
r�cCsBt|�}yt|j�t|j�f}Wnttfk
r<d}YnX|S)z{Given a url with authentication components, extract them into a tuple of
    username,password.

    :rtype: (str,str)
    r�)r�r�)rrZusernameZpasswordrUr�)rVZparsedZauthr-r-r.�get_auth_from_urlBs
r�s^\S[^\r\n]*$|^$z^\S[^\r\n]*$|^$cCsf|\}}t|t�rt}nt}y|j|�s4td|��Wn*tk
r`td||t|�f��YnXdS)z�Verifies that header value is a string which doesn't contain
    leading whitespace or return characters. This prevents unintended
    header injection.

    :param header: tuple, in the format (name, value).
    z7Invalid return character or leading space in header: %sz>Value for header {%s: %s} must be of type str or bytes, not %sN)rQr	�_CLEAN_HEADER_REGEX_BYTE�_CLEAN_HEADER_REGEX_STRr*rr��type)�headerrZreZpatr-r-r.�check_header_validityWs

r�cCsFt|�\}}}}}}|s"||}}|jdd�d}t|||||df�S)zW
    Given a url remove the fragment and the authentication part.

    :rtype: str
    �@rr�r])r�rsplitr)rVr�rSrLr�r�r�r-r-r.�
urldefragauthls

r�cCs`t|jdd�}|dk	rTt|jt�rTy||j�Wq\ttfk
rPtd��Yq\Xntd��dS)zfMove file pointer back to its recorded starting position
    so it can be read again on redirect.
    r9Nz;An error occurred when rewinding request body for redirect.z+Unable to rewind request body for redirect.)r^ZbodyrQZ_body_positionrrCr&r)Zprepared_requestZ	body_seekr-r-r.�rewind_body}sr�)rr)F)FzBABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789-._~)N)r�)^�__doc__r�r�rg�
contextlibr;r=�platformr)r�r�rArr�rZ_internal_utilsr�compatrrlrrr	r
rrr
rrrrrrrZcookiesrZ
structuresr�
exceptionsrrrrrK�whereZDEFAULT_CA_BUNDLE_PATHZ
DEFAULT_PORTS�systemr/r3rFrYrarfrjrqrsrmrwrxr~r�r�r�r��	frozensetr�r�r�r�r�r�r��contextmanagerr�r�r�r�r�r�r��encoder�r�r�r�r�r�r{r�r�r�r�r�r-r-r-r.�<module>	s�@
!	=
3 #

%9

	"
 

site-packages/pip/_vendor/requests/__pycache__/_internal_utils.cpython-36.opt-1.pyc000064400000002243147511334620024362 0ustar003

���eH�@s.dZddlmZmZmZd	dd�Zdd�ZdS)
z�
requests._internal_utils
~~~~~~~~~~~~~~

Provides utility functions that are consumed internally by Requests
which depend on extremely few external helpers (such as compat)
�)�is_py2�builtin_str�str�asciicCs.t|t�r|}ntr |j|�}n
|j|�}|S)z�Given a string object, regardless of type, returns a representation of
    that string in the native string type, encoding and decoding where
    necessary. This assumes ASCII unless told otherwise.
    )�
isinstancerr�encode�decode)�string�encoding�out�r�%/usr/lib/python3.6/_internal_utils.py�to_native_strings

rcCs(y|jd�dStk
r"dSXdS)z�Determine if unicode string only contains ASCII characters.

    :param str u_string: unicode string to check. Must be unicode
        and not Python 2 `str`.
    :rtype: bool
    rTFN)r�UnicodeEncodeError)Zu_stringrrr
�unicode_is_asciis

rN)r)�__doc__�compatrrrrrrrrr
�<module>	s
site-packages/pip/_vendor/requests/__pycache__/auth.cpython-36.pyc000064400000017115147511334620021175 0ustar003

���e&�@s�dZddlZddlZddlZddlZddlZddlZddlmZddl	m
Z
mZmZddl
mZddlmZddlmZd	Zd
Zdd�ZGd
d�de�ZGdd�de�ZGdd�de�ZGdd�de�ZdS)z]
requests.auth
~~~~~~~~~~~~~

This module contains the authentication handlers for Requests.
�N)�	b64encode�)�urlparse�str�
basestring)�extract_cookies_to_jar)�to_native_string)�parse_dict_headerz!application/x-www-form-urlencodedzmultipart/form-datacCs�t|t�s&tjdj|�td�t|�}t|t�sLtjdj|�td�t|�}t|t�r`|jd�}t|t�rt|jd�}dtt	dj
||f��j��}|S)zReturns a Basic Auth string.z�Non-string usernames will no longer be supported in Requests 3.0.0. Please convert the object you've passed in ({0!r}) to a string or bytes object in the near future to avoid problems.)�categoryz�Non-string passwords will no longer be supported in Requests 3.0.0. Please convert the object you've passed in ({0!r}) to a string or bytes object in the near future to avoid problems.�latin1zBasic �:)�
isinstancer�warnings�warn�format�DeprecationWarningr�encoderr�join�strip)�username�passwordZauthstr�r�/usr/lib/python3.6/auth.py�_basic_auth_strs&






rc@seZdZdZdd�ZdS)�AuthBasez4Base class that all auth implementations derive fromcCstd��dS)NzAuth hooks must be callable.)�NotImplementedError)�self�rrrr�__call__KszAuthBase.__call__N)�__name__�
__module__�__qualname__�__doc__rrrrrrHsrc@s0eZdZdZdd�Zdd�Zdd�Zdd	�Zd
S)�
HTTPBasicAuthz?Attaches HTTP Basic Authentication to the given Request object.cCs||_||_dS)N)rr)rrrrrr�__init__RszHTTPBasicAuth.__init__cCs(t|jt|dd�k|jt|dd�kg�S)Nrr)�allr�getattrr)r�otherrrr�__eq__VszHTTPBasicAuth.__eq__cCs
||kS)Nr)rr'rrr�__ne__\szHTTPBasicAuth.__ne__cCst|j|j�|jd<|S)N�
Authorization)rrr�headers)rrrrrr_szHTTPBasicAuth.__call__N)rr r!r"r$r(r)rrrrrr#Os
r#c@seZdZdZdd�ZdS)�
HTTPProxyAuthz=Attaches HTTP Proxy Authentication to a given Request object.cCst|j|j�|jd<|S)NzProxy-Authorization)rrrr+)rrrrrrgszHTTPProxyAuth.__call__N)rr r!r"rrrrrr,dsr,c@sPeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�ZdS)�HTTPDigestAuthz@Attaches HTTP Digest Authentication to the given Request object.cCs||_||_tj�|_dS)N)rr�	threadingZlocal�
_thread_local)rrrrrrr$oszHTTPDigestAuth.__init__cCs@t|jd�s<d|j_d|j_d|j_i|j_d|j_d|j_dS)N�initT�r)�hasattrr/r0�
last_nonce�nonce_count�chal�pos�
num_401_calls)rrrr�init_per_thread_stateusz$HTTPDigestAuth.init_per_thread_statecsj|jjd}|jjd}|jjjd�}|jjjd�}|jjjd�}d�|dkrTd}n|j�}|dksl|dkrzd	d
�}	|	�n|dkr�dd
�}
|
��fdd�}�dkr�dSd}t|�}
|
jp�d}|
jr�|d|
j7}d|j||jf}d||f}�|�}�|�}||jj	k�r|jj
d7_
nd|j_
d|jj
}t|jj
�jd�}||jd�7}|t
j�jd�7}|tjd�7}tj|�j�dd�}|dk�r��d|||f�}|�s�||d||f�}n<|dk�s�d|jd�k�r�d|||d|f}|||�}ndS||j_	d|j||||f}|�r(|d|7}|�r:|d|7}|�rL|d|7}|�rb|d ||f7}d!|S)"z
        :rtype: str
        �realm�nonce�qop�	algorithm�opaqueNZMD5zMD5-SESScSs"t|t�r|jd�}tj|�j�S)Nzutf-8)r
rr�hashlibZmd5�	hexdigest)�xrrr�md5_utf8�s

z4HTTPDigestAuth.build_digest_header.<locals>.md5_utf8ZSHAcSs"t|t�r|jd�}tj|�j�S)Nzutf-8)r
rrr>�sha1r?)r@rrr�sha_utf8�s

z4HTTPDigestAuth.build_digest_header.<locals>.sha_utf8cs�d||f�S)Nz%s:%sr)�s�d)�	hash_utf8rr�<lambda>�sz4HTTPDigestAuth.build_digest_header.<locals>.<lambda>�/�?z%s:%s:%sz%s:%srz%08xzutf-8��Zauth�,z%s:%s:%s:%s:%sz>username="%s", realm="%s", nonce="%s", uri="%s", response="%s"z
, opaque="%s"z, algorithm="%s"z
, digest="%s"z , qop="auth", nc=%s, cnonce="%s"z	Digest %s)r/r5�get�upperr�pathZqueryrrr3r4rr�timeZctime�os�urandomr>rBr?�split)r�method�urlr9r:r;r<r=Z
_algorithmrArCZKDZentdigZp_parsedrOZA1ZA2ZHA1ZHA2ZncvaluerDZcnonceZrespdigZnoncebit�baser)rFr�build_digest_headersr

z"HTTPDigestAuth.build_digest_headercKs|jrd|j_dS)z)Reset num_401_calls counter on redirects.rN)Zis_redirectr/r7)rr�kwargsrrr�handle_redirect�szHTTPDigestAuth.handle_redirectcKs"d|jkodkns&d|j_|S|jjdk	rD|jjj|jj�|jjdd�}d|j	�koh|jjdk�r|jjd7_t
jd	t
jd
�}t
|jd|dd��|j_|j|j�|jj�}t|j|j|j�|j|j�|j|j|j�|jd<|jj|f|�}|jj|�||_|Sd|j_|S)
zo
        Takes the given response and tries digest-auth, if needed.

        :rtype: requests.Response
        i�i�rNzwww-authenticater1Zdigest�zdigest )�flags)�countr*)Zstatus_coder/r7r6Zrequest�body�seekr+rM�lower�re�compile�
IGNORECASEr	�subr5Zcontent�close�copyrZ_cookies�rawZprepare_cookiesrWrTrUZ
connection�send�history�append)rrrXZs_authZpatZprepZ_rrrr�
handle_401�s.	
zHTTPDigestAuth.handle_401cCs~|j�|jjr&|j|j|j�|jd<y|jj�|j_	Wnt
k
rTd|j_	YnX|jd|j�|jd|j
�d|j_|S)Nr*Zresponser)r8r/r3rWrTrUr+r]�tellr6�AttributeErrorZ
register_hookrjrYr7)rrrrrr
szHTTPDigestAuth.__call__cCs(t|jt|dd�k|jt|dd�kg�S)Nrr)r%rr&r)rr'rrrr(szHTTPDigestAuth.__eq__cCs
||kS)Nr)rr'rrrr)$szHTTPDigestAuth.__ne__N)rr r!r"r$r8rWrYrjrr(r)rrrrr-ls
Z,r-)r"rQr`rPr>r.r�base64r�compatrrrZcookiesrZ_internal_utilsrZutilsr	ZCONTENT_TYPE_FORM_URLENCODEDZCONTENT_TYPE_MULTI_PARTr�objectrr#r,r-rrrr�<module>s$,site-packages/pip/_vendor/requests/__pycache__/compat.cpython-36.pyc000064400000002653147511334620021520 0ustar003

���eZ�@s�dZddlmZddlZejZeddkZeddkZddlZer�ddl	m
Z
mZmZm
Z
mZmZmZmZmZddlmZmZmZmZmZddlmZddlZdd	lmZdd
lmZddlmZe Z!e Z"e#Z e$Z$e%e&e'fZ(e%e&fZ)n�e�r�ddl*mZmZmZmZmZm
Z
mZmZm
Z
mZdd
l+mZmZmZmZmZddl,m-Zdd	l.mZdd
l/mZddl0mZe Z!e Z e"Z"e e"fZ$e%e'fZ(e%fZ)dS)zq
requests.compat
~~~~~~~~~~~~~~~

This module handles import compatibility issues between Python 2 and
Python 3.
�)�chardetN��)	�quote�unquote�
quote_plus�unquote_plus�	urlencode�
getproxies�proxy_bypass�proxy_bypass_environment�getproxies_environment)�urlparse�
urlunparse�urljoin�urlsplit�	urldefrag)�parse_http_list)�Morsel)�StringIO)�OrderedDict)
rrrrr	rrrrr)rr
rrr
)�	cookiejar)1�__doc__Zpip._vendorr�sys�version_infoZ_verZis_py2Zis_py3ZjsonZurllibrrrrr	r
rrr
rrrrrZurllib2rZ	cookielibZCookierrZ)pip._vendor.urllib3.packages.ordered_dictr�strZbuiltin_str�bytesZunicodeZ
basestring�intZlong�floatZ
numeric_typesZ
integer_typesZurllib.parseZurllib.requestZhttprZhttp.cookies�io�collections�r!r!�/usr/lib/python3.6/compat.py�<module>	sB,

0site-packages/pip/_vendor/requests/__pycache__/__init__.cpython-36.opt-1.pyc000064400000005665147511334620022741 0ustar003

���e�
�@s�dZddlmZddlmZddlZddlmZdd�Zyeejej�Wn0e	e
fk
rzejd	jejej�e�YnXdd
l
mZejde�ddlmZmZmZmZdd
lmZmZmZmZddlmZmZddlmZddlmZddlmZmZmZddl m!Z!m"Z"m#Z#m$Z$m%Z%m&Z&m'Z'm(Z(ddl)m*Z*m+Z+ddl,m-Z-ddlm.Z.m/Z/m0Z0m1Z1m2Z2m3Z3m4Z4m5Z5m6Z6ddl7Z7yddl7m8Z8Wn(e9k
�r�Gdd�de7j:�Z8YnXe7j;e<�j=e8��ejde4dd�dS)a�
Requests HTTP Library
~~~~~~~~~~~~~~~~~~~~~

Requests is an HTTP library, written in Python, for human beings. Basic GET
usage:

   >>> import requests
   >>> r = requests.get('https://www.python.org')
   >>> r.status_code
   200
   >>> 'Python is a programming language' in r.content
   True

... or POST:

   >>> payload = dict(key1='value1', key2='value2')
   >>> r = requests.post('http://httpbin.org/post', data=payload)
   >>> print(r.text)
   {
     ...
     "form": {
       "key2": "value2",
       "key1": "value1"
     },
     ...
   }

The other HTTP methods are supported - see `requests.api`. Full documentation
is at <http://python-requests.org>.

:copyright: (c) 2017 by Kenneth Reitz.
:license: Apache 2.0, see LICENSE for more details.
�)�urllib3)�chardetN�)�RequestsDependencyWarningcCs~|jd�}t|�dkr |jd�|\}}}t|�t|�t|�}}}|jd�dd�\}}}t|�t|�t|�}}}dS)N�.��0�)�split�len�append�int)Zurllib3_versionZchardet_version�major�minor�patch�r�/usr/lib/python3.6/__init__.py�check_compatibility1s


rzAurllib3 ({0}) or chardet ({1}) doesn't match a supported version!)�DependencyWarning�ignore)�	__title__�__description__�__url__�__version__)�	__build__�
__author__�__author_email__�__license__)�
__copyright__�__cake__)�utils)�packages)�Request�Response�PreparedRequest)�request�get�head�postr�put�delete�options)�session�Session)�codes)	�RequestException�Timeout�URLRequired�TooManyRedirects�	HTTPError�ConnectionError�FileModeWarning�ConnectTimeout�ReadTimeout)�NullHandlerc@seZdZdd�ZdS)r8cCsdS)Nr)�self�recordrrr�emitsszNullHandler.emitN)�__name__�
__module__�__qualname__r;rrrrr8rsr8�defaultT)r)>�__doc__Zpip._vendorrr�warnings�
exceptionsrrr�AssertionError�
ValueError�warn�formatZpip._vendor.urllib3.exceptionsr�simplefilterrrrrrrrrr�r r!Zmodelsr"r#r$Zapir%r&r'r(rr)r*r+Zsessionsr,r-Zstatus_codesr.r/r0r1r2r3r4r5r6r7Zloggingr8�ImportErrorZHandlerZ	getLoggerr<Z
addHandlerrrrr�<module>)s<

(,site-packages/pip/_vendor/requests/__pycache__/models.cpython-36.pyc000064400000056505147511334620021525 0ustar003

���e��@s�dZddlZddlZddlZddlZddlmZddlm	Z	ddl
mZddlm
Z
mZmZmZddlmZdd	lmZdd
lmZddlmZddlmZmZmZdd
lmZmZm Z m!Z!m"Z"m#Z#m$Z$ddl%m&Z&m'Z'ddl(m)Z)m*Z*m+Z+m,Z,m-Z-m.Z.m/Z/m0Z0m1Z1m2Z2ddl3m4Z4m5Z5m6Z6m7Z7m8Z8m9Z9m:Z:m;Z;m<Z<m=Z=ddl3m>Z?ddl@mAZAeAjBeAjCeAjDeAjEeAjFfZGdZHd!ZIdZJGdd�deK�ZLGdd�deK�ZMGdd�deM�ZNGdd�deLeM�ZOGdd �d eK�ZPdS)"z`
requests.models
~~~~~~~~~~~~~~~

This module contains the primary objects that power Requests.
�N)�RequestField)�encode_multipart_formdata)�	parse_url)�DecodeError�ReadTimeoutError�
ProtocolError�LocationParseError)�UnsupportedOperation�)�
default_hooks)�CaseInsensitiveDict)�
HTTPBasicAuth)�cookiejar_from_dict�get_cookie_header�_copy_cookie_jar)�	HTTPError�
MissingSchema�
InvalidURL�ChunkedEncodingError�ContentDecodingError�ConnectionError�StreamConsumedError)�to_native_string�unicode_is_ascii)
�guess_filename�get_auth_from_url�requote_uri�stream_decode_response_unicode�to_key_val_list�parse_header_links�iter_slices�guess_json_utf�	super_len�check_header_validity)
�	cookielib�
urlunparse�urlsplit�	urlencode�str�bytes�is_py2�chardet�builtin_str�
basestring)�json)�codes��
iic@s0eZdZedd��Zedd��Zedd��ZdS)�RequestEncodingMixincCsNg}t|j�}|j}|sd}|j|�|j}|rD|jd�|j|�dj|�S)zBuild the path URL to use.�/�?�)r&�url�path�append�query�join)�selfr6�pr7r9�r=�/usr/lib/python3.6/models.py�path_url=s



zRequestEncodingMixin.path_urlcCs�t|ttf�r|St|d�r |St|d�r�g}x|t|�D]p\}}t|t�sVt|d�r\|g}xJ|D]B}|dk	rb|jt|t�r�|jd�n|t|t�r�|jd�n|f�qbWq8Wt|dd�S|SdS)z�Encode parameters in a piece of data.

        Will successfully encode parameters when passed as a dict or a list of
        2-tuples. Order is retained if data is a list of 2-tuples but arbitrary
        if parameters are supplied as a dict.
        �read�__iter__Nzutf-8T)Zdoseq)	�
isinstancer(r)�hasattrrr-r8�encoder')�data�result�kZvs�vr=r=r>�_encode_paramsRs 	


$z#RequestEncodingMixin._encode_paramscCs�|std��nt|t�r td��g}t|p,i�}t|p8i�}x�|D]�\}}t|t�s`t|d�rf|g}x\|D]T}|dk	rlt|t�s�t|�}|jt|t�r�|jd�n|t|t�r�|j	d�n|f�qlWqBWx�|D]�\}}d}d}	t|t
tf��r.t|�dk�r|\}
}n&t|�dk�r |\}
}}n|\}
}}}	nt
|��p:|}
|}t|tttf��rX|}n|j�}t|||
|	d�}
|
j|d	�|j|
�q�Wt|�\}}||fS)
a�Build the body for a multipart/form-data request.

        Will successfully encode files when passed as a dict or a list of
        tuples. Order is retained if data is a list of tuples but arbitrary
        if parameters are supplied as a dict.
        The tuples may be 2-tuples (filename, fileobj), 3-tuples (filename, fileobj, contentype)
        or 4-tuples (filename, fileobj, contentype, custom_headers).
        zFiles must be provided.zData must not be a string.rANzutf-8��)�namerE�filename�headers)�content_type)�
ValueErrorrBr-rrCr)r(r8�decoderD�tuple�list�lenr�	bytearrayr@rZmake_multipartr)�filesrEZ
new_fieldsZfieldsZfield�valrHrGZftZfh�fn�fpZfdataZrf�bodyrOr=r=r>�
_encode_filesmsH




$
z"RequestEncodingMixin._encode_filesN)�__name__�
__module__�__qualname__�propertyr?�staticmethodrIr[r=r=r=r>r2<sr2c@seZdZdd�Zdd�ZdS)�RequestHooksMixincCs\||jkrtd|��t|tj�r4|j|j|�n$t|d�rX|j|jdd�|D��dS)zProperly register a hook.z1Unsupported event specified, with event name "%s"rAcss|]}t|tj�r|VqdS)N)rB�collections�Callable)�.0�hr=r=r>�	<genexpr>�sz2RequestHooksMixin.register_hook.<locals>.<genexpr>N)�hooksrPrBrbrcr8rC�extend)r;�event�hookr=r=r>�
register_hook�s

zRequestHooksMixin.register_hookcCs.y|j|j|�dStk
r(dSXdS)ziDeregister a previously registered hook.
        Returns True if the hook existed, False if not.
        TFN)rg�removerP)r;rirjr=r=r>�deregister_hook�s
z!RequestHooksMixin.deregister_hookN)r\r]r^rkrmr=r=r=r>ra�srac
@s*eZdZdZd	dd�Zdd�Zdd�ZdS)
�Requesta�A user-created :class:`Request <Request>` object.

    Used to prepare a :class:`PreparedRequest <PreparedRequest>`, which is sent to the server.

    :param method: HTTP method to use.
    :param url: URL to send.
    :param headers: dictionary of headers to send.
    :param files: dictionary of {filename: fileobject} files to multipart upload.
    :param data: the body to attach to the request. If a dictionary is provided, form-encoding will take place.
    :param json: json for the body to attach to the request (if files or data is not specified).
    :param params: dictionary of URL parameters to append to the URL.
    :param auth: Auth handler or (user, pass) tuple.
    :param cookies: dictionary or CookieJar of cookies to attach to this request.
    :param hooks: dictionary of callback hooks, for internal usage.

    Usage::

      >>> import requests
      >>> req = requests.Request('GET', 'http://httpbin.org/get')
      >>> req.prepare()
      <PreparedRequest [GET]>
    Nc
Cs�|dkrgn|}|dkrgn|}|dkr,in|}|dkr<in|}|	dkrLin|	}	t�|_x&t|	j��D]\}}|j||d�qfW||_||_||_||_||_	|
|_
||_||_||_
dS)N)rirj)rrgrS�itemsrk�methodr6rNrVrEr.�params�auth�cookies)
r;rpr6rNrVrErqrrrsrgr.rGrHr=r=r>�__init__�s"zRequest.__init__cCs
d|jS)Nz<Request [%s]>)rp)r;r=r=r>�__repr__�szRequest.__repr__cCs<t�}|j|j|j|j|j|j|j|j|j	|j
|jd�
|S)zXConstructs a :class:`PreparedRequest <PreparedRequest>` for transmission and returns it.)
rpr6rNrVrEr.rqrrrsrg)�PreparedRequest�preparerpr6rNrVrEr.rqrrrsrg)r;r<r=r=r>rw�s
zRequest.prepare)
NNNNNNNNNN)r\r]r^�__doc__rtrurwr=r=r=r>rn�s

rnc
@s�eZdZdZdd�Zddd�Zdd�Zd	d
�Zdd�Ze	d
d��Z
dd�Zdd�Zddd�Z
dd�Zd dd�Zdd�Zdd�ZdS)!rva�The fully mutable :class:`PreparedRequest <PreparedRequest>` object,
    containing the exact bytes that will be sent to the server.

    Generated from either a :class:`Request <Request>` object or manually.

    Usage::

      >>> import requests
      >>> req = requests.Request('GET', 'http://httpbin.org/get')
      >>> r = req.prepare()
      <PreparedRequest [GET]>

      >>> s = requests.Session()
      >>> s.send(r)
      <Response [200]>
    cCs0d|_d|_d|_d|_d|_t�|_d|_dS)N)rpr6rN�_cookiesrZrrg�_body_position)r;r=r=r>rtszPreparedRequest.__init__NcCsR|j|�|j||�|j|�|j|�|j|||
�|j||�|j|	�dS)z6Prepares the entire request with the given parameters.N)�prepare_method�prepare_url�prepare_headers�prepare_cookies�prepare_body�prepare_auth�
prepare_hooks)r;rpr6rNrVrErqrrrsrgr.r=r=r>rw+s


zPreparedRequest.preparecCs
d|jS)Nz<PreparedRequest [%s]>)rp)r;r=r=r>ru=szPreparedRequest.__repr__cCsXt�}|j|_|j|_|jdk	r*|jj�nd|_t|j�|_|j|_|j|_|j	|_	|S)N)
rvrpr6rN�copyrryrZrgrz)r;r<r=r=r>r�@szPreparedRequest.copycCs$||_|jdk	r t|jj��|_dS)zPrepares the given HTTP method.N)rpr�upper)r;rpr=r=r>r{Ks
zPreparedRequest.prepare_methodcCs@ddl}y|j|dd�jd�}Wn|jk
r:t�YnX|S)NrT)Zuts46zutf-8)�idnarDrQZ	IDNAError�UnicodeError)�hostr�r=r=r>�_get_idna_encoded_hostQs
z&PreparedRequest._get_idna_encoded_hostcCs0t|t�r|jd�}ntr"t|�nt|�}|j�}d|krT|j�jd�rT||_	dSyt
|�\}}}}}}}	Wn,tk
r�}
zt|
j
��WYdd}
~
XnX|s�d}|jt|d��}t|��|s�td|��t|��sy|j|�}Wntk
�rtd��YnXn|jd��rtd��|�p"d	}|�r2|d
7}||7}|�rP|dt|�7}|�sZd}t�r�t|t��rv|jd�}t|t��r�|jd�}t|t��r�|jd�}t|t��r�|jd�}t|	t��r�|	jd�}	t|ttf��r�t|�}|j|�}
|
�r|�r
d
||
f}n|
}tt|||d||	g��}||_	dS)zPrepares the given HTTP URL.�utf8�:ZhttpNzDInvalid URL {0!r}: No schema supplied. Perhaps you meant http://{0}?z Invalid URL %r: No host suppliedzURL has an invalid label.�*r5�@r3zutf-8z%s&%s)rBr)rQr*Zunicoder(�lstrip�lower�
startswithr6rrr�args�formatrrrr�r�rDrIrr%)r;r6rq�schemerrr�Zportr7r9Zfragment�e�errorZnetlocZ
enc_paramsr=r=r>r|[sh








zPreparedRequest.prepare_urlcCs@t�|_|r<x.|j�D]"}t|�|\}}||jt|�<qWdS)z Prepares the given HTTP headers.N)rrNror#r)r;rN�headerrL�valuer=r=r>r}�szPreparedRequest.prepare_headerscCsvd}d}|r8|dk	r8d}tj|�}t|t�s8|jd�}tt|d�t|ttt	t
jf�g�}yt|�}Wnt
ttfk
r�d}YnX|r�|}t|dd�dk	r�y|j�|_Wn ttfk
r�t�|_YnX|r�td��|r�t|�|jd<n
d|jd	<np|�r|j||�\}}n2|�rF|j|�}t|t��s<t|d
��rBd}nd}|j|�|�rld|jk�rl||jd
<||_dS)z"Prepares the given HTTP body data.Nzapplication/jsonzutf-8rA�tellz1Streamed bodies and files are mutually exclusive.zContent-LengthZchunkedzTransfer-Encodingr@z!application/x-www-form-urlencodedzcontent-typezContent-Type)�complexjson�dumpsrBr)rD�allrCr-rSrRrb�Mappingr"�	TypeError�AttributeErrorr	�getattrr�rz�IOError�OSError�object�NotImplementedErrorr,rNr[rI�prepare_content_lengthrZ)r;rErVr.rZrOZ	is_stream�lengthr=r=r>r�sJ






zPreparedRequest.prepare_bodycCsL|dk	r$t|�}|rHt|�|jd<n$|jdkrH|jjd�dkrHd|jd<dS)z>Prepare Content-Length header based on request method and bodyNzContent-Length�GET�HEAD�0)r�r�)r"r,rNrp�get)r;rZr�r=r=r>r�sz&PreparedRequest.prepare_content_lengthr5cCsj|dkr"t|j�}t|�r|nd}|rft|t�rDt|�dkrDt|�}||�}|jj|j�|j	|j
�dS)z"Prepares the given HTTP auth data.NrJ)rr6�anyrBrRrTr
�__dict__�updater�rZ)r;rrr6Zurl_auth�rr=r=r>r�s
zPreparedRequest.prepare_authcCs@t|tj�r||_n
t|�|_t|j|�}|dk	r<||jd<dS)aPrepares the given HTTP cookie data.

        This function eventually generates a ``Cookie`` header from the
        given cookies using cookielib. Due to cookielib's design, the header
        will not be regenerated if it already exists, meaning this function
        can only be called once for the life of the
        :class:`PreparedRequest <PreparedRequest>` object. Any subsequent calls
        to ``prepare_cookies`` will have no actual effect, unless the "Cookie"
        header is removed beforehand.
        NZCookie)rBr$Z	CookieJarryrrrN)r;rsZ
cookie_headerr=r=r>r~$s
zPreparedRequest.prepare_cookiescCs*|pg}x|D]}|j|||�qWdS)zPrepares the given hooks.N)rk)r;rgrir=r=r>r�8s
zPreparedRequest.prepare_hooks)
NNNNNNNNNN)N)r5)r\r]r^rxrtrwrur�r{r`r�r|r}rr�r�r~r�r=r=r=r>rvs

V
E
rvc
@seZdZdZdddddddd	d
dg
Zdd
�Zdd�Zdd�Zdd�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
edd��Zed d!��Zed"d#��Zed$d%��Zed&d'��Zd;d*d+�Zed,d,fd-d.�Zed/d0��Zed1d2��Zd3d4�Zed5d6��Zd7d8�Zd9d:�Zd,S)<�ResponsezhThe :class:`Response <Response>` object, which contains a
    server's response to an HTTP request.
    �_content�status_coderNr6�history�encoding�reasonrs�elapsed�requestcCs^d|_d|_d|_d|_t�|_d|_d|_d|_g|_	d|_
ti�|_t
jd�|_d|_dS)NFr)r��_content_consumed�_nextr�rrN�rawr6r�r�r�rrs�datetimeZ	timedeltar�r�)r;r=r=r>rtLs
zResponse.__init__cCs|S)Nr=)r;r=r=r>�	__enter__{szResponse.__enter__cGs|j�dS)N)�close)r;r�r=r=r>�__exit__~szResponse.__exit__cs$�js�jt�fdd��jD��S)Nc3s|]}|t�|d�fVqdS)N)r�)rd�attr)r;r=r>rf�sz(Response.__getstate__.<locals>.<genexpr>)r��content�dict�	__attrs__)r;r=)r;r>�__getstate__�s

zResponse.__getstate__cCs>x |j�D]\}}t|||�q
Wt|dd�t|dd�dS)Nr�Tr�)ro�setattr)r;�staterLr�r=r=r>�__setstate__�szResponse.__setstate__cCs
d|jS)Nz<Response [%s]>)r�)r;r=r=r>ru�szResponse.__repr__cCs|jS)akReturns True if :attr:`status_code` is less than 400.

        This attribute checks if the status code of the response is between
        400 and 600 to see if there was a client error or a server error. If
        the status code, is between 200 and 400, this will return True. This
        is **not** a check to see if the response code is ``200 OK``.
        )�ok)r;r=r=r>�__bool__�szResponse.__bool__cCs|jS)akReturns True if :attr:`status_code` is less than 400.

        This attribute checks if the status code of the response is between
        400 and 600 to see if there was a client error or a server error. If
        the status code, is between 200 and 400, this will return True. This
        is **not** a check to see if the response code is ``200 OK``.
        )r�)r;r=r=r>�__nonzero__�szResponse.__nonzero__cCs
|jd�S)z,Allows you to use a response as an iterator.�)�iter_content)r;r=r=r>rA�szResponse.__iter__cCs&y|j�Wntk
r dSXdS)akReturns True if :attr:`status_code` is less than 400.

        This attribute checks if the status code of the response is between
        400 and 600 to see if there was a client error or a server error. If
        the status code, is between 200 and 400, this will return True. This
        is **not** a check to see if the response code is ``200 OK``.
        FT)�raise_for_statusr)r;r=r=r>r��s
	zResponse.okcCsd|jko|jtkS)z�True if this Response is a well-formed HTTP redirect that could have
        been processed automatically (by :meth:`Session.resolve_redirects`).
        �location)rNr��REDIRECT_STATI)r;r=r=r>�is_redirect�szResponse.is_redirectcCsd|jko|jtjtjfkS)z@True if this Response one of the permanent versions of redirect.r�)rNr�r/Zmoved_permanently�permanent_redirect)r;r=r=r>�is_permanent_redirect�szResponse.is_permanent_redirectcCs|jS)zTReturns a PreparedRequest for the next request in a redirect chain, if there is one.)r�)r;r=r=r>�next�sz
Response.nextcCstj|j�dS)z7The apparent encoding, provided by the chardet library.r�)r+Zdetectr�)r;r=r=r>�apparent_encoding�szResponse.apparent_encodingr
Fcs~��fdd�}�jr(t�jt�r(t��n$�dk	rLt�t�rLtdt����t�j��}|�}�jrh|n|}|rzt	|��}|S)a�Iterates over the response data.  When stream=True is set on the
        request, this avoids reading the content at once into memory for
        large responses.  The chunk size is the number of bytes it should
        read into memory.  This is not necessarily the length of each item
        returned as decoding can take place.

        chunk_size must be of type int or None. A value of None will
        function differently depending on the value of `stream`.
        stream=True will read data as it arrives in whatever size the
        chunks are received. If stream=False, data is returned as
        a single chunk.

        If decode_unicode is True, content will be decoded using the best
        available encoding based on the response.
        c3s�t�jd�r�y$x�jj�dd�D]
}|Vq WWq�tk
rZ}zt|��WYdd}~Xq�tk
r�}zt|��WYdd}~Xq�tk
r�}zt|��WYdd}~Xq�Xnx�jj	��}|s�P|Vq�Wd�_
dS)N�streamT)Zdecode_content)rCr�r�rrrrrrr@r�)�chunkr�)�
chunk_sizer;r=r>�generate�s 
z'Response.iter_content.<locals>.generateNz.chunk_size must be an int, it is instead a %s.)
r�rBr��boolr�intr��typer r)r;r��decode_unicoder�Z
reused_chunksZ
stream_chunksZchunksr=)r�r;r>r��s
zResponse.iter_contentNccs�d}x�|j||d�D]r}|dk	r(||}|r8|j|�}n|j�}|rn|drn|rn|dd|dkrn|j�}nd}x|D]
}|VqxWqW|dk	r�|VdS)z�Iterates over the response data, one line at a time.  When
        stream=True is set on the request, this avoids reading the
        content at once into memory for large responses.

        .. note:: This method is not reentrant safe.
        N)r�r�r
���r�r�r�)r��split�
splitlines�pop)r;r�r�Z	delimiter�pendingr��lines�liner=r=r>�
iter_liness$

zResponse.iter_linescCsZ|jdkrN|jrtd��|jdks,|jdkr4d|_nt�j|jt��pJt�|_d|_|jS)z"Content of the response, in bytes.Fz2The content for this response was already consumedrNT)	r�r��RuntimeErrorr�r�r)r:r��CONTENT_CHUNK_SIZE)r;r=r=r>r�*s
zResponse.contentcCshd}|j}|jstd�S|jdkr(|j}yt|j|dd�}Wn&ttfk
rbt|jdd�}YnX|S)a�Content of the response, in unicode.

        If Response.encoding is None, encoding will be guessed using
        ``chardet``.

        The encoding of the response content is determined based solely on HTTP
        headers, following RFC 2616 to the letter. If you can take advantage of
        non-HTTP knowledge to make a better guess at the encoding, you should
        set ``r.encoding`` appropriately before accessing this property.
        Nr5�replace)�errors)r�r�r(r��LookupErrorr�)r;r�r�r=r=r>�text>s
z
Response.textcKsj|jrZ|jrZt|j�dkrZt|j�}|dk	rZytj|jj|�f|�Stk
rXYnXtj|jf|�S)z�Returns the json-encoded content of a response, if any.

        :param \*\*kwargs: Optional arguments that ``json.loads`` takes.
        :raises ValueError: If the response body does not contain valid json.
        rKN)	r�r�rTr!r��loadsrQ�UnicodeDecodeErrorr�)r;�kwargsr�r=r=r>r.ds
z
Response.jsoncCsJ|jjd�}i}|rFt|�}x(|D] }|jd�p8|jd�}|||<q"W|S)z8Returns the parsed header links of the response, if any.�linkZrelr6)rNr�r)r;r��l�linksr��keyr=r=r>r�~s
zResponse.linkscCs�d}t|jt�rDy|jjd�}WqJtk
r@|jjd�}YqJXn|j}d|jko^dknrxd|j||jf}n,d|jko�dknr�d|j||jf}|r�t||d	��d
S)z2Raises stored :class:`HTTPError`, if one occurred.r5zutf-8z
iso-8859-1i�i�z%s Client Error: %s for url: %siXz%s Server Error: %s for url: %s)ZresponseN)rBr�r)rQr�r�r6r)r;Zhttp_error_msgr�r=r=r>r��szResponse.raise_for_statuscCs0|js|jj�t|jdd�}|dk	r,|�dS)z�Releases the connection back to the pool. Once this method has been
        called the underlying ``raw`` object must not be accessed again.

        *Note: Should not normally need to be called explicitly.*
        �release_connN)r�r�r�r�)r;r�r=r=r>r��s

zResponse.close)r
F)r\r]r^rxr�rtr�r�r�r�rur�r�rAr_r�r�r�r�r�r��ITER_CHUNK_SIZEr�r�r�r.r�r�r�r=r=r=r>r�Bs2
/


7&r�i()Qrxrbr��sysZencodings.idnaZ	encodingsZpip._vendor.urllib3.fieldsrZpip._vendor.urllib3.filepostrZpip._vendor.urllib3.utilrZpip._vendor.urllib3.exceptionsrrrr�ior	rgrZ
structuresrrrr
rsrrr�
exceptionsrrrrrrrZ_internal_utilsrrZutilsrrrrrrr r!r"r#�compatr$r%r&r'r(r)r*r+r,r-r.r�Zstatus_codesr/Zmoved�found�otherZtemporary_redirectr�r�ZDEFAULT_REDIRECT_LIMITr�r�r�r2rarnrvr�r=r=r=r>�<module>sD$00nF<site-packages/pip/_vendor/requests/__pycache__/status_codes.cpython-36.opt-1.pyc000064400000007012147511334620023666 0ustar003

���e��F@s�ddlmZd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�dz�DZed{d|�ZxNej�D]B\ZZx6eD].Zeeee�ej	dă�s�eeej
�e��q�W�q�WdS)��)�
LookupDict�continue�switching_protocols�
processing�
checkpoint�uri_too_long�request_uri_too_long�ok�okay�all_ok�all_okay�all_good�\o/�✓�created�accepted�non_authoritative_info�non_authoritative_information�
no_content�
reset_content�reset�partial_content�partial�multi_status�multiple_status�multi_stati�multiple_stati�already_reported�im_used�multiple_choices�moved_permanently�moved�\o-�found�	see_other�other�not_modified�	use_proxy�switch_proxy�temporary_redirect�temporary_moved�	temporary�permanent_redirect�resume_incomplete�resume�bad_request�bad�unauthorized�payment_required�payment�	forbidden�	not_found�-o-�method_not_allowed�not_allowed�not_acceptable�proxy_authentication_required�
proxy_auth�proxy_authentication�request_timeout�timeout�conflict�gone�length_required�precondition_failed�precondition�request_entity_too_large�request_uri_too_large�unsupported_media_type�unsupported_media�
media_type�requested_range_not_satisfiable�requested_range�range_not_satisfiable�expectation_failed�im_a_teapot�teapot�
i_am_a_teapot�misdirected_request�unprocessable_entity�
unprocessable�locked�failed_dependency�
dependency�unordered_collection�	unordered�upgrade_required�upgrade�precondition_required�too_many_requests�too_many�header_fields_too_large�fields_too_large�no_response�none�
retry_with�retry�$blocked_by_windows_parental_controls�parental_controls�unavailable_for_legal_reasons�
legal_reasons�client_closed_request�internal_server_error�server_error�/o\�✗�not_implemented�bad_gateway�service_unavailable�unavailable�gateway_timeout�http_version_not_supported�http_version�variant_also_negotiates�insufficient_storage�bandwidth_limit_exceeded�	bandwidth�not_extended�network_authentication_required�network_auth�network_authentication)D�d�e�f�g�z��������������������i,i-i.i/i0i1i2i3i4i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�Zstatus_codes)�name�\�/N)r)r)r)r)rr)r	r
rrr
rr)r)r)rr)r)rr)rr)rrrr)r)r)r)r r!r")r#)r$r%)r&)r')r()r)r*r+)r,r-r.)r/r0)r1)r2r3)r4)r5r6)r7r8)r9)r:r;r<)r=r>)r?)r@)rA)rBrC)rD)rE)rFrGrH)rIrJrK)rL)rMrNrO)rP)rQrR)rS)rTrU)rVrW)rXrY)rZrC)r[r\)r]r^)r_r`)rarb)rcrd)rerf)rg)rhrirjrk)rl)rm)rnro)rp)rqrr)rs)rt)rurv)rw)rxryrz)r�r�)Z
structuresrZ_codesZcodes�items�codeZtitles�title�setattr�
startswith�upper�r�r��"/usr/lib/python3.6/status_codes.py�<module>s�

site-packages/pip/_vendor/requests/__pycache__/sessions.cpython-36.pyc000064400000044731147511334620022106 0ustar003

���ep�@s�dZddlZddlZddlZddlmZddlmZddlm	Z	ddl
mZmZm
Z
mZmZddlmZmZmZmZdd	lmZmZmZdd
lmZmZddlmZddlmZm Z dd
l!m"Z"m#Z#m$Z$m%Z%ddl&m'Z'ddl(m)Z)ddlm*Z*m+Z+m,Z,m-Z-m.Z.m/Z/m0Z0ddl1m2Z2ddlm3Z3ej4�dk�rXy
ej5Z6Wne7k
�rTej8Z6YnXnejZ6e
fdd�Z9e
fdd�Z:Gdd�de;�Z<Gdd�de<�Z=dd�Z>dS)z�
requests.session
~~~~~~~~~~~~~~~~

This module provides a Session object to manage and persist settings across
requests (cookies, auth, proxies).
�N)�Mapping)�	timedelta�)�_basic_auth_str)�	cookielib�is_py3�OrderedDict�urljoin�urlparse)�cookiejar_from_dict�extract_cookies_to_jar�RequestsCookieJar�
merge_cookies)�Request�PreparedRequest�DEFAULT_REDIRECT_LIMIT)�
default_hooks�
dispatch_hook)�to_native_string)�to_key_val_list�default_headers)�TooManyRedirects�
InvalidSchema�ChunkedEncodingError�ContentDecodingError)�CaseInsensitiveDict)�HTTPAdapter)�requote_uri�get_environ_proxies�get_netrc_auth�should_bypass_proxies�get_auth_from_url�rewind_body�
DEFAULT_PORTS)�codes)�REDIRECT_STATIZWindowscCst|dkr|S|dkr|St|t�o*t|t�s0|S|t|��}|jt|��dd�|j�D�}x|D]
}||=qbW|S)z�Determines appropriate setting for a given request, taking into account
    the explicit setting on that request, and the setting in the session. If a
    setting is a dictionary, they will be merged together using `dict_class`
    NcSsg|]\}}|dkr|�qS)N�)�.0�k�vr&r&�/usr/lib/python3.6/sessions.py�
<listcomp>Jsz!merge_setting.<locals>.<listcomp>)�
isinstancerr�update�items)Zrequest_settingZsession_setting�
dict_classZmerged_settingZ	none_keys�keyr&r&r*�
merge_setting2s



r1cCs@|dks|jd�gkr|S|dks0|jd�gkr4|St|||�S)z�Properly merges both requests and session hooks.

    This is necessary because when request_hooks == {'response': []}, the
    merge breaks Session hooks entirely.
    N�response)�getr1)Z
request_hooksZ
session_hooksr/r&r&r*�merge_hooksQs
r4c@s>eZdZdd�Zdd�Zddd	�Zd
d�Zdd
�Zdd�ZdS)�SessionRedirectMixincCs,|jr(|jd}tr|jd�}t|d�SdS)z7Receives a Response. Returns a redirect URI or ``None``�location�latin1�utf8N)Zis_redirect�headersr�encoder)�self�respr6r&r&r*�get_redirect_targetbs


z(SessionRedirectMixin.get_redirect_targetcCs�t|�}t|�}|j|jkr dS|jdkrL|jdkrL|jdkrL|jd	krLdS|j|jk}|j|jk}tj|jd�df}|r�|j|kr�|j|kr�dS|p�|S)
zFDecide whether Authorization header should be removed when redirectingTZhttp�PNZhttps�F)r>N)r?N)r
Zhostname�schemeZportr#r3)r;Zold_urlZnew_urlZ
old_parsedZ
new_parsedZchanged_portZchanged_schemeZdefault_portr&r&r*�should_strip_authxs
z&SessionRedirectMixin.should_strip_authFNTc	ks.g}
|j|�}�x|�r(|j�}|
j|�|
dd�|_y
|jWn(tttfk
rj|jj	dd�YnXt
|j�|jkr�td|j|d��|j
�|jd�r�t|j�}
dt|
j�|f}t|�}|j�}|js�t|jt|��}nt|�}t|�|_|j||�|jtjtjfk�r>d}x|D]}|jj|d��q Wd|_|j}y
|d=Wntk
�rdYnXt |j!||j�t"|j!|j#�|j$|j!�|j%||�}|j&||�|j'dk	�o�d	|k�p�d|k}|�r�t(|�|}|�r�|Vq|j)|f|||||dd
�|	��}t |j#||j�|j|�}|VqWdS)zBReceives a Response. Returns a generator of Responses or Requests.rNF)Zdecode_contentzExceeded %s redirects.)r2z//z%s:%s�Content-Length�Content-Type�Transfer-EncodingZCookie)�stream�timeout�verify�cert�proxies�allow_redirects)rBrCrD)*r=�copy�append�history�contentrr�RuntimeError�raw�read�len�
max_redirectsr�close�
startswithr
�urlrr@ZgeturlZnetlocr	r�rebuild_method�status_coder$Ztemporary_redirectZpermanent_redirectr9�popZbody�KeyErrorrZ_cookiesr�cookiesZprepare_cookies�rebuild_proxies�rebuild_authZ_body_positionr"�send)r;r<�reqrErFrGrHrI�yield_requestsZadapter_kwargsZhistrV�prepared_requestZparsed_rurlZparsedZpurged_headers�headerr9Z
rewindabler&r&r*�resolve_redirects�sr









z&SessionRedirectMixin.resolve_redirectscCsR|j}|j}d|kr*|j|jj|�r*|d=|jr8t|�nd}|dk	rN|j|�dS)z�When being redirected we may want to strip authentication from the
        request to avoid leaking credentials. This method intelligently removes
        and reapplies authentication where possible to avoid credential loss.
        Z
AuthorizationN)r9rVrA�request�	trust_envrZprepare_auth)r;rar2r9rVZnew_authr&r&r*r]�s
z!SessionRedirectMixin.rebuild_authc
Cs�|dk	r|ni}|j}|j}t|�j}|j�}|jd�}t||d�}|jr~|r~t||d�}	|	j||	jd��}
|
r~|j	||
�d|kr�|d=yt
||�\}}Wntk
r�d\}}YnX|r�|r�t||�|d<|S)a�This method re-evaluates the proxy configuration by considering the
        environment variables. If we are redirected to a URL covered by
        NO_PROXY, we strip the proxy configuration. Otherwise, we set missing
        proxy keys for this URL (in case they were stripped by a previous
        redirect).

        This method also replaces the Proxy-Authorization header where
        necessary.

        :rtype: dict
        N�no_proxy)rf�allzProxy-Authorization)NN)
r9rVr
r@rKr3r rer�
setdefaultr!rZr)
r;rarIr9rVr@Znew_proxiesrfZbypass_proxyZenviron_proxies�proxyZusernameZpasswordr&r&r*r\s*

z$SessionRedirectMixin.rebuild_proxiescCsX|j}|jtjkr|dkrd}|jtjkr6|dkr6d}|jtjkrN|dkrNd}||_dS)z�When being redirected we may want to change the method of the request
        based on certain specs or browser behavior.
        �HEAD�GET�POSTN)�methodrXr$Z	see_other�foundZmoved)r;rar2rmr&r&r*rW:sz#SessionRedirectMixin.rebuild_method)FNTNNF)	�__name__�
__module__�__qualname__r=rArcr]r\rWr&r&r&r*r5`s
k)r5c@s�eZdZdZdddddddd	d
ddd
dg
Zdd�Zdd�Zdd�Zdd�Zd7dd�Z	dd�Z
dd�Zdd �Zd8d!d"�Z
d9d#d$�Zd:d%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Zd5d6�ZdS);�Sessiona~A Requests session.

    Provides cookie persistence, connection-pooling, and configuration.

    Basic Usage::

      >>> import requests
      >>> s = requests.Session()
      >>> s.get('http://httpbin.org/get')
      <Response [200]>

    Or as a context manager::

      >>> with requests.Session() as s:
      >>>     s.get('http://httpbin.org/get')
      <Response [200]>
    r9r[�authrI�hooks�paramsrGrHZprefetch�adaptersrErerScCsrt�|_d|_i|_t�|_i|_d|_d|_d|_	t
|_d|_t
i�|_t�|_|jdt��|jdt��dS)NFTzhttps://zhttp://)rr9rsrIrrtrurErGrHrrSrerr[rrv�mountr)r;r&r&r*�__init__js
zSession.__init__cCs|S)Nr&)r;r&r&r*�	__enter__�szSession.__enter__cGs|j�dS)N)rT)r;�argsr&r&r*�__exit__�szSession.__exit__c
Cs�|jpi}t|tj�st|�}ttt�|j�|�}|j}|jrV|rV|jrVt	|j
�}t�}|j|j
j�|j
|j|j|jt|j|jtd�t|j|j�t||j�|t|j|j�d�
|S)a�Constructs a :class:`PreparedRequest <PreparedRequest>` for
        transmission and returns it. The :class:`PreparedRequest` has settings
        merged from the :class:`Request <Request>` instance and those of the
        :class:`Session`.

        :param request: :class:`Request` instance to prepare with this
            session's settings.
        :rtype: requests.PreparedRequest
        )r/)
rmrV�files�data�jsonr9rursr[rt)r[r,rZ	CookieJarrrr
rsrerrVrZpreparerm�upperr|r}r~r1r9rrur4rt)r;rdr[Zmerged_cookiesrs�pr&r&r*�prepare_request�s*



zSession.prepare_requestNTcCstt|j�||||pi||pi|||d�
}|j|�}|p8i}|j|j||
||�}|	|
d�}|j|�|j|f|�}|S)a�Constructs a :class:`Request <Request>`, prepares it and sends it.
        Returns :class:`Response <Response>` object.

        :param method: method for the new :class:`Request` object.
        :param url: URL for the new :class:`Request` object.
        :param params: (optional) Dictionary or bytes to be sent in the query
            string for the :class:`Request`.
        :param data: (optional) Dictionary, bytes, or file-like object to send
            in the body of the :class:`Request`.
        :param json: (optional) json to send in the body of the
            :class:`Request`.
        :param headers: (optional) Dictionary of HTTP Headers to send with the
            :class:`Request`.
        :param cookies: (optional) Dict or CookieJar object to send with the
            :class:`Request`.
        :param files: (optional) Dictionary of ``'filename': file-like-objects``
            for multipart encoding upload.
        :param auth: (optional) Auth tuple or callable to enable
            Basic/Digest/Custom HTTP Auth.
        :param timeout: (optional) How long to wait for the server to send
            data before giving up, as a float, or a :ref:`(connect timeout,
            read timeout) <timeouts>` tuple.
        :type timeout: float or tuple
        :param allow_redirects: (optional) Set to True by default.
        :type allow_redirects: bool
        :param proxies: (optional) Dictionary mapping protocol or protocol and
            hostname to the URL of the proxy.
        :param stream: (optional) whether to immediately download the response
            content. Defaults to ``False``.
        :param verify: (optional) Either a boolean, in which case it controls whether we verify
            the server's TLS certificate, or a string, in which case it must be a path
            to a CA bundle to use. Defaults to ``True``.
        :param cert: (optional) if String, path to ssl client cert file (.pem).
            If Tuple, ('cert', 'key') pair.
        :rtype: requests.Response
        )
rmrVr9r|r}r~rursr[rt)rFrJ)rrr��merge_environment_settingsrVr-r^)r;rmrVrur}r9r[r|rsrFrJrIrtrErGrHr~r_ZprepZsettingsZsend_kwargsr<r&r&r*rd�s()

zSession.requestcKs|jdd�|jd|f|�S)z�Sends a GET request. Returns :class:`Response` object.

        :param url: URL for the new :class:`Request` object.
        :param \*\*kwargs: Optional arguments that ``request`` takes.
        :rtype: requests.Response
        rJTrk)rhrd)r;rV�kwargsr&r&r*r3szSession.getcKs|jdd�|jd|f|�S)z�Sends a OPTIONS request. Returns :class:`Response` object.

        :param url: URL for the new :class:`Request` object.
        :param \*\*kwargs: Optional arguments that ``request`` takes.
        :rtype: requests.Response
        rJTZOPTIONS)rhrd)r;rVr�r&r&r*�options!szSession.optionscKs|jdd�|jd|f|�S)z�Sends a HEAD request. Returns :class:`Response` object.

        :param url: URL for the new :class:`Request` object.
        :param \*\*kwargs: Optional arguments that ``request`` takes.
        :rtype: requests.Response
        rJFrj)rhrd)r;rVr�r&r&r*�head,szSession.headcKs|jd|f||d�|��S)a�Sends a POST request. Returns :class:`Response` object.

        :param url: URL for the new :class:`Request` object.
        :param data: (optional) Dictionary, bytes, or file-like object to send in the body of the :class:`Request`.
        :param json: (optional) json to send in the body of the :class:`Request`.
        :param \*\*kwargs: Optional arguments that ``request`` takes.
        :rtype: requests.Response
        rl)r}r~)rd)r;rVr}r~r�r&r&r*�post7s
zSession.postcKs|jd|fd|i|��S)aYSends a PUT request. Returns :class:`Response` object.

        :param url: URL for the new :class:`Request` object.
        :param data: (optional) Dictionary, bytes, or file-like object to send in the body of the :class:`Request`.
        :param \*\*kwargs: Optional arguments that ``request`` takes.
        :rtype: requests.Response
        ZPUTr})rd)r;rVr}r�r&r&r*�putCs	zSession.putcKs|jd|fd|i|��S)a[Sends a PATCH request. Returns :class:`Response` object.

        :param url: URL for the new :class:`Request` object.
        :param data: (optional) Dictionary, bytes, or file-like object to send in the body of the :class:`Request`.
        :param \*\*kwargs: Optional arguments that ``request`` takes.
        :rtype: requests.Response
        ZPATCHr})rd)r;rVr}r�r&r&r*�patchNs	z
Session.patchcKs|jd|f|�S)z�Sends a DELETE request. Returns :class:`Response` object.

        :param url: URL for the new :class:`Request` object.
        :param \*\*kwargs: Optional arguments that ``request`` takes.
        :rtype: requests.Response
        ZDELETE)rd)r;rVr�r&r&r*�deleteYszSession.deletec
Ks~|jd|j�|jd|j�|jd|j�|jd|j�t|t�rJtd��|jdd�}|j	d�}|j
}|j|jd�}t
�}|j|f|�}t
�|}	t|	d	�|_td
||f|�}|jr�x |jD]}
t|j|
j|
j�q�Wt|j||j�|j||f|�}|�r
dd�|D�ng}|�r.|jd
|�|j�}||_|�sny"t|j||fddi|���|_Wntk
�rlYnX|�sz|j|S)zISend a given PreparedRequest.

        :rtype: requests.Response
        rErGrHrIz#You can only send PreparedRequests.rJT)rV)Zsecondsr2cSsg|]}|�qSr&r&)r'r<r&r&r*r+�sz Session.send.<locals>.<listcomp>rr`)rhrErGrHrIr,r�
ValueErrorrYr3rt�get_adapterrV�preferred_clockr^r�elapsedrrMrr[rdrPrc�insert�nextZ_next�
StopIterationrN)
r;rdr�rJrErt�adapter�start�rr�r<�genrMr&r&r*r^csB


"zSession.sendc
Cs�|jrr|dk	r|jd�nd}t||d�}x |j�D]\}}	|j||	�q2W|dksZ|dkrrtjjd�pptjjd�}t||j�}t||j	�}t||j
�}t||j�}||||d�S)z^
        Check the environment and merge it with some settings.

        :rtype: dict
        Nrf)rfTZREQUESTS_CA_BUNDLEZCURL_CA_BUNDLE)rGrIrErH)rer3rr.rh�os�environr1rIrErGrH)
r;rVrIrErGrHrfZenv_proxiesr(r)r&r&r*r��sz"Session.merge_environment_settingscCs:x(|jj�D]\}}|j�j|�r|SqWtd|��dS)z~
        Returns the appropriate connection adapter for the given URL.

        :rtype: requests.adapters.BaseAdapter
        z*No connection adapters were found for '%s'N)rvr.�lowerrUr)r;rV�prefixr�r&r&r*r��szSession.get_adaptercCs x|jj�D]}|j�qWdS)z+Closes all adapters and as such the sessionN)rv�valuesrT)r;r)r&r&r*rT�sz
Session.closecsB||j�<�fdd�|jD�}x|D]}|jj|�|j|<q$WdS)zwRegisters a connection adapter to a prefix.

        Adapters are sorted in descending order by prefix length.
        cs g|]}t|�t��kr|�qSr&)rR)r'r()r�r&r*r+�sz!Session.mount.<locals>.<listcomp>N)rvrY)r;r�r�Zkeys_to_mover0r&)r�r*rw�s

z
Session.mountcst�fdd��jD��}|S)Nc3s|]}|t�|d�fVqdS)N)�getattr)r'�attr)r;r&r*�	<genexpr>�sz'Session.__getstate__.<locals>.<genexpr>)�dict�	__attrs__)r;�stater&)r;r*�__getstate__�szSession.__getstate__cCs&x |j�D]\}}t|||�q
WdS)N)r.�setattr)r;r�r��valuer&r&r*�__setstate__�szSession.__setstate__)NNNNNNNTNNNNNN)NN)N)N)rorprq�__doc__r�rxryr{r�rdr3r�r�r�r�r�r�r^r�r�rTrwr�r�r&r&r&r*rrQs2
7)
D



IrrcCst�S)zQ
    Returns a :class:`Session` for context-management.

    :rtype: Session
    )rrr&r&r&r*�session�sr�)?r�r��platformZtime�collectionsrZdatetimerrsr�compatrrrr	r
r[rrr
rZmodelsrrrrtrrZ_internal_utilsrZutilsrr�
exceptionsrrrrZ
structuresrrvrrrrr r!r"r#Zstatus_codesr$r%�systemZperf_counterr��AttributeErrorZclockr1r4�objectr5rrr�r&r&r&r*�<module>	sB$
r"site-packages/pip/_vendor/requests/__pycache__/certs.cpython-36.opt-1.pyc000064400000001052147511334620022304 0ustar003

���e��@s&dZddlmZedkr"ee��dS)uF
requests.certs
~~~~~~~~~~~~~~

This module returns the preferred default CA certificate bundle. There is
only one — the one from the certifi package.

If you are packaging Requests, e.g., for a Linux distribution or a managed
environment, you can change the definition of where() to return a separately
packaged CA bundle.
�)�where�__main__N)�__doc__Zpip._vendor.certifir�__name__�print�rr�/usr/lib/python3.6/certs.py�<module>ssite-packages/pip/_vendor/requests/__pycache__/sessions.cpython-36.opt-1.pyc000064400000044731147511334620023045 0ustar003

���ep�@s�dZddlZddlZddlZddlmZddlmZddlm	Z	ddl
mZmZm
Z
mZmZddlmZmZmZmZdd	lmZmZmZdd
lmZmZddlmZddlmZm Z dd
l!m"Z"m#Z#m$Z$m%Z%ddl&m'Z'ddl(m)Z)ddlm*Z*m+Z+m,Z,m-Z-m.Z.m/Z/m0Z0ddl1m2Z2ddlm3Z3ej4�dk�rXy
ej5Z6Wne7k
�rTej8Z6YnXnejZ6e
fdd�Z9e
fdd�Z:Gdd�de;�Z<Gdd�de<�Z=dd�Z>dS)z�
requests.session
~~~~~~~~~~~~~~~~

This module provides a Session object to manage and persist settings across
requests (cookies, auth, proxies).
�N)�Mapping)�	timedelta�)�_basic_auth_str)�	cookielib�is_py3�OrderedDict�urljoin�urlparse)�cookiejar_from_dict�extract_cookies_to_jar�RequestsCookieJar�
merge_cookies)�Request�PreparedRequest�DEFAULT_REDIRECT_LIMIT)�
default_hooks�
dispatch_hook)�to_native_string)�to_key_val_list�default_headers)�TooManyRedirects�
InvalidSchema�ChunkedEncodingError�ContentDecodingError)�CaseInsensitiveDict)�HTTPAdapter)�requote_uri�get_environ_proxies�get_netrc_auth�should_bypass_proxies�get_auth_from_url�rewind_body�
DEFAULT_PORTS)�codes)�REDIRECT_STATIZWindowscCst|dkr|S|dkr|St|t�o*t|t�s0|S|t|��}|jt|��dd�|j�D�}x|D]
}||=qbW|S)z�Determines appropriate setting for a given request, taking into account
    the explicit setting on that request, and the setting in the session. If a
    setting is a dictionary, they will be merged together using `dict_class`
    NcSsg|]\}}|dkr|�qS)N�)�.0�k�vr&r&�/usr/lib/python3.6/sessions.py�
<listcomp>Jsz!merge_setting.<locals>.<listcomp>)�
isinstancerr�update�items)Zrequest_settingZsession_setting�
dict_classZmerged_settingZ	none_keys�keyr&r&r*�
merge_setting2s



r1cCs@|dks|jd�gkr|S|dks0|jd�gkr4|St|||�S)z�Properly merges both requests and session hooks.

    This is necessary because when request_hooks == {'response': []}, the
    merge breaks Session hooks entirely.
    N�response)�getr1)Z
request_hooksZ
session_hooksr/r&r&r*�merge_hooksQs
r4c@s>eZdZdd�Zdd�Zddd	�Zd
d�Zdd
�Zdd�ZdS)�SessionRedirectMixincCs,|jr(|jd}tr|jd�}t|d�SdS)z7Receives a Response. Returns a redirect URI or ``None``�location�latin1�utf8N)Zis_redirect�headersr�encoder)�self�respr6r&r&r*�get_redirect_targetbs


z(SessionRedirectMixin.get_redirect_targetcCs�t|�}t|�}|j|jkr dS|jdkrL|jdkrL|jdkrL|jd	krLdS|j|jk}|j|jk}tj|jd�df}|r�|j|kr�|j|kr�dS|p�|S)
zFDecide whether Authorization header should be removed when redirectingTZhttp�PNZhttps�F)r>N)r?N)r
Zhostname�schemeZportr#r3)r;Zold_urlZnew_urlZ
old_parsedZ
new_parsedZchanged_portZchanged_schemeZdefault_portr&r&r*�should_strip_authxs
z&SessionRedirectMixin.should_strip_authFNTc	ks.g}
|j|�}�x|�r(|j�}|
j|�|
dd�|_y
|jWn(tttfk
rj|jj	dd�YnXt
|j�|jkr�td|j|d��|j
�|jd�r�t|j�}
dt|
j�|f}t|�}|j�}|js�t|jt|��}nt|�}t|�|_|j||�|jtjtjfk�r>d}x|D]}|jj|d��q Wd|_|j}y
|d=Wntk
�rdYnXt |j!||j�t"|j!|j#�|j$|j!�|j%||�}|j&||�|j'dk	�o�d	|k�p�d|k}|�r�t(|�|}|�r�|Vq|j)|f|||||dd
�|	��}t |j#||j�|j|�}|VqWdS)zBReceives a Response. Returns a generator of Responses or Requests.rNF)Zdecode_contentzExceeded %s redirects.)r2z//z%s:%s�Content-Length�Content-Type�Transfer-EncodingZCookie)�stream�timeout�verify�cert�proxies�allow_redirects)rBrCrD)*r=�copy�append�history�contentrr�RuntimeError�raw�read�len�
max_redirectsr�close�
startswithr
�urlrr@ZgeturlZnetlocr	r�rebuild_method�status_coder$Ztemporary_redirectZpermanent_redirectr9�popZbody�KeyErrorrZ_cookiesr�cookiesZprepare_cookies�rebuild_proxies�rebuild_authZ_body_positionr"�send)r;r<�reqrErFrGrHrI�yield_requestsZadapter_kwargsZhistrV�prepared_requestZparsed_rurlZparsedZpurged_headers�headerr9Z
rewindabler&r&r*�resolve_redirects�sr









z&SessionRedirectMixin.resolve_redirectscCsR|j}|j}d|kr*|j|jj|�r*|d=|jr8t|�nd}|dk	rN|j|�dS)z�When being redirected we may want to strip authentication from the
        request to avoid leaking credentials. This method intelligently removes
        and reapplies authentication where possible to avoid credential loss.
        Z
AuthorizationN)r9rVrA�request�	trust_envrZprepare_auth)r;rar2r9rVZnew_authr&r&r*r]�s
z!SessionRedirectMixin.rebuild_authc
Cs�|dk	r|ni}|j}|j}t|�j}|j�}|jd�}t||d�}|jr~|r~t||d�}	|	j||	jd��}
|
r~|j	||
�d|kr�|d=yt
||�\}}Wntk
r�d\}}YnX|r�|r�t||�|d<|S)a�This method re-evaluates the proxy configuration by considering the
        environment variables. If we are redirected to a URL covered by
        NO_PROXY, we strip the proxy configuration. Otherwise, we set missing
        proxy keys for this URL (in case they were stripped by a previous
        redirect).

        This method also replaces the Proxy-Authorization header where
        necessary.

        :rtype: dict
        N�no_proxy)rf�allzProxy-Authorization)NN)
r9rVr
r@rKr3r rer�
setdefaultr!rZr)
r;rarIr9rVr@Znew_proxiesrfZbypass_proxyZenviron_proxies�proxyZusernameZpasswordr&r&r*r\s*

z$SessionRedirectMixin.rebuild_proxiescCsX|j}|jtjkr|dkrd}|jtjkr6|dkr6d}|jtjkrN|dkrNd}||_dS)z�When being redirected we may want to change the method of the request
        based on certain specs or browser behavior.
        �HEAD�GET�POSTN)�methodrXr$Z	see_other�foundZmoved)r;rar2rmr&r&r*rW:sz#SessionRedirectMixin.rebuild_method)FNTNNF)	�__name__�
__module__�__qualname__r=rArcr]r\rWr&r&r&r*r5`s
k)r5c@s�eZdZdZdddddddd	d
ddd
dg
Zdd�Zdd�Zdd�Zdd�Zd7dd�Z	dd�Z
dd�Zdd �Zd8d!d"�Z
d9d#d$�Zd:d%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Zd5d6�ZdS);�Sessiona~A Requests session.

    Provides cookie persistence, connection-pooling, and configuration.

    Basic Usage::

      >>> import requests
      >>> s = requests.Session()
      >>> s.get('http://httpbin.org/get')
      <Response [200]>

    Or as a context manager::

      >>> with requests.Session() as s:
      >>>     s.get('http://httpbin.org/get')
      <Response [200]>
    r9r[�authrI�hooks�paramsrGrHZprefetch�adaptersrErerScCsrt�|_d|_i|_t�|_i|_d|_d|_d|_	t
|_d|_t
i�|_t�|_|jdt��|jdt��dS)NFTzhttps://zhttp://)rr9rsrIrrtrurErGrHrrSrerr[rrv�mountr)r;r&r&r*�__init__js
zSession.__init__cCs|S)Nr&)r;r&r&r*�	__enter__�szSession.__enter__cGs|j�dS)N)rT)r;�argsr&r&r*�__exit__�szSession.__exit__c
Cs�|jpi}t|tj�st|�}ttt�|j�|�}|j}|jrV|rV|jrVt	|j
�}t�}|j|j
j�|j
|j|j|jt|j|jtd�t|j|j�t||j�|t|j|j�d�
|S)a�Constructs a :class:`PreparedRequest <PreparedRequest>` for
        transmission and returns it. The :class:`PreparedRequest` has settings
        merged from the :class:`Request <Request>` instance and those of the
        :class:`Session`.

        :param request: :class:`Request` instance to prepare with this
            session's settings.
        :rtype: requests.PreparedRequest
        )r/)
rmrV�files�data�jsonr9rursr[rt)r[r,rZ	CookieJarrrr
rsrerrVrZpreparerm�upperr|r}r~r1r9rrur4rt)r;rdr[Zmerged_cookiesrs�pr&r&r*�prepare_request�s*



zSession.prepare_requestNTcCstt|j�||||pi||pi|||d�
}|j|�}|p8i}|j|j||
||�}|	|
d�}|j|�|j|f|�}|S)a�Constructs a :class:`Request <Request>`, prepares it and sends it.
        Returns :class:`Response <Response>` object.

        :param method: method for the new :class:`Request` object.
        :param url: URL for the new :class:`Request` object.
        :param params: (optional) Dictionary or bytes to be sent in the query
            string for the :class:`Request`.
        :param data: (optional) Dictionary, bytes, or file-like object to send
            in the body of the :class:`Request`.
        :param json: (optional) json to send in the body of the
            :class:`Request`.
        :param headers: (optional) Dictionary of HTTP Headers to send with the
            :class:`Request`.
        :param cookies: (optional) Dict or CookieJar object to send with the
            :class:`Request`.
        :param files: (optional) Dictionary of ``'filename': file-like-objects``
            for multipart encoding upload.
        :param auth: (optional) Auth tuple or callable to enable
            Basic/Digest/Custom HTTP Auth.
        :param timeout: (optional) How long to wait for the server to send
            data before giving up, as a float, or a :ref:`(connect timeout,
            read timeout) <timeouts>` tuple.
        :type timeout: float or tuple
        :param allow_redirects: (optional) Set to True by default.
        :type allow_redirects: bool
        :param proxies: (optional) Dictionary mapping protocol or protocol and
            hostname to the URL of the proxy.
        :param stream: (optional) whether to immediately download the response
            content. Defaults to ``False``.
        :param verify: (optional) Either a boolean, in which case it controls whether we verify
            the server's TLS certificate, or a string, in which case it must be a path
            to a CA bundle to use. Defaults to ``True``.
        :param cert: (optional) if String, path to ssl client cert file (.pem).
            If Tuple, ('cert', 'key') pair.
        :rtype: requests.Response
        )
rmrVr9r|r}r~rursr[rt)rFrJ)rrr��merge_environment_settingsrVr-r^)r;rmrVrur}r9r[r|rsrFrJrIrtrErGrHr~r_ZprepZsettingsZsend_kwargsr<r&r&r*rd�s()

zSession.requestcKs|jdd�|jd|f|�S)z�Sends a GET request. Returns :class:`Response` object.

        :param url: URL for the new :class:`Request` object.
        :param \*\*kwargs: Optional arguments that ``request`` takes.
        :rtype: requests.Response
        rJTrk)rhrd)r;rV�kwargsr&r&r*r3szSession.getcKs|jdd�|jd|f|�S)z�Sends a OPTIONS request. Returns :class:`Response` object.

        :param url: URL for the new :class:`Request` object.
        :param \*\*kwargs: Optional arguments that ``request`` takes.
        :rtype: requests.Response
        rJTZOPTIONS)rhrd)r;rVr�r&r&r*�options!szSession.optionscKs|jdd�|jd|f|�S)z�Sends a HEAD request. Returns :class:`Response` object.

        :param url: URL for the new :class:`Request` object.
        :param \*\*kwargs: Optional arguments that ``request`` takes.
        :rtype: requests.Response
        rJFrj)rhrd)r;rVr�r&r&r*�head,szSession.headcKs|jd|f||d�|��S)a�Sends a POST request. Returns :class:`Response` object.

        :param url: URL for the new :class:`Request` object.
        :param data: (optional) Dictionary, bytes, or file-like object to send in the body of the :class:`Request`.
        :param json: (optional) json to send in the body of the :class:`Request`.
        :param \*\*kwargs: Optional arguments that ``request`` takes.
        :rtype: requests.Response
        rl)r}r~)rd)r;rVr}r~r�r&r&r*�post7s
zSession.postcKs|jd|fd|i|��S)aYSends a PUT request. Returns :class:`Response` object.

        :param url: URL for the new :class:`Request` object.
        :param data: (optional) Dictionary, bytes, or file-like object to send in the body of the :class:`Request`.
        :param \*\*kwargs: Optional arguments that ``request`` takes.
        :rtype: requests.Response
        ZPUTr})rd)r;rVr}r�r&r&r*�putCs	zSession.putcKs|jd|fd|i|��S)a[Sends a PATCH request. Returns :class:`Response` object.

        :param url: URL for the new :class:`Request` object.
        :param data: (optional) Dictionary, bytes, or file-like object to send in the body of the :class:`Request`.
        :param \*\*kwargs: Optional arguments that ``request`` takes.
        :rtype: requests.Response
        ZPATCHr})rd)r;rVr}r�r&r&r*�patchNs	z
Session.patchcKs|jd|f|�S)z�Sends a DELETE request. Returns :class:`Response` object.

        :param url: URL for the new :class:`Request` object.
        :param \*\*kwargs: Optional arguments that ``request`` takes.
        :rtype: requests.Response
        ZDELETE)rd)r;rVr�r&r&r*�deleteYszSession.deletec
Ks~|jd|j�|jd|j�|jd|j�|jd|j�t|t�rJtd��|jdd�}|j	d�}|j
}|j|jd�}t
�}|j|f|�}t
�|}	t|	d	�|_td
||f|�}|jr�x |jD]}
t|j|
j|
j�q�Wt|j||j�|j||f|�}|�r
dd�|D�ng}|�r.|jd
|�|j�}||_|�sny"t|j||fddi|���|_Wntk
�rlYnX|�sz|j|S)zISend a given PreparedRequest.

        :rtype: requests.Response
        rErGrHrIz#You can only send PreparedRequests.rJT)rV)Zsecondsr2cSsg|]}|�qSr&r&)r'r<r&r&r*r+�sz Session.send.<locals>.<listcomp>rr`)rhrErGrHrIr,r�
ValueErrorrYr3rt�get_adapterrV�preferred_clockr^r�elapsedrrMrr[rdrPrc�insert�nextZ_next�
StopIterationrN)
r;rdr�rJrErt�adapter�start�rr�r<�genrMr&r&r*r^csB


"zSession.sendc
Cs�|jrr|dk	r|jd�nd}t||d�}x |j�D]\}}	|j||	�q2W|dksZ|dkrrtjjd�pptjjd�}t||j�}t||j	�}t||j
�}t||j�}||||d�S)z^
        Check the environment and merge it with some settings.

        :rtype: dict
        Nrf)rfTZREQUESTS_CA_BUNDLEZCURL_CA_BUNDLE)rGrIrErH)rer3rr.rh�os�environr1rIrErGrH)
r;rVrIrErGrHrfZenv_proxiesr(r)r&r&r*r��sz"Session.merge_environment_settingscCs:x(|jj�D]\}}|j�j|�r|SqWtd|��dS)z~
        Returns the appropriate connection adapter for the given URL.

        :rtype: requests.adapters.BaseAdapter
        z*No connection adapters were found for '%s'N)rvr.�lowerrUr)r;rV�prefixr�r&r&r*r��szSession.get_adaptercCs x|jj�D]}|j�qWdS)z+Closes all adapters and as such the sessionN)rv�valuesrT)r;r)r&r&r*rT�sz
Session.closecsB||j�<�fdd�|jD�}x|D]}|jj|�|j|<q$WdS)zwRegisters a connection adapter to a prefix.

        Adapters are sorted in descending order by prefix length.
        cs g|]}t|�t��kr|�qSr&)rR)r'r()r�r&r*r+�sz!Session.mount.<locals>.<listcomp>N)rvrY)r;r�r�Zkeys_to_mover0r&)r�r*rw�s

z
Session.mountcst�fdd��jD��}|S)Nc3s|]}|t�|d�fVqdS)N)�getattr)r'�attr)r;r&r*�	<genexpr>�sz'Session.__getstate__.<locals>.<genexpr>)�dict�	__attrs__)r;�stater&)r;r*�__getstate__�szSession.__getstate__cCs&x |j�D]\}}t|||�q
WdS)N)r.�setattr)r;r�r��valuer&r&r*�__setstate__�szSession.__setstate__)NNNNNNNTNNNNNN)NN)N)N)rorprq�__doc__r�rxryr{r�rdr3r�r�r�r�r�r�r^r�r�rTrwr�r�r&r&r&r*rrQs2
7)
D



IrrcCst�S)zQ
    Returns a :class:`Session` for context-management.

    :rtype: Session
    )rrr&r&r&r*�session�sr�)?r�r��platformZtime�collectionsrZdatetimerrsr�compatrrrr	r
r[rrr
rZmodelsrrrrtrrZ_internal_utilsrZutilsrr�
exceptionsrrrrZ
structuresrrvrrrrr r!r"r#Zstatus_codesr$r%�systemZperf_counterr��AttributeErrorZclockr1r4�objectr5rrr�r&r&r&r*�<module>	sB$
r"site-packages/pip/_vendor/requests/__pycache__/api.cpython-36.pyc000064400000014374147511334620021011 0ustar003

���e]�@s\dZddlmZdd�Zddd�Zdd	�Zd
d�Zddd
�Zddd�Zddd�Z	dd�Z
dS)z�
requests.api
~~~~~~~~~~~~

This module implements the Requests API.

:copyright: (c) 2012 by Kenneth Reitz.
:license: Apache2, see LICENSE for more details.
�)�sessionscKs*tj��}|jf||d�|��SQRXdS)a�	Constructs and sends a :class:`Request <Request>`.

    :param method: method for the new :class:`Request` object.
    :param url: URL for the new :class:`Request` object.
    :param params: (optional) Dictionary or bytes to be sent in the query string for the :class:`Request`.
    :param data: (optional) Dictionary or list of tuples ``[(key, value)]`` (will be form-encoded), bytes, or file-like object to send in the body of the :class:`Request`.
    :param json: (optional) json data to send in the body of the :class:`Request`.
    :param headers: (optional) Dictionary of HTTP Headers to send with the :class:`Request`.
    :param cookies: (optional) Dict or CookieJar object to send with the :class:`Request`.
    :param files: (optional) Dictionary of ``'name': file-like-objects`` (or ``{'name': file-tuple}``) for multipart encoding upload.
        ``file-tuple`` can be a 2-tuple ``('filename', fileobj)``, 3-tuple ``('filename', fileobj, 'content_type')``
        or a 4-tuple ``('filename', fileobj, 'content_type', custom_headers)``, where ``'content-type'`` is a string
        defining the content type of the given file and ``custom_headers`` a dict-like object containing additional headers
        to add for the file.
    :param auth: (optional) Auth tuple to enable Basic/Digest/Custom HTTP Auth.
    :param timeout: (optional) How many seconds to wait for the server to send data
        before giving up, as a float, or a :ref:`(connect timeout, read
        timeout) <timeouts>` tuple.
    :type timeout: float or tuple
    :param allow_redirects: (optional) Boolean. Enable/disable GET/OPTIONS/POST/PUT/PATCH/DELETE/HEAD redirection. Defaults to ``True``.
    :type allow_redirects: bool
    :param proxies: (optional) Dictionary mapping protocol to the URL of the proxy.
    :param verify: (optional) Either a boolean, in which case it controls whether we verify
            the server's TLS certificate, or a string, in which case it must be a path
            to a CA bundle to use. Defaults to ``True``.
    :param stream: (optional) if ``False``, the response content will be immediately downloaded.
    :param cert: (optional) if String, path to ssl client cert file (.pem). If Tuple, ('cert', 'key') pair.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response

    Usage::

      >>> import requests
      >>> req = requests.request('GET', 'http://httpbin.org/get')
      <Response [200]>
    )�method�urlN)rZSession�request)rr�kwargsZsession�r�/usr/lib/python3.6/api.pyrs)
rNcKs"|jdd�td|fd|i|��S)aOSends a GET request.

    :param url: URL for the new :class:`Request` object.
    :param params: (optional) Dictionary or bytes to be sent in the query string for the :class:`Request`.
    :param \*\*kwargs: Optional arguments that ``request`` takes.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response
    �allow_redirectsT�get�params)�
setdefaultr)rrrrrrr
=s
r
cKs|jdd�td|f|�S)z�Sends an OPTIONS request.

    :param url: URL for the new :class:`Request` object.
    :param \*\*kwargs: Optional arguments that ``request`` takes.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response
    r	T�options)rr)rrrrrr
Ks	r
cKs|jdd�td|f|�S)z�Sends a HEAD request.

    :param url: URL for the new :class:`Request` object.
    :param \*\*kwargs: Optional arguments that ``request`` takes.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response
    r	F�head)rr)rrrrrrXs	rcKstd|f||d�|��S)a�Sends a POST request.

    :param url: URL for the new :class:`Request` object.
    :param data: (optional) Dictionary (will be form-encoded), bytes, or file-like object to send in the body of the :class:`Request`.
    :param json: (optional) json data to send in the body of the :class:`Request`.
    :param \*\*kwargs: Optional arguments that ``request`` takes.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response
    �post)�data�json)r)rrrrrrrresrcKstd|fd|i|��S)a�Sends a PUT request.

    :param url: URL for the new :class:`Request` object.
    :param data: (optional) Dictionary (will be form-encoded), bytes, or file-like object to send in the body of the :class:`Request`.
    :param json: (optional) json data to send in the body of the :class:`Request`.
    :param \*\*kwargs: Optional arguments that ``request`` takes.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response
    �putr)r)rrrrrrrssrcKstd|fd|i|��S)a�Sends a PATCH request.

    :param url: URL for the new :class:`Request` object.
    :param data: (optional) Dictionary (will be form-encoded), bytes, or file-like object to send in the body of the :class:`Request`.
    :param json: (optional) json data to send in the body of the :class:`Request`.
    :param \*\*kwargs: Optional arguments that ``request`` takes.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response
    �patchr)r)rrrrrrr�srcKstd|f|�S)z�Sends a DELETE request.

    :param url: URL for the new :class:`Request` object.
    :param \*\*kwargs: Optional arguments that ``request`` takes.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response
    �delete)r)rrrrrr�s	r)N)NN)N)N)�__doc__�rrr
r
rrrrrrrrr�<module>s-




site-packages/pip/_vendor/requests/status_codes.py000064400000006373147511334620016454 0ustar00# -*- coding: utf-8 -*-

from .structures import LookupDict

_codes = {

    # Informational.
    100: ('continue',),
    101: ('switching_protocols',),
    102: ('processing',),
    103: ('checkpoint',),
    122: ('uri_too_long', 'request_uri_too_long'),
    200: ('ok', 'okay', 'all_ok', 'all_okay', 'all_good', '\\o/', '✓'),
    201: ('created',),
    202: ('accepted',),
    203: ('non_authoritative_info', 'non_authoritative_information'),
    204: ('no_content',),
    205: ('reset_content', 'reset'),
    206: ('partial_content', 'partial'),
    207: ('multi_status', 'multiple_status', 'multi_stati', 'multiple_stati'),
    208: ('already_reported',),
    226: ('im_used',),

    # Redirection.
    300: ('multiple_choices',),
    301: ('moved_permanently', 'moved', '\\o-'),
    302: ('found',),
    303: ('see_other', 'other'),
    304: ('not_modified',),
    305: ('use_proxy',),
    306: ('switch_proxy',),
    307: ('temporary_redirect', 'temporary_moved', 'temporary'),
    308: ('permanent_redirect',
          'resume_incomplete', 'resume',),  # These 2 to be removed in 3.0

    # Client Error.
    400: ('bad_request', 'bad'),
    401: ('unauthorized',),
    402: ('payment_required', 'payment'),
    403: ('forbidden',),
    404: ('not_found', '-o-'),
    405: ('method_not_allowed', 'not_allowed'),
    406: ('not_acceptable',),
    407: ('proxy_authentication_required', 'proxy_auth', 'proxy_authentication'),
    408: ('request_timeout', 'timeout'),
    409: ('conflict',),
    410: ('gone',),
    411: ('length_required',),
    412: ('precondition_failed', 'precondition'),
    413: ('request_entity_too_large',),
    414: ('request_uri_too_large',),
    415: ('unsupported_media_type', 'unsupported_media', 'media_type'),
    416: ('requested_range_not_satisfiable', 'requested_range', 'range_not_satisfiable'),
    417: ('expectation_failed',),
    418: ('im_a_teapot', 'teapot', 'i_am_a_teapot'),
    421: ('misdirected_request',),
    422: ('unprocessable_entity', 'unprocessable'),
    423: ('locked',),
    424: ('failed_dependency', 'dependency'),
    425: ('unordered_collection', 'unordered'),
    426: ('upgrade_required', 'upgrade'),
    428: ('precondition_required', 'precondition'),
    429: ('too_many_requests', 'too_many'),
    431: ('header_fields_too_large', 'fields_too_large'),
    444: ('no_response', 'none'),
    449: ('retry_with', 'retry'),
    450: ('blocked_by_windows_parental_controls', 'parental_controls'),
    451: ('unavailable_for_legal_reasons', 'legal_reasons'),
    499: ('client_closed_request',),

    # Server Error.
    500: ('internal_server_error', 'server_error', '/o\\', '✗'),
    501: ('not_implemented',),
    502: ('bad_gateway',),
    503: ('service_unavailable', 'unavailable'),
    504: ('gateway_timeout',),
    505: ('http_version_not_supported', 'http_version'),
    506: ('variant_also_negotiates',),
    507: ('insufficient_storage',),
    509: ('bandwidth_limit_exceeded', 'bandwidth'),
    510: ('not_extended',),
    511: ('network_authentication_required', 'network_auth', 'network_authentication'),
}

codes = LookupDict(name='status_codes')

for code, titles in _codes.items():
    for title in titles:
        setattr(codes, title, code)
        if not title.startswith(('\\', '/')):
            setattr(codes, title.upper(), code)
site-packages/pip/_vendor/requests/cookies.py000064400000043440147511334620015404 0ustar00# -*- coding: utf-8 -*-

"""
requests.cookies
~~~~~~~~~~~~~~~~

Compatibility code to be able to use `cookielib.CookieJar` with requests.

requests.utils imports from here, so be careful with imports.
"""

import copy
import time
import calendar
import collections

from ._internal_utils import to_native_string
from .compat import cookielib, urlparse, urlunparse, Morsel

try:
    import threading
except ImportError:
    import dummy_threading as threading


class MockRequest(object):
    """Wraps a `requests.Request` to mimic a `urllib2.Request`.

    The code in `cookielib.CookieJar` expects this interface in order to correctly
    manage cookie policies, i.e., determine whether a cookie can be set, given the
    domains of the request and the cookie.

    The original request object is read-only. The client is responsible for collecting
    the new headers via `get_new_headers()` and interpreting them appropriately. You
    probably want `get_cookie_header`, defined below.
    """

    def __init__(self, request):
        self._r = request
        self._new_headers = {}
        self.type = urlparse(self._r.url).scheme

    def get_type(self):
        return self.type

    def get_host(self):
        return urlparse(self._r.url).netloc

    def get_origin_req_host(self):
        return self.get_host()

    def get_full_url(self):
        # Only return the response's URL if the user hadn't set the Host
        # header
        if not self._r.headers.get('Host'):
            return self._r.url
        # If they did set it, retrieve it and reconstruct the expected domain
        host = to_native_string(self._r.headers['Host'], encoding='utf-8')
        parsed = urlparse(self._r.url)
        # Reconstruct the URL as we expect it
        return urlunparse([
            parsed.scheme, host, parsed.path, parsed.params, parsed.query,
            parsed.fragment
        ])

    def is_unverifiable(self):
        return True

    def has_header(self, name):
        return name in self._r.headers or name in self._new_headers

    def get_header(self, name, default=None):
        return self._r.headers.get(name, self._new_headers.get(name, default))

    def add_header(self, key, val):
        """cookielib has no legitimate use for this method; add it back if you find one."""
        raise NotImplementedError("Cookie headers should be added with add_unredirected_header()")

    def add_unredirected_header(self, name, value):
        self._new_headers[name] = value

    def get_new_headers(self):
        return self._new_headers

    @property
    def unverifiable(self):
        return self.is_unverifiable()

    @property
    def origin_req_host(self):
        return self.get_origin_req_host()

    @property
    def host(self):
        return self.get_host()


class MockResponse(object):
    """Wraps a `httplib.HTTPMessage` to mimic a `urllib.addinfourl`.

    ...what? Basically, expose the parsed HTTP headers from the server response
    the way `cookielib` expects to see them.
    """

    def __init__(self, headers):
        """Make a MockResponse for `cookielib` to read.

        :param headers: a httplib.HTTPMessage or analogous carrying the headers
        """
        self._headers = headers

    def info(self):
        return self._headers

    def getheaders(self, name):
        self._headers.getheaders(name)


def extract_cookies_to_jar(jar, request, response):
    """Extract the cookies from the response into a CookieJar.

    :param jar: cookielib.CookieJar (not necessarily a RequestsCookieJar)
    :param request: our own requests.Request object
    :param response: urllib3.HTTPResponse object
    """
    if not (hasattr(response, '_original_response') and
            response._original_response):
        return
    # the _original_response field is the wrapped httplib.HTTPResponse object,
    req = MockRequest(request)
    # pull out the HTTPMessage with the headers and put it in the mock:
    res = MockResponse(response._original_response.msg)
    jar.extract_cookies(res, req)


def get_cookie_header(jar, request):
    """
    Produce an appropriate Cookie header string to be sent with `request`, or None.

    :rtype: str
    """
    r = MockRequest(request)
    jar.add_cookie_header(r)
    return r.get_new_headers().get('Cookie')


def remove_cookie_by_name(cookiejar, name, domain=None, path=None):
    """Unsets a cookie by name, by default over all domains and paths.

    Wraps CookieJar.clear(), is O(n).
    """
    clearables = []
    for cookie in cookiejar:
        if cookie.name != name:
            continue
        if domain is not None and domain != cookie.domain:
            continue
        if path is not None and path != cookie.path:
            continue
        clearables.append((cookie.domain, cookie.path, cookie.name))

    for domain, path, name in clearables:
        cookiejar.clear(domain, path, name)


class CookieConflictError(RuntimeError):
    """There are two cookies that meet the criteria specified in the cookie jar.
    Use .get and .set and include domain and path args in order to be more specific.
    """


class RequestsCookieJar(cookielib.CookieJar, collections.MutableMapping):
    """Compatibility class; is a cookielib.CookieJar, but exposes a dict
    interface.

    This is the CookieJar we create by default for requests and sessions that
    don't specify one, since some clients may expect response.cookies and
    session.cookies to support dict operations.

    Requests does not use the dict interface internally; it's just for
    compatibility with external client code. All requests code should work
    out of the box with externally provided instances of ``CookieJar``, e.g.
    ``LWPCookieJar`` and ``FileCookieJar``.

    Unlike a regular CookieJar, this class is pickleable.

    .. warning:: dictionary operations that are normally O(1) may be O(n).
    """

    def get(self, name, default=None, domain=None, path=None):
        """Dict-like get() that also supports optional domain and path args in
        order to resolve naming collisions from using one cookie jar over
        multiple domains.

        .. warning:: operation is O(n), not O(1).
        """
        try:
            return self._find_no_duplicates(name, domain, path)
        except KeyError:
            return default

    def set(self, name, value, **kwargs):
        """Dict-like set() that also supports optional domain and path args in
        order to resolve naming collisions from using one cookie jar over
        multiple domains.
        """
        # support client code that unsets cookies by assignment of a None value:
        if value is None:
            remove_cookie_by_name(self, name, domain=kwargs.get('domain'), path=kwargs.get('path'))
            return

        if isinstance(value, Morsel):
            c = morsel_to_cookie(value)
        else:
            c = create_cookie(name, value, **kwargs)
        self.set_cookie(c)
        return c

    def iterkeys(self):
        """Dict-like iterkeys() that returns an iterator of names of cookies
        from the jar.

        .. seealso:: itervalues() and iteritems().
        """
        for cookie in iter(self):
            yield cookie.name

    def keys(self):
        """Dict-like keys() that returns a list of names of cookies from the
        jar.

        .. seealso:: values() and items().
        """
        return list(self.iterkeys())

    def itervalues(self):
        """Dict-like itervalues() that returns an iterator of values of cookies
        from the jar.

        .. seealso:: iterkeys() and iteritems().
        """
        for cookie in iter(self):
            yield cookie.value

    def values(self):
        """Dict-like values() that returns a list of values of cookies from the
        jar.

        .. seealso:: keys() and items().
        """
        return list(self.itervalues())

    def iteritems(self):
        """Dict-like iteritems() that returns an iterator of name-value tuples
        from the jar.

        .. seealso:: iterkeys() and itervalues().
        """
        for cookie in iter(self):
            yield cookie.name, cookie.value

    def items(self):
        """Dict-like items() that returns a list of name-value tuples from the
        jar. Allows client-code to call ``dict(RequestsCookieJar)`` and get a
        vanilla python dict of key value pairs.

        .. seealso:: keys() and values().
        """
        return list(self.iteritems())

    def list_domains(self):
        """Utility method to list all the domains in the jar."""
        domains = []
        for cookie in iter(self):
            if cookie.domain not in domains:
                domains.append(cookie.domain)
        return domains

    def list_paths(self):
        """Utility method to list all the paths in the jar."""
        paths = []
        for cookie in iter(self):
            if cookie.path not in paths:
                paths.append(cookie.path)
        return paths

    def multiple_domains(self):
        """Returns True if there are multiple domains in the jar.
        Returns False otherwise.

        :rtype: bool
        """
        domains = []
        for cookie in iter(self):
            if cookie.domain is not None and cookie.domain in domains:
                return True
            domains.append(cookie.domain)
        return False  # there is only one domain in jar

    def get_dict(self, domain=None, path=None):
        """Takes as an argument an optional domain and path and returns a plain
        old Python dict of name-value pairs of cookies that meet the
        requirements.

        :rtype: dict
        """
        dictionary = {}
        for cookie in iter(self):
            if (
                (domain is None or cookie.domain == domain) and
                (path is None or cookie.path == path)
            ):
                dictionary[cookie.name] = cookie.value
        return dictionary

    def __contains__(self, name):
        try:
            return super(RequestsCookieJar, self).__contains__(name)
        except CookieConflictError:
            return True

    def __getitem__(self, name):
        """Dict-like __getitem__() for compatibility with client code. Throws
        exception if there are more than one cookie with name. In that case,
        use the more explicit get() method instead.

        .. warning:: operation is O(n), not O(1).
        """
        return self._find_no_duplicates(name)

    def __setitem__(self, name, value):
        """Dict-like __setitem__ for compatibility with client code. Throws
        exception if there is already a cookie of that name in the jar. In that
        case, use the more explicit set() method instead.
        """
        self.set(name, value)

    def __delitem__(self, name):
        """Deletes a cookie given a name. Wraps ``cookielib.CookieJar``'s
        ``remove_cookie_by_name()``.
        """
        remove_cookie_by_name(self, name)

    def set_cookie(self, cookie, *args, **kwargs):
        if hasattr(cookie.value, 'startswith') and cookie.value.startswith('"') and cookie.value.endswith('"'):
            cookie.value = cookie.value.replace('\\"', '')
        return super(RequestsCookieJar, self).set_cookie(cookie, *args, **kwargs)

    def update(self, other):
        """Updates this jar with cookies from another CookieJar or dict-like"""
        if isinstance(other, cookielib.CookieJar):
            for cookie in other:
                self.set_cookie(copy.copy(cookie))
        else:
            super(RequestsCookieJar, self).update(other)

    def _find(self, name, domain=None, path=None):
        """Requests uses this method internally to get cookie values.

        If there are conflicting cookies, _find arbitrarily chooses one.
        See _find_no_duplicates if you want an exception thrown if there are
        conflicting cookies.

        :param name: a string containing name of cookie
        :param domain: (optional) string containing domain of cookie
        :param path: (optional) string containing path of cookie
        :return: cookie.value
        """
        for cookie in iter(self):
            if cookie.name == name:
                if domain is None or cookie.domain == domain:
                    if path is None or cookie.path == path:
                        return cookie.value

        raise KeyError('name=%r, domain=%r, path=%r' % (name, domain, path))

    def _find_no_duplicates(self, name, domain=None, path=None):
        """Both ``__get_item__`` and ``get`` call this function: it's never
        used elsewhere in Requests.

        :param name: a string containing name of cookie
        :param domain: (optional) string containing domain of cookie
        :param path: (optional) string containing path of cookie
        :raises KeyError: if cookie is not found
        :raises CookieConflictError: if there are multiple cookies
            that match name and optionally domain and path
        :return: cookie.value
        """
        toReturn = None
        for cookie in iter(self):
            if cookie.name == name:
                if domain is None or cookie.domain == domain:
                    if path is None or cookie.path == path:
                        if toReturn is not None:  # if there are multiple cookies that meet passed in criteria
                            raise CookieConflictError('There are multiple cookies with name, %r' % (name))
                        toReturn = cookie.value  # we will eventually return this as long as no cookie conflict

        if toReturn:
            return toReturn
        raise KeyError('name=%r, domain=%r, path=%r' % (name, domain, path))

    def __getstate__(self):
        """Unlike a normal CookieJar, this class is pickleable."""
        state = self.__dict__.copy()
        # remove the unpickleable RLock object
        state.pop('_cookies_lock')
        return state

    def __setstate__(self, state):
        """Unlike a normal CookieJar, this class is pickleable."""
        self.__dict__.update(state)
        if '_cookies_lock' not in self.__dict__:
            self._cookies_lock = threading.RLock()

    def copy(self):
        """Return a copy of this RequestsCookieJar."""
        new_cj = RequestsCookieJar()
        new_cj.update(self)
        return new_cj


def _copy_cookie_jar(jar):
    if jar is None:
        return None

    if hasattr(jar, 'copy'):
        # We're dealing with an instance of RequestsCookieJar
        return jar.copy()
    # We're dealing with a generic CookieJar instance
    new_jar = copy.copy(jar)
    new_jar.clear()
    for cookie in jar:
        new_jar.set_cookie(copy.copy(cookie))
    return new_jar


def create_cookie(name, value, **kwargs):
    """Make a cookie from underspecified parameters.

    By default, the pair of `name` and `value` will be set for the domain ''
    and sent on every request (this is sometimes called a "supercookie").
    """
    result = dict(
        version=0,
        name=name,
        value=value,
        port=None,
        domain='',
        path='/',
        secure=False,
        expires=None,
        discard=True,
        comment=None,
        comment_url=None,
        rest={'HttpOnly': None},
        rfc2109=False,)

    badargs = set(kwargs) - set(result)
    if badargs:
        err = 'create_cookie() got unexpected keyword arguments: %s'
        raise TypeError(err % list(badargs))

    result.update(kwargs)
    result['port_specified'] = bool(result['port'])
    result['domain_specified'] = bool(result['domain'])
    result['domain_initial_dot'] = result['domain'].startswith('.')
    result['path_specified'] = bool(result['path'])

    return cookielib.Cookie(**result)


def morsel_to_cookie(morsel):
    """Convert a Morsel object into a Cookie containing the one k/v pair."""

    expires = None
    if morsel['max-age']:
        try:
            expires = int(time.time() + int(morsel['max-age']))
        except ValueError:
            raise TypeError('max-age: %s must be integer' % morsel['max-age'])
    elif morsel['expires']:
        time_template = '%a, %d-%b-%Y %H:%M:%S GMT'
        expires = calendar.timegm(
            time.strptime(morsel['expires'], time_template)
        )
    return create_cookie(
        comment=morsel['comment'],
        comment_url=bool(morsel['comment']),
        discard=False,
        domain=morsel['domain'],
        expires=expires,
        name=morsel.key,
        path=morsel['path'],
        port=None,
        rest={'HttpOnly': morsel['httponly']},
        rfc2109=False,
        secure=bool(morsel['secure']),
        value=morsel.value,
        version=morsel['version'] or 0,
    )


def cookiejar_from_dict(cookie_dict, cookiejar=None, overwrite=True):
    """Returns a CookieJar from a key/value dictionary.

    :param cookie_dict: Dict of key/values to insert into CookieJar.
    :param cookiejar: (optional) A cookiejar to add the cookies to.
    :param overwrite: (optional) If False, will not replace cookies
        already in the jar with new ones.
    """
    if cookiejar is None:
        cookiejar = RequestsCookieJar()

    if cookie_dict is not None:
        names_from_jar = [cookie.name for cookie in cookiejar]
        for name in cookie_dict:
            if overwrite or (name not in names_from_jar):
                cookiejar.set_cookie(create_cookie(name, cookie_dict[name]))

    return cookiejar


def merge_cookies(cookiejar, cookies):
    """Add cookies to cookiejar and returns a merged CookieJar.

    :param cookiejar: CookieJar object to add the cookies to.
    :param cookies: Dictionary or CookieJar object to be added.
    """
    if not isinstance(cookiejar, cookielib.CookieJar):
        raise ValueError('You can only merge into CookieJar')

    if isinstance(cookies, dict):
        cookiejar = cookiejar_from_dict(
            cookies, cookiejar=cookiejar, overwrite=False)
    elif isinstance(cookies, cookielib.CookieJar):
        try:
            cookiejar.update(cookies)
        except AttributeError:
            for cookie_in_jar in cookies:
                cookiejar.set_cookie(cookie_in_jar)

    return cookiejar
site-packages/pip/_vendor/requests/exceptions.py000064400000006053147511334620016130 0ustar00# -*- coding: utf-8 -*-

"""
requests.exceptions
~~~~~~~~~~~~~~~~~~~

This module contains the set of Requests' exceptions.
"""
from pip._vendor.urllib3.exceptions import HTTPError as BaseHTTPError


class RequestException(IOError):
    """There was an ambiguous exception that occurred while handling your
    request.
    """

    def __init__(self, *args, **kwargs):
        """Initialize RequestException with `request` and `response` objects."""
        response = kwargs.pop('response', None)
        self.response = response
        self.request = kwargs.pop('request', None)
        if (response is not None and not self.request and
                hasattr(response, 'request')):
            self.request = self.response.request
        super(RequestException, self).__init__(*args, **kwargs)


class HTTPError(RequestException):
    """An HTTP error occurred."""


class ConnectionError(RequestException):
    """A Connection error occurred."""


class ProxyError(ConnectionError):
    """A proxy error occurred."""


class SSLError(ConnectionError):
    """An SSL error occurred."""


class Timeout(RequestException):
    """The request timed out.

    Catching this error will catch both
    :exc:`~requests.exceptions.ConnectTimeout` and
    :exc:`~requests.exceptions.ReadTimeout` errors.
    """


class ConnectTimeout(ConnectionError, Timeout):
    """The request timed out while trying to connect to the remote server.

    Requests that produced this error are safe to retry.
    """


class ReadTimeout(Timeout):
    """The server did not send any data in the allotted amount of time."""


class URLRequired(RequestException):
    """A valid URL is required to make a request."""


class TooManyRedirects(RequestException):
    """Too many redirects."""


class MissingSchema(RequestException, ValueError):
    """The URL schema (e.g. http or https) is missing."""


class InvalidSchema(RequestException, ValueError):
    """See defaults.py for valid schemas."""


class InvalidURL(RequestException, ValueError):
    """The URL provided was somehow invalid."""


class InvalidHeader(RequestException, ValueError):
    """The header value provided was somehow invalid."""


class ChunkedEncodingError(RequestException):
    """The server declared chunked encoding but sent an invalid chunk."""


class ContentDecodingError(RequestException, BaseHTTPError):
    """Failed to decode response content"""


class StreamConsumedError(RequestException, TypeError):
    """The content for this response was already consumed"""


class RetryError(RequestException):
    """Custom retries logic failed"""


class UnrewindableBodyError(RequestException):
    """Requests encountered an error when trying to rewind a body"""

# Warnings


class RequestsWarning(Warning):
    """Base warning for Requests."""
    pass


class FileModeWarning(RequestsWarning, DeprecationWarning):
    """A file was opened in text mode, but Requests determined its binary length."""
    pass


class RequestsDependencyWarning(RequestsWarning):
    """An imported dependency doesn't match the expected version range."""
    pass
site-packages/pip/_vendor/requests/certs.py000064400000000721147511334620015063 0ustar00#!/usr/bin/env python
# -*- coding: utf-8 -*-

"""
requests.certs
~~~~~~~~~~~~~~

This module returns the preferred default CA certificate bundle. There is
only one — the one from the certifi package.

If you are packaging Requests, e.g., for a Linux distribution or a managed
environment, you can change the definition of where() to return a separately
packaged CA bundle.
"""
from pip._vendor.certifi import where

if __name__ == '__main__':
    print(where())
site-packages/pip/_vendor/requests/packages.py000064400000001267147511334620015527 0ustar00import sys

# This code exists for backwards compatibility reasons.
# I don't like it either. Just look the other way. :)

for package in ('urllib3', 'idna', 'chardet'):
    vendored_package = "pip._vendor." + package
    locals()[package] = __import__(vendored_package)
    # This traversal is apparently necessary such that the identities are
    # preserved (requests.packages.urllib3.* is urllib3.*)
    for mod in list(sys.modules):
        if mod == vendored_package or mod.startswith(vendored_package + '.'):
            unprefixed_mod = mod[len("pip._vendor."):]
            sys.modules['pip._vendor.requests.packages.' + unprefixed_mod] = sys.modules[mod]

# Kinda cool, though, right?
site-packages/pip/_vendor/requests/models.py000064400000102403147511334620015226 0ustar00# -*- coding: utf-8 -*-

"""
requests.models
~~~~~~~~~~~~~~~

This module contains the primary objects that power Requests.
"""

import collections
import datetime
import sys

# Import encoding now, to avoid implicit import later.
# Implicit import within threads may cause LookupError when standard library is in a ZIP,
# such as in Embedded Python. See https://github.com/requests/requests/issues/3578.
import encodings.idna

from pip._vendor.urllib3.fields import RequestField
from pip._vendor.urllib3.filepost import encode_multipart_formdata
from pip._vendor.urllib3.util import parse_url
from pip._vendor.urllib3.exceptions import (
    DecodeError, ReadTimeoutError, ProtocolError, LocationParseError)

from io import UnsupportedOperation
from .hooks import default_hooks
from .structures import CaseInsensitiveDict

from .auth import HTTPBasicAuth
from .cookies import cookiejar_from_dict, get_cookie_header, _copy_cookie_jar
from .exceptions import (
    HTTPError, MissingSchema, InvalidURL, ChunkedEncodingError,
    ContentDecodingError, ConnectionError, StreamConsumedError)
from ._internal_utils import to_native_string, unicode_is_ascii
from .utils import (
    guess_filename, get_auth_from_url, requote_uri,
    stream_decode_response_unicode, to_key_val_list, parse_header_links,
    iter_slices, guess_json_utf, super_len, check_header_validity)
from .compat import (
    cookielib, urlunparse, urlsplit, urlencode, str, bytes,
    is_py2, chardet, builtin_str, basestring)
from .compat import json as complexjson
from .status_codes import codes

#: The set of HTTP status codes that indicate an automatically
#: processable redirect.
REDIRECT_STATI = (
    codes.moved,               # 301
    codes.found,               # 302
    codes.other,               # 303
    codes.temporary_redirect,  # 307
    codes.permanent_redirect,  # 308
)

DEFAULT_REDIRECT_LIMIT = 30
CONTENT_CHUNK_SIZE = 10 * 1024
ITER_CHUNK_SIZE = 512


class RequestEncodingMixin(object):
    @property
    def path_url(self):
        """Build the path URL to use."""

        url = []

        p = urlsplit(self.url)

        path = p.path
        if not path:
            path = '/'

        url.append(path)

        query = p.query
        if query:
            url.append('?')
            url.append(query)

        return ''.join(url)

    @staticmethod
    def _encode_params(data):
        """Encode parameters in a piece of data.

        Will successfully encode parameters when passed as a dict or a list of
        2-tuples. Order is retained if data is a list of 2-tuples but arbitrary
        if parameters are supplied as a dict.
        """

        if isinstance(data, (str, bytes)):
            return data
        elif hasattr(data, 'read'):
            return data
        elif hasattr(data, '__iter__'):
            result = []
            for k, vs in to_key_val_list(data):
                if isinstance(vs, basestring) or not hasattr(vs, '__iter__'):
                    vs = [vs]
                for v in vs:
                    if v is not None:
                        result.append(
                            (k.encode('utf-8') if isinstance(k, str) else k,
                             v.encode('utf-8') if isinstance(v, str) else v))
            return urlencode(result, doseq=True)
        else:
            return data

    @staticmethod
    def _encode_files(files, data):
        """Build the body for a multipart/form-data request.

        Will successfully encode files when passed as a dict or a list of
        tuples. Order is retained if data is a list of tuples but arbitrary
        if parameters are supplied as a dict.
        The tuples may be 2-tuples (filename, fileobj), 3-tuples (filename, fileobj, contentype)
        or 4-tuples (filename, fileobj, contentype, custom_headers).
        """
        if (not files):
            raise ValueError("Files must be provided.")
        elif isinstance(data, basestring):
            raise ValueError("Data must not be a string.")

        new_fields = []
        fields = to_key_val_list(data or {})
        files = to_key_val_list(files or {})

        for field, val in fields:
            if isinstance(val, basestring) or not hasattr(val, '__iter__'):
                val = [val]
            for v in val:
                if v is not None:
                    # Don't call str() on bytestrings: in Py3 it all goes wrong.
                    if not isinstance(v, bytes):
                        v = str(v)

                    new_fields.append(
                        (field.decode('utf-8') if isinstance(field, bytes) else field,
                         v.encode('utf-8') if isinstance(v, str) else v))

        for (k, v) in files:
            # support for explicit filename
            ft = None
            fh = None
            if isinstance(v, (tuple, list)):
                if len(v) == 2:
                    fn, fp = v
                elif len(v) == 3:
                    fn, fp, ft = v
                else:
                    fn, fp, ft, fh = v
            else:
                fn = guess_filename(v) or k
                fp = v

            if isinstance(fp, (str, bytes, bytearray)):
                fdata = fp
            else:
                fdata = fp.read()

            rf = RequestField(name=k, data=fdata, filename=fn, headers=fh)
            rf.make_multipart(content_type=ft)
            new_fields.append(rf)

        body, content_type = encode_multipart_formdata(new_fields)

        return body, content_type


class RequestHooksMixin(object):
    def register_hook(self, event, hook):
        """Properly register a hook."""

        if event not in self.hooks:
            raise ValueError('Unsupported event specified, with event name "%s"' % (event))

        if isinstance(hook, collections.Callable):
            self.hooks[event].append(hook)
        elif hasattr(hook, '__iter__'):
            self.hooks[event].extend(h for h in hook if isinstance(h, collections.Callable))

    def deregister_hook(self, event, hook):
        """Deregister a previously registered hook.
        Returns True if the hook existed, False if not.
        """

        try:
            self.hooks[event].remove(hook)
            return True
        except ValueError:
            return False


class Request(RequestHooksMixin):
    """A user-created :class:`Request <Request>` object.

    Used to prepare a :class:`PreparedRequest <PreparedRequest>`, which is sent to the server.

    :param method: HTTP method to use.
    :param url: URL to send.
    :param headers: dictionary of headers to send.
    :param files: dictionary of {filename: fileobject} files to multipart upload.
    :param data: the body to attach to the request. If a dictionary is provided, form-encoding will take place.
    :param json: json for the body to attach to the request (if files or data is not specified).
    :param params: dictionary of URL parameters to append to the URL.
    :param auth: Auth handler or (user, pass) tuple.
    :param cookies: dictionary or CookieJar of cookies to attach to this request.
    :param hooks: dictionary of callback hooks, for internal usage.

    Usage::

      >>> import requests
      >>> req = requests.Request('GET', 'http://httpbin.org/get')
      >>> req.prepare()
      <PreparedRequest [GET]>
    """

    def __init__(self,
            method=None, url=None, headers=None, files=None, data=None,
            params=None, auth=None, cookies=None, hooks=None, json=None):

        # Default empty dicts for dict params.
        data = [] if data is None else data
        files = [] if files is None else files
        headers = {} if headers is None else headers
        params = {} if params is None else params
        hooks = {} if hooks is None else hooks

        self.hooks = default_hooks()
        for (k, v) in list(hooks.items()):
            self.register_hook(event=k, hook=v)

        self.method = method
        self.url = url
        self.headers = headers
        self.files = files
        self.data = data
        self.json = json
        self.params = params
        self.auth = auth
        self.cookies = cookies

    def __repr__(self):
        return '<Request [%s]>' % (self.method)

    def prepare(self):
        """Constructs a :class:`PreparedRequest <PreparedRequest>` for transmission and returns it."""
        p = PreparedRequest()
        p.prepare(
            method=self.method,
            url=self.url,
            headers=self.headers,
            files=self.files,
            data=self.data,
            json=self.json,
            params=self.params,
            auth=self.auth,
            cookies=self.cookies,
            hooks=self.hooks,
        )
        return p


class PreparedRequest(RequestEncodingMixin, RequestHooksMixin):
    """The fully mutable :class:`PreparedRequest <PreparedRequest>` object,
    containing the exact bytes that will be sent to the server.

    Generated from either a :class:`Request <Request>` object or manually.

    Usage::

      >>> import requests
      >>> req = requests.Request('GET', 'http://httpbin.org/get')
      >>> r = req.prepare()
      <PreparedRequest [GET]>

      >>> s = requests.Session()
      >>> s.send(r)
      <Response [200]>
    """

    def __init__(self):
        #: HTTP verb to send to the server.
        self.method = None
        #: HTTP URL to send the request to.
        self.url = None
        #: dictionary of HTTP headers.
        self.headers = None
        # The `CookieJar` used to create the Cookie header will be stored here
        # after prepare_cookies is called
        self._cookies = None
        #: request body to send to the server.
        self.body = None
        #: dictionary of callback hooks, for internal usage.
        self.hooks = default_hooks()
        #: integer denoting starting position of a readable file-like body.
        self._body_position = None

    def prepare(self,
            method=None, url=None, headers=None, files=None, data=None,
            params=None, auth=None, cookies=None, hooks=None, json=None):
        """Prepares the entire request with the given parameters."""

        self.prepare_method(method)
        self.prepare_url(url, params)
        self.prepare_headers(headers)
        self.prepare_cookies(cookies)
        self.prepare_body(data, files, json)
        self.prepare_auth(auth, url)

        # Note that prepare_auth must be last to enable authentication schemes
        # such as OAuth to work on a fully prepared request.

        # This MUST go after prepare_auth. Authenticators could add a hook
        self.prepare_hooks(hooks)

    def __repr__(self):
        return '<PreparedRequest [%s]>' % (self.method)

    def copy(self):
        p = PreparedRequest()
        p.method = self.method
        p.url = self.url
        p.headers = self.headers.copy() if self.headers is not None else None
        p._cookies = _copy_cookie_jar(self._cookies)
        p.body = self.body
        p.hooks = self.hooks
        p._body_position = self._body_position
        return p

    def prepare_method(self, method):
        """Prepares the given HTTP method."""
        self.method = method
        if self.method is not None:
            self.method = to_native_string(self.method.upper())

    @staticmethod
    def _get_idna_encoded_host(host):
        import idna

        try:
            host = idna.encode(host, uts46=True).decode('utf-8')
        except idna.IDNAError:
            raise UnicodeError
        return host

    def prepare_url(self, url, params):
        """Prepares the given HTTP URL."""
        #: Accept objects that have string representations.
        #: We're unable to blindly call unicode/str functions
        #: as this will include the bytestring indicator (b'')
        #: on python 3.x.
        #: https://github.com/requests/requests/pull/2238
        if isinstance(url, bytes):
            url = url.decode('utf8')
        else:
            url = unicode(url) if is_py2 else str(url)

        # Remove leading whitespaces from url
        url = url.lstrip()

        # Don't do any URL preparation for non-HTTP schemes like `mailto`,
        # `data` etc to work around exceptions from `url_parse`, which
        # handles RFC 3986 only.
        if ':' in url and not url.lower().startswith('http'):
            self.url = url
            return

        # Support for unicode domain names and paths.
        try:
            scheme, auth, host, port, path, query, fragment = parse_url(url)
        except LocationParseError as e:
            raise InvalidURL(*e.args)

        if not scheme:
            error = ("Invalid URL {0!r}: No schema supplied. Perhaps you meant http://{0}?")
            error = error.format(to_native_string(url, 'utf8'))

            raise MissingSchema(error)

        if not host:
            raise InvalidURL("Invalid URL %r: No host supplied" % url)

        # In general, we want to try IDNA encoding the hostname if the string contains
        # non-ASCII characters. This allows users to automatically get the correct IDNA
        # behaviour. For strings containing only ASCII characters, we need to also verify
        # it doesn't start with a wildcard (*), before allowing the unencoded hostname.
        if not unicode_is_ascii(host):
            try:
                host = self._get_idna_encoded_host(host)
            except UnicodeError:
                raise InvalidURL('URL has an invalid label.')
        elif host.startswith(u'*'):
            raise InvalidURL('URL has an invalid label.')

        # Carefully reconstruct the network location
        netloc = auth or ''
        if netloc:
            netloc += '@'
        netloc += host
        if port:
            netloc += ':' + str(port)

        # Bare domains aren't valid URLs.
        if not path:
            path = '/'

        if is_py2:
            if isinstance(scheme, str):
                scheme = scheme.encode('utf-8')
            if isinstance(netloc, str):
                netloc = netloc.encode('utf-8')
            if isinstance(path, str):
                path = path.encode('utf-8')
            if isinstance(query, str):
                query = query.encode('utf-8')
            if isinstance(fragment, str):
                fragment = fragment.encode('utf-8')

        if isinstance(params, (str, bytes)):
            params = to_native_string(params)

        enc_params = self._encode_params(params)
        if enc_params:
            if query:
                query = '%s&%s' % (query, enc_params)
            else:
                query = enc_params

        url = requote_uri(urlunparse([scheme, netloc, path, None, query, fragment]))
        self.url = url

    def prepare_headers(self, headers):
        """Prepares the given HTTP headers."""

        self.headers = CaseInsensitiveDict()
        if headers:
            for header in headers.items():
                # Raise exception on invalid header value.
                check_header_validity(header)
                name, value = header
                self.headers[to_native_string(name)] = value

    def prepare_body(self, data, files, json=None):
        """Prepares the given HTTP body data."""

        # Check if file, fo, generator, iterator.
        # If not, run through normal process.

        # Nottin' on you.
        body = None
        content_type = None

        if not data and json is not None:
            # urllib3 requires a bytes-like body. Python 2's json.dumps
            # provides this natively, but Python 3 gives a Unicode string.
            content_type = 'application/json'
            body = complexjson.dumps(json)
            if not isinstance(body, bytes):
                body = body.encode('utf-8')

        is_stream = all([
            hasattr(data, '__iter__'),
            not isinstance(data, (basestring, list, tuple, collections.Mapping))
        ])

        try:
            length = super_len(data)
        except (TypeError, AttributeError, UnsupportedOperation):
            length = None

        if is_stream:
            body = data

            if getattr(body, 'tell', None) is not None:
                # Record the current file position before reading.
                # This will allow us to rewind a file in the event
                # of a redirect.
                try:
                    self._body_position = body.tell()
                except (IOError, OSError):
                    # This differentiates from None, allowing us to catch
                    # a failed `tell()` later when trying to rewind the body
                    self._body_position = object()

            if files:
                raise NotImplementedError('Streamed bodies and files are mutually exclusive.')

            if length:
                self.headers['Content-Length'] = builtin_str(length)
            else:
                self.headers['Transfer-Encoding'] = 'chunked'
        else:
            # Multi-part file uploads.
            if files:
                (body, content_type) = self._encode_files(files, data)
            else:
                if data:
                    body = self._encode_params(data)
                    if isinstance(data, basestring) or hasattr(data, 'read'):
                        content_type = None
                    else:
                        content_type = 'application/x-www-form-urlencoded'

            self.prepare_content_length(body)

            # Add content-type if it wasn't explicitly provided.
            if content_type and ('content-type' not in self.headers):
                self.headers['Content-Type'] = content_type

        self.body = body

    def prepare_content_length(self, body):
        """Prepare Content-Length header based on request method and body"""
        if body is not None:
            length = super_len(body)
            if length:
                # If length exists, set it. Otherwise, we fallback
                # to Transfer-Encoding: chunked.
                self.headers['Content-Length'] = builtin_str(length)
        elif self.method not in ('GET', 'HEAD') and self.headers.get('Content-Length') is None:
            # Set Content-Length to 0 for methods that can have a body
            # but don't provide one. (i.e. not GET or HEAD)
            self.headers['Content-Length'] = '0'

    def prepare_auth(self, auth, url=''):
        """Prepares the given HTTP auth data."""

        # If no Auth is explicitly provided, extract it from the URL first.
        if auth is None:
            url_auth = get_auth_from_url(self.url)
            auth = url_auth if any(url_auth) else None

        if auth:
            if isinstance(auth, tuple) and len(auth) == 2:
                # special-case basic HTTP auth
                auth = HTTPBasicAuth(*auth)

            # Allow auth to make its changes.
            r = auth(self)

            # Update self to reflect the auth changes.
            self.__dict__.update(r.__dict__)

            # Recompute Content-Length
            self.prepare_content_length(self.body)

    def prepare_cookies(self, cookies):
        """Prepares the given HTTP cookie data.

        This function eventually generates a ``Cookie`` header from the
        given cookies using cookielib. Due to cookielib's design, the header
        will not be regenerated if it already exists, meaning this function
        can only be called once for the life of the
        :class:`PreparedRequest <PreparedRequest>` object. Any subsequent calls
        to ``prepare_cookies`` will have no actual effect, unless the "Cookie"
        header is removed beforehand.
        """
        if isinstance(cookies, cookielib.CookieJar):
            self._cookies = cookies
        else:
            self._cookies = cookiejar_from_dict(cookies)

        cookie_header = get_cookie_header(self._cookies, self)
        if cookie_header is not None:
            self.headers['Cookie'] = cookie_header

    def prepare_hooks(self, hooks):
        """Prepares the given hooks."""
        # hooks can be passed as None to the prepare method and to this
        # method. To prevent iterating over None, simply use an empty list
        # if hooks is False-y
        hooks = hooks or []
        for event in hooks:
            self.register_hook(event, hooks[event])


class Response(object):
    """The :class:`Response <Response>` object, which contains a
    server's response to an HTTP request.
    """

    __attrs__ = [
        '_content', 'status_code', 'headers', 'url', 'history',
        'encoding', 'reason', 'cookies', 'elapsed', 'request'
    ]

    def __init__(self):
        self._content = False
        self._content_consumed = False
        self._next = None

        #: Integer Code of responded HTTP Status, e.g. 404 or 200.
        self.status_code = None

        #: Case-insensitive Dictionary of Response Headers.
        #: For example, ``headers['content-encoding']`` will return the
        #: value of a ``'Content-Encoding'`` response header.
        self.headers = CaseInsensitiveDict()

        #: File-like object representation of response (for advanced usage).
        #: Use of ``raw`` requires that ``stream=True`` be set on the request.
        # This requirement does not apply for use internally to Requests.
        self.raw = None

        #: Final URL location of Response.
        self.url = None

        #: Encoding to decode with when accessing r.text.
        self.encoding = None

        #: A list of :class:`Response <Response>` objects from
        #: the history of the Request. Any redirect responses will end
        #: up here. The list is sorted from the oldest to the most recent request.
        self.history = []

        #: Textual reason of responded HTTP Status, e.g. "Not Found" or "OK".
        self.reason = None

        #: A CookieJar of Cookies the server sent back.
        self.cookies = cookiejar_from_dict({})

        #: The amount of time elapsed between sending the request
        #: and the arrival of the response (as a timedelta).
        #: This property specifically measures the time taken between sending
        #: the first byte of the request and finishing parsing the headers. It
        #: is therefore unaffected by consuming the response content or the
        #: value of the ``stream`` keyword argument.
        self.elapsed = datetime.timedelta(0)

        #: The :class:`PreparedRequest <PreparedRequest>` object to which this
        #: is a response.
        self.request = None

    def __enter__(self):
        return self

    def __exit__(self, *args):
        self.close()

    def __getstate__(self):
        # Consume everything; accessing the content attribute makes
        # sure the content has been fully read.
        if not self._content_consumed:
            self.content

        return dict(
            (attr, getattr(self, attr, None))
            for attr in self.__attrs__
        )

    def __setstate__(self, state):
        for name, value in state.items():
            setattr(self, name, value)

        # pickled objects do not have .raw
        setattr(self, '_content_consumed', True)
        setattr(self, 'raw', None)

    def __repr__(self):
        return '<Response [%s]>' % (self.status_code)

    def __bool__(self):
        """Returns True if :attr:`status_code` is less than 400.

        This attribute checks if the status code of the response is between
        400 and 600 to see if there was a client error or a server error. If
        the status code, is between 200 and 400, this will return True. This
        is **not** a check to see if the response code is ``200 OK``.
        """
        return self.ok

    def __nonzero__(self):
        """Returns True if :attr:`status_code` is less than 400.

        This attribute checks if the status code of the response is between
        400 and 600 to see if there was a client error or a server error. If
        the status code, is between 200 and 400, this will return True. This
        is **not** a check to see if the response code is ``200 OK``.
        """
        return self.ok

    def __iter__(self):
        """Allows you to use a response as an iterator."""
        return self.iter_content(128)

    @property
    def ok(self):
        """Returns True if :attr:`status_code` is less than 400.

        This attribute checks if the status code of the response is between
        400 and 600 to see if there was a client error or a server error. If
        the status code, is between 200 and 400, this will return True. This
        is **not** a check to see if the response code is ``200 OK``.
        """
        try:
            self.raise_for_status()
        except HTTPError:
            return False
        return True

    @property
    def is_redirect(self):
        """True if this Response is a well-formed HTTP redirect that could have
        been processed automatically (by :meth:`Session.resolve_redirects`).
        """
        return ('location' in self.headers and self.status_code in REDIRECT_STATI)

    @property
    def is_permanent_redirect(self):
        """True if this Response one of the permanent versions of redirect."""
        return ('location' in self.headers and self.status_code in (codes.moved_permanently, codes.permanent_redirect))

    @property
    def next(self):
        """Returns a PreparedRequest for the next request in a redirect chain, if there is one."""
        return self._next

    @property
    def apparent_encoding(self):
        """The apparent encoding, provided by the chardet library."""
        return chardet.detect(self.content)['encoding']

    def iter_content(self, chunk_size=1, decode_unicode=False):
        """Iterates over the response data.  When stream=True is set on the
        request, this avoids reading the content at once into memory for
        large responses.  The chunk size is the number of bytes it should
        read into memory.  This is not necessarily the length of each item
        returned as decoding can take place.

        chunk_size must be of type int or None. A value of None will
        function differently depending on the value of `stream`.
        stream=True will read data as it arrives in whatever size the
        chunks are received. If stream=False, data is returned as
        a single chunk.

        If decode_unicode is True, content will be decoded using the best
        available encoding based on the response.
        """

        def generate():
            # Special case for urllib3.
            if hasattr(self.raw, 'stream'):
                try:
                    for chunk in self.raw.stream(chunk_size, decode_content=True):
                        yield chunk
                except ProtocolError as e:
                    raise ChunkedEncodingError(e)
                except DecodeError as e:
                    raise ContentDecodingError(e)
                except ReadTimeoutError as e:
                    raise ConnectionError(e)
            else:
                # Standard file-like object.
                while True:
                    chunk = self.raw.read(chunk_size)
                    if not chunk:
                        break
                    yield chunk

            self._content_consumed = True

        if self._content_consumed and isinstance(self._content, bool):
            raise StreamConsumedError()
        elif chunk_size is not None and not isinstance(chunk_size, int):
            raise TypeError("chunk_size must be an int, it is instead a %s." % type(chunk_size))
        # simulate reading small chunks of the content
        reused_chunks = iter_slices(self._content, chunk_size)

        stream_chunks = generate()

        chunks = reused_chunks if self._content_consumed else stream_chunks

        if decode_unicode:
            chunks = stream_decode_response_unicode(chunks, self)

        return chunks

    def iter_lines(self, chunk_size=ITER_CHUNK_SIZE, decode_unicode=None, delimiter=None):
        """Iterates over the response data, one line at a time.  When
        stream=True is set on the request, this avoids reading the
        content at once into memory for large responses.

        .. note:: This method is not reentrant safe.
        """

        pending = None

        for chunk in self.iter_content(chunk_size=chunk_size, decode_unicode=decode_unicode):

            if pending is not None:
                chunk = pending + chunk

            if delimiter:
                lines = chunk.split(delimiter)
            else:
                lines = chunk.splitlines()

            if lines and lines[-1] and chunk and lines[-1][-1] == chunk[-1]:
                pending = lines.pop()
            else:
                pending = None

            for line in lines:
                yield line

        if pending is not None:
            yield pending

    @property
    def content(self):
        """Content of the response, in bytes."""

        if self._content is False:
            # Read the contents.
            if self._content_consumed:
                raise RuntimeError(
                    'The content for this response was already consumed')

            if self.status_code == 0 or self.raw is None:
                self._content = None
            else:
                self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()

        self._content_consumed = True
        # don't need to release the connection; that's been handled by urllib3
        # since we exhausted the data.
        return self._content

    @property
    def text(self):
        """Content of the response, in unicode.

        If Response.encoding is None, encoding will be guessed using
        ``chardet``.

        The encoding of the response content is determined based solely on HTTP
        headers, following RFC 2616 to the letter. If you can take advantage of
        non-HTTP knowledge to make a better guess at the encoding, you should
        set ``r.encoding`` appropriately before accessing this property.
        """

        # Try charset from content-type
        content = None
        encoding = self.encoding

        if not self.content:
            return str('')

        # Fallback to auto-detected encoding.
        if self.encoding is None:
            encoding = self.apparent_encoding

        # Decode unicode from given encoding.
        try:
            content = str(self.content, encoding, errors='replace')
        except (LookupError, TypeError):
            # A LookupError is raised if the encoding was not found which could
            # indicate a misspelling or similar mistake.
            #
            # A TypeError can be raised if encoding is None
            #
            # So we try blindly encoding.
            content = str(self.content, errors='replace')

        return content

    def json(self, **kwargs):
        r"""Returns the json-encoded content of a response, if any.

        :param \*\*kwargs: Optional arguments that ``json.loads`` takes.
        :raises ValueError: If the response body does not contain valid json.
        """

        if not self.encoding and self.content and len(self.content) > 3:
            # No encoding set. JSON RFC 4627 section 3 states we should expect
            # UTF-8, -16 or -32. Detect which one to use; If the detection or
            # decoding fails, fall back to `self.text` (using chardet to make
            # a best guess).
            encoding = guess_json_utf(self.content)
            if encoding is not None:
                try:
                    return complexjson.loads(
                        self.content.decode(encoding), **kwargs
                    )
                except UnicodeDecodeError:
                    # Wrong UTF codec detected; usually because it's not UTF-8
                    # but some other 8-bit codec.  This is an RFC violation,
                    # and the server didn't bother to tell us what codec *was*
                    # used.
                    pass
        return complexjson.loads(self.text, **kwargs)

    @property
    def links(self):
        """Returns the parsed header links of the response, if any."""

        header = self.headers.get('link')

        # l = MultiDict()
        l = {}

        if header:
            links = parse_header_links(header)

            for link in links:
                key = link.get('rel') or link.get('url')
                l[key] = link

        return l

    def raise_for_status(self):
        """Raises stored :class:`HTTPError`, if one occurred."""

        http_error_msg = ''
        if isinstance(self.reason, bytes):
            # We attempt to decode utf-8 first because some servers
            # choose to localize their reason strings. If the string
            # isn't utf-8, we fall back to iso-8859-1 for all other
            # encodings. (See PR #3538)
            try:
                reason = self.reason.decode('utf-8')
            except UnicodeDecodeError:
                reason = self.reason.decode('iso-8859-1')
        else:
            reason = self.reason

        if 400 <= self.status_code < 500:
            http_error_msg = u'%s Client Error: %s for url: %s' % (self.status_code, reason, self.url)

        elif 500 <= self.status_code < 600:
            http_error_msg = u'%s Server Error: %s for url: %s' % (self.status_code, reason, self.url)

        if http_error_msg:
            raise HTTPError(http_error_msg, response=self)

    def close(self):
        """Releases the connection back to the pool. Once this method has been
        called the underlying ``raw`` object must not be accessed again.

        *Note: Should not normally need to be called explicitly.*
        """
        if not self._content_consumed:
            self.raw.close()

        release_conn = getattr(self.raw, 'release_conn', None)
        if release_conn is not None:
            release_conn()
site-packages/pip/_vendor/requests/auth.py000064400000023000147511334620014677 0ustar00# -*- coding: utf-8 -*-

"""
requests.auth
~~~~~~~~~~~~~

This module contains the authentication handlers for Requests.
"""

import os
import re
import time
import hashlib
import threading
import warnings

from base64 import b64encode

from .compat import urlparse, str, basestring
from .cookies import extract_cookies_to_jar
from ._internal_utils import to_native_string
from .utils import parse_dict_header

CONTENT_TYPE_FORM_URLENCODED = 'application/x-www-form-urlencoded'
CONTENT_TYPE_MULTI_PART = 'multipart/form-data'


def _basic_auth_str(username, password):
    """Returns a Basic Auth string."""

    # "I want us to put a big-ol' comment on top of it that
    # says that this behaviour is dumb but we need to preserve
    # it because people are relying on it."
    #    - Lukasa
    #
    # These are here solely to maintain backwards compatibility
    # for things like ints. This will be removed in 3.0.0.
    if not isinstance(username, basestring):
        warnings.warn(
            "Non-string usernames will no longer be supported in Requests "
            "3.0.0. Please convert the object you've passed in ({0!r}) to "
            "a string or bytes object in the near future to avoid "
            "problems.".format(username),
            category=DeprecationWarning,
        )
        username = str(username)

    if not isinstance(password, basestring):
        warnings.warn(
            "Non-string passwords will no longer be supported in Requests "
            "3.0.0. Please convert the object you've passed in ({0!r}) to "
            "a string or bytes object in the near future to avoid "
            "problems.".format(password),
            category=DeprecationWarning,
        )
        password = str(password)
    # -- End Removal --

    if isinstance(username, str):
        username = username.encode('latin1')

    if isinstance(password, str):
        password = password.encode('latin1')

    authstr = 'Basic ' + to_native_string(
        b64encode(b':'.join((username, password))).strip()
    )

    return authstr


class AuthBase(object):
    """Base class that all auth implementations derive from"""

    def __call__(self, r):
        raise NotImplementedError('Auth hooks must be callable.')


class HTTPBasicAuth(AuthBase):
    """Attaches HTTP Basic Authentication to the given Request object."""

    def __init__(self, username, password):
        self.username = username
        self.password = password

    def __eq__(self, other):
        return all([
            self.username == getattr(other, 'username', None),
            self.password == getattr(other, 'password', None)
        ])

    def __ne__(self, other):
        return not self == other

    def __call__(self, r):
        r.headers['Authorization'] = _basic_auth_str(self.username, self.password)
        return r


class HTTPProxyAuth(HTTPBasicAuth):
    """Attaches HTTP Proxy Authentication to a given Request object."""

    def __call__(self, r):
        r.headers['Proxy-Authorization'] = _basic_auth_str(self.username, self.password)
        return r


class HTTPDigestAuth(AuthBase):
    """Attaches HTTP Digest Authentication to the given Request object."""

    def __init__(self, username, password):
        self.username = username
        self.password = password
        # Keep state in per-thread local storage
        self._thread_local = threading.local()

    def init_per_thread_state(self):
        # Ensure state is initialized just once per-thread
        if not hasattr(self._thread_local, 'init'):
            self._thread_local.init = True
            self._thread_local.last_nonce = ''
            self._thread_local.nonce_count = 0
            self._thread_local.chal = {}
            self._thread_local.pos = None
            self._thread_local.num_401_calls = None

    def build_digest_header(self, method, url):
        """
        :rtype: str
        """

        realm = self._thread_local.chal['realm']
        nonce = self._thread_local.chal['nonce']
        qop = self._thread_local.chal.get('qop')
        algorithm = self._thread_local.chal.get('algorithm')
        opaque = self._thread_local.chal.get('opaque')
        hash_utf8 = None

        if algorithm is None:
            _algorithm = 'MD5'
        else:
            _algorithm = algorithm.upper()
        # lambdas assume digest modules are imported at the top level
        if _algorithm == 'MD5' or _algorithm == 'MD5-SESS':
            def md5_utf8(x):
                if isinstance(x, str):
                    x = x.encode('utf-8')
                return hashlib.md5(x).hexdigest()
            hash_utf8 = md5_utf8
        elif _algorithm == 'SHA':
            def sha_utf8(x):
                if isinstance(x, str):
                    x = x.encode('utf-8')
                return hashlib.sha1(x).hexdigest()
            hash_utf8 = sha_utf8

        KD = lambda s, d: hash_utf8("%s:%s" % (s, d))

        if hash_utf8 is None:
            return None

        # XXX not implemented yet
        entdig = None
        p_parsed = urlparse(url)
        #: path is request-uri defined in RFC 2616 which should not be empty
        path = p_parsed.path or "/"
        if p_parsed.query:
            path += '?' + p_parsed.query

        A1 = '%s:%s:%s' % (self.username, realm, self.password)
        A2 = '%s:%s' % (method, path)

        HA1 = hash_utf8(A1)
        HA2 = hash_utf8(A2)

        if nonce == self._thread_local.last_nonce:
            self._thread_local.nonce_count += 1
        else:
            self._thread_local.nonce_count = 1
        ncvalue = '%08x' % self._thread_local.nonce_count
        s = str(self._thread_local.nonce_count).encode('utf-8')
        s += nonce.encode('utf-8')
        s += time.ctime().encode('utf-8')
        s += os.urandom(8)

        cnonce = (hashlib.sha1(s).hexdigest()[:16])
        if _algorithm == 'MD5-SESS':
            HA1 = hash_utf8('%s:%s:%s' % (HA1, nonce, cnonce))

        if not qop:
            respdig = KD(HA1, "%s:%s" % (nonce, HA2))
        elif qop == 'auth' or 'auth' in qop.split(','):
            noncebit = "%s:%s:%s:%s:%s" % (
                nonce, ncvalue, cnonce, 'auth', HA2
            )
            respdig = KD(HA1, noncebit)
        else:
            # XXX handle auth-int.
            return None

        self._thread_local.last_nonce = nonce

        # XXX should the partial digests be encoded too?
        base = 'username="%s", realm="%s", nonce="%s", uri="%s", ' \
               'response="%s"' % (self.username, realm, nonce, path, respdig)
        if opaque:
            base += ', opaque="%s"' % opaque
        if algorithm:
            base += ', algorithm="%s"' % algorithm
        if entdig:
            base += ', digest="%s"' % entdig
        if qop:
            base += ', qop="auth", nc=%s, cnonce="%s"' % (ncvalue, cnonce)

        return 'Digest %s' % (base)

    def handle_redirect(self, r, **kwargs):
        """Reset num_401_calls counter on redirects."""
        if r.is_redirect:
            self._thread_local.num_401_calls = 1

    def handle_401(self, r, **kwargs):
        """
        Takes the given response and tries digest-auth, if needed.

        :rtype: requests.Response
        """

        # If response is not 4xx, do not auth
        # See https://github.com/requests/requests/issues/3772
        if not 400 <= r.status_code < 500:
            self._thread_local.num_401_calls = 1
            return r

        if self._thread_local.pos is not None:
            # Rewind the file position indicator of the body to where
            # it was to resend the request.
            r.request.body.seek(self._thread_local.pos)
        s_auth = r.headers.get('www-authenticate', '')

        if 'digest' in s_auth.lower() and self._thread_local.num_401_calls < 2:

            self._thread_local.num_401_calls += 1
            pat = re.compile(r'digest ', flags=re.IGNORECASE)
            self._thread_local.chal = parse_dict_header(pat.sub('', s_auth, count=1))

            # Consume content and release the original connection
            # to allow our new request to reuse the same one.
            r.content
            r.close()
            prep = r.request.copy()
            extract_cookies_to_jar(prep._cookies, r.request, r.raw)
            prep.prepare_cookies(prep._cookies)

            prep.headers['Authorization'] = self.build_digest_header(
                prep.method, prep.url)
            _r = r.connection.send(prep, **kwargs)
            _r.history.append(r)
            _r.request = prep

            return _r

        self._thread_local.num_401_calls = 1
        return r

    def __call__(self, r):
        # Initialize per-thread state, if needed
        self.init_per_thread_state()
        # If we have a saved nonce, skip the 401
        if self._thread_local.last_nonce:
            r.headers['Authorization'] = self.build_digest_header(r.method, r.url)
        try:
            self._thread_local.pos = r.body.tell()
        except AttributeError:
            # In the case of HTTPDigestAuth being reused and the body of
            # the previous request was a file-like object, pos has the
            # file position of the previous body. Ensure it's set to
            # None.
            self._thread_local.pos = None
        r.register_hook('response', self.handle_401)
        r.register_hook('response', self.handle_redirect)
        self._thread_local.num_401_calls = 1

        return r

    def __eq__(self, other):
        return all([
            self.username == getattr(other, 'username', None),
            self.password == getattr(other, 'password', None)
        ])

    def __ne__(self, other):
        return not self == other
site-packages/pip/_vendor/requests/api.py000064400000014135147511334620014520 0ustar00# -*- coding: utf-8 -*-

"""
requests.api
~~~~~~~~~~~~

This module implements the Requests API.

:copyright: (c) 2012 by Kenneth Reitz.
:license: Apache2, see LICENSE for more details.
"""

from . import sessions


def request(method, url, **kwargs):
    """Constructs and sends a :class:`Request <Request>`.

    :param method: method for the new :class:`Request` object.
    :param url: URL for the new :class:`Request` object.
    :param params: (optional) Dictionary or bytes to be sent in the query string for the :class:`Request`.
    :param data: (optional) Dictionary or list of tuples ``[(key, value)]`` (will be form-encoded), bytes, or file-like object to send in the body of the :class:`Request`.
    :param json: (optional) json data to send in the body of the :class:`Request`.
    :param headers: (optional) Dictionary of HTTP Headers to send with the :class:`Request`.
    :param cookies: (optional) Dict or CookieJar object to send with the :class:`Request`.
    :param files: (optional) Dictionary of ``'name': file-like-objects`` (or ``{'name': file-tuple}``) for multipart encoding upload.
        ``file-tuple`` can be a 2-tuple ``('filename', fileobj)``, 3-tuple ``('filename', fileobj, 'content_type')``
        or a 4-tuple ``('filename', fileobj, 'content_type', custom_headers)``, where ``'content-type'`` is a string
        defining the content type of the given file and ``custom_headers`` a dict-like object containing additional headers
        to add for the file.
    :param auth: (optional) Auth tuple to enable Basic/Digest/Custom HTTP Auth.
    :param timeout: (optional) How many seconds to wait for the server to send data
        before giving up, as a float, or a :ref:`(connect timeout, read
        timeout) <timeouts>` tuple.
    :type timeout: float or tuple
    :param allow_redirects: (optional) Boolean. Enable/disable GET/OPTIONS/POST/PUT/PATCH/DELETE/HEAD redirection. Defaults to ``True``.
    :type allow_redirects: bool
    :param proxies: (optional) Dictionary mapping protocol to the URL of the proxy.
    :param verify: (optional) Either a boolean, in which case it controls whether we verify
            the server's TLS certificate, or a string, in which case it must be a path
            to a CA bundle to use. Defaults to ``True``.
    :param stream: (optional) if ``False``, the response content will be immediately downloaded.
    :param cert: (optional) if String, path to ssl client cert file (.pem). If Tuple, ('cert', 'key') pair.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response

    Usage::

      >>> import requests
      >>> req = requests.request('GET', 'http://httpbin.org/get')
      <Response [200]>
    """

    # By using the 'with' statement we are sure the session is closed, thus we
    # avoid leaving sockets open which can trigger a ResourceWarning in some
    # cases, and look like a memory leak in others.
    with sessions.Session() as session:
        return session.request(method=method, url=url, **kwargs)


def get(url, params=None, **kwargs):
    r"""Sends a GET request.

    :param url: URL for the new :class:`Request` object.
    :param params: (optional) Dictionary or bytes to be sent in the query string for the :class:`Request`.
    :param \*\*kwargs: Optional arguments that ``request`` takes.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response
    """

    kwargs.setdefault('allow_redirects', True)
    return request('get', url, params=params, **kwargs)


def options(url, **kwargs):
    r"""Sends an OPTIONS request.

    :param url: URL for the new :class:`Request` object.
    :param \*\*kwargs: Optional arguments that ``request`` takes.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response
    """

    kwargs.setdefault('allow_redirects', True)
    return request('options', url, **kwargs)


def head(url, **kwargs):
    r"""Sends a HEAD request.

    :param url: URL for the new :class:`Request` object.
    :param \*\*kwargs: Optional arguments that ``request`` takes.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response
    """

    kwargs.setdefault('allow_redirects', False)
    return request('head', url, **kwargs)


def post(url, data=None, json=None, **kwargs):
    r"""Sends a POST request.

    :param url: URL for the new :class:`Request` object.
    :param data: (optional) Dictionary (will be form-encoded), bytes, or file-like object to send in the body of the :class:`Request`.
    :param json: (optional) json data to send in the body of the :class:`Request`.
    :param \*\*kwargs: Optional arguments that ``request`` takes.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response
    """

    return request('post', url, data=data, json=json, **kwargs)


def put(url, data=None, **kwargs):
    r"""Sends a PUT request.

    :param url: URL for the new :class:`Request` object.
    :param data: (optional) Dictionary (will be form-encoded), bytes, or file-like object to send in the body of the :class:`Request`.
    :param json: (optional) json data to send in the body of the :class:`Request`.
    :param \*\*kwargs: Optional arguments that ``request`` takes.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response
    """

    return request('put', url, data=data, **kwargs)


def patch(url, data=None, **kwargs):
    r"""Sends a PATCH request.

    :param url: URL for the new :class:`Request` object.
    :param data: (optional) Dictionary (will be form-encoded), bytes, or file-like object to send in the body of the :class:`Request`.
    :param json: (optional) json data to send in the body of the :class:`Request`.
    :param \*\*kwargs: Optional arguments that ``request`` takes.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response
    """

    return request('patch', url, data=data, **kwargs)


def delete(url, **kwargs):
    r"""Sends a DELETE request.

    :param url: URL for the new :class:`Request` object.
    :param \*\*kwargs: Optional arguments that ``request`` takes.
    :return: :class:`Response <Response>` object
    :rtype: requests.Response
    """

    return request('delete', url, **kwargs)
site-packages/pip/_vendor/distro.py000064400000112755147511334620013407 0ustar00# Copyright 2015,2016 Nir Cohen
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

"""
The ``distro`` package (``distro`` stands for Linux Distribution) provides
information about the Linux distribution it runs on, such as a reliable
machine-readable distro ID, or version information.

It is a renewed alternative implementation for Python's original
:py:func:`platform.linux_distribution` function, but it provides much more
functionality. An alternative implementation became necessary because Python
3.5 deprecated this function, and Python 3.7 is expected to remove it
altogether. Its predecessor function :py:func:`platform.dist` was already
deprecated since Python 2.6 and is also expected to be removed in Python 3.7.
Still, there are many cases in which access to Linux distribution information
is needed. See `Python issue 1322 <https://bugs.python.org/issue1322>`_ for
more information.
"""

import os
import re
import sys
import json
import shlex
import logging
import subprocess


if not sys.platform.startswith('linux'):
    raise ImportError('Unsupported platform: {0}'.format(sys.platform))

_UNIXCONFDIR = '/etc'
_OS_RELEASE_BASENAME = 'os-release'

#: Translation table for normalizing the "ID" attribute defined in os-release
#: files, for use by the :func:`distro.id` method.
#:
#: * Key: Value as defined in the os-release file, translated to lower case,
#:   with blanks translated to underscores.
#:
#: * Value: Normalized value.
NORMALIZED_OS_ID = {}

#: Translation table for normalizing the "Distributor ID" attribute returned by
#: the lsb_release command, for use by the :func:`distro.id` method.
#:
#: * Key: Value as returned by the lsb_release command, translated to lower
#:   case, with blanks translated to underscores.
#:
#: * Value: Normalized value.
NORMALIZED_LSB_ID = {
    'enterpriseenterprise': 'oracle',  # Oracle Enterprise Linux
    'redhatenterpriseworkstation': 'rhel',  # RHEL 6.7
}

#: Translation table for normalizing the distro ID derived from the file name
#: of distro release files, for use by the :func:`distro.id` method.
#:
#: * Key: Value as derived from the file name of a distro release file,
#:   translated to lower case, with blanks translated to underscores.
#:
#: * Value: Normalized value.
NORMALIZED_DISTRO_ID = {
    'redhat': 'rhel',  # RHEL 6.x, 7.x
}

# Pattern for content of distro release file (reversed)
_DISTRO_RELEASE_CONTENT_REVERSED_PATTERN = re.compile(
    r'(?:[^)]*\)(.*)\()? *(?:STL )?([\d.+\-a-z]*\d) *(?:esaeler *)?(.+)')

# Pattern for base file name of distro release file
_DISTRO_RELEASE_BASENAME_PATTERN = re.compile(
    r'(\w+)[-_](release|version)$')

# Base file names to be ignored when searching for distro release file
_DISTRO_RELEASE_IGNORE_BASENAMES = (
    'debian_version',
    'lsb-release',
    'oem-release',
    _OS_RELEASE_BASENAME,
    'system-release'
)


def linux_distribution(full_distribution_name=True):
    """
    Return information about the current Linux distribution as a tuple
    ``(id_name, version, codename)`` with items as follows:

    * ``id_name``:  If *full_distribution_name* is false, the result of
      :func:`distro.id`. Otherwise, the result of :func:`distro.name`.

    * ``version``:  The result of :func:`distro.version`.

    * ``codename``:  The result of :func:`distro.codename`.

    The interface of this function is compatible with the original
    :py:func:`platform.linux_distribution` function, supporting a subset of
    its parameters.

    The data it returns may not exactly be the same, because it uses more data
    sources than the original function, and that may lead to different data if
    the Linux distribution is not consistent across multiple data sources it
    provides (there are indeed such distributions ...).

    Another reason for differences is the fact that the :func:`distro.id`
    method normalizes the distro ID string to a reliable machine-readable value
    for a number of popular Linux distributions.
    """
    return _distro.linux_distribution(full_distribution_name)


def id():
    """
    Return the distro ID of the current Linux distribution, as a
    machine-readable string.

    For a number of Linux distributions, the returned distro ID value is
    *reliable*, in the sense that it is documented and that it does not change
    across releases of the distribution.

    This package maintains the following reliable distro ID values:

    ==============  =========================================
    Distro ID       Distribution
    ==============  =========================================
    "ubuntu"        Ubuntu
    "debian"        Debian
    "rhel"          RedHat Enterprise Linux
    "centos"        CentOS
    "rocky"         Rocky Linux
    "fedora"        Fedora
    "sles"          SUSE Linux Enterprise Server
    "opensuse"      openSUSE
    "amazon"        Amazon Linux
    "arch"          Arch Linux
    "cloudlinux"    CloudLinux OS
    "exherbo"       Exherbo Linux
    "gentoo"        GenToo Linux
    "ibm_powerkvm"  IBM PowerKVM
    "kvmibm"        KVM for IBM z Systems
    "linuxmint"     Linux Mint
    "mageia"        Mageia
    "mandriva"      Mandriva Linux
    "parallels"     Parallels
    "pidora"        Pidora
    "raspbian"      Raspbian
    "oracle"        Oracle Linux (and Oracle Enterprise Linux)
    "scientific"    Scientific Linux
    "slackware"     Slackware
    "xenserver"     XenServer
    ==============  =========================================

    If you have a need to get distros for reliable IDs added into this set,
    or if you find that the :func:`distro.id` function returns a different
    distro ID for one of the listed distros, please create an issue in the
    `distro issue tracker`_.

    **Lookup hierarchy and transformations:**

    First, the ID is obtained from the following sources, in the specified
    order. The first available and non-empty value is used:

    * the value of the "ID" attribute of the os-release file,

    * the value of the "Distributor ID" attribute returned by the lsb_release
      command,

    * the first part of the file name of the distro release file,

    The so determined ID value then passes the following transformations,
    before it is returned by this method:

    * it is translated to lower case,

    * blanks (which should not be there anyway) are translated to underscores,

    * a normalization of the ID is performed, based upon
      `normalization tables`_. The purpose of this normalization is to ensure
      that the ID is as reliable as possible, even across incompatible changes
      in the Linux distributions. A common reason for an incompatible change is
      the addition of an os-release file, or the addition of the lsb_release
      command, with ID values that differ from what was previously determined
      from the distro release file name.
    """
    return _distro.id()


def name(pretty=False):
    """
    Return the name of the current Linux distribution, as a human-readable
    string.

    If *pretty* is false, the name is returned without version or codename.
    (e.g. "CentOS Linux")

    If *pretty* is true, the version and codename are appended.
    (e.g. "CentOS Linux 7.1.1503 (Core)")

    **Lookup hierarchy:**

    The name is obtained from the following sources, in the specified order.
    The first available and non-empty value is used:

    * If *pretty* is false:

      - the value of the "NAME" attribute of the os-release file,

      - the value of the "Distributor ID" attribute returned by the lsb_release
        command,

      - the value of the "<name>" field of the distro release file.

    * If *pretty* is true:

      - the value of the "PRETTY_NAME" attribute of the os-release file,

      - the value of the "Description" attribute returned by the lsb_release
        command,

      - the value of the "<name>" field of the distro release file, appended
        with the value of the pretty version ("<version_id>" and "<codename>"
        fields) of the distro release file, if available.
    """
    return _distro.name(pretty)


def version(pretty=False, best=False):
    """
    Return the version of the current Linux distribution, as a human-readable
    string.

    If *pretty* is false, the version is returned without codename (e.g.
    "7.0").

    If *pretty* is true, the codename in parenthesis is appended, if the
    codename is non-empty (e.g. "7.0 (Maipo)").

    Some distributions provide version numbers with different precisions in
    the different sources of distribution information. Examining the different
    sources in a fixed priority order does not always yield the most precise
    version (e.g. for Debian 8.2, or CentOS 7.1).

    The *best* parameter can be used to control the approach for the returned
    version:

    If *best* is false, the first non-empty version number in priority order of
    the examined sources is returned.

    If *best* is true, the most precise version number out of all examined
    sources is returned.

    **Lookup hierarchy:**

    In all cases, the version number is obtained from the following sources.
    If *best* is false, this order represents the priority order:

    * the value of the "VERSION_ID" attribute of the os-release file,
    * the value of the "Release" attribute returned by the lsb_release
      command,
    * the version number parsed from the "<version_id>" field of the first line
      of the distro release file,
    * the version number parsed from the "PRETTY_NAME" attribute of the
      os-release file, if it follows the format of the distro release files.
    * the version number parsed from the "Description" attribute returned by
      the lsb_release command, if it follows the format of the distro release
      files.
    """
    return _distro.version(pretty, best)


def version_parts(best=False):
    """
    Return the version of the current Linux distribution as a tuple
    ``(major, minor, build_number)`` with items as follows:

    * ``major``:  The result of :func:`distro.major_version`.

    * ``minor``:  The result of :func:`distro.minor_version`.

    * ``build_number``:  The result of :func:`distro.build_number`.

    For a description of the *best* parameter, see the :func:`distro.version`
    method.
    """
    return _distro.version_parts(best)


def major_version(best=False):
    """
    Return the major version of the current Linux distribution, as a string,
    if provided.
    Otherwise, the empty string is returned. The major version is the first
    part of the dot-separated version string.

    For a description of the *best* parameter, see the :func:`distro.version`
    method.
    """
    return _distro.major_version(best)


def minor_version(best=False):
    """
    Return the minor version of the current Linux distribution, as a string,
    if provided.
    Otherwise, the empty string is returned. The minor version is the second
    part of the dot-separated version string.

    For a description of the *best* parameter, see the :func:`distro.version`
    method.
    """
    return _distro.minor_version(best)


def build_number(best=False):
    """
    Return the build number of the current Linux distribution, as a string,
    if provided.
    Otherwise, the empty string is returned. The build number is the third part
    of the dot-separated version string.

    For a description of the *best* parameter, see the :func:`distro.version`
    method.
    """
    return _distro.build_number(best)


def like():
    """
    Return a space-separated list of distro IDs of distributions that are
    closely related to the current Linux distribution in regards to packaging
    and programming interfaces, for example distributions the current
    distribution is a derivative from.

    **Lookup hierarchy:**

    This information item is only provided by the os-release file.
    For details, see the description of the "ID_LIKE" attribute in the
    `os-release man page
    <http://www.freedesktop.org/software/systemd/man/os-release.html>`_.
    """
    return _distro.like()


def codename():
    """
    Return the codename for the release of the current Linux distribution,
    as a string.

    If the distribution does not have a codename, an empty string is returned.

    Note that the returned codename is not always really a codename. For
    example, openSUSE returns "x86_64". This function does not handle such
    cases in any special way and just returns the string it finds, if any.

    **Lookup hierarchy:**

    * the codename within the "VERSION" attribute of the os-release file, if
      provided,

    * the value of the "Codename" attribute returned by the lsb_release
      command,

    * the value of the "<codename>" field of the distro release file.
    """
    return _distro.codename()


def info(pretty=False, best=False):
    """
    Return certain machine-readable information items about the current Linux
    distribution in a dictionary, as shown in the following example:

    .. sourcecode:: python

        {
            'id': 'rhel',
            'version': '7.0',
            'version_parts': {
                'major': '7',
                'minor': '0',
                'build_number': ''
            },
            'like': 'fedora',
            'codename': 'Maipo'
        }

    The dictionary structure and keys are always the same, regardless of which
    information items are available in the underlying data sources. The values
    for the various keys are as follows:

    * ``id``:  The result of :func:`distro.id`.

    * ``version``:  The result of :func:`distro.version`.

    * ``version_parts -> major``:  The result of :func:`distro.major_version`.

    * ``version_parts -> minor``:  The result of :func:`distro.minor_version`.

    * ``version_parts -> build_number``:  The result of
      :func:`distro.build_number`.

    * ``like``:  The result of :func:`distro.like`.

    * ``codename``:  The result of :func:`distro.codename`.

    For a description of the *pretty* and *best* parameters, see the
    :func:`distro.version` method.
    """
    return _distro.info(pretty, best)


def os_release_info():
    """
    Return a dictionary containing key-value pairs for the information items
    from the os-release file data source of the current Linux distribution.

    See `os-release file`_ for details about these information items.
    """
    return _distro.os_release_info()


def lsb_release_info():
    """
    Return a dictionary containing key-value pairs for the information items
    from the lsb_release command data source of the current Linux distribution.

    See `lsb_release command output`_ for details about these information
    items.
    """
    return _distro.lsb_release_info()


def distro_release_info():
    """
    Return a dictionary containing key-value pairs for the information items
    from the distro release file data source of the current Linux distribution.

    See `distro release file`_ for details about these information items.
    """
    return _distro.distro_release_info()


def os_release_attr(attribute):
    """
    Return a single named information item from the os-release file data source
    of the current Linux distribution.

    Parameters:

    * ``attribute`` (string): Key of the information item.

    Returns:

    * (string): Value of the information item, if the item exists.
      The empty string, if the item does not exist.

    See `os-release file`_ for details about these information items.
    """
    return _distro.os_release_attr(attribute)


def lsb_release_attr(attribute):
    """
    Return a single named information item from the lsb_release command output
    data source of the current Linux distribution.

    Parameters:

    * ``attribute`` (string): Key of the information item.

    Returns:

    * (string): Value of the information item, if the item exists.
      The empty string, if the item does not exist.

    See `lsb_release command output`_ for details about these information
    items.
    """
    return _distro.lsb_release_attr(attribute)


def distro_release_attr(attribute):
    """
    Return a single named information item from the distro release file
    data source of the current Linux distribution.

    Parameters:

    * ``attribute`` (string): Key of the information item.

    Returns:

    * (string): Value of the information item, if the item exists.
      The empty string, if the item does not exist.

    See `distro release file`_ for details about these information items.
    """
    return _distro.distro_release_attr(attribute)


class LinuxDistribution(object):
    """
    Provides information about a Linux distribution.

    This package creates a private module-global instance of this class with
    default initialization arguments, that is used by the
    `consolidated accessor functions`_ and `single source accessor functions`_.
    By using default initialization arguments, that module-global instance
    returns data about the current Linux distribution (i.e. the distro this
    package runs on).

    Normally, it is not necessary to create additional instances of this class.
    However, in situations where control is needed over the exact data sources
    that are used, instances of this class can be created with a specific
    distro release file, or a specific os-release file, or without invoking the
    lsb_release command.
    """

    def __init__(self,
                 include_lsb=True,
                 os_release_file='',
                 distro_release_file=''):
        """
        The initialization method of this class gathers information from the
        available data sources, and stores that in private instance attributes.
        Subsequent access to the information items uses these private instance
        attributes, so that the data sources are read only once.

        Parameters:

        * ``include_lsb`` (bool): Controls whether the
          `lsb_release command output`_ is included as a data source.

          If the lsb_release command is not available in the program execution
          path, the data source for the lsb_release command will be empty.

        * ``os_release_file`` (string): The path name of the
          `os-release file`_ that is to be used as a data source.

          An empty string (the default) will cause the default path name to
          be used (see `os-release file`_ for details).

          If the specified or defaulted os-release file does not exist, the
          data source for the os-release file will be empty.

        * ``distro_release_file`` (string): The path name of the
          `distro release file`_ that is to be used as a data source.

          An empty string (the default) will cause a default search algorithm
          to be used (see `distro release file`_ for details).

          If the specified distro release file does not exist, or if no default
          distro release file can be found, the data source for the distro
          release file will be empty.

        Public instance attributes:

        * ``os_release_file`` (string): The path name of the
          `os-release file`_ that is actually used as a data source. The
          empty string if no distro release file is used as a data source.

        * ``distro_release_file`` (string): The path name of the
          `distro release file`_ that is actually used as a data source. The
          empty string if no distro release file is used as a data source.

        Raises:

        * :py:exc:`IOError`: Some I/O issue with an os-release file or distro
          release file.

        * :py:exc:`subprocess.CalledProcessError`: The lsb_release command had
          some issue (other than not being available in the program execution
          path).

        * :py:exc:`UnicodeError`: A data source has unexpected characters or
          uses an unexpected encoding.
        """
        self.os_release_file = os_release_file or \
            os.path.join(_UNIXCONFDIR, _OS_RELEASE_BASENAME)
        self.distro_release_file = distro_release_file or ''  # updated later
        self._os_release_info = self._get_os_release_info()
        self._lsb_release_info = self._get_lsb_release_info() \
            if include_lsb else {}
        self._distro_release_info = self._get_distro_release_info()

    def __repr__(self):
        """Return repr of all info
        """
        return \
            "LinuxDistribution(" \
            "os_release_file={0!r}, " \
            "distro_release_file={1!r}, " \
            "_os_release_info={2!r}, " \
            "_lsb_release_info={3!r}, " \
            "_distro_release_info={4!r})".format(
                self.os_release_file,
                self.distro_release_file,
                self._os_release_info,
                self._lsb_release_info,
                self._distro_release_info)

    def linux_distribution(self, full_distribution_name=True):
        """
        Return information about the Linux distribution that is compatible
        with Python's :func:`platform.linux_distribution`, supporting a subset
        of its parameters.

        For details, see :func:`distro.linux_distribution`.
        """
        return (
            self.name() if full_distribution_name else self.id(),
            self.version(),
            self.codename()
        )

    def id(self):
        """Return the distro ID of the Linux distribution, as a string.

        For details, see :func:`distro.id`.
        """
        def normalize(distro_id, table):
            distro_id = distro_id.lower().replace(' ', '_')
            return table.get(distro_id, distro_id)

        distro_id = self.os_release_attr('id')
        if distro_id:
            return normalize(distro_id, NORMALIZED_OS_ID)

        distro_id = self.lsb_release_attr('distributor_id')
        if distro_id:
            return normalize(distro_id, NORMALIZED_LSB_ID)

        distro_id = self.distro_release_attr('id')
        if distro_id:
            return normalize(distro_id, NORMALIZED_DISTRO_ID)

        return ''

    def name(self, pretty=False):
        """
        Return the name of the Linux distribution, as a string.

        For details, see :func:`distro.name`.
        """
        name = self.os_release_attr('name') \
            or self.lsb_release_attr('distributor_id') \
            or self.distro_release_attr('name')
        if pretty:
            name = self.os_release_attr('pretty_name') \
                or self.lsb_release_attr('description')
            if not name:
                name = self.distro_release_attr('name')
                version = self.version(pretty=True)
                if version:
                    name = name + ' ' + version
        return name or ''

    def version(self, pretty=False, best=False):
        """
        Return the version of the Linux distribution, as a string.

        For details, see :func:`distro.version`.
        """
        versions = [
            self.os_release_attr('version_id'),
            self.lsb_release_attr('release'),
            self.distro_release_attr('version_id'),
            self._parse_distro_release_content(
                self.os_release_attr('pretty_name')).get('version_id', ''),
            self._parse_distro_release_content(
                self.lsb_release_attr('description')).get('version_id', '')
        ]
        version = ''
        if best:
            # This algorithm uses the last version in priority order that has
            # the best precision. If the versions are not in conflict, that
            # does not matter; otherwise, using the last one instead of the
            # first one might be considered a surprise.
            for v in versions:
                if v.count(".") > version.count(".") or version == '':
                    version = v
        else:
            for v in versions:
                if v != '':
                    version = v
                    break
        if pretty and version and self.codename():
            version = u'{0} ({1})'.format(version, self.codename())
        return version

    def version_parts(self, best=False):
        """
        Return the version of the Linux distribution, as a tuple of version
        numbers.

        For details, see :func:`distro.version_parts`.
        """
        version_str = self.version(best=best)
        if version_str:
            version_regex = re.compile(r'(\d+)\.?(\d+)?\.?(\d+)?')
            matches = version_regex.match(version_str)
            if matches:
                major, minor, build_number = matches.groups()
                return major, minor or '', build_number or ''
        return '', '', ''

    def major_version(self, best=False):
        """
        Return the major version number of the current distribution.

        For details, see :func:`distro.major_version`.
        """
        return self.version_parts(best)[0]

    def minor_version(self, best=False):
        """
        Return the minor version number of the Linux distribution.

        For details, see :func:`distro.minor_version`.
        """
        return self.version_parts(best)[1]

    def build_number(self, best=False):
        """
        Return the build number of the Linux distribution.

        For details, see :func:`distro.build_number`.
        """
        return self.version_parts(best)[2]

    def like(self):
        """
        Return the IDs of distributions that are like the Linux distribution.

        For details, see :func:`distro.like`.
        """
        return self.os_release_attr('id_like') or ''

    def codename(self):
        """
        Return the codename of the Linux distribution.

        For details, see :func:`distro.codename`.
        """
        return self.os_release_attr('codename') \
            or self.lsb_release_attr('codename') \
            or self.distro_release_attr('codename') \
            or ''

    def info(self, pretty=False, best=False):
        """
        Return certain machine-readable information about the Linux
        distribution.

        For details, see :func:`distro.info`.
        """
        return dict(
            id=self.id(),
            version=self.version(pretty, best),
            version_parts=dict(
                major=self.major_version(best),
                minor=self.minor_version(best),
                build_number=self.build_number(best)
            ),
            like=self.like(),
            codename=self.codename(),
        )

    def os_release_info(self):
        """
        Return a dictionary containing key-value pairs for the information
        items from the os-release file data source of the Linux distribution.

        For details, see :func:`distro.os_release_info`.
        """
        return self._os_release_info

    def lsb_release_info(self):
        """
        Return a dictionary containing key-value pairs for the information
        items from the lsb_release command data source of the Linux
        distribution.

        For details, see :func:`distro.lsb_release_info`.
        """
        return self._lsb_release_info

    def distro_release_info(self):
        """
        Return a dictionary containing key-value pairs for the information
        items from the distro release file data source of the Linux
        distribution.

        For details, see :func:`distro.distro_release_info`.
        """
        return self._distro_release_info

    def os_release_attr(self, attribute):
        """
        Return a single named information item from the os-release file data
        source of the Linux distribution.

        For details, see :func:`distro.os_release_attr`.
        """
        return self._os_release_info.get(attribute, '')

    def lsb_release_attr(self, attribute):
        """
        Return a single named information item from the lsb_release command
        output data source of the Linux distribution.

        For details, see :func:`distro.lsb_release_attr`.
        """
        return self._lsb_release_info.get(attribute, '')

    def distro_release_attr(self, attribute):
        """
        Return a single named information item from the distro release file
        data source of the Linux distribution.

        For details, see :func:`distro.distro_release_attr`.
        """
        return self._distro_release_info.get(attribute, '')

    def _get_os_release_info(self):
        """
        Get the information items from the specified os-release file.

        Returns:
            A dictionary containing all information items.
        """
        if os.path.isfile(self.os_release_file):
            with open(self.os_release_file) as release_file:
                return self._parse_os_release_content(release_file)
        return {}

    @staticmethod
    def _parse_os_release_content(lines):
        """
        Parse the lines of an os-release file.

        Parameters:

        * lines: Iterable through the lines in the os-release file.
                 Each line must be a unicode string or a UTF-8 encoded byte
                 string.

        Returns:
            A dictionary containing all information items.
        """
        props = {}
        lexer = shlex.shlex(lines, posix=True)
        lexer.whitespace_split = True

        # The shlex module defines its `wordchars` variable using literals,
        # making it dependent on the encoding of the Python source file.
        # In Python 2.6 and 2.7, the shlex source file is encoded in
        # 'iso-8859-1', and the `wordchars` variable is defined as a byte
        # string. This causes a UnicodeDecodeError to be raised when the
        # parsed content is a unicode object. The following fix resolves that
        # (... but it should be fixed in shlex...):
        if sys.version_info[0] == 2 and isinstance(lexer.wordchars, bytes):
            lexer.wordchars = lexer.wordchars.decode('iso-8859-1')

        tokens = list(lexer)
        for token in tokens:
            # At this point, all shell-like parsing has been done (i.e.
            # comments processed, quotes and backslash escape sequences
            # processed, multi-line values assembled, trailing newlines
            # stripped, etc.), so the tokens are now either:
            # * variable assignments: var=value
            # * commands or their arguments (not allowed in os-release)
            if '=' in token:
                k, v = token.split('=', 1)
                if isinstance(v, bytes):
                    v = v.decode('utf-8')
                props[k.lower()] = v
                if k == 'VERSION':
                    # this handles cases in which the codename is in
                    # the `(CODENAME)` (rhel, centos, fedora) format
                    # or in the `, CODENAME` format (Ubuntu).
                    codename = re.search(r'(\(\D+\))|,(\s+)?\D+', v)
                    if codename:
                        codename = codename.group()
                        codename = codename.strip('()')
                        codename = codename.strip(',')
                        codename = codename.strip()
                        # codename appears within paranthese.
                        props['codename'] = codename
                    else:
                        props['codename'] = ''
            else:
                # Ignore any tokens that are not variable assignments
                pass
        return props

    def _get_lsb_release_info(self):
        """
        Get the information items from the lsb_release command output.

        Returns:
            A dictionary containing all information items.
        """
        cmd = 'lsb_release -a'
        process = subprocess.Popen(
            cmd,
            shell=True,
            stdout=subprocess.PIPE,
            stderr=subprocess.PIPE)
        stdout, stderr = process.communicate()
        stdout, stderr = stdout.decode('utf-8'), stderr.decode('utf-8')
        code = process.returncode
        if code == 0:
            content = stdout.splitlines()
            return self._parse_lsb_release_content(content)
        elif code == 127:  # Command not found
            return {}
        else:
            if sys.version_info[:2] >= (3, 5):
                raise subprocess.CalledProcessError(code, cmd, stdout, stderr)
            elif sys.version_info[:2] >= (2, 7):
                raise subprocess.CalledProcessError(code, cmd, stdout)
            elif sys.version_info[:2] == (2, 6):
                raise subprocess.CalledProcessError(code, cmd)

    @staticmethod
    def _parse_lsb_release_content(lines):
        """
        Parse the output of the lsb_release command.

        Parameters:

        * lines: Iterable through the lines of the lsb_release output.
                 Each line must be a unicode string or a UTF-8 encoded byte
                 string.

        Returns:
            A dictionary containing all information items.
        """
        props = {}
        for line in lines:
            line = line.decode('utf-8') if isinstance(line, bytes) else line
            kv = line.strip('\n').split(':', 1)
            if len(kv) != 2:
                # Ignore lines without colon.
                continue
            k, v = kv
            props.update({k.replace(' ', '_').lower(): v.strip()})
        return props

    def _get_distro_release_info(self):
        """
        Get the information items from the specified distro release file.

        Returns:
            A dictionary containing all information items.
        """
        if self.distro_release_file:
            # If it was specified, we use it and parse what we can, even if
            # its file name or content does not match the expected pattern.
            distro_info = self._parse_distro_release_file(
                self.distro_release_file)
            basename = os.path.basename(self.distro_release_file)
            # The file name pattern for user-specified distro release files
            # is somewhat more tolerant (compared to when searching for the
            # file), because we want to use what was specified as best as
            # possible.
            match = _DISTRO_RELEASE_BASENAME_PATTERN.match(basename)
            if match:
                distro_info['id'] = match.group(1)
            return distro_info
        else:
            basenames = os.listdir(_UNIXCONFDIR)
            # We sort for repeatability in cases where there are multiple
            # distro specific files; e.g. CentOS, Oracle, Enterprise all
            # containing `redhat-release` on top of their own.
            basenames.sort()
            for basename in basenames:
                if basename in _DISTRO_RELEASE_IGNORE_BASENAMES:
                    continue
                match = _DISTRO_RELEASE_BASENAME_PATTERN.match(basename)
                if match:
                    filepath = os.path.join(_UNIXCONFDIR, basename)
                    distro_info = self._parse_distro_release_file(filepath)
                    if 'name' in distro_info:
                        # The name is always present if the pattern matches
                        self.distro_release_file = filepath
                        distro_info['id'] = match.group(1)
                        return distro_info
            return {}

    def _parse_distro_release_file(self, filepath):
        """
        Parse a distro release file.

        Parameters:

        * filepath: Path name of the distro release file.

        Returns:
            A dictionary containing all information items.
        """
        if os.path.isfile(filepath):
            with open(filepath) as fp:
                # Only parse the first line. For instance, on SLES there
                # are multiple lines. We don't want them...
                return self._parse_distro_release_content(fp.readline())
        return {}

    @staticmethod
    def _parse_distro_release_content(line):
        """
        Parse a line from a distro release file.

        Parameters:
        * line: Line from the distro release file. Must be a unicode string
                or a UTF-8 encoded byte string.

        Returns:
            A dictionary containing all information items.
        """
        if isinstance(line, bytes):
            line = line.decode('utf-8')
        matches = _DISTRO_RELEASE_CONTENT_REVERSED_PATTERN.match(
            line.strip()[::-1])
        distro_info = {}
        if matches:
            # regexp ensures non-None
            distro_info['name'] = matches.group(3)[::-1]
            if matches.group(2):
                distro_info['version_id'] = matches.group(2)[::-1]
            if matches.group(1):
                distro_info['codename'] = matches.group(1)[::-1]
        elif line:
            distro_info['name'] = line.strip()
        return distro_info


_distro = LinuxDistribution()


def main():
    import argparse

    logger = logging.getLogger(__name__)
    logger.setLevel(logging.DEBUG)
    logger.addHandler(logging.StreamHandler(sys.stdout))

    parser = argparse.ArgumentParser(description="Linux distro info tool")
    parser.add_argument(
        '--json',
        '-j',
        help="Output in machine readable format",
        action="store_true")
    args = parser.parse_args()

    if args.json:
        logger.info(json.dumps(info(), indent=4, sort_keys=True))
    else:
        logger.info('Name: %s', name(pretty=True))
        distribution_version = version(pretty=True)
        if distribution_version:
            logger.info('Version: %s', distribution_version)
        distribution_codename = codename()
        if distribution_codename:
            logger.info('Codename: %s', distribution_codename)


if __name__ == '__main__':
    main()
site-packages/pip/_vendor/re-vendor.py000064400000001405147511334620013771 0ustar00import os
import sys
import pip
import glob
import shutil

here = os.path.abspath(os.path.dirname(__file__))

def usage():
    print("Usage: re-vendor.py [clean|vendor]")
    sys.exit(1)

def clean():
    for fn in os.listdir(here):
        dirname = os.path.join(here, fn)
        if os.path.isdir(dirname):
            shutil.rmtree(dirname)
    # six is a single file, not a package
    os.unlink(os.path.join(here, 'six.py'))

def vendor():
    pip.main(['install', '-t', here, '-r', 'vendor.txt'])
    for dirname in glob.glob('*.egg-info'):
        shutil.rmtree(dirname)

if __name__ == '__main__':
    if len(sys.argv) != 2:
        usage()
    if sys.argv[1] == 'clean':
        clean()
    elif sys.argv[1] == 'vendor':
        vendor()
    else:
        usage()
site-packages/pip/_vendor/idna/uts46data.py000064400000551160147511334620014632 0ustar00# This file is automatically generated by tools/idna-data
# vim: set fileencoding=utf-8 :

"""IDNA Mapping Table from UTS46."""


__version__ = "6.3.0"
def _seg_0():
    return [
    (0x0, '3'),
    (0x1, '3'),
    (0x2, '3'),
    (0x3, '3'),
    (0x4, '3'),
    (0x5, '3'),
    (0x6, '3'),
    (0x7, '3'),
    (0x8, '3'),
    (0x9, '3'),
    (0xA, '3'),
    (0xB, '3'),
    (0xC, '3'),
    (0xD, '3'),
    (0xE, '3'),
    (0xF, '3'),
    (0x10, '3'),
    (0x11, '3'),
    (0x12, '3'),
    (0x13, '3'),
    (0x14, '3'),
    (0x15, '3'),
    (0x16, '3'),
    (0x17, '3'),
    (0x18, '3'),
    (0x19, '3'),
    (0x1A, '3'),
    (0x1B, '3'),
    (0x1C, '3'),
    (0x1D, '3'),
    (0x1E, '3'),
    (0x1F, '3'),
    (0x20, '3'),
    (0x21, '3'),
    (0x22, '3'),
    (0x23, '3'),
    (0x24, '3'),
    (0x25, '3'),
    (0x26, '3'),
    (0x27, '3'),
    (0x28, '3'),
    (0x29, '3'),
    (0x2A, '3'),
    (0x2B, '3'),
    (0x2C, '3'),
    (0x2D, 'V'),
    (0x2E, 'V'),
    (0x2F, '3'),
    (0x30, 'V'),
    (0x31, 'V'),
    (0x32, 'V'),
    (0x33, 'V'),
    (0x34, 'V'),
    (0x35, 'V'),
    (0x36, 'V'),
    (0x37, 'V'),
    (0x38, 'V'),
    (0x39, 'V'),
    (0x3A, '3'),
    (0x3B, '3'),
    (0x3C, '3'),
    (0x3D, '3'),
    (0x3E, '3'),
    (0x3F, '3'),
    (0x40, '3'),
    (0x41, 'M', u'a'),
    (0x42, 'M', u'b'),
    (0x43, 'M', u'c'),
    (0x44, 'M', u'd'),
    (0x45, 'M', u'e'),
    (0x46, 'M', u'f'),
    (0x47, 'M', u'g'),
    (0x48, 'M', u'h'),
    (0x49, 'M', u'i'),
    (0x4A, 'M', u'j'),
    (0x4B, 'M', u'k'),
    (0x4C, 'M', u'l'),
    (0x4D, 'M', u'm'),
    (0x4E, 'M', u'n'),
    (0x4F, 'M', u'o'),
    (0x50, 'M', u'p'),
    (0x51, 'M', u'q'),
    (0x52, 'M', u'r'),
    (0x53, 'M', u's'),
    (0x54, 'M', u't'),
    (0x55, 'M', u'u'),
    (0x56, 'M', u'v'),
    (0x57, 'M', u'w'),
    (0x58, 'M', u'x'),
    (0x59, 'M', u'y'),
    (0x5A, 'M', u'z'),
    (0x5B, '3'),
    (0x5C, '3'),
    (0x5D, '3'),
    (0x5E, '3'),
    (0x5F, '3'),
    (0x60, '3'),
    (0x61, 'V'),
    (0x62, 'V'),
    (0x63, 'V'),
    ]

def _seg_1():
    return [
    (0x64, 'V'),
    (0x65, 'V'),
    (0x66, 'V'),
    (0x67, 'V'),
    (0x68, 'V'),
    (0x69, 'V'),
    (0x6A, 'V'),
    (0x6B, 'V'),
    (0x6C, 'V'),
    (0x6D, 'V'),
    (0x6E, 'V'),
    (0x6F, 'V'),
    (0x70, 'V'),
    (0x71, 'V'),
    (0x72, 'V'),
    (0x73, 'V'),
    (0x74, 'V'),
    (0x75, 'V'),
    (0x76, 'V'),
    (0x77, 'V'),
    (0x78, 'V'),
    (0x79, 'V'),
    (0x7A, 'V'),
    (0x7B, '3'),
    (0x7C, '3'),
    (0x7D, '3'),
    (0x7E, '3'),
    (0x7F, '3'),
    (0x80, 'X'),
    (0x81, 'X'),
    (0x82, 'X'),
    (0x83, 'X'),
    (0x84, 'X'),
    (0x85, 'X'),
    (0x86, 'X'),
    (0x87, 'X'),
    (0x88, 'X'),
    (0x89, 'X'),
    (0x8A, 'X'),
    (0x8B, 'X'),
    (0x8C, 'X'),
    (0x8D, 'X'),
    (0x8E, 'X'),
    (0x8F, 'X'),
    (0x90, 'X'),
    (0x91, 'X'),
    (0x92, 'X'),
    (0x93, 'X'),
    (0x94, 'X'),
    (0x95, 'X'),
    (0x96, 'X'),
    (0x97, 'X'),
    (0x98, 'X'),
    (0x99, 'X'),
    (0x9A, 'X'),
    (0x9B, 'X'),
    (0x9C, 'X'),
    (0x9D, 'X'),
    (0x9E, 'X'),
    (0x9F, 'X'),
    (0xA0, '3', u' '),
    (0xA1, 'V'),
    (0xA2, 'V'),
    (0xA3, 'V'),
    (0xA4, 'V'),
    (0xA5, 'V'),
    (0xA6, 'V'),
    (0xA7, 'V'),
    (0xA8, '3', u' ̈'),
    (0xA9, 'V'),
    (0xAA, 'M', u'a'),
    (0xAB, 'V'),
    (0xAC, 'V'),
    (0xAD, 'I'),
    (0xAE, 'V'),
    (0xAF, '3', u' ̄'),
    (0xB0, 'V'),
    (0xB1, 'V'),
    (0xB2, 'M', u'2'),
    (0xB3, 'M', u'3'),
    (0xB4, '3', u' ́'),
    (0xB5, 'M', u'μ'),
    (0xB6, 'V'),
    (0xB7, 'V'),
    (0xB8, '3', u' ̧'),
    (0xB9, 'M', u'1'),
    (0xBA, 'M', u'o'),
    (0xBB, 'V'),
    (0xBC, 'M', u'1⁄4'),
    (0xBD, 'M', u'1⁄2'),
    (0xBE, 'M', u'3⁄4'),
    (0xBF, 'V'),
    (0xC0, 'M', u'à'),
    (0xC1, 'M', u'á'),
    (0xC2, 'M', u'â'),
    (0xC3, 'M', u'ã'),
    (0xC4, 'M', u'ä'),
    (0xC5, 'M', u'å'),
    (0xC6, 'M', u'æ'),
    (0xC7, 'M', u'ç'),
    ]

def _seg_2():
    return [
    (0xC8, 'M', u'è'),
    (0xC9, 'M', u'é'),
    (0xCA, 'M', u'ê'),
    (0xCB, 'M', u'ë'),
    (0xCC, 'M', u'ì'),
    (0xCD, 'M', u'í'),
    (0xCE, 'M', u'î'),
    (0xCF, 'M', u'ï'),
    (0xD0, 'M', u'ð'),
    (0xD1, 'M', u'ñ'),
    (0xD2, 'M', u'ò'),
    (0xD3, 'M', u'ó'),
    (0xD4, 'M', u'ô'),
    (0xD5, 'M', u'õ'),
    (0xD6, 'M', u'ö'),
    (0xD7, 'V'),
    (0xD8, 'M', u'ø'),
    (0xD9, 'M', u'ù'),
    (0xDA, 'M', u'ú'),
    (0xDB, 'M', u'û'),
    (0xDC, 'M', u'ü'),
    (0xDD, 'M', u'ý'),
    (0xDE, 'M', u'þ'),
    (0xDF, 'D', u'ss'),
    (0xE0, 'V'),
    (0xE1, 'V'),
    (0xE2, 'V'),
    (0xE3, 'V'),
    (0xE4, 'V'),
    (0xE5, 'V'),
    (0xE6, 'V'),
    (0xE7, 'V'),
    (0xE8, 'V'),
    (0xE9, 'V'),
    (0xEA, 'V'),
    (0xEB, 'V'),
    (0xEC, 'V'),
    (0xED, 'V'),
    (0xEE, 'V'),
    (0xEF, 'V'),
    (0xF0, 'V'),
    (0xF1, 'V'),
    (0xF2, 'V'),
    (0xF3, 'V'),
    (0xF4, 'V'),
    (0xF5, 'V'),
    (0xF6, 'V'),
    (0xF7, 'V'),
    (0xF8, 'V'),
    (0xF9, 'V'),
    (0xFA, 'V'),
    (0xFB, 'V'),
    (0xFC, 'V'),
    (0xFD, 'V'),
    (0xFE, 'V'),
    (0xFF, 'V'),
    (0x100, 'M', u'ā'),
    (0x101, 'V'),
    (0x102, 'M', u'ă'),
    (0x103, 'V'),
    (0x104, 'M', u'ą'),
    (0x105, 'V'),
    (0x106, 'M', u'ć'),
    (0x107, 'V'),
    (0x108, 'M', u'ĉ'),
    (0x109, 'V'),
    (0x10A, 'M', u'ċ'),
    (0x10B, 'V'),
    (0x10C, 'M', u'č'),
    (0x10D, 'V'),
    (0x10E, 'M', u'ď'),
    (0x10F, 'V'),
    (0x110, 'M', u'đ'),
    (0x111, 'V'),
    (0x112, 'M', u'ē'),
    (0x113, 'V'),
    (0x114, 'M', u'ĕ'),
    (0x115, 'V'),
    (0x116, 'M', u'ė'),
    (0x117, 'V'),
    (0x118, 'M', u'ę'),
    (0x119, 'V'),
    (0x11A, 'M', u'ě'),
    (0x11B, 'V'),
    (0x11C, 'M', u'ĝ'),
    (0x11D, 'V'),
    (0x11E, 'M', u'ğ'),
    (0x11F, 'V'),
    (0x120, 'M', u'ġ'),
    (0x121, 'V'),
    (0x122, 'M', u'ģ'),
    (0x123, 'V'),
    (0x124, 'M', u'ĥ'),
    (0x125, 'V'),
    (0x126, 'M', u'ħ'),
    (0x127, 'V'),
    (0x128, 'M', u'ĩ'),
    (0x129, 'V'),
    (0x12A, 'M', u'ī'),
    (0x12B, 'V'),
    ]

def _seg_3():
    return [
    (0x12C, 'M', u'ĭ'),
    (0x12D, 'V'),
    (0x12E, 'M', u'į'),
    (0x12F, 'V'),
    (0x130, 'M', u'i̇'),
    (0x131, 'V'),
    (0x132, 'M', u'ij'),
    (0x134, 'M', u'ĵ'),
    (0x135, 'V'),
    (0x136, 'M', u'ķ'),
    (0x137, 'V'),
    (0x139, 'M', u'ĺ'),
    (0x13A, 'V'),
    (0x13B, 'M', u'ļ'),
    (0x13C, 'V'),
    (0x13D, 'M', u'ľ'),
    (0x13E, 'V'),
    (0x13F, 'M', u'l·'),
    (0x141, 'M', u'ł'),
    (0x142, 'V'),
    (0x143, 'M', u'ń'),
    (0x144, 'V'),
    (0x145, 'M', u'ņ'),
    (0x146, 'V'),
    (0x147, 'M', u'ň'),
    (0x148, 'V'),
    (0x149, 'M', u'ʼn'),
    (0x14A, 'M', u'ŋ'),
    (0x14B, 'V'),
    (0x14C, 'M', u'ō'),
    (0x14D, 'V'),
    (0x14E, 'M', u'ŏ'),
    (0x14F, 'V'),
    (0x150, 'M', u'ő'),
    (0x151, 'V'),
    (0x152, 'M', u'œ'),
    (0x153, 'V'),
    (0x154, 'M', u'ŕ'),
    (0x155, 'V'),
    (0x156, 'M', u'ŗ'),
    (0x157, 'V'),
    (0x158, 'M', u'ř'),
    (0x159, 'V'),
    (0x15A, 'M', u'ś'),
    (0x15B, 'V'),
    (0x15C, 'M', u'ŝ'),
    (0x15D, 'V'),
    (0x15E, 'M', u'ş'),
    (0x15F, 'V'),
    (0x160, 'M', u'š'),
    (0x161, 'V'),
    (0x162, 'M', u'ţ'),
    (0x163, 'V'),
    (0x164, 'M', u'ť'),
    (0x165, 'V'),
    (0x166, 'M', u'ŧ'),
    (0x167, 'V'),
    (0x168, 'M', u'ũ'),
    (0x169, 'V'),
    (0x16A, 'M', u'ū'),
    (0x16B, 'V'),
    (0x16C, 'M', u'ŭ'),
    (0x16D, 'V'),
    (0x16E, 'M', u'ů'),
    (0x16F, 'V'),
    (0x170, 'M', u'ű'),
    (0x171, 'V'),
    (0x172, 'M', u'ų'),
    (0x173, 'V'),
    (0x174, 'M', u'ŵ'),
    (0x175, 'V'),
    (0x176, 'M', u'ŷ'),
    (0x177, 'V'),
    (0x178, 'M', u'ÿ'),
    (0x179, 'M', u'ź'),
    (0x17A, 'V'),
    (0x17B, 'M', u'ż'),
    (0x17C, 'V'),
    (0x17D, 'M', u'ž'),
    (0x17E, 'V'),
    (0x17F, 'M', u's'),
    (0x180, 'V'),
    (0x181, 'M', u'ɓ'),
    (0x182, 'M', u'ƃ'),
    (0x183, 'V'),
    (0x184, 'M', u'ƅ'),
    (0x185, 'V'),
    (0x186, 'M', u'ɔ'),
    (0x187, 'M', u'ƈ'),
    (0x188, 'V'),
    (0x189, 'M', u'ɖ'),
    (0x18A, 'M', u'ɗ'),
    (0x18B, 'M', u'ƌ'),
    (0x18C, 'V'),
    (0x18E, 'M', u'ǝ'),
    (0x18F, 'M', u'ə'),
    (0x190, 'M', u'ɛ'),
    (0x191, 'M', u'ƒ'),
    (0x192, 'V'),
    (0x193, 'M', u'ɠ'),
    ]

def _seg_4():
    return [
    (0x194, 'M', u'ɣ'),
    (0x195, 'V'),
    (0x196, 'M', u'ɩ'),
    (0x197, 'M', u'ɨ'),
    (0x198, 'M', u'ƙ'),
    (0x199, 'V'),
    (0x19C, 'M', u'ɯ'),
    (0x19D, 'M', u'ɲ'),
    (0x19E, 'V'),
    (0x19F, 'M', u'ɵ'),
    (0x1A0, 'M', u'ơ'),
    (0x1A1, 'V'),
    (0x1A2, 'M', u'ƣ'),
    (0x1A3, 'V'),
    (0x1A4, 'M', u'ƥ'),
    (0x1A5, 'V'),
    (0x1A6, 'M', u'ʀ'),
    (0x1A7, 'M', u'ƨ'),
    (0x1A8, 'V'),
    (0x1A9, 'M', u'ʃ'),
    (0x1AA, 'V'),
    (0x1AC, 'M', u'ƭ'),
    (0x1AD, 'V'),
    (0x1AE, 'M', u'ʈ'),
    (0x1AF, 'M', u'ư'),
    (0x1B0, 'V'),
    (0x1B1, 'M', u'ʊ'),
    (0x1B2, 'M', u'ʋ'),
    (0x1B3, 'M', u'ƴ'),
    (0x1B4, 'V'),
    (0x1B5, 'M', u'ƶ'),
    (0x1B6, 'V'),
    (0x1B7, 'M', u'ʒ'),
    (0x1B8, 'M', u'ƹ'),
    (0x1B9, 'V'),
    (0x1BC, 'M', u'ƽ'),
    (0x1BD, 'V'),
    (0x1C4, 'M', u'dž'),
    (0x1C7, 'M', u'lj'),
    (0x1CA, 'M', u'nj'),
    (0x1CD, 'M', u'ǎ'),
    (0x1CE, 'V'),
    (0x1CF, 'M', u'ǐ'),
    (0x1D0, 'V'),
    (0x1D1, 'M', u'ǒ'),
    (0x1D2, 'V'),
    (0x1D3, 'M', u'ǔ'),
    (0x1D4, 'V'),
    (0x1D5, 'M', u'ǖ'),
    (0x1D6, 'V'),
    (0x1D7, 'M', u'ǘ'),
    (0x1D8, 'V'),
    (0x1D9, 'M', u'ǚ'),
    (0x1DA, 'V'),
    (0x1DB, 'M', u'ǜ'),
    (0x1DC, 'V'),
    (0x1DE, 'M', u'ǟ'),
    (0x1DF, 'V'),
    (0x1E0, 'M', u'ǡ'),
    (0x1E1, 'V'),
    (0x1E2, 'M', u'ǣ'),
    (0x1E3, 'V'),
    (0x1E4, 'M', u'ǥ'),
    (0x1E5, 'V'),
    (0x1E6, 'M', u'ǧ'),
    (0x1E7, 'V'),
    (0x1E8, 'M', u'ǩ'),
    (0x1E9, 'V'),
    (0x1EA, 'M', u'ǫ'),
    (0x1EB, 'V'),
    (0x1EC, 'M', u'ǭ'),
    (0x1ED, 'V'),
    (0x1EE, 'M', u'ǯ'),
    (0x1EF, 'V'),
    (0x1F1, 'M', u'dz'),
    (0x1F4, 'M', u'ǵ'),
    (0x1F5, 'V'),
    (0x1F6, 'M', u'ƕ'),
    (0x1F7, 'M', u'ƿ'),
    (0x1F8, 'M', u'ǹ'),
    (0x1F9, 'V'),
    (0x1FA, 'M', u'ǻ'),
    (0x1FB, 'V'),
    (0x1FC, 'M', u'ǽ'),
    (0x1FD, 'V'),
    (0x1FE, 'M', u'ǿ'),
    (0x1FF, 'V'),
    (0x200, 'M', u'ȁ'),
    (0x201, 'V'),
    (0x202, 'M', u'ȃ'),
    (0x203, 'V'),
    (0x204, 'M', u'ȅ'),
    (0x205, 'V'),
    (0x206, 'M', u'ȇ'),
    (0x207, 'V'),
    (0x208, 'M', u'ȉ'),
    (0x209, 'V'),
    (0x20A, 'M', u'ȋ'),
    (0x20B, 'V'),
    (0x20C, 'M', u'ȍ'),
    ]

def _seg_5():
    return [
    (0x20D, 'V'),
    (0x20E, 'M', u'ȏ'),
    (0x20F, 'V'),
    (0x210, 'M', u'ȑ'),
    (0x211, 'V'),
    (0x212, 'M', u'ȓ'),
    (0x213, 'V'),
    (0x214, 'M', u'ȕ'),
    (0x215, 'V'),
    (0x216, 'M', u'ȗ'),
    (0x217, 'V'),
    (0x218, 'M', u'ș'),
    (0x219, 'V'),
    (0x21A, 'M', u'ț'),
    (0x21B, 'V'),
    (0x21C, 'M', u'ȝ'),
    (0x21D, 'V'),
    (0x21E, 'M', u'ȟ'),
    (0x21F, 'V'),
    (0x220, 'M', u'ƞ'),
    (0x221, 'V'),
    (0x222, 'M', u'ȣ'),
    (0x223, 'V'),
    (0x224, 'M', u'ȥ'),
    (0x225, 'V'),
    (0x226, 'M', u'ȧ'),
    (0x227, 'V'),
    (0x228, 'M', u'ȩ'),
    (0x229, 'V'),
    (0x22A, 'M', u'ȫ'),
    (0x22B, 'V'),
    (0x22C, 'M', u'ȭ'),
    (0x22D, 'V'),
    (0x22E, 'M', u'ȯ'),
    (0x22F, 'V'),
    (0x230, 'M', u'ȱ'),
    (0x231, 'V'),
    (0x232, 'M', u'ȳ'),
    (0x233, 'V'),
    (0x23A, 'M', u'ⱥ'),
    (0x23B, 'M', u'ȼ'),
    (0x23C, 'V'),
    (0x23D, 'M', u'ƚ'),
    (0x23E, 'M', u'ⱦ'),
    (0x23F, 'V'),
    (0x241, 'M', u'ɂ'),
    (0x242, 'V'),
    (0x243, 'M', u'ƀ'),
    (0x244, 'M', u'ʉ'),
    (0x245, 'M', u'ʌ'),
    (0x246, 'M', u'ɇ'),
    (0x247, 'V'),
    (0x248, 'M', u'ɉ'),
    (0x249, 'V'),
    (0x24A, 'M', u'ɋ'),
    (0x24B, 'V'),
    (0x24C, 'M', u'ɍ'),
    (0x24D, 'V'),
    (0x24E, 'M', u'ɏ'),
    (0x24F, 'V'),
    (0x2B0, 'M', u'h'),
    (0x2B1, 'M', u'ɦ'),
    (0x2B2, 'M', u'j'),
    (0x2B3, 'M', u'r'),
    (0x2B4, 'M', u'ɹ'),
    (0x2B5, 'M', u'ɻ'),
    (0x2B6, 'M', u'ʁ'),
    (0x2B7, 'M', u'w'),
    (0x2B8, 'M', u'y'),
    (0x2B9, 'V'),
    (0x2D8, '3', u' ̆'),
    (0x2D9, '3', u' ̇'),
    (0x2DA, '3', u' ̊'),
    (0x2DB, '3', u' ̨'),
    (0x2DC, '3', u' ̃'),
    (0x2DD, '3', u' ̋'),
    (0x2DE, 'V'),
    (0x2E0, 'M', u'ɣ'),
    (0x2E1, 'M', u'l'),
    (0x2E2, 'M', u's'),
    (0x2E3, 'M', u'x'),
    (0x2E4, 'M', u'ʕ'),
    (0x2E5, 'V'),
    (0x340, 'M', u'̀'),
    (0x341, 'M', u'́'),
    (0x342, 'V'),
    (0x343, 'M', u'̓'),
    (0x344, 'M', u'̈́'),
    (0x345, 'M', u'ι'),
    (0x346, 'V'),
    (0x34F, 'I'),
    (0x350, 'V'),
    (0x370, 'M', u'ͱ'),
    (0x371, 'V'),
    (0x372, 'M', u'ͳ'),
    (0x373, 'V'),
    (0x374, 'M', u'ʹ'),
    (0x375, 'V'),
    (0x376, 'M', u'ͷ'),
    (0x377, 'V'),
    ]

def _seg_6():
    return [
    (0x378, 'X'),
    (0x37A, '3', u' ι'),
    (0x37B, 'V'),
    (0x37E, '3', u';'),
    (0x37F, 'X'),
    (0x384, '3', u' ́'),
    (0x385, '3', u' ̈́'),
    (0x386, 'M', u'ά'),
    (0x387, 'M', u'·'),
    (0x388, 'M', u'έ'),
    (0x389, 'M', u'ή'),
    (0x38A, 'M', u'ί'),
    (0x38B, 'X'),
    (0x38C, 'M', u'ό'),
    (0x38D, 'X'),
    (0x38E, 'M', u'ύ'),
    (0x38F, 'M', u'ώ'),
    (0x390, 'V'),
    (0x391, 'M', u'α'),
    (0x392, 'M', u'β'),
    (0x393, 'M', u'γ'),
    (0x394, 'M', u'δ'),
    (0x395, 'M', u'ε'),
    (0x396, 'M', u'ζ'),
    (0x397, 'M', u'η'),
    (0x398, 'M', u'θ'),
    (0x399, 'M', u'ι'),
    (0x39A, 'M', u'κ'),
    (0x39B, 'M', u'λ'),
    (0x39C, 'M', u'μ'),
    (0x39D, 'M', u'ν'),
    (0x39E, 'M', u'ξ'),
    (0x39F, 'M', u'ο'),
    (0x3A0, 'M', u'π'),
    (0x3A1, 'M', u'ρ'),
    (0x3A2, 'X'),
    (0x3A3, 'M', u'σ'),
    (0x3A4, 'M', u'τ'),
    (0x3A5, 'M', u'υ'),
    (0x3A6, 'M', u'φ'),
    (0x3A7, 'M', u'χ'),
    (0x3A8, 'M', u'ψ'),
    (0x3A9, 'M', u'ω'),
    (0x3AA, 'M', u'ϊ'),
    (0x3AB, 'M', u'ϋ'),
    (0x3AC, 'V'),
    (0x3C2, 'D', u'σ'),
    (0x3C3, 'V'),
    (0x3CF, 'M', u'ϗ'),
    (0x3D0, 'M', u'β'),
    (0x3D1, 'M', u'θ'),
    (0x3D2, 'M', u'υ'),
    (0x3D3, 'M', u'ύ'),
    (0x3D4, 'M', u'ϋ'),
    (0x3D5, 'M', u'φ'),
    (0x3D6, 'M', u'π'),
    (0x3D7, 'V'),
    (0x3D8, 'M', u'ϙ'),
    (0x3D9, 'V'),
    (0x3DA, 'M', u'ϛ'),
    (0x3DB, 'V'),
    (0x3DC, 'M', u'ϝ'),
    (0x3DD, 'V'),
    (0x3DE, 'M', u'ϟ'),
    (0x3DF, 'V'),
    (0x3E0, 'M', u'ϡ'),
    (0x3E1, 'V'),
    (0x3E2, 'M', u'ϣ'),
    (0x3E3, 'V'),
    (0x3E4, 'M', u'ϥ'),
    (0x3E5, 'V'),
    (0x3E6, 'M', u'ϧ'),
    (0x3E7, 'V'),
    (0x3E8, 'M', u'ϩ'),
    (0x3E9, 'V'),
    (0x3EA, 'M', u'ϫ'),
    (0x3EB, 'V'),
    (0x3EC, 'M', u'ϭ'),
    (0x3ED, 'V'),
    (0x3EE, 'M', u'ϯ'),
    (0x3EF, 'V'),
    (0x3F0, 'M', u'κ'),
    (0x3F1, 'M', u'ρ'),
    (0x3F2, 'M', u'σ'),
    (0x3F3, 'V'),
    (0x3F4, 'M', u'θ'),
    (0x3F5, 'M', u'ε'),
    (0x3F6, 'V'),
    (0x3F7, 'M', u'ϸ'),
    (0x3F8, 'V'),
    (0x3F9, 'M', u'σ'),
    (0x3FA, 'M', u'ϻ'),
    (0x3FB, 'V'),
    (0x3FD, 'M', u'ͻ'),
    (0x3FE, 'M', u'ͼ'),
    (0x3FF, 'M', u'ͽ'),
    (0x400, 'M', u'ѐ'),
    (0x401, 'M', u'ё'),
    (0x402, 'M', u'ђ'),
    (0x403, 'M', u'ѓ'),
    ]

def _seg_7():
    return [
    (0x404, 'M', u'є'),
    (0x405, 'M', u'ѕ'),
    (0x406, 'M', u'і'),
    (0x407, 'M', u'ї'),
    (0x408, 'M', u'ј'),
    (0x409, 'M', u'љ'),
    (0x40A, 'M', u'њ'),
    (0x40B, 'M', u'ћ'),
    (0x40C, 'M', u'ќ'),
    (0x40D, 'M', u'ѝ'),
    (0x40E, 'M', u'ў'),
    (0x40F, 'M', u'џ'),
    (0x410, 'M', u'а'),
    (0x411, 'M', u'б'),
    (0x412, 'M', u'в'),
    (0x413, 'M', u'г'),
    (0x414, 'M', u'д'),
    (0x415, 'M', u'е'),
    (0x416, 'M', u'ж'),
    (0x417, 'M', u'з'),
    (0x418, 'M', u'и'),
    (0x419, 'M', u'й'),
    (0x41A, 'M', u'к'),
    (0x41B, 'M', u'л'),
    (0x41C, 'M', u'м'),
    (0x41D, 'M', u'н'),
    (0x41E, 'M', u'о'),
    (0x41F, 'M', u'п'),
    (0x420, 'M', u'р'),
    (0x421, 'M', u'с'),
    (0x422, 'M', u'т'),
    (0x423, 'M', u'у'),
    (0x424, 'M', u'ф'),
    (0x425, 'M', u'х'),
    (0x426, 'M', u'ц'),
    (0x427, 'M', u'ч'),
    (0x428, 'M', u'ш'),
    (0x429, 'M', u'щ'),
    (0x42A, 'M', u'ъ'),
    (0x42B, 'M', u'ы'),
    (0x42C, 'M', u'ь'),
    (0x42D, 'M', u'э'),
    (0x42E, 'M', u'ю'),
    (0x42F, 'M', u'я'),
    (0x430, 'V'),
    (0x460, 'M', u'ѡ'),
    (0x461, 'V'),
    (0x462, 'M', u'ѣ'),
    (0x463, 'V'),
    (0x464, 'M', u'ѥ'),
    (0x465, 'V'),
    (0x466, 'M', u'ѧ'),
    (0x467, 'V'),
    (0x468, 'M', u'ѩ'),
    (0x469, 'V'),
    (0x46A, 'M', u'ѫ'),
    (0x46B, 'V'),
    (0x46C, 'M', u'ѭ'),
    (0x46D, 'V'),
    (0x46E, 'M', u'ѯ'),
    (0x46F, 'V'),
    (0x470, 'M', u'ѱ'),
    (0x471, 'V'),
    (0x472, 'M', u'ѳ'),
    (0x473, 'V'),
    (0x474, 'M', u'ѵ'),
    (0x475, 'V'),
    (0x476, 'M', u'ѷ'),
    (0x477, 'V'),
    (0x478, 'M', u'ѹ'),
    (0x479, 'V'),
    (0x47A, 'M', u'ѻ'),
    (0x47B, 'V'),
    (0x47C, 'M', u'ѽ'),
    (0x47D, 'V'),
    (0x47E, 'M', u'ѿ'),
    (0x47F, 'V'),
    (0x480, 'M', u'ҁ'),
    (0x481, 'V'),
    (0x48A, 'M', u'ҋ'),
    (0x48B, 'V'),
    (0x48C, 'M', u'ҍ'),
    (0x48D, 'V'),
    (0x48E, 'M', u'ҏ'),
    (0x48F, 'V'),
    (0x490, 'M', u'ґ'),
    (0x491, 'V'),
    (0x492, 'M', u'ғ'),
    (0x493, 'V'),
    (0x494, 'M', u'ҕ'),
    (0x495, 'V'),
    (0x496, 'M', u'җ'),
    (0x497, 'V'),
    (0x498, 'M', u'ҙ'),
    (0x499, 'V'),
    (0x49A, 'M', u'қ'),
    (0x49B, 'V'),
    (0x49C, 'M', u'ҝ'),
    (0x49D, 'V'),
    (0x49E, 'M', u'ҟ'),
    ]

def _seg_8():
    return [
    (0x49F, 'V'),
    (0x4A0, 'M', u'ҡ'),
    (0x4A1, 'V'),
    (0x4A2, 'M', u'ң'),
    (0x4A3, 'V'),
    (0x4A4, 'M', u'ҥ'),
    (0x4A5, 'V'),
    (0x4A6, 'M', u'ҧ'),
    (0x4A7, 'V'),
    (0x4A8, 'M', u'ҩ'),
    (0x4A9, 'V'),
    (0x4AA, 'M', u'ҫ'),
    (0x4AB, 'V'),
    (0x4AC, 'M', u'ҭ'),
    (0x4AD, 'V'),
    (0x4AE, 'M', u'ү'),
    (0x4AF, 'V'),
    (0x4B0, 'M', u'ұ'),
    (0x4B1, 'V'),
    (0x4B2, 'M', u'ҳ'),
    (0x4B3, 'V'),
    (0x4B4, 'M', u'ҵ'),
    (0x4B5, 'V'),
    (0x4B6, 'M', u'ҷ'),
    (0x4B7, 'V'),
    (0x4B8, 'M', u'ҹ'),
    (0x4B9, 'V'),
    (0x4BA, 'M', u'һ'),
    (0x4BB, 'V'),
    (0x4BC, 'M', u'ҽ'),
    (0x4BD, 'V'),
    (0x4BE, 'M', u'ҿ'),
    (0x4BF, 'V'),
    (0x4C0, 'X'),
    (0x4C1, 'M', u'ӂ'),
    (0x4C2, 'V'),
    (0x4C3, 'M', u'ӄ'),
    (0x4C4, 'V'),
    (0x4C5, 'M', u'ӆ'),
    (0x4C6, 'V'),
    (0x4C7, 'M', u'ӈ'),
    (0x4C8, 'V'),
    (0x4C9, 'M', u'ӊ'),
    (0x4CA, 'V'),
    (0x4CB, 'M', u'ӌ'),
    (0x4CC, 'V'),
    (0x4CD, 'M', u'ӎ'),
    (0x4CE, 'V'),
    (0x4D0, 'M', u'ӑ'),
    (0x4D1, 'V'),
    (0x4D2, 'M', u'ӓ'),
    (0x4D3, 'V'),
    (0x4D4, 'M', u'ӕ'),
    (0x4D5, 'V'),
    (0x4D6, 'M', u'ӗ'),
    (0x4D7, 'V'),
    (0x4D8, 'M', u'ә'),
    (0x4D9, 'V'),
    (0x4DA, 'M', u'ӛ'),
    (0x4DB, 'V'),
    (0x4DC, 'M', u'ӝ'),
    (0x4DD, 'V'),
    (0x4DE, 'M', u'ӟ'),
    (0x4DF, 'V'),
    (0x4E0, 'M', u'ӡ'),
    (0x4E1, 'V'),
    (0x4E2, 'M', u'ӣ'),
    (0x4E3, 'V'),
    (0x4E4, 'M', u'ӥ'),
    (0x4E5, 'V'),
    (0x4E6, 'M', u'ӧ'),
    (0x4E7, 'V'),
    (0x4E8, 'M', u'ө'),
    (0x4E9, 'V'),
    (0x4EA, 'M', u'ӫ'),
    (0x4EB, 'V'),
    (0x4EC, 'M', u'ӭ'),
    (0x4ED, 'V'),
    (0x4EE, 'M', u'ӯ'),
    (0x4EF, 'V'),
    (0x4F0, 'M', u'ӱ'),
    (0x4F1, 'V'),
    (0x4F2, 'M', u'ӳ'),
    (0x4F3, 'V'),
    (0x4F4, 'M', u'ӵ'),
    (0x4F5, 'V'),
    (0x4F6, 'M', u'ӷ'),
    (0x4F7, 'V'),
    (0x4F8, 'M', u'ӹ'),
    (0x4F9, 'V'),
    (0x4FA, 'M', u'ӻ'),
    (0x4FB, 'V'),
    (0x4FC, 'M', u'ӽ'),
    (0x4FD, 'V'),
    (0x4FE, 'M', u'ӿ'),
    (0x4FF, 'V'),
    (0x500, 'M', u'ԁ'),
    (0x501, 'V'),
    (0x502, 'M', u'ԃ'),
    (0x503, 'V'),
    ]

def _seg_9():
    return [
    (0x504, 'M', u'ԅ'),
    (0x505, 'V'),
    (0x506, 'M', u'ԇ'),
    (0x507, 'V'),
    (0x508, 'M', u'ԉ'),
    (0x509, 'V'),
    (0x50A, 'M', u'ԋ'),
    (0x50B, 'V'),
    (0x50C, 'M', u'ԍ'),
    (0x50D, 'V'),
    (0x50E, 'M', u'ԏ'),
    (0x50F, 'V'),
    (0x510, 'M', u'ԑ'),
    (0x511, 'V'),
    (0x512, 'M', u'ԓ'),
    (0x513, 'V'),
    (0x514, 'M', u'ԕ'),
    (0x515, 'V'),
    (0x516, 'M', u'ԗ'),
    (0x517, 'V'),
    (0x518, 'M', u'ԙ'),
    (0x519, 'V'),
    (0x51A, 'M', u'ԛ'),
    (0x51B, 'V'),
    (0x51C, 'M', u'ԝ'),
    (0x51D, 'V'),
    (0x51E, 'M', u'ԟ'),
    (0x51F, 'V'),
    (0x520, 'M', u'ԡ'),
    (0x521, 'V'),
    (0x522, 'M', u'ԣ'),
    (0x523, 'V'),
    (0x524, 'M', u'ԥ'),
    (0x525, 'V'),
    (0x526, 'M', u'ԧ'),
    (0x527, 'V'),
    (0x528, 'X'),
    (0x531, 'M', u'ա'),
    (0x532, 'M', u'բ'),
    (0x533, 'M', u'գ'),
    (0x534, 'M', u'դ'),
    (0x535, 'M', u'ե'),
    (0x536, 'M', u'զ'),
    (0x537, 'M', u'է'),
    (0x538, 'M', u'ը'),
    (0x539, 'M', u'թ'),
    (0x53A, 'M', u'ժ'),
    (0x53B, 'M', u'ի'),
    (0x53C, 'M', u'լ'),
    (0x53D, 'M', u'խ'),
    (0x53E, 'M', u'ծ'),
    (0x53F, 'M', u'կ'),
    (0x540, 'M', u'հ'),
    (0x541, 'M', u'ձ'),
    (0x542, 'M', u'ղ'),
    (0x543, 'M', u'ճ'),
    (0x544, 'M', u'մ'),
    (0x545, 'M', u'յ'),
    (0x546, 'M', u'ն'),
    (0x547, 'M', u'շ'),
    (0x548, 'M', u'ո'),
    (0x549, 'M', u'չ'),
    (0x54A, 'M', u'պ'),
    (0x54B, 'M', u'ջ'),
    (0x54C, 'M', u'ռ'),
    (0x54D, 'M', u'ս'),
    (0x54E, 'M', u'վ'),
    (0x54F, 'M', u'տ'),
    (0x550, 'M', u'ր'),
    (0x551, 'M', u'ց'),
    (0x552, 'M', u'ւ'),
    (0x553, 'M', u'փ'),
    (0x554, 'M', u'ք'),
    (0x555, 'M', u'օ'),
    (0x556, 'M', u'ֆ'),
    (0x557, 'X'),
    (0x559, 'V'),
    (0x560, 'X'),
    (0x561, 'V'),
    (0x587, 'M', u'եւ'),
    (0x588, 'X'),
    (0x589, 'V'),
    (0x58B, 'X'),
    (0x58F, 'V'),
    (0x590, 'X'),
    (0x591, 'V'),
    (0x5C8, 'X'),
    (0x5D0, 'V'),
    (0x5EB, 'X'),
    (0x5F0, 'V'),
    (0x5F5, 'X'),
    (0x606, 'V'),
    (0x61C, 'X'),
    (0x61E, 'V'),
    (0x675, 'M', u'اٴ'),
    (0x676, 'M', u'وٴ'),
    (0x677, 'M', u'ۇٴ'),
    (0x678, 'M', u'يٴ'),
    (0x679, 'V'),
    (0x6DD, 'X'),
    ]

def _seg_10():
    return [
    (0x6DE, 'V'),
    (0x70E, 'X'),
    (0x710, 'V'),
    (0x74B, 'X'),
    (0x74D, 'V'),
    (0x7B2, 'X'),
    (0x7C0, 'V'),
    (0x7FB, 'X'),
    (0x800, 'V'),
    (0x82E, 'X'),
    (0x830, 'V'),
    (0x83F, 'X'),
    (0x840, 'V'),
    (0x85C, 'X'),
    (0x85E, 'V'),
    (0x85F, 'X'),
    (0x8A0, 'V'),
    (0x8A1, 'X'),
    (0x8A2, 'V'),
    (0x8AD, 'X'),
    (0x8E4, 'V'),
    (0x8FF, 'X'),
    (0x900, 'V'),
    (0x958, 'M', u'क़'),
    (0x959, 'M', u'ख़'),
    (0x95A, 'M', u'ग़'),
    (0x95B, 'M', u'ज़'),
    (0x95C, 'M', u'ड़'),
    (0x95D, 'M', u'ढ़'),
    (0x95E, 'M', u'फ़'),
    (0x95F, 'M', u'य़'),
    (0x960, 'V'),
    (0x978, 'X'),
    (0x979, 'V'),
    (0x980, 'X'),
    (0x981, 'V'),
    (0x984, 'X'),
    (0x985, 'V'),
    (0x98D, 'X'),
    (0x98F, 'V'),
    (0x991, 'X'),
    (0x993, 'V'),
    (0x9A9, 'X'),
    (0x9AA, 'V'),
    (0x9B1, 'X'),
    (0x9B2, 'V'),
    (0x9B3, 'X'),
    (0x9B6, 'V'),
    (0x9BA, 'X'),
    (0x9BC, 'V'),
    (0x9C5, 'X'),
    (0x9C7, 'V'),
    (0x9C9, 'X'),
    (0x9CB, 'V'),
    (0x9CF, 'X'),
    (0x9D7, 'V'),
    (0x9D8, 'X'),
    (0x9DC, 'M', u'ড়'),
    (0x9DD, 'M', u'ঢ়'),
    (0x9DE, 'X'),
    (0x9DF, 'M', u'য়'),
    (0x9E0, 'V'),
    (0x9E4, 'X'),
    (0x9E6, 'V'),
    (0x9FC, 'X'),
    (0xA01, 'V'),
    (0xA04, 'X'),
    (0xA05, 'V'),
    (0xA0B, 'X'),
    (0xA0F, 'V'),
    (0xA11, 'X'),
    (0xA13, 'V'),
    (0xA29, 'X'),
    (0xA2A, 'V'),
    (0xA31, 'X'),
    (0xA32, 'V'),
    (0xA33, 'M', u'ਲ਼'),
    (0xA34, 'X'),
    (0xA35, 'V'),
    (0xA36, 'M', u'ਸ਼'),
    (0xA37, 'X'),
    (0xA38, 'V'),
    (0xA3A, 'X'),
    (0xA3C, 'V'),
    (0xA3D, 'X'),
    (0xA3E, 'V'),
    (0xA43, 'X'),
    (0xA47, 'V'),
    (0xA49, 'X'),
    (0xA4B, 'V'),
    (0xA4E, 'X'),
    (0xA51, 'V'),
    (0xA52, 'X'),
    (0xA59, 'M', u'ਖ਼'),
    (0xA5A, 'M', u'ਗ਼'),
    (0xA5B, 'M', u'ਜ਼'),
    (0xA5C, 'V'),
    (0xA5D, 'X'),
    (0xA5E, 'M', u'ਫ਼'),
    (0xA5F, 'X'),
    ]

def _seg_11():
    return [
    (0xA66, 'V'),
    (0xA76, 'X'),
    (0xA81, 'V'),
    (0xA84, 'X'),
    (0xA85, 'V'),
    (0xA8E, 'X'),
    (0xA8F, 'V'),
    (0xA92, 'X'),
    (0xA93, 'V'),
    (0xAA9, 'X'),
    (0xAAA, 'V'),
    (0xAB1, 'X'),
    (0xAB2, 'V'),
    (0xAB4, 'X'),
    (0xAB5, 'V'),
    (0xABA, 'X'),
    (0xABC, 'V'),
    (0xAC6, 'X'),
    (0xAC7, 'V'),
    (0xACA, 'X'),
    (0xACB, 'V'),
    (0xACE, 'X'),
    (0xAD0, 'V'),
    (0xAD1, 'X'),
    (0xAE0, 'V'),
    (0xAE4, 'X'),
    (0xAE6, 'V'),
    (0xAF2, 'X'),
    (0xB01, 'V'),
    (0xB04, 'X'),
    (0xB05, 'V'),
    (0xB0D, 'X'),
    (0xB0F, 'V'),
    (0xB11, 'X'),
    (0xB13, 'V'),
    (0xB29, 'X'),
    (0xB2A, 'V'),
    (0xB31, 'X'),
    (0xB32, 'V'),
    (0xB34, 'X'),
    (0xB35, 'V'),
    (0xB3A, 'X'),
    (0xB3C, 'V'),
    (0xB45, 'X'),
    (0xB47, 'V'),
    (0xB49, 'X'),
    (0xB4B, 'V'),
    (0xB4E, 'X'),
    (0xB56, 'V'),
    (0xB58, 'X'),
    (0xB5C, 'M', u'ଡ଼'),
    (0xB5D, 'M', u'ଢ଼'),
    (0xB5E, 'X'),
    (0xB5F, 'V'),
    (0xB64, 'X'),
    (0xB66, 'V'),
    (0xB78, 'X'),
    (0xB82, 'V'),
    (0xB84, 'X'),
    (0xB85, 'V'),
    (0xB8B, 'X'),
    (0xB8E, 'V'),
    (0xB91, 'X'),
    (0xB92, 'V'),
    (0xB96, 'X'),
    (0xB99, 'V'),
    (0xB9B, 'X'),
    (0xB9C, 'V'),
    (0xB9D, 'X'),
    (0xB9E, 'V'),
    (0xBA0, 'X'),
    (0xBA3, 'V'),
    (0xBA5, 'X'),
    (0xBA8, 'V'),
    (0xBAB, 'X'),
    (0xBAE, 'V'),
    (0xBBA, 'X'),
    (0xBBE, 'V'),
    (0xBC3, 'X'),
    (0xBC6, 'V'),
    (0xBC9, 'X'),
    (0xBCA, 'V'),
    (0xBCE, 'X'),
    (0xBD0, 'V'),
    (0xBD1, 'X'),
    (0xBD7, 'V'),
    (0xBD8, 'X'),
    (0xBE6, 'V'),
    (0xBFB, 'X'),
    (0xC01, 'V'),
    (0xC04, 'X'),
    (0xC05, 'V'),
    (0xC0D, 'X'),
    (0xC0E, 'V'),
    (0xC11, 'X'),
    (0xC12, 'V'),
    (0xC29, 'X'),
    (0xC2A, 'V'),
    (0xC34, 'X'),
    (0xC35, 'V'),
    ]

def _seg_12():
    return [
    (0xC3A, 'X'),
    (0xC3D, 'V'),
    (0xC45, 'X'),
    (0xC46, 'V'),
    (0xC49, 'X'),
    (0xC4A, 'V'),
    (0xC4E, 'X'),
    (0xC55, 'V'),
    (0xC57, 'X'),
    (0xC58, 'V'),
    (0xC5A, 'X'),
    (0xC60, 'V'),
    (0xC64, 'X'),
    (0xC66, 'V'),
    (0xC70, 'X'),
    (0xC78, 'V'),
    (0xC80, 'X'),
    (0xC82, 'V'),
    (0xC84, 'X'),
    (0xC85, 'V'),
    (0xC8D, 'X'),
    (0xC8E, 'V'),
    (0xC91, 'X'),
    (0xC92, 'V'),
    (0xCA9, 'X'),
    (0xCAA, 'V'),
    (0xCB4, 'X'),
    (0xCB5, 'V'),
    (0xCBA, 'X'),
    (0xCBC, 'V'),
    (0xCC5, 'X'),
    (0xCC6, 'V'),
    (0xCC9, 'X'),
    (0xCCA, 'V'),
    (0xCCE, 'X'),
    (0xCD5, 'V'),
    (0xCD7, 'X'),
    (0xCDE, 'V'),
    (0xCDF, 'X'),
    (0xCE0, 'V'),
    (0xCE4, 'X'),
    (0xCE6, 'V'),
    (0xCF0, 'X'),
    (0xCF1, 'V'),
    (0xCF3, 'X'),
    (0xD02, 'V'),
    (0xD04, 'X'),
    (0xD05, 'V'),
    (0xD0D, 'X'),
    (0xD0E, 'V'),
    (0xD11, 'X'),
    (0xD12, 'V'),
    (0xD3B, 'X'),
    (0xD3D, 'V'),
    (0xD45, 'X'),
    (0xD46, 'V'),
    (0xD49, 'X'),
    (0xD4A, 'V'),
    (0xD4F, 'X'),
    (0xD57, 'V'),
    (0xD58, 'X'),
    (0xD60, 'V'),
    (0xD64, 'X'),
    (0xD66, 'V'),
    (0xD76, 'X'),
    (0xD79, 'V'),
    (0xD80, 'X'),
    (0xD82, 'V'),
    (0xD84, 'X'),
    (0xD85, 'V'),
    (0xD97, 'X'),
    (0xD9A, 'V'),
    (0xDB2, 'X'),
    (0xDB3, 'V'),
    (0xDBC, 'X'),
    (0xDBD, 'V'),
    (0xDBE, 'X'),
    (0xDC0, 'V'),
    (0xDC7, 'X'),
    (0xDCA, 'V'),
    (0xDCB, 'X'),
    (0xDCF, 'V'),
    (0xDD5, 'X'),
    (0xDD6, 'V'),
    (0xDD7, 'X'),
    (0xDD8, 'V'),
    (0xDE0, 'X'),
    (0xDF2, 'V'),
    (0xDF5, 'X'),
    (0xE01, 'V'),
    (0xE33, 'M', u'ํา'),
    (0xE34, 'V'),
    (0xE3B, 'X'),
    (0xE3F, 'V'),
    (0xE5C, 'X'),
    (0xE81, 'V'),
    (0xE83, 'X'),
    (0xE84, 'V'),
    (0xE85, 'X'),
    (0xE87, 'V'),
    ]

def _seg_13():
    return [
    (0xE89, 'X'),
    (0xE8A, 'V'),
    (0xE8B, 'X'),
    (0xE8D, 'V'),
    (0xE8E, 'X'),
    (0xE94, 'V'),
    (0xE98, 'X'),
    (0xE99, 'V'),
    (0xEA0, 'X'),
    (0xEA1, 'V'),
    (0xEA4, 'X'),
    (0xEA5, 'V'),
    (0xEA6, 'X'),
    (0xEA7, 'V'),
    (0xEA8, 'X'),
    (0xEAA, 'V'),
    (0xEAC, 'X'),
    (0xEAD, 'V'),
    (0xEB3, 'M', u'ໍາ'),
    (0xEB4, 'V'),
    (0xEBA, 'X'),
    (0xEBB, 'V'),
    (0xEBE, 'X'),
    (0xEC0, 'V'),
    (0xEC5, 'X'),
    (0xEC6, 'V'),
    (0xEC7, 'X'),
    (0xEC8, 'V'),
    (0xECE, 'X'),
    (0xED0, 'V'),
    (0xEDA, 'X'),
    (0xEDC, 'M', u'ຫນ'),
    (0xEDD, 'M', u'ຫມ'),
    (0xEDE, 'V'),
    (0xEE0, 'X'),
    (0xF00, 'V'),
    (0xF0C, 'M', u'་'),
    (0xF0D, 'V'),
    (0xF43, 'M', u'གྷ'),
    (0xF44, 'V'),
    (0xF48, 'X'),
    (0xF49, 'V'),
    (0xF4D, 'M', u'ཌྷ'),
    (0xF4E, 'V'),
    (0xF52, 'M', u'དྷ'),
    (0xF53, 'V'),
    (0xF57, 'M', u'བྷ'),
    (0xF58, 'V'),
    (0xF5C, 'M', u'ཛྷ'),
    (0xF5D, 'V'),
    (0xF69, 'M', u'ཀྵ'),
    (0xF6A, 'V'),
    (0xF6D, 'X'),
    (0xF71, 'V'),
    (0xF73, 'M', u'ཱི'),
    (0xF74, 'V'),
    (0xF75, 'M', u'ཱུ'),
    (0xF76, 'M', u'ྲྀ'),
    (0xF77, 'M', u'ྲཱྀ'),
    (0xF78, 'M', u'ླྀ'),
    (0xF79, 'M', u'ླཱྀ'),
    (0xF7A, 'V'),
    (0xF81, 'M', u'ཱྀ'),
    (0xF82, 'V'),
    (0xF93, 'M', u'ྒྷ'),
    (0xF94, 'V'),
    (0xF98, 'X'),
    (0xF99, 'V'),
    (0xF9D, 'M', u'ྜྷ'),
    (0xF9E, 'V'),
    (0xFA2, 'M', u'ྡྷ'),
    (0xFA3, 'V'),
    (0xFA7, 'M', u'ྦྷ'),
    (0xFA8, 'V'),
    (0xFAC, 'M', u'ྫྷ'),
    (0xFAD, 'V'),
    (0xFB9, 'M', u'ྐྵ'),
    (0xFBA, 'V'),
    (0xFBD, 'X'),
    (0xFBE, 'V'),
    (0xFCD, 'X'),
    (0xFCE, 'V'),
    (0xFDB, 'X'),
    (0x1000, 'V'),
    (0x10A0, 'X'),
    (0x10C7, 'M', u'ⴧ'),
    (0x10C8, 'X'),
    (0x10CD, 'M', u'ⴭ'),
    (0x10CE, 'X'),
    (0x10D0, 'V'),
    (0x10FC, 'M', u'ნ'),
    (0x10FD, 'V'),
    (0x115F, 'X'),
    (0x1161, 'V'),
    (0x1249, 'X'),
    (0x124A, 'V'),
    (0x124E, 'X'),
    (0x1250, 'V'),
    (0x1257, 'X'),
    (0x1258, 'V'),
    ]

def _seg_14():
    return [
    (0x1259, 'X'),
    (0x125A, 'V'),
    (0x125E, 'X'),
    (0x1260, 'V'),
    (0x1289, 'X'),
    (0x128A, 'V'),
    (0x128E, 'X'),
    (0x1290, 'V'),
    (0x12B1, 'X'),
    (0x12B2, 'V'),
    (0x12B6, 'X'),
    (0x12B8, 'V'),
    (0x12BF, 'X'),
    (0x12C0, 'V'),
    (0x12C1, 'X'),
    (0x12C2, 'V'),
    (0x12C6, 'X'),
    (0x12C8, 'V'),
    (0x12D7, 'X'),
    (0x12D8, 'V'),
    (0x1311, 'X'),
    (0x1312, 'V'),
    (0x1316, 'X'),
    (0x1318, 'V'),
    (0x135B, 'X'),
    (0x135D, 'V'),
    (0x137D, 'X'),
    (0x1380, 'V'),
    (0x139A, 'X'),
    (0x13A0, 'V'),
    (0x13F5, 'X'),
    (0x1400, 'V'),
    (0x1680, 'X'),
    (0x1681, 'V'),
    (0x169D, 'X'),
    (0x16A0, 'V'),
    (0x16F1, 'X'),
    (0x1700, 'V'),
    (0x170D, 'X'),
    (0x170E, 'V'),
    (0x1715, 'X'),
    (0x1720, 'V'),
    (0x1737, 'X'),
    (0x1740, 'V'),
    (0x1754, 'X'),
    (0x1760, 'V'),
    (0x176D, 'X'),
    (0x176E, 'V'),
    (0x1771, 'X'),
    (0x1772, 'V'),
    (0x1774, 'X'),
    (0x1780, 'V'),
    (0x17B4, 'X'),
    (0x17B6, 'V'),
    (0x17DE, 'X'),
    (0x17E0, 'V'),
    (0x17EA, 'X'),
    (0x17F0, 'V'),
    (0x17FA, 'X'),
    (0x1800, 'V'),
    (0x1806, 'X'),
    (0x1807, 'V'),
    (0x180B, 'I'),
    (0x180E, 'X'),
    (0x1810, 'V'),
    (0x181A, 'X'),
    (0x1820, 'V'),
    (0x1878, 'X'),
    (0x1880, 'V'),
    (0x18AB, 'X'),
    (0x18B0, 'V'),
    (0x18F6, 'X'),
    (0x1900, 'V'),
    (0x191D, 'X'),
    (0x1920, 'V'),
    (0x192C, 'X'),
    (0x1930, 'V'),
    (0x193C, 'X'),
    (0x1940, 'V'),
    (0x1941, 'X'),
    (0x1944, 'V'),
    (0x196E, 'X'),
    (0x1970, 'V'),
    (0x1975, 'X'),
    (0x1980, 'V'),
    (0x19AC, 'X'),
    (0x19B0, 'V'),
    (0x19CA, 'X'),
    (0x19D0, 'V'),
    (0x19DB, 'X'),
    (0x19DE, 'V'),
    (0x1A1C, 'X'),
    (0x1A1E, 'V'),
    (0x1A5F, 'X'),
    (0x1A60, 'V'),
    (0x1A7D, 'X'),
    (0x1A7F, 'V'),
    (0x1A8A, 'X'),
    (0x1A90, 'V'),
    (0x1A9A, 'X'),
    ]

def _seg_15():
    return [
    (0x1AA0, 'V'),
    (0x1AAE, 'X'),
    (0x1B00, 'V'),
    (0x1B4C, 'X'),
    (0x1B50, 'V'),
    (0x1B7D, 'X'),
    (0x1B80, 'V'),
    (0x1BF4, 'X'),
    (0x1BFC, 'V'),
    (0x1C38, 'X'),
    (0x1C3B, 'V'),
    (0x1C4A, 'X'),
    (0x1C4D, 'V'),
    (0x1C80, 'X'),
    (0x1CC0, 'V'),
    (0x1CC8, 'X'),
    (0x1CD0, 'V'),
    (0x1CF7, 'X'),
    (0x1D00, 'V'),
    (0x1D2C, 'M', u'a'),
    (0x1D2D, 'M', u'æ'),
    (0x1D2E, 'M', u'b'),
    (0x1D2F, 'V'),
    (0x1D30, 'M', u'd'),
    (0x1D31, 'M', u'e'),
    (0x1D32, 'M', u'ǝ'),
    (0x1D33, 'M', u'g'),
    (0x1D34, 'M', u'h'),
    (0x1D35, 'M', u'i'),
    (0x1D36, 'M', u'j'),
    (0x1D37, 'M', u'k'),
    (0x1D38, 'M', u'l'),
    (0x1D39, 'M', u'm'),
    (0x1D3A, 'M', u'n'),
    (0x1D3B, 'V'),
    (0x1D3C, 'M', u'o'),
    (0x1D3D, 'M', u'ȣ'),
    (0x1D3E, 'M', u'p'),
    (0x1D3F, 'M', u'r'),
    (0x1D40, 'M', u't'),
    (0x1D41, 'M', u'u'),
    (0x1D42, 'M', u'w'),
    (0x1D43, 'M', u'a'),
    (0x1D44, 'M', u'ɐ'),
    (0x1D45, 'M', u'ɑ'),
    (0x1D46, 'M', u'ᴂ'),
    (0x1D47, 'M', u'b'),
    (0x1D48, 'M', u'd'),
    (0x1D49, 'M', u'e'),
    (0x1D4A, 'M', u'ə'),
    (0x1D4B, 'M', u'ɛ'),
    (0x1D4C, 'M', u'ɜ'),
    (0x1D4D, 'M', u'g'),
    (0x1D4E, 'V'),
    (0x1D4F, 'M', u'k'),
    (0x1D50, 'M', u'm'),
    (0x1D51, 'M', u'ŋ'),
    (0x1D52, 'M', u'o'),
    (0x1D53, 'M', u'ɔ'),
    (0x1D54, 'M', u'ᴖ'),
    (0x1D55, 'M', u'ᴗ'),
    (0x1D56, 'M', u'p'),
    (0x1D57, 'M', u't'),
    (0x1D58, 'M', u'u'),
    (0x1D59, 'M', u'ᴝ'),
    (0x1D5A, 'M', u'ɯ'),
    (0x1D5B, 'M', u'v'),
    (0x1D5C, 'M', u'ᴥ'),
    (0x1D5D, 'M', u'β'),
    (0x1D5E, 'M', u'γ'),
    (0x1D5F, 'M', u'δ'),
    (0x1D60, 'M', u'φ'),
    (0x1D61, 'M', u'χ'),
    (0x1D62, 'M', u'i'),
    (0x1D63, 'M', u'r'),
    (0x1D64, 'M', u'u'),
    (0x1D65, 'M', u'v'),
    (0x1D66, 'M', u'β'),
    (0x1D67, 'M', u'γ'),
    (0x1D68, 'M', u'ρ'),
    (0x1D69, 'M', u'φ'),
    (0x1D6A, 'M', u'χ'),
    (0x1D6B, 'V'),
    (0x1D78, 'M', u'н'),
    (0x1D79, 'V'),
    (0x1D9B, 'M', u'ɒ'),
    (0x1D9C, 'M', u'c'),
    (0x1D9D, 'M', u'ɕ'),
    (0x1D9E, 'M', u'ð'),
    (0x1D9F, 'M', u'ɜ'),
    (0x1DA0, 'M', u'f'),
    (0x1DA1, 'M', u'ɟ'),
    (0x1DA2, 'M', u'ɡ'),
    (0x1DA3, 'M', u'ɥ'),
    (0x1DA4, 'M', u'ɨ'),
    (0x1DA5, 'M', u'ɩ'),
    (0x1DA6, 'M', u'ɪ'),
    (0x1DA7, 'M', u'ᵻ'),
    (0x1DA8, 'M', u'ʝ'),
    (0x1DA9, 'M', u'ɭ'),
    ]

def _seg_16():
    return [
    (0x1DAA, 'M', u'ᶅ'),
    (0x1DAB, 'M', u'ʟ'),
    (0x1DAC, 'M', u'ɱ'),
    (0x1DAD, 'M', u'ɰ'),
    (0x1DAE, 'M', u'ɲ'),
    (0x1DAF, 'M', u'ɳ'),
    (0x1DB0, 'M', u'ɴ'),
    (0x1DB1, 'M', u'ɵ'),
    (0x1DB2, 'M', u'ɸ'),
    (0x1DB3, 'M', u'ʂ'),
    (0x1DB4, 'M', u'ʃ'),
    (0x1DB5, 'M', u'ƫ'),
    (0x1DB6, 'M', u'ʉ'),
    (0x1DB7, 'M', u'ʊ'),
    (0x1DB8, 'M', u'ᴜ'),
    (0x1DB9, 'M', u'ʋ'),
    (0x1DBA, 'M', u'ʌ'),
    (0x1DBB, 'M', u'z'),
    (0x1DBC, 'M', u'ʐ'),
    (0x1DBD, 'M', u'ʑ'),
    (0x1DBE, 'M', u'ʒ'),
    (0x1DBF, 'M', u'θ'),
    (0x1DC0, 'V'),
    (0x1DE7, 'X'),
    (0x1DFC, 'V'),
    (0x1E00, 'M', u'ḁ'),
    (0x1E01, 'V'),
    (0x1E02, 'M', u'ḃ'),
    (0x1E03, 'V'),
    (0x1E04, 'M', u'ḅ'),
    (0x1E05, 'V'),
    (0x1E06, 'M', u'ḇ'),
    (0x1E07, 'V'),
    (0x1E08, 'M', u'ḉ'),
    (0x1E09, 'V'),
    (0x1E0A, 'M', u'ḋ'),
    (0x1E0B, 'V'),
    (0x1E0C, 'M', u'ḍ'),
    (0x1E0D, 'V'),
    (0x1E0E, 'M', u'ḏ'),
    (0x1E0F, 'V'),
    (0x1E10, 'M', u'ḑ'),
    (0x1E11, 'V'),
    (0x1E12, 'M', u'ḓ'),
    (0x1E13, 'V'),
    (0x1E14, 'M', u'ḕ'),
    (0x1E15, 'V'),
    (0x1E16, 'M', u'ḗ'),
    (0x1E17, 'V'),
    (0x1E18, 'M', u'ḙ'),
    (0x1E19, 'V'),
    (0x1E1A, 'M', u'ḛ'),
    (0x1E1B, 'V'),
    (0x1E1C, 'M', u'ḝ'),
    (0x1E1D, 'V'),
    (0x1E1E, 'M', u'ḟ'),
    (0x1E1F, 'V'),
    (0x1E20, 'M', u'ḡ'),
    (0x1E21, 'V'),
    (0x1E22, 'M', u'ḣ'),
    (0x1E23, 'V'),
    (0x1E24, 'M', u'ḥ'),
    (0x1E25, 'V'),
    (0x1E26, 'M', u'ḧ'),
    (0x1E27, 'V'),
    (0x1E28, 'M', u'ḩ'),
    (0x1E29, 'V'),
    (0x1E2A, 'M', u'ḫ'),
    (0x1E2B, 'V'),
    (0x1E2C, 'M', u'ḭ'),
    (0x1E2D, 'V'),
    (0x1E2E, 'M', u'ḯ'),
    (0x1E2F, 'V'),
    (0x1E30, 'M', u'ḱ'),
    (0x1E31, 'V'),
    (0x1E32, 'M', u'ḳ'),
    (0x1E33, 'V'),
    (0x1E34, 'M', u'ḵ'),
    (0x1E35, 'V'),
    (0x1E36, 'M', u'ḷ'),
    (0x1E37, 'V'),
    (0x1E38, 'M', u'ḹ'),
    (0x1E39, 'V'),
    (0x1E3A, 'M', u'ḻ'),
    (0x1E3B, 'V'),
    (0x1E3C, 'M', u'ḽ'),
    (0x1E3D, 'V'),
    (0x1E3E, 'M', u'ḿ'),
    (0x1E3F, 'V'),
    (0x1E40, 'M', u'ṁ'),
    (0x1E41, 'V'),
    (0x1E42, 'M', u'ṃ'),
    (0x1E43, 'V'),
    (0x1E44, 'M', u'ṅ'),
    (0x1E45, 'V'),
    (0x1E46, 'M', u'ṇ'),
    (0x1E47, 'V'),
    (0x1E48, 'M', u'ṉ'),
    (0x1E49, 'V'),
    (0x1E4A, 'M', u'ṋ'),
    ]

def _seg_17():
    return [
    (0x1E4B, 'V'),
    (0x1E4C, 'M', u'ṍ'),
    (0x1E4D, 'V'),
    (0x1E4E, 'M', u'ṏ'),
    (0x1E4F, 'V'),
    (0x1E50, 'M', u'ṑ'),
    (0x1E51, 'V'),
    (0x1E52, 'M', u'ṓ'),
    (0x1E53, 'V'),
    (0x1E54, 'M', u'ṕ'),
    (0x1E55, 'V'),
    (0x1E56, 'M', u'ṗ'),
    (0x1E57, 'V'),
    (0x1E58, 'M', u'ṙ'),
    (0x1E59, 'V'),
    (0x1E5A, 'M', u'ṛ'),
    (0x1E5B, 'V'),
    (0x1E5C, 'M', u'ṝ'),
    (0x1E5D, 'V'),
    (0x1E5E, 'M', u'ṟ'),
    (0x1E5F, 'V'),
    (0x1E60, 'M', u'ṡ'),
    (0x1E61, 'V'),
    (0x1E62, 'M', u'ṣ'),
    (0x1E63, 'V'),
    (0x1E64, 'M', u'ṥ'),
    (0x1E65, 'V'),
    (0x1E66, 'M', u'ṧ'),
    (0x1E67, 'V'),
    (0x1E68, 'M', u'ṩ'),
    (0x1E69, 'V'),
    (0x1E6A, 'M', u'ṫ'),
    (0x1E6B, 'V'),
    (0x1E6C, 'M', u'ṭ'),
    (0x1E6D, 'V'),
    (0x1E6E, 'M', u'ṯ'),
    (0x1E6F, 'V'),
    (0x1E70, 'M', u'ṱ'),
    (0x1E71, 'V'),
    (0x1E72, 'M', u'ṳ'),
    (0x1E73, 'V'),
    (0x1E74, 'M', u'ṵ'),
    (0x1E75, 'V'),
    (0x1E76, 'M', u'ṷ'),
    (0x1E77, 'V'),
    (0x1E78, 'M', u'ṹ'),
    (0x1E79, 'V'),
    (0x1E7A, 'M', u'ṻ'),
    (0x1E7B, 'V'),
    (0x1E7C, 'M', u'ṽ'),
    (0x1E7D, 'V'),
    (0x1E7E, 'M', u'ṿ'),
    (0x1E7F, 'V'),
    (0x1E80, 'M', u'ẁ'),
    (0x1E81, 'V'),
    (0x1E82, 'M', u'ẃ'),
    (0x1E83, 'V'),
    (0x1E84, 'M', u'ẅ'),
    (0x1E85, 'V'),
    (0x1E86, 'M', u'ẇ'),
    (0x1E87, 'V'),
    (0x1E88, 'M', u'ẉ'),
    (0x1E89, 'V'),
    (0x1E8A, 'M', u'ẋ'),
    (0x1E8B, 'V'),
    (0x1E8C, 'M', u'ẍ'),
    (0x1E8D, 'V'),
    (0x1E8E, 'M', u'ẏ'),
    (0x1E8F, 'V'),
    (0x1E90, 'M', u'ẑ'),
    (0x1E91, 'V'),
    (0x1E92, 'M', u'ẓ'),
    (0x1E93, 'V'),
    (0x1E94, 'M', u'ẕ'),
    (0x1E95, 'V'),
    (0x1E9A, 'M', u'aʾ'),
    (0x1E9B, 'M', u'ṡ'),
    (0x1E9C, 'V'),
    (0x1E9E, 'M', u'ss'),
    (0x1E9F, 'V'),
    (0x1EA0, 'M', u'ạ'),
    (0x1EA1, 'V'),
    (0x1EA2, 'M', u'ả'),
    (0x1EA3, 'V'),
    (0x1EA4, 'M', u'ấ'),
    (0x1EA5, 'V'),
    (0x1EA6, 'M', u'ầ'),
    (0x1EA7, 'V'),
    (0x1EA8, 'M', u'ẩ'),
    (0x1EA9, 'V'),
    (0x1EAA, 'M', u'ẫ'),
    (0x1EAB, 'V'),
    (0x1EAC, 'M', u'ậ'),
    (0x1EAD, 'V'),
    (0x1EAE, 'M', u'ắ'),
    (0x1EAF, 'V'),
    (0x1EB0, 'M', u'ằ'),
    (0x1EB1, 'V'),
    (0x1EB2, 'M', u'ẳ'),
    (0x1EB3, 'V'),
    ]

def _seg_18():
    return [
    (0x1EB4, 'M', u'ẵ'),
    (0x1EB5, 'V'),
    (0x1EB6, 'M', u'ặ'),
    (0x1EB7, 'V'),
    (0x1EB8, 'M', u'ẹ'),
    (0x1EB9, 'V'),
    (0x1EBA, 'M', u'ẻ'),
    (0x1EBB, 'V'),
    (0x1EBC, 'M', u'ẽ'),
    (0x1EBD, 'V'),
    (0x1EBE, 'M', u'ế'),
    (0x1EBF, 'V'),
    (0x1EC0, 'M', u'ề'),
    (0x1EC1, 'V'),
    (0x1EC2, 'M', u'ể'),
    (0x1EC3, 'V'),
    (0x1EC4, 'M', u'ễ'),
    (0x1EC5, 'V'),
    (0x1EC6, 'M', u'ệ'),
    (0x1EC7, 'V'),
    (0x1EC8, 'M', u'ỉ'),
    (0x1EC9, 'V'),
    (0x1ECA, 'M', u'ị'),
    (0x1ECB, 'V'),
    (0x1ECC, 'M', u'ọ'),
    (0x1ECD, 'V'),
    (0x1ECE, 'M', u'ỏ'),
    (0x1ECF, 'V'),
    (0x1ED0, 'M', u'ố'),
    (0x1ED1, 'V'),
    (0x1ED2, 'M', u'ồ'),
    (0x1ED3, 'V'),
    (0x1ED4, 'M', u'ổ'),
    (0x1ED5, 'V'),
    (0x1ED6, 'M', u'ỗ'),
    (0x1ED7, 'V'),
    (0x1ED8, 'M', u'ộ'),
    (0x1ED9, 'V'),
    (0x1EDA, 'M', u'ớ'),
    (0x1EDB, 'V'),
    (0x1EDC, 'M', u'ờ'),
    (0x1EDD, 'V'),
    (0x1EDE, 'M', u'ở'),
    (0x1EDF, 'V'),
    (0x1EE0, 'M', u'ỡ'),
    (0x1EE1, 'V'),
    (0x1EE2, 'M', u'ợ'),
    (0x1EE3, 'V'),
    (0x1EE4, 'M', u'ụ'),
    (0x1EE5, 'V'),
    (0x1EE6, 'M', u'ủ'),
    (0x1EE7, 'V'),
    (0x1EE8, 'M', u'ứ'),
    (0x1EE9, 'V'),
    (0x1EEA, 'M', u'ừ'),
    (0x1EEB, 'V'),
    (0x1EEC, 'M', u'ử'),
    (0x1EED, 'V'),
    (0x1EEE, 'M', u'ữ'),
    (0x1EEF, 'V'),
    (0x1EF0, 'M', u'ự'),
    (0x1EF1, 'V'),
    (0x1EF2, 'M', u'ỳ'),
    (0x1EF3, 'V'),
    (0x1EF4, 'M', u'ỵ'),
    (0x1EF5, 'V'),
    (0x1EF6, 'M', u'ỷ'),
    (0x1EF7, 'V'),
    (0x1EF8, 'M', u'ỹ'),
    (0x1EF9, 'V'),
    (0x1EFA, 'M', u'ỻ'),
    (0x1EFB, 'V'),
    (0x1EFC, 'M', u'ỽ'),
    (0x1EFD, 'V'),
    (0x1EFE, 'M', u'ỿ'),
    (0x1EFF, 'V'),
    (0x1F08, 'M', u'ἀ'),
    (0x1F09, 'M', u'ἁ'),
    (0x1F0A, 'M', u'ἂ'),
    (0x1F0B, 'M', u'ἃ'),
    (0x1F0C, 'M', u'ἄ'),
    (0x1F0D, 'M', u'ἅ'),
    (0x1F0E, 'M', u'ἆ'),
    (0x1F0F, 'M', u'ἇ'),
    (0x1F10, 'V'),
    (0x1F16, 'X'),
    (0x1F18, 'M', u'ἐ'),
    (0x1F19, 'M', u'ἑ'),
    (0x1F1A, 'M', u'ἒ'),
    (0x1F1B, 'M', u'ἓ'),
    (0x1F1C, 'M', u'ἔ'),
    (0x1F1D, 'M', u'ἕ'),
    (0x1F1E, 'X'),
    (0x1F20, 'V'),
    (0x1F28, 'M', u'ἠ'),
    (0x1F29, 'M', u'ἡ'),
    (0x1F2A, 'M', u'ἢ'),
    (0x1F2B, 'M', u'ἣ'),
    (0x1F2C, 'M', u'ἤ'),
    (0x1F2D, 'M', u'ἥ'),
    ]

def _seg_19():
    return [
    (0x1F2E, 'M', u'ἦ'),
    (0x1F2F, 'M', u'ἧ'),
    (0x1F30, 'V'),
    (0x1F38, 'M', u'ἰ'),
    (0x1F39, 'M', u'ἱ'),
    (0x1F3A, 'M', u'ἲ'),
    (0x1F3B, 'M', u'ἳ'),
    (0x1F3C, 'M', u'ἴ'),
    (0x1F3D, 'M', u'ἵ'),
    (0x1F3E, 'M', u'ἶ'),
    (0x1F3F, 'M', u'ἷ'),
    (0x1F40, 'V'),
    (0x1F46, 'X'),
    (0x1F48, 'M', u'ὀ'),
    (0x1F49, 'M', u'ὁ'),
    (0x1F4A, 'M', u'ὂ'),
    (0x1F4B, 'M', u'ὃ'),
    (0x1F4C, 'M', u'ὄ'),
    (0x1F4D, 'M', u'ὅ'),
    (0x1F4E, 'X'),
    (0x1F50, 'V'),
    (0x1F58, 'X'),
    (0x1F59, 'M', u'ὑ'),
    (0x1F5A, 'X'),
    (0x1F5B, 'M', u'ὓ'),
    (0x1F5C, 'X'),
    (0x1F5D, 'M', u'ὕ'),
    (0x1F5E, 'X'),
    (0x1F5F, 'M', u'ὗ'),
    (0x1F60, 'V'),
    (0x1F68, 'M', u'ὠ'),
    (0x1F69, 'M', u'ὡ'),
    (0x1F6A, 'M', u'ὢ'),
    (0x1F6B, 'M', u'ὣ'),
    (0x1F6C, 'M', u'ὤ'),
    (0x1F6D, 'M', u'ὥ'),
    (0x1F6E, 'M', u'ὦ'),
    (0x1F6F, 'M', u'ὧ'),
    (0x1F70, 'V'),
    (0x1F71, 'M', u'ά'),
    (0x1F72, 'V'),
    (0x1F73, 'M', u'έ'),
    (0x1F74, 'V'),
    (0x1F75, 'M', u'ή'),
    (0x1F76, 'V'),
    (0x1F77, 'M', u'ί'),
    (0x1F78, 'V'),
    (0x1F79, 'M', u'ό'),
    (0x1F7A, 'V'),
    (0x1F7B, 'M', u'ύ'),
    (0x1F7C, 'V'),
    (0x1F7D, 'M', u'ώ'),
    (0x1F7E, 'X'),
    (0x1F80, 'M', u'ἀι'),
    (0x1F81, 'M', u'ἁι'),
    (0x1F82, 'M', u'ἂι'),
    (0x1F83, 'M', u'ἃι'),
    (0x1F84, 'M', u'ἄι'),
    (0x1F85, 'M', u'ἅι'),
    (0x1F86, 'M', u'ἆι'),
    (0x1F87, 'M', u'ἇι'),
    (0x1F88, 'M', u'ἀι'),
    (0x1F89, 'M', u'ἁι'),
    (0x1F8A, 'M', u'ἂι'),
    (0x1F8B, 'M', u'ἃι'),
    (0x1F8C, 'M', u'ἄι'),
    (0x1F8D, 'M', u'ἅι'),
    (0x1F8E, 'M', u'ἆι'),
    (0x1F8F, 'M', u'ἇι'),
    (0x1F90, 'M', u'ἠι'),
    (0x1F91, 'M', u'ἡι'),
    (0x1F92, 'M', u'ἢι'),
    (0x1F93, 'M', u'ἣι'),
    (0x1F94, 'M', u'ἤι'),
    (0x1F95, 'M', u'ἥι'),
    (0x1F96, 'M', u'ἦι'),
    (0x1F97, 'M', u'ἧι'),
    (0x1F98, 'M', u'ἠι'),
    (0x1F99, 'M', u'ἡι'),
    (0x1F9A, 'M', u'ἢι'),
    (0x1F9B, 'M', u'ἣι'),
    (0x1F9C, 'M', u'ἤι'),
    (0x1F9D, 'M', u'ἥι'),
    (0x1F9E, 'M', u'ἦι'),
    (0x1F9F, 'M', u'ἧι'),
    (0x1FA0, 'M', u'ὠι'),
    (0x1FA1, 'M', u'ὡι'),
    (0x1FA2, 'M', u'ὢι'),
    (0x1FA3, 'M', u'ὣι'),
    (0x1FA4, 'M', u'ὤι'),
    (0x1FA5, 'M', u'ὥι'),
    (0x1FA6, 'M', u'ὦι'),
    (0x1FA7, 'M', u'ὧι'),
    (0x1FA8, 'M', u'ὠι'),
    (0x1FA9, 'M', u'ὡι'),
    (0x1FAA, 'M', u'ὢι'),
    (0x1FAB, 'M', u'ὣι'),
    (0x1FAC, 'M', u'ὤι'),
    (0x1FAD, 'M', u'ὥι'),
    (0x1FAE, 'M', u'ὦι'),
    ]

def _seg_20():
    return [
    (0x1FAF, 'M', u'ὧι'),
    (0x1FB0, 'V'),
    (0x1FB2, 'M', u'ὰι'),
    (0x1FB3, 'M', u'αι'),
    (0x1FB4, 'M', u'άι'),
    (0x1FB5, 'X'),
    (0x1FB6, 'V'),
    (0x1FB7, 'M', u'ᾶι'),
    (0x1FB8, 'M', u'ᾰ'),
    (0x1FB9, 'M', u'ᾱ'),
    (0x1FBA, 'M', u'ὰ'),
    (0x1FBB, 'M', u'ά'),
    (0x1FBC, 'M', u'αι'),
    (0x1FBD, '3', u' ̓'),
    (0x1FBE, 'M', u'ι'),
    (0x1FBF, '3', u' ̓'),
    (0x1FC0, '3', u' ͂'),
    (0x1FC1, '3', u' ̈͂'),
    (0x1FC2, 'M', u'ὴι'),
    (0x1FC3, 'M', u'ηι'),
    (0x1FC4, 'M', u'ήι'),
    (0x1FC5, 'X'),
    (0x1FC6, 'V'),
    (0x1FC7, 'M', u'ῆι'),
    (0x1FC8, 'M', u'ὲ'),
    (0x1FC9, 'M', u'έ'),
    (0x1FCA, 'M', u'ὴ'),
    (0x1FCB, 'M', u'ή'),
    (0x1FCC, 'M', u'ηι'),
    (0x1FCD, '3', u' ̓̀'),
    (0x1FCE, '3', u' ̓́'),
    (0x1FCF, '3', u' ̓͂'),
    (0x1FD0, 'V'),
    (0x1FD3, 'M', u'ΐ'),
    (0x1FD4, 'X'),
    (0x1FD6, 'V'),
    (0x1FD8, 'M', u'ῐ'),
    (0x1FD9, 'M', u'ῑ'),
    (0x1FDA, 'M', u'ὶ'),
    (0x1FDB, 'M', u'ί'),
    (0x1FDC, 'X'),
    (0x1FDD, '3', u' ̔̀'),
    (0x1FDE, '3', u' ̔́'),
    (0x1FDF, '3', u' ̔͂'),
    (0x1FE0, 'V'),
    (0x1FE3, 'M', u'ΰ'),
    (0x1FE4, 'V'),
    (0x1FE8, 'M', u'ῠ'),
    (0x1FE9, 'M', u'ῡ'),
    (0x1FEA, 'M', u'ὺ'),
    (0x1FEB, 'M', u'ύ'),
    (0x1FEC, 'M', u'ῥ'),
    (0x1FED, '3', u' ̈̀'),
    (0x1FEE, '3', u' ̈́'),
    (0x1FEF, '3', u'`'),
    (0x1FF0, 'X'),
    (0x1FF2, 'M', u'ὼι'),
    (0x1FF3, 'M', u'ωι'),
    (0x1FF4, 'M', u'ώι'),
    (0x1FF5, 'X'),
    (0x1FF6, 'V'),
    (0x1FF7, 'M', u'ῶι'),
    (0x1FF8, 'M', u'ὸ'),
    (0x1FF9, 'M', u'ό'),
    (0x1FFA, 'M', u'ὼ'),
    (0x1FFB, 'M', u'ώ'),
    (0x1FFC, 'M', u'ωι'),
    (0x1FFD, '3', u' ́'),
    (0x1FFE, '3', u' ̔'),
    (0x1FFF, 'X'),
    (0x2000, '3', u' '),
    (0x200B, 'I'),
    (0x200C, 'D', u''),
    (0x200E, 'X'),
    (0x2010, 'V'),
    (0x2011, 'M', u'‐'),
    (0x2012, 'V'),
    (0x2017, '3', u' ̳'),
    (0x2018, 'V'),
    (0x2024, 'X'),
    (0x2027, 'V'),
    (0x2028, 'X'),
    (0x202F, '3', u' '),
    (0x2030, 'V'),
    (0x2033, 'M', u'′′'),
    (0x2034, 'M', u'′′′'),
    (0x2035, 'V'),
    (0x2036, 'M', u'‵‵'),
    (0x2037, 'M', u'‵‵‵'),
    (0x2038, 'V'),
    (0x203C, '3', u'!!'),
    (0x203D, 'V'),
    (0x203E, '3', u' ̅'),
    (0x203F, 'V'),
    (0x2047, '3', u'??'),
    (0x2048, '3', u'?!'),
    (0x2049, '3', u'!?'),
    (0x204A, 'V'),
    (0x2057, 'M', u'′′′′'),
    (0x2058, 'V'),
    ]

def _seg_21():
    return [
    (0x205F, '3', u' '),
    (0x2060, 'I'),
    (0x2061, 'X'),
    (0x2064, 'I'),
    (0x2065, 'X'),
    (0x2070, 'M', u'0'),
    (0x2071, 'M', u'i'),
    (0x2072, 'X'),
    (0x2074, 'M', u'4'),
    (0x2075, 'M', u'5'),
    (0x2076, 'M', u'6'),
    (0x2077, 'M', u'7'),
    (0x2078, 'M', u'8'),
    (0x2079, 'M', u'9'),
    (0x207A, '3', u'+'),
    (0x207B, 'M', u'−'),
    (0x207C, '3', u'='),
    (0x207D, '3', u'('),
    (0x207E, '3', u')'),
    (0x207F, 'M', u'n'),
    (0x2080, 'M', u'0'),
    (0x2081, 'M', u'1'),
    (0x2082, 'M', u'2'),
    (0x2083, 'M', u'3'),
    (0x2084, 'M', u'4'),
    (0x2085, 'M', u'5'),
    (0x2086, 'M', u'6'),
    (0x2087, 'M', u'7'),
    (0x2088, 'M', u'8'),
    (0x2089, 'M', u'9'),
    (0x208A, '3', u'+'),
    (0x208B, 'M', u'−'),
    (0x208C, '3', u'='),
    (0x208D, '3', u'('),
    (0x208E, '3', u')'),
    (0x208F, 'X'),
    (0x2090, 'M', u'a'),
    (0x2091, 'M', u'e'),
    (0x2092, 'M', u'o'),
    (0x2093, 'M', u'x'),
    (0x2094, 'M', u'ə'),
    (0x2095, 'M', u'h'),
    (0x2096, 'M', u'k'),
    (0x2097, 'M', u'l'),
    (0x2098, 'M', u'm'),
    (0x2099, 'M', u'n'),
    (0x209A, 'M', u'p'),
    (0x209B, 'M', u's'),
    (0x209C, 'M', u't'),
    (0x209D, 'X'),
    (0x20A0, 'V'),
    (0x20A8, 'M', u'rs'),
    (0x20A9, 'V'),
    (0x20BB, 'X'),
    (0x20D0, 'V'),
    (0x20F1, 'X'),
    (0x2100, '3', u'a/c'),
    (0x2101, '3', u'a/s'),
    (0x2102, 'M', u'c'),
    (0x2103, 'M', u'°c'),
    (0x2104, 'V'),
    (0x2105, '3', u'c/o'),
    (0x2106, '3', u'c/u'),
    (0x2107, 'M', u'ɛ'),
    (0x2108, 'V'),
    (0x2109, 'M', u'°f'),
    (0x210A, 'M', u'g'),
    (0x210B, 'M', u'h'),
    (0x210F, 'M', u'ħ'),
    (0x2110, 'M', u'i'),
    (0x2112, 'M', u'l'),
    (0x2114, 'V'),
    (0x2115, 'M', u'n'),
    (0x2116, 'M', u'no'),
    (0x2117, 'V'),
    (0x2119, 'M', u'p'),
    (0x211A, 'M', u'q'),
    (0x211B, 'M', u'r'),
    (0x211E, 'V'),
    (0x2120, 'M', u'sm'),
    (0x2121, 'M', u'tel'),
    (0x2122, 'M', u'tm'),
    (0x2123, 'V'),
    (0x2124, 'M', u'z'),
    (0x2125, 'V'),
    (0x2126, 'M', u'ω'),
    (0x2127, 'V'),
    (0x2128, 'M', u'z'),
    (0x2129, 'V'),
    (0x212A, 'M', u'k'),
    (0x212B, 'M', u'å'),
    (0x212C, 'M', u'b'),
    (0x212D, 'M', u'c'),
    (0x212E, 'V'),
    (0x212F, 'M', u'e'),
    (0x2131, 'M', u'f'),
    (0x2132, 'X'),
    (0x2133, 'M', u'm'),
    (0x2134, 'M', u'o'),
    (0x2135, 'M', u'א'),
    ]

def _seg_22():
    return [
    (0x2136, 'M', u'ב'),
    (0x2137, 'M', u'ג'),
    (0x2138, 'M', u'ד'),
    (0x2139, 'M', u'i'),
    (0x213A, 'V'),
    (0x213B, 'M', u'fax'),
    (0x213C, 'M', u'π'),
    (0x213D, 'M', u'γ'),
    (0x213F, 'M', u'π'),
    (0x2140, 'M', u'∑'),
    (0x2141, 'V'),
    (0x2145, 'M', u'd'),
    (0x2147, 'M', u'e'),
    (0x2148, 'M', u'i'),
    (0x2149, 'M', u'j'),
    (0x214A, 'V'),
    (0x2150, 'M', u'1⁄7'),
    (0x2151, 'M', u'1⁄9'),
    (0x2152, 'M', u'1⁄10'),
    (0x2153, 'M', u'1⁄3'),
    (0x2154, 'M', u'2⁄3'),
    (0x2155, 'M', u'1⁄5'),
    (0x2156, 'M', u'2⁄5'),
    (0x2157, 'M', u'3⁄5'),
    (0x2158, 'M', u'4⁄5'),
    (0x2159, 'M', u'1⁄6'),
    (0x215A, 'M', u'5⁄6'),
    (0x215B, 'M', u'1⁄8'),
    (0x215C, 'M', u'3⁄8'),
    (0x215D, 'M', u'5⁄8'),
    (0x215E, 'M', u'7⁄8'),
    (0x215F, 'M', u'1⁄'),
    (0x2160, 'M', u'i'),
    (0x2161, 'M', u'ii'),
    (0x2162, 'M', u'iii'),
    (0x2163, 'M', u'iv'),
    (0x2164, 'M', u'v'),
    (0x2165, 'M', u'vi'),
    (0x2166, 'M', u'vii'),
    (0x2167, 'M', u'viii'),
    (0x2168, 'M', u'ix'),
    (0x2169, 'M', u'x'),
    (0x216A, 'M', u'xi'),
    (0x216B, 'M', u'xii'),
    (0x216C, 'M', u'l'),
    (0x216D, 'M', u'c'),
    (0x216E, 'M', u'd'),
    (0x216F, 'M', u'm'),
    (0x2170, 'M', u'i'),
    (0x2171, 'M', u'ii'),
    (0x2172, 'M', u'iii'),
    (0x2173, 'M', u'iv'),
    (0x2174, 'M', u'v'),
    (0x2175, 'M', u'vi'),
    (0x2176, 'M', u'vii'),
    (0x2177, 'M', u'viii'),
    (0x2178, 'M', u'ix'),
    (0x2179, 'M', u'x'),
    (0x217A, 'M', u'xi'),
    (0x217B, 'M', u'xii'),
    (0x217C, 'M', u'l'),
    (0x217D, 'M', u'c'),
    (0x217E, 'M', u'd'),
    (0x217F, 'M', u'm'),
    (0x2180, 'V'),
    (0x2183, 'X'),
    (0x2184, 'V'),
    (0x2189, 'M', u'0⁄3'),
    (0x218A, 'X'),
    (0x2190, 'V'),
    (0x222C, 'M', u'∫∫'),
    (0x222D, 'M', u'∫∫∫'),
    (0x222E, 'V'),
    (0x222F, 'M', u'∮∮'),
    (0x2230, 'M', u'∮∮∮'),
    (0x2231, 'V'),
    (0x2260, '3'),
    (0x2261, 'V'),
    (0x226E, '3'),
    (0x2270, 'V'),
    (0x2329, 'M', u'〈'),
    (0x232A, 'M', u'〉'),
    (0x232B, 'V'),
    (0x23F4, 'X'),
    (0x2400, 'V'),
    (0x2427, 'X'),
    (0x2440, 'V'),
    (0x244B, 'X'),
    (0x2460, 'M', u'1'),
    (0x2461, 'M', u'2'),
    (0x2462, 'M', u'3'),
    (0x2463, 'M', u'4'),
    (0x2464, 'M', u'5'),
    (0x2465, 'M', u'6'),
    (0x2466, 'M', u'7'),
    (0x2467, 'M', u'8'),
    (0x2468, 'M', u'9'),
    (0x2469, 'M', u'10'),
    (0x246A, 'M', u'11'),
    (0x246B, 'M', u'12'),
    ]

def _seg_23():
    return [
    (0x246C, 'M', u'13'),
    (0x246D, 'M', u'14'),
    (0x246E, 'M', u'15'),
    (0x246F, 'M', u'16'),
    (0x2470, 'M', u'17'),
    (0x2471, 'M', u'18'),
    (0x2472, 'M', u'19'),
    (0x2473, 'M', u'20'),
    (0x2474, '3', u'(1)'),
    (0x2475, '3', u'(2)'),
    (0x2476, '3', u'(3)'),
    (0x2477, '3', u'(4)'),
    (0x2478, '3', u'(5)'),
    (0x2479, '3', u'(6)'),
    (0x247A, '3', u'(7)'),
    (0x247B, '3', u'(8)'),
    (0x247C, '3', u'(9)'),
    (0x247D, '3', u'(10)'),
    (0x247E, '3', u'(11)'),
    (0x247F, '3', u'(12)'),
    (0x2480, '3', u'(13)'),
    (0x2481, '3', u'(14)'),
    (0x2482, '3', u'(15)'),
    (0x2483, '3', u'(16)'),
    (0x2484, '3', u'(17)'),
    (0x2485, '3', u'(18)'),
    (0x2486, '3', u'(19)'),
    (0x2487, '3', u'(20)'),
    (0x2488, 'X'),
    (0x249C, '3', u'(a)'),
    (0x249D, '3', u'(b)'),
    (0x249E, '3', u'(c)'),
    (0x249F, '3', u'(d)'),
    (0x24A0, '3', u'(e)'),
    (0x24A1, '3', u'(f)'),
    (0x24A2, '3', u'(g)'),
    (0x24A3, '3', u'(h)'),
    (0x24A4, '3', u'(i)'),
    (0x24A5, '3', u'(j)'),
    (0x24A6, '3', u'(k)'),
    (0x24A7, '3', u'(l)'),
    (0x24A8, '3', u'(m)'),
    (0x24A9, '3', u'(n)'),
    (0x24AA, '3', u'(o)'),
    (0x24AB, '3', u'(p)'),
    (0x24AC, '3', u'(q)'),
    (0x24AD, '3', u'(r)'),
    (0x24AE, '3', u'(s)'),
    (0x24AF, '3', u'(t)'),
    (0x24B0, '3', u'(u)'),
    (0x24B1, '3', u'(v)'),
    (0x24B2, '3', u'(w)'),
    (0x24B3, '3', u'(x)'),
    (0x24B4, '3', u'(y)'),
    (0x24B5, '3', u'(z)'),
    (0x24B6, 'M', u'a'),
    (0x24B7, 'M', u'b'),
    (0x24B8, 'M', u'c'),
    (0x24B9, 'M', u'd'),
    (0x24BA, 'M', u'e'),
    (0x24BB, 'M', u'f'),
    (0x24BC, 'M', u'g'),
    (0x24BD, 'M', u'h'),
    (0x24BE, 'M', u'i'),
    (0x24BF, 'M', u'j'),
    (0x24C0, 'M', u'k'),
    (0x24C1, 'M', u'l'),
    (0x24C2, 'M', u'm'),
    (0x24C3, 'M', u'n'),
    (0x24C4, 'M', u'o'),
    (0x24C5, 'M', u'p'),
    (0x24C6, 'M', u'q'),
    (0x24C7, 'M', u'r'),
    (0x24C8, 'M', u's'),
    (0x24C9, 'M', u't'),
    (0x24CA, 'M', u'u'),
    (0x24CB, 'M', u'v'),
    (0x24CC, 'M', u'w'),
    (0x24CD, 'M', u'x'),
    (0x24CE, 'M', u'y'),
    (0x24CF, 'M', u'z'),
    (0x24D0, 'M', u'a'),
    (0x24D1, 'M', u'b'),
    (0x24D2, 'M', u'c'),
    (0x24D3, 'M', u'd'),
    (0x24D4, 'M', u'e'),
    (0x24D5, 'M', u'f'),
    (0x24D6, 'M', u'g'),
    (0x24D7, 'M', u'h'),
    (0x24D8, 'M', u'i'),
    (0x24D9, 'M', u'j'),
    (0x24DA, 'M', u'k'),
    (0x24DB, 'M', u'l'),
    (0x24DC, 'M', u'm'),
    (0x24DD, 'M', u'n'),
    (0x24DE, 'M', u'o'),
    (0x24DF, 'M', u'p'),
    (0x24E0, 'M', u'q'),
    (0x24E1, 'M', u'r'),
    (0x24E2, 'M', u's'),
    ]

def _seg_24():
    return [
    (0x24E3, 'M', u't'),
    (0x24E4, 'M', u'u'),
    (0x24E5, 'M', u'v'),
    (0x24E6, 'M', u'w'),
    (0x24E7, 'M', u'x'),
    (0x24E8, 'M', u'y'),
    (0x24E9, 'M', u'z'),
    (0x24EA, 'M', u'0'),
    (0x24EB, 'V'),
    (0x2700, 'X'),
    (0x2701, 'V'),
    (0x2A0C, 'M', u'∫∫∫∫'),
    (0x2A0D, 'V'),
    (0x2A74, '3', u'::='),
    (0x2A75, '3', u'=='),
    (0x2A76, '3', u'==='),
    (0x2A77, 'V'),
    (0x2ADC, 'M', u'⫝̸'),
    (0x2ADD, 'V'),
    (0x2B4D, 'X'),
    (0x2B50, 'V'),
    (0x2B5A, 'X'),
    (0x2C00, 'M', u'ⰰ'),
    (0x2C01, 'M', u'ⰱ'),
    (0x2C02, 'M', u'ⰲ'),
    (0x2C03, 'M', u'ⰳ'),
    (0x2C04, 'M', u'ⰴ'),
    (0x2C05, 'M', u'ⰵ'),
    (0x2C06, 'M', u'ⰶ'),
    (0x2C07, 'M', u'ⰷ'),
    (0x2C08, 'M', u'ⰸ'),
    (0x2C09, 'M', u'ⰹ'),
    (0x2C0A, 'M', u'ⰺ'),
    (0x2C0B, 'M', u'ⰻ'),
    (0x2C0C, 'M', u'ⰼ'),
    (0x2C0D, 'M', u'ⰽ'),
    (0x2C0E, 'M', u'ⰾ'),
    (0x2C0F, 'M', u'ⰿ'),
    (0x2C10, 'M', u'ⱀ'),
    (0x2C11, 'M', u'ⱁ'),
    (0x2C12, 'M', u'ⱂ'),
    (0x2C13, 'M', u'ⱃ'),
    (0x2C14, 'M', u'ⱄ'),
    (0x2C15, 'M', u'ⱅ'),
    (0x2C16, 'M', u'ⱆ'),
    (0x2C17, 'M', u'ⱇ'),
    (0x2C18, 'M', u'ⱈ'),
    (0x2C19, 'M', u'ⱉ'),
    (0x2C1A, 'M', u'ⱊ'),
    (0x2C1B, 'M', u'ⱋ'),
    (0x2C1C, 'M', u'ⱌ'),
    (0x2C1D, 'M', u'ⱍ'),
    (0x2C1E, 'M', u'ⱎ'),
    (0x2C1F, 'M', u'ⱏ'),
    (0x2C20, 'M', u'ⱐ'),
    (0x2C21, 'M', u'ⱑ'),
    (0x2C22, 'M', u'ⱒ'),
    (0x2C23, 'M', u'ⱓ'),
    (0x2C24, 'M', u'ⱔ'),
    (0x2C25, 'M', u'ⱕ'),
    (0x2C26, 'M', u'ⱖ'),
    (0x2C27, 'M', u'ⱗ'),
    (0x2C28, 'M', u'ⱘ'),
    (0x2C29, 'M', u'ⱙ'),
    (0x2C2A, 'M', u'ⱚ'),
    (0x2C2B, 'M', u'ⱛ'),
    (0x2C2C, 'M', u'ⱜ'),
    (0x2C2D, 'M', u'ⱝ'),
    (0x2C2E, 'M', u'ⱞ'),
    (0x2C2F, 'X'),
    (0x2C30, 'V'),
    (0x2C5F, 'X'),
    (0x2C60, 'M', u'ⱡ'),
    (0x2C61, 'V'),
    (0x2C62, 'M', u'ɫ'),
    (0x2C63, 'M', u'ᵽ'),
    (0x2C64, 'M', u'ɽ'),
    (0x2C65, 'V'),
    (0x2C67, 'M', u'ⱨ'),
    (0x2C68, 'V'),
    (0x2C69, 'M', u'ⱪ'),
    (0x2C6A, 'V'),
    (0x2C6B, 'M', u'ⱬ'),
    (0x2C6C, 'V'),
    (0x2C6D, 'M', u'ɑ'),
    (0x2C6E, 'M', u'ɱ'),
    (0x2C6F, 'M', u'ɐ'),
    (0x2C70, 'M', u'ɒ'),
    (0x2C71, 'V'),
    (0x2C72, 'M', u'ⱳ'),
    (0x2C73, 'V'),
    (0x2C75, 'M', u'ⱶ'),
    (0x2C76, 'V'),
    (0x2C7C, 'M', u'j'),
    (0x2C7D, 'M', u'v'),
    (0x2C7E, 'M', u'ȿ'),
    (0x2C7F, 'M', u'ɀ'),
    (0x2C80, 'M', u'ⲁ'),
    (0x2C81, 'V'),
    (0x2C82, 'M', u'ⲃ'),
    ]

def _seg_25():
    return [
    (0x2C83, 'V'),
    (0x2C84, 'M', u'ⲅ'),
    (0x2C85, 'V'),
    (0x2C86, 'M', u'ⲇ'),
    (0x2C87, 'V'),
    (0x2C88, 'M', u'ⲉ'),
    (0x2C89, 'V'),
    (0x2C8A, 'M', u'ⲋ'),
    (0x2C8B, 'V'),
    (0x2C8C, 'M', u'ⲍ'),
    (0x2C8D, 'V'),
    (0x2C8E, 'M', u'ⲏ'),
    (0x2C8F, 'V'),
    (0x2C90, 'M', u'ⲑ'),
    (0x2C91, 'V'),
    (0x2C92, 'M', u'ⲓ'),
    (0x2C93, 'V'),
    (0x2C94, 'M', u'ⲕ'),
    (0x2C95, 'V'),
    (0x2C96, 'M', u'ⲗ'),
    (0x2C97, 'V'),
    (0x2C98, 'M', u'ⲙ'),
    (0x2C99, 'V'),
    (0x2C9A, 'M', u'ⲛ'),
    (0x2C9B, 'V'),
    (0x2C9C, 'M', u'ⲝ'),
    (0x2C9D, 'V'),
    (0x2C9E, 'M', u'ⲟ'),
    (0x2C9F, 'V'),
    (0x2CA0, 'M', u'ⲡ'),
    (0x2CA1, 'V'),
    (0x2CA2, 'M', u'ⲣ'),
    (0x2CA3, 'V'),
    (0x2CA4, 'M', u'ⲥ'),
    (0x2CA5, 'V'),
    (0x2CA6, 'M', u'ⲧ'),
    (0x2CA7, 'V'),
    (0x2CA8, 'M', u'ⲩ'),
    (0x2CA9, 'V'),
    (0x2CAA, 'M', u'ⲫ'),
    (0x2CAB, 'V'),
    (0x2CAC, 'M', u'ⲭ'),
    (0x2CAD, 'V'),
    (0x2CAE, 'M', u'ⲯ'),
    (0x2CAF, 'V'),
    (0x2CB0, 'M', u'ⲱ'),
    (0x2CB1, 'V'),
    (0x2CB2, 'M', u'ⲳ'),
    (0x2CB3, 'V'),
    (0x2CB4, 'M', u'ⲵ'),
    (0x2CB5, 'V'),
    (0x2CB6, 'M', u'ⲷ'),
    (0x2CB7, 'V'),
    (0x2CB8, 'M', u'ⲹ'),
    (0x2CB9, 'V'),
    (0x2CBA, 'M', u'ⲻ'),
    (0x2CBB, 'V'),
    (0x2CBC, 'M', u'ⲽ'),
    (0x2CBD, 'V'),
    (0x2CBE, 'M', u'ⲿ'),
    (0x2CBF, 'V'),
    (0x2CC0, 'M', u'ⳁ'),
    (0x2CC1, 'V'),
    (0x2CC2, 'M', u'ⳃ'),
    (0x2CC3, 'V'),
    (0x2CC4, 'M', u'ⳅ'),
    (0x2CC5, 'V'),
    (0x2CC6, 'M', u'ⳇ'),
    (0x2CC7, 'V'),
    (0x2CC8, 'M', u'ⳉ'),
    (0x2CC9, 'V'),
    (0x2CCA, 'M', u'ⳋ'),
    (0x2CCB, 'V'),
    (0x2CCC, 'M', u'ⳍ'),
    (0x2CCD, 'V'),
    (0x2CCE, 'M', u'ⳏ'),
    (0x2CCF, 'V'),
    (0x2CD0, 'M', u'ⳑ'),
    (0x2CD1, 'V'),
    (0x2CD2, 'M', u'ⳓ'),
    (0x2CD3, 'V'),
    (0x2CD4, 'M', u'ⳕ'),
    (0x2CD5, 'V'),
    (0x2CD6, 'M', u'ⳗ'),
    (0x2CD7, 'V'),
    (0x2CD8, 'M', u'ⳙ'),
    (0x2CD9, 'V'),
    (0x2CDA, 'M', u'ⳛ'),
    (0x2CDB, 'V'),
    (0x2CDC, 'M', u'ⳝ'),
    (0x2CDD, 'V'),
    (0x2CDE, 'M', u'ⳟ'),
    (0x2CDF, 'V'),
    (0x2CE0, 'M', u'ⳡ'),
    (0x2CE1, 'V'),
    (0x2CE2, 'M', u'ⳣ'),
    (0x2CE3, 'V'),
    (0x2CEB, 'M', u'ⳬ'),
    (0x2CEC, 'V'),
    (0x2CED, 'M', u'ⳮ'),
    ]

def _seg_26():
    return [
    (0x2CEE, 'V'),
    (0x2CF2, 'M', u'ⳳ'),
    (0x2CF3, 'V'),
    (0x2CF4, 'X'),
    (0x2CF9, 'V'),
    (0x2D26, 'X'),
    (0x2D27, 'V'),
    (0x2D28, 'X'),
    (0x2D2D, 'V'),
    (0x2D2E, 'X'),
    (0x2D30, 'V'),
    (0x2D68, 'X'),
    (0x2D6F, 'M', u'ⵡ'),
    (0x2D70, 'V'),
    (0x2D71, 'X'),
    (0x2D7F, 'V'),
    (0x2D97, 'X'),
    (0x2DA0, 'V'),
    (0x2DA7, 'X'),
    (0x2DA8, 'V'),
    (0x2DAF, 'X'),
    (0x2DB0, 'V'),
    (0x2DB7, 'X'),
    (0x2DB8, 'V'),
    (0x2DBF, 'X'),
    (0x2DC0, 'V'),
    (0x2DC7, 'X'),
    (0x2DC8, 'V'),
    (0x2DCF, 'X'),
    (0x2DD0, 'V'),
    (0x2DD7, 'X'),
    (0x2DD8, 'V'),
    (0x2DDF, 'X'),
    (0x2DE0, 'V'),
    (0x2E3C, 'X'),
    (0x2E80, 'V'),
    (0x2E9A, 'X'),
    (0x2E9B, 'V'),
    (0x2E9F, 'M', u'母'),
    (0x2EA0, 'V'),
    (0x2EF3, 'M', u'龟'),
    (0x2EF4, 'X'),
    (0x2F00, 'M', u'一'),
    (0x2F01, 'M', u'丨'),
    (0x2F02, 'M', u'丶'),
    (0x2F03, 'M', u'丿'),
    (0x2F04, 'M', u'乙'),
    (0x2F05, 'M', u'亅'),
    (0x2F06, 'M', u'二'),
    (0x2F07, 'M', u'亠'),
    (0x2F08, 'M', u'人'),
    (0x2F09, 'M', u'儿'),
    (0x2F0A, 'M', u'入'),
    (0x2F0B, 'M', u'八'),
    (0x2F0C, 'M', u'冂'),
    (0x2F0D, 'M', u'冖'),
    (0x2F0E, 'M', u'冫'),
    (0x2F0F, 'M', u'几'),
    (0x2F10, 'M', u'凵'),
    (0x2F11, 'M', u'刀'),
    (0x2F12, 'M', u'力'),
    (0x2F13, 'M', u'勹'),
    (0x2F14, 'M', u'匕'),
    (0x2F15, 'M', u'匚'),
    (0x2F16, 'M', u'匸'),
    (0x2F17, 'M', u'十'),
    (0x2F18, 'M', u'卜'),
    (0x2F19, 'M', u'卩'),
    (0x2F1A, 'M', u'厂'),
    (0x2F1B, 'M', u'厶'),
    (0x2F1C, 'M', u'又'),
    (0x2F1D, 'M', u'口'),
    (0x2F1E, 'M', u'囗'),
    (0x2F1F, 'M', u'土'),
    (0x2F20, 'M', u'士'),
    (0x2F21, 'M', u'夂'),
    (0x2F22, 'M', u'夊'),
    (0x2F23, 'M', u'夕'),
    (0x2F24, 'M', u'大'),
    (0x2F25, 'M', u'女'),
    (0x2F26, 'M', u'子'),
    (0x2F27, 'M', u'宀'),
    (0x2F28, 'M', u'寸'),
    (0x2F29, 'M', u'小'),
    (0x2F2A, 'M', u'尢'),
    (0x2F2B, 'M', u'尸'),
    (0x2F2C, 'M', u'屮'),
    (0x2F2D, 'M', u'山'),
    (0x2F2E, 'M', u'巛'),
    (0x2F2F, 'M', u'工'),
    (0x2F30, 'M', u'己'),
    (0x2F31, 'M', u'巾'),
    (0x2F32, 'M', u'干'),
    (0x2F33, 'M', u'幺'),
    (0x2F34, 'M', u'广'),
    (0x2F35, 'M', u'廴'),
    (0x2F36, 'M', u'廾'),
    (0x2F37, 'M', u'弋'),
    (0x2F38, 'M', u'弓'),
    (0x2F39, 'M', u'彐'),
    ]

def _seg_27():
    return [
    (0x2F3A, 'M', u'彡'),
    (0x2F3B, 'M', u'彳'),
    (0x2F3C, 'M', u'心'),
    (0x2F3D, 'M', u'戈'),
    (0x2F3E, 'M', u'戶'),
    (0x2F3F, 'M', u'手'),
    (0x2F40, 'M', u'支'),
    (0x2F41, 'M', u'攴'),
    (0x2F42, 'M', u'文'),
    (0x2F43, 'M', u'斗'),
    (0x2F44, 'M', u'斤'),
    (0x2F45, 'M', u'方'),
    (0x2F46, 'M', u'无'),
    (0x2F47, 'M', u'日'),
    (0x2F48, 'M', u'曰'),
    (0x2F49, 'M', u'月'),
    (0x2F4A, 'M', u'木'),
    (0x2F4B, 'M', u'欠'),
    (0x2F4C, 'M', u'止'),
    (0x2F4D, 'M', u'歹'),
    (0x2F4E, 'M', u'殳'),
    (0x2F4F, 'M', u'毋'),
    (0x2F50, 'M', u'比'),
    (0x2F51, 'M', u'毛'),
    (0x2F52, 'M', u'氏'),
    (0x2F53, 'M', u'气'),
    (0x2F54, 'M', u'水'),
    (0x2F55, 'M', u'火'),
    (0x2F56, 'M', u'爪'),
    (0x2F57, 'M', u'父'),
    (0x2F58, 'M', u'爻'),
    (0x2F59, 'M', u'爿'),
    (0x2F5A, 'M', u'片'),
    (0x2F5B, 'M', u'牙'),
    (0x2F5C, 'M', u'牛'),
    (0x2F5D, 'M', u'犬'),
    (0x2F5E, 'M', u'玄'),
    (0x2F5F, 'M', u'玉'),
    (0x2F60, 'M', u'瓜'),
    (0x2F61, 'M', u'瓦'),
    (0x2F62, 'M', u'甘'),
    (0x2F63, 'M', u'生'),
    (0x2F64, 'M', u'用'),
    (0x2F65, 'M', u'田'),
    (0x2F66, 'M', u'疋'),
    (0x2F67, 'M', u'疒'),
    (0x2F68, 'M', u'癶'),
    (0x2F69, 'M', u'白'),
    (0x2F6A, 'M', u'皮'),
    (0x2F6B, 'M', u'皿'),
    (0x2F6C, 'M', u'目'),
    (0x2F6D, 'M', u'矛'),
    (0x2F6E, 'M', u'矢'),
    (0x2F6F, 'M', u'石'),
    (0x2F70, 'M', u'示'),
    (0x2F71, 'M', u'禸'),
    (0x2F72, 'M', u'禾'),
    (0x2F73, 'M', u'穴'),
    (0x2F74, 'M', u'立'),
    (0x2F75, 'M', u'竹'),
    (0x2F76, 'M', u'米'),
    (0x2F77, 'M', u'糸'),
    (0x2F78, 'M', u'缶'),
    (0x2F79, 'M', u'网'),
    (0x2F7A, 'M', u'羊'),
    (0x2F7B, 'M', u'羽'),
    (0x2F7C, 'M', u'老'),
    (0x2F7D, 'M', u'而'),
    (0x2F7E, 'M', u'耒'),
    (0x2F7F, 'M', u'耳'),
    (0x2F80, 'M', u'聿'),
    (0x2F81, 'M', u'肉'),
    (0x2F82, 'M', u'臣'),
    (0x2F83, 'M', u'自'),
    (0x2F84, 'M', u'至'),
    (0x2F85, 'M', u'臼'),
    (0x2F86, 'M', u'舌'),
    (0x2F87, 'M', u'舛'),
    (0x2F88, 'M', u'舟'),
    (0x2F89, 'M', u'艮'),
    (0x2F8A, 'M', u'色'),
    (0x2F8B, 'M', u'艸'),
    (0x2F8C, 'M', u'虍'),
    (0x2F8D, 'M', u'虫'),
    (0x2F8E, 'M', u'血'),
    (0x2F8F, 'M', u'行'),
    (0x2F90, 'M', u'衣'),
    (0x2F91, 'M', u'襾'),
    (0x2F92, 'M', u'見'),
    (0x2F93, 'M', u'角'),
    (0x2F94, 'M', u'言'),
    (0x2F95, 'M', u'谷'),
    (0x2F96, 'M', u'豆'),
    (0x2F97, 'M', u'豕'),
    (0x2F98, 'M', u'豸'),
    (0x2F99, 'M', u'貝'),
    (0x2F9A, 'M', u'赤'),
    (0x2F9B, 'M', u'走'),
    (0x2F9C, 'M', u'足'),
    (0x2F9D, 'M', u'身'),
    ]

def _seg_28():
    return [
    (0x2F9E, 'M', u'車'),
    (0x2F9F, 'M', u'辛'),
    (0x2FA0, 'M', u'辰'),
    (0x2FA1, 'M', u'辵'),
    (0x2FA2, 'M', u'邑'),
    (0x2FA3, 'M', u'酉'),
    (0x2FA4, 'M', u'釆'),
    (0x2FA5, 'M', u'里'),
    (0x2FA6, 'M', u'金'),
    (0x2FA7, 'M', u'長'),
    (0x2FA8, 'M', u'門'),
    (0x2FA9, 'M', u'阜'),
    (0x2FAA, 'M', u'隶'),
    (0x2FAB, 'M', u'隹'),
    (0x2FAC, 'M', u'雨'),
    (0x2FAD, 'M', u'靑'),
    (0x2FAE, 'M', u'非'),
    (0x2FAF, 'M', u'面'),
    (0x2FB0, 'M', u'革'),
    (0x2FB1, 'M', u'韋'),
    (0x2FB2, 'M', u'韭'),
    (0x2FB3, 'M', u'音'),
    (0x2FB4, 'M', u'頁'),
    (0x2FB5, 'M', u'風'),
    (0x2FB6, 'M', u'飛'),
    (0x2FB7, 'M', u'食'),
    (0x2FB8, 'M', u'首'),
    (0x2FB9, 'M', u'香'),
    (0x2FBA, 'M', u'馬'),
    (0x2FBB, 'M', u'骨'),
    (0x2FBC, 'M', u'高'),
    (0x2FBD, 'M', u'髟'),
    (0x2FBE, 'M', u'鬥'),
    (0x2FBF, 'M', u'鬯'),
    (0x2FC0, 'M', u'鬲'),
    (0x2FC1, 'M', u'鬼'),
    (0x2FC2, 'M', u'魚'),
    (0x2FC3, 'M', u'鳥'),
    (0x2FC4, 'M', u'鹵'),
    (0x2FC5, 'M', u'鹿'),
    (0x2FC6, 'M', u'麥'),
    (0x2FC7, 'M', u'麻'),
    (0x2FC8, 'M', u'黃'),
    (0x2FC9, 'M', u'黍'),
    (0x2FCA, 'M', u'黑'),
    (0x2FCB, 'M', u'黹'),
    (0x2FCC, 'M', u'黽'),
    (0x2FCD, 'M', u'鼎'),
    (0x2FCE, 'M', u'鼓'),
    (0x2FCF, 'M', u'鼠'),
    (0x2FD0, 'M', u'鼻'),
    (0x2FD1, 'M', u'齊'),
    (0x2FD2, 'M', u'齒'),
    (0x2FD3, 'M', u'龍'),
    (0x2FD4, 'M', u'龜'),
    (0x2FD5, 'M', u'龠'),
    (0x2FD6, 'X'),
    (0x3000, '3', u' '),
    (0x3001, 'V'),
    (0x3002, 'M', u'.'),
    (0x3003, 'V'),
    (0x3036, 'M', u'〒'),
    (0x3037, 'V'),
    (0x3038, 'M', u'十'),
    (0x3039, 'M', u'卄'),
    (0x303A, 'M', u'卅'),
    (0x303B, 'V'),
    (0x3040, 'X'),
    (0x3041, 'V'),
    (0x3097, 'X'),
    (0x3099, 'V'),
    (0x309B, '3', u' ゙'),
    (0x309C, '3', u' ゚'),
    (0x309D, 'V'),
    (0x309F, 'M', u'より'),
    (0x30A0, 'V'),
    (0x30FF, 'M', u'コト'),
    (0x3100, 'X'),
    (0x3105, 'V'),
    (0x312E, 'X'),
    (0x3131, 'M', u'ᄀ'),
    (0x3132, 'M', u'ᄁ'),
    (0x3133, 'M', u'ᆪ'),
    (0x3134, 'M', u'ᄂ'),
    (0x3135, 'M', u'ᆬ'),
    (0x3136, 'M', u'ᆭ'),
    (0x3137, 'M', u'ᄃ'),
    (0x3138, 'M', u'ᄄ'),
    (0x3139, 'M', u'ᄅ'),
    (0x313A, 'M', u'ᆰ'),
    (0x313B, 'M', u'ᆱ'),
    (0x313C, 'M', u'ᆲ'),
    (0x313D, 'M', u'ᆳ'),
    (0x313E, 'M', u'ᆴ'),
    (0x313F, 'M', u'ᆵ'),
    (0x3140, 'M', u'ᄚ'),
    (0x3141, 'M', u'ᄆ'),
    (0x3142, 'M', u'ᄇ'),
    (0x3143, 'M', u'ᄈ'),
    (0x3144, 'M', u'ᄡ'),
    ]

def _seg_29():
    return [
    (0x3145, 'M', u'ᄉ'),
    (0x3146, 'M', u'ᄊ'),
    (0x3147, 'M', u'ᄋ'),
    (0x3148, 'M', u'ᄌ'),
    (0x3149, 'M', u'ᄍ'),
    (0x314A, 'M', u'ᄎ'),
    (0x314B, 'M', u'ᄏ'),
    (0x314C, 'M', u'ᄐ'),
    (0x314D, 'M', u'ᄑ'),
    (0x314E, 'M', u'ᄒ'),
    (0x314F, 'M', u'ᅡ'),
    (0x3150, 'M', u'ᅢ'),
    (0x3151, 'M', u'ᅣ'),
    (0x3152, 'M', u'ᅤ'),
    (0x3153, 'M', u'ᅥ'),
    (0x3154, 'M', u'ᅦ'),
    (0x3155, 'M', u'ᅧ'),
    (0x3156, 'M', u'ᅨ'),
    (0x3157, 'M', u'ᅩ'),
    (0x3158, 'M', u'ᅪ'),
    (0x3159, 'M', u'ᅫ'),
    (0x315A, 'M', u'ᅬ'),
    (0x315B, 'M', u'ᅭ'),
    (0x315C, 'M', u'ᅮ'),
    (0x315D, 'M', u'ᅯ'),
    (0x315E, 'M', u'ᅰ'),
    (0x315F, 'M', u'ᅱ'),
    (0x3160, 'M', u'ᅲ'),
    (0x3161, 'M', u'ᅳ'),
    (0x3162, 'M', u'ᅴ'),
    (0x3163, 'M', u'ᅵ'),
    (0x3164, 'X'),
    (0x3165, 'M', u'ᄔ'),
    (0x3166, 'M', u'ᄕ'),
    (0x3167, 'M', u'ᇇ'),
    (0x3168, 'M', u'ᇈ'),
    (0x3169, 'M', u'ᇌ'),
    (0x316A, 'M', u'ᇎ'),
    (0x316B, 'M', u'ᇓ'),
    (0x316C, 'M', u'ᇗ'),
    (0x316D, 'M', u'ᇙ'),
    (0x316E, 'M', u'ᄜ'),
    (0x316F, 'M', u'ᇝ'),
    (0x3170, 'M', u'ᇟ'),
    (0x3171, 'M', u'ᄝ'),
    (0x3172, 'M', u'ᄞ'),
    (0x3173, 'M', u'ᄠ'),
    (0x3174, 'M', u'ᄢ'),
    (0x3175, 'M', u'ᄣ'),
    (0x3176, 'M', u'ᄧ'),
    (0x3177, 'M', u'ᄩ'),
    (0x3178, 'M', u'ᄫ'),
    (0x3179, 'M', u'ᄬ'),
    (0x317A, 'M', u'ᄭ'),
    (0x317B, 'M', u'ᄮ'),
    (0x317C, 'M', u'ᄯ'),
    (0x317D, 'M', u'ᄲ'),
    (0x317E, 'M', u'ᄶ'),
    (0x317F, 'M', u'ᅀ'),
    (0x3180, 'M', u'ᅇ'),
    (0x3181, 'M', u'ᅌ'),
    (0x3182, 'M', u'ᇱ'),
    (0x3183, 'M', u'ᇲ'),
    (0x3184, 'M', u'ᅗ'),
    (0x3185, 'M', u'ᅘ'),
    (0x3186, 'M', u'ᅙ'),
    (0x3187, 'M', u'ᆄ'),
    (0x3188, 'M', u'ᆅ'),
    (0x3189, 'M', u'ᆈ'),
    (0x318A, 'M', u'ᆑ'),
    (0x318B, 'M', u'ᆒ'),
    (0x318C, 'M', u'ᆔ'),
    (0x318D, 'M', u'ᆞ'),
    (0x318E, 'M', u'ᆡ'),
    (0x318F, 'X'),
    (0x3190, 'V'),
    (0x3192, 'M', u'一'),
    (0x3193, 'M', u'二'),
    (0x3194, 'M', u'三'),
    (0x3195, 'M', u'四'),
    (0x3196, 'M', u'上'),
    (0x3197, 'M', u'中'),
    (0x3198, 'M', u'下'),
    (0x3199, 'M', u'甲'),
    (0x319A, 'M', u'乙'),
    (0x319B, 'M', u'丙'),
    (0x319C, 'M', u'丁'),
    (0x319D, 'M', u'天'),
    (0x319E, 'M', u'地'),
    (0x319F, 'M', u'人'),
    (0x31A0, 'V'),
    (0x31BB, 'X'),
    (0x31C0, 'V'),
    (0x31E4, 'X'),
    (0x31F0, 'V'),
    (0x3200, '3', u'(ᄀ)'),
    (0x3201, '3', u'(ᄂ)'),
    (0x3202, '3', u'(ᄃ)'),
    (0x3203, '3', u'(ᄅ)'),
    (0x3204, '3', u'(ᄆ)'),
    ]

def _seg_30():
    return [
    (0x3205, '3', u'(ᄇ)'),
    (0x3206, '3', u'(ᄉ)'),
    (0x3207, '3', u'(ᄋ)'),
    (0x3208, '3', u'(ᄌ)'),
    (0x3209, '3', u'(ᄎ)'),
    (0x320A, '3', u'(ᄏ)'),
    (0x320B, '3', u'(ᄐ)'),
    (0x320C, '3', u'(ᄑ)'),
    (0x320D, '3', u'(ᄒ)'),
    (0x320E, '3', u'(가)'),
    (0x320F, '3', u'(나)'),
    (0x3210, '3', u'(다)'),
    (0x3211, '3', u'(라)'),
    (0x3212, '3', u'(마)'),
    (0x3213, '3', u'(바)'),
    (0x3214, '3', u'(사)'),
    (0x3215, '3', u'(아)'),
    (0x3216, '3', u'(자)'),
    (0x3217, '3', u'(차)'),
    (0x3218, '3', u'(카)'),
    (0x3219, '3', u'(타)'),
    (0x321A, '3', u'(파)'),
    (0x321B, '3', u'(하)'),
    (0x321C, '3', u'(주)'),
    (0x321D, '3', u'(오전)'),
    (0x321E, '3', u'(오후)'),
    (0x321F, 'X'),
    (0x3220, '3', u'(一)'),
    (0x3221, '3', u'(二)'),
    (0x3222, '3', u'(三)'),
    (0x3223, '3', u'(四)'),
    (0x3224, '3', u'(五)'),
    (0x3225, '3', u'(六)'),
    (0x3226, '3', u'(七)'),
    (0x3227, '3', u'(八)'),
    (0x3228, '3', u'(九)'),
    (0x3229, '3', u'(十)'),
    (0x322A, '3', u'(月)'),
    (0x322B, '3', u'(火)'),
    (0x322C, '3', u'(水)'),
    (0x322D, '3', u'(木)'),
    (0x322E, '3', u'(金)'),
    (0x322F, '3', u'(土)'),
    (0x3230, '3', u'(日)'),
    (0x3231, '3', u'(株)'),
    (0x3232, '3', u'(有)'),
    (0x3233, '3', u'(社)'),
    (0x3234, '3', u'(名)'),
    (0x3235, '3', u'(特)'),
    (0x3236, '3', u'(財)'),
    (0x3237, '3', u'(祝)'),
    (0x3238, '3', u'(労)'),
    (0x3239, '3', u'(代)'),
    (0x323A, '3', u'(呼)'),
    (0x323B, '3', u'(学)'),
    (0x323C, '3', u'(監)'),
    (0x323D, '3', u'(企)'),
    (0x323E, '3', u'(資)'),
    (0x323F, '3', u'(協)'),
    (0x3240, '3', u'(祭)'),
    (0x3241, '3', u'(休)'),
    (0x3242, '3', u'(自)'),
    (0x3243, '3', u'(至)'),
    (0x3244, 'M', u'問'),
    (0x3245, 'M', u'幼'),
    (0x3246, 'M', u'文'),
    (0x3247, 'M', u'箏'),
    (0x3248, 'V'),
    (0x3250, 'M', u'pte'),
    (0x3251, 'M', u'21'),
    (0x3252, 'M', u'22'),
    (0x3253, 'M', u'23'),
    (0x3254, 'M', u'24'),
    (0x3255, 'M', u'25'),
    (0x3256, 'M', u'26'),
    (0x3257, 'M', u'27'),
    (0x3258, 'M', u'28'),
    (0x3259, 'M', u'29'),
    (0x325A, 'M', u'30'),
    (0x325B, 'M', u'31'),
    (0x325C, 'M', u'32'),
    (0x325D, 'M', u'33'),
    (0x325E, 'M', u'34'),
    (0x325F, 'M', u'35'),
    (0x3260, 'M', u'ᄀ'),
    (0x3261, 'M', u'ᄂ'),
    (0x3262, 'M', u'ᄃ'),
    (0x3263, 'M', u'ᄅ'),
    (0x3264, 'M', u'ᄆ'),
    (0x3265, 'M', u'ᄇ'),
    (0x3266, 'M', u'ᄉ'),
    (0x3267, 'M', u'ᄋ'),
    (0x3268, 'M', u'ᄌ'),
    (0x3269, 'M', u'ᄎ'),
    (0x326A, 'M', u'ᄏ'),
    (0x326B, 'M', u'ᄐ'),
    (0x326C, 'M', u'ᄑ'),
    (0x326D, 'M', u'ᄒ'),
    (0x326E, 'M', u'가'),
    (0x326F, 'M', u'나'),
    ]

def _seg_31():
    return [
    (0x3270, 'M', u'다'),
    (0x3271, 'M', u'라'),
    (0x3272, 'M', u'마'),
    (0x3273, 'M', u'바'),
    (0x3274, 'M', u'사'),
    (0x3275, 'M', u'아'),
    (0x3276, 'M', u'자'),
    (0x3277, 'M', u'차'),
    (0x3278, 'M', u'카'),
    (0x3279, 'M', u'타'),
    (0x327A, 'M', u'파'),
    (0x327B, 'M', u'하'),
    (0x327C, 'M', u'참고'),
    (0x327D, 'M', u'주의'),
    (0x327E, 'M', u'우'),
    (0x327F, 'V'),
    (0x3280, 'M', u'一'),
    (0x3281, 'M', u'二'),
    (0x3282, 'M', u'三'),
    (0x3283, 'M', u'四'),
    (0x3284, 'M', u'五'),
    (0x3285, 'M', u'六'),
    (0x3286, 'M', u'七'),
    (0x3287, 'M', u'八'),
    (0x3288, 'M', u'九'),
    (0x3289, 'M', u'十'),
    (0x328A, 'M', u'月'),
    (0x328B, 'M', u'火'),
    (0x328C, 'M', u'水'),
    (0x328D, 'M', u'木'),
    (0x328E, 'M', u'金'),
    (0x328F, 'M', u'土'),
    (0x3290, 'M', u'日'),
    (0x3291, 'M', u'株'),
    (0x3292, 'M', u'有'),
    (0x3293, 'M', u'社'),
    (0x3294, 'M', u'名'),
    (0x3295, 'M', u'特'),
    (0x3296, 'M', u'財'),
    (0x3297, 'M', u'祝'),
    (0x3298, 'M', u'労'),
    (0x3299, 'M', u'秘'),
    (0x329A, 'M', u'男'),
    (0x329B, 'M', u'女'),
    (0x329C, 'M', u'適'),
    (0x329D, 'M', u'優'),
    (0x329E, 'M', u'印'),
    (0x329F, 'M', u'注'),
    (0x32A0, 'M', u'項'),
    (0x32A1, 'M', u'休'),
    (0x32A2, 'M', u'写'),
    (0x32A3, 'M', u'正'),
    (0x32A4, 'M', u'上'),
    (0x32A5, 'M', u'中'),
    (0x32A6, 'M', u'下'),
    (0x32A7, 'M', u'左'),
    (0x32A8, 'M', u'右'),
    (0x32A9, 'M', u'医'),
    (0x32AA, 'M', u'宗'),
    (0x32AB, 'M', u'学'),
    (0x32AC, 'M', u'監'),
    (0x32AD, 'M', u'企'),
    (0x32AE, 'M', u'資'),
    (0x32AF, 'M', u'協'),
    (0x32B0, 'M', u'夜'),
    (0x32B1, 'M', u'36'),
    (0x32B2, 'M', u'37'),
    (0x32B3, 'M', u'38'),
    (0x32B4, 'M', u'39'),
    (0x32B5, 'M', u'40'),
    (0x32B6, 'M', u'41'),
    (0x32B7, 'M', u'42'),
    (0x32B8, 'M', u'43'),
    (0x32B9, 'M', u'44'),
    (0x32BA, 'M', u'45'),
    (0x32BB, 'M', u'46'),
    (0x32BC, 'M', u'47'),
    (0x32BD, 'M', u'48'),
    (0x32BE, 'M', u'49'),
    (0x32BF, 'M', u'50'),
    (0x32C0, 'M', u'1月'),
    (0x32C1, 'M', u'2月'),
    (0x32C2, 'M', u'3月'),
    (0x32C3, 'M', u'4月'),
    (0x32C4, 'M', u'5月'),
    (0x32C5, 'M', u'6月'),
    (0x32C6, 'M', u'7月'),
    (0x32C7, 'M', u'8月'),
    (0x32C8, 'M', u'9月'),
    (0x32C9, 'M', u'10月'),
    (0x32CA, 'M', u'11月'),
    (0x32CB, 'M', u'12月'),
    (0x32CC, 'M', u'hg'),
    (0x32CD, 'M', u'erg'),
    (0x32CE, 'M', u'ev'),
    (0x32CF, 'M', u'ltd'),
    (0x32D0, 'M', u'ア'),
    (0x32D1, 'M', u'イ'),
    (0x32D2, 'M', u'ウ'),
    (0x32D3, 'M', u'エ'),
    ]

def _seg_32():
    return [
    (0x32D4, 'M', u'オ'),
    (0x32D5, 'M', u'カ'),
    (0x32D6, 'M', u'キ'),
    (0x32D7, 'M', u'ク'),
    (0x32D8, 'M', u'ケ'),
    (0x32D9, 'M', u'コ'),
    (0x32DA, 'M', u'サ'),
    (0x32DB, 'M', u'シ'),
    (0x32DC, 'M', u'ス'),
    (0x32DD, 'M', u'セ'),
    (0x32DE, 'M', u'ソ'),
    (0x32DF, 'M', u'タ'),
    (0x32E0, 'M', u'チ'),
    (0x32E1, 'M', u'ツ'),
    (0x32E2, 'M', u'テ'),
    (0x32E3, 'M', u'ト'),
    (0x32E4, 'M', u'ナ'),
    (0x32E5, 'M', u'ニ'),
    (0x32E6, 'M', u'ヌ'),
    (0x32E7, 'M', u'ネ'),
    (0x32E8, 'M', u'ノ'),
    (0x32E9, 'M', u'ハ'),
    (0x32EA, 'M', u'ヒ'),
    (0x32EB, 'M', u'フ'),
    (0x32EC, 'M', u'ヘ'),
    (0x32ED, 'M', u'ホ'),
    (0x32EE, 'M', u'マ'),
    (0x32EF, 'M', u'ミ'),
    (0x32F0, 'M', u'ム'),
    (0x32F1, 'M', u'メ'),
    (0x32F2, 'M', u'モ'),
    (0x32F3, 'M', u'ヤ'),
    (0x32F4, 'M', u'ユ'),
    (0x32F5, 'M', u'ヨ'),
    (0x32F6, 'M', u'ラ'),
    (0x32F7, 'M', u'リ'),
    (0x32F8, 'M', u'ル'),
    (0x32F9, 'M', u'レ'),
    (0x32FA, 'M', u'ロ'),
    (0x32FB, 'M', u'ワ'),
    (0x32FC, 'M', u'ヰ'),
    (0x32FD, 'M', u'ヱ'),
    (0x32FE, 'M', u'ヲ'),
    (0x32FF, 'X'),
    (0x3300, 'M', u'アパート'),
    (0x3301, 'M', u'アルファ'),
    (0x3302, 'M', u'アンペア'),
    (0x3303, 'M', u'アール'),
    (0x3304, 'M', u'イニング'),
    (0x3305, 'M', u'インチ'),
    (0x3306, 'M', u'ウォン'),
    (0x3307, 'M', u'エスクード'),
    (0x3308, 'M', u'エーカー'),
    (0x3309, 'M', u'オンス'),
    (0x330A, 'M', u'オーム'),
    (0x330B, 'M', u'カイリ'),
    (0x330C, 'M', u'カラット'),
    (0x330D, 'M', u'カロリー'),
    (0x330E, 'M', u'ガロン'),
    (0x330F, 'M', u'ガンマ'),
    (0x3310, 'M', u'ギガ'),
    (0x3311, 'M', u'ギニー'),
    (0x3312, 'M', u'キュリー'),
    (0x3313, 'M', u'ギルダー'),
    (0x3314, 'M', u'キロ'),
    (0x3315, 'M', u'キログラム'),
    (0x3316, 'M', u'キロメートル'),
    (0x3317, 'M', u'キロワット'),
    (0x3318, 'M', u'グラム'),
    (0x3319, 'M', u'グラムトン'),
    (0x331A, 'M', u'クルゼイロ'),
    (0x331B, 'M', u'クローネ'),
    (0x331C, 'M', u'ケース'),
    (0x331D, 'M', u'コルナ'),
    (0x331E, 'M', u'コーポ'),
    (0x331F, 'M', u'サイクル'),
    (0x3320, 'M', u'サンチーム'),
    (0x3321, 'M', u'シリング'),
    (0x3322, 'M', u'センチ'),
    (0x3323, 'M', u'セント'),
    (0x3324, 'M', u'ダース'),
    (0x3325, 'M', u'デシ'),
    (0x3326, 'M', u'ドル'),
    (0x3327, 'M', u'トン'),
    (0x3328, 'M', u'ナノ'),
    (0x3329, 'M', u'ノット'),
    (0x332A, 'M', u'ハイツ'),
    (0x332B, 'M', u'パーセント'),
    (0x332C, 'M', u'パーツ'),
    (0x332D, 'M', u'バーレル'),
    (0x332E, 'M', u'ピアストル'),
    (0x332F, 'M', u'ピクル'),
    (0x3330, 'M', u'ピコ'),
    (0x3331, 'M', u'ビル'),
    (0x3332, 'M', u'ファラッド'),
    (0x3333, 'M', u'フィート'),
    (0x3334, 'M', u'ブッシェル'),
    (0x3335, 'M', u'フラン'),
    (0x3336, 'M', u'ヘクタール'),
    (0x3337, 'M', u'ペソ'),
    ]

def _seg_33():
    return [
    (0x3338, 'M', u'ペニヒ'),
    (0x3339, 'M', u'ヘルツ'),
    (0x333A, 'M', u'ペンス'),
    (0x333B, 'M', u'ページ'),
    (0x333C, 'M', u'ベータ'),
    (0x333D, 'M', u'ポイント'),
    (0x333E, 'M', u'ボルト'),
    (0x333F, 'M', u'ホン'),
    (0x3340, 'M', u'ポンド'),
    (0x3341, 'M', u'ホール'),
    (0x3342, 'M', u'ホーン'),
    (0x3343, 'M', u'マイクロ'),
    (0x3344, 'M', u'マイル'),
    (0x3345, 'M', u'マッハ'),
    (0x3346, 'M', u'マルク'),
    (0x3347, 'M', u'マンション'),
    (0x3348, 'M', u'ミクロン'),
    (0x3349, 'M', u'ミリ'),
    (0x334A, 'M', u'ミリバール'),
    (0x334B, 'M', u'メガ'),
    (0x334C, 'M', u'メガトン'),
    (0x334D, 'M', u'メートル'),
    (0x334E, 'M', u'ヤード'),
    (0x334F, 'M', u'ヤール'),
    (0x3350, 'M', u'ユアン'),
    (0x3351, 'M', u'リットル'),
    (0x3352, 'M', u'リラ'),
    (0x3353, 'M', u'ルピー'),
    (0x3354, 'M', u'ルーブル'),
    (0x3355, 'M', u'レム'),
    (0x3356, 'M', u'レントゲン'),
    (0x3357, 'M', u'ワット'),
    (0x3358, 'M', u'0点'),
    (0x3359, 'M', u'1点'),
    (0x335A, 'M', u'2点'),
    (0x335B, 'M', u'3点'),
    (0x335C, 'M', u'4点'),
    (0x335D, 'M', u'5点'),
    (0x335E, 'M', u'6点'),
    (0x335F, 'M', u'7点'),
    (0x3360, 'M', u'8点'),
    (0x3361, 'M', u'9点'),
    (0x3362, 'M', u'10点'),
    (0x3363, 'M', u'11点'),
    (0x3364, 'M', u'12点'),
    (0x3365, 'M', u'13点'),
    (0x3366, 'M', u'14点'),
    (0x3367, 'M', u'15点'),
    (0x3368, 'M', u'16点'),
    (0x3369, 'M', u'17点'),
    (0x336A, 'M', u'18点'),
    (0x336B, 'M', u'19点'),
    (0x336C, 'M', u'20点'),
    (0x336D, 'M', u'21点'),
    (0x336E, 'M', u'22点'),
    (0x336F, 'M', u'23点'),
    (0x3370, 'M', u'24点'),
    (0x3371, 'M', u'hpa'),
    (0x3372, 'M', u'da'),
    (0x3373, 'M', u'au'),
    (0x3374, 'M', u'bar'),
    (0x3375, 'M', u'ov'),
    (0x3376, 'M', u'pc'),
    (0x3377, 'M', u'dm'),
    (0x3378, 'M', u'dm2'),
    (0x3379, 'M', u'dm3'),
    (0x337A, 'M', u'iu'),
    (0x337B, 'M', u'平成'),
    (0x337C, 'M', u'昭和'),
    (0x337D, 'M', u'大正'),
    (0x337E, 'M', u'明治'),
    (0x337F, 'M', u'株式会社'),
    (0x3380, 'M', u'pa'),
    (0x3381, 'M', u'na'),
    (0x3382, 'M', u'μa'),
    (0x3383, 'M', u'ma'),
    (0x3384, 'M', u'ka'),
    (0x3385, 'M', u'kb'),
    (0x3386, 'M', u'mb'),
    (0x3387, 'M', u'gb'),
    (0x3388, 'M', u'cal'),
    (0x3389, 'M', u'kcal'),
    (0x338A, 'M', u'pf'),
    (0x338B, 'M', u'nf'),
    (0x338C, 'M', u'μf'),
    (0x338D, 'M', u'μg'),
    (0x338E, 'M', u'mg'),
    (0x338F, 'M', u'kg'),
    (0x3390, 'M', u'hz'),
    (0x3391, 'M', u'khz'),
    (0x3392, 'M', u'mhz'),
    (0x3393, 'M', u'ghz'),
    (0x3394, 'M', u'thz'),
    (0x3395, 'M', u'μl'),
    (0x3396, 'M', u'ml'),
    (0x3397, 'M', u'dl'),
    (0x3398, 'M', u'kl'),
    (0x3399, 'M', u'fm'),
    (0x339A, 'M', u'nm'),
    (0x339B, 'M', u'μm'),
    ]

def _seg_34():
    return [
    (0x339C, 'M', u'mm'),
    (0x339D, 'M', u'cm'),
    (0x339E, 'M', u'km'),
    (0x339F, 'M', u'mm2'),
    (0x33A0, 'M', u'cm2'),
    (0x33A1, 'M', u'm2'),
    (0x33A2, 'M', u'km2'),
    (0x33A3, 'M', u'mm3'),
    (0x33A4, 'M', u'cm3'),
    (0x33A5, 'M', u'm3'),
    (0x33A6, 'M', u'km3'),
    (0x33A7, 'M', u'm∕s'),
    (0x33A8, 'M', u'm∕s2'),
    (0x33A9, 'M', u'pa'),
    (0x33AA, 'M', u'kpa'),
    (0x33AB, 'M', u'mpa'),
    (0x33AC, 'M', u'gpa'),
    (0x33AD, 'M', u'rad'),
    (0x33AE, 'M', u'rad∕s'),
    (0x33AF, 'M', u'rad∕s2'),
    (0x33B0, 'M', u'ps'),
    (0x33B1, 'M', u'ns'),
    (0x33B2, 'M', u'μs'),
    (0x33B3, 'M', u'ms'),
    (0x33B4, 'M', u'pv'),
    (0x33B5, 'M', u'nv'),
    (0x33B6, 'M', u'μv'),
    (0x33B7, 'M', u'mv'),
    (0x33B8, 'M', u'kv'),
    (0x33B9, 'M', u'mv'),
    (0x33BA, 'M', u'pw'),
    (0x33BB, 'M', u'nw'),
    (0x33BC, 'M', u'μw'),
    (0x33BD, 'M', u'mw'),
    (0x33BE, 'M', u'kw'),
    (0x33BF, 'M', u'mw'),
    (0x33C0, 'M', u'kω'),
    (0x33C1, 'M', u'mω'),
    (0x33C2, 'X'),
    (0x33C3, 'M', u'bq'),
    (0x33C4, 'M', u'cc'),
    (0x33C5, 'M', u'cd'),
    (0x33C6, 'M', u'c∕kg'),
    (0x33C7, 'X'),
    (0x33C8, 'M', u'db'),
    (0x33C9, 'M', u'gy'),
    (0x33CA, 'M', u'ha'),
    (0x33CB, 'M', u'hp'),
    (0x33CC, 'M', u'in'),
    (0x33CD, 'M', u'kk'),
    (0x33CE, 'M', u'km'),
    (0x33CF, 'M', u'kt'),
    (0x33D0, 'M', u'lm'),
    (0x33D1, 'M', u'ln'),
    (0x33D2, 'M', u'log'),
    (0x33D3, 'M', u'lx'),
    (0x33D4, 'M', u'mb'),
    (0x33D5, 'M', u'mil'),
    (0x33D6, 'M', u'mol'),
    (0x33D7, 'M', u'ph'),
    (0x33D8, 'X'),
    (0x33D9, 'M', u'ppm'),
    (0x33DA, 'M', u'pr'),
    (0x33DB, 'M', u'sr'),
    (0x33DC, 'M', u'sv'),
    (0x33DD, 'M', u'wb'),
    (0x33DE, 'M', u'v∕m'),
    (0x33DF, 'M', u'a∕m'),
    (0x33E0, 'M', u'1日'),
    (0x33E1, 'M', u'2日'),
    (0x33E2, 'M', u'3日'),
    (0x33E3, 'M', u'4日'),
    (0x33E4, 'M', u'5日'),
    (0x33E5, 'M', u'6日'),
    (0x33E6, 'M', u'7日'),
    (0x33E7, 'M', u'8日'),
    (0x33E8, 'M', u'9日'),
    (0x33E9, 'M', u'10日'),
    (0x33EA, 'M', u'11日'),
    (0x33EB, 'M', u'12日'),
    (0x33EC, 'M', u'13日'),
    (0x33ED, 'M', u'14日'),
    (0x33EE, 'M', u'15日'),
    (0x33EF, 'M', u'16日'),
    (0x33F0, 'M', u'17日'),
    (0x33F1, 'M', u'18日'),
    (0x33F2, 'M', u'19日'),
    (0x33F3, 'M', u'20日'),
    (0x33F4, 'M', u'21日'),
    (0x33F5, 'M', u'22日'),
    (0x33F6, 'M', u'23日'),
    (0x33F7, 'M', u'24日'),
    (0x33F8, 'M', u'25日'),
    (0x33F9, 'M', u'26日'),
    (0x33FA, 'M', u'27日'),
    (0x33FB, 'M', u'28日'),
    (0x33FC, 'M', u'29日'),
    (0x33FD, 'M', u'30日'),
    (0x33FE, 'M', u'31日'),
    (0x33FF, 'M', u'gal'),
    ]

def _seg_35():
    return [
    (0x3400, 'V'),
    (0x4DB6, 'X'),
    (0x4DC0, 'V'),
    (0x9FCD, 'X'),
    (0xA000, 'V'),
    (0xA48D, 'X'),
    (0xA490, 'V'),
    (0xA4C7, 'X'),
    (0xA4D0, 'V'),
    (0xA62C, 'X'),
    (0xA640, 'M', u'ꙁ'),
    (0xA641, 'V'),
    (0xA642, 'M', u'ꙃ'),
    (0xA643, 'V'),
    (0xA644, 'M', u'ꙅ'),
    (0xA645, 'V'),
    (0xA646, 'M', u'ꙇ'),
    (0xA647, 'V'),
    (0xA648, 'M', u'ꙉ'),
    (0xA649, 'V'),
    (0xA64A, 'M', u'ꙋ'),
    (0xA64B, 'V'),
    (0xA64C, 'M', u'ꙍ'),
    (0xA64D, 'V'),
    (0xA64E, 'M', u'ꙏ'),
    (0xA64F, 'V'),
    (0xA650, 'M', u'ꙑ'),
    (0xA651, 'V'),
    (0xA652, 'M', u'ꙓ'),
    (0xA653, 'V'),
    (0xA654, 'M', u'ꙕ'),
    (0xA655, 'V'),
    (0xA656, 'M', u'ꙗ'),
    (0xA657, 'V'),
    (0xA658, 'M', u'ꙙ'),
    (0xA659, 'V'),
    (0xA65A, 'M', u'ꙛ'),
    (0xA65B, 'V'),
    (0xA65C, 'M', u'ꙝ'),
    (0xA65D, 'V'),
    (0xA65E, 'M', u'ꙟ'),
    (0xA65F, 'V'),
    (0xA660, 'M', u'ꙡ'),
    (0xA661, 'V'),
    (0xA662, 'M', u'ꙣ'),
    (0xA663, 'V'),
    (0xA664, 'M', u'ꙥ'),
    (0xA665, 'V'),
    (0xA666, 'M', u'ꙧ'),
    (0xA667, 'V'),
    (0xA668, 'M', u'ꙩ'),
    (0xA669, 'V'),
    (0xA66A, 'M', u'ꙫ'),
    (0xA66B, 'V'),
    (0xA66C, 'M', u'ꙭ'),
    (0xA66D, 'V'),
    (0xA680, 'M', u'ꚁ'),
    (0xA681, 'V'),
    (0xA682, 'M', u'ꚃ'),
    (0xA683, 'V'),
    (0xA684, 'M', u'ꚅ'),
    (0xA685, 'V'),
    (0xA686, 'M', u'ꚇ'),
    (0xA687, 'V'),
    (0xA688, 'M', u'ꚉ'),
    (0xA689, 'V'),
    (0xA68A, 'M', u'ꚋ'),
    (0xA68B, 'V'),
    (0xA68C, 'M', u'ꚍ'),
    (0xA68D, 'V'),
    (0xA68E, 'M', u'ꚏ'),
    (0xA68F, 'V'),
    (0xA690, 'M', u'ꚑ'),
    (0xA691, 'V'),
    (0xA692, 'M', u'ꚓ'),
    (0xA693, 'V'),
    (0xA694, 'M', u'ꚕ'),
    (0xA695, 'V'),
    (0xA696, 'M', u'ꚗ'),
    (0xA697, 'V'),
    (0xA698, 'X'),
    (0xA69F, 'V'),
    (0xA6F8, 'X'),
    (0xA700, 'V'),
    (0xA722, 'M', u'ꜣ'),
    (0xA723, 'V'),
    (0xA724, 'M', u'ꜥ'),
    (0xA725, 'V'),
    (0xA726, 'M', u'ꜧ'),
    (0xA727, 'V'),
    (0xA728, 'M', u'ꜩ'),
    (0xA729, 'V'),
    (0xA72A, 'M', u'ꜫ'),
    (0xA72B, 'V'),
    (0xA72C, 'M', u'ꜭ'),
    (0xA72D, 'V'),
    (0xA72E, 'M', u'ꜯ'),
    (0xA72F, 'V'),
    (0xA732, 'M', u'ꜳ'),
    (0xA733, 'V'),
    ]

def _seg_36():
    return [
    (0xA734, 'M', u'ꜵ'),
    (0xA735, 'V'),
    (0xA736, 'M', u'ꜷ'),
    (0xA737, 'V'),
    (0xA738, 'M', u'ꜹ'),
    (0xA739, 'V'),
    (0xA73A, 'M', u'ꜻ'),
    (0xA73B, 'V'),
    (0xA73C, 'M', u'ꜽ'),
    (0xA73D, 'V'),
    (0xA73E, 'M', u'ꜿ'),
    (0xA73F, 'V'),
    (0xA740, 'M', u'ꝁ'),
    (0xA741, 'V'),
    (0xA742, 'M', u'ꝃ'),
    (0xA743, 'V'),
    (0xA744, 'M', u'ꝅ'),
    (0xA745, 'V'),
    (0xA746, 'M', u'ꝇ'),
    (0xA747, 'V'),
    (0xA748, 'M', u'ꝉ'),
    (0xA749, 'V'),
    (0xA74A, 'M', u'ꝋ'),
    (0xA74B, 'V'),
    (0xA74C, 'M', u'ꝍ'),
    (0xA74D, 'V'),
    (0xA74E, 'M', u'ꝏ'),
    (0xA74F, 'V'),
    (0xA750, 'M', u'ꝑ'),
    (0xA751, 'V'),
    (0xA752, 'M', u'ꝓ'),
    (0xA753, 'V'),
    (0xA754, 'M', u'ꝕ'),
    (0xA755, 'V'),
    (0xA756, 'M', u'ꝗ'),
    (0xA757, 'V'),
    (0xA758, 'M', u'ꝙ'),
    (0xA759, 'V'),
    (0xA75A, 'M', u'ꝛ'),
    (0xA75B, 'V'),
    (0xA75C, 'M', u'ꝝ'),
    (0xA75D, 'V'),
    (0xA75E, 'M', u'ꝟ'),
    (0xA75F, 'V'),
    (0xA760, 'M', u'ꝡ'),
    (0xA761, 'V'),
    (0xA762, 'M', u'ꝣ'),
    (0xA763, 'V'),
    (0xA764, 'M', u'ꝥ'),
    (0xA765, 'V'),
    (0xA766, 'M', u'ꝧ'),
    (0xA767, 'V'),
    (0xA768, 'M', u'ꝩ'),
    (0xA769, 'V'),
    (0xA76A, 'M', u'ꝫ'),
    (0xA76B, 'V'),
    (0xA76C, 'M', u'ꝭ'),
    (0xA76D, 'V'),
    (0xA76E, 'M', u'ꝯ'),
    (0xA76F, 'V'),
    (0xA770, 'M', u'ꝯ'),
    (0xA771, 'V'),
    (0xA779, 'M', u'ꝺ'),
    (0xA77A, 'V'),
    (0xA77B, 'M', u'ꝼ'),
    (0xA77C, 'V'),
    (0xA77D, 'M', u'ᵹ'),
    (0xA77E, 'M', u'ꝿ'),
    (0xA77F, 'V'),
    (0xA780, 'M', u'ꞁ'),
    (0xA781, 'V'),
    (0xA782, 'M', u'ꞃ'),
    (0xA783, 'V'),
    (0xA784, 'M', u'ꞅ'),
    (0xA785, 'V'),
    (0xA786, 'M', u'ꞇ'),
    (0xA787, 'V'),
    (0xA78B, 'M', u'ꞌ'),
    (0xA78C, 'V'),
    (0xA78D, 'M', u'ɥ'),
    (0xA78E, 'V'),
    (0xA78F, 'X'),
    (0xA790, 'M', u'ꞑ'),
    (0xA791, 'V'),
    (0xA792, 'M', u'ꞓ'),
    (0xA793, 'V'),
    (0xA794, 'X'),
    (0xA7A0, 'M', u'ꞡ'),
    (0xA7A1, 'V'),
    (0xA7A2, 'M', u'ꞣ'),
    (0xA7A3, 'V'),
    (0xA7A4, 'M', u'ꞥ'),
    (0xA7A5, 'V'),
    (0xA7A6, 'M', u'ꞧ'),
    (0xA7A7, 'V'),
    (0xA7A8, 'M', u'ꞩ'),
    (0xA7A9, 'V'),
    (0xA7AA, 'M', u'ɦ'),
    (0xA7AB, 'X'),
    (0xA7F8, 'M', u'ħ'),
    ]

def _seg_37():
    return [
    (0xA7F9, 'M', u'œ'),
    (0xA7FA, 'V'),
    (0xA82C, 'X'),
    (0xA830, 'V'),
    (0xA83A, 'X'),
    (0xA840, 'V'),
    (0xA878, 'X'),
    (0xA880, 'V'),
    (0xA8C5, 'X'),
    (0xA8CE, 'V'),
    (0xA8DA, 'X'),
    (0xA8E0, 'V'),
    (0xA8FC, 'X'),
    (0xA900, 'V'),
    (0xA954, 'X'),
    (0xA95F, 'V'),
    (0xA97D, 'X'),
    (0xA980, 'V'),
    (0xA9CE, 'X'),
    (0xA9CF, 'V'),
    (0xA9DA, 'X'),
    (0xA9DE, 'V'),
    (0xA9E0, 'X'),
    (0xAA00, 'V'),
    (0xAA37, 'X'),
    (0xAA40, 'V'),
    (0xAA4E, 'X'),
    (0xAA50, 'V'),
    (0xAA5A, 'X'),
    (0xAA5C, 'V'),
    (0xAA7C, 'X'),
    (0xAA80, 'V'),
    (0xAAC3, 'X'),
    (0xAADB, 'V'),
    (0xAAF7, 'X'),
    (0xAB01, 'V'),
    (0xAB07, 'X'),
    (0xAB09, 'V'),
    (0xAB0F, 'X'),
    (0xAB11, 'V'),
    (0xAB17, 'X'),
    (0xAB20, 'V'),
    (0xAB27, 'X'),
    (0xAB28, 'V'),
    (0xAB2F, 'X'),
    (0xABC0, 'V'),
    (0xABEE, 'X'),
    (0xABF0, 'V'),
    (0xABFA, 'X'),
    (0xAC00, 'V'),
    (0xD7A4, 'X'),
    (0xD7B0, 'V'),
    (0xD7C7, 'X'),
    (0xD7CB, 'V'),
    (0xD7FC, 'X'),
    (0xF900, 'M', u'豈'),
    (0xF901, 'M', u'更'),
    (0xF902, 'M', u'車'),
    (0xF903, 'M', u'賈'),
    (0xF904, 'M', u'滑'),
    (0xF905, 'M', u'串'),
    (0xF906, 'M', u'句'),
    (0xF907, 'M', u'龜'),
    (0xF909, 'M', u'契'),
    (0xF90A, 'M', u'金'),
    (0xF90B, 'M', u'喇'),
    (0xF90C, 'M', u'奈'),
    (0xF90D, 'M', u'懶'),
    (0xF90E, 'M', u'癩'),
    (0xF90F, 'M', u'羅'),
    (0xF910, 'M', u'蘿'),
    (0xF911, 'M', u'螺'),
    (0xF912, 'M', u'裸'),
    (0xF913, 'M', u'邏'),
    (0xF914, 'M', u'樂'),
    (0xF915, 'M', u'洛'),
    (0xF916, 'M', u'烙'),
    (0xF917, 'M', u'珞'),
    (0xF918, 'M', u'落'),
    (0xF919, 'M', u'酪'),
    (0xF91A, 'M', u'駱'),
    (0xF91B, 'M', u'亂'),
    (0xF91C, 'M', u'卵'),
    (0xF91D, 'M', u'欄'),
    (0xF91E, 'M', u'爛'),
    (0xF91F, 'M', u'蘭'),
    (0xF920, 'M', u'鸞'),
    (0xF921, 'M', u'嵐'),
    (0xF922, 'M', u'濫'),
    (0xF923, 'M', u'藍'),
    (0xF924, 'M', u'襤'),
    (0xF925, 'M', u'拉'),
    (0xF926, 'M', u'臘'),
    (0xF927, 'M', u'蠟'),
    (0xF928, 'M', u'廊'),
    (0xF929, 'M', u'朗'),
    (0xF92A, 'M', u'浪'),
    (0xF92B, 'M', u'狼'),
    (0xF92C, 'M', u'郎'),
    (0xF92D, 'M', u'來'),
    ]

def _seg_38():
    return [
    (0xF92E, 'M', u'冷'),
    (0xF92F, 'M', u'勞'),
    (0xF930, 'M', u'擄'),
    (0xF931, 'M', u'櫓'),
    (0xF932, 'M', u'爐'),
    (0xF933, 'M', u'盧'),
    (0xF934, 'M', u'老'),
    (0xF935, 'M', u'蘆'),
    (0xF936, 'M', u'虜'),
    (0xF937, 'M', u'路'),
    (0xF938, 'M', u'露'),
    (0xF939, 'M', u'魯'),
    (0xF93A, 'M', u'鷺'),
    (0xF93B, 'M', u'碌'),
    (0xF93C, 'M', u'祿'),
    (0xF93D, 'M', u'綠'),
    (0xF93E, 'M', u'菉'),
    (0xF93F, 'M', u'錄'),
    (0xF940, 'M', u'鹿'),
    (0xF941, 'M', u'論'),
    (0xF942, 'M', u'壟'),
    (0xF943, 'M', u'弄'),
    (0xF944, 'M', u'籠'),
    (0xF945, 'M', u'聾'),
    (0xF946, 'M', u'牢'),
    (0xF947, 'M', u'磊'),
    (0xF948, 'M', u'賂'),
    (0xF949, 'M', u'雷'),
    (0xF94A, 'M', u'壘'),
    (0xF94B, 'M', u'屢'),
    (0xF94C, 'M', u'樓'),
    (0xF94D, 'M', u'淚'),
    (0xF94E, 'M', u'漏'),
    (0xF94F, 'M', u'累'),
    (0xF950, 'M', u'縷'),
    (0xF951, 'M', u'陋'),
    (0xF952, 'M', u'勒'),
    (0xF953, 'M', u'肋'),
    (0xF954, 'M', u'凜'),
    (0xF955, 'M', u'凌'),
    (0xF956, 'M', u'稜'),
    (0xF957, 'M', u'綾'),
    (0xF958, 'M', u'菱'),
    (0xF959, 'M', u'陵'),
    (0xF95A, 'M', u'讀'),
    (0xF95B, 'M', u'拏'),
    (0xF95C, 'M', u'樂'),
    (0xF95D, 'M', u'諾'),
    (0xF95E, 'M', u'丹'),
    (0xF95F, 'M', u'寧'),
    (0xF960, 'M', u'怒'),
    (0xF961, 'M', u'率'),
    (0xF962, 'M', u'異'),
    (0xF963, 'M', u'北'),
    (0xF964, 'M', u'磻'),
    (0xF965, 'M', u'便'),
    (0xF966, 'M', u'復'),
    (0xF967, 'M', u'不'),
    (0xF968, 'M', u'泌'),
    (0xF969, 'M', u'數'),
    (0xF96A, 'M', u'索'),
    (0xF96B, 'M', u'參'),
    (0xF96C, 'M', u'塞'),
    (0xF96D, 'M', u'省'),
    (0xF96E, 'M', u'葉'),
    (0xF96F, 'M', u'說'),
    (0xF970, 'M', u'殺'),
    (0xF971, 'M', u'辰'),
    (0xF972, 'M', u'沈'),
    (0xF973, 'M', u'拾'),
    (0xF974, 'M', u'若'),
    (0xF975, 'M', u'掠'),
    (0xF976, 'M', u'略'),
    (0xF977, 'M', u'亮'),
    (0xF978, 'M', u'兩'),
    (0xF979, 'M', u'凉'),
    (0xF97A, 'M', u'梁'),
    (0xF97B, 'M', u'糧'),
    (0xF97C, 'M', u'良'),
    (0xF97D, 'M', u'諒'),
    (0xF97E, 'M', u'量'),
    (0xF97F, 'M', u'勵'),
    (0xF980, 'M', u'呂'),
    (0xF981, 'M', u'女'),
    (0xF982, 'M', u'廬'),
    (0xF983, 'M', u'旅'),
    (0xF984, 'M', u'濾'),
    (0xF985, 'M', u'礪'),
    (0xF986, 'M', u'閭'),
    (0xF987, 'M', u'驪'),
    (0xF988, 'M', u'麗'),
    (0xF989, 'M', u'黎'),
    (0xF98A, 'M', u'力'),
    (0xF98B, 'M', u'曆'),
    (0xF98C, 'M', u'歷'),
    (0xF98D, 'M', u'轢'),
    (0xF98E, 'M', u'年'),
    (0xF98F, 'M', u'憐'),
    (0xF990, 'M', u'戀'),
    (0xF991, 'M', u'撚'),
    ]

def _seg_39():
    return [
    (0xF992, 'M', u'漣'),
    (0xF993, 'M', u'煉'),
    (0xF994, 'M', u'璉'),
    (0xF995, 'M', u'秊'),
    (0xF996, 'M', u'練'),
    (0xF997, 'M', u'聯'),
    (0xF998, 'M', u'輦'),
    (0xF999, 'M', u'蓮'),
    (0xF99A, 'M', u'連'),
    (0xF99B, 'M', u'鍊'),
    (0xF99C, 'M', u'列'),
    (0xF99D, 'M', u'劣'),
    (0xF99E, 'M', u'咽'),
    (0xF99F, 'M', u'烈'),
    (0xF9A0, 'M', u'裂'),
    (0xF9A1, 'M', u'說'),
    (0xF9A2, 'M', u'廉'),
    (0xF9A3, 'M', u'念'),
    (0xF9A4, 'M', u'捻'),
    (0xF9A5, 'M', u'殮'),
    (0xF9A6, 'M', u'簾'),
    (0xF9A7, 'M', u'獵'),
    (0xF9A8, 'M', u'令'),
    (0xF9A9, 'M', u'囹'),
    (0xF9AA, 'M', u'寧'),
    (0xF9AB, 'M', u'嶺'),
    (0xF9AC, 'M', u'怜'),
    (0xF9AD, 'M', u'玲'),
    (0xF9AE, 'M', u'瑩'),
    (0xF9AF, 'M', u'羚'),
    (0xF9B0, 'M', u'聆'),
    (0xF9B1, 'M', u'鈴'),
    (0xF9B2, 'M', u'零'),
    (0xF9B3, 'M', u'靈'),
    (0xF9B4, 'M', u'領'),
    (0xF9B5, 'M', u'例'),
    (0xF9B6, 'M', u'禮'),
    (0xF9B7, 'M', u'醴'),
    (0xF9B8, 'M', u'隸'),
    (0xF9B9, 'M', u'惡'),
    (0xF9BA, 'M', u'了'),
    (0xF9BB, 'M', u'僚'),
    (0xF9BC, 'M', u'寮'),
    (0xF9BD, 'M', u'尿'),
    (0xF9BE, 'M', u'料'),
    (0xF9BF, 'M', u'樂'),
    (0xF9C0, 'M', u'燎'),
    (0xF9C1, 'M', u'療'),
    (0xF9C2, 'M', u'蓼'),
    (0xF9C3, 'M', u'遼'),
    (0xF9C4, 'M', u'龍'),
    (0xF9C5, 'M', u'暈'),
    (0xF9C6, 'M', u'阮'),
    (0xF9C7, 'M', u'劉'),
    (0xF9C8, 'M', u'杻'),
    (0xF9C9, 'M', u'柳'),
    (0xF9CA, 'M', u'流'),
    (0xF9CB, 'M', u'溜'),
    (0xF9CC, 'M', u'琉'),
    (0xF9CD, 'M', u'留'),
    (0xF9CE, 'M', u'硫'),
    (0xF9CF, 'M', u'紐'),
    (0xF9D0, 'M', u'類'),
    (0xF9D1, 'M', u'六'),
    (0xF9D2, 'M', u'戮'),
    (0xF9D3, 'M', u'陸'),
    (0xF9D4, 'M', u'倫'),
    (0xF9D5, 'M', u'崙'),
    (0xF9D6, 'M', u'淪'),
    (0xF9D7, 'M', u'輪'),
    (0xF9D8, 'M', u'律'),
    (0xF9D9, 'M', u'慄'),
    (0xF9DA, 'M', u'栗'),
    (0xF9DB, 'M', u'率'),
    (0xF9DC, 'M', u'隆'),
    (0xF9DD, 'M', u'利'),
    (0xF9DE, 'M', u'吏'),
    (0xF9DF, 'M', u'履'),
    (0xF9E0, 'M', u'易'),
    (0xF9E1, 'M', u'李'),
    (0xF9E2, 'M', u'梨'),
    (0xF9E3, 'M', u'泥'),
    (0xF9E4, 'M', u'理'),
    (0xF9E5, 'M', u'痢'),
    (0xF9E6, 'M', u'罹'),
    (0xF9E7, 'M', u'裏'),
    (0xF9E8, 'M', u'裡'),
    (0xF9E9, 'M', u'里'),
    (0xF9EA, 'M', u'離'),
    (0xF9EB, 'M', u'匿'),
    (0xF9EC, 'M', u'溺'),
    (0xF9ED, 'M', u'吝'),
    (0xF9EE, 'M', u'燐'),
    (0xF9EF, 'M', u'璘'),
    (0xF9F0, 'M', u'藺'),
    (0xF9F1, 'M', u'隣'),
    (0xF9F2, 'M', u'鱗'),
    (0xF9F3, 'M', u'麟'),
    (0xF9F4, 'M', u'林'),
    (0xF9F5, 'M', u'淋'),
    ]

def _seg_40():
    return [
    (0xF9F6, 'M', u'臨'),
    (0xF9F7, 'M', u'立'),
    (0xF9F8, 'M', u'笠'),
    (0xF9F9, 'M', u'粒'),
    (0xF9FA, 'M', u'狀'),
    (0xF9FB, 'M', u'炙'),
    (0xF9FC, 'M', u'識'),
    (0xF9FD, 'M', u'什'),
    (0xF9FE, 'M', u'茶'),
    (0xF9FF, 'M', u'刺'),
    (0xFA00, 'M', u'切'),
    (0xFA01, 'M', u'度'),
    (0xFA02, 'M', u'拓'),
    (0xFA03, 'M', u'糖'),
    (0xFA04, 'M', u'宅'),
    (0xFA05, 'M', u'洞'),
    (0xFA06, 'M', u'暴'),
    (0xFA07, 'M', u'輻'),
    (0xFA08, 'M', u'行'),
    (0xFA09, 'M', u'降'),
    (0xFA0A, 'M', u'見'),
    (0xFA0B, 'M', u'廓'),
    (0xFA0C, 'M', u'兀'),
    (0xFA0D, 'M', u'嗀'),
    (0xFA0E, 'V'),
    (0xFA10, 'M', u'塚'),
    (0xFA11, 'V'),
    (0xFA12, 'M', u'晴'),
    (0xFA13, 'V'),
    (0xFA15, 'M', u'凞'),
    (0xFA16, 'M', u'猪'),
    (0xFA17, 'M', u'益'),
    (0xFA18, 'M', u'礼'),
    (0xFA19, 'M', u'神'),
    (0xFA1A, 'M', u'祥'),
    (0xFA1B, 'M', u'福'),
    (0xFA1C, 'M', u'靖'),
    (0xFA1D, 'M', u'精'),
    (0xFA1E, 'M', u'羽'),
    (0xFA1F, 'V'),
    (0xFA20, 'M', u'蘒'),
    (0xFA21, 'V'),
    (0xFA22, 'M', u'諸'),
    (0xFA23, 'V'),
    (0xFA25, 'M', u'逸'),
    (0xFA26, 'M', u'都'),
    (0xFA27, 'V'),
    (0xFA2A, 'M', u'飯'),
    (0xFA2B, 'M', u'飼'),
    (0xFA2C, 'M', u'館'),
    (0xFA2D, 'M', u'鶴'),
    (0xFA2E, 'M', u'郞'),
    (0xFA2F, 'M', u'隷'),
    (0xFA30, 'M', u'侮'),
    (0xFA31, 'M', u'僧'),
    (0xFA32, 'M', u'免'),
    (0xFA33, 'M', u'勉'),
    (0xFA34, 'M', u'勤'),
    (0xFA35, 'M', u'卑'),
    (0xFA36, 'M', u'喝'),
    (0xFA37, 'M', u'嘆'),
    (0xFA38, 'M', u'器'),
    (0xFA39, 'M', u'塀'),
    (0xFA3A, 'M', u'墨'),
    (0xFA3B, 'M', u'層'),
    (0xFA3C, 'M', u'屮'),
    (0xFA3D, 'M', u'悔'),
    (0xFA3E, 'M', u'慨'),
    (0xFA3F, 'M', u'憎'),
    (0xFA40, 'M', u'懲'),
    (0xFA41, 'M', u'敏'),
    (0xFA42, 'M', u'既'),
    (0xFA43, 'M', u'暑'),
    (0xFA44, 'M', u'梅'),
    (0xFA45, 'M', u'海'),
    (0xFA46, 'M', u'渚'),
    (0xFA47, 'M', u'漢'),
    (0xFA48, 'M', u'煮'),
    (0xFA49, 'M', u'爫'),
    (0xFA4A, 'M', u'琢'),
    (0xFA4B, 'M', u'碑'),
    (0xFA4C, 'M', u'社'),
    (0xFA4D, 'M', u'祉'),
    (0xFA4E, 'M', u'祈'),
    (0xFA4F, 'M', u'祐'),
    (0xFA50, 'M', u'祖'),
    (0xFA51, 'M', u'祝'),
    (0xFA52, 'M', u'禍'),
    (0xFA53, 'M', u'禎'),
    (0xFA54, 'M', u'穀'),
    (0xFA55, 'M', u'突'),
    (0xFA56, 'M', u'節'),
    (0xFA57, 'M', u'練'),
    (0xFA58, 'M', u'縉'),
    (0xFA59, 'M', u'繁'),
    (0xFA5A, 'M', u'署'),
    (0xFA5B, 'M', u'者'),
    (0xFA5C, 'M', u'臭'),
    (0xFA5D, 'M', u'艹'),
    (0xFA5F, 'M', u'著'),
    ]

def _seg_41():
    return [
    (0xFA60, 'M', u'褐'),
    (0xFA61, 'M', u'視'),
    (0xFA62, 'M', u'謁'),
    (0xFA63, 'M', u'謹'),
    (0xFA64, 'M', u'賓'),
    (0xFA65, 'M', u'贈'),
    (0xFA66, 'M', u'辶'),
    (0xFA67, 'M', u'逸'),
    (0xFA68, 'M', u'難'),
    (0xFA69, 'M', u'響'),
    (0xFA6A, 'M', u'頻'),
    (0xFA6B, 'M', u'恵'),
    (0xFA6C, 'M', u'𤋮'),
    (0xFA6D, 'M', u'舘'),
    (0xFA6E, 'X'),
    (0xFA70, 'M', u'並'),
    (0xFA71, 'M', u'况'),
    (0xFA72, 'M', u'全'),
    (0xFA73, 'M', u'侀'),
    (0xFA74, 'M', u'充'),
    (0xFA75, 'M', u'冀'),
    (0xFA76, 'M', u'勇'),
    (0xFA77, 'M', u'勺'),
    (0xFA78, 'M', u'喝'),
    (0xFA79, 'M', u'啕'),
    (0xFA7A, 'M', u'喙'),
    (0xFA7B, 'M', u'嗢'),
    (0xFA7C, 'M', u'塚'),
    (0xFA7D, 'M', u'墳'),
    (0xFA7E, 'M', u'奄'),
    (0xFA7F, 'M', u'奔'),
    (0xFA80, 'M', u'婢'),
    (0xFA81, 'M', u'嬨'),
    (0xFA82, 'M', u'廒'),
    (0xFA83, 'M', u'廙'),
    (0xFA84, 'M', u'彩'),
    (0xFA85, 'M', u'徭'),
    (0xFA86, 'M', u'惘'),
    (0xFA87, 'M', u'慎'),
    (0xFA88, 'M', u'愈'),
    (0xFA89, 'M', u'憎'),
    (0xFA8A, 'M', u'慠'),
    (0xFA8B, 'M', u'懲'),
    (0xFA8C, 'M', u'戴'),
    (0xFA8D, 'M', u'揄'),
    (0xFA8E, 'M', u'搜'),
    (0xFA8F, 'M', u'摒'),
    (0xFA90, 'M', u'敖'),
    (0xFA91, 'M', u'晴'),
    (0xFA92, 'M', u'朗'),
    (0xFA93, 'M', u'望'),
    (0xFA94, 'M', u'杖'),
    (0xFA95, 'M', u'歹'),
    (0xFA96, 'M', u'殺'),
    (0xFA97, 'M', u'流'),
    (0xFA98, 'M', u'滛'),
    (0xFA99, 'M', u'滋'),
    (0xFA9A, 'M', u'漢'),
    (0xFA9B, 'M', u'瀞'),
    (0xFA9C, 'M', u'煮'),
    (0xFA9D, 'M', u'瞧'),
    (0xFA9E, 'M', u'爵'),
    (0xFA9F, 'M', u'犯'),
    (0xFAA0, 'M', u'猪'),
    (0xFAA1, 'M', u'瑱'),
    (0xFAA2, 'M', u'甆'),
    (0xFAA3, 'M', u'画'),
    (0xFAA4, 'M', u'瘝'),
    (0xFAA5, 'M', u'瘟'),
    (0xFAA6, 'M', u'益'),
    (0xFAA7, 'M', u'盛'),
    (0xFAA8, 'M', u'直'),
    (0xFAA9, 'M', u'睊'),
    (0xFAAA, 'M', u'着'),
    (0xFAAB, 'M', u'磌'),
    (0xFAAC, 'M', u'窱'),
    (0xFAAD, 'M', u'節'),
    (0xFAAE, 'M', u'类'),
    (0xFAAF, 'M', u'絛'),
    (0xFAB0, 'M', u'練'),
    (0xFAB1, 'M', u'缾'),
    (0xFAB2, 'M', u'者'),
    (0xFAB3, 'M', u'荒'),
    (0xFAB4, 'M', u'華'),
    (0xFAB5, 'M', u'蝹'),
    (0xFAB6, 'M', u'襁'),
    (0xFAB7, 'M', u'覆'),
    (0xFAB8, 'M', u'視'),
    (0xFAB9, 'M', u'調'),
    (0xFABA, 'M', u'諸'),
    (0xFABB, 'M', u'請'),
    (0xFABC, 'M', u'謁'),
    (0xFABD, 'M', u'諾'),
    (0xFABE, 'M', u'諭'),
    (0xFABF, 'M', u'謹'),
    (0xFAC0, 'M', u'變'),
    (0xFAC1, 'M', u'贈'),
    (0xFAC2, 'M', u'輸'),
    (0xFAC3, 'M', u'遲'),
    (0xFAC4, 'M', u'醙'),
    ]

def _seg_42():
    return [
    (0xFAC5, 'M', u'鉶'),
    (0xFAC6, 'M', u'陼'),
    (0xFAC7, 'M', u'難'),
    (0xFAC8, 'M', u'靖'),
    (0xFAC9, 'M', u'韛'),
    (0xFACA, 'M', u'響'),
    (0xFACB, 'M', u'頋'),
    (0xFACC, 'M', u'頻'),
    (0xFACD, 'M', u'鬒'),
    (0xFACE, 'M', u'龜'),
    (0xFACF, 'M', u'𢡊'),
    (0xFAD0, 'M', u'𢡄'),
    (0xFAD1, 'M', u'𣏕'),
    (0xFAD2, 'M', u'㮝'),
    (0xFAD3, 'M', u'䀘'),
    (0xFAD4, 'M', u'䀹'),
    (0xFAD5, 'M', u'𥉉'),
    (0xFAD6, 'M', u'𥳐'),
    (0xFAD7, 'M', u'𧻓'),
    (0xFAD8, 'M', u'齃'),
    (0xFAD9, 'M', u'龎'),
    (0xFADA, 'X'),
    (0xFB00, 'M', u'ff'),
    (0xFB01, 'M', u'fi'),
    (0xFB02, 'M', u'fl'),
    (0xFB03, 'M', u'ffi'),
    (0xFB04, 'M', u'ffl'),
    (0xFB05, 'M', u'st'),
    (0xFB07, 'X'),
    (0xFB13, 'M', u'մն'),
    (0xFB14, 'M', u'մե'),
    (0xFB15, 'M', u'մի'),
    (0xFB16, 'M', u'վն'),
    (0xFB17, 'M', u'մխ'),
    (0xFB18, 'X'),
    (0xFB1D, 'M', u'יִ'),
    (0xFB1E, 'V'),
    (0xFB1F, 'M', u'ײַ'),
    (0xFB20, 'M', u'ע'),
    (0xFB21, 'M', u'א'),
    (0xFB22, 'M', u'ד'),
    (0xFB23, 'M', u'ה'),
    (0xFB24, 'M', u'כ'),
    (0xFB25, 'M', u'ל'),
    (0xFB26, 'M', u'ם'),
    (0xFB27, 'M', u'ר'),
    (0xFB28, 'M', u'ת'),
    (0xFB29, '3', u'+'),
    (0xFB2A, 'M', u'שׁ'),
    (0xFB2B, 'M', u'שׂ'),
    (0xFB2C, 'M', u'שּׁ'),
    (0xFB2D, 'M', u'שּׂ'),
    (0xFB2E, 'M', u'אַ'),
    (0xFB2F, 'M', u'אָ'),
    (0xFB30, 'M', u'אּ'),
    (0xFB31, 'M', u'בּ'),
    (0xFB32, 'M', u'גּ'),
    (0xFB33, 'M', u'דּ'),
    (0xFB34, 'M', u'הּ'),
    (0xFB35, 'M', u'וּ'),
    (0xFB36, 'M', u'זּ'),
    (0xFB37, 'X'),
    (0xFB38, 'M', u'טּ'),
    (0xFB39, 'M', u'יּ'),
    (0xFB3A, 'M', u'ךּ'),
    (0xFB3B, 'M', u'כּ'),
    (0xFB3C, 'M', u'לּ'),
    (0xFB3D, 'X'),
    (0xFB3E, 'M', u'מּ'),
    (0xFB3F, 'X'),
    (0xFB40, 'M', u'נּ'),
    (0xFB41, 'M', u'סּ'),
    (0xFB42, 'X'),
    (0xFB43, 'M', u'ףּ'),
    (0xFB44, 'M', u'פּ'),
    (0xFB45, 'X'),
    (0xFB46, 'M', u'צּ'),
    (0xFB47, 'M', u'קּ'),
    (0xFB48, 'M', u'רּ'),
    (0xFB49, 'M', u'שּ'),
    (0xFB4A, 'M', u'תּ'),
    (0xFB4B, 'M', u'וֹ'),
    (0xFB4C, 'M', u'בֿ'),
    (0xFB4D, 'M', u'כֿ'),
    (0xFB4E, 'M', u'פֿ'),
    (0xFB4F, 'M', u'אל'),
    (0xFB50, 'M', u'ٱ'),
    (0xFB52, 'M', u'ٻ'),
    (0xFB56, 'M', u'پ'),
    (0xFB5A, 'M', u'ڀ'),
    (0xFB5E, 'M', u'ٺ'),
    (0xFB62, 'M', u'ٿ'),
    (0xFB66, 'M', u'ٹ'),
    (0xFB6A, 'M', u'ڤ'),
    (0xFB6E, 'M', u'ڦ'),
    (0xFB72, 'M', u'ڄ'),
    (0xFB76, 'M', u'ڃ'),
    (0xFB7A, 'M', u'چ'),
    (0xFB7E, 'M', u'ڇ'),
    (0xFB82, 'M', u'ڍ'),
    ]

def _seg_43():
    return [
    (0xFB84, 'M', u'ڌ'),
    (0xFB86, 'M', u'ڎ'),
    (0xFB88, 'M', u'ڈ'),
    (0xFB8A, 'M', u'ژ'),
    (0xFB8C, 'M', u'ڑ'),
    (0xFB8E, 'M', u'ک'),
    (0xFB92, 'M', u'گ'),
    (0xFB96, 'M', u'ڳ'),
    (0xFB9A, 'M', u'ڱ'),
    (0xFB9E, 'M', u'ں'),
    (0xFBA0, 'M', u'ڻ'),
    (0xFBA4, 'M', u'ۀ'),
    (0xFBA6, 'M', u'ہ'),
    (0xFBAA, 'M', u'ھ'),
    (0xFBAE, 'M', u'ے'),
    (0xFBB0, 'M', u'ۓ'),
    (0xFBB2, 'V'),
    (0xFBC2, 'X'),
    (0xFBD3, 'M', u'ڭ'),
    (0xFBD7, 'M', u'ۇ'),
    (0xFBD9, 'M', u'ۆ'),
    (0xFBDB, 'M', u'ۈ'),
    (0xFBDD, 'M', u'ۇٴ'),
    (0xFBDE, 'M', u'ۋ'),
    (0xFBE0, 'M', u'ۅ'),
    (0xFBE2, 'M', u'ۉ'),
    (0xFBE4, 'M', u'ې'),
    (0xFBE8, 'M', u'ى'),
    (0xFBEA, 'M', u'ئا'),
    (0xFBEC, 'M', u'ئە'),
    (0xFBEE, 'M', u'ئو'),
    (0xFBF0, 'M', u'ئۇ'),
    (0xFBF2, 'M', u'ئۆ'),
    (0xFBF4, 'M', u'ئۈ'),
    (0xFBF6, 'M', u'ئې'),
    (0xFBF9, 'M', u'ئى'),
    (0xFBFC, 'M', u'ی'),
    (0xFC00, 'M', u'ئج'),
    (0xFC01, 'M', u'ئح'),
    (0xFC02, 'M', u'ئم'),
    (0xFC03, 'M', u'ئى'),
    (0xFC04, 'M', u'ئي'),
    (0xFC05, 'M', u'بج'),
    (0xFC06, 'M', u'بح'),
    (0xFC07, 'M', u'بخ'),
    (0xFC08, 'M', u'بم'),
    (0xFC09, 'M', u'بى'),
    (0xFC0A, 'M', u'بي'),
    (0xFC0B, 'M', u'تج'),
    (0xFC0C, 'M', u'تح'),
    (0xFC0D, 'M', u'تخ'),
    (0xFC0E, 'M', u'تم'),
    (0xFC0F, 'M', u'تى'),
    (0xFC10, 'M', u'تي'),
    (0xFC11, 'M', u'ثج'),
    (0xFC12, 'M', u'ثم'),
    (0xFC13, 'M', u'ثى'),
    (0xFC14, 'M', u'ثي'),
    (0xFC15, 'M', u'جح'),
    (0xFC16, 'M', u'جم'),
    (0xFC17, 'M', u'حج'),
    (0xFC18, 'M', u'حم'),
    (0xFC19, 'M', u'خج'),
    (0xFC1A, 'M', u'خح'),
    (0xFC1B, 'M', u'خم'),
    (0xFC1C, 'M', u'سج'),
    (0xFC1D, 'M', u'سح'),
    (0xFC1E, 'M', u'سخ'),
    (0xFC1F, 'M', u'سم'),
    (0xFC20, 'M', u'صح'),
    (0xFC21, 'M', u'صم'),
    (0xFC22, 'M', u'ضج'),
    (0xFC23, 'M', u'ضح'),
    (0xFC24, 'M', u'ضخ'),
    (0xFC25, 'M', u'ضم'),
    (0xFC26, 'M', u'طح'),
    (0xFC27, 'M', u'طم'),
    (0xFC28, 'M', u'ظم'),
    (0xFC29, 'M', u'عج'),
    (0xFC2A, 'M', u'عم'),
    (0xFC2B, 'M', u'غج'),
    (0xFC2C, 'M', u'غم'),
    (0xFC2D, 'M', u'فج'),
    (0xFC2E, 'M', u'فح'),
    (0xFC2F, 'M', u'فخ'),
    (0xFC30, 'M', u'فم'),
    (0xFC31, 'M', u'فى'),
    (0xFC32, 'M', u'في'),
    (0xFC33, 'M', u'قح'),
    (0xFC34, 'M', u'قم'),
    (0xFC35, 'M', u'قى'),
    (0xFC36, 'M', u'قي'),
    (0xFC37, 'M', u'كا'),
    (0xFC38, 'M', u'كج'),
    (0xFC39, 'M', u'كح'),
    (0xFC3A, 'M', u'كخ'),
    (0xFC3B, 'M', u'كل'),
    (0xFC3C, 'M', u'كم'),
    (0xFC3D, 'M', u'كى'),
    (0xFC3E, 'M', u'كي'),
    ]

def _seg_44():
    return [
    (0xFC3F, 'M', u'لج'),
    (0xFC40, 'M', u'لح'),
    (0xFC41, 'M', u'لخ'),
    (0xFC42, 'M', u'لم'),
    (0xFC43, 'M', u'لى'),
    (0xFC44, 'M', u'لي'),
    (0xFC45, 'M', u'مج'),
    (0xFC46, 'M', u'مح'),
    (0xFC47, 'M', u'مخ'),
    (0xFC48, 'M', u'مم'),
    (0xFC49, 'M', u'مى'),
    (0xFC4A, 'M', u'مي'),
    (0xFC4B, 'M', u'نج'),
    (0xFC4C, 'M', u'نح'),
    (0xFC4D, 'M', u'نخ'),
    (0xFC4E, 'M', u'نم'),
    (0xFC4F, 'M', u'نى'),
    (0xFC50, 'M', u'ني'),
    (0xFC51, 'M', u'هج'),
    (0xFC52, 'M', u'هم'),
    (0xFC53, 'M', u'هى'),
    (0xFC54, 'M', u'هي'),
    (0xFC55, 'M', u'يج'),
    (0xFC56, 'M', u'يح'),
    (0xFC57, 'M', u'يخ'),
    (0xFC58, 'M', u'يم'),
    (0xFC59, 'M', u'يى'),
    (0xFC5A, 'M', u'يي'),
    (0xFC5B, 'M', u'ذٰ'),
    (0xFC5C, 'M', u'رٰ'),
    (0xFC5D, 'M', u'ىٰ'),
    (0xFC5E, '3', u' ٌّ'),
    (0xFC5F, '3', u' ٍّ'),
    (0xFC60, '3', u' َّ'),
    (0xFC61, '3', u' ُّ'),
    (0xFC62, '3', u' ِّ'),
    (0xFC63, '3', u' ّٰ'),
    (0xFC64, 'M', u'ئر'),
    (0xFC65, 'M', u'ئز'),
    (0xFC66, 'M', u'ئم'),
    (0xFC67, 'M', u'ئن'),
    (0xFC68, 'M', u'ئى'),
    (0xFC69, 'M', u'ئي'),
    (0xFC6A, 'M', u'بر'),
    (0xFC6B, 'M', u'بز'),
    (0xFC6C, 'M', u'بم'),
    (0xFC6D, 'M', u'بن'),
    (0xFC6E, 'M', u'بى'),
    (0xFC6F, 'M', u'بي'),
    (0xFC70, 'M', u'تر'),
    (0xFC71, 'M', u'تز'),
    (0xFC72, 'M', u'تم'),
    (0xFC73, 'M', u'تن'),
    (0xFC74, 'M', u'تى'),
    (0xFC75, 'M', u'تي'),
    (0xFC76, 'M', u'ثر'),
    (0xFC77, 'M', u'ثز'),
    (0xFC78, 'M', u'ثم'),
    (0xFC79, 'M', u'ثن'),
    (0xFC7A, 'M', u'ثى'),
    (0xFC7B, 'M', u'ثي'),
    (0xFC7C, 'M', u'فى'),
    (0xFC7D, 'M', u'في'),
    (0xFC7E, 'M', u'قى'),
    (0xFC7F, 'M', u'قي'),
    (0xFC80, 'M', u'كا'),
    (0xFC81, 'M', u'كل'),
    (0xFC82, 'M', u'كم'),
    (0xFC83, 'M', u'كى'),
    (0xFC84, 'M', u'كي'),
    (0xFC85, 'M', u'لم'),
    (0xFC86, 'M', u'لى'),
    (0xFC87, 'M', u'لي'),
    (0xFC88, 'M', u'ما'),
    (0xFC89, 'M', u'مم'),
    (0xFC8A, 'M', u'نر'),
    (0xFC8B, 'M', u'نز'),
    (0xFC8C, 'M', u'نم'),
    (0xFC8D, 'M', u'نن'),
    (0xFC8E, 'M', u'نى'),
    (0xFC8F, 'M', u'ني'),
    (0xFC90, 'M', u'ىٰ'),
    (0xFC91, 'M', u'ير'),
    (0xFC92, 'M', u'يز'),
    (0xFC93, 'M', u'يم'),
    (0xFC94, 'M', u'ين'),
    (0xFC95, 'M', u'يى'),
    (0xFC96, 'M', u'يي'),
    (0xFC97, 'M', u'ئج'),
    (0xFC98, 'M', u'ئح'),
    (0xFC99, 'M', u'ئخ'),
    (0xFC9A, 'M', u'ئم'),
    (0xFC9B, 'M', u'ئه'),
    (0xFC9C, 'M', u'بج'),
    (0xFC9D, 'M', u'بح'),
    (0xFC9E, 'M', u'بخ'),
    (0xFC9F, 'M', u'بم'),
    (0xFCA0, 'M', u'به'),
    (0xFCA1, 'M', u'تج'),
    (0xFCA2, 'M', u'تح'),
    ]

def _seg_45():
    return [
    (0xFCA3, 'M', u'تخ'),
    (0xFCA4, 'M', u'تم'),
    (0xFCA5, 'M', u'ته'),
    (0xFCA6, 'M', u'ثم'),
    (0xFCA7, 'M', u'جح'),
    (0xFCA8, 'M', u'جم'),
    (0xFCA9, 'M', u'حج'),
    (0xFCAA, 'M', u'حم'),
    (0xFCAB, 'M', u'خج'),
    (0xFCAC, 'M', u'خم'),
    (0xFCAD, 'M', u'سج'),
    (0xFCAE, 'M', u'سح'),
    (0xFCAF, 'M', u'سخ'),
    (0xFCB0, 'M', u'سم'),
    (0xFCB1, 'M', u'صح'),
    (0xFCB2, 'M', u'صخ'),
    (0xFCB3, 'M', u'صم'),
    (0xFCB4, 'M', u'ضج'),
    (0xFCB5, 'M', u'ضح'),
    (0xFCB6, 'M', u'ضخ'),
    (0xFCB7, 'M', u'ضم'),
    (0xFCB8, 'M', u'طح'),
    (0xFCB9, 'M', u'ظم'),
    (0xFCBA, 'M', u'عج'),
    (0xFCBB, 'M', u'عم'),
    (0xFCBC, 'M', u'غج'),
    (0xFCBD, 'M', u'غم'),
    (0xFCBE, 'M', u'فج'),
    (0xFCBF, 'M', u'فح'),
    (0xFCC0, 'M', u'فخ'),
    (0xFCC1, 'M', u'فم'),
    (0xFCC2, 'M', u'قح'),
    (0xFCC3, 'M', u'قم'),
    (0xFCC4, 'M', u'كج'),
    (0xFCC5, 'M', u'كح'),
    (0xFCC6, 'M', u'كخ'),
    (0xFCC7, 'M', u'كل'),
    (0xFCC8, 'M', u'كم'),
    (0xFCC9, 'M', u'لج'),
    (0xFCCA, 'M', u'لح'),
    (0xFCCB, 'M', u'لخ'),
    (0xFCCC, 'M', u'لم'),
    (0xFCCD, 'M', u'له'),
    (0xFCCE, 'M', u'مج'),
    (0xFCCF, 'M', u'مح'),
    (0xFCD0, 'M', u'مخ'),
    (0xFCD1, 'M', u'مم'),
    (0xFCD2, 'M', u'نج'),
    (0xFCD3, 'M', u'نح'),
    (0xFCD4, 'M', u'نخ'),
    (0xFCD5, 'M', u'نم'),
    (0xFCD6, 'M', u'نه'),
    (0xFCD7, 'M', u'هج'),
    (0xFCD8, 'M', u'هم'),
    (0xFCD9, 'M', u'هٰ'),
    (0xFCDA, 'M', u'يج'),
    (0xFCDB, 'M', u'يح'),
    (0xFCDC, 'M', u'يخ'),
    (0xFCDD, 'M', u'يم'),
    (0xFCDE, 'M', u'يه'),
    (0xFCDF, 'M', u'ئم'),
    (0xFCE0, 'M', u'ئه'),
    (0xFCE1, 'M', u'بم'),
    (0xFCE2, 'M', u'به'),
    (0xFCE3, 'M', u'تم'),
    (0xFCE4, 'M', u'ته'),
    (0xFCE5, 'M', u'ثم'),
    (0xFCE6, 'M', u'ثه'),
    (0xFCE7, 'M', u'سم'),
    (0xFCE8, 'M', u'سه'),
    (0xFCE9, 'M', u'شم'),
    (0xFCEA, 'M', u'شه'),
    (0xFCEB, 'M', u'كل'),
    (0xFCEC, 'M', u'كم'),
    (0xFCED, 'M', u'لم'),
    (0xFCEE, 'M', u'نم'),
    (0xFCEF, 'M', u'نه'),
    (0xFCF0, 'M', u'يم'),
    (0xFCF1, 'M', u'يه'),
    (0xFCF2, 'M', u'ـَّ'),
    (0xFCF3, 'M', u'ـُّ'),
    (0xFCF4, 'M', u'ـِّ'),
    (0xFCF5, 'M', u'طى'),
    (0xFCF6, 'M', u'طي'),
    (0xFCF7, 'M', u'عى'),
    (0xFCF8, 'M', u'عي'),
    (0xFCF9, 'M', u'غى'),
    (0xFCFA, 'M', u'غي'),
    (0xFCFB, 'M', u'سى'),
    (0xFCFC, 'M', u'سي'),
    (0xFCFD, 'M', u'شى'),
    (0xFCFE, 'M', u'شي'),
    (0xFCFF, 'M', u'حى'),
    (0xFD00, 'M', u'حي'),
    (0xFD01, 'M', u'جى'),
    (0xFD02, 'M', u'جي'),
    (0xFD03, 'M', u'خى'),
    (0xFD04, 'M', u'خي'),
    (0xFD05, 'M', u'صى'),
    (0xFD06, 'M', u'صي'),
    ]

def _seg_46():
    return [
    (0xFD07, 'M', u'ضى'),
    (0xFD08, 'M', u'ضي'),
    (0xFD09, 'M', u'شج'),
    (0xFD0A, 'M', u'شح'),
    (0xFD0B, 'M', u'شخ'),
    (0xFD0C, 'M', u'شم'),
    (0xFD0D, 'M', u'شر'),
    (0xFD0E, 'M', u'سر'),
    (0xFD0F, 'M', u'صر'),
    (0xFD10, 'M', u'ضر'),
    (0xFD11, 'M', u'طى'),
    (0xFD12, 'M', u'طي'),
    (0xFD13, 'M', u'عى'),
    (0xFD14, 'M', u'عي'),
    (0xFD15, 'M', u'غى'),
    (0xFD16, 'M', u'غي'),
    (0xFD17, 'M', u'سى'),
    (0xFD18, 'M', u'سي'),
    (0xFD19, 'M', u'شى'),
    (0xFD1A, 'M', u'شي'),
    (0xFD1B, 'M', u'حى'),
    (0xFD1C, 'M', u'حي'),
    (0xFD1D, 'M', u'جى'),
    (0xFD1E, 'M', u'جي'),
    (0xFD1F, 'M', u'خى'),
    (0xFD20, 'M', u'خي'),
    (0xFD21, 'M', u'صى'),
    (0xFD22, 'M', u'صي'),
    (0xFD23, 'M', u'ضى'),
    (0xFD24, 'M', u'ضي'),
    (0xFD25, 'M', u'شج'),
    (0xFD26, 'M', u'شح'),
    (0xFD27, 'M', u'شخ'),
    (0xFD28, 'M', u'شم'),
    (0xFD29, 'M', u'شر'),
    (0xFD2A, 'M', u'سر'),
    (0xFD2B, 'M', u'صر'),
    (0xFD2C, 'M', u'ضر'),
    (0xFD2D, 'M', u'شج'),
    (0xFD2E, 'M', u'شح'),
    (0xFD2F, 'M', u'شخ'),
    (0xFD30, 'M', u'شم'),
    (0xFD31, 'M', u'سه'),
    (0xFD32, 'M', u'شه'),
    (0xFD33, 'M', u'طم'),
    (0xFD34, 'M', u'سج'),
    (0xFD35, 'M', u'سح'),
    (0xFD36, 'M', u'سخ'),
    (0xFD37, 'M', u'شج'),
    (0xFD38, 'M', u'شح'),
    (0xFD39, 'M', u'شخ'),
    (0xFD3A, 'M', u'طم'),
    (0xFD3B, 'M', u'ظم'),
    (0xFD3C, 'M', u'اً'),
    (0xFD3E, 'V'),
    (0xFD40, 'X'),
    (0xFD50, 'M', u'تجم'),
    (0xFD51, 'M', u'تحج'),
    (0xFD53, 'M', u'تحم'),
    (0xFD54, 'M', u'تخم'),
    (0xFD55, 'M', u'تمج'),
    (0xFD56, 'M', u'تمح'),
    (0xFD57, 'M', u'تمخ'),
    (0xFD58, 'M', u'جمح'),
    (0xFD5A, 'M', u'حمي'),
    (0xFD5B, 'M', u'حمى'),
    (0xFD5C, 'M', u'سحج'),
    (0xFD5D, 'M', u'سجح'),
    (0xFD5E, 'M', u'سجى'),
    (0xFD5F, 'M', u'سمح'),
    (0xFD61, 'M', u'سمج'),
    (0xFD62, 'M', u'سمم'),
    (0xFD64, 'M', u'صحح'),
    (0xFD66, 'M', u'صمم'),
    (0xFD67, 'M', u'شحم'),
    (0xFD69, 'M', u'شجي'),
    (0xFD6A, 'M', u'شمخ'),
    (0xFD6C, 'M', u'شمم'),
    (0xFD6E, 'M', u'ضحى'),
    (0xFD6F, 'M', u'ضخم'),
    (0xFD71, 'M', u'طمح'),
    (0xFD73, 'M', u'طمم'),
    (0xFD74, 'M', u'طمي'),
    (0xFD75, 'M', u'عجم'),
    (0xFD76, 'M', u'عمم'),
    (0xFD78, 'M', u'عمى'),
    (0xFD79, 'M', u'غمم'),
    (0xFD7A, 'M', u'غمي'),
    (0xFD7B, 'M', u'غمى'),
    (0xFD7C, 'M', u'فخم'),
    (0xFD7E, 'M', u'قمح'),
    (0xFD7F, 'M', u'قمم'),
    (0xFD80, 'M', u'لحم'),
    (0xFD81, 'M', u'لحي'),
    (0xFD82, 'M', u'لحى'),
    (0xFD83, 'M', u'لجج'),
    (0xFD85, 'M', u'لخم'),
    (0xFD87, 'M', u'لمح'),
    (0xFD89, 'M', u'محج'),
    (0xFD8A, 'M', u'محم'),
    ]

def _seg_47():
    return [
    (0xFD8B, 'M', u'محي'),
    (0xFD8C, 'M', u'مجح'),
    (0xFD8D, 'M', u'مجم'),
    (0xFD8E, 'M', u'مخج'),
    (0xFD8F, 'M', u'مخم'),
    (0xFD90, 'X'),
    (0xFD92, 'M', u'مجخ'),
    (0xFD93, 'M', u'همج'),
    (0xFD94, 'M', u'همم'),
    (0xFD95, 'M', u'نحم'),
    (0xFD96, 'M', u'نحى'),
    (0xFD97, 'M', u'نجم'),
    (0xFD99, 'M', u'نجى'),
    (0xFD9A, 'M', u'نمي'),
    (0xFD9B, 'M', u'نمى'),
    (0xFD9C, 'M', u'يمم'),
    (0xFD9E, 'M', u'بخي'),
    (0xFD9F, 'M', u'تجي'),
    (0xFDA0, 'M', u'تجى'),
    (0xFDA1, 'M', u'تخي'),
    (0xFDA2, 'M', u'تخى'),
    (0xFDA3, 'M', u'تمي'),
    (0xFDA4, 'M', u'تمى'),
    (0xFDA5, 'M', u'جمي'),
    (0xFDA6, 'M', u'جحى'),
    (0xFDA7, 'M', u'جمى'),
    (0xFDA8, 'M', u'سخى'),
    (0xFDA9, 'M', u'صحي'),
    (0xFDAA, 'M', u'شحي'),
    (0xFDAB, 'M', u'ضحي'),
    (0xFDAC, 'M', u'لجي'),
    (0xFDAD, 'M', u'لمي'),
    (0xFDAE, 'M', u'يحي'),
    (0xFDAF, 'M', u'يجي'),
    (0xFDB0, 'M', u'يمي'),
    (0xFDB1, 'M', u'ممي'),
    (0xFDB2, 'M', u'قمي'),
    (0xFDB3, 'M', u'نحي'),
    (0xFDB4, 'M', u'قمح'),
    (0xFDB5, 'M', u'لحم'),
    (0xFDB6, 'M', u'عمي'),
    (0xFDB7, 'M', u'كمي'),
    (0xFDB8, 'M', u'نجح'),
    (0xFDB9, 'M', u'مخي'),
    (0xFDBA, 'M', u'لجم'),
    (0xFDBB, 'M', u'كمم'),
    (0xFDBC, 'M', u'لجم'),
    (0xFDBD, 'M', u'نجح'),
    (0xFDBE, 'M', u'جحي'),
    (0xFDBF, 'M', u'حجي'),
    (0xFDC0, 'M', u'مجي'),
    (0xFDC1, 'M', u'فمي'),
    (0xFDC2, 'M', u'بحي'),
    (0xFDC3, 'M', u'كمم'),
    (0xFDC4, 'M', u'عجم'),
    (0xFDC5, 'M', u'صمم'),
    (0xFDC6, 'M', u'سخي'),
    (0xFDC7, 'M', u'نجي'),
    (0xFDC8, 'X'),
    (0xFDF0, 'M', u'صلے'),
    (0xFDF1, 'M', u'قلے'),
    (0xFDF2, 'M', u'الله'),
    (0xFDF3, 'M', u'اكبر'),
    (0xFDF4, 'M', u'محمد'),
    (0xFDF5, 'M', u'صلعم'),
    (0xFDF6, 'M', u'رسول'),
    (0xFDF7, 'M', u'عليه'),
    (0xFDF8, 'M', u'وسلم'),
    (0xFDF9, 'M', u'صلى'),
    (0xFDFA, '3', u'صلى الله عليه وسلم'),
    (0xFDFB, '3', u'جل جلاله'),
    (0xFDFC, 'M', u'ریال'),
    (0xFDFD, 'V'),
    (0xFDFE, 'X'),
    (0xFE00, 'I'),
    (0xFE10, '3', u','),
    (0xFE11, 'M', u'、'),
    (0xFE12, 'X'),
    (0xFE13, '3', u':'),
    (0xFE14, '3', u';'),
    (0xFE15, '3', u'!'),
    (0xFE16, '3', u'?'),
    (0xFE17, 'M', u'〖'),
    (0xFE18, 'M', u'〗'),
    (0xFE19, 'X'),
    (0xFE20, 'V'),
    (0xFE27, 'X'),
    (0xFE31, 'M', u'—'),
    (0xFE32, 'M', u'–'),
    (0xFE33, '3', u'_'),
    (0xFE35, '3', u'('),
    (0xFE36, '3', u')'),
    (0xFE37, '3', u'{'),
    (0xFE38, '3', u'}'),
    (0xFE39, 'M', u'〔'),
    (0xFE3A, 'M', u'〕'),
    (0xFE3B, 'M', u'【'),
    (0xFE3C, 'M', u'】'),
    (0xFE3D, 'M', u'《'),
    (0xFE3E, 'M', u'》'),
    ]

def _seg_48():
    return [
    (0xFE3F, 'M', u'〈'),
    (0xFE40, 'M', u'〉'),
    (0xFE41, 'M', u'「'),
    (0xFE42, 'M', u'」'),
    (0xFE43, 'M', u'『'),
    (0xFE44, 'M', u'』'),
    (0xFE45, 'V'),
    (0xFE47, '3', u'['),
    (0xFE48, '3', u']'),
    (0xFE49, '3', u' ̅'),
    (0xFE4D, '3', u'_'),
    (0xFE50, '3', u','),
    (0xFE51, 'M', u'、'),
    (0xFE52, 'X'),
    (0xFE54, '3', u';'),
    (0xFE55, '3', u':'),
    (0xFE56, '3', u'?'),
    (0xFE57, '3', u'!'),
    (0xFE58, 'M', u'—'),
    (0xFE59, '3', u'('),
    (0xFE5A, '3', u')'),
    (0xFE5B, '3', u'{'),
    (0xFE5C, '3', u'}'),
    (0xFE5D, 'M', u'〔'),
    (0xFE5E, 'M', u'〕'),
    (0xFE5F, '3', u'#'),
    (0xFE60, '3', u'&'),
    (0xFE61, '3', u'*'),
    (0xFE62, '3', u'+'),
    (0xFE63, 'M', u'-'),
    (0xFE64, '3', u'<'),
    (0xFE65, '3', u'>'),
    (0xFE66, '3', u'='),
    (0xFE67, 'X'),
    (0xFE68, '3', u'\\'),
    (0xFE69, '3', u'$'),
    (0xFE6A, '3', u'%'),
    (0xFE6B, '3', u'@'),
    (0xFE6C, 'X'),
    (0xFE70, '3', u' ً'),
    (0xFE71, 'M', u'ـً'),
    (0xFE72, '3', u' ٌ'),
    (0xFE73, 'V'),
    (0xFE74, '3', u' ٍ'),
    (0xFE75, 'X'),
    (0xFE76, '3', u' َ'),
    (0xFE77, 'M', u'ـَ'),
    (0xFE78, '3', u' ُ'),
    (0xFE79, 'M', u'ـُ'),
    (0xFE7A, '3', u' ِ'),
    (0xFE7B, 'M', u'ـِ'),
    (0xFE7C, '3', u' ّ'),
    (0xFE7D, 'M', u'ـّ'),
    (0xFE7E, '3', u' ْ'),
    (0xFE7F, 'M', u'ـْ'),
    (0xFE80, 'M', u'ء'),
    (0xFE81, 'M', u'آ'),
    (0xFE83, 'M', u'أ'),
    (0xFE85, 'M', u'ؤ'),
    (0xFE87, 'M', u'إ'),
    (0xFE89, 'M', u'ئ'),
    (0xFE8D, 'M', u'ا'),
    (0xFE8F, 'M', u'ب'),
    (0xFE93, 'M', u'ة'),
    (0xFE95, 'M', u'ت'),
    (0xFE99, 'M', u'ث'),
    (0xFE9D, 'M', u'ج'),
    (0xFEA1, 'M', u'ح'),
    (0xFEA5, 'M', u'خ'),
    (0xFEA9, 'M', u'د'),
    (0xFEAB, 'M', u'ذ'),
    (0xFEAD, 'M', u'ر'),
    (0xFEAF, 'M', u'ز'),
    (0xFEB1, 'M', u'س'),
    (0xFEB5, 'M', u'ش'),
    (0xFEB9, 'M', u'ص'),
    (0xFEBD, 'M', u'ض'),
    (0xFEC1, 'M', u'ط'),
    (0xFEC5, 'M', u'ظ'),
    (0xFEC9, 'M', u'ع'),
    (0xFECD, 'M', u'غ'),
    (0xFED1, 'M', u'ف'),
    (0xFED5, 'M', u'ق'),
    (0xFED9, 'M', u'ك'),
    (0xFEDD, 'M', u'ل'),
    (0xFEE1, 'M', u'م'),
    (0xFEE5, 'M', u'ن'),
    (0xFEE9, 'M', u'ه'),
    (0xFEED, 'M', u'و'),
    (0xFEEF, 'M', u'ى'),
    (0xFEF1, 'M', u'ي'),
    (0xFEF5, 'M', u'لآ'),
    (0xFEF7, 'M', u'لأ'),
    (0xFEF9, 'M', u'لإ'),
    (0xFEFB, 'M', u'لا'),
    (0xFEFD, 'X'),
    (0xFEFF, 'I'),
    (0xFF00, 'X'),
    (0xFF01, '3', u'!'),
    (0xFF02, '3', u'"'),
    ]

def _seg_49():
    return [
    (0xFF03, '3', u'#'),
    (0xFF04, '3', u'$'),
    (0xFF05, '3', u'%'),
    (0xFF06, '3', u'&'),
    (0xFF07, '3', u'\''),
    (0xFF08, '3', u'('),
    (0xFF09, '3', u')'),
    (0xFF0A, '3', u'*'),
    (0xFF0B, '3', u'+'),
    (0xFF0C, '3', u','),
    (0xFF0D, 'M', u'-'),
    (0xFF0E, 'M', u'.'),
    (0xFF0F, '3', u'/'),
    (0xFF10, 'M', u'0'),
    (0xFF11, 'M', u'1'),
    (0xFF12, 'M', u'2'),
    (0xFF13, 'M', u'3'),
    (0xFF14, 'M', u'4'),
    (0xFF15, 'M', u'5'),
    (0xFF16, 'M', u'6'),
    (0xFF17, 'M', u'7'),
    (0xFF18, 'M', u'8'),
    (0xFF19, 'M', u'9'),
    (0xFF1A, '3', u':'),
    (0xFF1B, '3', u';'),
    (0xFF1C, '3', u'<'),
    (0xFF1D, '3', u'='),
    (0xFF1E, '3', u'>'),
    (0xFF1F, '3', u'?'),
    (0xFF20, '3', u'@'),
    (0xFF21, 'M', u'a'),
    (0xFF22, 'M', u'b'),
    (0xFF23, 'M', u'c'),
    (0xFF24, 'M', u'd'),
    (0xFF25, 'M', u'e'),
    (0xFF26, 'M', u'f'),
    (0xFF27, 'M', u'g'),
    (0xFF28, 'M', u'h'),
    (0xFF29, 'M', u'i'),
    (0xFF2A, 'M', u'j'),
    (0xFF2B, 'M', u'k'),
    (0xFF2C, 'M', u'l'),
    (0xFF2D, 'M', u'm'),
    (0xFF2E, 'M', u'n'),
    (0xFF2F, 'M', u'o'),
    (0xFF30, 'M', u'p'),
    (0xFF31, 'M', u'q'),
    (0xFF32, 'M', u'r'),
    (0xFF33, 'M', u's'),
    (0xFF34, 'M', u't'),
    (0xFF35, 'M', u'u'),
    (0xFF36, 'M', u'v'),
    (0xFF37, 'M', u'w'),
    (0xFF38, 'M', u'x'),
    (0xFF39, 'M', u'y'),
    (0xFF3A, 'M', u'z'),
    (0xFF3B, '3', u'['),
    (0xFF3C, '3', u'\\'),
    (0xFF3D, '3', u']'),
    (0xFF3E, '3', u'^'),
    (0xFF3F, '3', u'_'),
    (0xFF40, '3', u'`'),
    (0xFF41, 'M', u'a'),
    (0xFF42, 'M', u'b'),
    (0xFF43, 'M', u'c'),
    (0xFF44, 'M', u'd'),
    (0xFF45, 'M', u'e'),
    (0xFF46, 'M', u'f'),
    (0xFF47, 'M', u'g'),
    (0xFF48, 'M', u'h'),
    (0xFF49, 'M', u'i'),
    (0xFF4A, 'M', u'j'),
    (0xFF4B, 'M', u'k'),
    (0xFF4C, 'M', u'l'),
    (0xFF4D, 'M', u'm'),
    (0xFF4E, 'M', u'n'),
    (0xFF4F, 'M', u'o'),
    (0xFF50, 'M', u'p'),
    (0xFF51, 'M', u'q'),
    (0xFF52, 'M', u'r'),
    (0xFF53, 'M', u's'),
    (0xFF54, 'M', u't'),
    (0xFF55, 'M', u'u'),
    (0xFF56, 'M', u'v'),
    (0xFF57, 'M', u'w'),
    (0xFF58, 'M', u'x'),
    (0xFF59, 'M', u'y'),
    (0xFF5A, 'M', u'z'),
    (0xFF5B, '3', u'{'),
    (0xFF5C, '3', u'|'),
    (0xFF5D, '3', u'}'),
    (0xFF5E, '3', u'~'),
    (0xFF5F, 'M', u'⦅'),
    (0xFF60, 'M', u'⦆'),
    (0xFF61, 'M', u'.'),
    (0xFF62, 'M', u'「'),
    (0xFF63, 'M', u'」'),
    (0xFF64, 'M', u'、'),
    (0xFF65, 'M', u'・'),
    (0xFF66, 'M', u'ヲ'),
    ]

def _seg_50():
    return [
    (0xFF67, 'M', u'ァ'),
    (0xFF68, 'M', u'ィ'),
    (0xFF69, 'M', u'ゥ'),
    (0xFF6A, 'M', u'ェ'),
    (0xFF6B, 'M', u'ォ'),
    (0xFF6C, 'M', u'ャ'),
    (0xFF6D, 'M', u'ュ'),
    (0xFF6E, 'M', u'ョ'),
    (0xFF6F, 'M', u'ッ'),
    (0xFF70, 'M', u'ー'),
    (0xFF71, 'M', u'ア'),
    (0xFF72, 'M', u'イ'),
    (0xFF73, 'M', u'ウ'),
    (0xFF74, 'M', u'エ'),
    (0xFF75, 'M', u'オ'),
    (0xFF76, 'M', u'カ'),
    (0xFF77, 'M', u'キ'),
    (0xFF78, 'M', u'ク'),
    (0xFF79, 'M', u'ケ'),
    (0xFF7A, 'M', u'コ'),
    (0xFF7B, 'M', u'サ'),
    (0xFF7C, 'M', u'シ'),
    (0xFF7D, 'M', u'ス'),
    (0xFF7E, 'M', u'セ'),
    (0xFF7F, 'M', u'ソ'),
    (0xFF80, 'M', u'タ'),
    (0xFF81, 'M', u'チ'),
    (0xFF82, 'M', u'ツ'),
    (0xFF83, 'M', u'テ'),
    (0xFF84, 'M', u'ト'),
    (0xFF85, 'M', u'ナ'),
    (0xFF86, 'M', u'ニ'),
    (0xFF87, 'M', u'ヌ'),
    (0xFF88, 'M', u'ネ'),
    (0xFF89, 'M', u'ノ'),
    (0xFF8A, 'M', u'ハ'),
    (0xFF8B, 'M', u'ヒ'),
    (0xFF8C, 'M', u'フ'),
    (0xFF8D, 'M', u'ヘ'),
    (0xFF8E, 'M', u'ホ'),
    (0xFF8F, 'M', u'マ'),
    (0xFF90, 'M', u'ミ'),
    (0xFF91, 'M', u'ム'),
    (0xFF92, 'M', u'メ'),
    (0xFF93, 'M', u'モ'),
    (0xFF94, 'M', u'ヤ'),
    (0xFF95, 'M', u'ユ'),
    (0xFF96, 'M', u'ヨ'),
    (0xFF97, 'M', u'ラ'),
    (0xFF98, 'M', u'リ'),
    (0xFF99, 'M', u'ル'),
    (0xFF9A, 'M', u'レ'),
    (0xFF9B, 'M', u'ロ'),
    (0xFF9C, 'M', u'ワ'),
    (0xFF9D, 'M', u'ン'),
    (0xFF9E, 'M', u'゙'),
    (0xFF9F, 'M', u'゚'),
    (0xFFA0, 'X'),
    (0xFFA1, 'M', u'ᄀ'),
    (0xFFA2, 'M', u'ᄁ'),
    (0xFFA3, 'M', u'ᆪ'),
    (0xFFA4, 'M', u'ᄂ'),
    (0xFFA5, 'M', u'ᆬ'),
    (0xFFA6, 'M', u'ᆭ'),
    (0xFFA7, 'M', u'ᄃ'),
    (0xFFA8, 'M', u'ᄄ'),
    (0xFFA9, 'M', u'ᄅ'),
    (0xFFAA, 'M', u'ᆰ'),
    (0xFFAB, 'M', u'ᆱ'),
    (0xFFAC, 'M', u'ᆲ'),
    (0xFFAD, 'M', u'ᆳ'),
    (0xFFAE, 'M', u'ᆴ'),
    (0xFFAF, 'M', u'ᆵ'),
    (0xFFB0, 'M', u'ᄚ'),
    (0xFFB1, 'M', u'ᄆ'),
    (0xFFB2, 'M', u'ᄇ'),
    (0xFFB3, 'M', u'ᄈ'),
    (0xFFB4, 'M', u'ᄡ'),
    (0xFFB5, 'M', u'ᄉ'),
    (0xFFB6, 'M', u'ᄊ'),
    (0xFFB7, 'M', u'ᄋ'),
    (0xFFB8, 'M', u'ᄌ'),
    (0xFFB9, 'M', u'ᄍ'),
    (0xFFBA, 'M', u'ᄎ'),
    (0xFFBB, 'M', u'ᄏ'),
    (0xFFBC, 'M', u'ᄐ'),
    (0xFFBD, 'M', u'ᄑ'),
    (0xFFBE, 'M', u'ᄒ'),
    (0xFFBF, 'X'),
    (0xFFC2, 'M', u'ᅡ'),
    (0xFFC3, 'M', u'ᅢ'),
    (0xFFC4, 'M', u'ᅣ'),
    (0xFFC5, 'M', u'ᅤ'),
    (0xFFC6, 'M', u'ᅥ'),
    (0xFFC7, 'M', u'ᅦ'),
    (0xFFC8, 'X'),
    (0xFFCA, 'M', u'ᅧ'),
    (0xFFCB, 'M', u'ᅨ'),
    (0xFFCC, 'M', u'ᅩ'),
    (0xFFCD, 'M', u'ᅪ'),
    ]

def _seg_51():
    return [
    (0xFFCE, 'M', u'ᅫ'),
    (0xFFCF, 'M', u'ᅬ'),
    (0xFFD0, 'X'),
    (0xFFD2, 'M', u'ᅭ'),
    (0xFFD3, 'M', u'ᅮ'),
    (0xFFD4, 'M', u'ᅯ'),
    (0xFFD5, 'M', u'ᅰ'),
    (0xFFD6, 'M', u'ᅱ'),
    (0xFFD7, 'M', u'ᅲ'),
    (0xFFD8, 'X'),
    (0xFFDA, 'M', u'ᅳ'),
    (0xFFDB, 'M', u'ᅴ'),
    (0xFFDC, 'M', u'ᅵ'),
    (0xFFDD, 'X'),
    (0xFFE0, 'M', u'¢'),
    (0xFFE1, 'M', u'£'),
    (0xFFE2, 'M', u'¬'),
    (0xFFE3, '3', u' ̄'),
    (0xFFE4, 'M', u'¦'),
    (0xFFE5, 'M', u'¥'),
    (0xFFE6, 'M', u'₩'),
    (0xFFE7, 'X'),
    (0xFFE8, 'M', u'│'),
    (0xFFE9, 'M', u'←'),
    (0xFFEA, 'M', u'↑'),
    (0xFFEB, 'M', u'→'),
    (0xFFEC, 'M', u'↓'),
    (0xFFED, 'M', u'■'),
    (0xFFEE, 'M', u'○'),
    (0xFFEF, 'X'),
    (0x10000, 'V'),
    (0x1000C, 'X'),
    (0x1000D, 'V'),
    (0x10027, 'X'),
    (0x10028, 'V'),
    (0x1003B, 'X'),
    (0x1003C, 'V'),
    (0x1003E, 'X'),
    (0x1003F, 'V'),
    (0x1004E, 'X'),
    (0x10050, 'V'),
    (0x1005E, 'X'),
    (0x10080, 'V'),
    (0x100FB, 'X'),
    (0x10100, 'V'),
    (0x10103, 'X'),
    (0x10107, 'V'),
    (0x10134, 'X'),
    (0x10137, 'V'),
    (0x1018B, 'X'),
    (0x10190, 'V'),
    (0x1019C, 'X'),
    (0x101D0, 'V'),
    (0x101FE, 'X'),
    (0x10280, 'V'),
    (0x1029D, 'X'),
    (0x102A0, 'V'),
    (0x102D1, 'X'),
    (0x10300, 'V'),
    (0x1031F, 'X'),
    (0x10320, 'V'),
    (0x10324, 'X'),
    (0x10330, 'V'),
    (0x1034B, 'X'),
    (0x10380, 'V'),
    (0x1039E, 'X'),
    (0x1039F, 'V'),
    (0x103C4, 'X'),
    (0x103C8, 'V'),
    (0x103D6, 'X'),
    (0x10400, 'M', u'𐐨'),
    (0x10401, 'M', u'𐐩'),
    (0x10402, 'M', u'𐐪'),
    (0x10403, 'M', u'𐐫'),
    (0x10404, 'M', u'𐐬'),
    (0x10405, 'M', u'𐐭'),
    (0x10406, 'M', u'𐐮'),
    (0x10407, 'M', u'𐐯'),
    (0x10408, 'M', u'𐐰'),
    (0x10409, 'M', u'𐐱'),
    (0x1040A, 'M', u'𐐲'),
    (0x1040B, 'M', u'𐐳'),
    (0x1040C, 'M', u'𐐴'),
    (0x1040D, 'M', u'𐐵'),
    (0x1040E, 'M', u'𐐶'),
    (0x1040F, 'M', u'𐐷'),
    (0x10410, 'M', u'𐐸'),
    (0x10411, 'M', u'𐐹'),
    (0x10412, 'M', u'𐐺'),
    (0x10413, 'M', u'𐐻'),
    (0x10414, 'M', u'𐐼'),
    (0x10415, 'M', u'𐐽'),
    (0x10416, 'M', u'𐐾'),
    (0x10417, 'M', u'𐐿'),
    (0x10418, 'M', u'𐑀'),
    (0x10419, 'M', u'𐑁'),
    (0x1041A, 'M', u'𐑂'),
    (0x1041B, 'M', u'𐑃'),
    (0x1041C, 'M', u'𐑄'),
    (0x1041D, 'M', u'𐑅'),
    ]

def _seg_52():
    return [
    (0x1041E, 'M', u'𐑆'),
    (0x1041F, 'M', u'𐑇'),
    (0x10420, 'M', u'𐑈'),
    (0x10421, 'M', u'𐑉'),
    (0x10422, 'M', u'𐑊'),
    (0x10423, 'M', u'𐑋'),
    (0x10424, 'M', u'𐑌'),
    (0x10425, 'M', u'𐑍'),
    (0x10426, 'M', u'𐑎'),
    (0x10427, 'M', u'𐑏'),
    (0x10428, 'V'),
    (0x1049E, 'X'),
    (0x104A0, 'V'),
    (0x104AA, 'X'),
    (0x10800, 'V'),
    (0x10806, 'X'),
    (0x10808, 'V'),
    (0x10809, 'X'),
    (0x1080A, 'V'),
    (0x10836, 'X'),
    (0x10837, 'V'),
    (0x10839, 'X'),
    (0x1083C, 'V'),
    (0x1083D, 'X'),
    (0x1083F, 'V'),
    (0x10856, 'X'),
    (0x10857, 'V'),
    (0x10860, 'X'),
    (0x10900, 'V'),
    (0x1091C, 'X'),
    (0x1091F, 'V'),
    (0x1093A, 'X'),
    (0x1093F, 'V'),
    (0x10940, 'X'),
    (0x10980, 'V'),
    (0x109B8, 'X'),
    (0x109BE, 'V'),
    (0x109C0, 'X'),
    (0x10A00, 'V'),
    (0x10A04, 'X'),
    (0x10A05, 'V'),
    (0x10A07, 'X'),
    (0x10A0C, 'V'),
    (0x10A14, 'X'),
    (0x10A15, 'V'),
    (0x10A18, 'X'),
    (0x10A19, 'V'),
    (0x10A34, 'X'),
    (0x10A38, 'V'),
    (0x10A3B, 'X'),
    (0x10A3F, 'V'),
    (0x10A48, 'X'),
    (0x10A50, 'V'),
    (0x10A59, 'X'),
    (0x10A60, 'V'),
    (0x10A80, 'X'),
    (0x10B00, 'V'),
    (0x10B36, 'X'),
    (0x10B39, 'V'),
    (0x10B56, 'X'),
    (0x10B58, 'V'),
    (0x10B73, 'X'),
    (0x10B78, 'V'),
    (0x10B80, 'X'),
    (0x10C00, 'V'),
    (0x10C49, 'X'),
    (0x10E60, 'V'),
    (0x10E7F, 'X'),
    (0x11000, 'V'),
    (0x1104E, 'X'),
    (0x11052, 'V'),
    (0x11070, 'X'),
    (0x11080, 'V'),
    (0x110BD, 'X'),
    (0x110BE, 'V'),
    (0x110C2, 'X'),
    (0x110D0, 'V'),
    (0x110E9, 'X'),
    (0x110F0, 'V'),
    (0x110FA, 'X'),
    (0x11100, 'V'),
    (0x11135, 'X'),
    (0x11136, 'V'),
    (0x11144, 'X'),
    (0x11180, 'V'),
    (0x111C9, 'X'),
    (0x111D0, 'V'),
    (0x111DA, 'X'),
    (0x11680, 'V'),
    (0x116B8, 'X'),
    (0x116C0, 'V'),
    (0x116CA, 'X'),
    (0x12000, 'V'),
    (0x1236F, 'X'),
    (0x12400, 'V'),
    (0x12463, 'X'),
    (0x12470, 'V'),
    (0x12474, 'X'),
    (0x13000, 'V'),
    (0x1342F, 'X'),
    ]

def _seg_53():
    return [
    (0x16800, 'V'),
    (0x16A39, 'X'),
    (0x16F00, 'V'),
    (0x16F45, 'X'),
    (0x16F50, 'V'),
    (0x16F7F, 'X'),
    (0x16F8F, 'V'),
    (0x16FA0, 'X'),
    (0x1B000, 'V'),
    (0x1B002, 'X'),
    (0x1D000, 'V'),
    (0x1D0F6, 'X'),
    (0x1D100, 'V'),
    (0x1D127, 'X'),
    (0x1D129, 'V'),
    (0x1D15E, 'M', u'𝅗𝅥'),
    (0x1D15F, 'M', u'𝅘𝅥'),
    (0x1D160, 'M', u'𝅘𝅥𝅮'),
    (0x1D161, 'M', u'𝅘𝅥𝅯'),
    (0x1D162, 'M', u'𝅘𝅥𝅰'),
    (0x1D163, 'M', u'𝅘𝅥𝅱'),
    (0x1D164, 'M', u'𝅘𝅥𝅲'),
    (0x1D165, 'V'),
    (0x1D173, 'X'),
    (0x1D17B, 'V'),
    (0x1D1BB, 'M', u'𝆹𝅥'),
    (0x1D1BC, 'M', u'𝆺𝅥'),
    (0x1D1BD, 'M', u'𝆹𝅥𝅮'),
    (0x1D1BE, 'M', u'𝆺𝅥𝅮'),
    (0x1D1BF, 'M', u'𝆹𝅥𝅯'),
    (0x1D1C0, 'M', u'𝆺𝅥𝅯'),
    (0x1D1C1, 'V'),
    (0x1D1DE, 'X'),
    (0x1D200, 'V'),
    (0x1D246, 'X'),
    (0x1D300, 'V'),
    (0x1D357, 'X'),
    (0x1D360, 'V'),
    (0x1D372, 'X'),
    (0x1D400, 'M', u'a'),
    (0x1D401, 'M', u'b'),
    (0x1D402, 'M', u'c'),
    (0x1D403, 'M', u'd'),
    (0x1D404, 'M', u'e'),
    (0x1D405, 'M', u'f'),
    (0x1D406, 'M', u'g'),
    (0x1D407, 'M', u'h'),
    (0x1D408, 'M', u'i'),
    (0x1D409, 'M', u'j'),
    (0x1D40A, 'M', u'k'),
    (0x1D40B, 'M', u'l'),
    (0x1D40C, 'M', u'm'),
    (0x1D40D, 'M', u'n'),
    (0x1D40E, 'M', u'o'),
    (0x1D40F, 'M', u'p'),
    (0x1D410, 'M', u'q'),
    (0x1D411, 'M', u'r'),
    (0x1D412, 'M', u's'),
    (0x1D413, 'M', u't'),
    (0x1D414, 'M', u'u'),
    (0x1D415, 'M', u'v'),
    (0x1D416, 'M', u'w'),
    (0x1D417, 'M', u'x'),
    (0x1D418, 'M', u'y'),
    (0x1D419, 'M', u'z'),
    (0x1D41A, 'M', u'a'),
    (0x1D41B, 'M', u'b'),
    (0x1D41C, 'M', u'c'),
    (0x1D41D, 'M', u'd'),
    (0x1D41E, 'M', u'e'),
    (0x1D41F, 'M', u'f'),
    (0x1D420, 'M', u'g'),
    (0x1D421, 'M', u'h'),
    (0x1D422, 'M', u'i'),
    (0x1D423, 'M', u'j'),
    (0x1D424, 'M', u'k'),
    (0x1D425, 'M', u'l'),
    (0x1D426, 'M', u'm'),
    (0x1D427, 'M', u'n'),
    (0x1D428, 'M', u'o'),
    (0x1D429, 'M', u'p'),
    (0x1D42A, 'M', u'q'),
    (0x1D42B, 'M', u'r'),
    (0x1D42C, 'M', u's'),
    (0x1D42D, 'M', u't'),
    (0x1D42E, 'M', u'u'),
    (0x1D42F, 'M', u'v'),
    (0x1D430, 'M', u'w'),
    (0x1D431, 'M', u'x'),
    (0x1D432, 'M', u'y'),
    (0x1D433, 'M', u'z'),
    (0x1D434, 'M', u'a'),
    (0x1D435, 'M', u'b'),
    (0x1D436, 'M', u'c'),
    (0x1D437, 'M', u'd'),
    (0x1D438, 'M', u'e'),
    (0x1D439, 'M', u'f'),
    (0x1D43A, 'M', u'g'),
    (0x1D43B, 'M', u'h'),
    (0x1D43C, 'M', u'i'),
    ]

def _seg_54():
    return [
    (0x1D43D, 'M', u'j'),
    (0x1D43E, 'M', u'k'),
    (0x1D43F, 'M', u'l'),
    (0x1D440, 'M', u'm'),
    (0x1D441, 'M', u'n'),
    (0x1D442, 'M', u'o'),
    (0x1D443, 'M', u'p'),
    (0x1D444, 'M', u'q'),
    (0x1D445, 'M', u'r'),
    (0x1D446, 'M', u's'),
    (0x1D447, 'M', u't'),
    (0x1D448, 'M', u'u'),
    (0x1D449, 'M', u'v'),
    (0x1D44A, 'M', u'w'),
    (0x1D44B, 'M', u'x'),
    (0x1D44C, 'M', u'y'),
    (0x1D44D, 'M', u'z'),
    (0x1D44E, 'M', u'a'),
    (0x1D44F, 'M', u'b'),
    (0x1D450, 'M', u'c'),
    (0x1D451, 'M', u'd'),
    (0x1D452, 'M', u'e'),
    (0x1D453, 'M', u'f'),
    (0x1D454, 'M', u'g'),
    (0x1D455, 'X'),
    (0x1D456, 'M', u'i'),
    (0x1D457, 'M', u'j'),
    (0x1D458, 'M', u'k'),
    (0x1D459, 'M', u'l'),
    (0x1D45A, 'M', u'm'),
    (0x1D45B, 'M', u'n'),
    (0x1D45C, 'M', u'o'),
    (0x1D45D, 'M', u'p'),
    (0x1D45E, 'M', u'q'),
    (0x1D45F, 'M', u'r'),
    (0x1D460, 'M', u's'),
    (0x1D461, 'M', u't'),
    (0x1D462, 'M', u'u'),
    (0x1D463, 'M', u'v'),
    (0x1D464, 'M', u'w'),
    (0x1D465, 'M', u'x'),
    (0x1D466, 'M', u'y'),
    (0x1D467, 'M', u'z'),
    (0x1D468, 'M', u'a'),
    (0x1D469, 'M', u'b'),
    (0x1D46A, 'M', u'c'),
    (0x1D46B, 'M', u'd'),
    (0x1D46C, 'M', u'e'),
    (0x1D46D, 'M', u'f'),
    (0x1D46E, 'M', u'g'),
    (0x1D46F, 'M', u'h'),
    (0x1D470, 'M', u'i'),
    (0x1D471, 'M', u'j'),
    (0x1D472, 'M', u'k'),
    (0x1D473, 'M', u'l'),
    (0x1D474, 'M', u'm'),
    (0x1D475, 'M', u'n'),
    (0x1D476, 'M', u'o'),
    (0x1D477, 'M', u'p'),
    (0x1D478, 'M', u'q'),
    (0x1D479, 'M', u'r'),
    (0x1D47A, 'M', u's'),
    (0x1D47B, 'M', u't'),
    (0x1D47C, 'M', u'u'),
    (0x1D47D, 'M', u'v'),
    (0x1D47E, 'M', u'w'),
    (0x1D47F, 'M', u'x'),
    (0x1D480, 'M', u'y'),
    (0x1D481, 'M', u'z'),
    (0x1D482, 'M', u'a'),
    (0x1D483, 'M', u'b'),
    (0x1D484, 'M', u'c'),
    (0x1D485, 'M', u'd'),
    (0x1D486, 'M', u'e'),
    (0x1D487, 'M', u'f'),
    (0x1D488, 'M', u'g'),
    (0x1D489, 'M', u'h'),
    (0x1D48A, 'M', u'i'),
    (0x1D48B, 'M', u'j'),
    (0x1D48C, 'M', u'k'),
    (0x1D48D, 'M', u'l'),
    (0x1D48E, 'M', u'm'),
    (0x1D48F, 'M', u'n'),
    (0x1D490, 'M', u'o'),
    (0x1D491, 'M', u'p'),
    (0x1D492, 'M', u'q'),
    (0x1D493, 'M', u'r'),
    (0x1D494, 'M', u's'),
    (0x1D495, 'M', u't'),
    (0x1D496, 'M', u'u'),
    (0x1D497, 'M', u'v'),
    (0x1D498, 'M', u'w'),
    (0x1D499, 'M', u'x'),
    (0x1D49A, 'M', u'y'),
    (0x1D49B, 'M', u'z'),
    (0x1D49C, 'M', u'a'),
    (0x1D49D, 'X'),
    (0x1D49E, 'M', u'c'),
    (0x1D49F, 'M', u'd'),
    (0x1D4A0, 'X'),
    ]

def _seg_55():
    return [
    (0x1D4A2, 'M', u'g'),
    (0x1D4A3, 'X'),
    (0x1D4A5, 'M', u'j'),
    (0x1D4A6, 'M', u'k'),
    (0x1D4A7, 'X'),
    (0x1D4A9, 'M', u'n'),
    (0x1D4AA, 'M', u'o'),
    (0x1D4AB, 'M', u'p'),
    (0x1D4AC, 'M', u'q'),
    (0x1D4AD, 'X'),
    (0x1D4AE, 'M', u's'),
    (0x1D4AF, 'M', u't'),
    (0x1D4B0, 'M', u'u'),
    (0x1D4B1, 'M', u'v'),
    (0x1D4B2, 'M', u'w'),
    (0x1D4B3, 'M', u'x'),
    (0x1D4B4, 'M', u'y'),
    (0x1D4B5, 'M', u'z'),
    (0x1D4B6, 'M', u'a'),
    (0x1D4B7, 'M', u'b'),
    (0x1D4B8, 'M', u'c'),
    (0x1D4B9, 'M', u'd'),
    (0x1D4BA, 'X'),
    (0x1D4BB, 'M', u'f'),
    (0x1D4BC, 'X'),
    (0x1D4BD, 'M', u'h'),
    (0x1D4BE, 'M', u'i'),
    (0x1D4BF, 'M', u'j'),
    (0x1D4C0, 'M', u'k'),
    (0x1D4C1, 'M', u'l'),
    (0x1D4C2, 'M', u'm'),
    (0x1D4C3, 'M', u'n'),
    (0x1D4C4, 'X'),
    (0x1D4C5, 'M', u'p'),
    (0x1D4C6, 'M', u'q'),
    (0x1D4C7, 'M', u'r'),
    (0x1D4C8, 'M', u's'),
    (0x1D4C9, 'M', u't'),
    (0x1D4CA, 'M', u'u'),
    (0x1D4CB, 'M', u'v'),
    (0x1D4CC, 'M', u'w'),
    (0x1D4CD, 'M', u'x'),
    (0x1D4CE, 'M', u'y'),
    (0x1D4CF, 'M', u'z'),
    (0x1D4D0, 'M', u'a'),
    (0x1D4D1, 'M', u'b'),
    (0x1D4D2, 'M', u'c'),
    (0x1D4D3, 'M', u'd'),
    (0x1D4D4, 'M', u'e'),
    (0x1D4D5, 'M', u'f'),
    (0x1D4D6, 'M', u'g'),
    (0x1D4D7, 'M', u'h'),
    (0x1D4D8, 'M', u'i'),
    (0x1D4D9, 'M', u'j'),
    (0x1D4DA, 'M', u'k'),
    (0x1D4DB, 'M', u'l'),
    (0x1D4DC, 'M', u'm'),
    (0x1D4DD, 'M', u'n'),
    (0x1D4DE, 'M', u'o'),
    (0x1D4DF, 'M', u'p'),
    (0x1D4E0, 'M', u'q'),
    (0x1D4E1, 'M', u'r'),
    (0x1D4E2, 'M', u's'),
    (0x1D4E3, 'M', u't'),
    (0x1D4E4, 'M', u'u'),
    (0x1D4E5, 'M', u'v'),
    (0x1D4E6, 'M', u'w'),
    (0x1D4E7, 'M', u'x'),
    (0x1D4E8, 'M', u'y'),
    (0x1D4E9, 'M', u'z'),
    (0x1D4EA, 'M', u'a'),
    (0x1D4EB, 'M', u'b'),
    (0x1D4EC, 'M', u'c'),
    (0x1D4ED, 'M', u'd'),
    (0x1D4EE, 'M', u'e'),
    (0x1D4EF, 'M', u'f'),
    (0x1D4F0, 'M', u'g'),
    (0x1D4F1, 'M', u'h'),
    (0x1D4F2, 'M', u'i'),
    (0x1D4F3, 'M', u'j'),
    (0x1D4F4, 'M', u'k'),
    (0x1D4F5, 'M', u'l'),
    (0x1D4F6, 'M', u'm'),
    (0x1D4F7, 'M', u'n'),
    (0x1D4F8, 'M', u'o'),
    (0x1D4F9, 'M', u'p'),
    (0x1D4FA, 'M', u'q'),
    (0x1D4FB, 'M', u'r'),
    (0x1D4FC, 'M', u's'),
    (0x1D4FD, 'M', u't'),
    (0x1D4FE, 'M', u'u'),
    (0x1D4FF, 'M', u'v'),
    (0x1D500, 'M', u'w'),
    (0x1D501, 'M', u'x'),
    (0x1D502, 'M', u'y'),
    (0x1D503, 'M', u'z'),
    (0x1D504, 'M', u'a'),
    (0x1D505, 'M', u'b'),
    (0x1D506, 'X'),
    (0x1D507, 'M', u'd'),
    ]

def _seg_56():
    return [
    (0x1D508, 'M', u'e'),
    (0x1D509, 'M', u'f'),
    (0x1D50A, 'M', u'g'),
    (0x1D50B, 'X'),
    (0x1D50D, 'M', u'j'),
    (0x1D50E, 'M', u'k'),
    (0x1D50F, 'M', u'l'),
    (0x1D510, 'M', u'm'),
    (0x1D511, 'M', u'n'),
    (0x1D512, 'M', u'o'),
    (0x1D513, 'M', u'p'),
    (0x1D514, 'M', u'q'),
    (0x1D515, 'X'),
    (0x1D516, 'M', u's'),
    (0x1D517, 'M', u't'),
    (0x1D518, 'M', u'u'),
    (0x1D519, 'M', u'v'),
    (0x1D51A, 'M', u'w'),
    (0x1D51B, 'M', u'x'),
    (0x1D51C, 'M', u'y'),
    (0x1D51D, 'X'),
    (0x1D51E, 'M', u'a'),
    (0x1D51F, 'M', u'b'),
    (0x1D520, 'M', u'c'),
    (0x1D521, 'M', u'd'),
    (0x1D522, 'M', u'e'),
    (0x1D523, 'M', u'f'),
    (0x1D524, 'M', u'g'),
    (0x1D525, 'M', u'h'),
    (0x1D526, 'M', u'i'),
    (0x1D527, 'M', u'j'),
    (0x1D528, 'M', u'k'),
    (0x1D529, 'M', u'l'),
    (0x1D52A, 'M', u'm'),
    (0x1D52B, 'M', u'n'),
    (0x1D52C, 'M', u'o'),
    (0x1D52D, 'M', u'p'),
    (0x1D52E, 'M', u'q'),
    (0x1D52F, 'M', u'r'),
    (0x1D530, 'M', u's'),
    (0x1D531, 'M', u't'),
    (0x1D532, 'M', u'u'),
    (0x1D533, 'M', u'v'),
    (0x1D534, 'M', u'w'),
    (0x1D535, 'M', u'x'),
    (0x1D536, 'M', u'y'),
    (0x1D537, 'M', u'z'),
    (0x1D538, 'M', u'a'),
    (0x1D539, 'M', u'b'),
    (0x1D53A, 'X'),
    (0x1D53B, 'M', u'd'),
    (0x1D53C, 'M', u'e'),
    (0x1D53D, 'M', u'f'),
    (0x1D53E, 'M', u'g'),
    (0x1D53F, 'X'),
    (0x1D540, 'M', u'i'),
    (0x1D541, 'M', u'j'),
    (0x1D542, 'M', u'k'),
    (0x1D543, 'M', u'l'),
    (0x1D544, 'M', u'm'),
    (0x1D545, 'X'),
    (0x1D546, 'M', u'o'),
    (0x1D547, 'X'),
    (0x1D54A, 'M', u's'),
    (0x1D54B, 'M', u't'),
    (0x1D54C, 'M', u'u'),
    (0x1D54D, 'M', u'v'),
    (0x1D54E, 'M', u'w'),
    (0x1D54F, 'M', u'x'),
    (0x1D550, 'M', u'y'),
    (0x1D551, 'X'),
    (0x1D552, 'M', u'a'),
    (0x1D553, 'M', u'b'),
    (0x1D554, 'M', u'c'),
    (0x1D555, 'M', u'd'),
    (0x1D556, 'M', u'e'),
    (0x1D557, 'M', u'f'),
    (0x1D558, 'M', u'g'),
    (0x1D559, 'M', u'h'),
    (0x1D55A, 'M', u'i'),
    (0x1D55B, 'M', u'j'),
    (0x1D55C, 'M', u'k'),
    (0x1D55D, 'M', u'l'),
    (0x1D55E, 'M', u'm'),
    (0x1D55F, 'M', u'n'),
    (0x1D560, 'M', u'o'),
    (0x1D561, 'M', u'p'),
    (0x1D562, 'M', u'q'),
    (0x1D563, 'M', u'r'),
    (0x1D564, 'M', u's'),
    (0x1D565, 'M', u't'),
    (0x1D566, 'M', u'u'),
    (0x1D567, 'M', u'v'),
    (0x1D568, 'M', u'w'),
    (0x1D569, 'M', u'x'),
    (0x1D56A, 'M', u'y'),
    (0x1D56B, 'M', u'z'),
    (0x1D56C, 'M', u'a'),
    (0x1D56D, 'M', u'b'),
    (0x1D56E, 'M', u'c'),
    ]

def _seg_57():
    return [
    (0x1D56F, 'M', u'd'),
    (0x1D570, 'M', u'e'),
    (0x1D571, 'M', u'f'),
    (0x1D572, 'M', u'g'),
    (0x1D573, 'M', u'h'),
    (0x1D574, 'M', u'i'),
    (0x1D575, 'M', u'j'),
    (0x1D576, 'M', u'k'),
    (0x1D577, 'M', u'l'),
    (0x1D578, 'M', u'm'),
    (0x1D579, 'M', u'n'),
    (0x1D57A, 'M', u'o'),
    (0x1D57B, 'M', u'p'),
    (0x1D57C, 'M', u'q'),
    (0x1D57D, 'M', u'r'),
    (0x1D57E, 'M', u's'),
    (0x1D57F, 'M', u't'),
    (0x1D580, 'M', u'u'),
    (0x1D581, 'M', u'v'),
    (0x1D582, 'M', u'w'),
    (0x1D583, 'M', u'x'),
    (0x1D584, 'M', u'y'),
    (0x1D585, 'M', u'z'),
    (0x1D586, 'M', u'a'),
    (0x1D587, 'M', u'b'),
    (0x1D588, 'M', u'c'),
    (0x1D589, 'M', u'd'),
    (0x1D58A, 'M', u'e'),
    (0x1D58B, 'M', u'f'),
    (0x1D58C, 'M', u'g'),
    (0x1D58D, 'M', u'h'),
    (0x1D58E, 'M', u'i'),
    (0x1D58F, 'M', u'j'),
    (0x1D590, 'M', u'k'),
    (0x1D591, 'M', u'l'),
    (0x1D592, 'M', u'm'),
    (0x1D593, 'M', u'n'),
    (0x1D594, 'M', u'o'),
    (0x1D595, 'M', u'p'),
    (0x1D596, 'M', u'q'),
    (0x1D597, 'M', u'r'),
    (0x1D598, 'M', u's'),
    (0x1D599, 'M', u't'),
    (0x1D59A, 'M', u'u'),
    (0x1D59B, 'M', u'v'),
    (0x1D59C, 'M', u'w'),
    (0x1D59D, 'M', u'x'),
    (0x1D59E, 'M', u'y'),
    (0x1D59F, 'M', u'z'),
    (0x1D5A0, 'M', u'a'),
    (0x1D5A1, 'M', u'b'),
    (0x1D5A2, 'M', u'c'),
    (0x1D5A3, 'M', u'd'),
    (0x1D5A4, 'M', u'e'),
    (0x1D5A5, 'M', u'f'),
    (0x1D5A6, 'M', u'g'),
    (0x1D5A7, 'M', u'h'),
    (0x1D5A8, 'M', u'i'),
    (0x1D5A9, 'M', u'j'),
    (0x1D5AA, 'M', u'k'),
    (0x1D5AB, 'M', u'l'),
    (0x1D5AC, 'M', u'm'),
    (0x1D5AD, 'M', u'n'),
    (0x1D5AE, 'M', u'o'),
    (0x1D5AF, 'M', u'p'),
    (0x1D5B0, 'M', u'q'),
    (0x1D5B1, 'M', u'r'),
    (0x1D5B2, 'M', u's'),
    (0x1D5B3, 'M', u't'),
    (0x1D5B4, 'M', u'u'),
    (0x1D5B5, 'M', u'v'),
    (0x1D5B6, 'M', u'w'),
    (0x1D5B7, 'M', u'x'),
    (0x1D5B8, 'M', u'y'),
    (0x1D5B9, 'M', u'z'),
    (0x1D5BA, 'M', u'a'),
    (0x1D5BB, 'M', u'b'),
    (0x1D5BC, 'M', u'c'),
    (0x1D5BD, 'M', u'd'),
    (0x1D5BE, 'M', u'e'),
    (0x1D5BF, 'M', u'f'),
    (0x1D5C0, 'M', u'g'),
    (0x1D5C1, 'M', u'h'),
    (0x1D5C2, 'M', u'i'),
    (0x1D5C3, 'M', u'j'),
    (0x1D5C4, 'M', u'k'),
    (0x1D5C5, 'M', u'l'),
    (0x1D5C6, 'M', u'm'),
    (0x1D5C7, 'M', u'n'),
    (0x1D5C8, 'M', u'o'),
    (0x1D5C9, 'M', u'p'),
    (0x1D5CA, 'M', u'q'),
    (0x1D5CB, 'M', u'r'),
    (0x1D5CC, 'M', u's'),
    (0x1D5CD, 'M', u't'),
    (0x1D5CE, 'M', u'u'),
    (0x1D5CF, 'M', u'v'),
    (0x1D5D0, 'M', u'w'),
    (0x1D5D1, 'M', u'x'),
    (0x1D5D2, 'M', u'y'),
    ]

def _seg_58():
    return [
    (0x1D5D3, 'M', u'z'),
    (0x1D5D4, 'M', u'a'),
    (0x1D5D5, 'M', u'b'),
    (0x1D5D6, 'M', u'c'),
    (0x1D5D7, 'M', u'd'),
    (0x1D5D8, 'M', u'e'),
    (0x1D5D9, 'M', u'f'),
    (0x1D5DA, 'M', u'g'),
    (0x1D5DB, 'M', u'h'),
    (0x1D5DC, 'M', u'i'),
    (0x1D5DD, 'M', u'j'),
    (0x1D5DE, 'M', u'k'),
    (0x1D5DF, 'M', u'l'),
    (0x1D5E0, 'M', u'm'),
    (0x1D5E1, 'M', u'n'),
    (0x1D5E2, 'M', u'o'),
    (0x1D5E3, 'M', u'p'),
    (0x1D5E4, 'M', u'q'),
    (0x1D5E5, 'M', u'r'),
    (0x1D5E6, 'M', u's'),
    (0x1D5E7, 'M', u't'),
    (0x1D5E8, 'M', u'u'),
    (0x1D5E9, 'M', u'v'),
    (0x1D5EA, 'M', u'w'),
    (0x1D5EB, 'M', u'x'),
    (0x1D5EC, 'M', u'y'),
    (0x1D5ED, 'M', u'z'),
    (0x1D5EE, 'M', u'a'),
    (0x1D5EF, 'M', u'b'),
    (0x1D5F0, 'M', u'c'),
    (0x1D5F1, 'M', u'd'),
    (0x1D5F2, 'M', u'e'),
    (0x1D5F3, 'M', u'f'),
    (0x1D5F4, 'M', u'g'),
    (0x1D5F5, 'M', u'h'),
    (0x1D5F6, 'M', u'i'),
    (0x1D5F7, 'M', u'j'),
    (0x1D5F8, 'M', u'k'),
    (0x1D5F9, 'M', u'l'),
    (0x1D5FA, 'M', u'm'),
    (0x1D5FB, 'M', u'n'),
    (0x1D5FC, 'M', u'o'),
    (0x1D5FD, 'M', u'p'),
    (0x1D5FE, 'M', u'q'),
    (0x1D5FF, 'M', u'r'),
    (0x1D600, 'M', u's'),
    (0x1D601, 'M', u't'),
    (0x1D602, 'M', u'u'),
    (0x1D603, 'M', u'v'),
    (0x1D604, 'M', u'w'),
    (0x1D605, 'M', u'x'),
    (0x1D606, 'M', u'y'),
    (0x1D607, 'M', u'z'),
    (0x1D608, 'M', u'a'),
    (0x1D609, 'M', u'b'),
    (0x1D60A, 'M', u'c'),
    (0x1D60B, 'M', u'd'),
    (0x1D60C, 'M', u'e'),
    (0x1D60D, 'M', u'f'),
    (0x1D60E, 'M', u'g'),
    (0x1D60F, 'M', u'h'),
    (0x1D610, 'M', u'i'),
    (0x1D611, 'M', u'j'),
    (0x1D612, 'M', u'k'),
    (0x1D613, 'M', u'l'),
    (0x1D614, 'M', u'm'),
    (0x1D615, 'M', u'n'),
    (0x1D616, 'M', u'o'),
    (0x1D617, 'M', u'p'),
    (0x1D618, 'M', u'q'),
    (0x1D619, 'M', u'r'),
    (0x1D61A, 'M', u's'),
    (0x1D61B, 'M', u't'),
    (0x1D61C, 'M', u'u'),
    (0x1D61D, 'M', u'v'),
    (0x1D61E, 'M', u'w'),
    (0x1D61F, 'M', u'x'),
    (0x1D620, 'M', u'y'),
    (0x1D621, 'M', u'z'),
    (0x1D622, 'M', u'a'),
    (0x1D623, 'M', u'b'),
    (0x1D624, 'M', u'c'),
    (0x1D625, 'M', u'd'),
    (0x1D626, 'M', u'e'),
    (0x1D627, 'M', u'f'),
    (0x1D628, 'M', u'g'),
    (0x1D629, 'M', u'h'),
    (0x1D62A, 'M', u'i'),
    (0x1D62B, 'M', u'j'),
    (0x1D62C, 'M', u'k'),
    (0x1D62D, 'M', u'l'),
    (0x1D62E, 'M', u'm'),
    (0x1D62F, 'M', u'n'),
    (0x1D630, 'M', u'o'),
    (0x1D631, 'M', u'p'),
    (0x1D632, 'M', u'q'),
    (0x1D633, 'M', u'r'),
    (0x1D634, 'M', u's'),
    (0x1D635, 'M', u't'),
    (0x1D636, 'M', u'u'),
    ]

def _seg_59():
    return [
    (0x1D637, 'M', u'v'),
    (0x1D638, 'M', u'w'),
    (0x1D639, 'M', u'x'),
    (0x1D63A, 'M', u'y'),
    (0x1D63B, 'M', u'z'),
    (0x1D63C, 'M', u'a'),
    (0x1D63D, 'M', u'b'),
    (0x1D63E, 'M', u'c'),
    (0x1D63F, 'M', u'd'),
    (0x1D640, 'M', u'e'),
    (0x1D641, 'M', u'f'),
    (0x1D642, 'M', u'g'),
    (0x1D643, 'M', u'h'),
    (0x1D644, 'M', u'i'),
    (0x1D645, 'M', u'j'),
    (0x1D646, 'M', u'k'),
    (0x1D647, 'M', u'l'),
    (0x1D648, 'M', u'm'),
    (0x1D649, 'M', u'n'),
    (0x1D64A, 'M', u'o'),
    (0x1D64B, 'M', u'p'),
    (0x1D64C, 'M', u'q'),
    (0x1D64D, 'M', u'r'),
    (0x1D64E, 'M', u's'),
    (0x1D64F, 'M', u't'),
    (0x1D650, 'M', u'u'),
    (0x1D651, 'M', u'v'),
    (0x1D652, 'M', u'w'),
    (0x1D653, 'M', u'x'),
    (0x1D654, 'M', u'y'),
    (0x1D655, 'M', u'z'),
    (0x1D656, 'M', u'a'),
    (0x1D657, 'M', u'b'),
    (0x1D658, 'M', u'c'),
    (0x1D659, 'M', u'd'),
    (0x1D65A, 'M', u'e'),
    (0x1D65B, 'M', u'f'),
    (0x1D65C, 'M', u'g'),
    (0x1D65D, 'M', u'h'),
    (0x1D65E, 'M', u'i'),
    (0x1D65F, 'M', u'j'),
    (0x1D660, 'M', u'k'),
    (0x1D661, 'M', u'l'),
    (0x1D662, 'M', u'm'),
    (0x1D663, 'M', u'n'),
    (0x1D664, 'M', u'o'),
    (0x1D665, 'M', u'p'),
    (0x1D666, 'M', u'q'),
    (0x1D667, 'M', u'r'),
    (0x1D668, 'M', u's'),
    (0x1D669, 'M', u't'),
    (0x1D66A, 'M', u'u'),
    (0x1D66B, 'M', u'v'),
    (0x1D66C, 'M', u'w'),
    (0x1D66D, 'M', u'x'),
    (0x1D66E, 'M', u'y'),
    (0x1D66F, 'M', u'z'),
    (0x1D670, 'M', u'a'),
    (0x1D671, 'M', u'b'),
    (0x1D672, 'M', u'c'),
    (0x1D673, 'M', u'd'),
    (0x1D674, 'M', u'e'),
    (0x1D675, 'M', u'f'),
    (0x1D676, 'M', u'g'),
    (0x1D677, 'M', u'h'),
    (0x1D678, 'M', u'i'),
    (0x1D679, 'M', u'j'),
    (0x1D67A, 'M', u'k'),
    (0x1D67B, 'M', u'l'),
    (0x1D67C, 'M', u'm'),
    (0x1D67D, 'M', u'n'),
    (0x1D67E, 'M', u'o'),
    (0x1D67F, 'M', u'p'),
    (0x1D680, 'M', u'q'),
    (0x1D681, 'M', u'r'),
    (0x1D682, 'M', u's'),
    (0x1D683, 'M', u't'),
    (0x1D684, 'M', u'u'),
    (0x1D685, 'M', u'v'),
    (0x1D686, 'M', u'w'),
    (0x1D687, 'M', u'x'),
    (0x1D688, 'M', u'y'),
    (0x1D689, 'M', u'z'),
    (0x1D68A, 'M', u'a'),
    (0x1D68B, 'M', u'b'),
    (0x1D68C, 'M', u'c'),
    (0x1D68D, 'M', u'd'),
    (0x1D68E, 'M', u'e'),
    (0x1D68F, 'M', u'f'),
    (0x1D690, 'M', u'g'),
    (0x1D691, 'M', u'h'),
    (0x1D692, 'M', u'i'),
    (0x1D693, 'M', u'j'),
    (0x1D694, 'M', u'k'),
    (0x1D695, 'M', u'l'),
    (0x1D696, 'M', u'm'),
    (0x1D697, 'M', u'n'),
    (0x1D698, 'M', u'o'),
    (0x1D699, 'M', u'p'),
    (0x1D69A, 'M', u'q'),
    ]

def _seg_60():
    return [
    (0x1D69B, 'M', u'r'),
    (0x1D69C, 'M', u's'),
    (0x1D69D, 'M', u't'),
    (0x1D69E, 'M', u'u'),
    (0x1D69F, 'M', u'v'),
    (0x1D6A0, 'M', u'w'),
    (0x1D6A1, 'M', u'x'),
    (0x1D6A2, 'M', u'y'),
    (0x1D6A3, 'M', u'z'),
    (0x1D6A4, 'M', u'ı'),
    (0x1D6A5, 'M', u'ȷ'),
    (0x1D6A6, 'X'),
    (0x1D6A8, 'M', u'α'),
    (0x1D6A9, 'M', u'β'),
    (0x1D6AA, 'M', u'γ'),
    (0x1D6AB, 'M', u'δ'),
    (0x1D6AC, 'M', u'ε'),
    (0x1D6AD, 'M', u'ζ'),
    (0x1D6AE, 'M', u'η'),
    (0x1D6AF, 'M', u'θ'),
    (0x1D6B0, 'M', u'ι'),
    (0x1D6B1, 'M', u'κ'),
    (0x1D6B2, 'M', u'λ'),
    (0x1D6B3, 'M', u'μ'),
    (0x1D6B4, 'M', u'ν'),
    (0x1D6B5, 'M', u'ξ'),
    (0x1D6B6, 'M', u'ο'),
    (0x1D6B7, 'M', u'π'),
    (0x1D6B8, 'M', u'ρ'),
    (0x1D6B9, 'M', u'θ'),
    (0x1D6BA, 'M', u'σ'),
    (0x1D6BB, 'M', u'τ'),
    (0x1D6BC, 'M', u'υ'),
    (0x1D6BD, 'M', u'φ'),
    (0x1D6BE, 'M', u'χ'),
    (0x1D6BF, 'M', u'ψ'),
    (0x1D6C0, 'M', u'ω'),
    (0x1D6C1, 'M', u'∇'),
    (0x1D6C2, 'M', u'α'),
    (0x1D6C3, 'M', u'β'),
    (0x1D6C4, 'M', u'γ'),
    (0x1D6C5, 'M', u'δ'),
    (0x1D6C6, 'M', u'ε'),
    (0x1D6C7, 'M', u'ζ'),
    (0x1D6C8, 'M', u'η'),
    (0x1D6C9, 'M', u'θ'),
    (0x1D6CA, 'M', u'ι'),
    (0x1D6CB, 'M', u'κ'),
    (0x1D6CC, 'M', u'λ'),
    (0x1D6CD, 'M', u'μ'),
    (0x1D6CE, 'M', u'ν'),
    (0x1D6CF, 'M', u'ξ'),
    (0x1D6D0, 'M', u'ο'),
    (0x1D6D1, 'M', u'π'),
    (0x1D6D2, 'M', u'ρ'),
    (0x1D6D3, 'M', u'σ'),
    (0x1D6D5, 'M', u'τ'),
    (0x1D6D6, 'M', u'υ'),
    (0x1D6D7, 'M', u'φ'),
    (0x1D6D8, 'M', u'χ'),
    (0x1D6D9, 'M', u'ψ'),
    (0x1D6DA, 'M', u'ω'),
    (0x1D6DB, 'M', u'∂'),
    (0x1D6DC, 'M', u'ε'),
    (0x1D6DD, 'M', u'θ'),
    (0x1D6DE, 'M', u'κ'),
    (0x1D6DF, 'M', u'φ'),
    (0x1D6E0, 'M', u'ρ'),
    (0x1D6E1, 'M', u'π'),
    (0x1D6E2, 'M', u'α'),
    (0x1D6E3, 'M', u'β'),
    (0x1D6E4, 'M', u'γ'),
    (0x1D6E5, 'M', u'δ'),
    (0x1D6E6, 'M', u'ε'),
    (0x1D6E7, 'M', u'ζ'),
    (0x1D6E8, 'M', u'η'),
    (0x1D6E9, 'M', u'θ'),
    (0x1D6EA, 'M', u'ι'),
    (0x1D6EB, 'M', u'κ'),
    (0x1D6EC, 'M', u'λ'),
    (0x1D6ED, 'M', u'μ'),
    (0x1D6EE, 'M', u'ν'),
    (0x1D6EF, 'M', u'ξ'),
    (0x1D6F0, 'M', u'ο'),
    (0x1D6F1, 'M', u'π'),
    (0x1D6F2, 'M', u'ρ'),
    (0x1D6F3, 'M', u'θ'),
    (0x1D6F4, 'M', u'σ'),
    (0x1D6F5, 'M', u'τ'),
    (0x1D6F6, 'M', u'υ'),
    (0x1D6F7, 'M', u'φ'),
    (0x1D6F8, 'M', u'χ'),
    (0x1D6F9, 'M', u'ψ'),
    (0x1D6FA, 'M', u'ω'),
    (0x1D6FB, 'M', u'∇'),
    (0x1D6FC, 'M', u'α'),
    (0x1D6FD, 'M', u'β'),
    (0x1D6FE, 'M', u'γ'),
    (0x1D6FF, 'M', u'δ'),
    (0x1D700, 'M', u'ε'),
    ]

def _seg_61():
    return [
    (0x1D701, 'M', u'ζ'),
    (0x1D702, 'M', u'η'),
    (0x1D703, 'M', u'θ'),
    (0x1D704, 'M', u'ι'),
    (0x1D705, 'M', u'κ'),
    (0x1D706, 'M', u'λ'),
    (0x1D707, 'M', u'μ'),
    (0x1D708, 'M', u'ν'),
    (0x1D709, 'M', u'ξ'),
    (0x1D70A, 'M', u'ο'),
    (0x1D70B, 'M', u'π'),
    (0x1D70C, 'M', u'ρ'),
    (0x1D70D, 'M', u'σ'),
    (0x1D70F, 'M', u'τ'),
    (0x1D710, 'M', u'υ'),
    (0x1D711, 'M', u'φ'),
    (0x1D712, 'M', u'χ'),
    (0x1D713, 'M', u'ψ'),
    (0x1D714, 'M', u'ω'),
    (0x1D715, 'M', u'∂'),
    (0x1D716, 'M', u'ε'),
    (0x1D717, 'M', u'θ'),
    (0x1D718, 'M', u'κ'),
    (0x1D719, 'M', u'φ'),
    (0x1D71A, 'M', u'ρ'),
    (0x1D71B, 'M', u'π'),
    (0x1D71C, 'M', u'α'),
    (0x1D71D, 'M', u'β'),
    (0x1D71E, 'M', u'γ'),
    (0x1D71F, 'M', u'δ'),
    (0x1D720, 'M', u'ε'),
    (0x1D721, 'M', u'ζ'),
    (0x1D722, 'M', u'η'),
    (0x1D723, 'M', u'θ'),
    (0x1D724, 'M', u'ι'),
    (0x1D725, 'M', u'κ'),
    (0x1D726, 'M', u'λ'),
    (0x1D727, 'M', u'μ'),
    (0x1D728, 'M', u'ν'),
    (0x1D729, 'M', u'ξ'),
    (0x1D72A, 'M', u'ο'),
    (0x1D72B, 'M', u'π'),
    (0x1D72C, 'M', u'ρ'),
    (0x1D72D, 'M', u'θ'),
    (0x1D72E, 'M', u'σ'),
    (0x1D72F, 'M', u'τ'),
    (0x1D730, 'M', u'υ'),
    (0x1D731, 'M', u'φ'),
    (0x1D732, 'M', u'χ'),
    (0x1D733, 'M', u'ψ'),
    (0x1D734, 'M', u'ω'),
    (0x1D735, 'M', u'∇'),
    (0x1D736, 'M', u'α'),
    (0x1D737, 'M', u'β'),
    (0x1D738, 'M', u'γ'),
    (0x1D739, 'M', u'δ'),
    (0x1D73A, 'M', u'ε'),
    (0x1D73B, 'M', u'ζ'),
    (0x1D73C, 'M', u'η'),
    (0x1D73D, 'M', u'θ'),
    (0x1D73E, 'M', u'ι'),
    (0x1D73F, 'M', u'κ'),
    (0x1D740, 'M', u'λ'),
    (0x1D741, 'M', u'μ'),
    (0x1D742, 'M', u'ν'),
    (0x1D743, 'M', u'ξ'),
    (0x1D744, 'M', u'ο'),
    (0x1D745, 'M', u'π'),
    (0x1D746, 'M', u'ρ'),
    (0x1D747, 'M', u'σ'),
    (0x1D749, 'M', u'τ'),
    (0x1D74A, 'M', u'υ'),
    (0x1D74B, 'M', u'φ'),
    (0x1D74C, 'M', u'χ'),
    (0x1D74D, 'M', u'ψ'),
    (0x1D74E, 'M', u'ω'),
    (0x1D74F, 'M', u'∂'),
    (0x1D750, 'M', u'ε'),
    (0x1D751, 'M', u'θ'),
    (0x1D752, 'M', u'κ'),
    (0x1D753, 'M', u'φ'),
    (0x1D754, 'M', u'ρ'),
    (0x1D755, 'M', u'π'),
    (0x1D756, 'M', u'α'),
    (0x1D757, 'M', u'β'),
    (0x1D758, 'M', u'γ'),
    (0x1D759, 'M', u'δ'),
    (0x1D75A, 'M', u'ε'),
    (0x1D75B, 'M', u'ζ'),
    (0x1D75C, 'M', u'η'),
    (0x1D75D, 'M', u'θ'),
    (0x1D75E, 'M', u'ι'),
    (0x1D75F, 'M', u'κ'),
    (0x1D760, 'M', u'λ'),
    (0x1D761, 'M', u'μ'),
    (0x1D762, 'M', u'ν'),
    (0x1D763, 'M', u'ξ'),
    (0x1D764, 'M', u'ο'),
    (0x1D765, 'M', u'π'),
    (0x1D766, 'M', u'ρ'),
    ]

def _seg_62():
    return [
    (0x1D767, 'M', u'θ'),
    (0x1D768, 'M', u'σ'),
    (0x1D769, 'M', u'τ'),
    (0x1D76A, 'M', u'υ'),
    (0x1D76B, 'M', u'φ'),
    (0x1D76C, 'M', u'χ'),
    (0x1D76D, 'M', u'ψ'),
    (0x1D76E, 'M', u'ω'),
    (0x1D76F, 'M', u'∇'),
    (0x1D770, 'M', u'α'),
    (0x1D771, 'M', u'β'),
    (0x1D772, 'M', u'γ'),
    (0x1D773, 'M', u'δ'),
    (0x1D774, 'M', u'ε'),
    (0x1D775, 'M', u'ζ'),
    (0x1D776, 'M', u'η'),
    (0x1D777, 'M', u'θ'),
    (0x1D778, 'M', u'ι'),
    (0x1D779, 'M', u'κ'),
    (0x1D77A, 'M', u'λ'),
    (0x1D77B, 'M', u'μ'),
    (0x1D77C, 'M', u'ν'),
    (0x1D77D, 'M', u'ξ'),
    (0x1D77E, 'M', u'ο'),
    (0x1D77F, 'M', u'π'),
    (0x1D780, 'M', u'ρ'),
    (0x1D781, 'M', u'σ'),
    (0x1D783, 'M', u'τ'),
    (0x1D784, 'M', u'υ'),
    (0x1D785, 'M', u'φ'),
    (0x1D786, 'M', u'χ'),
    (0x1D787, 'M', u'ψ'),
    (0x1D788, 'M', u'ω'),
    (0x1D789, 'M', u'∂'),
    (0x1D78A, 'M', u'ε'),
    (0x1D78B, 'M', u'θ'),
    (0x1D78C, 'M', u'κ'),
    (0x1D78D, 'M', u'φ'),
    (0x1D78E, 'M', u'ρ'),
    (0x1D78F, 'M', u'π'),
    (0x1D790, 'M', u'α'),
    (0x1D791, 'M', u'β'),
    (0x1D792, 'M', u'γ'),
    (0x1D793, 'M', u'δ'),
    (0x1D794, 'M', u'ε'),
    (0x1D795, 'M', u'ζ'),
    (0x1D796, 'M', u'η'),
    (0x1D797, 'M', u'θ'),
    (0x1D798, 'M', u'ι'),
    (0x1D799, 'M', u'κ'),
    (0x1D79A, 'M', u'λ'),
    (0x1D79B, 'M', u'μ'),
    (0x1D79C, 'M', u'ν'),
    (0x1D79D, 'M', u'ξ'),
    (0x1D79E, 'M', u'ο'),
    (0x1D79F, 'M', u'π'),
    (0x1D7A0, 'M', u'ρ'),
    (0x1D7A1, 'M', u'θ'),
    (0x1D7A2, 'M', u'σ'),
    (0x1D7A3, 'M', u'τ'),
    (0x1D7A4, 'M', u'υ'),
    (0x1D7A5, 'M', u'φ'),
    (0x1D7A6, 'M', u'χ'),
    (0x1D7A7, 'M', u'ψ'),
    (0x1D7A8, 'M', u'ω'),
    (0x1D7A9, 'M', u'∇'),
    (0x1D7AA, 'M', u'α'),
    (0x1D7AB, 'M', u'β'),
    (0x1D7AC, 'M', u'γ'),
    (0x1D7AD, 'M', u'δ'),
    (0x1D7AE, 'M', u'ε'),
    (0x1D7AF, 'M', u'ζ'),
    (0x1D7B0, 'M', u'η'),
    (0x1D7B1, 'M', u'θ'),
    (0x1D7B2, 'M', u'ι'),
    (0x1D7B3, 'M', u'κ'),
    (0x1D7B4, 'M', u'λ'),
    (0x1D7B5, 'M', u'μ'),
    (0x1D7B6, 'M', u'ν'),
    (0x1D7B7, 'M', u'ξ'),
    (0x1D7B8, 'M', u'ο'),
    (0x1D7B9, 'M', u'π'),
    (0x1D7BA, 'M', u'ρ'),
    (0x1D7BB, 'M', u'σ'),
    (0x1D7BD, 'M', u'τ'),
    (0x1D7BE, 'M', u'υ'),
    (0x1D7BF, 'M', u'φ'),
    (0x1D7C0, 'M', u'χ'),
    (0x1D7C1, 'M', u'ψ'),
    (0x1D7C2, 'M', u'ω'),
    (0x1D7C3, 'M', u'∂'),
    (0x1D7C4, 'M', u'ε'),
    (0x1D7C5, 'M', u'θ'),
    (0x1D7C6, 'M', u'κ'),
    (0x1D7C7, 'M', u'φ'),
    (0x1D7C8, 'M', u'ρ'),
    (0x1D7C9, 'M', u'π'),
    (0x1D7CA, 'M', u'ϝ'),
    (0x1D7CC, 'X'),
    (0x1D7CE, 'M', u'0'),
    ]

def _seg_63():
    return [
    (0x1D7CF, 'M', u'1'),
    (0x1D7D0, 'M', u'2'),
    (0x1D7D1, 'M', u'3'),
    (0x1D7D2, 'M', u'4'),
    (0x1D7D3, 'M', u'5'),
    (0x1D7D4, 'M', u'6'),
    (0x1D7D5, 'M', u'7'),
    (0x1D7D6, 'M', u'8'),
    (0x1D7D7, 'M', u'9'),
    (0x1D7D8, 'M', u'0'),
    (0x1D7D9, 'M', u'1'),
    (0x1D7DA, 'M', u'2'),
    (0x1D7DB, 'M', u'3'),
    (0x1D7DC, 'M', u'4'),
    (0x1D7DD, 'M', u'5'),
    (0x1D7DE, 'M', u'6'),
    (0x1D7DF, 'M', u'7'),
    (0x1D7E0, 'M', u'8'),
    (0x1D7E1, 'M', u'9'),
    (0x1D7E2, 'M', u'0'),
    (0x1D7E3, 'M', u'1'),
    (0x1D7E4, 'M', u'2'),
    (0x1D7E5, 'M', u'3'),
    (0x1D7E6, 'M', u'4'),
    (0x1D7E7, 'M', u'5'),
    (0x1D7E8, 'M', u'6'),
    (0x1D7E9, 'M', u'7'),
    (0x1D7EA, 'M', u'8'),
    (0x1D7EB, 'M', u'9'),
    (0x1D7EC, 'M', u'0'),
    (0x1D7ED, 'M', u'1'),
    (0x1D7EE, 'M', u'2'),
    (0x1D7EF, 'M', u'3'),
    (0x1D7F0, 'M', u'4'),
    (0x1D7F1, 'M', u'5'),
    (0x1D7F2, 'M', u'6'),
    (0x1D7F3, 'M', u'7'),
    (0x1D7F4, 'M', u'8'),
    (0x1D7F5, 'M', u'9'),
    (0x1D7F6, 'M', u'0'),
    (0x1D7F7, 'M', u'1'),
    (0x1D7F8, 'M', u'2'),
    (0x1D7F9, 'M', u'3'),
    (0x1D7FA, 'M', u'4'),
    (0x1D7FB, 'M', u'5'),
    (0x1D7FC, 'M', u'6'),
    (0x1D7FD, 'M', u'7'),
    (0x1D7FE, 'M', u'8'),
    (0x1D7FF, 'M', u'9'),
    (0x1D800, 'X'),
    (0x1EE00, 'M', u'ا'),
    (0x1EE01, 'M', u'ب'),
    (0x1EE02, 'M', u'ج'),
    (0x1EE03, 'M', u'د'),
    (0x1EE04, 'X'),
    (0x1EE05, 'M', u'و'),
    (0x1EE06, 'M', u'ز'),
    (0x1EE07, 'M', u'ح'),
    (0x1EE08, 'M', u'ط'),
    (0x1EE09, 'M', u'ي'),
    (0x1EE0A, 'M', u'ك'),
    (0x1EE0B, 'M', u'ل'),
    (0x1EE0C, 'M', u'م'),
    (0x1EE0D, 'M', u'ن'),
    (0x1EE0E, 'M', u'س'),
    (0x1EE0F, 'M', u'ع'),
    (0x1EE10, 'M', u'ف'),
    (0x1EE11, 'M', u'ص'),
    (0x1EE12, 'M', u'ق'),
    (0x1EE13, 'M', u'ر'),
    (0x1EE14, 'M', u'ش'),
    (0x1EE15, 'M', u'ت'),
    (0x1EE16, 'M', u'ث'),
    (0x1EE17, 'M', u'خ'),
    (0x1EE18, 'M', u'ذ'),
    (0x1EE19, 'M', u'ض'),
    (0x1EE1A, 'M', u'ظ'),
    (0x1EE1B, 'M', u'غ'),
    (0x1EE1C, 'M', u'ٮ'),
    (0x1EE1D, 'M', u'ں'),
    (0x1EE1E, 'M', u'ڡ'),
    (0x1EE1F, 'M', u'ٯ'),
    (0x1EE20, 'X'),
    (0x1EE21, 'M', u'ب'),
    (0x1EE22, 'M', u'ج'),
    (0x1EE23, 'X'),
    (0x1EE24, 'M', u'ه'),
    (0x1EE25, 'X'),
    (0x1EE27, 'M', u'ح'),
    (0x1EE28, 'X'),
    (0x1EE29, 'M', u'ي'),
    (0x1EE2A, 'M', u'ك'),
    (0x1EE2B, 'M', u'ل'),
    (0x1EE2C, 'M', u'م'),
    (0x1EE2D, 'M', u'ن'),
    (0x1EE2E, 'M', u'س'),
    (0x1EE2F, 'M', u'ع'),
    (0x1EE30, 'M', u'ف'),
    (0x1EE31, 'M', u'ص'),
    (0x1EE32, 'M', u'ق'),
    ]

def _seg_64():
    return [
    (0x1EE33, 'X'),
    (0x1EE34, 'M', u'ش'),
    (0x1EE35, 'M', u'ت'),
    (0x1EE36, 'M', u'ث'),
    (0x1EE37, 'M', u'خ'),
    (0x1EE38, 'X'),
    (0x1EE39, 'M', u'ض'),
    (0x1EE3A, 'X'),
    (0x1EE3B, 'M', u'غ'),
    (0x1EE3C, 'X'),
    (0x1EE42, 'M', u'ج'),
    (0x1EE43, 'X'),
    (0x1EE47, 'M', u'ح'),
    (0x1EE48, 'X'),
    (0x1EE49, 'M', u'ي'),
    (0x1EE4A, 'X'),
    (0x1EE4B, 'M', u'ل'),
    (0x1EE4C, 'X'),
    (0x1EE4D, 'M', u'ن'),
    (0x1EE4E, 'M', u'س'),
    (0x1EE4F, 'M', u'ع'),
    (0x1EE50, 'X'),
    (0x1EE51, 'M', u'ص'),
    (0x1EE52, 'M', u'ق'),
    (0x1EE53, 'X'),
    (0x1EE54, 'M', u'ش'),
    (0x1EE55, 'X'),
    (0x1EE57, 'M', u'خ'),
    (0x1EE58, 'X'),
    (0x1EE59, 'M', u'ض'),
    (0x1EE5A, 'X'),
    (0x1EE5B, 'M', u'غ'),
    (0x1EE5C, 'X'),
    (0x1EE5D, 'M', u'ں'),
    (0x1EE5E, 'X'),
    (0x1EE5F, 'M', u'ٯ'),
    (0x1EE60, 'X'),
    (0x1EE61, 'M', u'ب'),
    (0x1EE62, 'M', u'ج'),
    (0x1EE63, 'X'),
    (0x1EE64, 'M', u'ه'),
    (0x1EE65, 'X'),
    (0x1EE67, 'M', u'ح'),
    (0x1EE68, 'M', u'ط'),
    (0x1EE69, 'M', u'ي'),
    (0x1EE6A, 'M', u'ك'),
    (0x1EE6B, 'X'),
    (0x1EE6C, 'M', u'م'),
    (0x1EE6D, 'M', u'ن'),
    (0x1EE6E, 'M', u'س'),
    (0x1EE6F, 'M', u'ع'),
    (0x1EE70, 'M', u'ف'),
    (0x1EE71, 'M', u'ص'),
    (0x1EE72, 'M', u'ق'),
    (0x1EE73, 'X'),
    (0x1EE74, 'M', u'ش'),
    (0x1EE75, 'M', u'ت'),
    (0x1EE76, 'M', u'ث'),
    (0x1EE77, 'M', u'خ'),
    (0x1EE78, 'X'),
    (0x1EE79, 'M', u'ض'),
    (0x1EE7A, 'M', u'ظ'),
    (0x1EE7B, 'M', u'غ'),
    (0x1EE7C, 'M', u'ٮ'),
    (0x1EE7D, 'X'),
    (0x1EE7E, 'M', u'ڡ'),
    (0x1EE7F, 'X'),
    (0x1EE80, 'M', u'ا'),
    (0x1EE81, 'M', u'ب'),
    (0x1EE82, 'M', u'ج'),
    (0x1EE83, 'M', u'د'),
    (0x1EE84, 'M', u'ه'),
    (0x1EE85, 'M', u'و'),
    (0x1EE86, 'M', u'ز'),
    (0x1EE87, 'M', u'ح'),
    (0x1EE88, 'M', u'ط'),
    (0x1EE89, 'M', u'ي'),
    (0x1EE8A, 'X'),
    (0x1EE8B, 'M', u'ل'),
    (0x1EE8C, 'M', u'م'),
    (0x1EE8D, 'M', u'ن'),
    (0x1EE8E, 'M', u'س'),
    (0x1EE8F, 'M', u'ع'),
    (0x1EE90, 'M', u'ف'),
    (0x1EE91, 'M', u'ص'),
    (0x1EE92, 'M', u'ق'),
    (0x1EE93, 'M', u'ر'),
    (0x1EE94, 'M', u'ش'),
    (0x1EE95, 'M', u'ت'),
    (0x1EE96, 'M', u'ث'),
    (0x1EE97, 'M', u'خ'),
    (0x1EE98, 'M', u'ذ'),
    (0x1EE99, 'M', u'ض'),
    (0x1EE9A, 'M', u'ظ'),
    (0x1EE9B, 'M', u'غ'),
    (0x1EE9C, 'X'),
    (0x1EEA1, 'M', u'ب'),
    (0x1EEA2, 'M', u'ج'),
    (0x1EEA3, 'M', u'د'),
    (0x1EEA4, 'X'),
    ]

def _seg_65():
    return [
    (0x1EEA5, 'M', u'و'),
    (0x1EEA6, 'M', u'ز'),
    (0x1EEA7, 'M', u'ح'),
    (0x1EEA8, 'M', u'ط'),
    (0x1EEA9, 'M', u'ي'),
    (0x1EEAA, 'X'),
    (0x1EEAB, 'M', u'ل'),
    (0x1EEAC, 'M', u'م'),
    (0x1EEAD, 'M', u'ن'),
    (0x1EEAE, 'M', u'س'),
    (0x1EEAF, 'M', u'ع'),
    (0x1EEB0, 'M', u'ف'),
    (0x1EEB1, 'M', u'ص'),
    (0x1EEB2, 'M', u'ق'),
    (0x1EEB3, 'M', u'ر'),
    (0x1EEB4, 'M', u'ش'),
    (0x1EEB5, 'M', u'ت'),
    (0x1EEB6, 'M', u'ث'),
    (0x1EEB7, 'M', u'خ'),
    (0x1EEB8, 'M', u'ذ'),
    (0x1EEB9, 'M', u'ض'),
    (0x1EEBA, 'M', u'ظ'),
    (0x1EEBB, 'M', u'غ'),
    (0x1EEBC, 'X'),
    (0x1EEF0, 'V'),
    (0x1EEF2, 'X'),
    (0x1F000, 'V'),
    (0x1F02C, 'X'),
    (0x1F030, 'V'),
    (0x1F094, 'X'),
    (0x1F0A0, 'V'),
    (0x1F0AF, 'X'),
    (0x1F0B1, 'V'),
    (0x1F0BF, 'X'),
    (0x1F0C1, 'V'),
    (0x1F0D0, 'X'),
    (0x1F0D1, 'V'),
    (0x1F0E0, 'X'),
    (0x1F101, '3', u'0,'),
    (0x1F102, '3', u'1,'),
    (0x1F103, '3', u'2,'),
    (0x1F104, '3', u'3,'),
    (0x1F105, '3', u'4,'),
    (0x1F106, '3', u'5,'),
    (0x1F107, '3', u'6,'),
    (0x1F108, '3', u'7,'),
    (0x1F109, '3', u'8,'),
    (0x1F10A, '3', u'9,'),
    (0x1F10B, 'X'),
    (0x1F110, '3', u'(a)'),
    (0x1F111, '3', u'(b)'),
    (0x1F112, '3', u'(c)'),
    (0x1F113, '3', u'(d)'),
    (0x1F114, '3', u'(e)'),
    (0x1F115, '3', u'(f)'),
    (0x1F116, '3', u'(g)'),
    (0x1F117, '3', u'(h)'),
    (0x1F118, '3', u'(i)'),
    (0x1F119, '3', u'(j)'),
    (0x1F11A, '3', u'(k)'),
    (0x1F11B, '3', u'(l)'),
    (0x1F11C, '3', u'(m)'),
    (0x1F11D, '3', u'(n)'),
    (0x1F11E, '3', u'(o)'),
    (0x1F11F, '3', u'(p)'),
    (0x1F120, '3', u'(q)'),
    (0x1F121, '3', u'(r)'),
    (0x1F122, '3', u'(s)'),
    (0x1F123, '3', u'(t)'),
    (0x1F124, '3', u'(u)'),
    (0x1F125, '3', u'(v)'),
    (0x1F126, '3', u'(w)'),
    (0x1F127, '3', u'(x)'),
    (0x1F128, '3', u'(y)'),
    (0x1F129, '3', u'(z)'),
    (0x1F12A, 'M', u'〔s〕'),
    (0x1F12B, 'M', u'c'),
    (0x1F12C, 'M', u'r'),
    (0x1F12D, 'M', u'cd'),
    (0x1F12E, 'M', u'wz'),
    (0x1F12F, 'X'),
    (0x1F130, 'M', u'a'),
    (0x1F131, 'M', u'b'),
    (0x1F132, 'M', u'c'),
    (0x1F133, 'M', u'd'),
    (0x1F134, 'M', u'e'),
    (0x1F135, 'M', u'f'),
    (0x1F136, 'M', u'g'),
    (0x1F137, 'M', u'h'),
    (0x1F138, 'M', u'i'),
    (0x1F139, 'M', u'j'),
    (0x1F13A, 'M', u'k'),
    (0x1F13B, 'M', u'l'),
    (0x1F13C, 'M', u'm'),
    (0x1F13D, 'M', u'n'),
    (0x1F13E, 'M', u'o'),
    (0x1F13F, 'M', u'p'),
    (0x1F140, 'M', u'q'),
    (0x1F141, 'M', u'r'),
    (0x1F142, 'M', u's'),
    ]

def _seg_66():
    return [
    (0x1F143, 'M', u't'),
    (0x1F144, 'M', u'u'),
    (0x1F145, 'M', u'v'),
    (0x1F146, 'M', u'w'),
    (0x1F147, 'M', u'x'),
    (0x1F148, 'M', u'y'),
    (0x1F149, 'M', u'z'),
    (0x1F14A, 'M', u'hv'),
    (0x1F14B, 'M', u'mv'),
    (0x1F14C, 'M', u'sd'),
    (0x1F14D, 'M', u'ss'),
    (0x1F14E, 'M', u'ppv'),
    (0x1F14F, 'M', u'wc'),
    (0x1F150, 'V'),
    (0x1F16A, 'M', u'mc'),
    (0x1F16B, 'M', u'md'),
    (0x1F16C, 'X'),
    (0x1F170, 'V'),
    (0x1F190, 'M', u'dj'),
    (0x1F191, 'V'),
    (0x1F19B, 'X'),
    (0x1F1E6, 'V'),
    (0x1F200, 'M', u'ほか'),
    (0x1F201, 'M', u'ココ'),
    (0x1F202, 'M', u'サ'),
    (0x1F203, 'X'),
    (0x1F210, 'M', u'手'),
    (0x1F211, 'M', u'字'),
    (0x1F212, 'M', u'双'),
    (0x1F213, 'M', u'デ'),
    (0x1F214, 'M', u'二'),
    (0x1F215, 'M', u'多'),
    (0x1F216, 'M', u'解'),
    (0x1F217, 'M', u'天'),
    (0x1F218, 'M', u'交'),
    (0x1F219, 'M', u'映'),
    (0x1F21A, 'M', u'無'),
    (0x1F21B, 'M', u'料'),
    (0x1F21C, 'M', u'前'),
    (0x1F21D, 'M', u'後'),
    (0x1F21E, 'M', u'再'),
    (0x1F21F, 'M', u'新'),
    (0x1F220, 'M', u'初'),
    (0x1F221, 'M', u'終'),
    (0x1F222, 'M', u'生'),
    (0x1F223, 'M', u'販'),
    (0x1F224, 'M', u'声'),
    (0x1F225, 'M', u'吹'),
    (0x1F226, 'M', u'演'),
    (0x1F227, 'M', u'投'),
    (0x1F228, 'M', u'捕'),
    (0x1F229, 'M', u'一'),
    (0x1F22A, 'M', u'三'),
    (0x1F22B, 'M', u'遊'),
    (0x1F22C, 'M', u'左'),
    (0x1F22D, 'M', u'中'),
    (0x1F22E, 'M', u'右'),
    (0x1F22F, 'M', u'指'),
    (0x1F230, 'M', u'走'),
    (0x1F231, 'M', u'打'),
    (0x1F232, 'M', u'禁'),
    (0x1F233, 'M', u'空'),
    (0x1F234, 'M', u'合'),
    (0x1F235, 'M', u'満'),
    (0x1F236, 'M', u'有'),
    (0x1F237, 'M', u'月'),
    (0x1F238, 'M', u'申'),
    (0x1F239, 'M', u'割'),
    (0x1F23A, 'M', u'営'),
    (0x1F23B, 'X'),
    (0x1F240, 'M', u'〔本〕'),
    (0x1F241, 'M', u'〔三〕'),
    (0x1F242, 'M', u'〔二〕'),
    (0x1F243, 'M', u'〔安〕'),
    (0x1F244, 'M', u'〔点〕'),
    (0x1F245, 'M', u'〔打〕'),
    (0x1F246, 'M', u'〔盗〕'),
    (0x1F247, 'M', u'〔勝〕'),
    (0x1F248, 'M', u'〔敗〕'),
    (0x1F249, 'X'),
    (0x1F250, 'M', u'得'),
    (0x1F251, 'M', u'可'),
    (0x1F252, 'X'),
    (0x1F300, 'V'),
    (0x1F321, 'X'),
    (0x1F330, 'V'),
    (0x1F336, 'X'),
    (0x1F337, 'V'),
    (0x1F37D, 'X'),
    (0x1F380, 'V'),
    (0x1F394, 'X'),
    (0x1F3A0, 'V'),
    (0x1F3C5, 'X'),
    (0x1F3C6, 'V'),
    (0x1F3CB, 'X'),
    (0x1F3E0, 'V'),
    (0x1F3F1, 'X'),
    (0x1F400, 'V'),
    (0x1F43F, 'X'),
    (0x1F440, 'V'),
    ]

def _seg_67():
    return [
    (0x1F441, 'X'),
    (0x1F442, 'V'),
    (0x1F4F8, 'X'),
    (0x1F4F9, 'V'),
    (0x1F4FD, 'X'),
    (0x1F500, 'V'),
    (0x1F53E, 'X'),
    (0x1F540, 'V'),
    (0x1F544, 'X'),
    (0x1F550, 'V'),
    (0x1F568, 'X'),
    (0x1F5FB, 'V'),
    (0x1F641, 'X'),
    (0x1F645, 'V'),
    (0x1F650, 'X'),
    (0x1F680, 'V'),
    (0x1F6C6, 'X'),
    (0x1F700, 'V'),
    (0x1F774, 'X'),
    (0x20000, 'V'),
    (0x2A6D7, 'X'),
    (0x2A700, 'V'),
    (0x2B735, 'X'),
    (0x2B740, 'V'),
    (0x2B81E, 'X'),
    (0x2F800, 'M', u'丽'),
    (0x2F801, 'M', u'丸'),
    (0x2F802, 'M', u'乁'),
    (0x2F803, 'M', u'𠄢'),
    (0x2F804, 'M', u'你'),
    (0x2F805, 'M', u'侮'),
    (0x2F806, 'M', u'侻'),
    (0x2F807, 'M', u'倂'),
    (0x2F808, 'M', u'偺'),
    (0x2F809, 'M', u'備'),
    (0x2F80A, 'M', u'僧'),
    (0x2F80B, 'M', u'像'),
    (0x2F80C, 'M', u'㒞'),
    (0x2F80D, 'M', u'𠘺'),
    (0x2F80E, 'M', u'免'),
    (0x2F80F, 'M', u'兔'),
    (0x2F810, 'M', u'兤'),
    (0x2F811, 'M', u'具'),
    (0x2F812, 'M', u'𠔜'),
    (0x2F813, 'M', u'㒹'),
    (0x2F814, 'M', u'內'),
    (0x2F815, 'M', u'再'),
    (0x2F816, 'M', u'𠕋'),
    (0x2F817, 'M', u'冗'),
    (0x2F818, 'M', u'冤'),
    (0x2F819, 'M', u'仌'),
    (0x2F81A, 'M', u'冬'),
    (0x2F81B, 'M', u'况'),
    (0x2F81C, 'M', u'𩇟'),
    (0x2F81D, 'M', u'凵'),
    (0x2F81E, 'M', u'刃'),
    (0x2F81F, 'M', u'㓟'),
    (0x2F820, 'M', u'刻'),
    (0x2F821, 'M', u'剆'),
    (0x2F822, 'M', u'割'),
    (0x2F823, 'M', u'剷'),
    (0x2F824, 'M', u'㔕'),
    (0x2F825, 'M', u'勇'),
    (0x2F826, 'M', u'勉'),
    (0x2F827, 'M', u'勤'),
    (0x2F828, 'M', u'勺'),
    (0x2F829, 'M', u'包'),
    (0x2F82A, 'M', u'匆'),
    (0x2F82B, 'M', u'北'),
    (0x2F82C, 'M', u'卉'),
    (0x2F82D, 'M', u'卑'),
    (0x2F82E, 'M', u'博'),
    (0x2F82F, 'M', u'即'),
    (0x2F830, 'M', u'卽'),
    (0x2F831, 'M', u'卿'),
    (0x2F834, 'M', u'𠨬'),
    (0x2F835, 'M', u'灰'),
    (0x2F836, 'M', u'及'),
    (0x2F837, 'M', u'叟'),
    (0x2F838, 'M', u'𠭣'),
    (0x2F839, 'M', u'叫'),
    (0x2F83A, 'M', u'叱'),
    (0x2F83B, 'M', u'吆'),
    (0x2F83C, 'M', u'咞'),
    (0x2F83D, 'M', u'吸'),
    (0x2F83E, 'M', u'呈'),
    (0x2F83F, 'M', u'周'),
    (0x2F840, 'M', u'咢'),
    (0x2F841, 'M', u'哶'),
    (0x2F842, 'M', u'唐'),
    (0x2F843, 'M', u'啓'),
    (0x2F844, 'M', u'啣'),
    (0x2F845, 'M', u'善'),
    (0x2F847, 'M', u'喙'),
    (0x2F848, 'M', u'喫'),
    (0x2F849, 'M', u'喳'),
    (0x2F84A, 'M', u'嗂'),
    (0x2F84B, 'M', u'圖'),
    (0x2F84C, 'M', u'嘆'),
    (0x2F84D, 'M', u'圗'),
    ]

def _seg_68():
    return [
    (0x2F84E, 'M', u'噑'),
    (0x2F84F, 'M', u'噴'),
    (0x2F850, 'M', u'切'),
    (0x2F851, 'M', u'壮'),
    (0x2F852, 'M', u'城'),
    (0x2F853, 'M', u'埴'),
    (0x2F854, 'M', u'堍'),
    (0x2F855, 'M', u'型'),
    (0x2F856, 'M', u'堲'),
    (0x2F857, 'M', u'報'),
    (0x2F858, 'M', u'墬'),
    (0x2F859, 'M', u'𡓤'),
    (0x2F85A, 'M', u'売'),
    (0x2F85B, 'M', u'壷'),
    (0x2F85C, 'M', u'夆'),
    (0x2F85D, 'M', u'多'),
    (0x2F85E, 'M', u'夢'),
    (0x2F85F, 'M', u'奢'),
    (0x2F860, 'M', u'𡚨'),
    (0x2F861, 'M', u'𡛪'),
    (0x2F862, 'M', u'姬'),
    (0x2F863, 'M', u'娛'),
    (0x2F864, 'M', u'娧'),
    (0x2F865, 'M', u'姘'),
    (0x2F866, 'M', u'婦'),
    (0x2F867, 'M', u'㛮'),
    (0x2F868, 'X'),
    (0x2F869, 'M', u'嬈'),
    (0x2F86A, 'M', u'嬾'),
    (0x2F86C, 'M', u'𡧈'),
    (0x2F86D, 'M', u'寃'),
    (0x2F86E, 'M', u'寘'),
    (0x2F86F, 'M', u'寧'),
    (0x2F870, 'M', u'寳'),
    (0x2F871, 'M', u'𡬘'),
    (0x2F872, 'M', u'寿'),
    (0x2F873, 'M', u'将'),
    (0x2F874, 'X'),
    (0x2F875, 'M', u'尢'),
    (0x2F876, 'M', u'㞁'),
    (0x2F877, 'M', u'屠'),
    (0x2F878, 'M', u'屮'),
    (0x2F879, 'M', u'峀'),
    (0x2F87A, 'M', u'岍'),
    (0x2F87B, 'M', u'𡷤'),
    (0x2F87C, 'M', u'嵃'),
    (0x2F87D, 'M', u'𡷦'),
    (0x2F87E, 'M', u'嵮'),
    (0x2F87F, 'M', u'嵫'),
    (0x2F880, 'M', u'嵼'),
    (0x2F881, 'M', u'巡'),
    (0x2F882, 'M', u'巢'),
    (0x2F883, 'M', u'㠯'),
    (0x2F884, 'M', u'巽'),
    (0x2F885, 'M', u'帨'),
    (0x2F886, 'M', u'帽'),
    (0x2F887, 'M', u'幩'),
    (0x2F888, 'M', u'㡢'),
    (0x2F889, 'M', u'𢆃'),
    (0x2F88A, 'M', u'㡼'),
    (0x2F88B, 'M', u'庰'),
    (0x2F88C, 'M', u'庳'),
    (0x2F88D, 'M', u'庶'),
    (0x2F88E, 'M', u'廊'),
    (0x2F88F, 'M', u'𪎒'),
    (0x2F890, 'M', u'廾'),
    (0x2F891, 'M', u'𢌱'),
    (0x2F893, 'M', u'舁'),
    (0x2F894, 'M', u'弢'),
    (0x2F896, 'M', u'㣇'),
    (0x2F897, 'M', u'𣊸'),
    (0x2F898, 'M', u'𦇚'),
    (0x2F899, 'M', u'形'),
    (0x2F89A, 'M', u'彫'),
    (0x2F89B, 'M', u'㣣'),
    (0x2F89C, 'M', u'徚'),
    (0x2F89D, 'M', u'忍'),
    (0x2F89E, 'M', u'志'),
    (0x2F89F, 'M', u'忹'),
    (0x2F8A0, 'M', u'悁'),
    (0x2F8A1, 'M', u'㤺'),
    (0x2F8A2, 'M', u'㤜'),
    (0x2F8A3, 'M', u'悔'),
    (0x2F8A4, 'M', u'𢛔'),
    (0x2F8A5, 'M', u'惇'),
    (0x2F8A6, 'M', u'慈'),
    (0x2F8A7, 'M', u'慌'),
    (0x2F8A8, 'M', u'慎'),
    (0x2F8A9, 'M', u'慌'),
    (0x2F8AA, 'M', u'慺'),
    (0x2F8AB, 'M', u'憎'),
    (0x2F8AC, 'M', u'憲'),
    (0x2F8AD, 'M', u'憤'),
    (0x2F8AE, 'M', u'憯'),
    (0x2F8AF, 'M', u'懞'),
    (0x2F8B0, 'M', u'懲'),
    (0x2F8B1, 'M', u'懶'),
    (0x2F8B2, 'M', u'成'),
    (0x2F8B3, 'M', u'戛'),
    (0x2F8B4, 'M', u'扝'),
    ]

def _seg_69():
    return [
    (0x2F8B5, 'M', u'抱'),
    (0x2F8B6, 'M', u'拔'),
    (0x2F8B7, 'M', u'捐'),
    (0x2F8B8, 'M', u'𢬌'),
    (0x2F8B9, 'M', u'挽'),
    (0x2F8BA, 'M', u'拼'),
    (0x2F8BB, 'M', u'捨'),
    (0x2F8BC, 'M', u'掃'),
    (0x2F8BD, 'M', u'揤'),
    (0x2F8BE, 'M', u'𢯱'),
    (0x2F8BF, 'M', u'搢'),
    (0x2F8C0, 'M', u'揅'),
    (0x2F8C1, 'M', u'掩'),
    (0x2F8C2, 'M', u'㨮'),
    (0x2F8C3, 'M', u'摩'),
    (0x2F8C4, 'M', u'摾'),
    (0x2F8C5, 'M', u'撝'),
    (0x2F8C6, 'M', u'摷'),
    (0x2F8C7, 'M', u'㩬'),
    (0x2F8C8, 'M', u'敏'),
    (0x2F8C9, 'M', u'敬'),
    (0x2F8CA, 'M', u'𣀊'),
    (0x2F8CB, 'M', u'旣'),
    (0x2F8CC, 'M', u'書'),
    (0x2F8CD, 'M', u'晉'),
    (0x2F8CE, 'M', u'㬙'),
    (0x2F8CF, 'M', u'暑'),
    (0x2F8D0, 'M', u'㬈'),
    (0x2F8D1, 'M', u'㫤'),
    (0x2F8D2, 'M', u'冒'),
    (0x2F8D3, 'M', u'冕'),
    (0x2F8D4, 'M', u'最'),
    (0x2F8D5, 'M', u'暜'),
    (0x2F8D6, 'M', u'肭'),
    (0x2F8D7, 'M', u'䏙'),
    (0x2F8D8, 'M', u'朗'),
    (0x2F8D9, 'M', u'望'),
    (0x2F8DA, 'M', u'朡'),
    (0x2F8DB, 'M', u'杞'),
    (0x2F8DC, 'M', u'杓'),
    (0x2F8DD, 'M', u'𣏃'),
    (0x2F8DE, 'M', u'㭉'),
    (0x2F8DF, 'M', u'柺'),
    (0x2F8E0, 'M', u'枅'),
    (0x2F8E1, 'M', u'桒'),
    (0x2F8E2, 'M', u'梅'),
    (0x2F8E3, 'M', u'𣑭'),
    (0x2F8E4, 'M', u'梎'),
    (0x2F8E5, 'M', u'栟'),
    (0x2F8E6, 'M', u'椔'),
    (0x2F8E7, 'M', u'㮝'),
    (0x2F8E8, 'M', u'楂'),
    (0x2F8E9, 'M', u'榣'),
    (0x2F8EA, 'M', u'槪'),
    (0x2F8EB, 'M', u'檨'),
    (0x2F8EC, 'M', u'𣚣'),
    (0x2F8ED, 'M', u'櫛'),
    (0x2F8EE, 'M', u'㰘'),
    (0x2F8EF, 'M', u'次'),
    (0x2F8F0, 'M', u'𣢧'),
    (0x2F8F1, 'M', u'歔'),
    (0x2F8F2, 'M', u'㱎'),
    (0x2F8F3, 'M', u'歲'),
    (0x2F8F4, 'M', u'殟'),
    (0x2F8F5, 'M', u'殺'),
    (0x2F8F6, 'M', u'殻'),
    (0x2F8F7, 'M', u'𣪍'),
    (0x2F8F8, 'M', u'𡴋'),
    (0x2F8F9, 'M', u'𣫺'),
    (0x2F8FA, 'M', u'汎'),
    (0x2F8FB, 'M', u'𣲼'),
    (0x2F8FC, 'M', u'沿'),
    (0x2F8FD, 'M', u'泍'),
    (0x2F8FE, 'M', u'汧'),
    (0x2F8FF, 'M', u'洖'),
    (0x2F900, 'M', u'派'),
    (0x2F901, 'M', u'海'),
    (0x2F902, 'M', u'流'),
    (0x2F903, 'M', u'浩'),
    (0x2F904, 'M', u'浸'),
    (0x2F905, 'M', u'涅'),
    (0x2F906, 'M', u'𣴞'),
    (0x2F907, 'M', u'洴'),
    (0x2F908, 'M', u'港'),
    (0x2F909, 'M', u'湮'),
    (0x2F90A, 'M', u'㴳'),
    (0x2F90B, 'M', u'滋'),
    (0x2F90C, 'M', u'滇'),
    (0x2F90D, 'M', u'𣻑'),
    (0x2F90E, 'M', u'淹'),
    (0x2F90F, 'M', u'潮'),
    (0x2F910, 'M', u'𣽞'),
    (0x2F911, 'M', u'𣾎'),
    (0x2F912, 'M', u'濆'),
    (0x2F913, 'M', u'瀹'),
    (0x2F914, 'M', u'瀞'),
    (0x2F915, 'M', u'瀛'),
    (0x2F916, 'M', u'㶖'),
    (0x2F917, 'M', u'灊'),
    (0x2F918, 'M', u'災'),
    ]

def _seg_70():
    return [
    (0x2F919, 'M', u'灷'),
    (0x2F91A, 'M', u'炭'),
    (0x2F91B, 'M', u'𠔥'),
    (0x2F91C, 'M', u'煅'),
    (0x2F91D, 'M', u'𤉣'),
    (0x2F91E, 'M', u'熜'),
    (0x2F91F, 'X'),
    (0x2F920, 'M', u'爨'),
    (0x2F921, 'M', u'爵'),
    (0x2F922, 'M', u'牐'),
    (0x2F923, 'M', u'𤘈'),
    (0x2F924, 'M', u'犀'),
    (0x2F925, 'M', u'犕'),
    (0x2F926, 'M', u'𤜵'),
    (0x2F927, 'M', u'𤠔'),
    (0x2F928, 'M', u'獺'),
    (0x2F929, 'M', u'王'),
    (0x2F92A, 'M', u'㺬'),
    (0x2F92B, 'M', u'玥'),
    (0x2F92C, 'M', u'㺸'),
    (0x2F92E, 'M', u'瑇'),
    (0x2F92F, 'M', u'瑜'),
    (0x2F930, 'M', u'瑱'),
    (0x2F931, 'M', u'璅'),
    (0x2F932, 'M', u'瓊'),
    (0x2F933, 'M', u'㼛'),
    (0x2F934, 'M', u'甤'),
    (0x2F935, 'M', u'𤰶'),
    (0x2F936, 'M', u'甾'),
    (0x2F937, 'M', u'𤲒'),
    (0x2F938, 'M', u'異'),
    (0x2F939, 'M', u'𢆟'),
    (0x2F93A, 'M', u'瘐'),
    (0x2F93B, 'M', u'𤾡'),
    (0x2F93C, 'M', u'𤾸'),
    (0x2F93D, 'M', u'𥁄'),
    (0x2F93E, 'M', u'㿼'),
    (0x2F93F, 'M', u'䀈'),
    (0x2F940, 'M', u'直'),
    (0x2F941, 'M', u'𥃳'),
    (0x2F942, 'M', u'𥃲'),
    (0x2F943, 'M', u'𥄙'),
    (0x2F944, 'M', u'𥄳'),
    (0x2F945, 'M', u'眞'),
    (0x2F946, 'M', u'真'),
    (0x2F948, 'M', u'睊'),
    (0x2F949, 'M', u'䀹'),
    (0x2F94A, 'M', u'瞋'),
    (0x2F94B, 'M', u'䁆'),
    (0x2F94C, 'M', u'䂖'),
    (0x2F94D, 'M', u'𥐝'),
    (0x2F94E, 'M', u'硎'),
    (0x2F94F, 'M', u'碌'),
    (0x2F950, 'M', u'磌'),
    (0x2F951, 'M', u'䃣'),
    (0x2F952, 'M', u'𥘦'),
    (0x2F953, 'M', u'祖'),
    (0x2F954, 'M', u'𥚚'),
    (0x2F955, 'M', u'𥛅'),
    (0x2F956, 'M', u'福'),
    (0x2F957, 'M', u'秫'),
    (0x2F958, 'M', u'䄯'),
    (0x2F959, 'M', u'穀'),
    (0x2F95A, 'M', u'穊'),
    (0x2F95B, 'M', u'穏'),
    (0x2F95C, 'M', u'𥥼'),
    (0x2F95D, 'M', u'𥪧'),
    (0x2F95F, 'X'),
    (0x2F960, 'M', u'䈂'),
    (0x2F961, 'M', u'𥮫'),
    (0x2F962, 'M', u'篆'),
    (0x2F963, 'M', u'築'),
    (0x2F964, 'M', u'䈧'),
    (0x2F965, 'M', u'𥲀'),
    (0x2F966, 'M', u'糒'),
    (0x2F967, 'M', u'䊠'),
    (0x2F968, 'M', u'糨'),
    (0x2F969, 'M', u'糣'),
    (0x2F96A, 'M', u'紀'),
    (0x2F96B, 'M', u'𥾆'),
    (0x2F96C, 'M', u'絣'),
    (0x2F96D, 'M', u'䌁'),
    (0x2F96E, 'M', u'緇'),
    (0x2F96F, 'M', u'縂'),
    (0x2F970, 'M', u'繅'),
    (0x2F971, 'M', u'䌴'),
    (0x2F972, 'M', u'𦈨'),
    (0x2F973, 'M', u'𦉇'),
    (0x2F974, 'M', u'䍙'),
    (0x2F975, 'M', u'𦋙'),
    (0x2F976, 'M', u'罺'),
    (0x2F977, 'M', u'𦌾'),
    (0x2F978, 'M', u'羕'),
    (0x2F979, 'M', u'翺'),
    (0x2F97A, 'M', u'者'),
    (0x2F97B, 'M', u'𦓚'),
    (0x2F97C, 'M', u'𦔣'),
    (0x2F97D, 'M', u'聠'),
    (0x2F97E, 'M', u'𦖨'),
    (0x2F97F, 'M', u'聰'),
    ]

def _seg_71():
    return [
    (0x2F980, 'M', u'𣍟'),
    (0x2F981, 'M', u'䏕'),
    (0x2F982, 'M', u'育'),
    (0x2F983, 'M', u'脃'),
    (0x2F984, 'M', u'䐋'),
    (0x2F985, 'M', u'脾'),
    (0x2F986, 'M', u'媵'),
    (0x2F987, 'M', u'𦞧'),
    (0x2F988, 'M', u'𦞵'),
    (0x2F989, 'M', u'𣎓'),
    (0x2F98A, 'M', u'𣎜'),
    (0x2F98B, 'M', u'舁'),
    (0x2F98C, 'M', u'舄'),
    (0x2F98D, 'M', u'辞'),
    (0x2F98E, 'M', u'䑫'),
    (0x2F98F, 'M', u'芑'),
    (0x2F990, 'M', u'芋'),
    (0x2F991, 'M', u'芝'),
    (0x2F992, 'M', u'劳'),
    (0x2F993, 'M', u'花'),
    (0x2F994, 'M', u'芳'),
    (0x2F995, 'M', u'芽'),
    (0x2F996, 'M', u'苦'),
    (0x2F997, 'M', u'𦬼'),
    (0x2F998, 'M', u'若'),
    (0x2F999, 'M', u'茝'),
    (0x2F99A, 'M', u'荣'),
    (0x2F99B, 'M', u'莭'),
    (0x2F99C, 'M', u'茣'),
    (0x2F99D, 'M', u'莽'),
    (0x2F99E, 'M', u'菧'),
    (0x2F99F, 'M', u'著'),
    (0x2F9A0, 'M', u'荓'),
    (0x2F9A1, 'M', u'菊'),
    (0x2F9A2, 'M', u'菌'),
    (0x2F9A3, 'M', u'菜'),
    (0x2F9A4, 'M', u'𦰶'),
    (0x2F9A5, 'M', u'𦵫'),
    (0x2F9A6, 'M', u'𦳕'),
    (0x2F9A7, 'M', u'䔫'),
    (0x2F9A8, 'M', u'蓱'),
    (0x2F9A9, 'M', u'蓳'),
    (0x2F9AA, 'M', u'蔖'),
    (0x2F9AB, 'M', u'𧏊'),
    (0x2F9AC, 'M', u'蕤'),
    (0x2F9AD, 'M', u'𦼬'),
    (0x2F9AE, 'M', u'䕝'),
    (0x2F9AF, 'M', u'䕡'),
    (0x2F9B0, 'M', u'𦾱'),
    (0x2F9B1, 'M', u'𧃒'),
    (0x2F9B2, 'M', u'䕫'),
    (0x2F9B3, 'M', u'虐'),
    (0x2F9B4, 'M', u'虜'),
    (0x2F9B5, 'M', u'虧'),
    (0x2F9B6, 'M', u'虩'),
    (0x2F9B7, 'M', u'蚩'),
    (0x2F9B8, 'M', u'蚈'),
    (0x2F9B9, 'M', u'蜎'),
    (0x2F9BA, 'M', u'蛢'),
    (0x2F9BB, 'M', u'蝹'),
    (0x2F9BC, 'M', u'蜨'),
    (0x2F9BD, 'M', u'蝫'),
    (0x2F9BE, 'M', u'螆'),
    (0x2F9BF, 'X'),
    (0x2F9C0, 'M', u'蟡'),
    (0x2F9C1, 'M', u'蠁'),
    (0x2F9C2, 'M', u'䗹'),
    (0x2F9C3, 'M', u'衠'),
    (0x2F9C4, 'M', u'衣'),
    (0x2F9C5, 'M', u'𧙧'),
    (0x2F9C6, 'M', u'裗'),
    (0x2F9C7, 'M', u'裞'),
    (0x2F9C8, 'M', u'䘵'),
    (0x2F9C9, 'M', u'裺'),
    (0x2F9CA, 'M', u'㒻'),
    (0x2F9CB, 'M', u'𧢮'),
    (0x2F9CC, 'M', u'𧥦'),
    (0x2F9CD, 'M', u'䚾'),
    (0x2F9CE, 'M', u'䛇'),
    (0x2F9CF, 'M', u'誠'),
    (0x2F9D0, 'M', u'諭'),
    (0x2F9D1, 'M', u'變'),
    (0x2F9D2, 'M', u'豕'),
    (0x2F9D3, 'M', u'𧲨'),
    (0x2F9D4, 'M', u'貫'),
    (0x2F9D5, 'M', u'賁'),
    (0x2F9D6, 'M', u'贛'),
    (0x2F9D7, 'M', u'起'),
    (0x2F9D8, 'M', u'𧼯'),
    (0x2F9D9, 'M', u'𠠄'),
    (0x2F9DA, 'M', u'跋'),
    (0x2F9DB, 'M', u'趼'),
    (0x2F9DC, 'M', u'跰'),
    (0x2F9DD, 'M', u'𠣞'),
    (0x2F9DE, 'M', u'軔'),
    (0x2F9DF, 'M', u'輸'),
    (0x2F9E0, 'M', u'𨗒'),
    (0x2F9E1, 'M', u'𨗭'),
    (0x2F9E2, 'M', u'邔'),
    (0x2F9E3, 'M', u'郱'),
    ]

def _seg_72():
    return [
    (0x2F9E4, 'M', u'鄑'),
    (0x2F9E5, 'M', u'𨜮'),
    (0x2F9E6, 'M', u'鄛'),
    (0x2F9E7, 'M', u'鈸'),
    (0x2F9E8, 'M', u'鋗'),
    (0x2F9E9, 'M', u'鋘'),
    (0x2F9EA, 'M', u'鉼'),
    (0x2F9EB, 'M', u'鏹'),
    (0x2F9EC, 'M', u'鐕'),
    (0x2F9ED, 'M', u'𨯺'),
    (0x2F9EE, 'M', u'開'),
    (0x2F9EF, 'M', u'䦕'),
    (0x2F9F0, 'M', u'閷'),
    (0x2F9F1, 'M', u'𨵷'),
    (0x2F9F2, 'M', u'䧦'),
    (0x2F9F3, 'M', u'雃'),
    (0x2F9F4, 'M', u'嶲'),
    (0x2F9F5, 'M', u'霣'),
    (0x2F9F6, 'M', u'𩅅'),
    (0x2F9F7, 'M', u'𩈚'),
    (0x2F9F8, 'M', u'䩮'),
    (0x2F9F9, 'M', u'䩶'),
    (0x2F9FA, 'M', u'韠'),
    (0x2F9FB, 'M', u'𩐊'),
    (0x2F9FC, 'M', u'䪲'),
    (0x2F9FD, 'M', u'𩒖'),
    (0x2F9FE, 'M', u'頋'),
    (0x2FA00, 'M', u'頩'),
    (0x2FA01, 'M', u'𩖶'),
    (0x2FA02, 'M', u'飢'),
    (0x2FA03, 'M', u'䬳'),
    (0x2FA04, 'M', u'餩'),
    (0x2FA05, 'M', u'馧'),
    (0x2FA06, 'M', u'駂'),
    (0x2FA07, 'M', u'駾'),
    (0x2FA08, 'M', u'䯎'),
    (0x2FA09, 'M', u'𩬰'),
    (0x2FA0A, 'M', u'鬒'),
    (0x2FA0B, 'M', u'鱀'),
    (0x2FA0C, 'M', u'鳽'),
    (0x2FA0D, 'M', u'䳎'),
    (0x2FA0E, 'M', u'䳭'),
    (0x2FA0F, 'M', u'鵧'),
    (0x2FA10, 'M', u'𪃎'),
    (0x2FA11, 'M', u'䳸'),
    (0x2FA12, 'M', u'𪄅'),
    (0x2FA13, 'M', u'𪈎'),
    (0x2FA14, 'M', u'𪊑'),
    (0x2FA15, 'M', u'麻'),
    (0x2FA16, 'M', u'䵖'),
    (0x2FA17, 'M', u'黹'),
    (0x2FA18, 'M', u'黾'),
    (0x2FA19, 'M', u'鼅'),
    (0x2FA1A, 'M', u'鼏'),
    (0x2FA1B, 'M', u'鼖'),
    (0x2FA1C, 'M', u'鼻'),
    (0x2FA1D, 'M', u'𪘀'),
    (0x2FA1E, 'X'),
    (0xE0100, 'I'),
    (0xE01F0, 'X'),
    ]

uts46data = tuple(
    _seg_0()
    + _seg_1()
    + _seg_2()
    + _seg_3()
    + _seg_4()
    + _seg_5()
    + _seg_6()
    + _seg_7()
    + _seg_8()
    + _seg_9()
    + _seg_10()
    + _seg_11()
    + _seg_12()
    + _seg_13()
    + _seg_14()
    + _seg_15()
    + _seg_16()
    + _seg_17()
    + _seg_18()
    + _seg_19()
    + _seg_20()
    + _seg_21()
    + _seg_22()
    + _seg_23()
    + _seg_24()
    + _seg_25()
    + _seg_26()
    + _seg_27()
    + _seg_28()
    + _seg_29()
    + _seg_30()
    + _seg_31()
    + _seg_32()
    + _seg_33()
    + _seg_34()
    + _seg_35()
    + _seg_36()
    + _seg_37()
    + _seg_38()
    + _seg_39()
    + _seg_40()
    + _seg_41()
    + _seg_42()
    + _seg_43()
    + _seg_44()
    + _seg_45()
    + _seg_46()
    + _seg_47()
    + _seg_48()
    + _seg_49()
    + _seg_50()
    + _seg_51()
    + _seg_52()
    + _seg_53()
    + _seg_54()
    + _seg_55()
    + _seg_56()
    + _seg_57()
    + _seg_58()
    + _seg_59()
    + _seg_60()
    + _seg_61()
    + _seg_62()
    + _seg_63()
    + _seg_64()
    + _seg_65()
    + _seg_66()
    + _seg_67()
    + _seg_68()
    + _seg_69()
    + _seg_70()
    + _seg_71()
    + _seg_72()
)
site-packages/pip/_vendor/idna/__pycache__/core.cpython-36.pyc000064400000021303147511334620020216 0ustar003

���e~,�@s>ddlmZddlZddlZddlZddlZddlmZdZdZ	ej
d�Zejddkr`e
ZeZGd	d
�d
e�ZGdd�de�ZGd
d�de�ZGdd�de�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd7dd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Z d8d(d)�Z!d*d+�Z"d,d-�Z#d.d/�Z$d9d1d2�Z%d:d3d4�Z&d;d5d6�Z'dS)<�)�idnadata�N)�intranges_contain�	sxn--u[.。.。]�c@seZdZdZdS)�	IDNAErrorz7 Base exception for all IDNA-encoding related problems N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/core.pyrsrc@seZdZdZdS)�
IDNABidiErrorz= Exception when bidirectional requirements are not satisfied N)rr	r
rrrrr
rsrc@seZdZdZdS)�InvalidCodepointz> Exception when a disallowed or unallocated codepoint is used N)rr	r
rrrrr
rsrc@seZdZdZdS)�InvalidCodepointContextzE Exception when the codepoint is not valid in the context it is used N)rr	r
rrrrr
rsrcCstjt|��S)N)�unicodedataZ	combining�unichr)�cprrr
�_combining_class$srcCstt|�tj|�S)N)r�ordr�scripts)rZscriptrrr
�
_is_script'srcCs
|jd�S)N�punycode)�encode)�srrr
�	_punycode*srcCs
dj|�S)Nz	U+{0:04X})�format)rrrr
�_unot-srcCst|�dkrdSdS)N�?FT)�len)�labelrrr
�valid_label_length1sr!cCst|�|rdndkrdSdS)N��FT)r)r �trailing_dotrrr
�valid_string_length8sr%Fc	Cspd}xJt|d�D]<\}}tj|�}|dkr>tdjt|�|���|dkrd}PqW|r`|r`dStj|d	�}|dkr|d}n |d
kr�d}ntdjt|����d}d}x�t|d�D]�\}}tj|�}|�r&|dkr�tdj|���|dk�r�d}n|dk�r�d}|dk�rZ|�s|}n||k�rZtd��q�|dk�r>tdj|���|dk�rNd}q�|dkr�d}q�W|�sltd��dS)NFr�z3Unknown directionality in label {0} at position {1}�R�AL�ANTr�Lz>First codepoint in label {0} must be directionality L, R or AL�EN�ES�CS�ET�ON�BN�NSMzHInvalid direction for codepoint at position {0} in a right-to-left labelz2Can not mix numeral types in a right-to-left labelzHInvalid direction for codepoint at position {0} in a left-to-right labelz0Label ends with illegal codepoint directionality)r'r(r))r'r()
r'r(r)r+r,r-r.r/r0r1)r'r(r+r))r)r+)r*r+r,r-r.r/r0r1)r*r+)�	enumeraterZ
bidirectionalrr�repr)	r Z	check_ltrZ
bidi_label�idxr�	directionZrtlZvalid_endingZnumber_typerrr
�
check_bidi?sR








r6cCs"tj|d�ddkrtd��dS)Nr�Mz0Label begins with an illegal combining characterT)r�categoryr)r rrr
�check_initial_combiner|sr9cCs<|dd�dkrtd��|ddks0|d
dkr8td��d	S)N��z--z4Label has disallowed hyphens in 3rd and 4th positionr�-rz)Label must not start or end with a hyphenT���)r)r rrr
�check_hyphen_ok�s
r>cCstjd|�|krtd��dS)N�NFCz%Label must be in Normalization Form C)r�	normalizer)r rrr
�	check_nfc�srAcCs:t||�}|dk�r�|dkr:tt||d��tkr:dSd}xTt|ddd�D]@}tjjt||��}|td�krvqP|td�td�gkrPd}PqPW|s�dSd}xVt|dt|��D]@}tjjt||��}|td�kr�q�|td	�td�gkr�d}Pq�W|S|d
k�r2|dk�r.tt||d��tk�r.dSdSdSdS)
Ni rrTF�Tr*�Dr'i
 r=r=)rr�_virama_combining_class�rangerZ
joining_types�getr)r �pos�cp_value�ok�iZjoining_typerrr
�valid_contextj�s<


rKcCs�t||�}|dkrdd|ko.t|�dknr`t||d�dkr`t||d�dkr`dSdS|dkr�|t|�dkr�t|�dkr�t||dd�SdS|d	ks�|d
kr�|dkr�t||dd�SdS|dk�rx<|D]4}|d
kr�q�t|d��st|d��st|d�r�dSq�WdSd|k�o,dkn�rlx2|D]*}dt|�k�oVdkn�r:dS�q:WdSd|k�o�dkn�r�x2|D]*}dt|�k�o�dkn�r�dS�q�WdSdS)N�rr�lTFiuZGreeki�i�ZHebrewi�0u・ZHiraganaZKatakanaZHani`iii�i�)rrr)r rGZ	exceptionrHrrrr
�valid_contexto�s> (

"
 

 
rNcCst|ttf�r|jd�}t|�dkr,td��t|�t|�t|�x�t	|�D]�\}}t
|�}t|tj
d�rrqNqNt|tj
d�r�t||�s�tdjt|�|dt|����qNt|tj
d�r�t||�s�td	jt|�|dt|����qNtd
jt|�|dt|����qNWt|�dS)Nzutf-8rzEmpty LabelZPVALIDZCONTEXTJz-Joiner {0} not allowed at position {1} in {2}rZCONTEXTOz0Codepoint {0} not allowed at position {1} in {2}z0Codepoint {0} at position {1} of {2} not allowed)�
isinstance�bytes�	bytearray�decoderrrAr>r9r2rrrZcodepoint_classesrKrrrr3rNrr6)r rGrrHrrr
�check_label�s&

 
 "rScCs�yN|jd�}yt|�Wn"tk
r:tdj|���YnXt|�sLtd��|Stk
rbYnX|sptd��t|�}t|�t|�}t	|}t|�s�td��|S)N�asciiz$The label {0} is not a valid A-labelzLabel too longzNo Input)
r�ulabelrrr!�UnicodeEncodeError�unicoderSr�_alabel_prefix)r rrr
�alabels(
rYcCs�t|ttf�s:y|jd�}Wntk
r8t|�|SX|j�}|jt�r^|t	t�d�}nt|�|j
d�S|j
d�}t|�|S)NrTr)rOrPrQrrVrS�lower�
startswithrXrrR)r rrr
rUs


rUTcCs,ddlm}d}y�x�t|�D]�\}}t|�}||dkr:|ntj||df�d}|d}	t|�dkrl|dnd}
|	d	ks�|	d
kr�|s�|	dkr�|r�|
dkr�||7}q|
dk	r�|	dks�|	dkr�|s�|	d
kr�|r�||
7}q|	d
krt��qWtjd|�Stk
�r&t	dj
t|�|dt|����YnXdS)zBRe-map the characters in the string according to UTS46 processing.r)�	uts46datar&��Zrr:N�VrC�3r7�Ir?z0Codepoint {0} not allowed at position {1} in {2})
r\r2r�bisectZbisect_leftr�
IndexErrorrr@rrrr3)Zdomain�
std3_rules�transitionalr\�outputrG�charZ
code_pointZuts46rowZstatusZreplacementrrr
�uts46_remap3s0


rhc	Cs�t|ttf�r|jd�}|r(t|||�}d}g}|r@|jd�}n
tj|�}x|rb|drb|d=qLW|sptd��|ddkr�|d
=d}x|D]}|jt	|��q�W|r�|jd	�d
j
|�}t||�s�td��|S)NrTF�.rzEmpty domainrr&T��.zDomain too longr=r=)rOrPrQrRrh�split�_unicode_dots_rer�appendrY�joinr%)	r�strict�uts46rdrer$�result�labelsr rrr
rOs0






rcCs�t|ttf�r|jd�}|r(t||d�}d}g}|s@tj|�}n
|jd�}x|rb|drb|d=qLW|sptd��|d	s�|d
=d}x|D]}|jt	|��q�W|r�|jd�dj
|�S)NrTFrirzEmpty domainrTr&r=r=)rOrPrQrRrhrmrlrrnrUro)rrprqrdr$rrrsr rrr
rRls*




rR)F)F)TF)FFFF)FFF)(r&rrbr�re�sysZ	intrangesrrDrX�compilerm�version_info�strrW�chrr�UnicodeErrorrrrrrrrrr!r%r6r9r>rArKrNrSrYrUrhrrRrrrr
�<module>sB

=	,
)

site-packages/pip/_vendor/idna/__pycache__/package_data.cpython-36.pyc000064400000000210147511334620021644 0ustar003

���e�@sdZdS)z2.6N)�__version__�rr�"/usr/lib/python3.6/package_data.py�<module>ssite-packages/pip/_vendor/idna/__pycache__/core.cpython-36.opt-1.pyc000064400000021303147511334620021155 0ustar003

���e~,�@s>ddlmZddlZddlZddlZddlZddlmZdZdZ	ej
d�Zejddkr`e
ZeZGd	d
�d
e�ZGdd�de�ZGd
d�de�ZGdd�de�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd7dd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Z d8d(d)�Z!d*d+�Z"d,d-�Z#d.d/�Z$d9d1d2�Z%d:d3d4�Z&d;d5d6�Z'dS)<�)�idnadata�N)�intranges_contain�	sxn--u[.。.。]�c@seZdZdZdS)�	IDNAErrorz7 Base exception for all IDNA-encoding related problems N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/core.pyrsrc@seZdZdZdS)�
IDNABidiErrorz= Exception when bidirectional requirements are not satisfied N)rr	r
rrrrr
rsrc@seZdZdZdS)�InvalidCodepointz> Exception when a disallowed or unallocated codepoint is used N)rr	r
rrrrr
rsrc@seZdZdZdS)�InvalidCodepointContextzE Exception when the codepoint is not valid in the context it is used N)rr	r
rrrrr
rsrcCstjt|��S)N)�unicodedataZ	combining�unichr)�cprrr
�_combining_class$srcCstt|�tj|�S)N)r�ordr�scripts)rZscriptrrr
�
_is_script'srcCs
|jd�S)N�punycode)�encode)�srrr
�	_punycode*srcCs
dj|�S)Nz	U+{0:04X})�format)rrrr
�_unot-srcCst|�dkrdSdS)N�?FT)�len)�labelrrr
�valid_label_length1sr!cCst|�|rdndkrdSdS)N��FT)r)r �trailing_dotrrr
�valid_string_length8sr%Fc	Cspd}xJt|d�D]<\}}tj|�}|dkr>tdjt|�|���|dkrd}PqW|r`|r`dStj|d	�}|dkr|d}n |d
kr�d}ntdjt|����d}d}x�t|d�D]�\}}tj|�}|�r&|dkr�tdj|���|dk�r�d}n|dk�r�d}|dk�rZ|�s|}n||k�rZtd��q�|dk�r>tdj|���|dk�rNd}q�|dkr�d}q�W|�sltd��dS)NFr�z3Unknown directionality in label {0} at position {1}�R�AL�ANTr�Lz>First codepoint in label {0} must be directionality L, R or AL�EN�ES�CS�ET�ON�BN�NSMzHInvalid direction for codepoint at position {0} in a right-to-left labelz2Can not mix numeral types in a right-to-left labelzHInvalid direction for codepoint at position {0} in a left-to-right labelz0Label ends with illegal codepoint directionality)r'r(r))r'r()
r'r(r)r+r,r-r.r/r0r1)r'r(r+r))r)r+)r*r+r,r-r.r/r0r1)r*r+)�	enumeraterZ
bidirectionalrr�repr)	r Z	check_ltrZ
bidi_label�idxr�	directionZrtlZvalid_endingZnumber_typerrr
�
check_bidi?sR








r6cCs"tj|d�ddkrtd��dS)Nr�Mz0Label begins with an illegal combining characterT)r�categoryr)r rrr
�check_initial_combiner|sr9cCs<|dd�dkrtd��|ddks0|d
dkr8td��d	S)N��z--z4Label has disallowed hyphens in 3rd and 4th positionr�-rz)Label must not start or end with a hyphenT���)r)r rrr
�check_hyphen_ok�s
r>cCstjd|�|krtd��dS)N�NFCz%Label must be in Normalization Form C)r�	normalizer)r rrr
�	check_nfc�srAcCs:t||�}|dk�r�|dkr:tt||d��tkr:dSd}xTt|ddd�D]@}tjjt||��}|td�krvqP|td�td�gkrPd}PqPW|s�dSd}xVt|dt|��D]@}tjjt||��}|td�kr�q�|td	�td�gkr�d}Pq�W|S|d
k�r2|dk�r.tt||d��tk�r.dSdSdSdS)
Ni rrTF�Tr*�Dr'i
 r=r=)rr�_virama_combining_class�rangerZ
joining_types�getr)r �pos�cp_value�ok�iZjoining_typerrr
�valid_contextj�s<


rKcCs�t||�}|dkrdd|ko.t|�dknr`t||d�dkr`t||d�dkr`dSdS|dkr�|t|�dkr�t|�dkr�t||dd�SdS|d	ks�|d
kr�|dkr�t||dd�SdS|dk�rx<|D]4}|d
kr�q�t|d��st|d��st|d�r�dSq�WdSd|k�o,dkn�rlx2|D]*}dt|�k�oVdkn�r:dS�q:WdSd|k�o�dkn�r�x2|D]*}dt|�k�o�dkn�r�dS�q�WdSdS)N�rr�lTFiuZGreeki�i�ZHebrewi�0u・ZHiraganaZKatakanaZHani`iii�i�)rrr)r rGZ	exceptionrHrrrr
�valid_contexto�s> (

"
 

 
rNcCst|ttf�r|jd�}t|�dkr,td��t|�t|�t|�x�t	|�D]�\}}t
|�}t|tj
d�rrqNqNt|tj
d�r�t||�s�tdjt|�|dt|����qNt|tj
d�r�t||�s�td	jt|�|dt|����qNtd
jt|�|dt|����qNWt|�dS)Nzutf-8rzEmpty LabelZPVALIDZCONTEXTJz-Joiner {0} not allowed at position {1} in {2}rZCONTEXTOz0Codepoint {0} not allowed at position {1} in {2}z0Codepoint {0} at position {1} of {2} not allowed)�
isinstance�bytes�	bytearray�decoderrrAr>r9r2rrrZcodepoint_classesrKrrrr3rNrr6)r rGrrHrrr
�check_label�s&

 
 "rScCs�yN|jd�}yt|�Wn"tk
r:tdj|���YnXt|�sLtd��|Stk
rbYnX|sptd��t|�}t|�t|�}t	|}t|�s�td��|S)N�asciiz$The label {0} is not a valid A-labelzLabel too longzNo Input)
r�ulabelrrr!�UnicodeEncodeError�unicoderSr�_alabel_prefix)r rrr
�alabels(
rYcCs�t|ttf�s:y|jd�}Wntk
r8t|�|SX|j�}|jt�r^|t	t�d�}nt|�|j
d�S|j
d�}t|�|S)NrTr)rOrPrQrrVrS�lower�
startswithrXrrR)r rrr
rUs


rUTcCs,ddlm}d}y�x�t|�D]�\}}t|�}||dkr:|ntj||df�d}|d}	t|�dkrl|dnd}
|	d	ks�|	d
kr�|s�|	dkr�|r�|
dkr�||7}q|
dk	r�|	dks�|	dkr�|s�|	d
kr�|r�||
7}q|	d
krt��qWtjd|�Stk
�r&t	dj
t|�|dt|����YnXdS)zBRe-map the characters in the string according to UTS46 processing.r)�	uts46datar&��Zrr:N�VrC�3r7�Ir?z0Codepoint {0} not allowed at position {1} in {2})
r\r2r�bisectZbisect_leftr�
IndexErrorrr@rrrr3)Zdomain�
std3_rules�transitionalr\�outputrG�charZ
code_pointZuts46rowZstatusZreplacementrrr
�uts46_remap3s0


rhc	Cs�t|ttf�r|jd�}|r(t|||�}d}g}|r@|jd�}n
tj|�}x|rb|drb|d=qLW|sptd��|ddkr�|d
=d}x|D]}|jt	|��q�W|r�|jd	�d
j
|�}t||�s�td��|S)NrTF�.rzEmpty domainrr&T��.zDomain too longr=r=)rOrPrQrRrh�split�_unicode_dots_rer�appendrY�joinr%)	r�strict�uts46rdrer$�result�labelsr rrr
rOs0






rcCs�t|ttf�r|jd�}|r(t||d�}d}g}|s@tj|�}n
|jd�}x|rb|drb|d=qLW|sptd��|d	s�|d
=d}x|D]}|jt	|��q�W|r�|jd�dj
|�S)NrTFrirzEmpty domainrTr&r=r=)rOrPrQrRrhrmrlrrnrUro)rrprqrdr$rrrsr rrr
rRls*




rR)F)F)TF)FFFF)FFF)(r&rrbr�re�sysZ	intrangesrrDrX�compilerm�version_info�strrW�chrr�UnicodeErrorrrrrrrrrr!r%r6r9r>rArKrNrSrYrUrhrrRrrrr
�<module>sB

=	,
)

site-packages/pip/_vendor/idna/__pycache__/codec.cpython-36.opt-1.pyc000064400000005725147511334620021314 0ustar003

���e��@s�ddlmZmZmZmZmZddlZddlZejd�Z	Gdd�dej
�Z
Gdd�dej�ZGd	d
�d
ej
�ZGdd�de
ej�ZGd
d�de
ej�Zdd�ZdS)�)�encode�decode�alabel�ulabel�	IDNAError�Nu[.。.。]c@s eZdZddd�Zddd�ZdS)	�Codec�strictcCs.|dkrtdj|���|sdSt|�t|�fS)Nr	z Unsupported error handling "{0}"�r)r
r)r�formatr�len)�self�data�errors�r�/usr/lib/python3.6/codec.pyr	s
zCodec.encodecCs.|dkrtdj|���|sdSt|�t|�fS)Nr	z Unsupported error handling "{0}"r
r)r
r)rrrr)r
rrrrrrs
zCodec.decodeN)r	)r	)�__name__�
__module__�__qualname__rrrrrrrs

rc@seZdZdd�ZdS)�IncrementalEncoderc	Cs�|dkrtdj|���|sdStj|�}d}|rV|dsDd}|d	=n|sV|d
=|rVd}g}d}x2|D]*}|jt|��|r�|d7}|t|�7}qdWdj|�|}|t|�7}||fS)Nr	z Unsupported error handling "{0}"r
rr�.)r
r���rr)rr�_unicode_dots_re�split�appendrr�join)	r
rr�final�labels�trailing_dot�result�size�labelrrr�_buffer_encodes0

z!IncrementalEncoder._buffer_encodeN)rrrr"rrrrrsrc@seZdZdd�ZdS)�IncrementalDecoderc	Cs�|dkrtdj|���|sdSt|t�r4tj|�}nt|�}t|d�|jd�}d}|r~|d	sld}|d
=n|s~|d=|r~d}g}d}x2|D]*}|jt|��|r�|d7}|t	|�7}q�Wdj
|�|}|t	|�7}||fS)Nr	z Unsupported error handling "{0}"r
r�asciirr)r
rrrr)rr�
isinstanceZunicoderr�strrrrr)	r
rrrrrrr r!rrr�_buffer_decode?s8



z!IncrementalDecoder._buffer_decodeN)rrrr'rrrrr#>sr#c@seZdZdS)�StreamWriterN)rrrrrrrr(gsr(c@seZdZdS)�StreamReaderN)rrrrrrrr)jsr)c	Cs tjdt�jt�jttttd�S)NZidna)�namerr�incrementalencoder�incrementaldecoder�streamwriter�streamreader)	�codecs�	CodecInforrrrr#r(r)rrrr�getregentrymsr1)Zcorerrrrrr/�re�compilerr�BufferedIncrementalEncoderr�BufferedIncrementalDecoderr#r(r)r1rrrr�<module>s
!)site-packages/pip/_vendor/idna/__pycache__/intranges.cpython-36.opt-1.pyc000064400000003317147511334620022224 0ustar003

���e��@s0dZddlZdd�Zdd�Zdd�Zd	d
�ZdS)a	
Given a list of integers, made up of (hopefully) a small number of long runs
of consecutive integers, compute a representation of the form
((start1, end1), (start2, end2) ...). Then answer the question "was x present
in the original list?" in time O(log(# runs)).
�NcCs�t|�}g}d}xrtt|��D]b}|dt|�krL||||ddkrLq||d|d�}|jt|d|dd��|}qWt|�S)aRepresent a list of integers as a sequence of ranges:
    ((start_0, end_0), (start_1, end_1), ...), such that the original
    integers are exactly those x such that start_i <= x < end_i for some i.

    Ranges are encoded as single integers (start << 32 | end), not as tuples.
    �r���r)�sorted�range�len�append�
_encode_range�tuple)Zlist_Zsorted_list�rangesZ
last_write�iZ
current_range�r�/usr/lib/python3.6/intranges.py�intranges_from_list
srcCs|d>|BS)N� r)�start�endrrr
rsrcCs|d?|d@fS)Nrrll��r)�rrrr
�
_decode_range"srcCszt|d�}tj||�}|dkrNt||d�\}}||koD|knrNdS|t|�krvt||�\}}||krvdSdS)z=Determine if `int_` falls into one of the ranges in `ranges`.rrTF)r�bisectZbisect_leftrr)Zint_r
Ztuple_�pos�left�right�_rrr
�intranges_contain&s
r)�__doc__rrrrrrrrr
�<module>s
site-packages/pip/_vendor/idna/__pycache__/intranges.cpython-36.pyc000064400000003317147511334620021265 0ustar003

���e��@s0dZddlZdd�Zdd�Zdd�Zd	d
�ZdS)a	
Given a list of integers, made up of (hopefully) a small number of long runs
of consecutive integers, compute a representation of the form
((start1, end1), (start2, end2) ...). Then answer the question "was x present
in the original list?" in time O(log(# runs)).
�NcCs�t|�}g}d}xrtt|��D]b}|dt|�krL||||ddkrLq||d|d�}|jt|d|dd��|}qWt|�S)aRepresent a list of integers as a sequence of ranges:
    ((start_0, end_0), (start_1, end_1), ...), such that the original
    integers are exactly those x such that start_i <= x < end_i for some i.

    Ranges are encoded as single integers (start << 32 | end), not as tuples.
    �r���r)�sorted�range�len�append�
_encode_range�tuple)Zlist_Zsorted_list�rangesZ
last_write�iZ
current_range�r�/usr/lib/python3.6/intranges.py�intranges_from_list
srcCs|d>|BS)N� r)�start�endrrr
rsrcCs|d?|d@fS)Nrrll��r)�rrrr
�
_decode_range"srcCszt|d�}tj||�}|dkrNt||d�\}}||koD|knrNdS|t|�krvt||�\}}||krvdSdS)z=Determine if `int_` falls into one of the ranges in `ranges`.rrTF)r�bisectZbisect_leftrr)Zint_r
Ztuple_�pos�left�right�_rrr
�intranges_contain&s
r)�__doc__rrrrrrrrr
�<module>s
site-packages/pip/_vendor/idna/__pycache__/uts46data.cpython-36.opt-1.pyc000064400000671047147511334620022064 0ustar003

���ep��@sdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zd2d3�Zd4d5�Zd6d7�Zd8d9�Zd:d;�Zd<d=�Zd>d?�Z d@dA�Z!dBdC�Z"dDdE�Z#dFdG�Z$dHdI�Z%dJdK�Z&dLdM�Z'dNdO�Z(dPdQ�Z)dRdS�Z*dTdU�Z+dVdW�Z,dXdY�Z-dZd[�Z.d\d]�Z/d^d_�Z0d`da�Z1dbdc�Z2ddde�Z3dfdg�Z4dhdi�Z5djdk�Z6dldm�Z7dndo�Z8dpdq�Z9drds�Z:dtdu�Z;dvdw�Z<dxdy�Z=dzd{�Z>d|d}�Z?d~d�Z@d�d��ZAd�d��ZBd�d��ZCd�d��ZDd�d��ZEd�d��ZFd�d��ZGd�d��ZHd�d��ZId�d��ZJeKe�e�e�e�e�e�e�e	�e
�e�e�e
�e�e�e�e�e�e�e�e�e�e�e�e�e�e�e�e�e�e�e �e!�e"�e#�e$�e%�e&�e'�e(�e)�e*�e+�e,�e-�e.�e/�e0�e1�e2�e3�e4�e5�e6�e7�e8�e9�e:�e;�e<�e=�e>�e?�e@�eA�eB�eC�eD�eE�eF�eG�eH�eI�eJ��ZLd�S)�zIDNA Mapping Table from UTS46.z6.3.0ceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N��3���������	�
���
������������������� �!�"�#�$�%�&�'�(�)�*�+�,�-�V�.�/�0�1�2�3�4�5�6�7�8�9�:�;�<�=�>�?�@�A�M�a�B�b�C�c�D�d�E�e�F�f�G�g�H�h�I�i�J�j�K�k�L�l�M�m�N�n�O�o�P�p�Q�q�R�r�S�s�T�t�U�u�V�v�W�w�X�x�Y�y�Z�z�[�\�]�^�_�`�a�b�c)rr)rr)rr)rr)rr)rr)rr)r	r)r
r)rr)rr)r
r)rr)rr)rr)rr)rr)rr)rr)rr)rr)rr)rr)rr)rr)rr)rr)rr)rr)rr)r r)r!r)r"r)r#r)r$r)r%r)r&r)r'r)r(r)r)r)r*r)r+r)r,r)r-r)r.r)r/r0)r1r0)r2r)r3r0)r4r0)r5r0)r6r0)r7r0)r8r0)r9r0)r:r0)r;r0)r<r0)r=r)r>r)r?r)r@r)rAr)rBr)rCr)rDrErF)rGrErH)rIrErJ)rKrErL)rMrErN)rOrErP)rQrErR)rSrErT)rUrErV)rWrErX)rYrErZ)r[rEr\)r]rEr^)r_rEr`)rarErb)rcrErd)rerErf)rgrErh)rirErj)rkrErl)rmrErn)rorErp)rqrErr)rsrErt)rurErv)rwrErx)ryr)rzr)r{r)r|r)r}r)r~r)rr0)r�r0)r�r0�r�r�r��/usr/lib/python3.6/uts46data.py�_seg_0s�r�cfCs�dd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�dr0�e�f�g�h�i�j�k�l�m�n�o�p�q�r�s�t�u�v�w�x�y�z�{r�|�}�~���X��������������������������������� ��������� ̈��rErF����I��� ̄����2��� ́��μ���� ̧��1�rb���1⁄4��1⁄2��3⁄4���à��á���â���ã���ä���å���æ���ç)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r)r�r)r�r)r�r)r�r)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�rr�)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�rr�)r�r0)r�rErF)r�r0)r�r0)r�r�)r�r0)r�rr�)r�r0)r�r0)r�rEr�)r�rEr)r�rr�)r�rEr�)r�r0)r�r0)r�rr�)r�rEr�)r�rErb)r�r0)r�rEr�)r�rEr�)r�rEr�)r�r0)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_1ps�r�ceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N��rE�è���é���ê���ë���ì���í���î���ï���ð���ñ���ò���ó���ô���õ���ö��r0���ø���ù���ú���û���ü���ý���þ���D�ss�������������������������������������������������������ā���ă���ą���ć���ĉ�	�
�ċ���č�
��ď���đ���ē���ĕ���ė���ę���ě���ĝ���ğ�� �ġ�!�"�ģ�#�$�ĥ�%�&�ħ�'�(�ĩ�)�*�ī�+)r�rEr)rrEr)rrEr)rrEr)rrEr)r	rEr
)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rr0)rrEr)r rEr!)r"rEr#)r$rEr%)r&rEr')r(rEr))r*rEr+)r,r-r.)r/r0)r0r0)r1r0)r2r0)r3r0)r4r0)r5r0)r6r0)r7r0)r8r0)r9r0)r:r0)r;r0)r<r0)r=r0)r>r0)r?r0)r@r0)rAr0)rBr0)rCr0)rDr0)rEr0)rFr0)rGr0)rHr0)rIr0)rJr0)rKr0)rLr0)rMr0)rNr0)rOrErP)rQr0)rRrErS)rTr0)rUrErV)rWr0)rXrErY)rZr0)r[rEr\)r]r0)r^rEr_)r`r0)rarErb)rcr0)rdrEre)rfr0)rgrErh)rir0)rjrErk)rlr0)rmrErn)ror0)rprErq)rrr0)rsrErt)rur0)rvrErw)rxr0)ryrErz)r{r0)r|rEr})r~r0)rrEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0r�r�r�r�r��_seg_2�s�r�cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�dgdS(N�,rE�ĭ�-r0�.�į�/�0�i̇�1�2�ij�4�ĵ�5�6�ķ�7�9�ĺ�:�;�ļ�<�=�ľ�>�?�l·�A�ł�B�C�ń�D�E�ņ�F�G�ň�H�I�ʼn�J�ŋ�K�L�ō�M�N�ŏ�O�P�ő�Q�R�œ�S�T�ŕ�U�V�ŗ�W�X�ř�Y�Z�ś�[�\�ŝ�]�^�ş�_�`�š�a�b�ţ�c�d�ť�e�f�ŧ�g�h�ũ�i�j�ū�k�l�ŭ�m�n�ů�o�p�ű�q�r�ų�s�t�ŵ�u�v�ŷ�w�x�ÿ�y�ź�z�{�ż�|�}�ž�~�rj���ɓ��ƃ���ƅ���ɔ��ƈ���ɖ��ɗ��ƌ���ǝ��ə��ɛ��ƒ���ɠ)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)rr0)rrEr)rrEr)rr0)rrEr)rr0)r	rEr
)rr0)rrErj)r
r0)rrEr)rrEr)rr0)rrEr)rr0)rrEr)rrEr)rr0)rrEr)rrEr)rrEr )r!r0)r"rEr#)r$rEr%)r&rEr')r(rEr))r*r0)r+rEr,r�r�r�r�r��_seg_3@s�r-cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�dgdS(N�rE�ɣ�r0��ɩ��ɨ��ƙ���ɯ��ɲ���ɵ��ơ���ƣ���ƥ���ʀ��ƨ���ʃ���ƭ���ʈ��ư���ʊ��ʋ��ƴ���ƶ���ʒ��ƹ���ƽ����dž���lj���nj���ǎ�����ǐ�����ǒ�����ǔ�����ǖ�����ǘ�����ǚ�����ǜ�����ǟ�����ǡ�����ǣ�����ǥ�����ǧ�����ǩ�����ǫ�����ǭ�����ǯ�����dz���ǵ���ƕ��ƿ��ǹ���ǻ���ǽ���ǿ���ȁ���ȃ���ȅ���ȇ���ȉ�	�
�ȋ���ȍ)r.rEr/)r0r0)r1rEr2)r3rEr4)r5rEr6)r7r0)r8rEr9)r:rEr;)r<r0)r=rEr>)r?rEr@)rAr0)rBrErC)rDr0)rErErF)rGr0)rHrErI)rJrErK)rLr0)rMrErN)rOr0)rPrErQ)rRr0)rSrErT)rUrErV)rWr0)rXrErY)rZrEr[)r\rEr])r^r0)r_rEr`)rar0)rbrErc)rdrEre)rfr0)rgrErh)rir0)rjrErk)rlrErm)rnrEro)rprErq)rrr0)rsrErt)rur0)rvrErw)rxr0)ryrErz)r{r0)r|rEr})r~r0)rrEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�rEr�)r�r0)r�rEr�)r�rEr�)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�r�r�r�r�r��_seg_4�s�r�ceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	gdS(
N�
r0�rE�ȏ���ȑ���ȓ���ȕ���ȗ���ș���ț���ȝ���ȟ�� �ƞ�!�"�ȣ�#�$�ȥ�%�&�ȧ�'�(�ȩ�)�*�ȫ�+�,�ȭ�-�.�ȯ�/�0�ȱ�1�2�ȳ�3�:�ⱥ�;�ȼ�<�=�ƚ�>�ⱦ�?�A�ɂ�B�C�ƀ�D�ʉ�E�ʌ�F�ɇ�G�H�ɉ�I�J�ɋ�K�L�ɍ�M�N�ɏ�O�rT��ɦ�rX�rh��ɹ��ɻ��ʁ�rr�rv���r� ̆��� ̇��� ̊��� ̨��� ̃��� ̋�����ɣ��r\��rj��rt���ʕ���@�̀�A�́�B�C�̓�D�̈́�E�ι�F�Or��P�p�ͱ�q�r�ͳ�s�t�ʹ�u�v�ͷ�w)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)rr0)rrEr)rr0)rrEr)rr0)rrEr)r	rEr
)rr0)rrEr
)rrEr)rr0)rrEr)rr0)rrEr)rrEr)rrEr)rrEr)rr0)rrEr)rr0)r rEr!)r"r0)r#rEr$)r%r0)r&rEr')r(r0)r)rErT)r*rEr+)r,rErX)r-rErh)r.rEr/)r0rEr1)r2rEr3)r4rErr)r5rErv)r6r0)r7rr8)r9rr:)r;rr<)r=rr>)r?rr@)rArrB)rCr0)rDrErE)rFrEr\)rGrErj)rHrErt)rIrErJ)rKr0)rLrErM)rNrErO)rPr0)rQrErR)rSrErT)rUrErV)rWr0)rXr�)rYr0)rZrEr[)r\r0)r]rEr^)r_r0)r`rEra)rbr0)rcrErd)rer0r�r�r�r�r��_seg_5s�rfcfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	gdS(
N�xr��zr� ι�{r0�~�;��� ́�� ̈́�rE�ά��·��έ��ή��ί���ό���ύ��ώ���α��β��γ��δ��ε��ζ��η��θ��ι��κ��λ��μ��ν��ξ��ο��π��ρ���σ��τ��υ��φ��χ��ψ��ω��ϊ��ϋ���r-�����ϗ�������������������ϙ�����ϛ�����ϝ�����ϟ�����ϡ�����ϣ�����ϥ�����ϧ�����ϩ�����ϫ�����ϭ�����ϯ����������������ϸ����ϻ���ͻ��ͼ��ͽ��ѐ��ё��ђ��ѓ)rgr�)rhrri)rjr0)rkrrl)rmr�)rnrro)rprrq)rrrErs)rtrEru)rvrErw)rxrEry)rzrEr{)r|r�)r}rEr~)rr�)r�rEr�)r�rEr�)r�r0)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r0)r�r-r�)r�r0)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�rEr�)r�rEr�)r�r0)r�rEr�)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�rEr�)r�r0)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rrEr)rrErr�r�r�r�r��_seg_6xs�rcfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�dgdS(N�rE�є��ѕ��і��ї��ј�	�љ�
�њ��ћ��ќ�
�ѝ��ў��џ��а��б��в��г��д��е��ж��з��и��й��к��л��м��н��о��п� �р�!�с�"�т�#�у�$�ф�%�х�&�ц�'�ч�(�ш�)�щ�*�ъ�+�ы�,�ь�-�э�.�ю�/�я�0r0�`�ѡ�a�b�ѣ�c�d�ѥ�e�f�ѧ�g�h�ѩ�i�j�ѫ�k�l�ѭ�m�n�ѯ�o�p�ѱ�q�r�ѳ�s�t�ѵ�u�v�ѷ�w�x�ѹ�y�z�ѻ�{�|�ѽ�}�~�ѿ���ҁ���ҋ���ҍ���ҏ���ґ���ғ���ҕ���җ���ҙ���қ���ҝ���ҟ)rrEr	)r
rEr)rrEr
)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)r rEr!)r"rEr#)r$rEr%)r&rEr')r(rEr))r*rEr+)r,rEr-)r.rEr/)r0rEr1)r2rEr3)r4rEr5)r6rEr7)r8rEr9)r:rEr;)r<rEr=)r>rEr?)r@rErA)rBrErC)rDrErE)rFrErG)rHrErI)rJrErK)rLrErM)rNrErO)rPrErQ)rRrErS)rTrErU)rVrErW)rXrErY)rZrEr[)r\rEr])r^rEr_)r`r0)rarErb)rcr0)rdrEre)rfr0)rgrErh)rir0)rjrErk)rlr0)rmrErn)ror0)rprErq)rrr0)rsrErt)rur0)rvrErw)rxr0)ryrErz)r{r0)r|rEr})r~r0)rrEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�r�r�r�r�r��_seg_7�s�r�ceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�r0�rE�ҡ���ң���ҥ���ҧ���ҩ���ҫ���ҭ���ү���ұ���ҳ���ҵ���ҷ���ҹ���һ���ҽ���ҿ��r���ӂ�����ӄ�����ӆ�����ӈ�����ӊ�����ӌ�����ӎ�����ӑ�����ӓ�����ӕ�����ӗ�����ә�����ӛ�����ӝ�����ӟ�����ӡ�����ӣ�����ӥ�����ӧ�����ө�����ӫ�����ӭ�����ӯ�����ӱ�����ӳ�����ӵ���ӷ���ӹ���ӻ���ӽ���ӿ���ԁ���ԃ�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�r�)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr)rr0)rrEr)rr0)rrEr)rr0)rrEr	)r
r0)rrEr)r
r0)rrEr)rr0)rrEr)rr0)rrEr)rr0)rrEr)rr0)rrEr)rr0)rrEr)rr0)r rEr!)r"r0)r#rEr$)r%r0)r&rEr')r(r0)r)rEr*)r+r0)r,rEr-)r.r0)r/rEr0)r1r0)r2rEr3)r4r0)r5rEr6)r7r0)r8rEr9)r:r0)r;rEr<)r=r0)r>rEr?)r@r0)rArErB)rCr0)rDrErE)rFr0)rGrErH)rIr0r�r�r�r�r��_seg_8Hs�rJceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�dgdS(	N�rE�ԅ�r0��ԇ���ԉ�	�
�ԋ���ԍ�
��ԏ���ԑ���ԓ���ԕ���ԗ���ԙ���ԛ���ԝ���ԟ�� �ԡ�!�"�ԣ�#�$�ԥ�%�&�ԧ�'�(r��1�ա�2�բ�3�գ�4�դ�5�ե�6�զ�7�է�8�ը�9�թ�:�ժ�;�ի�<�լ�=�խ�>�ծ�?�կ�@�հ�A�ձ�B�ղ�C�ճ�D�մ�E�յ�F�ն�G�շ�H�ո�I�չ�J�պ�K�ջ�L�ռ�M�ս�N�վ�O�տ�P�ր�Q�ց�R�ւ�S�փ�T�ք�U�օ�V�ֆ�W�Y�`�a��եւ�������������������u�اٴ�v�وٴ�w�ۇٴ�x�يٴ�y��)rKrErL)rMr0)rNrErO)rPr0)rQrErR)rSr0)rTrErU)rVr0)rWrErX)rYr0)rZrEr[)r\r0)r]rEr^)r_r0)r`rEra)rbr0)rcrErd)rer0)rfrErg)rhr0)rirErj)rkr0)rlrErm)rnr0)rorErp)rqr0)rrrErs)rtr0)rurErv)rwr0)rxrEry)rzr0)r{rEr|)r}r0)r~rEr)r�r0)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r�)r�r0)r�r�)r�r0)r�rEr�)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r0)r�r�r�r�r�r�r��_seg_9�s�r�ceCs�dydzd{d|d}d~dd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N��r0�r���K�M�����.�0�?�@�\�^�_��������	�X	rE�क़�Y	�ख़�Z	�ग़�[	�ज़�\	�ड़�]	�ढ़�^	�फ़�_	�य़�`	�x	�y	�	�	�	�	�	�	�	�	�	�	�	�	�	�	�	�	��	��	��	��	��	��	��	��	�ড়��	�ঢ়��	��	�য়��	��	��	�	�
�
�
�
�
�
�
�)
�*
�1
�2
�3
�ਲ਼�4
�5
�6
�ਸ਼�7
�8
�:
�<
�=
�>
�C
�G
�I
�K
�N
�Q
�R
�Y
�ਖ਼�Z
�ਗ਼�[
�ਜ਼�\
�]
�^
�ਫ਼�_
)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)rr�)rr0)rr�)rr0)rrEr)rrEr)rrEr	)r
rEr)rrEr
)rrEr)rrEr)rrEr)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)r r0)r!r�)r"r0)r#r�)r$r0)r%r�)r&r0)r'r�)r(r0)r)r�)r*r0)r+r�)r,r0)r-r�)r.rEr/)r0rEr1)r2r�)r3rEr4)r5r0)r6r�)r7r0)r8r�)r9r0)r:r�)r;r0)r<r�)r=r0)r>r�)r?r0)r@r�)rAr0)rBr�)rCr0)rDrErE)rFr�)rGr0)rHrErI)rJr�)rKr0)rLr�)rMr0)rNr�)rOr0)rPr�)rQr0)rRr�)rSr0)rTr�)rUr0)rVr�)rWrErX)rYrErZ)r[rEr\)r]r0)r^r�)r_rEr`)rar�r�r�r�r�r��_seg_10s�rbceCs�djdkdldmdndodpdqdrdsdtdudvdwdxdydzd{d|d}d~dd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�f
r0�v
r��
�
�
�
�
�
�
�
�
�
�
�
�
�
�
��
��
��
��
��
��
��
��
��
��
��
����
����)�*�1�2�4�5�:�<�E�G�I�K�N�V�X�\rE�ଡ଼�]�ଢ଼�^�_�d�f�x����������������������������������������������
����)�*�4�5)rcr0)rdr�)rer0)rfr�)rgr0)rhr�)rir0)rjr�)rkr0)rlr�)rmr0)rnr�)ror0)rpr�)rqr0)rrr�)rsr0)rtr�)rur0)rvr�)rwr0)rxr�)ryr0)rzr�)r{r0)r|r�)r}r0)r~r�)rr0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�rEr�)r�rEr�)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0r�r�r�r�r��_seg_11�s�r�ceCs�didjdkdldmdndodpdqdrdsdtdudvdwdxdydzd{d|d}d~dd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�:r��=r0�E�F�I�J�N�U�W�X�Z�`�d�f�p�x���������������������������������������������
�
�
�

�
�
�
�;
�=
�E
�F
�I
�J
�O
�W
�X
�`
�d
�f
�v
�y
�
�
�
�
�
�
�
�
�
�
�
�
��
��
��
��
��
��
��
��
��
��
�
��3rE�ํา�4�;�?�\�����)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)r	r0)r
r�)rr0)rr�)r
r0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)r r�)r!r0)r"r�)r#r0)r$rEr%)r&r0)r'r�)r(r0)r)r�)r*r0)r+r�)r,r0)r-r�)r.r0r�r�r�r�r��_seg_12�s�r/ceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�r��r0�����������������rE�ໍາ����������������������ຫນ���ຫມ�������་�
�C�གྷ�D�H�I�M�ཌྷ�N�R�དྷ�S�W�བྷ�X�\�ཛྷ�]�i�ཀྵ�j�m�q�s�ཱི�t�u�ཱུ�v�ྲྀ�w�	ྲཱྀ�x�ླྀ�y�	ླཱྀ�z��ཱྀ���ྒྷ�����ྜྷ���ྡྷ���ྦྷ���ྫྷ���ྐྵ��������������ⴧ�����ⴭ������ნ��_�a�I�J�N�P�W�X)r0r�)r1r0)r2r�)r3r0)r4r�)r5r0)r6r�)r7r0)r8r�)r9r0)r:r�)r;r0)r<r�)r=r0)r>r�)r?r0)r@r�)rAr0)rBrErC)rDr0)rEr�)rFr0)rGr�)rHr0)rIr�)rJr0)rKr�)rLr0)rMr�)rNr0)rOr�)rPrErQ)rRrErS)rTr0)rUr�)rVr0)rWrErX)rYr0)rZrEr[)r\r0)r]r�)r^r0)r_rEr`)rar0)rbrErc)rdr0)rerErf)rgr0)rhrEri)rjr0)rkrErl)rmr0)rnr�)ror0)rprErq)rrr0)rsrErt)rurErv)rwrErx)ryrErz)r{rEr|)r}r0)r~rEr)r�r0)r�rEr�)r�r0)r�r�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�rEr�)r�r�)r�rEr�)r�r�)r�r0)r�rEr�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0r�r�r�r�r��_seg_13Ps�r�ceCs�dhdidjdkdldmdndodpdqdrdsdtdudvdwdxdydzd{d|d}d~dd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�Yr��Zr0�^�`��������������������������[�]�}�������������
��� �7�@�T�`�m�n�q�r�t����������������r����� �x������� �,�0�<�@�A�D�n�p�u��������������_�`�}����)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)r	r0)r
r�)rr0)rr�)r
r0)rr�)rr0)rr�)rr0)rr�r�r�r�r�r��_seg_14�s�rcfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�dgdS(N�r0�r���L�P�}�����8�;�J�M���������,rErF�-r��.rH�/�0rL�1rN�2�ǝ�3rR�4rT�5rV�6rX�7rZ�8r\�9r^�:r`�;�<rb�=�ȣ�>rd�?rh�@rl�Arn�Brr�C�D�ɐ�E�ɑ�F�ᴂ�G�H�I�J�ə�K�ɛ�L�ɜ�M�N�O�P�Q�ŋ�R�S�ɔ�T�ᴖ�U�ᴗ�V�W�X�Y�ᴝ�Z�ɯ�[rp�\�ᴥ�]�β�^�γ�_�δ�`�φ�a�χ�b�c�d�e�f�g�h�ρ�i�j�k�x�н�y��ɒ�rJ��ɕ�r��rP��ɟ��ɡ��ɥ��ɨ��ɩ��ɪ��ᵻ��ʝ��ɭ)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)r r0)r!r�)r"r0)r#r�)r$r0)r%r�)r&r0)r'rErF)r(rEr�)r)rErH)r*r0)r+rErL)r,rErN)r-rEr.)r/rErR)r0rErT)r1rErV)r2rErX)r3rErZ)r4rEr\)r5rEr^)r6rEr`)r7r0)r8rErb)r9rEr:)r;rErd)r<rErh)r=rErl)r>rErn)r?rErr)r@rErF)rArErB)rCrErD)rErErF)rGrErH)rHrErL)rIrErN)rJrErK)rLrErM)rNrErO)rPrErR)rQr0)rRrErZ)rSrEr^)rTrErU)rVrErb)rWrErX)rYrErZ)r[rEr\)r]rErd)r^rErl)r_rErn)r`rEra)rbrErc)rdrErp)rerErf)rgrErh)rirErj)rkrErl)rmrErn)rorErp)rqrErV)rrrErh)rsrErn)rtrErp)rurErh)rvrErj)rwrErx)ryrErn)rzrErp)r{r0)r|rEr})r~r0)rrEr�)r�rErJ)r�rEr�)r�rEr)r�rErO)r�rErP)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_15 s�r�cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�dgdS(N�rE�ᶅ��ʟ��ɱ��ɰ��ɲ��ɳ��ɴ��ɵ��ɸ��ʂ��ʃ��ƫ��ʉ��ʊ��ᴜ��ʋ��ʌ�rx��ʐ��ʑ��ʒ��θ�r0��r����ḁ���ḃ���ḅ���ḇ���ḉ�	�
�ḋ���ḍ�
��ḏ���ḑ���ḓ���ḕ���ḗ���ḙ���ḛ���ḝ���ḟ�� �ḡ�!�"�ḣ�#�$�ḥ�%�&�ḧ�'�(�ḩ�)�*�ḫ�+�,�ḭ�-�.�ḯ�/�0�ḱ�1�2�ḳ�3�4�ḵ�5�6�ḷ�7�8�ḹ�9�:�ḻ�;�<�ḽ�=�>�ḿ�?�@�ṁ�A�B�ṃ�C�D�ṅ�E�F�ṇ�G�H�ṉ�I�J�ṋ)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rErx)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r0)r�r�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r	r0)r	rEr	)r	r0)r	rEr	)r	r0)r	rEr	)r		r0)r
	rEr	)r	r0)r
	rEr	)r	r0)r	rEr	)r	r0)r	rEr	)r	r0)r	rEr	)r	r0)r	rEr	)r	r0)r	rEr	)r	r0)r	rEr 	)r!	r0)r"	rEr#	)r$	r0)r%	rEr&	)r'	r0)r(	rEr)	)r*	r0)r+	rEr,	)r-	r0)r.	rEr/	)r0	r0)r1	rEr2	)r3	r0)r4	rEr5	)r6	r0)r7	rEr8	r�r�r�r�r��_seg_16�s�r9	ceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�Kr0�LrE�ṍ�M�N�ṏ�O�P�ṑ�Q�R�ṓ�S�T�ṕ�U�V�ṗ�W�X�ṙ�Y�Z�ṛ�[�\�ṝ�]�^�ṟ�_�`�ṡ�a�b�ṣ�c�d�ṥ�e�f�ṧ�g�h�ṩ�i�j�ṫ�k�l�ṭ�m�n�ṯ�o�p�ṱ�q�r�ṳ�s�t�ṵ�u�v�ṷ�w�x�ṹ�y�z�ṻ�{�|�ṽ�}�~�ṿ���ẁ���ẃ���ẅ���ẇ���ẉ���ẋ���ẍ���ẏ���ẑ���ẓ���ẕ���aʾ���r.���ạ���ả���ấ���ầ���ẩ���ẫ���ậ���ắ���ằ���ẳ�)r:	r0)r;	rEr<	)r=	r0)r>	rEr?	)r@	r0)rA	rErB	)rC	r0)rD	rErE	)rF	r0)rG	rErH	)rI	r0)rJ	rErK	)rL	r0)rM	rErN	)rO	r0)rP	rErQ	)rR	r0)rS	rErT	)rU	r0)rV	rErW	)rX	r0)rY	rErZ	)r[	r0)r\	rEr]	)r^	r0)r_	rEr`	)ra	r0)rb	rErc	)rd	r0)re	rErf	)rg	r0)rh	rEri	)rj	r0)rk	rErl	)rm	r0)rn	rEro	)rp	r0)rq	rErr	)rs	r0)rt	rEru	)rv	r0)rw	rErx	)ry	r0)rz	rEr{	)r|	r0)r}	rEr~	)r	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	rErZ	)r�	r0)r�	rEr.)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0r�r�r�r�r��_seg_17�s�r�	cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�dgdS(N�rE�ẵ�r0��ặ���ẹ���ẻ���ẽ���ế���ề����ể�����ễ�����ệ�����ỉ�����ị�����ọ�����ỏ�����ố�����ồ�����ổ�����ỗ�����ộ�����ớ�����ờ�����ở�����ỡ�����ợ�����ụ�����ủ�����ứ�����ừ�����ử�����ữ�����ự�����ỳ�����ỵ���ỷ���ỹ���ỻ���ỽ���ỿ���ἀ�	�ἁ�
�ἂ��ἃ��ἄ�
�ἅ��ἆ��ἇ��r���ἐ��ἑ��ἒ��ἓ��ἔ��ἕ�� �(�ἠ�)�ἡ�*�ἢ�+�ἣ�,�ἤ�-�ἥ)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr
)r
r0)r
rEr
)r
r0)r
rEr
)r
r0)r
rEr	
)r

r0)r
rEr
)r
r0)r
rEr
)r
r0)r
rEr
)r
r0)r
rEr
)r
r0)r
rEr
)r
r0)r
rEr
)r
r0)r
rEr
)r
r0)r 
rEr!
)r"
r0)r#
rEr$
)r%
r0)r&
rEr'
)r(
r0)r)
rEr*
)r+
r0)r,
rEr-
)r.
r0)r/
rEr0
)r1
r0)r2
rEr3
)r4
r0)r5
rEr6
)r7
r0)r8
rEr9
)r:
r0)r;
rEr<
)r=
r0)r>
rEr?
)r@
r0)rA
rErB
)rC
rErD
)rE
rErF
)rG
rErH
)rI
rErJ
)rK
rErL
)rM
rErN
)rO
rErP
)rQ
r0)rR
r�)rS
rErT
)rU
rErV
)rW
rErX
)rY
rErZ
)r[
rEr\
)r]
rEr^
)r_
r�)r`
r0)ra
rErb
)rc
rErd
)re
rErf
)rg
rErh
)ri
rErj
)rk
rErl
r�r�r�r�r��_seg_18Xs�rm
cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�dgdS(N�.rE�ἦ�/�ἧ�0r0�8�ἰ�9�ἱ�:�ἲ�;�ἳ�<�ἴ�=�ἵ�>�ἶ�?�ἷ�@�Fr��H�ὀ�I�ὁ�J�ὂ�K�ὃ�L�ὄ�M�ὅ�N�P�X�Y�ὑ�Z�[�ὓ�\�]�ὕ�^�_�ὗ�`�h�ὠ�i�ὡ�j�ὢ�k�ὣ�l�ὤ�m�ὥ�n�ὦ�o�ὧ�p�q�ά�r�s�έ�t�u�ή�v�w�ί�x�y�ό�z�{�ύ�|�}�ώ�~��ἀι��ἁι��ἂι��ἃι��ἄι��ἅι��ἆι��ἇι����������ἠι��ἡι��ἢι��ἣι��ἤι��ἥι��ἦι��ἧι����������ὠι��ὡι��ὢι��ὣι��ὤι��ὥι��ὦι��ὧι�������)rn
rEro
)rp
rErq
)rr
r0)rs
rErt
)ru
rErv
)rw
rErx
)ry
rErz
)r{
rEr|
)r}
rEr~
)r
rEr�
)r�
rEr�
)r�
r0)r�
r�)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
r�)r�
r0)r�
r�)r�
rEr�
)r�
r�)r�
rEr�
)r�
r�)r�
rEr�
)r�
r�)r�
rEr�
)r�
r0)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rEr�
)r�
r�)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)rrEr)rrEr)rrEr)rrEr�
)rrEr�
)rrEr�
)r	rEr�
)r
rEr�
)rrEr)rrErr�r�r�r�r��_seg_19�s�r
ceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�dgdS(N�rE�ὧι�r0��ὰι��αι��άι�r����ᾶι��ᾰ��ᾱ��ὰ��ά��r� ̓��ι��� ͂�� ̈͂���ὴι���ηι���ήι�������ῆι���ὲ���έ���ὴ���ή����� ̓̀��� ̓́��� ̓͂�����ΐ�������ῐ���ῑ���ὶ���ί����� ̔̀��� ̔́��� ̔͂�����ΰ�����ῠ���ῡ���ὺ���ύ���ῥ��� ̈̀��� ̈́���`�����ὼι���ωι���ώι����ῶι��ὸ��ό��ὼ��ώ��� ́�� ̔�� r�� r�� r-�� � � �‐� � � ̳� �$ �' �( �/ �0 �3 �′′�4 �	′′′�5 �6 �‵‵�7 �	‵‵‵�8 �< �!!�= �> � ̅�? �G �??�H �?!�I �!?�J �W �′′′′�X )rrEr)rr0)rrEr)rrEr)rrEr)rr�)rr0)rrEr)rrEr)rrEr)rrEr )r!rEr")r#rEr)r$rr%)r&rEr')r(rr%)r)rr*)r+rr,)r-rEr.)r/rEr0)r1rEr2)r3r�)r4r0)r5rEr6)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr0)r@rrA)rBrrC)rDrrE)rFr0)rGrErH)rIr�)rJr0)rKrErL)rMrErN)rOrErP)rQrErR)rSr�)rTrrU)rVrrW)rXrrY)rZr0)r[rEr\)r]r0)r^rEr_)r`rEra)rbrErc)rdrEre)rfrErg)rhrri)rjrrk)rlrrm)rnr�)rorErp)rqrErr)rsrErt)rur�)rvr0)rwrErx)ryrErz)r{rEr|)r}rEr~)rrEr�)r�rErr)r�rr�)r�rr�)r�r�)r�rr�)r�r�)r�r-r�)r�r�)r�r0)r�rEr�)r�r0)r�rr�)r�r0)r�r�)r�r0)r�r�)r�rr�)r�r0)r�rEr�)r�rEr�)r�r0)r�rEr�)r�rEr�)r�r0)r�rr�)r�r0)r�rr�)r�r0)r�rr�)r�rr�)r�rr�)r�r0)r�rEr�)r�r0r�r�r�r�r��_seg_20(s�r�cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�dgdS(N�_ rr��` r��a r��d �e �p rE�0�q rV�r �t �4�u �5�v �6�w �7�x �8�y �9�z �+�{ �−�| �=�} �(�~ �)� r`� � r�� r�� � � � � � � � � � � � � � rF� rN� rb� rt� �ə� rT� rZ� r\� r^� � rd� rj� rl� � r0� �rs� � �� �� �!�a/c�!�a/s�!rJ�!�°c�!�!�c/o�!�c/u�!�ɛ�!�	!�°f�
!rR�!�!�ħ�!�!�!�!�!�no�!�!�!rf�!rh�!� !�sm�!!�tel�"!�tm�#!�$!rx�%!�&!�ω�'!�(!�)!�*!�+!r��,!rH�-!�.!�/!�1!rP�2!�3!�4!�5!�א)r�rr�)r�r�)r�r�)r�r�)r�r�)r�rEr�)r�rErV)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rr�)r�rEr�)r�rr�)r�rr�)r�rr�)r�rEr`)r�rEr�)r�rEr�)r�rEr�)r�rEr)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rr�)r�rEr�)r�rr�)r�rr�)r�rr�)r�r�)r�rErF)r�rErN)r�rErb)r�rErt)r�rEr�)r�rErT)r�rErZ)r�rEr\)r�rEr^)r�rEr`)r�rErd)r�rErj)r�rErl)r�r�)r�r0)r�rEr�)r�r0)r�r�)r�r0)r�r�)r�rr�)r�rr�)r�rErJ)r�rEr�)rr0)rrr)rrr)rrEr)rr0)rrEr	)r
rErR)rrErT)rrEr
)rrErV)rrEr\)rr0)rrEr`)rrEr)rr0)rrErd)rrErf)rrErh)rr0)rrEr)rrEr)rrEr)rr0)r rErx)r!r0)r"rEr#)r$r0)r%rErx)r&r0)r'rErZ)r(rEr�)r)rErH)r*rErJ)r+r0)r,rErN)r-rErP)r.r�)r/rEr^)r0rErb)r1rEr2r�r�r�r�r��_seg_21�s�r3cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�dgdS(N�6!rE�ב�7!�ג�8!�ד�9!rV�:!r0�;!�fax�<!�π�=!�γ�?!�@!�∑�A!�E!rL�G!rN�H!�I!rX�J!�P!�1⁄7�Q!�1⁄9�R!�1⁄10�S!�1⁄3�T!�2⁄3�U!�1⁄5�V!�2⁄5�W!�3⁄5�X!�4⁄5�Y!�1⁄6�Z!�5⁄6�[!�1⁄8�\!�3⁄8�]!�5⁄8�^!�7⁄8�_!�1⁄�`!�a!�ii�b!�iii�c!�iv�d!rp�e!�vi�f!�vii�g!�viii�h!�ix�i!rt�j!�xi�k!�xii�l!r\�m!rJ�n!�o!r^�p!�q!�r!�s!�t!�u!�v!�w!�x!�y!�z!�{!�|!�}!�~!�!�!�!r��!�!�0⁄3�!�!�,"�∫∫�-"�	∫∫∫�."�/"�∮∮�0"�	∮∮∮�1"�`"r�a"�n"�p"�)#�〈�*#�〉�+#��#�$�'$�@$�K$�`$r��a$r��b$�c$r��d$r��e$r��f$r��g$r��h$r��i$�10�j$�11�k$�12)r4rEr5)r6rEr7)r8rEr9)r:rErV)r;r0)r<rEr=)r>rEr?)r@rErA)rBrEr?)rCrErD)rEr0)rFrErL)rGrErN)rHrErV)rIrErX)rJr0)rKrErL)rMrErN)rOrErP)rQrErR)rSrErT)rUrErV)rWrErX)rYrErZ)r[rEr\)r]rEr^)r_rEr`)rarErb)rcrErd)rerErf)rgrErh)rirErj)rkrErV)rlrErm)rnrEro)rprErq)rrrErp)rsrErt)rurErv)rwrErx)ryrErz)r{rErt)r|rEr})r~rEr)r�rEr\)r�rErJ)r�rErL)r�rEr^)r�rErV)r�rErm)r�rEro)r�rErq)r�rErp)r�rErt)r�rErv)r�rErx)r�rErz)r�rErt)r�rEr})r�rEr)r�rEr\)r�rErJ)r�rErL)r�rEr^)r�r0)r�r�)r�r0)r�rEr�)r�r�)r�r0)r�rEr�)r�rEr�)r�r0)r�rEr�)r�rEr�)r�r0)r�r)r�r0)r�r)r�r0)r�rEr�)r�rEr�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�rEr�)r�rEr�)r�rEr)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_22�s�r�cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�dgdS(N�l$rE�13�m$�14�n$�15�o$�16�p$�17�q$�18�r$�19�s$�20�t$r�(1)�u$�(2)�v$�(3)�w$�(4)�x$�(5)�y$�(6)�z$�(7)�{$�(8)�|$�(9)�}$�(10)�~$�(11)�$�(12)�$�(13)�$�(14)�$�(15)�$�(16)�$�(17)�$�(18)�$�(19)�$�(20)�$r��$�(a)�$�(b)�$�(c)�$�(d)�$�(e)�$�(f)�$�(g)�$�(h)�$�(i)�$�(j)�$�(k)�$�(l)�$�(m)�$�(n)�$�(o)�$�(p)�$�(q)�$�(r)�$�(s)�$�(t)�$�(u)�$�(v)�$�(w)�$�(x)�$�(y)�$�(z)�$rF�$rH�$rJ�$rL�$rN�$rP�$rR�$rT�$rV�$rX�$rZ�$r\��$r^��$r`��$rb��$rd��$rf��$rh��$rj��$rl��$rn��$rp��$rr��$rt��$rv��$rx��$��$��$��$��$��$��$��$��$��$��$��$��$��$��$��$��$��$��$)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�r�)r�rr�)r�rr�)r
rr
)r
rr
)r
rr
)r
rr
)r
rr	
)r

rr
)r
rr

)r
rr
)r
rr
)r
rr
)r
rr
)r
rr
)r
rr
)r
rr
)r
rr
)r
rr
)r 
rr!
)r"
rr#
)r$
rr%
)r&
rr'
)r(
rr)
)r*
rr+
)r,
rr-
)r.
rr/
)r0
rErF)r1
rErH)r2
rErJ)r3
rErL)r4
rErN)r5
rErP)r6
rErR)r7
rErT)r8
rErV)r9
rErX)r:
rErZ)r;
rEr\)r<
rEr^)r=
rEr`)r>
rErb)r?
rErd)r@
rErf)rA
rErh)rB
rErj)rC
rErl)rD
rErn)rE
rErp)rF
rErr)rG
rErt)rH
rErv)rI
rErx)rJ
rErF)rK
rErH)rL
rErJ)rM
rErL)rN
rErN)rO
rErP)rP
rErR)rQ
rErT)rR
rErV)rS
rErX)rT
rErZ)rU
rEr\)rV
rEr^)rW
rEr`)rX
rErb)rY
rErd)rZ
rErf)r[
rErh)r\
rErjr�r�r�r�r��_seg_23`	s�r]
cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�dgdS(N��$rErl��$rn��$rp��$rr��$rt��$rv��$rx��$r���$r0�'r��'�*�∫∫∫∫�
*�t*r�::=�u*�==�v*�===�w*��*�⫝̸��*�M+�P+�Z+�,�ⰰ�,�ⰱ�,�ⰲ�,�ⰳ�,�ⰴ�,�ⰵ�,�ⰶ�,�ⰷ�,�ⰸ�	,�ⰹ�
,�ⰺ�,�ⰻ�,�ⰼ�
,�ⰽ�,�ⰾ�,�ⰿ�,�ⱀ�,�ⱁ�,�ⱂ�,�ⱃ�,�ⱄ�,�ⱅ�,�ⱆ�,�ⱇ�,�ⱈ�,�ⱉ�,�ⱊ�,�ⱋ�,�ⱌ�,�ⱍ�,�ⱎ�,�ⱏ� ,�ⱐ�!,�ⱑ�",�ⱒ�#,�ⱓ�$,�ⱔ�%,�ⱕ�&,�ⱖ�',�ⱗ�(,�ⱘ�),�ⱙ�*,�ⱚ�+,�ⱛ�,,�ⱜ�-,�ⱝ�.,�ⱞ�/,�0,�_,�`,�ⱡ�a,�b,�ɫ�c,�ᵽ�d,�ɽ�e,�g,�ⱨ�h,�i,�ⱪ�j,�k,�ⱬ�l,�m,�ɑ�n,�ɱ�o,�ɐ�p,�ɒ�q,�r,�ⱳ�s,�u,�ⱶ�v,�|,rX�},�~,�ȿ�,�ɀ�,�ⲁ�,�,�ⲃ)r^
rErl)r_
rErn)r`
rErp)ra
rErr)rb
rErt)rc
rErv)rd
rErx)re
rEr�)rf
r0)rg
r�)rh
r0)ri
rErj
)rk
r0)rl
rrm
)rn
rro
)rp
rrq
)rr
r0)rs
rErt
)ru
r0)rv
r�)rw
r0)rx
r�)ry
rErz
)r{
rEr|
)r}
rEr~
)r
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
r�)r�
r0)r�
r�)r�
rEr�
)r�
r0)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rErX)r�
rErp)r�
rEr�
)rrEr)rrEr)rr0)rrErr�r�r�r�r��_seg_24�	s�rcfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�,r0�,rE�ⲅ�,�,�ⲇ�,�,�ⲉ�,�,�ⲋ�,�,�ⲍ�,�,�ⲏ�,�,�ⲑ�,�,�ⲓ�,�,�ⲕ�,�,�ⲗ�,�,�ⲙ�,�,�ⲛ�,�,�ⲝ�,�,�ⲟ�,�,�ⲡ�,�,�ⲣ�,�,�ⲥ�,�,�ⲧ�,�,�ⲩ�,�,�ⲫ�,�,�ⲭ�,�,�ⲯ�,�,�ⲱ�,�,�ⲳ�,�,�ⲵ�,�,�ⲷ�,�,�ⲹ�,�,�ⲻ�,�,�ⲽ�,�,�ⲿ�,�,�ⳁ�,��,�ⳃ��,��,�ⳅ��,��,�ⳇ��,��,�ⳉ��,��,�ⳋ��,��,�ⳍ��,��,�ⳏ��,��,�ⳑ��,��,�ⳓ��,��,�ⳕ��,��,�ⳗ��,��,�ⳙ��,��,�ⳛ��,��,�ⳝ��,��,�ⳟ��,��,�ⳡ��,��,�ⳣ��,��,�ⳬ��,��,�ⳮ)rr0)r	rEr
)rr0)rrEr
)rr0)rrEr)rr0)rrEr)rr0)rrEr)rr0)rrEr)rr0)rrEr)rr0)rrEr)r r0)r!rEr")r#r0)r$rEr%)r&r0)r'rEr()r)r0)r*rEr+)r,r0)r-rEr.)r/r0)r0rEr1)r2r0)r3rEr4)r5r0)r6rEr7)r8r0)r9rEr:)r;r0)r<rEr=)r>r0)r?rEr@)rAr0)rBrErC)rDr0)rErErF)rGr0)rHrErI)rJr0)rKrErL)rMr0)rNrErO)rPr0)rQrErR)rSr0)rTrErU)rVr0)rWrErX)rYr0)rZrEr[)r\r0)r]rEr^)r_r0)r`rEra)rbr0)rcrErd)rer0)rfrErg)rhr0)rirErj)rkr0)rlrErm)rnr0)rorErp)rqr0)rrrErs)rtr0)rurErv)rwr0)rxrEry)rzr0)r{rEr|)r}r0)r~rEr)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�r�r�r�r�r��_seg_250
s�r�cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	gdS(
N��,r0��,rE�ⳳ��,��,r��,�&-�'-�(-�--�.-�0-�h-�o-�ⵡ�p-�q-�-�-�-�-�-�-�-�-�-�-�-��-��-��-��-��-��-��-��-�<.�.�.�.�.�母�.��.�龟��.�/�一�/�丨�/�丶�/�丿�/�乙�/�亅�/�二�/�亠�/�人�	/�儿�
/�入�/�八�/�冂�
/�冖�/�冫�/�几�/�凵�/�刀�/�力�/�勹�/�匕�/�匚�/�匸�/�十�/�卜�/�卩�/�厂�/�厶�/�又�/�口�/�囗�/�土� /�士�!/�夂�"/�夊�#/�夕�$/�大�%/�女�&/�子�'/�宀�(/�寸�)/�小�*/�尢�+/�尸�,/�屮�-/�山�./�巛�//�工�0/�己�1/�巾�2/�干�3/�幺�4/�广�5/�廴�6/�廾�7/�弋�8/�弓�9/�彐)r�r0)r�rEr�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�rEr�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�rEr�)r�r0)r�rEr�)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rrEr)rrEr)rrEr)r	rEr
)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr )r!rEr")r#rEr$)r%rEr&)r'rEr()r)rEr*)r+rEr,)r-rEr.)r/rEr0)r1rEr2)r3rEr4)r5rEr6)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr@r�r�r�r�r��_seg_26�
s�rAcfCs(d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-gdS(.N�:/rE�彡�;/�彳�</�心�=/�戈�>/�戶�?/�手�@/�支�A/�攴�B/�文�C/�斗�D/�斤�E/�方�F/�无�G/�日�H/�曰�I/�月�J/�木�K/�欠�L/�止�M/�歹�N/�殳�O/�毋�P/�比�Q/�毛�R/�氏�S/�气�T/�水�U/�火�V/�爪�W/�父�X/�爻�Y/�爿�Z/�片�[/�牙�\/�牛�]/�犬�^/�玄�_/�玉�`/�瓜�a/�瓦�b/�甘�c/�生�d/�用�e/�田�f/�疋�g/�疒�h/�癶�i/�白�j/�皮�k/�皿�l/�目�m/�矛�n/�矢�o/�石�p/�示�q/�禸�r/�禾�s/�穴�t/�立�u/�竹�v/�米�w/�糸�x/�缶�y/�网�z/�羊�{/�羽�|/�老�}/�而�~/�耒�/�耳�/�聿�/�肉�/�臣�/�自�/�至�/�臼�/�舌�/�舛�/�舟�/�艮�/�色�/�艸�/�虍�/�虫�/�血�/�行�/�衣�/�襾�/�見�/�角�/�言�/�谷�/�豆�/�豕�/�豸�/�貝�/�赤�/�走�/�足�/�身)rBrErC)rDrErE)rFrErG)rHrErI)rJrErK)rLrErM)rNrErO)rPrErQ)rRrErS)rTrErU)rVrErW)rXrErY)rZrEr[)r\rEr])r^rEr_)r`rEra)rbrErc)rdrEre)rfrErg)rhrEri)rjrErk)rlrErm)rnrEro)rprErq)rrrErs)rtrEru)rvrErw)rxrEry)rzrEr{)r|rEr})r~rEr)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)rrEr)rrEr)rrEr)rrEr)rrEr	r�r�r�r�r��_seg_27s�r
cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"gdS(#N�/rE�車�/�辛�/�辰�/�辵�/�邑�/�酉�/�釆�/�里�/�金�/�長�/�門�/�阜�/�隶�/�隹�/�雨�/�靑�/�非�/�面�/�革�/�韋�/�韭�/�音�/�頁�/�風�/�飛�/�食�/�首�/�香�/�馬�/�骨�/�高�/�髟�/�鬥�/�鬯�/�鬲�/�鬼��/�魚��/�鳥��/�鹵��/�鹿��/�麥��/�麻��/�黃��/�黍��/�黑��/�黹��/�黽��/�鼎��/�鼓��/�鼠��/�鼻��/�齊��/�齒��/�龍��/�龜��/�龠��/r��0rr��0r0�0�.�0�60�〒�70�80�十�90�卄�:0�卅�;0�@0�A0�0�0�0� ゙�0� ゚�0�0�より�0�0�コト�1�1�.1�11�ᄀ�21�ᄁ�31�ᆪ�41�ᄂ�51�ᆬ�61�ᆭ�71�ᄃ�81�ᄄ�91�ᄅ�:1�ᆰ�;1�ᆱ�<1�ᆲ�=1�ᆳ�>1�ᆴ�?1�ᆵ�@1�ᄚ�A1�ᄆ�B1�ᄇ�C1�ᄈ�D1�ᄡ)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr )r!rEr")r#rEr$)r%rEr&)r'rEr()r)rEr*)r+rEr,)r-rEr.)r/rEr0)r1rEr2)r3rEr4)r5rEr6)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr@)rArErB)rCrErD)rErErF)rGrErH)rIrErJ)rKrErL)rMrErN)rOrErP)rQrErR)rSrErT)rUrErV)rWrErX)rYrErZ)r[rEr\)r]rEr^)r_rEr`)rarErb)rcrErd)rerErf)rgrErh)rirErj)rkrErl)rmrErn)rorErp)rqrErr)rsrErt)rurErv)rwrErx)ryrErz)r{r�)r|rr�)r}r0)r~rEr)r�r0)r�rEr�)r�r0)r�rEr�)r�rEr�)r�rEr�)r�r0)r�r�)r�r0)r�r�)r�r0)r�rr�)r�rr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r�)r�r0)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_28hs�r�cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(gdS()N�E1rE�ᄉ�F1�ᄊ�G1�ᄋ�H1�ᄌ�I1�ᄍ�J1�ᄎ�K1�ᄏ�L1�ᄐ�M1�ᄑ�N1�ᄒ�O1�ᅡ�P1�ᅢ�Q1�ᅣ�R1�ᅤ�S1�ᅥ�T1�ᅦ�U1�ᅧ�V1�ᅨ�W1�ᅩ�X1�ᅪ�Y1�ᅫ�Z1�ᅬ�[1�ᅭ�\1�ᅮ�]1�ᅯ�^1�ᅰ�_1�ᅱ�`1�ᅲ�a1�ᅳ�b1�ᅴ�c1�ᅵ�d1r��e1�ᄔ�f1�ᄕ�g1�ᇇ�h1�ᇈ�i1�ᇌ�j1�ᇎ�k1�ᇓ�l1�ᇗ�m1�ᇙ�n1�ᄜ�o1�ᇝ�p1�ᇟ�q1�ᄝ�r1�ᄞ�s1�ᄠ�t1�ᄢ�u1�ᄣ�v1�ᄧ�w1�ᄩ�x1�ᄫ�y1�ᄬ�z1�ᄭ�{1�ᄮ�|1�ᄯ�}1�ᄲ�~1�ᄶ�1�ᅀ�1�ᅇ�1�ᅌ�1�ᇱ�1�ᇲ�1�ᅗ�1�ᅘ�1�ᅙ�1�ᆄ�1�ᆅ�1�ᆈ�1�ᆑ�1�ᆒ�1�ᆔ�1�ᆞ�1�ᆡ�1�1r0�1�一�1�二�1�三�1�四�1�上�1�中�1�下�1�甲�1�乙�1�丙�1�丁�1�天�1�地�1�人�1�1�1��1��1�2r�(ᄀ)�2�(ᄂ)�2�(ᄃ)�2�(ᄅ)�2�(ᄆ))r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rr�)rrEr)rrEr)rrEr	)r
rEr)rrEr
)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)r rEr!)r"rEr#)r$rEr%)r&rEr')r(rEr))r*rEr+)r,rEr-)r.rEr/)r0rEr1)r2rEr3)r4rEr5)r6rEr7)r8rEr9)r:rEr;)r<rEr=)r>rEr?)r@rErA)rBrErC)rDrErE)rFrErG)rHrErI)rJrErK)rLrErM)rNrErO)rPrErQ)rRrErS)rTrErU)rVrErW)rXr�)rYr0)rZrEr[)r\rEr])r^rEr_)r`rEra)rbrErc)rdrEre)rfrErg)rhrEri)rjrErk)rlrErm)rnrEro)rprErq)rrrErs)rtrEru)rvr0)rwr�)rxr0)ryr�)rzr0)r{rr|)r}rr~)rrr�)r�rr�)r�rr�r�r�r�r�r��_seg_29�s�r�cfCs*d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-�d.gdS(/N�2r�(ᄇ)�2�(ᄉ)�2�(ᄋ)�2�(ᄌ)�	2�(ᄎ)�
2�(ᄏ)�2�(ᄐ)�2�(ᄑ)�
2�(ᄒ)�2�(가)�2�(나)�2�(다)�2�(라)�2�(마)�2�(바)�2�(사)�2�(아)�2�(자)�2�(차)�2�(카)�2�(타)�2�(파)�2�(하)�2�(주)�2�(오전)�2�(오후)�2r�� 2�(一)�!2�(二)�"2�(三)�#2�(四)�$2�(五)�%2�(六)�&2�(七)�'2�(八)�(2�(九)�)2�(十)�*2�(月)�+2�(火)�,2�(水)�-2�(木)�.2�(金)�/2�(土)�02�(日)�12�(株)�22�(有)�32�(社)�42�(名)�52�(特)�62�(財)�72�(祝)�82�(労)�92�(代)�:2�(呼)�;2�(学)�<2�(監)�=2�(企)�>2�(資)�?2�(協)�@2�(祭)�A2�(休)�B2�(自)�C2�(至)�D2rE�問�E2�幼�F2�文�G2�箏�H2r0�P2�pte�Q2�21�R2�22�S2�23�T2�24�U2�25�V2�26�W2�27�X2�28�Y2�29�Z2�30�[2�31�\2�32�]2�33�^2�34�_2�35�`2�ᄀ�a2�ᄂ�b2�ᄃ�c2�ᄅ�d2�ᄆ�e2�ᄇ�f2�ᄉ�g2�ᄋ�h2�ᄌ�i2�ᄎ�j2�ᄏ�k2�ᄐ�l2�ᄑ�m2�ᄒ�n2�가�o2�나)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�r�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr)rrr)rrEr)rrEr)rrEr)r	rEr
)rr0)rrEr
)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)r rEr!)r"rEr#)r$rEr%)r&rEr')r(rEr))r*rEr+)r,rEr-)r.rEr/)r0rEr1)r2rEr3)r4rEr5)r6rEr7)r8rEr9)r:rEr;)r<rEr=)r>rEr?)r@rErA)rBrErC)rDrErE)rFrErG)rHrErI)rJrErKr�r�r�r�r��_seg_308s�rLcfCs(d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-gdS(.N�p2rE�다�q2�라�r2�마�s2�바�t2�사�u2�아�v2�자�w2�차�x2�카�y2�타�z2�파�{2�하�|2�참고�}2�주의�~2�우�2r0�2�一�2�二�2�三�2�四�2�五�2�六�2�七�2�八�2�九�2�十�2�月�2�火�2�水�2�木�2�金�2�土�2�日�2�株�2�有�2�社�2�名�2�特�2�財�2�祝�2�労�2�秘�2�男�2�女�2�適�2�優�2�印�2�注�2�項�2�休�2�写�2�正�2�上�2�中�2�下�2�左�2�右�2�医�2�宗�2�学�2�監�2�企�2�資�2�協�2�夜�2�36�2�37�2�38�2�39�2�40�2�41�2�42�2�43�2�44�2�45�2�46�2�47�2�48�2�49�2�50�2�1月�2�2月��2�3月��2�4月��2�5月��2�6月��2�7月��2�8月��2�9月��2�10月��2�11月��2�12月��2�hg��2�erg��2�ev��2�ltd��2�ア��2�イ��2�ウ��2�エ)rMrErN)rOrErP)rQrErR)rSrErT)rUrErV)rWrErX)rYrErZ)r[rEr\)r]rEr^)r_rEr`)rarErb)rcrErd)rerErf)rgrErh)rirErj)rkr0)rlrErm)rnrEro)rprErq)rrrErs)rtrEru)rvrErw)rxrEry)rzrEr{)r|rEr})r~rEr)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)rrEr)rrEr)rrEr)rrEr)rrEr	)r
rEr)rrEr
)rrEr)rrEr)rrErr�r�r�r�r��_seg_31�s�rcfCs(d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-gdS(.N��2rE�オ��2�カ��2�キ��2�ク��2�ケ��2�コ��2�サ��2�シ��2�ス��2�セ��2�ソ��2�タ��2�チ��2�ツ��2�テ��2�ト��2�ナ��2�ニ��2�ヌ��2�ネ��2�ノ��2�ハ��2�ヒ��2�フ��2�ヘ��2�ホ��2�マ��2�ミ��2�ム��2�メ��2�モ��2�ヤ��2�ユ�2�ヨ�2�ラ�2�リ�2�ル�2�レ�2�ロ�2�ワ�2�ヰ�2�ヱ�2�ヲ�2r��3�アパート�3�アルファ�3�アンペア�3�	アール�3�イニング�3�	インチ�3�	ウォン�3�エスクード�3�エーカー�	3�	オンス�
3�	オーム�3�	カイリ�3�カラット�
3�カロリー�3�	ガロン�3�	ガンマ�3�ギガ�3�	ギニー�3�キュリー�3�ギルダー�3�キロ�3�キログラム�3�キロメートル�3�キロワット�3�	グラム�3�グラムトン�3�クルゼイロ�3�クローネ�3�	ケース�3�	コルナ�3�	コーポ�3�サイクル� 3�サンチーム�!3�シリング�"3�	センチ�#3�	セント�$3�	ダース�%3�デシ�&3�ドル�'3�トン�(3�ナノ�)3�	ノット�*3�	ハイツ�+3�パーセント�,3�	パーツ�-3�バーレル�.3�ピアストル�/3�	ピクル�03�ピコ�13�ビル�23�ファラッド�33�フィート�43�ブッシェル�53�	フラン�63�ヘクタール�73�ペソ)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr )r!rEr")r#rEr$)r%rEr&)r'rEr()r)rEr*)r+rEr,)r-rEr.)r/rEr0)r1rEr2)r3rEr4)r5rEr6)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr@)rArErB)rCrErD)rErErF)rGrErH)rIrErJ)rKrErL)rMrErN)rOrErP)rQrErR)rSrErT)rUrErV)rWrErX)rYrErZ)r[rEr\)r]rEr^)r_rEr`)rarErb)rcrErd)rerErf)rgrErh)rirErj)rkr�)rlrErm)rnrEro)rprErq)rrrErs)rtrEru)rvrErw)rxrEry)rzrEr{)r|rEr})r~rEr)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_32
s�r�cfCs(d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-gdS(.N�83rE�	ペニヒ�93�	ヘルツ�:3�	ペンス�;3�	ページ�<3�	ベータ�=3�ポイント�>3�	ボルト�?3�ホン�@3�	ポンド�A3�	ホール�B3�	ホーン�C3�マイクロ�D3�	マイル�E3�	マッハ�F3�	マルク�G3�マンション�H3�ミクロン�I3�ミリ�J3�ミリバール�K3�メガ�L3�メガトン�M3�メートル�N3�	ヤード�O3�	ヤール�P3�	ユアン�Q3�リットル�R3�リラ�S3�	ルピー�T3�ルーブル�U3�レム�V3�レントゲン�W3�	ワット�X3�0点�Y3�1点�Z3�2点�[3�3点�\3�4点�]3�5点�^3�6点�_3�7点�`3�8点�a3�9点�b3�10点�c3�11点�d3�12点�e3�13点�f3�14点�g3�15点�h3�16点�i3�17点�j3�18点�k3�19点�l3�20点�m3�21点�n3�22点�o3�23点�p3�24点�q3�hpa�r3�da�s3�au�t3�bar�u3�ov�v3�pc�w3�dm�x3�dm2�y3�dm3�z3�iu�{3�平成�|3�昭和�}3�大正�~3�明治�3�株式会社�3�pa�3�na�3�μa�3�ma�3�ka�3�kb�3�mb�3�gb�3�cal�3�kcal�3�pf�3�nf�3�μf�3�μg�3�mg�3�kg�3�hz�3�khz�3�mhz�3�ghz�3�thz�3�μl�3�ml�3�dl�3�kl�3�fm�3�nm�3�μm)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rrEr)rrEr)rrEr)r	rEr
)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr )r!rEr")r#rEr$)r%rEr&)r'rEr()r)rEr*)r+rEr,)r-rEr.)r/rEr0)r1rEr2)r3rEr4)r5rEr6)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr@)rArErB)rCrErD)rErErF)rGrErH)rIrErJ)rKrErL)rMrErN)rOrErP)rQrErR)rSrErT)rUrErV)rWrErX)rYrErZ)r[rEr\)r]rEr^)r_rEr`)rarErb)rcrErd)rerErf)rgrErh)rirErj)rkrErl)rmrErn)rorErp)rqrErr)rsrErt)rurErv)rwrErx)ryrErz)r{rEr|)r}rEr~)rrEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_33p
s�r�cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(gdS()N�3rE�mm�3�cm�3�km�3�mm2�3�cm2�3�m2�3�km2�3�mm3�3�cm3�3�m3�3�km3�3�m∕s�3�m∕s2�3rn�3�kpa�3�mpa�3�gpa�3�rad�3�rad∕s�3�rad∕s2�3�ps�3�ns�3�μs�3�ms�3�pv�3�nv�3�μv�3�mv�3�kv�3�3�pw�3�nw�3�μw�3�mw�3�kw�3�3�kω�3�mω��3r���3�bq��3�cc��3�cd��3�c∕kg��3��3�db��3�gy��3�ha��3�hp��3�in��3�kk��3��3�kt��3�lm��3�ln��3�log��3�lx��3rz��3�mil��3�mol��3�ph��3��3�ppm��3�pr��3�sr��3�sv��3�wb��3�v∕m��3�a∕m��3�1日��3�2日��3�3日��3�4日��3�5日��3�6日��3�7日��3�8日��3�9日��3�10日��3�11日��3�12日��3�13日��3�14日��3�15日��3�16日��3�17日��3�18日��3�19日��3�20日��3�21日�3�22日�3�23日�3�24日�3�25日�3�26日�3�27日�3�28日�3�29日�3�30日�3�31日�3�gal)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rErn)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rrEr)rrEr�)rrEr)rrEr	)r
rEr)rrEr
)rrEr)rrErz)rrEr)rrEr)rrEr)rr�)rrEr)rrEr)rrEr)rrEr)r rEr!)r"rEr#)r$rEr%)r&rEr')r(rEr))r*rEr+)r,rEr-)r.rEr/)r0rEr1)r2rEr3)r4rEr5)r6rEr7)r8rEr9)r:rEr;)r<rEr=)r>rEr?)r@rErA)rBrErC)rDrErE)rFrErG)rHrErI)rJrErK)rLrErM)rNrErO)rPrErQ)rRrErS)rTrErU)rVrErW)rXrErY)rZrEr[)r\rEr])r^rEr_)r`rEra)rbrErc)rdrErer�r�r�r�r��_seg_34�
s�rfceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�4r0�Mr��M�͟��鍤鐤�Ǥ�Ф�,��@�rE�ꙁ�A��B��ꙃ�C��D��ꙅ�E��F��ꙇ�G��H��ꙉ�I��J��ꙋ�K��L��ꙍ�M��N��ꙏ�O��P��ꙑ�Q��R��ꙓ�S��T��ꙕ�U��V��ꙗ�W��X��ꙙ�Y��Z��ꙛ�[��\��ꙝ�]��^��ꙟ�_��`��ꙡ�a��b��ꙣ�c��d��ꙥ�e��f��ꙧ�g��h��ꙩ�i��j��ꙫ�k��l��ꙭ�m�逦�ꚁ遦邦�ꚃ郦鄦�ꚅ酦醦�ꚇ釦鈦�ꚉ鉦銦�ꚋ鋦錦�ꚍ鍦鎦�ꚏ鏦鐦�ꚑ鑦钦�ꚓ铦锦�ꚕ镦閦�ꚗ闦阦韦����"��ꜣ�#��$��ꜥ�%��&��ꜧ�'��(��ꜩ�)��*��ꜫ�+��,��ꜭ�-��.��ꜯ�/��2��ꜳ�3�)rgr0)rhr�)rir0)rjr�)rkr0)rlr�)rmr0)rnr�)ror0)rpr�)rqrErr)rsr0)rtrEru)rvr0)rwrErx)ryr0)rzrEr{)r|r0)r}rEr~)rr0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�r�)r�r0)r�r�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0r�r�r�r�r��_seg_35@s�r�cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�4�rE�ꜵ�5�r0�6��ꜷ�7��8��ꜹ�9��:��ꜻ�;��<��ꜽ�=��>��ꜿ�?��@��ꝁ�A��B��ꝃ�C��D��ꝅ�E��F��ꝇ�G��H��ꝉ�I��J��ꝋ�K��L��ꝍ�M��N��ꝏ�O��P��ꝑ�Q��R��ꝓ�S��T��ꝕ�U��V��ꝗ�W��X��ꝙ�Y��Z��ꝛ�[��\��ꝝ�]��^��ꝟ�_��`��ꝡ�a��b��ꝣ�c��d��ꝥ�e��f��ꝧ�g��h��ꝩ�i��j��ꝫ�k��l��ꝭ�m��n��ꝯ�o��p��q��y��ꝺ�z��{��ꝼ�|��}��ᵹ�~��ꝿ��逧�ꞁ遧邧�ꞃ郧鄧�ꞅ酧醧�ꞇ釧鋧�ꞌ錧鍧�ɥ鎧鏧r�鐧�ꞑ鑧钧�ꞓ铧锧頧�ꞡ顧颧�ꞣ飧餧�ꞥ饧馧�ꞧ駧騧�ꞩ驧骧�ɦ髧��ħ)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)rrEr)rr0)rrEr)rr0)rrEr)rr0)r	rEr
)rr0)rrEr
)rr0)rrEr)rr0)rrEr)rr0)rrEr)rr0)rrEr)rr0)rrEr)rr0)rrEr)r r0)r!rEr")r#r0)r$rEr%)r&r0)r'rEr()r)r0)r*rEr+)r,r0)r-rEr.)r/r0)r0rEr1)r2r0)r3rEr4)r5r0)r6rEr7)r8r0)r9rEr:)r;r0)r<rEr=)r>r0)r?rEr@)rAr0)rBrErC)rDr0)rErErF)rGr0)rHrErI)rJr0)rKrErL)rMr0)rNrErO)rPr0)rQrErO)rRr0)rSrErT)rUr0)rVrErW)rXr0)rYrErZ)r[rEr\)r]r0)r^rEr_)r`r0)rarErb)rcr0)rdrEre)rfr0)rgrErh)rir0)rjrErk)rlr0)rmrErn)ror0)rpr�)rqrErr)rsr0)rtrEru)rvr0)rwr�)rxrEry)rzr0)r{rEr|)r}r0)r~rEr)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r�)r�rEr�r�r�r�r�r��_seg_36�s�r�cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�rE�œ�r0�,�r��0��:��@��x�逨�Ũ�Ψ�ڨ������T��_��}�逩�Ω�ϩ�ک�ީ�����7��@��N��P��Z��\��|�逪�ê�۪������	�������� ��'��(��/������������������������豈���更���車���賈���滑���串���句���龜�	��契�
��金���喇���奈�
��懶���癩���羅���蘿���螺���裸���邏���樂���洛���烙���珞���落���酪���駱���亂���卵���欄���爛���蘭� ��鸞�!��嵐�"��濫�#��藍�$��襤�%��拉�&��臘�'��蠟�(��廊�)��朗�*��浪�+��狼�,��郎�-��來)r�rEr�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rrEr)rrEr)rrEr)r	rEr
)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrErr�r�r�r�r��_seg_37s�rcfCs(d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-gdS(.N�.�rE�冷�/��勞�0��擄�1��櫓�2��爐�3��盧�4��老�5��蘆�6��虜�7��路�8��露�9��魯�:��鷺�;��碌�<��祿�=��綠�>��菉�?��錄�@��鹿�A��論�B��壟�C��弄�D��籠�E��聾�F��牢�G��磊�H��賂�I��雷�J��壘�K��屢�L��樓�M��淚�N��漏�O��累�P��縷�Q��陋�R��勒�S��肋�T��凜�U��凌�V��稜�W��綾�X��菱�Y��陵�Z��讀�[��拏�\��樂�]��諾�^��丹�_��寧�`��怒�a��率�b��異�c��北�d��磻�e��便�f��復�g��不�h��泌�i��數�j��索�k��參�l��塞�m��省�n��葉�o��說�p��殺�q��辰�r��沈�s��拾�t��若�u��掠�v��略�w��亮�x��兩�y��凉�z��梁�{��糧�|��良�}��諒�~��量���勵��呂��女��廬��旅��濾��礪��閭��驪��麗��黎��力��曆��歷��轢��年��憐��戀��撚)r rEr!)r"rEr#)r$rEr%)r&rEr')r(rEr))r*rEr+)r,rEr-)r.rEr/)r0rEr1)r2rEr3)r4rEr5)r6rEr7)r8rEr9)r:rEr;)r<rEr=)r>rEr?)r@rErA)rBrErC)rDrErE)rFrErG)rHrErI)rJrErK)rLrErM)rNrErO)rPrErQ)rRrErS)rTrErU)rVrErW)rXrErY)rZrEr[)r\rEr])r^rEr_)r`rEra)rbrErc)rdrEre)rfrErg)rhrEri)rjrErk)rlrErm)rnrEro)rprErq)rrrErs)rtrEru)rvrErw)rxrEry)rzrEr{)r|rEr})r~rEr)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_38xs�r�cfCs(d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-gdS(.N�rE�漣��煉��璉��秊��練��聯��輦��蓮��連��鍊��列��劣��咽��烈��裂��說��廉��念��捻��殮��簾��獵��令��囹��寧��嶺��怜��玲��瑩��羚��聆��鈴��零��靈��領��例��禮��醴��隸��惡��了��僚��寮��尿��料��樂��燎��療���蓼���遼���龍���暈���阮���劉���杻���柳���流���溜���琉���留���硫���紐���類���六���戮���陸���倫���崙���淪���輪���律���慄���栗���率���隆���利���吏���履���易���李���梨���泥���理���痢���罹���裏���裡���里���離���匿���溺���吝���燐���璘���藺���隣���鱗���麟���林��淋)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rrEr)rrEr)rrEr)r	rEr
)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr )r!rEr")r#rEr$)r%rEr&)r'rEr()r)rEr*)r+rEr,)r-rEr.)r/rEr0)r1rEr2)r3rEr4)r5rEr6)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr@)rArErB)rCrErD)rErErF)rGrErH)rIrErJ)rKrErL)rMrErN)rOrErP)rQrErR)rSrErT)rUrErV)rWrErX)rYrErZ)r[rEr\)r]rEr^)r_rEr`)rarErb)rcrErd)rerErf)rgrErh)rirErj)rkrErl)rmrErn)rorErp)rqrErr)rsrErt)rurErv)rwrErx)ryrErz)r{rEr|)r}rEr~)rrEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_39�s�r�cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'gdS((N�rE�臨��立��笠��粒��狀��炙��識��什��茶��刺���切���度���拓���糖���宅���洞���暴���輻���行�	��降�
��見���廓���兀�
��嗀��r0���塚�����晴�����凞���猪���益���礼���神���祥���福���靖���精���羽��� ��蘒�!��"��諸�#��%��逸�&��都�'��*��飯�+��飼�,��館�-��鶴�.��郞�/��隷�0��侮�1��僧�2��免�3��勉�4��勤�5��卑�6��喝�7��嘆�8��器�9��塀�:��墨�;��層�<��屮�=��悔�>��慨�?��憎�@��懲�A��敏�B��既�C��暑�D��梅�E��海�F��渚�G��漢�H��煮�I��爫�J��琢�K��碑�L��社�M��祉�N��祈�O��祐�P��祖�Q��祝�R��禍�S��禎�T��穀�U��突�V��節�W��練�X��縉�Y��繁�Z��署�[��者�\��臭�]��艹�_��著)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r0)r�rEr�)rr0)rrEr)rr0)rrEr)rrEr)rr0)r	rEr
)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr )r!rEr")r#rEr$)r%rEr&)r'rEr()r)rEr*)r+rEr,)r-rEr.)r/rEr0)r1rEr2)r3rEr4)r5rEr6)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr@)rArErB)rCrErD)rErErF)rGrErH)rIrErJ)rKrErL)rMrErN)rOrErP)rQrErR)rSrErT)rUrErV)rWrErX)rYrErZ)r[rEr\)r]rEr^)r_rEr`)rarErb)rcrErd)rerErf)rgrErh)rirErj)rkrErl)rmrErn)rorErp)rqrErrr�r�r�r�r��_seg_40Hs�rscfCs d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)gdS(*N�`�rE�褐�a��視�b��謁�c��謹�d��賓�e��贈�f��辶�g��逸�h��難�i��響�j��頻�k��恵�l��𤋮�m��舘�n�r��p��並�q��况�r��全�s��侀�t��充�u��冀�v��勇�w��勺�x��喝�y��啕�z��喙�{��嗢�|��塚�}��墳�~��奄���奔��婢��嬨��廒��廙��彩��徭��惘��慎��愈��憎��慠��懲��戴��揄��搜��摒��敖��晴��朗��望��杖��歹��殺��流��滛��滋��漢��瀞��煮��瞧��爵��犯��猪��瑱��甆��画��瘝��瘟��益��盛��直��睊��着��磌��窱��節��类��絛��練��缾��者��荒��華��蝹��襁��覆���調��諸��請���諾��諭���變����輸���遲���醙)rtrEru)rvrErw)rxrEry)rzrEr{)r|rEr})r~rEr)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rrEr)rrEr)rrEr)r	rEr
)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr )r!rErw)r"rEr#)r$rEr%)r&rEr')r(rEry)r)rEr*)r+rEr,)r-rEr{)r.rEr/)r0rEr)r1rEr2)r3rEr4)r5rEr6r�r�r�r�r��_seg_41�s�r7cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'gdS((N��rE�鉶���陼���難���靖���韛���響���頋���頻���鬒���龜���𢡊���𢡄���𣏕���㮝���䀘���䀹���𥉉���𥳐���𧻓���齃���龎��r����ff���fi���fl���ffi���ffl���st�����մն���մե���մի���վն���մխ�����יִ��r0���ײַ� ��ע�!��א�"��ד�#��ה�$��כ�%��ל�&��ם�'��ר�(��ת�)�rr��*��שׁ�+��שׂ�,��שּׁ�-��שּׂ�.��אַ�/��אָ�0��אּ�1��בּ�2��גּ�3��דּ�4��הּ�5��וּ�6��זּ�7��8��טּ�9��יּ�:��ךּ�;��כּ�<��לּ�=��>��מּ�?��@��נּ�A��סּ�B��C��ףּ�D��פּ�E��F��צּ�G��קּ�H��רּ�I��שּ�J��תּ�K��וֹ�L��בֿ�M��כֿ�N��פֿ�O��אל�P��ٱ�R��ٻ�V��پ�Z��ڀ�^��ٺ�b��ٿ�f��ٹ�j��ڤ�n��ڦ�r��ڄ�v��ڃ�z��چ�~��ڇ��ڍ)r8rEr9)r:rEr;)r<rEr=)r>rEr?)r@rErA)rBrErC)rDrErE)rFrErG)rHrErI)rJrErK)rLrErM)rNrErO)rPrErQ)rRrErS)rTrErU)rVrErW)rXrErY)rZrEr[)r\rEr])r^rEr_)r`rEra)rbr�)rcrErd)rerErf)rgrErh)rirErj)rkrErl)rmrErn)ror�)rprErq)rrrErs)rtrEru)rvrErw)rxrEry)rzr�)r{rEr|)r}r0)r~rEr)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r�)r�rEr�)r�r�)r�rEr�)r�rEr�)r�r�)r�rEr�)r�rEr�)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_42s�r�cfCs&d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,gdS(-N�rE�ڌ��ڎ��ڈ��ژ��ڑ��ک��گ��ڳ��ڱ��ں��ڻ��ۀ��ہ��ھ��ے��ۓ�r0��r����ڭ���ۇ���ۆ���ۈ���ۇٴ���ۋ���ۅ���ۉ���ې���ى���ئا���ئە���ئو���ئۇ���ئۆ���ئۈ��ئې��ئى��ی���ئج���ئح���ئم�����ئي���بج���بح���بخ���بم�	��بى�
��بي���تج���تح�
��تخ���تم���تى���تي���ثج���ثم���ثى���ثي���جح���جم���حج���حم���خج���خح���خم���سج���سح���سخ���سم� ��صح�!��صم�"��ضج�#��ضح�$��ضخ�%��ضم�&��طح�'��طم�(��ظم�)��عج�*��عم�+��غج�,��غم�-��فج�.��فح�/��فخ�0��فم�1��فى�2��في�3��قح�4��قم�5��قى�6��قي�7��كا�8��كج�9��كح�:��كخ�;��كل�<��كم�=��كى�>��كي)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rrEr)rrEr)rrEr)r	rEr
)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rr0)rr�)rrEr)rrEr)rrEr)rrEr )r!rEr")r#rEr$)r%rEr&)r'rEr()r)rEr*)r+rEr,)r-rEr.)r/rEr0)r1rEr2)r3rEr4)r5rEr6)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr@)rArErB)rCrErD)rErEr<)rFrErG)rHrErI)rJrErK)rLrErM)rNrErO)rPrErQ)rRrErS)rTrErU)rVrErW)rXrErY)rZrEr[)r\rEr])r^rEr_)r`rEra)rbrErc)rdrEre)rfrErg)rhrEri)rjrErk)rlrErm)rnrEro)rprErq)rrrErs)rtrEru)rvrErw)rxrEry)rzrEr{)r|rEr})r~rEr)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_43�s�r�cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!gdS("N�?�rE�لج�@��لح�A��لخ�B��لم�C��لى�D��لي�E��مج�F��مح�G��مخ�H��مم�I��مى�J��مي�K��نج�L��نح�M��نخ�N��نم�O��نى�P��ني�Q��هج�R��هم�S��هى�T��هي�U��يج�V��يح�W��يخ�X��يم�Y��يى�Z��يي�[��ذٰ�\��رٰ�]��ىٰ�^�r� ٌّ�_�� ٍّ�`�� َّ�a�� ُّ�b�� ِّ�c�� ّٰ�d��ئر�e��ئز�f��ئم�g��ئن�h��ئى�i��ئي�j��بر�k��بز�l��بم�m��بن�n��بى�o��بي�p��تر�q��تز�r��تم�s��تن�t��تى�u��تي�v��ثر�w��ثز�x��ثم�y��ثن�z��ثى�{��ثي�|��فى�}��في�~��قى���قي��كا��كل��كم��كى��كي�����ما���نر��نز���نن�����ير��يز���ين����ئج��ئح��ئخ���ئه��بج��بح��بخ���به��تج��تح)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rr�)r�rr�)r�rr)rrr)rrr)rrr)rrEr)r	rEr
)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr )r!rEr")r#rEr$)r%rEr&)r'rEr()r)rEr*)r+rEr,)r-rEr.)r/rEr0)r1rEr2)r3rEr4)r5rEr6)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr@)rArErB)rCrErD)rErErF)rGrErH)rIrEr�)rJrEr�)rKrEr�)rLrErM)rNrEr�)rOrErP)rQrErR)rSrEr�)rTrErU)rVrEr�)rWrEr�)rXrEr�)rYrErZ)r[rEr\)r]rEr�)r^rEr_)r`rEr�)rarEr�)rbrErc)rdrEre)rfrErg)rhrEr)rirErj)rkrErl)rmrErn)rorErp)rqrEr)rrrErs)rtrEru)rvrErwr�r�r�r�r��_seg_44�s�rxcfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"gdS(#N�rE�تخ��تم��ته��ثم��جح��جم��حج��حم��خج��خم��سج��سح��سخ��سم��صح��صخ��صم��ضج��ضح��ضخ��ضم��طح��ظم��عج��عم��غج��غم��فج��فح��فخ��فم���قح���قم���كج���كح���كخ���كل���كم���لج���لح���لخ���لم���له���مج���مح���مخ���مم���نج���نح���نخ���نم���نه���هج���هم���هٰ���يج���يح���يخ���يم���يه���ئم���ئه���بم���به���������ثه�����سه���شم���شه�����������������ـَّ���ـُّ���ـِّ��طى��طي��عى��عي��غى��غي��سى��سي��شى��شي��حى���حي���جى���جي���خى���خي���صى���صي)ryrErz)r{rEr|)r}rEr~)rrEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr|)r�rEr~)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rrEr)rrEr�)rrEr�)rrEr�)rrEr�)r	rEr�)r
rEr�)rrEr�)rrEr
)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)r rEr!)r"rEr#)r$rEr%)r&rEr')r(rEr))r*rEr+)r,rEr-)r.rEr/)r0rEr1)r2rEr3)r4rEr5r�r�r�r�r��_seg_45Ps�r6cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�dgdS(N��rE�ضى���ضي�	��شج�
��شح���شخ���شم�
��شر���سر���صر���ضر���طى���طي���عى���عي���غى���غي���سى���سي���شى���شي���حى���حي���جى���جي���خى� ��خي�!��صى�"��صي�#��$��%��&��'��(��)��*��+��,��-��.��/��0��1��سه�2��شه�3��طم�4��سج�5��سح�6��سخ�7��8��9��:��;��ظم�<��اً�>�r0�@�r��P��تجم�Q��تحج�S��تحم�T��تخم�U��تمج�V��تمح�W��تمخ�X��جمح�Z��حمي�[��حمى�\��سحج�]��سجح�^��سجى�_��سمح�a��سمج�b��سمم�d��صحح�f��صمم�g��شحم�i��شجي�j��شمخ�l��شمم�n��ضحى�o��ضخم�q��طمح�s��طمم�t��طمي�u��عجم�v��عمم�x��عمى�y��غمم�z��غمي�{��غمى�|��فخم�~��قمح���قمم��لحم��لحي��لحى��لجج��لخم��لمح��محج��محم)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr@)rArErB)rCrErD)rErErF)rGrErH)rIrErJ)rKrErL)rMrErN)rOrErP)rQrErR)rSrErT)rUrErV)rWrErX)rYrErZ)r[rEr\)r]rEr^)r_rEr`)rarErb)rcrErd)rerErf)rgrErh)rirErj)rkrErl)rmrErn)rorEr8)rprEr:)rqrEr<)rrrEr>)rsrEr@)rtrErB)rurErD)rvrErF)rwrErH)rxrErJ)ryrEr<)rzrEr>)r{rEr@)r|rErB)r}rEr~)rrEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr<)r�rEr>)r�rEr@)r�rEr�)r�rEr�)r�rEr�)r�r0)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_46�s�r�cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%gdS(&N�rE�محي��مجح��مجم��مخج��مخم�r���مجخ��همج��همم��نحم��نحى��نجم��نجى��نمي��نمى��يمم��بخي��تجي��تجى��تخي��تخى��تمي��تمى��جمي��جحى��جمى��سخى��صحي��شحي��ضحي��لجي��لمي��يحي��يجي��يمي��ممي��قمي��نحي��قمح��لحم��عمي��كمي��نجح��مخي��لجم��كمم����جحي��حجي��مجي��فمي���بحي�����عجم���صمم���سخي���نجي�����صلے���قلے���الله���اكبر���محمد��صلعم��رسول��عليه��وسلم��صلى�r�!صلى الله عليه وسلم��جل جلاله��ریال�r0���r����,���、�����:��rl���!���?���〖���〗��� ��'��1��—�2��–�3��_�5�r��6�r��7��{�8��}�9��〔�:��〕�;��【�<��】�=��《�>��》)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rrEr)rrEr)rrEr)r	rEr
)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr )r!rEr")r#rEr$)r%rEr&)r'rEr()r)rEr*)r+rEr,)r-rEr.)r/rEr0)r1rEr2)r3rEr4)r5rEr6)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr@)rArErB)rCrErD)rErErF)rGrErD)rHrEr@)rIrErJ)rKrErL)rMrErN)rOrErP)rQrErR)rSrErF)rTrErU)rVrErW)rXrErY)rZrEr[)r\r�)r]rEr^)r_rEr`)rarErb)rcrErd)rerErf)rgrErh)rirErj)rkrErl)rmrErn)rorErp)rqrrr)rsrrt)rurErv)rwr0)rxr�)ryr�)rzrr{)r|rEr})r~r�)rrr�)r�rrl)r�rr�)r�rr�)r�rEr�)r�rEr�)r�r�)r�r0)r�r�)r�rEr�)r�rEr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_47 s�r�cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'gdS((N�?�rE�〈�@��〉�A��「�B��」�C��『�D��』�E�r0�G�r�[�H��]�I�� ̅�M�r��P�r{�Q��、�R�r��T�rl�U�r��V�r��W�r��X��—�Y�r��Z�r��[�r��\�r��]��〔�^��〕�_��#�`��&�a��*�b�r��c��-�d��<�e��>�f�r��g��h��\�i��$�j��%�k��@�l��p�� ً�q��ـً�r�� ٌ�s��t�� ٍ�u��v�� َ�w��ـَ�x�� ُ�y��ـُ�z�� ِ�{��ـِ�|�� ّ�}��ـّ�~�� ْ���ـْ��ء��آ��أ��ؤ��إ��ئ��ا��ب��ة��ت��ث��ج��ح��خ��د��ذ��ر��ز��س��ش��ص��ض��ط���ظ���ع���غ���ف���ق���ك���ل���م���ن���ه���و���ى���ي��لآ��لأ��لإ��لا��r��������")r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r0)r�rr�)r�rr�)r�rr�)r�rr�)r�rr{)r�rEr�)r�r�)r�rrl)r�rr�)r�rr�)r�rr�)r�rEr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rEr�)r�rEr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rEr�)r�rr�)r�rr�)r�rr�)r�r�)r�rr�)r�rr�)r�rr�)r�rr�)r�r�)r�rr�)r�rEr�)r�rr�)r�r0)r�rr�)r�r�)r�rr�)r�rEr�)r�rr�)r�rEr�)r�rr�)r�rEr�)r�rr�)r�rEr�)r�rr�)rrEr)rrEr)rrEr)rrEr)rrEr	)r
rEr)rrEr
)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)r rEr!)r"rEr#)r$rEr%)r&rEr')r(rEr))r*rEr+)r,rEr-)r.rEr/)r0rEr1)r2rEr3)r4rEr5)r6rEr7)r8rEr9)r:rEr;)r<rEr=)r>rEr?)r@rErA)rBrErC)rDrErE)rFrErG)rHrErI)rJrErK)rLrErM)rNrErO)rPrErQ)rRr�)rSr�)rTr�)rUrr�)rVrrWr�r�r�r�r��_seg_48�s�rXcfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�dgdS(N��rr���r���r���r����'��r��	�r��
�r���r���r{�
�rEr���r���/��r���r���r�����r���r���r���r���r���r���r���rl��r���r���r���r�� �r��!�rF�"�rH�#�rJ�$�rL�%�rN�&�rP�'�rR�(�rT�)�rV�*�rX�+�rZ�,�r\�-�r^�.�r`�/�rb�0�rd�1�rf�2�rh�3�rj�4�rl�5�rn�6�rp�7�rr�8�rt�9�rv�:�rx�;�r��<�r��=�r��>��^�?�r��@�rm�A��B��C��D��E��F��G��H��I��J��K��L��M��N��O��P��Q��R��S��T��U��V��W��X��Y��Z��[�r��\��|�]�r��^��~�_��⦅�`��⦆�a��b��「�c��」�d��、�e��・�f��ヲ)rYrr�)rZrr�)r[rr�)r\rr�)r]rr^)r_rr�)r`rr�)rarr�)rbrr�)rcrr{)rdrEr�)rerEr)rfrrg)rhrEr�)rirEr�)rjrEr�)rkrEr)rlrEr�)rmrEr�)rnrEr�)rorEr�)rprEr�)rqrEr�)rrrr�)rsrrl)rtrr�)rurr�)rvrr�)rwrr�)rxrr�)ryrErF)rzrErH)r{rErJ)r|rErL)r}rErN)r~rErP)rrErR)r�rErT)r�rErV)r�rErX)r�rErZ)r�rEr\)r�rEr^)r�rEr`)r�rErb)r�rErd)r�rErf)r�rErh)r�rErj)r�rErl)r�rErn)r�rErp)r�rErr)r�rErt)r�rErv)r�rErx)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rrm)r�rErF)r�rErH)r�rErJ)r�rErL)r�rErN)r�rErP)r�rErR)r�rErT)r�rErV)r�rErX)r�rErZ)r�rEr\)r�rEr^)r�rEr`)r�rErb)r�rErd)r�rErf)r�rErh)r�rErj)r�rErl)r�rErn)r�rErp)r�rErr)r�rErt)r�rErv)r�rErx)r�rr�)r�rr�)r�rr�)r�rr�)r�rEr�)r�rEr�)r�rEr)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_49�s�r�cfCs$d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+gdS(,N�g�rE�ァ�h��ィ�i��ゥ�j��ェ�k��ォ�l��ャ�m��ュ�n��ョ�o��ッ�p��ー�q��ア�r��イ�s��ウ�t��エ�u��オ�v��カ�w��キ�x��ク�y��ケ�z��コ�{��サ�|��シ�}��ス�~��セ���ソ��タ��チ��ツ��テ��ト��ナ��ニ��ヌ��ネ��ノ��ハ��ヒ��フ��ヘ��ホ��マ��ミ��ム��メ��モ��ヤ��ユ��ヨ��ラ��リ��ル��レ��ロ��ワ��ン��゙��゚�r���ᄀ��ᄁ��ᆪ��ᄂ��ᆬ��ᆭ��ᄃ��ᄄ��ᄅ��ᆰ��ᆱ��ᆲ��ᆳ��ᆴ��ᆵ��ᄚ��ᄆ��ᄇ��ᄈ��ᄡ��ᄉ��ᄊ��ᄋ��ᄌ��ᄍ��ᄎ��ᄏ��ᄐ��ᄑ��ᄒ����ᅡ���ᅢ���ᅣ���ᅤ���ᅥ���ᅦ�����ᅧ���ᅨ���ᅩ���ᅪ)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r rEr )r rEr )r rEr )r rEr )r rEr	 )r
 rEr )r rEr
 )r rEr )r rEr )r rEr )r rEr )r rEr )r rEr )r rEr )r rEr )r rEr )r  rEr! )r" rEr# )r$ rEr% )r& rEr' )r( rEr) )r* rEr+ )r, rEr- )r. rEr/ )r0 rEr1 )r2 rEr3 )r4 rEr5 )r6 rEr7 )r8 rEr9 )r: rEr; )r< r�)r= rEr> )r? rEr@ )rA rErB )rC rErD )rE rErF )rG rErH )rI rErJ )rK rErL )rM rErN )rO rErP )rQ rErR )rS rErT )rU rErV )rW rErX )rY rErZ )r[ rEr\ )r] rEr^ )r_ rEr` )ra rErb )rc rErd )re rErf )rg rErh )ri rErj )rk rErl )rm rErn )ro rErp )rq rErr )rs rErt )ru rErv )rw rErx )ry r�)rz rEr{ )r| rEr} )r~ rEr )r� rEr� )r� rEr� )r� rEr� )r� r�)r� rEr� )r� rEr� )r� rEr� )r� rEr� r�r�r�r�r��_seg_50Xs�r� cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�dgdS(N��rE�ᅫ���ᅬ��r����ᅭ���ᅮ���ᅯ���ᅰ���ᅱ���ᅲ�����ᅳ���ᅴ���ᅵ�����¢���£���¬��r� ̄���¦���¥���₩�����│���←���↑���→���↓���■���○���r0��
�'�(�;�<�>�?�N�P�^������4�7�������������� �$�0�K�����������𐐨��𐐩��𐐪��𐐫��𐐬��𐐭��𐐮��𐐯��𐐰�	�𐐱�
�𐐲��𐐳��𐐴�
�𐐵��𐐶��𐐷��𐐸��𐐹��𐐺��𐐻��𐐼��𐐽��𐐾��𐐿��𐑀��𐑁��𐑂��𐑃��𐑄��𐑅)r� rEr� )r� rEr� )r� r�)r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� r�)r� rEr� )r� rEr� )r� rEr� )r� r�)r� rEr� )r� rEr� )r� rEr� )r� rr� )r� rEr� )r� rEr� )r� rEr� )r� r�)r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� rEr!)r!rEr!)r!rEr!)r!rEr!)r!rEr!)r	!rEr
!)r!rEr!)r
!rEr!)r!rEr!)r!rEr!)r!rEr!)r!rEr!)r!rEr!)r!rEr!)r!rEr!)r!rEr!)r!rEr !)r!!rEr"!)r#!rEr$!)r%!rEr&!)r'!rEr(!)r)!rEr*!r�r�r�r�r��_seg_51�s�r+!ceCs�drdsdtdudvdwdxdydzd{d|d}d~dd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�rE�𐑆��𐑇� �𐑈�!�𐑉�"�𐑊�#�𐑋�$�𐑌�%�𐑍�&�𐑎�'�𐑏�(r0�r�������	�
�6�7�9�<�=�?�V�W�`�	�	�	�:	�?	�@	�	�	�	�	�
�
�
�
�
�
�
�
�
�4
�8
�;
�?
�H
�P
�Y
�`
�
��6�9�V�X�s�x���I�`���N�R�p��������������5�6�D������������� �o#�$�c$�p$�t$�0�/4)r,!rEr-!)r.!rEr/!)r0!rEr1!)r2!rEr3!)r4!rEr5!)r6!rEr7!)r8!rEr9!)r:!rEr;!)r<!rEr=!)r>!rEr?!)r@!r0)rA!r�)rB!r0)rC!r�)rD!r0)rE!r�)rF!r0)rG!r�)rH!r0)rI!r�)rJ!r0)rK!r�)rL!r0)rM!r�)rN!r0)rO!r�)rP!r0)rQ!r�)rR!r0)rS!r�)rT!r0)rU!r�)rV!r0)rW!r�)rX!r0)rY!r�)rZ!r0)r[!r�)r\!r0)r]!r�)r^!r0)r_!r�)r`!r0)ra!r�)rb!r0)rc!r�)rd!r0)re!r�)rf!r0)rg!r�)rh!r0)ri!r�)rj!r0)rk!r�)rl!r0)rm!r�)rn!r0)ro!r�)rp!r0)rq!r�)rr!r0)rs!r�)rt!r0)ru!r�)rv!r0)rw!r�)rx!r0)ry!r�)rz!r0)r{!r�)r|!r0)r}!r�)r~!r0)r!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�r�r�r�r�r��_seg_52(s�r�!cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�hr0�9jr��o�Eo�Po�o�o�o�����������'��)��^�rE�𝅗𝅥�_��𝅘𝅥�`��𝅘𝅥𝅮�a��𝅘𝅥𝅯�b��𝅘𝅥𝅰�c��𝅘𝅥𝅱�d��𝅘𝅥𝅲�e��s��{����𝆹𝅥���𝆺𝅥���𝆹𝅥𝅮���𝆺𝅥𝅮���𝆹𝅥𝅯���𝆺𝅥𝅯��������F����W��`��r���rF��rH��rJ��rL��rN��rP��rR��rT��rV�	�rX�
�rZ��r\��r^�
�r`��rb��rd��rf��rh��rj��rl��rn��rp��rr��rt��rv��rx������������� ��!��"��#��$��%��&��'��(��)��*��+��,��-��.��/��0��1��2��3��4��5��6��7��8��9��:��;��<�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!rEr�!)r�!rEr�!)r�!rEr�!)r�!rEr�!)r�!rEr�!)r�!rEr�!)r�!rEr�!)r�!r0)r�!r�)r�!r0)r�!rEr�!)r�!rEr�!)r�!rEr�!)r�!rEr�!)r�!rEr�!)r�!rEr�!)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!rErF)r�!rErH)r�!rErJ)r�!rErL)r�!rErN)r�!rErP)r�!rErR)r�!rErT)r�!rErV)r�!rErX)r�!rErZ)r�!rEr\)r�!rEr^)r�!rEr`)r�!rErb)r�!rErd)r�!rErf)r�!rErh)r�!rErj)r�!rErl)r�!rErn)r�!rErp)r�!rErr)r�!rErt)r�!rErv)r�!rErx)r�!rErF)r�!rErH)r�!rErJ)r�!rErL)r�!rErN)r�!rErP)r�!rErR)r�!rErT)r�!rErV)r�!rErX)r�!rErZ)r�!rEr\)r�!rEr^)r�!rEr`)r�!rErb)r�!rErd)r�!rErf)r�!rErh)r�!rErj)r�!rErl)r�!rErn)r�!rErp)r�!rErr)r"rErt)r"rErv)r"rErx)r"rErF)r"rErH)r"rErJ)r"rErL)r"rErN)r"rErP)r	"rErR)r
"rErT)r"rErVr�r�r�r�r��_seg_53�s�r"ceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�=�rErX�>�rZ�?�r\�@�r^�A�r`�B�rb�C�rd�D�rf�E�rh�F�rj�G�rl�H�rn�I�rp�J�rr�K�rt�L�rv�M�rx�N�rF�O�rH�P�rJ�Q�rL�R�rN�S�rP�T�rR�U�r��V�rV�W��X��Y��Z��[��\��]��^��_��`��a��b��c��d��e��f��g��h��i��j��k��l��m��n��o�rT�p��q��r��s��t��u��v��w��x��y��z��{��|��}��~���������������������������������������������������������������������)r
"rErX)r"rErZ)r"rEr\)r"rEr^)r"rEr`)r"rErb)r"rErd)r"rErf)r"rErh)r"rErj)r"rErl)r"rErn)r"rErp)r"rErr)r"rErt)r"rErv)r"rErx)r"rErF)r"rErH)r "rErJ)r!"rErL)r""rErN)r#"rErP)r$"rErR)r%"r�)r&"rErV)r'"rErX)r("rErZ)r)"rEr\)r*"rEr^)r+"rEr`)r,"rErb)r-"rErd)r."rErf)r/"rErh)r0"rErj)r1"rErl)r2"rErn)r3"rErp)r4"rErr)r5"rErt)r6"rErv)r7"rErx)r8"rErF)r9"rErH)r:"rErJ)r;"rErL)r<"rErN)r="rErP)r>"rErR)r?"rErT)r@"rErV)rA"rErX)rB"rErZ)rC"rEr\)rD"rEr^)rE"rEr`)rF"rErb)rG"rErd)rH"rErf)rI"rErh)rJ"rErj)rK"rErl)rL"rErn)rM"rErp)rN"rErr)rO"rErt)rP"rErv)rQ"rErx)rR"rErF)rS"rErH)rT"rErJ)rU"rErL)rV"rErN)rW"rErP)rX"rErR)rY"rErT)rZ"rErV)r["rErX)r\"rErZ)r]"rEr\)r^"rEr^)r_"rEr`)r`"rErb)ra"rErd)rb"rErf)rc"rErh)rd"rErj)re"rErl)rf"rErn)rg"rErp)rh"rErr)ri"rErt)rj"rErv)rk"rErx)rl"rErF)rm"r�)rn"rErJ)ro"rErL)rp"r�r�r�r�r�r��_seg_54�s�rq"cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N��rErR��r���rX��rZ����r`��rb��rd��rf����rj��rl��rn��rp��rr��rt��rv��rx��rF��rH��rJ��rL����rP����rT��rV������r\���r^���������������rh���������������������������������������rN��������������������������������������������������������������������������������������������������������������������������������������)rr"rErR)rs"r�)rt"rErX)ru"rErZ)rv"r�)rw"rEr`)rx"rErb)ry"rErd)rz"rErf)r{"r�)r|"rErj)r}"rErl)r~"rErn)r"rErp)r�"rErr)r�"rErt)r�"rErv)r�"rErx)r�"rErF)r�"rErH)r�"rErJ)r�"rErL)r�"r�)r�"rErP)r�"r�)r�"rErT)r�"rErV)r�"rErX)r�"rErZ)r�"rEr\)r�"rEr^)r�"rEr`)r�"r�)r�"rErd)r�"rErf)r�"rErh)r�"rErj)r�"rErl)r�"rErn)r�"rErp)r�"rErr)r�"rErt)r�"rErv)r�"rErx)r�"rErF)r�"rErH)r�"rErJ)r�"rErL)r�"rErN)r�"rErP)r�"rErR)r�"rErT)r�"rErV)r�"rErX)r�"rErZ)r�"rEr\)r�"rEr^)r�"rEr`)r�"rErb)r�"rErd)r�"rErf)r�"rErh)r�"rErj)r�"rErl)r�"rErn)r�"rErp)r�"rErr)r�"rErt)r�"rErv)r�"rErx)r�"rErF)r�"rErH)r�"rErJ)r�"rErL)r�"rErN)r�"rErP)r�"rErR)r�"rErT)r�"rErV)r�"rErX)r�"rErZ)r�"rEr\)r�"rEr^)r�"rEr`)r�"rErb)r�"rErd)r�"rErf)r�"rErh)r�"rErj)r�"rErl)r�"rErn)r�"rErp)r�"rErr)r�"rErt)r�"rErv)r�"rErx)r�"rErF)r�"rErH)r�"r�)r�"rErLr�r�r�r�r��_seg_55`s�r�"cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N��rErN�	�rP�
�rR��r��
�rX��rZ��r\��r^��r`��rb��rd��rf����rj��rl��rn��rp��rr��rt��rv����rF��rH� �rJ�!�rL�"��#��$��%�rT�&�rV�'��(��)��*��+��,��-��.��/�rh�0��1��2��3��4��5��6��7�rx�8��9��:��;��<��=��>��?��@��A��B��C��D��E��F��G��J��K��L��M��N��O��P��Q��R��S��T��U��V��W��X��Y��Z��[��\��]��^��_��`��a��b��c��d��e��f��g��h��i��j��k��l��m��n�)r�"rErN)r�"rErP)r�"rErR)r�"r�)r�"rErX)r�"rErZ)r�"rEr\)r�"rEr^)r�"rEr`)r�"rErb)r�"rErd)r�"rErf)r�"r�)r�"rErj)r�"rErl)r�"rErn)r�"rErp)r�"rErr)r�"rErt)r�"rErv)r�"r�)r�"rErF)r�"rErH)r�"rErJ)r�"rErL)r�"rErN)r�"rErP)r�"rErR)r�"rErT)r�"rErV)r�"rErX)r�"rErZ)r�"rEr\)r�"rEr^)r�"rEr`)r�"rErb)r�"rErd)r�"rErf)r�"rErh)r�"rErj)r�"rErl)r#rErn)r#rErp)r#rErr)r#rErt)r#rErv)r#rErx)r#rErF)r#rErH)r#r�)r	#rErL)r
#rErN)r#rErP)r#rErR)r
#r�)r#rErV)r#rErX)r#rErZ)r#rEr\)r#rEr^)r#r�)r#rErb)r#r�)r#rErj)r#rErl)r#rErn)r#rErp)r#rErr)r#rErt)r#rErv)r#r�)r#rErF)r#rErH)r #rErJ)r!#rErL)r"#rErN)r##rErP)r$#rErR)r%#rErT)r&#rErV)r'#rErX)r(#rErZ)r)#rEr\)r*#rEr^)r+#rEr`)r,#rErb)r-#rErd)r.#rErf)r/#rErh)r0#rErj)r1#rErl)r2#rErn)r3#rErp)r4#rErr)r5#rErt)r6#rErv)r7#rErx)r8#rErF)r9#rErH)r:#rErJr�r�r�r�r��_seg_56�s�r;#cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�o�rErL�p�rN�q�rP�r�rR�s�rT�t�rV�u�rX�v�rZ�w�r\�x�r^�y�r`�z�rb�{�rd�|�rf�}�rh�~�rj��rl��rn��rp��rr��rt��rv��rx��rF��rH��rJ���������������������������������������������������������������������������������������������������������������������������������������������������������������������)r<#rErL)r=#rErN)r>#rErP)r?#rErR)r@#rErT)rA#rErV)rB#rErX)rC#rErZ)rD#rEr\)rE#rEr^)rF#rEr`)rG#rErb)rH#rErd)rI#rErf)rJ#rErh)rK#rErj)rL#rErl)rM#rErn)rN#rErp)rO#rErr)rP#rErt)rQ#rErv)rR#rErx)rS#rErF)rT#rErH)rU#rErJ)rV#rErL)rW#rErN)rX#rErP)rY#rErR)rZ#rErT)r[#rErV)r\#rErX)r]#rErZ)r^#rEr\)r_#rEr^)r`#rEr`)ra#rErb)rb#rErd)rc#rErf)rd#rErh)re#rErj)rf#rErl)rg#rErn)rh#rErp)ri#rErr)rj#rErt)rk#rErv)rl#rErx)rm#rErF)rn#rErH)ro#rErJ)rp#rErL)rq#rErN)rr#rErP)rs#rErR)rt#rErT)ru#rErV)rv#rErX)rw#rErZ)rx#rEr\)ry#rEr^)rz#rEr`)r{#rErb)r|#rErd)r}#rErf)r~#rErh)r#rErj)r�#rErl)r�#rErn)r�#rErp)r�#rErr)r�#rErt)r�#rErv)r�#rErx)r�#rErF)r�#rErH)r�#rErJ)r�#rErL)r�#rErN)r�#rErP)r�#rErR)r�#rErT)r�#rErV)r�#rErX)r�#rErZ)r�#rEr\)r�#rEr^)r�#rEr`)r�#rErb)r�#rErd)r�#rErf)r�#rErh)r�#rErj)r�#rErl)r�#rErn)r�#rErp)r�#rErr)r�#rErt)r�#rErvr�r�r�r�r��_seg_570s�r�#cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N���rErx���rF���rH���rJ���rL���rN���rP���rR���rT���rV���rX���rZ���r\���r^���r`���rb���rd���rf���rh���rj���rl���rn���rp���rr���rt���rv�����������������������������������������������������������������	��
������
�������������������������������������� ��!��"��#��$��%��&��'��(��)��*��+��,��-��.��/��0��1��2��3��4��5��6�)r�#rErx)r�#rErF)r�#rErH)r�#rErJ)r�#rErL)r�#rErN)r�#rErP)r�#rErR)r�#rErT)r�#rErV)r�#rErX)r�#rErZ)r�#rEr\)r�#rEr^)r�#rEr`)r�#rErb)r�#rErd)r�#rErf)r�#rErh)r�#rErj)r�#rErl)r�#rErn)r�#rErp)r�#rErr)r�#rErt)r�#rErv)r�#rErx)r�#rErF)r�#rErH)r�#rErJ)r�#rErL)r�#rErN)r�#rErP)r�#rErR)r�#rErT)r�#rErV)r�#rErX)r�#rErZ)r�#rEr\)r�#rEr^)r�#rEr`)r�#rErb)r�#rErd)r�#rErf)r�#rErh)r�#rErj)r�#rErl)r�#rErn)r�#rErp)r�#rErr)r�#rErt)r�#rErv)r�#rErx)r�#rErF)r�#rErH)r�#rErJ)r�#rErL)r�#rErN)r�#rErP)r�#rErR)r�#rErT)r�#rErV)r�#rErX)r�#rErZ)r�#rEr\)r�#rEr^)r�#rEr`)r�#rErb)r�#rErd)r�#rErf)r�#rErh)r�#rErj)r�#rErl)r�#rErn)r�#rErp)r�#rErr)r�#rErt)r�#rErv)r�#rErx)r�#rErF)r�#rErH)r�#rErJ)r�#rErL)r�#rErN)r�#rErP)r�#rErR)r�#rErT)r�#rErV)r�#rErX)r�#rErZ)r�#rEr\)r�#rEr^)r�#rEr`)r�#rErb)r�#rErd)r$rErf)r$rErh)r$rErj)r$rErl)r$rErnr�r�r�r�r��_seg_58�s�r$cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�7�rErp�8�rr�9�rt�:�rv�;�rx�<�rF�=�rH�>�rJ�?�rL�@�rN�A�rP�B�rR�C�rT�D�rV�E�rX�F�rZ�G�r\�H�r^�I�r`�J�rb�K�rd�L�rf�M�rh�N�rj�O�rl�P�rn�Q��R��S��T��U��V��W��X��Y��Z��[��\��]��^��_��`��a��b��c��d��e��f��g��h��i��j��k��l��m��n��o��p��q��r��s��t��u��v��w��x��y��z��{��|��}��~���������������������������������������������������������)r$rErp)r$rErr)r$rErt)r	$rErv)r
$rErx)r$rErF)r$rErH)r
$rErJ)r$rErL)r$rErN)r$rErP)r$rErR)r$rErT)r$rErV)r$rErX)r$rErZ)r$rEr\)r$rEr^)r$rEr`)r$rErb)r$rErd)r$rErf)r$rErh)r$rErj)r$rErl)r$rErn)r $rErp)r!$rErr)r"$rErt)r#$rErv)r$$rErx)r%$rErF)r&$rErH)r'$rErJ)r($rErL)r)$rErN)r*$rErP)r+$rErR)r,$rErT)r-$rErV)r.$rErX)r/$rErZ)r0$rEr\)r1$rEr^)r2$rEr`)r3$rErb)r4$rErd)r5$rErf)r6$rErh)r7$rErj)r8$rErl)r9$rErn)r:$rErp)r;$rErr)r<$rErt)r=$rErv)r>$rErx)r?$rErF)r@$rErH)rA$rErJ)rB$rErL)rC$rErN)rD$rErP)rE$rErR)rF$rErT)rG$rErV)rH$rErX)rI$rErZ)rJ$rEr\)rK$rEr^)rL$rEr`)rM$rErb)rN$rErd)rO$rErf)rP$rErh)rQ$rErj)rR$rErl)rS$rErn)rT$rErp)rU$rErr)rV$rErt)rW$rErv)rX$rErx)rY$rErF)rZ$rErH)r[$rErJ)r\$rErL)r]$rErN)r^$rErP)r_$rErR)r`$rErT)ra$rErV)rb$rErX)rc$rErZ)rd$rEr\)re$rEr^)rf$rEr`)rg$rErb)rh$rErd)ri$rErfr�r�r�r�r��_seg_59s�rj$cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N��rErh��rj��rl��rn��rp��rr��rt��rv��rx���ı���ȷ��r����α���β���γ���δ���ε���ζ���η���θ���ι���κ���λ���μ���ν���ξ���ο���π���ρ�����σ���τ���υ���φ���χ���ψ���ω���∇����������������������������������������������������������������������������∂���������������������������������������������������������������������������������������������������)rk$rErh)rl$rErj)rm$rErl)rn$rErn)ro$rErp)rp$rErr)rq$rErt)rr$rErv)rs$rErx)rt$rEru$)rv$rErw$)rx$r�)ry$rErz$)r{$rEr|$)r}$rEr~$)r$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rErz$)r�$rEr|$)r�$rEr~$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rErz$)r�$rEr|$)r�$rEr~$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rErz$)r�$rEr|$)r�$rEr~$)r�$rEr�$)r�$rEr�$r�r�r�r�r��_seg_60hs�r�$cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N��rE�ζ���η���θ���ι���κ���λ���μ���ν�	��ξ�
��ο���π���ρ�
��σ���τ���υ���φ���χ���ψ���ω���∂���ε�������������α���β���γ���δ� ��!��"��#��$��%��&��'��(��)��*��+��,��-��.��/��0��1��2��3��4��5��∇�6��7��8��9��:��;��<��=��>��?��@��A��B��C��D��E��F��G��I��J��K��L��M��N��O��P��Q��R��S��T��U��V��W��X��Y��Z��[��\��]��^��_��`��a��b��c��d��e��f�)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r%rEr%)r%rEr%)r%rEr%)r%rEr%)r%rEr	%)r
%rEr%)r%rEr
%)r%rEr%)r%rEr%)r%rEr%)r%rEr%)r%rEr�$)r%rEr�$)r%rEr%)r%rEr%)r%rEr%)r%rEr%)r%rEr%)r%rEr %)r!%rEr"%)r#%rEr%)r$%rEr�$)r%%rEr�$)r&%rEr�$)r'%rEr�$)r(%rEr�$)r)%rEr�$)r*%rEr�$)r+%rEr�$)r,%rEr�$)r-%rEr�$)r.%rEr%)r/%rEr%)r0%rEr�$)r1%rEr%)r2%rEr%)r3%rEr	%)r4%rEr%)r5%rEr
%)r6%rEr%)r7%rEr%)r8%rEr9%)r:%rEr%)r;%rEr%)r<%rEr %)r=%rEr"%)r>%rEr%)r?%rEr�$)r@%rEr�$)rA%rEr�$)rB%rEr�$)rC%rEr�$)rD%rEr�$)rE%rEr�$)rF%rEr�$)rG%rEr�$)rH%rEr�$)rI%rEr%)rJ%rEr%)rK%rEr%)rL%rEr%)rM%rEr	%)rN%rEr%)rO%rEr
%)rP%rEr%)rQ%rEr%)rR%rEr%)rS%rEr%)rT%rEr�$)rU%rEr�$)rV%rEr%)rW%rEr%)rX%rEr%)rY%rEr%)rZ%rEr%)r[%rEr %)r\%rEr"%)r]%rEr%)r^%rEr�$)r_%rEr�$)r`%rEr�$)ra%rEr�$)rb%rEr�$)rc%rEr�$)rd%rEr�$)re%rEr�$)rf%rEr�$)rg%rEr�$)rh%rEr%)ri%rEr%r�r�r�r�r��_seg_61�s�rj%cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�g�rE�θ�h��σ�i��τ�j��υ�k��φ�l��χ�m��ψ�n��ω�o��∇�p��α�q��β�r��γ�s��δ�t��ε�u��ζ�v��η�w��x��ι�y��κ�z��λ�{��μ�|��ν�}��ξ�~��ο���π���ρ�����������������∂������������������������������������������������������������������������������������������������������������������������������������������ϝ���r����r�)rk%rErl%)rm%rErn%)ro%rErp%)rq%rErr%)rs%rErt%)ru%rErv%)rw%rErx%)ry%rErz%)r{%rEr|%)r}%rEr~%)r%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rErl%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rErn%)r�%rErp%)r�%rErr%)r�%rErt%)r�%rErv%)r�%rErx%)r�%rErz%)r�%rEr�%)r�%rEr�%)r�%rErl%)r�%rEr�%)r�%rErt%)r�%rEr�%)r�%rEr�%)r�%rEr~%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rErl%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rErl%)r�%rErn%)r�%rErp%)r�%rErr%)r�%rErt%)r�%rErv%)r�%rErx%)r�%rErz%)r�%rEr|%)r�%rEr~%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rErl%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rErn%)r�%rErp%)r�%rErr%)r�%rErt%)r�%rErv%)r�%rErx%)r�%rErz%)r�%rEr�%)r�%rEr�%)r�%rErl%)r�%rEr�%)r�%rErt%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%r�)r�%rEr�r�r�r�r�r��_seg_628s�r�%cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N���rEr����r����r���r����r����r����r����r����r����r�������������������������������������������������������������������������������������������������������������r����ا���ب���ج���د�����و���ز���ح���ط�	��ي�
��ك���ل���م�
��ن���س���ع���ف���ص���ق���ر���ش���ت���ث���خ���ذ���ض���ظ���غ���ٮ���ں���ڡ���ٯ� ��!��"��#��$��ه�%��'��(��)��*��+��,��-��.��/��0��1��2�)r�%rEr�)r�%rEr�)r�%rEr)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r&rEr�)r&rEr)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r	&rEr�)r
&rEr�)r&rEr)r&rEr�)r
&rEr�)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r&rEr)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r&r�)r&rEr&)r&rEr &)r!&rEr"&)r#&rEr$&)r%&r�)r&&rEr'&)r(&rEr)&)r*&rEr+&)r,&rEr-&)r.&rEr/&)r0&rEr1&)r2&rEr3&)r4&rEr5&)r6&rEr7&)r8&rEr9&)r:&rEr;&)r<&rEr=&)r>&rEr?&)r@&rErA&)rB&rErC&)rD&rErE&)rF&rErG&)rH&rErI&)rJ&rErK&)rL&rErM&)rN&rErO&)rP&rErQ&)rR&rErS&)rT&rErU&)rV&rErW&)rX&rErY&)rZ&rEr[&)r\&r�)r]&rEr &)r^&rEr"&)r_&r�)r`&rEra&)rb&r�)rc&rEr+&)rd&r�)re&rEr/&)rf&rEr1&)rg&rEr3&)rh&rEr5&)ri&rEr7&)rj&rEr9&)rk&rEr;&)rl&rEr=&)rm&rEr?&)rn&rErA&r�r�r�r�r��_seg_63�s�ro&ceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�3�r��4�rE�ش�5��ت�6��ث�7��خ�8��9��ض�:��;��غ�<��B��ج�C��G��ح�H��I��ي�J��K��ل�L��M��ن�N��س�O��ع�P��Q��ص�R��ق�S��T��U��W��X��Y��Z��[��\��]��ں�^��_��ٯ�`��a��ب�b��c��d��ه�e��g��h��ط�i��j��ك�k��l��م�m��n��o��p��ف�q��r��s��t��u��v��w��x��y��z��ظ�{��|��ٮ�}��~��ڡ�����ا�������د�����و���ز���������������������������ر�����������ذ����������������)rp&r�)rq&rErr&)rs&rErt&)ru&rErv&)rw&rErx&)ry&r�)rz&rEr{&)r|&r�)r}&rEr~&)r&r�)r�&rEr�&)r�&r�)r�&rEr�&)r�&r�)r�&rEr�&)r�&r�)r�&rEr�&)r�&r�)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&r�)r�&rEr�&)r�&rEr�&)r�&r�)r�&rErr&)r�&r�)r�&rErx&)r�&r�)r�&rEr{&)r�&r�)r�&rEr~&)r�&r�)r�&rEr�&)r�&r�)r�&rEr�&)r�&r�)r�&rEr�&)r�&rEr�&)r�&r�)r�&rEr�&)r�&r�)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&r�)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&r�)r�&rErr&)r�&rErt&)r�&rErv&)r�&rErx&)r�&r�)r�&rEr{&)r�&rEr�&)r�&rEr~&)r�&rEr�&)r�&r�)r�&rEr�&)r�&r�)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&r�)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rErr&)r�&rErt&)r�&rErv&)r�&rErx&)r�&rEr�&)r�&rEr{&)r�&rEr�&)r�&rEr~&)r�&r�)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&r�r�r�r�r�r��_seg_64s�r�&cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�dgdS(N��rE�و���ز���ح���ط���ي��r����ل���م���ن���س���ع���ف���ص���ق���ر���ش���ت���ث���خ���ذ���ض���ظ���غ�����r0������,��0������������������������r�0,���1,���2,���3,���4,���5,���6,���7,�	��8,�
��9,�����(a)���(b)���(c)���(d)���(e)���(f)���(g)���(h)���(i)���(j)���(k)���(l)���(m)���(n)���(o)���(p)� ��(q)�!��(r)�"��(s)�#��(t)�$��(u)�%��(v)�&��(w)�'��(x)�(��(y)�)��(z)�*��〔s〕�+�rJ�,�rh�-�r��.��wz�/��0�rF�1�rH�2��3�rL�4�rN�5�rP�6�rR�7�rT�8�rV�9�rX�:�rZ�;�r\�<�r^�=�r`�>�rb�?�rd�@�rf�A��B�rj)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&r�)r'rEr')r'rEr')r'rEr')r'rEr')r'rEr	')r
'rEr')r'rEr
')r'rEr')r'rEr')r'rEr')r'rEr')r'rEr')r'rEr')r'rEr')r'rEr')r'rEr')r 'rEr!')r"'r�)r#'r0)r$'r�)r%'r0)r&'r�)r''r0)r('r�)r)'r0)r*'r�)r+'r0)r,'r�)r-'r0)r.'r�)r/'r0)r0'r�)r1'rr2')r3'rr4')r5'rr6')r7'rr8')r9'rr:')r;'rr<')r='rr>')r?'rr@')rA'rrB')rC'rrD')rE'r�)rF'rrG')rH'rrI')rJ'rrK')rL'rrM')rN'rrO')rP'rrQ')rR'rrS')rT'rrU')rV'rrW')rX'rrY')rZ'rr[')r\'rr]')r^'rr_')r`'rra')rb'rrc')rd'rre')rf'rrg')rh'rri')rj'rrk')rl'rrm')rn'rro')rp'rrq')rr'rrs')rt'rru')rv'rrw')rx'rry')rz'rEr{')r|'rErJ)r}'rErh)r~'rEr�)r'rEr�')r�'r�)r�'rErF)r�'rErH)r�'rErJ)r�'rErL)r�'rErN)r�'rErP)r�'rErR)r�'rErT)r�'rErV)r�'rErX)r�'rErZ)r�'rEr\)r�'rEr^)r�'rEr`)r�'rErb)r�'rErd)r�'rErf)r�'rErh)r�'rErjr�r�r�r�r��_seg_65ps�r�'ceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�dgdS(N�C�rErl�D�rn�E�rp�F�rr�G�rt�H�rv�I�rx�J��hv�K�r��L��sd�M�r.�N��ppv�O��wc�P�r0�j��mc�k��md�l�r��p����dj����������ほか���ココ���サ�����手���字���双���デ���二���多���解���天���交���映���無���料���前���後���再���新� ��初�!��終�"��生�#��販�$��声�%��吹�&��演�'��投�(��捕�)��一�*��三�+��遊�,��左�-��中�.��右�/��指�0��走�1��打�2��禁�3��空�4��合�5��満�6��有�7��月�8��申�9��割�:��営�;��@��	〔本〕�A��	〔三〕�B��	〔二〕�C��	〔安〕�D��	〔点〕�E��	〔打〕�F��	〔盗〕�G��	〔勝〕�H��	〔敗〕�I��P��得�Q��可�R����!��0��6��7��}�������������������������?��@�)r�'rErl)r�'rErn)r�'rErp)r�'rErr)r�'rErt)r�'rErv)r�'rErx)r�'rEr�')r�'rEr�)r�'rEr�')r�'rEr.)r�'rEr�')r�'rEr�')r�'r0)r�'rEr�')r�'rEr�')r�'r�)r�'r0)r�'rEr�')r�'r0)r�'r�)r�'r0)r�'rEr�')r�'rEr�')r�'rEr�')r�'r�)r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r(rEr()r(rEr()r(rEr()r(rEr()r(rEr	()r
(rEr()r(rEr
()r(rEr()r(r�)r(rEr()r(rEr()r(rEr()r(rEr()r(rEr()r(rEr()r(rEr()r(rEr ()r!(rEr"()r#(r�)r$(rEr%()r&(rEr'()r((r�)r)(r0)r*(r�)r+(r0)r,(r�)r-(r0)r.(r�)r/(r0)r0(r�)r1(r0)r2(r�)r3(r0)r4(r�)r5(r0)r6(r�)r7(r0)r8(r�)r9(r0r�r�r�r�r��_seg_66�s�r:(cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�dgdS(N�A�r��B�r0���������>��@��D��P��h���A��E��P�������t���צ���5��@�����rE�丽���丸���乁���𠄢���你���侮���侻���倂���偺�	��備�
��僧���像���㒞�
��𠘺���免���兔���兤���具���𠔜���㒹���內���再���𠕋���冗���冤���仌���冬���况���𩇟���凵���刃���㓟� ��刻�!��剆�"��割�#��剷�$��㔕�%��勇�&��勉�'��勤�(��勺�)��包�*��匆�+��北�,��卉�-��卑�.��博�/��即�0��卽�1��卿�4��𠨬�5��灰�6��及�7��叟�8��𠭣�9��叫�:��叱�;��吆�<��咞�=��吸�>��呈�?��周�@��咢�A��哶�B��唐�C��啓�D��啣�E��善�G��喙�H��喫�I��喳�J��嗂�K��圖�L��嘆�M��圗)r;(r�)r<(r0)r=(r�)r>(r0)r?(r�)r@(r0)rA(r�)rB(r0)rC(r�)rD(r0)rE(r�)rF(r0)rG(r�)rH(r0)rI(r�)rJ(r0)rK(r�)rL(r0)rM(r�)rN(r0)rO(r�)rP(r0)rQ(r�)rR(r0)rS(r�)rT(rErU()rV(rErW()rX(rErY()rZ(rEr[()r\(rEr]()r^(rEr_()r`(rEra()rb(rErc()rd(rEre()rf(rErg()rh(rEri()rj(rErk()rl(rErm()rn(rEro()rp(rErq()rr(rErs()rt(rEru()rv(rErw()rx(rEry()rz(rEr{()r|(rEr}()r~(rEr()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�(r�r�r�r�r��_seg_67@s�r�(cfCs$d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+gdS(,N�N�rE�噑�O��噴�P��切�Q��壮�R��城�S��埴�T��堍�U��型�V��堲�W��報�X��墬�Y��𡓤�Z��売�[��壷�\��夆�]��多�^��夢�_��奢�`��𡚨�a��𡛪�b��姬�c��娛�d��娧�e��姘�f��婦�g��㛮�h�r��i��嬈�j��嬾�l��𡧈�m��寃�n��寘�o��寧�p��寳�q��𡬘�r��寿�s��将�t��u��尢�v��㞁�w��屠�x��屮�y��峀�z��岍�{��𡷤�|��嵃�}��𡷦�~��嵮���嵫��嵼��巡��巢��㠯��巽��帨��帽��幩��㡢��𢆃��㡼��庰��庳��庶��廊��𪎒��廾��𢌱��舁��弢��㣇��𣊸��𦇚��形��彫��㣣��徚��忍��志��忹��悁��㤺��㤜��悔��𢛔��惇��慈��慌��慎���慺��憎��憲��憤��憯��懞��懲��懶��成��戛��扝)r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr))r)rEr))r)rEr))r)rEr))r)rEr))r	)rEr
))r)rEr))r
)rEr))r)rEr))r)rEr))r)rEr))r)rEr))r)rEr))r)rEr))r)rEr))r)rEr))r)r�)r )rEr!))r")rEr#))r$)rEr%))r&)rEr'))r()rEr)))r*)rEr+))r,)rEr-))r.)rEr/))r0)rEr1))r2)rEr3))r4)r�)r5)rEr6))r7)rEr8))r9)rEr:))r;)rEr<))r=)rEr>))r?)rEr@))rA)rErB))rC)rErD))rE)rErF))rG)rErH))rI)rErJ))rK)rErL))rM)rErN))rO)rErP))rQ)rErR))rS)rErT))rU)rErV))rW)rErX))rY)rErZ))r[)rEr\))r])rEr^))r_)rEr`))ra)rErb))rc)rErd))re)rErf))rg)rErh))ri)rErj))rk)rErl))rm)rErn))ro)rErp))rq)rErr))rs)rErt))ru)rErv))rw)rErx))ry)rErz))r{)rEr|))r})rEr~))r)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�)r�r�r�r�r��_seg_68�s�r�)cfCs(d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-gdS(.N�rE�抱��拔��捐��𢬌��挽��拼��捨��掃��揤��𢯱��搢��揅��掩���㨮���摩���摾���撝���摷���㩬���敏���敬���𣀊���旣���書���晉���㬙���暑���㬈���㫤���冒���冕���最���暜���肭���䏙���朗���望���朡���杞���杓���𣏃���㭉���柺���枅���桒���梅���𣑭���梎���栟���椔���㮝���楂���榣���槪���檨���𣚣���櫛���㰘���次���𣢧���歔���㱎���歲���殟��殺��殻��𣪍��𡴋��𣫺��汎��𣲼��沿��泍��汧��洖���派���海���流���浩���浸���涅���𣴞���洴���港�	��湮�
��㴳���滋���滇�
��𣻑���淹���潮���𣽞���𣾎���濆���瀹���瀞���瀛���㶖���灊���災)r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr*)r*rEr*)r*rEr*)r*rEr*)r*rEr*)r	*rEr
*)r*rEr*)r
*rEr*)r*rEr*)r*rEr*)r*rEr*)r*rEr*)r*rEr*)r*rEr*)r*rEr*)r*rEr*)r*rEr *)r!*rEr"*)r#*rEr$*)r%*rEr&*)r'*rEr(*)r)*rEr**)r+*rEr,*)r-*rEr.*)r/*rEr0*)r1*rEr2*)r3*rEr4*)r5*rEr6*)r7*rEr8*)r9*rEr:*)r;*rEr<*)r=*rEr>*)r?*rEr@*)rA*rErB*)rC*rErD*)rE*rErF*)rG*rErH*)rI*rErJ*)rK*rErL*)rM*rErN*)rO*rErP*)rQ*rErR*)rS*rErT*)rU*rErV*)rW*rErX*)rY*rErZ*)r[*rEr\*)r]*rEr^*)r_*rEr`*)ra*rErb*)rc*rErd*)re*rErf*)rg*rErh*)ri*rErj*)rk*rErl*)rm*rErn*)ro*rErp*)rq*rErr*)rs*rErt*)ru*rErv*)rw*rErx*r�r�r�r�r��_seg_69s�ry*cfCs&d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,gdS(-N��rE�灷���炭���𠔥���煅���𤉣���熜��r�� ��爨�!��爵�"��牐�#��𤘈�$��犀�%��犕�&��𤜵�'��𤠔�(��獺�)��王�*��㺬�+��玥�,��㺸�.��瑇�/��瑜�0��瑱�1��璅�2��瓊�3��㼛�4��甤�5��𤰶�6��甾�7��𤲒�8��異�9��𢆟�:��瘐�;��𤾡�<��𤾸�=��𥁄�>��㿼�?��䀈�@��直�A��𥃳�B��𥃲�C��𥄙�D��𥄳�E��眞�F��真�H��睊�I��䀹�J��瞋�K��䁆�L��䂖�M��𥐝�N��硎�O��碌�P��磌�Q��䃣�R��𥘦�S��祖�T��𥚚�U��𥛅�V��福�W��秫�X��䄯�Y��穀�Z��穊�[��穏�\��𥥼�]��𥪧�_��`��䈂�a��𥮫�b��篆�c��築�d��䈧�e��𥲀�f��糒�g��䊠�h��糨�i��糣�j��紀�k��𥾆�l��絣�m��䌁�n��緇�o��縂�p��繅�q��䌴�r��𦈨�s��𦉇�t��䍙�u��𦋙�v��罺�w��𦌾�x��羕�y��翺�z��者�{��𦓚�|��𦔣�}��聠�~��𦖨���聰)rz*rEr{*)r|*rEr}*)r~*rEr*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*r�)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*r�)r+rEr+)r+rEr+)r+rEr+)r+rEr+)r+rEr	+)r
+rEr+)r+rEr
+)r+rEr+)r+rEr+)r+rEr+)r+rEr+)r+rEr+)r+rEr+)r+rEr+)r+rEr+)r+rEr+)r +rEr!+)r"+rEr#+)r$+rEr%+)r&+rEr'+)r(+rEr)+)r*+rEr++)r,+rEr-+)r.+rEr/+)r0+rEr1+)r2+rEr3+)r4+rEr5+)r6+rEr7+)r8+rEr9+)r:+rEr;+)r<+rEr=+)r>+rEr?+r�r�r�r�r��_seg_70xs�r@+cfCs(d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-gdS(.N�rE�𣍟��䏕��育��脃��䐋��脾��媵��𦞧��𦞵��𣎓��𣎜��舁��舄��辞��䑫��芑��芋��芝��劳��花��芳��芽��苦��𦬼��若��茝��荣��莭��茣��莽��菧��著��荓��菊��菌��菜��𦰶��𦵫��𦳕��䔫��蓱��蓳��蔖��𧏊��蕤��𦼬��䕝��䕡��𦾱��𧃒��䕫��虐��虜��虧��虩��蚩��蚈��蜎��蛢��蝹��蜨��蝫��螆�r���蟡��蠁���䗹���衠���衣���𧙧���裗���裞���䘵���裺���㒻���𧢮���𧥦���䚾���䛇���誠���諭���變���豕���𧲨���貫���賁���贛���起���𧼯���𠠄���跋���趼���跰���𠣞���軔���輸���𨗒���𨗭���邔���郱)rA+rErB+)rC+rErD+)rE+rErF+)rG+rErH+)rI+rErJ+)rK+rErL+)rM+rErN+)rO+rErP+)rQ+rErR+)rS+rErT+)rU+rErV+)rW+rErX+)rY+rErZ+)r[+rEr\+)r]+rEr^+)r_+rEr`+)ra+rErb+)rc+rErd+)re+rErf+)rg+rErh+)ri+rErj+)rk+rErl+)rm+rErn+)ro+rErp+)rq+rErr+)rs+rErt+)ru+rErv+)rw+rErx+)ry+rErz+)r{+rEr|+)r}+rEr~+)r+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+r�)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r,rEr,)r,rEr,)r,rEr,)r,rEr,r�r�r�r�r��_seg_71�s�r,c=Cs|dydzd{d|d}d~dd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�g<S)�N��rE�鄑���𨜮���鄛���鈸���鋗���鋘���鉼���鏹���鐕���𨯺���開���䦕���閷���𨵷���䧦���雃���嶲��霣��𩅅��𩈚��䩮��䩶��韠��𩐊��䪲��𩒖��頋���頩���𩖶���飢���䬳���餩���馧���駂���駾���䯎�	��𩬰�
��鬒���鱀���鳽�
��䳎���䳭���鵧���𪃎���䳸���𪄅���𪈎���𪊑���麻���䵖���黹���黾���鼅���鼏���鼖���鼻���𪘀��r��r���)r	,rEr
,)r,rEr,)r
,rEr,)r,rEr,)r,rEr,)r,rEr,)r,rEr,)r,rEr,)r,rEr,)r,rEr,)r,rEr,)r,rEr ,)r!,rEr",)r#,rEr$,)r%,rEr&,)r',rEr(,)r),rEr*,)r+,rEr,,)r-,rEr.,)r/,rEr0,)r1,rEr2,)r3,rEr4,)r5,rEr6,)r7,rEr8,)r9,rEr:,)r;,rEr<,)r=,rEr>,)r?,rEr@,)rA,rErB,)rC,rErD,)rE,rErF,)rG,rErH,)rI,rErJ,)rK,rErL,)rM,rErN,)rO,rErP,)rQ,rErR,)rS,rErT,)rU,rErV,)rW,rErX,)rY,rErZ,)r[,rEr\,)r],rEr^,)r_,rEr`,)ra,rErb,)rc,rErd,)re,rErf,)rg,rErh,)ri,rErj,)rk,rErl,)rm,rErn,)ro,rErp,)rq,rErr,)rs,rErt,)ru,rErv,)rw,rErx,)ry,rErz,)r{,r�)r|,r�)r},r�r�r�r�r�r��_seg_72Hsxr~,N)M�__doc__�__version__r�r�r�r-r�rfrr�rJr�rbr�r/r�rr�r9	r�	rm
r
r�r3r�r]
rr�rAr
r�r�rLrr�r�rfr�r�rr�r�rsr7r�r�rxr6r�r�rXr�r� r+!r�!r"rq"r�"r;#r�#r$rj$r�$rj%r�%ro&r�&r�'r:(r�(r�)ry*r@+r,r~,�tupleZ	uts46datar�r�r�r��<module>s�hhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh@Isite-packages/pip/_vendor/idna/__pycache__/package_data.cpython-36.opt-1.pyc000064400000000210147511334620022603 0ustar003

���e�@sdZdS)z2.6N)�__version__�rr�"/usr/lib/python3.6/package_data.py�<module>ssite-packages/pip/_vendor/idna/__pycache__/idnadata.cpython-36.pyc000064400000057557147511334620021057 0ustar003

���e���@s0dZ�d �d!�d"�d#�d$dF�ZdGdGdGdGdGdGdGdHdGdIdIdIdIdHdIdHdIdHdHdHdHdHdIdIdIdIdHdHdHdHdHdHdHdHdHdHdHdHdHdJdHdHdHdHdHdHdHdIdHdHdHdHdIdIdIdGdIdIdIdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdIdIdIdIdIdIdIdIdIdIdIdIdIdIdIdIdIdIdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdIdHdHdIdIdIdIdIdIdIdIdIdHdIdHdIdHdHdIdIdIdGdIdIdHdHdHdHdIdHdHdHdIdIdIdIdIdHdHdHdHdIdHdHdHdHdHdHdHdHdHdIdHdIdHdIdHdHdIdIdHdHdHdHdHdHdHdHdHdHdHdIdIdIdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdIdIdHdHdHdHdIdHdIdIdHdHdHdIdIdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdJdIdHdHdHdHdHdIdHdHdIdHdHdHdHdHdIdHdHdHdHdIdHdGdGdGdHdHdHdHdHdHdHdHdHdIdIdIdGdHdJdGdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdGdGdGdGdGdGdGdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdGdJdGdGdGdGdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdKdGdL���Z�d%�d&�d'�d�Z�dS((z6.3.0�t�
�x�
�~�
������ ��0��8������+�t�btu�k�u���v�|�`|�F�|�N }�X@}�Zd}�\l}�^t}�~|}��~���~����X��t�������'!����FRH��.:��.l:��/<�0@�0@�*0�@�<0�@��MP��8�nzd��z�i��&�57�8]�z`��D��@����7{tl�={�l�?{�l�B{m�E{m�P{m��0A��0tB�0@
�rH��0�B�1�C�2�G��2@K�X3L�p�}���}�0@
)ZGreekZHanZHebrewZHiraganaZKatakana�U�D�R�C�L(�iiiiiiii i!i"i#i$i%i&i'i(i)i*i+i,i-i.i/i0i1i2i3i4i5i6i7i8i9i:i;i<i=i>i?i@iAiBiCiDiEiFiGiHiIiJinioiqirisitiuiviwixiyizi{i|i}i~ii�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�iiiiiiiiiiiiiiii i!i"i#i$i%i&i'i(i)i*i+i,i-i.i/iMiNiOiPiQiRiSiTiUiViWiXiYiZi[i\i]i^i_i`iaibicidieifigihiiijikiliminioipiqirisitiuiviwixiyizi{i|i}i~ii�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i@iAiBiCiDiEiFiGiHiIiJiKiLiMiNiOiPiQiRiSiTiUiViWiXi�i�i�i�i�i�i�i�i�i�i�i�iii
ii i!i"i#i$i%i&i'i(i)i*i+i,i-i.i/i0i1i2i3i4i5i6i7i8i9i:i;i<i=i>i?i@iAiBiCiDiEiFiGiHiIiJiKiLiMiNiOiPiQiRiSiTiUiViWiXiYiZi[i\i]i^i_i`iaibicidieifigihiiijikiliminioipiqirisitiuiviwi�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i i
 if ig ih ii i@�iA�iB�iC�iD�iE�iF�iG�iH�iI�iJ�iK�iL�iM�iN�iO�iP�iQ�iR�iS�iT�iU�iV�iW�iX�iY�iZ�i[�i\�i]�i^�i_�i`�ia�ib�ic�id�ie�if�ig�ih�ii�ij�ik�il�im�in�io�ip�iq�ir�is��.��:��{���|�������
$�,�4�<�D�L�T�\�d�l�t� |�"��$��&��(��*��,��.��0��2��6��9��;��=��?��C�E�G�I �L,�N4�P<�RD�TL�VT�X\�Zd�\l�^t�`|�b��d��f��h��j��l��n��p��r��t��v��x��{��}����������� ��0��H��T��d��x�����������������������������������8��@��H��P��X��`��h��p��|�������������������������������������������
$�,�4�<�D�L�T�\�d�l�t� |�"��$��&��(��*��,��.��0��2��:��=��A��C	�H	�J$	�L,	�N4	��<	���
���������@�C
�O
�p@
�r�
�t�
�x�
�~�
��@�����\��d��l��t��|����������������������������������`��b��d��f��h��j��l��n��p��r��t��v��x��z��|��~����������,��4��<��D��L��T��\��d��l��t��|�������������������������������������������������������� ��(��0��8��D��L��T��\��d��l��t��|����������������������������������������������������
$�,�4�<�D�L�T�\�d�l�t� |�"��$��&��(��Zd�����D�������������@�@��`�u������T��|������K@��4���. �\!���"���"���#�X	$�d	�%�p	�%�x	�%��	�%��	&��	&��	<&��	L&��	�&��	�&��	�&��	�&��	'��	,'��	\'��	�'��	�'�
(�
(�
<(�)
L(�1
�(�3
�(�6
�(�:
�(�=
�(�C
�(�I
)�N
,)�R
D)�]
p)�v
�)��
*��
*��
<*��
L*��
�*��
�*��
�*��
�*��
+��
,+��
@+��
�+��
�+�,�
,�<,�)L,�1�,�4�,�:�,�E�,�I-�N,-�XX-�d|-�p�-�r�-��.��.��8.��H.��d.��p.��x.���.���.���.���.��/��(/��@/��\/���/�0�
0�80�)H0�4�0�:�0�E�0�I1�N(1�WT1�Z`1�d�1�p�1��2��2��82��H2���2���2���2��3��(3��T3��x3���3���3���3�
4�

4�
84�;
H4�E
�4�I
5�O
(5�X
\5�d
�5�p
�5��
�5��
6��
6��
h6��
�6��
�6��
7��
(7��
<7��
X7��
`7��
�7�38�;�8�O9�Z@9��:��:��:��(:��4:��P:��d:���:���:���:���:���:���:���:��;��;�� ;��@;��x;�<�,<�`<�*�<�6�<�8�<�:�<�C�<�H=�M$=�R8=�WL=�\`=�it=�m�=�s�=�u�=���=��>��>��P>��d>��x>���>���>���>���>��?�J@��@A��@C��C�IH�N(I�W@I�Y`I�^hI���I��(J��@J���J���J��K��K�� K�`K�HL�[`L�`tM��N���N�mP���Y��Z���Z�
\�8\�5�\�T]�m�]�q�]�t�]��^���^��\_��p_���_�@`�x�`��b���b�d�,�d�<�d�ne�u�e��f���f��@g�h�_�h�}�i���i��@j���j�Ll�Z@m�t�m��n�8p�Jq�~4q��@s��Ps�,t�0�t�<�t�O8u�x�u���u��w��w�x�x�x�x�
$x�,x�4x�<x�Dx�Lx�Tx�\x�dx�lx�tx� |x�"�x�$�x�&�x�(�x�*�x�,�x�.�x�0�x�2�x�4�x�6�x�8�x�:�x�<�x�>�x�@�x�By�Dy�Fy�Hy�J$y�L,y�N4y�P<y�RDy�TLy�VTy�X\y�Zdy�\ly�^ty�`|y�b�y�d�y�f�y�h�y�j�y�l�y�n�y�p�y�r�y�t�y�v�y�x�y�z�y�|�y�~�y���y��z��z��z��z��$z��,z��4z��<z��Dz��Lz��Tz��pz��|z���z���z���z���z���z���z���z���z���z���z���z���z���z���z���z���z��{��{��{��{��${��,{��4{��<{��D{��L{��T{��\{��d{��l{��t{��|{���{���{���{���{���{���{���{���{���{���{���{���{���{���{���{��{�@|�(�|�8�|�F}�h�}�q�}�s�}�u�}�w�}�y�}�{�}�}�}���~���~����@��X����������O!8��!�_,�0�b,�1�g,�1�i,�1�k,�1�m,�1�r,�1�u,�1�|,�1��,2��,2��,2��,2��,$2��,,2��,42��,<2��,D2��,L2��,T2��,\2��,d2��,l2��,t2��,|2��,�2��,�2��,�2��,�2��,�2��,�2��,�2��,�2��,�2��,�2��,�2��,�2��,�2��,�2��,�2��,�2��,3��,3��,3��,3��,$3��,,3��,43��,<3��,D3��,L3��,T3��,\3��,d3��,l3��,t3��,|3��,�3��,�3��,�3��,�3��,�3�&-4�(-�4�.-�4�h-�4��-�5��-�6��-�6��-�6��-�6��-7��- 7��-@7��-`7�.�7�0.�8�0@�.0�@�=0�@��0dB��0tB��0�C�.1D��1�F��$��$@�
&�,&@�B&�D&�F&�H&�J&$�L&,�N&4�P&<�R&D�T&L�V&T�X&\�Z&d�\&l�^&t�`&|�b&��d&��f&��h&��j&��l&��p&��~&���&���&��&��&��&��&$��&,��&4��&<��&D��&L��&T��&\��&|��&�� '\�$'��&'��('��*'��,'��.'��2'��4'��6'��8'��:'��<'��>'��@'��B'�D'�F'�H'�J'$�L',�N'4�P'<�R'D�T'L�V'T�X'\�Z'd�\'l�^'t�`'|�b'��d'��f'��h'��j'��l'��n'��p'��y'��{'��}'���'���'��'��'��'��'0��'8��'D��'L��'���'���'���'���'��((��t(!��("��(@#��(�#��(�#�.)$�T)�$��)&��)<'�7*(�N*)�Z*@)�w*�)�|*�)��**��*l+��*�+��*�+�+,�+$,�+D,�'+�,�/+�,��+/��+�/��+�/��W0�z8h�zDh�zLh� z|h�"z�h�%z�h�*z�h�{xl�'~�x�t~�y��'4�;��>��N��^@�������
���
��A��J
������� ������� �	  �6( �9� �=� �V� �	$�:	�$��	&��	�&�
(�
(�
0(�
T(�4
d(�;
�(�@
�(�}
�)�6,�V-�s�-�I0�G@�p�A��B��@C���C�5D�@�D��F��@G��Z��[�o#	�/4@	�9j �Eo<�o@=��o<>�0@
� 0����v�
����j������0�C)ZPVALIDZCONTEXTJZCONTEXTON)!rrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!)r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0)	r1r2r3r4r5r6r7r8r9)r:r;r<r=)r>r?r@rArBrCrDrE(�rKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r2r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLr:rMrNr>rOrPrQr@r)r*rRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrr-r.r/)r)rrrrrr)�__version__�scriptsZ
joining_typesZcodepoint_classes�rr�/usr/lib/python3.6/idnadata.py�<module>s0

site-packages/pip/_vendor/idna/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000266147511334620021771 0ustar003

���e:�@sddlmZddlTdS)�)�__version__)�*N)Zpackage_datarZcore�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/idna/__pycache__/__init__.cpython-36.pyc000064400000000266147511334620021032 0ustar003

���e:�@sddlmZddlTdS)�)�__version__)�*N)Zpackage_datarZcore�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/_vendor/idna/__pycache__/compat.cpython-36.pyc000064400000001036147511334620020552 0ustar003

���e��@s,ddlTddlTdd�Zdd�Zdd�ZdS)	�)�*cCst|�S)N)�encode)�label�r�/usr/lib/python3.6/compat.py�ToASCIIsrcCst|�S)N)�decode)rrrr�	ToUnicodesr	cCstd��dS)Nz,IDNA 2008 does not utilise nameprep protocol)�NotImplementedError)�srrr�nameprep
srN)Zcore�codecrr	rrrrr�<module>ssite-packages/pip/_vendor/idna/__pycache__/compat.cpython-36.opt-1.pyc000064400000001036147511334620021511 0ustar003

���e��@s,ddlTddlTdd�Zdd�Zdd�ZdS)	�)�*cCst|�S)N)�encode)�label�r�/usr/lib/python3.6/compat.py�ToASCIIsrcCst|�S)N)�decode)rrrr�	ToUnicodesr	cCstd��dS)Nz,IDNA 2008 does not utilise nameprep protocol)�NotImplementedError)�srrr�nameprep
srN)Zcore�codecrr	rrrrr�<module>ssite-packages/pip/_vendor/idna/__pycache__/idnadata.cpython-36.opt-1.pyc000064400000057557147511334620022016 0ustar003

���e���@s0dZ�d �d!�d"�d#�d$dF�ZdGdGdGdGdGdGdGdHdGdIdIdIdIdHdIdHdIdHdHdHdHdHdIdIdIdIdHdHdHdHdHdHdHdHdHdHdHdHdHdJdHdHdHdHdHdHdHdIdHdHdHdHdIdIdIdGdIdIdIdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdIdIdIdIdIdIdIdIdIdIdIdIdIdIdIdIdIdIdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdIdHdHdIdIdIdIdIdIdIdIdIdHdIdHdIdHdHdIdIdIdGdIdIdHdHdHdHdIdHdHdHdIdIdIdIdIdHdHdHdHdIdHdHdHdHdHdHdHdHdHdIdHdIdHdIdHdHdIdIdHdHdHdHdHdHdHdHdHdHdHdIdIdIdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdIdIdHdHdHdHdIdHdIdIdHdHdHdIdIdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdJdIdHdHdHdHdHdIdHdHdIdHdHdHdHdHdIdHdHdHdHdIdHdGdGdGdHdHdHdHdHdHdHdHdHdIdIdIdGdHdJdGdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdGdGdGdGdGdGdGdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdGdJdGdGdGdGdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdHdKdGdL���Z�d%�d&�d'�d�Z�dS((z6.3.0�t�
�x�
�~�
������ ��0��8������+�t�btu�k�u���v�|�`|�F�|�N }�X@}�Zd}�\l}�^t}�~|}��~���~����X��t�������'!����FRH��.:��.l:��/<�0@�0@�*0�@�<0�@��MP��8�nzd��z�i��&�57�8]�z`��D��@����7{tl�={�l�?{�l�B{m�E{m�P{m��0A��0tB�0@
�rH��0�B�1�C�2�G��2@K�X3L�p�}���}�0@
)ZGreekZHanZHebrewZHiraganaZKatakana�U�D�R�C�L(�iiiiiiii i!i"i#i$i%i&i'i(i)i*i+i,i-i.i/i0i1i2i3i4i5i6i7i8i9i:i;i<i=i>i?i@iAiBiCiDiEiFiGiHiIiJinioiqirisitiuiviwixiyizi{i|i}i~ii�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�iiiiiiiiiiiiiiii i!i"i#i$i%i&i'i(i)i*i+i,i-i.i/iMiNiOiPiQiRiSiTiUiViWiXiYiZi[i\i]i^i_i`iaibicidieifigihiiijikiliminioipiqirisitiuiviwixiyizi{i|i}i~ii�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i@iAiBiCiDiEiFiGiHiIiJiKiLiMiNiOiPiQiRiSiTiUiViWiXi�i�i�i�i�i�i�i�i�i�i�i�iii
ii i!i"i#i$i%i&i'i(i)i*i+i,i-i.i/i0i1i2i3i4i5i6i7i8i9i:i;i<i=i>i?i@iAiBiCiDiEiFiGiHiIiJiKiLiMiNiOiPiQiRiSiTiUiViWiXiYiZi[i\i]i^i_i`iaibicidieifigihiiijikiliminioipiqirisitiuiviwi�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i�i i
 if ig ih ii i@�iA�iB�iC�iD�iE�iF�iG�iH�iI�iJ�iK�iL�iM�iN�iO�iP�iQ�iR�iS�iT�iU�iV�iW�iX�iY�iZ�i[�i\�i]�i^�i_�i`�ia�ib�ic�id�ie�if�ig�ih�ii�ij�ik�il�im�in�io�ip�iq�ir�is��.��:��{���|�������
$�,�4�<�D�L�T�\�d�l�t� |�"��$��&��(��*��,��.��0��2��6��9��;��=��?��C�E�G�I �L,�N4�P<�RD�TL�VT�X\�Zd�\l�^t�`|�b��d��f��h��j��l��n��p��r��t��v��x��{��}����������� ��0��H��T��d��x�����������������������������������8��@��H��P��X��`��h��p��|�������������������������������������������
$�,�4�<�D�L�T�\�d�l�t� |�"��$��&��(��*��,��.��0��2��:��=��A��C	�H	�J$	�L,	�N4	��<	���
���������@�C
�O
�p@
�r�
�t�
�x�
�~�
��@�����\��d��l��t��|����������������������������������`��b��d��f��h��j��l��n��p��r��t��v��x��z��|��~����������,��4��<��D��L��T��\��d��l��t��|�������������������������������������������������������� ��(��0��8��D��L��T��\��d��l��t��|����������������������������������������������������
$�,�4�<�D�L�T�\�d�l�t� |�"��$��&��(��Zd�����D�������������@�@��`�u������T��|������K@��4���. �\!���"���"���#�X	$�d	�%�p	�%�x	�%��	�%��	&��	&��	<&��	L&��	�&��	�&��	�&��	�&��	'��	,'��	\'��	�'��	�'�
(�
(�
<(�)
L(�1
�(�3
�(�6
�(�:
�(�=
�(�C
�(�I
)�N
,)�R
D)�]
p)�v
�)��
*��
*��
<*��
L*��
�*��
�*��
�*��
�*��
+��
,+��
@+��
�+��
�+�,�
,�<,�)L,�1�,�4�,�:�,�E�,�I-�N,-�XX-�d|-�p�-�r�-��.��.��8.��H.��d.��p.��x.���.���.���.���.��/��(/��@/��\/���/�0�
0�80�)H0�4�0�:�0�E�0�I1�N(1�WT1�Z`1�d�1�p�1��2��2��82��H2���2���2���2��3��(3��T3��x3���3���3���3�
4�

4�
84�;
H4�E
�4�I
5�O
(5�X
\5�d
�5�p
�5��
�5��
6��
6��
h6��
�6��
�6��
7��
(7��
<7��
X7��
`7��
�7�38�;�8�O9�Z@9��:��:��:��(:��4:��P:��d:���:���:���:���:���:���:���:��;��;�� ;��@;��x;�<�,<�`<�*�<�6�<�8�<�:�<�C�<�H=�M$=�R8=�WL=�\`=�it=�m�=�s�=�u�=���=��>��>��P>��d>��x>���>���>���>���>��?�J@��@A��@C��C�IH�N(I�W@I�Y`I�^hI���I��(J��@J���J���J��K��K�� K�`K�HL�[`L�`tM��N���N�mP���Y��Z���Z�
\�8\�5�\�T]�m�]�q�]�t�]��^���^��\_��p_���_�@`�x�`��b���b�d�,�d�<�d�ne�u�e��f���f��@g�h�_�h�}�i���i��@j���j�Ll�Z@m�t�m��n�8p�Jq�~4q��@s��Ps�,t�0�t�<�t�O8u�x�u���u��w��w�x�x�x�x�
$x�,x�4x�<x�Dx�Lx�Tx�\x�dx�lx�tx� |x�"�x�$�x�&�x�(�x�*�x�,�x�.�x�0�x�2�x�4�x�6�x�8�x�:�x�<�x�>�x�@�x�By�Dy�Fy�Hy�J$y�L,y�N4y�P<y�RDy�TLy�VTy�X\y�Zdy�\ly�^ty�`|y�b�y�d�y�f�y�h�y�j�y�l�y�n�y�p�y�r�y�t�y�v�y�x�y�z�y�|�y�~�y���y��z��z��z��z��$z��,z��4z��<z��Dz��Lz��Tz��pz��|z���z���z���z���z���z���z���z���z���z���z���z���z���z���z���z���z��{��{��{��{��${��,{��4{��<{��D{��L{��T{��\{��d{��l{��t{��|{���{���{���{���{���{���{���{���{���{���{���{���{���{���{���{��{�@|�(�|�8�|�F}�h�}�q�}�s�}�u�}�w�}�y�}�{�}�}�}���~���~����@��X����������O!8��!�_,�0�b,�1�g,�1�i,�1�k,�1�m,�1�r,�1�u,�1�|,�1��,2��,2��,2��,2��,$2��,,2��,42��,<2��,D2��,L2��,T2��,\2��,d2��,l2��,t2��,|2��,�2��,�2��,�2��,�2��,�2��,�2��,�2��,�2��,�2��,�2��,�2��,�2��,�2��,�2��,�2��,�2��,3��,3��,3��,3��,$3��,,3��,43��,<3��,D3��,L3��,T3��,\3��,d3��,l3��,t3��,|3��,�3��,�3��,�3��,�3��,�3�&-4�(-�4�.-�4�h-�4��-�5��-�6��-�6��-�6��-�6��-7��- 7��-@7��-`7�.�7�0.�8�0@�.0�@�=0�@��0dB��0tB��0�C�.1D��1�F��$��$@�
&�,&@�B&�D&�F&�H&�J&$�L&,�N&4�P&<�R&D�T&L�V&T�X&\�Z&d�\&l�^&t�`&|�b&��d&��f&��h&��j&��l&��p&��~&���&���&��&��&��&��&$��&,��&4��&<��&D��&L��&T��&\��&|��&�� '\�$'��&'��('��*'��,'��.'��2'��4'��6'��8'��:'��<'��>'��@'��B'�D'�F'�H'�J'$�L',�N'4�P'<�R'D�T'L�V'T�X'\�Z'd�\'l�^'t�`'|�b'��d'��f'��h'��j'��l'��n'��p'��y'��{'��}'���'���'��'��'��'��'0��'8��'D��'L��'���'���'���'���'��((��t(!��("��(@#��(�#��(�#�.)$�T)�$��)&��)<'�7*(�N*)�Z*@)�w*�)�|*�)��**��*l+��*�+��*�+�+,�+$,�+D,�'+�,�/+�,��+/��+�/��+�/��W0�z8h�zDh�zLh� z|h�"z�h�%z�h�*z�h�{xl�'~�x�t~�y��'4�;��>��N��^@�������
���
��A��J
������� ������� �	  �6( �9� �=� �V� �	$�:	�$��	&��	�&�
(�
(�
0(�
T(�4
d(�;
�(�@
�(�}
�)�6,�V-�s�-�I0�G@�p�A��B��@C���C�5D�@�D��F��@G��Z��[�o#	�/4@	�9j �Eo<�o@=��o<>�0@
� 0����v�
����j������0�C)ZPVALIDZCONTEXTJZCONTEXTON)!rrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!)r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0)	r1r2r3r4r5r6r7r8r9)r:r;r<r=)r>r?r@rArBrCrDrE(�rKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r2r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrrrrrrrrrrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-r.r/r0r1r2r3r4r5r6r7r8r9r:r;r<r=r>r?r@rArBrCrDrErFrGrHrIrJrKrLr:rMrNr>rOrPrQr@r)r*rRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirjrkrlrmrnrorprqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrrrrrr	r
rrr
rrrrrrrr-r.r/)r)rrrrrr)�__version__�scriptsZ
joining_typesZcodepoint_classes�rr�/usr/lib/python3.6/idnadata.py�<module>s0

site-packages/pip/_vendor/idna/__pycache__/uts46data.cpython-36.pyc000064400000671047147511334620021125 0ustar003

���ep��@sdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zd2d3�Zd4d5�Zd6d7�Zd8d9�Zd:d;�Zd<d=�Zd>d?�Z d@dA�Z!dBdC�Z"dDdE�Z#dFdG�Z$dHdI�Z%dJdK�Z&dLdM�Z'dNdO�Z(dPdQ�Z)dRdS�Z*dTdU�Z+dVdW�Z,dXdY�Z-dZd[�Z.d\d]�Z/d^d_�Z0d`da�Z1dbdc�Z2ddde�Z3dfdg�Z4dhdi�Z5djdk�Z6dldm�Z7dndo�Z8dpdq�Z9drds�Z:dtdu�Z;dvdw�Z<dxdy�Z=dzd{�Z>d|d}�Z?d~d�Z@d�d��ZAd�d��ZBd�d��ZCd�d��ZDd�d��ZEd�d��ZFd�d��ZGd�d��ZHd�d��ZId�d��ZJeKe�e�e�e�e�e�e�e	�e
�e�e�e
�e�e�e�e�e�e�e�e�e�e�e�e�e�e�e�e�e�e�e �e!�e"�e#�e$�e%�e&�e'�e(�e)�e*�e+�e,�e-�e.�e/�e0�e1�e2�e3�e4�e5�e6�e7�e8�e9�e:�e;�e<�e=�e>�e?�e@�eA�eB�eC�eD�eE�eF�eG�eH�eI�eJ��ZLd�S)�zIDNA Mapping Table from UTS46.z6.3.0ceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N��3���������	�
���
������������������� �!�"�#�$�%�&�'�(�)�*�+�,�-�V�.�/�0�1�2�3�4�5�6�7�8�9�:�;�<�=�>�?�@�A�M�a�B�b�C�c�D�d�E�e�F�f�G�g�H�h�I�i�J�j�K�k�L�l�M�m�N�n�O�o�P�p�Q�q�R�r�S�s�T�t�U�u�V�v�W�w�X�x�Y�y�Z�z�[�\�]�^�_�`�a�b�c)rr)rr)rr)rr)rr)rr)rr)r	r)r
r)rr)rr)r
r)rr)rr)rr)rr)rr)rr)rr)rr)rr)rr)rr)rr)rr)rr)rr)rr)rr)rr)r r)r!r)r"r)r#r)r$r)r%r)r&r)r'r)r(r)r)r)r*r)r+r)r,r)r-r)r.r)r/r0)r1r0)r2r)r3r0)r4r0)r5r0)r6r0)r7r0)r8r0)r9r0)r:r0)r;r0)r<r0)r=r)r>r)r?r)r@r)rAr)rBr)rCr)rDrErF)rGrErH)rIrErJ)rKrErL)rMrErN)rOrErP)rQrErR)rSrErT)rUrErV)rWrErX)rYrErZ)r[rEr\)r]rEr^)r_rEr`)rarErb)rcrErd)rerErf)rgrErh)rirErj)rkrErl)rmrErn)rorErp)rqrErr)rsrErt)rurErv)rwrErx)ryr)rzr)r{r)r|r)r}r)r~r)rr0)r�r0)r�r0�r�r�r��/usr/lib/python3.6/uts46data.py�_seg_0s�r�cfCs�dd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�dr0�e�f�g�h�i�j�k�l�m�n�o�p�q�r�s�t�u�v�w�x�y�z�{r�|�}�~���X��������������������������������� ��������� ̈��rErF����I��� ̄����2��� ́��μ���� ̧��1�rb���1⁄4��1⁄2��3⁄4���à��á���â���ã���ä���å���æ���ç)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r)r�r)r�r)r�r)r�r)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�rr�)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�r0)r�rr�)r�r0)r�rErF)r�r0)r�r0)r�r�)r�r0)r�rr�)r�r0)r�r0)r�rEr�)r�rEr)r�rr�)r�rEr�)r�r0)r�r0)r�rr�)r�rEr�)r�rErb)r�r0)r�rEr�)r�rEr�)r�rEr�)r�r0)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_1ps�r�ceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N��rE�è���é���ê���ë���ì���í���î���ï���ð���ñ���ò���ó���ô���õ���ö��r0���ø���ù���ú���û���ü���ý���þ���D�ss�������������������������������������������������������ā���ă���ą���ć���ĉ�	�
�ċ���č�
��ď���đ���ē���ĕ���ė���ę���ě���ĝ���ğ�� �ġ�!�"�ģ�#�$�ĥ�%�&�ħ�'�(�ĩ�)�*�ī�+)r�rEr)rrEr)rrEr)rrEr)rrEr)r	rEr
)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rr0)rrEr)r rEr!)r"rEr#)r$rEr%)r&rEr')r(rEr))r*rEr+)r,r-r.)r/r0)r0r0)r1r0)r2r0)r3r0)r4r0)r5r0)r6r0)r7r0)r8r0)r9r0)r:r0)r;r0)r<r0)r=r0)r>r0)r?r0)r@r0)rAr0)rBr0)rCr0)rDr0)rEr0)rFr0)rGr0)rHr0)rIr0)rJr0)rKr0)rLr0)rMr0)rNr0)rOrErP)rQr0)rRrErS)rTr0)rUrErV)rWr0)rXrErY)rZr0)r[rEr\)r]r0)r^rEr_)r`r0)rarErb)rcr0)rdrEre)rfr0)rgrErh)rir0)rjrErk)rlr0)rmrErn)ror0)rprErq)rrr0)rsrErt)rur0)rvrErw)rxr0)ryrErz)r{r0)r|rEr})r~r0)rrEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0r�r�r�r�r��_seg_2�s�r�cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�dgdS(N�,rE�ĭ�-r0�.�į�/�0�i̇�1�2�ij�4�ĵ�5�6�ķ�7�9�ĺ�:�;�ļ�<�=�ľ�>�?�l·�A�ł�B�C�ń�D�E�ņ�F�G�ň�H�I�ʼn�J�ŋ�K�L�ō�M�N�ŏ�O�P�ő�Q�R�œ�S�T�ŕ�U�V�ŗ�W�X�ř�Y�Z�ś�[�\�ŝ�]�^�ş�_�`�š�a�b�ţ�c�d�ť�e�f�ŧ�g�h�ũ�i�j�ū�k�l�ŭ�m�n�ů�o�p�ű�q�r�ų�s�t�ŵ�u�v�ŷ�w�x�ÿ�y�ź�z�{�ż�|�}�ž�~�rj���ɓ��ƃ���ƅ���ɔ��ƈ���ɖ��ɗ��ƌ���ǝ��ə��ɛ��ƒ���ɠ)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)rr0)rrEr)rrEr)rr0)rrEr)rr0)r	rEr
)rr0)rrErj)r
r0)rrEr)rrEr)rr0)rrEr)rr0)rrEr)rrEr)rr0)rrEr)rrEr)rrEr )r!r0)r"rEr#)r$rEr%)r&rEr')r(rEr))r*r0)r+rEr,r�r�r�r�r��_seg_3@s�r-cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�dgdS(N�rE�ɣ�r0��ɩ��ɨ��ƙ���ɯ��ɲ���ɵ��ơ���ƣ���ƥ���ʀ��ƨ���ʃ���ƭ���ʈ��ư���ʊ��ʋ��ƴ���ƶ���ʒ��ƹ���ƽ����dž���lj���nj���ǎ�����ǐ�����ǒ�����ǔ�����ǖ�����ǘ�����ǚ�����ǜ�����ǟ�����ǡ�����ǣ�����ǥ�����ǧ�����ǩ�����ǫ�����ǭ�����ǯ�����dz���ǵ���ƕ��ƿ��ǹ���ǻ���ǽ���ǿ���ȁ���ȃ���ȅ���ȇ���ȉ�	�
�ȋ���ȍ)r.rEr/)r0r0)r1rEr2)r3rEr4)r5rEr6)r7r0)r8rEr9)r:rEr;)r<r0)r=rEr>)r?rEr@)rAr0)rBrErC)rDr0)rErErF)rGr0)rHrErI)rJrErK)rLr0)rMrErN)rOr0)rPrErQ)rRr0)rSrErT)rUrErV)rWr0)rXrErY)rZrEr[)r\rEr])r^r0)r_rEr`)rar0)rbrErc)rdrEre)rfr0)rgrErh)rir0)rjrErk)rlrErm)rnrEro)rprErq)rrr0)rsrErt)rur0)rvrErw)rxr0)ryrErz)r{r0)r|rEr})r~r0)rrEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�rEr�)r�r0)r�rEr�)r�rEr�)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�r�r�r�r�r��_seg_4�s�r�ceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	gdS(
N�
r0�rE�ȏ���ȑ���ȓ���ȕ���ȗ���ș���ț���ȝ���ȟ�� �ƞ�!�"�ȣ�#�$�ȥ�%�&�ȧ�'�(�ȩ�)�*�ȫ�+�,�ȭ�-�.�ȯ�/�0�ȱ�1�2�ȳ�3�:�ⱥ�;�ȼ�<�=�ƚ�>�ⱦ�?�A�ɂ�B�C�ƀ�D�ʉ�E�ʌ�F�ɇ�G�H�ɉ�I�J�ɋ�K�L�ɍ�M�N�ɏ�O�rT��ɦ�rX�rh��ɹ��ɻ��ʁ�rr�rv���r� ̆��� ̇��� ̊��� ̨��� ̃��� ̋�����ɣ��r\��rj��rt���ʕ���@�̀�A�́�B�C�̓�D�̈́�E�ι�F�Or��P�p�ͱ�q�r�ͳ�s�t�ʹ�u�v�ͷ�w)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)rr0)rrEr)rr0)rrEr)rr0)rrEr)r	rEr
)rr0)rrEr
)rrEr)rr0)rrEr)rr0)rrEr)rrEr)rrEr)rrEr)rr0)rrEr)rr0)r rEr!)r"r0)r#rEr$)r%r0)r&rEr')r(r0)r)rErT)r*rEr+)r,rErX)r-rErh)r.rEr/)r0rEr1)r2rEr3)r4rErr)r5rErv)r6r0)r7rr8)r9rr:)r;rr<)r=rr>)r?rr@)rArrB)rCr0)rDrErE)rFrEr\)rGrErj)rHrErt)rIrErJ)rKr0)rLrErM)rNrErO)rPr0)rQrErR)rSrErT)rUrErV)rWr0)rXr�)rYr0)rZrEr[)r\r0)r]rEr^)r_r0)r`rEra)rbr0)rcrErd)rer0r�r�r�r�r��_seg_5s�rfcfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	gdS(
N�xr��zr� ι�{r0�~�;��� ́�� ̈́�rE�ά��·��έ��ή��ί���ό���ύ��ώ���α��β��γ��δ��ε��ζ��η��θ��ι��κ��λ��μ��ν��ξ��ο��π��ρ���σ��τ��υ��φ��χ��ψ��ω��ϊ��ϋ���r-�����ϗ�������������������ϙ�����ϛ�����ϝ�����ϟ�����ϡ�����ϣ�����ϥ�����ϧ�����ϩ�����ϫ�����ϭ�����ϯ����������������ϸ����ϻ���ͻ��ͼ��ͽ��ѐ��ё��ђ��ѓ)rgr�)rhrri)rjr0)rkrrl)rmr�)rnrro)rprrq)rrrErs)rtrEru)rvrErw)rxrEry)rzrEr{)r|r�)r}rEr~)rr�)r�rEr�)r�rEr�)r�r0)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r0)r�r-r�)r�r0)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�rEr�)r�rEr�)r�r0)r�rEr�)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�rEr�)r�r0)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rrEr)rrErr�r�r�r�r��_seg_6xs�rcfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�dgdS(N�rE�є��ѕ��і��ї��ј�	�љ�
�њ��ћ��ќ�
�ѝ��ў��џ��а��б��в��г��д��е��ж��з��и��й��к��л��м��н��о��п� �р�!�с�"�т�#�у�$�ф�%�х�&�ц�'�ч�(�ш�)�щ�*�ъ�+�ы�,�ь�-�э�.�ю�/�я�0r0�`�ѡ�a�b�ѣ�c�d�ѥ�e�f�ѧ�g�h�ѩ�i�j�ѫ�k�l�ѭ�m�n�ѯ�o�p�ѱ�q�r�ѳ�s�t�ѵ�u�v�ѷ�w�x�ѹ�y�z�ѻ�{�|�ѽ�}�~�ѿ���ҁ���ҋ���ҍ���ҏ���ґ���ғ���ҕ���җ���ҙ���қ���ҝ���ҟ)rrEr	)r
rEr)rrEr
)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)r rEr!)r"rEr#)r$rEr%)r&rEr')r(rEr))r*rEr+)r,rEr-)r.rEr/)r0rEr1)r2rEr3)r4rEr5)r6rEr7)r8rEr9)r:rEr;)r<rEr=)r>rEr?)r@rErA)rBrErC)rDrErE)rFrErG)rHrErI)rJrErK)rLrErM)rNrErO)rPrErQ)rRrErS)rTrErU)rVrErW)rXrErY)rZrEr[)r\rEr])r^rEr_)r`r0)rarErb)rcr0)rdrEre)rfr0)rgrErh)rir0)rjrErk)rlr0)rmrErn)ror0)rprErq)rrr0)rsrErt)rur0)rvrErw)rxr0)ryrErz)r{r0)r|rEr})r~r0)rrEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�r�r�r�r�r��_seg_7�s�r�ceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�r0�rE�ҡ���ң���ҥ���ҧ���ҩ���ҫ���ҭ���ү���ұ���ҳ���ҵ���ҷ���ҹ���һ���ҽ���ҿ��r���ӂ�����ӄ�����ӆ�����ӈ�����ӊ�����ӌ�����ӎ�����ӑ�����ӓ�����ӕ�����ӗ�����ә�����ӛ�����ӝ�����ӟ�����ӡ�����ӣ�����ӥ�����ӧ�����ө�����ӫ�����ӭ�����ӯ�����ӱ�����ӳ�����ӵ���ӷ���ӹ���ӻ���ӽ���ӿ���ԁ���ԃ�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�r�)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr)rr0)rrEr)rr0)rrEr)rr0)rrEr	)r
r0)rrEr)r
r0)rrEr)rr0)rrEr)rr0)rrEr)rr0)rrEr)rr0)rrEr)rr0)rrEr)rr0)r rEr!)r"r0)r#rEr$)r%r0)r&rEr')r(r0)r)rEr*)r+r0)r,rEr-)r.r0)r/rEr0)r1r0)r2rEr3)r4r0)r5rEr6)r7r0)r8rEr9)r:r0)r;rEr<)r=r0)r>rEr?)r@r0)rArErB)rCr0)rDrErE)rFr0)rGrErH)rIr0r�r�r�r�r��_seg_8Hs�rJceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�dgdS(	N�rE�ԅ�r0��ԇ���ԉ�	�
�ԋ���ԍ�
��ԏ���ԑ���ԓ���ԕ���ԗ���ԙ���ԛ���ԝ���ԟ�� �ԡ�!�"�ԣ�#�$�ԥ�%�&�ԧ�'�(r��1�ա�2�բ�3�գ�4�դ�5�ե�6�զ�7�է�8�ը�9�թ�:�ժ�;�ի�<�լ�=�խ�>�ծ�?�կ�@�հ�A�ձ�B�ղ�C�ճ�D�մ�E�յ�F�ն�G�շ�H�ո�I�չ�J�պ�K�ջ�L�ռ�M�ս�N�վ�O�տ�P�ր�Q�ց�R�ւ�S�փ�T�ք�U�օ�V�ֆ�W�Y�`�a��եւ�������������������u�اٴ�v�وٴ�w�ۇٴ�x�يٴ�y��)rKrErL)rMr0)rNrErO)rPr0)rQrErR)rSr0)rTrErU)rVr0)rWrErX)rYr0)rZrEr[)r\r0)r]rEr^)r_r0)r`rEra)rbr0)rcrErd)rer0)rfrErg)rhr0)rirErj)rkr0)rlrErm)rnr0)rorErp)rqr0)rrrErs)rtr0)rurErv)rwr0)rxrEry)rzr0)r{rEr|)r}r0)r~rEr)r�r0)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r�)r�r0)r�r�)r�r0)r�rEr�)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r0)r�r�r�r�r�r�r��_seg_9�s�r�ceCs�dydzd{d|d}d~dd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N��r0�r���K�M�����.�0�?�@�\�^�_��������	�X	rE�क़�Y	�ख़�Z	�ग़�[	�ज़�\	�ड़�]	�ढ़�^	�फ़�_	�य़�`	�x	�y	�	�	�	�	�	�	�	�	�	�	�	�	�	�	�	�	��	��	��	��	��	��	��	��	�ড়��	�ঢ়��	��	�য়��	��	��	�	�
�
�
�
�
�
�
�)
�*
�1
�2
�3
�ਲ਼�4
�5
�6
�ਸ਼�7
�8
�:
�<
�=
�>
�C
�G
�I
�K
�N
�Q
�R
�Y
�ਖ਼�Z
�ਗ਼�[
�ਜ਼�\
�]
�^
�ਫ਼�_
)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)rr�)rr0)rr�)rr0)rrEr)rrEr)rrEr	)r
rEr)rrEr
)rrEr)rrEr)rrEr)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)r r0)r!r�)r"r0)r#r�)r$r0)r%r�)r&r0)r'r�)r(r0)r)r�)r*r0)r+r�)r,r0)r-r�)r.rEr/)r0rEr1)r2r�)r3rEr4)r5r0)r6r�)r7r0)r8r�)r9r0)r:r�)r;r0)r<r�)r=r0)r>r�)r?r0)r@r�)rAr0)rBr�)rCr0)rDrErE)rFr�)rGr0)rHrErI)rJr�)rKr0)rLr�)rMr0)rNr�)rOr0)rPr�)rQr0)rRr�)rSr0)rTr�)rUr0)rVr�)rWrErX)rYrErZ)r[rEr\)r]r0)r^r�)r_rEr`)rar�r�r�r�r�r��_seg_10s�rbceCs�djdkdldmdndodpdqdrdsdtdudvdwdxdydzd{d|d}d~dd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�f
r0�v
r��
�
�
�
�
�
�
�
�
�
�
�
�
�
�
��
��
��
��
��
��
��
��
��
��
��
����
����)�*�1�2�4�5�:�<�E�G�I�K�N�V�X�\rE�ଡ଼�]�ଢ଼�^�_�d�f�x����������������������������������������������
����)�*�4�5)rcr0)rdr�)rer0)rfr�)rgr0)rhr�)rir0)rjr�)rkr0)rlr�)rmr0)rnr�)ror0)rpr�)rqr0)rrr�)rsr0)rtr�)rur0)rvr�)rwr0)rxr�)ryr0)rzr�)r{r0)r|r�)r}r0)r~r�)rr0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�rEr�)r�rEr�)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0r�r�r�r�r��_seg_11�s�r�ceCs�didjdkdldmdndodpdqdrdsdtdudvdwdxdydzd{d|d}d~dd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�:r��=r0�E�F�I�J�N�U�W�X�Z�`�d�f�p�x���������������������������������������������
�
�
�

�
�
�
�;
�=
�E
�F
�I
�J
�O
�W
�X
�`
�d
�f
�v
�y
�
�
�
�
�
�
�
�
�
�
�
�
��
��
��
��
��
��
��
��
��
��
�
��3rE�ํา�4�;�?�\�����)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)r	r0)r
r�)rr0)rr�)r
r0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)r r�)r!r0)r"r�)r#r0)r$rEr%)r&r0)r'r�)r(r0)r)r�)r*r0)r+r�)r,r0)r-r�)r.r0r�r�r�r�r��_seg_12�s�r/ceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�r��r0�����������������rE�ໍາ����������������������ຫນ���ຫມ�������་�
�C�གྷ�D�H�I�M�ཌྷ�N�R�དྷ�S�W�བྷ�X�\�ཛྷ�]�i�ཀྵ�j�m�q�s�ཱི�t�u�ཱུ�v�ྲྀ�w�	ྲཱྀ�x�ླྀ�y�	ླཱྀ�z��ཱྀ���ྒྷ�����ྜྷ���ྡྷ���ྦྷ���ྫྷ���ྐྵ��������������ⴧ�����ⴭ������ნ��_�a�I�J�N�P�W�X)r0r�)r1r0)r2r�)r3r0)r4r�)r5r0)r6r�)r7r0)r8r�)r9r0)r:r�)r;r0)r<r�)r=r0)r>r�)r?r0)r@r�)rAr0)rBrErC)rDr0)rEr�)rFr0)rGr�)rHr0)rIr�)rJr0)rKr�)rLr0)rMr�)rNr0)rOr�)rPrErQ)rRrErS)rTr0)rUr�)rVr0)rWrErX)rYr0)rZrEr[)r\r0)r]r�)r^r0)r_rEr`)rar0)rbrErc)rdr0)rerErf)rgr0)rhrEri)rjr0)rkrErl)rmr0)rnr�)ror0)rprErq)rrr0)rsrErt)rurErv)rwrErx)ryrErz)r{rEr|)r}r0)r~rEr)r�r0)r�rEr�)r�r0)r�r�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�rEr�)r�r�)r�rEr�)r�r�)r�r0)r�rEr�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0r�r�r�r�r��_seg_13Ps�r�ceCs�dhdidjdkdldmdndodpdqdrdsdtdudvdwdxdydzd{d|d}d~dd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�Yr��Zr0�^�`��������������������������[�]�}�������������
��� �7�@�T�`�m�n�q�r�t����������������r����� �x������� �,�0�<�@�A�D�n�p�u��������������_�`�}����)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)r	r0)r
r�)rr0)rr�)r
r0)rr�)rr0)rr�)rr0)rr�r�r�r�r�r��_seg_14�s�rcfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�dgdS(N�r0�r���L�P�}�����8�;�J�M���������,rErF�-r��.rH�/�0rL�1rN�2�ǝ�3rR�4rT�5rV�6rX�7rZ�8r\�9r^�:r`�;�<rb�=�ȣ�>rd�?rh�@rl�Arn�Brr�C�D�ɐ�E�ɑ�F�ᴂ�G�H�I�J�ə�K�ɛ�L�ɜ�M�N�O�P�Q�ŋ�R�S�ɔ�T�ᴖ�U�ᴗ�V�W�X�Y�ᴝ�Z�ɯ�[rp�\�ᴥ�]�β�^�γ�_�δ�`�φ�a�χ�b�c�d�e�f�g�h�ρ�i�j�k�x�н�y��ɒ�rJ��ɕ�r��rP��ɟ��ɡ��ɥ��ɨ��ɩ��ɪ��ᵻ��ʝ��ɭ)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)rr0)rr�)r r0)r!r�)r"r0)r#r�)r$r0)r%r�)r&r0)r'rErF)r(rEr�)r)rErH)r*r0)r+rErL)r,rErN)r-rEr.)r/rErR)r0rErT)r1rErV)r2rErX)r3rErZ)r4rEr\)r5rEr^)r6rEr`)r7r0)r8rErb)r9rEr:)r;rErd)r<rErh)r=rErl)r>rErn)r?rErr)r@rErF)rArErB)rCrErD)rErErF)rGrErH)rHrErL)rIrErN)rJrErK)rLrErM)rNrErO)rPrErR)rQr0)rRrErZ)rSrEr^)rTrErU)rVrErb)rWrErX)rYrErZ)r[rEr\)r]rErd)r^rErl)r_rErn)r`rEra)rbrErc)rdrErp)rerErf)rgrErh)rirErj)rkrErl)rmrErn)rorErp)rqrErV)rrrErh)rsrErn)rtrErp)rurErh)rvrErj)rwrErx)ryrErn)rzrErp)r{r0)r|rEr})r~r0)rrEr�)r�rErJ)r�rEr�)r�rEr)r�rErO)r�rErP)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_15 s�r�cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�dgdS(N�rE�ᶅ��ʟ��ɱ��ɰ��ɲ��ɳ��ɴ��ɵ��ɸ��ʂ��ʃ��ƫ��ʉ��ʊ��ᴜ��ʋ��ʌ�rx��ʐ��ʑ��ʒ��θ�r0��r����ḁ���ḃ���ḅ���ḇ���ḉ�	�
�ḋ���ḍ�
��ḏ���ḑ���ḓ���ḕ���ḗ���ḙ���ḛ���ḝ���ḟ�� �ḡ�!�"�ḣ�#�$�ḥ�%�&�ḧ�'�(�ḩ�)�*�ḫ�+�,�ḭ�-�.�ḯ�/�0�ḱ�1�2�ḳ�3�4�ḵ�5�6�ḷ�7�8�ḹ�9�:�ḻ�;�<�ḽ�=�>�ḿ�?�@�ṁ�A�B�ṃ�C�D�ṅ�E�F�ṇ�G�H�ṉ�I�J�ṋ)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rErx)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r0)r�r�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r	r0)r	rEr	)r	r0)r	rEr	)r	r0)r	rEr	)r		r0)r
	rEr	)r	r0)r
	rEr	)r	r0)r	rEr	)r	r0)r	rEr	)r	r0)r	rEr	)r	r0)r	rEr	)r	r0)r	rEr	)r	r0)r	rEr 	)r!	r0)r"	rEr#	)r$	r0)r%	rEr&	)r'	r0)r(	rEr)	)r*	r0)r+	rEr,	)r-	r0)r.	rEr/	)r0	r0)r1	rEr2	)r3	r0)r4	rEr5	)r6	r0)r7	rEr8	r�r�r�r�r��_seg_16�s�r9	ceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�Kr0�LrE�ṍ�M�N�ṏ�O�P�ṑ�Q�R�ṓ�S�T�ṕ�U�V�ṗ�W�X�ṙ�Y�Z�ṛ�[�\�ṝ�]�^�ṟ�_�`�ṡ�a�b�ṣ�c�d�ṥ�e�f�ṧ�g�h�ṩ�i�j�ṫ�k�l�ṭ�m�n�ṯ�o�p�ṱ�q�r�ṳ�s�t�ṵ�u�v�ṷ�w�x�ṹ�y�z�ṻ�{�|�ṽ�}�~�ṿ���ẁ���ẃ���ẅ���ẇ���ẉ���ẋ���ẍ���ẏ���ẑ���ẓ���ẕ���aʾ���r.���ạ���ả���ấ���ầ���ẩ���ẫ���ậ���ắ���ằ���ẳ�)r:	r0)r;	rEr<	)r=	r0)r>	rEr?	)r@	r0)rA	rErB	)rC	r0)rD	rErE	)rF	r0)rG	rErH	)rI	r0)rJ	rErK	)rL	r0)rM	rErN	)rO	r0)rP	rErQ	)rR	r0)rS	rErT	)rU	r0)rV	rErW	)rX	r0)rY	rErZ	)r[	r0)r\	rEr]	)r^	r0)r_	rEr`	)ra	r0)rb	rErc	)rd	r0)re	rErf	)rg	r0)rh	rEri	)rj	r0)rk	rErl	)rm	r0)rn	rEro	)rp	r0)rq	rErr	)rs	r0)rt	rEru	)rv	r0)rw	rErx	)ry	r0)rz	rEr{	)r|	r0)r}	rEr~	)r	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	rErZ	)r�	r0)r�	rEr.)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0r�r�r�r�r��_seg_17�s�r�	cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�dgdS(N�rE�ẵ�r0��ặ���ẹ���ẻ���ẽ���ế���ề����ể�����ễ�����ệ�����ỉ�����ị�����ọ�����ỏ�����ố�����ồ�����ổ�����ỗ�����ộ�����ớ�����ờ�����ở�����ỡ�����ợ�����ụ�����ủ�����ứ�����ừ�����ử�����ữ�����ự�����ỳ�����ỵ���ỷ���ỹ���ỻ���ỽ���ỿ���ἀ�	�ἁ�
�ἂ��ἃ��ἄ�
�ἅ��ἆ��ἇ��r���ἐ��ἑ��ἒ��ἓ��ἔ��ἕ�� �(�ἠ�)�ἡ�*�ἢ�+�ἣ�,�ἤ�-�ἥ)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr�	)r�	r0)r�	rEr
)r
r0)r
rEr
)r
r0)r
rEr
)r
r0)r
rEr	
)r

r0)r
rEr
)r
r0)r
rEr
)r
r0)r
rEr
)r
r0)r
rEr
)r
r0)r
rEr
)r
r0)r
rEr
)r
r0)r
rEr
)r
r0)r 
rEr!
)r"
r0)r#
rEr$
)r%
r0)r&
rEr'
)r(
r0)r)
rEr*
)r+
r0)r,
rEr-
)r.
r0)r/
rEr0
)r1
r0)r2
rEr3
)r4
r0)r5
rEr6
)r7
r0)r8
rEr9
)r:
r0)r;
rEr<
)r=
r0)r>
rEr?
)r@
r0)rA
rErB
)rC
rErD
)rE
rErF
)rG
rErH
)rI
rErJ
)rK
rErL
)rM
rErN
)rO
rErP
)rQ
r0)rR
r�)rS
rErT
)rU
rErV
)rW
rErX
)rY
rErZ
)r[
rEr\
)r]
rEr^
)r_
r�)r`
r0)ra
rErb
)rc
rErd
)re
rErf
)rg
rErh
)ri
rErj
)rk
rErl
r�r�r�r�r��_seg_18Xs�rm
cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�dgdS(N�.rE�ἦ�/�ἧ�0r0�8�ἰ�9�ἱ�:�ἲ�;�ἳ�<�ἴ�=�ἵ�>�ἶ�?�ἷ�@�Fr��H�ὀ�I�ὁ�J�ὂ�K�ὃ�L�ὄ�M�ὅ�N�P�X�Y�ὑ�Z�[�ὓ�\�]�ὕ�^�_�ὗ�`�h�ὠ�i�ὡ�j�ὢ�k�ὣ�l�ὤ�m�ὥ�n�ὦ�o�ὧ�p�q�ά�r�s�έ�t�u�ή�v�w�ί�x�y�ό�z�{�ύ�|�}�ώ�~��ἀι��ἁι��ἂι��ἃι��ἄι��ἅι��ἆι��ἇι����������ἠι��ἡι��ἢι��ἣι��ἤι��ἥι��ἦι��ἧι����������ὠι��ὡι��ὢι��ὣι��ὤι��ὥι��ὦι��ὧι�������)rn
rEro
)rp
rErq
)rr
r0)rs
rErt
)ru
rErv
)rw
rErx
)ry
rErz
)r{
rEr|
)r}
rEr~
)r
rEr�
)r�
rEr�
)r�
r0)r�
r�)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
r�)r�
r0)r�
r�)r�
rEr�
)r�
r�)r�
rEr�
)r�
r�)r�
rEr�
)r�
r�)r�
rEr�
)r�
r0)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rEr�
)r�
r�)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)rrEr)rrEr)rrEr)rrEr�
)rrEr�
)rrEr�
)r	rEr�
)r
rEr�
)rrEr)rrErr�r�r�r�r��_seg_19�s�r
ceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�dgdS(N�rE�ὧι�r0��ὰι��αι��άι�r����ᾶι��ᾰ��ᾱ��ὰ��ά��r� ̓��ι��� ͂�� ̈͂���ὴι���ηι���ήι�������ῆι���ὲ���έ���ὴ���ή����� ̓̀��� ̓́��� ̓͂�����ΐ�������ῐ���ῑ���ὶ���ί����� ̔̀��� ̔́��� ̔͂�����ΰ�����ῠ���ῡ���ὺ���ύ���ῥ��� ̈̀��� ̈́���`�����ὼι���ωι���ώι����ῶι��ὸ��ό��ὼ��ώ��� ́�� ̔�� r�� r�� r-�� � � �‐� � � ̳� �$ �' �( �/ �0 �3 �′′�4 �	′′′�5 �6 �‵‵�7 �	‵‵‵�8 �< �!!�= �> � ̅�? �G �??�H �?!�I �!?�J �W �′′′′�X )rrEr)rr0)rrEr)rrEr)rrEr)rr�)rr0)rrEr)rrEr)rrEr)rrEr )r!rEr")r#rEr)r$rr%)r&rEr')r(rr%)r)rr*)r+rr,)r-rEr.)r/rEr0)r1rEr2)r3r�)r4r0)r5rEr6)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr0)r@rrA)rBrrC)rDrrE)rFr0)rGrErH)rIr�)rJr0)rKrErL)rMrErN)rOrErP)rQrErR)rSr�)rTrrU)rVrrW)rXrrY)rZr0)r[rEr\)r]r0)r^rEr_)r`rEra)rbrErc)rdrEre)rfrErg)rhrri)rjrrk)rlrrm)rnr�)rorErp)rqrErr)rsrErt)rur�)rvr0)rwrErx)ryrErz)r{rEr|)r}rEr~)rrEr�)r�rErr)r�rr�)r�rr�)r�r�)r�rr�)r�r�)r�r-r�)r�r�)r�r0)r�rEr�)r�r0)r�rr�)r�r0)r�r�)r�r0)r�r�)r�rr�)r�r0)r�rEr�)r�rEr�)r�r0)r�rEr�)r�rEr�)r�r0)r�rr�)r�r0)r�rr�)r�r0)r�rr�)r�rr�)r�rr�)r�r0)r�rEr�)r�r0r�r�r�r�r��_seg_20(s�r�cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�dgdS(N�_ rr��` r��a r��d �e �p rE�0�q rV�r �t �4�u �5�v �6�w �7�x �8�y �9�z �+�{ �−�| �=�} �(�~ �)� r`� � r�� r�� � � � � � � � � � � � � � rF� rN� rb� rt� �ə� rT� rZ� r\� r^� � rd� rj� rl� � r0� �rs� � �� �� �!�a/c�!�a/s�!rJ�!�°c�!�!�c/o�!�c/u�!�ɛ�!�	!�°f�
!rR�!�!�ħ�!�!�!�!�!�no�!�!�!rf�!rh�!� !�sm�!!�tel�"!�tm�#!�$!rx�%!�&!�ω�'!�(!�)!�*!�+!r��,!rH�-!�.!�/!�1!rP�2!�3!�4!�5!�א)r�rr�)r�r�)r�r�)r�r�)r�r�)r�rEr�)r�rErV)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rr�)r�rEr�)r�rr�)r�rr�)r�rr�)r�rEr`)r�rEr�)r�rEr�)r�rEr�)r�rEr)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rr�)r�rEr�)r�rr�)r�rr�)r�rr�)r�r�)r�rErF)r�rErN)r�rErb)r�rErt)r�rEr�)r�rErT)r�rErZ)r�rEr\)r�rEr^)r�rEr`)r�rErd)r�rErj)r�rErl)r�r�)r�r0)r�rEr�)r�r0)r�r�)r�r0)r�r�)r�rr�)r�rr�)r�rErJ)r�rEr�)rr0)rrr)rrr)rrEr)rr0)rrEr	)r
rErR)rrErT)rrEr
)rrErV)rrEr\)rr0)rrEr`)rrEr)rr0)rrErd)rrErf)rrErh)rr0)rrEr)rrEr)rrEr)rr0)r rErx)r!r0)r"rEr#)r$r0)r%rErx)r&r0)r'rErZ)r(rEr�)r)rErH)r*rErJ)r+r0)r,rErN)r-rErP)r.r�)r/rEr^)r0rErb)r1rEr2r�r�r�r�r��_seg_21�s�r3cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�dgdS(N�6!rE�ב�7!�ג�8!�ד�9!rV�:!r0�;!�fax�<!�π�=!�γ�?!�@!�∑�A!�E!rL�G!rN�H!�I!rX�J!�P!�1⁄7�Q!�1⁄9�R!�1⁄10�S!�1⁄3�T!�2⁄3�U!�1⁄5�V!�2⁄5�W!�3⁄5�X!�4⁄5�Y!�1⁄6�Z!�5⁄6�[!�1⁄8�\!�3⁄8�]!�5⁄8�^!�7⁄8�_!�1⁄�`!�a!�ii�b!�iii�c!�iv�d!rp�e!�vi�f!�vii�g!�viii�h!�ix�i!rt�j!�xi�k!�xii�l!r\�m!rJ�n!�o!r^�p!�q!�r!�s!�t!�u!�v!�w!�x!�y!�z!�{!�|!�}!�~!�!�!�!r��!�!�0⁄3�!�!�,"�∫∫�-"�	∫∫∫�."�/"�∮∮�0"�	∮∮∮�1"�`"r�a"�n"�p"�)#�〈�*#�〉�+#��#�$�'$�@$�K$�`$r��a$r��b$�c$r��d$r��e$r��f$r��g$r��h$r��i$�10�j$�11�k$�12)r4rEr5)r6rEr7)r8rEr9)r:rErV)r;r0)r<rEr=)r>rEr?)r@rErA)rBrEr?)rCrErD)rEr0)rFrErL)rGrErN)rHrErV)rIrErX)rJr0)rKrErL)rMrErN)rOrErP)rQrErR)rSrErT)rUrErV)rWrErX)rYrErZ)r[rEr\)r]rEr^)r_rEr`)rarErb)rcrErd)rerErf)rgrErh)rirErj)rkrErV)rlrErm)rnrEro)rprErq)rrrErp)rsrErt)rurErv)rwrErx)ryrErz)r{rErt)r|rEr})r~rEr)r�rEr\)r�rErJ)r�rErL)r�rEr^)r�rErV)r�rErm)r�rEro)r�rErq)r�rErp)r�rErt)r�rErv)r�rErx)r�rErz)r�rErt)r�rEr})r�rEr)r�rEr\)r�rErJ)r�rErL)r�rEr^)r�r0)r�r�)r�r0)r�rEr�)r�r�)r�r0)r�rEr�)r�rEr�)r�r0)r�rEr�)r�rEr�)r�r0)r�r)r�r0)r�r)r�r0)r�rEr�)r�rEr�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�rEr�)r�rEr�)r�rEr)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_22�s�r�cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�dgdS(N�l$rE�13�m$�14�n$�15�o$�16�p$�17�q$�18�r$�19�s$�20�t$r�(1)�u$�(2)�v$�(3)�w$�(4)�x$�(5)�y$�(6)�z$�(7)�{$�(8)�|$�(9)�}$�(10)�~$�(11)�$�(12)�$�(13)�$�(14)�$�(15)�$�(16)�$�(17)�$�(18)�$�(19)�$�(20)�$r��$�(a)�$�(b)�$�(c)�$�(d)�$�(e)�$�(f)�$�(g)�$�(h)�$�(i)�$�(j)�$�(k)�$�(l)�$�(m)�$�(n)�$�(o)�$�(p)�$�(q)�$�(r)�$�(s)�$�(t)�$�(u)�$�(v)�$�(w)�$�(x)�$�(y)�$�(z)�$rF�$rH�$rJ�$rL�$rN�$rP�$rR�$rT�$rV�$rX�$rZ�$r\��$r^��$r`��$rb��$rd��$rf��$rh��$rj��$rl��$rn��$rp��$rr��$rt��$rv��$rx��$��$��$��$��$��$��$��$��$��$��$��$��$��$��$��$��$��$��$)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�r�)r�rr�)r�rr�)r
rr
)r
rr
)r
rr
)r
rr
)r
rr	
)r

rr
)r
rr

)r
rr
)r
rr
)r
rr
)r
rr
)r
rr
)r
rr
)r
rr
)r
rr
)r
rr
)r 
rr!
)r"
rr#
)r$
rr%
)r&
rr'
)r(
rr)
)r*
rr+
)r,
rr-
)r.
rr/
)r0
rErF)r1
rErH)r2
rErJ)r3
rErL)r4
rErN)r5
rErP)r6
rErR)r7
rErT)r8
rErV)r9
rErX)r:
rErZ)r;
rEr\)r<
rEr^)r=
rEr`)r>
rErb)r?
rErd)r@
rErf)rA
rErh)rB
rErj)rC
rErl)rD
rErn)rE
rErp)rF
rErr)rG
rErt)rH
rErv)rI
rErx)rJ
rErF)rK
rErH)rL
rErJ)rM
rErL)rN
rErN)rO
rErP)rP
rErR)rQ
rErT)rR
rErV)rS
rErX)rT
rErZ)rU
rEr\)rV
rEr^)rW
rEr`)rX
rErb)rY
rErd)rZ
rErf)r[
rErh)r\
rErjr�r�r�r�r��_seg_23`	s�r]
cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�dgdS(N��$rErl��$rn��$rp��$rr��$rt��$rv��$rx��$r���$r0�'r��'�*�∫∫∫∫�
*�t*r�::=�u*�==�v*�===�w*��*�⫝̸��*�M+�P+�Z+�,�ⰰ�,�ⰱ�,�ⰲ�,�ⰳ�,�ⰴ�,�ⰵ�,�ⰶ�,�ⰷ�,�ⰸ�	,�ⰹ�
,�ⰺ�,�ⰻ�,�ⰼ�
,�ⰽ�,�ⰾ�,�ⰿ�,�ⱀ�,�ⱁ�,�ⱂ�,�ⱃ�,�ⱄ�,�ⱅ�,�ⱆ�,�ⱇ�,�ⱈ�,�ⱉ�,�ⱊ�,�ⱋ�,�ⱌ�,�ⱍ�,�ⱎ�,�ⱏ� ,�ⱐ�!,�ⱑ�",�ⱒ�#,�ⱓ�$,�ⱔ�%,�ⱕ�&,�ⱖ�',�ⱗ�(,�ⱘ�),�ⱙ�*,�ⱚ�+,�ⱛ�,,�ⱜ�-,�ⱝ�.,�ⱞ�/,�0,�_,�`,�ⱡ�a,�b,�ɫ�c,�ᵽ�d,�ɽ�e,�g,�ⱨ�h,�i,�ⱪ�j,�k,�ⱬ�l,�m,�ɑ�n,�ɱ�o,�ɐ�p,�ɒ�q,�r,�ⱳ�s,�u,�ⱶ�v,�|,rX�},�~,�ȿ�,�ɀ�,�ⲁ�,�,�ⲃ)r^
rErl)r_
rErn)r`
rErp)ra
rErr)rb
rErt)rc
rErv)rd
rErx)re
rEr�)rf
r0)rg
r�)rh
r0)ri
rErj
)rk
r0)rl
rrm
)rn
rro
)rp
rrq
)rr
r0)rs
rErt
)ru
r0)rv
r�)rw
r0)rx
r�)ry
rErz
)r{
rEr|
)r}
rEr~
)r
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
r�)r�
r0)r�
r�)r�
rEr�
)r�
r0)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rEr�
)r�
r0)r�
rErX)r�
rErp)r�
rEr�
)rrEr)rrEr)rr0)rrErr�r�r�r�r��_seg_24�	s�rcfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�,r0�,rE�ⲅ�,�,�ⲇ�,�,�ⲉ�,�,�ⲋ�,�,�ⲍ�,�,�ⲏ�,�,�ⲑ�,�,�ⲓ�,�,�ⲕ�,�,�ⲗ�,�,�ⲙ�,�,�ⲛ�,�,�ⲝ�,�,�ⲟ�,�,�ⲡ�,�,�ⲣ�,�,�ⲥ�,�,�ⲧ�,�,�ⲩ�,�,�ⲫ�,�,�ⲭ�,�,�ⲯ�,�,�ⲱ�,�,�ⲳ�,�,�ⲵ�,�,�ⲷ�,�,�ⲹ�,�,�ⲻ�,�,�ⲽ�,�,�ⲿ�,�,�ⳁ�,��,�ⳃ��,��,�ⳅ��,��,�ⳇ��,��,�ⳉ��,��,�ⳋ��,��,�ⳍ��,��,�ⳏ��,��,�ⳑ��,��,�ⳓ��,��,�ⳕ��,��,�ⳗ��,��,�ⳙ��,��,�ⳛ��,��,�ⳝ��,��,�ⳟ��,��,�ⳡ��,��,�ⳣ��,��,�ⳬ��,��,�ⳮ)rr0)r	rEr
)rr0)rrEr
)rr0)rrEr)rr0)rrEr)rr0)rrEr)rr0)rrEr)rr0)rrEr)rr0)rrEr)r r0)r!rEr")r#r0)r$rEr%)r&r0)r'rEr()r)r0)r*rEr+)r,r0)r-rEr.)r/r0)r0rEr1)r2r0)r3rEr4)r5r0)r6rEr7)r8r0)r9rEr:)r;r0)r<rEr=)r>r0)r?rEr@)rAr0)rBrErC)rDr0)rErErF)rGr0)rHrErI)rJr0)rKrErL)rMr0)rNrErO)rPr0)rQrErR)rSr0)rTrErU)rVr0)rWrErX)rYr0)rZrEr[)r\r0)r]rEr^)r_r0)r`rEra)rbr0)rcrErd)rer0)rfrErg)rhr0)rirErj)rkr0)rlrErm)rnr0)rorErp)rqr0)rrrErs)rtr0)rurErv)rwr0)rxrEry)rzr0)r{rEr|)r}r0)r~rEr)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�r�r�r�r�r��_seg_250
s�r�cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	gdS(
N��,r0��,rE�ⳳ��,��,r��,�&-�'-�(-�--�.-�0-�h-�o-�ⵡ�p-�q-�-�-�-�-�-�-�-�-�-�-�-��-��-��-��-��-��-��-��-�<.�.�.�.�.�母�.��.�龟��.�/�一�/�丨�/�丶�/�丿�/�乙�/�亅�/�二�/�亠�/�人�	/�儿�
/�入�/�八�/�冂�
/�冖�/�冫�/�几�/�凵�/�刀�/�力�/�勹�/�匕�/�匚�/�匸�/�十�/�卜�/�卩�/�厂�/�厶�/�又�/�口�/�囗�/�土� /�士�!/�夂�"/�夊�#/�夕�$/�大�%/�女�&/�子�'/�宀�(/�寸�)/�小�*/�尢�+/�尸�,/�屮�-/�山�./�巛�//�工�0/�己�1/�巾�2/�干�3/�幺�4/�广�5/�廴�6/�廾�7/�弋�8/�弓�9/�彐)r�r0)r�rEr�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�rEr�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�rEr�)r�r0)r�rEr�)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rrEr)rrEr)rrEr)r	rEr
)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr )r!rEr")r#rEr$)r%rEr&)r'rEr()r)rEr*)r+rEr,)r-rEr.)r/rEr0)r1rEr2)r3rEr4)r5rEr6)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr@r�r�r�r�r��_seg_26�
s�rAcfCs(d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-gdS(.N�:/rE�彡�;/�彳�</�心�=/�戈�>/�戶�?/�手�@/�支�A/�攴�B/�文�C/�斗�D/�斤�E/�方�F/�无�G/�日�H/�曰�I/�月�J/�木�K/�欠�L/�止�M/�歹�N/�殳�O/�毋�P/�比�Q/�毛�R/�氏�S/�气�T/�水�U/�火�V/�爪�W/�父�X/�爻�Y/�爿�Z/�片�[/�牙�\/�牛�]/�犬�^/�玄�_/�玉�`/�瓜�a/�瓦�b/�甘�c/�生�d/�用�e/�田�f/�疋�g/�疒�h/�癶�i/�白�j/�皮�k/�皿�l/�目�m/�矛�n/�矢�o/�石�p/�示�q/�禸�r/�禾�s/�穴�t/�立�u/�竹�v/�米�w/�糸�x/�缶�y/�网�z/�羊�{/�羽�|/�老�}/�而�~/�耒�/�耳�/�聿�/�肉�/�臣�/�自�/�至�/�臼�/�舌�/�舛�/�舟�/�艮�/�色�/�艸�/�虍�/�虫�/�血�/�行�/�衣�/�襾�/�見�/�角�/�言�/�谷�/�豆�/�豕�/�豸�/�貝�/�赤�/�走�/�足�/�身)rBrErC)rDrErE)rFrErG)rHrErI)rJrErK)rLrErM)rNrErO)rPrErQ)rRrErS)rTrErU)rVrErW)rXrErY)rZrEr[)r\rEr])r^rEr_)r`rEra)rbrErc)rdrEre)rfrErg)rhrEri)rjrErk)rlrErm)rnrEro)rprErq)rrrErs)rtrEru)rvrErw)rxrEry)rzrEr{)r|rEr})r~rEr)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)rrEr)rrEr)rrEr)rrEr)rrEr	r�r�r�r�r��_seg_27s�r
cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"gdS(#N�/rE�車�/�辛�/�辰�/�辵�/�邑�/�酉�/�釆�/�里�/�金�/�長�/�門�/�阜�/�隶�/�隹�/�雨�/�靑�/�非�/�面�/�革�/�韋�/�韭�/�音�/�頁�/�風�/�飛�/�食�/�首�/�香�/�馬�/�骨�/�高�/�髟�/�鬥�/�鬯�/�鬲�/�鬼��/�魚��/�鳥��/�鹵��/�鹿��/�麥��/�麻��/�黃��/�黍��/�黑��/�黹��/�黽��/�鼎��/�鼓��/�鼠��/�鼻��/�齊��/�齒��/�龍��/�龜��/�龠��/r��0rr��0r0�0�.�0�60�〒�70�80�十�90�卄�:0�卅�;0�@0�A0�0�0�0� ゙�0� ゚�0�0�より�0�0�コト�1�1�.1�11�ᄀ�21�ᄁ�31�ᆪ�41�ᄂ�51�ᆬ�61�ᆭ�71�ᄃ�81�ᄄ�91�ᄅ�:1�ᆰ�;1�ᆱ�<1�ᆲ�=1�ᆳ�>1�ᆴ�?1�ᆵ�@1�ᄚ�A1�ᄆ�B1�ᄇ�C1�ᄈ�D1�ᄡ)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr )r!rEr")r#rEr$)r%rEr&)r'rEr()r)rEr*)r+rEr,)r-rEr.)r/rEr0)r1rEr2)r3rEr4)r5rEr6)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr@)rArErB)rCrErD)rErErF)rGrErH)rIrErJ)rKrErL)rMrErN)rOrErP)rQrErR)rSrErT)rUrErV)rWrErX)rYrErZ)r[rEr\)r]rEr^)r_rEr`)rarErb)rcrErd)rerErf)rgrErh)rirErj)rkrErl)rmrErn)rorErp)rqrErr)rsrErt)rurErv)rwrErx)ryrErz)r{r�)r|rr�)r}r0)r~rEr)r�r0)r�rEr�)r�r0)r�rEr�)r�rEr�)r�rEr�)r�r0)r�r�)r�r0)r�r�)r�r0)r�rr�)r�rr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r�)r�r0)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_28hs�r�cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(gdS()N�E1rE�ᄉ�F1�ᄊ�G1�ᄋ�H1�ᄌ�I1�ᄍ�J1�ᄎ�K1�ᄏ�L1�ᄐ�M1�ᄑ�N1�ᄒ�O1�ᅡ�P1�ᅢ�Q1�ᅣ�R1�ᅤ�S1�ᅥ�T1�ᅦ�U1�ᅧ�V1�ᅨ�W1�ᅩ�X1�ᅪ�Y1�ᅫ�Z1�ᅬ�[1�ᅭ�\1�ᅮ�]1�ᅯ�^1�ᅰ�_1�ᅱ�`1�ᅲ�a1�ᅳ�b1�ᅴ�c1�ᅵ�d1r��e1�ᄔ�f1�ᄕ�g1�ᇇ�h1�ᇈ�i1�ᇌ�j1�ᇎ�k1�ᇓ�l1�ᇗ�m1�ᇙ�n1�ᄜ�o1�ᇝ�p1�ᇟ�q1�ᄝ�r1�ᄞ�s1�ᄠ�t1�ᄢ�u1�ᄣ�v1�ᄧ�w1�ᄩ�x1�ᄫ�y1�ᄬ�z1�ᄭ�{1�ᄮ�|1�ᄯ�}1�ᄲ�~1�ᄶ�1�ᅀ�1�ᅇ�1�ᅌ�1�ᇱ�1�ᇲ�1�ᅗ�1�ᅘ�1�ᅙ�1�ᆄ�1�ᆅ�1�ᆈ�1�ᆑ�1�ᆒ�1�ᆔ�1�ᆞ�1�ᆡ�1�1r0�1�一�1�二�1�三�1�四�1�上�1�中�1�下�1�甲�1�乙�1�丙�1�丁�1�天�1�地�1�人�1�1�1��1��1�2r�(ᄀ)�2�(ᄂ)�2�(ᄃ)�2�(ᄅ)�2�(ᄆ))r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rr�)rrEr)rrEr)rrEr	)r
rEr)rrEr
)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)r rEr!)r"rEr#)r$rEr%)r&rEr')r(rEr))r*rEr+)r,rEr-)r.rEr/)r0rEr1)r2rEr3)r4rEr5)r6rEr7)r8rEr9)r:rEr;)r<rEr=)r>rEr?)r@rErA)rBrErC)rDrErE)rFrErG)rHrErI)rJrErK)rLrErM)rNrErO)rPrErQ)rRrErS)rTrErU)rVrErW)rXr�)rYr0)rZrEr[)r\rEr])r^rEr_)r`rEra)rbrErc)rdrEre)rfrErg)rhrEri)rjrErk)rlrErm)rnrEro)rprErq)rrrErs)rtrEru)rvr0)rwr�)rxr0)ryr�)rzr0)r{rr|)r}rr~)rrr�)r�rr�)r�rr�r�r�r�r�r��_seg_29�s�r�cfCs*d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-�d.gdS(/N�2r�(ᄇ)�2�(ᄉ)�2�(ᄋ)�2�(ᄌ)�	2�(ᄎ)�
2�(ᄏ)�2�(ᄐ)�2�(ᄑ)�
2�(ᄒ)�2�(가)�2�(나)�2�(다)�2�(라)�2�(마)�2�(바)�2�(사)�2�(아)�2�(자)�2�(차)�2�(카)�2�(타)�2�(파)�2�(하)�2�(주)�2�(오전)�2�(오후)�2r�� 2�(一)�!2�(二)�"2�(三)�#2�(四)�$2�(五)�%2�(六)�&2�(七)�'2�(八)�(2�(九)�)2�(十)�*2�(月)�+2�(火)�,2�(水)�-2�(木)�.2�(金)�/2�(土)�02�(日)�12�(株)�22�(有)�32�(社)�42�(名)�52�(特)�62�(財)�72�(祝)�82�(労)�92�(代)�:2�(呼)�;2�(学)�<2�(監)�=2�(企)�>2�(資)�?2�(協)�@2�(祭)�A2�(休)�B2�(自)�C2�(至)�D2rE�問�E2�幼�F2�文�G2�箏�H2r0�P2�pte�Q2�21�R2�22�S2�23�T2�24�U2�25�V2�26�W2�27�X2�28�Y2�29�Z2�30�[2�31�\2�32�]2�33�^2�34�_2�35�`2�ᄀ�a2�ᄂ�b2�ᄃ�c2�ᄅ�d2�ᄆ�e2�ᄇ�f2�ᄉ�g2�ᄋ�h2�ᄌ�i2�ᄎ�j2�ᄏ�k2�ᄐ�l2�ᄑ�m2�ᄒ�n2�가�o2�나)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�r�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr)rrr)rrEr)rrEr)rrEr)r	rEr
)rr0)rrEr
)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)r rEr!)r"rEr#)r$rEr%)r&rEr')r(rEr))r*rEr+)r,rEr-)r.rEr/)r0rEr1)r2rEr3)r4rEr5)r6rEr7)r8rEr9)r:rEr;)r<rEr=)r>rEr?)r@rErA)rBrErC)rDrErE)rFrErG)rHrErI)rJrErKr�r�r�r�r��_seg_308s�rLcfCs(d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-gdS(.N�p2rE�다�q2�라�r2�마�s2�바�t2�사�u2�아�v2�자�w2�차�x2�카�y2�타�z2�파�{2�하�|2�참고�}2�주의�~2�우�2r0�2�一�2�二�2�三�2�四�2�五�2�六�2�七�2�八�2�九�2�十�2�月�2�火�2�水�2�木�2�金�2�土�2�日�2�株�2�有�2�社�2�名�2�特�2�財�2�祝�2�労�2�秘�2�男�2�女�2�適�2�優�2�印�2�注�2�項�2�休�2�写�2�正�2�上�2�中�2�下�2�左�2�右�2�医�2�宗�2�学�2�監�2�企�2�資�2�協�2�夜�2�36�2�37�2�38�2�39�2�40�2�41�2�42�2�43�2�44�2�45�2�46�2�47�2�48�2�49�2�50�2�1月�2�2月��2�3月��2�4月��2�5月��2�6月��2�7月��2�8月��2�9月��2�10月��2�11月��2�12月��2�hg��2�erg��2�ev��2�ltd��2�ア��2�イ��2�ウ��2�エ)rMrErN)rOrErP)rQrErR)rSrErT)rUrErV)rWrErX)rYrErZ)r[rEr\)r]rEr^)r_rEr`)rarErb)rcrErd)rerErf)rgrErh)rirErj)rkr0)rlrErm)rnrEro)rprErq)rrrErs)rtrEru)rvrErw)rxrEry)rzrEr{)r|rEr})r~rEr)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)rrEr)rrEr)rrEr)rrEr)rrEr	)r
rEr)rrEr
)rrEr)rrEr)rrErr�r�r�r�r��_seg_31�s�rcfCs(d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-gdS(.N��2rE�オ��2�カ��2�キ��2�ク��2�ケ��2�コ��2�サ��2�シ��2�ス��2�セ��2�ソ��2�タ��2�チ��2�ツ��2�テ��2�ト��2�ナ��2�ニ��2�ヌ��2�ネ��2�ノ��2�ハ��2�ヒ��2�フ��2�ヘ��2�ホ��2�マ��2�ミ��2�ム��2�メ��2�モ��2�ヤ��2�ユ�2�ヨ�2�ラ�2�リ�2�ル�2�レ�2�ロ�2�ワ�2�ヰ�2�ヱ�2�ヲ�2r��3�アパート�3�アルファ�3�アンペア�3�	アール�3�イニング�3�	インチ�3�	ウォン�3�エスクード�3�エーカー�	3�	オンス�
3�	オーム�3�	カイリ�3�カラット�
3�カロリー�3�	ガロン�3�	ガンマ�3�ギガ�3�	ギニー�3�キュリー�3�ギルダー�3�キロ�3�キログラム�3�キロメートル�3�キロワット�3�	グラム�3�グラムトン�3�クルゼイロ�3�クローネ�3�	ケース�3�	コルナ�3�	コーポ�3�サイクル� 3�サンチーム�!3�シリング�"3�	センチ�#3�	セント�$3�	ダース�%3�デシ�&3�ドル�'3�トン�(3�ナノ�)3�	ノット�*3�	ハイツ�+3�パーセント�,3�	パーツ�-3�バーレル�.3�ピアストル�/3�	ピクル�03�ピコ�13�ビル�23�ファラッド�33�フィート�43�ブッシェル�53�	フラン�63�ヘクタール�73�ペソ)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr )r!rEr")r#rEr$)r%rEr&)r'rEr()r)rEr*)r+rEr,)r-rEr.)r/rEr0)r1rEr2)r3rEr4)r5rEr6)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr@)rArErB)rCrErD)rErErF)rGrErH)rIrErJ)rKrErL)rMrErN)rOrErP)rQrErR)rSrErT)rUrErV)rWrErX)rYrErZ)r[rEr\)r]rEr^)r_rEr`)rarErb)rcrErd)rerErf)rgrErh)rirErj)rkr�)rlrErm)rnrEro)rprErq)rrrErs)rtrEru)rvrErw)rxrEry)rzrEr{)r|rEr})r~rEr)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_32
s�r�cfCs(d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-gdS(.N�83rE�	ペニヒ�93�	ヘルツ�:3�	ペンス�;3�	ページ�<3�	ベータ�=3�ポイント�>3�	ボルト�?3�ホン�@3�	ポンド�A3�	ホール�B3�	ホーン�C3�マイクロ�D3�	マイル�E3�	マッハ�F3�	マルク�G3�マンション�H3�ミクロン�I3�ミリ�J3�ミリバール�K3�メガ�L3�メガトン�M3�メートル�N3�	ヤード�O3�	ヤール�P3�	ユアン�Q3�リットル�R3�リラ�S3�	ルピー�T3�ルーブル�U3�レム�V3�レントゲン�W3�	ワット�X3�0点�Y3�1点�Z3�2点�[3�3点�\3�4点�]3�5点�^3�6点�_3�7点�`3�8点�a3�9点�b3�10点�c3�11点�d3�12点�e3�13点�f3�14点�g3�15点�h3�16点�i3�17点�j3�18点�k3�19点�l3�20点�m3�21点�n3�22点�o3�23点�p3�24点�q3�hpa�r3�da�s3�au�t3�bar�u3�ov�v3�pc�w3�dm�x3�dm2�y3�dm3�z3�iu�{3�平成�|3�昭和�}3�大正�~3�明治�3�株式会社�3�pa�3�na�3�μa�3�ma�3�ka�3�kb�3�mb�3�gb�3�cal�3�kcal�3�pf�3�nf�3�μf�3�μg�3�mg�3�kg�3�hz�3�khz�3�mhz�3�ghz�3�thz�3�μl�3�ml�3�dl�3�kl�3�fm�3�nm�3�μm)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rrEr)rrEr)rrEr)r	rEr
)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr )r!rEr")r#rEr$)r%rEr&)r'rEr()r)rEr*)r+rEr,)r-rEr.)r/rEr0)r1rEr2)r3rEr4)r5rEr6)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr@)rArErB)rCrErD)rErErF)rGrErH)rIrErJ)rKrErL)rMrErN)rOrErP)rQrErR)rSrErT)rUrErV)rWrErX)rYrErZ)r[rEr\)r]rEr^)r_rEr`)rarErb)rcrErd)rerErf)rgrErh)rirErj)rkrErl)rmrErn)rorErp)rqrErr)rsrErt)rurErv)rwrErx)ryrErz)r{rEr|)r}rEr~)rrEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_33p
s�r�cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(gdS()N�3rE�mm�3�cm�3�km�3�mm2�3�cm2�3�m2�3�km2�3�mm3�3�cm3�3�m3�3�km3�3�m∕s�3�m∕s2�3rn�3�kpa�3�mpa�3�gpa�3�rad�3�rad∕s�3�rad∕s2�3�ps�3�ns�3�μs�3�ms�3�pv�3�nv�3�μv�3�mv�3�kv�3�3�pw�3�nw�3�μw�3�mw�3�kw�3�3�kω�3�mω��3r���3�bq��3�cc��3�cd��3�c∕kg��3��3�db��3�gy��3�ha��3�hp��3�in��3�kk��3��3�kt��3�lm��3�ln��3�log��3�lx��3rz��3�mil��3�mol��3�ph��3��3�ppm��3�pr��3�sr��3�sv��3�wb��3�v∕m��3�a∕m��3�1日��3�2日��3�3日��3�4日��3�5日��3�6日��3�7日��3�8日��3�9日��3�10日��3�11日��3�12日��3�13日��3�14日��3�15日��3�16日��3�17日��3�18日��3�19日��3�20日��3�21日�3�22日�3�23日�3�24日�3�25日�3�26日�3�27日�3�28日�3�29日�3�30日�3�31日�3�gal)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rErn)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rrEr)rrEr�)rrEr)rrEr	)r
rEr)rrEr
)rrEr)rrErz)rrEr)rrEr)rrEr)rr�)rrEr)rrEr)rrEr)rrEr)r rEr!)r"rEr#)r$rEr%)r&rEr')r(rEr))r*rEr+)r,rEr-)r.rEr/)r0rEr1)r2rEr3)r4rEr5)r6rEr7)r8rEr9)r:rEr;)r<rEr=)r>rEr?)r@rErA)rBrErC)rDrErE)rFrErG)rHrErI)rJrErK)rLrErM)rNrErO)rPrErQ)rRrErS)rTrErU)rVrErW)rXrErY)rZrEr[)r\rEr])r^rEr_)r`rEra)rbrErc)rdrErer�r�r�r�r��_seg_34�
s�rfceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�4r0�Mr��M�͟��鍤鐤�Ǥ�Ф�,��@�rE�ꙁ�A��B��ꙃ�C��D��ꙅ�E��F��ꙇ�G��H��ꙉ�I��J��ꙋ�K��L��ꙍ�M��N��ꙏ�O��P��ꙑ�Q��R��ꙓ�S��T��ꙕ�U��V��ꙗ�W��X��ꙙ�Y��Z��ꙛ�[��\��ꙝ�]��^��ꙟ�_��`��ꙡ�a��b��ꙣ�c��d��ꙥ�e��f��ꙧ�g��h��ꙩ�i��j��ꙫ�k��l��ꙭ�m�逦�ꚁ遦邦�ꚃ郦鄦�ꚅ酦醦�ꚇ釦鈦�ꚉ鉦銦�ꚋ鋦錦�ꚍ鍦鎦�ꚏ鏦鐦�ꚑ鑦钦�ꚓ铦锦�ꚕ镦閦�ꚗ闦阦韦����"��ꜣ�#��$��ꜥ�%��&��ꜧ�'��(��ꜩ�)��*��ꜫ�+��,��ꜭ�-��.��ꜯ�/��2��ꜳ�3�)rgr0)rhr�)rir0)rjr�)rkr0)rlr�)rmr0)rnr�)ror0)rpr�)rqrErr)rsr0)rtrEru)rvr0)rwrErx)ryr0)rzrEr{)r|r0)r}rEr~)rr0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�r�)r�r0)r�r�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0r�r�r�r�r��_seg_35@s�r�cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�4�rE�ꜵ�5�r0�6��ꜷ�7��8��ꜹ�9��:��ꜻ�;��<��ꜽ�=��>��ꜿ�?��@��ꝁ�A��B��ꝃ�C��D��ꝅ�E��F��ꝇ�G��H��ꝉ�I��J��ꝋ�K��L��ꝍ�M��N��ꝏ�O��P��ꝑ�Q��R��ꝓ�S��T��ꝕ�U��V��ꝗ�W��X��ꝙ�Y��Z��ꝛ�[��\��ꝝ�]��^��ꝟ�_��`��ꝡ�a��b��ꝣ�c��d��ꝥ�e��f��ꝧ�g��h��ꝩ�i��j��ꝫ�k��l��ꝭ�m��n��ꝯ�o��p��q��y��ꝺ�z��{��ꝼ�|��}��ᵹ�~��ꝿ��逧�ꞁ遧邧�ꞃ郧鄧�ꞅ酧醧�ꞇ釧鋧�ꞌ錧鍧�ɥ鎧鏧r�鐧�ꞑ鑧钧�ꞓ铧锧頧�ꞡ顧颧�ꞣ飧餧�ꞥ饧馧�ꞧ駧騧�ꞩ驧骧�ɦ髧��ħ)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)rrEr)rr0)rrEr)rr0)rrEr)rr0)r	rEr
)rr0)rrEr
)rr0)rrEr)rr0)rrEr)rr0)rrEr)rr0)rrEr)rr0)rrEr)rr0)rrEr)r r0)r!rEr")r#r0)r$rEr%)r&r0)r'rEr()r)r0)r*rEr+)r,r0)r-rEr.)r/r0)r0rEr1)r2r0)r3rEr4)r5r0)r6rEr7)r8r0)r9rEr:)r;r0)r<rEr=)r>r0)r?rEr@)rAr0)rBrErC)rDr0)rErErF)rGr0)rHrErI)rJr0)rKrErL)rMr0)rNrErO)rPr0)rQrErO)rRr0)rSrErT)rUr0)rVrErW)rXr0)rYrErZ)r[rEr\)r]r0)r^rEr_)r`r0)rarErb)rcr0)rdrEre)rfr0)rgrErh)rir0)rjrErk)rlr0)rmrErn)ror0)rpr�)rqrErr)rsr0)rtrEru)rvr0)rwr�)rxrEry)rzr0)r{rEr|)r}r0)r~rEr)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r�)r�rEr�r�r�r�r�r��_seg_36�s�r�cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�rE�œ�r0�,�r��0��:��@��x�逨�Ũ�Ψ�ڨ������T��_��}�逩�Ω�ϩ�ک�ީ�����7��@��N��P��Z��\��|�逪�ê�۪������	�������� ��'��(��/������������������������豈���更���車���賈���滑���串���句���龜�	��契�
��金���喇���奈�
��懶���癩���羅���蘿���螺���裸���邏���樂���洛���烙���珞���落���酪���駱���亂���卵���欄���爛���蘭� ��鸞�!��嵐�"��濫�#��藍�$��襤�%��拉�&��臘�'��蠟�(��廊�)��朗�*��浪�+��狼�,��郎�-��來)r�rEr�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�r0)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rrEr)rrEr)rrEr)r	rEr
)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrErr�r�r�r�r��_seg_37s�rcfCs(d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-gdS(.N�.�rE�冷�/��勞�0��擄�1��櫓�2��爐�3��盧�4��老�5��蘆�6��虜�7��路�8��露�9��魯�:��鷺�;��碌�<��祿�=��綠�>��菉�?��錄�@��鹿�A��論�B��壟�C��弄�D��籠�E��聾�F��牢�G��磊�H��賂�I��雷�J��壘�K��屢�L��樓�M��淚�N��漏�O��累�P��縷�Q��陋�R��勒�S��肋�T��凜�U��凌�V��稜�W��綾�X��菱�Y��陵�Z��讀�[��拏�\��樂�]��諾�^��丹�_��寧�`��怒�a��率�b��異�c��北�d��磻�e��便�f��復�g��不�h��泌�i��數�j��索�k��參�l��塞�m��省�n��葉�o��說�p��殺�q��辰�r��沈�s��拾�t��若�u��掠�v��略�w��亮�x��兩�y��凉�z��梁�{��糧�|��良�}��諒�~��量���勵��呂��女��廬��旅��濾��礪��閭��驪��麗��黎��力��曆��歷��轢��年��憐��戀��撚)r rEr!)r"rEr#)r$rEr%)r&rEr')r(rEr))r*rEr+)r,rEr-)r.rEr/)r0rEr1)r2rEr3)r4rEr5)r6rEr7)r8rEr9)r:rEr;)r<rEr=)r>rEr?)r@rErA)rBrErC)rDrErE)rFrErG)rHrErI)rJrErK)rLrErM)rNrErO)rPrErQ)rRrErS)rTrErU)rVrErW)rXrErY)rZrEr[)r\rEr])r^rEr_)r`rEra)rbrErc)rdrEre)rfrErg)rhrEri)rjrErk)rlrErm)rnrEro)rprErq)rrrErs)rtrEru)rvrErw)rxrEry)rzrEr{)r|rEr})r~rEr)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_38xs�r�cfCs(d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-gdS(.N�rE�漣��煉��璉��秊��練��聯��輦��蓮��連��鍊��列��劣��咽��烈��裂��說��廉��念��捻��殮��簾��獵��令��囹��寧��嶺��怜��玲��瑩��羚��聆��鈴��零��靈��領��例��禮��醴��隸��惡��了��僚��寮��尿��料��樂��燎��療���蓼���遼���龍���暈���阮���劉���杻���柳���流���溜���琉���留���硫���紐���類���六���戮���陸���倫���崙���淪���輪���律���慄���栗���率���隆���利���吏���履���易���李���梨���泥���理���痢���罹���裏���裡���里���離���匿���溺���吝���燐���璘���藺���隣���鱗���麟���林��淋)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rrEr)rrEr)rrEr)r	rEr
)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr )r!rEr")r#rEr$)r%rEr&)r'rEr()r)rEr*)r+rEr,)r-rEr.)r/rEr0)r1rEr2)r3rEr4)r5rEr6)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr@)rArErB)rCrErD)rErErF)rGrErH)rIrErJ)rKrErL)rMrErN)rOrErP)rQrErR)rSrErT)rUrErV)rWrErX)rYrErZ)r[rEr\)r]rEr^)r_rEr`)rarErb)rcrErd)rerErf)rgrErh)rirErj)rkrErl)rmrErn)rorErp)rqrErr)rsrErt)rurErv)rwrErx)ryrErz)r{rEr|)r}rEr~)rrEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_39�s�r�cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'gdS((N�rE�臨��立��笠��粒��狀��炙��識��什��茶��刺���切���度���拓���糖���宅���洞���暴���輻���行�	��降�
��見���廓���兀�
��嗀��r0���塚�����晴�����凞���猪���益���礼���神���祥���福���靖���精���羽��� ��蘒�!��"��諸�#��%��逸�&��都�'��*��飯�+��飼�,��館�-��鶴�.��郞�/��隷�0��侮�1��僧�2��免�3��勉�4��勤�5��卑�6��喝�7��嘆�8��器�9��塀�:��墨�;��層�<��屮�=��悔�>��慨�?��憎�@��懲�A��敏�B��既�C��暑�D��梅�E��海�F��渚�G��漢�H��煮�I��爫�J��琢�K��碑�L��社�M��祉�N��祈�O��祐�P��祖�Q��祝�R��禍�S��禎�T��穀�U��突�V��節�W��練�X��縉�Y��繁�Z��署�[��者�\��臭�]��艹�_��著)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�r0)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r0)r�rEr�)rr0)rrEr)rr0)rrEr)rrEr)rr0)r	rEr
)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr )r!rEr")r#rEr$)r%rEr&)r'rEr()r)rEr*)r+rEr,)r-rEr.)r/rEr0)r1rEr2)r3rEr4)r5rEr6)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr@)rArErB)rCrErD)rErErF)rGrErH)rIrErJ)rKrErL)rMrErN)rOrErP)rQrErR)rSrErT)rUrErV)rWrErX)rYrErZ)r[rEr\)r]rEr^)r_rEr`)rarErb)rcrErd)rerErf)rgrErh)rirErj)rkrErl)rmrErn)rorErp)rqrErrr�r�r�r�r��_seg_40Hs�rscfCs d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)gdS(*N�`�rE�褐�a��視�b��謁�c��謹�d��賓�e��贈�f��辶�g��逸�h��難�i��響�j��頻�k��恵�l��𤋮�m��舘�n�r��p��並�q��况�r��全�s��侀�t��充�u��冀�v��勇�w��勺�x��喝�y��啕�z��喙�{��嗢�|��塚�}��墳�~��奄���奔��婢��嬨��廒��廙��彩��徭��惘��慎��愈��憎��慠��懲��戴��揄��搜��摒��敖��晴��朗��望��杖��歹��殺��流��滛��滋��漢��瀞��煮��瞧��爵��犯��猪��瑱��甆��画��瘝��瘟��益��盛��直��睊��着��磌��窱��節��类��絛��練��缾��者��荒��華��蝹��襁��覆���調��諸��請���諾��諭���變����輸���遲���醙)rtrEru)rvrErw)rxrEry)rzrEr{)r|rEr})r~rEr)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rrEr)rrEr)rrEr)r	rEr
)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr )r!rErw)r"rEr#)r$rEr%)r&rEr')r(rEry)r)rEr*)r+rEr,)r-rEr{)r.rEr/)r0rEr)r1rEr2)r3rEr4)r5rEr6r�r�r�r�r��_seg_41�s�r7cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'gdS((N��rE�鉶���陼���難���靖���韛���響���頋���頻���鬒���龜���𢡊���𢡄���𣏕���㮝���䀘���䀹���𥉉���𥳐���𧻓���齃���龎��r����ff���fi���fl���ffi���ffl���st�����մն���մե���մի���վն���մխ�����יִ��r0���ײַ� ��ע�!��א�"��ד�#��ה�$��כ�%��ל�&��ם�'��ר�(��ת�)�rr��*��שׁ�+��שׂ�,��שּׁ�-��שּׂ�.��אַ�/��אָ�0��אּ�1��בּ�2��גּ�3��דּ�4��הּ�5��וּ�6��זּ�7��8��טּ�9��יּ�:��ךּ�;��כּ�<��לּ�=��>��מּ�?��@��נּ�A��סּ�B��C��ףּ�D��פּ�E��F��צּ�G��קּ�H��רּ�I��שּ�J��תּ�K��וֹ�L��בֿ�M��כֿ�N��פֿ�O��אל�P��ٱ�R��ٻ�V��پ�Z��ڀ�^��ٺ�b��ٿ�f��ٹ�j��ڤ�n��ڦ�r��ڄ�v��ڃ�z��چ�~��ڇ��ڍ)r8rEr9)r:rEr;)r<rEr=)r>rEr?)r@rErA)rBrErC)rDrErE)rFrErG)rHrErI)rJrErK)rLrErM)rNrErO)rPrErQ)rRrErS)rTrErU)rVrErW)rXrErY)rZrEr[)r\rEr])r^rEr_)r`rEra)rbr�)rcrErd)rerErf)rgrErh)rirErj)rkrErl)rmrErn)ror�)rprErq)rrrErs)rtrEru)rvrErw)rxrEry)rzr�)r{rEr|)r}r0)r~rEr)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r�)r�rEr�)r�r�)r�rEr�)r�rEr�)r�r�)r�rEr�)r�rEr�)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_42s�r�cfCs&d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,gdS(-N�rE�ڌ��ڎ��ڈ��ژ��ڑ��ک��گ��ڳ��ڱ��ں��ڻ��ۀ��ہ��ھ��ے��ۓ�r0��r����ڭ���ۇ���ۆ���ۈ���ۇٴ���ۋ���ۅ���ۉ���ې���ى���ئا���ئە���ئو���ئۇ���ئۆ���ئۈ��ئې��ئى��ی���ئج���ئح���ئم�����ئي���بج���بح���بخ���بم�	��بى�
��بي���تج���تح�
��تخ���تم���تى���تي���ثج���ثم���ثى���ثي���جح���جم���حج���حم���خج���خح���خم���سج���سح���سخ���سم� ��صح�!��صم�"��ضج�#��ضح�$��ضخ�%��ضم�&��طح�'��طم�(��ظم�)��عج�*��عم�+��غج�,��غم�-��فج�.��فح�/��فخ�0��فم�1��فى�2��في�3��قح�4��قم�5��قى�6��قي�7��كا�8��كج�9��كح�:��كخ�;��كل�<��كم�=��كى�>��كي)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rrEr)rrEr)rrEr)r	rEr
)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rr0)rr�)rrEr)rrEr)rrEr)rrEr )r!rEr")r#rEr$)r%rEr&)r'rEr()r)rEr*)r+rEr,)r-rEr.)r/rEr0)r1rEr2)r3rEr4)r5rEr6)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr@)rArErB)rCrErD)rErEr<)rFrErG)rHrErI)rJrErK)rLrErM)rNrErO)rPrErQ)rRrErS)rTrErU)rVrErW)rXrErY)rZrEr[)r\rEr])r^rEr_)r`rEra)rbrErc)rdrEre)rfrErg)rhrEri)rjrErk)rlrErm)rnrEro)rprErq)rrrErs)rtrEru)rvrErw)rxrEry)rzrEr{)r|rEr})r~rEr)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_43�s�r�cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!gdS("N�?�rE�لج�@��لح�A��لخ�B��لم�C��لى�D��لي�E��مج�F��مح�G��مخ�H��مم�I��مى�J��مي�K��نج�L��نح�M��نخ�N��نم�O��نى�P��ني�Q��هج�R��هم�S��هى�T��هي�U��يج�V��يح�W��يخ�X��يم�Y��يى�Z��يي�[��ذٰ�\��رٰ�]��ىٰ�^�r� ٌّ�_�� ٍّ�`�� َّ�a�� ُّ�b�� ِّ�c�� ّٰ�d��ئر�e��ئز�f��ئم�g��ئن�h��ئى�i��ئي�j��بر�k��بز�l��بم�m��بن�n��بى�o��بي�p��تر�q��تز�r��تم�s��تن�t��تى�u��تي�v��ثر�w��ثز�x��ثم�y��ثن�z��ثى�{��ثي�|��فى�}��في�~��قى���قي��كا��كل��كم��كى��كي�����ما���نر��نز���نن�����ير��يز���ين����ئج��ئح��ئخ���ئه��بج��بح��بخ���به��تج��تح)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rr�)r�rr�)r�rr)rrr)rrr)rrr)rrEr)r	rEr
)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr )r!rEr")r#rEr$)r%rEr&)r'rEr()r)rEr*)r+rEr,)r-rEr.)r/rEr0)r1rEr2)r3rEr4)r5rEr6)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr@)rArErB)rCrErD)rErErF)rGrErH)rIrEr�)rJrEr�)rKrEr�)rLrErM)rNrEr�)rOrErP)rQrErR)rSrEr�)rTrErU)rVrEr�)rWrEr�)rXrEr�)rYrErZ)r[rEr\)r]rEr�)r^rEr_)r`rEr�)rarEr�)rbrErc)rdrEre)rfrErg)rhrEr)rirErj)rkrErl)rmrErn)rorErp)rqrEr)rrrErs)rtrEru)rvrErwr�r�r�r�r��_seg_44�s�rxcfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"gdS(#N�rE�تخ��تم��ته��ثم��جح��جم��حج��حم��خج��خم��سج��سح��سخ��سم��صح��صخ��صم��ضج��ضح��ضخ��ضم��طح��ظم��عج��عم��غج��غم��فج��فح��فخ��فم���قح���قم���كج���كح���كخ���كل���كم���لج���لح���لخ���لم���له���مج���مح���مخ���مم���نج���نح���نخ���نم���نه���هج���هم���هٰ���يج���يح���يخ���يم���يه���ئم���ئه���بم���به���������ثه�����سه���شم���شه�����������������ـَّ���ـُّ���ـِّ��طى��طي��عى��عي��غى��غي��سى��سي��شى��شي��حى���حي���جى���جي���خى���خي���صى���صي)ryrErz)r{rEr|)r}rEr~)rrEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr|)r�rEr~)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rrEr)rrEr�)rrEr�)rrEr�)rrEr�)r	rEr�)r
rEr�)rrEr�)rrEr
)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)r rEr!)r"rEr#)r$rEr%)r&rEr')r(rEr))r*rEr+)r,rEr-)r.rEr/)r0rEr1)r2rEr3)r4rEr5r�r�r�r�r��_seg_45Ps�r6cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�dgdS(N��rE�ضى���ضي�	��شج�
��شح���شخ���شم�
��شر���سر���صر���ضر���طى���طي���عى���عي���غى���غي���سى���سي���شى���شي���حى���حي���جى���جي���خى� ��خي�!��صى�"��صي�#��$��%��&��'��(��)��*��+��,��-��.��/��0��1��سه�2��شه�3��طم�4��سج�5��سح�6��سخ�7��8��9��:��;��ظم�<��اً�>�r0�@�r��P��تجم�Q��تحج�S��تحم�T��تخم�U��تمج�V��تمح�W��تمخ�X��جمح�Z��حمي�[��حمى�\��سحج�]��سجح�^��سجى�_��سمح�a��سمج�b��سمم�d��صحح�f��صمم�g��شحم�i��شجي�j��شمخ�l��شمم�n��ضحى�o��ضخم�q��طمح�s��طمم�t��طمي�u��عجم�v��عمم�x��عمى�y��غمم�z��غمي�{��غمى�|��فخم�~��قمح���قمم��لحم��لحي��لحى��لجج��لخم��لمح��محج��محم)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr@)rArErB)rCrErD)rErErF)rGrErH)rIrErJ)rKrErL)rMrErN)rOrErP)rQrErR)rSrErT)rUrErV)rWrErX)rYrErZ)r[rEr\)r]rEr^)r_rEr`)rarErb)rcrErd)rerErf)rgrErh)rirErj)rkrErl)rmrErn)rorEr8)rprEr:)rqrEr<)rrrEr>)rsrEr@)rtrErB)rurErD)rvrErF)rwrErH)rxrErJ)ryrEr<)rzrEr>)r{rEr@)r|rErB)r}rEr~)rrEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr<)r�rEr>)r�rEr@)r�rEr�)r�rEr�)r�rEr�)r�r0)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_46�s�r�cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%gdS(&N�rE�محي��مجح��مجم��مخج��مخم�r���مجخ��همج��همم��نحم��نحى��نجم��نجى��نمي��نمى��يمم��بخي��تجي��تجى��تخي��تخى��تمي��تمى��جمي��جحى��جمى��سخى��صحي��شحي��ضحي��لجي��لمي��يحي��يجي��يمي��ممي��قمي��نحي��قمح��لحم��عمي��كمي��نجح��مخي��لجم��كمم����جحي��حجي��مجي��فمي���بحي�����عجم���صمم���سخي���نجي�����صلے���قلے���الله���اكبر���محمد��صلعم��رسول��عليه��وسلم��صلى�r�!صلى الله عليه وسلم��جل جلاله��ریال�r0���r����,���、�����:��rl���!���?���〖���〗��� ��'��1��—�2��–�3��_�5�r��6�r��7��{�8��}�9��〔�:��〕�;��【�<��】�=��《�>��》)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr)rrEr)rrEr)rrEr)rrEr)r	rEr
)rrEr)r
rEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr )r!rEr")r#rEr$)r%rEr&)r'rEr()r)rEr*)r+rEr,)r-rEr.)r/rEr0)r1rEr2)r3rEr4)r5rEr6)r7rEr8)r9rEr:)r;rEr<)r=rEr>)r?rEr@)rArErB)rCrErD)rErErF)rGrErD)rHrEr@)rIrErJ)rKrErL)rMrErN)rOrErP)rQrErR)rSrErF)rTrErU)rVrErW)rXrErY)rZrEr[)r\r�)r]rEr^)r_rEr`)rarErb)rcrErd)rerErf)rgrErh)rirErj)rkrErl)rmrErn)rorErp)rqrrr)rsrrt)rurErv)rwr0)rxr�)ryr�)rzrr{)r|rEr})r~r�)rrr�)r�rrl)r�rr�)r�rr�)r�rEr�)r�rEr�)r�r�)r�r0)r�r�)r�rEr�)r�rEr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_47 s�r�cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'gdS((N�?�rE�〈�@��〉�A��「�B��」�C��『�D��』�E�r0�G�r�[�H��]�I�� ̅�M�r��P�r{�Q��、�R�r��T�rl�U�r��V�r��W�r��X��—�Y�r��Z�r��[�r��\�r��]��〔�^��〕�_��#�`��&�a��*�b�r��c��-�d��<�e��>�f�r��g��h��\�i��$�j��%�k��@�l��p�� ً�q��ـً�r�� ٌ�s��t�� ٍ�u��v�� َ�w��ـَ�x�� ُ�y��ـُ�z�� ِ�{��ـِ�|�� ّ�}��ـّ�~�� ْ���ـْ��ء��آ��أ��ؤ��إ��ئ��ا��ب��ة��ت��ث��ج��ح��خ��د��ذ��ر��ز��س��ش��ص��ض��ط���ظ���ع���غ���ف���ق���ك���ل���م���ن���ه���و���ى���ي��لآ��لأ��لإ��لا��r��������")r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�r0)r�rr�)r�rr�)r�rr�)r�rr�)r�rr{)r�rEr�)r�r�)r�rrl)r�rr�)r�rr�)r�rr�)r�rEr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rEr�)r�rEr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rEr�)r�rr�)r�rr�)r�rr�)r�r�)r�rr�)r�rr�)r�rr�)r�rr�)r�r�)r�rr�)r�rEr�)r�rr�)r�r0)r�rr�)r�r�)r�rr�)r�rEr�)r�rr�)r�rEr�)r�rr�)r�rEr�)r�rr�)r�rEr�)r�rr�)rrEr)rrEr)rrEr)rrEr)rrEr	)r
rEr)rrEr
)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)rrEr)r rEr!)r"rEr#)r$rEr%)r&rEr')r(rEr))r*rEr+)r,rEr-)r.rEr/)r0rEr1)r2rEr3)r4rEr5)r6rEr7)r8rEr9)r:rEr;)r<rEr=)r>rEr?)r@rErA)rBrErC)rDrErE)rFrErG)rHrErI)rJrErK)rLrErM)rNrErO)rPrErQ)rRr�)rSr�)rTr�)rUrr�)rVrrWr�r�r�r�r��_seg_48�s�rXcfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�dgdS(N��rr���r���r���r����'��r��	�r��
�r���r���r{�
�rEr���r���/��r���r���r�����r���r���r���r���r���r���r���rl��r���r���r���r�� �r��!�rF�"�rH�#�rJ�$�rL�%�rN�&�rP�'�rR�(�rT�)�rV�*�rX�+�rZ�,�r\�-�r^�.�r`�/�rb�0�rd�1�rf�2�rh�3�rj�4�rl�5�rn�6�rp�7�rr�8�rt�9�rv�:�rx�;�r��<�r��=�r��>��^�?�r��@�rm�A��B��C��D��E��F��G��H��I��J��K��L��M��N��O��P��Q��R��S��T��U��V��W��X��Y��Z��[�r��\��|�]�r��^��~�_��⦅�`��⦆�a��b��「�c��」�d��、�e��・�f��ヲ)rYrr�)rZrr�)r[rr�)r\rr�)r]rr^)r_rr�)r`rr�)rarr�)rbrr�)rcrr{)rdrEr�)rerEr)rfrrg)rhrEr�)rirEr�)rjrEr�)rkrEr)rlrEr�)rmrEr�)rnrEr�)rorEr�)rprEr�)rqrEr�)rrrr�)rsrrl)rtrr�)rurr�)rvrr�)rwrr�)rxrr�)ryrErF)rzrErH)r{rErJ)r|rErL)r}rErN)r~rErP)rrErR)r�rErT)r�rErV)r�rErX)r�rErZ)r�rEr\)r�rEr^)r�rEr`)r�rErb)r�rErd)r�rErf)r�rErh)r�rErj)r�rErl)r�rErn)r�rErp)r�rErr)r�rErt)r�rErv)r�rErx)r�rr�)r�rr�)r�rr�)r�rr�)r�rr�)r�rrm)r�rErF)r�rErH)r�rErJ)r�rErL)r�rErN)r�rErP)r�rErR)r�rErT)r�rErV)r�rErX)r�rErZ)r�rEr\)r�rEr^)r�rEr`)r�rErb)r�rErd)r�rErf)r�rErh)r�rErj)r�rErl)r�rErn)r�rErp)r�rErr)r�rErt)r�rErv)r�rErx)r�rr�)r�rr�)r�rr�)r�rr�)r�rEr�)r�rEr�)r�rEr)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�r�r�r�r�r��_seg_49�s�r�cfCs$d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+gdS(,N�g�rE�ァ�h��ィ�i��ゥ�j��ェ�k��ォ�l��ャ�m��ュ�n��ョ�o��ッ�p��ー�q��ア�r��イ�s��ウ�t��エ�u��オ�v��カ�w��キ�x��ク�y��ケ�z��コ�{��サ�|��シ�}��ス�~��セ���ソ��タ��チ��ツ��テ��ト��ナ��ニ��ヌ��ネ��ノ��ハ��ヒ��フ��ヘ��ホ��マ��ミ��ム��メ��モ��ヤ��ユ��ヨ��ラ��リ��ル��レ��ロ��ワ��ン��゙��゚�r���ᄀ��ᄁ��ᆪ��ᄂ��ᆬ��ᆭ��ᄃ��ᄄ��ᄅ��ᆰ��ᆱ��ᆲ��ᆳ��ᆴ��ᆵ��ᄚ��ᄆ��ᄇ��ᄈ��ᄡ��ᄉ��ᄊ��ᄋ��ᄌ��ᄍ��ᄎ��ᄏ��ᄐ��ᄑ��ᄒ����ᅡ���ᅢ���ᅣ���ᅤ���ᅥ���ᅦ�����ᅧ���ᅨ���ᅩ���ᅪ)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r�rEr�)r rEr )r rEr )r rEr )r rEr )r rEr	 )r
 rEr )r rEr
 )r rEr )r rEr )r rEr )r rEr )r rEr )r rEr )r rEr )r rEr )r rEr )r  rEr! )r" rEr# )r$ rEr% )r& rEr' )r( rEr) )r* rEr+ )r, rEr- )r. rEr/ )r0 rEr1 )r2 rEr3 )r4 rEr5 )r6 rEr7 )r8 rEr9 )r: rEr; )r< r�)r= rEr> )r? rEr@ )rA rErB )rC rErD )rE rErF )rG rErH )rI rErJ )rK rErL )rM rErN )rO rErP )rQ rErR )rS rErT )rU rErV )rW rErX )rY rErZ )r[ rEr\ )r] rEr^ )r_ rEr` )ra rErb )rc rErd )re rErf )rg rErh )ri rErj )rk rErl )rm rErn )ro rErp )rq rErr )rs rErt )ru rErv )rw rErx )ry r�)rz rEr{ )r| rEr} )r~ rEr )r� rEr� )r� rEr� )r� rEr� )r� r�)r� rEr� )r� rEr� )r� rEr� )r� rEr� r�r�r�r�r��_seg_50Xs�r� cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�dgdS(N��rE�ᅫ���ᅬ��r����ᅭ���ᅮ���ᅯ���ᅰ���ᅱ���ᅲ�����ᅳ���ᅴ���ᅵ�����¢���£���¬��r� ̄���¦���¥���₩�����│���←���↑���→���↓���■���○���r0��
�'�(�;�<�>�?�N�P�^������4�7�������������� �$�0�K�����������𐐨��𐐩��𐐪��𐐫��𐐬��𐐭��𐐮��𐐯��𐐰�	�𐐱�
�𐐲��𐐳��𐐴�
�𐐵��𐐶��𐐷��𐐸��𐐹��𐐺��𐐻��𐐼��𐐽��𐐾��𐐿��𐑀��𐑁��𐑂��𐑃��𐑄��𐑅)r� rEr� )r� rEr� )r� r�)r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� r�)r� rEr� )r� rEr� )r� rEr� )r� r�)r� rEr� )r� rEr� )r� rEr� )r� rr� )r� rEr� )r� rEr� )r� rEr� )r� r�)r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� r0)r� r�)r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� rEr� )r� rEr!)r!rEr!)r!rEr!)r!rEr!)r!rEr!)r	!rEr
!)r!rEr!)r
!rEr!)r!rEr!)r!rEr!)r!rEr!)r!rEr!)r!rEr!)r!rEr!)r!rEr!)r!rEr!)r!rEr !)r!!rEr"!)r#!rEr$!)r%!rEr&!)r'!rEr(!)r)!rEr*!r�r�r�r�r��_seg_51�s�r+!ceCs�drdsdtdudvdwdxdydzd{d|d}d~dd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�rE�𐑆��𐑇� �𐑈�!�𐑉�"�𐑊�#�𐑋�$�𐑌�%�𐑍�&�𐑎�'�𐑏�(r0�r�������	�
�6�7�9�<�=�?�V�W�`�	�	�	�:	�?	�@	�	�	�	�	�
�
�
�
�
�
�
�
�
�4
�8
�;
�?
�H
�P
�Y
�`
�
��6�9�V�X�s�x���I�`���N�R�p��������������5�6�D������������� �o#�$�c$�p$�t$�0�/4)r,!rEr-!)r.!rEr/!)r0!rEr1!)r2!rEr3!)r4!rEr5!)r6!rEr7!)r8!rEr9!)r:!rEr;!)r<!rEr=!)r>!rEr?!)r@!r0)rA!r�)rB!r0)rC!r�)rD!r0)rE!r�)rF!r0)rG!r�)rH!r0)rI!r�)rJ!r0)rK!r�)rL!r0)rM!r�)rN!r0)rO!r�)rP!r0)rQ!r�)rR!r0)rS!r�)rT!r0)rU!r�)rV!r0)rW!r�)rX!r0)rY!r�)rZ!r0)r[!r�)r\!r0)r]!r�)r^!r0)r_!r�)r`!r0)ra!r�)rb!r0)rc!r�)rd!r0)re!r�)rf!r0)rg!r�)rh!r0)ri!r�)rj!r0)rk!r�)rl!r0)rm!r�)rn!r0)ro!r�)rp!r0)rq!r�)rr!r0)rs!r�)rt!r0)ru!r�)rv!r0)rw!r�)rx!r0)ry!r�)rz!r0)r{!r�)r|!r0)r}!r�)r~!r0)r!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�r�r�r�r�r��_seg_52(s�r�!cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�hr0�9jr��o�Eo�Po�o�o�o�����������'��)��^�rE�𝅗𝅥�_��𝅘𝅥�`��𝅘𝅥𝅮�a��𝅘𝅥𝅯�b��𝅘𝅥𝅰�c��𝅘𝅥𝅱�d��𝅘𝅥𝅲�e��s��{����𝆹𝅥���𝆺𝅥���𝆹𝅥𝅮���𝆺𝅥𝅮���𝆹𝅥𝅯���𝆺𝅥𝅯��������F����W��`��r���rF��rH��rJ��rL��rN��rP��rR��rT��rV�	�rX�
�rZ��r\��r^�
�r`��rb��rd��rf��rh��rj��rl��rn��rp��rr��rt��rv��rx������������� ��!��"��#��$��%��&��'��(��)��*��+��,��-��.��/��0��1��2��3��4��5��6��7��8��9��:��;��<�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!rEr�!)r�!rEr�!)r�!rEr�!)r�!rEr�!)r�!rEr�!)r�!rEr�!)r�!rEr�!)r�!r0)r�!r�)r�!r0)r�!rEr�!)r�!rEr�!)r�!rEr�!)r�!rEr�!)r�!rEr�!)r�!rEr�!)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!r0)r�!r�)r�!rErF)r�!rErH)r�!rErJ)r�!rErL)r�!rErN)r�!rErP)r�!rErR)r�!rErT)r�!rErV)r�!rErX)r�!rErZ)r�!rEr\)r�!rEr^)r�!rEr`)r�!rErb)r�!rErd)r�!rErf)r�!rErh)r�!rErj)r�!rErl)r�!rErn)r�!rErp)r�!rErr)r�!rErt)r�!rErv)r�!rErx)r�!rErF)r�!rErH)r�!rErJ)r�!rErL)r�!rErN)r�!rErP)r�!rErR)r�!rErT)r�!rErV)r�!rErX)r�!rErZ)r�!rEr\)r�!rEr^)r�!rEr`)r�!rErb)r�!rErd)r�!rErf)r�!rErh)r�!rErj)r�!rErl)r�!rErn)r�!rErp)r�!rErr)r"rErt)r"rErv)r"rErx)r"rErF)r"rErH)r"rErJ)r"rErL)r"rErN)r"rErP)r	"rErR)r
"rErT)r"rErVr�r�r�r�r��_seg_53�s�r"ceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�=�rErX�>�rZ�?�r\�@�r^�A�r`�B�rb�C�rd�D�rf�E�rh�F�rj�G�rl�H�rn�I�rp�J�rr�K�rt�L�rv�M�rx�N�rF�O�rH�P�rJ�Q�rL�R�rN�S�rP�T�rR�U�r��V�rV�W��X��Y��Z��[��\��]��^��_��`��a��b��c��d��e��f��g��h��i��j��k��l��m��n��o�rT�p��q��r��s��t��u��v��w��x��y��z��{��|��}��~���������������������������������������������������������������������)r
"rErX)r"rErZ)r"rEr\)r"rEr^)r"rEr`)r"rErb)r"rErd)r"rErf)r"rErh)r"rErj)r"rErl)r"rErn)r"rErp)r"rErr)r"rErt)r"rErv)r"rErx)r"rErF)r"rErH)r "rErJ)r!"rErL)r""rErN)r#"rErP)r$"rErR)r%"r�)r&"rErV)r'"rErX)r("rErZ)r)"rEr\)r*"rEr^)r+"rEr`)r,"rErb)r-"rErd)r."rErf)r/"rErh)r0"rErj)r1"rErl)r2"rErn)r3"rErp)r4"rErr)r5"rErt)r6"rErv)r7"rErx)r8"rErF)r9"rErH)r:"rErJ)r;"rErL)r<"rErN)r="rErP)r>"rErR)r?"rErT)r@"rErV)rA"rErX)rB"rErZ)rC"rEr\)rD"rEr^)rE"rEr`)rF"rErb)rG"rErd)rH"rErf)rI"rErh)rJ"rErj)rK"rErl)rL"rErn)rM"rErp)rN"rErr)rO"rErt)rP"rErv)rQ"rErx)rR"rErF)rS"rErH)rT"rErJ)rU"rErL)rV"rErN)rW"rErP)rX"rErR)rY"rErT)rZ"rErV)r["rErX)r\"rErZ)r]"rEr\)r^"rEr^)r_"rEr`)r`"rErb)ra"rErd)rb"rErf)rc"rErh)rd"rErj)re"rErl)rf"rErn)rg"rErp)rh"rErr)ri"rErt)rj"rErv)rk"rErx)rl"rErF)rm"r�)rn"rErJ)ro"rErL)rp"r�r�r�r�r�r��_seg_54�s�rq"cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N��rErR��r���rX��rZ����r`��rb��rd��rf����rj��rl��rn��rp��rr��rt��rv��rx��rF��rH��rJ��rL����rP����rT��rV������r\���r^���������������rh���������������������������������������rN��������������������������������������������������������������������������������������������������������������������������������������)rr"rErR)rs"r�)rt"rErX)ru"rErZ)rv"r�)rw"rEr`)rx"rErb)ry"rErd)rz"rErf)r{"r�)r|"rErj)r}"rErl)r~"rErn)r"rErp)r�"rErr)r�"rErt)r�"rErv)r�"rErx)r�"rErF)r�"rErH)r�"rErJ)r�"rErL)r�"r�)r�"rErP)r�"r�)r�"rErT)r�"rErV)r�"rErX)r�"rErZ)r�"rEr\)r�"rEr^)r�"rEr`)r�"r�)r�"rErd)r�"rErf)r�"rErh)r�"rErj)r�"rErl)r�"rErn)r�"rErp)r�"rErr)r�"rErt)r�"rErv)r�"rErx)r�"rErF)r�"rErH)r�"rErJ)r�"rErL)r�"rErN)r�"rErP)r�"rErR)r�"rErT)r�"rErV)r�"rErX)r�"rErZ)r�"rEr\)r�"rEr^)r�"rEr`)r�"rErb)r�"rErd)r�"rErf)r�"rErh)r�"rErj)r�"rErl)r�"rErn)r�"rErp)r�"rErr)r�"rErt)r�"rErv)r�"rErx)r�"rErF)r�"rErH)r�"rErJ)r�"rErL)r�"rErN)r�"rErP)r�"rErR)r�"rErT)r�"rErV)r�"rErX)r�"rErZ)r�"rEr\)r�"rEr^)r�"rEr`)r�"rErb)r�"rErd)r�"rErf)r�"rErh)r�"rErj)r�"rErl)r�"rErn)r�"rErp)r�"rErr)r�"rErt)r�"rErv)r�"rErx)r�"rErF)r�"rErH)r�"r�)r�"rErLr�r�r�r�r��_seg_55`s�r�"cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N��rErN�	�rP�
�rR��r��
�rX��rZ��r\��r^��r`��rb��rd��rf����rj��rl��rn��rp��rr��rt��rv����rF��rH� �rJ�!�rL�"��#��$��%�rT�&�rV�'��(��)��*��+��,��-��.��/�rh�0��1��2��3��4��5��6��7�rx�8��9��:��;��<��=��>��?��@��A��B��C��D��E��F��G��J��K��L��M��N��O��P��Q��R��S��T��U��V��W��X��Y��Z��[��\��]��^��_��`��a��b��c��d��e��f��g��h��i��j��k��l��m��n�)r�"rErN)r�"rErP)r�"rErR)r�"r�)r�"rErX)r�"rErZ)r�"rEr\)r�"rEr^)r�"rEr`)r�"rErb)r�"rErd)r�"rErf)r�"r�)r�"rErj)r�"rErl)r�"rErn)r�"rErp)r�"rErr)r�"rErt)r�"rErv)r�"r�)r�"rErF)r�"rErH)r�"rErJ)r�"rErL)r�"rErN)r�"rErP)r�"rErR)r�"rErT)r�"rErV)r�"rErX)r�"rErZ)r�"rEr\)r�"rEr^)r�"rEr`)r�"rErb)r�"rErd)r�"rErf)r�"rErh)r�"rErj)r�"rErl)r#rErn)r#rErp)r#rErr)r#rErt)r#rErv)r#rErx)r#rErF)r#rErH)r#r�)r	#rErL)r
#rErN)r#rErP)r#rErR)r
#r�)r#rErV)r#rErX)r#rErZ)r#rEr\)r#rEr^)r#r�)r#rErb)r#r�)r#rErj)r#rErl)r#rErn)r#rErp)r#rErr)r#rErt)r#rErv)r#r�)r#rErF)r#rErH)r #rErJ)r!#rErL)r"#rErN)r##rErP)r$#rErR)r%#rErT)r&#rErV)r'#rErX)r(#rErZ)r)#rEr\)r*#rEr^)r+#rEr`)r,#rErb)r-#rErd)r.#rErf)r/#rErh)r0#rErj)r1#rErl)r2#rErn)r3#rErp)r4#rErr)r5#rErt)r6#rErv)r7#rErx)r8#rErF)r9#rErH)r:#rErJr�r�r�r�r��_seg_56�s�r;#cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�o�rErL�p�rN�q�rP�r�rR�s�rT�t�rV�u�rX�v�rZ�w�r\�x�r^�y�r`�z�rb�{�rd�|�rf�}�rh�~�rj��rl��rn��rp��rr��rt��rv��rx��rF��rH��rJ���������������������������������������������������������������������������������������������������������������������������������������������������������������������)r<#rErL)r=#rErN)r>#rErP)r?#rErR)r@#rErT)rA#rErV)rB#rErX)rC#rErZ)rD#rEr\)rE#rEr^)rF#rEr`)rG#rErb)rH#rErd)rI#rErf)rJ#rErh)rK#rErj)rL#rErl)rM#rErn)rN#rErp)rO#rErr)rP#rErt)rQ#rErv)rR#rErx)rS#rErF)rT#rErH)rU#rErJ)rV#rErL)rW#rErN)rX#rErP)rY#rErR)rZ#rErT)r[#rErV)r\#rErX)r]#rErZ)r^#rEr\)r_#rEr^)r`#rEr`)ra#rErb)rb#rErd)rc#rErf)rd#rErh)re#rErj)rf#rErl)rg#rErn)rh#rErp)ri#rErr)rj#rErt)rk#rErv)rl#rErx)rm#rErF)rn#rErH)ro#rErJ)rp#rErL)rq#rErN)rr#rErP)rs#rErR)rt#rErT)ru#rErV)rv#rErX)rw#rErZ)rx#rEr\)ry#rEr^)rz#rEr`)r{#rErb)r|#rErd)r}#rErf)r~#rErh)r#rErj)r�#rErl)r�#rErn)r�#rErp)r�#rErr)r�#rErt)r�#rErv)r�#rErx)r�#rErF)r�#rErH)r�#rErJ)r�#rErL)r�#rErN)r�#rErP)r�#rErR)r�#rErT)r�#rErV)r�#rErX)r�#rErZ)r�#rEr\)r�#rEr^)r�#rEr`)r�#rErb)r�#rErd)r�#rErf)r�#rErh)r�#rErj)r�#rErl)r�#rErn)r�#rErp)r�#rErr)r�#rErt)r�#rErvr�r�r�r�r��_seg_570s�r�#cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N���rErx���rF���rH���rJ���rL���rN���rP���rR���rT���rV���rX���rZ���r\���r^���r`���rb���rd���rf���rh���rj���rl���rn���rp���rr���rt���rv�����������������������������������������������������������������	��
������
�������������������������������������� ��!��"��#��$��%��&��'��(��)��*��+��,��-��.��/��0��1��2��3��4��5��6�)r�#rErx)r�#rErF)r�#rErH)r�#rErJ)r�#rErL)r�#rErN)r�#rErP)r�#rErR)r�#rErT)r�#rErV)r�#rErX)r�#rErZ)r�#rEr\)r�#rEr^)r�#rEr`)r�#rErb)r�#rErd)r�#rErf)r�#rErh)r�#rErj)r�#rErl)r�#rErn)r�#rErp)r�#rErr)r�#rErt)r�#rErv)r�#rErx)r�#rErF)r�#rErH)r�#rErJ)r�#rErL)r�#rErN)r�#rErP)r�#rErR)r�#rErT)r�#rErV)r�#rErX)r�#rErZ)r�#rEr\)r�#rEr^)r�#rEr`)r�#rErb)r�#rErd)r�#rErf)r�#rErh)r�#rErj)r�#rErl)r�#rErn)r�#rErp)r�#rErr)r�#rErt)r�#rErv)r�#rErx)r�#rErF)r�#rErH)r�#rErJ)r�#rErL)r�#rErN)r�#rErP)r�#rErR)r�#rErT)r�#rErV)r�#rErX)r�#rErZ)r�#rEr\)r�#rEr^)r�#rEr`)r�#rErb)r�#rErd)r�#rErf)r�#rErh)r�#rErj)r�#rErl)r�#rErn)r�#rErp)r�#rErr)r�#rErt)r�#rErv)r�#rErx)r�#rErF)r�#rErH)r�#rErJ)r�#rErL)r�#rErN)r�#rErP)r�#rErR)r�#rErT)r�#rErV)r�#rErX)r�#rErZ)r�#rEr\)r�#rEr^)r�#rEr`)r�#rErb)r�#rErd)r$rErf)r$rErh)r$rErj)r$rErl)r$rErnr�r�r�r�r��_seg_58�s�r$cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�7�rErp�8�rr�9�rt�:�rv�;�rx�<�rF�=�rH�>�rJ�?�rL�@�rN�A�rP�B�rR�C�rT�D�rV�E�rX�F�rZ�G�r\�H�r^�I�r`�J�rb�K�rd�L�rf�M�rh�N�rj�O�rl�P�rn�Q��R��S��T��U��V��W��X��Y��Z��[��\��]��^��_��`��a��b��c��d��e��f��g��h��i��j��k��l��m��n��o��p��q��r��s��t��u��v��w��x��y��z��{��|��}��~���������������������������������������������������������)r$rErp)r$rErr)r$rErt)r	$rErv)r
$rErx)r$rErF)r$rErH)r
$rErJ)r$rErL)r$rErN)r$rErP)r$rErR)r$rErT)r$rErV)r$rErX)r$rErZ)r$rEr\)r$rEr^)r$rEr`)r$rErb)r$rErd)r$rErf)r$rErh)r$rErj)r$rErl)r$rErn)r $rErp)r!$rErr)r"$rErt)r#$rErv)r$$rErx)r%$rErF)r&$rErH)r'$rErJ)r($rErL)r)$rErN)r*$rErP)r+$rErR)r,$rErT)r-$rErV)r.$rErX)r/$rErZ)r0$rEr\)r1$rEr^)r2$rEr`)r3$rErb)r4$rErd)r5$rErf)r6$rErh)r7$rErj)r8$rErl)r9$rErn)r:$rErp)r;$rErr)r<$rErt)r=$rErv)r>$rErx)r?$rErF)r@$rErH)rA$rErJ)rB$rErL)rC$rErN)rD$rErP)rE$rErR)rF$rErT)rG$rErV)rH$rErX)rI$rErZ)rJ$rEr\)rK$rEr^)rL$rEr`)rM$rErb)rN$rErd)rO$rErf)rP$rErh)rQ$rErj)rR$rErl)rS$rErn)rT$rErp)rU$rErr)rV$rErt)rW$rErv)rX$rErx)rY$rErF)rZ$rErH)r[$rErJ)r\$rErL)r]$rErN)r^$rErP)r_$rErR)r`$rErT)ra$rErV)rb$rErX)rc$rErZ)rd$rEr\)re$rEr^)rf$rEr`)rg$rErb)rh$rErd)ri$rErfr�r�r�r�r��_seg_59s�rj$cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N��rErh��rj��rl��rn��rp��rr��rt��rv��rx���ı���ȷ��r����α���β���γ���δ���ε���ζ���η���θ���ι���κ���λ���μ���ν���ξ���ο���π���ρ�����σ���τ���υ���φ���χ���ψ���ω���∇����������������������������������������������������������������������������∂���������������������������������������������������������������������������������������������������)rk$rErh)rl$rErj)rm$rErl)rn$rErn)ro$rErp)rp$rErr)rq$rErt)rr$rErv)rs$rErx)rt$rEru$)rv$rErw$)rx$r�)ry$rErz$)r{$rEr|$)r}$rEr~$)r$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rErz$)r�$rEr|$)r�$rEr~$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rErz$)r�$rEr|$)r�$rEr~$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rErz$)r�$rEr|$)r�$rEr~$)r�$rEr�$)r�$rEr�$r�r�r�r�r��_seg_60hs�r�$cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N��rE�ζ���η���θ���ι���κ���λ���μ���ν�	��ξ�
��ο���π���ρ�
��σ���τ���υ���φ���χ���ψ���ω���∂���ε�������������α���β���γ���δ� ��!��"��#��$��%��&��'��(��)��*��+��,��-��.��/��0��1��2��3��4��5��∇�6��7��8��9��:��;��<��=��>��?��@��A��B��C��D��E��F��G��I��J��K��L��M��N��O��P��Q��R��S��T��U��V��W��X��Y��Z��[��\��]��^��_��`��a��b��c��d��e��f�)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r�$rEr�$)r%rEr%)r%rEr%)r%rEr%)r%rEr%)r%rEr	%)r
%rEr%)r%rEr
%)r%rEr%)r%rEr%)r%rEr%)r%rEr%)r%rEr�$)r%rEr�$)r%rEr%)r%rEr%)r%rEr%)r%rEr%)r%rEr%)r%rEr %)r!%rEr"%)r#%rEr%)r$%rEr�$)r%%rEr�$)r&%rEr�$)r'%rEr�$)r(%rEr�$)r)%rEr�$)r*%rEr�$)r+%rEr�$)r,%rEr�$)r-%rEr�$)r.%rEr%)r/%rEr%)r0%rEr�$)r1%rEr%)r2%rEr%)r3%rEr	%)r4%rEr%)r5%rEr
%)r6%rEr%)r7%rEr%)r8%rEr9%)r:%rEr%)r;%rEr%)r<%rEr %)r=%rEr"%)r>%rEr%)r?%rEr�$)r@%rEr�$)rA%rEr�$)rB%rEr�$)rC%rEr�$)rD%rEr�$)rE%rEr�$)rF%rEr�$)rG%rEr�$)rH%rEr�$)rI%rEr%)rJ%rEr%)rK%rEr%)rL%rEr%)rM%rEr	%)rN%rEr%)rO%rEr
%)rP%rEr%)rQ%rEr%)rR%rEr%)rS%rEr%)rT%rEr�$)rU%rEr�$)rV%rEr%)rW%rEr%)rX%rEr%)rY%rEr%)rZ%rEr%)r[%rEr %)r\%rEr"%)r]%rEr%)r^%rEr�$)r_%rEr�$)r`%rEr�$)ra%rEr�$)rb%rEr�$)rc%rEr�$)rd%rEr�$)re%rEr�$)rf%rEr�$)rg%rEr�$)rh%rEr%)ri%rEr%r�r�r�r�r��_seg_61�s�rj%cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�g�rE�θ�h��σ�i��τ�j��υ�k��φ�l��χ�m��ψ�n��ω�o��∇�p��α�q��β�r��γ�s��δ�t��ε�u��ζ�v��η�w��x��ι�y��κ�z��λ�{��μ�|��ν�}��ξ�~��ο���π���ρ�����������������∂������������������������������������������������������������������������������������������������������������������������������������������ϝ���r����r�)rk%rErl%)rm%rErn%)ro%rErp%)rq%rErr%)rs%rErt%)ru%rErv%)rw%rErx%)ry%rErz%)r{%rEr|%)r}%rEr~%)r%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rErl%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rErn%)r�%rErp%)r�%rErr%)r�%rErt%)r�%rErv%)r�%rErx%)r�%rErz%)r�%rEr�%)r�%rEr�%)r�%rErl%)r�%rEr�%)r�%rErt%)r�%rEr�%)r�%rEr�%)r�%rEr~%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rErl%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rErl%)r�%rErn%)r�%rErp%)r�%rErr%)r�%rErt%)r�%rErv%)r�%rErx%)r�%rErz%)r�%rEr|%)r�%rEr~%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rErl%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%rErn%)r�%rErp%)r�%rErr%)r�%rErt%)r�%rErv%)r�%rErx%)r�%rErz%)r�%rEr�%)r�%rEr�%)r�%rErl%)r�%rEr�%)r�%rErt%)r�%rEr�%)r�%rEr�%)r�%rEr�%)r�%r�)r�%rEr�r�r�r�r�r��_seg_628s�r�%cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N���rEr����r����r���r����r����r����r����r����r����r�������������������������������������������������������������������������������������������������������������r����ا���ب���ج���د�����و���ز���ح���ط�	��ي�
��ك���ل���م�
��ن���س���ع���ف���ص���ق���ر���ش���ت���ث���خ���ذ���ض���ظ���غ���ٮ���ں���ڡ���ٯ� ��!��"��#��$��ه�%��'��(��)��*��+��,��-��.��/��0��1��2�)r�%rEr�)r�%rEr�)r�%rEr)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r�%rEr�)r&rEr�)r&rEr)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r	&rEr�)r
&rEr�)r&rEr)r&rEr�)r
&rEr�)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r&rEr)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r&rEr�)r&r�)r&rEr&)r&rEr &)r!&rEr"&)r#&rEr$&)r%&r�)r&&rEr'&)r(&rEr)&)r*&rEr+&)r,&rEr-&)r.&rEr/&)r0&rEr1&)r2&rEr3&)r4&rEr5&)r6&rEr7&)r8&rEr9&)r:&rEr;&)r<&rEr=&)r>&rEr?&)r@&rErA&)rB&rErC&)rD&rErE&)rF&rErG&)rH&rErI&)rJ&rErK&)rL&rErM&)rN&rErO&)rP&rErQ&)rR&rErS&)rT&rErU&)rV&rErW&)rX&rErY&)rZ&rEr[&)r\&r�)r]&rEr &)r^&rEr"&)r_&r�)r`&rEra&)rb&r�)rc&rEr+&)rd&r�)re&rEr/&)rf&rEr1&)rg&rEr3&)rh&rEr5&)ri&rEr7&)rj&rEr9&)rk&rEr;&)rl&rEr=&)rm&rEr?&)rn&rErA&r�r�r�r�r��_seg_63�s�ro&ceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gdS)�N�3�r��4�rE�ش�5��ت�6��ث�7��خ�8��9��ض�:��;��غ�<��B��ج�C��G��ح�H��I��ي�J��K��ل�L��M��ن�N��س�O��ع�P��Q��ص�R��ق�S��T��U��W��X��Y��Z��[��\��]��ں�^��_��ٯ�`��a��ب�b��c��d��ه�e��g��h��ط�i��j��ك�k��l��م�m��n��o��p��ف�q��r��s��t��u��v��w��x��y��z��ظ�{��|��ٮ�}��~��ڡ�����ا�������د�����و���ز���������������������������ر�����������ذ����������������)rp&r�)rq&rErr&)rs&rErt&)ru&rErv&)rw&rErx&)ry&r�)rz&rEr{&)r|&r�)r}&rEr~&)r&r�)r�&rEr�&)r�&r�)r�&rEr�&)r�&r�)r�&rEr�&)r�&r�)r�&rEr�&)r�&r�)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&r�)r�&rEr�&)r�&rEr�&)r�&r�)r�&rErr&)r�&r�)r�&rErx&)r�&r�)r�&rEr{&)r�&r�)r�&rEr~&)r�&r�)r�&rEr�&)r�&r�)r�&rEr�&)r�&r�)r�&rEr�&)r�&rEr�&)r�&r�)r�&rEr�&)r�&r�)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&r�)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&r�)r�&rErr&)r�&rErt&)r�&rErv&)r�&rErx&)r�&r�)r�&rEr{&)r�&rEr�&)r�&rEr~&)r�&rEr�&)r�&r�)r�&rEr�&)r�&r�)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&r�)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rErr&)r�&rErt&)r�&rErv&)r�&rErx&)r�&rEr�&)r�&rEr{&)r�&rEr�&)r�&rEr~&)r�&r�)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&r�r�r�r�r�r��_seg_64s�r�&cfCsd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�dgdS(N��rE�و���ز���ح���ط���ي��r����ل���م���ن���س���ع���ف���ص���ق���ر���ش���ت���ث���خ���ذ���ض���ظ���غ�����r0������,��0������������������������r�0,���1,���2,���3,���4,���5,���6,���7,�	��8,�
��9,�����(a)���(b)���(c)���(d)���(e)���(f)���(g)���(h)���(i)���(j)���(k)���(l)���(m)���(n)���(o)���(p)� ��(q)�!��(r)�"��(s)�#��(t)�$��(u)�%��(v)�&��(w)�'��(x)�(��(y)�)��(z)�*��〔s〕�+�rJ�,�rh�-�r��.��wz�/��0�rF�1�rH�2��3�rL�4�rN�5�rP�6�rR�7�rT�8�rV�9�rX�:�rZ�;�r\�<�r^�=�r`�>�rb�?�rd�@�rf�A��B�rj)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&rEr�&)r�&r�)r'rEr')r'rEr')r'rEr')r'rEr')r'rEr	')r
'rEr')r'rEr
')r'rEr')r'rEr')r'rEr')r'rEr')r'rEr')r'rEr')r'rEr')r'rEr')r'rEr')r 'rEr!')r"'r�)r#'r0)r$'r�)r%'r0)r&'r�)r''r0)r('r�)r)'r0)r*'r�)r+'r0)r,'r�)r-'r0)r.'r�)r/'r0)r0'r�)r1'rr2')r3'rr4')r5'rr6')r7'rr8')r9'rr:')r;'rr<')r='rr>')r?'rr@')rA'rrB')rC'rrD')rE'r�)rF'rrG')rH'rrI')rJ'rrK')rL'rrM')rN'rrO')rP'rrQ')rR'rrS')rT'rrU')rV'rrW')rX'rrY')rZ'rr[')r\'rr]')r^'rr_')r`'rra')rb'rrc')rd'rre')rf'rrg')rh'rri')rj'rrk')rl'rrm')rn'rro')rp'rrq')rr'rrs')rt'rru')rv'rrw')rx'rry')rz'rEr{')r|'rErJ)r}'rErh)r~'rEr�)r'rEr�')r�'r�)r�'rErF)r�'rErH)r�'rErJ)r�'rErL)r�'rErN)r�'rErP)r�'rErR)r�'rErT)r�'rErV)r�'rErX)r�'rErZ)r�'rEr\)r�'rEr^)r�'rEr`)r�'rErb)r�'rErd)r�'rErf)r�'rErh)r�'rErjr�r�r�r�r��_seg_65ps�r�'ceCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�dgdS(N�C�rErl�D�rn�E�rp�F�rr�G�rt�H�rv�I�rx�J��hv�K�r��L��sd�M�r.�N��ppv�O��wc�P�r0�j��mc�k��md�l�r��p����dj����������ほか���ココ���サ�����手���字���双���デ���二���多���解���天���交���映���無���料���前���後���再���新� ��初�!��終�"��生�#��販�$��声�%��吹�&��演�'��投�(��捕�)��一�*��三�+��遊�,��左�-��中�.��右�/��指�0��走�1��打�2��禁�3��空�4��合�5��満�6��有�7��月�8��申�9��割�:��営�;��@��	〔本〕�A��	〔三〕�B��	〔二〕�C��	〔安〕�D��	〔点〕�E��	〔打〕�F��	〔盗〕�G��	〔勝〕�H��	〔敗〕�I��P��得�Q��可�R����!��0��6��7��}�������������������������?��@�)r�'rErl)r�'rErn)r�'rErp)r�'rErr)r�'rErt)r�'rErv)r�'rErx)r�'rEr�')r�'rEr�)r�'rEr�')r�'rEr.)r�'rEr�')r�'rEr�')r�'r0)r�'rEr�')r�'rEr�')r�'r�)r�'r0)r�'rEr�')r�'r0)r�'r�)r�'r0)r�'rEr�')r�'rEr�')r�'rEr�')r�'r�)r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r�'rEr�')r(rEr()r(rEr()r(rEr()r(rEr()r(rEr	()r
(rEr()r(rEr
()r(rEr()r(r�)r(rEr()r(rEr()r(rEr()r(rEr()r(rEr()r(rEr()r(rEr()r(rEr ()r!(rEr"()r#(r�)r$(rEr%()r&(rEr'()r((r�)r)(r0)r*(r�)r+(r0)r,(r�)r-(r0)r.(r�)r/(r0)r0(r�)r1(r0)r2(r�)r3(r0)r4(r�)r5(r0)r6(r�)r7(r0)r8(r�)r9(r0r�r�r�r�r��_seg_66�s�r:(cfCs�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�dgdS(N�A�r��B�r0���������>��@��D��P��h���A��E��P�������t���צ���5��@�����rE�丽���丸���乁���𠄢���你���侮���侻���倂���偺�	��備�
��僧���像���㒞�
��𠘺���免���兔���兤���具���𠔜���㒹���內���再���𠕋���冗���冤���仌���冬���况���𩇟���凵���刃���㓟� ��刻�!��剆�"��割�#��剷�$��㔕�%��勇�&��勉�'��勤�(��勺�)��包�*��匆�+��北�,��卉�-��卑�.��博�/��即�0��卽�1��卿�4��𠨬�5��灰�6��及�7��叟�8��𠭣�9��叫�:��叱�;��吆�<��咞�=��吸�>��呈�?��周�@��咢�A��哶�B��唐�C��啓�D��啣�E��善�G��喙�H��喫�I��喳�J��嗂�K��圖�L��嘆�M��圗)r;(r�)r<(r0)r=(r�)r>(r0)r?(r�)r@(r0)rA(r�)rB(r0)rC(r�)rD(r0)rE(r�)rF(r0)rG(r�)rH(r0)rI(r�)rJ(r0)rK(r�)rL(r0)rM(r�)rN(r0)rO(r�)rP(r0)rQ(r�)rR(r0)rS(r�)rT(rErU()rV(rErW()rX(rErY()rZ(rEr[()r\(rEr]()r^(rEr_()r`(rEra()rb(rErc()rd(rEre()rf(rErg()rh(rEri()rj(rErk()rl(rErm()rn(rEro()rp(rErq()rr(rErs()rt(rEru()rv(rErw()rx(rEry()rz(rEr{()r|(rEr}()r~(rEr()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�(r�r�r�r�r��_seg_67@s�r�(cfCs$d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+gdS(,N�N�rE�噑�O��噴�P��切�Q��壮�R��城�S��埴�T��堍�U��型�V��堲�W��報�X��墬�Y��𡓤�Z��売�[��壷�\��夆�]��多�^��夢�_��奢�`��𡚨�a��𡛪�b��姬�c��娛�d��娧�e��姘�f��婦�g��㛮�h�r��i��嬈�j��嬾�l��𡧈�m��寃�n��寘�o��寧�p��寳�q��𡬘�r��寿�s��将�t��u��尢�v��㞁�w��屠�x��屮�y��峀�z��岍�{��𡷤�|��嵃�}��𡷦�~��嵮���嵫��嵼��巡��巢��㠯��巽��帨��帽��幩��㡢��𢆃��㡼��庰��庳��庶��廊��𪎒��廾��𢌱��舁��弢��㣇��𣊸��𦇚��形��彫��㣣��徚��忍��志��忹��悁��㤺��㤜��悔��𢛔��惇��慈��慌��慎���慺��憎��憲��憤��憯��懞��懲��懶��成��戛��扝)r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr�()r�(rEr))r)rEr))r)rEr))r)rEr))r)rEr))r	)rEr
))r)rEr))r
)rEr))r)rEr))r)rEr))r)rEr))r)rEr))r)rEr))r)rEr))r)rEr))r)rEr))r)r�)r )rEr!))r")rEr#))r$)rEr%))r&)rEr'))r()rEr)))r*)rEr+))r,)rEr-))r.)rEr/))r0)rEr1))r2)rEr3))r4)r�)r5)rEr6))r7)rEr8))r9)rEr:))r;)rEr<))r=)rEr>))r?)rEr@))rA)rErB))rC)rErD))rE)rErF))rG)rErH))rI)rErJ))rK)rErL))rM)rErN))rO)rErP))rQ)rErR))rS)rErT))rU)rErV))rW)rErX))rY)rErZ))r[)rEr\))r])rEr^))r_)rEr`))ra)rErb))rc)rErd))re)rErf))rg)rErh))ri)rErj))rk)rErl))rm)rErn))ro)rErp))rq)rErr))rs)rErt))ru)rErv))rw)rErx))ry)rErz))r{)rEr|))r})rEr~))r)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�)r�r�r�r�r��_seg_68�s�r�)cfCs(d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-gdS(.N�rE�抱��拔��捐��𢬌��挽��拼��捨��掃��揤��𢯱��搢��揅��掩���㨮���摩���摾���撝���摷���㩬���敏���敬���𣀊���旣���書���晉���㬙���暑���㬈���㫤���冒���冕���最���暜���肭���䏙���朗���望���朡���杞���杓���𣏃���㭉���柺���枅���桒���梅���𣑭���梎���栟���椔���㮝���楂���榣���槪���檨���𣚣���櫛���㰘���次���𣢧���歔���㱎���歲���殟��殺��殻��𣪍��𡴋��𣫺��汎��𣲼��沿��泍��汧��洖���派���海���流���浩���浸���涅���𣴞���洴���港�	��湮�
��㴳���滋���滇�
��𣻑���淹���潮���𣽞���𣾎���濆���瀹���瀞���瀛���㶖���灊���災)r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr�))r�)rEr*)r*rEr*)r*rEr*)r*rEr*)r*rEr*)r	*rEr
*)r*rEr*)r
*rEr*)r*rEr*)r*rEr*)r*rEr*)r*rEr*)r*rEr*)r*rEr*)r*rEr*)r*rEr*)r*rEr *)r!*rEr"*)r#*rEr$*)r%*rEr&*)r'*rEr(*)r)*rEr**)r+*rEr,*)r-*rEr.*)r/*rEr0*)r1*rEr2*)r3*rEr4*)r5*rEr6*)r7*rEr8*)r9*rEr:*)r;*rEr<*)r=*rEr>*)r?*rEr@*)rA*rErB*)rC*rErD*)rE*rErF*)rG*rErH*)rI*rErJ*)rK*rErL*)rM*rErN*)rO*rErP*)rQ*rErR*)rS*rErT*)rU*rErV*)rW*rErX*)rY*rErZ*)r[*rEr\*)r]*rEr^*)r_*rEr`*)ra*rErb*)rc*rErd*)re*rErf*)rg*rErh*)ri*rErj*)rk*rErl*)rm*rErn*)ro*rErp*)rq*rErr*)rs*rErt*)ru*rErv*)rw*rErx*r�r�r�r�r��_seg_69s�ry*cfCs&d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,gdS(-N��rE�灷���炭���𠔥���煅���𤉣���熜��r�� ��爨�!��爵�"��牐�#��𤘈�$��犀�%��犕�&��𤜵�'��𤠔�(��獺�)��王�*��㺬�+��玥�,��㺸�.��瑇�/��瑜�0��瑱�1��璅�2��瓊�3��㼛�4��甤�5��𤰶�6��甾�7��𤲒�8��異�9��𢆟�:��瘐�;��𤾡�<��𤾸�=��𥁄�>��㿼�?��䀈�@��直�A��𥃳�B��𥃲�C��𥄙�D��𥄳�E��眞�F��真�H��睊�I��䀹�J��瞋�K��䁆�L��䂖�M��𥐝�N��硎�O��碌�P��磌�Q��䃣�R��𥘦�S��祖�T��𥚚�U��𥛅�V��福�W��秫�X��䄯�Y��穀�Z��穊�[��穏�\��𥥼�]��𥪧�_��`��䈂�a��𥮫�b��篆�c��築�d��䈧�e��𥲀�f��糒�g��䊠�h��糨�i��糣�j��紀�k��𥾆�l��絣�m��䌁�n��緇�o��縂�p��繅�q��䌴�r��𦈨�s��𦉇�t��䍙�u��𦋙�v��罺�w��𦌾�x��羕�y��翺�z��者�{��𦓚�|��𦔣�}��聠�~��𦖨���聰)rz*rEr{*)r|*rEr}*)r~*rEr*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*r�)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*rEr�*)r�*r�)r+rEr+)r+rEr+)r+rEr+)r+rEr+)r+rEr	+)r
+rEr+)r+rEr
+)r+rEr+)r+rEr+)r+rEr+)r+rEr+)r+rEr+)r+rEr+)r+rEr+)r+rEr+)r+rEr+)r +rEr!+)r"+rEr#+)r$+rEr%+)r&+rEr'+)r(+rEr)+)r*+rEr++)r,+rEr-+)r.+rEr/+)r0+rEr1+)r2+rEr3+)r4+rEr5+)r6+rEr7+)r8+rEr9+)r:+rEr;+)r<+rEr=+)r>+rEr?+r�r�r�r�r��_seg_70xs�r@+cfCs(d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d��d�d�d�d�d�d�d�d�d�d	�d
�d�d�d
�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d �d!�d"�d#�d$�d%�d&�d'�d(�d)�d*�d+�d,�d-gdS(.N�rE�𣍟��䏕��育��脃��䐋��脾��媵��𦞧��𦞵��𣎓��𣎜��舁��舄��辞��䑫��芑��芋��芝��劳��花��芳��芽��苦��𦬼��若��茝��荣��莭��茣��莽��菧��著��荓��菊��菌��菜��𦰶��𦵫��𦳕��䔫��蓱��蓳��蔖��𧏊��蕤��𦼬��䕝��䕡��𦾱��𧃒��䕫��虐��虜��虧��虩��蚩��蚈��蜎��蛢��蝹��蜨��蝫��螆�r���蟡��蠁���䗹���衠���衣���𧙧���裗���裞���䘵���裺���㒻���𧢮���𧥦���䚾���䛇���誠���諭���變���豕���𧲨���貫���賁���贛���起���𧼯���𠠄���跋���趼���跰���𠣞���軔���輸���𨗒���𨗭���邔���郱)rA+rErB+)rC+rErD+)rE+rErF+)rG+rErH+)rI+rErJ+)rK+rErL+)rM+rErN+)rO+rErP+)rQ+rErR+)rS+rErT+)rU+rErV+)rW+rErX+)rY+rErZ+)r[+rEr\+)r]+rEr^+)r_+rEr`+)ra+rErb+)rc+rErd+)re+rErf+)rg+rErh+)ri+rErj+)rk+rErl+)rm+rErn+)ro+rErp+)rq+rErr+)rs+rErt+)ru+rErv+)rw+rErx+)ry+rErz+)r{+rEr|+)r}+rEr~+)r+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+r�)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r�+rEr�+)r,rEr,)r,rEr,)r,rEr,)r,rEr,r�r�r�r�r��_seg_71�s�r,c=Cs|dydzd{d|d}d~dd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�g<S)�N��rE�鄑���𨜮���鄛���鈸���鋗���鋘���鉼���鏹���鐕���𨯺���開���䦕���閷���𨵷���䧦���雃���嶲��霣��𩅅��𩈚��䩮��䩶��韠��𩐊��䪲��𩒖��頋���頩���𩖶���飢���䬳���餩���馧���駂���駾���䯎�	��𩬰�
��鬒���鱀���鳽�
��䳎���䳭���鵧���𪃎���䳸���𪄅���𪈎���𪊑���麻���䵖���黹���黾���鼅���鼏���鼖���鼻���𪘀��r��r���)r	,rEr
,)r,rEr,)r
,rEr,)r,rEr,)r,rEr,)r,rEr,)r,rEr,)r,rEr,)r,rEr,)r,rEr,)r,rEr,)r,rEr ,)r!,rEr",)r#,rEr$,)r%,rEr&,)r',rEr(,)r),rEr*,)r+,rEr,,)r-,rEr.,)r/,rEr0,)r1,rEr2,)r3,rEr4,)r5,rEr6,)r7,rEr8,)r9,rEr:,)r;,rEr<,)r=,rEr>,)r?,rEr@,)rA,rErB,)rC,rErD,)rE,rErF,)rG,rErH,)rI,rErJ,)rK,rErL,)rM,rErN,)rO,rErP,)rQ,rErR,)rS,rErT,)rU,rErV,)rW,rErX,)rY,rErZ,)r[,rEr\,)r],rEr^,)r_,rEr`,)ra,rErb,)rc,rErd,)re,rErf,)rg,rErh,)ri,rErj,)rk,rErl,)rm,rErn,)ro,rErp,)rq,rErr,)rs,rErt,)ru,rErv,)rw,rErx,)ry,rErz,)r{,r�)r|,r�)r},r�r�r�r�r�r��_seg_72Hsxr~,N)M�__doc__�__version__r�r�r�r-r�rfrr�rJr�rbr�r/r�rr�r9	r�	rm
r
r�r3r�r]
rr�rAr
r�r�rLrr�r�rfr�r�rr�r�rsr7r�r�rxr6r�r�rXr�r� r+!r�!r"rq"r�"r;#r�#r$rj$r�$rj%r�%ro&r�&r�'r:(r�(r�)ry*r@+r,r~,�tupleZ	uts46datar�r�r�r��<module>s�hhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh@Isite-packages/pip/_vendor/idna/__pycache__/codec.cpython-36.pyc000064400000005725147511334620020355 0ustar003

���e��@s�ddlmZmZmZmZmZddlZddlZejd�Z	Gdd�dej
�Z
Gdd�dej�ZGd	d
�d
ej
�ZGdd�de
ej�ZGd
d�de
ej�Zdd�ZdS)�)�encode�decode�alabel�ulabel�	IDNAError�Nu[.。.。]c@s eZdZddd�Zddd�ZdS)	�Codec�strictcCs.|dkrtdj|���|sdSt|�t|�fS)Nr	z Unsupported error handling "{0}"�r)r
r)r�formatr�len)�self�data�errors�r�/usr/lib/python3.6/codec.pyr	s
zCodec.encodecCs.|dkrtdj|���|sdSt|�t|�fS)Nr	z Unsupported error handling "{0}"r
r)r
r)rrrr)r
rrrrrrs
zCodec.decodeN)r	)r	)�__name__�
__module__�__qualname__rrrrrrrs

rc@seZdZdd�ZdS)�IncrementalEncoderc	Cs�|dkrtdj|���|sdStj|�}d}|rV|dsDd}|d	=n|sV|d
=|rVd}g}d}x2|D]*}|jt|��|r�|d7}|t|�7}qdWdj|�|}|t|�7}||fS)Nr	z Unsupported error handling "{0}"r
rr�.)r
r���rr)rr�_unicode_dots_re�split�appendrr�join)	r
rr�final�labels�trailing_dot�result�size�labelrrr�_buffer_encodes0

z!IncrementalEncoder._buffer_encodeN)rrrr"rrrrrsrc@seZdZdd�ZdS)�IncrementalDecoderc	Cs�|dkrtdj|���|sdSt|t�r4tj|�}nt|�}t|d�|jd�}d}|r~|d	sld}|d
=n|s~|d=|r~d}g}d}x2|D]*}|jt|��|r�|d7}|t	|�7}q�Wdj
|�|}|t	|�7}||fS)Nr	z Unsupported error handling "{0}"r
r�asciirr)r
rrrr)rr�
isinstanceZunicoderr�strrrrr)	r
rrrrrrr r!rrr�_buffer_decode?s8



z!IncrementalDecoder._buffer_decodeN)rrrr'rrrrr#>sr#c@seZdZdS)�StreamWriterN)rrrrrrrr(gsr(c@seZdZdS)�StreamReaderN)rrrrrrrr)jsr)c	Cs tjdt�jt�jttttd�S)NZidna)�namerr�incrementalencoder�incrementaldecoder�streamwriter�streamreader)	�codecs�	CodecInforrrrr#r(r)rrrr�getregentrymsr1)Zcorerrrrrr/�re�compilerr�BufferedIncrementalEncoderr�BufferedIncrementalDecoderr#r(r)r1rrrr�<module>s
!)site-packages/pip/_vendor/idna/core.py000064400000026176147511334620013747 0ustar00from . import idnadata
import bisect
import unicodedata
import re
import sys
from .intranges import intranges_contain

_virama_combining_class = 9
_alabel_prefix = b'xn--'
_unicode_dots_re = re.compile(u'[\u002e\u3002\uff0e\uff61]')

if sys.version_info[0] == 3:
    unicode = str
    unichr = chr

class IDNAError(UnicodeError):
    """ Base exception for all IDNA-encoding related problems """
    pass


class IDNABidiError(IDNAError):
    """ Exception when bidirectional requirements are not satisfied """
    pass


class InvalidCodepoint(IDNAError):
    """ Exception when a disallowed or unallocated codepoint is used """
    pass


class InvalidCodepointContext(IDNAError):
    """ Exception when the codepoint is not valid in the context it is used """
    pass


def _combining_class(cp):
    return unicodedata.combining(unichr(cp))

def _is_script(cp, script):
    return intranges_contain(ord(cp), idnadata.scripts[script])

def _punycode(s):
    return s.encode('punycode')

def _unot(s):
    return 'U+{0:04X}'.format(s)


def valid_label_length(label):

    if len(label) > 63:
        return False
    return True


def valid_string_length(label, trailing_dot):

    if len(label) > (254 if trailing_dot else 253):
        return False
    return True


def check_bidi(label, check_ltr=False):

    # Bidi rules should only be applied if string contains RTL characters
    bidi_label = False
    for (idx, cp) in enumerate(label, 1):
        direction = unicodedata.bidirectional(cp)
        if direction == '':
            # String likely comes from a newer version of Unicode
            raise IDNABidiError('Unknown directionality in label {0} at position {1}'.format(repr(label), idx))
        if direction in ['R', 'AL', 'AN']:
            bidi_label = True
            break
    if not bidi_label and not check_ltr:
        return True

    # Bidi rule 1
    direction = unicodedata.bidirectional(label[0])
    if direction in ['R', 'AL']:
        rtl = True
    elif direction == 'L':
        rtl = False
    else:
        raise IDNABidiError('First codepoint in label {0} must be directionality L, R or AL'.format(repr(label)))

    valid_ending = False
    number_type = False
    for (idx, cp) in enumerate(label, 1):
        direction = unicodedata.bidirectional(cp)

        if rtl:
            # Bidi rule 2
            if not direction in ['R', 'AL', 'AN', 'EN', 'ES', 'CS', 'ET', 'ON', 'BN', 'NSM']:
                raise IDNABidiError('Invalid direction for codepoint at position {0} in a right-to-left label'.format(idx))
            # Bidi rule 3
            if direction in ['R', 'AL', 'EN', 'AN']:
                valid_ending = True
            elif direction != 'NSM':
                valid_ending = False
            # Bidi rule 4
            if direction in ['AN', 'EN']:
                if not number_type:
                    number_type = direction
                else:
                    if number_type != direction:
                        raise IDNABidiError('Can not mix numeral types in a right-to-left label')
        else:
            # Bidi rule 5
            if not direction in ['L', 'EN', 'ES', 'CS', 'ET', 'ON', 'BN', 'NSM']:
                raise IDNABidiError('Invalid direction for codepoint at position {0} in a left-to-right label'.format(idx))
            # Bidi rule 6
            if direction in ['L', 'EN']:
                valid_ending = True
            elif direction != 'NSM':
                valid_ending = False

    if not valid_ending:
        raise IDNABidiError('Label ends with illegal codepoint directionality')

    return True


def check_initial_combiner(label):

    if unicodedata.category(label[0])[0] == 'M':
        raise IDNAError('Label begins with an illegal combining character')
    return True


def check_hyphen_ok(label):

    if label[2:4] == '--':
        raise IDNAError('Label has disallowed hyphens in 3rd and 4th position')
    if label[0] == '-' or label[-1] == '-':
        raise IDNAError('Label must not start or end with a hyphen')
    return True


def check_nfc(label):

    if unicodedata.normalize('NFC', label) != label:
        raise IDNAError('Label must be in Normalization Form C')


def valid_contextj(label, pos):

    cp_value = ord(label[pos])

    if cp_value == 0x200c:

        if pos > 0:
            if _combining_class(ord(label[pos - 1])) == _virama_combining_class:
                return True

        ok = False
        for i in range(pos-1, -1, -1):
            joining_type = idnadata.joining_types.get(ord(label[i]))
            if joining_type == ord('T'):
                continue
            if joining_type in [ord('L'), ord('D')]:
                ok = True
                break

        if not ok:
            return False

        ok = False
        for i in range(pos+1, len(label)):
            joining_type = idnadata.joining_types.get(ord(label[i]))
            if joining_type == ord('T'):
                continue
            if joining_type in [ord('R'), ord('D')]:
                ok = True
                break
        return ok

    if cp_value == 0x200d:

        if pos > 0:
            if _combining_class(ord(label[pos - 1])) == _virama_combining_class:
                return True
        return False

    else:

        return False


def valid_contexto(label, pos, exception=False):

    cp_value = ord(label[pos])

    if cp_value == 0x00b7:
        if 0 < pos < len(label)-1:
            if ord(label[pos - 1]) == 0x006c and ord(label[pos + 1]) == 0x006c:
                return True
        return False

    elif cp_value == 0x0375:
        if pos < len(label)-1 and len(label) > 1:
            return _is_script(label[pos + 1], 'Greek')
        return False

    elif cp_value == 0x05f3 or cp_value == 0x05f4:
        if pos > 0:
            return _is_script(label[pos - 1], 'Hebrew')
        return False

    elif cp_value == 0x30fb:
        for cp in label:
            if cp == u'\u30fb':
                continue
            if _is_script(cp, 'Hiragana') or _is_script(cp, 'Katakana') or _is_script(cp, 'Han'):
                return True
        return False

    elif 0x660 <= cp_value <= 0x669:
        for cp in label:
            if 0x6f0 <= ord(cp) <= 0x06f9:
                return False
        return True

    elif 0x6f0 <= cp_value <= 0x6f9:
        for cp in label:
            if 0x660 <= ord(cp) <= 0x0669:
                return False
        return True


def check_label(label):

    if isinstance(label, (bytes, bytearray)):
        label = label.decode('utf-8')
    if len(label) == 0:
        raise IDNAError('Empty Label')

    check_nfc(label)
    check_hyphen_ok(label)
    check_initial_combiner(label)

    for (pos, cp) in enumerate(label):
        cp_value = ord(cp)
        if intranges_contain(cp_value, idnadata.codepoint_classes['PVALID']):
            continue
        elif intranges_contain(cp_value, idnadata.codepoint_classes['CONTEXTJ']):
            if not valid_contextj(label, pos):
                raise InvalidCodepointContext('Joiner {0} not allowed at position {1} in {2}'.format(_unot(cp_value), pos+1, repr(label)))
        elif intranges_contain(cp_value, idnadata.codepoint_classes['CONTEXTO']):
            if not valid_contexto(label, pos):
                raise InvalidCodepointContext('Codepoint {0} not allowed at position {1} in {2}'.format(_unot(cp_value), pos+1, repr(label)))
        else:
            raise InvalidCodepoint('Codepoint {0} at position {1} of {2} not allowed'.format(_unot(cp_value), pos+1, repr(label)))

    check_bidi(label)


def alabel(label):

    try:
        label = label.encode('ascii')
        try:
            ulabel(label)
        except IDNAError:
            raise IDNAError('The label {0} is not a valid A-label'.format(label))
        if not valid_label_length(label):
            raise IDNAError('Label too long')
        return label
    except UnicodeEncodeError:
        pass

    if not label:
        raise IDNAError('No Input')

    label = unicode(label)
    check_label(label)
    label = _punycode(label)
    label = _alabel_prefix + label

    if not valid_label_length(label):
        raise IDNAError('Label too long')

    return label


def ulabel(label):

    if not isinstance(label, (bytes, bytearray)):
        try:
            label = label.encode('ascii')
        except UnicodeEncodeError:
            check_label(label)
            return label

    label = label.lower()
    if label.startswith(_alabel_prefix):
        label = label[len(_alabel_prefix):]
    else:
        check_label(label)
        return label.decode('ascii')

    label = label.decode('punycode')
    check_label(label)
    return label


def uts46_remap(domain, std3_rules=True, transitional=False):
    """Re-map the characters in the string according to UTS46 processing."""
    from .uts46data import uts46data
    output = u""
    try:
        for pos, char in enumerate(domain):
            code_point = ord(char)
            uts46row = uts46data[code_point if code_point < 256 else
                bisect.bisect_left(uts46data, (code_point, "Z")) - 1]
            status = uts46row[1]
            replacement = uts46row[2] if len(uts46row) == 3 else None
            if (status == "V" or
                    (status == "D" and not transitional) or
                    (status == "3" and std3_rules and replacement is None)):
                output += char
            elif replacement is not None and (status == "M" or
                    (status == "3" and std3_rules) or
                    (status == "D" and transitional)):
                output += replacement
            elif status != "I":
                raise IndexError()
        return unicodedata.normalize("NFC", output)
    except IndexError:
        raise InvalidCodepoint(
            "Codepoint {0} not allowed at position {1} in {2}".format(
            _unot(code_point), pos + 1, repr(domain)))


def encode(s, strict=False, uts46=False, std3_rules=False, transitional=False):

    if isinstance(s, (bytes, bytearray)):
        s = s.decode("ascii")
    if uts46:
        s = uts46_remap(s, std3_rules, transitional)
    trailing_dot = False
    result = []
    if strict:
        labels = s.split('.')
    else:
        labels = _unicode_dots_re.split(s)
    while labels and not labels[0]:
        del labels[0]
    if not labels:
        raise IDNAError('Empty domain')
    if labels[-1] == '':
        del labels[-1]
        trailing_dot = True
    for label in labels:
        result.append(alabel(label))
    if trailing_dot:
        result.append(b'')
    s = b'.'.join(result)
    if not valid_string_length(s, trailing_dot):
        raise IDNAError('Domain too long')
    return s


def decode(s, strict=False, uts46=False, std3_rules=False):

    if isinstance(s, (bytes, bytearray)):
        s = s.decode("ascii")
    if uts46:
        s = uts46_remap(s, std3_rules, False)
    trailing_dot = False
    result = []
    if not strict:
        labels = _unicode_dots_re.split(s)
    else:
        labels = s.split(u'.')
    while labels and not labels[0]:
        del labels[0]
    if not labels:
        raise IDNAError('Empty domain')
    if not labels[-1]:
        del labels[-1]
        trailing_dot = True
    for label in labels:
        result.append(ulabel(label))
    if trailing_dot:
        result.append(u'')
    return u'.'.join(result)
site-packages/pip/_vendor/idna/idnadata.py000064400000100347147511334620014555 0ustar00# This file is automatically generated by tools/idna-data

__version__ = "6.3.0"
scripts = {
    'Greek': (
        0x37000000374,
        0x37500000378,
        0x37a0000037e,
        0x38400000385,
        0x38600000387,
        0x3880000038b,
        0x38c0000038d,
        0x38e000003a2,
        0x3a3000003e2,
        0x3f000000400,
        0x1d2600001d2b,
        0x1d5d00001d62,
        0x1d6600001d6b,
        0x1dbf00001dc0,
        0x1f0000001f16,
        0x1f1800001f1e,
        0x1f2000001f46,
        0x1f4800001f4e,
        0x1f5000001f58,
        0x1f5900001f5a,
        0x1f5b00001f5c,
        0x1f5d00001f5e,
        0x1f5f00001f7e,
        0x1f8000001fb5,
        0x1fb600001fc5,
        0x1fc600001fd4,
        0x1fd600001fdc,
        0x1fdd00001ff0,
        0x1ff200001ff5,
        0x1ff600001fff,
        0x212600002127,
        0x101400001018b,
        0x1d2000001d246,
    ),
    'Han': (
        0x2e8000002e9a,
        0x2e9b00002ef4,
        0x2f0000002fd6,
        0x300500003006,
        0x300700003008,
        0x30210000302a,
        0x30380000303c,
        0x340000004db6,
        0x4e0000009fcd,
        0xf9000000fa6e,
        0xfa700000fada,
        0x200000002a6d7,
        0x2a7000002b735,
        0x2b7400002b81e,
        0x2f8000002fa1e,
    ),
    'Hebrew': (
        0x591000005c8,
        0x5d0000005eb,
        0x5f0000005f5,
        0xfb1d0000fb37,
        0xfb380000fb3d,
        0xfb3e0000fb3f,
        0xfb400000fb42,
        0xfb430000fb45,
        0xfb460000fb50,
    ),
    'Hiragana': (
        0x304100003097,
        0x309d000030a0,
        0x1b0010001b002,
        0x1f2000001f201,
    ),
    'Katakana': (
        0x30a1000030fb,
        0x30fd00003100,
        0x31f000003200,
        0x32d0000032ff,
        0x330000003358,
        0xff660000ff70,
        0xff710000ff9e,
        0x1b0000001b001,
    ),
}
joining_types = {
    0x600: 85,
    0x601: 85,
    0x602: 85,
    0x603: 85,
    0x604: 85,
    0x608: 85,
    0x60b: 85,
    0x620: 68,
    0x621: 85,
    0x622: 82,
    0x623: 82,
    0x624: 82,
    0x625: 82,
    0x626: 68,
    0x627: 82,
    0x628: 68,
    0x629: 82,
    0x62a: 68,
    0x62b: 68,
    0x62c: 68,
    0x62d: 68,
    0x62e: 68,
    0x62f: 82,
    0x630: 82,
    0x631: 82,
    0x632: 82,
    0x633: 68,
    0x634: 68,
    0x635: 68,
    0x636: 68,
    0x637: 68,
    0x638: 68,
    0x639: 68,
    0x63a: 68,
    0x63b: 68,
    0x63c: 68,
    0x63d: 68,
    0x63e: 68,
    0x63f: 68,
    0x640: 67,
    0x641: 68,
    0x642: 68,
    0x643: 68,
    0x644: 68,
    0x645: 68,
    0x646: 68,
    0x647: 68,
    0x648: 82,
    0x649: 68,
    0x64a: 68,
    0x66e: 68,
    0x66f: 68,
    0x671: 82,
    0x672: 82,
    0x673: 82,
    0x674: 85,
    0x675: 82,
    0x676: 82,
    0x677: 82,
    0x678: 68,
    0x679: 68,
    0x67a: 68,
    0x67b: 68,
    0x67c: 68,
    0x67d: 68,
    0x67e: 68,
    0x67f: 68,
    0x680: 68,
    0x681: 68,
    0x682: 68,
    0x683: 68,
    0x684: 68,
    0x685: 68,
    0x686: 68,
    0x687: 68,
    0x688: 82,
    0x689: 82,
    0x68a: 82,
    0x68b: 82,
    0x68c: 82,
    0x68d: 82,
    0x68e: 82,
    0x68f: 82,
    0x690: 82,
    0x691: 82,
    0x692: 82,
    0x693: 82,
    0x694: 82,
    0x695: 82,
    0x696: 82,
    0x697: 82,
    0x698: 82,
    0x699: 82,
    0x69a: 68,
    0x69b: 68,
    0x69c: 68,
    0x69d: 68,
    0x69e: 68,
    0x69f: 68,
    0x6a0: 68,
    0x6a1: 68,
    0x6a2: 68,
    0x6a3: 68,
    0x6a4: 68,
    0x6a5: 68,
    0x6a6: 68,
    0x6a7: 68,
    0x6a8: 68,
    0x6a9: 68,
    0x6aa: 68,
    0x6ab: 68,
    0x6ac: 68,
    0x6ad: 68,
    0x6ae: 68,
    0x6af: 68,
    0x6b0: 68,
    0x6b1: 68,
    0x6b2: 68,
    0x6b3: 68,
    0x6b4: 68,
    0x6b5: 68,
    0x6b6: 68,
    0x6b7: 68,
    0x6b8: 68,
    0x6b9: 68,
    0x6ba: 68,
    0x6bb: 68,
    0x6bc: 68,
    0x6bd: 68,
    0x6be: 68,
    0x6bf: 68,
    0x6c0: 82,
    0x6c1: 68,
    0x6c2: 68,
    0x6c3: 82,
    0x6c4: 82,
    0x6c5: 82,
    0x6c6: 82,
    0x6c7: 82,
    0x6c8: 82,
    0x6c9: 82,
    0x6ca: 82,
    0x6cb: 82,
    0x6cc: 68,
    0x6cd: 82,
    0x6ce: 68,
    0x6cf: 82,
    0x6d0: 68,
    0x6d1: 68,
    0x6d2: 82,
    0x6d3: 82,
    0x6d5: 82,
    0x6dd: 85,
    0x6ee: 82,
    0x6ef: 82,
    0x6fa: 68,
    0x6fb: 68,
    0x6fc: 68,
    0x6ff: 68,
    0x710: 82,
    0x712: 68,
    0x713: 68,
    0x714: 68,
    0x715: 82,
    0x716: 82,
    0x717: 82,
    0x718: 82,
    0x719: 82,
    0x71a: 68,
    0x71b: 68,
    0x71c: 68,
    0x71d: 68,
    0x71e: 82,
    0x71f: 68,
    0x720: 68,
    0x721: 68,
    0x722: 68,
    0x723: 68,
    0x724: 68,
    0x725: 68,
    0x726: 68,
    0x727: 68,
    0x728: 82,
    0x729: 68,
    0x72a: 82,
    0x72b: 68,
    0x72c: 82,
    0x72d: 68,
    0x72e: 68,
    0x72f: 82,
    0x74d: 82,
    0x74e: 68,
    0x74f: 68,
    0x750: 68,
    0x751: 68,
    0x752: 68,
    0x753: 68,
    0x754: 68,
    0x755: 68,
    0x756: 68,
    0x757: 68,
    0x758: 68,
    0x759: 82,
    0x75a: 82,
    0x75b: 82,
    0x75c: 68,
    0x75d: 68,
    0x75e: 68,
    0x75f: 68,
    0x760: 68,
    0x761: 68,
    0x762: 68,
    0x763: 68,
    0x764: 68,
    0x765: 68,
    0x766: 68,
    0x767: 68,
    0x768: 68,
    0x769: 68,
    0x76a: 68,
    0x76b: 82,
    0x76c: 82,
    0x76d: 68,
    0x76e: 68,
    0x76f: 68,
    0x770: 68,
    0x771: 82,
    0x772: 68,
    0x773: 82,
    0x774: 82,
    0x775: 68,
    0x776: 68,
    0x777: 68,
    0x778: 82,
    0x779: 82,
    0x77a: 68,
    0x77b: 68,
    0x77c: 68,
    0x77d: 68,
    0x77e: 68,
    0x77f: 68,
    0x7ca: 68,
    0x7cb: 68,
    0x7cc: 68,
    0x7cd: 68,
    0x7ce: 68,
    0x7cf: 68,
    0x7d0: 68,
    0x7d1: 68,
    0x7d2: 68,
    0x7d3: 68,
    0x7d4: 68,
    0x7d5: 68,
    0x7d6: 68,
    0x7d7: 68,
    0x7d8: 68,
    0x7d9: 68,
    0x7da: 68,
    0x7db: 68,
    0x7dc: 68,
    0x7dd: 68,
    0x7de: 68,
    0x7df: 68,
    0x7e0: 68,
    0x7e1: 68,
    0x7e2: 68,
    0x7e3: 68,
    0x7e4: 68,
    0x7e5: 68,
    0x7e6: 68,
    0x7e7: 68,
    0x7e8: 68,
    0x7e9: 68,
    0x7ea: 68,
    0x7fa: 67,
    0x840: 82,
    0x841: 68,
    0x842: 68,
    0x843: 68,
    0x844: 68,
    0x845: 68,
    0x846: 82,
    0x847: 68,
    0x848: 68,
    0x849: 82,
    0x84a: 68,
    0x84b: 68,
    0x84c: 68,
    0x84d: 68,
    0x84e: 68,
    0x84f: 82,
    0x850: 68,
    0x851: 68,
    0x852: 68,
    0x853: 68,
    0x854: 82,
    0x855: 68,
    0x856: 85,
    0x857: 85,
    0x858: 85,
    0x8a0: 68,
    0x8a2: 68,
    0x8a3: 68,
    0x8a4: 68,
    0x8a5: 68,
    0x8a6: 68,
    0x8a7: 68,
    0x8a8: 68,
    0x8a9: 68,
    0x8aa: 82,
    0x8ab: 82,
    0x8ac: 82,
    0x1806: 85,
    0x1807: 68,
    0x180a: 67,
    0x180e: 85,
    0x1820: 68,
    0x1821: 68,
    0x1822: 68,
    0x1823: 68,
    0x1824: 68,
    0x1825: 68,
    0x1826: 68,
    0x1827: 68,
    0x1828: 68,
    0x1829: 68,
    0x182a: 68,
    0x182b: 68,
    0x182c: 68,
    0x182d: 68,
    0x182e: 68,
    0x182f: 68,
    0x1830: 68,
    0x1831: 68,
    0x1832: 68,
    0x1833: 68,
    0x1834: 68,
    0x1835: 68,
    0x1836: 68,
    0x1837: 68,
    0x1838: 68,
    0x1839: 68,
    0x183a: 68,
    0x183b: 68,
    0x183c: 68,
    0x183d: 68,
    0x183e: 68,
    0x183f: 68,
    0x1840: 68,
    0x1841: 68,
    0x1842: 68,
    0x1843: 68,
    0x1844: 68,
    0x1845: 68,
    0x1846: 68,
    0x1847: 68,
    0x1848: 68,
    0x1849: 68,
    0x184a: 68,
    0x184b: 68,
    0x184c: 68,
    0x184d: 68,
    0x184e: 68,
    0x184f: 68,
    0x1850: 68,
    0x1851: 68,
    0x1852: 68,
    0x1853: 68,
    0x1854: 68,
    0x1855: 68,
    0x1856: 68,
    0x1857: 68,
    0x1858: 68,
    0x1859: 68,
    0x185a: 68,
    0x185b: 68,
    0x185c: 68,
    0x185d: 68,
    0x185e: 68,
    0x185f: 68,
    0x1860: 68,
    0x1861: 68,
    0x1862: 68,
    0x1863: 68,
    0x1864: 68,
    0x1865: 68,
    0x1866: 68,
    0x1867: 68,
    0x1868: 68,
    0x1869: 68,
    0x186a: 68,
    0x186b: 68,
    0x186c: 68,
    0x186d: 68,
    0x186e: 68,
    0x186f: 68,
    0x1870: 68,
    0x1871: 68,
    0x1872: 68,
    0x1873: 68,
    0x1874: 68,
    0x1875: 68,
    0x1876: 68,
    0x1877: 68,
    0x1880: 85,
    0x1881: 85,
    0x1882: 85,
    0x1883: 85,
    0x1884: 85,
    0x1885: 85,
    0x1886: 85,
    0x1887: 68,
    0x1888: 68,
    0x1889: 68,
    0x188a: 68,
    0x188b: 68,
    0x188c: 68,
    0x188d: 68,
    0x188e: 68,
    0x188f: 68,
    0x1890: 68,
    0x1891: 68,
    0x1892: 68,
    0x1893: 68,
    0x1894: 68,
    0x1895: 68,
    0x1896: 68,
    0x1897: 68,
    0x1898: 68,
    0x1899: 68,
    0x189a: 68,
    0x189b: 68,
    0x189c: 68,
    0x189d: 68,
    0x189e: 68,
    0x189f: 68,
    0x18a0: 68,
    0x18a1: 68,
    0x18a2: 68,
    0x18a3: 68,
    0x18a4: 68,
    0x18a5: 68,
    0x18a6: 68,
    0x18a7: 68,
    0x18a8: 68,
    0x18aa: 68,
    0x200c: 85,
    0x200d: 67,
    0x2066: 85,
    0x2067: 85,
    0x2068: 85,
    0x2069: 85,
    0xa840: 68,
    0xa841: 68,
    0xa842: 68,
    0xa843: 68,
    0xa844: 68,
    0xa845: 68,
    0xa846: 68,
    0xa847: 68,
    0xa848: 68,
    0xa849: 68,
    0xa84a: 68,
    0xa84b: 68,
    0xa84c: 68,
    0xa84d: 68,
    0xa84e: 68,
    0xa84f: 68,
    0xa850: 68,
    0xa851: 68,
    0xa852: 68,
    0xa853: 68,
    0xa854: 68,
    0xa855: 68,
    0xa856: 68,
    0xa857: 68,
    0xa858: 68,
    0xa859: 68,
    0xa85a: 68,
    0xa85b: 68,
    0xa85c: 68,
    0xa85d: 68,
    0xa85e: 68,
    0xa85f: 68,
    0xa860: 68,
    0xa861: 68,
    0xa862: 68,
    0xa863: 68,
    0xa864: 68,
    0xa865: 68,
    0xa866: 68,
    0xa867: 68,
    0xa868: 68,
    0xa869: 68,
    0xa86a: 68,
    0xa86b: 68,
    0xa86c: 68,
    0xa86d: 68,
    0xa86e: 68,
    0xa86f: 68,
    0xa870: 68,
    0xa871: 68,
    0xa872: 76,
    0xa873: 85,
}
codepoint_classes = {
    'PVALID': (
        0x2d0000002e,
        0x300000003a,
        0x610000007b,
        0xdf000000f7,
        0xf800000100,
        0x10100000102,
        0x10300000104,
        0x10500000106,
        0x10700000108,
        0x1090000010a,
        0x10b0000010c,
        0x10d0000010e,
        0x10f00000110,
        0x11100000112,
        0x11300000114,
        0x11500000116,
        0x11700000118,
        0x1190000011a,
        0x11b0000011c,
        0x11d0000011e,
        0x11f00000120,
        0x12100000122,
        0x12300000124,
        0x12500000126,
        0x12700000128,
        0x1290000012a,
        0x12b0000012c,
        0x12d0000012e,
        0x12f00000130,
        0x13100000132,
        0x13500000136,
        0x13700000139,
        0x13a0000013b,
        0x13c0000013d,
        0x13e0000013f,
        0x14200000143,
        0x14400000145,
        0x14600000147,
        0x14800000149,
        0x14b0000014c,
        0x14d0000014e,
        0x14f00000150,
        0x15100000152,
        0x15300000154,
        0x15500000156,
        0x15700000158,
        0x1590000015a,
        0x15b0000015c,
        0x15d0000015e,
        0x15f00000160,
        0x16100000162,
        0x16300000164,
        0x16500000166,
        0x16700000168,
        0x1690000016a,
        0x16b0000016c,
        0x16d0000016e,
        0x16f00000170,
        0x17100000172,
        0x17300000174,
        0x17500000176,
        0x17700000178,
        0x17a0000017b,
        0x17c0000017d,
        0x17e0000017f,
        0x18000000181,
        0x18300000184,
        0x18500000186,
        0x18800000189,
        0x18c0000018e,
        0x19200000193,
        0x19500000196,
        0x1990000019c,
        0x19e0000019f,
        0x1a1000001a2,
        0x1a3000001a4,
        0x1a5000001a6,
        0x1a8000001a9,
        0x1aa000001ac,
        0x1ad000001ae,
        0x1b0000001b1,
        0x1b4000001b5,
        0x1b6000001b7,
        0x1b9000001bc,
        0x1bd000001c4,
        0x1ce000001cf,
        0x1d0000001d1,
        0x1d2000001d3,
        0x1d4000001d5,
        0x1d6000001d7,
        0x1d8000001d9,
        0x1da000001db,
        0x1dc000001de,
        0x1df000001e0,
        0x1e1000001e2,
        0x1e3000001e4,
        0x1e5000001e6,
        0x1e7000001e8,
        0x1e9000001ea,
        0x1eb000001ec,
        0x1ed000001ee,
        0x1ef000001f1,
        0x1f5000001f6,
        0x1f9000001fa,
        0x1fb000001fc,
        0x1fd000001fe,
        0x1ff00000200,
        0x20100000202,
        0x20300000204,
        0x20500000206,
        0x20700000208,
        0x2090000020a,
        0x20b0000020c,
        0x20d0000020e,
        0x20f00000210,
        0x21100000212,
        0x21300000214,
        0x21500000216,
        0x21700000218,
        0x2190000021a,
        0x21b0000021c,
        0x21d0000021e,
        0x21f00000220,
        0x22100000222,
        0x22300000224,
        0x22500000226,
        0x22700000228,
        0x2290000022a,
        0x22b0000022c,
        0x22d0000022e,
        0x22f00000230,
        0x23100000232,
        0x2330000023a,
        0x23c0000023d,
        0x23f00000241,
        0x24200000243,
        0x24700000248,
        0x2490000024a,
        0x24b0000024c,
        0x24d0000024e,
        0x24f000002b0,
        0x2b9000002c2,
        0x2c6000002d2,
        0x2ec000002ed,
        0x2ee000002ef,
        0x30000000340,
        0x34200000343,
        0x3460000034f,
        0x35000000370,
        0x37100000372,
        0x37300000374,
        0x37700000378,
        0x37b0000037e,
        0x39000000391,
        0x3ac000003cf,
        0x3d7000003d8,
        0x3d9000003da,
        0x3db000003dc,
        0x3dd000003de,
        0x3df000003e0,
        0x3e1000003e2,
        0x3e3000003e4,
        0x3e5000003e6,
        0x3e7000003e8,
        0x3e9000003ea,
        0x3eb000003ec,
        0x3ed000003ee,
        0x3ef000003f0,
        0x3f3000003f4,
        0x3f8000003f9,
        0x3fb000003fd,
        0x43000000460,
        0x46100000462,
        0x46300000464,
        0x46500000466,
        0x46700000468,
        0x4690000046a,
        0x46b0000046c,
        0x46d0000046e,
        0x46f00000470,
        0x47100000472,
        0x47300000474,
        0x47500000476,
        0x47700000478,
        0x4790000047a,
        0x47b0000047c,
        0x47d0000047e,
        0x47f00000480,
        0x48100000482,
        0x48300000488,
        0x48b0000048c,
        0x48d0000048e,
        0x48f00000490,
        0x49100000492,
        0x49300000494,
        0x49500000496,
        0x49700000498,
        0x4990000049a,
        0x49b0000049c,
        0x49d0000049e,
        0x49f000004a0,
        0x4a1000004a2,
        0x4a3000004a4,
        0x4a5000004a6,
        0x4a7000004a8,
        0x4a9000004aa,
        0x4ab000004ac,
        0x4ad000004ae,
        0x4af000004b0,
        0x4b1000004b2,
        0x4b3000004b4,
        0x4b5000004b6,
        0x4b7000004b8,
        0x4b9000004ba,
        0x4bb000004bc,
        0x4bd000004be,
        0x4bf000004c0,
        0x4c2000004c3,
        0x4c4000004c5,
        0x4c6000004c7,
        0x4c8000004c9,
        0x4ca000004cb,
        0x4cc000004cd,
        0x4ce000004d0,
        0x4d1000004d2,
        0x4d3000004d4,
        0x4d5000004d6,
        0x4d7000004d8,
        0x4d9000004da,
        0x4db000004dc,
        0x4dd000004de,
        0x4df000004e0,
        0x4e1000004e2,
        0x4e3000004e4,
        0x4e5000004e6,
        0x4e7000004e8,
        0x4e9000004ea,
        0x4eb000004ec,
        0x4ed000004ee,
        0x4ef000004f0,
        0x4f1000004f2,
        0x4f3000004f4,
        0x4f5000004f6,
        0x4f7000004f8,
        0x4f9000004fa,
        0x4fb000004fc,
        0x4fd000004fe,
        0x4ff00000500,
        0x50100000502,
        0x50300000504,
        0x50500000506,
        0x50700000508,
        0x5090000050a,
        0x50b0000050c,
        0x50d0000050e,
        0x50f00000510,
        0x51100000512,
        0x51300000514,
        0x51500000516,
        0x51700000518,
        0x5190000051a,
        0x51b0000051c,
        0x51d0000051e,
        0x51f00000520,
        0x52100000522,
        0x52300000524,
        0x52500000526,
        0x52700000528,
        0x5590000055a,
        0x56100000587,
        0x591000005be,
        0x5bf000005c0,
        0x5c1000005c3,
        0x5c4000005c6,
        0x5c7000005c8,
        0x5d0000005eb,
        0x5f0000005f3,
        0x6100000061b,
        0x62000000640,
        0x64100000660,
        0x66e00000675,
        0x679000006d4,
        0x6d5000006dd,
        0x6df000006e9,
        0x6ea000006f0,
        0x6fa00000700,
        0x7100000074b,
        0x74d000007b2,
        0x7c0000007f6,
        0x8000000082e,
        0x8400000085c,
        0x8a0000008a1,
        0x8a2000008ad,
        0x8e4000008ff,
        0x90000000958,
        0x96000000964,
        0x96600000970,
        0x97100000978,
        0x97900000980,
        0x98100000984,
        0x9850000098d,
        0x98f00000991,
        0x993000009a9,
        0x9aa000009b1,
        0x9b2000009b3,
        0x9b6000009ba,
        0x9bc000009c5,
        0x9c7000009c9,
        0x9cb000009cf,
        0x9d7000009d8,
        0x9e0000009e4,
        0x9e6000009f2,
        0xa0100000a04,
        0xa0500000a0b,
        0xa0f00000a11,
        0xa1300000a29,
        0xa2a00000a31,
        0xa3200000a33,
        0xa3500000a36,
        0xa3800000a3a,
        0xa3c00000a3d,
        0xa3e00000a43,
        0xa4700000a49,
        0xa4b00000a4e,
        0xa5100000a52,
        0xa5c00000a5d,
        0xa6600000a76,
        0xa8100000a84,
        0xa8500000a8e,
        0xa8f00000a92,
        0xa9300000aa9,
        0xaaa00000ab1,
        0xab200000ab4,
        0xab500000aba,
        0xabc00000ac6,
        0xac700000aca,
        0xacb00000ace,
        0xad000000ad1,
        0xae000000ae4,
        0xae600000af0,
        0xb0100000b04,
        0xb0500000b0d,
        0xb0f00000b11,
        0xb1300000b29,
        0xb2a00000b31,
        0xb3200000b34,
        0xb3500000b3a,
        0xb3c00000b45,
        0xb4700000b49,
        0xb4b00000b4e,
        0xb5600000b58,
        0xb5f00000b64,
        0xb6600000b70,
        0xb7100000b72,
        0xb8200000b84,
        0xb8500000b8b,
        0xb8e00000b91,
        0xb9200000b96,
        0xb9900000b9b,
        0xb9c00000b9d,
        0xb9e00000ba0,
        0xba300000ba5,
        0xba800000bab,
        0xbae00000bba,
        0xbbe00000bc3,
        0xbc600000bc9,
        0xbca00000bce,
        0xbd000000bd1,
        0xbd700000bd8,
        0xbe600000bf0,
        0xc0100000c04,
        0xc0500000c0d,
        0xc0e00000c11,
        0xc1200000c29,
        0xc2a00000c34,
        0xc3500000c3a,
        0xc3d00000c45,
        0xc4600000c49,
        0xc4a00000c4e,
        0xc5500000c57,
        0xc5800000c5a,
        0xc6000000c64,
        0xc6600000c70,
        0xc8200000c84,
        0xc8500000c8d,
        0xc8e00000c91,
        0xc9200000ca9,
        0xcaa00000cb4,
        0xcb500000cba,
        0xcbc00000cc5,
        0xcc600000cc9,
        0xcca00000cce,
        0xcd500000cd7,
        0xcde00000cdf,
        0xce000000ce4,
        0xce600000cf0,
        0xcf100000cf3,
        0xd0200000d04,
        0xd0500000d0d,
        0xd0e00000d11,
        0xd1200000d3b,
        0xd3d00000d45,
        0xd4600000d49,
        0xd4a00000d4f,
        0xd5700000d58,
        0xd6000000d64,
        0xd6600000d70,
        0xd7a00000d80,
        0xd8200000d84,
        0xd8500000d97,
        0xd9a00000db2,
        0xdb300000dbc,
        0xdbd00000dbe,
        0xdc000000dc7,
        0xdca00000dcb,
        0xdcf00000dd5,
        0xdd600000dd7,
        0xdd800000de0,
        0xdf200000df4,
        0xe0100000e33,
        0xe3400000e3b,
        0xe4000000e4f,
        0xe5000000e5a,
        0xe8100000e83,
        0xe8400000e85,
        0xe8700000e89,
        0xe8a00000e8b,
        0xe8d00000e8e,
        0xe9400000e98,
        0xe9900000ea0,
        0xea100000ea4,
        0xea500000ea6,
        0xea700000ea8,
        0xeaa00000eac,
        0xead00000eb3,
        0xeb400000eba,
        0xebb00000ebe,
        0xec000000ec5,
        0xec600000ec7,
        0xec800000ece,
        0xed000000eda,
        0xede00000ee0,
        0xf0000000f01,
        0xf0b00000f0c,
        0xf1800000f1a,
        0xf2000000f2a,
        0xf3500000f36,
        0xf3700000f38,
        0xf3900000f3a,
        0xf3e00000f43,
        0xf4400000f48,
        0xf4900000f4d,
        0xf4e00000f52,
        0xf5300000f57,
        0xf5800000f5c,
        0xf5d00000f69,
        0xf6a00000f6d,
        0xf7100000f73,
        0xf7400000f75,
        0xf7a00000f81,
        0xf8200000f85,
        0xf8600000f93,
        0xf9400000f98,
        0xf9900000f9d,
        0xf9e00000fa2,
        0xfa300000fa7,
        0xfa800000fac,
        0xfad00000fb9,
        0xfba00000fbd,
        0xfc600000fc7,
        0x10000000104a,
        0x10500000109e,
        0x10d0000010fb,
        0x10fd00001100,
        0x120000001249,
        0x124a0000124e,
        0x125000001257,
        0x125800001259,
        0x125a0000125e,
        0x126000001289,
        0x128a0000128e,
        0x1290000012b1,
        0x12b2000012b6,
        0x12b8000012bf,
        0x12c0000012c1,
        0x12c2000012c6,
        0x12c8000012d7,
        0x12d800001311,
        0x131200001316,
        0x13180000135b,
        0x135d00001360,
        0x138000001390,
        0x13a0000013f5,
        0x14010000166d,
        0x166f00001680,
        0x16810000169b,
        0x16a0000016eb,
        0x17000000170d,
        0x170e00001715,
        0x172000001735,
        0x174000001754,
        0x17600000176d,
        0x176e00001771,
        0x177200001774,
        0x1780000017b4,
        0x17b6000017d4,
        0x17d7000017d8,
        0x17dc000017de,
        0x17e0000017ea,
        0x18100000181a,
        0x182000001878,
        0x1880000018ab,
        0x18b0000018f6,
        0x19000000191d,
        0x19200000192c,
        0x19300000193c,
        0x19460000196e,
        0x197000001975,
        0x1980000019ac,
        0x19b0000019ca,
        0x19d0000019da,
        0x1a0000001a1c,
        0x1a2000001a5f,
        0x1a6000001a7d,
        0x1a7f00001a8a,
        0x1a9000001a9a,
        0x1aa700001aa8,
        0x1b0000001b4c,
        0x1b5000001b5a,
        0x1b6b00001b74,
        0x1b8000001bf4,
        0x1c0000001c38,
        0x1c4000001c4a,
        0x1c4d00001c7e,
        0x1cd000001cd3,
        0x1cd400001cf7,
        0x1d0000001d2c,
        0x1d2f00001d30,
        0x1d3b00001d3c,
        0x1d4e00001d4f,
        0x1d6b00001d78,
        0x1d7900001d9b,
        0x1dc000001de7,
        0x1dfc00001e00,
        0x1e0100001e02,
        0x1e0300001e04,
        0x1e0500001e06,
        0x1e0700001e08,
        0x1e0900001e0a,
        0x1e0b00001e0c,
        0x1e0d00001e0e,
        0x1e0f00001e10,
        0x1e1100001e12,
        0x1e1300001e14,
        0x1e1500001e16,
        0x1e1700001e18,
        0x1e1900001e1a,
        0x1e1b00001e1c,
        0x1e1d00001e1e,
        0x1e1f00001e20,
        0x1e2100001e22,
        0x1e2300001e24,
        0x1e2500001e26,
        0x1e2700001e28,
        0x1e2900001e2a,
        0x1e2b00001e2c,
        0x1e2d00001e2e,
        0x1e2f00001e30,
        0x1e3100001e32,
        0x1e3300001e34,
        0x1e3500001e36,
        0x1e3700001e38,
        0x1e3900001e3a,
        0x1e3b00001e3c,
        0x1e3d00001e3e,
        0x1e3f00001e40,
        0x1e4100001e42,
        0x1e4300001e44,
        0x1e4500001e46,
        0x1e4700001e48,
        0x1e4900001e4a,
        0x1e4b00001e4c,
        0x1e4d00001e4e,
        0x1e4f00001e50,
        0x1e5100001e52,
        0x1e5300001e54,
        0x1e5500001e56,
        0x1e5700001e58,
        0x1e5900001e5a,
        0x1e5b00001e5c,
        0x1e5d00001e5e,
        0x1e5f00001e60,
        0x1e6100001e62,
        0x1e6300001e64,
        0x1e6500001e66,
        0x1e6700001e68,
        0x1e6900001e6a,
        0x1e6b00001e6c,
        0x1e6d00001e6e,
        0x1e6f00001e70,
        0x1e7100001e72,
        0x1e7300001e74,
        0x1e7500001e76,
        0x1e7700001e78,
        0x1e7900001e7a,
        0x1e7b00001e7c,
        0x1e7d00001e7e,
        0x1e7f00001e80,
        0x1e8100001e82,
        0x1e8300001e84,
        0x1e8500001e86,
        0x1e8700001e88,
        0x1e8900001e8a,
        0x1e8b00001e8c,
        0x1e8d00001e8e,
        0x1e8f00001e90,
        0x1e9100001e92,
        0x1e9300001e94,
        0x1e9500001e9a,
        0x1e9c00001e9e,
        0x1e9f00001ea0,
        0x1ea100001ea2,
        0x1ea300001ea4,
        0x1ea500001ea6,
        0x1ea700001ea8,
        0x1ea900001eaa,
        0x1eab00001eac,
        0x1ead00001eae,
        0x1eaf00001eb0,
        0x1eb100001eb2,
        0x1eb300001eb4,
        0x1eb500001eb6,
        0x1eb700001eb8,
        0x1eb900001eba,
        0x1ebb00001ebc,
        0x1ebd00001ebe,
        0x1ebf00001ec0,
        0x1ec100001ec2,
        0x1ec300001ec4,
        0x1ec500001ec6,
        0x1ec700001ec8,
        0x1ec900001eca,
        0x1ecb00001ecc,
        0x1ecd00001ece,
        0x1ecf00001ed0,
        0x1ed100001ed2,
        0x1ed300001ed4,
        0x1ed500001ed6,
        0x1ed700001ed8,
        0x1ed900001eda,
        0x1edb00001edc,
        0x1edd00001ede,
        0x1edf00001ee0,
        0x1ee100001ee2,
        0x1ee300001ee4,
        0x1ee500001ee6,
        0x1ee700001ee8,
        0x1ee900001eea,
        0x1eeb00001eec,
        0x1eed00001eee,
        0x1eef00001ef0,
        0x1ef100001ef2,
        0x1ef300001ef4,
        0x1ef500001ef6,
        0x1ef700001ef8,
        0x1ef900001efa,
        0x1efb00001efc,
        0x1efd00001efe,
        0x1eff00001f08,
        0x1f1000001f16,
        0x1f2000001f28,
        0x1f3000001f38,
        0x1f4000001f46,
        0x1f5000001f58,
        0x1f6000001f68,
        0x1f7000001f71,
        0x1f7200001f73,
        0x1f7400001f75,
        0x1f7600001f77,
        0x1f7800001f79,
        0x1f7a00001f7b,
        0x1f7c00001f7d,
        0x1fb000001fb2,
        0x1fb600001fb7,
        0x1fc600001fc7,
        0x1fd000001fd3,
        0x1fd600001fd8,
        0x1fe000001fe3,
        0x1fe400001fe8,
        0x1ff600001ff7,
        0x214e0000214f,
        0x218400002185,
        0x2c3000002c5f,
        0x2c6100002c62,
        0x2c6500002c67,
        0x2c6800002c69,
        0x2c6a00002c6b,
        0x2c6c00002c6d,
        0x2c7100002c72,
        0x2c7300002c75,
        0x2c7600002c7c,
        0x2c8100002c82,
        0x2c8300002c84,
        0x2c8500002c86,
        0x2c8700002c88,
        0x2c8900002c8a,
        0x2c8b00002c8c,
        0x2c8d00002c8e,
        0x2c8f00002c90,
        0x2c9100002c92,
        0x2c9300002c94,
        0x2c9500002c96,
        0x2c9700002c98,
        0x2c9900002c9a,
        0x2c9b00002c9c,
        0x2c9d00002c9e,
        0x2c9f00002ca0,
        0x2ca100002ca2,
        0x2ca300002ca4,
        0x2ca500002ca6,
        0x2ca700002ca8,
        0x2ca900002caa,
        0x2cab00002cac,
        0x2cad00002cae,
        0x2caf00002cb0,
        0x2cb100002cb2,
        0x2cb300002cb4,
        0x2cb500002cb6,
        0x2cb700002cb8,
        0x2cb900002cba,
        0x2cbb00002cbc,
        0x2cbd00002cbe,
        0x2cbf00002cc0,
        0x2cc100002cc2,
        0x2cc300002cc4,
        0x2cc500002cc6,
        0x2cc700002cc8,
        0x2cc900002cca,
        0x2ccb00002ccc,
        0x2ccd00002cce,
        0x2ccf00002cd0,
        0x2cd100002cd2,
        0x2cd300002cd4,
        0x2cd500002cd6,
        0x2cd700002cd8,
        0x2cd900002cda,
        0x2cdb00002cdc,
        0x2cdd00002cde,
        0x2cdf00002ce0,
        0x2ce100002ce2,
        0x2ce300002ce5,
        0x2cec00002ced,
        0x2cee00002cf2,
        0x2cf300002cf4,
        0x2d0000002d26,
        0x2d2700002d28,
        0x2d2d00002d2e,
        0x2d3000002d68,
        0x2d7f00002d97,
        0x2da000002da7,
        0x2da800002daf,
        0x2db000002db7,
        0x2db800002dbf,
        0x2dc000002dc7,
        0x2dc800002dcf,
        0x2dd000002dd7,
        0x2dd800002ddf,
        0x2de000002e00,
        0x2e2f00002e30,
        0x300500003008,
        0x302a0000302e,
        0x303c0000303d,
        0x304100003097,
        0x30990000309b,
        0x309d0000309f,
        0x30a1000030fb,
        0x30fc000030ff,
        0x31050000312e,
        0x31a0000031bb,
        0x31f000003200,
        0x340000004db6,
        0x4e0000009fcd,
        0xa0000000a48d,
        0xa4d00000a4fe,
        0xa5000000a60d,
        0xa6100000a62c,
        0xa6410000a642,
        0xa6430000a644,
        0xa6450000a646,
        0xa6470000a648,
        0xa6490000a64a,
        0xa64b0000a64c,
        0xa64d0000a64e,
        0xa64f0000a650,
        0xa6510000a652,
        0xa6530000a654,
        0xa6550000a656,
        0xa6570000a658,
        0xa6590000a65a,
        0xa65b0000a65c,
        0xa65d0000a65e,
        0xa65f0000a660,
        0xa6610000a662,
        0xa6630000a664,
        0xa6650000a666,
        0xa6670000a668,
        0xa6690000a66a,
        0xa66b0000a66c,
        0xa66d0000a670,
        0xa6740000a67e,
        0xa67f0000a680,
        0xa6810000a682,
        0xa6830000a684,
        0xa6850000a686,
        0xa6870000a688,
        0xa6890000a68a,
        0xa68b0000a68c,
        0xa68d0000a68e,
        0xa68f0000a690,
        0xa6910000a692,
        0xa6930000a694,
        0xa6950000a696,
        0xa6970000a698,
        0xa69f0000a6e6,
        0xa6f00000a6f2,
        0xa7170000a720,
        0xa7230000a724,
        0xa7250000a726,
        0xa7270000a728,
        0xa7290000a72a,
        0xa72b0000a72c,
        0xa72d0000a72e,
        0xa72f0000a732,
        0xa7330000a734,
        0xa7350000a736,
        0xa7370000a738,
        0xa7390000a73a,
        0xa73b0000a73c,
        0xa73d0000a73e,
        0xa73f0000a740,
        0xa7410000a742,
        0xa7430000a744,
        0xa7450000a746,
        0xa7470000a748,
        0xa7490000a74a,
        0xa74b0000a74c,
        0xa74d0000a74e,
        0xa74f0000a750,
        0xa7510000a752,
        0xa7530000a754,
        0xa7550000a756,
        0xa7570000a758,
        0xa7590000a75a,
        0xa75b0000a75c,
        0xa75d0000a75e,
        0xa75f0000a760,
        0xa7610000a762,
        0xa7630000a764,
        0xa7650000a766,
        0xa7670000a768,
        0xa7690000a76a,
        0xa76b0000a76c,
        0xa76d0000a76e,
        0xa76f0000a770,
        0xa7710000a779,
        0xa77a0000a77b,
        0xa77c0000a77d,
        0xa77f0000a780,
        0xa7810000a782,
        0xa7830000a784,
        0xa7850000a786,
        0xa7870000a789,
        0xa78c0000a78d,
        0xa78e0000a78f,
        0xa7910000a792,
        0xa7930000a794,
        0xa7a10000a7a2,
        0xa7a30000a7a4,
        0xa7a50000a7a6,
        0xa7a70000a7a8,
        0xa7a90000a7aa,
        0xa7fa0000a828,
        0xa8400000a874,
        0xa8800000a8c5,
        0xa8d00000a8da,
        0xa8e00000a8f8,
        0xa8fb0000a8fc,
        0xa9000000a92e,
        0xa9300000a954,
        0xa9800000a9c1,
        0xa9cf0000a9da,
        0xaa000000aa37,
        0xaa400000aa4e,
        0xaa500000aa5a,
        0xaa600000aa77,
        0xaa7a0000aa7c,
        0xaa800000aac3,
        0xaadb0000aade,
        0xaae00000aaf0,
        0xaaf20000aaf7,
        0xab010000ab07,
        0xab090000ab0f,
        0xab110000ab17,
        0xab200000ab27,
        0xab280000ab2f,
        0xabc00000abeb,
        0xabec0000abee,
        0xabf00000abfa,
        0xac000000d7a4,
        0xfa0e0000fa10,
        0xfa110000fa12,
        0xfa130000fa15,
        0xfa1f0000fa20,
        0xfa210000fa22,
        0xfa230000fa25,
        0xfa270000fa2a,
        0xfb1e0000fb1f,
        0xfe200000fe27,
        0xfe730000fe74,
        0x100000001000c,
        0x1000d00010027,
        0x100280001003b,
        0x1003c0001003e,
        0x1003f0001004e,
        0x100500001005e,
        0x10080000100fb,
        0x101fd000101fe,
        0x102800001029d,
        0x102a0000102d1,
        0x103000001031f,
        0x1033000010341,
        0x103420001034a,
        0x103800001039e,
        0x103a0000103c4,
        0x103c8000103d0,
        0x104280001049e,
        0x104a0000104aa,
        0x1080000010806,
        0x1080800010809,
        0x1080a00010836,
        0x1083700010839,
        0x1083c0001083d,
        0x1083f00010856,
        0x1090000010916,
        0x109200001093a,
        0x10980000109b8,
        0x109be000109c0,
        0x10a0000010a04,
        0x10a0500010a07,
        0x10a0c00010a14,
        0x10a1500010a18,
        0x10a1900010a34,
        0x10a3800010a3b,
        0x10a3f00010a40,
        0x10a6000010a7d,
        0x10b0000010b36,
        0x10b4000010b56,
        0x10b6000010b73,
        0x10c0000010c49,
        0x1100000011047,
        0x1106600011070,
        0x11080000110bb,
        0x110d0000110e9,
        0x110f0000110fa,
        0x1110000011135,
        0x1113600011140,
        0x11180000111c5,
        0x111d0000111da,
        0x11680000116b8,
        0x116c0000116ca,
        0x120000001236f,
        0x130000001342f,
        0x1680000016a39,
        0x16f0000016f45,
        0x16f5000016f7f,
        0x16f8f00016fa0,
        0x1b0000001b002,
        0x200000002a6d7,
        0x2a7000002b735,
        0x2b7400002b81e,
    ),
    'CONTEXTJ': (
        0x200c0000200e,
    ),
    'CONTEXTO': (
        0xb7000000b8,
        0x37500000376,
        0x5f3000005f5,
        0x6600000066a,
        0x6f0000006fa,
        0x30fb000030fc,
    ),
}
site-packages/pip/_vendor/idna/intranges.py000064400000003325147511334620015000 0ustar00"""
Given a list of integers, made up of (hopefully) a small number of long runs
of consecutive integers, compute a representation of the form
((start1, end1), (start2, end2) ...). Then answer the question "was x present
in the original list?" in time O(log(# runs)).
"""

import bisect

def intranges_from_list(list_):
    """Represent a list of integers as a sequence of ranges:
    ((start_0, end_0), (start_1, end_1), ...), such that the original
    integers are exactly those x such that start_i <= x < end_i for some i.

    Ranges are encoded as single integers (start << 32 | end), not as tuples.
    """

    sorted_list = sorted(list_)
    ranges = []
    last_write = -1
    for i in range(len(sorted_list)):
        if i+1 < len(sorted_list):
            if sorted_list[i] == sorted_list[i+1]-1:
                continue
        current_range = sorted_list[last_write+1:i+1]
        ranges.append(_encode_range(current_range[0], current_range[-1] + 1))
        last_write = i

    return tuple(ranges)

def _encode_range(start, end):
    return (start << 32) | end

def _decode_range(r):
    return (r >> 32), (r & ((1 << 32) - 1))


def intranges_contain(int_, ranges):
    """Determine if `int_` falls into one of the ranges in `ranges`."""
    tuple_ = _encode_range(int_, 0)
    pos = bisect.bisect_left(ranges, tuple_)
    # we could be immediately ahead of a tuple (start, end)
    # with start < int_ <= end
    if pos > 0:
        left, right = _decode_range(ranges[pos-1])
        if left <= int_ < right:
            return True
    # or we could be immediately behind a tuple (int_, end)
    if pos < len(ranges):
        left, _ = _decode_range(ranges[pos])
        if left == int_:
            return True
    return False
site-packages/pip/_vendor/idna/package_data.py000064400000000025147511334620015364 0ustar00__version__ = '2.6'

site-packages/pip/_vendor/idna/compat.py000064400000000350147511334620014264 0ustar00from .core import *
from .codec import *

def ToASCII(label):
    return encode(label)

def ToUnicode(label):
    return decode(label)

def nameprep(s):
    raise NotImplementedError("IDNA 2008 does not utilise nameprep protocol")

site-packages/pip/_vendor/idna/__init__.py000064400000000072147511334620014541 0ustar00from .package_data import __version__
from .core import *
site-packages/pip/_vendor/idna/codec.py000064400000006343147511334620014066 0ustar00from .core import encode, decode, alabel, ulabel, IDNAError
import codecs
import re

_unicode_dots_re = re.compile(u'[\u002e\u3002\uff0e\uff61]')

class Codec(codecs.Codec):

    def encode(self, data, errors='strict'):

        if errors != 'strict':
            raise IDNAError("Unsupported error handling \"{0}\"".format(errors))

        if not data:
            return "", 0

        return encode(data), len(data)

    def decode(self, data, errors='strict'):

        if errors != 'strict':
            raise IDNAError("Unsupported error handling \"{0}\"".format(errors))

        if not data:
            return u"", 0

        return decode(data), len(data)

class IncrementalEncoder(codecs.BufferedIncrementalEncoder):
    def _buffer_encode(self, data, errors, final):
        if errors != 'strict':
            raise IDNAError("Unsupported error handling \"{0}\"".format(errors))

        if not data:
            return ("", 0)

        labels = _unicode_dots_re.split(data)
        trailing_dot = u''
        if labels:
            if not labels[-1]:
                trailing_dot = '.'
                del labels[-1]
            elif not final:
                # Keep potentially unfinished label until the next call
                del labels[-1]
                if labels:
                    trailing_dot = '.'

        result = []
        size = 0
        for label in labels:
            result.append(alabel(label))
            if size:
                size += 1
            size += len(label)

        # Join with U+002E
        result = ".".join(result) + trailing_dot
        size += len(trailing_dot)
        return (result, size)

class IncrementalDecoder(codecs.BufferedIncrementalDecoder):
    def _buffer_decode(self, data, errors, final):
        if errors != 'strict':
            raise IDNAError("Unsupported error handling \"{0}\"".format(errors))

        if not data:
            return (u"", 0)

        # IDNA allows decoding to operate on Unicode strings, too.
        if isinstance(data, unicode):
            labels = _unicode_dots_re.split(data)
        else:
            # Must be ASCII string
            data = str(data)
            unicode(data, "ascii")
            labels = data.split(".")

        trailing_dot = u''
        if labels:
            if not labels[-1]:
                trailing_dot = u'.'
                del labels[-1]
            elif not final:
                # Keep potentially unfinished label until the next call
                del labels[-1]
                if labels:
                    trailing_dot = u'.'

        result = []
        size = 0
        for label in labels:
            result.append(ulabel(label))
            if size:
                size += 1
            size += len(label)

        result = u".".join(result) + trailing_dot
        size += len(trailing_dot)
        return (result, size)


class StreamWriter(Codec, codecs.StreamWriter):
    pass

class StreamReader(Codec, codecs.StreamReader):
    pass

def getregentry():
    return codecs.CodecInfo(
        name='idna',
        encode=Codec().encode,
        decode=Codec().decode,
        incrementalencoder=IncrementalEncoder,
        incrementaldecoder=IncrementalDecoder,
        streamwriter=StreamWriter,
        streamreader=StreamReader,
    )
site-packages/pip/_vendor/__init__.py000064400000011076147511334620013634 0ustar00"""
pip._vendor is for vendoring dependencies of pip to prevent needing pip to
depend on something external.

Files inside of pip._vendor should be considered immutable and should only be
updated to versions from upstream.
"""
from __future__ import absolute_import

import glob
import os.path
import sys

# Downstream redistributors which have debundled our dependencies should also
# patch this value to be true. This will trigger the additional patching
# to cause things like "six" to be available as pip.
DEBUNDLED = False

# By default, look in this directory for a bunch of .whl files which we will
# add to the beginning of sys.path before attempting to import anything. This
# is done to support downstream re-distributors like Debian and Fedora who
# wish to create their own Wheels for our dependencies to aid in debundling.
WHEEL_DIR = os.path.abspath(os.path.dirname(__file__))


# Define a small helper function to alias our vendored modules to the real ones
# if the vendored ones do not exist. This idea of this was taken from
# https://github.com/kennethreitz/requests/pull/2567.
def vendored(modulename):
    vendored_name = "{0}.{1}".format(__name__, modulename)

    try:
        __import__(vendored_name, globals(), locals(), level=0)
    except ImportError:
        try:
            __import__(modulename, globals(), locals(), level=0)
        except ImportError:
            # We can just silently allow import failures to pass here. If we
            # got to this point it means that ``import pip._vendor.whatever``
            # failed and so did ``import whatever``. Since we're importing this
            # upfront in an attempt to alias imports, not erroring here will
            # just mean we get a regular import error whenever pip *actually*
            # tries to import one of these modules to use it, which actually
            # gives us a better error message than we would have otherwise
            # gotten.
            pass
        else:
            sys.modules[vendored_name] = sys.modules[modulename]
            base, head = vendored_name.rsplit(".", 1)
            setattr(sys.modules[base], head, sys.modules[modulename])


# If we're operating in a debundled setup, then we want to go ahead and trigger
# the aliasing of our vendored libraries as well as looking for wheels to add
# to our sys.path. This will cause all of this code to be a no-op typically
# however downstream redistributors can enable it in a consistent way across
# all platforms.
if DEBUNDLED:
    # Actually look inside of WHEEL_DIR to find .whl files and add them to the
    # front of our sys.path.
    sys.path[:] = glob.glob(os.path.join(WHEEL_DIR, "*.whl")) + sys.path

    # Actually alias all of our vendored dependencies.
    vendored("cachecontrol")
    vendored("colorama")
    vendored("distlib")
    vendored("distro")
    vendored("html5lib")
    vendored("lockfile")
    vendored("six")
    vendored("six.moves")
    vendored("six.moves.urllib")
    vendored("packaging")
    vendored("packaging.version")
    vendored("packaging.specifiers")
    vendored("pkg_resources")
    vendored("progress")
    vendored("retrying")
    vendored("requests")
    vendored("requests.packages")
    vendored("requests.packages.urllib3")
    vendored("requests.packages.urllib3._collections")
    vendored("requests.packages.urllib3.connection")
    vendored("requests.packages.urllib3.connectionpool")
    vendored("requests.packages.urllib3.contrib")
    vendored("requests.packages.urllib3.contrib.ntlmpool")
    vendored("requests.packages.urllib3.contrib.pyopenssl")
    vendored("requests.packages.urllib3.exceptions")
    vendored("requests.packages.urllib3.fields")
    vendored("requests.packages.urllib3.filepost")
    vendored("requests.packages.urllib3.packages")
    vendored("requests.packages.urllib3.packages.ordered_dict")
    vendored("requests.packages.urllib3.packages.six")
    vendored("requests.packages.urllib3.packages.ssl_match_hostname")
    vendored("requests.packages.urllib3.packages.ssl_match_hostname."
             "_implementation")
    vendored("requests.packages.urllib3.poolmanager")
    vendored("requests.packages.urllib3.request")
    vendored("requests.packages.urllib3.response")
    vendored("requests.packages.urllib3.util")
    vendored("requests.packages.urllib3.util.connection")
    vendored("requests.packages.urllib3.util.request")
    vendored("requests.packages.urllib3.util.response")
    vendored("requests.packages.urllib3.util.retry")
    vendored("requests.packages.urllib3.util.ssl_")
    vendored("requests.packages.urllib3.util.timeout")
    vendored("requests.packages.urllib3.util.url")
site-packages/pip/_vendor/pkg_resources/__init__.py000064400000311476147511334620016516 0ustar00# coding: utf-8
"""
Package resource API
--------------------

A resource is a logical file contained within a package, or a logical
subdirectory thereof.  The package resource API expects resource names
to have their path parts separated with ``/``, *not* whatever the local
path separator is.  Do not use os.path operations to manipulate resource
names being passed into the API.

The package resource API is designed to work with normal filesystem packages,
.egg files, and unpacked .egg files.  It can also work in a limited way with
.zip files and with custom PEP 302 loaders that support the ``get_data()``
method.
"""

from __future__ import absolute_import

import sys
import os
import io
import time
import re
import types
import zipfile
import zipimport
import warnings
import stat
import functools
import pkgutil
import operator
import platform
import collections
import plistlib
import email.parser
import tempfile
import textwrap
import itertools
from pkgutil import get_importer

try:
    import _imp
except ImportError:
    # Python 3.2 compatibility
    import imp as _imp

from pip._vendor import six
from pip._vendor.six.moves import urllib, map, filter

# capture these to bypass sandboxing
from os import utime
try:
    from os import mkdir, rename, unlink
    WRITE_SUPPORT = True
except ImportError:
    # no write support, probably under GAE
    WRITE_SUPPORT = False

from os import open as os_open
from os.path import isdir, split

try:
    import importlib.machinery as importlib_machinery
    # access attribute to force import under delayed import mechanisms.
    importlib_machinery.__name__
except ImportError:
    importlib_machinery = None

from pip._vendor import appdirs
from pip._vendor import packaging
__import__('pip._vendor.packaging.version')
__import__('pip._vendor.packaging.specifiers')
__import__('pip._vendor.packaging.requirements')
__import__('pip._vendor.packaging.markers')


if (3, 0) < sys.version_info < (3, 3):
    msg = (
        "Support for Python 3.0-3.2 has been dropped. Future versions "
        "will fail here."
    )
    warnings.warn(msg)

# declare some globals that will be defined later to
# satisfy the linters.
require = None
working_set = None


class PEP440Warning(RuntimeWarning):
    """
    Used when there is an issue with a version or specifier not complying with
    PEP 440.
    """


class _SetuptoolsVersionMixin(object):
    def __hash__(self):
        return super(_SetuptoolsVersionMixin, self).__hash__()

    def __lt__(self, other):
        if isinstance(other, tuple):
            return tuple(self) < other
        else:
            return super(_SetuptoolsVersionMixin, self).__lt__(other)

    def __le__(self, other):
        if isinstance(other, tuple):
            return tuple(self) <= other
        else:
            return super(_SetuptoolsVersionMixin, self).__le__(other)

    def __eq__(self, other):
        if isinstance(other, tuple):
            return tuple(self) == other
        else:
            return super(_SetuptoolsVersionMixin, self).__eq__(other)

    def __ge__(self, other):
        if isinstance(other, tuple):
            return tuple(self) >= other
        else:
            return super(_SetuptoolsVersionMixin, self).__ge__(other)

    def __gt__(self, other):
        if isinstance(other, tuple):
            return tuple(self) > other
        else:
            return super(_SetuptoolsVersionMixin, self).__gt__(other)

    def __ne__(self, other):
        if isinstance(other, tuple):
            return tuple(self) != other
        else:
            return super(_SetuptoolsVersionMixin, self).__ne__(other)

    def __getitem__(self, key):
        return tuple(self)[key]

    def __iter__(self):
        component_re = re.compile(r'(\d+ | [a-z]+ | \.| -)', re.VERBOSE)
        replace = {
            'pre': 'c',
            'preview': 'c',
            '-': 'final-',
            'rc': 'c',
            'dev': '@',
        }.get

        def _parse_version_parts(s):
            for part in component_re.split(s):
                part = replace(part, part)
                if not part or part == '.':
                    continue
                if part[:1] in '0123456789':
                    # pad for numeric comparison
                    yield part.zfill(8)
                else:
                    yield '*' + part

            # ensure that alpha/beta/candidate are before final
            yield '*final'

        def old_parse_version(s):
            parts = []
            for part in _parse_version_parts(s.lower()):
                if part.startswith('*'):
                    # remove '-' before a prerelease tag
                    if part < '*final':
                        while parts and parts[-1] == '*final-':
                            parts.pop()
                    # remove trailing zeros from each series of numeric parts
                    while parts and parts[-1] == '00000000':
                        parts.pop()
                parts.append(part)
            return tuple(parts)

        # Warn for use of this function
        warnings.warn(
            "You have iterated over the result of "
            "pkg_resources.parse_version. This is a legacy behavior which is "
            "inconsistent with the new version class introduced in setuptools "
            "8.0. In most cases, conversion to a tuple is unnecessary. For "
            "comparison of versions, sort the Version instances directly. If "
            "you have another use case requiring the tuple, please file a "
            "bug with the setuptools project describing that need.",
            RuntimeWarning,
            stacklevel=1,
        )

        for part in old_parse_version(str(self)):
            yield part


class SetuptoolsVersion(_SetuptoolsVersionMixin, packaging.version.Version):
    pass


class SetuptoolsLegacyVersion(_SetuptoolsVersionMixin,
                              packaging.version.LegacyVersion):
    pass


def parse_version(v):
    try:
        return SetuptoolsVersion(v)
    except packaging.version.InvalidVersion:
        return SetuptoolsLegacyVersion(v)


_state_vars = {}


def _declare_state(vartype, **kw):
    globals().update(kw)
    _state_vars.update(dict.fromkeys(kw, vartype))


def __getstate__():
    state = {}
    g = globals()
    for k, v in _state_vars.items():
        state[k] = g['_sget_' + v](g[k])
    return state


def __setstate__(state):
    g = globals()
    for k, v in state.items():
        g['_sset_' + _state_vars[k]](k, g[k], v)
    return state


def _sget_dict(val):
    return val.copy()


def _sset_dict(key, ob, state):
    ob.clear()
    ob.update(state)


def _sget_object(val):
    return val.__getstate__()


def _sset_object(key, ob, state):
    ob.__setstate__(state)


_sget_none = _sset_none = lambda *args: None


def get_supported_platform():
    """Return this platform's maximum compatible version.

    distutils.util.get_platform() normally reports the minimum version
    of Mac OS X that would be required to *use* extensions produced by
    distutils.  But what we want when checking compatibility is to know the
    version of Mac OS X that we are *running*.  To allow usage of packages that
    explicitly require a newer version of Mac OS X, we must also know the
    current version of the OS.

    If this condition occurs for any other platform with a version in its
    platform strings, this function should be extended accordingly.
    """
    plat = get_build_platform()
    m = macosVersionString.match(plat)
    if m is not None and sys.platform == "darwin":
        try:
            plat = 'macosx-%s-%s' % ('.'.join(_macosx_vers()[:2]), m.group(3))
        except ValueError:
            # not Mac OS X
            pass
    return plat


__all__ = [
    # Basic resource access and distribution/entry point discovery
    'require', 'run_script', 'get_provider', 'get_distribution',
    'load_entry_point', 'get_entry_map', 'get_entry_info',
    'iter_entry_points',
    'resource_string', 'resource_stream', 'resource_filename',
    'resource_listdir', 'resource_exists', 'resource_isdir',

    # Environmental control
    'declare_namespace', 'working_set', 'add_activation_listener',
    'find_distributions', 'set_extraction_path', 'cleanup_resources',
    'get_default_cache',

    # Primary implementation classes
    'Environment', 'WorkingSet', 'ResourceManager',
    'Distribution', 'Requirement', 'EntryPoint',

    # Exceptions
    'ResolutionError', 'VersionConflict', 'DistributionNotFound',
    'UnknownExtra', 'ExtractionError',

    # Warnings
    'PEP440Warning',

    # Parsing functions and string utilities
    'parse_requirements', 'parse_version', 'safe_name', 'safe_version',
    'get_platform', 'compatible_platforms', 'yield_lines', 'split_sections',
    'safe_extra', 'to_filename', 'invalid_marker', 'evaluate_marker',

    # filesystem utilities
    'ensure_directory', 'normalize_path',

    # Distribution "precedence" constants
    'EGG_DIST', 'BINARY_DIST', 'SOURCE_DIST', 'CHECKOUT_DIST', 'DEVELOP_DIST',

    # "Provider" interfaces, implementations, and registration/lookup APIs
    'IMetadataProvider', 'IResourceProvider', 'FileMetadata',
    'PathMetadata', 'EggMetadata', 'EmptyProvider', 'empty_provider',
    'NullProvider', 'EggProvider', 'DefaultProvider', 'ZipProvider',
    'register_finder', 'register_namespace_handler', 'register_loader_type',
    'fixup_namespace_packages', 'get_importer',

    # Deprecated/backward compatibility only
    'run_main', 'AvailableDistributions',
]


class ResolutionError(Exception):
    """Abstract base for dependency resolution errors"""

    def __repr__(self):
        return self.__class__.__name__ + repr(self.args)


class VersionConflict(ResolutionError):
    """
    An already-installed version conflicts with the requested version.

    Should be initialized with the installed Distribution and the requested
    Requirement.
    """

    _template = "{self.dist} is installed but {self.req} is required"

    @property
    def dist(self):
        return self.args[0]

    @property
    def req(self):
        return self.args[1]

    def report(self):
        return self._template.format(**locals())

    def with_context(self, required_by):
        """
        If required_by is non-empty, return a version of self that is a
        ContextualVersionConflict.
        """
        if not required_by:
            return self
        args = self.args + (required_by,)
        return ContextualVersionConflict(*args)


class ContextualVersionConflict(VersionConflict):
    """
    A VersionConflict that accepts a third parameter, the set of the
    requirements that required the installed Distribution.
    """

    _template = VersionConflict._template + ' by {self.required_by}'

    @property
    def required_by(self):
        return self.args[2]


class DistributionNotFound(ResolutionError):
    """A requested distribution was not found"""

    _template = ("The '{self.req}' distribution was not found "
                 "and is required by {self.requirers_str}")

    @property
    def req(self):
        return self.args[0]

    @property
    def requirers(self):
        return self.args[1]

    @property
    def requirers_str(self):
        if not self.requirers:
            return 'the application'
        return ', '.join(self.requirers)

    def report(self):
        return self._template.format(**locals())

    def __str__(self):
        return self.report()


class UnknownExtra(ResolutionError):
    """Distribution doesn't have an "extra feature" of the given name"""


_provider_factories = {}

PY_MAJOR = sys.version[:3]
EGG_DIST = 3
BINARY_DIST = 2
SOURCE_DIST = 1
CHECKOUT_DIST = 0
DEVELOP_DIST = -1


def register_loader_type(loader_type, provider_factory):
    """Register `provider_factory` to make providers for `loader_type`

    `loader_type` is the type or class of a PEP 302 ``module.__loader__``,
    and `provider_factory` is a function that, passed a *module* object,
    returns an ``IResourceProvider`` for that module.
    """
    _provider_factories[loader_type] = provider_factory


def get_provider(moduleOrReq):
    """Return an IResourceProvider for the named module or requirement"""
    if isinstance(moduleOrReq, Requirement):
        return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0]
    try:
        module = sys.modules[moduleOrReq]
    except KeyError:
        __import__(moduleOrReq)
        module = sys.modules[moduleOrReq]
    loader = getattr(module, '__loader__', None)
    return _find_adapter(_provider_factories, loader)(module)


def _macosx_vers(_cache=[]):
    if not _cache:
        version = platform.mac_ver()[0]
        # fallback for MacPorts
        if version == '':
            plist = '/System/Library/CoreServices/SystemVersion.plist'
            if os.path.exists(plist):
                if hasattr(plistlib, 'readPlist'):
                    plist_content = plistlib.readPlist(plist)
                    if 'ProductVersion' in plist_content:
                        version = plist_content['ProductVersion']

        _cache.append(version.split('.'))
    return _cache[0]


def _macosx_arch(machine):
    return {'PowerPC': 'ppc', 'Power_Macintosh': 'ppc'}.get(machine, machine)


def get_build_platform():
    """Return this platform's string for platform-specific distributions

    XXX Currently this is the same as ``distutils.util.get_platform()``, but it
    needs some hacks for Linux and Mac OS X.
    """
    try:
        # Python 2.7 or >=3.2
        from sysconfig import get_platform
    except ImportError:
        from distutils.util import get_platform

    plat = get_platform()
    if sys.platform == "darwin" and not plat.startswith('macosx-'):
        try:
            version = _macosx_vers()
            machine = os.uname()[4].replace(" ", "_")
            return "macosx-%d.%d-%s" % (int(version[0]), int(version[1]),
                _macosx_arch(machine))
        except ValueError:
            # if someone is running a non-Mac darwin system, this will fall
            # through to the default implementation
            pass
    return plat


macosVersionString = re.compile(r"macosx-(\d+)\.(\d+)-(.*)")
darwinVersionString = re.compile(r"darwin-(\d+)\.(\d+)\.(\d+)-(.*)")
# XXX backward compat
get_platform = get_build_platform


def compatible_platforms(provided, required):
    """Can code for the `provided` platform run on the `required` platform?

    Returns true if either platform is ``None``, or the platforms are equal.

    XXX Needs compatibility checks for Linux and other unixy OSes.
    """
    if provided is None or required is None or provided == required:
        # easy case
        return True

    # Mac OS X special cases
    reqMac = macosVersionString.match(required)
    if reqMac:
        provMac = macosVersionString.match(provided)

        # is this a Mac package?
        if not provMac:
            # this is backwards compatibility for packages built before
            # setuptools 0.6. All packages built after this point will
            # use the new macosx designation.
            provDarwin = darwinVersionString.match(provided)
            if provDarwin:
                dversion = int(provDarwin.group(1))
                macosversion = "%s.%s" % (reqMac.group(1), reqMac.group(2))
                if dversion == 7 and macosversion >= "10.3" or \
                        dversion == 8 and macosversion >= "10.4":
                    return True
            # egg isn't macosx or legacy darwin
            return False

        # are they the same major version and machine type?
        if provMac.group(1) != reqMac.group(1) or \
                provMac.group(3) != reqMac.group(3):
            return False

        # is the required OS major update >= the provided one?
        if int(provMac.group(2)) > int(reqMac.group(2)):
            return False

        return True

    # XXX Linux and other platforms' special cases should go here
    return False


def run_script(dist_spec, script_name):
    """Locate distribution `dist_spec` and run its `script_name` script"""
    ns = sys._getframe(1).f_globals
    name = ns['__name__']
    ns.clear()
    ns['__name__'] = name
    require(dist_spec)[0].run_script(script_name, ns)


# backward compatibility
run_main = run_script


def get_distribution(dist):
    """Return a current distribution object for a Requirement or string"""
    if isinstance(dist, six.string_types):
        dist = Requirement.parse(dist)
    if isinstance(dist, Requirement):
        dist = get_provider(dist)
    if not isinstance(dist, Distribution):
        raise TypeError("Expected string, Requirement, or Distribution", dist)
    return dist


def load_entry_point(dist, group, name):
    """Return `name` entry point of `group` for `dist` or raise ImportError"""
    return get_distribution(dist).load_entry_point(group, name)


def get_entry_map(dist, group=None):
    """Return the entry point map for `group`, or the full entry map"""
    return get_distribution(dist).get_entry_map(group)


def get_entry_info(dist, group, name):
    """Return the EntryPoint object for `group`+`name`, or ``None``"""
    return get_distribution(dist).get_entry_info(group, name)


class IMetadataProvider:
    def has_metadata(name):
        """Does the package's distribution contain the named metadata?"""

    def get_metadata(name):
        """The named metadata resource as a string"""

    def get_metadata_lines(name):
        """Yield named metadata resource as list of non-blank non-comment lines

       Leading and trailing whitespace is stripped from each line, and lines
       with ``#`` as the first non-blank character are omitted."""

    def metadata_isdir(name):
        """Is the named metadata a directory?  (like ``os.path.isdir()``)"""

    def metadata_listdir(name):
        """List of metadata names in the directory (like ``os.listdir()``)"""

    def run_script(script_name, namespace):
        """Execute the named script in the supplied namespace dictionary"""


class IResourceProvider(IMetadataProvider):
    """An object that provides access to package resources"""

    def get_resource_filename(manager, resource_name):
        """Return a true filesystem path for `resource_name`

        `manager` must be an ``IResourceManager``"""

    def get_resource_stream(manager, resource_name):
        """Return a readable file-like object for `resource_name`

        `manager` must be an ``IResourceManager``"""

    def get_resource_string(manager, resource_name):
        """Return a string containing the contents of `resource_name`

        `manager` must be an ``IResourceManager``"""

    def has_resource(resource_name):
        """Does the package contain the named resource?"""

    def resource_isdir(resource_name):
        """Is the named resource a directory?  (like ``os.path.isdir()``)"""

    def resource_listdir(resource_name):
        """List of resource names in the directory (like ``os.listdir()``)"""


class WorkingSet(object):
    """A collection of active distributions on sys.path (or a similar list)"""

    def __init__(self, entries=None):
        """Create working set from list of path entries (default=sys.path)"""
        self.entries = []
        self.entry_keys = {}
        self.by_key = {}
        self.callbacks = []

        if entries is None:
            entries = sys.path

        for entry in entries:
            self.add_entry(entry)

    @classmethod
    def _build_master(cls):
        """
        Prepare the master working set.
        """
        ws = cls()
        try:
            from __main__ import __requires__
        except ImportError:
            # The main program does not list any requirements
            return ws

        # ensure the requirements are met
        try:
            ws.require(__requires__)
        except VersionConflict:
            return cls._build_from_requirements(__requires__)

        return ws

    @classmethod
    def _build_from_requirements(cls, req_spec):
        """
        Build a working set from a requirement spec. Rewrites sys.path.
        """
        # try it without defaults already on sys.path
        # by starting with an empty path
        ws = cls([])
        reqs = parse_requirements(req_spec)
        dists = ws.resolve(reqs, Environment())
        for dist in dists:
            ws.add(dist)

        # add any missing entries from sys.path
        for entry in sys.path:
            if entry not in ws.entries:
                ws.add_entry(entry)

        # then copy back to sys.path
        sys.path[:] = ws.entries
        return ws

    def add_entry(self, entry):
        """Add a path item to ``.entries``, finding any distributions on it

        ``find_distributions(entry, True)`` is used to find distributions
        corresponding to the path entry, and they are added.  `entry` is
        always appended to ``.entries``, even if it is already present.
        (This is because ``sys.path`` can contain the same value more than
        once, and the ``.entries`` of the ``sys.path`` WorkingSet should always
        equal ``sys.path``.)
        """
        self.entry_keys.setdefault(entry, [])
        self.entries.append(entry)
        for dist in find_distributions(entry, True):
            self.add(dist, entry, False)

    def __contains__(self, dist):
        """True if `dist` is the active distribution for its project"""
        return self.by_key.get(dist.key) == dist

    def find(self, req):
        """Find a distribution matching requirement `req`

        If there is an active distribution for the requested project, this
        returns it as long as it meets the version requirement specified by
        `req`.  But, if there is an active distribution for the project and it
        does *not* meet the `req` requirement, ``VersionConflict`` is raised.
        If there is no active distribution for the requested project, ``None``
        is returned.
        """
        dist = self.by_key.get(req.key)
        if dist is not None and dist not in req:
            # XXX add more info
            raise VersionConflict(dist, req)
        return dist

    def iter_entry_points(self, group, name=None):
        """Yield entry point objects from `group` matching `name`

        If `name` is None, yields all entry points in `group` from all
        distributions in the working set, otherwise only ones matching
        both `group` and `name` are yielded (in distribution order).
        """
        for dist in self:
            entries = dist.get_entry_map(group)
            if name is None:
                for ep in entries.values():
                    yield ep
            elif name in entries:
                yield entries[name]

    def run_script(self, requires, script_name):
        """Locate distribution for `requires` and run `script_name` script"""
        ns = sys._getframe(1).f_globals
        name = ns['__name__']
        ns.clear()
        ns['__name__'] = name
        self.require(requires)[0].run_script(script_name, ns)

    def __iter__(self):
        """Yield distributions for non-duplicate projects in the working set

        The yield order is the order in which the items' path entries were
        added to the working set.
        """
        seen = {}
        for item in self.entries:
            if item not in self.entry_keys:
                # workaround a cache issue
                continue

            for key in self.entry_keys[item]:
                if key not in seen:
                    seen[key] = 1
                    yield self.by_key[key]

    def add(self, dist, entry=None, insert=True, replace=False):
        """Add `dist` to working set, associated with `entry`

        If `entry` is unspecified, it defaults to the ``.location`` of `dist`.
        On exit from this routine, `entry` is added to the end of the working
        set's ``.entries`` (if it wasn't already present).

        `dist` is only added to the working set if it's for a project that
        doesn't already have a distribution in the set, unless `replace=True`.
        If it's added, any callbacks registered with the ``subscribe()`` method
        will be called.
        """
        if insert:
            dist.insert_on(self.entries, entry, replace=replace)

        if entry is None:
            entry = dist.location
        keys = self.entry_keys.setdefault(entry, [])
        keys2 = self.entry_keys.setdefault(dist.location, [])
        if not replace and dist.key in self.by_key:
            # ignore hidden distros
            return

        self.by_key[dist.key] = dist
        if dist.key not in keys:
            keys.append(dist.key)
        if dist.key not in keys2:
            keys2.append(dist.key)
        self._added_new(dist)

    def resolve(self, requirements, env=None, installer=None,
            replace_conflicting=False):
        """List all distributions needed to (recursively) meet `requirements`

        `requirements` must be a sequence of ``Requirement`` objects.  `env`,
        if supplied, should be an ``Environment`` instance.  If
        not supplied, it defaults to all distributions available within any
        entry or distribution in the working set.  `installer`, if supplied,
        will be invoked with each requirement that cannot be met by an
        already-installed distribution; it should return a ``Distribution`` or
        ``None``.

        Unless `replace_conflicting=True`, raises a VersionConflict exception if
        any requirements are found on the path that have the correct name but
        the wrong version.  Otherwise, if an `installer` is supplied it will be
        invoked to obtain the correct version of the requirement and activate
        it.
        """

        # set up the stack
        requirements = list(requirements)[::-1]
        # set of processed requirements
        processed = {}
        # key -> dist
        best = {}
        to_activate = []

        req_extras = _ReqExtras()

        # Mapping of requirement to set of distributions that required it;
        # useful for reporting info about conflicts.
        required_by = collections.defaultdict(set)

        while requirements:
            # process dependencies breadth-first
            req = requirements.pop(0)
            if req in processed:
                # Ignore cyclic or redundant dependencies
                continue

            if not req_extras.markers_pass(req):
                continue

            dist = best.get(req.key)
            if dist is None:
                # Find the best distribution and add it to the map
                dist = self.by_key.get(req.key)
                if dist is None or (dist not in req and replace_conflicting):
                    ws = self
                    if env is None:
                        if dist is None:
                            env = Environment(self.entries)
                        else:
                            # Use an empty environment and workingset to avoid
                            # any further conflicts with the conflicting
                            # distribution
                            env = Environment([])
                            ws = WorkingSet([])
                    dist = best[req.key] = env.best_match(req, ws, installer)
                    if dist is None:
                        requirers = required_by.get(req, None)
                        raise DistributionNotFound(req, requirers)
                to_activate.append(dist)
            if dist not in req:
                # Oops, the "best" so far conflicts with a dependency
                dependent_req = required_by[req]
                raise VersionConflict(dist, req).with_context(dependent_req)

            # push the new requirements onto the stack
            new_requirements = dist.requires(req.extras)[::-1]
            requirements.extend(new_requirements)

            # Register the new requirements needed by req
            for new_requirement in new_requirements:
                required_by[new_requirement].add(req.project_name)
                req_extras[new_requirement] = req.extras

            processed[req] = True

        # return list of distros to activate
        return to_activate

    def find_plugins(self, plugin_env, full_env=None, installer=None,
            fallback=True):
        """Find all activatable distributions in `plugin_env`

        Example usage::

            distributions, errors = working_set.find_plugins(
                Environment(plugin_dirlist)
            )
            # add plugins+libs to sys.path
            map(working_set.add, distributions)
            # display errors
            print('Could not load', errors)

        The `plugin_env` should be an ``Environment`` instance that contains
        only distributions that are in the project's "plugin directory" or
        directories. The `full_env`, if supplied, should be an ``Environment``
        contains all currently-available distributions.  If `full_env` is not
        supplied, one is created automatically from the ``WorkingSet`` this
        method is called on, which will typically mean that every directory on
        ``sys.path`` will be scanned for distributions.

        `installer` is a standard installer callback as used by the
        ``resolve()`` method. The `fallback` flag indicates whether we should
        attempt to resolve older versions of a plugin if the newest version
        cannot be resolved.

        This method returns a 2-tuple: (`distributions`, `error_info`), where
        `distributions` is a list of the distributions found in `plugin_env`
        that were loadable, along with any other distributions that are needed
        to resolve their dependencies.  `error_info` is a dictionary mapping
        unloadable plugin distributions to an exception instance describing the
        error that occurred. Usually this will be a ``DistributionNotFound`` or
        ``VersionConflict`` instance.
        """

        plugin_projects = list(plugin_env)
        # scan project names in alphabetic order
        plugin_projects.sort()

        error_info = {}
        distributions = {}

        if full_env is None:
            env = Environment(self.entries)
            env += plugin_env
        else:
            env = full_env + plugin_env

        shadow_set = self.__class__([])
        # put all our entries in shadow_set
        list(map(shadow_set.add, self))

        for project_name in plugin_projects:

            for dist in plugin_env[project_name]:

                req = [dist.as_requirement()]

                try:
                    resolvees = shadow_set.resolve(req, env, installer)

                except ResolutionError as v:
                    # save error info
                    error_info[dist] = v
                    if fallback:
                        # try the next older version of project
                        continue
                    else:
                        # give up on this project, keep going
                        break

                else:
                    list(map(shadow_set.add, resolvees))
                    distributions.update(dict.fromkeys(resolvees))

                    # success, no need to try any more versions of this project
                    break

        distributions = list(distributions)
        distributions.sort()

        return distributions, error_info

    def require(self, *requirements):
        """Ensure that distributions matching `requirements` are activated

        `requirements` must be a string or a (possibly-nested) sequence
        thereof, specifying the distributions and versions required.  The
        return value is a sequence of the distributions that needed to be
        activated to fulfill the requirements; all relevant distributions are
        included, even if they were already activated in this working set.
        """
        needed = self.resolve(parse_requirements(requirements))

        for dist in needed:
            self.add(dist)

        return needed

    def subscribe(self, callback, existing=True):
        """Invoke `callback` for all distributions

        If `existing=True` (default),
        call on all existing ones, as well.
        """
        if callback in self.callbacks:
            return
        self.callbacks.append(callback)
        if not existing:
            return
        for dist in self:
            callback(dist)

    def _added_new(self, dist):
        for callback in self.callbacks:
            callback(dist)

    def __getstate__(self):
        return (
            self.entries[:], self.entry_keys.copy(), self.by_key.copy(),
            self.callbacks[:]
        )

    def __setstate__(self, e_k_b_c):
        entries, keys, by_key, callbacks = e_k_b_c
        self.entries = entries[:]
        self.entry_keys = keys.copy()
        self.by_key = by_key.copy()
        self.callbacks = callbacks[:]


class _ReqExtras(dict):
    """
    Map each requirement to the extras that demanded it.
    """

    def markers_pass(self, req):
        """
        Evaluate markers for req against each extra that
        demanded it.

        Return False if the req has a marker and fails
        evaluation. Otherwise, return True.
        """
        extra_evals = (
            req.marker.evaluate({'extra': extra})
            for extra in self.get(req, ()) + (None,)
        )
        return not req.marker or any(extra_evals)


class Environment(object):
    """Searchable snapshot of distributions on a search path"""

    def __init__(self, search_path=None, platform=get_supported_platform(),
            python=PY_MAJOR):
        """Snapshot distributions available on a search path

        Any distributions found on `search_path` are added to the environment.
        `search_path` should be a sequence of ``sys.path`` items.  If not
        supplied, ``sys.path`` is used.

        `platform` is an optional string specifying the name of the platform
        that platform-specific distributions must be compatible with.  If
        unspecified, it defaults to the current platform.  `python` is an
        optional string naming the desired version of Python (e.g. ``'3.3'``);
        it defaults to the current version.

        You may explicitly set `platform` (and/or `python`) to ``None`` if you
        wish to map *all* distributions, not just those compatible with the
        running platform or Python version.
        """
        self._distmap = {}
        self.platform = platform
        self.python = python
        self.scan(search_path)

    def can_add(self, dist):
        """Is distribution `dist` acceptable for this environment?

        The distribution must match the platform and python version
        requirements specified when this environment was created, or False
        is returned.
        """
        return (self.python is None or dist.py_version is None
            or dist.py_version == self.python) \
            and compatible_platforms(dist.platform, self.platform)

    def remove(self, dist):
        """Remove `dist` from the environment"""
        self._distmap[dist.key].remove(dist)

    def scan(self, search_path=None):
        """Scan `search_path` for distributions usable in this environment

        Any distributions found are added to the environment.
        `search_path` should be a sequence of ``sys.path`` items.  If not
        supplied, ``sys.path`` is used.  Only distributions conforming to
        the platform/python version defined at initialization are added.
        """
        if search_path is None:
            search_path = sys.path

        for item in search_path:
            for dist in find_distributions(item):
                self.add(dist)

    def __getitem__(self, project_name):
        """Return a newest-to-oldest list of distributions for `project_name`

        Uses case-insensitive `project_name` comparison, assuming all the
        project's distributions use their project's name converted to all
        lowercase as their key.

        """
        distribution_key = project_name.lower()
        return self._distmap.get(distribution_key, [])

    def add(self, dist):
        """Add `dist` if we ``can_add()`` it and it has not already been added
        """
        if self.can_add(dist) and dist.has_version():
            dists = self._distmap.setdefault(dist.key, [])
            if dist not in dists:
                dists.append(dist)
                dists.sort(key=operator.attrgetter('hashcmp'), reverse=True)

    def best_match(self, req, working_set, installer=None):
        """Find distribution best matching `req` and usable on `working_set`

        This calls the ``find(req)`` method of the `working_set` to see if a
        suitable distribution is already active.  (This may raise
        ``VersionConflict`` if an unsuitable version of the project is already
        active in the specified `working_set`.)  If a suitable distribution
        isn't active, this method returns the newest distribution in the
        environment that meets the ``Requirement`` in `req`.  If no suitable
        distribution is found, and `installer` is supplied, then the result of
        calling the environment's ``obtain(req, installer)`` method will be
        returned.
        """
        dist = working_set.find(req)
        if dist is not None:
            return dist
        for dist in self[req.key]:
            if dist in req:
                return dist
        # try to download/install
        return self.obtain(req, installer)

    def obtain(self, requirement, installer=None):
        """Obtain a distribution matching `requirement` (e.g. via download)

        Obtain a distro that matches requirement (e.g. via download).  In the
        base ``Environment`` class, this routine just returns
        ``installer(requirement)``, unless `installer` is None, in which case
        None is returned instead.  This method is a hook that allows subclasses
        to attempt other ways of obtaining a distribution before falling back
        to the `installer` argument."""
        if installer is not None:
            return installer(requirement)

    def __iter__(self):
        """Yield the unique project names of the available distributions"""
        for key in self._distmap.keys():
            if self[key]:
                yield key

    def __iadd__(self, other):
        """In-place addition of a distribution or environment"""
        if isinstance(other, Distribution):
            self.add(other)
        elif isinstance(other, Environment):
            for project in other:
                for dist in other[project]:
                    self.add(dist)
        else:
            raise TypeError("Can't add %r to environment" % (other,))
        return self

    def __add__(self, other):
        """Add an environment or distribution to an environment"""
        new = self.__class__([], platform=None, python=None)
        for env in self, other:
            new += env
        return new


# XXX backward compatibility
AvailableDistributions = Environment


class ExtractionError(RuntimeError):
    """An error occurred extracting a resource

    The following attributes are available from instances of this exception:

    manager
        The resource manager that raised this exception

    cache_path
        The base directory for resource extraction

    original_error
        The exception instance that caused extraction to fail
    """


class ResourceManager:
    """Manage resource extraction and packages"""
    extraction_path = None

    def __init__(self):
        self.cached_files = {}

    def resource_exists(self, package_or_requirement, resource_name):
        """Does the named resource exist?"""
        return get_provider(package_or_requirement).has_resource(resource_name)

    def resource_isdir(self, package_or_requirement, resource_name):
        """Is the named resource an existing directory?"""
        return get_provider(package_or_requirement).resource_isdir(
            resource_name
        )

    def resource_filename(self, package_or_requirement, resource_name):
        """Return a true filesystem path for specified resource"""
        return get_provider(package_or_requirement).get_resource_filename(
            self, resource_name
        )

    def resource_stream(self, package_or_requirement, resource_name):
        """Return a readable file-like object for specified resource"""
        return get_provider(package_or_requirement).get_resource_stream(
            self, resource_name
        )

    def resource_string(self, package_or_requirement, resource_name):
        """Return specified resource as a string"""
        return get_provider(package_or_requirement).get_resource_string(
            self, resource_name
        )

    def resource_listdir(self, package_or_requirement, resource_name):
        """List the contents of the named resource directory"""
        return get_provider(package_or_requirement).resource_listdir(
            resource_name
        )

    def extraction_error(self):
        """Give an error message for problems extracting file(s)"""

        old_exc = sys.exc_info()[1]
        cache_path = self.extraction_path or get_default_cache()

        tmpl = textwrap.dedent("""
            Can't extract file(s) to egg cache

            The following error occurred while trying to extract file(s) to the Python egg
            cache:

              {old_exc}

            The Python egg cache directory is currently set to:

              {cache_path}

            Perhaps your account does not have write access to this directory?  You can
            change the cache directory by setting the PYTHON_EGG_CACHE environment
            variable to point to an accessible directory.
            """).lstrip()
        err = ExtractionError(tmpl.format(**locals()))
        err.manager = self
        err.cache_path = cache_path
        err.original_error = old_exc
        raise err

    def get_cache_path(self, archive_name, names=()):
        """Return absolute location in cache for `archive_name` and `names`

        The parent directory of the resulting path will be created if it does
        not already exist.  `archive_name` should be the base filename of the
        enclosing egg (which may not be the name of the enclosing zipfile!),
        including its ".egg" extension.  `names`, if provided, should be a
        sequence of path name parts "under" the egg's extraction location.

        This method should only be called by resource providers that need to
        obtain an extraction location, and only for names they intend to
        extract, as it tracks the generated names for possible cleanup later.
        """
        extract_path = self.extraction_path or get_default_cache()
        target_path = os.path.join(extract_path, archive_name + '-tmp', *names)
        try:
            _bypass_ensure_directory(target_path)
        except:
            self.extraction_error()

        self._warn_unsafe_extraction_path(extract_path)

        self.cached_files[target_path] = 1
        return target_path

    @staticmethod
    def _warn_unsafe_extraction_path(path):
        """
        If the default extraction path is overridden and set to an insecure
        location, such as /tmp, it opens up an opportunity for an attacker to
        replace an extracted file with an unauthorized payload. Warn the user
        if a known insecure location is used.

        See Distribute #375 for more details.
        """
        if os.name == 'nt' and not path.startswith(os.environ['windir']):
            # On Windows, permissions are generally restrictive by default
            #  and temp directories are not writable by other users, so
            #  bypass the warning.
            return
        mode = os.stat(path).st_mode
        if mode & stat.S_IWOTH or mode & stat.S_IWGRP:
            msg = ("%s is writable by group/others and vulnerable to attack "
                "when "
                "used with get_resource_filename. Consider a more secure "
                "location (set with .set_extraction_path or the "
                "PYTHON_EGG_CACHE environment variable)." % path)
            warnings.warn(msg, UserWarning)

    def postprocess(self, tempname, filename):
        """Perform any platform-specific postprocessing of `tempname`

        This is where Mac header rewrites should be done; other platforms don't
        have anything special they should do.

        Resource providers should call this method ONLY after successfully
        extracting a compressed resource.  They must NOT call it on resources
        that are already in the filesystem.

        `tempname` is the current (temporary) name of the file, and `filename`
        is the name it will be renamed to by the caller after this routine
        returns.
        """

        if os.name == 'posix':
            # Make the resource executable
            mode = ((os.stat(tempname).st_mode) | 0o555) & 0o7777
            os.chmod(tempname, mode)

    def set_extraction_path(self, path):
        """Set the base path where resources will be extracted to, if needed.

        If you do not call this routine before any extractions take place, the
        path defaults to the return value of ``get_default_cache()``.  (Which
        is based on the ``PYTHON_EGG_CACHE`` environment variable, with various
        platform-specific fallbacks.  See that routine's documentation for more
        details.)

        Resources are extracted to subdirectories of this path based upon
        information given by the ``IResourceProvider``.  You may set this to a
        temporary directory, but then you must call ``cleanup_resources()`` to
        delete the extracted files when done.  There is no guarantee that
        ``cleanup_resources()`` will be able to remove all extracted files.

        (Note: you may not change the extraction path for a given resource
        manager once resources have been extracted, unless you first call
        ``cleanup_resources()``.)
        """
        if self.cached_files:
            raise ValueError(
                "Can't change extraction path, files already extracted"
            )

        self.extraction_path = path

    def cleanup_resources(self, force=False):
        """
        Delete all extracted resource files and directories, returning a list
        of the file and directory names that could not be successfully removed.
        This function does not have any concurrency protection, so it should
        generally only be called when the extraction path is a temporary
        directory exclusive to a single process.  This method is not
        automatically called; you must call it explicitly or register it as an
        ``atexit`` function if you wish to ensure cleanup of a temporary
        directory used for extractions.
        """
        # XXX


def get_default_cache():
    """
    Return the ``PYTHON_EGG_CACHE`` environment variable
    or a platform-relevant user cache dir for an app
    named "Python-Eggs".
    """
    return (
        os.environ.get('PYTHON_EGG_CACHE')
        or appdirs.user_cache_dir(appname='Python-Eggs')
    )


def safe_name(name):
    """Convert an arbitrary string to a standard distribution name

    Any runs of non-alphanumeric/. characters are replaced with a single '-'.
    """
    return re.sub('[^A-Za-z0-9.]+', '-', name)


def safe_version(version):
    """
    Convert an arbitrary string to a standard version string
    """
    try:
        # normalize the version
        return str(packaging.version.Version(version))
    except packaging.version.InvalidVersion:
        version = version.replace(' ', '.')
        return re.sub('[^A-Za-z0-9.]+', '-', version)


def safe_extra(extra):
    """Convert an arbitrary string to a standard 'extra' name

    Any runs of non-alphanumeric characters are replaced with a single '_',
    and the result is always lowercased.
    """
    return re.sub('[^A-Za-z0-9.-]+', '_', extra).lower()


def to_filename(name):
    """Convert a project or version name to its filename-escaped form

    Any '-' characters are currently replaced with '_'.
    """
    return name.replace('-', '_')


def invalid_marker(text):
    """
    Validate text as a PEP 508 environment marker; return an exception
    if invalid or False otherwise.
    """
    try:
        evaluate_marker(text)
    except SyntaxError as e:
        e.filename = None
        e.lineno = None
        return e
    return False


def evaluate_marker(text, extra=None):
    """
    Evaluate a PEP 508 environment marker.
    Return a boolean indicating the marker result in this environment.
    Raise SyntaxError if marker is invalid.

    This implementation uses the 'pyparsing' module.
    """
    try:
        marker = packaging.markers.Marker(text)
        return marker.evaluate()
    except packaging.markers.InvalidMarker as e:
        raise SyntaxError(e)


class NullProvider:
    """Try to implement resources and metadata for arbitrary PEP 302 loaders"""

    egg_name = None
    egg_info = None
    loader = None

    def __init__(self, module):
        self.loader = getattr(module, '__loader__', None)
        self.module_path = os.path.dirname(getattr(module, '__file__', ''))

    def get_resource_filename(self, manager, resource_name):
        return self._fn(self.module_path, resource_name)

    def get_resource_stream(self, manager, resource_name):
        return io.BytesIO(self.get_resource_string(manager, resource_name))

    def get_resource_string(self, manager, resource_name):
        return self._get(self._fn(self.module_path, resource_name))

    def has_resource(self, resource_name):
        return self._has(self._fn(self.module_path, resource_name))

    def has_metadata(self, name):
        return self.egg_info and self._has(self._fn(self.egg_info, name))

    def get_metadata(self, name):
        if not self.egg_info:
            return ""
        value = self._get(self._fn(self.egg_info, name))
        return value.decode('utf-8') if six.PY3 else value

    def get_metadata_lines(self, name):
        return yield_lines(self.get_metadata(name))

    def resource_isdir(self, resource_name):
        return self._isdir(self._fn(self.module_path, resource_name))

    def metadata_isdir(self, name):
        return self.egg_info and self._isdir(self._fn(self.egg_info, name))

    def resource_listdir(self, resource_name):
        return self._listdir(self._fn(self.module_path, resource_name))

    def metadata_listdir(self, name):
        if self.egg_info:
            return self._listdir(self._fn(self.egg_info, name))
        return []

    def run_script(self, script_name, namespace):
        script = 'scripts/' + script_name
        if not self.has_metadata(script):
            raise ResolutionError("No script named %r" % script_name)
        script_text = self.get_metadata(script).replace('\r\n', '\n')
        script_text = script_text.replace('\r', '\n')
        script_filename = self._fn(self.egg_info, script)
        namespace['__file__'] = script_filename
        if os.path.exists(script_filename):
            source = open(script_filename).read()
            code = compile(source, script_filename, 'exec')
            exec(code, namespace, namespace)
        else:
            from linecache import cache
            cache[script_filename] = (
                len(script_text), 0, script_text.split('\n'), script_filename
            )
            script_code = compile(script_text, script_filename, 'exec')
            exec(script_code, namespace, namespace)

    def _has(self, path):
        raise NotImplementedError(
            "Can't perform this operation for unregistered loader type"
        )

    def _isdir(self, path):
        raise NotImplementedError(
            "Can't perform this operation for unregistered loader type"
        )

    def _listdir(self, path):
        raise NotImplementedError(
            "Can't perform this operation for unregistered loader type"
        )

    def _fn(self, base, resource_name):
        if resource_name:
            return os.path.join(base, *resource_name.split('/'))
        return base

    def _get(self, path):
        if hasattr(self.loader, 'get_data'):
            return self.loader.get_data(path)
        raise NotImplementedError(
            "Can't perform this operation for loaders without 'get_data()'"
        )


register_loader_type(object, NullProvider)


class EggProvider(NullProvider):
    """Provider based on a virtual filesystem"""

    def __init__(self, module):
        NullProvider.__init__(self, module)
        self._setup_prefix()

    def _setup_prefix(self):
        # we assume here that our metadata may be nested inside a "basket"
        # of multiple eggs; that's why we use module_path instead of .archive
        path = self.module_path
        old = None
        while path != old:
            if _is_unpacked_egg(path):
                self.egg_name = os.path.basename(path)
                self.egg_info = os.path.join(path, 'EGG-INFO')
                self.egg_root = path
                break
            old = path
            path, base = os.path.split(path)


class DefaultProvider(EggProvider):
    """Provides access to package resources in the filesystem"""

    def _has(self, path):
        return os.path.exists(path)

    def _isdir(self, path):
        return os.path.isdir(path)

    def _listdir(self, path):
        return os.listdir(path)

    def get_resource_stream(self, manager, resource_name):
        return open(self._fn(self.module_path, resource_name), 'rb')

    def _get(self, path):
        with open(path, 'rb') as stream:
            return stream.read()

    @classmethod
    def _register(cls):
        loader_cls = getattr(importlib_machinery, 'SourceFileLoader',
            type(None))
        register_loader_type(loader_cls, cls)


DefaultProvider._register()


class EmptyProvider(NullProvider):
    """Provider that returns nothing for all requests"""

    _isdir = _has = lambda self, path: False
    _get = lambda self, path: ''
    _listdir = lambda self, path: []
    module_path = None

    def __init__(self):
        pass


empty_provider = EmptyProvider()


class ZipManifests(dict):
    """
    zip manifest builder
    """

    @classmethod
    def build(cls, path):
        """
        Build a dictionary similar to the zipimport directory
        caches, except instead of tuples, store ZipInfo objects.

        Use a platform-specific path separator (os.sep) for the path keys
        for compatibility with pypy on Windows.
        """
        with ContextualZipFile(path) as zfile:
            items = (
                (
                    name.replace('/', os.sep),
                    zfile.getinfo(name),
                )
                for name in zfile.namelist()
            )
            return dict(items)

    load = build


class MemoizedZipManifests(ZipManifests):
    """
    Memoized zipfile manifests.
    """
    manifest_mod = collections.namedtuple('manifest_mod', 'manifest mtime')

    def load(self, path):
        """
        Load a manifest at path or return a suitable manifest already loaded.
        """
        path = os.path.normpath(path)
        mtime = os.stat(path).st_mtime

        if path not in self or self[path].mtime != mtime:
            manifest = self.build(path)
            self[path] = self.manifest_mod(manifest, mtime)

        return self[path].manifest


class ContextualZipFile(zipfile.ZipFile):
    """
    Supplement ZipFile class to support context manager for Python 2.6
    """

    def __enter__(self):
        return self

    def __exit__(self, type, value, traceback):
        self.close()

    def __new__(cls, *args, **kwargs):
        """
        Construct a ZipFile or ContextualZipFile as appropriate
        """
        if hasattr(zipfile.ZipFile, '__exit__'):
            return zipfile.ZipFile(*args, **kwargs)
        return super(ContextualZipFile, cls).__new__(cls)


class ZipProvider(EggProvider):
    """Resource support for zips and eggs"""

    eagers = None
    _zip_manifests = MemoizedZipManifests()

    def __init__(self, module):
        EggProvider.__init__(self, module)
        self.zip_pre = self.loader.archive + os.sep

    def _zipinfo_name(self, fspath):
        # Convert a virtual filename (full path to file) into a zipfile subpath
        # usable with the zipimport directory cache for our target archive
        if fspath.startswith(self.zip_pre):
            return fspath[len(self.zip_pre):]
        raise AssertionError(
            "%s is not a subpath of %s" % (fspath, self.zip_pre)
        )

    def _parts(self, zip_path):
        # Convert a zipfile subpath into an egg-relative path part list.
        # pseudo-fs path
        fspath = self.zip_pre + zip_path
        if fspath.startswith(self.egg_root + os.sep):
            return fspath[len(self.egg_root) + 1:].split(os.sep)
        raise AssertionError(
            "%s is not a subpath of %s" % (fspath, self.egg_root)
        )

    @property
    def zipinfo(self):
        return self._zip_manifests.load(self.loader.archive)

    def get_resource_filename(self, manager, resource_name):
        if not self.egg_name:
            raise NotImplementedError(
                "resource_filename() only supported for .egg, not .zip"
            )
        # no need to lock for extraction, since we use temp names
        zip_path = self._resource_to_zip(resource_name)
        eagers = self._get_eager_resources()
        if '/'.join(self._parts(zip_path)) in eagers:
            for name in eagers:
                self._extract_resource(manager, self._eager_to_zip(name))
        return self._extract_resource(manager, zip_path)

    @staticmethod
    def _get_date_and_size(zip_stat):
        size = zip_stat.file_size
        # ymdhms+wday, yday, dst
        date_time = zip_stat.date_time + (0, 0, -1)
        # 1980 offset already done
        timestamp = time.mktime(date_time)
        return timestamp, size

    def _extract_resource(self, manager, zip_path):

        if zip_path in self._index():
            for name in self._index()[zip_path]:
                last = self._extract_resource(
                    manager, os.path.join(zip_path, name)
                )
            # return the extracted directory name
            return os.path.dirname(last)

        timestamp, size = self._get_date_and_size(self.zipinfo[zip_path])

        if not WRITE_SUPPORT:
            raise IOError('"os.rename" and "os.unlink" are not supported '
                          'on this platform')
        try:

            real_path = manager.get_cache_path(
                self.egg_name, self._parts(zip_path)
            )

            if self._is_current(real_path, zip_path):
                return real_path

            outf, tmpnam = _mkstemp(".$extract", dir=os.path.dirname(real_path))
            os.write(outf, self.loader.get_data(zip_path))
            os.close(outf)
            utime(tmpnam, (timestamp, timestamp))
            manager.postprocess(tmpnam, real_path)

            try:
                rename(tmpnam, real_path)

            except os.error:
                if os.path.isfile(real_path):
                    if self._is_current(real_path, zip_path):
                        # the file became current since it was checked above,
                        #  so proceed.
                        return real_path
                    # Windows, del old file and retry
                    elif os.name == 'nt':
                        unlink(real_path)
                        rename(tmpnam, real_path)
                        return real_path
                raise

        except os.error:
            # report a user-friendly error
            manager.extraction_error()

        return real_path

    def _is_current(self, file_path, zip_path):
        """
        Return True if the file_path is current for this zip_path
        """
        timestamp, size = self._get_date_and_size(self.zipinfo[zip_path])
        if not os.path.isfile(file_path):
            return False
        stat = os.stat(file_path)
        if stat.st_size != size or stat.st_mtime != timestamp:
            return False
        # check that the contents match
        zip_contents = self.loader.get_data(zip_path)
        with open(file_path, 'rb') as f:
            file_contents = f.read()
        return zip_contents == file_contents

    def _get_eager_resources(self):
        if self.eagers is None:
            eagers = []
            for name in ('native_libs.txt', 'eager_resources.txt'):
                if self.has_metadata(name):
                    eagers.extend(self.get_metadata_lines(name))
            self.eagers = eagers
        return self.eagers

    def _index(self):
        try:
            return self._dirindex
        except AttributeError:
            ind = {}
            for path in self.zipinfo:
                parts = path.split(os.sep)
                while parts:
                    parent = os.sep.join(parts[:-1])
                    if parent in ind:
                        ind[parent].append(parts[-1])
                        break
                    else:
                        ind[parent] = [parts.pop()]
            self._dirindex = ind
            return ind

    def _has(self, fspath):
        zip_path = self._zipinfo_name(fspath)
        return zip_path in self.zipinfo or zip_path in self._index()

    def _isdir(self, fspath):
        return self._zipinfo_name(fspath) in self._index()

    def _listdir(self, fspath):
        return list(self._index().get(self._zipinfo_name(fspath), ()))

    def _eager_to_zip(self, resource_name):
        return self._zipinfo_name(self._fn(self.egg_root, resource_name))

    def _resource_to_zip(self, resource_name):
        return self._zipinfo_name(self._fn(self.module_path, resource_name))


register_loader_type(zipimport.zipimporter, ZipProvider)


class FileMetadata(EmptyProvider):
    """Metadata handler for standalone PKG-INFO files

    Usage::

        metadata = FileMetadata("/path/to/PKG-INFO")

    This provider rejects all data and metadata requests except for PKG-INFO,
    which is treated as existing, and will be the contents of the file at
    the provided location.
    """

    def __init__(self, path):
        self.path = path

    def has_metadata(self, name):
        return name == 'PKG-INFO' and os.path.isfile(self.path)

    def get_metadata(self, name):
        if name != 'PKG-INFO':
            raise KeyError("No metadata except PKG-INFO is available")

        with io.open(self.path, encoding='utf-8', errors="replace") as f:
            metadata = f.read()
        self._warn_on_replacement(metadata)
        return metadata

    def _warn_on_replacement(self, metadata):
        # Python 2.6 and 3.2 compat for: replacement_char = '�'
        replacement_char = b'\xef\xbf\xbd'.decode('utf-8')
        if replacement_char in metadata:
            tmpl = "{self.path} could not be properly decoded in UTF-8"
            msg = tmpl.format(**locals())
            warnings.warn(msg)

    def get_metadata_lines(self, name):
        return yield_lines(self.get_metadata(name))


class PathMetadata(DefaultProvider):
    """Metadata provider for egg directories

    Usage::

        # Development eggs:

        egg_info = "/path/to/PackageName.egg-info"
        base_dir = os.path.dirname(egg_info)
        metadata = PathMetadata(base_dir, egg_info)
        dist_name = os.path.splitext(os.path.basename(egg_info))[0]
        dist = Distribution(basedir, project_name=dist_name, metadata=metadata)

        # Unpacked egg directories:

        egg_path = "/path/to/PackageName-ver-pyver-etc.egg"
        metadata = PathMetadata(egg_path, os.path.join(egg_path,'EGG-INFO'))
        dist = Distribution.from_filename(egg_path, metadata=metadata)
    """

    def __init__(self, path, egg_info):
        self.module_path = path
        self.egg_info = egg_info


class EggMetadata(ZipProvider):
    """Metadata provider for .egg files"""

    def __init__(self, importer):
        """Create a metadata provider from a zipimporter"""

        self.zip_pre = importer.archive + os.sep
        self.loader = importer
        if importer.prefix:
            self.module_path = os.path.join(importer.archive, importer.prefix)
        else:
            self.module_path = importer.archive
        self._setup_prefix()


_declare_state('dict', _distribution_finders={})


def register_finder(importer_type, distribution_finder):
    """Register `distribution_finder` to find distributions in sys.path items

    `importer_type` is the type or class of a PEP 302 "Importer" (sys.path item
    handler), and `distribution_finder` is a callable that, passed a path
    item and the importer instance, yields ``Distribution`` instances found on
    that path item.  See ``pkg_resources.find_on_path`` for an example."""
    _distribution_finders[importer_type] = distribution_finder


def find_distributions(path_item, only=False):
    """Yield distributions accessible via `path_item`"""
    importer = get_importer(path_item)
    finder = _find_adapter(_distribution_finders, importer)
    return finder(importer, path_item, only)


def find_eggs_in_zip(importer, path_item, only=False):
    """
    Find eggs in zip files; possibly multiple nested eggs.
    """
    if importer.archive.endswith('.whl'):
        # wheels are not supported with this finder
        # they don't have PKG-INFO metadata, and won't ever contain eggs
        return
    metadata = EggMetadata(importer)
    if metadata.has_metadata('PKG-INFO'):
        yield Distribution.from_filename(path_item, metadata=metadata)
    if only:
        # don't yield nested distros
        return
    for subitem in metadata.resource_listdir('/'):
        if _is_unpacked_egg(subitem):
            subpath = os.path.join(path_item, subitem)
            for dist in find_eggs_in_zip(zipimport.zipimporter(subpath), subpath):
                yield dist


register_finder(zipimport.zipimporter, find_eggs_in_zip)


def find_nothing(importer, path_item, only=False):
    return ()


register_finder(object, find_nothing)


def _by_version_descending(names):
    """
    Given a list of filenames, return them in descending order
    by version number.

    >>> names = 'bar', 'foo', 'Python-2.7.10.egg', 'Python-2.7.2.egg'
    >>> _by_version_descending(names)
    ['Python-2.7.10.egg', 'Python-2.7.2.egg', 'foo', 'bar']
    >>> names = 'Setuptools-1.2.3b1.egg', 'Setuptools-1.2.3.egg'
    >>> _by_version_descending(names)
    ['Setuptools-1.2.3.egg', 'Setuptools-1.2.3b1.egg']
    >>> names = 'Setuptools-1.2.3b1.egg', 'Setuptools-1.2.3.post1.egg'
    >>> _by_version_descending(names)
    ['Setuptools-1.2.3.post1.egg', 'Setuptools-1.2.3b1.egg']
    """
    def _by_version(name):
        """
        Parse each component of the filename
        """
        name, ext = os.path.splitext(name)
        parts = itertools.chain(name.split('-'), [ext])
        return [packaging.version.parse(part) for part in parts]

    return sorted(names, key=_by_version, reverse=True)


def find_on_path(importer, path_item, only=False):
    """Yield distributions accessible on a sys.path directory"""
    path_item = _normalize_cached(path_item)

    if os.path.isdir(path_item) and os.access(path_item, os.R_OK):
        if _is_unpacked_egg(path_item):
            yield Distribution.from_filename(
                path_item, metadata=PathMetadata(
                    path_item, os.path.join(path_item, 'EGG-INFO')
                )
            )
        else:
            # scan for .egg and .egg-info in directory
            path_item_entries = _by_version_descending(os.listdir(path_item))
            for entry in path_item_entries:
                lower = entry.lower()
                if lower.endswith('.egg-info') or lower.endswith('.dist-info'):
                    fullpath = os.path.join(path_item, entry)
                    if os.path.isdir(fullpath):
                        # egg-info directory, allow getting metadata
                        if len(os.listdir(fullpath)) == 0:
                            # Empty egg directory, skip.
                            continue
                        metadata = PathMetadata(path_item, fullpath)
                    else:
                        metadata = FileMetadata(fullpath)
                    yield Distribution.from_location(
                        path_item, entry, metadata, precedence=DEVELOP_DIST
                    )
                elif not only and _is_unpacked_egg(entry):
                    dists = find_distributions(os.path.join(path_item, entry))
                    for dist in dists:
                        yield dist
                elif not only and lower.endswith('.egg-link'):
                    with open(os.path.join(path_item, entry)) as entry_file:
                        entry_lines = entry_file.readlines()
                    for line in entry_lines:
                        if not line.strip():
                            continue
                        path = os.path.join(path_item, line.rstrip())
                        dists = find_distributions(path)
                        for item in dists:
                            yield item
                        break


register_finder(pkgutil.ImpImporter, find_on_path)

if hasattr(importlib_machinery, 'FileFinder'):
    register_finder(importlib_machinery.FileFinder, find_on_path)

_declare_state('dict', _namespace_handlers={})
_declare_state('dict', _namespace_packages={})


def register_namespace_handler(importer_type, namespace_handler):
    """Register `namespace_handler` to declare namespace packages

    `importer_type` is the type or class of a PEP 302 "Importer" (sys.path item
    handler), and `namespace_handler` is a callable like this::

        def namespace_handler(importer, path_entry, moduleName, module):
            # return a path_entry to use for child packages

    Namespace handlers are only called if the importer object has already
    agreed that it can handle the relevant path item, and they should only
    return a subpath if the module __path__ does not already contain an
    equivalent subpath.  For an example namespace handler, see
    ``pkg_resources.file_ns_handler``.
    """
    _namespace_handlers[importer_type] = namespace_handler


def _handle_ns(packageName, path_item):
    """Ensure that named package includes a subpath of path_item (if needed)"""

    importer = get_importer(path_item)
    if importer is None:
        return None
    loader = importer.find_module(packageName)
    if loader is None:
        return None
    module = sys.modules.get(packageName)
    if module is None:
        module = sys.modules[packageName] = types.ModuleType(packageName)
        module.__path__ = []
        _set_parent_ns(packageName)
    elif not hasattr(module, '__path__'):
        raise TypeError("Not a package:", packageName)
    handler = _find_adapter(_namespace_handlers, importer)
    subpath = handler(importer, path_item, packageName, module)
    if subpath is not None:
        path = module.__path__
        path.append(subpath)
        loader.load_module(packageName)
        _rebuild_mod_path(path, packageName, module)
    return subpath


def _rebuild_mod_path(orig_path, package_name, module):
    """
    Rebuild module.__path__ ensuring that all entries are ordered
    corresponding to their sys.path order
    """
    sys_path = [_normalize_cached(p) for p in sys.path]

    def safe_sys_path_index(entry):
        """
        Workaround for #520 and #513.
        """
        try:
            return sys_path.index(entry)
        except ValueError:
            return float('inf')

    def position_in_sys_path(path):
        """
        Return the ordinal of the path based on its position in sys.path
        """
        path_parts = path.split(os.sep)
        module_parts = package_name.count('.') + 1
        parts = path_parts[:-module_parts]
        return safe_sys_path_index(_normalize_cached(os.sep.join(parts)))

    orig_path.sort(key=position_in_sys_path)
    module.__path__[:] = [_normalize_cached(p) for p in orig_path]


def declare_namespace(packageName):
    """Declare that package 'packageName' is a namespace package"""

    _imp.acquire_lock()
    try:
        if packageName in _namespace_packages:
            return

        path, parent = sys.path, None
        if '.' in packageName:
            parent = '.'.join(packageName.split('.')[:-1])
            declare_namespace(parent)
            if parent not in _namespace_packages:
                __import__(parent)
            try:
                path = sys.modules[parent].__path__
            except AttributeError:
                raise TypeError("Not a package:", parent)

        # Track what packages are namespaces, so when new path items are added,
        # they can be updated
        _namespace_packages.setdefault(parent, []).append(packageName)
        _namespace_packages.setdefault(packageName, [])

        for path_item in path:
            # Ensure all the parent's path items are reflected in the child,
            # if they apply
            _handle_ns(packageName, path_item)

    finally:
        _imp.release_lock()


def fixup_namespace_packages(path_item, parent=None):
    """Ensure that previously-declared namespace packages include path_item"""
    _imp.acquire_lock()
    try:
        for package in _namespace_packages.get(parent, ()):
            subpath = _handle_ns(package, path_item)
            if subpath:
                fixup_namespace_packages(subpath, package)
    finally:
        _imp.release_lock()


def file_ns_handler(importer, path_item, packageName, module):
    """Compute an ns-package subpath for a filesystem or zipfile importer"""

    subpath = os.path.join(path_item, packageName.split('.')[-1])
    normalized = _normalize_cached(subpath)
    for item in module.__path__:
        if _normalize_cached(item) == normalized:
            break
    else:
        # Only return the path if it's not already there
        return subpath


register_namespace_handler(pkgutil.ImpImporter, file_ns_handler)
register_namespace_handler(zipimport.zipimporter, file_ns_handler)

if hasattr(importlib_machinery, 'FileFinder'):
    register_namespace_handler(importlib_machinery.FileFinder, file_ns_handler)


def null_ns_handler(importer, path_item, packageName, module):
    return None


register_namespace_handler(object, null_ns_handler)


def normalize_path(filename):
    """Normalize a file/dir name for comparison purposes"""
    return os.path.normcase(os.path.realpath(filename))


def _normalize_cached(filename, _cache={}):
    try:
        return _cache[filename]
    except KeyError:
        _cache[filename] = result = normalize_path(filename)
        return result


def _is_unpacked_egg(path):
    """
    Determine if given path appears to be an unpacked egg.
    """
    return (
        path.lower().endswith('.egg')
    )


def _set_parent_ns(packageName):
    parts = packageName.split('.')
    name = parts.pop()
    if parts:
        parent = '.'.join(parts)
        setattr(sys.modules[parent], name, sys.modules[packageName])


def yield_lines(strs):
    """Yield non-empty/non-comment lines of a string or sequence"""
    if isinstance(strs, six.string_types):
        for s in strs.splitlines():
            s = s.strip()
            # skip blank lines/comments
            if s and not s.startswith('#'):
                yield s
    else:
        for ss in strs:
            for s in yield_lines(ss):
                yield s


MODULE = re.compile(r"\w+(\.\w+)*$").match
EGG_NAME = re.compile(
    r"""
    (?P<name>[^-]+) (
        -(?P<ver>[^-]+) (
            -py(?P<pyver>[^-]+) (
                -(?P<plat>.+)
            )?
        )?
    )?
    """,
    re.VERBOSE | re.IGNORECASE,
).match


class EntryPoint(object):
    """Object representing an advertised importable object"""

    def __init__(self, name, module_name, attrs=(), extras=(), dist=None):
        if not MODULE(module_name):
            raise ValueError("Invalid module name", module_name)
        self.name = name
        self.module_name = module_name
        self.attrs = tuple(attrs)
        self.extras = Requirement.parse(("x[%s]" % ','.join(extras))).extras
        self.dist = dist

    def __str__(self):
        s = "%s = %s" % (self.name, self.module_name)
        if self.attrs:
            s += ':' + '.'.join(self.attrs)
        if self.extras:
            s += ' [%s]' % ','.join(self.extras)
        return s

    def __repr__(self):
        return "EntryPoint.parse(%r)" % str(self)

    def load(self, require=True, *args, **kwargs):
        """
        Require packages for this EntryPoint, then resolve it.
        """
        if not require or args or kwargs:
            warnings.warn(
                "Parameters to load are deprecated.  Call .resolve and "
                ".require separately.",
                DeprecationWarning,
                stacklevel=2,
            )
        if require:
            self.require(*args, **kwargs)
        return self.resolve()

    def resolve(self):
        """
        Resolve the entry point from its module and attrs.
        """
        module = __import__(self.module_name, fromlist=['__name__'], level=0)
        try:
            return functools.reduce(getattr, self.attrs, module)
        except AttributeError as exc:
            raise ImportError(str(exc))

    def require(self, env=None, installer=None):
        if self.extras and not self.dist:
            raise UnknownExtra("Can't require() without a distribution", self)
        reqs = self.dist.requires(self.extras)
        items = working_set.resolve(reqs, env, installer)
        list(map(working_set.add, items))

    pattern = re.compile(
        r'\s*'
        r'(?P<name>.+?)\s*'
        r'=\s*'
        r'(?P<module>[\w.]+)\s*'
        r'(:\s*(?P<attr>[\w.]+))?\s*'
        r'(?P<extras>\[.*\])?\s*$'
    )

    @classmethod
    def parse(cls, src, dist=None):
        """Parse a single entry point from string `src`

        Entry point syntax follows the form::

            name = some.module:some.attr [extra1, extra2]

        The entry name and module name are required, but the ``:attrs`` and
        ``[extras]`` parts are optional
        """
        m = cls.pattern.match(src)
        if not m:
            msg = "EntryPoint must be in 'name=module:attrs [extras]' format"
            raise ValueError(msg, src)
        res = m.groupdict()
        extras = cls._parse_extras(res['extras'])
        attrs = res['attr'].split('.') if res['attr'] else ()
        return cls(res['name'], res['module'], attrs, extras, dist)

    @classmethod
    def _parse_extras(cls, extras_spec):
        if not extras_spec:
            return ()
        req = Requirement.parse('x' + extras_spec)
        if req.specs:
            raise ValueError()
        return req.extras

    @classmethod
    def parse_group(cls, group, lines, dist=None):
        """Parse an entry point group"""
        if not MODULE(group):
            raise ValueError("Invalid group name", group)
        this = {}
        for line in yield_lines(lines):
            ep = cls.parse(line, dist)
            if ep.name in this:
                raise ValueError("Duplicate entry point", group, ep.name)
            this[ep.name] = ep
        return this

    @classmethod
    def parse_map(cls, data, dist=None):
        """Parse a map of entry point groups"""
        if isinstance(data, dict):
            data = data.items()
        else:
            data = split_sections(data)
        maps = {}
        for group, lines in data:
            if group is None:
                if not lines:
                    continue
                raise ValueError("Entry points must be listed in groups")
            group = group.strip()
            if group in maps:
                raise ValueError("Duplicate group name", group)
            maps[group] = cls.parse_group(group, lines, dist)
        return maps


def _remove_md5_fragment(location):
    if not location:
        return ''
    parsed = urllib.parse.urlparse(location)
    if parsed[-1].startswith('md5='):
        return urllib.parse.urlunparse(parsed[:-1] + ('',))
    return location


def _version_from_file(lines):
    """
    Given an iterable of lines from a Metadata file, return
    the value of the Version field, if present, or None otherwise.
    """
    is_version_line = lambda line: line.lower().startswith('version:')
    version_lines = filter(is_version_line, lines)
    line = next(iter(version_lines), '')
    _, _, value = line.partition(':')
    return safe_version(value.strip()) or None


class Distribution(object):
    """Wrap an actual or potential sys.path entry w/metadata"""
    PKG_INFO = 'PKG-INFO'

    def __init__(self, location=None, metadata=None, project_name=None,
            version=None, py_version=PY_MAJOR, platform=None,
            precedence=EGG_DIST):
        self.project_name = safe_name(project_name or 'Unknown')
        if version is not None:
            self._version = safe_version(version)
        self.py_version = py_version
        self.platform = platform
        self.location = location
        self.precedence = precedence
        self._provider = metadata or empty_provider

    @classmethod
    def from_location(cls, location, basename, metadata=None, **kw):
        project_name, version, py_version, platform = [None] * 4
        basename, ext = os.path.splitext(basename)
        if ext.lower() in _distributionImpl:
            cls = _distributionImpl[ext.lower()]

            match = EGG_NAME(basename)
            if match:
                project_name, version, py_version, platform = match.group(
                    'name', 'ver', 'pyver', 'plat'
                )
        return cls(
            location, metadata, project_name=project_name, version=version,
            py_version=py_version, platform=platform, **kw
        )._reload_version()

    def _reload_version(self):
        return self

    @property
    def hashcmp(self):
        return (
            self.parsed_version,
            self.precedence,
            self.key,
            _remove_md5_fragment(self.location),
            self.py_version or '',
            self.platform or '',
        )

    def __hash__(self):
        return hash(self.hashcmp)

    def __lt__(self, other):
        return self.hashcmp < other.hashcmp

    def __le__(self, other):
        return self.hashcmp <= other.hashcmp

    def __gt__(self, other):
        return self.hashcmp > other.hashcmp

    def __ge__(self, other):
        return self.hashcmp >= other.hashcmp

    def __eq__(self, other):
        if not isinstance(other, self.__class__):
            # It's not a Distribution, so they are not equal
            return False
        return self.hashcmp == other.hashcmp

    def __ne__(self, other):
        return not self == other

    # These properties have to be lazy so that we don't have to load any
    # metadata until/unless it's actually needed.  (i.e., some distributions
    # may not know their name or version without loading PKG-INFO)

    @property
    def key(self):
        try:
            return self._key
        except AttributeError:
            self._key = key = self.project_name.lower()
            return key

    @property
    def parsed_version(self):
        if not hasattr(self, "_parsed_version"):
            self._parsed_version = parse_version(self.version)

        return self._parsed_version

    def _warn_legacy_version(self):
        LV = packaging.version.LegacyVersion
        is_legacy = isinstance(self._parsed_version, LV)
        if not is_legacy:
            return

        # While an empty version is technically a legacy version and
        # is not a valid PEP 440 version, it's also unlikely to
        # actually come from someone and instead it is more likely that
        # it comes from setuptools attempting to parse a filename and
        # including it in the list. So for that we'll gate this warning
        # on if the version is anything at all or not.
        if not self.version:
            return

        tmpl = textwrap.dedent("""
            '{project_name} ({version})' is being parsed as a legacy,
            non PEP 440,
            version. You may find odd behavior and sort order.
            In particular it will be sorted as less than 0.0. It
            is recommended to migrate to PEP 440 compatible
            versions.
            """).strip().replace('\n', ' ')

        warnings.warn(tmpl.format(**vars(self)), PEP440Warning)

    @property
    def version(self):
        try:
            return self._version
        except AttributeError:
            version = _version_from_file(self._get_metadata(self.PKG_INFO))
            if version is None:
                tmpl = "Missing 'Version:' header and/or %s file"
                raise ValueError(tmpl % self.PKG_INFO, self)
            return version

    @property
    def _dep_map(self):
        try:
            return self.__dep_map
        except AttributeError:
            dm = self.__dep_map = {None: []}
            for name in 'requires.txt', 'depends.txt':
                for extra, reqs in split_sections(self._get_metadata(name)):
                    if extra:
                        if ':' in extra:
                            extra, marker = extra.split(':', 1)
                            if invalid_marker(marker):
                                # XXX warn
                                reqs = []
                            elif not evaluate_marker(marker):
                                reqs = []
                        extra = safe_extra(extra) or None
                    dm.setdefault(extra, []).extend(parse_requirements(reqs))
            return dm

    def requires(self, extras=()):
        """List of Requirements needed for this distro if `extras` are used"""
        dm = self._dep_map
        deps = []
        deps.extend(dm.get(None, ()))
        for ext in extras:
            try:
                deps.extend(dm[safe_extra(ext)])
            except KeyError:
                raise UnknownExtra(
                    "%s has no such extra feature %r" % (self, ext)
                )
        return deps

    def _get_metadata(self, name):
        if self.has_metadata(name):
            for line in self.get_metadata_lines(name):
                yield line

    def activate(self, path=None, replace=False):
        """Ensure distribution is importable on `path` (default=sys.path)"""
        if path is None:
            path = sys.path
        self.insert_on(path, replace=replace)
        if path is sys.path:
            fixup_namespace_packages(self.location)
            for pkg in self._get_metadata('namespace_packages.txt'):
                if pkg in sys.modules:
                    declare_namespace(pkg)

    def egg_name(self):
        """Return what this distribution's standard .egg filename should be"""
        filename = "%s-%s-py%s" % (
            to_filename(self.project_name), to_filename(self.version),
            self.py_version or PY_MAJOR
        )

        if self.platform:
            filename += '-' + self.platform
        return filename

    def __repr__(self):
        if self.location:
            return "%s (%s)" % (self, self.location)
        else:
            return str(self)

    def __str__(self):
        try:
            version = getattr(self, 'version', None)
        except ValueError:
            version = None
        version = version or "[unknown version]"
        return "%s %s" % (self.project_name, version)

    def __getattr__(self, attr):
        """Delegate all unrecognized public attributes to .metadata provider"""
        if attr.startswith('_'):
            raise AttributeError(attr)
        return getattr(self._provider, attr)

    @classmethod
    def from_filename(cls, filename, metadata=None, **kw):
        return cls.from_location(
            _normalize_cached(filename), os.path.basename(filename), metadata,
            **kw
        )

    def as_requirement(self):
        """Return a ``Requirement`` that matches this distribution exactly"""
        if isinstance(self.parsed_version, packaging.version.Version):
            spec = "%s==%s" % (self.project_name, self.parsed_version)
        else:
            spec = "%s===%s" % (self.project_name, self.parsed_version)

        return Requirement.parse(spec)

    def load_entry_point(self, group, name):
        """Return the `name` entry point of `group` or raise ImportError"""
        ep = self.get_entry_info(group, name)
        if ep is None:
            raise ImportError("Entry point %r not found" % ((group, name),))
        return ep.load()

    def get_entry_map(self, group=None):
        """Return the entry point map for `group`, or the full entry map"""
        try:
            ep_map = self._ep_map
        except AttributeError:
            ep_map = self._ep_map = EntryPoint.parse_map(
                self._get_metadata('entry_points.txt'), self
            )
        if group is not None:
            return ep_map.get(group, {})
        return ep_map

    def get_entry_info(self, group, name):
        """Return the EntryPoint object for `group`+`name`, or ``None``"""
        return self.get_entry_map(group).get(name)

    def insert_on(self, path, loc=None, replace=False):
        """Ensure self.location is on path

        If replace=False (default):
            - If location is already in path anywhere, do nothing.
            - Else:
              - If it's an egg and its parent directory is on path,
                insert just ahead of the parent.
              - Else: add to the end of path.
        If replace=True:
            - If location is already on path anywhere (not eggs)
              or higher priority than its parent (eggs)
              do nothing.
            - Else:
              - If it's an egg and its parent directory is on path,
                insert just ahead of the parent,
                removing any lower-priority entries.
              - Else: add it to the front of path.
        """

        loc = loc or self.location
        if not loc:
            return

        nloc = _normalize_cached(loc)
        bdir = os.path.dirname(nloc)
        npath = [(p and _normalize_cached(p) or p) for p in path]

        for p, item in enumerate(npath):
            if item == nloc:
                if replace:
                    break
                else:
                    # don't modify path (even removing duplicates) if found and not replace
                    return
            elif item == bdir and self.precedence == EGG_DIST:
                # if it's an .egg, give it precedence over its directory
                # UNLESS it's already been added to sys.path and replace=False
                if (not replace) and nloc in npath[p:]:
                    return
                if path is sys.path:
                    self.check_version_conflict()
                path.insert(p, loc)
                npath.insert(p, nloc)
                break
        else:
            if path is sys.path:
                self.check_version_conflict()
            if replace:
                path.insert(0, loc)
            else:
                path.append(loc)
            return

        # p is the spot where we found or inserted loc; now remove duplicates
        while True:
            try:
                np = npath.index(nloc, p + 1)
            except ValueError:
                break
            else:
                del npath[np], path[np]
                # ha!
                p = np

        return

    def check_version_conflict(self):
        if self.key == 'setuptools':
            # ignore the inevitable setuptools self-conflicts  :(
            return

        nsp = dict.fromkeys(self._get_metadata('namespace_packages.txt'))
        loc = normalize_path(self.location)
        for modname in self._get_metadata('top_level.txt'):
            if (modname not in sys.modules or modname in nsp
                    or modname in _namespace_packages):
                continue
            if modname in ('pkg_resources', 'setuptools', 'site'):
                continue
            fn = getattr(sys.modules[modname], '__file__', None)
            if fn and (normalize_path(fn).startswith(loc) or
                       fn.startswith(self.location)):
                continue
            issue_warning(
                "Module %s was already imported from %s, but %s is being added"
                " to sys.path" % (modname, fn, self.location),
            )

    def has_version(self):
        try:
            self.version
        except ValueError:
            issue_warning("Unbuilt egg for " + repr(self))
            return False
        return True

    def clone(self, **kw):
        """Copy this distribution, substituting in any changed keyword args"""
        names = 'project_name version py_version platform location precedence'
        for attr in names.split():
            kw.setdefault(attr, getattr(self, attr, None))
        kw.setdefault('metadata', self._provider)
        return self.__class__(**kw)

    @property
    def extras(self):
        return [dep for dep in self._dep_map if dep]


class EggInfoDistribution(Distribution):
    def _reload_version(self):
        """
        Packages installed by distutils (e.g. numpy or scipy),
        which uses an old safe_version, and so
        their version numbers can get mangled when
        converted to filenames (e.g., 1.11.0.dev0+2329eae to
        1.11.0.dev0_2329eae). These distributions will not be
        parsed properly
        downstream by Distribution and safe_version, so
        take an extra step and try to get the version number from
        the metadata file itself instead of the filename.
        """
        md_version = _version_from_file(self._get_metadata(self.PKG_INFO))
        if md_version:
            self._version = md_version
        return self


class DistInfoDistribution(Distribution):
    """Wrap an actual or potential sys.path entry w/metadata, .dist-info style"""
    PKG_INFO = 'METADATA'
    EQEQ = re.compile(r"([\(,])\s*(\d.*?)\s*([,\)])")

    @property
    def _parsed_pkg_info(self):
        """Parse and cache metadata"""
        try:
            return self._pkg_info
        except AttributeError:
            metadata = self.get_metadata(self.PKG_INFO)
            self._pkg_info = email.parser.Parser().parsestr(metadata)
            return self._pkg_info

    @property
    def _dep_map(self):
        try:
            return self.__dep_map
        except AttributeError:
            self.__dep_map = self._compute_dependencies()
            return self.__dep_map

    def _compute_dependencies(self):
        """Recompute this distribution's dependencies."""
        dm = self.__dep_map = {None: []}

        reqs = []
        # Including any condition expressions
        for req in self._parsed_pkg_info.get_all('Requires-Dist') or []:
            reqs.extend(parse_requirements(req))

        def reqs_for_extra(extra):
            for req in reqs:
                if not req.marker or req.marker.evaluate({'extra': extra}):
                    yield req

        common = frozenset(reqs_for_extra(None))
        dm[None].extend(common)

        for extra in self._parsed_pkg_info.get_all('Provides-Extra') or []:
            s_extra = safe_extra(extra.strip())
            dm[s_extra] = list(frozenset(reqs_for_extra(extra)) - common)

        return dm


_distributionImpl = {
    '.egg': Distribution,
    '.egg-info': EggInfoDistribution,
    '.dist-info': DistInfoDistribution,
    }


def issue_warning(*args, **kw):
    level = 1
    g = globals()
    try:
        # find the first stack frame that is *not* code in
        # the pkg_resources module, to use for the warning
        while sys._getframe(level).f_globals is g:
            level += 1
    except ValueError:
        pass
    warnings.warn(stacklevel=level + 1, *args, **kw)


class RequirementParseError(ValueError):
    def __str__(self):
        return ' '.join(self.args)


def parse_requirements(strs):
    """Yield ``Requirement`` objects for each specification in `strs`

    `strs` must be a string, or a (possibly-nested) iterable thereof.
    """
    # create a steppable iterator, so we can handle \-continuations
    lines = iter(yield_lines(strs))

    for line in lines:
        # Drop comments -- a hash without a space may be in a URL.
        if ' #' in line:
            line = line[:line.find(' #')]
        # If there is a line continuation, drop it, and append the next line.
        if line.endswith('\\'):
            line = line[:-2].strip()
            line += next(lines)
        yield Requirement(line)


class Requirement(packaging.requirements.Requirement):
    def __init__(self, requirement_string):
        """DO NOT CALL THIS UNDOCUMENTED METHOD; use Requirement.parse()!"""
        try:
            super(Requirement, self).__init__(requirement_string)
        except packaging.requirements.InvalidRequirement as e:
            raise RequirementParseError(str(e))
        self.unsafe_name = self.name
        project_name = safe_name(self.name)
        self.project_name, self.key = project_name, project_name.lower()
        self.specs = [
            (spec.operator, spec.version) for spec in self.specifier]
        self.extras = tuple(map(safe_extra, self.extras))
        self.hashCmp = (
            self.key,
            self.specifier,
            frozenset(self.extras),
            str(self.marker) if self.marker else None,
        )
        self.__hash = hash(self.hashCmp)

    def __eq__(self, other):
        return (
            isinstance(other, Requirement) and
            self.hashCmp == other.hashCmp
        )

    def __ne__(self, other):
        return not self == other

    def __contains__(self, item):
        if isinstance(item, Distribution):
            if item.key != self.key:
                return False

            item = item.version

        # Allow prereleases always in order to match the previous behavior of
        # this method. In the future this should be smarter and follow PEP 440
        # more accurately.
        return self.specifier.contains(item, prereleases=True)

    def __hash__(self):
        return self.__hash

    def __repr__(self): return "Requirement.parse(%r)" % str(self)

    @staticmethod
    def parse(s):
        req, = parse_requirements(s)
        return req


def _get_mro(cls):
    """Get an mro for a type or classic class"""
    if not isinstance(cls, type):

        class cls(cls, object):
            pass

        return cls.__mro__[1:]
    return cls.__mro__


def _find_adapter(registry, ob):
    """Return an adapter factory for `ob` from `registry`"""
    for t in _get_mro(getattr(ob, '__class__', type(ob))):
        if t in registry:
            return registry[t]


def ensure_directory(path):
    """Ensure that the parent directory of `path` exists"""
    dirname = os.path.dirname(path)
    if not os.path.isdir(dirname):
        os.makedirs(dirname)


def _bypass_ensure_directory(path):
    """Sandbox-bypassing version of ensure_directory()"""
    if not WRITE_SUPPORT:
        raise IOError('"os.mkdir" not supported on this platform.')
    dirname, filename = split(path)
    if dirname and filename and not isdir(dirname):
        _bypass_ensure_directory(dirname)
        mkdir(dirname, 0o755)


def split_sections(s):
    """Split a string or iterable thereof into (section, content) pairs

    Each ``section`` is a stripped version of the section header ("[section]")
    and each ``content`` is a list of stripped lines excluding blank lines and
    comment-only lines.  If there are any such lines before the first section
    header, they're returned in a first ``section`` of ``None``.
    """
    section = None
    content = []
    for line in yield_lines(s):
        if line.startswith("["):
            if line.endswith("]"):
                if section or content:
                    yield section, content
                section = line[1:-1].strip()
                content = []
            else:
                raise ValueError("Invalid section heading", line)
        else:
            content.append(line)

    # wrap up last segment
    yield section, content


def _mkstemp(*args, **kw):
    old_open = os.open
    try:
        # temporarily bypass sandboxing
        os.open = os_open
        return tempfile.mkstemp(*args, **kw)
    finally:
        # and then put it back
        os.open = old_open


# Silence the PEP440Warning by default, so that end users don't get hit by it
# randomly just because they use pkg_resources. We want to append the rule
# because we want earlier uses of filterwarnings to take precedence over this
# one.
warnings.filterwarnings("ignore", category=PEP440Warning, append=True)


# from jaraco.functools 1.3
def _call_aside(f, *args, **kwargs):
    f(*args, **kwargs)
    return f


@_call_aside
def _initialize(g=globals()):
    "Set up global resource manager (deliberately not state-saved)"
    manager = ResourceManager()
    g['_manager'] = manager
    for name in dir(manager):
        if not name.startswith('_'):
            g[name] = getattr(manager, name)


@_call_aside
def _initialize_master_working_set():
    """
    Prepare the master working set and make the ``require()``
    API available.

    This function has explicit effects on the global state
    of pkg_resources. It is intended to be invoked once at
    the initialization of this module.

    Invocation by other packages is unsupported and done
    at their own risk.
    """
    working_set = WorkingSet._build_master()
    _declare_state('object', working_set=working_set)

    require = working_set.require
    iter_entry_points = working_set.iter_entry_points
    add_activation_listener = working_set.subscribe
    run_script = working_set.run_script
    # backward compatibility
    run_main = run_script
    # Activate all distributions already on sys.path with replace=False and
    # ensure that all distributions added to the working set in the future
    # (e.g. by calling ``require()``) will get activated as well,
    # with higher priority (replace=True).
    dist = None  # ensure dist is defined for del dist below
    for dist in working_set:
        dist.activate(replace=False)
    del dist
    add_activation_listener(lambda dist: dist.activate(replace=True), existing=False)
    working_set.entries = []
    # match order
    list(map(working_set.add_entry, sys.path))
    globals().update(locals())
site-packages/pip/_vendor/pkg_resources/__pycache__/__init__.cpython-36.pyc000064400000271767147511334620023012 0ustar003

���e>��^@s�dZddlmZddlZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlZddlZddlZddlZddlZddlZddlmZyddlZWnek
r�ddlZYnXddlmZddlmZm Z m!Z!ddlm"Z"yddlm#Z#m$Z$m%Z%d	Z&Wnek
�rHd
Z&YnXddlm'Z(ddl)m*Z*m+Z+yddl,j-Z.e.j/Wnek
�r�dZ.YnXdd
lm0Z0ddlm1Z1e2d�e2d�e2d�e2d�d�ej3k�o�d�kn�r�dZ4ej5e4�dZ6dZ7Gdd�de8�Z9Gdd�de:�Z;Gdd�de;e1j<j=�Z>Gdd�de;e1j<j?�Z@dd�ZAiZBdd �ZCd!d"�ZDd#d$�ZEd%d&�ZFd'd(�ZGd)d*�ZHd+d,�ZId-d.�ZJZKd/d0�ZLd1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPddQddRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrdsdtgFZMGdudL�dLeN�ZOGdvdM�dMeO�ZPGdwdx�dxeP�ZQGdydN�dNeO�ZRGdzdO�dOeO�ZSiZTej<dd�ZUdZVd{ZWd|ZXdZYd�ZZd}dp�Z[d~d3�Z\gfdd��Z]d�d��Z^d�d��Z_ej`d��Zaej`d��Zbe_Zcd�dU�Zdd�d2�ZeeeZfd�d4�Zgd�d5�Zhd�d�d6�Zid�d7�ZjGd�dc�dc�ZkGd�dd�ddek�ZlGd�dG�dGe:�ZmGd�d��d�en�ZoGd�dF�dFe:�ZpepZqGd�dP�dPer�ZsGd�dH�dH�Ztd�dE�Zud�dR�Zvd�dS�Zwd�dX�Zxd�dY�Zyd�dZ�Zzd�d�d[�Z{Gd�dj�dj�Z|e[e:e|�Gd�dk�dke|�Z}Gd�dl�dle}�Z~e~j�Gd�dh�dhe|�Z�e��Z�Gd�d��d�en�Z�Gd�d��d�e��Z�Gd�d��d�e	j��Z�Gd�dm�dme}�Z�e[e
j�e��Gd�de�dee��Z�Gd�df�dfe~�Z�Gd�dg�dge��Z�eCd�id��d�dn�Z�d�d�dB�Z�d�d�d��Z�e�e
j�e��d�d�d��Z�e�e:e��d�d��Z�d�d�d��Z�e�ej�e��e�e.d���r"e�e.j�e��eCd�id��eCd�id��d�do�Z�d�d��Z�d�d��Z�d�d?�Z�d�d�dq�Z�d�d��Z�e�ej�e��e�e
j�e��e�e.d���r�e�e.j�e��d�dÄZ�e�e:e��d�d]�Z�ifd�dƄZ�d�dȄZ�d�dʄZ�d�dV�Z�ej`d̃j�Z�ej`d�ej�ej�B�j�Z�Gd�dK�dKe:�Z�d�dЄZ�d�d҄Z�Gd�dI�dIe:�Z�Gd�dՄd�e��Z�Gd�dׄd�e��Z�e�e�e�d؜Z�d�dڄZ�Gd�d܄d�e��Z�d�dQ�Z�Gd�dJ�dJe1j�j��Z�d�d�Z�d�d�Z�d�d\�Z�d�d�Z�d�dW�Z�d�d�Z�ej�d�e9d	d�d�d�Z�e�e��fd�d��Z�e�d�d��Z�dS)�aZ
Package resource API
--------------------

A resource is a logical file contained within a package, or a logical
subdirectory thereof.  The package resource API expects resource names
to have their path parts separated with ``/``, *not* whatever the local
path separator is.  Do not use os.path operations to manipulate resource
names being passed into the API.

The package resource API is designed to work with normal filesystem packages,
.egg files, and unpacked .egg files.  It can also work in a limited way with
.zip files and with custom PEP 302 loaders that support the ``get_data()``
method.
�)�absolute_importN)�get_importer)�six)�urllib�map�filter)�utime)�mkdir�rename�unlinkTF)�open)�isdir�split)�appdirs)�	packagingzpip._vendor.packaging.versionz pip._vendor.packaging.specifiersz"pip._vendor.packaging.requirementszpip._vendor.packaging.markers�zLSupport for Python 3.0-3.2 has been dropped. Future versions will fail here.c@seZdZdZdS)�
PEP440Warningza
    Used when there is an issue with a version or specifier not complying with
    PEP 440.
    N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/__init__.pyr[srcsteZdZ�fdd�Z�fdd�Z�fdd�Z�fdd�Z�fd	d
�Z�fdd�Z�fd
d�Z	dd�Z
dd�Z�ZS)�_SetuptoolsVersionMixincstt|�j�S)N)�superr�__hash__)�self)�	__class__rrrcsz _SetuptoolsVersionMixin.__hash__cs*t|t�rt|�|kStt|�j|�SdS)N)�
isinstance�tuplerr�__lt__)r�other)rrrr fs
z_SetuptoolsVersionMixin.__lt__cs*t|t�rt|�|kStt|�j|�SdS)N)rrrr�__le__)rr!)rrrr"ls
z_SetuptoolsVersionMixin.__le__cs*t|t�rt|�|kStt|�j|�SdS)N)rrrr�__eq__)rr!)rrrr#rs
z_SetuptoolsVersionMixin.__eq__cs*t|t�rt|�|kStt|�j|�SdS)N)rrrr�__ge__)rr!)rrrr$xs
z_SetuptoolsVersionMixin.__ge__cs*t|t�rt|�|kStt|�j|�SdS)N)rrrr�__gt__)rr!)rrrr%~s
z_SetuptoolsVersionMixin.__gt__cs*t|t�rt|�|kStt|�j|�SdS)N)rrrr�__ne__)rr!)rrrr&�s
z_SetuptoolsVersionMixin.__ne__cCst|�|S)N)r)r�keyrrr�__getitem__�sz#_SetuptoolsVersionMixin.__getitem__c#sjtjdtj��dddddd�j���fdd���fdd	�}tjd
tdd�x|t|��D]
}|VqXWdS)
Nz(\d+ | [a-z]+ | \.| -)�czfinal-�@)ZpreZpreview�-ZrcZdevc3s`xT�j|�D]F}�||�}|s|dkr*q|dd�dkrH|jd�Vqd|VqWdVdS)N�.��
0123456789��*z*final)r�zfill)�s�part)�component_re�replacerr�_parse_version_parts�s
z>_SetuptoolsVersionMixin.__iter__.<locals>._parse_version_partscszg}xl�|j��D]\}|jd�rd|dkrFx|rD|ddkrD|j�q*Wx|rb|ddkrb|j�qHW|j|�qWt|�S)Nr0z*finalr-z*final-Z00000000���r7)�lower�
startswith�pop�appendr)r2�partsr3)r6rr�old_parse_version�s
z;_SetuptoolsVersionMixin.__iter__.<locals>.old_parse_versiona�You have iterated over the result of pkg_resources.parse_version. This is a legacy behavior which is inconsistent with the new version class introduced in setuptools 8.0. In most cases, conversion to a tuple is unnecessary. For comparison of versions, sort the Version instances directly. If you have another use case requiring the tuple, please file a bug with the setuptools project describing that need.r-)�
stacklevel)�re�compile�VERBOSE�get�warnings�warn�RuntimeWarning�str)rr=r3r)r6r4r5r�__iter__�s
z _SetuptoolsVersionMixin.__iter__)
rrrrr r"r#r$r%r&r(rG�
__classcell__rr)rrrbsrc@seZdZdS)�SetuptoolsVersionN)rrrrrrrrI�srIc@seZdZdS)�SetuptoolsLegacyVersionN)rrrrrrrrJ�srJcCs*yt|�Stjjk
r$t|�SXdS)N)rIr�version�InvalidVersionrJ)�vrrr�
parse_version�srNcKs"t�j|�tjtj||��dS)N)�globals�update�_state_vars�dict�fromkeys)Zvartype�kwrrr�_declare_state�srUcCs<i}t�}x,tj�D] \}}|d|||�||<qW|S)NZ_sget_)rOrQ�items)�state�g�krMrrr�__getstate__�s
rZcCs<t�}x0|j�D]$\}}|dt|||||�qW|S)NZ_sset_)rOrVrQ)rWrXrYrMrrr�__setstate__�s r[cCs|j�S)N)�copy)�valrrr�
_sget_dict�sr^cCs|j�|j|�dS)N)�clearrP)r'�obrWrrr�
_sset_dict�sracCs|j�S)N)rZ)r]rrr�_sget_object�srbcCs|j|�dS)N)r[)r'r`rWrrr�_sset_object�srccGsdS)Nr)�argsrrr�<lambda>�srecCsbt�}tj|�}|dk	r^tjdkr^y&ddjt�dd��|jd�f}Wntk
r\YnX|S)aZReturn this platform's maximum compatible version.

    distutils.util.get_platform() normally reports the minimum version
    of Mac OS X that would be required to *use* extensions produced by
    distutils.  But what we want when checking compatibility is to know the
    version of Mac OS X that we are *running*.  To allow usage of packages that
    explicitly require a newer version of Mac OS X, we must also know the
    current version of the OS.

    If this condition occurs for any other platform with a version in its
    platform strings, this function should be extended accordingly.
    N�darwinzmacosx-%s-%sr,�r)	�get_build_platform�macosVersionString�match�sys�platform�join�_macosx_vers�group�
ValueError)�plat�mrrr�get_supported_platform�s

&rs�require�
run_script�get_provider�get_distribution�load_entry_point�
get_entry_map�get_entry_info�iter_entry_points�resource_string�resource_stream�resource_filename�resource_listdir�resource_exists�resource_isdir�declare_namespace�working_set�add_activation_listener�find_distributions�set_extraction_path�cleanup_resources�get_default_cache�Environment�
WorkingSet�ResourceManager�Distribution�Requirement�
EntryPoint�ResolutionError�VersionConflict�DistributionNotFound�UnknownExtra�ExtractionError�parse_requirements�	safe_name�safe_version�get_platform�compatible_platforms�yield_lines�split_sections�
safe_extra�to_filename�invalid_marker�evaluate_marker�ensure_directory�normalize_path�EGG_DIST�BINARY_DIST�SOURCE_DIST�
CHECKOUT_DIST�DEVELOP_DIST�IMetadataProvider�IResourceProvider�FileMetadata�PathMetadata�EggMetadata�
EmptyProvider�empty_provider�NullProvider�EggProvider�DefaultProvider�ZipProvider�register_finder�register_namespace_handler�register_loader_type�fixup_namespace_packagesr�run_main�AvailableDistributionsc@seZdZdZdd�ZdS)r�z.Abstract base for dependency resolution errorscCs|jjt|j�S)N)rr�reprrd)rrrr�__repr__IszResolutionError.__repr__N)rrrrr�rrrrr�Fsc@s<eZdZdZdZedd��Zedd��Zdd�Zd	d
�Z	dS)r�z�
    An already-installed version conflicts with the requested version.

    Should be initialized with the installed Distribution and the requested
    Requirement.
    z3{self.dist} is installed but {self.req} is requiredcCs
|jdS)Nr)rd)rrrr�distWszVersionConflict.distcCs
|jdS)Nr-)rd)rrrr�req[szVersionConflict.reqcCs|jjft��S)N)�	_template�format�locals)rrrr�report_szVersionConflict.reportcCs|s|S|j|f}t|�S)zt
        If required_by is non-empty, return a version of self that is a
        ContextualVersionConflict.
        )rd�ContextualVersionConflict)r�required_byrdrrr�with_contextbszVersionConflict.with_contextN)
rrrrr��propertyr�r�r�r�rrrrr�Msc@s&eZdZdZejdZedd��ZdS)r�z�
    A VersionConflict that accepts a third parameter, the set of the
    requirements that required the installed Distribution.
    z by {self.required_by}cCs
|jdS)Nrg)rd)rrrrr�usz%ContextualVersionConflict.required_byN)rrrrr�r�r�r�rrrrr�ms
r�c@sHeZdZdZdZedd��Zedd��Zedd��Zd	d
�Z	dd�Z
d
S)r�z&A requested distribution was not foundzSThe '{self.req}' distribution was not found and is required by {self.requirers_str}cCs
|jdS)Nr)rd)rrrrr��szDistributionNotFound.reqcCs
|jdS)Nr-)rd)rrrr�	requirers�szDistributionNotFound.requirerscCs|js
dSdj|j�S)Nzthe applicationz, )r�rm)rrrr�
requirers_str�sz"DistributionNotFound.requirers_strcCs|jjft��S)N)r�r�r�)rrrrr��szDistributionNotFound.reportcCs|j�S)N)r�)rrrr�__str__�szDistributionNotFound.__str__N)rrrrr�r�r�r�r�r�r�rrrrr�zsc@seZdZdZdS)r�z>Distribution doesn't have an "extra feature" of the given nameN)rrrrrrrrr��srgr-cCs|t|<dS)aRegister `provider_factory` to make providers for `loader_type`

    `loader_type` is the type or class of a PEP 302 ``module.__loader__``,
    and `provider_factory` is a function that, passed a *module* object,
    returns an ``IResourceProvider`` for that module.
    N)�_provider_factories)Zloader_typeZprovider_factoryrrrr��scCstt|t�r$tj|�p"tt|��dSytj|}Wn&tk
rXt	|�tj|}YnXt
|dd�}tt|�|�S)z?Return an IResourceProvider for the named module or requirementr�
__loader__N)
rr�r��findrtrFrk�modules�KeyError�
__import__�getattr�
_find_adapterr�)ZmoduleOrReq�module�loaderrrrrv�s
cCsd|s\tj�d}|dkrLd}tjj|�rLttd�rLtj|�}d|krL|d}|j|j	d��|dS)Nr�z0/System/Library/CoreServices/SystemVersion.plist�	readPlistZProductVersionr,)
rlZmac_ver�os�path�exists�hasattr�plistlibr�r;r)�_cacherKZplistZ
plist_contentrrrrn�s

rncCsddd�j||�S)NZppc)ZPowerPCZPower_Macintosh)rB)�machinerrr�_macosx_arch�sr�cCs�yddlm}Wn tk
r0ddlm}YnX|�}tjdkr�|jd�r�y<t�}tj	�dj
dd�}dt|d�t|d	�t|�fSt
k
r�YnX|S)
z�Return this platform's string for platform-specific distributions

    XXX Currently this is the same as ``distutils.util.get_platform()``, but it
    needs some hacks for Linux and Mac OS X.
    r)r�rfzmacosx-�� �_zmacosx-%d.%d-%sr-)�	sysconfigr��ImportErrorZdistutils.utilrkrlr9rnr��unamer5�intr�rp)r�rqrKr�rrrrh�srhzmacosx-(\d+)\.(\d+)-(.*)zdarwin-(\d+)\.(\d+)\.(\d+)-(.*)cCs�|dks|dks||krdStj|�}|r�tj|�}|s�tj|�}|r�t|jd��}d|jd�|jd�f}|dkr||dks�|dkr�|d	kr�dSd
S|jd�|jd�ks�|jd�|jd�kr�d
St|jd��t|jd��kr�d
SdSd
S)z�Can code for the `provided` platform run on the `required` platform?

    Returns true if either platform is ``None``, or the platforms are equal.

    XXX Needs compatibility checks for Linux and other unixy OSes.
    NTr-z%s.%srg�z10.3r/z10.4Fr)rirj�darwinVersionStringr�ro)ZprovidedZrequiredZreqMacZprovMacZ
provDarwinZdversionZmacosversionrrrr��s*


cCs<tjd�j}|d}|j�||d<t|�dj||�dS)z@Locate distribution `dist_spec` and run its `script_name` scriptr-rrN)rk�	_getframe�	f_globalsr_rtru)Z	dist_spec�script_name�ns�namerrrrus
cCs@t|tj�rtj|�}t|t�r(t|�}t|t�s<td|��|S)z@Return a current distribution object for a Requirement or stringz-Expected string, Requirement, or Distribution)rr�string_typesr��parservr��	TypeError)r�rrrrw)s



cCst|�j||�S)zDReturn `name` entry point of `group` for `dist` or raise ImportError)rwrx)r�ror�rrrrx4scCst|�j|�S)z=Return the entry point map for `group`, or the full entry map)rwry)r�rorrrry9scCst|�j||�S)z<Return the EntryPoint object for `group`+`name`, or ``None``)rwrz)r�ror�rrrrz>sc@s<eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
S)r�cCsdS)z;Does the package's distribution contain the named metadata?Nr)r�rrr�has_metadataDszIMetadataProvider.has_metadatacCsdS)z'The named metadata resource as a stringNr)r�rrr�get_metadataGszIMetadataProvider.get_metadatacCsdS)z�Yield named metadata resource as list of non-blank non-comment lines

       Leading and trailing whitespace is stripped from each line, and lines
       with ``#`` as the first non-blank character are omitted.Nr)r�rrr�get_metadata_linesJsz$IMetadataProvider.get_metadata_linescCsdS)z>Is the named metadata a directory?  (like ``os.path.isdir()``)Nr)r�rrr�metadata_isdirPsz IMetadataProvider.metadata_isdircCsdS)z?List of metadata names in the directory (like ``os.listdir()``)Nr)r�rrr�metadata_listdirSsz"IMetadataProvider.metadata_listdircCsdS)z=Execute the named script in the supplied namespace dictionaryNr)r��	namespacerrrruVszIMetadataProvider.run_scriptN)	rrrr�r�r�r�r�rurrrrr�Csc@s@eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dS)r�z3An object that provides access to package resourcescCsdS)zdReturn a true filesystem path for `resource_name`

        `manager` must be an ``IResourceManager``Nr)�manager�
resource_namerrr�get_resource_filename]sz'IResourceProvider.get_resource_filenamecCsdS)ziReturn a readable file-like object for `resource_name`

        `manager` must be an ``IResourceManager``Nr)r�r�rrr�get_resource_streambsz%IResourceProvider.get_resource_streamcCsdS)zmReturn a string containing the contents of `resource_name`

        `manager` must be an ``IResourceManager``Nr)r�r�rrr�get_resource_stringgsz%IResourceProvider.get_resource_stringcCsdS)z,Does the package contain the named resource?Nr)r�rrr�has_resourcelszIResourceProvider.has_resourcecCsdS)z>Is the named resource a directory?  (like ``os.path.isdir()``)Nr)r�rrrr�osz IResourceProvider.resource_isdircCsdS)z?List of resource names in the directory (like ``os.listdir()``)Nr)r�rrrrrsz"IResourceProvider.resource_listdirN)
rrrrr�r�r�r�r�rrrrrr�Zsc@s�eZdZdZd'dd�Zedd��Zedd��Zd	d
�Zdd�Z	d
d�Z
d(dd�Zdd�Zdd�Z
d)dd�Zd*dd�Zd+dd�Zdd�Zd,dd �Zd!d"�Zd#d$�Zd%d&�ZdS)-r�zDA collection of active distributions on sys.path (or a similar list)NcCsBg|_i|_i|_g|_|dkr&tj}x|D]}|j|�q,WdS)z?Create working set from list of path entries (default=sys.path)N)�entries�
entry_keys�by_key�	callbacksrkr��	add_entry)rr��entryrrr�__init__ys
zWorkingSet.__init__cCsZ|�}yddlm}Wntk
r*|SXy|j|�Wntk
rT|j|�SX|S)z1
        Prepare the master working set.
        r)�__requires__)�__main__r�r�rtr��_build_from_requirements)�cls�wsr�rrr�
_build_master�szWorkingSet._build_mastercCsn|g�}t|�}|j|t��}x|D]}|j|�q$Wx"tjD]}||jkr>|j|�q>W|jtjdd�<|S)zQ
        Build a working set from a requirement spec. Rewrites sys.path.
        N)r��resolver��addrkr�r�r�)rZreq_specr�reqs�distsr�r�rrrr�s

z#WorkingSet._build_from_requirementscCs@|jj|g�|jj|�x t|d�D]}|j||d�q&WdS)a�Add a path item to ``.entries``, finding any distributions on it

        ``find_distributions(entry, True)`` is used to find distributions
        corresponding to the path entry, and they are added.  `entry` is
        always appended to ``.entries``, even if it is already present.
        (This is because ``sys.path`` can contain the same value more than
        once, and the ``.entries`` of the ``sys.path`` WorkingSet should always
        equal ``sys.path``.)
        TFN)r��
setdefaultr�r;r�r)rr�r�rrrr��s
zWorkingSet.add_entrycCs|jj|j�|kS)z9True if `dist` is the active distribution for its project)r�rBr')rr�rrr�__contains__�szWorkingSet.__contains__cCs,|jj|j�}|dk	r(||kr(t||��|S)a�Find a distribution matching requirement `req`

        If there is an active distribution for the requested project, this
        returns it as long as it meets the version requirement specified by
        `req`.  But, if there is an active distribution for the project and it
        does *not* meet the `req` requirement, ``VersionConflict`` is raised.
        If there is no active distribution for the requested project, ``None``
        is returned.
        N)r�rBr'r�)rr�r�rrrr��s

zWorkingSet.findccsPxJ|D]B}|j|�}|dkr6x*|j�D]
}|Vq&Wq||kr||VqWdS)aYield entry point objects from `group` matching `name`

        If `name` is None, yields all entry points in `group` from all
        distributions in the working set, otherwise only ones matching
        both `group` and `name` are yielded (in distribution order).
        N)ry�values)rror�r�r��eprrrr{�s

zWorkingSet.iter_entry_pointscCs>tjd�j}|d}|j�||d<|j|�dj||�dS)z?Locate distribution for `requires` and run `script_name` scriptr-rrN)rkr�r�r_rtru)r�requiresr�r�r�rrrru�s
zWorkingSet.run_scriptccsTi}xJ|jD]@}||jkrqx.|j|D] }||kr(d||<|j|Vq(WqWdS)z�Yield distributions for non-duplicate projects in the working set

        The yield order is the order in which the items' path entries were
        added to the working set.
        r-N)r�r�r�)r�seen�itemr'rrrrG�s
zWorkingSet.__iter__TFcCs�|r|j|j||d�|dkr$|j}|jj|g�}|jj|jg�}|rX|j|jkrXdS||j|j<|j|krz|j|j�|j|kr�|j|j�|j|�dS)aAdd `dist` to working set, associated with `entry`

        If `entry` is unspecified, it defaults to the ``.location`` of `dist`.
        On exit from this routine, `entry` is added to the end of the working
        set's ``.entries`` (if it wasn't already present).

        `dist` is only added to the working set if it's for a project that
        doesn't already have a distribution in the set, unless `replace=True`.
        If it's added, any callbacks registered with the ``subscribe()`` method
        will be called.
        )r5N)	�	insert_onr��locationr�rr'r�r;�
_added_new)rr�r��insertr5�keysZkeys2rrrr�s

zWorkingSet.addcCs|t|�ddd�}i}i}g}t�}tjt�}	�xF|�rv|jd�}
|
|krLq2|j|
�sXq2|j|
j�}|dk�r|j	j|
j�}|dks�||
kr�|r�|}|dkr�|dkr�t
|j�}nt
g�}tg�}|j
|
||�}||
j<|dkr�|	j|
d�}
t|
|
��|j|�||
k�r"|	|
}t||
�j|��|j|
j�ddd�}|j|�x(|D] }|	|j|
j�|
j||<�qHWd||
<q2W|S)aeList all distributions needed to (recursively) meet `requirements`

        `requirements` must be a sequence of ``Requirement`` objects.  `env`,
        if supplied, should be an ``Environment`` instance.  If
        not supplied, it defaults to all distributions available within any
        entry or distribution in the working set.  `installer`, if supplied,
        will be invoked with each requirement that cannot be met by an
        already-installed distribution; it should return a ``Distribution`` or
        ``None``.

        Unless `replace_conflicting=True`, raises a VersionConflict exception if
        any requirements are found on the path that have the correct name but
        the wrong version.  Otherwise, if an `installer` is supplied it will be
        invoked to obtain the correct version of the requirement and activate
        it.
        Nr-rTr7r7)�list�
_ReqExtras�collections�defaultdict�setr:�markers_passrBr'r�r�r�r��
best_matchr�r;r�r�r�extras�extendr�project_name)r�requirements�env�	installerZreplace_conflictingZ	processedZbestZto_activateZ
req_extrasr�r�r�rr�Z
dependent_reqZnew_requirementsZnew_requirementrrrrsJ









zWorkingSet.resolvecCst|�}|j�i}i}|dkr4t|j�}||7}n||}|jg�}	tt|	j|��x�|D]�}
x�||
D]x}|j�g}y|	j|||�}
Wn4t	k
r�}z|||<|r�wjnPWYdd}~XqjXtt|	j|
��|j
tj|
��PqjWq\Wt|�}|j�||fS)asFind all activatable distributions in `plugin_env`

        Example usage::

            distributions, errors = working_set.find_plugins(
                Environment(plugin_dirlist)
            )
            # add plugins+libs to sys.path
            map(working_set.add, distributions)
            # display errors
            print('Could not load', errors)

        The `plugin_env` should be an ``Environment`` instance that contains
        only distributions that are in the project's "plugin directory" or
        directories. The `full_env`, if supplied, should be an ``Environment``
        contains all currently-available distributions.  If `full_env` is not
        supplied, one is created automatically from the ``WorkingSet`` this
        method is called on, which will typically mean that every directory on
        ``sys.path`` will be scanned for distributions.

        `installer` is a standard installer callback as used by the
        ``resolve()`` method. The `fallback` flag indicates whether we should
        attempt to resolve older versions of a plugin if the newest version
        cannot be resolved.

        This method returns a 2-tuple: (`distributions`, `error_info`), where
        `distributions` is a list of the distributions found in `plugin_env`
        that were loadable, along with any other distributions that are needed
        to resolve their dependencies.  `error_info` is a dictionary mapping
        unloadable plugin distributions to an exception instance describing the
        error that occurred. Usually this will be a ``DistributionNotFound`` or
        ``VersionConflict`` instance.
        N)
r�sortr�r�rrr�as_requirementrr�rPrRrS)rZ
plugin_envZfull_envr ZfallbackZplugin_projectsZ
error_infoZ
distributionsrZ
shadow_setrr�r�Z	resolveesrMrrr�find_pluginsks4$





zWorkingSet.find_pluginscGs*|jt|��}x|D]}|j|�qW|S)a�Ensure that distributions matching `requirements` are activated

        `requirements` must be a string or a (possibly-nested) sequence
        thereof, specifying the distributions and versions required.  The
        return value is a sequence of the distributions that needed to be
        activated to fulfill the requirements; all relevant distributions are
        included, even if they were already activated in this working set.
        )rr�r)rrZneededr�rrrrt�s	
zWorkingSet.requirecCs<||jkrdS|jj|�|s"dSx|D]}||�q(WdS)z�Invoke `callback` for all distributions

        If `existing=True` (default),
        call on all existing ones, as well.
        N)r�r;)r�callback�existingr�rrr�	subscribe�s

zWorkingSet.subscribecCsx|jD]}||�qWdS)N)r�)rr�r$rrrr�szWorkingSet._added_newcCs,|jdd�|jj�|jj�|jdd�fS)N)r�r�r\r�r�)rrrrrZ�szWorkingSet.__getstate__cCs@|\}}}}|dd�|_|j�|_|j�|_|dd�|_dS)N)r�r\r�r�r�)rZe_k_b_cr�rr�r�rrrr[�s


zWorkingSet.__setstate__)N)N)NTF)NNF)NNT)T)rrrrr��classmethodrrr�r	r�r{rurGrrr#rtr&rrZr[rrrrr�vs(




Q
S
c@seZdZdZdd�ZdS)rz>
    Map each requirement to the extras that demanded it.
    cs.�fdd�|j�f�dD�}�jp,t|�S)z�
        Evaluate markers for req against each extra that
        demanded it.

        Return False if the req has a marker and fails
        evaluation. Otherwise, return True.
        c3s|]}�jjd|i�VqdS)�extraN)�marker�evaluate)�.0r()r�rr�	<genexpr>�sz*_ReqExtras.markers_pass.<locals>.<genexpr>N)N)rBr)�any)rr�Zextra_evalsr)r�rr�s	
z_ReqExtras.markers_passN)rrrrrrrrrr�src@sxeZdZdZde�efdd�Zdd�Zdd�Zdd	d
�Z	dd�Z
d
d�Zddd�Zddd�Z
dd�Zdd�Zdd�ZdS)r�z5Searchable snapshot of distributions on a search pathNcCs i|_||_||_|j|�dS)a!Snapshot distributions available on a search path

        Any distributions found on `search_path` are added to the environment.
        `search_path` should be a sequence of ``sys.path`` items.  If not
        supplied, ``sys.path`` is used.

        `platform` is an optional string specifying the name of the platform
        that platform-specific distributions must be compatible with.  If
        unspecified, it defaults to the current platform.  `python` is an
        optional string naming the desired version of Python (e.g. ``'3.3'``);
        it defaults to the current version.

        You may explicitly set `platform` (and/or `python`) to ``None`` if you
        wish to map *all* distributions, not just those compatible with the
        running platform or Python version.
        N)�_distmaprl�python�scan)r�search_pathrlr/rrrr�szEnvironment.__init__cCs.|jdks |jdks |j|jko,t|j|j�S)z�Is distribution `dist` acceptable for this environment?

        The distribution must match the platform and python version
        requirements specified when this environment was created, or False
        is returned.
        N)r/�
py_versionr�rl)rr�rrr�can_addszEnvironment.can_addcCs|j|jj|�dS)z"Remove `dist` from the environmentN)r.r'�remove)rr�rrrr4(szEnvironment.removecCs<|dkrtj}x(|D] }xt|�D]}|j|�q"WqWdS)adScan `search_path` for distributions usable in this environment

        Any distributions found are added to the environment.
        `search_path` should be a sequence of ``sys.path`` items.  If not
        supplied, ``sys.path`` is used.  Only distributions conforming to
        the platform/python version defined at initialization are added.
        N)rkr�r�r)rr1rr�rrrr0,s

zEnvironment.scancCs|j�}|jj|g�S)aReturn a newest-to-oldest list of distributions for `project_name`

        Uses case-insensitive `project_name` comparison, assuming all the
        project's distributions use their project's name converted to all
        lowercase as their key.

        )r8r.rB)rrZdistribution_keyrrrr(;szEnvironment.__getitem__cCsL|j|�rH|j�rH|jj|jg�}||krH|j|�|jtjd�dd�dS)zLAdd `dist` if we ``can_add()`` it and it has not already been added
        �hashcmpT)r'�reverseN)	r3�has_versionr.rr'r;r!�operator�
attrgetter)rr�rrrrrFs

zEnvironment.addcCsB|j|�}|dk	r|Sx||jD]}||kr"|Sq"W|j||�S)a�Find distribution best matching `req` and usable on `working_set`

        This calls the ``find(req)`` method of the `working_set` to see if a
        suitable distribution is already active.  (This may raise
        ``VersionConflict`` if an unsuitable version of the project is already
        active in the specified `working_set`.)  If a suitable distribution
        isn't active, this method returns the newest distribution in the
        environment that meets the ``Requirement`` in `req`.  If no suitable
        distribution is found, and `installer` is supplied, then the result of
        calling the environment's ``obtain(req, installer)`` method will be
        returned.
        N)r�r'�obtain)rr�r�r r�rrrrOs
zEnvironment.best_matchcCs|dk	r||�SdS)a�Obtain a distribution matching `requirement` (e.g. via download)

        Obtain a distro that matches requirement (e.g. via download).  In the
        base ``Environment`` class, this routine just returns
        ``installer(requirement)``, unless `installer` is None, in which case
        None is returned instead.  This method is a hook that allows subclasses
        to attempt other ways of obtaining a distribution before falling back
        to the `installer` argument.Nr)rZrequirementr rrrr:es	zEnvironment.obtainccs&x |jj�D]}||r|VqWdS)z=Yield the unique project names of the available distributionsN)r.r)rr'rrrrGqszEnvironment.__iter__cCs^t|t�r|j|�nDt|t�rLx8|D] }x||D]}|j|�q4Wq&Wntd|f��|S)z2In-place addition of a distribution or environmentzCan't add %r to environment)rr�rr�r�)rr!Zprojectr�rrr�__iadd__ws


zEnvironment.__iadd__cCs.|jgddd�}x||fD]}||7}qW|S)z4Add an environment or distribution to an environmentN)rlr/)r)rr!�newrrrr�__add__�szEnvironment.__add__)N)N)N)rrrrrs�PY_MAJORr�r3r4r0r(rrr:rGr;r=rrrrr�s
	

c@seZdZdZdS)r�aTAn error occurred extracting a resource

    The following attributes are available from instances of this exception:

    manager
        The resource manager that raised this exception

    cache_path
        The base directory for resource extraction

    original_error
        The exception instance that caused extraction to fail
    N)rrrrrrrrr��s
c@s�eZdZdZdZdd�Zdd�Zdd�Zd	d
�Zdd�Z	d
d�Z
dd�Zdd�Zffdd�Z
edd��Zdd�Zdd�Zddd�ZdS)r�z'Manage resource extraction and packagesNcCs
i|_dS)N)�cached_files)rrrrr��szResourceManager.__init__cCst|�j|�S)zDoes the named resource exist?)rvr�)r�package_or_requirementr�rrrr��szResourceManager.resource_existscCst|�j|�S)z,Is the named resource an existing directory?)rvr�)rr@r�rrrr��szResourceManager.resource_isdircCst|�j||�S)z4Return a true filesystem path for specified resource)rvr�)rr@r�rrrr~�sz!ResourceManager.resource_filenamecCst|�j||�S)z9Return a readable file-like object for specified resource)rvr�)rr@r�rrrr}�szResourceManager.resource_streamcCst|�j||�S)z%Return specified resource as a string)rvr�)rr@r�rrrr|�szResourceManager.resource_stringcCst|�j|�S)z1List the contents of the named resource directory)rvr)rr@r�rrrr�sz ResourceManager.resource_listdircCsRtj�d}|jpt�}tjd�j�}t|jft	���}||_
||_||_|�dS)z5Give an error message for problems extracting file(s)r-a
            Can't extract file(s) to egg cache

            The following error occurred while trying to extract file(s) to the Python egg
            cache:

              {old_exc}

            The Python egg cache directory is currently set to:

              {cache_path}

            Perhaps your account does not have write access to this directory?  You can
            change the cache directory by setting the PYTHON_EGG_CACHE environment
            variable to point to an accessible directory.
            N)
rk�exc_info�extraction_pathr��textwrap�dedent�lstripr�r�r�r��
cache_pathZoriginal_error)r�old_excrF�tmpl�errrrr�extraction_error�s
z ResourceManager.extraction_errorc	Cs^|jp
t�}tjj||df|��}yt|�Wn|j�YnX|j|�d|j|<|S)a�Return absolute location in cache for `archive_name` and `names`

        The parent directory of the resulting path will be created if it does
        not already exist.  `archive_name` should be the base filename of the
        enclosing egg (which may not be the name of the enclosing zipfile!),
        including its ".egg" extension.  `names`, if provided, should be a
        sequence of path name parts "under" the egg's extraction location.

        This method should only be called by resource providers that need to
        obtain an extraction location, and only for names they intend to
        extract, as it tracks the generated names for possible cleanup later.
        z-tmpr-)	rBr�r�r�rm�_bypass_ensure_directoryrJ�_warn_unsafe_extraction_pathr?)rZarchive_name�namesZextract_pathZtarget_pathrrr�get_cache_path�s


zResourceManager.get_cache_pathcCsXtjdkr |jtjd�r dStj|�j}|tj@s@|tj@rTd|}tj	|t
�dS)aN
        If the default extraction path is overridden and set to an insecure
        location, such as /tmp, it opens up an opportunity for an attacker to
        replace an extracted file with an unauthorized payload. Warn the user
        if a known insecure location is used.

        See Distribute #375 for more details.
        �ntZwindirNz�%s is writable by group/others and vulnerable to attack when used with get_resource_filename. Consider a more secure location (set with .set_extraction_path or the PYTHON_EGG_CACHE environment variable).)r�r�r9�environ�stat�st_mode�S_IWOTH�S_IWGRPrCrD�UserWarning)r��mode�msgrrrrL�s
z,ResourceManager._warn_unsafe_extraction_pathcCs.tjdkr*tj|�jdBd@}tj||�dS)a4Perform any platform-specific postprocessing of `tempname`

        This is where Mac header rewrites should be done; other platforms don't
        have anything special they should do.

        Resource providers should call this method ONLY after successfully
        extracting a compressed resource.  They must NOT call it on resources
        that are already in the filesystem.

        `tempname` is the current (temporary) name of the file, and `filename`
        is the name it will be renamed to by the caller after this routine
        returns.
        �posiximi�N)r�r�rQrR�chmod)rZtempname�filenamerVrrr�postprocesss
zResourceManager.postprocesscCs|jrtd��||_dS)a�Set the base path where resources will be extracted to, if needed.

        If you do not call this routine before any extractions take place, the
        path defaults to the return value of ``get_default_cache()``.  (Which
        is based on the ``PYTHON_EGG_CACHE`` environment variable, with various
        platform-specific fallbacks.  See that routine's documentation for more
        details.)

        Resources are extracted to subdirectories of this path based upon
        information given by the ``IResourceProvider``.  You may set this to a
        temporary directory, but then you must call ``cleanup_resources()`` to
        delete the extracted files when done.  There is no guarantee that
        ``cleanup_resources()`` will be able to remove all extracted files.

        (Note: you may not change the extraction path for a given resource
        manager once resources have been extracted, unless you first call
        ``cleanup_resources()``.)
        z5Can't change extraction path, files already extractedN)r?rprB)rr�rrrr�)sz#ResourceManager.set_extraction_pathFcCsdS)aB
        Delete all extracted resource files and directories, returning a list
        of the file and directory names that could not be successfully removed.
        This function does not have any concurrency protection, so it should
        generally only be called when the extraction path is a temporary
        directory exclusive to a single process.  This method is not
        automatically called; you must call it explicitly or register it as an
        ``atexit`` function if you wish to ensure cleanup of a temporary
        directory used for extractions.
        Nr)r�forcerrrr�Csz!ResourceManager.cleanup_resources)F)rrrrrBr�r�r�r~r}r|rrJrN�staticmethodrLr[r�r�rrrrr��scCstjjd�ptjdd�S)z�
    Return the ``PYTHON_EGG_CACHE`` environment variable
    or a platform-relevant user cache dir for an app
    named "Python-Eggs".
    ZPYTHON_EGG_CACHEzPython-Eggs)Zappname)r�rPrBrZuser_cache_dirrrrrr�QscCstjdd|�S)z�Convert an arbitrary string to a standard distribution name

    Any runs of non-alphanumeric/. characters are replaced with a single '-'.
    z[^A-Za-z0-9.]+r+)r?�sub)r�rrrr�]scCsDyttjj|��Stjjk
r>|jdd�}tjdd|�SXdS)zB
    Convert an arbitrary string to a standard version string
    r�r,z[^A-Za-z0-9.]+r+N)rFrrK�VersionrLr5r?r^)rKrrrr�es
cCstjdd|�j�S)z�Convert an arbitrary string to a standard 'extra' name

    Any runs of non-alphanumeric characters are replaced with a single '_',
    and the result is always lowercased.
    z[^A-Za-z0-9.-]+r�)r?r^r8)r(rrrr�qscCs|jdd�S)z|Convert a project or version name to its filename-escaped form

    Any '-' characters are currently replaced with '_'.
    r+r�)r5)r�rrrr�zscCs>yt|�Wn,tk
r8}zd|_d|_|Sd}~XnXdS)zo
    Validate text as a PEP 508 environment marker; return an exception
    if invalid or False otherwise.
    NF)r��SyntaxErrorrZ�lineno)�text�errrr��scCsHytjj|�}|j�Stjjk
rB}zt|��WYdd}~XnXdS)z�
    Evaluate a PEP 508 environment marker.
    Return a boolean indicating the marker result in this environment.
    Raise SyntaxError if marker is invalid.

    This implementation uses the 'pyparsing' module.
    N)rZmarkersZMarkerr*Z
InvalidMarkerr`)rbr(r)rcrrrr��s
c@s�eZdZdZdZdZdZdd�Zdd�Zdd�Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�ZdS)'r�zETry to implement resources and metadata for arbitrary PEP 302 loadersNcCs(t|dd�|_tjjt|dd��|_dS)Nr��__file__r�)r�r�r�r��dirname�module_path)rr�rrrr��szNullProvider.__init__cCs|j|j|�S)N)�_fnrf)rr�r�rrrr��sz"NullProvider.get_resource_filenamecCstj|j||��S)N)�io�BytesIOr�)rr�r�rrrr��sz NullProvider.get_resource_streamcCs|j|j|j|��S)N)�_getrgrf)rr�r�rrrr��sz NullProvider.get_resource_stringcCs|j|j|j|��S)N)�_hasrgrf)rr�rrrr��szNullProvider.has_resourcecCs|jo|j|j|j|��S)N)�egg_inforkrg)rr�rrrr��szNullProvider.has_metadatacCs2|js
dS|j|j|j|��}tjr.|jd�S|S)Nr�zutf-8)rlrjrgrZPY3�decode)rr��valuerrrr��szNullProvider.get_metadatacCst|j|��S)N)r�r�)rr�rrrr��szNullProvider.get_metadata_linescCs|j|j|j|��S)N)�_isdirrgrf)rr�rrrr��szNullProvider.resource_isdircCs|jo|j|j|j|��S)N)rlrorg)rr�rrrr��szNullProvider.metadata_isdircCs|j|j|j|��S)N)�_listdirrgrf)rr�rrrr�szNullProvider.resource_listdircCs|jr|j|j|j|��SgS)N)rlrprg)rr�rrrr��szNullProvider.metadata_listdirc
Cs�d|}|j|�std|��|j|�jdd�}|jdd�}|j|j|�}||d<tjj|�r�t	|�j
�}t||d�}t|||�n>dd	l
m}t|�d|jd�|f||<t||d�}	t|	||�dS)
Nzscripts/zNo script named %rz
�
�
rd�execr)�cache)r�r�r�r5rgrlr�r�r�r�readr@rs�	linecachert�lenr)
rr�r�ZscriptZscript_textZscript_filename�source�codertZscript_coderrrru�s
zNullProvider.run_scriptcCstd��dS)Nz9Can't perform this operation for unregistered loader type)�NotImplementedError)rr�rrrrk�szNullProvider._hascCstd��dS)Nz9Can't perform this operation for unregistered loader type)rz)rr�rrrro�szNullProvider._isdircCstd��dS)Nz9Can't perform this operation for unregistered loader type)rz)rr�rrrrp�szNullProvider._listdircCs |rtjj|f|jd���S|S)N�/)r�r�rmr)r�baser�rrrrg�szNullProvider._fncCs$t|jd�r|jj|�Std��dS)N�get_dataz=Can't perform this operation for loaders without 'get_data()')r�r�r}rz)rr�rrrrj�szNullProvider._get)rrrr�egg_namerlr�r�r�r�r�r�r�r�r�r�r�rr�rurkrorprgrjrrrrr��s,c@s eZdZdZdd�Zdd�ZdS)r�z&Provider based on a virtual filesystemcCstj||�|j�dS)N)r�r��
_setup_prefix)rr�rrrr�szEggProvider.__init__cCs^|j}d}xN||krXt|�rBtjj|�|_tjj|d�|_||_P|}tjj	|�\}}qWdS)NzEGG-INFO)
rf�_is_unpacked_eggr�r��basenamer~rmrl�egg_rootr)rr��oldr|rrrr
s
zEggProvider._setup_prefixN)rrrrr�rrrrrr�sc@sDeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Ze	dd
��Z
dS)r�z6Provides access to package resources in the filesystemcCstjj|�S)N)r�r�r�)rr�rrrrkszDefaultProvider._hascCstjj|�S)N)r�r�r
)rr�rrrroszDefaultProvider._isdircCs
tj|�S)N)r��listdir)rr�rrrrp"szDefaultProvider._listdircCst|j|j|�d�S)N�rb)rrgrf)rr�r�rrrr�%sz#DefaultProvider.get_resource_streamc	Cst|d��
}|j�SQRXdS)Nr�)rru)rr��streamrrrrj(szDefaultProvider._getcCsttdtd��}t||�dS)N�SourceFileLoader)r��importlib_machinery�typer�)rZ
loader_clsrrr�	_register,s
zDefaultProvider._registerN)rrrrrkrorpr�rjr'r�rrrrr�sc@s8eZdZdZdd�ZZdd�Zdd�ZdZdd�Z	dS)	r�z.Provider that returns nothing for all requestscCsdS)NFr)rr�rrrre9szEmptyProvider.<lambda>cCsdS)Nr�r)rr�rrrre:scCsgS)Nr)rr�rrrre;sNcCsdS)Nr)rrrrr�>szEmptyProvider.__init__)
rrrrrorkrjrprfr�rrrrr�6sc@s eZdZdZedd��ZeZdS)�ZipManifestsz
    zip manifest builder
    c
s2t|�� ��fdd��j�D�}t|�SQRXdS)a
        Build a dictionary similar to the zipimport directory
        caches, except instead of tuples, store ZipInfo objects.

        Use a platform-specific path separator (os.sep) for the path keys
        for compatibility with pypy on Windows.
        c3s&|]}|jdtj��j|�fVqdS)r{N)r5r��sepZgetinfo)r+r�)�zfilerrr,Usz%ZipManifests.build.<locals>.<genexpr>N)�ContextualZipFileZnamelistrR)rr�rVr)r�r�buildJs	

zZipManifests.buildN)rrrrr'r��loadrrrrr�Esr�c@s$eZdZdZejdd�Zdd�ZdS)�MemoizedZipManifestsz%
    Memoized zipfile manifests.
    �manifest_modzmanifest mtimecCsRtjj|�}tj|�j}||ks.||j|krH|j|�}|j||�||<||jS)zW
        Load a manifest at path or return a suitable manifest already loaded.
        )	r�r��normpathrQ�st_mtime�mtimer�r��manifest)rr�r�r�rrrr�fs
zMemoizedZipManifests.loadN)rrrrr�
namedtupler�r�rrrrr�`sr�cs0eZdZdZdd�Zdd�Z�fdd�Z�ZS)r�zL
    Supplement ZipFile class to support context manager for Python 2.6
    cCs|S)Nr)rrrr�	__enter__yszContextualZipFile.__enter__cCs|j�dS)N)�close)rr�rn�	tracebackrrr�__exit__|szContextualZipFile.__exit__cs(ttjd�rtj||�Stt|�j|�S)zI
        Construct a ZipFile or ContextualZipFile as appropriate
        r�)r��zipfile�ZipFilerr��__new__)rrd�kwargs)rrrr�szContextualZipFile.__new__)rrrrr�r�r�rHrr)rrr�tsr�c@s�eZdZdZdZe�Zdd�Zdd�Zdd�Z	e
d	d
��Zdd�Ze
d
d��Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �ZdS)!r�z"Resource support for zips and eggsNcCs tj||�|jjtj|_dS)N)r�r�r��archiver�r��zip_pre)rr�rrrr��szZipProvider.__init__cCs4|j|j�r|t|j�d�Std||jf��dS)Nz%s is not a subpath of %s)r9r�rw�AssertionError)r�fspathrrr�
_zipinfo_name�szZipProvider._zipinfo_namecCsP|j|}|j|jtj�r:|t|j�dd�jtj�Std||jf��dS)Nr-z%s is not a subpath of %s)r�r9r�r�r�rwrr�)r�zip_pathr�rrr�_parts�s

zZipProvider._partscCs|jj|jj�S)N)�_zip_manifestsr�r�r�)rrrr�zipinfo�szZipProvider.zipinfocCs`|jstd��|j|�}|j�}dj|j|��|krTx|D]}|j||j|��q:W|j||�S)Nz5resource_filename() only supported for .egg, not .zipr{)r~rz�_resource_to_zip�_get_eager_resourcesrmr��_extract_resource�
_eager_to_zip)rr�r�r��eagersr�rrrr��s

z!ZipProvider.get_resource_filenamecCs"|j}|jd}tj|�}||fS)Nrr-r7)rrr7)Z	file_size�	date_time�timeZmktime)Zzip_stat�sizer��	timestamprrr�_get_date_and_size�s

zZipProvider._get_date_and_sizec
Csn||j�krDx*|j�|D]}|j|tjj||��}qWtjj|�S|j|j|�\}}tsdt	d��y�|j
|j|j|��}|j
||�r�|Stdtjj|�d�\}}	tj||jj|��tj|�t|	||f�|j|	|�yt|	|�Wn\tjk
�rDtjj|��r>|j
||��r|Stjdk�r>t|�t|	|�|S�YnXWn tjk
�rh|j�YnX|S)Nz>"os.rename" and "os.unlink" are not supported on this platformz	.$extract)�dirrO)�_indexr�r�r�rmrer�r��
WRITE_SUPPORT�IOErrorrNr~r��_is_current�_mkstemp�writer�r}r�rr[r
�error�isfiler�rrJ)
rr�r�r�Zlastr�r�Z	real_pathZoutfZtmpnamrrrr��s@

zZipProvider._extract_resourcec		Csx|j|j|�\}}tjj|�s$dStj|�}|j|ksB|j|krFdS|jj	|�}t
|d��}|j�}WdQRX||kS)zK
        Return True if the file_path is current for this zip_path
        Fr�N)r�r�r�r�r�rQ�st_sizer�r�r}rru)	rZ	file_pathr�r�r�rQZzip_contents�fZ
file_contentsrrrr��s
zZipProvider._is_currentcCsB|jdkr<g}x&dD]}|j|�r|j|j|��qW||_|jS)N�native_libs.txt�eager_resources.txt)r�r�)r�r�rr�)rr�r�rrrr�s


z ZipProvider._get_eager_resourcescCs�y|jStk
r�i}xd|jD]Z}|jtj�}xH|rztjj|dd��}||krj||j|d�Pq4|j�g||<q4Wq"W||_|SXdS)Nr-r7r7)	Z	_dirindex�AttributeErrorr�rr�r�rmr;r:)rZindr�r<�parentrrrr�szZipProvider._indexcCs |j|�}||jkp||j�kS)N)r�r�r�)rr�r�rrrrks
zZipProvider._hascCs|j|�|j�kS)N)r�r�)rr�rrrro!szZipProvider._isdircCst|j�j|j|�f��S)N)rr�rBr�)rr�rrrrp$szZipProvider._listdircCs|j|j|j|��S)N)r�rgr�)rr�rrrr�'szZipProvider._eager_to_zipcCs|j|j|j|��S)N)r�rgrf)rr�rrrr�*szZipProvider._resource_to_zip)rrrrr�r�r�r�r�r�r�r�r�r]r�r�r�r�r�rkrorpr�r�rrrrr��s$	

	4	c@s8eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�ZdS)
r�a*Metadata handler for standalone PKG-INFO files

    Usage::

        metadata = FileMetadata("/path/to/PKG-INFO")

    This provider rejects all data and metadata requests except for PKG-INFO,
    which is treated as existing, and will be the contents of the file at
    the provided location.
    cCs
||_dS)N)r�)rr�rrrr�=szFileMetadata.__init__cCs|dkotjj|j�S)NzPKG-INFO)r�r�r�)rr�rrrr�@szFileMetadata.has_metadatac	CsD|dkrtd��tj|jddd��}|j�}WdQRX|j|�|S)NzPKG-INFOz(No metadata except PKG-INFO is availablezutf-8r5)�encoding�errors)r�rhrr�ru�_warn_on_replacement)rr�r��metadatarrrr�Cs
zFileMetadata.get_metadatacCs2djd�}||kr.d}|jft��}tj|�dS)Ns�zutf-8z2{self.path} could not be properly decoded in UTF-8)rmr�r�rCrD)rr�Zreplacement_charrHrWrrrr�Ls

z!FileMetadata._warn_on_replacementcCst|j|��S)N)r�r�)rr�rrrr�TszFileMetadata.get_metadata_linesN)	rrrrr�r�r�r�r�rrrrr�1s
	c@seZdZdZdd�ZdS)r�asMetadata provider for egg directories

    Usage::

        # Development eggs:

        egg_info = "/path/to/PackageName.egg-info"
        base_dir = os.path.dirname(egg_info)
        metadata = PathMetadata(base_dir, egg_info)
        dist_name = os.path.splitext(os.path.basename(egg_info))[0]
        dist = Distribution(basedir, project_name=dist_name, metadata=metadata)

        # Unpacked egg directories:

        egg_path = "/path/to/PackageName-ver-pyver-etc.egg"
        metadata = PathMetadata(egg_path, os.path.join(egg_path,'EGG-INFO'))
        dist = Distribution.from_filename(egg_path, metadata=metadata)
    cCs||_||_dS)N)rfrl)rr�rlrrrr�lszPathMetadata.__init__N)rrrrr�rrrrr�Xsc@seZdZdZdd�ZdS)r�z Metadata provider for .egg filescCsD|jtj|_||_|jr0tjj|j|j�|_n|j|_|j	�dS)z-Create a metadata provider from a zipimporterN)
r�r�r�r�r��prefixr�rmrfr)r�importerrrrr�tszEggMetadata.__init__N)rrrrr�rrrrr�qsrR)�_distribution_finderscCs|t|<dS)axRegister `distribution_finder` to find distributions in sys.path items

    `importer_type` is the type or class of a PEP 302 "Importer" (sys.path item
    handler), and `distribution_finder` is a callable that, passed a path
    item and the importer instance, yields ``Distribution`` instances found on
    that path item.  See ``pkg_resources.find_on_path`` for an example.N)r�)�
importer_typeZdistribution_finderrrrr��scCst|�}tt|�}||||�S)z.Yield distributions accessible via `path_item`)rr�r�)�	path_item�onlyr��finderrrrr��s
ccs�|jjd�rdSt|�}|jd�r2tj||d�V|r:dSxH|jd�D]:}t|�rFtj	j
||�}xttj
|�|�D]
}|VqrWqFWdS)z@
    Find eggs in zip files; possibly multiple nested eggs.
    z.whlNzPKG-INFO)r�r{)r��endswithr�r�r��
from_filenamerr�r�r�rm�find_eggs_in_zip�	zipimport�zipimporter)r�r�r�r�Zsubitem�subpathr�rrrr��s
r�cCsfS)Nr)r�r�r�rrr�find_nothing�sr�cCsdd�}t||dd�S)aL
    Given a list of filenames, return them in descending order
    by version number.

    >>> names = 'bar', 'foo', 'Python-2.7.10.egg', 'Python-2.7.2.egg'
    >>> _by_version_descending(names)
    ['Python-2.7.10.egg', 'Python-2.7.2.egg', 'foo', 'bar']
    >>> names = 'Setuptools-1.2.3b1.egg', 'Setuptools-1.2.3.egg'
    >>> _by_version_descending(names)
    ['Setuptools-1.2.3.egg', 'Setuptools-1.2.3b1.egg']
    >>> names = 'Setuptools-1.2.3b1.egg', 'Setuptools-1.2.3.post1.egg'
    >>> _by_version_descending(names)
    ['Setuptools-1.2.3.post1.egg', 'Setuptools-1.2.3b1.egg']
    cSs2tjj|�\}}tj|jd�|g�}dd�|D�S)z6
        Parse each component of the filename
        r+cSsg|]}tjj|��qSr)rrKr�)r+r3rrr�
<listcomp>�sz?_by_version_descending.<locals>._by_version.<locals>.<listcomp>)r�r��splitext�	itertools�chainr)r��extr<rrr�_by_version�sz+_by_version_descending.<locals>._by_versionT)r'r6)�sorted)rMr�rrr�_by_version_descending�sr�ccs�t|�}tjj|�o tj|tj��r�t|�rPtj|t	|tjj
|d��d�V�nTttj|��}�xB|D�]8}|j
�}|jd�s�|jd�r�tjj
||�}tjj|�r�ttj|��dkr�qft	||�}nt|�}tj|||td�Vqf|o�t|��rttjj
||��}x�|D]}	|	V�qWqf|rf|jd�rfttjj
||���}
|
j�}WdQRXxN|D]F}|j��sh�qVtjj
||j��}
t|
�}x|D]}|V�q�WP�qVWqfWdS)	z6Yield distributions accessible on a sys.path directoryzEGG-INFO)r�z	.egg-infoz
.dist-infor)�
precedencez	.egg-linkN)�_normalize_cachedr�r�r
�access�R_OKr�r�r�r�rmr�r�r8r�rwr��
from_locationr�r�r�	readlines�strip�rstrip)r�r�r�Zpath_item_entriesr�r8Zfullpathr�rr�Z
entry_fileZentry_lines�liner�rrrr�find_on_path�sB



r��
FileFinder)�_namespace_handlers)�_namespace_packagescCs|t|<dS)a�Register `namespace_handler` to declare namespace packages

    `importer_type` is the type or class of a PEP 302 "Importer" (sys.path item
    handler), and `namespace_handler` is a callable like this::

        def namespace_handler(importer, path_entry, moduleName, module):
            # return a path_entry to use for child packages

    Namespace handlers are only called if the importer object has already
    agreed that it can handle the relevant path item, and they should only
    return a subpath if the module __path__ does not already contain an
    equivalent subpath.  For an example namespace handler, see
    ``pkg_resources.file_ns_handler``.
    N)r�)r�Znamespace_handlerrrrr�scCs�t|�}|dkrdS|j|�}|dkr*dStjj|�}|dkrbtj|�}tj|<g|_t|�nt	|d�svt
d|��tt|�}|||||�}|dk	r�|j}|j
|�|j|�t|||�|S)zEEnsure that named package includes a subpath of path_item (if needed)N�__path__zNot a package:)r�find_modulerkr�rB�types�
ModuleTyper��_set_parent_nsr�r�r�r�r;�load_module�_rebuild_mod_path)�packageNamer�r�r�r�Zhandlerr�r�rrr�
_handle_nss*






r�csRdd�tjD���fdd����fdd�}|j|d�dd�|D�|jd	d	�<d	S)
zq
    Rebuild module.__path__ ensuring that all entries are ordered
    corresponding to their sys.path order
    cSsg|]}t|��qSr)r�)r+�prrrr�5sz%_rebuild_mod_path.<locals>.<listcomp>cs(y
�j|�Stk
r"td�SXdS)z/
        Workaround for #520 and #513.
        �infN)�indexrp�float)r�)�sys_pathrr�safe_sys_path_index7s
z._rebuild_mod_path.<locals>.safe_sys_path_indexcs<|jtj�}�jd�d}|d|�}�ttjj|���S)zR
        Return the ordinal of the path based on its position in sys.path
        r,r-N)rr�r��countr�rm)r��
path_partsZmodule_partsr<)�package_namer�rr�position_in_sys_path@sz/_rebuild_mod_path.<locals>.position_in_sys_path)r'cSsg|]}t|��qSr)r�)r+r�rrrr�JsN)rkr�r!r�)Z	orig_pathr�r�r�r)r�r�r�rr�0s
		r�cCs�tj�z�|tkrdStjd}}d|kr�dj|jd�dd��}t|�|tkrZt|�ytj	|j
}Wntk
r�td|��YnXtj
|g�j|�tj
|g�x|D]}t||�q�WWdtj�XdS)z9Declare that package 'packageName' is a namespace packageNr,r-zNot a package:r7)�_imp�acquire_lockr�rkr�rmrr�r�r�r�r�r�rr;r��release_lock)r�r�r�r�rrrr�Ms&
c
CsJtj�z2x,tj|f�D]}t||�}|rt||�qWWdtj�XdS)zDEnsure that previously-declared namespace packages include path_itemN)r�r�r�rBr�r�r�)r�r��packager�rrrr�ns
cCsFtjj||jd�d�}t|�}x |jD]}t|�|kr(Pq(W|SdS)zBCompute an ns-package subpath for a filesystem or zipfile importerr,r-Nr7)r�r�rmrr�r�)r�r�r�r�r�Z
normalizedrrrr�file_ns_handlerzsrcCsdS)Nr)r�r�r�r�rrr�null_ns_handler�srcCstjjtjj|��S)z1Normalize a file/dir name for comparison purposes)r�r��normcase�realpath)rZrrrr��scCs2y||Stk
r,t|�||<}|SXdS)N)r�r�)rZr��resultrrrr��s
r�cCs|j�jd�S)z@
    Determine if given path appears to be an unpacked egg.
    z.egg)r8r�)r�rrrr��sr�cCs<|jd�}|j�}|r8dj|�}ttj||tj|�dS)Nr,)rr:rm�setattrrkr�)r�r<r�r�rrrr��s


r�ccsht|tj�r>xV|j�D]"}|j�}|r|jd�r|VqWn&x$|D]}xt|�D]
}|VqRWqDWdS)z9Yield non-empty/non-comment lines of a string or sequence�#N)rrr��
splitlinesr�r9r�)�strsr2Zssrrrr��s
z\w+(\.\w+)*$z�
    (?P<name>[^-]+) (
        -(?P<ver>[^-]+) (
            -py(?P<pyver>[^-]+) (
                -(?P<plat>.+)
            )?
        )?
    )?
    c@s�eZdZdZffdfdd�Zdd�Zdd�Zdd
d�Zdd
�Zddd�Z	e
jd�Ze
ddd��Ze
dd��Ze
ddd��Ze
ddd��ZdS)r�z3Object representing an advertised importable objectNcCsJt|�std|��||_||_t|�|_tjddj|��j	|_	||_
dS)NzInvalid module namezx[%s]�,)�MODULErpr��module_namer�attrsr�r�rmrr�)rr�rrrr�rrrr��s

zEntryPoint.__init__cCsHd|j|jf}|jr*|ddj|j�7}|jrD|ddj|j�7}|S)Nz%s = %s�:r,z [%s]r	)r�rrrmr)rr2rrrr��szEntryPoint.__str__cCsdt|�S)NzEntryPoint.parse(%r))rF)rrrrr��szEntryPoint.__repr__TcOs6|s|s|rtjdtdd�|r.|j||�|j�S)zH
        Require packages for this EntryPoint, then resolve it.
        zJParameters to load are deprecated.  Call .resolve and .require separately.rg)r>)rCrD�DeprecationWarningrtr)rrtrdr�rrrr��szEntryPoint.loadcCsVt|jdgdd�}ytjt|j|�Stk
rP}ztt|���WYdd}~XnXdS)zD
        Resolve the entry point from its module and attrs.
        rr)�fromlist�levelN)	r�r�	functools�reducer�rr�r�rF)rr��excrrrr�s
zEntryPoint.resolvecCsH|jr|jrtd|��|jj|j�}tj|||�}tttj|��dS)Nz&Can't require() without a distribution)	rr�r�rr�rrrr)rrr rrVrrrrt	s

zEntryPoint.requirez]\s*(?P<name>.+?)\s*=\s*(?P<module>[\w.]+)\s*(:\s*(?P<attr>[\w.]+))?\s*(?P<extras>\[.*\])?\s*$cCsf|jj|�}|sd}t||��|j�}|j|d�}|drJ|djd�nf}||d|d|||�S)aParse a single entry point from string `src`

        Entry point syntax follows the form::

            name = some.module:some.attr [extra1, extra2]

        The entry name and module name are required, but the ``:attrs`` and
        ``[extras]`` parts are optional
        z9EntryPoint must be in 'name=module:attrs [extras]' formatr�attrr,r�r�)�patternrjrp�	groupdict�
_parse_extrasr)r�srcr�rrrW�resrrrrrr�	s
zEntryPoint.parsecCs(|sfStjd|�}|jr"t��|jS)N�x)r�r��specsrpr)rZextras_specr�rrrr$	szEntryPoint._parse_extrascCsZt|�std|��i}x>t|�D]2}|j||�}|j|krHtd||j��|||j<q W|S)zParse an entry point groupzInvalid group namezDuplicate entry point)r
rpr�r�r�)rro�linesr��thisr�rrrr�parse_group-	s

zEntryPoint.parse_groupcCsxt|t�r|j�}nt|�}i}xR|D]J\}}|dkrD|s<q&td��|j�}||kr^td|��|j|||�||<q&W|S)z!Parse a map of entry point groupsNz%Entry points must be listed in groupszDuplicate group name)rrRrVr�rpr�r)r�datar��mapsrorrrr�	parse_map:	s


zEntryPoint.parse_map)T)NN)N)N)N)rrrrr�r�r�r�rrtr?r@rr'r�rrr!rrrrr��s 	


	cCs>|sdStjj|�}|djd�r:tjj|dd�d�S|S)Nr�r-zmd5=r7r7)r�)rr�Zurlparser9Z
urlunparse)rZparsedrrr�_remove_md5_fragmentN	sr"cCs@dd�}t||�}tt|�d�}|jd�\}}}t|j��p>dS)z�
    Given an iterable of lines from a Metadata file, return
    the value of the Version field, if present, or None otherwise.
    cSs|j�jd�S)Nzversion:)r8r9)r�rrrre\	sz$_version_from_file.<locals>.<lambda>r�r
N)r�next�iter�	partitionr�r�)rZis_version_lineZ
version_linesr�r�rnrrr�_version_from_fileW	s

r&c@sZeZdZdZdZddddedefdd�ZedGdd��Z	dd	�Z
ed
d��Zdd
�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zedd��Zedd��Zdd�Zed d!��Zed"d#��Zffd$d%�Zd&d'�ZdHd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�ZedId3d4��Z d5d6�Z!d7d8�Z"dJd9d:�Z#d;d<�Z$dKd=d>�Z%d?d@�Z&dAdB�Z'dCdD�Z(edEdF��Z)dS)Lr�z5Wrap an actual or potential sys.path entry w/metadatazPKG-INFONcCsFt|pd�|_|dk	r t|�|_||_||_||_||_|p>t|_	dS)NZUnknown)
r�rr��_versionr2rlrr�r��	_provider)rrr�rrKr2rlr�rrrr�g	s
zDistribution.__init__cKs~dgd\}}}}tjj|�\}}	|	j�tkr^t|	j�}t|�}
|
r^|
jdddd�\}}}}|||f||||d�|��j�S)Nr�r�ZverZpyverrq)rrKr2rl)r�r�r�r8�_distributionImpl�EGG_NAMEro�_reload_version)rrr�r�rTrrKr2rlr�rjrrrr�s	s
zDistribution.from_locationcCs|S)Nr)rrrrr+�	szDistribution._reload_versioncCs(|j|j|jt|j�|jpd|jp$dfS)Nr�)�parsed_versionr�r'r"rr2rl)rrrrr5�	szDistribution.hashcmpcCs
t|j�S)N)�hashr5)rrrrr�	szDistribution.__hash__cCs|j|jkS)N)r5)rr!rrrr �	szDistribution.__lt__cCs|j|jkS)N)r5)rr!rrrr"�	szDistribution.__le__cCs|j|jkS)N)r5)rr!rrrr%�	szDistribution.__gt__cCs|j|jkS)N)r5)rr!rrrr$�	szDistribution.__ge__cCst||j�sdS|j|jkS)NF)rrr5)rr!rrrr#�	szDistribution.__eq__cCs
||kS)Nr)rr!rrrr&�	szDistribution.__ne__cCs0y|jStk
r*|jj�|_}|SXdS)N)Z_keyr�rr8)rr'rrrr'�	s
zDistribution.keycCst|d�st|j�|_|jS)N�_parsed_version)r�rNrKr.)rrrrr,�	s
zDistribution.parsed_versioncCsXtjj}t|j|�}|sdS|js&dStjd�j�jdd�}t	j
|jft|��t
�dS)Na>
            '{project_name} ({version})' is being parsed as a legacy,
            non PEP 440,
            version. You may find odd behavior and sort order.
            In particular it will be sorted as less than 0.0. It
            is recommended to migrate to PEP 440 compatible
            versions.
            rqr�)rrK�
LegacyVersionrr.rCrDr�r5rCrDr��varsr)rZLVZ	is_legacyrHrrr�_warn_legacy_version�	sz!Distribution._warn_legacy_versioncCsLy|jStk
rFt|j|j��}|dkrBd}t||j|��|SXdS)Nz(Missing 'Version:' header and/or %s file)r'r�r&�
_get_metadata�PKG_INFOrp)rrKrHrrrrK�	szDistribution.versioncCs�y|jStk
r�dgi}|_x�dD]x}xrt|j|��D]`\}}|r�d|kr||jdd�\}}t|�rpg}nt|�s|g}t|�p�d}|j|g�j	t
|��q>Wq*W|SXdS)N�requires.txt�depends.txtr
r-)r4r5)Z_Distribution__dep_mapr�r�r2rr�r�r�rrr�)r�dmr�r(rr)rrr�_dep_map�	s 
zDistribution._dep_mapcCsj|j}g}|j|jdf��xH|D]@}y|j|t|��Wq"tk
r`td||f��Yq"Xq"W|S)z@List of Requirements needed for this distro if `extras` are usedNz%s has no such extra feature %r)r7rrBr�r�r�)rrr6Zdepsr�rrrr�	s
zDistribution.requiresccs(|j|�r$x|j|�D]
}|VqWdS)N)r�r�)rr�r�rrrr2
s
zDistribution._get_metadataFcCsZ|dkrtj}|j||d�|tjkrVt|j�x$|jd�D]}|tjkr<t|�q<WdS)z>Ensure distribution is importable on `path` (default=sys.path)N)r5znamespace_packages.txt)rkr�rr�rr2r�r�)rr�r5Zpkgrrr�activate	
s


zDistribution.activatecCs8dt|j�t|j�|jptf}|jr4|d|j7}|S)z@Return what this distribution's standard .egg filename should bez
%s-%s-py%sr+)r�rrKr2r>rl)rrZrrrr~
szDistribution.egg_namecCs |jrd||jfSt|�SdS)Nz%s (%s))rrF)rrrrr�
szDistribution.__repr__cCs@yt|dd�}Wntk
r(d}YnX|p0d}d|j|fS)NrKz[unknown version]z%s %s)r�rpr)rrKrrrr�%
s
zDistribution.__str__cCs|jd�rt|��t|j|�S)zADelegate all unrecognized public attributes to .metadata providerr�)r9r�r�r()rrrrr�__getattr__-
s
zDistribution.__getattr__cKs|jt|�tjj|�|f|�S)N)r�r�r�r�r�)rrZr�rTrrrr�3
szDistribution.from_filenamecCs<t|jtjj�r"d|j|jf}nd|j|jf}tj|�S)z?Return a ``Requirement`` that matches this distribution exactlyz%s==%sz%s===%s)rr,rrKr_rr�r�)r�specrrrr":
szDistribution.as_requirementcCs.|j||�}|dkr&td||ff��|j�S)z=Return the `name` entry point of `group` or raise ImportErrorNzEntry point %r not found)rzr�r�)rror�rrrrrxC
szDistribution.load_entry_pointcCsPy
|j}Wn,tk
r6tj|jd�|�}|_YnX|dk	rL|j|i�S|S)z=Return the entry point map for `group`, or the full entry mapzentry_points.txtN)Z_ep_mapr�r�r!r2rB)rroZep_maprrrryJ
s
zDistribution.get_entry_mapcCs|j|�j|�S)z<Return the EntryPoint object for `group`+`name`, or ``None``)ryrB)rror�rrrrzV
szDistribution.get_entry_infoc
Cs2|p|j}|sdSt|�}tjj|�}dd�|D�}x�t|�D]v\}}||kr\|rVPq�dSq>||kr>|jtkr>|r�|||d�kr�dS|tjkr�|j	�|j
||�|j
||�Pq>W|tjkr�|j	�|r�|j
d|�n
|j|�dSxBy|j||d�}	Wnt
k
�rPYq�X||	=||	=|	}q�WdS)a�Ensure self.location is on path

        If replace=False (default):
            - If location is already in path anywhere, do nothing.
            - Else:
              - If it's an egg and its parent directory is on path,
                insert just ahead of the parent.
              - Else: add to the end of path.
        If replace=True:
            - If location is already on path anywhere (not eggs)
              or higher priority than its parent (eggs)
              do nothing.
            - Else:
              - If it's an egg and its parent directory is on path,
                insert just ahead of the parent,
                removing any lower-priority entries.
              - Else: add it to the front of path.
        NcSsg|]}|rt|�p|�qSr)r�)r+r�rrrr�t
sz*Distribution.insert_on.<locals>.<listcomp>rr-)rr�r�r�re�	enumerater�r�rk�check_version_conflictrr;r�rp)
rr��locr5ZnlocZbdirZnpathr�rZnprrrrZ
sB



zDistribution.insert_oncCs�|jdkrdStj|jd��}t|j�}x~|jd�D]p}|tjks4||ks4|tkrTq4|dkr^q4t	tj|dd�}|r�t|�j
|�s4|j
|j�r�q4td|||jf�q4WdS)	N�
setuptoolsznamespace_packages.txtz
top_level.txt�
pkg_resources�siterdzIModule %s was already imported from %s, but %s is being added to sys.path)r?r>r@)r'rRrSr2r�rrkr�r�r�r9�
issue_warning)rZnspr=�modname�fnrrrr<�
s"

z#Distribution.check_version_conflictcCs4y
|jWn$tk
r.tdt|��dSXdS)NzUnbuilt egg for FT)rKrprAr�)rrrrr7�
s
zDistribution.has_versioncKsDd}x$|j�D]}|j|t||d��qW|jd|j�|jf|�S)z@Copy this distribution, substituting in any changed keyword argsz<project_name version py_version platform location precedenceNr�)rrr�r(r)rrTrMrrrr�clone�
s
zDistribution.clonecCsdd�|jD�S)NcSsg|]}|r|�qSrr)r+Zdeprrrr��
sz'Distribution.extras.<locals>.<listcomp>)r7)rrrrr�
szDistribution.extras)N)NF)N)N)NF)*rrrrr3r>r�r�r'r�r+r�r5rr r"r%r$r#r&r'r,r1rKr7rr2r8r~r�r�r9r�r"rxryrzrr<r7rDrrrrrr�c	sN

	

Cc@seZdZdd�ZdS)�EggInfoDistributioncCst|j|j��}|r||_|S)a�
        Packages installed by distutils (e.g. numpy or scipy),
        which uses an old safe_version, and so
        their version numbers can get mangled when
        converted to filenames (e.g., 1.11.0.dev0+2329eae to
        1.11.0.dev0_2329eae). These distributions will not be
        parsed properly
        downstream by Distribution and safe_version, so
        take an extra step and try to get the version number from
        the metadata file itself instead of the filename.
        )r&r2r3r')rZ
md_versionrrrr+�
sz#EggInfoDistribution._reload_versionN)rrrr+rrrrrE�
srEc@s>eZdZdZdZejd�Zedd��Z	edd��Z
dd	�Zd
S)�DistInfoDistributionzGWrap an actual or potential sys.path entry w/metadata, .dist-info styleZMETADATAz([\(,])\s*(\d.*?)\s*([,\)])cCs@y|jStk
r:|j|j�}tjj�j|�|_|jSXdS)zParse and cache metadataN)Z	_pkg_infor�r�r3�email�parserZParserZparsestr)rr�rrr�_parsed_pkg_info�
sz%DistInfoDistribution._parsed_pkg_infocCs,y|jStk
r&|j�|_|jSXdS)N)�_DistInfoDistribution__dep_mapr��_compute_dependencies)rrrrr7�
s

zDistInfoDistribution._dep_mapcs�dgi}|_g�x&|jjd�p"gD]}�jt|��q$W�fdd�}t|d��}|dj|�x<|jjd�ppgD](}t|j��}tt||��|�||<qrW|S)z+Recompute this distribution's dependencies.Nz
Requires-Distc3s0x*�D]"}|js"|jjd|i�r|VqWdS)Nr()r)r*)r(r�)rrr�reqs_for_extra�
s
zBDistInfoDistribution._compute_dependencies.<locals>.reqs_for_extrazProvides-Extra)	rJrIZget_allrr��	frozensetr�r�r)rr6r�rL�commonr(Zs_extrar)rrrK�
sz*DistInfoDistribution._compute_dependenciesN)rrrrr3r?r@ZEQEQr�rIr7rKrrrrrF�
s

rF)z.eggz	.egg-infoz
.dist-infoc
Os^d}t�}y"xtj|�j|kr(|d7}qWWntk
r@YnXtj|d|di|��dS)Nr-r>)rOrkr�r�rprCrD)rdrTrrXrrrrAsrAc@seZdZdd�ZdS)�RequirementParseErrorcCsdj|j�S)Nr�)rmrd)rrrrr�szRequirementParseError.__str__N)rrrr�rrrrrOsrOccshtt|��}xV|D]N}d|kr0|d|jd��}|jd�rV|dd�j�}|t|�7}t|�VqWdS)z�Yield ``Requirement`` objects for each specification in `strs`

    `strs` must be a string, or a (possibly-nested) iterable thereof.
    z #N�\rg���)r$r�r�r�r�r#r�)rrr�rrrr�#s

csPeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Ze	d
d��Z
�ZS)r�cs�ytt|�j|�Wn2tjjk
rF}ztt|���WYdd}~XnX|j|_	t
|j�}||j�|_|_
dd�|jD�|_ttt|j��|_|j
|jt|j�|jr�t|j�ndf|_t|j�|_dS)z>DO NOT CALL THIS UNDOCUMENTED METHOD; use Requirement.parse()!NcSsg|]}|j|jf�qSr)r8rK)r+r:rrrr�Asz(Requirement.__init__.<locals>.<listcomp>)rr�r�rrZInvalidRequirementrOrFr�Zunsafe_namer�r8rr'�	specifierrrrr�rrMr)�hashCmpr-�_Requirement__hash)rZrequirement_stringrcr)rrrr�7s
zRequirement.__init__cCst|t�o|j|jkS)N)rr�rS)rr!rrrr#Ks
zRequirement.__eq__cCs
||kS)Nr)rr!rrrr&QszRequirement.__ne__cCs0t|t�r |j|jkrdS|j}|jj|dd�S)NFT)Zprereleases)rr�r'rKrR�contains)rrrrrr	Ts

zRequirement.__contains__cCs|jS)N)rT)rrrrr`szRequirement.__hash__cCsdt|�S)NzRequirement.parse(%r))rF)rrrrr�cszRequirement.__repr__cCst|�\}|S)N)r�)r2r�rrrr�es
zRequirement.parse)rrrr�r#r&r	rr�r]r�rHrr)rrr�6scCs0t|t�s*Gdd�d|t�}|jdd�S|jS)z&Get an mro for a type or classic classc@seZdZdS)z_get_mro.<locals>.clsN)rrrrrrrrosrr-N)rr��object�__mro__)rrrr�_get_mroks
rXcCs2x,tt|dt|���D]}||kr||SqWdS)z2Return an adapter factory for `ob` from `registry`rN)rXr�r�)�registryr`�trrrr�vsr�cCs&tjj|�}tjj|�s"tj|�dS)z1Ensure that the parent directory of `path` existsN)r�r�rer
�makedirs)r�rerrrr�}scCs@tstd��t|�\}}|r<|r<t|�r<t|�t|d�dS)z/Sandbox-bypassing version of ensure_directory()z*"os.mkdir" not supported on this platform.i�N)r�r�rr
rKr	)r�rerZrrrrK�srKccszd}g}xbt|�D]V}|jd�r^|jd�rR|s2|r<||fV|dd�j�}g}qhtd|��q|j|�qW||fVdS)asSplit a string or iterable thereof into (section, content) pairs

    Each ``section`` is a stripped version of the section header ("[section]")
    and each ``content`` is a list of stripped lines excluding blank lines and
    comment-only lines.  If there are any such lines before the first section
    header, they're returned in a first ``section`` of ``None``.
    N�[�]r-zInvalid section headingr7)r�r9r�r�rpr;)r2ZsectionZcontentr�rrrr��s


cOs&tj}ztt_tj||�S|t_XdS)N)r�r�os_open�tempfileZmkstemp)rdrTZold_openrrrr��s
r��ignore)�categoryr;cOs|||�|S)Nr)r�rdr�rrr�_call_aside�s
rbcCs<t�}||d<x(t|�D]}|jd�st||�||<qWdS)z=Set up global resource manager (deliberately not state-saved)Z_managerr�N)r�r�r9r�)rXr�r�rrr�_initialize�s

rccCs�tj�}td|d�|j}|j}|j}|j}|}d}x|D]}|jdd�q:W~|dd�dd�g|_t	t
|jtj
��t�jt��dS)	aE
    Prepare the master working set and make the ``require()``
    API available.

    This function has explicit effects on the global state
    of pkg_resources. It is intended to be invoked once at
    the initialization of this module.

    Invocation by other packages is unsupported and done
    at their own risk.
    rV)r�NF)r5cSs|jdd�S)NT)r5)r8)r�rrrre�sz0_initialize_master_working_set.<locals>.<lambda>)r%)r�rrUrtr{r&rur8r�rrr�rkr�rOrPr�)r�rtr{r�rur�r�rrr�_initialize_master_working_set�s

rd)rr)rrr7)N)N)F)F)F)F)N)�rZ
__future__rrkr�rhr�r?r�r�r�rCrQrZpkgutilr8rlrr�Zemail.parserrGr_rCr�rr�r�ZimpZpip._vendorrZpip._vendor.six.movesrrrrr	r
rr�rr^Zos.pathr
rZimportlib.machinery�	machineryr�rrrr��version_inforWrDrtr�rErrVrrKr_rIr/rJrNrQrUrZr[r^rarbrcZ
_sget_noneZ
_sset_noners�__all__�	Exceptionr�r�r�r�r�r�r>r�r�r�r�r�r�rvrnr�rhr@rir�r�r�rur�rwrxryrzr�r�r�rRrr�r��RuntimeErrorr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ZImpImporterr�r�r�r�r�r�r�rrr�r�r�r�r�rjr
rA�
IGNORECASEr*r�r"r&r�rErFr)rArprOr�rr�rXr�r�rKr�r��filterwarningsrbrOrcrdrrrr�<module>s�




b

 




.

{
3	
a
''





.!


		~	g0
5
	site-packages/pip/_vendor/pkg_resources/__pycache__/__init__.cpython-36.opt-1.pyc000064400000271767147511334620023751 0ustar003

���e>��^@s�dZddlmZddlZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlZddlZddlZddlZddlZddlZddlmZyddlZWnek
r�ddlZYnXddlmZddlmZm Z m!Z!ddlm"Z"yddlm#Z#m$Z$m%Z%d	Z&Wnek
�rHd
Z&YnXddlm'Z(ddl)m*Z*m+Z+yddl,j-Z.e.j/Wnek
�r�dZ.YnXdd
lm0Z0ddlm1Z1e2d�e2d�e2d�e2d�d�ej3k�o�d�kn�r�dZ4ej5e4�dZ6dZ7Gdd�de8�Z9Gdd�de:�Z;Gdd�de;e1j<j=�Z>Gdd�de;e1j<j?�Z@dd�ZAiZBdd �ZCd!d"�ZDd#d$�ZEd%d&�ZFd'd(�ZGd)d*�ZHd+d,�ZId-d.�ZJZKd/d0�ZLd1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPddQddRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrdsdtgFZMGdudL�dLeN�ZOGdvdM�dMeO�ZPGdwdx�dxeP�ZQGdydN�dNeO�ZRGdzdO�dOeO�ZSiZTej<dd�ZUdZVd{ZWd|ZXdZYd�ZZd}dp�Z[d~d3�Z\gfdd��Z]d�d��Z^d�d��Z_ej`d��Zaej`d��Zbe_Zcd�dU�Zdd�d2�ZeeeZfd�d4�Zgd�d5�Zhd�d�d6�Zid�d7�ZjGd�dc�dc�ZkGd�dd�ddek�ZlGd�dG�dGe:�ZmGd�d��d�en�ZoGd�dF�dFe:�ZpepZqGd�dP�dPer�ZsGd�dH�dH�Ztd�dE�Zud�dR�Zvd�dS�Zwd�dX�Zxd�dY�Zyd�dZ�Zzd�d�d[�Z{Gd�dj�dj�Z|e[e:e|�Gd�dk�dke|�Z}Gd�dl�dle}�Z~e~j�Gd�dh�dhe|�Z�e��Z�Gd�d��d�en�Z�Gd�d��d�e��Z�Gd�d��d�e	j��Z�Gd�dm�dme}�Z�e[e
j�e��Gd�de�dee��Z�Gd�df�dfe~�Z�Gd�dg�dge��Z�eCd�id��d�dn�Z�d�d�dB�Z�d�d�d��Z�e�e
j�e��d�d�d��Z�e�e:e��d�d��Z�d�d�d��Z�e�ej�e��e�e.d���r"e�e.j�e��eCd�id��eCd�id��d�do�Z�d�d��Z�d�d��Z�d�d?�Z�d�d�dq�Z�d�d��Z�e�ej�e��e�e
j�e��e�e.d���r�e�e.j�e��d�dÄZ�e�e:e��d�d]�Z�ifd�dƄZ�d�dȄZ�d�dʄZ�d�dV�Z�ej`d̃j�Z�ej`d�ej�ej�B�j�Z�Gd�dK�dKe:�Z�d�dЄZ�d�d҄Z�Gd�dI�dIe:�Z�Gd�dՄd�e��Z�Gd�dׄd�e��Z�e�e�e�d؜Z�d�dڄZ�Gd�d܄d�e��Z�d�dQ�Z�Gd�dJ�dJe1j�j��Z�d�d�Z�d�d�Z�d�d\�Z�d�d�Z�d�dW�Z�d�d�Z�ej�d�e9d	d�d�d�Z�e�e��fd�d��Z�e�d�d��Z�dS)�aZ
Package resource API
--------------------

A resource is a logical file contained within a package, or a logical
subdirectory thereof.  The package resource API expects resource names
to have their path parts separated with ``/``, *not* whatever the local
path separator is.  Do not use os.path operations to manipulate resource
names being passed into the API.

The package resource API is designed to work with normal filesystem packages,
.egg files, and unpacked .egg files.  It can also work in a limited way with
.zip files and with custom PEP 302 loaders that support the ``get_data()``
method.
�)�absolute_importN)�get_importer)�six)�urllib�map�filter)�utime)�mkdir�rename�unlinkTF)�open)�isdir�split)�appdirs)�	packagingzpip._vendor.packaging.versionz pip._vendor.packaging.specifiersz"pip._vendor.packaging.requirementszpip._vendor.packaging.markers�zLSupport for Python 3.0-3.2 has been dropped. Future versions will fail here.c@seZdZdZdS)�
PEP440Warningza
    Used when there is an issue with a version or specifier not complying with
    PEP 440.
    N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/__init__.pyr[srcsteZdZ�fdd�Z�fdd�Z�fdd�Z�fdd�Z�fd	d
�Z�fdd�Z�fd
d�Z	dd�Z
dd�Z�ZS)�_SetuptoolsVersionMixincstt|�j�S)N)�superr�__hash__)�self)�	__class__rrrcsz _SetuptoolsVersionMixin.__hash__cs*t|t�rt|�|kStt|�j|�SdS)N)�
isinstance�tuplerr�__lt__)r�other)rrrr fs
z_SetuptoolsVersionMixin.__lt__cs*t|t�rt|�|kStt|�j|�SdS)N)rrrr�__le__)rr!)rrrr"ls
z_SetuptoolsVersionMixin.__le__cs*t|t�rt|�|kStt|�j|�SdS)N)rrrr�__eq__)rr!)rrrr#rs
z_SetuptoolsVersionMixin.__eq__cs*t|t�rt|�|kStt|�j|�SdS)N)rrrr�__ge__)rr!)rrrr$xs
z_SetuptoolsVersionMixin.__ge__cs*t|t�rt|�|kStt|�j|�SdS)N)rrrr�__gt__)rr!)rrrr%~s
z_SetuptoolsVersionMixin.__gt__cs*t|t�rt|�|kStt|�j|�SdS)N)rrrr�__ne__)rr!)rrrr&�s
z_SetuptoolsVersionMixin.__ne__cCst|�|S)N)r)r�keyrrr�__getitem__�sz#_SetuptoolsVersionMixin.__getitem__c#sjtjdtj��dddddd�j���fdd���fdd	�}tjd
tdd�x|t|��D]
}|VqXWdS)
Nz(\d+ | [a-z]+ | \.| -)�czfinal-�@)ZpreZpreview�-ZrcZdevc3s`xT�j|�D]F}�||�}|s|dkr*q|dd�dkrH|jd�Vqd|VqWdVdS)N�.��
0123456789��*z*final)r�zfill)�s�part)�component_re�replacerr�_parse_version_parts�s
z>_SetuptoolsVersionMixin.__iter__.<locals>._parse_version_partscszg}xl�|j��D]\}|jd�rd|dkrFx|rD|ddkrD|j�q*Wx|rb|ddkrb|j�qHW|j|�qWt|�S)Nr0z*finalr-z*final-Z00000000���r7)�lower�
startswith�pop�appendr)r2�partsr3)r6rr�old_parse_version�s
z;_SetuptoolsVersionMixin.__iter__.<locals>.old_parse_versiona�You have iterated over the result of pkg_resources.parse_version. This is a legacy behavior which is inconsistent with the new version class introduced in setuptools 8.0. In most cases, conversion to a tuple is unnecessary. For comparison of versions, sort the Version instances directly. If you have another use case requiring the tuple, please file a bug with the setuptools project describing that need.r-)�
stacklevel)�re�compile�VERBOSE�get�warnings�warn�RuntimeWarning�str)rr=r3r)r6r4r5r�__iter__�s
z _SetuptoolsVersionMixin.__iter__)
rrrrr r"r#r$r%r&r(rG�
__classcell__rr)rrrbsrc@seZdZdS)�SetuptoolsVersionN)rrrrrrrrI�srIc@seZdZdS)�SetuptoolsLegacyVersionN)rrrrrrrrJ�srJcCs*yt|�Stjjk
r$t|�SXdS)N)rIr�version�InvalidVersionrJ)�vrrr�
parse_version�srNcKs"t�j|�tjtj||��dS)N)�globals�update�_state_vars�dict�fromkeys)Zvartype�kwrrr�_declare_state�srUcCs<i}t�}x,tj�D] \}}|d|||�||<qW|S)NZ_sget_)rOrQ�items)�state�g�krMrrr�__getstate__�s
rZcCs<t�}x0|j�D]$\}}|dt|||||�qW|S)NZ_sset_)rOrVrQ)rWrXrYrMrrr�__setstate__�s r[cCs|j�S)N)�copy)�valrrr�
_sget_dict�sr^cCs|j�|j|�dS)N)�clearrP)r'�obrWrrr�
_sset_dict�sracCs|j�S)N)rZ)r]rrr�_sget_object�srbcCs|j|�dS)N)r[)r'r`rWrrr�_sset_object�srccGsdS)Nr)�argsrrr�<lambda>�srecCsbt�}tj|�}|dk	r^tjdkr^y&ddjt�dd��|jd�f}Wntk
r\YnX|S)aZReturn this platform's maximum compatible version.

    distutils.util.get_platform() normally reports the minimum version
    of Mac OS X that would be required to *use* extensions produced by
    distutils.  But what we want when checking compatibility is to know the
    version of Mac OS X that we are *running*.  To allow usage of packages that
    explicitly require a newer version of Mac OS X, we must also know the
    current version of the OS.

    If this condition occurs for any other platform with a version in its
    platform strings, this function should be extended accordingly.
    N�darwinzmacosx-%s-%sr,�r)	�get_build_platform�macosVersionString�match�sys�platform�join�_macosx_vers�group�
ValueError)�plat�mrrr�get_supported_platform�s

&rs�require�
run_script�get_provider�get_distribution�load_entry_point�
get_entry_map�get_entry_info�iter_entry_points�resource_string�resource_stream�resource_filename�resource_listdir�resource_exists�resource_isdir�declare_namespace�working_set�add_activation_listener�find_distributions�set_extraction_path�cleanup_resources�get_default_cache�Environment�
WorkingSet�ResourceManager�Distribution�Requirement�
EntryPoint�ResolutionError�VersionConflict�DistributionNotFound�UnknownExtra�ExtractionError�parse_requirements�	safe_name�safe_version�get_platform�compatible_platforms�yield_lines�split_sections�
safe_extra�to_filename�invalid_marker�evaluate_marker�ensure_directory�normalize_path�EGG_DIST�BINARY_DIST�SOURCE_DIST�
CHECKOUT_DIST�DEVELOP_DIST�IMetadataProvider�IResourceProvider�FileMetadata�PathMetadata�EggMetadata�
EmptyProvider�empty_provider�NullProvider�EggProvider�DefaultProvider�ZipProvider�register_finder�register_namespace_handler�register_loader_type�fixup_namespace_packagesr�run_main�AvailableDistributionsc@seZdZdZdd�ZdS)r�z.Abstract base for dependency resolution errorscCs|jjt|j�S)N)rr�reprrd)rrrr�__repr__IszResolutionError.__repr__N)rrrrr�rrrrr�Fsc@s<eZdZdZdZedd��Zedd��Zdd�Zd	d
�Z	dS)r�z�
    An already-installed version conflicts with the requested version.

    Should be initialized with the installed Distribution and the requested
    Requirement.
    z3{self.dist} is installed but {self.req} is requiredcCs
|jdS)Nr)rd)rrrr�distWszVersionConflict.distcCs
|jdS)Nr-)rd)rrrr�req[szVersionConflict.reqcCs|jjft��S)N)�	_template�format�locals)rrrr�report_szVersionConflict.reportcCs|s|S|j|f}t|�S)zt
        If required_by is non-empty, return a version of self that is a
        ContextualVersionConflict.
        )rd�ContextualVersionConflict)r�required_byrdrrr�with_contextbszVersionConflict.with_contextN)
rrrrr��propertyr�r�r�r�rrrrr�Msc@s&eZdZdZejdZedd��ZdS)r�z�
    A VersionConflict that accepts a third parameter, the set of the
    requirements that required the installed Distribution.
    z by {self.required_by}cCs
|jdS)Nrg)rd)rrrrr�usz%ContextualVersionConflict.required_byN)rrrrr�r�r�r�rrrrr�ms
r�c@sHeZdZdZdZedd��Zedd��Zedd��Zd	d
�Z	dd�Z
d
S)r�z&A requested distribution was not foundzSThe '{self.req}' distribution was not found and is required by {self.requirers_str}cCs
|jdS)Nr)rd)rrrrr��szDistributionNotFound.reqcCs
|jdS)Nr-)rd)rrrr�	requirers�szDistributionNotFound.requirerscCs|js
dSdj|j�S)Nzthe applicationz, )r�rm)rrrr�
requirers_str�sz"DistributionNotFound.requirers_strcCs|jjft��S)N)r�r�r�)rrrrr��szDistributionNotFound.reportcCs|j�S)N)r�)rrrr�__str__�szDistributionNotFound.__str__N)rrrrr�r�r�r�r�r�r�rrrrr�zsc@seZdZdZdS)r�z>Distribution doesn't have an "extra feature" of the given nameN)rrrrrrrrr��srgr-cCs|t|<dS)aRegister `provider_factory` to make providers for `loader_type`

    `loader_type` is the type or class of a PEP 302 ``module.__loader__``,
    and `provider_factory` is a function that, passed a *module* object,
    returns an ``IResourceProvider`` for that module.
    N)�_provider_factories)Zloader_typeZprovider_factoryrrrr��scCstt|t�r$tj|�p"tt|��dSytj|}Wn&tk
rXt	|�tj|}YnXt
|dd�}tt|�|�S)z?Return an IResourceProvider for the named module or requirementr�
__loader__N)
rr�r��findrtrFrk�modules�KeyError�
__import__�getattr�
_find_adapterr�)ZmoduleOrReq�module�loaderrrrrv�s
cCsd|s\tj�d}|dkrLd}tjj|�rLttd�rLtj|�}d|krL|d}|j|j	d��|dS)Nr�z0/System/Library/CoreServices/SystemVersion.plist�	readPlistZProductVersionr,)
rlZmac_ver�os�path�exists�hasattr�plistlibr�r;r)�_cacherKZplistZ
plist_contentrrrrn�s

rncCsddd�j||�S)NZppc)ZPowerPCZPower_Macintosh)rB)�machinerrr�_macosx_arch�sr�cCs�yddlm}Wn tk
r0ddlm}YnX|�}tjdkr�|jd�r�y<t�}tj	�dj
dd�}dt|d�t|d	�t|�fSt
k
r�YnX|S)
z�Return this platform's string for platform-specific distributions

    XXX Currently this is the same as ``distutils.util.get_platform()``, but it
    needs some hacks for Linux and Mac OS X.
    r)r�rfzmacosx-�� �_zmacosx-%d.%d-%sr-)�	sysconfigr��ImportErrorZdistutils.utilrkrlr9rnr��unamer5�intr�rp)r�rqrKr�rrrrh�srhzmacosx-(\d+)\.(\d+)-(.*)zdarwin-(\d+)\.(\d+)\.(\d+)-(.*)cCs�|dks|dks||krdStj|�}|r�tj|�}|s�tj|�}|r�t|jd��}d|jd�|jd�f}|dkr||dks�|dkr�|d	kr�dSd
S|jd�|jd�ks�|jd�|jd�kr�d
St|jd��t|jd��kr�d
SdSd
S)z�Can code for the `provided` platform run on the `required` platform?

    Returns true if either platform is ``None``, or the platforms are equal.

    XXX Needs compatibility checks for Linux and other unixy OSes.
    NTr-z%s.%srg�z10.3r/z10.4Fr)rirj�darwinVersionStringr�ro)ZprovidedZrequiredZreqMacZprovMacZ
provDarwinZdversionZmacosversionrrrr��s*


cCs<tjd�j}|d}|j�||d<t|�dj||�dS)z@Locate distribution `dist_spec` and run its `script_name` scriptr-rrN)rk�	_getframe�	f_globalsr_rtru)Z	dist_spec�script_name�ns�namerrrrus
cCs@t|tj�rtj|�}t|t�r(t|�}t|t�s<td|��|S)z@Return a current distribution object for a Requirement or stringz-Expected string, Requirement, or Distribution)rr�string_typesr��parservr��	TypeError)r�rrrrw)s



cCst|�j||�S)zDReturn `name` entry point of `group` for `dist` or raise ImportError)rwrx)r�ror�rrrrx4scCst|�j|�S)z=Return the entry point map for `group`, or the full entry map)rwry)r�rorrrry9scCst|�j||�S)z<Return the EntryPoint object for `group`+`name`, or ``None``)rwrz)r�ror�rrrrz>sc@s<eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
S)r�cCsdS)z;Does the package's distribution contain the named metadata?Nr)r�rrr�has_metadataDszIMetadataProvider.has_metadatacCsdS)z'The named metadata resource as a stringNr)r�rrr�get_metadataGszIMetadataProvider.get_metadatacCsdS)z�Yield named metadata resource as list of non-blank non-comment lines

       Leading and trailing whitespace is stripped from each line, and lines
       with ``#`` as the first non-blank character are omitted.Nr)r�rrr�get_metadata_linesJsz$IMetadataProvider.get_metadata_linescCsdS)z>Is the named metadata a directory?  (like ``os.path.isdir()``)Nr)r�rrr�metadata_isdirPsz IMetadataProvider.metadata_isdircCsdS)z?List of metadata names in the directory (like ``os.listdir()``)Nr)r�rrr�metadata_listdirSsz"IMetadataProvider.metadata_listdircCsdS)z=Execute the named script in the supplied namespace dictionaryNr)r��	namespacerrrruVszIMetadataProvider.run_scriptN)	rrrr�r�r�r�r�rurrrrr�Csc@s@eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dS)r�z3An object that provides access to package resourcescCsdS)zdReturn a true filesystem path for `resource_name`

        `manager` must be an ``IResourceManager``Nr)�manager�
resource_namerrr�get_resource_filename]sz'IResourceProvider.get_resource_filenamecCsdS)ziReturn a readable file-like object for `resource_name`

        `manager` must be an ``IResourceManager``Nr)r�r�rrr�get_resource_streambsz%IResourceProvider.get_resource_streamcCsdS)zmReturn a string containing the contents of `resource_name`

        `manager` must be an ``IResourceManager``Nr)r�r�rrr�get_resource_stringgsz%IResourceProvider.get_resource_stringcCsdS)z,Does the package contain the named resource?Nr)r�rrr�has_resourcelszIResourceProvider.has_resourcecCsdS)z>Is the named resource a directory?  (like ``os.path.isdir()``)Nr)r�rrrr�osz IResourceProvider.resource_isdircCsdS)z?List of resource names in the directory (like ``os.listdir()``)Nr)r�rrrrrsz"IResourceProvider.resource_listdirN)
rrrrr�r�r�r�r�rrrrrr�Zsc@s�eZdZdZd'dd�Zedd��Zedd��Zd	d
�Zdd�Z	d
d�Z
d(dd�Zdd�Zdd�Z
d)dd�Zd*dd�Zd+dd�Zdd�Zd,dd �Zd!d"�Zd#d$�Zd%d&�ZdS)-r�zDA collection of active distributions on sys.path (or a similar list)NcCsBg|_i|_i|_g|_|dkr&tj}x|D]}|j|�q,WdS)z?Create working set from list of path entries (default=sys.path)N)�entries�
entry_keys�by_key�	callbacksrkr��	add_entry)rr��entryrrr�__init__ys
zWorkingSet.__init__cCsZ|�}yddlm}Wntk
r*|SXy|j|�Wntk
rT|j|�SX|S)z1
        Prepare the master working set.
        r)�__requires__)�__main__r�r�rtr��_build_from_requirements)�cls�wsr�rrr�
_build_master�szWorkingSet._build_mastercCsn|g�}t|�}|j|t��}x|D]}|j|�q$Wx"tjD]}||jkr>|j|�q>W|jtjdd�<|S)zQ
        Build a working set from a requirement spec. Rewrites sys.path.
        N)r��resolver��addrkr�r�r�)rZreq_specr�reqs�distsr�r�rrrr�s

z#WorkingSet._build_from_requirementscCs@|jj|g�|jj|�x t|d�D]}|j||d�q&WdS)a�Add a path item to ``.entries``, finding any distributions on it

        ``find_distributions(entry, True)`` is used to find distributions
        corresponding to the path entry, and they are added.  `entry` is
        always appended to ``.entries``, even if it is already present.
        (This is because ``sys.path`` can contain the same value more than
        once, and the ``.entries`` of the ``sys.path`` WorkingSet should always
        equal ``sys.path``.)
        TFN)r��
setdefaultr�r;r�r)rr�r�rrrr��s
zWorkingSet.add_entrycCs|jj|j�|kS)z9True if `dist` is the active distribution for its project)r�rBr')rr�rrr�__contains__�szWorkingSet.__contains__cCs,|jj|j�}|dk	r(||kr(t||��|S)a�Find a distribution matching requirement `req`

        If there is an active distribution for the requested project, this
        returns it as long as it meets the version requirement specified by
        `req`.  But, if there is an active distribution for the project and it
        does *not* meet the `req` requirement, ``VersionConflict`` is raised.
        If there is no active distribution for the requested project, ``None``
        is returned.
        N)r�rBr'r�)rr�r�rrrr��s

zWorkingSet.findccsPxJ|D]B}|j|�}|dkr6x*|j�D]
}|Vq&Wq||kr||VqWdS)aYield entry point objects from `group` matching `name`

        If `name` is None, yields all entry points in `group` from all
        distributions in the working set, otherwise only ones matching
        both `group` and `name` are yielded (in distribution order).
        N)ry�values)rror�r�r��eprrrr{�s

zWorkingSet.iter_entry_pointscCs>tjd�j}|d}|j�||d<|j|�dj||�dS)z?Locate distribution for `requires` and run `script_name` scriptr-rrN)rkr�r�r_rtru)r�requiresr�r�r�rrrru�s
zWorkingSet.run_scriptccsTi}xJ|jD]@}||jkrqx.|j|D] }||kr(d||<|j|Vq(WqWdS)z�Yield distributions for non-duplicate projects in the working set

        The yield order is the order in which the items' path entries were
        added to the working set.
        r-N)r�r�r�)r�seen�itemr'rrrrG�s
zWorkingSet.__iter__TFcCs�|r|j|j||d�|dkr$|j}|jj|g�}|jj|jg�}|rX|j|jkrXdS||j|j<|j|krz|j|j�|j|kr�|j|j�|j|�dS)aAdd `dist` to working set, associated with `entry`

        If `entry` is unspecified, it defaults to the ``.location`` of `dist`.
        On exit from this routine, `entry` is added to the end of the working
        set's ``.entries`` (if it wasn't already present).

        `dist` is only added to the working set if it's for a project that
        doesn't already have a distribution in the set, unless `replace=True`.
        If it's added, any callbacks registered with the ``subscribe()`` method
        will be called.
        )r5N)	�	insert_onr��locationr�rr'r�r;�
_added_new)rr�r��insertr5�keysZkeys2rrrr�s

zWorkingSet.addcCs|t|�ddd�}i}i}g}t�}tjt�}	�xF|�rv|jd�}
|
|krLq2|j|
�sXq2|j|
j�}|dk�r|j	j|
j�}|dks�||
kr�|r�|}|dkr�|dkr�t
|j�}nt
g�}tg�}|j
|
||�}||
j<|dkr�|	j|
d�}
t|
|
��|j|�||
k�r"|	|
}t||
�j|��|j|
j�ddd�}|j|�x(|D] }|	|j|
j�|
j||<�qHWd||
<q2W|S)aeList all distributions needed to (recursively) meet `requirements`

        `requirements` must be a sequence of ``Requirement`` objects.  `env`,
        if supplied, should be an ``Environment`` instance.  If
        not supplied, it defaults to all distributions available within any
        entry or distribution in the working set.  `installer`, if supplied,
        will be invoked with each requirement that cannot be met by an
        already-installed distribution; it should return a ``Distribution`` or
        ``None``.

        Unless `replace_conflicting=True`, raises a VersionConflict exception if
        any requirements are found on the path that have the correct name but
        the wrong version.  Otherwise, if an `installer` is supplied it will be
        invoked to obtain the correct version of the requirement and activate
        it.
        Nr-rTr7r7)�list�
_ReqExtras�collections�defaultdict�setr:�markers_passrBr'r�r�r�r��
best_matchr�r;r�r�r�extras�extendr�project_name)r�requirements�env�	installerZreplace_conflictingZ	processedZbestZto_activateZ
req_extrasr�r�r�rr�Z
dependent_reqZnew_requirementsZnew_requirementrrrrsJ









zWorkingSet.resolvecCst|�}|j�i}i}|dkr4t|j�}||7}n||}|jg�}	tt|	j|��x�|D]�}
x�||
D]x}|j�g}y|	j|||�}
Wn4t	k
r�}z|||<|r�wjnPWYdd}~XqjXtt|	j|
��|j
tj|
��PqjWq\Wt|�}|j�||fS)asFind all activatable distributions in `plugin_env`

        Example usage::

            distributions, errors = working_set.find_plugins(
                Environment(plugin_dirlist)
            )
            # add plugins+libs to sys.path
            map(working_set.add, distributions)
            # display errors
            print('Could not load', errors)

        The `plugin_env` should be an ``Environment`` instance that contains
        only distributions that are in the project's "plugin directory" or
        directories. The `full_env`, if supplied, should be an ``Environment``
        contains all currently-available distributions.  If `full_env` is not
        supplied, one is created automatically from the ``WorkingSet`` this
        method is called on, which will typically mean that every directory on
        ``sys.path`` will be scanned for distributions.

        `installer` is a standard installer callback as used by the
        ``resolve()`` method. The `fallback` flag indicates whether we should
        attempt to resolve older versions of a plugin if the newest version
        cannot be resolved.

        This method returns a 2-tuple: (`distributions`, `error_info`), where
        `distributions` is a list of the distributions found in `plugin_env`
        that were loadable, along with any other distributions that are needed
        to resolve their dependencies.  `error_info` is a dictionary mapping
        unloadable plugin distributions to an exception instance describing the
        error that occurred. Usually this will be a ``DistributionNotFound`` or
        ``VersionConflict`` instance.
        N)
r�sortr�r�rrr�as_requirementrr�rPrRrS)rZ
plugin_envZfull_envr ZfallbackZplugin_projectsZ
error_infoZ
distributionsrZ
shadow_setrr�r�Z	resolveesrMrrr�find_pluginsks4$





zWorkingSet.find_pluginscGs*|jt|��}x|D]}|j|�qW|S)a�Ensure that distributions matching `requirements` are activated

        `requirements` must be a string or a (possibly-nested) sequence
        thereof, specifying the distributions and versions required.  The
        return value is a sequence of the distributions that needed to be
        activated to fulfill the requirements; all relevant distributions are
        included, even if they were already activated in this working set.
        )rr�r)rrZneededr�rrrrt�s	
zWorkingSet.requirecCs<||jkrdS|jj|�|s"dSx|D]}||�q(WdS)z�Invoke `callback` for all distributions

        If `existing=True` (default),
        call on all existing ones, as well.
        N)r�r;)r�callback�existingr�rrr�	subscribe�s

zWorkingSet.subscribecCsx|jD]}||�qWdS)N)r�)rr�r$rrrr�szWorkingSet._added_newcCs,|jdd�|jj�|jj�|jdd�fS)N)r�r�r\r�r�)rrrrrZ�szWorkingSet.__getstate__cCs@|\}}}}|dd�|_|j�|_|j�|_|dd�|_dS)N)r�r\r�r�r�)rZe_k_b_cr�rr�r�rrrr[�s


zWorkingSet.__setstate__)N)N)NTF)NNF)NNT)T)rrrrr��classmethodrrr�r	r�r{rurGrrr#rtr&rrZr[rrrrr�vs(




Q
S
c@seZdZdZdd�ZdS)rz>
    Map each requirement to the extras that demanded it.
    cs.�fdd�|j�f�dD�}�jp,t|�S)z�
        Evaluate markers for req against each extra that
        demanded it.

        Return False if the req has a marker and fails
        evaluation. Otherwise, return True.
        c3s|]}�jjd|i�VqdS)�extraN)�marker�evaluate)�.0r()r�rr�	<genexpr>�sz*_ReqExtras.markers_pass.<locals>.<genexpr>N)N)rBr)�any)rr�Zextra_evalsr)r�rr�s	
z_ReqExtras.markers_passN)rrrrrrrrrr�src@sxeZdZdZde�efdd�Zdd�Zdd�Zdd	d
�Z	dd�Z
d
d�Zddd�Zddd�Z
dd�Zdd�Zdd�ZdS)r�z5Searchable snapshot of distributions on a search pathNcCs i|_||_||_|j|�dS)a!Snapshot distributions available on a search path

        Any distributions found on `search_path` are added to the environment.
        `search_path` should be a sequence of ``sys.path`` items.  If not
        supplied, ``sys.path`` is used.

        `platform` is an optional string specifying the name of the platform
        that platform-specific distributions must be compatible with.  If
        unspecified, it defaults to the current platform.  `python` is an
        optional string naming the desired version of Python (e.g. ``'3.3'``);
        it defaults to the current version.

        You may explicitly set `platform` (and/or `python`) to ``None`` if you
        wish to map *all* distributions, not just those compatible with the
        running platform or Python version.
        N)�_distmaprl�python�scan)r�search_pathrlr/rrrr�szEnvironment.__init__cCs.|jdks |jdks |j|jko,t|j|j�S)z�Is distribution `dist` acceptable for this environment?

        The distribution must match the platform and python version
        requirements specified when this environment was created, or False
        is returned.
        N)r/�
py_versionr�rl)rr�rrr�can_addszEnvironment.can_addcCs|j|jj|�dS)z"Remove `dist` from the environmentN)r.r'�remove)rr�rrrr4(szEnvironment.removecCs<|dkrtj}x(|D] }xt|�D]}|j|�q"WqWdS)adScan `search_path` for distributions usable in this environment

        Any distributions found are added to the environment.
        `search_path` should be a sequence of ``sys.path`` items.  If not
        supplied, ``sys.path`` is used.  Only distributions conforming to
        the platform/python version defined at initialization are added.
        N)rkr�r�r)rr1rr�rrrr0,s

zEnvironment.scancCs|j�}|jj|g�S)aReturn a newest-to-oldest list of distributions for `project_name`

        Uses case-insensitive `project_name` comparison, assuming all the
        project's distributions use their project's name converted to all
        lowercase as their key.

        )r8r.rB)rrZdistribution_keyrrrr(;szEnvironment.__getitem__cCsL|j|�rH|j�rH|jj|jg�}||krH|j|�|jtjd�dd�dS)zLAdd `dist` if we ``can_add()`` it and it has not already been added
        �hashcmpT)r'�reverseN)	r3�has_versionr.rr'r;r!�operator�
attrgetter)rr�rrrrrFs

zEnvironment.addcCsB|j|�}|dk	r|Sx||jD]}||kr"|Sq"W|j||�S)a�Find distribution best matching `req` and usable on `working_set`

        This calls the ``find(req)`` method of the `working_set` to see if a
        suitable distribution is already active.  (This may raise
        ``VersionConflict`` if an unsuitable version of the project is already
        active in the specified `working_set`.)  If a suitable distribution
        isn't active, this method returns the newest distribution in the
        environment that meets the ``Requirement`` in `req`.  If no suitable
        distribution is found, and `installer` is supplied, then the result of
        calling the environment's ``obtain(req, installer)`` method will be
        returned.
        N)r�r'�obtain)rr�r�r r�rrrrOs
zEnvironment.best_matchcCs|dk	r||�SdS)a�Obtain a distribution matching `requirement` (e.g. via download)

        Obtain a distro that matches requirement (e.g. via download).  In the
        base ``Environment`` class, this routine just returns
        ``installer(requirement)``, unless `installer` is None, in which case
        None is returned instead.  This method is a hook that allows subclasses
        to attempt other ways of obtaining a distribution before falling back
        to the `installer` argument.Nr)rZrequirementr rrrr:es	zEnvironment.obtainccs&x |jj�D]}||r|VqWdS)z=Yield the unique project names of the available distributionsN)r.r)rr'rrrrGqszEnvironment.__iter__cCs^t|t�r|j|�nDt|t�rLx8|D] }x||D]}|j|�q4Wq&Wntd|f��|S)z2In-place addition of a distribution or environmentzCan't add %r to environment)rr�rr�r�)rr!Zprojectr�rrr�__iadd__ws


zEnvironment.__iadd__cCs.|jgddd�}x||fD]}||7}qW|S)z4Add an environment or distribution to an environmentN)rlr/)r)rr!�newrrrr�__add__�szEnvironment.__add__)N)N)N)rrrrrs�PY_MAJORr�r3r4r0r(rrr:rGr;r=rrrrr�s
	

c@seZdZdZdS)r�aTAn error occurred extracting a resource

    The following attributes are available from instances of this exception:

    manager
        The resource manager that raised this exception

    cache_path
        The base directory for resource extraction

    original_error
        The exception instance that caused extraction to fail
    N)rrrrrrrrr��s
c@s�eZdZdZdZdd�Zdd�Zdd�Zd	d
�Zdd�Z	d
d�Z
dd�Zdd�Zffdd�Z
edd��Zdd�Zdd�Zddd�ZdS)r�z'Manage resource extraction and packagesNcCs
i|_dS)N)�cached_files)rrrrr��szResourceManager.__init__cCst|�j|�S)zDoes the named resource exist?)rvr�)r�package_or_requirementr�rrrr��szResourceManager.resource_existscCst|�j|�S)z,Is the named resource an existing directory?)rvr�)rr@r�rrrr��szResourceManager.resource_isdircCst|�j||�S)z4Return a true filesystem path for specified resource)rvr�)rr@r�rrrr~�sz!ResourceManager.resource_filenamecCst|�j||�S)z9Return a readable file-like object for specified resource)rvr�)rr@r�rrrr}�szResourceManager.resource_streamcCst|�j||�S)z%Return specified resource as a string)rvr�)rr@r�rrrr|�szResourceManager.resource_stringcCst|�j|�S)z1List the contents of the named resource directory)rvr)rr@r�rrrr�sz ResourceManager.resource_listdircCsRtj�d}|jpt�}tjd�j�}t|jft	���}||_
||_||_|�dS)z5Give an error message for problems extracting file(s)r-a
            Can't extract file(s) to egg cache

            The following error occurred while trying to extract file(s) to the Python egg
            cache:

              {old_exc}

            The Python egg cache directory is currently set to:

              {cache_path}

            Perhaps your account does not have write access to this directory?  You can
            change the cache directory by setting the PYTHON_EGG_CACHE environment
            variable to point to an accessible directory.
            N)
rk�exc_info�extraction_pathr��textwrap�dedent�lstripr�r�r�r��
cache_pathZoriginal_error)r�old_excrF�tmpl�errrrr�extraction_error�s
z ResourceManager.extraction_errorc	Cs^|jp
t�}tjj||df|��}yt|�Wn|j�YnX|j|�d|j|<|S)a�Return absolute location in cache for `archive_name` and `names`

        The parent directory of the resulting path will be created if it does
        not already exist.  `archive_name` should be the base filename of the
        enclosing egg (which may not be the name of the enclosing zipfile!),
        including its ".egg" extension.  `names`, if provided, should be a
        sequence of path name parts "under" the egg's extraction location.

        This method should only be called by resource providers that need to
        obtain an extraction location, and only for names they intend to
        extract, as it tracks the generated names for possible cleanup later.
        z-tmpr-)	rBr�r�r�rm�_bypass_ensure_directoryrJ�_warn_unsafe_extraction_pathr?)rZarchive_name�namesZextract_pathZtarget_pathrrr�get_cache_path�s


zResourceManager.get_cache_pathcCsXtjdkr |jtjd�r dStj|�j}|tj@s@|tj@rTd|}tj	|t
�dS)aN
        If the default extraction path is overridden and set to an insecure
        location, such as /tmp, it opens up an opportunity for an attacker to
        replace an extracted file with an unauthorized payload. Warn the user
        if a known insecure location is used.

        See Distribute #375 for more details.
        �ntZwindirNz�%s is writable by group/others and vulnerable to attack when used with get_resource_filename. Consider a more secure location (set with .set_extraction_path or the PYTHON_EGG_CACHE environment variable).)r�r�r9�environ�stat�st_mode�S_IWOTH�S_IWGRPrCrD�UserWarning)r��mode�msgrrrrL�s
z,ResourceManager._warn_unsafe_extraction_pathcCs.tjdkr*tj|�jdBd@}tj||�dS)a4Perform any platform-specific postprocessing of `tempname`

        This is where Mac header rewrites should be done; other platforms don't
        have anything special they should do.

        Resource providers should call this method ONLY after successfully
        extracting a compressed resource.  They must NOT call it on resources
        that are already in the filesystem.

        `tempname` is the current (temporary) name of the file, and `filename`
        is the name it will be renamed to by the caller after this routine
        returns.
        �posiximi�N)r�r�rQrR�chmod)rZtempname�filenamerVrrr�postprocesss
zResourceManager.postprocesscCs|jrtd��||_dS)a�Set the base path where resources will be extracted to, if needed.

        If you do not call this routine before any extractions take place, the
        path defaults to the return value of ``get_default_cache()``.  (Which
        is based on the ``PYTHON_EGG_CACHE`` environment variable, with various
        platform-specific fallbacks.  See that routine's documentation for more
        details.)

        Resources are extracted to subdirectories of this path based upon
        information given by the ``IResourceProvider``.  You may set this to a
        temporary directory, but then you must call ``cleanup_resources()`` to
        delete the extracted files when done.  There is no guarantee that
        ``cleanup_resources()`` will be able to remove all extracted files.

        (Note: you may not change the extraction path for a given resource
        manager once resources have been extracted, unless you first call
        ``cleanup_resources()``.)
        z5Can't change extraction path, files already extractedN)r?rprB)rr�rrrr�)sz#ResourceManager.set_extraction_pathFcCsdS)aB
        Delete all extracted resource files and directories, returning a list
        of the file and directory names that could not be successfully removed.
        This function does not have any concurrency protection, so it should
        generally only be called when the extraction path is a temporary
        directory exclusive to a single process.  This method is not
        automatically called; you must call it explicitly or register it as an
        ``atexit`` function if you wish to ensure cleanup of a temporary
        directory used for extractions.
        Nr)r�forcerrrr�Csz!ResourceManager.cleanup_resources)F)rrrrrBr�r�r�r~r}r|rrJrN�staticmethodrLr[r�r�rrrrr��scCstjjd�ptjdd�S)z�
    Return the ``PYTHON_EGG_CACHE`` environment variable
    or a platform-relevant user cache dir for an app
    named "Python-Eggs".
    ZPYTHON_EGG_CACHEzPython-Eggs)Zappname)r�rPrBrZuser_cache_dirrrrrr�QscCstjdd|�S)z�Convert an arbitrary string to a standard distribution name

    Any runs of non-alphanumeric/. characters are replaced with a single '-'.
    z[^A-Za-z0-9.]+r+)r?�sub)r�rrrr�]scCsDyttjj|��Stjjk
r>|jdd�}tjdd|�SXdS)zB
    Convert an arbitrary string to a standard version string
    r�r,z[^A-Za-z0-9.]+r+N)rFrrK�VersionrLr5r?r^)rKrrrr�es
cCstjdd|�j�S)z�Convert an arbitrary string to a standard 'extra' name

    Any runs of non-alphanumeric characters are replaced with a single '_',
    and the result is always lowercased.
    z[^A-Za-z0-9.-]+r�)r?r^r8)r(rrrr�qscCs|jdd�S)z|Convert a project or version name to its filename-escaped form

    Any '-' characters are currently replaced with '_'.
    r+r�)r5)r�rrrr�zscCs>yt|�Wn,tk
r8}zd|_d|_|Sd}~XnXdS)zo
    Validate text as a PEP 508 environment marker; return an exception
    if invalid or False otherwise.
    NF)r��SyntaxErrorrZ�lineno)�text�errrr��scCsHytjj|�}|j�Stjjk
rB}zt|��WYdd}~XnXdS)z�
    Evaluate a PEP 508 environment marker.
    Return a boolean indicating the marker result in this environment.
    Raise SyntaxError if marker is invalid.

    This implementation uses the 'pyparsing' module.
    N)rZmarkersZMarkerr*Z
InvalidMarkerr`)rbr(r)rcrrrr��s
c@s�eZdZdZdZdZdZdd�Zdd�Zdd�Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�ZdS)'r�zETry to implement resources and metadata for arbitrary PEP 302 loadersNcCs(t|dd�|_tjjt|dd��|_dS)Nr��__file__r�)r�r�r�r��dirname�module_path)rr�rrrr��szNullProvider.__init__cCs|j|j|�S)N)�_fnrf)rr�r�rrrr��sz"NullProvider.get_resource_filenamecCstj|j||��S)N)�io�BytesIOr�)rr�r�rrrr��sz NullProvider.get_resource_streamcCs|j|j|j|��S)N)�_getrgrf)rr�r�rrrr��sz NullProvider.get_resource_stringcCs|j|j|j|��S)N)�_hasrgrf)rr�rrrr��szNullProvider.has_resourcecCs|jo|j|j|j|��S)N)�egg_inforkrg)rr�rrrr��szNullProvider.has_metadatacCs2|js
dS|j|j|j|��}tjr.|jd�S|S)Nr�zutf-8)rlrjrgrZPY3�decode)rr��valuerrrr��szNullProvider.get_metadatacCst|j|��S)N)r�r�)rr�rrrr��szNullProvider.get_metadata_linescCs|j|j|j|��S)N)�_isdirrgrf)rr�rrrr��szNullProvider.resource_isdircCs|jo|j|j|j|��S)N)rlrorg)rr�rrrr��szNullProvider.metadata_isdircCs|j|j|j|��S)N)�_listdirrgrf)rr�rrrr�szNullProvider.resource_listdircCs|jr|j|j|j|��SgS)N)rlrprg)rr�rrrr��szNullProvider.metadata_listdirc
Cs�d|}|j|�std|��|j|�jdd�}|jdd�}|j|j|�}||d<tjj|�r�t	|�j
�}t||d�}t|||�n>dd	l
m}t|�d|jd�|f||<t||d�}	t|	||�dS)
Nzscripts/zNo script named %rz
�
�
rd�execr)�cache)r�r�r�r5rgrlr�r�r�r�readr@rs�	linecachert�lenr)
rr�r�ZscriptZscript_textZscript_filename�source�codertZscript_coderrrru�s
zNullProvider.run_scriptcCstd��dS)Nz9Can't perform this operation for unregistered loader type)�NotImplementedError)rr�rrrrk�szNullProvider._hascCstd��dS)Nz9Can't perform this operation for unregistered loader type)rz)rr�rrrro�szNullProvider._isdircCstd��dS)Nz9Can't perform this operation for unregistered loader type)rz)rr�rrrrp�szNullProvider._listdircCs |rtjj|f|jd���S|S)N�/)r�r�rmr)r�baser�rrrrg�szNullProvider._fncCs$t|jd�r|jj|�Std��dS)N�get_dataz=Can't perform this operation for loaders without 'get_data()')r�r�r}rz)rr�rrrrj�szNullProvider._get)rrrr�egg_namerlr�r�r�r�r�r�r�r�r�r�r�rr�rurkrorprgrjrrrrr��s,c@s eZdZdZdd�Zdd�ZdS)r�z&Provider based on a virtual filesystemcCstj||�|j�dS)N)r�r��
_setup_prefix)rr�rrrr�szEggProvider.__init__cCs^|j}d}xN||krXt|�rBtjj|�|_tjj|d�|_||_P|}tjj	|�\}}qWdS)NzEGG-INFO)
rf�_is_unpacked_eggr�r��basenamer~rmrl�egg_rootr)rr��oldr|rrrr
s
zEggProvider._setup_prefixN)rrrrr�rrrrrr�sc@sDeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Ze	dd
��Z
dS)r�z6Provides access to package resources in the filesystemcCstjj|�S)N)r�r�r�)rr�rrrrkszDefaultProvider._hascCstjj|�S)N)r�r�r
)rr�rrrroszDefaultProvider._isdircCs
tj|�S)N)r��listdir)rr�rrrrp"szDefaultProvider._listdircCst|j|j|�d�S)N�rb)rrgrf)rr�r�rrrr�%sz#DefaultProvider.get_resource_streamc	Cst|d��
}|j�SQRXdS)Nr�)rru)rr��streamrrrrj(szDefaultProvider._getcCsttdtd��}t||�dS)N�SourceFileLoader)r��importlib_machinery�typer�)rZ
loader_clsrrr�	_register,s
zDefaultProvider._registerN)rrrrrkrorpr�rjr'r�rrrrr�sc@s8eZdZdZdd�ZZdd�Zdd�ZdZdd�Z	dS)	r�z.Provider that returns nothing for all requestscCsdS)NFr)rr�rrrre9szEmptyProvider.<lambda>cCsdS)Nr�r)rr�rrrre:scCsgS)Nr)rr�rrrre;sNcCsdS)Nr)rrrrr�>szEmptyProvider.__init__)
rrrrrorkrjrprfr�rrrrr�6sc@s eZdZdZedd��ZeZdS)�ZipManifestsz
    zip manifest builder
    c
s2t|�� ��fdd��j�D�}t|�SQRXdS)a
        Build a dictionary similar to the zipimport directory
        caches, except instead of tuples, store ZipInfo objects.

        Use a platform-specific path separator (os.sep) for the path keys
        for compatibility with pypy on Windows.
        c3s&|]}|jdtj��j|�fVqdS)r{N)r5r��sepZgetinfo)r+r�)�zfilerrr,Usz%ZipManifests.build.<locals>.<genexpr>N)�ContextualZipFileZnamelistrR)rr�rVr)r�r�buildJs	

zZipManifests.buildN)rrrrr'r��loadrrrrr�Esr�c@s$eZdZdZejdd�Zdd�ZdS)�MemoizedZipManifestsz%
    Memoized zipfile manifests.
    �manifest_modzmanifest mtimecCsRtjj|�}tj|�j}||ks.||j|krH|j|�}|j||�||<||jS)zW
        Load a manifest at path or return a suitable manifest already loaded.
        )	r�r��normpathrQ�st_mtime�mtimer�r��manifest)rr�r�r�rrrr�fs
zMemoizedZipManifests.loadN)rrrrr�
namedtupler�r�rrrrr�`sr�cs0eZdZdZdd�Zdd�Z�fdd�Z�ZS)r�zL
    Supplement ZipFile class to support context manager for Python 2.6
    cCs|S)Nr)rrrr�	__enter__yszContextualZipFile.__enter__cCs|j�dS)N)�close)rr�rn�	tracebackrrr�__exit__|szContextualZipFile.__exit__cs(ttjd�rtj||�Stt|�j|�S)zI
        Construct a ZipFile or ContextualZipFile as appropriate
        r�)r��zipfile�ZipFilerr��__new__)rrd�kwargs)rrrr�szContextualZipFile.__new__)rrrrr�r�r�rHrr)rrr�tsr�c@s�eZdZdZdZe�Zdd�Zdd�Zdd�Z	e
d	d
��Zdd�Ze
d
d��Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �ZdS)!r�z"Resource support for zips and eggsNcCs tj||�|jjtj|_dS)N)r�r�r��archiver�r��zip_pre)rr�rrrr��szZipProvider.__init__cCs4|j|j�r|t|j�d�Std||jf��dS)Nz%s is not a subpath of %s)r9r�rw�AssertionError)r�fspathrrr�
_zipinfo_name�szZipProvider._zipinfo_namecCsP|j|}|j|jtj�r:|t|j�dd�jtj�Std||jf��dS)Nr-z%s is not a subpath of %s)r�r9r�r�r�rwrr�)r�zip_pathr�rrr�_parts�s

zZipProvider._partscCs|jj|jj�S)N)�_zip_manifestsr�r�r�)rrrr�zipinfo�szZipProvider.zipinfocCs`|jstd��|j|�}|j�}dj|j|��|krTx|D]}|j||j|��q:W|j||�S)Nz5resource_filename() only supported for .egg, not .zipr{)r~rz�_resource_to_zip�_get_eager_resourcesrmr��_extract_resource�
_eager_to_zip)rr�r�r��eagersr�rrrr��s

z!ZipProvider.get_resource_filenamecCs"|j}|jd}tj|�}||fS)Nrr-r7)rrr7)Z	file_size�	date_time�timeZmktime)Zzip_stat�sizer��	timestamprrr�_get_date_and_size�s

zZipProvider._get_date_and_sizec
Csn||j�krDx*|j�|D]}|j|tjj||��}qWtjj|�S|j|j|�\}}tsdt	d��y�|j
|j|j|��}|j
||�r�|Stdtjj|�d�\}}	tj||jj|��tj|�t|	||f�|j|	|�yt|	|�Wn\tjk
�rDtjj|��r>|j
||��r|Stjdk�r>t|�t|	|�|S�YnXWn tjk
�rh|j�YnX|S)Nz>"os.rename" and "os.unlink" are not supported on this platformz	.$extract)�dirrO)�_indexr�r�r�rmrer�r��
WRITE_SUPPORT�IOErrorrNr~r��_is_current�_mkstemp�writer�r}r�rr[r
�error�isfiler�rrJ)
rr�r�r�Zlastr�r�Z	real_pathZoutfZtmpnamrrrr��s@

zZipProvider._extract_resourcec		Csx|j|j|�\}}tjj|�s$dStj|�}|j|ksB|j|krFdS|jj	|�}t
|d��}|j�}WdQRX||kS)zK
        Return True if the file_path is current for this zip_path
        Fr�N)r�r�r�r�r�rQ�st_sizer�r�r}rru)	rZ	file_pathr�r�r�rQZzip_contents�fZ
file_contentsrrrr��s
zZipProvider._is_currentcCsB|jdkr<g}x&dD]}|j|�r|j|j|��qW||_|jS)N�native_libs.txt�eager_resources.txt)r�r�)r�r�rr�)rr�r�rrrr�s


z ZipProvider._get_eager_resourcescCs�y|jStk
r�i}xd|jD]Z}|jtj�}xH|rztjj|dd��}||krj||j|d�Pq4|j�g||<q4Wq"W||_|SXdS)Nr-r7r7)	Z	_dirindex�AttributeErrorr�rr�r�rmr;r:)rZindr�r<�parentrrrr�szZipProvider._indexcCs |j|�}||jkp||j�kS)N)r�r�r�)rr�r�rrrrks
zZipProvider._hascCs|j|�|j�kS)N)r�r�)rr�rrrro!szZipProvider._isdircCst|j�j|j|�f��S)N)rr�rBr�)rr�rrrrp$szZipProvider._listdircCs|j|j|j|��S)N)r�rgr�)rr�rrrr�'szZipProvider._eager_to_zipcCs|j|j|j|��S)N)r�rgrf)rr�rrrr�*szZipProvider._resource_to_zip)rrrrr�r�r�r�r�r�r�r�r�r]r�r�r�r�r�rkrorpr�r�rrrrr��s$	

	4	c@s8eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�ZdS)
r�a*Metadata handler for standalone PKG-INFO files

    Usage::

        metadata = FileMetadata("/path/to/PKG-INFO")

    This provider rejects all data and metadata requests except for PKG-INFO,
    which is treated as existing, and will be the contents of the file at
    the provided location.
    cCs
||_dS)N)r�)rr�rrrr�=szFileMetadata.__init__cCs|dkotjj|j�S)NzPKG-INFO)r�r�r�)rr�rrrr�@szFileMetadata.has_metadatac	CsD|dkrtd��tj|jddd��}|j�}WdQRX|j|�|S)NzPKG-INFOz(No metadata except PKG-INFO is availablezutf-8r5)�encoding�errors)r�rhrr�ru�_warn_on_replacement)rr�r��metadatarrrr�Cs
zFileMetadata.get_metadatacCs2djd�}||kr.d}|jft��}tj|�dS)Ns�zutf-8z2{self.path} could not be properly decoded in UTF-8)rmr�r�rCrD)rr�Zreplacement_charrHrWrrrr�Ls

z!FileMetadata._warn_on_replacementcCst|j|��S)N)r�r�)rr�rrrr�TszFileMetadata.get_metadata_linesN)	rrrrr�r�r�r�r�rrrrr�1s
	c@seZdZdZdd�ZdS)r�asMetadata provider for egg directories

    Usage::

        # Development eggs:

        egg_info = "/path/to/PackageName.egg-info"
        base_dir = os.path.dirname(egg_info)
        metadata = PathMetadata(base_dir, egg_info)
        dist_name = os.path.splitext(os.path.basename(egg_info))[0]
        dist = Distribution(basedir, project_name=dist_name, metadata=metadata)

        # Unpacked egg directories:

        egg_path = "/path/to/PackageName-ver-pyver-etc.egg"
        metadata = PathMetadata(egg_path, os.path.join(egg_path,'EGG-INFO'))
        dist = Distribution.from_filename(egg_path, metadata=metadata)
    cCs||_||_dS)N)rfrl)rr�rlrrrr�lszPathMetadata.__init__N)rrrrr�rrrrr�Xsc@seZdZdZdd�ZdS)r�z Metadata provider for .egg filescCsD|jtj|_||_|jr0tjj|j|j�|_n|j|_|j	�dS)z-Create a metadata provider from a zipimporterN)
r�r�r�r�r��prefixr�rmrfr)r�importerrrrr�tszEggMetadata.__init__N)rrrrr�rrrrr�qsrR)�_distribution_finderscCs|t|<dS)axRegister `distribution_finder` to find distributions in sys.path items

    `importer_type` is the type or class of a PEP 302 "Importer" (sys.path item
    handler), and `distribution_finder` is a callable that, passed a path
    item and the importer instance, yields ``Distribution`` instances found on
    that path item.  See ``pkg_resources.find_on_path`` for an example.N)r�)�
importer_typeZdistribution_finderrrrr��scCst|�}tt|�}||||�S)z.Yield distributions accessible via `path_item`)rr�r�)�	path_item�onlyr��finderrrrr��s
ccs�|jjd�rdSt|�}|jd�r2tj||d�V|r:dSxH|jd�D]:}t|�rFtj	j
||�}xttj
|�|�D]
}|VqrWqFWdS)z@
    Find eggs in zip files; possibly multiple nested eggs.
    z.whlNzPKG-INFO)r�r{)r��endswithr�r�r��
from_filenamerr�r�r�rm�find_eggs_in_zip�	zipimport�zipimporter)r�r�r�r�Zsubitem�subpathr�rrrr��s
r�cCsfS)Nr)r�r�r�rrr�find_nothing�sr�cCsdd�}t||dd�S)aL
    Given a list of filenames, return them in descending order
    by version number.

    >>> names = 'bar', 'foo', 'Python-2.7.10.egg', 'Python-2.7.2.egg'
    >>> _by_version_descending(names)
    ['Python-2.7.10.egg', 'Python-2.7.2.egg', 'foo', 'bar']
    >>> names = 'Setuptools-1.2.3b1.egg', 'Setuptools-1.2.3.egg'
    >>> _by_version_descending(names)
    ['Setuptools-1.2.3.egg', 'Setuptools-1.2.3b1.egg']
    >>> names = 'Setuptools-1.2.3b1.egg', 'Setuptools-1.2.3.post1.egg'
    >>> _by_version_descending(names)
    ['Setuptools-1.2.3.post1.egg', 'Setuptools-1.2.3b1.egg']
    cSs2tjj|�\}}tj|jd�|g�}dd�|D�S)z6
        Parse each component of the filename
        r+cSsg|]}tjj|��qSr)rrKr�)r+r3rrr�
<listcomp>�sz?_by_version_descending.<locals>._by_version.<locals>.<listcomp>)r�r��splitext�	itertools�chainr)r��extr<rrr�_by_version�sz+_by_version_descending.<locals>._by_versionT)r'r6)�sorted)rMr�rrr�_by_version_descending�sr�ccs�t|�}tjj|�o tj|tj��r�t|�rPtj|t	|tjj
|d��d�V�nTttj|��}�xB|D�]8}|j
�}|jd�s�|jd�r�tjj
||�}tjj|�r�ttj|��dkr�qft	||�}nt|�}tj|||td�Vqf|o�t|��rttjj
||��}x�|D]}	|	V�qWqf|rf|jd�rfttjj
||���}
|
j�}WdQRXxN|D]F}|j��sh�qVtjj
||j��}
t|
�}x|D]}|V�q�WP�qVWqfWdS)	z6Yield distributions accessible on a sys.path directoryzEGG-INFO)r�z	.egg-infoz
.dist-infor)�
precedencez	.egg-linkN)�_normalize_cachedr�r�r
�access�R_OKr�r�r�r�rmr�r�r8r�rwr��
from_locationr�r�r�	readlines�strip�rstrip)r�r�r�Zpath_item_entriesr�r8Zfullpathr�rr�Z
entry_fileZentry_lines�liner�rrrr�find_on_path�sB



r��
FileFinder)�_namespace_handlers)�_namespace_packagescCs|t|<dS)a�Register `namespace_handler` to declare namespace packages

    `importer_type` is the type or class of a PEP 302 "Importer" (sys.path item
    handler), and `namespace_handler` is a callable like this::

        def namespace_handler(importer, path_entry, moduleName, module):
            # return a path_entry to use for child packages

    Namespace handlers are only called if the importer object has already
    agreed that it can handle the relevant path item, and they should only
    return a subpath if the module __path__ does not already contain an
    equivalent subpath.  For an example namespace handler, see
    ``pkg_resources.file_ns_handler``.
    N)r�)r�Znamespace_handlerrrrr�scCs�t|�}|dkrdS|j|�}|dkr*dStjj|�}|dkrbtj|�}tj|<g|_t|�nt	|d�svt
d|��tt|�}|||||�}|dk	r�|j}|j
|�|j|�t|||�|S)zEEnsure that named package includes a subpath of path_item (if needed)N�__path__zNot a package:)r�find_modulerkr�rB�types�
ModuleTyper��_set_parent_nsr�r�r�r�r;�load_module�_rebuild_mod_path)�packageNamer�r�r�r�Zhandlerr�r�rrr�
_handle_nss*






r�csRdd�tjD���fdd����fdd�}|j|d�dd�|D�|jd	d	�<d	S)
zq
    Rebuild module.__path__ ensuring that all entries are ordered
    corresponding to their sys.path order
    cSsg|]}t|��qSr)r�)r+�prrrr�5sz%_rebuild_mod_path.<locals>.<listcomp>cs(y
�j|�Stk
r"td�SXdS)z/
        Workaround for #520 and #513.
        �infN)�indexrp�float)r�)�sys_pathrr�safe_sys_path_index7s
z._rebuild_mod_path.<locals>.safe_sys_path_indexcs<|jtj�}�jd�d}|d|�}�ttjj|���S)zR
        Return the ordinal of the path based on its position in sys.path
        r,r-N)rr�r��countr�rm)r��
path_partsZmodule_partsr<)�package_namer�rr�position_in_sys_path@sz/_rebuild_mod_path.<locals>.position_in_sys_path)r'cSsg|]}t|��qSr)r�)r+r�rrrr�JsN)rkr�r!r�)Z	orig_pathr�r�r�r)r�r�r�rr�0s
		r�cCs�tj�z�|tkrdStjd}}d|kr�dj|jd�dd��}t|�|tkrZt|�ytj	|j
}Wntk
r�td|��YnXtj
|g�j|�tj
|g�x|D]}t||�q�WWdtj�XdS)z9Declare that package 'packageName' is a namespace packageNr,r-zNot a package:r7)�_imp�acquire_lockr�rkr�rmrr�r�r�r�r�r�rr;r��release_lock)r�r�r�r�rrrr�Ms&
c
CsJtj�z2x,tj|f�D]}t||�}|rt||�qWWdtj�XdS)zDEnsure that previously-declared namespace packages include path_itemN)r�r�r�rBr�r�r�)r�r��packager�rrrr�ns
cCsFtjj||jd�d�}t|�}x |jD]}t|�|kr(Pq(W|SdS)zBCompute an ns-package subpath for a filesystem or zipfile importerr,r-Nr7)r�r�rmrr�r�)r�r�r�r�r�Z
normalizedrrrr�file_ns_handlerzsrcCsdS)Nr)r�r�r�r�rrr�null_ns_handler�srcCstjjtjj|��S)z1Normalize a file/dir name for comparison purposes)r�r��normcase�realpath)rZrrrr��scCs2y||Stk
r,t|�||<}|SXdS)N)r�r�)rZr��resultrrrr��s
r�cCs|j�jd�S)z@
    Determine if given path appears to be an unpacked egg.
    z.egg)r8r�)r�rrrr��sr�cCs<|jd�}|j�}|r8dj|�}ttj||tj|�dS)Nr,)rr:rm�setattrrkr�)r�r<r�r�rrrr��s


r�ccsht|tj�r>xV|j�D]"}|j�}|r|jd�r|VqWn&x$|D]}xt|�D]
}|VqRWqDWdS)z9Yield non-empty/non-comment lines of a string or sequence�#N)rrr��
splitlinesr�r9r�)�strsr2Zssrrrr��s
z\w+(\.\w+)*$z�
    (?P<name>[^-]+) (
        -(?P<ver>[^-]+) (
            -py(?P<pyver>[^-]+) (
                -(?P<plat>.+)
            )?
        )?
    )?
    c@s�eZdZdZffdfdd�Zdd�Zdd�Zdd
d�Zdd
�Zddd�Z	e
jd�Ze
ddd��Ze
dd��Ze
ddd��Ze
ddd��ZdS)r�z3Object representing an advertised importable objectNcCsJt|�std|��||_||_t|�|_tjddj|��j	|_	||_
dS)NzInvalid module namezx[%s]�,)�MODULErpr��module_namer�attrsr�r�rmrr�)rr�rrrr�rrrr��s

zEntryPoint.__init__cCsHd|j|jf}|jr*|ddj|j�7}|jrD|ddj|j�7}|S)Nz%s = %s�:r,z [%s]r	)r�rrrmr)rr2rrrr��szEntryPoint.__str__cCsdt|�S)NzEntryPoint.parse(%r))rF)rrrrr��szEntryPoint.__repr__TcOs6|s|s|rtjdtdd�|r.|j||�|j�S)zH
        Require packages for this EntryPoint, then resolve it.
        zJParameters to load are deprecated.  Call .resolve and .require separately.rg)r>)rCrD�DeprecationWarningrtr)rrtrdr�rrrr��szEntryPoint.loadcCsVt|jdgdd�}ytjt|j|�Stk
rP}ztt|���WYdd}~XnXdS)zD
        Resolve the entry point from its module and attrs.
        rr)�fromlist�levelN)	r�r�	functools�reducer�rr�r�rF)rr��excrrrr�s
zEntryPoint.resolvecCsH|jr|jrtd|��|jj|j�}tj|||�}tttj|��dS)Nz&Can't require() without a distribution)	rr�r�rr�rrrr)rrr rrVrrrrt	s

zEntryPoint.requirez]\s*(?P<name>.+?)\s*=\s*(?P<module>[\w.]+)\s*(:\s*(?P<attr>[\w.]+))?\s*(?P<extras>\[.*\])?\s*$cCsf|jj|�}|sd}t||��|j�}|j|d�}|drJ|djd�nf}||d|d|||�S)aParse a single entry point from string `src`

        Entry point syntax follows the form::

            name = some.module:some.attr [extra1, extra2]

        The entry name and module name are required, but the ``:attrs`` and
        ``[extras]`` parts are optional
        z9EntryPoint must be in 'name=module:attrs [extras]' formatr�attrr,r�r�)�patternrjrp�	groupdict�
_parse_extrasr)r�srcr�rrrW�resrrrrrr�	s
zEntryPoint.parsecCs(|sfStjd|�}|jr"t��|jS)N�x)r�r��specsrpr)rZextras_specr�rrrr$	szEntryPoint._parse_extrascCsZt|�std|��i}x>t|�D]2}|j||�}|j|krHtd||j��|||j<q W|S)zParse an entry point groupzInvalid group namezDuplicate entry point)r
rpr�r�r�)rro�linesr��thisr�rrrr�parse_group-	s

zEntryPoint.parse_groupcCsxt|t�r|j�}nt|�}i}xR|D]J\}}|dkrD|s<q&td��|j�}||kr^td|��|j|||�||<q&W|S)z!Parse a map of entry point groupsNz%Entry points must be listed in groupszDuplicate group name)rrRrVr�rpr�r)r�datar��mapsrorrrr�	parse_map:	s


zEntryPoint.parse_map)T)NN)N)N)N)rrrrr�r�r�r�rrtr?r@rr'r�rrr!rrrrr��s 	


	cCs>|sdStjj|�}|djd�r:tjj|dd�d�S|S)Nr�r-zmd5=r7r7)r�)rr�Zurlparser9Z
urlunparse)rZparsedrrr�_remove_md5_fragmentN	sr"cCs@dd�}t||�}tt|�d�}|jd�\}}}t|j��p>dS)z�
    Given an iterable of lines from a Metadata file, return
    the value of the Version field, if present, or None otherwise.
    cSs|j�jd�S)Nzversion:)r8r9)r�rrrre\	sz$_version_from_file.<locals>.<lambda>r�r
N)r�next�iter�	partitionr�r�)rZis_version_lineZ
version_linesr�r�rnrrr�_version_from_fileW	s

r&c@sZeZdZdZdZddddedefdd�ZedGdd��Z	dd	�Z
ed
d��Zdd
�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zedd��Zedd��Zdd�Zed d!��Zed"d#��Zffd$d%�Zd&d'�ZdHd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�ZedId3d4��Z d5d6�Z!d7d8�Z"dJd9d:�Z#d;d<�Z$dKd=d>�Z%d?d@�Z&dAdB�Z'dCdD�Z(edEdF��Z)dS)Lr�z5Wrap an actual or potential sys.path entry w/metadatazPKG-INFONcCsFt|pd�|_|dk	r t|�|_||_||_||_||_|p>t|_	dS)NZUnknown)
r�rr��_versionr2rlrr�r��	_provider)rrr�rrKr2rlr�rrrr�g	s
zDistribution.__init__cKs~dgd\}}}}tjj|�\}}	|	j�tkr^t|	j�}t|�}
|
r^|
jdddd�\}}}}|||f||||d�|��j�S)Nr�r�ZverZpyverrq)rrKr2rl)r�r�r�r8�_distributionImpl�EGG_NAMEro�_reload_version)rrr�r�rTrrKr2rlr�rjrrrr�s	s
zDistribution.from_locationcCs|S)Nr)rrrrr+�	szDistribution._reload_versioncCs(|j|j|jt|j�|jpd|jp$dfS)Nr�)�parsed_versionr�r'r"rr2rl)rrrrr5�	szDistribution.hashcmpcCs
t|j�S)N)�hashr5)rrrrr�	szDistribution.__hash__cCs|j|jkS)N)r5)rr!rrrr �	szDistribution.__lt__cCs|j|jkS)N)r5)rr!rrrr"�	szDistribution.__le__cCs|j|jkS)N)r5)rr!rrrr%�	szDistribution.__gt__cCs|j|jkS)N)r5)rr!rrrr$�	szDistribution.__ge__cCst||j�sdS|j|jkS)NF)rrr5)rr!rrrr#�	szDistribution.__eq__cCs
||kS)Nr)rr!rrrr&�	szDistribution.__ne__cCs0y|jStk
r*|jj�|_}|SXdS)N)Z_keyr�rr8)rr'rrrr'�	s
zDistribution.keycCst|d�st|j�|_|jS)N�_parsed_version)r�rNrKr.)rrrrr,�	s
zDistribution.parsed_versioncCsXtjj}t|j|�}|sdS|js&dStjd�j�jdd�}t	j
|jft|��t
�dS)Na>
            '{project_name} ({version})' is being parsed as a legacy,
            non PEP 440,
            version. You may find odd behavior and sort order.
            In particular it will be sorted as less than 0.0. It
            is recommended to migrate to PEP 440 compatible
            versions.
            rqr�)rrK�
LegacyVersionrr.rCrDr�r5rCrDr��varsr)rZLVZ	is_legacyrHrrr�_warn_legacy_version�	sz!Distribution._warn_legacy_versioncCsLy|jStk
rFt|j|j��}|dkrBd}t||j|��|SXdS)Nz(Missing 'Version:' header and/or %s file)r'r�r&�
_get_metadata�PKG_INFOrp)rrKrHrrrrK�	szDistribution.versioncCs�y|jStk
r�dgi}|_x�dD]x}xrt|j|��D]`\}}|r�d|kr||jdd�\}}t|�rpg}nt|�s|g}t|�p�d}|j|g�j	t
|��q>Wq*W|SXdS)N�requires.txt�depends.txtr
r-)r4r5)Z_Distribution__dep_mapr�r�r2rr�r�r�rrr�)r�dmr�r(rr)rrr�_dep_map�	s 
zDistribution._dep_mapcCsj|j}g}|j|jdf��xH|D]@}y|j|t|��Wq"tk
r`td||f��Yq"Xq"W|S)z@List of Requirements needed for this distro if `extras` are usedNz%s has no such extra feature %r)r7rrBr�r�r�)rrr6Zdepsr�rrrr�	s
zDistribution.requiresccs(|j|�r$x|j|�D]
}|VqWdS)N)r�r�)rr�r�rrrr2
s
zDistribution._get_metadataFcCsZ|dkrtj}|j||d�|tjkrVt|j�x$|jd�D]}|tjkr<t|�q<WdS)z>Ensure distribution is importable on `path` (default=sys.path)N)r5znamespace_packages.txt)rkr�rr�rr2r�r�)rr�r5Zpkgrrr�activate	
s


zDistribution.activatecCs8dt|j�t|j�|jptf}|jr4|d|j7}|S)z@Return what this distribution's standard .egg filename should bez
%s-%s-py%sr+)r�rrKr2r>rl)rrZrrrr~
szDistribution.egg_namecCs |jrd||jfSt|�SdS)Nz%s (%s))rrF)rrrrr�
szDistribution.__repr__cCs@yt|dd�}Wntk
r(d}YnX|p0d}d|j|fS)NrKz[unknown version]z%s %s)r�rpr)rrKrrrr�%
s
zDistribution.__str__cCs|jd�rt|��t|j|�S)zADelegate all unrecognized public attributes to .metadata providerr�)r9r�r�r()rrrrr�__getattr__-
s
zDistribution.__getattr__cKs|jt|�tjj|�|f|�S)N)r�r�r�r�r�)rrZr�rTrrrr�3
szDistribution.from_filenamecCs<t|jtjj�r"d|j|jf}nd|j|jf}tj|�S)z?Return a ``Requirement`` that matches this distribution exactlyz%s==%sz%s===%s)rr,rrKr_rr�r�)r�specrrrr":
szDistribution.as_requirementcCs.|j||�}|dkr&td||ff��|j�S)z=Return the `name` entry point of `group` or raise ImportErrorNzEntry point %r not found)rzr�r�)rror�rrrrrxC
szDistribution.load_entry_pointcCsPy
|j}Wn,tk
r6tj|jd�|�}|_YnX|dk	rL|j|i�S|S)z=Return the entry point map for `group`, or the full entry mapzentry_points.txtN)Z_ep_mapr�r�r!r2rB)rroZep_maprrrryJ
s
zDistribution.get_entry_mapcCs|j|�j|�S)z<Return the EntryPoint object for `group`+`name`, or ``None``)ryrB)rror�rrrrzV
szDistribution.get_entry_infoc
Cs2|p|j}|sdSt|�}tjj|�}dd�|D�}x�t|�D]v\}}||kr\|rVPq�dSq>||kr>|jtkr>|r�|||d�kr�dS|tjkr�|j	�|j
||�|j
||�Pq>W|tjkr�|j	�|r�|j
d|�n
|j|�dSxBy|j||d�}	Wnt
k
�rPYq�X||	=||	=|	}q�WdS)a�Ensure self.location is on path

        If replace=False (default):
            - If location is already in path anywhere, do nothing.
            - Else:
              - If it's an egg and its parent directory is on path,
                insert just ahead of the parent.
              - Else: add to the end of path.
        If replace=True:
            - If location is already on path anywhere (not eggs)
              or higher priority than its parent (eggs)
              do nothing.
            - Else:
              - If it's an egg and its parent directory is on path,
                insert just ahead of the parent,
                removing any lower-priority entries.
              - Else: add it to the front of path.
        NcSsg|]}|rt|�p|�qSr)r�)r+r�rrrr�t
sz*Distribution.insert_on.<locals>.<listcomp>rr-)rr�r�r�re�	enumerater�r�rk�check_version_conflictrr;r�rp)
rr��locr5ZnlocZbdirZnpathr�rZnprrrrZ
sB



zDistribution.insert_oncCs�|jdkrdStj|jd��}t|j�}x~|jd�D]p}|tjks4||ks4|tkrTq4|dkr^q4t	tj|dd�}|r�t|�j
|�s4|j
|j�r�q4td|||jf�q4WdS)	N�
setuptoolsznamespace_packages.txtz
top_level.txt�
pkg_resources�siterdzIModule %s was already imported from %s, but %s is being added to sys.path)r?r>r@)r'rRrSr2r�rrkr�r�r�r9�
issue_warning)rZnspr=�modname�fnrrrr<�
s"

z#Distribution.check_version_conflictcCs4y
|jWn$tk
r.tdt|��dSXdS)NzUnbuilt egg for FT)rKrprAr�)rrrrr7�
s
zDistribution.has_versioncKsDd}x$|j�D]}|j|t||d��qW|jd|j�|jf|�S)z@Copy this distribution, substituting in any changed keyword argsz<project_name version py_version platform location precedenceNr�)rrr�r(r)rrTrMrrrr�clone�
s
zDistribution.clonecCsdd�|jD�S)NcSsg|]}|r|�qSrr)r+Zdeprrrr��
sz'Distribution.extras.<locals>.<listcomp>)r7)rrrrr�
szDistribution.extras)N)NF)N)N)NF)*rrrrr3r>r�r�r'r�r+r�r5rr r"r%r$r#r&r'r,r1rKr7rr2r8r~r�r�r9r�r"rxryrzrr<r7rDrrrrrr�c	sN

	

Cc@seZdZdd�ZdS)�EggInfoDistributioncCst|j|j��}|r||_|S)a�
        Packages installed by distutils (e.g. numpy or scipy),
        which uses an old safe_version, and so
        their version numbers can get mangled when
        converted to filenames (e.g., 1.11.0.dev0+2329eae to
        1.11.0.dev0_2329eae). These distributions will not be
        parsed properly
        downstream by Distribution and safe_version, so
        take an extra step and try to get the version number from
        the metadata file itself instead of the filename.
        )r&r2r3r')rZ
md_versionrrrr+�
sz#EggInfoDistribution._reload_versionN)rrrr+rrrrrE�
srEc@s>eZdZdZdZejd�Zedd��Z	edd��Z
dd	�Zd
S)�DistInfoDistributionzGWrap an actual or potential sys.path entry w/metadata, .dist-info styleZMETADATAz([\(,])\s*(\d.*?)\s*([,\)])cCs@y|jStk
r:|j|j�}tjj�j|�|_|jSXdS)zParse and cache metadataN)Z	_pkg_infor�r�r3�email�parserZParserZparsestr)rr�rrr�_parsed_pkg_info�
sz%DistInfoDistribution._parsed_pkg_infocCs,y|jStk
r&|j�|_|jSXdS)N)�_DistInfoDistribution__dep_mapr��_compute_dependencies)rrrrr7�
s

zDistInfoDistribution._dep_mapcs�dgi}|_g�x&|jjd�p"gD]}�jt|��q$W�fdd�}t|d��}|dj|�x<|jjd�ppgD](}t|j��}tt||��|�||<qrW|S)z+Recompute this distribution's dependencies.Nz
Requires-Distc3s0x*�D]"}|js"|jjd|i�r|VqWdS)Nr()r)r*)r(r�)rrr�reqs_for_extra�
s
zBDistInfoDistribution._compute_dependencies.<locals>.reqs_for_extrazProvides-Extra)	rJrIZget_allrr��	frozensetr�r�r)rr6r�rL�commonr(Zs_extrar)rrrK�
sz*DistInfoDistribution._compute_dependenciesN)rrrrr3r?r@ZEQEQr�rIr7rKrrrrrF�
s

rF)z.eggz	.egg-infoz
.dist-infoc
Os^d}t�}y"xtj|�j|kr(|d7}qWWntk
r@YnXtj|d|di|��dS)Nr-r>)rOrkr�r�rprCrD)rdrTrrXrrrrAsrAc@seZdZdd�ZdS)�RequirementParseErrorcCsdj|j�S)Nr�)rmrd)rrrrr�szRequirementParseError.__str__N)rrrr�rrrrrOsrOccshtt|��}xV|D]N}d|kr0|d|jd��}|jd�rV|dd�j�}|t|�7}t|�VqWdS)z�Yield ``Requirement`` objects for each specification in `strs`

    `strs` must be a string, or a (possibly-nested) iterable thereof.
    z #N�\rg���)r$r�r�r�r�r#r�)rrr�rrrr�#s

csPeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Ze	d
d��Z
�ZS)r�cs�ytt|�j|�Wn2tjjk
rF}ztt|���WYdd}~XnX|j|_	t
|j�}||j�|_|_
dd�|jD�|_ttt|j��|_|j
|jt|j�|jr�t|j�ndf|_t|j�|_dS)z>DO NOT CALL THIS UNDOCUMENTED METHOD; use Requirement.parse()!NcSsg|]}|j|jf�qSr)r8rK)r+r:rrrr�Asz(Requirement.__init__.<locals>.<listcomp>)rr�r�rrZInvalidRequirementrOrFr�Zunsafe_namer�r8rr'�	specifierrrrr�rrMr)�hashCmpr-�_Requirement__hash)rZrequirement_stringrcr)rrrr�7s
zRequirement.__init__cCst|t�o|j|jkS)N)rr�rS)rr!rrrr#Ks
zRequirement.__eq__cCs
||kS)Nr)rr!rrrr&QszRequirement.__ne__cCs0t|t�r |j|jkrdS|j}|jj|dd�S)NFT)Zprereleases)rr�r'rKrR�contains)rrrrrr	Ts

zRequirement.__contains__cCs|jS)N)rT)rrrrr`szRequirement.__hash__cCsdt|�S)NzRequirement.parse(%r))rF)rrrrr�cszRequirement.__repr__cCst|�\}|S)N)r�)r2r�rrrr�es
zRequirement.parse)rrrr�r#r&r	rr�r]r�rHrr)rrr�6scCs0t|t�s*Gdd�d|t�}|jdd�S|jS)z&Get an mro for a type or classic classc@seZdZdS)z_get_mro.<locals>.clsN)rrrrrrrrosrr-N)rr��object�__mro__)rrrr�_get_mroks
rXcCs2x,tt|dt|���D]}||kr||SqWdS)z2Return an adapter factory for `ob` from `registry`rN)rXr�r�)�registryr`�trrrr�vsr�cCs&tjj|�}tjj|�s"tj|�dS)z1Ensure that the parent directory of `path` existsN)r�r�rer
�makedirs)r�rerrrr�}scCs@tstd��t|�\}}|r<|r<t|�r<t|�t|d�dS)z/Sandbox-bypassing version of ensure_directory()z*"os.mkdir" not supported on this platform.i�N)r�r�rr
rKr	)r�rerZrrrrK�srKccszd}g}xbt|�D]V}|jd�r^|jd�rR|s2|r<||fV|dd�j�}g}qhtd|��q|j|�qW||fVdS)asSplit a string or iterable thereof into (section, content) pairs

    Each ``section`` is a stripped version of the section header ("[section]")
    and each ``content`` is a list of stripped lines excluding blank lines and
    comment-only lines.  If there are any such lines before the first section
    header, they're returned in a first ``section`` of ``None``.
    N�[�]r-zInvalid section headingr7)r�r9r�r�rpr;)r2ZsectionZcontentr�rrrr��s


cOs&tj}ztt_tj||�S|t_XdS)N)r�r�os_open�tempfileZmkstemp)rdrTZold_openrrrr��s
r��ignore)�categoryr;cOs|||�|S)Nr)r�rdr�rrr�_call_aside�s
rbcCs<t�}||d<x(t|�D]}|jd�st||�||<qWdS)z=Set up global resource manager (deliberately not state-saved)Z_managerr�N)r�r�r9r�)rXr�r�rrr�_initialize�s

rccCs�tj�}td|d�|j}|j}|j}|j}|}d}x|D]}|jdd�q:W~|dd�dd�g|_t	t
|jtj
��t�jt��dS)	aE
    Prepare the master working set and make the ``require()``
    API available.

    This function has explicit effects on the global state
    of pkg_resources. It is intended to be invoked once at
    the initialization of this module.

    Invocation by other packages is unsupported and done
    at their own risk.
    rV)r�NF)r5cSs|jdd�S)NT)r5)r8)r�rrrre�sz0_initialize_master_working_set.<locals>.<lambda>)r%)r�rrUrtr{r&rur8r�rrr�rkr�rOrPr�)r�rtr{r�rur�r�rrr�_initialize_master_working_set�s

rd)rr)rrr7)N)N)F)F)F)F)N)�rZ
__future__rrkr�rhr�r?r�r�r�rCrQrZpkgutilr8rlrr�Zemail.parserrGr_rCr�rr�r�ZimpZpip._vendorrZpip._vendor.six.movesrrrrr	r
rr�rr^Zos.pathr
rZimportlib.machinery�	machineryr�rrrr��version_inforWrDrtr�rErrVrrKr_rIr/rJrNrQrUrZr[r^rarbrcZ
_sget_noneZ
_sset_noners�__all__�	Exceptionr�r�r�r�r�r�r>r�r�r�r�r�r�rvrnr�rhr@rir�r�r�rur�rwrxryrzr�r�r�rRrr�r��RuntimeErrorr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ZImpImporterr�r�r�r�r�r�r�rrr�r�r�r�r�rjr
rA�
IGNORECASEr*r�r"r&r�rErFr)rArprOr�rr�rXr�r�rKr�r��filterwarningsrbrOrcrdrrrr�<module>s�




b

 




.

{
3	
a
''





.!


		~	g0
5
	site-packages/pip/compat/dictconfig.py000064400000055070147511334620014037 0ustar00# This is a copy of the Python logging.config.dictconfig module,
# reproduced with permission. It is provided here for backwards
# compatibility for Python versions prior to 2.7.
#
# Copyright 2009-2010 by Vinay Sajip. All Rights Reserved.
#
# Permission to use, copy, modify, and distribute this software and its
# documentation for any purpose and without fee is hereby granted,
# provided that the above copyright notice appear in all copies and that
# both that copyright notice and this permission notice appear in
# supporting documentation, and that the name of Vinay Sajip
# not be used in advertising or publicity pertaining to distribution
# of the software without specific, written prior permission.
# VINAY SAJIP DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE, INCLUDING
# ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL
# VINAY SAJIP BE LIABLE FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR
# ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER
# IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT
# OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
from __future__ import absolute_import

import logging.handlers
import re
import sys
import types

from pip._vendor import six

# flake8: noqa

IDENTIFIER = re.compile('^[a-z_][a-z0-9_]*$', re.I)


def valid_ident(s):
    m = IDENTIFIER.match(s)
    if not m:
        raise ValueError('Not a valid Python identifier: %r' % s)
    return True

#
# This function is defined in logging only in recent versions of Python
#
try:
    from logging import _checkLevel
except ImportError:
    def _checkLevel(level):
        if isinstance(level, int):
            rv = level
        elif str(level) == level:
            if level not in logging._levelNames:
                raise ValueError('Unknown level: %r' % level)
            rv = logging._levelNames[level]
        else:
            raise TypeError('Level not an integer or a '
                            'valid string: %r' % level)
        return rv

# The ConvertingXXX classes are wrappers around standard Python containers,
# and they serve to convert any suitable values in the container. The
# conversion converts base dicts, lists and tuples to their wrapped
# equivalents, whereas strings which match a conversion format are converted
# appropriately.
#
# Each wrapper should have a configurator attribute holding the actual
# configurator to use for conversion.


class ConvertingDict(dict):
    """A converting dictionary wrapper."""

    def __getitem__(self, key):
        value = dict.__getitem__(self, key)
        result = self.configurator.convert(value)
        # If the converted value is different, save for next time
        if value is not result:
            self[key] = result
            if type(result) in (ConvertingDict, ConvertingList,
                                ConvertingTuple):
                result.parent = self
                result.key = key
        return result

    def get(self, key, default=None):
        value = dict.get(self, key, default)
        result = self.configurator.convert(value)
        # If the converted value is different, save for next time
        if value is not result:
            self[key] = result
            if type(result) in (ConvertingDict, ConvertingList,
                                ConvertingTuple):
                result.parent = self
                result.key = key
        return result

    def pop(self, key, default=None):
        value = dict.pop(self, key, default)
        result = self.configurator.convert(value)
        if value is not result:
            if type(result) in (ConvertingDict, ConvertingList,
                                ConvertingTuple):
                result.parent = self
                result.key = key
        return result


class ConvertingList(list):
    """A converting list wrapper."""
    def __getitem__(self, key):
        value = list.__getitem__(self, key)
        result = self.configurator.convert(value)
        # If the converted value is different, save for next time
        if value is not result:
            self[key] = result
            if type(result) in (ConvertingDict, ConvertingList,
                                ConvertingTuple):
                result.parent = self
                result.key = key
        return result

    def pop(self, idx=-1):
        value = list.pop(self, idx)
        result = self.configurator.convert(value)
        if value is not result:
            if type(result) in (ConvertingDict, ConvertingList,
                                ConvertingTuple):
                result.parent = self
        return result


class ConvertingTuple(tuple):
    """A converting tuple wrapper."""
    def __getitem__(self, key):
        value = tuple.__getitem__(self, key)
        result = self.configurator.convert(value)
        if value is not result:
            if type(result) in (ConvertingDict, ConvertingList,
                                ConvertingTuple):
                result.parent = self
                result.key = key
        return result


class BaseConfigurator(object):
    """
    The configurator base class which defines some useful defaults.
    """

    CONVERT_PATTERN = re.compile(r'^(?P<prefix>[a-z]+)://(?P<suffix>.*)$')

    WORD_PATTERN = re.compile(r'^\s*(\w+)\s*')
    DOT_PATTERN = re.compile(r'^\.\s*(\w+)\s*')
    INDEX_PATTERN = re.compile(r'^\[\s*(\w+)\s*\]\s*')
    DIGIT_PATTERN = re.compile(r'^\d+$')

    value_converters = {
        'ext' : 'ext_convert',
        'cfg' : 'cfg_convert',
    }

    # We might want to use a different one, e.g. importlib
    importer = __import__

    def __init__(self, config):
        self.config = ConvertingDict(config)
        self.config.configurator = self

    def resolve(self, s):
        """
        Resolve strings to objects using standard import and attribute
        syntax.
        """
        name = s.split('.')
        used = name.pop(0)
        try:
            found = self.importer(used)
            for frag in name:
                used += '.' + frag
                try:
                    found = getattr(found, frag)
                except AttributeError:
                    self.importer(used)
                    found = getattr(found, frag)
            return found
        except ImportError:
            e, tb = sys.exc_info()[1:]
            v = ValueError('Cannot resolve %r: %s' % (s, e))
            v.__cause__, v.__traceback__ = e, tb
            raise v

    def ext_convert(self, value):
        """Default converter for the ext:// protocol."""
        return self.resolve(value)

    def cfg_convert(self, value):
        """Default converter for the cfg:// protocol."""
        rest = value
        m = self.WORD_PATTERN.match(rest)
        if m is None:
            raise ValueError("Unable to convert %r" % value)
        else:
            rest = rest[m.end():]
            d = self.config[m.groups()[0]]
            # print d, rest
            while rest:
                m = self.DOT_PATTERN.match(rest)
                if m:
                    d = d[m.groups()[0]]
                else:
                    m = self.INDEX_PATTERN.match(rest)
                    if m:
                        idx = m.groups()[0]
                        if not self.DIGIT_PATTERN.match(idx):
                            d = d[idx]
                        else:
                            try:
                                n = int(idx)  # try as number first (most likely)
                                d = d[n]
                            except TypeError:
                                d = d[idx]
                if m:
                    rest = rest[m.end():]
                else:
                    raise ValueError('Unable to convert '
                                     '%r at %r' % (value, rest))
        # rest should be empty
        return d

    def convert(self, value):
        """
        Convert values to an appropriate type. dicts, lists and tuples are
        replaced by their converting alternatives. Strings are checked to
        see if they have a conversion format and are converted if they do.
        """
        if not isinstance(value, ConvertingDict) and isinstance(value, dict):
            value = ConvertingDict(value)
            value.configurator = self
        elif not isinstance(value, ConvertingList) and isinstance(value, list):
            value = ConvertingList(value)
            value.configurator = self
        elif not isinstance(value, ConvertingTuple) and\
                 isinstance(value, tuple):
            value = ConvertingTuple(value)
            value.configurator = self
        elif isinstance(value, six.string_types):  # str for py3k
            m = self.CONVERT_PATTERN.match(value)
            if m:
                d = m.groupdict()
                prefix = d['prefix']
                converter = self.value_converters.get(prefix, None)
                if converter:
                    suffix = d['suffix']
                    converter = getattr(self, converter)
                    value = converter(suffix)
        return value

    def configure_custom(self, config):
        """Configure an object with a user-supplied factory."""
        c = config.pop('()')
        if not hasattr(c, '__call__') and hasattr(types, 'ClassType') and type(c) != types.ClassType:
            c = self.resolve(c)
        props = config.pop('.', None)
        # Check for valid identifiers
        kwargs = dict((k, config[k]) for k in config if valid_ident(k))
        result = c(**kwargs)
        if props:
            for name, value in props.items():
                setattr(result, name, value)
        return result

    def as_tuple(self, value):
        """Utility function which converts lists to tuples."""
        if isinstance(value, list):
            value = tuple(value)
        return value


class DictConfigurator(BaseConfigurator):
    """
    Configure logging using a dictionary-like object to describe the
    configuration.
    """

    def configure(self):
        """Do the configuration."""

        config = self.config
        if 'version' not in config:
            raise ValueError("dictionary doesn't specify a version")
        if config['version'] != 1:
            raise ValueError("Unsupported version: %s" % config['version'])
        incremental = config.pop('incremental', False)
        EMPTY_DICT = {}
        logging._acquireLock()
        try:
            if incremental:
                handlers = config.get('handlers', EMPTY_DICT)
                # incremental handler config only if handler name
                # ties in to logging._handlers (Python 2.7)
                if sys.version_info[:2] == (2, 7):
                    for name in handlers:
                        if name not in logging._handlers:
                            raise ValueError('No handler found with '
                                             'name %r'  % name)
                        else:
                            try:
                                handler = logging._handlers[name]
                                handler_config = handlers[name]
                                level = handler_config.get('level', None)
                                if level:
                                    handler.setLevel(_checkLevel(level))
                            except StandardError as e:
                                raise ValueError('Unable to configure handler '
                                                 '%r: %s' % (name, e))
                loggers = config.get('loggers', EMPTY_DICT)
                for name in loggers:
                    try:
                        self.configure_logger(name, loggers[name], True)
                    except StandardError as e:
                        raise ValueError('Unable to configure logger '
                                         '%r: %s' % (name, e))
                root = config.get('root', None)
                if root:
                    try:
                        self.configure_root(root, True)
                    except StandardError as e:
                        raise ValueError('Unable to configure root '
                                         'logger: %s' % e)
            else:
                disable_existing = config.pop('disable_existing_loggers', True)

                logging._handlers.clear()
                del logging._handlerList[:]

                # Do formatters first - they don't refer to anything else
                formatters = config.get('formatters', EMPTY_DICT)
                for name in formatters:
                    try:
                        formatters[name] = self.configure_formatter(
                                                            formatters[name])
                    except StandardError as e:
                        raise ValueError('Unable to configure '
                                         'formatter %r: %s' % (name, e))
                # Next, do filters - they don't refer to anything else, either
                filters = config.get('filters', EMPTY_DICT)
                for name in filters:
                    try:
                        filters[name] = self.configure_filter(filters[name])
                    except StandardError as e:
                        raise ValueError('Unable to configure '
                                         'filter %r: %s' % (name, e))

                # Next, do handlers - they refer to formatters and filters
                # As handlers can refer to other handlers, sort the keys
                # to allow a deterministic order of configuration
                handlers = config.get('handlers', EMPTY_DICT)
                for name in sorted(handlers):
                    try:
                        handler = self.configure_handler(handlers[name])
                        handler.name = name
                        handlers[name] = handler
                    except StandardError as e:
                        raise ValueError('Unable to configure handler '
                                         '%r: %s' % (name, e))
                # Next, do loggers - they refer to handlers and filters

                # we don't want to lose the existing loggers,
                # since other threads may have pointers to them.
                # existing is set to contain all existing loggers,
                # and as we go through the new configuration we
                # remove any which are configured. At the end,
                # what's left in existing is the set of loggers
                # which were in the previous configuration but
                # which are not in the new configuration.
                root = logging.root
                existing = list(root.manager.loggerDict)
                # The list needs to be sorted so that we can
                # avoid disabling child loggers of explicitly
                # named loggers. With a sorted list it is easier
                # to find the child loggers.
                existing.sort()
                # We'll keep the list of existing loggers
                # which are children of named loggers here...
                child_loggers = []
                # now set up the new ones...
                loggers = config.get('loggers', EMPTY_DICT)
                for name in loggers:
                    if name in existing:
                        i = existing.index(name)
                        prefixed = name + "."
                        pflen = len(prefixed)
                        num_existing = len(existing)
                        i = i + 1  # look at the entry after name
                        while (i < num_existing) and\
                              (existing[i][:pflen] == prefixed):
                            child_loggers.append(existing[i])
                            i = i + 1
                        existing.remove(name)
                    try:
                        self.configure_logger(name, loggers[name])
                    except StandardError as e:
                        raise ValueError('Unable to configure logger '
                                         '%r: %s' % (name, e))

                # Disable any old loggers. There's no point deleting
                # them as other threads may continue to hold references
                # and by disabling them, you stop them doing any logging.
                # However, don't disable children of named loggers, as that's
                # probably not what was intended by the user.
                for log in existing:
                    logger = root.manager.loggerDict[log]
                    if log in child_loggers:
                        logger.level = logging.NOTSET
                        logger.handlers = []
                        logger.propagate = True
                    elif disable_existing:
                        logger.disabled = True

                # And finally, do the root logger
                root = config.get('root', None)
                if root:
                    try:
                        self.configure_root(root)
                    except StandardError as e:
                        raise ValueError('Unable to configure root '
                                         'logger: %s' % e)
        finally:
            logging._releaseLock()

    def configure_formatter(self, config):
        """Configure a formatter from a dictionary."""
        if '()' in config:
            factory = config['()']  # for use in exception handler
            try:
                result = self.configure_custom(config)
            except TypeError as te:
                if "'format'" not in str(te):
                    raise
                # Name of parameter changed from fmt to format.
                # Retry with old name.
                # This is so that code can be used with older Python versions
                #(e.g. by Django)
                config['fmt'] = config.pop('format')
                config['()'] = factory
                result = self.configure_custom(config)
        else:
            fmt = config.get('format', None)
            dfmt = config.get('datefmt', None)
            result = logging.Formatter(fmt, dfmt)
        return result

    def configure_filter(self, config):
        """Configure a filter from a dictionary."""
        if '()' in config:
            result = self.configure_custom(config)
        else:
            name = config.get('name', '')
            result = logging.Filter(name)
        return result

    def add_filters(self, filterer, filters):
        """Add filters to a filterer from a list of names."""
        for f in filters:
            try:
                filterer.addFilter(self.config['filters'][f])
            except StandardError as e:
                raise ValueError('Unable to add filter %r: %s' % (f, e))

    def configure_handler(self, config):
        """Configure a handler from a dictionary."""
        formatter = config.pop('formatter', None)
        if formatter:
            try:
                formatter = self.config['formatters'][formatter]
            except StandardError as e:
                raise ValueError('Unable to set formatter '
                                 '%r: %s' % (formatter, e))
        level = config.pop('level', None)
        filters = config.pop('filters', None)
        if '()' in config:
            c = config.pop('()')
            if not hasattr(c, '__call__') and hasattr(types, 'ClassType') and type(c) != types.ClassType:
                c = self.resolve(c)
            factory = c
        else:
            klass = self.resolve(config.pop('class'))
            # Special case for handler which refers to another handler
            if issubclass(klass, logging.handlers.MemoryHandler) and\
                'target' in config:
                try:
                    config['target'] = self.config['handlers'][config['target']]
                except StandardError as e:
                    raise ValueError('Unable to set target handler '
                                     '%r: %s' % (config['target'], e))
            elif issubclass(klass, logging.handlers.SMTPHandler) and\
                'mailhost' in config:
                config['mailhost'] = self.as_tuple(config['mailhost'])
            elif issubclass(klass, logging.handlers.SysLogHandler) and\
                'address' in config:
                config['address'] = self.as_tuple(config['address'])
            factory = klass
        kwargs = dict((k, config[k]) for k in config if valid_ident(k))
        try:
            result = factory(**kwargs)
        except TypeError as te:
            if "'stream'" not in str(te):
                raise
            # The argument name changed from strm to stream
            # Retry with old name.
            # This is so that code can be used with older Python versions
            #(e.g. by Django)
            kwargs['strm'] = kwargs.pop('stream')
            result = factory(**kwargs)
        if formatter:
            result.setFormatter(formatter)
        if level is not None:
            result.setLevel(_checkLevel(level))
        if filters:
            self.add_filters(result, filters)
        return result

    def add_handlers(self, logger, handlers):
        """Add handlers to a logger from a list of names."""
        for h in handlers:
            try:
                logger.addHandler(self.config['handlers'][h])
            except StandardError as e:
                raise ValueError('Unable to add handler %r: %s' % (h, e))

    def common_logger_config(self, logger, config, incremental=False):
        """
        Perform configuration which is common to root and non-root loggers.
        """
        level = config.get('level', None)
        if level is not None:
            logger.setLevel(_checkLevel(level))
        if not incremental:
            # Remove any existing handlers
            for h in logger.handlers[:]:
                logger.removeHandler(h)
            handlers = config.get('handlers', None)
            if handlers:
                self.add_handlers(logger, handlers)
            filters = config.get('filters', None)
            if filters:
                self.add_filters(logger, filters)

    def configure_logger(self, name, config, incremental=False):
        """Configure a non-root logger from a dictionary."""
        logger = logging.getLogger(name)
        self.common_logger_config(logger, config, incremental)
        propagate = config.get('propagate', None)
        if propagate is not None:
            logger.propagate = propagate

    def configure_root(self, config, incremental=False):
        """Configure a root logger from a dictionary."""
        root = logging.getLogger()
        self.common_logger_config(root, config, incremental)

dictConfigClass = DictConfigurator


def dictConfig(config):
    """Configure logging using a dictionary."""
    dictConfigClass(config).configure()
site-packages/pip/compat/__pycache__/__init__.cpython-36.pyc000064400000007541147511334620017751 0ustar003

���e@�0@s dZddlmZmZddlZddlZddlmZyddlm	Z
Wn ek
r`ddlm	Z
YnXyddl
mZWn ek
r�ddlmZYnXyddlZWnRek
r�yddlmZWn,ek
r�ddlZeje_eje_YnXYnXyddlZdd	�ZWn*ek
�r2dd
lmZdd	�ZYnXdd
ddddddddg
Zejd.k�rjdZddlmZn$ddlZe ed�Ze�r�ejZndZejd/k�r�dd�Z!d0dd�Z"ndd�Z!d1dd�Z"d d!�Z#d"d�Z$d#d$�Z%d2Z&ejd3k�r�e&d47Z&ej'j(d*��pej'd+k�oej)d,kZ*d-d�Z+dS)5zKStuff that differs in different Python versions and platform
distributions.�)�absolute_import�divisionN)�	text_type)�
dictConfig)�OrderedDict)�	ipaddresscCs"tjd�tjd�g}ttt|��S)N�stdlib�
platstdlib)�	sysconfig�get_path�set�filter�bool)�paths�r�/usr/lib/python3.6/__init__.py�
get_stdlib"sr)r
cCs(tjdd�tjddd�g}ttt|��S)NT)�standard_lib)rZ
plat_specific)r
Zget_python_librr
r)rrrrr+s
�logging_dictConfigr�uses_pycache�console_to_str�
native_str�get_path_uid�stdlib_pkgs�WINDOWS�samefiler��T)�cache_from_sourcercCs.y|jtjj�Stk
r(|jd�SXdS)N�utf_8)�decode�sys�
__stdout__�encoding�UnicodeDecodeError)�srrrrGsFcCs"t|t�r|jd|rdnd�S|S)Nzutf-8�replace�strict)�
isinstance�bytesr )r%r&rrrrMs
cCs|S)Nr)r%rrrrSscCst|t�r|jd�S|S)Nzutf-8)r(r�encode)r%r&rrrrVs

cCs<t|d�r|j�S|j|j|jddd}|dSdS)N�
total_seconds�i�
�i@Bi@B)�hasattrr+ZmicrosecondsZsecondsZdays)Ztd�valrrrr+]s
r+cCs`ttd�r6tj|tjtjB�}tj|�j}tj|�n&tjj	|�sPtj
|�j}ntd|��|S)a)
    Return path's uid.

    Does not follow symlinks:
        https://github.com/pypa/pip/pull/935#discussion_r5307003

    Placed this function in compat due to differences on AIX and
    Jython, that should eventually go away.

    :raises OSError: When path is a symlink or can't be read.
    �
O_NOFOLLOWz1%s is a symlink; Will not return uid for symlinks)r/�os�open�O_RDONLYr1�fstat�st_uid�close�path�islink�stat�OSError)r8�fdZfile_uidrrrres

cCs0tjj|�}|jd�r,|jd�r,|dd�}|S)zl
    Expand ~ and ~user constructions.

    Includes a workaround for http://bugs.python.org/issue14768
    z~/z//�N)r2r8�
expanduser�
startswith)r8Zexpandedrrrr>�sr>�python�wsgiref���argparse�winZcli�ntcCsNttjd�rtjj||�Stjjtjj|��}tjjtjj|��}||kSdS)z>Provide an alternative for os.path.samefile on Windows/Python2rN)r/r2r8r�normcase�abspath)Zfile1Zfile2Zpath1Zpath2rrrr�s
)rr)r)F)F)r@rA)rBrC)rD),�__doc__Z
__future__rrr2r!Zpip._vendor.sixrZlogging.configrr�ImportErrorZpip.compat.dictconfig�collectionsrZpip._vendor.ordereddictrZpip._vendorZipaddrZ	IPAddressZ
ip_addressZ	IPNetworkZ
ip_networkr
rZ	distutils�__all__�version_inforZimportlib.utilrZimpr/rrr+rr>r�platformr?�namerrrrrr�<module>sh	


site-packages/pip/compat/__pycache__/__init__.cpython-36.opt-1.pyc000064400000007541147511334620020710 0ustar003

���e@�0@s dZddlmZmZddlZddlZddlmZyddlm	Z
Wn ek
r`ddlm	Z
YnXyddl
mZWn ek
r�ddlmZYnXyddlZWnRek
r�yddlmZWn,ek
r�ddlZeje_eje_YnXYnXyddlZdd	�ZWn*ek
�r2dd
lmZdd	�ZYnXdd
ddddddddg
Zejd.k�rjdZddlmZn$ddlZe ed�Ze�r�ejZndZejd/k�r�dd�Z!d0dd�Z"ndd�Z!d1dd�Z"d d!�Z#d"d�Z$d#d$�Z%d2Z&ejd3k�r�e&d47Z&ej'j(d*��pej'd+k�oej)d,kZ*d-d�Z+dS)5zKStuff that differs in different Python versions and platform
distributions.�)�absolute_import�divisionN)�	text_type)�
dictConfig)�OrderedDict)�	ipaddresscCs"tjd�tjd�g}ttt|��S)N�stdlib�
platstdlib)�	sysconfig�get_path�set�filter�bool)�paths�r�/usr/lib/python3.6/__init__.py�
get_stdlib"sr)r
cCs(tjdd�tjddd�g}ttt|��S)NT)�standard_lib)rZ
plat_specific)r
Zget_python_librr
r)rrrrr+s
�logging_dictConfigr�uses_pycache�console_to_str�
native_str�get_path_uid�stdlib_pkgs�WINDOWS�samefiler��T)�cache_from_sourcercCs.y|jtjj�Stk
r(|jd�SXdS)N�utf_8)�decode�sys�
__stdout__�encoding�UnicodeDecodeError)�srrrrGsFcCs"t|t�r|jd|rdnd�S|S)Nzutf-8�replace�strict)�
isinstance�bytesr )r%r&rrrrMs
cCs|S)Nr)r%rrrrSscCst|t�r|jd�S|S)Nzutf-8)r(r�encode)r%r&rrrrVs

cCs<t|d�r|j�S|j|j|jddd}|dSdS)N�
total_seconds�i�
�i@Bi@B)�hasattrr+ZmicrosecondsZsecondsZdays)Ztd�valrrrr+]s
r+cCs`ttd�r6tj|tjtjB�}tj|�j}tj|�n&tjj	|�sPtj
|�j}ntd|��|S)a)
    Return path's uid.

    Does not follow symlinks:
        https://github.com/pypa/pip/pull/935#discussion_r5307003

    Placed this function in compat due to differences on AIX and
    Jython, that should eventually go away.

    :raises OSError: When path is a symlink or can't be read.
    �
O_NOFOLLOWz1%s is a symlink; Will not return uid for symlinks)r/�os�open�O_RDONLYr1�fstat�st_uid�close�path�islink�stat�OSError)r8�fdZfile_uidrrrres

cCs0tjj|�}|jd�r,|jd�r,|dd�}|S)zl
    Expand ~ and ~user constructions.

    Includes a workaround for http://bugs.python.org/issue14768
    z~/z//�N)r2r8�
expanduser�
startswith)r8Zexpandedrrrr>�sr>�python�wsgiref���argparse�winZcli�ntcCsNttjd�rtjj||�Stjjtjj|��}tjjtjj|��}||kSdS)z>Provide an alternative for os.path.samefile on Windows/Python2rN)r/r2r8r�normcase�abspath)Zfile1Zfile2Zpath1Zpath2rrrr�s
)rr)r)F)F)r@rA)rBrC)rD),�__doc__Z
__future__rrr2r!Zpip._vendor.sixrZlogging.configrr�ImportErrorZpip.compat.dictconfig�collectionsrZpip._vendor.ordereddictrZpip._vendorZipaddrZ	IPAddressZ
ip_addressZ	IPNetworkZ
ip_networkr
rZ	distutils�__all__�version_inforZimportlib.utilrZimpr/rrr+rr>r�platformr?�namerrrrrr�<module>sh	


site-packages/pip/compat/__pycache__/dictconfig.cpython-36.opt-1.pyc000064400000032420147511334620021254 0ustar003

���e8Z�@s�ddlmZddlZddlZddlZddlZddlmZej	dej
�Zdd�Zyddlm
Z
Wnek
rzdd	�Z
YnXGd
d�de�ZGdd
�d
e�ZGdd�de�ZGdd�de�ZGdd�de�ZeZdd�ZdS)�)�absolute_importN)�sixz^[a-z_][a-z0-9_]*$cCstj|�}|std|��dS)Nz!Not a valid Python identifier: %rT)�
IDENTIFIER�match�
ValueError)�s�m�r	� /usr/lib/python3.6/dictconfig.py�valid_ident"s
r)�_checkLevelcCsNt|t�r|}n:t|�|kr>|tjkr2td|��tj|}ntd|��|S)NzUnknown level: %rz*Level not an integer or a valid string: %r)�
isinstance�int�str�loggingZ_levelNamesr�	TypeError)�level�rvr	r	r
r.s

rc@s,eZdZdZdd�Zd	dd�Zd
dd�ZdS)�ConvertingDictz A converting dictionary wrapper.cCsJtj||�}|jj|�}||k	rF|||<t|�tttfkrF||_||_	|S)N)
�dict�__getitem__�configurator�convert�typer�ConvertingList�ConvertingTuple�parent�key)�selfr�value�resultr	r	r
rGs
zConvertingDict.__getitem__NcCsLtj|||�}|jj|�}||k	rH|||<t|�tttfkrH||_||_	|S)N)
r�getrrrrrrrr)rr�defaultrr r	r	r
r!Ss
zConvertingDict.getcCsDtj|||�}|jj|�}||k	r@t|�tttfkr@||_||_	|S)N)
r�poprrrrrrrr)rrr"rr r	r	r
r#_s
zConvertingDict.pop)N)N)�__name__�
__module__�__qualname__�__doc__rr!r#r	r	r	r
rDs
rc@s"eZdZdZdd�Zd	dd�ZdS)
rzA converting list wrapper.cCsJtj||�}|jj|�}||k	rF|||<t|�tttfkrF||_||_	|S)N)
�listrrrrrrrrr)rrrr r	r	r
rls
zConvertingList.__getitem__�cCs<tj||�}|jj|�}||k	r8t|�tttfkr8||_|S)N)	r(r#rrrrrrr)r�idxrr r	r	r
r#xs
zConvertingList.popN���)r+)r$r%r&r'rr#r	r	r	r
rjsrc@seZdZdZdd�ZdS)rzA converting tuple wrapper.cCsBtj||�}|jj|�}||k	r>t|�tttfkr>||_||_	|S)N)
�tuplerrrrrrrrr)rrrr r	r	r
r�s
zConvertingTuple.__getitem__N)r$r%r&r'rr	r	r	r
r�src@s�eZdZdZejd�Zejd�Zejd�Zejd�Z	ejd�Z
ddd	�ZeZ
d
d�Zdd
�Zdd�Zdd�Zdd�Zdd�Zdd�ZdS)�BaseConfiguratorzI
    The configurator base class which defines some useful defaults.
    z%^(?P<prefix>[a-z]+)://(?P<suffix>.*)$z^\s*(\w+)\s*z^\.\s*(\w+)\s*z^\[\s*(\w+)\s*\]\s*z^\d+$�ext_convert�cfg_convert)ZextZcfgcCst|�|_||j_dS)N)r�configr)rr0r	r	r
�__init__�s
zBaseConfigurator.__init__c	Cs�|jd�}|jd�}y`|j|�}xP|D]H}|d|7}yt||�}Wq&tk
rl|j|�t||�}Yq&Xq&W|Stk
r�tj�dd�\}}td||f�}|||_	|_
|�YnXdS)z`
        Resolve strings to objects using standard import and attribute
        syntax.
        �.rr)NzCannot resolve %r: %s)�splitr#�importer�getattr�AttributeError�ImportError�sys�exc_infor�	__cause__�
__traceback__)	rr�nameZused�foundZfrag�e�tb�vr	r	r
�resolve�s"




zBaseConfigurator.resolvecCs
|j|�S)z*Default converter for the ext:// protocol.)rA)rrr	r	r
r.�szBaseConfigurator.ext_convertcCs|}|jj|�}|dkr&td|��n�||j�d�}|j|j�d}x�|r�|jj|�}|rp||j�d}nd|jj|�}|r�|j�d}|jj|�s�||}n2yt	|�}||}Wnt
k
r�||}YnX|r�||j�d�}qJtd||f��qJW|S)z*Default converter for the cfg:// protocol.NzUnable to convert %rrzUnable to convert %r at %r)�WORD_PATTERNrr�endr0�groups�DOT_PATTERN�
INDEX_PATTERN�
DIGIT_PATTERNrr)rr�restr�dr*�nr	r	r
r/�s2
zBaseConfigurator.cfg_convertcCs�t|t�r&t|t�r&t|�}||_n�t|t�rLt|t�rLt|�}||_n~t|t�rrt|t�rrt|�}||_nXt|tj	�r�|j
j|�}|r�|j�}|d}|j
j|d�}|r�|d}t||�}||�}|S)z�
        Convert values to an appropriate type. dicts, lists and tuples are
        replaced by their converting alternatives. Strings are checked to
        see if they have a conversion format and are converted if they do.
        �prefixN�suffix)r
rrrrr(rr,rZstring_types�CONVERT_PATTERNr�	groupdict�value_convertersr!r5)rrrrIrKZ	converterrLr	r	r
r�s*

zBaseConfigurator.convertcs��jd�}t|d�r8ttd�r8t|�tjkr8|j|�}�jdd�}t�fdd��D��}|f|�}|r�x |j�D]\}}t|||�qrW|S)z1Configure an object with a user-supplied factory.z()�__call__�	ClassTyper2Nc3s"|]}t|�r|�|fVqdS)N)r)�.0�k)r0r	r
�	<genexpr>sz4BaseConfigurator.configure_custom.<locals>.<genexpr>)	r#�hasattr�typesrrQrAr�items�setattr)rr0�cZprops�kwargsr r<rr	)r0r
�configure_customs
$

z!BaseConfigurator.configure_customcCst|t�rt|�}|S)z0Utility function which converts lists to tuples.)r
r(r,)rrr	r	r
�as_tuples
zBaseConfigurator.as_tupleN)r$r%r&r'�re�compilerMrBrErFrGrO�
__import__r4r1rAr.r/rr[r\r	r	r	r
r-�s 




"r-c@s^eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	ddd�Z
ddd�Zddd�ZdS)�DictConfiguratorz]
    Configure logging using a dictionary-like object to describe the
    configuration.
    cCs�|j}d|krtd��|ddkr2td|d��|jdd�}i}tj��z||�r�|jd|�}tjdd	�dk�rx�|D]�}|tjkr�td|��qzy4tj|}||}|jdd�}|r�|j	t
|��Wqztk
�r�}	ztd
||	f��WYdd}	~	XqzXqzW|jd|�}
xZ|
D]R}y|j||
|d�Wn4tk
�rd}	ztd||	f��WYdd}	~	XnX�qW|jdd�}|�r�y|j
|d�Wn0tk
�r�}	ztd|	��WYdd}	~	XnX�n|jdd�}tjj�tjdd�=|jd|�}
xZ|
D]R}y|j|
|�|
|<Wn4tk
�rF}	ztd||	f��WYdd}	~	XnX�q�W|jd|�}xZ|D]R}y|j||�||<Wn4tk
�r�}	ztd||	f��WYdd}	~	XnX�q`W|jd|�}xht|�D]\}y |j||�}||_|||<Wn4tk
�r$}	ztd
||	f��WYdd}	~	XnX�q�Wtj}t|jj�}|j�g}|jd|�}
x�|
D]�}||k�r�|j|�}|d}t|�}t|�}|d}x<||k�r�||d|�|k�r�|j||�|d}�q�W|j|�y|j||
|�Wn4tk
�r$}	ztd||	f��WYdd}	~	XnX�q\WxF|D]>}|jj|}||k�rbtj|_g|_ d|_!n|�r2d|_"�q2W|jdd�}|�r�y|j
|�Wn0tk
�r�}	ztd|	��WYdd}	~	XnXWdtj#�XdS)zDo the configuration.�versionz$dictionary doesn't specify a versionr)zUnsupported version: %s�incrementalF�handlersN��zNo handler found with name %rrz"Unable to configure handler %r: %s�loggersTz!Unable to configure logger %r: %s�rootz#Unable to configure root logger: %sZdisable_existing_loggers�
formattersz$Unable to configure formatter %r: %s�filtersz!Unable to configure filter %r: %sr2)rdre)$r0rr#rZ_acquireLockr!r8�version_infoZ	_handlers�setLevelr�
StandardError�configure_logger�configure_root�clearZ_handlerList�configure_formatter�configure_filter�sorted�configure_handlerr<rgr(ZmanagerZ
loggerDict�sort�index�len�append�removeZNOTSETrrc�	propagateZdisabledZ_releaseLock)rr0rbZ
EMPTY_DICTrcr<ZhandlerZhandler_configrr>rfrgZdisable_existingrhriZexistingZ
child_loggers�iZprefixedZpflenZnum_existing�log�loggerr	r	r
�	configures�



"
$

$
$$



$

zDictConfigurator.configurecCs�d|krr|d}y|j|�}Wq�tk
rn}z4dt|�kr>�|jd�|d<||d<|j|�}WYdd}~Xq�Xn$|jdd�}|jdd�}tj||�}|S)z(Configure a formatter from a dictionary.z()z'format'�format�fmtNZdatefmt)r[rrr#r!rZ	Formatter)rr0�factoryr �terZdfmtr	r	r
rp�sz$DictConfigurator.configure_formattercCs.d|kr|j|�}n|jdd�}tj|�}|S)z%Configure a filter from a dictionary.z()r<�)r[r!r�Filter)rr0r r<r	r	r
rq�s

z!DictConfigurator.configure_filtercCs^xX|D]P}y|j|jd|�Wqtk
rT}ztd||f��WYdd}~XqXqWdS)z/Add filters to a filterer from a list of names.rizUnable to add filter %r: %sN)Z	addFilterr0rlr)rZfiltererri�fr>r	r	r
�add_filters�s

zDictConfigurator.add_filtersc-s@�jdd�}|rVy|jd|}Wn2tk
rT}ztd||f��WYdd}~XnX�jdd�}�jdd�}d�kr��jd�}t|d�r�ttd	�r�t|�tjkr�|j|�}|}n�|j�jd
��}t	|t
jj�o�d�k�r2y|jd�d�d<Wn8tk
�r.}ztd
�d|f��WYdd}~XnXnZt	|t
jj
��r`d�k�r`|j�d��d<n,t	|t
jj��r�d�k�r�|j�d��d<|}t�fdd��D��}	y|f|	�}
WnLtk
�r}z.dt|�k�r؂|	jd�|	d<|f|	�}
WYdd}~XnX|�r|
j|�|dk	�r*|
jt|��|�r<|j|
|�|
S)z&Configure a handler from a dictionary.�	formatterNrhzUnable to set formatter %r: %srriz()rPrQ�class�targetrcz#Unable to set target handler %r: %sZmailhostZaddressc3s"|]}t|�r|�|fVqdS)N)r)rRrS)r0r	r
rT�sz5DictConfigurator.configure_handler.<locals>.<genexpr>z'stream'�streamZstrm)r#r0rlrrUrVrrQrA�
issubclassrrcZ
MemoryHandlerZSMTPHandlerr\Z
SysLogHandlerrrrZsetFormatterrkrr�)rr0r�r>rrirYr��klassrZr r�r	)r0r
rs�sX
$

$



z"DictConfigurator.configure_handlercCs^xX|D]P}y|j|jd|�Wqtk
rT}ztd||f��WYdd}~XqXqWdS)z.Add handlers to a logger from a list of names.rczUnable to add handler %r: %sN)Z
addHandlerr0rlr)rr|rc�hr>r	r	r
�add_handlers	s

zDictConfigurator.add_handlersFcCs�|jdd�}|dk	r"|jt|��|s�x |jdd�D]}|j|�q6W|jdd�}|rd|j||�|jdd�}|r�|j||�dS)zU
        Perform configuration which is common to root and non-root loggers.
        rNrcri)r!rkrrcZ
removeHandlerr�r�)rr|r0rbrr�rcrir	r	r
�common_logger_configsz%DictConfigurator.common_logger_configcCs6tj|�}|j|||�|jdd�}|dk	r2||_dS)z.Configure a non-root logger from a dictionary.ryN)r�	getLoggerr�r!ry)rr<r0rbr|ryr	r	r
rm#s

z!DictConfigurator.configure_loggercCstj�}|j|||�dS)z*Configure a root logger from a dictionary.N)rr�r�)rr0rbrgr	r	r
rn+szDictConfigurator.configure_rootN)F)F)F)
r$r%r&r'r}rprqr�rsr�r�rmrnr	r	r	r
r`s	5

r`cCst|�j�dS)z%Configure logging using a dictionary.N)�dictConfigClassr})r0r	r	r
�
dictConfig3sr�)Z
__future__rZlogging.handlersrr]r8rVZpip._vendorrr^�Irrrr7rrr(rr,r�objectr-r`r�r�r	r	r	r
�<module>s*	&
site-packages/pip/compat/__pycache__/dictconfig.cpython-36.pyc000064400000032420147511334620020315 0ustar003

���e8Z�@s�ddlmZddlZddlZddlZddlZddlmZej	dej
�Zdd�Zyddlm
Z
Wnek
rzdd	�Z
YnXGd
d�de�ZGdd
�d
e�ZGdd�de�ZGdd�de�ZGdd�de�ZeZdd�ZdS)�)�absolute_importN)�sixz^[a-z_][a-z0-9_]*$cCstj|�}|std|��dS)Nz!Not a valid Python identifier: %rT)�
IDENTIFIER�match�
ValueError)�s�m�r	� /usr/lib/python3.6/dictconfig.py�valid_ident"s
r)�_checkLevelcCsNt|t�r|}n:t|�|kr>|tjkr2td|��tj|}ntd|��|S)NzUnknown level: %rz*Level not an integer or a valid string: %r)�
isinstance�int�str�loggingZ_levelNamesr�	TypeError)�level�rvr	r	r
r.s

rc@s,eZdZdZdd�Zd	dd�Zd
dd�ZdS)�ConvertingDictz A converting dictionary wrapper.cCsJtj||�}|jj|�}||k	rF|||<t|�tttfkrF||_||_	|S)N)
�dict�__getitem__�configurator�convert�typer�ConvertingList�ConvertingTuple�parent�key)�selfr�value�resultr	r	r
rGs
zConvertingDict.__getitem__NcCsLtj|||�}|jj|�}||k	rH|||<t|�tttfkrH||_||_	|S)N)
r�getrrrrrrrr)rr�defaultrr r	r	r
r!Ss
zConvertingDict.getcCsDtj|||�}|jj|�}||k	r@t|�tttfkr@||_||_	|S)N)
r�poprrrrrrrr)rrr"rr r	r	r
r#_s
zConvertingDict.pop)N)N)�__name__�
__module__�__qualname__�__doc__rr!r#r	r	r	r
rDs
rc@s"eZdZdZdd�Zd	dd�ZdS)
rzA converting list wrapper.cCsJtj||�}|jj|�}||k	rF|||<t|�tttfkrF||_||_	|S)N)
�listrrrrrrrrr)rrrr r	r	r
rls
zConvertingList.__getitem__�cCs<tj||�}|jj|�}||k	r8t|�tttfkr8||_|S)N)	r(r#rrrrrrr)r�idxrr r	r	r
r#xs
zConvertingList.popN���)r+)r$r%r&r'rr#r	r	r	r
rjsrc@seZdZdZdd�ZdS)rzA converting tuple wrapper.cCsBtj||�}|jj|�}||k	r>t|�tttfkr>||_||_	|S)N)
�tuplerrrrrrrrr)rrrr r	r	r
r�s
zConvertingTuple.__getitem__N)r$r%r&r'rr	r	r	r
r�src@s�eZdZdZejd�Zejd�Zejd�Zejd�Z	ejd�Z
ddd	�ZeZ
d
d�Zdd
�Zdd�Zdd�Zdd�Zdd�Zdd�ZdS)�BaseConfiguratorzI
    The configurator base class which defines some useful defaults.
    z%^(?P<prefix>[a-z]+)://(?P<suffix>.*)$z^\s*(\w+)\s*z^\.\s*(\w+)\s*z^\[\s*(\w+)\s*\]\s*z^\d+$�ext_convert�cfg_convert)ZextZcfgcCst|�|_||j_dS)N)r�configr)rr0r	r	r
�__init__�s
zBaseConfigurator.__init__c	Cs�|jd�}|jd�}y`|j|�}xP|D]H}|d|7}yt||�}Wq&tk
rl|j|�t||�}Yq&Xq&W|Stk
r�tj�dd�\}}td||f�}|||_	|_
|�YnXdS)z`
        Resolve strings to objects using standard import and attribute
        syntax.
        �.rr)NzCannot resolve %r: %s)�splitr#�importer�getattr�AttributeError�ImportError�sys�exc_infor�	__cause__�
__traceback__)	rr�nameZused�foundZfrag�e�tb�vr	r	r
�resolve�s"




zBaseConfigurator.resolvecCs
|j|�S)z*Default converter for the ext:// protocol.)rA)rrr	r	r
r.�szBaseConfigurator.ext_convertcCs|}|jj|�}|dkr&td|��n�||j�d�}|j|j�d}x�|r�|jj|�}|rp||j�d}nd|jj|�}|r�|j�d}|jj|�s�||}n2yt	|�}||}Wnt
k
r�||}YnX|r�||j�d�}qJtd||f��qJW|S)z*Default converter for the cfg:// protocol.NzUnable to convert %rrzUnable to convert %r at %r)�WORD_PATTERNrr�endr0�groups�DOT_PATTERN�
INDEX_PATTERN�
DIGIT_PATTERNrr)rr�restr�dr*�nr	r	r
r/�s2
zBaseConfigurator.cfg_convertcCs�t|t�r&t|t�r&t|�}||_n�t|t�rLt|t�rLt|�}||_n~t|t�rrt|t�rrt|�}||_nXt|tj	�r�|j
j|�}|r�|j�}|d}|j
j|d�}|r�|d}t||�}||�}|S)z�
        Convert values to an appropriate type. dicts, lists and tuples are
        replaced by their converting alternatives. Strings are checked to
        see if they have a conversion format and are converted if they do.
        �prefixN�suffix)r
rrrrr(rr,rZstring_types�CONVERT_PATTERNr�	groupdict�value_convertersr!r5)rrrrIrKZ	converterrLr	r	r
r�s*

zBaseConfigurator.convertcs��jd�}t|d�r8ttd�r8t|�tjkr8|j|�}�jdd�}t�fdd��D��}|f|�}|r�x |j�D]\}}t|||�qrW|S)z1Configure an object with a user-supplied factory.z()�__call__�	ClassTyper2Nc3s"|]}t|�r|�|fVqdS)N)r)�.0�k)r0r	r
�	<genexpr>sz4BaseConfigurator.configure_custom.<locals>.<genexpr>)	r#�hasattr�typesrrQrAr�items�setattr)rr0�cZprops�kwargsr r<rr	)r0r
�configure_customs
$

z!BaseConfigurator.configure_customcCst|t�rt|�}|S)z0Utility function which converts lists to tuples.)r
r(r,)rrr	r	r
�as_tuples
zBaseConfigurator.as_tupleN)r$r%r&r'�re�compilerMrBrErFrGrO�
__import__r4r1rAr.r/rr[r\r	r	r	r
r-�s 




"r-c@s^eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	ddd�Z
ddd�Zddd�ZdS)�DictConfiguratorz]
    Configure logging using a dictionary-like object to describe the
    configuration.
    cCs�|j}d|krtd��|ddkr2td|d��|jdd�}i}tj��z||�r�|jd|�}tjdd	�dk�rx�|D]�}|tjkr�td|��qzy4tj|}||}|jdd�}|r�|j	t
|��Wqztk
�r�}	ztd
||	f��WYdd}	~	XqzXqzW|jd|�}
xZ|
D]R}y|j||
|d�Wn4tk
�rd}	ztd||	f��WYdd}	~	XnX�qW|jdd�}|�r�y|j
|d�Wn0tk
�r�}	ztd|	��WYdd}	~	XnX�n|jdd�}tjj�tjdd�=|jd|�}
xZ|
D]R}y|j|
|�|
|<Wn4tk
�rF}	ztd||	f��WYdd}	~	XnX�q�W|jd|�}xZ|D]R}y|j||�||<Wn4tk
�r�}	ztd||	f��WYdd}	~	XnX�q`W|jd|�}xht|�D]\}y |j||�}||_|||<Wn4tk
�r$}	ztd
||	f��WYdd}	~	XnX�q�Wtj}t|jj�}|j�g}|jd|�}
x�|
D]�}||k�r�|j|�}|d}t|�}t|�}|d}x<||k�r�||d|�|k�r�|j||�|d}�q�W|j|�y|j||
|�Wn4tk
�r$}	ztd||	f��WYdd}	~	XnX�q\WxF|D]>}|jj|}||k�rbtj|_g|_ d|_!n|�r2d|_"�q2W|jdd�}|�r�y|j
|�Wn0tk
�r�}	ztd|	��WYdd}	~	XnXWdtj#�XdS)zDo the configuration.�versionz$dictionary doesn't specify a versionr)zUnsupported version: %s�incrementalF�handlersN��zNo handler found with name %rrz"Unable to configure handler %r: %s�loggersTz!Unable to configure logger %r: %s�rootz#Unable to configure root logger: %sZdisable_existing_loggers�
formattersz$Unable to configure formatter %r: %s�filtersz!Unable to configure filter %r: %sr2)rdre)$r0rr#rZ_acquireLockr!r8�version_infoZ	_handlers�setLevelr�
StandardError�configure_logger�configure_root�clearZ_handlerList�configure_formatter�configure_filter�sorted�configure_handlerr<rgr(ZmanagerZ
loggerDict�sort�index�len�append�removeZNOTSETrrc�	propagateZdisabledZ_releaseLock)rr0rbZ
EMPTY_DICTrcr<ZhandlerZhandler_configrr>rfrgZdisable_existingrhriZexistingZ
child_loggers�iZprefixedZpflenZnum_existing�log�loggerr	r	r
�	configures�



"
$

$
$$



$

zDictConfigurator.configurecCs�d|krr|d}y|j|�}Wq�tk
rn}z4dt|�kr>�|jd�|d<||d<|j|�}WYdd}~Xq�Xn$|jdd�}|jdd�}tj||�}|S)z(Configure a formatter from a dictionary.z()z'format'�format�fmtNZdatefmt)r[rrr#r!rZ	Formatter)rr0�factoryr �terZdfmtr	r	r
rp�sz$DictConfigurator.configure_formattercCs.d|kr|j|�}n|jdd�}tj|�}|S)z%Configure a filter from a dictionary.z()r<�)r[r!r�Filter)rr0r r<r	r	r
rq�s

z!DictConfigurator.configure_filtercCs^xX|D]P}y|j|jd|�Wqtk
rT}ztd||f��WYdd}~XqXqWdS)z/Add filters to a filterer from a list of names.rizUnable to add filter %r: %sN)Z	addFilterr0rlr)rZfiltererri�fr>r	r	r
�add_filters�s

zDictConfigurator.add_filtersc-s@�jdd�}|rVy|jd|}Wn2tk
rT}ztd||f��WYdd}~XnX�jdd�}�jdd�}d�kr��jd�}t|d�r�ttd	�r�t|�tjkr�|j|�}|}n�|j�jd
��}t	|t
jj�o�d�k�r2y|jd�d�d<Wn8tk
�r.}ztd
�d|f��WYdd}~XnXnZt	|t
jj
��r`d�k�r`|j�d��d<n,t	|t
jj��r�d�k�r�|j�d��d<|}t�fdd��D��}	y|f|	�}
WnLtk
�r}z.dt|�k�r؂|	jd�|	d<|f|	�}
WYdd}~XnX|�r|
j|�|dk	�r*|
jt|��|�r<|j|
|�|
S)z&Configure a handler from a dictionary.�	formatterNrhzUnable to set formatter %r: %srriz()rPrQ�class�targetrcz#Unable to set target handler %r: %sZmailhostZaddressc3s"|]}t|�r|�|fVqdS)N)r)rRrS)r0r	r
rT�sz5DictConfigurator.configure_handler.<locals>.<genexpr>z'stream'�streamZstrm)r#r0rlrrUrVrrQrA�
issubclassrrcZ
MemoryHandlerZSMTPHandlerr\Z
SysLogHandlerrrrZsetFormatterrkrr�)rr0r�r>rrirYr��klassrZr r�r	)r0r
rs�sX
$

$



z"DictConfigurator.configure_handlercCs^xX|D]P}y|j|jd|�Wqtk
rT}ztd||f��WYdd}~XqXqWdS)z.Add handlers to a logger from a list of names.rczUnable to add handler %r: %sN)Z
addHandlerr0rlr)rr|rc�hr>r	r	r
�add_handlers	s

zDictConfigurator.add_handlersFcCs�|jdd�}|dk	r"|jt|��|s�x |jdd�D]}|j|�q6W|jdd�}|rd|j||�|jdd�}|r�|j||�dS)zU
        Perform configuration which is common to root and non-root loggers.
        rNrcri)r!rkrrcZ
removeHandlerr�r�)rr|r0rbrr�rcrir	r	r
�common_logger_configsz%DictConfigurator.common_logger_configcCs6tj|�}|j|||�|jdd�}|dk	r2||_dS)z.Configure a non-root logger from a dictionary.ryN)r�	getLoggerr�r!ry)rr<r0rbr|ryr	r	r
rm#s

z!DictConfigurator.configure_loggercCstj�}|j|||�dS)z*Configure a root logger from a dictionary.N)rr�r�)rr0rbrgr	r	r
rn+szDictConfigurator.configure_rootN)F)F)F)
r$r%r&r'r}rprqr�rsr�r�rmrnr	r	r	r
r`s	5

r`cCst|�j�dS)z%Configure logging using a dictionary.N)�dictConfigClassr})r0r	r	r
�
dictConfig3sr�)Z
__future__rZlogging.handlersrr]r8rVZpip._vendorrr^�Irrrr7rrr(rr,r�objectr-r`r�r�r	r	r	r
�<module>s*	&
site-packages/pip/compat/__init__.py000064400000011100147511334620013447 0ustar00"""Stuff that differs in different Python versions and platform
distributions."""
from __future__ import absolute_import, division

import os
import sys

from pip._vendor.six import text_type

try:
    from logging.config import dictConfig as logging_dictConfig
except ImportError:
    from pip.compat.dictconfig import dictConfig as logging_dictConfig

try:
    from collections import OrderedDict
except ImportError:
    from pip._vendor.ordereddict import OrderedDict

try:
    import ipaddress
except ImportError:
    try:
        from pip._vendor import ipaddress
    except ImportError:
        import ipaddr as ipaddress
        ipaddress.ip_address = ipaddress.IPAddress
        ipaddress.ip_network = ipaddress.IPNetwork


try:
    import sysconfig

    def get_stdlib():
        paths = [
            sysconfig.get_path("stdlib"),
            sysconfig.get_path("platstdlib"),
        ]
        return set(filter(bool, paths))
except ImportError:
    from distutils import sysconfig

    def get_stdlib():
        paths = [
            sysconfig.get_python_lib(standard_lib=True),
            sysconfig.get_python_lib(standard_lib=True, plat_specific=True),
        ]
        return set(filter(bool, paths))


__all__ = [
    "logging_dictConfig", "ipaddress", "uses_pycache", "console_to_str",
    "native_str", "get_path_uid", "stdlib_pkgs", "WINDOWS", "samefile",
    "OrderedDict",
]


if sys.version_info >= (3, 4):
    uses_pycache = True
    from importlib.util import cache_from_source
else:
    import imp
    uses_pycache = hasattr(imp, 'cache_from_source')
    if uses_pycache:
        cache_from_source = imp.cache_from_source
    else:
        cache_from_source = None


if sys.version_info >= (3,):
    def console_to_str(s):
        try:
            return s.decode(sys.__stdout__.encoding)
        except UnicodeDecodeError:
            return s.decode('utf_8')

    def native_str(s, replace=False):
        if isinstance(s, bytes):
            return s.decode('utf-8', 'replace' if replace else 'strict')
        return s

else:
    def console_to_str(s):
        return s

    def native_str(s, replace=False):
        # Replace is ignored -- unicode to UTF-8 can't fail
        if isinstance(s, text_type):
            return s.encode('utf-8')
        return s


def total_seconds(td):
    if hasattr(td, "total_seconds"):
        return td.total_seconds()
    else:
        val = td.microseconds + (td.seconds + td.days * 24 * 3600) * 10 ** 6
        return val / 10 ** 6


def get_path_uid(path):
    """
    Return path's uid.

    Does not follow symlinks:
        https://github.com/pypa/pip/pull/935#discussion_r5307003

    Placed this function in compat due to differences on AIX and
    Jython, that should eventually go away.

    :raises OSError: When path is a symlink or can't be read.
    """
    if hasattr(os, 'O_NOFOLLOW'):
        fd = os.open(path, os.O_RDONLY | os.O_NOFOLLOW)
        file_uid = os.fstat(fd).st_uid
        os.close(fd)
    else:  # AIX and Jython
        # WARNING: time of check vulnerability, but best we can do w/o NOFOLLOW
        if not os.path.islink(path):
            # older versions of Jython don't have `os.fstat`
            file_uid = os.stat(path).st_uid
        else:
            # raise OSError for parity with os.O_NOFOLLOW above
            raise OSError(
                "%s is a symlink; Will not return uid for symlinks" % path
            )
    return file_uid


def expanduser(path):
    """
    Expand ~ and ~user constructions.

    Includes a workaround for http://bugs.python.org/issue14768
    """
    expanded = os.path.expanduser(path)
    if path.startswith('~/') and expanded.startswith('//'):
        expanded = expanded[1:]
    return expanded


# packages in the stdlib that may have installation metadata, but should not be
# considered 'installed'.  this theoretically could be determined based on
# dist.location (py27:`sysconfig.get_paths()['stdlib']`,
# py26:sysconfig.get_config_vars('LIBDEST')), but fear platform variation may
# make this ineffective, so hard-coding
stdlib_pkgs = ('python', 'wsgiref')
if sys.version_info >= (2, 7):
    stdlib_pkgs += ('argparse',)


# windows detection, covers cpython and ironpython
WINDOWS = (sys.platform.startswith("win") or
           (sys.platform == 'cli' and os.name == 'nt'))


def samefile(file1, file2):
    """Provide an alternative for os.path.samefile on Windows/Python2"""
    if hasattr(os.path, 'samefile'):
        return os.path.samefile(file1, file2)
    else:
        path1 = os.path.normcase(os.path.abspath(file1))
        path2 = os.path.normcase(os.path.abspath(file2))
        return path1 == path2
site-packages/pip/cmdoptions.py000064400000040132147511334620012613 0ustar00"""
shared options and groups

The principle here is to define options once, but *not* instantiate them
globally. One reason being that options with action='append' can carry state
between parses. pip parses general options twice internally, and shouldn't
pass on state. To be consistent, all options will follow this design.

"""
from __future__ import absolute_import

from functools import partial
from optparse import OptionGroup, SUPPRESS_HELP, Option
import warnings

from pip.index import (
    FormatControl, fmt_ctl_handle_mutual_exclude, fmt_ctl_no_binary,
    fmt_ctl_no_use_wheel)
from pip.models import PyPI
from pip.locations import USER_CACHE_DIR, src_prefix
from pip.utils.hashes import STRONG_HASHES


def make_option_group(group, parser):
    """
    Return an OptionGroup object
    group  -- assumed to be dict with 'name' and 'options' keys
    parser -- an optparse Parser
    """
    option_group = OptionGroup(parser, group['name'])
    for option in group['options']:
        option_group.add_option(option())
    return option_group


def resolve_wheel_no_use_binary(options):
    if not options.use_wheel:
        control = options.format_control
        fmt_ctl_no_use_wheel(control)


def check_install_build_global(options, check_options=None):
    """Disable wheels if per-setup.py call options are set.

    :param options: The OptionParser options to update.
    :param check_options: The options to check, if not supplied defaults to
        options.
    """
    if check_options is None:
        check_options = options

    def getname(n):
        return getattr(check_options, n, None)
    names = ["build_options", "global_options", "install_options"]
    if any(map(getname, names)):
        control = options.format_control
        fmt_ctl_no_binary(control)
        warnings.warn(
            'Disabling all use of wheels due to the use of --build-options '
            '/ --global-options / --install-options.', stacklevel=2)


###########
# options #
###########

help_ = partial(
    Option,
    '-h', '--help',
    dest='help',
    action='help',
    help='Show help.')

isolated_mode = partial(
    Option,
    "--isolated",
    dest="isolated_mode",
    action="store_true",
    default=False,
    help=(
        "Run pip in an isolated mode, ignoring environment variables and user "
        "configuration."
    ),
)

require_virtualenv = partial(
    Option,
    # Run only if inside a virtualenv, bail if not.
    '--require-virtualenv', '--require-venv',
    dest='require_venv',
    action='store_true',
    default=False,
    help=SUPPRESS_HELP)

verbose = partial(
    Option,
    '-v', '--verbose',
    dest='verbose',
    action='count',
    default=0,
    help='Give more output. Option is additive, and can be used up to 3 times.'
)

version = partial(
    Option,
    '-V', '--version',
    dest='version',
    action='store_true',
    help='Show version and exit.')

quiet = partial(
    Option,
    '-q', '--quiet',
    dest='quiet',
    action='count',
    default=0,
    help=('Give less output. Option is additive, and can be used up to 3'
          ' times (corresponding to WARNING, ERROR, and CRITICAL logging'
          ' levels).')
)

log = partial(
    Option,
    "--log", "--log-file", "--local-log",
    dest="log",
    metavar="path",
    help="Path to a verbose appending log."
)

no_input = partial(
    Option,
    # Don't ask for input
    '--no-input',
    dest='no_input',
    action='store_true',
    default=False,
    help=SUPPRESS_HELP)

proxy = partial(
    Option,
    '--proxy',
    dest='proxy',
    type='str',
    default='',
    help="Specify a proxy in the form [user:passwd@]proxy.server:port.")

retries = partial(
    Option,
    '--retries',
    dest='retries',
    type='int',
    default=5,
    help="Maximum number of retries each connection should attempt "
         "(default %default times).")

timeout = partial(
    Option,
    '--timeout', '--default-timeout',
    metavar='sec',
    dest='timeout',
    type='float',
    default=15,
    help='Set the socket timeout (default %default seconds).')

default_vcs = partial(
    Option,
    # The default version control system for editables, e.g. 'svn'
    '--default-vcs',
    dest='default_vcs',
    type='str',
    default='',
    help=SUPPRESS_HELP)

skip_requirements_regex = partial(
    Option,
    # A regex to be used to skip requirements
    '--skip-requirements-regex',
    dest='skip_requirements_regex',
    type='str',
    default='',
    help=SUPPRESS_HELP)


def exists_action():
    return Option(
        # Option when path already exist
        '--exists-action',
        dest='exists_action',
        type='choice',
        choices=['s', 'i', 'w', 'b', 'a'],
        default=[],
        action='append',
        metavar='action',
        help="Default action when a path already exists: "
        "(s)witch, (i)gnore, (w)ipe, (b)ackup, (a)bort.")


cert = partial(
    Option,
    '--cert',
    dest='cert',
    type='str',
    metavar='path',
    help="Path to alternate CA bundle.")

client_cert = partial(
    Option,
    '--client-cert',
    dest='client_cert',
    type='str',
    default=None,
    metavar='path',
    help="Path to SSL client certificate, a single file containing the "
         "private key and the certificate in PEM format.")

index_url = partial(
    Option,
    '-i', '--index-url', '--pypi-url',
    dest='index_url',
    metavar='URL',
    default=PyPI.simple_url,
    help="Base URL of Python Package Index (default %default). "
         "This should point to a repository compliant with PEP 503 "
         "(the simple repository API) or a local directory laid out "
         "in the same format.")


def extra_index_url():
    return Option(
        '--extra-index-url',
        dest='extra_index_urls',
        metavar='URL',
        action='append',
        default=[],
        help="Extra URLs of package indexes to use in addition to "
             "--index-url. Should follow the same rules as "
             "--index-url."
    )


no_index = partial(
    Option,
    '--no-index',
    dest='no_index',
    action='store_true',
    default=False,
    help='Ignore package index (only looking at --find-links URLs instead).')


def find_links():
    return Option(
        '-f', '--find-links',
        dest='find_links',
        action='append',
        default=[],
        metavar='url',
        help="If a url or path to an html file, then parse for links to "
             "archives. If a local path or file:// url that's a directory, "
             "then look for archives in the directory listing.")


def allow_external():
    return Option(
        "--allow-external",
        dest="allow_external",
        action="append",
        default=[],
        metavar="PACKAGE",
        help=SUPPRESS_HELP,
    )


allow_all_external = partial(
    Option,
    "--allow-all-external",
    dest="allow_all_external",
    action="store_true",
    default=False,
    help=SUPPRESS_HELP,
)


def trusted_host():
    return Option(
        "--trusted-host",
        dest="trusted_hosts",
        action="append",
        metavar="HOSTNAME",
        default=[],
        help="Mark this host as trusted, even though it does not have valid "
             "or any HTTPS.",
    )


# Remove after 7.0
no_allow_external = partial(
    Option,
    "--no-allow-external",
    dest="allow_all_external",
    action="store_false",
    default=False,
    help=SUPPRESS_HELP,
)


# Remove --allow-insecure after 7.0
def allow_unsafe():
    return Option(
        "--allow-unverified", "--allow-insecure",
        dest="allow_unverified",
        action="append",
        default=[],
        metavar="PACKAGE",
        help=SUPPRESS_HELP,
    )

# Remove after 7.0
no_allow_unsafe = partial(
    Option,
    "--no-allow-insecure",
    dest="allow_all_insecure",
    action="store_false",
    default=False,
    help=SUPPRESS_HELP
)

# Remove after 1.5
process_dependency_links = partial(
    Option,
    "--process-dependency-links",
    dest="process_dependency_links",
    action="store_true",
    default=False,
    help="Enable the processing of dependency links.",
)


def constraints():
    return Option(
        '-c', '--constraint',
        dest='constraints',
        action='append',
        default=[],
        metavar='file',
        help='Constrain versions using the given constraints file. '
        'This option can be used multiple times.')


def requirements():
    return Option(
        '-r', '--requirement',
        dest='requirements',
        action='append',
        default=[],
        metavar='file',
        help='Install from the given requirements file. '
        'This option can be used multiple times.')


def editable():
    return Option(
        '-e', '--editable',
        dest='editables',
        action='append',
        default=[],
        metavar='path/url',
        help=('Install a project in editable mode (i.e. setuptools '
              '"develop mode") from a local project path or a VCS url.'),
    )

src = partial(
    Option,
    '--src', '--source', '--source-dir', '--source-directory',
    dest='src_dir',
    metavar='dir',
    default=src_prefix,
    help='Directory to check out editable projects into. '
    'The default in a virtualenv is "<venv path>/src". '
    'The default for global installs is "<current dir>/src".'
)

# XXX: deprecated, remove in 9.0
use_wheel = partial(
    Option,
    '--use-wheel',
    dest='use_wheel',
    action='store_true',
    default=True,
    help=SUPPRESS_HELP,
)

# XXX: deprecated, remove in 9.0
no_use_wheel = partial(
    Option,
    '--no-use-wheel',
    dest='use_wheel',
    action='store_false',
    default=True,
    help=('Do not Find and prefer wheel archives when searching indexes and '
          'find-links locations. DEPRECATED in favour of --no-binary.'),
)


def _get_format_control(values, option):
    """Get a format_control object."""
    return getattr(values, option.dest)


def _handle_no_binary(option, opt_str, value, parser):
    existing = getattr(parser.values, option.dest)
    fmt_ctl_handle_mutual_exclude(
        value, existing.no_binary, existing.only_binary)


def _handle_only_binary(option, opt_str, value, parser):
    existing = getattr(parser.values, option.dest)
    fmt_ctl_handle_mutual_exclude(
        value, existing.only_binary, existing.no_binary)


def no_binary():
    return Option(
        "--no-binary", dest="format_control", action="callback",
        callback=_handle_no_binary, type="str",
        default=FormatControl(set(), set()),
        help="Do not use binary packages. Can be supplied multiple times, and "
             "each time adds to the existing value. Accepts either :all: to "
             "disable all binary packages, :none: to empty the set, or one or "
             "more package names with commas between them. Note that some "
             "packages are tricky to compile and may fail to install when "
             "this option is used on them.")


def only_binary():
    return Option(
        "--only-binary", dest="format_control", action="callback",
        callback=_handle_only_binary, type="str",
        default=FormatControl(set(), set()),
        help="Do not use source packages. Can be supplied multiple times, and "
             "each time adds to the existing value. Accepts either :all: to "
             "disable all source packages, :none: to empty the set, or one or "
             "more package names with commas between them. Packages without "
             "binary distributions will fail to install when this option is "
             "used on them.")


cache_dir = partial(
    Option,
    "--cache-dir",
    dest="cache_dir",
    default=USER_CACHE_DIR,
    metavar="dir",
    help="Store the cache data in <dir>."
)

no_cache = partial(
    Option,
    "--no-cache-dir",
    dest="cache_dir",
    action="store_false",
    help="Disable the cache.",
)

no_deps = partial(
    Option,
    '--no-deps', '--no-dependencies',
    dest='ignore_dependencies',
    action='store_true',
    default=False,
    help="Don't install package dependencies.")

build_dir = partial(
    Option,
    '-b', '--build', '--build-dir', '--build-directory',
    dest='build_dir',
    metavar='dir',
    help='Directory to unpack packages into and build in.'
)

ignore_requires_python = partial(
    Option,
    '--ignore-requires-python',
    dest='ignore_requires_python',
    action='store_true',
    help='Ignore the Requires-Python information.')

install_options = partial(
    Option,
    '--install-option',
    dest='install_options',
    action='append',
    metavar='options',
    help="Extra arguments to be supplied to the setup.py install "
         "command (use like --install-option=\"--install-scripts=/usr/local/"
         "bin\"). Use multiple --install-option options to pass multiple "
         "options to setup.py install. If you are using an option with a "
         "directory path, be sure to use absolute path.")

global_options = partial(
    Option,
    '--global-option',
    dest='global_options',
    action='append',
    metavar='options',
    help="Extra global options to be supplied to the setup.py "
         "call before the install command.")

no_clean = partial(
    Option,
    '--no-clean',
    action='store_true',
    default=False,
    help="Don't clean up build directories.")

pre = partial(
    Option,
    '--pre',
    action='store_true',
    default=False,
    help="Include pre-release and development versions. By default, "
         "pip only finds stable versions.")

disable_pip_version_check = partial(
    Option,
    "--disable-pip-version-check",
    dest="disable_pip_version_check",
    action="store_true",
    default=False,
    help="Don't periodically check PyPI to determine whether a new version "
         "of pip is available for download. Implied with --no-index.")

# Deprecated, Remove later
always_unzip = partial(
    Option,
    '-Z', '--always-unzip',
    dest='always_unzip',
    action='store_true',
    help=SUPPRESS_HELP,
)


def _merge_hash(option, opt_str, value, parser):
    """Given a value spelled "algo:digest", append the digest to a list
    pointed to in a dict by the algo name."""
    if not parser.values.hashes:
        parser.values.hashes = {}
    try:
        algo, digest = value.split(':', 1)
    except ValueError:
        parser.error('Arguments to %s must be a hash name '
                     'followed by a value, like --hash=sha256:abcde...' %
                     opt_str)
    if algo not in STRONG_HASHES:
        parser.error('Allowed hash algorithms for %s are %s.' %
                     (opt_str, ', '.join(STRONG_HASHES)))
    parser.values.hashes.setdefault(algo, []).append(digest)


hash = partial(
    Option,
    '--hash',
    # Hash values eventually end up in InstallRequirement.hashes due to
    # __dict__ copying in process_line().
    dest='hashes',
    action='callback',
    callback=_merge_hash,
    type='string',
    help="Verify that the package's archive matches this "
         'hash before installing. Example: --hash=sha256:abcdef...')


require_hashes = partial(
    Option,
    '--require-hashes',
    dest='require_hashes',
    action='store_true',
    default=False,
    help='Require a hash to check each requirement against, for '
         'repeatable installs. This option is implied when any package in a '
         'requirements file has a --hash option.')


##########
# groups #
##########

general_group = {
    'name': 'General Options',
    'options': [
        help_,
        isolated_mode,
        require_virtualenv,
        verbose,
        version,
        quiet,
        log,
        no_input,
        proxy,
        retries,
        timeout,
        default_vcs,
        skip_requirements_regex,
        exists_action,
        trusted_host,
        cert,
        client_cert,
        cache_dir,
        no_cache,
        disable_pip_version_check,
    ]
}

non_deprecated_index_group = {
    'name': 'Package Index Options',
    'options': [
        index_url,
        extra_index_url,
        no_index,
        find_links,
        process_dependency_links,
    ]
}

index_group = {
    'name': 'Package Index Options (including deprecated options)',
    'options': non_deprecated_index_group['options'] + [
        allow_external,
        allow_all_external,
        no_allow_external,
        allow_unsafe,
        no_allow_unsafe,
    ]
}
site-packages/pip/__init__.py000064400000027236147511334620012205 0ustar00from __future__ import absolute_import

import locale
import logging
import os
import optparse
import warnings

import sys
import re

# 2016-06-17 barry@debian.org: urllib3 1.14 added optional support for socks,
# but if invoked (i.e. imported), it will issue a warning to stderr if socks
# isn't available.  requests unconditionally imports urllib3's socks contrib
# module, triggering this warning.  The warning breaks DEP-8 tests (because of
# the stderr output) and is just plain annoying in normal usage.  I don't want
# to add socks as yet another dependency for pip, nor do I want to allow-stder
# in the DEP-8 tests, so just suppress the warning.  pdb tells me this has to
# be done before the import of pip.vcs.
from pip._vendor.urllib3.exceptions import DependencyWarning
warnings.filterwarnings("ignore", category=DependencyWarning)  # noqa

# We want to inject the use of SecureTransport as early as possible so that any
# references or sessions or what have you are ensured to have it, however we
# only want to do this in the case that we're running on macOS and the linked
# OpenSSL is too old to handle TLSv1.2
try:
    import ssl
except ImportError:
    pass
else:
    if (sys.platform == "darwin" and
            getattr(ssl, "OPENSSL_VERSION_NUMBER", 0) < 0x1000100f):  # OpenSSL 1.0.1
        try:
            from pip._vendor.urllib3.contrib import securetransport
        except (ImportError, OSError):
            pass
        else:
            securetransport.inject_into_urllib3()

from pip.exceptions import InstallationError, CommandError, PipError
from pip.utils import get_installed_distributions, get_prog
from pip.utils import deprecation, dist_is_editable
from pip.vcs import git, mercurial, subversion, bazaar  # noqa
from pip.baseparser import ConfigOptionParser, UpdatingDefaultsHelpFormatter
from pip.commands import get_summaries, get_similar_commands
from pip.commands import commands_dict
from pip._vendor.urllib3.exceptions import InsecureRequestWarning


# assignment for flake8 to be happy

# This fixes a peculiarity when importing via __import__ - as we are
# initialising the pip module, "from pip import cmdoptions" is recursive
# and appears not to work properly in that situation.
import pip.cmdoptions
cmdoptions = pip.cmdoptions

# The version as used in the setup.py and the docs conf.py
__version__ = "9.0.3"


logger = logging.getLogger(__name__)

# Hide the InsecureRequestWarning from urllib3
warnings.filterwarnings("ignore", category=InsecureRequestWarning)


def autocomplete():
    """Command and option completion for the main option parser (and options)
    and its subcommands (and options).

    Enable by sourcing one of the completion shell scripts (bash, zsh or fish).
    """
    # Don't complete if user hasn't sourced bash_completion file.
    if 'PIP_AUTO_COMPLETE' not in os.environ:
        return
    cwords = os.environ['COMP_WORDS'].split()[1:]
    cword = int(os.environ['COMP_CWORD'])
    try:
        current = cwords[cword - 1]
    except IndexError:
        current = ''

    subcommands = [cmd for cmd, summary in get_summaries()]
    options = []
    # subcommand
    try:
        subcommand_name = [w for w in cwords if w in subcommands][0]
    except IndexError:
        subcommand_name = None

    parser = create_main_parser()
    # subcommand options
    if subcommand_name:
        # special case: 'help' subcommand has no options
        if subcommand_name == 'help':
            sys.exit(1)
        # special case: list locally installed dists for uninstall command
        if subcommand_name == 'uninstall' and not current.startswith('-'):
            installed = []
            lc = current.lower()
            for dist in get_installed_distributions(local_only=True):
                if dist.key.startswith(lc) and dist.key not in cwords[1:]:
                    installed.append(dist.key)
            # if there are no dists installed, fall back to option completion
            if installed:
                for dist in installed:
                    print(dist)
                sys.exit(1)

        subcommand = commands_dict[subcommand_name]()
        options += [(opt.get_opt_string(), opt.nargs)
                    for opt in subcommand.parser.option_list_all
                    if opt.help != optparse.SUPPRESS_HELP]

        # filter out previously specified options from available options
        prev_opts = [x.split('=')[0] for x in cwords[1:cword - 1]]
        options = [(x, v) for (x, v) in options if x not in prev_opts]
        # filter options by current input
        options = [(k, v) for k, v in options if k.startswith(current)]
        for option in options:
            opt_label = option[0]
            # append '=' to options which require args
            if option[1]:
                opt_label += '='
            print(opt_label)
    else:
        # show main parser options only when necessary
        if current.startswith('-') or current.startswith('--'):
            opts = [i.option_list for i in parser.option_groups]
            opts.append(parser.option_list)
            opts = (o for it in opts for o in it)

            subcommands += [i.get_opt_string() for i in opts
                            if i.help != optparse.SUPPRESS_HELP]

        print(' '.join([x for x in subcommands if x.startswith(current)]))
    sys.exit(1)


def create_main_parser():
    parser_kw = {
        'usage': '\n%prog <command> [options]',
        'add_help_option': False,
        'formatter': UpdatingDefaultsHelpFormatter(),
        'name': 'global',
        'prog': get_prog(),
    }

    parser = ConfigOptionParser(**parser_kw)
    parser.disable_interspersed_args()

    pip_pkg_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
    parser.version = 'pip %s from %s (python %s)' % (
        __version__, pip_pkg_dir, sys.version[:3])

    # add the general options
    gen_opts = cmdoptions.make_option_group(cmdoptions.general_group, parser)
    parser.add_option_group(gen_opts)

    parser.main = True  # so the help formatter knows

    # create command listing for description
    command_summaries = get_summaries()
    description = [''] + ['%-27s %s' % (i, j) for i, j in command_summaries]
    parser.description = '\n'.join(description)

    return parser


def parseopts(args):
    parser = create_main_parser()

    # Note: parser calls disable_interspersed_args(), so the result of this
    # call is to split the initial args into the general options before the
    # subcommand and everything else.
    # For example:
    #  args: ['--timeout=5', 'install', '--user', 'INITools']
    #  general_options: ['--timeout==5']
    #  args_else: ['install', '--user', 'INITools']
    general_options, args_else = parser.parse_args(args)

    # --version
    if general_options.version:
        sys.stdout.write(parser.version)
        sys.stdout.write(os.linesep)
        sys.exit()

    # pip || pip help -> print_help()
    if not args_else or (args_else[0] == 'help' and len(args_else) == 1):
        parser.print_help()
        sys.exit()

    # the subcommand name
    cmd_name = args_else[0]

    if cmd_name not in commands_dict:
        guess = get_similar_commands(cmd_name)

        msg = ['unknown command "%s"' % cmd_name]
        if guess:
            msg.append('maybe you meant "%s"' % guess)

        raise CommandError(' - '.join(msg))

    # all the args without the subcommand
    cmd_args = args[:]
    cmd_args.remove(cmd_name)

    return cmd_name, cmd_args


def check_isolated(args):
    isolated = False

    if "--isolated" in args:
        isolated = True

    return isolated


def main(args=None):
    if args is None:
        args = sys.argv[1:]

    # Configure our deprecation warnings to be sent through loggers
    deprecation.install_warning_logger()

    autocomplete()

    try:
        cmd_name, cmd_args = parseopts(args)
    except PipError as exc:
        sys.stderr.write("ERROR: %s" % exc)
        sys.stderr.write(os.linesep)
        sys.exit(1)

    # Needed for locale.getpreferredencoding(False) to work
    # in pip.utils.encoding.auto_decode
    try:
        locale.setlocale(locale.LC_ALL, '')
    except locale.Error as e:
        # setlocale can apparently crash if locale are uninitialized
        logger.debug("Ignoring error %s when setting locale", e)
    command = commands_dict[cmd_name](isolated=check_isolated(cmd_args))
    return command.main(cmd_args)


# ###########################################################
# # Writing freeze files

class FrozenRequirement(object):

    def __init__(self, name, req, editable, comments=()):
        self.name = name
        self.req = req
        self.editable = editable
        self.comments = comments

    _rev_re = re.compile(r'-r(\d+)$')
    _date_re = re.compile(r'-(20\d\d\d\d\d\d)$')

    @classmethod
    def from_dist(cls, dist, dependency_links):
        location = os.path.normcase(os.path.abspath(dist.location))
        comments = []
        from pip.vcs import vcs, get_src_requirement
        if dist_is_editable(dist) and vcs.get_backend_name(location):
            editable = True
            try:
                req = get_src_requirement(dist, location)
            except InstallationError as exc:
                logger.warning(
                    "Error when trying to get requirement for VCS system %s, "
                    "falling back to uneditable format", exc
                )
                req = None
            if req is None:
                logger.warning(
                    'Could not determine repository location of %s', location
                )
                comments.append(
                    '## !! Could not determine repository location'
                )
                req = dist.as_requirement()
                editable = False
        else:
            editable = False
            req = dist.as_requirement()
            specs = req.specs
            assert len(specs) == 1 and specs[0][0] in ["==", "==="], \
                'Expected 1 spec with == or ===; specs = %r; dist = %r' % \
                (specs, dist)
            version = specs[0][1]
            ver_match = cls._rev_re.search(version)
            date_match = cls._date_re.search(version)
            if ver_match or date_match:
                svn_backend = vcs.get_backend('svn')
                if svn_backend:
                    svn_location = svn_backend().get_location(
                        dist,
                        dependency_links,
                    )
                if not svn_location:
                    logger.warning(
                        'Warning: cannot find svn location for %s', req)
                    comments.append(
                        '## FIXME: could not find svn URL in dependency_links '
                        'for this package:'
                    )
                else:
                    comments.append(
                        '# Installing as editable to satisfy requirement %s:' %
                        req
                    )
                    if ver_match:
                        rev = ver_match.group(1)
                    else:
                        rev = '{%s}' % date_match.group(1)
                    editable = True
                    req = '%s@%s#egg=%s' % (
                        svn_location,
                        rev,
                        cls.egg_name(dist)
                    )
        return cls(dist.project_name, req, editable, comments)

    @staticmethod
    def egg_name(dist):
        name = dist.egg_name()
        match = re.search(r'-py\d\.\d$', name)
        if match:
            name = name[:match.start()]
        return name

    def __str__(self):
        req = self.req
        if self.editable:
            req = '-e %s' % req
        return '\n'.join(list(self.comments) + [str(req)]) + '\n'


if __name__ == '__main__':
    sys.exit(main())
site-packages/pip/utils/__pycache__/filesystem.cpython-36.pyc000064400000001064147511334620020245 0ustar003

���e��@s(ddlZddlZddlmZdd�ZdS)�N)�get_path_uidcCs�ttd�sdSd}xp||kr�tjj|�rntj�dkr^yt|�}Wntk
rTdSX|dkStj|tj�Sq|tjj	|�}}qWdS)N�geteuidTrF)
�hasattr�os�path�lexistsrr�OSError�access�W_OK�dirname)rZpreviousZpath_uid�r� /usr/lib/python3.6/filesystem.py�check_path_owners

r)rZos.pathZ
pip.compatrrrrrr
�<module>ssite-packages/pip/utils/__pycache__/encoding.cpython-36.opt-1.pyc000064400000001747147511334620020616 0ustar003

���e��@sjddlZddlZddlZejdfejdfejdfejdfejdfejdfej	dfgZ
ejd	�Zd
d�Z
dS)�N�utf8�utf16zutf16-bezutf16-le�utf32zutf32-bezutf32-lescoding[:=]\s*([-\w.]+)cCs�x0tD](\}}|j|�r|t|�d�j|�SqWxV|jd�dd�D]@}|dd�dkrFtj|�rFtj|�j�djd�}|j|�SqFW|jtj	d��S)	z�Check a bytes string for a BOM to correctly detect the encoding

    Fallback to locale.getpreferredencoding(False) like open() on Python3N�
�r��#�asciiF)
�BOMS�
startswith�len�decode�split�ENCODING_RE�search�groups�locale�getpreferredencoding)�dataZbom�encoding�line�r�/usr/lib/python3.6/encoding.py�auto_decodes
r)�codecsr�re�BOM_UTF8�	BOM_UTF16�BOM_UTF16_BE�BOM_UTF16_LE�	BOM_UTF32�BOM_UTF32_BE�BOM_UTF32_LEr
�compilerrrrrr�<module>s
site-packages/pip/utils/__pycache__/build.cpython-36.opt-1.pyc000064400000002420147511334620020114 0ustar003

���e �@s<ddlmZddlZddlZddlmZGdd�de�ZdS)�)�absolute_importN)�rmtreec@s6eZdZddd�Zdd�Zdd�Zdd	�Zd
d�ZdS)
�BuildDirectoryNcCsL|dkr|dkrd}|dkr<tjjtjdd��}|dkr<d}||_||_dS)NTz
pip-build-)�prefix)�os�path�realpath�tempfileZmkdtemp�name�delete)�selfr
r�r
�/usr/lib/python3.6/build.py�__init__szBuildDirectory.__init__cCsdj|jj|j�S)Nz	<{} {!r}>)�format�	__class__�__name__r
)rr
r
r�__repr__szBuildDirectory.__repr__cCs|jS)N)r
)rr
r
r�	__enter__"szBuildDirectory.__enter__cCs|j�dS)N)�cleanup)r�exc�value�tbr
r
r�__exit__%szBuildDirectory.__exit__cCs|jrt|j�dS)N)rrr
)rr
r
rr(szBuildDirectory.cleanup)NN)r�
__module__�__qualname__rrrrrr
r
r
rr	s

r)	Z
__future__rZos.pathrr	Z	pip.utilsr�objectrr
r
r
r�<module>ssite-packages/pip/utils/__pycache__/appdirs.cpython-36.opt-1.pyc000064400000017002147511334620020461 0ustar003

���ek"�@s�dZddlmZddlZddlZddlmZmZddlm	Z	m
Z
dd�Zdd	d
�Zddd
�Z
dd�Zdd�Zdd�Zer�yddlZeZWnek
r�eZYnXdd�ZdS)zd
This code was taken from https://github.com/ActiveState/appdirs and modified
to suit our purposes.
�)�absolute_importN)�WINDOWS�
expanduser)�PY2�	text_typecCs�tr<tjjtd��}tr*t|t�r*t|�}tjj	||d�}n@t
jdkr^td�}tjj	||�}ntj
dtd��}tjj	||�}|S)a5
    Return full path to the user-specific cache dir for this application.

        "appname" is the name of application.

    Typical user cache directories are:
        macOS:      ~/Library/Caches/<AppName>
        Unix:       ~/.cache/<AppName> (XDG default)
        Windows:    C:\Users\<username>\AppData\Local\<AppName>\Cache

    On Windows the only suggestion in the MSDN docs is that local settings go
    in the `CSIDL_LOCAL_APPDATA` directory. This is identical to the
    non-roaming app data dir (the default returned by `user_data_dir`). Apps
    typically put cache data somewhere *under* the given dir here. Some
    examples:
        ...\Mozilla\Firefox\Profiles\<ProfileName>\Cache
        ...\Acme\SuperApp\Cache\1.0

    OPINION: This function appends "Cache" to the `CSIDL_LOCAL_APPDATA` value.
    �CSIDL_LOCAL_APPDATAZCache�darwinz~/Library/CachesZXDG_CACHE_HOMEz~/.cache)r�os�path�normpath�_get_win_folderr�
isinstancer�_win_path_to_bytes�join�sys�platformr�getenv)�appnamer
�r�/usr/lib/python3.6/appdirs.py�user_cache_dirs
rFcCshtr,|rdpd}tjjtjjt|��|�}n8tjdkrJtjjtd�|�}ntjjtj	dtd��|�}|S)aS
    Return full path to the user-specific data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user data directories are:
        macOS:                  ~/Library/Application Support/<AppName>
        Unix:                   ~/.local/share/<AppName>    # or in
                                $XDG_DATA_HOME, if defined
        Win XP (not roaming):   C:\Documents and Settings\<username>\ ...
                                ...Application Data\<AppName>
        Win XP (roaming):       C:\Documents and Settings\<username>\Local ...
                                ...Settings\Application Data\<AppName>
        Win 7  (not roaming):   C:\Users\<username>\AppData\Local\<AppName>
        Win 7  (roaming):       C:\Users\<username>\AppData\Roaming\<AppName>

    For Unix, we follow the XDG spec and support $XDG_DATA_HOME.
    That means, by default "~/.local/share/<AppName>".
    �
CSIDL_APPDATArrz~/Library/Application Support/Z
XDG_DATA_HOMEz~/.local/share)
rr	r
rrrrrrr)r�roaming�constr
rrr�
user_data_dir>s
rTcCsHtrt||d�}n2tjdkr&t|�}ntjdtd��}tjj||�}|S)arReturn full path to the user-specific config dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "roaming" (boolean, default True) can be set False to not use the
            Windows roaming appdata directory. That means that for users on a
            Windows network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user data directories are:
        macOS:                  same as user_data_dir
        Unix:                   ~/.config/<AppName>
        Win *:                  same as user_data_dir

    For Unix, we follow the XDG spec and support $XDG_CONFIG_HOME.
    That means, by default "~/.config/<AppName>".
    )rrZXDG_CONFIG_HOMEz	~/.config)	rrrrr	rrr
r)rrr
rrr�user_config_dirjs

rcs�tr&tjjtd��}tjj|��g}nVtjdkrBtjjd��g}n:tjdd�}|rn�fdd�|j	tj
�D�}ng}|jd�|S)	a�Return a list of potential user-shared config dirs for this application.

        "appname" is the name of application.

    Typical user config directories are:
        macOS:      /Library/Application Support/<AppName>/
        Unix:       /etc or $XDG_CONFIG_DIRS[i]/<AppName>/ for each value in
                    $XDG_CONFIG_DIRS
        Win XP:     C:\Documents and Settings\All Users\Application ...
                    ...Data\<AppName>        Vista:      (Fail! "C:\ProgramData" is a hidden *system* directory
                    on Vista.)
        Win 7:      Hidden, but writeable on Win 7:
                    C:\ProgramData\<AppName>    �CSIDL_COMMON_APPDATArz/Library/Application SupportZXDG_CONFIG_DIRSz/etc/xdgcsg|]}tjjt|����qSr)r	r
rr)�.0�x)rrr�
<listcomp>�sz$site_config_dirs.<locals>.<listcomp>z/etc)rr	r
rrrrrr�split�pathsep�append)rr
ZpathlistZxdg_config_dirsr)rr�site_config_dirs�s


r#cCs:ddl}dddd�|}|j|jd�}|j||�\}}|S)z�
    This is a fallback technique at best. I'm not sure if using the
    registry for this guarantees us the correct answer for all CSIDL_*
    names.
    rNZAppDatazCommon AppDataz
Local AppData)rrrz@Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders)�_winreg�OpenKey�HKEY_CURRENT_USERZQueryValueEx)�
csidl_namer$Zshell_folder_name�keyZ	directoryZ_typerrr�_get_win_folder_from_registry�sr)cCs�dddd�|}tjd�}tjjjd|dd|�d}x|D]}t|�dkr:d	}Pq:W|rztjd�}tjjj|j|d�rz|}|jS)
N��#�)rrrirF�T)	�ctypesZcreate_unicode_bufferZwindllZshell32ZSHGetFolderPathW�ordZkernel32ZGetShortPathNameW�value)r'Zcsidl_constZbufZ
has_high_char�cZbuf2rrr�_get_win_folder_with_ctypes�s 


r2c
Cs6x0dD](}y
|j|�Sttfk
r,YqXqW|S)a�Encode Windows paths to bytes. Only used on Python 2.

    Motivation is to be consistent with other operating systems where paths
    are also returned as bytes. This avoids problems mixing bytes and Unicode
    elsewhere in the codebase. For more details and discussion see
    <https://github.com/pypa/pip/issues/3463>.

    If encoding using ASCII and MBCS fails, return the original Unicode path.
    �ASCII�MBCS)r3r4)�encode�UnicodeEncodeError�LookupError)r
�encodingrrrr�s



r)F)T)�__doc__Z
__future__rr	rZ
pip.compatrrZpip._vendor.sixrrrrrr#r)r2r.r�ImportErrorrrrrr�<module>s$0
,
!(
site-packages/pip/utils/__pycache__/packaging.cpython-36.pyc000064400000003657147511334620020017 0ustar003

���e �@s~ddlmZddlmZddlZddlZddlmZddlmZddl	m
Z
ddlmZej
e�Zdd	�Zd
d�Zdd
�ZdS)�)�absolute_import)�
FeedParserN)�
specifiers)�version)�
pkg_resources)�
exceptionscCs>|dkrdStj|�}tjdjtttjdd����}||kS)aG
    Check if the python version in use match the `requires_python` specifier.

    Returns `True` if the version of python in use matches the requirement.
    Returns `False` if the version of python in use does not matches the
    requirement.

    Raises an InvalidSpecifier if `requires_python` have an invalid format.
    NT�.�)	rZSpecifierSetr�parse�join�map�str�sys�version_info)�requires_pythonZrequires_python_specifierZpython_version�r�/usr/lib/python3.6/packaging.py�check_requires_pythons


 rcCs8t|tj�r |jd�r |jd�S|jd�r4|jd�SdS)NZMETADATAzPKG-INFO)�
isinstancerZDistInfoDistributionZhas_metadata�get_metadata)�distrrrr%s



rcCs�t|�}t�}|j|�|j�}|jd�}y8t|�s`tjd|j|dj	t
ttj
dd���f��Wn8tjk
r�}ztjd|j||f�dSd}~XnXdS)NzRequires-Pythonz4%s requires Python '%s' but the running Python is %srr	z7Package %s has an invalid Requires-Python entry %s - %s)rrZfeed�close�getrrZUnsupportedPythonVersionZproject_namerrr
rrrZInvalidSpecifier�loggerZwarning)rZmetadataZfeed_parserZ
pkg_info_dictr�errr�check_dist_requires_python-s"

$r)Z
__future__rZemail.parserrZloggingrZpip._vendor.packagingrrZpip._vendorrZpiprZ	getLogger�__name__rrrrrrrr�<module>s
site-packages/pip/utils/__pycache__/appdirs.cpython-36.pyc000064400000017002147511334620017522 0ustar003

���ek"�@s�dZddlmZddlZddlZddlmZmZddlm	Z	m
Z
dd�Zdd	d
�Zddd
�Z
dd�Zdd�Zdd�Zer�yddlZeZWnek
r�eZYnXdd�ZdS)zd
This code was taken from https://github.com/ActiveState/appdirs and modified
to suit our purposes.
�)�absolute_importN)�WINDOWS�
expanduser)�PY2�	text_typecCs�tr<tjjtd��}tr*t|t�r*t|�}tjj	||d�}n@t
jdkr^td�}tjj	||�}ntj
dtd��}tjj	||�}|S)a5
    Return full path to the user-specific cache dir for this application.

        "appname" is the name of application.

    Typical user cache directories are:
        macOS:      ~/Library/Caches/<AppName>
        Unix:       ~/.cache/<AppName> (XDG default)
        Windows:    C:\Users\<username>\AppData\Local\<AppName>\Cache

    On Windows the only suggestion in the MSDN docs is that local settings go
    in the `CSIDL_LOCAL_APPDATA` directory. This is identical to the
    non-roaming app data dir (the default returned by `user_data_dir`). Apps
    typically put cache data somewhere *under* the given dir here. Some
    examples:
        ...\Mozilla\Firefox\Profiles\<ProfileName>\Cache
        ...\Acme\SuperApp\Cache\1.0

    OPINION: This function appends "Cache" to the `CSIDL_LOCAL_APPDATA` value.
    �CSIDL_LOCAL_APPDATAZCache�darwinz~/Library/CachesZXDG_CACHE_HOMEz~/.cache)r�os�path�normpath�_get_win_folderr�
isinstancer�_win_path_to_bytes�join�sys�platformr�getenv)�appnamer
�r�/usr/lib/python3.6/appdirs.py�user_cache_dirs
rFcCshtr,|rdpd}tjjtjjt|��|�}n8tjdkrJtjjtd�|�}ntjjtj	dtd��|�}|S)aS
    Return full path to the user-specific data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user data directories are:
        macOS:                  ~/Library/Application Support/<AppName>
        Unix:                   ~/.local/share/<AppName>    # or in
                                $XDG_DATA_HOME, if defined
        Win XP (not roaming):   C:\Documents and Settings\<username>\ ...
                                ...Application Data\<AppName>
        Win XP (roaming):       C:\Documents and Settings\<username>\Local ...
                                ...Settings\Application Data\<AppName>
        Win 7  (not roaming):   C:\Users\<username>\AppData\Local\<AppName>
        Win 7  (roaming):       C:\Users\<username>\AppData\Roaming\<AppName>

    For Unix, we follow the XDG spec and support $XDG_DATA_HOME.
    That means, by default "~/.local/share/<AppName>".
    �
CSIDL_APPDATArrz~/Library/Application Support/Z
XDG_DATA_HOMEz~/.local/share)
rr	r
rrrrrrr)r�roaming�constr
rrr�
user_data_dir>s
rTcCsHtrt||d�}n2tjdkr&t|�}ntjdtd��}tjj||�}|S)arReturn full path to the user-specific config dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "roaming" (boolean, default True) can be set False to not use the
            Windows roaming appdata directory. That means that for users on a
            Windows network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user data directories are:
        macOS:                  same as user_data_dir
        Unix:                   ~/.config/<AppName>
        Win *:                  same as user_data_dir

    For Unix, we follow the XDG spec and support $XDG_CONFIG_HOME.
    That means, by default "~/.config/<AppName>".
    )rrZXDG_CONFIG_HOMEz	~/.config)	rrrrr	rrr
r)rrr
rrr�user_config_dirjs

rcs�tr&tjjtd��}tjj|��g}nVtjdkrBtjjd��g}n:tjdd�}|rn�fdd�|j	tj
�D�}ng}|jd�|S)	a�Return a list of potential user-shared config dirs for this application.

        "appname" is the name of application.

    Typical user config directories are:
        macOS:      /Library/Application Support/<AppName>/
        Unix:       /etc or $XDG_CONFIG_DIRS[i]/<AppName>/ for each value in
                    $XDG_CONFIG_DIRS
        Win XP:     C:\Documents and Settings\All Users\Application ...
                    ...Data\<AppName>        Vista:      (Fail! "C:\ProgramData" is a hidden *system* directory
                    on Vista.)
        Win 7:      Hidden, but writeable on Win 7:
                    C:\ProgramData\<AppName>    �CSIDL_COMMON_APPDATArz/Library/Application SupportZXDG_CONFIG_DIRSz/etc/xdgcsg|]}tjjt|����qSr)r	r
rr)�.0�x)rrr�
<listcomp>�sz$site_config_dirs.<locals>.<listcomp>z/etc)rr	r
rrrrrr�split�pathsep�append)rr
ZpathlistZxdg_config_dirsr)rr�site_config_dirs�s


r#cCs:ddl}dddd�|}|j|jd�}|j||�\}}|S)z�
    This is a fallback technique at best. I'm not sure if using the
    registry for this guarantees us the correct answer for all CSIDL_*
    names.
    rNZAppDatazCommon AppDataz
Local AppData)rrrz@Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders)�_winreg�OpenKey�HKEY_CURRENT_USERZQueryValueEx)�
csidl_namer$Zshell_folder_name�keyZ	directoryZ_typerrr�_get_win_folder_from_registry�sr)cCs�dddd�|}tjd�}tjjjd|dd|�d}x|D]}t|�dkr:d	}Pq:W|rztjd�}tjjj|j|d�rz|}|jS)
N��#�)rrrirF�T)	�ctypesZcreate_unicode_bufferZwindllZshell32ZSHGetFolderPathW�ordZkernel32ZGetShortPathNameW�value)r'Zcsidl_constZbufZ
has_high_char�cZbuf2rrr�_get_win_folder_with_ctypes�s 


r2c
Cs6x0dD](}y
|j|�Sttfk
r,YqXqW|S)a�Encode Windows paths to bytes. Only used on Python 2.

    Motivation is to be consistent with other operating systems where paths
    are also returned as bytes. This avoids problems mixing bytes and Unicode
    elsewhere in the codebase. For more details and discussion see
    <https://github.com/pypa/pip/issues/3463>.

    If encoding using ASCII and MBCS fails, return the original Unicode path.
    �ASCII�MBCS)r3r4)�encode�UnicodeEncodeError�LookupError)r
�encodingrrrr�s



r)F)T)�__doc__Z
__future__rr	rZ
pip.compatrrZpip._vendor.sixrrrrrr#r)r2r.r�ImportErrorrrrrr�<module>s$0
,
!(
site-packages/pip/utils/__pycache__/__init__.cpython-36.pyc000064400000054152147511334620017626 0ustar003

���eml�@s�ddlmZddlmZddlZddlZddlZddlZddlZ	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlZddlZddlmZddlmZmZmZddlmZmZmZmZmZmZddl m!Z!ddl"m#Z#dd	l$m%Z%dd
l&m'Z'e%�rddlm(Z)nddlm)Z)d
ddddddddddddddddddd d!d"d#d$gZ*e	j+e,�Z-dzZ.d{Z/d|Z0d}Z1e0e.e1e/Z2e0e1Z3yddl4Z4e3e.7Z3Wn e5k
�r�e-j6d1�YnXyddl7Z7e3e/7Z3Wn e5k
�r�e-j6d2�YnXd3d4�Z8d5d!�Z9d6d�Z:e'd7d8d9�d~d;d
��Z;d<d=�Z<d>d�Z=dd@d�Z>dAdB�Z?dCd�Z@dDd�ZAdEd�ZBdFd�ZCdGd�ZDejEfdHdI�ZFdJd�ZGdKd�ZHd�dMd�ZIdNd�ZJdOd�ZKdPdQ�ZLdRdS�ZMdTdU�ZNdVdW�ZOdXdY�ZPdZd[�ZQdLedLd:d:fd\d]�ZRd^d_�ZSd`da�ZTdbd�ZUdcdd�ZVd�ded�ZWdfd�ZXdgd�ZYd�did�ZZdjdk�Z[dldm�Z\Gdndo�doe]�Z^Gdpdq�dqe)�Z_ej`drds��Zadtd �ZbGdudv�dve]�Zcd�dwd$�Zddxdy�ZedS)��)�absolute_import)�dequeN)�InstallationError)�console_to_str�
expanduser�stdlib_pkgs)�
site_packages�	user_site�running_under_virtualenv�virtualenv_no_global�write_delete_marker_file�distutils_scheme)�
pkg_resources)�input)�PY2)�retry)�BytesIO)�StringIO�rmtree�display_path�
backup_dir�ask�splitext�format_size�is_installable_dir�is_svn_page�
file_contents�split_leading_dir�has_leading_dir�normalize_path�renames�get_terminal_size�get_prog�
unzip_file�
untar_file�unpack_file�call_subprocess�captured_stdout�
ensure_dir�ARCHIVE_EXTENSIONS�SUPPORTED_EXTENSIONS�get_installed_version�.tar.bz2�.tbz�.tar.xz�.txz�.tlz�.tar.lz�	.tar.lzma�.zip�.whl�.tar.gz�.tgz�.tarzbz2 module is not availablezlzma module is not availablecOs,yt|�Stk
r&|||��YnXdS)N)�
__import__�ImportError)Zpkg_or_module_stringZ
ExceptionType�args�kwargs�r<�/usr/lib/python3.6/__init__.py�import_or_raiseIsr>cCsDytj|�Wn0tk
r>}z|jtjkr.�WYdd}~XnXdS)z os.path.makedirs without EEXIST.N)�os�makedirs�OSError�errnoZEEXIST)�path�er<r<r=r(Ps
c
CsDy$tjjtjd�dkr"dtjSWntttfk
r>YnXdS)Nr�__main__.py�-cz	%s -m pipZpip)rErF)	r?rC�basename�sys�argv�
executable�AttributeError�	TypeError�
IndexErrorr<r<r<r=r"Ysi�i�)Zstop_max_delayZ
wait_fixedFcCstj||td�dS)N)�
ignore_errors�onerror)�shutilr�rmtree_errorhandler)�dirrNr<r<r=rcscCs2tj|�jtj@r,tj|tj�||�dS�dS)z�On Windows, the files in .svn are read-only, so when rmtree() tries to
    remove them, an exception is thrown.  We catch that here, remove the
    read-only attribute, and hopefully continue without problems.N)r?�stat�st_mode�S_IREAD�chmod�S_IWRITE)�funcrC�exc_infor<r<r=rQis
rQcCsttjjtjj|��}tjddkrB|jtj�d�}|jtj	�d�}|j
tj�tjj�rpd|t
tj��d�}|S)zTGives the display value for a given path, making it relative to cwd
    if possible.r��replace�.N)r?rC�normcase�abspathrH�version_info�decode�getfilesystemencoding�encode�getdefaultencoding�
startswith�getcwd�sep�len)rCr<r<r=rxs�.bakcCs:d}|}x(tjj||�r0|d7}|t|�}q
W||S)z\Figure out the name of a directory to back up the given dir to
    (adding .bak, .bak2, etc)�)r?rC�exists�str)rR�ext�n�	extensionr<r<r=r�scCs2x&tjjdd�j�D]}||kr|SqWt||�S)NZPIP_EXISTS_ACTION�)r?�environ�get�splitr)�message�options�actionr<r<r=�ask_path_exists�srvcCsZxTtjjd�rtd|��t|�}|j�j�}||krNtd|dj|�f�q|SqWdS)z@Ask the message interactively, with the given possible responsesZPIP_NO_INPUTz7No input was expected ($PIP_NO_INPUT set); question: %sz<Your response (%r) was not one of the expected responses: %sz, N)	r?rprq�	Exceptionr�strip�lower�print�join)rsrtZresponser<r<r=r�scCsL|dkrd|ddS|d	kr,d|dS|dkr@d|dSd|SdS)
Ni�z%.1fMBg@�@�
z%ikBz%.1fkBz%ibytesi@Bi'r<)�bytesr<r<r=r�scCs2tjj|�sdStjj|d�}tjj|�r.dSdS)z@Return True if `path` is a directory containing a setup.py file.Fzsetup.pyT)r?rC�isdirr{�isfile)rCZsetup_pyr<r<r=r�scCstjd|�otjd|tj�S)zT
    Returns true if the page appears to be the index page of an svn repository
    z<title>[^<]*Revision \d+:z#Powered by (?:<a[^>]*?>)?Subversion)�re�search�I)Zhtmlr<r<r=r�sc	Cs$t|d��}|j�jd�SQRXdS)N�rbzutf-8)�open�readr`)�filename�fpr<r<r=r�sccs x|j|�}|sP|VqWdS)z7Yield pieces of data from a file-like object until EOF.N)r�)�file�size�chunkr<r<r=�read_chunks�s

r�cCsh|jd�jd�}d|krHd|kr4|jd�|jd�ks<d|krH|jdd�Sd|kr\|jdd�S|dfSdS)N�/�\riro)�lstrip�findrr)rCr<r<r=r�s$cCsDd}x:|D]2}t|�\}}|s"dS|dkr0|}q
||kr
dSq
WdS)zyReturns true if all the paths have the same leading path name
    (i.e., everything is in one subdirectory in an archive)NFT)r)�pathsZ
common_prefixrC�prefix�restr<r<r=r�s
TcCs2t|�}|rtjj|�}ntjj|�}tjj|�S)zN
    Convert a path to its canonical, case-normalized, absolute version.

    )rr?rC�realpathr^r])rCZresolve_symlinksr<r<r=r�s
cCs@tj|�\}}|j�jd�r8|dd�|}|dd�}||fS)z,Like os.path.splitext, but take off .tar tooz.tar�N���r�)�	posixpathrry�endswith)rC�baserlr<r<r=r�s
cCs|tjj|�\}}|r0|r0tjj|�r0tj|�tj||�tjj|�\}}|rx|rxytj|�Wntk
rvYnXdS)z7Like os.renames(), but handles renaming across devices.N)	r?rCrrrjr@rPZmove�
removedirsrA)�old�new�head�tailr<r<r=r s
cCst�s
dSt|�jttj��S)z�
    Return True if path is within sys.prefix, if we're running in a virtualenv.

    If we're not in a virtualenv, all paths are considered "local."

    T)r
rrdrHr�)rCr<r<r=�is_localsr�cCstt|��S)z�
    Return True if given Distribution object is installed locally
    (i.e. within current virtualenv).

    Always True if we're not in a virtualenv.

    )r��
dist_location)�distr<r<r=�
dist_is_local!sr�cCstt|��}|jtt��S)zF
    Return True if given Distribution is installed in user site.
    )rr�rdr	)r��	norm_pathr<r<r=�dist_in_usersite,sr�cCstt|��jtt��S)ze
    Return True if given Distribution is installed in
    distutils.sysconfig.get_python_lib().
    )rr�rdr)r�r<r<r=�dist_in_site_packages4s
r�cCs,tt|��}|jttd�djd�d��S)zf
    Return True if given Distribution is installed in
    path matching distutils_scheme layout.
    ro�purelib�pythonr)rr�rdr
rr)r�r�r<r<r=�dist_in_install_path>sr�cCs8x2tjD](}tjj||jd�}tjj|�rdSqWdS)z$Is distribution an editable install?z	.egg-linkTF)rHrCr?r{�project_namer)r�Z	path_item�egg_linkr<r<r=�dist_is_editableHs
r�csl|r
t�ndd��|r dd��ndd��|r6dd��ndd��|rHt�nd	d
�������fdd�tjD�S)
a�
    Return a list of installed Distribution objects.

    If ``local_only`` is True (default), only return installations
    local to the current virtualenv, if in a virtualenv.

    ``skip`` argument is an iterable of lower-case project names to
    ignore; defaults to stdlib_pkgs

    If ``editables`` is False, don't report editables.

    If ``editables_only`` is True , only report editables.

    If ``user_only`` is True , only report installations in the user
    site directory.

    cSsdS)NTr<)�dr<r<r=�
local_testjsz/get_installed_distributions.<locals>.local_testcSsdS)NTr<)r�r<r<r=�
editable_testnsz2get_installed_distributions.<locals>.editable_testcSs
t|�S)N)r�)r�r<r<r=r�qscSst|�S)N)r�)r�r<r<r=�editables_only_testusz8get_installed_distributions.<locals>.editables_only_testcSsdS)NTr<)r�r<r<r=r�xscSsdS)NTr<)r�r<r<r=�	user_test~sz.get_installed_distributions.<locals>.user_testcs:g|]2}�|�r|j�kr�|�r�|�r�|�r|�qSr<)�key)�.0r�)r�r�r��skipr�r<r=�
<listcomp>�s
z/get_installed_distributions.<locals>.<listcomp>)r�r�r�working_set)Z
local_onlyr�Zinclude_editablesZeditables_onlyZ	user_onlyr<)r�r�r�r�r�r=�get_installed_distributionsQs

r�cCs�g}t�r6t�r|jt�qN|jt�trN|jt�ntrD|jt�|jt�x0|D](}tjj||j�d}tjj	|�rT|SqTWdS)a
    Return the path for the .egg-link file if it exists, otherwise, None.

    There's 3 scenarios:
    1) not in a virtualenv
       try to find in site.USER_SITE, then site_packages
    2) in a no-global virtualenv
       try to find in site_packages
    3) in a yes-global virtualenv
       try to find in site_packages, then site.USER_SITE
       (don't look in global location)

    For #1 and #3, there could be odd cases, where there's an egg-link in 2
    locations.

    This method will just return the first one found.
    z	.egg-linkN)
r
r�appendrr	r?rCr{r�r)r�ZsitesZsiteZegglinkr<r<r=�
egg_link_path�s



r�cCst|�}|r|S|jS)z�
    Get the site-packages location of this distribution. Generally
    this is dist.location, except in the case of develop-installed
    packages, where dist.location is the source code location, and we
    want to know where the egg-link file is.

    )r��location)r�r�r<r<r=r��sr�c
Cs�dd�}|d�p|d�p|d�}|sZy(tjtj�tj�}||�}tj|�WnYnX|sztjjdd�tjjdd	�f}t|d�t|d�fS)
zlReturns a tuple (x, y) representing the width(x) and the height(x)
    in characters of the terminal window.cSsPy4ddl}ddl}ddl}|jd|j||jd��}Wn
dS|dkrLdS|S)NrZhhZ1234)rr)�fcntl�termios�struct�unpackZioctlZ
TIOCGWINSZ)�fdr�r�r��crr<r<r=�ioctl_GWINSZ�sz'get_terminal_size.<locals>.ioctl_GWINSZrrirZZLINES�ZCOLUMNS�P)r?r��ctermid�O_RDONLY�closerprq�int)r�r�r�r<r<r=r!�scCstjd�}tj|�|S)zBGet the current umask which involves having to set it temporarily.r)r?�umask)�maskr<r<r=�
current_umask�s

r�c
Cst|�t|d�}z�tj|dd�}t|j��o0|}x�|j�D]�}|j}|j|�}|}	|rdt	|�d}	t
jj||	�}	t
jj
|	�}
|	jd�s�|	jd�r�t|	�q<t|
�t|	d�}z|j|�Wd|j�|jd	?}|r�tj|�r�|d
@r�t
j|	dt�d
B�Xq<WWd|j�XdS)a�
    Unzip the file (with path `filename`) to the destination `location`.  All
    files are written based on system defaults and umask (i.e. permissions are
    not preserved), except that regular file members with any execute
    permissions (user, group, or world) have "chmod +x" applied after being
    written. Note that for windows, any execute changes using os.chmod are
    no-ops per the python docs.
    r�T)Z
allowZip64rir�r��wbN��Ii�)r(r��zipfileZZipFilerZnamelistZinfolistr�r�rr?rCr{�dirnamer��writer�Z
external_attrrS�S_ISREGrVr�)
r�r��flattenZzipfp�zip�leading�info�name�data�fnrRr��moder<r<r=r#�s0	




 c(Cs@t|�|j�jd�s$|j�jd�r*d}nL|j�jt�r>d}n8|j�jt�rRd}n$|j�jd�rfd}ntjd|�d	}tj||�}�z�t	d
d�|j
�D��}�x�|j
�D�]�}|j}|dkr�q�|r�t|�d
}t
jj||�}ytj|j|d�|�Wntjk
�rYnX|j��r"t|�q�|j��rxy|j||�Wn8tk
�rt}ztjd||j|�w�WYdd}~XnXq�y|j|�}	Wn<ttfk
�r�}ztjd||j|�w�WYdd}~XnXtt
jj|��t|d��}
tj|	|
�WdQRX|	j�|j||�|jd@r�t
j |dt!�dB�q�WWd|j�XdS)a�
    Untar the file (with path `filename`) to the destination `location`.
    All files are written based on system defaults and umask (i.e. permissions
    are not preserved), except that regular file members with any execute
    permissions (user, group, or world) have "chmod +x" applied after being
    written.  Note that for windows, any execute changes using os.chmod are
    no-ops per the python docs.
    z.gzz.tgzzr:gzzr:bz2zr:xzz.tar�rz-Cannot determine compression type for file %szr:*cSsg|]}|jdkr|j�qS)�pax_global_header)r�)r��memberr<r<r=r�(szuntar_file.<locals>.<listcomp>r�ri)r�z/In the tar file %s the member %s is invalid: %sNr�r�i�)"r(ryr��BZ2_EXTENSIONS�
XZ_EXTENSIONS�logger�warning�tarfiler�rZ
getmembersr�rr?rCr{Zdata_filterr[ZLinkOutsideDestinationErrorr~ZissymZ_extract_memberrwZextractfile�KeyErrorrKr�rPZcopyfileobjr��utimer�rVr�)r�r�r�Ztarr�r�r�rC�excr�Zdestfpr<r<r=r$
sh	



cCs�tjj|�}|dks,|j�jt�s,tj|�rDt|||jd�d�n�|dkslt	j
|�sl|j�jttt
�rxt||�nX|r�|jd�r�tt|��r�ddlm}|d|j�j|�ntjd	|||�td
|��dS)Nzapplication/zipz.whl)r�zapplication/x-gzipz	text/htmlr)�
Subversionzsvn+zZCannot unpack file %s (downloaded from %s, content-type: %s); cannot detect archive formatz%Cannot determine archive format of %s)r?rCr�ryr��ZIP_EXTENSIONSr�Z
is_zipfiler#r�Z
is_tarfile�TAR_EXTENSIONSr�r�r$rdrrZpip.vcs.subversionr�Zurlr�r��criticalr)r�r�Zcontent_type�linkr�r<r<r=r%`s,


�raisecCs,|r
d}ntj}|dkrng}xF|D]>}	d|	ksFd|	ksFd|	ksFd|	krVd|	jdd�}	|j|	�q"Wdj|�}tjd|�tjj	�}
|r�|
j
|�ytj|tjd|||
d�}Wn2t
k
r�}ztjd	||��WYdd}~XnX|dk	�rNg}
x\t|jj��}|�sP|j�}|
j|d�tj�tjk�r:tj|�q�|dk	r�|j�q�W|j�|dk	�r~|j�rt|jd
�n
|jd�|j�r|dk�r�tj�tjk�r�|�r�tjd
|�tjdj|
�d�td||j|f��n:|dk�r�tjd||j|�n|dk�rntdt|���|�s(dj|
�SdS)N� �
�"�'z"%s"z\"zRunning command %s)�stderr�stdin�stdout�cwd�envz#Error %s while executing command %s�error�doner�z Complete output from command %s:roz)
----------------------------------------z,Command "%s" failed with error code %s in %s�warnz$Command "%s" had error code %s in %s�ignorezInvalid value: on_returncode=%s)�
subprocess�PIPEr[r�r{r��debugr?rp�copy�update�PopenZSTDOUTrwr�rr��readline�rstripZgetEffectiveLevel�std_logging�DEBUGZspin�wait�
returncodeZfinishr�rr��
ValueError�repr)�cmdZshow_stdoutr�Z
on_returncodeZcommand_descZ
extra_environZspinnerr�Z	cmd_parts�partr��procr�Z
all_output�liner<r<r=r&�sz
 










cCsxt|d��}|j�}WdQRXdtjd�dg}x4|D],}y|j|�}Wntk
r\w4YnXPq4Wt|�tkstt�|S)aRReturn the contents of *filename*.

    Try to decode the file contents with utf-8, the preferred system encoding
    (e.g., cp1252 on some Windows machines), and latin1, in that order.
    Decoding a byte string with latin1 will never raise an error. In the worst
    case, the returned string will contain some garbage characters.

    r�Nzutf-8F�latin1)	r�r��locale�getpreferredencodingr`�UnicodeDecodeError�typer}�AssertionError)r�r�r�Z	encodings�encr<r<r=�read_text_file�s	
rcCstj|�t|�dS)N)r?r@r)Z	build_dirr<r<r=�_make_build_dir�s
rc@s(eZdZdZdd�Zdd�Zdd�ZdS)	�FakeFilezQWrap a list of lines in an object with readline() to make
    ConfigParser happy.cCsdd�|D�|_dS)Ncss|]
}|VqdS)Nr<)r��lr<r<r=�	<genexpr>sz$FakeFile.__init__.<locals>.<genexpr>)�_gen)�self�linesr<r<r=�__init__szFakeFile.__init__cCsDy*y
t|j�Stk
r&|jj�SXWntk
r>dSXdS)Nro)�nextr�	NameError�
StopIteration)rr<r<r=r�s
zFakeFile.readlinecCs|jS)N)r)rr<r<r=�__iter__szFakeFile.__iter__N)�__name__�
__module__�__qualname__�__doc__rr�rr<r<r<r=rs	rc@s$eZdZedd��Zedd��ZdS)�
StreamWrappercCs||_|�S)N)�orig_stream)�clsr!r<r<r=�from_streamszStreamWrapper.from_streamcCs|jjS)N)r!�encoding)rr<r<r=r$szStreamWrapper.encodingN)rrr�classmethodr#�propertyr$r<r<r<r=r sr c
cs@tt|�}tt|tj|��ztt|�VWdtt||�XdS)z�Return a context manager used by captured_stdout/stdin/stderr
    that temporarily replaces the sys stream *stream_name* with a StringIO.

    Taken from Lib/support/__init__.py in the CPython repo.
    N)�getattrrH�setattrr r#)Zstream_nameZorig_stdoutr<r<r=�captured_output s

r)cCstd�S)z�Capture the output of sys.stdout:

       with captured_stdout() as stdout:
           print('hello')
       self.assertEqual(stdout.getvalue(), 'hello
')

    Taken from Lib/support/__init__.py in the CPython repo.
    r�)r)r<r<r<r=r'/s	c@s eZdZdZdd�Zdd�ZdS)�cached_propertyz�A property that is only computed once per instance and then replaces
       itself with an ordinary attribute. Deleting the attribute resets the
       property.

       Source: https://github.com/bottlepy/bottle/blob/0.11.5/bottle.py#L175
    cCst|d�|_||_dS)Nr)r'rrX)rrXr<r<r=rCszcached_property.__init__cCs(|dkr|S|j|�}|j|jj<|S)N)rX�__dict__r)r�objr"�valuer<r<r=�__get__Gszcached_property.__get__N)rrrrrr.r<r<r<r=r*;sr*cCs@tjj|�}|dkrtj�}n
tj|�}|j|�}|r<|jSdS)zCGet the installed version of dist_name avoiding pkg_resources cacheN)rZRequirement�parseZ
WorkingSetr��version)Z	dist_nameZlookup_dirsZreqr�r�r<r<r=r+Os


cCst|dd�dS)zConsume an iterable at C speed.r)�maxlenN)r)�iteratorr<r<r=�consumecsr3)r,r-)r.r/r0r1r2)r3r4)r5r6r7)F)rh)T)T)TNr�NNN)N)fZ
__future__r�collectionsr�
contextlibrB�ior	Zloggingr�r�r?r�rPrSr�rHr�r�Zpip.exceptionsrZ
pip.compatrrrZ
pip.locationsrr	r
rrr
Zpip._vendorrZpip._vendor.six.movesrZpip._vendor.sixrZpip._vendor.retryingrrr�__all__Z	getLoggerrr�r�r�r�r�r)r*�bz2r9r�Zlzmar>r(r"rrQrrrvrrrrr�DEFAULT_BUFFER_SIZEr�rrrrr r�r�r�r�r�r�r�r�r�r!r�r#r$r%r&rr�objectrr �contextmanagerr)r'r*r+r3r<r<r<r=�<module>s� 
	



	

	

	5%
+S!
_

site-packages/pip/utils/__pycache__/deprecation.cpython-36.opt-1.pyc000064400000003255147511334620021321 0ustar003

���e��@s�dZddlmZddlZddlZGdd�de�ZGdd�de�ZGdd	�d	e�Z	Gd
d�dee�Z
Gdd
�d
e�Zdaddd�Z
dd�ZdS)zN
A module that implements tooling to enable easy warnings about deprecations.
�)�absolute_importNc@seZdZdS)�PipDeprecationWarningN)�__name__�
__module__�__qualname__�rr�!/usr/lib/python3.6/deprecation.pyr
src@seZdZdS)�PendingN)rrrrrrrr	sr	c@seZdZdS)�RemovedInPip10WarningN)rrrrrrrr
sr
c@seZdZdS)�RemovedInPip11WarningN)rrrrrrrrsrc@seZdZdS)�Python26DeprecationWarningN)rrrrrrrrsrcCsx|dk	r$tdk	rtt||||||�nPt|t�rbtjd�}d|}t|t�rV|j|�qt|j|�nt||||||�dS)Nzpip.deprecationszDEPRECATION: %s)�_warnings_showwarning�
issubclassr�loggingZ	getLoggerr	Zwarning�error)�message�category�filename�lineno�file�lineZloggerZlog_messagerrr�_showwarning$s


rcCs(tjdtdd�tdkr$tjatt_dS)N�defaultT)�append)�warnings�simplefilterrr
�showwarningrrrrr�install_warning_loggerDsr)NN)�__doc__Z
__future__rrr�Warningr�objectr	r
rrr
rrrrrr�<module>s
 site-packages/pip/utils/__pycache__/glibc.cpython-36.opt-1.pyc000064400000002503147511334620020077 0ustar003

���e{�@sPddlmZddlZddlZddlZddlZdd�Zdd�Zdd�Zd	d
�Z	dS)�)�absolute_importNcCsPtjd�}y
|j}Wntk
r(dSXtj|_|�}t|t�sL|jd�}|S)z9Returns glibc version string, or None if not using glibc.N�ascii)	�ctypesZCDLL�gnu_get_libc_version�AttributeErrorZc_char_pZrestype�
isinstance�str�decode)Zprocess_namespacer�version_str�r�/usr/lib/python3.6/glibc.py�glibc_version_string	s



r
cCsHtjd|�}|s$tjd|t�dSt|jd��|koFt|jd��|kS)Nz$(?P<major>[0-9]+)\.(?P<minor>[0-9]+)z=Expected glibc version with 2 components major.minor, got: %sF�major�minor)�re�match�warnings�warn�RuntimeWarning�int�group)r
�required_major�
minimum_minor�mrrr�check_glibc_version#s
rcCst�}|dkrdSt|||�S)NF)r
r)rrr
rrr�have_compatible_glibc3srcCs"t�}|dkrtj�Sd|fSdS)NZglibc)r
�platform�libc_ver)Z
glibc_versionrrrrKsr)
Z
__future__rrrrrr
rrrrrrr�<module>ssite-packages/pip/utils/__pycache__/ui.cpython-36.opt-1.pyc000064400000022441147511334620017437 0ustar003

���eM-�@s�ddlmZddlmZddlZddlZddlmZmZmZddlZddl	Z	ddl
Z
ddlmZddl
mZddlmZddlmZdd	lmZmZdd
lmZmZmZddlmZyddlmZWnek
r�dZYnXe
je�Z d
d�Z!e!ee�Z"Gdd�de#�Z$Gdd�de#�Z%Gdd�de#�Z&Gdd�de&e$e%e"�Z'Gdd�de&e$e%ee�Z(e	j)dd��Z*Gdd�de#�Z+Gdd�de#�Z,Gdd �d e#�Z-e	j)d!d"��Z.dS)#�)�absolute_import)�divisionN)�signal�SIGINT�default_int_handler)�WINDOWS)�format_size)�get_indentation)�six)�Bar�IncrementalBar)�WritelnMixin�HIDE_CURSOR�SHOW_CURSOR)�Spinner)�coloramacCs�t|jdd�}|s|St|dtj��t|dtj��g}|tt|dg��7}ytj�j|�j|�Wntk
rv|SX|SdS)N�encodingZ
empty_fillZfill�phases)�getattr�filer
Z	text_type�list�join�encode�UnicodeEncodeError)Z	preferredZfallbackrZ
characters�r�/usr/lib/python3.6/ui.py�_select_progress_classsrcs4eZdZdZ�fdd�Z�fdd�Zdd�Z�ZS)�InterruptibleMixina�
    Helper to ensure that self.finish() gets called on keyboard interrupt.

    This allows downloads to be interrupted without leaving temporary state
    (like hidden cursors) behind.

    This class is similar to the progress library's existing SigIntMixin
    helper, but as of version 1.2, that helper has the following problems:

    1. It calls sys.exit().
    2. It discards the existing SIGINT handler completely.
    3. It leaves its own handler in place even after an uninterrupted finish,
       which will have unexpected delayed effects if the user triggers an
       unrelated keyboard interrupt some time after a progress-displaying
       download has already completed, for example.
    cs4tt|�j||�tt|j�|_|jdkr0t|_dS)z=
        Save the original SIGINT handler for later.
        N)�superr�__init__rr�
handle_sigint�original_handlerr)�self�args�kwargs)�	__class__rrrNs
zInterruptibleMixin.__init__cstt|�j�tt|j�dS)z�
        Restore the original SIGINT handler after finishing.

        This should happen regardless of whether the progress display finishes
        normally, or gets interrupted.
        N)rr�finishrrr!)r")r%rrr&^szInterruptibleMixin.finishcCs|j�|j||�dS)z�
        Call self.finish() before delegating to the original SIGINT handler.

        This handler should only be in place while the progress display is
        active.
        N)r&r!)r"Zsignum�framerrrr hsz InterruptibleMixin.handle_sigint)�__name__�
__module__�__qualname__�__doc__rr&r �
__classcell__rr)r%rr<s
rcsJeZdZ�fdd�Zedd��Zedd��Zedd��Zdd
d�Z�Z	S)
�DownloadProgressMixincs,tt|�j||�dt�d|j|_dS)N� �)rr-rr	�message)r"r#r$)r%rrruszDownloadProgressMixin.__init__cCs
t|j�S)N)r�index)r"rrr�
downloadedysz DownloadProgressMixin.downloadedcCs |jdkrdStd|j�dS)Ngz...�z/s)Zavgr)r"rrr�download_speed}s
z$DownloadProgressMixin.download_speedcCs|jrd|jSdS)Nzeta %s�)ZetaZeta_td)r"rrr�
pretty_eta�s
z DownloadProgressMixin.pretty_etar3ccs*x|D]}|V|j|�qW|j�dS)N)�nextr&)r"�it�n�xrrr�iter�s
zDownloadProgressMixin.iter)r3)
r(r)r*r�propertyr2r4r6r;r,rr)r%rr-ss
r-cseZdZ�fdd�Z�ZS)�WindowsMixincs\tr�jrd�_tt��j||�trXtrXtj�j��_�fdd��j_�fdd��j_	dS)NFcs�jjj�S)N)r�wrapped�isattyr)r"rr�<lambda>�sz'WindowsMixin.__init__.<locals>.<lambda>cs�jjj�S)N)rr>�flushr)r"rrr@�s)
rZhide_cursorrr=rrZAnsiToWin32rr?rA)r"r#r$)r%)r"rr�s
zWindowsMixin.__init__)r(r)r*rr,rr)r%rr=�sr=c@seZdZejZdZdZdS)�DownloadProgressBarz
%(percent)d%%z0%(downloaded)s %(download_speed)s %(pretty_eta)sN)r(r)r*�sys�stdoutrr0�suffixrrrrrB�srBc@s&eZdZejZdZdd�Zdd�ZdS)�DownloadProgressSpinnerz!%(downloaded)s %(download_speed)scCs"t|d�stj|j�|_t|j�S)N�_phaser)�hasattr�	itertools�cyclerrGr7)r"rrr�
next_phase�s
z"DownloadProgressSpinner.next_phasecCsN|j|}|j�}|j|}dj||r*dnd||r6dnd|g�}|j|�dS)Nr5r.)r0rKrErZwriteln)r"r0ZphaserE�linerrr�update�s



zDownloadProgressSpinner.updateN)	r(r)r*rCrDrrErKrMrrrrrF�srFccsRtrdVnB|j�s$tj�tjkr,dVn"|jt�z
dVWd|jt�XdS)N)	rr?�logger�getEffectiveLevel�logging�INFO�writerr)rrrr�
hidden_cursor�s

rSc@s$eZdZdd�Zdd�Zdd�ZdS)�RateLimitercCs||_d|_dS)Nr)�_min_update_interval_seconds�_last_update)r"�min_update_interval_secondsrrrr�szRateLimiter.__init__cCstj�}||j}||jkS)N)�timerVrU)r"ZnowZdeltarrr�ready�s
zRateLimiter.readycCstj�|_dS)N)rXrV)r"rrr�reset�szRateLimiter.resetN)r(r)r*rrYrZrrrrrT�srTc@s.eZdZddd�Zdd�Zdd	�Zd
d�ZdS)
�InteractiveSpinnerN�-\|/��?cCs\||_|dkrtj}||_t|�|_d|_tj|�|_	|jj
dt�|jd�d|_dS)NFr.z ... r)
�_messagerCrD�_filerT�
_rate_limiter�	_finishedrIrJ�_spin_cyclerRr	�_width)r"r0rZ
spin_charsrWrrrr�s
zInteractiveSpinner.__init__cCsRd|j}|jj|d|j|�|jj|�t|�|_|jj�|jj�dS)N�r.)rcr_rR�lenrAr`rZ)r"�statusZbackuprrr�_write	s


zInteractiveSpinner._writecCs,|jr
dS|jj�sdS|jt|j��dS)N)rar`rYrgr7rb)r"rrr�spins

zInteractiveSpinner.spincCs4|jr
dS|j|�|jjd�|jj�d|_dS)N�
T)rargr_rRrA)r"�final_statusrrrr&s

zInteractiveSpinner.finish)Nr\r])r(r)r*rrgrhr&rrrrr[�s


r[c@s.eZdZddd�Zdd�Zdd�Zdd	�Zd
S)�NonInteractiveSpinner�<cCs$||_d|_t|�|_|jd�dS)NFZstarted)r^rarTr`�_update)r"r0rWrrrr*s
zNonInteractiveSpinner.__init__cCs|jj�tjd|j|�dS)Nz%s: %s)r`rZrN�infor^)r"rfrrrrm0s
zNonInteractiveSpinner._updatecCs&|jr
dS|jj�sdS|jd�dS)Nzstill running...)rar`rYrm)r"rrrrh5s

zNonInteractiveSpinner.spincCs$|jr
dS|jd|f�d|_dS)Nzfinished with status '%s'T)rarm)r"rjrrrr&<szNonInteractiveSpinner.finishN)rl)r(r)r*rrmrhr&rrrrrk)s
rkccs�tjj�r"tj�tjkr"t|�}nt|�}y t	tj��|VWdQRXWn>t
k
rj|jd��Yn*tk
r�|jd��YnX|jd�dS)NZcanceled�error�done)
rCrDr?rNrOrPrQr[rkrS�KeyboardInterruptr&�	Exception)r0Zspinnerrrr�open_spinnerCs


rs)/Z
__future__rrrIrCrrrrX�
contextlibrPZ
pip.compatrZ	pip.utilsrZpip.utils.loggingr	Zpip._vendorr
Zpip._vendor.progress.barrrZpip._vendor.progress.helpersr
rrZpip._vendor.progress.spinnerrrrrZ	getLoggerr(rNrZ_BaseBar�objectrr-r=rBrF�contextmanagerrSrTr[rkrsrrrr�<module>sB


7
!0site-packages/pip/utils/__pycache__/build.cpython-36.pyc000064400000002420147511334620017155 0ustar003

���e �@s<ddlmZddlZddlZddlmZGdd�de�ZdS)�)�absolute_importN)�rmtreec@s6eZdZddd�Zdd�Zdd�Zdd	�Zd
d�ZdS)
�BuildDirectoryNcCsL|dkr|dkrd}|dkr<tjjtjdd��}|dkr<d}||_||_dS)NTz
pip-build-)�prefix)�os�path�realpath�tempfileZmkdtemp�name�delete)�selfr
r�r
�/usr/lib/python3.6/build.py�__init__szBuildDirectory.__init__cCsdj|jj|j�S)Nz	<{} {!r}>)�format�	__class__�__name__r
)rr
r
r�__repr__szBuildDirectory.__repr__cCs|jS)N)r
)rr
r
r�	__enter__"szBuildDirectory.__enter__cCs|j�dS)N)�cleanup)r�exc�value�tbr
r
r�__exit__%szBuildDirectory.__exit__cCs|jrt|j�dS)N)rrr
)rr
r
rr(szBuildDirectory.cleanup)NN)r�
__module__�__qualname__rrrrrr
r
r
rr	s

r)	Z
__future__rZos.pathrr	Z	pip.utilsr�objectrr
r
r
r�<module>ssite-packages/pip/utils/__pycache__/packaging.cpython-36.opt-1.pyc000064400000003657147511334620020756 0ustar003

���e �@s~ddlmZddlmZddlZddlZddlmZddlmZddl	m
Z
ddlmZej
e�Zdd	�Zd
d�Zdd
�ZdS)�)�absolute_import)�
FeedParserN)�
specifiers)�version)�
pkg_resources)�
exceptionscCs>|dkrdStj|�}tjdjtttjdd����}||kS)aG
    Check if the python version in use match the `requires_python` specifier.

    Returns `True` if the version of python in use matches the requirement.
    Returns `False` if the version of python in use does not matches the
    requirement.

    Raises an InvalidSpecifier if `requires_python` have an invalid format.
    NT�.�)	rZSpecifierSetr�parse�join�map�str�sys�version_info)�requires_pythonZrequires_python_specifierZpython_version�r�/usr/lib/python3.6/packaging.py�check_requires_pythons


 rcCs8t|tj�r |jd�r |jd�S|jd�r4|jd�SdS)NZMETADATAzPKG-INFO)�
isinstancerZDistInfoDistributionZhas_metadata�get_metadata)�distrrrr%s



rcCs�t|�}t�}|j|�|j�}|jd�}y8t|�s`tjd|j|dj	t
ttj
dd���f��Wn8tjk
r�}ztjd|j||f�dSd}~XnXdS)NzRequires-Pythonz4%s requires Python '%s' but the running Python is %srr	z7Package %s has an invalid Requires-Python entry %s - %s)rrZfeed�close�getrrZUnsupportedPythonVersionZproject_namerrr
rrrZInvalidSpecifier�loggerZwarning)rZmetadataZfeed_parserZ
pkg_info_dictr�errr�check_dist_requires_python-s"

$r)Z
__future__rZemail.parserrZloggingrZpip._vendor.packagingrrZpip._vendorrZpiprZ	getLogger�__name__rrrrrrrr�<module>s
site-packages/pip/utils/__pycache__/logging.cpython-36.opt-1.pyc000064400000007447147511334620020461 0ustar003

���e��@sddlmZddlZddlZddlZddlZyddlZWnek
rTddlZYnXddl	m
Z
ddlmZyddl
mZWnek
r�dZYnXej�Zde_ejddd��Zd	d
�ZGdd�dej�Zd
d�ZGdd�dej�ZGdd�dejj�ZGdd�dej�ZdS)�)�absolute_importN)�WINDOWS)�
ensure_dir)�colorama�ccs.tj|7_z
dVWdtj|8_XdS)zv
    A context manager which will cause the log output to be indented for any
    log messages emitted inside it.
    N)�
_log_state�indentation)Znum�r	�/usr/lib/python3.6/logging.py�
indent_logs
rcCsttdd�S)Nrr)�getattrrr	r	r	r
�get_indentation)sr
c@seZdZdd�ZdS)�IndentingFormattercCs,tjj||�}djdd�|jd�D��}|S)z�
        Calls the standard formatter, but will indent all of the log messages
        by our current indentation level.
        �cSsg|]}dt�|�qS)� )r
)�.0�liner	r	r
�
<listcomp>6sz-IndentingFormatter.format.<locals>.<listcomp>T)�logging�	Formatter�format�join�
splitlines)�self�recordZ	formattedr	r	r
r/s
zIndentingFormatter.formatN)�__name__�
__module__�__qualname__rr	r	r	r
r-srcs�fdd�}|S)Ncsdjt��|tjjg�S)Nr)r�listrZStyleZ	RESET_ALL)Zinp)�colorsr	r
�wrapped=sz_color_wrap.<locals>.wrappedr	)rr r	)rr
�_color_wrap<sr!c@sTeZdZer2ejeejj�fej	eejj
�fgZngZddd�Zdd�Z
dd�ZdS)	�ColorizedStreamHandlerNcCs(tjj||�tr$tr$tj|j�|_dS)N)r�
StreamHandler�__init__rr�AnsiToWin32�stream)rr&r	r	r
r$NszColorizedStreamHandler.__init__cCsRtsdSt|jtj�s|jn|jj}t|d�r:|j�r:dStjj	d�dkrNdSdS)NF�isattyTZTERMZANSI)
r�
isinstancer&r%r �hasattrr'�os�environ�get)rZreal_streamr	r	r
�should_colorTsz#ColorizedStreamHandler.should_colorcCsBtjj||�}|j�r>x&|jD]\}}|j|kr||�}PqW|S)N)rr#rr-�COLORS�levelno)rr�msg�levelZcolorr	r	r
ris
zColorizedStreamHandler.format)N)rrrrrZERRORr!ZForeZREDZWARNINGZYELLOWr.r$r-rr	r	r	r
r"Bs
r"c@seZdZdd�ZdS)�BetterRotatingFileHandlercCs ttjj|j��tjjj|�S)N)	rr*�path�dirnameZbaseFilenamer�handlers�RotatingFileHandler�_open)rr	r	r
r7wszBetterRotatingFileHandler._openN)rrrr7r	r	r	r
r2usr2c@seZdZdd�Zdd�ZdS)�MaxLevelFiltercCs
||_dS)N)r1)rr1r	r	r
r$~szMaxLevelFilter.__init__cCs|j|jkS)N)r/r1)rrr	r	r
�filter�szMaxLevelFilter.filterN)rrrr$r9r	r	r	r
r8|sr8)r) Z
__future__r�
contextlibrZlogging.handlersr*Z	threading�ImportErrorZdummy_threadingZ
pip.compatrZ	pip.utilsrZpip._vendorr�	ExceptionZlocalrr�contextmanagerrr
rrr!r#r"r5r6r2�Filterr8r	r	r	r
�<module>s0
3site-packages/pip/utils/__pycache__/setuptools_build.cpython-36.pyc000064400000000464147511334620021464 0ustar003

���e�@sdZdS)z�import setuptools, tokenize;__file__=%r;f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))N)ZSETUPTOOLS_SHIM�rr�&/usr/lib/python3.6/setuptools_build.py�<module>ssite-packages/pip/utils/__pycache__/__init__.cpython-36.opt-1.pyc000064400000054075147511334620020571 0ustar003

���eml�@s�ddlmZddlmZddlZddlZddlZddlZddlZ	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlZddlZddlmZddlmZmZmZddlmZmZmZmZmZmZddl m!Z!ddl"m#Z#dd	l$m%Z%dd
l&m'Z'e%�rddlm(Z)nddlm)Z)d
ddddddddddddddddddd d!d"d#d$gZ*e	j+e,�Z-dzZ.d{Z/d|Z0d}Z1e0e.e1e/Z2e0e1Z3yddl4Z4e3e.7Z3Wn e5k
�r�e-j6d1�YnXyddl7Z7e3e/7Z3Wn e5k
�r�e-j6d2�YnXd3d4�Z8d5d!�Z9d6d�Z:e'd7d8d9�d~d;d
��Z;d<d=�Z<d>d�Z=dd@d�Z>dAdB�Z?dCd�Z@dDd�ZAdEd�ZBdFd�ZCdGd�ZDejEfdHdI�ZFdJd�ZGdKd�ZHd�dMd�ZIdNd�ZJdOd�ZKdPdQ�ZLdRdS�ZMdTdU�ZNdVdW�ZOdXdY�ZPdZd[�ZQdLedLd:d:fd\d]�ZRd^d_�ZSd`da�ZTdbd�ZUdcdd�ZVd�ded�ZWdfd�ZXdgd�ZYd�did�ZZdjdk�Z[dldm�Z\Gdndo�doe]�Z^Gdpdq�dqe)�Z_ej`drds��Zadtd �ZbGdudv�dve]�Zcd�dwd$�Zddxdy�ZedS)��)�absolute_import)�dequeN)�InstallationError)�console_to_str�
expanduser�stdlib_pkgs)�
site_packages�	user_site�running_under_virtualenv�virtualenv_no_global�write_delete_marker_file�distutils_scheme)�
pkg_resources)�input)�PY2)�retry)�BytesIO)�StringIO�rmtree�display_path�
backup_dir�ask�splitext�format_size�is_installable_dir�is_svn_page�
file_contents�split_leading_dir�has_leading_dir�normalize_path�renames�get_terminal_size�get_prog�
unzip_file�
untar_file�unpack_file�call_subprocess�captured_stdout�
ensure_dir�ARCHIVE_EXTENSIONS�SUPPORTED_EXTENSIONS�get_installed_version�.tar.bz2�.tbz�.tar.xz�.txz�.tlz�.tar.lz�	.tar.lzma�.zip�.whl�.tar.gz�.tgz�.tarzbz2 module is not availablezlzma module is not availablecOs,yt|�Stk
r&|||��YnXdS)N)�
__import__�ImportError)Zpkg_or_module_stringZ
ExceptionType�args�kwargs�r<�/usr/lib/python3.6/__init__.py�import_or_raiseIsr>cCsDytj|�Wn0tk
r>}z|jtjkr.�WYdd}~XnXdS)z os.path.makedirs without EEXIST.N)�os�makedirs�OSError�errnoZEEXIST)�path�er<r<r=r(Ps
c
CsDy$tjjtjd�dkr"dtjSWntttfk
r>YnXdS)Nr�__main__.py�-cz	%s -m pipZpip)rErF)	r?rC�basename�sys�argv�
executable�AttributeError�	TypeError�
IndexErrorr<r<r<r=r"Ysi�i�)Zstop_max_delayZ
wait_fixedFcCstj||td�dS)N)�
ignore_errors�onerror)�shutilr�rmtree_errorhandler)�dirrNr<r<r=rcscCs2tj|�jtj@r,tj|tj�||�dS�dS)z�On Windows, the files in .svn are read-only, so when rmtree() tries to
    remove them, an exception is thrown.  We catch that here, remove the
    read-only attribute, and hopefully continue without problems.N)r?�stat�st_mode�S_IREAD�chmod�S_IWRITE)�funcrC�exc_infor<r<r=rQis
rQcCsttjjtjj|��}tjddkrB|jtj�d�}|jtj	�d�}|j
tj�tjj�rpd|t
tj��d�}|S)zTGives the display value for a given path, making it relative to cwd
    if possible.r��replace�.N)r?rC�normcase�abspathrH�version_info�decode�getfilesystemencoding�encode�getdefaultencoding�
startswith�getcwd�sep�len)rCr<r<r=rxs�.bakcCs:d}|}x(tjj||�r0|d7}|t|�}q
W||S)z\Figure out the name of a directory to back up the given dir to
    (adding .bak, .bak2, etc)�)r?rC�exists�str)rR�ext�n�	extensionr<r<r=r�scCs2x&tjjdd�j�D]}||kr|SqWt||�S)NZPIP_EXISTS_ACTION�)r?�environ�get�splitr)�message�options�actionr<r<r=�ask_path_exists�srvcCsZxTtjjd�rtd|��t|�}|j�j�}||krNtd|dj|�f�q|SqWdS)z@Ask the message interactively, with the given possible responsesZPIP_NO_INPUTz7No input was expected ($PIP_NO_INPUT set); question: %sz<Your response (%r) was not one of the expected responses: %sz, N)	r?rprq�	Exceptionr�strip�lower�print�join)rsrtZresponser<r<r=r�scCsL|dkrd|ddS|d	kr,d|dS|dkr@d|dSd|SdS)
Ni�z%.1fMBg@�@�
z%ikBz%.1fkBz%ibytesi@Bi'r<)�bytesr<r<r=r�scCs2tjj|�sdStjj|d�}tjj|�r.dSdS)z@Return True if `path` is a directory containing a setup.py file.Fzsetup.pyT)r?rC�isdirr{�isfile)rCZsetup_pyr<r<r=r�scCstjd|�otjd|tj�S)zT
    Returns true if the page appears to be the index page of an svn repository
    z<title>[^<]*Revision \d+:z#Powered by (?:<a[^>]*?>)?Subversion)�re�search�I)Zhtmlr<r<r=r�sc	Cs$t|d��}|j�jd�SQRXdS)N�rbzutf-8)�open�readr`)�filename�fpr<r<r=r�sccs x|j|�}|sP|VqWdS)z7Yield pieces of data from a file-like object until EOF.N)r�)�file�size�chunkr<r<r=�read_chunks�s

r�cCsh|jd�jd�}d|krHd|kr4|jd�|jd�ks<d|krH|jdd�Sd|kr\|jdd�S|dfSdS)N�/�\riro)�lstrip�findrr)rCr<r<r=r�s$cCsDd}x:|D]2}t|�\}}|s"dS|dkr0|}q
||kr
dSq
WdS)zyReturns true if all the paths have the same leading path name
    (i.e., everything is in one subdirectory in an archive)NFT)r)�pathsZ
common_prefixrC�prefix�restr<r<r=r�s
TcCs2t|�}|rtjj|�}ntjj|�}tjj|�S)zN
    Convert a path to its canonical, case-normalized, absolute version.

    )rr?rC�realpathr^r])rCZresolve_symlinksr<r<r=r�s
cCs@tj|�\}}|j�jd�r8|dd�|}|dd�}||fS)z,Like os.path.splitext, but take off .tar tooz.tar�N���r�)�	posixpathrry�endswith)rC�baserlr<r<r=r�s
cCs|tjj|�\}}|r0|r0tjj|�r0tj|�tj||�tjj|�\}}|rx|rxytj|�Wntk
rvYnXdS)z7Like os.renames(), but handles renaming across devices.N)	r?rCrrrjr@rPZmove�
removedirsrA)�old�new�head�tailr<r<r=r s
cCst�s
dSt|�jttj��S)z�
    Return True if path is within sys.prefix, if we're running in a virtualenv.

    If we're not in a virtualenv, all paths are considered "local."

    T)r
rrdrHr�)rCr<r<r=�is_localsr�cCstt|��S)z�
    Return True if given Distribution object is installed locally
    (i.e. within current virtualenv).

    Always True if we're not in a virtualenv.

    )r��
dist_location)�distr<r<r=�
dist_is_local!sr�cCstt|��}|jtt��S)zF
    Return True if given Distribution is installed in user site.
    )rr�rdr	)r��	norm_pathr<r<r=�dist_in_usersite,sr�cCstt|��jtt��S)ze
    Return True if given Distribution is installed in
    distutils.sysconfig.get_python_lib().
    )rr�rdr)r�r<r<r=�dist_in_site_packages4s
r�cCs,tt|��}|jttd�djd�d��S)zf
    Return True if given Distribution is installed in
    path matching distutils_scheme layout.
    ro�purelib�pythonr)rr�rdr
rr)r�r�r<r<r=�dist_in_install_path>sr�cCs8x2tjD](}tjj||jd�}tjj|�rdSqWdS)z$Is distribution an editable install?z	.egg-linkTF)rHrCr?r{�project_namer)r�Z	path_item�egg_linkr<r<r=�dist_is_editableHs
r�csl|r
t�ndd��|r dd��ndd��|r6dd��ndd��|rHt�nd	d
�������fdd�tjD�S)
a�
    Return a list of installed Distribution objects.

    If ``local_only`` is True (default), only return installations
    local to the current virtualenv, if in a virtualenv.

    ``skip`` argument is an iterable of lower-case project names to
    ignore; defaults to stdlib_pkgs

    If ``editables`` is False, don't report editables.

    If ``editables_only`` is True , only report editables.

    If ``user_only`` is True , only report installations in the user
    site directory.

    cSsdS)NTr<)�dr<r<r=�
local_testjsz/get_installed_distributions.<locals>.local_testcSsdS)NTr<)r�r<r<r=�
editable_testnsz2get_installed_distributions.<locals>.editable_testcSs
t|�S)N)r�)r�r<r<r=r�qscSst|�S)N)r�)r�r<r<r=�editables_only_testusz8get_installed_distributions.<locals>.editables_only_testcSsdS)NTr<)r�r<r<r=r�xscSsdS)NTr<)r�r<r<r=�	user_test~sz.get_installed_distributions.<locals>.user_testcs:g|]2}�|�r|j�kr�|�r�|�r�|�r|�qSr<)�key)�.0r�)r�r�r��skipr�r<r=�
<listcomp>�s
z/get_installed_distributions.<locals>.<listcomp>)r�r�r�working_set)Z
local_onlyr�Zinclude_editablesZeditables_onlyZ	user_onlyr<)r�r�r�r�r�r=�get_installed_distributionsQs

r�cCs�g}t�r6t�r|jt�qN|jt�trN|jt�ntrD|jt�|jt�x0|D](}tjj||j�d}tjj	|�rT|SqTWdS)a
    Return the path for the .egg-link file if it exists, otherwise, None.

    There's 3 scenarios:
    1) not in a virtualenv
       try to find in site.USER_SITE, then site_packages
    2) in a no-global virtualenv
       try to find in site_packages
    3) in a yes-global virtualenv
       try to find in site_packages, then site.USER_SITE
       (don't look in global location)

    For #1 and #3, there could be odd cases, where there's an egg-link in 2
    locations.

    This method will just return the first one found.
    z	.egg-linkN)
r
r�appendrr	r?rCr{r�r)r�ZsitesZsiteZegglinkr<r<r=�
egg_link_path�s



r�cCst|�}|r|S|jS)z�
    Get the site-packages location of this distribution. Generally
    this is dist.location, except in the case of develop-installed
    packages, where dist.location is the source code location, and we
    want to know where the egg-link file is.

    )r��location)r�r�r<r<r=r��sr�c
Cs�dd�}|d�p|d�p|d�}|sZy(tjtj�tj�}||�}tj|�WnYnX|sztjjdd�tjjdd	�f}t|d�t|d�fS)
zlReturns a tuple (x, y) representing the width(x) and the height(x)
    in characters of the terminal window.cSsPy4ddl}ddl}ddl}|jd|j||jd��}Wn
dS|dkrLdS|S)NrZhhZ1234)rr)�fcntl�termios�struct�unpackZioctlZ
TIOCGWINSZ)�fdr�r�r��crr<r<r=�ioctl_GWINSZ�sz'get_terminal_size.<locals>.ioctl_GWINSZrrirZZLINES�ZCOLUMNS�P)r?r��ctermid�O_RDONLY�closerprq�int)r�r�r�r<r<r=r!�scCstjd�}tj|�|S)zBGet the current umask which involves having to set it temporarily.r)r?�umask)�maskr<r<r=�
current_umask�s

r�c
Cst|�t|d�}z�tj|dd�}t|j��o0|}x�|j�D]�}|j}|j|�}|}	|rdt	|�d}	t
jj||	�}	t
jj
|	�}
|	jd�s�|	jd�r�t|	�q<t|
�t|	d�}z|j|�Wd|j�|jd	?}|r�tj|�r�|d
@r�t
j|	dt�d
B�Xq<WWd|j�XdS)a�
    Unzip the file (with path `filename`) to the destination `location`.  All
    files are written based on system defaults and umask (i.e. permissions are
    not preserved), except that regular file members with any execute
    permissions (user, group, or world) have "chmod +x" applied after being
    written. Note that for windows, any execute changes using os.chmod are
    no-ops per the python docs.
    r�T)Z
allowZip64rir�r��wbN��Ii�)r(r��zipfileZZipFilerZnamelistZinfolistr�r�rr?rCr{�dirnamer��writer�Z
external_attrrS�S_ISREGrVr�)
r�r��flattenZzipfp�zip�leading�info�name�data�fnrRr��moder<r<r=r#�s0	




 c(Cs@t|�|j�jd�s$|j�jd�r*d}nL|j�jt�r>d}n8|j�jt�rRd}n$|j�jd�rfd}ntjd|�d	}tj||�}�z�t	d
d�|j
�D��}�x�|j
�D�]�}|j}|dkr�q�|r�t|�d
}t
jj||�}ytj|j|d�|�Wntjk
�rYnX|j��r"t|�q�|j��rxy|j||�Wn8tk
�rt}ztjd||j|�w�WYdd}~XnXq�y|j|�}	Wn<ttfk
�r�}ztjd||j|�w�WYdd}~XnXtt
jj|��t|d��}
tj|	|
�WdQRX|	j�|j||�|jd@r�t
j |dt!�dB�q�WWd|j�XdS)a�
    Untar the file (with path `filename`) to the destination `location`.
    All files are written based on system defaults and umask (i.e. permissions
    are not preserved), except that regular file members with any execute
    permissions (user, group, or world) have "chmod +x" applied after being
    written.  Note that for windows, any execute changes using os.chmod are
    no-ops per the python docs.
    z.gzz.tgzzr:gzzr:bz2zr:xzz.tar�rz-Cannot determine compression type for file %szr:*cSsg|]}|jdkr|j�qS)�pax_global_header)r�)r��memberr<r<r=r�(szuntar_file.<locals>.<listcomp>r�ri)r�z/In the tar file %s the member %s is invalid: %sNr�r�i�)"r(ryr��BZ2_EXTENSIONS�
XZ_EXTENSIONS�logger�warning�tarfiler�rZ
getmembersr�rr?rCr{Zdata_filterr[ZLinkOutsideDestinationErrorr~ZissymZ_extract_memberrwZextractfile�KeyErrorrKr�rPZcopyfileobjr��utimer�rVr�)r�r�r�Ztarr�r�r�rC�excr�Zdestfpr<r<r=r$
sh	



cCs�tjj|�}|dks,|j�jt�s,tj|�rDt|||jd�d�n�|dkslt	j
|�sl|j�jttt
�rxt||�nX|r�|jd�r�tt|��r�ddlm}|d|j�j|�ntjd	|||�td
|��dS)Nzapplication/zipz.whl)r�zapplication/x-gzipz	text/htmlr)�
Subversionzsvn+zZCannot unpack file %s (downloaded from %s, content-type: %s); cannot detect archive formatz%Cannot determine archive format of %s)r?rCr�ryr��ZIP_EXTENSIONSr�Z
is_zipfiler#r�Z
is_tarfile�TAR_EXTENSIONSr�r�r$rdrrZpip.vcs.subversionr�Zurlr�r��criticalr)r�r�Zcontent_type�linkr�r<r<r=r%`s,


�raisecCs,|r
d}ntj}|dkrng}xF|D]>}	d|	ksFd|	ksFd|	ksFd|	krVd|	jdd�}	|j|	�q"Wdj|�}tjd|�tjj	�}
|r�|
j
|�ytj|tjd|||
d�}Wn2t
k
r�}ztjd	||��WYdd}~XnX|dk	�rNg}
x\t|jj��}|�sP|j�}|
j|d�tj�tjk�r:tj|�q�|dk	r�|j�q�W|j�|dk	�r~|j�rt|jd
�n
|jd�|j�r|dk�r�tj�tjk�r�|�r�tjd
|�tjdj|
�d�td||j|f��n:|dk�r�tjd||j|�n|dk�rntdt|���|�s(dj|
�SdS)N� �
�"�'z"%s"z\"zRunning command %s)�stderr�stdin�stdout�cwd�envz#Error %s while executing command %s�error�doner�z Complete output from command %s:roz)
----------------------------------------z,Command "%s" failed with error code %s in %s�warnz$Command "%s" had error code %s in %s�ignorezInvalid value: on_returncode=%s)�
subprocess�PIPEr[r�r{r��debugr?rp�copy�update�PopenZSTDOUTrwr�rr��readline�rstripZgetEffectiveLevel�std_logging�DEBUGZspin�wait�
returncodeZfinishr�rr��
ValueError�repr)�cmdZshow_stdoutr�Z
on_returncodeZcommand_descZ
extra_environZspinnerr�Z	cmd_parts�partr��procr�Z
all_output�liner<r<r=r&�sz
 










cCsht|d��}|j�}WdQRXdtjd�dg}x4|D],}y|j|�}Wntk
r\w4YnXPq4W|S)aRReturn the contents of *filename*.

    Try to decode the file contents with utf-8, the preferred system encoding
    (e.g., cp1252 on some Windows machines), and latin1, in that order.
    Decoding a byte string with latin1 will never raise an error. In the worst
    case, the returned string will contain some garbage characters.

    r�Nzutf-8F�latin1)r�r��locale�getpreferredencodingr`�UnicodeDecodeError)r�r�r�Z	encodings�encr<r<r=�read_text_file�s	
r
cCstj|�t|�dS)N)r?r@r)Z	build_dirr<r<r=�_make_build_dir�s
rc@s(eZdZdZdd�Zdd�Zdd�ZdS)	�FakeFilezQWrap a list of lines in an object with readline() to make
    ConfigParser happy.cCsdd�|D�|_dS)Ncss|]
}|VqdS)Nr<)r��lr<r<r=�	<genexpr>sz$FakeFile.__init__.<locals>.<genexpr>)�_gen)�self�linesr<r<r=�__init__szFakeFile.__init__cCsDy*y
t|j�Stk
r&|jj�SXWntk
r>dSXdS)Nro)�nextr�	NameError�
StopIteration)rr<r<r=r�s
zFakeFile.readlinecCs|jS)N)r)rr<r<r=�__iter__szFakeFile.__iter__N)�__name__�
__module__�__qualname__�__doc__rr�rr<r<r<r=rs	rc@s$eZdZedd��Zedd��ZdS)�
StreamWrappercCs||_|�S)N)�orig_stream)�clsrr<r<r=�from_streamszStreamWrapper.from_streamcCs|jjS)N)r�encoding)rr<r<r=r"szStreamWrapper.encodingN)rrr�classmethodr!�propertyr"r<r<r<r=rsrc
cs@tt|�}tt|tj|��ztt|�VWdtt||�XdS)z�Return a context manager used by captured_stdout/stdin/stderr
    that temporarily replaces the sys stream *stream_name* with a StringIO.

    Taken from Lib/support/__init__.py in the CPython repo.
    N)�getattrrH�setattrrr!)Zstream_nameZorig_stdoutr<r<r=�captured_output s

r'cCstd�S)z�Capture the output of sys.stdout:

       with captured_stdout() as stdout:
           print('hello')
       self.assertEqual(stdout.getvalue(), 'hello
')

    Taken from Lib/support/__init__.py in the CPython repo.
    r�)r'r<r<r<r=r'/s	c@s eZdZdZdd�Zdd�ZdS)�cached_propertyz�A property that is only computed once per instance and then replaces
       itself with an ordinary attribute. Deleting the attribute resets the
       property.

       Source: https://github.com/bottlepy/bottle/blob/0.11.5/bottle.py#L175
    cCst|d�|_||_dS)Nr)r%rrX)rrXr<r<r=rCszcached_property.__init__cCs(|dkr|S|j|�}|j|jj<|S)N)rX�__dict__r)r�objr �valuer<r<r=�__get__Gszcached_property.__get__N)rrrrrr,r<r<r<r=r(;sr(cCs@tjj|�}|dkrtj�}n
tj|�}|j|�}|r<|jSdS)zCGet the installed version of dist_name avoiding pkg_resources cacheN)rZRequirement�parseZ
WorkingSetr��version)Z	dist_nameZlookup_dirsZreqr�r�r<r<r=r+Os


cCst|dd�dS)zConsume an iterable at C speed.r)�maxlenN)r)�iteratorr<r<r=�consumecsr1)r,r-)r.r/r0r1r2)r3r4)r5r6r7)F)rh)T)T)TNr�NNN)N)fZ
__future__r�collectionsr�
contextlibrB�ior	Zloggingr�r�r?r�rPrSr�rHr�r�Zpip.exceptionsrZ
pip.compatrrrZ
pip.locationsrr	r
rrr
Zpip._vendorrZpip._vendor.six.movesrZpip._vendor.sixrZpip._vendor.retryingrrr�__all__Z	getLoggerrr�r�r�r�r�r)r*�bz2r9r�Zlzmar>r(r"rrQrrrvrrrrr�DEFAULT_BUFFER_SIZEr�rrrrr r�r�r�r�r�r�r�r�r�r!r�r#r$r%r&r
r�objectrr�contextmanagerr'r'r(r+r1r<r<r<r=�<module>s� 
	



	

	

	5%
+S!
_

site-packages/pip/utils/__pycache__/logging.cpython-36.pyc000064400000007447147511334620017522 0ustar003

���e��@sddlmZddlZddlZddlZddlZyddlZWnek
rTddlZYnXddl	m
Z
ddlmZyddl
mZWnek
r�dZYnXej�Zde_ejddd��Zd	d
�ZGdd�dej�Zd
d�ZGdd�dej�ZGdd�dejj�ZGdd�dej�ZdS)�)�absolute_importN)�WINDOWS)�
ensure_dir)�colorama�ccs.tj|7_z
dVWdtj|8_XdS)zv
    A context manager which will cause the log output to be indented for any
    log messages emitted inside it.
    N)�
_log_state�indentation)Znum�r	�/usr/lib/python3.6/logging.py�
indent_logs
rcCsttdd�S)Nrr)�getattrrr	r	r	r
�get_indentation)sr
c@seZdZdd�ZdS)�IndentingFormattercCs,tjj||�}djdd�|jd�D��}|S)z�
        Calls the standard formatter, but will indent all of the log messages
        by our current indentation level.
        �cSsg|]}dt�|�qS)� )r
)�.0�liner	r	r
�
<listcomp>6sz-IndentingFormatter.format.<locals>.<listcomp>T)�logging�	Formatter�format�join�
splitlines)�self�recordZ	formattedr	r	r
r/s
zIndentingFormatter.formatN)�__name__�
__module__�__qualname__rr	r	r	r
r-srcs�fdd�}|S)Ncsdjt��|tjjg�S)Nr)r�listrZStyleZ	RESET_ALL)Zinp)�colorsr	r
�wrapped=sz_color_wrap.<locals>.wrappedr	)rr r	)rr
�_color_wrap<sr!c@sTeZdZer2ejeejj�fej	eejj
�fgZngZddd�Zdd�Z
dd�ZdS)	�ColorizedStreamHandlerNcCs(tjj||�tr$tr$tj|j�|_dS)N)r�
StreamHandler�__init__rr�AnsiToWin32�stream)rr&r	r	r
r$NszColorizedStreamHandler.__init__cCsRtsdSt|jtj�s|jn|jj}t|d�r:|j�r:dStjj	d�dkrNdSdS)NF�isattyTZTERMZANSI)
r�
isinstancer&r%r �hasattrr'�os�environ�get)rZreal_streamr	r	r
�should_colorTsz#ColorizedStreamHandler.should_colorcCsBtjj||�}|j�r>x&|jD]\}}|j|kr||�}PqW|S)N)rr#rr-�COLORS�levelno)rr�msg�levelZcolorr	r	r
ris
zColorizedStreamHandler.format)N)rrrrrZERRORr!ZForeZREDZWARNINGZYELLOWr.r$r-rr	r	r	r
r"Bs
r"c@seZdZdd�ZdS)�BetterRotatingFileHandlercCs ttjj|j��tjjj|�S)N)	rr*�path�dirnameZbaseFilenamer�handlers�RotatingFileHandler�_open)rr	r	r
r7wszBetterRotatingFileHandler._openN)rrrr7r	r	r	r
r2usr2c@seZdZdd�Zdd�ZdS)�MaxLevelFiltercCs
||_dS)N)r1)rr1r	r	r
r$~szMaxLevelFilter.__init__cCs|j|jkS)N)r/r1)rrr	r	r
�filter�szMaxLevelFilter.filterN)rrrr$r9r	r	r	r
r8|sr8)r) Z
__future__r�
contextlibrZlogging.handlersr*Z	threading�ImportErrorZdummy_threadingZ
pip.compatrZ	pip.utilsrZpip._vendorr�	ExceptionZlocalrr�contextmanagerrr
rrr!r#r"r5r6r2�Filterr8r	r	r	r
�<module>s0
3site-packages/pip/utils/__pycache__/glibc.cpython-36.pyc000064400000002503147511334620017140 0ustar003

���e{�@sPddlmZddlZddlZddlZddlZdd�Zdd�Zdd�Zd	d
�Z	dS)�)�absolute_importNcCsPtjd�}y
|j}Wntk
r(dSXtj|_|�}t|t�sL|jd�}|S)z9Returns glibc version string, or None if not using glibc.N�ascii)	�ctypesZCDLL�gnu_get_libc_version�AttributeErrorZc_char_pZrestype�
isinstance�str�decode)Zprocess_namespacer�version_str�r�/usr/lib/python3.6/glibc.py�glibc_version_string	s



r
cCsHtjd|�}|s$tjd|t�dSt|jd��|koFt|jd��|kS)Nz$(?P<major>[0-9]+)\.(?P<minor>[0-9]+)z=Expected glibc version with 2 components major.minor, got: %sF�major�minor)�re�match�warnings�warn�RuntimeWarning�int�group)r
�required_major�
minimum_minor�mrrr�check_glibc_version#s
rcCst�}|dkrdSt|||�S)NF)r
r)rrr
rrr�have_compatible_glibc3srcCs"t�}|dkrtj�Sd|fSdS)NZglibc)r
�platform�libc_ver)Z
glibc_versionrrrrKsr)
Z
__future__rrrrrr
rrrrrrr�<module>ssite-packages/pip/utils/__pycache__/hashes.cpython-36.opt-1.pyc000064400000006227147511334620020301 0ustar003

���e2�@szddlmZddlZddlmZmZmZddlmZddl	m
Z
mZmZdZ
dddgZGd	d
�d
e�ZGdd�de�ZdS)
�)�absolute_importN)�HashMismatch�HashMissing�InstallationError)�read_chunks)�	iteritems�iterkeys�
itervaluesZsha256Zsha384Zsha512c@sJeZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)�HasheszaA wrapper that builds multiple hashes at once and checks them against
    known-good values

    NcCs|dkrin||_dS)zo
        :param hashes: A dict of algorithm names pointing to lists of allowed
            hex digests
        N)�_allowed)�self�hashes�r�/usr/lib/python3.6/hashes.py�__init__szHashes.__init__c
Cs�i}xJt|j�D]<}ytj|�||<Wqttfk
rJtd|��YqXqWx(|D] }xt|�D]}|j|�qdWqVWx*t	|�D]\}}|j
�|j|kr�dSq�W|j|�dS)z�Check good hashes against ones built from iterable of chunks of
        data.

        Raise HashMismatch if none match.

        zUnknown hash name: %sN)rr�hashlib�new�
ValueError�	TypeErrorrr	�updater�	hexdigest�_raise)rZchunks�gotsZ	hash_name�chunk�hashZgotrrr�check_against_chunks s
zHashes.check_against_chunkscCst|j|��dS)N)rr)rrrrrr7sz
Hashes._raisecCs|jt|��S)zaCheck good hashes against a file-like object

        Raise HashMismatch if none match.

        )rr)r�filerrr�check_against_file:szHashes.check_against_filec	Cs t|d��}|j|�SQRXdS)N�rb)�openr)r�pathrrrr�check_against_pathBszHashes.check_against_pathcCs
t|j�S)z,Return whether I know any known-good hashes.)�boolr)rrrr�__nonzero__FszHashes.__nonzero__cCs|j�S)N)r#)rrrr�__bool__JszHashes.__bool__)N)�__name__�
__module__�__qualname__�__doc__rrrrr!r#r$rrrrr
s
r
cs(eZdZdZ�fdd�Zdd�Z�ZS)�
MissingHashesz�A workalike for Hashes used when we're missing a hash for a requirement

    It computes the actual hash of the requirement and raises a HashMissing
    exception showing it to the user.

    cstt|�jtgid�dS)z!Don't offer the ``hashes`` kwarg.)r
N)�superr)r�
FAVORITE_HASH)r)�	__class__rrrUszMissingHashes.__init__cCst|tj���dS)N)rr+r)rrrrrr[szMissingHashes._raise)r%r&r'r(rr�
__classcell__rr)r,rr)Nsr))Z
__future__rrZpip.exceptionsrrrZ	pip.utilsrZpip._vendor.sixrrr	r+Z
STRONG_HASHES�objectr
r)rrrr�<module>s
:site-packages/pip/utils/__pycache__/deprecation.cpython-36.pyc000064400000003255147511334620020362 0ustar003

���e��@s�dZddlmZddlZddlZGdd�de�ZGdd�de�ZGdd	�d	e�Z	Gd
d�dee�Z
Gdd
�d
e�Zdaddd�Z
dd�ZdS)zN
A module that implements tooling to enable easy warnings about deprecations.
�)�absolute_importNc@seZdZdS)�PipDeprecationWarningN)�__name__�
__module__�__qualname__�rr�!/usr/lib/python3.6/deprecation.pyr
src@seZdZdS)�PendingN)rrrrrrrr	sr	c@seZdZdS)�RemovedInPip10WarningN)rrrrrrrr
sr
c@seZdZdS)�RemovedInPip11WarningN)rrrrrrrrsrc@seZdZdS)�Python26DeprecationWarningN)rrrrrrrrsrcCsx|dk	r$tdk	rtt||||||�nPt|t�rbtjd�}d|}t|t�rV|j|�qt|j|�nt||||||�dS)Nzpip.deprecationszDEPRECATION: %s)�_warnings_showwarning�
issubclassr�loggingZ	getLoggerr	Zwarning�error)�message�category�filename�lineno�file�lineZloggerZlog_messagerrr�_showwarning$s


rcCs(tjdtdd�tdkr$tjatt_dS)N�defaultT)�append)�warnings�simplefilterrr
�showwarningrrrrr�install_warning_loggerDsr)NN)�__doc__Z
__future__rrr�Warningr�objectr	r
rrr
rrrrrr�<module>s
 site-packages/pip/utils/__pycache__/hashes.cpython-36.pyc000064400000006227147511334620017342 0ustar003

���e2�@szddlmZddlZddlmZmZmZddlmZddl	m
Z
mZmZdZ
dddgZGd	d
�d
e�ZGdd�de�ZdS)
�)�absolute_importN)�HashMismatch�HashMissing�InstallationError)�read_chunks)�	iteritems�iterkeys�
itervaluesZsha256Zsha384Zsha512c@sJeZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)�HasheszaA wrapper that builds multiple hashes at once and checks them against
    known-good values

    NcCs|dkrin||_dS)zo
        :param hashes: A dict of algorithm names pointing to lists of allowed
            hex digests
        N)�_allowed)�self�hashes�r�/usr/lib/python3.6/hashes.py�__init__szHashes.__init__c
Cs�i}xJt|j�D]<}ytj|�||<Wqttfk
rJtd|��YqXqWx(|D] }xt|�D]}|j|�qdWqVWx*t	|�D]\}}|j
�|j|kr�dSq�W|j|�dS)z�Check good hashes against ones built from iterable of chunks of
        data.

        Raise HashMismatch if none match.

        zUnknown hash name: %sN)rr�hashlib�new�
ValueError�	TypeErrorrr	�updater�	hexdigest�_raise)rZchunks�gotsZ	hash_name�chunk�hashZgotrrr�check_against_chunks s
zHashes.check_against_chunkscCst|j|��dS)N)rr)rrrrrr7sz
Hashes._raisecCs|jt|��S)zaCheck good hashes against a file-like object

        Raise HashMismatch if none match.

        )rr)r�filerrr�check_against_file:szHashes.check_against_filec	Cs t|d��}|j|�SQRXdS)N�rb)�openr)r�pathrrrr�check_against_pathBszHashes.check_against_pathcCs
t|j�S)z,Return whether I know any known-good hashes.)�boolr)rrrr�__nonzero__FszHashes.__nonzero__cCs|j�S)N)r#)rrrr�__bool__JszHashes.__bool__)N)�__name__�
__module__�__qualname__�__doc__rrrrr!r#r$rrrrr
s
r
cs(eZdZdZ�fdd�Zdd�Z�ZS)�
MissingHashesz�A workalike for Hashes used when we're missing a hash for a requirement

    It computes the actual hash of the requirement and raises a HashMissing
    exception showing it to the user.

    cstt|�jtgid�dS)z!Don't offer the ``hashes`` kwarg.)r
N)�superr)r�
FAVORITE_HASH)r)�	__class__rrrUszMissingHashes.__init__cCst|tj���dS)N)rr+r)rrrrrr[szMissingHashes._raise)r%r&r'r(rr�
__classcell__rr)r,rr)Nsr))Z
__future__rrZpip.exceptionsrrrZ	pip.utilsrZpip._vendor.sixrrr	r+Z
STRONG_HASHES�objectr
r)rrrr�<module>s
:site-packages/pip/utils/__pycache__/outdated.cpython-36.opt-1.pyc000064400000011147147511334620020634 0ustar003

���ee�@s�ddlmZddlZddlZddlZddlZddlZddlm	Z	ddl
mZddl
mZmZddlmZddlmZmZddlmZmZdd	lmZd
Zeje�ZGdd�de�ZGd
d�de�Z dd�Z!dd�Z"dd�Z#dS)�)�absolute_importN)�lockfile)�version)�
total_seconds�WINDOWS)�PyPI)�USER_CACHE_DIR�running_under_virtualenv)�
ensure_dir�get_installed_version)�check_path_ownerz%Y-%m-%dT%H:%M:%SZc@seZdZdd�Zdd�ZdS)�VirtualenvSelfCheckStatecCs\tjjtjd�|_y&t|j��}tj|�|_	WdQRXWnt
tfk
rVi|_	YnXdS)Nzpip-selfcheck.json)�os�path�join�sys�prefix�statefile_path�open�json�load�state�IOError�
ValueError)�self�	statefile�r�/usr/lib/python3.6/outdated.py�__init__sz!VirtualenvSelfCheckState.__init__c
Cs:t|jd��$}tj|jt�|d�|ddd�WdQRXdS)N�w)�
last_check�pypi_versionT�,�:)�	sort_keys�
separators)r"r#)rrr�dump�strftime�SELFCHECK_DATE_FMT)rr!�current_timerrrr�save$szVirtualenvSelfCheckState.saveN)�__name__�
__module__�__qualname__rr*rrrrr
s
r
c@seZdZdd�Zdd�ZdS)�GlobalSelfCheckStatecCsbtjjtd�|_y,t|j��}tj|�tj	|_
WdQRXWn ttt
fk
r\i|_
YnXdS)Nzselfcheck.json)rrrrrrrrrrrrr�KeyError)rrrrrr3s zGlobalSelfCheckState.__init__cCs�ttjj|j��sdSttjj|j��tj|j��ztjj|j�rft	|j��}t
j|�}WdQRXni}|jt
�|d�|tj<t	|jd��}t
j||ddd�WdQRXWdQRXdS)N)r r!rTr"r#)r$r%)r"r#)rrr�dirnamerr
rZLockFile�existsrrrr'r(rrr&)rr!r)rrrrrr*=s
zGlobalSelfCheckState.saveN)r+r,r-rr*rrrrr.2s
r.cCst�rt�St�SdS)N)r	r
r.rrrr�load_selfcheck_statefileXsr2cCsFddl}y"|jd�}|jd�o*d|jd�kS|jk
r@dSXdS)z�Checks whether pip was installed by pip

    This is used not to display the upgrade message when pip is in fact
    installed by system package manager, such as dnf on Fedora.
    rN�pipZ	INSTALLERF)�
pkg_resourcesZget_distributionZhas_metadataZget_metadata_linesZDistributionNotFound)r4Zdistrrr�pip_installed_by_pip_s

r5c
CsFtd�}|dkrdStj|�}d}�y�t�}tjj�}d|jkrxd|jkrxtjj|jdt�}t	||�dkrx|jd}|dkr�|j
tjdd	id
�}|j
�dd�tt|j�d
�tjd�D�d}|j||�tj|�}||k�r|j|jk�rt��rt�rd}	nd}	tjd|||	�Wn$tk
�r@tjddd�YnXdS)z�Check for an update for pip.

    Limit the frequency of checks to once per week. State is stored either in
    the active virtualenv or in the user's USER_CACHE_DIR keyed off the prefix
    of the pip script path.
    r3Nr r!���<ZAcceptzapplication/json)ZheaderscSsg|]}tj|�js|�qSr)�packaging_version�parseZ
is_prerelease)�.0�vrrr�
<listcomp>�sz%pip_version_check.<locals>.<listcomp>Zreleases)�key�z
python -m pipz�You are using pip version %s, however version %s is available.
You should consider upgrading via the '%s install --upgrade pip' command.z5There was an error checking the latest version of pipT)�exc_info�i`'i�:	���)rr9r:r2�datetimeZutcnowrZstrptimer(r�getrZpip_json_urlZraise_for_status�sorted�listrr*Zbase_versionr5r�loggerZwarning�	Exception�debug)
ZsessionZinstalled_versionZpip_versionr!rr)r ZrespZremote_versionZpip_cmdrrr�pip_version_checknsL




rJ)$Z
__future__rrCrZloggingZos.pathrrZpip._vendorrZpip._vendor.packagingrr9Z
pip.compatrrZ
pip.modelsrZ
pip.locationsrr	Z	pip.utilsr
rZpip.utils.filesystemrr(Z	getLoggerr+rG�objectr
r.r2r5rJrrrr�<module>s&
&site-packages/pip/utils/__pycache__/outdated.cpython-36.pyc000064400000011147147511334620017675 0ustar003

���ee�@s�ddlmZddlZddlZddlZddlZddlZddlm	Z	ddl
mZddl
mZmZddlmZddlmZmZddlmZmZdd	lmZd
Zeje�ZGdd�de�ZGd
d�de�Z dd�Z!dd�Z"dd�Z#dS)�)�absolute_importN)�lockfile)�version)�
total_seconds�WINDOWS)�PyPI)�USER_CACHE_DIR�running_under_virtualenv)�
ensure_dir�get_installed_version)�check_path_ownerz%Y-%m-%dT%H:%M:%SZc@seZdZdd�Zdd�ZdS)�VirtualenvSelfCheckStatecCs\tjjtjd�|_y&t|j��}tj|�|_	WdQRXWnt
tfk
rVi|_	YnXdS)Nzpip-selfcheck.json)�os�path�join�sys�prefix�statefile_path�open�json�load�state�IOError�
ValueError)�self�	statefile�r�/usr/lib/python3.6/outdated.py�__init__sz!VirtualenvSelfCheckState.__init__c
Cs:t|jd��$}tj|jt�|d�|ddd�WdQRXdS)N�w)�
last_check�pypi_versionT�,�:)�	sort_keys�
separators)r"r#)rrr�dump�strftime�SELFCHECK_DATE_FMT)rr!�current_timerrrr�save$szVirtualenvSelfCheckState.saveN)�__name__�
__module__�__qualname__rr*rrrrr
s
r
c@seZdZdd�Zdd�ZdS)�GlobalSelfCheckStatecCsbtjjtd�|_y,t|j��}tj|�tj	|_
WdQRXWn ttt
fk
r\i|_
YnXdS)Nzselfcheck.json)rrrrrrrrrrrrr�KeyError)rrrrrr3s zGlobalSelfCheckState.__init__cCs�ttjj|j��sdSttjj|j��tj|j��ztjj|j�rft	|j��}t
j|�}WdQRXni}|jt
�|d�|tj<t	|jd��}t
j||ddd�WdQRXWdQRXdS)N)r r!rTr"r#)r$r%)r"r#)rrr�dirnamerr
rZLockFile�existsrrrr'r(rrr&)rr!r)rrrrrr*=s
zGlobalSelfCheckState.saveN)r+r,r-rr*rrrrr.2s
r.cCst�rt�St�SdS)N)r	r
r.rrrr�load_selfcheck_statefileXsr2cCsFddl}y"|jd�}|jd�o*d|jd�kS|jk
r@dSXdS)z�Checks whether pip was installed by pip

    This is used not to display the upgrade message when pip is in fact
    installed by system package manager, such as dnf on Fedora.
    rN�pipZ	INSTALLERF)�
pkg_resourcesZget_distributionZhas_metadataZget_metadata_linesZDistributionNotFound)r4Zdistrrr�pip_installed_by_pip_s

r5c
CsFtd�}|dkrdStj|�}d}�y�t�}tjj�}d|jkrxd|jkrxtjj|jdt�}t	||�dkrx|jd}|dkr�|j
tjdd	id
�}|j
�dd�tt|j�d
�tjd�D�d}|j||�tj|�}||k�r|j|jk�rt��rt�rd}	nd}	tjd|||	�Wn$tk
�r@tjddd�YnXdS)z�Check for an update for pip.

    Limit the frequency of checks to once per week. State is stored either in
    the active virtualenv or in the user's USER_CACHE_DIR keyed off the prefix
    of the pip script path.
    r3Nr r!���<ZAcceptzapplication/json)ZheaderscSsg|]}tj|�js|�qSr)�packaging_version�parseZ
is_prerelease)�.0�vrrr�
<listcomp>�sz%pip_version_check.<locals>.<listcomp>Zreleases)�key�z
python -m pipz�You are using pip version %s, however version %s is available.
You should consider upgrading via the '%s install --upgrade pip' command.z5There was an error checking the latest version of pipT)�exc_info�i`'i�:	���)rr9r:r2�datetimeZutcnowrZstrptimer(r�getrZpip_json_urlZraise_for_status�sorted�listrr*Zbase_versionr5r�loggerZwarning�	Exception�debug)
ZsessionZinstalled_versionZpip_versionr!rr)r ZrespZremote_versionZpip_cmdrrr�pip_version_checknsL




rJ)$Z
__future__rrCrZloggingZos.pathrrZpip._vendorrZpip._vendor.packagingrr9Z
pip.compatrrZ
pip.modelsrZ
pip.locationsrr	Z	pip.utilsr
rZpip.utils.filesystemrr(Z	getLoggerr+rG�objectr
r.r2r5rJrrrr�<module>s&
&site-packages/pip/utils/__pycache__/setuptools_build.cpython-36.opt-1.pyc000064400000000464147511334620022423 0ustar003

���e�@sdZdS)z�import setuptools, tokenize;__file__=%r;f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))N)ZSETUPTOOLS_SHIM�rr�&/usr/lib/python3.6/setuptools_build.py�<module>ssite-packages/pip/utils/__pycache__/filesystem.cpython-36.opt-1.pyc000064400000001064147511334620021204 0ustar003

���e��@s(ddlZddlZddlmZdd�ZdS)�N)�get_path_uidcCs�ttd�sdSd}xp||kr�tjj|�rntj�dkr^yt|�}Wntk
rTdSX|dkStj|tj�Sq|tjj	|�}}qWdS)N�geteuidTrF)
�hasattr�os�path�lexistsrr�OSError�access�W_OK�dirname)rZpreviousZpath_uid�r� /usr/lib/python3.6/filesystem.py�check_path_owners

r)rZos.pathZ
pip.compatrrrrrr
�<module>ssite-packages/pip/utils/__pycache__/ui.cpython-36.pyc000064400000022534147511334620016503 0ustar003

���eM-�@s�ddlmZddlmZddlZddlZddlmZmZmZddlZddl	Z	ddl
Z
ddlmZddl
mZddlmZddlmZdd	lmZmZdd
lmZmZmZddlmZyddlmZWnek
r�dZYnXe
je�Z d
d�Z!e!ee�Z"Gdd�de#�Z$Gdd�de#�Z%Gdd�de#�Z&Gdd�de&e$e%e"�Z'Gdd�de&e$e%ee�Z(e	j)dd��Z*Gdd�de#�Z+Gdd�de#�Z,Gdd �d e#�Z-e	j)d!d"��Z.dS)#�)�absolute_import)�divisionN)�signal�SIGINT�default_int_handler)�WINDOWS)�format_size)�get_indentation)�six)�Bar�IncrementalBar)�WritelnMixin�HIDE_CURSOR�SHOW_CURSOR)�Spinner)�coloramacCs�t|jdd�}|s|St|dtj��t|dtj��g}|tt|dg��7}ytj�j|�j|�Wntk
rv|SX|SdS)N�encodingZ
empty_fillZfill�phases)�getattr�filer
Z	text_type�list�join�encode�UnicodeEncodeError)Z	preferredZfallbackrZ
characters�r�/usr/lib/python3.6/ui.py�_select_progress_classsrcs4eZdZdZ�fdd�Z�fdd�Zdd�Z�ZS)�InterruptibleMixina�
    Helper to ensure that self.finish() gets called on keyboard interrupt.

    This allows downloads to be interrupted without leaving temporary state
    (like hidden cursors) behind.

    This class is similar to the progress library's existing SigIntMixin
    helper, but as of version 1.2, that helper has the following problems:

    1. It calls sys.exit().
    2. It discards the existing SIGINT handler completely.
    3. It leaves its own handler in place even after an uninterrupted finish,
       which will have unexpected delayed effects if the user triggers an
       unrelated keyboard interrupt some time after a progress-displaying
       download has already completed, for example.
    cs4tt|�j||�tt|j�|_|jdkr0t|_dS)z=
        Save the original SIGINT handler for later.
        N)�superr�__init__rr�
handle_sigint�original_handlerr)�self�args�kwargs)�	__class__rrrNs
zInterruptibleMixin.__init__cstt|�j�tt|j�dS)z�
        Restore the original SIGINT handler after finishing.

        This should happen regardless of whether the progress display finishes
        normally, or gets interrupted.
        N)rr�finishrrr!)r")r%rrr&^szInterruptibleMixin.finishcCs|j�|j||�dS)z�
        Call self.finish() before delegating to the original SIGINT handler.

        This handler should only be in place while the progress display is
        active.
        N)r&r!)r"Zsignum�framerrrr hsz InterruptibleMixin.handle_sigint)�__name__�
__module__�__qualname__�__doc__rr&r �
__classcell__rr)r%rr<s
rcsJeZdZ�fdd�Zedd��Zedd��Zedd��Zdd
d�Z�Z	S)
�DownloadProgressMixincs,tt|�j||�dt�d|j|_dS)N� �)rr-rr	�message)r"r#r$)r%rrruszDownloadProgressMixin.__init__cCs
t|j�S)N)r�index)r"rrr�
downloadedysz DownloadProgressMixin.downloadedcCs |jdkrdStd|j�dS)Ngz...�z/s)Zavgr)r"rrr�download_speed}s
z$DownloadProgressMixin.download_speedcCs|jrd|jSdS)Nzeta %s�)ZetaZeta_td)r"rrr�
pretty_eta�s
z DownloadProgressMixin.pretty_etar3ccs*x|D]}|V|j|�qW|j�dS)N)�nextr&)r"�it�n�xrrr�iter�s
zDownloadProgressMixin.iter)r3)
r(r)r*r�propertyr2r4r6r;r,rr)r%rr-ss
r-cseZdZ�fdd�Z�ZS)�WindowsMixincs\tr�jrd�_tt��j||�trXtrXtj�j��_�fdd��j_�fdd��j_	dS)NFcs�jjj�S)N)r�wrapped�isattyr)r"rr�<lambda>�sz'WindowsMixin.__init__.<locals>.<lambda>cs�jjj�S)N)rr>�flushr)r"rrr@�s)
rZhide_cursorrr=rrZAnsiToWin32rr?rA)r"r#r$)r%)r"rr�s
zWindowsMixin.__init__)r(r)r*rr,rr)r%rr=�sr=c@seZdZejZdZdZdS)�DownloadProgressBarz
%(percent)d%%z0%(downloaded)s %(download_speed)s %(pretty_eta)sN)r(r)r*�sys�stdoutrr0�suffixrrrrrB�srBc@s&eZdZejZdZdd�Zdd�ZdS)�DownloadProgressSpinnerz!%(downloaded)s %(download_speed)scCs"t|d�stj|j�|_t|j�S)N�_phaser)�hasattr�	itertools�cyclerrGr7)r"rrr�
next_phase�s
z"DownloadProgressSpinner.next_phasecCsN|j|}|j�}|j|}dj||r*dnd||r6dnd|g�}|j|�dS)Nr5r.)r0rKrErZwriteln)r"r0ZphaserE�linerrr�update�s



zDownloadProgressSpinner.updateN)	r(r)r*rCrDrrErKrMrrrrrF�srFccsRtrdVnB|j�s$tj�tjkr,dVn"|jt�z
dVWd|jt�XdS)N)	rr?�logger�getEffectiveLevel�logging�INFO�writerr)rrrr�
hidden_cursor�s

rSc@s$eZdZdd�Zdd�Zdd�ZdS)�RateLimitercCs||_d|_dS)Nr)�_min_update_interval_seconds�_last_update)r"�min_update_interval_secondsrrrr�szRateLimiter.__init__cCstj�}||j}||jkS)N)�timerVrU)r"ZnowZdeltarrr�ready�s
zRateLimiter.readycCstj�|_dS)N)rXrV)r"rrr�reset�szRateLimiter.resetN)r(r)r*rrYrZrrrrrT�srTc@s.eZdZddd�Zdd�Zdd	�Zd
d�ZdS)
�InteractiveSpinnerN�-\|/��?cCs\||_|dkrtj}||_t|�|_d|_tj|�|_	|jj
dt�|jd�d|_dS)NFr.z ... r)
�_messagerCrD�_filerT�
_rate_limiter�	_finishedrIrJ�_spin_cyclerRr	�_width)r"r0rZ
spin_charsrWrrrr�s
zInteractiveSpinner.__init__cCs^|jst�d|j}|jj|d|j|�|jj|�t|�|_|jj�|jj�dS)N�r.)	ra�AssertionErrorrcr_rR�lenrAr`rZ)r"�statusZbackuprrr�_write	s


zInteractiveSpinner._writecCs,|jr
dS|jj�sdS|jt|j��dS)N)rar`rYrhr7rb)r"rrr�spins

zInteractiveSpinner.spincCs4|jr
dS|j|�|jjd�|jj�d|_dS)N�
T)rarhr_rRrA)r"�final_statusrrrr&s

zInteractiveSpinner.finish)Nr\r])r(r)r*rrhrir&rrrrr[�s


r[c@s.eZdZddd�Zdd�Zdd�Zdd	�Zd
S)�NonInteractiveSpinner�<cCs$||_d|_t|�|_|jd�dS)NFZstarted)r^rarTr`�_update)r"r0rWrrrr*s
zNonInteractiveSpinner.__init__cCs*|jst�|jj�tjd|j|�dS)Nz%s: %s)rarer`rZrN�infor^)r"rgrrrrn0s
zNonInteractiveSpinner._updatecCs&|jr
dS|jj�sdS|jd�dS)Nzstill running...)rar`rYrn)r"rrrri5s

zNonInteractiveSpinner.spincCs$|jr
dS|jd|f�d|_dS)Nzfinished with status '%s'T)rarn)r"rkrrrr&<szNonInteractiveSpinner.finishN)rm)r(r)r*rrnrir&rrrrrl)s
rlccs�tjj�r"tj�tjkr"t|�}nt|�}y t	tj��|VWdQRXWn>t
k
rj|jd��Yn*tk
r�|jd��YnX|jd�dS)NZcanceled�error�done)
rCrDr?rNrOrPrQr[rlrS�KeyboardInterruptr&�	Exception)r0Zspinnerrrr�open_spinnerCs


rt)/Z
__future__rrrIrCrrrrX�
contextlibrPZ
pip.compatrZ	pip.utilsrZpip.utils.loggingr	Zpip._vendorr
Zpip._vendor.progress.barrrZpip._vendor.progress.helpersr
rrZpip._vendor.progress.spinnerrrrsZ	getLoggerr(rNrZ_BaseBar�objectrr-r=rBrF�contextmanagerrSrTr[rlrtrrrr�<module>sB


7
!0site-packages/pip/utils/__pycache__/encoding.cpython-36.pyc000064400000001747147511334620017657 0ustar003

���e��@sjddlZddlZddlZejdfejdfejdfejdfejdfejdfej	dfgZ
ejd	�Zd
d�Z
dS)�N�utf8�utf16zutf16-bezutf16-le�utf32zutf32-bezutf32-lescoding[:=]\s*([-\w.]+)cCs�x0tD](\}}|j|�r|t|�d�j|�SqWxV|jd�dd�D]@}|dd�dkrFtj|�rFtj|�j�djd�}|j|�SqFW|jtj	d��S)	z�Check a bytes string for a BOM to correctly detect the encoding

    Fallback to locale.getpreferredencoding(False) like open() on Python3N�
�r��#�asciiF)
�BOMS�
startswith�len�decode�split�ENCODING_RE�search�groups�locale�getpreferredencoding)�dataZbom�encoding�line�r�/usr/lib/python3.6/encoding.py�auto_decodes
r)�codecsr�re�BOM_UTF8�	BOM_UTF16�BOM_UTF16_BE�BOM_UTF16_LE�	BOM_UTF32�BOM_UTF32_BE�BOM_UTF32_LEr
�compilerrrrrr�<module>s
site-packages/pip/utils/build.py000064400000002440147511334620012673 0ustar00from __future__ import absolute_import

import os.path
import tempfile

from pip.utils import rmtree


class BuildDirectory(object):

    def __init__(self, name=None, delete=None):
        # If we were not given an explicit directory, and we were not given an
        # explicit delete option, then we'll default to deleting.
        if name is None and delete is None:
            delete = True

        if name is None:
            # We realpath here because some systems have their default tmpdir
            # symlinked to another directory.  This tends to confuse build
            # scripts, so we canonicalize the path by traversing potential
            # symlinks here.
            name = os.path.realpath(tempfile.mkdtemp(prefix="pip-build-"))
            # If we were not given an explicit directory, and we were not given
            # an explicit delete option, then we'll default to deleting.
            if delete is None:
                delete = True

        self.name = name
        self.delete = delete

    def __repr__(self):
        return "<{} {!r}>".format(self.__class__.__name__, self.name)

    def __enter__(self):
        return self.name

    def __exit__(self, exc, value, tb):
        self.cleanup()

    def cleanup(self):
        if self.delete:
            rmtree(self.name)
site-packages/pip/utils/appdirs.py000064400000021153147511334620013240 0ustar00"""
This code was taken from https://github.com/ActiveState/appdirs and modified
to suit our purposes.
"""
from __future__ import absolute_import

import os
import sys

from pip.compat import WINDOWS, expanduser
from pip._vendor.six import PY2, text_type


def user_cache_dir(appname):
    r"""
    Return full path to the user-specific cache dir for this application.

        "appname" is the name of application.

    Typical user cache directories are:
        macOS:      ~/Library/Caches/<AppName>
        Unix:       ~/.cache/<AppName> (XDG default)
        Windows:    C:\Users\<username>\AppData\Local\<AppName>\Cache

    On Windows the only suggestion in the MSDN docs is that local settings go
    in the `CSIDL_LOCAL_APPDATA` directory. This is identical to the
    non-roaming app data dir (the default returned by `user_data_dir`). Apps
    typically put cache data somewhere *under* the given dir here. Some
    examples:
        ...\Mozilla\Firefox\Profiles\<ProfileName>\Cache
        ...\Acme\SuperApp\Cache\1.0

    OPINION: This function appends "Cache" to the `CSIDL_LOCAL_APPDATA` value.
    """
    if WINDOWS:
        # Get the base path
        path = os.path.normpath(_get_win_folder("CSIDL_LOCAL_APPDATA"))

        # When using Python 2, return paths as bytes on Windows like we do on
        # other operating systems. See helper function docs for more details.
        if PY2 and isinstance(path, text_type):
            path = _win_path_to_bytes(path)

        # Add our app name and Cache directory to it
        path = os.path.join(path, appname, "Cache")
    elif sys.platform == "darwin":
        # Get the base path
        path = expanduser("~/Library/Caches")

        # Add our app name to it
        path = os.path.join(path, appname)
    else:
        # Get the base path
        path = os.getenv("XDG_CACHE_HOME", expanduser("~/.cache"))

        # Add our app name to it
        path = os.path.join(path, appname)

    return path


def user_data_dir(appname, roaming=False):
    """
    Return full path to the user-specific data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user data directories are:
        macOS:                  ~/Library/Application Support/<AppName>
        Unix:                   ~/.local/share/<AppName>    # or in
                                $XDG_DATA_HOME, if defined
        Win XP (not roaming):   C:\Documents and Settings\<username>\ ...
                                ...Application Data\<AppName>
        Win XP (roaming):       C:\Documents and Settings\<username>\Local ...
                                ...Settings\Application Data\<AppName>
        Win 7  (not roaming):   C:\\Users\<username>\AppData\Local\<AppName>
        Win 7  (roaming):       C:\\Users\<username>\AppData\Roaming\<AppName>

    For Unix, we follow the XDG spec and support $XDG_DATA_HOME.
    That means, by default "~/.local/share/<AppName>".
    """
    if WINDOWS:
        const = roaming and "CSIDL_APPDATA" or "CSIDL_LOCAL_APPDATA"
        path = os.path.join(os.path.normpath(_get_win_folder(const)), appname)
    elif sys.platform == "darwin":
        path = os.path.join(
            expanduser('~/Library/Application Support/'),
            appname,
        )
    else:
        path = os.path.join(
            os.getenv('XDG_DATA_HOME', expanduser("~/.local/share")),
            appname,
        )

    return path


def user_config_dir(appname, roaming=True):
    """Return full path to the user-specific config dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "roaming" (boolean, default True) can be set False to not use the
            Windows roaming appdata directory. That means that for users on a
            Windows network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user data directories are:
        macOS:                  same as user_data_dir
        Unix:                   ~/.config/<AppName>
        Win *:                  same as user_data_dir

    For Unix, we follow the XDG spec and support $XDG_CONFIG_HOME.
    That means, by default "~/.config/<AppName>".
    """
    if WINDOWS:
        path = user_data_dir(appname, roaming=roaming)
    elif sys.platform == "darwin":
        path = user_data_dir(appname)
    else:
        path = os.getenv('XDG_CONFIG_HOME', expanduser("~/.config"))
        path = os.path.join(path, appname)

    return path


# for the discussion regarding site_config_dirs locations
# see <https://github.com/pypa/pip/issues/1733>
def site_config_dirs(appname):
    """Return a list of potential user-shared config dirs for this application.

        "appname" is the name of application.

    Typical user config directories are:
        macOS:      /Library/Application Support/<AppName>/
        Unix:       /etc or $XDG_CONFIG_DIRS[i]/<AppName>/ for each value in
                    $XDG_CONFIG_DIRS
        Win XP:     C:\Documents and Settings\All Users\Application ...
                    ...Data\<AppName>\
        Vista:      (Fail! "C:\ProgramData" is a hidden *system* directory
                    on Vista.)
        Win 7:      Hidden, but writeable on Win 7:
                    C:\ProgramData\<AppName>\
    """
    if WINDOWS:
        path = os.path.normpath(_get_win_folder("CSIDL_COMMON_APPDATA"))
        pathlist = [os.path.join(path, appname)]
    elif sys.platform == 'darwin':
        pathlist = [os.path.join('/Library/Application Support', appname)]
    else:
        # try looking in $XDG_CONFIG_DIRS
        xdg_config_dirs = os.getenv('XDG_CONFIG_DIRS', '/etc/xdg')
        if xdg_config_dirs:
            pathlist = [
                os.path.join(expanduser(x), appname)
                for x in xdg_config_dirs.split(os.pathsep)
            ]
        else:
            pathlist = []

        # always look in /etc directly as well
        pathlist.append('/etc')

    return pathlist


# -- Windows support functions --

def _get_win_folder_from_registry(csidl_name):
    """
    This is a fallback technique at best. I'm not sure if using the
    registry for this guarantees us the correct answer for all CSIDL_*
    names.
    """
    import _winreg

    shell_folder_name = {
        "CSIDL_APPDATA": "AppData",
        "CSIDL_COMMON_APPDATA": "Common AppData",
        "CSIDL_LOCAL_APPDATA": "Local AppData",
    }[csidl_name]

    key = _winreg.OpenKey(
        _winreg.HKEY_CURRENT_USER,
        r"Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders"
    )
    directory, _type = _winreg.QueryValueEx(key, shell_folder_name)
    return directory


def _get_win_folder_with_ctypes(csidl_name):
    csidl_const = {
        "CSIDL_APPDATA": 26,
        "CSIDL_COMMON_APPDATA": 35,
        "CSIDL_LOCAL_APPDATA": 28,
    }[csidl_name]

    buf = ctypes.create_unicode_buffer(1024)
    ctypes.windll.shell32.SHGetFolderPathW(None, csidl_const, None, 0, buf)

    # Downgrade to short path name if have highbit chars. See
    # <http://bugs.activestate.com/show_bug.cgi?id=85099>.
    has_high_char = False
    for c in buf:
        if ord(c) > 255:
            has_high_char = True
            break
    if has_high_char:
        buf2 = ctypes.create_unicode_buffer(1024)
        if ctypes.windll.kernel32.GetShortPathNameW(buf.value, buf2, 1024):
            buf = buf2

    return buf.value

if WINDOWS:
    try:
        import ctypes
        _get_win_folder = _get_win_folder_with_ctypes
    except ImportError:
        _get_win_folder = _get_win_folder_from_registry


def _win_path_to_bytes(path):
    """Encode Windows paths to bytes. Only used on Python 2.

    Motivation is to be consistent with other operating systems where paths
    are also returned as bytes. This avoids problems mixing bytes and Unicode
    elsewhere in the codebase. For more details and discussion see
    <https://github.com/pypa/pip/issues/3463>.

    If encoding using ASCII and MBCS fails, return the original Unicode path.
    """
    for encoding in ('ASCII', 'MBCS'):
        try:
            return path.encode(encoding)
        except (UnicodeEncodeError, LookupError):
            pass
    return path
site-packages/pip/utils/filesystem.py000064400000001603147511334620013760 0ustar00import os
import os.path

from pip.compat import get_path_uid


def check_path_owner(path):
    # If we don't have a way to check the effective uid of this process, then
    # we'll just assume that we own the directory.
    if not hasattr(os, "geteuid"):
        return True

    previous = None
    while path != previous:
        if os.path.lexists(path):
            # Check if path is writable by current user.
            if os.geteuid() == 0:
                # Special handling for root user in order to handle properly
                # cases where users use sudo without -H flag.
                try:
                    path_uid = get_path_uid(path)
                except OSError:
                    return False
                return path_uid == 0
            else:
                return os.access(path, os.W_OK)
        else:
            previous, path = path, os.path.dirname(path)
site-packages/pip/utils/outdated.py000064400000013545147511334620013415 0ustar00from __future__ import absolute_import

import datetime
import json
import logging
import os.path
import sys

from pip._vendor import lockfile
from pip._vendor.packaging import version as packaging_version

from pip.compat import total_seconds, WINDOWS
from pip.models import PyPI
from pip.locations import USER_CACHE_DIR, running_under_virtualenv
from pip.utils import ensure_dir, get_installed_version
from pip.utils.filesystem import check_path_owner


SELFCHECK_DATE_FMT = "%Y-%m-%dT%H:%M:%SZ"


logger = logging.getLogger(__name__)


class VirtualenvSelfCheckState(object):
    def __init__(self):
        self.statefile_path = os.path.join(sys.prefix, "pip-selfcheck.json")

        # Load the existing state
        try:
            with open(self.statefile_path) as statefile:
                self.state = json.load(statefile)
        except (IOError, ValueError):
            self.state = {}

    def save(self, pypi_version, current_time):
        # Attempt to write out our version check file
        with open(self.statefile_path, "w") as statefile:
            json.dump(
                {
                    "last_check": current_time.strftime(SELFCHECK_DATE_FMT),
                    "pypi_version": pypi_version,
                },
                statefile,
                sort_keys=True,
                separators=(",", ":")
            )


class GlobalSelfCheckState(object):
    def __init__(self):
        self.statefile_path = os.path.join(USER_CACHE_DIR, "selfcheck.json")

        # Load the existing state
        try:
            with open(self.statefile_path) as statefile:
                self.state = json.load(statefile)[sys.prefix]
        except (IOError, ValueError, KeyError):
            self.state = {}

    def save(self, pypi_version, current_time):
        # Check to make sure that we own the directory
        if not check_path_owner(os.path.dirname(self.statefile_path)):
            return

        # Now that we've ensured the directory is owned by this user, we'll go
        # ahead and make sure that all our directories are created.
        ensure_dir(os.path.dirname(self.statefile_path))

        # Attempt to write out our version check file
        with lockfile.LockFile(self.statefile_path):
            if os.path.exists(self.statefile_path):
                with open(self.statefile_path) as statefile:
                    state = json.load(statefile)
            else:
                state = {}

            state[sys.prefix] = {
                "last_check": current_time.strftime(SELFCHECK_DATE_FMT),
                "pypi_version": pypi_version,
            }

            with open(self.statefile_path, "w") as statefile:
                json.dump(state, statefile, sort_keys=True,
                          separators=(",", ":"))


def load_selfcheck_statefile():
    if running_under_virtualenv():
        return VirtualenvSelfCheckState()
    else:
        return GlobalSelfCheckState()


def pip_installed_by_pip():
    """Checks whether pip was installed by pip

    This is used not to display the upgrade message when pip is in fact
    installed by system package manager, such as dnf on Fedora.
    """
    import pkg_resources
    try:
        dist = pkg_resources.get_distribution('pip')
        return (dist.has_metadata('INSTALLER') and
                'pip' in dist.get_metadata_lines('INSTALLER'))
    except pkg_resources.DistributionNotFound:
        return False


def pip_version_check(session):
    """Check for an update for pip.

    Limit the frequency of checks to once per week. State is stored either in
    the active virtualenv or in the user's USER_CACHE_DIR keyed off the prefix
    of the pip script path.
    """
    installed_version = get_installed_version("pip")
    if installed_version is None:
        return

    pip_version = packaging_version.parse(installed_version)
    pypi_version = None

    try:
        state = load_selfcheck_statefile()

        current_time = datetime.datetime.utcnow()
        # Determine if we need to refresh the state
        if "last_check" in state.state and "pypi_version" in state.state:
            last_check = datetime.datetime.strptime(
                state.state["last_check"],
                SELFCHECK_DATE_FMT
            )
            if total_seconds(current_time - last_check) < 7 * 24 * 60 * 60:
                pypi_version = state.state["pypi_version"]

        # Refresh the version if we need to or just see if we need to warn
        if pypi_version is None:
            resp = session.get(
                PyPI.pip_json_url,
                headers={"Accept": "application/json"},
            )
            resp.raise_for_status()
            pypi_version = [
                v for v in sorted(
                    list(resp.json()["releases"]),
                    key=packaging_version.parse,
                )
                if not packaging_version.parse(v).is_prerelease
            ][-1]

            # save that we've performed a check
            state.save(pypi_version, current_time)

        remote_version = packaging_version.parse(pypi_version)

        # Determine if our pypi_version is older
        if (pip_version < remote_version and
                pip_version.base_version != remote_version.base_version and
                pip_installed_by_pip()):
            # Advise "python -m pip" on Windows to avoid issues
            # with overwriting pip.exe.
            if WINDOWS:
                pip_cmd = "python -m pip"
            else:
                pip_cmd = "pip"
            logger.warning(
                "You are using pip version %s, however version %s is "
                "available.\nYou should consider upgrading via the "
                "'%s install --upgrade pip' command.",
                pip_version, pypi_version, pip_cmd
            )

    except Exception:
        logger.debug(
            "There was an error checking the latest version of pip",
            exc_info=True,
        )
site-packages/pip/utils/logging.py000064400000006377147511334620013237 0ustar00from __future__ import absolute_import

import contextlib
import logging
import logging.handlers
import os

try:
    import threading
except ImportError:
    import dummy_threading as threading

from pip.compat import WINDOWS
from pip.utils import ensure_dir

try:
    from pip._vendor import colorama
# Lots of different errors can come from this, including SystemError and
# ImportError.
except Exception:
    colorama = None


_log_state = threading.local()
_log_state.indentation = 0


@contextlib.contextmanager
def indent_log(num=2):
    """
    A context manager which will cause the log output to be indented for any
    log messages emitted inside it.
    """
    _log_state.indentation += num
    try:
        yield
    finally:
        _log_state.indentation -= num


def get_indentation():
    return getattr(_log_state, 'indentation', 0)


class IndentingFormatter(logging.Formatter):

    def format(self, record):
        """
        Calls the standard formatter, but will indent all of the log messages
        by our current indentation level.
        """
        formatted = logging.Formatter.format(self, record)
        formatted = "".join([
            (" " * get_indentation()) + line
            for line in formatted.splitlines(True)
        ])
        return formatted


def _color_wrap(*colors):
    def wrapped(inp):
        return "".join(list(colors) + [inp, colorama.Style.RESET_ALL])
    return wrapped


class ColorizedStreamHandler(logging.StreamHandler):

    # Don't build up a list of colors if we don't have colorama
    if colorama:
        COLORS = [
            # This needs to be in order from highest logging level to lowest.
            (logging.ERROR, _color_wrap(colorama.Fore.RED)),
            (logging.WARNING, _color_wrap(colorama.Fore.YELLOW)),
        ]
    else:
        COLORS = []

    def __init__(self, stream=None):
        logging.StreamHandler.__init__(self, stream)

        if WINDOWS and colorama:
            self.stream = colorama.AnsiToWin32(self.stream)

    def should_color(self):
        # Don't colorize things if we do not have colorama
        if not colorama:
            return False

        real_stream = (
            self.stream if not isinstance(self.stream, colorama.AnsiToWin32)
            else self.stream.wrapped
        )

        # If the stream is a tty we should color it
        if hasattr(real_stream, "isatty") and real_stream.isatty():
            return True

        # If we have an ASNI term we should color it
        if os.environ.get("TERM") == "ANSI":
            return True

        # If anything else we should not color it
        return False

    def format(self, record):
        msg = logging.StreamHandler.format(self, record)

        if self.should_color():
            for level, color in self.COLORS:
                if record.levelno >= level:
                    msg = color(msg)
                    break

        return msg


class BetterRotatingFileHandler(logging.handlers.RotatingFileHandler):

    def _open(self):
        ensure_dir(os.path.dirname(self.baseFilename))
        return logging.handlers.RotatingFileHandler._open(self)


class MaxLevelFilter(logging.Filter):

    def __init__(self, level):
        self.level = level

    def filter(self, record):
        return record.levelno < self.level
site-packages/pip/utils/hashes.py000064400000005462147511334620013056 0ustar00from __future__ import absolute_import

import hashlib

from pip.exceptions import HashMismatch, HashMissing, InstallationError
from pip.utils import read_chunks
from pip._vendor.six import iteritems, iterkeys, itervalues


# The recommended hash algo of the moment. Change this whenever the state of
# the art changes; it won't hurt backward compatibility.
FAVORITE_HASH = 'sha256'


# Names of hashlib algorithms allowed by the --hash option and ``pip hash``
# Currently, those are the ones at least as collision-resistant as sha256.
STRONG_HASHES = ['sha256', 'sha384', 'sha512']


class Hashes(object):
    """A wrapper that builds multiple hashes at once and checks them against
    known-good values

    """
    def __init__(self, hashes=None):
        """
        :param hashes: A dict of algorithm names pointing to lists of allowed
            hex digests
        """
        self._allowed = {} if hashes is None else hashes

    def check_against_chunks(self, chunks):
        """Check good hashes against ones built from iterable of chunks of
        data.

        Raise HashMismatch if none match.

        """
        gots = {}
        for hash_name in iterkeys(self._allowed):
            try:
                gots[hash_name] = hashlib.new(hash_name)
            except (ValueError, TypeError):
                raise InstallationError('Unknown hash name: %s' % hash_name)

        for chunk in chunks:
            for hash in itervalues(gots):
                hash.update(chunk)

        for hash_name, got in iteritems(gots):
            if got.hexdigest() in self._allowed[hash_name]:
                return
        self._raise(gots)

    def _raise(self, gots):
        raise HashMismatch(self._allowed, gots)

    def check_against_file(self, file):
        """Check good hashes against a file-like object

        Raise HashMismatch if none match.

        """
        return self.check_against_chunks(read_chunks(file))

    def check_against_path(self, path):
        with open(path, 'rb') as file:
            return self.check_against_file(file)

    def __nonzero__(self):
        """Return whether I know any known-good hashes."""
        return bool(self._allowed)

    def __bool__(self):
        return self.__nonzero__()


class MissingHashes(Hashes):
    """A workalike for Hashes used when we're missing a hash for a requirement

    It computes the actual hash of the requirement and raises a HashMissing
    exception showing it to the user.

    """
    def __init__(self):
        """Don't offer the ``hashes`` kwarg."""
        # Pass our favorite hash in to generate a "gotten hash". With the
        # empty list, it will never match, so an error will always raise.
        super(MissingHashes, self).__init__(hashes={FAVORITE_HASH: []})

    def _raise(self, gots):
        raise HashMissing(gots[FAVORITE_HASH].hexdigest())
site-packages/pip/utils/encoding.py000064400000001713147511334620013364 0ustar00import codecs
import locale
import re


BOMS = [
    (codecs.BOM_UTF8, 'utf8'),
    (codecs.BOM_UTF16, 'utf16'),
    (codecs.BOM_UTF16_BE, 'utf16-be'),
    (codecs.BOM_UTF16_LE, 'utf16-le'),
    (codecs.BOM_UTF32, 'utf32'),
    (codecs.BOM_UTF32_BE, 'utf32-be'),
    (codecs.BOM_UTF32_LE, 'utf32-le'),
]

ENCODING_RE = re.compile(b'coding[:=]\s*([-\w.]+)')


def auto_decode(data):
    """Check a bytes string for a BOM to correctly detect the encoding

    Fallback to locale.getpreferredencoding(False) like open() on Python3"""
    for bom, encoding in BOMS:
        if data.startswith(bom):
            return data[len(bom):].decode(encoding)
    # Lets check the first two lines as in PEP263
    for line in data.split(b'\n')[:2]:
        if line[0:1] == b'#' and ENCODING_RE.search(line):
            encoding = ENCODING_RE.search(line).groups()[0].decode('ascii')
            return data.decode(encoding)
    return data.decode(locale.getpreferredencoding(False))
site-packages/pip/utils/__init__.py000064400000066155147511334620013350 0ustar00from __future__ import absolute_import

from collections import deque
import contextlib
import errno
import io
import locale
# we have a submodule named 'logging' which would shadow this if we used the
# regular name:
import logging as std_logging
import re
import os
import posixpath
import shutil
import stat
import subprocess
import sys
import tarfile
import zipfile

from pip.exceptions import InstallationError
from pip.compat import console_to_str, expanduser, stdlib_pkgs
from pip.locations import (
    site_packages, user_site, running_under_virtualenv, virtualenv_no_global,
    write_delete_marker_file, distutils_scheme,
)
from pip._vendor import pkg_resources
from pip._vendor.six.moves import input
from pip._vendor.six import PY2
from pip._vendor.retrying import retry

if PY2:
    from io import BytesIO as StringIO
else:
    from io import StringIO

__all__ = ['rmtree', 'display_path', 'backup_dir',
           'ask', 'splitext',
           'format_size', 'is_installable_dir',
           'is_svn_page', 'file_contents',
           'split_leading_dir', 'has_leading_dir',
           'normalize_path',
           'renames', 'get_terminal_size', 'get_prog',
           'unzip_file', 'untar_file', 'unpack_file', 'call_subprocess',
           'captured_stdout', 'ensure_dir',
           'ARCHIVE_EXTENSIONS', 'SUPPORTED_EXTENSIONS',
           'get_installed_version']


logger = std_logging.getLogger(__name__)

BZ2_EXTENSIONS = ('.tar.bz2', '.tbz')
XZ_EXTENSIONS = ('.tar.xz', '.txz', '.tlz', '.tar.lz', '.tar.lzma')
ZIP_EXTENSIONS = ('.zip', '.whl')
TAR_EXTENSIONS = ('.tar.gz', '.tgz', '.tar')
ARCHIVE_EXTENSIONS = (
    ZIP_EXTENSIONS + BZ2_EXTENSIONS + TAR_EXTENSIONS + XZ_EXTENSIONS)
SUPPORTED_EXTENSIONS = ZIP_EXTENSIONS + TAR_EXTENSIONS
try:
    import bz2  # noqa
    SUPPORTED_EXTENSIONS += BZ2_EXTENSIONS
except ImportError:
    logger.debug('bz2 module is not available')

try:
    # Only for Python 3.3+
    import lzma  # noqa
    SUPPORTED_EXTENSIONS += XZ_EXTENSIONS
except ImportError:
    logger.debug('lzma module is not available')


def import_or_raise(pkg_or_module_string, ExceptionType, *args, **kwargs):
    try:
        return __import__(pkg_or_module_string)
    except ImportError:
        raise ExceptionType(*args, **kwargs)


def ensure_dir(path):
    """os.path.makedirs without EEXIST."""
    try:
        os.makedirs(path)
    except OSError as e:
        if e.errno != errno.EEXIST:
            raise


def get_prog():
    try:
        if os.path.basename(sys.argv[0]) in ('__main__.py', '-c'):
            return "%s -m pip" % sys.executable
    except (AttributeError, TypeError, IndexError):
        pass
    return 'pip'


# Retry every half second for up to 3 seconds
@retry(stop_max_delay=3000, wait_fixed=500)
def rmtree(dir, ignore_errors=False):
    shutil.rmtree(dir, ignore_errors=ignore_errors,
                  onerror=rmtree_errorhandler)


def rmtree_errorhandler(func, path, exc_info):
    """On Windows, the files in .svn are read-only, so when rmtree() tries to
    remove them, an exception is thrown.  We catch that here, remove the
    read-only attribute, and hopefully continue without problems."""
    # if file type currently read only
    if os.stat(path).st_mode & stat.S_IREAD:
        # convert to read/write
        os.chmod(path, stat.S_IWRITE)
        # use the original function to repeat the operation
        func(path)
        return
    else:
        raise


def display_path(path):
    """Gives the display value for a given path, making it relative to cwd
    if possible."""
    path = os.path.normcase(os.path.abspath(path))
    if sys.version_info[0] == 2:
        path = path.decode(sys.getfilesystemencoding(), 'replace')
        path = path.encode(sys.getdefaultencoding(), 'replace')
    if path.startswith(os.getcwd() + os.path.sep):
        path = '.' + path[len(os.getcwd()):]
    return path


def backup_dir(dir, ext='.bak'):
    """Figure out the name of a directory to back up the given dir to
    (adding .bak, .bak2, etc)"""
    n = 1
    extension = ext
    while os.path.exists(dir + extension):
        n += 1
        extension = ext + str(n)
    return dir + extension


def ask_path_exists(message, options):
    for action in os.environ.get('PIP_EXISTS_ACTION', '').split():
        if action in options:
            return action
    return ask(message, options)


def ask(message, options):
    """Ask the message interactively, with the given possible responses"""
    while 1:
        if os.environ.get('PIP_NO_INPUT'):
            raise Exception(
                'No input was expected ($PIP_NO_INPUT set); question: %s' %
                message
            )
        response = input(message)
        response = response.strip().lower()
        if response not in options:
            print(
                'Your response (%r) was not one of the expected responses: '
                '%s' % (response, ', '.join(options))
            )
        else:
            return response


def format_size(bytes):
    if bytes > 1000 * 1000:
        return '%.1fMB' % (bytes / 1000.0 / 1000)
    elif bytes > 10 * 1000:
        return '%ikB' % (bytes / 1000)
    elif bytes > 1000:
        return '%.1fkB' % (bytes / 1000.0)
    else:
        return '%ibytes' % bytes


def is_installable_dir(path):
    """Return True if `path` is a directory containing a setup.py file."""
    if not os.path.isdir(path):
        return False
    setup_py = os.path.join(path, 'setup.py')
    if os.path.isfile(setup_py):
        return True
    return False


def is_svn_page(html):
    """
    Returns true if the page appears to be the index page of an svn repository
    """
    return (re.search(r'<title>[^<]*Revision \d+:', html) and
            re.search(r'Powered by (?:<a[^>]*?>)?Subversion', html, re.I))


def file_contents(filename):
    with open(filename, 'rb') as fp:
        return fp.read().decode('utf-8')


def read_chunks(file, size=io.DEFAULT_BUFFER_SIZE):
    """Yield pieces of data from a file-like object until EOF."""
    while True:
        chunk = file.read(size)
        if not chunk:
            break
        yield chunk


def split_leading_dir(path):
    path = path.lstrip('/').lstrip('\\')
    if '/' in path and (('\\' in path and path.find('/') < path.find('\\')) or
                        '\\' not in path):
        return path.split('/', 1)
    elif '\\' in path:
        return path.split('\\', 1)
    else:
        return path, ''


def has_leading_dir(paths):
    """Returns true if all the paths have the same leading path name
    (i.e., everything is in one subdirectory in an archive)"""
    common_prefix = None
    for path in paths:
        prefix, rest = split_leading_dir(path)
        if not prefix:
            return False
        elif common_prefix is None:
            common_prefix = prefix
        elif prefix != common_prefix:
            return False
    return True


def normalize_path(path, resolve_symlinks=True):
    """
    Convert a path to its canonical, case-normalized, absolute version.

    """
    path = expanduser(path)
    if resolve_symlinks:
        path = os.path.realpath(path)
    else:
        path = os.path.abspath(path)
    return os.path.normcase(path)


def splitext(path):
    """Like os.path.splitext, but take off .tar too"""
    base, ext = posixpath.splitext(path)
    if base.lower().endswith('.tar'):
        ext = base[-4:] + ext
        base = base[:-4]
    return base, ext


def renames(old, new):
    """Like os.renames(), but handles renaming across devices."""
    # Implementation borrowed from os.renames().
    head, tail = os.path.split(new)
    if head and tail and not os.path.exists(head):
        os.makedirs(head)

    shutil.move(old, new)

    head, tail = os.path.split(old)
    if head and tail:
        try:
            os.removedirs(head)
        except OSError:
            pass


def is_local(path):
    """
    Return True if path is within sys.prefix, if we're running in a virtualenv.

    If we're not in a virtualenv, all paths are considered "local."

    """
    if not running_under_virtualenv():
        return True
    return normalize_path(path).startswith(normalize_path(sys.prefix))


def dist_is_local(dist):
    """
    Return True if given Distribution object is installed locally
    (i.e. within current virtualenv).

    Always True if we're not in a virtualenv.

    """
    return is_local(dist_location(dist))


def dist_in_usersite(dist):
    """
    Return True if given Distribution is installed in user site.
    """
    norm_path = normalize_path(dist_location(dist))
    return norm_path.startswith(normalize_path(user_site))


def dist_in_site_packages(dist):
    """
    Return True if given Distribution is installed in
    distutils.sysconfig.get_python_lib().
    """
    return normalize_path(
        dist_location(dist)
    ).startswith(normalize_path(site_packages))


def dist_in_install_path(dist):
    """
    Return True if given Distribution is installed in
    path matching distutils_scheme layout.
    """
    norm_path = normalize_path(dist_location(dist))
    return norm_path.startswith(normalize_path(
        distutils_scheme("")['purelib'].split('python')[0]))


def dist_is_editable(dist):
    """Is distribution an editable install?"""
    for path_item in sys.path:
        egg_link = os.path.join(path_item, dist.project_name + '.egg-link')
        if os.path.isfile(egg_link):
            return True
    return False


def get_installed_distributions(local_only=True,
                                skip=stdlib_pkgs,
                                include_editables=True,
                                editables_only=False,
                                user_only=False):
    """
    Return a list of installed Distribution objects.

    If ``local_only`` is True (default), only return installations
    local to the current virtualenv, if in a virtualenv.

    ``skip`` argument is an iterable of lower-case project names to
    ignore; defaults to stdlib_pkgs

    If ``editables`` is False, don't report editables.

    If ``editables_only`` is True , only report editables.

    If ``user_only`` is True , only report installations in the user
    site directory.

    """
    if local_only:
        local_test = dist_is_local
    else:
        def local_test(d):
            return True

    if include_editables:
        def editable_test(d):
            return True
    else:
        def editable_test(d):
            return not dist_is_editable(d)

    if editables_only:
        def editables_only_test(d):
            return dist_is_editable(d)
    else:
        def editables_only_test(d):
            return True

    if user_only:
        user_test = dist_in_usersite
    else:
        def user_test(d):
            return True

    return [d for d in pkg_resources.working_set
            if local_test(d) and
            d.key not in skip and
            editable_test(d) and
            editables_only_test(d) and
            user_test(d)
            ]


def egg_link_path(dist):
    """
    Return the path for the .egg-link file if it exists, otherwise, None.

    There's 3 scenarios:
    1) not in a virtualenv
       try to find in site.USER_SITE, then site_packages
    2) in a no-global virtualenv
       try to find in site_packages
    3) in a yes-global virtualenv
       try to find in site_packages, then site.USER_SITE
       (don't look in global location)

    For #1 and #3, there could be odd cases, where there's an egg-link in 2
    locations.

    This method will just return the first one found.
    """
    sites = []
    if running_under_virtualenv():
        if virtualenv_no_global():
            sites.append(site_packages)
        else:
            sites.append(site_packages)
            if user_site:
                sites.append(user_site)
    else:
        if user_site:
            sites.append(user_site)
        sites.append(site_packages)

    for site in sites:
        egglink = os.path.join(site, dist.project_name) + '.egg-link'
        if os.path.isfile(egglink):
            return egglink


def dist_location(dist):
    """
    Get the site-packages location of this distribution. Generally
    this is dist.location, except in the case of develop-installed
    packages, where dist.location is the source code location, and we
    want to know where the egg-link file is.

    """
    egg_link = egg_link_path(dist)
    if egg_link:
        return egg_link
    return dist.location


def get_terminal_size():
    """Returns a tuple (x, y) representing the width(x) and the height(x)
    in characters of the terminal window."""
    def ioctl_GWINSZ(fd):
        try:
            import fcntl
            import termios
            import struct
            cr = struct.unpack(
                'hh',
                fcntl.ioctl(fd, termios.TIOCGWINSZ, '1234')
            )
        except:
            return None
        if cr == (0, 0):
            return None
        return cr
    cr = ioctl_GWINSZ(0) or ioctl_GWINSZ(1) or ioctl_GWINSZ(2)
    if not cr:
        try:
            fd = os.open(os.ctermid(), os.O_RDONLY)
            cr = ioctl_GWINSZ(fd)
            os.close(fd)
        except:
            pass
    if not cr:
        cr = (os.environ.get('LINES', 25), os.environ.get('COLUMNS', 80))
    return int(cr[1]), int(cr[0])


def current_umask():
    """Get the current umask which involves having to set it temporarily."""
    mask = os.umask(0)
    os.umask(mask)
    return mask


def unzip_file(filename, location, flatten=True):
    """
    Unzip the file (with path `filename`) to the destination `location`.  All
    files are written based on system defaults and umask (i.e. permissions are
    not preserved), except that regular file members with any execute
    permissions (user, group, or world) have "chmod +x" applied after being
    written. Note that for windows, any execute changes using os.chmod are
    no-ops per the python docs.
    """
    ensure_dir(location)
    zipfp = open(filename, 'rb')
    try:
        zip = zipfile.ZipFile(zipfp, allowZip64=True)
        leading = has_leading_dir(zip.namelist()) and flatten
        for info in zip.infolist():
            name = info.filename
            data = zip.read(name)
            fn = name
            if leading:
                fn = split_leading_dir(name)[1]
            fn = os.path.join(location, fn)
            dir = os.path.dirname(fn)
            if fn.endswith('/') or fn.endswith('\\'):
                # A directory
                ensure_dir(fn)
            else:
                ensure_dir(dir)
                fp = open(fn, 'wb')
                try:
                    fp.write(data)
                finally:
                    fp.close()
                    mode = info.external_attr >> 16
                    # if mode and regular file and any execute permissions for
                    # user/group/world?
                    if mode and stat.S_ISREG(mode) and mode & 0o111:
                        # make dest file have execute for user/group/world
                        # (chmod +x) no-op on windows per python docs
                        os.chmod(fn, (0o777 - current_umask() | 0o111))
    finally:
        zipfp.close()


def untar_file(filename, location):
    """
    Untar the file (with path `filename`) to the destination `location`.
    All files are written based on system defaults and umask (i.e. permissions
    are not preserved), except that regular file members with any execute
    permissions (user, group, or world) have "chmod +x" applied after being
    written.  Note that for windows, any execute changes using os.chmod are
    no-ops per the python docs.
    """
    ensure_dir(location)
    if filename.lower().endswith('.gz') or filename.lower().endswith('.tgz'):
        mode = 'r:gz'
    elif filename.lower().endswith(BZ2_EXTENSIONS):
        mode = 'r:bz2'
    elif filename.lower().endswith(XZ_EXTENSIONS):
        mode = 'r:xz'
    elif filename.lower().endswith('.tar'):
        mode = 'r'
    else:
        logger.warning(
            'Cannot determine compression type for file %s', filename,
        )
        mode = 'r:*'
    tar = tarfile.open(filename, mode)
    try:
        # note: python<=2.5 doesn't seem to know about pax headers, filter them
        leading = has_leading_dir([
            member.name for member in tar.getmembers()
            if member.name != 'pax_global_header'
        ])
        for member in tar.getmembers():
            fn = member.name
            if fn == 'pax_global_header':
                continue
            if leading:
                fn = split_leading_dir(fn)[1]
            path = os.path.join(location, fn)

            # Call the `data` filter for its side effect (raising exception)
            try:
                tarfile.data_filter(member.replace(name=fn), location)
            except tarfile.LinkOutsideDestinationError:
                pass

            if member.isdir():
                ensure_dir(path)
            elif member.issym():
                try:
                    tar._extract_member(member, path)
                except Exception as exc:
                    # Some corrupt tar files seem to produce this
                    # (specifically bad symlinks)
                    logger.warning(
                        'In the tar file %s the member %s is invalid: %s',
                        filename, member.name, exc,
                    )
                    continue
            else:
                try:
                    fp = tar.extractfile(member)
                except (KeyError, AttributeError) as exc:
                    # Some corrupt tar files seem to produce this
                    # (specifically bad symlinks)
                    logger.warning(
                        'In the tar file %s the member %s is invalid: %s',
                        filename, member.name, exc,
                    )
                    continue
                ensure_dir(os.path.dirname(path))
                with open(path, 'wb') as destfp:
                    shutil.copyfileobj(fp, destfp)
                fp.close()
                # Update the timestamp (useful for cython compiled files)
                tar.utime(member, path)
                # member have any execute permissions for user/group/world?
                if member.mode & 0o111:
                    # make dest file have execute for user/group/world
                    # no-op on windows per python docs
                    os.chmod(path, (0o777 - current_umask() | 0o111))
    finally:
        tar.close()


def unpack_file(filename, location, content_type, link):
    filename = os.path.realpath(filename)
    if (content_type == 'application/zip' or
            filename.lower().endswith(ZIP_EXTENSIONS) or
            zipfile.is_zipfile(filename)):
        unzip_file(
            filename,
            location,
            flatten=not filename.endswith('.whl')
        )
    elif (content_type == 'application/x-gzip' or
            tarfile.is_tarfile(filename) or
            filename.lower().endswith(
                TAR_EXTENSIONS + BZ2_EXTENSIONS + XZ_EXTENSIONS)):
        untar_file(filename, location)
    elif (content_type and content_type.startswith('text/html') and
            is_svn_page(file_contents(filename))):
        # We don't really care about this
        from pip.vcs.subversion import Subversion
        Subversion('svn+' + link.url).unpack(location)
    else:
        # FIXME: handle?
        # FIXME: magic signatures?
        logger.critical(
            'Cannot unpack file %s (downloaded from %s, content-type: %s); '
            'cannot detect archive format',
            filename, location, content_type,
        )
        raise InstallationError(
            'Cannot determine archive format of %s' % location
        )


def call_subprocess(cmd, show_stdout=True, cwd=None,
                    on_returncode='raise',
                    command_desc=None,
                    extra_environ=None, spinner=None):
    # This function's handling of subprocess output is confusing and I
    # previously broke it terribly, so as penance I will write a long comment
    # explaining things.
    #
    # The obvious thing that affects output is the show_stdout=
    # kwarg. show_stdout=True means, let the subprocess write directly to our
    # stdout. Even though it is nominally the default, it is almost never used
    # inside pip (and should not be used in new code without a very good
    # reason); as of 2016-02-22 it is only used in a few places inside the VCS
    # wrapper code. Ideally we should get rid of it entirely, because it
    # creates a lot of complexity here for a rarely used feature.
    #
    # Most places in pip set show_stdout=False. What this means is:
    # - We connect the child stdout to a pipe, which we read.
    # - By default, we hide the output but show a spinner -- unless the
    #   subprocess exits with an error, in which case we show the output.
    # - If the --verbose option was passed (= loglevel is DEBUG), then we show
    #   the output unconditionally. (But in this case we don't want to show
    #   the output a second time if it turns out that there was an error.)
    #
    # stderr is always merged with stdout (even if show_stdout=True).
    if show_stdout:
        stdout = None
    else:
        stdout = subprocess.PIPE
    if command_desc is None:
        cmd_parts = []
        for part in cmd:
            if ' ' in part or '\n' in part or '"' in part or "'" in part:
                part = '"%s"' % part.replace('"', '\\"')
            cmd_parts.append(part)
        command_desc = ' '.join(cmd_parts)
    logger.debug("Running command %s", command_desc)
    env = os.environ.copy()
    if extra_environ:
        env.update(extra_environ)
    try:
        proc = subprocess.Popen(
            cmd, stderr=subprocess.STDOUT, stdin=None, stdout=stdout,
            cwd=cwd, env=env)
    except Exception as exc:
        logger.critical(
            "Error %s while executing command %s", exc, command_desc,
        )
        raise
    if stdout is not None:
        all_output = []
        while True:
            line = console_to_str(proc.stdout.readline())
            if not line:
                break
            line = line.rstrip()
            all_output.append(line + '\n')
            if logger.getEffectiveLevel() <= std_logging.DEBUG:
                # Show the line immediately
                logger.debug(line)
            else:
                # Update the spinner
                if spinner is not None:
                    spinner.spin()
    proc.wait()
    if spinner is not None:
        if proc.returncode:
            spinner.finish("error")
        else:
            spinner.finish("done")
    if proc.returncode:
        if on_returncode == 'raise':
            if (logger.getEffectiveLevel() > std_logging.DEBUG and
                    not show_stdout):
                logger.info(
                    'Complete output from command %s:', command_desc,
                )
                logger.info(
                    ''.join(all_output) +
                    '\n----------------------------------------'
                )
            raise InstallationError(
                'Command "%s" failed with error code %s in %s'
                % (command_desc, proc.returncode, cwd))
        elif on_returncode == 'warn':
            logger.warning(
                'Command "%s" had error code %s in %s',
                command_desc, proc.returncode, cwd,
            )
        elif on_returncode == 'ignore':
            pass
        else:
            raise ValueError('Invalid value: on_returncode=%s' %
                             repr(on_returncode))
    if not show_stdout:
        return ''.join(all_output)


def read_text_file(filename):
    """Return the contents of *filename*.

    Try to decode the file contents with utf-8, the preferred system encoding
    (e.g., cp1252 on some Windows machines), and latin1, in that order.
    Decoding a byte string with latin1 will never raise an error. In the worst
    case, the returned string will contain some garbage characters.

    """
    with open(filename, 'rb') as fp:
        data = fp.read()

    encodings = ['utf-8', locale.getpreferredencoding(False), 'latin1']
    for enc in encodings:
        try:
            data = data.decode(enc)
        except UnicodeDecodeError:
            continue
        break

    assert type(data) != bytes  # Latin1 should have worked.
    return data


def _make_build_dir(build_dir):
    os.makedirs(build_dir)
    write_delete_marker_file(build_dir)


class FakeFile(object):
    """Wrap a list of lines in an object with readline() to make
    ConfigParser happy."""
    def __init__(self, lines):
        self._gen = (l for l in lines)

    def readline(self):
        try:
            try:
                return next(self._gen)
            except NameError:
                return self._gen.next()
        except StopIteration:
            return ''

    def __iter__(self):
        return self._gen


class StreamWrapper(StringIO):

    @classmethod
    def from_stream(cls, orig_stream):
        cls.orig_stream = orig_stream
        return cls()

    # compileall.compile_dir() needs stdout.encoding to print to stdout
    @property
    def encoding(self):
        return self.orig_stream.encoding


@contextlib.contextmanager
def captured_output(stream_name):
    """Return a context manager used by captured_stdout/stdin/stderr
    that temporarily replaces the sys stream *stream_name* with a StringIO.

    Taken from Lib/support/__init__.py in the CPython repo.
    """
    orig_stdout = getattr(sys, stream_name)
    setattr(sys, stream_name, StreamWrapper.from_stream(orig_stdout))
    try:
        yield getattr(sys, stream_name)
    finally:
        setattr(sys, stream_name, orig_stdout)


def captured_stdout():
    """Capture the output of sys.stdout:

       with captured_stdout() as stdout:
           print('hello')
       self.assertEqual(stdout.getvalue(), 'hello\n')

    Taken from Lib/support/__init__.py in the CPython repo.
    """
    return captured_output('stdout')


class cached_property(object):
    """A property that is only computed once per instance and then replaces
       itself with an ordinary attribute. Deleting the attribute resets the
       property.

       Source: https://github.com/bottlepy/bottle/blob/0.11.5/bottle.py#L175
    """

    def __init__(self, func):
        self.__doc__ = getattr(func, '__doc__')
        self.func = func

    def __get__(self, obj, cls):
        if obj is None:
            # We're being accessed from the class itself, not from an object
            return self
        value = obj.__dict__[self.func.__name__] = self.func(obj)
        return value


def get_installed_version(dist_name, lookup_dirs=None):
    """Get the installed version of dist_name avoiding pkg_resources cache"""
    # Create a requirement that we'll look for inside of setuptools.
    req = pkg_resources.Requirement.parse(dist_name)

    # We want to avoid having this cached, so we need to construct a new
    # working set each time.
    if lookup_dirs is None:
        working_set = pkg_resources.WorkingSet()
    else:
        working_set = pkg_resources.WorkingSet(lookup_dirs)

    # Get the installed distribution from our working set
    dist = working_set.find(req)

    # Check to see if we got an installed distribution or not, if we did
    # we want to return it's version.
    return dist.version if dist else None


def consume(iterator):
    """Consume an iterable at C speed."""
    deque(iterator, maxlen=0)
site-packages/pip/utils/packaging.py000064400000004040147511334620013516 0ustar00from __future__ import absolute_import

from email.parser import FeedParser

import logging
import sys

from pip._vendor.packaging import specifiers
from pip._vendor.packaging import version
from pip._vendor import pkg_resources

from pip import exceptions

logger = logging.getLogger(__name__)


def check_requires_python(requires_python):
    """
    Check if the python version in use match the `requires_python` specifier.

    Returns `True` if the version of python in use matches the requirement.
    Returns `False` if the version of python in use does not matches the
    requirement.

    Raises an InvalidSpecifier if `requires_python` have an invalid format.
    """
    if requires_python is None:
        # The package provides no information
        return True
    requires_python_specifier = specifiers.SpecifierSet(requires_python)

    # We only use major.minor.micro
    python_version = version.parse('.'.join(map(str, sys.version_info[:3])))
    return python_version in requires_python_specifier


def get_metadata(dist):
    if (isinstance(dist, pkg_resources.DistInfoDistribution) and
            dist.has_metadata('METADATA')):
        return dist.get_metadata('METADATA')
    elif dist.has_metadata('PKG-INFO'):
        return dist.get_metadata('PKG-INFO')


def check_dist_requires_python(dist):
    metadata = get_metadata(dist)
    feed_parser = FeedParser()
    feed_parser.feed(metadata)
    pkg_info_dict = feed_parser.close()
    requires_python = pkg_info_dict.get('Requires-Python')
    try:
        if not check_requires_python(requires_python):
            raise exceptions.UnsupportedPythonVersion(
                "%s requires Python '%s' but the running Python is %s" % (
                    dist.project_name,
                    requires_python,
                    '.'.join(map(str, sys.version_info[:3])),)
            )
    except specifiers.InvalidSpecifier as e:
        logger.warning(
            "Package %s has an invalid Requires-Python entry %s - %s" % (
                dist.project_name, requires_python, e))
        return
site-packages/pip/utils/setuptools_build.py000064400000000426147511334620015176 0ustar00# Shim to wrap setup.py invocation with setuptools
SETUPTOOLS_SHIM = (
    "import setuptools, tokenize;__file__=%r;"
    "f=getattr(tokenize, 'open', open)(__file__);"
    "code=f.read().replace('\\r\\n', '\\n');"
    "f.close();"
    "exec(compile(code, __file__, 'exec'))"
)
site-packages/pip/utils/glibc.py000064400000005573147511334620012666 0ustar00from __future__ import absolute_import

import re
import ctypes
import platform
import warnings


def glibc_version_string():
    "Returns glibc version string, or None if not using glibc."

    # ctypes.CDLL(None) internally calls dlopen(NULL), and as the dlopen
    # manpage says, "If filename is NULL, then the returned handle is for the
    # main program". This way we can let the linker do the work to figure out
    # which libc our process is actually using.
    process_namespace = ctypes.CDLL(None)
    try:
        gnu_get_libc_version = process_namespace.gnu_get_libc_version
    except AttributeError:
        # Symbol doesn't exist -> therefore, we are not linked to
        # glibc.
        return None

    # Call gnu_get_libc_version, which returns a string like "2.5"
    gnu_get_libc_version.restype = ctypes.c_char_p
    version_str = gnu_get_libc_version()
    # py2 / py3 compatibility:
    if not isinstance(version_str, str):
        version_str = version_str.decode("ascii")

    return version_str


# Separated out from have_compatible_glibc for easier unit testing
def check_glibc_version(version_str, required_major, minimum_minor):
    # Parse string and check against requested version.
    #
    # We use a regexp instead of str.split because we want to discard any
    # random junk that might come after the minor version -- this might happen
    # in patched/forked versions of glibc (e.g. Linaro's version of glibc
    # uses version strings like "2.20-2014.11"). See gh-3588.
    m = re.match(r"(?P<major>[0-9]+)\.(?P<minor>[0-9]+)", version_str)
    if not m:
        warnings.warn("Expected glibc version with 2 components major.minor,"
                      " got: %s" % version_str, RuntimeWarning)
        return False
    return (int(m.group("major")) == required_major and
            int(m.group("minor")) >= minimum_minor)


def have_compatible_glibc(required_major, minimum_minor):
    version_str = glibc_version_string()
    if version_str is None:
        return False
    return check_glibc_version(version_str, required_major, minimum_minor)


# platform.libc_ver regularly returns completely nonsensical glibc
# versions. E.g. on my computer, platform says:
#
#   ~$ python2.7 -c 'import platform; print(platform.libc_ver())'
#   ('glibc', '2.7')
#   ~$ python3.5 -c 'import platform; print(platform.libc_ver())'
#   ('glibc', '2.9')
#
# But the truth is:
#
#   ~$ ldd --version
#   ldd (Debian GLIBC 2.22-11) 2.22
#
# This is unfortunate, because it means that the linehaul data on libc
# versions that was generated by pip 8.1.2 and earlier is useless and
# misleading. Solution: instead of using platform, use our code that actually
# works.
def libc_ver():
    glibc_version = glibc_version_string()
    if glibc_version is None:
        # For non-glibc platforms, fall back on platform.libc_ver
        return platform.libc_ver()
    else:
        return ("glibc", glibc_version)
site-packages/pip/utils/ui.py000064400000026515147511334620012222 0ustar00from __future__ import absolute_import
from __future__ import division

import itertools
import sys
from signal import signal, SIGINT, default_int_handler
import time
import contextlib
import logging

from pip.compat import WINDOWS
from pip.utils import format_size
from pip.utils.logging import get_indentation
from pip._vendor import six
from pip._vendor.progress.bar import Bar, IncrementalBar
from pip._vendor.progress.helpers import (WritelnMixin,
                                          HIDE_CURSOR, SHOW_CURSOR)
from pip._vendor.progress.spinner import Spinner

try:
    from pip._vendor import colorama
# Lots of different errors can come from this, including SystemError and
# ImportError.
except Exception:
    colorama = None

logger = logging.getLogger(__name__)


def _select_progress_class(preferred, fallback):
    encoding = getattr(preferred.file, "encoding", None)

    # If we don't know what encoding this file is in, then we'll just assume
    # that it doesn't support unicode and use the ASCII bar.
    if not encoding:
        return fallback

    # Collect all of the possible characters we want to use with the preferred
    # bar.
    characters = [
        getattr(preferred, "empty_fill", six.text_type()),
        getattr(preferred, "fill", six.text_type()),
    ]
    characters += list(getattr(preferred, "phases", []))

    # Try to decode the characters we're using for the bar using the encoding
    # of the given file, if this works then we'll assume that we can use the
    # fancier bar and if not we'll fall back to the plaintext bar.
    try:
        six.text_type().join(characters).encode(encoding)
    except UnicodeEncodeError:
        return fallback
    else:
        return preferred


_BaseBar = _select_progress_class(IncrementalBar, Bar)


class InterruptibleMixin(object):
    """
    Helper to ensure that self.finish() gets called on keyboard interrupt.

    This allows downloads to be interrupted without leaving temporary state
    (like hidden cursors) behind.

    This class is similar to the progress library's existing SigIntMixin
    helper, but as of version 1.2, that helper has the following problems:

    1. It calls sys.exit().
    2. It discards the existing SIGINT handler completely.
    3. It leaves its own handler in place even after an uninterrupted finish,
       which will have unexpected delayed effects if the user triggers an
       unrelated keyboard interrupt some time after a progress-displaying
       download has already completed, for example.
    """

    def __init__(self, *args, **kwargs):
        """
        Save the original SIGINT handler for later.
        """
        super(InterruptibleMixin, self).__init__(*args, **kwargs)

        self.original_handler = signal(SIGINT, self.handle_sigint)

        # If signal() returns None, the previous handler was not installed from
        # Python, and we cannot restore it. This probably should not happen,
        # but if it does, we must restore something sensible instead, at least.
        # The least bad option should be Python's default SIGINT handler, which
        # just raises KeyboardInterrupt.
        if self.original_handler is None:
            self.original_handler = default_int_handler

    def finish(self):
        """
        Restore the original SIGINT handler after finishing.

        This should happen regardless of whether the progress display finishes
        normally, or gets interrupted.
        """
        super(InterruptibleMixin, self).finish()
        signal(SIGINT, self.original_handler)

    def handle_sigint(self, signum, frame):
        """
        Call self.finish() before delegating to the original SIGINT handler.

        This handler should only be in place while the progress display is
        active.
        """
        self.finish()
        self.original_handler(signum, frame)


class DownloadProgressMixin(object):

    def __init__(self, *args, **kwargs):
        super(DownloadProgressMixin, self).__init__(*args, **kwargs)
        self.message = (" " * (get_indentation() + 2)) + self.message

    @property
    def downloaded(self):
        return format_size(self.index)

    @property
    def download_speed(self):
        # Avoid zero division errors...
        if self.avg == 0.0:
            return "..."
        return format_size(1 / self.avg) + "/s"

    @property
    def pretty_eta(self):
        if self.eta:
            return "eta %s" % self.eta_td
        return ""

    def iter(self, it, n=1):
        for x in it:
            yield x
            self.next(n)
        self.finish()


class WindowsMixin(object):

    def __init__(self, *args, **kwargs):
        # The Windows terminal does not support the hide/show cursor ANSI codes
        # even with colorama. So we'll ensure that hide_cursor is False on
        # Windows.
        # This call neds to go before the super() call, so that hide_cursor
        # is set in time. The base progress bar class writes the "hide cursor"
        # code to the terminal in its init, so if we don't set this soon
        # enough, we get a "hide" with no corresponding "show"...
        if WINDOWS and self.hide_cursor:
            self.hide_cursor = False

        super(WindowsMixin, self).__init__(*args, **kwargs)

        # Check if we are running on Windows and we have the colorama module,
        # if we do then wrap our file with it.
        if WINDOWS and colorama:
            self.file = colorama.AnsiToWin32(self.file)
            # The progress code expects to be able to call self.file.isatty()
            # but the colorama.AnsiToWin32() object doesn't have that, so we'll
            # add it.
            self.file.isatty = lambda: self.file.wrapped.isatty()
            # The progress code expects to be able to call self.file.flush()
            # but the colorama.AnsiToWin32() object doesn't have that, so we'll
            # add it.
            self.file.flush = lambda: self.file.wrapped.flush()


class DownloadProgressBar(WindowsMixin, InterruptibleMixin,
                          DownloadProgressMixin, _BaseBar):

    file = sys.stdout
    message = "%(percent)d%%"
    suffix = "%(downloaded)s %(download_speed)s %(pretty_eta)s"


class DownloadProgressSpinner(WindowsMixin, InterruptibleMixin,
                              DownloadProgressMixin, WritelnMixin, Spinner):

    file = sys.stdout
    suffix = "%(downloaded)s %(download_speed)s"

    def next_phase(self):
        if not hasattr(self, "_phaser"):
            self._phaser = itertools.cycle(self.phases)
        return next(self._phaser)

    def update(self):
        message = self.message % self
        phase = self.next_phase()
        suffix = self.suffix % self
        line = ''.join([
            message,
            " " if message else "",
            phase,
            " " if suffix else "",
            suffix,
        ])

        self.writeln(line)


################################################################
# Generic "something is happening" spinners
#
# We don't even try using progress.spinner.Spinner here because it's actually
# simpler to reimplement from scratch than to coerce their code into doing
# what we need.
################################################################

@contextlib.contextmanager
def hidden_cursor(file):
    # The Windows terminal does not support the hide/show cursor ANSI codes,
    # even via colorama. So don't even try.
    if WINDOWS:
        yield
    # We don't want to clutter the output with control characters if we're
    # writing to a file, or if the user is running with --quiet.
    # See https://github.com/pypa/pip/issues/3418
    elif not file.isatty() or logger.getEffectiveLevel() > logging.INFO:
        yield
    else:
        file.write(HIDE_CURSOR)
        try:
            yield
        finally:
            file.write(SHOW_CURSOR)


class RateLimiter(object):
    def __init__(self, min_update_interval_seconds):
        self._min_update_interval_seconds = min_update_interval_seconds
        self._last_update = 0

    def ready(self):
        now = time.time()
        delta = now - self._last_update
        return delta >= self._min_update_interval_seconds

    def reset(self):
        self._last_update = time.time()


class InteractiveSpinner(object):
    def __init__(self, message, file=None, spin_chars="-\\|/",
                 # Empirically, 8 updates/second looks nice
                 min_update_interval_seconds=0.125):
        self._message = message
        if file is None:
            file = sys.stdout
        self._file = file
        self._rate_limiter = RateLimiter(min_update_interval_seconds)
        self._finished = False

        self._spin_cycle = itertools.cycle(spin_chars)

        self._file.write(" " * get_indentation() + self._message + " ... ")
        self._width = 0

    def _write(self, status):
        assert not self._finished
        # Erase what we wrote before by backspacing to the beginning, writing
        # spaces to overwrite the old text, and then backspacing again
        backup = "\b" * self._width
        self._file.write(backup + " " * self._width + backup)
        # Now we have a blank slate to add our status
        self._file.write(status)
        self._width = len(status)
        self._file.flush()
        self._rate_limiter.reset()

    def spin(self):
        if self._finished:
            return
        if not self._rate_limiter.ready():
            return
        self._write(next(self._spin_cycle))

    def finish(self, final_status):
        if self._finished:
            return
        self._write(final_status)
        self._file.write("\n")
        self._file.flush()
        self._finished = True


# Used for dumb terminals, non-interactive installs (no tty), etc.
# We still print updates occasionally (once every 60 seconds by default) to
# act as a keep-alive for systems like Travis-CI that take lack-of-output as
# an indication that a task has frozen.
class NonInteractiveSpinner(object):
    def __init__(self, message, min_update_interval_seconds=60):
        self._message = message
        self._finished = False
        self._rate_limiter = RateLimiter(min_update_interval_seconds)
        self._update("started")

    def _update(self, status):
        assert not self._finished
        self._rate_limiter.reset()
        logger.info("%s: %s", self._message, status)

    def spin(self):
        if self._finished:
            return
        if not self._rate_limiter.ready():
            return
        self._update("still running...")

    def finish(self, final_status):
        if self._finished:
            return
        self._update("finished with status '%s'" % (final_status,))
        self._finished = True


@contextlib.contextmanager
def open_spinner(message):
    # Interactive spinner goes directly to sys.stdout rather than being routed
    # through the logging system, but it acts like it has level INFO,
    # i.e. it's only displayed if we're at level INFO or better.
    # Non-interactive spinner goes through the logging system, so it is always
    # in sync with logging configuration.
    if sys.stdout.isatty() and logger.getEffectiveLevel() <= logging.INFO:
        spinner = InteractiveSpinner(message)
    else:
        spinner = NonInteractiveSpinner(message)
    try:
        with hidden_cursor(sys.stdout):
            yield spinner
    except KeyboardInterrupt:
        spinner.finish("canceled")
        raise
    except Exception:
        spinner.finish("error")
        raise
    else:
        spinner.finish("done")
site-packages/pip/utils/deprecation.py000064400000004270147511334620014074 0ustar00"""
A module that implements tooling to enable easy warnings about deprecations.
"""
from __future__ import absolute_import

import logging
import warnings


class PipDeprecationWarning(Warning):
    pass


class Pending(object):
    pass


class RemovedInPip10Warning(PipDeprecationWarning):
    pass


class RemovedInPip11Warning(PipDeprecationWarning, Pending):
    pass


class Python26DeprecationWarning(PipDeprecationWarning):
    pass


# Warnings <-> Logging Integration


_warnings_showwarning = None


def _showwarning(message, category, filename, lineno, file=None, line=None):
    if file is not None:
        if _warnings_showwarning is not None:
            _warnings_showwarning(
                message, category, filename, lineno, file, line,
            )
    else:
        if issubclass(category, PipDeprecationWarning):
            # We use a specially named logger which will handle all of the
            # deprecation messages for pip.
            logger = logging.getLogger("pip.deprecations")

            # This is purposely using the % formatter here instead of letting
            # the logging module handle the interpolation. This is because we
            # want it to appear as if someone typed this entire message out.
            log_message = "DEPRECATION: %s" % message

            # PipDeprecationWarnings that are Pending still have at least 2
            # versions to go until they are removed so they can just be
            # warnings.  Otherwise, they will be removed in the very next
            # version of pip. We want these to be more obvious so we use the
            # ERROR logging level.
            if issubclass(category, Pending):
                logger.warning(log_message)
            else:
                logger.error(log_message)
        else:
            _warnings_showwarning(
                message, category, filename, lineno, file, line,
            )


def install_warning_logger():
    # Enable our Deprecation Warnings
    warnings.simplefilter("default", PipDeprecationWarning, append=True)

    global _warnings_showwarning

    if _warnings_showwarning is None:
        _warnings_showwarning = warnings.showwarning
        warnings.showwarning = _showwarning
site-packages/pip/index.py000064400000121127147511334620011547 0ustar00"""Routines related to PyPI, indexes"""
from __future__ import absolute_import

import logging
import cgi
from collections import namedtuple
import itertools
import sys
import os
import re
import mimetypes
import posixpath
import warnings

from pip._vendor.six.moves.urllib import parse as urllib_parse
from pip._vendor.six.moves.urllib import request as urllib_request

from pip.compat import ipaddress
from pip.utils import (
    cached_property, splitext, normalize_path,
    ARCHIVE_EXTENSIONS, SUPPORTED_EXTENSIONS,
)
from pip.utils.deprecation import RemovedInPip10Warning
from pip.utils.logging import indent_log
from pip.utils.packaging import check_requires_python
from pip.exceptions import (
    DistributionNotFound, BestVersionAlreadyInstalled, InvalidWheelFilename,
    UnsupportedWheel,
)
from pip.download import HAS_TLS, is_url, path_to_url, url_to_path
from pip.wheel import Wheel, wheel_ext
from pip.pep425tags import get_supported
from pip._vendor import html5lib, requests, six
from pip._vendor.packaging.version import parse as parse_version
from pip._vendor.packaging.utils import canonicalize_name
from pip._vendor.packaging import specifiers
from pip._vendor.requests.exceptions import SSLError
from pip._vendor.distlib.compat import unescape


__all__ = ['FormatControl', 'fmt_ctl_handle_mutual_exclude', 'PackageFinder']


SECURE_ORIGINS = [
    # protocol, hostname, port
    # Taken from Chrome's list of secure origins (See: http://bit.ly/1qrySKC)
    ("https", "*", "*"),
    ("*", "localhost", "*"),
    ("*", "127.0.0.0/8", "*"),
    ("*", "::1/128", "*"),
    ("file", "*", None),
    # ssh is always secure.
    ("ssh", "*", "*"),
]


logger = logging.getLogger(__name__)


class InstallationCandidate(object):

    def __init__(self, project, version, location):
        self.project = project
        self.version = parse_version(version)
        self.location = location
        self._key = (self.project, self.version, self.location)

    def __repr__(self):
        return "<InstallationCandidate({0!r}, {1!r}, {2!r})>".format(
            self.project, self.version, self.location,
        )

    def __hash__(self):
        return hash(self._key)

    def __lt__(self, other):
        return self._compare(other, lambda s, o: s < o)

    def __le__(self, other):
        return self._compare(other, lambda s, o: s <= o)

    def __eq__(self, other):
        return self._compare(other, lambda s, o: s == o)

    def __ge__(self, other):
        return self._compare(other, lambda s, o: s >= o)

    def __gt__(self, other):
        return self._compare(other, lambda s, o: s > o)

    def __ne__(self, other):
        return self._compare(other, lambda s, o: s != o)

    def _compare(self, other, method):
        if not isinstance(other, InstallationCandidate):
            return NotImplemented

        return method(self._key, other._key)


class PackageFinder(object):
    """This finds packages.

    This is meant to match easy_install's technique for looking for
    packages, by reading pages and looking for appropriate links.
    """

    def __init__(self, find_links, index_urls, allow_all_prereleases=False,
                 trusted_hosts=None, process_dependency_links=False,
                 session=None, format_control=None, platform=None,
                 versions=None, abi=None, implementation=None):
        """Create a PackageFinder.

        :param format_control: A FormatControl object or None. Used to control
            the selection of source packages / binary packages when consulting
            the index and links.
        :param platform: A string or None. If None, searches for packages
            that are supported by the current system. Otherwise, will find
            packages that can be built on the platform passed in. These
            packages will only be downloaded for distribution: they will
            not be built locally.
        :param versions: A list of strings or None. This is passed directly
            to pep425tags.py in the get_supported() method.
        :param abi: A string or None. This is passed directly
            to pep425tags.py in the get_supported() method.
        :param implementation: A string or None. This is passed directly
            to pep425tags.py in the get_supported() method.
        """
        if session is None:
            raise TypeError(
                "PackageFinder() missing 1 required keyword argument: "
                "'session'"
            )

        # Build find_links. If an argument starts with ~, it may be
        # a local file relative to a home directory. So try normalizing
        # it and if it exists, use the normalized version.
        # This is deliberately conservative - it might be fine just to
        # blindly normalize anything starting with a ~...
        self.find_links = []
        for link in find_links:
            if link.startswith('~'):
                new_link = normalize_path(link)
                if os.path.exists(new_link):
                    link = new_link
            self.find_links.append(link)

        self.index_urls = index_urls
        self.dependency_links = []

        # These are boring links that have already been logged somehow:
        self.logged_links = set()

        self.format_control = format_control or FormatControl(set(), set())

        # Domains that we won't emit warnings for when not using HTTPS
        self.secure_origins = [
            ("*", host, "*")
            for host in (trusted_hosts if trusted_hosts else [])
        ]

        # Do we want to allow _all_ pre-releases?
        self.allow_all_prereleases = allow_all_prereleases

        # Do we process dependency links?
        self.process_dependency_links = process_dependency_links

        # The Session we'll use to make requests
        self.session = session

        # The valid tags to check potential found wheel candidates against
        self.valid_tags = get_supported(
            versions=versions,
            platform=platform,
            abi=abi,
            impl=implementation,
        )

        # If we don't have TLS enabled, then WARN if anyplace we're looking
        # relies on TLS.
        if not HAS_TLS:
            for link in itertools.chain(self.index_urls, self.find_links):
                parsed = urllib_parse.urlparse(link)
                if parsed.scheme == "https":
                    logger.warning(
                        "pip is configured with locations that require "
                        "TLS/SSL, however the ssl module in Python is not "
                        "available."
                    )
                    break

    def add_dependency_links(self, links):
        # # FIXME: this shouldn't be global list this, it should only
        # # apply to requirements of the package that specifies the
        # # dependency_links value
        # # FIXME: also, we should track comes_from (i.e., use Link)
        if self.process_dependency_links:
            warnings.warn(
                "Dependency Links processing has been deprecated and will be "
                "removed in a future release.",
                RemovedInPip10Warning,
            )
            self.dependency_links.extend(links)

    @staticmethod
    def _sort_locations(locations, expand_dir=False):
        """
        Sort locations into "files" (archives) and "urls", and return
        a pair of lists (files,urls)
        """
        files = []
        urls = []

        # puts the url for the given file path into the appropriate list
        def sort_path(path):
            url = path_to_url(path)
            if mimetypes.guess_type(url, strict=False)[0] == 'text/html':
                urls.append(url)
            else:
                files.append(url)

        for url in locations:

            is_local_path = os.path.exists(url)
            is_file_url = url.startswith('file:')

            if is_local_path or is_file_url:
                if is_local_path:
                    path = url
                else:
                    path = url_to_path(url)
                if os.path.isdir(path):
                    if expand_dir:
                        path = os.path.realpath(path)
                        for item in os.listdir(path):
                            sort_path(os.path.join(path, item))
                    elif is_file_url:
                        urls.append(url)
                elif os.path.isfile(path):
                    sort_path(path)
                else:
                    logger.warning(
                        "Url '%s' is ignored: it is neither a file "
                        "nor a directory.", url)
            elif is_url(url):
                # Only add url with clear scheme
                urls.append(url)
            else:
                logger.warning(
                    "Url '%s' is ignored. It is either a non-existing "
                    "path or lacks a specific scheme.", url)

        return files, urls

    def _candidate_sort_key(self, candidate):
        """
        Function used to generate link sort key for link tuples.
        The greater the return value, the more preferred it is.
        If not finding wheels, then sorted by version only.
        If finding wheels, then the sort order is by version, then:
          1. existing installs
          2. wheels ordered via Wheel.support_index_min(self.valid_tags)
          3. source archives
        Note: it was considered to embed this logic into the Link
              comparison operators, but then different sdist links
              with the same version, would have to be considered equal
        """
        support_num = len(self.valid_tags)
        if candidate.location.is_wheel:
            # can raise InvalidWheelFilename
            wheel = Wheel(candidate.location.filename)
            if not wheel.supported(self.valid_tags):
                raise UnsupportedWheel(
                    "%s is not a supported wheel for this platform. It "
                    "can't be sorted." % wheel.filename
                )
            pri = -(wheel.support_index_min(self.valid_tags))
        else:  # sdist
            pri = -(support_num)
        return (candidate.version, pri)

    def _validate_secure_origin(self, logger, location):
        # Determine if this url used a secure transport mechanism
        parsed = urllib_parse.urlparse(str(location))
        origin = (parsed.scheme, parsed.hostname, parsed.port)

        # The protocol to use to see if the protocol matches.
        # Don't count the repository type as part of the protocol: in
        # cases such as "git+ssh", only use "ssh". (I.e., Only verify against
        # the last scheme.)
        protocol = origin[0].rsplit('+', 1)[-1]

        # Determine if our origin is a secure origin by looking through our
        # hardcoded list of secure origins, as well as any additional ones
        # configured on this PackageFinder instance.
        for secure_origin in (SECURE_ORIGINS + self.secure_origins):
            if protocol != secure_origin[0] and secure_origin[0] != "*":
                continue

            try:
                # We need to do this decode dance to ensure that we have a
                # unicode object, even on Python 2.x.
                addr = ipaddress.ip_address(
                    origin[1]
                    if (
                        isinstance(origin[1], six.text_type) or
                        origin[1] is None
                    )
                    else origin[1].decode("utf8")
                )
                network = ipaddress.ip_network(
                    secure_origin[1]
                    if isinstance(secure_origin[1], six.text_type)
                    else secure_origin[1].decode("utf8")
                )
            except ValueError:
                # We don't have both a valid address or a valid network, so
                # we'll check this origin against hostnames.
                if (origin[1] and
                        origin[1].lower() != secure_origin[1].lower() and
                        secure_origin[1] != "*"):
                    continue
            else:
                # We have a valid address and network, so see if the address
                # is contained within the network.
                if addr not in network:
                    continue

            # Check to see if the port patches
            if (origin[2] != secure_origin[2] and
                    secure_origin[2] != "*" and
                    secure_origin[2] is not None):
                continue

            # If we've gotten here, then this origin matches the current
            # secure origin and we should return True
            return True

        # If we've gotten to this point, then the origin isn't secure and we
        # will not accept it as a valid location to search. We will however
        # log a warning that we are ignoring it.
        logger.warning(
            "The repository located at %s is not a trusted or secure host and "
            "is being ignored. If this repository is available via HTTPS it "
            "is recommended to use HTTPS instead, otherwise you may silence "
            "this warning and allow it anyways with '--trusted-host %s'.",
            parsed.hostname,
            parsed.hostname,
        )

        return False

    def _get_index_urls_locations(self, project_name):
        """Returns the locations found via self.index_urls

        Checks the url_name on the main (first in the list) index and
        use this url_name to produce all locations
        """

        def mkurl_pypi_url(url):
            loc = posixpath.join(
                url,
                urllib_parse.quote(canonicalize_name(project_name)))
            # For maximum compatibility with easy_install, ensure the path
            # ends in a trailing slash.  Although this isn't in the spec
            # (and PyPI can handle it without the slash) some other index
            # implementations might break if they relied on easy_install's
            # behavior.
            if not loc.endswith('/'):
                loc = loc + '/'
            return loc

        return [mkurl_pypi_url(url) for url in self.index_urls]

    def find_all_candidates(self, project_name):
        """Find all available InstallationCandidate for project_name

        This checks index_urls, find_links and dependency_links.
        All versions found are returned as an InstallationCandidate list.

        See _link_package_versions for details on which files are accepted
        """
        index_locations = self._get_index_urls_locations(project_name)
        index_file_loc, index_url_loc = self._sort_locations(index_locations)
        fl_file_loc, fl_url_loc = self._sort_locations(
            self.find_links, expand_dir=True)
        dep_file_loc, dep_url_loc = self._sort_locations(self.dependency_links)

        file_locations = (
            Link(url) for url in itertools.chain(
                index_file_loc, fl_file_loc, dep_file_loc)
        )

        # We trust every url that the user has given us whether it was given
        #   via --index-url or --find-links
        # We explicitly do not trust links that came from dependency_links
        # We want to filter out any thing which does not have a secure origin.
        url_locations = [
            link for link in itertools.chain(
                (Link(url) for url in index_url_loc),
                (Link(url) for url in fl_url_loc),
                (Link(url) for url in dep_url_loc),
            )
            if self._validate_secure_origin(logger, link)
        ]

        logger.debug('%d location(s) to search for versions of %s:',
                     len(url_locations), project_name)

        for location in url_locations:
            logger.debug('* %s', location)

        canonical_name = canonicalize_name(project_name)
        formats = fmt_ctl_formats(self.format_control, canonical_name)
        search = Search(project_name, canonical_name, formats)
        find_links_versions = self._package_versions(
            # We trust every directly linked archive in find_links
            (Link(url, '-f') for url in self.find_links),
            search
        )

        page_versions = []
        for page in self._get_pages(url_locations, project_name):
            logger.debug('Analyzing links from page %s', page.url)
            with indent_log():
                page_versions.extend(
                    self._package_versions(page.links, search)
                )

        dependency_versions = self._package_versions(
            (Link(url) for url in self.dependency_links), search
        )
        if dependency_versions:
            logger.debug(
                'dependency_links found: %s',
                ', '.join([
                    version.location.url for version in dependency_versions
                ])
            )

        file_versions = self._package_versions(file_locations, search)
        if file_versions:
            file_versions.sort(reverse=True)
            logger.debug(
                'Local files found: %s',
                ', '.join([
                    url_to_path(candidate.location.url)
                    for candidate in file_versions
                ])
            )

        # This is an intentional priority ordering
        return (
            file_versions + find_links_versions + page_versions +
            dependency_versions
        )

    def find_requirement(self, req, upgrade):
        """Try to find a Link matching req

        Expects req, an InstallRequirement and upgrade, a boolean
        Returns a Link if found,
        Raises DistributionNotFound or BestVersionAlreadyInstalled otherwise
        """
        all_candidates = self.find_all_candidates(req.name)

        # Filter out anything which doesn't match our specifier
        compatible_versions = set(
            req.specifier.filter(
                # We turn the version object into a str here because otherwise
                # when we're debundled but setuptools isn't, Python will see
                # packaging.version.Version and
                # pkg_resources._vendor.packaging.version.Version as different
                # types. This way we'll use a str as a common data interchange
                # format. If we stop using the pkg_resources provided specifier
                # and start using our own, we can drop the cast to str().
                [str(c.version) for c in all_candidates],
                prereleases=(
                    self.allow_all_prereleases
                    if self.allow_all_prereleases else None
                ),
            )
        )
        applicable_candidates = [
            # Again, converting to str to deal with debundling.
            c for c in all_candidates if str(c.version) in compatible_versions
        ]

        if applicable_candidates:
            best_candidate = max(applicable_candidates,
                                 key=self._candidate_sort_key)
            # If we cannot find a non-yanked candidate,
            # use the best one and print a warning about it.
            # Otherwise, try to find another best candidate, ignoring
            # all the yanked releases.
            if getattr(best_candidate.location, "yanked", False):
                nonyanked_candidates = [
                    c for c in applicable_candidates
                    if not getattr(c.location, "yanked", False)
                ]

                if set(nonyanked_candidates):
                    best_candidate = max(nonyanked_candidates,
                                         key=self._candidate_sort_key)
                else:
                    warning_message = (
                        "WARNING: The candidate selected for download or install "
                        "is a yanked version: '{}' candidate (version {} at {})"
                    ).format(best_candidate.project, best_candidate.version, best_candidate.location)
                    if best_candidate.location.yanked_reason:
                        warning_message += "\nReason for being yanked: {}".format(best_candidate.location.yanked_reason)
                    logger.warning(warning_message)
        else:
            best_candidate = None

        if req.satisfied_by is not None:
            installed_version = parse_version(req.satisfied_by.version)
        else:
            installed_version = None

        if installed_version is None and best_candidate is None:
            logger.critical(
                'Could not find a version that satisfies the requirement %s '
                '(from versions: %s)',
                req,
                ', '.join(
                    sorted(
                        set(str(c.version) for c in all_candidates),
                        key=parse_version,
                    )
                )
            )

            raise DistributionNotFound(
                'No matching distribution found for %s' % req
            )

        best_installed = False
        if installed_version and (
                best_candidate is None or
                best_candidate.version <= installed_version):
            best_installed = True

        if not upgrade and installed_version is not None:
            if best_installed:
                logger.debug(
                    'Existing installed version (%s) is most up-to-date and '
                    'satisfies requirement',
                    installed_version,
                )
            else:
                logger.debug(
                    'Existing installed version (%s) satisfies requirement '
                    '(most up-to-date version is %s)',
                    installed_version,
                    best_candidate.version,
                )
            return None

        if best_installed:
            # We have an existing version, and its the best version
            logger.debug(
                'Installed version (%s) is most up-to-date (past versions: '
                '%s)',
                installed_version,
                ', '.join(sorted(compatible_versions, key=parse_version)) or
                "none",
            )
            raise BestVersionAlreadyInstalled

        logger.debug(
            'Using version %s (newest of versions: %s)',
            best_candidate.version,
            ', '.join(sorted(compatible_versions, key=parse_version))
        )
        return best_candidate.location

    def _get_pages(self, locations, project_name):
        """
        Yields (page, page_url) from the given locations, skipping
        locations that have errors.
        """
        seen = set()
        for location in locations:
            if location in seen:
                continue
            seen.add(location)

            page = self._get_page(location)
            if page is None:
                continue

            yield page

    _py_version_re = re.compile(r'-py([123]\.?[0-9]?)$')

    def _sort_links(self, links):
        """
        Returns elements of links in order, non-egg links first, egg links
        second, while eliminating duplicates
        """
        eggs, no_eggs = [], []
        seen = set()
        for link in links:
            if link not in seen:
                seen.add(link)
                if link.egg_fragment:
                    eggs.append(link)
                else:
                    no_eggs.append(link)
        return no_eggs + eggs

    def _package_versions(self, links, search):
        result = []
        for link in self._sort_links(links):
            v = self._link_package_versions(link, search)
            if v is not None:
                result.append(v)
        return result

    def _log_skipped_link(self, link, reason):
        if link not in self.logged_links:
            logger.debug('Skipping link %s; %s', link, reason)
            self.logged_links.add(link)

    def _link_package_versions(self, link, search):
        """Return an InstallationCandidate or None"""
        version = None
        if link.egg_fragment:
            egg_info = link.egg_fragment
            ext = link.ext
        else:
            egg_info, ext = link.splitext()
            if not ext:
                self._log_skipped_link(link, 'not a file')
                return
            if ext not in SUPPORTED_EXTENSIONS:
                self._log_skipped_link(
                    link, 'unsupported archive format: %s' % ext)
                return
            if "binary" not in search.formats and ext == wheel_ext:
                self._log_skipped_link(
                    link, 'No binaries permitted for %s' % search.supplied)
                return
            if "macosx10" in link.path and ext == '.zip':
                self._log_skipped_link(link, 'macosx10 one')
                return
            if ext == wheel_ext:
                try:
                    wheel = Wheel(link.filename)
                except InvalidWheelFilename:
                    self._log_skipped_link(link, 'invalid wheel filename')
                    return
                if canonicalize_name(wheel.name) != search.canonical:
                    self._log_skipped_link(
                        link, 'wrong project name (not %s)' % search.supplied)
                    return

                if not wheel.supported(self.valid_tags):
                    self._log_skipped_link(
                        link, 'it is not compatible with this Python')
                    return

                version = wheel.version

        # This should be up by the search.ok_binary check, but see issue 2700.
        if "source" not in search.formats and ext != wheel_ext:
            self._log_skipped_link(
                link, 'No sources permitted for %s' % search.supplied)
            return

        if not version:
            version = egg_info_matches(egg_info, search.supplied, link)
        if version is None:
            self._log_skipped_link(
                link, 'wrong project name (not %s)' % search.supplied)
            return

        match = self._py_version_re.search(version)
        if match:
            version = version[:match.start()]
            py_version = match.group(1)
            if py_version != sys.version[:3]:
                self._log_skipped_link(
                    link, 'Python version is incorrect')
                return
        try:
            support_this_python = check_requires_python(link.requires_python)
        except specifiers.InvalidSpecifier:
            logger.debug("Package %s has an invalid Requires-Python entry: %s",
                         link.filename, link.requires_python)
            support_this_python = True

        if not support_this_python:
            logger.debug("The package %s is incompatible with the python"
                         "version in use. Acceptable python versions are:%s",
                         link, link.requires_python)
            return
        logger.debug('Found link %s, version: %s', link, version)

        return InstallationCandidate(search.supplied, version, link)

    def _get_page(self, link):
        return HTMLPage.get_page(link, session=self.session)


def egg_info_matches(
        egg_info, search_name, link,
        _egg_info_re=re.compile(r'([a-z0-9_.]+)-([a-z0-9_.!+-]+)', re.I)):
    """Pull the version part out of a string.

    :param egg_info: The string to parse. E.g. foo-2.1
    :param search_name: The name of the package this belongs to. None to
        infer the name. Note that this cannot unambiguously parse strings
        like foo-2-2 which might be foo, 2-2 or foo-2, 2.
    :param link: The link the string came from, for logging on failure.
    """
    match = _egg_info_re.search(egg_info)
    if not match:
        logger.debug('Could not parse version from link: %s', link)
        return None
    if search_name is None:
        full_match = match.group(0)
        return full_match[full_match.index('-'):]
    name = match.group(0).lower()
    # To match the "safe" name that pkg_resources creates:
    name = name.replace('_', '-')
    # project name and version must be separated by a dash
    look_for = search_name.lower() + "-"
    if name.startswith(look_for):
        return match.group(0)[len(look_for):]
    else:
        return None


class HTMLPage(object):
    """Represents one page, along with its URL"""

    def __init__(self, content, url, headers=None):
        # Determine if we have any encoding information in our headers
        encoding = None
        if headers and "Content-Type" in headers:
            content_type, params = cgi.parse_header(headers["Content-Type"])

            if "charset" in params:
                encoding = params['charset']

        self.content = content
        self.parsed = html5lib.parse(
            self.content,
            transport_encoding=encoding,
            namespaceHTMLElements=False,
        )
        self.url = url
        self.headers = headers

    def __str__(self):
        return self.url

    @classmethod
    def get_page(cls, link, skip_archives=True, session=None):
        if session is None:
            raise TypeError(
                "get_page() missing 1 required keyword argument: 'session'"
            )

        url = link.url
        url = url.split('#', 1)[0]

        # Check for VCS schemes that do not support lookup as web pages.
        from pip.vcs import VcsSupport
        for scheme in VcsSupport.schemes:
            if url.lower().startswith(scheme) and url[len(scheme)] in '+:':
                logger.debug('Cannot look at %s URL %s', scheme, link)
                return None

        try:
            if skip_archives:
                filename = link.filename
                for bad_ext in ARCHIVE_EXTENSIONS:
                    if filename.endswith(bad_ext):
                        content_type = cls._get_content_type(
                            url, session=session,
                        )
                        if content_type.lower().startswith('text/html'):
                            break
                        else:
                            logger.debug(
                                'Skipping page %s because of Content-Type: %s',
                                link,
                                content_type,
                            )
                            return

            logger.debug('Getting page %s', url)

            # Tack index.html onto file:// URLs that point to directories
            (scheme, netloc, path, params, query, fragment) = \
                urllib_parse.urlparse(url)
            if (scheme == 'file' and
                    os.path.isdir(urllib_request.url2pathname(path))):
                # add trailing slash if not present so urljoin doesn't trim
                # final segment
                if not url.endswith('/'):
                    url += '/'
                url = urllib_parse.urljoin(url, 'index.html')
                logger.debug(' file: URL is directory, getting %s', url)

            resp = session.get(
                url,
                headers={
                    "Accept": "text/html",
                    "Cache-Control": "max-age=600",
                },
            )
            resp.raise_for_status()

            # The check for archives above only works if the url ends with
            # something that looks like an archive. However that is not a
            # requirement of an url. Unless we issue a HEAD request on every
            # url we cannot know ahead of time for sure if something is HTML
            # or not. However we can check after we've downloaded it.
            content_type = resp.headers.get('Content-Type', 'unknown')
            if not content_type.lower().startswith("text/html"):
                logger.debug(
                    'Skipping page %s because of Content-Type: %s',
                    link,
                    content_type,
                )
                return

            inst = cls(resp.content, resp.url, resp.headers)
        except requests.HTTPError as exc:
            cls._handle_fail(link, exc, url)
        except SSLError as exc:
            reason = ("There was a problem confirming the ssl certificate: "
                      "%s" % exc)
            cls._handle_fail(link, reason, url, meth=logger.info)
        except requests.ConnectionError as exc:
            cls._handle_fail(link, "connection error: %s" % exc, url)
        except requests.Timeout:
            cls._handle_fail(link, "timed out", url)
        else:
            return inst

    @staticmethod
    def _handle_fail(link, reason, url, meth=None):
        if meth is None:
            meth = logger.debug

        meth("Could not fetch URL %s: %s - skipping", link, reason)

    @staticmethod
    def _get_content_type(url, session):
        """Get the Content-Type of the given url, using a HEAD request"""
        scheme, netloc, path, query, fragment = urllib_parse.urlsplit(url)
        if scheme not in ('http', 'https'):
            # FIXME: some warning or something?
            # assertion error?
            return ''

        resp = session.head(url, allow_redirects=True)
        resp.raise_for_status()

        return resp.headers.get("Content-Type", "")

    @cached_property
    def base_url(self):
        bases = [
            x for x in self.parsed.findall(".//base")
            if x.get("href") is not None
        ]
        if bases and bases[0].get("href"):
            return bases[0].get("href")
        else:
            return self.url

    @property
    def links(self):
        """Yields all links in the page"""
        for anchor in self.parsed.findall(".//a"):
            if anchor.get("href"):
                href = anchor.get("href")
                url = self.clean_link(
                    urllib_parse.urljoin(self.base_url, href)
                )
                pyrequire = anchor.get('data-requires-python')
                pyrequire = unescape(pyrequire) if pyrequire else None
                yanked_reason = anchor.get('data-yanked', default=None)
                # Empty or valueless attribute are both parsed as empty string
                if yanked_reason is not None:
                    yanked_reason = unescape(yanked_reason)
                yield Link(url, self, requires_python=pyrequire, yanked_reason=yanked_reason)

    _clean_re = re.compile(r'[^a-z0-9$&+,/:;=?@.#%_\\|-]', re.I)

    def clean_link(self, url):
        """Makes sure a link is fully encoded.  That is, if a ' ' shows up in
        the link, it will be rewritten to %20 (while not over-quoting
        % or other characters)."""
        return self._clean_re.sub(
            lambda match: '%%%2x' % ord(match.group(0)), url)


class Link(object):

    def __init__(self, url, comes_from=None, requires_python=None, yanked_reason=None):
        """
        Object representing a parsed link from https://pypi.python.org/simple/*

        url:
            url of the resource pointed to (href of the link)
        comes_from:
            instance of HTMLPage where the link was found, or string.
        requires_python:
            String containing the `Requires-Python` metadata field, specified
            in PEP 345. This may be specified by a data-requires-python
            attribute in the HTML link tag, as described in PEP 503.
        """

        # url can be a UNC windows share
        if url.startswith('\\\\'):
            url = path_to_url(url)

        self.url = url
        self.comes_from = comes_from
        self.requires_python = requires_python if requires_python else None
        self.yanked_reason = yanked_reason
        self.yanked = yanked_reason is not None

    def __str__(self):
        if self.requires_python:
            rp = ' (requires-python:%s)' % self.requires_python
        else:
            rp = ''
        if self.comes_from:
            return '%s (from %s)%s' % (self.url, self.comes_from, rp)
        else:
            return str(self.url)

    def __repr__(self):
        return '<Link %s>' % self

    def __eq__(self, other):
        if not isinstance(other, Link):
            return NotImplemented
        return self.url == other.url

    def __ne__(self, other):
        if not isinstance(other, Link):
            return NotImplemented
        return self.url != other.url

    def __lt__(self, other):
        if not isinstance(other, Link):
            return NotImplemented
        return self.url < other.url

    def __le__(self, other):
        if not isinstance(other, Link):
            return NotImplemented
        return self.url <= other.url

    def __gt__(self, other):
        if not isinstance(other, Link):
            return NotImplemented
        return self.url > other.url

    def __ge__(self, other):
        if not isinstance(other, Link):
            return NotImplemented
        return self.url >= other.url

    def __hash__(self):
        return hash(self.url)

    @property
    def filename(self):
        _, netloc, path, _, _ = urllib_parse.urlsplit(self.url)
        name = posixpath.basename(path.rstrip('/')) or netloc
        name = urllib_parse.unquote(name)
        assert name, ('URL %r produced no filename' % self.url)
        return name

    @property
    def scheme(self):
        return urllib_parse.urlsplit(self.url)[0]

    @property
    def netloc(self):
        return urllib_parse.urlsplit(self.url)[1]

    @property
    def path(self):
        return urllib_parse.unquote(urllib_parse.urlsplit(self.url)[2])

    def splitext(self):
        return splitext(posixpath.basename(self.path.rstrip('/')))

    @property
    def ext(self):
        return self.splitext()[1]

    @property
    def url_without_fragment(self):
        scheme, netloc, path, query, fragment = urllib_parse.urlsplit(self.url)
        return urllib_parse.urlunsplit((scheme, netloc, path, query, None))

    _egg_fragment_re = re.compile(r'[#&]egg=([^&]*)')

    @property
    def egg_fragment(self):
        match = self._egg_fragment_re.search(self.url)
        if not match:
            return None
        return match.group(1)

    _subdirectory_fragment_re = re.compile(r'[#&]subdirectory=([^&]*)')

    @property
    def subdirectory_fragment(self):
        match = self._subdirectory_fragment_re.search(self.url)
        if not match:
            return None
        return match.group(1)

    _hash_re = re.compile(
        r'(sha1|sha224|sha384|sha256|sha512|md5)=([a-f0-9]+)'
    )

    @property
    def hash(self):
        match = self._hash_re.search(self.url)
        if match:
            return match.group(2)
        return None

    @property
    def hash_name(self):
        match = self._hash_re.search(self.url)
        if match:
            return match.group(1)
        return None

    @property
    def show_url(self):
        return posixpath.basename(self.url.split('#', 1)[0].split('?', 1)[0])

    @property
    def is_wheel(self):
        return self.ext == wheel_ext

    @property
    def is_artifact(self):
        """
        Determines if this points to an actual artifact (e.g. a tarball) or if
        it points to an "abstract" thing like a path or a VCS location.
        """
        from pip.vcs import vcs

        if self.scheme in vcs.all_schemes:
            return False

        return True


FormatControl = namedtuple('FormatControl', 'no_binary only_binary')
"""This object has two fields, no_binary and only_binary.

If a field is falsy, it isn't set. If it is {':all:'}, it should match all
packages except those listed in the other field. Only one field can be set
to {':all:'} at a time. The rest of the time exact package name matches
are listed, with any given package only showing up in one field at a time.
"""


def fmt_ctl_handle_mutual_exclude(value, target, other):
    new = value.split(',')
    while ':all:' in new:
        other.clear()
        target.clear()
        target.add(':all:')
        del new[:new.index(':all:') + 1]
        if ':none:' not in new:
            # Without a none, we want to discard everything as :all: covers it
            return
    for name in new:
        if name == ':none:':
            target.clear()
            continue
        name = canonicalize_name(name)
        other.discard(name)
        target.add(name)


def fmt_ctl_formats(fmt_ctl, canonical_name):
    result = set(["binary", "source"])
    if canonical_name in fmt_ctl.only_binary:
        result.discard('source')
    elif canonical_name in fmt_ctl.no_binary:
        result.discard('binary')
    elif ':all:' in fmt_ctl.only_binary:
        result.discard('source')
    elif ':all:' in fmt_ctl.no_binary:
        result.discard('binary')
    return frozenset(result)


def fmt_ctl_no_binary(fmt_ctl):
    fmt_ctl_handle_mutual_exclude(
        ':all:', fmt_ctl.no_binary, fmt_ctl.only_binary)


def fmt_ctl_no_use_wheel(fmt_ctl):
    fmt_ctl_no_binary(fmt_ctl)
    warnings.warn(
        '--no-use-wheel is deprecated and will be removed in the future. '
        ' Please use --no-binary :all: instead.', RemovedInPip10Warning,
        stacklevel=2)


Search = namedtuple('Search', 'supplied canonical formats')
"""Capture key aspects of a search.

:attribute supplied: The user supplied package.
:attribute canonical: The canonical package name.
:attribute formats: The formats allowed for this package. Should be a set
    with 'binary' or 'source' or both in it.
"""
site-packages/pip/basecommand.py000064400000027206147511334620012714 0ustar00"""Base Command class, and related routines"""
from __future__ import absolute_import

import logging
import os
import sys
import optparse
import warnings

from pip import cmdoptions
from pip.index import PackageFinder
from pip.locations import running_under_virtualenv
from pip.download import PipSession
from pip.exceptions import (BadCommand, InstallationError, UninstallationError,
                            CommandError, PreviousBuildDirError)

from pip.compat import logging_dictConfig
from pip.baseparser import ConfigOptionParser, UpdatingDefaultsHelpFormatter
from pip.req import InstallRequirement, parse_requirements
from pip.status_codes import (
    SUCCESS, ERROR, UNKNOWN_ERROR, VIRTUALENV_NOT_FOUND,
    PREVIOUS_BUILD_DIR_ERROR,
)
from pip.utils import deprecation, get_prog, normalize_path
from pip.utils.logging import IndentingFormatter
from pip.utils.outdated import pip_version_check


__all__ = ['Command']


logger = logging.getLogger(__name__)


class Command(object):
    name = None
    usage = None
    hidden = False
    log_streams = ("ext://sys.stdout", "ext://sys.stderr")

    def __init__(self, isolated=False):
        parser_kw = {
            'usage': self.usage,
            'prog': '%s %s' % (get_prog(), self.name),
            'formatter': UpdatingDefaultsHelpFormatter(),
            'add_help_option': False,
            'name': self.name,
            'description': self.__doc__,
            'isolated': isolated,
        }

        self.parser = ConfigOptionParser(**parser_kw)

        # Commands should add options to this option group
        optgroup_name = '%s Options' % self.name.capitalize()
        self.cmd_opts = optparse.OptionGroup(self.parser, optgroup_name)

        # Add the general options
        gen_opts = cmdoptions.make_option_group(
            cmdoptions.general_group,
            self.parser,
        )
        self.parser.add_option_group(gen_opts)

    def _build_session(self, options, retries=None, timeout=None):
        session = PipSession(
            cache=(
                normalize_path(os.path.join(options.cache_dir, "http"))
                if options.cache_dir else None
            ),
            retries=retries if retries is not None else options.retries,
            insecure_hosts=options.trusted_hosts,
        )

        # Handle custom ca-bundles from the user
        if options.cert:
            session.verify = options.cert

        # Handle SSL client certificate
        if options.client_cert:
            session.cert = options.client_cert

        # Handle timeouts
        if options.timeout or timeout:
            session.timeout = (
                timeout if timeout is not None else options.timeout
            )

        # Handle configured proxies
        if options.proxy:
            session.proxies = {
                "http": options.proxy,
                "https": options.proxy,
            }

        # Determine if we can prompt the user for authentication or not
        session.auth.prompting = not options.no_input

        return session

    def parse_args(self, args):
        # factored out for testability
        return self.parser.parse_args(args)

    def main(self, args):
        options, args = self.parse_args(args)

        if options.quiet:
            if options.quiet == 1:
                level = "WARNING"
            if options.quiet == 2:
                level = "ERROR"
            else:
                level = "CRITICAL"
        elif options.verbose:
            level = "DEBUG"
        else:
            level = "INFO"

        # The root logger should match the "console" level *unless* we
        # specified "--log" to send debug logs to a file.
        root_level = level
        if options.log:
            root_level = "DEBUG"

        logging_dictConfig({
            "version": 1,
            "disable_existing_loggers": False,
            "filters": {
                "exclude_warnings": {
                    "()": "pip.utils.logging.MaxLevelFilter",
                    "level": logging.WARNING,
                },
            },
            "formatters": {
                "indent": {
                    "()": IndentingFormatter,
                    "format": "%(message)s",
                },
            },
            "handlers": {
                "console": {
                    "level": level,
                    "class": "pip.utils.logging.ColorizedStreamHandler",
                    "stream": self.log_streams[0],
                    "filters": ["exclude_warnings"],
                    "formatter": "indent",
                },
                "console_errors": {
                    "level": "WARNING",
                    "class": "pip.utils.logging.ColorizedStreamHandler",
                    "stream": self.log_streams[1],
                    "formatter": "indent",
                },
                "user_log": {
                    "level": "DEBUG",
                    "class": "pip.utils.logging.BetterRotatingFileHandler",
                    "filename": options.log or "/dev/null",
                    "delay": True,
                    "formatter": "indent",
                },
            },
            "root": {
                "level": root_level,
                "handlers": list(filter(None, [
                    "console",
                    "console_errors",
                    "user_log" if options.log else None,
                ])),
            },
            # Disable any logging besides WARNING unless we have DEBUG level
            # logging enabled. These use both pip._vendor and the bare names
            # for the case where someone unbundles our libraries.
            "loggers": dict(
                (
                    name,
                    {
                        "level": (
                            "WARNING"
                            if level in ["INFO", "ERROR"]
                            else "DEBUG"
                        ),
                    },
                )
                for name in ["pip._vendor", "distlib", "requests", "urllib3"]
            ),
        })

        if sys.version_info[:2] == (2, 6):
            warnings.warn(
                "Python 2.6 is no longer supported by the Python core team, "
                "please upgrade your Python. A future version of pip will "
                "drop support for Python 2.6",
                deprecation.Python26DeprecationWarning
            )

        # TODO: try to get these passing down from the command?
        #      without resorting to os.environ to hold these.

        if options.no_input:
            os.environ['PIP_NO_INPUT'] = '1'

        if options.exists_action:
            os.environ['PIP_EXISTS_ACTION'] = ' '.join(options.exists_action)

        if options.require_venv:
            # If a venv is required check if it can really be found
            if not running_under_virtualenv():
                logger.critical(
                    'Could not find an activated virtualenv (required).'
                )
                sys.exit(VIRTUALENV_NOT_FOUND)

        try:
            status = self.run(options, args)
            # FIXME: all commands should return an exit status
            # and when it is done, isinstance is not needed anymore
            if isinstance(status, int):
                return status
        except PreviousBuildDirError as exc:
            logger.critical(str(exc))
            logger.debug('Exception information:', exc_info=True)

            return PREVIOUS_BUILD_DIR_ERROR
        except (InstallationError, UninstallationError, BadCommand) as exc:
            logger.critical(str(exc))
            logger.debug('Exception information:', exc_info=True)

            return ERROR
        except CommandError as exc:
            logger.critical('ERROR: %s', exc)
            logger.debug('Exception information:', exc_info=True)

            return ERROR
        except KeyboardInterrupt:
            logger.critical('Operation cancelled by user')
            logger.debug('Exception information:', exc_info=True)

            return ERROR
        except:
            logger.critical('Exception:', exc_info=True)

            return UNKNOWN_ERROR
        finally:
            # Check if we're using the latest version of pip available
            if (not options.disable_pip_version_check and not
                    getattr(options, "no_index", False)):
                with self._build_session(
                        options,
                        retries=0,
                        timeout=min(5, options.timeout)) as session:
                    pip_version_check(session)

        return SUCCESS


class RequirementCommand(Command):

    @staticmethod
    def populate_requirement_set(requirement_set, args, options, finder,
                                 session, name, wheel_cache):
        """
        Marshal cmd line args into a requirement set.
        """
        for filename in options.constraints:
            for req in parse_requirements(
                    filename,
                    constraint=True, finder=finder, options=options,
                    session=session, wheel_cache=wheel_cache):
                requirement_set.add_requirement(req)

        for req in args:
            requirement_set.add_requirement(
                InstallRequirement.from_line(
                    req, None, isolated=options.isolated_mode,
                    wheel_cache=wheel_cache
                )
            )

        for req in options.editables:
            requirement_set.add_requirement(
                InstallRequirement.from_editable(
                    req,
                    default_vcs=options.default_vcs,
                    isolated=options.isolated_mode,
                    wheel_cache=wheel_cache
                )
            )

        found_req_in_file = False
        for filename in options.requirements:
            for req in parse_requirements(
                    filename,
                    finder=finder, options=options, session=session,
                    wheel_cache=wheel_cache):
                found_req_in_file = True
                requirement_set.add_requirement(req)
        # If --require-hashes was a line in a requirements file, tell
        # RequirementSet about it:
        requirement_set.require_hashes = options.require_hashes

        if not (args or options.editables or found_req_in_file):
            opts = {'name': name}
            if options.find_links:
                msg = ('You must give at least one requirement to '
                       '%(name)s (maybe you meant "pip %(name)s '
                       '%(links)s"?)' %
                       dict(opts, links=' '.join(options.find_links)))
            else:
                msg = ('You must give at least one requirement '
                       'to %(name)s (see "pip help %(name)s")' % opts)
            logger.warning(msg)

    def _build_package_finder(self, options, session,
                              platform=None, python_versions=None,
                              abi=None, implementation=None):
        """
        Create a package finder appropriate to this requirement command.
        """
        index_urls = [options.index_url] + options.extra_index_urls
        if options.no_index:
            logger.debug('Ignoring indexes: %s', ','.join(index_urls))
            index_urls = []

        return PackageFinder(
            find_links=options.find_links,
            format_control=options.format_control,
            index_urls=index_urls,
            trusted_hosts=options.trusted_hosts,
            allow_all_prereleases=options.pre,
            process_dependency_links=options.process_dependency_links,
            session=session,
            platform=platform,
            versions=python_versions,
            abi=abi,
            implementation=implementation,
        )
site-packages/pip/exceptions.py000064400000017671147511334620012631 0ustar00"""Exceptions used throughout package"""
from __future__ import absolute_import

from itertools import chain, groupby, repeat

from pip._vendor.six import iteritems


class PipError(Exception):
    """Base pip exception"""


class InstallationError(PipError):
    """General exception during installation"""


class UninstallationError(PipError):
    """General exception during uninstallation"""


class DistributionNotFound(InstallationError):
    """Raised when a distribution cannot be found to satisfy a requirement"""


class RequirementsFileParseError(InstallationError):
    """Raised when a general error occurs parsing a requirements file line."""


class BestVersionAlreadyInstalled(PipError):
    """Raised when the most up-to-date version of a package is already
    installed."""


class BadCommand(PipError):
    """Raised when virtualenv or a command is not found"""


class CommandError(PipError):
    """Raised when there is an error in command-line arguments"""


class PreviousBuildDirError(PipError):
    """Raised when there's a previous conflicting build directory"""


class InvalidWheelFilename(InstallationError):
    """Invalid wheel filename."""


class UnsupportedWheel(InstallationError):
    """Unsupported wheel."""


class HashErrors(InstallationError):
    """Multiple HashError instances rolled into one for reporting"""

    def __init__(self):
        self.errors = []

    def append(self, error):
        self.errors.append(error)

    def __str__(self):
        lines = []
        self.errors.sort(key=lambda e: e.order)
        for cls, errors_of_cls in groupby(self.errors, lambda e: e.__class__):
            lines.append(cls.head)
            lines.extend(e.body() for e in errors_of_cls)
        if lines:
            return '\n'.join(lines)

    def __nonzero__(self):
        return bool(self.errors)

    def __bool__(self):
        return self.__nonzero__()


class HashError(InstallationError):
    """
    A failure to verify a package against known-good hashes

    :cvar order: An int sorting hash exception classes by difficulty of
        recovery (lower being harder), so the user doesn't bother fretting
        about unpinned packages when he has deeper issues, like VCS
        dependencies, to deal with. Also keeps error reports in a
        deterministic order.
    :cvar head: A section heading for display above potentially many
        exceptions of this kind
    :ivar req: The InstallRequirement that triggered this error. This is
        pasted on after the exception is instantiated, because it's not
        typically available earlier.

    """
    req = None
    head = ''

    def body(self):
        """Return a summary of me for display under the heading.

        This default implementation simply prints a description of the
        triggering requirement.

        :param req: The InstallRequirement that provoked this error, with
            populate_link() having already been called

        """
        return '    %s' % self._requirement_name()

    def __str__(self):
        return '%s\n%s' % (self.head, self.body())

    def _requirement_name(self):
        """Return a description of the requirement that triggered me.

        This default implementation returns long description of the req, with
        line numbers

        """
        return str(self.req) if self.req else 'unknown package'


class VcsHashUnsupported(HashError):
    """A hash was provided for a version-control-system-based requirement, but
    we don't have a method for hashing those."""

    order = 0
    head = ("Can't verify hashes for these requirements because we don't "
            "have a way to hash version control repositories:")


class DirectoryUrlHashUnsupported(HashError):
    """A hash was provided for a version-control-system-based requirement, but
    we don't have a method for hashing those."""

    order = 1
    head = ("Can't verify hashes for these file:// requirements because they "
            "point to directories:")


class HashMissing(HashError):
    """A hash was needed for a requirement but is absent."""

    order = 2
    head = ('Hashes are required in --require-hashes mode, but they are '
            'missing from some requirements. Here is a list of those '
            'requirements along with the hashes their downloaded archives '
            'actually had. Add lines like these to your requirements files to '
            'prevent tampering. (If you did not enable --require-hashes '
            'manually, note that it turns on automatically when any package '
            'has a hash.)')

    def __init__(self, gotten_hash):
        """
        :param gotten_hash: The hash of the (possibly malicious) archive we
            just downloaded
        """
        self.gotten_hash = gotten_hash

    def body(self):
        from pip.utils.hashes import FAVORITE_HASH  # Dodge circular import.

        package = None
        if self.req:
            # In the case of URL-based requirements, display the original URL
            # seen in the requirements file rather than the package name,
            # so the output can be directly copied into the requirements file.
            package = (self.req.original_link if self.req.original_link
                       # In case someone feeds something downright stupid
                       # to InstallRequirement's constructor.
                       else getattr(self.req, 'req', None))
        return '    %s --hash=%s:%s' % (package or 'unknown package',
                                        FAVORITE_HASH,
                                        self.gotten_hash)


class HashUnpinned(HashError):
    """A requirement had a hash specified but was not pinned to a specific
    version."""

    order = 3
    head = ('In --require-hashes mode, all requirements must have their '
            'versions pinned with ==. These do not:')


class HashMismatch(HashError):
    """
    Distribution file hash values don't match.

    :ivar package_name: The name of the package that triggered the hash
        mismatch. Feel free to write to this after the exception is raise to
        improve its error message.

    """
    order = 4
    head = ('THESE PACKAGES DO NOT MATCH THE HASHES FROM THE REQUIREMENTS '
            'FILE. If you have updated the package versions, please update '
            'the hashes. Otherwise, examine the package contents carefully; '
            'someone may have tampered with them.')

    def __init__(self, allowed, gots):
        """
        :param allowed: A dict of algorithm names pointing to lists of allowed
            hex digests
        :param gots: A dict of algorithm names pointing to hashes we
            actually got from the files under suspicion
        """
        self.allowed = allowed
        self.gots = gots

    def body(self):
        return '    %s:\n%s' % (self._requirement_name(),
                                self._hash_comparison())

    def _hash_comparison(self):
        """
        Return a comparison of actual and expected hash values.

        Example::

               Expected sha256 abcdeabcdeabcdeabcdeabcdeabcdeabcdeabcdeabcde
                            or 123451234512345123451234512345123451234512345
                    Got        bcdefbcdefbcdefbcdefbcdefbcdefbcdefbcdefbcdef

        """
        def hash_then_or(hash_name):
            # For now, all the decent hashes have 6-char names, so we can get
            # away with hard-coding space literals.
            return chain([hash_name], repeat('    or'))

        lines = []
        for hash_name, expecteds in iteritems(self.allowed):
            prefix = hash_then_or(hash_name)
            lines.extend(('        Expected %s %s' % (next(prefix), e))
                         for e in expecteds)
            lines.append('             Got        %s\n' %
                         self.gots[hash_name].hexdigest())
            prefix = '    or'
        return '\n'.join(lines)


class UnsupportedPythonVersion(InstallationError):
    """Unsupported python version according to Requires-Python package
    metadata."""
site-packages/pip/status_codes.py000064400000000234147511334620013133 0ustar00from __future__ import absolute_import

SUCCESS = 0
ERROR = 1
UNKNOWN_ERROR = 2
VIRTUALENV_NOT_FOUND = 3
PREVIOUS_BUILD_DIR_ERROR = 4
NO_MATCHES_FOUND = 23
site-packages/pip/baseparser.py000064400000024341147511334620012567 0ustar00"""Base option parser setup"""
from __future__ import absolute_import

import sys
import optparse
import os
import re
import textwrap
from distutils.util import strtobool

from pip._vendor.six import string_types
from pip._vendor.six.moves import configparser
from pip.locations import (
    legacy_config_file, config_basename, running_under_virtualenv,
    site_config_files
)
from pip.utils import appdirs, get_terminal_size


_environ_prefix_re = re.compile(r"^PIP_", re.I)


class PrettyHelpFormatter(optparse.IndentedHelpFormatter):
    """A prettier/less verbose help formatter for optparse."""

    def __init__(self, *args, **kwargs):
        # help position must be aligned with __init__.parseopts.description
        kwargs['max_help_position'] = 30
        kwargs['indent_increment'] = 1
        kwargs['width'] = get_terminal_size()[0] - 2
        optparse.IndentedHelpFormatter.__init__(self, *args, **kwargs)

    def format_option_strings(self, option):
        return self._format_option_strings(option, ' <%s>', ', ')

    def _format_option_strings(self, option, mvarfmt=' <%s>', optsep=', '):
        """
        Return a comma-separated list of option strings and metavars.

        :param option:  tuple of (short opt, long opt), e.g: ('-f', '--format')
        :param mvarfmt: metavar format string - evaluated as mvarfmt % metavar
        :param optsep:  separator
        """
        opts = []

        if option._short_opts:
            opts.append(option._short_opts[0])
        if option._long_opts:
            opts.append(option._long_opts[0])
        if len(opts) > 1:
            opts.insert(1, optsep)

        if option.takes_value():
            metavar = option.metavar or option.dest.lower()
            opts.append(mvarfmt % metavar.lower())

        return ''.join(opts)

    def format_heading(self, heading):
        if heading == 'Options':
            return ''
        return heading + ':\n'

    def format_usage(self, usage):
        """
        Ensure there is only one newline between usage and the first heading
        if there is no description.
        """
        msg = '\nUsage: %s\n' % self.indent_lines(textwrap.dedent(usage), "  ")
        return msg

    def format_description(self, description):
        # leave full control over description to us
        if description:
            if hasattr(self.parser, 'main'):
                label = 'Commands'
            else:
                label = 'Description'
            # some doc strings have initial newlines, some don't
            description = description.lstrip('\n')
            # some doc strings have final newlines and spaces, some don't
            description = description.rstrip()
            # dedent, then reindent
            description = self.indent_lines(textwrap.dedent(description), "  ")
            description = '%s:\n%s\n' % (label, description)
            return description
        else:
            return ''

    def format_epilog(self, epilog):
        # leave full control over epilog to us
        if epilog:
            return epilog
        else:
            return ''

    def indent_lines(self, text, indent):
        new_lines = [indent + line for line in text.split('\n')]
        return "\n".join(new_lines)


class UpdatingDefaultsHelpFormatter(PrettyHelpFormatter):
    """Custom help formatter for use in ConfigOptionParser.

    This is updates the defaults before expanding them, allowing
    them to show up correctly in the help listing.
    """

    def expand_default(self, option):
        if self.parser is not None:
            self.parser._update_defaults(self.parser.defaults)
        return optparse.IndentedHelpFormatter.expand_default(self, option)


class CustomOptionParser(optparse.OptionParser):

    def insert_option_group(self, idx, *args, **kwargs):
        """Insert an OptionGroup at a given position."""
        group = self.add_option_group(*args, **kwargs)

        self.option_groups.pop()
        self.option_groups.insert(idx, group)

        return group

    @property
    def option_list_all(self):
        """Get a list of all options, including those in option groups."""
        res = self.option_list[:]
        for i in self.option_groups:
            res.extend(i.option_list)

        return res


class ConfigOptionParser(CustomOptionParser):
    """Custom option parser which updates its defaults by checking the
    configuration files and environmental variables"""

    isolated = False

    def __init__(self, *args, **kwargs):
        self.config = configparser.RawConfigParser()
        self.name = kwargs.pop('name')
        self.isolated = kwargs.pop("isolated", False)
        self.files = self.get_config_files()
        if self.files:
            self.config.read(self.files)
        assert self.name
        optparse.OptionParser.__init__(self, *args, **kwargs)

    def get_config_files(self):
        # the files returned by this method will be parsed in order with the
        # first files listed being overridden by later files in standard
        # ConfigParser fashion
        config_file = os.environ.get('PIP_CONFIG_FILE', False)
        if config_file == os.devnull:
            return []

        # at the base we have any site-wide configuration
        files = list(site_config_files)

        # per-user configuration next
        if not self.isolated:
            if config_file and os.path.exists(config_file):
                files.append(config_file)
            else:
                # This is the legacy config file, we consider it to be a lower
                # priority than the new file location.
                files.append(legacy_config_file)

                # This is the new config file, we consider it to be a higher
                # priority than the legacy file.
                files.append(
                    os.path.join(
                        appdirs.user_config_dir("pip"),
                        config_basename,
                    )
                )

        # finally virtualenv configuration first trumping others
        if running_under_virtualenv():
            venv_config_file = os.path.join(
                sys.prefix,
                config_basename,
            )
            if os.path.exists(venv_config_file):
                files.append(venv_config_file)

        return files

    def check_default(self, option, key, val):
        try:
            return option.check_value(key, val)
        except optparse.OptionValueError as exc:
            print("An error occurred during configuration: %s" % exc)
            sys.exit(3)

    def _update_defaults(self, defaults):
        """Updates the given defaults with values from the config files and
        the environ. Does a little special handling for certain types of
        options (lists)."""
        # Then go and look for the other sources of configuration:
        config = {}
        # 1. config files
        for section in ('global', self.name):
            config.update(
                self.normalize_keys(self.get_config_section(section))
            )
        # 2. environmental variables
        if not self.isolated:
            config.update(self.normalize_keys(self.get_environ_vars()))
        # Accumulate complex default state.
        self.values = optparse.Values(self.defaults)
        late_eval = set()
        # Then set the options with those values
        for key, val in config.items():
            # ignore empty values
            if not val:
                continue

            option = self.get_option(key)
            # Ignore options not present in this parser. E.g. non-globals put
            # in [global] by users that want them to apply to all applicable
            # commands.
            if option is None:
                continue

            if option.action in ('store_true', 'store_false', 'count'):
                val = strtobool(val)
            elif option.action == 'append':
                val = val.split()
                val = [self.check_default(option, key, v) for v in val]
            elif option.action == 'callback':
                late_eval.add(option.dest)
                opt_str = option.get_opt_string()
                val = option.convert_value(opt_str, val)
                # From take_action
                args = option.callback_args or ()
                kwargs = option.callback_kwargs or {}
                option.callback(option, opt_str, val, self, *args, **kwargs)
            else:
                val = self.check_default(option, key, val)

            defaults[option.dest] = val

        for key in late_eval:
            defaults[key] = getattr(self.values, key)
        self.values = None
        return defaults

    def normalize_keys(self, items):
        """Return a config dictionary with normalized keys regardless of
        whether the keys were specified in environment variables or in config
        files"""
        normalized = {}
        for key, val in items:
            key = key.replace('_', '-')
            if not key.startswith('--'):
                key = '--%s' % key  # only prefer long opts
            normalized[key] = val
        return normalized

    def get_config_section(self, name):
        """Get a section of a configuration"""
        if self.config.has_section(name):
            return self.config.items(name)
        return []

    def get_environ_vars(self):
        """Returns a generator with all environmental vars with prefix PIP_"""
        for key, val in os.environ.items():
            if _environ_prefix_re.search(key):
                yield (_environ_prefix_re.sub("", key).lower(), val)

    def get_default_values(self):
        """Overriding to make updating the defaults after instantiation of
        the option parser possible, _update_defaults() does the dirty work."""
        if not self.process_default_values:
            # Old, pre-Optik 1.5 behaviour.
            return optparse.Values(self.defaults)

        defaults = self._update_defaults(self.defaults.copy())  # ours
        for option in self._get_all_options():
            default = defaults.get(option.dest)
            if isinstance(default, string_types):
                opt_str = option.get_opt_string()
                defaults[option.dest] = option.check_value(opt_str, default)
        return optparse.Values(defaults)

    def error(self, msg):
        self.print_usage(sys.stderr)
        self.exit(2, "%s\n" % msg)
site-packages/pip/download.py000064400000100117147511334620012243 0ustar00from __future__ import absolute_import

import cgi
import email.utils
import getpass
import json
import logging
import mimetypes
import os
import platform
import re
import shutil
import sys
import tempfile

try:
    import ssl  # noqa
    HAS_TLS = True
except ImportError:
    HAS_TLS = False

from pip._vendor.six.moves.urllib import parse as urllib_parse
from pip._vendor.six.moves.urllib import request as urllib_request

import pip

from pip.exceptions import InstallationError, HashMismatch
from pip.models import PyPI
from pip.utils import (splitext, rmtree, format_size, display_path,
                       backup_dir, ask_path_exists, unpack_file,
                       ARCHIVE_EXTENSIONS, consume, call_subprocess)
from pip.utils.encoding import auto_decode
from pip.utils.filesystem import check_path_owner
from pip.utils.logging import indent_log
from pip.utils.setuptools_build import SETUPTOOLS_SHIM
from pip.utils.glibc import libc_ver
from pip.utils.ui import DownloadProgressBar, DownloadProgressSpinner
from pip.locations import write_delete_marker_file
from pip.vcs import vcs
from pip._vendor import requests, six
from pip._vendor.requests.adapters import BaseAdapter, HTTPAdapter
from pip._vendor.requests.auth import AuthBase, HTTPBasicAuth
from pip._vendor.requests.models import CONTENT_CHUNK_SIZE, Response
from pip._vendor.requests.utils import get_netrc_auth
from pip._vendor.requests.structures import CaseInsensitiveDict
from pip._vendor import urllib3
from pip._vendor.cachecontrol import CacheControlAdapter
from pip._vendor.cachecontrol.caches import FileCache
from pip._vendor.lockfile import LockError
from pip._vendor.six.moves import xmlrpc_client


__all__ = ['get_file_content',
           'is_url', 'url_to_path', 'path_to_url',
           'is_archive_file', 'unpack_vcs_link',
           'unpack_file_url', 'is_vcs_url', 'is_file_url',
           'unpack_http_url', 'unpack_url',
           'parse_content_disposition', 'sanitize_content_filename']


logger = logging.getLogger(__name__)


def user_agent():
    """
    Return a string representing the user agent.
    """
    data = {
        "installer": {"name": "pip", "version": pip.__version__},
        "python": platform.python_version(),
        "implementation": {
            "name": platform.python_implementation(),
        },
    }

    if data["implementation"]["name"] == 'CPython':
        data["implementation"]["version"] = platform.python_version()
    elif data["implementation"]["name"] == 'PyPy':
        if sys.pypy_version_info.releaselevel == 'final':
            pypy_version_info = sys.pypy_version_info[:3]
        else:
            pypy_version_info = sys.pypy_version_info
        data["implementation"]["version"] = ".".join(
            [str(x) for x in pypy_version_info]
        )
    elif data["implementation"]["name"] == 'Jython':
        # Complete Guess
        data["implementation"]["version"] = platform.python_version()
    elif data["implementation"]["name"] == 'IronPython':
        # Complete Guess
        data["implementation"]["version"] = platform.python_version()

    if sys.platform.startswith("linux"):
        from pip._vendor import distro
        distro_infos = dict(filter(
            lambda x: x[1],
            zip(["name", "version", "id"], distro.linux_distribution()),
        ))
        libc = dict(filter(
            lambda x: x[1],
            zip(["lib", "version"], libc_ver()),
        ))
        if libc:
            distro_infos["libc"] = libc
        if distro_infos:
            data["distro"] = distro_infos

    if sys.platform.startswith("darwin") and platform.mac_ver()[0]:
        data["distro"] = {"name": "macOS", "version": platform.mac_ver()[0]}

    if platform.system():
        data.setdefault("system", {})["name"] = platform.system()

    if platform.release():
        data.setdefault("system", {})["release"] = platform.release()

    if platform.machine():
        data["cpu"] = platform.machine()

    # Python 2.6 doesn't have ssl.OPENSSL_VERSION.
    if HAS_TLS and sys.version_info[:2] > (2, 6):
        data["openssl_version"] = ssl.OPENSSL_VERSION

    return "{data[installer][name]}/{data[installer][version]} {json}".format(
        data=data,
        json=json.dumps(data, separators=(",", ":"), sort_keys=True),
    )


class MultiDomainBasicAuth(AuthBase):

    def __init__(self, prompting=True):
        self.prompting = prompting
        self.passwords = {}

    def __call__(self, req):
        parsed = urllib_parse.urlparse(req.url)

        # Get the netloc without any embedded credentials
        netloc = parsed.netloc.rsplit("@", 1)[-1]

        # Set the url of the request to the url without any credentials
        req.url = urllib_parse.urlunparse(parsed[:1] + (netloc,) + parsed[2:])

        # Use any stored credentials that we have for this netloc
        username, password = self.passwords.get(netloc, (None, None))

        # Extract credentials embedded in the url if we have none stored
        if username is None:
            username, password = self.parse_credentials(parsed.netloc)

        # Get creds from netrc if we still don't have them
        if username is None and password is None:
            netrc_auth = get_netrc_auth(req.url)
            username, password = netrc_auth if netrc_auth else (None, None)

        if username or password:
            # Store the username and password
            self.passwords[netloc] = (username, password)

            # Send the basic auth with this request
            req = HTTPBasicAuth(username or "", password or "")(req)

        # Attach a hook to handle 401 responses
        req.register_hook("response", self.handle_401)

        return req

    def handle_401(self, resp, **kwargs):
        # We only care about 401 responses, anything else we want to just
        #   pass through the actual response
        if resp.status_code != 401:
            return resp

        # We are not able to prompt the user so simply return the response
        if not self.prompting:
            return resp

        parsed = urllib_parse.urlparse(resp.url)

        # Prompt the user for a new username and password
        username = six.moves.input("User for %s: " % parsed.netloc)
        password = getpass.getpass("Password: ")

        # Store the new username and password to use for future requests
        if username or password:
            self.passwords[parsed.netloc] = (username, password)

        # Consume content and release the original connection to allow our new
        #   request to reuse the same one.
        resp.content
        resp.raw.release_conn()

        # Add our new username and password to the request
        req = HTTPBasicAuth(username or "", password or "")(resp.request)

        # Send our new request
        new_resp = resp.connection.send(req, **kwargs)
        new_resp.history.append(resp)

        return new_resp

    def parse_credentials(self, netloc):
        if "@" in netloc:
            userinfo = netloc.rsplit("@", 1)[0]
            if ":" in userinfo:
                return userinfo.split(":", 1)
            return userinfo, None
        return None, None


class LocalFSAdapter(BaseAdapter):

    def send(self, request, stream=None, timeout=None, verify=None, cert=None,
             proxies=None):
        pathname = url_to_path(request.url)

        resp = Response()
        resp.status_code = 200
        resp.url = request.url

        try:
            stats = os.stat(pathname)
        except OSError as exc:
            resp.status_code = 404
            resp.raw = exc
        else:
            modified = email.utils.formatdate(stats.st_mtime, usegmt=True)
            content_type = mimetypes.guess_type(pathname)[0] or "text/plain"
            resp.headers = CaseInsensitiveDict({
                "Content-Type": content_type,
                "Content-Length": stats.st_size,
                "Last-Modified": modified,
            })

            resp.raw = open(pathname, "rb")
            resp.close = resp.raw.close

        return resp

    def close(self):
        pass


class SafeFileCache(FileCache):
    """
    A file based cache which is safe to use even when the target directory may
    not be accessible or writable.
    """

    def __init__(self, *args, **kwargs):
        super(SafeFileCache, self).__init__(*args, **kwargs)

        # Check to ensure that the directory containing our cache directory
        # is owned by the user current executing pip. If it does not exist
        # we will check the parent directory until we find one that does exist.
        # If it is not owned by the user executing pip then we will disable
        # the cache and log a warning.
        if not check_path_owner(self.directory):
            logger.warning(
                "The directory '%s' or its parent directory is not owned by "
                "the current user and the cache has been disabled. Please "
                "check the permissions and owner of that directory. If "
                "executing pip with sudo, you may want sudo's -H flag.",
                self.directory,
            )

            # Set our directory to None to disable the Cache
            self.directory = None

    def get(self, *args, **kwargs):
        # If we don't have a directory, then the cache should be a no-op.
        if self.directory is None:
            return

        try:
            return super(SafeFileCache, self).get(*args, **kwargs)
        except (LockError, OSError, IOError):
            # We intentionally silence this error, if we can't access the cache
            # then we can just skip caching and process the request as if
            # caching wasn't enabled.
            pass

    def set(self, *args, **kwargs):
        # If we don't have a directory, then the cache should be a no-op.
        if self.directory is None:
            return

        try:
            return super(SafeFileCache, self).set(*args, **kwargs)
        except (LockError, OSError, IOError):
            # We intentionally silence this error, if we can't access the cache
            # then we can just skip caching and process the request as if
            # caching wasn't enabled.
            pass

    def delete(self, *args, **kwargs):
        # If we don't have a directory, then the cache should be a no-op.
        if self.directory is None:
            return

        try:
            return super(SafeFileCache, self).delete(*args, **kwargs)
        except (LockError, OSError, IOError):
            # We intentionally silence this error, if we can't access the cache
            # then we can just skip caching and process the request as if
            # caching wasn't enabled.
            pass


class InsecureHTTPAdapter(HTTPAdapter):

    def cert_verify(self, conn, url, verify, cert):
        conn.cert_reqs = 'CERT_NONE'
        conn.ca_certs = None


class PipSession(requests.Session):

    timeout = None

    def __init__(self, *args, **kwargs):
        retries = kwargs.pop("retries", 0)
        cache = kwargs.pop("cache", None)
        insecure_hosts = kwargs.pop("insecure_hosts", [])

        super(PipSession, self).__init__(*args, **kwargs)

        # Attach our User Agent to the request
        self.headers["User-Agent"] = user_agent()

        # Attach our Authentication handler to the session
        self.auth = MultiDomainBasicAuth()

        # Create our urllib3.Retry instance which will allow us to customize
        # how we handle retries.
        retries = urllib3.Retry(
            # Set the total number of retries that a particular request can
            # have.
            total=retries,

            # A 503 error from PyPI typically means that the Fastly -> Origin
            # connection got interrupted in some way. A 503 error in general
            # is typically considered a transient error so we'll go ahead and
            # retry it.
            status_forcelist=[503],

            # Add a small amount of back off between failed requests in
            # order to prevent hammering the service.
            backoff_factor=0.25,
        )

        # We want to _only_ cache responses on securely fetched origins. We do
        # this because we can't validate the response of an insecurely fetched
        # origin, and we don't want someone to be able to poison the cache and
        # require manual eviction from the cache to fix it.
        if cache:
            secure_adapter = CacheControlAdapter(
                cache=SafeFileCache(cache, use_dir_lock=True),
                max_retries=retries,
            )
        else:
            secure_adapter = HTTPAdapter(max_retries=retries)

        # Our Insecure HTTPAdapter disables HTTPS validation. It does not
        # support caching (see above) so we'll use it for all http:// URLs as
        # well as any https:// host that we've marked as ignoring TLS errors
        # for.
        insecure_adapter = InsecureHTTPAdapter(max_retries=retries)

        self.mount("https://", secure_adapter)
        self.mount("http://", insecure_adapter)

        # Enable file:// urls
        self.mount("file://", LocalFSAdapter())

        # We want to use a non-validating adapter for any requests which are
        # deemed insecure.
        for host in insecure_hosts:
            self.mount("https://{0}/".format(host), insecure_adapter)

    def request(self, method, url, *args, **kwargs):
        # Allow setting a default timeout on a session
        kwargs.setdefault("timeout", self.timeout)

        # Dispatch the actual request
        return super(PipSession, self).request(method, url, *args, **kwargs)


def get_file_content(url, comes_from=None, session=None):
    """Gets the content of a file; it may be a filename, file: URL, or
    http: URL.  Returns (location, content).  Content is unicode."""
    if session is None:
        raise TypeError(
            "get_file_content() missing 1 required keyword argument: 'session'"
        )

    match = _scheme_re.search(url)
    if match:
        scheme = match.group(1).lower()
        if (scheme == 'file' and comes_from and
                comes_from.startswith('http')):
            raise InstallationError(
                'Requirements file %s references URL %s, which is local'
                % (comes_from, url))
        if scheme == 'file':
            path = url.split(':', 1)[1]
            path = path.replace('\\', '/')
            match = _url_slash_drive_re.match(path)
            if match:
                path = match.group(1) + ':' + path.split('|', 1)[1]
            path = urllib_parse.unquote(path)
            if path.startswith('/'):
                path = '/' + path.lstrip('/')
            url = path
        else:
            # FIXME: catch some errors
            resp = session.get(url)
            resp.raise_for_status()
            return resp.url, resp.text
    try:
        with open(url, 'rb') as f:
            content = auto_decode(f.read())
    except IOError as exc:
        raise InstallationError(
            'Could not open requirements file: %s' % str(exc)
        )
    return url, content


_scheme_re = re.compile(r'^(http|https|file):', re.I)
_url_slash_drive_re = re.compile(r'/*([a-z])\|', re.I)


def is_url(name):
    """Returns true if the name looks like a URL"""
    if ':' not in name:
        return False
    scheme = name.split(':', 1)[0].lower()
    return scheme in ['http', 'https', 'file', 'ftp'] + vcs.all_schemes


def url_to_path(url):
    """
    Convert a file: URL to a path.
    """
    assert url.startswith('file:'), (
        "You can only turn file: urls into filenames (not %r)" % url)

    _, netloc, path, _, _ = urllib_parse.urlsplit(url)

    # if we have a UNC path, prepend UNC share notation
    if netloc:
        netloc = '\\\\' + netloc

    path = urllib_request.url2pathname(netloc + path)
    return path


def path_to_url(path):
    """
    Convert a path to a file: URL.  The path will be made absolute and have
    quoted path parts.
    """
    path = os.path.normpath(os.path.abspath(path))
    url = urllib_parse.urljoin('file:', urllib_request.pathname2url(path))
    return url


def is_archive_file(name):
    """Return True if `name` is a considered as an archive file."""
    ext = splitext(name)[1].lower()
    if ext in ARCHIVE_EXTENSIONS:
        return True
    return False


def unpack_vcs_link(link, location):
    vcs_backend = _get_used_vcs_backend(link)
    vcs_backend.unpack(location)


def _get_used_vcs_backend(link):
    for backend in vcs.backends:
        if link.scheme in backend.schemes:
            vcs_backend = backend(link.url)
            return vcs_backend


def is_vcs_url(link):
    return bool(_get_used_vcs_backend(link))


def is_file_url(link):
    return link.url.lower().startswith('file:')


def is_dir_url(link):
    """Return whether a file:// Link points to a directory.

    ``link`` must not have any other scheme but file://. Call is_file_url()
    first.

    """
    link_path = url_to_path(link.url_without_fragment)
    return os.path.isdir(link_path)


def _progress_indicator(iterable, *args, **kwargs):
    return iterable


def _download_url(resp, link, content_file, hashes):
    try:
        total_length = int(resp.headers['content-length'])
    except (ValueError, KeyError, TypeError):
        total_length = 0

    cached_resp = getattr(resp, "from_cache", False)

    if logger.getEffectiveLevel() > logging.INFO:
        show_progress = False
    elif cached_resp:
        show_progress = False
    elif total_length > (40 * 1000):
        show_progress = True
    elif not total_length:
        show_progress = True
    else:
        show_progress = False

    show_url = link.show_url

    def resp_read(chunk_size):
        try:
            # Special case for urllib3.
            for chunk in resp.raw.stream(
                    chunk_size,
                    # We use decode_content=False here because we don't
                    # want urllib3 to mess with the raw bytes we get
                    # from the server. If we decompress inside of
                    # urllib3 then we cannot verify the checksum
                    # because the checksum will be of the compressed
                    # file. This breakage will only occur if the
                    # server adds a Content-Encoding header, which
                    # depends on how the server was configured:
                    # - Some servers will notice that the file isn't a
                    #   compressible file and will leave the file alone
                    #   and with an empty Content-Encoding
                    # - Some servers will notice that the file is
                    #   already compressed and will leave the file
                    #   alone and will add a Content-Encoding: gzip
                    #   header
                    # - Some servers won't notice anything at all and
                    #   will take a file that's already been compressed
                    #   and compress it again and set the
                    #   Content-Encoding: gzip header
                    #
                    # By setting this not to decode automatically we
                    # hope to eliminate problems with the second case.
                    decode_content=False):
                yield chunk
        except AttributeError:
            # Standard file-like object.
            while True:
                chunk = resp.raw.read(chunk_size)
                if not chunk:
                    break
                yield chunk

    def written_chunks(chunks):
        for chunk in chunks:
            content_file.write(chunk)
            yield chunk

    progress_indicator = _progress_indicator

    if link.netloc == PyPI.netloc:
        url = show_url
    else:
        url = link.url_without_fragment

    if show_progress:  # We don't show progress on cached responses
        if total_length:
            logger.info("Downloading %s (%s)", url, format_size(total_length))
            progress_indicator = DownloadProgressBar(max=total_length).iter
        else:
            logger.info("Downloading %s", url)
            progress_indicator = DownloadProgressSpinner().iter
    elif cached_resp:
        logger.info("Using cached %s", url)
    else:
        logger.info("Downloading %s", url)

    logger.debug('Downloading from URL %s', link)

    downloaded_chunks = written_chunks(
        progress_indicator(
            resp_read(CONTENT_CHUNK_SIZE),
            CONTENT_CHUNK_SIZE
        )
    )
    if hashes:
        hashes.check_against_chunks(downloaded_chunks)
    else:
        consume(downloaded_chunks)


def _copy_file(filename, location, link):
    copy = True
    download_location = os.path.join(location, link.filename)
    if os.path.exists(download_location):
        response = ask_path_exists(
            'The file %s exists. (i)gnore, (w)ipe, (b)ackup, (a)abort' %
            display_path(download_location), ('i', 'w', 'b', 'a'))
        if response == 'i':
            copy = False
        elif response == 'w':
            logger.warning('Deleting %s', display_path(download_location))
            os.remove(download_location)
        elif response == 'b':
            dest_file = backup_dir(download_location)
            logger.warning(
                'Backing up %s to %s',
                display_path(download_location),
                display_path(dest_file),
            )
            shutil.move(download_location, dest_file)
        elif response == 'a':
            sys.exit(-1)
    if copy:
        shutil.copy(filename, download_location)
        logger.info('Saved %s', display_path(download_location))


def unpack_http_url(link, location, download_dir=None,
                    session=None, hashes=None):
    if session is None:
        raise TypeError(
            "unpack_http_url() missing 1 required keyword argument: 'session'"
        )

    temp_dir = tempfile.mkdtemp('-unpack', 'pip-')

    # If a download dir is specified, is the file already downloaded there?
    already_downloaded_path = None
    if download_dir:
        already_downloaded_path = _check_download_dir(link,
                                                      download_dir,
                                                      hashes)

    if already_downloaded_path:
        from_path = already_downloaded_path
        content_type = mimetypes.guess_type(from_path)[0]
    else:
        # let's download to a tmp dir
        from_path, content_type = _download_http_url(link,
                                                     session,
                                                     temp_dir,
                                                     hashes)

    # unpack the archive to the build dir location. even when only downloading
    # archives, they have to be unpacked to parse dependencies
    unpack_file(from_path, location, content_type, link)

    # a download dir is specified; let's copy the archive there
    if download_dir and not already_downloaded_path:
        _copy_file(from_path, download_dir, link)

    if not already_downloaded_path:
        os.unlink(from_path)
    rmtree(temp_dir)


def unpack_file_url(link, location, download_dir=None, hashes=None):
    """Unpack link into location.

    If download_dir is provided and link points to a file, make a copy
    of the link file inside download_dir.
    """
    link_path = url_to_path(link.url_without_fragment)

    # If it's a url to a local directory
    if is_dir_url(link):
        if os.path.isdir(location):
            rmtree(location)
        shutil.copytree(link_path, location, symlinks=True)
        if download_dir:
            logger.info('Link is a directory, ignoring download_dir')
        return

    # If --require-hashes is off, `hashes` is either empty, the
    # link's embedded hash, or MissingHashes; it is required to
    # match. If --require-hashes is on, we are satisfied by any
    # hash in `hashes` matching: a URL-based or an option-based
    # one; no internet-sourced hash will be in `hashes`.
    if hashes:
        hashes.check_against_path(link_path)

    # If a download dir is specified, is the file already there and valid?
    already_downloaded_path = None
    if download_dir:
        already_downloaded_path = _check_download_dir(link,
                                                      download_dir,
                                                      hashes)

    if already_downloaded_path:
        from_path = already_downloaded_path
    else:
        from_path = link_path

    content_type = mimetypes.guess_type(from_path)[0]

    # unpack the archive to the build dir location. even when only downloading
    # archives, they have to be unpacked to parse dependencies
    unpack_file(from_path, location, content_type, link)

    # a download dir is specified and not already downloaded
    if download_dir and not already_downloaded_path:
        _copy_file(from_path, download_dir, link)


def _copy_dist_from_dir(link_path, location):
    """Copy distribution files in `link_path` to `location`.

    Invoked when user requests to install a local directory. E.g.:

        pip install .
        pip install ~/dev/git-repos/python-prompt-toolkit

    """

    # Note: This is currently VERY SLOW if you have a lot of data in the
    # directory, because it copies everything with `shutil.copytree`.
    # What it should really do is build an sdist and install that.
    # See https://github.com/pypa/pip/issues/2195

    if os.path.isdir(location):
        rmtree(location)

    # build an sdist
    setup_py = 'setup.py'
    sdist_args = [sys.executable]
    sdist_args.append('-c')
    sdist_args.append(SETUPTOOLS_SHIM % setup_py)
    sdist_args.append('sdist')
    sdist_args += ['--dist-dir', location]
    logger.info('Running setup.py sdist for %s', link_path)

    with indent_log():
        call_subprocess(sdist_args, cwd=link_path, show_stdout=False)

    # unpack sdist into `location`
    sdist = os.path.join(location, os.listdir(location)[0])
    logger.info('Unpacking sdist %s into %s', sdist, location)
    unpack_file(sdist, location, content_type=None, link=None)


class PipXmlrpcTransport(xmlrpc_client.Transport):
    """Provide a `xmlrpclib.Transport` implementation via a `PipSession`
    object.
    """

    def __init__(self, index_url, session, use_datetime=False):
        xmlrpc_client.Transport.__init__(self, use_datetime)
        index_parts = urllib_parse.urlparse(index_url)
        self._scheme = index_parts.scheme
        self._session = session

    def request(self, host, handler, request_body, verbose=False):
        parts = (self._scheme, host, handler, None, None, None)
        url = urllib_parse.urlunparse(parts)
        try:
            headers = {'Content-Type': 'text/xml'}
            response = self._session.post(url, data=request_body,
                                          headers=headers, stream=True)
            response.raise_for_status()
            self.verbose = verbose
            return self.parse_response(response.raw)
        except requests.HTTPError as exc:
            logger.critical(
                "HTTP error %s while getting %s",
                exc.response.status_code, url,
            )
            raise


def unpack_url(link, location, download_dir=None,
               only_download=False, session=None, hashes=None):
    """Unpack link.
       If link is a VCS link:
         if only_download, export into download_dir and ignore location
          else unpack into location
       for other types of link:
         - unpack into location
         - if download_dir, copy the file into download_dir
         - if only_download, mark location for deletion

    :param hashes: A Hashes object, one of whose embedded hashes must match,
        or HashMismatch will be raised. If the Hashes is empty, no matches are
        required, and unhashable types of requirements (like VCS ones, which
        would ordinarily raise HashUnsupported) are allowed.
    """
    # non-editable vcs urls
    if is_vcs_url(link):
        unpack_vcs_link(link, location)

    # file urls
    elif is_file_url(link):
        unpack_file_url(link, location, download_dir, hashes=hashes)

    # http urls
    else:
        if session is None:
            session = PipSession()

        unpack_http_url(
            link,
            location,
            download_dir,
            session,
            hashes=hashes
        )
    if only_download:
        write_delete_marker_file(location)


def sanitize_content_filename(filename):
    # type: (str) -> str
    """
    Sanitize the "filename" value from a Content-Disposition header.
    """
    return os.path.basename(filename)


def parse_content_disposition(content_disposition, default_filename):
    # type: (str, str) -> str
    """
    Parse the "filename" value from a Content-Disposition header, and
    return the default filename if the result is empty.
    """
    _type, params = cgi.parse_header(content_disposition)
    filename = params.get('filename')
    if filename:
        # We need to sanitize the filename to prevent directory traversal
        # in case the filename contains ".." path parts.
        filename = sanitize_content_filename(filename)
    return filename or default_filename


def _download_http_url(link, session, temp_dir, hashes):
    """Download link url into temp_dir using provided session"""
    target_url = link.url.split('#', 1)[0]
    try:
        resp = session.get(
            target_url,
            # We use Accept-Encoding: identity here because requests
            # defaults to accepting compressed responses. This breaks in
            # a variety of ways depending on how the server is configured.
            # - Some servers will notice that the file isn't a compressible
            #   file and will leave the file alone and with an empty
            #   Content-Encoding
            # - Some servers will notice that the file is already
            #   compressed and will leave the file alone and will add a
            #   Content-Encoding: gzip header
            # - Some servers won't notice anything at all and will take
            #   a file that's already been compressed and compress it again
            #   and set the Content-Encoding: gzip header
            # By setting this to request only the identity encoding We're
            # hoping to eliminate the third case. Hopefully there does not
            # exist a server which when given a file will notice it is
            # already compressed and that you're not asking for a
            # compressed file and will then decompress it before sending
            # because if that's the case I don't think it'll ever be
            # possible to make this work.
            headers={"Accept-Encoding": "identity"},
            stream=True,
        )
        resp.raise_for_status()
    except requests.HTTPError as exc:
        logger.critical(
            "HTTP error %s while getting %s", exc.response.status_code, link,
        )
        raise

    content_type = resp.headers.get('content-type', '')
    filename = link.filename  # fallback
    # Have a look at the Content-Disposition header for a better guess
    content_disposition = resp.headers.get('content-disposition')
    if content_disposition:
        filename = parse_content_disposition(content_disposition, filename)
    ext = splitext(filename)[1]
    if not ext:
        ext = mimetypes.guess_extension(content_type)
        if ext:
            filename += ext
    if not ext and link.url != resp.url:
        ext = os.path.splitext(resp.url)[1]
        if ext:
            filename += ext
    file_path = os.path.join(temp_dir, filename)
    with open(file_path, 'wb') as content_file:
        _download_url(resp, link, content_file, hashes)
    return file_path, content_type


def _check_download_dir(link, download_dir, hashes):
    """ Check download_dir for previously downloaded file with correct hash
        If a correct file is found return its path else None
    """
    download_path = os.path.join(download_dir, link.filename)
    if os.path.exists(download_path):
        # If already downloaded, does its hash match?
        logger.info('File was already downloaded %s', download_path)
        if hashes:
            try:
                hashes.check_against_path(download_path)
            except HashMismatch:
                logger.warning(
                    'Previously-downloaded file %s has bad hash. '
                    'Re-downloading.',
                    download_path
                )
                os.unlink(download_path)
                return None
        return download_path
    return None
site-packages/pip/vcs/bazaar.py000064400000007333147511334620012475 0ustar00from __future__ import absolute_import

import logging
import os
import tempfile

# TODO: Get this into six.moves.urllib.parse
try:
    from urllib import parse as urllib_parse
except ImportError:
    import urlparse as urllib_parse

from pip.utils import rmtree, display_path
from pip.vcs import vcs, VersionControl
from pip.download import path_to_url


logger = logging.getLogger(__name__)


class Bazaar(VersionControl):
    name = 'bzr'
    dirname = '.bzr'
    repo_name = 'branch'
    schemes = (
        'bzr', 'bzr+http', 'bzr+https', 'bzr+ssh', 'bzr+sftp', 'bzr+ftp',
        'bzr+lp',
    )

    def __init__(self, url=None, *args, **kwargs):
        super(Bazaar, self).__init__(url, *args, **kwargs)
        # Python >= 2.7.4, 3.3 doesn't have uses_fragment or non_hierarchical
        # Register lp but do not expose as a scheme to support bzr+lp.
        if getattr(urllib_parse, 'uses_fragment', None):
            urllib_parse.uses_fragment.extend(['lp'])
            urllib_parse.non_hierarchical.extend(['lp'])

    def export(self, location):
        """
        Export the Bazaar repository at the url to the destination location
        """
        temp_dir = tempfile.mkdtemp('-export', 'pip-')
        self.unpack(temp_dir)
        if os.path.exists(location):
            # Remove the location to make sure Bazaar can export it correctly
            rmtree(location)
        try:
            self.run_command(['export', location], cwd=temp_dir,
                             show_stdout=False)
        finally:
            rmtree(temp_dir)

    def switch(self, dest, url, rev_options):
        self.run_command(['switch', url], cwd=dest)

    def update(self, dest, rev_options):
        self.run_command(['pull', '-q'] + rev_options, cwd=dest)

    def obtain(self, dest):
        url, rev = self.get_url_rev()
        if rev:
            rev_options = ['-r', rev]
            rev_display = ' (to revision %s)' % rev
        else:
            rev_options = []
            rev_display = ''
        if self.check_destination(dest, url, rev_options, rev_display):
            logger.info(
                'Checking out %s%s to %s',
                url,
                rev_display,
                display_path(dest),
            )
            self.run_command(['branch', '-q'] + rev_options + [url, dest])

    def get_url_rev(self):
        # hotfix the URL scheme after removing bzr+ from bzr+ssh:// readd it
        url, rev = super(Bazaar, self).get_url_rev()
        if url.startswith('ssh://'):
            url = 'bzr+' + url
        return url, rev

    def get_url(self, location):
        urls = self.run_command(['info'], show_stdout=False, cwd=location)
        for line in urls.splitlines():
            line = line.strip()
            for x in ('checkout of branch: ',
                      'parent branch: '):
                if line.startswith(x):
                    repo = line.split(x)[1]
                    if self._is_local_repository(repo):
                        return path_to_url(repo)
                    return repo
        return None

    def get_revision(self, location):
        revision = self.run_command(
            ['revno'], show_stdout=False, cwd=location)
        return revision.splitlines()[-1]

    def get_src_requirement(self, dist, location):
        repo = self.get_url(location)
        if not repo:
            return None
        if not repo.lower().startswith('bzr:'):
            repo = 'bzr+' + repo
        egg_project_name = dist.egg_name().split('-', 1)[0]
        current_rev = self.get_revision(location)
        return '%s@%s#egg=%s' % (repo, current_rev, egg_project_name)

    def check_version(self, dest, rev_options):
        """Always assume the versions don't match"""
        return False


vcs.register(Bazaar)
site-packages/pip/vcs/subversion.py000064400000022206147511334620013430 0ustar00from __future__ import absolute_import

import logging
import os
import re

from pip._vendor.six.moves.urllib import parse as urllib_parse

from pip.index import Link
from pip.utils import rmtree, display_path
from pip.utils.logging import indent_log
from pip.vcs import vcs, VersionControl

_svn_xml_url_re = re.compile('url="([^"]+)"')
_svn_rev_re = re.compile('committed-rev="(\d+)"')
_svn_url_re = re.compile(r'URL: (.+)')
_svn_revision_re = re.compile(r'Revision: (.+)')
_svn_info_xml_rev_re = re.compile(r'\s*revision="(\d+)"')
_svn_info_xml_url_re = re.compile(r'<url>(.*)</url>')


logger = logging.getLogger(__name__)


class Subversion(VersionControl):
    name = 'svn'
    dirname = '.svn'
    repo_name = 'checkout'
    schemes = ('svn', 'svn+ssh', 'svn+http', 'svn+https', 'svn+svn')

    def get_info(self, location):
        """Returns (url, revision), where both are strings"""
        assert not location.rstrip('/').endswith(self.dirname), \
            'Bad directory: %s' % location
        output = self.run_command(
            ['info', location],
            show_stdout=False,
            extra_environ={'LANG': 'C'},
        )
        match = _svn_url_re.search(output)
        if not match:
            logger.warning(
                'Cannot determine URL of svn checkout %s',
                display_path(location),
            )
            logger.debug('Output that cannot be parsed: \n%s', output)
            return None, None
        url = match.group(1).strip()
        match = _svn_revision_re.search(output)
        if not match:
            logger.warning(
                'Cannot determine revision of svn checkout %s',
                display_path(location),
            )
            logger.debug('Output that cannot be parsed: \n%s', output)
            return url, None
        return url, match.group(1)

    def export(self, location):
        """Export the svn repository at the url to the destination location"""
        url, rev = self.get_url_rev()
        rev_options = get_rev_options(url, rev)
        url = self.remove_auth_from_url(url)
        logger.info('Exporting svn repository %s to %s', url, location)
        with indent_log():
            if os.path.exists(location):
                # Subversion doesn't like to check out over an existing
                # directory --force fixes this, but was only added in svn 1.5
                rmtree(location)
            self.run_command(
                ['export'] + rev_options + [url, location],
                show_stdout=False)

    def switch(self, dest, url, rev_options):
        self.run_command(['switch'] + rev_options + [url, dest])

    def update(self, dest, rev_options):
        self.run_command(['update'] + rev_options + [dest])

    def obtain(self, dest):
        url, rev = self.get_url_rev()
        rev_options = get_rev_options(url, rev)
        url = self.remove_auth_from_url(url)
        if rev:
            rev_display = ' (to revision %s)' % rev
        else:
            rev_display = ''
        if self.check_destination(dest, url, rev_options, rev_display):
            logger.info(
                'Checking out %s%s to %s',
                url,
                rev_display,
                display_path(dest),
            )
            self.run_command(['checkout', '-q'] + rev_options + [url, dest])

    def get_location(self, dist, dependency_links):
        for url in dependency_links:
            egg_fragment = Link(url).egg_fragment
            if not egg_fragment:
                continue
            if '-' in egg_fragment:
                # FIXME: will this work when a package has - in the name?
                key = '-'.join(egg_fragment.split('-')[:-1]).lower()
            else:
                key = egg_fragment
            if key == dist.key:
                return url.split('#', 1)[0]
        return None

    def get_revision(self, location):
        """
        Return the maximum revision for all files under a given location
        """
        # Note: taken from setuptools.command.egg_info
        revision = 0

        for base, dirs, files in os.walk(location):
            if self.dirname not in dirs:
                dirs[:] = []
                continue    # no sense walking uncontrolled subdirs
            dirs.remove(self.dirname)
            entries_fn = os.path.join(base, self.dirname, 'entries')
            if not os.path.exists(entries_fn):
                # FIXME: should we warn?
                continue

            dirurl, localrev = self._get_svn_url_rev(base)

            if base == location:
                base_url = dirurl + '/'   # save the root url
            elif not dirurl or not dirurl.startswith(base_url):
                dirs[:] = []
                continue    # not part of the same svn tree, skip it
            revision = max(revision, localrev)
        return revision

    def get_url_rev(self):
        # hotfix the URL scheme after removing svn+ from svn+ssh:// readd it
        url, rev = super(Subversion, self).get_url_rev()
        if url.startswith('ssh://'):
            url = 'svn+' + url
        return url, rev

    def get_url(self, location):
        # In cases where the source is in a subdirectory, not alongside
        # setup.py we have to look up in the location until we find a real
        # setup.py
        orig_location = location
        while not os.path.exists(os.path.join(location, 'setup.py')):
            last_location = location
            location = os.path.dirname(location)
            if location == last_location:
                # We've traversed up to the root of the filesystem without
                # finding setup.py
                logger.warning(
                    "Could not find setup.py for directory %s (tried all "
                    "parent directories)",
                    orig_location,
                )
                return None

        return self._get_svn_url_rev(location)[0]

    def _get_svn_url_rev(self, location):
        from pip.exceptions import InstallationError

        entries_path = os.path.join(location, self.dirname, 'entries')
        if os.path.exists(entries_path):
            with open(entries_path) as f:
                data = f.read()
        else:  # subversion >= 1.7 does not have the 'entries' file
            data = ''

        if (data.startswith('8') or
                data.startswith('9') or
                data.startswith('10')):
            data = list(map(str.splitlines, data.split('\n\x0c\n')))
            del data[0][0]  # get rid of the '8'
            url = data[0][3]
            revs = [int(d[9]) for d in data if len(d) > 9 and d[9]] + [0]
        elif data.startswith('<?xml'):
            match = _svn_xml_url_re.search(data)
            if not match:
                raise ValueError('Badly formatted data: %r' % data)
            url = match.group(1)    # get repository URL
            revs = [int(m.group(1)) for m in _svn_rev_re.finditer(data)] + [0]
        else:
            try:
                # subversion >= 1.7
                xml = self.run_command(
                    ['info', '--xml', location],
                    show_stdout=False,
                )
                url = _svn_info_xml_url_re.search(xml).group(1)
                revs = [
                    int(m.group(1)) for m in _svn_info_xml_rev_re.finditer(xml)
                ]
            except InstallationError:
                url, revs = None, []

        if revs:
            rev = max(revs)
        else:
            rev = 0

        return url, rev

    def get_src_requirement(self, dist, location):
        repo = self.get_url(location)
        if repo is None:
            return None
        # FIXME: why not project name?
        egg_project_name = dist.egg_name().split('-', 1)[0]
        rev = self.get_revision(location)
        return 'svn+%s@%s#egg=%s' % (repo, rev, egg_project_name)

    def check_version(self, dest, rev_options):
        """Always assume the versions don't match"""
        return False

    @staticmethod
    def remove_auth_from_url(url):
        # Return a copy of url with 'username:password@' removed.
        # username/pass params are passed to subversion through flags
        # and are not recognized in the url.

        # parsed url
        purl = urllib_parse.urlsplit(url)
        stripped_netloc = \
            purl.netloc.split('@')[-1]

        # stripped url
        url_pieces = (
            purl.scheme, stripped_netloc, purl.path, purl.query, purl.fragment
        )
        surl = urllib_parse.urlunsplit(url_pieces)
        return surl


def get_rev_options(url, rev):
    if rev:
        rev_options = ['-r', rev]
    else:
        rev_options = []

    r = urllib_parse.urlsplit(url)
    if hasattr(r, 'username'):
        # >= Python-2.5
        username, password = r.username, r.password
    else:
        netloc = r[1]
        if '@' in netloc:
            auth = netloc.split('@')[0]
            if ':' in auth:
                username, password = auth.split(':', 1)
            else:
                username, password = auth, None
        else:
            username, password = None, None

    if username:
        rev_options += ['--username', username]
    if password:
        rev_options += ['--password', password]
    return rev_options


vcs.register(Subversion)
site-packages/pip/vcs/__init__.py000064400000030126147511334620012770 0ustar00"""Handles all VCS (version control) support"""
from __future__ import absolute_import

import errno
import logging
import os
import shutil
import sys

from pip._vendor.six.moves.urllib import parse as urllib_parse

from pip.exceptions import BadCommand
from pip.utils import (display_path, backup_dir, call_subprocess,
                       rmtree, ask_path_exists)


__all__ = ['vcs', 'get_src_requirement']


logger = logging.getLogger(__name__)


class VcsSupport(object):
    _registry = {}
    schemes = ['ssh', 'git', 'hg', 'bzr', 'sftp', 'svn']

    def __init__(self):
        # Register more schemes with urlparse for various version control
        # systems
        urllib_parse.uses_netloc.extend(self.schemes)
        # Python >= 2.7.4, 3.3 doesn't have uses_fragment
        if getattr(urllib_parse, 'uses_fragment', None):
            urllib_parse.uses_fragment.extend(self.schemes)
        super(VcsSupport, self).__init__()

    def __iter__(self):
        return self._registry.__iter__()

    @property
    def backends(self):
        return list(self._registry.values())

    @property
    def dirnames(self):
        return [backend.dirname for backend in self.backends]

    @property
    def all_schemes(self):
        schemes = []
        for backend in self.backends:
            schemes.extend(backend.schemes)
        return schemes

    def register(self, cls):
        if not hasattr(cls, 'name'):
            logger.warning('Cannot register VCS %s', cls.__name__)
            return
        if cls.name not in self._registry:
            self._registry[cls.name] = cls
            logger.debug('Registered VCS backend: %s', cls.name)

    def unregister(self, cls=None, name=None):
        if name in self._registry:
            del self._registry[name]
        elif cls in self._registry.values():
            del self._registry[cls.name]
        else:
            logger.warning('Cannot unregister because no class or name given')

    def get_backend_name(self, location):
        """
        Return the name of the version control backend if found at given
        location, e.g. vcs.get_backend_name('/path/to/vcs/checkout')
        """
        for vc_type in self._registry.values():
            if vc_type.controls_location(location):
                logger.debug('Determine that %s uses VCS: %s',
                             location, vc_type.name)
                return vc_type.name
        return None

    def get_backend(self, name):
        name = name.lower()
        if name in self._registry:
            return self._registry[name]

    def get_backend_from_location(self, location):
        vc_type = self.get_backend_name(location)
        if vc_type:
            return self.get_backend(vc_type)
        return None


vcs = VcsSupport()


class VersionControl(object):
    name = ''
    dirname = ''
    # List of supported schemes for this Version Control
    schemes = ()

    def __init__(self, url=None, *args, **kwargs):
        self.url = url
        super(VersionControl, self).__init__(*args, **kwargs)

    def _is_local_repository(self, repo):
        """
           posix absolute paths start with os.path.sep,
           win32 ones start with drive (like c:\\folder)
        """
        drive, tail = os.path.splitdrive(repo)
        return repo.startswith(os.path.sep) or drive

    # See issue #1083 for why this method was introduced:
    # https://github.com/pypa/pip/issues/1083
    def translate_egg_surname(self, surname):
        # For example, Django has branches of the form "stable/1.7.x".
        return surname.replace('/', '_')

    def export(self, location):
        """
        Export the repository at the url to the destination location
        i.e. only download the files, without vcs informations
        """
        raise NotImplementedError

    def get_url_rev(self):
        """
        Returns the correct repository URL and revision by parsing the given
        repository URL
        """
        error_message = (
            "Sorry, '%s' is a malformed VCS url. "
            "The format is <vcs>+<protocol>://<url>, "
            "e.g. svn+http://myrepo/svn/MyApp#egg=MyApp"
        )
        assert '+' in self.url, error_message % self.url
        url = self.url.split('+', 1)[1]
        scheme, netloc, path, query, frag = urllib_parse.urlsplit(url)
        rev = None
        if '@' in path:
            path, rev = path.rsplit('@', 1)
        url = urllib_parse.urlunsplit((scheme, netloc, path, query, ''))
        return url, rev

    def get_info(self, location):
        """
        Returns (url, revision), where both are strings
        """
        assert not location.rstrip('/').endswith(self.dirname), \
            'Bad directory: %s' % location
        return self.get_url(location), self.get_revision(location)

    def normalize_url(self, url):
        """
        Normalize a URL for comparison by unquoting it and removing any
        trailing slash.
        """
        return urllib_parse.unquote(url).rstrip('/')

    def compare_urls(self, url1, url2):
        """
        Compare two repo URLs for identity, ignoring incidental differences.
        """
        return (self.normalize_url(url1) == self.normalize_url(url2))

    def obtain(self, dest):
        """
        Called when installing or updating an editable package, takes the
        source path of the checkout.
        """
        raise NotImplementedError

    def switch(self, dest, url, rev_options):
        """
        Switch the repo at ``dest`` to point to ``URL``.
        """
        raise NotImplementedError

    def update(self, dest, rev_options):
        """
        Update an already-existing repo to the given ``rev_options``.
        """
        raise NotImplementedError

    def check_version(self, dest, rev_options):
        """
        Return True if the version is identical to what exists and
        doesn't need to be updated.
        """
        raise NotImplementedError

    def check_destination(self, dest, url, rev_options, rev_display):
        """
        Prepare a location to receive a checkout/clone.

        Return True if the location is ready for (and requires) a
        checkout/clone, False otherwise.
        """
        checkout = True
        prompt = False
        if os.path.exists(dest):
            checkout = False
            if os.path.exists(os.path.join(dest, self.dirname)):
                existing_url = self.get_url(dest)
                if self.compare_urls(existing_url, url):
                    logger.debug(
                        '%s in %s exists, and has correct URL (%s)',
                        self.repo_name.title(),
                        display_path(dest),
                        url,
                    )
                    if not self.check_version(dest, rev_options):
                        logger.info(
                            'Updating %s %s%s',
                            display_path(dest),
                            self.repo_name,
                            rev_display,
                        )
                        self.update(dest, rev_options)
                    else:
                        logger.info(
                            'Skipping because already up-to-date.')
                else:
                    logger.warning(
                        '%s %s in %s exists with URL %s',
                        self.name,
                        self.repo_name,
                        display_path(dest),
                        existing_url,
                    )
                    prompt = ('(s)witch, (i)gnore, (w)ipe, (b)ackup ',
                              ('s', 'i', 'w', 'b'))
            else:
                logger.warning(
                    'Directory %s already exists, and is not a %s %s.',
                    dest,
                    self.name,
                    self.repo_name,
                )
                prompt = ('(i)gnore, (w)ipe, (b)ackup ', ('i', 'w', 'b'))
        if prompt:
            logger.warning(
                'The plan is to install the %s repository %s',
                self.name,
                url,
            )
            response = ask_path_exists('What to do?  %s' % prompt[0],
                                       prompt[1])

            if response == 's':
                logger.info(
                    'Switching %s %s to %s%s',
                    self.repo_name,
                    display_path(dest),
                    url,
                    rev_display,
                )
                self.switch(dest, url, rev_options)
            elif response == 'i':
                # do nothing
                pass
            elif response == 'w':
                logger.warning('Deleting %s', display_path(dest))
                rmtree(dest)
                checkout = True
            elif response == 'b':
                dest_dir = backup_dir(dest)
                logger.warning(
                    'Backing up %s to %s', display_path(dest), dest_dir,
                )
                shutil.move(dest, dest_dir)
                checkout = True
            elif response == 'a':
                sys.exit(-1)
        return checkout

    def unpack(self, location):
        """
        Clean up current location and download the url repository
        (and vcs infos) into location
        """
        if os.path.exists(location):
            rmtree(location)
        self.obtain(location)

    def get_src_requirement(self, dist, location):
        """
        Return a string representing the requirement needed to
        redownload the files currently present in location, something
        like:
          {repository_url}@{revision}#egg={project_name}-{version_identifier}
        """
        raise NotImplementedError

    def get_url(self, location):
        """
        Return the url used at location
        Used in get_info or check_destination
        """
        raise NotImplementedError

    def get_revision(self, location):
        """
        Return the current revision of the files at location
        Used in get_info
        """
        raise NotImplementedError

    def run_command(self, cmd, show_stdout=True, cwd=None,
                    on_returncode='raise',
                    command_desc=None,
                    extra_environ=None, spinner=None):
        """
        Run a VCS subcommand
        This is simply a wrapper around call_subprocess that adds the VCS
        command name, and checks that the VCS is available
        """
        cmd = [self.name] + cmd
        try:
            return call_subprocess(cmd, show_stdout, cwd,
                                   on_returncode,
                                   command_desc, extra_environ,
                                   spinner)
        except OSError as e:
            # errno.ENOENT = no such file or directory
            # In other words, the VCS executable isn't available
            if e.errno == errno.ENOENT:
                raise BadCommand('Cannot find command %r' % self.name)
            else:
                raise  # re-raise exception if a different error occurred

    @classmethod
    def controls_location(cls, location):
        """
        Check if a location is controlled by the vcs.
        It is meant to be overridden to implement smarter detection
        mechanisms for specific vcs.
        """
        logger.debug('Checking in %s for %s (%s)...',
                     location, cls.dirname, cls.name)
        path = os.path.join(location, cls.dirname)
        return os.path.exists(path)


def get_src_requirement(dist, location):
    version_control = vcs.get_backend_from_location(location)
    if version_control:
        try:
            return version_control().get_src_requirement(dist,
                                                         location)
        except BadCommand:
            logger.warning(
                'cannot determine version of editable source in %s '
                '(%s command not found in path)',
                location,
                version_control.name,
            )
            return dist.as_requirement()
    logger.warning(
        'cannot determine version of editable source in %s (is not SVN '
        'checkout, Git clone, Mercurial clone or Bazaar branch)',
        location,
    )
    return dist.as_requirement()
site-packages/pip/vcs/mercurial.py000064400000006620147511334620013216 0ustar00from __future__ import absolute_import

import logging
import os
import tempfile

from pip.utils import display_path, rmtree
from pip.vcs import vcs, VersionControl
from pip.download import path_to_url
from pip._vendor.six.moves import configparser


logger = logging.getLogger(__name__)


class Mercurial(VersionControl):
    name = 'hg'
    dirname = '.hg'
    repo_name = 'clone'
    schemes = ('hg', 'hg+http', 'hg+https', 'hg+ssh', 'hg+static-http')

    def export(self, location):
        """Export the Hg repository at the url to the destination location"""
        temp_dir = tempfile.mkdtemp('-export', 'pip-')
        self.unpack(temp_dir)
        try:
            self.run_command(
                ['archive', location], show_stdout=False, cwd=temp_dir)
        finally:
            rmtree(temp_dir)

    def switch(self, dest, url, rev_options):
        repo_config = os.path.join(dest, self.dirname, 'hgrc')
        config = configparser.SafeConfigParser()
        try:
            config.read(repo_config)
            config.set('paths', 'default', url)
            with open(repo_config, 'w') as config_file:
                config.write(config_file)
        except (OSError, configparser.NoSectionError) as exc:
            logger.warning(
                'Could not switch Mercurial repository to %s: %s', url, exc,
            )
        else:
            self.run_command(['update', '-q'] + rev_options, cwd=dest)

    def update(self, dest, rev_options):
        self.run_command(['pull', '-q'], cwd=dest)
        self.run_command(['update', '-q'] + rev_options, cwd=dest)

    def obtain(self, dest):
        url, rev = self.get_url_rev()
        if rev:
            rev_options = [rev]
            rev_display = ' (to revision %s)' % rev
        else:
            rev_options = []
            rev_display = ''
        if self.check_destination(dest, url, rev_options, rev_display):
            logger.info(
                'Cloning hg %s%s to %s',
                url,
                rev_display,
                display_path(dest),
            )
            self.run_command(['clone', '--noupdate', '-q', url, dest])
            self.run_command(['update', '-q'] + rev_options, cwd=dest)

    def get_url(self, location):
        url = self.run_command(
            ['showconfig', 'paths.default'],
            show_stdout=False, cwd=location).strip()
        if self._is_local_repository(url):
            url = path_to_url(url)
        return url.strip()

    def get_revision(self, location):
        current_revision = self.run_command(
            ['parents', '--template={rev}'],
            show_stdout=False, cwd=location).strip()
        return current_revision

    def get_revision_hash(self, location):
        current_rev_hash = self.run_command(
            ['parents', '--template={node}'],
            show_stdout=False, cwd=location).strip()
        return current_rev_hash

    def get_src_requirement(self, dist, location):
        repo = self.get_url(location)
        if not repo.lower().startswith('hg:'):
            repo = 'hg+' + repo
        egg_project_name = dist.egg_name().split('-', 1)[0]
        if not repo:
            return None
        current_rev_hash = self.get_revision_hash(location)
        return '%s@%s#egg=%s' % (repo, current_rev_hash, egg_project_name)

    def check_version(self, dest, rev_options):
        """Always assume the versions don't match"""
        return False

vcs.register(Mercurial)
site-packages/pip/vcs/git.py000064400000026454147511334620012025 0ustar00from __future__ import absolute_import

import logging
import tempfile
import os.path

from pip.compat import samefile
from pip.exceptions import BadCommand
from pip._vendor.six.moves.urllib import parse as urllib_parse
from pip._vendor.six.moves.urllib import request as urllib_request
from pip._vendor.packaging.version import parse as parse_version

from pip.utils import display_path, rmtree
from pip.vcs import vcs, VersionControl


urlsplit = urllib_parse.urlsplit
urlunsplit = urllib_parse.urlunsplit


logger = logging.getLogger(__name__)


class Git(VersionControl):
    name = 'git'
    dirname = '.git'
    repo_name = 'clone'
    schemes = (
        'git', 'git+http', 'git+https', 'git+ssh', 'git+git', 'git+file',
    )

    def __init__(self, url=None, *args, **kwargs):

        # Works around an apparent Git bug
        # (see http://article.gmane.org/gmane.comp.version-control.git/146500)
        if url:
            scheme, netloc, path, query, fragment = urlsplit(url)
            if scheme.endswith('file'):
                initial_slashes = path[:-len(path.lstrip('/'))]
                newpath = (
                    initial_slashes +
                    urllib_request.url2pathname(path)
                    .replace('\\', '/').lstrip('/')
                )
                url = urlunsplit((scheme, netloc, newpath, query, fragment))
                after_plus = scheme.find('+') + 1
                url = scheme[:after_plus] + urlunsplit(
                    (scheme[after_plus:], netloc, newpath, query, fragment),
                )

        super(Git, self).__init__(url, *args, **kwargs)

    def get_git_version(self):
        VERSION_PFX = 'git version '
        version = self.run_command(['version'], show_stdout=False)
        if version.startswith(VERSION_PFX):
            version = version[len(VERSION_PFX):]
        else:
            version = ''
        # get first 3 positions of the git version becasue
        # on windows it is x.y.z.windows.t, and this parses as
        # LegacyVersion which always smaller than a Version.
        version = '.'.join(version.split('.')[:3])
        return parse_version(version)

    def export(self, location):
        """Export the Git repository at the url to the destination location"""
        temp_dir = tempfile.mkdtemp('-export', 'pip-')
        self.unpack(temp_dir)
        try:
            if not location.endswith('/'):
                location = location + '/'
            self.run_command(
                ['checkout-index', '-a', '-f', '--prefix', location],
                show_stdout=False, cwd=temp_dir)
        finally:
            rmtree(temp_dir)

    def check_rev_options(self, rev, dest, rev_options):
        """Check the revision options before checkout to compensate that tags
        and branches may need origin/ as a prefix.
        Returns the SHA1 of the branch or tag if found.
        """
        revisions = self.get_short_refs(dest, rev)

        origin_rev = 'origin/%s' % rev
        if origin_rev in revisions:
            # remote branch
            return [revisions[origin_rev]]
        elif rev in revisions:
            # a local tag or branch name
            return [revisions[rev]]
        else:
            logger.warning(
                "Could not find a tag or branch '%s', assuming commit.", rev,
            )
            return rev_options

    def check_version(self, dest, rev_options):
        """
        Compare the current sha to the ref. ref may be a branch or tag name,
        but current rev will always point to a sha. This means that a branch
        or tag will never compare as True. So this ultimately only matches
        against exact shas.
        """
        return self.get_revision(dest).startswith(rev_options[0])

    def switch(self, dest, url, rev_options):
        self.run_command(['config', 'remote.origin.url', url], cwd=dest)
        self.run_command(['checkout', '-q'] + rev_options, cwd=dest)

        self.update_submodules(dest)

    def update(self, dest, rev_options):
        # First fetch changes from the default remote
        if self.get_git_version() >= parse_version('1.9.0'):
            # fetch tags in addition to everything else
            self.run_command(['fetch', '-q', '--tags'], cwd=dest)
        else:
            self.run_command(['fetch', '-q'], cwd=dest)
        # Then reset to wanted revision (maybe even origin/master)
        if rev_options:
            rev_options = self.check_rev_options(
                rev_options[0], dest, rev_options,
            )
        self.run_command(['reset', '--hard', '-q'] + rev_options, cwd=dest)
        #: update submodules
        self.update_submodules(dest)

    def obtain(self, dest):
        url, rev = self.get_url_rev()
        if rev:
            rev_options = [rev]
            rev_display = ' (to %s)' % rev
        else:
            rev_options = ['origin/master']
            rev_display = ''
        if self.check_destination(dest, url, rev_options, rev_display):
            logger.info(
                'Cloning %s%s to %s', url, rev_display, display_path(dest),
            )
            self.run_command(['clone', '-q', url, dest])

            if rev:
                rev_options = self.check_rev_options(rev, dest, rev_options)
                # Only do a checkout if rev_options differs from HEAD
                if not self.check_version(dest, rev_options):
                    self.run_command(
                        ['checkout', '-q'] + rev_options,
                        cwd=dest,
                    )
            #: repo may contain submodules
            self.update_submodules(dest)

    def get_url(self, location):
        """Return URL of the first remote encountered."""
        remotes = self.run_command(
            ['config', '--get-regexp', 'remote\..*\.url'],
            show_stdout=False, cwd=location)
        remotes = remotes.splitlines()
        found_remote = remotes[0]
        for remote in remotes:
            if remote.startswith('remote.origin.url '):
                found_remote = remote
                break
        url = found_remote.split(' ')[1]
        return url.strip()

    def get_revision(self, location):
        current_rev = self.run_command(
            ['rev-parse', 'HEAD'], show_stdout=False, cwd=location)
        return current_rev.strip()

    def get_full_refs(self, location, pattern=''):
        """Yields tuples of (commit, ref) for branches and tags"""
        output = self.run_command(['show-ref', pattern],
                                  show_stdout=False, cwd=location)
        for line in output.split("\n"):
            line = line.rstrip("\r")
            if not line:
                continue
            try:
                commit, ref = line.split(' ', 1)
            except ValueError:
                # Include the offending line to simplify troubleshooting if
                # this error ever occurs.
                raise ValueError(f'unexpected show-ref line: {line!r}')
            yield commit.strip(), ref.strip()

    def is_ref_remote(self, ref):
        return ref.startswith('refs/remotes/')

    def is_ref_branch(self, ref):
        return ref.startswith('refs/heads/')

    def is_ref_tag(self, ref):
        return ref.startswith('refs/tags/')

    def is_ref_commit(self, ref):
        """A ref is a commit sha if it is not anything else"""
        return not any((
            self.is_ref_remote(ref),
            self.is_ref_branch(ref),
            self.is_ref_tag(ref),
        ))

    # Should deprecate `get_refs` since it's ambiguous
    def get_refs(self, location):
        return self.get_short_refs(location)

    def get_short_refs(self, location, pattern=''):
        """Return map of named refs (branches or tags) to commit hashes."""
        rv = {}
        for commit, ref in self.get_full_refs(location, pattern):
            ref_name = None
            if self.is_ref_remote(ref):
                ref_name = ref[len('refs/remotes/'):]
            elif self.is_ref_branch(ref):
                ref_name = ref[len('refs/heads/'):]
            elif self.is_ref_tag(ref):
                ref_name = ref[len('refs/tags/'):]
            if ref_name is not None:
                rv[ref_name] = commit
        return rv

    def _get_subdirectory(self, location):
        """Return the relative path of setup.py to the git repo root."""
        # find the repo root
        git_dir = self.run_command(['rev-parse', '--git-dir'],
                                   show_stdout=False, cwd=location).strip()
        if not os.path.isabs(git_dir):
            git_dir = os.path.join(location, git_dir)
        root_dir = os.path.join(git_dir, '..')
        # find setup.py
        orig_location = location
        while not os.path.exists(os.path.join(location, 'setup.py')):
            last_location = location
            location = os.path.dirname(location)
            if location == last_location:
                # We've traversed up to the root of the filesystem without
                # finding setup.py
                logger.warning(
                    "Could not find setup.py for directory %s (tried all "
                    "parent directories)",
                    orig_location,
                )
                return None
        # relative path of setup.py to repo root
        if samefile(root_dir, location):
            return None
        return os.path.relpath(location, root_dir)

    def get_src_requirement(self, dist, location):
        repo = self.get_url(location)
        if not repo.lower().startswith('git:'):
            repo = 'git+' + repo
        egg_project_name = dist.egg_name().split('-', 1)[0]
        if not repo:
            return None
        current_rev = self.get_revision(location)
        req = '%s@%s#egg=%s' % (repo, current_rev, egg_project_name)
        subdirectory = self._get_subdirectory(location)
        if subdirectory:
            req += '&subdirectory=' + subdirectory
        return req

    def get_url_rev(self):
        """
        Prefixes stub URLs like 'user@hostname:user/repo.git' with 'ssh://'.
        That's required because although they use SSH they sometimes doesn't
        work with a ssh:// scheme (e.g. Github). But we need a scheme for
        parsing. Hence we remove it again afterwards and return it as a stub.
        """
        if '://' not in self.url:
            assert 'file:' not in self.url
            self.url = self.url.replace('git+', 'git+ssh://')
            url, rev = super(Git, self).get_url_rev()
            url = url.replace('ssh://', '')
        else:
            url, rev = super(Git, self).get_url_rev()

        return url, rev

    def update_submodules(self, location):
        if not os.path.exists(os.path.join(location, '.gitmodules')):
            return
        self.run_command(
            ['submodule', 'update', '--init', '--recursive', '-q'],
            cwd=location,
        )

    @classmethod
    def controls_location(cls, location):
        if super(Git, cls).controls_location(location):
            return True
        try:
            r = cls().run_command(['rev-parse'],
                                  cwd=location,
                                  show_stdout=False,
                                  on_returncode='ignore')
            return not r
        except BadCommand:
            logger.debug("could not determine if %s is under git control "
                         "because git is not available", location)
            return False


vcs.register(Git)
site-packages/pip/vcs/__pycache__/git.cpython-36.pyc000064400000021350147511334620016277 0ustar003

���e,-�@s�ddlmZddlZddlZddlZddlmZddlm	Z	ddl
mZddl
m
ZddlmZddlmZmZddlmZmZejZejZeje�ZGd	d
�d
e�Zeje�dS)�)�absolute_importN)�samefile)�
BadCommand)�parse)�request)�display_path�rmtree)�vcs�VersionControlcs�eZdZdZdZdZd7Zd8�fd
d�	Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zd9dd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd:d+d,�Zd-d.�Zd/d0�Z�fd1d2�Zd3d4�Ze�fd5d6��Z�ZS);�Git�gitz.git�clone�git+http�	git+https�git+ssh�git+git�git+fileNcs�|r�t|�\}}}}}|jd�r�|dt|jd���}	|	tj|�jdd�jd�}
t|||
||f�}|jd�d}|d|�t||d�||
||f�}t	t
|�j|f|�|�dS)N�file�/�\�+�)�urlsplit�endswith�len�lstrip�urllib_requestZurl2pathname�replace�
urlunsplit�find�superr�__init__)�self�url�args�kwargs�schemeZnetloc�pathZqueryZfragment�initial_slashes�newpathZ
after_plus)�	__class__��/usr/lib/python3.6/git.pyr! s

zGit.__init__cCsTd}|jdgdd�}|j|�r0|t|�d�}nd}dj|jd�dd��}t|�S)Nzgit version �versionF)�show_stdout��.�)�run_command�
startswithr�join�split�
parse_version)r"ZVERSION_PFXr-r+r+r,�get_git_version5s
zGit.get_git_versioncCsVtjdd�}|j|�z0|jd�s*|d}|jdddd|gd|d	�Wd
t|�Xd
S)z@Export the Git repository at the url to the destination locationz-exportzpip-rzcheckout-indexz-az-fz--prefixF)r.�cwdN)�tempfileZmkdtemp�unpackrr2r)r"�locationZtemp_dirr+r+r,�exportBs

z
Git.exportcCsL|j||�}d|}||kr&||gS||kr8||gStjd|�|SdS)z�Check the revision options before checkout to compensate that tags
        and branches may need origin/ as a prefix.
        Returns the SHA1 of the branch or tag if found.
        z	origin/%sz5Could not find a tag or branch '%s', assuming commit.N)�get_short_refs�logger�warning)r"�rev�dest�rev_optionsZ	revisionsZ
origin_revr+r+r,�check_rev_optionsOs

zGit.check_rev_optionscCs|j|�j|d�S)a

        Compare the current sha to the ref. ref may be a branch or tag name,
        but current rev will always point to a sha. This means that a branch
        or tag will never compare as True. So this ultimately only matches
        against exact shas.
        r)�get_revisionr3)r"rArBr+r+r,�
check_versioncszGit.check_versioncCs8|jdd|g|d�|jddg||d�|j|�dS)N�configzremote.origin.url)r8�checkoutz-q)r2�update_submodules)r"rAr#rBr+r+r,�switchlsz
Git.switchcCst|j�td�kr&|jdddg|d�n|jddg|d�|rN|j|d||�}|jdddg||d�|j|�dS)	Nz1.9.0Zfetchz-qz--tags)r8r�resetz--hard)r7r6r2rCrH)r"rArBr+r+r,�updatersz
Git.updatecCs�|j�\}}|r |g}d|}n
dg}d}|j||||�r�tjd||t|��|jdd||g�|r�|j|||�}|j||�s�|jddg||d�|j|�dS)	Nz (to %s)z
origin/masterr/zCloning %s%s to %sr
z-qrG)r8)	�get_url_revZcheck_destinationr>�inforr2rCrErH)r"rAr#r@rBZrev_displayr+r+r,�obtain�s"

z
Git.obtaincCsZ|jdddgd|d�}|j�}|d}x|D]}|jd�r,|}Pq,W|jd�d	}|j�S)
z+Return URL of the first remote encountered.rFz--get-regexpzremote\..*\.urlF)r.r8rzremote.origin.url � r)r2�
splitlinesr3r5�strip)r"r;ZremotesZfound_remoteZremoter#r+r+r,�get_url�s


zGit.get_urlcCs|jddgd|d�}|j�S)Nz	rev-parseZHEADF)r.r8)r2rQ)r"r;�current_revr+r+r,rD�szGit.get_revisionr/ccs�|jd|gd|d�}xl|jd�D]^}|jd�}|s4q y|jdd�\}}Wn"tk
rjtd|����YnX|j�|j�fVq Wd	S)
z4Yields tuples of (commit, ref) for branches and tagszshow-refF)r.r8�
�
rOrzunexpected show-ref line: N)r2r5�rstrip�
ValueErrorrQ)r"r;�pattern�output�line�commit�refr+r+r,�
get_full_refs�s


zGit.get_full_refscCs
|jd�S)Nz
refs/remotes/)r3)r"r\r+r+r,�
is_ref_remote�szGit.is_ref_remotecCs
|jd�S)Nzrefs/heads/)r3)r"r\r+r+r,�
is_ref_branch�szGit.is_ref_branchcCs
|jd�S)Nz
refs/tags/)r3)r"r\r+r+r,�
is_ref_tag�szGit.is_ref_tagcCs"t|j|�|j|�|j|�f�S)z0A ref is a commit sha if it is not anything else)�anyr^r_r`)r"r\r+r+r,�
is_ref_commit�szGit.is_ref_commitcCs
|j|�S)N)r=)r"r;r+r+r,�get_refs�szGit.get_refscCs�i}x~|j||�D]n\}}d}|j|�r:|td�d�}n6|j|�rV|td�d�}n|j|�rp|td�d�}|dk	r|||<qW|S)z=Return map of named refs (branches or tags) to commit hashes.Nz
refs/remotes/zrefs/heads/z
refs/tags/)r]r^rr_r`)r"r;rX�rvr[r\Zref_namer+r+r,r=�s


zGit.get_short_refscCs�|jddgd|d�j�}tjj|�s2tjj||�}tjj|d�}|}xBtjjtjj|d��s�|}tjj|�}||krFtj	d|�dSqFWt
||�r�dStjj||�S)	z:Return the relative path of setup.py to the git repo root.z	rev-parsez	--git-dirF)r.r8z..zsetup.pyzGCould not find setup.py for directory %s (tried all parent directories)N)r2rQ�osr'�isabsr4�exists�dirnamer>r?r�relpath)r"r;Zgit_dirZroot_dirZ
orig_locationZ
last_locationr+r+r,�_get_subdirectory�s"

zGit._get_subdirectorycCsr|j|�}|j�jd�s d|}|j�jdd�d}|s<dS|j|�}d|||f}|j|�}|rn|d|7}|S)Nzgit:zgit+�-rrz%s@%s#egg=%sz&subdirectory=)rR�lowerr3Zegg_namer5rDrj)r"Zdistr;ZrepoZegg_project_namerSZreqZsubdirectoryr+r+r,�get_src_requirement�s


zGit.get_src_requirementcsbd|jkrHd|jkst�|jjdd�|_tt|�j�\}}|jdd�}ntt|�j�\}}||fS)a;
        Prefixes stub URLs like 'user@hostname:user/repo.git' with 'ssh://'.
        That's required because although they use SSH they sometimes doesn't
        work with a ssh:// scheme (e.g. Github). But we need a scheme for
        parsing. Hence we remove it again afterwards and return it as a stub.
        z://zfile:zgit+z
git+ssh://zssh://r/)r#�AssertionErrorrr rrL)r"r#r@)r*r+r,rLs
zGit.get_url_revcCs6tjjtjj|d��sdS|jdddddg|d�dS)Nz.gitmodulesZ	submodulerKz--initz--recursivez-q)r8)rer'rgr4r2)r"r;r+r+r,rHs
zGit.update_submodulescsVtt|�j|�rdSy|�jdg|ddd�}|Stk
rPtjd|�dSXdS)NTz	rev-parseF�ignore)r8r.Z
on_returncodezKcould not determine if %s is under git control because git is not available)r r�controls_locationr2rr>�debug)�clsr;�r)r*r+r,rp$s
zGit.controls_location)rrrrrr)N)r/)r/)�__name__�
__module__�__qualname__�namerhZ	repo_nameZschemesr!r7r<rCrErIrKrNrRrDr]r^r_r`rbrcr=rjrmrLrH�classmethodrp�
__classcell__r+r+)r*r,rs4

	
	
r)Z
__future__rZloggingr9Zos.pathreZ
pip.compatrZpip.exceptionsrZpip._vendor.six.moves.urllibrZurllib_parserrZpip._vendor.packaging.versionr6Z	pip.utilsrrZpip.vcsr	r
rrZ	getLoggerrtr>r�registerr+r+r+r,�<module>s"
site-packages/pip/vcs/__pycache__/__init__.cpython-36.opt-1.pyc000064400000025316147511334620020220 0ustar003

���eV0�@s�dZddlmZddlZddlZddlZddlZddlZddlm	Z
ddlmZddl
mZmZmZmZmZddgZeje�ZGd	d
�d
e�Ze�ZGdd�de�Zd
d�ZdS)z)Handles all VCS (version control) support�)�absolute_importN)�parse)�
BadCommand)�display_path�
backup_dir�call_subprocess�rmtree�ask_path_exists�vcs�get_src_requirementcs�eZdZiZddddddgZ�fdd�Zd	d
�Zedd��Zed
d��Z	edd��Z
dd�Zddd�Zdd�Z
dd�Zdd�Z�ZS)�
VcsSupportZsshZgitZhgZbzrZsftpZsvncs:tjj|j�ttdd�r(tjj|j�tt|�j�dS)N�
uses_fragment)	�urllib_parseZuses_netloc�extend�schemes�getattrr
�superr�__init__)�self)�	__class__��/usr/lib/python3.6/__init__.pyrszVcsSupport.__init__cCs
|jj�S)N)�	_registry�__iter__)rrrrr$szVcsSupport.__iter__cCst|jj��S)N)�listr�values)rrrr�backends'szVcsSupport.backendscCsdd�|jD�S)NcSsg|]
}|j�qSr)�dirname)�.0�backendrrr�
<listcomp>-sz'VcsSupport.dirnames.<locals>.<listcomp>)r)rrrr�dirnames+szVcsSupport.dirnamescCs$g}x|jD]}|j|j�qW|S)N)rrr)rrrrrr�all_schemes/szVcsSupport.all_schemescCsFt|d�stjd|j�dS|j|jkrB||j|j<tjd|j�dS)N�namezCannot register VCS %szRegistered VCS backend: %s)�hasattr�logger�warning�__name__r#r�debug)r�clsrrr�register6s
zVcsSupport.registerNcCs<||jkr|j|=n$||jj�kr.|j|j=n
tjd�dS)Nz0Cannot unregister because no class or name given)rrr#r%r&)rr)r#rrr�
unregister>s


zVcsSupport.unregistercCs8x2|jj�D]$}|j|�rtjd||j�|jSqWdS)z�
        Return the name of the version control backend if found at given
        location, e.g. vcs.get_backend_name('/path/to/vcs/checkout')
        zDetermine that %s uses VCS: %sN)rr�controls_locationr%r(r#)r�location�vc_typerrr�get_backend_nameFs


zVcsSupport.get_backend_namecCs |j�}||jkr|j|SdS)N)�lowerr)rr#rrr�get_backendRs
zVcsSupport.get_backendcCs|j|�}|r|j|�SdS)N)r/r1)rr-r.rrr�get_backend_from_locationWs

z$VcsSupport.get_backend_from_location)NN)r'�
__module__�__qualname__rrrr�propertyrr!r"r*r+r/r1r2�
__classcell__rr)rrrs	
rcs�eZdZdZdZfZd+�fdd�	Zdd�Zdd�Zd	d
�Z	dd�Z
d
d�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd,d'd(�Zed)d*��Z�ZS)-�VersionControl�Ncs||_tt|�j||�dS)N)�urlrr7r)rr9�args�kwargs)rrrrgszVersionControl.__init__cCs"tjj|�\}}|jtjj�p |S)zy
           posix absolute paths start with os.path.sep,
           win32 ones start with drive (like c:\folder)
        )�os�path�
splitdrive�
startswith�sep)rZrepoZdrive�tailrrr�_is_local_repositoryksz#VersionControl._is_local_repositorycCs|jdd�S)N�/�_)�replace)rZsurnamerrr�translate_egg_surnameusz$VersionControl.translate_egg_surnamecCst�dS)z�
        Export the repository at the url to the destination location
        i.e. only download the files, without vcs informations
        N)�NotImplementedError)rr-rrr�exportyszVersionControl.exportc	Csbd}|jjdd�d}tj|�\}}}}}d}d|krF|jdd�\}}tj||||df�}||fS)zm
        Returns the correct repository URL and revision by parsing the given
        repository URL
        zvSorry, '%s' is a malformed VCS url. The format is <vcs>+<protocol>://<url>, e.g. svn+http://myrepo/svn/MyApp#egg=MyApp�+�N�@r8)r9�splitrZurlsplit�rsplitZ
urlunsplit)	rZ
error_messager9�schemeZnetlocr=ZqueryZfragZrevrrr�get_url_rev�szVersionControl.get_url_revcCs|j|�|j|�fS)zA
        Returns (url, revision), where both are strings
        )�get_url�get_revision)rr-rrr�get_info�szVersionControl.get_infocCstj|�jd�S)zi
        Normalize a URL for comparison by unquoting it and removing any
        trailing slash.
        rC)rZunquote�rstrip)rr9rrr�
normalize_url�szVersionControl.normalize_urlcCs|j|�|j|�kS)zV
        Compare two repo URLs for identity, ignoring incidental differences.
        )rT)rZurl1Zurl2rrr�compare_urls�szVersionControl.compare_urlscCst�dS)zx
        Called when installing or updating an editable package, takes the
        source path of the checkout.
        N)rG)r�destrrr�obtain�szVersionControl.obtaincCst�dS)zB
        Switch the repo at ``dest`` to point to ``URL``.
        N)rG)rrVr9�rev_optionsrrr�switch�szVersionControl.switchcCst�dS)zO
        Update an already-existing repo to the given ``rev_options``.
        N)rG)rrVrXrrr�update�szVersionControl.updatecCst�dS)zp
        Return True if the version is identical to what exists and
        doesn't need to be updated.
        N)rG)rrVrXrrr�
check_version�szVersionControl.check_versionc
Cs�d}d}tjj|�r�d}tjjtjj||j��r�|j|�}|j||�r�tjd|j	j
�t|�|�|j||�s�tj
dt|�|j	|�|j||�q�tj
d�q�tjd|j|j	t|�|�d}ntjd||j|j	�d}|�r�tjd|j|�td|d|d�}|dk�r2tj
d|j	t|�||�|j|||�n~|d	k�r>nr|d
k�rftjdt|��t|�d}nJ|dk�r�t|�}	tjdt|�|	�tj||	�d}n|dk�r�tjd�|S)z�
        Prepare a location to receive a checkout/clone.

        Return True if the location is ready for (and requires) a
        checkout/clone, False otherwise.
        TFz)%s in %s exists, and has correct URL (%s)zUpdating %s %s%sz$Skipping because already up-to-date.z%s %s in %s exists with URL %s�%(s)witch, (i)gnore, (w)ipe, (b)ackup �s�i�w�bz0Directory %s already exists, and is not a %s %s.�(i)gnore, (w)ipe, (b)ackup z+The plan is to install the %s repository %szWhat to do?  %srrJzSwitching %s %s to %s%szDeleting %szBacking up %s to %s�a�r]r^r_r`)r\rc�r^r_r`)rard���)r<r=�exists�joinrrPrUr%r(Z	repo_name�titlerr[�inforZr&r#r	rYrr�shutilZmove�sys�exit)
rrVr9rXZrev_displayZcheckout�promptZexisting_urlZresponseZdest_dirrrr�check_destination�s�







z VersionControl.check_destinationcCs"tjj|�rt|�|j|�dS)zq
        Clean up current location and download the url repository
        (and vcs infos) into location
        N)r<r=rfrrW)rr-rrr�unpackszVersionControl.unpackcCst�dS)z�
        Return a string representing the requirement needed to
        redownload the files currently present in location, something
        like:
          {repository_url}@{revision}#egg={project_name}-{version_identifier}
        N)rG)r�distr-rrrr sz"VersionControl.get_src_requirementcCst�dS)z_
        Return the url used at location
        Used in get_info or check_destination
        N)rG)rr-rrrrP)szVersionControl.get_urlcCst�dS)z_
        Return the current revision of the files at location
        Used in get_info
        N)rG)rr-rrrrQ0szVersionControl.get_revisionT�raisec	Csf|jg|}yt|||||||�Stk
r`}z$|jtjkrNtd|j��n�WYdd}~XnXdS)z�
        Run a VCS subcommand
        This is simply a wrapper around call_subprocess that adds the VCS
        command name, and checks that the VCS is available
        zCannot find command %rN)r#r�OSError�errno�ENOENTr)	r�cmdZshow_stdout�cwdZ
on_returncodeZcommand_descZ
extra_environZspinner�errr�run_command7s	zVersionControl.run_commandcCs0tjd||j|j�tjj||j�}tjj|�S)z�
        Check if a location is controlled by the vcs.
        It is meant to be overridden to implement smarter detection
        mechanisms for specific vcs.
        zChecking in %s for %s (%s)...)r%r(rr#r<r=rgrf)r)r-r=rrrr,Nsz VersionControl.controls_location)N)TNrqNNN)r'r3r4r#rrrrBrFrHrOrRrTrUrWrYrZr[rnrorrPrQrx�classmethodr,r6rr)rrr7as2
U		
r7cCsZtj|�}|rFy|�j||�Stk
rDtjd||j�|j�SXtjd|�|j�S)NzPcannot determine version of editable source in %s (%s command not found in path)ztcannot determine version of editable source in %s (is not SVN checkout, Git clone, Mercurial clone or Bazaar branch))r
r2rrr%r&r#Zas_requirement)rpr-Zversion_controlrrrr[s

)�__doc__Z
__future__rrsZloggingr<rjrkZpip._vendor.six.moves.urllibrrZpip.exceptionsrZ	pip.utilsrrrrr	�__all__Z	getLoggerr'r%�objectrr
r7rrrrr�<module>s 
G{site-packages/pip/vcs/__pycache__/mercurial.cpython-36.opt-1.pyc000064400000006737147511334620020452 0ustar003

���e�
�@s�ddlmZddlZddlZddlZddlmZmZddlm	Z	m
Z
ddlmZddl
mZeje�ZGdd�de
�Ze	je�dS)	�)�absolute_importN)�display_path�rmtree)�vcs�VersionControl)�path_to_url)�configparserc@sdeZdZdZdZdZdZdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�ZdS)�	Mercurial�hgz.hg�clone�hg+http�hg+https�hg+ssh�hg+static-httpcCs>tjdd�}|j|�z|jd|gd|d�Wdt|�XdS)z?Export the Hg repository at the url to the destination locationz-exportzpip-�archiveF)�show_stdout�cwdN)�tempfileZmkdtemp�unpack�run_commandr)�self�locationZtemp_dir�r�/usr/lib/python3.6/mercurial.py�exports
zMercurial.exportcCs�tjj||jd�}tj�}y<|j|�|jdd|�t|d��}|j	|�WdQRXWn6t
tjfk
r�}ztj
d||�WYdd}~XnX|jddg||d�dS)	NZhgrc�paths�default�wz/Could not switch Mercurial repository to %s: %s�updatez-q)r)�os�path�join�dirnamerZSafeConfigParser�read�set�open�write�OSErrorZNoSectionError�loggerZwarningr)r�dest�url�rev_optionsZrepo_config�configZconfig_file�excrrr�switch s
zMercurial.switchcCs,|jddg|d�|jddg||d�dS)NZpullz-q)rr)r)rr)r+rrrr/szMercurial.updatecCsz|j�\}}|r |g}d|}ng}d}|j||||�rvtjd||t|��|jddd||g�|jddg||d�dS)	Nz (to revision %s)�zCloning hg %s%s to %srz
--noupdatez-qr)r)Zget_url_revZcheck_destinationr(�inforr)rr)r*Zrevr+Zrev_displayrrr�obtain3s

zMercurial.obtaincCs2|jddgd|d�j�}|j|�r*t|�}|j�S)NZ
showconfigz
paths.defaultF)rr)r�stripZ_is_local_repositoryr)rrr*rrr�get_urlEs
zMercurial.get_urlcCs|jddgd|d�j�}|S)N�parentsz--template={rev}F)rr)rr2)rrZcurrent_revisionrrr�get_revisionMszMercurial.get_revisioncCs|jddgd|d�j�}|S)Nr4z--template={node}F)rr)rr2)rr�current_rev_hashrrr�get_revision_hashSszMercurial.get_revision_hashcCsT|j|�}|j�jd�s d|}|j�jdd�d}|s<dS|j|�}d|||fS)Nzhg:zhg+�-�rz%s@%s#egg=%s)r3�lower�
startswithZegg_name�splitr7)rZdistrZrepoZegg_project_namer6rrr�get_src_requirementYs

zMercurial.get_src_requirementcCsdS)z&Always assume the versions don't matchFr)rr)r+rrr�
check_versioncszMercurial.check_versionN)r
rr
rr)�__name__�
__module__�__qualname__�namer"Z	repo_nameZschemesrr.rr1r3r5r7r=r>rrrrr	s

r	)Z
__future__rZloggingrrZ	pip.utilsrrZpip.vcsrrZpip.downloadrZpip._vendor.six.movesrZ	getLoggerr?r(r	�registerrrrr�<module>s
Wsite-packages/pip/vcs/__pycache__/subversion.cpython-36.opt-1.pyc000064400000015517147511334620020662 0ustar003

���e�$�@s�ddlmZddlZddlZddlZddlmZddlm	Z	ddl
mZmZddl
mZddlmZmZejd�Zejd	�Zejd
�Zejd�Zejd�Zejd
�Zeje�ZGdd�de�Zdd�Zeje�dS)�)�absolute_importN)�parse)�Link)�rmtree�display_path)�
indent_log)�vcs�VersionControlz
url="([^"]+)"zcommitted-rev="(\d+)"z	URL: (.+)zRevision: (.+)z\s*revision="(\d+)"z<url>(.*)</url>cs�eZdZdZdZdZd"Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
�fdd�Zdd�Zdd�Zdd�Zdd�Zed d!��Z�ZS)#�
Subversion�svnz.svn�checkout�svn+ssh�svn+http�	svn+https�svn+svncCs�|jd|gdddid�}tj|�}|sFtjdt|��tjd|�dS|jd	�j�}t	j|�}|s�tjd
t|��tjd|�|dfS||jd	�fS)z/Returns (url, revision), where both are strings�infoFZLANG�C)�show_stdoutZ
extra_environz'Cannot determine URL of svn checkout %sz!Output that cannot be parsed: 
%sN�z,Cannot determine revision of svn checkout %s)NN)
�run_command�_svn_url_re�search�logger�warningr�debug�group�strip�_svn_revision_re)�self�location�output�match�url�r#� /usr/lib/python3.6/subversion.py�get_infos(



zSubversion.get_infocCst|j�\}}t||�}|j|�}tjd||�t��6tjj|�rJt	|�|j
dg|||gdd�WdQRXdS)z@Export the svn repository at the url to the destination locationz!Exporting svn repository %s to %s�exportF)rN)�get_url_rev�get_rev_options�remove_auth_from_urlrrr�os�path�existsrr)rrr"�rev�rev_optionsr#r#r$r&;s

zSubversion.exportcCs|jdg|||g�dS)N�switch)r)r�destr"r.r#r#r$r/JszSubversion.switchcCs|jdg||g�dS)N�update)r)rr0r.r#r#r$r1MszSubversion.updatecCst|j�\}}t||�}|j|�}|r.d|}nd}|j||||�rptjd||t|��|jddg|||g�dS)Nz (to revision %s)�zChecking out %s%s to %srz-q)r'r(r)Zcheck_destinationrrrr)rr0r"r-r.Zrev_displayr#r#r$�obtainPs



zSubversion.obtaincCsfx`|D]X}t|�j}|sqd|kr@dj|jd�dd��j�}n|}||jkr|jdd�dSqWdS)N�-r�#r���)r�egg_fragment�join�split�lower�key)r�distZdependency_linksr"r7r;r#r#r$�get_locationas


zSubversion.get_locationc
Cs�d}x�tj|�D]�\}}}|j|kr2g|dd�<q|j|j�tjj||jd�}tjj|�s^q|j|�\}}||kr~|d}	n |s�|j|	�r�g|dd�<qt	||�}qW|S)zR
        Return the maximum revision for all files under a given location
        rN�entries�/)
r*�walk�dirname�remover+r8r,�_get_svn_url_rev�
startswith�max)
rrZrevision�base�dirs�filesZ
entries_fnZdirurlZlocalrevZbase_urlr#r#r$�get_revisionos"

zSubversion.get_revisioncs,tt|�j�\}}|jd�r$d|}||fS)Nzssh://zsvn+)�superr
r'rD)rr"r-)�	__class__r#r$r'�s
zSubversion.get_url_revcCsV|}xBtjjtjj|d��sF|}tjj|�}||krtjd|�dSqW|j|�dS)Nzsetup.pyzGCould not find setup.py for directory %s (tried all parent directories)r)r*r+r,r8rArrrC)rrZ
orig_locationZ
last_locationr#r#r$�get_url�szSubversion.get_urlcCspddlm}tjj||jd�}tjj|�rHt|��}|j�}WdQRXnd}|j	d�sj|j	d�sj|j	d�r�t
ttj
|jd���}|dd=|dd	}d
d�|D�dg}n�|j	d�r�tj|�}|s�td
|��|jd�}dd�tj|�D�dg}n^y<|jdd|gdd�}	tj|	�jd�}dd�tj|	�D�}Wn |k
�rRdg}}YnX|�rdt|�}
nd}
||
fS)Nr)�InstallationErrorr>r2�8�9Z10z

�cSs,g|]$}t|�dkr|drt|d��qS)�	)�len�int)�.0�dr#r#r$�
<listcomp>�sz/Subversion._get_svn_url_rev.<locals>.<listcomp>z<?xmlzBadly formatted data: %rrcSsg|]}t|jd���qS)r)rSr)rT�mr#r#r$rV�srz--xmlF)rcSsg|]}t|jd���qS)r)rSr)rTrWr#r#r$rV�s)Zpip.exceptionsrMr*r+r8rAr,�open�readrD�list�map�str�
splitlinesr9�_svn_xml_url_rer�
ValueErrorr�_svn_rev_re�finditerr�_svn_info_xml_url_re�_svn_info_xml_rev_rerE)rrrMZentries_path�f�datar"Zrevsr!Zxmlr-r#r#r$rC�s>








zSubversion._get_svn_url_revcCsB|j|�}|dkrdS|j�jdd�d}|j|�}d|||fS)Nr4rrzsvn+%s@%s#egg=%s)rLZegg_namer9rI)rr<rZrepoZegg_project_namer-r#r#r$�get_src_requirement�s

zSubversion.get_src_requirementcCsdS)z&Always assume the versions don't matchFr#)rr0r.r#r#r$�
check_version�szSubversion.check_versioncCs>tj|�}|jjd�d}|j||j|j|jf}tj|�}|S)N�@rr6)	�urllib_parse�urlsplit�netlocr9�schemer+ZqueryZfragmentZ
urlunsplit)r"ZpurlZstripped_netlocZ
url_piecesZsurlr#r#r$r)�s


zSubversion.remove_auth_from_url)rr
rrr)�__name__�
__module__�__qualname__�namerAZ	repo_nameZschemesr%r&r/r1r3r=rIr'rLrCrfrg�staticmethodr)�
__classcell__r#r#)rKr$r
s",	r
cCs�|rd|g}ng}tj|�}t|d�r6|j|j}}nL|d}d|krz|jd�d}d|krn|jdd�\}}q�|d}}nd	\}}|r�|d|g7}|r�|d|g7}|S)
Nz-r�usernamerrhr�:z
--usernamez
--password)NN)rirj�hasattrrs�passwordr9)r"r-r.�rrsrvrkZauthr#r#r$r(�s$


r()Z
__future__rZloggingr*�reZpip._vendor.six.moves.urllibrriZ	pip.indexrZ	pip.utilsrrZpip.utils.loggingrZpip.vcsrr	�compiler^r`rrrcrbZ	getLoggerrmrr
r(�registerr#r#r#r$�<module>s&






Ysite-packages/pip/vcs/__pycache__/subversion.cpython-36.pyc000064400000015662147511334620017724 0ustar003

���e�$�@s�ddlmZddlZddlZddlZddlmZddlm	Z	ddl
mZmZddl
mZddlmZmZejd�Zejd	�Zejd
�Zejd�Zejd�Zejd
�Zeje�ZGdd�de�Zdd�Zeje�dS)�)�absolute_importN)�parse)�Link)�rmtree�display_path)�
indent_log)�vcs�VersionControlz
url="([^"]+)"zcommitted-rev="(\d+)"z	URL: (.+)zRevision: (.+)z\s*revision="(\d+)"z<url>(.*)</url>cs�eZdZdZdZdZd"Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
�fdd�Zdd�Zdd�Zdd�Zdd�Zed d!��Z�ZS)#�
Subversion�svnz.svn�checkout�svn+ssh�svn+http�	svn+https�svn+svncCs�|jd�j|j�s td|��|jd|gdddid�}tj|�}|sftjdt	|��tj
d	|�d
S|jd�j�}t
j|�}|s�tjdt	|��tj
d	|�|d
fS||jd�fS)z/Returns (url, revision), where both are strings�/zBad directory: %s�infoFZLANG�C)�show_stdoutZ
extra_environz'Cannot determine URL of svn checkout %sz!Output that cannot be parsed: 
%sN�z,Cannot determine revision of svn checkout %s)NN)�rstrip�endswith�dirname�AssertionError�run_command�_svn_url_re�search�logger�warningr�debug�group�strip�_svn_revision_re)�self�location�output�match�url�r(� /usr/lib/python3.6/subversion.py�get_infos,




zSubversion.get_infocCst|j�\}}t||�}|j|�}tjd||�t��6tjj|�rJt	|�|j
dg|||gdd�WdQRXdS)z@Export the svn repository at the url to the destination locationz!Exporting svn repository %s to %s�exportF)rN)�get_url_rev�get_rev_options�remove_auth_from_urlrrr�os�path�existsrr)r#r$r'�rev�rev_optionsr(r(r)r+;s

zSubversion.exportcCs|jdg|||g�dS)N�switch)r)r#�destr'r3r(r(r)r4JszSubversion.switchcCs|jdg||g�dS)N�update)r)r#r5r3r(r(r)r6MszSubversion.updatecCst|j�\}}t||�}|j|�}|r.d|}nd}|j||||�rptjd||t|��|jddg|||g�dS)Nz (to revision %s)�zChecking out %s%s to %srz-q)r,r-r.Zcheck_destinationrrrr)r#r5r'r2r3Zrev_displayr(r(r)�obtainPs



zSubversion.obtaincCsfx`|D]X}t|�j}|sqd|kr@dj|jd�dd��j�}n|}||jkr|jdd�dSqWdS)N�-r�#r���)r�egg_fragment�join�split�lower�key)r#�distZdependency_linksr'r<r@r(r(r)�get_locationas


zSubversion.get_locationc
Cs�d}x�tj|�D]�\}}}|j|kr2g|dd�<q|j|j�tjj||jd�}tjj|�s^q|j|�\}}||kr~|d}	n |s�|j|	�r�g|dd�<qt	||�}qW|S)zR
        Return the maximum revision for all files under a given location
        rN�entriesr)
r/�walkr�remover0r=r1�_get_svn_url_rev�
startswith�max)
r#r$Zrevision�base�dirs�filesZ
entries_fnZdirurlZlocalrevZbase_urlr(r(r)�get_revisionos"

zSubversion.get_revisioncs,tt|�j�\}}|jd�r$d|}||fS)Nzssh://zsvn+)�superr
r,rG)r#r'r2)�	__class__r(r)r,�s
zSubversion.get_url_revcCsV|}xBtjjtjj|d��sF|}tjj|�}||krtjd|�dSqW|j|�dS)Nzsetup.pyzGCould not find setup.py for directory %s (tried all parent directories)r)r/r0r1r=rrrrF)r#r$Z
orig_locationZ
last_locationr(r(r)�get_url�szSubversion.get_urlcCspddlm}tjj||jd�}tjj|�rHt|��}|j�}WdQRXnd}|j	d�sj|j	d�sj|j	d�r�t
ttj
|jd���}|dd=|dd	}d
d�|D�dg}n�|j	d�r�tj|�}|s�td
|��|jd�}dd�tj|�D�dg}n^y<|jdd|gdd�}	tj|	�jd�}dd�tj|	�D�}Wn |k
�rRdg}}YnX|�rdt|�}
nd}
||
fS)Nr)�InstallationErrorrCr7�8�9Z10z

�cSs,g|]$}t|�dkr|drt|d��qS)�	)�len�int)�.0�dr(r(r)�
<listcomp>�sz/Subversion._get_svn_url_rev.<locals>.<listcomp>z<?xmlzBadly formatted data: %rrcSsg|]}t|jd���qS)r)rVr )rW�mr(r(r)rY�srz--xmlF)rcSsg|]}t|jd���qS)r)rVr )rWrZr(r(r)rY�s)Zpip.exceptionsrPr/r0r=rr1�open�readrG�list�map�str�
splitlinesr>�_svn_xml_url_rer�
ValueErrorr �_svn_rev_re�finditerr�_svn_info_xml_url_re�_svn_info_xml_rev_rerH)r#r$rPZentries_path�f�datar'Zrevsr&Zxmlr2r(r(r)rF�s>








zSubversion._get_svn_url_revcCsB|j|�}|dkrdS|j�jdd�d}|j|�}d|||fS)Nr9rrzsvn+%s@%s#egg=%s)rOZegg_namer>rL)r#rAr$ZrepoZegg_project_namer2r(r(r)�get_src_requirement�s

zSubversion.get_src_requirementcCsdS)z&Always assume the versions don't matchFr()r#r5r3r(r(r)�
check_version�szSubversion.check_versioncCs>tj|�}|jjd�d}|j||j|j|jf}tj|�}|S)N�@rr;)	�urllib_parse�urlsplit�netlocr>�schemer0ZqueryZfragmentZ
urlunsplit)r'ZpurlZstripped_netlocZ
url_piecesZsurlr(r(r)r.�s


zSubversion.remove_auth_from_url)rr
rrr)�__name__�
__module__�__qualname__�namerZ	repo_nameZschemesr*r+r4r6r8rBrLr,rOrFrirj�staticmethodr.�
__classcell__r(r()rNr)r
s",	r
cCs�|rd|g}ng}tj|�}t|d�r6|j|j}}nL|d}d|krz|jd�d}d|krn|jdd�\}}q�|d}}nd	\}}|r�|d|g7}|r�|d|g7}|S)
Nz-r�usernamerrkr�:z
--usernamez
--password)NN)rlrm�hasattrrv�passwordr>)r'r2r3�rrvryrnZauthr(r(r)r-�s$


r-)Z
__future__rZloggingr/�reZpip._vendor.six.moves.urllibrrlZ	pip.indexrZ	pip.utilsrrZpip.utils.loggingrZpip.vcsrr	�compilerarcrr"rfreZ	getLoggerrprr
r-�registerr(r(r(r)�<module>s&






Ysite-packages/pip/vcs/__pycache__/__init__.cpython-36.pyc000064400000025515147511334620017262 0ustar003

���eV0�@s�dZddlmZddlZddlZddlZddlZddlZddlm	Z
ddlmZddl
mZmZmZmZmZddgZeje�ZGd	d
�d
e�Ze�ZGdd�de�Zd
d�ZdS)z)Handles all VCS (version control) support�)�absolute_importN)�parse)�
BadCommand)�display_path�
backup_dir�call_subprocess�rmtree�ask_path_exists�vcs�get_src_requirementcs�eZdZiZddddddgZ�fdd�Zd	d
�Zedd��Zed
d��Z	edd��Z
dd�Zddd�Zdd�Z
dd�Zdd�Z�ZS)�
VcsSupportZsshZgitZhgZbzrZsftpZsvncs:tjj|j�ttdd�r(tjj|j�tt|�j�dS)N�
uses_fragment)	�urllib_parseZuses_netloc�extend�schemes�getattrr
�superr�__init__)�self)�	__class__��/usr/lib/python3.6/__init__.pyrszVcsSupport.__init__cCs
|jj�S)N)�	_registry�__iter__)rrrrr$szVcsSupport.__iter__cCst|jj��S)N)�listr�values)rrrr�backends'szVcsSupport.backendscCsdd�|jD�S)NcSsg|]
}|j�qSr)�dirname)�.0�backendrrr�
<listcomp>-sz'VcsSupport.dirnames.<locals>.<listcomp>)r)rrrr�dirnames+szVcsSupport.dirnamescCs$g}x|jD]}|j|j�qW|S)N)rrr)rrrrrr�all_schemes/szVcsSupport.all_schemescCsFt|d�stjd|j�dS|j|jkrB||j|j<tjd|j�dS)N�namezCannot register VCS %szRegistered VCS backend: %s)�hasattr�logger�warning�__name__r#r�debug)r�clsrrr�register6s
zVcsSupport.registerNcCs<||jkr|j|=n$||jj�kr.|j|j=n
tjd�dS)Nz0Cannot unregister because no class or name given)rrr#r%r&)rr)r#rrr�
unregister>s


zVcsSupport.unregistercCs8x2|jj�D]$}|j|�rtjd||j�|jSqWdS)z�
        Return the name of the version control backend if found at given
        location, e.g. vcs.get_backend_name('/path/to/vcs/checkout')
        zDetermine that %s uses VCS: %sN)rr�controls_locationr%r(r#)r�location�vc_typerrr�get_backend_nameFs


zVcsSupport.get_backend_namecCs |j�}||jkr|j|SdS)N)�lowerr)rr#rrr�get_backendRs
zVcsSupport.get_backendcCs|j|�}|r|j|�SdS)N)r/r1)rr-r.rrr�get_backend_from_locationWs

z$VcsSupport.get_backend_from_location)NN)r'�
__module__�__qualname__rrrr�propertyrr!r"r*r+r/r1r2�
__classcell__rr)rrrs	
rcs�eZdZdZdZfZd+�fdd�	Zdd�Zdd�Zd	d
�Z	dd�Z
d
d�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd,d'd(�Zed)d*��Z�ZS)-�VersionControl�Ncs||_tt|�j||�dS)N)�urlrr7r)rr9�args�kwargs)rrrrgszVersionControl.__init__cCs"tjj|�\}}|jtjj�p |S)zy
           posix absolute paths start with os.path.sep,
           win32 ones start with drive (like c:\folder)
        )�os�path�
splitdrive�
startswith�sep)rZrepoZdrive�tailrrr�_is_local_repositoryksz#VersionControl._is_local_repositorycCs|jdd�S)N�/�_)�replace)rZsurnamerrr�translate_egg_surnameusz$VersionControl.translate_egg_surnamecCst�dS)z�
        Export the repository at the url to the destination location
        i.e. only download the files, without vcs informations
        N)�NotImplementedError)rr-rrr�exportyszVersionControl.exportc	Cszd}d|jkst||j��|jjdd�d}tj|�\}}}}}d}d|kr^|jdd�\}}tj||||df�}||fS)zm
        Returns the correct repository URL and revision by parsing the given
        repository URL
        zvSorry, '%s' is a malformed VCS url. The format is <vcs>+<protocol>://<url>, e.g. svn+http://myrepo/svn/MyApp#egg=MyApp�+�N�@r8)r9�AssertionError�splitrZurlsplit�rsplitZ
urlunsplit)	rZ
error_messager9�schemeZnetlocr=ZqueryZfragZrevrrr�get_url_rev�szVersionControl.get_url_revcCs4|jd�j|j�s td|��|j|�|j|�fS)zA
        Returns (url, revision), where both are strings
        rCzBad directory: %s)�rstrip�endswithrrL�get_url�get_revision)rr-rrr�get_info�s
zVersionControl.get_infocCstj|�jd�S)zi
        Normalize a URL for comparison by unquoting it and removing any
        trailing slash.
        rC)rZunquoterQ)rr9rrr�
normalize_url�szVersionControl.normalize_urlcCs|j|�|j|�kS)zV
        Compare two repo URLs for identity, ignoring incidental differences.
        )rV)rZurl1Zurl2rrr�compare_urls�szVersionControl.compare_urlscCst�dS)zx
        Called when installing or updating an editable package, takes the
        source path of the checkout.
        N)rG)r�destrrr�obtain�szVersionControl.obtaincCst�dS)zB
        Switch the repo at ``dest`` to point to ``URL``.
        N)rG)rrXr9�rev_optionsrrr�switch�szVersionControl.switchcCst�dS)zO
        Update an already-existing repo to the given ``rev_options``.
        N)rG)rrXrZrrr�update�szVersionControl.updatecCst�dS)zp
        Return True if the version is identical to what exists and
        doesn't need to be updated.
        N)rG)rrXrZrrr�
check_version�szVersionControl.check_versionc
Cs�d}d}tjj|�r�d}tjjtjj||j��r�|j|�}|j||�r�tjd|j	j
�t|�|�|j||�s�tj
dt|�|j	|�|j||�q�tj
d�q�tjd|j|j	t|�|�d}ntjd||j|j	�d}|�r�tjd|j|�td|d|d�}|dk�r2tj
d|j	t|�||�|j|||�n~|d	k�r>nr|d
k�rftjdt|��t|�d}nJ|dk�r�t|�}	tjdt|�|	�tj||	�d}n|dk�r�tjd�|S)z�
        Prepare a location to receive a checkout/clone.

        Return True if the location is ready for (and requires) a
        checkout/clone, False otherwise.
        TFz)%s in %s exists, and has correct URL (%s)zUpdating %s %s%sz$Skipping because already up-to-date.z%s %s in %s exists with URL %s�%(s)witch, (i)gnore, (w)ipe, (b)ackup �s�i�w�bz0Directory %s already exists, and is not a %s %s.�(i)gnore, (w)ipe, (b)ackup z+The plan is to install the %s repository %szWhat to do?  %srrJzSwitching %s %s to %s%szDeleting %szBacking up %s to %s�a�r_r`rarb)r^re�r`rarb)rcrf���)r<r=�exists�joinrrSrWr%r(Z	repo_name�titlerr]�infor\r&r#r	r[rr�shutilZmove�sys�exit)
rrXr9rZZrev_displayZcheckout�promptZexisting_urlZresponseZdest_dirrrr�check_destination�s�







z VersionControl.check_destinationcCs"tjj|�rt|�|j|�dS)zq
        Clean up current location and download the url repository
        (and vcs infos) into location
        N)r<r=rhrrY)rr-rrr�unpackszVersionControl.unpackcCst�dS)z�
        Return a string representing the requirement needed to
        redownload the files currently present in location, something
        like:
          {repository_url}@{revision}#egg={project_name}-{version_identifier}
        N)rG)r�distr-rrrr sz"VersionControl.get_src_requirementcCst�dS)z_
        Return the url used at location
        Used in get_info or check_destination
        N)rG)rr-rrrrS)szVersionControl.get_urlcCst�dS)z_
        Return the current revision of the files at location
        Used in get_info
        N)rG)rr-rrrrT0szVersionControl.get_revisionT�raisec	Csf|jg|}yt|||||||�Stk
r`}z$|jtjkrNtd|j��n�WYdd}~XnXdS)z�
        Run a VCS subcommand
        This is simply a wrapper around call_subprocess that adds the VCS
        command name, and checks that the VCS is available
        zCannot find command %rN)r#r�OSError�errno�ENOENTr)	r�cmdZshow_stdout�cwdZ
on_returncodeZcommand_descZ
extra_environZspinner�errr�run_command7s	zVersionControl.run_commandcCs0tjd||j|j�tjj||j�}tjj|�S)z�
        Check if a location is controlled by the vcs.
        It is meant to be overridden to implement smarter detection
        mechanisms for specific vcs.
        zChecking in %s for %s (%s)...)r%r(rr#r<r=rirh)r)r-r=rrrr,Nsz VersionControl.controls_location)N)TNrsNNN)r'r3r4r#rrrrBrFrHrPrUrVrWrYr[r\r]rprqrrSrTrz�classmethodr,r6rr)rrr7as2
U		
r7cCsZtj|�}|rFy|�j||�Stk
rDtjd||j�|j�SXtjd|�|j�S)NzPcannot determine version of editable source in %s (%s command not found in path)ztcannot determine version of editable source in %s (is not SVN checkout, Git clone, Mercurial clone or Bazaar branch))r
r2rrr%r&r#Zas_requirement)rrr-Zversion_controlrrrr[s

)�__doc__Z
__future__rruZloggingr<rlrmZpip._vendor.six.moves.urllibrrZpip.exceptionsrZ	pip.utilsrrrrr	�__all__Z	getLoggerr'r%�objectrr
r7rrrrr�<module>s 
G{site-packages/pip/vcs/__pycache__/git.cpython-36.opt-1.pyc000064400000021301147511334620017232 0ustar003

���e,-�@s�ddlmZddlZddlZddlZddlmZddlm	Z	ddl
mZddl
m
ZddlmZddlmZmZddlmZmZejZejZeje�ZGd	d
�d
e�Zeje�dS)�)�absolute_importN)�samefile)�
BadCommand)�parse)�request)�display_path�rmtree)�vcs�VersionControlcs�eZdZdZdZdZd7Zd8�fd
d�	Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zd9dd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd:d+d,�Zd-d.�Zd/d0�Z�fd1d2�Zd3d4�Ze�fd5d6��Z�ZS);�Git�gitz.git�clone�git+http�	git+https�git+ssh�git+git�git+fileNcs�|r�t|�\}}}}}|jd�r�|dt|jd���}	|	tj|�jdd�jd�}
t|||
||f�}|jd�d}|d|�t||d�||
||f�}t	t
|�j|f|�|�dS)N�file�/�\�+�)�urlsplit�endswith�len�lstrip�urllib_requestZurl2pathname�replace�
urlunsplit�find�superr�__init__)�self�url�args�kwargs�schemeZnetloc�pathZqueryZfragment�initial_slashes�newpathZ
after_plus)�	__class__��/usr/lib/python3.6/git.pyr! s

zGit.__init__cCsTd}|jdgdd�}|j|�r0|t|�d�}nd}dj|jd�dd��}t|�S)Nzgit version �versionF)�show_stdout��.�)�run_command�
startswithr�join�split�
parse_version)r"ZVERSION_PFXr-r+r+r,�get_git_version5s
zGit.get_git_versioncCsVtjdd�}|j|�z0|jd�s*|d}|jdddd|gd|d	�Wd
t|�Xd
S)z@Export the Git repository at the url to the destination locationz-exportzpip-rzcheckout-indexz-az-fz--prefixF)r.�cwdN)�tempfileZmkdtemp�unpackrr2r)r"�locationZtemp_dirr+r+r,�exportBs

z
Git.exportcCsL|j||�}d|}||kr&||gS||kr8||gStjd|�|SdS)z�Check the revision options before checkout to compensate that tags
        and branches may need origin/ as a prefix.
        Returns the SHA1 of the branch or tag if found.
        z	origin/%sz5Could not find a tag or branch '%s', assuming commit.N)�get_short_refs�logger�warning)r"�rev�dest�rev_optionsZ	revisionsZ
origin_revr+r+r,�check_rev_optionsOs

zGit.check_rev_optionscCs|j|�j|d�S)a

        Compare the current sha to the ref. ref may be a branch or tag name,
        but current rev will always point to a sha. This means that a branch
        or tag will never compare as True. So this ultimately only matches
        against exact shas.
        r)�get_revisionr3)r"rArBr+r+r,�
check_versioncszGit.check_versioncCs8|jdd|g|d�|jddg||d�|j|�dS)N�configzremote.origin.url)r8�checkoutz-q)r2�update_submodules)r"rAr#rBr+r+r,�switchlsz
Git.switchcCst|j�td�kr&|jdddg|d�n|jddg|d�|rN|j|d||�}|jdddg||d�|j|�dS)	Nz1.9.0Zfetchz-qz--tags)r8r�resetz--hard)r7r6r2rCrH)r"rArBr+r+r,�updatersz
Git.updatecCs�|j�\}}|r |g}d|}n
dg}d}|j||||�r�tjd||t|��|jdd||g�|r�|j|||�}|j||�s�|jddg||d�|j|�dS)	Nz (to %s)z
origin/masterr/zCloning %s%s to %sr
z-qrG)r8)	�get_url_revZcheck_destinationr>�inforr2rCrErH)r"rAr#r@rBZrev_displayr+r+r,�obtain�s"

z
Git.obtaincCsZ|jdddgd|d�}|j�}|d}x|D]}|jd�r,|}Pq,W|jd�d	}|j�S)
z+Return URL of the first remote encountered.rFz--get-regexpzremote\..*\.urlF)r.r8rzremote.origin.url � r)r2�
splitlinesr3r5�strip)r"r;ZremotesZfound_remoteZremoter#r+r+r,�get_url�s


zGit.get_urlcCs|jddgd|d�}|j�S)Nz	rev-parseZHEADF)r.r8)r2rQ)r"r;�current_revr+r+r,rD�szGit.get_revisionr/ccs�|jd|gd|d�}xl|jd�D]^}|jd�}|s4q y|jdd�\}}Wn"tk
rjtd|����YnX|j�|j�fVq Wd	S)
z4Yields tuples of (commit, ref) for branches and tagszshow-refF)r.r8�
�
rOrzunexpected show-ref line: N)r2r5�rstrip�
ValueErrorrQ)r"r;�pattern�output�line�commit�refr+r+r,�
get_full_refs�s


zGit.get_full_refscCs
|jd�S)Nz
refs/remotes/)r3)r"r\r+r+r,�
is_ref_remote�szGit.is_ref_remotecCs
|jd�S)Nzrefs/heads/)r3)r"r\r+r+r,�
is_ref_branch�szGit.is_ref_branchcCs
|jd�S)Nz
refs/tags/)r3)r"r\r+r+r,�
is_ref_tag�szGit.is_ref_tagcCs"t|j|�|j|�|j|�f�S)z0A ref is a commit sha if it is not anything else)�anyr^r_r`)r"r\r+r+r,�
is_ref_commit�szGit.is_ref_commitcCs
|j|�S)N)r=)r"r;r+r+r,�get_refs�szGit.get_refscCs�i}x~|j||�D]n\}}d}|j|�r:|td�d�}n6|j|�rV|td�d�}n|j|�rp|td�d�}|dk	r|||<qW|S)z=Return map of named refs (branches or tags) to commit hashes.Nz
refs/remotes/zrefs/heads/z
refs/tags/)r]r^rr_r`)r"r;rX�rvr[r\Zref_namer+r+r,r=�s


zGit.get_short_refscCs�|jddgd|d�j�}tjj|�s2tjj||�}tjj|d�}|}xBtjjtjj|d��s�|}tjj|�}||krFtj	d|�dSqFWt
||�r�dStjj||�S)	z:Return the relative path of setup.py to the git repo root.z	rev-parsez	--git-dirF)r.r8z..zsetup.pyzGCould not find setup.py for directory %s (tried all parent directories)N)r2rQ�osr'�isabsr4�exists�dirnamer>r?r�relpath)r"r;Zgit_dirZroot_dirZ
orig_locationZ
last_locationr+r+r,�_get_subdirectory�s"

zGit._get_subdirectorycCsr|j|�}|j�jd�s d|}|j�jdd�d}|s<dS|j|�}d|||f}|j|�}|rn|d|7}|S)Nzgit:zgit+�-rrz%s@%s#egg=%sz&subdirectory=)rR�lowerr3Zegg_namer5rDrj)r"Zdistr;ZrepoZegg_project_namerSZreqZsubdirectoryr+r+r,�get_src_requirement�s


zGit.get_src_requirementcsTd|jkr:|jjdd�|_tt|�j�\}}|jdd�}ntt|�j�\}}||fS)a;
        Prefixes stub URLs like 'user@hostname:user/repo.git' with 'ssh://'.
        That's required because although they use SSH they sometimes doesn't
        work with a ssh:// scheme (e.g. Github). But we need a scheme for
        parsing. Hence we remove it again afterwards and return it as a stub.
        z://zgit+z
git+ssh://zssh://r/)r#rr rrL)r"r#r@)r*r+r,rLs
zGit.get_url_revcCs6tjjtjj|d��sdS|jdddddg|d�dS)Nz.gitmodulesZ	submodulerKz--initz--recursivez-q)r8)rer'rgr4r2)r"r;r+r+r,rHs
zGit.update_submodulescsVtt|�j|�rdSy|�jdg|ddd�}|Stk
rPtjd|�dSXdS)NTz	rev-parseF�ignore)r8r.Z
on_returncodezKcould not determine if %s is under git control because git is not available)r r�controls_locationr2rr>�debug)�clsr;�r)r*r+r,ro$s
zGit.controls_location)rrrrrr)N)r/)r/)�__name__�
__module__�__qualname__�namerhZ	repo_nameZschemesr!r7r<rCrErIrKrNrRrDr]r^r_r`rbrcr=rjrmrLrH�classmethodro�
__classcell__r+r+)r*r,rs4

	
	
r)Z
__future__rZloggingr9Zos.pathreZ
pip.compatrZpip.exceptionsrZpip._vendor.six.moves.urllibrZurllib_parserrZpip._vendor.packaging.versionr6Z	pip.utilsrrZpip.vcsr	r
rrZ	getLoggerrsr>r�registerr+r+r+r,�<module>s"
site-packages/pip/vcs/__pycache__/mercurial.cpython-36.pyc000064400000006737147511334620017513 0ustar003

���e�
�@s�ddlmZddlZddlZddlZddlmZmZddlm	Z	m
Z
ddlmZddl
mZeje�ZGdd�de
�Ze	je�dS)	�)�absolute_importN)�display_path�rmtree)�vcs�VersionControl)�path_to_url)�configparserc@sdeZdZdZdZdZdZdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�ZdS)�	Mercurial�hgz.hg�clone�hg+http�hg+https�hg+ssh�hg+static-httpcCs>tjdd�}|j|�z|jd|gd|d�Wdt|�XdS)z?Export the Hg repository at the url to the destination locationz-exportzpip-�archiveF)�show_stdout�cwdN)�tempfileZmkdtemp�unpack�run_commandr)�self�locationZtemp_dir�r�/usr/lib/python3.6/mercurial.py�exports
zMercurial.exportcCs�tjj||jd�}tj�}y<|j|�|jdd|�t|d��}|j	|�WdQRXWn6t
tjfk
r�}ztj
d||�WYdd}~XnX|jddg||d�dS)	NZhgrc�paths�default�wz/Could not switch Mercurial repository to %s: %s�updatez-q)r)�os�path�join�dirnamerZSafeConfigParser�read�set�open�write�OSErrorZNoSectionError�loggerZwarningr)r�dest�url�rev_optionsZrepo_config�configZconfig_file�excrrr�switch s
zMercurial.switchcCs,|jddg|d�|jddg||d�dS)NZpullz-q)rr)r)rr)r+rrrr/szMercurial.updatecCsz|j�\}}|r |g}d|}ng}d}|j||||�rvtjd||t|��|jddd||g�|jddg||d�dS)	Nz (to revision %s)�zCloning hg %s%s to %srz
--noupdatez-qr)r)Zget_url_revZcheck_destinationr(�inforr)rr)r*Zrevr+Zrev_displayrrr�obtain3s

zMercurial.obtaincCs2|jddgd|d�j�}|j|�r*t|�}|j�S)NZ
showconfigz
paths.defaultF)rr)r�stripZ_is_local_repositoryr)rrr*rrr�get_urlEs
zMercurial.get_urlcCs|jddgd|d�j�}|S)N�parentsz--template={rev}F)rr)rr2)rrZcurrent_revisionrrr�get_revisionMszMercurial.get_revisioncCs|jddgd|d�j�}|S)Nr4z--template={node}F)rr)rr2)rr�current_rev_hashrrr�get_revision_hashSszMercurial.get_revision_hashcCsT|j|�}|j�jd�s d|}|j�jdd�d}|s<dS|j|�}d|||fS)Nzhg:zhg+�-�rz%s@%s#egg=%s)r3�lower�
startswithZegg_name�splitr7)rZdistrZrepoZegg_project_namer6rrr�get_src_requirementYs

zMercurial.get_src_requirementcCsdS)z&Always assume the versions don't matchFr)rr)r+rrr�
check_versioncszMercurial.check_versionN)r
rr
rr)�__name__�
__module__�__qualname__�namer"Z	repo_nameZschemesrr.rr1r3r5r7r=r>rrrrr	s

r	)Z
__future__rZloggingrrZ	pip.utilsrrZpip.vcsrrZpip.downloadrZpip._vendor.six.movesrZ	getLoggerr?r(r	�registerrrrr�<module>s
Wsite-packages/pip/vcs/__pycache__/bazaar.cpython-36.opt-1.pyc000064400000007066147511334620017723 0ustar003

���e��@s�ddlmZddlZddlZddlZyddlmZWnek
rPddl	ZYnXddl
mZmZddl
mZmZddlmZeje�ZGdd�de�Zeje�dS)	�)�absolute_importN)�parse)�rmtree�display_path)�vcs�VersionControl)�path_to_urlcszeZdZdZdZdZdZd �fdd�	Zd
d�Zdd�Z	dd�Z
dd�Z�fdd�Zdd�Z
dd�Zdd�Zdd�Z�ZS)!�Bazaar�bzrz.bzr�branch�bzr+http�	bzr+https�bzr+ssh�bzr+sftp�bzr+ftp�bzr+lpNcsDtt|�j|f|�|�ttdd�r@tjjdg�tjjdg�dS)N�
uses_fragmentZlp)�superr	�__init__�getattr�urllib_parser�extendZnon_hierarchical)�self�url�args�kwargs)�	__class__��/usr/lib/python3.6/bazaar.pyrszBazaar.__init__cCsRtjdd�}|j|�tjj|�r*t|�z|jd|g|dd�Wdt|�XdS)zU
        Export the Bazaar repository at the url to the destination location
        z-exportzpip-�exportF)�cwd�show_stdoutN)�tempfileZmkdtemp�unpack�os�path�existsr�run_command)r�locationZtemp_dirrrrr&s
z
Bazaar.exportcCs|jd|g|d�dS)N�switch)r )r')r�destr�rev_optionsrrrr)5sz
Bazaar.switchcCs|jddg||d�dS)NZpullz-q)r )r')rr*r+rrr�update8sz
Bazaar.updatecCsl|j�\}}|r"d|g}d|}ng}d}|j||||�rhtjd||t|��|jddg|||g�dS)Nz-rz (to revision %s)�zChecking out %s%s to %srz-q)�get_url_revZcheck_destination�logger�inforr')rr*r�revr+Zrev_displayrrr�obtain;s

z
Bazaar.obtaincs,tt|�j�\}}|jd�r$d|}||fS)Nzssh://zbzr+)rr	r.�
startswith)rrr1)rrrr.Ls
zBazaar.get_url_revcCsl|jdgd|d�}xT|j�D]H}|j�}x:dD]2}|j|�r.|j|�d}|j|�r\t|�S|Sq.WqWdS)Nr0F)r!r �checkout of branch: �parent branch: �)r4r5)r'�
splitlines�stripr3�splitZ_is_local_repositoryr)rr(Zurls�line�x�reporrr�get_urlSs

zBazaar.get_urlcCs|jdgd|d�}|j�dS)NZrevnoF)r!r r6���)r'r7)rr(Zrevisionrrr�get_revision`szBazaar.get_revisioncCsT|j|�}|sdS|j�jd�s(d|}|j�jdd�d}|j|�}d|||fS)Nzbzr:zbzr+�-r6rz%s@%s#egg=%s)r=�lowerr3Zegg_namer9r?)rZdistr(r<Zegg_project_nameZcurrent_revrrr�get_src_requirementes

zBazaar.get_src_requirementcCsdS)z&Always assume the versions don't matchFr)rr*r+rrr�
check_versionoszBazaar.check_version)r
rr
rrrr)N)�__name__�
__module__�__qualname__�name�dirnameZ	repo_nameZschemesrrr)r,r2r.r=r?rBrC�
__classcell__rr)rrr	s

r	)Z
__future__rZloggingr$r"Zurllibrr�ImportErrorZurlparseZ	pip.utilsrrZpip.vcsrrZpip.downloadrZ	getLoggerrDr/r	�registerrrrr�<module>s
_site-packages/pip/vcs/__pycache__/bazaar.cpython-36.pyc000064400000007066147511334620016764 0ustar003

���e��@s�ddlmZddlZddlZddlZyddlmZWnek
rPddl	ZYnXddl
mZmZddl
mZmZddlmZeje�ZGdd�de�Zeje�dS)	�)�absolute_importN)�parse)�rmtree�display_path)�vcs�VersionControl)�path_to_urlcszeZdZdZdZdZdZd �fdd�	Zd
d�Zdd�Z	dd�Z
dd�Z�fdd�Zdd�Z
dd�Zdd�Zdd�Z�ZS)!�Bazaar�bzrz.bzr�branch�bzr+http�	bzr+https�bzr+ssh�bzr+sftp�bzr+ftp�bzr+lpNcsDtt|�j|f|�|�ttdd�r@tjjdg�tjjdg�dS)N�
uses_fragmentZlp)�superr	�__init__�getattr�urllib_parser�extendZnon_hierarchical)�self�url�args�kwargs)�	__class__��/usr/lib/python3.6/bazaar.pyrszBazaar.__init__cCsRtjdd�}|j|�tjj|�r*t|�z|jd|g|dd�Wdt|�XdS)zU
        Export the Bazaar repository at the url to the destination location
        z-exportzpip-�exportF)�cwd�show_stdoutN)�tempfileZmkdtemp�unpack�os�path�existsr�run_command)r�locationZtemp_dirrrrr&s
z
Bazaar.exportcCs|jd|g|d�dS)N�switch)r )r')r�destr�rev_optionsrrrr)5sz
Bazaar.switchcCs|jddg||d�dS)NZpullz-q)r )r')rr*r+rrr�update8sz
Bazaar.updatecCsl|j�\}}|r"d|g}d|}ng}d}|j||||�rhtjd||t|��|jddg|||g�dS)Nz-rz (to revision %s)�zChecking out %s%s to %srz-q)�get_url_revZcheck_destination�logger�inforr')rr*r�revr+Zrev_displayrrr�obtain;s

z
Bazaar.obtaincs,tt|�j�\}}|jd�r$d|}||fS)Nzssh://zbzr+)rr	r.�
startswith)rrr1)rrrr.Ls
zBazaar.get_url_revcCsl|jdgd|d�}xT|j�D]H}|j�}x:dD]2}|j|�r.|j|�d}|j|�r\t|�S|Sq.WqWdS)Nr0F)r!r �checkout of branch: �parent branch: �)r4r5)r'�
splitlines�stripr3�splitZ_is_local_repositoryr)rr(Zurls�line�x�reporrr�get_urlSs

zBazaar.get_urlcCs|jdgd|d�}|j�dS)NZrevnoF)r!r r6���)r'r7)rr(Zrevisionrrr�get_revision`szBazaar.get_revisioncCsT|j|�}|sdS|j�jd�s(d|}|j�jdd�d}|j|�}d|||fS)Nzbzr:zbzr+�-r6rz%s@%s#egg=%s)r=�lowerr3Zegg_namer9r?)rZdistr(r<Zegg_project_nameZcurrent_revrrr�get_src_requirementes

zBazaar.get_src_requirementcCsdS)z&Always assume the versions don't matchFr)rr*r+rrr�
check_versionoszBazaar.check_version)r
rr
rrrr)N)�__name__�
__module__�__qualname__�name�dirnameZ	repo_nameZschemesrrr)r,r2r.r=r?rBrC�
__classcell__rr)rrr	s

r	)Z
__future__rZloggingr$r"Zurllibrr�ImportErrorZurlparseZ	pip.utilsrrZpip.vcsrrZpip.downloadrZ	getLoggerrDr/r	�registerrrrr�<module>s
_site-packages/pip/commands/uninstall.py000064400000005504147511334620014252 0ustar00from __future__ import absolute_import

import pip
from pip.wheel import WheelCache
from pip.req import InstallRequirement, RequirementSet, parse_requirements
from pip.basecommand import Command
from pip.exceptions import InstallationError


class UninstallCommand(Command):
    """
    Uninstall packages.

    pip is able to uninstall most installed packages. Known exceptions are:

    - Pure distutils packages installed with ``python setup.py install``, which
      leave behind no metadata to determine what files were installed.
    - Script wrappers installed by ``python setup.py develop``.
    """
    name = 'uninstall'
    usage = """
      %prog [options] <package> ...
      %prog [options] -r <requirements file> ..."""
    summary = 'Uninstall packages.'

    def __init__(self, *args, **kw):
        super(UninstallCommand, self).__init__(*args, **kw)
        self.cmd_opts.add_option(
            '-r', '--requirement',
            dest='requirements',
            action='append',
            default=[],
            metavar='file',
            help='Uninstall all the packages listed in the given requirements '
                 'file.  This option can be used multiple times.',
        )
        self.cmd_opts.add_option(
            '-y', '--yes',
            dest='yes',
            action='store_true',
            help="Don't ask for confirmation of uninstall deletions.")

        self.parser.insert_option_group(0, self.cmd_opts)

    def run(self, options, args):
        with self._build_session(options) as session:
            format_control = pip.index.FormatControl(set(), set())
            wheel_cache = WheelCache(options.cache_dir, format_control)
            requirement_set = RequirementSet(
                build_dir=None,
                src_dir=None,
                download_dir=None,
                isolated=options.isolated_mode,
                session=session,
                wheel_cache=wheel_cache,
            )
            for name in args:
                requirement_set.add_requirement(
                    InstallRequirement.from_line(
                        name, isolated=options.isolated_mode,
                        wheel_cache=wheel_cache
                    )
                )
            for filename in options.requirements:
                for req in parse_requirements(
                        filename,
                        options=options,
                        session=session,
                        wheel_cache=wheel_cache):
                    requirement_set.add_requirement(req)
            if not requirement_set.has_requirements:
                raise InstallationError(
                    'You must give at least one requirement to %(name)s (see '
                    '"pip help %(name)s")' % dict(name=self.name)
                )
            requirement_set.uninstall(auto_confirm=options.yes)
site-packages/pip/commands/hash.py000064400000003075147511334620013165 0ustar00from __future__ import absolute_import

import hashlib
import logging
import sys

from pip.basecommand import Command
from pip.status_codes import ERROR
from pip.utils import read_chunks
from pip.utils.hashes import FAVORITE_HASH, STRONG_HASHES


logger = logging.getLogger(__name__)


class HashCommand(Command):
    """
    Compute a hash of a local package archive.

    These can be used with --hash in a requirements file to do repeatable
    installs.

    """
    name = 'hash'
    usage = '%prog [options] <file> ...'
    summary = 'Compute hashes of package archives.'

    def __init__(self, *args, **kw):
        super(HashCommand, self).__init__(*args, **kw)
        self.cmd_opts.add_option(
            '-a', '--algorithm',
            dest='algorithm',
            choices=STRONG_HASHES,
            action='store',
            default=FAVORITE_HASH,
            help='The hash algorithm to use: one of %s' %
                 ', '.join(STRONG_HASHES))
        self.parser.insert_option_group(0, self.cmd_opts)

    def run(self, options, args):
        if not args:
            self.parser.print_usage(sys.stderr)
            return ERROR

        algorithm = options.algorithm
        for path in args:
            logger.info('%s:\n--hash=%s:%s',
                        path, algorithm, _hash_of_file(path, algorithm))


def _hash_of_file(path, algorithm):
    """Return the hash digest of a file."""
    with open(path, 'rb') as archive:
        hash = hashlib.new(algorithm)
        for chunk in read_chunks(archive):
            hash.update(chunk)
    return hash.hexdigest()
site-packages/pip/commands/download.py000064400000017202147511334620014046 0ustar00from __future__ import absolute_import

import logging
import os

from pip.exceptions import CommandError
from pip.index import FormatControl
from pip.req import RequirementSet
from pip.basecommand import RequirementCommand
from pip import cmdoptions
from pip.utils import ensure_dir, normalize_path
from pip.utils.build import BuildDirectory
from pip.utils.filesystem import check_path_owner


logger = logging.getLogger(__name__)


class DownloadCommand(RequirementCommand):
    """
    Download packages from:

    - PyPI (and other indexes) using requirement specifiers.
    - VCS project urls.
    - Local project directories.
    - Local or remote source archives.

    pip also supports downloading from "requirements files", which provide
    an easy way to specify a whole environment to be downloaded.
    """
    name = 'download'

    usage = """
      %prog [options] <requirement specifier> [package-index-options] ...
      %prog [options] -r <requirements file> [package-index-options] ...
      %prog [options] [-e] <vcs project url> ...
      %prog [options] [-e] <local project path> ...
      %prog [options] <archive url/path> ..."""

    summary = 'Download packages.'

    def __init__(self, *args, **kw):
        super(DownloadCommand, self).__init__(*args, **kw)

        cmd_opts = self.cmd_opts

        cmd_opts.add_option(cmdoptions.constraints())
        cmd_opts.add_option(cmdoptions.editable())
        cmd_opts.add_option(cmdoptions.requirements())
        cmd_opts.add_option(cmdoptions.build_dir())
        cmd_opts.add_option(cmdoptions.no_deps())
        cmd_opts.add_option(cmdoptions.global_options())
        cmd_opts.add_option(cmdoptions.no_binary())
        cmd_opts.add_option(cmdoptions.only_binary())
        cmd_opts.add_option(cmdoptions.src())
        cmd_opts.add_option(cmdoptions.pre())
        cmd_opts.add_option(cmdoptions.no_clean())
        cmd_opts.add_option(cmdoptions.require_hashes())

        cmd_opts.add_option(
            '-d', '--dest', '--destination-dir', '--destination-directory',
            dest='download_dir',
            metavar='dir',
            default=os.curdir,
            help=("Download packages into <dir>."),
        )

        cmd_opts.add_option(
            '--platform',
            dest='platform',
            metavar='platform',
            default=None,
            help=("Only download wheels compatible with <platform>. "
                  "Defaults to the platform of the running system."),
        )

        cmd_opts.add_option(
            '--python-version',
            dest='python_version',
            metavar='python_version',
            default=None,
            help=("Only download wheels compatible with Python "
                  "interpreter version <version>. If not specified, then the "
                  "current system interpreter minor version is used. A major "
                  "version (e.g. '2') can be specified to match all "
                  "minor revs of that major version.  A minor version "
                  "(e.g. '34') can also be specified."),
        )

        cmd_opts.add_option(
            '--implementation',
            dest='implementation',
            metavar='implementation',
            default=None,
            help=("Only download wheels compatible with Python "
                  "implementation <implementation>, e.g. 'pp', 'jy', 'cp', "
                  " or 'ip'. If not specified, then the current "
                  "interpreter implementation is used.  Use 'py' to force "
                  "implementation-agnostic wheels."),
        )

        cmd_opts.add_option(
            '--abi',
            dest='abi',
            metavar='abi',
            default=None,
            help=("Only download wheels compatible with Python "
                  "abi <abi>, e.g. 'pypy_41'.  If not specified, then the "
                  "current interpreter abi tag is used.  Generally "
                  "you will need to specify --implementation, "
                  "--platform, and --python-version when using "
                  "this option."),
        )

        index_opts = cmdoptions.make_option_group(
            cmdoptions.non_deprecated_index_group,
            self.parser,
        )

        self.parser.insert_option_group(0, index_opts)
        self.parser.insert_option_group(0, cmd_opts)

    def run(self, options, args):
        options.ignore_installed = True

        if options.python_version:
            python_versions = [options.python_version]
        else:
            python_versions = None

        dist_restriction_set = any([
            options.python_version,
            options.platform,
            options.abi,
            options.implementation,
        ])
        binary_only = FormatControl(set(), set([':all:']))
        if dist_restriction_set and options.format_control != binary_only:
            raise CommandError(
                "--only-binary=:all: must be set and --no-binary must not "
                "be set (or must be set to :none:) when restricting platform "
                "and interpreter constraints using --python-version, "
                "--platform, --abi, or --implementation."
            )

        options.src_dir = os.path.abspath(options.src_dir)
        options.download_dir = normalize_path(options.download_dir)

        ensure_dir(options.download_dir)

        with self._build_session(options) as session:
            finder = self._build_package_finder(
                options=options,
                session=session,
                platform=options.platform,
                python_versions=python_versions,
                abi=options.abi,
                implementation=options.implementation,
            )
            build_delete = (not (options.no_clean or options.build_dir))
            if options.cache_dir and not check_path_owner(options.cache_dir):
                logger.warning(
                    "The directory '%s' or its parent directory is not owned "
                    "by the current user and caching wheels has been "
                    "disabled. check the permissions and owner of that "
                    "directory. If executing pip with sudo, you may want "
                    "sudo's -H flag.",
                    options.cache_dir,
                )
                options.cache_dir = None

            with BuildDirectory(options.build_dir,
                                delete=build_delete) as build_dir:

                requirement_set = RequirementSet(
                    build_dir=build_dir,
                    src_dir=options.src_dir,
                    download_dir=options.download_dir,
                    ignore_installed=True,
                    ignore_dependencies=options.ignore_dependencies,
                    session=session,
                    isolated=options.isolated_mode,
                    require_hashes=options.require_hashes
                )
                self.populate_requirement_set(
                    requirement_set,
                    args,
                    options,
                    finder,
                    session,
                    self.name,
                    None
                )

                if not requirement_set.has_requirements:
                    return

                requirement_set.prepare_files(finder)

                downloaded = ' '.join([
                    req.name for req in requirement_set.successfully_downloaded
                ])
                if downloaded:
                    logger.info(
                        'Successfully downloaded %s', downloaded
                    )

                # Clean up
                if not options.no_clean:
                    requirement_set.cleanup_files()

        return requirement_set
site-packages/pip/commands/list.py000064400000026151147511334620013215 0ustar00from __future__ import absolute_import

import json
import logging
import warnings
try:
    from itertools import zip_longest
except ImportError:
    from itertools import izip_longest as zip_longest

from pip._vendor import six

from pip.basecommand import Command
from pip.exceptions import CommandError
from pip.index import PackageFinder
from pip.utils import (
    get_installed_distributions, dist_is_editable)
from pip.utils.deprecation import RemovedInPip10Warning
from pip.cmdoptions import make_option_group, index_group

logger = logging.getLogger(__name__)


class ListCommand(Command):
    """
    List installed packages, including editables.

    Packages are listed in a case-insensitive sorted order.
    """
    name = 'list'
    usage = """
      %prog [options]"""
    summary = 'List installed packages.'

    def __init__(self, *args, **kw):
        super(ListCommand, self).__init__(*args, **kw)

        cmd_opts = self.cmd_opts

        cmd_opts.add_option(
            '-o', '--outdated',
            action='store_true',
            default=False,
            help='List outdated packages')
        cmd_opts.add_option(
            '-u', '--uptodate',
            action='store_true',
            default=False,
            help='List uptodate packages')
        cmd_opts.add_option(
            '-e', '--editable',
            action='store_true',
            default=False,
            help='List editable projects.')
        cmd_opts.add_option(
            '-l', '--local',
            action='store_true',
            default=False,
            help=('If in a virtualenv that has global access, do not list '
                  'globally-installed packages.'),
        )
        self.cmd_opts.add_option(
            '--user',
            dest='user',
            action='store_true',
            default=False,
            help='Only output packages installed in user-site.')

        cmd_opts.add_option(
            '--pre',
            action='store_true',
            default=False,
            help=("Include pre-release and development versions. By default, "
                  "pip only finds stable versions."),
        )

        cmd_opts.add_option(
            '--format',
            action='store',
            dest='list_format',
            choices=('legacy', 'columns', 'freeze', 'json'),
            help="Select the output format among: legacy (default), columns, "
                 "freeze or json.",
        )

        cmd_opts.add_option(
            '--not-required',
            action='store_true',
            dest='not_required',
            help="List packages that are not dependencies of "
                 "installed packages.",
        )

        index_opts = make_option_group(index_group, self.parser)

        self.parser.insert_option_group(0, index_opts)
        self.parser.insert_option_group(0, cmd_opts)

    def _build_package_finder(self, options, index_urls, session):
        """
        Create a package finder appropriate to this list command.
        """
        return PackageFinder(
            find_links=options.find_links,
            index_urls=index_urls,
            allow_all_prereleases=options.pre,
            trusted_hosts=options.trusted_hosts,
            process_dependency_links=options.process_dependency_links,
            session=session,
        )

    def run(self, options, args):
        if options.allow_external:
            warnings.warn(
                "--allow-external has been deprecated and will be removed in "
                "the future. Due to changes in the repository protocol, it no "
                "longer has any effect.",
                RemovedInPip10Warning,
            )

        if options.allow_all_external:
            warnings.warn(
                "--allow-all-external has been deprecated and will be removed "
                "in the future. Due to changes in the repository protocol, it "
                "no longer has any effect.",
                RemovedInPip10Warning,
            )

        if options.allow_unverified:
            warnings.warn(
                "--allow-unverified has been deprecated and will be removed "
                "in the future. Due to changes in the repository protocol, it "
                "no longer has any effect.",
                RemovedInPip10Warning,
            )

        if options.list_format is None:
            warnings.warn(
                "The default format will switch to columns in the future. "
                "You can use --format=(legacy|columns) (or define a "
                "format=(legacy|columns) in your pip.conf under the [list] "
                "section) to disable this warning.",
                RemovedInPip10Warning,
            )

        if options.outdated and options.uptodate:
            raise CommandError(
                "Options --outdated and --uptodate cannot be combined.")

        packages = get_installed_distributions(
            local_only=options.local,
            user_only=options.user,
            editables_only=options.editable,
        )

        if options.outdated:
            packages = self.get_outdated(packages, options)
        elif options.uptodate:
            packages = self.get_uptodate(packages, options)

        if options.not_required:
            packages = self.get_not_required(packages, options)

        self.output_package_listing(packages, options)

    def get_outdated(self, packages, options):
        return [
            dist for dist in self.iter_packages_latest_infos(packages, options)
            if dist.latest_version > dist.parsed_version
        ]

    def get_uptodate(self, packages, options):
        return [
            dist for dist in self.iter_packages_latest_infos(packages, options)
            if dist.latest_version == dist.parsed_version
        ]

    def get_not_required(self, packages, options):
        dep_keys = set()
        for dist in packages:
            dep_keys.update(requirement.key for requirement in dist.requires())
        return set(pkg for pkg in packages if pkg.key not in dep_keys)

    def iter_packages_latest_infos(self, packages, options):
        index_urls = [options.index_url] + options.extra_index_urls
        if options.no_index:
            logger.debug('Ignoring indexes: %s', ','.join(index_urls))
            index_urls = []

        dependency_links = []
        for dist in packages:
            if dist.has_metadata('dependency_links.txt'):
                dependency_links.extend(
                    dist.get_metadata_lines('dependency_links.txt'),
                )

        with self._build_session(options) as session:
            finder = self._build_package_finder(options, index_urls, session)
            finder.add_dependency_links(dependency_links)

            for dist in packages:
                typ = 'unknown'
                all_candidates = finder.find_all_candidates(dist.key)
                if not options.pre:
                    # Remove prereleases
                    all_candidates = [candidate for candidate in all_candidates
                                      if not candidate.version.is_prerelease]

                if not all_candidates:
                    continue
                best_candidate = max(all_candidates,
                                     key=finder._candidate_sort_key)
                remote_version = best_candidate.version
                if best_candidate.location.is_wheel:
                    typ = 'wheel'
                else:
                    typ = 'sdist'
                # This is dirty but makes the rest of the code much cleaner
                dist.latest_version = remote_version
                dist.latest_filetype = typ
                yield dist

    def output_legacy(self, dist):
        if dist_is_editable(dist):
            return '%s (%s, %s)' % (
                dist.project_name,
                dist.version,
                dist.location,
            )
        else:
            return '%s (%s)' % (dist.project_name, dist.version)

    def output_legacy_latest(self, dist):
        return '%s - Latest: %s [%s]' % (
            self.output_legacy(dist),
            dist.latest_version,
            dist.latest_filetype,
        )

    def output_package_listing(self, packages, options):
        packages = sorted(
            packages,
            key=lambda dist: dist.project_name.lower(),
        )
        if options.list_format == 'columns' and packages:
            data, header = format_for_columns(packages, options)
            self.output_package_listing_columns(data, header)
        elif options.list_format == 'freeze':
            for dist in packages:
                logger.info("%s==%s", dist.project_name, dist.version)
        elif options.list_format == 'json':
            logger.info(format_for_json(packages, options))
        else:  # legacy
            for dist in packages:
                if options.outdated:
                    logger.info(self.output_legacy_latest(dist))
                else:
                    logger.info(self.output_legacy(dist))

    def output_package_listing_columns(self, data, header):
        # insert the header first: we need to know the size of column names
        if len(data) > 0:
            data.insert(0, header)

        pkg_strings, sizes = tabulate(data)

        # Create and add a separator.
        if len(data) > 0:
            pkg_strings.insert(1, " ".join(map(lambda x: '-' * x, sizes)))

        for val in pkg_strings:
            logger.info(val)


def tabulate(vals):
    # From pfmoore on GitHub:
    # https://github.com/pypa/pip/issues/3651#issuecomment-216932564
    assert len(vals) > 0

    sizes = [0] * max(len(x) for x in vals)
    for row in vals:
        sizes = [max(s, len(str(c))) for s, c in zip_longest(sizes, row)]

    result = []
    for row in vals:
        display = " ".join([str(c).ljust(s) if c is not None else ''
                            for s, c in zip_longest(sizes, row)])
        result.append(display)

    return result, sizes


def format_for_columns(pkgs, options):
    """
    Convert the package data into something usable
    by output_package_listing_columns.
    """
    running_outdated = options.outdated
    # Adjust the header for the `pip list --outdated` case.
    if running_outdated:
        header = ["Package", "Version", "Latest", "Type"]
    else:
        header = ["Package", "Version"]

    data = []
    if any(dist_is_editable(x) for x in pkgs):
        header.append("Location")

    for proj in pkgs:
        # if we're working on the 'outdated' list, separate out the
        # latest_version and type
        row = [proj.project_name, proj.version]

        if running_outdated:
            row.append(proj.latest_version)
            row.append(proj.latest_filetype)

        if dist_is_editable(proj):
            row.append(proj.location)

        data.append(row)

    return data, header


def format_for_json(packages, options):
    data = []
    for dist in packages:
        info = {
            'name': dist.project_name,
            'version': six.text_type(dist.version),
        }
        if options.outdated:
            info['latest_version'] = six.text_type(dist.latest_version)
            info['latest_filetype'] = dist.latest_filetype
        data.append(info)
    return json.dumps(data)
site-packages/pip/commands/help.py000064400000001726147511334620013173 0ustar00from __future__ import absolute_import

from pip.basecommand import Command, SUCCESS
from pip.exceptions import CommandError


class HelpCommand(Command):
    """Show help for commands"""
    name = 'help'
    usage = """
      %prog <command>"""
    summary = 'Show help for commands.'

    def run(self, options, args):
        from pip.commands import commands_dict, get_similar_commands

        try:
            # 'pip help' with no args is handled by pip.__init__.parseopt()
            cmd_name = args[0]  # the command we need help for
        except IndexError:
            return SUCCESS

        if cmd_name not in commands_dict:
            guess = get_similar_commands(cmd_name)

            msg = ['unknown command "%s"' % cmd_name]
            if guess:
                msg.append('maybe you meant "%s"' % guess)

            raise CommandError(' - '.join(msg))

        command = commands_dict[cmd_name]()
        command.parser.print_help()

        return SUCCESS
site-packages/pip/commands/__init__.py000064400000004304147511334620013775 0ustar00"""
Package containing all pip commands
"""
from __future__ import absolute_import

from pip.commands.completion import CompletionCommand
from pip.commands.download import DownloadCommand
from pip.commands.freeze import FreezeCommand
from pip.commands.hash import HashCommand
from pip.commands.help import HelpCommand
from pip.commands.list import ListCommand
from pip.commands.check import CheckCommand
from pip.commands.search import SearchCommand
from pip.commands.show import ShowCommand
from pip.commands.install import InstallCommand
from pip.commands.uninstall import UninstallCommand
from pip.commands.wheel import WheelCommand


commands_dict = {
    CompletionCommand.name: CompletionCommand,
    FreezeCommand.name: FreezeCommand,
    HashCommand.name: HashCommand,
    HelpCommand.name: HelpCommand,
    SearchCommand.name: SearchCommand,
    ShowCommand.name: ShowCommand,
    InstallCommand.name: InstallCommand,
    UninstallCommand.name: UninstallCommand,
    DownloadCommand.name: DownloadCommand,
    ListCommand.name: ListCommand,
    CheckCommand.name: CheckCommand,
    WheelCommand.name: WheelCommand,
}


commands_order = [
    InstallCommand,
    DownloadCommand,
    UninstallCommand,
    FreezeCommand,
    ListCommand,
    ShowCommand,
    CheckCommand,
    SearchCommand,
    WheelCommand,
    HashCommand,
    CompletionCommand,
    HelpCommand,
]


def get_summaries(ordered=True):
    """Yields sorted (command name, command summary) tuples."""

    if ordered:
        cmditems = _sort_commands(commands_dict, commands_order)
    else:
        cmditems = commands_dict.items()

    for name, command_class in cmditems:
        yield (name, command_class.summary)


def get_similar_commands(name):
    """Command name auto-correct."""
    from difflib import get_close_matches

    name = name.lower()

    close_commands = get_close_matches(name, commands_dict.keys())

    if close_commands:
        return close_commands[0]
    else:
        return False


def _sort_commands(cmddict, order):
    def keyfn(key):
        try:
            return order.index(key[1])
        except ValueError:
            # unordered items should come last
            return 0xff

    return sorted(cmddict.items(), key=keyfn)
site-packages/pip/commands/wheel.py000064400000017061147511334620013346 0ustar00# -*- coding: utf-8 -*-
from __future__ import absolute_import

import logging
import os
import warnings

from pip.basecommand import RequirementCommand
from pip.exceptions import CommandError, PreviousBuildDirError
from pip.req import RequirementSet
from pip.utils import import_or_raise
from pip.utils.build import BuildDirectory
from pip.utils.deprecation import RemovedInPip10Warning
from pip.wheel import WheelCache, WheelBuilder
from pip import cmdoptions


logger = logging.getLogger(__name__)


class WheelCommand(RequirementCommand):
    """
    Build Wheel archives for your requirements and dependencies.

    Wheel is a built-package format, and offers the advantage of not
    recompiling your software during every install. For more details, see the
    wheel docs: https://wheel.readthedocs.io/en/latest/

    Requirements: setuptools>=0.8, and wheel.

    'pip wheel' uses the bdist_wheel setuptools extension from the wheel
    package to build individual wheels.

    """

    name = 'wheel'
    usage = """
      %prog [options] <requirement specifier> ...
      %prog [options] -r <requirements file> ...
      %prog [options] [-e] <vcs project url> ...
      %prog [options] [-e] <local project path> ...
      %prog [options] <archive url/path> ..."""

    summary = 'Build wheels from your requirements.'

    def __init__(self, *args, **kw):
        super(WheelCommand, self).__init__(*args, **kw)

        cmd_opts = self.cmd_opts

        cmd_opts.add_option(
            '-w', '--wheel-dir',
            dest='wheel_dir',
            metavar='dir',
            default=os.curdir,
            help=("Build wheels into <dir>, where the default is the "
                  "current working directory."),
        )
        cmd_opts.add_option(cmdoptions.use_wheel())
        cmd_opts.add_option(cmdoptions.no_use_wheel())
        cmd_opts.add_option(cmdoptions.no_binary())
        cmd_opts.add_option(cmdoptions.only_binary())
        cmd_opts.add_option(
            '--build-option',
            dest='build_options',
            metavar='options',
            action='append',
            help="Extra arguments to be supplied to 'setup.py bdist_wheel'.")
        cmd_opts.add_option(cmdoptions.constraints())
        cmd_opts.add_option(cmdoptions.editable())
        cmd_opts.add_option(cmdoptions.requirements())
        cmd_opts.add_option(cmdoptions.src())
        cmd_opts.add_option(cmdoptions.ignore_requires_python())
        cmd_opts.add_option(cmdoptions.no_deps())
        cmd_opts.add_option(cmdoptions.build_dir())

        cmd_opts.add_option(
            '--global-option',
            dest='global_options',
            action='append',
            metavar='options',
            help="Extra global options to be supplied to the setup.py "
            "call before the 'bdist_wheel' command.")

        cmd_opts.add_option(
            '--pre',
            action='store_true',
            default=False,
            help=("Include pre-release and development versions. By default, "
                  "pip only finds stable versions."),
        )

        cmd_opts.add_option(cmdoptions.no_clean())
        cmd_opts.add_option(cmdoptions.require_hashes())

        index_opts = cmdoptions.make_option_group(
            cmdoptions.index_group,
            self.parser,
        )

        self.parser.insert_option_group(0, index_opts)
        self.parser.insert_option_group(0, cmd_opts)

    def check_required_packages(self):
        import_or_raise(
            'wheel.bdist_wheel',
            CommandError,
            "'pip wheel' requires the 'wheel' package. To fix this, run: "
            "pip install wheel"
        )
        pkg_resources = import_or_raise(
            'pkg_resources',
            CommandError,
            "'pip wheel' requires setuptools >= 0.8 for dist-info support."
            " To fix this, run: pip install --upgrade setuptools"
        )
        if not hasattr(pkg_resources, 'DistInfoDistribution'):
            raise CommandError(
                "'pip wheel' requires setuptools >= 0.8 for dist-info "
                "support. To fix this, run: pip install --upgrade "
                "setuptools"
            )

    def run(self, options, args):
        self.check_required_packages()
        cmdoptions.resolve_wheel_no_use_binary(options)
        cmdoptions.check_install_build_global(options)

        if options.allow_external:
            warnings.warn(
                "--allow-external has been deprecated and will be removed in "
                "the future. Due to changes in the repository protocol, it no "
                "longer has any effect.",
                RemovedInPip10Warning,
            )

        if options.allow_all_external:
            warnings.warn(
                "--allow-all-external has been deprecated and will be removed "
                "in the future. Due to changes in the repository protocol, it "
                "no longer has any effect.",
                RemovedInPip10Warning,
            )

        if options.allow_unverified:
            warnings.warn(
                "--allow-unverified has been deprecated and will be removed "
                "in the future. Due to changes in the repository protocol, it "
                "no longer has any effect.",
                RemovedInPip10Warning,
            )

        index_urls = [options.index_url] + options.extra_index_urls
        if options.no_index:
            logger.debug('Ignoring indexes: %s', ','.join(index_urls))
            index_urls = []

        if options.build_dir:
            options.build_dir = os.path.abspath(options.build_dir)

        options.src_dir = os.path.abspath(options.src_dir)

        with self._build_session(options) as session:
            finder = self._build_package_finder(options, session)
            build_delete = (not (options.no_clean or options.build_dir))
            wheel_cache = WheelCache(options.cache_dir, options.format_control)
            with BuildDirectory(options.build_dir,
                                delete=build_delete) as build_dir:
                requirement_set = RequirementSet(
                    build_dir=build_dir,
                    src_dir=options.src_dir,
                    download_dir=None,
                    ignore_dependencies=options.ignore_dependencies,
                    ignore_installed=True,
                    ignore_requires_python=options.ignore_requires_python,
                    isolated=options.isolated_mode,
                    session=session,
                    wheel_cache=wheel_cache,
                    wheel_download_dir=options.wheel_dir,
                    require_hashes=options.require_hashes
                )

                self.populate_requirement_set(
                    requirement_set, args, options, finder, session, self.name,
                    wheel_cache
                )

                if not requirement_set.has_requirements:
                    return

                try:
                    # build wheels
                    wb = WheelBuilder(
                        requirement_set,
                        finder,
                        build_options=options.build_options or [],
                        global_options=options.global_options or [],
                    )
                    if not wb.build():
                        raise CommandError(
                            "Failed to build one or more wheels"
                        )
                except PreviousBuildDirError:
                    options.no_clean = True
                    raise
                finally:
                    if not options.no_clean:
                        requirement_set.cleanup_files()
site-packages/pip/commands/check.py000064400000002546147511334620013321 0ustar00import logging

from pip.basecommand import Command
from pip.operations.check import check_requirements
from pip.utils import get_installed_distributions


logger = logging.getLogger(__name__)


class CheckCommand(Command):
    """Verify installed packages have compatible dependencies."""
    name = 'check'
    usage = """
      %prog [options]"""
    summary = 'Verify installed packages have compatible dependencies.'

    def run(self, options, args):
        dists = get_installed_distributions(local_only=False, skip=())
        missing_reqs_dict, incompatible_reqs_dict = check_requirements(dists)

        for dist in dists:
            key = '%s==%s' % (dist.project_name, dist.version)

            for requirement in missing_reqs_dict.get(key, []):
                logger.info(
                    "%s %s requires %s, which is not installed.",
                    dist.project_name, dist.version, requirement.project_name)

            for requirement, actual in incompatible_reqs_dict.get(key, []):
                logger.info(
                    "%s %s has requirement %s, but you have %s %s.",
                    dist.project_name, dist.version, requirement,
                    actual.project_name, actual.version)

        if missing_reqs_dict or incompatible_reqs_dict:
            return 1
        else:
            logger.info("No broken requirements found.")
site-packages/pip/commands/completion.py000064400000004625147511334620014415 0ustar00from __future__ import absolute_import

import sys
from pip.basecommand import Command

BASE_COMPLETION = """
# pip %(shell)s completion start%(script)s# pip %(shell)s completion end
"""

COMPLETION_SCRIPTS = {
    'bash': """
_pip_completion()
{
    COMPREPLY=( $( COMP_WORDS="${COMP_WORDS[*]}" \\
                   COMP_CWORD=$COMP_CWORD \\
                   PIP_AUTO_COMPLETE=1 $1 ) )
}
complete -o default -F _pip_completion pip
""", 'zsh': """
function _pip_completion {
  local words cword
  read -Ac words
  read -cn cword
  reply=( $( COMP_WORDS="$words[*]" \\
             COMP_CWORD=$(( cword-1 )) \\
             PIP_AUTO_COMPLETE=1 $words[1] ) )
}
compctl -K _pip_completion pip
""", 'fish': """
function __fish_complete_pip
    set -lx COMP_WORDS (commandline -o) ""
    set -lx COMP_CWORD (math (contains -i -- (commandline -t) $COMP_WORDS)-1)
    set -lx PIP_AUTO_COMPLETE 1
    string split \  -- (eval $COMP_WORDS[1])
end
complete -fa "(__fish_complete_pip)" -c pip
"""}


class CompletionCommand(Command):
    """A helper command to be used for command completion."""
    name = 'completion'
    summary = 'A helper command used for command completion.'

    def __init__(self, *args, **kw):
        super(CompletionCommand, self).__init__(*args, **kw)

        cmd_opts = self.cmd_opts

        cmd_opts.add_option(
            '--bash', '-b',
            action='store_const',
            const='bash',
            dest='shell',
            help='Emit completion code for bash')
        cmd_opts.add_option(
            '--zsh', '-z',
            action='store_const',
            const='zsh',
            dest='shell',
            help='Emit completion code for zsh')
        cmd_opts.add_option(
            '--fish', '-f',
            action='store_const',
            const='fish',
            dest='shell',
            help='Emit completion code for fish')

        self.parser.insert_option_group(0, cmd_opts)

    def run(self, options, args):
        """Prints the completion code of the given shell"""
        shells = COMPLETION_SCRIPTS.keys()
        shell_options = ['--' + shell for shell in sorted(shells)]
        if options.shell in shells:
            script = COMPLETION_SCRIPTS.get(options.shell, '')
            print(BASE_COMPLETION % {'script': script, 'shell': options.shell})
        else:
            sys.stderr.write(
                'ERROR: You must pass %s\n' % ' or '.join(shell_options)
            )
site-packages/pip/commands/install.py000064400000043561147511334620013714 0ustar00from __future__ import absolute_import

import logging
import operator
import os
import tempfile
import shutil
import warnings
import sys
from os import path
try:
    import wheel
except ImportError:
    wheel = None

from pip.req import RequirementSet
from pip.basecommand import RequirementCommand
from pip.locations import virtualenv_no_global, distutils_scheme
from pip.exceptions import (
    InstallationError, CommandError, PreviousBuildDirError,
)
from pip import cmdoptions
from pip.utils import ensure_dir, get_installed_version
from pip.utils.build import BuildDirectory
from pip.utils.deprecation import RemovedInPip10Warning
from pip.utils.filesystem import check_path_owner
from pip.wheel import WheelCache, WheelBuilder


logger = logging.getLogger(__name__)


class InstallCommand(RequirementCommand):
    """
    Install packages from:

    - PyPI (and other indexes) using requirement specifiers.
    - VCS project urls.
    - Local project directories.
    - Local or remote source archives.

    pip also supports installing from "requirements files", which provide
    an easy way to specify a whole environment to be installed.
    """
    name = 'install'

    usage = """
      %prog [options] <requirement specifier> [package-index-options] ...
      %prog [options] -r <requirements file> [package-index-options] ...
      %prog [options] [-e] <vcs project url> ...
      %prog [options] [-e] <local project path> ...
      %prog [options] <archive url/path> ..."""

    summary = 'Install packages.'

    def __init__(self, *args, **kw):
        super(InstallCommand, self).__init__(*args, **kw)

        cmd_opts = self.cmd_opts

        cmd_opts.add_option(cmdoptions.constraints())
        cmd_opts.add_option(cmdoptions.editable())
        cmd_opts.add_option(cmdoptions.requirements())
        cmd_opts.add_option(cmdoptions.build_dir())

        cmd_opts.add_option(
            '-t', '--target',
            dest='target_dir',
            metavar='dir',
            default=None,
            help='Install packages into <dir>. '
                 'By default this will not replace existing files/folders in '
                 '<dir>. Use --upgrade to replace existing packages in <dir> '
                 'with new versions.'
        )

        cmd_opts.add_option(
            '-d', '--download', '--download-dir', '--download-directory',
            dest='download_dir',
            metavar='dir',
            default=None,
            help=("Download packages into <dir> instead of installing them, "
                  "regardless of what's already installed."),
        )

        cmd_opts.add_option(cmdoptions.src())

        cmd_opts.add_option(
            '-U', '--upgrade',
            dest='upgrade',
            action='store_true',
            help='Upgrade all specified packages to the newest available '
                 'version. The handling of dependencies depends on the '
                 'upgrade-strategy used.'
        )

        cmd_opts.add_option(
            '--upgrade-strategy',
            dest='upgrade_strategy',
            default='eager',
            choices=['only-if-needed', 'eager'],
            help='Determines how dependency upgrading should be handled. '
                 '"eager" - dependencies are upgraded regardless of '
                 'whether the currently installed version satisfies the '
                 'requirements of the upgraded package(s). '
                 '"only-if-needed" -  are upgraded only when they do not '
                 'satisfy the requirements of the upgraded package(s).'
        )

        cmd_opts.add_option(
            '--force-reinstall',
            dest='force_reinstall',
            action='store_true',
            help='When upgrading, reinstall all packages even if they are '
                 'already up-to-date.')

        cmd_opts.add_option(
            '-I', '--ignore-installed',
            dest='ignore_installed',
            action='store_true',
            help='Ignore the installed packages (reinstalling instead).')

        cmd_opts.add_option(cmdoptions.ignore_requires_python())
        cmd_opts.add_option(cmdoptions.no_deps())

        cmd_opts.add_option(cmdoptions.install_options())
        cmd_opts.add_option(cmdoptions.global_options())

        cmd_opts.add_option(
            '--user',
            dest='use_user_site',
            action='store_true',
            help="Install to the Python user install directory for your "
                 "platform. Typically ~/.local/, or %APPDATA%\Python on "
                 "Windows. (See the Python documentation for site.USER_BASE "
                 "for full details.)")

        cmd_opts.add_option(
            '--egg',
            dest='as_egg',
            action='store_true',
            help="Install packages as eggs, not 'flat', like pip normally "
                 "does. This option is not about installing *from* eggs. "
                 "(WARNING: Because this option overrides pip's normal install"
                 " logic, requirements files may not behave as expected.)")

        cmd_opts.add_option(
            '--root',
            dest='root_path',
            metavar='dir',
            default=None,
            help="Install everything relative to this alternate root "
                 "directory.")

        cmd_opts.add_option(
            '--strip-file-prefix',
            dest='strip_file_prefix',
            metavar='prefix',
            default=None,
            help="Strip given prefix from script paths in wheel RECORD."
        )

        cmd_opts.add_option(
            '--prefix',
            dest='prefix_path',
            metavar='dir',
            default=None,
            help="Installation prefix where lib, bin and other top-level "
                 "folders are placed")

        cmd_opts.add_option(
            "--compile",
            action="store_true",
            dest="compile",
            default=True,
            help="Compile py files to pyc",
        )

        cmd_opts.add_option(
            "--no-compile",
            action="store_false",
            dest="compile",
            help="Do not compile py files to pyc",
        )

        cmd_opts.add_option(cmdoptions.use_wheel())
        cmd_opts.add_option(cmdoptions.no_use_wheel())
        cmd_opts.add_option(cmdoptions.no_binary())
        cmd_opts.add_option(cmdoptions.only_binary())
        cmd_opts.add_option(cmdoptions.pre())
        cmd_opts.add_option(cmdoptions.no_clean())
        cmd_opts.add_option(cmdoptions.require_hashes())

        index_opts = cmdoptions.make_option_group(
            cmdoptions.index_group,
            self.parser,
        )

        self.parser.insert_option_group(0, index_opts)
        self.parser.insert_option_group(0, cmd_opts)

    def run(self, options, args):
        cmdoptions.resolve_wheel_no_use_binary(options)
        cmdoptions.check_install_build_global(options)

        def is_venv():
            return hasattr(sys, 'real_prefix') or \
                    (hasattr(sys, 'base_prefix') and sys.base_prefix != sys.prefix)

        # Check whether we have root privileges and aren't in venv/virtualenv
        if os.getuid() == 0 and not is_venv():
            logger.warning(
                "WARNING: Running pip install with root privileges is "
                "generally not a good idea. Try `%s install --user` instead."
                        % path.basename(sys.argv[0])
            )

        if options.as_egg:
            warnings.warn(
                "--egg has been deprecated and will be removed in the future. "
                "This flag is mutually exclusive with large parts of pip, and "
                "actually using it invalidates pip's ability to manage the "
                "installation process.",
                RemovedInPip10Warning,
            )

        if options.allow_external:
            warnings.warn(
                "--allow-external has been deprecated and will be removed in "
                "the future. Due to changes in the repository protocol, it no "
                "longer has any effect.",
                RemovedInPip10Warning,
            )

        if options.allow_all_external:
            warnings.warn(
                "--allow-all-external has been deprecated and will be removed "
                "in the future. Due to changes in the repository protocol, it "
                "no longer has any effect.",
                RemovedInPip10Warning,
            )

        if options.allow_unverified:
            warnings.warn(
                "--allow-unverified has been deprecated and will be removed "
                "in the future. Due to changes in the repository protocol, it "
                "no longer has any effect.",
                RemovedInPip10Warning,
            )

        if options.download_dir:
            warnings.warn(
                "pip install --download has been deprecated and will be "
                "removed in the future. Pip now has a download command that "
                "should be used instead.",
                RemovedInPip10Warning,
            )
            options.ignore_installed = True

        if options.build_dir:
            options.build_dir = os.path.abspath(options.build_dir)

        options.src_dir = os.path.abspath(options.src_dir)
        install_options = options.install_options or []
        if options.use_user_site:
            if options.prefix_path:
                raise CommandError(
                    "Can not combine '--user' and '--prefix' as they imply "
                    "different installation locations"
                )
            if virtualenv_no_global():
                raise InstallationError(
                    "Can not perform a '--user' install. User site-packages "
                    "are not visible in this virtualenv."
                )
            install_options.append('--user')
            install_options.append('--prefix=')

        temp_target_dir = None
        if options.target_dir:
            options.ignore_installed = True
            temp_target_dir = tempfile.mkdtemp()
            options.target_dir = os.path.abspath(options.target_dir)
            if (os.path.exists(options.target_dir) and not
                    os.path.isdir(options.target_dir)):
                raise CommandError(
                    "Target path exists but is not a directory, will not "
                    "continue."
                )
            install_options.append('--home=' + temp_target_dir)

        global_options = options.global_options or []

        with self._build_session(options) as session:

            finder = self._build_package_finder(options, session)
            build_delete = (not (options.no_clean or options.build_dir))
            wheel_cache = WheelCache(options.cache_dir, options.format_control)
            if options.cache_dir and not check_path_owner(options.cache_dir):
                logger.warning(
                    "The directory '%s' or its parent directory is not owned "
                    "by the current user and caching wheels has been "
                    "disabled. check the permissions and owner of that "
                    "directory. If executing pip with sudo, you may want "
                    "sudo's -H flag.",
                    options.cache_dir,
                )
                options.cache_dir = None

            with BuildDirectory(options.build_dir,
                                delete=build_delete) as build_dir:
                requirement_set = RequirementSet(
                    build_dir=build_dir,
                    src_dir=options.src_dir,
                    download_dir=options.download_dir,
                    upgrade=options.upgrade,
                    upgrade_strategy=options.upgrade_strategy,
                    as_egg=options.as_egg,
                    ignore_installed=options.ignore_installed,
                    ignore_dependencies=options.ignore_dependencies,
                    ignore_requires_python=options.ignore_requires_python,
                    force_reinstall=options.force_reinstall,
                    use_user_site=options.use_user_site,
                    target_dir=temp_target_dir,
                    session=session,
                    pycompile=options.compile,
                    isolated=options.isolated_mode,
                    wheel_cache=wheel_cache,
                    require_hashes=options.require_hashes,
                )

                self.populate_requirement_set(
                    requirement_set, args, options, finder, session, self.name,
                    wheel_cache
                )

                if not requirement_set.has_requirements:
                    return

                try:
                    if (options.download_dir or not wheel or not
                            options.cache_dir):
                        # on -d don't do complex things like building
                        # wheels, and don't try to build wheels when wheel is
                        # not installed.
                        requirement_set.prepare_files(finder)
                    else:
                        # build wheels before install.
                        wb = WheelBuilder(
                            requirement_set,
                            finder,
                            build_options=[],
                            global_options=[],
                        )
                        # Ignore the result: a failed wheel will be
                        # installed from the sdist/vcs whatever.
                        wb.build(autobuilding=True)

                    if not options.download_dir:
                        requirement_set.install(
                            install_options,
                            global_options,
                            root=options.root_path,
                            prefix=options.prefix_path,
                            strip_file_prefix=options.strip_file_prefix,
                        )

                        possible_lib_locations = get_lib_location_guesses(
                            user=options.use_user_site,
                            home=temp_target_dir,
                            root=options.root_path,
                            prefix=options.prefix_path,
                            isolated=options.isolated_mode,
                        )
                        reqs = sorted(
                            requirement_set.successfully_installed,
                            key=operator.attrgetter('name'))
                        items = []
                        for req in reqs:
                            item = req.name
                            try:
                                installed_version = get_installed_version(
                                    req.name, possible_lib_locations
                                )
                                if installed_version:
                                    item += '-' + installed_version
                            except Exception:
                                pass
                            items.append(item)
                        installed = ' '.join(items)
                        if installed:
                            logger.info('Successfully installed %s', installed)
                    else:
                        downloaded = ' '.join([
                            req.name
                            for req in requirement_set.successfully_downloaded
                        ])
                        if downloaded:
                            logger.info(
                                'Successfully downloaded %s', downloaded
                            )
                except PreviousBuildDirError:
                    options.no_clean = True
                    raise
                finally:
                    # Clean up
                    if not options.no_clean:
                        requirement_set.cleanup_files()

        if options.target_dir:
            ensure_dir(options.target_dir)

            # Checking both purelib and platlib directories for installed
            # packages to be moved to target directory
            lib_dir_list = []

            purelib_dir = distutils_scheme('', home=temp_target_dir)['purelib']
            platlib_dir = distutils_scheme('', home=temp_target_dir)['platlib']

            if os.path.exists(purelib_dir):
                lib_dir_list.append(purelib_dir)
            if os.path.exists(platlib_dir) and platlib_dir != purelib_dir:
                lib_dir_list.append(platlib_dir)

            for lib_dir in lib_dir_list:
                for item in os.listdir(lib_dir):
                    target_item_dir = os.path.join(options.target_dir, item)
                    if os.path.exists(target_item_dir):
                        if not options.upgrade:
                            logger.warning(
                                'Target directory %s already exists. Specify '
                                '--upgrade to force replacement.',
                                target_item_dir
                            )
                            continue
                        if os.path.islink(target_item_dir):
                            logger.warning(
                                'Target directory %s already exists and is '
                                'a link. Pip will not automatically replace '
                                'links, please remove if replacement is '
                                'desired.',
                                target_item_dir
                            )
                            continue
                        if os.path.isdir(target_item_dir):
                            shutil.rmtree(target_item_dir)
                        else:
                            os.remove(target_item_dir)

                    shutil.move(
                        os.path.join(lib_dir, item),
                        target_item_dir
                    )
            shutil.rmtree(temp_target_dir)
        return requirement_set


def get_lib_location_guesses(*args, **kwargs):
    scheme = distutils_scheme('', *args, **kwargs)
    return [scheme['purelib'], scheme['platlib']]
site-packages/pip/commands/freeze.py000064400000005423147511334620013521 0ustar00from __future__ import absolute_import

import sys

import pip
from pip.compat import stdlib_pkgs
from pip.basecommand import Command
from pip.operations.freeze import freeze
from pip.wheel import WheelCache


DEV_PKGS = ('pip', 'setuptools', 'distribute', 'wheel')


class FreezeCommand(Command):
    """
    Output installed packages in requirements format.

    packages are listed in a case-insensitive sorted order.
    """
    name = 'freeze'
    usage = """
      %prog [options]"""
    summary = 'Output installed packages in requirements format.'
    log_streams = ("ext://sys.stderr", "ext://sys.stderr")

    def __init__(self, *args, **kw):
        super(FreezeCommand, self).__init__(*args, **kw)

        self.cmd_opts.add_option(
            '-r', '--requirement',
            dest='requirements',
            action='append',
            default=[],
            metavar='file',
            help="Use the order in the given requirements file and its "
                 "comments when generating output. This option can be "
                 "used multiple times.")
        self.cmd_opts.add_option(
            '-f', '--find-links',
            dest='find_links',
            action='append',
            default=[],
            metavar='URL',
            help='URL for finding packages, which will be added to the '
                 'output.')
        self.cmd_opts.add_option(
            '-l', '--local',
            dest='local',
            action='store_true',
            default=False,
            help='If in a virtualenv that has global access, do not output '
                 'globally-installed packages.')
        self.cmd_opts.add_option(
            '--user',
            dest='user',
            action='store_true',
            default=False,
            help='Only output packages installed in user-site.')
        self.cmd_opts.add_option(
            '--all',
            dest='freeze_all',
            action='store_true',
            help='Do not skip these packages in the output:'
                 ' %s' % ', '.join(DEV_PKGS))

        self.parser.insert_option_group(0, self.cmd_opts)

    def run(self, options, args):
        format_control = pip.index.FormatControl(set(), set())
        wheel_cache = WheelCache(options.cache_dir, format_control)
        skip = set(stdlib_pkgs)
        if not options.freeze_all:
            skip.update(DEV_PKGS)

        freeze_kwargs = dict(
            requirement=options.requirements,
            find_links=options.find_links,
            local_only=options.local,
            user_only=options.user,
            skip_regex=options.skip_requirements_regex,
            isolated=options.isolated_mode,
            wheel_cache=wheel_cache,
            skip=skip)

        for line in freeze(**freeze_kwargs):
            sys.stdout.write(line + '\n')
site-packages/pip/commands/__pycache__/hash.cpython-36.pyc000064400000003555147511334620017454 0ustar003

���e=�@s~ddlmZddlZddlZddlZddlmZddlmZddl	m
Z
ddlmZm
Z
eje�ZGdd�de�Zd	d
�ZdS)�)�absolute_importN)�Command)�ERROR)�read_chunks)�
FAVORITE_HASH�
STRONG_HASHEScs4eZdZdZdZdZdZ�fdd�Zdd�Z�Z	S)	�HashCommandz�
    Compute a hash of a local package archive.

    These can be used with --hash in a requirements file to do repeatable
    installs.

    �hashz%prog [options] <file> ...z#Compute hashes of package archives.c
sJtt|�j||�|jjdddtdtddjt�d�|jj	d|j�dS)	Nz-az--algorithm�	algorithmZstorez$The hash algorithm to use: one of %sz, )�dest�choices�action�default�helpr)
�superr�__init__Zcmd_optsZ
add_optionrr�join�parserZinsert_option_group)�self�args�kw)�	__class__��/usr/lib/python3.6/hash.pyrszHashCommand.__init__cCsD|s|jjtj�tS|j}x"|D]}tjd||t||��q"WdS)Nz%s:
--hash=%s:%s)	rZprint_usage�sys�stderrrr
�logger�info�
_hash_of_file)rZoptionsrr
�pathrrr�run(s
zHashCommand.run)
�__name__�
__module__�__qualname__�__doc__�nameZusageZsummaryrr �
__classcell__rr)rrrsrc
CsDt|d��,}tj|�}xt|�D]}|j|�q WWdQRX|j�S)z!Return the hash digest of a file.�rbN)�open�hashlib�newr�updateZ	hexdigest)rr
�archiver	�chunkrrrr3s

r)Z
__future__rr)ZloggingrZpip.basecommandrZpip.status_codesrZ	pip.utilsrZpip.utils.hashesrrZ	getLoggerr!rrrrrrr�<module>s
#site-packages/pip/commands/__pycache__/uninstall.cpython-36.pyc000064400000004752147511334620020542 0ustar003

���eD�@s`ddlmZddlZddlmZddlmZmZmZddl	m
Z
ddlmZGdd�de
�Z
dS)	�)�absolute_importN)�
WheelCache)�InstallRequirement�RequirementSet�parse_requirements)�Command)�InstallationErrorcs4eZdZdZdZdZdZ�fdd�Zdd�Z�Z	S)	�UninstallCommandaB
    Uninstall packages.

    pip is able to uninstall most installed packages. Known exceptions are:

    - Pure distutils packages installed with ``python setup.py install``, which
      leave behind no metadata to determine what files were installed.
    - Script wrappers installed by ``python setup.py develop``.
    �	uninstallzU
      %prog [options] <package> ...
      %prog [options] -r <requirements file> ...zUninstall packages.c	sVtt|�j||�|jjddddgddd�|jjdd	d
ddd
�|jjd|j�dS)Nz-rz
--requirement�requirements�append�filezjUninstall all the packages listed in the given requirements file.  This option can be used multiple times.)�dest�action�default�metavar�helpz-yz--yes�yes�
store_truez2Don't ask for confirmation of uninstall deletions.)rrrr)�superr	�__init__Zcmd_optsZ
add_option�parserZinsert_option_group)�self�args�kw)�	__class__��/usr/lib/python3.6/uninstall.pyrszUninstallCommand.__init__c
Cs�|j|���}tjjt�t��}t|j|�}tddd|j||d�}x$|D]}|j	t
j||j|d��qFWx2|jD](}x"t
||||d�D]}	|j	|	�q�WqnW|js�tdt|jd���|j|jd�WdQRXdS)N)Z	build_dirZsrc_dirZdownload_dir�isolated�session�wheel_cache)rr )�optionsrr zLYou must give at least one requirement to %(name)s (see "pip help %(name)s"))�name)Zauto_confirm)Z_build_session�pip�indexZ
FormatControl�setr�	cache_dirrZ
isolated_modeZadd_requirementrZ	from_linerrZhas_requirementsr�dictr"r
r)
rr!rrZformat_controlr Zrequirement_setr"�filenameZreqrrr�run-s6
zUninstallCommand.run)
�__name__�
__module__�__qualname__�__doc__r"ZusageZsummaryrr)�
__classcell__rr)rrr	
s	r	)Z
__future__rr#Z	pip.wheelrZpip.reqrrrZpip.basecommandrZpip.exceptionsrr	rrrr�<module>ssite-packages/pip/commands/__pycache__/completion.cpython-36.opt-1.pyc000064400000005022147511334620021630 0ustar003

���e�	�@sDddlmZddlZddlmZdZdddd�ZGd	d
�d
e�ZdS)�)�absolute_importN)�CommandzJ
# pip %(shell)s completion start%(script)s# pip %(shell)s completion end
z�
_pip_completion()
{
    COMPREPLY=( $( COMP_WORDS="${COMP_WORDS[*]}" \
                   COMP_CWORD=$COMP_CWORD \
                   PIP_AUTO_COMPLETE=1 $1 ) )
}
complete -o default -F _pip_completion pip
z�
function _pip_completion {
  local words cword
  read -Ac words
  read -cn cword
  reply=( $( COMP_WORDS="$words[*]" \
             COMP_CWORD=$(( cword-1 )) \
             PIP_AUTO_COMPLETE=1 $words[1] ) )
}
compctl -K _pip_completion pip
a
function __fish_complete_pip
    set -lx COMP_WORDS (commandline -o) ""
    set -lx COMP_CWORD (math (contains -i -- (commandline -t) $COMP_WORDS)-1)
    set -lx PIP_AUTO_COMPLETE 1
    string split \  -- (eval $COMP_WORDS[1])
end
complete -fa "(__fish_complete_pip)" -c pip
)�bash�zsh�fishcs0eZdZdZdZdZ�fdd�Zdd�Z�ZS)�CompletionCommandz3A helper command to be used for command completion.Z
completionz-A helper command used for command completion.csltt|�j||�|j}|jddddddd�|jdd	dd
ddd�|jdd
ddddd�|jjd|�dS)Nz--bashz-b�store_constr�shellzEmit completion code for bash)�action�const�dest�helpz--zshz-zrzEmit completion code for zshz--fishz-frzEmit completion code for fishr)�superr�__init__�cmd_optsZ
add_option�parserZinsert_option_group)�self�args�kwr)�	__class__�� /usr/lib/python3.6/completion.pyr-s*zCompletionCommand.__init__cCsbtj�}dd�t|�D�}|j|krHtj|jd�}tt||jd��ntjj	ddj
|��dS)z-Prints the completion code of the given shellcSsg|]}d|�qS)z--r)�.0r	rrr�
<listcomp>Jsz)CompletionCommand.run.<locals>.<listcomp>�)�scriptr	zERROR: You must pass %s
z or N)�COMPLETION_SCRIPTS�keys�sortedr	�get�print�BASE_COMPLETION�sys�stderr�write�join)rZoptionsrZshellsZ
shell_optionsrrrr�runGs
zCompletionCommand.run)	�__name__�
__module__�__qualname__�__doc__�nameZsummaryrr&�
__classcell__rr)rrr(s
r)Z
__future__rr"Zpip.basecommandrr!rrrrrr�<module>s
site-packages/pip/commands/__pycache__/install.cpython-36.pyc000064400000024302147511334620020170 0ustar003

���eqG�@s(ddlmZddlZddlZddlZddlZddlZddlZddlZddlm	Z	yddl
Z
Wnek
rtdZ
YnXddlm
Z
ddlmZddlmZmZddlmZmZmZddlmZdd	lmZmZdd
lmZddlmZddl m!Z!dd
l"m#Z#m$Z$ej%e&�Z'Gdd�de�Z(dd�Z)dS)�)�absolute_importN)�path)�RequirementSet)�RequirementCommand)�virtualenv_no_global�distutils_scheme)�InstallationError�CommandError�PreviousBuildDirError)�
cmdoptions)�
ensure_dir�get_installed_version)�BuildDirectory)�RemovedInPip10Warning)�check_path_owner)�
WheelCache�WheelBuildercs4eZdZdZdZdZdZ�fdd�Zdd�Z�Z	S)	�InstallCommandaI
    Install packages from:

    - PyPI (and other indexes) using requirement specifiers.
    - VCS project urls.
    - Local project directories.
    - Local or remote source archives.

    pip also supports installing from "requirements files", which provide
    an easy way to specify a whole environment to be installed.
    �installa%
      %prog [options] <requirement specifier> [package-index-options] ...
      %prog [options] -r <requirements file> [package-index-options] ...
      %prog [options] [-e] <vcs project url> ...
      %prog [options] [-e] <local project path> ...
      %prog [options] <archive url/path> ...zInstall packages.c
s0tt|�j||�|j}|jtj��|jtj��|jtj��|jtj	��|jddddddd�|jddd	d
ddddd�|jtj
��|jd
ddddd�|jdddddgdd�|jddddd�|jdddddd�|jtj��|jtj��|jtj
��|jtj��|jd d!dd"d�|jd#d$dd%d�|jd&d'ddd(d�|jd)d*d+dd,d�|jd-d.ddd/d�|jd0dd1d2d3d4�|jd5d6d1d7d8�|jtj��|jtj��|jtj��|jtj��|jtj��|jtj��|jtj��tjtj|j�}|jjd9|�|jjd9|�dS):Nz-tz--target�
target_dir�dirz�Install packages into <dir>. By default this will not replace existing files/folders in <dir>. Use --upgrade to replace existing packages in <dir> with new versions.)�dest�metavar�default�helpz-dz
--downloadz--download-dirz--download-directory�download_dirz`Download packages into <dir> instead of installing them, regardless of what's already installed.z-Uz	--upgrade�upgrade�
store_truez�Upgrade all specified packages to the newest available version. The handling of dependencies depends on the upgrade-strategy used.)r�actionrz--upgrade-strategy�upgrade_strategyZeagerzonly-if-neededa3Determines how dependency upgrading should be handled. "eager" - dependencies are upgraded regardless of whether the currently installed version satisfies the requirements of the upgraded package(s). "only-if-needed" -  are upgraded only when they do not satisfy the requirements of the upgraded package(s).)rr�choicesrz--force-reinstall�force_reinstallzKWhen upgrading, reinstall all packages even if they are already up-to-date.z-Iz--ignore-installed�ignore_installedz5Ignore the installed packages (reinstalling instead).z--user�
use_user_sitez�Install to the Python user install directory for your platform. Typically ~/.local/, or %APPDATA%\Python on Windows. (See the Python documentation for site.USER_BASE for full details.)z--egg�as_eggz�Install packages as eggs, not 'flat', like pip normally does. This option is not about installing *from* eggs. (WARNING: Because this option overrides pip's normal install logic, requirements files may not behave as expected.)z--root�	root_pathz=Install everything relative to this alternate root directory.z--strip-file-prefix�strip_file_prefix�prefixz5Strip given prefix from script paths in wheel RECORD.z--prefix�prefix_pathzIInstallation prefix where lib, bin and other top-level folders are placedz	--compile�compileTzCompile py files to pyc)rrrrz--no-compileZstore_falsezDo not compile py files to pyc)rrrr)�superr�__init__�cmd_optsZ
add_optionrZconstraintsZeditableZrequirements�	build_dir�src�ignore_requires_pythonZno_deps�install_options�global_optionsZ	use_wheelZno_use_wheelZ	no_binaryZonly_binaryZpre�no_clean�require_hashesZmake_option_groupZindex_group�parserZinsert_option_group)�self�args�kwr,Z
index_opts)�	__class__��/usr/lib/python3.6/install.pyr+8s�zInstallCommand.__init__c&Cstj|�tj|�dd�}tj�dkrJ|�rJtjdtjt	j
d��|jr\tj
dt�|jrntj
dt�|jr�tj
dt�|jr�tj
dt�|jr�tj
d	t�d
|_|jr�tjj|j�|_tjj|j�|_|jp�g}|j�r|jr�td��t�r�td��|jd
�|jd�d}|j�rtd
|_tj �}tjj|j�|_tjj!|j��rftjj"|j��rftd��|jd|�|j#�p~g}|j$|���T}|j%||�}|j&�p�|j}	t'|j(|j)�}
|j(�r�t*|j(��r�tjd|j(�d|_(t+|j|	d����}t,||j|j|j-|j.|j|j|j/|j0|j1|j|||j2|j3|
|j4d�}|j5||||||j6|
�|j7�s\dS�z`�y:|j�s~t8�s~|j(�r�|j9|�nt:||ggd�}
|
j;d
d�|j�sr|j<|||j=|j|j>d�t?|j||j=|j|j3d�}t@|jAtBjCd�d�}g}xX|D]P}|j6}y"tD|j6|�}|�r*|d|7}WntEk
�rBYnX|j|��qWdjF|�}|�r�tjGd|�n(djFdd�|jHD��}|�r�tjGd|�WntIk
�r�d
|_&�YnXWd|j&�s�|jJ�XWdQRXWdQRX|j�rtK|j�g}tLd |d!�d"}tLd |d!�d#}tjj!|��r4|j|�tjj!|��rV||k�rV|j|�x�|D]�}x�tjM|�D]�}tjjF|j|�}tjj!|��r�|j-�s�tjd$|��qltjjN|��r�tjd%|��qltjj"|��r�tOjP|�n
tjQ|�tOjRtjjF||�|��qlW�q\WtOjP|�|S)&NcSs ttd�pttd�otjtjkS)NZreal_prefix�base_prefix)�hasattr�sysr;r'r9r9r9r:�is_venv�s
z#InstallCommand.run.<locals>.is_venvrzpWARNING: Running pip install with root privileges is generally not a good idea. Try `%s install --user` instead.z�--egg has been deprecated and will be removed in the future. This flag is mutually exclusive with large parts of pip, and actually using it invalidates pip's ability to manage the installation process.z�--allow-external has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.z�--allow-all-external has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.z�--allow-unverified has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.z�pip install --download has been deprecated and will be removed in the future. Pip now has a download command that should be used instead.TzVCan not combine '--user' and '--prefix' as they imply different installation locationszZCan not perform a '--user' install. User site-packages are not visible in this virtualenv.z--userz	--prefix=z=Target path exists but is not a directory, will not continue.z--home=z�The directory '%s' or its parent directory is not owned by the current user and caching wheels has been disabled. check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.)�delete)r-�src_dirrrrr$r"�ignore_dependenciesr/r!r#r�sessionZ	pycompile�isolated�wheel_cacher3)Z
build_optionsr1)Zautobuilding)�rootr'r&)�user�homerEr'rC�name)�key�-� zSuccessfully installed %scSsg|]
}|j�qSr9)rH)�.0�reqr9r9r:�
<listcomp>�sz&InstallCommand.run.<locals>.<listcomp>zSuccessfully downloaded %s�)rG�purelib�platlibzKTarget directory %s already exists. Specify --upgrade to force replacement.z�Target directory %s already exists and is a link. Pip will not automatically replace links, please remove if replacement is desired.)SrZresolve_wheel_no_use_binaryZcheck_install_build_global�os�getuid�loggerZwarningr�basenamer=�argvr$�warnings�warnrZallow_externalZallow_all_externalZallow_unverifiedrr"r-�abspathr@r0r#r(r	rr�appendr�tempfileZmkdtemp�exists�isdirr1Z_build_sessionZ_build_package_finderr2r�	cache_dirZformat_controlrrrrrrAr/r!r)Z
isolated_moder3Zpopulate_requirement_setrHZhas_requirements�wheelZ
prepare_filesrZbuildrr%r&�get_lib_location_guesses�sortedZsuccessfully_installed�operator�
attrgetterr
�	Exception�join�infoZsuccessfully_downloadedr
Z
cleanup_filesrr�listdir�islink�shutilZrmtree�removeZmove)r5Zoptionsr6r>r0Ztemp_target_dirr1rB�finderZbuild_deleterDr-Zrequirement_set�wbZpossible_lib_locationsZreqs�itemsrM�itemZinstalled_versionZ	installedZ
downloadedZlib_dir_listZpurelib_dirZplatlib_dirZlib_dirZtarget_item_dirr9r9r:�run�sP

















zInstallCommand.run)
�__name__�
__module__�__qualname__�__doc__rHZusageZsummaryr+ro�
__classcell__r9r9)r8r:r!srcOstd|�|�}|d|dgS)NrOrPrQ)rO)r)r6�kwargs�schemer9r9r:r`�sr`)*Z
__future__rZloggingrbrRr[rirWr=rr_�ImportErrorZpip.reqrZpip.basecommandrZ
pip.locationsrrZpip.exceptionsrr	r
ZpiprZ	pip.utilsrr
Zpip.utils.buildrZpip.utils.deprecationrZpip.utils.filesystemrZ	pip.wheelrrZ	getLoggerrprTrr`r9r9r9r:�<module>s8

,site-packages/pip/commands/__pycache__/install.cpython-36.opt-1.pyc000064400000024302147511334620021127 0ustar003

���eqG�@s(ddlmZddlZddlZddlZddlZddlZddlZddlZddlm	Z	yddl
Z
Wnek
rtdZ
YnXddlm
Z
ddlmZddlmZmZddlmZmZmZddlmZdd	lmZmZdd
lmZddlmZddl m!Z!dd
l"m#Z#m$Z$ej%e&�Z'Gdd�de�Z(dd�Z)dS)�)�absolute_importN)�path)�RequirementSet)�RequirementCommand)�virtualenv_no_global�distutils_scheme)�InstallationError�CommandError�PreviousBuildDirError)�
cmdoptions)�
ensure_dir�get_installed_version)�BuildDirectory)�RemovedInPip10Warning)�check_path_owner)�
WheelCache�WheelBuildercs4eZdZdZdZdZdZ�fdd�Zdd�Z�Z	S)	�InstallCommandaI
    Install packages from:

    - PyPI (and other indexes) using requirement specifiers.
    - VCS project urls.
    - Local project directories.
    - Local or remote source archives.

    pip also supports installing from "requirements files", which provide
    an easy way to specify a whole environment to be installed.
    �installa%
      %prog [options] <requirement specifier> [package-index-options] ...
      %prog [options] -r <requirements file> [package-index-options] ...
      %prog [options] [-e] <vcs project url> ...
      %prog [options] [-e] <local project path> ...
      %prog [options] <archive url/path> ...zInstall packages.c
s0tt|�j||�|j}|jtj��|jtj��|jtj��|jtj	��|jddddddd�|jddd	d
ddddd�|jtj
��|jd
ddddd�|jdddddgdd�|jddddd�|jdddddd�|jtj��|jtj��|jtj
��|jtj��|jd d!dd"d�|jd#d$dd%d�|jd&d'ddd(d�|jd)d*d+dd,d�|jd-d.ddd/d�|jd0dd1d2d3d4�|jd5d6d1d7d8�|jtj��|jtj��|jtj��|jtj��|jtj��|jtj��|jtj��tjtj|j�}|jjd9|�|jjd9|�dS):Nz-tz--target�
target_dir�dirz�Install packages into <dir>. By default this will not replace existing files/folders in <dir>. Use --upgrade to replace existing packages in <dir> with new versions.)�dest�metavar�default�helpz-dz
--downloadz--download-dirz--download-directory�download_dirz`Download packages into <dir> instead of installing them, regardless of what's already installed.z-Uz	--upgrade�upgrade�
store_truez�Upgrade all specified packages to the newest available version. The handling of dependencies depends on the upgrade-strategy used.)r�actionrz--upgrade-strategy�upgrade_strategyZeagerzonly-if-neededa3Determines how dependency upgrading should be handled. "eager" - dependencies are upgraded regardless of whether the currently installed version satisfies the requirements of the upgraded package(s). "only-if-needed" -  are upgraded only when they do not satisfy the requirements of the upgraded package(s).)rr�choicesrz--force-reinstall�force_reinstallzKWhen upgrading, reinstall all packages even if they are already up-to-date.z-Iz--ignore-installed�ignore_installedz5Ignore the installed packages (reinstalling instead).z--user�
use_user_sitez�Install to the Python user install directory for your platform. Typically ~/.local/, or %APPDATA%\Python on Windows. (See the Python documentation for site.USER_BASE for full details.)z--egg�as_eggz�Install packages as eggs, not 'flat', like pip normally does. This option is not about installing *from* eggs. (WARNING: Because this option overrides pip's normal install logic, requirements files may not behave as expected.)z--root�	root_pathz=Install everything relative to this alternate root directory.z--strip-file-prefix�strip_file_prefix�prefixz5Strip given prefix from script paths in wheel RECORD.z--prefix�prefix_pathzIInstallation prefix where lib, bin and other top-level folders are placedz	--compile�compileTzCompile py files to pyc)rrrrz--no-compileZstore_falsezDo not compile py files to pyc)rrrr)�superr�__init__�cmd_optsZ
add_optionrZconstraintsZeditableZrequirements�	build_dir�src�ignore_requires_pythonZno_deps�install_options�global_optionsZ	use_wheelZno_use_wheelZ	no_binaryZonly_binaryZpre�no_clean�require_hashesZmake_option_groupZindex_group�parserZinsert_option_group)�self�args�kwr,Z
index_opts)�	__class__��/usr/lib/python3.6/install.pyr+8s�zInstallCommand.__init__c&Cstj|�tj|�dd�}tj�dkrJ|�rJtjdtjt	j
d��|jr\tj
dt�|jrntj
dt�|jr�tj
dt�|jr�tj
dt�|jr�tj
d	t�d
|_|jr�tjj|j�|_tjj|j�|_|jp�g}|j�r|jr�td��t�r�td��|jd
�|jd�d}|j�rtd
|_tj �}tjj|j�|_tjj!|j��rftjj"|j��rftd��|jd|�|j#�p~g}|j$|���T}|j%||�}|j&�p�|j}	t'|j(|j)�}
|j(�r�t*|j(��r�tjd|j(�d|_(t+|j|	d����}t,||j|j|j-|j.|j|j|j/|j0|j1|j|||j2|j3|
|j4d�}|j5||||||j6|
�|j7�s\dS�z`�y:|j�s~t8�s~|j(�r�|j9|�nt:||ggd�}
|
j;d
d�|j�sr|j<|||j=|j|j>d�t?|j||j=|j|j3d�}t@|jAtBjCd�d�}g}xX|D]P}|j6}y"tD|j6|�}|�r*|d|7}WntEk
�rBYnX|j|��qWdjF|�}|�r�tjGd|�n(djFdd�|jHD��}|�r�tjGd|�WntIk
�r�d
|_&�YnXWd|j&�s�|jJ�XWdQRXWdQRX|j�rtK|j�g}tLd |d!�d"}tLd |d!�d#}tjj!|��r4|j|�tjj!|��rV||k�rV|j|�x�|D]�}x�tjM|�D]�}tjjF|j|�}tjj!|��r�|j-�s�tjd$|��qltjjN|��r�tjd%|��qltjj"|��r�tOjP|�n
tjQ|�tOjRtjjF||�|��qlW�q\WtOjP|�|S)&NcSs ttd�pttd�otjtjkS)NZreal_prefix�base_prefix)�hasattr�sysr;r'r9r9r9r:�is_venv�s
z#InstallCommand.run.<locals>.is_venvrzpWARNING: Running pip install with root privileges is generally not a good idea. Try `%s install --user` instead.z�--egg has been deprecated and will be removed in the future. This flag is mutually exclusive with large parts of pip, and actually using it invalidates pip's ability to manage the installation process.z�--allow-external has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.z�--allow-all-external has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.z�--allow-unverified has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.z�pip install --download has been deprecated and will be removed in the future. Pip now has a download command that should be used instead.TzVCan not combine '--user' and '--prefix' as they imply different installation locationszZCan not perform a '--user' install. User site-packages are not visible in this virtualenv.z--userz	--prefix=z=Target path exists but is not a directory, will not continue.z--home=z�The directory '%s' or its parent directory is not owned by the current user and caching wheels has been disabled. check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.)�delete)r-�src_dirrrrr$r"�ignore_dependenciesr/r!r#r�sessionZ	pycompile�isolated�wheel_cacher3)Z
build_optionsr1)Zautobuilding)�rootr'r&)�user�homerEr'rC�name)�key�-� zSuccessfully installed %scSsg|]
}|j�qSr9)rH)�.0�reqr9r9r:�
<listcomp>�sz&InstallCommand.run.<locals>.<listcomp>zSuccessfully downloaded %s�)rG�purelib�platlibzKTarget directory %s already exists. Specify --upgrade to force replacement.z�Target directory %s already exists and is a link. Pip will not automatically replace links, please remove if replacement is desired.)SrZresolve_wheel_no_use_binaryZcheck_install_build_global�os�getuid�loggerZwarningr�basenamer=�argvr$�warnings�warnrZallow_externalZallow_all_externalZallow_unverifiedrr"r-�abspathr@r0r#r(r	rr�appendr�tempfileZmkdtemp�exists�isdirr1Z_build_sessionZ_build_package_finderr2r�	cache_dirZformat_controlrrrrrrAr/r!r)Z
isolated_moder3Zpopulate_requirement_setrHZhas_requirements�wheelZ
prepare_filesrZbuildrr%r&�get_lib_location_guesses�sortedZsuccessfully_installed�operator�
attrgetterr
�	Exception�join�infoZsuccessfully_downloadedr
Z
cleanup_filesrr�listdir�islink�shutilZrmtree�removeZmove)r5Zoptionsr6r>r0Ztemp_target_dirr1rB�finderZbuild_deleterDr-Zrequirement_set�wbZpossible_lib_locationsZreqs�itemsrM�itemZinstalled_versionZ	installedZ
downloadedZlib_dir_listZpurelib_dirZplatlib_dirZlib_dirZtarget_item_dirr9r9r:�run�sP

















zInstallCommand.run)
�__name__�
__module__�__qualname__�__doc__rHZusageZsummaryr+ro�
__classcell__r9r9)r8r:r!srcOstd|�|�}|d|dgS)NrOrPrQ)rO)r)r6�kwargs�schemer9r9r:r`�sr`)*Z
__future__rZloggingrbrRr[rirWr=rr_�ImportErrorZpip.reqrZpip.basecommandrZ
pip.locationsrrZpip.exceptionsrr	r
ZpiprZ	pip.utilsrr
Zpip.utils.buildrZpip.utils.deprecationrZpip.utils.filesystemrZ	pip.wheelrrZ	getLoggerrprTrr`r9r9r9r:�<module>s8

,site-packages/pip/commands/__pycache__/download.cpython-36.pyc000064400000012363147511334620020335 0ustar003

���e��@s�ddlmZddlZddlZddlmZddlmZddlm	Z	ddl
mZddlm
Z
ddlmZmZdd	lmZdd
lmZeje�ZGdd�de�ZdS)
�)�absolute_importN)�CommandError)�
FormatControl)�RequirementSet)�RequirementCommand)�
cmdoptions)�
ensure_dir�normalize_path)�BuildDirectory)�check_path_ownercs4eZdZdZdZdZdZ�fdd�Zdd�Z�Z	S)	�DownloadCommandaL
    Download packages from:

    - PyPI (and other indexes) using requirement specifiers.
    - VCS project urls.
    - Local project directories.
    - Local or remote source archives.

    pip also supports downloading from "requirements files", which provide
    an easy way to specify a whole environment to be downloaded.
    Zdownloada%
      %prog [options] <requirement specifier> [package-index-options] ...
      %prog [options] -r <requirements file> [package-index-options] ...
      %prog [options] [-e] <vcs project url> ...
      %prog [options] [-e] <local project path> ...
      %prog [options] <archive url/path> ...zDownload packages.c
s\tt|�j||�|j}|jtj��|jtj��|jtj��|jtj	��|jtj
��|jtj��|jtj��|jtj
��|jtj��|jtj��|jtj��|jtj��|jddddddtjdd�|jd	d
d
ddd�|jdd
d
ddd�|jdddddd�|jdddddd�tjtj|j�}|jjd|�|jjd|�dS)Nz-dz--destz--destination-dirz--destination-directory�download_dir�dirzDownload packages into <dir>.)�dest�metavar�default�helpz
--platform�platformz`Only download wheels compatible with <platform>. Defaults to the platform of the running system.z--python-version�python_versiona&Only download wheels compatible with Python interpreter version <version>. If not specified, then the current system interpreter minor version is used. A major version (e.g. '2') can be specified to match all minor revs of that major version.  A minor version (e.g. '34') can also be specified.z--implementation�implementationz�Only download wheels compatible with Python implementation <implementation>, e.g. 'pp', 'jy', 'cp',  or 'ip'. If not specified, then the current interpreter implementation is used.  Use 'py' to force implementation-agnostic wheels.z--abi�abiz�Only download wheels compatible with Python abi <abi>, e.g. 'pypy_41'.  If not specified, then the current interpreter abi tag is used.  Generally you will need to specify --implementation, --platform, and --python-version when using this option.r)�superr�__init__�cmd_optsZ
add_optionrZconstraintsZeditableZrequirements�	build_dirZno_depsZglobal_optionsZ	no_binaryZonly_binary�srcZpre�no_clean�require_hashes�os�curdirZmake_option_groupZnon_deprecated_index_group�parserZinsert_option_group)�self�args�kwrZ
index_opts)�	__class__��/usr/lib/python3.6/download.pyr*sbzDownloadCommand.__init__cCs�d|_|jr|jg}nd}t|j|j|j|jg�}tt�tdg��}|rZ|j|krZt	d��t
jj|j
�|_
t|j�|_t|j�|j|���}|j|||j||j|jd�}|jp�|j}|jr�t|j�r�tjd|j�d|_t|j|d���}	t|	|j
|jd|j||j|jd�}
|j|
|||||jd�|
j �s2dS|
j!|�dj"d	d
�|
j#D��}|�rdtj$d|�|j�st|
j%�WdQRXWdQRX|
S)NTz:all:z�--only-binary=:all: must be set and --no-binary must not be set (or must be set to :none:) when restricting platform and interpreter constraints using --python-version, --platform, --abi, or --implementation.)�options�sessionr�python_versionsrrz�The directory '%s' or its parent directory is not owned by the current user and caching wheels has been disabled. check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.)�delete)r�src_dirr
�ignore_installed�ignore_dependenciesr(�isolatedr� cSsg|]
}|j�qSr%)�name)�.0Zreqr%r%r&�
<listcomp>�sz'DownloadCommand.run.<locals>.<listcomp>zSuccessfully downloaded %s)&r,r�anyrrrr�setZformat_controlrr�path�abspathr+r	r
rZ_build_sessionZ_build_package_finderrr�	cache_dirr�loggerZwarningr
rr-Z
isolated_moderZpopulate_requirement_setr0Zhas_requirementsZ
prepare_files�joinZsuccessfully_downloaded�infoZ
cleanup_files)r!r'r"r)Zdist_restriction_setZbinary_onlyr(�finderZbuild_deleterZrequirement_setZ
downloadedr%r%r&�run{sx






zDownloadCommand.run)
�__name__�
__module__�__qualname__�__doc__r0ZusageZsummaryrr<�
__classcell__r%r%)r$r&rsQr)Z
__future__rZloggingrZpip.exceptionsrZ	pip.indexrZpip.reqrZpip.basecommandrZpiprZ	pip.utilsrr	Zpip.utils.buildr
Zpip.utils.filesystemrZ	getLoggerr=r8rr%r%r%r&�<module>s
site-packages/pip/commands/__pycache__/search.cpython-36.pyc000064400000010056147511334620017770 0ustar003

���e��@s�ddlmZddlZddlZddlZddlmZmZddlm	Z	ddl
mZddlm
Z
ddlmZddlmZdd	lmZdd
lmZddlmZddlmZdd
lmZeje�ZGdd�de�Z dd�Z!ddd�Z"dd�Z#dS)�)�absolute_importN)�Command�SUCCESS)�OrderedDict)�PipXmlrpcTransport)�PyPI)�get_terminal_size)�
indent_log)�CommandError)�NO_MATCHES_FOUND)�parse)�
pkg_resources)�
xmlrpc_clientcs<eZdZdZdZdZdZ�fdd�Zdd�Zd	d
�Z	�Z
S)�
SearchCommandz@Search for PyPI packages whose name or summary contains <query>.�searchz
      %prog [options] <query>zSearch PyPI for packages.cs@tt|�j||�|jjddddtjdd�|jjd|j�dS)Nz-iz--index�indexZURLz3Base URL of Python Package Index (default %default))�dest�metavar�default�helpr)	�superr�__init__Zcmd_optsZ
add_optionrZpypi_url�parserZinsert_option_group)�self�args�kw)�	__class__��/usr/lib/python3.6/search.pyrszSearchCommand.__init__cCsT|std��|}|j||�}t|�}d}tjj�r<t�d}t||d�|rPtSt	S)Nz)Missing required argument (search query).r)�terminal_width)
r
r�transform_hits�sys�stdout�isattyr�
print_resultsrr)r�optionsr�queryZ	pypi_hits�hitsrrrr�run)s

zSearchCommand.runcCsH|j}|j|��.}t||�}tj||�}|j||d�d�}|SQRXdS)N)�name�summary�or)rZ_build_sessionrrZServerProxyr)rr&r%Z	index_urlZsessionZ	transportZpypir'rrrr9s
zSearchCommand.search)�__name__�
__module__�__qualname__�__doc__r)Zusager*rr(r�
__classcell__rr)rrrsrcCs�t�}xv|D]n}|d}|d}|d}||j�krH|||gd�||<q||dj|�|t||d�kr|||d<qWt|j��S)z�
    The list from pypi is really a list of versions. We want a list of
    packages with the list of versions stored inline. This converts the
    list from pypi into one we can use.
    r)r*�version)r)r*�versionsr2)r�keys�append�highest_version�list�values)r'Zpackages�hitr)r*r1rrrr Bs
r cCsT|sdS|dkr&tdd�|D��d}dd�tjD�}�x|D�]}|d}|dpVd}|jdd	g�d}|dk	r�||d}|dkr�tj||�}d
d|dj|�}d|d||f|f}	yvtj|	�||k�r2tj	|�}
t
��Ht|d�}|
j|k�rtjd|
j�ntjd|
j�tjd|�WdQRXWq>t
k
�rJYq>Xq>WdS)NcSs.g|]&}t|d�t|jddg�d��qS)r)r2�-����)�len�get)�.0r8rrr�
<listcomp>csz!print_results.<locals>.<listcomp>�cSsg|]
}|j�qSr)Zproject_name)r>�prrrr?gsr)r*�r2r9r:��
�
� �z	%-*s - %sz%s (%s)zINSTALLED: %s (latest)z
INSTALLED: %sz
LATEST:    %sr;)�maxr
Zworking_setr=�textwrapZwrap�join�logger�infoZget_distributionr	r5r1�UnicodeEncodeError)r'Zname_column_widthrZinstalled_packagesr8r)r*r1Ztarget_width�lineZdistZlatestrrrr$^s>


r$cCst|td�S)N)�key)rH�
parse_version)r2rrrr5�sr5)NN)$Z
__future__rZloggingr!rIZpip.basecommandrrZ
pip.compatrZpip.downloadrZ
pip.modelsrZ	pip.utilsrZpip.utils.loggingr	Zpip.exceptionsr
Zpip.status_codesrZpip._vendor.packaging.versionrrPZpip._vendorr
Zpip._vendor.six.movesrZ	getLoggerr,rKrr r$r5rrrr�<module>s&
+
&site-packages/pip/commands/__pycache__/__init__.cpython-36.opt-1.pyc000064400000003726147511334620021227 0ustar003

���e��@s&dZddlmZddlmZddlmZddlmZddl	m
Z
ddlmZddl
mZdd	lmZdd
lmZddlmZddlmZdd
lmZddlmZejeejee
je
ejeejeejeejeejeejeejeejeejeiZeeeeeeeeee
eegZddd�Zdd�Zdd�Z dS)z%
Package containing all pip commands
�)�absolute_import)�CompletionCommand)�DownloadCommand)�
FreezeCommand)�HashCommand)�HelpCommand)�ListCommand)�CheckCommand)�
SearchCommand)�ShowCommand)�InstallCommand)�UninstallCommand)�WheelCommandTccs:|rttt�}ntj�}x|D]\}}||jfVqWdS)z5Yields sorted (command name, command summary) tuples.N)�_sort_commands�
commands_dict�commands_order�itemsZsummary)ZorderedZcmditems�nameZ
command_class�r�/usr/lib/python3.6/__init__.py�
get_summaries4s
rcCs6ddlm}|j�}||tj��}|r.|dSdSdS)zCommand name auto-correct.r)�get_close_matchesFN)Zdifflibr�lowerr�keys)rrZclose_commandsrrr�get_similar_commands@srcs�fdd�}t|j�|d�S)Ncs(y�j|d�Stk
r"dSXdS)N��)�index�
ValueError)�key)�orderrr�keyfnOsz_sort_commands.<locals>.keyfn)r)�sortedr)Zcmddictr r!r)r rrNsrN)T)!�__doc__Z
__future__rZpip.commands.completionrZpip.commands.downloadrZpip.commands.freezerZpip.commands.hashrZpip.commands.helprZpip.commands.listrZpip.commands.checkr	Zpip.commands.searchr
Zpip.commands.showrZpip.commands.installrZpip.commands.uninstallr
Zpip.commands.wheelrrrrrrrrrrr�<module>sP

site-packages/pip/commands/__pycache__/show.cpython-36.pyc000064400000012301147511334620017476 0ustar003

���e�@s�ddlmZddlmZddlZddlZddlmZddlm	Z	m
Z
ddlmZddl
mZeje�ZGdd	�d	e�Zd
d�Zdd
d�ZdS)�)�absolute_import)�
FeedParserN)�Command)�SUCCESS�ERROR)�
pkg_resources)�canonicalize_namecs4eZdZdZdZdZdZ�fdd�Zdd�Z�Z	S)	�ShowCommandz6Show information about one or more installed packages.Zshowz$
      %prog [options] <package> ...z*Show information about installed packages.cs>tt|�j||�|jjddddddd�|jjd|j�dS)	Nz-fz--files�files�
store_trueFz7Show the full list of installed files for each package.)�dest�action�default�helpr)�superr	�__init__Zcmd_optsZ
add_option�parserZinsert_option_group)�self�args�kw)�	__class__��/usr/lib/python3.6/show.pyrszShowCommand.__init__cCs8|stjd�tS|}t|�}t||j|jd�s4tStS)Nz.ERROR: Please provide a package name or names.)�
list_files�verbose)�loggerZwarningr�search_packages_info�
print_resultsr
rr)rZoptionsr�query�resultsrrr�run"s
zShowCommand.run)
�__name__�
__module__�__qualname__�__doc__�nameZusage�summaryrr �
__classcell__rr)rrr	sr	c#si�xtjD]}|�t|j�<qWdd�|D�}�x�fdd�|D�D�]Ή�j�j�jdd��j�D�d�}d}d}t�tj�rވj	d�rȈj
d�}dd�|D�}�fd	d�|D�}�fd
d�|D�}�j	d�r܈jd�}nP�j	d��r�j
d�}�fd
d�|D�}�fdd�|D�}�j	d��r.�jd�}�j	d��rL�j
d�}||d<�j	d��r�x,�j
d�D]}	|	j��rd|	j�|d<P�qdWt
�}
|
j|�|
j�}xdD]}|j|�||<�q�Wg}
x4|j�D](}	|	jd��r�|
j|	td�d���q�W|
|d<|�rt|�|d<|VqFWdS)z�
    Gather details from installed distributions. Print distribution name,
    version, location, and installed files. Installed files requires a
    pip generated 'installed-files.txt' in the distributions '.egg-info'
    directory.
    cSsg|]}t|��qSr)r)�.0r%rrr�
<listcomp>:sz(search_packages_info.<locals>.<listcomp>csg|]}|�kr�|�qSrr)r(Zpkg)�	installedrrr)<scSsg|]
}|j�qSr)�project_name)r(Zdeprrrr)As)r%�version�location�requiresNZRECORDcSsg|]}|jd�d�qS)�,r)�split)r(�lrrrr)Iscsg|]}tjj�j|��qSr)�os�path�joinr-)r(�p)�distrrr)Jscsg|]}tjj|�j��qSr)r2r3�relpathr-)r(r5)r6rrr)KsZMETADATAzinstalled-files.txtcsg|]}tjj�j|��qSr)r2r3r4Zegg_info)r(r5)r6rrr)Sscsg|]}tjj|�j��qSr)r2r3r7r-)r(r5)r6rrr)TszPKG-INFOzentry_points.txt�entry_pointsZ	INSTALLER�	installer�metadata-versionr&�	home-page�author�author-email�licensezClassifier: �classifiersr
)r:r&r;r<r=r>)rZworking_setrr+r,r-r.�
isinstanceZDistInfoDistributionZhas_metadataZget_metadata_linesZget_metadata�striprZfeed�close�get�
splitlines�
startswith�append�len�sorted)rr5Zquery_names�packageZ	file_listZmetadata�lines�pathsr8�lineZfeed_parserZ
pkg_info_dict�keyr?r)r6r*rr/s^







rFc	Cs�d}�x�t|�D�]�\}}d}|dkr0tjd�tjd|jdd��tjd|jd	d��tjd
|jdd��tjd|jd
d��tjd|jdd��tjd|jdd��tjd|jdd��tjd|jdd��tjddj|jdg���|�rxtjd|jdd��tjd|jdd��tjd�x"|jdg�D]}tjd|��q0Wtjd �x&|jd!g�D]}tjd|j���q^W|rtjd"�x&|jd#g�D]}tjd|j���q�Wd#|krtjd$�qW|S)%zD
    Print the informations from installed distributions found.
    FTrz---zName: %sr%�zVersion: %sr,zSummary: %sr&z
Home-page: %sz	home-pagez
Author: %sr<zAuthor-email: %szauthor-emailzLicense: %sr>zLocation: %sr-zRequires: %sz, r.zMetadata-Version: %szmetadata-versionz
Installer: %sr9zClassifiers:r?z  %sz
Entry-points:r8zFiles:r
z!Cannot locate installed-files.txt)�	enumerater�inforCr4rA)	Z
distributionsrrZresults_printed�ir6Z
classifier�entryrLrrrrxs>



r)FF)Z
__future__rZemail.parserrZloggingr2Zpip.basecommandrZpip.status_codesrrZpip._vendorrZpip._vendor.packaging.utilsrZ	getLoggerr!rr	rrrrrr�<module>s
Isite-packages/pip/commands/__pycache__/list.cpython-36.pyc000064400000022677147511334620017512 0ustar003

���ei,�@s�ddlmZddlZddlZddlZyddlmZWn ek
rTddlmZYnXddl	m
Z
ddlmZddl
mZddlmZdd	lmZmZdd
lmZddlmZmZeje�ZGdd
�d
e�Zdd�Zdd�Zdd�ZdS)�)�absolute_importN)�zip_longest)�izip_longest)�six)�Command)�CommandError)�
PackageFinder)�get_installed_distributions�dist_is_editable)�RemovedInPip10Warning)�make_option_group�index_groupcs|eZdZdZdZdZdZ�fdd�Zdd�Zd	d
�Z	dd�Z
d
d�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Z�ZS)�ListCommandzt
    List installed packages, including editables.

    Packages are listed in a case-insensitive sorted order.
    �listz
      %prog [options]zList installed packages.cs�tt|�j||�|j}|jdddddd�|jddddd	d�|jd
ddddd�|jd
ddddd�|jjdddddd�|jddddd�|jdddd$dd�|jddd d!d"�tt|j�}|jjd#|�|jjd#|�dS)%Nz-oz
--outdated�
store_trueFzList outdated packages)�action�default�helpz-uz
--uptodatezList uptodate packagesz-ez
--editablezList editable projects.z-lz--localzSIf in a virtualenv that has global access, do not list globally-installed packages.z--user�userz,Only output packages installed in user-site.)�destrrrz--prezYInclude pre-release and development versions. By default, pip only finds stable versions.z--formatZstore�list_format�legacy�columns�freeze�jsonzJSelect the output format among: legacy (default), columns, freeze or json.)rr�choicesrz--not-required�not_requiredz>List packages that are not dependencies of installed packages.)rrrr)rrrr)	�superr�__init__�cmd_optsZ
add_optionrr
�parserZinsert_option_group)�self�args�kwrZ
index_opts)�	__class__��/usr/lib/python3.6/list.pyr#s^zListCommand.__init__cCst|j||j|j|j|d�S)zK
        Create a package finder appropriate to this list command.
        )�
find_links�
index_urlsZallow_all_prereleases�
trusted_hosts�process_dependency_links�session)rr'�prer)r*)r!�optionsr(r+r%r%r&�_build_package_findercsz!ListCommand._build_package_findercCs�|jrtjdt�|jr$tjdt�|jr6tjdt�|jdkrLtjdt�|jr`|jr`t	d��t
|j|j|j
d�}|jr�|j||�}n|jr�|j||�}|jr�|j||�}|j||�dS)Nz�--allow-external has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.z�--allow-all-external has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.z�--allow-unverified has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.z�The default format will switch to columns in the future. You can use --format=(legacy|columns) (or define a format=(legacy|columns) in your pip.conf under the [list] section) to disable this warning.z5Options --outdated and --uptodate cannot be combined.)Z
local_onlyZ	user_onlyZeditables_only)Zallow_external�warnings�warnrZallow_all_externalZallow_unverifiedr�outdatedZuptodaterr	ZlocalrZeditable�get_outdated�get_uptodater�get_not_required�output_package_listing)r!r-r"�packagesr%r%r&�runps<

zListCommand.runcCsdd�|j||�D�S)NcSsg|]}|j|jkr|�qSr%)�latest_version�parsed_version)�.0�distr%r%r&�
<listcomp>�sz,ListCommand.get_outdated.<locals>.<listcomp>)�iter_packages_latest_infos)r!r6r-r%r%r&r2�szListCommand.get_outdatedcCsdd�|j||�D�S)NcSsg|]}|j|jkr|�qSr%)r8r9)r:r;r%r%r&r<�sz,ListCommand.get_uptodate.<locals>.<listcomp>)r=)r!r6r-r%r%r&r3�szListCommand.get_uptodatecsBt��x$|D]}�jdd�|j�D��qWt�fdd�|D��S)Ncss|]}|jVqdS)N)�key)r:Zrequirementr%r%r&�	<genexpr>�sz/ListCommand.get_not_required.<locals>.<genexpr>c3s|]}|j�kr|VqdS)N)r>)r:Zpkg)�dep_keysr%r&r?�s)�set�updateZrequires)r!r6r-r;r%)r@r&r4�s
zListCommand.get_not_requiredccs�|jg|j}|jr*tjddj|��g}g}x&|D]}|jd�r4|j|jd��q4W|j	|���}|j
|||�}|j|�xn|D]f}d}|j|j
�}	|js�dd�|	D�}	|	s�q�t|	|jd�}
|
j}|
jjr�d}nd	}||_||_|Vq�WWdQRXdS)
NzIgnoring indexes: %s�,zdependency_links.txt�unknowncSsg|]}|jjs|�qSr%)�versionZ
is_prerelease)r:�	candidater%r%r&r<�sz:ListCommand.iter_packages_latest_infos.<locals>.<listcomp>)r>ZwheelZsdist)Z	index_urlZextra_index_urlsZno_index�logger�debug�joinZhas_metadata�extendZget_metadata_linesZ_build_sessionr.Zadd_dependency_linksZfind_all_candidatesr>r,�maxZ_candidate_sort_keyrE�locationZis_wheelr8�latest_filetype)r!r6r-r(Zdependency_linksr;r+�finder�typZall_candidatesZbest_candidateZremote_versionr%r%r&r=�s8




z&ListCommand.iter_packages_latest_infoscCs0t|�rd|j|j|jfSd|j|jfSdS)Nz%s (%s, %s)z%s (%s))r
�project_namerErL)r!r;r%r%r&�
output_legacy�s
zListCommand.output_legacycCsd|j|�|j|jfS)Nz%s - Latest: %s [%s])rQr8rM)r!r;r%r%r&�output_legacy_latest�sz ListCommand.output_legacy_latestcCs�t|dd�d�}|jdkr:|r:t||�\}}|j||�n~|jdkrfxr|D]}tjd|j|j�qJWnR|jdkr�tjt||��n6x4|D],}|j	r�tj|j
|��q�tj|j|��q�WdS)NcSs
|jj�S)N)rP�lower)r;r%r%r&�<lambda>�sz4ListCommand.output_package_listing.<locals>.<lambda>)r>rrz%s==%sr)�sortedr�format_for_columns�output_package_listing_columnsrG�inforPrE�format_for_jsonr1rRrQ)r!r6r-�data�headerr;r%r%r&r5�s



z"ListCommand.output_package_listingcCsht|�dkr|jd|�t|�\}}t|�dkrL|jddjtdd�|���x|D]}tj|�qRWdS)Nr�� cSsd|S)N�-r%)�xr%r%r&rT
sz<ListCommand.output_package_listing_columns.<locals>.<lambda>)�len�insert�tabulaterI�maprGrX)r!rZr[Zpkg_strings�sizes�valr%r%r&rWs
z*ListCommand.output_package_listing_columns)�__name__�
__module__�__qualname__�__doc__�nameZusageZsummaryrr.r7r2r3r4r=rQrRr5rW�
__classcell__r%r%)r$r&rs@
6'
rcCs�t|�dkst�dgtdd�|D��}x |D]}dd�t||�D�}q.Wg}x0|D](}djdd�t||�D��}|j|�qTW||fS)Nrcss|]}t|�VqdS)N)r`)r:r_r%r%r&r?sztabulate.<locals>.<genexpr>cSs"g|]\}}t|tt|����qSr%)rKr`�str)r:�s�cr%r%r&r<sztabulate.<locals>.<listcomp>r]cSs*g|]"\}}|dk	r"t|�j|�nd�qS)N�)rl�ljust)r:rmrnr%r%r&r<s)r`�AssertionErrorrKrrI�append)�valsrd�row�resultZdisplayr%r%r&rbs


rbcCs�|j}|rddddg}nddg}g}tdd�|D��r@|jd�xR|D]J}|j|jg}|rr|j|j�|j|j�t|�r�|j|j�|j|�qFW||fS)z_
    Convert the package data into something usable
    by output_package_listing_columns.
    ZPackageZVersionZLatestZTypecss|]}t|�VqdS)N)r
)r:r_r%r%r&r?2sz%format_for_columns.<locals>.<genexpr>ZLocation)	r1�anyrrrPrEr8rMr
rL)Zpkgsr-Zrunning_outdatedr[rZZprojrtr%r%r&rV%s 

rVcCsZg}xJ|D]B}|jtj|j�d�}|jrBtj|j�|d<|j|d<|j|�q
Wtj	|�S)N)rjrEr8rM)
rPrZ	text_typerEr1r8rMrrr�dumps)r6r-rZr;rXr%r%r&rYFs

rY) Z
__future__rrZloggingr/�	itertoolsr�ImportErrorrZpip._vendorrZpip.basecommandrZpip.exceptionsrZ	pip.indexrZ	pip.utilsr	r
Zpip.utils.deprecationrZpip.cmdoptionsrr
Z	getLoggerrfrGrrbrVrYr%r%r%r&�<module>s(
|!site-packages/pip/commands/__pycache__/wheel.cpython-36.opt-1.pyc000064400000012430147511334620020564 0ustar003

���e1�@s�ddlmZddlZddlZddlZddlmZddlmZm	Z	ddl
mZddlm
Z
ddlmZddlmZdd	lmZmZdd
lmZeje�ZGdd�de�ZdS)
�)�absolute_importN)�RequirementCommand)�CommandError�PreviousBuildDirError)�RequirementSet)�import_or_raise)�BuildDirectory)�RemovedInPip10Warning)�
WheelCache�WheelBuilder)�
cmdoptionscs<eZdZdZdZdZdZ�fdd�Zdd�Zd	d
�Z	�Z
S)�WheelCommanda�
    Build Wheel archives for your requirements and dependencies.

    Wheel is a built-package format, and offers the advantage of not
    recompiling your software during every install. For more details, see the
    wheel docs: https://wheel.readthedocs.io/en/latest/

    Requirements: setuptools>=0.8, and wheel.

    'pip wheel' uses the bdist_wheel setuptools extension from the wheel
    package to build individual wheels.

    Zwheelz�
      %prog [options] <requirement specifier> ...
      %prog [options] -r <requirements file> ...
      %prog [options] [-e] <vcs project url> ...
      %prog [options] [-e] <local project path> ...
      %prog [options] <archive url/path> ...z$Build wheels from your requirements.csPtt|�j||�|j}|jddddtjdd�|jtj��|jtj	��|jtj
��|jtj��|jddd	d
dd�|jtj��|jtj
��|jtj��|jtj��|jtj��|jtj��|jtj��|jd
dd
d	dd�|jddddd�|jtj��|jtj��tjtj|j�}|jjd|�|jjd|�dS)Nz-wz--wheel-dir�	wheel_dir�dirzLBuild wheels into <dir>, where the default is the current working directory.)�dest�metavar�default�helpz--build-option�
build_options�options�appendz9Extra arguments to be supplied to 'setup.py bdist_wheel'.)rr�actionrz--global-option�global_optionszZExtra global options to be supplied to the setup.py call before the 'bdist_wheel' command.)rrrrz--pre�
store_trueFzYInclude pre-release and development versions. By default, pip only finds stable versions.)rrrr)�superr
�__init__�cmd_optsZ
add_option�os�curdirrZ	use_wheelZno_use_wheelZ	no_binaryZonly_binaryZconstraintsZeditableZrequirements�src�ignore_requires_pythonZno_deps�	build_dir�no_clean�require_hashesZmake_option_groupZindex_group�parserZinsert_option_group)�self�args�kwrZ
index_opts)�	__class__��/usr/lib/python3.6/wheel.pyr.sVzWheelCommand.__init__cCs.tdtd�tdtd�}t|d�s*td��dS)Nzwheel.bdist_wheelzM'pip wheel' requires the 'wheel' package. To fix this, run: pip install wheel�
pkg_resourceszp'pip wheel' requires setuptools >= 0.8 for dist-info support. To fix this, run: pip install --upgrade setuptoolsZDistInfoDistribution)rr�hasattr)r%r+r)r)r*�check_required_packageshs
z$WheelCommand.check_required_packagesc Cs�|j�tj|�tj|�|jr.tjdt�|jr@tjdt�|j	rRtjdt�|j
g|j}|jr|t
jddj|��g}|jr�tjj|j�|_tjj|j�|_|j|���}|j||�}|jp�|j}t|j|j�}t|j|d���}t||jd|jd|j|j|||j |j!d�}	|j"|	|||||j#|�|	j$�s6dSzZy6t%|	||j&�pJg|j'�pTgd	�}
|
j(��slt)d
��Wnt*k
�r�d|_�YnXWd|j�s�|	j+�XWdQRXWdQRXdS)Nz�--allow-external has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.z�--allow-all-external has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.z�--allow-unverified has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.zIgnoring indexes: %s�,)�deleteT)r!�src_dirZdownload_dir�ignore_dependenciesZignore_installedr �isolated�session�wheel_cacheZwheel_download_dirr#)rrz"Failed to build one or more wheels),r-rZresolve_wheel_no_use_binaryZcheck_install_build_globalZallow_external�warnings�warnr	Zallow_all_externalZallow_unverifiedZ	index_urlZextra_index_urlsZno_index�logger�debug�joinr!r�path�abspathr0Z_build_sessionZ_build_package_finderr"r
�	cache_dirZformat_controlrrr1r Z
isolated_moderr#Zpopulate_requirement_set�nameZhas_requirementsrrrZbuildrrZ
cleanup_files)r%rr&Z
index_urlsr3�finderZbuild_deleter4r!Zrequirement_set�wbr)r)r*�run|sv






zWheelCommand.run)�__name__�
__module__�__qualname__�__doc__r=ZusageZsummaryrr-r@�
__classcell__r)r))r(r*r
s
:r
)Z
__future__rZloggingrr5Zpip.basecommandrZpip.exceptionsrrZpip.reqrZ	pip.utilsrZpip.utils.buildrZpip.utils.deprecationr	Z	pip.wheelr
rZpiprZ	getLoggerrAr7r
r)r)r)r*�<module>s
site-packages/pip/commands/__pycache__/check.cpython-36.pyc000064400000002337147511334620017603 0ustar003

���ef�@sJddlZddlmZddlmZddlmZeje�Z	Gdd�de�Z
dS)�N)�Command)�check_requirements)�get_installed_distributionsc@s$eZdZdZdZdZdZdd�ZdS)�CheckCommandz7Verify installed packages have compatible dependencies.Zcheckz
      %prog [options]c
	Cs�tdfd�}t|�\}}x~|D]v}d|j|jf}x*|j|g�D]}tjd|j|j|j�q@Wx4|j|g�D]$\}}	tjd|j|j||	j|	j�qlWqW|s�|r�dStjd�dS)NF)Z
local_only�skipz%s==%sz*%s %s requires %s, which is not installed.z-%s %s has requirement %s, but you have %s %s.�zNo broken requirements found.)rrZproject_name�version�get�logger�info)
�selfZoptions�argsZdistsZmissing_reqs_dictZincompatible_reqs_dictZdist�keyZrequirement�actual�r�/usr/lib/python3.6/check.py�runs 

zCheckCommand.runN)�__name__�
__module__�__qualname__�__doc__�nameZusageZsummaryrrrrrrs
r)ZloggingZpip.basecommandrZpip.operations.checkrZ	pip.utilsrZ	getLoggerrr
rrrrr�<module>s

site-packages/pip/commands/__pycache__/search.cpython-36.opt-1.pyc000064400000010056147511334620020727 0ustar003

���e��@s�ddlmZddlZddlZddlZddlmZmZddlm	Z	ddl
mZddlm
Z
ddlmZddlmZdd	lmZdd
lmZddlmZddlmZdd
lmZeje�ZGdd�de�Z dd�Z!ddd�Z"dd�Z#dS)�)�absolute_importN)�Command�SUCCESS)�OrderedDict)�PipXmlrpcTransport)�PyPI)�get_terminal_size)�
indent_log)�CommandError)�NO_MATCHES_FOUND)�parse)�
pkg_resources)�
xmlrpc_clientcs<eZdZdZdZdZdZ�fdd�Zdd�Zd	d
�Z	�Z
S)�
SearchCommandz@Search for PyPI packages whose name or summary contains <query>.�searchz
      %prog [options] <query>zSearch PyPI for packages.cs@tt|�j||�|jjddddtjdd�|jjd|j�dS)Nz-iz--index�indexZURLz3Base URL of Python Package Index (default %default))�dest�metavar�default�helpr)	�superr�__init__Zcmd_optsZ
add_optionrZpypi_url�parserZinsert_option_group)�self�args�kw)�	__class__��/usr/lib/python3.6/search.pyrszSearchCommand.__init__cCsT|std��|}|j||�}t|�}d}tjj�r<t�d}t||d�|rPtSt	S)Nz)Missing required argument (search query).r)�terminal_width)
r
r�transform_hits�sys�stdout�isattyr�
print_resultsrr)r�optionsr�queryZ	pypi_hits�hitsrrrr�run)s

zSearchCommand.runcCsH|j}|j|��.}t||�}tj||�}|j||d�d�}|SQRXdS)N)�name�summary�or)rZ_build_sessionrrZServerProxyr)rr&r%Z	index_urlZsessionZ	transportZpypir'rrrr9s
zSearchCommand.search)�__name__�
__module__�__qualname__�__doc__r)Zusager*rr(r�
__classcell__rr)rrrsrcCs�t�}xv|D]n}|d}|d}|d}||j�krH|||gd�||<q||dj|�|t||d�kr|||d<qWt|j��S)z�
    The list from pypi is really a list of versions. We want a list of
    packages with the list of versions stored inline. This converts the
    list from pypi into one we can use.
    r)r*�version)r)r*�versionsr2)r�keys�append�highest_version�list�values)r'Zpackages�hitr)r*r1rrrr Bs
r cCsT|sdS|dkr&tdd�|D��d}dd�tjD�}�x|D�]}|d}|dpVd}|jdd	g�d}|dk	r�||d}|dkr�tj||�}d
d|dj|�}d|d||f|f}	yvtj|	�||k�r2tj	|�}
t
��Ht|d�}|
j|k�rtjd|
j�ntjd|
j�tjd|�WdQRXWq>t
k
�rJYq>Xq>WdS)NcSs.g|]&}t|d�t|jddg�d��qS)r)r2�-����)�len�get)�.0r8rrr�
<listcomp>csz!print_results.<locals>.<listcomp>�cSsg|]
}|j�qSr)Zproject_name)r>�prrrr?gsr)r*�r2r9r:��
�
� �z	%-*s - %sz%s (%s)zINSTALLED: %s (latest)z
INSTALLED: %sz
LATEST:    %sr;)�maxr
Zworking_setr=�textwrapZwrap�join�logger�infoZget_distributionr	r5r1�UnicodeEncodeError)r'Zname_column_widthrZinstalled_packagesr8r)r*r1Ztarget_width�lineZdistZlatestrrrr$^s>


r$cCst|td�S)N)�key)rH�
parse_version)r2rrrr5�sr5)NN)$Z
__future__rZloggingr!rIZpip.basecommandrrZ
pip.compatrZpip.downloadrZ
pip.modelsrZ	pip.utilsrZpip.utils.loggingr	Zpip.exceptionsr
Zpip.status_codesrZpip._vendor.packaging.versionrrPZpip._vendorr
Zpip._vendor.six.movesrZ	getLoggerr,rKrr r$r5rrrr�<module>s&
+
&site-packages/pip/commands/__pycache__/freeze.cpython-36.opt-1.pyc000064400000005000147511334620020733 0ustar003

���e�@sdddlmZddlZddlZddlmZddlmZddlm	Z	ddl
mZd
ZGdd�de�Z
dS)�)�absolute_importN)�stdlib_pkgs)�Command)�freeze)�
WheelCache�pip�
setuptools�
distribute�wheelcs8eZdZdZdZdZdZd
Z�fdd�Zdd	�Z	�Z
S)�
FreezeCommandzx
    Output installed packages in requirements format.

    packages are listed in a case-insensitive sorted order.
    rz
      %prog [options]z1Output installed packages in requirements format.�ext://sys.stderrc	s�tt|�j||�|jjddddgddd�|jjdd	d
dgddd�|jjd
dddddd�|jjdddddd�|jjdddddjt�d�|jjd|j�dS)Nz-rz
--requirement�requirements�append�filez}Use the order in the given requirements file and its comments when generating output. This option can be used multiple times.)�dest�action�default�metavar�helpz-fz--find-links�
find_linksZURLz<URL for finding packages, which will be added to the output.z-lz--local�local�
store_trueFzUIf in a virtualenv that has global access, do not output globally-installed packages.)rrrrz--user�userz,Only output packages installed in user-site.z--all�
freeze_allz,Do not skip these packages in the output: %sz, )rrrr)	�superr�__init__Zcmd_optsZ
add_option�join�DEV_PKGS�parserZinsert_option_group)�self�args�kw)�	__class__��/usr/lib/python3.6/freeze.pyrsDzFreezeCommand.__init__c
Cs�tjjt�t��}t|j|�}tt�}|js6|jt	�t
|j|j|j
|j|j|j||d�}x"tf|�D]}tjj|d�qfWdS)N)ZrequirementrZ
local_onlyZ	user_onlyZ
skip_regex�isolated�wheel_cache�skip�
)r�indexZ
FormatControl�setr�	cache_dirrr�updater�dictr
rrrZskip_requirements_regexZ
isolated_moder�sys�stdout�write)rZoptionsr Zformat_controlr&r'Z
freeze_kwargs�liner#r#r$�runEs 
zFreezeCommand.run)rr)�__name__�
__module__�__qualname__�__doc__�nameZusageZsummaryZlog_streamsrr2�
__classcell__r#r#)r"r$rs*r)rrr	r
)Z
__future__rr.rZ
pip.compatrZpip.basecommandrZpip.operations.freezerZ	pip.wheelrrrr#r#r#r$�<module>ssite-packages/pip/commands/__pycache__/download.cpython-36.opt-1.pyc000064400000012363147511334620021274 0ustar003

���e��@s�ddlmZddlZddlZddlmZddlmZddlm	Z	ddl
mZddlm
Z
ddlmZmZdd	lmZdd
lmZeje�ZGdd�de�ZdS)
�)�absolute_importN)�CommandError)�
FormatControl)�RequirementSet)�RequirementCommand)�
cmdoptions)�
ensure_dir�normalize_path)�BuildDirectory)�check_path_ownercs4eZdZdZdZdZdZ�fdd�Zdd�Z�Z	S)	�DownloadCommandaL
    Download packages from:

    - PyPI (and other indexes) using requirement specifiers.
    - VCS project urls.
    - Local project directories.
    - Local or remote source archives.

    pip also supports downloading from "requirements files", which provide
    an easy way to specify a whole environment to be downloaded.
    Zdownloada%
      %prog [options] <requirement specifier> [package-index-options] ...
      %prog [options] -r <requirements file> [package-index-options] ...
      %prog [options] [-e] <vcs project url> ...
      %prog [options] [-e] <local project path> ...
      %prog [options] <archive url/path> ...zDownload packages.c
s\tt|�j||�|j}|jtj��|jtj��|jtj��|jtj	��|jtj
��|jtj��|jtj��|jtj
��|jtj��|jtj��|jtj��|jtj��|jddddddtjdd�|jd	d
d
ddd�|jdd
d
ddd�|jdddddd�|jdddddd�tjtj|j�}|jjd|�|jjd|�dS)Nz-dz--destz--destination-dirz--destination-directory�download_dir�dirzDownload packages into <dir>.)�dest�metavar�default�helpz
--platform�platformz`Only download wheels compatible with <platform>. Defaults to the platform of the running system.z--python-version�python_versiona&Only download wheels compatible with Python interpreter version <version>. If not specified, then the current system interpreter minor version is used. A major version (e.g. '2') can be specified to match all minor revs of that major version.  A minor version (e.g. '34') can also be specified.z--implementation�implementationz�Only download wheels compatible with Python implementation <implementation>, e.g. 'pp', 'jy', 'cp',  or 'ip'. If not specified, then the current interpreter implementation is used.  Use 'py' to force implementation-agnostic wheels.z--abi�abiz�Only download wheels compatible with Python abi <abi>, e.g. 'pypy_41'.  If not specified, then the current interpreter abi tag is used.  Generally you will need to specify --implementation, --platform, and --python-version when using this option.r)�superr�__init__�cmd_optsZ
add_optionrZconstraintsZeditableZrequirements�	build_dirZno_depsZglobal_optionsZ	no_binaryZonly_binary�srcZpre�no_clean�require_hashes�os�curdirZmake_option_groupZnon_deprecated_index_group�parserZinsert_option_group)�self�args�kwrZ
index_opts)�	__class__��/usr/lib/python3.6/download.pyr*sbzDownloadCommand.__init__cCs�d|_|jr|jg}nd}t|j|j|j|jg�}tt�tdg��}|rZ|j|krZt	d��t
jj|j
�|_
t|j�|_t|j�|j|���}|j|||j||j|jd�}|jp�|j}|jr�t|j�r�tjd|j�d|_t|j|d���}	t|	|j
|jd|j||j|jd�}
|j|
|||||jd�|
j �s2dS|
j!|�dj"d	d
�|
j#D��}|�rdtj$d|�|j�st|
j%�WdQRXWdQRX|
S)NTz:all:z�--only-binary=:all: must be set and --no-binary must not be set (or must be set to :none:) when restricting platform and interpreter constraints using --python-version, --platform, --abi, or --implementation.)�options�sessionr�python_versionsrrz�The directory '%s' or its parent directory is not owned by the current user and caching wheels has been disabled. check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.)�delete)r�src_dirr
�ignore_installed�ignore_dependenciesr(�isolatedr� cSsg|]
}|j�qSr%)�name)�.0Zreqr%r%r&�
<listcomp>�sz'DownloadCommand.run.<locals>.<listcomp>zSuccessfully downloaded %s)&r,r�anyrrrr�setZformat_controlrr�path�abspathr+r	r
rZ_build_sessionZ_build_package_finderrr�	cache_dirr�loggerZwarningr
rr-Z
isolated_moderZpopulate_requirement_setr0Zhas_requirementsZ
prepare_files�joinZsuccessfully_downloaded�infoZ
cleanup_files)r!r'r"r)Zdist_restriction_setZbinary_onlyr(�finderZbuild_deleterZrequirement_setZ
downloadedr%r%r&�run{sx






zDownloadCommand.run)
�__name__�
__module__�__qualname__�__doc__r0ZusageZsummaryrr<�
__classcell__r%r%)r$r&rsQr)Z
__future__rZloggingrZpip.exceptionsrZ	pip.indexrZpip.reqrZpip.basecommandrZpiprZ	pip.utilsrr	Zpip.utils.buildr
Zpip.utils.filesystemrZ	getLoggerr=r8rr%r%r%r&�<module>s
site-packages/pip/commands/__pycache__/wheel.cpython-36.pyc000064400000012430147511334620017625 0ustar003

���e1�@s�ddlmZddlZddlZddlZddlmZddlmZm	Z	ddl
mZddlm
Z
ddlmZddlmZdd	lmZmZdd
lmZeje�ZGdd�de�ZdS)
�)�absolute_importN)�RequirementCommand)�CommandError�PreviousBuildDirError)�RequirementSet)�import_or_raise)�BuildDirectory)�RemovedInPip10Warning)�
WheelCache�WheelBuilder)�
cmdoptionscs<eZdZdZdZdZdZ�fdd�Zdd�Zd	d
�Z	�Z
S)�WheelCommanda�
    Build Wheel archives for your requirements and dependencies.

    Wheel is a built-package format, and offers the advantage of not
    recompiling your software during every install. For more details, see the
    wheel docs: https://wheel.readthedocs.io/en/latest/

    Requirements: setuptools>=0.8, and wheel.

    'pip wheel' uses the bdist_wheel setuptools extension from the wheel
    package to build individual wheels.

    Zwheelz�
      %prog [options] <requirement specifier> ...
      %prog [options] -r <requirements file> ...
      %prog [options] [-e] <vcs project url> ...
      %prog [options] [-e] <local project path> ...
      %prog [options] <archive url/path> ...z$Build wheels from your requirements.csPtt|�j||�|j}|jddddtjdd�|jtj��|jtj	��|jtj
��|jtj��|jddd	d
dd�|jtj��|jtj
��|jtj��|jtj��|jtj��|jtj��|jtj��|jd
dd
d	dd�|jddddd�|jtj��|jtj��tjtj|j�}|jjd|�|jjd|�dS)Nz-wz--wheel-dir�	wheel_dir�dirzLBuild wheels into <dir>, where the default is the current working directory.)�dest�metavar�default�helpz--build-option�
build_options�options�appendz9Extra arguments to be supplied to 'setup.py bdist_wheel'.)rr�actionrz--global-option�global_optionszZExtra global options to be supplied to the setup.py call before the 'bdist_wheel' command.)rrrrz--pre�
store_trueFzYInclude pre-release and development versions. By default, pip only finds stable versions.)rrrr)�superr
�__init__�cmd_optsZ
add_option�os�curdirrZ	use_wheelZno_use_wheelZ	no_binaryZonly_binaryZconstraintsZeditableZrequirements�src�ignore_requires_pythonZno_deps�	build_dir�no_clean�require_hashesZmake_option_groupZindex_group�parserZinsert_option_group)�self�args�kwrZ
index_opts)�	__class__��/usr/lib/python3.6/wheel.pyr.sVzWheelCommand.__init__cCs.tdtd�tdtd�}t|d�s*td��dS)Nzwheel.bdist_wheelzM'pip wheel' requires the 'wheel' package. To fix this, run: pip install wheel�
pkg_resourceszp'pip wheel' requires setuptools >= 0.8 for dist-info support. To fix this, run: pip install --upgrade setuptoolsZDistInfoDistribution)rr�hasattr)r%r+r)r)r*�check_required_packageshs
z$WheelCommand.check_required_packagesc Cs�|j�tj|�tj|�|jr.tjdt�|jr@tjdt�|j	rRtjdt�|j
g|j}|jr|t
jddj|��g}|jr�tjj|j�|_tjj|j�|_|j|���}|j||�}|jp�|j}t|j|j�}t|j|d���}t||jd|jd|j|j|||j |j!d�}	|j"|	|||||j#|�|	j$�s6dSzZy6t%|	||j&�pJg|j'�pTgd	�}
|
j(��slt)d
��Wnt*k
�r�d|_�YnXWd|j�s�|	j+�XWdQRXWdQRXdS)Nz�--allow-external has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.z�--allow-all-external has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.z�--allow-unverified has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.zIgnoring indexes: %s�,)�deleteT)r!�src_dirZdownload_dir�ignore_dependenciesZignore_installedr �isolated�session�wheel_cacheZwheel_download_dirr#)rrz"Failed to build one or more wheels),r-rZresolve_wheel_no_use_binaryZcheck_install_build_globalZallow_external�warnings�warnr	Zallow_all_externalZallow_unverifiedZ	index_urlZextra_index_urlsZno_index�logger�debug�joinr!r�path�abspathr0Z_build_sessionZ_build_package_finderr"r
�	cache_dirZformat_controlrrr1r Z
isolated_moderr#Zpopulate_requirement_set�nameZhas_requirementsrrrZbuildrrZ
cleanup_files)r%rr&Z
index_urlsr3�finderZbuild_deleter4r!Zrequirement_set�wbr)r)r*�run|sv






zWheelCommand.run)�__name__�
__module__�__qualname__�__doc__r=ZusageZsummaryrr-r@�
__classcell__r)r))r(r*r
s
:r
)Z
__future__rZloggingrr5Zpip.basecommandrZpip.exceptionsrrZpip.reqrZ	pip.utilsrZpip.utils.buildrZpip.utils.deprecationr	Z	pip.wheelr
rZpiprZ	getLoggerrAr7r
r)r)r)r*�<module>s
site-packages/pip/commands/__pycache__/help.cpython-36.opt-1.pyc000064400000002034147511334620020407 0ustar003

���e��@s<ddlmZddlmZmZddlmZGdd�de�ZdS)�)�absolute_import)�Command�SUCCESS)�CommandErrorc@s$eZdZdZdZdZdZdd�ZdS)�HelpCommandzShow help for commands�helpz
      %prog <command>zShow help for commands.c	Cs�ddlm}m}y|d}Wntk
r0tSX||krl||�}d|g}|r^|jd|�tdj|���||�}|jj	�tS)Nr)�
commands_dict�get_similar_commandszunknown command "%s"zmaybe you meant "%s"z - )
Zpip.commandsrr	�
IndexErrorr�appendr�join�parserZ
print_help)	�selfZoptions�argsrr	Zcmd_nameZguess�msgZcommand�r�/usr/lib/python3.6/help.py�runs


zHelpCommand.runN)�__name__�
__module__�__qualname__�__doc__�nameZusageZsummaryrrrrrrs
rN)Z
__future__rZpip.basecommandrrZpip.exceptionsrrrrrr�<module>ssite-packages/pip/commands/__pycache__/show.cpython-36.opt-1.pyc000064400000012301147511334620020435 0ustar003

���e�@s�ddlmZddlmZddlZddlZddlmZddlm	Z	m
Z
ddlmZddl
mZeje�ZGdd	�d	e�Zd
d�Zdd
d�ZdS)�)�absolute_import)�
FeedParserN)�Command)�SUCCESS�ERROR)�
pkg_resources)�canonicalize_namecs4eZdZdZdZdZdZ�fdd�Zdd�Z�Z	S)	�ShowCommandz6Show information about one or more installed packages.Zshowz$
      %prog [options] <package> ...z*Show information about installed packages.cs>tt|�j||�|jjddddddd�|jjd|j�dS)	Nz-fz--files�files�
store_trueFz7Show the full list of installed files for each package.)�dest�action�default�helpr)�superr	�__init__Zcmd_optsZ
add_option�parserZinsert_option_group)�self�args�kw)�	__class__��/usr/lib/python3.6/show.pyrszShowCommand.__init__cCs8|stjd�tS|}t|�}t||j|jd�s4tStS)Nz.ERROR: Please provide a package name or names.)�
list_files�verbose)�loggerZwarningr�search_packages_info�
print_resultsr
rr)rZoptionsr�query�resultsrrr�run"s
zShowCommand.run)
�__name__�
__module__�__qualname__�__doc__�nameZusage�summaryrr �
__classcell__rr)rrr	sr	c#si�xtjD]}|�t|j�<qWdd�|D�}�x�fdd�|D�D�]Ή�j�j�jdd��j�D�d�}d}d}t�tj�rވj	d�rȈj
d�}dd�|D�}�fd	d�|D�}�fd
d�|D�}�j	d�r܈jd�}nP�j	d��r�j
d�}�fd
d�|D�}�fdd�|D�}�j	d��r.�jd�}�j	d��rL�j
d�}||d<�j	d��r�x,�j
d�D]}	|	j��rd|	j�|d<P�qdWt
�}
|
j|�|
j�}xdD]}|j|�||<�q�Wg}
x4|j�D](}	|	jd��r�|
j|	td�d���q�W|
|d<|�rt|�|d<|VqFWdS)z�
    Gather details from installed distributions. Print distribution name,
    version, location, and installed files. Installed files requires a
    pip generated 'installed-files.txt' in the distributions '.egg-info'
    directory.
    cSsg|]}t|��qSr)r)�.0r%rrr�
<listcomp>:sz(search_packages_info.<locals>.<listcomp>csg|]}|�kr�|�qSrr)r(Zpkg)�	installedrrr)<scSsg|]
}|j�qSr)�project_name)r(Zdeprrrr)As)r%�version�location�requiresNZRECORDcSsg|]}|jd�d�qS)�,r)�split)r(�lrrrr)Iscsg|]}tjj�j|��qSr)�os�path�joinr-)r(�p)�distrrr)Jscsg|]}tjj|�j��qSr)r2r3�relpathr-)r(r5)r6rrr)KsZMETADATAzinstalled-files.txtcsg|]}tjj�j|��qSr)r2r3r4Zegg_info)r(r5)r6rrr)Sscsg|]}tjj|�j��qSr)r2r3r7r-)r(r5)r6rrr)TszPKG-INFOzentry_points.txt�entry_pointsZ	INSTALLER�	installer�metadata-versionr&�	home-page�author�author-email�licensezClassifier: �classifiersr
)r:r&r;r<r=r>)rZworking_setrr+r,r-r.�
isinstanceZDistInfoDistributionZhas_metadataZget_metadata_linesZget_metadata�striprZfeed�close�get�
splitlines�
startswith�append�len�sorted)rr5Zquery_names�packageZ	file_listZmetadata�lines�pathsr8�lineZfeed_parserZ
pkg_info_dict�keyr?r)r6r*rr/s^







rFc	Cs�d}�x�t|�D�]�\}}d}|dkr0tjd�tjd|jdd��tjd|jd	d��tjd
|jdd��tjd|jd
d��tjd|jdd��tjd|jdd��tjd|jdd��tjd|jdd��tjddj|jdg���|�rxtjd|jdd��tjd|jdd��tjd�x"|jdg�D]}tjd|��q0Wtjd �x&|jd!g�D]}tjd|j���q^W|rtjd"�x&|jd#g�D]}tjd|j���q�Wd#|krtjd$�qW|S)%zD
    Print the informations from installed distributions found.
    FTrz---zName: %sr%�zVersion: %sr,zSummary: %sr&z
Home-page: %sz	home-pagez
Author: %sr<zAuthor-email: %szauthor-emailzLicense: %sr>zLocation: %sr-zRequires: %sz, r.zMetadata-Version: %szmetadata-versionz
Installer: %sr9zClassifiers:r?z  %sz
Entry-points:r8zFiles:r
z!Cannot locate installed-files.txt)�	enumerater�inforCr4rA)	Z
distributionsrrZresults_printed�ir6Z
classifier�entryrLrrrrxs>



r)FF)Z
__future__rZemail.parserrZloggingr2Zpip.basecommandrZpip.status_codesrrZpip._vendorrZpip._vendor.packaging.utilsrZ	getLoggerr!rr	rrrrrr�<module>s
Isite-packages/pip/commands/__pycache__/__init__.cpython-36.pyc000064400000003726147511334620020270 0ustar003

���e��@s&dZddlmZddlmZddlmZddlmZddl	m
Z
ddlmZddl
mZdd	lmZdd
lmZddlmZddlmZdd
lmZddlmZejeejee
je
ejeejeejeejeejeejeejeejeejeiZeeeeeeeeee
eegZddd�Zdd�Zdd�Z dS)z%
Package containing all pip commands
�)�absolute_import)�CompletionCommand)�DownloadCommand)�
FreezeCommand)�HashCommand)�HelpCommand)�ListCommand)�CheckCommand)�
SearchCommand)�ShowCommand)�InstallCommand)�UninstallCommand)�WheelCommandTccs:|rttt�}ntj�}x|D]\}}||jfVqWdS)z5Yields sorted (command name, command summary) tuples.N)�_sort_commands�
commands_dict�commands_order�itemsZsummary)ZorderedZcmditems�nameZ
command_class�r�/usr/lib/python3.6/__init__.py�
get_summaries4s
rcCs6ddlm}|j�}||tj��}|r.|dSdSdS)zCommand name auto-correct.r)�get_close_matchesFN)Zdifflibr�lowerr�keys)rrZclose_commandsrrr�get_similar_commands@srcs�fdd�}t|j�|d�S)Ncs(y�j|d�Stk
r"dSXdS)N��)�index�
ValueError)�key)�orderrr�keyfnOsz_sort_commands.<locals>.keyfn)r)�sortedr)Zcmddictr r!r)r rrNsrN)T)!�__doc__Z
__future__rZpip.commands.completionrZpip.commands.downloadrZpip.commands.freezerZpip.commands.hashrZpip.commands.helprZpip.commands.listrZpip.commands.checkr	Zpip.commands.searchr
Zpip.commands.showrZpip.commands.installrZpip.commands.uninstallr
Zpip.commands.wheelrrrrrrrrrrr�<module>sP

site-packages/pip/commands/__pycache__/uninstall.cpython-36.opt-1.pyc000064400000004752147511334620021501 0ustar003

���eD�@s`ddlmZddlZddlmZddlmZmZmZddl	m
Z
ddlmZGdd�de
�Z
dS)	�)�absolute_importN)�
WheelCache)�InstallRequirement�RequirementSet�parse_requirements)�Command)�InstallationErrorcs4eZdZdZdZdZdZ�fdd�Zdd�Z�Z	S)	�UninstallCommandaB
    Uninstall packages.

    pip is able to uninstall most installed packages. Known exceptions are:

    - Pure distutils packages installed with ``python setup.py install``, which
      leave behind no metadata to determine what files were installed.
    - Script wrappers installed by ``python setup.py develop``.
    �	uninstallzU
      %prog [options] <package> ...
      %prog [options] -r <requirements file> ...zUninstall packages.c	sVtt|�j||�|jjddddgddd�|jjdd	d
ddd
�|jjd|j�dS)Nz-rz
--requirement�requirements�append�filezjUninstall all the packages listed in the given requirements file.  This option can be used multiple times.)�dest�action�default�metavar�helpz-yz--yes�yes�
store_truez2Don't ask for confirmation of uninstall deletions.)rrrr)�superr	�__init__Zcmd_optsZ
add_option�parserZinsert_option_group)�self�args�kw)�	__class__��/usr/lib/python3.6/uninstall.pyrszUninstallCommand.__init__c
Cs�|j|���}tjjt�t��}t|j|�}tddd|j||d�}x$|D]}|j	t
j||j|d��qFWx2|jD](}x"t
||||d�D]}	|j	|	�q�WqnW|js�tdt|jd���|j|jd�WdQRXdS)N)Z	build_dirZsrc_dirZdownload_dir�isolated�session�wheel_cache)rr )�optionsrr zLYou must give at least one requirement to %(name)s (see "pip help %(name)s"))�name)Zauto_confirm)Z_build_session�pip�indexZ
FormatControl�setr�	cache_dirrZ
isolated_modeZadd_requirementrZ	from_linerrZhas_requirementsr�dictr"r
r)
rr!rrZformat_controlr Zrequirement_setr"�filenameZreqrrr�run-s6
zUninstallCommand.run)
�__name__�
__module__�__qualname__�__doc__r"ZusageZsummaryrr)�
__classcell__rr)rrr	
s	r	)Z
__future__rr#Z	pip.wheelrZpip.reqrrrZpip.basecommandrZpip.exceptionsrr	rrrr�<module>ssite-packages/pip/commands/__pycache__/list.cpython-36.opt-1.pyc000064400000022630147511334620020436 0ustar003

���ei,�@s�ddlmZddlZddlZddlZyddlmZWn ek
rTddlmZYnXddl	m
Z
ddlmZddl
mZddlmZdd	lmZmZdd
lmZddlmZmZeje�ZGdd
�d
e�Zdd�Zdd�Zdd�ZdS)�)�absolute_importN)�zip_longest)�izip_longest)�six)�Command)�CommandError)�
PackageFinder)�get_installed_distributions�dist_is_editable)�RemovedInPip10Warning)�make_option_group�index_groupcs|eZdZdZdZdZdZ�fdd�Zdd�Zd	d
�Z	dd�Z
d
d�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Z�ZS)�ListCommandzt
    List installed packages, including editables.

    Packages are listed in a case-insensitive sorted order.
    �listz
      %prog [options]zList installed packages.cs�tt|�j||�|j}|jdddddd�|jddddd	d�|jd
ddddd�|jd
ddddd�|jjdddddd�|jddddd�|jdddd$dd�|jddd d!d"�tt|j�}|jjd#|�|jjd#|�dS)%Nz-oz
--outdated�
store_trueFzList outdated packages)�action�default�helpz-uz
--uptodatezList uptodate packagesz-ez
--editablezList editable projects.z-lz--localzSIf in a virtualenv that has global access, do not list globally-installed packages.z--user�userz,Only output packages installed in user-site.)�destrrrz--prezYInclude pre-release and development versions. By default, pip only finds stable versions.z--formatZstore�list_format�legacy�columns�freeze�jsonzJSelect the output format among: legacy (default), columns, freeze or json.)rr�choicesrz--not-required�not_requiredz>List packages that are not dependencies of installed packages.)rrrr)rrrr)	�superr�__init__�cmd_optsZ
add_optionrr
�parserZinsert_option_group)�self�args�kwrZ
index_opts)�	__class__��/usr/lib/python3.6/list.pyr#s^zListCommand.__init__cCst|j||j|j|j|d�S)zK
        Create a package finder appropriate to this list command.
        )�
find_links�
index_urlsZallow_all_prereleases�
trusted_hosts�process_dependency_links�session)rr'�prer)r*)r!�optionsr(r+r%r%r&�_build_package_findercsz!ListCommand._build_package_findercCs�|jrtjdt�|jr$tjdt�|jr6tjdt�|jdkrLtjdt�|jr`|jr`t	d��t
|j|j|j
d�}|jr�|j||�}n|jr�|j||�}|jr�|j||�}|j||�dS)Nz�--allow-external has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.z�--allow-all-external has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.z�--allow-unverified has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.z�The default format will switch to columns in the future. You can use --format=(legacy|columns) (or define a format=(legacy|columns) in your pip.conf under the [list] section) to disable this warning.z5Options --outdated and --uptodate cannot be combined.)Z
local_onlyZ	user_onlyZeditables_only)Zallow_external�warnings�warnrZallow_all_externalZallow_unverifiedr�outdatedZuptodaterr	ZlocalrZeditable�get_outdated�get_uptodater�get_not_required�output_package_listing)r!r-r"�packagesr%r%r&�runps<

zListCommand.runcCsdd�|j||�D�S)NcSsg|]}|j|jkr|�qSr%)�latest_version�parsed_version)�.0�distr%r%r&�
<listcomp>�sz,ListCommand.get_outdated.<locals>.<listcomp>)�iter_packages_latest_infos)r!r6r-r%r%r&r2�szListCommand.get_outdatedcCsdd�|j||�D�S)NcSsg|]}|j|jkr|�qSr%)r8r9)r:r;r%r%r&r<�sz,ListCommand.get_uptodate.<locals>.<listcomp>)r=)r!r6r-r%r%r&r3�szListCommand.get_uptodatecsBt��x$|D]}�jdd�|j�D��qWt�fdd�|D��S)Ncss|]}|jVqdS)N)�key)r:Zrequirementr%r%r&�	<genexpr>�sz/ListCommand.get_not_required.<locals>.<genexpr>c3s|]}|j�kr|VqdS)N)r>)r:Zpkg)�dep_keysr%r&r?�s)�set�updateZrequires)r!r6r-r;r%)r@r&r4�s
zListCommand.get_not_requiredccs�|jg|j}|jr*tjddj|��g}g}x&|D]}|jd�r4|j|jd��q4W|j	|���}|j
|||�}|j|�xn|D]f}d}|j|j
�}	|js�dd�|	D�}	|	s�q�t|	|jd�}
|
j}|
jjr�d}nd	}||_||_|Vq�WWdQRXdS)
NzIgnoring indexes: %s�,zdependency_links.txt�unknowncSsg|]}|jjs|�qSr%)�versionZ
is_prerelease)r:�	candidater%r%r&r<�sz:ListCommand.iter_packages_latest_infos.<locals>.<listcomp>)r>ZwheelZsdist)Z	index_urlZextra_index_urlsZno_index�logger�debug�joinZhas_metadata�extendZget_metadata_linesZ_build_sessionr.Zadd_dependency_linksZfind_all_candidatesr>r,�maxZ_candidate_sort_keyrE�locationZis_wheelr8�latest_filetype)r!r6r-r(Zdependency_linksr;r+�finder�typZall_candidatesZbest_candidateZremote_versionr%r%r&r=�s8




z&ListCommand.iter_packages_latest_infoscCs0t|�rd|j|j|jfSd|j|jfSdS)Nz%s (%s, %s)z%s (%s))r
�project_namerErL)r!r;r%r%r&�
output_legacy�s
zListCommand.output_legacycCsd|j|�|j|jfS)Nz%s - Latest: %s [%s])rQr8rM)r!r;r%r%r&�output_legacy_latest�sz ListCommand.output_legacy_latestcCs�t|dd�d�}|jdkr:|r:t||�\}}|j||�n~|jdkrfxr|D]}tjd|j|j�qJWnR|jdkr�tjt||��n6x4|D],}|j	r�tj|j
|��q�tj|j|��q�WdS)NcSs
|jj�S)N)rP�lower)r;r%r%r&�<lambda>�sz4ListCommand.output_package_listing.<locals>.<lambda>)r>rrz%s==%sr)�sortedr�format_for_columns�output_package_listing_columnsrG�inforPrE�format_for_jsonr1rRrQ)r!r6r-�data�headerr;r%r%r&r5�s



z"ListCommand.output_package_listingcCsht|�dkr|jd|�t|�\}}t|�dkrL|jddjtdd�|���x|D]}tj|�qRWdS)Nr�� cSsd|S)N�-r%)�xr%r%r&rT
sz<ListCommand.output_package_listing_columns.<locals>.<lambda>)�len�insert�tabulaterI�maprGrX)r!rZr[Zpkg_strings�sizes�valr%r%r&rWs
z*ListCommand.output_package_listing_columns)�__name__�
__module__�__qualname__�__doc__�nameZusageZsummaryrr.r7r2r3r4r=rQrRr5rW�
__classcell__r%r%)r$r&rs@
6'
rcCsxdgtdd�|D��}x |D]}dd�t||�D�}qWg}x0|D](}djdd�t||�D��}|j|�qDW||fS)Nrcss|]}t|�VqdS)N)r`)r:r_r%r%r&r?sztabulate.<locals>.<genexpr>cSs"g|]\}}t|tt|����qSr%)rKr`�str)r:�s�cr%r%r&r<sztabulate.<locals>.<listcomp>r]cSs*g|]"\}}|dk	r"t|�j|�nd�qS)N�)rl�ljust)r:rmrnr%r%r&r<s)rKrrI�append)�valsrd�row�resultZdisplayr%r%r&rbs


rbcCs�|j}|rddddg}nddg}g}tdd�|D��r@|jd�xR|D]J}|j|jg}|rr|j|j�|j|j�t|�r�|j|j�|j|�qFW||fS)z_
    Convert the package data into something usable
    by output_package_listing_columns.
    ZPackageZVersionZLatestZTypecss|]}t|�VqdS)N)r
)r:r_r%r%r&r?2sz%format_for_columns.<locals>.<genexpr>ZLocation)	r1�anyrqrPrEr8rMr
rL)Zpkgsr-Zrunning_outdatedr[rZZprojrsr%r%r&rV%s 

rVcCsZg}xJ|D]B}|jtj|j�d�}|jrBtj|j�|d<|j|d<|j|�q
Wtj	|�S)N)rjrEr8rM)
rPrZ	text_typerEr1r8rMrqr�dumps)r6r-rZr;rXr%r%r&rYFs

rY) Z
__future__rrZloggingr/�	itertoolsr�ImportErrorrZpip._vendorrZpip.basecommandrZpip.exceptionsrZ	pip.indexrZ	pip.utilsr	r
Zpip.utils.deprecationrZpip.cmdoptionsrr
Z	getLoggerrfrGrrbrVrYr%r%r%r&�<module>s(
|!site-packages/pip/commands/__pycache__/completion.cpython-36.pyc000064400000005022147511334620020671 0ustar003

���e�	�@sDddlmZddlZddlmZdZdddd�ZGd	d
�d
e�ZdS)�)�absolute_importN)�CommandzJ
# pip %(shell)s completion start%(script)s# pip %(shell)s completion end
z�
_pip_completion()
{
    COMPREPLY=( $( COMP_WORDS="${COMP_WORDS[*]}" \
                   COMP_CWORD=$COMP_CWORD \
                   PIP_AUTO_COMPLETE=1 $1 ) )
}
complete -o default -F _pip_completion pip
z�
function _pip_completion {
  local words cword
  read -Ac words
  read -cn cword
  reply=( $( COMP_WORDS="$words[*]" \
             COMP_CWORD=$(( cword-1 )) \
             PIP_AUTO_COMPLETE=1 $words[1] ) )
}
compctl -K _pip_completion pip
a
function __fish_complete_pip
    set -lx COMP_WORDS (commandline -o) ""
    set -lx COMP_CWORD (math (contains -i -- (commandline -t) $COMP_WORDS)-1)
    set -lx PIP_AUTO_COMPLETE 1
    string split \  -- (eval $COMP_WORDS[1])
end
complete -fa "(__fish_complete_pip)" -c pip
)�bash�zsh�fishcs0eZdZdZdZdZ�fdd�Zdd�Z�ZS)�CompletionCommandz3A helper command to be used for command completion.Z
completionz-A helper command used for command completion.csltt|�j||�|j}|jddddddd�|jdd	dd
ddd�|jdd
ddddd�|jjd|�dS)Nz--bashz-b�store_constr�shellzEmit completion code for bash)�action�const�dest�helpz--zshz-zrzEmit completion code for zshz--fishz-frzEmit completion code for fishr)�superr�__init__�cmd_optsZ
add_option�parserZinsert_option_group)�self�args�kwr)�	__class__�� /usr/lib/python3.6/completion.pyr-s*zCompletionCommand.__init__cCsbtj�}dd�t|�D�}|j|krHtj|jd�}tt||jd��ntjj	ddj
|��dS)z-Prints the completion code of the given shellcSsg|]}d|�qS)z--r)�.0r	rrr�
<listcomp>Jsz)CompletionCommand.run.<locals>.<listcomp>�)�scriptr	zERROR: You must pass %s
z or N)�COMPLETION_SCRIPTS�keys�sortedr	�get�print�BASE_COMPLETION�sys�stderr�write�join)rZoptionsrZshellsZ
shell_optionsrrrr�runGs
zCompletionCommand.run)	�__name__�
__module__�__qualname__�__doc__�nameZsummaryrr&�
__classcell__rr)rrr(s
r)Z
__future__rr"Zpip.basecommandrr!rrrrrr�<module>s
site-packages/pip/commands/__pycache__/hash.cpython-36.opt-1.pyc000064400000003555147511334620020413 0ustar003

���e=�@s~ddlmZddlZddlZddlZddlmZddlmZddl	m
Z
ddlmZm
Z
eje�ZGdd�de�Zd	d
�ZdS)�)�absolute_importN)�Command)�ERROR)�read_chunks)�
FAVORITE_HASH�
STRONG_HASHEScs4eZdZdZdZdZdZ�fdd�Zdd�Z�Z	S)	�HashCommandz�
    Compute a hash of a local package archive.

    These can be used with --hash in a requirements file to do repeatable
    installs.

    �hashz%prog [options] <file> ...z#Compute hashes of package archives.c
sJtt|�j||�|jjdddtdtddjt�d�|jj	d|j�dS)	Nz-az--algorithm�	algorithmZstorez$The hash algorithm to use: one of %sz, )�dest�choices�action�default�helpr)
�superr�__init__Zcmd_optsZ
add_optionrr�join�parserZinsert_option_group)�self�args�kw)�	__class__��/usr/lib/python3.6/hash.pyrszHashCommand.__init__cCsD|s|jjtj�tS|j}x"|D]}tjd||t||��q"WdS)Nz%s:
--hash=%s:%s)	rZprint_usage�sys�stderrrr
�logger�info�
_hash_of_file)rZoptionsrr
�pathrrr�run(s
zHashCommand.run)
�__name__�
__module__�__qualname__�__doc__�nameZusageZsummaryrr �
__classcell__rr)rrrsrc
CsDt|d��,}tj|�}xt|�D]}|j|�q WWdQRX|j�S)z!Return the hash digest of a file.�rbN)�open�hashlib�newr�updateZ	hexdigest)rr
�archiver	�chunkrrrr3s

r)Z
__future__rr)ZloggingrZpip.basecommandrZpip.status_codesrZ	pip.utilsrZpip.utils.hashesrrZ	getLoggerr!rrrrrrr�<module>s
#site-packages/pip/commands/__pycache__/freeze.cpython-36.pyc000064400000005000147511334620017774 0ustar003

���e�@sdddlmZddlZddlZddlmZddlmZddlm	Z	ddl
mZd
ZGdd�de�Z
dS)�)�absolute_importN)�stdlib_pkgs)�Command)�freeze)�
WheelCache�pip�
setuptools�
distribute�wheelcs8eZdZdZdZdZdZd
Z�fdd�Zdd	�Z	�Z
S)�
FreezeCommandzx
    Output installed packages in requirements format.

    packages are listed in a case-insensitive sorted order.
    rz
      %prog [options]z1Output installed packages in requirements format.�ext://sys.stderrc	s�tt|�j||�|jjddddgddd�|jjdd	d
dgddd�|jjd
dddddd�|jjdddddd�|jjdddddjt�d�|jjd|j�dS)Nz-rz
--requirement�requirements�append�filez}Use the order in the given requirements file and its comments when generating output. This option can be used multiple times.)�dest�action�default�metavar�helpz-fz--find-links�
find_linksZURLz<URL for finding packages, which will be added to the output.z-lz--local�local�
store_trueFzUIf in a virtualenv that has global access, do not output globally-installed packages.)rrrrz--user�userz,Only output packages installed in user-site.z--all�
freeze_allz,Do not skip these packages in the output: %sz, )rrrr)	�superr�__init__Zcmd_optsZ
add_option�join�DEV_PKGS�parserZinsert_option_group)�self�args�kw)�	__class__��/usr/lib/python3.6/freeze.pyrsDzFreezeCommand.__init__c
Cs�tjjt�t��}t|j|�}tt�}|js6|jt	�t
|j|j|j
|j|j|j||d�}x"tf|�D]}tjj|d�qfWdS)N)ZrequirementrZ
local_onlyZ	user_onlyZ
skip_regex�isolated�wheel_cache�skip�
)r�indexZ
FormatControl�setr�	cache_dirrr�updater�dictr
rrrZskip_requirements_regexZ
isolated_moder�sys�stdout�write)rZoptionsr Zformat_controlr&r'Z
freeze_kwargs�liner#r#r$�runEs 
zFreezeCommand.run)rr)�__name__�
__module__�__qualname__�__doc__�nameZusageZsummaryZlog_streamsrr2�
__classcell__r#r#)r"r$rs*r)rrr	r
)Z
__future__rr.rZ
pip.compatrZpip.basecommandrZpip.operations.freezerZ	pip.wheelrrrr#r#r#r$�<module>ssite-packages/pip/commands/__pycache__/check.cpython-36.opt-1.pyc000064400000002337147511334620020542 0ustar003

���ef�@sJddlZddlmZddlmZddlmZeje�Z	Gdd�de�Z
dS)�N)�Command)�check_requirements)�get_installed_distributionsc@s$eZdZdZdZdZdZdd�ZdS)�CheckCommandz7Verify installed packages have compatible dependencies.Zcheckz
      %prog [options]c
	Cs�tdfd�}t|�\}}x~|D]v}d|j|jf}x*|j|g�D]}tjd|j|j|j�q@Wx4|j|g�D]$\}}	tjd|j|j||	j|	j�qlWqW|s�|r�dStjd�dS)NF)Z
local_only�skipz%s==%sz*%s %s requires %s, which is not installed.z-%s %s has requirement %s, but you have %s %s.�zNo broken requirements found.)rrZproject_name�version�get�logger�info)
�selfZoptions�argsZdistsZmissing_reqs_dictZincompatible_reqs_dictZdist�keyZrequirement�actual�r�/usr/lib/python3.6/check.py�runs 

zCheckCommand.runN)�__name__�
__module__�__qualname__�__doc__�nameZusageZsummaryrrrrrrs
r)ZloggingZpip.basecommandrZpip.operations.checkrZ	pip.utilsrZ	getLoggerrr
rrrrr�<module>s

site-packages/pip/commands/__pycache__/help.cpython-36.pyc000064400000002034147511334620017450 0ustar003

���e��@s<ddlmZddlmZmZddlmZGdd�de�ZdS)�)�absolute_import)�Command�SUCCESS)�CommandErrorc@s$eZdZdZdZdZdZdd�ZdS)�HelpCommandzShow help for commands�helpz
      %prog <command>zShow help for commands.c	Cs�ddlm}m}y|d}Wntk
r0tSX||krl||�}d|g}|r^|jd|�tdj|���||�}|jj	�tS)Nr)�
commands_dict�get_similar_commandszunknown command "%s"zmaybe you meant "%s"z - )
Zpip.commandsrr	�
IndexErrorr�appendr�join�parserZ
print_help)	�selfZoptions�argsrr	Zcmd_nameZguess�msgZcommand�r�/usr/lib/python3.6/help.py�runs


zHelpCommand.runN)�__name__�
__module__�__qualname__�__doc__�nameZusageZsummaryrrrrrrs
rN)Z
__future__rZpip.basecommandrrZpip.exceptionsrrrrrr�<module>ssite-packages/pip/commands/search.py000064400000010626147511334620013507 0ustar00from __future__ import absolute_import

import logging
import sys
import textwrap

from pip.basecommand import Command, SUCCESS
from pip.compat import OrderedDict
from pip.download import PipXmlrpcTransport
from pip.models import PyPI
from pip.utils import get_terminal_size
from pip.utils.logging import indent_log
from pip.exceptions import CommandError
from pip.status_codes import NO_MATCHES_FOUND
from pip._vendor.packaging.version import parse as parse_version
from pip._vendor import pkg_resources
from pip._vendor.six.moves import xmlrpc_client


logger = logging.getLogger(__name__)


class SearchCommand(Command):
    """Search for PyPI packages whose name or summary contains <query>."""
    name = 'search'
    usage = """
      %prog [options] <query>"""
    summary = 'Search PyPI for packages.'

    def __init__(self, *args, **kw):
        super(SearchCommand, self).__init__(*args, **kw)
        self.cmd_opts.add_option(
            '-i', '--index',
            dest='index',
            metavar='URL',
            default=PyPI.pypi_url,
            help='Base URL of Python Package Index (default %default)')

        self.parser.insert_option_group(0, self.cmd_opts)

    def run(self, options, args):
        if not args:
            raise CommandError('Missing required argument (search query).')
        query = args
        pypi_hits = self.search(query, options)
        hits = transform_hits(pypi_hits)

        terminal_width = None
        if sys.stdout.isatty():
            terminal_width = get_terminal_size()[0]

        print_results(hits, terminal_width=terminal_width)
        if pypi_hits:
            return SUCCESS
        return NO_MATCHES_FOUND

    def search(self, query, options):
        index_url = options.index
        with self._build_session(options) as session:
            transport = PipXmlrpcTransport(index_url, session)
            pypi = xmlrpc_client.ServerProxy(index_url, transport)
            hits = pypi.search({'name': query, 'summary': query}, 'or')
            return hits


def transform_hits(hits):
    """
    The list from pypi is really a list of versions. We want a list of
    packages with the list of versions stored inline. This converts the
    list from pypi into one we can use.
    """
    packages = OrderedDict()
    for hit in hits:
        name = hit['name']
        summary = hit['summary']
        version = hit['version']

        if name not in packages.keys():
            packages[name] = {
                'name': name,
                'summary': summary,
                'versions': [version],
            }
        else:
            packages[name]['versions'].append(version)

            # if this is the highest version, replace summary and score
            if version == highest_version(packages[name]['versions']):
                packages[name]['summary'] = summary

    return list(packages.values())


def print_results(hits, name_column_width=None, terminal_width=None):
    if not hits:
        return
    if name_column_width is None:
        name_column_width = max([
            len(hit['name']) + len(hit.get('versions', ['-'])[-1])
            for hit in hits
        ]) + 4

    installed_packages = [p.project_name for p in pkg_resources.working_set]
    for hit in hits:
        name = hit['name']
        summary = hit['summary'] or ''
        version = hit.get('versions', ['-'])[-1]
        if terminal_width is not None:
            target_width = terminal_width - name_column_width - 5
            if target_width > 10:
                # wrap and indent summary to fit terminal
                summary = textwrap.wrap(summary, target_width)
                summary = ('\n' + ' ' * (name_column_width + 3)).join(summary)

        line = '%-*s - %s' % (name_column_width,
                              '%s (%s)' % (name, version), summary)
        try:
            logger.info(line)
            if name in installed_packages:
                dist = pkg_resources.get_distribution(name)
                with indent_log():
                    latest = highest_version(hit['versions'])
                    if dist.version == latest:
                        logger.info('INSTALLED: %s (latest)', dist.version)
                    else:
                        logger.info('INSTALLED: %s', dist.version)
                        logger.info('LATEST:    %s', latest)
        except UnicodeEncodeError:
            pass


def highest_version(versions):
    return max(versions, key=parse_version)
site-packages/pip/commands/show.py000064400000013403147511334620013216 0ustar00from __future__ import absolute_import

from email.parser import FeedParser
import logging
import os

from pip.basecommand import Command
from pip.status_codes import SUCCESS, ERROR
from pip._vendor import pkg_resources
from pip._vendor.packaging.utils import canonicalize_name


logger = logging.getLogger(__name__)


class ShowCommand(Command):
    """Show information about one or more installed packages."""
    name = 'show'
    usage = """
      %prog [options] <package> ..."""
    summary = 'Show information about installed packages.'

    def __init__(self, *args, **kw):
        super(ShowCommand, self).__init__(*args, **kw)
        self.cmd_opts.add_option(
            '-f', '--files',
            dest='files',
            action='store_true',
            default=False,
            help='Show the full list of installed files for each package.')

        self.parser.insert_option_group(0, self.cmd_opts)

    def run(self, options, args):
        if not args:
            logger.warning('ERROR: Please provide a package name or names.')
            return ERROR
        query = args

        results = search_packages_info(query)
        if not print_results(
                results, list_files=options.files, verbose=options.verbose):
            return ERROR
        return SUCCESS


def search_packages_info(query):
    """
    Gather details from installed distributions. Print distribution name,
    version, location, and installed files. Installed files requires a
    pip generated 'installed-files.txt' in the distributions '.egg-info'
    directory.
    """
    installed = {}
    for p in pkg_resources.working_set:
        installed[canonicalize_name(p.project_name)] = p

    query_names = [canonicalize_name(name) for name in query]

    for dist in [installed[pkg] for pkg in query_names if pkg in installed]:
        package = {
            'name': dist.project_name,
            'version': dist.version,
            'location': dist.location,
            'requires': [dep.project_name for dep in dist.requires()],
        }
        file_list = None
        metadata = None
        if isinstance(dist, pkg_resources.DistInfoDistribution):
            # RECORDs should be part of .dist-info metadatas
            if dist.has_metadata('RECORD'):
                lines = dist.get_metadata_lines('RECORD')
                paths = [l.split(',')[0] for l in lines]
                paths = [os.path.join(dist.location, p) for p in paths]
                file_list = [os.path.relpath(p, dist.location) for p in paths]

            if dist.has_metadata('METADATA'):
                metadata = dist.get_metadata('METADATA')
        else:
            # Otherwise use pip's log for .egg-info's
            if dist.has_metadata('installed-files.txt'):
                paths = dist.get_metadata_lines('installed-files.txt')
                paths = [os.path.join(dist.egg_info, p) for p in paths]
                file_list = [os.path.relpath(p, dist.location) for p in paths]

            if dist.has_metadata('PKG-INFO'):
                metadata = dist.get_metadata('PKG-INFO')

        if dist.has_metadata('entry_points.txt'):
            entry_points = dist.get_metadata_lines('entry_points.txt')
            package['entry_points'] = entry_points

        if dist.has_metadata('INSTALLER'):
            for line in dist.get_metadata_lines('INSTALLER'):
                if line.strip():
                    package['installer'] = line.strip()
                    break

        # @todo: Should pkg_resources.Distribution have a
        # `get_pkg_info` method?
        feed_parser = FeedParser()
        feed_parser.feed(metadata)
        pkg_info_dict = feed_parser.close()
        for key in ('metadata-version', 'summary',
                    'home-page', 'author', 'author-email', 'license'):
            package[key] = pkg_info_dict.get(key)

        # It looks like FeedParser cannot deal with repeated headers
        classifiers = []
        for line in metadata.splitlines():
            if line.startswith('Classifier: '):
                classifiers.append(line[len('Classifier: '):])
        package['classifiers'] = classifiers

        if file_list:
            package['files'] = sorted(file_list)
        yield package


def print_results(distributions, list_files=False, verbose=False):
    """
    Print the informations from installed distributions found.
    """
    results_printed = False
    for i, dist in enumerate(distributions):
        results_printed = True
        if i > 0:
            logger.info("---")
        logger.info("Name: %s", dist.get('name', ''))
        logger.info("Version: %s", dist.get('version', ''))
        logger.info("Summary: %s", dist.get('summary', ''))
        logger.info("Home-page: %s", dist.get('home-page', ''))
        logger.info("Author: %s", dist.get('author', ''))
        logger.info("Author-email: %s", dist.get('author-email', ''))
        logger.info("License: %s", dist.get('license', ''))
        logger.info("Location: %s", dist.get('location', ''))
        logger.info("Requires: %s", ', '.join(dist.get('requires', [])))
        if verbose:
            logger.info("Metadata-Version: %s",
                        dist.get('metadata-version', ''))
            logger.info("Installer: %s", dist.get('installer', ''))
            logger.info("Classifiers:")
            for classifier in dist.get('classifiers', []):
                logger.info("  %s", classifier)
            logger.info("Entry-points:")
            for entry in dist.get('entry_points', []):
                logger.info("  %s", entry.strip())
        if list_files:
            logger.info("Files:")
            for line in dist.get('files', []):
                logger.info("  %s", line.strip())
            if "files" not in dist:
                logger.info("Cannot locate installed-files.txt")
    return results_printed
site-packages/pip/__pycache__/status_codes.cpython-36.pyc000064400000000505147511334620017420 0ustar003

���e��@s(ddlmZdZdZdZdZdZdZdS)�)�absolute_import�����N)Z
__future__r�SUCCESSZERRORZ
UNKNOWN_ERRORZVIRTUALENV_NOT_FOUNDZPREVIOUS_BUILD_DIR_ERRORZNO_MATCHES_FOUND�r	r	�"/usr/lib/python3.6/status_codes.py�<module>ssite-packages/pip/__pycache__/__main__.cpython-36.pyc000064400000000551147511334620016441 0ustar003

���eH�@shddlmZddlZddlZedkrFejjejje��Zejjde�ddl	Z	e
dkrdeje	j��dS)�)�absolute_importN��__main__)
Z
__future__r�os�sys�__package__�path�dirname�__file__�insertZpip�__name__�exit�main�rr�/usr/lib/python3.6/__main__.py�<module>ssite-packages/pip/__pycache__/basecommand.cpython-36.opt-1.pyc000064400000016035147511334620020135 0ustar003

���e�.�@s,dZddlmZddlZddlZddlZddlZddlZddlm	Z	ddl
mZddlm
Z
ddlmZddlmZmZmZmZmZdd	lmZdd
lmZmZddlmZmZddlmZm Z m!Z!m"Z"m#Z#dd
l$m%Z%m&Z&m'Z'ddl(m)Z)ddl*m+Z+dgZ,ej-e.�Z/Gdd�de0�Z1Gdd�de1�Z2dS)z(Base Command class, and related routines�)�absolute_importN)�
cmdoptions)�
PackageFinder)�running_under_virtualenv)�
PipSession)�
BadCommand�InstallationError�UninstallationError�CommandError�PreviousBuildDirError)�logging_dictConfig)�ConfigOptionParser�UpdatingDefaultsHelpFormatter)�InstallRequirement�parse_requirements)�SUCCESS�ERROR�
UNKNOWN_ERROR�VIRTUALENV_NOT_FOUND�PREVIOUS_BUILD_DIR_ERROR)�deprecation�get_prog�normalize_path)�IndentingFormatter)�pip_version_check�Commandc@s@eZdZdZdZdZd
Zddd�Zddd�Zd	d
�Z	dd�Z
dS)rNF�ext://sys.stdout�ext://sys.stderrcCsr|jdt�|jft�d|j|j|d�}tf|�|_d|jj�}tj	|j|�|_
tjtj
|j�}|jj|�dS)Nz%s %sF)�usage�prog�	formatterZadd_help_option�name�description�isolatedz
%s Options)rrr!r�__doc__r
�parser�
capitalize�optparseZOptionGroupZcmd_optsrZmake_option_groupZ
general_groupZadd_option_group)�selfr#Z	parser_kwZ
optgroup_nameZgen_opts�r)�!/usr/lib/python3.6/basecommand.py�__init__)szCommand.__init__cCs�t|jrttjj|jd��nd|dk	r*|n|j|jd�}|jrF|j|_	|j
rT|j
|_|js^|rr|dk	rj|n|j|_|jr�|j|jd�|_
|j|j_|S)N�http)�cache�retriesZinsecure_hosts)r,Zhttps)r�	cache_dirr�os�path�joinr.�
trusted_hostsZcertZverifyZclient_cert�timeout�proxyZproxies�no_inputZauthZ	prompting)r(�optionsr.r4�sessionr)r)r*�_build_sessionAs

zCommand._build_sessioncCs|jj|�S)N)r%�
parse_args)r(�argsr)r)r*r:eszCommand.parse_argscs�|j|�\}}|jr8|jdkr"d�|jdkr2d�qHd�n|jrDd�nd��}|jrVd}tddd	d
tjd�idtd
d�i�d|jdd	gdd�dd|jddd�dd|jp�dddd�d�|t	t
ddd|jr�dndg��d�t�fdd�d2D��d"��tj
dd�d3k�rtjd$tj�|j�r(d%tjd&<|j�rBd'j|j�tjd(<|j�rft��sftjd)�tjt��z$y"|j||�}t|t��r�|SW�n�t k
�r�}z tjt!|��tj"d*dd+�t#Sd}~Xn�t$t%t&fk
�r}z tjt!|��tj"d*dd+�t'Sd}~Xn~t(k
�rF}ztjd,|�tj"d*dd+�t'Sd}~XnDt)k
�rrtjd-�tj"d*dd+�t'Stjd.dd+�t*SWd|j+�r�t,|d/d��r�|j-|dt.d0|j/�d1��}t0|�WdQRXXt1S)4N��WARNING�rZCRITICAL�DEBUG�INFOFZexclude_warningsz pip.utils.logging.MaxLevelFilter)z()�level�indentz%(message)s)z()�formatz(pip.utils.logging.ColorizedStreamHandlerr)rA�class�stream�filtersr )rArDrEr z+pip.utils.logging.BetterRotatingFileHandlerz	/dev/nullT)rArD�filenameZdelayr )�console�console_errors�user_logrHrIrJ)rA�handlersc3s&|]}|d�dkrdndifVqdS)rAr@rr=r?N)r@rr))�.0r!)rAr)r*�	<genexpr>�s
zCommand.main.<locals>.<genexpr>�pip._vendor�distlib�requests�urllib3)�versionZdisable_existing_loggersrFZ
formattersrK�rootZloggers�z�Python 2.6 is no longer supported by the Python core team, please upgrade your Python. A future version of pip will drop support for Python 2.6�1ZPIP_NO_INPUT� ZPIP_EXISTS_ACTIONz2Could not find an activated virtualenv (required).zException information:)�exc_infoz	ERROR: %szOperation cancelled by userz
Exception:�no_index�)r.r4)rNrOrPrQ)r>rT)2r:�quiet�verbose�logr�loggingr=r�log_streams�list�filter�dict�sys�version_info�warnings�warnrZPython26DeprecationWarningr6r0�environZ
exists_actionr2Zrequire_venvr�loggerZcritical�exitrZrun�
isinstance�intr�str�debugrrr	rrr
�KeyboardInterruptrZdisable_pip_version_check�getattrr9�minr4rr)r(r;r7Z
root_levelZstatus�excr8r))rAr*�mainis�










zCommand.main)rr)F)NN)�__name__�
__module__�__qualname__r!rZhiddenr^r+r9r:rqr)r)r)r*r#s

$c@s"eZdZedd��Zddd�ZdS)�RequirementCommandc	Cs"x6|jD],}x&t|d||||d�D]}|j|�q"WqWx&|D]}|jtj|d|j|d��q>Wx*|jD] }|jtj||j|j|d��qhWd}	x8|j	D].}x(t|||||d�D]}d}	|j|�q�Wq�W|j
|_
|p�|jp�|	�sd|i}
|j�rd	t|
d
j
|j�d�}nd|
}tj|�dS)
z?
        Marshal cmd line args into a requirement set.
        T)Z
constraint�finderr7r8�wheel_cacheN)r#rw)�default_vcsr#rwF)rvr7r8rwr!z^You must give at least one requirement to %(name)s (maybe you meant "pip %(name)s %(links)s"?)rV)ZlinkszLYou must give at least one requirement to %(name)s (see "pip help %(name)s"))ZconstraintsrZadd_requirementrZ	from_lineZ
isolated_modeZ	editablesZ
from_editablerxZrequirementsZrequire_hashes�
find_linksrar2rgZwarning)Zrequirement_setr;r7rvr8r!rwrGZreqZfound_req_in_fileZopts�msgr)r)r*�populate_requirement_setsF
z+RequirementCommand.populate_requirement_setNc
CsR|jg|j}|jr*tjddj|��g}t|j|j||j	|j
|j|||||d�S)zR
        Create a package finder appropriate to this requirement command.
        zIgnoring indexes: %s�,)ry�format_control�
index_urlsr3Zallow_all_prereleases�process_dependency_linksr8�platformZversions�abi�implementation)Z	index_urlZextra_index_urlsrXrgrlr2rryr}r3Zprer)r(r7r8r�Zpython_versionsr�r�r~r)r)r*�_build_package_finder:s z(RequirementCommand._build_package_finder)NNNN)rrrsrt�staticmethodr{r�r)r)r)r*rus8ru)3r$Z
__future__rr]r0rbr'rdZpiprZ	pip.indexrZ
pip.locationsrZpip.downloadrZpip.exceptionsrrr	r
rZ
pip.compatrZpip.baseparserr
rZpip.reqrrZpip.status_codesrrrrrZ	pip.utilsrrrZpip.utils.loggingrZpip.utils.outdatedr�__all__Z	getLoggerrrrg�objectrrur)r)r)r*�<module>s.
_site-packages/pip/__pycache__/download.cpython-36.pyc000064400000050500147511334620016527 0ustar003

���eO��@s^ddlmZddlZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZyddlZdZWnek
r�dZYnXddlmZddlmZddlZddlmZmZddlmZdd	lmZmZm Z m!Z!m"Z"m#Z#m$Z$m%Z%m&Z&m'Z'dd
l(m)Z)ddl*m+Z+ddl,m-Z-dd
l.m/Z/ddl0m1Z1ddl2m3Z3m4Z4ddl5m6Z6ddl7m8Z8ddl9m:Z:m;Z;ddl<m=Z=m>Z>ddl?m@Z@mAZAddlBmCZCmDZDddlEmFZFddlGmHZHddl9mIZIddlJmKZKddlLmMZMddlNmOZOddlPmQZQdddd d!d"d#d$d%d&d'd(d)g
ZRejSeT�ZUd*d+�ZVGd,d-�d-e@�ZWGd.d/�d/e=�ZXGd0d1�d1eM�ZYGd2d3�d3e>�ZZGd4d5�d5e:j[�Z\dWd6d�Z]ej^d7ej_�Z`ej^d8ej_�Zad9d�Zbd:d�Zcd;d �Zdd<d!�Zed=d"�Zfd>d?�Zgd@d$�ZhdAd%�ZidBdC�ZjdDdE�ZkdFdG�ZldHdI�ZmdXdJd&�ZndYdKd#�ZodLdM�ZpGdNdO�dOeQjq�ZrdZdPd'�ZsdQd)�ZtdRd(�ZudSdT�ZvdUdV�ZwdS)[�)�absolute_importNTF)�parse)�request)�InstallationError�HashMismatch)�PyPI)
�splitext�rmtree�format_size�display_path�
backup_dir�ask_path_exists�unpack_file�ARCHIVE_EXTENSIONS�consume�call_subprocess)�auto_decode)�check_path_owner)�
indent_log)�SETUPTOOLS_SHIM)�libc_ver)�DownloadProgressBar�DownloadProgressSpinner)�write_delete_marker_file)�vcs)�requests�six)�BaseAdapter�HTTPAdapter)�AuthBase�
HTTPBasicAuth)�CONTENT_CHUNK_SIZE�Response)�get_netrc_auth)�CaseInsensitiveDict)�urllib3)�CacheControlAdapter)�	FileCache)�	LockError)�
xmlrpc_client�get_file_content�is_url�url_to_path�path_to_url�is_archive_file�unpack_vcs_link�unpack_file_url�
is_vcs_url�is_file_url�unpack_http_url�
unpack_url�parse_content_disposition�sanitize_content_filenamecCsdtjd�tj�dtj�id�}|dddkrBtj�|dd<n�|dddkr�tjjd	krntjd
d�}ntj}djd
d�|D��|dd<nB|dddkr�tj�|dd<n |dddkr�tj�|dd<tjj	d��rJddl
m}tt
dd�tdddg|j����}tt
dd�tddgt����}|�r<||d<|�rJ||d<tjj	d��r|tj�d�r|dtj�dd�|d<tj��r�tj�|jdi�d<tj��r�tj�|jdi�d<tj��r�tj�|d<t�r�tjd
d �d)k�r�tj|d"<d#j|tj|d*d&d'�d(�S)+z6
    Return a string representing the user agent.
    �pip)�name�versionr8)Z	installer�python�implementationr;ZCPythonr9ZPyPy�finalN��.cSsg|]}t|��qS�)�str)�.0�xr?r?�/usr/lib/python3.6/download.py�
<listcomp>Tszuser_agent.<locals>.<listcomp>ZJythonZ
IronPython�linuxr)�distrocSs|dS)N�r?)rBr?r?rC�<lambda>`szuser_agent.<locals>.<lambda>�idcSs|dS)NrGr?)rBr?r?rCrHds�lib�libcrF�darwinZmacOS�system�releaseZcpu��Zopenssl_versionz9{data[installer][name]}/{data[installer][version]} {json}�,�:T)Z
separatorsZ	sort_keys)�data�json)rOrP)rQrR)r7�__version__�platformZpython_versionZpython_implementation�sys�pypy_version_info�releaselevel�join�
startswith�pip._vendorrF�dict�filter�zipZlinux_distributionrZmac_verrM�
setdefaultrN�machine�HAS_TLS�version_info�sslZOPENSSL_VERSION�formatrT�dumps)rSrXrFZdistro_infosrKr?r?rC�
user_agent@sP




rgc@s.eZdZddd�Zdd�Zdd�Zdd	�Zd
S)�MultiDomainBasicAuthTcCs||_i|_dS)N)�	prompting�	passwords)�selfrir?r?rC�__init__�szMultiDomainBasicAuth.__init__cCs�tj|j�}|jjdd�d}tj|dd�|f|dd��|_|jj|d�\}}|dkrn|j|j�\}}|dkr�|dkr�t	|j�}|r�|nd\}}|s�|r�||f|j|<t
|p�d|p�d�|�}|jd|j�|S)	N�@rGrO��response���)NN)NN)
�urllib_parse�urlparse�url�netloc�rsplit�
urlunparserj�get�parse_credentialsr#r Z
register_hook�
handle_401)rk�req�parsedrt�username�passwordZ
netrc_authr?r?rC�__call__�s&
zMultiDomainBasicAuth.__call__cKs�|jdkr|S|js|Stj|j�}tjjd|j�}t	j	d�}|sH|rX||f|j
|j<|j|jj
�t|ppd|pvd�|j�}|jj|f|�}|jj|�|S)Ni�z
User for %s: z
Password: rn)�status_coderirqrrrsrZmoves�inputrt�getpassrj�content�rawZrelease_connr rZ
connection�send�history�append)rk�resp�kwargsr{r|r}rzZnew_respr?r?rCry�s


zMultiDomainBasicAuth.handle_401cCs8d|kr4|jdd�d}d|kr,|jdd�S|dfSdS)NrmrGrrR)NN)ru�split)rkrtZuserinfor?r?rCrx�sz&MultiDomainBasicAuth.parse_credentialsN)T)�__name__�
__module__�__qualname__rlr~ryrxr?r?r?rCrh�s
!"rhc@seZdZddd�Zdd�ZdS)�LocalFSAdapterNc
Cs�t|j�}t�}d|_|j|_ytj|�}	Wn.tk
rZ}
zd|_|
|_WYdd}
~
XnPXtj	j
|	jdd�}tj
|�dp~d}t||	j|d��|_t|d�|_|jj|_|S)	N��i�T)Zusegmtrz
text/plain)zContent-TypezContent-Lengthz
Last-Modified�rb)r,rsr"r�os�stat�OSErrorr��emailZutilsZ
formatdate�st_mtime�	mimetypes�
guess_typer$�st_size�headers�open�close)
rkr�stream�timeout�verify�certZproxies�pathnamer�Zstats�excZmodified�content_typer?r?rCr��s$

zLocalFSAdapter.sendcCsdS)Nr?)rkr?r?rCr��szLocalFSAdapter.close)NNNNN)r�r�r�r�r�r?r?r?rCr��s
r�csDeZdZdZ�fdd�Z�fdd�Z�fdd�Z�fdd	�Z�ZS)
�
SafeFileCachezw
    A file based cache which is safe to use even when the target directory may
    not be accessible or writable.
    cs4tt|�j||�t|j�s0tjd|j�d|_dS)Nz�The directory '%s' or its parent directory is not owned by the current user and the cache has been disabled. Please check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.)�superr�rlr�	directory�logger�warning)rk�argsr�)�	__class__r?rCrl�s
zSafeFileCache.__init__c
s@|jdkrdSytt|�j||�Stttfk
r:YnXdS)N)r�r�r�rwr(r��IOError)rkr�r�)r�r?rCrws
zSafeFileCache.getc
s@|jdkrdSytt|�j||�Stttfk
r:YnXdS)N)r�r�r��setr(r�r�)rkr�r�)r�r?rCr�s
zSafeFileCache.setc
s@|jdkrdSytt|�j||�Stttfk
r:YnXdS)N)r�r�r��deleter(r�r�)rkr�r�)r�r?rCr�)s
zSafeFileCache.delete)	r�r�r��__doc__rlrwr�r��
__classcell__r?r?)r�rCr��s


r�c@seZdZdd�ZdS)�InsecureHTTPAdaptercCsd|_d|_dS)NZ	CERT_NONE)Z	cert_reqsZca_certs)rkZconnrsr�r�r?r?rC�cert_verify9szInsecureHTTPAdapter.cert_verifyN)r�r�r�r�r?r?r?rCr�7sr�cs,eZdZdZ�fdd�Z�fdd�Z�ZS)�
PipSessionNc	s�|jdd�}|jdd�}|jdg�}tt|�j||�t�|jd<t�|_tj	|dgdd�}|rvt
t|d	d
�|d�}n
t|d�}t
|d�}|jd
|�|jd|�|jdt��x|D]}|jdj|�|�q�WdS)N�retriesr�cache�insecure_hostsz
User-Agenti�g�?)ZtotalZstatus_forcelistZbackoff_factorT)Zuse_dir_lock)r��max_retries)r�zhttps://zhttp://zfile://zhttps://{0}/)�popr�r�rlrgr�rhZauthr%ZRetryr&r�rr�Zmountr�re)	rkr�r�r�r�r�Zsecure_adapterZinsecure_adapter�host)r�r?rCrlBs*




zPipSession.__init__cs(|jd|j�tt|�j||f|�|�S)Nr�)r`r�r�r�r)rk�methodrsr�r�)r�r?rCr~szPipSession.request)r�r�r�r�rlrr�r?r?)r�rCr�>s<r�c
CsL|dkrtd��tj|�}|r�|jd�j�}|dkrR|rR|jd�rRtd||f��|dkr�|jdd�d}|jdd	�}t	j
|�}|r�|jd�d|jd
d�d}tj|�}|jd	�r�d	|j
d	�}|}n|j|�}|j�|j|jfSy&t|d��}t|j��}WdQRXWn4tk
�rB}	ztdt|	���WYdd}	~	XnX||fS)
z�Gets the content of a file; it may be a filename, file: URL, or
    http: URL.  Returns (location, content).  Content is unicode.NzAget_file_content() missing 1 required keyword argument: 'session'rG�file�httpz6Requirements file %s references URL %s, which is localrR�\�/�|r�z$Could not open requirements file: %s)�	TypeError�
_scheme_re�search�group�lowerr[rr��replace�_url_slash_drive_re�matchrqZunquote�lstriprw�raise_for_statusrs�textr�r�readr�r@)
rsZ
comes_from�sessionr��scheme�pathr��fr�r�r?r?rCr*�s>





 z^(http|https|file):z/*([a-z])\|cCs6d|krdS|jdd�dj�}|ddddgtjkS)	z)Returns true if the name looks like a URLrRFrGrr�Zhttpsr�Zftp)r�r�rZall_schemes)r8r�r?r?rCr+�scCsH|jd�std|��tj|�\}}}}}|r6d|}tj||�}|S)z(
    Convert a file: URL to a path.
    zfile:z4You can only turn file: urls into filenames (not %r)z\\)r[�AssertionErrorrqZurlsplit�urllib_requestZurl2pathname)rs�_rtr�r?r?rCr,�s
cCs*tjjtjj|��}tjdtj|��}|S)zh
    Convert a path to a file: URL.  The path will be made absolute and have
    quoted path parts.
    zfile:)r�r��normpath�abspathrqZurljoinr�Zpathname2url)r�rsr?r?rCr-�scCs t|�dj�}|tkrdSdS)z9Return True if `name` is a considered as an archive file.rGTF)rr�r)r8�extr?r?rCr.�scCst|�}|j|�dS)N)�_get_used_vcs_backend�unpack)�link�location�vcs_backendr?r?rCr/�scCs.x(tjD]}|j|jkr||j�}|SqWdS)N)rZbackendsr�Zschemesrs)r�Zbackendr�r?r?rCr��s
r�cCstt|��S)N)�boolr�)r�r?r?rCr1�scCs|jj�jd�S)Nzfile:)rsr�r[)r�r?r?rCr2�scCst|j�}tjj|�S)z�Return whether a file:// Link points to a directory.

    ``link`` must not have any other scheme but file://. Call is_file_url()
    first.

    )r,�url_without_fragmentr�r��isdir)r��	link_pathr?r?rC�
is_dir_url�s
r�cOs|S)Nr?)�iterabler�r�r?r?rC�_progress_indicator�sr�c

sLyt�jd�}Wntttfk
r0d}YnXt�dd�}tj�tj	krRd}n&|r\d}n|dkrjd}n|std}nd}|j
}�fdd	�}�fd
d�}	t}
|jt
jkr�|}n|j}|r�|r�tjd|t|��t|d
�j}
ntjd|�t�j}
n |�rtjd|�ntjd|�tjd|�|	|
|t�t��}|�r@|j|�nt|�dS)Nzcontent-lengthrZ
from_cacheF�(i�Tc3s\y$x�jj|dd�D]
}|VqWWn2tk
rVx�jj|�}|sHP|Vq6WYnXdS)NF)Zdecode_content)r�r��AttributeErrorr�)Z
chunk_size�chunk)r�r?rC�	resp_readsz _download_url.<locals>.resp_readc3s"x|D]}�j|�|VqWdS)N)�write)Zchunksr�)�content_filer?rC�written_chunks;s

z%_download_url.<locals>.written_chunkszDownloading %s (%s))�maxzDownloading %szUsing cached %szDownloading from URL %si@�)�intr��
ValueError�KeyErrorr��getattrr�ZgetEffectiveLevel�logging�INFO�show_urlr�rtrr��infor
r�iterr�debugr!Zcheck_against_chunksr)
r�r�r��hashesZtotal_lengthZcached_respZ
show_progressr�r�r�Zprogress_indicatorrsZdownloaded_chunksr?)r�r�rC�
_download_urlsL
%
r�cCs�d}tjj||j�}tjj|�r�tdt|�d�}|dkr@d}nj|dkrdtjdt|��tj	|�nF|dkr�t
|�}tjd	t|�t|��tj||�n|dkr�t
jd
�|r�tj||�tjdt|��dS)NTz8The file %s exists. (i)gnore, (w)ipe, (b)ackup, (a)abort�i�w�b�aFzDeleting %szBacking up %s to %srGzSaved %s)r�r�r�r�rp)r�r�rZ�filename�existsr
rr�r��remover�shutilZmoverW�exit�copyr�)r�r�r�r�Zdownload_locationroZ	dest_filer?r?rC�
_copy_fileas.

r�c	Cs�|dkrtd��tjdd�}d}|r0t|||�}|rH|}tj|�d}nt||||�\}}t||||�|r~|r~t|||�|s�t	j
|�t|�dS)Nz@unpack_http_url() missing 1 required keyword argument: 'session'z-unpackzpip-r)r��tempfileZmkdtemp�_check_download_dirr�r��_download_http_urlrr�r��unlinkr	)	r�r��download_dirr�r��temp_dir�already_downloaded_path�	from_pathr�r?r?rCr3|s,


cCs�t|j�}t|�rHtjj|�r&t|�tj||dd�|rDt	j
d�dS|rV|j|�d}|rjt|||�}|rt|}n|}t
j|�d}t||||�|r�|r�t|||�dS)z�Unpack link into location.

    If download_dir is provided and link points to a file, make a copy
    of the link file inside download_dir.
    T)Zsymlinksz*Link is a directory, ignoring download_dirNr)r,r�r�r�r�r�r	r�Zcopytreer�r��check_against_pathr�r�r�rr�)r�r�rr�r�rrr�r?r?rCr0�s,



c
Cs�tjj|�rt|�d}tjg}|jd�|jt|�|jd�|d|g7}tj	d|�t
��t||dd�WdQRXtjj|tj
|�d	�}tj	d
||�t||ddd�dS)z�Copy distribution files in `link_path` to `location`.

    Invoked when user requests to install a local directory. E.g.:

        pip install .
        pip install ~/dev/git-repos/python-prompt-toolkit

    zsetup.pyz-c�sdistz
--dist-dirzRunning setup.py sdist for %sF)�cwdZshow_stdoutNrzUnpacking sdist %s into %s)r�r�)r�r�r�r	rW�
executabler�rr�r�rrrZ�listdirr)r�r�Zsetup_pyZ
sdist_argsrr?r?rC�_copy_dist_from_dir�s

rc@s$eZdZdZddd�Zd	dd�ZdS)
�PipXmlrpcTransportzRProvide a `xmlrpclib.Transport` implementation via a `PipSession`
    object.
    FcCs*tjj||�tj|�}|j|_||_dS)N)r)�	Transportrlrqrrr��_scheme�_session)rkZ	index_urlr�Zuse_datetimeZindex_partsr?r?rCrl�s
zPipXmlrpcTransport.__init__c
Cs�|j||dddf}tj|�}y6ddi}|jj|||dd�}|j�||_|j|j�St	j
k
r�}	ztjd|	j
j|��WYdd}	~	XnXdS)NzContent-Typeztext/xmlT)rSr�r�zHTTP error %s while getting %s)rrqrvrZpostr��verboseZparse_responser�r�	HTTPErrorr��criticalror)
rkr�ZhandlerZrequest_bodyr�partsrsr�ror�r?r?rCrs


zPipXmlrpcTransport.requestN)F)F)r�r�r�r�rlrr?r?r?rCr�s
rcCs^t|�rt||�n:t|�r.t||||d�n |dkr<t�}t|||||d�|rZt|�dS)avUnpack link.
       If link is a VCS link:
         if only_download, export into download_dir and ignore location
          else unpack into location
       for other types of link:
         - unpack into location
         - if download_dir, copy the file into download_dir
         - if only_download, mark location for deletion

    :param hashes: A Hashes object, one of whose embedded hashes must match,
        or HashMismatch will be raised. If the Hashes is empty, no matches are
        required, and unhashable types of requirements (like VCS ones, which
        would ordinarily raise HashUnsupported) are allowed.
    )r�N)r1r/r2r0r�r3r)r�r�rZ
only_downloadr�r�r?r?rCr4scCstjj|�S)zJ
    Sanitize the "filename" value from a Content-Disposition header.
    )r�r��basename)r�r?r?rCr6<scCs,tj|�\}}|jd�}|r$t|�}|p*|S)z�
    Parse the "filename" value from a Content-Disposition header, and
    return the default filename if the result is empty.
    r�)�cgiZparse_headerrwr6)�content_dispositionZdefault_filenameZ_typeZparamsr�r?r?rCr5Ds

c
Cs*|jjdd�d}y |j|ddidd�}|j�Wn8tjk
rj}ztjd|jj	|��WYd	d	}~XnX|j
jd
d�}|j}|j
jd�}	|	r�t|	|�}t
|�d}
|
s�tj|�}
|
r�||
7}|
r�|j|jkr�tjj
|j�d}
|
r�||
7}tjj||�}t|d
��}t||||�Wd	QRX||fS)z6Download link url into temp_dir using provided session�#rGrzAccept-EncodingZidentityT)r�r�zHTTP error %s while getting %sNzcontent-typernzcontent-disposition�wb)rsr�rwr�rrr�rrorr�r�r5rr�Zguess_extensionr�r�rZr�r�)
r�r�rr�Z
target_urlr�r�r�r�rr�Z	file_pathr�r?r?rCrSs:

rcCsntjj||j�}tjj|�rjtjd|�|rfy|j|�Wn*tk
rdtj	d|�tj
|�dSX|SdS)z� Check download_dir for previously downloaded file with correct hash
        If a correct file is found return its path else None
    zFile was already downloaded %sz;Previously-downloaded file %s has bad hash. Re-downloading.N)r�r�rZr�r�r�r�rrr�r)r�rr�Z
download_pathr?r?rCr��s
r�)NN)NNN)NN)NFNN)xZ
__future__rrZemail.utilsr�r�rTr�r�r�rV�rer�rWr�rdrb�ImportErrorZpip._vendor.six.moves.urllibrrqrr�r7Zpip.exceptionsrrZ
pip.modelsrZ	pip.utilsrr	r
rrr
rrrrZpip.utils.encodingrZpip.utils.filesystemrZpip.utils.loggingrZpip.utils.setuptools_buildrZpip.utils.glibcrZpip.utils.uirrZ
pip.locationsrZpip.vcsrr\rrZpip._vendor.requests.adaptersrrZpip._vendor.requests.authrr Zpip._vendor.requests.modelsr!r"Zpip._vendor.requests.utilsr#Zpip._vendor.requests.structuresr$r%Zpip._vendor.cachecontrolr&Zpip._vendor.cachecontrol.cachesr'Zpip._vendor.lockfiler(Zpip._vendor.six.movesr)�__all__Z	getLoggerr�r�rgrhr�r�r�ZSessionr�r*�compile�Ir�r�r+r,r-r.r/r�r1r2r�r�r�r�r3r0rr
rr4r6r5rr�r?r?r?rC�<module>s�
0
BR!BH
)
`
&
0$
'8site-packages/pip/__pycache__/wheel.cpython-36.opt-1.pyc000064400000052064147511334620016772 0ustar003

���e~�@s$dZddlmZddlZddlZddlZddlZddlZddlZddl	Z	ddl
Z	ddlZddlZddl
Z
ddlZddlZddlZddlmZddlmZddlmZddlZddlmZddlmZmZdd	lmZmZm Z dd
l!m"Z"m#Z#ddlm$Z$ddl%m&Z&m'Z'm(Z(m)Z)m*Z*dd
l+m,Z,ddl-m.Z.ddl/m0Z0ddl1m2Z2ddl3m4Z4ddl5m6Z6ddl7m8Z8dZ9d9Z:ej;e<�Z=Gdd�de>�Z?dd�Z@dd�ZAd;dd�ZBd d!�ZCd"d#�ZDejEd$ejF�ZGd%d&�ZHd'd(�ZId<d+d,�ZJd-d.�ZKeKd/d0��ZLd1d2�ZMd3d4�ZNGd5d6�d6e>�ZOGd7d8�d8e>�ZPdS)=zH
Support for installing and building the "wheel" binary package format.
�)�absolute_importN)�urlsafe_b64encode)�Parser)�StringIO)�
expanduser)�path_to_url�
unpack_url)�InstallationError�InvalidWheelFilename�UnsupportedWheel)�distutils_scheme�PIP_DELETE_MARKER_FILENAME)�
pep425tags)�call_subprocess�
ensure_dir�captured_stdout�rmtree�read_chunks)�open_spinner)�
indent_log)�SETUPTOOLS_SHIM)�ScriptMaker)�
pkg_resources)�canonicalize_name)�configparserz.whl�c@s eZdZdZdd�Zdd�ZdS)�
WheelCachez&A cache of wheels for future installs.cCs|rt|�nd|_||_dS)z�Create a wheel cache.

        :param cache_dir: The root of the cache.
        :param format_control: A pip.index.FormatControl object to limit
            binaries being read from the cache.
        N)r�
_cache_dir�_format_control)�self�	cache_dir�format_control�r"�/usr/lib/python3.6/wheel.py�__init__8szWheelCache.__init__cCst|j||j|�S)N)�cached_wheelrr)r�link�package_namer"r"r#r%BszWheelCache.cached_wheelN)�__name__�
__module__�__qualname__�__doc__r$r%r"r"r"r#r5s
rcCs�|jg}|jdk	r4|jdk	r4|jdj|j|jg��dj|�}tj|j��j�}|dd�|dd�|dd�|dd�g}t	j
j|df|��S)a�
    Return a directory to store cached wheels in for link.

    Because there are M wheels for any one sdist, we provide a directory
    to cache them in, and then consult that directory when looking up
    cache hits.

    We only insert things into the cache if they have plausible version
    numbers, so that we don't contaminate the cache with things that were not
    unique. E.g. ./package might have dozens of installs done for it and build
    a version of 0.0...and if we built and cached a wheel, we'd end up using
    the same wheel even if the source has been edited.

    :param cache_dir: The cache_dir being used by pip.
    :param link: The link of the sdist for which this will cache wheels.
    N�=�#���Zwheels)Zurl_without_fragmentZ	hash_name�hash�append�join�hashlibZsha224�encodeZ	hexdigest�os�path)r r&Z	key_partsZkey_urlZhashed�partsr"r"r#�_cache_for_linkGs
,r9c
Cs,|s|S|s|S|jr|S|js$|S|s,|St|�}tjj||�}d|krN|St||�}ytj|�}Wn:t	k
r�}z|j
t
jt
jfkr�|S�WYdd}~XnXg}	xL|D]D}
yt
|
�}Wntk
r�w�YnX|j�s�q�|	j|j�|
f�q�W|	�s�|S|	j�tjj||	dd�}tjjt|��S)N�binaryrr)�is_wheel�is_artifactr�pip�index�fmt_ctl_formatsr9r6�listdir�OSError�errno�ENOENT�ENOTDIR�Wheelr
�	supportedr2�support_index_min�sortr7r3�Linkr)
r r&r!r'Zcanonical_nameZformats�rootZwheel_names�eZ
candidates�
wheel_name�wheelr7r"r"r#r%psF

r%�sha256�cCsttj|�}d}t|d��2}x*t||d�D]}|t|�7}|j|�q(WWdQRXdt|j��jd�j	d�}||fS)z6Return (hash, length) for path using hashlib.new(algo)r�rb)�sizeNzsha256=�latin1r,)
r4�new�openr�len�updater�digest�decode�rstrip)r7ZalgoZ	blocksize�hZlength�f�blockrWr"r"r#�rehash�s

r]cCs6tjddkri}d}nddi}d}t|||f|�S)Nr��b�newline�)�sys�version_inforT)�name�mode�nl�binr"r"r#�open_for_csv�srhcCs�tjj|�r�t|d��H}|j�}|jd�s.dStjjtj	��}d|tj
jd�}|j�}WdQRXt|d��}|j|�|j|�WdQRXdSdS)	zLReplace #!python with #!/path/to/python
    Return True if file was changed.rPs#!pythonFs#!�asciiN�wbT)
r6r7�isfilerT�readline�
startswithrb�
executabler5�getfilesystemencoding�linesep�read�write)r7Zscript�	firstlineZexename�restr"r"r#�
fix_script�s

ruzZ^(?P<namever>(?P<name>.+?)(-(?P<ver>\d.+?))?)
                                \.dist-info$cCs�|jdd�}xttj|�D]f}tj|�}|r|jd�|krttjj||d���,}x$|D]}|j	�j
�}|dkrTdSqTWWdQRXqWdS)	zP
    Return True if the extracted wheel in wheeldir should go into purelib.
    �-�_rd�WHEELzroot-is-purelib: trueTNF)�replacer6r@�dist_info_re�match�grouprTr7r3�lowerrY)rd�wheeldirZname_folded�itemr{rM�liner"r"r#�root_is_purelib�s

r�c
Cs�tjj|�siifSt|��<}t�}x$|D]}|j|j��|jd�q*W|jd�WdQRXtj	�}dd�|_
|j|�i}i}|jd�r�t
|jd��}|jd�r�t
|jd��}||fS)N�
rcSs|S)Nr")Zoptionr"r"r#�<lambda>�sz!get_entrypoints.<locals>.<lambda>Zconsole_scriptsZgui_scripts)r6r7�existsrTrrr�strip�seekrZRawConfigParserZoptionxformZreadfpZhas_section�dict�items)�filename�fp�datar�Zcp�console�guir"r"r#�get_entrypoints�s$





r�FTc+)s|st||||||	d�}t|��r,|d�n|d�g�g��jtjj�tjj}i�t��g}|r�t��4}
tj	�� tj
d�tj|ddd�WdQRXWdQRXt
j|
j��dd	��d2�����fdd�	�	d3���	�
fd
d�	}||�d�tjj�dd�}t|�\����fdd�}xv�D]n}d}d}x^tjtjj�|��D]F}d}|dk�r^t}|}tjj�||�}||}|||d
||d��qDW�q"Wtd|d��d�_td4��_d�_��
fdd�}|�_d�_�jdd�}|�r�dtjk�rd|}|j�j|��tjjdd�dk�rBdtj dd�|f}|j�j|��dtj dd�|f}|j�j|��d d!��D�}x|D]}�|=�q|W�jd"d�}|�rdtjk�r�d#|}|j�j|��d$tj dd�|f}|j�j|��d%d!��D�}x|D]}�|=�q�Wt!��dk�r8|j�j"d&d!��j#�D���t!��dk�rj|j�j"d'd!��j#�D�d(di��tjj�dd)�}tjj�dd*�}t$|d+��}|j%d,�WdQRXt&j'||�|j(|�tjj�dd-�} tjj�dd.�}!t)| d/���}"t)|!d0���}#t*j+|"�}$t*j,|#�}%xV|$D]N}&�j|&d|&d�|&d<|&d�k�r^t-|&d�\|&d<|&d1<|%j.|&��qWx`|D]X}'t-|'�\}(})�|'��}*|
�r�|*j/|
��r�tjjtjtjj0|*|
��}*|%j.|*|(|)f��qtWx"�D]}'|%j.�|'ddf��q�WWdQRXWdQRXt&j'|!| �dS)5zInstall a wheel)�user�homerJ�isolated�prefix�purelib�platlib�ignoreT)�force�quietNcSstjj||�jtjjd�S)N�/)r6r7�relpathry�sep)�src�pr"r"r#�normpathsz"move_wheel_files.<locals>.normpathFcs.�|��}�|��}|�|<|r*�j|�dS)z6Map archive RECORD paths to installation RECORD paths.N)�add)�srcfile�destfileZmodifiedZoldpath�newpath)�changed�	installed�lib_dirr�r~r"r#�record_installeds


z*move_wheel_files.<locals>.record_installedcs�t|��x�tj|�D�]�\}}}|t|�d�jtjj�}tjj||�}	|rj|jtjjd�dj	d�rjqxl|D]d}
tjj|||
�}|r�|dkr�|j	d�r��j
|
�qpqp|rp|
j	d�rpt|
�jt�j
��rp�j
|�qpWx�|D]�}|r�||�r�q�tjj||�}
tjj|||�}t|	�tj|
|�tj|
�}ttd��rLtj||j|jf�tj|
tj��r�tj|
�}|jtjBtjBtjB}tj||�d}|�r�||�}�|
||�q�WqWdS)Nrrz.dataraz
.dist-info�utimeF)rr6�walkrU�lstripr7r�r3�split�endswithr2rrmrd�shutilZcopyfile�stat�hasattrr��st_atime�st_mtime�access�X_OK�st_mode�S_IXUSR�S_IXGRP�S_IXOTH�chmod)�source�destZis_base�fixer�filter�dirZsubdirs�filesZbasedirZdestdir�sZ
destsubdirr[r�r��stZpermissionsr�)�	data_dirs�info_dirr��reqr"r#�clobbersD





z!move_wheel_files.<locals>.clobberrzentry_points.txtcsh|j�jd�r|dd�}n<|j�jd�r8|dd�}n |j�jd�rT|dd�}n|}|�kpf|�kS)	Nz.exer/z
-script.py�
z.pya���i����r�)r}r�)rdZ	matchname)r�r�r"r#�is_entrypoint_wrapperasz/move_wheel_files.<locals>.is_entrypoint_wrapper�scripts)r�r�racs<|jdkrtd|�f���j|j|jjd�d|jd�S)Nz�Invalid script entry point: %s for req: %s - A callable suffix is required. Cf https://packaging.python.org/en/latest/distributing.html#console-scripts for more information.�.r)�moduleZimport_name�func)�suffixr	�script_templater�r�)�entry)�makerr�r"r#�_get_script_text�s
z*move_wheel_files.<locals>._get_script_textz�# -*- coding: utf-8 -*-
import re
import sys

from %(module)s import %(import_name)s

if __name__ == '__main__':
    sys.argv[0] = re.sub(r'(-script\.pyw?|\.exe)?$', '', sys.argv[0])
    sys.exit(%(func)s())
r=ZENSUREPIP_OPTIONSzpip = Z
altinstallz
pip%s = %srr^cSsg|]}tjd|�r|�qS)zpip(\d(\.\d)?)?$)�rer{)�.0�kr"r"r#�
<listcomp>�sz$move_wheel_files.<locals>.<listcomp>Zeasy_installzeasy_install = zeasy_install-%s = %scSsg|]}tjd|�r|�qS)zeasy_install(-\d\.\d)?$)r�r{)r�r�r"r"r#r��scSsg|]}d|�qS)z%s = %sr")r��kvr"r"r#r��scSsg|]}d|�qS)z%s = %sr")r�r�r"r"r#r��sr�Z	INSTALLERz
INSTALLER.piprjspip
�RECORDz
RECORD.pip�rzw+r.)F)NN)ra)1rr�rYr6r7r��setr�warnings�catch_warnings�filterwarnings�
compileall�compile_dir�logger�debug�getvaluer3r�r@rurr�ZvariantsZset_moder�r��pop�environ�extendZmake�getrb�versionrUZ
make_multipler�rTrrr��mover2rh�csv�reader�writerr]Zwriterowrmr�)+rdr�r~r�r�rJZ	pycompile�schemer�r�Zstrip_file_prefixr�Z	generated�stdoutr�Zep_filer�Zdatadirr�r�Zsubdirr�r�Z
pip_script�specZpip_epr�Zeasy_install_scriptZeasy_install_epZ	installerZtemp_installerZinstaller_file�recordZtemp_recordZ	record_inZ
record_outr�r��rowr[rZ�lZ
final_pathr")r�r�r�r�r�r�r�r�r�r�r�r~r#�move_wheel_files�s�




$;



#









.r�cstj���fdd��}|S)Nc?s6t�}x*�||�D]}||kr|j|�|VqWdS)N)r�r�)�args�kw�seenr)�fnr"r#�uniques

z_unique.<locals>.unique)�	functools�wraps)r�r�r")r�r#�_uniquesr�ccs�ddlm}tj||jd���}xd|D]\}tjj|j|d�}|V|j	d�r&tjj
|�\}}|dd�}tjj||d�}|Vq&WdS)	a
    Yield all the uninstallation paths for dist based on RECORD-without-.pyc

    Yield paths to all the files in RECORD. For each .py file in RECORD, add
    the .pyc in the same directory.

    UninstallPathSet.add() takes care of the __pycache__ .pyc.
    r)�FakeFiler�z.pyNr^z.pyc���)�	pip.utilsr�r�r�Zget_metadata_linesr6r7r3�locationr�r�)�distr�r�r�r7Zdnr��baser"r"r#�uninstallation_paths"s


r�cCsdyTdd�tjd|�D�d}|jd�}t�j|�}|dj�}ttt|j	d���}|SdSdS)	z�
    Return the Wheel-Version of an extracted wheel, if possible.

    Otherwise, return False if we couldn't parse / extract it.
    cSsg|]}|�qSr"r")r��dr"r"r#r�?sz!wheel_version.<locals>.<listcomp>Nrrxz
Wheel-Versionr�F)
rZfind_on_pathZget_metadatarZparsestrr��tuple�map�intr�)�
source_dirr�Z
wheel_datar�r"r"r#�
wheel_version8s
rcCsb|std|��|dtdkr>td|djtt|��f��n |tkr^tjddjtt|���dS)a�
    Raises errors or warns if called with an incompatible Wheel-Version.

    Pip should refuse to install a Wheel-Version that's a major series
    ahead of what it's compatible with (e.g 2.0 > 1.1); and warn when
    installing a version only minor version ahead (e.g 1.2 > 1.1).

    version: a 2-tuple representing a Wheel-Version (Major, Minor)
    name: name of wheel or package to raise exception about

    :raises UnsupportedWheel: when an incompatible Wheel-Version is given
    z(%s is in an unsupported or invalid wheelrzB%s's Wheel-Version (%s) is not compatible with this version of pipr�z*Installing from a newer Wheel-Version (%s)N)r�VERSION_COMPATIBLEr3r�strr��warning)r�rdr"r"r#�check_compatibilityKs

rc@s:eZdZdZejdej�Zdd�Zd
dd�Z	ddd	�Z
dS)rEzA wheel filez�^(?P<namever>(?P<name>.+?)-(?P<ver>\d.*?))
        ((-(?P<build>\d.*?))?-(?P<pyver>.+?)-(?P<abi>.+?)-(?P<plat>.+?)
        \.whl|\.dist-info)$cs��jj|�}|std|��|�_|jd�jdd��_|jd�jdd��_|jd�jd��_	|jd�jd��_
|jd	�jd��_t�fd
d��j	D���_
dS)
zX
        :raises InvalidWheelFilename: when the filename is invalid for a wheel
        z!%s is not a valid wheel filename.rdrwrvZverZpyverr��abiZplatc3s0|](}�jD]}�jD]}|||fVqqqdS)N)�abis�plats)r��x�y�z)rr"r#�	<genexpr>�sz!Wheel.__init__.<locals>.<genexpr>N)�
wheel_file_rer{r
r�r|ryrdr�r�Z
pyversionsr	r
r��	file_tags)rr�Z
wheel_infor")rr#r$ts
zWheel.__init__Ncs2�dkrtj��fdd�|jD�}|r.t|�SdS)a"
        Return the lowest index that one of the wheel's file_tag combinations
        achieves in the supported_tags list e.g. if there are 8 supported tags,
        and one of the file tags is first in the list, then return 0.  Returns
        None is the wheel is not supported.
        Ncsg|]}|�kr�j|��qSr")r>)r��c)�tagsr"r#r��sz+Wheel.support_index_min.<locals>.<listcomp>)r�supported_tagsr�min)rrZindexesr")rr#rG�szWheel.support_index_mincCs"|dkrtj}tt|�j|j��S)z'Is this wheel supported on this system?N)rr�boolr��intersectionr)rrr"r"r#rF�szWheel.supported)N)N)r(r)r*r+r��compile�VERBOSErr$rGrFr"r"r"r#rEhs
rEc@sHeZdZdZddd�Zddd�Zdd�Zdd	d
�Zdd�Zddd�Z	dS)�WheelBuilderz#Build wheels from a RequirementSet.NcCs6||_||_|jj|_|j|_|p$g|_|p.g|_dS)N)	�requirement_set�finderZ_wheel_cacher�_cache_rootZwheel_download_dir�
_wheel_dir�
build_options�global_options)rrrrrr"r"r#r$�s

zWheelBuilder.__init__cCs�tjd�}zn|j|||d�rlyBtj|�d}tjj||�}tjtjj||�|�t	j
d|�|SYnX|j|�dSt|�XdS)ziBuild one wheel.

        :return: The filename of the built wheel, or None if the build failed.
        z
pip-wheel-)�
python_tagrzStored in directory: %sN)
�tempfileZmkdtemp�_WheelBuilder__build_oner6r@r7r3r�r�r��info�
_clean_oner)rr��
output_dirr �tempdrLZ
wheel_pathr"r"r#�
_build_one�s

zWheelBuilder._build_onecCstjddt|jgt|j�S)Nz-uz-c)rbrnrZsetup_py�listr)rr�r"r"r#�_base_setup_args�s
zWheelBuilder._base_setup_argscCs�|j|�}d|jf}t|��t}tjd|�|dd|g|j}|dk	rT|d|g7}yt||jd|d�dS|jd	�tj	d
|j�dSWdQRXdS)Nz#Running setup.py bdist_wheel for %szDestination directory: %sZbdist_wheelz-dz--python-tagF)�cwd�show_stdout�spinnerT�errorzFailed building wheel for %s)
r)rdrr�r�rrZsetup_py_dirZfinishr-)rr�r&r �	base_argsZspin_messager,Z
wheel_argsr"r"r#Z__build_one�s



zWheelBuilder.__build_onecCsV|j|�}tjd|j�|ddg}yt||jdd�dStjd|j�dSdS)NzRunning setup.py clean for %sZcleanz--allF)r*r+Tz Failed cleaning build dir for %s)r)r�r#rdrrr-)rr�r.Z
clean_argsr"r"r#r$�s
zWheelBuilder._clean_oneFcCs�|jj|j�|jjj�}g}x�|D]�}|jr0q$|jrJ|s�tjd|j	�q$|rV|j
rVq$|rl|jrl|jjrlq$|rz|j
rzq$|r�|j}|j�\}}tjj|d|�dkr�q$dtjj|jjt|j	��kr�tjd|j	�q$|j|�q$W|s�dStjddjdd	�|D���t���<gg}}	�x(|D�]}d}
|�r�tj}
t|j|j�}yt|�WnBtk
�r�}z$tjd
|j	|�|	j|��w WYdd}~XnXn|j}|j |||
d�}
|
�r4|j|�|�r>|j
�r�t!j"j#t!j"j|j
t$���r�t%d��|j&�|j'|jj(�|_
tjj)t*|
��|_t+|j|j
dd
|jj,d�n
|	j|��q WWdQRX|�rptjddjdd	�|D���|	�r�tjddjdd	�|	D���t-|	�dkS)z�Build wheels.

        :param unpack: If True, replace the sdist we built from with the
            newly built wheel, in preparation for installation.
        :return: True if all the wheels built correctly.
        z(Skipping %s, due to already being wheel.Nr:zCSkipping bdist_wheel for %s, due to binaries being disabled for it.Tz*Building wheels for collected packages: %sz, cSsg|]
}|j�qSr")rd)r�r�r"r"r#r�sz&WheelBuilder.build.<locals>.<listcomp>z Building wheel for %s failed: %s)r zbad source dir - missing markerF)�sessionzSuccessfully built %s� cSsg|]
}|j�qSr")rd)r�r�r"r"r#r�QszFailed to build %scSsg|]
}|j�qSr")rd)r�r�r"r"r#r�Vsr).rZ
prepare_filesrZrequirements�valuesZ
constraintr;r�r#rdZeditabler&r<r�splitextr=r>Zegg_info_matchesr?r!rr2r3rrZimplementation_tagr9rrrArrr'r6r7r�r
�AssertionErrorZremove_temporary_sourceZbuild_locationZ	build_dirrIrrr/rU)rZautobuildingZreqsetZbuildsetr�r&r�ZextZ
build_successZ
build_failurer r%rKZ
wheel_filer"r"r#�build�s�	






zWheelBuilder.build)NN)N)N)F)
r(r)r*r+r$r'r)r"r$r4r"r"r"r#r�s


r)rr�)rNr5)FNNTNFNN)Qr+Z
__future__rr�r�rBr�r4Zloggingr6Zos.pathr�r�r�rbr!r��base64rZemail.parserrZpip._vendor.sixrr=Z
pip.compatrZpip.downloadrrZpip.exceptionsr	r
rZ
pip.locationsrr
rr�rrrrrZpip.utils.uirZpip.utils.loggingrZpip.utils.setuptools_buildrZpip._vendor.distlib.scriptsrZpip._vendorrZpip._vendor.packaging.utilsrZpip._vendor.six.movesrZ	wheel_extrZ	getLoggerr(r��objectrr9r%r]rhrurrrzr�r�r�r�r�rrrErr"r"r"r#�<module>sn
)'



'7site-packages/pip/__pycache__/locations.cpython-36.pyc000064400000007420147511334620016716 0ustar003

���e��
@sdZddlmZddlZddlZddlZddlZddlmZddl	m
Z
mZddlm
Z
mZddlmZejd�Zd	Zd
Zdd�Zd
d�Zdd�Ze�r�ejjejd�Zn6yejjej�d�ZWnek
r�ejd�YnXejje�Zej �Z!ej"Z#ed�Z$e
�rtejjejd�Z%ejje#d�Z&ejj'e%��sRejjejd�Z%ejje#d�Z&dZ(ejje$d�Z)ejje)e(�Z*njejjejd�Z%ejje#d�Z&dZ(ejje$d�Z)ejje)e(�Z*ej+dd�dk�r�ejdd�dk�r�dZ%dd�ej,d�D�Z-d#d!d"�Z.dS)$z7Locations where we look for configs, install stuff, etc�)�absolute_importN)�	sysconfig)�install�SCHEME_KEYS)�WINDOWS�
expanduser)�appdirsZpipz�This file is placed here by pip to indicate the source was put
here by pip.

Once this package is successfully installed this source code will be
deleted (unless you remove this file).
zpip-delete-this-directory.txtc	Cs2tjj|t�}t|d��}|jt�WdQRXdS)z?
    Write the pip delete marker file into this directory.
    �wN)�os�path�join�PIP_DELETE_MARKER_FILENAME�open�write�DELETE_MARKER_MESSAGE)Z	directory�filepathZ	marker_fp�r�/usr/lib/python3.6/locations.py�write_delete_marker_filesrcCs*ttd�rdStjttdtj�kr&dSdS)zM
    Return True if we're running inside a virtualenv, False otherwise.

    Zreal_prefixT�base_prefixF)�hasattr�sys�prefix�getattrrrrr�running_under_virtualenv's

rcCs>tjjtjjtj��}tjj|d�}t�r:tjj|�r:dSdS)z?
    Return True if in a venv and no system site packages.
    zno-global-site-packages.txtTN)	r
r�dirname�abspath�site�__file__rr�isfile)Zsite_mod_dirZno_global_filerrr�virtualenv_no_global4sr �srcz=The folder you are executing pip from can no longer be found.�~ZScripts�binzpip.inizpip.confz.pip��darwin�z/System/Library/z/usr/local/bincCsg|]}tjj|t��qSr)r
rr�config_basename)�.0rrrr�
<listcomp>wsr)FcCshddlm}i}|r ddgi}ni}d|i}	|	j|�||	�}
|
j�|
jddd�}|oZ|sntd	j||���|pv|j|_|r�d
|_|p�|j|_|p�|j	|_	|p�|j
|_
|j�xtD]}t
|d|�||<q�Wd|
jd�kr�|jt|j|jd
��t��rdtjjtjdddtjdd�|�|d<|dk	�rdtjjtjj|d��d}
tjj||
dd��|d<|S)z+
    Return a distutils install scheme
    r)�DistributionZscript_argsz
--no-user-cfg�namerT)Zcreatezuser={0} prefix={1}�Zinstall_�install_lib)�purelib�platlib�includer�pythonN�Zheaders�)Zdistutils.distr*�updateZparse_config_filesZget_command_obj�AssertionError�format�userr�home�rootZfinalize_optionsrrZget_option_dict�dictr-rr
rrr�version�
splitdriver)Z	dist_namer7r8r9�isolatedrr*�schemeZextra_dist_argsZ	dist_args�d�i�keyZ
path_no_driverrr�distutils_scheme|sH



rB)FNNFN)/�__doc__Z
__future__rr
Zos.pathrrZ	distutilsrZdistutils.command.installrrZ
pip.compatrrZ	pip.utilsrZuser_cache_dirZUSER_CACHE_DIRrr
rrr rrrZ
src_prefix�getcwd�OSError�exitrZget_python_libZ
site_packages�	USER_SITE�	user_siteZuser_dirZbin_pyZbin_user�existsr'Zlegacy_storage_dirZlegacy_config_file�platformZsite_config_dirsZsite_config_filesrBrrrr�<module>sd
		
(site-packages/pip/__pycache__/cmdoptions.cpython-36.pyc000064400000031164147511334620017104 0ustar003

���eZ@�@sndZddlmZddlmZddlmZmZmZddl	Z	ddl
mZmZm
Z
mZddlmZddlmZmZdd	lmZd
d�Zdd
�Zd�dd�Zeedddddd�Zeedddddd�Zeeddddded�Zeeddd d!dd"d�Zeed#d$d%dd&d�Zeed'd(d)d!dd*d�Zeed+d,d-d.d/d0d1�Zeed2d3dded�Z eed4d5d6d7d8d9�Z!eed:d;d<d=d>d9�Z"eed?d@dAdBdCdDdEdF�Z#eedGdHd6d7ed9�Z$eedIdJd6d7ed9�Z%dKdL�Z&eedMdNd6d/dOdP�Z'eedQdRd6dd/dSdT�Z(eedUdVdWdXdYej)dZd[�Z*d\d]�Z+eed^d_ddd`d�Z,dadb�Z-dcdd�Z.eededfdded�Z/dgdh�Z0eedidfdjded�Z1dkdl�Z2eedmdndjded�Z3eedodpdddqd�Z4drds�Z5dtdu�Z6dvdw�Z7eedxdydzd{d|d}ed~d[�	Z8eedd�dd�ed�Z9eed�d�djd�d�d�Z:d�d��Z;d�d��Z<d�d��Z=d�d��Z>d�d��Z?eed�d�ed}d�d��Z@eed�d�djd�d�ZAeed�d�d�ddd�d�ZBeed�d�d�d�d�d}d�d1�ZCeed�d�dd�d�ZDeed�d�d�d�d�d��ZEeed�d�d�d�d�d��ZFeed�ddd�d��ZGeed�ddd�d��ZHeed�d�ddd�d�ZIeed�d�d�ded�ZJd�d��ZKeed�d�d�eKd�d�d��ZLeed�d�ddd�d�ZMd�eeeeeeee e!e"e#e$e%e&e0e'e(e@eAeIgd��ZNd�e*e+e,e-e4gd��ZOd�eOd�e.e/e1e2e3gd��ZPdS)�aD
shared options and groups

The principle here is to define options once, but *not* instantiate them
globally. One reason being that options with action='append' can carry state
between parses. pip parses general options twice internally, and shouldn't
pass on state. To be consistent, all options will follow this design.

�)�absolute_import)�partial)�OptionGroup�
SUPPRESS_HELP�OptionN)�
FormatControl�fmt_ctl_handle_mutual_exclude�fmt_ctl_no_binary�fmt_ctl_no_use_wheel)�PyPI)�USER_CACHE_DIR�
src_prefix)�
STRONG_HASHEScCs0t||d�}x|dD]}|j|��qW|S)z�
    Return an OptionGroup object
    group  -- assumed to be dict with 'name' and 'options' keys
    parser -- an optparse Parser
    �name�options)rZ
add_option)�group�parserZoption_group�option�r� /usr/lib/python3.6/cmdoptions.py�make_option_groupsrcCs|js|j}t|�dS)N)�	use_wheel�format_controlr
)r�controlrrr�resolve_wheel_no_use_binary$srcsP�dkr|��fdd�}dddg}tt||��rL|j}t|�tjddd	�dS)
z�Disable wheels if per-setup.py call options are set.

    :param options: The OptionParser options to update.
    :param check_options: The options to check, if not supplied defaults to
        options.
    Ncst�|d�S)N)�getattr)�n)�
check_optionsrr�getname4sz+check_install_build_global.<locals>.getnameZ
build_options�global_options�install_optionszeDisabling all use of wheels due to the use of --build-options / --global-options / --install-options.�)�
stacklevel)�any�maprr	�warnings�warn)rrr�namesrr)rr�check_install_build_global*s
r(z-hz--help�helpz
Show help.)�dest�actionr)z
--isolated�
isolated_mode�
store_trueFzSRun pip in an isolated mode, ignoring environment variables and user configuration.)r*r+�defaultr)z--require-virtualenvz--require-venvZrequire_venvz-vz	--verbose�verbose�countzDGive more output. Option is additive, and can be used up to 3 times.z-Vz	--version�versionzShow version and exit.z-qz--quiet�quietz�Give less output. Option is additive, and can be used up to 3 times (corresponding to WARNING, ERROR, and CRITICAL logging levels).z--logz
--log-filez--local-log�log�pathz Path to a verbose appending log.)r*�metavarr)z
--no-input�no_inputz--proxy�proxy�str�z<Specify a proxy in the form [user:passwd@]proxy.server:port.)r*�typer.r)z	--retries�retries�int�zRMaximum number of retries each connection should attempt (default %default times).z	--timeoutz--default-timeoutZsec�timeout�float�z2Set the socket timeout (default %default seconds).)r5r*r:r.r)z
--default-vcs�default_vcsz--skip-requirements-regex�skip_requirements_regexc
Cs"tddddddddggd	d
dd�S)
Nz--exists-action�
exists_actionZchoice�s�i�w�b�a�appendr+zYDefault action when a path already exists: (s)witch, (i)gnore, (w)ipe, (b)ackup, (a)bort.)r*r:�choicesr.r+r5r))rrrrrrC�srCz--cert�certzPath to alternate CA bundle.)r*r:r5r)z
--client-cert�client_certzkPath to SSL client certificate, a single file containing the private key and the certificate in PEM format.)r*r:r.r5r)z-iz--index-urlz
--pypi-url�	index_url�URLz�Base URL of Python Package Index (default %default). This should point to a repository compliant with PEP 503 (the simple repository API) or a local directory laid out in the same format.)r*r5r.r)cCstddddgdd�S)Nz--extra-index-urlZextra_index_urlsrNrIzmExtra URLs of package indexes to use in addition to --index-url. Should follow the same rules as --index-url.)r*r5r+r.r))rrrrr�extra_index_url�srOz
--no-index�no_indexzAIgnore package index (only looking at --find-links URLs instead).c	Cstddddgddd�S)Nz-fz--find-links�
find_linksrIZurlz�If a url or path to an html file, then parse for links to archives. If a local path or file:// url that's a directory, then look for archives in the directory listing.)r*r+r.r5r))rrrrrrQ�srQcCstdddgdtd�S)Nz--allow-external�allow_externalrI�PACKAGE)r*r+r.r5r))rrrrrrrRsrRz--allow-all-external�allow_all_externalcCstddddgdd�S)Nz--trusted-hostZ
trusted_hostsrIZHOSTNAMEzKMark this host as trusted, even though it does not have valid or any HTTPS.)r*r+r5r.r))rrrrr�trusted_hostsrUz--no-allow-externalZstore_falsec	Cstddddgdtd�S)Nz--allow-unverifiedz--allow-insecureZallow_unverifiedrIrS)r*r+r.r5r))rrrrrr�allow_unsafe3srVz--no-allow-insecureZallow_all_insecurez--process-dependency-links�process_dependency_linksz*Enable the processing of dependency links.c	Cstddddgddd�S)Nz-cz--constraint�constraintsrI�filez\Constrain versions using the given constraints file. This option can be used multiple times.)r*r+r.r5r))rrrrrrXRsrXc	Cstddddgddd�S)Nz-rz
--requirement�requirementsrIrYzQInstall from the given requirements file. This option can be used multiple times.)r*r+r.r5r))rrrrrrZ]srZc	Cstddddgddd�S)Nz-ez
--editableZ	editablesrIzpath/urlzkInstall a project in editable mode (i.e. setuptools "develop mode") from a local project path or a VCS url.)r*r+r.r5r))rrrrr�editablehsr[z--srcz--sourcez--source-dirz--source-directoryZsrc_dir�dirz�Directory to check out editable projects into. The default in a virtualenv is "<venv path>/src". The default for global installs is "<current dir>/src".z--use-wheelrTz--no-use-wheelz{Do not Find and prefer wheel archives when searching indexes and find-links locations. DEPRECATED in favour of --no-binary.cCst||j�S)zGet a format_control object.)rr*)�valuesrrrr�_get_format_control�sr^cCs"t|j|j�}t||j|j�dS)N)rr]r*r�	no_binary�only_binary)r�opt_str�valuer�existingrrr�_handle_no_binary�srdcCs"t|j|j�}t||j|j�dS)N)rr]r*rr`r_)rrarbrrcrrr�_handle_only_binary�srec	Cs tdddtdtt�t��dd�S)Nz--no-binaryr�callbackr8aRDo not use binary packages. Can be supplied multiple times, and each time adds to the existing value. Accepts either :all: to disable all binary packages, :none: to empty the set, or one or more package names with commas between them. Note that some packages are tricky to compile and may fail to install when this option is used on them.)r*r+rfr:r.r))rrdr�setrrrrr_�s
r_c	Cs tdddtdtt�t��dd�S)Nz
--only-binaryrrfr8aGDo not use source packages. Can be supplied multiple times, and each time adds to the existing value. Accepts either :all: to disable all source packages, :none: to empty the set, or one or more package names with commas between them. Packages without binary distributions will fail to install when this option is used on them.)r*r+rfr:r.r))rrerrgrrrrr`�s
r`z--cache-dir�	cache_dirzStore the cache data in <dir>.)r*r.r5r)z--no-cache-dirzDisable the cache.z	--no-depsz--no-dependenciesZignore_dependenciesz#Don't install package dependencies.z-bz--buildz--build-dirz--build-directory�	build_dirz/Directory to unpack packages into and build in.z--ignore-requires-python�ignore_requires_pythonz'Ignore the Requires-Python information.z--install-optionr rIra"Extra arguments to be supplied to the setup.py install command (use like --install-option="--install-scripts=/usr/local/bin"). Use multiple --install-option options to pass multiple options to setup.py install. If you are using an option with a directory path, be sure to use absolute path.)r*r+r5r)z--global-optionrzTExtra global options to be supplied to the setup.py call before the install command.z
--no-cleanz!Don't clean up build directories.)r+r.r)z--prezYInclude pre-release and development versions. By default, pip only finds stable versions.z--disable-pip-version-check�disable_pip_version_checkz{Don't periodically check PyPI to determine whether a new version of pip is available for download. Implied with --no-index.z-Zz--always-unzip�always_unzipc
Cs�|jjsi|j_y|jdd�\}}Wn"tk
rF|jd|�YnX|tkrh|jd|djt�f�|jjj|g�j|�dS)zkGiven a value spelled "algo:digest", append the digest to a list
    pointed to in a dict by the algo name.�:�zTArguments to %s must be a hash name followed by a value, like --hash=sha256:abcde...z&Allowed hash algorithms for %s are %s.z, N)	r]�hashes�split�
ValueError�errorr�join�
setdefaultrI)rrarbrZalgoZdigestrrr�_merge_hashsruz--hashrorf�stringzgVerify that the package's archive matches this hash before installing. Example: --hash=sha256:abcdef...)r*r+rfr:r)z--require-hashes�require_hashesz�Require a hash to check each requirement against, for repeatable installs. This option is implied when any package in a requirements file has a --hash option.zGeneral Options)rrzPackage Index Optionsz4Package Index Options (including deprecated options))N)Q�__doc__Z
__future__r�	functoolsrZoptparserrrr%Z	pip.indexrrr	r
Z
pip.modelsrZ
pip.locationsrr
Zpip.utils.hashesrrrr(Zhelp_r,Zrequire_virtualenvr/r1r2r3r6r7r;r>rArBrCrKrLZ
simple_urlrMrOrPrQrRrTrUZno_allow_externalrVZno_allow_unsaferWrXrZr[�srcrZno_use_wheelr^rdrer_r`rhZno_cacheZno_depsrirjr rZno_cleanZprerkrlru�hashrwZ
general_groupZnon_deprecated_index_groupZindex_grouprrrr�<module>	sr







site-packages/pip/__pycache__/baseparser.cpython-36.pyc000064400000022052147511334620017050 0ustar003

���e�(�@s�dZddlmZddlZddlZddlZddlZddlZddlm	Z	ddl
mZddlm
Z
ddlmZmZmZmZddlmZmZejd	ej�ZGd
d�dej�ZGdd
�d
e�ZGdd�dej�ZGdd�de�ZdS)zBase option parser setup�)�absolute_importN)�	strtobool)�string_types)�configparser)�legacy_config_file�config_basename�running_under_virtualenv�site_config_files)�appdirs�get_terminal_sizez^PIP_c@sReZdZdZdd�Zdd�Zddd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�ZdS)�PrettyHelpFormatterz4A prettier/less verbose help formatter for optparse.cOs:d|d<d|d<t�dd|d<tjj|f|�|�dS)N�Zmax_help_position�Zindent_incrementr��width)r�optparse�IndentedHelpFormatter�__init__)�self�args�kwargs�r� /usr/lib/python3.6/baseparser.pyrszPrettyHelpFormatter.__init__cCs|j|dd�S)Nz <%s>z, )�_format_option_strings)r�optionrrr�format_option_strings!sz)PrettyHelpFormatter.format_option_strings� <%s>�, cCs|g}|jr|j|jd�|jr0|j|jd�t|�dkrH|jd|�|j�rr|jp^|jj�}|j||j��dj	|�S)a
        Return a comma-separated list of option strings and metavars.

        :param option:  tuple of (short opt, long opt), e.g: ('-f', '--format')
        :param mvarfmt: metavar format string - evaluated as mvarfmt % metavar
        :param optsep:  separator
        rr�)
Z_short_opts�appendZ
_long_opts�len�insertZtakes_value�metavar�dest�lower�join)rrZmvarfmtZoptsepZoptsr"rrrr$sz*PrettyHelpFormatter._format_option_stringscCs|dkrdS|dS)NZOptionsrz:
r)rZheadingrrr�format_heading;sz"PrettyHelpFormatter.format_headingcCsd|jtj|�d�}|S)zz
        Ensure there is only one newline between usage and the first heading
        if there is no description.
        z
Usage: %s
z  )�indent_lines�textwrap�dedent)rZusage�msgrrr�format_usage@sz PrettyHelpFormatter.format_usagecCsV|rNt|jd�rd}nd}|jd�}|j�}|jtj|�d�}d||f}|SdSdS)N�mainZCommandsZDescription�
z  z%s:
%s
r)�hasattr�parser�lstrip�rstripr'r(r))r�descriptionZlabelrrr�format_descriptionHs
z&PrettyHelpFormatter.format_descriptioncCs|r|SdSdS)Nrr)rZepilogrrr�
format_epilogZsz!PrettyHelpFormatter.format_epilogcs"�fdd�|jd�D�}dj|�S)Ncsg|]}�|�qSrr)�.0�line)�indentrr�
<listcomp>bsz4PrettyHelpFormatter.indent_lines.<locals>.<listcomp>r-)�splitr%)r�textr7Z	new_linesr)r7rr'asz PrettyHelpFormatter.indent_linesN)rr)�__name__�
__module__�__qualname__�__doc__rrrr&r+r3r4r'rrrrrs
rc@seZdZdZdd�ZdS)�UpdatingDefaultsHelpFormatterz�Custom help formatter for use in ConfigOptionParser.

    This is updates the defaults before expanding them, allowing
    them to show up correctly in the help listing.
    cCs(|jdk	r|jj|jj�tjj||�S)N)r/�_update_defaults�defaultsrr�expand_default)rrrrrrBms
z,UpdatingDefaultsHelpFormatter.expand_defaultN)r;r<r=r>rBrrrrr?fsr?c@s eZdZdd�Zedd��ZdS)�CustomOptionParsercOs(|j||�}|jj�|jj||�|S)z*Insert an OptionGroup at a given position.)Zadd_option_group�
option_groups�popr!)r�idxrr�grouprrr�insert_option_groupus
z&CustomOptionParser.insert_option_groupcCs.|jdd�}x|jD]}|j|j�qW|S)z<Get a list of all options, including those in option groups.N)Zoption_listrD�extend)r�res�irrr�option_list_all~sz"CustomOptionParser.option_list_allN)r;r<r=rH�propertyrLrrrrrCss	rCc@s\eZdZdZdZdd�Zdd�Zdd�Zd	d
�Zdd�Z	d
d�Z
dd�Zdd�Zdd�Z
dS)�ConfigOptionParserzsCustom option parser which updates its defaults by checking the
    configuration files and environmental variablesFcOsdtj�|_|jd�|_|jdd�|_|j�|_|jrB|jj|j�|jsLt	�t
jj|f|�|�dS)N�name�isolatedF)
rZRawConfigParser�configrErOrP�get_config_files�files�read�AssertionErrorr�OptionParserr)rrrrrrr�s


zConfigOptionParser.__init__cCs�tjjdd�}|tjkrgStt�}|jsj|rFtjj|�rF|j	|�n$|j	t
�|j	tjjtj
d�t��t�r�tjjtjt�}tjj|�r�|j	|�|S)NZPIP_CONFIG_FILEFZpip)�os�environ�get�devnull�listr	rP�path�existsrrr%r
Zuser_config_dirrr�sys�prefix)rZconfig_filerSZvenv_config_filerrrrR�s&


z#ConfigOptionParser.get_config_filescCsLy|j||�Stjk
rF}ztd|�tjd�WYdd}~XnXdS)Nz*An error occurred during configuration: %s�)�check_valuerZOptionValueError�printr^�exit)rr�key�val�excrrr�
check_default�s
z ConfigOptionParser.check_defaultc	sji}x(d�jfD]}|j�j�j|���qW�jsH|j�j�j���tj�j��_	t
�}x�|j�D]�\�}|stqf�j����dkr�qf�j
d
kr�t|�}n��j
dkr�|j�}���fdd�|D�}nl�j
d	k�r$|j�j��j�}�j||�}�j�p�f}�j�pi}�j�||�f|�|�n�j��|�}||�j<qfWx|D]�t�j	��|�<�qFWd�_	|S)z�Updates the given defaults with values from the config files and
        the environ. Does a little special handling for certain types of
        options (lists).�globalN�
store_true�store_false�countrcsg|]}�j��|��qSr)rg)r5�v)rdrrrrr8�sz7ConfigOptionParser._update_defaults.<locals>.<listcomp>�callback)rirjrk)rO�update�normalize_keys�get_config_sectionrP�get_environ_varsr�ValuesrA�values�set�itemsZ
get_option�actionrr9�addr#�get_opt_stringZ
convert_valueZ
callback_argsZcallback_kwargsrmrg�getattr)	rrArQZsectionZ	late_evalre�opt_strrrr)rdrrrr@�s@




z#ConfigOptionParser._update_defaultscCs@i}x6|D].\}}|jdd�}|jd�s0d|}|||<q
W|S)z�Return a config dictionary with normalized keys regardless of
        whether the keys were specified in environment variables or in config
        files�_�-z--z--%s)�replace�
startswith)rruZ
normalizedrdrerrrro�s
z!ConfigOptionParser.normalize_keyscCs|jj|�r|jj|�SgS)z Get a section of a configuration)rQZhas_sectionru)rrOrrrrpsz%ConfigOptionParser.get_config_sectionccs<x6tjj�D](\}}tj|�rtjd|�j�|fVqWdS)z@Returns a generator with all environmental vars with prefix PIP_rN)rWrXru�_environ_prefix_re�search�subr$)rrdrerrrrqs
z#ConfigOptionParser.get_environ_varscCsn|jstj|j�S|j|jj��}x@|j�D]4}|j|j�}t	|t
�r,|j�}|j||�||j<q,Wtj|�S)z�Overriding to make updating the defaults after instantiation of
        the option parser possible, _update_defaults() does the dirty work.)
Zprocess_default_valuesrrrrAr@�copyZ_get_all_optionsrYr#�
isinstancerrxra)rrAr�defaultrzrrr�get_default_valuess
z%ConfigOptionParser.get_default_valuescCs |jtj�|jdd|�dS)Nrz%s
)Zprint_usager^�stderrrc)rr*rrr�error#szConfigOptionParser.errorN)r;r<r=r>rPrrRrgr@rorprqr�r�rrrrrN�s
(5rN)r>Z
__future__rr^rrW�rer(Zdistutils.utilrZpip._vendor.sixrZpip._vendor.six.movesrZ
pip.locationsrrrr	Z	pip.utilsr
r�compile�Irrrr?rVrCrNrrrr�<module>s O
site-packages/pip/__pycache__/pep425tags.cpython-36.opt-1.pyc000064400000016433147511334620017564 0ustar003

���e�*�
@sdZddlmZddlZddlZddlZddlZddlZyddlZWne	k
rbddl
jZYnXddlZddl
mZddlZeje�Zejd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Zd!dd�Zdd�Zdd�Zdd�Zdd�Zdd�Z d"dd�Z!e!�Z"e!dd �Z#e�Z$dS)#z2Generate and work with PEP 425 Compatibility Tags.�)�absolute_importN)�OrderedDictz(.+)_(\d+)_(\d+)_(.+)cCsBy
tj|�Stk
r<}ztjdj|�t�dSd}~XnXdS)Nz{0})�	sysconfig�get_config_var�IOError�warnings�warn�format�RuntimeWarning)�var�e�r
� /usr/lib/python3.6/pep425tags.pyrs

rcCs:ttd�rd}n&tjjd�r"d}ntjdkr2d}nd}|S)z'Return abbreviated implementation name.�pypy_version_info�pp�javaZjyZcliZip�cp)�hasattr�sys�platform�
startswith)Zpyimplr
r
r�
get_abbr_impl!s

rcCs.td�}|st�dkr*djttt���}|S)zReturn implementation version.�py_version_nodotr�)rr�join�map�str�get_impl_version_info)Zimpl_verr
r
r�get_impl_ver.srcCs:t�dkr"tjdtjjtjjfStjdtjdfSdS)zQReturn sys.version_info-like tuple for use in decrementing the minor
    version.rr�N)rr�version_infor�major�minorr
r
r
rr6s

rcCsdjt�t��S)z;
    Returns the Tag for this specific implementation.
    z{0}{1})r	rrr
r
r
r�get_impl_tagAsr#TcCs.t|�}|dkr&|r tjd|�|�S||kS)zgUse a fallback method for determining SOABI flags if the needed config
    var is unset or unavailable.Nz>Config variable '%s' is unset, Python ABI tag may be incorrect)r�logger�debug)rZfallback�expectedr�valr
r
r�get_flagHsr(cs�td�}t��|r��dkr�ttd�r�d}d}d}tddd��dkd	�rLd
}td�fdd��dkd	�rjd
}tddd�d�dko�tjdkd�r�tjdkr�d}d�t�|||f}n@|r�|jd�r�d|jd�d}n|r�|j	dd�j	dd�}nd}|S)zXReturn the ABI tag based on SOABI (if available) or emulate SOABI
    (CPython 2, PyPy).�SOABIrr�
maxunicoder�Py_DEBUGcSs
ttd�S)N�gettotalrefcount)rrr
r
r
r�<lambda>^szget_abi_tag.<locals>.<lambda>)r�d�
WITH_PYMALLOCcs�dkS)Nrr
r
)�implr
rr-bs�mZPy_UNICODE_SIZEcSs
tjdkS)Ni��)rr*r
r
r
rr-fs��)r&r�uz
%s%s%s%s%szcpython-�-r�.�_N)rr)r3r3)r3r3)
rrrrr(r rr�split�replace)Zsoabir.r1r4�abir
)r0r�get_abi_tagTs8

r;cCs
tjdkS)Ni���)r�maxsizer
r
r
r�_is_running_32bitvsr=cCs�tjdkr^tj�\}}}|jd�}|dkr6t�r6d}n|dkrHt�rHd}dj|d|d	|�Stjj�j	dd
�j	dd
�}|dkr�t�r�d
}|S)z0Return our platform name 'win32', 'linux_x86_64'�darwinr6�x86_64�i386�ppc64�ppczmacosx_{0}_{1}_{2}rrr7r5�linux_x86_64�
linux_i686)
rrZmac_verr8r=r	�	distutils�util�get_platformr9)�releaser7�machineZ	split_ver�resultr
r
rrGzs

rGcCsJt�dkrdSyddl}t|j�Sttfk
r8YnXtjjj	dd�S)NrCrDFr��)rCrD)
rG�
_manylinux�boolZmanylinux1_compatible�ImportError�AttributeError�pipZutilsZglibcZhave_compatible_glibc)rMr
r
r�is_manylinux1_compatible�s

rRcsvg}��fdd��td
dddg���|||�r8|j|�x.�D]&}|�|kr>�|||�r>|j|�q>W|jd�|S)z�Return a list of supported arches (including group arches) for
    the given major, minor and machine architecture of an macOS machine.
    cs~|dkr||fd
kS|dkr(||fdkS|dkr<||fdkS|dkrP||fd
kS|�krzx �|D]}�|||�rbdSqbWd	S)NrB�
rLrAr@r2r?TF)rSrL)rSrL)rSr2)rSrLr
)r!r"�arch�garch)�_supports_arch�groupsr
rrV�sz)get_darwin_arches.<locals>._supports_arch�fatr@rB�intelr?�fat64rA�fat32Z	universal�r@rB)rXr\�r?r@)rYr]�r?rA)rZr^�r?r@rB)r[r_)r�append)r!r"rI�archesrUr
)rVrWr�get_darwin_arches�s$


rbFcCsg}|dkrXg}t�}|dd�}x4t|ddd�D] }|jdjtt||f���q4W|p`t�}g}	|pnt�}|r�|g|	dd�<t�}
ddl	}x8|j
�D],}|djd�r�|
j|dj
dd�d�q�W|	jtt|
���|	jd�|�sx|p�t�}
|
jd	��r�tj|
�}|�r�|j�\}}}}d
j||�}g}xTttt|�d��D]4}x,tt|�||�D]}|j|||f��q^W�qHWn|
g}n*|dk�r�t��r�|
jdd�|
g}n|
g}x:|	D]2}x*|D]"}
|jd
||df||
f��q�W�q�WxZ|dd�D]J}|dk�rPx6|
D].}x&|D]}
|jd
||f||
f��qW�qW�q�Wx*|D]"}
|jd|ddd|
f��qRW|jd
||dfddf�|jd
||ddfddf�xNt|�D]B\}}|jd|fddf�|dk�r�|jd|dddf��q�W|S)acReturn a list of supported tags for each version specified in
    `versions`.

    :param versions: a list of string versions, of the form ["33", "32"],
        or None. The first version will be assumed to support our ABI.
    :param platform: specify the exact platform you want valid
        tags for, or None. If None, use the local system platform.
    :param impl: specify the exact implementation you want valid
        tags for, or None. If None, use the local interpreter impl.
    :param abi: specify the exact abi you want valid
        tags for, or None. If None, use the local interpreter abi.
    Nrrrz.abir6rKZnoneZmacosxz
{0}_{1}_%i_%s�linuxZ
manylinux1z%s%s�31�30zpy%s�any���rgrgrg)rdre)r�ranger`rrrrr;�set�impZget_suffixesr�addr8�extend�sorted�listrG�
_osx_arch_pat�matchrWr	�reversed�intrbrRr9�	enumerate)Zversions�noarchrr0r:Z	supportedr r!r"ZabisZabi3srj�suffixrTrp�nameZactual_archZtplrar1�a�version�ir
r
r�
get_supported�sh 




 

(


*
" 
rz)rt)TT)NFNNN)%�__doc__Z
__future__r�rerrrZloggingrrOZdistutils.sysconfigZdistutils.utilrEZ
pip.compatrZpip.utils.glibcrQZ	getLogger�__name__r$�compilerorrrrr#r(r;r=rGrRrbrzZsupported_tagsZsupported_tags_noarchZimplementation_tagr
r
r
r�<module>s>



"=
^
site-packages/pip/__pycache__/download.cpython-36.opt-1.pyc000064400000050324147511334620017472 0ustar003

���eO��@s^ddlmZddlZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZyddlZdZWnek
r�dZYnXddlmZddlmZddlZddlmZmZddlmZdd	lmZmZm Z m!Z!m"Z"m#Z#m$Z$m%Z%m&Z&m'Z'dd
l(m)Z)ddl*m+Z+ddl,m-Z-dd
l.m/Z/ddl0m1Z1ddl2m3Z3m4Z4ddl5m6Z6ddl7m8Z8ddl9m:Z:m;Z;ddl<m=Z=m>Z>ddl?m@Z@mAZAddlBmCZCmDZDddlEmFZFddlGmHZHddl9mIZIddlJmKZKddlLmMZMddlNmOZOddlPmQZQdddd d!d"d#d$d%d&d'd(d)g
ZRejSeT�ZUd*d+�ZVGd,d-�d-e@�ZWGd.d/�d/e=�ZXGd0d1�d1eM�ZYGd2d3�d3e>�ZZGd4d5�d5e:j[�Z\dWd6d�Z]ej^d7ej_�Z`ej^d8ej_�Zad9d�Zbd:d�Zcd;d �Zdd<d!�Zed=d"�Zfd>d?�Zgd@d$�ZhdAd%�ZidBdC�ZjdDdE�ZkdFdG�ZldHdI�ZmdXdJd&�ZndYdKd#�ZodLdM�ZpGdNdO�dOeQjq�ZrdZdPd'�ZsdQd)�ZtdRd(�ZudSdT�ZvdUdV�ZwdS)[�)�absolute_importNTF)�parse)�request)�InstallationError�HashMismatch)�PyPI)
�splitext�rmtree�format_size�display_path�
backup_dir�ask_path_exists�unpack_file�ARCHIVE_EXTENSIONS�consume�call_subprocess)�auto_decode)�check_path_owner)�
indent_log)�SETUPTOOLS_SHIM)�libc_ver)�DownloadProgressBar�DownloadProgressSpinner)�write_delete_marker_file)�vcs)�requests�six)�BaseAdapter�HTTPAdapter)�AuthBase�
HTTPBasicAuth)�CONTENT_CHUNK_SIZE�Response)�get_netrc_auth)�CaseInsensitiveDict)�urllib3)�CacheControlAdapter)�	FileCache)�	LockError)�
xmlrpc_client�get_file_content�is_url�url_to_path�path_to_url�is_archive_file�unpack_vcs_link�unpack_file_url�
is_vcs_url�is_file_url�unpack_http_url�
unpack_url�parse_content_disposition�sanitize_content_filenamecCsdtjd�tj�dtj�id�}|dddkrBtj�|dd<n�|dddkr�tjjd	krntjd
d�}ntj}djd
d�|D��|dd<nB|dddkr�tj�|dd<n |dddkr�tj�|dd<tjj	d��rJddl
m}tt
dd�tdddg|j����}tt
dd�tddgt����}|�r<||d<|�rJ||d<tjj	d��r|tj�d�r|dtj�dd�|d<tj��r�tj�|jdi�d<tj��r�tj�|jdi�d<tj��r�tj�|d<t�r�tjd
d �d)k�r�tj|d"<d#j|tj|d*d&d'�d(�S)+z6
    Return a string representing the user agent.
    �pip)�name�versionr8)Z	installer�python�implementationr;ZCPythonr9ZPyPy�finalN��.cSsg|]}t|��qS�)�str)�.0�xr?r?�/usr/lib/python3.6/download.py�
<listcomp>Tszuser_agent.<locals>.<listcomp>ZJythonZ
IronPython�linuxr)�distrocSs|dS)N�r?)rBr?r?rC�<lambda>`szuser_agent.<locals>.<lambda>�idcSs|dS)NrGr?)rBr?r?rCrHds�lib�libcrF�darwinZmacOS�system�releaseZcpu��Zopenssl_versionz9{data[installer][name]}/{data[installer][version]} {json}�,�:T)Z
separatorsZ	sort_keys)�data�json)rOrP)rQrR)r7�__version__�platformZpython_versionZpython_implementation�sys�pypy_version_info�releaselevel�join�
startswith�pip._vendorrF�dict�filter�zipZlinux_distributionrZmac_verrM�
setdefaultrN�machine�HAS_TLS�version_info�sslZOPENSSL_VERSION�formatrT�dumps)rSrXrFZdistro_infosrKr?r?rC�
user_agent@sP




rgc@s.eZdZddd�Zdd�Zdd�Zdd	�Zd
S)�MultiDomainBasicAuthTcCs||_i|_dS)N)�	prompting�	passwords)�selfrir?r?rC�__init__�szMultiDomainBasicAuth.__init__cCs�tj|j�}|jjdd�d}tj|dd�|f|dd��|_|jj|d�\}}|dkrn|j|j�\}}|dkr�|dkr�t	|j�}|r�|nd\}}|s�|r�||f|j|<t
|p�d|p�d�|�}|jd|j�|S)	N�@rGrO��response���)NN)NN)
�urllib_parse�urlparse�url�netloc�rsplit�
urlunparserj�get�parse_credentialsr#r Z
register_hook�
handle_401)rk�req�parsedrt�username�passwordZ
netrc_authr?r?rC�__call__�s&
zMultiDomainBasicAuth.__call__cKs�|jdkr|S|js|Stj|j�}tjjd|j�}t	j	d�}|sH|rX||f|j
|j<|j|jj
�t|ppd|pvd�|j�}|jj|f|�}|jj|�|S)Ni�z
User for %s: z
Password: rn)�status_coderirqrrrsrZmoves�inputrt�getpassrj�content�rawZrelease_connr rZ
connection�send�history�append)rk�resp�kwargsr{r|r}rzZnew_respr?r?rCry�s


zMultiDomainBasicAuth.handle_401cCs8d|kr4|jdd�d}d|kr,|jdd�S|dfSdS)NrmrGrrR)NN)ru�split)rkrtZuserinfor?r?rCrx�sz&MultiDomainBasicAuth.parse_credentialsN)T)�__name__�
__module__�__qualname__rlr~ryrxr?r?r?rCrh�s
!"rhc@seZdZddd�Zdd�ZdS)�LocalFSAdapterNc
Cs�t|j�}t�}d|_|j|_ytj|�}	Wn.tk
rZ}
zd|_|
|_WYdd}
~
XnPXtj	j
|	jdd�}tj
|�dp~d}t||	j|d��|_t|d�|_|jj|_|S)	N��i�T)Zusegmtrz
text/plain)zContent-TypezContent-Lengthz
Last-Modified�rb)r,rsr"r�os�stat�OSErrorr��emailZutilsZ
formatdate�st_mtime�	mimetypes�
guess_typer$�st_size�headers�open�close)
rkr�stream�timeout�verify�certZproxies�pathnamer�Zstats�excZmodified�content_typer?r?rCr��s$

zLocalFSAdapter.sendcCsdS)Nr?)rkr?r?rCr��szLocalFSAdapter.close)NNNNN)r�r�r�r�r�r?r?r?rCr��s
r�csDeZdZdZ�fdd�Z�fdd�Z�fdd�Z�fdd	�Z�ZS)
�
SafeFileCachezw
    A file based cache which is safe to use even when the target directory may
    not be accessible or writable.
    cs4tt|�j||�t|j�s0tjd|j�d|_dS)Nz�The directory '%s' or its parent directory is not owned by the current user and the cache has been disabled. Please check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.)�superr�rlr�	directory�logger�warning)rk�argsr�)�	__class__r?rCrl�s
zSafeFileCache.__init__c
s@|jdkrdSytt|�j||�Stttfk
r:YnXdS)N)r�r�r�rwr(r��IOError)rkr�r�)r�r?rCrws
zSafeFileCache.getc
s@|jdkrdSytt|�j||�Stttfk
r:YnXdS)N)r�r�r��setr(r�r�)rkr�r�)r�r?rCr�s
zSafeFileCache.setc
s@|jdkrdSytt|�j||�Stttfk
r:YnXdS)N)r�r�r��deleter(r�r�)rkr�r�)r�r?rCr�)s
zSafeFileCache.delete)	r�r�r��__doc__rlrwr�r��
__classcell__r?r?)r�rCr��s


r�c@seZdZdd�ZdS)�InsecureHTTPAdaptercCsd|_d|_dS)NZ	CERT_NONE)Z	cert_reqsZca_certs)rkZconnrsr�r�r?r?rC�cert_verify9szInsecureHTTPAdapter.cert_verifyN)r�r�r�r�r?r?r?rCr�7sr�cs,eZdZdZ�fdd�Z�fdd�Z�ZS)�
PipSessionNc	s�|jdd�}|jdd�}|jdg�}tt|�j||�t�|jd<t�|_tj	|dgdd�}|rvt
t|d	d
�|d�}n
t|d�}t
|d�}|jd
|�|jd|�|jdt��x|D]}|jdj|�|�q�WdS)N�retriesr�cache�insecure_hostsz
User-Agenti�g�?)ZtotalZstatus_forcelistZbackoff_factorT)Zuse_dir_lock)r��max_retries)r�zhttps://zhttp://zfile://zhttps://{0}/)�popr�r�rlrgr�rhZauthr%ZRetryr&r�rr�Zmountr�re)	rkr�r�r�r�r�Zsecure_adapterZinsecure_adapter�host)r�r?rCrlBs*




zPipSession.__init__cs(|jd|j�tt|�j||f|�|�S)Nr�)r`r�r�r�r)rk�methodrsr�r�)r�r?rCr~szPipSession.request)r�r�r�r�rlrr�r?r?)r�rCr�>s<r�c
CsL|dkrtd��tj|�}|r�|jd�j�}|dkrR|rR|jd�rRtd||f��|dkr�|jdd�d}|jdd	�}t	j
|�}|r�|jd�d|jd
d�d}tj|�}|jd	�r�d	|j
d	�}|}n|j|�}|j�|j|jfSy&t|d��}t|j��}WdQRXWn4tk
�rB}	ztdt|	���WYdd}	~	XnX||fS)
z�Gets the content of a file; it may be a filename, file: URL, or
    http: URL.  Returns (location, content).  Content is unicode.NzAget_file_content() missing 1 required keyword argument: 'session'rG�file�httpz6Requirements file %s references URL %s, which is localrR�\�/�|r�z$Could not open requirements file: %s)�	TypeError�
_scheme_re�search�group�lowerr[rr��replace�_url_slash_drive_re�matchrqZunquote�lstriprw�raise_for_statusrs�textr�r�readr�r@)
rsZ
comes_from�sessionr��scheme�pathr��fr�r�r?r?rCr*�s>





 z^(http|https|file):z/*([a-z])\|cCs6d|krdS|jdd�dj�}|ddddgtjkS)	z)Returns true if the name looks like a URLrRFrGrr�Zhttpsr�Zftp)r�r�rZall_schemes)r8r�r?r?rCr+�scCs2tj|�\}}}}}|r d|}tj||�}|S)z(
    Convert a file: URL to a path.
    z\\)rqZurlsplit�urllib_requestZurl2pathname)rs�_rtr�r?r?rCr,�s
cCs*tjjtjj|��}tjdtj|��}|S)zh
    Convert a path to a file: URL.  The path will be made absolute and have
    quoted path parts.
    zfile:)r�r��normpath�abspathrqZurljoinr�Zpathname2url)r�rsr?r?rCr-�scCs t|�dj�}|tkrdSdS)z9Return True if `name` is a considered as an archive file.rGTF)rr�r)r8�extr?r?rCr.�scCst|�}|j|�dS)N)�_get_used_vcs_backend�unpack)�link�location�vcs_backendr?r?rCr/�scCs.x(tjD]}|j|jkr||j�}|SqWdS)N)rZbackendsr�Zschemesrs)r�Zbackendr�r?r?rCr��s
r�cCstt|��S)N)�boolr�)r�r?r?rCr1�scCs|jj�jd�S)Nzfile:)rsr�r[)r�r?r?rCr2�scCst|j�}tjj|�S)z�Return whether a file:// Link points to a directory.

    ``link`` must not have any other scheme but file://. Call is_file_url()
    first.

    )r,�url_without_fragmentr�r��isdir)r��	link_pathr?r?rC�
is_dir_url�s
r�cOs|S)Nr?)�iterabler�r�r?r?rC�_progress_indicator�sr�c

sLyt�jd�}Wntttfk
r0d}YnXt�dd�}tj�tj	krRd}n&|r\d}n|dkrjd}n|std}nd}|j
}�fdd	�}�fd
d�}	t}
|jt
jkr�|}n|j}|r�|r�tjd|t|��t|d
�j}
ntjd|�t�j}
n |�rtjd|�ntjd|�tjd|�|	|
|t�t��}|�r@|j|�nt|�dS)Nzcontent-lengthrZ
from_cacheF�(i�Tc3s\y$x�jj|dd�D]
}|VqWWn2tk
rVx�jj|�}|sHP|Vq6WYnXdS)NF)Zdecode_content)r�r��AttributeErrorr�)Z
chunk_size�chunk)r�r?rC�	resp_readsz _download_url.<locals>.resp_readc3s"x|D]}�j|�|VqWdS)N)�write)Zchunksr�)�content_filer?rC�written_chunks;s

z%_download_url.<locals>.written_chunkszDownloading %s (%s))�maxzDownloading %szUsing cached %szDownloading from URL %si@�)�intr��
ValueError�KeyErrorr��getattrr�ZgetEffectiveLevel�logging�INFO�show_urlr�rtrr��infor
r�iterr�debugr!Zcheck_against_chunksr)
r�r�r��hashesZtotal_lengthZcached_respZ
show_progressr�r�r�Zprogress_indicatorrsZdownloaded_chunksr?)r�r�rC�
_download_urlsL
%
r�cCs�d}tjj||j�}tjj|�r�tdt|�d�}|dkr@d}nj|dkrdtjdt|��tj	|�nF|dkr�t
|�}tjd	t|�t|��tj||�n|dkr�t
jd
�|r�tj||�tjdt|��dS)NTz8The file %s exists. (i)gnore, (w)ipe, (b)ackup, (a)abort�i�w�b�aFzDeleting %szBacking up %s to %srGzSaved %s)r�r�r�r�rp)r�r�rZ�filename�existsr
rr�r��remover�shutilZmoverW�exit�copyr�)r�r�r�r�Zdownload_locationroZ	dest_filer?r?rC�
_copy_fileas.

r�c	Cs�|dkrtd��tjdd�}d}|r0t|||�}|rH|}tj|�d}nt||||�\}}t||||�|r~|r~t|||�|s�t	j
|�t|�dS)Nz@unpack_http_url() missing 1 required keyword argument: 'session'z-unpackzpip-r)r��tempfileZmkdtemp�_check_download_dirr�r��_download_http_urlrr�r��unlinkr	)	r�r��download_dirr�r��temp_dir�already_downloaded_path�	from_pathr�r?r?rCr3|s,


cCs�t|j�}t|�rHtjj|�r&t|�tj||dd�|rDt	j
d�dS|rV|j|�d}|rjt|||�}|rt|}n|}t
j|�d}t||||�|r�|r�t|||�dS)z�Unpack link into location.

    If download_dir is provided and link points to a file, make a copy
    of the link file inside download_dir.
    T)Zsymlinksz*Link is a directory, ignoring download_dirNr)r,r�r�r�r�r�r	r�Zcopytreer�r��check_against_pathr�r�r�rr�)r�r�rr�r�rrr�r?r?rCr0�s,



c
Cs�tjj|�rt|�d}tjg}|jd�|jt|�|jd�|d|g7}tj	d|�t
��t||dd�WdQRXtjj|tj
|�d	�}tj	d
||�t||ddd�dS)z�Copy distribution files in `link_path` to `location`.

    Invoked when user requests to install a local directory. E.g.:

        pip install .
        pip install ~/dev/git-repos/python-prompt-toolkit

    zsetup.pyz-c�sdistz
--dist-dirzRunning setup.py sdist for %sF)�cwdZshow_stdoutNrzUnpacking sdist %s into %s)r�r�)r�r�r�r	rW�
executabler�rr�r�rrrZ�listdirr)r�r�Zsetup_pyZ
sdist_argsrr?r?rC�_copy_dist_from_dir�s

r
c@s$eZdZdZddd�Zd	dd�ZdS)
�PipXmlrpcTransportzRProvide a `xmlrpclib.Transport` implementation via a `PipSession`
    object.
    FcCs*tjj||�tj|�}|j|_||_dS)N)r)�	Transportrlrqrrr��_scheme�_session)rkZ	index_urlr�Zuse_datetimeZindex_partsr?r?rCrl�s
zPipXmlrpcTransport.__init__c
Cs�|j||dddf}tj|�}y6ddi}|jj|||dd�}|j�||_|j|j�St	j
k
r�}	ztjd|	j
j|��WYdd}	~	XnXdS)NzContent-Typeztext/xmlT)rSr�r�zHTTP error %s while getting %s)r
rqrvrZpostr��verboseZparse_responser�r�	HTTPErrorr��criticalror)
rkr�ZhandlerZrequest_bodyr�partsrsr�ror�r?r?rCrs


zPipXmlrpcTransport.requestN)F)F)r�r�r�r�rlrr?r?r?rCr�s
rcCs^t|�rt||�n:t|�r.t||||d�n |dkr<t�}t|||||d�|rZt|�dS)avUnpack link.
       If link is a VCS link:
         if only_download, export into download_dir and ignore location
          else unpack into location
       for other types of link:
         - unpack into location
         - if download_dir, copy the file into download_dir
         - if only_download, mark location for deletion

    :param hashes: A Hashes object, one of whose embedded hashes must match,
        or HashMismatch will be raised. If the Hashes is empty, no matches are
        required, and unhashable types of requirements (like VCS ones, which
        would ordinarily raise HashUnsupported) are allowed.
    )r�N)r1r/r2r0r�r3r)r�r�rZ
only_downloadr�r�r?r?rCr4scCstjj|�S)zJ
    Sanitize the "filename" value from a Content-Disposition header.
    )r�r��basename)r�r?r?rCr6<scCs,tj|�\}}|jd�}|r$t|�}|p*|S)z�
    Parse the "filename" value from a Content-Disposition header, and
    return the default filename if the result is empty.
    r�)�cgiZparse_headerrwr6)�content_dispositionZdefault_filenameZ_typeZparamsr�r?r?rCr5Ds

c
Cs*|jjdd�d}y |j|ddidd�}|j�Wn8tjk
rj}ztjd|jj	|��WYd	d	}~XnX|j
jd
d�}|j}|j
jd�}	|	r�t|	|�}t
|�d}
|
s�tj|�}
|
r�||
7}|
r�|j|jkr�tjj
|j�d}
|
r�||
7}tjj||�}t|d
��}t||||�Wd	QRX||fS)z6Download link url into temp_dir using provided session�#rGrzAccept-EncodingZidentityT)r�r�zHTTP error %s while getting %sNzcontent-typernzcontent-disposition�wb)rsr�rwr�rrr�rrorr�r�r5rr�Zguess_extensionr�r�rZr�r�)
r�r�rr�Z
target_urlr�r�r�r�rr�Z	file_pathr�r?r?rCr�Ss:

r�cCsntjj||j�}tjj|�rjtjd|�|rfy|j|�Wn*tk
rdtj	d|�tj
|�dSX|SdS)z� Check download_dir for previously downloaded file with correct hash
        If a correct file is found return its path else None
    zFile was already downloaded %sz;Previously-downloaded file %s has bad hash. Re-downloading.N)r�r�rZr�r�r�r�rrr�r)r�rr�Z
download_pathr?r?rCr��s
r�)NN)NNN)NN)NFNN)xZ
__future__rrZemail.utilsr�r�rTr�r�r�rV�rer�rWr�rdrb�ImportErrorZpip._vendor.six.moves.urllibrrqrr�r7Zpip.exceptionsrrZ
pip.modelsrZ	pip.utilsrr	r
rrr
rrrrZpip.utils.encodingrZpip.utils.filesystemrZpip.utils.loggingrZpip.utils.setuptools_buildrZpip.utils.glibcrZpip.utils.uirrZ
pip.locationsrZpip.vcsrr\rrZpip._vendor.requests.adaptersrrZpip._vendor.requests.authrr Zpip._vendor.requests.modelsr!r"Zpip._vendor.requests.utilsr#Zpip._vendor.requests.structuresr$r%Zpip._vendor.cachecontrolr&Zpip._vendor.cachecontrol.cachesr'Zpip._vendor.lockfiler(Zpip._vendor.six.movesr)�__all__Z	getLoggerr�r�rgrhr�r�r�ZSessionr�r*�compile�Ir�r�r+r,r-r.r/r�r1r2r�r�r�r�r3r0r
rrr4r6r5r�r�r?r?r?rC�<module>s�
0
BR!BH
)
`
&
0$
'8site-packages/pip/__pycache__/wheel.cpython-36.pyc000064400000052356147511334620016037 0ustar003

���e~�@s$dZddlmZddlZddlZddlZddlZddlZddlZddl	Z	ddl
Z	ddlZddlZddl
Z
ddlZddlZddlZddlmZddlmZddlmZddlZddlmZddlmZmZdd	lmZmZm Z dd
l!m"Z"m#Z#ddlm$Z$ddl%m&Z&m'Z'm(Z(m)Z)m*Z*dd
l+m,Z,ddl-m.Z.ddl/m0Z0ddl1m2Z2ddl3m4Z4ddl5m6Z6ddl7m8Z8dZ9d9Z:ej;e<�Z=Gdd�de>�Z?dd�Z@dd�ZAd;dd�ZBd d!�ZCd"d#�ZDejEd$ejF�ZGd%d&�ZHd'd(�ZId<d+d,�ZJd-d.�ZKeKd/d0��ZLd1d2�ZMd3d4�ZNGd5d6�d6e>�ZOGd7d8�d8e>�ZPdS)=zH
Support for installing and building the "wheel" binary package format.
�)�absolute_importN)�urlsafe_b64encode)�Parser)�StringIO)�
expanduser)�path_to_url�
unpack_url)�InstallationError�InvalidWheelFilename�UnsupportedWheel)�distutils_scheme�PIP_DELETE_MARKER_FILENAME)�
pep425tags)�call_subprocess�
ensure_dir�captured_stdout�rmtree�read_chunks)�open_spinner)�
indent_log)�SETUPTOOLS_SHIM)�ScriptMaker)�
pkg_resources)�canonicalize_name)�configparserz.whl�c@s eZdZdZdd�Zdd�ZdS)�
WheelCachez&A cache of wheels for future installs.cCs|rt|�nd|_||_dS)z�Create a wheel cache.

        :param cache_dir: The root of the cache.
        :param format_control: A pip.index.FormatControl object to limit
            binaries being read from the cache.
        N)r�
_cache_dir�_format_control)�self�	cache_dir�format_control�r"�/usr/lib/python3.6/wheel.py�__init__8szWheelCache.__init__cCst|j||j|�S)N)�cached_wheelrr)r�link�package_namer"r"r#r%BszWheelCache.cached_wheelN)�__name__�
__module__�__qualname__�__doc__r$r%r"r"r"r#r5s
rcCs�|jg}|jdk	r4|jdk	r4|jdj|j|jg��dj|�}tj|j��j�}|dd�|dd�|dd�|dd�g}t	j
j|df|��S)a�
    Return a directory to store cached wheels in for link.

    Because there are M wheels for any one sdist, we provide a directory
    to cache them in, and then consult that directory when looking up
    cache hits.

    We only insert things into the cache if they have plausible version
    numbers, so that we don't contaminate the cache with things that were not
    unique. E.g. ./package might have dozens of installs done for it and build
    a version of 0.0...and if we built and cached a wheel, we'd end up using
    the same wheel even if the source has been edited.

    :param cache_dir: The cache_dir being used by pip.
    :param link: The link of the sdist for which this will cache wheels.
    N�=�#���Zwheels)Zurl_without_fragmentZ	hash_name�hash�append�join�hashlibZsha224�encodeZ	hexdigest�os�path)r r&Z	key_partsZkey_urlZhashed�partsr"r"r#�_cache_for_linkGs
,r9c
Cs,|s|S|s|S|jr|S|js$|S|s,|St|�}tjj||�}d|krN|St||�}ytj|�}Wn:t	k
r�}z|j
t
jt
jfkr�|S�WYdd}~XnXg}	xL|D]D}
yt
|
�}Wntk
r�w�YnX|j�s�q�|	j|j�|
f�q�W|	�s�|S|	j�tjj||	dd�}tjjt|��S)N�binaryrr)�is_wheel�is_artifactr�pip�index�fmt_ctl_formatsr9r6�listdir�OSError�errno�ENOENT�ENOTDIR�Wheelr
�	supportedr2�support_index_min�sortr7r3�Linkr)
r r&r!r'Zcanonical_nameZformats�rootZwheel_names�eZ
candidates�
wheel_name�wheelr7r"r"r#r%psF

r%�sha256�cCsttj|�}d}t|d��2}x*t||d�D]}|t|�7}|j|�q(WWdQRXdt|j��jd�j	d�}||fS)z6Return (hash, length) for path using hashlib.new(algo)r�rb)�sizeNzsha256=�latin1r,)
r4�new�openr�len�updater�digest�decode�rstrip)r7ZalgoZ	blocksize�hZlength�f�blockrWr"r"r#�rehash�s

r]cCs6tjddkri}d}nddi}d}t|||f|�S)Nr��b�newline�)�sys�version_inforT)�name�mode�nl�binr"r"r#�open_for_csv�srhcCs�tjj|�r�t|d��H}|j�}|jd�s.dStjjtj	��}d|tj
jd�}|j�}WdQRXt|d��}|j|�|j|�WdQRXdSdS)	zLReplace #!python with #!/path/to/python
    Return True if file was changed.rPs#!pythonFs#!�asciiN�wbT)
r6r7�isfilerT�readline�
startswithrb�
executabler5�getfilesystemencoding�linesep�read�write)r7Zscript�	firstlineZexename�restr"r"r#�
fix_script�s

ruzZ^(?P<namever>(?P<name>.+?)(-(?P<ver>\d.+?))?)
                                \.dist-info$cCs�|jdd�}xttj|�D]f}tj|�}|r|jd�|krttjj||d���,}x$|D]}|j	�j
�}|dkrTdSqTWWdQRXqWdS)	zP
    Return True if the extracted wheel in wheeldir should go into purelib.
    �-�_rd�WHEELzroot-is-purelib: trueTNF)�replacer6r@�dist_info_re�match�grouprTr7r3�lowerrY)rd�wheeldirZname_folded�itemr{rM�liner"r"r#�root_is_purelib�s

r�c
Cs�tjj|�siifSt|��<}t�}x$|D]}|j|j��|jd�q*W|jd�WdQRXtj	�}dd�|_
|j|�i}i}|jd�r�t
|jd��}|jd�r�t
|jd��}||fS)N�
rcSs|S)Nr")Zoptionr"r"r#�<lambda>�sz!get_entrypoints.<locals>.<lambda>Zconsole_scriptsZgui_scripts)r6r7�existsrTrrr�strip�seekrZRawConfigParserZoptionxformZreadfpZhas_section�dict�items)�filename�fp�datar�Zcp�console�guir"r"r#�get_entrypoints�s$





r�FTc+)s*|st||||||	d�}t|��r,|d�n|d�g�g��jtjj�tjj}i�t��g}|r�t��4}
tj	�� tj
d�tj|ddd�WdQRXWdQRXt
j|
j��dd	��d3�����fdd�	�	d4���	�
fd
d�	}||�d���std�
��tjj�dd�}t|�\����fdd�}xv�D]n}d}d}x^tjtjj�|��D]F}d}|dk�rpt}|}tjj�||�}||}|||d
||d��qVW�q4Wtd|d��d�_td5��_d�_��
fdd�}|�_d�_�jdd�}|�r�dtjk�rd|}|j�j|��tjjdd�dk�rTdt j!dd�|f}|j�j|��dt j!dd �|f}|j�j|��d!d"��D�}x|D]}�|=�q�W�jd#d�}|�rdtjk�r�d$|}|j�j|��d%t j!dd �|f}|j�j|��d&d"��D�}x|D]}�|=�qWt"��dk�rJ|j�j#d'd"��j$�D���t"��dk�r||j�j#d(d"��j$�D�d)di��tjj�dd*�}tjj�dd+�}t%|d,��}|j&d-�WdQRXt'j(||�|j)|�tjj�dd.�} tjj�dd/�}!t*| d0���}"t*|!d1���}#t+j,|"�}$t+j-|#�}%xV|$D]N}&�j|&d|&d�|&d<|&d�k�rpt.|&d�\|&d<|&d2<|%j/|&��q.Wx`|D]X}'t.|'�\}(})�|'��}*|
�r�|*j0|
��r�tjjtjtjj1|*|
��}*|%j/|*|(|)f��q�Wx"�D]}'|%j/�|'ddf��q�WWdQRXWdQRXt'j(|!| �dS)6zInstall a wheel)�user�homerJ�isolated�prefix�purelib�platlib�ignoreT)�force�quietNcSstjj||�jtjjd�S)N�/)r6r7�relpathry�sep)�src�pr"r"r#�normpathsz"move_wheel_files.<locals>.normpathFcs.�|��}�|��}|�|<|r*�j|�dS)z6Map archive RECORD paths to installation RECORD paths.N)�add)�srcfile�destfileZmodifiedZoldpath�newpath)�changed�	installed�lib_dirr�r~r"r#�record_installeds


z*move_wheel_files.<locals>.record_installedcs�t|��x�tj|�D�]�\}}}|t|�d�jtjj�}tjj||�}	|rj|jtjjd�dj	d�rjqx�|D]�}
tjj|||
�}|r�|dkr�|j	d�r��j
|
�qpqp|rp|
j	d�rpt|
�jt�j
��rp�s�td|ddj�����j
|�qpWx�|D]�}|�r||��rq�tjj||�}
tjj|||�}t|	�tj|
|�tj|
�}ttd��rptj||j|jf�tj|
tj��r�tj|
�}|jtjBtjBtjB}tj||�d	}|�r�||�}�|
||�q�WqWdS)
Nrrz.dataraz
.dist-infoz!Multiple .dist-info directories: z, �utimeF)rr6�walkrU�lstripr7r�r3�split�endswithr2rrmrd�AssertionError�shutilZcopyfile�stat�hasattrr��st_atime�st_mtime�access�X_OK�st_mode�S_IXUSR�S_IXGRP�S_IXOTH�chmod)�source�destZis_base�fixer�filter�dirZsubdirs�filesZbasedirZdestdir�sZ
destsubdirr[r�r��stZpermissionsr�)�	data_dirs�info_dirr��reqr"r#�clobbersJ






z!move_wheel_files.<locals>.clobberz!%s .dist-info directory not foundrzentry_points.txtcsh|j�jd�r|dd�}n<|j�jd�r8|dd�}n |j�jd�rT|dd�}n|}|�kpf|�kS)	Nz.exer/z
-script.py�
z.pya���i����r�)r}r�)rdZ	matchname)r�r�r"r#�is_entrypoint_wrapperasz/move_wheel_files.<locals>.is_entrypoint_wrapper�scripts)r�r�racs<|jdkrtd|�f���j|j|jjd�d|jd�S)Nz�Invalid script entry point: %s for req: %s - A callable suffix is required. Cf https://packaging.python.org/en/latest/distributing.html#console-scripts for more information.�.r)�moduleZimport_name�func)�suffixr	�script_templater�r�)�entry)�makerr�r"r#�_get_script_text�s
z*move_wheel_files.<locals>._get_script_textz�# -*- coding: utf-8 -*-
import re
import sys

from %(module)s import %(import_name)s

if __name__ == '__main__':
    sys.argv[0] = re.sub(r'(-script\.pyw?|\.exe)?$', '', sys.argv[0])
    sys.exit(%(func)s())
r=ZENSUREPIP_OPTIONSzpip = Z
altinstallz
pip%s = %srr^cSsg|]}tjd|�r|�qS)zpip(\d(\.\d)?)?$)�rer{)�.0�kr"r"r#�
<listcomp>�sz$move_wheel_files.<locals>.<listcomp>Zeasy_installzeasy_install = zeasy_install-%s = %scSsg|]}tjd|�r|�qS)zeasy_install(-\d\.\d)?$)r�r{)r�r�r"r"r#r��scSsg|]}d|�qS)z%s = %sr")r��kvr"r"r#r��scSsg|]}d|�qS)z%s = %sr")r�r�r"r"r#r��sr�Z	INSTALLERz
INSTALLER.piprjspip
�RECORDz
RECORD.pip�rzw+r.)F)NN)ra)2rr�rYr6r7r��setr�warnings�catch_warnings�filterwarnings�
compileall�compile_dir�logger�debug�getvaluer�r3r�r@rurr�ZvariantsZset_moder�r��pop�environ�extendZmake�getrb�versionrUZ
make_multipler�rTrrr��mover2rh�csv�reader�writerr]Zwriterowrmr�)+rdr�r~r�r�rJZ	pycompile�schemer�r�Zstrip_file_prefixr�Z	generated�stdoutr�Zep_filer�Zdatadirr�r�Zsubdirr�r�Z
pip_script�specZpip_epr�Zeasy_install_scriptZeasy_install_epZ	installerZtemp_installerZinstaller_file�recordZtemp_recordZ	record_inZ
record_outr�r��rowr[rZ�lZ
final_pathr")r�r�r�r�r�r�r�r�r�r�r�r~r#�move_wheel_files�s�




$;



#









.r�cstj���fdd��}|S)Nc?s6t�}x*�||�D]}||kr|j|�|VqWdS)N)r�r�)�args�kw�seenr)�fnr"r#�uniques

z_unique.<locals>.unique)�	functools�wraps)r�r�r")r�r#�_uniquesr�ccs�ddlm}tj||jd���}xd|D]\}tjj|j|d�}|V|j	d�r&tjj
|�\}}|dd�}tjj||d�}|Vq&WdS)	a
    Yield all the uninstallation paths for dist based on RECORD-without-.pyc

    Yield paths to all the files in RECORD. For each .py file in RECORD, add
    the .pyc in the same directory.

    UninstallPathSet.add() takes care of the __pycache__ .pyc.
    r)�FakeFiler�z.pyNr^z.pyc���)�	pip.utilsr�r�r�Zget_metadata_linesr6r7r3�locationr�r�)�distr�r�r�r7Zdnr��baser"r"r#�uninstallation_paths"s


r�cCsdyTdd�tjd|�D�d}|jd�}t�j|�}|dj�}ttt|j	d���}|SdSdS)	z�
    Return the Wheel-Version of an extracted wheel, if possible.

    Otherwise, return False if we couldn't parse / extract it.
    cSsg|]}|�qSr"r")r��dr"r"r#r�?sz!wheel_version.<locals>.<listcomp>Nrrxz
Wheel-Versionr�F)
rZfind_on_pathZget_metadatarZparsestrr��tuple�map�intr�)�
source_dirr�Z
wheel_datar�r"r"r#�
wheel_version8s
rcCsb|std|��|dtdkr>td|djtt|��f��n |tkr^tjddjtt|���dS)a�
    Raises errors or warns if called with an incompatible Wheel-Version.

    Pip should refuse to install a Wheel-Version that's a major series
    ahead of what it's compatible with (e.g 2.0 > 1.1); and warn when
    installing a version only minor version ahead (e.g 1.2 > 1.1).

    version: a 2-tuple representing a Wheel-Version (Major, Minor)
    name: name of wheel or package to raise exception about

    :raises UnsupportedWheel: when an incompatible Wheel-Version is given
    z(%s is in an unsupported or invalid wheelrzB%s's Wheel-Version (%s) is not compatible with this version of pipr�z*Installing from a newer Wheel-Version (%s)N)r�VERSION_COMPATIBLEr3r�strr��warning)r�rdr"r"r#�check_compatibilityKs

rc@s:eZdZdZejdej�Zdd�Zd
dd�Z	ddd	�Z
dS)rEzA wheel filez�^(?P<namever>(?P<name>.+?)-(?P<ver>\d.*?))
        ((-(?P<build>\d.*?))?-(?P<pyver>.+?)-(?P<abi>.+?)-(?P<plat>.+?)
        \.whl|\.dist-info)$cs��jj|�}|std|��|�_|jd�jdd��_|jd�jdd��_|jd�jd��_	|jd�jd��_
|jd	�jd��_t�fd
d��j	D���_
dS)
zX
        :raises InvalidWheelFilename: when the filename is invalid for a wheel
        z!%s is not a valid wheel filename.rdrwrvZverZpyverr��abiZplatc3s0|](}�jD]}�jD]}|||fVqqqdS)N)�abis�plats)r��x�y�z)rr"r#�	<genexpr>�sz!Wheel.__init__.<locals>.<genexpr>N)�
wheel_file_rer{r
r�r|ryrdr�r�Z
pyversionsr
rr��	file_tags)rr�Z
wheel_infor")rr#r$ts
zWheel.__init__Ncs2�dkrtj��fdd�|jD�}|r.t|�SdS)a"
        Return the lowest index that one of the wheel's file_tag combinations
        achieves in the supported_tags list e.g. if there are 8 supported tags,
        and one of the file tags is first in the list, then return 0.  Returns
        None is the wheel is not supported.
        Ncsg|]}|�kr�j|��qSr")r>)r��c)�tagsr"r#r��sz+Wheel.support_index_min.<locals>.<listcomp>)r�supported_tagsr�min)rrZindexesr")rr#rG�szWheel.support_index_mincCs"|dkrtj}tt|�j|j��S)z'Is this wheel supported on this system?N)rr�boolr��intersectionr)rrr"r"r#rF�szWheel.supported)N)N)r(r)r*r+r��compile�VERBOSErr$rGrFr"r"r"r#rEhs
rEc@sHeZdZdZddd�Zddd�Zdd�Zdd	d
�Zdd�Zddd�Z	dS)�WheelBuilderz#Build wheels from a RequirementSet.NcCs6||_||_|jj|_|j|_|p$g|_|p.g|_dS)N)	�requirement_set�finderZ_wheel_cacher�_cache_rootZwheel_download_dir�
_wheel_dir�
build_options�global_options)rrrrr r"r"r#r$�s

zWheelBuilder.__init__cCs�tjd�}zn|j|||d�rlyBtj|�d}tjj||�}tjtjj||�|�t	j
d|�|SYnX|j|�dSt|�XdS)ziBuild one wheel.

        :return: The filename of the built wheel, or None if the build failed.
        z
pip-wheel-)�
python_tagrzStored in directory: %sN)
�tempfileZmkdtemp�_WheelBuilder__build_oner6r@r7r3r�r�r��info�
_clean_oner)rr��
output_dirr!�tempdrLZ
wheel_pathr"r"r#�
_build_one�s

zWheelBuilder._build_onecCstjddt|jgt|j�S)Nz-uz-c)rbrnrZsetup_py�listr )rr�r"r"r#�_base_setup_args�s
zWheelBuilder._base_setup_argscCs�|j|�}d|jf}t|��t}tjd|�|dd|g|j}|dk	rT|d|g7}yt||jd|d�dS|jd	�tj	d
|j�dSWdQRXdS)Nz#Running setup.py bdist_wheel for %szDestination directory: %sZbdist_wheelz-dz--python-tagF)�cwd�show_stdout�spinnerT�errorzFailed building wheel for %s)
r*rdrr�r�rrZsetup_py_dirZfinishr.)rr�r'r!�	base_argsZspin_messager-Z
wheel_argsr"r"r#Z__build_one�s



zWheelBuilder.__build_onecCsV|j|�}tjd|j�|ddg}yt||jdd�dStjd|j�dSdS)NzRunning setup.py clean for %sZcleanz--allF)r+r,Tz Failed cleaning build dir for %s)r*r�r$rdrrr.)rr�r/Z
clean_argsr"r"r#r%�s
zWheelBuilder._clean_oneFcCs�|js|r|jst�|jj|j�|jjj�}g}x�|D]�}|jrDq8|j	r^|s�t
jd|j�q8|rj|j
rjq8|r�|jr�|jjr�q8|r�|jr�q8|r�|j}|j�\}}tjj|d|�dkr�q8dtjj|jjt|j��kr�t
jd|j�q8|j|�q8W|�s�dSt
jddjdd	�|D���t���Jgg}}	�x6|D�],}d}
|�r�tj}
t|j|j�}yt|�WnBtk
�r�}z$t
j d
|j|�|	j|��w6WYdd}~XnXn|j}|j!|||
d�}
|
�rX|j|�|�rb|j�rt"j#j$t"j#j|jt%���rtd��|j&�|j'|jj(�|_tjj)t*|
��|_|jj	�s<t�t+|j|jdd
|jj,d�n
|	j|��q6WWdQRX|�r�t
jddjdd	�|D���|	�r�t
jddjdd	�|	D���t-|	�dkS)z�Build wheels.

        :param unpack: If True, replace the sdist we built from with the
            newly built wheel, in preparation for installation.
        :return: True if all the wheels built correctly.
        z(Skipping %s, due to already being wheel.Nr:zCSkipping bdist_wheel for %s, due to binaries being disabled for it.Tz*Building wheels for collected packages: %sz, cSsg|]
}|j�qSr")rd)r�r�r"r"r#r�sz&WheelBuilder.build.<locals>.<listcomp>z Building wheel for %s failed: %s)r!zbad source dir - missing markerF)�sessionzSuccessfully built %s� cSsg|]
}|j�qSr")rd)r�r�r"r"r#r�QszFailed to build %scSsg|]
}|j�qSr")rd)r�r�r"r"r#r�Vsr).rrr�rZ
prepare_filesrZrequirements�valuesZ
constraintr;r�r$rdZeditabler&r<r�splitextr=r>Zegg_info_matchesr?r!rr2r3rrZimplementation_tagr9rrArr(r6r7r�r
Zremove_temporary_sourceZbuild_locationZ	build_dirrIrrr0rU)rZautobuildingZreqsetZbuildsetr�r&r�ZextZ
build_successZ
build_failurer!r&rKZ
wheel_filer"r"r#�build�s�






zWheelBuilder.build)NN)N)N)F)
r(r)r*r+r$r(r*r#r%r4r"r"r"r#r�s


r)rr�)rNr5)FNNTNFNN)Qr+Z
__future__rr�r�rBr�r4Zloggingr6Zos.pathr�r�r�rbr"r��base64rZemail.parserrZpip._vendor.sixrr=Z
pip.compatrZpip.downloadrrZpip.exceptionsr	r
rZ
pip.locationsrr
rr�rrrrrZpip.utils.uirZpip.utils.loggingrZpip.utils.setuptools_buildrZpip._vendor.distlib.scriptsrZpip._vendorrZpip._vendor.packaging.utilsrZpip._vendor.six.movesrZ	wheel_extrZ	getLoggerr(r��objectrr9r%r]rhrurrrzr�r�r�r�r�rrrErr"r"r"r#�<module>sn
)'



'7site-packages/pip/__pycache__/exceptions.cpython-36.opt-1.pyc000064400000024346147511334620020051 0ustar003

���e��@sddZddlmZddlmZmZmZddlmZGdd�de	�Z
Gdd�de
�ZGd	d
�d
e
�ZGdd�de�Z
Gd
d�de�ZGdd�de
�ZGdd�de
�ZGdd�de
�ZGdd�de
�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd �d e�ZGd!d"�d"e�ZGd#d$�d$e�ZGd%d&�d&e�ZGd'd(�d(e�ZGd)d*�d*e�Zd+S),z"Exceptions used throughout package�)�absolute_import)�chain�groupby�repeat)�	iteritemsc@seZdZdZdS)�PipErrorzBase pip exceptionN)�__name__�
__module__�__qualname__�__doc__�rr� /usr/lib/python3.6/exceptions.pyr	src@seZdZdZdS)�InstallationErrorz%General exception during installationN)rr	r
rrrrr
r
src@seZdZdZdS)�UninstallationErrorz'General exception during uninstallationN)rr	r
rrrrr
rsrc@seZdZdZdS)�DistributionNotFoundzCRaised when a distribution cannot be found to satisfy a requirementN)rr	r
rrrrr
rsrc@seZdZdZdS)�RequirementsFileParseErrorzDRaised when a general error occurs parsing a requirements file line.N)rr	r
rrrrr
rsrc@seZdZdZdS)�BestVersionAlreadyInstalledzNRaised when the most up-to-date version of a package is already
    installed.N)rr	r
rrrrr
rsrc@seZdZdZdS)�
BadCommandz0Raised when virtualenv or a command is not foundN)rr	r
rrrrr
r"src@seZdZdZdS)�CommandErrorz7Raised when there is an error in command-line argumentsN)rr	r
rrrrr
r&src@seZdZdZdS)�PreviousBuildDirErrorz:Raised when there's a previous conflicting build directoryN)rr	r
rrrrr
r*src@seZdZdZdS)�InvalidWheelFilenamezInvalid wheel filename.N)rr	r
rrrrr
r.src@seZdZdZdS)�UnsupportedWheelzUnsupported wheel.N)rr	r
rrrrr
r2src@s8eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�ZdS)
�
HashErrorsz:Multiple HashError instances rolled into one for reportingcCs
g|_dS)N)�errors)�selfrrr
�__init__9szHashErrors.__init__cCs|jj|�dS)N)r�append)r�errorrrr
r<szHashErrors.appendcCsfg}|jjdd�d�x<t|jdd��D](\}}|j|j�|jdd�|D��q(W|rbdj|�SdS)NcSs|jS)N)�order)�errr
�<lambda>Asz$HashErrors.__str__.<locals>.<lambda>)�keycSs|jS)N)�	__class__)rrrr
r Bscss|]}|j�VqdS)N)�body)�.0rrrr
�	<genexpr>Dsz%HashErrors.__str__.<locals>.<genexpr>�
)r�sortrr�head�extend�join)r�lines�clsZ
errors_of_clsrrr
�__str__?szHashErrors.__str__cCs
t|j�S)N)�boolr)rrrr
�__nonzero__HszHashErrors.__nonzero__cCs|j�S)N)r/)rrrr
�__bool__KszHashErrors.__bool__N)	rr	r
rrrr-r/r0rrrr
r6s	rc@s0eZdZdZdZdZdd�Zdd�Zdd	�ZdS)
�	HashErrora�
    A failure to verify a package against known-good hashes

    :cvar order: An int sorting hash exception classes by difficulty of
        recovery (lower being harder), so the user doesn't bother fretting
        about unpinned packages when he has deeper issues, like VCS
        dependencies, to deal with. Also keeps error reports in a
        deterministic order.
    :cvar head: A section heading for display above potentially many
        exceptions of this kind
    :ivar req: The InstallRequirement that triggered this error. This is
        pasted on after the exception is instantiated, because it's not
        typically available earlier.

    N�cCsd|j�S)a)Return a summary of me for display under the heading.

        This default implementation simply prints a description of the
        triggering requirement.

        :param req: The InstallRequirement that provoked this error, with
            populate_link() having already been called

        z    %s)�_requirement_name)rrrr
r#bs
zHashError.bodycCsd|j|j�fS)Nz%s
%s)r(r#)rrrr
r-nszHashError.__str__cCs|jrt|j�SdS)z�Return a description of the requirement that triggered me.

        This default implementation returns long description of the req, with
        line numbers

        zunknown package)�req�str)rrrr
r3qszHashError._requirement_name)	rr	r
rr4r(r#r-r3rrrr
r1Osr1c@seZdZdZdZdZdS)�VcsHashUnsupportedzuA hash was provided for a version-control-system-based requirement, but
    we don't have a method for hashing those.rzlCan't verify hashes for these requirements because we don't have a way to hash version control repositories:N)rr	r
rrr(rrrr
r6{sr6c@seZdZdZdZdZdS)�DirectoryUrlHashUnsupportedzuA hash was provided for a version-control-system-based requirement, but
    we don't have a method for hashing those.�zUCan't verify hashes for these file:// requirements because they point to directories:N)rr	r
rrr(rrrr
r7�sr7c@s(eZdZdZdZdZdd�Zdd�ZdS)	�HashMissingz2A hash was needed for a requirement but is absent.�awHashes are required in --require-hashes mode, but they are missing from some requirements. Here is a list of those requirements along with the hashes their downloaded archives actually had. Add lines like these to your requirements files to prevent tampering. (If you did not enable --require-hashes manually, note that it turns on automatically when any package has a hash.)cCs
||_dS)zq
        :param gotten_hash: The hash of the (possibly malicious) archive we
            just downloaded
        N)�gotten_hash)rr;rrr
r�szHashMissing.__init__cCsHddlm}d}|jr4|jjr&|jjnt|jdd�}d|p<d||jfS)Nr)�
FAVORITE_HASHr4z    %s --hash=%s:%szunknown package)Zpip.utils.hashesr<r4Z
original_link�getattrr;)rr<�packagerrr
r#�szHashMissing.bodyN)rr	r
rrr(rr#rrrr
r9�s
r9c@seZdZdZdZdZdS)�HashUnpinnedzPA requirement had a hash specified but was not pinned to a specific
    version.�zaIn --require-hashes mode, all requirements must have their versions pinned with ==. These do not:N)rr	r
rrr(rrrr
r?�sr?c@s0eZdZdZdZdZdd�Zdd�Zdd	�Zd
S)�HashMismatchz�
    Distribution file hash values don't match.

    :ivar package_name: The name of the package that triggered the hash
        mismatch. Feel free to write to this after the exception is raise to
        improve its error message.

    �z�THESE PACKAGES DO NOT MATCH THE HASHES FROM THE REQUIREMENTS FILE. If you have updated the package versions, please update the hashes. Otherwise, examine the package contents carefully; someone may have tampered with them.cCs||_||_dS)z�
        :param allowed: A dict of algorithm names pointing to lists of allowed
            hex digests
        :param gots: A dict of algorithm names pointing to hashes we
            actually got from the files under suspicion
        N)�allowed�gots)rrCrDrrr
r�szHashMismatch.__init__cCsd|j�|j�fS)Nz
    %s:
%s)r3�_hash_comparison)rrrr
r#�szHashMismatch.bodycsjdd�}g}xRt|j�D]D\}}||��|j�fdd�|D��|jd|j|j��d�qWdj|�S)aE
        Return a comparison of actual and expected hash values.

        Example::

               Expected sha256 abcdeabcdeabcdeabcdeabcdeabcdeabcdeabcdeabcde
                            or 123451234512345123451234512345123451234512345
                    Got        bcdefbcdefbcdefbcdefbcdefbcdefbcdefbcdefbcdef

        cSst|gtd��S)Nz    or)rr)�	hash_namerrr
�hash_then_or�sz3HashMismatch._hash_comparison.<locals>.hash_then_orc3s|]}dt��|fVqdS)z        Expected %s %sN)�next)r$r)�prefixrr
r%�sz0HashMismatch._hash_comparison.<locals>.<genexpr>z             Got        %s
z    orr&)rrCr)rrDZ	hexdigestr*)rrGr+rFZ	expectedsr)rIr
rE�s
zHashMismatch._hash_comparisonN)	rr	r
rrr(rr#rErrrr
rA�s
rAc@seZdZdZdS)�UnsupportedPythonVersionzMUnsupported python version according to Requires-Python package
    metadata.N)rr	r
rrrrr
rJ�srJN)rZ
__future__r�	itertoolsrrrZpip._vendor.sixr�	Exceptionrrrrrrrrrrrrr1r6r7r9r?rArJrrrr
�<module>s,,		$	8site-packages/pip/__pycache__/__init__.cpython-36.opt-1.pyc000064400000020444147511334620017422 0ustar003

���e�.�@s�ddlmZddlZddlZddlZddlZddlZddlZddlZddl	m
Z
ejde
d�yddlZWne
k
r~YnNXejdkr�eedd�dkr�ydd	lmZWne
efk
r�Yn
Xej�dd
lmZmZmZddlmZmZddlmZmZdd
lmZmZm Z m!Z!ddl"m#Z#m$Z$ddl%m&Z&m'Z'ddl%m(Z(ddl	m)Z)ddl*Z+e+j,Z,dZ-ej.e/�Z0ejde)d�dd�Z1dd�Z2dd�Z3dd�Z4d dd�Z5Gdd�de6�Z7e/dk�r�ej8e5��dS)!�)�absolute_importN)�DependencyWarning�ignore)�category�darwinZOPENSSL_VERSION_NUMBERi)�securetransport)�InstallationError�CommandError�PipError)�get_installed_distributions�get_prog)�deprecation�dist_is_editable)�git�	mercurial�
subversion�bazaar)�ConfigOptionParser�UpdatingDefaultsHelpFormatter)�
get_summaries�get_similar_commands)�
commands_dict)�InsecureRequestWarningz9.0.3csZdtjkrdStjdj�dd�}ttjd�}y||d�Wntk
rZd�YnXdd�t�D��g}y�fd	d�|D�d
}Wntk
r�d}YnXt�}|�r�|dkr�tjd�|dkoԈj	d
��rJg}�j
�}x<tdd�D].}|jj	|�r�|j|dd�kr�|j
|j�q�W|�rJx|D]}t|��q.Wtjd�t|�}|dd�|jjD�7}dd�|d|d�D���fdd�|D�}�fdd�|D�}x�|D](}	|	d
}
|	d�r�|
d7}
t|
��q�Wnp�j	d
��s�j	d��r0dd�|jD�}|j
|j�dd�|D�}�dd�|D�7�tdj�fdd��D���tjd�dS)z�Command and option completion for the main option parser (and options)
    and its subcommands (and options).

    Enable by sourcing one of the completion shell scripts (bash, zsh or fish).
    ZPIP_AUTO_COMPLETENZ
COMP_WORDS�Z
COMP_CWORD�cSsg|]\}}|�qS�r)�.0�cmdZsummaryrr�/usr/lib/python3.6/__init__.py�
<listcomp>Usz autocomplete.<locals>.<listcomp>csg|]}|�kr|�qSrr)r�w)�subcommandsrrrYsr�helpZ	uninstall�-T)Z
local_onlycSs&g|]}|jtjkr|j�|jf�qSr)r"�optparse�
SUPPRESS_HELP�get_opt_string�nargs)r�optrrrrqscSsg|]}|jd�d�qS)�=r)�split)r�xrrrrvscs g|]\}}|�kr||f�qSrr)rr+�v)�	prev_optsrrrwscs"g|]\}}|j��r||f�qSr)�
startswith)r�kr,)�currentrrrysr)z--cSsg|]
}|j�qSr)�option_list)r�irrrr�scss|]}|D]
}|Vq
qdS)Nr)r�it�orrr�	<genexpr>�szautocomplete.<locals>.<genexpr>cSs g|]}|jtjkr|j��qSr)r"r$r%r&)rr2rrrr�s� csg|]}|j��r|�qSr)r.)rr+)r0rrr�s)�os�environr*�int�
IndexErrorr�create_main_parser�sys�exitr.�lowerr�key�append�printr�parserZoption_list_allZ
option_groupsr1�join)ZcwordsZcwordZoptionsZsubcommand_namerBZ	installedZlc�distZ
subcommandZoptionZ	opt_labelZoptsr)r0r-r!r�autocompleteEs\








rEcCs�ddt�dt�d�}tf|�}|j�tjjtjjtjjt���}dt	|t
jdd�f|_tj
tj|�}|j|�d|_t�}dgd	d
�|D�}dj|�|_|S)Nz
%prog <command> [options]F�global)ZusageZadd_help_optionZ	formatter�name�progzpip %s from %s (python %s)�TrcSsg|]\}}d||f�qS)z%-27s %sr)rr2�jrrrr�sz&create_main_parser.<locals>.<listcomp>�
)rrrZdisable_interspersed_argsr7�path�dirname�abspath�__file__�__version__r<�version�
cmdoptionsZmake_option_groupZ
general_groupZadd_option_group�mainrrC�description)Z	parser_kwrBZpip_pkg_dirZgen_optsZcommand_summariesrTrrrr;�s"


r;cCs�t�}|j|�\}}|jr>tjj|j�tjjtj�tj�|s\|ddkrlt	|�dkrl|j
�tj�|d}|tkr�t|�}d|g}|r�|j
d|�tdj|���|dd�}|j|�||fS)Nrr"rzunknown command "%s"zmaybe you meant "%s"z - )r;�
parse_argsrQr<�stdout�writer7�linesepr=�lenZ
print_helprrr@r	rC�remove)�argsrBZgeneral_optionsZ	args_else�cmd_nameZguess�msg�cmd_argsrrr�	parseopts�s&	

r_cCsd}d|krd}|S)NFz
--isolatedTr)r[�isolatedrrr�check_isolated�sracCs�|dkrtjdd�}tj�t�yt|�\}}WnJtk
r~}z.tjjd|�tjjt	j
�tjd�WYdd}~XnXytj
tjd�Wn0tjk
r�}ztjd|�WYdd}~XnXt|t|�d�}|j|�S)Nrz	ERROR: %srz%Ignoring error %s when setting locale)r`)r<�argvr
Zinstall_warning_loggerrEr_r
�stderrrWr7rXr=�locale�	setlocale�LC_ALL�Error�logger�debugrrarS)r[r\r^�exc�eZcommandrrrrS�s rSc@sLeZdZffdd�Zejd�Zejd�Zedd��Z	e
dd��Zd	d
�ZdS)�FrozenRequirementcCs||_||_||_||_dS)N)rG�req�editable�comments)�selfrGrmrnrorrr�__init__�szFrozenRequirement.__init__z-r(\d+)$z-(20\d\d\d\d\d\d)$cCs�tjjtjj|j��}g}ddlm}m}t|�r�|j	|�r�d}y|||�}Wn2t
k
r�}	ztjd|	�d}WYdd}	~	XnX|dkr�tjd|�|j
d�|j�}d}n�d}|j�}|j}
|
dd}|jj|�}|jj|�}
|s�|
�rp|jd	�}|�r|�j||�}|�s*tjd
|�|j
d�nF|j
d|�|�rJ|jd�}nd
|
jd�}d}d|||j|�f}||j|||�S)Nr)�vcs�get_src_requirementTzYError when trying to get requirement for VCS system %s, falling back to uneditable formatz-Could not determine repository location of %sz-## !! Could not determine repository locationFrZsvnz(Warning: cannot find svn location for %szF## FIXME: could not find svn URL in dependency_links for this package:z3# Installing as editable to satisfy requirement %s:z{%s}z%s@%s#egg=%s)r7rL�normcaserN�location�pip.vcsrrrsrZget_backend_namerrhZwarningr@Zas_requirement�specs�_rev_re�search�_date_reZget_backendZget_location�group�egg_nameZproject_name)�clsrDZdependency_linksrurorrrsrnrmrjrwrQZ	ver_matchZ
date_matchZsvn_backendZsvn_locationZrevrrr�	from_dists`

zFrozenRequirement.from_distcCs,|j�}tjd|�}|r(|d|j��}|S)Nz
-py\d\.\d$)r|�rery�start)rDrG�matchrrrr|Is
zFrozenRequirement.egg_namecCs2|j}|jrd|}djt|j�t|�g�dS)Nz-e %srK)rmrnrC�listro�str)rprmrrr�__str__QszFrozenRequirement.__str__N)
�__name__�
__module__�__qualname__rqr�compilerxrz�classmethodr~�staticmethodr|r�rrrrrl�s

Arl�__main__)N)9Z
__future__rrdZloggingr7r$�warningsr<rZpip._vendor.urllib3.exceptionsr�filterwarningsZssl�ImportError�platform�getattrZpip._vendor.urllib3.contribr�OSErrorZinject_into_urllib3Zpip.exceptionsrr	r
Z	pip.utilsrrr
rrvrrrrZpip.baseparserrrZpip.commandsrrrrZpip.cmdoptionsZpiprRrPZ	getLoggerr�rhrEr;r_rarS�objectrlr=rrrr�<module>sR


I*	
[
site-packages/pip/__pycache__/index.cpython-36.opt-1.pyc000064400000074036147511334620017000 0ustar003

���eW��@sdZddlmZddlZddlZddlmZddlZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlmZddlmZddlmZddlmZmZmZmZmZdd	lmZdd
lmZddlm Z ddl!m"Z"m#Z#m$Z$m%Z%dd
l&m'Z'm(Z(m)Z)m*Z*ddl+m,Z,m-Z-ddl.m/Z/ddl0m1Z1m2Z2m3Z3ddl4mZ5ddl6m7Z7ddl8m9Z9ddl:m;Z;ddl<m=Z=dddgZ>d3d4d5d6d7d8gZ?ej@eA�ZBGdd �d eC�ZDGd!d�deC�ZEe
jFd"e
jG�fd#d$�ZHGd%d&�d&eC�ZIGd'd(�d(eC�ZJedd)�ZKd*d�ZLd+d,�ZMd-d.�ZNd/d0�ZOed1d2�ZPdS)9z!Routines related to PyPI, indexes�)�absolute_importN)�
namedtuple)�parse)�request)�	ipaddress)�cached_property�splitext�normalize_path�ARCHIVE_EXTENSIONS�SUPPORTED_EXTENSIONS)�RemovedInPip10Warning)�
indent_log)�check_requires_python)�DistributionNotFound�BestVersionAlreadyInstalled�InvalidWheelFilename�UnsupportedWheel)�HAS_TLS�is_url�path_to_url�url_to_path)�Wheel�	wheel_ext)�
get_supported)�html5lib�requests�six)�canonicalize_name)�
specifiers)�SSLError)�unescape�
FormatControl�fmt_ctl_handle_mutual_exclude�
PackageFinder�https�*�	localhost�127.0.0.0/8�::1/128�file�sshc@s\eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�ZdS)�InstallationCandidatecCs,||_t|�|_||_|j|j|jf|_dS)N)�project�
parse_version�version�location�_key)�selfr,r.r/�r2�/usr/lib/python3.6/index.py�__init__>s
zInstallationCandidate.__init__cCsdj|j|j|j�S)Nz,<InstallationCandidate({0!r}, {1!r}, {2!r})>)�formatr,r.r/)r1r2r2r3�__repr__DszInstallationCandidate.__repr__cCs
t|j�S)N)�hashr0)r1r2r2r3�__hash__IszInstallationCandidate.__hash__cCs|j|dd��S)NcSs||kS)Nr2)�s�or2r2r3�<lambda>Msz.InstallationCandidate.__lt__.<locals>.<lambda>)�_compare)r1�otherr2r2r3�__lt__LszInstallationCandidate.__lt__cCs|j|dd��S)NcSs||kS)Nr2)r9r:r2r2r3r;Psz.InstallationCandidate.__le__.<locals>.<lambda>)r<)r1r=r2r2r3�__le__OszInstallationCandidate.__le__cCs|j|dd��S)NcSs||kS)Nr2)r9r:r2r2r3r;Ssz.InstallationCandidate.__eq__.<locals>.<lambda>)r<)r1r=r2r2r3�__eq__RszInstallationCandidate.__eq__cCs|j|dd��S)NcSs||kS)Nr2)r9r:r2r2r3r;Vsz.InstallationCandidate.__ge__.<locals>.<lambda>)r<)r1r=r2r2r3�__ge__UszInstallationCandidate.__ge__cCs|j|dd��S)NcSs||kS)Nr2)r9r:r2r2r3r;Ysz.InstallationCandidate.__gt__.<locals>.<lambda>)r<)r1r=r2r2r3�__gt__XszInstallationCandidate.__gt__cCs|j|dd��S)NcSs||kS)Nr2)r9r:r2r2r3r;\sz.InstallationCandidate.__ne__.<locals>.<lambda>)r<)r1r=r2r2r3�__ne__[szInstallationCandidate.__ne__cCst|t�stS||j|j�S)N)�
isinstancer+�NotImplementedr0)r1r=�methodr2r2r3r<^s
zInstallationCandidate._compareN)
�__name__�
__module__�__qualname__r4r6r8r>r?r@rArBrCr<r2r2r2r3r+<sr+c	@s�eZdZdZd!dd�Zdd�Zed"dd	��Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
ejd�Zdd�Zdd�Zdd�Zdd�Zdd �ZdS)#r#z�This finds packages.

    This is meant to match easy_install's technique for looking for
    packages, by reading pages and looking for appropriate links.
    FNcCs�|dkrtd��g|_x:|D]2}|jd�rBt|�}
tjj|
�rB|
}|jj|�qW||_g|_	t
�|_|pvtt
�t
��|_
dd�|r�|ngD�|_||_||_||_t|	||
|d�|_ts�x8tj|j|j�D]$}tj|�}|jdkr�tjd�Pq�WdS)	a�Create a PackageFinder.

        :param format_control: A FormatControl object or None. Used to control
            the selection of source packages / binary packages when consulting
            the index and links.
        :param platform: A string or None. If None, searches for packages
            that are supported by the current system. Otherwise, will find
            packages that can be built on the platform passed in. These
            packages will only be downloaded for distribution: they will
            not be built locally.
        :param versions: A list of strings or None. This is passed directly
            to pep425tags.py in the get_supported() method.
        :param abi: A string or None. This is passed directly
            to pep425tags.py in the get_supported() method.
        :param implementation: A string or None. This is passed directly
            to pep425tags.py in the get_supported() method.
        Nz>PackageFinder() missing 1 required keyword argument: 'session'�~cSsg|]}d|df�qS)r%r2)�.0�hostr2r2r3�
<listcomp>�sz*PackageFinder.__init__.<locals>.<listcomp>)�versions�platform�abi�implr$zipip is configured with locations that require TLS/SSL, however the ssl module in Python is not available.)�	TypeError�
find_links�
startswithr	�os�path�exists�append�
index_urls�dependency_links�set�logged_linksr!�format_control�secure_origins�allow_all_prereleases�process_dependency_links�sessionr�
valid_tagsr�	itertools�chain�urllib_parse�urlparse�scheme�logger�warning)r1rSrYr_Z
trusted_hostsr`rar]rOrNrP�implementation�linkZnew_link�parsedr2r2r3r4ls>	




zPackageFinder.__init__cCs"|jrtjdt�|jj|�dS)NzXDependency Links processing has been deprecated and will be removed in a future release.)r`�warnings�warnrrZ�extend)r1�linksr2r2r3�add_dependency_links�s
z"PackageFinder.add_dependency_linkscs�g�g���fdd�}x�|D]�}tjj|�}|jd�}|s>|r�|rH|}nt|�}tjj|�r�|r�tjj|�}x4tj|�D]}|tjj||��qxWq�|rƈj	|�q�tjj
|�r�||�q�tjd|�qt
|�r܈j	|�qtjd|�qW��fS)zt
        Sort locations into "files" (archives) and "urls", and return
        a pair of lists (files,urls)
        cs8t|�}tj|dd�ddkr*�j|�n
�j|�dS)NF)�strictrz	text/html)r�	mimetypesZ
guess_typerX)rV�url)�files�urlsr2r3�	sort_path�sz0PackageFinder._sort_locations.<locals>.sort_pathzfile:z:Url '%s' is ignored: it is neither a file nor a directory.zQUrl '%s' is ignored. It is either a non-existing path or lacks a specific scheme.)rUrVrWrTr�isdir�realpath�listdir�joinrX�isfilerhrir)�	locations�
expand_dirrwrtZ
is_local_pathZis_file_urlrV�itemr2)rurvr3�_sort_locations�s8



zPackageFinder._sort_locationscCsXt|j�}|jjrHt|jj�}|j|j�s8td|j��|j|j�}n|}|j	|fS)a[
        Function used to generate link sort key for link tuples.
        The greater the return value, the more preferred it is.
        If not finding wheels, then sorted by version only.
        If finding wheels, then the sort order is by version, then:
          1. existing installs
          2. wheels ordered via Wheel.support_index_min(self.valid_tags)
          3. source archives
        Note: it was considered to embed this logic into the Link
              comparison operators, but then different sdist links
              with the same version, would have to be considered equal
        zB%s is not a supported wheel for this platform. It can't be sorted.)
�lenrbr/�is_wheelr�filename�	supportedrZsupport_index_minr.)r1�	candidateZsupport_num�wheelZprir2r2r3�_candidate_sort_key�s

z!PackageFinder._candidate_sort_keyc	Csltjt|��}|j|j|jf}|djdd�d
}�x t|jD�]}||dkr`|ddkr`q@yht	j
t|dtj
�s�|ddkr�|dn|djd��}t	jt|dtj
�r�|dn|djd��}WnJtk
�r|d�r|dj�|dj�k�r|ddk�rw@YnX||k�r q@|d|dk�rP|ddk�rP|ddk	�rPq@dSW|jd|j|j�d	S)Nr�+�r%�utf8�Tz�The repository located at %s is not a trusted or secure host and is being ignored. If this repository is available via HTTPS it is recommended to use HTTPS instead, otherwise you may silence this warning and allow it anyways with '--trusted-host %s'.F���)rerf�strrgZhostnameZport�rsplit�SECURE_ORIGINSr^rZ
ip_addressrDrZ	text_type�decodeZ
ip_network�
ValueError�lowerri)	r1rhr/rl�originZprotocolZ
secure_originZaddrZnetworkr2r2r3�_validate_secure_origins>

z%PackageFinder._validate_secure_origincs �fdd���fdd�|jD�S)z�Returns the locations found via self.index_urls

        Checks the url_name on the main (first in the list) index and
        use this url_name to produce all locations
        cs,tj|tjt����}|jd�s(|d}|S)N�/)�	posixpathr{reZquoter�endswith)rt�loc)�project_namer2r3�mkurl_pypi_urlhs
z?PackageFinder._get_index_urls_locations.<locals>.mkurl_pypi_urlcsg|]}�|��qSr2r2)rKrt)r�r2r3rMusz;PackageFinder._get_index_urls_locations.<locals>.<listcomp>)rY)r1r�r2)r�r�r3�_get_index_urls_locationsas
z'PackageFinder._get_index_urls_locationscs��j|�}�j|�\}}�j�jdd�\}}�j�j�\}}dd�tj|||�D�}	�fdd�tjdd�|D�dd�|D�d	d�|D��D�}
tjd
t|
�|�x|
D]}tjd|�q�Wt	|�}t
�j|�}
t|||
�}�j
dd��jD�|�}g}xJ�j|
|�D]:}tjd
|j�t��|j�j
|j|��WdQRX�qW�j
dd��jD�|�}|�r|tjddjdd�|D����j
|	|�}|�r�|jdd�tjddjdd�|D���||||S)aFind all available InstallationCandidate for project_name

        This checks index_urls, find_links and dependency_links.
        All versions found are returned as an InstallationCandidate list.

        See _link_package_versions for details on which files are accepted
        T)r~css|]}t|�VqdS)N)�Link)rKrtr2r2r3�	<genexpr>�sz4PackageFinder.find_all_candidates.<locals>.<genexpr>csg|]}�jt|�r|�qSr2)r�rh)rKrk)r1r2r3rM�sz5PackageFinder.find_all_candidates.<locals>.<listcomp>css|]}t|�VqdS)N)r�)rKrtr2r2r3r��scss|]}t|�VqdS)N)r�)rKrtr2r2r3r��scss|]}t|�VqdS)N)r�)rKrtr2r2r3r��sz,%d location(s) to search for versions of %s:z* %scss|]}t|d�VqdS)z-fN)r�)rKrtr2r2r3r��szAnalyzing links from page %sNcss|]}t|�VqdS)N)r�)rKrtr2r2r3r��szdependency_links found: %sz, cSsg|]}|jj�qSr2)r/rt)rKr.r2r2r3rM�s)�reversezLocal files found: %scSsg|]}t|jj��qSr2)rr/rt)rKr�r2r2r3rM�s)r�r�rSrZrcrdrh�debugr�r�fmt_ctl_formatsr]�Search�_package_versions�
_get_pagesrtr
rorpr{�sort)r1r�Zindex_locationsZindex_file_locZ
index_url_locZfl_file_locZ
fl_url_locZdep_file_locZdep_url_locZfile_locationsZ
url_locationsr/�canonical_name�formats�searchZfind_links_versionsZ
page_versions�pageZdependency_versionsZ
file_versionsr2)r1r3�find_all_candidateswsX


 
z!PackageFinder.find_all_candidatesc
s�|j|j�}t|jjdd�|D�|jr,|jndd����fdd�|D�}|r�t||jd�}t|j	dd�r�d	d�|D�}t|�r�t||jd�}q�d
j
|j|j|j	�}|j	j
r�|dj
|j	j
�7}tj|�nd}|jdk	r�t|jj�}nd}|dko�|dk�r0tjd|d
jttdd�|D��td���td|��d}	|�rT|dk�sP|j|k�rTd}	|�r�|dk	�r�|	�rztjd|�ntjd||j�dS|	�r�tjd|d
jt�td���p�d�t�tjd|jd
jt�td���|j	S)z�Try to find a Link matching req

        Expects req, an InstallRequirement and upgrade, a boolean
        Returns a Link if found,
        Raises DistributionNotFound or BestVersionAlreadyInstalled otherwise
        cSsg|]}t|j��qSr2)r�r.)rK�cr2r2r3rM�sz2PackageFinder.find_requirement.<locals>.<listcomp>N)Zprereleasescsg|]}t|j��kr|�qSr2)r�r.)rKr�)�compatible_versionsr2r3rM�s)�key�yankedFcSsg|]}t|jdd�s|�qS)r�F)�getattrr/)rKr�r2r2r3rM�sznWARNING: The candidate selected for download or install is a yanked version: '{}' candidate (version {} at {})z
Reason for being yanked: {}zNCould not find a version that satisfies the requirement %s (from versions: %s)z, css|]}t|j�VqdS)N)r�r.)rKr�r2r2r3r�sz1PackageFinder.find_requirement.<locals>.<genexpr>z%No matching distribution found for %sTzLExisting installed version (%s) is most up-to-date and satisfies requirementzUExisting installed version (%s) satisfies requirement (most up-to-date version is %s)z=Installed version (%s) is most up-to-date (past versions: %s)Znonez)Using version %s (newest of versions: %s))r��namer[Z	specifier�filterr_�maxr�r�r/r5r,r.�
yanked_reasonrhriZsatisfied_byr-Zcriticalr{�sortedrr�r)
r1ZreqZupgradeZall_candidatesZapplicable_candidatesZbest_candidateZnonyanked_candidatesZwarning_messageZinstalled_versionZbest_installedr2)r�r3�find_requirement�sx



zPackageFinder.find_requirementccsFt�}x:|D]2}||krq|j|�|j|�}|dkr8q|VqWdS)zp
        Yields (page, page_url) from the given locations, skipping
        locations that have errors.
        N)r[�add�	_get_page)r1r}r��seenr/r�r2r2r3r�Bs


zPackageFinder._get_pagesz-py([123]\.?[0-9]?)$cCsTgg}}t�}x:|D]2}||kr|j|�|jr>|j|�q|j|�qW||S)z�
        Returns elements of links in order, non-egg links first, egg links
        second, while eliminating duplicates
        )r[r��egg_fragmentrX)r1rpZeggsZno_eggsr�rkr2r2r3�_sort_linksUs


zPackageFinder._sort_linkscCs:g}x0|j|�D]"}|j||�}|dk	r|j|�qW|S)N)r��_link_package_versionsrX)r1rpr��resultrk�vr2r2r3r�eszPackageFinder._package_versionscCs(||jkr$tjd||�|jj|�dS)NzSkipping link %s; %s)r\rhr�r�)r1rk�reasonr2r2r3�_log_skipped_linkms
zPackageFinder._log_skipped_linkc
CsJd}|jr|j}|j}�n|j�\}}|s:|j|d�dS|tkrV|j|d|�dSd|jkr~|tkr~|j|d|j�dSd|jkr�|dkr�|j|d�dS|tk�r&yt	|j
�}Wn tk
r�|j|d	�dSXt|j
�|jk�r|j|d
|j�dS|j|j��s |j|d�dS|j}d|jk�rR|tk�rR|j|d
|j�dS|�sft||j|�}|dk�r�|j|d
|j�dS|jj|�}|�r�|d|j��}|jd�}|tjdd�k�r�|j|d�dSyt|j�}	Wn.tjk
�rtjd|j
|j�d}	YnX|	�s.tjd||j�dStjd||�t|j||�S)z'Return an InstallationCandidate or NoneNz
not a filezunsupported archive format: %s�binaryzNo binaries permitted for %sZmacosx10z.zipzmacosx10 onezinvalid wheel filenamezwrong project name (not %s)z%it is not compatible with this Python�sourcezNo sources permitted for %sr��zPython version is incorrectz3Package %s has an invalid Requires-Python entry: %sTz_The package %s is incompatible with the pythonversion in use. Acceptable python versions are:%szFound link %s, version: %s)r��extrr�rr�rZsuppliedrVrr�rrr�Z	canonicalr�rbr.�egg_info_matches�_py_version_rer��start�group�sysr�requires_pythonrZInvalidSpecifierrhr�r+)
r1rkr�r.�egg_infor�r��match�
py_versionZsupport_this_pythonr2r2r3r�rs�





z$PackageFinder._link_package_versionscCstj||jd�S)N)ra)�HTMLPage�get_pagera)r1rkr2r2r3r��szPackageFinder._get_page)	FNFNNNNNN)F)rGrHrI�__doc__r4rq�staticmethodr�r�r�r�r�r�r��re�compiler�r�r�r�r�r�r2r2r2r3r#es(
Q
1GSx
Mz([a-z0-9_.]+)-([a-z0-9_.!+-]+)cCs�|j|�}|stjd|�dS|dkrB|jd�}||jd�d�S|jd�j�}|jdd�}|j�d}|j|�r�|jd�t|�d�SdSdS)axPull the version part out of a string.

    :param egg_info: The string to parse. E.g. foo-2.1
    :param search_name: The name of the package this belongs to. None to
        infer the name. Note that this cannot unambiguously parse strings
        like foo-2-2 which might be foo, 2-2 or foo-2, 2.
    :param link: The link the string came from, for logging on failure.
    z%Could not parse version from link: %sNr�-�_)	r�rhr�r��indexr��replacerTr�)r�Zsearch_namerkZ_egg_info_rer�Z
full_matchr�Zlook_forr2r2r3r��s


r�c@sxeZdZdZddd�Zdd�Zeddd	��Zedd
d��Z	edd
��Z
edd��Ze
dd��Zejdej�Zdd�ZdS)r�z'Represents one page, along with its URLNcCs\d}|r2d|kr2tj|d�\}}d|kr2|d}||_tj|j|dd�|_||_||_dS)NzContent-Type�charsetF)Ztransport_encodingZnamespaceHTMLElements)�cgiZparse_header�contentrrrlrt�headers)r1r�rtr��encoding�content_type�paramsr2r2r3r4�s
zHTMLPage.__init__cCs|jS)N)rt)r1r2r2r3�__str__�szHTMLPage.__str__TcCsl|dkrtd��|j}|jdd�d}ddlm}x>|jD]4}|j�j|�r:|t|�dkr:t	j
d||�dSq:W�y"|r�|j}xHtD]@}|j
|�r�|j||d�}	|	j�jd	�r�Pq�t	j
d
||	�dSq�Wt	j
d|�tj|�\}}
}}}
}|dk�r6tjjtj|���r6|j
d
��s|d
7}tj|d�}t	j
d|�|j|d	dd�d�}|j�|jjdd�}	|	j�jd	��s�t	j
d
||	�dS||j|j|j�}Wn�tjk
�r�}z|j|||�WYdd}~Xn�tk
�r}z"d|}|j|||t	jd�WYdd}~Xn`tj k
�r>}z|j|d||�WYdd}~Xn*tj!k
�rb|j|d|�YnX|SdS)Nz9get_page() missing 1 required keyword argument: 'session'�#r�r)�
VcsSupportz+:zCannot look at %s URL %s)raz	text/htmlz,Skipping page %s because of Content-Type: %szGetting page %sr)r�z
index.htmlz# file: URL is directory, getting %szmax-age=600)ZAcceptz
Cache-Control)r�zContent-Type�unknownz6There was a problem confirming the ssl certificate: %s)�methzconnection error: %sz	timed out)"rRrt�split�pip.vcsr�Zschemesr�rTr�rhr�r�r
r��_get_content_typererfrUrVrx�urllib_requestZurl2pathname�urljoin�get�raise_for_statusr�r�rZ	HTTPError�_handle_failr�info�ConnectionErrorZTimeout)�clsrkZ
skip_archivesrartr�rgr�Zbad_extr��netlocrVr��query�fragment�respZinst�excr�r2r2r3r��sp



$"zHTMLPage.get_pagecCs|dkrtj}|d||�dS)Nz%Could not fetch URL %s: %s - skipping)rhr�)rkr�rtr�r2r2r3r�NszHTMLPage._handle_failcCsDtj|�\}}}}}|dkr dS|j|dd�}|j�|jjdd�S)z;Get the Content-Type of the given url, using a HEAD request�httpr$�T)Zallow_redirectszContent-Type)r�r$)re�urlsplit�headr�r�r�)rtrargr�rVr�r�r�r2r2r3r�UszHTMLPage._get_content_typecCs@dd�|jjd�D�}|r6|djd�r6|djd�S|jSdS)NcSsg|]}|jd�dk	r|�qS)�hrefN)r�)rK�xr2r2r3rMfsz%HTMLPage.base_url.<locals>.<listcomp>z.//baserr�)rl�findallr�rt)r1�basesr2r2r3�base_urlcszHTMLPage.base_urlccs�x�|jjd�D]v}|jd�r|jd�}|jtj|j|��}|jd�}|rPt|�nd}|jddd�}|dk	rrt|�}t||||d�VqWdS)zYields all links in the pagez.//ar�zdata-requires-pythonNzdata-yanked)�default)r�r�)	rlr�r��
clean_linkrer�r�r r�)r1Zanchorr�rtZ	pyrequirer�r2r2r3rpns


zHTMLPage.linksz[^a-z0-9$&+,/:;=?@.#%_\\|-]cCs|jjdd�|�S)z�Makes sure a link is fully encoded.  That is, if a ' ' shows up in
        the link, it will be rewritten to %20 (while not over-quoting
        % or other characters).cSsdt|jd��S)Nz%%%2xr)�ordr�)r�r2r2r3r;�sz%HTMLPage.clean_link.<locals>.<lambda>)�	_clean_re�sub)r1rtr2r2r3r��szHTMLPage.clean_link)N)TN)N)rGrHrIr�r4r��classmethodr�r�r�r�rr��propertyrpr�r��Ir�r�r2r2r2r3r��s
Ur�c@s eZdZd5dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Ze
dd��Ze
dd��Ze
dd��Ze
dd��Zdd�Ze
d d!��Ze
d"d#��Zejd$�Ze
d%d&��Zejd'�Ze
d(d)��Zejd*�Ze
d+d,��Ze
d-d.��Ze
d/d0��Ze
d1d2��Ze
d3d4��Z dS)6r�NcCs@|jd�rt|�}||_||_|r&|nd|_||_|dk	|_dS)a�
        Object representing a parsed link from https://pypi.python.org/simple/*

        url:
            url of the resource pointed to (href of the link)
        comes_from:
            instance of HTMLPage where the link was found, or string.
        requires_python:
            String containing the `Requires-Python` metadata field, specified
            in PEP 345. This may be specified by a data-requires-python
            attribute in the HTML link tag, as described in PEP 503.
        z\\N)rTrrt�
comes_fromr�r�r�)r1rtrr�r�r2r2r3r4�s
z
Link.__init__cCs<|jrd|j}nd}|jr.d|j|j|fSt|j�SdS)Nz (requires-python:%s)r�z%s (from %s)%s)r�rrtr�)r1Zrpr2r2r3r��szLink.__str__cCsd|S)Nz	<Link %s>r2)r1r2r2r3r6�sz
Link.__repr__cCst|t�stS|j|jkS)N)rDr�rErt)r1r=r2r2r3r@�s
zLink.__eq__cCst|t�stS|j|jkS)N)rDr�rErt)r1r=r2r2r3rC�s
zLink.__ne__cCst|t�stS|j|jkS)N)rDr�rErt)r1r=r2r2r3r>�s
zLink.__lt__cCst|t�stS|j|jkS)N)rDr�rErt)r1r=r2r2r3r?�s
zLink.__le__cCst|t�stS|j|jkS)N)rDr�rErt)r1r=r2r2r3rB�s
zLink.__gt__cCst|t�stS|j|jkS)N)rDr�rErt)r1r=r2r2r3rA�s
zLink.__ge__cCs
t|j�S)N)r7rt)r1r2r2r3r8�sz
Link.__hash__cCs8tj|j�\}}}}}tj|jd��p(|}tj|�}|S)Nr�)rer�rtr��basename�rstrip�unquote)r1r�r�rVr�r2r2r3r��s
z
Link.filenamecCstj|j�dS)Nr)rer�rt)r1r2r2r3rg�szLink.schemecCstj|j�dS)Nr�)rer�rt)r1r2r2r3r��szLink.netloccCstjtj|j�d�S)Nr�)rerr�rt)r1r2r2r3rV�sz	Link.pathcCsttj|jjd���S)Nr�)rr�rrVr)r1r2r2r3r�sz
Link.splitextcCs|j�dS)Nr�)r)r1r2r2r3r��szLink.extcCs*tj|j�\}}}}}tj||||df�S)N)rer�rtZ
urlunsplit)r1rgr�rVr�r�r2r2r3�url_without_fragment�szLink.url_without_fragmentz[#&]egg=([^&]*)cCs |jj|j�}|sdS|jd�S)Nr�)�_egg_fragment_rer�rtr�)r1r�r2r2r3r��szLink.egg_fragmentz[#&]subdirectory=([^&]*)cCs |jj|j�}|sdS|jd�S)Nr�)�_subdirectory_fragment_rer�rtr�)r1r�r2r2r3�subdirectory_fragment�szLink.subdirectory_fragmentz2(sha1|sha224|sha384|sha256|sha512|md5)=([a-f0-9]+)cCs |jj|j�}|r|jd�SdS)Nr�)�_hash_rer�rtr�)r1r�r2r2r3r7s
z	Link.hashcCs |jj|j�}|r|jd�SdS)Nr�)rr�rtr�)r1r�r2r2r3�	hash_names
zLink.hash_namecCs$tj|jjdd�djdd�d�S)Nr�r�r�?)r�rrtr�)r1r2r2r3�show_urlsz
Link.show_urlcCs
|jtkS)N)r�r)r1r2r2r3r�sz
Link.is_wheelcCs ddlm}|j|jkrdSdS)z�
        Determines if this points to an actual artifact (e.g. a tarball) or if
        it points to an "abstract" thing like a path or a VCS location.
        r)�vcsFT)r�rrgZall_schemes)r1rr2r2r3�is_artifactszLink.is_artifact)NNN)!rGrHrIr4r�r6r@rCr>r?rBrAr8rr�rgr�rVrr�rr�r�rr�r	r
rr7rrr�rr2r2r2r3r��s8



r�zno_binary only_binarycCs�|jd�}xFd|krP|j�|j�|jd�|d|jd�d�=d|krdSqWx:|D]2}|dkrn|j�qXt|�}|j|�|j|�qXWdS)N�,z:all:r�z:none:)r��clearr�r�r�discard)�value�targetr=�newr�r2r2r3r"5s




cCsjtddg�}||jkr"|jd�n@||jkr8|jd�n*d|jkrN|jd�nd|jkrb|jd�t|�S)Nr�r�z:all:)r[�only_binaryr�	no_binary�	frozenset)�fmt_ctlr�r�r2r2r3r�Hs




r�cCstd|j|j�dS)Nz:all:)r"rr)rr2r2r3�fmt_ctl_no_binaryUsrcCst|�tjdtdd�dS)Nzf--no-use-wheel is deprecated and will be removed in the future.  Please use --no-binary :all: instead.r�)�
stacklevel)rrmrnr)rr2r2r3�fmt_ctl_no_use_wheelZs
rr�zsupplied canonical formats)r$r%r%)r%r&r%)r%r'r%)r%r(r%)r)r%N)r*r%r%)Qr�Z
__future__rZloggingr��collectionsrrcr�rUr�rsr�rmZpip._vendor.six.moves.urllibrrerr�Z
pip.compatrZ	pip.utilsrrr	r
rZpip.utils.deprecationrZpip.utils.loggingr
Zpip.utils.packagingrZpip.exceptionsrrrrZpip.downloadrrrrZ	pip.wheelrrZpip.pep425tagsrZpip._vendorrrrZpip._vendor.packaging.versionr-Zpip._vendor.packaging.utilsrZpip._vendor.packagingrZpip._vendor.requests.exceptionsrZpip._vendor.distlib.compatr �__all__r�Z	getLoggerrGrh�objectr+r#r�rr�r�r�r!r"r�rrr�r2r2r2r3�<module>sl

)d*#



site-packages/pip/__pycache__/baseparser.cpython-36.opt-1.pyc000064400000022016147511334620020007 0ustar003

���e�(�@s�dZddlmZddlZddlZddlZddlZddlZddlm	Z	ddl
mZddlm
Z
ddlmZmZmZmZddlmZmZejd	ej�ZGd
d�dej�ZGdd
�d
e�ZGdd�dej�ZGdd�de�ZdS)zBase option parser setup�)�absolute_importN)�	strtobool)�string_types)�configparser)�legacy_config_file�config_basename�running_under_virtualenv�site_config_files)�appdirs�get_terminal_sizez^PIP_c@sReZdZdZdd�Zdd�Zddd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�ZdS)�PrettyHelpFormatterz4A prettier/less verbose help formatter for optparse.cOs:d|d<d|d<t�dd|d<tjj|f|�|�dS)N�Zmax_help_position�Zindent_incrementr��width)r�optparse�IndentedHelpFormatter�__init__)�self�args�kwargs�r� /usr/lib/python3.6/baseparser.pyrszPrettyHelpFormatter.__init__cCs|j|dd�S)Nz <%s>z, )�_format_option_strings)r�optionrrr�format_option_strings!sz)PrettyHelpFormatter.format_option_strings� <%s>�, cCs|g}|jr|j|jd�|jr0|j|jd�t|�dkrH|jd|�|j�rr|jp^|jj�}|j||j��dj	|�S)a
        Return a comma-separated list of option strings and metavars.

        :param option:  tuple of (short opt, long opt), e.g: ('-f', '--format')
        :param mvarfmt: metavar format string - evaluated as mvarfmt % metavar
        :param optsep:  separator
        rr�)
Z_short_opts�appendZ
_long_opts�len�insertZtakes_value�metavar�dest�lower�join)rrZmvarfmtZoptsepZoptsr"rrrr$sz*PrettyHelpFormatter._format_option_stringscCs|dkrdS|dS)NZOptionsrz:
r)rZheadingrrr�format_heading;sz"PrettyHelpFormatter.format_headingcCsd|jtj|�d�}|S)zz
        Ensure there is only one newline between usage and the first heading
        if there is no description.
        z
Usage: %s
z  )�indent_lines�textwrap�dedent)rZusage�msgrrr�format_usage@sz PrettyHelpFormatter.format_usagecCsV|rNt|jd�rd}nd}|jd�}|j�}|jtj|�d�}d||f}|SdSdS)N�mainZCommandsZDescription�
z  z%s:
%s
r)�hasattr�parser�lstrip�rstripr'r(r))r�descriptionZlabelrrr�format_descriptionHs
z&PrettyHelpFormatter.format_descriptioncCs|r|SdSdS)Nrr)rZepilogrrr�
format_epilogZsz!PrettyHelpFormatter.format_epilogcs"�fdd�|jd�D�}dj|�S)Ncsg|]}�|�qSrr)�.0�line)�indentrr�
<listcomp>bsz4PrettyHelpFormatter.indent_lines.<locals>.<listcomp>r-)�splitr%)r�textr7Z	new_linesr)r7rr'asz PrettyHelpFormatter.indent_linesN)rr)�__name__�
__module__�__qualname__�__doc__rrrr&r+r3r4r'rrrrrs
rc@seZdZdZdd�ZdS)�UpdatingDefaultsHelpFormatterz�Custom help formatter for use in ConfigOptionParser.

    This is updates the defaults before expanding them, allowing
    them to show up correctly in the help listing.
    cCs(|jdk	r|jj|jj�tjj||�S)N)r/�_update_defaults�defaultsrr�expand_default)rrrrrrBms
z,UpdatingDefaultsHelpFormatter.expand_defaultN)r;r<r=r>rBrrrrr?fsr?c@s eZdZdd�Zedd��ZdS)�CustomOptionParsercOs(|j||�}|jj�|jj||�|S)z*Insert an OptionGroup at a given position.)Zadd_option_group�
option_groups�popr!)r�idxrr�grouprrr�insert_option_groupus
z&CustomOptionParser.insert_option_groupcCs.|jdd�}x|jD]}|j|j�qW|S)z<Get a list of all options, including those in option groups.N)Zoption_listrD�extend)r�res�irrr�option_list_all~sz"CustomOptionParser.option_list_allN)r;r<r=rH�propertyrLrrrrrCss	rCc@s\eZdZdZdZdd�Zdd�Zdd�Zd	d
�Zdd�Z	d
d�Z
dd�Zdd�Zdd�Z
dS)�ConfigOptionParserzsCustom option parser which updates its defaults by checking the
    configuration files and environmental variablesFcOsZtj�|_|jd�|_|jdd�|_|j�|_|jrB|jj|j�t	j
j|f|�|�dS)N�name�isolatedF)rZRawConfigParser�configrErOrP�get_config_files�files�readr�OptionParserr)rrrrrrr�s

zConfigOptionParser.__init__cCs�tjjdd�}|tjkrgStt�}|jsj|rFtjj|�rF|j	|�n$|j	t
�|j	tjjtj
d�t��t�r�tjjtjt�}tjj|�r�|j	|�|S)NZPIP_CONFIG_FILEFZpip)�os�environ�get�devnull�listr	rP�path�existsrrr%r
Zuser_config_dirrr�sys�prefix)rZconfig_filerSZvenv_config_filerrrrR�s&


z#ConfigOptionParser.get_config_filescCsLy|j||�Stjk
rF}ztd|�tjd�WYdd}~XnXdS)Nz*An error occurred during configuration: %s�)�check_valuerZOptionValueError�printr]�exit)rr�key�val�excrrr�
check_default�s
z ConfigOptionParser.check_defaultc	sji}x(d�jfD]}|j�j�j|���qW�jsH|j�j�j���tj�j��_	t
�}x�|j�D]�\�}|stqf�j����dkr�qf�j
d
kr�t|�}n��j
dkr�|j�}���fdd�|D�}nl�j
d	k�r$|j�j��j�}�j||�}�j�p�f}�j�pi}�j�||�f|�|�n�j��|�}||�j<qfWx|D]�t�j	��|�<�qFWd�_	|S)z�Updates the given defaults with values from the config files and
        the environ. Does a little special handling for certain types of
        options (lists).�globalN�
store_true�store_false�countrcsg|]}�j��|��qSr)rf)r5�v)rcrrrrr8�sz7ConfigOptionParser._update_defaults.<locals>.<listcomp>�callback)rhrirj)rO�update�normalize_keys�get_config_sectionrP�get_environ_varsr�ValuesrA�values�set�itemsZ
get_option�actionrr9�addr#�get_opt_stringZ
convert_valueZ
callback_argsZcallback_kwargsrlrf�getattr)	rrArQZsectionZ	late_evalrd�opt_strrrr)rcrrrr@�s@




z#ConfigOptionParser._update_defaultscCs@i}x6|D].\}}|jdd�}|jd�s0d|}|||<q
W|S)z�Return a config dictionary with normalized keys regardless of
        whether the keys were specified in environment variables or in config
        files�_�-z--z--%s)�replace�
startswith)rrtZ
normalizedrcrdrrrrn�s
z!ConfigOptionParser.normalize_keyscCs|jj|�r|jj|�SgS)z Get a section of a configuration)rQZhas_sectionrt)rrOrrrrosz%ConfigOptionParser.get_config_sectionccs<x6tjj�D](\}}tj|�rtjd|�j�|fVqWdS)z@Returns a generator with all environmental vars with prefix PIP_rN)rVrWrt�_environ_prefix_re�search�subr$)rrcrdrrrrps
z#ConfigOptionParser.get_environ_varscCsn|jstj|j�S|j|jj��}x@|j�D]4}|j|j�}t	|t
�r,|j�}|j||�||j<q,Wtj|�S)z�Overriding to make updating the defaults after instantiation of
        the option parser possible, _update_defaults() does the dirty work.)
Zprocess_default_valuesrrqrAr@�copyZ_get_all_optionsrXr#�
isinstancerrwr`)rrAr�defaultryrrr�get_default_valuess
z%ConfigOptionParser.get_default_valuescCs |jtj�|jdd|�dS)Nrz%s
)Zprint_usager]�stderrrb)rr*rrr�error#szConfigOptionParser.errorN)r;r<r=r>rPrrRrfr@rnrorpr�r�rrrrrN�s
(5rN)r>Z
__future__rr]rrV�rer(Zdistutils.utilrZpip._vendor.sixrZpip._vendor.six.movesrZ
pip.locationsrrrr	Z	pip.utilsr
r�compile�Ir~rrr?rUrCrNrrrr�<module>s O
site-packages/pip/__pycache__/__init__.cpython-36.pyc000064400000020671147511334620016465 0ustar003

���e�.�@s�ddlmZddlZddlZddlZddlZddlZddlZddlZddl	m
Z
ejde
d�yddlZWne
k
r~YnNXejdkr�eedd�dkr�ydd	lmZWne
efk
r�Yn
Xej�dd
lmZmZmZddlmZmZddlmZmZdd
lmZmZm Z m!Z!ddl"m#Z#m$Z$ddl%m&Z&m'Z'ddl%m(Z(ddl	m)Z)ddl*Z+e+j,Z,dZ-ej.e/�Z0ejde)d�dd�Z1dd�Z2dd�Z3dd�Z4d dd�Z5Gdd�de6�Z7e/dk�r�ej8e5��dS)!�)�absolute_importN)�DependencyWarning�ignore)�category�darwinZOPENSSL_VERSION_NUMBERi)�securetransport)�InstallationError�CommandError�PipError)�get_installed_distributions�get_prog)�deprecation�dist_is_editable)�git�	mercurial�
subversion�bazaar)�ConfigOptionParser�UpdatingDefaultsHelpFormatter)�
get_summaries�get_similar_commands)�
commands_dict)�InsecureRequestWarningz9.0.3csZdtjkrdStjdj�dd�}ttjd�}y||d�Wntk
rZd�YnXdd�t�D��g}y�fd	d�|D�d
}Wntk
r�d}YnXt�}|�r�|dkr�tjd�|dkoԈj	d
��rJg}�j
�}x<tdd�D].}|jj	|�r�|j|dd�kr�|j
|j�q�W|�rJx|D]}t|��q.Wtjd�t|�}|dd�|jjD�7}dd�|d|d�D���fdd�|D�}�fdd�|D�}x�|D](}	|	d
}
|	d�r�|
d7}
t|
��q�Wnp�j	d
��s�j	d��r0dd�|jD�}|j
|j�dd�|D�}�dd�|D�7�tdj�fdd��D���tjd�dS)z�Command and option completion for the main option parser (and options)
    and its subcommands (and options).

    Enable by sourcing one of the completion shell scripts (bash, zsh or fish).
    ZPIP_AUTO_COMPLETENZ
COMP_WORDS�Z
COMP_CWORD�cSsg|]\}}|�qS�r)�.0�cmdZsummaryrr�/usr/lib/python3.6/__init__.py�
<listcomp>Usz autocomplete.<locals>.<listcomp>csg|]}|�kr|�qSrr)r�w)�subcommandsrrrYsr�helpZ	uninstall�-T)Z
local_onlycSs&g|]}|jtjkr|j�|jf�qSr)r"�optparse�
SUPPRESS_HELP�get_opt_string�nargs)r�optrrrrqscSsg|]}|jd�d�qS)�=r)�split)r�xrrrrvscs g|]\}}|�kr||f�qSrr)rr+�v)�	prev_optsrrrwscs"g|]\}}|j��r||f�qSr)�
startswith)r�kr,)�currentrrrysr)z--cSsg|]
}|j�qSr)�option_list)r�irrrr�scss|]}|D]
}|Vq
qdS)Nr)r�it�orrr�	<genexpr>�szautocomplete.<locals>.<genexpr>cSs g|]}|jtjkr|j��qSr)r"r$r%r&)rr2rrrr�s� csg|]}|j��r|�qSr)r.)rr+)r0rrr�s)�os�environr*�int�
IndexErrorr�create_main_parser�sys�exitr.�lowerr�key�append�printr�parserZoption_list_allZ
option_groupsr1�join)ZcwordsZcwordZoptionsZsubcommand_namerBZ	installedZlc�distZ
subcommandZoptionZ	opt_labelZoptsr)r0r-r!r�autocompleteEs\








rEcCs�ddt�dt�d�}tf|�}|j�tjjtjjtjjt���}dt	|t
jdd�f|_tj
tj|�}|j|�d|_t�}dgd	d
�|D�}dj|�|_|S)Nz
%prog <command> [options]F�global)ZusageZadd_help_optionZ	formatter�name�progzpip %s from %s (python %s)�TrcSsg|]\}}d||f�qS)z%-27s %sr)rr2�jrrrr�sz&create_main_parser.<locals>.<listcomp>�
)rrrZdisable_interspersed_argsr7�path�dirname�abspath�__file__�__version__r<�version�
cmdoptionsZmake_option_groupZ
general_groupZadd_option_group�mainrrC�description)Z	parser_kwrBZpip_pkg_dirZgen_optsZcommand_summariesrTrrrr;�s"


r;cCs�t�}|j|�\}}|jr>tjj|j�tjjtj�tj�|s\|ddkrlt	|�dkrl|j
�tj�|d}|tkr�t|�}d|g}|r�|j
d|�tdj|���|dd�}|j|�||fS)Nrr"rzunknown command "%s"zmaybe you meant "%s"z - )r;�
parse_argsrQr<�stdout�writer7�linesepr=�lenZ
print_helprrr@r	rC�remove)�argsrBZgeneral_optionsZ	args_else�cmd_nameZguess�msg�cmd_argsrrr�	parseopts�s&	

r_cCsd}d|krd}|S)NFz
--isolatedTr)r[�isolatedrrr�check_isolated�sracCs�|dkrtjdd�}tj�t�yt|�\}}WnJtk
r~}z.tjjd|�tjjt	j
�tjd�WYdd}~XnXytj
tjd�Wn0tjk
r�}ztjd|�WYdd}~XnXt|t|�d�}|j|�S)Nrz	ERROR: %srz%Ignoring error %s when setting locale)r`)r<�argvr
Zinstall_warning_loggerrEr_r
�stderrrWr7rXr=�locale�	setlocale�LC_ALL�Error�logger�debugrrarS)r[r\r^�exc�eZcommandrrrrS�s rSc@sLeZdZffdd�Zejd�Zejd�Zedd��Z	e
dd��Zd	d
�ZdS)�FrozenRequirementcCs||_||_||_||_dS)N)rG�req�editable�comments)�selfrGrmrnrorrr�__init__�szFrozenRequirement.__init__z-r(\d+)$z-(20\d\d\d\d\d\d)$cCs�tjjtjj|j��}g}ddlm}m}t|�r�|j	|�r�d}y|||�}Wn2t
k
r�}	ztjd|	�d}WYdd}	~	XnX|dkr�tjd|�|j
d�|j�}d}n�d}|j�}|j}
t|
�dkr�|
dddks�td|
|f��|
dd}|jj|�}|jj|�}
|�s|
�r�|jd�}|�r:|�j||�}|�sXtjd
|�|j
d�nF|j
d|�|�rx|jd�}nd|
jd�}d}d|||j|�f}||j|||�S)Nr)�vcs�get_src_requirementTzYError when trying to get requirement for VCS system %s, falling back to uneditable formatz-Could not determine repository location of %sz-## !! Could not determine repository locationFr�==�===z5Expected 1 spec with == or ===; specs = %r; dist = %rZsvnz(Warning: cannot find svn location for %szF## FIXME: could not find svn URL in dependency_links for this package:z3# Installing as editable to satisfy requirement %s:z{%s}z%s@%s#egg=%s)rtru)r7rL�normcaserN�location�pip.vcsrrrsrZget_backend_namerrhZwarningr@Zas_requirement�specsrY�AssertionError�_rev_re�search�_date_reZget_backendZget_location�group�egg_nameZproject_name)�clsrDZdependency_linksrwrorrrsrnrmrjryrQZ	ver_matchZ
date_matchZsvn_backendZsvn_locationZrevrrr�	from_distsf
zFrozenRequirement.from_distcCs,|j�}tjd|�}|r(|d|j��}|S)Nz
-py\d\.\d$)r�rer|�start)rDrG�matchrrrrIs
zFrozenRequirement.egg_namecCs2|j}|jrd|}djt|j�t|�g�dS)Nz-e %srK)rmrnrC�listro�str)rprmrrr�__str__QszFrozenRequirement.__str__N)
�__name__�
__module__�__qualname__rqr��compiler{r}�classmethodr��staticmethodrr�rrrrrl�s

Arl�__main__)N)9Z
__future__rrdZloggingr7r$�warningsr<r�Zpip._vendor.urllib3.exceptionsr�filterwarningsZssl�ImportError�platform�getattrZpip._vendor.urllib3.contribr�OSErrorZinject_into_urllib3Zpip.exceptionsrr	r
Z	pip.utilsrrr
rrxrrrrZpip.baseparserrrZpip.commandsrrrrZpip.cmdoptionsZpiprRrPZ	getLoggerr�rhrEr;r_rarS�objectrlr=rrrr�<module>sR


I*	
[
site-packages/pip/__pycache__/basecommand.cpython-36.pyc000064400000016035147511334620017176 0ustar003

���e�.�@s,dZddlmZddlZddlZddlZddlZddlZddlm	Z	ddl
mZddlm
Z
ddlmZddlmZmZmZmZmZdd	lmZdd
lmZmZddlmZmZddlmZm Z m!Z!m"Z"m#Z#dd
l$m%Z%m&Z&m'Z'ddl(m)Z)ddl*m+Z+dgZ,ej-e.�Z/Gdd�de0�Z1Gdd�de1�Z2dS)z(Base Command class, and related routines�)�absolute_importN)�
cmdoptions)�
PackageFinder)�running_under_virtualenv)�
PipSession)�
BadCommand�InstallationError�UninstallationError�CommandError�PreviousBuildDirError)�logging_dictConfig)�ConfigOptionParser�UpdatingDefaultsHelpFormatter)�InstallRequirement�parse_requirements)�SUCCESS�ERROR�
UNKNOWN_ERROR�VIRTUALENV_NOT_FOUND�PREVIOUS_BUILD_DIR_ERROR)�deprecation�get_prog�normalize_path)�IndentingFormatter)�pip_version_check�Commandc@s@eZdZdZdZdZd
Zddd�Zddd�Zd	d
�Z	dd�Z
dS)rNF�ext://sys.stdout�ext://sys.stderrcCsr|jdt�|jft�d|j|j|d�}tf|�|_d|jj�}tj	|j|�|_
tjtj
|j�}|jj|�dS)Nz%s %sF)�usage�prog�	formatterZadd_help_option�name�description�isolatedz
%s Options)rrr!r�__doc__r
�parser�
capitalize�optparseZOptionGroupZcmd_optsrZmake_option_groupZ
general_groupZadd_option_group)�selfr#Z	parser_kwZ
optgroup_nameZgen_opts�r)�!/usr/lib/python3.6/basecommand.py�__init__)szCommand.__init__cCs�t|jrttjj|jd��nd|dk	r*|n|j|jd�}|jrF|j|_	|j
rT|j
|_|js^|rr|dk	rj|n|j|_|jr�|j|jd�|_
|j|j_|S)N�http)�cache�retriesZinsecure_hosts)r,Zhttps)r�	cache_dirr�os�path�joinr.�
trusted_hostsZcertZverifyZclient_cert�timeout�proxyZproxies�no_inputZauthZ	prompting)r(�optionsr.r4�sessionr)r)r*�_build_sessionAs

zCommand._build_sessioncCs|jj|�S)N)r%�
parse_args)r(�argsr)r)r*r:eszCommand.parse_argscs�|j|�\}}|jr8|jdkr"d�|jdkr2d�qHd�n|jrDd�nd��}|jrVd}tddd	d
tjd�idtd
d�i�d|jdd	gdd�dd|jddd�dd|jp�dddd�d�|t	t
ddd|jr�dndg��d�t�fdd�d2D��d"��tj
dd�d3k�rtjd$tj�|j�r(d%tjd&<|j�rBd'j|j�tjd(<|j�rft��sftjd)�tjt��z$y"|j||�}t|t��r�|SW�n�t k
�r�}z tjt!|��tj"d*dd+�t#Sd}~Xn�t$t%t&fk
�r}z tjt!|��tj"d*dd+�t'Sd}~Xn~t(k
�rF}ztjd,|�tj"d*dd+�t'Sd}~XnDt)k
�rrtjd-�tj"d*dd+�t'Stjd.dd+�t*SWd|j+�r�t,|d/d��r�|j-|dt.d0|j/�d1��}t0|�WdQRXXt1S)4N��WARNING�rZCRITICAL�DEBUG�INFOFZexclude_warningsz pip.utils.logging.MaxLevelFilter)z()�level�indentz%(message)s)z()�formatz(pip.utils.logging.ColorizedStreamHandlerr)rA�class�stream�filtersr )rArDrEr z+pip.utils.logging.BetterRotatingFileHandlerz	/dev/nullT)rArD�filenameZdelayr )�console�console_errors�user_logrHrIrJ)rA�handlersc3s&|]}|d�dkrdndifVqdS)rAr@rr=r?N)r@rr))�.0r!)rAr)r*�	<genexpr>�s
zCommand.main.<locals>.<genexpr>�pip._vendor�distlib�requests�urllib3)�versionZdisable_existing_loggersrFZ
formattersrK�rootZloggers�z�Python 2.6 is no longer supported by the Python core team, please upgrade your Python. A future version of pip will drop support for Python 2.6�1ZPIP_NO_INPUT� ZPIP_EXISTS_ACTIONz2Could not find an activated virtualenv (required).zException information:)�exc_infoz	ERROR: %szOperation cancelled by userz
Exception:�no_index�)r.r4)rNrOrPrQ)r>rT)2r:�quiet�verbose�logr�loggingr=r�log_streams�list�filter�dict�sys�version_info�warnings�warnrZPython26DeprecationWarningr6r0�environZ
exists_actionr2Zrequire_venvr�loggerZcritical�exitrZrun�
isinstance�intr�str�debugrrr	rrr
�KeyboardInterruptrZdisable_pip_version_check�getattrr9�minr4rr)r(r;r7Z
root_levelZstatus�excr8r))rAr*�mainis�










zCommand.main)rr)F)NN)�__name__�
__module__�__qualname__r!rZhiddenr^r+r9r:rqr)r)r)r*r#s

$c@s"eZdZedd��Zddd�ZdS)�RequirementCommandc	Cs"x6|jD],}x&t|d||||d�D]}|j|�q"WqWx&|D]}|jtj|d|j|d��q>Wx*|jD] }|jtj||j|j|d��qhWd}	x8|j	D].}x(t|||||d�D]}d}	|j|�q�Wq�W|j
|_
|p�|jp�|	�sd|i}
|j�rd	t|
d
j
|j�d�}nd|
}tj|�dS)
z?
        Marshal cmd line args into a requirement set.
        T)Z
constraint�finderr7r8�wheel_cacheN)r#rw)�default_vcsr#rwF)rvr7r8rwr!z^You must give at least one requirement to %(name)s (maybe you meant "pip %(name)s %(links)s"?)rV)ZlinkszLYou must give at least one requirement to %(name)s (see "pip help %(name)s"))ZconstraintsrZadd_requirementrZ	from_lineZ
isolated_modeZ	editablesZ
from_editablerxZrequirementsZrequire_hashes�
find_linksrar2rgZwarning)Zrequirement_setr;r7rvr8r!rwrGZreqZfound_req_in_fileZopts�msgr)r)r*�populate_requirement_setsF
z+RequirementCommand.populate_requirement_setNc
CsR|jg|j}|jr*tjddj|��g}t|j|j||j	|j
|j|||||d�S)zR
        Create a package finder appropriate to this requirement command.
        zIgnoring indexes: %s�,)ry�format_control�
index_urlsr3Zallow_all_prereleases�process_dependency_linksr8�platformZversions�abi�implementation)Z	index_urlZextra_index_urlsrXrgrlr2rryr}r3Zprer)r(r7r8r�Zpython_versionsr�r�r~r)r)r*�_build_package_finder:s z(RequirementCommand._build_package_finder)NNNN)rrrsrt�staticmethodr{r�r)r)r)r*rus8ru)3r$Z
__future__rr]r0rbr'rdZpiprZ	pip.indexrZ
pip.locationsrZpip.downloadrZpip.exceptionsrrr	r
rZ
pip.compatrZpip.baseparserr
rZpip.reqrrZpip.status_codesrrrrrZ	pip.utilsrrrZpip.utils.loggingrZpip.utils.outdatedr�__all__Z	getLoggerrrrg�objectrrur)r)r)r*�<module>s.
_site-packages/pip/__pycache__/status_codes.cpython-36.opt-1.pyc000064400000000505147511334620020357 0ustar003

���e��@s(ddlmZdZdZdZdZdZdZdS)�)�absolute_import�����N)Z
__future__r�SUCCESSZERRORZ
UNKNOWN_ERRORZVIRTUALENV_NOT_FOUNDZPREVIOUS_BUILD_DIR_ERRORZNO_MATCHES_FOUND�r	r	�"/usr/lib/python3.6/status_codes.py�<module>ssite-packages/pip/__pycache__/index.cpython-36.pyc000064400000074137147511334620016043 0ustar003

���eW��@sdZddlmZddlZddlZddlmZddlZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlmZddlmZddlmZddlmZmZmZmZmZdd	lmZdd
lmZddlm Z ddl!m"Z"m#Z#m$Z$m%Z%dd
l&m'Z'm(Z(m)Z)m*Z*ddl+m,Z,m-Z-ddl.m/Z/ddl0m1Z1m2Z2m3Z3ddl4mZ5ddl6m7Z7ddl8m9Z9ddl:m;Z;ddl<m=Z=dddgZ>d3d4d5d6d7d8gZ?ej@eA�ZBGdd �d eC�ZDGd!d�deC�ZEe
jFd"e
jG�fd#d$�ZHGd%d&�d&eC�ZIGd'd(�d(eC�ZJedd)�ZKd*d�ZLd+d,�ZMd-d.�ZNd/d0�ZOed1d2�ZPdS)9z!Routines related to PyPI, indexes�)�absolute_importN)�
namedtuple)�parse)�request)�	ipaddress)�cached_property�splitext�normalize_path�ARCHIVE_EXTENSIONS�SUPPORTED_EXTENSIONS)�RemovedInPip10Warning)�
indent_log)�check_requires_python)�DistributionNotFound�BestVersionAlreadyInstalled�InvalidWheelFilename�UnsupportedWheel)�HAS_TLS�is_url�path_to_url�url_to_path)�Wheel�	wheel_ext)�
get_supported)�html5lib�requests�six)�canonicalize_name)�
specifiers)�SSLError)�unescape�
FormatControl�fmt_ctl_handle_mutual_exclude�
PackageFinder�https�*�	localhost�127.0.0.0/8�::1/128�file�sshc@s\eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�ZdS)�InstallationCandidatecCs,||_t|�|_||_|j|j|jf|_dS)N)�project�
parse_version�version�location�_key)�selfr,r.r/�r2�/usr/lib/python3.6/index.py�__init__>s
zInstallationCandidate.__init__cCsdj|j|j|j�S)Nz,<InstallationCandidate({0!r}, {1!r}, {2!r})>)�formatr,r.r/)r1r2r2r3�__repr__DszInstallationCandidate.__repr__cCs
t|j�S)N)�hashr0)r1r2r2r3�__hash__IszInstallationCandidate.__hash__cCs|j|dd��S)NcSs||kS)Nr2)�s�or2r2r3�<lambda>Msz.InstallationCandidate.__lt__.<locals>.<lambda>)�_compare)r1�otherr2r2r3�__lt__LszInstallationCandidate.__lt__cCs|j|dd��S)NcSs||kS)Nr2)r9r:r2r2r3r;Psz.InstallationCandidate.__le__.<locals>.<lambda>)r<)r1r=r2r2r3�__le__OszInstallationCandidate.__le__cCs|j|dd��S)NcSs||kS)Nr2)r9r:r2r2r3r;Ssz.InstallationCandidate.__eq__.<locals>.<lambda>)r<)r1r=r2r2r3�__eq__RszInstallationCandidate.__eq__cCs|j|dd��S)NcSs||kS)Nr2)r9r:r2r2r3r;Vsz.InstallationCandidate.__ge__.<locals>.<lambda>)r<)r1r=r2r2r3�__ge__UszInstallationCandidate.__ge__cCs|j|dd��S)NcSs||kS)Nr2)r9r:r2r2r3r;Ysz.InstallationCandidate.__gt__.<locals>.<lambda>)r<)r1r=r2r2r3�__gt__XszInstallationCandidate.__gt__cCs|j|dd��S)NcSs||kS)Nr2)r9r:r2r2r3r;\sz.InstallationCandidate.__ne__.<locals>.<lambda>)r<)r1r=r2r2r3�__ne__[szInstallationCandidate.__ne__cCst|t�stS||j|j�S)N)�
isinstancer+�NotImplementedr0)r1r=�methodr2r2r3r<^s
zInstallationCandidate._compareN)
�__name__�
__module__�__qualname__r4r6r8r>r?r@rArBrCr<r2r2r2r3r+<sr+c	@s�eZdZdZd!dd�Zdd�Zed"dd	��Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
ejd�Zdd�Zdd�Zdd�Zdd�Zdd �ZdS)#r#z�This finds packages.

    This is meant to match easy_install's technique for looking for
    packages, by reading pages and looking for appropriate links.
    FNcCs�|dkrtd��g|_x:|D]2}|jd�rBt|�}
tjj|
�rB|
}|jj|�qW||_g|_	t
�|_|pvtt
�t
��|_
dd�|r�|ngD�|_||_||_||_t|	||
|d�|_ts�x8tj|j|j�D]$}tj|�}|jdkr�tjd�Pq�WdS)	a�Create a PackageFinder.

        :param format_control: A FormatControl object or None. Used to control
            the selection of source packages / binary packages when consulting
            the index and links.
        :param platform: A string or None. If None, searches for packages
            that are supported by the current system. Otherwise, will find
            packages that can be built on the platform passed in. These
            packages will only be downloaded for distribution: they will
            not be built locally.
        :param versions: A list of strings or None. This is passed directly
            to pep425tags.py in the get_supported() method.
        :param abi: A string or None. This is passed directly
            to pep425tags.py in the get_supported() method.
        :param implementation: A string or None. This is passed directly
            to pep425tags.py in the get_supported() method.
        Nz>PackageFinder() missing 1 required keyword argument: 'session'�~cSsg|]}d|df�qS)r%r2)�.0�hostr2r2r3�
<listcomp>�sz*PackageFinder.__init__.<locals>.<listcomp>)�versions�platform�abi�implr$zipip is configured with locations that require TLS/SSL, however the ssl module in Python is not available.)�	TypeError�
find_links�
startswithr	�os�path�exists�append�
index_urls�dependency_links�set�logged_linksr!�format_control�secure_origins�allow_all_prereleases�process_dependency_links�sessionr�
valid_tagsr�	itertools�chain�urllib_parse�urlparse�scheme�logger�warning)r1rSrYr_Z
trusted_hostsr`rar]rOrNrP�implementation�linkZnew_link�parsedr2r2r3r4ls>	




zPackageFinder.__init__cCs"|jrtjdt�|jj|�dS)NzXDependency Links processing has been deprecated and will be removed in a future release.)r`�warnings�warnrrZ�extend)r1�linksr2r2r3�add_dependency_links�s
z"PackageFinder.add_dependency_linkscs�g�g���fdd�}x�|D]�}tjj|�}|jd�}|s>|r�|rH|}nt|�}tjj|�r�|r�tjj|�}x4tj|�D]}|tjj||��qxWq�|rƈj	|�q�tjj
|�r�||�q�tjd|�qt
|�r܈j	|�qtjd|�qW��fS)zt
        Sort locations into "files" (archives) and "urls", and return
        a pair of lists (files,urls)
        cs8t|�}tj|dd�ddkr*�j|�n
�j|�dS)NF)�strictrz	text/html)r�	mimetypesZ
guess_typerX)rV�url)�files�urlsr2r3�	sort_path�sz0PackageFinder._sort_locations.<locals>.sort_pathzfile:z:Url '%s' is ignored: it is neither a file nor a directory.zQUrl '%s' is ignored. It is either a non-existing path or lacks a specific scheme.)rUrVrWrTr�isdir�realpath�listdir�joinrX�isfilerhrir)�	locations�
expand_dirrwrtZ
is_local_pathZis_file_urlrV�itemr2)rurvr3�_sort_locations�s8



zPackageFinder._sort_locationscCsXt|j�}|jjrHt|jj�}|j|j�s8td|j��|j|j�}n|}|j	|fS)a[
        Function used to generate link sort key for link tuples.
        The greater the return value, the more preferred it is.
        If not finding wheels, then sorted by version only.
        If finding wheels, then the sort order is by version, then:
          1. existing installs
          2. wheels ordered via Wheel.support_index_min(self.valid_tags)
          3. source archives
        Note: it was considered to embed this logic into the Link
              comparison operators, but then different sdist links
              with the same version, would have to be considered equal
        zB%s is not a supported wheel for this platform. It can't be sorted.)
�lenrbr/�is_wheelr�filename�	supportedrZsupport_index_minr.)r1�	candidateZsupport_num�wheelZprir2r2r3�_candidate_sort_key�s

z!PackageFinder._candidate_sort_keyc	Csltjt|��}|j|j|jf}|djdd�d
}�x t|jD�]}||dkr`|ddkr`q@yht	j
t|dtj
�s�|ddkr�|dn|djd��}t	jt|dtj
�r�|dn|djd��}WnJtk
�r|d�r|dj�|dj�k�r|ddk�rw@YnX||k�r q@|d|dk�rP|ddk�rP|ddk	�rPq@dSW|jd|j|j�d	S)Nr�+�r%�utf8�Tz�The repository located at %s is not a trusted or secure host and is being ignored. If this repository is available via HTTPS it is recommended to use HTTPS instead, otherwise you may silence this warning and allow it anyways with '--trusted-host %s'.F���)rerf�strrgZhostnameZport�rsplit�SECURE_ORIGINSr^rZ
ip_addressrDrZ	text_type�decodeZ
ip_network�
ValueError�lowerri)	r1rhr/rl�originZprotocolZ
secure_originZaddrZnetworkr2r2r3�_validate_secure_origins>

z%PackageFinder._validate_secure_origincs �fdd���fdd�|jD�S)z�Returns the locations found via self.index_urls

        Checks the url_name on the main (first in the list) index and
        use this url_name to produce all locations
        cs,tj|tjt����}|jd�s(|d}|S)N�/)�	posixpathr{reZquoter�endswith)rt�loc)�project_namer2r3�mkurl_pypi_urlhs
z?PackageFinder._get_index_urls_locations.<locals>.mkurl_pypi_urlcsg|]}�|��qSr2r2)rKrt)r�r2r3rMusz;PackageFinder._get_index_urls_locations.<locals>.<listcomp>)rY)r1r�r2)r�r�r3�_get_index_urls_locationsas
z'PackageFinder._get_index_urls_locationscs��j|�}�j|�\}}�j�jdd�\}}�j�j�\}}dd�tj|||�D�}	�fdd�tjdd�|D�dd�|D�d	d�|D��D�}
tjd
t|
�|�x|
D]}tjd|�q�Wt	|�}t
�j|�}
t|||
�}�j
dd��jD�|�}g}xJ�j|
|�D]:}tjd
|j�t��|j�j
|j|��WdQRX�qW�j
dd��jD�|�}|�r|tjddjdd�|D����j
|	|�}|�r�|jdd�tjddjdd�|D���||||S)aFind all available InstallationCandidate for project_name

        This checks index_urls, find_links and dependency_links.
        All versions found are returned as an InstallationCandidate list.

        See _link_package_versions for details on which files are accepted
        T)r~css|]}t|�VqdS)N)�Link)rKrtr2r2r3�	<genexpr>�sz4PackageFinder.find_all_candidates.<locals>.<genexpr>csg|]}�jt|�r|�qSr2)r�rh)rKrk)r1r2r3rM�sz5PackageFinder.find_all_candidates.<locals>.<listcomp>css|]}t|�VqdS)N)r�)rKrtr2r2r3r��scss|]}t|�VqdS)N)r�)rKrtr2r2r3r��scss|]}t|�VqdS)N)r�)rKrtr2r2r3r��sz,%d location(s) to search for versions of %s:z* %scss|]}t|d�VqdS)z-fN)r�)rKrtr2r2r3r��szAnalyzing links from page %sNcss|]}t|�VqdS)N)r�)rKrtr2r2r3r��szdependency_links found: %sz, cSsg|]}|jj�qSr2)r/rt)rKr.r2r2r3rM�s)�reversezLocal files found: %scSsg|]}t|jj��qSr2)rr/rt)rKr�r2r2r3rM�s)r�r�rSrZrcrdrh�debugr�r�fmt_ctl_formatsr]�Search�_package_versions�
_get_pagesrtr
rorpr{�sort)r1r�Zindex_locationsZindex_file_locZ
index_url_locZfl_file_locZ
fl_url_locZdep_file_locZdep_url_locZfile_locationsZ
url_locationsr/�canonical_name�formats�searchZfind_links_versionsZ
page_versions�pageZdependency_versionsZ
file_versionsr2)r1r3�find_all_candidateswsX


 
z!PackageFinder.find_all_candidatesc
s�|j|j�}t|jjdd�|D�|jr,|jndd����fdd�|D�}|r�t||jd�}t|j	dd�r�d	d�|D�}t|�r�t||jd�}q�d
j
|j|j|j	�}|j	j
r�|dj
|j	j
�7}tj|�nd}|jdk	r�t|jj�}nd}|dko�|dk�r0tjd|d
jttdd�|D��td���td|��d}	|�rT|dk�sP|j|k�rTd}	|�r�|dk	�r�|	�rztjd|�ntjd||j�dS|	�r�tjd|d
jt�td���p�d�t�tjd|jd
jt�td���|j	S)z�Try to find a Link matching req

        Expects req, an InstallRequirement and upgrade, a boolean
        Returns a Link if found,
        Raises DistributionNotFound or BestVersionAlreadyInstalled otherwise
        cSsg|]}t|j��qSr2)r�r.)rK�cr2r2r3rM�sz2PackageFinder.find_requirement.<locals>.<listcomp>N)Zprereleasescsg|]}t|j��kr|�qSr2)r�r.)rKr�)�compatible_versionsr2r3rM�s)�key�yankedFcSsg|]}t|jdd�s|�qS)r�F)�getattrr/)rKr�r2r2r3rM�sznWARNING: The candidate selected for download or install is a yanked version: '{}' candidate (version {} at {})z
Reason for being yanked: {}zNCould not find a version that satisfies the requirement %s (from versions: %s)z, css|]}t|j�VqdS)N)r�r.)rKr�r2r2r3r�sz1PackageFinder.find_requirement.<locals>.<genexpr>z%No matching distribution found for %sTzLExisting installed version (%s) is most up-to-date and satisfies requirementzUExisting installed version (%s) satisfies requirement (most up-to-date version is %s)z=Installed version (%s) is most up-to-date (past versions: %s)Znonez)Using version %s (newest of versions: %s))r��namer[Z	specifier�filterr_�maxr�r�r/r5r,r.�
yanked_reasonrhriZsatisfied_byr-Zcriticalr{�sortedrr�r)
r1ZreqZupgradeZall_candidatesZapplicable_candidatesZbest_candidateZnonyanked_candidatesZwarning_messageZinstalled_versionZbest_installedr2)r�r3�find_requirement�sx



zPackageFinder.find_requirementccsFt�}x:|D]2}||krq|j|�|j|�}|dkr8q|VqWdS)zp
        Yields (page, page_url) from the given locations, skipping
        locations that have errors.
        N)r[�add�	_get_page)r1r}r��seenr/r�r2r2r3r�Bs


zPackageFinder._get_pagesz-py([123]\.?[0-9]?)$cCsTgg}}t�}x:|D]2}||kr|j|�|jr>|j|�q|j|�qW||S)z�
        Returns elements of links in order, non-egg links first, egg links
        second, while eliminating duplicates
        )r[r��egg_fragmentrX)r1rpZeggsZno_eggsr�rkr2r2r3�_sort_linksUs


zPackageFinder._sort_linkscCs:g}x0|j|�D]"}|j||�}|dk	r|j|�qW|S)N)r��_link_package_versionsrX)r1rpr��resultrk�vr2r2r3r�eszPackageFinder._package_versionscCs(||jkr$tjd||�|jj|�dS)NzSkipping link %s; %s)r\rhr�r�)r1rk�reasonr2r2r3�_log_skipped_linkms
zPackageFinder._log_skipped_linkc
CsJd}|jr|j}|j}�n|j�\}}|s:|j|d�dS|tkrV|j|d|�dSd|jkr~|tkr~|j|d|j�dSd|jkr�|dkr�|j|d�dS|tk�r&yt	|j
�}Wn tk
r�|j|d	�dSXt|j
�|jk�r|j|d
|j�dS|j|j��s |j|d�dS|j}d|jk�rR|tk�rR|j|d
|j�dS|�sft||j|�}|dk�r�|j|d
|j�dS|jj|�}|�r�|d|j��}|jd�}|tjdd�k�r�|j|d�dSyt|j�}	Wn.tjk
�rtjd|j
|j�d}	YnX|	�s.tjd||j�dStjd||�t|j||�S)z'Return an InstallationCandidate or NoneNz
not a filezunsupported archive format: %s�binaryzNo binaries permitted for %sZmacosx10z.zipzmacosx10 onezinvalid wheel filenamezwrong project name (not %s)z%it is not compatible with this Python�sourcezNo sources permitted for %sr��zPython version is incorrectz3Package %s has an invalid Requires-Python entry: %sTz_The package %s is incompatible with the pythonversion in use. Acceptable python versions are:%szFound link %s, version: %s)r��extrr�rr�rZsuppliedrVrr�rrr�Z	canonicalr�rbr.�egg_info_matches�_py_version_rer��start�group�sysr�requires_pythonrZInvalidSpecifierrhr�r+)
r1rkr�r.�egg_infor�r��match�
py_versionZsupport_this_pythonr2r2r3r�rs�





z$PackageFinder._link_package_versionscCstj||jd�S)N)ra)�HTMLPage�get_pagera)r1rkr2r2r3r��szPackageFinder._get_page)	FNFNNNNNN)F)rGrHrI�__doc__r4rq�staticmethodr�r�r�r�r�r�r��re�compiler�r�r�r�r�r�r2r2r2r3r#es(
Q
1GSx
Mz([a-z0-9_.]+)-([a-z0-9_.!+-]+)cCs�|j|�}|stjd|�dS|dkrB|jd�}||jd�d�S|jd�j�}|jdd�}|j�d}|j|�r�|jd�t|�d�SdSdS)axPull the version part out of a string.

    :param egg_info: The string to parse. E.g. foo-2.1
    :param search_name: The name of the package this belongs to. None to
        infer the name. Note that this cannot unambiguously parse strings
        like foo-2-2 which might be foo, 2-2 or foo-2, 2.
    :param link: The link the string came from, for logging on failure.
    z%Could not parse version from link: %sNr�-�_)	r�rhr�r��indexr��replacerTr�)r�Zsearch_namerkZ_egg_info_rer�Z
full_matchr�Zlook_forr2r2r3r��s


r�c@sxeZdZdZddd�Zdd�Zeddd	��Zedd
d��Z	edd
��Z
edd��Ze
dd��Zejdej�Zdd�ZdS)r�z'Represents one page, along with its URLNcCs\d}|r2d|kr2tj|d�\}}d|kr2|d}||_tj|j|dd�|_||_||_dS)NzContent-Type�charsetF)Ztransport_encodingZnamespaceHTMLElements)�cgiZparse_header�contentrrrlrt�headers)r1r�rtr��encoding�content_type�paramsr2r2r3r4�s
zHTMLPage.__init__cCs|jS)N)rt)r1r2r2r3�__str__�szHTMLPage.__str__TcCsl|dkrtd��|j}|jdd�d}ddlm}x>|jD]4}|j�j|�r:|t|�dkr:t	j
d||�dSq:W�y"|r�|j}xHtD]@}|j
|�r�|j||d�}	|	j�jd	�r�Pq�t	j
d
||	�dSq�Wt	j
d|�tj|�\}}
}}}
}|dk�r6tjjtj|���r6|j
d
��s|d
7}tj|d�}t	j
d|�|j|d	dd�d�}|j�|jjdd�}	|	j�jd	��s�t	j
d
||	�dS||j|j|j�}Wn�tjk
�r�}z|j|||�WYdd}~Xn�tk
�r}z"d|}|j|||t	jd�WYdd}~Xn`tj k
�r>}z|j|d||�WYdd}~Xn*tj!k
�rb|j|d|�YnX|SdS)Nz9get_page() missing 1 required keyword argument: 'session'�#r�r)�
VcsSupportz+:zCannot look at %s URL %s)raz	text/htmlz,Skipping page %s because of Content-Type: %szGetting page %sr)r�z
index.htmlz# file: URL is directory, getting %szmax-age=600)ZAcceptz
Cache-Control)r�zContent-Type�unknownz6There was a problem confirming the ssl certificate: %s)�methzconnection error: %sz	timed out)"rRrt�split�pip.vcsr�Zschemesr�rTr�rhr�r�r
r��_get_content_typererfrUrVrx�urllib_requestZurl2pathname�urljoin�get�raise_for_statusr�r�rZ	HTTPError�_handle_failr�info�ConnectionErrorZTimeout)�clsrkZ
skip_archivesrartr�rgr�Zbad_extr��netlocrVr��query�fragment�respZinst�excr�r2r2r3r��sp



$"zHTMLPage.get_pagecCs|dkrtj}|d||�dS)Nz%Could not fetch URL %s: %s - skipping)rhr�)rkr�rtr�r2r2r3r�NszHTMLPage._handle_failcCsDtj|�\}}}}}|dkr dS|j|dd�}|j�|jjdd�S)z;Get the Content-Type of the given url, using a HEAD request�httpr$�T)Zallow_redirectszContent-Type)r�r$)re�urlsplit�headr�r�r�)rtrargr�rVr�r�r�r2r2r3r�UszHTMLPage._get_content_typecCs@dd�|jjd�D�}|r6|djd�r6|djd�S|jSdS)NcSsg|]}|jd�dk	r|�qS)�hrefN)r�)rK�xr2r2r3rMfsz%HTMLPage.base_url.<locals>.<listcomp>z.//baserr�)rl�findallr�rt)r1�basesr2r2r3�base_urlcszHTMLPage.base_urlccs�x�|jjd�D]v}|jd�r|jd�}|jtj|j|��}|jd�}|rPt|�nd}|jddd�}|dk	rrt|�}t||||d�VqWdS)zYields all links in the pagez.//ar�zdata-requires-pythonNzdata-yanked)�default)r�r�)	rlr�r��
clean_linkrer�r�r r�)r1Zanchorr�rtZ	pyrequirer�r2r2r3rpns


zHTMLPage.linksz[^a-z0-9$&+,/:;=?@.#%_\\|-]cCs|jjdd�|�S)z�Makes sure a link is fully encoded.  That is, if a ' ' shows up in
        the link, it will be rewritten to %20 (while not over-quoting
        % or other characters).cSsdt|jd��S)Nz%%%2xr)�ordr�)r�r2r2r3r;�sz%HTMLPage.clean_link.<locals>.<lambda>)�	_clean_re�sub)r1rtr2r2r3r��szHTMLPage.clean_link)N)TN)N)rGrHrIr�r4r��classmethodr�r�r�r�rr��propertyrpr�r��Ir�r�r2r2r2r3r��s
Ur�c@s eZdZd5dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Ze
dd��Ze
dd��Ze
dd��Ze
dd��Zdd�Ze
d d!��Ze
d"d#��Zejd$�Ze
d%d&��Zejd'�Ze
d(d)��Zejd*�Ze
d+d,��Ze
d-d.��Ze
d/d0��Ze
d1d2��Ze
d3d4��Z dS)6r�NcCs@|jd�rt|�}||_||_|r&|nd|_||_|dk	|_dS)a�
        Object representing a parsed link from https://pypi.python.org/simple/*

        url:
            url of the resource pointed to (href of the link)
        comes_from:
            instance of HTMLPage where the link was found, or string.
        requires_python:
            String containing the `Requires-Python` metadata field, specified
            in PEP 345. This may be specified by a data-requires-python
            attribute in the HTML link tag, as described in PEP 503.
        z\\N)rTrrt�
comes_fromr�r�r�)r1rtrr�r�r2r2r3r4�s
z
Link.__init__cCs<|jrd|j}nd}|jr.d|j|j|fSt|j�SdS)Nz (requires-python:%s)r�z%s (from %s)%s)r�rrtr�)r1Zrpr2r2r3r��szLink.__str__cCsd|S)Nz	<Link %s>r2)r1r2r2r3r6�sz
Link.__repr__cCst|t�stS|j|jkS)N)rDr�rErt)r1r=r2r2r3r@�s
zLink.__eq__cCst|t�stS|j|jkS)N)rDr�rErt)r1r=r2r2r3rC�s
zLink.__ne__cCst|t�stS|j|jkS)N)rDr�rErt)r1r=r2r2r3r>�s
zLink.__lt__cCst|t�stS|j|jkS)N)rDr�rErt)r1r=r2r2r3r?�s
zLink.__le__cCst|t�stS|j|jkS)N)rDr�rErt)r1r=r2r2r3rB�s
zLink.__gt__cCst|t�stS|j|jkS)N)rDr�rErt)r1r=r2r2r3rA�s
zLink.__ge__cCs
t|j�S)N)r7rt)r1r2r2r3r8�sz
Link.__hash__cCsJtj|j�\}}}}}tj|jd��p(|}tj|�}|sFtd|j��|S)Nr�zURL %r produced no filename)rer�rtr��basename�rstrip�unquote�AssertionError)r1r�r�rVr�r2r2r3r��s

z
Link.filenamecCstj|j�dS)Nr)rer�rt)r1r2r2r3rg�szLink.schemecCstj|j�dS)Nr�)rer�rt)r1r2r2r3r��szLink.netloccCstjtj|j�d�S)Nr�)rerr�rt)r1r2r2r3rV�sz	Link.pathcCsttj|jjd���S)Nr�)rr�rrVr)r1r2r2r3r�sz
Link.splitextcCs|j�dS)Nr�)r)r1r2r2r3r��szLink.extcCs*tj|j�\}}}}}tj||||df�S)N)rer�rtZ
urlunsplit)r1rgr�rVr�r�r2r2r3�url_without_fragment�szLink.url_without_fragmentz[#&]egg=([^&]*)cCs |jj|j�}|sdS|jd�S)Nr�)�_egg_fragment_rer�rtr�)r1r�r2r2r3r��szLink.egg_fragmentz[#&]subdirectory=([^&]*)cCs |jj|j�}|sdS|jd�S)Nr�)�_subdirectory_fragment_rer�rtr�)r1r�r2r2r3�subdirectory_fragment�szLink.subdirectory_fragmentz2(sha1|sha224|sha384|sha256|sha512|md5)=([a-f0-9]+)cCs |jj|j�}|r|jd�SdS)Nr�)�_hash_rer�rtr�)r1r�r2r2r3r7s
z	Link.hashcCs |jj|j�}|r|jd�SdS)Nr�)rr�rtr�)r1r�r2r2r3�	hash_names
zLink.hash_namecCs$tj|jjdd�djdd�d�S)Nr�r�r�?)r�rrtr�)r1r2r2r3�show_urlsz
Link.show_urlcCs
|jtkS)N)r�r)r1r2r2r3r�sz
Link.is_wheelcCs ddlm}|j|jkrdSdS)z�
        Determines if this points to an actual artifact (e.g. a tarball) or if
        it points to an "abstract" thing like a path or a VCS location.
        r)�vcsFT)r�rrgZall_schemes)r1rr2r2r3�is_artifactszLink.is_artifact)NNN)!rGrHrIr4r�r6r@rCr>r?rBrAr8rr�rgr�rVrr�rr�r�r	r�r
rrr7r
rr�rr2r2r2r3r��s8



r�zno_binary only_binarycCs�|jd�}xFd|krP|j�|j�|jd�|d|jd�d�=d|krdSqWx:|D]2}|dkrn|j�qXt|�}|j|�|j|�qXWdS)N�,z:all:r�z:none:)r��clearr�r�r�discard)�value�targetr=�newr�r2r2r3r"5s




cCsjtddg�}||jkr"|jd�n@||jkr8|jd�n*d|jkrN|jd�nd|jkrb|jd�t|�S)Nr�r�z:all:)r[�only_binaryr�	no_binary�	frozenset)�fmt_ctlr�r�r2r2r3r�Hs




r�cCstd|j|j�dS)Nz:all:)r"rr)rr2r2r3�fmt_ctl_no_binaryUsrcCst|�tjdtdd�dS)Nzf--no-use-wheel is deprecated and will be removed in the future.  Please use --no-binary :all: instead.r�)�
stacklevel)rrmrnr)rr2r2r3�fmt_ctl_no_use_wheelZs
rr�zsupplied canonical formats)r$r%r%)r%r&r%)r%r'r%)r%r(r%)r)r%N)r*r%r%)Qr�Z
__future__rZloggingr��collectionsrrcr�rUr�rsr�rmZpip._vendor.six.moves.urllibrrerr�Z
pip.compatrZ	pip.utilsrrr	r
rZpip.utils.deprecationrZpip.utils.loggingr
Zpip.utils.packagingrZpip.exceptionsrrrrZpip.downloadrrrrZ	pip.wheelrrZpip.pep425tagsrZpip._vendorrrrZpip._vendor.packaging.versionr-Zpip._vendor.packaging.utilsrZpip._vendor.packagingrZpip._vendor.requests.exceptionsrZpip._vendor.distlib.compatr �__all__r�Z	getLoggerrGrh�objectr+r#r�rr�r�r�r!r"r�rrr�r2r2r2r3�<module>sl

)d*#



site-packages/pip/__pycache__/cmdoptions.cpython-36.opt-1.pyc000064400000031164147511334620020043 0ustar003

���eZ@�@sndZddlmZddlmZddlmZmZmZddl	Z	ddl
mZmZm
Z
mZddlmZddlmZmZdd	lmZd
d�Zdd
�Zd�dd�Zeedddddd�Zeedddddd�Zeeddddded�Zeeddd d!dd"d�Zeed#d$d%dd&d�Zeed'd(d)d!dd*d�Zeed+d,d-d.d/d0d1�Zeed2d3dded�Z eed4d5d6d7d8d9�Z!eed:d;d<d=d>d9�Z"eed?d@dAdBdCdDdEdF�Z#eedGdHd6d7ed9�Z$eedIdJd6d7ed9�Z%dKdL�Z&eedMdNd6d/dOdP�Z'eedQdRd6dd/dSdT�Z(eedUdVdWdXdYej)dZd[�Z*d\d]�Z+eed^d_ddd`d�Z,dadb�Z-dcdd�Z.eededfdded�Z/dgdh�Z0eedidfdjded�Z1dkdl�Z2eedmdndjded�Z3eedodpdddqd�Z4drds�Z5dtdu�Z6dvdw�Z7eedxdydzd{d|d}ed~d[�	Z8eedd�dd�ed�Z9eed�d�djd�d�d�Z:d�d��Z;d�d��Z<d�d��Z=d�d��Z>d�d��Z?eed�d�ed}d�d��Z@eed�d�djd�d�ZAeed�d�d�ddd�d�ZBeed�d�d�d�d�d}d�d1�ZCeed�d�dd�d�ZDeed�d�d�d�d�d��ZEeed�d�d�d�d�d��ZFeed�ddd�d��ZGeed�ddd�d��ZHeed�d�ddd�d�ZIeed�d�d�ded�ZJd�d��ZKeed�d�d�eKd�d�d��ZLeed�d�ddd�d�ZMd�eeeeeeee e!e"e#e$e%e&e0e'e(e@eAeIgd��ZNd�e*e+e,e-e4gd��ZOd�eOd�e.e/e1e2e3gd��ZPdS)�aD
shared options and groups

The principle here is to define options once, but *not* instantiate them
globally. One reason being that options with action='append' can carry state
between parses. pip parses general options twice internally, and shouldn't
pass on state. To be consistent, all options will follow this design.

�)�absolute_import)�partial)�OptionGroup�
SUPPRESS_HELP�OptionN)�
FormatControl�fmt_ctl_handle_mutual_exclude�fmt_ctl_no_binary�fmt_ctl_no_use_wheel)�PyPI)�USER_CACHE_DIR�
src_prefix)�
STRONG_HASHEScCs0t||d�}x|dD]}|j|��qW|S)z�
    Return an OptionGroup object
    group  -- assumed to be dict with 'name' and 'options' keys
    parser -- an optparse Parser
    �name�options)rZ
add_option)�group�parserZoption_group�option�r� /usr/lib/python3.6/cmdoptions.py�make_option_groupsrcCs|js|j}t|�dS)N)�	use_wheel�format_controlr
)r�controlrrr�resolve_wheel_no_use_binary$srcsP�dkr|��fdd�}dddg}tt||��rL|j}t|�tjddd	�dS)
z�Disable wheels if per-setup.py call options are set.

    :param options: The OptionParser options to update.
    :param check_options: The options to check, if not supplied defaults to
        options.
    Ncst�|d�S)N)�getattr)�n)�
check_optionsrr�getname4sz+check_install_build_global.<locals>.getnameZ
build_options�global_options�install_optionszeDisabling all use of wheels due to the use of --build-options / --global-options / --install-options.�)�
stacklevel)�any�maprr	�warnings�warn)rrr�namesrr)rr�check_install_build_global*s
r(z-hz--help�helpz
Show help.)�dest�actionr)z
--isolated�
isolated_mode�
store_trueFzSRun pip in an isolated mode, ignoring environment variables and user configuration.)r*r+�defaultr)z--require-virtualenvz--require-venvZrequire_venvz-vz	--verbose�verbose�countzDGive more output. Option is additive, and can be used up to 3 times.z-Vz	--version�versionzShow version and exit.z-qz--quiet�quietz�Give less output. Option is additive, and can be used up to 3 times (corresponding to WARNING, ERROR, and CRITICAL logging levels).z--logz
--log-filez--local-log�log�pathz Path to a verbose appending log.)r*�metavarr)z
--no-input�no_inputz--proxy�proxy�str�z<Specify a proxy in the form [user:passwd@]proxy.server:port.)r*�typer.r)z	--retries�retries�int�zRMaximum number of retries each connection should attempt (default %default times).z	--timeoutz--default-timeoutZsec�timeout�float�z2Set the socket timeout (default %default seconds).)r5r*r:r.r)z
--default-vcs�default_vcsz--skip-requirements-regex�skip_requirements_regexc
Cs"tddddddddggd	d
dd�S)
Nz--exists-action�
exists_actionZchoice�s�i�w�b�a�appendr+zYDefault action when a path already exists: (s)witch, (i)gnore, (w)ipe, (b)ackup, (a)bort.)r*r:�choicesr.r+r5r))rrrrrrC�srCz--cert�certzPath to alternate CA bundle.)r*r:r5r)z
--client-cert�client_certzkPath to SSL client certificate, a single file containing the private key and the certificate in PEM format.)r*r:r.r5r)z-iz--index-urlz
--pypi-url�	index_url�URLz�Base URL of Python Package Index (default %default). This should point to a repository compliant with PEP 503 (the simple repository API) or a local directory laid out in the same format.)r*r5r.r)cCstddddgdd�S)Nz--extra-index-urlZextra_index_urlsrNrIzmExtra URLs of package indexes to use in addition to --index-url. Should follow the same rules as --index-url.)r*r5r+r.r))rrrrr�extra_index_url�srOz
--no-index�no_indexzAIgnore package index (only looking at --find-links URLs instead).c	Cstddddgddd�S)Nz-fz--find-links�
find_linksrIZurlz�If a url or path to an html file, then parse for links to archives. If a local path or file:// url that's a directory, then look for archives in the directory listing.)r*r+r.r5r))rrrrrrQ�srQcCstdddgdtd�S)Nz--allow-external�allow_externalrI�PACKAGE)r*r+r.r5r))rrrrrrrRsrRz--allow-all-external�allow_all_externalcCstddddgdd�S)Nz--trusted-hostZ
trusted_hostsrIZHOSTNAMEzKMark this host as trusted, even though it does not have valid or any HTTPS.)r*r+r5r.r))rrrrr�trusted_hostsrUz--no-allow-externalZstore_falsec	Cstddddgdtd�S)Nz--allow-unverifiedz--allow-insecureZallow_unverifiedrIrS)r*r+r.r5r))rrrrrr�allow_unsafe3srVz--no-allow-insecureZallow_all_insecurez--process-dependency-links�process_dependency_linksz*Enable the processing of dependency links.c	Cstddddgddd�S)Nz-cz--constraint�constraintsrI�filez\Constrain versions using the given constraints file. This option can be used multiple times.)r*r+r.r5r))rrrrrrXRsrXc	Cstddddgddd�S)Nz-rz
--requirement�requirementsrIrYzQInstall from the given requirements file. This option can be used multiple times.)r*r+r.r5r))rrrrrrZ]srZc	Cstddddgddd�S)Nz-ez
--editableZ	editablesrIzpath/urlzkInstall a project in editable mode (i.e. setuptools "develop mode") from a local project path or a VCS url.)r*r+r.r5r))rrrrr�editablehsr[z--srcz--sourcez--source-dirz--source-directoryZsrc_dir�dirz�Directory to check out editable projects into. The default in a virtualenv is "<venv path>/src". The default for global installs is "<current dir>/src".z--use-wheelrTz--no-use-wheelz{Do not Find and prefer wheel archives when searching indexes and find-links locations. DEPRECATED in favour of --no-binary.cCst||j�S)zGet a format_control object.)rr*)�valuesrrrr�_get_format_control�sr^cCs"t|j|j�}t||j|j�dS)N)rr]r*r�	no_binary�only_binary)r�opt_str�valuer�existingrrr�_handle_no_binary�srdcCs"t|j|j�}t||j|j�dS)N)rr]r*rr`r_)rrarbrrcrrr�_handle_only_binary�srec	Cs tdddtdtt�t��dd�S)Nz--no-binaryr�callbackr8aRDo not use binary packages. Can be supplied multiple times, and each time adds to the existing value. Accepts either :all: to disable all binary packages, :none: to empty the set, or one or more package names with commas between them. Note that some packages are tricky to compile and may fail to install when this option is used on them.)r*r+rfr:r.r))rrdr�setrrrrr_�s
r_c	Cs tdddtdtt�t��dd�S)Nz
--only-binaryrrfr8aGDo not use source packages. Can be supplied multiple times, and each time adds to the existing value. Accepts either :all: to disable all source packages, :none: to empty the set, or one or more package names with commas between them. Packages without binary distributions will fail to install when this option is used on them.)r*r+rfr:r.r))rrerrgrrrrr`�s
r`z--cache-dir�	cache_dirzStore the cache data in <dir>.)r*r.r5r)z--no-cache-dirzDisable the cache.z	--no-depsz--no-dependenciesZignore_dependenciesz#Don't install package dependencies.z-bz--buildz--build-dirz--build-directory�	build_dirz/Directory to unpack packages into and build in.z--ignore-requires-python�ignore_requires_pythonz'Ignore the Requires-Python information.z--install-optionr rIra"Extra arguments to be supplied to the setup.py install command (use like --install-option="--install-scripts=/usr/local/bin"). Use multiple --install-option options to pass multiple options to setup.py install. If you are using an option with a directory path, be sure to use absolute path.)r*r+r5r)z--global-optionrzTExtra global options to be supplied to the setup.py call before the install command.z
--no-cleanz!Don't clean up build directories.)r+r.r)z--prezYInclude pre-release and development versions. By default, pip only finds stable versions.z--disable-pip-version-check�disable_pip_version_checkz{Don't periodically check PyPI to determine whether a new version of pip is available for download. Implied with --no-index.z-Zz--always-unzip�always_unzipc
Cs�|jjsi|j_y|jdd�\}}Wn"tk
rF|jd|�YnX|tkrh|jd|djt�f�|jjj|g�j|�dS)zkGiven a value spelled "algo:digest", append the digest to a list
    pointed to in a dict by the algo name.�:�zTArguments to %s must be a hash name followed by a value, like --hash=sha256:abcde...z&Allowed hash algorithms for %s are %s.z, N)	r]�hashes�split�
ValueError�errorr�join�
setdefaultrI)rrarbrZalgoZdigestrrr�_merge_hashsruz--hashrorf�stringzgVerify that the package's archive matches this hash before installing. Example: --hash=sha256:abcdef...)r*r+rfr:r)z--require-hashes�require_hashesz�Require a hash to check each requirement against, for repeatable installs. This option is implied when any package in a requirements file has a --hash option.zGeneral Options)rrzPackage Index Optionsz4Package Index Options (including deprecated options))N)Q�__doc__Z
__future__r�	functoolsrZoptparserrrr%Z	pip.indexrrr	r
Z
pip.modelsrZ
pip.locationsrr
Zpip.utils.hashesrrrr(Zhelp_r,Zrequire_virtualenvr/r1r2r3r6r7r;r>rArBrCrKrLZ
simple_urlrMrOrPrQrRrTrUZno_allow_externalrVZno_allow_unsaferWrXrZr[�srcrZno_use_wheelr^rdrer_r`rhZno_cacheZno_depsrirjr rZno_cleanZprerkrlru�hashrwZ
general_groupZnon_deprecated_index_groupZindex_grouprrrr�<module>	sr







site-packages/pip/__pycache__/locations.cpython-36.opt-1.pyc000064400000007307147511334620017661 0ustar003

���e��
@sdZddlmZddlZddlZddlZddlZddlmZddl	m
Z
mZddlm
Z
mZddlmZejd�Zd	Zd
Zdd�Zd
d�Zdd�Ze�r�ejjejd�Zn6yejjej�d�ZWnek
r�ejd�YnXejje�Zej �Z!ej"Z#ed�Z$e
�rtejjejd�Z%ejje#d�Z&ejj'e%��sRejjejd�Z%ejje#d�Z&dZ(ejje$d�Z)ejje)e(�Z*njejjejd�Z%ejje#d�Z&dZ(ejje$d�Z)ejje)e(�Z*ej+dd�dk�r�ejdd�dk�r�dZ%dd�ej,d�D�Z-d#d!d"�Z.dS)$z7Locations where we look for configs, install stuff, etc�)�absolute_importN)�	sysconfig)�install�SCHEME_KEYS)�WINDOWS�
expanduser)�appdirsZpipz�This file is placed here by pip to indicate the source was put
here by pip.

Once this package is successfully installed this source code will be
deleted (unless you remove this file).
zpip-delete-this-directory.txtc	Cs2tjj|t�}t|d��}|jt�WdQRXdS)z?
    Write the pip delete marker file into this directory.
    �wN)�os�path�join�PIP_DELETE_MARKER_FILENAME�open�write�DELETE_MARKER_MESSAGE)Z	directory�filepathZ	marker_fp�r�/usr/lib/python3.6/locations.py�write_delete_marker_filesrcCs*ttd�rdStjttdtj�kr&dSdS)zM
    Return True if we're running inside a virtualenv, False otherwise.

    Zreal_prefixT�base_prefixF)�hasattr�sys�prefix�getattrrrrr�running_under_virtualenv's

rcCs>tjjtjjtj��}tjj|d�}t�r:tjj|�r:dSdS)z?
    Return True if in a venv and no system site packages.
    zno-global-site-packages.txtTN)	r
r�dirname�abspath�site�__file__rr�isfile)Zsite_mod_dirZno_global_filerrr�virtualenv_no_global4sr �srcz=The folder you are executing pip from can no longer be found.�~ZScripts�binzpip.inizpip.confz.pip��darwin�z/System/Library/z/usr/local/bincCsg|]}tjj|t��qSr)r
rr�config_basename)�.0rrrr�
<listcomp>wsr)FcCsNddlm}i}|r ddgi}ni}d|i}	|	j|�||	�}
|
j�|
jddd�}|p\|j|_|rjd	|_|pr|j|_|p~|j|_|p�|j|_|j	�xt
D]}t|d
|�||<q�Wd|
jd�kr�|jt
|j|jd��t��rJtjjtjd
ddtjdd�|�|d<|dk	�rJtjjtjj|d��d}
tjj||
dd��|d<|S)z+
    Return a distutils install scheme
    r)�DistributionZscript_argsz
--no-user-cfg�namerT)Zcreate�Zinstall_�install_lib)�purelib�platlib�includer�pythonN�Zheaders�)Zdistutils.distr*�updateZparse_config_filesZget_command_obj�userr�home�rootZfinalize_optionsrrZget_option_dict�dictr-rr
rrr�version�
splitdriver)Z	dist_namer5r6r7�isolatedrr*�schemeZextra_dist_argsZ	dist_args�d�i�keyZ
path_no_driverrr�distutils_scheme|sF



r@)FNNFN)/�__doc__Z
__future__rr
Zos.pathrrZ	distutilsrZdistutils.command.installrrZ
pip.compatrrZ	pip.utilsrZuser_cache_dirZUSER_CACHE_DIRrr
rrr rrrZ
src_prefix�getcwd�OSError�exitrZget_python_libZ
site_packages�	USER_SITE�	user_siteZuser_dirZbin_pyZbin_user�existsr'Zlegacy_storage_dirZlegacy_config_file�platformZsite_config_dirsZsite_config_filesr@rrrr�<module>sd
		
(site-packages/pip/__pycache__/pep425tags.cpython-36.pyc000064400000016433147511334620016625 0ustar003

���e�*�
@sdZddlmZddlZddlZddlZddlZddlZyddlZWne	k
rbddl
jZYnXddlZddl
mZddlZeje�Zejd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Zd!dd�Zdd�Zdd�Zdd�Zdd�Zdd�Z d"dd�Z!e!�Z"e!dd �Z#e�Z$dS)#z2Generate and work with PEP 425 Compatibility Tags.�)�absolute_importN)�OrderedDictz(.+)_(\d+)_(\d+)_(.+)cCsBy
tj|�Stk
r<}ztjdj|�t�dSd}~XnXdS)Nz{0})�	sysconfig�get_config_var�IOError�warnings�warn�format�RuntimeWarning)�var�e�r
� /usr/lib/python3.6/pep425tags.pyrs

rcCs:ttd�rd}n&tjjd�r"d}ntjdkr2d}nd}|S)z'Return abbreviated implementation name.�pypy_version_info�pp�javaZjyZcliZip�cp)�hasattr�sys�platform�
startswith)Zpyimplr
r
r�
get_abbr_impl!s

rcCs.td�}|st�dkr*djttt���}|S)zReturn implementation version.�py_version_nodotr�)rr�join�map�str�get_impl_version_info)Zimpl_verr
r
r�get_impl_ver.srcCs:t�dkr"tjdtjjtjjfStjdtjdfSdS)zQReturn sys.version_info-like tuple for use in decrementing the minor
    version.rr�N)rr�version_infor�major�minorr
r
r
rr6s

rcCsdjt�t��S)z;
    Returns the Tag for this specific implementation.
    z{0}{1})r	rrr
r
r
r�get_impl_tagAsr#TcCs.t|�}|dkr&|r tjd|�|�S||kS)zgUse a fallback method for determining SOABI flags if the needed config
    var is unset or unavailable.Nz>Config variable '%s' is unset, Python ABI tag may be incorrect)r�logger�debug)rZfallback�expectedr�valr
r
r�get_flagHsr(cs�td�}t��|r��dkr�ttd�r�d}d}d}tddd��dkd	�rLd
}td�fdd��dkd	�rjd
}tddd�d�dko�tjdkd�r�tjdkr�d}d�t�|||f}n@|r�|jd�r�d|jd�d}n|r�|j	dd�j	dd�}nd}|S)zXReturn the ABI tag based on SOABI (if available) or emulate SOABI
    (CPython 2, PyPy).�SOABIrr�
maxunicoder�Py_DEBUGcSs
ttd�S)N�gettotalrefcount)rrr
r
r
r�<lambda>^szget_abi_tag.<locals>.<lambda>)r�d�
WITH_PYMALLOCcs�dkS)Nrr
r
)�implr
rr-bs�mZPy_UNICODE_SIZEcSs
tjdkS)Ni��)rr*r
r
r
rr-fs��)r&r�uz
%s%s%s%s%szcpython-�-r�.�_N)rr)r3r3)r3r3)
rrrrr(r rr�split�replace)Zsoabir.r1r4�abir
)r0r�get_abi_tagTs8

r;cCs
tjdkS)Ni���)r�maxsizer
r
r
r�_is_running_32bitvsr=cCs�tjdkr^tj�\}}}|jd�}|dkr6t�r6d}n|dkrHt�rHd}dj|d|d	|�Stjj�j	dd
�j	dd
�}|dkr�t�r�d
}|S)z0Return our platform name 'win32', 'linux_x86_64'�darwinr6�x86_64�i386�ppc64�ppczmacosx_{0}_{1}_{2}rrr7r5�linux_x86_64�
linux_i686)
rrZmac_verr8r=r	�	distutils�util�get_platformr9)�releaser7�machineZ	split_ver�resultr
r
rrGzs

rGcCsJt�dkrdSyddl}t|j�Sttfk
r8YnXtjjj	dd�S)NrCrDFr��)rCrD)
rG�
_manylinux�boolZmanylinux1_compatible�ImportError�AttributeError�pipZutilsZglibcZhave_compatible_glibc)rMr
r
r�is_manylinux1_compatible�s

rRcsvg}��fdd��td
dddg���|||�r8|j|�x.�D]&}|�|kr>�|||�r>|j|�q>W|jd�|S)z�Return a list of supported arches (including group arches) for
    the given major, minor and machine architecture of an macOS machine.
    cs~|dkr||fd
kS|dkr(||fdkS|dkr<||fdkS|dkrP||fd
kS|�krzx �|D]}�|||�rbdSqbWd	S)NrB�
rLrAr@r2r?TF)rSrL)rSrL)rSr2)rSrLr
)r!r"�arch�garch)�_supports_arch�groupsr
rrV�sz)get_darwin_arches.<locals>._supports_arch�fatr@rB�intelr?�fat64rA�fat32Z	universal�r@rB)rXr\�r?r@)rYr]�r?rA)rZr^�r?r@rB)r[r_)r�append)r!r"rI�archesrUr
)rVrWr�get_darwin_arches�s$


rbFcCsg}|dkrXg}t�}|dd�}x4t|ddd�D] }|jdjtt||f���q4W|p`t�}g}	|pnt�}|r�|g|	dd�<t�}
ddl	}x8|j
�D],}|djd�r�|
j|dj
dd�d�q�W|	jtt|
���|	jd�|�sx|p�t�}
|
jd	��r�tj|
�}|�r�|j�\}}}}d
j||�}g}xTttt|�d��D]4}x,tt|�||�D]}|j|||f��q^W�qHWn|
g}n*|dk�r�t��r�|
jdd�|
g}n|
g}x:|	D]2}x*|D]"}
|jd
||df||
f��q�W�q�WxZ|dd�D]J}|dk�rPx6|
D].}x&|D]}
|jd
||f||
f��qW�qW�q�Wx*|D]"}
|jd|ddd|
f��qRW|jd
||dfddf�|jd
||ddfddf�xNt|�D]B\}}|jd|fddf�|dk�r�|jd|dddf��q�W|S)acReturn a list of supported tags for each version specified in
    `versions`.

    :param versions: a list of string versions, of the form ["33", "32"],
        or None. The first version will be assumed to support our ABI.
    :param platform: specify the exact platform you want valid
        tags for, or None. If None, use the local system platform.
    :param impl: specify the exact implementation you want valid
        tags for, or None. If None, use the local interpreter impl.
    :param abi: specify the exact abi you want valid
        tags for, or None. If None, use the local interpreter abi.
    Nrrrz.abir6rKZnoneZmacosxz
{0}_{1}_%i_%s�linuxZ
manylinux1z%s%s�31�30zpy%s�any���rgrgrg)rdre)r�ranger`rrrrr;�set�impZget_suffixesr�addr8�extend�sorted�listrG�
_osx_arch_pat�matchrWr	�reversed�intrbrRr9�	enumerate)Zversions�noarchrr0r:Z	supportedr r!r"ZabisZabi3srj�suffixrTrp�nameZactual_archZtplrar1�a�version�ir
r
r�
get_supported�sh 




 

(


*
" 
rz)rt)TT)NFNNN)%�__doc__Z
__future__r�rerrrZloggingrrOZdistutils.sysconfigZdistutils.utilrEZ
pip.compatrZpip.utils.glibcrQZ	getLogger�__name__r$�compilerorrrrr#r(r;r=rGrRrbrzZsupported_tagsZsupported_tags_noarchZimplementation_tagr
r
r
r�<module>s>



"=
^
site-packages/pip/__pycache__/__main__.cpython-36.opt-1.pyc000064400000000551147511334620017400 0ustar003

���eH�@shddlmZddlZddlZedkrFejjejje��Zejjde�ddl	Z	e
dkrdeje	j��dS)�)�absolute_importN��__main__)
Z
__future__r�os�sys�__package__�path�dirname�__file__�insertZpip�__name__�exit�main�rr�/usr/lib/python3.6/__main__.py�<module>ssite-packages/pip/__pycache__/exceptions.cpython-36.pyc000064400000024346147511334620017112 0ustar003

���e��@sddZddlmZddlmZmZmZddlmZGdd�de	�Z
Gdd�de
�ZGd	d
�d
e
�ZGdd�de�Z
Gd
d�de�ZGdd�de
�ZGdd�de
�ZGdd�de
�ZGdd�de
�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd �d e�ZGd!d"�d"e�ZGd#d$�d$e�ZGd%d&�d&e�ZGd'd(�d(e�ZGd)d*�d*e�Zd+S),z"Exceptions used throughout package�)�absolute_import)�chain�groupby�repeat)�	iteritemsc@seZdZdZdS)�PipErrorzBase pip exceptionN)�__name__�
__module__�__qualname__�__doc__�rr� /usr/lib/python3.6/exceptions.pyr	src@seZdZdZdS)�InstallationErrorz%General exception during installationN)rr	r
rrrrr
r
src@seZdZdZdS)�UninstallationErrorz'General exception during uninstallationN)rr	r
rrrrr
rsrc@seZdZdZdS)�DistributionNotFoundzCRaised when a distribution cannot be found to satisfy a requirementN)rr	r
rrrrr
rsrc@seZdZdZdS)�RequirementsFileParseErrorzDRaised when a general error occurs parsing a requirements file line.N)rr	r
rrrrr
rsrc@seZdZdZdS)�BestVersionAlreadyInstalledzNRaised when the most up-to-date version of a package is already
    installed.N)rr	r
rrrrr
rsrc@seZdZdZdS)�
BadCommandz0Raised when virtualenv or a command is not foundN)rr	r
rrrrr
r"src@seZdZdZdS)�CommandErrorz7Raised when there is an error in command-line argumentsN)rr	r
rrrrr
r&src@seZdZdZdS)�PreviousBuildDirErrorz:Raised when there's a previous conflicting build directoryN)rr	r
rrrrr
r*src@seZdZdZdS)�InvalidWheelFilenamezInvalid wheel filename.N)rr	r
rrrrr
r.src@seZdZdZdS)�UnsupportedWheelzUnsupported wheel.N)rr	r
rrrrr
r2src@s8eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�ZdS)
�
HashErrorsz:Multiple HashError instances rolled into one for reportingcCs
g|_dS)N)�errors)�selfrrr
�__init__9szHashErrors.__init__cCs|jj|�dS)N)r�append)r�errorrrr
r<szHashErrors.appendcCsfg}|jjdd�d�x<t|jdd��D](\}}|j|j�|jdd�|D��q(W|rbdj|�SdS)NcSs|jS)N)�order)�errr
�<lambda>Asz$HashErrors.__str__.<locals>.<lambda>)�keycSs|jS)N)�	__class__)rrrr
r Bscss|]}|j�VqdS)N)�body)�.0rrrr
�	<genexpr>Dsz%HashErrors.__str__.<locals>.<genexpr>�
)r�sortrr�head�extend�join)r�lines�clsZ
errors_of_clsrrr
�__str__?szHashErrors.__str__cCs
t|j�S)N)�boolr)rrrr
�__nonzero__HszHashErrors.__nonzero__cCs|j�S)N)r/)rrrr
�__bool__KszHashErrors.__bool__N)	rr	r
rrrr-r/r0rrrr
r6s	rc@s0eZdZdZdZdZdd�Zdd�Zdd	�ZdS)
�	HashErrora�
    A failure to verify a package against known-good hashes

    :cvar order: An int sorting hash exception classes by difficulty of
        recovery (lower being harder), so the user doesn't bother fretting
        about unpinned packages when he has deeper issues, like VCS
        dependencies, to deal with. Also keeps error reports in a
        deterministic order.
    :cvar head: A section heading for display above potentially many
        exceptions of this kind
    :ivar req: The InstallRequirement that triggered this error. This is
        pasted on after the exception is instantiated, because it's not
        typically available earlier.

    N�cCsd|j�S)a)Return a summary of me for display under the heading.

        This default implementation simply prints a description of the
        triggering requirement.

        :param req: The InstallRequirement that provoked this error, with
            populate_link() having already been called

        z    %s)�_requirement_name)rrrr
r#bs
zHashError.bodycCsd|j|j�fS)Nz%s
%s)r(r#)rrrr
r-nszHashError.__str__cCs|jrt|j�SdS)z�Return a description of the requirement that triggered me.

        This default implementation returns long description of the req, with
        line numbers

        zunknown package)�req�str)rrrr
r3qszHashError._requirement_name)	rr	r
rr4r(r#r-r3rrrr
r1Osr1c@seZdZdZdZdZdS)�VcsHashUnsupportedzuA hash was provided for a version-control-system-based requirement, but
    we don't have a method for hashing those.rzlCan't verify hashes for these requirements because we don't have a way to hash version control repositories:N)rr	r
rrr(rrrr
r6{sr6c@seZdZdZdZdZdS)�DirectoryUrlHashUnsupportedzuA hash was provided for a version-control-system-based requirement, but
    we don't have a method for hashing those.�zUCan't verify hashes for these file:// requirements because they point to directories:N)rr	r
rrr(rrrr
r7�sr7c@s(eZdZdZdZdZdd�Zdd�ZdS)	�HashMissingz2A hash was needed for a requirement but is absent.�awHashes are required in --require-hashes mode, but they are missing from some requirements. Here is a list of those requirements along with the hashes their downloaded archives actually had. Add lines like these to your requirements files to prevent tampering. (If you did not enable --require-hashes manually, note that it turns on automatically when any package has a hash.)cCs
||_dS)zq
        :param gotten_hash: The hash of the (possibly malicious) archive we
            just downloaded
        N)�gotten_hash)rr;rrr
r�szHashMissing.__init__cCsHddlm}d}|jr4|jjr&|jjnt|jdd�}d|p<d||jfS)Nr)�
FAVORITE_HASHr4z    %s --hash=%s:%szunknown package)Zpip.utils.hashesr<r4Z
original_link�getattrr;)rr<�packagerrr
r#�szHashMissing.bodyN)rr	r
rrr(rr#rrrr
r9�s
r9c@seZdZdZdZdZdS)�HashUnpinnedzPA requirement had a hash specified but was not pinned to a specific
    version.�zaIn --require-hashes mode, all requirements must have their versions pinned with ==. These do not:N)rr	r
rrr(rrrr
r?�sr?c@s0eZdZdZdZdZdd�Zdd�Zdd	�Zd
S)�HashMismatchz�
    Distribution file hash values don't match.

    :ivar package_name: The name of the package that triggered the hash
        mismatch. Feel free to write to this after the exception is raise to
        improve its error message.

    �z�THESE PACKAGES DO NOT MATCH THE HASHES FROM THE REQUIREMENTS FILE. If you have updated the package versions, please update the hashes. Otherwise, examine the package contents carefully; someone may have tampered with them.cCs||_||_dS)z�
        :param allowed: A dict of algorithm names pointing to lists of allowed
            hex digests
        :param gots: A dict of algorithm names pointing to hashes we
            actually got from the files under suspicion
        N)�allowed�gots)rrCrDrrr
r�szHashMismatch.__init__cCsd|j�|j�fS)Nz
    %s:
%s)r3�_hash_comparison)rrrr
r#�szHashMismatch.bodycsjdd�}g}xRt|j�D]D\}}||��|j�fdd�|D��|jd|j|j��d�qWdj|�S)aE
        Return a comparison of actual and expected hash values.

        Example::

               Expected sha256 abcdeabcdeabcdeabcdeabcdeabcdeabcdeabcdeabcde
                            or 123451234512345123451234512345123451234512345
                    Got        bcdefbcdefbcdefbcdefbcdefbcdefbcdefbcdefbcdef

        cSst|gtd��S)Nz    or)rr)�	hash_namerrr
�hash_then_or�sz3HashMismatch._hash_comparison.<locals>.hash_then_orc3s|]}dt��|fVqdS)z        Expected %s %sN)�next)r$r)�prefixrr
r%�sz0HashMismatch._hash_comparison.<locals>.<genexpr>z             Got        %s
z    orr&)rrCr)rrDZ	hexdigestr*)rrGr+rFZ	expectedsr)rIr
rE�s
zHashMismatch._hash_comparisonN)	rr	r
rrr(rr#rErrrr
rA�s
rAc@seZdZdZdS)�UnsupportedPythonVersionzMUnsupported python version according to Requires-Python package
    metadata.N)rr	r
rrrrr
rJ�srJN)rZ
__future__r�	itertoolsrrrZpip._vendor.sixr�	Exceptionrrrrrrrrrrrrr1r6r7r9r?rArJrrrr
�<module>s,,		$	8site-packages/pip/models/__pycache__/__init__.cpython-36.pyc000064400000000313147511334620017737 0ustar003

���eG�@sddlmZmZddgZdS)�)�Index�PyPIrrN)Zpip.models.indexrr�__all__�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/models/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000313147511334620020676 0ustar003

���eG�@sddlmZmZddgZdS)�)�Index�PyPIrrN)Zpip.models.indexrr�__all__�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/models/__pycache__/index.cpython-36.pyc000064400000001447147511334620017320 0ustar003

���e��@s(ddlmZGdd�de�Zed�ZdS)�)�parsec@seZdZdd�Zdd�ZdS)�IndexcCs<||_tj|�j|_|jd�|_|jd�|_|jd�|_dS)NZsimpleZpypiz
pypi/pip/json)�url�urllib_parseZurlsplitZnetloc�url_to_pathZ
simple_urlZpypi_urlZpip_json_url)�selfr�r�/usr/lib/python3.6/index.py�__init__s
zIndex.__init__cCstj|j|�S)N)rZurljoinr)r�pathrrr	rszIndex.url_to_pathN)�__name__�
__module__�__qualname__r
rrrrr	rsrzhttps://pypi.python.org/N)Zpip._vendor.six.moves.urllibrr�objectrZPyPIrrrr	�<module>ssite-packages/pip/models/__pycache__/index.cpython-36.opt-1.pyc000064400000001447147511334620020257 0ustar003

���e��@s(ddlmZGdd�de�Zed�ZdS)�)�parsec@seZdZdd�Zdd�ZdS)�IndexcCs<||_tj|�j|_|jd�|_|jd�|_|jd�|_dS)NZsimpleZpypiz
pypi/pip/json)�url�urllib_parseZurlsplitZnetloc�url_to_pathZ
simple_urlZpypi_urlZpip_json_url)�selfr�r�/usr/lib/python3.6/index.py�__init__s
zIndex.__init__cCstj|j|�S)N)rZurljoinr)r�pathrrr	rszIndex.url_to_pathN)�__name__�
__module__�__qualname__r
rrrrr	rsrzhttps://pypi.python.org/N)Zpip._vendor.six.moves.urllibrr�objectrZPyPIrrrr	�<module>ssite-packages/pip/models/index.py000064400000000747147511334620013036 0ustar00from pip._vendor.six.moves.urllib import parse as urllib_parse


class Index(object):
    def __init__(self, url):
        self.url = url
        self.netloc = urllib_parse.urlsplit(url).netloc
        self.simple_url = self.url_to_path('simple')
        self.pypi_url = self.url_to_path('pypi')
        self.pip_json_url = self.url_to_path('pypi/pip/json')

    def url_to_path(self, path):
        return urllib_parse.urljoin(self.url, path)


PyPI = Index('https://pypi.python.org/')
site-packages/pip/models/__init__.py000064400000000107147511334620013454 0ustar00from pip.models.index import Index, PyPI


__all__ = ["Index", "PyPI"]
site-packages/pip/__main__.py000064400000001110147511334620012145 0ustar00from __future__ import absolute_import

import os
import sys

# If we are running from a wheel, add the wheel to sys.path
# This allows the usage python pip-*.whl/pip install pip-*.whl
if __package__ == '':
    # __file__ is pip-*.whl/pip/__main__.py
    # first dirname call strips of '/__main__.py', second strips off '/pip'
    # Resulting path is the name of the wheel itself
    # Add that to sys.path so we can import pip
    path = os.path.dirname(os.path.dirname(__file__))
    sys.path.insert(0, path)

import pip  # noqa

if __name__ == '__main__':
    sys.exit(pip.main())
site-packages/pip/operations/check.py000064400000003066147511334620013701 0ustar00

def check_requirements(installed_dists):
    missing_reqs_dict = {}
    incompatible_reqs_dict = {}

    for dist in installed_dists:
        key = '%s==%s' % (dist.project_name, dist.version)

        missing_reqs = list(get_missing_reqs(dist, installed_dists))
        if missing_reqs:
            missing_reqs_dict[key] = missing_reqs

        incompatible_reqs = list(get_incompatible_reqs(
            dist, installed_dists))
        if incompatible_reqs:
            incompatible_reqs_dict[key] = incompatible_reqs

    return (missing_reqs_dict, incompatible_reqs_dict)


def get_missing_reqs(dist, installed_dists):
    """Return all of the requirements of `dist` that aren't present in
    `installed_dists`.

    """
    installed_names = set(d.project_name.lower() for d in installed_dists)
    missing_requirements = set()

    for requirement in dist.requires():
        if requirement.project_name.lower() not in installed_names:
            missing_requirements.add(requirement)
            yield requirement


def get_incompatible_reqs(dist, installed_dists):
    """Return all of the requirements of `dist` that are present in
    `installed_dists`, but have incompatible versions.

    """
    installed_dists_by_name = {}
    for installed_dist in installed_dists:
        installed_dists_by_name[installed_dist.project_name] = installed_dist

    for requirement in dist.requires():
        present_dist = installed_dists_by_name.get(requirement.project_name)

        if present_dist and present_dist not in requirement:
            yield (requirement, present_dist)
site-packages/pip/operations/__init__.py000064400000000000147511334620014344 0ustar00site-packages/pip/operations/freeze.py000064400000012112147511334620014074 0ustar00from __future__ import absolute_import

import logging
import re

import pip
from pip.req import InstallRequirement
from pip.req.req_file import COMMENT_RE
from pip.utils import get_installed_distributions
from pip._vendor import pkg_resources
from pip._vendor.packaging.utils import canonicalize_name
from pip._vendor.pkg_resources import RequirementParseError


logger = logging.getLogger(__name__)


def freeze(
        requirement=None,
        find_links=None, local_only=None, user_only=None, skip_regex=None,
        default_vcs=None,
        isolated=False,
        wheel_cache=None,
        skip=()):
    find_links = find_links or []
    skip_match = None

    if skip_regex:
        skip_match = re.compile(skip_regex).search

    dependency_links = []

    for dist in pkg_resources.working_set:
        if dist.has_metadata('dependency_links.txt'):
            dependency_links.extend(
                dist.get_metadata_lines('dependency_links.txt')
            )
    for link in find_links:
        if '#egg=' in link:
            dependency_links.append(link)
    for link in find_links:
        yield '-f %s' % link
    installations = {}
    for dist in get_installed_distributions(local_only=local_only,
                                            skip=(),
                                            user_only=user_only):
        try:
            req = pip.FrozenRequirement.from_dist(
                dist,
                dependency_links
            )
        except RequirementParseError:
            logger.warning(
                "Could not parse requirement: %s",
                dist.project_name
            )
            continue
        installations[req.name] = req

    if requirement:
        # the options that don't get turned into an InstallRequirement
        # should only be emitted once, even if the same option is in multiple
        # requirements files, so we need to keep track of what has been emitted
        # so that we don't emit it again if it's seen again
        emitted_options = set()
        for req_file_path in requirement:
            with open(req_file_path) as req_file:
                for line in req_file:
                    if (not line.strip() or
                            line.strip().startswith('#') or
                            (skip_match and skip_match(line)) or
                            line.startswith((
                                '-r', '--requirement',
                                '-Z', '--always-unzip',
                                '-f', '--find-links',
                                '-i', '--index-url',
                                '--pre',
                                '--trusted-host',
                                '--process-dependency-links',
                                '--extra-index-url'))):
                        line = line.rstrip()
                        if line not in emitted_options:
                            emitted_options.add(line)
                            yield line
                        continue

                    if line.startswith('-e') or line.startswith('--editable'):
                        if line.startswith('-e'):
                            line = line[2:].strip()
                        else:
                            line = line[len('--editable'):].strip().lstrip('=')
                        line_req = InstallRequirement.from_editable(
                            line,
                            default_vcs=default_vcs,
                            isolated=isolated,
                            wheel_cache=wheel_cache,
                        )
                    else:
                        line_req = InstallRequirement.from_line(
                            COMMENT_RE.sub('', line).strip(),
                            isolated=isolated,
                            wheel_cache=wheel_cache,
                        )

                    if not line_req.name:
                        logger.info(
                            "Skipping line in requirement file [%s] because "
                            "it's not clear what it would install: %s",
                            req_file_path, line.strip(),
                        )
                        logger.info(
                            "  (add #egg=PackageName to the URL to avoid"
                            " this warning)"
                        )
                    elif line_req.name not in installations:
                        logger.warning(
                            "Requirement file [%s] contains %s, but that "
                            "package is not installed",
                            req_file_path, COMMENT_RE.sub('', line).strip(),
                        )
                    else:
                        yield str(installations[line_req.name]).rstrip()
                        del installations[line_req.name]

        yield(
            '## The following requirements were added by '
            'pip freeze:'
        )
    for installation in sorted(
            installations.values(), key=lambda x: x.name.lower()):
        if canonicalize_name(installation.name) not in skip:
            yield str(installation).rstrip()
site-packages/pip/operations/__pycache__/check.cpython-36.pyc000064400000002570147511334620020164 0ustar003

���e6�@sdd�Zdd�Zdd�ZdS)cCsbi}i}xP|D]H}d|j|jf}tt||��}|r<|||<tt||��}|r|||<qW||fS)Nz%s==%s)�project_name�version�list�get_missing_reqs�get_incompatible_reqs)�installed_distsZmissing_reqs_dictZincompatible_reqs_dict�dist�keyZmissing_reqsZincompatible_reqs�r	�/usr/lib/python3.6/check.py�check_requirementss

rccsLtdd�|D��}t�}x.|j�D]"}|jj�|kr"|j|�|Vq"WdS)z\Return all of the requirements of `dist` that aren't present in
    `installed_dists`.

    css|]}|jj�VqdS)N)r�lower)�.0�dr	r	r
�	<genexpr>sz#get_missing_reqs.<locals>.<genexpr>N)�set�requiresrr�add)rrZinstalled_namesZmissing_requirements�requirementr	r	r
rs
rccsTi}x|D]}|||j<q
Wx2|j�D]&}|j|j�}|r&||kr&||fVq&WdS)zyReturn all of the requirements of `dist` that are present in
    `installed_dists`, but have incompatible versions.

    N)rr�get)rrZinstalled_dists_by_nameZinstalled_distrZpresent_distr	r	r
r$s
rN)rrrr	r	r	r
�<module>ssite-packages/pip/operations/__pycache__/check.cpython-36.opt-1.pyc000064400000002570147511334620021123 0ustar003

���e6�@sdd�Zdd�Zdd�ZdS)cCsbi}i}xP|D]H}d|j|jf}tt||��}|r<|||<tt||��}|r|||<qW||fS)Nz%s==%s)�project_name�version�list�get_missing_reqs�get_incompatible_reqs)�installed_distsZmissing_reqs_dictZincompatible_reqs_dict�dist�keyZmissing_reqsZincompatible_reqs�r	�/usr/lib/python3.6/check.py�check_requirementss

rccsLtdd�|D��}t�}x.|j�D]"}|jj�|kr"|j|�|Vq"WdS)z\Return all of the requirements of `dist` that aren't present in
    `installed_dists`.

    css|]}|jj�VqdS)N)r�lower)�.0�dr	r	r
�	<genexpr>sz#get_missing_reqs.<locals>.<genexpr>N)�set�requiresrr�add)rrZinstalled_namesZmissing_requirements�requirementr	r	r
rs
rccsTi}x|D]}|||j<q
Wx2|j�D]&}|j|j�}|r&||kr&||fVq&WdS)zyReturn all of the requirements of `dist` that are present in
    `installed_dists`, but have incompatible versions.

    N)rr�get)rrZinstalled_dists_by_nameZinstalled_distrZpresent_distr	r	r
r$s
rN)rrrr	r	r	r
�<module>ssite-packages/pip/operations/__pycache__/freeze.cpython-36.opt-1.pyc000064400000005451147511334620021327 0ustar003

���eJ�	@s�ddlmZddlZddlZddlZddlmZddlmZddl	m
Z
ddlmZddl
mZddlmZeje�Zddddddd	dff	d
d�ZdS)�)�absolute_importN)�InstallRequirement)�
COMMENT_RE)�get_installed_distributions)�
pkg_resources)�canonicalize_name)�RequirementParseErrorFc	cs�|pg}d}	|rtj|�j}	g}
x(tjD]}|jd�r(|
j|jd��q(Wx|D]}d|krP|
j|�qPWx|D]}d|VqpWi}
xXt	|f|d�D]F}yt
jj||
�}Wn$t
k
r�tjd|j�w�YnX||
|j<q�W|�rvt�}�x�|D�]v}t|���b}�xX|D�]N}|j��sL|j�jd��sL|	�r@|	|��sL|jd!��rr|j�}||k�r|j|�|V�q|jd��s�|jd��r�|jd��r�|dd�j�}n|td�d�j�jd�}tj||||d�}ntjtjd|�j�||d�}|j�stjd||j��tjd�nD|j|
k�r@tjd|tjd|�j��nt |
|j�j�V|
|j=�qWWdQRXq�WdVx<t!|
j"�dd�d �D]$}t#|j�|k�r�t |�j�V�q�WdS)"Nzdependency_links.txtz#egg=z-f %s)�
local_only�skip�	user_onlyzCould not parse requirement: %s�#�-r�
--requirement�-Z�--always-unzip�-f�--find-links�-i�--index-url�--pre�--trusted-host�--process-dependency-links�--extra-index-urlz-ez
--editable��=)�default_vcs�isolated�wheel_cache�)rrzWSkipping line in requirement file [%s] because it's not clear what it would install: %sz9  (add #egg=PackageName to the URL to avoid this warning)zDRequirement file [%s] contains %s, but that package is not installedz7## The following requirements were added by pip freeze:cSs
|jj�S)N)�name�lower)�x�r"�/usr/lib/python3.6/freeze.py�<lambda>�szfreeze.<locals>.<lambda>)�key)r
rrrrrrrrrrr)$�re�compile�searchrZworking_setZhas_metadata�extendZget_metadata_lines�appendr�pipZFrozenRequirementZ	from_distr�loggerZwarningZproject_namer�set�open�strip�
startswith�rstrip�add�len�lstriprZ
from_editableZ	from_liner�sub�info�str�sorted�valuesr)ZrequirementZ
find_linksr	rZ
skip_regexrrrr
Z
skip_matchZdependency_linksZdist�linkZ
installationsZreqZemitted_optionsZ
req_file_pathZreq_file�lineZline_reqZinstallationr"r"r#�freezes�






r<)Z
__future__rZloggingr&r+Zpip.reqrZpip.req.req_filerZ	pip.utilsrZpip._vendorrZpip._vendor.packaging.utilsrZpip._vendor.pkg_resourcesrZ	getLogger�__name__r,r<r"r"r"r#�<module>s 
site-packages/pip/operations/__pycache__/freeze.cpython-36.pyc000064400000005451147511334620020370 0ustar003

���eJ�	@s�ddlmZddlZddlZddlZddlmZddlmZddl	m
Z
ddlmZddl
mZddlmZeje�Zddddddd	dff	d
d�ZdS)�)�absolute_importN)�InstallRequirement)�
COMMENT_RE)�get_installed_distributions)�
pkg_resources)�canonicalize_name)�RequirementParseErrorFc	cs�|pg}d}	|rtj|�j}	g}
x(tjD]}|jd�r(|
j|jd��q(Wx|D]}d|krP|
j|�qPWx|D]}d|VqpWi}
xXt	|f|d�D]F}yt
jj||
�}Wn$t
k
r�tjd|j�w�YnX||
|j<q�W|�rvt�}�x�|D�]v}t|���b}�xX|D�]N}|j��sL|j�jd��sL|	�r@|	|��sL|jd!��rr|j�}||k�r|j|�|V�q|jd��s�|jd��r�|jd��r�|dd�j�}n|td�d�j�jd�}tj||||d�}ntjtjd|�j�||d�}|j�stjd||j��tjd�nD|j|
k�r@tjd|tjd|�j��nt |
|j�j�V|
|j=�qWWdQRXq�WdVx<t!|
j"�dd�d �D]$}t#|j�|k�r�t |�j�V�q�WdS)"Nzdependency_links.txtz#egg=z-f %s)�
local_only�skip�	user_onlyzCould not parse requirement: %s�#�-r�
--requirement�-Z�--always-unzip�-f�--find-links�-i�--index-url�--pre�--trusted-host�--process-dependency-links�--extra-index-urlz-ez
--editable��=)�default_vcs�isolated�wheel_cache�)rrzWSkipping line in requirement file [%s] because it's not clear what it would install: %sz9  (add #egg=PackageName to the URL to avoid this warning)zDRequirement file [%s] contains %s, but that package is not installedz7## The following requirements were added by pip freeze:cSs
|jj�S)N)�name�lower)�x�r"�/usr/lib/python3.6/freeze.py�<lambda>�szfreeze.<locals>.<lambda>)�key)r
rrrrrrrrrrr)$�re�compile�searchrZworking_setZhas_metadata�extendZget_metadata_lines�appendr�pipZFrozenRequirementZ	from_distr�loggerZwarningZproject_namer�set�open�strip�
startswith�rstrip�add�len�lstriprZ
from_editableZ	from_liner�sub�info�str�sorted�valuesr)ZrequirementZ
find_linksr	rZ
skip_regexrrrr
Z
skip_matchZdependency_linksZdist�linkZ
installationsZreqZemitted_optionsZ
req_file_pathZreq_file�lineZline_reqZinstallationr"r"r#�freezes�






r<)Z
__future__rZloggingr&r+Zpip.reqrZpip.req.req_filerZ	pip.utilsrZpip._vendorrZpip._vendor.packaging.utilsrZpip._vendor.pkg_resourcesrZ	getLogger�__name__r,r<r"r"r"r#�<module>s 
site-packages/pip/operations/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000161147511334620021577 0ustar003

���e�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/operations/__pycache__/__init__.cpython-36.pyc000064400000000161147511334620020640 0ustar003

���e�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pip/locations.py000064400000012772147511334620012440 0ustar00"""Locations where we look for configs, install stuff, etc"""
from __future__ import absolute_import

import os
import os.path
import site
import sys

from distutils import sysconfig
from distutils.command.install import install, SCHEME_KEYS  # noqa

from pip.compat import WINDOWS, expanduser
from pip.utils import appdirs


# Application Directories
USER_CACHE_DIR = appdirs.user_cache_dir("pip")


DELETE_MARKER_MESSAGE = '''\
This file is placed here by pip to indicate the source was put
here by pip.

Once this package is successfully installed this source code will be
deleted (unless you remove this file).
'''
PIP_DELETE_MARKER_FILENAME = 'pip-delete-this-directory.txt'


def write_delete_marker_file(directory):
    """
    Write the pip delete marker file into this directory.
    """
    filepath = os.path.join(directory, PIP_DELETE_MARKER_FILENAME)
    with open(filepath, 'w') as marker_fp:
        marker_fp.write(DELETE_MARKER_MESSAGE)


def running_under_virtualenv():
    """
    Return True if we're running inside a virtualenv, False otherwise.

    """
    if hasattr(sys, 'real_prefix'):
        return True
    elif sys.prefix != getattr(sys, "base_prefix", sys.prefix):
        return True

    return False


def virtualenv_no_global():
    """
    Return True if in a venv and no system site packages.
    """
    # this mirrors the logic in virtualenv.py for locating the
    # no-global-site-packages.txt file
    site_mod_dir = os.path.dirname(os.path.abspath(site.__file__))
    no_global_file = os.path.join(site_mod_dir, 'no-global-site-packages.txt')
    if running_under_virtualenv() and os.path.isfile(no_global_file):
        return True


if running_under_virtualenv():
    src_prefix = os.path.join(sys.prefix, 'src')
else:
    # FIXME: keep src in cwd for now (it is not a temporary folder)
    try:
        src_prefix = os.path.join(os.getcwd(), 'src')
    except OSError:
        # In case the current working directory has been renamed or deleted
        sys.exit(
            "The folder you are executing pip from can no longer be found."
        )

# under macOS + virtualenv sys.prefix is not properly resolved
# it is something like /path/to/python/bin/..
# Note: using realpath due to tmp dirs on OSX being symlinks
src_prefix = os.path.abspath(src_prefix)

# FIXME doesn't account for venv linked to global site-packages

site_packages = sysconfig.get_python_lib()
user_site = site.USER_SITE
user_dir = expanduser('~')
if WINDOWS:
    bin_py = os.path.join(sys.prefix, 'Scripts')
    bin_user = os.path.join(user_site, 'Scripts')
    # buildout uses 'bin' on Windows too?
    if not os.path.exists(bin_py):
        bin_py = os.path.join(sys.prefix, 'bin')
        bin_user = os.path.join(user_site, 'bin')

    config_basename = 'pip.ini'

    legacy_storage_dir = os.path.join(user_dir, 'pip')
    legacy_config_file = os.path.join(
        legacy_storage_dir,
        config_basename,
    )
else:
    bin_py = os.path.join(sys.prefix, 'bin')
    bin_user = os.path.join(user_site, 'bin')

    config_basename = 'pip.conf'

    legacy_storage_dir = os.path.join(user_dir, '.pip')
    legacy_config_file = os.path.join(
        legacy_storage_dir,
        config_basename,
    )

    # Forcing to use /usr/local/bin for standard macOS framework installs
    # Also log to ~/Library/Logs/ for use with the Console.app log viewer
    if sys.platform[:6] == 'darwin' and sys.prefix[:16] == '/System/Library/':
        bin_py = '/usr/local/bin'

site_config_files = [
    os.path.join(path, config_basename)
    for path in appdirs.site_config_dirs('pip')
]


def distutils_scheme(dist_name, user=False, home=None, root=None,
                     isolated=False, prefix=None):
    """
    Return a distutils install scheme
    """
    from distutils.dist import Distribution

    scheme = {}

    if isolated:
        extra_dist_args = {"script_args": ["--no-user-cfg"]}
    else:
        extra_dist_args = {}
    dist_args = {'name': dist_name}
    dist_args.update(extra_dist_args)

    d = Distribution(dist_args)
    d.parse_config_files()
    i = d.get_command_obj('install', create=True)
    # NOTE: setting user or home has the side-effect of creating the home dir
    # or user base for installations during finalize_options()
    # ideally, we'd prefer a scheme class that has no side-effects.
    assert not (user and prefix), "user={0} prefix={1}".format(user, prefix)
    i.user = user or i.user
    if user:
        i.prefix = ""
    i.prefix = prefix or i.prefix
    i.home = home or i.home
    i.root = root or i.root
    i.finalize_options()
    for key in SCHEME_KEYS:
        scheme[key] = getattr(i, 'install_' + key)

    # install_lib specified in setup.cfg should install *everything*
    # into there (i.e. it takes precedence over both purelib and
    # platlib).  Note, i.install_lib is *always* set after
    # finalize_options(); we only want to override here if the user
    # has explicitly requested it hence going back to the config
    if 'install_lib' in d.get_option_dict('install'):
        scheme.update(dict(purelib=i.install_lib, platlib=i.install_lib))

    if running_under_virtualenv():
        scheme['headers'] = os.path.join(
            sys.prefix,
            'include',
            'site',
            'python' + sys.version[:3],
            dist_name,
        )

        if root is not None:
            path_no_drive = os.path.splitdrive(
                os.path.abspath(scheme["headers"]))[1]
            scheme["headers"] = os.path.join(
                root,
                path_no_drive[1:],
            )

    return scheme
site-packages/pip/req/req_set.py000064400000104036147511334620012671 0ustar00from __future__ import absolute_import

from collections import defaultdict
from itertools import chain
import logging
import os

from pip._vendor import pkg_resources
from pip._vendor import requests

from pip.compat import expanduser
from pip.download import (is_file_url, is_dir_url, is_vcs_url, url_to_path,
                          unpack_url)
from pip.exceptions import (InstallationError, BestVersionAlreadyInstalled,
                            DistributionNotFound, PreviousBuildDirError,
                            HashError, HashErrors, HashUnpinned,
                            DirectoryUrlHashUnsupported, VcsHashUnsupported,
                            UnsupportedPythonVersion)
from pip.req.req_install import InstallRequirement
from pip.utils import (
    display_path, dist_in_usersite, dist_in_install_path, ensure_dir,
    normalize_path)
from pip.utils.hashes import MissingHashes
from pip.utils.logging import indent_log
from pip.utils.packaging import check_dist_requires_python
from pip.vcs import vcs
from pip.wheel import Wheel

logger = logging.getLogger(__name__)


class Requirements(object):

    def __init__(self):
        self._keys = []
        self._dict = {}

    def keys(self):
        return self._keys

    def values(self):
        return [self._dict[key] for key in self._keys]

    def __contains__(self, item):
        return item in self._keys

    def __setitem__(self, key, value):
        if key not in self._keys:
            self._keys.append(key)
        self._dict[key] = value

    def __getitem__(self, key):
        return self._dict[key]

    def __repr__(self):
        values = ['%s: %s' % (repr(k), repr(self[k])) for k in self.keys()]
        return 'Requirements({%s})' % ', '.join(values)


class DistAbstraction(object):
    """Abstracts out the wheel vs non-wheel prepare_files logic.

    The requirements for anything installable are as follows:
     - we must be able to determine the requirement name
       (or we can't correctly handle the non-upgrade case).
     - we must be able to generate a list of run-time dependencies
       without installing any additional packages (or we would
       have to either burn time by doing temporary isolated installs
       or alternatively violate pips 'don't start installing unless
       all requirements are available' rule - neither of which are
       desirable).
     - for packages with setup requirements, we must also be able
       to determine their requirements without installing additional
       packages (for the same reason as run-time dependencies)
     - we must be able to create a Distribution object exposing the
       above metadata.
    """

    def __init__(self, req_to_install):
        self.req_to_install = req_to_install

    def dist(self, finder):
        """Return a setuptools Dist object."""
        raise NotImplementedError(self.dist)

    def prep_for_dist(self):
        """Ensure that we can get a Dist for this requirement."""
        raise NotImplementedError(self.dist)


def make_abstract_dist(req_to_install):
    """Factory to make an abstract dist object.

    Preconditions: Either an editable req with a source_dir, or satisfied_by or
    a wheel link, or a non-editable req with a source_dir.

    :return: A concrete DistAbstraction.
    """
    if req_to_install.editable:
        return IsSDist(req_to_install)
    elif req_to_install.link and req_to_install.link.is_wheel:
        return IsWheel(req_to_install)
    else:
        return IsSDist(req_to_install)


class IsWheel(DistAbstraction):

    def dist(self, finder):
        return list(pkg_resources.find_distributions(
            self.req_to_install.source_dir))[0]

    def prep_for_dist(self):
        # FIXME:https://github.com/pypa/pip/issues/1112
        pass


class IsSDist(DistAbstraction):

    def dist(self, finder):
        dist = self.req_to_install.get_dist()
        # FIXME: shouldn't be globally added:
        if dist.has_metadata('dependency_links.txt'):
            finder.add_dependency_links(
                dist.get_metadata_lines('dependency_links.txt')
            )
        return dist

    def prep_for_dist(self):
        self.req_to_install.run_egg_info()
        self.req_to_install.assert_source_matches_version()


class Installed(DistAbstraction):

    def dist(self, finder):
        return self.req_to_install.satisfied_by

    def prep_for_dist(self):
        pass


class RequirementSet(object):

    def __init__(self, build_dir, src_dir, download_dir, upgrade=False,
                 upgrade_strategy=None, ignore_installed=False, as_egg=False,
                 target_dir=None, ignore_dependencies=False,
                 force_reinstall=False, use_user_site=False, session=None,
                 pycompile=True, isolated=False, wheel_download_dir=None,
                 wheel_cache=None, require_hashes=False,
                 ignore_requires_python=False):
        """Create a RequirementSet.

        :param wheel_download_dir: Where still-packed .whl files should be
            written to. If None they are written to the download_dir parameter.
            Separate to download_dir to permit only keeping wheel archives for
            pip wheel.
        :param download_dir: Where still packed archives should be written to.
            If None they are not saved, and are deleted immediately after
            unpacking.
        :param wheel_cache: The pip wheel cache, for passing to
            InstallRequirement.
        """
        if session is None:
            raise TypeError(
                "RequirementSet() missing 1 required keyword argument: "
                "'session'"
            )

        self.build_dir = build_dir
        self.src_dir = src_dir
        # XXX: download_dir and wheel_download_dir overlap semantically and may
        # be combined if we're willing to have non-wheel archives present in
        # the wheelhouse output by 'pip wheel'.
        self.download_dir = download_dir
        self.upgrade = upgrade
        self.upgrade_strategy = upgrade_strategy
        self.ignore_installed = ignore_installed
        self.force_reinstall = force_reinstall
        self.requirements = Requirements()
        # Mapping of alias: real_name
        self.requirement_aliases = {}
        self.unnamed_requirements = []
        self.ignore_dependencies = ignore_dependencies
        self.ignore_requires_python = ignore_requires_python
        self.successfully_downloaded = []
        self.successfully_installed = []
        self.reqs_to_cleanup = []
        self.as_egg = as_egg
        self.use_user_site = use_user_site
        self.target_dir = target_dir  # set from --target option
        self.session = session
        self.pycompile = pycompile
        self.isolated = isolated
        if wheel_download_dir:
            wheel_download_dir = normalize_path(wheel_download_dir)
        self.wheel_download_dir = wheel_download_dir
        self._wheel_cache = wheel_cache
        self.require_hashes = require_hashes
        # Maps from install_req -> dependencies_of_install_req
        self._dependencies = defaultdict(list)

    def __str__(self):
        reqs = [req for req in self.requirements.values()
                if not req.comes_from]
        reqs.sort(key=lambda req: req.name.lower())
        return ' '.join([str(req.req) for req in reqs])

    def __repr__(self):
        reqs = [req for req in self.requirements.values()]
        reqs.sort(key=lambda req: req.name.lower())
        reqs_str = ', '.join([str(req.req) for req in reqs])
        return ('<%s object; %d requirement(s): %s>'
                % (self.__class__.__name__, len(reqs), reqs_str))

    def add_requirement(self, install_req, parent_req_name=None,
                        extras_requested=None):
        """Add install_req as a requirement to install.

        :param parent_req_name: The name of the requirement that needed this
            added. The name is used because when multiple unnamed requirements
            resolve to the same name, we could otherwise end up with dependency
            links that point outside the Requirements set. parent_req must
            already be added. Note that None implies that this is a user
            supplied requirement, vs an inferred one.
        :param extras_requested: an iterable of extras used to evaluate the
            environement markers.
        :return: Additional requirements to scan. That is either [] if
            the requirement is not applicable, or [install_req] if the
            requirement is applicable and has just been added.
        """
        name = install_req.name
        if not install_req.match_markers(extras_requested):
            logger.warning("Ignoring %s: markers '%s' don't match your "
                           "environment", install_req.name,
                           install_req.markers)
            return []

        # This check has to come after we filter requirements with the
        # environment markers.
        if install_req.link and install_req.link.is_wheel:
            wheel = Wheel(install_req.link.filename)
            if not wheel.supported():
                raise InstallationError(
                    "%s is not a supported wheel on this platform." %
                    wheel.filename
                )

        install_req.as_egg = self.as_egg
        install_req.use_user_site = self.use_user_site
        install_req.target_dir = self.target_dir
        install_req.pycompile = self.pycompile
        install_req.is_direct = (parent_req_name is None)

        if not name:
            # url or path requirement w/o an egg fragment
            self.unnamed_requirements.append(install_req)
            return [install_req]
        else:
            try:
                existing_req = self.get_requirement(name)
            except KeyError:
                existing_req = None
            if (parent_req_name is None and existing_req and not
                    existing_req.constraint and
                    existing_req.extras == install_req.extras and not
                    existing_req.req.specifier == install_req.req.specifier):
                raise InstallationError(
                    'Double requirement given: %s (already in %s, name=%r)'
                    % (install_req, existing_req, name))
            if not existing_req:
                # Add requirement
                self.requirements[name] = install_req
                # FIXME: what about other normalizations?  E.g., _ vs. -?
                if name.lower() != name:
                    self.requirement_aliases[name.lower()] = name
                result = [install_req]
            else:
                # Assume there's no need to scan, and that we've already
                # encountered this for scanning.
                result = []
                if not install_req.constraint and existing_req.constraint:
                    if (install_req.link and not (existing_req.link and
                       install_req.link.path == existing_req.link.path)):
                        self.reqs_to_cleanup.append(install_req)
                        raise InstallationError(
                            "Could not satisfy constraints for '%s': "
                            "installation from path or url cannot be "
                            "constrained to a version" % name)
                    # If we're now installing a constraint, mark the existing
                    # object for real installation.
                    existing_req.constraint = False
                    existing_req.extras = tuple(
                        sorted(set(existing_req.extras).union(
                               set(install_req.extras))))
                    logger.debug("Setting %s extras to: %s",
                                 existing_req, existing_req.extras)
                    # And now we need to scan this.
                    result = [existing_req]
                # Canonicalise to the already-added object for the backref
                # check below.
                install_req = existing_req
            if parent_req_name:
                parent_req = self.get_requirement(parent_req_name)
                self._dependencies[parent_req].append(install_req)
            return result

    def has_requirement(self, project_name):
        name = project_name.lower()
        if (name in self.requirements and
           not self.requirements[name].constraint or
           name in self.requirement_aliases and
           not self.requirements[self.requirement_aliases[name]].constraint):
            return True
        return False

    @property
    def has_requirements(self):
        return list(req for req in self.requirements.values() if not
                    req.constraint) or self.unnamed_requirements

    @property
    def is_download(self):
        if self.download_dir:
            self.download_dir = expanduser(self.download_dir)
            if os.path.exists(self.download_dir):
                return True
            else:
                logger.critical('Could not find download directory')
                raise InstallationError(
                    "Could not find or access download directory '%s'"
                    % display_path(self.download_dir))
        return False

    def get_requirement(self, project_name):
        for name in project_name, project_name.lower():
            if name in self.requirements:
                return self.requirements[name]
            if name in self.requirement_aliases:
                return self.requirements[self.requirement_aliases[name]]
        raise KeyError("No project with the name %r" % project_name)

    def uninstall(self, auto_confirm=False):
        for req in self.requirements.values():
            if req.constraint:
                continue
            req.uninstall(auto_confirm=auto_confirm)
            req.commit_uninstall()

    def prepare_files(self, finder):
        """
        Prepare process. Create temp directories, download and/or unpack files.
        """
        # make the wheelhouse
        if self.wheel_download_dir:
            ensure_dir(self.wheel_download_dir)

        # If any top-level requirement has a hash specified, enter
        # hash-checking mode, which requires hashes from all.
        root_reqs = self.unnamed_requirements + self.requirements.values()
        require_hashes = (self.require_hashes or
                          any(req.has_hash_options for req in root_reqs))
        if require_hashes and self.as_egg:
            raise InstallationError(
                '--egg is not allowed with --require-hashes mode, since it '
                'delegates dependency resolution to setuptools and could thus '
                'result in installation of unhashed packages.')

        # Actually prepare the files, and collect any exceptions. Most hash
        # exceptions cannot be checked ahead of time, because
        # req.populate_link() needs to be called before we can make decisions
        # based on link type.
        discovered_reqs = []
        hash_errors = HashErrors()
        for req in chain(root_reqs, discovered_reqs):
            try:
                discovered_reqs.extend(self._prepare_file(
                    finder,
                    req,
                    require_hashes=require_hashes,
                    ignore_dependencies=self.ignore_dependencies))
            except HashError as exc:
                exc.req = req
                hash_errors.append(exc)

        if hash_errors:
            raise hash_errors

    def _is_upgrade_allowed(self, req):
        return self.upgrade and (
            self.upgrade_strategy == "eager" or (
                self.upgrade_strategy == "only-if-needed" and req.is_direct
            )
        )

    def _check_skip_installed(self, req_to_install, finder):
        """Check if req_to_install should be skipped.

        This will check if the req is installed, and whether we should upgrade
        or reinstall it, taking into account all the relevant user options.

        After calling this req_to_install will only have satisfied_by set to
        None if the req_to_install is to be upgraded/reinstalled etc. Any
        other value will be a dist recording the current thing installed that
        satisfies the requirement.

        Note that for vcs urls and the like we can't assess skipping in this
        routine - we simply identify that we need to pull the thing down,
        then later on it is pulled down and introspected to assess upgrade/
        reinstalls etc.

        :return: A text reason for why it was skipped, or None.
        """
        # Check whether to upgrade/reinstall this req or not.
        req_to_install.check_if_exists()
        if req_to_install.satisfied_by:
            upgrade_allowed = self._is_upgrade_allowed(req_to_install)

            # Is the best version is installed.
            best_installed = False

            if upgrade_allowed:
                # For link based requirements we have to pull the
                # tree down and inspect to assess the version #, so
                # its handled way down.
                if not (self.force_reinstall or req_to_install.link):
                    try:
                        finder.find_requirement(
                            req_to_install, upgrade_allowed)
                    except BestVersionAlreadyInstalled:
                        best_installed = True
                    except DistributionNotFound:
                        # No distribution found, so we squash the
                        # error - it will be raised later when we
                        # re-try later to do the install.
                        # Why don't we just raise here?
                        pass

                if not best_installed:
                    # don't uninstall conflict if user install and
                    # conflict is not user install or conflict lives
                    # in a different path (/usr/lib vs /usr/local/lib/)
                    if not (self.use_user_site and not
                            dist_in_usersite(req_to_install.satisfied_by) or not
                            dist_in_install_path(req_to_install.satisfied_by)):
                        req_to_install.conflicts_with = \
                            req_to_install.satisfied_by
                    req_to_install.satisfied_by = None

            # Figure out a nice message to say why we're skipping this.
            if best_installed:
                skip_reason = 'already up-to-date'
            elif self.upgrade_strategy == "only-if-needed":
                skip_reason = 'not upgraded as not directly required'
            else:
                skip_reason = 'already satisfied'

            return skip_reason
        else:
            return None

    def _prepare_file(self,
                      finder,
                      req_to_install,
                      require_hashes=False,
                      ignore_dependencies=False):
        """Prepare a single requirements file.

        :return: A list of additional InstallRequirements to also install.
        """
        # Tell user what we are doing for this requirement:
        # obtain (editable), skipping, processing (local url), collecting
        # (remote url or package name)
        if req_to_install.constraint or req_to_install.prepared:
            return []

        req_to_install.prepared = True

        # ###################### #
        # # print log messages # #
        # ###################### #
        if req_to_install.editable:
            logger.info('Obtaining %s', req_to_install)
        else:
            # satisfied_by is only evaluated by calling _check_skip_installed,
            # so it must be None here.
            assert req_to_install.satisfied_by is None
            if not self.ignore_installed:
                skip_reason = self._check_skip_installed(
                    req_to_install, finder)

            if req_to_install.satisfied_by:
                assert skip_reason is not None, (
                    '_check_skip_installed returned None but '
                    'req_to_install.satisfied_by is set to %r'
                    % (req_to_install.satisfied_by,))
                logger.info(
                    'Requirement %s: %s', skip_reason,
                    req_to_install)
            else:
                if (req_to_install.link and
                        req_to_install.link.scheme == 'file'):
                    path = url_to_path(req_to_install.link.url)
                    logger.info('Processing %s', display_path(path))
                else:
                    logger.info('Collecting %s', req_to_install)

        with indent_log():
            # ################################ #
            # # vcs update or unpack archive # #
            # ################################ #
            if req_to_install.editable:
                if require_hashes:
                    raise InstallationError(
                        'The editable requirement %s cannot be installed when '
                        'requiring hashes, because there is no single file to '
                        'hash.' % req_to_install)
                req_to_install.ensure_has_source_dir(self.src_dir)
                req_to_install.update_editable(not self.is_download)
                abstract_dist = make_abstract_dist(req_to_install)
                abstract_dist.prep_for_dist()
                if self.is_download:
                    req_to_install.archive(self.download_dir)
                req_to_install.check_if_exists()
            elif req_to_install.satisfied_by:
                if require_hashes:
                    logger.debug(
                        'Since it is already installed, we are trusting this '
                        'package without checking its hash. To ensure a '
                        'completely repeatable environment, install into an '
                        'empty virtualenv.')
                abstract_dist = Installed(req_to_install)
            else:
                # @@ if filesystem packages are not marked
                # editable in a req, a non deterministic error
                # occurs when the script attempts to unpack the
                # build directory
                req_to_install.ensure_has_source_dir(self.build_dir)
                # If a checkout exists, it's unwise to keep going.  version
                # inconsistencies are logged later, but do not fail the
                # installation.
                # FIXME: this won't upgrade when there's an existing
                # package unpacked in `req_to_install.source_dir`
                if os.path.exists(
                        os.path.join(req_to_install.source_dir, 'setup.py')):
                    raise PreviousBuildDirError(
                        "pip can't proceed with requirements '%s' due to a"
                        " pre-existing build directory (%s). This is "
                        "likely due to a previous installation that failed"
                        ". pip is being responsible and not assuming it "
                        "can delete this. Please delete it and try again."
                        % (req_to_install, req_to_install.source_dir)
                    )
                req_to_install.populate_link(
                    finder,
                    self._is_upgrade_allowed(req_to_install),
                    require_hashes
                )
                # We can't hit this spot and have populate_link return None.
                # req_to_install.satisfied_by is None here (because we're
                # guarded) and upgrade has no impact except when satisfied_by
                # is not None.
                # Then inside find_requirement existing_applicable -> False
                # If no new versions are found, DistributionNotFound is raised,
                # otherwise a result is guaranteed.
                assert req_to_install.link
                link = req_to_install.link

                # Now that we have the real link, we can tell what kind of
                # requirements we have and raise some more informative errors
                # than otherwise. (For example, we can raise VcsHashUnsupported
                # for a VCS URL rather than HashMissing.)
                if require_hashes:
                    # We could check these first 2 conditions inside
                    # unpack_url and save repetition of conditions, but then
                    # we would report less-useful error messages for
                    # unhashable requirements, complaining that there's no
                    # hash provided.
                    if is_vcs_url(link):
                        raise VcsHashUnsupported()
                    elif is_file_url(link) and is_dir_url(link):
                        raise DirectoryUrlHashUnsupported()
                    if (not req_to_install.original_link and
                            not req_to_install.is_pinned):
                        # Unpinned packages are asking for trouble when a new
                        # version is uploaded. This isn't a security check, but
                        # it saves users a surprising hash mismatch in the
                        # future.
                        #
                        # file:/// URLs aren't pinnable, so don't complain
                        # about them not being pinned.
                        raise HashUnpinned()
                hashes = req_to_install.hashes(
                    trust_internet=not require_hashes)
                if require_hashes and not hashes:
                    # Known-good hashes are missing for this requirement, so
                    # shim it with a facade object that will provoke hash
                    # computation and then raise a HashMissing exception
                    # showing the user what the hash should be.
                    hashes = MissingHashes()

                try:
                    download_dir = self.download_dir
                    # We always delete unpacked sdists after pip ran.
                    autodelete_unpacked = True
                    if req_to_install.link.is_wheel \
                            and self.wheel_download_dir:
                        # when doing 'pip wheel` we download wheels to a
                        # dedicated dir.
                        download_dir = self.wheel_download_dir
                    if req_to_install.link.is_wheel:
                        if download_dir:
                            # When downloading, we only unpack wheels to get
                            # metadata.
                            autodelete_unpacked = True
                        else:
                            # When installing a wheel, we use the unpacked
                            # wheel.
                            autodelete_unpacked = False
                    unpack_url(
                        req_to_install.link, req_to_install.source_dir,
                        download_dir, autodelete_unpacked,
                        session=self.session, hashes=hashes)
                except requests.HTTPError as exc:
                    logger.critical(
                        'Could not install requirement %s because '
                        'of error %s',
                        req_to_install,
                        exc,
                    )
                    raise InstallationError(
                        'Could not install requirement %s because '
                        'of HTTP error %s for URL %s' %
                        (req_to_install, exc, req_to_install.link)
                    )
                abstract_dist = make_abstract_dist(req_to_install)
                abstract_dist.prep_for_dist()
                if self.is_download:
                    # Make a .zip of the source_dir we already created.
                    if req_to_install.link.scheme in vcs.all_schemes:
                        req_to_install.archive(self.download_dir)
                # req_to_install.req is only avail after unpack for URL
                # pkgs repeat check_if_exists to uninstall-on-upgrade
                # (#14)
                if not self.ignore_installed:
                    req_to_install.check_if_exists()
                if req_to_install.satisfied_by:
                    if self.upgrade or self.ignore_installed:
                        # don't uninstall conflict if user install and
                        # conflict is not user install or conflict lives
                        # in a different path (/usr/lib vs /usr/local/lib/)
                        if not (self.use_user_site and not
                                dist_in_usersite(
                                    req_to_install.satisfied_by) or not
                                dist_in_install_path(req_to_install.satisfied_by)):
                            req_to_install.conflicts_with = \
                                req_to_install.satisfied_by
                        req_to_install.satisfied_by = None
                    else:
                        logger.info(
                            'Requirement already satisfied (use '
                            '--upgrade to upgrade): %s',
                            req_to_install,
                        )

            # ###################### #
            # # parse dependencies # #
            # ###################### #
            dist = abstract_dist.dist(finder)
            try:
                check_dist_requires_python(dist)
            except UnsupportedPythonVersion as e:
                if self.ignore_requires_python:
                    logger.warning(e.args[0])
                else:
                    req_to_install.remove_temporary_source()
                    raise
            more_reqs = []

            def add_req(subreq, extras_requested):
                sub_install_req = InstallRequirement(
                    str(subreq),
                    req_to_install,
                    isolated=self.isolated,
                    wheel_cache=self._wheel_cache,
                )
                more_reqs.extend(self.add_requirement(
                    sub_install_req, req_to_install.name,
                    extras_requested=extras_requested))

            # We add req_to_install before its dependencies, so that we
            # can refer to it when adding dependencies.
            if not self.has_requirement(req_to_install.name):
                # 'unnamed' requirements will get added here
                self.add_requirement(req_to_install, None)

            if not ignore_dependencies:
                if (req_to_install.extras):
                    logger.debug(
                        "Installing extra requirements: %r",
                        ','.join(req_to_install.extras),
                    )
                missing_requested = sorted(
                    set(req_to_install.extras) - set(dist.extras)
                )
                for missing in missing_requested:
                    logger.warning(
                        '%s does not provide the extra \'%s\'',
                        dist, missing
                    )

                available_requested = sorted(
                    set(dist.extras) & set(req_to_install.extras)
                )
                for subreq in dist.requires(available_requested):
                    add_req(subreq, extras_requested=available_requested)

            # cleanup tmp src
            self.reqs_to_cleanup.append(req_to_install)

            if not req_to_install.editable and not req_to_install.satisfied_by:
                # XXX: --no-install leads this to report 'Successfully
                # downloaded' for only non-editable reqs, even though we took
                # action on them.
                self.successfully_downloaded.append(req_to_install)

        return more_reqs

    def cleanup_files(self):
        """Clean up files, remove builds."""
        logger.debug('Cleaning up...')
        with indent_log():
            for req in self.reqs_to_cleanup:
                req.remove_temporary_source()

    def _to_install(self):
        """Create the installation order.

        The installation order is topological - requirements are installed
        before the requiring thing. We break cycles at an arbitrary point,
        and make no other guarantees.
        """
        # The current implementation, which we may change at any point
        # installs the user specified things in the order given, except when
        # dependencies must come earlier to achieve topological order.
        order = []
        ordered_reqs = set()

        def schedule(req):
            if req.satisfied_by or req in ordered_reqs:
                return
            if req.constraint:
                return
            ordered_reqs.add(req)
            for dep in self._dependencies[req]:
                schedule(dep)
            order.append(req)
        for install_req in self.requirements.values():
            schedule(install_req)
        return order

    def install(self, install_options, global_options=(), *args, **kwargs):
        """
        Install everything in this set (after having downloaded and unpacked
        the packages)
        """
        to_install = self._to_install()

        if to_install:
            logger.info(
                'Installing collected packages: %s',
                ', '.join([req.name for req in to_install]),
            )

        with indent_log():
            for requirement in to_install:
                if requirement.conflicts_with:
                    logger.info(
                        'Found existing installation: %s',
                        requirement.conflicts_with,
                    )
                    with indent_log():
                        requirement.uninstall(auto_confirm=True)
                try:
                    requirement.install(
                        install_options,
                        global_options,
                        *args,
                        **kwargs
                    )
                except:
                    # if install did not succeed, rollback previous uninstall
                    if (requirement.conflicts_with and not
                            requirement.install_succeeded):
                        requirement.rollback_uninstall()
                    raise
                else:
                    if (requirement.conflicts_with and
                            requirement.install_succeeded):
                        requirement.commit_uninstall()
                requirement.remove_temporary_source()

        self.successfully_installed = to_install
site-packages/pip/req/__init__.py000064400000000424147511334620012762 0ustar00from __future__ import absolute_import

from .req_install import InstallRequirement
from .req_set import RequirementSet, Requirements
from .req_file import parse_requirements

__all__ = [
    "RequirementSet", "Requirements", "InstallRequirement",
    "parse_requirements",
]
site-packages/pip/req/req_file.py000064400000027226147511334620013022 0ustar00"""
Requirements file parsing
"""

from __future__ import absolute_import

import os
import re
import shlex
import sys
import optparse
import warnings

from pip._vendor.six.moves.urllib import parse as urllib_parse
from pip._vendor.six.moves import filterfalse

import pip
from pip.download import get_file_content
from pip.req.req_install import InstallRequirement
from pip.exceptions import (RequirementsFileParseError)
from pip.utils.deprecation import RemovedInPip10Warning
from pip import cmdoptions

__all__ = ['parse_requirements']

SCHEME_RE = re.compile(r'^(http|https|file):', re.I)
COMMENT_RE = re.compile(r'(^|\s)+#.*$')

SUPPORTED_OPTIONS = [
    cmdoptions.constraints,
    cmdoptions.editable,
    cmdoptions.requirements,
    cmdoptions.no_index,
    cmdoptions.index_url,
    cmdoptions.find_links,
    cmdoptions.extra_index_url,
    cmdoptions.allow_external,
    cmdoptions.allow_all_external,
    cmdoptions.no_allow_external,
    cmdoptions.allow_unsafe,
    cmdoptions.no_allow_unsafe,
    cmdoptions.use_wheel,
    cmdoptions.no_use_wheel,
    cmdoptions.always_unzip,
    cmdoptions.no_binary,
    cmdoptions.only_binary,
    cmdoptions.pre,
    cmdoptions.process_dependency_links,
    cmdoptions.trusted_host,
    cmdoptions.require_hashes,
]

# options to be passed to requirements
SUPPORTED_OPTIONS_REQ = [
    cmdoptions.install_options,
    cmdoptions.global_options,
    cmdoptions.hash,
]

# the 'dest' string values
SUPPORTED_OPTIONS_REQ_DEST = [o().dest for o in SUPPORTED_OPTIONS_REQ]


def parse_requirements(filename, finder=None, comes_from=None, options=None,
                       session=None, constraint=False, wheel_cache=None):
    """Parse a requirements file and yield InstallRequirement instances.

    :param filename:    Path or url of requirements file.
    :param finder:      Instance of pip.index.PackageFinder.
    :param comes_from:  Origin description of requirements.
    :param options:     cli options.
    :param session:     Instance of pip.download.PipSession.
    :param constraint:  If true, parsing a constraint file rather than
        requirements file.
    :param wheel_cache: Instance of pip.wheel.WheelCache
    """
    if session is None:
        raise TypeError(
            "parse_requirements() missing 1 required keyword argument: "
            "'session'"
        )

    _, content = get_file_content(
        filename, comes_from=comes_from, session=session
    )

    lines_enum = preprocess(content, options)

    for line_number, line in lines_enum:
        req_iter = process_line(line, filename, line_number, finder,
                                comes_from, options, session, wheel_cache,
                                constraint=constraint)
        for req in req_iter:
            yield req


def preprocess(content, options):
    """Split, filter, and join lines, and return a line iterator

    :param content: the content of the requirements file
    :param options: cli options
    """
    lines_enum = enumerate(content.splitlines(), start=1)
    lines_enum = join_lines(lines_enum)
    lines_enum = ignore_comments(lines_enum)
    lines_enum = skip_regex(lines_enum, options)
    return lines_enum


def process_line(line, filename, line_number, finder=None, comes_from=None,
                 options=None, session=None, wheel_cache=None,
                 constraint=False):
    """Process a single requirements line; This can result in creating/yielding
    requirements, or updating the finder.

    For lines that contain requirements, the only options that have an effect
    are from SUPPORTED_OPTIONS_REQ, and they are scoped to the
    requirement. Other options from SUPPORTED_OPTIONS may be present, but are
    ignored.

    For lines that do not contain requirements, the only options that have an
    effect are from SUPPORTED_OPTIONS. Options from SUPPORTED_OPTIONS_REQ may
    be present, but are ignored. These lines may contain multiple options
    (although our docs imply only one is supported), and all our parsed and
    affect the finder.

    :param constraint: If True, parsing a constraints file.
    :param options: OptionParser options that we may update
    """
    parser = build_parser()
    defaults = parser.get_default_values()
    defaults.index_url = None
    if finder:
        # `finder.format_control` will be updated during parsing
        defaults.format_control = finder.format_control
    args_str, options_str = break_args_options(line)
    if sys.version_info < (2, 7, 3):
        # Prior to 2.7.3, shlex cannot deal with unicode entries
        options_str = options_str.encode('utf8')
    opts, _ = parser.parse_args(shlex.split(options_str), defaults)

    # preserve for the nested code path
    line_comes_from = '%s %s (line %s)' % (
        '-c' if constraint else '-r', filename, line_number)

    # yield a line requirement
    if args_str:
        isolated = options.isolated_mode if options else False
        if options:
            cmdoptions.check_install_build_global(options, opts)
        # get the options that apply to requirements
        req_options = {}
        for dest in SUPPORTED_OPTIONS_REQ_DEST:
            if dest in opts.__dict__ and opts.__dict__[dest]:
                req_options[dest] = opts.__dict__[dest]
        yield InstallRequirement.from_line(
            args_str, line_comes_from, constraint=constraint,
            isolated=isolated, options=req_options, wheel_cache=wheel_cache
        )

    # yield an editable requirement
    elif opts.editables:
        isolated = options.isolated_mode if options else False
        default_vcs = options.default_vcs if options else None
        yield InstallRequirement.from_editable(
            opts.editables[0], comes_from=line_comes_from,
            constraint=constraint, default_vcs=default_vcs, isolated=isolated,
            wheel_cache=wheel_cache
        )

    # parse a nested requirements file
    elif opts.requirements or opts.constraints:
        if opts.requirements:
            req_path = opts.requirements[0]
            nested_constraint = False
        else:
            req_path = opts.constraints[0]
            nested_constraint = True
        # original file is over http
        if SCHEME_RE.search(filename):
            # do a url join so relative paths work
            req_path = urllib_parse.urljoin(filename, req_path)
        # original file and nested file are paths
        elif not SCHEME_RE.search(req_path):
            # do a join so relative paths work
            req_path = os.path.join(os.path.dirname(filename), req_path)
        # TODO: Why not use `comes_from='-r {} (line {})'` here as well?
        parser = parse_requirements(
            req_path, finder, comes_from, options, session,
            constraint=nested_constraint, wheel_cache=wheel_cache
        )
        for req in parser:
            yield req

    # percolate hash-checking option upward
    elif opts.require_hashes:
        options.require_hashes = opts.require_hashes

    # set finder options
    elif finder:
        if opts.allow_external:
            warnings.warn(
                "--allow-external has been deprecated and will be removed in "
                "the future. Due to changes in the repository protocol, it no "
                "longer has any effect.",
                RemovedInPip10Warning,
            )

        if opts.allow_all_external:
            warnings.warn(
                "--allow-all-external has been deprecated and will be removed "
                "in the future. Due to changes in the repository protocol, it "
                "no longer has any effect.",
                RemovedInPip10Warning,
            )

        if opts.allow_unverified:
            warnings.warn(
                "--allow-unverified has been deprecated and will be removed "
                "in the future. Due to changes in the repository protocol, it "
                "no longer has any effect.",
                RemovedInPip10Warning,
            )

        if opts.index_url:
            finder.index_urls = [opts.index_url]
        if opts.use_wheel is False:
            finder.use_wheel = False
            pip.index.fmt_ctl_no_use_wheel(finder.format_control)
        if opts.no_index is True:
            finder.index_urls = []
        if opts.extra_index_urls:
            finder.index_urls.extend(opts.extra_index_urls)
        if opts.find_links:
            # FIXME: it would be nice to keep track of the source
            # of the find_links: support a find-links local path
            # relative to a requirements file.
            value = opts.find_links[0]
            req_dir = os.path.dirname(os.path.abspath(filename))
            relative_to_reqs_file = os.path.join(req_dir, value)
            if os.path.exists(relative_to_reqs_file):
                value = relative_to_reqs_file
            finder.find_links.append(value)
        if opts.pre:
            finder.allow_all_prereleases = True
        if opts.process_dependency_links:
            finder.process_dependency_links = True
        if opts.trusted_hosts:
            finder.secure_origins.extend(
                ("*", host, "*") for host in opts.trusted_hosts)


def break_args_options(line):
    """Break up the line into an args and options string.  We only want to shlex
    (and then optparse) the options, not the args.  args can contain markers
    which are corrupted by shlex.
    """
    tokens = line.split(' ')
    args = []
    options = tokens[:]
    for token in tokens:
        if token.startswith('-') or token.startswith('--'):
            break
        else:
            args.append(token)
            options.pop(0)
    return ' '.join(args), ' '.join(options)


def build_parser():
    """
    Return a parser for parsing requirement lines
    """
    parser = optparse.OptionParser(add_help_option=False)

    option_factories = SUPPORTED_OPTIONS + SUPPORTED_OPTIONS_REQ
    for option_factory in option_factories:
        option = option_factory()
        parser.add_option(option)

    # By default optparse sys.exits on parsing errors. We want to wrap
    # that in our own exception.
    def parser_exit(self, msg):
        raise RequirementsFileParseError(msg)
    parser.exit = parser_exit

    return parser


def join_lines(lines_enum):
    """Joins a line ending in '\' with the previous line (except when following
    comments).  The joined line takes on the index of the first line.
    """
    primary_line_number = None
    new_line = []
    for line_number, line in lines_enum:
        if not line.endswith('\\') or COMMENT_RE.match(line):
            if COMMENT_RE.match(line):
                # this ensures comments are always matched later
                line = ' ' + line
            if new_line:
                new_line.append(line)
                yield primary_line_number, ''.join(new_line)
                new_line = []
            else:
                yield line_number, line
        else:
            if not new_line:
                primary_line_number = line_number
            new_line.append(line.strip('\\'))

    # last line contains \
    if new_line:
        yield primary_line_number, ''.join(new_line)

    # TODO: handle space after '\'.


def ignore_comments(lines_enum):
    """
    Strips comments and filter empty lines.
    """
    for line_number, line in lines_enum:
        line = COMMENT_RE.sub('', line)
        line = line.strip()
        if line:
            yield line_number, line


def skip_regex(lines_enum, options):
    """
    Skip lines that match '--skip-requirements-regex' pattern

    Note: the regex pattern is only built once
    """
    skip_regex = options.skip_requirements_regex if options else None
    if skip_regex:
        pattern = re.compile(skip_regex)
        lines_enum = filterfalse(
            lambda e: pattern.search(e[1]),
            lines_enum)
    return lines_enum
site-packages/pip/req/__pycache__/req_file.cpython-36.pyc000064400000020322147511334620017274 0ustar003

���e�.�@sxdZddlmZddlZddlZddlZddlZddlZddlZddl	m
Zddlm
Z
ddlZddlmZddlmZddlmZdd	lmZdd
lmZdgZejdej�Zejd
�Zejejejej ej!ej"ej#ej$ej%ej&ej'ej(ej)ej*ej+ej,ej-ej.ej/ej0ej1gZ2ej3ej4ej5gZ6dd�e6D�Z7d dd�Z8dd�Z9d!dd�Z:dd�Z;dd�Z<dd�Z=dd�Z>dd�Z?dS)"z
Requirements file parsing
�)�absolute_importN)�parse)�filterfalse)�get_file_content)�InstallRequirement)�RequirementsFileParseError)�RemovedInPip10Warning)�
cmdoptions�parse_requirementsz^(http|https|file):z(^|\s)+#.*$cCsg|]}|�j�qS�)�dest)�.0�orr�/usr/lib/python3.6/req_file.py�
<listcomp>=srFccsp|dkrtd��t|||d�\}}t||�}	x>|	D]6\}
}t|||
||||||d�	}x|D]
}
|
VqZWq2WdS)a�Parse a requirements file and yield InstallRequirement instances.

    :param filename:    Path or url of requirements file.
    :param finder:      Instance of pip.index.PackageFinder.
    :param comes_from:  Origin description of requirements.
    :param options:     cli options.
    :param session:     Instance of pip.download.PipSession.
    :param constraint:  If true, parsing a constraint file rather than
        requirements file.
    :param wheel_cache: Instance of pip.wheel.WheelCache
    NzCparse_requirements() missing 1 required keyword argument: 'session')�
comes_from�session)�
constraint)�	TypeErrorr�
preprocess�process_line)�filename�finderr�optionsrr�wheel_cache�_�content�
lines_enum�line_number�lineZreq_iter�reqrrrr
@s



cCs.t|j�dd�}t|�}t|�}t||�}|S)z�Split, filter, and join lines, and return a line iterator

    :param content: the content of the requirements file
    :param options: cli options
    �)�start)�	enumerate�
splitlines�
join_lines�ignore_comments�
skip_regex)rrrrrrras

rc		cs
t�}	|	j�}
d|
_|r |j|
_t|�\}}tjdkr@|jd�}|	jt	j
|�|
�\}
}d|r`dnd||f}|r�|rz|jnd	}|r�tj
||
�i}x.tD]&}||
jkr�|
j|r�|
j|||<q�Wtj||||||d
�V�n(|
j�r&|r�|jnd	}|�r|jnd}tj|
jd|||||d�V�n�|
j�s6|
j�r�|
j�rN|
jd}d	}n|
jd}d
}tj|��rvtj||�}n"tj|��s�tjjtjj|�|�}t|||||||d�}	x|	D]}|V�q�W�n>|
j �r�|
j |_ �n*|�r|
j!�r�t"j#dt$�|
j%�r
t"j#dt$�|
j&�rt"j#dt$�|
j�r0|
jg|_'|
j(d	k�rPd	|_(t)j*j+|j�|
j,d
k�rbg|_'|
j-�rx|j'j.|
j-�|
j/�r�|
j/d}tjjtjj0|��}tjj||�}tjj1|��r�|}|j/j2|�|
j3�r�d
|_4|
j5�r�d
|_5|
j6�r|j7j.dd�|
j6D��dS)a#Process a single requirements line; This can result in creating/yielding
    requirements, or updating the finder.

    For lines that contain requirements, the only options that have an effect
    are from SUPPORTED_OPTIONS_REQ, and they are scoped to the
    requirement. Other options from SUPPORTED_OPTIONS may be present, but are
    ignored.

    For lines that do not contain requirements, the only options that have an
    effect are from SUPPORTED_OPTIONS. Options from SUPPORTED_OPTIONS_REQ may
    be present, but are ignored. These lines may contain multiple options
    (although our docs imply only one is supported), and all our parsed and
    affect the finder.

    :param constraint: If True, parsing a constraints file.
    :param options: OptionParser options that we may update
    N����utf8z%s %s (line %s)z-cz-rF)r�isolatedrrr)rr�default_vcsr,rT)rrz�--allow-external has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.z�--allow-all-external has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.z�--allow-unverified has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.css|]}d|dfVqdS)�*Nr)r
�hostrrr�	<genexpr>�szprocess_line.<locals>.<genexpr>)r(r)r*)8�build_parserZget_default_values�	index_urlZformat_control�break_args_options�sys�version_info�encode�
parse_args�shlex�splitZ
isolated_moder	Zcheck_install_build_global�SUPPORTED_OPTIONS_REQ_DEST�__dict__rZ	from_lineZ	editablesr-Z
from_editable�requirements�constraints�	SCHEME_RE�search�urllib_parseZurljoin�os�path�join�dirnamer
�require_hashes�allow_external�warnings�warnr�allow_all_externalZallow_unverifiedZ
index_urls�	use_wheel�pip�indexZfmt_ctl_no_use_wheel�no_indexZextra_index_urls�extend�
find_links�abspath�exists�append�preZallow_all_prereleases�process_dependency_linksZ
trusted_hostsZsecure_origins)rrrrrrrrr�parserZdefaultsZargs_strZoptions_strZoptsrZline_comes_fromr,Zreq_optionsrr-Zreq_pathZnested_constraintr �valueZreq_dirZrelative_to_reqs_filerrrrns�










rcCsh|jd�}g}|dd�}x8|D]0}|jd�s8|jd�r<Pq |j|�|jd�q Wdj|�dj|�fS)z�Break up the line into an args and options string.  We only want to shlex
    (and then optparse) the options, not the args.  args can contain markers
    which are corrupted by shlex.
    � N�-z--r)r9�
startswithrR�poprC)r�tokens�argsr�tokenrrrr3�s


r3cCsDtjdd�}tt}x|D]}|�}|j|�qWdd�}||_|S)z7
    Return a parser for parsing requirement lines
    F)Zadd_help_optioncSst|��dS)N)r)�self�msgrrr�parser_exitsz!build_parser.<locals>.parser_exit)�optparseZOptionParser�SUPPORTED_OPTIONS�SUPPORTED_OPTIONS_REQZ
add_option�exit)rUZoption_factoriesZoption_factoryZoptionr`rrrr1s
r1ccs�d}g}x�|D]x\}}|jd�s,tj|�rntj|�r>d|}|rb|j|�|dj|�fVg}q�||fVq|sv|}|j|jd��qW|r�|dj|�fVdS)z�Joins a line ending in '' with the previous line (except when following
    comments).  The joined line takes on the index of the first line.
    N�\rW�)�endswith�
COMMENT_RE�matchrRrC�strip)rZprimary_line_numberZnew_linerrrrrr%"s 

r%ccs8x2|D]*\}}tjd|�}|j�}|r||fVqWdS)z1
    Strips comments and filter empty lines.
    rfN)rh�subrj)rrrrrrr&?s
r&cs2|r
|jnd}|r.tj|��t�fdd�|�}|S)zs
    Skip lines that match '--skip-requirements-regex' pattern

    Note: the regex pattern is only built once
    Ncs�j|d�S)Nr!)r?)�e)�patternrr�<lambda>Tszskip_regex.<locals>.<lambda>)Zskip_requirements_regex�re�compiler)rrr'r)rmrr'Js

r')NNNNFN)NNNNNF)@�__doc__Z
__future__rrAror8r4rarGZpip._vendor.six.moves.urllibrr@Zpip._vendor.six.movesrrKZpip.downloadrZpip.req.req_installrZpip.exceptionsrZpip.utils.deprecationrr	�__all__rp�Ir>rhr=Zeditabler<rMr2rOZextra_index_urlrFrIZno_allow_externalZallow_unsafeZno_allow_unsaferJZno_use_wheelZalways_unzipZ	no_binaryZonly_binaryrSrTZtrusted_hostrErbZinstall_optionsZglobal_options�hashrcr:r
rrr3r1r%r&r'rrrr�<module>sn

 

site-packages/pip/req/__pycache__/req_set.cpython-36.pyc000064400000051035147511334620017155 0ustar003

���e��@s~ddlmZddlmZddlmZddlZddlZddlm	Z	ddlm
Z
ddlmZddl
mZmZmZmZmZdd	lmZmZmZmZmZmZmZmZmZmZdd
lmZddl m!Z!m"Z"m#Z#m$Z$m%Z%ddl&m'Z'dd
l(m)Z)ddl*m+Z+ddl,m-Z-ddl.m/Z/ej0e1�Z2Gdd�de3�Z4Gdd�de3�Z5dd�Z6Gdd�de5�Z7Gdd�de5�Z8Gdd�de5�Z9Gdd�de3�Z:dS)�)�absolute_import)�defaultdict)�chainN)�
pkg_resources)�requests)�
expanduser)�is_file_url�
is_dir_url�
is_vcs_url�url_to_path�
unpack_url)
�InstallationError�BestVersionAlreadyInstalled�DistributionNotFound�PreviousBuildDirError�	HashError�
HashErrors�HashUnpinned�DirectoryUrlHashUnsupported�VcsHashUnsupported�UnsupportedPythonVersion)�InstallRequirement)�display_path�dist_in_usersite�dist_in_install_path�
ensure_dir�normalize_path)�
MissingHashes)�
indent_log)�check_dist_requires_python)�vcs)�Wheelc@sDeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dS)�RequirementscCsg|_i|_dS)N)�_keys�_dict)�self�r&�/usr/lib/python3.6/req_set.py�__init__"szRequirements.__init__cCs|jS)N)r#)r%r&r&r'�keys&szRequirements.keyscs�fdd��jD�S)Ncsg|]}�j|�qSr&)r$)�.0�key)r%r&r'�
<listcomp>*sz'Requirements.values.<locals>.<listcomp>)r#)r%r&)r%r'�values)szRequirements.valuescCs
||jkS)N)r#)r%�itemr&r&r'�__contains__,szRequirements.__contains__cCs$||jkr|jj|�||j|<dS)N)r#�appendr$)r%r+�valuer&r&r'�__setitem__/s
zRequirements.__setitem__cCs
|j|S)N)r$)r%r+r&r&r'�__getitem__4szRequirements.__getitem__cs$�fdd��j�D�}ddj|�S)Ncs$g|]}dt|�t�|�f�qS)z%s: %s)�repr)r*�k)r%r&r'r,8sz)Requirements.__repr__.<locals>.<listcomp>zRequirements({%s})z, )r)�join)r%r-r&)r%r'�__repr__7szRequirements.__repr__N)
�__name__�
__module__�__qualname__r(r)r-r/r2r3r7r&r&r&r'r" sr"c@s(eZdZdZdd�Zdd�Zdd�ZdS)	�DistAbstractionatAbstracts out the wheel vs non-wheel prepare_files logic.

    The requirements for anything installable are as follows:
     - we must be able to determine the requirement name
       (or we can't correctly handle the non-upgrade case).
     - we must be able to generate a list of run-time dependencies
       without installing any additional packages (or we would
       have to either burn time by doing temporary isolated installs
       or alternatively violate pips 'don't start installing unless
       all requirements are available' rule - neither of which are
       desirable).
     - for packages with setup requirements, we must also be able
       to determine their requirements without installing additional
       packages (for the same reason as run-time dependencies)
     - we must be able to create a Distribution object exposing the
       above metadata.
    cCs
||_dS)N)�req_to_install)r%r<r&r&r'r(OszDistAbstraction.__init__cCst|j��dS)z Return a setuptools Dist object.N)�NotImplementedError�dist)r%�finderr&r&r'r>RszDistAbstraction.distcCst|j��dS)z3Ensure that we can get a Dist for this requirement.N)r=r>)r%r&r&r'�
prep_for_distVszDistAbstraction.prep_for_distN)r8r9r:�__doc__r(r>r@r&r&r&r'r;<sr;cCs0|jrt|�S|jr$|jjr$t|�St|�SdS)z�Factory to make an abstract dist object.

    Preconditions: Either an editable req with a source_dir, or satisfied_by or
    a wheel link, or a non-editable req with a source_dir.

    :return: A concrete DistAbstraction.
    N)�editable�IsSDist�link�is_wheel�IsWheel)r<r&r&r'�make_abstract_dist[s
rGc@seZdZdd�Zdd�ZdS)rFcCsttj|jj��dS)Nr)�listrZfind_distributionsr<�
source_dir)r%r?r&r&r'r>mszIsWheel.distcCsdS)Nr&)r%r&r&r'r@qszIsWheel.prep_for_distN)r8r9r:r>r@r&r&r&r'rFksrFc@seZdZdd�Zdd�ZdS)rCcCs(|jj�}|jd�r$|j|jd��|S)Nzdependency_links.txt)r<Zget_distZhas_metadataZadd_dependency_linksZget_metadata_lines)r%r?r>r&r&r'r>xs


zIsSDist.distcCs|jj�|jj�dS)N)r<Zrun_egg_infoZassert_source_matches_version)r%r&r&r'r@�s
zIsSDist.prep_for_distN)r8r9r:r>r@r&r&r&r'rCvs	rCc@seZdZdd�Zdd�ZdS)�	InstalledcCs|jjS)N)r<�satisfied_by)r%r?r&r&r'r>�szInstalled.distcCsdS)Nr&)r%r&r&r'r@�szInstalled.prep_for_distN)r8r9r:r>r@r&r&r&r'rJ�srJc@s�eZdZd$dd�Zdd�Zdd	�Zd%d
d�Zdd
�Zedd��Z	edd��Z
dd�Zd&dd�Zdd�Z
dd�Zdd�Zd'dd�Zdd�Zd d!�Zffd"d#�ZdS)(�RequirementSetFNTcCs�|dkrtd��||_||_||_||_||_||_|
|_t�|_	i|_
g|_|	|_||_
g|_g|_g|_||_||_||_||_|
|_||_|r�t|�}||_||_||_tt�|_dS)a3Create a RequirementSet.

        :param wheel_download_dir: Where still-packed .whl files should be
            written to. If None they are written to the download_dir parameter.
            Separate to download_dir to permit only keeping wheel archives for
            pip wheel.
        :param download_dir: Where still packed archives should be written to.
            If None they are not saved, and are deleted immediately after
            unpacking.
        :param wheel_cache: The pip wheel cache, for passing to
            InstallRequirement.
        Nz?RequirementSet() missing 1 required keyword argument: 'session')�	TypeError�	build_dir�src_dir�download_dir�upgrade�upgrade_strategy�ignore_installed�force_reinstallr"�requirements�requirement_aliases�unnamed_requirements�ignore_dependencies�ignore_requires_python�successfully_downloaded�successfully_installed�reqs_to_cleanup�as_egg�
use_user_site�
target_dir�session�	pycompile�isolatedr�wheel_download_dir�_wheel_cache�require_hashesrrH�
_dependencies)r%rNrOrPrQrRrSr]r_rXrTr^r`rarbrc�wheel_cachererYr&r&r'r(�s<zRequirementSet.__init__cCs8dd�|jj�D�}|jdd�d�djdd�|D��S)NcSsg|]}|js|�qSr&)Z
comes_from)r*�reqr&r&r'r,�sz*RequirementSet.__str__.<locals>.<listcomp>cSs
|jj�S)N)�name�lower)rhr&r&r'�<lambda>�sz(RequirementSet.__str__.<locals>.<lambda>)r+� cSsg|]}t|j��qSr&)�strrh)r*rhr&r&r'r,�s)rUr-�sortr6)r%�reqsr&r&r'�__str__�szRequirementSet.__str__cCsNdd�|jj�D�}|jdd�d�djdd�|D��}d|jjt|�|fS)	NcSsg|]}|�qSr&r&)r*rhr&r&r'r,�sz+RequirementSet.__repr__.<locals>.<listcomp>cSs
|jj�S)N)rirj)rhr&r&r'rk�sz)RequirementSet.__repr__.<locals>.<lambda>)r+z, cSsg|]}t|j��qSr&)rmrh)r*rhr&r&r'r,�sz"<%s object; %d requirement(s): %s>)rUr-rnr6�	__class__r8�len)r%roZreqs_strr&r&r'r7�s
zRequirementSet.__repr__c	
Cs�|j}|j|�s&tjd|j|j�gS|jrV|jjrVt|jj�}|j	�sVt
d|j��|j|_|j|_|j
|_
|j|_|dk|_|s�|jj|�|gSy|j|�}Wntk
r�d}YnX|dko�|o�|jo�|j|jko�|jj|jjk�rt
d|||f��|�s8||j|<|j�|k�r0||j|j�<|g}n�g}|j�r�|j�r�|j�r�|j�ol|jj|jjk�r�|jj|�t
d|��d|_ttt|j�j t|j����|_tj!d||j�|g}|}|�r�|j|�}|j"|j|�|SdS)a'Add install_req as a requirement to install.

        :param parent_req_name: The name of the requirement that needed this
            added. The name is used because when multiple unnamed requirements
            resolve to the same name, we could otherwise end up with dependency
            links that point outside the Requirements set. parent_req must
            already be added. Note that None implies that this is a user
            supplied requirement, vs an inferred one.
        :param extras_requested: an iterable of extras used to evaluate the
            environement markers.
        :return: Additional requirements to scan. That is either [] if
            the requirement is not applicable, or [install_req] if the
            requirement is applicable and has just been added.
        z6Ignoring %s: markers '%s' don't match your environmentz-%s is not a supported wheel on this platform.Nz5Double requirement given: %s (already in %s, name=%r)zhCould not satisfy constraints for '%s': installation from path or url cannot be constrained to a versionFzSetting %s extras to: %s)#riZ
match_markers�logger�warningZmarkersrDrEr!�filenameZ	supportedr
r]r^r_ra�	is_directrWr0�get_requirement�KeyError�
constraint�extrasrhZ	specifierrUrjrV�pathr\�tuple�sorted�set�union�debugrf)	r%�install_reqZparent_req_name�extras_requestedriZwheelZexisting_req�resultZ
parent_reqr&r&r'�add_requirement�sp






zRequirementSet.add_requirementcCsF|j�}||jkr |j|js>||jkrB|j|j|jrBdSdS)NTF)rjrUryrV)r%�project_namerir&r&r'�has_requirement4s

zRequirementSet.has_requirementcCstdd�|jj�D��p|jS)Ncss|]}|js|VqdS)N)ry)r*rhr&r&r'�	<genexpr>?sz2RequirementSet.has_requirements.<locals>.<genexpr>)rHrUr-rW)r%r&r&r'�has_requirements=szRequirementSet.has_requirementscCsD|jr@t|j�|_tjj|j�r$dStjd�tdt|j���dS)NTz!Could not find download directoryz0Could not find or access download directory '%s'F)	rPr�osr{�existsrs�criticalr
r)r%r&r&r'�is_downloadBs
zRequirementSet.is_downloadcCsTxB||j�fD]2}||jkr&|j|S||jkr|j|j|SqWtd|��dS)NzNo project with the name %r)rjrUrVrx)r%r�rir&r&r'rwOs


zRequirementSet.get_requirementcCs4x.|jj�D] }|jrq|j|d�|j�qWdS)N)�auto_confirm)rUr-ry�	uninstall�commit_uninstall)r%r�rhr&r&r'r�Ws
zRequirementSet.uninstallcCs�|jrt|j�|j|jj�}|jp6tdd�|D��}|rJ|jrJtd��g}t	�}xdt
||�D]V}y|j|j||||j
d��Wq`tk
r�}z||_|j|�WYdd}~Xq`Xq`W|r�|�dS)zY
        Prepare process. Create temp directories, download and/or unpack files.
        css|]}|jVqdS)N)Zhas_hash_options)r*rhr&r&r'r�jsz/RequirementSet.prepare_files.<locals>.<genexpr>z�--egg is not allowed with --require-hashes mode, since it delegates dependency resolution to setuptools and could thus result in installation of unhashed packages.)rerXN)rcrrWrUr-re�anyr]r
rr�extend�
_prepare_filerXrrhr0)r%r?Z	root_reqsreZdiscovered_reqsZhash_errorsrh�excr&r&r'�
prepare_files^s,

 zRequirementSet.prepare_filescCs |jo|jdkp|jdko|jS)NZeagerzonly-if-needed)rQrRrv)r%rhr&r&r'�_is_upgrade_allowed�s
z"RequirementSet._is_upgrade_allowedcCs�|j�|jr�|j|�}d}|r�|jp*|jshy|j||�Wn*tk
rTd}Yntk
rfYnX|s�|jr~t	|j�p�t
|j�s�|j|_d|_|r�d}n|jdkr�d}nd}|SdSdS)aCheck if req_to_install should be skipped.

        This will check if the req is installed, and whether we should upgrade
        or reinstall it, taking into account all the relevant user options.

        After calling this req_to_install will only have satisfied_by set to
        None if the req_to_install is to be upgraded/reinstalled etc. Any
        other value will be a dist recording the current thing installed that
        satisfies the requirement.

        Note that for vcs urls and the like we can't assess skipping in this
        routine - we simply identify that we need to pull the thing down,
        then later on it is pulled down and introspected to assess upgrade/
        reinstalls etc.

        :return: A text reason for why it was skipped, or None.
        FTNzalready up-to-datezonly-if-neededz%not upgraded as not directly requiredzalready satisfied)
�check_if_existsrKr�rTrDZfind_requirementrrr^rr�conflicts_withrR)r%r<r?Zupgrade_allowedZbest_installed�skip_reasonr&r&r'�_check_skip_installed�s4

z$RequirementSet._check_skip_installedc(s��js�jrgSd�_�jr*tjd��n��jdks8t��jsJ�j�|�}�jrx|dk	shtd�jf��tjd|��n<�j	r��j	j
dkr�t�j	j�}tjdt
|��ntjd��t�����j�r |r�td	����j�j��j�j�t��}|j��j�r�j�j��j��n0�j�rD|�r8tjd
�t��}�n�j�j�tjjtjj�j d���r|t!d��j f���j"|�j#��|��j	�s�t��j	}|�r�t$|��r�t%��nt&|��r�t'|��r�t(���j)�r�j*�r�t+���j,|d
�}	|�r|	�rt-�}	yZ�j}
d}�j	j.�r4�j/�r4�j/}
�j	j.�rN|
�rJd}nd}t0�j	�j |
|�j1|	d�WnHt2j3k
�r�}z(tj4d�|�td�|�j	f��WYdd}~XnXt��}|j��j�r�j	j
t5j6k�r�j�j��j�s��j��j�rP�j7�s�j�rD�j8�r&t9�j��p0t:�j��s<�j�_;d�_ntjd��|j<|�}
yt=|
�WnHt>k
�r�}z*�j?�r�tj@|jAd�n
�jB��WYdd}~XnXg����fdd�}�jC�jD��sވjE�d�|�sp�jF�rtjddj�jF��tGtH�jF�tH|
jF��}x|D]}tj@d|
|��qWtGtH|
jF�tH�jF�@�}x |
jI|�D]}|||d��qZW�jJjK���j�r��j�r��jLjK��WdQRX�S)zxPrepare a single requirements file.

        :return: A list of additional InstallRequirements to also install.
        TzObtaining %sNzP_check_skip_installed returned None but req_to_install.satisfied_by is set to %rzRequirement %s: %s�filez
Processing %sz
Collecting %szoThe editable requirement %s cannot be installed when requiring hashes, because there is no single file to hash.z�Since it is already installed, we are trusting this package without checking its hash. To ensure a completely repeatable environment, install into an empty virtualenv.zsetup.pyz�pip can't proceed with requirements '%s' due to a pre-existing build directory (%s). This is likely due to a previous installation that failed. pip is being responsible and not assuming it can delete this. Please delete it and try again.)Ztrust_internetF)r`�hashesz4Could not install requirement %s because of error %szDCould not install requirement %s because of HTTP error %s for URL %sz<Requirement already satisfied (use --upgrade to upgrade): %srcs4tt|���j�jd�}�j�j|�j|d��dS)N)rbrg)r�)rrmrbrdr�r�ri)�subreqr�Zsub_install_req)�	more_reqsr<r%r&r'�add_req�s
z-RequirementSet._prepare_file.<locals>.add_reqz!Installing extra requirements: %r�,z"%s does not provide the extra '%s')r�)MryZpreparedrBrs�inforK�AssertionErrorrSr�rD�schemerZurlrrr
Zensure_has_source_dirrOZupdate_editabler�rGr@�archiverPr�r�rJrNr�r{r�r6rIrZ
populate_linkr�r
rrr	rZ
original_linkZ	is_pinnedrr�rrErcrr`rZ	HTTPErrorr�r Zall_schemesrQr^rrr�r>rrrYrt�args�remove_temporary_sourcer�rir�rzr}r~Zrequiresr\r0rZ)r%r?r<rerXr�r{Z
abstract_distrDr�rPZautodelete_unpackedr�r>�er�Zmissing_requestedZmissingZavailable_requestedr�r&)r�r<r%r'r��s

	





"


zRequirementSet._prepare_filec	Cs8tjd�t��x|jD]}|j�qWWdQRXdS)zClean up files, remove builds.zCleaning up...N)rsr�rr\r�)r%rhr&r&r'�
cleanup_files�s
zRequirementSet.cleanup_filescs<g�t������fdd��x�jj�D]}�|�q(W�S)z�Create the installation order.

        The installation order is topological - requirements are installed
        before the requiring thing. We break cycles at an arbitrary point,
        and make no other guarantees.
        csP|js|�krdS|jrdS�j|�x�j|D]}�|�q2W�j|�dS)N)rKry�addrfr0)rhZdep)�order�ordered_reqs�scheduler%r&r'r��s
z,RequirementSet._to_install.<locals>.schedule)r~rUr-)r%r�r&)r�r�r�r%r'�_to_install�s
	zRequirementSet._to_installcOs�|j�}|r(tjddjdd�|D���t���x�|D]�}|jrltjd|j�t��|jdd�WdQRXy|j||f|�|�Wn$|jr�|jr�|j	��YnX|jr�|jr�|j
�|j�q6WWdQRX||_dS)	zl
        Install everything in this set (after having downloaded and unpacked
        the packages)
        z!Installing collected packages: %sz, cSsg|]
}|j�qSr&)ri)r*rhr&r&r'r,sz*RequirementSet.install.<locals>.<listcomp>zFound existing installation: %sT)r�N)
r�rsr�r6rr�r��installZinstall_succeededZrollback_uninstallr�r�r[)r%Zinstall_optionsZglobal_optionsr��kwargsZ
to_installZrequirementr&r&r'r��s:

zRequirementSet.install)FNFFNFFFNTFNNFF)NN)F)FF)r8r9r:r(rpr7r�r��propertyr�r�rwr�r�r�r�r�r�r�r�r&r&r&r'rL�s4
4
[	

'E
	rL);Z
__future__r�collectionsr�	itertoolsrZloggingr�Zpip._vendorrrZ
pip.compatrZpip.downloadrr	r
rrZpip.exceptionsr
rrrrrrrrrZpip.req.req_installrZ	pip.utilsrrrrrZpip.utils.hashesrZpip.utils.loggingrZpip.utils.packagingrZpip.vcsr Z	pip.wheelr!Z	getLoggerr8rs�objectr"r;rGrFrCrJrLr&r&r&r'�<module>s00
	site-packages/pip/req/__pycache__/req_file.cpython-36.opt-1.pyc000064400000020322147511334620020233 0ustar003

���e�.�@sxdZddlmZddlZddlZddlZddlZddlZddlZddl	m
Zddlm
Z
ddlZddlmZddlmZddlmZdd	lmZdd
lmZdgZejdej�Zejd
�Zejejejej ej!ej"ej#ej$ej%ej&ej'ej(ej)ej*ej+ej,ej-ej.ej/ej0ej1gZ2ej3ej4ej5gZ6dd�e6D�Z7d dd�Z8dd�Z9d!dd�Z:dd�Z;dd�Z<dd�Z=dd�Z>dd�Z?dS)"z
Requirements file parsing
�)�absolute_importN)�parse)�filterfalse)�get_file_content)�InstallRequirement)�RequirementsFileParseError)�RemovedInPip10Warning)�
cmdoptions�parse_requirementsz^(http|https|file):z(^|\s)+#.*$cCsg|]}|�j�qS�)�dest)�.0�orr�/usr/lib/python3.6/req_file.py�
<listcomp>=srFccsp|dkrtd��t|||d�\}}t||�}	x>|	D]6\}
}t|||
||||||d�	}x|D]
}
|
VqZWq2WdS)a�Parse a requirements file and yield InstallRequirement instances.

    :param filename:    Path or url of requirements file.
    :param finder:      Instance of pip.index.PackageFinder.
    :param comes_from:  Origin description of requirements.
    :param options:     cli options.
    :param session:     Instance of pip.download.PipSession.
    :param constraint:  If true, parsing a constraint file rather than
        requirements file.
    :param wheel_cache: Instance of pip.wheel.WheelCache
    NzCparse_requirements() missing 1 required keyword argument: 'session')�
comes_from�session)�
constraint)�	TypeErrorr�
preprocess�process_line)�filename�finderr�optionsrr�wheel_cache�_�content�
lines_enum�line_number�lineZreq_iter�reqrrrr
@s



cCs.t|j�dd�}t|�}t|�}t||�}|S)z�Split, filter, and join lines, and return a line iterator

    :param content: the content of the requirements file
    :param options: cli options
    �)�start)�	enumerate�
splitlines�
join_lines�ignore_comments�
skip_regex)rrrrrrras

rc		cs
t�}	|	j�}
d|
_|r |j|
_t|�\}}tjdkr@|jd�}|	jt	j
|�|
�\}
}d|r`dnd||f}|r�|rz|jnd	}|r�tj
||
�i}x.tD]&}||
jkr�|
j|r�|
j|||<q�Wtj||||||d
�V�n(|
j�r&|r�|jnd	}|�r|jnd}tj|
jd|||||d�V�n�|
j�s6|
j�r�|
j�rN|
jd}d	}n|
jd}d
}tj|��rvtj||�}n"tj|��s�tjjtjj|�|�}t|||||||d�}	x|	D]}|V�q�W�n>|
j �r�|
j |_ �n*|�r|
j!�r�t"j#dt$�|
j%�r
t"j#dt$�|
j&�rt"j#dt$�|
j�r0|
jg|_'|
j(d	k�rPd	|_(t)j*j+|j�|
j,d
k�rbg|_'|
j-�rx|j'j.|
j-�|
j/�r�|
j/d}tjjtjj0|��}tjj||�}tjj1|��r�|}|j/j2|�|
j3�r�d
|_4|
j5�r�d
|_5|
j6�r|j7j.dd�|
j6D��dS)a#Process a single requirements line; This can result in creating/yielding
    requirements, or updating the finder.

    For lines that contain requirements, the only options that have an effect
    are from SUPPORTED_OPTIONS_REQ, and they are scoped to the
    requirement. Other options from SUPPORTED_OPTIONS may be present, but are
    ignored.

    For lines that do not contain requirements, the only options that have an
    effect are from SUPPORTED_OPTIONS. Options from SUPPORTED_OPTIONS_REQ may
    be present, but are ignored. These lines may contain multiple options
    (although our docs imply only one is supported), and all our parsed and
    affect the finder.

    :param constraint: If True, parsing a constraints file.
    :param options: OptionParser options that we may update
    N����utf8z%s %s (line %s)z-cz-rF)r�isolatedrrr)rr�default_vcsr,rT)rrz�--allow-external has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.z�--allow-all-external has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.z�--allow-unverified has been deprecated and will be removed in the future. Due to changes in the repository protocol, it no longer has any effect.css|]}d|dfVqdS)�*Nr)r
�hostrrr�	<genexpr>�szprocess_line.<locals>.<genexpr>)r(r)r*)8�build_parserZget_default_values�	index_urlZformat_control�break_args_options�sys�version_info�encode�
parse_args�shlex�splitZ
isolated_moder	Zcheck_install_build_global�SUPPORTED_OPTIONS_REQ_DEST�__dict__rZ	from_lineZ	editablesr-Z
from_editable�requirements�constraints�	SCHEME_RE�search�urllib_parseZurljoin�os�path�join�dirnamer
�require_hashes�allow_external�warnings�warnr�allow_all_externalZallow_unverifiedZ
index_urls�	use_wheel�pip�indexZfmt_ctl_no_use_wheel�no_indexZextra_index_urls�extend�
find_links�abspath�exists�append�preZallow_all_prereleases�process_dependency_linksZ
trusted_hostsZsecure_origins)rrrrrrrrr�parserZdefaultsZargs_strZoptions_strZoptsrZline_comes_fromr,Zreq_optionsrr-Zreq_pathZnested_constraintr �valueZreq_dirZrelative_to_reqs_filerrrrns�










rcCsh|jd�}g}|dd�}x8|D]0}|jd�s8|jd�r<Pq |j|�|jd�q Wdj|�dj|�fS)z�Break up the line into an args and options string.  We only want to shlex
    (and then optparse) the options, not the args.  args can contain markers
    which are corrupted by shlex.
    � N�-z--r)r9�
startswithrR�poprC)r�tokens�argsr�tokenrrrr3�s


r3cCsDtjdd�}tt}x|D]}|�}|j|�qWdd�}||_|S)z7
    Return a parser for parsing requirement lines
    F)Zadd_help_optioncSst|��dS)N)r)�self�msgrrr�parser_exitsz!build_parser.<locals>.parser_exit)�optparseZOptionParser�SUPPORTED_OPTIONS�SUPPORTED_OPTIONS_REQZ
add_option�exit)rUZoption_factoriesZoption_factoryZoptionr`rrrr1s
r1ccs�d}g}x�|D]x\}}|jd�s,tj|�rntj|�r>d|}|rb|j|�|dj|�fVg}q�||fVq|sv|}|j|jd��qW|r�|dj|�fVdS)z�Joins a line ending in '' with the previous line (except when following
    comments).  The joined line takes on the index of the first line.
    N�\rW�)�endswith�
COMMENT_RE�matchrRrC�strip)rZprimary_line_numberZnew_linerrrrrr%"s 

r%ccs8x2|D]*\}}tjd|�}|j�}|r||fVqWdS)z1
    Strips comments and filter empty lines.
    rfN)rh�subrj)rrrrrrr&?s
r&cs2|r
|jnd}|r.tj|��t�fdd�|�}|S)zs
    Skip lines that match '--skip-requirements-regex' pattern

    Note: the regex pattern is only built once
    Ncs�j|d�S)Nr!)r?)�e)�patternrr�<lambda>Tszskip_regex.<locals>.<lambda>)Zskip_requirements_regex�re�compiler)rrr'r)rmrr'Js

r')NNNNFN)NNNNNF)@�__doc__Z
__future__rrAror8r4rarGZpip._vendor.six.moves.urllibrr@Zpip._vendor.six.movesrrKZpip.downloadrZpip.req.req_installrZpip.exceptionsrZpip.utils.deprecationrr	�__all__rp�Ir>rhr=Zeditabler<rMr2rOZextra_index_urlrFrIZno_allow_externalZallow_unsafeZno_allow_unsaferJZno_use_wheelZalways_unzipZ	no_binaryZonly_binaryrSrTZtrusted_hostrErbZinstall_optionsZglobal_options�hashrcr:r
rrr3r1r%r&r'rrrr�<module>sn

 

site-packages/pip/req/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000603147511334620020204 0ustar003

���e�@sDddlmZddlmZddlmZmZddlmZdddd	gZ	d
S)�)�absolute_import�)�InstallRequirement)�RequirementSet�Requirements)�parse_requirementsrrrrN)
Z
__future__rZreq_installrZreq_setrrZreq_filer�__all__�r	r	�/usr/lib/python3.6/__init__.py�<module>s
site-packages/pip/req/__pycache__/req_install.cpython-36.pyc000064400000072766147511334620020046 0ustar003

���e���@s0ddlmZddlZddlZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddlmZddl
mZddlmZddlmZmZddlmZddlmZdd	lmZmZdd
lmZddlmZmZ ddl!m"Z"ddl#Z$dd
l%m&Z&m'Z'm(Z(ddl)m*Z*m+Z+m,Z,m-Z-ddl.m/Z/m0Z0ddl1m2Z2m3Z3m4Z4m5Z5ddl6m7Z7m8Z8m9Z9m:Z:m;Z;m<Z<m=Z=m>Z>m?Z?m@Z@mAZAmBZBmCZCmDZDmEZEmFZFmGZGddlHmIZIddlJmKZKddlLmMZMddlNmOZOddlPmQZQddlRmSZSddlTmUZUddl#mVZVmWZWejXeY�ZZej[j\j]�Z^dd�Z_dd�Z`Gdd�dea�Zbd d!�Zcd$d"d#�ZddS)%�)�absolute_importN)�	sysconfig)�change_root)�
FeedParser)�
pkg_resources�six)�
specifiers)�Marker)�InvalidRequirement�Requirement)�canonicalize_name)�Version�parse)�configparser)�
native_str�
get_stdlib�WINDOWS)�is_url�url_to_path�path_to_url�is_archive_file)�InstallationError�UninstallationError)�bin_py�running_under_virtualenv�PIP_DELETE_MARKER_FILENAME�bin_user)�display_path�rmtree�ask_path_exists�
backup_dir�is_installable_dir�dist_in_usersite�dist_in_site_packages�dist_in_install_path�
egg_link_path�call_subprocess�read_text_file�FakeFile�_make_build_dir�
ensure_dir�get_installed_version�normalize_path�
dist_is_local)�Hashes)�RemovedInPip10Warning)�
indent_log)�SETUPTOOLS_SHIM)�open_spinner)�UninstallPathSet)�vcs)�move_wheel_files�WheelcCs6tjd|�}d}|r*|jd�}|jd�}n|}||fS)Nz^(.+)(\[[^\]]+\])$��)�re�match�group)�path�m�extrasZpath_no_extras�r?�!/usr/lib/python3.6/req_install.py�
_strip_extras9s
rAcCstdd�|D��S)Ncss|]}tj|�VqdS)N)rZ
safe_extra)�.0�extrar?r?r@�	<genexpr>Fsz_safe_extras.<locals>.<genexpr>)�set)r>r?r?r@�_safe_extrasEsrFc@s�eZdZdQdd�ZedRdd��ZedSdd	��Zd
d�Zdd
�Zdd�Z	e
dd��Ze
dd��Zdd�Z
dd�Zdd�Ze
dd��Ze
dd��Ze
dd��Zd d!�Zd"d#�Zd$d%�Zd&d'�Zejd(�Ze
d)d*��Zd+d,�ZdTd-d.�ZdUd/d0�Zd1d2�Zd3d4�Zd5d6�Z d7d8�Z!dVd9d:�Z"gdddfd;d<�Z#d=d>�Z$d?d@�Z%dAdB�Z&fdfdCdD�Z'dEdF�Z(e
dGdH��Z)dWdIdJ�Z*dKdL�Z+e
dMdN��Z,dXdOdP�Z-dS)Y�InstallRequirementNFTcsJf|_t�tj�r�yt���Wndtk
r�tjj�kr@d}n.d�krft	�fdd�t
D��rfd}ntj�}t
d�|f��YnXt�j�|_�|_||_|
|_||_||_||_||_|_||_|	dk	r�|	|_n�oވj|_d|_d|_d|_d|_d|_||_d|_ d|_!d|_"d|_#d|_$|�r.|ni|_%||_&d|_'|
|_(dS)Nz%It looks like a path. Does it exist ?�=c3s|]}|�kVqdS)Nr?)rB�op)�reqr?r@rDVsz.InstallRequirement.__init__.<locals>.<genexpr>z,= is not a valid operator. Did you mean == ?zInvalid requirement: '%s'
%sF))r>�
isinstancer�string_typesrr
�osr<�sep�any�	operators�	traceback�
format_excrrFrJ�
comes_from�
constraint�
source_dir�editable�_wheel_cache�link�
original_link�as_egg�markers�marker�_egg_info_path�satisfied_by�conflicts_with�_temp_build_dir�_ideal_build_dir�update�install_succeeded�uninstalled�nothing_to_uninstall�
use_user_site�
target_dir�options�	pycompileZprepared�isolated)�selfrJrSrUrVrXrZrbrir[rjrh�wheel_cacherT�add_msgr?)rJr@�__init__KsN zInstallRequirement.__init__cCspddlm}t||�\}	}
}|
jd�r0t|
�}nd}||	||d||
�|||rP|ni|d�	}
|dk	rlt|�|
_|
S)Nr)�Linkzfile:T)rUrVrXrTrjrhrl)�	pip.indexro�parse_editable�
startswithrrFr>)�cls�editable_reqrS�default_vcsrjrhrlrTro�name�urlZextras_overriderU�resr?r?r@�
from_editable�s 



z InstallRequirement.from_editablec
Cs�ddlm}t|�rd}nd}||krR|j|d�\}}	|	j�}	|	sHd}	qVt|	�}	nd}	|j�}d}
tjjtjj	|��}d}d}
t|�r�||�}n�t
|�\}}
tjj|�r�tjj|ks�|j
d�r�t|�s�td|��|t|��}n0t|��rtjj|��stjd	|�|t|��}|�r||jd
k�rPtjd|j��rP|ttjjtjj	|j����}|j�rtt|j�}d|j|jf}
n|j}
n|}
|�r�|ni}||
|||	||||d
�}|
�r�tt d|
�j!�|_!|S)z�Creates an InstallRequirement from a name, which might be a
        requirement, directory containing 'setup.py', filename, or URL.
        r)roz; �;r7N�.z;Directory %r is not installable. File 'setup.py' not found.zARequirement %r looks like a filename, but the file does not exist�filez\.\./z%s==%s)rXr[rjrhrlrT�placeholder)"rpror�split�stripr	rMr<�normpath�abspathrA�isdirrNrrr!rrr�isfile�logger�warning�schemer9�searchrw�is_wheelr6�filenamerv�version�egg_fragmentrFrr>)rsrvrSrjrhrlrTroZ
marker_sepr[rJr<rXr>�p�wheelrxr?r?r@�	from_line�sb





zInstallRequirement.from_linecCs�|jr(t|j�}|jr:|d|jj7}n|jr6|jjnd}|jdk	rX|dt|jj�7}|jr�t|jt	j
�rt|j}n
|jj�}|r�|d|7}|S)Nz from %sz in %sz
 (from %s))rJ�strrXrwr^r�locationrSrKrrL�	from_path)rk�srSr?r?r@�__str__�s


zInstallRequirement.__str__cCsd|jjt|�|jfS)Nz<%s object: %s editable=%r>)�	__class__�__name__r�rV)rkr?r?r@�__repr__szInstallRequirement.__repr__cCs^|jdkr|j||�|_|jdk	rZ|rZ|j}|jj|j|j�|_||jkrZtjd|j�dS)aEnsure that if a link can be found for this, that it is found.

        Note that self.link may still be None - if Upgrade is False and the
        requirement is already installed.

        If require_hashes is True, don't use the wheel cache, because cached
        wheels, always built locally, have different hashes than the files
        downloaded from the index server and thus throw false hash mismatches.
        Furthermore, cached wheels at present have undeterministic contents due
        to file modification times.
        NzUsing cached wheel link: %s)rXZfind_requirementrWZcached_wheelrvr��debug)rk�finderZupgradeZrequire_hashesZold_linkr?r?r@�
populate_link	s

z InstallRequirement.populate_linkcCs|jjS)N)rJ�	specifier)rkr?r?r@r�szInstallRequirement.specifiercCs$|j}t|�dko"tt|��jdkS)z�Return whether I am pinned to an exact version.

        For example, some-package==1.2 is pinned; some-package>1.2 is not.
        r7�==�===)r�r�)r��len�next�iter�operator)rkrr?r?r@�	is_pinned!szInstallRequirement.is_pinnedcCsR|jdkrdSt|j�}|jrNt|jtj�r4|j}n
|jj�}|rN|d|7}|S)Nz->)rJr�rSrKrrLr�)rkr�rSr?r?r@r�+s


zInstallRequirement.from_pathcCs�|jdk	r|jS|jdkr<tjjtjdd��|_||_|jS|jrN|j	j
�}n|j	}tjj|�sttj
d|�t|�tjj||�S)Nz-buildzpip-zCreating directory %s)r`rJrMr<�realpath�tempfile�mkdtemprarVrv�lower�existsr�r�r)�join)rk�	build_dirrvr?r?r@�build_location8s

z!InstallRequirement.build_locationcCs�|jdk	rdS|jdk	st�|js&t�|js0t�|j}d|_|j|j�}tjj|�rdt	dt
|���tjd|t
|�t
|��t
j||�||_d|_||_d|_dS)a�Move self._temp_build_dir to self._ideal_build_dir/self.req.name

        For some requirements (e.g. a path to a directory), the name of the
        package is not available until we run egg_info, so the build_location
        will return a temporary directory and store the _ideal_build_dir.

        This is only called by self.egg_info_path to fix the temporary build
        directory.
        Nz<A package already exists in %s; please remove it to continuez,Moving package %s from %s to new location %s)rUrJ�AssertionErrorr`rar�rMr<r�rrr�r��shutil�mover])rkZold_locationZnew_locationr?r?r@�_correct_build_locationSs(



z*InstallRequirement._correct_build_locationcCs |jdkrdSttj|jj��S)N)rJrrZ	safe_namerv)rkr?r?r@rvss
zInstallRequirement.namecCstjj|j|jr|jjpd�S)N�)rMr<r�rUrXZsubdirectory_fragment)rkr?r?r@�setup_py_diryszInstallRequirement.setup_py_dircCs�|jstd|��yddl}Wn:tk
rXtd�dkr@d}ntj�}td|��YnXtj	j
|jd�}tj
r�t|tj�r�|jtj��}|S)NzNo source dir for %sr�
setuptoolszPlease install setuptools.zWCould not import setuptools which is required to install from a source distribution.
%szsetup.py)rUr�r��ImportErrorr+rQrRrrMr<r�r�r�PY2rKZ	text_type�encode�sys�getfilesystemencoding)rkr�rm�setup_pyr?r?r@r�szInstallRequirement.setup_pyc
CsZ|js
t�|jr$tjd|j|j�ntjd|j|j�t��xt|j}t	j
d|g}|jrd|dg7}|dg}|jrzg}n t
jj|jd�}t|�ddg}t|||jdd	d
�WdQRX|j�stt|j�d�t�r�d}nd
}tdj|j�d||j�dg��|_|j�nDt|j�d�}t|jj�|k�rVtjd|j|j||j�t|�|_dS)Nz2Running setup.py (path:%s) egg_info for package %sz7Running setup.py (path:%s) egg_info for package from %sz-cz
--no-user-cfg�egg_infozpip-egg-infoz
--egg-baseFzpython setup.py egg_info)�cwd�show_stdoutZcommand_descr
z==z===r��NamezuRunning setup.py (path:%s) egg_info for package %s produced metadata for project name %s. Fix your #egg=%s fragments.)rUr�rvr�r�r�rXr0r1r��
executablerjrVrMr<r�r�r*r&rJrK�
parse_version�pkg_infor
rr�rr�)rk�scriptZbase_cmdZegg_info_cmdZegg_base_option�egg_info_dirrIZ
metadata_namer?r?r@�run_egg_info�sP





zInstallRequirement.run_egg_infocCsV|jdk	r&|jj|�sdS|jj|�S|js0t�|j|�}tjj|�sJdSt	|�}|S)N)
r^�has_metadata�get_metadatarUr��
egg_info_pathrMr<r�r')rkr��datar?r?r@�
egg_info_data�s


z InstallRequirement.egg_info_datacs||jdk�rl|jr|j}ntjj|jd�}tj|�}|j�rg}x�tj|�D]�\�}}x t	j
D]}||kr^|j|�q^Wxjt|�D]^}tjj
tjj�|dd��s�tjjtjj�|dd��r�|j|�q�|dks�|dkr�|j|�q�W|j�fdd	�|D��qLWd
d	�|D�}|�s$td||f��|�s:td||f��t|�dk�rX|jd
d�d�tjj||d�|_tjj|j|�S)Nzpip-egg-info�bin�pythonZScriptsz
Python.exeZtestZtestscsg|]}tjj�|��qSr?)rMr<r�)rB�dir)�rootr?r@�
<listcomp>�sz4InstallRequirement.egg_info_path.<locals>.<listcomp>cSsg|]}|jd�r|�qS)z	.egg-info)�endswith)rB�fr?r?r@r��sz$No files/directories in %s (from %s)r7cSs(|jtjj�tjjr"|jtjj�p$dS)Nr)�countrMr<rN�altsep)�xr?r?r@�<lambda>
sz2InstallRequirement.egg_info_path.<locals>.<lambda>)�keyr)r]rVrUrMr<r�r��listdir�walkr4�dirnames�remove�list�lexistsr��extendrr�r��sort)rkr��base�	filenames�dirs�filesr�r?)r�r@r��sB
z InstallRequirement.egg_info_pathcCs@t�}|jd�}|s*tjdt|jd���|j|p4d�|j�S)NzPKG-INFOzNo PKG-INFO file found in %sr�)rr�r�r�rr�Zfeed�close)rkr�r�r?r?r@r�s
zInstallRequirement.pkg_infoz	\[(.*?)\]cCs
t|j�S)N)r+rv)rkr?r?r@�installed_version sz$InstallRequirement.installed_versioncCsV|js
t�|j�d}|jjr<||jjkr<tjd||j�ntjdt	|j�||�dS)Nr�z'Requested %s, but installing version %sz;Source in %s has version %s, which satisfies requirement %s)
rUr�r�rJr�r�r�r�r�r)rkr�r?r?r@�assert_source_matches_version$s

z0InstallRequirement.assert_source_matches_versioncCs�|jstjd|j�dS|js"t�|js,t�|jjdkr<dSd|jjksXtd|jj��|jsbdS|jjj	dd�\}}t
j|�}|r�||jj�}|r�|j|j�q�|j
|j�nds�td|j|f��dS)Nz>Cannot update repository at %s; repository location is unknownr|�+zbad url: %rr7rz+Unexpected version control type (in %s): %s)rXr�r�rUrVr�r�rwrbr~r4�get_backend�obtainZexport)rkr��vc_typerw�backendZvcs_backendr?r?r@�update_editable5s,


z"InstallRequirement.update_editablecsx|j�std|jf��|jp"|j}t|j�}t|�sTtj	d|j
|tj�d|_
dS|t�krxtj	d|j
|�d|_
dSt|�}t|�}djtj|j��}|jo�tjj|j�}t|jdd�}|o�|jjd�o�|jj|��r�|j|j�|jd	��r2x�|jd	�j�D](}	tjj tjj!|j|	��}
|j|
��qWn�|jd
��r|jd��rV|jd��ng�xj�fdd
�|jd
�j�D�D]J}tjj!|j|�}
|j|
�|j|
d�|j|
d�|j|
d��qxW�nH|�r�t"j#dj|j�t$�|j|��n |jjd��rH|j|j�tjj%|j�d}tjj!tjj&|j�d�}
|j'|
d|�n�|�r�|jjd��r�x�t(j)j*|�D]}
|j|
��qjWn�|�rt+|d��}tjj,|j-�j.��}WdQRX||jk�s�t/d||j|jf��|j|�tjj!tjj&|�d�}
|j'|
|j�ntj0d||j�|jd��r�|j1d��r�xZ|j2d�D]L}t3|��rJt4}nt5}|jtjj!||��t6�r6|jtjj!||�d��q6W|jd��rdt7j8�r�i}ndd#i}t9j:f|�}|j;t<|j=d���|j>d��rdx�|j?d�D]�\}}t3|��r�t4}nt5}|jtjj!||��t6�r�|jtjj!||�d �|jtjj!||�d!�|jtjj!||�d"��q�W|j@|�||_AdS)$a�
        Uninstall the distribution currently satisfying this requirement.

        Prompts before removing or modifying files unless
        ``auto_confirm`` is True.

        Refuses to delete or modify files outside of ``sys.prefix`` -
        thus uninstallation within a virtual environment can only
        modify that virtual environment, even if the virtualenv is
        linked to global site-packages.

        z.Cannot uninstall requirement %s, not installedz1Not uninstalling %s at %s, outside environment %sTNz<Not uninstalling %s at %s, as it is in the standard library.z{0}.egg-infor<z	.egg-infozinstalled-files.txtz
top_level.txtznamespace_packages.txtcsg|]}|r|�kr|�qSr?r?)rBr�)�
namespacesr?r@r��sz0InstallRequirement.uninstall.<locals>.<listcomp>z.pyz.pycz.pyoz�Uninstalling a distutils installed project ({0}) has been deprecated and will be removed in a future version. This is due to the fact that uninstalling a distutils project will only partially uninstall the project.z.eggr7zeasy-install.pthz./z
.dist-info�rz;Egg-link %s does not match installed location of %s (at %s)z)Not sure how to uninstall: %s - Check: %s�scriptsz.batzentry_points.txtZ
delimitersrHZconsole_scriptsz.exez
.exe.manifestz
-script.py)rH)B�check_if_existsrrvr^r_r,r�r-r��infor�r��prefixrerr3r%�formatrZto_filename�project_namer�rMr<r��getattrZ	_providerr��addr�r��
splitlinesr�r��warnings�warnr/r~�dirnameZadd_pth�pipr�Zuninstallation_paths�open�normcase�readlinerr�r�Zmetadata_isdirZmetadata_listdirr"rrrrr�rZSafeConfigParserZreadfpr(Zget_metadata_linesZhas_section�itemsr�rd)rkZauto_confirmZdistZ	dist_pathZpaths_to_removeZdevelop_egg_linkZdevelop_egg_link_egg_infoZegg_info_existsZdistutils_egg_infoZinstalled_filer<Z
top_level_pkgZeasy_install_eggZeasy_install_pthZfhZlink_pointerr�Zbin_dirrh�configrv�valuer?)r�r@�	uninstallRs�









zInstallRequirement.uninstallcCs$|jr|jj�ntjd|j�dS)Nz'Can't rollback %s, nothing uninstalled.)rdZrollbackr��errorrv)rkr?r?r@�rollback_uninstall�sz%InstallRequirement.rollback_uninstallcCs*|jr|jj�n|js&tjd|j�dS)Nz%Can't commit %s, nothing uninstalled.)rdZcommitrer�r�rv)rkr?r?r@�commit_uninstall�s
z#InstallRequirement.commit_uninstallcCs�|js
t�d}d|j|j�df}tjj||�}tjj|�r�tdt	|�d�}|dkr^d	}nj|dkr�t
jd
t	|��tj|�nF|dkr�t
|�}t
jdt	|�t	|��tj||�n|dkr�tjd�|�r�tj|dtjdd
�}tjjtjj|j��}x�tj|�D]�\}	}
}d|
k�r"|
jd�xR|
D]J}tjj|	|�}|j||�}
tj|jd|
d�}d|_|j|d��q(WxL|D]D}|tk�r��q|tjj|	|�}|j||�}
|j||jd|
��q|W�qW|j�t
j dt	|��dS)NTz	%s-%s.zipr�z8The file %s exists. (i)gnore, (w)ipe, (b)ackup, (a)bort �i�w�b�aFzDeleting %szBacking up %s to %sr7)Z
allowZip64zpip-egg-info�/i��r�zSaved %s)rrrr���i�)!rUr�rvr�rMr<r�r�rrr�r�r�r r�r�r��exit�zipfileZZipFileZZIP_DEFLATEDr�r�r�r��_clean_zip_nameZZipInfoZ
external_attrZwritestrr�writer�r�)rkr�Zcreate_archiveZarchive_nameZarchive_pathZresponseZ	dest_file�zipr��dirpathr�r�r�rvZzipdirr�r?r?r@�archivesX







"zInstallRequirement.archivecCsJ|j|tjj�s"td||f��|t|�dd�}|jtjjd�}|S)Nz$name %r doesn't start with prefix %rr7r)rrrMr<rNr�r��replace)rkrvr�r?r?r@r	5s
z"InstallRequirement._clean_zip_namecs0|sd}�jdk	r(t�fdd�|D��SdSdS)Nr�c3s|]}�jjd|i�VqdS)rCN)r[Zevaluate)rBrC)rkr?r@rDDsz3InstallRequirement.match_markers.<locals>.<genexpr>T)r�)r[rO)rkZextras_requestedr?)rkr@�
match_markers=s


z InstallRequirement.match_markersc,s`|jr|j|||d�dS|jr\tjj|j�}tjj||j�|j	|j�||d�d|_
dS||jjdg�7}||jjdg�7}|j
r�t|�dg}tjdd�}tjj|d	�}�z�|j||�|�}	d
|jf}
t|
��.}t��t|	||jd|d�WdQRXWdQRXtjj|��s(tjd
|�dSd|_
|j�r:dS�fdd�}t|��H}
x@|
D](}tjj|�}|jd��rV||�}P�qVWtj d|�dSWdQRXg}t|��P}
xH|
D]@}|j!�}tjj"|��r�|tjj#7}|j$tjj%||�|���q�WWdQRXtjj|d�}t|d��}
|
j&dj|�d�WdQRXWdtjj|��rRtj'|�t(|�XdS)N)r�)r�r��strip_file_prefixT�global_options�install_optionsz
--no-user-cfgz-recordzpip-zinstall-record.txtzRunning setup.py install for %sF)r�r��spinnerzRecord file %s not foundcs(�dkstjj|�r|St�|�SdS)N)rMr<�isabsr)r<)r�r?r@�prepend_root~sz0InstallRequirement.install.<locals>.prepend_rootz	.egg-infoz;Could not find .egg-info directory in install record for %szinstalled-files.txtr�
))rV�install_editabler�r�r�Z
wheel_versionrUZcheck_compatibilityrvr5rcrh�getrjr�r�r�rMr<r��get_install_argsr2r0r&r�r�r�r�rZr�r�r�r�rr�rN�append�relpathr
r�r)rkrrr�r�rr�Z
temp_location�record_filename�install_args�msgrrr��lineZ	directoryr�Z	new_linesr�Zinst_files_pathr?)r�r@�installIs~




"
zInstallRequirement.installcCs|jdkr|j|�|_|jS)aAEnsure that a source_dir is set.

        This will create a temporary build dir if the name of the requirement
        isn't known yet.

        :param parent_dir: The ideal pip parent_dir for the source_dir.
            Generally src_dir for editables and build_dir for sdists.
        :return: self.source_dir
        N)rUr�)rkZ
parent_dirr?r?r@�ensure_has_source_dir�s

z(InstallRequirement.ensure_has_source_dircCs�tjdg}|jd�|jt|j�|t|�dd|g7}|jsJ|dg7}|dk	r^|d|g7}|dk	rr|d|g7}|jr�|dg7}n
|d	g7}t�r�d
t	j
�}|dtjj
tjdd
||j�g7}|S)Nz-uz-cr z--recordz#--single-version-externally-managedz--rootz--prefixz	--compilez--no-compiler�z--install-headers�includeZsite)r�r�rr1r�r�rZrirr�get_python_versionrMr<r�r�rv)rkrrr�r�rZ
py_ver_strr?r?r@r�s(



z#InstallRequirement.get_install_argscCsd|jr6tjjtjj|jt��r6tjd|j�t|j�d|_|j	rZtjj|j	�rZt|j	�d|_	dS)zVRemove the source files from this requirement, if they are marked
        for deletionzRemoving source in %sN)
rUrMr<r�r�rr�r�rr`)rkr?r?r@�remove_temporary_source�s

z*InstallRequirement.remove_temporary_sourcecCs�tjd|j�|jr"t|�dg}|r>dj|�g}t|�|}t��<ttj	dt
|jgt|�ddgt|�|jdd�WdQRXd	|_
dS)
NzRunning setup.py develop for %sz
--no-user-cfgz--prefix={0}z-cZdevelopz	--no-depsF)r�r�T)r�r�rvrjr�r�r0r&r�r�r1r�r�rc)rkrrr�Zprefix_paramr?r?r@r�s z#InstallRequirement.install_editablecCs�|jdkrdSyFtt|j��}d|_tjt|��|_|jrR|jrR|j|_d|_dSWn�tj	k
rjdStj
k
r�tj|jj�}|jr�t
|�r�||_q�t�r�t|�r�td|j|jf��nt|�r�||_YnXdS)z�Find an installed distribution that satisfies or conflicts
        with this requirement, and set self.satisfied_by or
        self.conflicts_with appropriately.
        NFTzVWill not install to the user site because it will lack sys.path precedence to %s in %s)rJrr�r\rZget_distributionr^rVr_ZDistributionNotFoundZVersionConflictrvrfr"rr#rr�r�r$)rkZ	no_markerZ
existing_distr?r?r@r��s4

z"InstallRequirement.check_if_existscCs|jo|jjS)N)rXr�)rkr?r?r@r� szInstallRequirement.is_wheelcCs,t|j|j||j|j|||j|j|d�
dS)N)�user�homer�r�rirjr)r5rvrJrfrgrirj)rkZwheeldirr�r�rr?r?r@r5$s
z#InstallRequirement.move_wheel_filescCsX|jd�jd�}tjj|�}tj||�}tjjtjj|��d}tj	tjj|�||d�S)zAReturn a pkg_resources.Distribution built from self.egg_info_pathr�rr)r��metadata)
r��rstriprMr<r�rZPathMetadata�splitext�basenameZDistribution)rkr�Zbase_dirr'Z	dist_namer?r?r@�get_dist0s
zInstallRequirement.get_distcCst|jjdi��S)z�Return whether any known-good hashes are specified as options.

        These activate --require-hashes mode; hashes specified as part of a
        URL do not.

        �hashes)�boolrhr)rkr?r?r@�has_hash_options;sz#InstallRequirement.has_hash_optionscCsJ|jjdi�j�}|r|jn|j}|rB|jrB|j|jg�j|j�t	|�S)a�Return a hash-comparer that considers my option- and URL-based
        hashes to be known-good.

        Hashes in URLs--ones embedded in the requirements file, not ones
        downloaded from an index server--are almost peers with ones from
        flags. They satisfy --require-hashes (whether it was implicitly or
        explicitly activated) but do not activate it. md5 and sha224 are not
        allowed in flags, which should nudge people toward good algos. We
        always OR all hashes together, even ones from URLs.

        :param trust_internet: Whether to trust URL-based (#md5=...) hashes
            downloaded from the internet, as by populate_link()

        r,)
rhr�copyrXrY�hash�
setdefaultZ	hash_namerr.)rkZtrust_internetZgood_hashesrXr?r?r@r,Es

zInstallRequirement.hashes)NFNFTTNFNNF)NNFNNF)NFNNF)T)F)N)NNN)T).r��
__module__�__qualname__rn�classmethodryr�r�r�r��propertyr�r�r�r�r�rvr�r�r�r�r�r�r9�compileZ_requirements_section_rer�r�r�r�r�r�r
r	rr r!rr$rr�r�r5r+r.r,r?r?r?r@rGIs`
;M

 :6


$0
\
)

rGcCstjd|�}|r|jd�}|S)z2
        Strip req postfix ( -dev, 0.2, etc )
    z^(.*?)(?:-dev|-\d.*)$r7)r9r�r;)rJr:r?r?r@�_strip_postfix[s
r7cCs�ddlm}|}d}tjd|�}|r:|jd�}|jd�}n|}tjj|�rttjjtjj	|d��slt
d|��t|�}|j�j
d	�r�||�j}|r�||td
|j��jfS||dfSx,tD]$}|j�j
d|�r�d||f}Pq�Wd
|k�r|�r
tjdt�|d
|}nt
d|��|jd
d�dj�}	tj|	��s`d|dj	dd�tjD��d}
t
|
��||�j}|�sxt
d��|�s�t
d|��t|�|dfS)aParses an editable requirement into:
        - a requirement name
        - an URL
        - extras
        - editable options
    Accepted requirements:
        svn+http://blahblah@rev#egg=Foobar[baz]&subdirectory=version_subdir
        .[some_extra]
    r)roNz^(.+)(\[[^\]]+\])$r7r8zsetup.pyz;Directory %r is not installable. File 'setup.py' not found.zfile:r}z%s:z%s+%sr�zD--default-vcs has been deprecated and will be removed in the future.zb%s should either be a path to a local project or a VCS url beginning with svn+, git+, hg+, or bzr+zFor --editable=%s only z, cSsg|]}|jd�qS)z+URL)rv)rBr�r?r?r@r��sz"parse_editable.<locals>.<listcomp>z is currently supportedz@Could not detect requirement name, please specify one with #egg=z@--editable=%s is not the right format; it must have #egg=Package)rpror9r:r;rMr<r�r�r�rrr�rrr�rr>r4r�r�r/r~r�Zbackendsr7)rtrurorwr>r=Z
url_no_extrasZpackage_nameZversion_controlr�Z
error_messager?r?r@rqgs`





rq)N)eZ
__future__rZloggingrMr9r�r�r�rQr�rZ	distutilsrZdistutils.utilrZemail.parserrZpip._vendorrrZpip._vendor.packagingrZpip._vendor.packaging.markersr	Z"pip._vendor.packaging.requirementsr
rZpip._vendor.packaging.utilsrZpip._vendor.packaging.versionr
rr�Zpip._vendor.six.movesrZ	pip.wheelr�Z
pip.compatrrrZpip.downloadrrrrZpip.exceptionsrrZ
pip.locationsrrrrZ	pip.utilsrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-Zpip.utils.hashesr.Zpip.utils.deprecationr/Zpip.utils.loggingr0Zpip.utils.setuptools_buildr1Zpip.utils.uir2Zpip.req.req_uninstallr3Zpip.vcsr4r5r6Z	getLoggerr�r�Z	SpecifierZ
_operators�keysrPrArF�objectrGr7rqr?r?r?r@�<module>s`L
site-packages/pip/req/__pycache__/req_install.cpython-36.opt-1.pyc000064400000071670147511334620020776 0ustar003

���e���@s0ddlmZddlZddlZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddlmZddl
mZddlmZddlmZmZddlmZddlmZdd	lmZmZdd
lmZddlmZmZ ddl!m"Z"ddl#Z$dd
l%m&Z&m'Z'm(Z(ddl)m*Z*m+Z+m,Z,m-Z-ddl.m/Z/m0Z0ddl1m2Z2m3Z3m4Z4m5Z5ddl6m7Z7m8Z8m9Z9m:Z:m;Z;m<Z<m=Z=m>Z>m?Z?m@Z@mAZAmBZBmCZCmDZDmEZEmFZFmGZGddlHmIZIddlJmKZKddlLmMZMddlNmOZOddlPmQZQddlRmSZSddlTmUZUddl#mVZVmWZWejXeY�ZZej[j\j]�Z^dd�Z_dd�Z`Gdd�dea�Zbd d!�Zcd$d"d#�ZddS)%�)�absolute_importN)�	sysconfig)�change_root)�
FeedParser)�
pkg_resources�six)�
specifiers)�Marker)�InvalidRequirement�Requirement)�canonicalize_name)�Version�parse)�configparser)�
native_str�
get_stdlib�WINDOWS)�is_url�url_to_path�path_to_url�is_archive_file)�InstallationError�UninstallationError)�bin_py�running_under_virtualenv�PIP_DELETE_MARKER_FILENAME�bin_user)�display_path�rmtree�ask_path_exists�
backup_dir�is_installable_dir�dist_in_usersite�dist_in_site_packages�dist_in_install_path�
egg_link_path�call_subprocess�read_text_file�FakeFile�_make_build_dir�
ensure_dir�get_installed_version�normalize_path�
dist_is_local)�Hashes)�RemovedInPip10Warning)�
indent_log)�SETUPTOOLS_SHIM)�open_spinner)�UninstallPathSet)�vcs)�move_wheel_files�WheelcCs6tjd|�}d}|r*|jd�}|jd�}n|}||fS)Nz^(.+)(\[[^\]]+\])$��)�re�match�group)�path�m�extrasZpath_no_extras�r?�!/usr/lib/python3.6/req_install.py�
_strip_extras9s
rAcCstdd�|D��S)Ncss|]}tj|�VqdS)N)rZ
safe_extra)�.0�extrar?r?r@�	<genexpr>Fsz_safe_extras.<locals>.<genexpr>)�set)r>r?r?r@�_safe_extrasEsrFc@s�eZdZdQdd�ZedRdd��ZedSdd	��Zd
d�Zdd
�Zdd�Z	e
dd��Ze
dd��Zdd�Z
dd�Zdd�Ze
dd��Ze
dd��Ze
dd��Zd d!�Zd"d#�Zd$d%�Zd&d'�Zejd(�Ze
d)d*��Zd+d,�ZdTd-d.�ZdUd/d0�Zd1d2�Zd3d4�Zd5d6�Z d7d8�Z!dVd9d:�Z"gdddfd;d<�Z#d=d>�Z$d?d@�Z%dAdB�Z&fdfdCdD�Z'dEdF�Z(e
dGdH��Z)dWdIdJ�Z*dKdL�Z+e
dMdN��Z,dXdOdP�Z-dS)Y�InstallRequirementNFTcsJf|_t�tj�r�yt���Wndtk
r�tjj�kr@d}n.d�krft	�fdd�t
D��rfd}ntj�}t
d�|f��YnXt�j�|_�|_||_|
|_||_||_||_||_|_||_|	dk	r�|	|_n�oވj|_d|_d|_d|_d|_d|_||_d|_ d|_!d|_"d|_#d|_$|�r.|ni|_%||_&d|_'|
|_(dS)Nz%It looks like a path. Does it exist ?�=c3s|]}|�kVqdS)Nr?)rB�op)�reqr?r@rDVsz.InstallRequirement.__init__.<locals>.<genexpr>z,= is not a valid operator. Did you mean == ?zInvalid requirement: '%s'
%sF))r>�
isinstancer�string_typesrr
�osr<�sep�any�	operators�	traceback�
format_excrrFrJ�
comes_from�
constraint�
source_dir�editable�_wheel_cache�link�
original_link�as_egg�markers�marker�_egg_info_path�satisfied_by�conflicts_with�_temp_build_dir�_ideal_build_dir�update�install_succeeded�uninstalled�nothing_to_uninstall�
use_user_site�
target_dir�options�	pycompileZprepared�isolated)�selfrJrSrUrVrXrZrbrir[rjrh�wheel_cacherT�add_msgr?)rJr@�__init__KsN zInstallRequirement.__init__cCspddlm}t||�\}	}
}|
jd�r0t|
�}nd}||	||d||
�|||rP|ni|d�	}
|dk	rlt|�|
_|
S)Nr)�Linkzfile:T)rUrVrXrTrjrhrl)�	pip.indexro�parse_editable�
startswithrrFr>)�cls�editable_reqrS�default_vcsrjrhrlrTro�name�urlZextras_overriderU�resr?r?r@�
from_editable�s 



z InstallRequirement.from_editablec
Cs�ddlm}t|�rd}nd}||krR|j|d�\}}	|	j�}	|	sHd}	qVt|	�}	nd}	|j�}d}
tjjtjj	|��}d}d}
t|�r�||�}n�t
|�\}}
tjj|�r�tjj|ks�|j
d�r�t|�s�td|��|t|��}n0t|��rtjj|��stjd	|�|t|��}|�r||jd
k�rPtjd|j��rP|ttjjtjj	|j����}|j�rtt|j�}d|j|jf}
n|j}
n|}
|�r�|ni}||
|||	||||d
�}|
�r�tt d|
�j!�|_!|S)z�Creates an InstallRequirement from a name, which might be a
        requirement, directory containing 'setup.py', filename, or URL.
        r)roz; �;r7N�.z;Directory %r is not installable. File 'setup.py' not found.zARequirement %r looks like a filename, but the file does not exist�filez\.\./z%s==%s)rXr[rjrhrlrT�placeholder)"rpror�split�stripr	rMr<�normpath�abspathrA�isdirrNrrr!rrr�isfile�logger�warning�schemer9�searchrw�is_wheelr6�filenamerv�version�egg_fragmentrFrr>)rsrvrSrjrhrlrTroZ
marker_sepr[rJr<rXr>�p�wheelrxr?r?r@�	from_line�sb





zInstallRequirement.from_linecCs�|jr(t|j�}|jr:|d|jj7}n|jr6|jjnd}|jdk	rX|dt|jj�7}|jr�t|jt	j
�rt|j}n
|jj�}|r�|d|7}|S)Nz from %sz in %sz
 (from %s))rJ�strrXrwr^r�locationrSrKrrL�	from_path)rk�srSr?r?r@�__str__�s


zInstallRequirement.__str__cCsd|jjt|�|jfS)Nz<%s object: %s editable=%r>)�	__class__�__name__r�rV)rkr?r?r@�__repr__szInstallRequirement.__repr__cCs^|jdkr|j||�|_|jdk	rZ|rZ|j}|jj|j|j�|_||jkrZtjd|j�dS)aEnsure that if a link can be found for this, that it is found.

        Note that self.link may still be None - if Upgrade is False and the
        requirement is already installed.

        If require_hashes is True, don't use the wheel cache, because cached
        wheels, always built locally, have different hashes than the files
        downloaded from the index server and thus throw false hash mismatches.
        Furthermore, cached wheels at present have undeterministic contents due
        to file modification times.
        NzUsing cached wheel link: %s)rXZfind_requirementrWZcached_wheelrvr��debug)rk�finderZupgradeZrequire_hashesZold_linkr?r?r@�
populate_link	s

z InstallRequirement.populate_linkcCs|jjS)N)rJ�	specifier)rkr?r?r@r�szInstallRequirement.specifiercCs$|j}t|�dko"tt|��jdkS)z�Return whether I am pinned to an exact version.

        For example, some-package==1.2 is pinned; some-package>1.2 is not.
        r7�==�===)r�r�)r��len�next�iter�operator)rkrr?r?r@�	is_pinned!szInstallRequirement.is_pinnedcCsR|jdkrdSt|j�}|jrNt|jtj�r4|j}n
|jj�}|rN|d|7}|S)Nz->)rJr�rSrKrrLr�)rkr�rSr?r?r@r�+s


zInstallRequirement.from_pathcCs�|jdk	r|jS|jdkr<tjjtjdd��|_||_|jS|jrN|j	j
�}n|j	}tjj|�sttj
d|�t|�tjj||�S)Nz-buildzpip-zCreating directory %s)r`rJrMr<�realpath�tempfile�mkdtemprarVrv�lower�existsr�r�r)�join)rk�	build_dirrvr?r?r@�build_location8s

z!InstallRequirement.build_locationcCs�|jdk	rdS|j}d|_|j|j�}tjj|�rBtdt|���t	j
d|t|�t|��tj||�||_d|_||_d|_
dS)a�Move self._temp_build_dir to self._ideal_build_dir/self.req.name

        For some requirements (e.g. a path to a directory), the name of the
        package is not available until we run egg_info, so the build_location
        will return a temporary directory and store the _ideal_build_dir.

        This is only called by self.egg_info_path to fix the temporary build
        directory.
        Nz<A package already exists in %s; please remove it to continuez,Moving package %s from %s to new location %s)rUr`r�rarMr<r�rrr�r��shutil�mover])rkZold_locationZnew_locationr?r?r@�_correct_build_locationSs"

z*InstallRequirement._correct_build_locationcCs |jdkrdSttj|jj��S)N)rJrrZ	safe_namerv)rkr?r?r@rvss
zInstallRequirement.namecCstjj|j|jr|jjpd�S)N�)rMr<r�rUrXZsubdirectory_fragment)rkr?r?r@�setup_py_diryszInstallRequirement.setup_py_dircCs|yddl}Wn:tk
rFtd�dkr.d}ntj�}td|��YnXtjj|j	d�}t
jrxt|t
j
�rx|jtj��}|S)Nr�
setuptoolszPlease install setuptools.zWCould not import setuptools which is required to install from a source distribution.
%szsetup.py)r��ImportErrorr+rQrRrrMr<r�r�r�PY2rKZ	text_type�encode�sys�getfilesystemencoding)rkr�rm�setup_pyr?r?r@r�szInstallRequirement.setup_pyc
CsP|jrtjd|j|j�ntjd|j|j�t��xt|j}tjd|g}|j	rZ|dg7}|dg}|j
rpg}n tjj
|jd�}t|�ddg}t|||jdd	d
�WdQRX|j�stt|j�d�t�r�d}nd
}tdj
|j�d||j�dg��|_|j�nDt|j�d�}t|jj�|k�rLtjd|j|j||j�t|�|_dS)Nz2Running setup.py (path:%s) egg_info for package %sz7Running setup.py (path:%s) egg_info for package from %sz-cz
--no-user-cfg�egg_infozpip-egg-infoz
--egg-baseFzpython setup.py egg_info)�cwd�show_stdoutZcommand_descr
z==z===r��NamezuRunning setup.py (path:%s) egg_info for package %s produced metadata for project name %s. Fix your #egg=%s fragments.)rvr�r�r�rXr0r1r��
executablerjrVrMr<r�r�r*r&rJrK�
parse_version�pkg_infor
rr�rr�)rk�scriptZbase_cmdZegg_info_cmdZegg_base_option�egg_info_dirrIZ
metadata_namer?r?r@�run_egg_info�sN




zInstallRequirement.run_egg_infocCsL|jdk	r&|jj|�sdS|jj|�S|j|�}tjj|�s@dSt|�}|S)N)r^�has_metadata�get_metadata�
egg_info_pathrMr<r�r')rkr��datar?r?r@�
egg_info_data�s

z InstallRequirement.egg_info_datacsf|jdk�rV|jr|j}ntjj|jd�}tj|�}|j�rg}x�tj|�D]�\�}}x t	j
D]}||kr^|j|�q^Wxjt|�D]^}tjj
tjj�|dd��s�tjjtjj�|dd��r�|j|�q�|dks�|dkr�|j|�q�W|j�fdd	�|D��qLWd
d	�|D�}|�s$td||f��t|�dk�rB|jd
d�d�tjj||d�|_tjj|j|�S)Nzpip-egg-info�bin�pythonZScriptsz
Python.exeZtestZtestscsg|]}tjj�|��qSr?)rMr<r�)rB�dir)�rootr?r@�
<listcomp>�sz4InstallRequirement.egg_info_path.<locals>.<listcomp>cSsg|]}|jd�r|�qS)z	.egg-info)�endswith)rB�fr?r?r@r��sz$No files/directories in %s (from %s)r7cSs(|jtjj�tjjr"|jtjj�p$dS)Nr)�countrMr<rN�altsep)�xr?r?r@�<lambda>
sz2InstallRequirement.egg_info_path.<locals>.<lambda>)�keyr)r]rVrUrMr<r�r��listdir�walkr4�dirnames�remove�list�lexistsr��extendrr��sort)rkr��base�	filenames�dirs�filesr�r?)r�r@r��s>
z InstallRequirement.egg_info_pathcCs@t�}|jd�}|s*tjdt|jd���|j|p4d�|j�S)NzPKG-INFOzNo PKG-INFO file found in %sr�)rr�r�r�rr�Zfeed�close)rkr�r�r?r?r@r�s
zInstallRequirement.pkg_infoz	\[(.*?)\]cCs
t|j�S)N)r+rv)rkr?r?r@�installed_version sz$InstallRequirement.installed_versioncCsL|j�d}|jjr2||jjkr2tjd||j�ntjdt|j�||�dS)Nr�z'Requested %s, but installing version %sz;Source in %s has version %s, which satisfies requirement %s)	r�rJr�r�r�r�r�rrU)rkr�r?r?r@�assert_source_matches_version$s
z0InstallRequirement.assert_source_matches_versioncCs�|jstjd|j�dS|jjdkr(dS|js2dS|jjjdd�\}}tj	|�}|r�||jj�}|rr|j
|j�q�|j|j�ndS)Nz>Cannot update repository at %s; repository location is unknownr|�+r7)rXr�r�rUr�rbrwr~r4�get_backend�obtainZexport)rkr��vc_typerw�backendZvcs_backendr?r?r@�update_editable5s"
z"InstallRequirement.update_editablecsT|j�std|jf��|jp"|j}t|j�}t|�sTtj	d|j
|tj�d|_
dS|t�krxtj	d|j
|�d|_
dSt|�}t|�}djtj|j��}|jo�tjj|j�}t|jdd�}|o�|jjd�o�|jj|��r�|j|j�|jd	��r2x�|jd	�j�D](}	tjj tjj!|j|	��}
|j|
��qWn�|jd
��r�|jd��rV|jd��ng�xj�fdd
�|jd
�j�D�D]J}tjj!|j|�}
|j|
�|j|
d�|j|
d�|j|
d��qxW�n$|�r�t"j#dj|j�t$�|j|�n�|jjd��rF|j|j�tjj%|j�d}tjj!tjj&|j�d�}
|j'|
d|�n�|�r~|jjd��r~x�t(j)j*|�D]}
|j|
��qhWnp|�r�t+|d��}tjj,|j-�j.��}WdQRX|j|�tjj!tjj&|�d�}
|j'|
|j�ntj/d||j�|jd��rb|j0d��rbxZ|j1d�D]L}t2|��r&t3}nt4}|jtjj!||��t5�r|jtjj!||�d��qW|jd��r@t6j7�r|i}ndd"i}t8j9f|�}|j:t;|j<d���|j=d��r@x�|j>d�D]�\}}t2|��r�t3}nt4}|jtjj!||��t5�r�|jtjj!||�d�|jtjj!||�d �|jtjj!||�d!��q�W|j?|�||_@dS)#a�
        Uninstall the distribution currently satisfying this requirement.

        Prompts before removing or modifying files unless
        ``auto_confirm`` is True.

        Refuses to delete or modify files outside of ``sys.prefix`` -
        thus uninstallation within a virtual environment can only
        modify that virtual environment, even if the virtualenv is
        linked to global site-packages.

        z.Cannot uninstall requirement %s, not installedz1Not uninstalling %s at %s, outside environment %sTNz<Not uninstalling %s at %s, as it is in the standard library.z{0}.egg-infor<z	.egg-infozinstalled-files.txtz
top_level.txtznamespace_packages.txtcsg|]}|r|�kr|�qSr?r?)rBr�)�
namespacesr?r@r��sz0InstallRequirement.uninstall.<locals>.<listcomp>z.pyz.pycz.pyoz�Uninstalling a distutils installed project ({0}) has been deprecated and will be removed in a future version. This is due to the fact that uninstalling a distutils project will only partially uninstall the project.z.eggr7zeasy-install.pthz./z
.dist-info�rz)Not sure how to uninstall: %s - Check: %s�scriptsz.batzentry_points.txtZ
delimitersrHZconsole_scriptsz.exez
.exe.manifestz
-script.py)rH)A�check_if_existsrrvr^r_r,r�r-r��infor�r��prefixrerr3r%�formatrZto_filename�project_namer�rMr<r��getattrZ	_providerr��addr�r��
splitlinesr�r��warnings�warnr/r~�dirnameZadd_pth�pipr�Zuninstallation_paths�open�normcase�readlinerr�Zmetadata_isdirZmetadata_listdirr"rrrrr�rZSafeConfigParserZreadfpr(Zget_metadata_linesZhas_section�itemsr�rd)rkZauto_confirmZdistZ	dist_pathZpaths_to_removeZdevelop_egg_linkZdevelop_egg_link_egg_infoZegg_info_existsZdistutils_egg_infoZinstalled_filer<Z
top_level_pkgZeasy_install_eggZeasy_install_pthZfhZlink_pointerr�Zbin_dirrh�configrv�valuer?)r�r@�	uninstallRs�









zInstallRequirement.uninstallcCs$|jr|jj�ntjd|j�dS)Nz'Can't rollback %s, nothing uninstalled.)rdZrollbackr��errorrv)rkr?r?r@�rollback_uninstall�sz%InstallRequirement.rollback_uninstallcCs*|jr|jj�n|js&tjd|j�dS)Nz%Can't commit %s, nothing uninstalled.)rdZcommitrer�r�rv)rkr?r?r@�commit_uninstall�s
z#InstallRequirement.commit_uninstallcCs�d}d|j|j�df}tjj||�}tjj|�r�tdt|�d�}|dkrTd	}nj|dkrxtj	d
t|��tj
|�nF|dkr�t|�}tj	dt|�t|��tj
||�n|dkr�tjd�|�r�tj|dtjdd
�}tjjtjj|j��}x�tj|�D]�\}	}
}d|
k�r|
j
d�xR|
D]J}tjj|	|�}|j||�}
tj|jd|
d�}d|_|j|d��qWxL|D]D}|tk�r��qrtjj|	|�}|j||�}
|j||jd|
��qrW�q�W|j�tjdt|��dS)NTz	%s-%s.zipr�z8The file %s exists. (i)gnore, (w)ipe, (b)ackup, (a)bort �i�w�b�aFzDeleting %szBacking up %s to %sr7)Z
allowZip64zpip-egg-info�/i��r�zSaved %s)r�rrr���i�)rvr�rMr<r�r�rrr�r�r�r r�r�r��exit�zipfileZZipFileZZIP_DEFLATEDr�r�r�r��_clean_zip_nameZZipInfoZ
external_attrZwritestrr�writer�r�)rkr�Zcreate_archiveZarchive_nameZarchive_pathZresponseZ	dest_file�zipr��dirpathr�r�r�rvZzipdirr�r?r?r@�archivesV






"zInstallRequirement.archivecCs(|t|�dd�}|jtjjd�}|S)Nr7r)r��replacerMr<rN)rkrvr�r?r?r@r5sz"InstallRequirement._clean_zip_namecs0|sd}�jdk	r(t�fdd�|D��SdSdS)Nr�c3s|]}�jjd|i�VqdS)rCN)r[Zevaluate)rBrC)rkr?r@rDDsz3InstallRequirement.match_markers.<locals>.<genexpr>T)r�)r[rO)rkZextras_requestedr?)rkr@�
match_markers=s


z InstallRequirement.match_markersc,s`|jr|j|||d�dS|jr\tjj|j�}tjj||j�|j	|j�||d�d|_
dS||jjdg�7}||jjdg�7}|j
r�t|�dg}tjdd�}tjj|d	�}�z�|j||�|�}	d
|jf}
t|
��.}t��t|	||jd|d�WdQRXWdQRXtjj|��s(tjd
|�dSd|_
|j�r:dS�fdd�}t|��H}
x@|
D](}tjj|�}|jd��rV||�}P�qVWtj d|�dSWdQRXg}t|��P}
xH|
D]@}|j!�}tjj"|��r�|tjj#7}|j$tjj%||�|���q�WWdQRXtjj|d�}t|d��}
|
j&dj|�d�WdQRXWdtjj|��rRtj'|�t(|�XdS)N)r�)r�r��strip_file_prefixT�global_options�install_optionsz
--no-user-cfgz-recordzpip-zinstall-record.txtzRunning setup.py install for %sF)r�r��spinnerzRecord file %s not foundcs(�dkstjj|�r|St�|�SdS)N)rMr<�isabsr)r<)r�r?r@�prepend_root~sz0InstallRequirement.install.<locals>.prepend_rootz	.egg-infoz;Could not find .egg-info directory in install record for %szinstalled-files.txtr�
))rV�install_editabler�r�r�Z
wheel_versionrUZcheck_compatibilityrvr5rcrh�getrjr�r�r�rMr<r��get_install_argsr2r0r&r�r�r�r�rZr�r�r�r�rr�rN�append�relpathr	r�r)rkrrr�r�rr�Z
temp_location�record_filename�install_args�msgrrr��lineZ	directoryr�Z	new_linesr�Zinst_files_pathr?)r�r@�installIs~




"
zInstallRequirement.installcCs|jdkr|j|�|_|jS)aAEnsure that a source_dir is set.

        This will create a temporary build dir if the name of the requirement
        isn't known yet.

        :param parent_dir: The ideal pip parent_dir for the source_dir.
            Generally src_dir for editables and build_dir for sdists.
        :return: self.source_dir
        N)rUr�)rkZ
parent_dirr?r?r@�ensure_has_source_dir�s

z(InstallRequirement.ensure_has_source_dircCs�tjdg}|jd�|jt|j�|t|�dd|g7}|jsJ|dg7}|dk	r^|d|g7}|dk	rr|d|g7}|jr�|dg7}n
|d	g7}t�r�d
t	j
�}|dtjj
tjdd
||j�g7}|S)Nz-uz-crz--recordz#--single-version-externally-managedz--rootz--prefixz	--compilez--no-compiler�z--install-headers�includeZsite)r�r�rr1r�r�rZrirr�get_python_versionrMr<r�r�rv)rkrrr�r�rZ
py_ver_strr?r?r@r�s(



z#InstallRequirement.get_install_argscCsd|jr6tjjtjj|jt��r6tjd|j�t|j�d|_|j	rZtjj|j	�rZt|j	�d|_	dS)zVRemove the source files from this requirement, if they are marked
        for deletionzRemoving source in %sN)
rUrMr<r�r�rr�r�rr`)rkr?r?r@�remove_temporary_source�s

z*InstallRequirement.remove_temporary_sourcecCs�tjd|j�|jr"t|�dg}|r>dj|�g}t|�|}t��<ttj	dt
|jgt|�ddgt|�|jdd�WdQRXd	|_
dS)
NzRunning setup.py develop for %sz
--no-user-cfgz--prefix={0}z-cZdevelopz	--no-depsF)r�r�T)r�r�rvrjr�r�r0r&r�r�r1r�r�rc)rkrrr�Zprefix_paramr?r?r@r�s z#InstallRequirement.install_editablecCs�|jdkrdSyFtt|j��}d|_tjt|��|_|jrR|jrR|j|_d|_dSWn�tj	k
rjdStj
k
r�tj|jj�}|jr�t
|�r�||_q�t�r�t|�r�td|j|jf��nt|�r�||_YnXdS)z�Find an installed distribution that satisfies or conflicts
        with this requirement, and set self.satisfied_by or
        self.conflicts_with appropriately.
        NFTzVWill not install to the user site because it will lack sys.path precedence to %s in %s)rJrr�r\rZget_distributionr^rVr_ZDistributionNotFoundZVersionConflictrvrfr"rr#rr�r�r$)rkZ	no_markerZ
existing_distr?r?r@r��s4

z"InstallRequirement.check_if_existscCs|jo|jjS)N)rXr�)rkr?r?r@r� szInstallRequirement.is_wheelcCs,t|j|j||j|j|||j|j|d�
dS)N)�user�homer�r�rirjr)r5rvrJrfrgrirj)rkZwheeldirr�r�rr?r?r@r5$s
z#InstallRequirement.move_wheel_filescCsX|jd�jd�}tjj|�}tj||�}tjjtjj|��d}tj	tjj|�||d�S)zAReturn a pkg_resources.Distribution built from self.egg_info_pathr�rr)r��metadata)
r��rstriprMr<r�rZPathMetadata�splitext�basenameZDistribution)rkr�Zbase_dirr&Z	dist_namer?r?r@�get_dist0s
zInstallRequirement.get_distcCst|jjdi��S)z�Return whether any known-good hashes are specified as options.

        These activate --require-hashes mode; hashes specified as part of a
        URL do not.

        �hashes)�boolrhr)rkr?r?r@�has_hash_options;sz#InstallRequirement.has_hash_optionscCsJ|jjdi�j�}|r|jn|j}|rB|jrB|j|jg�j|j�t	|�S)a�Return a hash-comparer that considers my option- and URL-based
        hashes to be known-good.

        Hashes in URLs--ones embedded in the requirements file, not ones
        downloaded from an index server--are almost peers with ones from
        flags. They satisfy --require-hashes (whether it was implicitly or
        explicitly activated) but do not activate it. md5 and sha224 are not
        allowed in flags, which should nudge people toward good algos. We
        always OR all hashes together, even ones from URLs.

        :param trust_internet: Whether to trust URL-based (#md5=...) hashes
            downloaded from the internet, as by populate_link()

        r+)
rhr�copyrXrY�hash�
setdefaultZ	hash_namerr.)rkZtrust_internetZgood_hashesrXr?r?r@r+Es

zInstallRequirement.hashes)NFNFTTNFNNF)NNFNNF)NFNNF)T)F)N)NNN)T).r��
__module__�__qualname__rn�classmethodryr�r�r�r��propertyr�r�r�r�r�rvr�r�r�r�r�r�r9�compileZ_requirements_section_rer�r�r�r�r�r�rrrrr rr#rr�r�r5r*r-r+r?r?r?r@rGIs`
;M

 :6


$0
\
)

rGcCstjd|�}|r|jd�}|S)z2
        Strip req postfix ( -dev, 0.2, etc )
    z^(.*?)(?:-dev|-\d.*)$r7)r9r�r;)rJr:r?r?r@�_strip_postfix[s
r6cCs�ddlm}|}d}tjd|�}|r:|jd�}|jd�}n|}tjj|�rttjjtjj	|d��slt
d|��t|�}|j�j
d	�r�||�j}|r�||td
|j��jfS||dfSx,tD]$}|j�j
d|�r�d||f}Pq�Wd
|k�r|�r
tjdt�|d
|}nt
d|��|jd
d�dj�}	tj|	��s`d|dj	dd�tjD��d}
t
|
��||�j}|�sxt
d��|�s�t
d|��t|�|dfS)aParses an editable requirement into:
        - a requirement name
        - an URL
        - extras
        - editable options
    Accepted requirements:
        svn+http://blahblah@rev#egg=Foobar[baz]&subdirectory=version_subdir
        .[some_extra]
    r)roNz^(.+)(\[[^\]]+\])$r7r8zsetup.pyz;Directory %r is not installable. File 'setup.py' not found.zfile:r}z%s:z%s+%sr�zD--default-vcs has been deprecated and will be removed in the future.zb%s should either be a path to a local project or a VCS url beginning with svn+, git+, hg+, or bzr+zFor --editable=%s only z, cSsg|]}|jd�qS)z+URL)rv)rBr�r?r?r@r��sz"parse_editable.<locals>.<listcomp>z is currently supportedz@Could not detect requirement name, please specify one with #egg=z@--editable=%s is not the right format; it must have #egg=Package)rpror9r:r;rMr<r�r�r�rrr�rrr�rr>r4r�r�r/r~r�Zbackendsr6)rtrurorwr>r=Z
url_no_extrasZpackage_nameZversion_controlr�Z
error_messager?r?r@rqgs`





rq)N)eZ
__future__rZloggingrMr9r�r�r�rQr�rZ	distutilsrZdistutils.utilrZemail.parserrZpip._vendorrrZpip._vendor.packagingrZpip._vendor.packaging.markersr	Z"pip._vendor.packaging.requirementsr
rZpip._vendor.packaging.utilsrZpip._vendor.packaging.versionr
rr�Zpip._vendor.six.movesrZ	pip.wheelr�Z
pip.compatrrrZpip.downloadrrrrZpip.exceptionsrrZ
pip.locationsrrrrZ	pip.utilsrrrr r!r"r#r$r%r&r'r(r)r*r+r,r-Zpip.utils.hashesr.Zpip.utils.deprecationr/Zpip.utils.loggingr0Zpip.utils.setuptools_buildr1Zpip.utils.uir2Zpip.req.req_uninstallr3Zpip.vcsr4r5r6Z	getLoggerr�r�Z	SpecifierZ
_operators�keysrPrArF�objectrGr6rqr?r?r?r@�<module>s`L
site-packages/pip/req/__pycache__/req_uninstall.cpython-36.pyc000064400000014171147511334620020373 0ustar003

���e��@s�ddlmZddlZddlZddlZddlmZmZmZddl	m
Z
ddlmZm
Z
mZmZmZddlmZeje�ZGdd�de�ZGd	d
�d
e�ZdS)�)�absolute_importN)�uses_pycache�WINDOWS�cache_from_source)�UninstallationError)�rmtree�ask�is_local�renames�normalize_path)�
indent_logc@sZeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	ddd�Z
dd�Zdd�ZdS)�UninstallPathSetzMA set of file paths to be removed in the uninstallation of a
    requirement.cCs,t�|_t�|_i|_||_d|_g|_dS)N)�set�paths�_refuse�pth�dist�save_dir�_moved_paths)�selfr�r�#/usr/lib/python3.6/req_uninstall.py�__init__szUninstallPathSet.__init__cCst|�S)zs
        Return True if the given path is one we are permitted to
        remove/modify, False otherwise.

        )r	)r�pathrrr�
_permittedszUninstallPathSet._permittedcCs�tjj|�\}}tjjt|�tjj|��}tjj|�s:dS|j|�rR|jj	|�n|j
j	|�tjj|�ddkr�tr�|j	t
|��dS)N�z.py)�osr�split�joinr�normcase�existsrr�addr�splitextrr)rr�head�tailrrrr!#s
zUninstallPathSet.addcCsLt|�}|j|�r<||jkr*t|�|j|<|j|j|�n|jj|�dS)N)rrr�UninstallPthEntriesr!r)r�pth_file�entryrrr�add_pth6s

zUninstallPathSet.add_pthcs@t�}x4t|td�D]$�t�fdd�|D��s|j��qW|S)z�Compact a path set to contain the minimal number of paths
        necessary to contain all paths in the set. If /a/path/ and
        /a/path/to/a/file.txt are both in the set, leave only the
        shorter path.)�keycs4g|],}�j|�o.�t|jtjj��tjjk�qSr)�
startswith�len�rstriprr�sep)�.0Z	shortpath)rrr�
<listcomp>Gsz,UninstallPathSet.compact.<locals>.<listcomp>)r�sortedr+�anyr!)rrZshort_pathsr)rr�compact?s

zUninstallPathSet.compactcCs&tjj|jtjj|�djtjj��S)Nr)rrrr�
splitdrive�lstripr-)rrrrr�_stashMszUninstallPathSet._stashFcCs:|jstjd|jj�dStjd|jj|jj�t���t|j|j��}|rRd}n"x|D]}tj|�qXWt	dd
�}|j
r�tjd�x|j|j
�D]}tj|�q�W|dk�r,tjdd	d
�|_
x8|D]0}|j|�}tjd|�|jj|�t||�q�Wx|jj�D]}|j��qWtjd|jj|jj�WdQRXdS)z[Remove paths in ``self.paths`` with confirmation (unless
        ``auto_confirm`` is True).z7Can't uninstall '%s'. No files were found to uninstall.NzUninstalling %s-%s:�yzProceed (y/n)? �nz.Not removing or modifying (outside of prefix):z
-uninstallzpip-)�suffix�prefixzRemoving file or directory %szSuccessfully uninstalled %s-%s)r6r7)r�logger�infor�project_name�versionrr0r2rr�tempfileZmkdtemprr5�debugr�appendr
r�values�remove)rZauto_confirmrZresponser�new_pathrrrrrBQs@







zUninstallPathSet.removecCs~|jdkrtjd|jj�dStjd|jj�x.|jD]$}|j|�}tjd|�t	||�q6Wx|j
j�D]}|j�qjWdS)z1Rollback the changes previously made by remove().Nz'Can't roll back %s; was not uninstalledFzRolling back uninstall of %szReplacing %s)
rr:�errorrr<r;rr5r?r
rrA�rollback)rrZtmp_pathrrrrrE{s


zUninstallPathSet.rollbackcCs$|jdk	r t|j�d|_g|_dS)z?Remove temporary save dir: rollback will no longer be possible.N)rrr)rrrr�commit�s

zUninstallPathSet.commitN)F)
�__name__�
__module__�__qualname__�__doc__rrr!r(r2r5rBrErFrrrrr
s	
*r
c@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
r%cCs0tjj|�std|��||_t�|_d|_dS)Nz.Cannot remove entries from nonexistent file %s)rr�isfiler�filer�entries�_saved_lines)rr&rrrr�s
zUninstallPthEntries.__init__cCs>tjj|�}tr.tjj|�dr.|jdd�}|jj|�dS)Nr�\�/)rrrrr3�replacerMr!)rr'rrrr!�szUninstallPthEntries.addcCs�tjd|j�t|jd��}|j�}||_WdQRXtdd�|D��rLd}nd}xH|jD]>}y$tjd|�|j||j	d��WqXt
k
r�YqXXqXWt|jd	��}|j|�WdQRXdS)
NzRemoving pth entries from %s:�rbcss|]}d|kVqdS)s
Nr)r.�linerrr�	<genexpr>�sz-UninstallPthEntries.remove.<locals>.<genexpr>z
�
zRemoving entry: %szutf-8�wb)r:r?rL�open�	readlinesrNr1rMrB�encode�
ValueError�
writelines)r�fh�linesZendliner'rrrrB�s
zUninstallPthEntries.removec	CsR|jdkrtjd|j�dStjd|j�t|jd��}|j|j�WdQRXdS)Nz.Cannot roll back changes to %s, none were madeFz!Rolling %s back to previous staterVT)rNr:rDrLr?rWr[)rr\rrrrE�s

zUninstallPthEntries.rollbackN)rGrHrIrr!rBrErrrrr%�s	
r%)Z
__future__rZloggingrr>Z
pip.compatrrrZpip.exceptionsrZ	pip.utilsrrr	r
rZpip.utils.loggingrZ	getLoggerrGr:�objectr
r%rrrr�<module>s
site-packages/pip/req/__pycache__/req_uninstall.cpython-36.opt-1.pyc000064400000014171147511334620021332 0ustar003

���e��@s�ddlmZddlZddlZddlZddlmZmZmZddl	m
Z
ddlmZm
Z
mZmZmZddlmZeje�ZGdd�de�ZGd	d
�d
e�ZdS)�)�absolute_importN)�uses_pycache�WINDOWS�cache_from_source)�UninstallationError)�rmtree�ask�is_local�renames�normalize_path)�
indent_logc@sZeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	ddd�Z
dd�Zdd�ZdS)�UninstallPathSetzMA set of file paths to be removed in the uninstallation of a
    requirement.cCs,t�|_t�|_i|_||_d|_g|_dS)N)�set�paths�_refuse�pth�dist�save_dir�_moved_paths)�selfr�r�#/usr/lib/python3.6/req_uninstall.py�__init__szUninstallPathSet.__init__cCst|�S)zs
        Return True if the given path is one we are permitted to
        remove/modify, False otherwise.

        )r	)r�pathrrr�
_permittedszUninstallPathSet._permittedcCs�tjj|�\}}tjjt|�tjj|��}tjj|�s:dS|j|�rR|jj	|�n|j
j	|�tjj|�ddkr�tr�|j	t
|��dS)N�z.py)�osr�split�joinr�normcase�existsrr�addr�splitextrr)rr�head�tailrrrr!#s
zUninstallPathSet.addcCsLt|�}|j|�r<||jkr*t|�|j|<|j|j|�n|jj|�dS)N)rrr�UninstallPthEntriesr!r)r�pth_file�entryrrr�add_pth6s

zUninstallPathSet.add_pthcs@t�}x4t|td�D]$�t�fdd�|D��s|j��qW|S)z�Compact a path set to contain the minimal number of paths
        necessary to contain all paths in the set. If /a/path/ and
        /a/path/to/a/file.txt are both in the set, leave only the
        shorter path.)�keycs4g|],}�j|�o.�t|jtjj��tjjk�qSr)�
startswith�len�rstriprr�sep)�.0Z	shortpath)rrr�
<listcomp>Gsz,UninstallPathSet.compact.<locals>.<listcomp>)r�sortedr+�anyr!)rrZshort_pathsr)rr�compact?s

zUninstallPathSet.compactcCs&tjj|jtjj|�djtjj��S)Nr)rrrr�
splitdrive�lstripr-)rrrrr�_stashMszUninstallPathSet._stashFcCs:|jstjd|jj�dStjd|jj|jj�t���t|j|j��}|rRd}n"x|D]}tj|�qXWt	dd
�}|j
r�tjd�x|j|j
�D]}tj|�q�W|dk�r,tjdd	d
�|_
x8|D]0}|j|�}tjd|�|jj|�t||�q�Wx|jj�D]}|j��qWtjd|jj|jj�WdQRXdS)z[Remove paths in ``self.paths`` with confirmation (unless
        ``auto_confirm`` is True).z7Can't uninstall '%s'. No files were found to uninstall.NzUninstalling %s-%s:�yzProceed (y/n)? �nz.Not removing or modifying (outside of prefix):z
-uninstallzpip-)�suffix�prefixzRemoving file or directory %szSuccessfully uninstalled %s-%s)r6r7)r�logger�infor�project_name�versionrr0r2rr�tempfileZmkdtemprr5�debugr�appendr
r�values�remove)rZauto_confirmrZresponser�new_pathrrrrrBQs@







zUninstallPathSet.removecCs~|jdkrtjd|jj�dStjd|jj�x.|jD]$}|j|�}tjd|�t	||�q6Wx|j
j�D]}|j�qjWdS)z1Rollback the changes previously made by remove().Nz'Can't roll back %s; was not uninstalledFzRolling back uninstall of %szReplacing %s)
rr:�errorrr<r;rr5r?r
rrA�rollback)rrZtmp_pathrrrrrE{s


zUninstallPathSet.rollbackcCs$|jdk	r t|j�d|_g|_dS)z?Remove temporary save dir: rollback will no longer be possible.N)rrr)rrrr�commit�s

zUninstallPathSet.commitN)F)
�__name__�
__module__�__qualname__�__doc__rrr!r(r2r5rBrErFrrrrr
s	
*r
c@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
r%cCs0tjj|�std|��||_t�|_d|_dS)Nz.Cannot remove entries from nonexistent file %s)rr�isfiler�filer�entries�_saved_lines)rr&rrrr�s
zUninstallPthEntries.__init__cCs>tjj|�}tr.tjj|�dr.|jdd�}|jj|�dS)Nr�\�/)rrrrr3�replacerMr!)rr'rrrr!�szUninstallPthEntries.addcCs�tjd|j�t|jd��}|j�}||_WdQRXtdd�|D��rLd}nd}xH|jD]>}y$tjd|�|j||j	d��WqXt
k
r�YqXXqXWt|jd	��}|j|�WdQRXdS)
NzRemoving pth entries from %s:�rbcss|]}d|kVqdS)s
Nr)r.�linerrr�	<genexpr>�sz-UninstallPthEntries.remove.<locals>.<genexpr>z
�
zRemoving entry: %szutf-8�wb)r:r?rL�open�	readlinesrNr1rMrB�encode�
ValueError�
writelines)r�fh�linesZendliner'rrrrB�s
zUninstallPthEntries.removec	CsR|jdkrtjd|j�dStjd|j�t|jd��}|j|j�WdQRXdS)Nz.Cannot roll back changes to %s, none were madeFz!Rolling %s back to previous staterVT)rNr:rDrLr?rWr[)rr\rrrrE�s

zUninstallPthEntries.rollbackN)rGrHrIrr!rBrErrrrr%�s	
r%)Z
__future__rZloggingrr>Z
pip.compatrrrZpip.exceptionsrZ	pip.utilsrrr	r
rZpip.utils.loggingrZ	getLoggerrGr:�objectr
r%rrrr�<module>s
site-packages/pip/req/__pycache__/req_set.cpython-36.opt-1.pyc000064400000050573147511334620020122 0ustar003

���e��@s~ddlmZddlmZddlmZddlZddlZddlm	Z	ddlm
Z
ddlmZddl
mZmZmZmZmZdd	lmZmZmZmZmZmZmZmZmZmZdd
lmZddl m!Z!m"Z"m#Z#m$Z$m%Z%ddl&m'Z'dd
l(m)Z)ddl*m+Z+ddl,m-Z-ddl.m/Z/ej0e1�Z2Gdd�de3�Z4Gdd�de3�Z5dd�Z6Gdd�de5�Z7Gdd�de5�Z8Gdd�de5�Z9Gdd�de3�Z:dS)�)�absolute_import)�defaultdict)�chainN)�
pkg_resources)�requests)�
expanduser)�is_file_url�
is_dir_url�
is_vcs_url�url_to_path�
unpack_url)
�InstallationError�BestVersionAlreadyInstalled�DistributionNotFound�PreviousBuildDirError�	HashError�
HashErrors�HashUnpinned�DirectoryUrlHashUnsupported�VcsHashUnsupported�UnsupportedPythonVersion)�InstallRequirement)�display_path�dist_in_usersite�dist_in_install_path�
ensure_dir�normalize_path)�
MissingHashes)�
indent_log)�check_dist_requires_python)�vcs)�Wheelc@sDeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dS)�RequirementscCsg|_i|_dS)N)�_keys�_dict)�self�r&�/usr/lib/python3.6/req_set.py�__init__"szRequirements.__init__cCs|jS)N)r#)r%r&r&r'�keys&szRequirements.keyscs�fdd��jD�S)Ncsg|]}�j|�qSr&)r$)�.0�key)r%r&r'�
<listcomp>*sz'Requirements.values.<locals>.<listcomp>)r#)r%r&)r%r'�values)szRequirements.valuescCs
||jkS)N)r#)r%�itemr&r&r'�__contains__,szRequirements.__contains__cCs$||jkr|jj|�||j|<dS)N)r#�appendr$)r%r+�valuer&r&r'�__setitem__/s
zRequirements.__setitem__cCs
|j|S)N)r$)r%r+r&r&r'�__getitem__4szRequirements.__getitem__cs$�fdd��j�D�}ddj|�S)Ncs$g|]}dt|�t�|�f�qS)z%s: %s)�repr)r*�k)r%r&r'r,8sz)Requirements.__repr__.<locals>.<listcomp>zRequirements({%s})z, )r)�join)r%r-r&)r%r'�__repr__7szRequirements.__repr__N)
�__name__�
__module__�__qualname__r(r)r-r/r2r3r7r&r&r&r'r" sr"c@s(eZdZdZdd�Zdd�Zdd�ZdS)	�DistAbstractionatAbstracts out the wheel vs non-wheel prepare_files logic.

    The requirements for anything installable are as follows:
     - we must be able to determine the requirement name
       (or we can't correctly handle the non-upgrade case).
     - we must be able to generate a list of run-time dependencies
       without installing any additional packages (or we would
       have to either burn time by doing temporary isolated installs
       or alternatively violate pips 'don't start installing unless
       all requirements are available' rule - neither of which are
       desirable).
     - for packages with setup requirements, we must also be able
       to determine their requirements without installing additional
       packages (for the same reason as run-time dependencies)
     - we must be able to create a Distribution object exposing the
       above metadata.
    cCs
||_dS)N)�req_to_install)r%r<r&r&r'r(OszDistAbstraction.__init__cCst|j��dS)z Return a setuptools Dist object.N)�NotImplementedError�dist)r%�finderr&r&r'r>RszDistAbstraction.distcCst|j��dS)z3Ensure that we can get a Dist for this requirement.N)r=r>)r%r&r&r'�
prep_for_distVszDistAbstraction.prep_for_distN)r8r9r:�__doc__r(r>r@r&r&r&r'r;<sr;cCs0|jrt|�S|jr$|jjr$t|�St|�SdS)z�Factory to make an abstract dist object.

    Preconditions: Either an editable req with a source_dir, or satisfied_by or
    a wheel link, or a non-editable req with a source_dir.

    :return: A concrete DistAbstraction.
    N)�editable�IsSDist�link�is_wheel�IsWheel)r<r&r&r'�make_abstract_dist[s
rGc@seZdZdd�Zdd�ZdS)rFcCsttj|jj��dS)Nr)�listrZfind_distributionsr<�
source_dir)r%r?r&r&r'r>mszIsWheel.distcCsdS)Nr&)r%r&r&r'r@qszIsWheel.prep_for_distN)r8r9r:r>r@r&r&r&r'rFksrFc@seZdZdd�Zdd�ZdS)rCcCs(|jj�}|jd�r$|j|jd��|S)Nzdependency_links.txt)r<Zget_distZhas_metadataZadd_dependency_linksZget_metadata_lines)r%r?r>r&r&r'r>xs


zIsSDist.distcCs|jj�|jj�dS)N)r<Zrun_egg_infoZassert_source_matches_version)r%r&r&r'r@�s
zIsSDist.prep_for_distN)r8r9r:r>r@r&r&r&r'rCvs	rCc@seZdZdd�Zdd�ZdS)�	InstalledcCs|jjS)N)r<�satisfied_by)r%r?r&r&r'r>�szInstalled.distcCsdS)Nr&)r%r&r&r'r@�szInstalled.prep_for_distN)r8r9r:r>r@r&r&r&r'rJ�srJc@s�eZdZd$dd�Zdd�Zdd	�Zd%d
d�Zdd
�Zedd��Z	edd��Z
dd�Zd&dd�Zdd�Z
dd�Zdd�Zd'dd�Zdd�Zd d!�Zffd"d#�ZdS)(�RequirementSetFNTcCs�|dkrtd��||_||_||_||_||_||_|
|_t�|_	i|_
g|_|	|_||_
g|_g|_g|_||_||_||_||_|
|_||_|r�t|�}||_||_||_tt�|_dS)a3Create a RequirementSet.

        :param wheel_download_dir: Where still-packed .whl files should be
            written to. If None they are written to the download_dir parameter.
            Separate to download_dir to permit only keeping wheel archives for
            pip wheel.
        :param download_dir: Where still packed archives should be written to.
            If None they are not saved, and are deleted immediately after
            unpacking.
        :param wheel_cache: The pip wheel cache, for passing to
            InstallRequirement.
        Nz?RequirementSet() missing 1 required keyword argument: 'session')�	TypeError�	build_dir�src_dir�download_dir�upgrade�upgrade_strategy�ignore_installed�force_reinstallr"�requirements�requirement_aliases�unnamed_requirements�ignore_dependencies�ignore_requires_python�successfully_downloaded�successfully_installed�reqs_to_cleanup�as_egg�
use_user_site�
target_dir�session�	pycompile�isolatedr�wheel_download_dir�_wheel_cache�require_hashesrrH�
_dependencies)r%rNrOrPrQrRrSr]r_rXrTr^r`rarbrc�wheel_cachererYr&r&r'r(�s<zRequirementSet.__init__cCs8dd�|jj�D�}|jdd�d�djdd�|D��S)NcSsg|]}|js|�qSr&)Z
comes_from)r*�reqr&r&r'r,�sz*RequirementSet.__str__.<locals>.<listcomp>cSs
|jj�S)N)�name�lower)rhr&r&r'�<lambda>�sz(RequirementSet.__str__.<locals>.<lambda>)r+� cSsg|]}t|j��qSr&)�strrh)r*rhr&r&r'r,�s)rUr-�sortr6)r%�reqsr&r&r'�__str__�szRequirementSet.__str__cCsNdd�|jj�D�}|jdd�d�djdd�|D��}d|jjt|�|fS)	NcSsg|]}|�qSr&r&)r*rhr&r&r'r,�sz+RequirementSet.__repr__.<locals>.<listcomp>cSs
|jj�S)N)rirj)rhr&r&r'rk�sz)RequirementSet.__repr__.<locals>.<lambda>)r+z, cSsg|]}t|j��qSr&)rmrh)r*rhr&r&r'r,�sz"<%s object; %d requirement(s): %s>)rUr-rnr6�	__class__r8�len)r%roZreqs_strr&r&r'r7�s
zRequirementSet.__repr__c	
Cs�|j}|j|�s&tjd|j|j�gS|jrV|jjrVt|jj�}|j	�sVt
d|j��|j|_|j|_|j
|_
|j|_|dk|_|s�|jj|�|gSy|j|�}Wntk
r�d}YnX|dko�|o�|jo�|j|jko�|jj|jjk�rt
d|||f��|�s8||j|<|j�|k�r0||j|j�<|g}n�g}|j�r�|j�r�|j�r�|j�ol|jj|jjk�r�|jj|�t
d|��d|_ttt|j�j t|j����|_tj!d||j�|g}|}|�r�|j|�}|j"|j|�|SdS)a'Add install_req as a requirement to install.

        :param parent_req_name: The name of the requirement that needed this
            added. The name is used because when multiple unnamed requirements
            resolve to the same name, we could otherwise end up with dependency
            links that point outside the Requirements set. parent_req must
            already be added. Note that None implies that this is a user
            supplied requirement, vs an inferred one.
        :param extras_requested: an iterable of extras used to evaluate the
            environement markers.
        :return: Additional requirements to scan. That is either [] if
            the requirement is not applicable, or [install_req] if the
            requirement is applicable and has just been added.
        z6Ignoring %s: markers '%s' don't match your environmentz-%s is not a supported wheel on this platform.Nz5Double requirement given: %s (already in %s, name=%r)zhCould not satisfy constraints for '%s': installation from path or url cannot be constrained to a versionFzSetting %s extras to: %s)#riZ
match_markers�logger�warningZmarkersrDrEr!�filenameZ	supportedr
r]r^r_ra�	is_directrWr0�get_requirement�KeyError�
constraint�extrasrhZ	specifierrUrjrV�pathr\�tuple�sorted�set�union�debugrf)	r%�install_reqZparent_req_name�extras_requestedriZwheelZexisting_req�resultZ
parent_reqr&r&r'�add_requirement�sp






zRequirementSet.add_requirementcCsF|j�}||jkr |j|js>||jkrB|j|j|jrBdSdS)NTF)rjrUryrV)r%�project_namerir&r&r'�has_requirement4s

zRequirementSet.has_requirementcCstdd�|jj�D��p|jS)Ncss|]}|js|VqdS)N)ry)r*rhr&r&r'�	<genexpr>?sz2RequirementSet.has_requirements.<locals>.<genexpr>)rHrUr-rW)r%r&r&r'�has_requirements=szRequirementSet.has_requirementscCsD|jr@t|j�|_tjj|j�r$dStjd�tdt|j���dS)NTz!Could not find download directoryz0Could not find or access download directory '%s'F)	rPr�osr{�existsrs�criticalr
r)r%r&r&r'�is_downloadBs
zRequirementSet.is_downloadcCsTxB||j�fD]2}||jkr&|j|S||jkr|j|j|SqWtd|��dS)NzNo project with the name %r)rjrUrVrx)r%r�rir&r&r'rwOs


zRequirementSet.get_requirementcCs4x.|jj�D] }|jrq|j|d�|j�qWdS)N)�auto_confirm)rUr-ry�	uninstall�commit_uninstall)r%r�rhr&r&r'r�Ws
zRequirementSet.uninstallcCs�|jrt|j�|j|jj�}|jp6tdd�|D��}|rJ|jrJtd��g}t	�}xdt
||�D]V}y|j|j||||j
d��Wq`tk
r�}z||_|j|�WYdd}~Xq`Xq`W|r�|�dS)zY
        Prepare process. Create temp directories, download and/or unpack files.
        css|]}|jVqdS)N)Zhas_hash_options)r*rhr&r&r'r�jsz/RequirementSet.prepare_files.<locals>.<genexpr>z�--egg is not allowed with --require-hashes mode, since it delegates dependency resolution to setuptools and could thus result in installation of unhashed packages.)rerXN)rcrrWrUr-re�anyr]r
rr�extend�
_prepare_filerXrrhr0)r%r?Z	root_reqsreZdiscovered_reqsZhash_errorsrh�excr&r&r'�
prepare_files^s,

 zRequirementSet.prepare_filescCs |jo|jdkp|jdko|jS)NZeagerzonly-if-needed)rQrRrv)r%rhr&r&r'�_is_upgrade_allowed�s
z"RequirementSet._is_upgrade_allowedcCs�|j�|jr�|j|�}d}|r�|jp*|jshy|j||�Wn*tk
rTd}Yntk
rfYnX|s�|jr~t	|j�p�t
|j�s�|j|_d|_|r�d}n|jdkr�d}nd}|SdSdS)aCheck if req_to_install should be skipped.

        This will check if the req is installed, and whether we should upgrade
        or reinstall it, taking into account all the relevant user options.

        After calling this req_to_install will only have satisfied_by set to
        None if the req_to_install is to be upgraded/reinstalled etc. Any
        other value will be a dist recording the current thing installed that
        satisfies the requirement.

        Note that for vcs urls and the like we can't assess skipping in this
        routine - we simply identify that we need to pull the thing down,
        then later on it is pulled down and introspected to assess upgrade/
        reinstalls etc.

        :return: A text reason for why it was skipped, or None.
        FTNzalready up-to-datezonly-if-neededz%not upgraded as not directly requiredzalready satisfied)
�check_if_existsrKr�rTrDZfind_requirementrrr^rr�conflicts_withrR)r%r<r?Zupgrade_allowedZbest_installed�skip_reasonr&r&r'�_check_skip_installed�s4

z$RequirementSet._check_skip_installedc(st�js�jrgSd�_�jr*tjd��nd�js<�j�|�}�jrRtjd|��n<�jr��jj	dkr�t
�jj�}tjdt|��ntjd��t
�����jr�|r�td����j�j��j�j�t��}|j��jr�j�j��j��n$�j�r|�rtjd�t��}�n�j�j�tjjtjj�jd	���rRt d
��jf���j!|�j"��|��j}|�r�t#|��r�t$��nt%|��r�t&|��r�t'���j(�r��j)�r�t*���j+|d�}	|�r�|	�r�t,�}	yZ�j}
d}�jj-�r��j.�r��j.}
�jj-�r|
�rd}nd}t/�j�j|
|�j0|	d
�WnHt1j2k
�r|}z(tj3d�|�td�|�jf��WYdd}~XnXt��}|j��j�r��jj	t4j5k�r��j�j��j�sˆj��j�r�j6�sڈj�r�j7�r�t8�j��p�t9�j��s�j�_:d�_ntjd��|j;|�}
yt<|
�WnHt=k
�rx}z*�j>�r^tj?|j@d�n
�jA��WYdd}~XnXg����fdd�}�jB�jC��s��jD�d�|�s:�jE�r�tjddj�jE��tFtG�jE�tG|
jE��}x|D]}tj?d|
|��q�WtFtG|
jE�tG�jE�@�}x |
jH|�D]}|||d��q$W�jIjJ���j�rf�j�rf�jKjJ��WdQRX�S)zxPrepare a single requirements file.

        :return: A list of additional InstallRequirements to also install.
        TzObtaining %szRequirement %s: %s�filez
Processing %sz
Collecting %szoThe editable requirement %s cannot be installed when requiring hashes, because there is no single file to hash.z�Since it is already installed, we are trusting this package without checking its hash. To ensure a completely repeatable environment, install into an empty virtualenv.zsetup.pyz�pip can't proceed with requirements '%s' due to a pre-existing build directory (%s). This is likely due to a previous installation that failed. pip is being responsible and not assuming it can delete this. Please delete it and try again.)Ztrust_internetF)r`�hashesz4Could not install requirement %s because of error %szDCould not install requirement %s because of HTTP error %s for URL %sNz<Requirement already satisfied (use --upgrade to upgrade): %srcs4tt|���j�jd�}�j�j|�j|d��dS)N)rbrg)r�)rrmrbrdr�r�ri)�subreqr�Zsub_install_req)�	more_reqsr<r%r&r'�add_req�s
z-RequirementSet._prepare_file.<locals>.add_reqz!Installing extra requirements: %r�,z"%s does not provide the extra '%s')r�)LryZpreparedrBrs�inforSr�rKrD�schemerZurlrrr
Zensure_has_source_dirrOZupdate_editabler�rGr@�archiverPr�r�rJrNr�r{r�r6rIrZ
populate_linkr�r
rrr	rZ
original_linkZ	is_pinnedrr�rrErcrr`rZ	HTTPErrorr�r Zall_schemesrQr^rrr�r>rrrYrt�args�remove_temporary_sourcer�rir�rzr}r~Zrequiresr\r0rZ)r%r?r<rerXr�r{Z
abstract_distrDr�rPZautodelete_unpackedr�r>�er�Zmissing_requestedZmissingZavailable_requestedr�r&)r�r<r%r'r��s







"


zRequirementSet._prepare_filec	Cs8tjd�t��x|jD]}|j�qWWdQRXdS)zClean up files, remove builds.zCleaning up...N)rsr�rr\r�)r%rhr&r&r'�
cleanup_files�s
zRequirementSet.cleanup_filescs<g�t������fdd��x�jj�D]}�|�q(W�S)z�Create the installation order.

        The installation order is topological - requirements are installed
        before the requiring thing. We break cycles at an arbitrary point,
        and make no other guarantees.
        csP|js|�krdS|jrdS�j|�x�j|D]}�|�q2W�j|�dS)N)rKry�addrfr0)rhZdep)�order�ordered_reqs�scheduler%r&r'r��s
z,RequirementSet._to_install.<locals>.schedule)r~rUr-)r%r�r&)r�r�r�r%r'�_to_install�s
	zRequirementSet._to_installcOs�|j�}|r(tjddjdd�|D���t���x�|D]�}|jrltjd|j�t��|jdd�WdQRXy|j||f|�|�Wn$|jr�|jr�|j	��YnX|jr�|jr�|j
�|j�q6WWdQRX||_dS)	zl
        Install everything in this set (after having downloaded and unpacked
        the packages)
        z!Installing collected packages: %sz, cSsg|]
}|j�qSr&)ri)r*rhr&r&r'r,sz*RequirementSet.install.<locals>.<listcomp>zFound existing installation: %sT)r�N)
r�rsr�r6rr�r��installZinstall_succeededZrollback_uninstallr�r�r[)r%Zinstall_optionsZglobal_optionsr��kwargsZ
to_installZrequirementr&r&r'r��s:

zRequirementSet.install)FNFFNFFFNTFNNFF)NN)F)FF)r8r9r:r(rpr7r�r��propertyr�r�rwr�r�r�r�r�r�r�r�r&r&r&r'rL�s4
4
[	

'E
	rL);Z
__future__r�collectionsr�	itertoolsrZloggingr�Zpip._vendorrrZ
pip.compatrZpip.downloadrr	r
rrZpip.exceptionsr
rrrrrrrrrZpip.req.req_installrZ	pip.utilsrrrrrZpip.utils.hashesrZpip.utils.loggingrZpip.utils.packagingrZpip.vcsr Z	pip.wheelr!Z	getLoggerr8rs�objectr"r;rGrFrCrJrLr&r&r&r'�<module>s00
	site-packages/pip/req/__pycache__/__init__.cpython-36.pyc000064400000000603147511334620017245 0ustar003

���e�@sDddlmZddlmZddlmZmZddlmZdddd	gZ	d
S)�)�absolute_import�)�InstallRequirement)�RequirementSet�Requirements)�parse_requirementsrrrrN)
Z
__future__rZreq_installrZreq_setrrZreq_filer�__all__�r	r	�/usr/lib/python3.6/__init__.py�<module>s
site-packages/pip/req/req_uninstall.py000064400000015361147511334620014111 0ustar00from __future__ import absolute_import

import logging
import os
import tempfile

from pip.compat import uses_pycache, WINDOWS, cache_from_source
from pip.exceptions import UninstallationError
from pip.utils import rmtree, ask, is_local, renames, normalize_path
from pip.utils.logging import indent_log


logger = logging.getLogger(__name__)


class UninstallPathSet(object):
    """A set of file paths to be removed in the uninstallation of a
    requirement."""
    def __init__(self, dist):
        self.paths = set()
        self._refuse = set()
        self.pth = {}
        self.dist = dist
        self.save_dir = None
        self._moved_paths = []

    def _permitted(self, path):
        """
        Return True if the given path is one we are permitted to
        remove/modify, False otherwise.

        """
        return is_local(path)

    def add(self, path):
        head, tail = os.path.split(path)

        # we normalize the head to resolve parent directory symlinks, but not
        # the tail, since we only want to uninstall symlinks, not their targets
        path = os.path.join(normalize_path(head), os.path.normcase(tail))

        if not os.path.exists(path):
            return
        if self._permitted(path):
            self.paths.add(path)
        else:
            self._refuse.add(path)

        # __pycache__ files can show up after 'installed-files.txt' is created,
        # due to imports
        if os.path.splitext(path)[1] == '.py' and uses_pycache:
            self.add(cache_from_source(path))

    def add_pth(self, pth_file, entry):
        pth_file = normalize_path(pth_file)
        if self._permitted(pth_file):
            if pth_file not in self.pth:
                self.pth[pth_file] = UninstallPthEntries(pth_file)
            self.pth[pth_file].add(entry)
        else:
            self._refuse.add(pth_file)

    def compact(self, paths):
        """Compact a path set to contain the minimal number of paths
        necessary to contain all paths in the set. If /a/path/ and
        /a/path/to/a/file.txt are both in the set, leave only the
        shorter path."""
        short_paths = set()
        for path in sorted(paths, key=len):
            if not any([
                    (path.startswith(shortpath) and
                     path[len(shortpath.rstrip(os.path.sep))] == os.path.sep)
                    for shortpath in short_paths]):
                short_paths.add(path)
        return short_paths

    def _stash(self, path):
        return os.path.join(
            self.save_dir, os.path.splitdrive(path)[1].lstrip(os.path.sep))

    def remove(self, auto_confirm=False):
        """Remove paths in ``self.paths`` with confirmation (unless
        ``auto_confirm`` is True)."""
        if not self.paths:
            logger.info(
                "Can't uninstall '%s'. No files were found to uninstall.",
                self.dist.project_name,
            )
            return
        logger.info(
            'Uninstalling %s-%s:',
            self.dist.project_name, self.dist.version
        )

        with indent_log():
            paths = sorted(self.compact(self.paths))

            if auto_confirm:
                response = 'y'
            else:
                for path in paths:
                    logger.info(path)
                response = ask('Proceed (y/n)? ', ('y', 'n'))
            if self._refuse:
                logger.info('Not removing or modifying (outside of prefix):')
                for path in self.compact(self._refuse):
                    logger.info(path)
            if response == 'y':
                self.save_dir = tempfile.mkdtemp(suffix='-uninstall',
                                                 prefix='pip-')
                for path in paths:
                    new_path = self._stash(path)
                    logger.debug('Removing file or directory %s', path)
                    self._moved_paths.append(path)
                    renames(path, new_path)
                for pth in self.pth.values():
                    pth.remove()
                logger.info(
                    'Successfully uninstalled %s-%s',
                    self.dist.project_name, self.dist.version
                )

    def rollback(self):
        """Rollback the changes previously made by remove()."""
        if self.save_dir is None:
            logger.error(
                "Can't roll back %s; was not uninstalled",
                self.dist.project_name,
            )
            return False
        logger.info('Rolling back uninstall of %s', self.dist.project_name)
        for path in self._moved_paths:
            tmp_path = self._stash(path)
            logger.debug('Replacing %s', path)
            renames(tmp_path, path)
        for pth in self.pth.values():
            pth.rollback()

    def commit(self):
        """Remove temporary save dir: rollback will no longer be possible."""
        if self.save_dir is not None:
            rmtree(self.save_dir)
            self.save_dir = None
            self._moved_paths = []


class UninstallPthEntries(object):
    def __init__(self, pth_file):
        if not os.path.isfile(pth_file):
            raise UninstallationError(
                "Cannot remove entries from nonexistent file %s" % pth_file
            )
        self.file = pth_file
        self.entries = set()
        self._saved_lines = None

    def add(self, entry):
        entry = os.path.normcase(entry)
        # On Windows, os.path.normcase converts the entry to use
        # backslashes.  This is correct for entries that describe absolute
        # paths outside of site-packages, but all the others use forward
        # slashes.
        if WINDOWS and not os.path.splitdrive(entry)[0]:
            entry = entry.replace('\\', '/')
        self.entries.add(entry)

    def remove(self):
        logger.debug('Removing pth entries from %s:', self.file)
        with open(self.file, 'rb') as fh:
            # windows uses '\r\n' with py3k, but uses '\n' with py2.x
            lines = fh.readlines()
            self._saved_lines = lines
        if any(b'\r\n' in line for line in lines):
            endline = '\r\n'
        else:
            endline = '\n'
        for entry in self.entries:
            try:
                logger.debug('Removing entry: %s', entry)
                lines.remove((entry + endline).encode("utf-8"))
            except ValueError:
                pass
        with open(self.file, 'wb') as fh:
            fh.writelines(lines)

    def rollback(self):
        if self._saved_lines is None:
            logger.error(
                'Cannot roll back changes to %s, none were made', self.file
            )
            return False
        logger.debug('Rolling %s back to previous state', self.file)
        with open(self.file, 'wb') as fh:
            fh.writelines(self._saved_lines)
        return True
site-packages/pip/req/req_install.py000064400000133225147511334620013546 0ustar00from __future__ import absolute_import

import logging
import os
import re
import shutil
import sys
import tempfile
import traceback
import warnings
import zipfile

from distutils import sysconfig
from distutils.util import change_root
from email.parser import FeedParser

from pip._vendor import pkg_resources, six
from pip._vendor.packaging import specifiers
from pip._vendor.packaging.markers import Marker
from pip._vendor.packaging.requirements import InvalidRequirement, Requirement
from pip._vendor.packaging.utils import canonicalize_name
from pip._vendor.packaging.version import Version, parse as parse_version
from pip._vendor.six.moves import configparser

import pip.wheel

from pip.compat import native_str, get_stdlib, WINDOWS
from pip.download import is_url, url_to_path, path_to_url, is_archive_file
from pip.exceptions import (
    InstallationError, UninstallationError,
)
from pip.locations import (
    bin_py, running_under_virtualenv, PIP_DELETE_MARKER_FILENAME, bin_user,
)
from pip.utils import (
    display_path, rmtree, ask_path_exists, backup_dir, is_installable_dir,
    dist_in_usersite, dist_in_site_packages, dist_in_install_path, egg_link_path,
    call_subprocess, read_text_file, FakeFile, _make_build_dir, ensure_dir,
    get_installed_version, normalize_path, dist_is_local,
)

from pip.utils.hashes import Hashes
from pip.utils.deprecation import RemovedInPip10Warning
from pip.utils.logging import indent_log
from pip.utils.setuptools_build import SETUPTOOLS_SHIM
from pip.utils.ui import open_spinner
from pip.req.req_uninstall import UninstallPathSet
from pip.vcs import vcs
from pip.wheel import move_wheel_files, Wheel


logger = logging.getLogger(__name__)

operators = specifiers.Specifier._operators.keys()


def _strip_extras(path):
    m = re.match(r'^(.+)(\[[^\]]+\])$', path)
    extras = None
    if m:
        path_no_extras = m.group(1)
        extras = m.group(2)
    else:
        path_no_extras = path

    return path_no_extras, extras


def _safe_extras(extras):
    return set(pkg_resources.safe_extra(extra) for extra in extras)


class InstallRequirement(object):

    def __init__(self, req, comes_from, source_dir=None, editable=False,
                 link=None, as_egg=False, update=True,
                 pycompile=True, markers=None, isolated=False, options=None,
                 wheel_cache=None, constraint=False):
        self.extras = ()
        if isinstance(req, six.string_types):
            try:
                req = Requirement(req)
            except InvalidRequirement:
                if os.path.sep in req:
                    add_msg = "It looks like a path. Does it exist ?"
                elif '=' in req and not any(op in req for op in operators):
                    add_msg = "= is not a valid operator. Did you mean == ?"
                else:
                    add_msg = traceback.format_exc()
                raise InstallationError(
                    "Invalid requirement: '%s'\n%s" % (req, add_msg))
            self.extras = _safe_extras(req.extras)

        self.req = req
        self.comes_from = comes_from
        self.constraint = constraint
        self.source_dir = source_dir
        self.editable = editable

        self._wheel_cache = wheel_cache
        self.link = self.original_link = link
        self.as_egg = as_egg
        if markers is not None:
            self.markers = markers
        else:
            self.markers = req and req.marker
        self._egg_info_path = None
        # This holds the pkg_resources.Distribution object if this requirement
        # is already available:
        self.satisfied_by = None
        # This hold the pkg_resources.Distribution object if this requirement
        # conflicts with another installed distribution:
        self.conflicts_with = None
        # Temporary build location
        self._temp_build_dir = None
        # Used to store the global directory where the _temp_build_dir should
        # have been created. Cf _correct_build_location method.
        self._ideal_build_dir = None
        # True if the editable should be updated:
        self.update = update
        # Set to True after successful installation
        self.install_succeeded = None
        # UninstallPathSet of uninstalled distribution (for possible rollback)
        self.uninstalled = None
        # Set True if a legitimate do-nothing-on-uninstall has happened - e.g.
        # system site packages, stdlib packages.
        self.nothing_to_uninstall = False
        self.use_user_site = False
        self.target_dir = None
        self.options = options if options else {}
        self.pycompile = pycompile
        # Set to True after successful preparation of this requirement
        self.prepared = False

        self.isolated = isolated

    @classmethod
    def from_editable(cls, editable_req, comes_from=None, default_vcs=None,
                      isolated=False, options=None, wheel_cache=None,
                      constraint=False):
        from pip.index import Link

        name, url, extras_override = parse_editable(
            editable_req, default_vcs)
        if url.startswith('file:'):
            source_dir = url_to_path(url)
        else:
            source_dir = None

        res = cls(name, comes_from, source_dir=source_dir,
                  editable=True,
                  link=Link(url),
                  constraint=constraint,
                  isolated=isolated,
                  options=options if options else {},
                  wheel_cache=wheel_cache)

        if extras_override is not None:
            res.extras = _safe_extras(extras_override)

        return res

    @classmethod
    def from_line(
            cls, name, comes_from=None, isolated=False, options=None,
            wheel_cache=None, constraint=False):
        """Creates an InstallRequirement from a name, which might be a
        requirement, directory containing 'setup.py', filename, or URL.
        """
        from pip.index import Link

        if is_url(name):
            marker_sep = '; '
        else:
            marker_sep = ';'
        if marker_sep in name:
            name, markers = name.split(marker_sep, 1)
            markers = markers.strip()
            if not markers:
                markers = None
            else:
                markers = Marker(markers)
        else:
            markers = None
        name = name.strip()
        req = None
        path = os.path.normpath(os.path.abspath(name))
        link = None
        extras = None

        if is_url(name):
            link = Link(name)
        else:
            p, extras = _strip_extras(path)
            if (os.path.isdir(p) and
                    (os.path.sep in name or name.startswith('.'))):

                if not is_installable_dir(p):
                    raise InstallationError(
                        "Directory %r is not installable. File 'setup.py' "
                        "not found." % name
                    )
                link = Link(path_to_url(p))
            elif is_archive_file(p):
                if not os.path.isfile(p):
                    logger.warning(
                        'Requirement %r looks like a filename, but the '
                        'file does not exist',
                        name
                    )
                link = Link(path_to_url(p))

        # it's a local file, dir, or url
        if link:
            # Handle relative file URLs
            if link.scheme == 'file' and re.search(r'\.\./', link.url):
                link = Link(
                    path_to_url(os.path.normpath(os.path.abspath(link.path))))
            # wheel file
            if link.is_wheel:
                wheel = Wheel(link.filename)  # can raise InvalidWheelFilename
                req = "%s==%s" % (wheel.name, wheel.version)
            else:
                # set the req to the egg fragment.  when it's not there, this
                # will become an 'unnamed' requirement
                req = link.egg_fragment

        # a requirement specifier
        else:
            req = name

        options = options if options else {}
        res = cls(req, comes_from, link=link, markers=markers,
                  isolated=isolated, options=options,
                  wheel_cache=wheel_cache, constraint=constraint)

        if extras:
            res.extras = _safe_extras(
                Requirement('placeholder' + extras).extras)

        return res

    def __str__(self):
        if self.req:
            s = str(self.req)
            if self.link:
                s += ' from %s' % self.link.url
        else:
            s = self.link.url if self.link else None
        if self.satisfied_by is not None:
            s += ' in %s' % display_path(self.satisfied_by.location)
        if self.comes_from:
            if isinstance(self.comes_from, six.string_types):
                comes_from = self.comes_from
            else:
                comes_from = self.comes_from.from_path()
            if comes_from:
                s += ' (from %s)' % comes_from
        return s

    def __repr__(self):
        return '<%s object: %s editable=%r>' % (
            self.__class__.__name__, str(self), self.editable)

    def populate_link(self, finder, upgrade, require_hashes):
        """Ensure that if a link can be found for this, that it is found.

        Note that self.link may still be None - if Upgrade is False and the
        requirement is already installed.

        If require_hashes is True, don't use the wheel cache, because cached
        wheels, always built locally, have different hashes than the files
        downloaded from the index server and thus throw false hash mismatches.
        Furthermore, cached wheels at present have undeterministic contents due
        to file modification times.
        """
        if self.link is None:
            self.link = finder.find_requirement(self, upgrade)
        if self._wheel_cache is not None and not require_hashes:
            old_link = self.link
            self.link = self._wheel_cache.cached_wheel(self.link, self.name)
            if old_link != self.link:
                logger.debug('Using cached wheel link: %s', self.link)

    @property
    def specifier(self):
        return self.req.specifier

    @property
    def is_pinned(self):
        """Return whether I am pinned to an exact version.

        For example, some-package==1.2 is pinned; some-package>1.2 is not.
        """
        specifiers = self.specifier
        return (len(specifiers) == 1 and
                next(iter(specifiers)).operator in ('==', '==='))

    def from_path(self):
        if self.req is None:
            return None
        s = str(self.req)
        if self.comes_from:
            if isinstance(self.comes_from, six.string_types):
                comes_from = self.comes_from
            else:
                comes_from = self.comes_from.from_path()
            if comes_from:
                s += '->' + comes_from
        return s

    def build_location(self, build_dir):
        if self._temp_build_dir is not None:
            return self._temp_build_dir
        if self.req is None:
            # for requirement via a path to a directory: the name of the
            # package is not available yet so we create a temp directory
            # Once run_egg_info will have run, we'll be able
            # to fix it via _correct_build_location
            # Some systems have /tmp as a symlink which confuses custom
            # builds (such as numpy). Thus, we ensure that the real path
            # is returned.
            self._temp_build_dir = os.path.realpath(
                tempfile.mkdtemp('-build', 'pip-')
            )
            self._ideal_build_dir = build_dir
            return self._temp_build_dir
        if self.editable:
            name = self.name.lower()
        else:
            name = self.name
        # FIXME: Is there a better place to create the build_dir? (hg and bzr
        # need this)
        if not os.path.exists(build_dir):
            logger.debug('Creating directory %s', build_dir)
            _make_build_dir(build_dir)
        return os.path.join(build_dir, name)

    def _correct_build_location(self):
        """Move self._temp_build_dir to self._ideal_build_dir/self.req.name

        For some requirements (e.g. a path to a directory), the name of the
        package is not available until we run egg_info, so the build_location
        will return a temporary directory and store the _ideal_build_dir.

        This is only called by self.egg_info_path to fix the temporary build
        directory.
        """
        if self.source_dir is not None:
            return
        assert self.req is not None
        assert self._temp_build_dir
        assert self._ideal_build_dir
        old_location = self._temp_build_dir
        self._temp_build_dir = None
        new_location = self.build_location(self._ideal_build_dir)
        if os.path.exists(new_location):
            raise InstallationError(
                'A package already exists in %s; please remove it to continue'
                % display_path(new_location))
        logger.debug(
            'Moving package %s from %s to new location %s',
            self, display_path(old_location), display_path(new_location),
        )
        shutil.move(old_location, new_location)
        self._temp_build_dir = new_location
        self._ideal_build_dir = None
        self.source_dir = new_location
        self._egg_info_path = None

    @property
    def name(self):
        if self.req is None:
            return None
        return native_str(pkg_resources.safe_name(self.req.name))

    @property
    def setup_py_dir(self):
        return os.path.join(
            self.source_dir,
            self.link and self.link.subdirectory_fragment or '')

    @property
    def setup_py(self):
        assert self.source_dir, "No source dir for %s" % self
        try:
            import setuptools  # noqa
        except ImportError:
            if get_installed_version('setuptools') is None:
                add_msg = "Please install setuptools."
            else:
                add_msg = traceback.format_exc()
            # Setuptools is not available
            raise InstallationError(
                "Could not import setuptools which is required to "
                "install from a source distribution.\n%s" % add_msg
            )

        setup_py = os.path.join(self.setup_py_dir, 'setup.py')

        # Python2 __file__ should not be unicode
        if six.PY2 and isinstance(setup_py, six.text_type):
            setup_py = setup_py.encode(sys.getfilesystemencoding())

        return setup_py

    def run_egg_info(self):
        assert self.source_dir
        if self.name:
            logger.debug(
                'Running setup.py (path:%s) egg_info for package %s',
                self.setup_py, self.name,
            )
        else:
            logger.debug(
                'Running setup.py (path:%s) egg_info for package from %s',
                self.setup_py, self.link,
            )

        with indent_log():
            script = SETUPTOOLS_SHIM % self.setup_py
            base_cmd = [sys.executable, '-c', script]
            if self.isolated:
                base_cmd += ["--no-user-cfg"]
            egg_info_cmd = base_cmd + ['egg_info']
            # We can't put the .egg-info files at the root, because then the
            # source code will be mistaken for an installed egg, causing
            # problems
            if self.editable:
                egg_base_option = []
            else:
                egg_info_dir = os.path.join(self.setup_py_dir, 'pip-egg-info')
                ensure_dir(egg_info_dir)
                egg_base_option = ['--egg-base', 'pip-egg-info']
            call_subprocess(
                egg_info_cmd + egg_base_option,
                cwd=self.setup_py_dir,
                show_stdout=False,
                command_desc='python setup.py egg_info')

        if not self.req:
            if isinstance(parse_version(self.pkg_info()["Version"]), Version):
                op = "=="
            else:
                op = "==="
            self.req = Requirement(
                "".join([
                    self.pkg_info()["Name"],
                    op,
                    self.pkg_info()["Version"],
                ])
            )
            self._correct_build_location()
        else:
            metadata_name = canonicalize_name(self.pkg_info()["Name"])
            if canonicalize_name(self.req.name) != metadata_name:
                logger.warning(
                    'Running setup.py (path:%s) egg_info for package %s '
                    'produced metadata for project name %s. Fix your '
                    '#egg=%s fragments.',
                    self.setup_py, self.name, metadata_name, self.name
                )
                self.req = Requirement(metadata_name)

    def egg_info_data(self, filename):
        if self.satisfied_by is not None:
            if not self.satisfied_by.has_metadata(filename):
                return None
            return self.satisfied_by.get_metadata(filename)
        assert self.source_dir
        filename = self.egg_info_path(filename)
        if not os.path.exists(filename):
            return None
        data = read_text_file(filename)
        return data

    def egg_info_path(self, filename):
        if self._egg_info_path is None:
            if self.editable:
                base = self.source_dir
            else:
                base = os.path.join(self.setup_py_dir, 'pip-egg-info')
            filenames = os.listdir(base)
            if self.editable:
                filenames = []
                for root, dirs, files in os.walk(base):
                    for dir in vcs.dirnames:
                        if dir in dirs:
                            dirs.remove(dir)
                    # Iterate over a copy of ``dirs``, since mutating
                    # a list while iterating over it can cause trouble.
                    # (See https://github.com/pypa/pip/pull/462.)
                    for dir in list(dirs):
                        # Don't search in anything that looks like a virtualenv
                        # environment
                        if (
                                os.path.lexists(
                                    os.path.join(root, dir, 'bin', 'python')
                                ) or
                                os.path.exists(
                                    os.path.join(
                                        root, dir, 'Scripts', 'Python.exe'
                                    )
                                )):
                            dirs.remove(dir)
                        # Also don't search through tests
                        elif dir == 'test' or dir == 'tests':
                            dirs.remove(dir)
                    filenames.extend([os.path.join(root, dir)
                                     for dir in dirs])
                filenames = [f for f in filenames if f.endswith('.egg-info')]

            if not filenames:
                raise InstallationError(
                    'No files/directories in %s (from %s)' % (base, filename)
                )
            assert filenames, \
                "No files/directories in %s (from %s)" % (base, filename)

            # if we have more than one match, we pick the toplevel one.  This
            # can easily be the case if there is a dist folder which contains
            # an extracted tarball for testing purposes.
            if len(filenames) > 1:
                filenames.sort(
                    key=lambda x: x.count(os.path.sep) +
                    (os.path.altsep and x.count(os.path.altsep) or 0)
                )
            self._egg_info_path = os.path.join(base, filenames[0])
        return os.path.join(self._egg_info_path, filename)

    def pkg_info(self):
        p = FeedParser()
        data = self.egg_info_data('PKG-INFO')
        if not data:
            logger.warning(
                'No PKG-INFO file found in %s',
                display_path(self.egg_info_path('PKG-INFO')),
            )
        p.feed(data or '')
        return p.close()

    _requirements_section_re = re.compile(r'\[(.*?)\]')

    @property
    def installed_version(self):
        return get_installed_version(self.name)

    def assert_source_matches_version(self):
        assert self.source_dir
        version = self.pkg_info()['version']
        if self.req.specifier and version not in self.req.specifier:
            logger.warning(
                'Requested %s, but installing version %s',
                self,
                self.installed_version,
            )
        else:
            logger.debug(
                'Source in %s has version %s, which satisfies requirement %s',
                display_path(self.source_dir),
                version,
                self,
            )

    def update_editable(self, obtain=True):
        if not self.link:
            logger.debug(
                "Cannot update repository at %s; repository location is "
                "unknown",
                self.source_dir,
            )
            return
        assert self.editable
        assert self.source_dir
        if self.link.scheme == 'file':
            # Static paths don't get updated
            return
        assert '+' in self.link.url, "bad url: %r" % self.link.url
        if not self.update:
            return
        vc_type, url = self.link.url.split('+', 1)
        backend = vcs.get_backend(vc_type)
        if backend:
            vcs_backend = backend(self.link.url)
            if obtain:
                vcs_backend.obtain(self.source_dir)
            else:
                vcs_backend.export(self.source_dir)
        else:
            assert 0, (
                'Unexpected version control type (in %s): %s'
                % (self.link, vc_type))

    def uninstall(self, auto_confirm=False):
        """
        Uninstall the distribution currently satisfying this requirement.

        Prompts before removing or modifying files unless
        ``auto_confirm`` is True.

        Refuses to delete or modify files outside of ``sys.prefix`` -
        thus uninstallation within a virtual environment can only
        modify that virtual environment, even if the virtualenv is
        linked to global site-packages.

        """
        if not self.check_if_exists():
            raise UninstallationError(
                "Cannot uninstall requirement %s, not installed" % (self.name,)
            )
        dist = self.satisfied_by or self.conflicts_with

        dist_path = normalize_path(dist.location)
        if not dist_is_local(dist):
            logger.info(
                "Not uninstalling %s at %s, outside environment %s",
                dist.key,
                dist_path,
                sys.prefix,
            )
            self.nothing_to_uninstall = True
            return

        if dist_path in get_stdlib():
            logger.info(
                "Not uninstalling %s at %s, as it is in the standard library.",
                dist.key,
                dist_path,
            )
            self.nothing_to_uninstall = True
            return

        paths_to_remove = UninstallPathSet(dist)
        develop_egg_link = egg_link_path(dist)
        develop_egg_link_egg_info = '{0}.egg-info'.format(
            pkg_resources.to_filename(dist.project_name))
        egg_info_exists = dist.egg_info and os.path.exists(dist.egg_info)
        # Special case for distutils installed package
        distutils_egg_info = getattr(dist._provider, 'path', None)

        # Uninstall cases order do matter as in the case of 2 installs of the
        # same package, pip needs to uninstall the currently detected version
        if (egg_info_exists and dist.egg_info.endswith('.egg-info') and
                not dist.egg_info.endswith(develop_egg_link_egg_info)):
            # if dist.egg_info.endswith(develop_egg_link_egg_info), we
            # are in fact in the develop_egg_link case
            paths_to_remove.add(dist.egg_info)
            if dist.has_metadata('installed-files.txt'):
                for installed_file in dist.get_metadata(
                        'installed-files.txt').splitlines():
                    path = os.path.normpath(
                        os.path.join(dist.egg_info, installed_file)
                    )
                    paths_to_remove.add(path)
            # FIXME: need a test for this elif block
            # occurs with --single-version-externally-managed/--record outside
            # of pip
            elif dist.has_metadata('top_level.txt'):
                if dist.has_metadata('namespace_packages.txt'):
                    namespaces = dist.get_metadata('namespace_packages.txt')
                else:
                    namespaces = []
                for top_level_pkg in [
                        p for p
                        in dist.get_metadata('top_level.txt').splitlines()
                        if p and p not in namespaces]:
                    path = os.path.join(dist.location, top_level_pkg)
                    paths_to_remove.add(path)
                    paths_to_remove.add(path + '.py')
                    paths_to_remove.add(path + '.pyc')
                    paths_to_remove.add(path + '.pyo')

        elif distutils_egg_info:
            warnings.warn(
                "Uninstalling a distutils installed project ({0}) has been "
                "deprecated and will be removed in a future version. This is "
                "due to the fact that uninstalling a distutils project will "
                "only partially uninstall the project.".format(self.name),
                RemovedInPip10Warning,
            )
            paths_to_remove.add(distutils_egg_info)

        elif dist.location.endswith('.egg'):
            # package installed by easy_install
            # We cannot match on dist.egg_name because it can slightly vary
            # i.e. setuptools-0.6c11-py2.6.egg vs setuptools-0.6rc11-py2.6.egg
            paths_to_remove.add(dist.location)
            easy_install_egg = os.path.split(dist.location)[1]
            easy_install_pth = os.path.join(os.path.dirname(dist.location),
                                            'easy-install.pth')
            paths_to_remove.add_pth(easy_install_pth, './' + easy_install_egg)

        elif egg_info_exists and dist.egg_info.endswith('.dist-info'):
            for path in pip.wheel.uninstallation_paths(dist):
                paths_to_remove.add(path)

        elif develop_egg_link:
            # develop egg
            with open(develop_egg_link, 'r') as fh:
                link_pointer = os.path.normcase(fh.readline().strip())
            assert (link_pointer == dist.location), (
                'Egg-link %s does not match installed location of %s '
                '(at %s)' % (link_pointer, self.name, dist.location)
            )
            paths_to_remove.add(develop_egg_link)
            easy_install_pth = os.path.join(os.path.dirname(develop_egg_link),
                                            'easy-install.pth')
            paths_to_remove.add_pth(easy_install_pth, dist.location)

        else:
            logger.debug(
                'Not sure how to uninstall: %s - Check: %s',
                dist, dist.location)

        # find distutils scripts= scripts
        if dist.has_metadata('scripts') and dist.metadata_isdir('scripts'):
            for script in dist.metadata_listdir('scripts'):
                if dist_in_usersite(dist):
                    bin_dir = bin_user
                else:
                    bin_dir = bin_py
                paths_to_remove.add(os.path.join(bin_dir, script))
                if WINDOWS:
                    paths_to_remove.add(os.path.join(bin_dir, script) + '.bat')

        # find console_scripts
        if dist.has_metadata('entry_points.txt'):
            if six.PY2:
                options = {}
            else:
                options = {"delimiters": ('=', )}
            config = configparser.SafeConfigParser(**options)
            config.readfp(
                FakeFile(dist.get_metadata_lines('entry_points.txt'))
            )
            if config.has_section('console_scripts'):
                for name, value in config.items('console_scripts'):
                    if dist_in_usersite(dist):
                        bin_dir = bin_user
                    else:
                        bin_dir = bin_py
                    paths_to_remove.add(os.path.join(bin_dir, name))
                    if WINDOWS:
                        paths_to_remove.add(
                            os.path.join(bin_dir, name) + '.exe'
                        )
                        paths_to_remove.add(
                            os.path.join(bin_dir, name) + '.exe.manifest'
                        )
                        paths_to_remove.add(
                            os.path.join(bin_dir, name) + '-script.py'
                        )

        paths_to_remove.remove(auto_confirm)
        self.uninstalled = paths_to_remove

    def rollback_uninstall(self):
        if self.uninstalled:
            self.uninstalled.rollback()
        else:
            logger.error(
                "Can't rollback %s, nothing uninstalled.", self.name,
            )

    def commit_uninstall(self):
        if self.uninstalled:
            self.uninstalled.commit()
        elif not self.nothing_to_uninstall:
            logger.error(
                "Can't commit %s, nothing uninstalled.", self.name,
            )

    def archive(self, build_dir):
        assert self.source_dir
        create_archive = True
        archive_name = '%s-%s.zip' % (self.name, self.pkg_info()["version"])
        archive_path = os.path.join(build_dir, archive_name)
        if os.path.exists(archive_path):
            response = ask_path_exists(
                'The file %s exists. (i)gnore, (w)ipe, (b)ackup, (a)bort ' %
                display_path(archive_path), ('i', 'w', 'b', 'a'))
            if response == 'i':
                create_archive = False
            elif response == 'w':
                logger.warning('Deleting %s', display_path(archive_path))
                os.remove(archive_path)
            elif response == 'b':
                dest_file = backup_dir(archive_path)
                logger.warning(
                    'Backing up %s to %s',
                    display_path(archive_path),
                    display_path(dest_file),
                )
                shutil.move(archive_path, dest_file)
            elif response == 'a':
                sys.exit(-1)
        if create_archive:
            zip = zipfile.ZipFile(
                archive_path, 'w', zipfile.ZIP_DEFLATED,
                allowZip64=True
            )
            dir = os.path.normcase(os.path.abspath(self.setup_py_dir))
            for dirpath, dirnames, filenames in os.walk(dir):
                if 'pip-egg-info' in dirnames:
                    dirnames.remove('pip-egg-info')
                for dirname in dirnames:
                    dirname = os.path.join(dirpath, dirname)
                    name = self._clean_zip_name(dirname, dir)
                    zipdir = zipfile.ZipInfo(self.name + '/' + name + '/')
                    zipdir.external_attr = 0x1ED << 16  # 0o755
                    zip.writestr(zipdir, '')
                for filename in filenames:
                    if filename == PIP_DELETE_MARKER_FILENAME:
                        continue
                    filename = os.path.join(dirpath, filename)
                    name = self._clean_zip_name(filename, dir)
                    zip.write(filename, self.name + '/' + name)
            zip.close()
            logger.info('Saved %s', display_path(archive_path))

    def _clean_zip_name(self, name, prefix):
        assert name.startswith(prefix + os.path.sep), (
            "name %r doesn't start with prefix %r" % (name, prefix)
        )
        name = name[len(prefix) + 1:]
        name = name.replace(os.path.sep, '/')
        return name

    def match_markers(self, extras_requested=None):
        if not extras_requested:
            # Provide an extra to safely evaluate the markers
            # without matching any extra
            extras_requested = ('',)
        if self.markers is not None:
            return any(
                self.markers.evaluate({'extra': extra})
                for extra in extras_requested)
        else:
            return True

    def install(self, install_options, global_options=[], root=None, prefix=None, strip_file_prefix=None):
        if self.editable:
            self.install_editable(
                install_options, global_options, prefix=prefix)
            return
        if self.is_wheel:
            version = pip.wheel.wheel_version(self.source_dir)
            pip.wheel.check_compatibility(version, self.name)

            self.move_wheel_files(
                self.source_dir,
                root=root,
                prefix=prefix,
                strip_file_prefix=strip_file_prefix
            )
            self.install_succeeded = True
            return

        # Extend the list of global and install options passed on to
        # the setup.py call with the ones from the requirements file.
        # Options specified in requirements file override those
        # specified on the command line, since the last option given
        # to setup.py is the one that is used.
        global_options += self.options.get('global_options', [])
        install_options += self.options.get('install_options', [])

        if self.isolated:
            global_options = list(global_options) + ["--no-user-cfg"]

        temp_location = tempfile.mkdtemp('-record', 'pip-')
        record_filename = os.path.join(temp_location, 'install-record.txt')
        try:
            install_args = self.get_install_args(
                global_options, record_filename, root, prefix)
            msg = 'Running setup.py install for %s' % (self.name,)
            with open_spinner(msg) as spinner:
                with indent_log():
                    call_subprocess(
                        install_args + install_options,
                        cwd=self.setup_py_dir,
                        show_stdout=False,
                        spinner=spinner,
                    )

            if not os.path.exists(record_filename):
                logger.debug('Record file %s not found', record_filename)
                return
            self.install_succeeded = True
            if self.as_egg:
                # there's no --always-unzip option we can pass to install
                # command so we unable to save the installed-files.txt
                return

            def prepend_root(path):
                if root is None or not os.path.isabs(path):
                    return path
                else:
                    return change_root(root, path)

            with open(record_filename) as f:
                for line in f:
                    directory = os.path.dirname(line)
                    if directory.endswith('.egg-info'):
                        egg_info_dir = prepend_root(directory)
                        break
                else:
                    logger.warning(
                        'Could not find .egg-info directory in install record'
                        ' for %s',
                        self,
                    )
                    # FIXME: put the record somewhere
                    # FIXME: should this be an error?
                    return
            new_lines = []
            with open(record_filename) as f:
                for line in f:
                    filename = line.strip()
                    if os.path.isdir(filename):
                        filename += os.path.sep
                    new_lines.append(
                        os.path.relpath(
                            prepend_root(filename), egg_info_dir)
                    )
            inst_files_path = os.path.join(egg_info_dir, 'installed-files.txt')
            with open(inst_files_path, 'w') as f:
                f.write('\n'.join(new_lines) + '\n')
        finally:
            if os.path.exists(record_filename):
                os.remove(record_filename)
            rmtree(temp_location)

    def ensure_has_source_dir(self, parent_dir):
        """Ensure that a source_dir is set.

        This will create a temporary build dir if the name of the requirement
        isn't known yet.

        :param parent_dir: The ideal pip parent_dir for the source_dir.
            Generally src_dir for editables and build_dir for sdists.
        :return: self.source_dir
        """
        if self.source_dir is None:
            self.source_dir = self.build_location(parent_dir)
        return self.source_dir

    def get_install_args(self, global_options, record_filename, root, prefix):
        install_args = [sys.executable, "-u"]
        install_args.append('-c')
        install_args.append(SETUPTOOLS_SHIM % self.setup_py)
        install_args += list(global_options) + \
            ['install', '--record', record_filename]

        if not self.as_egg:
            install_args += ['--single-version-externally-managed']

        if root is not None:
            install_args += ['--root', root]
        if prefix is not None:
            install_args += ['--prefix', prefix]

        if self.pycompile:
            install_args += ["--compile"]
        else:
            install_args += ["--no-compile"]

        if running_under_virtualenv():
            py_ver_str = 'python' + sysconfig.get_python_version()
            install_args += ['--install-headers',
                             os.path.join(sys.prefix, 'include', 'site',
                                          py_ver_str, self.name)]

        return install_args

    def remove_temporary_source(self):
        """Remove the source files from this requirement, if they are marked
        for deletion"""
        if self.source_dir and os.path.exists(
                os.path.join(self.source_dir, PIP_DELETE_MARKER_FILENAME)):
            logger.debug('Removing source in %s', self.source_dir)
            rmtree(self.source_dir)
        self.source_dir = None
        if self._temp_build_dir and os.path.exists(self._temp_build_dir):
            rmtree(self._temp_build_dir)
        self._temp_build_dir = None

    def install_editable(self, install_options,
                         global_options=(), prefix=None):
        logger.info('Running setup.py develop for %s', self.name)

        if self.isolated:
            global_options = list(global_options) + ["--no-user-cfg"]

        if prefix:
            prefix_param = ['--prefix={0}'.format(prefix)]
            install_options = list(install_options) + prefix_param

        with indent_log():
            # FIXME: should we do --install-headers here too?
            call_subprocess(
                [
                    sys.executable,
                    '-c',
                    SETUPTOOLS_SHIM % self.setup_py
                ] +
                list(global_options) +
                ['develop', '--no-deps'] +
                list(install_options),

                cwd=self.setup_py_dir,
                show_stdout=False)

        self.install_succeeded = True

    def check_if_exists(self):
        """Find an installed distribution that satisfies or conflicts
        with this requirement, and set self.satisfied_by or
        self.conflicts_with appropriately.
        """
        if self.req is None:
            return False
        try:
            # get_distribution() will resolve the entire list of requirements
            # anyway, and we've already determined that we need the requirement
            # in question, so strip the marker so that we don't try to
            # evaluate it.
            no_marker = Requirement(str(self.req))
            no_marker.marker = None
            self.satisfied_by = pkg_resources.get_distribution(str(no_marker))
            if self.editable and self.satisfied_by:
                self.conflicts_with = self.satisfied_by
                # when installing editables, nothing pre-existing should ever
                # satisfy
                self.satisfied_by = None
                return True
        except pkg_resources.DistributionNotFound:
            return False
        except pkg_resources.VersionConflict:
            existing_dist = pkg_resources.get_distribution(
                self.req.name
            )
            if self.use_user_site:
                if dist_in_usersite(existing_dist):
                    self.conflicts_with = existing_dist
                elif (running_under_virtualenv() and
                        dist_in_site_packages(existing_dist)):
                    raise InstallationError(
                        "Will not install to the user site because it will "
                        "lack sys.path precedence to %s in %s" %
                        (existing_dist.project_name, existing_dist.location)
                    )
            elif dist_in_install_path(existing_dist):
                self.conflicts_with = existing_dist
        return True

    @property
    def is_wheel(self):
        return self.link and self.link.is_wheel

    def move_wheel_files(self, wheeldir, root=None, prefix=None, strip_file_prefix=None):
        move_wheel_files(
            self.name, self.req, wheeldir,
            user=self.use_user_site,
            home=self.target_dir,
            root=root,
            prefix=prefix,
            pycompile=self.pycompile,
            isolated=self.isolated,
            strip_file_prefix=strip_file_prefix,
        )

    def get_dist(self):
        """Return a pkg_resources.Distribution built from self.egg_info_path"""
        egg_info = self.egg_info_path('').rstrip('/')
        base_dir = os.path.dirname(egg_info)
        metadata = pkg_resources.PathMetadata(base_dir, egg_info)
        dist_name = os.path.splitext(os.path.basename(egg_info))[0]
        return pkg_resources.Distribution(
            os.path.dirname(egg_info),
            project_name=dist_name,
            metadata=metadata)

    @property
    def has_hash_options(self):
        """Return whether any known-good hashes are specified as options.

        These activate --require-hashes mode; hashes specified as part of a
        URL do not.

        """
        return bool(self.options.get('hashes', {}))

    def hashes(self, trust_internet=True):
        """Return a hash-comparer that considers my option- and URL-based
        hashes to be known-good.

        Hashes in URLs--ones embedded in the requirements file, not ones
        downloaded from an index server--are almost peers with ones from
        flags. They satisfy --require-hashes (whether it was implicitly or
        explicitly activated) but do not activate it. md5 and sha224 are not
        allowed in flags, which should nudge people toward good algos. We
        always OR all hashes together, even ones from URLs.

        :param trust_internet: Whether to trust URL-based (#md5=...) hashes
            downloaded from the internet, as by populate_link()

        """
        good_hashes = self.options.get('hashes', {}).copy()
        link = self.link if trust_internet else self.original_link
        if link and link.hash:
            good_hashes.setdefault(link.hash_name, []).append(link.hash)
        return Hashes(good_hashes)


def _strip_postfix(req):
    """
        Strip req postfix ( -dev, 0.2, etc )
    """
    # FIXME: use package_to_requirement?
    match = re.search(r'^(.*?)(?:-dev|-\d.*)$', req)
    if match:
        # Strip off -dev, -0.2, etc.
        req = match.group(1)
    return req


def parse_editable(editable_req, default_vcs=None):
    """Parses an editable requirement into:
        - a requirement name
        - an URL
        - extras
        - editable options
    Accepted requirements:
        svn+http://blahblah@rev#egg=Foobar[baz]&subdirectory=version_subdir
        .[some_extra]
    """

    from pip.index import Link

    url = editable_req
    extras = None

    # If a file path is specified with extras, strip off the extras.
    m = re.match(r'^(.+)(\[[^\]]+\])$', url)
    if m:
        url_no_extras = m.group(1)
        extras = m.group(2)
    else:
        url_no_extras = url

    if os.path.isdir(url_no_extras):
        if not os.path.exists(os.path.join(url_no_extras, 'setup.py')):
            raise InstallationError(
                "Directory %r is not installable. File 'setup.py' not found." %
                url_no_extras
            )
        # Treating it as code that has already been checked out
        url_no_extras = path_to_url(url_no_extras)

    if url_no_extras.lower().startswith('file:'):
        package_name = Link(url_no_extras).egg_fragment
        if extras:
            return (
                package_name,
                url_no_extras,
                Requirement("placeholder" + extras.lower()).extras,
            )
        else:
            return package_name, url_no_extras, None

    for version_control in vcs:
        if url.lower().startswith('%s:' % version_control):
            url = '%s+%s' % (version_control, url)
            break

    if '+' not in url:
        if default_vcs:
            warnings.warn(
                "--default-vcs has been deprecated and will be removed in "
                "the future.",
                RemovedInPip10Warning,
            )
            url = default_vcs + '+' + url
        else:
            raise InstallationError(
                '%s should either be a path to a local project or a VCS url '
                'beginning with svn+, git+, hg+, or bzr+' %
                editable_req
            )

    vc_type = url.split('+', 1)[0].lower()

    if not vcs.get_backend(vc_type):
        error_message = 'For --editable=%s only ' % editable_req + \
            ', '.join([backend.name + '+URL' for backend in vcs.backends]) + \
            ' is currently supported'
        raise InstallationError(error_message)

    package_name = Link(url).egg_fragment
    if not package_name:
        raise InstallationError(
            "Could not detect requirement name, please specify one with #egg="
        )
    if not package_name:
        raise InstallationError(
            '--editable=%s is not the right format; it must have '
            '#egg=Package' % editable_req
        )
    return _strip_postfix(package_name), url, None
site-packages/pip/wheel.py000064400000077037147511334620011556 0ustar00"""
Support for installing and building the "wheel" binary package format.
"""
from __future__ import absolute_import

import compileall
import csv
import errno
import functools
import hashlib
import logging
import os
import os.path
import re
import shutil
import stat
import sys
import tempfile
import warnings

from base64 import urlsafe_b64encode
from email.parser import Parser

from pip._vendor.six import StringIO

import pip
from pip.compat import expanduser
from pip.download import path_to_url, unpack_url
from pip.exceptions import (
    InstallationError, InvalidWheelFilename, UnsupportedWheel)
from pip.locations import distutils_scheme, PIP_DELETE_MARKER_FILENAME
from pip import pep425tags
from pip.utils import (
    call_subprocess, ensure_dir, captured_stdout, rmtree, read_chunks,
)
from pip.utils.ui import open_spinner
from pip.utils.logging import indent_log
from pip.utils.setuptools_build import SETUPTOOLS_SHIM
from pip._vendor.distlib.scripts import ScriptMaker
from pip._vendor import pkg_resources
from pip._vendor.packaging.utils import canonicalize_name
from pip._vendor.six.moves import configparser


wheel_ext = '.whl'

VERSION_COMPATIBLE = (1, 0)


logger = logging.getLogger(__name__)


class WheelCache(object):
    """A cache of wheels for future installs."""

    def __init__(self, cache_dir, format_control):
        """Create a wheel cache.

        :param cache_dir: The root of the cache.
        :param format_control: A pip.index.FormatControl object to limit
            binaries being read from the cache.
        """
        self._cache_dir = expanduser(cache_dir) if cache_dir else None
        self._format_control = format_control

    def cached_wheel(self, link, package_name):
        return cached_wheel(
            self._cache_dir, link, self._format_control, package_name)


def _cache_for_link(cache_dir, link):
    """
    Return a directory to store cached wheels in for link.

    Because there are M wheels for any one sdist, we provide a directory
    to cache them in, and then consult that directory when looking up
    cache hits.

    We only insert things into the cache if they have plausible version
    numbers, so that we don't contaminate the cache with things that were not
    unique. E.g. ./package might have dozens of installs done for it and build
    a version of 0.0...and if we built and cached a wheel, we'd end up using
    the same wheel even if the source has been edited.

    :param cache_dir: The cache_dir being used by pip.
    :param link: The link of the sdist for which this will cache wheels.
    """

    # We want to generate an url to use as our cache key, we don't want to just
    # re-use the URL because it might have other items in the fragment and we
    # don't care about those.
    key_parts = [link.url_without_fragment]
    if link.hash_name is not None and link.hash is not None:
        key_parts.append("=".join([link.hash_name, link.hash]))
    key_url = "#".join(key_parts)

    # Encode our key url with sha224, we'll use this because it has similar
    # security properties to sha256, but with a shorter total output (and thus
    # less secure). However the differences don't make a lot of difference for
    # our use case here.
    hashed = hashlib.sha224(key_url.encode()).hexdigest()

    # We want to nest the directories some to prevent having a ton of top level
    # directories where we might run out of sub directories on some FS.
    parts = [hashed[:2], hashed[2:4], hashed[4:6], hashed[6:]]

    # Inside of the base location for cached wheels, expand our parts and join
    # them all together.
    return os.path.join(cache_dir, "wheels", *parts)


def cached_wheel(cache_dir, link, format_control, package_name):
    if not cache_dir:
        return link
    if not link:
        return link
    if link.is_wheel:
        return link
    if not link.is_artifact:
        return link
    if not package_name:
        return link
    canonical_name = canonicalize_name(package_name)
    formats = pip.index.fmt_ctl_formats(format_control, canonical_name)
    if "binary" not in formats:
        return link
    root = _cache_for_link(cache_dir, link)
    try:
        wheel_names = os.listdir(root)
    except OSError as e:
        if e.errno in (errno.ENOENT, errno.ENOTDIR):
            return link
        raise
    candidates = []
    for wheel_name in wheel_names:
        try:
            wheel = Wheel(wheel_name)
        except InvalidWheelFilename:
            continue
        if not wheel.supported():
            # Built for a different python/arch/etc
            continue
        candidates.append((wheel.support_index_min(), wheel_name))
    if not candidates:
        return link
    candidates.sort()
    path = os.path.join(root, candidates[0][1])
    return pip.index.Link(path_to_url(path))


def rehash(path, algo='sha256', blocksize=1 << 20):
    """Return (hash, length) for path using hashlib.new(algo)"""
    h = hashlib.new(algo)
    length = 0
    with open(path, 'rb') as f:
        for block in read_chunks(f, size=blocksize):
            length += len(block)
            h.update(block)
    digest = 'sha256=' + urlsafe_b64encode(
        h.digest()
    ).decode('latin1').rstrip('=')
    return (digest, length)


def open_for_csv(name, mode):
    if sys.version_info[0] < 3:
        nl = {}
        bin = 'b'
    else:
        nl = {'newline': ''}
        bin = ''
    return open(name, mode + bin, **nl)


def fix_script(path):
    """Replace #!python with #!/path/to/python
    Return True if file was changed."""
    # XXX RECORD hashes will need to be updated
    if os.path.isfile(path):
        with open(path, 'rb') as script:
            firstline = script.readline()
            if not firstline.startswith(b'#!python'):
                return False
            exename = sys.executable.encode(sys.getfilesystemencoding())
            firstline = b'#!' + exename + os.linesep.encode("ascii")
            rest = script.read()
        with open(path, 'wb') as script:
            script.write(firstline)
            script.write(rest)
        return True

dist_info_re = re.compile(r"""^(?P<namever>(?P<name>.+?)(-(?P<ver>\d.+?))?)
                                \.dist-info$""", re.VERBOSE)


def root_is_purelib(name, wheeldir):
    """
    Return True if the extracted wheel in wheeldir should go into purelib.
    """
    name_folded = name.replace("-", "_")
    for item in os.listdir(wheeldir):
        match = dist_info_re.match(item)
        if match and match.group('name') == name_folded:
            with open(os.path.join(wheeldir, item, 'WHEEL')) as wheel:
                for line in wheel:
                    line = line.lower().rstrip()
                    if line == "root-is-purelib: true":
                        return True
    return False


def get_entrypoints(filename):
    if not os.path.exists(filename):
        return {}, {}

    # This is done because you can pass a string to entry_points wrappers which
    # means that they may or may not be valid INI files. The attempt here is to
    # strip leading and trailing whitespace in order to make them valid INI
    # files.
    with open(filename) as fp:
        data = StringIO()
        for line in fp:
            data.write(line.strip())
            data.write("\n")
        data.seek(0)

    cp = configparser.RawConfigParser()
    cp.optionxform = lambda option: option
    cp.readfp(data)

    console = {}
    gui = {}
    if cp.has_section('console_scripts'):
        console = dict(cp.items('console_scripts'))
    if cp.has_section('gui_scripts'):
        gui = dict(cp.items('gui_scripts'))
    return console, gui


def move_wheel_files(name, req, wheeldir, user=False, home=None, root=None,
                     pycompile=True, scheme=None, isolated=False, prefix=None, strip_file_prefix=None):
    """Install a wheel"""

    if not scheme:
        scheme = distutils_scheme(
            name, user=user, home=home, root=root, isolated=isolated,
            prefix=prefix,
        )

    if root_is_purelib(name, wheeldir):
        lib_dir = scheme['purelib']
    else:
        lib_dir = scheme['platlib']

    info_dir = []
    data_dirs = []
    source = wheeldir.rstrip(os.path.sep) + os.path.sep

    # Record details of the files moved
    #   installed = files copied from the wheel to the destination
    #   changed = files changed while installing (scripts #! line typically)
    #   generated = files newly generated during the install (script wrappers)
    installed = {}
    changed = set()
    generated = []

    # Compile all of the pyc files that we're going to be installing
    if pycompile:
        with captured_stdout() as stdout:
            with warnings.catch_warnings():
                warnings.filterwarnings('ignore')
                compileall.compile_dir(source, force=True, quiet=True)
        logger.debug(stdout.getvalue())

    def normpath(src, p):
        return os.path.relpath(src, p).replace(os.path.sep, '/')

    def record_installed(srcfile, destfile, modified=False):
        """Map archive RECORD paths to installation RECORD paths."""
        oldpath = normpath(srcfile, wheeldir)
        newpath = normpath(destfile, lib_dir)
        installed[oldpath] = newpath
        if modified:
            changed.add(destfile)

    def clobber(source, dest, is_base, fixer=None, filter=None):
        ensure_dir(dest)  # common for the 'include' path

        for dir, subdirs, files in os.walk(source):
            basedir = dir[len(source):].lstrip(os.path.sep)
            destdir = os.path.join(dest, basedir)
            if is_base and basedir.split(os.path.sep, 1)[0].endswith('.data'):
                continue
            for s in subdirs:
                destsubdir = os.path.join(dest, basedir, s)
                if is_base and basedir == '' and destsubdir.endswith('.data'):
                    data_dirs.append(s)
                    continue
                elif (is_base and
                        s.endswith('.dist-info') and
                        canonicalize_name(s).startswith(
                            canonicalize_name(req.name))):
                    assert not info_dir, ('Multiple .dist-info directories: ' +
                                          destsubdir + ', ' +
                                          ', '.join(info_dir))
                    info_dir.append(destsubdir)
            for f in files:
                # Skip unwanted files
                if filter and filter(f):
                    continue
                srcfile = os.path.join(dir, f)
                destfile = os.path.join(dest, basedir, f)
                # directory creation is lazy and after the file filtering above
                # to ensure we don't install empty dirs; empty dirs can't be
                # uninstalled.
                ensure_dir(destdir)

                # We use copyfile (not move, copy, or copy2) to be extra sure
                # that we are not moving directories over (copyfile fails for
                # directories) as well as to ensure that we are not copying
                # over any metadata because we want more control over what
                # metadata we actually copy over.
                shutil.copyfile(srcfile, destfile)

                # Copy over the metadata for the file, currently this only
                # includes the atime and mtime.
                st = os.stat(srcfile)
                if hasattr(os, "utime"):
                    os.utime(destfile, (st.st_atime, st.st_mtime))

                # If our file is executable, then make our destination file
                # executable.
                if os.access(srcfile, os.X_OK):
                    st = os.stat(srcfile)
                    permissions = (
                        st.st_mode | stat.S_IXUSR | stat.S_IXGRP | stat.S_IXOTH
                    )
                    os.chmod(destfile, permissions)

                changed = False
                if fixer:
                    changed = fixer(destfile)
                record_installed(srcfile, destfile, changed)

    clobber(source, lib_dir, True)

    assert info_dir, "%s .dist-info directory not found" % req

    # Get the defined entry points
    ep_file = os.path.join(info_dir[0], 'entry_points.txt')
    console, gui = get_entrypoints(ep_file)

    def is_entrypoint_wrapper(name):
        # EP, EP.exe and EP-script.py are scripts generated for
        # entry point EP by setuptools
        if name.lower().endswith('.exe'):
            matchname = name[:-4]
        elif name.lower().endswith('-script.py'):
            matchname = name[:-10]
        elif name.lower().endswith(".pya"):
            matchname = name[:-4]
        else:
            matchname = name
        # Ignore setuptools-generated scripts
        return (matchname in console or matchname in gui)

    for datadir in data_dirs:
        fixer = None
        filter = None
        for subdir in os.listdir(os.path.join(wheeldir, datadir)):
            fixer = None
            if subdir == 'scripts':
                fixer = fix_script
                filter = is_entrypoint_wrapper
            source = os.path.join(wheeldir, datadir, subdir)
            dest = scheme[subdir]
            clobber(source, dest, False, fixer=fixer, filter=filter)

    maker = ScriptMaker(None, scheme['scripts'])

    # Ensure old scripts are overwritten.
    # See https://github.com/pypa/pip/issues/1800
    maker.clobber = True

    # Ensure we don't generate any variants for scripts because this is almost
    # never what somebody wants.
    # See https://bitbucket.org/pypa/distlib/issue/35/
    maker.variants = set(('', ))

    # This is required because otherwise distlib creates scripts that are not
    # executable.
    # See https://bitbucket.org/pypa/distlib/issue/32/
    maker.set_mode = True

    # Simplify the script and fix the fact that the default script swallows
    # every single stack trace.
    # See https://bitbucket.org/pypa/distlib/issue/34/
    # See https://bitbucket.org/pypa/distlib/issue/33/
    def _get_script_text(entry):
        if entry.suffix is None:
            raise InstallationError(
                "Invalid script entry point: %s for req: %s - A callable "
                "suffix is required. Cf https://packaging.python.org/en/"
                "latest/distributing.html#console-scripts for more "
                "information." % (entry, req)
            )
        return maker.script_template % {
            "module": entry.prefix,
            "import_name": entry.suffix.split(".")[0],
            "func": entry.suffix,
        }

    maker._get_script_text = _get_script_text
    maker.script_template = """# -*- coding: utf-8 -*-
import re
import sys

from %(module)s import %(import_name)s

if __name__ == '__main__':
    sys.argv[0] = re.sub(r'(-script\.pyw?|\.exe)?$', '', sys.argv[0])
    sys.exit(%(func)s())
"""

    # Special case pip and setuptools to generate versioned wrappers
    #
    # The issue is that some projects (specifically, pip and setuptools) use
    # code in setup.py to create "versioned" entry points - pip2.7 on Python
    # 2.7, pip3.3 on Python 3.3, etc. But these entry points are baked into
    # the wheel metadata at build time, and so if the wheel is installed with
    # a *different* version of Python the entry points will be wrong. The
    # correct fix for this is to enhance the metadata to be able to describe
    # such versioned entry points, but that won't happen till Metadata 2.0 is
    # available.
    # In the meantime, projects using versioned entry points will either have
    # incorrect versioned entry points, or they will not be able to distribute
    # "universal" wheels (i.e., they will need a wheel per Python version).
    #
    # Because setuptools and pip are bundled with _ensurepip and virtualenv,
    # we need to use universal wheels. So, as a stopgap until Metadata 2.0, we
    # override the versioned entry points in the wheel and generate the
    # correct ones. This code is purely a short-term measure until Metadata 2.0
    # is available.
    #
    # To add the level of hack in this section of code, in order to support
    # ensurepip this code will look for an ``ENSUREPIP_OPTIONS`` environment
    # variable which will control which version scripts get installed.
    #
    # ENSUREPIP_OPTIONS=altinstall
    #   - Only pipX.Y and easy_install-X.Y will be generated and installed
    # ENSUREPIP_OPTIONS=install
    #   - pipX.Y, pipX, easy_install-X.Y will be generated and installed. Note
    #     that this option is technically if ENSUREPIP_OPTIONS is set and is
    #     not altinstall
    # DEFAULT
    #   - The default behavior is to install pip, pipX, pipX.Y, easy_install
    #     and easy_install-X.Y.
    pip_script = console.pop('pip', None)
    if pip_script:
        if "ENSUREPIP_OPTIONS" not in os.environ:
            spec = 'pip = ' + pip_script
            generated.extend(maker.make(spec))

        if os.environ.get("ENSUREPIP_OPTIONS", "") != "altinstall":
            spec = 'pip%s = %s' % (sys.version[:1], pip_script)
            generated.extend(maker.make(spec))

        spec = 'pip%s = %s' % (sys.version[:3], pip_script)
        generated.extend(maker.make(spec))
        # Delete any other versioned pip entry points
        pip_ep = [k for k in console if re.match(r'pip(\d(\.\d)?)?$', k)]
        for k in pip_ep:
            del console[k]
    easy_install_script = console.pop('easy_install', None)
    if easy_install_script:
        if "ENSUREPIP_OPTIONS" not in os.environ:
            spec = 'easy_install = ' + easy_install_script
            generated.extend(maker.make(spec))

        spec = 'easy_install-%s = %s' % (sys.version[:3], easy_install_script)
        generated.extend(maker.make(spec))
        # Delete any other versioned easy_install entry points
        easy_install_ep = [
            k for k in console if re.match(r'easy_install(-\d\.\d)?$', k)
        ]
        for k in easy_install_ep:
            del console[k]

    # Generate the console and GUI entry points specified in the wheel
    if len(console) > 0:
        generated.extend(
            maker.make_multiple(['%s = %s' % kv for kv in console.items()])
        )
    if len(gui) > 0:
        generated.extend(
            maker.make_multiple(
                ['%s = %s' % kv for kv in gui.items()],
                {'gui': True}
            )
        )

    # Record pip as the installer
    installer = os.path.join(info_dir[0], 'INSTALLER')
    temp_installer = os.path.join(info_dir[0], 'INSTALLER.pip')
    with open(temp_installer, 'wb') as installer_file:
        installer_file.write(b'pip\n')
    shutil.move(temp_installer, installer)
    generated.append(installer)

    # Record details of all files installed
    record = os.path.join(info_dir[0], 'RECORD')
    temp_record = os.path.join(info_dir[0], 'RECORD.pip')
    with open_for_csv(record, 'r') as record_in:
        with open_for_csv(temp_record, 'w+') as record_out:
            reader = csv.reader(record_in)
            writer = csv.writer(record_out)
            for row in reader:
                row[0] = installed.pop(row[0], row[0])
                if row[0] in changed:
                    row[1], row[2] = rehash(row[0])
                writer.writerow(row)
            for f in generated:
                h, l = rehash(f)
                final_path = normpath(f, lib_dir)
                if strip_file_prefix and final_path.startswith(strip_file_prefix):
                    final_path = os.path.join(os.sep,
                            os.path.relpath(final_path, strip_file_prefix))
                writer.writerow((final_path, h, l))
            for f in installed:
                writer.writerow((installed[f], '', ''))
    shutil.move(temp_record, record)


def _unique(fn):
    @functools.wraps(fn)
    def unique(*args, **kw):
        seen = set()
        for item in fn(*args, **kw):
            if item not in seen:
                seen.add(item)
                yield item
    return unique


# TODO: this goes somewhere besides the wheel module
@_unique
def uninstallation_paths(dist):
    """
    Yield all the uninstallation paths for dist based on RECORD-without-.pyc

    Yield paths to all the files in RECORD. For each .py file in RECORD, add
    the .pyc in the same directory.

    UninstallPathSet.add() takes care of the __pycache__ .pyc.
    """
    from pip.utils import FakeFile  # circular import
    r = csv.reader(FakeFile(dist.get_metadata_lines('RECORD')))
    for row in r:
        path = os.path.join(dist.location, row[0])
        yield path
        if path.endswith('.py'):
            dn, fn = os.path.split(path)
            base = fn[:-3]
            path = os.path.join(dn, base + '.pyc')
            yield path


def wheel_version(source_dir):
    """
    Return the Wheel-Version of an extracted wheel, if possible.

    Otherwise, return False if we couldn't parse / extract it.
    """
    try:
        dist = [d for d in pkg_resources.find_on_path(None, source_dir)][0]

        wheel_data = dist.get_metadata('WHEEL')
        wheel_data = Parser().parsestr(wheel_data)

        version = wheel_data['Wheel-Version'].strip()
        version = tuple(map(int, version.split('.')))
        return version
    except:
        return False


def check_compatibility(version, name):
    """
    Raises errors or warns if called with an incompatible Wheel-Version.

    Pip should refuse to install a Wheel-Version that's a major series
    ahead of what it's compatible with (e.g 2.0 > 1.1); and warn when
    installing a version only minor version ahead (e.g 1.2 > 1.1).

    version: a 2-tuple representing a Wheel-Version (Major, Minor)
    name: name of wheel or package to raise exception about

    :raises UnsupportedWheel: when an incompatible Wheel-Version is given
    """
    if not version:
        raise UnsupportedWheel(
            "%s is in an unsupported or invalid wheel" % name
        )
    if version[0] > VERSION_COMPATIBLE[0]:
        raise UnsupportedWheel(
            "%s's Wheel-Version (%s) is not compatible with this version "
            "of pip" % (name, '.'.join(map(str, version)))
        )
    elif version > VERSION_COMPATIBLE:
        logger.warning(
            'Installing from a newer Wheel-Version (%s)',
            '.'.join(map(str, version)),
        )


class Wheel(object):
    """A wheel file"""

    # TODO: maybe move the install code into this class

    wheel_file_re = re.compile(
        r"""^(?P<namever>(?P<name>.+?)-(?P<ver>\d.*?))
        ((-(?P<build>\d.*?))?-(?P<pyver>.+?)-(?P<abi>.+?)-(?P<plat>.+?)
        \.whl|\.dist-info)$""",
        re.VERBOSE
    )

    def __init__(self, filename):
        """
        :raises InvalidWheelFilename: when the filename is invalid for a wheel
        """
        wheel_info = self.wheel_file_re.match(filename)
        if not wheel_info:
            raise InvalidWheelFilename(
                "%s is not a valid wheel filename." % filename
            )
        self.filename = filename
        self.name = wheel_info.group('name').replace('_', '-')
        # we'll assume "_" means "-" due to wheel naming scheme
        # (https://github.com/pypa/pip/issues/1150)
        self.version = wheel_info.group('ver').replace('_', '-')
        self.pyversions = wheel_info.group('pyver').split('.')
        self.abis = wheel_info.group('abi').split('.')
        self.plats = wheel_info.group('plat').split('.')

        # All the tag combinations from this file
        self.file_tags = set(
            (x, y, z) for x in self.pyversions
            for y in self.abis for z in self.plats
        )

    def support_index_min(self, tags=None):
        """
        Return the lowest index that one of the wheel's file_tag combinations
        achieves in the supported_tags list e.g. if there are 8 supported tags,
        and one of the file tags is first in the list, then return 0.  Returns
        None is the wheel is not supported.
        """
        if tags is None:  # for mock
            tags = pep425tags.supported_tags
        indexes = [tags.index(c) for c in self.file_tags if c in tags]
        return min(indexes) if indexes else None

    def supported(self, tags=None):
        """Is this wheel supported on this system?"""
        if tags is None:  # for mock
            tags = pep425tags.supported_tags
        return bool(set(tags).intersection(self.file_tags))


class WheelBuilder(object):
    """Build wheels from a RequirementSet."""

    def __init__(self, requirement_set, finder, build_options=None,
                 global_options=None):
        self.requirement_set = requirement_set
        self.finder = finder
        self._cache_root = requirement_set._wheel_cache._cache_dir
        self._wheel_dir = requirement_set.wheel_download_dir
        self.build_options = build_options or []
        self.global_options = global_options or []

    def _build_one(self, req, output_dir, python_tag=None):
        """Build one wheel.

        :return: The filename of the built wheel, or None if the build failed.
        """
        tempd = tempfile.mkdtemp('pip-wheel-')
        try:
            if self.__build_one(req, tempd, python_tag=python_tag):
                try:
                    wheel_name = os.listdir(tempd)[0]
                    wheel_path = os.path.join(output_dir, wheel_name)
                    shutil.move(os.path.join(tempd, wheel_name), wheel_path)
                    logger.info('Stored in directory: %s', output_dir)
                    return wheel_path
                except:
                    pass
            # Ignore return, we can't do anything else useful.
            self._clean_one(req)
            return None
        finally:
            rmtree(tempd)

    def _base_setup_args(self, req):
        return [
            sys.executable, "-u", '-c',
            SETUPTOOLS_SHIM % req.setup_py
        ] + list(self.global_options)

    def __build_one(self, req, tempd, python_tag=None):
        base_args = self._base_setup_args(req)

        spin_message = 'Running setup.py bdist_wheel for %s' % (req.name,)
        with open_spinner(spin_message) as spinner:
            logger.debug('Destination directory: %s', tempd)
            wheel_args = base_args + ['bdist_wheel', '-d', tempd] \
                + self.build_options

            if python_tag is not None:
                wheel_args += ["--python-tag", python_tag]

            try:
                call_subprocess(wheel_args, cwd=req.setup_py_dir,
                                show_stdout=False, spinner=spinner)
                return True
            except:
                spinner.finish("error")
                logger.error('Failed building wheel for %s', req.name)
                return False

    def _clean_one(self, req):
        base_args = self._base_setup_args(req)

        logger.info('Running setup.py clean for %s', req.name)
        clean_args = base_args + ['clean', '--all']
        try:
            call_subprocess(clean_args, cwd=req.source_dir, show_stdout=False)
            return True
        except:
            logger.error('Failed cleaning build dir for %s', req.name)
            return False

    def build(self, autobuilding=False):
        """Build wheels.

        :param unpack: If True, replace the sdist we built from with the
            newly built wheel, in preparation for installation.
        :return: True if all the wheels built correctly.
        """
        assert self._wheel_dir or (autobuilding and self._cache_root)
        # unpack sdists and constructs req set
        self.requirement_set.prepare_files(self.finder)

        reqset = self.requirement_set.requirements.values()

        buildset = []
        for req in reqset:
            if req.constraint:
                continue
            if req.is_wheel:
                if not autobuilding:
                    logger.info(
                        'Skipping %s, due to already being wheel.', req.name)
            elif autobuilding and req.editable:
                pass
            elif autobuilding and req.link and not req.link.is_artifact:
                pass
            elif autobuilding and not req.source_dir:
                pass
            else:
                if autobuilding:
                    link = req.link
                    base, ext = link.splitext()
                    if pip.index.egg_info_matches(base, None, link) is None:
                        # Doesn't look like a package - don't autobuild a wheel
                        # because we'll have no way to lookup the result sanely
                        continue
                    if "binary" not in pip.index.fmt_ctl_formats(
                            self.finder.format_control,
                            canonicalize_name(req.name)):
                        logger.info(
                            "Skipping bdist_wheel for %s, due to binaries "
                            "being disabled for it.", req.name)
                        continue
                buildset.append(req)

        if not buildset:
            return True

        # Build the wheels.
        logger.info(
            'Building wheels for collected packages: %s',
            ', '.join([req.name for req in buildset]),
        )
        with indent_log():
            build_success, build_failure = [], []
            for req in buildset:
                python_tag = None
                if autobuilding:
                    python_tag = pep425tags.implementation_tag
                    output_dir = _cache_for_link(self._cache_root, req.link)
                    try:
                        ensure_dir(output_dir)
                    except OSError as e:
                        logger.warning("Building wheel for %s failed: %s",
                                       req.name, e)
                        build_failure.append(req)
                        continue
                else:
                    output_dir = self._wheel_dir
                wheel_file = self._build_one(
                    req, output_dir,
                    python_tag=python_tag,
                )
                if wheel_file:
                    build_success.append(req)
                    if autobuilding:
                        # XXX: This is mildly duplicative with prepare_files,
                        # but not close enough to pull out to a single common
                        # method.
                        # The code below assumes temporary source dirs -
                        # prevent it doing bad things.
                        if req.source_dir and not os.path.exists(os.path.join(
                                req.source_dir, PIP_DELETE_MARKER_FILENAME)):
                            raise AssertionError(
                                "bad source dir - missing marker")
                        # Delete the source we built the wheel from
                        req.remove_temporary_source()
                        # set the build directory again - name is known from
                        # the work prepare_files did.
                        req.source_dir = req.build_location(
                            self.requirement_set.build_dir)
                        # Update the link for this.
                        req.link = pip.index.Link(
                            path_to_url(wheel_file))
                        assert req.link.is_wheel
                        # extract the wheel into the dir
                        unpack_url(
                            req.link, req.source_dir, None, False,
                            session=self.requirement_set.session)
                else:
                    build_failure.append(req)

        # notify success/failure
        if build_success:
            logger.info(
                'Successfully built %s',
                ' '.join([req.name for req in build_success]),
            )
        if build_failure:
            logger.info(
                'Failed to build %s',
                ' '.join([req.name for req in build_failure]),
            )
        # Return True if all builds were successful
        return len(build_failure) == 0
site-packages/syspurpose-1.28.42-py3.6.egg-info/top_level.txt000064400000000013147511334620017433 0ustar00syspurpose
site-packages/syspurpose-1.28.42-py3.6.egg-info/entry_points.txt000064400000000065147511334620020206 0ustar00[console_scripts]
syspurpose = syspurpose.main:main

site-packages/syspurpose-1.28.42-py3.6.egg-info/PKG-INFO000064400000000367147511334620016012 0ustar00Metadata-Version: 1.0
Name: syspurpose
Version: 1.28.42
Summary: Manage Red Hat System Purpose
Home-page: http://www.candlepinproject.org
Author: Chris Snyder
Author-email: chainsaw@redhat.com
License: GPLv2
Description: UNKNOWN
Platform: UNKNOWN
site-packages/syspurpose-1.28.42-py3.6.egg-info/SOURCES.txt000064400000000575147511334620016602 0ustar00README.md
setup.cfg
setup.py
man/syspurpose.8
src/syspurpose/__init__.py
src/syspurpose/cli.py
src/syspurpose/files.py
src/syspurpose/i18n.py
src/syspurpose/main.py
src/syspurpose/utils.py
src/syspurpose.egg-info/PKG-INFO
src/syspurpose.egg-info/SOURCES.txt
src/syspurpose.egg-info/dependency_links.txt
src/syspurpose.egg-info/entry_points.txt
src/syspurpose.egg-info/top_level.txtsite-packages/syspurpose-1.28.42-py3.6.egg-info/dependency_links.txt000064400000000001147511334620020755 0ustar00
site-packages/configobj.py000064400000257033147511334620011616 0ustar00# configobj.py
# A config file reader/writer that supports nested sections in config files.
# Copyright (C) 2005-2014:
# (name) : (email)
# Michael Foord: fuzzyman AT voidspace DOT org DOT uk
# Nicola Larosa: nico AT tekNico DOT net
# Rob Dennis: rdennis AT gmail DOT com
# Eli Courtwright: eli AT courtwright DOT org

# This software is licensed under the terms of the BSD license.
# http://opensource.org/licenses/BSD-3-Clause

# ConfigObj 5 - main repository for documentation and issue tracking:
# https://github.com/DiffSK/configobj

import os
import re
import sys

from codecs import BOM_UTF8, BOM_UTF16, BOM_UTF16_BE, BOM_UTF16_LE

import six
from _version import __version__

# imported lazily to avoid startup performance hit if it isn't used
compiler = None

# A dictionary mapping BOM to
# the encoding to decode with, and what to set the
# encoding attribute to.
BOMS = {
    BOM_UTF8: ('utf_8', None),
    BOM_UTF16_BE: ('utf16_be', 'utf_16'),
    BOM_UTF16_LE: ('utf16_le', 'utf_16'),
    BOM_UTF16: ('utf_16', 'utf_16'),
    }
# All legal variants of the BOM codecs.
# TODO: the list of aliases is not meant to be exhaustive, is there a
#   better way ?
BOM_LIST = {
    'utf_16': 'utf_16',
    'u16': 'utf_16',
    'utf16': 'utf_16',
    'utf-16': 'utf_16',
    'utf16_be': 'utf16_be',
    'utf_16_be': 'utf16_be',
    'utf-16be': 'utf16_be',
    'utf16_le': 'utf16_le',
    'utf_16_le': 'utf16_le',
    'utf-16le': 'utf16_le',
    'utf_8': 'utf_8',
    'u8': 'utf_8',
    'utf': 'utf_8',
    'utf8': 'utf_8',
    'utf-8': 'utf_8',
    }

# Map of encodings to the BOM to write.
BOM_SET = {
    'utf_8': BOM_UTF8,
    'utf_16': BOM_UTF16,
    'utf16_be': BOM_UTF16_BE,
    'utf16_le': BOM_UTF16_LE,
    None: BOM_UTF8
    }


def match_utf8(encoding):
    return BOM_LIST.get(encoding.lower()) == 'utf_8'


# Quote strings used for writing values
squot = "'%s'"
dquot = '"%s"'
noquot = "%s"
wspace_plus = ' \r\n\v\t\'"'
tsquot = '"""%s"""'
tdquot = "'''%s'''"

# Sentinel for use in getattr calls to replace hasattr
MISSING = object()

__all__ = (
    'DEFAULT_INDENT_TYPE',
    'DEFAULT_INTERPOLATION',
    'ConfigObjError',
    'NestingError',
    'ParseError',
    'DuplicateError',
    'ConfigspecError',
    'ConfigObj',
    'SimpleVal',
    'InterpolationError',
    'InterpolationLoopError',
    'MissingInterpolationOption',
    'RepeatSectionError',
    'ReloadError',
    'UnreprError',
    'UnknownType',
    'flatten_errors',
    'get_extra_values'
)

DEFAULT_INTERPOLATION = 'configparser'
DEFAULT_INDENT_TYPE = '    '
MAX_INTERPOL_DEPTH = 10

OPTION_DEFAULTS = {
    'interpolation': True,
    'raise_errors': False,
    'list_values': True,
    'create_empty': False,
    'file_error': False,
    'configspec': None,
    'stringify': True,
    # option may be set to one of ('', ' ', '\t')
    'indent_type': None,
    'encoding': None,
    'default_encoding': None,
    'unrepr': False,
    'write_empty_values': False,
}

# this could be replaced if six is used for compatibility, or there are no
# more assertions about items being a string


def getObj(s):
    global compiler
    if compiler is None:
        import compiler
    s = "a=" + s
    p = compiler.parse(s)
    return p.getChildren()[1].getChildren()[0].getChildren()[1]


class UnknownType(Exception):
    pass


class Builder(object):
    
    def build(self, o):
        if m is None:
            raise UnknownType(o.__class__.__name__)
        return m(o)
    
    def build_List(self, o):
        return list(map(self.build, o.getChildren()))
    
    def build_Const(self, o):
        return o.value
    
    def build_Dict(self, o):
        d = {}
        i = iter(map(self.build, o.getChildren()))
        for el in i:
            d[el] = next(i)
        return d
    
    def build_Tuple(self, o):
        return tuple(self.build_List(o))
    
    def build_Name(self, o):
        if o.name == 'None':
            return None
        if o.name == 'True':
            return True
        if o.name == 'False':
            return False
        
        # An undefined Name
        raise UnknownType('Undefined Name')
    
    def build_Add(self, o):
        real, imag = list(map(self.build_Const, o.getChildren()))
        try:
            real = float(real)
        except TypeError:
            raise UnknownType('Add')
        if not isinstance(imag, complex) or imag.real != 0.0:
            raise UnknownType('Add')
        return real+imag
    
    def build_Getattr(self, o):
        parent = self.build(o.expr)
        return getattr(parent, o.attrname)
    
    def build_UnarySub(self, o):
        return -self.build_Const(o.getChildren()[0])
    
    def build_UnaryAdd(self, o):
        return self.build_Const(o.getChildren()[0])


_builder = Builder()


def unrepr(s):
    if not s:
        return s
    
    # this is supposed to be safe
    import ast
    return ast.literal_eval(s)


class ConfigObjError(SyntaxError):
    """
    This is the base class for all errors that ConfigObj raises.
    It is a subclass of SyntaxError.
    """
    def __init__(self, message='', line_number=None, line=''):
        self.line = line
        self.line_number = line_number
        SyntaxError.__init__(self, message)


class NestingError(ConfigObjError):
    """
    This error indicates a level of nesting that doesn't match.
    """


class ParseError(ConfigObjError):
    """
    This error indicates that a line is badly written.
    It is neither a valid ``key = value`` line,
    nor a valid section marker line.
    """


class ReloadError(IOError):
    """
    A 'reload' operation failed.
    This exception is a subclass of ``IOError``.
    """
    def __init__(self):
        IOError.__init__(self, 'reload failed, filename is not set.')


class DuplicateError(ConfigObjError):
    """
    The keyword or section specified already exists.
    """


class ConfigspecError(ConfigObjError):
    """
    An error occured whilst parsing a configspec.
    """


class InterpolationError(ConfigObjError):
    """Base class for the two interpolation errors."""


class InterpolationLoopError(InterpolationError):
    """Maximum interpolation depth exceeded in string interpolation."""

    def __init__(self, option):
        InterpolationError.__init__(
            self,
            'interpolation loop detected in value "%s".' % option)


class RepeatSectionError(ConfigObjError):
    """
    This error indicates additional sections in a section with a
    ``__many__`` (repeated) section.
    """


class MissingInterpolationOption(InterpolationError):
    """A value specified for interpolation was missing."""
    def __init__(self, option):
        msg = 'missing option "%s" in interpolation.' % option
        InterpolationError.__init__(self, msg)


class UnreprError(ConfigObjError):
    """An error parsing in unrepr mode."""



class InterpolationEngine(object):
    """
    A helper class to help perform string interpolation.

    This class is an abstract base class; its descendants perform
    the actual work.
    """

    # compiled regexp to use in self.interpolate()
    _KEYCRE = re.compile(r"%\(([^)]*)\)s")
    _cookie = '%'

    def __init__(self, section):
        # the Section instance that "owns" this engine
        self.section = section


    def interpolate(self, key, value):
        # short-cut
        if not self._cookie in value:
            return value
        
        def recursive_interpolate(key, value, section, backtrail):
            """The function that does the actual work.

            ``value``: the string we're trying to interpolate.
            ``section``: the section in which that string was found
            ``backtrail``: a dict to keep track of where we've been,
            to detect and prevent infinite recursion loops

            This is similar to a depth-first-search algorithm.
            """
            # Have we been here already?
            if (key, section.name) in backtrail:
                # Yes - infinite loop detected
                raise InterpolationLoopError(key)
            # Place a marker on our backtrail so we won't come back here again
            backtrail[(key, section.name)] = 1

            # Now start the actual work
            match = self._KEYCRE.search(value)
            while match:
                # The actual parsing of the match is implementation-dependent,
                # so delegate to our helper function
                k, v, s = self._parse_match(match)
                if k is None:
                    # That's the signal that no further interpolation is needed
                    replacement = v
                else:
                    # Further interpolation may be needed to obtain final value
                    replacement = recursive_interpolate(k, v, s, backtrail)
                # Replace the matched string with its final value
                start, end = match.span()
                value = ''.join((value[:start], replacement, value[end:]))
                new_search_start = start + len(replacement)
                # Pick up the next interpolation key, if any, for next time
                # through the while loop
                match = self._KEYCRE.search(value, new_search_start)

            # Now safe to come back here again; remove marker from backtrail
            del backtrail[(key, section.name)]

            return value

        # Back in interpolate(), all we have to do is kick off the recursive
        # function with appropriate starting values
        value = recursive_interpolate(key, value, self.section, {})
        return value


    def _fetch(self, key):
        """Helper function to fetch values from owning section.

        Returns a 2-tuple: the value, and the section where it was found.
        """
        # switch off interpolation before we try and fetch anything !
        save_interp = self.section.main.interpolation
        self.section.main.interpolation = False

        # Start at section that "owns" this InterpolationEngine
        current_section = self.section
        while True:
            # try the current section first
            val = current_section.get(key)
            if val is not None and not isinstance(val, Section):
                break
            # try "DEFAULT" next
            val = current_section.get('DEFAULT', {}).get(key)
            if val is not None and not isinstance(val, Section):
                break
            # move up to parent and try again
            # top-level's parent is itself
            if current_section.parent is current_section:
                # reached top level, time to give up
                break
            current_section = current_section.parent

        # restore interpolation to previous value before returning
        self.section.main.interpolation = save_interp
        if val is None:
            raise MissingInterpolationOption(key)
        return val, current_section


    def _parse_match(self, match):
        """Implementation-dependent helper function.

        Will be passed a match object corresponding to the interpolation
        key we just found (e.g., "%(foo)s" or "$foo"). Should look up that
        key in the appropriate config file section (using the ``_fetch()``
        helper function) and return a 3-tuple: (key, value, section)

        ``key`` is the name of the key we're looking for
        ``value`` is the value found for that key
        ``section`` is a reference to the section where it was found

        ``key`` and ``section`` should be None if no further
        interpolation should be performed on the resulting value
        (e.g., if we interpolated "$$" and returned "$").
        """
        raise NotImplementedError()
    


class ConfigParserInterpolation(InterpolationEngine):
    """Behaves like ConfigParser."""
    _cookie = '%'
    _KEYCRE = re.compile(r"%\(([^)]*)\)s")

    def _parse_match(self, match):
        key = match.group(1)
        value, section = self._fetch(key)
        return key, value, section



class TemplateInterpolation(InterpolationEngine):
    """Behaves like string.Template."""
    _cookie = '$'
    _delimiter = '$'
    _KEYCRE = re.compile(r"""
        \$(?:
          (?P<escaped>\$)              |   # Two $ signs
          (?P<named>[_a-z][_a-z0-9]*)  |   # $name format
          {(?P<braced>[^}]*)}              # ${name} format
        )
        """, re.IGNORECASE | re.VERBOSE)

    def _parse_match(self, match):
        # Valid name (in or out of braces): fetch value from section
        key = match.group('named') or match.group('braced')
        if key is not None:
            value, section = self._fetch(key)
            return key, value, section
        # Escaped delimiter (e.g., $$): return single delimiter
        if match.group('escaped') is not None:
            # Return None for key and section to indicate it's time to stop
            return None, self._delimiter, None
        # Anything else: ignore completely, just return it unchanged
        return None, match.group(), None


interpolation_engines = {
    'configparser': ConfigParserInterpolation,
    'template': TemplateInterpolation,
}


def __newobj__(cls, *args):
    # Hack for pickle
    return cls.__new__(cls, *args) 

class Section(dict):
    """
    A dictionary-like object that represents a section in a config file.
    
    It does string interpolation if the 'interpolation' attribute
    of the 'main' object is set to True.
    
    Interpolation is tried first from this object, then from the 'DEFAULT'
    section of this object, next from the parent and its 'DEFAULT' section,
    and so on until the main object is reached.
    
    A Section will behave like an ordered dictionary - following the
    order of the ``scalars`` and ``sections`` attributes.
    You can use this to change the order of members.
    
    Iteration follows the order: scalars, then sections.
    """

    
    def __setstate__(self, state):
        dict.update(self, state[0])
        self.__dict__.update(state[1])

    def __reduce__(self):
        state = (dict(self), self.__dict__)
        return (__newobj__, (self.__class__,), state)
    
    
    def __init__(self, parent, depth, main, indict=None, name=None):
        """
        * parent is the section above
        * depth is the depth level of this section
        * main is the main ConfigObj
        * indict is a dictionary to initialise the section with
        """
        if indict is None:
            indict = {}
        dict.__init__(self)
        # used for nesting level *and* interpolation
        self.parent = parent
        # used for the interpolation attribute
        self.main = main
        # level of nesting depth of this Section
        self.depth = depth
        # purely for information
        self.name = name
        #
        self._initialise()
        # we do this explicitly so that __setitem__ is used properly
        # (rather than just passing to ``dict.__init__``)
        for entry, value in indict.items():
            self[entry] = value
            
            
    def _initialise(self):
        # the sequence of scalar values in this Section
        self.scalars = []
        # the sequence of sections in this Section
        self.sections = []
        # for comments :-)
        self.comments = {}
        self.inline_comments = {}
        # the configspec
        self.configspec = None
        # for defaults
        self.defaults = []
        self.default_values = {}
        self.extra_values = []
        self._created = False


    def _interpolate(self, key, value):
        try:
            # do we already have an interpolation engine?
            engine = self._interpolation_engine
        except AttributeError:
            # not yet: first time running _interpolate(), so pick the engine
            name = self.main.interpolation
            if name == True:  # note that "if name:" would be incorrect here
                # backwards-compatibility: interpolation=True means use default
                name = DEFAULT_INTERPOLATION
            name = name.lower()  # so that "Template", "template", etc. all work
            class_ = interpolation_engines.get(name, None)
            if class_ is None:
                # invalid value for self.main.interpolation
                self.main.interpolation = False
                return value
            else:
                # save reference to engine so we don't have to do this again
                engine = self._interpolation_engine = class_(self)
        # let the engine do the actual work
        return engine.interpolate(key, value)


    def __getitem__(self, key):
        """Fetch the item and do string interpolation."""
        val = dict.__getitem__(self, key)
        if self.main.interpolation: 
            if isinstance(val, six.string_types):
                return self._interpolate(key, val)
            if isinstance(val, list):
                def _check(entry):
                    if isinstance(entry, six.string_types):
                        return self._interpolate(key, entry)
                    return entry
                new = [_check(entry) for entry in val]
                if new != val:
                    return new
        return val


    def __setitem__(self, key, value, unrepr=False):
        """
        Correctly set a value.
        
        Making dictionary values Section instances.
        (We have to special case 'Section' instances - which are also dicts)
        
        Keys must be strings.
        Values need only be strings (or lists of strings) if
        ``main.stringify`` is set.
        
        ``unrepr`` must be set when setting a value to a dictionary, without
        creating a new sub-section.
        """
        if not isinstance(key, six.string_types):
            raise ValueError('The key "%s" is not a string.' % key)
        
        # add the comment
        if key not in self.comments:
            self.comments[key] = []
            self.inline_comments[key] = ''
        # remove the entry from defaults
        if key in self.defaults:
            self.defaults.remove(key)
        #
        if isinstance(value, Section):
            if key not in self:
                self.sections.append(key)
            dict.__setitem__(self, key, value)
        elif isinstance(value, dict) and not unrepr:
            # First create the new depth level,
            # then create the section
            if key not in self:
                self.sections.append(key)
            new_depth = self.depth + 1
            dict.__setitem__(
                self,
                key,
                Section(
                    self,
                    new_depth,
                    self.main,
                    indict=value,
                    name=key))
        else:
            if key not in self:
                self.scalars.append(key)
            if not self.main.stringify:
                if isinstance(value, six.string_types):
                    pass
                elif isinstance(value, (list, tuple)):
                    for entry in value:
                        if not isinstance(entry, six.string_types):
                            raise TypeError('Value is not a string "%s".' % entry)
                else:
                    raise TypeError('Value is not a string "%s".' % value)
            dict.__setitem__(self, key, value)


    def __delitem__(self, key):
        """Remove items from the sequence when deleting."""
        dict. __delitem__(self, key)
        if key in self.scalars:
            self.scalars.remove(key)
        else:
            self.sections.remove(key)
        del self.comments[key]
        del self.inline_comments[key]


    def get(self, key, default=None):
        """A version of ``get`` that doesn't bypass string interpolation."""
        try:
            return self[key]
        except KeyError:
            return default


    def update(self, indict):
        """
        A version of update that uses our ``__setitem__``.
        """
        for entry in indict:
            self[entry] = indict[entry]


    def pop(self, key, default=MISSING):
        """
        'D.pop(k[,d]) -> v, remove specified key and return the corresponding value.
        If key is not found, d is returned if given, otherwise KeyError is raised'
        """
        try:
            val = self[key]
        except KeyError:
            if default is MISSING:
                raise
            val = default
        else:
            del self[key]
        return val


    def popitem(self):
        """Pops the first (key,val)"""
        sequence = (self.scalars + self.sections)
        if not sequence:
            raise KeyError(": 'popitem(): dictionary is empty'")
        key = sequence[0]
        val =  self[key]
        del self[key]
        return key, val


    def clear(self):
        """
        A version of clear that also affects scalars/sections
        Also clears comments and configspec.
        
        Leaves other attributes alone :
            depth/main/parent are not affected
        """
        dict.clear(self)
        self.scalars = []
        self.sections = []
        self.comments = {}
        self.inline_comments = {}
        self.configspec = None
        self.defaults = []
        self.extra_values = []


    def setdefault(self, key, default=None):
        """A version of setdefault that sets sequence if appropriate."""
        try:
            return self[key]
        except KeyError:
            self[key] = default
            return self[key]


    def items(self):
        """D.items() -> list of D's (key, value) pairs, as 2-tuples"""
        return list(zip((self.scalars + self.sections), list(self.values())))


    def keys(self):
        """D.keys() -> list of D's keys"""
        return (self.scalars + self.sections)


    def values(self):
        """D.values() -> list of D's values"""
        return [self[key] for key in (self.scalars + self.sections)]


    def iteritems(self):
        """D.iteritems() -> an iterator over the (key, value) items of D"""
        return iter(list(self.items()))


    def iterkeys(self):
        """D.iterkeys() -> an iterator over the keys of D"""
        return iter((self.scalars + self.sections))

    __iter__ = iterkeys


    def itervalues(self):
        """D.itervalues() -> an iterator over the values of D"""
        return iter(list(self.values()))


    def __repr__(self):
        """x.__repr__() <==> repr(x)"""
        def _getval(key):
            try:
                return self[key]
            except MissingInterpolationOption:
                return dict.__getitem__(self, key)
        return '{%s}' % ', '.join([('%s: %s' % (repr(key), repr(_getval(key))))
            for key in (self.scalars + self.sections)])

    __str__ = __repr__
    __str__.__doc__ = "x.__str__() <==> str(x)"


    # Extra methods - not in a normal dictionary

    def dict(self):
        """
        Return a deepcopy of self as a dictionary.
        
        All members that are ``Section`` instances are recursively turned to
        ordinary dictionaries - by calling their ``dict`` method.
        
        >>> n = a.dict()
        >>> n == a
        1
        >>> n is a
        0
        """
        newdict = {}
        for entry in self:
            this_entry = self[entry]
            if isinstance(this_entry, Section):
                this_entry = this_entry.dict()
            elif isinstance(this_entry, list):
                # create a copy rather than a reference
                this_entry = list(this_entry)
            elif isinstance(this_entry, tuple):
                # create a copy rather than a reference
                this_entry = tuple(this_entry)
            newdict[entry] = this_entry
        return newdict


    def merge(self, indict):
        """
        A recursive update - useful for merging config files.
        
        >>> a = '''[section1]
        ...     option1 = True
        ...     [[subsection]]
        ...     more_options = False
        ...     # end of file'''.splitlines()
        >>> b = '''# File is user.ini
        ...     [section1]
        ...     option1 = False
        ...     # end of file'''.splitlines()
        >>> c1 = ConfigObj(b)
        >>> c2 = ConfigObj(a)
        >>> c2.merge(c1)
        >>> c2
        ConfigObj({'section1': {'option1': 'False', 'subsection': {'more_options': 'False'}}})
        """
        for key, val in list(indict.items()):
            if (key in self and isinstance(self[key], dict) and
                                isinstance(val, dict)):
                self[key].merge(val)
            else:   
                self[key] = val


    def rename(self, oldkey, newkey):
        """
        Change a keyname to another, without changing position in sequence.
        
        Implemented so that transformations can be made on keys,
        as well as on values. (used by encode and decode)
        
        Also renames comments.
        """
        if oldkey in self.scalars:
            the_list = self.scalars
        elif oldkey in self.sections:
            the_list = self.sections
        else:
            raise KeyError('Key "%s" not found.' % oldkey)
        pos = the_list.index(oldkey)
        #
        val = self[oldkey]
        dict.__delitem__(self, oldkey)
        dict.__setitem__(self, newkey, val)
        the_list.remove(oldkey)
        the_list.insert(pos, newkey)
        comm = self.comments[oldkey]
        inline_comment = self.inline_comments[oldkey]
        del self.comments[oldkey]
        del self.inline_comments[oldkey]
        self.comments[newkey] = comm
        self.inline_comments[newkey] = inline_comment


    def walk(self, function, raise_errors=True,
            call_on_sections=False, **keywargs):
        """
        Walk every member and call a function on the keyword and value.
        
        Return a dictionary of the return values
        
        If the function raises an exception, raise the errror
        unless ``raise_errors=False``, in which case set the return value to
        ``False``.
        
        Any unrecognised keyword arguments you pass to walk, will be pased on
        to the function you pass in.
        
        Note: if ``call_on_sections`` is ``True`` then - on encountering a
        subsection, *first* the function is called for the *whole* subsection,
        and then recurses into it's members. This means your function must be
        able to handle strings, dictionaries and lists. This allows you
        to change the key of subsections as well as for ordinary members. The
        return value when called on the whole subsection has to be discarded.
        
        See  the encode and decode methods for examples, including functions.
        
        .. admonition:: caution
        
            You can use ``walk`` to transform the names of members of a section
            but you mustn't add or delete members.
        
        >>> config = '''[XXXXsection]
        ... XXXXkey = XXXXvalue'''.splitlines()
        >>> cfg = ConfigObj(config)
        >>> cfg
        ConfigObj({'XXXXsection': {'XXXXkey': 'XXXXvalue'}})
        >>> def transform(section, key):
        ...     val = section[key]
        ...     newkey = key.replace('XXXX', 'CLIENT1')
        ...     section.rename(key, newkey)
        ...     if isinstance(val, (tuple, list, dict)):
        ...         pass
        ...     else:
        ...         val = val.replace('XXXX', 'CLIENT1')
        ...         section[newkey] = val
        >>> cfg.walk(transform, call_on_sections=True)
        {'CLIENT1section': {'CLIENT1key': None}}
        >>> cfg
        ConfigObj({'CLIENT1section': {'CLIENT1key': 'CLIENT1value'}})
        """
        out = {}
        # scalars first
        for i in range(len(self.scalars)):
            entry = self.scalars[i]
            try:
                val = function(self, entry, **keywargs)
                # bound again in case name has changed
                entry = self.scalars[i]
                out[entry] = val
            except Exception:
                if raise_errors:
                    raise
                else:
                    entry = self.scalars[i]
                    out[entry] = False
        # then sections
        for i in range(len(self.sections)):
            entry = self.sections[i]
            if call_on_sections:
                try:
                    function(self, entry, **keywargs)
                except Exception:
                    if raise_errors:
                        raise
                    else:
                        entry = self.sections[i]
                        out[entry] = False
                # bound again in case name has changed
                entry = self.sections[i]
            # previous result is discarded
            out[entry] = self[entry].walk(
                function,
                raise_errors=raise_errors,
                call_on_sections=call_on_sections,
                **keywargs)
        return out


    def as_bool(self, key):
        """
        Accepts a key as input. The corresponding value must be a string or
        the objects (``True`` or 1) or (``False`` or 0). We allow 0 and 1 to
        retain compatibility with Python 2.2.
        
        If the string is one of  ``True``, ``On``, ``Yes``, or ``1`` it returns 
        ``True``.
        
        If the string is one of  ``False``, ``Off``, ``No``, or ``0`` it returns 
        ``False``.
        
        ``as_bool`` is not case sensitive.
        
        Any other input will raise a ``ValueError``.
        
        >>> a = ConfigObj()
        >>> a['a'] = 'fish'
        >>> a.as_bool('a')
        Traceback (most recent call last):
        ValueError: Value "fish" is neither True nor False
        >>> a['b'] = 'True'
        >>> a.as_bool('b')
        1
        >>> a['b'] = 'off'
        >>> a.as_bool('b')
        0
        """
        val = self[key]
        if val == True:
            return True
        elif val == False:
            return False
        else:
            try:
                if not isinstance(val, six.string_types):
                    # TODO: Why do we raise a KeyError here?
                    raise KeyError()
                else:
                    return self.main._bools[val.lower()]
            except KeyError:
                raise ValueError('Value "%s" is neither True nor False' % val)


    def as_int(self, key):
        """
        A convenience method which coerces the specified value to an integer.
        
        If the value is an invalid literal for ``int``, a ``ValueError`` will
        be raised.
        
        >>> a = ConfigObj()
        >>> a['a'] = 'fish'
        >>> a.as_int('a')
        Traceback (most recent call last):
        ValueError: invalid literal for int() with base 10: 'fish'
        >>> a['b'] = '1'
        >>> a.as_int('b')
        1
        >>> a['b'] = '3.2'
        >>> a.as_int('b')
        Traceback (most recent call last):
        ValueError: invalid literal for int() with base 10: '3.2'
        """
        return int(self[key])


    def as_float(self, key):
        """
        A convenience method which coerces the specified value to a float.
        
        If the value is an invalid literal for ``float``, a ``ValueError`` will
        be raised.
        
        >>> a = ConfigObj()
        >>> a['a'] = 'fish'
        >>> a.as_float('a')  #doctest: +IGNORE_EXCEPTION_DETAIL
        Traceback (most recent call last):
        ValueError: invalid literal for float(): fish
        >>> a['b'] = '1'
        >>> a.as_float('b')
        1.0
        >>> a['b'] = '3.2'
        >>> a.as_float('b')  #doctest: +ELLIPSIS
        3.2...
        """
        return float(self[key])
    
    
    def as_list(self, key):
        """
        A convenience method which fetches the specified value, guaranteeing
        that it is a list.
        
        >>> a = ConfigObj()
        >>> a['a'] = 1
        >>> a.as_list('a')
        [1]
        >>> a['a'] = (1,)
        >>> a.as_list('a')
        [1]
        >>> a['a'] = [1]
        >>> a.as_list('a')
        [1]
        """
        result = self[key]
        if isinstance(result, (tuple, list)):
            return list(result)
        return [result]
        

    def restore_default(self, key):
        """
        Restore (and return) default value for the specified key.
        
        This method will only work for a ConfigObj that was created
        with a configspec and has been validated.
        
        If there is no default value for this key, ``KeyError`` is raised.
        """
        default = self.default_values[key]
        dict.__setitem__(self, key, default)
        if key not in self.defaults:
            self.defaults.append(key)
        return default

    
    def restore_defaults(self):
        """
        Recursively restore default values to all members
        that have them.
        
        This method will only work for a ConfigObj that was created
        with a configspec and has been validated.
        
        It doesn't delete or modify entries without default values.
        """
        for key in self.default_values:
            self.restore_default(key)
            
        for section in self.sections:
            self[section].restore_defaults()


class ConfigObj(Section):
    """An object to read, create, and write config files."""

    _keyword = re.compile(r'''^ # line start
        (\s*)                   # indentation
        (                       # keyword
            (?:".*?")|          # double quotes
            (?:'.*?')|          # single quotes
            (?:[^'"=].*?)       # no quotes
        )
        \s*=\s*                 # divider
        (.*)                    # value (including list values and comments)
        $   # line end
        ''',
        re.VERBOSE)

    _sectionmarker = re.compile(r'''^
        (\s*)                     # 1: indentation
        ((?:\[\s*)+)              # 2: section marker open
        (                         # 3: section name open
            (?:"\s*\S.*?\s*")|    # at least one non-space with double quotes
            (?:'\s*\S.*?\s*')|    # at least one non-space with single quotes
            (?:[^'"\s].*?)        # at least one non-space unquoted
        )                         # section name close
        ((?:\s*\])+)              # 4: section marker close
        \s*(\#.*)?                # 5: optional comment
        $''',
        re.VERBOSE)

    # this regexp pulls list values out as a single string
    # or single values and comments
    # FIXME: this regex adds a '' to the end of comma terminated lists
    #   workaround in ``_handle_value``
    _valueexp = re.compile(r'''^
        (?:
            (?:
                (
                    (?:
                        (?:
                            (?:".*?")|              # double quotes
                            (?:'.*?')|              # single quotes
                            (?:[^'",\#][^,\#]*?)    # unquoted
                        )
                        \s*,\s*                     # comma
                    )*      # match all list items ending in a comma (if any)
                )
                (
                    (?:".*?")|                      # double quotes
                    (?:'.*?')|                      # single quotes
                    (?:[^'",\#\s][^,]*?)|           # unquoted
                    (?:(?<!,))                      # Empty value
                )?          # last item in a list - or string value
            )|
            (,)             # alternatively a single comma - empty list
        )
        \s*(\#.*)?          # optional comment
        $''',
        re.VERBOSE)

    # use findall to get the members of a list value
    _listvalueexp = re.compile(r'''
        (
            (?:".*?")|          # double quotes
            (?:'.*?')|          # single quotes
            (?:[^'",\#]?.*?)       # unquoted
        )
        \s*,\s*                 # comma
        ''',
        re.VERBOSE)

    # this regexp is used for the value
    # when lists are switched off
    _nolistvalue = re.compile(r'''^
        (
            (?:".*?")|          # double quotes
            (?:'.*?')|          # single quotes
            (?:[^'"\#].*?)|     # unquoted
            (?:)                # Empty value
        )
        \s*(\#.*)?              # optional comment
        $''',
        re.VERBOSE)

    # regexes for finding triple quoted values on one line
    _single_line_single = re.compile(r"^'''(.*?)'''\s*(#.*)?$")
    _single_line_double = re.compile(r'^"""(.*?)"""\s*(#.*)?$')
    _multi_line_single = re.compile(r"^(.*?)'''\s*(#.*)?$")
    _multi_line_double = re.compile(r'^(.*?)"""\s*(#.*)?$')

    _triple_quote = {
        "'''": (_single_line_single, _multi_line_single),
        '"""': (_single_line_double, _multi_line_double),
    }

    # Used by the ``istrue`` Section method
    _bools = {
        'yes': True, 'no': False,
        'on': True, 'off': False,
        '1': True, '0': False,
        'true': True, 'false': False,
        }


    def __init__(self, infile=None, options=None, configspec=None, encoding=None,
                 interpolation=True, raise_errors=False, list_values=True,
                 create_empty=False, file_error=False, stringify=True,
                 indent_type=None, default_encoding=None, unrepr=False,
                 write_empty_values=False, _inspec=False):
        """
        Parse a config file or create a config file object.
        
        ``ConfigObj(infile=None, configspec=None, encoding=None,
                    interpolation=True, raise_errors=False, list_values=True,
                    create_empty=False, file_error=False, stringify=True,
                    indent_type=None, default_encoding=None, unrepr=False,
                    write_empty_values=False, _inspec=False)``
        """
        self._inspec = _inspec
        # init the superclass
        Section.__init__(self, self, 0, self)
        
        infile = infile or []
        
        _options = {'configspec': configspec,
                    'encoding': encoding, 'interpolation': interpolation,
                    'raise_errors': raise_errors, 'list_values': list_values,
                    'create_empty': create_empty, 'file_error': file_error,
                    'stringify': stringify, 'indent_type': indent_type,
                    'default_encoding': default_encoding, 'unrepr': unrepr,
                    'write_empty_values': write_empty_values}

        if options is None:
            options = _options
        else:
            import warnings
            warnings.warn('Passing in an options dictionary to ConfigObj() is '
                          'deprecated. Use **options instead.',
                          DeprecationWarning, stacklevel=2)
            
            # TODO: check the values too.
            for entry in options:
                if entry not in OPTION_DEFAULTS:
                    raise TypeError('Unrecognised option "%s".' % entry)
            for entry, value in list(OPTION_DEFAULTS.items()):
                if entry not in options:
                    options[entry] = value
                keyword_value = _options[entry]
                if value != keyword_value:
                    options[entry] = keyword_value
        
        # XXXX this ignores an explicit list_values = True in combination
        # with _inspec. The user should *never* do that anyway, but still...
        if _inspec:
            options['list_values'] = False
        
        self._initialise(options)
        configspec = options['configspec']
        self._original_configspec = configspec
        self._load(infile, configspec)
        
        
    def _load(self, infile, configspec):
        if isinstance(infile, six.string_types):
            self.filename = infile
            if os.path.isfile(infile):
                with open(infile, 'rb') as h:
                    content = h.readlines() or []
            elif self.file_error:
                # raise an error if the file doesn't exist
                raise IOError('Config file not found: "%s".' % self.filename)
            else:
                # file doesn't already exist
                if self.create_empty:
                    # this is a good test that the filename specified
                    # isn't impossible - like on a non-existent device
                    with open(infile, 'w') as h:
                        h.write('')
                content = []
                
        elif isinstance(infile, (list, tuple)):
            content = list(infile)
            
        elif isinstance(infile, dict):
            # initialise self
            # the Section class handles creating subsections
            if isinstance(infile, ConfigObj):
                # get a copy of our ConfigObj
                def set_section(in_section, this_section):
                    for entry in in_section.scalars:
                        this_section[entry] = in_section[entry]
                    for section in in_section.sections:
                        this_section[section] = {}
                        set_section(in_section[section], this_section[section])
                set_section(infile, self)
                
            else:
                for entry in infile:
                    self[entry] = infile[entry]
            del self._errors
            
            if configspec is not None:
                self._handle_configspec(configspec)
            else:
                self.configspec = None
            return
        
        elif getattr(infile, 'read', MISSING) is not MISSING:
            # This supports file like objects
            content = infile.read() or []
            # needs splitting into lines - but needs doing *after* decoding
            # in case it's not an 8 bit encoding
        else:
            raise TypeError('infile must be a filename, file like object, or list of lines.')

        if content:
            # don't do it for the empty ConfigObj
            content = self._handle_bom(content)
            # infile is now *always* a list
            #
            # Set the newlines attribute (first line ending it finds)
            # and strip trailing '\n' or '\r' from lines
            for line in content:
                if (not line) or (line[-1] not in ('\r', '\n')):
                    continue
                for end in ('\r\n', '\n', '\r'):
                    if line.endswith(end):
                        self.newlines = end
                        break
                break

        assert all(isinstance(line, six.string_types) for line in content), repr(content)
        content = [line.rstrip('\r\n') for line in content]
            
        self._parse(content)
        # if we had any errors, now is the time to raise them
        if self._errors:
            info = "at line %s." % self._errors[0].line_number
            if len(self._errors) > 1:
                msg = "Parsing failed with several errors.\nFirst error %s" % info
                error = ConfigObjError(msg)
            else:
                error = self._errors[0]
            # set the errors attribute; it's a list of tuples:
            # (error_type, message, line_number)
            error.errors = self._errors
            # set the config attribute
            error.config = self
            raise error
        # delete private attributes
        del self._errors
        
        if configspec is None:
            self.configspec = None
        else:
            self._handle_configspec(configspec)
    
    
    def _initialise(self, options=None):
        if options is None:
            options = OPTION_DEFAULTS
            
        # initialise a few variables
        self.filename = None
        self._errors = []
        self.raise_errors = options['raise_errors']
        self.interpolation = options['interpolation']
        self.list_values = options['list_values']
        self.create_empty = options['create_empty']
        self.file_error = options['file_error']
        self.stringify = options['stringify']
        self.indent_type = options['indent_type']
        self.encoding = options['encoding']
        self.default_encoding = options['default_encoding']
        self.BOM = False
        self.newlines = None
        self.write_empty_values = options['write_empty_values']
        self.unrepr = options['unrepr']
        
        self.initial_comment = []
        self.final_comment = []
        self.configspec = None
        
        if self._inspec:
            self.list_values = False
        
        # Clear section attributes as well
        Section._initialise(self)
        
        
    def __repr__(self):
        def _getval(key):
            try:
                return self[key]
            except MissingInterpolationOption:
                return dict.__getitem__(self, key)
        return ('ConfigObj({%s})' % 
                ', '.join([('%s: %s' % (repr(key), repr(_getval(key)))) 
                for key in (self.scalars + self.sections)]))
    
    
    def _handle_bom(self, infile):
        """
        Handle any BOM, and decode if necessary.
        
        If an encoding is specified, that *must* be used - but the BOM should
        still be removed (and the BOM attribute set).
        
        (If the encoding is wrongly specified, then a BOM for an alternative
        encoding won't be discovered or removed.)
        
        If an encoding is not specified, UTF8 or UTF16 BOM will be detected and
        removed. The BOM attribute will be set. UTF16 will be decoded to
        unicode.
        
        NOTE: This method must not be called with an empty ``infile``.
        
        Specifying the *wrong* encoding is likely to cause a
        ``UnicodeDecodeError``.
        
        ``infile`` must always be returned as a list of lines, but may be
        passed in as a single string.
        """

        if ((self.encoding is not None) and
            (self.encoding.lower() not in BOM_LIST)):
            # No need to check for a BOM
            # the encoding specified doesn't have one
            # just decode
            return self._decode(infile, self.encoding)
        
        if isinstance(infile, (list, tuple)):
            line = infile[0]
        else:
            line = infile

        if isinstance(line, six.text_type):
            # it's already decoded and there's no need to do anything
            # else, just use the _decode utility method to handle
            # listifying appropriately
            return self._decode(infile, self.encoding)

        if self.encoding is not None:
            # encoding explicitly supplied
            # And it could have an associated BOM
            # TODO: if encoding is just UTF16 - we ought to check for both
            # TODO: big endian and little endian versions.
            enc = BOM_LIST[self.encoding.lower()]
            if enc == 'utf_16':
                # For UTF16 we try big endian and little endian
                for BOM, (encoding, final_encoding) in list(BOMS.items()):
                    if not final_encoding:
                        # skip UTF8
                        continue
                    if infile.startswith(BOM):
                        ### BOM discovered
                        ##self.BOM = True
                        # Don't need to remove BOM
                        return self._decode(infile, encoding)
                    
                # If we get this far, will *probably* raise a DecodeError
                # As it doesn't appear to start with a BOM
                return self._decode(infile, self.encoding)
            
            # Must be UTF8
            BOM = BOM_SET[enc]
            if not line.startswith(BOM):
                return self._decode(infile, self.encoding)
            
            newline = line[len(BOM):]
            
            # BOM removed
            if isinstance(infile, (list, tuple)):
                infile[0] = newline
            else:
                infile = newline
            self.BOM = True
            return self._decode(infile, self.encoding)
        
        # No encoding specified - so we need to check for UTF8/UTF16
        for BOM, (encoding, final_encoding) in list(BOMS.items()):
            if not isinstance(line, six.binary_type) or not line.startswith(BOM):
                # didn't specify a BOM, or it's not a bytestring
                continue
            else:
                # BOM discovered
                self.encoding = final_encoding
                if not final_encoding:
                    self.BOM = True
                    # UTF8
                    # remove BOM
                    newline = line[len(BOM):]
                    if isinstance(infile, (list, tuple)):
                        infile[0] = newline
                    else:
                        infile = newline
                    # UTF-8
                    if isinstance(infile, six.text_type):
                        return infile.splitlines(True)
                    elif isinstance(infile, six.binary_type):
                        return infile.decode('utf-8').splitlines(True)
                    else:
                        return self._decode(infile, 'utf-8')
                # UTF16 - have to decode
                return self._decode(infile, encoding)
            

        if six.PY2 and isinstance(line, str):
            # don't actually do any decoding, since we're on python 2 and
            # returning a bytestring is fine
            return self._decode(infile, None)
        # No BOM discovered and no encoding specified, default to UTF-8
        if isinstance(infile, six.binary_type):
            return infile.decode('utf-8').splitlines(True)
        else:
            return self._decode(infile, 'utf-8')


    def _a_to_u(self, aString):
        """Decode ASCII strings to unicode if a self.encoding is specified."""
        if isinstance(aString, six.binary_type) and self.encoding:
            return aString.decode(self.encoding)
        else:
            return aString


    def _decode(self, infile, encoding):
        """
        Decode infile to unicode. Using the specified encoding.
        
        if is a string, it also needs converting to a list.
        """
        if isinstance(infile, six.string_types):
            return infile.splitlines(True)
        if isinstance(infile, six.binary_type):
            # NOTE: Could raise a ``UnicodeDecodeError``
            if encoding:
                return infile.decode(encoding).splitlines(True)
            else:
                return infile.splitlines(True)

        if encoding:
            for i, line in enumerate(infile):
                if isinstance(line, six.binary_type):
                    # NOTE: The isinstance test here handles mixed lists of unicode/string
                    # NOTE: But the decode will break on any non-string values
                    # NOTE: Or could raise a ``UnicodeDecodeError``
                    infile[i] = line.decode(encoding)
        return infile


    def _decode_element(self, line):
        """Decode element to unicode if necessary."""
        if isinstance(line, six.binary_type) and self.default_encoding:
            return line.decode(self.default_encoding)
        else:
            return line


    # TODO: this may need to be modified
    def _str(self, value):
        """
        Used by ``stringify`` within validate, to turn non-string values
        into strings.
        """
        if not isinstance(value, six.string_types):
            # intentially 'str' because it's just whatever the "normal"
            # string type is for the python version we're dealing with
            return str(value)
        else:
            return value


    def _parse(self, infile):
        """Actually parse the config file."""
        temp_list_values = self.list_values
        if self.unrepr:
            self.list_values = False
            
        comment_list = []
        done_start = False
        this_section = self
        maxline = len(infile) - 1
        cur_index = -1
        reset_comment = False
        
        while cur_index < maxline:
            if reset_comment:
                comment_list = []
            cur_index += 1
            line = infile[cur_index]
            sline = line.strip()
            # do we have anything on the line ?
            if not sline or sline.startswith('#'):
                reset_comment = False
                comment_list.append(line)
                continue
            
            if not done_start:
                # preserve initial comment
                self.initial_comment = comment_list
                comment_list = []
                done_start = True
                
            reset_comment = True
            # first we check if it's a section marker
            mat = self._sectionmarker.match(line)
            if mat is not None:
                # is a section line
                (indent, sect_open, sect_name, sect_close, comment) = mat.groups()
                if indent and (self.indent_type is None):
                    self.indent_type = indent
                cur_depth = sect_open.count('[')
                if cur_depth != sect_close.count(']'):
                    self._handle_error("Cannot compute the section depth",
                                       NestingError, infile, cur_index)
                    continue
                
                if cur_depth < this_section.depth:
                    # the new section is dropping back to a previous level
                    try:
                        parent = self._match_depth(this_section,
                                                   cur_depth).parent
                    except SyntaxError:
                        self._handle_error("Cannot compute nesting level",
                                           NestingError, infile, cur_index)
                        continue
                elif cur_depth == this_section.depth:
                    # the new section is a sibling of the current section
                    parent = this_section.parent
                elif cur_depth == this_section.depth + 1:
                    # the new section is a child the current section
                    parent = this_section
                else:
                    self._handle_error("Section too nested",
                                       NestingError, infile, cur_index)
                    continue
                    
                sect_name = self._unquote(sect_name)
                if sect_name in parent:
                    self._handle_error('Duplicate section name',
                                       DuplicateError, infile, cur_index)
                    continue
                
                # create the new section
                this_section = Section(
                    parent,
                    cur_depth,
                    self,
                    name=sect_name)
                parent[sect_name] = this_section
                parent.inline_comments[sect_name] = comment
                parent.comments[sect_name] = comment_list
                continue
            #
            # it's not a section marker,
            # so it should be a valid ``key = value`` line
            mat = self._keyword.match(line)
            if mat is None:
                self._handle_error(
                    'Invalid line ({0!r}) (matched as neither section nor keyword)'.format(line),
                    ParseError, infile, cur_index)
            else:
                # is a keyword value
                # value will include any inline comment
                (indent, key, value) = mat.groups()
                if indent and (self.indent_type is None):
                    self.indent_type = indent
                # check for a multiline value
                if value[:3] in ['"""', "'''"]:
                    try:
                        value, comment, cur_index = self._multiline(
                            value, infile, cur_index, maxline)
                    except SyntaxError:
                        self._handle_error(
                            'Parse error in multiline value',
                            ParseError, infile, cur_index)
                        continue
                    else:
                        if self.unrepr:
                            comment = ''
                            try:
                                value = unrepr(value)
                            except Exception as e:
                                if type(e) == UnknownType:
                                    msg = 'Unknown name or type in value'
                                else:
                                    msg = 'Parse error from unrepr-ing multiline value'
                                self._handle_error(msg, UnreprError, infile,
                                    cur_index)
                                continue
                else:
                    if self.unrepr:
                        comment = ''
                        try:
                            value = unrepr(value)
                        except Exception as e:
                            if isinstance(e, UnknownType):
                                msg = 'Unknown name or type in value'
                            else:
                                msg = 'Parse error from unrepr-ing value'
                            self._handle_error(msg, UnreprError, infile,
                                cur_index)
                            continue
                    else:
                        # extract comment and lists
                        try:
                            (value, comment) = self._handle_value(value)
                        except SyntaxError:
                            self._handle_error(
                                'Parse error in value',
                                ParseError, infile, cur_index)
                            continue
                #
                key = self._unquote(key)
                if key in this_section:
                    self._handle_error(
                        'Duplicate keyword name',
                        DuplicateError, infile, cur_index)
                    continue
                # add the key.
                # we set unrepr because if we have got this far we will never
                # be creating a new section
                this_section.__setitem__(key, value, unrepr=True)
                this_section.inline_comments[key] = comment
                this_section.comments[key] = comment_list
                continue
        #
        if self.indent_type is None:
            # no indentation used, set the type accordingly
            self.indent_type = ''

        # preserve the final comment
        if not self and not self.initial_comment:
            self.initial_comment = comment_list
        elif not reset_comment:
            self.final_comment = comment_list
        self.list_values = temp_list_values


    def _match_depth(self, sect, depth):
        """
        Given a section and a depth level, walk back through the sections
        parents to see if the depth level matches a previous section.
        
        Return a reference to the right section,
        or raise a SyntaxError.
        """
        while depth < sect.depth:
            if sect is sect.parent:
                # we've reached the top level already
                raise SyntaxError()
            sect = sect.parent
        if sect.depth == depth:
            return sect
        # shouldn't get here
        raise SyntaxError()


    def _handle_error(self, text, ErrorClass, infile, cur_index):
        """
        Handle an error according to the error settings.
        
        Either raise the error or store it.
        The error will have occured at ``cur_index``
        """
        line = infile[cur_index]
        cur_index += 1
        message = '{0} at line {1}.'.format(text, cur_index)
        error = ErrorClass(message, cur_index, line)
        if self.raise_errors:
            # raise the error - parsing stops here
            raise error
        # store the error
        # reraise when parsing has finished
        self._errors.append(error)


    def _unquote(self, value):
        """Return an unquoted version of a value"""
        if not value:
            # should only happen during parsing of lists
            raise SyntaxError
        if (value[0] == value[-1]) and (value[0] in ('"', "'")):
            value = value[1:-1]
        return value


    def _quote(self, value, multiline=True):
        """
        Return a safely quoted version of a value.
        
        Raise a ConfigObjError if the value cannot be safely quoted.
        If multiline is ``True`` (default) then use triple quotes
        if necessary.
        
        * Don't quote values that don't need it.
        * Recursively quote members of a list and return a comma joined list.
        * Multiline is ``False`` for lists.
        * Obey list syntax for empty and single member lists.
        
        If ``list_values=False`` then the value is only quoted if it contains
        a ``\\n`` (is multiline) or '#'.
        
        If ``write_empty_values`` is set, and the value is an empty string, it
        won't be quoted.
        """
        if multiline and self.write_empty_values and value == '':
            # Only if multiline is set, so that it is used for values not
            # keys, and not values that are part of a list
            return ''
        
        if multiline and isinstance(value, (list, tuple)):
            if not value:
                return ','
            elif len(value) == 1:
                return self._quote(value[0], multiline=False) + ','
            return ', '.join([self._quote(val, multiline=False)
                for val in value])
        if not isinstance(value, six.string_types):
            if self.stringify:
                # intentially 'str' because it's just whatever the "normal"
                # string type is for the python version we're dealing with
                value = str(value)
            else:
                raise TypeError('Value "%s" is not a string.' % value)

        if not value:
            return '""'
        
        no_lists_no_quotes = not self.list_values and '\n' not in value and '#' not in value
        need_triple = multiline and ((("'" in value) and ('"' in value)) or ('\n' in value ))
        hash_triple_quote = multiline and not need_triple and ("'" in value) and ('"' in value) and ('#' in value)
        check_for_single = (no_lists_no_quotes or not need_triple) and not hash_triple_quote
        
        if check_for_single:
            if not self.list_values:
                # we don't quote if ``list_values=False``
                quot = noquot
            # for normal values either single or double quotes will do
            elif '\n' in value:
                # will only happen if multiline is off - e.g. '\n' in key
                raise ConfigObjError('Value "%s" cannot be safely quoted.' % value)
            elif ((value[0] not in wspace_plus) and
                    (value[-1] not in wspace_plus) and
                    (',' not in value)):
                quot = noquot
            else:
                quot = self._get_single_quote(value)
        else:
            # if value has '\n' or "'" *and* '"', it will need triple quotes
            quot = self._get_triple_quote(value)
        
        if quot == noquot and '#' in value and self.list_values:
            quot = self._get_single_quote(value)
                
        return quot % value
    
    
    def _get_single_quote(self, value):
        if ("'" in value) and ('"' in value):
            raise ConfigObjError('Value "%s" cannot be safely quoted.' % value)
        elif '"' in value:
            quot = squot
        else:
            quot = dquot
        return quot
    
    
    def _get_triple_quote(self, value):
        if (value.find('"""') != -1) and (value.find("'''") != -1):
            raise ConfigObjError('Value "%s" cannot be safely quoted.' % value)
        if value.find('"""') == -1:
            quot = tdquot
        else:
            quot = tsquot 
        return quot


    def _handle_value(self, value):
        """
        Given a value string, unquote, remove comment,
        handle lists. (including empty and single member lists)
        """
        if self._inspec:
            # Parsing a configspec so don't handle comments
            return (value, '')
        # do we look for lists in values ?
        if not self.list_values:
            mat = self._nolistvalue.match(value)
            if mat is None:
                raise SyntaxError()
            # NOTE: we don't unquote here
            return mat.groups()
        #
        mat = self._valueexp.match(value)
        if mat is None:
            # the value is badly constructed, probably badly quoted,
            # or an invalid list
            raise SyntaxError()
        (list_values, single, empty_list, comment) = mat.groups()
        if (list_values == '') and (single is None):
            # change this if you want to accept empty values
            raise SyntaxError()
        # NOTE: note there is no error handling from here if the regex
        # is wrong: then incorrect values will slip through
        if empty_list is not None:
            # the single comma - meaning an empty list
            return ([], comment)
        if single is not None:
            # handle empty values
            if list_values and not single:
                # FIXME: the '' is a workaround because our regex now matches
                #   '' at the end of a list if it has a trailing comma
                single = None
            else:
                single = single or '""'
                single = self._unquote(single)
        if list_values == '':
            # not a list value
            return (single, comment)
        the_list = self._listvalueexp.findall(list_values)
        the_list = [self._unquote(val) for val in the_list]
        if single is not None:
            the_list += [single]
        return (the_list, comment)


    def _multiline(self, value, infile, cur_index, maxline):
        """Extract the value, where we are in a multiline situation."""
        quot = value[:3]
        newvalue = value[3:]
        single_line = self._triple_quote[quot][0]
        multi_line = self._triple_quote[quot][1]
        mat = single_line.match(value)
        if mat is not None:
            retval = list(mat.groups())
            retval.append(cur_index)
            return retval
        elif newvalue.find(quot) != -1:
            # somehow the triple quote is missing
            raise SyntaxError()
        #
        while cur_index < maxline:
            cur_index += 1
            newvalue += '\n'
            line = infile[cur_index]
            if line.find(quot) == -1:
                newvalue += line
            else:
                # end of multiline, process it
                break
        else:
            # we've got to the end of the config, oops...
            raise SyntaxError()
        mat = multi_line.match(line)
        if mat is None:
            # a badly formed line
            raise SyntaxError()
        (value, comment) = mat.groups()
        return (newvalue + value, comment, cur_index)


    def _handle_configspec(self, configspec):
        """Parse the configspec."""
        # FIXME: Should we check that the configspec was created with the 
        #        correct settings ? (i.e. ``list_values=False``)
        if not isinstance(configspec, ConfigObj):
            try:
                configspec = ConfigObj(configspec,
                                       raise_errors=True,
                                       file_error=True,
                                       _inspec=True)
            except ConfigObjError as e:
                # FIXME: Should these errors have a reference
                #        to the already parsed ConfigObj ?
                raise ConfigspecError('Parsing configspec failed: %s' % e)
            except IOError as e:
                raise IOError('Reading configspec failed: %s' % e)
        
        self.configspec = configspec
            

        
    def _set_configspec(self, section, copy):
        """
        Called by validate. Handles setting the configspec on subsections
        including sections to be validated by __many__
        """
        configspec = section.configspec
        many = configspec.get('__many__')
        if isinstance(many, dict):
            for entry in section.sections:
                if entry not in configspec:
                    section[entry].configspec = many
                    
        for entry in configspec.sections:
            if entry == '__many__':
                continue
            if entry not in section:
                section[entry] = {}
                section[entry]._created = True
                if copy:
                    # copy comments
                    section.comments[entry] = configspec.comments.get(entry, [])
                    section.inline_comments[entry] = configspec.inline_comments.get(entry, '')
                
            # Could be a scalar when we expect a section
            if isinstance(section[entry], Section):
                section[entry].configspec = configspec[entry]
                        

    def _write_line(self, indent_string, entry, this_entry, comment):
        """Write an individual line, for the write method"""
        # NOTE: the calls to self._quote here handles non-StringType values.
        if not self.unrepr:
            val = self._decode_element(self._quote(this_entry))
        else:
            val = repr(this_entry)
        return '%s%s%s%s%s' % (indent_string,
                               self._decode_element(self._quote(entry, multiline=False)),
                               self._a_to_u(' = '),
                               val,
                               self._decode_element(comment))


    def _write_marker(self, indent_string, depth, entry, comment):
        """Write a section marker line"""
        return '%s%s%s%s%s' % (indent_string,
                               self._a_to_u('[' * depth),
                               self._quote(self._decode_element(entry), multiline=False),
                               self._a_to_u(']' * depth),
                               self._decode_element(comment))


    def _handle_comment(self, comment):
        """Deal with a comment."""
        if not comment:
            return ''
        start = self.indent_type
        if not comment.startswith('#'):
            start += self._a_to_u(' # ')
        return (start + comment)


    # Public methods

    def write(self, outfile=None, section=None):
        """
        Write the current ConfigObj as a file
        
        tekNico: FIXME: use StringIO instead of real files
        
        >>> filename = a.filename
        >>> a.filename = 'test.ini'
        >>> a.write()
        >>> a.filename = filename
        >>> a == ConfigObj('test.ini', raise_errors=True)
        1
        >>> import os
        >>> os.remove('test.ini')
        """
        if self.indent_type is None:
            # this can be true if initialised from a dictionary
            self.indent_type = DEFAULT_INDENT_TYPE
            
        out = []
        cs = self._a_to_u('#')
        csp = self._a_to_u('# ')
        if section is None:
            int_val = self.interpolation
            self.interpolation = False
            section = self
            for line in self.initial_comment:
                line = self._decode_element(line)
                stripped_line = line.strip()
                if stripped_line and not stripped_line.startswith(cs):
                    line = csp + line
                out.append(line)
                
        indent_string = self.indent_type * section.depth
        for entry in (section.scalars + section.sections):
            if entry in section.defaults:
                # don't write out default values
                continue
            for comment_line in section.comments[entry]:
                comment_line = self._decode_element(comment_line.lstrip())
                if comment_line and not comment_line.startswith(cs):
                    comment_line = csp + comment_line
                out.append(indent_string + comment_line)
            this_entry = section[entry]
            comment = self._handle_comment(section.inline_comments[entry])
            
            if isinstance(this_entry, Section):
                # a section
                out.append(self._write_marker(
                    indent_string,
                    this_entry.depth,
                    entry,
                    comment))
                out.extend(self.write(section=this_entry))
            else:
                out.append(self._write_line(
                    indent_string,
                    entry,
                    this_entry,
                    comment))
                
        if section is self:
            for line in self.final_comment:
                line = self._decode_element(line)
                stripped_line = line.strip()
                if stripped_line and not stripped_line.startswith(cs):
                    line = csp + line
                out.append(line)
            self.interpolation = int_val
            
        if section is not self:
            return out
        
        if (self.filename is None) and (outfile is None):
            # output a list of lines
            # might need to encode
            # NOTE: This will *screw* UTF16, each line will start with the BOM
            if self.encoding:
                out = [l.encode(self.encoding) for l in out]
            if (self.BOM and ((self.encoding is None) or
                (BOM_LIST.get(self.encoding.lower()) == 'utf_8'))):
                # Add the UTF8 BOM
                if not out:
                    out.append('')
                out[0] = BOM_UTF8 + out[0]
            return out
        
        # Turn the list to a string, joined with correct newlines
        newline = self.newlines or os.linesep
        if (getattr(outfile, 'mode', None) is not None and outfile.mode == 'w'
            and sys.platform == 'win32' and newline == '\r\n'):
            # Windows specific hack to avoid writing '\r\r\n'
            newline = '\n'
        output = self._a_to_u(newline).join(out)
        if not output.endswith(newline):
            output += newline

        if isinstance(output, six.binary_type):
            output_bytes = output
        else:
            output_bytes = output.encode(self.encoding or
                                         self.default_encoding or
                                         'ascii')

        if self.BOM and ((self.encoding is None) or match_utf8(self.encoding)):
            # Add the UTF8 BOM
            output_bytes = BOM_UTF8 + output_bytes

        if outfile is not None:
            outfile.write(output_bytes)
        else:
            with open(self.filename, 'wb') as h:
                h.write(output_bytes)

    def validate(self, validator, preserve_errors=False, copy=False,
                 section=None):
        """
        Test the ConfigObj against a configspec.
        
        It uses the ``validator`` object from *validate.py*.
        
        To run ``validate`` on the current ConfigObj, call: ::
        
            test = config.validate(validator)
        
        (Normally having previously passed in the configspec when the ConfigObj
        was created - you can dynamically assign a dictionary of checks to the
        ``configspec`` attribute of a section though).
        
        It returns ``True`` if everything passes, or a dictionary of
        pass/fails (True/False). If every member of a subsection passes, it
        will just have the value ``True``. (It also returns ``False`` if all
        members fail).
        
        In addition, it converts the values from strings to their native
        types if their checks pass (and ``stringify`` is set).
        
        If ``preserve_errors`` is ``True`` (``False`` is default) then instead
        of a marking a fail with a ``False``, it will preserve the actual
        exception object. This can contain info about the reason for failure.
        For example the ``VdtValueTooSmallError`` indicates that the value
        supplied was too small. If a value (or section) is missing it will
        still be marked as ``False``.
        
        You must have the validate module to use ``preserve_errors=True``.
        
        You can then use the ``flatten_errors`` function to turn your nested
        results dictionary into a flattened list of failures - useful for
        displaying meaningful error messages.
        """
        if section is None:
            if self.configspec is None:
                raise ValueError('No configspec supplied.')
            if preserve_errors:
                # We do this once to remove a top level dependency on the validate module
                # Which makes importing configobj faster
                from validate import VdtMissingValue
                self._vdtMissingValue = VdtMissingValue
                
            section = self

            if copy:
                section.initial_comment = section.configspec.initial_comment
                section.final_comment = section.configspec.final_comment
                section.encoding = section.configspec.encoding
                section.BOM = section.configspec.BOM
                section.newlines = section.configspec.newlines
                section.indent_type = section.configspec.indent_type
            
        #
        # section.default_values.clear() #??
        configspec = section.configspec
        self._set_configspec(section, copy)

        
        def validate_entry(entry, spec, val, missing, ret_true, ret_false):
            section.default_values.pop(entry, None)
                
            try:
                section.default_values[entry] = validator.get_default_value(configspec[entry])
            except (KeyError, AttributeError, validator.baseErrorClass):
                # No default, bad default or validator has no 'get_default_value'
                # (e.g. SimpleVal)
                pass
            
            try:
                check = validator.check(spec,
                                        val,
                                        missing=missing
                                        )
            except validator.baseErrorClass as e:
                if not preserve_errors or isinstance(e, self._vdtMissingValue):
                    out[entry] = False
                else:
                    # preserve the error
                    out[entry] = e
                    ret_false = False
                ret_true = False
            else:
                ret_false = False
                out[entry] = True
                if self.stringify or missing:
                    # if we are doing type conversion
                    # or the value is a supplied default
                    if not self.stringify:
                        if isinstance(check, (list, tuple)):
                            # preserve lists
                            check = [self._str(item) for item in check]
                        elif missing and check is None:
                            # convert the None from a default to a ''
                            check = ''
                        else:
                            check = self._str(check)
                    if (check != val) or missing:
                        section[entry] = check
                if not copy and missing and entry not in section.defaults:
                    section.defaults.append(entry)
            return ret_true, ret_false
        
        #
        out = {}
        ret_true = True
        ret_false = True
        
        unvalidated = [k for k in section.scalars if k not in configspec]
        incorrect_sections = [k for k in configspec.sections if k in section.scalars]        
        incorrect_scalars = [k for k in configspec.scalars if k in section.sections]
        
        for entry in configspec.scalars:
            if entry in ('__many__', '___many___'):
                # reserved names
                continue
            if (not entry in section.scalars) or (entry in section.defaults):
                # missing entries
                # or entries from defaults
                missing = True
                val = None
                if copy and entry not in section.scalars:
                    # copy comments
                    section.comments[entry] = (
                        configspec.comments.get(entry, []))
                    section.inline_comments[entry] = (
                        configspec.inline_comments.get(entry, ''))
                #
            else:
                missing = False
                val = section[entry]
            
            ret_true, ret_false = validate_entry(entry, configspec[entry], val, 
                                                 missing, ret_true, ret_false)
        
        many = None
        if '__many__' in configspec.scalars:
            many = configspec['__many__']
        elif '___many___' in configspec.scalars:
            many = configspec['___many___']
        
        if many is not None:
            for entry in unvalidated:
                val = section[entry]
                ret_true, ret_false = validate_entry(entry, many, val, False,
                                                     ret_true, ret_false)
            unvalidated = []

        for entry in incorrect_scalars:
            ret_true = False
            if not preserve_errors:
                out[entry] = False
            else:
                ret_false = False
                msg = 'Value %r was provided as a section' % entry
                out[entry] = validator.baseErrorClass(msg)
        for entry in incorrect_sections:
            ret_true = False
            if not preserve_errors:
                out[entry] = False
            else:
                ret_false = False
                msg = 'Section %r was provided as a single value' % entry
                out[entry] = validator.baseErrorClass(msg)
                
        # Missing sections will have been created as empty ones when the
        # configspec was read.
        for entry in section.sections:
            # FIXME: this means DEFAULT is not copied in copy mode
            if section is self and entry == 'DEFAULT':
                continue
            if section[entry].configspec is None:
                unvalidated.append(entry)
                continue
            if copy:
                section.comments[entry] = configspec.comments.get(entry, [])
                section.inline_comments[entry] = configspec.inline_comments.get(entry, '')
            check = self.validate(validator, preserve_errors=preserve_errors, copy=copy, section=section[entry])
            out[entry] = check
            if check == False:
                ret_true = False
            elif check == True:
                ret_false = False
            else:
                ret_true = False
        
        section.extra_values = unvalidated
        if preserve_errors and not section._created:
            # If the section wasn't created (i.e. it wasn't missing)
            # then we can't return False, we need to preserve errors
            ret_false = False
        #
        if ret_false and preserve_errors and out:
            # If we are preserving errors, but all
            # the failures are from missing sections / values
            # then we can return False. Otherwise there is a
            # real failure that we need to preserve.
            ret_false = not any(out.values())
        if ret_true:
            return True
        elif ret_false:
            return False
        return out


    def reset(self):
        """Clear ConfigObj instance and restore to 'freshly created' state."""
        self.clear()
        self._initialise()
        # FIXME: Should be done by '_initialise', but ConfigObj constructor (and reload)
        #        requires an empty dictionary
        self.configspec = None
        # Just to be sure ;-)
        self._original_configspec = None
        
        
    def reload(self):
        """
        Reload a ConfigObj from file.
        
        This method raises a ``ReloadError`` if the ConfigObj doesn't have
        a filename attribute pointing to a file.
        """
        if not isinstance(self.filename, six.string_types):
            raise ReloadError()

        filename = self.filename
        current_options = {}
        for entry in OPTION_DEFAULTS:
            if entry == 'configspec':
                continue
            current_options[entry] = getattr(self, entry)
            
        configspec = self._original_configspec
        current_options['configspec'] = configspec
            
        self.clear()
        self._initialise(current_options)
        self._load(filename, configspec)
        


class SimpleVal(object):
    """
    A simple validator.
    Can be used to check that all members expected are present.
    
    To use it, provide a configspec with all your members in (the value given
    will be ignored). Pass an instance of ``SimpleVal`` to the ``validate``
    method of your ``ConfigObj``. ``validate`` will return ``True`` if all
    members are present, or a dictionary with True/False meaning
    present/missing. (Whole missing sections will be replaced with ``False``)
    """
    
    def __init__(self):
        self.baseErrorClass = ConfigObjError
    
    def check(self, check, member, missing=False):
        """A dummy check method, always returns the value unchanged."""
        if missing:
            raise self.baseErrorClass()
        return member


def flatten_errors(cfg, res, levels=None, results=None):
    """
    An example function that will turn a nested dictionary of results
    (as returned by ``ConfigObj.validate``) into a flat list.
    
    ``cfg`` is the ConfigObj instance being checked, ``res`` is the results
    dictionary returned by ``validate``.
    
    (This is a recursive function, so you shouldn't use the ``levels`` or
    ``results`` arguments - they are used by the function.)
    
    Returns a list of keys that failed. Each member of the list is a tuple::
    
        ([list of sections...], key, result)
    
    If ``validate`` was called with ``preserve_errors=False`` (the default)
    then ``result`` will always be ``False``.

    *list of sections* is a flattened list of sections that the key was found
    in.
    
    If the section was missing (or a section was expected and a scalar provided
    - or vice-versa) then key will be ``None``.
    
    If the value (or section) was missing then ``result`` will be ``False``.
    
    If ``validate`` was called with ``preserve_errors=True`` and a value
    was present, but failed the check, then ``result`` will be the exception
    object returned. You can use this as a string that describes the failure.
    
    For example *The value "3" is of the wrong type*.
    """
    if levels is None:
        # first time called
        levels = []
        results = []
    if res == True:
        return sorted(results)
    if res == False or isinstance(res, Exception):
        results.append((levels[:], None, res))
        if levels:
            levels.pop()
        return sorted(results)
    for (key, val) in list(res.items()):
        if val == True:
            continue
        if isinstance(cfg.get(key), dict):
            # Go down one level
            levels.append(key)
            flatten_errors(cfg[key], val, levels, results)
            continue
        results.append((levels[:], key, val))
    #
    # Go up one level
    if levels:
        levels.pop()
    #
    return sorted(results)


def get_extra_values(conf, _prepend=()):
    """
    Find all the values and sections not in the configspec from a validated
    ConfigObj.
    
    ``get_extra_values`` returns a list of tuples where each tuple represents
    either an extra section, or an extra value.
    
    The tuples contain two values, a tuple representing the section the value 
    is in and the name of the extra values. For extra values in the top level
    section the first member will be an empty tuple. For values in the 'foo'
    section the first member will be ``('foo',)``. For members in the 'bar'
    subsection of the 'foo' section the first member will be ``('foo', 'bar')``.
    
    NOTE: If you call ``get_extra_values`` on a ConfigObj instance that hasn't
    been validated it will return an empty list.
    """
    out = []
    
    out.extend([(_prepend, name) for name in conf.extra_values])
    for name in conf.sections:
        if name not in conf.extra_values:
            out.extend(get_extra_values(conf[name], _prepend + (name,)))
    return out


"""*A programming language is a medium of expression.* - Paul Graham"""
site-packages/nftables/__init__.py000064400000000030147511334620013172 0ustar00from .nftables import *
site-packages/nftables/nftables.py000064400000034135147511334620013246 0ustar00#!/usr/bin/python
# Copyright(C) 2018 Phil Sutter <phil@nwl.cc>

# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, version 2 of the License.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place - Suite 330, Boston, MA 02111-1307, USA.

import json
from ctypes import *
import sys
import os

NFTABLES_VERSION = "0.1"

class SchemaValidator:
    """Libnftables JSON validator using jsonschema"""

    def __init__(self):
        schema_path = os.path.join(os.path.dirname(__file__), "schema.json")
        with open(schema_path, 'r') as schema_file:
            self.schema = json.load(schema_file)
        import jsonschema
        self.jsonschema = jsonschema

    def validate(self, json):
        self.jsonschema.validate(instance=json, schema=self.schema)

class Nftables:
    """A class representing libnftables interface"""

    debug_flags = {
        "scanner":   0x1,
        "parser":    0x2,
        "eval":      0x4,
        "netlink":   0x8,
        "mnl":       0x10,
        "proto-ctx": 0x20,
        "segtree":   0x40,
    }

    output_flags = {
        "reversedns":     (1 << 0),
        "service":        (1 << 1),
        "stateless":      (1 << 2),
        "handle":         (1 << 3),
        "json":           (1 << 4),
        "echo":           (1 << 5),
        "guid":           (1 << 6),
        "numeric_proto":  (1 << 7),
        "numeric_prio":   (1 << 8),
        "numeric_symbol": (1 << 9),
        "numeric_time":   (1 << 10),
        "terse":          (1 << 11),
    }

    validator = None

    def __init__(self, sofile="libnftables.so.1.1.0"):
        """Instantiate a new Nftables class object.

        Accepts a shared object file to open, by default standard search path
        is searched for a file named 'libnftables.so'.

        After loading the library using ctypes module, a new nftables context
        is requested from the library and buffering of output and error streams
        is turned on.
        """
        lib = cdll.LoadLibrary(sofile)

        ### API function definitions

        self.nft_ctx_new = lib.nft_ctx_new
        self.nft_ctx_new.restype = c_void_p
        self.nft_ctx_new.argtypes = [c_int]

        self.nft_ctx_output_get_flags = lib.nft_ctx_output_get_flags
        self.nft_ctx_output_get_flags.restype = c_uint
        self.nft_ctx_output_get_flags.argtypes = [c_void_p]

        self.nft_ctx_output_set_flags = lib.nft_ctx_output_set_flags
        self.nft_ctx_output_set_flags.argtypes = [c_void_p, c_uint]

        self.nft_ctx_output_get_debug = lib.nft_ctx_output_get_debug
        self.nft_ctx_output_get_debug.restype = c_int
        self.nft_ctx_output_get_debug.argtypes = [c_void_p]

        self.nft_ctx_output_set_debug = lib.nft_ctx_output_set_debug
        self.nft_ctx_output_set_debug.argtypes = [c_void_p, c_int]

        self.nft_ctx_buffer_output = lib.nft_ctx_buffer_output
        self.nft_ctx_buffer_output.restype = c_int
        self.nft_ctx_buffer_output.argtypes = [c_void_p]

        self.nft_ctx_get_output_buffer = lib.nft_ctx_get_output_buffer
        self.nft_ctx_get_output_buffer.restype = c_char_p
        self.nft_ctx_get_output_buffer.argtypes = [c_void_p]

        self.nft_ctx_buffer_error = lib.nft_ctx_buffer_error
        self.nft_ctx_buffer_error.restype = c_int
        self.nft_ctx_buffer_error.argtypes = [c_void_p]

        self.nft_ctx_get_error_buffer = lib.nft_ctx_get_error_buffer
        self.nft_ctx_get_error_buffer.restype = c_char_p
        self.nft_ctx_get_error_buffer.argtypes = [c_void_p]

        self.nft_run_cmd_from_buffer = lib.nft_run_cmd_from_buffer
        self.nft_run_cmd_from_buffer.restype = c_int
        self.nft_run_cmd_from_buffer.argtypes = [c_void_p, c_char_p]

        self.nft_ctx_free = lib.nft_ctx_free
        lib.nft_ctx_free.argtypes = [c_void_p]

        # initialize libnftables context
        self.__ctx = self.nft_ctx_new(0)
        self.nft_ctx_buffer_output(self.__ctx)
        self.nft_ctx_buffer_error(self.__ctx)

    def __del__(self):
        self.nft_ctx_free(self.__ctx)

    def __get_output_flag(self, name):
        flag = self.output_flags[name]
        return self.nft_ctx_output_get_flags(self.__ctx) & flag

    def __set_output_flag(self, name, val):
        flag = self.output_flags[name]
        flags = self.nft_ctx_output_get_flags(self.__ctx)
        if val:
            new_flags = flags | flag
        else:
            new_flags = flags & ~flag
        self.nft_ctx_output_set_flags(self.__ctx, new_flags)
        return flags & flag

    def get_reversedns_output(self):
        """Get the current state of reverse DNS output.

        Returns a boolean indicating whether reverse DNS lookups are performed
        for IP addresses in output.
        """
        return self.__get_output_flag("reversedns")

    def set_reversedns_output(self, val):
        """Enable or disable reverse DNS output.

        Accepts a boolean turning reverse DNS lookups in output on or off.

        Returns the previous value.
        """
        return self.__set_output_flag("reversedns", val)

    def get_service_output(self):
        """Get the current state of service name output.

        Returns a boolean indicating whether service names are used for port
        numbers in output or not.
        """
        return self.__get_output_flag("service")

    def set_service_output(self, val):
        """Enable or disable service name output.

        Accepts a boolean turning service names for port numbers in output on
        or off.

        Returns the previous value.
        """
        return self.__set_output_flag("service", val)

    def get_stateless_output(self):
        """Get the current state of stateless output.

        Returns a boolean indicating whether stateless output is active or not.
        """
        return self.__get_output_flag("stateless")

    def set_stateless_output(self, val):
        """Enable or disable stateless output.

        Accepts a boolean turning stateless output either on or off.

        Returns the previous value.
        """
        return self.__set_output_flag("stateless", val)

    def get_handle_output(self):
        """Get the current state of handle output.

        Returns a boolean indicating whether handle output is active or not.
        """
        return self.__get_output_flag("handle")

    def set_handle_output(self, val):
        """Enable or disable handle output.

        Accepts a boolean turning handle output on or off.

        Returns the previous value.
        """
        return self.__set_output_flag("handle", val)

    def get_json_output(self):
        """Get the current state of JSON output.

        Returns a boolean indicating whether JSON output is active or not.
        """
        return self.__get_output_flag("json")

    def set_json_output(self, val):
        """Enable or disable JSON output.

        Accepts a boolean turning JSON output either on or off.

        Returns the previous value.
        """
        return self.__set_output_flag("json", val)

    def get_echo_output(self):
        """Get the current state of echo output.

        Returns a boolean indicating whether echo output is active or not.
        """
        return self.__get_output_flag("echo")

    def set_echo_output(self, val):
        """Enable or disable echo output.

        Accepts a boolean turning echo output on or off.

        Returns the previous value.
        """
        return self.__set_output_flag("echo", val)

    def get_guid_output(self):
        """Get the current state of GID/UID output.

        Returns a boolean indicating whether names for group/user IDs are used
        in output or not.
        """
        return self.__get_output_flag("guid")

    def set_guid_output(self, val):
        """Enable or disable GID/UID output.

        Accepts a boolean turning names for group/user IDs on or off.

        Returns the previous value.
        """
        return self.__set_output_flag("guid", val)

    def get_numeric_proto_output(self):
        """Get current status of numeric protocol output flag.

        Returns a boolean value indicating the status.
        """
        return self.__get_output_flag("numeric_proto")

    def set_numeric_proto_output(self, val):
        """Set numeric protocol output flag.

        Accepts a boolean turning numeric protocol output either on or off.

        Returns the previous value.
        """
        return self.__set_output_flag("numeric_proto", val)

    def get_numeric_prio_output(self):
        """Get current status of numeric chain priority output flag.

        Returns a boolean value indicating the status.
        """
        return self.__get_output_flag("numeric_prio")

    def set_numeric_prio_output(self, val):
        """Set numeric chain priority output flag.

        Accepts a boolean turning numeric chain priority output either on or
        off.

        Returns the previous value.
        """
        return self.__set_output_flag("numeric_prio", val)

    def get_numeric_symbol_output(self):
        """Get current status of numeric symbols output flag.

        Returns a boolean value indicating the status.
        """
        return self.__get_output_flag("numeric_symbol")

    def set_numeric_symbol_output(self, val):
        """Set numeric symbols output flag.

        Accepts a boolean turning numeric representation of symbolic constants
        in output either on or off.

        Returns the previous value.
        """
        return self.__set_output_flag("numeric_symbol", val)

    def get_numeric_time_output(self):
        """Get current status of numeric times output flag.

        Returns a boolean value indicating the status.
        """
        return self.__get_output_flag("numeric_time")

    def set_numeric_time_output(self, val):
        """Set numeric times output flag.

        Accepts a boolean turning numeric representation of time values
        in output either on or off.

        Returns the previous value.
        """
        return self.__set_output_flag("numeric_time", val)

    def get_terse_output(self):
        """Get the current state of terse output.

        Returns a boolean indicating whether terse output is active or not.
        """
        return self.__get_output_flag("terse")

    def set_terse_output(self, val):
        """Enable or disable terse output.

        Accepts a boolean turning terse output either on or off.

        Returns the previous value.
        """
        return self.__set_output_flag("terse", val)

    def get_debug(self):
        """Get currently active debug flags.

        Returns a set of flag names. See set_debug() for details.
        """
        val = self.nft_ctx_output_get_debug(self.__ctx)

        names = []
        for n,v in self.debug_flags.items():
            if val & v:
                names.append(n)
                val &= ~v
        if val:
            names.append(val)

        return names

    def set_debug(self, values):
        """Set debug output flags.

        Accepts either a single flag or a set of flags. Each flag might be
        given either as string or integer value as shown in the following
        table:

        Name      | Value (hex)
        -----------------------
        scanner   | 0x1
        parser    | 0x2
        eval      | 0x4
        netlink   | 0x8
        mnl       | 0x10
        proto-ctx | 0x20
        segtree   | 0x40

        Returns a set of previously active debug flags, as returned by
        get_debug() method.
        """
        old = self.get_debug()

        if type(values) in [str, int]:
            values = [values]

        val = 0
        for v in values:
            if type(v) is str:
                v = self.debug_flags[v]
            val |= v

        self.nft_ctx_output_set_debug(self.__ctx, val)

        return old

    def cmd(self, cmdline):
        """Run a simple nftables command via libnftables.

        Accepts a string containing an nftables command just like what one
        would enter into an interactive nftables (nft -i) session.

        Returns a tuple (rc, output, error):
        rc     -- return code as returned by nft_run_cmd_from_buffer() fuction
        output -- a string containing output written to stdout
        error  -- a string containing output written to stderr
        """
        cmdline_is_unicode = False
        if not isinstance(cmdline, bytes):
            cmdline_is_unicode = True
            cmdline = cmdline.encode("utf-8")
        rc = self.nft_run_cmd_from_buffer(self.__ctx, cmdline)
        output = self.nft_ctx_get_output_buffer(self.__ctx)
        error = self.nft_ctx_get_error_buffer(self.__ctx)
        if cmdline_is_unicode:
            output = output.decode("utf-8")
            error = error.decode("utf-8")

        return (rc, output, error)

    def json_cmd(self, json_root):
        """Run an nftables command in JSON syntax via libnftables.

        Accepts a hash object as input.

        Returns a tuple (rc, output, error):
        rc     -- return code as returned by nft_run_cmd_from_buffer() function
        output -- a hash object containing library standard output
        error  -- a string containing output written to stderr
        """
        json_out_old = self.set_json_output(True)
        rc, output, error = self.cmd(json.dumps(json_root))
        if not json_out_old:
            self.set_json_output(json_out_old)
        if len(output):
            output = json.loads(output)
        return (rc, output, error)

    def json_validate(self, json_root):
        """Validate JSON object against libnftables schema.

        Accepts a hash object as input.

        Returns True if JSON is valid, raises an exception otherwise.
        """
        if not self.validator:
            self.validator = SchemaValidator()

        self.validator.validate(json_root)
        return True
site-packages/nftables/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000212147511334620020417 0ustar003

�]�b�@sddlTdS)�)�*N)Znftables�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/nftables/__pycache__/nftables.cpython-36.pyc000064400000032430147511334620017526 0ustar003

2Ue]8�@sDddlZddlTddlZddlZdZGdd�d�ZGdd�d�ZdS)�N)�*z0.1c@s eZdZdZdd�Zdd�ZdS)�SchemaValidatorz+Libnftables JSON validator using jsonschemac	CsJtjjtjjt�d�}t|d��}tj|�|_WdQRXddl	}||_	dS)Nzschema.json�rr)
�os�path�join�dirname�__file__�open�json�load�schema�
jsonschema)�selfZschema_pathZschema_filer�r�/usr/lib/python3.6/nftables.py�__init__s
zSchemaValidator.__init__cCs|jj||jd�dS)N)�instancer
)r�validater
)rrrrrr"szSchemaValidator.validateN)�__name__�
__module__�__qualname__�__doc__rrrrrrrsrc
@sPeZdZdZdddddddd	�ZdWdXdYdZd[d\d]d^d_d`dadbd�ZdZdcdd�Zdd�Zdd�Z	dd�Z
dd�Zdd �Zd!d"�Z
d#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Zd5d6�Zd7d8�Zd9d:�Zd;d<�Zd=d>�Zd?d@�ZdAdB�ZdCdD�ZdEdF�ZdGdH�Z dIdJ�Z!dKdL�Z"dMdN�Z#dOdP�Z$dQdR�Z%dSdT�Z&dUdV�Z'dS)d�Nftablesz*A class representing libnftables interface������ �@)�scanner�parser�evalZnetlinkZmnlz	proto-ctxZsegtreer�����	�
�)�
reversedns�service�	stateless�handler�echo�guid�
numeric_proto�numeric_prio�numeric_symbol�numeric_time�terseN�libnftables.so.1.1.0cCs>tj|�}|j|_t|j_tg|j_|j|_t|j_tg|j_|j	|_	ttg|j	_|j
|_
t|j
_tg|j
_|j|_ttg|j_|j|_t|j_tg|j_|j
|_
t|j
_tg|j
_|j|_t|j_tg|j_|j|_t|j_tg|j_|j|_t|j_ttg|j_|j|_tg|j_|jd�|_|j|j�|j|j�dS)alInstantiate a new Nftables class object.

        Accepts a shared object file to open, by default standard search path
        is searched for a file named 'libnftables.so'.

        After loading the library using ctypes module, a new nftables context
        is requested from the library and buffering of output and error streams
        is turned on.
        rN)ZcdllZLoadLibraryZnft_ctx_newZc_void_pZrestypeZc_intZargtypes�nft_ctx_output_get_flagsZc_uint�nft_ctx_output_set_flags�nft_ctx_output_get_debug�nft_ctx_output_set_debugZnft_ctx_buffer_output�nft_ctx_get_output_bufferZc_char_pZnft_ctx_buffer_error�nft_ctx_get_error_buffer�nft_run_cmd_from_buffer�nft_ctx_free�_Nftables__ctx)rZsofile�librrrrCsD









zNftables.__init__cCs|j|j�dS)N)r>r?)rrrr�__del__szNftables.__del__cCs|j|}|j|j�|@S)N)�output_flagsr7r?)r�name�flagrrrZ__get_output_flag�s
zNftables.__get_output_flagcCsD|j|}|j|j�}|r$||B}n
||@}|j|j|�||@S)N)rBr7r?r8)rrC�valrD�flagsZ	new_flagsrrrZ__set_output_flag�s


zNftables.__set_output_flagcCs
|jd�S)z�Get the current state of reverse DNS output.

        Returns a boolean indicating whether reverse DNS lookups are performed
        for IP addresses in output.
        r+)�_Nftables__get_output_flag)rrrr�get_reversedns_output�szNftables.get_reversedns_outputcCs|jd|�S)z�Enable or disable reverse DNS output.

        Accepts a boolean turning reverse DNS lookups in output on or off.

        Returns the previous value.
        r+)�_Nftables__set_output_flag)rrErrr�set_reversedns_output�szNftables.set_reversedns_outputcCs
|jd�S)z�Get the current state of service name output.

        Returns a boolean indicating whether service names are used for port
        numbers in output or not.
        r,)rG)rrrr�get_service_output�szNftables.get_service_outputcCs|jd|�S)z�Enable or disable service name output.

        Accepts a boolean turning service names for port numbers in output on
        or off.

        Returns the previous value.
        r,)rI)rrErrr�set_service_output�szNftables.set_service_outputcCs
|jd�S)z�Get the current state of stateless output.

        Returns a boolean indicating whether stateless output is active or not.
        r-)rG)rrrr�get_stateless_output�szNftables.get_stateless_outputcCs|jd|�S)z�Enable or disable stateless output.

        Accepts a boolean turning stateless output either on or off.

        Returns the previous value.
        r-)rI)rrErrr�set_stateless_output�szNftables.set_stateless_outputcCs
|jd�S)z~Get the current state of handle output.

        Returns a boolean indicating whether handle output is active or not.
        r.)rG)rrrr�get_handle_output�szNftables.get_handle_outputcCs|jd|�S)z�Enable or disable handle output.

        Accepts a boolean turning handle output on or off.

        Returns the previous value.
        r.)rI)rrErrr�set_handle_output�szNftables.set_handle_outputcCs
|jd�S)zzGet the current state of JSON output.

        Returns a boolean indicating whether JSON output is active or not.
        r)rG)rrrr�get_json_output�szNftables.get_json_outputcCs|jd|�S)z�Enable or disable JSON output.

        Accepts a boolean turning JSON output either on or off.

        Returns the previous value.
        r)rI)rrErrr�set_json_output�szNftables.set_json_outputcCs
|jd�S)zzGet the current state of echo output.

        Returns a boolean indicating whether echo output is active or not.
        r/)rG)rrrr�get_echo_output�szNftables.get_echo_outputcCs|jd|�S)z�Enable or disable echo output.

        Accepts a boolean turning echo output on or off.

        Returns the previous value.
        r/)rI)rrErrr�set_echo_output�szNftables.set_echo_outputcCs
|jd�S)z�Get the current state of GID/UID output.

        Returns a boolean indicating whether names for group/user IDs are used
        in output or not.
        r0)rG)rrrr�get_guid_output�szNftables.get_guid_outputcCs|jd|�S)z�Enable or disable GID/UID output.

        Accepts a boolean turning names for group/user IDs on or off.

        Returns the previous value.
        r0)rI)rrErrr�set_guid_output�szNftables.set_guid_outputcCs
|jd�S)ztGet current status of numeric protocol output flag.

        Returns a boolean value indicating the status.
        r1)rG)rrrr�get_numeric_proto_outputsz!Nftables.get_numeric_proto_outputcCs|jd|�S)z�Set numeric protocol output flag.

        Accepts a boolean turning numeric protocol output either on or off.

        Returns the previous value.
        r1)rI)rrErrr�set_numeric_proto_outputsz!Nftables.set_numeric_proto_outputcCs
|jd�S)zzGet current status of numeric chain priority output flag.

        Returns a boolean value indicating the status.
        r2)rG)rrrr�get_numeric_prio_outputsz Nftables.get_numeric_prio_outputcCs|jd|�S)z�Set numeric chain priority output flag.

        Accepts a boolean turning numeric chain priority output either on or
        off.

        Returns the previous value.
        r2)rI)rrErrr�set_numeric_prio_outputsz Nftables.set_numeric_prio_outputcCs
|jd�S)zsGet current status of numeric symbols output flag.

        Returns a boolean value indicating the status.
        r3)rG)rrrr�get_numeric_symbol_output%sz"Nftables.get_numeric_symbol_outputcCs|jd|�S)z�Set numeric symbols output flag.

        Accepts a boolean turning numeric representation of symbolic constants
        in output either on or off.

        Returns the previous value.
        r3)rI)rrErrr�set_numeric_symbol_output,sz"Nftables.set_numeric_symbol_outputcCs
|jd�S)zqGet current status of numeric times output flag.

        Returns a boolean value indicating the status.
        r4)rG)rrrr�get_numeric_time_output6sz Nftables.get_numeric_time_outputcCs|jd|�S)z�Set numeric times output flag.

        Accepts a boolean turning numeric representation of time values
        in output either on or off.

        Returns the previous value.
        r4)rI)rrErrr�set_numeric_time_output=sz Nftables.set_numeric_time_outputcCs
|jd�S)z|Get the current state of terse output.

        Returns a boolean indicating whether terse output is active or not.
        r5)rG)rrrr�get_terse_outputGszNftables.get_terse_outputcCs|jd|�S)z�Enable or disable terse output.

        Accepts a boolean turning terse output either on or off.

        Returns the previous value.
        r5)rI)rrErrr�set_terse_outputNszNftables.set_terse_outputcCsV|j|j�}g}x2|jj�D]$\}}||@r|j|�||M}qW|rR|j|�|S)zmGet currently active debug flags.

        Returns a set of flag names. See set_debug() for details.
        )r9r?�debug_flags�items�append)rrE�names�n�vrrr�	get_debugWs

zNftables.get_debugcCs`|j�}t|�ttgkr|g}d}x*|D]"}t|�tkrB|j|}||O}q(W|j|j|�|S)aSet debug output flags.

        Accepts either a single flag or a set of flags. Each flag might be
        given either as string or integer value as shown in the following
        table:

        Name      | Value (hex)
        -----------------------
        scanner   | 0x1
        parser    | 0x2
        eval      | 0x4
        netlink   | 0x8
        mnl       | 0x10
        proto-ctx | 0x20
        segtree   | 0x40

        Returns a set of previously active debug flags, as returned by
        get_debug() method.
        r)rg�type�str�intrar:r?)r�values�oldrErfrrr�	set_debughs

zNftables.set_debugcCsdd}t|t�sd}|jd�}|j|j|�}|j|j�}|j|j�}|rZ|jd�}|jd�}|||fS)a�Run a simple nftables command via libnftables.

        Accepts a string containing an nftables command just like what one
        would enter into an interactive nftables (nft -i) session.

        Returns a tuple (rc, output, error):
        rc     -- return code as returned by nft_run_cmd_from_buffer() fuction
        output -- a string containing output written to stdout
        error  -- a string containing output written to stderr
        FTzutf-8)�
isinstance�bytes�encoder=r?r;r<�decode)rZcmdlineZcmdline_is_unicode�rc�output�errorrrr�cmd�s



zNftables.cmdcCsJ|jd�}|jtj|��\}}}|s.|j|�t|�r@tj|�}|||fS)aiRun an nftables command in JSON syntax via libnftables.

        Accepts a hash object as input.

        Returns a tuple (rc, output, error):
        rc     -- return code as returned by nft_run_cmd_from_buffer() function
        output -- a hash object containing library standard output
        error  -- a string containing output written to stderr
        T)rRrur�dumps�len�loads)r�	json_rootZjson_out_oldrrrsrtrrr�json_cmd�s



zNftables.json_cmdcCs|jst�|_|jj|�dS)z�Validate JSON object against libnftables schema.

        Accepts a hash object as input.

        Returns True if JSON is valid, raises an exception otherwise.
        T)�	validatorrr)rryrrr�
json_validate�szNftables.json_validaterrrrrrr ��iii)r6)(rrrrrarBr{rrArGrIrHrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rgrmrurzr|rrrrr%sl
<
	
						


	#r)rZctypes�sysrZNFTABLES_VERSIONrrrrrr�<module>s
site-packages/nftables/__pycache__/nftables.cpython-36.opt-1.pyc000064400000032430147511334620020465 0ustar003

2Ue]8�@sDddlZddlTddlZddlZdZGdd�d�ZGdd�d�ZdS)�N)�*z0.1c@s eZdZdZdd�Zdd�ZdS)�SchemaValidatorz+Libnftables JSON validator using jsonschemac	CsJtjjtjjt�d�}t|d��}tj|�|_WdQRXddl	}||_	dS)Nzschema.json�rr)
�os�path�join�dirname�__file__�open�json�load�schema�
jsonschema)�selfZschema_pathZschema_filer�r�/usr/lib/python3.6/nftables.py�__init__s
zSchemaValidator.__init__cCs|jj||jd�dS)N)�instancer
)r�validater
)rrrrrr"szSchemaValidator.validateN)�__name__�
__module__�__qualname__�__doc__rrrrrrrsrc
@sPeZdZdZdddddddd	�ZdWdXdYdZd[d\d]d^d_d`dadbd�ZdZdcdd�Zdd�Zdd�Z	dd�Z
dd�Zdd �Zd!d"�Z
d#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Zd5d6�Zd7d8�Zd9d:�Zd;d<�Zd=d>�Zd?d@�ZdAdB�ZdCdD�ZdEdF�ZdGdH�Z dIdJ�Z!dKdL�Z"dMdN�Z#dOdP�Z$dQdR�Z%dSdT�Z&dUdV�Z'dS)d�Nftablesz*A class representing libnftables interface������ �@)�scanner�parser�evalZnetlinkZmnlz	proto-ctxZsegtreer�����	�
�)�
reversedns�service�	stateless�handler�echo�guid�
numeric_proto�numeric_prio�numeric_symbol�numeric_time�terseN�libnftables.so.1.1.0cCs>tj|�}|j|_t|j_tg|j_|j|_t|j_tg|j_|j	|_	ttg|j	_|j
|_
t|j
_tg|j
_|j|_ttg|j_|j|_t|j_tg|j_|j
|_
t|j
_tg|j
_|j|_t|j_tg|j_|j|_t|j_tg|j_|j|_t|j_ttg|j_|j|_tg|j_|jd�|_|j|j�|j|j�dS)alInstantiate a new Nftables class object.

        Accepts a shared object file to open, by default standard search path
        is searched for a file named 'libnftables.so'.

        After loading the library using ctypes module, a new nftables context
        is requested from the library and buffering of output and error streams
        is turned on.
        rN)ZcdllZLoadLibraryZnft_ctx_newZc_void_pZrestypeZc_intZargtypes�nft_ctx_output_get_flagsZc_uint�nft_ctx_output_set_flags�nft_ctx_output_get_debug�nft_ctx_output_set_debugZnft_ctx_buffer_output�nft_ctx_get_output_bufferZc_char_pZnft_ctx_buffer_error�nft_ctx_get_error_buffer�nft_run_cmd_from_buffer�nft_ctx_free�_Nftables__ctx)rZsofile�librrrrCsD









zNftables.__init__cCs|j|j�dS)N)r>r?)rrrr�__del__szNftables.__del__cCs|j|}|j|j�|@S)N)�output_flagsr7r?)r�name�flagrrrZ__get_output_flag�s
zNftables.__get_output_flagcCsD|j|}|j|j�}|r$||B}n
||@}|j|j|�||@S)N)rBr7r?r8)rrC�valrD�flagsZ	new_flagsrrrZ__set_output_flag�s


zNftables.__set_output_flagcCs
|jd�S)z�Get the current state of reverse DNS output.

        Returns a boolean indicating whether reverse DNS lookups are performed
        for IP addresses in output.
        r+)�_Nftables__get_output_flag)rrrr�get_reversedns_output�szNftables.get_reversedns_outputcCs|jd|�S)z�Enable or disable reverse DNS output.

        Accepts a boolean turning reverse DNS lookups in output on or off.

        Returns the previous value.
        r+)�_Nftables__set_output_flag)rrErrr�set_reversedns_output�szNftables.set_reversedns_outputcCs
|jd�S)z�Get the current state of service name output.

        Returns a boolean indicating whether service names are used for port
        numbers in output or not.
        r,)rG)rrrr�get_service_output�szNftables.get_service_outputcCs|jd|�S)z�Enable or disable service name output.

        Accepts a boolean turning service names for port numbers in output on
        or off.

        Returns the previous value.
        r,)rI)rrErrr�set_service_output�szNftables.set_service_outputcCs
|jd�S)z�Get the current state of stateless output.

        Returns a boolean indicating whether stateless output is active or not.
        r-)rG)rrrr�get_stateless_output�szNftables.get_stateless_outputcCs|jd|�S)z�Enable or disable stateless output.

        Accepts a boolean turning stateless output either on or off.

        Returns the previous value.
        r-)rI)rrErrr�set_stateless_output�szNftables.set_stateless_outputcCs
|jd�S)z~Get the current state of handle output.

        Returns a boolean indicating whether handle output is active or not.
        r.)rG)rrrr�get_handle_output�szNftables.get_handle_outputcCs|jd|�S)z�Enable or disable handle output.

        Accepts a boolean turning handle output on or off.

        Returns the previous value.
        r.)rI)rrErrr�set_handle_output�szNftables.set_handle_outputcCs
|jd�S)zzGet the current state of JSON output.

        Returns a boolean indicating whether JSON output is active or not.
        r)rG)rrrr�get_json_output�szNftables.get_json_outputcCs|jd|�S)z�Enable or disable JSON output.

        Accepts a boolean turning JSON output either on or off.

        Returns the previous value.
        r)rI)rrErrr�set_json_output�szNftables.set_json_outputcCs
|jd�S)zzGet the current state of echo output.

        Returns a boolean indicating whether echo output is active or not.
        r/)rG)rrrr�get_echo_output�szNftables.get_echo_outputcCs|jd|�S)z�Enable or disable echo output.

        Accepts a boolean turning echo output on or off.

        Returns the previous value.
        r/)rI)rrErrr�set_echo_output�szNftables.set_echo_outputcCs
|jd�S)z�Get the current state of GID/UID output.

        Returns a boolean indicating whether names for group/user IDs are used
        in output or not.
        r0)rG)rrrr�get_guid_output�szNftables.get_guid_outputcCs|jd|�S)z�Enable or disable GID/UID output.

        Accepts a boolean turning names for group/user IDs on or off.

        Returns the previous value.
        r0)rI)rrErrr�set_guid_output�szNftables.set_guid_outputcCs
|jd�S)ztGet current status of numeric protocol output flag.

        Returns a boolean value indicating the status.
        r1)rG)rrrr�get_numeric_proto_outputsz!Nftables.get_numeric_proto_outputcCs|jd|�S)z�Set numeric protocol output flag.

        Accepts a boolean turning numeric protocol output either on or off.

        Returns the previous value.
        r1)rI)rrErrr�set_numeric_proto_outputsz!Nftables.set_numeric_proto_outputcCs
|jd�S)zzGet current status of numeric chain priority output flag.

        Returns a boolean value indicating the status.
        r2)rG)rrrr�get_numeric_prio_outputsz Nftables.get_numeric_prio_outputcCs|jd|�S)z�Set numeric chain priority output flag.

        Accepts a boolean turning numeric chain priority output either on or
        off.

        Returns the previous value.
        r2)rI)rrErrr�set_numeric_prio_outputsz Nftables.set_numeric_prio_outputcCs
|jd�S)zsGet current status of numeric symbols output flag.

        Returns a boolean value indicating the status.
        r3)rG)rrrr�get_numeric_symbol_output%sz"Nftables.get_numeric_symbol_outputcCs|jd|�S)z�Set numeric symbols output flag.

        Accepts a boolean turning numeric representation of symbolic constants
        in output either on or off.

        Returns the previous value.
        r3)rI)rrErrr�set_numeric_symbol_output,sz"Nftables.set_numeric_symbol_outputcCs
|jd�S)zqGet current status of numeric times output flag.

        Returns a boolean value indicating the status.
        r4)rG)rrrr�get_numeric_time_output6sz Nftables.get_numeric_time_outputcCs|jd|�S)z�Set numeric times output flag.

        Accepts a boolean turning numeric representation of time values
        in output either on or off.

        Returns the previous value.
        r4)rI)rrErrr�set_numeric_time_output=sz Nftables.set_numeric_time_outputcCs
|jd�S)z|Get the current state of terse output.

        Returns a boolean indicating whether terse output is active or not.
        r5)rG)rrrr�get_terse_outputGszNftables.get_terse_outputcCs|jd|�S)z�Enable or disable terse output.

        Accepts a boolean turning terse output either on or off.

        Returns the previous value.
        r5)rI)rrErrr�set_terse_outputNszNftables.set_terse_outputcCsV|j|j�}g}x2|jj�D]$\}}||@r|j|�||M}qW|rR|j|�|S)zmGet currently active debug flags.

        Returns a set of flag names. See set_debug() for details.
        )r9r?�debug_flags�items�append)rrE�names�n�vrrr�	get_debugWs

zNftables.get_debugcCs`|j�}t|�ttgkr|g}d}x*|D]"}t|�tkrB|j|}||O}q(W|j|j|�|S)aSet debug output flags.

        Accepts either a single flag or a set of flags. Each flag might be
        given either as string or integer value as shown in the following
        table:

        Name      | Value (hex)
        -----------------------
        scanner   | 0x1
        parser    | 0x2
        eval      | 0x4
        netlink   | 0x8
        mnl       | 0x10
        proto-ctx | 0x20
        segtree   | 0x40

        Returns a set of previously active debug flags, as returned by
        get_debug() method.
        r)rg�type�str�intrar:r?)r�values�oldrErfrrr�	set_debughs

zNftables.set_debugcCsdd}t|t�sd}|jd�}|j|j|�}|j|j�}|j|j�}|rZ|jd�}|jd�}|||fS)a�Run a simple nftables command via libnftables.

        Accepts a string containing an nftables command just like what one
        would enter into an interactive nftables (nft -i) session.

        Returns a tuple (rc, output, error):
        rc     -- return code as returned by nft_run_cmd_from_buffer() fuction
        output -- a string containing output written to stdout
        error  -- a string containing output written to stderr
        FTzutf-8)�
isinstance�bytes�encoder=r?r;r<�decode)rZcmdlineZcmdline_is_unicode�rc�output�errorrrr�cmd�s



zNftables.cmdcCsJ|jd�}|jtj|��\}}}|s.|j|�t|�r@tj|�}|||fS)aiRun an nftables command in JSON syntax via libnftables.

        Accepts a hash object as input.

        Returns a tuple (rc, output, error):
        rc     -- return code as returned by nft_run_cmd_from_buffer() function
        output -- a hash object containing library standard output
        error  -- a string containing output written to stderr
        T)rRrur�dumps�len�loads)r�	json_rootZjson_out_oldrrrsrtrrr�json_cmd�s



zNftables.json_cmdcCs|jst�|_|jj|�dS)z�Validate JSON object against libnftables schema.

        Accepts a hash object as input.

        Returns True if JSON is valid, raises an exception otherwise.
        T)�	validatorrr)rryrrr�
json_validate�szNftables.json_validaterrrrrrr ��iii)r6)(rrrrrarBr{rrArGrIrHrJrKrLrMrNrOrPrQrRrSrTrUrVrWrXrYrZr[r\r]r^r_r`rgrmrurzr|rrrrr%sl
<
	
						


	#r)rZctypes�sysrZNFTABLES_VERSIONrrrrrr�<module>s
site-packages/nftables/__pycache__/__init__.cpython-36.pyc000064400000000212147511334620017460 0ustar003

�]�b�@sddlTdS)�)�*N)Znftables�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/nftables/schema.json000064400000000417147511334620013225 0ustar00{
	"$schema": "http://json-schema.org/schema#",
	"description": "libnftables JSON API schema",

	"type": "object",
        "properties": {
		"nftables": {
			"type": "array",
			"minitems": 0,
			"items": {
				"type": "object"
			}
		}
	},
	"required": [ "nftables" ]
}
site-packages/certifi/__pycache__/__main__.cpython-36.opt-1.pyc000064400000000251147511334620020232 0ustar003

�])�@sddlmZee��dS)�)�whereN)Zcertifir�print�rr�/usr/lib/python3.6/__main__.py�<module>ssite-packages/certifi/__pycache__/__main__.cpython-36.pyc000064400000000251147511334620017273 0ustar003

�])�@sddlmZee��dS)�)�whereN)Zcertifir�print�rr�/usr/lib/python3.6/__main__.py�<module>ssite-packages/certifi/__pycache__/core.cpython-36.pyc000064400000002112147511334620016501 0ustar003

�]�@sJdZddlZddlZGdd�de�Zdd�Zdd�Zed	krFee��dS)
zU
certifi.py
~~~~~~~~~~

This module returns the installation location of cacert.pem.
�Nc@seZdZdZdS)�DeprecatedBundleWarningz�
    The weak security bundle is being deprecated. Please bother your service
    provider to get them to stop using cross-signed roots.
    N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/core.pyr
srcCsdS)Nz /etc/pki/tls/certs/ca-bundle.crtrrrrr�wheresr	cCstjdt�t�S)Nz�The weak security bundle has been removed. certifi.old_where() is now an alias of certifi.where(). Please update your code to use certifi.where() instead. certifi.old_where() will be removed in 2018.)�warnings�warnrr	rrrr�	old_wheresr�__main__)	r�osr
�DeprecationWarningrr	rr�printrrrr�<module>s	site-packages/certifi/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000306147511334620020252 0ustar003

�]?�@sddlmZmZdZdS)�)�where�	old_wherez
2018.10.15N)Zcorerr�__version__�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/certifi/__pycache__/core.cpython-36.opt-1.pyc000064400000002112147511334620017440 0ustar003

�]�@sJdZddlZddlZGdd�de�Zdd�Zdd�Zed	krFee��dS)
zU
certifi.py
~~~~~~~~~~

This module returns the installation location of cacert.pem.
�Nc@seZdZdZdS)�DeprecatedBundleWarningz�
    The weak security bundle is being deprecated. Please bother your service
    provider to get them to stop using cross-signed roots.
    N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/core.pyr
srcCsdS)Nz /etc/pki/tls/certs/ca-bundle.crtrrrrr�wheresr	cCstjdt�t�S)Nz�The weak security bundle has been removed. certifi.old_where() is now an alias of certifi.where(). Please update your code to use certifi.where() instead. certifi.old_where() will be removed in 2018.)�warnings�warnrr	rrrr�	old_wheresr�__main__)	r�osr
�DeprecationWarningrr	rr�printrrrr�<module>s	site-packages/certifi/__pycache__/__init__.cpython-36.pyc000064400000000306147511334620017313 0ustar003

�]?�@sddlmZmZdZdS)�)�where�	old_wherez
2018.10.15N)Zcorerr�__version__�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/certifi/core.py000064400000001420147511334620012216 0ustar00# -*- coding: utf-8 -*-

"""
certifi.py
~~~~~~~~~~

This module returns the installation location of cacert.pem.
"""
import os
import warnings


class DeprecatedBundleWarning(DeprecationWarning):
    """
    The weak security bundle is being deprecated. Please bother your service
    provider to get them to stop using cross-signed roots.
    """


def where():
    return '/etc/pki/tls/certs/ca-bundle.crt'


def old_where():
    warnings.warn(
        "The weak security bundle has been removed. certifi.old_where() is now an alias "
        "of certifi.where(). Please update your code to use certifi.where() instead. "
        "certifi.old_where() will be removed in 2018.",
        DeprecatedBundleWarning
    )
    return where()

if __name__ == '__main__':
    print(where())
site-packages/certifi/__main__.py000064400000000051147511334620013005 0ustar00from certifi import where
print(where())
site-packages/certifi/__init__.py000064400000000077147511334620013034 0ustar00from .core import where, old_where

__version__ = "2018.10.15"
site-packages/slip.dbus-0.6.4-py3.6.egg-info000064400000000415147511334620014132 0ustar00Metadata-Version: 1.1
Name: slip.dbus
Version: 0.6.4
Summary: UNKNOWN
Home-page: UNKNOWN
Author: UNKNOWN
Author-email: UNKNOWN
License: UNKNOWN
Description: UNKNOWN
Platform: UNKNOWN
Requires: dbus
Requires: decorator
Requires: StringIO
Requires: xml.etree.ElementTree
site-packages/slip/dbus/__init__.py000064400000000336147511334620013311 0ustar00# -*- coding: utf-8 -*-

from __future__ import absolute_import

from . import bus
from .bus import SessionBus, SystemBus, StarterBus
from . import proxies
from . import service
from . import polkit
from . import mainloop
site-packages/slip/dbus/proxies.py000064400000003540147511334620013243 0ustar00# -*- coding: utf-8 -*-

# slip.dbus.proxies -- slightly augmented dbus proxy classes
#
# Copyright © 2005-2007 Collabora Ltd. <http://www.collabora.co.uk/>
# Copyright © 2009, 2011 Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#
# Authors:
# Nils Philippsen <nils@redhat.com>

"""This module contains D-Bus proxy classes which implement the default
timeout of the augmented bus classes in slip.dbus.bus."""

from __future__ import absolute_import

import dbus.proxies

from . import constants


class _ProxyMethod(dbus.proxies._ProxyMethod):

    _connections_default_timeouts = {}

    @property
    def default_timeout(self):
        if self._connection not in self._connections_default_timeouts:
            dt = getattr(self._proxy._bus, "default_timeout", None)
            if dt is None:
                dt = constants.method_call_no_timeout
            self._connections_default_timeouts[self._connection] = dt
        return self._connections_default_timeouts[self._connection]

    def __call__(self, *args, **kwargs):
        if kwargs.get('timeout') is None:
            kwargs["timeout"] = self.default_timeout

        return dbus.proxies._ProxyMethod.__call__(self, *args, **kwargs)


class ProxyObject(dbus.proxies.ProxyObject):

    ProxyMethodClass = _ProxyMethod
site-packages/slip/dbus/constants.py000064400000002700147511334620013563 0ustar00# -*- coding: utf-8 -*-

# slip.dbus.constants -- constant values
#
# Copyright © 2011 Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#
# Authors:
# Nils Philippsen <nils@redhat.com>

"""This module contains some constant values."""

# The maximum value of a 32bit signed integer is the magic value to indicate an
# infinite timeout for dbus. Unlike the C interface which deals with
# milliseconds as integers, the python interface uses seconds as floats for the
# timeout. Therefore we need to use the Python float (C double) value that
# gives 0x7FFFFFFF if multiplied by 1000.0 and cast into an integer.
#
# This calculation should be precise enough to get a value of 0x7FFFFFFF on the
# C side. If not, it will still amount to a very long time (not quite 25 days)
# which should be enough for all intents and purposes.
method_call_no_timeout = 0x7FFFFFFF / 1000.0
site-packages/slip/dbus/bus.py000064400000002531147511334620012342 0ustar00# -*- coding: utf-8 -*-

# slip.dbus.bus -- augmented dbus buses
#
# Copyright © 2009, 2011 Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#
# Authors:
# Nils Philippsen <nils@redhat.com>

"""This module contains functions which create monkey-patched/augmented D-Bus
buses."""

from __future__ import absolute_import

import dbus
from . import proxies
from . import constants

for name in ("Bus", "SystemBus", "SessionBus", "StarterBus"):
    exec(
        """def %(name)s(*args, **kwargs):
    busobj = dbus.%(name)s(*args, **kwargs)
    busobj.ProxyObjectClass = proxies.ProxyObject
    busobj.default_timeout = %(default_timeout)s
    return busobj
""" % {
        "name": name, "modname": __name__,
        "default_timeout": constants.method_call_no_timeout})
site-packages/slip/dbus/introspection.py000064400000010330147511334620014445 0ustar00# -*- coding: utf-8 -*-

# slip.dbus.introspection -- access dbus introspection data
#
# Copyright © 2011 Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#
# Authors:
# Nils Philippsen <nils@redhat.com>

"""Classes and functions to easily access DBus introspection data."""

from __future__ import absolute_import

from xml.etree.ElementTree import ElementTree
from io import StringIO
from six import with_metaclass


class IElemMeta(type):
    """Metaclass for introspection elements.

    Sets elemname class member automatically from class name if not set
    explicitly. Registers classes for their element names."""

    elemnames_to_classes = {}

    @classmethod
    def clsname_to_elemname(cls, clsname):
        elemname = ""
        for c in clsname:
            c_lower = c.lower()
            if c_lower != c:
                if len(elemname):
                    elemname += "_"
            elemname += c_lower
        return elemname

    def __new__(cls, name, bases, dct):
        if name == "IElem":
            return type.__new__(cls, name, bases, dct)

        if 'elemname' not in dct:
            if not name.startswith("IElem"):
                raise TypeError(
                    "Class '%s' needs to set elemname (or be called "
                    "'IElem...'))" % name)
            dct['elemname'] = IElemMeta.clsname_to_elemname(name[5:])

        elemname = dct['elemname']

        if elemname in IElemMeta.elemnames_to_classes:
            raise TypeError(
                "Class '%s' tries to register duplicate elemname '%s'" %
                (name, elemname))

        kls = type.__new__(cls, name, bases, dct)

        IElemMeta.elemnames_to_classes[elemname] = kls

        return kls


class IElem(with_metaclass(IElemMeta, object)):
    """Base class for introspection elements."""

    def __new__(cls, elem, parent=None):
        kls = IElemMeta.elemnames_to_classes.get(
            elem.tag, IElemMeta.elemnames_to_classes[None])
        return super(IElem, cls).__new__(kls, elem, parent)

    def __init__(self, elem, parent=None):
        self.elem = elem
        self.parent = parent
        self.child_elements = [IElem(c, parent=self) for c in elem]

    def __str__(self):
        s = "%s %r" % (self.elemname if self.elemname else "unknown:%s" %
            self.elem.tag, self.attrib)
        for c in self.child_elements:
            for cc in str(c).split("\n"):
                s += "\n  %s" % (cc)
        return s

    @property
    def attrib(self):
        return self.elem.attrib


class IElemUnknown(IElem):
    """Catch-all for unknown introspection elements."""

    elemname = None


class IElemNameMixin(object):
    """Mixin for introspection elements with names."""

    @property
    def name(self):
        return self.attrib['name']


class IElemNode(IElem, IElemNameMixin):
    """Introspection node."""

    def __init__(self, elem, parent=None):
        super(IElemNode, self).__init__(elem, parent)

        self.child_nodes = [
            c for c in self.child_elements if isinstance(c, IElemNode)]


class IElemInterface(IElem):
    """Introspection interface."""


class IElemMethod(IElem):
    """Introspection interface method."""


class IElemArg(IElem):
    """Introspection method argument."""


class IElemSignal(IElem, IElemNameMixin):
    """Introspection interface signal."""


def introspect(string_or_file):
    tree = ElementTree()
    # assume string if read() method doesn't exist, works for string, unicode,
    # dbus.String
    if not hasattr(string_or_file, "read"):
        string_or_file = StringIO(string_or_file)
    xml_root = tree.parse(string_or_file)
    elem_root = IElem(xml_root)
    return elem_root
site-packages/slip/dbus/mainloop.py000064400000006443147511334620013375 0ustar00# -*- coding: utf-8 -*-

# slip.dbus.mainloop -- mainloop wrappers
#
# Copyright © 2009, 2012 Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#
# Authors:
# Nils Philippsen <nils@redhat.com>

"""This module contains mainloop wrappers.

Currently only glib main loops are supported."""

from __future__ import absolute_import

__all__ = ("MainLoop", "set_type")


class MainLoop(object):
    """An abstract main loop wrapper class and factory.

    Use MainLoop() to get a main loop wrapper object for a main loop type
    previously registered with set_type(). Defaults to glib main loops.

    Actual main loop wrapper classes are derived from this class."""

    __mainloop_class = None

    def __new__(cls, *args, **kwargs):
        global _mainloop_class
        if MainLoop._mainloop_class is None:
            MainLoop.set_type("glib")
        return super(MainLoop, cls).__new__(
            MainLoop.__mainloop_class, *args, **kwargs)

    @classmethod
    def set_type(cls, mltype):
        """Set a main loop type for non-blocking interfaces.

        mltype: "glib" (currently only glib main loops are supported)"""

        if MainLoop.__mainloop_class is not None:
            raise RuntimeError("The main loop type can only be set once.")

        ml_type_class = {"glib": GlibMainLoop}

        if mltype in ml_type_class:
            MainLoop.__mainloop_class = ml_type_class[mltype]
        else:
            raise ValueError(
                "'%s' is not one of the valid main loop types:\n%s" %
                (mltype, ", ".join(ml_type_class)))

    def pending(self):
        """Returns if there are pending events."""

        raise NotImplementedError()

    def iterate(self):
        """Iterates over one pending event."""

        raise NotImplementedError()

    def iterate_over_pending_events(self):
        """Iterates over all pending events."""

        while self.pending():
            self.iterate()

    def run(self):
        """Runs the main loop."""

        raise NotImplementedError()

    def quit(self):
        """Quits the main loop."""

        raise NotImplementedError()


class GlibMainLoop(MainLoop):

    def __init__(self):
        from .._wrappers import _glib
        ml = _glib.MainLoop()
        ctx = ml.get_context()

        self._mainloop = ml
        self.pending = ctx.pending
        self.iterate = ctx.iteration
        self.run = ml.run
        self.quit = ml.quit


def set_type(mltype):
    """Set a main loop type for non-blocking interfaces.

    mltype: "glib" (currently only glib main loops are supported)

    Deprecated, use MainLoop.set_type() instead."""

    from warnings import warn

    warn("use MainLoop.set_type() instead", DeprecationWarning)

    MainLoop.set_type(mltype)
site-packages/slip/dbus/service.py000064400000017640147511334620013220 0ustar00# -*- coding: utf-8 -*-

# slip.dbus.service -- convenience functions for using dbus-activated
# services
#
# Copyright © 2008, 2009, 2015 Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#
# Authors:
# Nils Philippsen <nils@redhat.com>

"This module contains convenience functions for using dbus-activated services."

from __future__ import absolute_import

import dbus
import dbus.service
from six import with_metaclass

from .._wrappers import _glib as GLib

from . import polkit

__all__ = ["Object", "InterfaceType", "set_mainloop"]

__mainloop__ = None


def __glib_quit_cb__():
    global __mainloop__

    # assume a Glib mainloop

    __mainloop__.quit()


__quit_cb__ = __glib_quit_cb__


def set_mainloop(mainloop):
    global __mainloop__
    __mainloop__ = mainloop


def set_quit_cb(quit_cb):
    global __quit_cb__
    __quit_cb__ = quit_cb


def quit_cb():
    global __quit_cb__
    __quit_cb__()


SENDER_KEYWORD = "__slip_dbus_service_sender__"
ASYNC_CALLBACKS = ("__slip_dbus_service_reply_cb__",
                   "__slip_dbus_service_error_cb__")


def wrap_method(method):
    global SENDER_KEYWORD
    global ASYNC_CALLBACKS

    if method._dbus_sender_keyword is not None:
        sender_keyword = method._dbus_sender_keyword
        hide_sender_keyword = False
    else:
        sender_keyword = SENDER_KEYWORD
        hide_sender_keyword = True

    if method._dbus_async_callbacks is not None:
        async_callbacks = method._dbus_async_callbacks
        method_is_async = True
    else:
        async_callbacks = ASYNC_CALLBACKS
        method_is_async = False
    hide_async_callbacks = not method_is_async

    def wrapped_method(self, *p, **k):
        sender = k.get(sender_keyword)
        if sender is not None:
            # i.e. called over the bus, not locally
            reply_cb = k[async_callbacks[0]]
            error_cb = k[async_callbacks[1]]

            if hide_sender_keyword:
                del k[sender_keyword]

            if hide_async_callbacks:
                del k[async_callbacks[0]]
                del k[async_callbacks[1]]

            self.sender_seen(sender)

        action_id = getattr(method, "_slip_polkit_auth_required",
                            getattr(self, "default_polkit_auth_required",
                                    None))

        if sender is not None and action_id:

            def reply_handler(is_auth):
                if is_auth:
                    if method_is_async:

                        # k contains async callbacks, simply pass on reply_cb
                        # and error_cb

                        method(self, *p, **k)
                    else:

                        # execute the synchronous method ...

                        error = None
                        try:
                            result = method(self, *p, **k)
                        except Exception as e:
                            error = e

                        # ... and call the reply or error callback

                        if error:
                            error_cb(error)
                        else:

                            # reply_cb((None,)) != reply_cb()

                            if result is None:
                                reply_cb()
                            else:
                                reply_cb(result)
                else:
                    error_cb(polkit.NotAuthorizedException(action_id))
                self.timeout_restart()

            def error_handler(error):
                error_cb(error)
                self.timeout_restart()

            polkit.IsSystemBusNameAuthorizedAsync(
                sender, action_id,
                reply_handler=reply_handler, error_handler=error_handler)
        else:
            # no action id, or run locally, no need to do anything fancy
            retval = method(self, *p, **k)
            self.timeout_restart()
            return retval

    for attr in (x for x in dir(method) if x[:6] == "_dbus_"):
        if attr == "_dbus_sender_keyword":
            wrapped_method._dbus_sender_keyword = sender_keyword
        elif attr == "_dbus_async_callbacks":
            wrapped_method._dbus_async_callbacks = async_callbacks
        else:
            setattr(wrapped_method, attr, getattr(method, attr))

        # delattr (method, attr)

    wrapped_method.__name__ = method.__name__

    return wrapped_method


class InterfaceType(dbus.service.InterfaceType):

    def __new__(cls, name, bases, dct):

        for (attrname, attr) in dct.items():
            if getattr(attr, "_dbus_is_method", False):
                dct[attrname] = wrap_method(attr)
        return super(InterfaceType, cls).__new__(cls, name, bases, dct)


class Object(with_metaclass(InterfaceType, dbus.service.Object)):

    # timeout & persistence

    persistent = False
    default_duration = 5
    duration = default_duration
    current_source = None
    senders = set()
    connections_senders = {}
    connections_smobjs = {}

    # PolicyKit

    default_polkit_auth_required = None

    def __init__(
        self, conn=None, object_path=None, bus_name=None, persistent=None):

        super(Object, self).__init__(conn, object_path, bus_name)
        if persistent is None:
            self.persistent = self.__class__.persistent
        else:
            self.persistent = persistent

    def _timeout_cb(self):
        if not self.persistent and len(Object.senders) == 0:
            quit_cb()
            return False

        Object.current_source = None
        Object.duration = self.default_duration

        return False

    def _name_owner_changed(self, name, old_owner, new_owner):

        conn = self.connection

        if not new_owner and (old_owner, conn) in Object.senders:
            Object.senders.remove((old_owner, conn))
            Object.connections_senders[conn].remove(old_owner)

            if len(Object.connections_senders[conn]) == 0:
                Object.connections_smobjs[conn].remove()
                del Object.connections_senders[conn]
                del Object.connections_smobjs[conn]

            if not self.persistent and len(Object.senders) == 0 and \
                    Object.current_source is None:
                quit_cb()

    def timeout_restart(self, duration=None):
        if not duration:
            duration = self.__class__.default_duration
        if not Object.duration or duration > Object.duration:
            Object.duration = duration
        if not self.persistent or len(Object.senders) == 0:
            if Object.current_source:
                GLib.source_remove(Object.current_source)
            Object.current_source = \
                GLib.timeout_add(Object.duration * 1000,
                                    self._timeout_cb)

    def sender_seen(self, sender):
        if (sender, self.connection) not in Object.senders:
            Object.senders.add((sender, self.connection))
            if self.connection not in Object.connections_senders:
                Object.connections_senders[self.connection] = set()
                Object.connections_smobjs[self.connection] = \
                    self.connection.add_signal_receiver(
                        handler_function=self._name_owner_changed,
                        signal_name='NameOwnerChanged',
                        dbus_interface='org.freedesktop.DBus',
                        arg1=sender)
            Object.connections_senders[self.connection].add(sender)
site-packages/slip/dbus/__pycache__/introspection.cpython-36.pyc000064400000011116147511334620020734 0ustar003

yS5c��@s�dZddlmZddlmZddlmZddlmZGdd�de	�Z
Gdd	�d	ee
e��ZGd
d�de�Z
Gdd
�d
e�ZGdd�dee�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�dee�Zdd�ZdS)z?Classes and functions to easily access DBus introspection data.�)�absolute_import)�ElementTree)�StringIO)�with_metaclassc@s(eZdZdZiZedd��Zdd�ZdS)�	IElemMetaz�Metaclass for introspection elements.

    Sets elemname class member automatically from class name if not set
    explicitly. Registers classes for their element names.cCs>d}x4|D],}|j�}||kr.t|�r.|d7}||7}q
W|S)N��_)�lower�len)�clsZclsname�elemname�cZc_lower�r�#/usr/lib/python3.6/introspection.py�clsname_to_elemname(s
zIElemMeta.clsname_to_elemnamecCs�|dkrtj||||�Sd|krL|jd�s6td|��tj|dd��|d<|d}|tjkrntd||f��tj||||�}|tj|<|S)N�IElemrz;Class '%s' needs to set elemname (or be called 'IElem...'))�z4Class '%s' tries to register duplicate elemname '%s')�type�__new__�
startswith�	TypeErrorrr�elemnames_to_classes)r�name�basesZdctr�klsrrrr3s 


zIElemMeta.__new__N)�__name__�
__module__�__qualname__�__doc__r�classmethodrrrrrrr srcs@eZdZdZd�fdd�	Zddd�Zdd�Zed	d
��Z�Z	S)
rz&Base class for introspection elements.Ncs*tjj|jtjd�}tt|�j|||�S)N)rr�get�tag�superrr)r�elem�parentr)�	__class__rrrOsz
IElem.__new__cs$|�_|�_�fdd�|D��_dS)Ncsg|]}t|�d��qS))r$)r)�.0r
)�selfrr�
<listcomp>Wsz"IElem.__init__.<locals>.<listcomp>)r#r$�child_elements)r'r#r$r)r'r�__init__TszIElem.__init__cCsZd|jr|jn
d|jj|jf}x2|jD](}x"t|�jd�D]}|d|7}q>Wq*W|S)Nz%s %rz
unknown:%s�
z
  %s)rr#r!�attribr)�str�split)r'�sr
Zccrrr�__str__Ysz
IElem.__str__cCs|jjS)N)r#r,)r'rrrr,aszIElem.attrib)N)N)
rrrrrr*r0�propertyr,�
__classcell__rr)r%rrLs

rc@seZdZdZdZdS)�IElemUnknownz-Catch-all for unknown introspection elements.N)rrrrrrrrrr3fsr3c@seZdZdZedd��ZdS)�IElemNameMixinz,Mixin for introspection elements with names.cCs
|jdS)Nr)r,)r'rrrroszIElemNameMixin.nameN)rrrrr1rrrrrr4lsr4cs"eZdZdZd�fdd�	Z�ZS)�	IElemNodezIntrospection node.Ncs(tt|�j||�dd�|jD�|_dS)NcSsg|]}t|t�r|�qSr)�
isinstancer5)r&r
rrrr({sz&IElemNode.__init__.<locals>.<listcomp>)r"r5r*r)Zchild_nodes)r'r#r$)r%rrr*wszIElemNode.__init__)N)rrrrr*r2rr)r%rr5tsr5c@seZdZdZdS)�IElemInterfacezIntrospection interface.N)rrrrrrrrr7~sr7c@seZdZdZdS)�IElemMethodzIntrospection interface method.N)rrrrrrrrr8�sr8c@seZdZdZdS)�IElemArgzIntrospection method argument.N)rrrrrrrrr9�sr9c@seZdZdZdS)�IElemSignalzIntrospection interface signal.N)rrrrrrrrr:�sr:cCs.t�}t|d�st|�}|j|�}t|�}|S)N�read)r�hasattrr�parser)Zstring_or_fileZtreeZxml_rootZ	elem_rootrrr�
introspect�s

r>N)rZ
__future__rZxml.etree.ElementTreer�iorZsixrrr�objectrr3r4r5r7r8r9r:r>rrrr�<module>s,
site-packages/slip/dbus/__pycache__/proxies.cpython-36.opt-1.pyc000064400000002345147511334620020470 0ustar003

yS5c`�@sPdZddlmZddlZddlmZGdd�dejj�ZGdd	�d	ejj	�Z	dS)
z{This module contains D-Bus proxy classes which implement the default
timeout of the augmented bus classes in slip.dbus.bus.�)�absolute_importN�)�	constantsc@s$eZdZiZedd��Zdd�ZdS)�_ProxyMethodcCsB|j|jkr6t|jjdd�}|dkr*tj}||j|j<|j|jS)N�default_timeout)Z_connection�_connections_default_timeouts�getattr�_proxyZ_busrZmethod_call_no_timeout)�selfZdt�r�/usr/lib/python3.6/proxies.pyr&sz_ProxyMethod.default_timeoutcOs.|jd�dkr|j|d<tjjj|f|�|�S)NZtimeout)�getr�dbus�proxiesr�__call__)r
�args�kwargsrrrr/s
z_ProxyMethod.__call__N)�__name__�
__module__�__qualname__r�propertyrrrrrrr"s	rc@seZdZeZdS)�ProxyObjectN)rrrrZProxyMethodClassrrrrr6sr)
�__doc__Z
__future__rZdbus.proxiesr�rrrrrrrr�<module>s
site-packages/slip/dbus/__pycache__/polkit.cpython-36.opt-1.pyc000064400000017150147511334620020301 0ustar003

yS5cI$�@s�dZddlmZddlZddlZddlmZddlmZddlm	Z	dd	d
ddd
gZ
dd�ZdZGdd
�d
e
�Zdeddfdd	�ZGdd�dej�ZGdd�de
�Ze�Zdd�Zdifdd
�ZdS)zmThis module contains convenience decorators and functions for using
PolicyKit with dbus services and clients.�)�absolute_importN)�	decorator)�reduce�)�method_call_no_timeout�require_auth�enable_proxy�AUTHFAIL_DONTCATCH�NotAuthorizedException�AreAuthorizationsObtainable�IsSystemBusNameAuthorizedAsynccs�fdd�}|S)u�Decorator for DBus service methods.

    Specify that a user needs a specific PolicyKit authorization `polkit_auth´
    to execute it.cst|d��|S)NZ_slip_polkit_auth_required)�setattr)�method)�polkit_auth��/usr/lib/python3.6/polkit.py�require_auth_decorator/sz,require_auth.<locals>.require_auth_decoratorr)rrr)rrr)szBorg.fedoraproject.slip.dbus.service.PolKit.NotAuthorizedException.c@seZdZdS)r	N)�__name__�
__module__�__qualname__rrrrr	<scs6���fdd��|dk	r"t�|�S�fdd�}|SdS)u�Decorator for DBus proxy methods.

    Let's you (optionally) specify either a result value or an exception type
    and a callback which is returned, thrown or called respectively if a
    PolicyKit authorization doesn't exist or can't be obtained in the DBus
    mechanism, i.e. an appropriate DBus exception is thrown.

    An exception constructor may and a callback must accept an `action_id´
    parameter which will be set to the id of the PolicyKit action for which
    authorization could not be obtained.

    Examples:

    1) Return `False´ in the event of an authorization problem, and call
    `error_handler´:

        def error_handler(action_id=None):
            print "Authorization problem:", action_id

        class MyProxy(object):
            @polkit.enable_proxy(authfail_result=False,
                                 authfail_callback=error_handler)
            def some_method(self, ...):
                ...

    2) Throw a `MyAuthError´ instance in the event of an authorization problem:

        class MyAuthError(Exception):
            def __init__(self, *args, **kwargs):
                action_id = kwargs.pop("action_id")
                super(MyAuthError, self).__init__(*args, **kwargs)
                self.action_id = action_id

        class MyProxy(object):
            @polkit.enable_proxy(authfail_exception=MyAuthError)
            def some_method(self, ...):
                ...cs�y
|||�Stjk
r�}zr|j�}|jt�s2�|tt�d�}�dk	rT�|d��dk	r�y�|d�}Wn��}YnX|��tkr���Sd}~XnXdS)N)�	action_id)�dbus�
DBusExceptionZ
get_dbus_name�
startswith�AUTH_EXC_PREFIX�lenr	)�func�p�k�eZexc_namerZaf_exc)�authfail_callback�authfail_exception�authfail_resultrr�
_enable_proxyvs$


z#enable_proxy.<locals>._enable_proxyNcs
t�|�S)N)r)r)r#rr�decorate�szenable_proxy.<locals>.decorate)r)rr"r!r r$r)r#r r!r"rr@s
6
cs$eZdZdZdZ�fdd�Z�ZS)r
zqException which a DBus service method throws if an authorization
    required for executing it can't be obtained.zAorg.fedoraproject.slip.dbus.service.PolKit.NotAuthorizedExceptioncs(|jjd||_tt|�j||�dS)N�.)�	__class__�_dbus_error_name�superr
�__init__)�selfrrr)r&rrr)�s
zNotAuthorizedException.__init__)rrr�__doc__r'r)�
__classcell__rr)r&rr
�sc@s�eZdZdZdZdZdZdZdZdZ	dZ
edd��Ze
dd	��Ze
d
d��Ze
dd
��Ze
dd��Zdd�Zdd�Zdd�Zdifdd�ZdS)�PolKitz"Convenience wrapper around polkit.zorg.freedesktop.PolicyKit1z%/org/freedesktop/PolicyKit1/Authorityz$org.freedesktop.PolicyKit1.AuthorityNcCs4||jkr0tjr0tjjtj�dt_dt_dt_dS)N)�
_dbus_namer-�_PolKit__busZremove_signal_receiver�_PolKit__signal_receiver�_PolKit__interface)�cls�nameZ	old_ownerZ	new_ownerrrr�_on_name_owner_changed�s
zPolKit._on_name_owner_changedcCs0tjs*tj�t_tjj|jdd|jd�t_tjS)NZNameOwnerChangedzorg.freedesktop.DBus)Zhandler_functionZsignal_nameZdbus_interfaceZarg0)r-r/rZ	SystemBusZadd_signal_receiverr4r.r0)r*rrr�_bus�s
zPolKit._buscCstjs|jj�t_tjS)N)r-�_PolKit__bus_namer5Zget_unique_name)r*rrr�	_bus_name�szPolKit._bus_namecCsFtjs@y"tj|jj|j|j�|j�t_Wntj	k
r>YnXtjS)N)
r-r1r�	Interfacer5�
get_objectr.�
_dbus_path�_dbus_interfacer)r*rrr�
_interface�s

zPolKit._interfacecCs
t|j�S)N)�boolr<)r*rrr�_polkit_present�szPolKit._polkit_presentc	Cs>|jjdd�}tj|d�}y|j|�}Wnd}YnX|S)Nzorg.freedesktop.DBusz/org/freedesktop/DBus)r5r9rr8ZGetConnectionUnixUser)r*�system_bus_nameZ
bus_objectZ
bus_interfaceZuidrrrZ__dbus_system_bus_name_uid�s
z!PolKit.__dbus_system_bus_name_uidcCs6|js
dS|jjdd|jif|idd�\}}}|p4|S)NTzsystem-bus-namer3r�)r>r<�CheckAuthorizationr7)r*Z
authorization�
is_authorized�is_challenge�detailsrrrZ__authorization_is_obtainable�sz$PolKit.__authorization_is_obtainablecs8�js
dSt|tttf�s |f}t�fdd�|d�}|S)NTcs|o�j|�S)N)�$_PolKit__authorization_is_obtainable)�x�y)r*rr�<lambda>�sz4PolKit.AreAuthorizationsObtainable.<locals>.<lambda>)r>�
isinstance�tuple�list�setr)r*�authorizationsZ
obtainabler)r*rr�s
z"PolKit.AreAuthorizationsObtainableTc	
sd|js �|dkp|j|�dk�Sd}|r0|dO}�fdd�}|jjdd|if|||d||td�dS)	Nrrcs|\}}}�|�dS)Nr)�argsrBrCrD)�
reply_handlerrr�reply_cbs
z7PolKit.IsSystemBusNameAuthorizedAsync.<locals>.reply_cbzsystem-bus-namer3r@)rO�
error_handlerZtimeout)r>�!_PolKit__dbus_system_bus_name_uidr<rAr)	r*r?rrOrQ�	challengerD�flagsrPr)rOrrs

z%PolKit.IsSystemBusNameAuthorizedAsync)rrrr+r.r:r;r1r/r6r0�classmethodr4�propertyr5r7r<r>rRrErrrrrrr-�s"
r-cCs
tj|�S)N)�__polkitr)rMrrrrsTcCstj||||||�S)N)rWr)r?rrOrQrSrDrrrrs
)r+Z
__future__r�collectionsrr�	functoolsrZ	constantsr�__all__rr�objectr	rrr
r-rWrrrrrr�<module>s(Wosite-packages/slip/dbus/__pycache__/service.cpython-36.pyc000064400000012352147511334620017477 0ustar003

yS5c��@s�dZddlmZddlZddlZddlmZddlmZ	ddl
mZd	d
dgZda
dd
�Zeadd�Zdd�Zdd�Zdadadd�ZGdd
�d
ejj�ZGdd	�d	eeejj��ZdS)zMThis module contains convenience functions for using dbus-activated services.�)�absolute_importN)�with_metaclass�)�_glib�)�polkit�Object�
InterfaceType�set_mainloopcCstj�dS)N)�__mainloop__�quit�r
r
�/usr/lib/python3.6/service.py�__glib_quit_cb__)srcCs|adS)N)r)Zmainloopr
r
rr
4scCs|adS)N)�__quit_cb__)�quit_cbr
r
r�set_quit_cb9srcCs
t�dS)N)rr
r
r
rr>srZ__slip_dbus_service_sender__�__slip_dbus_service_reply_cb__�__slip_dbus_service_error_cb__cs��jdk	r�j�d�nt�d��jdk	r4�j�d�nt�d���������fdd�}xLdd�t��D�D]6}|dkr��|_ql|dkr��|_qlt||t�|��qlW�j|_|S)	NFTcs��j��}|dk	rV��d���d��r4��=�rL��d=��d=�j|�t�	dt�dd���|dk	r��r�����	�
���fdd�}��fdd�}tj|�||d	�n�	�f����}�j�|SdS)
NrrZ_slip_polkit_auth_required�default_polkit_auth_requiredcs�|r��r��f����q�d}y��f����}Wn&tk
rX}z
|}WYdd}~XnX|rh�|�q�|dkrx��q��|�n�tj����j�dS)N)�	ExceptionrZNotAuthorizedException�timeout_restart)Zis_auth�error�result�e)�	action_id�error_cb�k�method�method_is_async�p�reply_cb�selfr
r�
reply_handlerqs

z:wrap_method.<locals>.wrapped_method.<locals>.reply_handlercs�|��j�dS)N)r)r)rr"r
r�
error_handler�sz:wrap_method.<locals>.wrapped_method.<locals>.error_handler)r#r$)�get�sender_seen�getattrrZIsSystemBusNameAuthorizedAsyncr)r"r r�senderr#r$Zretval)�async_callbacks�hide_async_callbacks�hide_sender_keywordrr�sender_keyword)rrrr r!r"r�wrapped_method[s,



"z#wrap_method.<locals>.wrapped_methodcss"|]}|dd�dkr|VqdS)N�Z_dbus_r
)�.0�xr
r
r�	<genexpr>�szwrap_method.<locals>.<genexpr>�_dbus_sender_keyword�_dbus_async_callbacks)r2�SENDER_KEYWORDr3�ASYNC_CALLBACKS�dir�setattrr'�__name__)rr-�attrr
)r)r*r+rrr,r�wrap_methodHs(

Er:cseZdZ�fdd�Z�ZS)r	csDx,|j�D] \}}t|dd�r
t|�||<q
Wtt|�j||||�S)NZ_dbus_is_methodF)�itemsr'r:�superr	�__new__)�cls�name�basesZdctZattrnamer9)�	__class__r
rr=�szInterfaceType.__new__)r8�
__module__�__qualname__r=�
__classcell__r
r
)rArr	�scsbeZdZdZdZeZdZe�ZiZ	iZ
dZd�fdd�	Zdd�Z
dd	�Zdd
d�Zdd
�Z�ZS)rF�Ncs2tt|�j|||�|dkr(|jj|_n||_dS)N)r<r�__init__rA�
persistent)r"�connZobject_pathZbus_namerG)rAr
rrF�szObject.__init__cCs2|jr ttj�dkr t�dSdt_|jt_dS)NrF)rG�lenr�sendersr�current_source�default_duration�duration)r"r
r
r�_timeout_cb�szObject._timeout_cbcCs�|j}|r�||ftjkr�tjj||f�tj|j|�ttj|�dkrjtj|j�tj|=tj|=|jr�ttj�dkr�tjdkr�t	�dS)Nr)
�
connectionrrJ�remove�connections_sendersrI�connections_smobjsrGrKr)r"r?Z	old_ownerZ	new_ownerrHr
r
r�_name_owner_changed�s
zObject._name_owner_changedcCsf|s|jj}tjs|tjkr$|t_|js:ttj�dkrbtjrLtj	tj�tj
tjd|j�t_dS)Nri�)rArLrrMrGrIrJrK�GLibZ
source_removeZtimeout_addrN)r"rMr
r
rr�szObject.timeout_restartcCsp||jftjkrltjj||jf�|jtjkrZt�tj|j<|jj|jdd|d�tj|j<tj|jj|�dS)NZNameOwnerChangedzorg.freedesktop.DBus)Zhandler_functionZsignal_nameZdbus_interfaceZarg1)	rOrrJ�addrQ�setZadd_signal_receiverrSrR)r"r(r
r
rr&�szObject.sender_seen)NNNN)N)r8rBrCrGrLrMrKrVrJrQrRrrFrNrSrr&rDr
r
)rArr�s

)rr)�__doc__Z
__future__rZdbusZdbus.serviceZsixrZ	_wrappersrrT�r�__all__rrrr
rrr4r5r:Zservicer	rr
r
r
r�<module>s&
g
site-packages/slip/dbus/__pycache__/polkit.cpython-36.pyc000064400000017503147511334620017344 0ustar003

yS5cI$�@s�dZddlmZddlZddlZddlmZddlmZddlm	Z	dd	d
ddd
gZ
dd�ZdZGdd
�d
e
�Zdeddfdd	�ZGdd�dej�ZGdd�de
�Ze�Zdd�Zdifdd
�ZdS)zmThis module contains convenience decorators and functions for using
PolicyKit with dbus services and clients.�)�absolute_importN)�	decorator)�reduce�)�method_call_no_timeout�require_auth�enable_proxy�AUTHFAIL_DONTCATCH�NotAuthorizedException�AreAuthorizationsObtainable�IsSystemBusNameAuthorizedAsynccs�fdd�}|S)u�Decorator for DBus service methods.

    Specify that a user needs a specific PolicyKit authorization `polkit_auth´
    to execute it.cst|d�st�t|d��|S)NZ_dbus_is_methodZ_slip_polkit_auth_required)�hasattr�AssertionError�setattr)�method)�polkit_auth��/usr/lib/python3.6/polkit.py�require_auth_decorator/sz,require_auth.<locals>.require_auth_decoratorr)rrr)rrr)szBorg.fedoraproject.slip.dbus.service.PolKit.NotAuthorizedException.c@seZdZdS)r	N)�__name__�
__module__�__qualname__rrrrr	<scs�|dkst|tj�st��dtfks0�dks0t��dksHt�tj�sHt��dks^t�t�s^t����fdd��|dk	r�t�|�S�fdd�}|SdS)u�Decorator for DBus proxy methods.

    Let's you (optionally) specify either a result value or an exception type
    and a callback which is returned, thrown or called respectively if a
    PolicyKit authorization doesn't exist or can't be obtained in the DBus
    mechanism, i.e. an appropriate DBus exception is thrown.

    An exception constructor may and a callback must accept an `action_id´
    parameter which will be set to the id of the PolicyKit action for which
    authorization could not be obtained.

    Examples:

    1) Return `False´ in the event of an authorization problem, and call
    `error_handler´:

        def error_handler(action_id=None):
            print "Authorization problem:", action_id

        class MyProxy(object):
            @polkit.enable_proxy(authfail_result=False,
                                 authfail_callback=error_handler)
            def some_method(self, ...):
                ...

    2) Throw a `MyAuthError´ instance in the event of an authorization problem:

        class MyAuthError(Exception):
            def __init__(self, *args, **kwargs):
                action_id = kwargs.pop("action_id")
                super(MyAuthError, self).__init__(*args, **kwargs)
                self.action_id = action_id

        class MyProxy(object):
            @polkit.enable_proxy(authfail_exception=MyAuthError)
            def some_method(self, ...):
                ...Ncs�y
|||�Stjk
r�}zr|j�}|jt�s2�|tt�d�}�dk	rT�|d��dk	r�y�|d�}Wn��}YnX|��tkr���Sd}~XnXdS)N)�	action_id)�dbus�
DBusExceptionZ
get_dbus_name�
startswith�AUTH_EXC_PREFIX�lenr	)�func�p�k�eZexc_namerZaf_exc)�authfail_callback�authfail_exception�authfail_resultrr�
_enable_proxyvs$


z#enable_proxy.<locals>._enable_proxycs
t�|�S)N)r)r)r%rr�decorate�szenable_proxy.<locals>.decorate)�
isinstance�collections�Callablerr	�
issubclass�	Exceptionr)rr$r#r"r&r)r%r"r#r$rr@s*
cs$eZdZdZdZ�fdd�Z�ZS)r
zqException which a DBus service method throws if an authorization
    required for executing it can't be obtained.zAorg.fedoraproject.slip.dbus.service.PolKit.NotAuthorizedExceptioncs(|jjd||_tt|�j||�dS)N�.)�	__class__�_dbus_error_name�superr
�__init__)�selfrrr )r-rrr0�s
zNotAuthorizedException.__init__)rrr�__doc__r.r0�
__classcell__rr)r-rr
�sc@s�eZdZdZdZdZdZdZdZdZ	dZ
edd��Ze
dd	��Ze
d
d��Ze
dd
��Ze
dd��Zdd�Zdd�Zdd�Zdifdd�ZdS)�PolKitz"Convenience wrapper around polkit.zorg.freedesktop.PolicyKit1z%/org/freedesktop/PolicyKit1/Authorityz$org.freedesktop.PolicyKit1.AuthorityNcCs4||jkr0tjr0tjjtj�dt_dt_dt_dS)N)�
_dbus_namer4�_PolKit__busZremove_signal_receiver�_PolKit__signal_receiver�_PolKit__interface)�cls�nameZ	old_ownerZ	new_ownerrrr�_on_name_owner_changed�s
zPolKit._on_name_owner_changedcCs0tjs*tj�t_tjj|jdd|jd�t_tjS)NZNameOwnerChangedzorg.freedesktop.DBus)Zhandler_functionZsignal_nameZdbus_interfaceZarg0)r4r6rZ	SystemBusZadd_signal_receiverr;r5r7)r1rrr�_bus�s
zPolKit._buscCstjs|jj�t_tjS)N)r4�_PolKit__bus_namer<Zget_unique_name)r1rrr�	_bus_name�szPolKit._bus_namecCsFtjs@y"tj|jj|j|j�|j�t_Wntj	k
r>YnXtjS)N)
r4r8r�	Interfacer<�
get_objectr5�
_dbus_path�_dbus_interfacer)r1rrr�
_interface�s

zPolKit._interfacecCs
t|j�S)N)�boolrC)r1rrr�_polkit_present�szPolKit._polkit_presentc	Cs>|jjdd�}tj|d�}y|j|�}Wnd}YnX|S)Nzorg.freedesktop.DBusz/org/freedesktop/DBus)r<r@rr?ZGetConnectionUnixUser)r1�system_bus_nameZ
bus_objectZ
bus_interfaceZuidrrrZ__dbus_system_bus_name_uid�s
z!PolKit.__dbus_system_bus_name_uidcCs6|js
dS|jjdd|jif|idd�\}}}|p4|S)NTzsystem-bus-namer:r�)rErC�CheckAuthorizationr>)r1Z
authorization�
is_authorized�is_challenge�detailsrrrZ__authorization_is_obtainable�sz$PolKit.__authorization_is_obtainablecs8�js
dSt|tttf�s |f}t�fdd�|d�}|S)NTcs|o�j|�S)N)�$_PolKit__authorization_is_obtainable)�x�y)r1rr�<lambda>�sz4PolKit.AreAuthorizationsObtainable.<locals>.<lambda>)rEr'�tuple�list�setr)r1�authorizationsZ
obtainabler)r1rr�s
z"PolKit.AreAuthorizationsObtainableTc	
sd|js �|dkp|j|�dk�Sd}|r0|dO}�fdd�}|jjdd|if|||d||td�dS)	Nrrcs|\}}}�|�dS)Nr)�argsrIrJrK)�
reply_handlerrr�reply_cbs
z7PolKit.IsSystemBusNameAuthorizedAsync.<locals>.reply_cbzsystem-bus-namer:rG)rU�
error_handlerZtimeout)rE�!_PolKit__dbus_system_bus_name_uidrCrHr)	r1rFrrUrW�	challengerK�flagsrVr)rUrrs

z%PolKit.IsSystemBusNameAuthorizedAsync)rrrr2r5rArBr8r6r=r7�classmethodr;�propertyr<r>rCrErXrLrrrrrrr4�s"
r4cCs
tj|�S)N)�__polkitr)rSrrrrsTcCstj||||||�S)N)r]r)rFrrUrWrYrKrrrrs
)r2Z
__future__rr(rr�	functoolsrZ	constantsr�__all__rr�objectr	rrr
r4r]rrrrrr�<module>s(Wosite-packages/slip/dbus/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000606147511334620020534 0ustar003

yS5c��@s`ddlmZddlmZddlmZmZmZddlmZddlmZddlm	Z	ddlm
Z
d	S)
�)�absolute_import�)�bus)�
SessionBus�	SystemBus�
StarterBus)�proxies)�service)�polkit)�mainloopN)Z
__future__r�rrrrrr	r
r�r
r
�/usr/lib/python3.6/__init__.py�<module>ssite-packages/slip/dbus/__pycache__/bus.cpython-36.pyc000064400000001322147511334620016623 0ustar003

yS5cY�@sXdZddlmZddlZddlmZddlmZx"d
D]Zedee	ej
d��q6WdS)zQThis module contains functions which create monkey-patched/augmented D-Bus
buses.�)�absolute_importN�)�proxies)�	constants�Bus�	SystemBus�
SessionBus�
StarterBusz�def %(name)s(*args, **kwargs):
    busobj = dbus.%(name)s(*args, **kwargs)
    busobj.ProxyObjectClass = proxies.ProxyObject
    busobj.default_timeout = %(default_timeout)s
    return busobj
)�name�modnameZdefault_timeout)rrrr	)�__doc__Z
__future__rZdbus�rrr
�exec�__name__Zmethod_call_no_timeout�rr�/usr/lib/python3.6/bus.py�<module>s
site-packages/slip/dbus/__pycache__/constants.cpython-36.pyc000064400000000335147511334620020051 0ustar003

yS5c��@sdZdZdS)z*This module contains some constant values.i���g@�@Ng`���Mb@A)�__doc__Zmethod_call_no_timeout�rr�/usr/lib/python3.6/constants.py�<module>ssite-packages/slip/dbus/__pycache__/bus.cpython-36.opt-1.pyc000064400000001322147511334620017562 0ustar003

yS5cY�@sXdZddlmZddlZddlmZddlmZx"d
D]Zedee	ej
d��q6WdS)zQThis module contains functions which create monkey-patched/augmented D-Bus
buses.�)�absolute_importN�)�proxies)�	constants�Bus�	SystemBus�
SessionBus�
StarterBusz�def %(name)s(*args, **kwargs):
    busobj = dbus.%(name)s(*args, **kwargs)
    busobj.ProxyObjectClass = proxies.ProxyObject
    busobj.default_timeout = %(default_timeout)s
    return busobj
)�name�modnameZdefault_timeout)rrrr	)�__doc__Z
__future__rZdbus�rrr
�exec�__name__Zmethod_call_no_timeout�rr�/usr/lib/python3.6/bus.py�<module>s
site-packages/slip/dbus/__pycache__/__init__.cpython-36.pyc000064400000000606147511334620017575 0ustar003

yS5c��@s`ddlmZddlmZddlmZmZmZddlmZddlmZddlm	Z	ddlm
Z
d	S)
�)�absolute_import�)�bus)�
SessionBus�	SystemBus�
StarterBus)�proxies)�service)�polkit)�mainloopN)Z
__future__r�rrrrrr	r
r�r
r
�/usr/lib/python3.6/__init__.py�<module>ssite-packages/slip/dbus/__pycache__/mainloop.cpython-36.opt-1.pyc000064400000006142147511334620020614 0ustar003

yS5c#
�@s@dZddlmZd
ZGdd�de�ZGdd�de�Zdd�Zd	S)zVThis module contains mainloop wrappers.

Currently only glib main loops are supported.�)�absolute_import�MainLoop�set_typecsXeZdZdZdZ�fdd�Zedd��Zdd�Zd	d
�Z	dd�Z
d
d�Zdd�Z�Z
S)raAn abstract main loop wrapper class and factory.

    Use MainLoop() to get a main loop wrapper object for a main loop type
    previously registered with set_type(). Defaults to glib main loops.

    Actual main loop wrapper classes are derived from this class.Ncs.tjdkrtjd�tt|�jtjf|�|�S)N�glib)rZ_mainloop_classr�super�__new__�_MainLoop__mainloop_class)�cls�args�kwargs)�	__class__��/usr/lib/python3.6/mainloop.pyr*s


zMainLoop.__new__cCsHtjdk	rtd��dti}||kr.||t_ntd|dj|�f��dS)zxSet a main loop type for non-blocking interfaces.

        mltype: "glib" (currently only glib main loops are supported)Nz(The main loop type can only be set once.rz0'%s' is not one of the valid main loop types:
%sz, )rr�RuntimeError�GlibMainLoop�
ValueError�join)r	�mltypeZ
ml_type_classr
r
rr1s
zMainLoop.set_typecCs
t��dS)z$Returns if there are pending events.N)�NotImplementedError)�selfr
r
r�pendingCszMainLoop.pendingcCs
t��dS)z Iterates over one pending event.N)r)rr
r
r�iterateHszMainLoop.iteratecCsx|j�r|j�qWdS)z!Iterates over all pending events.N)rr)rr
r
r�iterate_over_pending_eventsMs
z$MainLoop.iterate_over_pending_eventscCs
t��dS)zRuns the main loop.N)r)rr
r
r�runSszMainLoop.runcCs
t��dS)zQuits the main loop.N)r)rr
r
r�quitXsz
MainLoop.quit)�__name__�
__module__�__qualname__�__doc__rr�classmethodrrrrrr�
__classcell__r
r
)rrr sc@seZdZdd�ZdS)rcCsFddlm}|j�}|j�}||_|j|_|j|_|j|_|j	|_	dS)N�)�_glib)
Z	_wrappersr"rZget_contextZ	_mainlooprZ	iterationrrr)rr"ZmlZctxr
r
r�__init__`szGlibMainLoop.__init__N)rrrr#r
r
r
rr^srcCs$ddlm}|dt�tj|�dS)z�Set a main loop type for non-blocking interfaces.

    mltype: "glib" (currently only glib main loops are supported)

    Deprecated, use MainLoop.set_type() instead.r)�warnzuse MainLoop.set_type() insteadN)�warningsr$�DeprecationWarningrr)rr$r
r
rrls
N)rr)rZ
__future__r�__all__�objectrrrr
r
r
r�<module>s
>site-packages/slip/dbus/__pycache__/constants.cpython-36.opt-1.pyc000064400000000335147511334620021010 0ustar003

yS5c��@sdZdZdS)z*This module contains some constant values.i���g@�@Ng`���Mb@A)�__doc__Zmethod_call_no_timeout�rr�/usr/lib/python3.6/constants.py�<module>ssite-packages/slip/dbus/__pycache__/introspection.cpython-36.opt-1.pyc000064400000011116147511334620021673 0ustar003

yS5c��@s�dZddlmZddlmZddlmZddlmZGdd�de	�Z
Gdd	�d	ee
e��ZGd
d�de�Z
Gdd
�d
e�ZGdd�dee�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�dee�Zdd�ZdS)z?Classes and functions to easily access DBus introspection data.�)�absolute_import)�ElementTree)�StringIO)�with_metaclassc@s(eZdZdZiZedd��Zdd�ZdS)�	IElemMetaz�Metaclass for introspection elements.

    Sets elemname class member automatically from class name if not set
    explicitly. Registers classes for their element names.cCs>d}x4|D],}|j�}||kr.t|�r.|d7}||7}q
W|S)N��_)�lower�len)�clsZclsname�elemname�cZc_lower�r�#/usr/lib/python3.6/introspection.py�clsname_to_elemname(s
zIElemMeta.clsname_to_elemnamecCs�|dkrtj||||�Sd|krL|jd�s6td|��tj|dd��|d<|d}|tjkrntd||f��tj||||�}|tj|<|S)N�IElemrz;Class '%s' needs to set elemname (or be called 'IElem...'))�z4Class '%s' tries to register duplicate elemname '%s')�type�__new__�
startswith�	TypeErrorrr�elemnames_to_classes)r�name�basesZdctr�klsrrrr3s 


zIElemMeta.__new__N)�__name__�
__module__�__qualname__�__doc__r�classmethodrrrrrrr srcs@eZdZdZd�fdd�	Zddd�Zdd�Zed	d
��Z�Z	S)
rz&Base class for introspection elements.Ncs*tjj|jtjd�}tt|�j|||�S)N)rr�get�tag�superrr)r�elem�parentr)�	__class__rrrOsz
IElem.__new__cs$|�_|�_�fdd�|D��_dS)Ncsg|]}t|�d��qS))r$)r)�.0r
)�selfrr�
<listcomp>Wsz"IElem.__init__.<locals>.<listcomp>)r#r$�child_elements)r'r#r$r)r'r�__init__TszIElem.__init__cCsZd|jr|jn
d|jj|jf}x2|jD](}x"t|�jd�D]}|d|7}q>Wq*W|S)Nz%s %rz
unknown:%s�
z
  %s)rr#r!�attribr)�str�split)r'�sr
Zccrrr�__str__Ysz
IElem.__str__cCs|jjS)N)r#r,)r'rrrr,aszIElem.attrib)N)N)
rrrrrr*r0�propertyr,�
__classcell__rr)r%rrLs

rc@seZdZdZdZdS)�IElemUnknownz-Catch-all for unknown introspection elements.N)rrrrrrrrrr3fsr3c@seZdZdZedd��ZdS)�IElemNameMixinz,Mixin for introspection elements with names.cCs
|jdS)Nr)r,)r'rrrroszIElemNameMixin.nameN)rrrrr1rrrrrr4lsr4cs"eZdZdZd�fdd�	Z�ZS)�	IElemNodezIntrospection node.Ncs(tt|�j||�dd�|jD�|_dS)NcSsg|]}t|t�r|�qSr)�
isinstancer5)r&r
rrrr({sz&IElemNode.__init__.<locals>.<listcomp>)r"r5r*r)Zchild_nodes)r'r#r$)r%rrr*wszIElemNode.__init__)N)rrrrr*r2rr)r%rr5tsr5c@seZdZdZdS)�IElemInterfacezIntrospection interface.N)rrrrrrrrr7~sr7c@seZdZdZdS)�IElemMethodzIntrospection interface method.N)rrrrrrrrr8�sr8c@seZdZdZdS)�IElemArgzIntrospection method argument.N)rrrrrrrrr9�sr9c@seZdZdZdS)�IElemSignalzIntrospection interface signal.N)rrrrrrrrr:�sr:cCs.t�}t|d�st|�}|j|�}t|�}|S)N�read)r�hasattrr�parser)Zstring_or_fileZtreeZxml_rootZ	elem_rootrrr�
introspect�s

r>N)rZ
__future__rZxml.etree.ElementTreer�iorZsixrrr�objectrr3r4r5r7r8r9r:r>rrrr�<module>s,
site-packages/slip/dbus/__pycache__/mainloop.cpython-36.pyc000064400000006142147511334620017655 0ustar003

yS5c#
�@s@dZddlmZd
ZGdd�de�ZGdd�de�Zdd�Zd	S)zVThis module contains mainloop wrappers.

Currently only glib main loops are supported.�)�absolute_import�MainLoop�set_typecsXeZdZdZdZ�fdd�Zedd��Zdd�Zd	d
�Z	dd�Z
d
d�Zdd�Z�Z
S)raAn abstract main loop wrapper class and factory.

    Use MainLoop() to get a main loop wrapper object for a main loop type
    previously registered with set_type(). Defaults to glib main loops.

    Actual main loop wrapper classes are derived from this class.Ncs.tjdkrtjd�tt|�jtjf|�|�S)N�glib)rZ_mainloop_classr�super�__new__�_MainLoop__mainloop_class)�cls�args�kwargs)�	__class__��/usr/lib/python3.6/mainloop.pyr*s


zMainLoop.__new__cCsHtjdk	rtd��dti}||kr.||t_ntd|dj|�f��dS)zxSet a main loop type for non-blocking interfaces.

        mltype: "glib" (currently only glib main loops are supported)Nz(The main loop type can only be set once.rz0'%s' is not one of the valid main loop types:
%sz, )rr�RuntimeError�GlibMainLoop�
ValueError�join)r	�mltypeZ
ml_type_classr
r
rr1s
zMainLoop.set_typecCs
t��dS)z$Returns if there are pending events.N)�NotImplementedError)�selfr
r
r�pendingCszMainLoop.pendingcCs
t��dS)z Iterates over one pending event.N)r)rr
r
r�iterateHszMainLoop.iteratecCsx|j�r|j�qWdS)z!Iterates over all pending events.N)rr)rr
r
r�iterate_over_pending_eventsMs
z$MainLoop.iterate_over_pending_eventscCs
t��dS)zRuns the main loop.N)r)rr
r
r�runSszMainLoop.runcCs
t��dS)zQuits the main loop.N)r)rr
r
r�quitXsz
MainLoop.quit)�__name__�
__module__�__qualname__�__doc__rr�classmethodrrrrrr�
__classcell__r
r
)rrr sc@seZdZdd�ZdS)rcCsFddlm}|j�}|j�}||_|j|_|j|_|j|_|j	|_	dS)N�)�_glib)
Z	_wrappersr"rZget_contextZ	_mainlooprZ	iterationrrr)rr"ZmlZctxr
r
r�__init__`szGlibMainLoop.__init__N)rrrr#r
r
r
rr^srcCs$ddlm}|dt�tj|�dS)z�Set a main loop type for non-blocking interfaces.

    mltype: "glib" (currently only glib main loops are supported)

    Deprecated, use MainLoop.set_type() instead.r)�warnzuse MainLoop.set_type() insteadN)�warningsr$�DeprecationWarningrr)rr$r
r
rrls
N)rr)rZ
__future__r�__all__�objectrrrr
r
r
r�<module>s
>site-packages/slip/dbus/__pycache__/proxies.cpython-36.pyc000064400000002345147511334620017531 0ustar003

yS5c`�@sPdZddlmZddlZddlmZGdd�dejj�ZGdd	�d	ejj	�Z	dS)
z{This module contains D-Bus proxy classes which implement the default
timeout of the augmented bus classes in slip.dbus.bus.�)�absolute_importN�)�	constantsc@s$eZdZiZedd��Zdd�ZdS)�_ProxyMethodcCsB|j|jkr6t|jjdd�}|dkr*tj}||j|j<|j|jS)N�default_timeout)Z_connection�_connections_default_timeouts�getattr�_proxyZ_busrZmethod_call_no_timeout)�selfZdt�r�/usr/lib/python3.6/proxies.pyr&sz_ProxyMethod.default_timeoutcOs.|jd�dkr|j|d<tjjj|f|�|�S)NZtimeout)�getr�dbus�proxiesr�__call__)r
�args�kwargsrrrr/s
z_ProxyMethod.__call__N)�__name__�
__module__�__qualname__r�propertyrrrrrrr"s	rc@seZdZeZdS)�ProxyObjectN)rrrrZProxyMethodClassrrrrr6sr)
�__doc__Z
__future__rZdbus.proxiesr�rrrrrrrr�<module>s
site-packages/slip/dbus/__pycache__/service.cpython-36.opt-1.pyc000064400000012352147511334620020436 0ustar003

yS5c��@s�dZddlmZddlZddlZddlmZddlmZ	ddl
mZd	d
dgZda
dd
�Zeadd�Zdd�Zdd�Zdadadd�ZGdd
�d
ejj�ZGdd	�d	eeejj��ZdS)zMThis module contains convenience functions for using dbus-activated services.�)�absolute_importN)�with_metaclass�)�_glib�)�polkit�Object�
InterfaceType�set_mainloopcCstj�dS)N)�__mainloop__�quit�r
r
�/usr/lib/python3.6/service.py�__glib_quit_cb__)srcCs|adS)N)r)Zmainloopr
r
rr
4scCs|adS)N)�__quit_cb__)�quit_cbr
r
r�set_quit_cb9srcCs
t�dS)N)rr
r
r
rr>srZ__slip_dbus_service_sender__�__slip_dbus_service_reply_cb__�__slip_dbus_service_error_cb__cs��jdk	r�j�d�nt�d��jdk	r4�j�d�nt�d���������fdd�}xLdd�t��D�D]6}|dkr��|_ql|dkr��|_qlt||t�|��qlW�j|_|S)	NFTcs��j��}|dk	rV��d���d��r4��=�rL��d=��d=�j|�t�	dt�dd���|dk	r��r�����	�
���fdd�}��fdd�}tj|�||d	�n�	�f����}�j�|SdS)
NrrZ_slip_polkit_auth_required�default_polkit_auth_requiredcs�|r��r��f����q�d}y��f����}Wn&tk
rX}z
|}WYdd}~XnX|rh�|�q�|dkrx��q��|�n�tj����j�dS)N)�	ExceptionrZNotAuthorizedException�timeout_restart)Zis_auth�error�result�e)�	action_id�error_cb�k�method�method_is_async�p�reply_cb�selfr
r�
reply_handlerqs

z:wrap_method.<locals>.wrapped_method.<locals>.reply_handlercs�|��j�dS)N)r)r)rr"r
r�
error_handler�sz:wrap_method.<locals>.wrapped_method.<locals>.error_handler)r#r$)�get�sender_seen�getattrrZIsSystemBusNameAuthorizedAsyncr)r"r r�senderr#r$Zretval)�async_callbacks�hide_async_callbacks�hide_sender_keywordrr�sender_keyword)rrrr r!r"r�wrapped_method[s,



"z#wrap_method.<locals>.wrapped_methodcss"|]}|dd�dkr|VqdS)N�Z_dbus_r
)�.0�xr
r
r�	<genexpr>�szwrap_method.<locals>.<genexpr>�_dbus_sender_keyword�_dbus_async_callbacks)r2�SENDER_KEYWORDr3�ASYNC_CALLBACKS�dir�setattrr'�__name__)rr-�attrr
)r)r*r+rrr,r�wrap_methodHs(

Er:cseZdZ�fdd�Z�ZS)r	csDx,|j�D] \}}t|dd�r
t|�||<q
Wtt|�j||||�S)NZ_dbus_is_methodF)�itemsr'r:�superr	�__new__)�cls�name�basesZdctZattrnamer9)�	__class__r
rr=�szInterfaceType.__new__)r8�
__module__�__qualname__r=�
__classcell__r
r
)rArr	�scsbeZdZdZdZeZdZe�ZiZ	iZ
dZd�fdd�	Zdd�Z
dd	�Zdd
d�Zdd
�Z�ZS)rF�Ncs2tt|�j|||�|dkr(|jj|_n||_dS)N)r<r�__init__rA�
persistent)r"�connZobject_pathZbus_namerG)rAr
rrF�szObject.__init__cCs2|jr ttj�dkr t�dSdt_|jt_dS)NrF)rG�lenr�sendersr�current_source�default_duration�duration)r"r
r
r�_timeout_cb�szObject._timeout_cbcCs�|j}|r�||ftjkr�tjj||f�tj|j|�ttj|�dkrjtj|j�tj|=tj|=|jr�ttj�dkr�tjdkr�t	�dS)Nr)
�
connectionrrJ�remove�connections_sendersrI�connections_smobjsrGrKr)r"r?Z	old_ownerZ	new_ownerrHr
r
r�_name_owner_changed�s
zObject._name_owner_changedcCsf|s|jj}tjs|tjkr$|t_|js:ttj�dkrbtjrLtj	tj�tj
tjd|j�t_dS)Nri�)rArLrrMrGrIrJrK�GLibZ
source_removeZtimeout_addrN)r"rMr
r
rr�szObject.timeout_restartcCsp||jftjkrltjj||jf�|jtjkrZt�tj|j<|jj|jdd|d�tj|j<tj|jj|�dS)NZNameOwnerChangedzorg.freedesktop.DBus)Zhandler_functionZsignal_nameZdbus_interfaceZarg1)	rOrrJ�addrQ�setZadd_signal_receiverrSrR)r"r(r
r
rr&�szObject.sender_seen)NNNN)N)r8rBrCrGrLrMrKrVrJrQrRrrFrNrSrr&rDr
r
)rArr�s

)rr)�__doc__Z
__future__rZdbusZdbus.serviceZsixrZ	_wrappersrrT�r�__all__rrrr
rrr4r5r:Zservicer	rr
r
r
r�<module>s&
g
site-packages/slip/dbus/polkit.py000064400000022111147511334620013047 0ustar00# -*- coding: utf-8 -*-

# slip.dbus.polkit -- convenience decorators and functions for using PolicyKit
# with dbus services and clients
#
# Copyright © 2008, 2009, 2012, 2013, 2015 Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#
# Authors:
# Nils Philippsen <nils@redhat.com>

"""This module contains convenience decorators and functions for using
PolicyKit with dbus services and clients."""

from __future__ import absolute_import

import collections
import dbus
from decorator import decorator
from functools import reduce

from .constants import method_call_no_timeout

__all__ = ["require_auth", "enable_proxy", "AUTHFAIL_DONTCATCH",
           "NotAuthorizedException", "AreAuthorizationsObtainable",
           "IsSystemBusNameAuthorizedAsync"]


def require_auth(polkit_auth):
    """Decorator for DBus service methods.

    Specify that a user needs a specific PolicyKit authorization `polkit_auth´
    to execute it."""

    def require_auth_decorator(method):
        assert hasattr(method, "_dbus_is_method")

        setattr(method, "_slip_polkit_auth_required", polkit_auth)
        return method

    return require_auth_decorator


AUTH_EXC_PREFIX = \
    "org.fedoraproject.slip.dbus.service.PolKit.NotAuthorizedException."


class AUTHFAIL_DONTCATCH(object):
    pass


def enable_proxy(
    func=None, authfail_result=AUTHFAIL_DONTCATCH, authfail_exception=None,
    authfail_callback=None):

    """Decorator for DBus proxy methods.

    Let's you (optionally) specify either a result value or an exception type
    and a callback which is returned, thrown or called respectively if a
    PolicyKit authorization doesn't exist or can't be obtained in the DBus
    mechanism, i.e. an appropriate DBus exception is thrown.

    An exception constructor may and a callback must accept an `action_id´
    parameter which will be set to the id of the PolicyKit action for which
    authorization could not be obtained.

    Examples:

    1) Return `False´ in the event of an authorization problem, and call
    `error_handler´:

        def error_handler(action_id=None):
            print "Authorization problem:", action_id

        class MyProxy(object):
            @polkit.enable_proxy(authfail_result=False,
                                 authfail_callback=error_handler)
            def some_method(self, ...):
                ...

    2) Throw a `MyAuthError´ instance in the event of an authorization problem:

        class MyAuthError(Exception):
            def __init__(self, *args, **kwargs):
                action_id = kwargs.pop("action_id")
                super(MyAuthError, self).__init__(*args, **kwargs)
                self.action_id = action_id

        class MyProxy(object):
            @polkit.enable_proxy(authfail_exception=MyAuthError)
            def some_method(self, ...):
                ..."""

    assert(func is None or isinstance(func, collections.Callable))

    assert(
        authfail_result in (None, AUTHFAIL_DONTCATCH) or
        authfail_exception is None)
    assert(
        authfail_callback is None or
        isinstance(authfail_callback, collections.Callable))
    assert(
        authfail_exception is None or
        issubclass(authfail_exception, Exception))

    def _enable_proxy(func, *p, **k):

        try:
            return func(*p, **k)
        except dbus.DBusException as e:
            exc_name = e.get_dbus_name()

            if not exc_name.startswith(AUTH_EXC_PREFIX):
                raise

            action_id = exc_name[len(AUTH_EXC_PREFIX):]

            if authfail_callback is not None:
                authfail_callback(action_id=action_id)

            if authfail_exception is not None:
                try:
                    af_exc = authfail_exception(action_id=action_id)
                except:
                    af_exc = authfail_exception()
                raise af_exc

            if authfail_result is AUTHFAIL_DONTCATCH:
                raise

            return authfail_result

    if func is not None:
        return decorator(_enable_proxy, func)
    else:
        def decorate(func):
            return decorator(_enable_proxy, func)
        return decorate


class NotAuthorizedException(dbus.DBusException):

    """Exception which a DBus service method throws if an authorization
    required for executing it can't be obtained."""

    _dbus_error_name = \
        "org.fedoraproject.slip.dbus.service.PolKit.NotAuthorizedException"

    def __init__(self, action_id, *p, **k):

        self._dbus_error_name = self.__class__._dbus_error_name + "." +\
            action_id
        super(NotAuthorizedException, self).__init__(*p, **k)


class PolKit(object):

    """Convenience wrapper around polkit."""

    _dbus_name = 'org.freedesktop.PolicyKit1'
    _dbus_path = '/org/freedesktop/PolicyKit1/Authority'
    _dbus_interface = 'org.freedesktop.PolicyKit1.Authority'

    __interface = None
    __bus = None
    __bus_name = None
    __signal_receiver = None

    @classmethod
    def _on_name_owner_changed(cls, name, old_owner, new_owner):
        if name == cls._dbus_name and PolKit.__bus:
            PolKit.__bus.remove_signal_receiver(PolKit.__signal_receiver)
            PolKit.__bus = None
            PolKit.__signal_receiver = None
            PolKit.__interface = None

    @property
    def _bus(self):
        if not PolKit.__bus:
            PolKit.__bus = dbus.SystemBus()
            PolKit.__signal_receiver = PolKit.__bus.add_signal_receiver(
                handler_function=self._on_name_owner_changed,
                signal_name='NameOwnerChanged',
                dbus_interface='org.freedesktop.DBus',
                arg0=self._dbus_name)
        return PolKit.__bus

    @property
    def _bus_name(self):
        if not PolKit.__bus_name:
            PolKit.__bus_name = self._bus.get_unique_name()
        return PolKit.__bus_name

    @property
    def _interface(self):
        if not PolKit.__interface:
            try:
                PolKit.__interface = dbus.Interface(self._bus.get_object(
                    self._dbus_name, self._dbus_path),
                    self._dbus_interface)
            except dbus.DBusException:
                pass
        return PolKit.__interface

    @property
    def _polkit_present(self):
        return bool(self._interface)

    def __dbus_system_bus_name_uid(self, system_bus_name):
        bus_object = self._bus.get_object(
            'org.freedesktop.DBus', '/org/freedesktop/DBus')
        bus_interface = dbus.Interface(bus_object, 'org.freedesktop.DBus')
        try:
            uid = bus_interface.GetConnectionUnixUser(system_bus_name)
        except:
            uid = None
        return uid

    def __authorization_is_obtainable(self, authorization):
        if not self._polkit_present:
            return True

        (is_authorized, is_challenge, details) = \
            self._interface.CheckAuthorization(
                ("system-bus-name", {"name": self._bus_name}),
                authorization, {}, 0, "")

        return is_authorized or is_challenge

    def AreAuthorizationsObtainable(self, authorizations):
        if not self._polkit_present:
            return True

        if not isinstance(authorizations, (tuple, list, set)):
            authorizations = (authorizations,)

        obtainable = \
            reduce(
                lambda x, y: x and self.__authorization_is_obtainable(y),
                authorizations, True)

        return obtainable

    def IsSystemBusNameAuthorizedAsync(
        self, system_bus_name, action_id, reply_handler, error_handler,
        challenge=True, details={}):

        if not self._polkit_present:
            return reply_handler(action_id is None or
                self.__dbus_system_bus_name_uid(system_bus_name) == 0)

        flags = 0
        if challenge:
            flags |= 0x1

        def reply_cb(args):
            (is_authorized, is_challenge, details) = args
            reply_handler(is_authorized)

        self._interface.CheckAuthorization(
            ("system-bus-name", {"name": system_bus_name}),
            action_id, details, flags, "",
            reply_handler=reply_cb, error_handler=error_handler,
            timeout=method_call_no_timeout)


__polkit = PolKit()


def AreAuthorizationsObtainable(authorizations):
    return __polkit.AreAuthorizationsObtainable(authorizations)


def IsSystemBusNameAuthorizedAsync(
    system_bus_name, action_id, reply_handler, error_handler, challenge=True,
    details={}):

    return __polkit.IsSystemBusNameAuthorizedAsync(
        system_bus_name, action_id, reply_handler, error_handler, challenge,
        details)
site-packages/slip/util/__pycache__/files.cpython-36.opt-1.pyc000064400000011525147511334620020121 0ustar003

yS5c��@s�dZddlmZdee�kr eZdddddgZdd	lZdd	l	Z	dd	l
Z
dd	lZdd	lZd
Z
dd�Zgfd
d�Zdd�Zddd�Zddd�Zddd�Zddd�Zd	S)z=This module contains helper functions for dealing with files.�)�absolute_import�xrange�
issamefile�linkfile�copyfile�linkorcopyfile�overwrite_safelyNicCs"tj|�}tj|�}tjj||�S)N)�os�stat�path�samestat)�path1�path2�s1�s2�r�/usr/lib/python3.6/files.py�_issamefile+s

rcCs0|dkrt}y
t||�S|k
r*dSXdS)zECheck whether two paths point to the same file (i.e. are hardlinked).TFN)�	Exceptionr)r
r�catch_stat_exceptionsrrrr2s
cCs�t||td�rdStjj|�}tjj|�}tjj|�}d}xpttj	�D]b}tj
|tj|d�}ytj||�Wn2tk
r�}z|j
t
jkr�n�WYdd}~XqFXd}PqFW|r�tj||�dS)zUHardlink srcpath to dstpath.

    Attempt to atomically replace dstpath if it exists.)rNF)�prefix�dirT)r�OSErrorr	r�abspath�dirname�basename�range�tempfile�TMP_MAX�mktemp�extsep�link�errno�EEXIST�rename)�srcpath�dstpath�dstdname�dstbnameZ
hardlinked�attempt�_dsttmp�errrr>s$Tc
Cs8t||td�rdStjj|�}tjj|�}tjj|�}t|d�}tj	|tjj
|dd�}tj|�}|r�ytj|�}Wntk
r�YnXtj|j
�tj|j��d}	xP|	dkr�|jt�}	y|j|	�Wq�|j�|j�tj|j��Yq�Xq�W|j�|j�tj|j|�|�r4tj�dk�r4tj|�dS)z�Copy srcpath to dstpath.

    Abort operation if e.g. not enough space is available.  Attempt to
    atomically replace dstpath if it exists.)rN�rbF)rr�delete�r)rrr	rrrr�openrZNamedTemporaryFiler r
�fchmod�fileno�S_IMODE�st_mode�read�	BLOCKSIZE�write�close�unlink�namer$�selinux�is_selinux_enabled�
restorecon)
r%r&�copy_mode_from_dst�run_restoreconr'r(ZsrcfileZ
dsttmpfile�s�datarrrr_s<



cCs^yt||�dStk
rJ}z |jtjtjtjfkr:�nWYdd}~XnXt||||�dS)ztFirst attempt to hardlink srcpath to dstpath, if hardlinking isn't
    possible, attempt copying srcpath to dstpath.N)rrr"ZEMLINKZEPERMZEXDEVr)r%r&r=r>r+rrrr�s
Fc
Cs�tjj|�}tjj|�}d}d}|r6tj�dkr6d}n^y&tj|�\}}|dkrZtd|��Wn6tk
r�}	z|	j	t	j
kr�d}n�WYdd}	~	XnX|s�tj||�|r�tj|�n�d}
xtt
tj�D]f}tj|tj|d�}ytj||�Wn6tk
�r"}	z|	j	t	jk�rwĂWYdd}	~	Xq�X|}
Pq�W|
dk�rDtt	jd��|�r^|�r^tj|
|�ytj|
|�Wntj|
��YnX|�r�tj|�dS)zpCreate a symlink, optionally replacing dstpath atomically, optionally
    setting or preserving SELinux context.FNrzgetfilecon(%r) failedT)rrz/No suitable temporary symlink could be created.)r	rrrr:r;Zlgetfilecon�RuntimeErrorrr"�ENOENT�symlinkr<rrrrr r#�IOErrorZlsetfileconr$�remove)
r%r&�force�preserve_contextr'r(r>�ctx�retr+Zdsttmpr)r*rrr�symlink_atomically�sV

rJcCs~tjj|�}tjj|�}tjj|�}d}d}d}	tjj|�}
|rPtj�dkrPd}z�tj	|tjj
|d�\}}	|
r�tj|�}|r�tj||j
|j�|r�tj|tj|j��|r�tj|�\}}
|dkr�td|��tj|d�}d}|j|�|j�d}tj|	|�|�r$|
�rtj||
�n
tj|�Wd|�r8|j�n|�rHtj|�|	�rxtjj|	��rxytj|	�WnYnXXdS)z�Safely overwrite a file by creating a temporary file in the same
    directory, writing it, moving it over the original file, eventually
    preserving file mode, SELinux context and ownership.NrF)rrzgetfilecon(%r) failed�w)r	r�realpathrr�existsr:r;rZmkstempr r
�fchown�st_uid�st_gidr0r2r3Z
getfileconrA�fdopenr6r7r$Z
setfileconr<�isfiler8)rZcontentZ
preserve_moderGZpreserve_ownershipZdir_�base�fd�fZtmpnamerMr?rIrHrrrr�sR



)TT)TT)FT)TTT)�__doc__Z
__future__rr�__builtins__rr�__all__r	r:rr"r
r5rrrrrrJrrrrr�<module>s&!
5

?site-packages/slip/util/__pycache__/hookable.cpython-36.pyc000064400000013577147511334620017655 0ustar003

yS5c^�@sldZddlZddlmZddgZGdd�de�ZGdd	�d	e�ZGd
d�deee��Z	Gdd�de
e	�ZdS)z[This module contains variants of certain base types which call registered
hooks on changes.�N)�with_metaclass�Hookable�HookableSetc@s eZdZdd�Zedd��ZdS)�HookableTypec
Cs�d|kr�y|d}WnJtk
r^d}x0dd�|D�D]}|rRtdt|���q8|}q8WYnXx |dD]}tj||�||<qjWtj||||�S)N�_hookable_change_methodsZ_hookable_base_classcss|]}|tkr|VqdS)N)r)�.0�x�r	�/usr/lib/python3.6/hookable.py�	<genexpr>)sz'HookableType.__new__.<locals>.<genexpr>ztoo many base classes: %s)�KeyError�	TypeError�strr�wrap_method�type�__new__)�cls�name�basesZdct�baseZbase_candidate�
methodnamer	r	r
r"szHookableType.__new__cs t||���fdd�}||_|S)Ncs�|f|�|�}|j�|S)N)�
_run_hooks)�self�p�kZretval)�funcr	r
�
methodwrapper9sz/HookableType.wrap_method.<locals>.methodwrapper)�getattr�__name__)rrrrr	)rr
r5s
zHookableType.wrap_methodN)r�
__module__�__qualname__r�classmethodrr	r	r	r
r src@s6eZdZddd�Zdd�Zdd�Zdd	�Zd
d�ZdS)
�
_HookEntryNcCs�t|tj�st�t|t�st�xFt|�D]:\}}yt|�Wq(tk
r`td||f��Yq(Xq(WxF|j�D]:\}}yt|�Wqptk
r�td||f��YqpXqpWt|t	�s�t	|�}||_
||_||_||_
d|_dS)Nz*Positional argument %d is not hashable: %rz'Keyword argument %r is not hashable: %r)�
isinstance�collections�Callable�AssertionErrorr�	enumerate�hashr
�items�tuple�_HookEntry__hook�_HookEntry__args�_HookEntry__kwargs�_HookEntry__hookable�_HookEntry__hash)r�hook�args�kwargs�hookable�nrrr	r	r
�__init__Ds.
z_HookEntry.__init__cCs$|j|jko"|j|jko"|j|jkS)N)r+r,r-)r�objr	r	r
�__cmp__csz_HookEntry.__cmp__cCs|js|j�|_|jS)N)r/�
_compute_hash)rr	r	r
�__hash__is
z_HookEntry.__hash__cCs>t|j�}t|�t|j�A}t|�ttt|jj����A}|S)N)r(r+r,r*�sortedr-r))r�	hashvaluer	r	r
r8ns

z_HookEntry._compute_hashcCs4|jr |j|jf|j�|j�n|j|j|j�dS)N)r.r+r,r-)rr	r	r
�runusz_HookEntry.run)N)rrr r5r7r9r8r<r	r	r	r
r"Bs

r"c@s�eZdZdZedd��Zdd�Zdd�Zeee�Zdd	�Z	d
d�Z
ee	e
�Zdd
�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�ZdS)rz2An object which calls registered hooks on changes.cOst|d�st�|_|jS)N�__real_hooks__)�hasattr�setr=)rrrr	r	r
�	__hooks__�s
zHookable.__hooks__cCst|d�sd|_|jS)N�__hooks_enabled__T)r>rA)rr	r	r
�_get_hooks_enabled�s
zHookable._get_hooks_enabledcCs
||_dS)N)rA)rZenabledr	r	r
�_set_hooks_enabled�szHookable._set_hooks_enabledcCst|d�sd|_|jS)N�__hooks_frozen__F)r>rD)rr	r	r
�_get_hooks_frozen�s
zHookable._get_hooks_frozencCsB||jkrdS||_|r"t�|_nx|jD]}|j�q*W|`dS)N)�hooks_frozenrDr?�__hooks_frozen_entries__r<)rZfreeze�	hookentryr	r	r
�_set_hooks_frozen�s

zHookable._set_hooks_frozencCs
d|_dS)NT)rF)rr	r	r
�freeze_hooks�szHookable.freeze_hookscCs
d|_dS)NF)rF)rr	r	r
�
thaw_hooks�szHookable.thaw_hookscOs|j|df|�|�dS)N)�_Hookable__add_hook)rr0r1r2r	r	r
�add_hook�szHookable.add_hookcOs|j||f|�|�dS)N)rL)rr0r1r2r	r	r
�add_hook_hookable�szHookable.add_hook_hookablecOs>t|tj�st�t|t�st�t||||d�}|jj|�dS)N)r3)r#r$r%r&rr"r@�add)rr0Z	_hookabler1r2rHr	r	r
Z
__add_hook�szHookable.__add_hookcOs|jjt|||��dS)N)r@�remover")rr0r1r2r	r	r
�remove_hook�szHookable.remove_hookcCs8|jr4|js&x&|jD]}|j�qWn|jj|j�dS)N)�
hooks_enabledrFr@r<rG�update)rrHr	r	r
r�s
zHookable._run_hooksN)rrr �__doc__�propertyr@rBrCrRrErIrFrJrKrMrNrLrQrr	r	r	r
r|s

c	@seZdZdZdZdd�Zd
S)rz5A set object which calls registered hooks on changes.rO�clear�difference_update�discard�intersection_update�poprP�symmetric_difference_updaterScCstj|�}t�|_|S)N)r?�copyr=)rr6r	r	r
r\�s
zHookableSet.copyN)	rOrVrWrXrYrZrPr[rS)rrr rTrr\r	r	r	r
r�s)rTr$Zsixr�__all__rr�objectr"rr?rr	r	r	r
�<module>s":Gsite-packages/slip/util/__pycache__/files.cpython-36.pyc000064400000011525147511334620017162 0ustar003

yS5c��@s�dZddlmZdee�kr eZdddddgZdd	lZdd	l	Z	dd	l
Z
dd	lZdd	lZd
Z
dd�Zgfd
d�Zdd�Zddd�Zddd�Zddd�Zddd�Zd	S)z=This module contains helper functions for dealing with files.�)�absolute_import�xrange�
issamefile�linkfile�copyfile�linkorcopyfile�overwrite_safelyNicCs"tj|�}tj|�}tjj||�S)N)�os�stat�path�samestat)�path1�path2�s1�s2�r�/usr/lib/python3.6/files.py�_issamefile+s

rcCs0|dkrt}y
t||�S|k
r*dSXdS)zECheck whether two paths point to the same file (i.e. are hardlinked).TFN)�	Exceptionr)r
r�catch_stat_exceptionsrrrr2s
cCs�t||td�rdStjj|�}tjj|�}tjj|�}d}xpttj	�D]b}tj
|tj|d�}ytj||�Wn2tk
r�}z|j
t
jkr�n�WYdd}~XqFXd}PqFW|r�tj||�dS)zUHardlink srcpath to dstpath.

    Attempt to atomically replace dstpath if it exists.)rNF)�prefix�dirT)r�OSErrorr	r�abspath�dirname�basename�range�tempfile�TMP_MAX�mktemp�extsep�link�errno�EEXIST�rename)�srcpath�dstpath�dstdname�dstbnameZ
hardlinked�attempt�_dsttmp�errrr>s$Tc
Cs8t||td�rdStjj|�}tjj|�}tjj|�}t|d�}tj	|tjj
|dd�}tj|�}|r�ytj|�}Wntk
r�YnXtj|j
�tj|j��d}	xP|	dkr�|jt�}	y|j|	�Wq�|j�|j�tj|j��Yq�Xq�W|j�|j�tj|j|�|�r4tj�dk�r4tj|�dS)z�Copy srcpath to dstpath.

    Abort operation if e.g. not enough space is available.  Attempt to
    atomically replace dstpath if it exists.)rN�rbF)rr�delete�r)rrr	rrrr�openrZNamedTemporaryFiler r
�fchmod�fileno�S_IMODE�st_mode�read�	BLOCKSIZE�write�close�unlink�namer$�selinux�is_selinux_enabled�
restorecon)
r%r&�copy_mode_from_dst�run_restoreconr'r(ZsrcfileZ
dsttmpfile�s�datarrrr_s<



cCs^yt||�dStk
rJ}z |jtjtjtjfkr:�nWYdd}~XnXt||||�dS)ztFirst attempt to hardlink srcpath to dstpath, if hardlinking isn't
    possible, attempt copying srcpath to dstpath.N)rrr"ZEMLINKZEPERMZEXDEVr)r%r&r=r>r+rrrr�s
Fc
Cs�tjj|�}tjj|�}d}d}|r6tj�dkr6d}n^y&tj|�\}}|dkrZtd|��Wn6tk
r�}	z|	j	t	j
kr�d}n�WYdd}	~	XnX|s�tj||�|r�tj|�n�d}
xtt
tj�D]f}tj|tj|d�}ytj||�Wn6tk
�r"}	z|	j	t	jk�rwĂWYdd}	~	Xq�X|}
Pq�W|
dk�rDtt	jd��|�r^|�r^tj|
|�ytj|
|�Wntj|
��YnX|�r�tj|�dS)zpCreate a symlink, optionally replacing dstpath atomically, optionally
    setting or preserving SELinux context.FNrzgetfilecon(%r) failedT)rrz/No suitable temporary symlink could be created.)r	rrrr:r;Zlgetfilecon�RuntimeErrorrr"�ENOENT�symlinkr<rrrrr r#�IOErrorZlsetfileconr$�remove)
r%r&�force�preserve_contextr'r(r>�ctx�retr+Zdsttmpr)r*rrr�symlink_atomically�sV

rJcCs~tjj|�}tjj|�}tjj|�}d}d}d}	tjj|�}
|rPtj�dkrPd}z�tj	|tjj
|d�\}}	|
r�tj|�}|r�tj||j
|j�|r�tj|tj|j��|r�tj|�\}}
|dkr�td|��tj|d�}d}|j|�|j�d}tj|	|�|�r$|
�rtj||
�n
tj|�Wd|�r8|j�n|�rHtj|�|	�rxtjj|	��rxytj|	�WnYnXXdS)z�Safely overwrite a file by creating a temporary file in the same
    directory, writing it, moving it over the original file, eventually
    preserving file mode, SELinux context and ownership.NrF)rrzgetfilecon(%r) failed�w)r	r�realpathrr�existsr:r;rZmkstempr r
�fchown�st_uid�st_gidr0r2r3Z
getfileconrA�fdopenr6r7r$Z
setfileconr<�isfiler8)rZcontentZ
preserve_moderGZpreserve_ownershipZdir_�base�fd�fZtmpnamerMr?rIrHrrrr�sR



)TT)TT)FT)TTT)�__doc__Z
__future__rr�__builtins__rr�__all__r	r:rr"r
r5rrrrrrJrrrrr�<module>s&!
5

?site-packages/slip/util/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000345147511334620020554 0ustar003

yS5cl�@s(ddlmZddlmZddlmZdS)�)�absolute_import�)�hookable)�filesN)Z
__future__r�rr�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/slip/util/__pycache__/hookable.cpython-36.opt-1.pyc000064400000013376147511334620020611 0ustar003

yS5c^�@sldZddlZddlmZddgZGdd�de�ZGdd	�d	e�ZGd
d�deee��Z	Gdd�de
e	�ZdS)z[This module contains variants of certain base types which call registered
hooks on changes.�N)�with_metaclass�Hookable�HookableSetc@s eZdZdd�Zedd��ZdS)�HookableTypec
Cs�d|kr�y|d}WnJtk
r^d}x0dd�|D�D]}|rRtdt|���q8|}q8WYnXx |dD]}tj||�||<qjWtj||||�S)N�_hookable_change_methodsZ_hookable_base_classcss|]}|tkr|VqdS)N)r)�.0�x�r	�/usr/lib/python3.6/hookable.py�	<genexpr>)sz'HookableType.__new__.<locals>.<genexpr>ztoo many base classes: %s)�KeyError�	TypeError�strr�wrap_method�type�__new__)�cls�name�basesZdct�baseZbase_candidate�
methodnamer	r	r
r"szHookableType.__new__cs t||���fdd�}||_|S)Ncs�|f|�|�}|j�|S)N)�
_run_hooks)�self�p�kZretval)�funcr	r
�
methodwrapper9sz/HookableType.wrap_method.<locals>.methodwrapper)�getattr�__name__)rrrrr	)rr
r5s
zHookableType.wrap_methodN)r�
__module__�__qualname__r�classmethodrr	r	r	r
r src@s6eZdZddd�Zdd�Zdd�Zdd	�Zd
d�ZdS)
�
_HookEntryNcCs�xFt|�D]:\}}yt|�Wq
tk
rBtd||f��Yq
Xq
WxF|j�D]:\}}yt|�WqRtk
r�td||f��YqRXqRWt|t�s�t|�}||_||_||_||_	d|_
dS)Nz*Positional argument %d is not hashable: %rz'Keyword argument %r is not hashable: %r)�	enumerate�hashr
�items�
isinstance�tuple�_HookEntry__hook�_HookEntry__args�_HookEntry__kwargs�_HookEntry__hookable�_HookEntry__hash)r�hook�args�kwargs�hookable�nrrr	r	r
�__init__Ds*
z_HookEntry.__init__cCs$|j|jko"|j|jko"|j|jkS)N)r(r)r*)r�objr	r	r
�__cmp__csz_HookEntry.__cmp__cCs|js|j�|_|jS)N)r,�
_compute_hash)rr	r	r
�__hash__is
z_HookEntry.__hash__cCs>t|j�}t|�t|j�A}t|�ttt|jj����A}|S)N)r$r(r)r'�sortedr*r%)r�	hashvaluer	r	r
r5ns

z_HookEntry._compute_hashcCs4|jr |j|jf|j�|j�n|j|j|j�dS)N)r+r(r)r*)rr	r	r
�runusz_HookEntry.run)N)rrr r2r4r6r5r9r	r	r	r
r"Bs

r"c@s�eZdZdZedd��Zdd�Zdd�Zeee�Zdd	�Z	d
d�Z
ee	e
�Zdd
�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�ZdS)rz2An object which calls registered hooks on changes.cOst|d�st�|_|jS)N�__real_hooks__)�hasattr�setr:)rrrr	r	r
�	__hooks__�s
zHookable.__hooks__cCst|d�sd|_|jS)N�__hooks_enabled__T)r;r>)rr	r	r
�_get_hooks_enabled�s
zHookable._get_hooks_enabledcCs
||_dS)N)r>)rZenabledr	r	r
�_set_hooks_enabled�szHookable._set_hooks_enabledcCst|d�sd|_|jS)N�__hooks_frozen__F)r;rA)rr	r	r
�_get_hooks_frozen�s
zHookable._get_hooks_frozencCsB||jkrdS||_|r"t�|_nx|jD]}|j�q*W|`dS)N)�hooks_frozenrAr<�__hooks_frozen_entries__r9)rZfreeze�	hookentryr	r	r
�_set_hooks_frozen�s

zHookable._set_hooks_frozencCs
d|_dS)NT)rC)rr	r	r
�freeze_hooks�szHookable.freeze_hookscCs
d|_dS)NF)rC)rr	r	r
�
thaw_hooks�szHookable.thaw_hookscOs|j|df|�|�dS)N)�_Hookable__add_hook)rr-r.r/r	r	r
�add_hook�szHookable.add_hookcOs|j||f|�|�dS)N)rI)rr-r.r/r	r	r
�add_hook_hookable�szHookable.add_hook_hookablecOs t||||d�}|jj|�dS)N)r0)r"r=�add)rr-Z	_hookabler.r/rEr	r	r
Z
__add_hook�szHookable.__add_hookcOs|jjt|||��dS)N)r=�remover")rr-r.r/r	r	r
�remove_hook�szHookable.remove_hookcCs8|jr4|js&x&|jD]}|j�qWn|jj|j�dS)N)�
hooks_enabledrCr=r9rD�update)rrEr	r	r
r�s
zHookable._run_hooksN)rrr �__doc__�propertyr=r?r@rOrBrFrCrGrHrJrKrIrNrr	r	r	r
r|s

c	@seZdZdZdZdd�Zd
S)rz5A set object which calls registered hooks on changes.rL�clear�difference_update�discard�intersection_update�poprM�symmetric_difference_updaterPcCstj|�}t�|_|S)N)r<�copyr:)rr3r	r	r
rY�s
zHookableSet.copyN)	rLrSrTrUrVrWrMrXrP)rrr rQrrYr	r	r	r
r�s)rQ�collectionsZsixr�__all__rr�objectr"rr<rr	r	r	r
�<module>s":Gsite-packages/slip/util/__pycache__/__init__.cpython-36.pyc000064400000000345147511334620017615 0ustar003

yS5cl�@s(ddlmZddlmZddlmZdS)�)�absolute_import�)�hookable)�filesN)Z
__future__r�rr�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/slip/util/hookable.py000064400000014136147511334620013361 0ustar00# -*- coding: utf-8 -*-

# slip.util.hookable -- run hooks on changes in objects
#
# Copyright © 2008 Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#
# Authors:
# Nils Philippsen <nils@redhat.com>

"""This module contains variants of certain base types which call registered
hooks on changes."""

import collections
from six import with_metaclass

__all__ = ["Hookable", "HookableSet"]


class HookableType(type):

    def __new__(cls, name, bases, dct):

        if '_hookable_change_methods' in dct:
            try:
                base = dct["_hookable_base_class"]
            except KeyError:
                base = None
                for base_candidate in (x for x in bases if x != Hookable):
                    if base:
                        raise TypeError(
                            "too many base classes: %s" % str(bases))
                    else:
                        base = base_candidate

            for methodname in dct["_hookable_change_methods"]:
                dct[methodname] = HookableType.wrap_method(base, methodname)

        return type.__new__(cls, name, bases, dct)

    @classmethod
    def wrap_method(cls, base, methodname):
        func = getattr(base, methodname)

        def methodwrapper(self, *p, **k):
            retval = func(self, *p, **k)
            self._run_hooks()
            return retval

        methodwrapper.__name__ = methodname
        return methodwrapper


class _HookEntry(object):

    def __init__(self, hook, args, kwargs, hookable=None):

        assert(isinstance(hook, collections.Callable))
        assert(isinstance(hookable, Hookable))

        for n, x in enumerate(args):
            try:
                hash(x)
            except TypeError:
                raise TypeError(
                        "Positional argument %d is not hashable: %r" %
                        (n, x))

        for k, x in kwargs.items():
            try:
                hash(x)
            except TypeError:
                raise TypeError(
                        "Keyword argument %r is not hashable: %r" %
                        (k, x))

        if not isinstance(args, tuple):
            args = tuple(args)

        self.__hook = hook
        self.__args = args
        self.__kwargs = kwargs
        self.__hookable = hookable

        self.__hash = None

    def __cmp__(self, obj):
        return (
            self.__hook == obj.__hook and
            self.__args == obj.__args and
            self.__kwargs == obj.__kwargs)

    def __hash__(self):
        if not self.__hash:
            self.__hash = self._compute_hash()
        return self.__hash

    def _compute_hash(self):
        hashvalue = hash(self.__hook)
        hashvalue = hash(hashvalue) ^ hash(self.__args)
        hashvalue = hash(hashvalue) ^ hash(
                tuple(sorted(self.__kwargs.items())))
        return hashvalue

    def run(self):
        if self.__hookable:
            self.__hook(self.__hookable, *self.__args, **self.__kwargs)
        else:
            self.__hook(*self.__args, **self.__kwargs)


class Hookable(with_metaclass(HookableType, object)):

    """An object which calls registered hooks on changes."""

    @property
    def __hooks__(self, *p, **k):
        if not hasattr(self, "__real_hooks__"):
            self.__real_hooks__ = set()
        return self.__real_hooks__

    def _get_hooks_enabled(self):
        if not hasattr(self, "__hooks_enabled__"):
            self.__hooks_enabled__ = True
        return self.__hooks_enabled__

    def _set_hooks_enabled(self, enabled):
        self.__hooks_enabled__ = enabled

    hooks_enabled = property(_get_hooks_enabled, _set_hooks_enabled)

    def _get_hooks_frozen(self):
        if not hasattr(self, "__hooks_frozen__"):
            self.__hooks_frozen__ = False
        return self.__hooks_frozen__

    def _set_hooks_frozen(self, freeze):
        if freeze == self.hooks_frozen:
            return

        self.__hooks_frozen__ = freeze

        if freeze:
            self.__hooks_frozen_entries__ = set()
        else:
            for hookentry in self.__hooks_frozen_entries__:
                hookentry.run()
            del self.__hooks_frozen_entries__

    hooks_frozen = property(_get_hooks_frozen, _set_hooks_frozen)

    def freeze_hooks(self):
        self.hooks_frozen = True

    def thaw_hooks(self):
        self.hooks_frozen = False

    def add_hook(self, hook, *args, **kwargs):
        self.__add_hook(hook, None, *args, **kwargs)

    def add_hook_hookable(self, hook, *args, **kwargs):
        self.__add_hook(hook, self, *args, **kwargs)

    def __add_hook(self, hook, _hookable, *args, **kwargs):
        assert isinstance(hook, collections.Callable)
        assert isinstance(_hookable, Hookable)
        hookentry = _HookEntry(hook, args, kwargs, hookable=_hookable)
        self.__hooks__.add(hookentry)

    def remove_hook(self, hook, *args, **kwargs):

        self.__hooks__.remove(_HookEntry(hook, args, kwargs))

    def _run_hooks(self):
        if self.hooks_enabled:
            if not self.hooks_frozen:
                for hookentry in self.__hooks__:
                    hookentry.run()
            else:
                self.__hooks_frozen_entries__.update(self.__hooks__)


class HookableSet(set, Hookable):

    """A set object which calls registered hooks on changes."""

    _hookable_change_methods = (
        "add", "clear", "difference_update", "discard", "intersection_update",
        "pop", "remove", "symmetric_difference_update", "update")

    def copy(self):
        obj = set.copy(self)
        obj.__real_hooks__ = set()
        return obj
site-packages/slip/util/__init__.py000064400000000154147511334620013327 0ustar00# -*- coding: utf-8 -*-

from __future__ import absolute_import

from . import hookable
from . import files
site-packages/slip/util/files.py000064400000017257147511334620012706 0ustar00# -*- coding: utf-8 -*-

# slip.util.files -- file helper functions
#
# Copyright © 2009, 2010, 2012, 2015 Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#
# Authors:
# Nils Philippsen <nils@redhat.com>

"""This module contains helper functions for dealing with files."""

from __future__ import absolute_import

# ensure range() returns a generator
if 'xrange' in dir(__builtins__):
    range = xrange

__all__ = ["issamefile", "linkfile", "copyfile", "linkorcopyfile",
           "overwrite_safely"]

import os
import selinux
import tempfile
import errno
import stat

BLOCKSIZE = 1024


def _issamefile(path1, path2):
    s1 = os.stat(path1)
    s2 = os.stat(path2)

    return os.path.samestat(s1, s2)


def issamefile(path1, path2, catch_stat_exceptions=[]):
    """Check whether two paths point to the same file (i.e. are hardlinked)."""

    if catch_stat_exceptions is True:
        catch_stat_exceptions = Exception

    try:
        return _issamefile(path1, path2)
    except catch_stat_exceptions:
        return False


def linkfile(srcpath, dstpath):
    """Hardlink srcpath to dstpath.

    Attempt to atomically replace dstpath if it exists."""

    if issamefile(srcpath, dstpath, catch_stat_exceptions=OSError):
        return

    dstpath = os.path.abspath(dstpath)
    dstdname = os.path.dirname(dstpath)
    dstbname = os.path.basename(dstpath)

    hardlinked = False
    for attempt in range(tempfile.TMP_MAX):
        _dsttmp = tempfile.mktemp(prefix=dstbname + os.extsep, dir=dstdname)
        try:
            os.link(srcpath, _dsttmp)
        except OSError as e:
            if e.errno == errno.EEXIST:

                # try another name

                pass
            else:
                raise
        else:
            hardlinked = True
            break

    if hardlinked:
        os.rename(_dsttmp, dstpath)


def copyfile(srcpath, dstpath, copy_mode_from_dst=True, run_restorecon=True):
    """Copy srcpath to dstpath.

    Abort operation if e.g. not enough space is available.  Attempt to
    atomically replace dstpath if it exists."""

    if issamefile(srcpath, dstpath, catch_stat_exceptions=OSError):
        return

    dstpath = os.path.abspath(dstpath)
    dstdname = os.path.dirname(dstpath)
    dstbname = os.path.basename(dstpath)

    srcfile = open(srcpath, "rb")
    dsttmpfile = tempfile.NamedTemporaryFile(
        prefix=dstbname + os.path.extsep, dir=dstdname, delete=False)

    s = os.stat(srcpath)

    if copy_mode_from_dst:

        # attempt to copy mode from destination file (if it exists,
        # otherwise fall back to copying it from the source file below)

        try:
            s = os.stat(dstpath)
        except OSError:
            pass

    os.fchmod(dsttmpfile.fileno(), stat.S_IMODE(s.st_mode))

    data = None

    while data != "":
        data = srcfile.read(BLOCKSIZE)
        try:
            dsttmpfile.write(data)
        except:
            srcfile.close()
            dsttmpfile.close()
            os.unlink(dsttmpfile.name)
            raise

    srcfile.close()
    dsttmpfile.close()

    os.rename(dsttmpfile.name, dstpath)

    if run_restorecon and selinux.is_selinux_enabled() > 0:
        selinux.restorecon(dstpath)


def linkorcopyfile(
    srcpath, dstpath, copy_mode_from_dst=True, run_restorecon=True):

    """First attempt to hardlink srcpath to dstpath, if hardlinking isn't
    possible, attempt copying srcpath to dstpath."""

    try:
        linkfile(srcpath, dstpath)
        return
    except OSError as e:
        if e.errno not in (errno.EMLINK, errno.EPERM, errno.EXDEV):

            # don't bother copying

            raise
        else:

            # try copying

            pass

    copyfile(srcpath, dstpath, copy_mode_from_dst, run_restorecon)


def symlink_atomically(srcpath, dstpath, force=False, preserve_context=True):
    """Create a symlink, optionally replacing dstpath atomically, optionally
    setting or preserving SELinux context."""

    dstdname = os.path.dirname(dstpath)
    dstbname = os.path.basename(dstpath)

    run_restorecon = False
    ctx = None

    if preserve_context and selinux.is_selinux_enabled() <= 0:
        preserve_context = False
    else:
        try:
            ret, ctx = selinux.lgetfilecon(dstpath)
            if ret < 0:
                raise RuntimeError("getfilecon(%r) failed" % dstpath)
        except OSError as e:
            if e.errno == errno.ENOENT:
                run_restorecon = True
            else:
                raise

    if not force:
        os.symlink(srcpath, dstpath)
        if preserve_context:
            selinux.restorecon(dstpath)
    else:
        dsttmp = None
        for attempt in range(tempfile.TMP_MAX):
            _dsttmp = tempfile.mktemp(
                prefix=dstbname + os.extsep, dir=dstdname)
            try:
                os.symlink(srcpath, _dsttmp)
            except OSError as e:
                if e.errno == errno.EEXIST:
                    # try again
                    continue
                raise
            else:
                dsttmp = _dsttmp
                break

        if dsttmp is None:
            raise IOError(
                errno.EEXIST,
                "No suitable temporary symlink could be created.")

        if preserve_context and not run_restorecon:
            selinux.lsetfilecon(dsttmp, ctx)

        try:
            os.rename(dsttmp, dstpath)
        except:
            # clean up
            os.remove(dsttmp)
            raise

        if run_restorecon:
            selinux.restorecon(dstpath)


def overwrite_safely(
        path, content, preserve_mode=True, preserve_context=True,
        preserve_ownership=True):
    """Safely overwrite a file by creating a temporary file in the same
    directory, writing it, moving it over the original file, eventually
    preserving file mode, SELinux context and ownership."""

    path = os.path.realpath(path)
    dir_ = os.path.dirname(path)
    base = os.path.basename(path)

    fd = None
    f = None
    tmpname = None

    exists = os.path.exists(path)

    if preserve_context and selinux.is_selinux_enabled() <= 0:
        preserve_context = False

    try:
        fd, tmpname = tempfile.mkstemp(prefix=base + os.path.extsep,
                                       dir=dir_)

        if exists:
            s = os.stat(path)

            if preserve_ownership:
                os.fchown(fd, s.st_uid, s.st_gid)

            if preserve_mode:
                os.fchmod(fd, stat.S_IMODE(s.st_mode))

            if preserve_context:
                ret, ctx = selinux.getfilecon(path)
                if ret < 0:
                    raise RuntimeError("getfilecon(%r) failed" % path)

        f = os.fdopen(fd, "w")
        fd = None

        f.write(content)

        f.close()
        f = None

        os.rename(tmpname, path)

        if preserve_context:
            if exists:
                selinux.setfilecon(path, ctx)
            else:
                selinux.restorecon(path)

    finally:
        if f:
            f.close()
        elif fd:
            os.close(fd)
        if tmpname and os.path.isfile(tmpname):
            try:
                os.unlink(tmpname)
            except:
                pass
site-packages/slip/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000161147511334620017573 0ustar003

yS5c�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/slip/__pycache__/__init__.cpython-36.pyc000064400000000161147511334620016634 0ustar003

yS5c�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/slip/_wrappers/__pycache__/_glib.cpython-36.pyc000064400000001250147511334620020153 0ustar003

yS5c\�@s�dZddlmZddlZdddgZejeZdZxhedkr�dejkrPejdZndejkrdejdZedkr2yddl	Z	Wq2e
k
r�ddlZYq2Xq2Wx*eD]"Z
e
ee�kr�eee
eee
��q�WdS)	zjThis module lets some other slip modules cooperate with either the glib
or the gi.repository.GLib modules.�)�absolute_importNZMainLoopZ
source_removeZtimeout_addzgi.repository.GLib�glib)�__doc__Z
__future__r�sys�__all__�modules�__name__�_selfZ_modr�ImportErrorZgi.repository.GLibZgiZwhat�dir�setattr�getattr�rr�/usr/lib/python3.6/_glib.py�<module>s$






site-packages/slip/_wrappers/__pycache__/__init__.cpython-36.pyc000064400000000161147511334620020636 0ustar003

yS5c�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/slip/_wrappers/__pycache__/_glib.cpython-36.opt-1.pyc000064400000001250147511334620021112 0ustar003

yS5c\�@s�dZddlmZddlZdddgZejeZdZxhedkr�dejkrPejdZndejkrdejdZedkr2yddl	Z	Wq2e
k
r�ddlZYq2Xq2Wx*eD]"Z
e
ee�kr�eee
eee
��q�WdS)	zjThis module lets some other slip modules cooperate with either the glib
or the gi.repository.GLib modules.�)�absolute_importNZMainLoopZ
source_removeZtimeout_addzgi.repository.GLib�glib)�__doc__Z
__future__r�sys�__all__�modules�__name__�_selfZ_modr�ImportErrorZgi.repository.GLibZgiZwhat�dir�setattr�getattr�rr�/usr/lib/python3.6/_glib.py�<module>s$






site-packages/slip/_wrappers/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000161147511334620021575 0ustar003

yS5c�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/slip/_wrappers/_glib.py000064400000003134147511334620013672 0ustar00# -*- coding: utf-8 -*-

# slip._wrappers._glib -- abstract (some) differences between glib and
# gi.repository.GLib
#
# Copyright © 2012, 2015 Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#
# Authors:
# Nils Philippsen <nils@redhat.com>

"""This module lets some other slip modules cooperate with either the glib
or the gi.repository.GLib modules."""

from __future__ import absolute_import

import sys

__all__ = ['MainLoop', 'source_remove', 'timeout_add']

_self = sys.modules[__name__]

_mod = None

while _mod is None:
    if 'gi.repository.GLib' in sys.modules:
        _mod = sys.modules['gi.repository.GLib']
    elif 'glib' in sys.modules:
        _mod = sys.modules['glib']
    # if not yet imported, try to import glib first, then
    # gi.repository.GLib ...
    if _mod is None:
        try:
            import glib
        except ImportError:
            import gi.repository.GLib
    # ... then repeat.

for what in __all__:
    if what not in dir(_self):
        setattr(_self, what, getattr(_mod, what))
site-packages/slip/_wrappers/__init__.py000064400000000000147511334620014342 0ustar00site-packages/slip/__init__.py000064400000000000147511334620012340 0ustar00site-packages/pyudev/_ctypeslib/__init__.py000064400000001717147511334620015062 0ustar00# -*- coding: utf-8 -*-
# Copyright (C) 2015 mulhern <amulhern@redhat.com>

# This library is free software; you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as published by the
# Free Software Foundation; either version 2.1 of the License, or (at your
# option) any later version.

# This library is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE.  See the GNU Lesser General Public License
# for more details.

# You should have received a copy of the GNU Lesser General Public License
# along with this library; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA

"""
    pyudev._ctypeslib
    =================

    Wrappers for libraries.

    .. moduleauthor::  mulhern  <amulhern@redhat.com>
"""

from . import libc
from . import libudev
site-packages/pyudev/_ctypeslib/utils.py000064400000004651147511334620014463 0ustar00# -*- coding: utf-8 -*-
# Copyright (C) 2013 Sebastian Wiesner <lunaryorn@gmail.com>

# This library is free software; you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as published by the
# Free Software Foundation; either version 2.1 of the License, or (at your
# option) any later version.

# This library is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE.  See the GNU Lesser General Public License
# for more details.

# You should have received a copy of the GNU Lesser General Public License
# along with this library; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA

"""
    pyudev._ctypeslib.utils
    =======================

    Utilities for loading ctypeslib.

    .. moduleauthor::  Anne Mulhern  <amulhern@redhat.com>
"""

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals

from ctypes import CDLL
from ctypes.util import find_library


def load_ctypes_library(name, signatures, error_checkers):
    """
    Load library ``name`` and return a :class:`ctypes.CDLL` object for it.

    :param str name: the library name
    :param signatures: signatures of methods
    :type signatures: dict of str * (tuple of (list of type) * type)
    :param error_checkers: error checkers for methods
    :type error_checkers: dict of str * ((int * ptr * arglist) -> int)

    The library has errno handling enabled.
    Important functions are given proper signatures and return types to support
    type checking and argument conversion.

    :returns: a loaded library
    :rtype: ctypes.CDLL
    :raises ImportError: if the library is not found
    """
    library_name = find_library(name)
    if not library_name:
        raise ImportError('No library named %s' % name)
    lib = CDLL(library_name, use_errno=True)
    # Add function signatures
    for funcname, signature in signatures.items():
        function = getattr(lib, funcname, None)
        if function:
            argtypes, restype = signature
            function.argtypes = argtypes
            function.restype = restype
            errorchecker = error_checkers.get(funcname)
            if errorchecker:
                function.errcheck = errorchecker
    return lib
site-packages/pyudev/_ctypeslib/libc.py000064400000002474147511334620014235 0ustar00# -*- coding: utf-8 -*-
# Copyright (C) 2013 Sebastian Wiesner <lunaryorn@gmail.com>

# This library is free software; you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as published by the
# Free Software Foundation; either version 2.1 of the License, or (at your
# option) any later version.

# This library is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE.  See the GNU Lesser General Public License
# for more details.

# You should have received a copy of the GNU Lesser General Public License
# along with this library; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA

"""
    pyudev._ctypeslib.libc
    ======================

    Wrappers for libc.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
"""

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals

from ctypes import c_int

from ._errorcheckers import check_errno_on_nonzero_return


fd_pair = c_int * 2


SIGNATURES = dict(
   pipe2=([fd_pair, c_int], c_int),
)

ERROR_CHECKERS = dict(
   pipe2=check_errno_on_nonzero_return,
)
site-packages/pyudev/_ctypeslib/__pycache__/libudev.cpython-36.opt-1.pyc000064400000013134147511334620022174 0ustar003

u1�W+*�I@sfdZddlmZddlmZddlmZddlmZddlmZddlmZddlm	Z	dd	lm
Z
dd
lmZddlmZddlm
Z
dd
lmZddlmZddlmZddlmZddlmZGdd�de
�Zee�ZGdd�de
�Zee�ZGdd�de
�Zee�ZGdd�de
�Zee�ZGdd�de
�Zee�ZGdd�de
�Zee�Z eZ!e"gefegdfegefegefegefegefege	fee	gdfegefegefegdfeege	feege	feeege	feeege	feeege	feege	feege	feege	fege	fege	fegefegefegefegefegefegdfeegefeeegefeee!gefeegefegefegefeeegefegefegefegefegefegefegefegefegefeegefeegefege!fegefegefege	fegefegefegefegefegefeeege	feege	fegefegdfeegefege	fee	ge	fege	fegefeeege	feege	fege	fege	fe ge fe gdfege fe ee
gefd �FZ#e"dddddddddddddddddddddddddddddeeeeeeeeeeddddddddddddddddeeedeeeedddddddd!�FZ$dS)"z�
    pyudev._ctypeslib.libudev
    =========================

    Wrapper types for libudev.  Use ``libudev`` attribute to access libudev
    functions.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
�)�absolute_import)�division)�print_function)�unicode_literals)�c_char)�c_char_p)�c_int)�c_uint)�c_ulonglong)�CDLL)�	Structure)�POINTER)�find_library�)�check_errno_on_nonzero_return)�"check_errno_on_null_pointer_return)�check_negative_errorcodec@seZdZdZdS)�udevz'
    Dummy for ``udev`` structure.
    N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/libudev.pyr2src@seZdZdZdS)�udev_enumeratez1
    Dummy for ``udev_enumerate`` structure.
    N)rrrrrrrrr<src@seZdZdZdS)�udev_list_entryz2
    Dummy for ``udev_list_entry`` structure.
    N)rrrrrrrrrFsrc@seZdZdZdS)�udev_devicez.
    Dummy for ``udev_device`` structure.
    N)rrrrrrrrrPsrc@seZdZdZdS)�udev_monitorz.
    Dummy for ``udev_device`` structure.
    N)rrrrrrrrrZsrc@seZdZdZdS)�	udev_hwdbz,
    Dummy for ``udev_hwdb`` structure.
    N)rrrrrrrrrcsrN)F�udev_new�
udev_unref�udev_ref�udev_get_sys_path�udev_get_dev_path�udev_get_run_path�udev_get_log_priority�udev_set_log_priority�udev_enumerate_new�udev_enumerate_ref�udev_enumerate_unref�"udev_enumerate_add_match_subsystem�$udev_enumerate_add_nomatch_subsystem�!udev_enumerate_add_match_property� udev_enumerate_add_match_sysattr�"udev_enumerate_add_nomatch_sysattr�udev_enumerate_add_match_tag� udev_enumerate_add_match_sysname�udev_enumerate_add_match_parent�'udev_enumerate_add_match_is_initialized�udev_enumerate_scan_devices�udev_enumerate_get_list_entry�udev_list_entry_get_next�udev_list_entry_get_name�udev_list_entry_get_value�udev_device_ref�udev_device_unref�udev_device_new_from_syspath�&udev_device_new_from_subsystem_sysname�udev_device_new_from_devnum�udev_device_new_from_device_id� udev_device_new_from_environment�udev_device_get_parent�-udev_device_get_parent_with_subsystem_devtype�udev_device_get_devpath�udev_device_get_subsystem�udev_device_get_syspath�udev_device_get_sysnum�udev_device_get_sysname�udev_device_get_driver�udev_device_get_devtype�udev_device_get_devnode�udev_device_get_property_value�udev_device_get_sysattr_value�udev_device_get_devnum�udev_device_get_action�udev_device_get_seqnum�udev_device_get_is_initialized�&udev_device_get_usec_since_initialized�#udev_device_get_devlinks_list_entry�udev_device_get_tags_list_entry�%udev_device_get_properties_list_entry�"udev_device_get_sysattr_list_entry�udev_device_set_sysattr_value�udev_device_has_tag�udev_monitor_ref�udev_monitor_unref�udev_monitor_new_from_netlink�udev_monitor_enable_receiving�$udev_monitor_set_receive_buffer_size�udev_monitor_get_fd�udev_monitor_receive_device�/udev_monitor_filter_add_match_subsystem_devtype�!udev_monitor_filter_add_match_tag�udev_monitor_filter_update�udev_monitor_filter_remove�
udev_hwdb_ref�udev_hwdb_unref�
udev_hwdb_new�#udev_hwdb_get_properties_list_entry)FrLrPrHrKrArGrFrNr?r@rRrIrMrBrSrJrErDrCrQrOrUr=r<r>r;r:r8r9rTr1r*r+r,r-r.r/r0r2r4r'r(r3r)r#r%r$r"rdrcrarbr6r5r7rZrYr\rVr]r^r_r`r[rXrWrr!r&r )%rZ
__future__rrrrZctypesrrrr	r
rrr
Zctypes.utilrZ_errorcheckersrrrrZudev_prZudev_enumerate_prZudev_list_entry_prZ
udev_device_prZudev_monitor_prZudev_hwdb_pZdev_t�dictZ
SIGNATURESZERROR_CHECKERSrrrr�<module>sb














site-packages/pyudev/_ctypeslib/__pycache__/libudev.cpython-36.pyc000064400000013134147511334620021235 0ustar003

u1�W+*�I@sfdZddlmZddlmZddlmZddlmZddlmZddlmZddlm	Z	dd	lm
Z
dd
lmZddlmZddlm
Z
dd
lmZddlmZddlmZddlmZddlmZGdd�de
�Zee�ZGdd�de
�Zee�ZGdd�de
�Zee�ZGdd�de
�Zee�ZGdd�de
�Zee�ZGdd�de
�Zee�Z eZ!e"gefegdfegefegefegefegefege	fee	gdfegefegefegdfeege	feege	feeege	feeege	feeege	feege	feege	feege	fege	fege	fegefegefegefegefegefegdfeegefeeegefeee!gefeegefegefegefeeegefegefegefegefegefegefegefegefegefeegefeegefege!fegefegefege	fegefegefegefegefegefeeege	feege	fegefegdfeegefege	fee	ge	fege	fegefeeege	feege	fege	fege	fe ge fe gdfege fe ee
gefd �FZ#e"dddddddddddddddddddddddddddddeeeeeeeeeeddddddddddddddddeeedeeeedddddddd!�FZ$dS)"z�
    pyudev._ctypeslib.libudev
    =========================

    Wrapper types for libudev.  Use ``libudev`` attribute to access libudev
    functions.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
�)�absolute_import)�division)�print_function)�unicode_literals)�c_char)�c_char_p)�c_int)�c_uint)�c_ulonglong)�CDLL)�	Structure)�POINTER)�find_library�)�check_errno_on_nonzero_return)�"check_errno_on_null_pointer_return)�check_negative_errorcodec@seZdZdZdS)�udevz'
    Dummy for ``udev`` structure.
    N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/libudev.pyr2src@seZdZdZdS)�udev_enumeratez1
    Dummy for ``udev_enumerate`` structure.
    N)rrrrrrrrr<src@seZdZdZdS)�udev_list_entryz2
    Dummy for ``udev_list_entry`` structure.
    N)rrrrrrrrrFsrc@seZdZdZdS)�udev_devicez.
    Dummy for ``udev_device`` structure.
    N)rrrrrrrrrPsrc@seZdZdZdS)�udev_monitorz.
    Dummy for ``udev_device`` structure.
    N)rrrrrrrrrZsrc@seZdZdZdS)�	udev_hwdbz,
    Dummy for ``udev_hwdb`` structure.
    N)rrrrrrrrrcsrN)F�udev_new�
udev_unref�udev_ref�udev_get_sys_path�udev_get_dev_path�udev_get_run_path�udev_get_log_priority�udev_set_log_priority�udev_enumerate_new�udev_enumerate_ref�udev_enumerate_unref�"udev_enumerate_add_match_subsystem�$udev_enumerate_add_nomatch_subsystem�!udev_enumerate_add_match_property� udev_enumerate_add_match_sysattr�"udev_enumerate_add_nomatch_sysattr�udev_enumerate_add_match_tag� udev_enumerate_add_match_sysname�udev_enumerate_add_match_parent�'udev_enumerate_add_match_is_initialized�udev_enumerate_scan_devices�udev_enumerate_get_list_entry�udev_list_entry_get_next�udev_list_entry_get_name�udev_list_entry_get_value�udev_device_ref�udev_device_unref�udev_device_new_from_syspath�&udev_device_new_from_subsystem_sysname�udev_device_new_from_devnum�udev_device_new_from_device_id� udev_device_new_from_environment�udev_device_get_parent�-udev_device_get_parent_with_subsystem_devtype�udev_device_get_devpath�udev_device_get_subsystem�udev_device_get_syspath�udev_device_get_sysnum�udev_device_get_sysname�udev_device_get_driver�udev_device_get_devtype�udev_device_get_devnode�udev_device_get_property_value�udev_device_get_sysattr_value�udev_device_get_devnum�udev_device_get_action�udev_device_get_seqnum�udev_device_get_is_initialized�&udev_device_get_usec_since_initialized�#udev_device_get_devlinks_list_entry�udev_device_get_tags_list_entry�%udev_device_get_properties_list_entry�"udev_device_get_sysattr_list_entry�udev_device_set_sysattr_value�udev_device_has_tag�udev_monitor_ref�udev_monitor_unref�udev_monitor_new_from_netlink�udev_monitor_enable_receiving�$udev_monitor_set_receive_buffer_size�udev_monitor_get_fd�udev_monitor_receive_device�/udev_monitor_filter_add_match_subsystem_devtype�!udev_monitor_filter_add_match_tag�udev_monitor_filter_update�udev_monitor_filter_remove�
udev_hwdb_ref�udev_hwdb_unref�
udev_hwdb_new�#udev_hwdb_get_properties_list_entry)FrLrPrHrKrArGrFrNr?r@rRrIrMrBrSrJrErDrCrQrOrUr=r<r>r;r:r8r9rTr1r*r+r,r-r.r/r0r2r4r'r(r3r)r#r%r$r"rdrcrarbr6r5r7rZrYr\rVr]r^r_r`r[rXrWrr!r&r )%rZ
__future__rrrrZctypesrrrr	r
rrr
Zctypes.utilrZ_errorcheckersrrrrZudev_prZudev_enumerate_prZudev_list_entry_prZ
udev_device_prZudev_monitor_prZudev_hwdb_pZdev_t�dictZ
SIGNATURESZERROR_CHECKERSrrrr�<module>sb














site-packages/pyudev/_ctypeslib/__pycache__/utils.cpython-36.pyc000064400000003126147511334620020743 0ustar003

u1�W�	�@sXdZddlmZddlmZddlmZddlmZddlmZddlm	Z	dd	�Z
d
S)z�
    pyudev._ctypeslib.utils
    =======================

    Utilities for loading ctypeslib.

    .. moduleauthor::  Anne Mulhern  <amulhern@redhat.com>
�)�absolute_import)�division)�print_function)�unicode_literals)�CDLL)�find_librarycCsvt|�}|std|��t|dd�}xL|j�D]@\}}t||d�}|r.|\}}	||_|	|_|j|�}
|
r.|
|_q.W|S)a{
    Load library ``name`` and return a :class:`ctypes.CDLL` object for it.

    :param str name: the library name
    :param signatures: signatures of methods
    :type signatures: dict of str * (tuple of (list of type) * type)
    :param error_checkers: error checkers for methods
    :type error_checkers: dict of str * ((int * ptr * arglist) -> int)

    The library has errno handling enabled.
    Important functions are given proper signatures and return types to support
    type checking and argument conversion.

    :returns: a loaded library
    :rtype: ctypes.CDLL
    :raises ImportError: if the library is not found
    zNo library named %sT)Z	use_errnoN)	r�ImportErrorr�items�getattr�argtypes�restype�getZerrcheck)�nameZ
signaturesZerror_checkersZlibrary_name�lib�funcnameZ	signatureZfunctionrrZerrorchecker�r�/usr/lib/python3.6/utils.py�load_ctypes_library$s

rN)�__doc__Z
__future__rrrrZctypesrZctypes.utilrrrrrr�<module>ssite-packages/pyudev/_ctypeslib/__pycache__/libc.cpython-36.opt-1.pyc000064400000001177147511334620021457 0ustar003

u1�W<�@stdZddlmZddlmZddlmZddlmZddlmZddlm	Z	ed	Z
ee
egefd
�Zee	d
�Z
dS)z�
    pyudev._ctypeslib.libc
    ======================

    Wrappers for libc.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
�)�absolute_import)�division)�print_function)�unicode_literals)�c_int�)�check_errno_on_nonzero_return�)�pipe2N)�__doc__Z
__future__rrrrZctypesrZ_errorcheckersrZfd_pair�dictZ
SIGNATURESZERROR_CHECKERS�r
r
�/usr/lib/python3.6/libc.py�<module>ssite-packages/pyudev/_ctypeslib/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000476147511334620022306 0ustar003

H�V��@s dZddlmZddlmZdS)z�
    pyudev._ctypeslib
    =================

    Wrappers for libraries.

    .. moduleauthor::  mulhern  <amulhern@redhat.com>
�)�libc)�libudevN)�__doc__�rr�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pyudev/_ctypeslib/__pycache__/_errorcheckers.cpython-36.opt-1.pyc000064400000004555147511334620023551 0ustar003

u1�W��@s�dZddlmZddlmZddlmZddlmZddlZddlZddlm	Z	ej
eeje
ejeiZdd	�Zd
d�Zdd
�Zdd�ZdS)zt
    pyudev._ctypeslib._errorcheckers
    ================================

    Error checkers for ctypes wrappers.
�)�absolute_import)�division)�print_function)�unicode_literalsN)�	get_errnocCs2tj|�}tj|�}|dk	r$||�St||�SdS)z�Create an exception from ``errnum``.

    ``errnum`` is an integral error number.

    Return an exception object appropriate to ``errnum``.

    N)�ERRNO_EXCEPTIONS�get�os�strerror�EnvironmentError)�errnumZ	exceptionZerrorstr�r
�$/usr/lib/python3.6/_errorcheckers.py�exception_from_errno+s


rcGs |dkr|}t|��n|SdS)a�Error checker for funtions, which return negative error codes.

    If ``result`` is smaller than ``0``, it is interpreted as negative error
    code, and an appropriate exception is raised:

    - ``-ENOMEM`` raises a :exc:`~exceptions.MemoryError`
    - ``-EOVERFLOW`` raises a :exc:`~exceptions.OverflowError`
    - all other error codes raise :exc:`~exceptions.EnvironmentError`

    If result is greater or equal to ``0``, it is returned unchanged.

    rN)r)�result�func�argsrr
r
r�check_negative_errorcode;s

rcGs"|dkrt�}|dkrt|��|S)z�Error checker to check the system ``errno`` as returned by
    :func:`ctypes.get_errno()`.

    If ``result`` is not ``0``, an exception according to this errno is raised.
    Otherwise nothing happens.

    r)rr)rrrrr
r
r�check_errno_on_nonzero_returnPs
rcGs|st�}|dkrt|��|S)z�Error checker to check the system ``errno`` as returned by
    :func:`ctypes.get_errno()`.

    If ``result`` is a null pointer, an exception according to this errno is
    raised.  Otherwise nothing happens.

    r)rr)rrrrr
r
r�"check_errno_on_null_pointer_return_s
	r)�__doc__Z
__future__rrrrr	�errnoZctypesrZENOMEM�MemoryErrorZ	EOVERFLOW�
OverflowErrorZEINVAL�
ValueErrorrrrrrr
r
r
r�<module>s
site-packages/pyudev/_ctypeslib/__pycache__/_errorcheckers.cpython-36.pyc000064400000004555147511334620022612 0ustar003

u1�W��@s�dZddlmZddlmZddlmZddlmZddlZddlZddlm	Z	ej
eeje
ejeiZdd	�Zd
d�Zdd
�Zdd�ZdS)zt
    pyudev._ctypeslib._errorcheckers
    ================================

    Error checkers for ctypes wrappers.
�)�absolute_import)�division)�print_function)�unicode_literalsN)�	get_errnocCs2tj|�}tj|�}|dk	r$||�St||�SdS)z�Create an exception from ``errnum``.

    ``errnum`` is an integral error number.

    Return an exception object appropriate to ``errnum``.

    N)�ERRNO_EXCEPTIONS�get�os�strerror�EnvironmentError)�errnumZ	exceptionZerrorstr�r
�$/usr/lib/python3.6/_errorcheckers.py�exception_from_errno+s


rcGs |dkr|}t|��n|SdS)a�Error checker for funtions, which return negative error codes.

    If ``result`` is smaller than ``0``, it is interpreted as negative error
    code, and an appropriate exception is raised:

    - ``-ENOMEM`` raises a :exc:`~exceptions.MemoryError`
    - ``-EOVERFLOW`` raises a :exc:`~exceptions.OverflowError`
    - all other error codes raise :exc:`~exceptions.EnvironmentError`

    If result is greater or equal to ``0``, it is returned unchanged.

    rN)r)�result�func�argsrr
r
r�check_negative_errorcode;s

rcGs"|dkrt�}|dkrt|��|S)z�Error checker to check the system ``errno`` as returned by
    :func:`ctypes.get_errno()`.

    If ``result`` is not ``0``, an exception according to this errno is raised.
    Otherwise nothing happens.

    r)rr)rrrrr
r
r�check_errno_on_nonzero_returnPs
rcGs|st�}|dkrt|��|S)z�Error checker to check the system ``errno`` as returned by
    :func:`ctypes.get_errno()`.

    If ``result`` is a null pointer, an exception according to this errno is
    raised.  Otherwise nothing happens.

    r)rr)rrrrr
r
r�"check_errno_on_null_pointer_return_s
	r)�__doc__Z
__future__rrrrr	�errnoZctypesrZENOMEM�MemoryErrorZ	EOVERFLOW�
OverflowErrorZEINVAL�
ValueErrorrrrrrr
r
r
r�<module>s
site-packages/pyudev/_ctypeslib/__pycache__/__init__.cpython-36.pyc000064400000000476147511334620021347 0ustar003

H�V��@s dZddlmZddlmZdS)z�
    pyudev._ctypeslib
    =================

    Wrappers for libraries.

    .. moduleauthor::  mulhern  <amulhern@redhat.com>
�)�libc)�libudevN)�__doc__�rr�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pyudev/_ctypeslib/__pycache__/utils.cpython-36.opt-1.pyc000064400000003126147511334620021702 0ustar003

u1�W�	�@sXdZddlmZddlmZddlmZddlmZddlmZddlm	Z	dd	�Z
d
S)z�
    pyudev._ctypeslib.utils
    =======================

    Utilities for loading ctypeslib.

    .. moduleauthor::  Anne Mulhern  <amulhern@redhat.com>
�)�absolute_import)�division)�print_function)�unicode_literals)�CDLL)�find_librarycCsvt|�}|std|��t|dd�}xL|j�D]@\}}t||d�}|r.|\}}	||_|	|_|j|�}
|
r.|
|_q.W|S)a{
    Load library ``name`` and return a :class:`ctypes.CDLL` object for it.

    :param str name: the library name
    :param signatures: signatures of methods
    :type signatures: dict of str * (tuple of (list of type) * type)
    :param error_checkers: error checkers for methods
    :type error_checkers: dict of str * ((int * ptr * arglist) -> int)

    The library has errno handling enabled.
    Important functions are given proper signatures and return types to support
    type checking and argument conversion.

    :returns: a loaded library
    :rtype: ctypes.CDLL
    :raises ImportError: if the library is not found
    zNo library named %sT)Z	use_errnoN)	r�ImportErrorr�items�getattr�argtypes�restype�getZerrcheck)�nameZ
signaturesZerror_checkersZlibrary_name�lib�funcnameZ	signatureZfunctionrrZerrorchecker�r�/usr/lib/python3.6/utils.py�load_ctypes_library$s

rN)�__doc__Z
__future__rrrrZctypesrZctypes.utilrrrrrr�<module>ssite-packages/pyudev/_ctypeslib/__pycache__/libc.cpython-36.pyc000064400000001177147511334620020520 0ustar003

u1�W<�@stdZddlmZddlmZddlmZddlmZddlmZddlm	Z	ed	Z
ee
egefd
�Zee	d
�Z
dS)z�
    pyudev._ctypeslib.libc
    ======================

    Wrappers for libc.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
�)�absolute_import)�division)�print_function)�unicode_literals)�c_int�)�check_errno_on_nonzero_return�)�pipe2N)�__doc__Z
__future__rrrrZctypesrZ_errorcheckersrZfd_pair�dictZ
SIGNATURESZERROR_CHECKERS�r
r
�/usr/lib/python3.6/libc.py�<module>ssite-packages/pyudev/_ctypeslib/_errorcheckers.py000064400000006241147511334620016320 0ustar00# -*- coding: utf-8 -*-
# Copyright (C) 2013 Sebastian Wiesner <lunaryorn@gmail.com>

# This library is free software; you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as published by the
# Free Software Foundation; either version 2.1 of the License, or (at your
# option) any later version.

# This library is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE.  See the GNU Lesser General Public License
# for more details.

# You should have received a copy of the GNU Lesser General Public License
# along with this library; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA

"""
    pyudev._ctypeslib._errorcheckers
    ================================

    Error checkers for ctypes wrappers.
"""

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals


import os
import errno
from ctypes import get_errno


ERRNO_EXCEPTIONS = {
    errno.ENOMEM: MemoryError,
    errno.EOVERFLOW: OverflowError,
    errno.EINVAL: ValueError
}


def exception_from_errno(errnum):
    """Create an exception from ``errnum``.

    ``errnum`` is an integral error number.

    Return an exception object appropriate to ``errnum``.

    """
    exception = ERRNO_EXCEPTIONS.get(errnum)
    errorstr = os.strerror(errnum)
    if exception is not None:
        return exception(errorstr)
    else:
        return EnvironmentError(errnum, errorstr)


def check_negative_errorcode(result, func, *args):
    """Error checker for funtions, which return negative error codes.

    If ``result`` is smaller than ``0``, it is interpreted as negative error
    code, and an appropriate exception is raised:

    - ``-ENOMEM`` raises a :exc:`~exceptions.MemoryError`
    - ``-EOVERFLOW`` raises a :exc:`~exceptions.OverflowError`
    - all other error codes raise :exc:`~exceptions.EnvironmentError`

    If result is greater or equal to ``0``, it is returned unchanged.

    """
    if result < 0:
        # udev returns the *negative* errno code at this point
        errnum = -result
        raise exception_from_errno(errnum)
    else:
        return result


def check_errno_on_nonzero_return(result, func, *args):
    """Error checker to check the system ``errno`` as returned by
    :func:`ctypes.get_errno()`.

    If ``result`` is not ``0``, an exception according to this errno is raised.
    Otherwise nothing happens.

    """
    if result != 0:
        errnum = get_errno()
        if errnum != 0:
            raise exception_from_errno(errnum)
    return result


def check_errno_on_null_pointer_return(result, func, *args):
    """Error checker to check the system ``errno`` as returned by
    :func:`ctypes.get_errno()`.

    If ``result`` is a null pointer, an exception according to this errno is
    raised.  Otherwise nothing happens.

    """
    # pylint: disable=invalid-name
    if not result:
        errnum = get_errno()
        if errnum != 0:
            raise exception_from_errno(errnum)
    return result
site-packages/pyudev/_ctypeslib/libudev.py000064400000025053147511334620014754 0ustar00# -*- coding: utf-8 -*-
# Copyright (C) 2010, 2011, 2012, 2013 Sebastian Wiesner <lunaryorn@gmail.com>

# This library is free software; you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as published by the
# Free Software Foundation; either version 2.1 of the License, or (at your
# option) any later version.

# This library is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE.  See the GNU Lesser General Public License
# for more details.

# You should have received a copy of the GNU Lesser General Public License
# along with this library; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA


"""
    pyudev._ctypeslib.libudev
    =========================

    Wrapper types for libudev.  Use ``libudev`` attribute to access libudev
    functions.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
"""

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals

from ctypes import c_char
from ctypes import c_char_p
from ctypes import c_int
from ctypes import c_uint
from ctypes import c_ulonglong
from ctypes import CDLL
from ctypes import Structure
from ctypes import POINTER

from ctypes.util import find_library

from ._errorcheckers import check_errno_on_nonzero_return
from ._errorcheckers import check_errno_on_null_pointer_return
from ._errorcheckers import check_negative_errorcode


class udev(Structure): # pylint: disable=invalid-name
    """
    Dummy for ``udev`` structure.
    """
    # pylint: disable=too-few-public-methods
    pass

udev_p = POINTER(udev) # pylint: disable=invalid-name


class udev_enumerate(Structure): # pylint: disable=invalid-name
    """
    Dummy for ``udev_enumerate`` structure.
    """
    # pylint: disable=too-few-public-methods
    pass

udev_enumerate_p = POINTER(udev_enumerate) # pylint: disable=invalid-name


class udev_list_entry(Structure): # pylint: disable=invalid-name
    """
    Dummy for ``udev_list_entry`` structure.
    """
    # pylint: disable=too-few-public-methods
    pass

udev_list_entry_p = POINTER(udev_list_entry) # pylint: disable=invalid-name


class udev_device(Structure): # pylint: disable=invalid-name
    """
    Dummy for ``udev_device`` structure.
    """
    # pylint: disable=too-few-public-methods
    pass

udev_device_p = POINTER(udev_device) # pylint: disable=invalid-name


class udev_monitor(Structure): # pylint: disable=invalid-name
    """
    Dummy for ``udev_device`` structure.
    """
    # pylint: disable=too-few-public-methods
    pass

udev_monitor_p = POINTER(udev_monitor) # pylint: disable=invalid-name

class udev_hwdb(Structure): # pylint: disable=invalid-name
    """
    Dummy for ``udev_hwdb`` structure.
    """
    # pylint: disable=too-few-public-methods
    pass

udev_hwdb_p = POINTER(udev_hwdb) # pylint: disable=invalid-name


dev_t = c_ulonglong # pylint: disable=invalid-name


SIGNATURES = dict(
   # context
   udev_new=([], udev_p),
   udev_unref=([udev_p], None),
   udev_ref=([udev_p], udev_p),
   udev_get_sys_path=([udev_p], c_char_p),
   udev_get_dev_path=([udev_p], c_char_p),
   udev_get_run_path=([udev_p], c_char_p),
   udev_get_log_priority=([udev_p], c_int),
   udev_set_log_priority=([udev_p, c_int], None),
   udev_enumerate_new=([udev_p], udev_enumerate_p),
   udev_enumerate_ref=([udev_enumerate_p], udev_enumerate_p),
   udev_enumerate_unref=([udev_enumerate_p], None),
   udev_enumerate_add_match_subsystem=([udev_enumerate_p, c_char_p], c_int),
   udev_enumerate_add_nomatch_subsystem=([udev_enumerate_p, c_char_p], c_int),
   udev_enumerate_add_match_property=(
      [udev_enumerate_p, c_char_p, c_char_p],
      c_int
   ),
   udev_enumerate_add_match_sysattr=(
      [udev_enumerate_p, c_char_p, c_char_p],
      c_int
   ),
   udev_enumerate_add_nomatch_sysattr=(
      [udev_enumerate_p, c_char_p, c_char_p],
      c_int
   ),
   udev_enumerate_add_match_tag=([udev_enumerate_p, c_char_p], c_int),
   udev_enumerate_add_match_sysname=([udev_enumerate_p, c_char_p], c_int),
   udev_enumerate_add_match_parent=([udev_enumerate_p, udev_device_p], c_int),
   udev_enumerate_add_match_is_initialized=([udev_enumerate_p], c_int),
   udev_enumerate_scan_devices=([udev_enumerate_p], c_int),
   udev_enumerate_get_list_entry=([udev_enumerate_p], udev_list_entry_p),
   # list entries
   udev_list_entry_get_next=([udev_list_entry_p], udev_list_entry_p),
   udev_list_entry_get_name=([udev_list_entry_p], c_char_p),
   udev_list_entry_get_value=([udev_list_entry_p], c_char_p),
   # devices
   udev_device_ref=([udev_device_p], udev_device_p),
   udev_device_unref=([udev_device_p], None),
   udev_device_new_from_syspath=([udev_p, c_char_p], udev_device_p),
   udev_device_new_from_subsystem_sysname=(
      [udev_p, c_char_p, c_char_p],
      udev_device_p
   ),
   udev_device_new_from_devnum=([udev_p, c_char, dev_t], udev_device_p),
   udev_device_new_from_device_id=([udev_p, c_char_p], udev_device_p),
   udev_device_new_from_environment=([udev_p], udev_device_p),
   udev_device_get_parent=([udev_device_p], udev_device_p),
   udev_device_get_parent_with_subsystem_devtype=(
      [udev_device_p, c_char_p, c_char_p],
      udev_device_p
   ),
   udev_device_get_devpath=([udev_device_p], c_char_p),
   udev_device_get_subsystem=([udev_device_p], c_char_p),
   udev_device_get_syspath=([udev_device_p], c_char_p),
   udev_device_get_sysnum=([udev_device_p], c_char_p),
   udev_device_get_sysname=([udev_device_p], c_char_p),
   udev_device_get_driver=([udev_device_p], c_char_p),
   udev_device_get_devtype=([udev_device_p], c_char_p),
   udev_device_get_devnode=([udev_device_p], c_char_p),
   udev_device_get_property_value=([udev_device_p, c_char_p], c_char_p),
   udev_device_get_sysattr_value=([udev_device_p, c_char_p], c_char_p),
   udev_device_get_devnum=([udev_device_p], dev_t),
   udev_device_get_action=([udev_device_p], c_char_p),
   udev_device_get_seqnum=([udev_device_p], c_ulonglong),
   udev_device_get_is_initialized=([udev_device_p], c_int),
   udev_device_get_usec_since_initialized=([udev_device_p], c_ulonglong),
   udev_device_get_devlinks_list_entry=([udev_device_p], udev_list_entry_p),
   udev_device_get_tags_list_entry=([udev_device_p], udev_list_entry_p),
   udev_device_get_properties_list_entry=([udev_device_p], udev_list_entry_p),
   udev_device_get_sysattr_list_entry=([udev_device_p], udev_list_entry_p),
   udev_device_set_sysattr_value=([udev_device_p, c_char_p, c_char_p], c_int),
   udev_device_has_tag=([udev_device_p, c_char_p], c_int),
   # monitoring
   udev_monitor_ref=([udev_monitor_p], udev_monitor_p),
   udev_monitor_unref=([udev_monitor_p], None),
   udev_monitor_new_from_netlink=([udev_p, c_char_p], udev_monitor_p),
   udev_monitor_enable_receiving=([udev_monitor_p], c_int),
   udev_monitor_set_receive_buffer_size=([udev_monitor_p, c_int], c_int),
   udev_monitor_get_fd=([udev_monitor_p], c_int),
   udev_monitor_receive_device=([udev_monitor_p], udev_device_p),
   udev_monitor_filter_add_match_subsystem_devtype=(
           [udev_monitor_p, c_char_p, c_char_p], c_int),
   udev_monitor_filter_add_match_tag=([udev_monitor_p, c_char_p], c_int),
   udev_monitor_filter_update=([udev_monitor_p], c_int),
   udev_monitor_filter_remove=([udev_monitor_p], c_int),
   # hwdb
   udev_hwdb_ref=([udev_hwdb_p], udev_hwdb_p),
   udev_hwdb_unref=([udev_hwdb_p], None),
   udev_hwdb_new=([udev_p], udev_hwdb_p),
   udev_hwdb_get_properties_list_entry=(
      [udev_hwdb_p, c_char_p, c_uint],
      udev_list_entry_p
   )
)


ERROR_CHECKERS = dict(
   udev_device_get_action=None,
   udev_device_get_devlinks_list_entry=None,
   udev_device_get_devnode=None,
   udev_device_get_devnum=None,
   udev_device_get_devpath=None,
   udev_device_get_devtype=None,
   udev_device_get_driver=None,
   udev_device_get_is_initialized=None,
   udev_device_get_parent=None,
   udev_device_get_parent_with_subsystem_devtype=None,
   udev_device_get_properties_list_entry=None,
   udev_device_get_property_value=None,
   udev_device_get_seqnum=None,
   udev_device_get_subsystem=None,
   udev_device_get_sysattr_list_entry=None,
   udev_device_get_sysattr_value=None,
   udev_device_get_sysname=None,
   udev_device_get_sysnum=None,
   udev_device_get_syspath=None,
   udev_device_get_tags_list_entry=None,
   udev_device_get_usec_since_initialized=None,
   udev_device_has_tag=None,
   udev_device_new_from_device_id=None,
   udev_device_new_from_devnum=None,
   udev_device_new_from_environment=None,
   udev_device_new_from_subsystem_sysname=None,
   udev_device_new_from_syspath=None,
   udev_device_ref=None,
   udev_device_unref=None,
   udev_device_set_sysattr_value=check_negative_errorcode,
   udev_enumerate_add_match_parent=check_negative_errorcode,
   udev_enumerate_add_match_subsystem=check_negative_errorcode,
   udev_enumerate_add_nomatch_subsystem=check_negative_errorcode,
   udev_enumerate_add_match_property=check_negative_errorcode,
   udev_enumerate_add_match_sysattr=check_negative_errorcode,
   udev_enumerate_add_nomatch_sysattr=check_negative_errorcode,
   udev_enumerate_add_match_tag=check_negative_errorcode,
   udev_enumerate_add_match_sysname=check_negative_errorcode,
   udev_enumerate_add_match_is_initialized=check_negative_errorcode,
   udev_enumerate_get_list_entry=None,
   udev_enumerate_new=None,
   udev_enumerate_ref=None,
   udev_enumerate_scan_devices=None,
   udev_enumerate_unref=None,
   udev_get_dev_path=None,
   udev_get_log_priority=None,
   udev_get_run_path=None,
   udev_get_sys_path=None,
   udev_hwdb_get_properties_list_entry=None,
   udev_hwdb_new=None,
   udev_hwdb_ref=None,
   udev_hwdb_unref=None,
   udev_list_entry_get_name=None,
   udev_list_entry_get_next=None,
   udev_list_entry_get_value=None,
   udev_monitor_set_receive_buffer_size=check_errno_on_nonzero_return,
   # libudev doc says, enable_receiving returns a negative errno, but tests
   # show that this is not reliable, so query the real error code
   udev_monitor_enable_receiving=check_errno_on_nonzero_return,
   udev_monitor_receive_device=check_errno_on_null_pointer_return,
   udev_monitor_ref=None,
   udev_monitor_filter_add_match_subsystem_devtype=check_negative_errorcode,
   udev_monitor_filter_add_match_tag=check_negative_errorcode,
   udev_monitor_filter_update=check_errno_on_nonzero_return,
   udev_monitor_filter_remove=check_errno_on_nonzero_return,
   udev_monitor_get_fd=None,
   udev_monitor_new_from_netlink=None,
   udev_monitor_unref=None,
   udev_new=None,
   udev_ref=None,
   udev_set_log_priority=None,
   udev_unref=None
)
site-packages/pyudev/_qt_base.py000064400000015553147511334620012746 0ustar00# -*- coding: utf-8 -*-
# Copyright (C) 2010, 2011, 2012, 2013 Sebastian Wiesner <lunaryorn@gmail.com>

# This library is free software; you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as published by the
# Free Software Foundation; either version 2.1 of the License, or (at your
# option) any later version.

# This library is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE.  See the GNU Lesser General Public License
# for more details.

# You should have received a copy of the GNU Lesser General Public License
# along with this library; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA


"""
    pyudev._qt_base
    ===============

    Base mixin class for Qt4,Qt5 support.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
"""

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals

import six

from pyudev.device import Device

class MonitorObserverMixin(object):
    """
    Base mixin for pyqt monitor observers.
    """
    # pylint: disable=too-few-public-methods

    def _setup_notifier(self, monitor, notifier_class):
        self.monitor = monitor
        self.notifier = notifier_class(
            monitor.fileno(), notifier_class.Read, self)
        self.notifier.activated[int].connect(self._process_udev_event)

    @property
    def enabled(self):
        """
        Whether this observer is enabled or not.

        If ``True`` (the default), this observer is enabled, and emits events.
        Otherwise it is disabled and does not emit any events.  This merely
        reflects the state of the ``enabled`` property of the underlying
        :attr:`notifier`.

        .. versionadded:: 0.14
        """
        return self.notifier.isEnabled()

    @enabled.setter
    def enabled(self, value):
        self.notifier.setEnabled(value)

    def _process_udev_event(self):
        """
        Attempt to receive a single device event from the monitor, process
        the event and emit corresponding signals.

        Called by ``QSocketNotifier``, if data is available on the udev
        monitoring socket.
        """
        device = self.monitor.poll(timeout=0)
        if device is not None:
            self._emit_event(device)

    def _emit_event(self, device):
        self.deviceEvent.emit(device)


class QUDevMonitorObserverMixin(MonitorObserverMixin):
    """
    Obsolete monitor observer mixin.
    """
    # pylint: disable=too-few-public-methods

    def _setup_notifier(self, monitor, notifier_class):
        MonitorObserverMixin._setup_notifier(self, monitor, notifier_class)
        self._action_signal_map = {
            'add': self.deviceAdded, 'remove': self.deviceRemoved,
            'change': self.deviceChanged, 'move': self.deviceMoved,
        }
        import warnings
        warnings.warn('Will be removed in 1.0. '
                      'Use pyudev.pyqt4.MonitorObserver instead.',
                      DeprecationWarning)

    def _emit_event(self, device):
        self.deviceEvent.emit(device.action, device)
        signal = self._action_signal_map.get(device.action)
        if signal is not None:
            signal.emit(device)

def make_init(qobject, socket_notifier):
    """
    Generates an initializer to observer the given ``monitor``
    (a :class:`~pyudev.Monitor`):

    ``parent`` is the parent :class:`~PyQt{4,5}.QtCore.QObject` of this
    object.  It is passed unchanged to the inherited constructor of
    :class:`~PyQt{4,5}.QtCore.QObject`.
    """

    def __init__(self, monitor, parent=None):
        qobject.__init__(self, parent)
        # pylint: disable=protected-access
        self._setup_notifier(monitor, socket_notifier)

    return __init__


class MonitorObserverGenerator(object):
    """
    Class to generate a MonitorObserver class.
    """
    # pylint: disable=too-few-public-methods

    @staticmethod
    def make_monitor_observer(qobject, signal, socket_notifier):
        """Generates an observer for device events integrating into the
        PyQt{4,5} mainloop.

        This class inherits :class:`~PyQt{4,5}.QtCore.QObject` to turn device
        events into Qt signals:

        >>> from pyudev import Context, Monitor
        >>> from pyudev.pyqt4 import MonitorObserver
        >>> context = Context()
        >>> monitor = Monitor.from_netlink(context)
        >>> monitor.filter_by(subsystem='input')
        >>> observer = MonitorObserver(monitor)
        >>> def device_event(device):
        ...     print('event {0} on device {1}'.format(device.action, device))
        >>> observer.deviceEvent.connect(device_event)
        >>> monitor.start()

        This class is a child of :class:`~{PySide, PyQt{4,5}}.QtCore.QObject`.

        """
        return type(
           str("MonitorObserver"),
           (qobject, MonitorObserverMixin),
           {
              str("__init__") : make_init(qobject, socket_notifier),
              str("deviceEvent") : signal(Device)
           }
        )



class QUDevMonitorObserverGenerator(object):
    """
    Class to generate a MonitorObserver class.
    """
    # pylint: disable=too-few-public-methods

    @staticmethod
    def make_monitor_observer(qobject, signal, socket_notifier):
        """Generates an observer for device events integrating into the
        PyQt{4,5} mainloop.

        This class inherits :class:`~PyQt{4,5}.QtCore.QObject` to turn device
        events into Qt signals:

        >>> from pyudev import Context, Monitor
        >>> from pyudev.pyqt4 import MonitorObserver
        >>> context = Context()
        >>> monitor = Monitor.from_netlink(context)
        >>> monitor.filter_by(subsystem='input')
        >>> observer = MonitorObserver(monitor)
        >>> def device_event(device):
        ...     print('event {0} on device {1}'.format(device.action, device))
        >>> observer.deviceEvent.connect(device_event)
        >>> monitor.start()

        This class is a child of :class:`~{PyQt{4,5}, PySide}.QtCore.QObject`.

        """
        return type(
           str("QUDevMonitorObserver"),
           (qobject, QUDevMonitorObserverMixin),
           {
              str("__init__") : make_init(qobject, socket_notifier),
              #: emitted upon arbitrary device events
              str("deviceEvent") : signal(six.text_type, Device),
              #: emitted if a device was added
              str("deviceAdded") : signal(Device),
              #: emitted if a device was removed
              str("deviceRemoved") : signal(Device),
              #: emitted if a device was changed
              str("deviceChanged") : signal(Device),
              #: emitted if a device was moved
              str("deviceMoved") : signal(Device)
           }
        )
site-packages/pyudev/__pycache__/version.cpython-36.pyc000064400000001143147511334620017130 0ustar003

u1�WM�@sHdZdZddjdd�edd	�D��djd
d�ed	d�D��fZdS)zx
    pyudev.version
    ==============

    Version information.

    .. moduleauthor::  mulhern  <amulhern@redhat.com>
���z%s%s�.ccs|]}t|�VqdS)N)�str)�.0�x�r�/usr/lib/python3.6/version.py�	<genexpr>sr
N�ccs|]}t|�VqdS)N)r)rrrrr	r
 s)rrrr)�__doc__Z__version_info__�join�__version__rrrr	�<module>ssite-packages/pyudev/__pycache__/_compat.cpython-36.opt-1.pyc000064400000001476147511334620020035 0ustar003

װV��@s<dZddlmZmZmZmZddlmZmZm	Z	dd�Z
dS)z�
    pyudev._compat
    ==============

    Compatibility for Python versions, that lack certain functions.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
�)�print_function�division�unicode_literals�absolute_import)�Popen�CalledProcessError�PIPEcCs2t|td�}|j�d}|jdkr.t|j|��|S)z]
    Compatibility with :func:`subprocess.check_output` from Python 2.7 and
    upwards.
    )�stdoutr)rrZcommunicate�
returncoder)Zcommand�proc�output�r
�/usr/lib/python3.6/_compat.py�check_output#s

rN)�__doc__Z
__future__rrrr�
subprocessrrrrr
r
r
r�<module>ssite-packages/pyudev/__pycache__/_compat.cpython-36.pyc000064400000001476147511334620017076 0ustar003

װV��@s<dZddlmZmZmZmZddlmZmZm	Z	dd�Z
dS)z�
    pyudev._compat
    ==============

    Compatibility for Python versions, that lack certain functions.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
�)�print_function�division�unicode_literals�absolute_import)�Popen�CalledProcessError�PIPEcCs2t|td�}|j�d}|jdkr.t|j|��|S)z]
    Compatibility with :func:`subprocess.check_output` from Python 2.7 and
    upwards.
    )�stdoutr)rrZcommunicate�
returncoder)Zcommand�proc�output�r
�/usr/lib/python3.6/_compat.py�check_output#s

rN)�__doc__Z
__future__rrrr�
subprocessrrrrr
r
r
r�<module>ssite-packages/pyudev/__pycache__/monitor.cpython-36.opt-1.pyc000064400000046361147511334620020104 0ustar003

u1�W�Q�@s�dZddlmZmZmZmZddlZddlZddlm	Z	ddl
mZddlm
Z
ddlmZddlmZdd	lmZdd
lmZGdd�de�ZGd
d�de	�ZdS)z�
    pyudev.monitor
    ==============

    Monitor implementation.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
�)�print_function�division�unicode_literals�absolute_importN)�Thread)�partial)�Device)�eintr_retry_call)�ensure_byte_string)�pipe)�pollc@s�eZdZdZdd�Zdd�Zed"dd��Zed	d
��Z	dd�Z
d#dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zd$dd�Zdd�Zd d!�Zd
S)%�Monitorat
    A synchronous device event monitor.

    A :class:`Monitor` objects connects to the udev daemon and listens for
    changes to the device list.  A monitor is created by connecting to the
    kernel daemon through netlink (see :meth:`from_netlink`):

    >>> from pyudev import Context, Monitor
    >>> context = Context()
    >>> monitor = Monitor.from_netlink(context)

    Once the monitor is created, you can add a filter using :meth:`filter_by()`
    or :meth:`filter_by_tag()` to drop incoming events in subsystems, which are
    not of interest to the application:

    >>> monitor.filter_by('input')

    When the monitor is eventually set up, you can either poll for events
    synchronously:

    >>> device = monitor.poll(timeout=3)
    >>> if device:
    ...     print('{0.action}: {0}'.format(device))
    ...

    Or you can monitor events asynchronously with :class:`MonitorObserver`.

    To integrate into various event processing frameworks, the monitor provides
    a :func:`selectable <select.select>` file description by :meth:`fileno()`.
    However, do *not*  read or write directly on this file descriptor.

    Instances of this class can directly be given as ``udev_monitor *`` to
    functions wrapped through :mod:`ctypes`.

    .. versionchanged:: 0.16
       Remove :meth:`from_socket()` which is deprecated, and even removed in
       recent udev versions.
    cCs||_||_|j|_d|_dS)NF)�contextZ_as_parameter_�_libudev�_started)�selfrZ	monitor_p�r�/usr/lib/python3.6/monitor.py�__init__VszMonitor.__init__cCs|jj|�dS)N)rZudev_monitor_unref)rrrr�__del__\szMonitor.__del__�udevcCs>|dkrtdj|���|jj|t|��}|s4td��|||�S)a�
        Create a monitor by connecting to the kernel daemon through netlink.

        ``context`` is the :class:`Context` to use.  ``source`` is a string,
        describing the event source.  Two sources are available:

        ``'udev'`` (the default)
          Events emitted after udev as registered and configured the device.
          This is the absolutely recommended source for applications.

        ``'kernel'``
          Events emitted directly after the kernel has seen the device.  The
          device has not yet been configured by udev and might not be usable
          at all.  **Never** use this, unless you know what you are doing.

        Return a new :class:`Monitor` object, which is connected to the
        given source.  Raise :exc:`~exceptions.ValueError`, if an invalid
        source has been specified.  Raise
        :exc:`~exceptions.EnvironmentError`, if the creation of the monitor
        failed.
        �kernelrz8Invalid source: {0!r}. Must be one of "udev" or "kernel"zCould not create udev monitor)rr)�
ValueError�formatrZudev_monitor_new_from_netlinkr
�EnvironmentError)�clsr�source�monitorrrr�from_netlink_szMonitor.from_netlinkcCs|jS)z�
        ``True``, if this monitor was started, ``False`` otherwise. Readonly.

        .. seealso:: :meth:`start()`
        .. versionadded:: 0.16
        )r)rrrr�startedszMonitor.startedcCs|jj|�S)z�
        Return the file description associated with this monitor as integer.

        This is really a real file descriptor ;), which can be watched and
        :func:`select.select`\ ed.
        )rZudev_monitor_get_fd)rrrr�fileno�szMonitor.filenoNcCs8t|�}|dk	rt|�}|jj|||�|jj|�dS)a
        Filter incoming events.

        ``subsystem`` is a byte or unicode string with the name of a
        subsystem (e.g. ``'input'``).  Only events originating from the
        given subsystem pass the filter and are handed to the caller.

        If given, ``device_type`` is a byte or unicode string specifying the
        device type.  Only devices with the given device type are propagated
        to the caller.  If ``device_type`` is not given, no additional
        filter for a specific device type is installed.

        These filters are executed inside the kernel, and client processes
        will usually not be woken up for device, that do not match these
        filters.

        .. versionchanged:: 0.15
           This method can also be after :meth:`start()` now.
        N)r
rZ/udev_monitor_filter_add_match_subsystem_devtype�udev_monitor_filter_update)rZ	subsystemZdevice_typerrr�	filter_by�s
zMonitor.filter_bycCs"|jj|t|��|jj|�dS)aR
        Filter incoming events by the given ``tag``.

        ``tag`` is a byte or unicode string with the name of a tag.  Only
        events for devices which have this tag attached pass the filter and are
        handed to the caller.

        Like with :meth:`filter_by` this filter is also executed inside the
        kernel, so that client processes are usually not woken up for devices
        without the given ``tag``.

        .. udevversion:: 154

        .. versionadded:: 0.9

        .. versionchanged:: 0.15
           This method can also be after :meth:`start()` now.
        N)rZ!udev_monitor_filter_add_match_tagr
r!)r�tagrrr�
filter_by_tag�szMonitor.filter_by_tagcCs|jj|�|jj|�dS)a
        Remove any filters installed with :meth:`filter_by()` or
        :meth:`filter_by_tag()` from this monitor.

        .. warning::

           Up to udev 181 (and possibly even later versions) the underlying
           ``udev_monitor_filter_remove()`` seems to be broken.  If used with
           affected versions this method always raises
           :exc:`~exceptions.ValueError`.

        Raise :exc:`~exceptions.EnvironmentError` if removal of installed
        filters failed.

        .. versionadded:: 0.15
        N)rZudev_monitor_filter_remover!)rrrr�
remove_filter�szMonitor.remove_filtercCs ddl}|jdt�|j�dS)a�
        Switch the monitor into listing mode.

        Connect to the event source and receive incoming events.  Only after
        calling this method, the monitor listens for incoming events.

        .. note::

           This method is implicitly called by :meth:`__iter__`.  You don't
           need to call it explicitly, if you are iterating over the
           monitor.

        .. deprecated:: 0.16
           Will be removed in 1.0. Use :meth:`start()` instead.
        rNz4Will be removed in 1.0. Use Monitor.start() instead.)�warnings�warn�DeprecationWarning�start)rr&rrr�enable_receiving�szMonitor.enable_receivingcCs*|js&|jj|�tj|tj�d|_dS)a�
        Start this monitor.

        The monitor will not receive events until this method is called. This
        method does nothing if called on an already started :class:`Monitor`.

        .. note::

           Typically you don't need to call this method. It is implicitly
           called by :meth:`poll()` and :meth:`__iter__()`.

        .. seealso:: :attr:`started`
        .. versionchanged:: 0.16
           This method does nothing if the :class:`Monitor` was already
           started.
        TN)rrZudev_monitor_enable_receivingrZset_fd_status_flag�os�
O_NONBLOCK)rrrrr)�sz
Monitor.startcCs|jj||�dS)aN
        Set the receive buffer ``size``.

        ``size`` is the requested buffer size in bytes, as integer.

        .. note::

           The CAP_NET_ADMIN capability must be contained in the effective
           capability set of the caller for this method to succeed.  Otherwise
           :exc:`~exceptions.EnvironmentError` will be raised, with ``errno``
           set to :data:`~errno.EPERM`.  Unprivileged processes typically lack
           this capability.  You can check the capabilities of the current
           process with the python-prctl_ module:

           >>> import prctl
           >>> prctl.cap_effective.net_admin

        Raise :exc:`~exceptions.EnvironmentError`, if the buffer size could not
        bet set.

        .. versionadded:: 0.13

        .. _python-prctl: http://packages.python.org/python-prctl
        N)rZ$udev_monitor_set_receive_buffer_size)r�sizerrr�set_receive_buffer_sizeszMonitor.set_receive_buffer_sizecCsvxpy |jj|�}|r t|j|�SdStk
rl}z.|jtjtjfkrJdS|jtjkrZwn�WYdd}~XqXqWdS)z�Receive a single device from the monitor.

        Return the received :class:`Device`, or ``None`` if no device could be
        received.

        N)	rZudev_monitor_receive_devicerrr�errnoZEAGAINZEWOULDBLOCKZEINTR)rZdevice_p�errorrrr�_receive_device szMonitor._receive_devicecCsL|dk	r|dkrt|d�}|j�ttjj|df�j|�rD|j�SdSdS)a�
        Poll for a device event.

        You can use this method together with :func:`iter()` to synchronously
        monitor events in the current thread::

           for device in iter(monitor.poll, None):
               print('{0.action} on {0.device_path}'.format(device))

        Since this method will never return ``None`` if no ``timeout`` is
        specified, this is effectively an endless loop. With
        :func:`functools.partial()` you can also create a loop that only waits
        for a specified time::

           for device in iter(partial(monitor.poll, 3), None):
               print('{0.action} on {0.device_path}'.format(device))

        This loop will only wait three seconds for a new device event. If no
        device event occurred after three seconds, the loop will exit.

        ``timeout`` is a floating point number that specifies a time-out in
        seconds. If omitted or ``None``, this method blocks until a device
        event is available. If ``0``, this method just polls and will never
        block.

        .. note::

           This method implicitly calls :meth:`start()`.

        Return the received :class:`Device`, or ``None`` if a timeout
        occurred. Raise :exc:`~exceptions.EnvironmentError` if event retrieval
        failed.

        .. seealso::

           :attr:`Device.action`
              The action that created this event.

           :attr:`Device.sequence_number`
              The sequence number of this event.

        .. versionadded:: 0.16
        Nri��r)�intr)r	r�Poll�
for_eventsr1)r�timeoutrrrr5s,zMonitor.pollcCs&ddl}|jdt�|j�}|j|fS)a:
        Receive a single device from the monitor.

        .. warning::

           You *must* call :meth:`start()` before calling this method.

        The caller must make sure, that there are events available in the
        event queue.  The call blocks, until a device is available.

        If a device was available, return ``(action, device)``.  ``device``
        is the :class:`Device` object describing the device.  ``action`` is
        a string describing the action.  Usual actions are:

        ``'add'``
          A device has been added (e.g. a USB device was plugged in)
        ``'remove'``
          A device has been removed (e.g. a USB device was unplugged)
        ``'change'``
          Something about the device changed (e.g. a device property)
        ``'online'``
          The device is online now
        ``'offline'``
          The device is offline now

        Raise :exc:`~exceptions.EnvironmentError`, if no device could be
        read.

        .. deprecated:: 0.16
           Will be removed in 1.0. Use :meth:`Monitor.poll()` instead.
        rNz3Will be removed in 1.0. Use Monitor.poll() instead.)r&r'r(r�action)rr&�devicerrr�receive_devicejs
 zMonitor.receive_deviceccsBddl}|jdt�|j�x |j�}|dk	r|j|fVqWdS)a�
        Wait for incoming events and receive them upon arrival.

        This methods implicitly calls :meth:`start()`, and starts polling the
        :meth:`fileno` of this monitor.  If a event comes in, it receives the
        corresponding device and yields it to the caller.

        The returned iterator is endless, and continues receiving devices
        without ever stopping.

        Yields ``(action, device)`` (see :meth:`receive_device` for a
        description).

        .. deprecated:: 0.16
           Will be removed in 1.0. Use an explicit loop over :meth:`poll()`
           instead, or monitor asynchronously with :class:`MonitorObserver`.
        rNzuWill be removed in 1.0. Use an explicit loop over "poll()" instead, or monitor asynchronously with "MonitorObserver".)r&r'r(r)rr7)rr&r8rrr�__iter__�szMonitor.__iter__)r)N)N)�__name__�
__module__�__qualname__�__doc__rr�classmethodr�propertyrr r"r$r%r*r)r.r1rr9r:rrrrr
.s"&



5&r
c@s:eZdZdZd
dd�Zdd�Zdd�Zd	d
�Zdd�ZdS)�MonitorObservera`
    An asynchronous observer for device events.

    This class subclasses :class:`~threading.Thread` class to asynchronously
    observe a :class:`Monitor` in a background thread:

    >>> from pyudev import Context, Monitor, MonitorObserver
    >>> context = Context()
    >>> monitor = Monitor.from_netlink(context)
    >>> monitor.filter_by(subsystem='input')
    >>> def print_device_event(device):
    ...     print('background event {0.action}: {0.device_path}'.format(device))
    >>> observer = MonitorObserver(monitor, callback=print_device_event, name='monitor-observer')
    >>> observer.daemon
    True
    >>> observer.start()

    In the above example, input device events will be printed in background,
    until :meth:`stop()` is called on ``observer``.

    .. note::

       Instances of this class are always created as daemon thread.  If you do
       not want to use daemon threads for monitoring, you need explicitly set
       :attr:`~threading.Thread.daemon` to ``False`` before invoking
       :meth:`~threading.Thread.start()`.

    .. seealso::

       :attr:`Device.action`
          The action that created this event.

       :attr:`Device.sequence_number`
          The sequence number of this event.

    .. versionadded:: 0.14

    .. versionchanged:: 0.15
       :meth:`Monitor.start()` is implicitly called when the thread is started.
    Ncs�|dkr�dkrtd��n|dk	r2�dk	r2td��tj|f|�|�||_d|_d|_�dk	r~ddl}|jdt��fdd�}||_	dS)	a
        Create a new observer for the given ``monitor``.

        ``monitor`` is the :class:`Monitor` to observe. ``callback`` is the
        callable to invoke on events, with the signature ``callback(device)``
        where ``device`` is the :class:`Device` that caused the event.

        .. warning::

           ``callback`` is invoked in the observer thread, hence the observer
           is blocked while callback executes.

        ``args`` and ``kwargs`` are passed unchanged to the constructor of
        :class:`~threading.Thread`.

        .. deprecated:: 0.16
           The ``event_handler`` argument will be removed in 1.0. Use
           the ``callback`` argument instead.
        .. versionchanged:: 0.16
           Add ``callback`` argument.
        Nzcallback missingz$Use either callback or event handlerTrzL"event_handler" argument will be removed in 1.0. Use Monitor.poll() instead.cs�|j|�S)N)r7)�d)�
event_handlerrr�<lambda>�sz*MonitorObserver.__init__.<locals>.<lambda>)
rrrrZdaemon�_stop_eventr&r'r(�	_callback)rrrC�callback�args�kwargsr&r)rCrr�s
zMonitorObserver.__init__cCs"|j�stjj�|_tj|�dS)zStart the observer thread.N)Zis_aliverZPipe�openrErr))rrrrr)�szMonitorObserver.startcCs�|jj�tjj|jdf|jjdf�}x�x�t|j�D]x\}}||jjj�kr\|jjj	�dS||jj�kr�|dkr�t
t|jjdd�}x&t|d�D]}|j|�q�Wq4t
d��q4Wq(WdS)Nr2r)r6zObserved monitor hung up)rr)rr4r5rErr	r �closer�iterrFr)rZnotifierZfile_descriptorZeventZread_devicer8rrr�runs
zMonitorObserver.runc
CsB|jdkrdS|jj�"t|jjjd�|jjj�WdQRXdS)aT
        Send a stop signal to the background thread.

        The background thread will eventually exit, but it may still be running
        when this method returns.  This method is essentially the asynchronous
        equivalent to :meth:`stop()`.

        .. note::

           The underlying :attr:`monitor` is *not* stopped.
        N�)rEZsinkr	�write�flush)rrrr�	send_stops


zMonitorObserver.send_stopcCs.|j�y|j�Wntk
r(YnXdS)a
        Synchronously stop the background thread.

        .. note::

           This method can safely be called from the observer thread. In this
           case it is equivalent to :meth:`send_stop()`.

        Send a stop signal to the backgroud (see :meth:`send_stop`), and waits
        for the background thread to exit (see :meth:`~threading.Thread.join`)
        if the current thread is *not* the observer thread.

        After this method returns in a thread *that is not the observer
        thread*, the ``callback`` is guaranteed to not be invoked again
        anymore.

        .. note::

           The underlying :attr:`monitor` is *not* stopped.

        .. versionchanged:: 0.16
           This method can be called from the observer thread.
        N)rQ�join�RuntimeError)rrrr�stop*s
zMonitorObserver.stop)NN)	r;r<r=r>rr)rMrQrTrrrrrA�s(
(rA)r>Z
__future__rrrrr+r/Z	threadingr�	functoolsrZ
pyudev.devicerZpyudev._utilr	r
Z
pyudev._osrr�objectr
rArrrr�<module>ssite-packages/pyudev/__pycache__/_util.cpython-36.opt-1.pyc000064400000013502147511334620017520 0ustar003

N�#W(�@s�dZddlmZmZmZmZyddlmZWn ek
rLddl	mZYnXddl
Z
ddlZddlZddl
Z
ddlZdd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Zdd�Zdd�ZdS)z|
    pyudev._util
    ============

    Internal utilities

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
�)�print_function�division�unicode_literals�absolute_import)�check_outputNcCst|t�s|jtj��}|S)z�
    Return the given ``value`` as bytestring.

    If the given ``value`` is not a byte string, but a real unicode string, it
    is encoded with the filesystem encoding (as in
    :func:`sys.getfilesystemencoding()`).
    )�
isinstance�bytes�encode�sys�getfilesystemencoding)�value�r
�/usr/lib/python3.6/_util.py�ensure_byte_string-s
rcCst|tj�s|jtj��}|S)z�
    Return the given ``value`` as unicode string.

    If the given ``value`` is not a unicode string, but a byte string, it is
    decoded with the filesystem encoding (as in
    :func:`sys.getfilesystemencoding()`).
    )r�six�	text_type�decoder
r)rr
r
r�ensure_unicode_string:srcCs2t|t�rt|�}t|t�r |Sttj|��SdS)a�
    Return a byte string, which represents the given ``value`` in a way
    suitable as raw value of an udev property.

    If ``value`` is a boolean object, it is converted to ``'1'`` or ``'0'``,
    depending on whether ``value`` is ``True`` or ``False``.  If ``value`` is a
    byte string already, it is returned unchanged.  Anything else is simply
    converted to a unicode string, and then passed to
    :func:`ensure_byte_string`.
    N)r�bool�intrrrr)rr
r
r�property_value_to_bytesGs


rcCs|dkrtdj|���|dkS)z�
    Convert the given unicode string ``value`` to a boolean object.

    If ``value`` is ``'1'``, ``True`` is returned.  If ``value`` is ``'0'``,
    ``False`` is returned.  Any other value raises a
    :exc:`~exceptions.ValueError`.
    �1�0zNot a boolean value: {0!r})rr)�
ValueError�format)rr
r
r�string_to_bool\srccs6x0|r0|j|�}|j|�}||fV|j|�}qWdS)z�
    Iteration helper for udev list entry objects.

    Yield a tuple ``(name, value)``.  ``name`` and ``value`` are bytestrings
    containing the name and the value of the list entry.  The exact contents
    depend on the list iterated over.
    N)Zudev_list_entry_get_nameZudev_list_entry_get_valueZudev_list_entry_get_next)Zlibudev�entry�namerr
r
r�udev_list_iterateis



rcCs:tj|�j}tj|�rdStj|�r(dStdj|���dS)a�
    Get the device type of a device file.

    ``filename`` is a string containing the path of a device file.

    Return ``'char'`` if ``filename`` is a character device, or ``'block'`` if
    ``filename`` is a block device.  Raise :exc:`~exceptions.ValueError` if
    ``filename`` is no device file at all.  Raise
    :exc:`~exceptions.EnvironmentError` if ``filename`` does not exist or if
    its metadata was inaccessible.

    .. versionadded:: 0.15
    �char�blockznot a device file: {0!r}N)�os�stat�st_mode�S_ISCHR�S_ISBLKrr)�filename�moder
r
r�get_device_typexs

r(cOsvddl}xhy
|||�Stt|jfk
rl}z4t|ttf�rD|j}n
|jd}|tjkrZw
�WYdd}~Xq
Xq
WdS)a=
    Handle interruptions to an interruptible system call.

    Run an interruptible system call in a loop and retry if it raises EINTR.
    The signal calls that may raise EINTR prior to Python 3.5 are listed in
    PEP 0475.  Any calls to these functions must be wrapped in eintr_retry_call
    in order to handle EINTR returns in older versions of Python.

    This function is safe to use under Python 3.5 and newer since the wrapped
    function will simply return without raising EINTR.

    This function is based on _eintr_retry_call in python's subprocess.py.
    rN)�select�OSError�IOError�errorr�errno�argsZEINTR)�funcr.�kwargsr)�errZ
error_coder
r
r�eintr_retry_call�s


r2cCsttddg��}t|j��S)ak
    Get the version of the underlying udev library.

    udev doesn't use a standard major-minor versioning scheme, but instead
    labels releases with a single consecutive number.  Consequently, the
    version number returned by this function is a single integer, and not a
    tuple (like for instance the interpreter version in
    :data:`sys.version_info`).

    As libudev itself does not provide a function to query the version number,
    this function calls the ``udevadm`` utility, so be prepared to catch
    :exc:`~exceptions.EnvironmentError` and
    :exc:`~subprocess.CalledProcessError` if you call this function.

    Return the version number as single integer.  Raise
    :exc:`~exceptions.ValueError`, if the version number retrieved from udev
    could not be converted to an integer.  Raise
    :exc:`~exceptions.EnvironmentError`, if ``udevadm`` was not found, or could
    not be executed.  Raise :exc:`subprocess.CalledProcessError`, if
    ``udevadm`` returned a non-zero exit code.  On Python 2.7 or newer, the
    ``output`` attribute of this exception is correctly set.

    .. versionadded:: 0.8
    Zudevadmz	--version)rrr�strip)�outputr
r
r�udev_version�sr5)�__doc__Z
__future__rrrr�
subprocessr�ImportErrorZpyudev._compatr!r
r"r-rrrrrrr(r2r5r
r
r
r�<module>s$


!site-packages/pyudev/__pycache__/discover.cpython-36.opt-1.pyc000064400000032356147511334620020232 0ustar003

u1�W�,�@s�dZddlmZddlmZddlmZddlmZddlZddlZddlZddl	Z	ddl
Z
ddlmZddlm
Z
d	d
�Ze
jej�Gdd�de��ZGd
d�de�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZdS)z�
    pyudev.discover
    ===============

    Tools to discover a device given limited information.

    .. moduleauthor::  mulhern <amulhern@redhat.com>
�)�absolute_import)�division)�print_function)�unicode_literalsN)�Devices)�DeviceNotFoundErrorcstj���fdd��}|S)z\
    Allow Device discovery methods to return None instead of raising an
    exception.
    cs$y
�||�Stk
rdSXdS)z�
        Returns result of calling ``func`` on ``args``, ``kwargs``.
        Returns None if ``func`` raises :exc:`DeviceNotFoundError`.
        N)r)�args�kwargs)�func��/usr/lib/python3.6/discover.py�the_func1s
z wrap_exception.<locals>.the_func)�	functools�wraps)r
r
r)r
r�wrap_exception+src@sLeZdZdZeejdd���Zeejdd���Zedd��Z	edd	��Z
d
S)�
HypothesiszM
    Represents a hypothesis about the meaning of the device identifier.
    cCs
t��dS)a�
        Match the given string according to the hypothesis.

        The purpose of this method is to obtain a value corresponding to
        ``value`` if that is possible. It may use a regular expression, but
        in general it should just return ``value`` and let the lookup method
        sort out the rest.

        :param str value: the string to inspect
        :returns: the matched thing or None if unmatched
        :rtype: the type of lookup's key parameter or NoneType
        N)�NotImplementedError)�cls�valuerrr�matchDszHypothesis.matchcCs
t��dS)aN
        Lookup the given string according to the hypothesis.

        :param Context context: the pyudev context
        :param key: a key with which to lookup the device
        :type key: the type of match's return value if not None
        :returns: a list of Devices obtained
        :rtype: frozenset of :class:`Device`
        N)r)r�context�keyrrr�lookupUszHypothesis.lookupcCsdS)z�
        A potentially expensive method that may allow an :class:`Hypothesis`
        to find devices more rapidly or to find a device that it would
        otherwise miss.

        :param Context context: the pyudev context
        Nr)rrrrr�setupcs	zHypothesis.setupcCs$|j|�}|dk	r|j||�St�S)a
        Get any devices that may correspond to the given string.

        :param Context context: the pyudev context
        :param str value: the value to look for
        :returns: a list of devices obtained
        :rtype: set of :class:`Device`
        N)rr�	frozenset)rrrrrrr�get_devicesns

zHypothesis.get_devicesN)�__name__�
__module__�__qualname__�__doc__�classmethod�abc�abstractmethodrrrrrrrrr>s
rc@sLeZdZdZedd��Zedd��Zedd��Zedd	��Zed
d��Z	dS)
�DeviceNumberHypothesisz�
    Represents the hypothesis that the device is a device number.

    The device may be separated into major/minor number or a composite number.
    cCs8tjd�}|j|�}|o6tjt|jd��t|jd���S)z�
        Match the number under the assumption that it is a major,minor pair.

        :param str value: value to match
        :returns: the device number or None
        :rtype: int or NoneType
        z#^(?P<major>\d+)(\D+)(?P<minor>\d+)$�major�minor)�re�compiler�os�makedev�int�group)rrZmajor_minor_rerrrr�_match_major_minor�s	
z)DeviceNumberHypothesis._match_major_minorcCs&tjd�}|j|�}|o$t|jd��S)z�
        Match the number under the assumption that it is a single number.

        :param str value: value to match
        :returns: the device number or None
        :rtype: int or NoneType
        z^(?P<number>\d+)$Znumber)r&r'rr*r+)rrZ	number_rerrrr�
_match_number�s	

z$DeviceNumberHypothesis._match_numbercCs|j|�p|j|�S)z�
        Match the number under the assumption that it is a device number.

        :returns: the device number or None
        :rtype: int or NoneType
        )r,r-)rrrrrr�szDeviceNumberHypothesis.matchcCs|j}tjtjj|d��S)z�
        Find subsystems in /sys/dev.

        :param Context context: the context
        :returns: a lis of available subsystems
        :rtype: list of str
        Zdev)�sys_pathr(�listdir�path�join)rrr.rrr�find_subsystems�s	z&DeviceNumberHypothesis.find_subsystemscs8ttj�����fdd�|j��D�}tdd�|D��S)z�
        Lookup by the device number.

        :param Context context: the context
        :param int key: the device number
        :returns: a list of matching devices
        :rtype: frozenset of :class:`Device`
        c3s|]}��|��VqdS)Nr)�.0�s)rr
rrr�	<genexpr>�sz0DeviceNumberHypothesis.lookup.<locals>.<genexpr>css|]}|dk	r|VqdS)Nr)r3�rrrrr5�s)rrZfrom_device_numberr2r)rrr�resr)rr
rrr�s

zDeviceNumberHypothesis.lookupN)
rrrrr r,r-rr2rrrrrr#|s

r#c@s(eZdZdZedd��Zedd��ZdS)�DevicePathHypothesiszG
    Discover the device assuming the identifier is a device path.
    cCs|S)z�
        Match ``value`` under the assumption that it is a device path.

        :returns: the device path or None
        :rtype: str or NoneType
        r)rrrrrr�szDevicePathHypothesis.matchcCs(ttj�||�}|dk	r"t|f�St�S)z�
        Lookup by the path.

        :param Context context: the context
        :param str key: the device path
        :returns: a list of matching devices
        :rtype: frozenset of :class:`Device`
        N)rrZ	from_pathr)rrrr7rrrr�s
zDevicePathHypothesis.lookupN)rrrrr rrrrrrr8�s
r8c@s4eZdZdZedd��Zedd��Zedd��ZdS)	�DeviceNameHypothesiszf
    Discover the device assuming the input is a device name.

    Try every available subsystem.
    cs<|j�d}�fdd�|D�}dd�|D�}tdd�|D��S)	z�
        Find all subsystems in sysfs.

        :param Context context: the context
        :rtype: frozenset
        :returns: subsystems in sysfs
        �bus�class�	subsystemc3s|]}tjj�|�VqdS)N)r(r0r1)r3�name)r.rrr5�sz7DeviceNameHypothesis.find_subsystems.<locals>.<genexpr>css|]}tjj|�r|VqdS)N)r(r0�isdir)r3�drrrr5�scss"|]}tj|�D]
}|VqqdS)N)r(r/)r3r?�nrrrr5�s)r:r;r<)r.r)rrZdirnamesZabsnamesZ	realnamesr)r.rr2�s
	z$DeviceNameHypothesis.find_subsystemscCs|S)z�
        Match ``value`` under the assumption that it is a device name.

        :returns: the device path or None
        :rtype: str or NoneType
        r)rrrrrr�szDeviceNameHypothesis.matchcs8ttj�����fdd�|j��D�}tdd�|D��S)z�
        Lookup by the path.

        :param Context context: the context
        :param str key: the device path
        :returns: a list of matching devices
        :rtype: frozenset of :class:`Device`
        c3s|]}��|��VqdS)Nr)r3r4)rr
rrrr5sz.DeviceNameHypothesis.lookup.<locals>.<genexpr>css|]}|dk	r|VqdS)Nr)r3r6rrrr5s)rr�	from_namer2r)rrrr7r)rr
rrrs

zDeviceNameHypothesis.lookupN)rrrrr r2rrrrrrr9�s
r9c@sZeZdZdZdddddddd	d
ddgZed
d��Zedd��Zedd��Zedd��Z	dS)�DeviceFileHypothesisz�
    Discover the device assuming the value is some portion of a device file.

    The device file may be a link to a device node.
    z/devz/dev/disk/by-idz/dev/disk/by-labelz/dev/disk/by-partlabelz/dev/disk/by-partuuidz/dev/disk/by-pathz/dev/disk/by-uuidz/dev/input/by-pathz/dev/mapperz/dev/mdz/dev/vgcCs:|j�}dd�|D�}dd�|D�}ttdd�|D���S)a7
        Get all directories that may contain links to device nodes.

        This method checks the device links of every device, so it is very
        expensive.

        :param Context context: the context
        :returns: a sorted list of directories that contain device links
        :rtype: list
        css|]}t|j�r|VqdS)N)�list�device_links)r3r?rrrr55sz5DeviceFileHypothesis.get_link_dirs.<locals>.<genexpr>css|]}|jD]
}|VqqdS)N)rD)r3r?�lrrrr56scss|]}tjj|�VqdS)N)r(r0�dirname)r3rErrrr57s)Zlist_devices�sorted�set)rr�devicesZdevices_with_linksZlinksrrr�
get_link_dirs(sz"DeviceFileHypothesis.get_link_dirscCs|j|�|_dS)z�
        Set the link directories to be used when discovering by file.

        Uses `get_link_dirs`, so is as expensive as it is.

        :param Context context: the context
        N)rJ�
_LINK_DIRS)rrrrrr9s	zDeviceFileHypothesis.setupcCs|S)Nr)rrrrrrDszDeviceFileHypothesis.matchcsrttj��d�kr4����}|dk	r.t|f�St�S�fdd�|jD�}��fdd�|D�}tdd�|D��SdS)a�
        Lookup the device under the assumption that the key is part of
        the name of a device file.

        :param Context context: the context
        :param str key: a portion of the device file name

        It is assumed that either it is the whole name of the device file
        or it is the basename.

        A device file may be a device node or a device link.
        �/Nc3s|]}tjj|��VqdS)N)r(r0r1)r3Zld)rrrr5[sz.DeviceFileHypothesis.lookup.<locals>.<genexpr>c3s|]}��|�VqdS)Nr)r3�f)rr
rrr5\scss|]}|dk	r|VqdS)Nr)r3r?rrrr5]s)rrZfrom_device_filerrK)rrrZdevice�filesrIr)rr
rrrHs

zDeviceFileHypothesis.lookupN)
rrrrrKr rJrrrrrrrrBs rBc@s4eZdZdZeeeegZdd�Z	dd�Z
dd�ZdS)	�	Discoveryz1
    Provides discovery methods for devices.
    cCs|j|_dS)N)�_HYPOTHESES�_hypotheses)�selfrrr�__init__mszDiscovery.__init__cCsx|jD]}|j|�qWdS)z
        Set up individual hypotheses.

        May be an expensive call.

        :param Context context: the context
        N)rQr)rRrZhyprrrrpszDiscovery.setupcst��fdd�|jD��S)z�
        Get the devices corresponding to value.

        :param Context context: the context
        :param str value: some identifier of the device
        :returns: a list of corresponding devices
        :rtype: frozenset of :class:`Device`
        c3s$|]}|j���D]
}|VqqdS)N)r)r3�hr?)rrrrr5�sz(Discovery.get_devices.<locals>.<genexpr>)rrQ)rRrrr)rrrr{s	zDiscovery.get_devicesN)rrrrrBr9r#r8rPrSrrrrrrrO`srO)rZ
__future__rrrrr!rr(r&ZsixZ
pyudev.devicerrrZ
add_metaclass�ABCMeta�objectrr#r8r9rBrOrrrr�<module>s&=K/Msite-packages/pyudev/__pycache__/discover.cpython-36.pyc000064400000032356147511334620017273 0ustar003

u1�W�,�@s�dZddlmZddlmZddlmZddlmZddlZddlZddlZddl	Z	ddl
Z
ddlmZddlm
Z
d	d
�Ze
jej�Gdd�de��ZGd
d�de�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZdS)z�
    pyudev.discover
    ===============

    Tools to discover a device given limited information.

    .. moduleauthor::  mulhern <amulhern@redhat.com>
�)�absolute_import)�division)�print_function)�unicode_literalsN)�Devices)�DeviceNotFoundErrorcstj���fdd��}|S)z\
    Allow Device discovery methods to return None instead of raising an
    exception.
    cs$y
�||�Stk
rdSXdS)z�
        Returns result of calling ``func`` on ``args``, ``kwargs``.
        Returns None if ``func`` raises :exc:`DeviceNotFoundError`.
        N)r)�args�kwargs)�func��/usr/lib/python3.6/discover.py�the_func1s
z wrap_exception.<locals>.the_func)�	functools�wraps)r
r
r)r
r�wrap_exception+src@sLeZdZdZeejdd���Zeejdd���Zedd��Z	edd	��Z
d
S)�
HypothesiszM
    Represents a hypothesis about the meaning of the device identifier.
    cCs
t��dS)a�
        Match the given string according to the hypothesis.

        The purpose of this method is to obtain a value corresponding to
        ``value`` if that is possible. It may use a regular expression, but
        in general it should just return ``value`` and let the lookup method
        sort out the rest.

        :param str value: the string to inspect
        :returns: the matched thing or None if unmatched
        :rtype: the type of lookup's key parameter or NoneType
        N)�NotImplementedError)�cls�valuerrr�matchDszHypothesis.matchcCs
t��dS)aN
        Lookup the given string according to the hypothesis.

        :param Context context: the pyudev context
        :param key: a key with which to lookup the device
        :type key: the type of match's return value if not None
        :returns: a list of Devices obtained
        :rtype: frozenset of :class:`Device`
        N)r)r�context�keyrrr�lookupUszHypothesis.lookupcCsdS)z�
        A potentially expensive method that may allow an :class:`Hypothesis`
        to find devices more rapidly or to find a device that it would
        otherwise miss.

        :param Context context: the pyudev context
        Nr)rrrrr�setupcs	zHypothesis.setupcCs$|j|�}|dk	r|j||�St�S)a
        Get any devices that may correspond to the given string.

        :param Context context: the pyudev context
        :param str value: the value to look for
        :returns: a list of devices obtained
        :rtype: set of :class:`Device`
        N)rr�	frozenset)rrrrrrr�get_devicesns

zHypothesis.get_devicesN)�__name__�
__module__�__qualname__�__doc__�classmethod�abc�abstractmethodrrrrrrrrr>s
rc@sLeZdZdZedd��Zedd��Zedd��Zedd	��Zed
d��Z	dS)
�DeviceNumberHypothesisz�
    Represents the hypothesis that the device is a device number.

    The device may be separated into major/minor number or a composite number.
    cCs8tjd�}|j|�}|o6tjt|jd��t|jd���S)z�
        Match the number under the assumption that it is a major,minor pair.

        :param str value: value to match
        :returns: the device number or None
        :rtype: int or NoneType
        z#^(?P<major>\d+)(\D+)(?P<minor>\d+)$�major�minor)�re�compiler�os�makedev�int�group)rrZmajor_minor_rerrrr�_match_major_minor�s	
z)DeviceNumberHypothesis._match_major_minorcCs&tjd�}|j|�}|o$t|jd��S)z�
        Match the number under the assumption that it is a single number.

        :param str value: value to match
        :returns: the device number or None
        :rtype: int or NoneType
        z^(?P<number>\d+)$Znumber)r&r'rr*r+)rrZ	number_rerrrr�
_match_number�s	

z$DeviceNumberHypothesis._match_numbercCs|j|�p|j|�S)z�
        Match the number under the assumption that it is a device number.

        :returns: the device number or None
        :rtype: int or NoneType
        )r,r-)rrrrrr�szDeviceNumberHypothesis.matchcCs|j}tjtjj|d��S)z�
        Find subsystems in /sys/dev.

        :param Context context: the context
        :returns: a lis of available subsystems
        :rtype: list of str
        Zdev)�sys_pathr(�listdir�path�join)rrr.rrr�find_subsystems�s	z&DeviceNumberHypothesis.find_subsystemscs8ttj�����fdd�|j��D�}tdd�|D��S)z�
        Lookup by the device number.

        :param Context context: the context
        :param int key: the device number
        :returns: a list of matching devices
        :rtype: frozenset of :class:`Device`
        c3s|]}��|��VqdS)Nr)�.0�s)rr
rrr�	<genexpr>�sz0DeviceNumberHypothesis.lookup.<locals>.<genexpr>css|]}|dk	r|VqdS)Nr)r3�rrrrr5�s)rrZfrom_device_numberr2r)rrr�resr)rr
rrr�s

zDeviceNumberHypothesis.lookupN)
rrrrr r,r-rr2rrrrrr#|s

r#c@s(eZdZdZedd��Zedd��ZdS)�DevicePathHypothesiszG
    Discover the device assuming the identifier is a device path.
    cCs|S)z�
        Match ``value`` under the assumption that it is a device path.

        :returns: the device path or None
        :rtype: str or NoneType
        r)rrrrrr�szDevicePathHypothesis.matchcCs(ttj�||�}|dk	r"t|f�St�S)z�
        Lookup by the path.

        :param Context context: the context
        :param str key: the device path
        :returns: a list of matching devices
        :rtype: frozenset of :class:`Device`
        N)rrZ	from_pathr)rrrr7rrrr�s
zDevicePathHypothesis.lookupN)rrrrr rrrrrrr8�s
r8c@s4eZdZdZedd��Zedd��Zedd��ZdS)	�DeviceNameHypothesiszf
    Discover the device assuming the input is a device name.

    Try every available subsystem.
    cs<|j�d}�fdd�|D�}dd�|D�}tdd�|D��S)	z�
        Find all subsystems in sysfs.

        :param Context context: the context
        :rtype: frozenset
        :returns: subsystems in sysfs
        �bus�class�	subsystemc3s|]}tjj�|�VqdS)N)r(r0r1)r3�name)r.rrr5�sz7DeviceNameHypothesis.find_subsystems.<locals>.<genexpr>css|]}tjj|�r|VqdS)N)r(r0�isdir)r3�drrrr5�scss"|]}tj|�D]
}|VqqdS)N)r(r/)r3r?�nrrrr5�s)r:r;r<)r.r)rrZdirnamesZabsnamesZ	realnamesr)r.rr2�s
	z$DeviceNameHypothesis.find_subsystemscCs|S)z�
        Match ``value`` under the assumption that it is a device name.

        :returns: the device path or None
        :rtype: str or NoneType
        r)rrrrrr�szDeviceNameHypothesis.matchcs8ttj�����fdd�|j��D�}tdd�|D��S)z�
        Lookup by the path.

        :param Context context: the context
        :param str key: the device path
        :returns: a list of matching devices
        :rtype: frozenset of :class:`Device`
        c3s|]}��|��VqdS)Nr)r3r4)rr
rrrr5sz.DeviceNameHypothesis.lookup.<locals>.<genexpr>css|]}|dk	r|VqdS)Nr)r3r6rrrr5s)rr�	from_namer2r)rrrr7r)rr
rrrs

zDeviceNameHypothesis.lookupN)rrrrr r2rrrrrrr9�s
r9c@sZeZdZdZdddddddd	d
ddgZed
d��Zedd��Zedd��Zedd��Z	dS)�DeviceFileHypothesisz�
    Discover the device assuming the value is some portion of a device file.

    The device file may be a link to a device node.
    z/devz/dev/disk/by-idz/dev/disk/by-labelz/dev/disk/by-partlabelz/dev/disk/by-partuuidz/dev/disk/by-pathz/dev/disk/by-uuidz/dev/input/by-pathz/dev/mapperz/dev/mdz/dev/vgcCs:|j�}dd�|D�}dd�|D�}ttdd�|D���S)a7
        Get all directories that may contain links to device nodes.

        This method checks the device links of every device, so it is very
        expensive.

        :param Context context: the context
        :returns: a sorted list of directories that contain device links
        :rtype: list
        css|]}t|j�r|VqdS)N)�list�device_links)r3r?rrrr55sz5DeviceFileHypothesis.get_link_dirs.<locals>.<genexpr>css|]}|jD]
}|VqqdS)N)rD)r3r?�lrrrr56scss|]}tjj|�VqdS)N)r(r0�dirname)r3rErrrr57s)Zlist_devices�sorted�set)rr�devicesZdevices_with_linksZlinksrrr�
get_link_dirs(sz"DeviceFileHypothesis.get_link_dirscCs|j|�|_dS)z�
        Set the link directories to be used when discovering by file.

        Uses `get_link_dirs`, so is as expensive as it is.

        :param Context context: the context
        N)rJ�
_LINK_DIRS)rrrrrr9s	zDeviceFileHypothesis.setupcCs|S)Nr)rrrrrrDszDeviceFileHypothesis.matchcsrttj��d�kr4����}|dk	r.t|f�St�S�fdd�|jD�}��fdd�|D�}tdd�|D��SdS)a�
        Lookup the device under the assumption that the key is part of
        the name of a device file.

        :param Context context: the context
        :param str key: a portion of the device file name

        It is assumed that either it is the whole name of the device file
        or it is the basename.

        A device file may be a device node or a device link.
        �/Nc3s|]}tjj|��VqdS)N)r(r0r1)r3Zld)rrrr5[sz.DeviceFileHypothesis.lookup.<locals>.<genexpr>c3s|]}��|�VqdS)Nr)r3�f)rr
rrr5\scss|]}|dk	r|VqdS)Nr)r3r?rrrr5]s)rrZfrom_device_filerrK)rrrZdevice�filesrIr)rr
rrrHs

zDeviceFileHypothesis.lookupN)
rrrrrKr rJrrrrrrrrBs rBc@s4eZdZdZeeeegZdd�Z	dd�Z
dd�ZdS)	�	Discoveryz1
    Provides discovery methods for devices.
    cCs|j|_dS)N)�_HYPOTHESES�_hypotheses)�selfrrr�__init__mszDiscovery.__init__cCsx|jD]}|j|�qWdS)z
        Set up individual hypotheses.

        May be an expensive call.

        :param Context context: the context
        N)rQr)rRrZhyprrrrpszDiscovery.setupcst��fdd�|jD��S)z�
        Get the devices corresponding to value.

        :param Context context: the context
        :param str value: some identifier of the device
        :returns: a list of corresponding devices
        :rtype: frozenset of :class:`Device`
        c3s$|]}|j���D]
}|VqqdS)N)r)r3�hr?)rrrrr5�sz(Discovery.get_devices.<locals>.<genexpr>)rrQ)rRrrr)rrrr{s	zDiscovery.get_devicesN)rrrrrBr9r#r8rPrSrrrrrrrO`srO)rZ
__future__rrrrr!rr(r&ZsixZ
pyudev.devicerrrZ
add_metaclass�ABCMeta�objectrr#r8r9rBrOrrrr�<module>s&=K/Msite-packages/pyudev/__pycache__/version.cpython-36.opt-1.pyc000064400000001143147511334620020067 0ustar003

u1�WM�@sHdZdZddjdd�edd	�D��djd
d�ed	d�D��fZdS)zx
    pyudev.version
    ==============

    Version information.

    .. moduleauthor::  mulhern  <amulhern@redhat.com>
���z%s%s�.ccs|]}t|�VqdS)N)�str)�.0�x�r�/usr/lib/python3.6/version.py�	<genexpr>sr
N�ccs|]}t|�VqdS)N)r)rrrrr	r
 s)rrrr)�__doc__Z__version_info__�join�__version__rrrr	�<module>ssite-packages/pyudev/__pycache__/_qt_base.cpython-36.opt-1.pyc000064400000014331147511334620020162 0ustar003

u1�Wk�@s�dZddlmZddlmZddlmZddlmZddlZddlmZGdd	�d	e	�Z
Gd
d�de
�Zdd
�ZGdd�de	�Z
Gdd�de	�ZdS)z�
    pyudev._qt_base
    ===============

    Base mixin class for Qt4,Qt5 support.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
�)�absolute_import)�division)�print_function)�unicode_literalsN)�Devicec@sBeZdZdZdd�Zedd��Zejdd��Zdd�Zd	d
�Z	dS)�MonitorObserverMixinz0
    Base mixin for pyqt monitor observers.
    cCs2||_||j�|j|�|_|jjtj|j�dS)N)�monitor�filenoZRead�notifierZ	activated�intZconnect�_process_udev_event)�selfr�notifier_class�r�/usr/lib/python3.6/_qt_base.py�_setup_notifier+sz$MonitorObserverMixin._setup_notifiercCs
|jj�S)aY
        Whether this observer is enabled or not.

        If ``True`` (the default), this observer is enabled, and emits events.
        Otherwise it is disabled and does not emit any events.  This merely
        reflects the state of the ``enabled`` property of the underlying
        :attr:`notifier`.

        .. versionadded:: 0.14
        )r
Z	isEnabled)r
rrr�enabled1szMonitorObserverMixin.enabledcCs|jj|�dS)N)r
Z
setEnabled)r
�valuerrrr?scCs$|jjdd�}|dk	r |j|�dS)z�
        Attempt to receive a single device event from the monitor, process
        the event and emit corresponding signals.

        Called by ``QSocketNotifier``, if data is available on the udev
        monitoring socket.
        r)ZtimeoutN)rZpoll�_emit_event)r
�devicerrrrCsz(MonitorObserverMixin._process_udev_eventcCs|jj|�dS)N)�deviceEvent�emit)r
rrrrrOsz MonitorObserverMixin._emit_eventN)
�__name__�
__module__�__qualname__�__doc__r�propertyr�setterrrrrrrr%src@s eZdZdZdd�Zdd�ZdS)�QUDevMonitorObserverMixinz*
    Obsolete monitor observer mixin.
    cCs>tj|||�|j|j|j|jd�|_ddl}|jdt	�dS)N)�add�removeZchangeZmoverzAWill be removed in 1.0. Use pyudev.pyqt4.MonitorObserver instead.)
rr�deviceAdded�
deviceRemoved�
deviceChanged�deviceMoved�_action_signal_map�warnings�warn�DeprecationWarning)r
rrr&rrrrYsz)QUDevMonitorObserverMixin._setup_notifiercCs4|jj|j|�|jj|j�}|dk	r0|j|�dS)N)rr�actionr%�get)r
r�signalrrrrdsz%QUDevMonitorObserverMixin._emit_eventN)rrrrrrrrrrrSsrcsd��fdd�	}|S)a
    Generates an initializer to observer the given ``monitor``
    (a :class:`~pyudev.Monitor`):

    ``parent`` is the parent :class:`~PyQt{4,5}.QtCore.QObject` of this
    object.  It is passed unchanged to the inherited constructor of
    :class:`~PyQt{4,5}.QtCore.QObject`.
    Ncs�j||�|j|��dS)N)�__init__r)r
r�parent)�qobject�socket_notifierrrr,tszmake_init.<locals>.__init__)Nr)r.r/r,r)r.r/r�	make_initjs
r0c@seZdZdZedd��ZdS)�MonitorObserverGeneratorz4
    Class to generate a MonitorObserver class.
    cCs.ttd�|tftd�t||�td�|t�i�S)aGenerates an observer for device events integrating into the
        PyQt{4,5} mainloop.

        This class inherits :class:`~PyQt{4,5}.QtCore.QObject` to turn device
        events into Qt signals:

        >>> from pyudev import Context, Monitor
        >>> from pyudev.pyqt4 import MonitorObserver
        >>> context = Context()
        >>> monitor = Monitor.from_netlink(context)
        >>> monitor.filter_by(subsystem='input')
        >>> observer = MonitorObserver(monitor)
        >>> def device_event(device):
        ...     print('event {0} on device {1}'.format(device.action, device))
        >>> observer.deviceEvent.connect(device_event)
        >>> monitor.start()

        This class is a child of :class:`~{PySide, PyQt{4,5}}.QtCore.QObject`.

        ZMonitorObserverr,r)�type�strrr0r)r.r+r/rrr�make_monitor_observer�s
z.MonitorObserverGenerator.make_monitor_observerN)rrrr�staticmethodr4rrrrr1|sr1c@seZdZdZedd��ZdS)�QUDevMonitorObserverGeneratorz4
    Class to generate a MonitorObserver class.
    cCsbttd�|tftd�t||�td�|tjt�td�|t�td�|t�td�|t�td�|t�i�S)aGenerates an observer for device events integrating into the
        PyQt{4,5} mainloop.

        This class inherits :class:`~PyQt{4,5}.QtCore.QObject` to turn device
        events into Qt signals:

        >>> from pyudev import Context, Monitor
        >>> from pyudev.pyqt4 import MonitorObserver
        >>> context = Context()
        >>> monitor = Monitor.from_netlink(context)
        >>> monitor.filter_by(subsystem='input')
        >>> observer = MonitorObserver(monitor)
        >>> def device_event(device):
        ...     print('event {0} on device {1}'.format(device.action, device))
        >>> observer.deviceEvent.connect(device_event)
        >>> monitor.start()

        This class is a child of :class:`~{PyQt{4,5}, PySide}.QtCore.QObject`.

        ZQUDevMonitorObserverr,rr!r"r#r$)r2r3rr0�sixZ	text_typer)r.r+r/rrrr4�sz3QUDevMonitorObserverGenerator.make_monitor_observerN)rrrrr5r4rrrrr6�sr6)rZ
__future__rrrrr7Z
pyudev.devicer�objectrrr0r1r6rrrr�<module>s.'site-packages/pyudev/__pycache__/__init__.cpython-36.pyc000064400000003402147511334620017202 0ustar003

N�#W�	�@s@dZddlmZddlmZddlmZddlmZddlmZddlmZddlm	Z	dd	lm
Z
dd
lmZddlmZddlm
Z
dd
lmZddlmZddlmZddlmZddlmZddlmZddlmZddlmZddlmZddlmZddlmZddlmZddlmZddlmZddl m!Z!dS)a.
    pyudev
    ======

    A binding to libudev.

    The :class:`Context` provides the connection to the udev device database
    and enumerates devices.  Individual devices are represented by the
    :class:`Device` class.

    Device monitoring is provided by :class:`Monitor` and
    :class:`MonitorObserver`.  With :mod:`pyudev.pyqt4`, :mod:`pyudev.pyside`,
    :mod:`pyudev.glib` and :mod:`pyudev.wx` device monitoring can be integrated
    into the event loop of various GUI toolkits.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
�)�absolute_import)�division)�print_function)�unicode_literals)�
Attributes)�Device)�Devices)�DeviceNotFoundAtPathError)�DeviceNotFoundByFileError)�DeviceNotFoundByNameError)�DeviceNotFoundByNumberError)�DeviceNotFoundError)� DeviceNotFoundInEnvironmentError)�Tags)�DeviceFileHypothesis)�DeviceNameHypothesis)�DeviceNumberHypothesis)�DevicePathHypothesis)�	Discovery)�Context)�
Enumerator)�Monitor)�MonitorObserver)�__version__)�__version_info__)�udev_versionN)"�__doc__Z
__future__rrrrZ
pyudev.devicerrrr	r
rrr
rrZpyudev.discoverrrrrrZpyudev.corerrZpyudev.monitorrrZpyudev.versionrrZpyudev._utilr�rr�/usr/lib/python3.6/__init__.py�<module>#s4site-packages/pyudev/__pycache__/monitor.cpython-36.pyc000064400000046361147511334620017145 0ustar003

u1�W�Q�@s�dZddlmZmZmZmZddlZddlZddlm	Z	ddl
mZddlm
Z
ddlmZddlmZdd	lmZdd
lmZGdd�de�ZGd
d�de	�ZdS)z�
    pyudev.monitor
    ==============

    Monitor implementation.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
�)�print_function�division�unicode_literals�absolute_importN)�Thread)�partial)�Device)�eintr_retry_call)�ensure_byte_string)�pipe)�pollc@s�eZdZdZdd�Zdd�Zed"dd��Zed	d
��Z	dd�Z
d#dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zd$dd�Zdd�Zd d!�Zd
S)%�Monitorat
    A synchronous device event monitor.

    A :class:`Monitor` objects connects to the udev daemon and listens for
    changes to the device list.  A monitor is created by connecting to the
    kernel daemon through netlink (see :meth:`from_netlink`):

    >>> from pyudev import Context, Monitor
    >>> context = Context()
    >>> monitor = Monitor.from_netlink(context)

    Once the monitor is created, you can add a filter using :meth:`filter_by()`
    or :meth:`filter_by_tag()` to drop incoming events in subsystems, which are
    not of interest to the application:

    >>> monitor.filter_by('input')

    When the monitor is eventually set up, you can either poll for events
    synchronously:

    >>> device = monitor.poll(timeout=3)
    >>> if device:
    ...     print('{0.action}: {0}'.format(device))
    ...

    Or you can monitor events asynchronously with :class:`MonitorObserver`.

    To integrate into various event processing frameworks, the monitor provides
    a :func:`selectable <select.select>` file description by :meth:`fileno()`.
    However, do *not*  read or write directly on this file descriptor.

    Instances of this class can directly be given as ``udev_monitor *`` to
    functions wrapped through :mod:`ctypes`.

    .. versionchanged:: 0.16
       Remove :meth:`from_socket()` which is deprecated, and even removed in
       recent udev versions.
    cCs||_||_|j|_d|_dS)NF)�contextZ_as_parameter_�_libudev�_started)�selfrZ	monitor_p�r�/usr/lib/python3.6/monitor.py�__init__VszMonitor.__init__cCs|jj|�dS)N)rZudev_monitor_unref)rrrr�__del__\szMonitor.__del__�udevcCs>|dkrtdj|���|jj|t|��}|s4td��|||�S)a�
        Create a monitor by connecting to the kernel daemon through netlink.

        ``context`` is the :class:`Context` to use.  ``source`` is a string,
        describing the event source.  Two sources are available:

        ``'udev'`` (the default)
          Events emitted after udev as registered and configured the device.
          This is the absolutely recommended source for applications.

        ``'kernel'``
          Events emitted directly after the kernel has seen the device.  The
          device has not yet been configured by udev and might not be usable
          at all.  **Never** use this, unless you know what you are doing.

        Return a new :class:`Monitor` object, which is connected to the
        given source.  Raise :exc:`~exceptions.ValueError`, if an invalid
        source has been specified.  Raise
        :exc:`~exceptions.EnvironmentError`, if the creation of the monitor
        failed.
        �kernelrz8Invalid source: {0!r}. Must be one of "udev" or "kernel"zCould not create udev monitor)rr)�
ValueError�formatrZudev_monitor_new_from_netlinkr
�EnvironmentError)�clsr�source�monitorrrr�from_netlink_szMonitor.from_netlinkcCs|jS)z�
        ``True``, if this monitor was started, ``False`` otherwise. Readonly.

        .. seealso:: :meth:`start()`
        .. versionadded:: 0.16
        )r)rrrr�startedszMonitor.startedcCs|jj|�S)z�
        Return the file description associated with this monitor as integer.

        This is really a real file descriptor ;), which can be watched and
        :func:`select.select`\ ed.
        )rZudev_monitor_get_fd)rrrr�fileno�szMonitor.filenoNcCs8t|�}|dk	rt|�}|jj|||�|jj|�dS)a
        Filter incoming events.

        ``subsystem`` is a byte or unicode string with the name of a
        subsystem (e.g. ``'input'``).  Only events originating from the
        given subsystem pass the filter and are handed to the caller.

        If given, ``device_type`` is a byte or unicode string specifying the
        device type.  Only devices with the given device type are propagated
        to the caller.  If ``device_type`` is not given, no additional
        filter for a specific device type is installed.

        These filters are executed inside the kernel, and client processes
        will usually not be woken up for device, that do not match these
        filters.

        .. versionchanged:: 0.15
           This method can also be after :meth:`start()` now.
        N)r
rZ/udev_monitor_filter_add_match_subsystem_devtype�udev_monitor_filter_update)rZ	subsystemZdevice_typerrr�	filter_by�s
zMonitor.filter_bycCs"|jj|t|��|jj|�dS)aR
        Filter incoming events by the given ``tag``.

        ``tag`` is a byte or unicode string with the name of a tag.  Only
        events for devices which have this tag attached pass the filter and are
        handed to the caller.

        Like with :meth:`filter_by` this filter is also executed inside the
        kernel, so that client processes are usually not woken up for devices
        without the given ``tag``.

        .. udevversion:: 154

        .. versionadded:: 0.9

        .. versionchanged:: 0.15
           This method can also be after :meth:`start()` now.
        N)rZ!udev_monitor_filter_add_match_tagr
r!)r�tagrrr�
filter_by_tag�szMonitor.filter_by_tagcCs|jj|�|jj|�dS)a
        Remove any filters installed with :meth:`filter_by()` or
        :meth:`filter_by_tag()` from this monitor.

        .. warning::

           Up to udev 181 (and possibly even later versions) the underlying
           ``udev_monitor_filter_remove()`` seems to be broken.  If used with
           affected versions this method always raises
           :exc:`~exceptions.ValueError`.

        Raise :exc:`~exceptions.EnvironmentError` if removal of installed
        filters failed.

        .. versionadded:: 0.15
        N)rZudev_monitor_filter_remover!)rrrr�
remove_filter�szMonitor.remove_filtercCs ddl}|jdt�|j�dS)a�
        Switch the monitor into listing mode.

        Connect to the event source and receive incoming events.  Only after
        calling this method, the monitor listens for incoming events.

        .. note::

           This method is implicitly called by :meth:`__iter__`.  You don't
           need to call it explicitly, if you are iterating over the
           monitor.

        .. deprecated:: 0.16
           Will be removed in 1.0. Use :meth:`start()` instead.
        rNz4Will be removed in 1.0. Use Monitor.start() instead.)�warnings�warn�DeprecationWarning�start)rr&rrr�enable_receiving�szMonitor.enable_receivingcCs*|js&|jj|�tj|tj�d|_dS)a�
        Start this monitor.

        The monitor will not receive events until this method is called. This
        method does nothing if called on an already started :class:`Monitor`.

        .. note::

           Typically you don't need to call this method. It is implicitly
           called by :meth:`poll()` and :meth:`__iter__()`.

        .. seealso:: :attr:`started`
        .. versionchanged:: 0.16
           This method does nothing if the :class:`Monitor` was already
           started.
        TN)rrZudev_monitor_enable_receivingrZset_fd_status_flag�os�
O_NONBLOCK)rrrrr)�sz
Monitor.startcCs|jj||�dS)aN
        Set the receive buffer ``size``.

        ``size`` is the requested buffer size in bytes, as integer.

        .. note::

           The CAP_NET_ADMIN capability must be contained in the effective
           capability set of the caller for this method to succeed.  Otherwise
           :exc:`~exceptions.EnvironmentError` will be raised, with ``errno``
           set to :data:`~errno.EPERM`.  Unprivileged processes typically lack
           this capability.  You can check the capabilities of the current
           process with the python-prctl_ module:

           >>> import prctl
           >>> prctl.cap_effective.net_admin

        Raise :exc:`~exceptions.EnvironmentError`, if the buffer size could not
        bet set.

        .. versionadded:: 0.13

        .. _python-prctl: http://packages.python.org/python-prctl
        N)rZ$udev_monitor_set_receive_buffer_size)r�sizerrr�set_receive_buffer_sizeszMonitor.set_receive_buffer_sizecCsvxpy |jj|�}|r t|j|�SdStk
rl}z.|jtjtjfkrJdS|jtjkrZwn�WYdd}~XqXqWdS)z�Receive a single device from the monitor.

        Return the received :class:`Device`, or ``None`` if no device could be
        received.

        N)	rZudev_monitor_receive_devicerrr�errnoZEAGAINZEWOULDBLOCKZEINTR)rZdevice_p�errorrrr�_receive_device szMonitor._receive_devicecCsL|dk	r|dkrt|d�}|j�ttjj|df�j|�rD|j�SdSdS)a�
        Poll for a device event.

        You can use this method together with :func:`iter()` to synchronously
        monitor events in the current thread::

           for device in iter(monitor.poll, None):
               print('{0.action} on {0.device_path}'.format(device))

        Since this method will never return ``None`` if no ``timeout`` is
        specified, this is effectively an endless loop. With
        :func:`functools.partial()` you can also create a loop that only waits
        for a specified time::

           for device in iter(partial(monitor.poll, 3), None):
               print('{0.action} on {0.device_path}'.format(device))

        This loop will only wait three seconds for a new device event. If no
        device event occurred after three seconds, the loop will exit.

        ``timeout`` is a floating point number that specifies a time-out in
        seconds. If omitted or ``None``, this method blocks until a device
        event is available. If ``0``, this method just polls and will never
        block.

        .. note::

           This method implicitly calls :meth:`start()`.

        Return the received :class:`Device`, or ``None`` if a timeout
        occurred. Raise :exc:`~exceptions.EnvironmentError` if event retrieval
        failed.

        .. seealso::

           :attr:`Device.action`
              The action that created this event.

           :attr:`Device.sequence_number`
              The sequence number of this event.

        .. versionadded:: 0.16
        Nri��r)�intr)r	r�Poll�
for_eventsr1)r�timeoutrrrr5s,zMonitor.pollcCs&ddl}|jdt�|j�}|j|fS)a:
        Receive a single device from the monitor.

        .. warning::

           You *must* call :meth:`start()` before calling this method.

        The caller must make sure, that there are events available in the
        event queue.  The call blocks, until a device is available.

        If a device was available, return ``(action, device)``.  ``device``
        is the :class:`Device` object describing the device.  ``action`` is
        a string describing the action.  Usual actions are:

        ``'add'``
          A device has been added (e.g. a USB device was plugged in)
        ``'remove'``
          A device has been removed (e.g. a USB device was unplugged)
        ``'change'``
          Something about the device changed (e.g. a device property)
        ``'online'``
          The device is online now
        ``'offline'``
          The device is offline now

        Raise :exc:`~exceptions.EnvironmentError`, if no device could be
        read.

        .. deprecated:: 0.16
           Will be removed in 1.0. Use :meth:`Monitor.poll()` instead.
        rNz3Will be removed in 1.0. Use Monitor.poll() instead.)r&r'r(r�action)rr&�devicerrr�receive_devicejs
 zMonitor.receive_deviceccsBddl}|jdt�|j�x |j�}|dk	r|j|fVqWdS)a�
        Wait for incoming events and receive them upon arrival.

        This methods implicitly calls :meth:`start()`, and starts polling the
        :meth:`fileno` of this monitor.  If a event comes in, it receives the
        corresponding device and yields it to the caller.

        The returned iterator is endless, and continues receiving devices
        without ever stopping.

        Yields ``(action, device)`` (see :meth:`receive_device` for a
        description).

        .. deprecated:: 0.16
           Will be removed in 1.0. Use an explicit loop over :meth:`poll()`
           instead, or monitor asynchronously with :class:`MonitorObserver`.
        rNzuWill be removed in 1.0. Use an explicit loop over "poll()" instead, or monitor asynchronously with "MonitorObserver".)r&r'r(r)rr7)rr&r8rrr�__iter__�szMonitor.__iter__)r)N)N)�__name__�
__module__�__qualname__�__doc__rr�classmethodr�propertyrr r"r$r%r*r)r.r1rr9r:rrrrr
.s"&



5&r
c@s:eZdZdZd
dd�Zdd�Zdd�Zd	d
�Zdd�ZdS)�MonitorObservera`
    An asynchronous observer for device events.

    This class subclasses :class:`~threading.Thread` class to asynchronously
    observe a :class:`Monitor` in a background thread:

    >>> from pyudev import Context, Monitor, MonitorObserver
    >>> context = Context()
    >>> monitor = Monitor.from_netlink(context)
    >>> monitor.filter_by(subsystem='input')
    >>> def print_device_event(device):
    ...     print('background event {0.action}: {0.device_path}'.format(device))
    >>> observer = MonitorObserver(monitor, callback=print_device_event, name='monitor-observer')
    >>> observer.daemon
    True
    >>> observer.start()

    In the above example, input device events will be printed in background,
    until :meth:`stop()` is called on ``observer``.

    .. note::

       Instances of this class are always created as daemon thread.  If you do
       not want to use daemon threads for monitoring, you need explicitly set
       :attr:`~threading.Thread.daemon` to ``False`` before invoking
       :meth:`~threading.Thread.start()`.

    .. seealso::

       :attr:`Device.action`
          The action that created this event.

       :attr:`Device.sequence_number`
          The sequence number of this event.

    .. versionadded:: 0.14

    .. versionchanged:: 0.15
       :meth:`Monitor.start()` is implicitly called when the thread is started.
    Ncs�|dkr�dkrtd��n|dk	r2�dk	r2td��tj|f|�|�||_d|_d|_�dk	r~ddl}|jdt��fdd�}||_	dS)	a
        Create a new observer for the given ``monitor``.

        ``monitor`` is the :class:`Monitor` to observe. ``callback`` is the
        callable to invoke on events, with the signature ``callback(device)``
        where ``device`` is the :class:`Device` that caused the event.

        .. warning::

           ``callback`` is invoked in the observer thread, hence the observer
           is blocked while callback executes.

        ``args`` and ``kwargs`` are passed unchanged to the constructor of
        :class:`~threading.Thread`.

        .. deprecated:: 0.16
           The ``event_handler`` argument will be removed in 1.0. Use
           the ``callback`` argument instead.
        .. versionchanged:: 0.16
           Add ``callback`` argument.
        Nzcallback missingz$Use either callback or event handlerTrzL"event_handler" argument will be removed in 1.0. Use Monitor.poll() instead.cs�|j|�S)N)r7)�d)�
event_handlerrr�<lambda>�sz*MonitorObserver.__init__.<locals>.<lambda>)
rrrrZdaemon�_stop_eventr&r'r(�	_callback)rrrC�callback�args�kwargsr&r)rCrr�s
zMonitorObserver.__init__cCs"|j�stjj�|_tj|�dS)zStart the observer thread.N)Zis_aliverZPipe�openrErr))rrrrr)�szMonitorObserver.startcCs�|jj�tjj|jdf|jjdf�}x�x�t|j�D]x\}}||jjj�kr\|jjj	�dS||jj�kr�|dkr�t
t|jjdd�}x&t|d�D]}|j|�q�Wq4t
d��q4Wq(WdS)Nr2r)r6zObserved monitor hung up)rr)rr4r5rErr	r �closer�iterrFr)rZnotifierZfile_descriptorZeventZread_devicer8rrr�runs
zMonitorObserver.runc
CsB|jdkrdS|jj�"t|jjjd�|jjj�WdQRXdS)aT
        Send a stop signal to the background thread.

        The background thread will eventually exit, but it may still be running
        when this method returns.  This method is essentially the asynchronous
        equivalent to :meth:`stop()`.

        .. note::

           The underlying :attr:`monitor` is *not* stopped.
        N�)rEZsinkr	�write�flush)rrrr�	send_stops


zMonitorObserver.send_stopcCs.|j�y|j�Wntk
r(YnXdS)a
        Synchronously stop the background thread.

        .. note::

           This method can safely be called from the observer thread. In this
           case it is equivalent to :meth:`send_stop()`.

        Send a stop signal to the backgroud (see :meth:`send_stop`), and waits
        for the background thread to exit (see :meth:`~threading.Thread.join`)
        if the current thread is *not* the observer thread.

        After this method returns in a thread *that is not the observer
        thread*, the ``callback`` is guaranteed to not be invoked again
        anymore.

        .. note::

           The underlying :attr:`monitor` is *not* stopped.

        .. versionchanged:: 0.16
           This method can be called from the observer thread.
        N)rQ�join�RuntimeError)rrrr�stop*s
zMonitorObserver.stop)NN)	r;r<r=r>rr)rMrQrTrrrrrA�s(
(rA)r>Z
__future__rrrrr+r/Z	threadingr�	functoolsrZ
pyudev.devicerZpyudev._utilr	r
Z
pyudev._osrr�objectr
rArrrr�<module>ssite-packages/pyudev/__pycache__/core.cpython-36.pyc000064400000031216147511334620016377 0ustar003

u1�Ws4�@s�dZddlmZmZmZmZddlmZddlm	Z	ddl
mZddl
mZddl
mZddlmZdd	lmZdd
lmZddlmZGdd
�d
e�ZGdd�de�ZdS)z�
    pyudev.core
    ===========

    Core types and functions of :mod:`pyudev`.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
�)�print_function�division�unicode_literals�absolute_import)�Devices)�DeviceNotFoundAtPathError)�ERROR_CHECKERS)�
SIGNATURES)�load_ctypes_library)�ensure_byte_string)�ensure_unicode_string)�property_value_to_bytes)�udev_list_iteratec@sfeZdZdZdd�Zdd�Zedd��Zedd	��Zed
d��Z	edd
��Z
e
jdd
��Z
dd�ZdS)�Contexta
    A device database connection.

    This class represents a connection to the udev device database, and is
    really *the* central object to access udev.  You need an instance of this
    class for almost anything else in pyudev.

    This class itself gives access to various udev configuration data (e.g.
    :attr:`sys_path`, :attr:`device_path`), and provides device enumeration
    (:meth:`list_devices()`).

    Instances of this class can directly be given as ``udev *`` to functions
    wrapped through :mod:`ctypes`.
    cCstdtt�|_|jj�|_dS)z'
        Create a new context.
        ZudevN)r
r	r�_libudevZudev_new�_as_parameter_)�self�r�/usr/lib/python3.6/core.py�__init__<szContext.__init__cCs|jj|�dS)N)rZ
udev_unref)rrrr�__del__CszContext.__del__cCs$t|jd�rt|jj|��SdSdS)zV
        The ``sysfs`` mount point defaulting to ``/sys'`` as unicode string.
        �udev_get_sys_pathz/sysN)�hasattrrrr)rrrr�sys_pathFszContext.sys_pathcCs$t|jd�rt|jj|��SdSdS)zU
        The device directory path defaulting to ``/dev`` as unicode string.
        �udev_get_dev_pathz/devN)rrrr)rrrr�device_pathQszContext.device_pathcCs$t|jd�rt|jj|��SdSdS)z�
        The run runtime directory path defaulting to ``/run`` as unicode
        string.

        .. udevversion:: 167

        .. versionadded:: 0.10
        �udev_get_run_pathz	/run/udevN)rrrr)rrrr�run_path\s
zContext.run_pathcCs|jj|�S)a
        The logging priority of the interal logging facitility of udev as
        integer with a standard :mod:`syslog` priority.  Assign to this
        property to change the logging priority.

        UDev uses the standard :mod:`syslog` priorities.  Constants for these
        priorities are defined in the :mod:`syslog` module in the standard
        library:

        >>> import syslog
        >>> context = pyudev.Context()
        >>> context.log_priority = syslog.LOG_DEBUG

        .. versionadded:: 0.9
        )rZudev_get_log_priority)rrrr�log_prioritykszContext.log_prioritycCs|jj||�dS)zT
        Set the log priority.

        :param int value: the log priority.
        N)rZudev_set_log_priority)r�valuerrrr~scKst|�jf|�S)a"
        List all available devices.

        The arguments of this method are the same as for
        :meth:`Enumerator.match()`.  In fact, the arguments are simply passed
        straight to method :meth:`~Enumerator.match()`.

        This function creates and returns an :class:`Enumerator` object,
        that can be used to filter the list of devices, and eventually
        retrieve :class:`Device` objects representing matching devices.

        .. versionchanged:: 0.8
           Accept keyword arguments now for easy matching.
        )�
Enumerator�match)r�kwargsrrr�list_devices�szContext.list_devicesN)
�__name__�
__module__�__qualname__�__doc__rr�propertyrrrr�setterr#rrrrr,s	rc@sleZdZdZdd�Zdd�Zdd�Zdd	d
�Zdd�Zd
d�Z	ddd�Z
dd�Zdd�Zdd�Z
dd�ZdS)r a�
    A filtered iterable of devices.

    To retrieve devices, simply iterate over an instance of this class.
    This operation yields :class:`Device` objects representing the available
    devices.

    Before iteration the device list can be filtered by subsystem or by
    property values using :meth:`match_subsystem` and
    :meth:`match_property`.  Multiple subsystem (property) filters are
    combined using a logical OR, filters of different types are combined
    using a logical AND.  The following filter for instance::

        devices.match_subsystem('block').match_property(
            'ID_TYPE', 'disk').match_property('DEVTYPE', 'disk')

    means the following::

        subsystem == 'block' and (ID_TYPE == 'disk' or DEVTYPE == 'disk')

    Once added, a filter cannot be removed anymore.  Create a new object
    instead.

    Instances of this class can directly be given as given ``udev_enumerate *``
    to functions wrapped through :mod:`ctypes`.
    cCs2t|t�std��||_|jj|�|_|j|_dS)z�
        Create a new enumerator with the given ``context`` (a
        :class:`Context` instance).

        While you can create objects of this class directly, this is not
        recommended.  Call :method:`Context.list_devices()` instead.
        zInvalid context objectN)�
isinstancer�	TypeError�contextrZudev_enumerate_newr)rr,rrrr�s

zEnumerator.__init__cCs|jj|�dS)N)rZudev_enumerate_unref)rrrrr�szEnumerator.__del__cKs�|jdd�}|dk	r|j|�|jdd�}|dk	r<|j|�|jdd�}|dk	rZ|j|�|jdd�}|dk	rx|j|�x |j�D]\}}|j||�q�W|S)a2
        Include devices according to the rules defined by the keyword
        arguments.  These keyword arguments are interpreted as follows:

        - The value for the keyword argument ``subsystem`` is forwarded to
          :meth:`match_subsystem()`.
        - The value for the keyword argument ``sys_name`` is forwared to
          :meth:`match_sys_name()`.
        - The value for the keyword argument ``tag`` is forwared to
          :meth:`match_tag()`.
        - The value for the keyword argument ``parent`` is forwared to
          :meth:`match_parent()`.
        - All other keyword arguments are forwareded one by one to
          :meth:`match_property()`.  The keyword argument itself is interpreted
          as property name, the value of the keyword argument as the property
          value.

        All keyword arguments are optional, calling this method without no
        arguments at all is simply a noop.

        Return the instance again.

        .. versionadded:: 0.8

        .. versionchanged:: 0.13
           Add ``parent`` keyword.
        �	subsystemN�sys_name�tag�parent)�pop�match_subsystem�match_sys_name�	match_tag�match_parent�items�match_property)rr"r-r.r/r0�proprrrrr!�s



zEnumerator.matchFcCs&|r|jjn|jj}||t|��|S)a$
        Include all devices, which are part of the given ``subsystem``.

        ``subsystem`` is either a unicode string or a byte string, containing
        the name of the subsystem.  If ``nomatch`` is ``True`` (default is
        ``False``), the match is inverted:  A device is only included if it is
        *not* part of the given ``subsystem``.

        Note that, if a device has no subsystem, it is not included either
        with value of ``nomatch`` True or with value of ``nomatch`` False.

        Return the instance again.
        )rZ$udev_enumerate_add_nomatch_subsystemZ"udev_enumerate_add_match_subsystemr)rr-�nomatchr!rrrr2�szEnumerator.match_subsystemcCs|jj|t|��|S)z�
        Include all devices with the given name.

        ``sys_name`` is a byte or unicode string containing the device name.

        Return the instance again.

        .. versionadded:: 0.8
        )rZ udev_enumerate_add_match_sysnamer)rr.rrrr3s
zEnumerator.match_sys_namecCs|jj|t|�t|��|S)a�
        Include all devices, whose ``prop`` has the given ``value``.

        ``prop`` is either a unicode string or a byte string, containing
        the name of the property to match.  ``value`` is a property value,
        being one of the following types:

        - :func:`int`
        - :func:`bool`
        - A byte string
        - Anything convertable to a unicode string (including a unicode string
          itself)

        Return the instance again.
        )rZ!udev_enumerate_add_match_propertyrr
)rr8rrrrr7szEnumerator.match_propertycCs,|s|jjn|jj}||t|�t|��|S)a�
        Include all devices, whose ``attribute`` has the given ``value``.

        ``attribute`` is either a unicode string or a byte string, containing
        the name of a sys attribute to match.  ``value`` is an attribute value,
        being one of the following types:

        - :func:`int`,
        - :func:`bool`
        - A byte string
        - Anything convertable to a unicode string (including a unicode string
          itself)

        If ``nomatch`` is ``True`` (default is ``False``), the match is
        inverted:  A device is include if the ``attribute`` does *not* match
        the given ``value``.

        .. note::

           If ``nomatch`` is ``True``, devices which do not have the given
           ``attribute`` at all are also included.  In other words, with
           ``nomatch=True`` the given ``attribute`` is *not* guaranteed to
           exist on all returned devices.

        Return the instance again.
        )rZ udev_enumerate_add_match_sysattrZ"udev_enumerate_add_nomatch_sysattrrr
)rZ	attributerr9r!rrr�match_attribute's


zEnumerator.match_attributecCs|jj|t|��|S)z�
        Include all devices, which have the given ``tag`` attached.

        ``tag`` is a byte or unicode string containing the tag name.

        Return the instance again.

        .. udevversion:: 154

        .. versionadded:: 0.6
        )rZudev_enumerate_add_match_tagr)rr/rrrr4IszEnumerator.match_tagcCs|jj|�|S)a�
        Include only devices, which are initialized.

        Initialized devices have properly set device node permissions and
        context, and are (in case of network devices) fully renamed.

        Currently this will not affect devices which do not have device nodes
        and are not network interfaces.

        Return the instance again.

        .. seealso:: :attr:`Device.is_initialized`

        .. udevversion:: 165

        .. versionadded:: 0.8
        )rZ'udev_enumerate_add_match_is_initialized)rrrr�match_is_initializedXszEnumerator.match_is_initializedcCs|jj||�|S)a 
        Include all devices on the subtree of the given ``parent`` device.

        The ``parent`` device itself is also included.

        ``parent`` is a :class:`~pyudev.Device`.

        Return the instance again.

        .. udevversion:: 172

        .. versionadded:: 0.13
        )rZudev_enumerate_add_match_parent)rr0rrrr5mszEnumerator.match_parentccsb|jj|�|jj|�}xDt|j|�D]4\}}ytj|j|�VWq&tk
rXw&Yq&Xq&WdS)z\
        Iterate over all matching devices.

        Yield :class:`Device` objects.
        N)rZudev_enumerate_scan_devicesZudev_enumerate_get_list_entryrrZ
from_sys_pathr,r)r�entry�name�_rrr�__iter__~szEnumerator.__iter__N)F)F)r$r%r&r'rrr!r2r3r7r:r4r;r5r?rrrrr �s,

"r N)r'Z
__future__rrrrZ
pyudev.devicerZpyudev.device._errorsrZpyudev._ctypeslib.libudevrr	Zpyudev._ctypeslib.utilsr
Zpyudev._utilrrr
r�objectrr rrrr�<module>smsite-packages/pyudev/__pycache__/core.cpython-36.opt-1.pyc000064400000031216147511334620017336 0ustar003

u1�Ws4�@s�dZddlmZmZmZmZddlmZddlm	Z	ddl
mZddl
mZddl
mZddlmZdd	lmZdd
lmZddlmZGdd
�d
e�ZGdd�de�ZdS)z�
    pyudev.core
    ===========

    Core types and functions of :mod:`pyudev`.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
�)�print_function�division�unicode_literals�absolute_import)�Devices)�DeviceNotFoundAtPathError)�ERROR_CHECKERS)�
SIGNATURES)�load_ctypes_library)�ensure_byte_string)�ensure_unicode_string)�property_value_to_bytes)�udev_list_iteratec@sfeZdZdZdd�Zdd�Zedd��Zedd	��Zed
d��Z	edd
��Z
e
jdd
��Z
dd�ZdS)�Contexta
    A device database connection.

    This class represents a connection to the udev device database, and is
    really *the* central object to access udev.  You need an instance of this
    class for almost anything else in pyudev.

    This class itself gives access to various udev configuration data (e.g.
    :attr:`sys_path`, :attr:`device_path`), and provides device enumeration
    (:meth:`list_devices()`).

    Instances of this class can directly be given as ``udev *`` to functions
    wrapped through :mod:`ctypes`.
    cCstdtt�|_|jj�|_dS)z'
        Create a new context.
        ZudevN)r
r	r�_libudevZudev_new�_as_parameter_)�self�r�/usr/lib/python3.6/core.py�__init__<szContext.__init__cCs|jj|�dS)N)rZ
udev_unref)rrrr�__del__CszContext.__del__cCs$t|jd�rt|jj|��SdSdS)zV
        The ``sysfs`` mount point defaulting to ``/sys'`` as unicode string.
        �udev_get_sys_pathz/sysN)�hasattrrrr)rrrr�sys_pathFszContext.sys_pathcCs$t|jd�rt|jj|��SdSdS)zU
        The device directory path defaulting to ``/dev`` as unicode string.
        �udev_get_dev_pathz/devN)rrrr)rrrr�device_pathQszContext.device_pathcCs$t|jd�rt|jj|��SdSdS)z�
        The run runtime directory path defaulting to ``/run`` as unicode
        string.

        .. udevversion:: 167

        .. versionadded:: 0.10
        �udev_get_run_pathz	/run/udevN)rrrr)rrrr�run_path\s
zContext.run_pathcCs|jj|�S)a
        The logging priority of the interal logging facitility of udev as
        integer with a standard :mod:`syslog` priority.  Assign to this
        property to change the logging priority.

        UDev uses the standard :mod:`syslog` priorities.  Constants for these
        priorities are defined in the :mod:`syslog` module in the standard
        library:

        >>> import syslog
        >>> context = pyudev.Context()
        >>> context.log_priority = syslog.LOG_DEBUG

        .. versionadded:: 0.9
        )rZudev_get_log_priority)rrrr�log_prioritykszContext.log_prioritycCs|jj||�dS)zT
        Set the log priority.

        :param int value: the log priority.
        N)rZudev_set_log_priority)r�valuerrrr~scKst|�jf|�S)a"
        List all available devices.

        The arguments of this method are the same as for
        :meth:`Enumerator.match()`.  In fact, the arguments are simply passed
        straight to method :meth:`~Enumerator.match()`.

        This function creates and returns an :class:`Enumerator` object,
        that can be used to filter the list of devices, and eventually
        retrieve :class:`Device` objects representing matching devices.

        .. versionchanged:: 0.8
           Accept keyword arguments now for easy matching.
        )�
Enumerator�match)r�kwargsrrr�list_devices�szContext.list_devicesN)
�__name__�
__module__�__qualname__�__doc__rr�propertyrrrr�setterr#rrrrr,s	rc@sleZdZdZdd�Zdd�Zdd�Zdd	d
�Zdd�Zd
d�Z	ddd�Z
dd�Zdd�Zdd�Z
dd�ZdS)r a�
    A filtered iterable of devices.

    To retrieve devices, simply iterate over an instance of this class.
    This operation yields :class:`Device` objects representing the available
    devices.

    Before iteration the device list can be filtered by subsystem or by
    property values using :meth:`match_subsystem` and
    :meth:`match_property`.  Multiple subsystem (property) filters are
    combined using a logical OR, filters of different types are combined
    using a logical AND.  The following filter for instance::

        devices.match_subsystem('block').match_property(
            'ID_TYPE', 'disk').match_property('DEVTYPE', 'disk')

    means the following::

        subsystem == 'block' and (ID_TYPE == 'disk' or DEVTYPE == 'disk')

    Once added, a filter cannot be removed anymore.  Create a new object
    instead.

    Instances of this class can directly be given as given ``udev_enumerate *``
    to functions wrapped through :mod:`ctypes`.
    cCs2t|t�std��||_|jj|�|_|j|_dS)z�
        Create a new enumerator with the given ``context`` (a
        :class:`Context` instance).

        While you can create objects of this class directly, this is not
        recommended.  Call :method:`Context.list_devices()` instead.
        zInvalid context objectN)�
isinstancer�	TypeError�contextrZudev_enumerate_newr)rr,rrrr�s

zEnumerator.__init__cCs|jj|�dS)N)rZudev_enumerate_unref)rrrrr�szEnumerator.__del__cKs�|jdd�}|dk	r|j|�|jdd�}|dk	r<|j|�|jdd�}|dk	rZ|j|�|jdd�}|dk	rx|j|�x |j�D]\}}|j||�q�W|S)a2
        Include devices according to the rules defined by the keyword
        arguments.  These keyword arguments are interpreted as follows:

        - The value for the keyword argument ``subsystem`` is forwarded to
          :meth:`match_subsystem()`.
        - The value for the keyword argument ``sys_name`` is forwared to
          :meth:`match_sys_name()`.
        - The value for the keyword argument ``tag`` is forwared to
          :meth:`match_tag()`.
        - The value for the keyword argument ``parent`` is forwared to
          :meth:`match_parent()`.
        - All other keyword arguments are forwareded one by one to
          :meth:`match_property()`.  The keyword argument itself is interpreted
          as property name, the value of the keyword argument as the property
          value.

        All keyword arguments are optional, calling this method without no
        arguments at all is simply a noop.

        Return the instance again.

        .. versionadded:: 0.8

        .. versionchanged:: 0.13
           Add ``parent`` keyword.
        �	subsystemN�sys_name�tag�parent)�pop�match_subsystem�match_sys_name�	match_tag�match_parent�items�match_property)rr"r-r.r/r0�proprrrrr!�s



zEnumerator.matchFcCs&|r|jjn|jj}||t|��|S)a$
        Include all devices, which are part of the given ``subsystem``.

        ``subsystem`` is either a unicode string or a byte string, containing
        the name of the subsystem.  If ``nomatch`` is ``True`` (default is
        ``False``), the match is inverted:  A device is only included if it is
        *not* part of the given ``subsystem``.

        Note that, if a device has no subsystem, it is not included either
        with value of ``nomatch`` True or with value of ``nomatch`` False.

        Return the instance again.
        )rZ$udev_enumerate_add_nomatch_subsystemZ"udev_enumerate_add_match_subsystemr)rr-�nomatchr!rrrr2�szEnumerator.match_subsystemcCs|jj|t|��|S)z�
        Include all devices with the given name.

        ``sys_name`` is a byte or unicode string containing the device name.

        Return the instance again.

        .. versionadded:: 0.8
        )rZ udev_enumerate_add_match_sysnamer)rr.rrrr3s
zEnumerator.match_sys_namecCs|jj|t|�t|��|S)a�
        Include all devices, whose ``prop`` has the given ``value``.

        ``prop`` is either a unicode string or a byte string, containing
        the name of the property to match.  ``value`` is a property value,
        being one of the following types:

        - :func:`int`
        - :func:`bool`
        - A byte string
        - Anything convertable to a unicode string (including a unicode string
          itself)

        Return the instance again.
        )rZ!udev_enumerate_add_match_propertyrr
)rr8rrrrr7szEnumerator.match_propertycCs,|s|jjn|jj}||t|�t|��|S)a�
        Include all devices, whose ``attribute`` has the given ``value``.

        ``attribute`` is either a unicode string or a byte string, containing
        the name of a sys attribute to match.  ``value`` is an attribute value,
        being one of the following types:

        - :func:`int`,
        - :func:`bool`
        - A byte string
        - Anything convertable to a unicode string (including a unicode string
          itself)

        If ``nomatch`` is ``True`` (default is ``False``), the match is
        inverted:  A device is include if the ``attribute`` does *not* match
        the given ``value``.

        .. note::

           If ``nomatch`` is ``True``, devices which do not have the given
           ``attribute`` at all are also included.  In other words, with
           ``nomatch=True`` the given ``attribute`` is *not* guaranteed to
           exist on all returned devices.

        Return the instance again.
        )rZ udev_enumerate_add_match_sysattrZ"udev_enumerate_add_nomatch_sysattrrr
)rZ	attributerr9r!rrr�match_attribute's


zEnumerator.match_attributecCs|jj|t|��|S)z�
        Include all devices, which have the given ``tag`` attached.

        ``tag`` is a byte or unicode string containing the tag name.

        Return the instance again.

        .. udevversion:: 154

        .. versionadded:: 0.6
        )rZudev_enumerate_add_match_tagr)rr/rrrr4IszEnumerator.match_tagcCs|jj|�|S)a�
        Include only devices, which are initialized.

        Initialized devices have properly set device node permissions and
        context, and are (in case of network devices) fully renamed.

        Currently this will not affect devices which do not have device nodes
        and are not network interfaces.

        Return the instance again.

        .. seealso:: :attr:`Device.is_initialized`

        .. udevversion:: 165

        .. versionadded:: 0.8
        )rZ'udev_enumerate_add_match_is_initialized)rrrr�match_is_initializedXszEnumerator.match_is_initializedcCs|jj||�|S)a 
        Include all devices on the subtree of the given ``parent`` device.

        The ``parent`` device itself is also included.

        ``parent`` is a :class:`~pyudev.Device`.

        Return the instance again.

        .. udevversion:: 172

        .. versionadded:: 0.13
        )rZudev_enumerate_add_match_parent)rr0rrrr5mszEnumerator.match_parentccsb|jj|�|jj|�}xDt|j|�D]4\}}ytj|j|�VWq&tk
rXw&Yq&Xq&WdS)z\
        Iterate over all matching devices.

        Yield :class:`Device` objects.
        N)rZudev_enumerate_scan_devicesZudev_enumerate_get_list_entryrrZ
from_sys_pathr,r)r�entry�name�_rrr�__iter__~szEnumerator.__iter__N)F)F)r$r%r&r'rrr!r2r3r7r:r4r;r5r?rrrrr �s,

"r N)r'Z
__future__rrrrZ
pyudev.devicerZpyudev.device._errorsrZpyudev._ctypeslib.libudevrr	Zpyudev._ctypeslib.utilsr
Zpyudev._utilrrr
r�objectrr rrrr�<module>smsite-packages/pyudev/__pycache__/_util.cpython-36.pyc000064400000013502147511334620016561 0ustar003

N�#W(�@s�dZddlmZmZmZmZyddlmZWn ek
rLddl	mZYnXddl
Z
ddlZddlZddl
Z
ddlZdd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Zdd�Zdd�ZdS)z|
    pyudev._util
    ============

    Internal utilities

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
�)�print_function�division�unicode_literals�absolute_import)�check_outputNcCst|t�s|jtj��}|S)z�
    Return the given ``value`` as bytestring.

    If the given ``value`` is not a byte string, but a real unicode string, it
    is encoded with the filesystem encoding (as in
    :func:`sys.getfilesystemencoding()`).
    )�
isinstance�bytes�encode�sys�getfilesystemencoding)�value�r
�/usr/lib/python3.6/_util.py�ensure_byte_string-s
rcCst|tj�s|jtj��}|S)z�
    Return the given ``value`` as unicode string.

    If the given ``value`` is not a unicode string, but a byte string, it is
    decoded with the filesystem encoding (as in
    :func:`sys.getfilesystemencoding()`).
    )r�six�	text_type�decoder
r)rr
r
r�ensure_unicode_string:srcCs2t|t�rt|�}t|t�r |Sttj|��SdS)a�
    Return a byte string, which represents the given ``value`` in a way
    suitable as raw value of an udev property.

    If ``value`` is a boolean object, it is converted to ``'1'`` or ``'0'``,
    depending on whether ``value`` is ``True`` or ``False``.  If ``value`` is a
    byte string already, it is returned unchanged.  Anything else is simply
    converted to a unicode string, and then passed to
    :func:`ensure_byte_string`.
    N)r�bool�intrrrr)rr
r
r�property_value_to_bytesGs


rcCs|dkrtdj|���|dkS)z�
    Convert the given unicode string ``value`` to a boolean object.

    If ``value`` is ``'1'``, ``True`` is returned.  If ``value`` is ``'0'``,
    ``False`` is returned.  Any other value raises a
    :exc:`~exceptions.ValueError`.
    �1�0zNot a boolean value: {0!r})rr)�
ValueError�format)rr
r
r�string_to_bool\srccs6x0|r0|j|�}|j|�}||fV|j|�}qWdS)z�
    Iteration helper for udev list entry objects.

    Yield a tuple ``(name, value)``.  ``name`` and ``value`` are bytestrings
    containing the name and the value of the list entry.  The exact contents
    depend on the list iterated over.
    N)Zudev_list_entry_get_nameZudev_list_entry_get_valueZudev_list_entry_get_next)Zlibudev�entry�namerr
r
r�udev_list_iterateis



rcCs:tj|�j}tj|�rdStj|�r(dStdj|���dS)a�
    Get the device type of a device file.

    ``filename`` is a string containing the path of a device file.

    Return ``'char'`` if ``filename`` is a character device, or ``'block'`` if
    ``filename`` is a block device.  Raise :exc:`~exceptions.ValueError` if
    ``filename`` is no device file at all.  Raise
    :exc:`~exceptions.EnvironmentError` if ``filename`` does not exist or if
    its metadata was inaccessible.

    .. versionadded:: 0.15
    �char�blockznot a device file: {0!r}N)�os�stat�st_mode�S_ISCHR�S_ISBLKrr)�filename�moder
r
r�get_device_typexs

r(cOsvddl}xhy
|||�Stt|jfk
rl}z4t|ttf�rD|j}n
|jd}|tjkrZw
�WYdd}~Xq
Xq
WdS)a=
    Handle interruptions to an interruptible system call.

    Run an interruptible system call in a loop and retry if it raises EINTR.
    The signal calls that may raise EINTR prior to Python 3.5 are listed in
    PEP 0475.  Any calls to these functions must be wrapped in eintr_retry_call
    in order to handle EINTR returns in older versions of Python.

    This function is safe to use under Python 3.5 and newer since the wrapped
    function will simply return without raising EINTR.

    This function is based on _eintr_retry_call in python's subprocess.py.
    rN)�select�OSError�IOError�errorr�errno�argsZEINTR)�funcr.�kwargsr)�errZ
error_coder
r
r�eintr_retry_call�s


r2cCsttddg��}t|j��S)ak
    Get the version of the underlying udev library.

    udev doesn't use a standard major-minor versioning scheme, but instead
    labels releases with a single consecutive number.  Consequently, the
    version number returned by this function is a single integer, and not a
    tuple (like for instance the interpreter version in
    :data:`sys.version_info`).

    As libudev itself does not provide a function to query the version number,
    this function calls the ``udevadm`` utility, so be prepared to catch
    :exc:`~exceptions.EnvironmentError` and
    :exc:`~subprocess.CalledProcessError` if you call this function.

    Return the version number as single integer.  Raise
    :exc:`~exceptions.ValueError`, if the version number retrieved from udev
    could not be converted to an integer.  Raise
    :exc:`~exceptions.EnvironmentError`, if ``udevadm`` was not found, or could
    not be executed.  Raise :exc:`subprocess.CalledProcessError`, if
    ``udevadm`` returned a non-zero exit code.  On Python 2.7 or newer, the
    ``output`` attribute of this exception is correctly set.

    .. versionadded:: 0.8
    Zudevadmz	--version)rrr�strip)�outputr
r
r�udev_version�sr5)�__doc__Z
__future__rrrr�
subprocessr�ImportErrorZpyudev._compatr!r
r"r-rrrrrrr(r2r5r
r
r
r�<module>s$


!site-packages/pyudev/__pycache__/_qt_base.cpython-36.pyc000064400000014331147511334620017223 0ustar003

u1�Wk�@s�dZddlmZddlmZddlmZddlmZddlZddlmZGdd	�d	e	�Z
Gd
d�de
�Zdd
�ZGdd�de	�Z
Gdd�de	�ZdS)z�
    pyudev._qt_base
    ===============

    Base mixin class for Qt4,Qt5 support.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
�)�absolute_import)�division)�print_function)�unicode_literalsN)�Devicec@sBeZdZdZdd�Zedd��Zejdd��Zdd�Zd	d
�Z	dS)�MonitorObserverMixinz0
    Base mixin for pyqt monitor observers.
    cCs2||_||j�|j|�|_|jjtj|j�dS)N)�monitor�filenoZRead�notifierZ	activated�intZconnect�_process_udev_event)�selfr�notifier_class�r�/usr/lib/python3.6/_qt_base.py�_setup_notifier+sz$MonitorObserverMixin._setup_notifiercCs
|jj�S)aY
        Whether this observer is enabled or not.

        If ``True`` (the default), this observer is enabled, and emits events.
        Otherwise it is disabled and does not emit any events.  This merely
        reflects the state of the ``enabled`` property of the underlying
        :attr:`notifier`.

        .. versionadded:: 0.14
        )r
Z	isEnabled)r
rrr�enabled1szMonitorObserverMixin.enabledcCs|jj|�dS)N)r
Z
setEnabled)r
�valuerrrr?scCs$|jjdd�}|dk	r |j|�dS)z�
        Attempt to receive a single device event from the monitor, process
        the event and emit corresponding signals.

        Called by ``QSocketNotifier``, if data is available on the udev
        monitoring socket.
        r)ZtimeoutN)rZpoll�_emit_event)r
�devicerrrrCsz(MonitorObserverMixin._process_udev_eventcCs|jj|�dS)N)�deviceEvent�emit)r
rrrrrOsz MonitorObserverMixin._emit_eventN)
�__name__�
__module__�__qualname__�__doc__r�propertyr�setterrrrrrrr%src@s eZdZdZdd�Zdd�ZdS)�QUDevMonitorObserverMixinz*
    Obsolete monitor observer mixin.
    cCs>tj|||�|j|j|j|jd�|_ddl}|jdt	�dS)N)�add�removeZchangeZmoverzAWill be removed in 1.0. Use pyudev.pyqt4.MonitorObserver instead.)
rr�deviceAdded�
deviceRemoved�
deviceChanged�deviceMoved�_action_signal_map�warnings�warn�DeprecationWarning)r
rrr&rrrrYsz)QUDevMonitorObserverMixin._setup_notifiercCs4|jj|j|�|jj|j�}|dk	r0|j|�dS)N)rr�actionr%�get)r
r�signalrrrrdsz%QUDevMonitorObserverMixin._emit_eventN)rrrrrrrrrrrSsrcsd��fdd�	}|S)a
    Generates an initializer to observer the given ``monitor``
    (a :class:`~pyudev.Monitor`):

    ``parent`` is the parent :class:`~PyQt{4,5}.QtCore.QObject` of this
    object.  It is passed unchanged to the inherited constructor of
    :class:`~PyQt{4,5}.QtCore.QObject`.
    Ncs�j||�|j|��dS)N)�__init__r)r
r�parent)�qobject�socket_notifierrrr,tszmake_init.<locals>.__init__)Nr)r.r/r,r)r.r/r�	make_initjs
r0c@seZdZdZedd��ZdS)�MonitorObserverGeneratorz4
    Class to generate a MonitorObserver class.
    cCs.ttd�|tftd�t||�td�|t�i�S)aGenerates an observer for device events integrating into the
        PyQt{4,5} mainloop.

        This class inherits :class:`~PyQt{4,5}.QtCore.QObject` to turn device
        events into Qt signals:

        >>> from pyudev import Context, Monitor
        >>> from pyudev.pyqt4 import MonitorObserver
        >>> context = Context()
        >>> monitor = Monitor.from_netlink(context)
        >>> monitor.filter_by(subsystem='input')
        >>> observer = MonitorObserver(monitor)
        >>> def device_event(device):
        ...     print('event {0} on device {1}'.format(device.action, device))
        >>> observer.deviceEvent.connect(device_event)
        >>> monitor.start()

        This class is a child of :class:`~{PySide, PyQt{4,5}}.QtCore.QObject`.

        ZMonitorObserverr,r)�type�strrr0r)r.r+r/rrr�make_monitor_observer�s
z.MonitorObserverGenerator.make_monitor_observerN)rrrr�staticmethodr4rrrrr1|sr1c@seZdZdZedd��ZdS)�QUDevMonitorObserverGeneratorz4
    Class to generate a MonitorObserver class.
    cCsbttd�|tftd�t||�td�|tjt�td�|t�td�|t�td�|t�td�|t�i�S)aGenerates an observer for device events integrating into the
        PyQt{4,5} mainloop.

        This class inherits :class:`~PyQt{4,5}.QtCore.QObject` to turn device
        events into Qt signals:

        >>> from pyudev import Context, Monitor
        >>> from pyudev.pyqt4 import MonitorObserver
        >>> context = Context()
        >>> monitor = Monitor.from_netlink(context)
        >>> monitor.filter_by(subsystem='input')
        >>> observer = MonitorObserver(monitor)
        >>> def device_event(device):
        ...     print('event {0} on device {1}'.format(device.action, device))
        >>> observer.deviceEvent.connect(device_event)
        >>> monitor.start()

        This class is a child of :class:`~{PyQt{4,5}, PySide}.QtCore.QObject`.

        ZQUDevMonitorObserverr,rr!r"r#r$)r2r3rr0�sixZ	text_typer)r.r+r/rrrr4�sz3QUDevMonitorObserverGenerator.make_monitor_observerN)rrrrr5r4rrrrr6�sr6)rZ
__future__rrrrr7Z
pyudev.devicer�objectrrr0r1r6rrrr�<module>s.'site-packages/pyudev/__pycache__/__init__.cpython-36.opt-1.pyc000064400000003402147511334620020141 0ustar003

N�#W�	�@s@dZddlmZddlmZddlmZddlmZddlmZddlmZddlm	Z	dd	lm
Z
dd
lmZddlmZddlm
Z
dd
lmZddlmZddlmZddlmZddlmZddlmZddlmZddlmZddlmZddlmZddlmZddlmZddlmZddlmZddl m!Z!dS)a.
    pyudev
    ======

    A binding to libudev.

    The :class:`Context` provides the connection to the udev device database
    and enumerates devices.  Individual devices are represented by the
    :class:`Device` class.

    Device monitoring is provided by :class:`Monitor` and
    :class:`MonitorObserver`.  With :mod:`pyudev.pyqt4`, :mod:`pyudev.pyside`,
    :mod:`pyudev.glib` and :mod:`pyudev.wx` device monitoring can be integrated
    into the event loop of various GUI toolkits.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
�)�absolute_import)�division)�print_function)�unicode_literals)�
Attributes)�Device)�Devices)�DeviceNotFoundAtPathError)�DeviceNotFoundByFileError)�DeviceNotFoundByNameError)�DeviceNotFoundByNumberError)�DeviceNotFoundError)� DeviceNotFoundInEnvironmentError)�Tags)�DeviceFileHypothesis)�DeviceNameHypothesis)�DeviceNumberHypothesis)�DevicePathHypothesis)�	Discovery)�Context)�
Enumerator)�Monitor)�MonitorObserver)�__version__)�__version_info__)�udev_versionN)"�__doc__Z
__future__rrrrZ
pyudev.devicerrrr	r
rrr
rrZpyudev.discoverrrrrrZpyudev.corerrZpyudev.monitorrrZpyudev.versionrrZpyudev._utilr�rr�/usr/lib/python3.6/__init__.py�<module>#s4site-packages/pyudev/version.py000064400000002115147511334620012644 0ustar00# -*- coding: utf-8 -*-
# Copyright (C) 2015 mulhern <amulhern@redhat.com>

# This library is free software; you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as published by the
# Free Software Foundation; either version 2.1 of the License, or (at your
# option) any later version.

# This library is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE.  See the GNU Lesser General Public License
# for more details.

# You should have received a copy of the GNU Lesser General Public License
# along with this library; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA


"""
    pyudev.version
    ==============

    Version information.

    .. moduleauthor::  mulhern  <amulhern@redhat.com>
"""

__version_info__ = (0, 21, 0, '')
__version__ = "%s%s" % \
   (
      ".".join(str(x) for x in __version_info__[:3]),
      "".join(str(x) for x in __version_info__[3:])
   )
site-packages/pyudev/discover.py000064400000026364147511334620013011 0ustar00# -*- coding: utf-8 -*-
# Copyright (C) 2015 mulhern <amulhern@redhat.com>

# This library is free software; you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as published by the
# Free Software Foundation; either version 2.1 of the License, or (at your
# option) any later version.

# This library is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE.  See the GNU Lesser General Public License
# for more details.

# You should have received a copy of the GNU Lesser General Public License
# along with this library; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA


"""
    pyudev.discover
    ===============

    Tools to discover a device given limited information.

    .. moduleauthor::  mulhern <amulhern@redhat.com>
"""

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals

import abc
import functools
import os
import re
import six

from pyudev.device import Devices
from pyudev.device import DeviceNotFoundError


def wrap_exception(func):
    """
    Allow Device discovery methods to return None instead of raising an
    exception.
    """

    @functools.wraps(func)
    def the_func(*args, **kwargs):
        """
        Returns result of calling ``func`` on ``args``, ``kwargs``.
        Returns None if ``func`` raises :exc:`DeviceNotFoundError`.
        """
        try:
            return func(*args, **kwargs)
        except DeviceNotFoundError:
            return None

    return the_func

@six.add_metaclass(abc.ABCMeta)
class Hypothesis(object):
    """
    Represents a hypothesis about the meaning of the device identifier.
    """

    @classmethod
    @abc.abstractmethod
    def match(cls, value): # pragma: no cover
        """
        Match the given string according to the hypothesis.

        The purpose of this method is to obtain a value corresponding to
        ``value`` if that is possible. It may use a regular expression, but
        in general it should just return ``value`` and let the lookup method
        sort out the rest.

        :param str value: the string to inspect
        :returns: the matched thing or None if unmatched
        :rtype: the type of lookup's key parameter or NoneType
        """
        raise NotImplementedError()

    @classmethod
    @abc.abstractmethod
    def lookup(cls, context, key): # pragma: no cover
        """
        Lookup the given string according to the hypothesis.

        :param Context context: the pyudev context
        :param key: a key with which to lookup the device
        :type key: the type of match's return value if not None
        :returns: a list of Devices obtained
        :rtype: frozenset of :class:`Device`
        """
        raise NotImplementedError()

    @classmethod
    def setup(cls, context):
        """
        A potentially expensive method that may allow an :class:`Hypothesis`
        to find devices more rapidly or to find a device that it would
        otherwise miss.

        :param Context context: the pyudev context
        """
        pass

    @classmethod
    def get_devices(cls, context, value):
        """
        Get any devices that may correspond to the given string.

        :param Context context: the pyudev context
        :param str value: the value to look for
        :returns: a list of devices obtained
        :rtype: set of :class:`Device`
        """
        key = cls.match(value)
        return cls.lookup(context, key) if key is not None else frozenset()


class DeviceNumberHypothesis(Hypothesis):
    """
    Represents the hypothesis that the device is a device number.

    The device may be separated into major/minor number or a composite number.
    """

    @classmethod
    def _match_major_minor(cls, value):
        """
        Match the number under the assumption that it is a major,minor pair.

        :param str value: value to match
        :returns: the device number or None
        :rtype: int or NoneType
        """
        major_minor_re = re.compile(
           r'^(?P<major>\d+)(\D+)(?P<minor>\d+)$'
        )
        match = major_minor_re.match(value)
        return match and os.makedev(
           int(match.group('major')),
           int(match.group('minor'))
        )

    @classmethod
    def _match_number(cls, value):
        """
        Match the number under the assumption that it is a single number.

        :param str value: value to match
        :returns: the device number or None
        :rtype: int or NoneType
        """
        number_re = re.compile(r'^(?P<number>\d+)$')
        match = number_re.match(value)
        return match and int(match.group('number'))

    @classmethod
    def match(cls, value):
        """
        Match the number under the assumption that it is a device number.

        :returns: the device number or None
        :rtype: int or NoneType
        """
        return cls._match_major_minor(value) or cls._match_number(value)

    @classmethod
    def find_subsystems(cls, context):
        """
        Find subsystems in /sys/dev.

        :param Context context: the context
        :returns: a lis of available subsystems
        :rtype: list of str
        """
        sys_path = context.sys_path
        return os.listdir(os.path.join(sys_path, 'dev'))

    @classmethod
    def lookup(cls, context, key):
        """
        Lookup by the device number.

        :param Context context: the context
        :param int key: the device number
        :returns: a list of matching devices
        :rtype: frozenset of :class:`Device`
        """
        func = wrap_exception(Devices.from_device_number)
        res = (func(context, s, key) for s in cls.find_subsystems(context))
        return frozenset(r for r in res if r is not None)


class DevicePathHypothesis(Hypothesis):
    """
    Discover the device assuming the identifier is a device path.
    """

    @classmethod
    def match(cls, value):
        """
        Match ``value`` under the assumption that it is a device path.

        :returns: the device path or None
        :rtype: str or NoneType
        """
        return value

    @classmethod
    def lookup(cls, context, key):
        """
        Lookup by the path.

        :param Context context: the context
        :param str key: the device path
        :returns: a list of matching devices
        :rtype: frozenset of :class:`Device`
        """
        res = wrap_exception(Devices.from_path)(context, key)
        return frozenset((res,)) if res is not None else frozenset()


class DeviceNameHypothesis(Hypothesis):
    """
    Discover the device assuming the input is a device name.

    Try every available subsystem.
    """

    @classmethod
    def find_subsystems(cls, context):
        """
        Find all subsystems in sysfs.

        :param Context context: the context
        :rtype: frozenset
        :returns: subsystems in sysfs
        """
        sys_path = context.sys_path
        dirnames = ('bus', 'class', 'subsystem')
        absnames = (os.path.join(sys_path, name) for name in dirnames)
        realnames = (d for d in absnames if os.path.isdir(d))
        return frozenset(n for d in realnames for n in os.listdir(d))

    @classmethod
    def match(cls, value):
        """
        Match ``value`` under the assumption that it is a device name.

        :returns: the device path or None
        :rtype: str or NoneType
        """
        return value

    @classmethod
    def lookup(cls, context, key):
        """
        Lookup by the path.

        :param Context context: the context
        :param str key: the device path
        :returns: a list of matching devices
        :rtype: frozenset of :class:`Device`
        """
        func = wrap_exception(Devices.from_name)
        res = (func(context, s, key) for s in cls.find_subsystems(context))
        return frozenset(r for r in res if r is not None)


class DeviceFileHypothesis(Hypothesis):
    """
    Discover the device assuming the value is some portion of a device file.

    The device file may be a link to a device node.
    """

    _LINK_DIRS = [
       '/dev',
       '/dev/disk/by-id',
       '/dev/disk/by-label',
       '/dev/disk/by-partlabel',
       '/dev/disk/by-partuuid',
       '/dev/disk/by-path',
       '/dev/disk/by-uuid',
       '/dev/input/by-path',
       '/dev/mapper',
       '/dev/md',
       '/dev/vg'
    ]

    @classmethod
    def get_link_dirs(cls, context):
        """
        Get all directories that may contain links to device nodes.

        This method checks the device links of every device, so it is very
        expensive.

        :param Context context: the context
        :returns: a sorted list of directories that contain device links
        :rtype: list
        """
        devices = context.list_devices()
        devices_with_links = (d for d in devices if list(d.device_links))
        links = (l for d in devices_with_links for l in d.device_links)
        return sorted(set(os.path.dirname(l) for l in links))

    @classmethod
    def setup(cls, context):
        """
        Set the link directories to be used when discovering by file.

        Uses `get_link_dirs`, so is as expensive as it is.

        :param Context context: the context
        """
        cls._LINK_DIRS = cls.get_link_dirs(context)

    @classmethod
    def match(cls, value):
        return value

    @classmethod
    def lookup(cls, context, key):
        """
        Lookup the device under the assumption that the key is part of
        the name of a device file.

        :param Context context: the context
        :param str key: a portion of the device file name

        It is assumed that either it is the whole name of the device file
        or it is the basename.

        A device file may be a device node or a device link.
        """
        func = wrap_exception(Devices.from_device_file)
        if '/' in key:
            device = func(context, key)
            return frozenset((device,)) if device is not None else frozenset()
        else:
            files = (os.path.join(ld, key) for ld in cls._LINK_DIRS)
            devices = (func(context, f) for f in files)
            return frozenset(d for d in devices if d is not None)


class Discovery(object):
    # pylint: disable=too-few-public-methods
    """
    Provides discovery methods for devices.
    """

    _HYPOTHESES = [
       DeviceFileHypothesis,
       DeviceNameHypothesis,
       DeviceNumberHypothesis,
       DevicePathHypothesis
    ]

    def __init__(self):
        self._hypotheses = self._HYPOTHESES

    def setup(self, context):
        """
        Set up individual hypotheses.

        May be an expensive call.

        :param Context context: the context
        """
        for hyp in self._hypotheses:
            hyp.setup(context)

    def get_devices(self, context, value):
        """
        Get the devices corresponding to value.

        :param Context context: the context
        :param str value: some identifier of the device
        :returns: a list of corresponding devices
        :rtype: frozenset of :class:`Device`
        """
        return frozenset(
           d for h in self._hypotheses for d in h.get_devices(context, value)
        )
site-packages/pyudev/_os/__init__.py000064400000001741147511334620013502 0ustar00# -*- coding: utf-8 -*-
# Copyright (C) 2015 mulhern <amulhern@redhat.com>

# This library is free software; you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as published by the
# Free Software Foundation; either version 2.1 of the License, or (at your
# option) any later version.

# This library is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE.  See the GNU Lesser General Public License
# for more details.

# You should have received a copy of the GNU Lesser General Public License
# along with this library; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA

"""
    pyudev._os
    ==========

    Extras to compensate for deficiencies in python os module.

    .. moduleauthor::  mulhern  <amulhern@redhat.com>
"""

from . import pipe
from . import poll
site-packages/pyudev/_os/pipe.py000064400000010703147511334620012676 0ustar00# -*- coding: utf-8 -*-
# Copyright (C) 2013 Sebastian Wiesner <lunaryorn@gmail.com>

# This library is free software; you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as published by the
# Free Software Foundation; either version 2.1 of the License, or (at your
# option) any later version.

# This library is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE.  See the GNU Lesser General Public License
# for more details.

# You should have received a copy of the GNU Lesser General Public License
# along with this library; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA


"""
    pyudev._os.pipe
    ===============

    Fallback implementations for pipe.

    1. pipe2 from python os module
    2. pipe2 from libc
    3. pipe from python os module

    The Pipe class wraps the chosen implementation.

    .. moduleauthor:: Sebastian Wiesner  <lunaryorn@gmail.com>
"""

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals

import os
import fcntl
from functools import partial

from pyudev._ctypeslib.libc import fd_pair
from pyudev._ctypeslib.libc import ERROR_CHECKERS
from pyudev._ctypeslib.libc import SIGNATURES
from pyudev._ctypeslib.utils import load_ctypes_library

# Define O_CLOEXEC, if not present in os already
O_CLOEXEC = getattr(os, 'O_CLOEXEC', 0o2000000)


def _pipe2_ctypes(libc, flags):
    """A ``pipe2`` implementation using ``pipe2`` from ctypes.

    ``libc`` is a :class:`ctypes.CDLL` object for libc.  ``flags`` is an
    integer providing the flags to ``pipe2``.

    Return a pair of file descriptors ``(r, w)``.

    """
    fds = fd_pair()
    libc.pipe2(fds, flags)
    return fds[0], fds[1]


def _pipe2_by_pipe(flags):
    """A ``pipe2`` implementation using :func:`os.pipe`.

    ``flags`` is an integer providing the flags to ``pipe2``.

    .. warning::

       This implementation is not atomic!

    Return a pair of file descriptors ``(r, w)``.

    """
    fds = os.pipe()
    if flags & os.O_NONBLOCK != 0:
        for fd in fds:
            set_fd_status_flag(fd, os.O_NONBLOCK)
    if flags & O_CLOEXEC != 0:
        for fd in fds:
            set_fd_flag(fd, O_CLOEXEC)
    return fds


def _get_pipe2_implementation():
    """Find the appropriate implementation for ``pipe2``.

Return a function implementing ``pipe2``."""
    if hasattr(os, 'pipe2'):
        return os.pipe2 # pylint: disable=no-member
    else:
        try:
            libc = load_ctypes_library("libc", SIGNATURES, ERROR_CHECKERS)
            return (partial(_pipe2_ctypes, libc)
                    if hasattr(libc, 'pipe2') else
                    _pipe2_by_pipe)
        except ImportError:
            return _pipe2_by_pipe


_PIPE2 = _get_pipe2_implementation()


def set_fd_flag(fd, flag):
    """Set a flag on a file descriptor.

    ``fd`` is the file descriptor or file object, ``flag`` the flag as integer.

    """
    flags = fcntl.fcntl(fd, fcntl.F_GETFD, 0)
    fcntl.fcntl(fd, fcntl.F_SETFD, flags | flag)


def set_fd_status_flag(fd, flag):
    """Set a status flag on a file descriptor.

    ``fd`` is the file descriptor or file object, ``flag`` the flag as integer.

    """
    flags = fcntl.fcntl(fd, fcntl.F_GETFL, 0)
    fcntl.fcntl(fd, fcntl.F_SETFL, flags | flag)


class Pipe(object):
    """A unix pipe.

    A pipe object provides two file objects: :attr:`source` is a readable file
    object, and :attr:`sink` a writeable.  Bytes written to :attr:`sink` appear
    at :attr:`source`.

    Open a pipe with :meth:`open()`.

    """

    @classmethod
    def open(cls):
        """Open and return a new :class:`Pipe`.

        The pipe uses non-blocking IO."""
        source, sink = _PIPE2(os.O_NONBLOCK | O_CLOEXEC)
        return cls(source, sink)

    def __init__(self, source_fd, sink_fd):
        """Create a new pipe object from the given file descriptors.

        ``source_fd`` is a file descriptor for the readable side of the pipe,
        ``sink_fd`` is a file descriptor for the writeable side."""
        self.source = os.fdopen(source_fd, 'rb', 0)
        self.sink = os.fdopen(sink_fd, 'wb', 0)

    def close(self):
        """Closes both sides of the pipe."""
        try:
            self.source.close()
        finally:
            self.sink.close()
site-packages/pyudev/_os/poll.py000064400000010033147511334620012703 0ustar00# -*- coding: utf-8 -*-
# Copyright (C) 2013 Sebastian Wiesner <lunaryorn@gmail.com>

# This library is free software; you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as published by the
# Free Software Foundation; either version 2.1 of the License, or (at your
# option) any later version.

# This library is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE.  See the GNU Lesser General Public License
# for more details.

# You should have received a copy of the GNU Lesser General Public License
# along with this library; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA


"""
    pyudev._os.poll
    ===============

    Operating system interface for pyudev.

    .. moduleauthor:: Sebastian Wiesner  <lunaryorn@gmail.com>
"""

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals

import select

from pyudev._util import eintr_retry_call


class Poll(object):
    """A poll object.

    This object essentially provides a more convenient interface around
    :class:`select.poll`.

    """

    _EVENT_TO_MASK = {'r': select.POLLIN,
                      'w': select.POLLOUT}

    @staticmethod
    def _has_event(events, event):
        return events & event != 0

    @classmethod
    def for_events(cls, *events):
        """Listen for ``events``.

        ``events`` is a list of ``(fd, event)`` pairs, where ``fd`` is a file
        descriptor or file object and ``event`` either ``'r'`` or ``'w'``.  If
        ``r``, listen for whether that is ready to be read.  If ``w``, listen
        for whether the channel is ready to be written to.

        """
        notifier = eintr_retry_call(select.poll)
        for fd, event in events:
            mask = cls._EVENT_TO_MASK.get(event)
            if not mask:
                raise ValueError('Unknown event type: {0!r}'.format(event))
            notifier.register(fd, mask)
        return cls(notifier)

    def __init__(self, notifier):
        """Create a poll object for the given ``notifier``.

        ``notifier`` is the :class:`select.poll` object wrapped by the new poll
        object.

        """
        self._notifier = notifier

    def poll(self, timeout=None):
        """Poll for events.

        ``timeout`` is an integer specifying how long to wait for events (in
        milliseconds).  If omitted, ``None`` or negative, wait until an event
        occurs.

        Return a list of all events that occurred before ``timeout``, where
        each event is a pair ``(fd, event)``. ``fd`` is the integral file
        descriptor, and ``event`` a string indicating the event type.  If
        ``'r'``, there is data to read from ``fd``.  If ``'w'``, ``fd`` is
        writable without blocking now.  If ``'h'``, the file descriptor was
        hung up (i.e. the remote side of a pipe was closed).

        """
        # Return a list to allow clients to determine whether there are any
        # events at all with a simple truthiness test.
        return list(self._parse_events(eintr_retry_call(self._notifier.poll, timeout)))

    def _parse_events(self, events):
        """Parse ``events``.

        ``events`` is a list of events as returned by
        :meth:`select.poll.poll()`.

        Yield all parsed events.

        """
        for fd, event_mask in events:
            if self._has_event(event_mask, select.POLLNVAL):
                raise IOError('File descriptor not open: {0!r}'.format(fd))
            elif self._has_event(event_mask, select.POLLERR):
                raise IOError('Error while polling fd: {0!r}'.format(fd))

            if self._has_event(event_mask, select.POLLIN):
                yield fd, 'r'
            if self._has_event(event_mask, select.POLLOUT):
                yield fd, 'w'
            if self._has_event(event_mask, select.POLLHUP):
                yield fd, 'h'
site-packages/pyudev/_os/__pycache__/__init__.cpython-36.pyc000064400000000520147511334620017760 0ustar003

H�V��@s dZddlmZddlmZdS)z�
    pyudev._os
    ==========

    Extras to compensate for deficiencies in python os module.

    .. moduleauthor::  mulhern  <amulhern@redhat.com>
�)�pipe)�pollN)�__doc__�rr�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pyudev/_os/__pycache__/poll.cpython-36.opt-1.pyc000064400000006434147511334620020140 0ustar003

�W�@s\dZddlmZddlmZddlmZddlmZddlZddlmZGdd	�d	e	�Z
dS)
z�
    pyudev._os.poll
    ===============

    Operating system interface for pyudev.

    .. moduleauthor:: Sebastian Wiesner  <lunaryorn@gmail.com>
�)�absolute_import)�division)�print_function)�unicode_literalsN)�eintr_retry_callc@sPeZdZdZejejd�Zedd��Z	e
dd��Zdd�Zdd
d�Z
dd
�Zd	S)�PollzwA poll object.

    This object essentially provides a more convenient interface around
    :class:`select.poll`.

    )�r�wcCs||@dkS)Nr�)�events�eventr
r
�/usr/lib/python3.6/poll.py�
_has_event1szPoll._has_eventcGsNttj�}x:|D]2\}}|jj|�}|s6tdj|���|j||�qW||�S)aGListen for ``events``.

        ``events`` is a list of ``(fd, event)`` pairs, where ``fd`` is a file
        descriptor or file object and ``event`` either ``'r'`` or ``'w'``.  If
        ``r``, listen for whether that is ready to be read.  If ``w``, listen
        for whether the channel is ready to be written to.

        zUnknown event type: {0!r})r�select�poll�_EVENT_TO_MASK�get�
ValueError�format�register)�clsr�notifier�fdr�maskr
r
r
�
for_events5s

zPoll.for_eventscCs
||_dS)z�Create a poll object for the given ``notifier``.

        ``notifier`` is the :class:`select.poll` object wrapped by the new poll
        object.

        N)�	_notifier)�selfrr
r
r
�__init__Gsz
Poll.__init__NcCst|jt|jj|���S)a{Poll for events.

        ``timeout`` is an integer specifying how long to wait for events (in
        milliseconds).  If omitted, ``None`` or negative, wait until an event
        occurs.

        Return a list of all events that occurred before ``timeout``, where
        each event is a pair ``(fd, event)``. ``fd`` is the integral file
        descriptor, and ``event`` a string indicating the event type.  If
        ``'r'``, there is data to read from ``fd``.  If ``'w'``, ``fd`` is
        writable without blocking now.  If ``'h'``, the file descriptor was
        hung up (i.e. the remote side of a pipe was closed).

        )�list�
_parse_eventsrrr)rZtimeoutr
r
r
rPsz	Poll.pollccs�x�|D]�\}}|j|tj�r,tdj|���n|j|tj�rHtdj|���|j|tj�r`|dfV|j|tj�rx|dfV|j|tj�r|dfVqWdS)z�Parse ``events``.

        ``events`` is a list of events as returned by
        :meth:`select.poll.poll()`.

        Yield all parsed events.

        zFile descriptor not open: {0!r}zError while polling fd: {0!r}rr	�hN)	rrZPOLLNVAL�IOErrorrZPOLLERR�POLLIN�POLLOUTZPOLLHUP)rrrZ
event_maskr
r
r
rcs	

zPoll._parse_events)N)�__name__�
__module__�__qualname__�__doc__rr"r#r�staticmethodr�classmethodrrrrr
r
r
r
r&s
	
r)r'Z
__future__rrrrrZpyudev._utilr�objectrr
r
r
r
�<module>ssite-packages/pyudev/_os/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000520147511334620020717 0ustar003

H�V��@s dZddlmZddlmZdS)z�
    pyudev._os
    ==========

    Extras to compensate for deficiencies in python os module.

    .. moduleauthor::  mulhern  <amulhern@redhat.com>
�)�pipe)�pollN)�__doc__�rr�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pyudev/_os/__pycache__/poll.cpython-36.pyc000064400000006434147511334620017201 0ustar003

�W�@s\dZddlmZddlmZddlmZddlmZddlZddlmZGdd	�d	e	�Z
dS)
z�
    pyudev._os.poll
    ===============

    Operating system interface for pyudev.

    .. moduleauthor:: Sebastian Wiesner  <lunaryorn@gmail.com>
�)�absolute_import)�division)�print_function)�unicode_literalsN)�eintr_retry_callc@sPeZdZdZejejd�Zedd��Z	e
dd��Zdd�Zdd
d�Z
dd
�Zd	S)�PollzwA poll object.

    This object essentially provides a more convenient interface around
    :class:`select.poll`.

    )�r�wcCs||@dkS)Nr�)�events�eventr
r
�/usr/lib/python3.6/poll.py�
_has_event1szPoll._has_eventcGsNttj�}x:|D]2\}}|jj|�}|s6tdj|���|j||�qW||�S)aGListen for ``events``.

        ``events`` is a list of ``(fd, event)`` pairs, where ``fd`` is a file
        descriptor or file object and ``event`` either ``'r'`` or ``'w'``.  If
        ``r``, listen for whether that is ready to be read.  If ``w``, listen
        for whether the channel is ready to be written to.

        zUnknown event type: {0!r})r�select�poll�_EVENT_TO_MASK�get�
ValueError�format�register)�clsr�notifier�fdr�maskr
r
r
�
for_events5s

zPoll.for_eventscCs
||_dS)z�Create a poll object for the given ``notifier``.

        ``notifier`` is the :class:`select.poll` object wrapped by the new poll
        object.

        N)�	_notifier)�selfrr
r
r
�__init__Gsz
Poll.__init__NcCst|jt|jj|���S)a{Poll for events.

        ``timeout`` is an integer specifying how long to wait for events (in
        milliseconds).  If omitted, ``None`` or negative, wait until an event
        occurs.

        Return a list of all events that occurred before ``timeout``, where
        each event is a pair ``(fd, event)``. ``fd`` is the integral file
        descriptor, and ``event`` a string indicating the event type.  If
        ``'r'``, there is data to read from ``fd``.  If ``'w'``, ``fd`` is
        writable without blocking now.  If ``'h'``, the file descriptor was
        hung up (i.e. the remote side of a pipe was closed).

        )�list�
_parse_eventsrrr)rZtimeoutr
r
r
rPsz	Poll.pollccs�x�|D]�\}}|j|tj�r,tdj|���n|j|tj�rHtdj|���|j|tj�r`|dfV|j|tj�rx|dfV|j|tj�r|dfVqWdS)z�Parse ``events``.

        ``events`` is a list of events as returned by
        :meth:`select.poll.poll()`.

        Yield all parsed events.

        zFile descriptor not open: {0!r}zError while polling fd: {0!r}rr	�hN)	rrZPOLLNVAL�IOErrorrZPOLLERR�POLLIN�POLLOUTZPOLLHUP)rrrZ
event_maskr
r
r
rcs	

zPoll._parse_events)N)�__name__�
__module__�__qualname__�__doc__rr"r#r�staticmethodr�classmethodrrrrr
r
r
r
r&s
	
r)r'Z
__future__rrrrrZpyudev._utilr�objectrr
r
r
r
�<module>ssite-packages/pyudev/_os/__pycache__/pipe.cpython-36.pyc000064400000007657147511334620017200 0ustar003

u1�W��@s�dZddlmZddlmZddlmZddlmZddlZddlZddlm	Z	ddl
mZdd	l
mZdd
l
m
Z
ddlmZeedd
�Zdd�Zdd�Zdd�Ze�Zdd�Zdd�ZGdd�de�ZdS)a#
    pyudev._os.pipe
    ===============

    Fallback implementations for pipe.

    1. pipe2 from python os module
    2. pipe2 from libc
    3. pipe from python os module

    The Pipe class wraps the chosen implementation.

    .. moduleauthor:: Sebastian Wiesner  <lunaryorn@gmail.com>
�)�absolute_import)�division)�print_function)�unicode_literalsN)�partial)�fd_pair)�ERROR_CHECKERS)�
SIGNATURES)�load_ctypes_library�	O_CLOEXECicCs"t�}|j||�|d|dfS)z�A ``pipe2`` implementation using ``pipe2`` from ctypes.

    ``libc`` is a :class:`ctypes.CDLL` object for libc.  ``flags`` is an
    integer providing the flags to ``pipe2``.

    Return a pair of file descriptors ``(r, w)``.

    r�)r�pipe2)�libc�flags�fds�r�/usr/lib/python3.6/pipe.py�
_pipe2_ctypes4s	rcCsXtj�}|tj@dkr0x|D]}t|tj�qW|t@dkrTx|D]}t|t�qBW|S)z�A ``pipe2`` implementation using :func:`os.pipe`.

    ``flags`` is an integer providing the flags to ``pipe2``.

    .. warning::

       This implementation is not atomic!

    Return a pair of file descriptors ``(r, w)``.

    r)�os�pipe�
O_NONBLOCK�set_fd_status_flagr�set_fd_flag)rr�fdrrr�_pipe2_by_pipeBs

rcCsNttd�rtjSy$tdtt�}t|d�r2tt|�StSt	k
rHtSXdS)z]Find the appropriate implementation for ``pipe2``.

Return a function implementing ``pipe2``.r
rN)
�hasattrrr
r
r	rrrr�ImportError)rrrr�_get_pipe2_implementationXs
rcCs(tj|tjd�}tj|tj||B�dS)zwSet a flag on a file descriptor.

    ``fd`` is the file descriptor or file object, ``flag`` the flag as integer.

    rN)�fcntlZF_GETFDZF_SETFD)r�flagrrrrrksrcCs(tj|tjd�}tj|tj||B�dS)z~Set a status flag on a file descriptor.

    ``fd`` is the file descriptor or file object, ``flag`` the flag as integer.

    rN)rZF_GETFLZF_SETFL)rrrrrrrusrc@s,eZdZdZedd��Zdd�Zdd�ZdS)	�Pipez�A unix pipe.

    A pipe object provides two file objects: :attr:`source` is a readable file
    object, and :attr:`sink` a writeable.  Bytes written to :attr:`sink` appear
    at :attr:`source`.

    Open a pipe with :meth:`open()`.

    cCsttjtB�\}}|||�S)zLOpen and return a new :class:`Pipe`.

        The pipe uses non-blocking IO.)�_PIPE2rrr)�cls�source�sinkrrr�open�sz	Pipe.opencCs$tj|dd�|_tj|dd�|_dS)z�Create a new pipe object from the given file descriptors.

        ``source_fd`` is a file descriptor for the readable side of the pipe,
        ``sink_fd`` is a file descriptor for the writeable side.�rbr�wbN)r�fdopenr#r$)�selfZ	source_fdZsink_fdrrr�__init__�sz
Pipe.__init__c
Cs z|jj�Wd|jj�XdS)zCloses both sides of the pipe.N)r#�closer$)r)rrrr+�sz
Pipe.closeN)�__name__�
__module__�__qualname__�__doc__�classmethodr%r*r+rrrrr s	r )r/Z
__future__rrrrrr�	functoolsrZpyudev._ctypeslib.libcrrr	Zpyudev._ctypeslib.utilsr
�getattrrrrrr!rr�objectr rrrr�<module> s&

site-packages/pyudev/_os/__pycache__/pipe.cpython-36.opt-1.pyc000064400000007657147511334620020137 0ustar003

u1�W��@s�dZddlmZddlmZddlmZddlmZddlZddlZddlm	Z	ddl
mZdd	l
mZdd
l
m
Z
ddlmZeedd
�Zdd�Zdd�Zdd�Ze�Zdd�Zdd�ZGdd�de�ZdS)a#
    pyudev._os.pipe
    ===============

    Fallback implementations for pipe.

    1. pipe2 from python os module
    2. pipe2 from libc
    3. pipe from python os module

    The Pipe class wraps the chosen implementation.

    .. moduleauthor:: Sebastian Wiesner  <lunaryorn@gmail.com>
�)�absolute_import)�division)�print_function)�unicode_literalsN)�partial)�fd_pair)�ERROR_CHECKERS)�
SIGNATURES)�load_ctypes_library�	O_CLOEXECicCs"t�}|j||�|d|dfS)z�A ``pipe2`` implementation using ``pipe2`` from ctypes.

    ``libc`` is a :class:`ctypes.CDLL` object for libc.  ``flags`` is an
    integer providing the flags to ``pipe2``.

    Return a pair of file descriptors ``(r, w)``.

    r�)r�pipe2)�libc�flags�fds�r�/usr/lib/python3.6/pipe.py�
_pipe2_ctypes4s	rcCsXtj�}|tj@dkr0x|D]}t|tj�qW|t@dkrTx|D]}t|t�qBW|S)z�A ``pipe2`` implementation using :func:`os.pipe`.

    ``flags`` is an integer providing the flags to ``pipe2``.

    .. warning::

       This implementation is not atomic!

    Return a pair of file descriptors ``(r, w)``.

    r)�os�pipe�
O_NONBLOCK�set_fd_status_flagr�set_fd_flag)rr�fdrrr�_pipe2_by_pipeBs

rcCsNttd�rtjSy$tdtt�}t|d�r2tt|�StSt	k
rHtSXdS)z]Find the appropriate implementation for ``pipe2``.

Return a function implementing ``pipe2``.r
rN)
�hasattrrr
r
r	rrrr�ImportError)rrrr�_get_pipe2_implementationXs
rcCs(tj|tjd�}tj|tj||B�dS)zwSet a flag on a file descriptor.

    ``fd`` is the file descriptor or file object, ``flag`` the flag as integer.

    rN)�fcntlZF_GETFDZF_SETFD)r�flagrrrrrksrcCs(tj|tjd�}tj|tj||B�dS)z~Set a status flag on a file descriptor.

    ``fd`` is the file descriptor or file object, ``flag`` the flag as integer.

    rN)rZF_GETFLZF_SETFL)rrrrrrrusrc@s,eZdZdZedd��Zdd�Zdd�ZdS)	�Pipez�A unix pipe.

    A pipe object provides two file objects: :attr:`source` is a readable file
    object, and :attr:`sink` a writeable.  Bytes written to :attr:`sink` appear
    at :attr:`source`.

    Open a pipe with :meth:`open()`.

    cCsttjtB�\}}|||�S)zLOpen and return a new :class:`Pipe`.

        The pipe uses non-blocking IO.)�_PIPE2rrr)�cls�source�sinkrrr�open�sz	Pipe.opencCs$tj|dd�|_tj|dd�|_dS)z�Create a new pipe object from the given file descriptors.

        ``source_fd`` is a file descriptor for the readable side of the pipe,
        ``sink_fd`` is a file descriptor for the writeable side.�rbr�wbN)r�fdopenr#r$)�selfZ	source_fdZsink_fdrrr�__init__�sz
Pipe.__init__c
Cs z|jj�Wd|jj�XdS)zCloses both sides of the pipe.N)r#�closer$)r)rrrr+�sz
Pipe.closeN)�__name__�
__module__�__qualname__�__doc__�classmethodr%r*r+rrrrr s	r )r/Z
__future__rrrrrr�	functoolsrZpyudev._ctypeslib.libcrrr	Zpyudev._ctypeslib.utilsr
�getattrrrrrr!rr�objectr rrrr�<module> s&

site-packages/pyudev/core.py000064400000032163147511334620012115 0ustar00# -*- coding: utf-8 -*-
# Copyright (C) 2010, 2011 Sebastian Wiesner <lunaryorn@gmail.com>

# This library is free software; you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as published by the
# Free Software Foundation; either version 2.1 of the License, or (at your
# option) any later version.

# This library is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE.  See the GNU Lesser General Public License
# for more details.

# You should have received a copy of the GNU Lesser General Public License
# along with this library; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA


"""
    pyudev.core
    ===========

    Core types and functions of :mod:`pyudev`.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
"""


from __future__ import (print_function, division, unicode_literals,
                        absolute_import)

from pyudev.device import Devices
from pyudev.device._errors import DeviceNotFoundAtPathError
from pyudev._ctypeslib.libudev import ERROR_CHECKERS
from pyudev._ctypeslib.libudev import SIGNATURES
from pyudev._ctypeslib.utils import load_ctypes_library

from pyudev._util import ensure_byte_string
from pyudev._util import ensure_unicode_string
from pyudev._util import property_value_to_bytes
from pyudev._util import udev_list_iterate


class Context(object):
    """
    A device database connection.

    This class represents a connection to the udev device database, and is
    really *the* central object to access udev.  You need an instance of this
    class for almost anything else in pyudev.

    This class itself gives access to various udev configuration data (e.g.
    :attr:`sys_path`, :attr:`device_path`), and provides device enumeration
    (:meth:`list_devices()`).

    Instances of this class can directly be given as ``udev *`` to functions
    wrapped through :mod:`ctypes`.
    """

    def __init__(self):
        """
        Create a new context.
        """
        self._libudev = load_ctypes_library('udev', SIGNATURES, ERROR_CHECKERS)
        self._as_parameter_ = self._libudev.udev_new()

    def __del__(self):
        self._libudev.udev_unref(self)

    @property
    def sys_path(self):
        """
        The ``sysfs`` mount point defaulting to ``/sys'`` as unicode string.
        """
        if hasattr(self._libudev, 'udev_get_sys_path'):
            return ensure_unicode_string(self._libudev.udev_get_sys_path(self))
        else:
            # Fixed path since udev 183
            return '/sys'

    @property
    def device_path(self):
        """
        The device directory path defaulting to ``/dev`` as unicode string.
        """
        if hasattr(self._libudev, 'udev_get_dev_path'):
            return ensure_unicode_string(self._libudev.udev_get_dev_path(self))
        else:
            # Fixed path since udev 183
            return '/dev'

    @property
    def run_path(self):
        """
        The run runtime directory path defaulting to ``/run`` as unicode
        string.

        .. udevversion:: 167

        .. versionadded:: 0.10
        """
        if hasattr(self._libudev, 'udev_get_run_path'):
            return ensure_unicode_string(self._libudev.udev_get_run_path(self))
        else:
            return '/run/udev'

    @property
    def log_priority(self):
        """
        The logging priority of the interal logging facitility of udev as
        integer with a standard :mod:`syslog` priority.  Assign to this
        property to change the logging priority.

        UDev uses the standard :mod:`syslog` priorities.  Constants for these
        priorities are defined in the :mod:`syslog` module in the standard
        library:

        >>> import syslog
        >>> context = pyudev.Context()
        >>> context.log_priority = syslog.LOG_DEBUG

        .. versionadded:: 0.9
        """
        return self._libudev.udev_get_log_priority(self)

    @log_priority.setter
    def log_priority(self, value):
        """
        Set the log priority.

        :param int value: the log priority.
        """
        self._libudev.udev_set_log_priority(self, value)

    def list_devices(self, **kwargs):
        """
        List all available devices.

        The arguments of this method are the same as for
        :meth:`Enumerator.match()`.  In fact, the arguments are simply passed
        straight to method :meth:`~Enumerator.match()`.

        This function creates and returns an :class:`Enumerator` object,
        that can be used to filter the list of devices, and eventually
        retrieve :class:`Device` objects representing matching devices.

        .. versionchanged:: 0.8
           Accept keyword arguments now for easy matching.
        """
        return Enumerator(self).match(**kwargs)


class Enumerator(object):
    """
    A filtered iterable of devices.

    To retrieve devices, simply iterate over an instance of this class.
    This operation yields :class:`Device` objects representing the available
    devices.

    Before iteration the device list can be filtered by subsystem or by
    property values using :meth:`match_subsystem` and
    :meth:`match_property`.  Multiple subsystem (property) filters are
    combined using a logical OR, filters of different types are combined
    using a logical AND.  The following filter for instance::

        devices.match_subsystem('block').match_property(
            'ID_TYPE', 'disk').match_property('DEVTYPE', 'disk')

    means the following::

        subsystem == 'block' and (ID_TYPE == 'disk' or DEVTYPE == 'disk')

    Once added, a filter cannot be removed anymore.  Create a new object
    instead.

    Instances of this class can directly be given as given ``udev_enumerate *``
    to functions wrapped through :mod:`ctypes`.
    """

    def __init__(self, context):
        """
        Create a new enumerator with the given ``context`` (a
        :class:`Context` instance).

        While you can create objects of this class directly, this is not
        recommended.  Call :method:`Context.list_devices()` instead.
        """
        if not isinstance(context, Context):
            raise TypeError('Invalid context object')
        self.context = context
        self._as_parameter_ = context._libudev.udev_enumerate_new(context)
        self._libudev = context._libudev

    def __del__(self):
        self._libudev.udev_enumerate_unref(self)

    def match(self, **kwargs):
        """
        Include devices according to the rules defined by the keyword
        arguments.  These keyword arguments are interpreted as follows:

        - The value for the keyword argument ``subsystem`` is forwarded to
          :meth:`match_subsystem()`.
        - The value for the keyword argument ``sys_name`` is forwared to
          :meth:`match_sys_name()`.
        - The value for the keyword argument ``tag`` is forwared to
          :meth:`match_tag()`.
        - The value for the keyword argument ``parent`` is forwared to
          :meth:`match_parent()`.
        - All other keyword arguments are forwareded one by one to
          :meth:`match_property()`.  The keyword argument itself is interpreted
          as property name, the value of the keyword argument as the property
          value.

        All keyword arguments are optional, calling this method without no
        arguments at all is simply a noop.

        Return the instance again.

        .. versionadded:: 0.8

        .. versionchanged:: 0.13
           Add ``parent`` keyword.
        """
        subsystem = kwargs.pop('subsystem', None)
        if subsystem is not None:
            self.match_subsystem(subsystem)
        sys_name = kwargs.pop('sys_name', None)
        if sys_name is not None:
            self.match_sys_name(sys_name)
        tag = kwargs.pop('tag', None)
        if tag is not None:
            self.match_tag(tag)
        parent = kwargs.pop('parent', None)
        if parent is not None:
            self.match_parent(parent)
        for prop, value in kwargs.items():
            self.match_property(prop, value)
        return self

    def match_subsystem(self, subsystem, nomatch=False):
        """
        Include all devices, which are part of the given ``subsystem``.

        ``subsystem`` is either a unicode string or a byte string, containing
        the name of the subsystem.  If ``nomatch`` is ``True`` (default is
        ``False``), the match is inverted:  A device is only included if it is
        *not* part of the given ``subsystem``.

        Note that, if a device has no subsystem, it is not included either
        with value of ``nomatch`` True or with value of ``nomatch`` False.

        Return the instance again.
        """
        match = self._libudev.udev_enumerate_add_nomatch_subsystem \
           if nomatch else self._libudev.udev_enumerate_add_match_subsystem
        match(self, ensure_byte_string(subsystem))
        return self

    def match_sys_name(self, sys_name):
        """
        Include all devices with the given name.

        ``sys_name`` is a byte or unicode string containing the device name.

        Return the instance again.

        .. versionadded:: 0.8
        """
        self._libudev.udev_enumerate_add_match_sysname(
            self, ensure_byte_string(sys_name))
        return self

    def match_property(self, prop, value):
        """
        Include all devices, whose ``prop`` has the given ``value``.

        ``prop`` is either a unicode string or a byte string, containing
        the name of the property to match.  ``value`` is a property value,
        being one of the following types:

        - :func:`int`
        - :func:`bool`
        - A byte string
        - Anything convertable to a unicode string (including a unicode string
          itself)

        Return the instance again.
        """
        self._libudev.udev_enumerate_add_match_property(
            self, ensure_byte_string(prop), property_value_to_bytes(value))
        return self

    def match_attribute(self, attribute, value, nomatch=False):
        """
        Include all devices, whose ``attribute`` has the given ``value``.

        ``attribute`` is either a unicode string or a byte string, containing
        the name of a sys attribute to match.  ``value`` is an attribute value,
        being one of the following types:

        - :func:`int`,
        - :func:`bool`
        - A byte string
        - Anything convertable to a unicode string (including a unicode string
          itself)

        If ``nomatch`` is ``True`` (default is ``False``), the match is
        inverted:  A device is include if the ``attribute`` does *not* match
        the given ``value``.

        .. note::

           If ``nomatch`` is ``True``, devices which do not have the given
           ``attribute`` at all are also included.  In other words, with
           ``nomatch=True`` the given ``attribute`` is *not* guaranteed to
           exist on all returned devices.

        Return the instance again.
        """
        match = (self._libudev.udev_enumerate_add_match_sysattr
                 if not nomatch else
                 self._libudev.udev_enumerate_add_nomatch_sysattr)
        match(self, ensure_byte_string(attribute),
              property_value_to_bytes(value))
        return self

    def match_tag(self, tag):
        """
        Include all devices, which have the given ``tag`` attached.

        ``tag`` is a byte or unicode string containing the tag name.

        Return the instance again.

        .. udevversion:: 154

        .. versionadded:: 0.6
        """
        self._libudev.udev_enumerate_add_match_tag(self, ensure_byte_string(tag))
        return self

    def match_is_initialized(self):
        """
        Include only devices, which are initialized.

        Initialized devices have properly set device node permissions and
        context, and are (in case of network devices) fully renamed.

        Currently this will not affect devices which do not have device nodes
        and are not network interfaces.

        Return the instance again.

        .. seealso:: :attr:`Device.is_initialized`

        .. udevversion:: 165

        .. versionadded:: 0.8
        """
        self._libudev.udev_enumerate_add_match_is_initialized(self)
        return self

    def match_parent(self, parent):
        """
        Include all devices on the subtree of the given ``parent`` device.

        The ``parent`` device itself is also included.

        ``parent`` is a :class:`~pyudev.Device`.

        Return the instance again.

        .. udevversion:: 172

        .. versionadded:: 0.13
        """
        self._libudev.udev_enumerate_add_match_parent(self, parent)
        return self

    def __iter__(self):
        """
        Iterate over all matching devices.

        Yield :class:`Device` objects.
        """
        self._libudev.udev_enumerate_scan_devices(self)
        entry = self._libudev.udev_enumerate_get_list_entry(self)
        for name, _ in udev_list_iterate(self._libudev, entry):
            try:
                yield Devices.from_sys_path(self.context, name)
            except DeviceNotFoundAtPathError:
                continue
site-packages/pyudev/__init__.py000064400000004676147511334620012734 0ustar00# -*- coding: utf-8 -*-
# Copyright (C) 2010, 2011, 2012 Sebastian Wiesner <lunaryorn@gmail.com>

# This library is free software; you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as published by the
# Free Software Foundation; either version 2.1 of the License, or (at your
# option) any later version.

# This library is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE.  See the GNU Lesser General Public License
# for more details.

# You should have received a copy of the GNU Lesser General Public License
# along with this library; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA


"""
    pyudev
    ======

    A binding to libudev.

    The :class:`Context` provides the connection to the udev device database
    and enumerates devices.  Individual devices are represented by the
    :class:`Device` class.

    Device monitoring is provided by :class:`Monitor` and
    :class:`MonitorObserver`.  With :mod:`pyudev.pyqt4`, :mod:`pyudev.pyside`,
    :mod:`pyudev.glib` and :mod:`pyudev.wx` device monitoring can be integrated
    into the event loop of various GUI toolkits.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
"""

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals


from pyudev.device import Attributes
from pyudev.device import Device
from pyudev.device import Devices
from pyudev.device import DeviceNotFoundAtPathError
from pyudev.device import DeviceNotFoundByFileError
from pyudev.device import DeviceNotFoundByNameError
from pyudev.device import DeviceNotFoundByNumberError
from pyudev.device import DeviceNotFoundError
from pyudev.device import DeviceNotFoundInEnvironmentError
from pyudev.device import Tags

from pyudev.discover import DeviceFileHypothesis
from pyudev.discover import DeviceNameHypothesis
from pyudev.discover import DeviceNumberHypothesis
from pyudev.discover import DevicePathHypothesis
from pyudev.discover import Discovery

from pyudev.core import Context
from pyudev.core import Enumerator

from pyudev.monitor import Monitor
from pyudev.monitor import MonitorObserver

from pyudev.version import __version__
from pyudev.version import __version_info__

from pyudev._util import udev_version
site-packages/pyudev/monitor.py000064400000050730147511334620012654 0ustar00# -*- coding: utf-8 -*-
# Copyright (C) 2010, 2011, 2012, 2013 Sebastian Wiesner <lunaryorn@gmail.com>

# This library is free software; you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as published by the
# Free Software Foundation; either version 2.1 of the License, or (at your
# option) any later version.

# This library is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE.  See the GNU Lesser General Public License
# for more details.

# You should have received a copy of the GNU Lesser General Public License
# along with this library; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA


"""
    pyudev.monitor
    ==============

    Monitor implementation.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
"""


from __future__ import (print_function, division, unicode_literals,
                        absolute_import)

import os
import errno
from threading import Thread
from functools import partial

from pyudev.device import Device

from pyudev._util import eintr_retry_call
from pyudev._util import ensure_byte_string

from pyudev._os import pipe
from pyudev._os import poll


class Monitor(object):
    """
    A synchronous device event monitor.

    A :class:`Monitor` objects connects to the udev daemon and listens for
    changes to the device list.  A monitor is created by connecting to the
    kernel daemon through netlink (see :meth:`from_netlink`):

    >>> from pyudev import Context, Monitor
    >>> context = Context()
    >>> monitor = Monitor.from_netlink(context)

    Once the monitor is created, you can add a filter using :meth:`filter_by()`
    or :meth:`filter_by_tag()` to drop incoming events in subsystems, which are
    not of interest to the application:

    >>> monitor.filter_by('input')

    When the monitor is eventually set up, you can either poll for events
    synchronously:

    >>> device = monitor.poll(timeout=3)
    >>> if device:
    ...     print('{0.action}: {0}'.format(device))
    ...

    Or you can monitor events asynchronously with :class:`MonitorObserver`.

    To integrate into various event processing frameworks, the monitor provides
    a :func:`selectable <select.select>` file description by :meth:`fileno()`.
    However, do *not*  read or write directly on this file descriptor.

    Instances of this class can directly be given as ``udev_monitor *`` to
    functions wrapped through :mod:`ctypes`.

    .. versionchanged:: 0.16
       Remove :meth:`from_socket()` which is deprecated, and even removed in
       recent udev versions.
    """

    def __init__(self, context, monitor_p):
        self.context = context
        self._as_parameter_ = monitor_p
        self._libudev = context._libudev
        self._started = False

    def __del__(self):
        self._libudev.udev_monitor_unref(self)

    @classmethod
    def from_netlink(cls, context, source='udev'):
        """
        Create a monitor by connecting to the kernel daemon through netlink.

        ``context`` is the :class:`Context` to use.  ``source`` is a string,
        describing the event source.  Two sources are available:

        ``'udev'`` (the default)
          Events emitted after udev as registered and configured the device.
          This is the absolutely recommended source for applications.

        ``'kernel'``
          Events emitted directly after the kernel has seen the device.  The
          device has not yet been configured by udev and might not be usable
          at all.  **Never** use this, unless you know what you are doing.

        Return a new :class:`Monitor` object, which is connected to the
        given source.  Raise :exc:`~exceptions.ValueError`, if an invalid
        source has been specified.  Raise
        :exc:`~exceptions.EnvironmentError`, if the creation of the monitor
        failed.
        """
        if source not in ('kernel', 'udev'):
            raise ValueError('Invalid source: {0!r}. Must be one of "udev" '
                             'or "kernel"'.format(source))
        monitor = context._libudev.udev_monitor_new_from_netlink(
            context, ensure_byte_string(source))
        if not monitor:
            raise EnvironmentError('Could not create udev monitor')
        return cls(context, monitor)

    @property
    def started(self):
        """
        ``True``, if this monitor was started, ``False`` otherwise. Readonly.

        .. seealso:: :meth:`start()`
        .. versionadded:: 0.16
        """
        return self._started

    def fileno(self):
        # pylint: disable=anomalous-backslash-in-string
        """
        Return the file description associated with this monitor as integer.

        This is really a real file descriptor ;), which can be watched and
        :func:`select.select`\ ed.
        """
        return self._libudev.udev_monitor_get_fd(self)

    def filter_by(self, subsystem, device_type=None):
        """
        Filter incoming events.

        ``subsystem`` is a byte or unicode string with the name of a
        subsystem (e.g. ``'input'``).  Only events originating from the
        given subsystem pass the filter and are handed to the caller.

        If given, ``device_type`` is a byte or unicode string specifying the
        device type.  Only devices with the given device type are propagated
        to the caller.  If ``device_type`` is not given, no additional
        filter for a specific device type is installed.

        These filters are executed inside the kernel, and client processes
        will usually not be woken up for device, that do not match these
        filters.

        .. versionchanged:: 0.15
           This method can also be after :meth:`start()` now.
        """
        subsystem = ensure_byte_string(subsystem)
        if device_type is not None:
            device_type = ensure_byte_string(device_type)
        self._libudev.udev_monitor_filter_add_match_subsystem_devtype(
            self, subsystem, device_type)
        self._libudev.udev_monitor_filter_update(self)

    def filter_by_tag(self, tag):
        """
        Filter incoming events by the given ``tag``.

        ``tag`` is a byte or unicode string with the name of a tag.  Only
        events for devices which have this tag attached pass the filter and are
        handed to the caller.

        Like with :meth:`filter_by` this filter is also executed inside the
        kernel, so that client processes are usually not woken up for devices
        without the given ``tag``.

        .. udevversion:: 154

        .. versionadded:: 0.9

        .. versionchanged:: 0.15
           This method can also be after :meth:`start()` now.
        """
        self._libudev.udev_monitor_filter_add_match_tag(
            self, ensure_byte_string(tag))
        self._libudev.udev_monitor_filter_update(self)

    def remove_filter(self):
        """
        Remove any filters installed with :meth:`filter_by()` or
        :meth:`filter_by_tag()` from this monitor.

        .. warning::

           Up to udev 181 (and possibly even later versions) the underlying
           ``udev_monitor_filter_remove()`` seems to be broken.  If used with
           affected versions this method always raises
           :exc:`~exceptions.ValueError`.

        Raise :exc:`~exceptions.EnvironmentError` if removal of installed
        filters failed.

        .. versionadded:: 0.15
        """
        self._libudev.udev_monitor_filter_remove(self)
        self._libudev.udev_monitor_filter_update(self)

    def enable_receiving(self):
        """
        Switch the monitor into listing mode.

        Connect to the event source and receive incoming events.  Only after
        calling this method, the monitor listens for incoming events.

        .. note::

           This method is implicitly called by :meth:`__iter__`.  You don't
           need to call it explicitly, if you are iterating over the
           monitor.

        .. deprecated:: 0.16
           Will be removed in 1.0. Use :meth:`start()` instead.
        """
        import warnings
        warnings.warn('Will be removed in 1.0. Use Monitor.start() instead.',
                      DeprecationWarning)
        self.start()

    def start(self):
        """
        Start this monitor.

        The monitor will not receive events until this method is called. This
        method does nothing if called on an already started :class:`Monitor`.

        .. note::

           Typically you don't need to call this method. It is implicitly
           called by :meth:`poll()` and :meth:`__iter__()`.

        .. seealso:: :attr:`started`
        .. versionchanged:: 0.16
           This method does nothing if the :class:`Monitor` was already
           started.
        """
        if not self._started:
            self._libudev.udev_monitor_enable_receiving(self)
            # Force monitor FD into non-blocking mode
            pipe.set_fd_status_flag(self, os.O_NONBLOCK)
            self._started = True

    def set_receive_buffer_size(self, size):
        """
        Set the receive buffer ``size``.

        ``size`` is the requested buffer size in bytes, as integer.

        .. note::

           The CAP_NET_ADMIN capability must be contained in the effective
           capability set of the caller for this method to succeed.  Otherwise
           :exc:`~exceptions.EnvironmentError` will be raised, with ``errno``
           set to :data:`~errno.EPERM`.  Unprivileged processes typically lack
           this capability.  You can check the capabilities of the current
           process with the python-prctl_ module:

           >>> import prctl
           >>> prctl.cap_effective.net_admin

        Raise :exc:`~exceptions.EnvironmentError`, if the buffer size could not
        bet set.

        .. versionadded:: 0.13

        .. _python-prctl: http://packages.python.org/python-prctl
        """
        self._libudev.udev_monitor_set_receive_buffer_size(self, size)

    def _receive_device(self):
        """Receive a single device from the monitor.

        Return the received :class:`Device`, or ``None`` if no device could be
        received.

        """
        while True:
            try:
                device_p = self._libudev.udev_monitor_receive_device(self)
                return Device(self.context, device_p) if device_p else None
            except EnvironmentError as error:
                if error.errno in (errno.EAGAIN, errno.EWOULDBLOCK):
                    # No data available
                    return None
                elif error.errno == errno.EINTR:
                    # Try again if our system call was interrupted
                    continue
                else:
                    raise

    def poll(self, timeout=None):
        """
        Poll for a device event.

        You can use this method together with :func:`iter()` to synchronously
        monitor events in the current thread::

           for device in iter(monitor.poll, None):
               print('{0.action} on {0.device_path}'.format(device))

        Since this method will never return ``None`` if no ``timeout`` is
        specified, this is effectively an endless loop. With
        :func:`functools.partial()` you can also create a loop that only waits
        for a specified time::

           for device in iter(partial(monitor.poll, 3), None):
               print('{0.action} on {0.device_path}'.format(device))

        This loop will only wait three seconds for a new device event. If no
        device event occurred after three seconds, the loop will exit.

        ``timeout`` is a floating point number that specifies a time-out in
        seconds. If omitted or ``None``, this method blocks until a device
        event is available. If ``0``, this method just polls and will never
        block.

        .. note::

           This method implicitly calls :meth:`start()`.

        Return the received :class:`Device`, or ``None`` if a timeout
        occurred. Raise :exc:`~exceptions.EnvironmentError` if event retrieval
        failed.

        .. seealso::

           :attr:`Device.action`
              The action that created this event.

           :attr:`Device.sequence_number`
              The sequence number of this event.

        .. versionadded:: 0.16
        """
        if timeout is not None and timeout > 0:
            # .poll() takes timeout in milliseconds
            timeout = int(timeout * 1000)
        self.start()
        if eintr_retry_call(poll.Poll.for_events((self, 'r')).poll, timeout):
            return self._receive_device()
        else:
            return None

    def receive_device(self):
        """
        Receive a single device from the monitor.

        .. warning::

           You *must* call :meth:`start()` before calling this method.

        The caller must make sure, that there are events available in the
        event queue.  The call blocks, until a device is available.

        If a device was available, return ``(action, device)``.  ``device``
        is the :class:`Device` object describing the device.  ``action`` is
        a string describing the action.  Usual actions are:

        ``'add'``
          A device has been added (e.g. a USB device was plugged in)
        ``'remove'``
          A device has been removed (e.g. a USB device was unplugged)
        ``'change'``
          Something about the device changed (e.g. a device property)
        ``'online'``
          The device is online now
        ``'offline'``
          The device is offline now

        Raise :exc:`~exceptions.EnvironmentError`, if no device could be
        read.

        .. deprecated:: 0.16
           Will be removed in 1.0. Use :meth:`Monitor.poll()` instead.
        """
        import warnings
        warnings.warn('Will be removed in 1.0. Use Monitor.poll() instead.',
                      DeprecationWarning)
        device = self.poll()
        return device.action, device

    def __iter__(self):
        """
        Wait for incoming events and receive them upon arrival.

        This methods implicitly calls :meth:`start()`, and starts polling the
        :meth:`fileno` of this monitor.  If a event comes in, it receives the
        corresponding device and yields it to the caller.

        The returned iterator is endless, and continues receiving devices
        without ever stopping.

        Yields ``(action, device)`` (see :meth:`receive_device` for a
        description).

        .. deprecated:: 0.16
           Will be removed in 1.0. Use an explicit loop over :meth:`poll()`
           instead, or monitor asynchronously with :class:`MonitorObserver`.
        """
        import warnings
        warnings.warn('Will be removed in 1.0. Use an explicit loop over '
                      '"poll()" instead, or monitor asynchronously with '
                      '"MonitorObserver".', DeprecationWarning)
        self.start()
        while True:
            device = self.poll()
            if device is not None:
                yield device.action, device


class MonitorObserver(Thread):
    """
    An asynchronous observer for device events.

    This class subclasses :class:`~threading.Thread` class to asynchronously
    observe a :class:`Monitor` in a background thread:

    >>> from pyudev import Context, Monitor, MonitorObserver
    >>> context = Context()
    >>> monitor = Monitor.from_netlink(context)
    >>> monitor.filter_by(subsystem='input')
    >>> def print_device_event(device):
    ...     print('background event {0.action}: {0.device_path}'.format(device))
    >>> observer = MonitorObserver(monitor, callback=print_device_event, name='monitor-observer')
    >>> observer.daemon
    True
    >>> observer.start()

    In the above example, input device events will be printed in background,
    until :meth:`stop()` is called on ``observer``.

    .. note::

       Instances of this class are always created as daemon thread.  If you do
       not want to use daemon threads for monitoring, you need explicitly set
       :attr:`~threading.Thread.daemon` to ``False`` before invoking
       :meth:`~threading.Thread.start()`.

    .. seealso::

       :attr:`Device.action`
          The action that created this event.

       :attr:`Device.sequence_number`
          The sequence number of this event.

    .. versionadded:: 0.14

    .. versionchanged:: 0.15
       :meth:`Monitor.start()` is implicitly called when the thread is started.
    """

    def __init__(self, monitor, event_handler=None, callback=None, *args,
                 **kwargs):
        """
        Create a new observer for the given ``monitor``.

        ``monitor`` is the :class:`Monitor` to observe. ``callback`` is the
        callable to invoke on events, with the signature ``callback(device)``
        where ``device`` is the :class:`Device` that caused the event.

        .. warning::

           ``callback`` is invoked in the observer thread, hence the observer
           is blocked while callback executes.

        ``args`` and ``kwargs`` are passed unchanged to the constructor of
        :class:`~threading.Thread`.

        .. deprecated:: 0.16
           The ``event_handler`` argument will be removed in 1.0. Use
           the ``callback`` argument instead.
        .. versionchanged:: 0.16
           Add ``callback`` argument.
        """
        if callback is None and event_handler is None:
            raise ValueError('callback missing')
        elif callback is not None and event_handler is not None:
            raise ValueError('Use either callback or event handler')

        Thread.__init__(self, *args, **kwargs)
        self.monitor = monitor
        # observer threads should not keep the interpreter alive
        self.daemon = True
        self._stop_event = None
        if event_handler is not None:
            import warnings
            warnings.warn('"event_handler" argument will be removed in 1.0. '
                          'Use Monitor.poll() instead.', DeprecationWarning)
            callback = lambda d: event_handler(d.action, d)
        self._callback = callback

    def start(self):
        """Start the observer thread."""
        if not self.is_alive():
            self._stop_event = pipe.Pipe.open()
        Thread.start(self)

    def run(self):
        self.monitor.start()
        notifier = poll.Poll.for_events(
            (self.monitor, 'r'), (self._stop_event.source, 'r'))
        while True:
            for file_descriptor, event in eintr_retry_call(notifier.poll):
                if file_descriptor == self._stop_event.source.fileno():
                    # in case of a stop event, close our pipe side, and
                    # return from the thread
                    self._stop_event.source.close()
                    return
                elif file_descriptor == self.monitor.fileno() and event == 'r':
                    read_device = partial(eintr_retry_call, self.monitor.poll, timeout=0)
                    for device in iter(read_device, None):
                        self._callback(device)
                else:
                    raise EnvironmentError('Observed monitor hung up')

    def send_stop(self):
        """
        Send a stop signal to the background thread.

        The background thread will eventually exit, but it may still be running
        when this method returns.  This method is essentially the asynchronous
        equivalent to :meth:`stop()`.

        .. note::

           The underlying :attr:`monitor` is *not* stopped.
        """
        if self._stop_event is None:
            return
        with self._stop_event.sink:
            # emit a stop event to the thread
            eintr_retry_call(self._stop_event.sink.write, b'\x01')
            self._stop_event.sink.flush()

    def stop(self):
        """
        Synchronously stop the background thread.

        .. note::

           This method can safely be called from the observer thread. In this
           case it is equivalent to :meth:`send_stop()`.

        Send a stop signal to the backgroud (see :meth:`send_stop`), and waits
        for the background thread to exit (see :meth:`~threading.Thread.join`)
        if the current thread is *not* the observer thread.

        After this method returns in a thread *that is not the observer
        thread*, the ``callback`` is guaranteed to not be invoked again
        anymore.

        .. note::

           The underlying :attr:`monitor` is *not* stopped.

        .. versionchanged:: 0.16
           This method can be called from the observer thread.
        """
        self.send_stop()
        try:
            self.join()
        except RuntimeError:
            pass
site-packages/pyudev/device/_errors.py000064400000012154147511334620014075 0ustar00# -*- coding: utf-8 -*-
# Copyright (C) 2015 mulhern <amulhern@redhat.com>

# This library is free software; you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as published by the
# Free Software Foundation; either version 2.1 of the License, or (at your
# option) any later version.

# This library is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE.  See the GNU Lesser General Public License
# for more details.

# You should have received a copy of the GNU Lesser General Public License
# along with this library; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA

"""
    pyudev.device._errors
    =====================

    Errors raised by Device methods.

    .. moduleauthor:: Sebastian Wiesner <lunaryorn@gmail.com>
"""

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals

import abc

from six import add_metaclass

@add_metaclass(abc.ABCMeta)
class DeviceError(Exception):
    """
    Any error raised when messing around w/ or trying to discover devices.
    """


@add_metaclass(abc.ABCMeta)
class DeviceNotFoundError(DeviceError):
    """
    An exception indicating that no :class:`Device` was found.

    .. versionchanged:: 0.5
       Rename from ``NoSuchDeviceError`` to its current name.
    """


class DeviceNotFoundAtPathError(DeviceNotFoundError):
    """
    A :exc:`DeviceNotFoundError` indicating that no :class:`Device` was
    found at a given path.
    """

    def __init__(self, sys_path):
        DeviceNotFoundError.__init__(self, sys_path)

    @property
    def sys_path(self):
        """
        The path that caused this error as string.
        """
        return self.args[0]

    def __str__(self):
        return 'No device at {0!r}'.format(self.sys_path)


class DeviceNotFoundByFileError(DeviceNotFoundError):
    """
    A :exc:`DeviceNotFoundError` indicating that no :class:`Device` was
    found from the given filename.
    """

class DeviceNotFoundByInterfaceIndexError(DeviceNotFoundError):
    """
    A :exc:`DeviceNotFoundError` indicating that no :class:`Device` was found
    from the given interface index.
    """

class DeviceNotFoundByKernelDeviceError(DeviceNotFoundError):
    """
    A :exc:`DeviceNotFoundError` indicating that no :class:`Device` was found
    from the given kernel device string.

    The format of the kernel device string is defined in the
    systemd.journal-fields man pagees.
    """


class DeviceNotFoundByNameError(DeviceNotFoundError):
    """
    A :exc:`DeviceNotFoundError` indicating that no :class:`Device` was
    found with a given name.
    """

    def __init__(self, subsystem, sys_name):
        DeviceNotFoundError.__init__(self, subsystem, sys_name)

    @property
    def subsystem(self):
        """
        The subsystem that caused this error as string.
        """
        return self.args[0]

    @property
    def sys_name(self):
        """
        The sys name that caused this error as string.
        """
        return self.args[1]

    def __str__(self):
        return 'No device {0.sys_name!r} in {0.subsystem!r}'.format(self)


class DeviceNotFoundByNumberError(DeviceNotFoundError):
    """
    A :exc:`DeviceNotFoundError` indicating, that no :class:`Device` was found
    for a given device number.
    """

    def __init__(self, typ, number):
        DeviceNotFoundError.__init__(self, typ, number)

    @property
    def device_type(self):
        """
        The device type causing this error as string.  Either ``'char'`` or
        ``'block'``.
        """
        return self.args[0]

    @property
    def device_number(self):
        """
        The device number causing this error as integer.
        """
        return self.args[1]

    def __str__(self):
        return ('No {0.device_type} device with number '
                '{0.device_number}'.format(self))


class DeviceNotFoundInEnvironmentError(DeviceNotFoundError):
    """
    A :exc:`DeviceNotFoundError` indicating, that no :class:`Device` could
    be constructed from the process environment.
    """

    def __str__(self):
        return 'No device found in environment'


class DeviceValueError(DeviceError):
    """
    Raised when a parameter has an unacceptable value.

    May also be raised when the parameter has an unacceptable type.
    """

    _FMT_STR = "value '%s' for parameter %s is unacceptable"

    def __init__(self, value, param, msg=None):
        """ Initializer.

            :param object value: the value
            :param str param: the parameter
            :param str msg: an explanatory message
        """
        # pylint: disable=super-init-not-called
        self._value = value
        self._param = param
        self._msg = msg

    def __str__(self):
        if self._msg:
            fmt_str = self._FMT_STR + ": %s"
            return fmt_str % (self._value, self._param, self._msg)
        else:
            return self._FMT_STR % (self._value, self._param)
site-packages/pyudev/device/__init__.py000064400000002517147511334620014163 0ustar00# -*- coding: utf-8 -*-
# Copyright (C) 2015 mulhern <amulhern@redhat.com>

# This library is free software; you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as published by the
# Free Software Foundation; either version 2.1 of the License, or (at your
# option) any later version.

# This library is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE.  See the GNU Lesser General Public License
# for more details.

# You should have received a copy of the GNU Lesser General Public License
# along with this library; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA

"""
    pyudev.device
    =============

    Device class implementation of :mod:`pyudev`.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
"""


from ._device import Attributes
from ._device import Device
from ._device import Devices
from ._device import Tags
from ._errors import DeviceNotFoundAtPathError
from ._errors import DeviceNotFoundByFileError
from ._errors import DeviceNotFoundByNameError
from ._errors import DeviceNotFoundByNumberError
from ._errors import DeviceNotFoundError
from ._errors import DeviceNotFoundInEnvironmentError
site-packages/pyudev/device/_device.py000064400000126711147511334620014025 0ustar00# -*- coding: utf-8 -*-
# Copyright (C) 2011, 2012 Sebastian Wiesner <lunaryorn@gmail.com>

# This library is free software; you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as published by the
# Free Software Foundation; either version 2.1 of the License, or (at your
# option) any later version.

# This library is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE.  See the GNU Lesser General Public License
# for more details.

# You should have received a copy of the GNU Lesser General Public License
# along with this library; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA


"""
    pyudev.device._device
    =====================

    Device class implementation of :mod:`pyudev`.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
"""

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals

import os
import re
from collections import Container
from collections import Iterable
from collections import Mapping
from datetime import timedelta

from pyudev.device._errors import DeviceNotFoundAtPathError
from pyudev.device._errors import DeviceNotFoundByFileError
from pyudev.device._errors import DeviceNotFoundByInterfaceIndexError
from pyudev.device._errors import DeviceNotFoundByKernelDeviceError
from pyudev.device._errors import DeviceNotFoundByNameError
from pyudev.device._errors import DeviceNotFoundByNumberError
from pyudev.device._errors import DeviceNotFoundInEnvironmentError
from pyudev._util import ensure_byte_string
from pyudev._util import ensure_unicode_string
from pyudev._util import get_device_type
from pyudev._util import string_to_bool
from pyudev._util import udev_list_iterate

# pylint: disable=too-many-lines

class Devices(object):
    """
    Class for constructing :class:`Device` objects from various kinds of data.
    """

    @classmethod
    def from_path(cls, context, path):
        """
        Create a device from a device ``path``.  The ``path`` may or may not
        start with the ``sysfs`` mount point:

        >>> from pyudev import Context, Device
        >>> context = Context()
        >>> Devices.from_path(context, '/devices/platform')
        Device(u'/sys/devices/platform')
        >>> Devices.from_path(context, '/sys/devices/platform')
        Device(u'/sys/devices/platform')

        ``context`` is the :class:`Context` in which to search the device.
        ``path`` is a device path as unicode or byte string.

        Return a :class:`Device` object for the device.  Raise
        :exc:`DeviceNotFoundAtPathError`, if no device was found for ``path``.

        .. versionadded:: 0.18
        """
        if not path.startswith(context.sys_path):
            path = os.path.join(context.sys_path, path.lstrip(os.sep))
        return cls.from_sys_path(context, path)

    @classmethod
    def from_sys_path(cls, context, sys_path):
        """
        Create a new device from a given ``sys_path``:

        >>> from pyudev import Context, Device
        >>> context = Context()
        >>> Devices.from_sys_path(context, '/sys/devices/platform')
        Device(u'/sys/devices/platform')

        ``context`` is the :class:`Context` in which to search the device.
        ``sys_path`` is a unicode or byte string containing the path of the
        device inside ``sysfs`` with the mount point included.

        Return a :class:`Device` object for the device.  Raise
        :exc:`DeviceNotFoundAtPathError`, if no device was found for
        ``sys_path``.

        .. versionadded:: 0.18
        """
        device = context._libudev.udev_device_new_from_syspath(
            context, ensure_byte_string(sys_path))
        if not device:
            raise DeviceNotFoundAtPathError(sys_path)
        return Device(context, device)

    @classmethod
    def from_name(cls, context, subsystem, sys_name):
        """
        Create a new device from a given ``subsystem`` and a given
        ``sys_name``:

        >>> from pyudev import Context, Device
        >>> context = Context()
        >>> sda = Devices.from_name(context, 'block', 'sda')
        >>> sda
        Device(u'/sys/devices/pci0000:00/0000:00:1f.2/host0/target0:0:0/0:0:0:0/block/sda')
        >>> sda == Devices.from_path(context, '/block/sda')

        ``context`` is the :class:`Context` in which to search the device.
        ``subsystem`` and ``sys_name`` are byte or unicode strings, which
        denote the subsystem and the name of the device to create.

        Return a :class:`Device` object for the device.  Raise
        :exc:`DeviceNotFoundByNameError`, if no device was found with the given
        name.

        .. versionadded:: 0.18
        """
        sys_name = sys_name.replace("/", "!")
        device = context._libudev.udev_device_new_from_subsystem_sysname(
            context, ensure_byte_string(subsystem),
            ensure_byte_string(sys_name))
        if not device:
            raise DeviceNotFoundByNameError(subsystem, sys_name)
        return Device(context, device)

    @classmethod
    def from_device_number(cls, context, typ, number):
        """
        Create a new device from a device ``number`` with the given device
        ``type``:

        >>> import os
        >>> from pyudev import Context, Device
        >>> ctx = Context()
        >>> major, minor = 8, 0
        >>> device = Devices.from_device_number(context, 'block',
        ...     os.makedev(major, minor))
        >>> device
        Device(u'/sys/devices/pci0000:00/0000:00:11.0/host0/target0:0:0/0:0:0:0/block/sda')
        >>> os.major(device.device_number), os.minor(device.device_number)
        (8, 0)

        Use :func:`os.makedev` to construct a device number from a major and a
        minor device number, as shown in the example above.

        .. warning::

           Device numbers are not unique across different device types.
           Passing a correct number with a wrong type may silently yield a
           wrong device object, so make sure to pass the correct device type.

        ``context`` is the :class:`Context`, in which to search the device.
        ``type`` is either ``'char'`` or ``'block'``, according to whether the
        device is a character or block device.  ``number`` is the device number
        as integer.

        Return a :class:`Device` object for the device with the given device
        ``number``.  Raise :exc:`DeviceNotFoundByNumberError`, if no device was
        found with the given device type and number.

        .. versionadded:: 0.18
        """
        device = context._libudev.udev_device_new_from_devnum(
            context, ensure_byte_string(typ[0]), number)
        if not device:
            raise DeviceNotFoundByNumberError(typ, number)
        return Device(context, device)

    @classmethod
    def from_device_file(cls, context, filename):
        """
        Create a new device from the given device file:

        >>> from pyudev import Context, Device
        >>> context = Context()
        >>> device = Devices.from_device_file(context, '/dev/sda')
        >>> device
        Device(u'/sys/devices/pci0000:00/0000:00:0d.0/host2/target2:0:0/2:0:0:0/block/sda')
        >>> device.device_node
        u'/dev/sda'

        .. warning::

           Though the example seems to suggest that ``device.device_node ==
           filename`` holds with ``device = Devices.from_device_file(context,
           filename)``, this is only true in a majority of cases.  There *can*
           be devices, for which this relation is actually false!  Thus, do
           *not* expect :attr:`~Device.device_node` to be equal to the given
           ``filename`` for the returned :class:`Device`.  Especially, use
           :attr:`~Device.device_node` if you need the device file of a
           :class:`Device` created with this method afterwards.

        ``context`` is the :class:`Context` in which to search the device.
        ``filename`` is a string containing the path of a device file.

        Return a :class:`Device` representing the given device file.  Raise
        :exc:`DeviceNotFoundByFileError` if ``filename`` is no device file
        at all or if ``filename`` does not exist or if its metadata was
        inaccessible.

        .. versionadded:: 0.18
        """
        try:
            device_type = get_device_type(filename)
            device_number = os.stat(filename).st_rdev
        except (EnvironmentError, ValueError) as err:
            raise DeviceNotFoundByFileError(err)

        return cls.from_device_number(context, device_type, device_number)


    @classmethod
    def from_interface_index(cls, context, ifindex):
        """
        Locate a device based on the interface index.

        :param `Context` context: the libudev context
        :param int ifindex: the interface index
        :returns: the device corresponding to the interface index
        :rtype: `Device`

        This method is only appropriate for network devices.
        """
        network_devices = context.list_devices(subsystem='net')
        dev = next(
           (d for d in network_devices if \
              d.attributes.get('ifindex') == ifindex),
           None
        )
        if dev is not None:
            return dev
        else:
            raise DeviceNotFoundByInterfaceIndexError(ifindex)


    @classmethod
    def from_kernel_device(cls, context, kernel_device):
        """
        Locate a device based on the kernel device.

        :param `Context` context: the libudev context
        :param str kernel_device: the kernel device
        :returns: the device corresponding to ``kernel_device``
        :rtype: `Device`
        """
        switch_char = kernel_device[0]
        rest = kernel_device[1:]
        if switch_char in ('b', 'c'):
            number_re = re.compile(r'^(?P<major>\d+):(?P<minor>\d+)$')
            match = number_re.match(rest)
            if match:
                number = os.makedev(
                   int(match.group('major')),
                   int(match.group('minor'))
                )
                return cls.from_device_number(context, switch_char, number)
            else:
                raise DeviceNotFoundByKernelDeviceError(kernel_device)
        elif switch_char == 'n':
            return cls.from_interface_index(context, rest)
        elif switch_char == '+':
            (subsystem, _, kernel_device_name) = rest.partition(':')
            if kernel_device_name and subsystem:
                return cls.from_name(context, subsystem, kernel_device_name)
            else:
                raise DeviceNotFoundByKernelDeviceError(kernel_device)
        else:
            raise DeviceNotFoundByKernelDeviceError(kernel_device)


    @classmethod
    def from_environment(cls, context):
        """
        Create a new device from the process environment (as in
        :data:`os.environ`).

        This only works reliable, if the current process is called from an
        udev rule, and is usually used for tools executed from ``IMPORT=``
        rules.  Use this method to create device objects in Python scripts
        called from udev rules.

        ``context`` is the library :class:`Context`.

        Return a :class:`Device` object constructed from the environment.
        Raise :exc:`DeviceNotFoundInEnvironmentError`, if no device could be
        created from the environment.

        .. udevversion:: 152

        .. versionadded:: 0.18
        """
        device = context._libudev.udev_device_new_from_environment(context)
        if not device:
            raise DeviceNotFoundInEnvironmentError()
        return Device(context, device)

    @classmethod
    def METHODS(cls): # pylint: disable=invalid-name
        """
        Return methods that obtain a :class:`Device` from a variety of
        different data.

        :return: a list of from_* methods.
        :rtype: list of class methods

        .. versionadded:: 0.18
        """
        return [ #pragma: no cover
           cls.from_device_file,
           cls.from_device_number,
           cls.from_name,
           cls.from_path,
           cls.from_sys_path
        ]


class Device(Mapping):
    # pylint: disable=too-many-public-methods
    """
    A single device with attached attributes and properties.

    This class subclasses the ``Mapping`` ABC, providing a read-only
    dictionary mapping property names to the corresponding values.
    Therefore all well-known dicitionary methods and operators
    (e.g. ``.keys()``, ``.items()``, ``in``) are available to access device
    properties.

    Aside of the properties, a device also has a set of udev-specific
    attributes like the path inside ``sysfs``.

    :class:`Device` objects compare equal and unequal to other devices and
    to strings (based on :attr:`device_path`).  However, there is no
    ordering on :class:`Device` objects, and the corresponding operators
    ``>``, ``<``, ``<=`` and ``>=`` raise :exc:`~exceptions.TypeError`.

    .. warning::

       **Never** use object identity (``is`` operator) to compare
       :class:`Device` objects.  :mod:`pyudev` may create multiple
       :class:`Device` objects for the same device.  Instead compare
       devices by value using ``==`` or ``!=``.

    :class:`Device` objects are hashable and can therefore be used as keys
    in dictionaries and sets.

    They can also be given directly as ``udev_device *`` to functions wrapped
    through :mod:`ctypes`.
    """

    @classmethod
    def from_path(cls, context, path): #pragma: no cover
        """
        .. versionadded:: 0.4
        .. deprecated:: 0.18
           Use :class:`Devices.from_path` instead.
        """
        import warnings
        warnings.warn(
           'Will be removed in 1.0. Use equivalent Devices method instead.',
           DeprecationWarning,
           stacklevel=2
        )
        return Devices.from_path(context, path)

    @classmethod
    def from_sys_path(cls, context, sys_path): #pragma: no cover
        """
        .. versionchanged:: 0.4
           Raise :exc:`NoSuchDeviceError` instead of returning ``None``, if
           no device was found for ``sys_path``.
        .. versionchanged:: 0.5
           Raise :exc:`DeviceNotFoundAtPathError` instead of
           :exc:`NoSuchDeviceError`.
        .. deprecated:: 0.18
           Use :class:`Devices.from_sys_path` instead.
        """
        import warnings
        warnings.warn(
           'Will be removed in 1.0. Use equivalent Devices method instead.',
           DeprecationWarning,
           stacklevel=2
        )
        return Devices.from_sys_path(context, sys_path)

    @classmethod
    def from_name(cls, context, subsystem, sys_name): #pragma: no cover
        """
        .. versionadded:: 0.5
        .. deprecated:: 0.18
           Use :class:`Devices.from_name` instead.
        """
        import warnings
        warnings.warn(
           'Will be removed in 1.0. Use equivalent Devices method instead.',
           DeprecationWarning,
           stacklevel=2
        )
        return Devices.from_name(context, subsystem, sys_name)

    @classmethod
    def from_device_number(cls, context, typ, number): #pragma: no cover
        """
        .. versionadded:: 0.11
        .. deprecated:: 0.18
           Use :class:`Devices.from_device_number` instead.
        """
        import warnings
        warnings.warn(
           'Will be removed in 1.0. Use equivalent Devices method instead.',
           DeprecationWarning,
           stacklevel=2
        )
        return Devices.from_device_number(context, typ, number)

    @classmethod
    def from_device_file(cls, context, filename): #pragma: no cover
        """
        .. versionadded:: 0.15
        .. deprecated:: 0.18
           Use :class:`Devices.from_device_file` instead.
        """
        import warnings
        warnings.warn(
           'Will be removed in 1.0. Use equivalent Devices method instead.',
           DeprecationWarning,
           stacklevel=2
        )
        return Devices.from_device_file(context, filename)

    @classmethod
    def from_environment(cls, context): #pragma: no cover
        """
        .. versionadded:: 0.6
        .. deprecated:: 0.18
           Use :class:`Devices.from_environment` instead.
        """
        import warnings
        warnings.warn(
           'Will be removed in 1.0. Use equivalent Devices method instead.',
           DeprecationWarning,
           stacklevel=2
        )
        return Devices.from_environment(context)

    def __init__(self, context, _device):
        self.context = context
        self._as_parameter_ = _device
        self._libudev = context._libudev

    def __del__(self):
        self._libudev.udev_device_unref(self)

    def __repr__(self):
        return 'Device({0.sys_path!r})'.format(self)

    @property
    def parent(self):
        """
        The parent :class:`Device` or ``None``, if there is no parent
        device.
        """
        parent = self._libudev.udev_device_get_parent(self)
        if not parent:
            return None
        # the parent device is not referenced, thus forcibly acquire a
        # reference
        return Device(self.context, self._libudev.udev_device_ref(parent))

    @property
    def children(self):
        """
        Yield all direct children of this device.

        .. note::

           In udev, parent-child relationships are generally ambiguous, i.e.
           a parent can have multiple children, *and* a child can have multiple
           parents. Hence, `child.parent == parent` does generally *not* hold
           for all `child` objects in `parent.children`. In other words,
           the :attr:`parent` of a device in this property can be different
           from this device!

        .. note::

           As the underlying library does not provide any means to directly
           query the children of a device, this property performs a linear
           search through all devices.

        Return an iterable yielding a :class:`Device` object for each direct
        child of this device.

        .. udevversion:: 172

        .. versionchanged:: 0.13
           Requires udev version 172 now.
        """
        for device in self.context.list_devices().match_parent(self):
            if device != self:
                yield device

    @property
    def ancestors(self):
        """
        Yield all ancestors of this device from bottom to top.

        Return an iterator yielding a :class:`Device` object for each
        ancestor of this device from bottom to top.

        .. versionadded:: 0.16
        """
        parent = self.parent
        while parent is not None:
            yield parent
            parent = parent.parent

    def find_parent(self, subsystem, device_type=None):
        """
        Find the parent device with the given ``subsystem`` and
        ``device_type``.

        ``subsystem`` is a byte or unicode string containing the name of the
        subsystem, in which to search for the parent.  ``device_type`` is a
        byte or unicode string holding the expected device type of the parent.
        It can be ``None`` (the default), which means, that no specific device
        type is expected.

        Return a parent :class:`Device` within the given ``subsystem`` and, if
        ``device_type`` is not ``None``, with the given ``device_type``, or
        ``None``, if this device has no parent device matching these
        constraints.

        .. versionadded:: 0.9
        """
        subsystem = ensure_byte_string(subsystem)
        if device_type is not None:
            device_type = ensure_byte_string(device_type)
        parent = self._libudev.udev_device_get_parent_with_subsystem_devtype(
            self, subsystem, device_type)
        if not parent:
            return None
        # parent device is not referenced, thus forcibly acquire a reference
        return Device(self.context, self._libudev.udev_device_ref(parent))

    def traverse(self):
        """
        Traverse all parent devices of this device from bottom to top.

        Return an iterable yielding all parent devices as :class:`Device`
        objects, *not* including the current device.  The last yielded
        :class:`Device` is the top of the device hierarchy.

        .. deprecated:: 0.16
           Will be removed in 1.0. Use :attr:`ancestors` instead.
        """
        import warnings
        warnings.warn(
           'Will be removed in 1.0. Use Device.ancestors instead.',
           DeprecationWarning,
           stacklevel=2
        )
        return self.ancestors

    @property
    def sys_path(self):
        """
        Absolute path of this device in ``sysfs`` including the ``sysfs``
        mount point as unicode string.
        """
        return ensure_unicode_string(
            self._libudev.udev_device_get_syspath(self))

    @property
    def device_path(self):
        """
        Kernel device path as unicode string.  This path uniquely identifies
        a single device.

        Unlike :attr:`sys_path`, this path does not contain the ``sysfs``
        mount point.  However, the path is absolute and starts with a slash
        ``'/'``.
        """
        return ensure_unicode_string(
            self._libudev.udev_device_get_devpath(self))

    @property
    def subsystem(self):
        """
        Name of the subsystem this device is part of as unicode string.

        :returns: name of subsystem if found, else None
        :rtype: unicode string or NoneType
        """
        subsys = self._libudev.udev_device_get_subsystem(self)
        return None if subsys is None else ensure_unicode_string(subsys)

    @property
    def sys_name(self):
        """
        Device file name inside ``sysfs`` as unicode string.
        """
        return ensure_unicode_string(
            self._libudev.udev_device_get_sysname(self))

    @property
    def sys_number(self):
        """
        The trailing number of the :attr:`sys_name` as unicode string, or
        ``None``, if the device has no trailing number in its name.

        .. note::

           The number is returned as unicode string to preserve the exact
           format of the number, especially any leading zeros:

           >>> from pyudev import Context, Device
           >>> context = Context()
           >>> device = Devices.from_path(context, '/sys/devices/LNXSYSTM:00')
           >>> device.sys_number
           u'00'

           To work with numbers, explicitly convert them to ints:

           >>> int(device.sys_number)
           0

        .. versionadded:: 0.11
        """
        number = self._libudev.udev_device_get_sysnum(self)
        return ensure_unicode_string(number) if number is not None else None

    @property
    def device_type(self):
        """
        Device type as unicode string, or ``None``, if the device type is
        unknown.

        >>> from pyudev import Context
        >>> context = Context()
        >>> for device in context.list_devices(subsystem='net'):
        ...     '{0} - {1}'.format(device.sys_name, device.device_type or 'ethernet')
        ...
        u'eth0 - ethernet'
        u'wlan0 - wlan'
        u'lo - ethernet'
        u'vboxnet0 - ethernet'

        .. versionadded:: 0.10
        """
        device_type = self._libudev.udev_device_get_devtype(self)
        if device_type is not None:
            return ensure_unicode_string(device_type)
        else:
            return device_type

    @property
    def driver(self):
        """
        The driver name as unicode string, or ``None``, if there is no
        driver for this device.

        .. versionadded:: 0.5
        """
        driver = self._libudev.udev_device_get_driver(self)
        return ensure_unicode_string(driver) if driver is not None else None

    @property
    def device_node(self):
        """
        Absolute path to the device node of this device as unicode string or
        ``None``, if this device doesn't have a device node.  The path
        includes the device directory (see :attr:`Context.device_path`).

        This path always points to the actual device node associated with
        this device, and never to any symbolic links to this device node.
        See :attr:`device_links` to get a list of symbolic links to this
        device node.

        .. warning::

           For devices created with :meth:`from_device_file()`, the value of
           this property is not necessary equal to the ``filename`` given to
           :meth:`from_device_file()`.
        """
        node = self._libudev.udev_device_get_devnode(self)
        return ensure_unicode_string(node) if node is not None else None

    @property
    def device_number(self):
        """
        The device number of the associated device as integer, or ``0``, if no
        device number is associated.

        Use :func:`os.major` and :func:`os.minor` to decompose the device
        number into its major and minor number:

        >>> import os
        >>> from pyudev import Context, Device
        >>> context = Context()
        >>> sda = Devices.from_name(context, 'block', 'sda')
        >>> sda.device_number
        2048L
        >>> (os.major(sda.device_number), os.minor(sda.device_number))
        (8, 0)

        For devices with an associated :attr:`device_node`, this is the same as
        the ``st_rdev`` field of the stat result of the :attr:`device_node`:

        >>> os.stat(sda.device_node).st_rdev
        2048

        .. versionadded:: 0.11
        """
        return self._libudev.udev_device_get_devnum(self)

    @property
    def is_initialized(self):
        """
        ``True``, if the device is initialized, ``False`` otherwise.

        A device is initialized, if udev has already handled this device and
        has set up device node permissions and context, or renamed a network
        device.

        Consequently, this property is only implemented for devices with a
        device node or for network devices.  On all other devices this property
        is always ``True``.

        It is *not* recommended, that you use uninitialized devices.

        .. seealso:: :attr:`time_since_initialized`

        .. udevversion:: 165

        .. versionadded:: 0.8
        """
        return bool(self._libudev.udev_device_get_is_initialized(self))

    @property
    def time_since_initialized(self):
        """
        The time elapsed since initialization as :class:`~datetime.timedelta`.

        This property is only implemented on devices, which need to store
        properties in the udev database.  On all other devices this property is
        simply zero :class:`~datetime.timedelta`.

        .. seealso:: :attr:`is_initialized`

        .. udevversion:: 165

        .. versionadded:: 0.8
        """
        microseconds = self._libudev.udev_device_get_usec_since_initialized(
            self)
        return timedelta(microseconds=microseconds)

    @property
    def device_links(self):
        """
        An iterator, which yields the absolute paths (including the device
        directory, see :attr:`Context.device_path`) of all symbolic links
        pointing to the :attr:`device_node` of this device.  The paths are
        unicode strings.

        UDev can create symlinks to the original device node (see
        :attr:`device_node`) inside the device directory.  This is often
        used to assign a constant, fixed device node to devices like
        removeable media, which technically do not have a constant device
        node, or to map a single device into multiple device hierarchies.
        The property provides access to all such symbolic links, which were
        created by UDev for this device.

        .. warning::

           Links are not necessarily resolved by
           :meth:`Devices.from_device_file()`. Hence do *not* rely on
           ``Devices.from_device_file(context, link).device_path ==
           device.device_path`` from any ``link`` in ``device.device_links``.
        """
        devlinks = self._libudev.udev_device_get_devlinks_list_entry(self)
        for name, _ in udev_list_iterate(self._libudev, devlinks):
            yield ensure_unicode_string(name)

    @property
    def action(self):
        """
        The device event action as string, or ``None``, if this device was not
        received from a :class:`Monitor`.

        Usual actions are:

        ``'add'``
          A device has been added (e.g. a USB device was plugged in)
        ``'remove'``
          A device has been removed (e.g. a USB device was unplugged)
        ``'change'``
          Something about the device changed (e.g. a device property)
        ``'online'``
          The device is online now
        ``'offline'``
          The device is offline now

        .. warning::

           Though the actions listed above are the most common, this property
           *may* return other values, too, so be prepared to handle unknown
           actions!

        .. versionadded:: 0.16
        """
        action = self._libudev.udev_device_get_action(self)
        return ensure_unicode_string(action) if action is not None else None

    @property
    def sequence_number(self):
        """
        The device event sequence number as integer, or ``0`` if this device
        has no sequence number, i.e. was not received from a :class:`Monitor`.

        .. versionadded:: 0.16
        """
        return self._libudev.udev_device_get_seqnum(self)

    @property
    def attributes(self):
        """
        The system attributes of this device as read-only
        :class:`Attributes` mapping.

        System attributes are basically normal files inside the the device
        directory.  These files contain all sorts of information about the
        device, which may not be reflected by properties.  These attributes
        are commonly used for matching in udev rules, and can be printed
        using ``udevadm info --attribute-walk``.

        The values of these attributes are not always proper strings, and
        can contain arbitrary bytes.

        .. versionadded:: 0.5
        """
        # do *not* cache the created object in an attribute of this class.
        # Doing so creates an uncollectable reference cycle between Device and
        # Attributes, because Attributes refers to this object through
        # Attributes.device.
        return Attributes(self)

    @property
    def properties(self):
        """
        The udev properties of this object as read-only Properties mapping.

        .. versionadded:: 0.21
        """
        return Properties(self)

    @property
    def tags(self):
        """
        A :class:`Tags` object representing the tags attached to this device.

        The :class:`Tags` object supports a test for a single tag as well as
        iteration over all tags:

        >>> from pyudev import Context
        >>> context = Context()
        >>> device = next(iter(context.list_devices(tag='systemd')))
        >>> 'systemd' in device.tags
        True
        >>> list(device.tags)
        [u'seat', u'systemd', u'uaccess']

        Tags are arbitrary classifiers that can be attached to devices by udev
        scripts and daemons.  For instance, systemd_ uses tags for multi-seat_
        support.

        .. _systemd: http://freedesktop.org/wiki/Software/systemd
        .. _multi-seat: http://www.freedesktop.org/wiki/Software/systemd/multiseat

        .. udevversion:: 154

        .. versionadded:: 0.6

        .. versionchanged:: 0.13
           Return a :class:`Tags` object now.
        """
        return Tags(self)

    def __iter__(self):
        """
        Iterate over the names of all properties defined for this device.

        Return a generator yielding the names of all properties of this
        device as unicode strings.

        .. deprecated:: 0.21
           Will be removed in 1.0. Access properties with Device.properties.
        """
        import warnings
        warnings.warn(
           'Will be removed in 1.0. Access properties with Device.properties.',
           DeprecationWarning,
           stacklevel=2
        )
        return self.properties.__iter__()

    def __len__(self):
        """
        Return the amount of properties defined for this device as integer.

        .. deprecated:: 0.21
           Will be removed in 1.0. Access properties with Device.properties.
        """
        import warnings
        warnings.warn(
           'Will be removed in 1.0. Access properties with Device.properties.',
           DeprecationWarning,
           stacklevel=2
        )
        return self.properties.__len__()

    def __getitem__(self, prop):
        """
        Get the given property from this device.

        ``prop`` is a unicode or byte string containing the name of the
        property.

        Return the property value as unicode string, or raise a
        :exc:`~exceptions.KeyError`, if the given property is not defined
        for this device.

        .. deprecated:: 0.21
           Will be removed in 1.0. Access properties with Device.properties.
        """
        import warnings
        warnings.warn(
           'Will be removed in 1.0. Access properties with Device.properties.',
           DeprecationWarning,
           stacklevel=2
        )
        return self.properties.__getitem__(prop)

    def asint(self, prop):
        """
        Get the given property from this device as integer.

        ``prop`` is a unicode or byte string containing the name of the
        property.

        Return the property value as integer. Raise a
        :exc:`~exceptions.KeyError`, if the given property is not defined
        for this device, or a :exc:`~exceptions.ValueError`, if the property
        value cannot be converted to an integer.

        .. deprecated:: 0.21
           Will be removed in 1.0. Use Device.properties.asint() instead.
        """
        import warnings
        warnings.warn(
           'Will be removed in 1.0. Use Device.properties.asint instead.',
           DeprecationWarning,
           stacklevel=2
        )
        return self.properties.asint(prop)

    def asbool(self, prop):
        """
        Get the given property from this device as boolean.

        A boolean property has either a value of ``'1'`` or of ``'0'``,
        where ``'1'`` stands for ``True``, and ``'0'`` for ``False``.  Any
        other value causes a :exc:`~exceptions.ValueError` to be raised.

        ``prop`` is a unicode or byte string containing the name of the
        property.

        Return ``True``, if the property value is ``'1'`` and ``False``, if
        the property value is ``'0'``.  Any other value raises a
        :exc:`~exceptions.ValueError`.  Raise a :exc:`~exceptions.KeyError`,
        if the given property is not defined for this device.

        .. deprecated:: 0.21
           Will be removed in 1.0. Use Device.properties.asbool() instead.
        """
        import warnings
        warnings.warn(
           'Will be removed in 1.0. Use Device.properties.asbool instead.',
           DeprecationWarning,
           stacklevel=2
        )
        return self.properties.asbool(prop)

    def __hash__(self):
        return hash(self.device_path)

    def __eq__(self, other):
        if isinstance(other, Device):
            return self.device_path == other.device_path
        else:
            return self.device_path == other

    def __ne__(self, other):
        if isinstance(other, Device):
            return self.device_path != other.device_path
        else:
            return self.device_path != other

    def __gt__(self, other):
        raise TypeError('Device not orderable')

    def __lt__(self, other):
        raise TypeError('Device not orderable')

    def __le__(self, other):
        raise TypeError('Device not orderable')

    def __ge__(self, other):
        raise TypeError('Device not orderable')


class Properties(Mapping):
    """
    udev properties :class:`Device` objects.

    .. versionadded:: 0.21
    """

    def __init__(self, device):
        self.device = device
        self._libudev = device._libudev

    def __iter__(self):
        """
        Iterate over the names of all properties defined for the device.

        Return a generator yielding the names of all properties of this
        device as unicode strings.
        """
        properties = \
           self._libudev.udev_device_get_properties_list_entry(self.device)
        for name, _ in udev_list_iterate(self._libudev, properties):
            yield ensure_unicode_string(name)

    def __len__(self):
        """
        Return the amount of properties defined for this device as integer.
        """
        properties = \
           self._libudev.udev_device_get_properties_list_entry(self.device)
        return sum(1 for _ in udev_list_iterate(self._libudev, properties))

    def __getitem__(self, prop):
        """
        Get the given property from this device.

        ``prop`` is a unicode or byte string containing the name of the
        property.

        Return the property value as unicode string, or raise a
        :exc:`~exceptions.KeyError`, if the given property is not defined
        for this device.
        """
        value = self._libudev.udev_device_get_property_value(
           self.device,
           ensure_byte_string(prop)
        )
        if value is None:
            raise KeyError(prop)
        return ensure_unicode_string(value)

    def asint(self, prop):
        """
        Get the given property from this device as integer.

        ``prop`` is a unicode or byte string containing the name of the
        property.

        Return the property value as integer. Raise a
        :exc:`~exceptions.KeyError`, if the given property is not defined
        for this device, or a :exc:`~exceptions.ValueError`, if the property
        value cannot be converted to an integer.
        """
        return int(self[prop])

    def asbool(self, prop):
        """
        Get the given property from this device as boolean.

        A boolean property has either a value of ``'1'`` or of ``'0'``,
        where ``'1'`` stands for ``True``, and ``'0'`` for ``False``.  Any
        other value causes a :exc:`~exceptions.ValueError` to be raised.

        ``prop`` is a unicode or byte string containing the name of the
        property.

        Return ``True``, if the property value is ``'1'`` and ``False``, if
        the property value is ``'0'``.  Any other value raises a
        :exc:`~exceptions.ValueError`.  Raise a :exc:`~exceptions.KeyError`,
        if the given property is not defined for this device.
        """
        return string_to_bool(self[prop])


class Attributes(object):
    """
    udev attributes for :class:`Device` objects.

    .. versionadded:: 0.5
    """

    def __init__(self, device):
        self.device = device
        self._libudev = device._libudev

    @property
    def available_attributes(self):
        """
        Yield the ``available`` attributes for the device.

        It is not guaranteed that a key in this list will have a value.
        It is not guaranteed that a key not in this list will not have a value.

        It is guaranteed that the keys in this list are the keys that libudev
        considers to be "available" attributes.

        If libudev version does not define udev_device_get_sysattr_list_entry()
        yields nothing.

        See rhbz#1267584.
        """
        if not hasattr(self._libudev, 'udev_device_get_sysattr_list_entry'):
            return # pragma: no cover
        attrs = self._libudev.udev_device_get_sysattr_list_entry(self.device)
        for attribute, _ in udev_list_iterate(self._libudev, attrs):
            yield ensure_unicode_string(attribute)

    def _get(self, attribute):
        """
        Get the given system ``attribute`` for the device.

        :param attribute: the key for an attribute value
        :type attribute: unicode or byte string
        :returns: the value corresponding to ``attribute``
        :rtype: an arbitrary sequence of bytes
        :raises KeyError: if no value found
        """
        value = self._libudev.udev_device_get_sysattr_value(
           self.device,
           ensure_byte_string(attribute)
        )
        if value is None:
            raise KeyError(attribute)
        return value

    def get(self, attribute, default=None):
        """
        Get the given system ``attribute`` for the device.

        :param attribute: the key for an attribute value
        :type attribute: unicode or byte string
        :param default: a default if no corresponding value found
        :type default: a sequence of bytes
        :returns: the value corresponding to ``attribute`` or ``default``
        :rtype: object
        """
        try:
            return self._get(attribute)
        except KeyError:
            return default

    def asstring(self, attribute):
        """
        Get the given ``attribute`` for the device as unicode string.

        :param attribute: the key for an attribute value
        :type attribute: unicode or byte string
        :returns: the value corresponding to ``attribute``, as unicode
        :rtype: unicode
        :raises KeyError: if no value found for ``attribute``
        :raises UnicodeDecodeError: if value is not convertible
        """
        return ensure_unicode_string(self._get(attribute))

    def asint(self, attribute):
        """
        Get the given ``attribute`` as an int.

        :param attribute: the key for an attribute value
        :type attribute: unicode or byte string
        :returns: the value corresponding to ``attribute``, as an int
        :rtype: int
        :raises KeyError: if no value found for ``attribute``
        :raises UnicodeDecodeError: if value is not convertible to unicode
        :raises ValueError: if unicode value can not be converted to an int
        """
        return int(self.asstring(attribute))

    def asbool(self, attribute):
        """
        Get the given ``attribute`` from this device as a bool.

        :param attribute: the key for an attribute value
        :type attribute: unicode or byte string
        :returns: the value corresponding to ``attribute``, as bool
        :rtype: bool
        :raises KeyError: if no value found for ``attribute``
        :raises UnicodeDecodeError: if value is not convertible to unicode
        :raises ValueError: if unicode value can not be converted to a bool

        A boolean attribute has either a value of ``'1'`` or of ``'0'``,
        where ``'1'`` stands for ``True``, and ``'0'`` for ``False``.  Any
        other value causes a :exc:`~exceptions.ValueError` to be raised.
        """
        return string_to_bool(self.asstring(attribute))


class Tags(Iterable, Container):
    """
    A iterable over :class:`Device` tags.

    Subclasses the ``Container`` and the ``Iterable`` ABC.
    """

    # pylint: disable=too-few-public-methods

    def __init__(self, device):
        self.device = device
        self._libudev = device._libudev

    def _has_tag(self, tag):
        """
            Whether ``tag`` exists.

            :param tag: unicode string with name of tag
            :rtype: bool
        """
        if hasattr(self._libudev, 'udev_device_has_tag'):
            return bool(self._libudev.udev_device_has_tag(
                self.device, ensure_byte_string(tag)))
        else: # pragma: no cover
            return any(t == tag for t in self)

    def __contains__(self, tag):
        """
        Check for existence of ``tag``.

        ``tag`` is a tag as unicode string.

        Return ``True``, if ``tag`` is attached to the device, ``False``
        otherwise.
        """
        return self._has_tag(tag)

    def __iter__(self):
        """
        Iterate over all tags.

        Yield each tag as unicode string.
        """
        tags = self._libudev.udev_device_get_tags_list_entry(self.device)
        for tag, _ in udev_list_iterate(self._libudev, tags):
            yield ensure_unicode_string(tag)
site-packages/pyudev/device/__pycache__/__init__.cpython-36.opt-1.pyc000064400000001307147511334620021402 0ustar003

N�#WO�@s�dZddlmZddlmZddlmZddlmZddlmZddlmZddlm	Z	dd	lm
Z
dd
lmZddlmZdS)
z�
    pyudev.device
    =============

    Device class implementation of :mod:`pyudev`.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
�)�
Attributes)�Device)�Devices)�Tags)�DeviceNotFoundAtPathError)�DeviceNotFoundByFileError)�DeviceNotFoundByNameError)�DeviceNotFoundByNumberError)�DeviceNotFoundError)� DeviceNotFoundInEnvironmentErrorN)
�__doc__Z_devicerrrrZ_errorsrrrr	r
r�r
r
�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pyudev/device/__pycache__/_device.cpython-36.pyc000064400000124641147511334620020311 0ustar003

u1�Wɭ�@sZdZddlmZddlmZddlmZddlmZddlZddlZddlm	Z	ddlm
Z
dd	lmZdd
lm
Z
ddlmZddlmZdd
lmZddlmZddlmZddlmZddlmZddlmZddlmZddlmZddlmZddlmZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�Z Gdd �d e
e	�Z!dS)!z�
    pyudev.device._device
    =====================

    Device class implementation of :mod:`pyudev`.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
�)�absolute_import)�division)�print_function)�unicode_literalsN)�	Container)�Iterable)�Mapping)�	timedelta)�DeviceNotFoundAtPathError)�DeviceNotFoundByFileError)�#DeviceNotFoundByInterfaceIndexError)�!DeviceNotFoundByKernelDeviceError)�DeviceNotFoundByNameError)�DeviceNotFoundByNumberError)� DeviceNotFoundInEnvironmentError)�ensure_byte_string)�ensure_unicode_string)�get_device_type)�string_to_bool)�udev_list_iteratec@s|eZdZdZedd��Zedd��Zedd��Zedd	��Zed
d��Z	edd
��Z
edd��Zedd��Zedd��Z
dS)�DeviceszT
    Class for constructing :class:`Device` objects from various kinds of data.
    cCs0|j|j�s$tjj|j|jtj��}|j||�S)a�
        Create a device from a device ``path``.  The ``path`` may or may not
        start with the ``sysfs`` mount point:

        >>> from pyudev import Context, Device
        >>> context = Context()
        >>> Devices.from_path(context, '/devices/platform')
        Device(u'/sys/devices/platform')
        >>> Devices.from_path(context, '/sys/devices/platform')
        Device(u'/sys/devices/platform')

        ``context`` is the :class:`Context` in which to search the device.
        ``path`` is a device path as unicode or byte string.

        Return a :class:`Device` object for the device.  Raise
        :exc:`DeviceNotFoundAtPathError`, if no device was found for ``path``.

        .. versionadded:: 0.18
        )�
startswith�sys_path�os�path�join�lstrip�sep�
from_sys_path)�cls�contextr�r!�/usr/lib/python3.6/_device.py�	from_path<szDevices.from_pathcCs(|jj|t|��}|st|��t||�S)a�
        Create a new device from a given ``sys_path``:

        >>> from pyudev import Context, Device
        >>> context = Context()
        >>> Devices.from_sys_path(context, '/sys/devices/platform')
        Device(u'/sys/devices/platform')

        ``context`` is the :class:`Context` in which to search the device.
        ``sys_path`` is a unicode or byte string containing the path of the
        device inside ``sysfs`` with the mount point included.

        Return a :class:`Device` object for the device.  Raise
        :exc:`DeviceNotFoundAtPathError`, if no device was found for
        ``sys_path``.

        .. versionadded:: 0.18
        )�_libudevZudev_device_new_from_syspathrr
�Device)rr r�devicer!r!r"rUs
zDevices.from_sys_pathcCs<|jdd�}|jj|t|�t|��}|s2t||��t||�S)a.
        Create a new device from a given ``subsystem`` and a given
        ``sys_name``:

        >>> from pyudev import Context, Device
        >>> context = Context()
        >>> sda = Devices.from_name(context, 'block', 'sda')
        >>> sda
        Device(u'/sys/devices/pci0000:00/0000:00:1f.2/host0/target0:0:0/0:0:0:0/block/sda')
        >>> sda == Devices.from_path(context, '/block/sda')

        ``context`` is the :class:`Context` in which to search the device.
        ``subsystem`` and ``sys_name`` are byte or unicode strings, which
        denote the subsystem and the name of the device to create.

        Return a :class:`Device` object for the device.  Raise
        :exc:`DeviceNotFoundByNameError`, if no device was found with the given
        name.

        .. versionadded:: 0.18
        �/�!)�replacer$Z&udev_device_new_from_subsystem_sysnamerrr%)rr �	subsystem�sys_namer&r!r!r"�	from_nameos

zDevices.from_namecCs0|jj|t|d�|�}|s&t||��t||�S)a�
        Create a new device from a device ``number`` with the given device
        ``type``:

        >>> import os
        >>> from pyudev import Context, Device
        >>> ctx = Context()
        >>> major, minor = 8, 0
        >>> device = Devices.from_device_number(context, 'block',
        ...     os.makedev(major, minor))
        >>> device
        Device(u'/sys/devices/pci0000:00/0000:00:11.0/host0/target0:0:0/0:0:0:0/block/sda')
        >>> os.major(device.device_number), os.minor(device.device_number)
        (8, 0)

        Use :func:`os.makedev` to construct a device number from a major and a
        minor device number, as shown in the example above.

        .. warning::

           Device numbers are not unique across different device types.
           Passing a correct number with a wrong type may silently yield a
           wrong device object, so make sure to pass the correct device type.

        ``context`` is the :class:`Context`, in which to search the device.
        ``type`` is either ``'char'`` or ``'block'``, according to whether the
        device is a character or block device.  ``number`` is the device number
        as integer.

        Return a :class:`Device` object for the device with the given device
        ``number``.  Raise :exc:`DeviceNotFoundByNumberError`, if no device was
        found with the given device type and number.

        .. versionadded:: 0.18
        r)r$Zudev_device_new_from_devnumrrr%)rr �typ�numberr&r!r!r"�from_device_number�s
%
zDevices.from_device_numbercCsVyt|�}tj|�j}Wn.ttfk
rF}zt|��WYdd}~XnX|j|||�S)a�
        Create a new device from the given device file:

        >>> from pyudev import Context, Device
        >>> context = Context()
        >>> device = Devices.from_device_file(context, '/dev/sda')
        >>> device
        Device(u'/sys/devices/pci0000:00/0000:00:0d.0/host2/target2:0:0/2:0:0:0/block/sda')
        >>> device.device_node
        u'/dev/sda'

        .. warning::

           Though the example seems to suggest that ``device.device_node ==
           filename`` holds with ``device = Devices.from_device_file(context,
           filename)``, this is only true in a majority of cases.  There *can*
           be devices, for which this relation is actually false!  Thus, do
           *not* expect :attr:`~Device.device_node` to be equal to the given
           ``filename`` for the returned :class:`Device`.  Especially, use
           :attr:`~Device.device_node` if you need the device file of a
           :class:`Device` created with this method afterwards.

        ``context`` is the :class:`Context` in which to search the device.
        ``filename`` is a string containing the path of a device file.

        Return a :class:`Device` representing the given device file.  Raise
        :exc:`DeviceNotFoundByFileError` if ``filename`` is no device file
        at all or if ``filename`` does not exist or if its metadata was
        inaccessible.

        .. versionadded:: 0.18
        N)rr�stat�st_rdev�EnvironmentError�
ValueErrorrr/)rr �filename�device_type�
device_number�errr!r!r"�from_device_file�s"zDevices.from_device_filecs<|jdd�}t�fdd�|D�d�}|dk	r0|St���dS)a?
        Locate a device based on the interface index.

        :param `Context` context: the libudev context
        :param int ifindex: the interface index
        :returns: the device corresponding to the interface index
        :rtype: `Device`

        This method is only appropriate for network devices.
        Znet)r*c3s"|]}|jjd��kr|VqdS)�ifindexN)�
attributes�get)�.0�d)r9r!r"�	<genexpr>�sz/Devices.from_interface_index.<locals>.<genexpr>N)�list_devices�nextr)rr r9Znetwork_devicesZdevr!)r9r"�from_interface_index�szDevices.from_interface_indexcCs�|d}|dd�}|dkrltjd�}|j|�}|rbtjt|jd��t|jd���}|j|||�St|��nT|d	kr�|j	||�S|d
kr�|j
d�\}}	}
|
r�|r�|j|||
�St|��nt|��dS)
a
        Locate a device based on the kernel device.

        :param `Context` context: the libudev context
        :param str kernel_device: the kernel device
        :returns: the device corresponding to ``kernel_device``
        :rtype: `Device`
        r�N�b�cz^(?P<major>\d+):(?P<minor>\d+)$�major�minor�n�+�:)rCrD)�re�compile�matchr�makedev�int�groupr/r
rA�	partitionr,)rr Z
kernel_deviceZswitch_char�restZ	number_rerLr.r*�_Zkernel_device_namer!r!r"�from_kernel_device�s&




zDevices.from_kernel_devicecCs |jj|�}|st��t||�S)a�
        Create a new device from the process environment (as in
        :data:`os.environ`).

        This only works reliable, if the current process is called from an
        udev rule, and is usually used for tools executed from ``IMPORT=``
        rules.  Use this method to create device objects in Python scripts
        called from udev rules.

        ``context`` is the library :class:`Context`.

        Return a :class:`Device` object constructed from the environment.
        Raise :exc:`DeviceNotFoundInEnvironmentError`, if no device could be
        created from the environment.

        .. udevversion:: 152

        .. versionadded:: 0.18
        )r$Z udev_device_new_from_environmentrr%)rr r&r!r!r"�from_environmentszDevices.from_environmentcCs|j|j|j|j|jgS)z�
        Return methods that obtain a :class:`Device` from a variety of
        different data.

        :return: a list of from_* methods.
        :rtype: list of class methods

        .. versionadded:: 0.18
        )r8r/r,r#r)rr!r!r"�METHODS9s
zDevices.METHODSN)�__name__�
__module__�__qualname__�__doc__�classmethodr#rr,r/r8rArSrTrUr!r!r!r"r7s++#rc@s�eZdZdZedd��Zedd��Zedd��Zedd	��Zed
d��Z	edd
��Z
dd�Zdd�Zdd�Z
edd��Zedd��Zedd��ZdYdd�Zdd�Zedd ��Zed!d"��Zed#d$��Zed%d&��Zed'd(��Zed)d*��Zed+d,��Zed-d.��Zed/d0��Zed1d2��Zed3d4��Zed5d6��Zed7d8��Z ed9d:��Z!ed;d<��Z"ed=d>��Z#ed?d@��Z$dAdB�Z%dCdD�Z&dEdF�Z'dGdH�Z(dIdJ�Z)dKdL�Z*dMdN�Z+dOdP�Z,dQdR�Z-dSdT�Z.dUdV�Z/dWdX�Z0dS)Zr%a�
    A single device with attached attributes and properties.

    This class subclasses the ``Mapping`` ABC, providing a read-only
    dictionary mapping property names to the corresponding values.
    Therefore all well-known dicitionary methods and operators
    (e.g. ``.keys()``, ``.items()``, ``in``) are available to access device
    properties.

    Aside of the properties, a device also has a set of udev-specific
    attributes like the path inside ``sysfs``.

    :class:`Device` objects compare equal and unequal to other devices and
    to strings (based on :attr:`device_path`).  However, there is no
    ordering on :class:`Device` objects, and the corresponding operators
    ``>``, ``<``, ``<=`` and ``>=`` raise :exc:`~exceptions.TypeError`.

    .. warning::

       **Never** use object identity (``is`` operator) to compare
       :class:`Device` objects.  :mod:`pyudev` may create multiple
       :class:`Device` objects for the same device.  Instead compare
       devices by value using ``==`` or ``!=``.

    :class:`Device` objects are hashable and can therefore be used as keys
    in dictionaries and sets.

    They can also be given directly as ``udev_device *`` to functions wrapped
    through :mod:`ctypes`.
    cCs$ddl}|jdtdd�tj||�S)zw
        .. versionadded:: 0.4
        .. deprecated:: 0.18
           Use :class:`Devices.from_path` instead.
        rNz>Will be removed in 1.0. Use equivalent Devices method instead.�)�
stacklevel)�warnings�warn�DeprecationWarningrr#)rr rr]r!r!r"r#nszDevice.from_pathcCs$ddl}|jdtdd�tj||�S)a|
        .. versionchanged:: 0.4
           Raise :exc:`NoSuchDeviceError` instead of returning ``None``, if
           no device was found for ``sys_path``.
        .. versionchanged:: 0.5
           Raise :exc:`DeviceNotFoundAtPathError` instead of
           :exc:`NoSuchDeviceError`.
        .. deprecated:: 0.18
           Use :class:`Devices.from_sys_path` instead.
        rNz>Will be removed in 1.0. Use equivalent Devices method instead.r[)r\)r]r^r_rr)rr rr]r!r!r"r}szDevice.from_sys_pathcCs&ddl}|jdtdd�tj|||�S)zw
        .. versionadded:: 0.5
        .. deprecated:: 0.18
           Use :class:`Devices.from_name` instead.
        rNz>Will be removed in 1.0. Use equivalent Devices method instead.r[)r\)r]r^r_rr,)rr r*r+r]r!r!r"r,�szDevice.from_namecCs&ddl}|jdtdd�tj|||�S)z�
        .. versionadded:: 0.11
        .. deprecated:: 0.18
           Use :class:`Devices.from_device_number` instead.
        rNz>Will be removed in 1.0. Use equivalent Devices method instead.r[)r\)r]r^r_rr/)rr r-r.r]r!r!r"r/�szDevice.from_device_numbercCs$ddl}|jdtdd�tj||�S)z
        .. versionadded:: 0.15
        .. deprecated:: 0.18
           Use :class:`Devices.from_device_file` instead.
        rNz>Will be removed in 1.0. Use equivalent Devices method instead.r[)r\)r]r^r_rr8)rr r4r]r!r!r"r8�szDevice.from_device_filecCs"ddl}|jdtdd�tj|�S)z~
        .. versionadded:: 0.6
        .. deprecated:: 0.18
           Use :class:`Devices.from_environment` instead.
        rNz>Will be removed in 1.0. Use equivalent Devices method instead.r[)r\)r]r^r_rrT)rr r]r!r!r"rT�szDevice.from_environmentcCs||_||_|j|_dS)N)r Z_as_parameter_r$)�selfr Z_devicer!r!r"�__init__�szDevice.__init__cCs|jj|�dS)N)r$Zudev_device_unref)r`r!r!r"�__del__�szDevice.__del__cCs
dj|�S)NzDevice({0.sys_path!r}))�format)r`r!r!r"�__repr__�szDevice.__repr__cCs(|jj|�}|sdSt|j|jj|��S)z_
        The parent :class:`Device` or ``None``, if there is no parent
        device.
        N)r$Zudev_device_get_parentr%r �udev_device_ref)r`�parentr!r!r"rf�sz
Device.parentccs,x&|jj�j|�D]}||kr|VqWdS)a�
        Yield all direct children of this device.

        .. note::

           In udev, parent-child relationships are generally ambiguous, i.e.
           a parent can have multiple children, *and* a child can have multiple
           parents. Hence, `child.parent == parent` does generally *not* hold
           for all `child` objects in `parent.children`. In other words,
           the :attr:`parent` of a device in this property can be different
           from this device!

        .. note::

           As the underlying library does not provide any means to directly
           query the children of a device, this property performs a linear
           search through all devices.

        Return an iterable yielding a :class:`Device` object for each direct
        child of this device.

        .. udevversion:: 172

        .. versionchanged:: 0.13
           Requires udev version 172 now.
        N)r r?Zmatch_parent)r`r&r!r!r"�children�szDevice.childrenccs$|j}x|dk	r|V|j}qWdS)z�
        Yield all ancestors of this device from bottom to top.

        Return an iterator yielding a :class:`Device` object for each
        ancestor of this device from bottom to top.

        .. versionadded:: 0.16
        N)rf)r`rfr!r!r"�	ancestorss

zDevice.ancestorsNcCsDt|�}|dk	rt|�}|jj|||�}|s0dSt|j|jj|��S)a�
        Find the parent device with the given ``subsystem`` and
        ``device_type``.

        ``subsystem`` is a byte or unicode string containing the name of the
        subsystem, in which to search for the parent.  ``device_type`` is a
        byte or unicode string holding the expected device type of the parent.
        It can be ``None`` (the default), which means, that no specific device
        type is expected.

        Return a parent :class:`Device` within the given ``subsystem`` and, if
        ``device_type`` is not ``None``, with the given ``device_type``, or
        ``None``, if this device has no parent device matching these
        constraints.

        .. versionadded:: 0.9
        N)rr$Z-udev_device_get_parent_with_subsystem_devtyper%r re)r`r*r5rfr!r!r"�find_parents
zDevice.find_parentcCsddl}|jdtdd�|jS)a~
        Traverse all parent devices of this device from bottom to top.

        Return an iterable yielding all parent devices as :class:`Device`
        objects, *not* including the current device.  The last yielded
        :class:`Device` is the top of the device hierarchy.

        .. deprecated:: 0.16
           Will be removed in 1.0. Use :attr:`ancestors` instead.
        rNz5Will be removed in 1.0. Use Device.ancestors instead.r[)r\)r]r^r_rh)r`r]r!r!r"�traverse0szDevice.traversecCst|jj|��S)zz
        Absolute path of this device in ``sysfs`` including the ``sysfs``
        mount point as unicode string.
        )rr$Zudev_device_get_syspath)r`r!r!r"rCszDevice.sys_pathcCst|jj|��S)a
        Kernel device path as unicode string.  This path uniquely identifies
        a single device.

        Unlike :attr:`sys_path`, this path does not contain the ``sysfs``
        mount point.  However, the path is absolute and starts with a slash
        ``'/'``.
        )rr$Zudev_device_get_devpath)r`r!r!r"�device_pathLs
zDevice.device_pathcCs |jj|�}|dkrdSt|�S)z�
        Name of the subsystem this device is part of as unicode string.

        :returns: name of subsystem if found, else None
        :rtype: unicode string or NoneType
        N)r$Zudev_device_get_subsystemr)r`Zsubsysr!r!r"r*YszDevice.subsystemcCst|jj|��S)zF
        Device file name inside ``sysfs`` as unicode string.
        )rr$Zudev_device_get_sysname)r`r!r!r"r+dszDevice.sys_namecCs |jj|�}|dk	rt|�SdS)a�
        The trailing number of the :attr:`sys_name` as unicode string, or
        ``None``, if the device has no trailing number in its name.

        .. note::

           The number is returned as unicode string to preserve the exact
           format of the number, especially any leading zeros:

           >>> from pyudev import Context, Device
           >>> context = Context()
           >>> device = Devices.from_path(context, '/sys/devices/LNXSYSTM:00')
           >>> device.sys_number
           u'00'

           To work with numbers, explicitly convert them to ints:

           >>> int(device.sys_number)
           0

        .. versionadded:: 0.11
        N)r$Zudev_device_get_sysnumr)r`r.r!r!r"�
sys_numberlszDevice.sys_numbercCs$|jj|�}|dk	rt|�S|SdS)a�
        Device type as unicode string, or ``None``, if the device type is
        unknown.

        >>> from pyudev import Context
        >>> context = Context()
        >>> for device in context.list_devices(subsystem='net'):
        ...     '{0} - {1}'.format(device.sys_name, device.device_type or 'ethernet')
        ...
        u'eth0 - ethernet'
        u'wlan0 - wlan'
        u'lo - ethernet'
        u'vboxnet0 - ethernet'

        .. versionadded:: 0.10
        N)r$Zudev_device_get_devtyper)r`r5r!r!r"r5�szDevice.device_typecCs |jj|�}|dk	rt|�SdS)z�
        The driver name as unicode string, or ``None``, if there is no
        driver for this device.

        .. versionadded:: 0.5
        N)r$Zudev_device_get_driverr)r`�driverr!r!r"rm�sz
Device.drivercCs |jj|�}|dk	rt|�SdS)a�
        Absolute path to the device node of this device as unicode string or
        ``None``, if this device doesn't have a device node.  The path
        includes the device directory (see :attr:`Context.device_path`).

        This path always points to the actual device node associated with
        this device, and never to any symbolic links to this device node.
        See :attr:`device_links` to get a list of symbolic links to this
        device node.

        .. warning::

           For devices created with :meth:`from_device_file()`, the value of
           this property is not necessary equal to the ``filename`` given to
           :meth:`from_device_file()`.
        N)r$Zudev_device_get_devnoder)r`Znoder!r!r"�device_node�szDevice.device_nodecCs|jj|�S)a
        The device number of the associated device as integer, or ``0``, if no
        device number is associated.

        Use :func:`os.major` and :func:`os.minor` to decompose the device
        number into its major and minor number:

        >>> import os
        >>> from pyudev import Context, Device
        >>> context = Context()
        >>> sda = Devices.from_name(context, 'block', 'sda')
        >>> sda.device_number
        2048L
        >>> (os.major(sda.device_number), os.minor(sda.device_number))
        (8, 0)

        For devices with an associated :attr:`device_node`, this is the same as
        the ``st_rdev`` field of the stat result of the :attr:`device_node`:

        >>> os.stat(sda.device_node).st_rdev
        2048

        .. versionadded:: 0.11
        )r$Zudev_device_get_devnum)r`r!r!r"r6�szDevice.device_numbercCst|jj|��S)ai
        ``True``, if the device is initialized, ``False`` otherwise.

        A device is initialized, if udev has already handled this device and
        has set up device node permissions and context, or renamed a network
        device.

        Consequently, this property is only implemented for devices with a
        device node or for network devices.  On all other devices this property
        is always ``True``.

        It is *not* recommended, that you use uninitialized devices.

        .. seealso:: :attr:`time_since_initialized`

        .. udevversion:: 165

        .. versionadded:: 0.8
        )�boolr$Zudev_device_get_is_initialized)r`r!r!r"�is_initialized�szDevice.is_initializedcCs|jj|�}t|d�S)a�
        The time elapsed since initialization as :class:`~datetime.timedelta`.

        This property is only implemented on devices, which need to store
        properties in the udev database.  On all other devices this property is
        simply zero :class:`~datetime.timedelta`.

        .. seealso:: :attr:`is_initialized`

        .. udevversion:: 165

        .. versionadded:: 0.8
        )�microseconds)r$Z&udev_device_get_usec_since_initializedr	)r`rqr!r!r"�time_since_initialized�szDevice.time_since_initializedccs4|jj|�}x"t|j|�D]\}}t|�VqWdS)a�
        An iterator, which yields the absolute paths (including the device
        directory, see :attr:`Context.device_path`) of all symbolic links
        pointing to the :attr:`device_node` of this device.  The paths are
        unicode strings.

        UDev can create symlinks to the original device node (see
        :attr:`device_node`) inside the device directory.  This is often
        used to assign a constant, fixed device node to devices like
        removeable media, which technically do not have a constant device
        node, or to map a single device into multiple device hierarchies.
        The property provides access to all such symbolic links, which were
        created by UDev for this device.

        .. warning::

           Links are not necessarily resolved by
           :meth:`Devices.from_device_file()`. Hence do *not* rely on
           ``Devices.from_device_file(context, link).device_path ==
           device.device_path`` from any ``link`` in ``device.device_links``.
        N)r$Z#udev_device_get_devlinks_list_entryrr)r`Zdevlinks�namerRr!r!r"�device_linksszDevice.device_linkscCs |jj|�}|dk	rt|�SdS)a
        The device event action as string, or ``None``, if this device was not
        received from a :class:`Monitor`.

        Usual actions are:

        ``'add'``
          A device has been added (e.g. a USB device was plugged in)
        ``'remove'``
          A device has been removed (e.g. a USB device was unplugged)
        ``'change'``
          Something about the device changed (e.g. a device property)
        ``'online'``
          The device is online now
        ``'offline'``
          The device is offline now

        .. warning::

           Though the actions listed above are the most common, this property
           *may* return other values, too, so be prepared to handle unknown
           actions!

        .. versionadded:: 0.16
        N)r$Zudev_device_get_actionr)r`�actionr!r!r"ru sz
Device.actioncCs|jj|�S)z�
        The device event sequence number as integer, or ``0`` if this device
        has no sequence number, i.e. was not received from a :class:`Monitor`.

        .. versionadded:: 0.16
        )r$Zudev_device_get_seqnum)r`r!r!r"�sequence_number>szDevice.sequence_numbercCst|�S)aT
        The system attributes of this device as read-only
        :class:`Attributes` mapping.

        System attributes are basically normal files inside the the device
        directory.  These files contain all sorts of information about the
        device, which may not be reflected by properties.  These attributes
        are commonly used for matching in udev rules, and can be printed
        using ``udevadm info --attribute-walk``.

        The values of these attributes are not always proper strings, and
        can contain arbitrary bytes.

        .. versionadded:: 0.5
        )�
Attributes)r`r!r!r"r:HszDevice.attributescCst|�S)zu
        The udev properties of this object as read-only Properties mapping.

        .. versionadded:: 0.21
        )�
Properties)r`r!r!r"�
properties_szDevice.propertiescCst|�S)a�
        A :class:`Tags` object representing the tags attached to this device.

        The :class:`Tags` object supports a test for a single tag as well as
        iteration over all tags:

        >>> from pyudev import Context
        >>> context = Context()
        >>> device = next(iter(context.list_devices(tag='systemd')))
        >>> 'systemd' in device.tags
        True
        >>> list(device.tags)
        [u'seat', u'systemd', u'uaccess']

        Tags are arbitrary classifiers that can be attached to devices by udev
        scripts and daemons.  For instance, systemd_ uses tags for multi-seat_
        support.

        .. _systemd: http://freedesktop.org/wiki/Software/systemd
        .. _multi-seat: http://www.freedesktop.org/wiki/Software/systemd/multiseat

        .. udevversion:: 154

        .. versionadded:: 0.6

        .. versionchanged:: 0.13
           Return a :class:`Tags` object now.
        )�Tags)r`r!r!r"�tagshszDevice.tagscCs"ddl}|jdtdd�|jj�S)a*
        Iterate over the names of all properties defined for this device.

        Return a generator yielding the names of all properties of this
        device as unicode strings.

        .. deprecated:: 0.21
           Will be removed in 1.0. Access properties with Device.properties.
        rNzAWill be removed in 1.0. Access properties with Device.properties.r[)r\)r]r^r_ry�__iter__)r`r]r!r!r"r|�s
zDevice.__iter__cCs"ddl}|jdtdd�|jj�S)z�
        Return the amount of properties defined for this device as integer.

        .. deprecated:: 0.21
           Will be removed in 1.0. Access properties with Device.properties.
        rNzAWill be removed in 1.0. Access properties with Device.properties.r[)r\)r]r^r_ry�__len__)r`r]r!r!r"r}�szDevice.__len__cCs$ddl}|jdtdd�|jj|�S)a�
        Get the given property from this device.

        ``prop`` is a unicode or byte string containing the name of the
        property.

        Return the property value as unicode string, or raise a
        :exc:`~exceptions.KeyError`, if the given property is not defined
        for this device.

        .. deprecated:: 0.21
           Will be removed in 1.0. Access properties with Device.properties.
        rNzAWill be removed in 1.0. Access properties with Device.properties.r[)r\)r]r^r_ry�__getitem__)r`�propr]r!r!r"r~�szDevice.__getitem__cCs$ddl}|jdtdd�|jj|�S)a
        Get the given property from this device as integer.

        ``prop`` is a unicode or byte string containing the name of the
        property.

        Return the property value as integer. Raise a
        :exc:`~exceptions.KeyError`, if the given property is not defined
        for this device, or a :exc:`~exceptions.ValueError`, if the property
        value cannot be converted to an integer.

        .. deprecated:: 0.21
           Will be removed in 1.0. Use Device.properties.asint() instead.
        rNz<Will be removed in 1.0. Use Device.properties.asint instead.r[)r\)r]r^r_ry�asint)r`rr]r!r!r"r��szDevice.asintcCs$ddl}|jdtdd�|jj|�S)a�
        Get the given property from this device as boolean.

        A boolean property has either a value of ``'1'`` or of ``'0'``,
        where ``'1'`` stands for ``True``, and ``'0'`` for ``False``.  Any
        other value causes a :exc:`~exceptions.ValueError` to be raised.

        ``prop`` is a unicode or byte string containing the name of the
        property.

        Return ``True``, if the property value is ``'1'`` and ``False``, if
        the property value is ``'0'``.  Any other value raises a
        :exc:`~exceptions.ValueError`.  Raise a :exc:`~exceptions.KeyError`,
        if the given property is not defined for this device.

        .. deprecated:: 0.21
           Will be removed in 1.0. Use Device.properties.asbool() instead.
        rNz=Will be removed in 1.0. Use Device.properties.asbool instead.r[)r\)r]r^r_ry�asbool)r`rr]r!r!r"r��sz
Device.asboolcCs
t|j�S)N)�hashrk)r`r!r!r"�__hash__�szDevice.__hash__cCs$t|t�r|j|jkS|j|kSdS)N)�
isinstancer%rk)r`�otherr!r!r"�__eq__�s
z
Device.__eq__cCs$t|t�r|j|jkS|j|kSdS)N)r�r%rk)r`r�r!r!r"�__ne__�s
z
Device.__ne__cCstd��dS)NzDevice not orderable)�	TypeError)r`r�r!r!r"�__gt__sz
Device.__gt__cCstd��dS)NzDevice not orderable)r�)r`r�r!r!r"�__lt__sz
Device.__lt__cCstd��dS)NzDevice not orderable)r�)r`r�r!r!r"�__le__sz
Device.__le__cCstd��dS)NzDevice not orderable)r�)r`r�r!r!r"�__ge__	sz
Device.__ge__)N)1rVrWrXrYrZr#rr,r/r8rTrarbrd�propertyrfrgrhrirjrrkr*r+rlr5rmrnr6rprrrtrurvr:ryr{r|r}r~r�r�r�r�r�r�r�r�r�r!r!r!r"r%MsX
 
	

	 r%c@s@eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dS)rxzN
    udev properties :class:`Device` objects.

    .. versionadded:: 0.21
    cCs||_|j|_dS)N)r&r$)r`r&r!r!r"raszProperties.__init__ccs6|jj|j�}x"t|j|�D]\}}t|�VqWdS)z�
        Iterate over the names of all properties defined for the device.

        Return a generator yielding the names of all properties of this
        device as unicode strings.
        N)r$�%udev_device_get_properties_list_entryr&rr)r`ryrsrRr!r!r"r|szProperties.__iter__cCs(|jj|j�}tdd�t|j|�D��S)zU
        Return the amount of properties defined for this device as integer.
        css|]
}dVqdS)rBNr!)r<rRr!r!r"r>*sz%Properties.__len__.<locals>.<genexpr>)r$r�r&�sumr)r`ryr!r!r"r}$szProperties.__len__cCs,|jj|jt|��}|dkr$t|��t|�S)a9
        Get the given property from this device.

        ``prop`` is a unicode or byte string containing the name of the
        property.

        Return the property value as unicode string, or raise a
        :exc:`~exceptions.KeyError`, if the given property is not defined
        for this device.
        N)r$Zudev_device_get_property_valuer&r�KeyErrorr)r`r�valuer!r!r"r~,s
zProperties.__getitem__cCst||�S)a�
        Get the given property from this device as integer.

        ``prop`` is a unicode or byte string containing the name of the
        property.

        Return the property value as integer. Raise a
        :exc:`~exceptions.KeyError`, if the given property is not defined
        for this device, or a :exc:`~exceptions.ValueError`, if the property
        value cannot be converted to an integer.
        )rN)r`rr!r!r"r�?szProperties.asintcCst||�S)a�
        Get the given property from this device as boolean.

        A boolean property has either a value of ``'1'`` or of ``'0'``,
        where ``'1'`` stands for ``True``, and ``'0'`` for ``False``.  Any
        other value causes a :exc:`~exceptions.ValueError` to be raised.

        ``prop`` is a unicode or byte string containing the name of the
        property.

        Return ``True``, if the property value is ``'1'`` and ``False``, if
        the property value is ``'0'``.  Any other value raises a
        :exc:`~exceptions.ValueError`.  Raise a :exc:`~exceptions.KeyError`,
        if the given property is not defined for this device.
        )r)r`rr!r!r"r�MszProperties.asboolN)
rVrWrXrYrar|r}r~r�r�r!r!r!r"rx
srxc@sNeZdZdZdd�Zedd��Zdd�Zdd	d
�Zdd�Z	d
d�Z
dd�ZdS)rwzQ
    udev attributes for :class:`Device` objects.

    .. versionadded:: 0.5
    cCs||_|j|_dS)N)r&r$)r`r&r!r!r"ragszAttributes.__init__ccsFt|jd�sdS|jj|j�}x"t|j|�D]\}}t|�Vq,WdS)a�
        Yield the ``available`` attributes for the device.

        It is not guaranteed that a key in this list will have a value.
        It is not guaranteed that a key not in this list will not have a value.

        It is guaranteed that the keys in this list are the keys that libudev
        considers to be "available" attributes.

        If libudev version does not define udev_device_get_sysattr_list_entry()
        yields nothing.

        See rhbz#1267584.
        �"udev_device_get_sysattr_list_entryN)�hasattrr$r�r&rr)r`Zattrs�	attributerRr!r!r"�available_attributesks
zAttributes.available_attributescCs(|jj|jt|��}|dkr$t|��|S)aD
        Get the given system ``attribute`` for the device.

        :param attribute: the key for an attribute value
        :type attribute: unicode or byte string
        :returns: the value corresponding to ``attribute``
        :rtype: an arbitrary sequence of bytes
        :raises KeyError: if no value found
        N)r$Zudev_device_get_sysattr_valuer&rr�)r`r�r�r!r!r"�_get�s

zAttributes._getNcCs$y
|j|�Stk
r|SXdS)a|
        Get the given system ``attribute`` for the device.

        :param attribute: the key for an attribute value
        :type attribute: unicode or byte string
        :param default: a default if no corresponding value found
        :type default: a sequence of bytes
        :returns: the value corresponding to ``attribute`` or ``default``
        :rtype: object
        N)r�r�)r`r��defaultr!r!r"r;�s
zAttributes.getcCst|j|��S)a�
        Get the given ``attribute`` for the device as unicode string.

        :param attribute: the key for an attribute value
        :type attribute: unicode or byte string
        :returns: the value corresponding to ``attribute``, as unicode
        :rtype: unicode
        :raises KeyError: if no value found for ``attribute``
        :raises UnicodeDecodeError: if value is not convertible
        )rr�)r`r�r!r!r"�asstring�szAttributes.asstringcCst|j|��S)a�
        Get the given ``attribute`` as an int.

        :param attribute: the key for an attribute value
        :type attribute: unicode or byte string
        :returns: the value corresponding to ``attribute``, as an int
        :rtype: int
        :raises KeyError: if no value found for ``attribute``
        :raises UnicodeDecodeError: if value is not convertible to unicode
        :raises ValueError: if unicode value can not be converted to an int
        )rNr�)r`r�r!r!r"r��szAttributes.asintcCst|j|��S)a�
        Get the given ``attribute`` from this device as a bool.

        :param attribute: the key for an attribute value
        :type attribute: unicode or byte string
        :returns: the value corresponding to ``attribute``, as bool
        :rtype: bool
        :raises KeyError: if no value found for ``attribute``
        :raises UnicodeDecodeError: if value is not convertible to unicode
        :raises ValueError: if unicode value can not be converted to a bool

        A boolean attribute has either a value of ``'1'`` or of ``'0'``,
        where ``'1'`` stands for ``True``, and ``'0'`` for ``False``.  Any
        other value causes a :exc:`~exceptions.ValueError` to be raised.
        )rr�)r`r�r!r!r"r��szAttributes.asbool)N)rVrWrXrYrar�r�r�r;r�r�r�r!r!r!r"rw`s

rwc@s0eZdZdZdd�Zdd�Zdd�Zdd	�Zd
S)rzzk
    A iterable over :class:`Device` tags.

    Subclasses the ``Container`` and the ``Iterable`` ABC.
    cCs||_|j|_dS)N)r&r$)r`r&r!r!r"ra�sz
Tags.__init__cs>t|jd�r$t|jj|jt����St�fdd�|D��SdS)z
            Whether ``tag`` exists.

            :param tag: unicode string with name of tag
            :rtype: bool
        �udev_device_has_tagc3s|]}|�kVqdS)Nr!)r<�t)�tagr!r"r>�sz Tags._has_tag.<locals>.<genexpr>N)r�r$ror�r&r�any)r`r�r!)r�r"�_has_tag�sz
Tags._has_tagcCs
|j|�S)z�
        Check for existence of ``tag``.

        ``tag`` is a tag as unicode string.

        Return ``True``, if ``tag`` is attached to the device, ``False``
        otherwise.
        )r�)r`r�r!r!r"�__contains__�s	zTags.__contains__ccs6|jj|j�}x"t|j|�D]\}}t|�VqWdS)zS
        Iterate over all tags.

        Yield each tag as unicode string.
        N)r$Zudev_device_get_tags_list_entryr&rr)r`r{r�rRr!r!r"r|�sz
Tags.__iter__N)rVrWrXrYrar�r�r|r!r!r!r"rz�s

rz)"rYZ
__future__rrrrrrJ�collectionsrrrZdatetimer	Zpyudev.device._errorsr
rrr
rrrZpyudev._utilrrrrr�objectrr%rxrwrzr!r!r!r"�<module>sDESqsite-packages/pyudev/device/__pycache__/_errors.cpython-36.pyc000064400000014370147511334620020363 0ustar003

5�Vl�@sdZddlmZddlmZddlmZddlmZddlZddlmZeej	�Gdd	�d	e
��Zeej	�Gd
d�de��ZGdd
�d
e�Z
Gdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZdS)z�
    pyudev.device._errors
    =====================

    Errors raised by Device methods.

    .. moduleauthor:: Sebastian Wiesner <lunaryorn@gmail.com>
�)�absolute_import)�division)�print_function)�unicode_literalsN)�
add_metaclassc@seZdZdZdS)�DeviceErrorzP
    Any error raised when messing around w/ or trying to discover devices.
    N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/_errors.pyr$src@seZdZdZdS)�DeviceNotFoundErrorz�
    An exception indicating that no :class:`Device` was found.

    .. versionchanged:: 0.5
       Rename from ``NoSuchDeviceError`` to its current name.
    N)rr	r
rrrrr
r+src@s,eZdZdZdd�Zedd��Zdd�ZdS)	�DeviceNotFoundAtPathErrorzh
    A :exc:`DeviceNotFoundError` indicating that no :class:`Device` was
    found at a given path.
    cCstj||�dS)N)r�__init__)�self�sys_pathrrr
r;sz"DeviceNotFoundAtPathError.__init__cCs
|jdS)z<
        The path that caused this error as string.
        r)�args)rrrr
r>sz"DeviceNotFoundAtPathError.sys_pathcCsdj|j�S)NzNo device at {0!r})�formatr)rrrr
�__str__Esz!DeviceNotFoundAtPathError.__str__N)rr	r
rr�propertyrrrrrr
r5src@seZdZdZdS)�DeviceNotFoundByFileErrorzp
    A :exc:`DeviceNotFoundError` indicating that no :class:`Device` was
    found from the given filename.
    N)rr	r
rrrrr
rIsrc@seZdZdZdS)�#DeviceNotFoundByInterfaceIndexErrorzw
    A :exc:`DeviceNotFoundError` indicating that no :class:`Device` was found
    from the given interface index.
    N)rr	r
rrrrr
rOsrc@seZdZdZdS)�!DeviceNotFoundByKernelDeviceErrorz�
    A :exc:`DeviceNotFoundError` indicating that no :class:`Device` was found
    from the given kernel device string.

    The format of the kernel device string is defined in the
    systemd.journal-fields man pagees.
    N)rr	r
rrrrr
rUsrc@s8eZdZdZdd�Zedd��Zedd��Zdd	�Zd
S)�DeviceNotFoundByNameErrorzj
    A :exc:`DeviceNotFoundError` indicating that no :class:`Device` was
    found with a given name.
    cCstj|||�dS)N)rr)r�	subsystem�sys_namerrr
resz"DeviceNotFoundByNameError.__init__cCs
|jdS)zA
        The subsystem that caused this error as string.
        r)r)rrrr
rhsz#DeviceNotFoundByNameError.subsystemcCs
|jdS)z@
        The sys name that caused this error as string.
        �)r)rrrr
rosz"DeviceNotFoundByNameError.sys_namecCs
dj|�S)Nz+No device {0.sys_name!r} in {0.subsystem!r})r)rrrr
rvsz!DeviceNotFoundByNameError.__str__N)	rr	r
rrrrrrrrrr
r_s
rc@s8eZdZdZdd�Zedd��Zedd��Zdd	�Zd
S)�DeviceNotFoundByNumberErrorzs
    A :exc:`DeviceNotFoundError` indicating, that no :class:`Device` was found
    for a given device number.
    cCstj|||�dS)N)rr)r�typZnumberrrr
r�sz$DeviceNotFoundByNumberError.__init__cCs
|jdS)zj
        The device type causing this error as string.  Either ``'char'`` or
        ``'block'``.
        r)r)rrrr
�device_type�sz'DeviceNotFoundByNumberError.device_typecCs
|jdS)zB
        The device number causing this error as integer.
        r)r)rrrr
�
device_number�sz)DeviceNotFoundByNumberError.device_numbercCs
dj|�S)Nz7No {0.device_type} device with number {0.device_number})r)rrrr
r�sz#DeviceNotFoundByNumberError.__str__N)	rr	r
rrrr r!rrrrr
rzs
rc@seZdZdZdd�ZdS)� DeviceNotFoundInEnvironmentErrorz�
    A :exc:`DeviceNotFoundError` indicating, that no :class:`Device` could
    be constructed from the process environment.
    cCsdS)NzNo device found in environmentr)rrrr
r�sz(DeviceNotFoundInEnvironmentError.__str__N)rr	r
rrrrrr
r"�sr"c@s&eZdZdZdZddd�Zdd�ZdS)	�DeviceValueErrorz�
    Raised when a parameter has an unacceptable value.

    May also be raised when the parameter has an unacceptable type.
    z+value '%s' for parameter %s is unacceptableNcCs||_||_||_dS)z� Initializer.

            :param object value: the value
            :param str param: the parameter
            :param str msg: an explanatory message
        N)�_value�_param�_msg)r�valueZparam�msgrrr
r�szDeviceValueError.__init__cCs:|jr$|jd}||j|j|jfS|j|j|jfSdS)Nz: %s)r&�_FMT_STRr$r%)rZfmt_strrrr
r�s
zDeviceValueError.__str__)N)rr	r
rr)rrrrrr
r#�s
r#)rZ
__future__rrrr�abcZsixr�ABCMeta�	Exceptionrrrrrrrrr"r#rrrr
�<module>s$	

site-packages/pyudev/device/__pycache__/__init__.cpython-36.pyc000064400000001307147511334620020443 0ustar003

N�#WO�@s�dZddlmZddlmZddlmZddlmZddlmZddlmZddlm	Z	dd	lm
Z
dd
lmZddlmZdS)
z�
    pyudev.device
    =============

    Device class implementation of :mod:`pyudev`.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
�)�
Attributes)�Device)�Devices)�Tags)�DeviceNotFoundAtPathError)�DeviceNotFoundByFileError)�DeviceNotFoundByNameError)�DeviceNotFoundByNumberError)�DeviceNotFoundError)� DeviceNotFoundInEnvironmentErrorN)
�__doc__Z_devicerrrrZ_errorsrrrr	r
r�r
r
�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pyudev/device/__pycache__/_errors.cpython-36.opt-1.pyc000064400000014370147511334620021322 0ustar003

5�Vl�@sdZddlmZddlmZddlmZddlmZddlZddlmZeej	�Gdd	�d	e
��Zeej	�Gd
d�de��ZGdd
�d
e�Z
Gdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZdS)z�
    pyudev.device._errors
    =====================

    Errors raised by Device methods.

    .. moduleauthor:: Sebastian Wiesner <lunaryorn@gmail.com>
�)�absolute_import)�division)�print_function)�unicode_literalsN)�
add_metaclassc@seZdZdZdS)�DeviceErrorzP
    Any error raised when messing around w/ or trying to discover devices.
    N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/_errors.pyr$src@seZdZdZdS)�DeviceNotFoundErrorz�
    An exception indicating that no :class:`Device` was found.

    .. versionchanged:: 0.5
       Rename from ``NoSuchDeviceError`` to its current name.
    N)rr	r
rrrrr
r+src@s,eZdZdZdd�Zedd��Zdd�ZdS)	�DeviceNotFoundAtPathErrorzh
    A :exc:`DeviceNotFoundError` indicating that no :class:`Device` was
    found at a given path.
    cCstj||�dS)N)r�__init__)�self�sys_pathrrr
r;sz"DeviceNotFoundAtPathError.__init__cCs
|jdS)z<
        The path that caused this error as string.
        r)�args)rrrr
r>sz"DeviceNotFoundAtPathError.sys_pathcCsdj|j�S)NzNo device at {0!r})�formatr)rrrr
�__str__Esz!DeviceNotFoundAtPathError.__str__N)rr	r
rr�propertyrrrrrr
r5src@seZdZdZdS)�DeviceNotFoundByFileErrorzp
    A :exc:`DeviceNotFoundError` indicating that no :class:`Device` was
    found from the given filename.
    N)rr	r
rrrrr
rIsrc@seZdZdZdS)�#DeviceNotFoundByInterfaceIndexErrorzw
    A :exc:`DeviceNotFoundError` indicating that no :class:`Device` was found
    from the given interface index.
    N)rr	r
rrrrr
rOsrc@seZdZdZdS)�!DeviceNotFoundByKernelDeviceErrorz�
    A :exc:`DeviceNotFoundError` indicating that no :class:`Device` was found
    from the given kernel device string.

    The format of the kernel device string is defined in the
    systemd.journal-fields man pagees.
    N)rr	r
rrrrr
rUsrc@s8eZdZdZdd�Zedd��Zedd��Zdd	�Zd
S)�DeviceNotFoundByNameErrorzj
    A :exc:`DeviceNotFoundError` indicating that no :class:`Device` was
    found with a given name.
    cCstj|||�dS)N)rr)r�	subsystem�sys_namerrr
resz"DeviceNotFoundByNameError.__init__cCs
|jdS)zA
        The subsystem that caused this error as string.
        r)r)rrrr
rhsz#DeviceNotFoundByNameError.subsystemcCs
|jdS)z@
        The sys name that caused this error as string.
        �)r)rrrr
rosz"DeviceNotFoundByNameError.sys_namecCs
dj|�S)Nz+No device {0.sys_name!r} in {0.subsystem!r})r)rrrr
rvsz!DeviceNotFoundByNameError.__str__N)	rr	r
rrrrrrrrrr
r_s
rc@s8eZdZdZdd�Zedd��Zedd��Zdd	�Zd
S)�DeviceNotFoundByNumberErrorzs
    A :exc:`DeviceNotFoundError` indicating, that no :class:`Device` was found
    for a given device number.
    cCstj|||�dS)N)rr)r�typZnumberrrr
r�sz$DeviceNotFoundByNumberError.__init__cCs
|jdS)zj
        The device type causing this error as string.  Either ``'char'`` or
        ``'block'``.
        r)r)rrrr
�device_type�sz'DeviceNotFoundByNumberError.device_typecCs
|jdS)zB
        The device number causing this error as integer.
        r)r)rrrr
�
device_number�sz)DeviceNotFoundByNumberError.device_numbercCs
dj|�S)Nz7No {0.device_type} device with number {0.device_number})r)rrrr
r�sz#DeviceNotFoundByNumberError.__str__N)	rr	r
rrrr r!rrrrr
rzs
rc@seZdZdZdd�ZdS)� DeviceNotFoundInEnvironmentErrorz�
    A :exc:`DeviceNotFoundError` indicating, that no :class:`Device` could
    be constructed from the process environment.
    cCsdS)NzNo device found in environmentr)rrrr
r�sz(DeviceNotFoundInEnvironmentError.__str__N)rr	r
rrrrrr
r"�sr"c@s&eZdZdZdZddd�Zdd�ZdS)	�DeviceValueErrorz�
    Raised when a parameter has an unacceptable value.

    May also be raised when the parameter has an unacceptable type.
    z+value '%s' for parameter %s is unacceptableNcCs||_||_||_dS)z� Initializer.

            :param object value: the value
            :param str param: the parameter
            :param str msg: an explanatory message
        N)�_value�_param�_msg)r�valueZparam�msgrrr
r�szDeviceValueError.__init__cCs:|jr$|jd}||j|j|jfS|j|j|jfSdS)Nz: %s)r&�_FMT_STRr$r%)rZfmt_strrrr
r�s
zDeviceValueError.__str__)N)rr	r
rr)rrrrrr
r#�s
r#)rZ
__future__rrrr�abcZsixr�ABCMeta�	Exceptionrrrrrrrrr"r#rrrr
�<module>s$	

site-packages/pyudev/device/__pycache__/_device.cpython-36.opt-1.pyc000064400000124641147511334620021250 0ustar003

u1�Wɭ�@sZdZddlmZddlmZddlmZddlmZddlZddlZddlm	Z	ddlm
Z
dd	lmZdd
lm
Z
ddlmZddlmZdd
lmZddlmZddlmZddlmZddlmZddlmZddlmZddlmZddlmZddlmZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�Z Gdd �d e
e	�Z!dS)!z�
    pyudev.device._device
    =====================

    Device class implementation of :mod:`pyudev`.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
�)�absolute_import)�division)�print_function)�unicode_literalsN)�	Container)�Iterable)�Mapping)�	timedelta)�DeviceNotFoundAtPathError)�DeviceNotFoundByFileError)�#DeviceNotFoundByInterfaceIndexError)�!DeviceNotFoundByKernelDeviceError)�DeviceNotFoundByNameError)�DeviceNotFoundByNumberError)� DeviceNotFoundInEnvironmentError)�ensure_byte_string)�ensure_unicode_string)�get_device_type)�string_to_bool)�udev_list_iteratec@s|eZdZdZedd��Zedd��Zedd��Zedd	��Zed
d��Z	edd
��Z
edd��Zedd��Zedd��Z
dS)�DeviceszT
    Class for constructing :class:`Device` objects from various kinds of data.
    cCs0|j|j�s$tjj|j|jtj��}|j||�S)a�
        Create a device from a device ``path``.  The ``path`` may or may not
        start with the ``sysfs`` mount point:

        >>> from pyudev import Context, Device
        >>> context = Context()
        >>> Devices.from_path(context, '/devices/platform')
        Device(u'/sys/devices/platform')
        >>> Devices.from_path(context, '/sys/devices/platform')
        Device(u'/sys/devices/platform')

        ``context`` is the :class:`Context` in which to search the device.
        ``path`` is a device path as unicode or byte string.

        Return a :class:`Device` object for the device.  Raise
        :exc:`DeviceNotFoundAtPathError`, if no device was found for ``path``.

        .. versionadded:: 0.18
        )�
startswith�sys_path�os�path�join�lstrip�sep�
from_sys_path)�cls�contextr�r!�/usr/lib/python3.6/_device.py�	from_path<szDevices.from_pathcCs(|jj|t|��}|st|��t||�S)a�
        Create a new device from a given ``sys_path``:

        >>> from pyudev import Context, Device
        >>> context = Context()
        >>> Devices.from_sys_path(context, '/sys/devices/platform')
        Device(u'/sys/devices/platform')

        ``context`` is the :class:`Context` in which to search the device.
        ``sys_path`` is a unicode or byte string containing the path of the
        device inside ``sysfs`` with the mount point included.

        Return a :class:`Device` object for the device.  Raise
        :exc:`DeviceNotFoundAtPathError`, if no device was found for
        ``sys_path``.

        .. versionadded:: 0.18
        )�_libudevZudev_device_new_from_syspathrr
�Device)rr r�devicer!r!r"rUs
zDevices.from_sys_pathcCs<|jdd�}|jj|t|�t|��}|s2t||��t||�S)a.
        Create a new device from a given ``subsystem`` and a given
        ``sys_name``:

        >>> from pyudev import Context, Device
        >>> context = Context()
        >>> sda = Devices.from_name(context, 'block', 'sda')
        >>> sda
        Device(u'/sys/devices/pci0000:00/0000:00:1f.2/host0/target0:0:0/0:0:0:0/block/sda')
        >>> sda == Devices.from_path(context, '/block/sda')

        ``context`` is the :class:`Context` in which to search the device.
        ``subsystem`` and ``sys_name`` are byte or unicode strings, which
        denote the subsystem and the name of the device to create.

        Return a :class:`Device` object for the device.  Raise
        :exc:`DeviceNotFoundByNameError`, if no device was found with the given
        name.

        .. versionadded:: 0.18
        �/�!)�replacer$Z&udev_device_new_from_subsystem_sysnamerrr%)rr �	subsystem�sys_namer&r!r!r"�	from_nameos

zDevices.from_namecCs0|jj|t|d�|�}|s&t||��t||�S)a�
        Create a new device from a device ``number`` with the given device
        ``type``:

        >>> import os
        >>> from pyudev import Context, Device
        >>> ctx = Context()
        >>> major, minor = 8, 0
        >>> device = Devices.from_device_number(context, 'block',
        ...     os.makedev(major, minor))
        >>> device
        Device(u'/sys/devices/pci0000:00/0000:00:11.0/host0/target0:0:0/0:0:0:0/block/sda')
        >>> os.major(device.device_number), os.minor(device.device_number)
        (8, 0)

        Use :func:`os.makedev` to construct a device number from a major and a
        minor device number, as shown in the example above.

        .. warning::

           Device numbers are not unique across different device types.
           Passing a correct number with a wrong type may silently yield a
           wrong device object, so make sure to pass the correct device type.

        ``context`` is the :class:`Context`, in which to search the device.
        ``type`` is either ``'char'`` or ``'block'``, according to whether the
        device is a character or block device.  ``number`` is the device number
        as integer.

        Return a :class:`Device` object for the device with the given device
        ``number``.  Raise :exc:`DeviceNotFoundByNumberError`, if no device was
        found with the given device type and number.

        .. versionadded:: 0.18
        r)r$Zudev_device_new_from_devnumrrr%)rr �typ�numberr&r!r!r"�from_device_number�s
%
zDevices.from_device_numbercCsVyt|�}tj|�j}Wn.ttfk
rF}zt|��WYdd}~XnX|j|||�S)a�
        Create a new device from the given device file:

        >>> from pyudev import Context, Device
        >>> context = Context()
        >>> device = Devices.from_device_file(context, '/dev/sda')
        >>> device
        Device(u'/sys/devices/pci0000:00/0000:00:0d.0/host2/target2:0:0/2:0:0:0/block/sda')
        >>> device.device_node
        u'/dev/sda'

        .. warning::

           Though the example seems to suggest that ``device.device_node ==
           filename`` holds with ``device = Devices.from_device_file(context,
           filename)``, this is only true in a majority of cases.  There *can*
           be devices, for which this relation is actually false!  Thus, do
           *not* expect :attr:`~Device.device_node` to be equal to the given
           ``filename`` for the returned :class:`Device`.  Especially, use
           :attr:`~Device.device_node` if you need the device file of a
           :class:`Device` created with this method afterwards.

        ``context`` is the :class:`Context` in which to search the device.
        ``filename`` is a string containing the path of a device file.

        Return a :class:`Device` representing the given device file.  Raise
        :exc:`DeviceNotFoundByFileError` if ``filename`` is no device file
        at all or if ``filename`` does not exist or if its metadata was
        inaccessible.

        .. versionadded:: 0.18
        N)rr�stat�st_rdev�EnvironmentError�
ValueErrorrr/)rr �filename�device_type�
device_number�errr!r!r"�from_device_file�s"zDevices.from_device_filecs<|jdd�}t�fdd�|D�d�}|dk	r0|St���dS)a?
        Locate a device based on the interface index.

        :param `Context` context: the libudev context
        :param int ifindex: the interface index
        :returns: the device corresponding to the interface index
        :rtype: `Device`

        This method is only appropriate for network devices.
        Znet)r*c3s"|]}|jjd��kr|VqdS)�ifindexN)�
attributes�get)�.0�d)r9r!r"�	<genexpr>�sz/Devices.from_interface_index.<locals>.<genexpr>N)�list_devices�nextr)rr r9Znetwork_devicesZdevr!)r9r"�from_interface_index�szDevices.from_interface_indexcCs�|d}|dd�}|dkrltjd�}|j|�}|rbtjt|jd��t|jd���}|j|||�St|��nT|d	kr�|j	||�S|d
kr�|j
d�\}}	}
|
r�|r�|j|||
�St|��nt|��dS)
a
        Locate a device based on the kernel device.

        :param `Context` context: the libudev context
        :param str kernel_device: the kernel device
        :returns: the device corresponding to ``kernel_device``
        :rtype: `Device`
        r�N�b�cz^(?P<major>\d+):(?P<minor>\d+)$�major�minor�n�+�:)rCrD)�re�compile�matchr�makedev�int�groupr/r
rA�	partitionr,)rr Z
kernel_deviceZswitch_char�restZ	number_rerLr.r*�_Zkernel_device_namer!r!r"�from_kernel_device�s&




zDevices.from_kernel_devicecCs |jj|�}|st��t||�S)a�
        Create a new device from the process environment (as in
        :data:`os.environ`).

        This only works reliable, if the current process is called from an
        udev rule, and is usually used for tools executed from ``IMPORT=``
        rules.  Use this method to create device objects in Python scripts
        called from udev rules.

        ``context`` is the library :class:`Context`.

        Return a :class:`Device` object constructed from the environment.
        Raise :exc:`DeviceNotFoundInEnvironmentError`, if no device could be
        created from the environment.

        .. udevversion:: 152

        .. versionadded:: 0.18
        )r$Z udev_device_new_from_environmentrr%)rr r&r!r!r"�from_environmentszDevices.from_environmentcCs|j|j|j|j|jgS)z�
        Return methods that obtain a :class:`Device` from a variety of
        different data.

        :return: a list of from_* methods.
        :rtype: list of class methods

        .. versionadded:: 0.18
        )r8r/r,r#r)rr!r!r"�METHODS9s
zDevices.METHODSN)�__name__�
__module__�__qualname__�__doc__�classmethodr#rr,r/r8rArSrTrUr!r!r!r"r7s++#rc@s�eZdZdZedd��Zedd��Zedd��Zedd	��Zed
d��Z	edd
��Z
dd�Zdd�Zdd�Z
edd��Zedd��Zedd��ZdYdd�Zdd�Zedd ��Zed!d"��Zed#d$��Zed%d&��Zed'd(��Zed)d*��Zed+d,��Zed-d.��Zed/d0��Zed1d2��Zed3d4��Zed5d6��Zed7d8��Z ed9d:��Z!ed;d<��Z"ed=d>��Z#ed?d@��Z$dAdB�Z%dCdD�Z&dEdF�Z'dGdH�Z(dIdJ�Z)dKdL�Z*dMdN�Z+dOdP�Z,dQdR�Z-dSdT�Z.dUdV�Z/dWdX�Z0dS)Zr%a�
    A single device with attached attributes and properties.

    This class subclasses the ``Mapping`` ABC, providing a read-only
    dictionary mapping property names to the corresponding values.
    Therefore all well-known dicitionary methods and operators
    (e.g. ``.keys()``, ``.items()``, ``in``) are available to access device
    properties.

    Aside of the properties, a device also has a set of udev-specific
    attributes like the path inside ``sysfs``.

    :class:`Device` objects compare equal and unequal to other devices and
    to strings (based on :attr:`device_path`).  However, there is no
    ordering on :class:`Device` objects, and the corresponding operators
    ``>``, ``<``, ``<=`` and ``>=`` raise :exc:`~exceptions.TypeError`.

    .. warning::

       **Never** use object identity (``is`` operator) to compare
       :class:`Device` objects.  :mod:`pyudev` may create multiple
       :class:`Device` objects for the same device.  Instead compare
       devices by value using ``==`` or ``!=``.

    :class:`Device` objects are hashable and can therefore be used as keys
    in dictionaries and sets.

    They can also be given directly as ``udev_device *`` to functions wrapped
    through :mod:`ctypes`.
    cCs$ddl}|jdtdd�tj||�S)zw
        .. versionadded:: 0.4
        .. deprecated:: 0.18
           Use :class:`Devices.from_path` instead.
        rNz>Will be removed in 1.0. Use equivalent Devices method instead.�)�
stacklevel)�warnings�warn�DeprecationWarningrr#)rr rr]r!r!r"r#nszDevice.from_pathcCs$ddl}|jdtdd�tj||�S)a|
        .. versionchanged:: 0.4
           Raise :exc:`NoSuchDeviceError` instead of returning ``None``, if
           no device was found for ``sys_path``.
        .. versionchanged:: 0.5
           Raise :exc:`DeviceNotFoundAtPathError` instead of
           :exc:`NoSuchDeviceError`.
        .. deprecated:: 0.18
           Use :class:`Devices.from_sys_path` instead.
        rNz>Will be removed in 1.0. Use equivalent Devices method instead.r[)r\)r]r^r_rr)rr rr]r!r!r"r}szDevice.from_sys_pathcCs&ddl}|jdtdd�tj|||�S)zw
        .. versionadded:: 0.5
        .. deprecated:: 0.18
           Use :class:`Devices.from_name` instead.
        rNz>Will be removed in 1.0. Use equivalent Devices method instead.r[)r\)r]r^r_rr,)rr r*r+r]r!r!r"r,�szDevice.from_namecCs&ddl}|jdtdd�tj|||�S)z�
        .. versionadded:: 0.11
        .. deprecated:: 0.18
           Use :class:`Devices.from_device_number` instead.
        rNz>Will be removed in 1.0. Use equivalent Devices method instead.r[)r\)r]r^r_rr/)rr r-r.r]r!r!r"r/�szDevice.from_device_numbercCs$ddl}|jdtdd�tj||�S)z
        .. versionadded:: 0.15
        .. deprecated:: 0.18
           Use :class:`Devices.from_device_file` instead.
        rNz>Will be removed in 1.0. Use equivalent Devices method instead.r[)r\)r]r^r_rr8)rr r4r]r!r!r"r8�szDevice.from_device_filecCs"ddl}|jdtdd�tj|�S)z~
        .. versionadded:: 0.6
        .. deprecated:: 0.18
           Use :class:`Devices.from_environment` instead.
        rNz>Will be removed in 1.0. Use equivalent Devices method instead.r[)r\)r]r^r_rrT)rr r]r!r!r"rT�szDevice.from_environmentcCs||_||_|j|_dS)N)r Z_as_parameter_r$)�selfr Z_devicer!r!r"�__init__�szDevice.__init__cCs|jj|�dS)N)r$Zudev_device_unref)r`r!r!r"�__del__�szDevice.__del__cCs
dj|�S)NzDevice({0.sys_path!r}))�format)r`r!r!r"�__repr__�szDevice.__repr__cCs(|jj|�}|sdSt|j|jj|��S)z_
        The parent :class:`Device` or ``None``, if there is no parent
        device.
        N)r$Zudev_device_get_parentr%r �udev_device_ref)r`�parentr!r!r"rf�sz
Device.parentccs,x&|jj�j|�D]}||kr|VqWdS)a�
        Yield all direct children of this device.

        .. note::

           In udev, parent-child relationships are generally ambiguous, i.e.
           a parent can have multiple children, *and* a child can have multiple
           parents. Hence, `child.parent == parent` does generally *not* hold
           for all `child` objects in `parent.children`. In other words,
           the :attr:`parent` of a device in this property can be different
           from this device!

        .. note::

           As the underlying library does not provide any means to directly
           query the children of a device, this property performs a linear
           search through all devices.

        Return an iterable yielding a :class:`Device` object for each direct
        child of this device.

        .. udevversion:: 172

        .. versionchanged:: 0.13
           Requires udev version 172 now.
        N)r r?Zmatch_parent)r`r&r!r!r"�children�szDevice.childrenccs$|j}x|dk	r|V|j}qWdS)z�
        Yield all ancestors of this device from bottom to top.

        Return an iterator yielding a :class:`Device` object for each
        ancestor of this device from bottom to top.

        .. versionadded:: 0.16
        N)rf)r`rfr!r!r"�	ancestorss

zDevice.ancestorsNcCsDt|�}|dk	rt|�}|jj|||�}|s0dSt|j|jj|��S)a�
        Find the parent device with the given ``subsystem`` and
        ``device_type``.

        ``subsystem`` is a byte or unicode string containing the name of the
        subsystem, in which to search for the parent.  ``device_type`` is a
        byte or unicode string holding the expected device type of the parent.
        It can be ``None`` (the default), which means, that no specific device
        type is expected.

        Return a parent :class:`Device` within the given ``subsystem`` and, if
        ``device_type`` is not ``None``, with the given ``device_type``, or
        ``None``, if this device has no parent device matching these
        constraints.

        .. versionadded:: 0.9
        N)rr$Z-udev_device_get_parent_with_subsystem_devtyper%r re)r`r*r5rfr!r!r"�find_parents
zDevice.find_parentcCsddl}|jdtdd�|jS)a~
        Traverse all parent devices of this device from bottom to top.

        Return an iterable yielding all parent devices as :class:`Device`
        objects, *not* including the current device.  The last yielded
        :class:`Device` is the top of the device hierarchy.

        .. deprecated:: 0.16
           Will be removed in 1.0. Use :attr:`ancestors` instead.
        rNz5Will be removed in 1.0. Use Device.ancestors instead.r[)r\)r]r^r_rh)r`r]r!r!r"�traverse0szDevice.traversecCst|jj|��S)zz
        Absolute path of this device in ``sysfs`` including the ``sysfs``
        mount point as unicode string.
        )rr$Zudev_device_get_syspath)r`r!r!r"rCszDevice.sys_pathcCst|jj|��S)a
        Kernel device path as unicode string.  This path uniquely identifies
        a single device.

        Unlike :attr:`sys_path`, this path does not contain the ``sysfs``
        mount point.  However, the path is absolute and starts with a slash
        ``'/'``.
        )rr$Zudev_device_get_devpath)r`r!r!r"�device_pathLs
zDevice.device_pathcCs |jj|�}|dkrdSt|�S)z�
        Name of the subsystem this device is part of as unicode string.

        :returns: name of subsystem if found, else None
        :rtype: unicode string or NoneType
        N)r$Zudev_device_get_subsystemr)r`Zsubsysr!r!r"r*YszDevice.subsystemcCst|jj|��S)zF
        Device file name inside ``sysfs`` as unicode string.
        )rr$Zudev_device_get_sysname)r`r!r!r"r+dszDevice.sys_namecCs |jj|�}|dk	rt|�SdS)a�
        The trailing number of the :attr:`sys_name` as unicode string, or
        ``None``, if the device has no trailing number in its name.

        .. note::

           The number is returned as unicode string to preserve the exact
           format of the number, especially any leading zeros:

           >>> from pyudev import Context, Device
           >>> context = Context()
           >>> device = Devices.from_path(context, '/sys/devices/LNXSYSTM:00')
           >>> device.sys_number
           u'00'

           To work with numbers, explicitly convert them to ints:

           >>> int(device.sys_number)
           0

        .. versionadded:: 0.11
        N)r$Zudev_device_get_sysnumr)r`r.r!r!r"�
sys_numberlszDevice.sys_numbercCs$|jj|�}|dk	rt|�S|SdS)a�
        Device type as unicode string, or ``None``, if the device type is
        unknown.

        >>> from pyudev import Context
        >>> context = Context()
        >>> for device in context.list_devices(subsystem='net'):
        ...     '{0} - {1}'.format(device.sys_name, device.device_type or 'ethernet')
        ...
        u'eth0 - ethernet'
        u'wlan0 - wlan'
        u'lo - ethernet'
        u'vboxnet0 - ethernet'

        .. versionadded:: 0.10
        N)r$Zudev_device_get_devtyper)r`r5r!r!r"r5�szDevice.device_typecCs |jj|�}|dk	rt|�SdS)z�
        The driver name as unicode string, or ``None``, if there is no
        driver for this device.

        .. versionadded:: 0.5
        N)r$Zudev_device_get_driverr)r`�driverr!r!r"rm�sz
Device.drivercCs |jj|�}|dk	rt|�SdS)a�
        Absolute path to the device node of this device as unicode string or
        ``None``, if this device doesn't have a device node.  The path
        includes the device directory (see :attr:`Context.device_path`).

        This path always points to the actual device node associated with
        this device, and never to any symbolic links to this device node.
        See :attr:`device_links` to get a list of symbolic links to this
        device node.

        .. warning::

           For devices created with :meth:`from_device_file()`, the value of
           this property is not necessary equal to the ``filename`` given to
           :meth:`from_device_file()`.
        N)r$Zudev_device_get_devnoder)r`Znoder!r!r"�device_node�szDevice.device_nodecCs|jj|�S)a
        The device number of the associated device as integer, or ``0``, if no
        device number is associated.

        Use :func:`os.major` and :func:`os.minor` to decompose the device
        number into its major and minor number:

        >>> import os
        >>> from pyudev import Context, Device
        >>> context = Context()
        >>> sda = Devices.from_name(context, 'block', 'sda')
        >>> sda.device_number
        2048L
        >>> (os.major(sda.device_number), os.minor(sda.device_number))
        (8, 0)

        For devices with an associated :attr:`device_node`, this is the same as
        the ``st_rdev`` field of the stat result of the :attr:`device_node`:

        >>> os.stat(sda.device_node).st_rdev
        2048

        .. versionadded:: 0.11
        )r$Zudev_device_get_devnum)r`r!r!r"r6�szDevice.device_numbercCst|jj|��S)ai
        ``True``, if the device is initialized, ``False`` otherwise.

        A device is initialized, if udev has already handled this device and
        has set up device node permissions and context, or renamed a network
        device.

        Consequently, this property is only implemented for devices with a
        device node or for network devices.  On all other devices this property
        is always ``True``.

        It is *not* recommended, that you use uninitialized devices.

        .. seealso:: :attr:`time_since_initialized`

        .. udevversion:: 165

        .. versionadded:: 0.8
        )�boolr$Zudev_device_get_is_initialized)r`r!r!r"�is_initialized�szDevice.is_initializedcCs|jj|�}t|d�S)a�
        The time elapsed since initialization as :class:`~datetime.timedelta`.

        This property is only implemented on devices, which need to store
        properties in the udev database.  On all other devices this property is
        simply zero :class:`~datetime.timedelta`.

        .. seealso:: :attr:`is_initialized`

        .. udevversion:: 165

        .. versionadded:: 0.8
        )�microseconds)r$Z&udev_device_get_usec_since_initializedr	)r`rqr!r!r"�time_since_initialized�szDevice.time_since_initializedccs4|jj|�}x"t|j|�D]\}}t|�VqWdS)a�
        An iterator, which yields the absolute paths (including the device
        directory, see :attr:`Context.device_path`) of all symbolic links
        pointing to the :attr:`device_node` of this device.  The paths are
        unicode strings.

        UDev can create symlinks to the original device node (see
        :attr:`device_node`) inside the device directory.  This is often
        used to assign a constant, fixed device node to devices like
        removeable media, which technically do not have a constant device
        node, or to map a single device into multiple device hierarchies.
        The property provides access to all such symbolic links, which were
        created by UDev for this device.

        .. warning::

           Links are not necessarily resolved by
           :meth:`Devices.from_device_file()`. Hence do *not* rely on
           ``Devices.from_device_file(context, link).device_path ==
           device.device_path`` from any ``link`` in ``device.device_links``.
        N)r$Z#udev_device_get_devlinks_list_entryrr)r`Zdevlinks�namerRr!r!r"�device_linksszDevice.device_linkscCs |jj|�}|dk	rt|�SdS)a
        The device event action as string, or ``None``, if this device was not
        received from a :class:`Monitor`.

        Usual actions are:

        ``'add'``
          A device has been added (e.g. a USB device was plugged in)
        ``'remove'``
          A device has been removed (e.g. a USB device was unplugged)
        ``'change'``
          Something about the device changed (e.g. a device property)
        ``'online'``
          The device is online now
        ``'offline'``
          The device is offline now

        .. warning::

           Though the actions listed above are the most common, this property
           *may* return other values, too, so be prepared to handle unknown
           actions!

        .. versionadded:: 0.16
        N)r$Zudev_device_get_actionr)r`�actionr!r!r"ru sz
Device.actioncCs|jj|�S)z�
        The device event sequence number as integer, or ``0`` if this device
        has no sequence number, i.e. was not received from a :class:`Monitor`.

        .. versionadded:: 0.16
        )r$Zudev_device_get_seqnum)r`r!r!r"�sequence_number>szDevice.sequence_numbercCst|�S)aT
        The system attributes of this device as read-only
        :class:`Attributes` mapping.

        System attributes are basically normal files inside the the device
        directory.  These files contain all sorts of information about the
        device, which may not be reflected by properties.  These attributes
        are commonly used for matching in udev rules, and can be printed
        using ``udevadm info --attribute-walk``.

        The values of these attributes are not always proper strings, and
        can contain arbitrary bytes.

        .. versionadded:: 0.5
        )�
Attributes)r`r!r!r"r:HszDevice.attributescCst|�S)zu
        The udev properties of this object as read-only Properties mapping.

        .. versionadded:: 0.21
        )�
Properties)r`r!r!r"�
properties_szDevice.propertiescCst|�S)a�
        A :class:`Tags` object representing the tags attached to this device.

        The :class:`Tags` object supports a test for a single tag as well as
        iteration over all tags:

        >>> from pyudev import Context
        >>> context = Context()
        >>> device = next(iter(context.list_devices(tag='systemd')))
        >>> 'systemd' in device.tags
        True
        >>> list(device.tags)
        [u'seat', u'systemd', u'uaccess']

        Tags are arbitrary classifiers that can be attached to devices by udev
        scripts and daemons.  For instance, systemd_ uses tags for multi-seat_
        support.

        .. _systemd: http://freedesktop.org/wiki/Software/systemd
        .. _multi-seat: http://www.freedesktop.org/wiki/Software/systemd/multiseat

        .. udevversion:: 154

        .. versionadded:: 0.6

        .. versionchanged:: 0.13
           Return a :class:`Tags` object now.
        )�Tags)r`r!r!r"�tagshszDevice.tagscCs"ddl}|jdtdd�|jj�S)a*
        Iterate over the names of all properties defined for this device.

        Return a generator yielding the names of all properties of this
        device as unicode strings.

        .. deprecated:: 0.21
           Will be removed in 1.0. Access properties with Device.properties.
        rNzAWill be removed in 1.0. Access properties with Device.properties.r[)r\)r]r^r_ry�__iter__)r`r]r!r!r"r|�s
zDevice.__iter__cCs"ddl}|jdtdd�|jj�S)z�
        Return the amount of properties defined for this device as integer.

        .. deprecated:: 0.21
           Will be removed in 1.0. Access properties with Device.properties.
        rNzAWill be removed in 1.0. Access properties with Device.properties.r[)r\)r]r^r_ry�__len__)r`r]r!r!r"r}�szDevice.__len__cCs$ddl}|jdtdd�|jj|�S)a�
        Get the given property from this device.

        ``prop`` is a unicode or byte string containing the name of the
        property.

        Return the property value as unicode string, or raise a
        :exc:`~exceptions.KeyError`, if the given property is not defined
        for this device.

        .. deprecated:: 0.21
           Will be removed in 1.0. Access properties with Device.properties.
        rNzAWill be removed in 1.0. Access properties with Device.properties.r[)r\)r]r^r_ry�__getitem__)r`�propr]r!r!r"r~�szDevice.__getitem__cCs$ddl}|jdtdd�|jj|�S)a
        Get the given property from this device as integer.

        ``prop`` is a unicode or byte string containing the name of the
        property.

        Return the property value as integer. Raise a
        :exc:`~exceptions.KeyError`, if the given property is not defined
        for this device, or a :exc:`~exceptions.ValueError`, if the property
        value cannot be converted to an integer.

        .. deprecated:: 0.21
           Will be removed in 1.0. Use Device.properties.asint() instead.
        rNz<Will be removed in 1.0. Use Device.properties.asint instead.r[)r\)r]r^r_ry�asint)r`rr]r!r!r"r��szDevice.asintcCs$ddl}|jdtdd�|jj|�S)a�
        Get the given property from this device as boolean.

        A boolean property has either a value of ``'1'`` or of ``'0'``,
        where ``'1'`` stands for ``True``, and ``'0'`` for ``False``.  Any
        other value causes a :exc:`~exceptions.ValueError` to be raised.

        ``prop`` is a unicode or byte string containing the name of the
        property.

        Return ``True``, if the property value is ``'1'`` and ``False``, if
        the property value is ``'0'``.  Any other value raises a
        :exc:`~exceptions.ValueError`.  Raise a :exc:`~exceptions.KeyError`,
        if the given property is not defined for this device.

        .. deprecated:: 0.21
           Will be removed in 1.0. Use Device.properties.asbool() instead.
        rNz=Will be removed in 1.0. Use Device.properties.asbool instead.r[)r\)r]r^r_ry�asbool)r`rr]r!r!r"r��sz
Device.asboolcCs
t|j�S)N)�hashrk)r`r!r!r"�__hash__�szDevice.__hash__cCs$t|t�r|j|jkS|j|kSdS)N)�
isinstancer%rk)r`�otherr!r!r"�__eq__�s
z
Device.__eq__cCs$t|t�r|j|jkS|j|kSdS)N)r�r%rk)r`r�r!r!r"�__ne__�s
z
Device.__ne__cCstd��dS)NzDevice not orderable)�	TypeError)r`r�r!r!r"�__gt__sz
Device.__gt__cCstd��dS)NzDevice not orderable)r�)r`r�r!r!r"�__lt__sz
Device.__lt__cCstd��dS)NzDevice not orderable)r�)r`r�r!r!r"�__le__sz
Device.__le__cCstd��dS)NzDevice not orderable)r�)r`r�r!r!r"�__ge__	sz
Device.__ge__)N)1rVrWrXrYrZr#rr,r/r8rTrarbrd�propertyrfrgrhrirjrrkr*r+rlr5rmrnr6rprrrtrurvr:ryr{r|r}r~r�r�r�r�r�r�r�r�r�r!r!r!r"r%MsX
 
	

	 r%c@s@eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dS)rxzN
    udev properties :class:`Device` objects.

    .. versionadded:: 0.21
    cCs||_|j|_dS)N)r&r$)r`r&r!r!r"raszProperties.__init__ccs6|jj|j�}x"t|j|�D]\}}t|�VqWdS)z�
        Iterate over the names of all properties defined for the device.

        Return a generator yielding the names of all properties of this
        device as unicode strings.
        N)r$�%udev_device_get_properties_list_entryr&rr)r`ryrsrRr!r!r"r|szProperties.__iter__cCs(|jj|j�}tdd�t|j|�D��S)zU
        Return the amount of properties defined for this device as integer.
        css|]
}dVqdS)rBNr!)r<rRr!r!r"r>*sz%Properties.__len__.<locals>.<genexpr>)r$r�r&�sumr)r`ryr!r!r"r}$szProperties.__len__cCs,|jj|jt|��}|dkr$t|��t|�S)a9
        Get the given property from this device.

        ``prop`` is a unicode or byte string containing the name of the
        property.

        Return the property value as unicode string, or raise a
        :exc:`~exceptions.KeyError`, if the given property is not defined
        for this device.
        N)r$Zudev_device_get_property_valuer&r�KeyErrorr)r`r�valuer!r!r"r~,s
zProperties.__getitem__cCst||�S)a�
        Get the given property from this device as integer.

        ``prop`` is a unicode or byte string containing the name of the
        property.

        Return the property value as integer. Raise a
        :exc:`~exceptions.KeyError`, if the given property is not defined
        for this device, or a :exc:`~exceptions.ValueError`, if the property
        value cannot be converted to an integer.
        )rN)r`rr!r!r"r�?szProperties.asintcCst||�S)a�
        Get the given property from this device as boolean.

        A boolean property has either a value of ``'1'`` or of ``'0'``,
        where ``'1'`` stands for ``True``, and ``'0'`` for ``False``.  Any
        other value causes a :exc:`~exceptions.ValueError` to be raised.

        ``prop`` is a unicode or byte string containing the name of the
        property.

        Return ``True``, if the property value is ``'1'`` and ``False``, if
        the property value is ``'0'``.  Any other value raises a
        :exc:`~exceptions.ValueError`.  Raise a :exc:`~exceptions.KeyError`,
        if the given property is not defined for this device.
        )r)r`rr!r!r"r�MszProperties.asboolN)
rVrWrXrYrar|r}r~r�r�r!r!r!r"rx
srxc@sNeZdZdZdd�Zedd��Zdd�Zdd	d
�Zdd�Z	d
d�Z
dd�ZdS)rwzQ
    udev attributes for :class:`Device` objects.

    .. versionadded:: 0.5
    cCs||_|j|_dS)N)r&r$)r`r&r!r!r"ragszAttributes.__init__ccsFt|jd�sdS|jj|j�}x"t|j|�D]\}}t|�Vq,WdS)a�
        Yield the ``available`` attributes for the device.

        It is not guaranteed that a key in this list will have a value.
        It is not guaranteed that a key not in this list will not have a value.

        It is guaranteed that the keys in this list are the keys that libudev
        considers to be "available" attributes.

        If libudev version does not define udev_device_get_sysattr_list_entry()
        yields nothing.

        See rhbz#1267584.
        �"udev_device_get_sysattr_list_entryN)�hasattrr$r�r&rr)r`Zattrs�	attributerRr!r!r"�available_attributesks
zAttributes.available_attributescCs(|jj|jt|��}|dkr$t|��|S)aD
        Get the given system ``attribute`` for the device.

        :param attribute: the key for an attribute value
        :type attribute: unicode or byte string
        :returns: the value corresponding to ``attribute``
        :rtype: an arbitrary sequence of bytes
        :raises KeyError: if no value found
        N)r$Zudev_device_get_sysattr_valuer&rr�)r`r�r�r!r!r"�_get�s

zAttributes._getNcCs$y
|j|�Stk
r|SXdS)a|
        Get the given system ``attribute`` for the device.

        :param attribute: the key for an attribute value
        :type attribute: unicode or byte string
        :param default: a default if no corresponding value found
        :type default: a sequence of bytes
        :returns: the value corresponding to ``attribute`` or ``default``
        :rtype: object
        N)r�r�)r`r��defaultr!r!r"r;�s
zAttributes.getcCst|j|��S)a�
        Get the given ``attribute`` for the device as unicode string.

        :param attribute: the key for an attribute value
        :type attribute: unicode or byte string
        :returns: the value corresponding to ``attribute``, as unicode
        :rtype: unicode
        :raises KeyError: if no value found for ``attribute``
        :raises UnicodeDecodeError: if value is not convertible
        )rr�)r`r�r!r!r"�asstring�szAttributes.asstringcCst|j|��S)a�
        Get the given ``attribute`` as an int.

        :param attribute: the key for an attribute value
        :type attribute: unicode or byte string
        :returns: the value corresponding to ``attribute``, as an int
        :rtype: int
        :raises KeyError: if no value found for ``attribute``
        :raises UnicodeDecodeError: if value is not convertible to unicode
        :raises ValueError: if unicode value can not be converted to an int
        )rNr�)r`r�r!r!r"r��szAttributes.asintcCst|j|��S)a�
        Get the given ``attribute`` from this device as a bool.

        :param attribute: the key for an attribute value
        :type attribute: unicode or byte string
        :returns: the value corresponding to ``attribute``, as bool
        :rtype: bool
        :raises KeyError: if no value found for ``attribute``
        :raises UnicodeDecodeError: if value is not convertible to unicode
        :raises ValueError: if unicode value can not be converted to a bool

        A boolean attribute has either a value of ``'1'`` or of ``'0'``,
        where ``'1'`` stands for ``True``, and ``'0'`` for ``False``.  Any
        other value causes a :exc:`~exceptions.ValueError` to be raised.
        )rr�)r`r�r!r!r"r��szAttributes.asbool)N)rVrWrXrYrar�r�r�r;r�r�r�r!r!r!r"rw`s

rwc@s0eZdZdZdd�Zdd�Zdd�Zdd	�Zd
S)rzzk
    A iterable over :class:`Device` tags.

    Subclasses the ``Container`` and the ``Iterable`` ABC.
    cCs||_|j|_dS)N)r&r$)r`r&r!r!r"ra�sz
Tags.__init__cs>t|jd�r$t|jj|jt����St�fdd�|D��SdS)z
            Whether ``tag`` exists.

            :param tag: unicode string with name of tag
            :rtype: bool
        �udev_device_has_tagc3s|]}|�kVqdS)Nr!)r<�t)�tagr!r"r>�sz Tags._has_tag.<locals>.<genexpr>N)r�r$ror�r&r�any)r`r�r!)r�r"�_has_tag�sz
Tags._has_tagcCs
|j|�S)z�
        Check for existence of ``tag``.

        ``tag`` is a tag as unicode string.

        Return ``True``, if ``tag`` is attached to the device, ``False``
        otherwise.
        )r�)r`r�r!r!r"�__contains__�s	zTags.__contains__ccs6|jj|j�}x"t|j|�D]\}}t|�VqWdS)zS
        Iterate over all tags.

        Yield each tag as unicode string.
        N)r$Zudev_device_get_tags_list_entryr&rr)r`r{r�rRr!r!r"r|�sz
Tags.__iter__N)rVrWrXrYrar�r�r|r!r!r!r"rz�s

rz)"rYZ
__future__rrrrrrJ�collectionsrrrZdatetimer	Zpyudev.device._errorsr
rrr
rrrZpyudev._utilrrrrr�objectrr%rxrwrzr!r!r!r"�<module>sDESqsite-packages/pyudev/_compat.py000064400000002674147511334620012613 0ustar00# -*- coding: utf-8 -*-
# Copyright (C) 2011 Sebastian Wiesner <lunaryorn@gmail.com>

# This library is free software; you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as published by the
# Free Software Foundation; either version 2.1 of the License, or (at your
# option) any later version.

# This library is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE.  See the GNU Lesser General Public License
# for more details.

# You should have received a copy of the GNU Lesser General Public License
# along with this library; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA


"""
    pyudev._compat
    ==============

    Compatibility for Python versions, that lack certain functions.

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
"""


from __future__ import (print_function, division, unicode_literals,
                        absolute_import)

from subprocess import Popen, CalledProcessError, PIPE


def check_output(command):
    """
    Compatibility with :func:`subprocess.check_output` from Python 2.7 and
    upwards.
    """
    proc = Popen(command, stdout=PIPE)
    output = proc.communicate()[0]
    if proc.returncode != 0:
        raise CalledProcessError(proc.returncode, command)
    return output
site-packages/pyudev/_util.py000064400000015450147511334620012301 0ustar00# -*- coding: utf-8 -*-
# Copyright (C) 2010, 2011, 2012 Sebastian Wiesner <lunaryorn@gmail.com>

# This library is free software; you can redistribute it and/or modify it
# under the terms of the GNU Lesser General Public License as published by the
# Free Software Foundation; either version 2.1 of the License, or (at your
# option) any later version.

# This library is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE.  See the GNU Lesser General Public License
# for more details.

# You should have received a copy of the GNU Lesser General Public License
# along with this library; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA


"""
    pyudev._util
    ============

    Internal utilities

    .. moduleauthor::  Sebastian Wiesner  <lunaryorn@gmail.com>
"""


from __future__ import (print_function, division, unicode_literals,
                        absolute_import)

try:
    from subprocess import check_output
except ImportError:
    from pyudev._compat import check_output

import os
import sys
import stat
import errno

import six


def ensure_byte_string(value):
    """
    Return the given ``value`` as bytestring.

    If the given ``value`` is not a byte string, but a real unicode string, it
    is encoded with the filesystem encoding (as in
    :func:`sys.getfilesystemencoding()`).
    """
    if not isinstance(value, bytes):
        value = value.encode(sys.getfilesystemencoding())
    return value


def ensure_unicode_string(value):
    """
    Return the given ``value`` as unicode string.

    If the given ``value`` is not a unicode string, but a byte string, it is
    decoded with the filesystem encoding (as in
    :func:`sys.getfilesystemencoding()`).
    """
    if not isinstance(value, six.text_type):
        value = value.decode(sys.getfilesystemencoding())
    return value


def property_value_to_bytes(value):
    """
    Return a byte string, which represents the given ``value`` in a way
    suitable as raw value of an udev property.

    If ``value`` is a boolean object, it is converted to ``'1'`` or ``'0'``,
    depending on whether ``value`` is ``True`` or ``False``.  If ``value`` is a
    byte string already, it is returned unchanged.  Anything else is simply
    converted to a unicode string, and then passed to
    :func:`ensure_byte_string`.
    """
    # udev represents boolean values as 1 or 0, therefore an explicit
    # conversion to int is required for boolean values
    if isinstance(value, bool):
        value = int(value)
    if isinstance(value, bytes):
        return value
    else:
        return ensure_byte_string(six.text_type(value))


def string_to_bool(value):
    """
    Convert the given unicode string ``value`` to a boolean object.

    If ``value`` is ``'1'``, ``True`` is returned.  If ``value`` is ``'0'``,
    ``False`` is returned.  Any other value raises a
    :exc:`~exceptions.ValueError`.
    """
    if value not in ('1', '0'):
        raise ValueError('Not a boolean value: {0!r}'.format(value))
    return value == '1'


def udev_list_iterate(libudev, entry):
    """
    Iteration helper for udev list entry objects.

    Yield a tuple ``(name, value)``.  ``name`` and ``value`` are bytestrings
    containing the name and the value of the list entry.  The exact contents
    depend on the list iterated over.
    """
    while entry:
        name = libudev.udev_list_entry_get_name(entry)
        value = libudev.udev_list_entry_get_value(entry)
        yield (name, value)
        entry = libudev.udev_list_entry_get_next(entry)


def get_device_type(filename):
    """
    Get the device type of a device file.

    ``filename`` is a string containing the path of a device file.

    Return ``'char'`` if ``filename`` is a character device, or ``'block'`` if
    ``filename`` is a block device.  Raise :exc:`~exceptions.ValueError` if
    ``filename`` is no device file at all.  Raise
    :exc:`~exceptions.EnvironmentError` if ``filename`` does not exist or if
    its metadata was inaccessible.

    .. versionadded:: 0.15
    """
    mode = os.stat(filename).st_mode
    if stat.S_ISCHR(mode):
        return 'char'
    elif stat.S_ISBLK(mode):
        return 'block'
    else:
        raise ValueError('not a device file: {0!r}'.format(filename))


def eintr_retry_call(func, *args, **kwargs):
    """
    Handle interruptions to an interruptible system call.

    Run an interruptible system call in a loop and retry if it raises EINTR.
    The signal calls that may raise EINTR prior to Python 3.5 are listed in
    PEP 0475.  Any calls to these functions must be wrapped in eintr_retry_call
    in order to handle EINTR returns in older versions of Python.

    This function is safe to use under Python 3.5 and newer since the wrapped
    function will simply return without raising EINTR.

    This function is based on _eintr_retry_call in python's subprocess.py.
    """

    # select.error inherits from Exception instead of OSError in Python 2
    import select

    while True:
        try:
            return func(*args, **kwargs)
        except (OSError, IOError, select.error) as err:
            # If this is not an IOError or OSError, it's the old select.error
            # type, which means that the errno is only accessible via subscript
            if isinstance(err, (OSError, IOError)):
                error_code = err.errno
            else:
                error_code = err.args[0]

            if error_code == errno.EINTR:
                continue
            raise

def udev_version():
    """
    Get the version of the underlying udev library.

    udev doesn't use a standard major-minor versioning scheme, but instead
    labels releases with a single consecutive number.  Consequently, the
    version number returned by this function is a single integer, and not a
    tuple (like for instance the interpreter version in
    :data:`sys.version_info`).

    As libudev itself does not provide a function to query the version number,
    this function calls the ``udevadm`` utility, so be prepared to catch
    :exc:`~exceptions.EnvironmentError` and
    :exc:`~subprocess.CalledProcessError` if you call this function.

    Return the version number as single integer.  Raise
    :exc:`~exceptions.ValueError`, if the version number retrieved from udev
    could not be converted to an integer.  Raise
    :exc:`~exceptions.EnvironmentError`, if ``udevadm`` was not found, or could
    not be executed.  Raise :exc:`subprocess.CalledProcessError`, if
    ``udevadm`` returned a non-zero exit code.  On Python 2.7 or newer, the
    ``output`` attribute of this exception is correctly set.

    .. versionadded:: 0.8
    """
    output = ensure_unicode_string(check_output(['udevadm', '--version']))
    return int(output.strip())
site-packages/python_dateutil-2.6.1-py3.6.egg-info/PKG-INFO000064400000002140147511334620016611 0ustar00Metadata-Version: 1.1
Name: python-dateutil
Version: 2.6.1
Summary: Extensions to the standard Python datetime module
Home-page: https://dateutil.readthedocs.io
Author: Paul Ganssle
Author-email: dateutil@python.org
License: Simplified BSD
Description: 
        The dateutil module provides powerful extensions to the
        datetime module available in the Python standard library.
        
Platform: UNKNOWN
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: BSD License
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 2
Classifier: Programming Language :: Python :: 2.6
Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.2
Classifier: Programming Language :: Python :: 3.3
Classifier: Programming Language :: Python :: 3.4
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Classifier: Topic :: Software Development :: Libraries
Requires: six
site-packages/python_dateutil-2.6.1-py3.6.egg-info/SOURCES.txt000064400000001600147511334620017400 0ustar00LICENSE
MANIFEST.in
NEWS
README.rst
setup.cfg
setup.py
updatezinfo.py
zonefile_metadata.json
dateutil/__init__.py
dateutil/_common.py
dateutil/_version.py
dateutil/easter.py
dateutil/parser.py
dateutil/relativedelta.py
dateutil/rrule.py
dateutil/tzwin.py
dateutil/test/__init__.py
dateutil/test/_common.py
dateutil/test/test_easter.py
dateutil/test/test_imports.py
dateutil/test/test_parser.py
dateutil/test/test_relativedelta.py
dateutil/test/test_rrule.py
dateutil/test/test_tz.py
dateutil/tz/__init__.py
dateutil/tz/_common.py
dateutil/tz/tz.py
dateutil/tz/win.py
dateutil/zoneinfo/__init__.py
dateutil/zoneinfo/dateutil-zoneinfo.tar.gz
dateutil/zoneinfo/rebuild.py
python_dateutil.egg-info/PKG-INFO
python_dateutil.egg-info/SOURCES.txt
python_dateutil.egg-info/dependency_links.txt
python_dateutil.egg-info/requires.txt
python_dateutil.egg-info/top_level.txt
python_dateutil.egg-info/zip-safesite-packages/python_dateutil-2.6.1-py3.6.egg-info/requires.txt000064400000000011147511334620020107 0ustar00six>=1.5
site-packages/python_dateutil-2.6.1-py3.6.egg-info/zip-safe000064400000000001147511334620017147 0ustar00
site-packages/python_dateutil-2.6.1-py3.6.egg-info/dependency_links.txt000064400000000001147511334620021565 0ustar00
site-packages/python_dateutil-2.6.1-py3.6.egg-info/top_level.txt000064400000000011147511334620020241 0ustar00dateutil
site-packages/setuptools/_vendor/__init__.py000064400000000000147511334620015246 0ustar00site-packages/setuptools/_vendor/pyparsing.py000064400000700753147511334620015551 0ustar00# module pyparsing.py
#
# Copyright (c) 2003-2016  Paul T. McGuire
#
# Permission is hereby granted, free of charge, to any person obtaining
# a copy of this software and associated documentation files (the
# "Software"), to deal in the Software without restriction, including
# without limitation the rights to use, copy, modify, merge, publish,
# distribute, sublicense, and/or sell copies of the Software, and to
# permit persons to whom the Software is furnished to do so, subject to
# the following conditions:
#
# The above copyright notice and this permission notice shall be
# included in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
#

__doc__ = \
"""
pyparsing module - Classes and methods to define and execute parsing grammars

The pyparsing module is an alternative approach to creating and executing simple grammars,
vs. the traditional lex/yacc approach, or the use of regular expressions.  With pyparsing, you
don't need to learn a new syntax for defining grammars or matching expressions - the parsing module
provides a library of classes that you use to construct the grammar directly in Python.

Here is a program to parse "Hello, World!" (or any greeting of the form 
C{"<salutation>, <addressee>!"}), built up using L{Word}, L{Literal}, and L{And} elements 
(L{'+'<ParserElement.__add__>} operator gives L{And} expressions, strings are auto-converted to
L{Literal} expressions)::

    from pyparsing import Word, alphas

    # define grammar of a greeting
    greet = Word(alphas) + "," + Word(alphas) + "!"

    hello = "Hello, World!"
    print (hello, "->", greet.parseString(hello))

The program outputs the following::

    Hello, World! -> ['Hello', ',', 'World', '!']

The Python representation of the grammar is quite readable, owing to the self-explanatory
class names, and the use of '+', '|' and '^' operators.

The L{ParseResults} object returned from L{ParserElement.parseString<ParserElement.parseString>} can be accessed as a nested list, a dictionary, or an
object with named attributes.

The pyparsing module handles some of the problems that are typically vexing when writing text parsers:
 - extra or missing whitespace (the above program will also handle "Hello,World!", "Hello  ,  World  !", etc.)
 - quoted strings
 - embedded comments
"""

__version__ = "2.1.10"
__versionTime__ = "07 Oct 2016 01:31 UTC"
__author__ = "Paul McGuire <ptmcg@users.sourceforge.net>"

import string
from weakref import ref as wkref
import copy
import sys
import warnings
import re
import sre_constants
import collections
import pprint
import traceback
import types
from datetime import datetime

try:
    from _thread import RLock
except ImportError:
    from threading import RLock

try:
    from collections import OrderedDict as _OrderedDict
except ImportError:
    try:
        from ordereddict import OrderedDict as _OrderedDict
    except ImportError:
        _OrderedDict = None

#~ sys.stderr.write( "testing pyparsing module, version %s, %s\n" % (__version__,__versionTime__ ) )

__all__ = [
'And', 'CaselessKeyword', 'CaselessLiteral', 'CharsNotIn', 'Combine', 'Dict', 'Each', 'Empty',
'FollowedBy', 'Forward', 'GoToColumn', 'Group', 'Keyword', 'LineEnd', 'LineStart', 'Literal',
'MatchFirst', 'NoMatch', 'NotAny', 'OneOrMore', 'OnlyOnce', 'Optional', 'Or',
'ParseBaseException', 'ParseElementEnhance', 'ParseException', 'ParseExpression', 'ParseFatalException',
'ParseResults', 'ParseSyntaxException', 'ParserElement', 'QuotedString', 'RecursiveGrammarException',
'Regex', 'SkipTo', 'StringEnd', 'StringStart', 'Suppress', 'Token', 'TokenConverter', 
'White', 'Word', 'WordEnd', 'WordStart', 'ZeroOrMore',
'alphanums', 'alphas', 'alphas8bit', 'anyCloseTag', 'anyOpenTag', 'cStyleComment', 'col',
'commaSeparatedList', 'commonHTMLEntity', 'countedArray', 'cppStyleComment', 'dblQuotedString',
'dblSlashComment', 'delimitedList', 'dictOf', 'downcaseTokens', 'empty', 'hexnums',
'htmlComment', 'javaStyleComment', 'line', 'lineEnd', 'lineStart', 'lineno',
'makeHTMLTags', 'makeXMLTags', 'matchOnlyAtCol', 'matchPreviousExpr', 'matchPreviousLiteral',
'nestedExpr', 'nullDebugAction', 'nums', 'oneOf', 'opAssoc', 'operatorPrecedence', 'printables',
'punc8bit', 'pythonStyleComment', 'quotedString', 'removeQuotes', 'replaceHTMLEntity', 
'replaceWith', 'restOfLine', 'sglQuotedString', 'srange', 'stringEnd',
'stringStart', 'traceParseAction', 'unicodeString', 'upcaseTokens', 'withAttribute',
'indentedBlock', 'originalTextFor', 'ungroup', 'infixNotation','locatedExpr', 'withClass',
'CloseMatch', 'tokenMap', 'pyparsing_common',
]

system_version = tuple(sys.version_info)[:3]
PY_3 = system_version[0] == 3
if PY_3:
    _MAX_INT = sys.maxsize
    basestring = str
    unichr = chr
    _ustr = str

    # build list of single arg builtins, that can be used as parse actions
    singleArgBuiltins = [sum, len, sorted, reversed, list, tuple, set, any, all, min, max]

else:
    _MAX_INT = sys.maxint
    range = xrange

    def _ustr(obj):
        """Drop-in replacement for str(obj) that tries to be Unicode friendly. It first tries
           str(obj). If that fails with a UnicodeEncodeError, then it tries unicode(obj). It
           then < returns the unicode object | encodes it with the default encoding | ... >.
        """
        if isinstance(obj,unicode):
            return obj

        try:
            # If this works, then _ustr(obj) has the same behaviour as str(obj), so
            # it won't break any existing code.
            return str(obj)

        except UnicodeEncodeError:
            # Else encode it
            ret = unicode(obj).encode(sys.getdefaultencoding(), 'xmlcharrefreplace')
            xmlcharref = Regex('&#\d+;')
            xmlcharref.setParseAction(lambda t: '\\u' + hex(int(t[0][2:-1]))[2:])
            return xmlcharref.transformString(ret)

    # build list of single arg builtins, tolerant of Python version, that can be used as parse actions
    singleArgBuiltins = []
    import __builtin__
    for fname in "sum len sorted reversed list tuple set any all min max".split():
        try:
            singleArgBuiltins.append(getattr(__builtin__,fname))
        except AttributeError:
            continue
            
_generatorType = type((y for y in range(1)))
 
def _xml_escape(data):
    """Escape &, <, >, ", ', etc. in a string of data."""

    # ampersand must be replaced first
    from_symbols = '&><"\''
    to_symbols = ('&'+s+';' for s in "amp gt lt quot apos".split())
    for from_,to_ in zip(from_symbols, to_symbols):
        data = data.replace(from_, to_)
    return data

class _Constants(object):
    pass

alphas     = string.ascii_uppercase + string.ascii_lowercase
nums       = "0123456789"
hexnums    = nums + "ABCDEFabcdef"
alphanums  = alphas + nums
_bslash    = chr(92)
printables = "".join(c for c in string.printable if c not in string.whitespace)

class ParseBaseException(Exception):
    """base exception class for all parsing runtime exceptions"""
    # Performance tuning: we construct a *lot* of these, so keep this
    # constructor as small and fast as possible
    def __init__( self, pstr, loc=0, msg=None, elem=None ):
        self.loc = loc
        if msg is None:
            self.msg = pstr
            self.pstr = ""
        else:
            self.msg = msg
            self.pstr = pstr
        self.parserElement = elem
        self.args = (pstr, loc, msg)

    @classmethod
    def _from_exception(cls, pe):
        """
        internal factory method to simplify creating one type of ParseException 
        from another - avoids having __init__ signature conflicts among subclasses
        """
        return cls(pe.pstr, pe.loc, pe.msg, pe.parserElement)

    def __getattr__( self, aname ):
        """supported attributes by name are:
            - lineno - returns the line number of the exception text
            - col - returns the column number of the exception text
            - line - returns the line containing the exception text
        """
        if( aname == "lineno" ):
            return lineno( self.loc, self.pstr )
        elif( aname in ("col", "column") ):
            return col( self.loc, self.pstr )
        elif( aname == "line" ):
            return line( self.loc, self.pstr )
        else:
            raise AttributeError(aname)

    def __str__( self ):
        return "%s (at char %d), (line:%d, col:%d)" % \
                ( self.msg, self.loc, self.lineno, self.column )
    def __repr__( self ):
        return _ustr(self)
    def markInputline( self, markerString = ">!<" ):
        """Extracts the exception line from the input string, and marks
           the location of the exception with a special symbol.
        """
        line_str = self.line
        line_column = self.column - 1
        if markerString:
            line_str = "".join((line_str[:line_column],
                                markerString, line_str[line_column:]))
        return line_str.strip()
    def __dir__(self):
        return "lineno col line".split() + dir(type(self))

class ParseException(ParseBaseException):
    """
    Exception thrown when parse expressions don't match class;
    supported attributes by name are:
     - lineno - returns the line number of the exception text
     - col - returns the column number of the exception text
     - line - returns the line containing the exception text
        
    Example::
        try:
            Word(nums).setName("integer").parseString("ABC")
        except ParseException as pe:
            print(pe)
            print("column: {}".format(pe.col))
            
    prints::
       Expected integer (at char 0), (line:1, col:1)
        column: 1
    """
    pass

class ParseFatalException(ParseBaseException):
    """user-throwable exception thrown when inconsistent parse content
       is found; stops all parsing immediately"""
    pass

class ParseSyntaxException(ParseFatalException):
    """just like L{ParseFatalException}, but thrown internally when an
       L{ErrorStop<And._ErrorStop>} ('-' operator) indicates that parsing is to stop 
       immediately because an unbacktrackable syntax error has been found"""
    pass

#~ class ReparseException(ParseBaseException):
    #~ """Experimental class - parse actions can raise this exception to cause
       #~ pyparsing to reparse the input string:
        #~ - with a modified input string, and/or
        #~ - with a modified start location
       #~ Set the values of the ReparseException in the constructor, and raise the
       #~ exception in a parse action to cause pyparsing to use the new string/location.
       #~ Setting the values as None causes no change to be made.
       #~ """
    #~ def __init_( self, newstring, restartLoc ):
        #~ self.newParseText = newstring
        #~ self.reparseLoc = restartLoc

class RecursiveGrammarException(Exception):
    """exception thrown by L{ParserElement.validate} if the grammar could be improperly recursive"""
    def __init__( self, parseElementList ):
        self.parseElementTrace = parseElementList

    def __str__( self ):
        return "RecursiveGrammarException: %s" % self.parseElementTrace

class _ParseResultsWithOffset(object):
    def __init__(self,p1,p2):
        self.tup = (p1,p2)
    def __getitem__(self,i):
        return self.tup[i]
    def __repr__(self):
        return repr(self.tup[0])
    def setOffset(self,i):
        self.tup = (self.tup[0],i)

class ParseResults(object):
    """
    Structured parse results, to provide multiple means of access to the parsed data:
       - as a list (C{len(results)})
       - by list index (C{results[0], results[1]}, etc.)
       - by attribute (C{results.<resultsName>} - see L{ParserElement.setResultsName})

    Example::
        integer = Word(nums)
        date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))
        # equivalent form:
        # date_str = integer("year") + '/' + integer("month") + '/' + integer("day")

        # parseString returns a ParseResults object
        result = date_str.parseString("1999/12/31")

        def test(s, fn=repr):
            print("%s -> %s" % (s, fn(eval(s))))
        test("list(result)")
        test("result[0]")
        test("result['month']")
        test("result.day")
        test("'month' in result")
        test("'minutes' in result")
        test("result.dump()", str)
    prints::
        list(result) -> ['1999', '/', '12', '/', '31']
        result[0] -> '1999'
        result['month'] -> '12'
        result.day -> '31'
        'month' in result -> True
        'minutes' in result -> False
        result.dump() -> ['1999', '/', '12', '/', '31']
        - day: 31
        - month: 12
        - year: 1999
    """
    def __new__(cls, toklist=None, name=None, asList=True, modal=True ):
        if isinstance(toklist, cls):
            return toklist
        retobj = object.__new__(cls)
        retobj.__doinit = True
        return retobj

    # Performance tuning: we construct a *lot* of these, so keep this
    # constructor as small and fast as possible
    def __init__( self, toklist=None, name=None, asList=True, modal=True, isinstance=isinstance ):
        if self.__doinit:
            self.__doinit = False
            self.__name = None
            self.__parent = None
            self.__accumNames = {}
            self.__asList = asList
            self.__modal = modal
            if toklist is None:
                toklist = []
            if isinstance(toklist, list):
                self.__toklist = toklist[:]
            elif isinstance(toklist, _generatorType):
                self.__toklist = list(toklist)
            else:
                self.__toklist = [toklist]
            self.__tokdict = dict()

        if name is not None and name:
            if not modal:
                self.__accumNames[name] = 0
            if isinstance(name,int):
                name = _ustr(name) # will always return a str, but use _ustr for consistency
            self.__name = name
            if not (isinstance(toklist, (type(None), basestring, list)) and toklist in (None,'',[])):
                if isinstance(toklist,basestring):
                    toklist = [ toklist ]
                if asList:
                    if isinstance(toklist,ParseResults):
                        self[name] = _ParseResultsWithOffset(toklist.copy(),0)
                    else:
                        self[name] = _ParseResultsWithOffset(ParseResults(toklist[0]),0)
                    self[name].__name = name
                else:
                    try:
                        self[name] = toklist[0]
                    except (KeyError,TypeError,IndexError):
                        self[name] = toklist

    def __getitem__( self, i ):
        if isinstance( i, (int,slice) ):
            return self.__toklist[i]
        else:
            if i not in self.__accumNames:
                return self.__tokdict[i][-1][0]
            else:
                return ParseResults([ v[0] for v in self.__tokdict[i] ])

    def __setitem__( self, k, v, isinstance=isinstance ):
        if isinstance(v,_ParseResultsWithOffset):
            self.__tokdict[k] = self.__tokdict.get(k,list()) + [v]
            sub = v[0]
        elif isinstance(k,(int,slice)):
            self.__toklist[k] = v
            sub = v
        else:
            self.__tokdict[k] = self.__tokdict.get(k,list()) + [_ParseResultsWithOffset(v,0)]
            sub = v
        if isinstance(sub,ParseResults):
            sub.__parent = wkref(self)

    def __delitem__( self, i ):
        if isinstance(i,(int,slice)):
            mylen = len( self.__toklist )
            del self.__toklist[i]

            # convert int to slice
            if isinstance(i, int):
                if i < 0:
                    i += mylen
                i = slice(i, i+1)
            # get removed indices
            removed = list(range(*i.indices(mylen)))
            removed.reverse()
            # fixup indices in token dictionary
            for name,occurrences in self.__tokdict.items():
                for j in removed:
                    for k, (value, position) in enumerate(occurrences):
                        occurrences[k] = _ParseResultsWithOffset(value, position - (position > j))
        else:
            del self.__tokdict[i]

    def __contains__( self, k ):
        return k in self.__tokdict

    def __len__( self ): return len( self.__toklist )
    def __bool__(self): return ( not not self.__toklist )
    __nonzero__ = __bool__
    def __iter__( self ): return iter( self.__toklist )
    def __reversed__( self ): return iter( self.__toklist[::-1] )
    def _iterkeys( self ):
        if hasattr(self.__tokdict, "iterkeys"):
            return self.__tokdict.iterkeys()
        else:
            return iter(self.__tokdict)

    def _itervalues( self ):
        return (self[k] for k in self._iterkeys())
            
    def _iteritems( self ):
        return ((k, self[k]) for k in self._iterkeys())

    if PY_3:
        keys = _iterkeys       
        """Returns an iterator of all named result keys (Python 3.x only)."""

        values = _itervalues
        """Returns an iterator of all named result values (Python 3.x only)."""

        items = _iteritems
        """Returns an iterator of all named result key-value tuples (Python 3.x only)."""

    else:
        iterkeys = _iterkeys
        """Returns an iterator of all named result keys (Python 2.x only)."""

        itervalues = _itervalues
        """Returns an iterator of all named result values (Python 2.x only)."""

        iteritems = _iteritems
        """Returns an iterator of all named result key-value tuples (Python 2.x only)."""

        def keys( self ):
            """Returns all named result keys (as a list in Python 2.x, as an iterator in Python 3.x)."""
            return list(self.iterkeys())

        def values( self ):
            """Returns all named result values (as a list in Python 2.x, as an iterator in Python 3.x)."""
            return list(self.itervalues())
                
        def items( self ):
            """Returns all named result key-values (as a list of tuples in Python 2.x, as an iterator in Python 3.x)."""
            return list(self.iteritems())

    def haskeys( self ):
        """Since keys() returns an iterator, this method is helpful in bypassing
           code that looks for the existence of any defined results names."""
        return bool(self.__tokdict)
        
    def pop( self, *args, **kwargs):
        """
        Removes and returns item at specified index (default=C{last}).
        Supports both C{list} and C{dict} semantics for C{pop()}. If passed no
        argument or an integer argument, it will use C{list} semantics
        and pop tokens from the list of parsed tokens. If passed a 
        non-integer argument (most likely a string), it will use C{dict}
        semantics and pop the corresponding value from any defined 
        results names. A second default return value argument is 
        supported, just as in C{dict.pop()}.

        Example::
            def remove_first(tokens):
                tokens.pop(0)
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            print(OneOrMore(Word(nums)).addParseAction(remove_first).parseString("0 123 321")) # -> ['123', '321']

            label = Word(alphas)
            patt = label("LABEL") + OneOrMore(Word(nums))
            print(patt.parseString("AAB 123 321").dump())

            # Use pop() in a parse action to remove named result (note that corresponding value is not
            # removed from list form of results)
            def remove_LABEL(tokens):
                tokens.pop("LABEL")
                return tokens
            patt.addParseAction(remove_LABEL)
            print(patt.parseString("AAB 123 321").dump())
        prints::
            ['AAB', '123', '321']
            - LABEL: AAB

            ['AAB', '123', '321']
        """
        if not args:
            args = [-1]
        for k,v in kwargs.items():
            if k == 'default':
                args = (args[0], v)
            else:
                raise TypeError("pop() got an unexpected keyword argument '%s'" % k)
        if (isinstance(args[0], int) or 
                        len(args) == 1 or 
                        args[0] in self):
            index = args[0]
            ret = self[index]
            del self[index]
            return ret
        else:
            defaultvalue = args[1]
            return defaultvalue

    def get(self, key, defaultValue=None):
        """
        Returns named result matching the given key, or if there is no
        such name, then returns the given C{defaultValue} or C{None} if no
        C{defaultValue} is specified.

        Similar to C{dict.get()}.
        
        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            result = date_str.parseString("1999/12/31")
            print(result.get("year")) # -> '1999'
            print(result.get("hour", "not specified")) # -> 'not specified'
            print(result.get("hour")) # -> None
        """
        if key in self:
            return self[key]
        else:
            return defaultValue

    def insert( self, index, insStr ):
        """
        Inserts new element at location index in the list of parsed tokens.
        
        Similar to C{list.insert()}.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']

            # use a parse action to insert the parse location in the front of the parsed results
            def insert_locn(locn, tokens):
                tokens.insert(0, locn)
            print(OneOrMore(Word(nums)).addParseAction(insert_locn).parseString("0 123 321")) # -> [0, '0', '123', '321']
        """
        self.__toklist.insert(index, insStr)
        # fixup indices in token dictionary
        for name,occurrences in self.__tokdict.items():
            for k, (value, position) in enumerate(occurrences):
                occurrences[k] = _ParseResultsWithOffset(value, position + (position > index))

    def append( self, item ):
        """
        Add single element to end of ParseResults list of elements.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            
            # use a parse action to compute the sum of the parsed integers, and add it to the end
            def append_sum(tokens):
                tokens.append(sum(map(int, tokens)))
            print(OneOrMore(Word(nums)).addParseAction(append_sum).parseString("0 123 321")) # -> ['0', '123', '321', 444]
        """
        self.__toklist.append(item)

    def extend( self, itemseq ):
        """
        Add sequence of elements to end of ParseResults list of elements.

        Example::
            patt = OneOrMore(Word(alphas))
            
            # use a parse action to append the reverse of the matched strings, to make a palindrome
            def make_palindrome(tokens):
                tokens.extend(reversed([t[::-1] for t in tokens]))
                return ''.join(tokens)
            print(patt.addParseAction(make_palindrome).parseString("lskdj sdlkjf lksd")) # -> 'lskdjsdlkjflksddsklfjkldsjdksl'
        """
        if isinstance(itemseq, ParseResults):
            self += itemseq
        else:
            self.__toklist.extend(itemseq)

    def clear( self ):
        """
        Clear all elements and results names.
        """
        del self.__toklist[:]
        self.__tokdict.clear()

    def __getattr__( self, name ):
        try:
            return self[name]
        except KeyError:
            return ""
            
        if name in self.__tokdict:
            if name not in self.__accumNames:
                return self.__tokdict[name][-1][0]
            else:
                return ParseResults([ v[0] for v in self.__tokdict[name] ])
        else:
            return ""

    def __add__( self, other ):
        ret = self.copy()
        ret += other
        return ret

    def __iadd__( self, other ):
        if other.__tokdict:
            offset = len(self.__toklist)
            addoffset = lambda a: offset if a<0 else a+offset
            otheritems = other.__tokdict.items()
            otherdictitems = [(k, _ParseResultsWithOffset(v[0],addoffset(v[1])) )
                                for (k,vlist) in otheritems for v in vlist]
            for k,v in otherdictitems:
                self[k] = v
                if isinstance(v[0],ParseResults):
                    v[0].__parent = wkref(self)
            
        self.__toklist += other.__toklist
        self.__accumNames.update( other.__accumNames )
        return self

    def __radd__(self, other):
        if isinstance(other,int) and other == 0:
            # useful for merging many ParseResults using sum() builtin
            return self.copy()
        else:
            # this may raise a TypeError - so be it
            return other + self
        
    def __repr__( self ):
        return "(%s, %s)" % ( repr( self.__toklist ), repr( self.__tokdict ) )

    def __str__( self ):
        return '[' + ', '.join(_ustr(i) if isinstance(i, ParseResults) else repr(i) for i in self.__toklist) + ']'

    def _asStringList( self, sep='' ):
        out = []
        for item in self.__toklist:
            if out and sep:
                out.append(sep)
            if isinstance( item, ParseResults ):
                out += item._asStringList()
            else:
                out.append( _ustr(item) )
        return out

    def asList( self ):
        """
        Returns the parse results as a nested list of matching tokens, all converted to strings.

        Example::
            patt = OneOrMore(Word(alphas))
            result = patt.parseString("sldkj lsdkj sldkj")
            # even though the result prints in string-like form, it is actually a pyparsing ParseResults
            print(type(result), result) # -> <class 'pyparsing.ParseResults'> ['sldkj', 'lsdkj', 'sldkj']
            
            # Use asList() to create an actual list
            result_list = result.asList()
            print(type(result_list), result_list) # -> <class 'list'> ['sldkj', 'lsdkj', 'sldkj']
        """
        return [res.asList() if isinstance(res,ParseResults) else res for res in self.__toklist]

    def asDict( self ):
        """
        Returns the named parse results as a nested dictionary.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(type(result), repr(result)) # -> <class 'pyparsing.ParseResults'> (['12', '/', '31', '/', '1999'], {'day': [('1999', 4)], 'year': [('12', 0)], 'month': [('31', 2)]})
            
            result_dict = result.asDict()
            print(type(result_dict), repr(result_dict)) # -> <class 'dict'> {'day': '1999', 'year': '12', 'month': '31'}

            # even though a ParseResults supports dict-like access, sometime you just need to have a dict
            import json
            print(json.dumps(result)) # -> Exception: TypeError: ... is not JSON serializable
            print(json.dumps(result.asDict())) # -> {"month": "31", "day": "1999", "year": "12"}
        """
        if PY_3:
            item_fn = self.items
        else:
            item_fn = self.iteritems
            
        def toItem(obj):
            if isinstance(obj, ParseResults):
                if obj.haskeys():
                    return obj.asDict()
                else:
                    return [toItem(v) for v in obj]
            else:
                return obj
                
        return dict((k,toItem(v)) for k,v in item_fn())

    def copy( self ):
        """
        Returns a new copy of a C{ParseResults} object.
        """
        ret = ParseResults( self.__toklist )
        ret.__tokdict = self.__tokdict.copy()
        ret.__parent = self.__parent
        ret.__accumNames.update( self.__accumNames )
        ret.__name = self.__name
        return ret

    def asXML( self, doctag=None, namedItemsOnly=False, indent="", formatted=True ):
        """
        (Deprecated) Returns the parse results as XML. Tags are created for tokens and lists that have defined results names.
        """
        nl = "\n"
        out = []
        namedItems = dict((v[1],k) for (k,vlist) in self.__tokdict.items()
                                                            for v in vlist)
        nextLevelIndent = indent + "  "

        # collapse out indents if formatting is not desired
        if not formatted:
            indent = ""
            nextLevelIndent = ""
            nl = ""

        selfTag = None
        if doctag is not None:
            selfTag = doctag
        else:
            if self.__name:
                selfTag = self.__name

        if not selfTag:
            if namedItemsOnly:
                return ""
            else:
                selfTag = "ITEM"

        out += [ nl, indent, "<", selfTag, ">" ]

        for i,res in enumerate(self.__toklist):
            if isinstance(res,ParseResults):
                if i in namedItems:
                    out += [ res.asXML(namedItems[i],
                                        namedItemsOnly and doctag is None,
                                        nextLevelIndent,
                                        formatted)]
                else:
                    out += [ res.asXML(None,
                                        namedItemsOnly and doctag is None,
                                        nextLevelIndent,
                                        formatted)]
            else:
                # individual token, see if there is a name for it
                resTag = None
                if i in namedItems:
                    resTag = namedItems[i]
                if not resTag:
                    if namedItemsOnly:
                        continue
                    else:
                        resTag = "ITEM"
                xmlBodyText = _xml_escape(_ustr(res))
                out += [ nl, nextLevelIndent, "<", resTag, ">",
                                                xmlBodyText,
                                                "</", resTag, ">" ]

        out += [ nl, indent, "</", selfTag, ">" ]
        return "".join(out)

    def __lookup(self,sub):
        for k,vlist in self.__tokdict.items():
            for v,loc in vlist:
                if sub is v:
                    return k
        return None

    def getName(self):
        """
        Returns the results name for this token expression. Useful when several 
        different expressions might match at a particular location.

        Example::
            integer = Word(nums)
            ssn_expr = Regex(r"\d\d\d-\d\d-\d\d\d\d")
            house_number_expr = Suppress('#') + Word(nums, alphanums)
            user_data = (Group(house_number_expr)("house_number") 
                        | Group(ssn_expr)("ssn")
                        | Group(integer)("age"))
            user_info = OneOrMore(user_data)
            
            result = user_info.parseString("22 111-22-3333 #221B")
            for item in result:
                print(item.getName(), ':', item[0])
        prints::
            age : 22
            ssn : 111-22-3333
            house_number : 221B
        """
        if self.__name:
            return self.__name
        elif self.__parent:
            par = self.__parent()
            if par:
                return par.__lookup(self)
            else:
                return None
        elif (len(self) == 1 and
               len(self.__tokdict) == 1 and
               next(iter(self.__tokdict.values()))[0][1] in (0,-1)):
            return next(iter(self.__tokdict.keys()))
        else:
            return None

    def dump(self, indent='', depth=0, full=True):
        """
        Diagnostic method for listing out the contents of a C{ParseResults}.
        Accepts an optional C{indent} argument so that this string can be embedded
        in a nested display of other data.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(result.dump())
        prints::
            ['12', '/', '31', '/', '1999']
            - day: 1999
            - month: 31
            - year: 12
        """
        out = []
        NL = '\n'
        out.append( indent+_ustr(self.asList()) )
        if full:
            if self.haskeys():
                items = sorted((str(k), v) for k,v in self.items())
                for k,v in items:
                    if out:
                        out.append(NL)
                    out.append( "%s%s- %s: " % (indent,('  '*depth), k) )
                    if isinstance(v,ParseResults):
                        if v:
                            out.append( v.dump(indent,depth+1) )
                        else:
                            out.append(_ustr(v))
                    else:
                        out.append(repr(v))
            elif any(isinstance(vv,ParseResults) for vv in self):
                v = self
                for i,vv in enumerate(v):
                    if isinstance(vv,ParseResults):
                        out.append("\n%s%s[%d]:\n%s%s%s" % (indent,('  '*(depth)),i,indent,('  '*(depth+1)),vv.dump(indent,depth+1) ))
                    else:
                        out.append("\n%s%s[%d]:\n%s%s%s" % (indent,('  '*(depth)),i,indent,('  '*(depth+1)),_ustr(vv)))
            
        return "".join(out)

    def pprint(self, *args, **kwargs):
        """
        Pretty-printer for parsed results as a list, using the C{pprint} module.
        Accepts additional positional or keyword args as defined for the 
        C{pprint.pprint} method. (U{http://docs.python.org/3/library/pprint.html#pprint.pprint})

        Example::
            ident = Word(alphas, alphanums)
            num = Word(nums)
            func = Forward()
            term = ident | num | Group('(' + func + ')')
            func <<= ident + Group(Optional(delimitedList(term)))
            result = func.parseString("fna a,b,(fnb c,d,200),100")
            result.pprint(width=40)
        prints::
            ['fna',
             ['a',
              'b',
              ['(', 'fnb', ['c', 'd', '200'], ')'],
              '100']]
        """
        pprint.pprint(self.asList(), *args, **kwargs)

    # add support for pickle protocol
    def __getstate__(self):
        return ( self.__toklist,
                 ( self.__tokdict.copy(),
                   self.__parent is not None and self.__parent() or None,
                   self.__accumNames,
                   self.__name ) )

    def __setstate__(self,state):
        self.__toklist = state[0]
        (self.__tokdict,
         par,
         inAccumNames,
         self.__name) = state[1]
        self.__accumNames = {}
        self.__accumNames.update(inAccumNames)
        if par is not None:
            self.__parent = wkref(par)
        else:
            self.__parent = None

    def __getnewargs__(self):
        return self.__toklist, self.__name, self.__asList, self.__modal

    def __dir__(self):
        return (dir(type(self)) + list(self.keys()))

collections.MutableMapping.register(ParseResults)

def col (loc,strg):
    """Returns current column within a string, counting newlines as line separators.
   The first column is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   """
    s = strg
    return 1 if 0<loc<len(s) and s[loc-1] == '\n' else loc - s.rfind("\n", 0, loc)

def lineno(loc,strg):
    """Returns current line number within a string, counting newlines as line separators.
   The first line is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   """
    return strg.count("\n",0,loc) + 1

def line( loc, strg ):
    """Returns the line of text containing loc within a string, counting newlines as line separators.
       """
    lastCR = strg.rfind("\n", 0, loc)
    nextCR = strg.find("\n", loc)
    if nextCR >= 0:
        return strg[lastCR+1:nextCR]
    else:
        return strg[lastCR+1:]

def _defaultStartDebugAction( instring, loc, expr ):
    print (("Match " + _ustr(expr) + " at loc " + _ustr(loc) + "(%d,%d)" % ( lineno(loc,instring), col(loc,instring) )))

def _defaultSuccessDebugAction( instring, startloc, endloc, expr, toks ):
    print ("Matched " + _ustr(expr) + " -> " + str(toks.asList()))

def _defaultExceptionDebugAction( instring, loc, expr, exc ):
    print ("Exception raised:" + _ustr(exc))

def nullDebugAction(*args):
    """'Do-nothing' debug action, to suppress debugging output during parsing."""
    pass

# Only works on Python 3.x - nonlocal is toxic to Python 2 installs
#~ 'decorator to trim function calls to match the arity of the target'
#~ def _trim_arity(func, maxargs=3):
    #~ if func in singleArgBuiltins:
        #~ return lambda s,l,t: func(t)
    #~ limit = 0
    #~ foundArity = False
    #~ def wrapper(*args):
        #~ nonlocal limit,foundArity
        #~ while 1:
            #~ try:
                #~ ret = func(*args[limit:])
                #~ foundArity = True
                #~ return ret
            #~ except TypeError:
                #~ if limit == maxargs or foundArity:
                    #~ raise
                #~ limit += 1
                #~ continue
    #~ return wrapper

# this version is Python 2.x-3.x cross-compatible
'decorator to trim function calls to match the arity of the target'
def _trim_arity(func, maxargs=2):
    if func in singleArgBuiltins:
        return lambda s,l,t: func(t)
    limit = [0]
    foundArity = [False]
    
    # traceback return data structure changed in Py3.5 - normalize back to plain tuples
    if system_version[:2] >= (3,5):
        def extract_stack(limit=0):
            # special handling for Python 3.5.0 - extra deep call stack by 1
            offset = -3 if system_version == (3,5,0) else -2
            frame_summary = traceback.extract_stack(limit=-offset+limit-1)[offset]
            return [(frame_summary.filename, frame_summary.lineno)]
        def extract_tb(tb, limit=0):
            frames = traceback.extract_tb(tb, limit=limit)
            frame_summary = frames[-1]
            return [(frame_summary.filename, frame_summary.lineno)]
    else:
        extract_stack = traceback.extract_stack
        extract_tb = traceback.extract_tb
    
    # synthesize what would be returned by traceback.extract_stack at the call to 
    # user's parse action 'func', so that we don't incur call penalty at parse time
    
    LINE_DIFF = 6
    # IF ANY CODE CHANGES, EVEN JUST COMMENTS OR BLANK LINES, BETWEEN THE NEXT LINE AND 
    # THE CALL TO FUNC INSIDE WRAPPER, LINE_DIFF MUST BE MODIFIED!!!!
    this_line = extract_stack(limit=2)[-1]
    pa_call_line_synth = (this_line[0], this_line[1]+LINE_DIFF)

    def wrapper(*args):
        while 1:
            try:
                ret = func(*args[limit[0]:])
                foundArity[0] = True
                return ret
            except TypeError:
                # re-raise TypeErrors if they did not come from our arity testing
                if foundArity[0]:
                    raise
                else:
                    try:
                        tb = sys.exc_info()[-1]
                        if not extract_tb(tb, limit=2)[-1][:2] == pa_call_line_synth:
                            raise
                    finally:
                        del tb

                if limit[0] <= maxargs:
                    limit[0] += 1
                    continue
                raise

    # copy func name to wrapper for sensible debug output
    func_name = "<parse action>"
    try:
        func_name = getattr(func, '__name__', 
                            getattr(func, '__class__').__name__)
    except Exception:
        func_name = str(func)
    wrapper.__name__ = func_name

    return wrapper

class ParserElement(object):
    """Abstract base level parser element class."""
    DEFAULT_WHITE_CHARS = " \n\t\r"
    verbose_stacktrace = False

    @staticmethod
    def setDefaultWhitespaceChars( chars ):
        r"""
        Overrides the default whitespace chars

        Example::
            # default whitespace chars are space, <TAB> and newline
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def', 'ghi', 'jkl']
            
            # change to just treat newline as significant
            ParserElement.setDefaultWhitespaceChars(" \t")
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def']
        """
        ParserElement.DEFAULT_WHITE_CHARS = chars

    @staticmethod
    def inlineLiteralsUsing(cls):
        """
        Set class to be used for inclusion of string literals into a parser.
        
        Example::
            # default literal class used is Literal
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']


            # change to Suppress
            ParserElement.inlineLiteralsUsing(Suppress)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '12', '31']
        """
        ParserElement._literalStringClass = cls

    def __init__( self, savelist=False ):
        self.parseAction = list()
        self.failAction = None
        #~ self.name = "<unknown>"  # don't define self.name, let subclasses try/except upcall
        self.strRepr = None
        self.resultsName = None
        self.saveAsList = savelist
        self.skipWhitespace = True
        self.whiteChars = ParserElement.DEFAULT_WHITE_CHARS
        self.copyDefaultWhiteChars = True
        self.mayReturnEmpty = False # used when checking for left-recursion
        self.keepTabs = False
        self.ignoreExprs = list()
        self.debug = False
        self.streamlined = False
        self.mayIndexError = True # used to optimize exception handling for subclasses that don't advance parse index
        self.errmsg = ""
        self.modalResults = True # used to mark results names as modal (report only last) or cumulative (list all)
        self.debugActions = ( None, None, None ) #custom debug actions
        self.re = None
        self.callPreparse = True # used to avoid redundant calls to preParse
        self.callDuringTry = False

    def copy( self ):
        """
        Make a copy of this C{ParserElement}.  Useful for defining different parse actions
        for the same parsing pattern, using copies of the original parse element.
        
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            integerK = integer.copy().addParseAction(lambda toks: toks[0]*1024) + Suppress("K")
            integerM = integer.copy().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
            
            print(OneOrMore(integerK | integerM | integer).parseString("5K 100 640K 256M"))
        prints::
            [5120, 100, 655360, 268435456]
        Equivalent form of C{expr.copy()} is just C{expr()}::
            integerM = integer().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
        """
        cpy = copy.copy( self )
        cpy.parseAction = self.parseAction[:]
        cpy.ignoreExprs = self.ignoreExprs[:]
        if self.copyDefaultWhiteChars:
            cpy.whiteChars = ParserElement.DEFAULT_WHITE_CHARS
        return cpy

    def setName( self, name ):
        """
        Define name for this expression, makes debugging and exception messages clearer.
        
        Example::
            Word(nums).parseString("ABC")  # -> Exception: Expected W:(0123...) (at char 0), (line:1, col:1)
            Word(nums).setName("integer").parseString("ABC")  # -> Exception: Expected integer (at char 0), (line:1, col:1)
        """
        self.name = name
        self.errmsg = "Expected " + self.name
        if hasattr(self,"exception"):
            self.exception.msg = self.errmsg
        return self

    def setResultsName( self, name, listAllMatches=False ):
        """
        Define name for referencing matching tokens as a nested attribute
        of the returned parse results.
        NOTE: this returns a *copy* of the original C{ParserElement} object;
        this is so that the client can define a basic element, such as an
        integer, and reference it in multiple places with different names.

        You can also set results names using the abbreviated syntax,
        C{expr("name")} in place of C{expr.setResultsName("name")} - 
        see L{I{__call__}<__call__>}.

        Example::
            date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))

            # equivalent form:
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
        """
        newself = self.copy()
        if name.endswith("*"):
            name = name[:-1]
            listAllMatches=True
        newself.resultsName = name
        newself.modalResults = not listAllMatches
        return newself

    def setBreak(self,breakFlag = True):
        """Method to invoke the Python pdb debugger when this element is
           about to be parsed. Set C{breakFlag} to True to enable, False to
           disable.
        """
        if breakFlag:
            _parseMethod = self._parse
            def breaker(instring, loc, doActions=True, callPreParse=True):
                import pdb
                pdb.set_trace()
                return _parseMethod( instring, loc, doActions, callPreParse )
            breaker._originalParseMethod = _parseMethod
            self._parse = breaker
        else:
            if hasattr(self._parse,"_originalParseMethod"):
                self._parse = self._parse._originalParseMethod
        return self

    def setParseAction( self, *fns, **kwargs ):
        """
        Define action to perform when successfully matching parse element definition.
        Parse action fn is a callable method with 0-3 arguments, called as C{fn(s,loc,toks)},
        C{fn(loc,toks)}, C{fn(toks)}, or just C{fn()}, where:
         - s   = the original string being parsed (see note below)
         - loc = the location of the matching substring
         - toks = a list of the matched tokens, packaged as a C{L{ParseResults}} object
        If the functions in fns modify the tokens, they can return them as the return
        value from fn, and the modified list of tokens will replace the original.
        Otherwise, fn does not need to return any value.

        Optional keyword arguments:
         - callDuringTry = (default=C{False}) indicate if parse action should be run during lookaheads and alternate testing

        Note: the default parsing behavior is to expand tabs in the input string
        before starting the parsing process.  See L{I{parseString}<parseString>} for more information
        on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
        consistent view of the parsed string, the parse location, and line and column
        positions within the parsed string.
        
        Example::
            integer = Word(nums)
            date_str = integer + '/' + integer + '/' + integer

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']

            # use parse action to convert to ints at parse time
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            date_str = integer + '/' + integer + '/' + integer

            # note that integer fields are now ints, not strings
            date_str.parseString("1999/12/31")  # -> [1999, '/', 12, '/', 31]
        """
        self.parseAction = list(map(_trim_arity, list(fns)))
        self.callDuringTry = kwargs.get("callDuringTry", False)
        return self

    def addParseAction( self, *fns, **kwargs ):
        """
        Add parse action to expression's list of parse actions. See L{I{setParseAction}<setParseAction>}.
        
        See examples in L{I{copy}<copy>}.
        """
        self.parseAction += list(map(_trim_arity, list(fns)))
        self.callDuringTry = self.callDuringTry or kwargs.get("callDuringTry", False)
        return self

    def addCondition(self, *fns, **kwargs):
        """Add a boolean predicate function to expression's list of parse actions. See 
        L{I{setParseAction}<setParseAction>} for function call signatures. Unlike C{setParseAction}, 
        functions passed to C{addCondition} need to return boolean success/fail of the condition.

        Optional keyword arguments:
         - message = define a custom message to be used in the raised exception
         - fatal   = if True, will raise ParseFatalException to stop parsing immediately; otherwise will raise ParseException
         
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            year_int = integer.copy()
            year_int.addCondition(lambda toks: toks[0] >= 2000, message="Only support years 2000 and later")
            date_str = year_int + '/' + integer + '/' + integer

            result = date_str.parseString("1999/12/31")  # -> Exception: Only support years 2000 and later (at char 0), (line:1, col:1)
        """
        msg = kwargs.get("message", "failed user-defined condition")
        exc_type = ParseFatalException if kwargs.get("fatal", False) else ParseException
        for fn in fns:
            def pa(s,l,t):
                if not bool(_trim_arity(fn)(s,l,t)):
                    raise exc_type(s,l,msg)
            self.parseAction.append(pa)
        self.callDuringTry = self.callDuringTry or kwargs.get("callDuringTry", False)
        return self

    def setFailAction( self, fn ):
        """Define action to perform if parsing fails at this expression.
           Fail acton fn is a callable function that takes the arguments
           C{fn(s,loc,expr,err)} where:
            - s = string being parsed
            - loc = location where expression match was attempted and failed
            - expr = the parse expression that failed
            - err = the exception thrown
           The function returns no value.  It may throw C{L{ParseFatalException}}
           if it is desired to stop parsing immediately."""
        self.failAction = fn
        return self

    def _skipIgnorables( self, instring, loc ):
        exprsFound = True
        while exprsFound:
            exprsFound = False
            for e in self.ignoreExprs:
                try:
                    while 1:
                        loc,dummy = e._parse( instring, loc )
                        exprsFound = True
                except ParseException:
                    pass
        return loc

    def preParse( self, instring, loc ):
        if self.ignoreExprs:
            loc = self._skipIgnorables( instring, loc )

        if self.skipWhitespace:
            wt = self.whiteChars
            instrlen = len(instring)
            while loc < instrlen and instring[loc] in wt:
                loc += 1

        return loc

    def parseImpl( self, instring, loc, doActions=True ):
        return loc, []

    def postParse( self, instring, loc, tokenlist ):
        return tokenlist

    #~ @profile
    def _parseNoCache( self, instring, loc, doActions=True, callPreParse=True ):
        debugging = ( self.debug ) #and doActions )

        if debugging or self.failAction:
            #~ print ("Match",self,"at loc",loc,"(%d,%d)" % ( lineno(loc,instring), col(loc,instring) ))
            if (self.debugActions[0] ):
                self.debugActions[0]( instring, loc, self )
            if callPreParse and self.callPreparse:
                preloc = self.preParse( instring, loc )
            else:
                preloc = loc
            tokensStart = preloc
            try:
                try:
                    loc,tokens = self.parseImpl( instring, preloc, doActions )
                except IndexError:
                    raise ParseException( instring, len(instring), self.errmsg, self )
            except ParseBaseException as err:
                #~ print ("Exception raised:", err)
                if self.debugActions[2]:
                    self.debugActions[2]( instring, tokensStart, self, err )
                if self.failAction:
                    self.failAction( instring, tokensStart, self, err )
                raise
        else:
            if callPreParse and self.callPreparse:
                preloc = self.preParse( instring, loc )
            else:
                preloc = loc
            tokensStart = preloc
            if self.mayIndexError or loc >= len(instring):
                try:
                    loc,tokens = self.parseImpl( instring, preloc, doActions )
                except IndexError:
                    raise ParseException( instring, len(instring), self.errmsg, self )
            else:
                loc,tokens = self.parseImpl( instring, preloc, doActions )

        tokens = self.postParse( instring, loc, tokens )

        retTokens = ParseResults( tokens, self.resultsName, asList=self.saveAsList, modal=self.modalResults )
        if self.parseAction and (doActions or self.callDuringTry):
            if debugging:
                try:
                    for fn in self.parseAction:
                        tokens = fn( instring, tokensStart, retTokens )
                        if tokens is not None:
                            retTokens = ParseResults( tokens,
                                                      self.resultsName,
                                                      asList=self.saveAsList and isinstance(tokens,(ParseResults,list)),
                                                      modal=self.modalResults )
                except ParseBaseException as err:
                    #~ print "Exception raised in user parse action:", err
                    if (self.debugActions[2] ):
                        self.debugActions[2]( instring, tokensStart, self, err )
                    raise
            else:
                for fn in self.parseAction:
                    tokens = fn( instring, tokensStart, retTokens )
                    if tokens is not None:
                        retTokens = ParseResults( tokens,
                                                  self.resultsName,
                                                  asList=self.saveAsList and isinstance(tokens,(ParseResults,list)),
                                                  modal=self.modalResults )

        if debugging:
            #~ print ("Matched",self,"->",retTokens.asList())
            if (self.debugActions[1] ):
                self.debugActions[1]( instring, tokensStart, loc, self, retTokens )

        return loc, retTokens

    def tryParse( self, instring, loc ):
        try:
            return self._parse( instring, loc, doActions=False )[0]
        except ParseFatalException:
            raise ParseException( instring, loc, self.errmsg, self)
    
    def canParseNext(self, instring, loc):
        try:
            self.tryParse(instring, loc)
        except (ParseException, IndexError):
            return False
        else:
            return True

    class _UnboundedCache(object):
        def __init__(self):
            cache = {}
            self.not_in_cache = not_in_cache = object()

            def get(self, key):
                return cache.get(key, not_in_cache)

            def set(self, key, value):
                cache[key] = value

            def clear(self):
                cache.clear()

            self.get = types.MethodType(get, self)
            self.set = types.MethodType(set, self)
            self.clear = types.MethodType(clear, self)

    if _OrderedDict is not None:
        class _FifoCache(object):
            def __init__(self, size):
                self.not_in_cache = not_in_cache = object()

                cache = _OrderedDict()

                def get(self, key):
                    return cache.get(key, not_in_cache)

                def set(self, key, value):
                    cache[key] = value
                    if len(cache) > size:
                        cache.popitem(False)

                def clear(self):
                    cache.clear()

                self.get = types.MethodType(get, self)
                self.set = types.MethodType(set, self)
                self.clear = types.MethodType(clear, self)

    else:
        class _FifoCache(object):
            def __init__(self, size):
                self.not_in_cache = not_in_cache = object()

                cache = {}
                key_fifo = collections.deque([], size)

                def get(self, key):
                    return cache.get(key, not_in_cache)

                def set(self, key, value):
                    cache[key] = value
                    if len(cache) > size:
                        cache.pop(key_fifo.popleft(), None)
                    key_fifo.append(key)

                def clear(self):
                    cache.clear()
                    key_fifo.clear()

                self.get = types.MethodType(get, self)
                self.set = types.MethodType(set, self)
                self.clear = types.MethodType(clear, self)

    # argument cache for optimizing repeated calls when backtracking through recursive expressions
    packrat_cache = {} # this is set later by enabledPackrat(); this is here so that resetCache() doesn't fail
    packrat_cache_lock = RLock()
    packrat_cache_stats = [0, 0]

    # this method gets repeatedly called during backtracking with the same arguments -
    # we can cache these arguments and save ourselves the trouble of re-parsing the contained expression
    def _parseCache( self, instring, loc, doActions=True, callPreParse=True ):
        HIT, MISS = 0, 1
        lookup = (self, instring, loc, callPreParse, doActions)
        with ParserElement.packrat_cache_lock:
            cache = ParserElement.packrat_cache
            value = cache.get(lookup)
            if value is cache.not_in_cache:
                ParserElement.packrat_cache_stats[MISS] += 1
                try:
                    value = self._parseNoCache(instring, loc, doActions, callPreParse)
                except ParseBaseException as pe:
                    # cache a copy of the exception, without the traceback
                    cache.set(lookup, pe.__class__(*pe.args))
                    raise
                else:
                    cache.set(lookup, (value[0], value[1].copy()))
                    return value
            else:
                ParserElement.packrat_cache_stats[HIT] += 1
                if isinstance(value, Exception):
                    raise value
                return (value[0], value[1].copy())

    _parse = _parseNoCache

    @staticmethod
    def resetCache():
        ParserElement.packrat_cache.clear()
        ParserElement.packrat_cache_stats[:] = [0] * len(ParserElement.packrat_cache_stats)

    _packratEnabled = False
    @staticmethod
    def enablePackrat(cache_size_limit=128):
        """Enables "packrat" parsing, which adds memoizing to the parsing logic.
           Repeated parse attempts at the same string location (which happens
           often in many complex grammars) can immediately return a cached value,
           instead of re-executing parsing/validating code.  Memoizing is done of
           both valid results and parsing exceptions.
           
           Parameters:
            - cache_size_limit - (default=C{128}) - if an integer value is provided
              will limit the size of the packrat cache; if None is passed, then
              the cache size will be unbounded; if 0 is passed, the cache will
              be effectively disabled.
            
           This speedup may break existing programs that use parse actions that
           have side-effects.  For this reason, packrat parsing is disabled when
           you first import pyparsing.  To activate the packrat feature, your
           program must call the class method C{ParserElement.enablePackrat()}.  If
           your program uses C{psyco} to "compile as you go", you must call
           C{enablePackrat} before calling C{psyco.full()}.  If you do not do this,
           Python will crash.  For best results, call C{enablePackrat()} immediately
           after importing pyparsing.
           
           Example::
               import pyparsing
               pyparsing.ParserElement.enablePackrat()
        """
        if not ParserElement._packratEnabled:
            ParserElement._packratEnabled = True
            if cache_size_limit is None:
                ParserElement.packrat_cache = ParserElement._UnboundedCache()
            else:
                ParserElement.packrat_cache = ParserElement._FifoCache(cache_size_limit)
            ParserElement._parse = ParserElement._parseCache

    def parseString( self, instring, parseAll=False ):
        """
        Execute the parse expression with the given string.
        This is the main interface to the client code, once the complete
        expression has been built.

        If you want the grammar to require that the entire input string be
        successfully parsed, then set C{parseAll} to True (equivalent to ending
        the grammar with C{L{StringEnd()}}).

        Note: C{parseString} implicitly calls C{expandtabs()} on the input string,
        in order to report proper column numbers in parse actions.
        If the input string contains tabs and
        the grammar uses parse actions that use the C{loc} argument to index into the
        string being parsed, you can ensure you have a consistent view of the input
        string by:
         - calling C{parseWithTabs} on your grammar before calling C{parseString}
           (see L{I{parseWithTabs}<parseWithTabs>})
         - define your parse action using the full C{(s,loc,toks)} signature, and
           reference the input string using the parse action's C{s} argument
         - explictly expand the tabs in your input string before calling
           C{parseString}
        
        Example::
            Word('a').parseString('aaaaabaaa')  # -> ['aaaaa']
            Word('a').parseString('aaaaabaaa', parseAll=True)  # -> Exception: Expected end of text
        """
        ParserElement.resetCache()
        if not self.streamlined:
            self.streamline()
            #~ self.saveAsList = True
        for e in self.ignoreExprs:
            e.streamline()
        if not self.keepTabs:
            instring = instring.expandtabs()
        try:
            loc, tokens = self._parse( instring, 0 )
            if parseAll:
                loc = self.preParse( instring, loc )
                se = Empty() + StringEnd()
                se._parse( instring, loc )
        except ParseBaseException as exc:
            if ParserElement.verbose_stacktrace:
                raise
            else:
                # catch and re-raise exception from here, clears out pyparsing internal stack trace
                raise exc
        else:
            return tokens

    def scanString( self, instring, maxMatches=_MAX_INT, overlap=False ):
        """
        Scan the input string for expression matches.  Each match will return the
        matching tokens, start location, and end location.  May be called with optional
        C{maxMatches} argument, to clip scanning after 'n' matches are found.  If
        C{overlap} is specified, then overlapping matches will be reported.

        Note that the start and end locations are reported relative to the string
        being parsed.  See L{I{parseString}<parseString>} for more information on parsing
        strings with embedded tabs.

        Example::
            source = "sldjf123lsdjjkf345sldkjf879lkjsfd987"
            print(source)
            for tokens,start,end in Word(alphas).scanString(source):
                print(' '*start + '^'*(end-start))
                print(' '*start + tokens[0])
        
        prints::
        
            sldjf123lsdjjkf345sldkjf879lkjsfd987
            ^^^^^
            sldjf
                    ^^^^^^^
                    lsdjjkf
                              ^^^^^^
                              sldkjf
                                       ^^^^^^
                                       lkjsfd
        """
        if not self.streamlined:
            self.streamline()
        for e in self.ignoreExprs:
            e.streamline()

        if not self.keepTabs:
            instring = _ustr(instring).expandtabs()
        instrlen = len(instring)
        loc = 0
        preparseFn = self.preParse
        parseFn = self._parse
        ParserElement.resetCache()
        matches = 0
        try:
            while loc <= instrlen and matches < maxMatches:
                try:
                    preloc = preparseFn( instring, loc )
                    nextLoc,tokens = parseFn( instring, preloc, callPreParse=False )
                except ParseException:
                    loc = preloc+1
                else:
                    if nextLoc > loc:
                        matches += 1
                        yield tokens, preloc, nextLoc
                        if overlap:
                            nextloc = preparseFn( instring, loc )
                            if nextloc > loc:
                                loc = nextLoc
                            else:
                                loc += 1
                        else:
                            loc = nextLoc
                    else:
                        loc = preloc+1
        except ParseBaseException as exc:
            if ParserElement.verbose_stacktrace:
                raise
            else:
                # catch and re-raise exception from here, clears out pyparsing internal stack trace
                raise exc

    def transformString( self, instring ):
        """
        Extension to C{L{scanString}}, to modify matching text with modified tokens that may
        be returned from a parse action.  To use C{transformString}, define a grammar and
        attach a parse action to it that modifies the returned token list.
        Invoking C{transformString()} on a target string will then scan for matches,
        and replace the matched text patterns according to the logic in the parse
        action.  C{transformString()} returns the resulting transformed string.
        
        Example::
            wd = Word(alphas)
            wd.setParseAction(lambda toks: toks[0].title())
            
            print(wd.transformString("now is the winter of our discontent made glorious summer by this sun of york."))
        Prints::
            Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York.
        """
        out = []
        lastE = 0
        # force preservation of <TAB>s, to minimize unwanted transformation of string, and to
        # keep string locs straight between transformString and scanString
        self.keepTabs = True
        try:
            for t,s,e in self.scanString( instring ):
                out.append( instring[lastE:s] )
                if t:
                    if isinstance(t,ParseResults):
                        out += t.asList()
                    elif isinstance(t,list):
                        out += t
                    else:
                        out.append(t)
                lastE = e
            out.append(instring[lastE:])
            out = [o for o in out if o]
            return "".join(map(_ustr,_flatten(out)))
        except ParseBaseException as exc:
            if ParserElement.verbose_stacktrace:
                raise
            else:
                # catch and re-raise exception from here, clears out pyparsing internal stack trace
                raise exc

    def searchString( self, instring, maxMatches=_MAX_INT ):
        """
        Another extension to C{L{scanString}}, simplifying the access to the tokens found
        to match the given parse expression.  May be called with optional
        C{maxMatches} argument, to clip searching after 'n' matches are found.
        
        Example::
            # a capitalized word starts with an uppercase letter, followed by zero or more lowercase letters
            cap_word = Word(alphas.upper(), alphas.lower())
            
            print(cap_word.searchString("More than Iron, more than Lead, more than Gold I need Electricity"))
        prints::
            ['More', 'Iron', 'Lead', 'Gold', 'I']
        """
        try:
            return ParseResults([ t for t,s,e in self.scanString( instring, maxMatches ) ])
        except ParseBaseException as exc:
            if ParserElement.verbose_stacktrace:
                raise
            else:
                # catch and re-raise exception from here, clears out pyparsing internal stack trace
                raise exc

    def split(self, instring, maxsplit=_MAX_INT, includeSeparators=False):
        """
        Generator method to split a string using the given expression as a separator.
        May be called with optional C{maxsplit} argument, to limit the number of splits;
        and the optional C{includeSeparators} argument (default=C{False}), if the separating
        matching text should be included in the split results.
        
        Example::        
            punc = oneOf(list(".,;:/-!?"))
            print(list(punc.split("This, this?, this sentence, is badly punctuated!")))
        prints::
            ['This', ' this', '', ' this sentence', ' is badly punctuated', '']
        """
        splits = 0
        last = 0
        for t,s,e in self.scanString(instring, maxMatches=maxsplit):
            yield instring[last:s]
            if includeSeparators:
                yield t[0]
            last = e
        yield instring[last:]

    def __add__(self, other ):
        """
        Implementation of + operator - returns C{L{And}}. Adding strings to a ParserElement
        converts them to L{Literal}s by default.
        
        Example::
            greet = Word(alphas) + "," + Word(alphas) + "!"
            hello = "Hello, World!"
            print (hello, "->", greet.parseString(hello))
        Prints::
            Hello, World! -> ['Hello', ',', 'World', '!']
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return And( [ self, other ] )

    def __radd__(self, other ):
        """
        Implementation of + operator when left operand is not a C{L{ParserElement}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return other + self

    def __sub__(self, other):
        """
        Implementation of - operator, returns C{L{And}} with error stop
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return And( [ self, And._ErrorStop(), other ] )

    def __rsub__(self, other ):
        """
        Implementation of - operator when left operand is not a C{L{ParserElement}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return other - self

    def __mul__(self,other):
        """
        Implementation of * operator, allows use of C{expr * 3} in place of
        C{expr + expr + expr}.  Expressions may also me multiplied by a 2-integer
        tuple, similar to C{{min,max}} multipliers in regular expressions.  Tuples
        may also include C{None} as in:
         - C{expr*(n,None)} or C{expr*(n,)} is equivalent
              to C{expr*n + L{ZeroOrMore}(expr)}
              (read as "at least n instances of C{expr}")
         - C{expr*(None,n)} is equivalent to C{expr*(0,n)}
              (read as "0 to n instances of C{expr}")
         - C{expr*(None,None)} is equivalent to C{L{ZeroOrMore}(expr)}
         - C{expr*(1,None)} is equivalent to C{L{OneOrMore}(expr)}

        Note that C{expr*(None,n)} does not raise an exception if
        more than n exprs exist in the input stream; that is,
        C{expr*(None,n)} does not enforce a maximum number of expr
        occurrences.  If this behavior is desired, then write
        C{expr*(None,n) + ~expr}
        """
        if isinstance(other,int):
            minElements, optElements = other,0
        elif isinstance(other,tuple):
            other = (other + (None, None))[:2]
            if other[0] is None:
                other = (0, other[1])
            if isinstance(other[0],int) and other[1] is None:
                if other[0] == 0:
                    return ZeroOrMore(self)
                if other[0] == 1:
                    return OneOrMore(self)
                else:
                    return self*other[0] + ZeroOrMore(self)
            elif isinstance(other[0],int) and isinstance(other[1],int):
                minElements, optElements = other
                optElements -= minElements
            else:
                raise TypeError("cannot multiply 'ParserElement' and ('%s','%s') objects", type(other[0]),type(other[1]))
        else:
            raise TypeError("cannot multiply 'ParserElement' and '%s' objects", type(other))

        if minElements < 0:
            raise ValueError("cannot multiply ParserElement by negative value")
        if optElements < 0:
            raise ValueError("second tuple value must be greater or equal to first tuple value")
        if minElements == optElements == 0:
            raise ValueError("cannot multiply ParserElement by 0 or (0,0)")

        if (optElements):
            def makeOptionalList(n):
                if n>1:
                    return Optional(self + makeOptionalList(n-1))
                else:
                    return Optional(self)
            if minElements:
                if minElements == 1:
                    ret = self + makeOptionalList(optElements)
                else:
                    ret = And([self]*minElements) + makeOptionalList(optElements)
            else:
                ret = makeOptionalList(optElements)
        else:
            if minElements == 1:
                ret = self
            else:
                ret = And([self]*minElements)
        return ret

    def __rmul__(self, other):
        return self.__mul__(other)

    def __or__(self, other ):
        """
        Implementation of | operator - returns C{L{MatchFirst}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return MatchFirst( [ self, other ] )

    def __ror__(self, other ):
        """
        Implementation of | operator when left operand is not a C{L{ParserElement}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return other | self

    def __xor__(self, other ):
        """
        Implementation of ^ operator - returns C{L{Or}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return Or( [ self, other ] )

    def __rxor__(self, other ):
        """
        Implementation of ^ operator when left operand is not a C{L{ParserElement}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return other ^ self

    def __and__(self, other ):
        """
        Implementation of & operator - returns C{L{Each}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return Each( [ self, other ] )

    def __rand__(self, other ):
        """
        Implementation of & operator when left operand is not a C{L{ParserElement}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return other & self

    def __invert__( self ):
        """
        Implementation of ~ operator - returns C{L{NotAny}}
        """
        return NotAny( self )

    def __call__(self, name=None):
        """
        Shortcut for C{L{setResultsName}}, with C{listAllMatches=False}.
        
        If C{name} is given with a trailing C{'*'} character, then C{listAllMatches} will be
        passed as C{True}.
           
        If C{name} is omitted, same as calling C{L{copy}}.

        Example::
            # these are equivalent
            userdata = Word(alphas).setResultsName("name") + Word(nums+"-").setResultsName("socsecno")
            userdata = Word(alphas)("name") + Word(nums+"-")("socsecno")             
        """
        if name is not None:
            return self.setResultsName(name)
        else:
            return self.copy()

    def suppress( self ):
        """
        Suppresses the output of this C{ParserElement}; useful to keep punctuation from
        cluttering up returned output.
        """
        return Suppress( self )

    def leaveWhitespace( self ):
        """
        Disables the skipping of whitespace before matching the characters in the
        C{ParserElement}'s defined pattern.  This is normally only used internally by
        the pyparsing module, but may be needed in some whitespace-sensitive grammars.
        """
        self.skipWhitespace = False
        return self

    def setWhitespaceChars( self, chars ):
        """
        Overrides the default whitespace chars
        """
        self.skipWhitespace = True
        self.whiteChars = chars
        self.copyDefaultWhiteChars = False
        return self

    def parseWithTabs( self ):
        """
        Overrides default behavior to expand C{<TAB>}s to spaces before parsing the input string.
        Must be called before C{parseString} when the input grammar contains elements that
        match C{<TAB>} characters.
        """
        self.keepTabs = True
        return self

    def ignore( self, other ):
        """
        Define expression to be ignored (e.g., comments) while doing pattern
        matching; may be called repeatedly, to define multiple comment or other
        ignorable patterns.
        
        Example::
            patt = OneOrMore(Word(alphas))
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj']
            
            patt.ignore(cStyleComment)
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj', 'lskjd']
        """
        if isinstance(other, basestring):
            other = Suppress(other)

        if isinstance( other, Suppress ):
            if other not in self.ignoreExprs:
                self.ignoreExprs.append(other)
        else:
            self.ignoreExprs.append( Suppress( other.copy() ) )
        return self

    def setDebugActions( self, startAction, successAction, exceptionAction ):
        """
        Enable display of debugging messages while doing pattern matching.
        """
        self.debugActions = (startAction or _defaultStartDebugAction,
                             successAction or _defaultSuccessDebugAction,
                             exceptionAction or _defaultExceptionDebugAction)
        self.debug = True
        return self

    def setDebug( self, flag=True ):
        """
        Enable display of debugging messages while doing pattern matching.
        Set C{flag} to True to enable, False to disable.

        Example::
            wd = Word(alphas).setName("alphaword")
            integer = Word(nums).setName("numword")
            term = wd | integer
            
            # turn on debugging for wd
            wd.setDebug()

            OneOrMore(term).parseString("abc 123 xyz 890")
        
        prints::
            Match alphaword at loc 0(1,1)
            Matched alphaword -> ['abc']
            Match alphaword at loc 3(1,4)
            Exception raised:Expected alphaword (at char 4), (line:1, col:5)
            Match alphaword at loc 7(1,8)
            Matched alphaword -> ['xyz']
            Match alphaword at loc 11(1,12)
            Exception raised:Expected alphaword (at char 12), (line:1, col:13)
            Match alphaword at loc 15(1,16)
            Exception raised:Expected alphaword (at char 15), (line:1, col:16)

        The output shown is that produced by the default debug actions - custom debug actions can be
        specified using L{setDebugActions}. Prior to attempting
        to match the C{wd} expression, the debugging message C{"Match <exprname> at loc <n>(<line>,<col>)"}
        is shown. Then if the parse succeeds, a C{"Matched"} message is shown, or an C{"Exception raised"}
        message is shown. Also note the use of L{setName} to assign a human-readable name to the expression,
        which makes debugging and exception messages easier to understand - for instance, the default
        name created for the C{Word} expression without calling C{setName} is C{"W:(ABCD...)"}.
        """
        if flag:
            self.setDebugActions( _defaultStartDebugAction, _defaultSuccessDebugAction, _defaultExceptionDebugAction )
        else:
            self.debug = False
        return self

    def __str__( self ):
        return self.name

    def __repr__( self ):
        return _ustr(self)

    def streamline( self ):
        self.streamlined = True
        self.strRepr = None
        return self

    def checkRecursion( self, parseElementList ):
        pass

    def validate( self, validateTrace=[] ):
        """
        Check defined expressions for valid structure, check for infinite recursive definitions.
        """
        self.checkRecursion( [] )

    def parseFile( self, file_or_filename, parseAll=False ):
        """
        Execute the parse expression on the given file or filename.
        If a filename is specified (instead of a file object),
        the entire file is opened, read, and closed before parsing.
        """
        try:
            file_contents = file_or_filename.read()
        except AttributeError:
            with open(file_or_filename, "r") as f:
                file_contents = f.read()
        try:
            return self.parseString(file_contents, parseAll)
        except ParseBaseException as exc:
            if ParserElement.verbose_stacktrace:
                raise
            else:
                # catch and re-raise exception from here, clears out pyparsing internal stack trace
                raise exc

    def __eq__(self,other):
        if isinstance(other, ParserElement):
            return self is other or vars(self) == vars(other)
        elif isinstance(other, basestring):
            return self.matches(other)
        else:
            return super(ParserElement,self)==other

    def __ne__(self,other):
        return not (self == other)

    def __hash__(self):
        return hash(id(self))

    def __req__(self,other):
        return self == other

    def __rne__(self,other):
        return not (self == other)

    def matches(self, testString, parseAll=True):
        """
        Method for quick testing of a parser against a test string. Good for simple 
        inline microtests of sub expressions while building up larger parser.
           
        Parameters:
         - testString - to test against this expression for a match
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests
            
        Example::
            expr = Word(nums)
            assert expr.matches("100")
        """
        try:
            self.parseString(_ustr(testString), parseAll=parseAll)
            return True
        except ParseBaseException:
            return False
                
    def runTests(self, tests, parseAll=True, comment='#', fullDump=True, printResults=True, failureTests=False):
        """
        Execute the parse expression on a series of test strings, showing each
        test, the parsed results or where the parse failed. Quick and easy way to
        run a parse expression against a list of sample strings.
           
        Parameters:
         - tests - a list of separate test strings, or a multiline string of test strings
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests           
         - comment - (default=C{'#'}) - expression for indicating embedded comments in the test 
              string; pass None to disable comment filtering
         - fullDump - (default=C{True}) - dump results as list followed by results names in nested outline;
              if False, only dump nested list
         - printResults - (default=C{True}) prints test output to stdout
         - failureTests - (default=C{False}) indicates if these tests are expected to fail parsing

        Returns: a (success, results) tuple, where success indicates that all tests succeeded
        (or failed if C{failureTests} is True), and the results contain a list of lines of each 
        test's output
        
        Example::
            number_expr = pyparsing_common.number.copy()

            result = number_expr.runTests('''
                # unsigned integer
                100
                # negative integer
                -100
                # float with scientific notation
                6.02e23
                # integer with scientific notation
                1e-12
                ''')
            print("Success" if result[0] else "Failed!")

            result = number_expr.runTests('''
                # stray character
                100Z
                # missing leading digit before '.'
                -.100
                # too many '.'
                3.14.159
                ''', failureTests=True)
            print("Success" if result[0] else "Failed!")
        prints::
            # unsigned integer
            100
            [100]

            # negative integer
            -100
            [-100]

            # float with scientific notation
            6.02e23
            [6.02e+23]

            # integer with scientific notation
            1e-12
            [1e-12]

            Success
            
            # stray character
            100Z
               ^
            FAIL: Expected end of text (at char 3), (line:1, col:4)

            # missing leading digit before '.'
            -.100
            ^
            FAIL: Expected {real number with scientific notation | real number | signed integer} (at char 0), (line:1, col:1)

            # too many '.'
            3.14.159
                ^
            FAIL: Expected end of text (at char 4), (line:1, col:5)

            Success

        Each test string must be on a single line. If you want to test a string that spans multiple
        lines, create a test like this::

            expr.runTest(r"this is a test\\n of strings that spans \\n 3 lines")
        
        (Note that this is a raw string literal, you must include the leading 'r'.)
        """
        if isinstance(tests, basestring):
            tests = list(map(str.strip, tests.rstrip().splitlines()))
        if isinstance(comment, basestring):
            comment = Literal(comment)
        allResults = []
        comments = []
        success = True
        for t in tests:
            if comment is not None and comment.matches(t, False) or comments and not t:
                comments.append(t)
                continue
            if not t:
                continue
            out = ['\n'.join(comments), t]
            comments = []
            try:
                t = t.replace(r'\n','\n')
                result = self.parseString(t, parseAll=parseAll)
                out.append(result.dump(full=fullDump))
                success = success and not failureTests
            except ParseBaseException as pe:
                fatal = "(FATAL)" if isinstance(pe, ParseFatalException) else ""
                if '\n' in t:
                    out.append(line(pe.loc, t))
                    out.append(' '*(col(pe.loc,t)-1) + '^' + fatal)
                else:
                    out.append(' '*pe.loc + '^' + fatal)
                out.append("FAIL: " + str(pe))
                success = success and failureTests
                result = pe
            except Exception as exc:
                out.append("FAIL-EXCEPTION: " + str(exc))
                success = success and failureTests
                result = exc

            if printResults:
                if fullDump:
                    out.append('')
                print('\n'.join(out))

            allResults.append((t, result))
        
        return success, allResults

        
class Token(ParserElement):
    """
    Abstract C{ParserElement} subclass, for defining atomic matching patterns.
    """
    def __init__( self ):
        super(Token,self).__init__( savelist=False )


class Empty(Token):
    """
    An empty token, will always match.
    """
    def __init__( self ):
        super(Empty,self).__init__()
        self.name = "Empty"
        self.mayReturnEmpty = True
        self.mayIndexError = False


class NoMatch(Token):
    """
    A token that will never match.
    """
    def __init__( self ):
        super(NoMatch,self).__init__()
        self.name = "NoMatch"
        self.mayReturnEmpty = True
        self.mayIndexError = False
        self.errmsg = "Unmatchable token"

    def parseImpl( self, instring, loc, doActions=True ):
        raise ParseException(instring, loc, self.errmsg, self)


class Literal(Token):
    """
    Token to exactly match a specified string.
    
    Example::
        Literal('blah').parseString('blah')  # -> ['blah']
        Literal('blah').parseString('blahfooblah')  # -> ['blah']
        Literal('blah').parseString('bla')  # -> Exception: Expected "blah"
    
    For case-insensitive matching, use L{CaselessLiteral}.
    
    For keyword matching (force word break before and after the matched string),
    use L{Keyword} or L{CaselessKeyword}.
    """
    def __init__( self, matchString ):
        super(Literal,self).__init__()
        self.match = matchString
        self.matchLen = len(matchString)
        try:
            self.firstMatchChar = matchString[0]
        except IndexError:
            warnings.warn("null string passed to Literal; use Empty() instead",
                            SyntaxWarning, stacklevel=2)
            self.__class__ = Empty
        self.name = '"%s"' % _ustr(self.match)
        self.errmsg = "Expected " + self.name
        self.mayReturnEmpty = False
        self.mayIndexError = False

    # Performance tuning: this routine gets called a *lot*
    # if this is a single character match string  and the first character matches,
    # short-circuit as quickly as possible, and avoid calling startswith
    #~ @profile
    def parseImpl( self, instring, loc, doActions=True ):
        if (instring[loc] == self.firstMatchChar and
            (self.matchLen==1 or instring.startswith(self.match,loc)) ):
            return loc+self.matchLen, self.match
        raise ParseException(instring, loc, self.errmsg, self)
_L = Literal
ParserElement._literalStringClass = Literal

class Keyword(Token):
    """
    Token to exactly match a specified string as a keyword, that is, it must be
    immediately followed by a non-keyword character.  Compare with C{L{Literal}}:
     - C{Literal("if")} will match the leading C{'if'} in C{'ifAndOnlyIf'}.
     - C{Keyword("if")} will not; it will only match the leading C{'if'} in C{'if x=1'}, or C{'if(y==2)'}
    Accepts two optional constructor arguments in addition to the keyword string:
     - C{identChars} is a string of characters that would be valid identifier characters,
          defaulting to all alphanumerics + "_" and "$"
     - C{caseless} allows case-insensitive matching, default is C{False}.
       
    Example::
        Keyword("start").parseString("start")  # -> ['start']
        Keyword("start").parseString("starting")  # -> Exception

    For case-insensitive matching, use L{CaselessKeyword}.
    """
    DEFAULT_KEYWORD_CHARS = alphanums+"_$"

    def __init__( self, matchString, identChars=None, caseless=False ):
        super(Keyword,self).__init__()
        if identChars is None:
            identChars = Keyword.DEFAULT_KEYWORD_CHARS
        self.match = matchString
        self.matchLen = len(matchString)
        try:
            self.firstMatchChar = matchString[0]
        except IndexError:
            warnings.warn("null string passed to Keyword; use Empty() instead",
                            SyntaxWarning, stacklevel=2)
        self.name = '"%s"' % self.match
        self.errmsg = "Expected " + self.name
        self.mayReturnEmpty = False
        self.mayIndexError = False
        self.caseless = caseless
        if caseless:
            self.caselessmatch = matchString.upper()
            identChars = identChars.upper()
        self.identChars = set(identChars)

    def parseImpl( self, instring, loc, doActions=True ):
        if self.caseless:
            if ( (instring[ loc:loc+self.matchLen ].upper() == self.caselessmatch) and
                 (loc >= len(instring)-self.matchLen or instring[loc+self.matchLen].upper() not in self.identChars) and
                 (loc == 0 or instring[loc-1].upper() not in self.identChars) ):
                return loc+self.matchLen, self.match
        else:
            if (instring[loc] == self.firstMatchChar and
                (self.matchLen==1 or instring.startswith(self.match,loc)) and
                (loc >= len(instring)-self.matchLen or instring[loc+self.matchLen] not in self.identChars) and
                (loc == 0 or instring[loc-1] not in self.identChars) ):
                return loc+self.matchLen, self.match
        raise ParseException(instring, loc, self.errmsg, self)

    def copy(self):
        c = super(Keyword,self).copy()
        c.identChars = Keyword.DEFAULT_KEYWORD_CHARS
        return c

    @staticmethod
    def setDefaultKeywordChars( chars ):
        """Overrides the default Keyword chars
        """
        Keyword.DEFAULT_KEYWORD_CHARS = chars

class CaselessLiteral(Literal):
    """
    Token to match a specified string, ignoring case of letters.
    Note: the matched results will always be in the case of the given
    match string, NOT the case of the input text.

    Example::
        OneOrMore(CaselessLiteral("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD', 'CMD']
        
    (Contrast with example for L{CaselessKeyword}.)
    """
    def __init__( self, matchString ):
        super(CaselessLiteral,self).__init__( matchString.upper() )
        # Preserve the defining literal.
        self.returnString = matchString
        self.name = "'%s'" % self.returnString
        self.errmsg = "Expected " + self.name

    def parseImpl( self, instring, loc, doActions=True ):
        if instring[ loc:loc+self.matchLen ].upper() == self.match:
            return loc+self.matchLen, self.returnString
        raise ParseException(instring, loc, self.errmsg, self)

class CaselessKeyword(Keyword):
    """
    Caseless version of L{Keyword}.

    Example::
        OneOrMore(CaselessKeyword("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD']
        
    (Contrast with example for L{CaselessLiteral}.)
    """
    def __init__( self, matchString, identChars=None ):
        super(CaselessKeyword,self).__init__( matchString, identChars, caseless=True )

    def parseImpl( self, instring, loc, doActions=True ):
        if ( (instring[ loc:loc+self.matchLen ].upper() == self.caselessmatch) and
             (loc >= len(instring)-self.matchLen or instring[loc+self.matchLen].upper() not in self.identChars) ):
            return loc+self.matchLen, self.match
        raise ParseException(instring, loc, self.errmsg, self)

class CloseMatch(Token):
    """
    A variation on L{Literal} which matches "close" matches, that is, 
    strings with at most 'n' mismatching characters. C{CloseMatch} takes parameters:
     - C{match_string} - string to be matched
     - C{maxMismatches} - (C{default=1}) maximum number of mismatches allowed to count as a match
    
    The results from a successful parse will contain the matched text from the input string and the following named results:
     - C{mismatches} - a list of the positions within the match_string where mismatches were found
     - C{original} - the original match_string used to compare against the input string
    
    If C{mismatches} is an empty list, then the match was an exact match.
    
    Example::
        patt = CloseMatch("ATCATCGAATGGA")
        patt.parseString("ATCATCGAAXGGA") # -> (['ATCATCGAAXGGA'], {'mismatches': [[9]], 'original': ['ATCATCGAATGGA']})
        patt.parseString("ATCAXCGAAXGGA") # -> Exception: Expected 'ATCATCGAATGGA' (with up to 1 mismatches) (at char 0), (line:1, col:1)

        # exact match
        patt.parseString("ATCATCGAATGGA") # -> (['ATCATCGAATGGA'], {'mismatches': [[]], 'original': ['ATCATCGAATGGA']})

        # close match allowing up to 2 mismatches
        patt = CloseMatch("ATCATCGAATGGA", maxMismatches=2)
        patt.parseString("ATCAXCGAAXGGA") # -> (['ATCAXCGAAXGGA'], {'mismatches': [[4, 9]], 'original': ['ATCATCGAATGGA']})
    """
    def __init__(self, match_string, maxMismatches=1):
        super(CloseMatch,self).__init__()
        self.name = match_string
        self.match_string = match_string
        self.maxMismatches = maxMismatches
        self.errmsg = "Expected %r (with up to %d mismatches)" % (self.match_string, self.maxMismatches)
        self.mayIndexError = False
        self.mayReturnEmpty = False

    def parseImpl( self, instring, loc, doActions=True ):
        start = loc
        instrlen = len(instring)
        maxloc = start + len(self.match_string)

        if maxloc <= instrlen:
            match_string = self.match_string
            match_stringloc = 0
            mismatches = []
            maxMismatches = self.maxMismatches

            for match_stringloc,s_m in enumerate(zip(instring[loc:maxloc], self.match_string)):
                src,mat = s_m
                if src != mat:
                    mismatches.append(match_stringloc)
                    if len(mismatches) > maxMismatches:
                        break
            else:
                loc = match_stringloc + 1
                results = ParseResults([instring[start:loc]])
                results['original'] = self.match_string
                results['mismatches'] = mismatches
                return loc, results

        raise ParseException(instring, loc, self.errmsg, self)


class Word(Token):
    """
    Token for matching words composed of allowed character sets.
    Defined with string containing all allowed initial characters,
    an optional string containing allowed body characters (if omitted,
    defaults to the initial character set), and an optional minimum,
    maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction. An optional
    C{excludeChars} parameter can list characters that might be found in 
    the input C{bodyChars} string; useful to define a word of all printables
    except for one or two characters, for instance.
    
    L{srange} is useful for defining custom character set strings for defining 
    C{Word} expressions, using range notation from regular expression character sets.
    
    A common mistake is to use C{Word} to match a specific literal string, as in 
    C{Word("Address")}. Remember that C{Word} uses the string argument to define
    I{sets} of matchable characters. This expression would match "Add", "AAA",
    "dAred", or any other word made up of the characters 'A', 'd', 'r', 'e', and 's'.
    To match an exact literal string, use L{Literal} or L{Keyword}.

    pyparsing includes helper strings for building Words:
     - L{alphas}
     - L{nums}
     - L{alphanums}
     - L{hexnums}
     - L{alphas8bit} (alphabetic characters in ASCII range 128-255 - accented, tilded, umlauted, etc.)
     - L{punc8bit} (non-alphabetic characters in ASCII range 128-255 - currency, symbols, superscripts, diacriticals, etc.)
     - L{printables} (any non-whitespace character)

    Example::
        # a word composed of digits
        integer = Word(nums) # equivalent to Word("0123456789") or Word(srange("0-9"))
        
        # a word with a leading capital, and zero or more lowercase
        capital_word = Word(alphas.upper(), alphas.lower())

        # hostnames are alphanumeric, with leading alpha, and '-'
        hostname = Word(alphas, alphanums+'-')
        
        # roman numeral (not a strict parser, accepts invalid mix of characters)
        roman = Word("IVXLCDM")
        
        # any string of non-whitespace characters, except for ','
        csv_value = Word(printables, excludeChars=",")
    """
    def __init__( self, initChars, bodyChars=None, min=1, max=0, exact=0, asKeyword=False, excludeChars=None ):
        super(Word,self).__init__()
        if excludeChars:
            initChars = ''.join(c for c in initChars if c not in excludeChars)
            if bodyChars:
                bodyChars = ''.join(c for c in bodyChars if c not in excludeChars)
        self.initCharsOrig = initChars
        self.initChars = set(initChars)
        if bodyChars :
            self.bodyCharsOrig = bodyChars
            self.bodyChars = set(bodyChars)
        else:
            self.bodyCharsOrig = initChars
            self.bodyChars = set(initChars)

        self.maxSpecified = max > 0

        if min < 1:
            raise ValueError("cannot specify a minimum length < 1; use Optional(Word()) if zero-length word is permitted")

        self.minLen = min

        if max > 0:
            self.maxLen = max
        else:
            self.maxLen = _MAX_INT

        if exact > 0:
            self.maxLen = exact
            self.minLen = exact

        self.name = _ustr(self)
        self.errmsg = "Expected " + self.name
        self.mayIndexError = False
        self.asKeyword = asKeyword

        if ' ' not in self.initCharsOrig+self.bodyCharsOrig and (min==1 and max==0 and exact==0):
            if self.bodyCharsOrig == self.initCharsOrig:
                self.reString = "[%s]+" % _escapeRegexRangeChars(self.initCharsOrig)
            elif len(self.initCharsOrig) == 1:
                self.reString = "%s[%s]*" % \
                                      (re.escape(self.initCharsOrig),
                                      _escapeRegexRangeChars(self.bodyCharsOrig),)
            else:
                self.reString = "[%s][%s]*" % \
                                      (_escapeRegexRangeChars(self.initCharsOrig),
                                      _escapeRegexRangeChars(self.bodyCharsOrig),)
            if self.asKeyword:
                self.reString = r"\b"+self.reString+r"\b"
            try:
                self.re = re.compile( self.reString )
            except Exception:
                self.re = None

    def parseImpl( self, instring, loc, doActions=True ):
        if self.re:
            result = self.re.match(instring,loc)
            if not result:
                raise ParseException(instring, loc, self.errmsg, self)

            loc = result.end()
            return loc, result.group()

        if not(instring[ loc ] in self.initChars):
            raise ParseException(instring, loc, self.errmsg, self)

        start = loc
        loc += 1
        instrlen = len(instring)
        bodychars = self.bodyChars
        maxloc = start + self.maxLen
        maxloc = min( maxloc, instrlen )
        while loc < maxloc and instring[loc] in bodychars:
            loc += 1

        throwException = False
        if loc - start < self.minLen:
            throwException = True
        if self.maxSpecified and loc < instrlen and instring[loc] in bodychars:
            throwException = True
        if self.asKeyword:
            if (start>0 and instring[start-1] in bodychars) or (loc<instrlen and instring[loc] in bodychars):
                throwException = True

        if throwException:
            raise ParseException(instring, loc, self.errmsg, self)

        return loc, instring[start:loc]

    def __str__( self ):
        try:
            return super(Word,self).__str__()
        except Exception:
            pass


        if self.strRepr is None:

            def charsAsStr(s):
                if len(s)>4:
                    return s[:4]+"..."
                else:
                    return s

            if ( self.initCharsOrig != self.bodyCharsOrig ):
                self.strRepr = "W:(%s,%s)" % ( charsAsStr(self.initCharsOrig), charsAsStr(self.bodyCharsOrig) )
            else:
                self.strRepr = "W:(%s)" % charsAsStr(self.initCharsOrig)

        return self.strRepr


class Regex(Token):
    """
    Token for matching strings that match a given regular expression.
    Defined with string specifying the regular expression in a form recognized by the inbuilt Python re module.
    If the given regex contains named groups (defined using C{(?P<name>...)}), these will be preserved as 
    named parse results.

    Example::
        realnum = Regex(r"[+-]?\d+\.\d*")
        date = Regex(r'(?P<year>\d{4})-(?P<month>\d\d?)-(?P<day>\d\d?)')
        # ref: http://stackoverflow.com/questions/267399/how-do-you-match-only-valid-roman-numerals-with-a-regular-expression
        roman = Regex(r"M{0,4}(CM|CD|D?C{0,3})(XC|XL|L?X{0,3})(IX|IV|V?I{0,3})")
    """
    compiledREtype = type(re.compile("[A-Z]"))
    def __init__( self, pattern, flags=0):
        """The parameters C{pattern} and C{flags} are passed to the C{re.compile()} function as-is. See the Python C{re} module for an explanation of the acceptable patterns and flags."""
        super(Regex,self).__init__()

        if isinstance(pattern, basestring):
            if not pattern:
                warnings.warn("null string passed to Regex; use Empty() instead",
                        SyntaxWarning, stacklevel=2)

            self.pattern = pattern
            self.flags = flags

            try:
                self.re = re.compile(self.pattern, self.flags)
                self.reString = self.pattern
            except sre_constants.error:
                warnings.warn("invalid pattern (%s) passed to Regex" % pattern,
                    SyntaxWarning, stacklevel=2)
                raise

        elif isinstance(pattern, Regex.compiledREtype):
            self.re = pattern
            self.pattern = \
            self.reString = str(pattern)
            self.flags = flags
            
        else:
            raise ValueError("Regex may only be constructed with a string or a compiled RE object")

        self.name = _ustr(self)
        self.errmsg = "Expected " + self.name
        self.mayIndexError = False
        self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        result = self.re.match(instring,loc)
        if not result:
            raise ParseException(instring, loc, self.errmsg, self)

        loc = result.end()
        d = result.groupdict()
        ret = ParseResults(result.group())
        if d:
            for k in d:
                ret[k] = d[k]
        return loc,ret

    def __str__( self ):
        try:
            return super(Regex,self).__str__()
        except Exception:
            pass

        if self.strRepr is None:
            self.strRepr = "Re:(%s)" % repr(self.pattern)

        return self.strRepr


class QuotedString(Token):
    r"""
    Token for matching strings that are delimited by quoting characters.
    
    Defined with the following parameters:
        - quoteChar - string of one or more characters defining the quote delimiting string
        - escChar - character to escape quotes, typically backslash (default=C{None})
        - escQuote - special quote sequence to escape an embedded quote string (such as SQL's "" to escape an embedded ") (default=C{None})
        - multiline - boolean indicating whether quotes can span multiple lines (default=C{False})
        - unquoteResults - boolean indicating whether the matched text should be unquoted (default=C{True})
        - endQuoteChar - string of one or more characters defining the end of the quote delimited string (default=C{None} => same as quoteChar)
        - convertWhitespaceEscapes - convert escaped whitespace (C{'\t'}, C{'\n'}, etc.) to actual whitespace (default=C{True})

    Example::
        qs = QuotedString('"')
        print(qs.searchString('lsjdf "This is the quote" sldjf'))
        complex_qs = QuotedString('{{', endQuoteChar='}}')
        print(complex_qs.searchString('lsjdf {{This is the "quote"}} sldjf'))
        sql_qs = QuotedString('"', escQuote='""')
        print(sql_qs.searchString('lsjdf "This is the quote with ""embedded"" quotes" sldjf'))
    prints::
        [['This is the quote']]
        [['This is the "quote"']]
        [['This is the quote with "embedded" quotes']]
    """
    def __init__( self, quoteChar, escChar=None, escQuote=None, multiline=False, unquoteResults=True, endQuoteChar=None, convertWhitespaceEscapes=True):
        super(QuotedString,self).__init__()

        # remove white space from quote chars - wont work anyway
        quoteChar = quoteChar.strip()
        if not quoteChar:
            warnings.warn("quoteChar cannot be the empty string",SyntaxWarning,stacklevel=2)
            raise SyntaxError()

        if endQuoteChar is None:
            endQuoteChar = quoteChar
        else:
            endQuoteChar = endQuoteChar.strip()
            if not endQuoteChar:
                warnings.warn("endQuoteChar cannot be the empty string",SyntaxWarning,stacklevel=2)
                raise SyntaxError()

        self.quoteChar = quoteChar
        self.quoteCharLen = len(quoteChar)
        self.firstQuoteChar = quoteChar[0]
        self.endQuoteChar = endQuoteChar
        self.endQuoteCharLen = len(endQuoteChar)
        self.escChar = escChar
        self.escQuote = escQuote
        self.unquoteResults = unquoteResults
        self.convertWhitespaceEscapes = convertWhitespaceEscapes

        if multiline:
            self.flags = re.MULTILINE | re.DOTALL
            self.pattern = r'%s(?:[^%s%s]' % \
                ( re.escape(self.quoteChar),
                  _escapeRegexRangeChars(self.endQuoteChar[0]),
                  (escChar is not None and _escapeRegexRangeChars(escChar) or '') )
        else:
            self.flags = 0
            self.pattern = r'%s(?:[^%s\n\r%s]' % \
                ( re.escape(self.quoteChar),
                  _escapeRegexRangeChars(self.endQuoteChar[0]),
                  (escChar is not None and _escapeRegexRangeChars(escChar) or '') )
        if len(self.endQuoteChar) > 1:
            self.pattern += (
                '|(?:' + ')|(?:'.join("%s[^%s]" % (re.escape(self.endQuoteChar[:i]),
                                               _escapeRegexRangeChars(self.endQuoteChar[i]))
                                    for i in range(len(self.endQuoteChar)-1,0,-1)) + ')'
                )
        if escQuote:
            self.pattern += (r'|(?:%s)' % re.escape(escQuote))
        if escChar:
            self.pattern += (r'|(?:%s.)' % re.escape(escChar))
            self.escCharReplacePattern = re.escape(self.escChar)+"(.)"
        self.pattern += (r')*%s' % re.escape(self.endQuoteChar))

        try:
            self.re = re.compile(self.pattern, self.flags)
            self.reString = self.pattern
        except sre_constants.error:
            warnings.warn("invalid pattern (%s) passed to Regex" % self.pattern,
                SyntaxWarning, stacklevel=2)
            raise

        self.name = _ustr(self)
        self.errmsg = "Expected " + self.name
        self.mayIndexError = False
        self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        result = instring[loc] == self.firstQuoteChar and self.re.match(instring,loc) or None
        if not result:
            raise ParseException(instring, loc, self.errmsg, self)

        loc = result.end()
        ret = result.group()

        if self.unquoteResults:

            # strip off quotes
            ret = ret[self.quoteCharLen:-self.endQuoteCharLen]

            if isinstance(ret,basestring):
                # replace escaped whitespace
                if '\\' in ret and self.convertWhitespaceEscapes:
                    ws_map = {
                        r'\t' : '\t',
                        r'\n' : '\n',
                        r'\f' : '\f',
                        r'\r' : '\r',
                    }
                    for wslit,wschar in ws_map.items():
                        ret = ret.replace(wslit, wschar)

                # replace escaped characters
                if self.escChar:
                    ret = re.sub(self.escCharReplacePattern,"\g<1>",ret)

                # replace escaped quotes
                if self.escQuote:
                    ret = ret.replace(self.escQuote, self.endQuoteChar)

        return loc, ret

    def __str__( self ):
        try:
            return super(QuotedString,self).__str__()
        except Exception:
            pass

        if self.strRepr is None:
            self.strRepr = "quoted string, starting with %s ending with %s" % (self.quoteChar, self.endQuoteChar)

        return self.strRepr


class CharsNotIn(Token):
    """
    Token for matching words composed of characters I{not} in a given set (will
    include whitespace in matched characters if not listed in the provided exclusion set - see example).
    Defined with string containing all disallowed characters, and an optional
    minimum, maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction.

    Example::
        # define a comma-separated-value as anything that is not a ','
        csv_value = CharsNotIn(',')
        print(delimitedList(csv_value).parseString("dkls,lsdkjf,s12 34,@!#,213"))
    prints::
        ['dkls', 'lsdkjf', 's12 34', '@!#', '213']
    """
    def __init__( self, notChars, min=1, max=0, exact=0 ):
        super(CharsNotIn,self).__init__()
        self.skipWhitespace = False
        self.notChars = notChars

        if min < 1:
            raise ValueError("cannot specify a minimum length < 1; use Optional(CharsNotIn()) if zero-length char group is permitted")

        self.minLen = min

        if max > 0:
            self.maxLen = max
        else:
            self.maxLen = _MAX_INT

        if exact > 0:
            self.maxLen = exact
            self.minLen = exact

        self.name = _ustr(self)
        self.errmsg = "Expected " + self.name
        self.mayReturnEmpty = ( self.minLen == 0 )
        self.mayIndexError = False

    def parseImpl( self, instring, loc, doActions=True ):
        if instring[loc] in self.notChars:
            raise ParseException(instring, loc, self.errmsg, self)

        start = loc
        loc += 1
        notchars = self.notChars
        maxlen = min( start+self.maxLen, len(instring) )
        while loc < maxlen and \
              (instring[loc] not in notchars):
            loc += 1

        if loc - start < self.minLen:
            raise ParseException(instring, loc, self.errmsg, self)

        return loc, instring[start:loc]

    def __str__( self ):
        try:
            return super(CharsNotIn, self).__str__()
        except Exception:
            pass

        if self.strRepr is None:
            if len(self.notChars) > 4:
                self.strRepr = "!W:(%s...)" % self.notChars[:4]
            else:
                self.strRepr = "!W:(%s)" % self.notChars

        return self.strRepr

class White(Token):
    """
    Special matching class for matching whitespace.  Normally, whitespace is ignored
    by pyparsing grammars.  This class is included when some whitespace structures
    are significant.  Define with a string containing the whitespace characters to be
    matched; default is C{" \\t\\r\\n"}.  Also takes optional C{min}, C{max}, and C{exact} arguments,
    as defined for the C{L{Word}} class.
    """
    whiteStrs = {
        " " : "<SPC>",
        "\t": "<TAB>",
        "\n": "<LF>",
        "\r": "<CR>",
        "\f": "<FF>",
        }
    def __init__(self, ws=" \t\r\n", min=1, max=0, exact=0):
        super(White,self).__init__()
        self.matchWhite = ws
        self.setWhitespaceChars( "".join(c for c in self.whiteChars if c not in self.matchWhite) )
        #~ self.leaveWhitespace()
        self.name = ("".join(White.whiteStrs[c] for c in self.matchWhite))
        self.mayReturnEmpty = True
        self.errmsg = "Expected " + self.name

        self.minLen = min

        if max > 0:
            self.maxLen = max
        else:
            self.maxLen = _MAX_INT

        if exact > 0:
            self.maxLen = exact
            self.minLen = exact

    def parseImpl( self, instring, loc, doActions=True ):
        if not(instring[ loc ] in self.matchWhite):
            raise ParseException(instring, loc, self.errmsg, self)
        start = loc
        loc += 1
        maxloc = start + self.maxLen
        maxloc = min( maxloc, len(instring) )
        while loc < maxloc and instring[loc] in self.matchWhite:
            loc += 1

        if loc - start < self.minLen:
            raise ParseException(instring, loc, self.errmsg, self)

        return loc, instring[start:loc]


class _PositionToken(Token):
    def __init__( self ):
        super(_PositionToken,self).__init__()
        self.name=self.__class__.__name__
        self.mayReturnEmpty = True
        self.mayIndexError = False

class GoToColumn(_PositionToken):
    """
    Token to advance to a specific column of input text; useful for tabular report scraping.
    """
    def __init__( self, colno ):
        super(GoToColumn,self).__init__()
        self.col = colno

    def preParse( self, instring, loc ):
        if col(loc,instring) != self.col:
            instrlen = len(instring)
            if self.ignoreExprs:
                loc = self._skipIgnorables( instring, loc )
            while loc < instrlen and instring[loc].isspace() and col( loc, instring ) != self.col :
                loc += 1
        return loc

    def parseImpl( self, instring, loc, doActions=True ):
        thiscol = col( loc, instring )
        if thiscol > self.col:
            raise ParseException( instring, loc, "Text not in expected column", self )
        newloc = loc + self.col - thiscol
        ret = instring[ loc: newloc ]
        return newloc, ret


class LineStart(_PositionToken):
    """
    Matches if current position is at the beginning of a line within the parse string
    
    Example::
    
        test = '''\
        AAA this line
        AAA and this line
          AAA but not this one
        B AAA and definitely not this one
        '''

        for t in (LineStart() + 'AAA' + restOfLine).searchString(test):
            print(t)
    
    Prints::
        ['AAA', ' this line']
        ['AAA', ' and this line']    

    """
    def __init__( self ):
        super(LineStart,self).__init__()
        self.errmsg = "Expected start of line"

    def parseImpl( self, instring, loc, doActions=True ):
        if col(loc, instring) == 1:
            return loc, []
        raise ParseException(instring, loc, self.errmsg, self)

class LineEnd(_PositionToken):
    """
    Matches if current position is at the end of a line within the parse string
    """
    def __init__( self ):
        super(LineEnd,self).__init__()
        self.setWhitespaceChars( ParserElement.DEFAULT_WHITE_CHARS.replace("\n","") )
        self.errmsg = "Expected end of line"

    def parseImpl( self, instring, loc, doActions=True ):
        if loc<len(instring):
            if instring[loc] == "\n":
                return loc+1, "\n"
            else:
                raise ParseException(instring, loc, self.errmsg, self)
        elif loc == len(instring):
            return loc+1, []
        else:
            raise ParseException(instring, loc, self.errmsg, self)

class StringStart(_PositionToken):
    """
    Matches if current position is at the beginning of the parse string
    """
    def __init__( self ):
        super(StringStart,self).__init__()
        self.errmsg = "Expected start of text"

    def parseImpl( self, instring, loc, doActions=True ):
        if loc != 0:
            # see if entire string up to here is just whitespace and ignoreables
            if loc != self.preParse( instring, 0 ):
                raise ParseException(instring, loc, self.errmsg, self)
        return loc, []

class StringEnd(_PositionToken):
    """
    Matches if current position is at the end of the parse string
    """
    def __init__( self ):
        super(StringEnd,self).__init__()
        self.errmsg = "Expected end of text"

    def parseImpl( self, instring, loc, doActions=True ):
        if loc < len(instring):
            raise ParseException(instring, loc, self.errmsg, self)
        elif loc == len(instring):
            return loc+1, []
        elif loc > len(instring):
            return loc, []
        else:
            raise ParseException(instring, loc, self.errmsg, self)

class WordStart(_PositionToken):
    """
    Matches if the current position is at the beginning of a Word, and
    is not preceded by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{\b} behavior of regular expressions,
    use C{WordStart(alphanums)}. C{WordStart} will also match at the beginning of
    the string being parsed, or at the beginning of a line.
    """
    def __init__(self, wordChars = printables):
        super(WordStart,self).__init__()
        self.wordChars = set(wordChars)
        self.errmsg = "Not at the start of a word"

    def parseImpl(self, instring, loc, doActions=True ):
        if loc != 0:
            if (instring[loc-1] in self.wordChars or
                instring[loc] not in self.wordChars):
                raise ParseException(instring, loc, self.errmsg, self)
        return loc, []

class WordEnd(_PositionToken):
    """
    Matches if the current position is at the end of a Word, and
    is not followed by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{\b} behavior of regular expressions,
    use C{WordEnd(alphanums)}. C{WordEnd} will also match at the end of
    the string being parsed, or at the end of a line.
    """
    def __init__(self, wordChars = printables):
        super(WordEnd,self).__init__()
        self.wordChars = set(wordChars)
        self.skipWhitespace = False
        self.errmsg = "Not at the end of a word"

    def parseImpl(self, instring, loc, doActions=True ):
        instrlen = len(instring)
        if instrlen>0 and loc<instrlen:
            if (instring[loc] in self.wordChars or
                instring[loc-1] not in self.wordChars):
                raise ParseException(instring, loc, self.errmsg, self)
        return loc, []


class ParseExpression(ParserElement):
    """
    Abstract subclass of ParserElement, for combining and post-processing parsed tokens.
    """
    def __init__( self, exprs, savelist = False ):
        super(ParseExpression,self).__init__(savelist)
        if isinstance( exprs, _generatorType ):
            exprs = list(exprs)

        if isinstance( exprs, basestring ):
            self.exprs = [ ParserElement._literalStringClass( exprs ) ]
        elif isinstance( exprs, collections.Iterable ):
            exprs = list(exprs)
            # if sequence of strings provided, wrap with Literal
            if all(isinstance(expr, basestring) for expr in exprs):
                exprs = map(ParserElement._literalStringClass, exprs)
            self.exprs = list(exprs)
        else:
            try:
                self.exprs = list( exprs )
            except TypeError:
                self.exprs = [ exprs ]
        self.callPreparse = False

    def __getitem__( self, i ):
        return self.exprs[i]

    def append( self, other ):
        self.exprs.append( other )
        self.strRepr = None
        return self

    def leaveWhitespace( self ):
        """Extends C{leaveWhitespace} defined in base class, and also invokes C{leaveWhitespace} on
           all contained expressions."""
        self.skipWhitespace = False
        self.exprs = [ e.copy() for e in self.exprs ]
        for e in self.exprs:
            e.leaveWhitespace()
        return self

    def ignore( self, other ):
        if isinstance( other, Suppress ):
            if other not in self.ignoreExprs:
                super( ParseExpression, self).ignore( other )
                for e in self.exprs:
                    e.ignore( self.ignoreExprs[-1] )
        else:
            super( ParseExpression, self).ignore( other )
            for e in self.exprs:
                e.ignore( self.ignoreExprs[-1] )
        return self

    def __str__( self ):
        try:
            return super(ParseExpression,self).__str__()
        except Exception:
            pass

        if self.strRepr is None:
            self.strRepr = "%s:(%s)" % ( self.__class__.__name__, _ustr(self.exprs) )
        return self.strRepr

    def streamline( self ):
        super(ParseExpression,self).streamline()

        for e in self.exprs:
            e.streamline()

        # collapse nested And's of the form And( And( And( a,b), c), d) to And( a,b,c,d )
        # but only if there are no parse actions or resultsNames on the nested And's
        # (likewise for Or's and MatchFirst's)
        if ( len(self.exprs) == 2 ):
            other = self.exprs[0]
            if ( isinstance( other, self.__class__ ) and
                  not(other.parseAction) and
                  other.resultsName is None and
                  not other.debug ):
                self.exprs = other.exprs[:] + [ self.exprs[1] ]
                self.strRepr = None
                self.mayReturnEmpty |= other.mayReturnEmpty
                self.mayIndexError  |= other.mayIndexError

            other = self.exprs[-1]
            if ( isinstance( other, self.__class__ ) and
                  not(other.parseAction) and
                  other.resultsName is None and
                  not other.debug ):
                self.exprs = self.exprs[:-1] + other.exprs[:]
                self.strRepr = None
                self.mayReturnEmpty |= other.mayReturnEmpty
                self.mayIndexError  |= other.mayIndexError

        self.errmsg = "Expected " + _ustr(self)
        
        return self

    def setResultsName( self, name, listAllMatches=False ):
        ret = super(ParseExpression,self).setResultsName(name,listAllMatches)
        return ret

    def validate( self, validateTrace=[] ):
        tmp = validateTrace[:]+[self]
        for e in self.exprs:
            e.validate(tmp)
        self.checkRecursion( [] )
        
    def copy(self):
        ret = super(ParseExpression,self).copy()
        ret.exprs = [e.copy() for e in self.exprs]
        return ret

class And(ParseExpression):
    """
    Requires all given C{ParseExpression}s to be found in the given order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'+'} operator.
    May also be constructed using the C{'-'} operator, which will suppress backtracking.

    Example::
        integer = Word(nums)
        name_expr = OneOrMore(Word(alphas))

        expr = And([integer("id"),name_expr("name"),integer("age")])
        # more easily written as:
        expr = integer("id") + name_expr("name") + integer("age")
    """

    class _ErrorStop(Empty):
        def __init__(self, *args, **kwargs):
            super(And._ErrorStop,self).__init__(*args, **kwargs)
            self.name = '-'
            self.leaveWhitespace()

    def __init__( self, exprs, savelist = True ):
        super(And,self).__init__(exprs, savelist)
        self.mayReturnEmpty = all(e.mayReturnEmpty for e in self.exprs)
        self.setWhitespaceChars( self.exprs[0].whiteChars )
        self.skipWhitespace = self.exprs[0].skipWhitespace
        self.callPreparse = True

    def parseImpl( self, instring, loc, doActions=True ):
        # pass False as last arg to _parse for first element, since we already
        # pre-parsed the string as part of our And pre-parsing
        loc, resultlist = self.exprs[0]._parse( instring, loc, doActions, callPreParse=False )
        errorStop = False
        for e in self.exprs[1:]:
            if isinstance(e, And._ErrorStop):
                errorStop = True
                continue
            if errorStop:
                try:
                    loc, exprtokens = e._parse( instring, loc, doActions )
                except ParseSyntaxException:
                    raise
                except ParseBaseException as pe:
                    pe.__traceback__ = None
                    raise ParseSyntaxException._from_exception(pe)
                except IndexError:
                    raise ParseSyntaxException(instring, len(instring), self.errmsg, self)
            else:
                loc, exprtokens = e._parse( instring, loc, doActions )
            if exprtokens or exprtokens.haskeys():
                resultlist += exprtokens
        return loc, resultlist

    def __iadd__(self, other ):
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        return self.append( other ) #And( [ self, other ] )

    def checkRecursion( self, parseElementList ):
        subRecCheckList = parseElementList[:] + [ self ]
        for e in self.exprs:
            e.checkRecursion( subRecCheckList )
            if not e.mayReturnEmpty:
                break

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "{" + " ".join(_ustr(e) for e in self.exprs) + "}"

        return self.strRepr


class Or(ParseExpression):
    """
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the expression that matches the longest string will be used.
    May be constructed using the C{'^'} operator.

    Example::
        # construct Or using '^' operator
        
        number = Word(nums) ^ Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789"))
    prints::
        [['123'], ['3.1416'], ['789']]
    """
    def __init__( self, exprs, savelist = False ):
        super(Or,self).__init__(exprs, savelist)
        if self.exprs:
            self.mayReturnEmpty = any(e.mayReturnEmpty for e in self.exprs)
        else:
            self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        maxExcLoc = -1
        maxException = None
        matches = []
        for e in self.exprs:
            try:
                loc2 = e.tryParse( instring, loc )
            except ParseException as err:
                err.__traceback__ = None
                if err.loc > maxExcLoc:
                    maxException = err
                    maxExcLoc = err.loc
            except IndexError:
                if len(instring) > maxExcLoc:
                    maxException = ParseException(instring,len(instring),e.errmsg,self)
                    maxExcLoc = len(instring)
            else:
                # save match among all matches, to retry longest to shortest
                matches.append((loc2, e))

        if matches:
            matches.sort(key=lambda x: -x[0])
            for _,e in matches:
                try:
                    return e._parse( instring, loc, doActions )
                except ParseException as err:
                    err.__traceback__ = None
                    if err.loc > maxExcLoc:
                        maxException = err
                        maxExcLoc = err.loc

        if maxException is not None:
            maxException.msg = self.errmsg
            raise maxException
        else:
            raise ParseException(instring, loc, "no defined alternatives to match", self)


    def __ixor__(self, other ):
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        return self.append( other ) #Or( [ self, other ] )

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "{" + " ^ ".join(_ustr(e) for e in self.exprs) + "}"

        return self.strRepr

    def checkRecursion( self, parseElementList ):
        subRecCheckList = parseElementList[:] + [ self ]
        for e in self.exprs:
            e.checkRecursion( subRecCheckList )


class MatchFirst(ParseExpression):
    """
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the first one listed is the one that will match.
    May be constructed using the C{'|'} operator.

    Example::
        # construct MatchFirst using '|' operator
        
        # watch the order of expressions to match
        number = Word(nums) | Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789")) #  Fail! -> [['123'], ['3'], ['1416'], ['789']]

        # put more selective expression first
        number = Combine(Word(nums) + '.' + Word(nums)) | Word(nums)
        print(number.searchString("123 3.1416 789")) #  Better -> [['123'], ['3.1416'], ['789']]
    """
    def __init__( self, exprs, savelist = False ):
        super(MatchFirst,self).__init__(exprs, savelist)
        if self.exprs:
            self.mayReturnEmpty = any(e.mayReturnEmpty for e in self.exprs)
        else:
            self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        maxExcLoc = -1
        maxException = None
        for e in self.exprs:
            try:
                ret = e._parse( instring, loc, doActions )
                return ret
            except ParseException as err:
                if err.loc > maxExcLoc:
                    maxException = err
                    maxExcLoc = err.loc
            except IndexError:
                if len(instring) > maxExcLoc:
                    maxException = ParseException(instring,len(instring),e.errmsg,self)
                    maxExcLoc = len(instring)

        # only got here if no expression matched, raise exception for match that made it the furthest
        else:
            if maxException is not None:
                maxException.msg = self.errmsg
                raise maxException
            else:
                raise ParseException(instring, loc, "no defined alternatives to match", self)

    def __ior__(self, other ):
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        return self.append( other ) #MatchFirst( [ self, other ] )

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "{" + " | ".join(_ustr(e) for e in self.exprs) + "}"

        return self.strRepr

    def checkRecursion( self, parseElementList ):
        subRecCheckList = parseElementList[:] + [ self ]
        for e in self.exprs:
            e.checkRecursion( subRecCheckList )


class Each(ParseExpression):
    """
    Requires all given C{ParseExpression}s to be found, but in any order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'&'} operator.

    Example::
        color = oneOf("RED ORANGE YELLOW GREEN BLUE PURPLE BLACK WHITE BROWN")
        shape_type = oneOf("SQUARE CIRCLE TRIANGLE STAR HEXAGON OCTAGON")
        integer = Word(nums)
        shape_attr = "shape:" + shape_type("shape")
        posn_attr = "posn:" + Group(integer("x") + ',' + integer("y"))("posn")
        color_attr = "color:" + color("color")
        size_attr = "size:" + integer("size")

        # use Each (using operator '&') to accept attributes in any order 
        # (shape and posn are required, color and size are optional)
        shape_spec = shape_attr & posn_attr & Optional(color_attr) & Optional(size_attr)

        shape_spec.runTests('''
            shape: SQUARE color: BLACK posn: 100, 120
            shape: CIRCLE size: 50 color: BLUE posn: 50,80
            color:GREEN size:20 shape:TRIANGLE posn:20,40
            '''
            )
    prints::
        shape: SQUARE color: BLACK posn: 100, 120
        ['shape:', 'SQUARE', 'color:', 'BLACK', 'posn:', ['100', ',', '120']]
        - color: BLACK
        - posn: ['100', ',', '120']
          - x: 100
          - y: 120
        - shape: SQUARE


        shape: CIRCLE size: 50 color: BLUE posn: 50,80
        ['shape:', 'CIRCLE', 'size:', '50', 'color:', 'BLUE', 'posn:', ['50', ',', '80']]
        - color: BLUE
        - posn: ['50', ',', '80']
          - x: 50
          - y: 80
        - shape: CIRCLE
        - size: 50


        color: GREEN size: 20 shape: TRIANGLE posn: 20,40
        ['color:', 'GREEN', 'size:', '20', 'shape:', 'TRIANGLE', 'posn:', ['20', ',', '40']]
        - color: GREEN
        - posn: ['20', ',', '40']
          - x: 20
          - y: 40
        - shape: TRIANGLE
        - size: 20
    """
    def __init__( self, exprs, savelist = True ):
        super(Each,self).__init__(exprs, savelist)
        self.mayReturnEmpty = all(e.mayReturnEmpty for e in self.exprs)
        self.skipWhitespace = True
        self.initExprGroups = True

    def parseImpl( self, instring, loc, doActions=True ):
        if self.initExprGroups:
            self.opt1map = dict((id(e.expr),e) for e in self.exprs if isinstance(e,Optional))
            opt1 = [ e.expr for e in self.exprs if isinstance(e,Optional) ]
            opt2 = [ e for e in self.exprs if e.mayReturnEmpty and not isinstance(e,Optional)]
            self.optionals = opt1 + opt2
            self.multioptionals = [ e.expr for e in self.exprs if isinstance(e,ZeroOrMore) ]
            self.multirequired = [ e.expr for e in self.exprs if isinstance(e,OneOrMore) ]
            self.required = [ e for e in self.exprs if not isinstance(e,(Optional,ZeroOrMore,OneOrMore)) ]
            self.required += self.multirequired
            self.initExprGroups = False
        tmpLoc = loc
        tmpReqd = self.required[:]
        tmpOpt  = self.optionals[:]
        matchOrder = []

        keepMatching = True
        while keepMatching:
            tmpExprs = tmpReqd + tmpOpt + self.multioptionals + self.multirequired
            failed = []
            for e in tmpExprs:
                try:
                    tmpLoc = e.tryParse( instring, tmpLoc )
                except ParseException:
                    failed.append(e)
                else:
                    matchOrder.append(self.opt1map.get(id(e),e))
                    if e in tmpReqd:
                        tmpReqd.remove(e)
                    elif e in tmpOpt:
                        tmpOpt.remove(e)
            if len(failed) == len(tmpExprs):
                keepMatching = False

        if tmpReqd:
            missing = ", ".join(_ustr(e) for e in tmpReqd)
            raise ParseException(instring,loc,"Missing one or more required elements (%s)" % missing )

        # add any unmatched Optionals, in case they have default values defined
        matchOrder += [e for e in self.exprs if isinstance(e,Optional) and e.expr in tmpOpt]

        resultlist = []
        for e in matchOrder:
            loc,results = e._parse(instring,loc,doActions)
            resultlist.append(results)

        finalResults = sum(resultlist, ParseResults([]))
        return loc, finalResults

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "{" + " & ".join(_ustr(e) for e in self.exprs) + "}"

        return self.strRepr

    def checkRecursion( self, parseElementList ):
        subRecCheckList = parseElementList[:] + [ self ]
        for e in self.exprs:
            e.checkRecursion( subRecCheckList )


class ParseElementEnhance(ParserElement):
    """
    Abstract subclass of C{ParserElement}, for combining and post-processing parsed tokens.
    """
    def __init__( self, expr, savelist=False ):
        super(ParseElementEnhance,self).__init__(savelist)
        if isinstance( expr, basestring ):
            if issubclass(ParserElement._literalStringClass, Token):
                expr = ParserElement._literalStringClass(expr)
            else:
                expr = ParserElement._literalStringClass(Literal(expr))
        self.expr = expr
        self.strRepr = None
        if expr is not None:
            self.mayIndexError = expr.mayIndexError
            self.mayReturnEmpty = expr.mayReturnEmpty
            self.setWhitespaceChars( expr.whiteChars )
            self.skipWhitespace = expr.skipWhitespace
            self.saveAsList = expr.saveAsList
            self.callPreparse = expr.callPreparse
            self.ignoreExprs.extend(expr.ignoreExprs)

    def parseImpl( self, instring, loc, doActions=True ):
        if self.expr is not None:
            return self.expr._parse( instring, loc, doActions, callPreParse=False )
        else:
            raise ParseException("",loc,self.errmsg,self)

    def leaveWhitespace( self ):
        self.skipWhitespace = False
        self.expr = self.expr.copy()
        if self.expr is not None:
            self.expr.leaveWhitespace()
        return self

    def ignore( self, other ):
        if isinstance( other, Suppress ):
            if other not in self.ignoreExprs:
                super( ParseElementEnhance, self).ignore( other )
                if self.expr is not None:
                    self.expr.ignore( self.ignoreExprs[-1] )
        else:
            super( ParseElementEnhance, self).ignore( other )
            if self.expr is not None:
                self.expr.ignore( self.ignoreExprs[-1] )
        return self

    def streamline( self ):
        super(ParseElementEnhance,self).streamline()
        if self.expr is not None:
            self.expr.streamline()
        return self

    def checkRecursion( self, parseElementList ):
        if self in parseElementList:
            raise RecursiveGrammarException( parseElementList+[self] )
        subRecCheckList = parseElementList[:] + [ self ]
        if self.expr is not None:
            self.expr.checkRecursion( subRecCheckList )

    def validate( self, validateTrace=[] ):
        tmp = validateTrace[:]+[self]
        if self.expr is not None:
            self.expr.validate(tmp)
        self.checkRecursion( [] )

    def __str__( self ):
        try:
            return super(ParseElementEnhance,self).__str__()
        except Exception:
            pass

        if self.strRepr is None and self.expr is not None:
            self.strRepr = "%s:(%s)" % ( self.__class__.__name__, _ustr(self.expr) )
        return self.strRepr


class FollowedBy(ParseElementEnhance):
    """
    Lookahead matching of the given parse expression.  C{FollowedBy}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression matches at the current
    position.  C{FollowedBy} always returns a null token list.

    Example::
        # use FollowedBy to match a label only if it is followed by a ':'
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        OneOrMore(attr_expr).parseString("shape: SQUARE color: BLACK posn: upper left").pprint()
    prints::
        [['shape', 'SQUARE'], ['color', 'BLACK'], ['posn', 'upper left']]
    """
    def __init__( self, expr ):
        super(FollowedBy,self).__init__(expr)
        self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        self.expr.tryParse( instring, loc )
        return loc, []


class NotAny(ParseElementEnhance):
    """
    Lookahead to disallow matching with the given parse expression.  C{NotAny}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression does I{not} match at the current
    position.  Also, C{NotAny} does I{not} skip over leading whitespace. C{NotAny}
    always returns a null token list.  May be constructed using the '~' operator.

    Example::
        
    """
    def __init__( self, expr ):
        super(NotAny,self).__init__(expr)
        #~ self.leaveWhitespace()
        self.skipWhitespace = False  # do NOT use self.leaveWhitespace(), don't want to propagate to exprs
        self.mayReturnEmpty = True
        self.errmsg = "Found unwanted token, "+_ustr(self.expr)

    def parseImpl( self, instring, loc, doActions=True ):
        if self.expr.canParseNext(instring, loc):
            raise ParseException(instring, loc, self.errmsg, self)
        return loc, []

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "~{" + _ustr(self.expr) + "}"

        return self.strRepr

class _MultipleMatch(ParseElementEnhance):
    def __init__( self, expr, stopOn=None):
        super(_MultipleMatch, self).__init__(expr)
        self.saveAsList = True
        ender = stopOn
        if isinstance(ender, basestring):
            ender = ParserElement._literalStringClass(ender)
        self.not_ender = ~ender if ender is not None else None

    def parseImpl( self, instring, loc, doActions=True ):
        self_expr_parse = self.expr._parse
        self_skip_ignorables = self._skipIgnorables
        check_ender = self.not_ender is not None
        if check_ender:
            try_not_ender = self.not_ender.tryParse
        
        # must be at least one (but first see if we are the stopOn sentinel;
        # if so, fail)
        if check_ender:
            try_not_ender(instring, loc)
        loc, tokens = self_expr_parse( instring, loc, doActions, callPreParse=False )
        try:
            hasIgnoreExprs = (not not self.ignoreExprs)
            while 1:
                if check_ender:
                    try_not_ender(instring, loc)
                if hasIgnoreExprs:
                    preloc = self_skip_ignorables( instring, loc )
                else:
                    preloc = loc
                loc, tmptokens = self_expr_parse( instring, preloc, doActions )
                if tmptokens or tmptokens.haskeys():
                    tokens += tmptokens
        except (ParseException,IndexError):
            pass

        return loc, tokens
        
class OneOrMore(_MultipleMatch):
    """
    Repetition of one or more of the given expression.
    
    Parameters:
     - expr - expression that must match one or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: BLACK"
        OneOrMore(attr_expr).parseString(text).pprint()  # Fail! read 'color' as data instead of next label -> [['shape', 'SQUARE color']]

        # use stopOn attribute for OneOrMore to avoid reading label string as part of the data
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        OneOrMore(attr_expr).parseString(text).pprint() # Better -> [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'BLACK']]
        
        # could also be written as
        (attr_expr * (1,)).parseString(text).pprint()
    """

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "{" + _ustr(self.expr) + "}..."

        return self.strRepr

class ZeroOrMore(_MultipleMatch):
    """
    Optional repetition of zero or more of the given expression.
    
    Parameters:
     - expr - expression that must match zero or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example: similar to L{OneOrMore}
    """
    def __init__( self, expr, stopOn=None):
        super(ZeroOrMore,self).__init__(expr, stopOn=stopOn)
        self.mayReturnEmpty = True
        
    def parseImpl( self, instring, loc, doActions=True ):
        try:
            return super(ZeroOrMore, self).parseImpl(instring, loc, doActions)
        except (ParseException,IndexError):
            return loc, []

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "[" + _ustr(self.expr) + "]..."

        return self.strRepr

class _NullToken(object):
    def __bool__(self):
        return False
    __nonzero__ = __bool__
    def __str__(self):
        return ""

_optionalNotMatched = _NullToken()
class Optional(ParseElementEnhance):
    """
    Optional matching of the given expression.

    Parameters:
     - expr - expression that must match zero or more times
     - default (optional) - value to be returned if the optional expression is not found.

    Example::
        # US postal code can be a 5-digit zip, plus optional 4-digit qualifier
        zip = Combine(Word(nums, exact=5) + Optional('-' + Word(nums, exact=4)))
        zip.runTests('''
            # traditional ZIP code
            12345
            
            # ZIP+4 form
            12101-0001
            
            # invalid ZIP
            98765-
            ''')
    prints::
        # traditional ZIP code
        12345
        ['12345']

        # ZIP+4 form
        12101-0001
        ['12101-0001']

        # invalid ZIP
        98765-
             ^
        FAIL: Expected end of text (at char 5), (line:1, col:6)
    """
    def __init__( self, expr, default=_optionalNotMatched ):
        super(Optional,self).__init__( expr, savelist=False )
        self.saveAsList = self.expr.saveAsList
        self.defaultValue = default
        self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        try:
            loc, tokens = self.expr._parse( instring, loc, doActions, callPreParse=False )
        except (ParseException,IndexError):
            if self.defaultValue is not _optionalNotMatched:
                if self.expr.resultsName:
                    tokens = ParseResults([ self.defaultValue ])
                    tokens[self.expr.resultsName] = self.defaultValue
                else:
                    tokens = [ self.defaultValue ]
            else:
                tokens = []
        return loc, tokens

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "[" + _ustr(self.expr) + "]"

        return self.strRepr

class SkipTo(ParseElementEnhance):
    """
    Token for skipping over all undefined text until the matched expression is found.

    Parameters:
     - expr - target expression marking the end of the data to be skipped
     - include - (default=C{False}) if True, the target expression is also parsed 
          (the skipped text and target expression are returned as a 2-element list).
     - ignore - (default=C{None}) used to define grammars (typically quoted strings and 
          comments) that might contain false matches to the target expression
     - failOn - (default=C{None}) define expressions that are not allowed to be 
          included in the skipped test; if found before the target expression is found, 
          the SkipTo is not a match

    Example::
        report = '''
            Outstanding Issues Report - 1 Jan 2000

               # | Severity | Description                               |  Days Open
            -----+----------+-------------------------------------------+-----------
             101 | Critical | Intermittent system crash                 |          6
              94 | Cosmetic | Spelling error on Login ('log|n')         |         14
              79 | Minor    | System slow when running too many reports |         47
            '''
        integer = Word(nums)
        SEP = Suppress('|')
        # use SkipTo to simply match everything up until the next SEP
        # - ignore quoted strings, so that a '|' character inside a quoted string does not match
        # - parse action will call token.strip() for each matched token, i.e., the description body
        string_data = SkipTo(SEP, ignore=quotedString)
        string_data.setParseAction(tokenMap(str.strip))
        ticket_expr = (integer("issue_num") + SEP 
                      + string_data("sev") + SEP 
                      + string_data("desc") + SEP 
                      + integer("days_open"))
        
        for tkt in ticket_expr.searchString(report):
            print tkt.dump()
    prints::
        ['101', 'Critical', 'Intermittent system crash', '6']
        - days_open: 6
        - desc: Intermittent system crash
        - issue_num: 101
        - sev: Critical
        ['94', 'Cosmetic', "Spelling error on Login ('log|n')", '14']
        - days_open: 14
        - desc: Spelling error on Login ('log|n')
        - issue_num: 94
        - sev: Cosmetic
        ['79', 'Minor', 'System slow when running too many reports', '47']
        - days_open: 47
        - desc: System slow when running too many reports
        - issue_num: 79
        - sev: Minor
    """
    def __init__( self, other, include=False, ignore=None, failOn=None ):
        super( SkipTo, self ).__init__( other )
        self.ignoreExpr = ignore
        self.mayReturnEmpty = True
        self.mayIndexError = False
        self.includeMatch = include
        self.asList = False
        if isinstance(failOn, basestring):
            self.failOn = ParserElement._literalStringClass(failOn)
        else:
            self.failOn = failOn
        self.errmsg = "No match found for "+_ustr(self.expr)

    def parseImpl( self, instring, loc, doActions=True ):
        startloc = loc
        instrlen = len(instring)
        expr = self.expr
        expr_parse = self.expr._parse
        self_failOn_canParseNext = self.failOn.canParseNext if self.failOn is not None else None
        self_ignoreExpr_tryParse = self.ignoreExpr.tryParse if self.ignoreExpr is not None else None
        
        tmploc = loc
        while tmploc <= instrlen:
            if self_failOn_canParseNext is not None:
                # break if failOn expression matches
                if self_failOn_canParseNext(instring, tmploc):
                    break
                    
            if self_ignoreExpr_tryParse is not None:
                # advance past ignore expressions
                while 1:
                    try:
                        tmploc = self_ignoreExpr_tryParse(instring, tmploc)
                    except ParseBaseException:
                        break
            
            try:
                expr_parse(instring, tmploc, doActions=False, callPreParse=False)
            except (ParseException, IndexError):
                # no match, advance loc in string
                tmploc += 1
            else:
                # matched skipto expr, done
                break

        else:
            # ran off the end of the input string without matching skipto expr, fail
            raise ParseException(instring, loc, self.errmsg, self)

        # build up return values
        loc = tmploc
        skiptext = instring[startloc:loc]
        skipresult = ParseResults(skiptext)
        
        if self.includeMatch:
            loc, mat = expr_parse(instring,loc,doActions,callPreParse=False)
            skipresult += mat

        return loc, skipresult

class Forward(ParseElementEnhance):
    """
    Forward declaration of an expression to be defined later -
    used for recursive grammars, such as algebraic infix notation.
    When the expression is known, it is assigned to the C{Forward} variable using the '<<' operator.

    Note: take care when assigning to C{Forward} not to overlook precedence of operators.
    Specifically, '|' has a lower precedence than '<<', so that::
        fwdExpr << a | b | c
    will actually be evaluated as::
        (fwdExpr << a) | b | c
    thereby leaving b and c out as parseable alternatives.  It is recommended that you
    explicitly group the values inserted into the C{Forward}::
        fwdExpr << (a | b | c)
    Converting to use the '<<=' operator instead will avoid this problem.

    See L{ParseResults.pprint} for an example of a recursive parser created using
    C{Forward}.
    """
    def __init__( self, other=None ):
        super(Forward,self).__init__( other, savelist=False )

    def __lshift__( self, other ):
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass(other)
        self.expr = other
        self.strRepr = None
        self.mayIndexError = self.expr.mayIndexError
        self.mayReturnEmpty = self.expr.mayReturnEmpty
        self.setWhitespaceChars( self.expr.whiteChars )
        self.skipWhitespace = self.expr.skipWhitespace
        self.saveAsList = self.expr.saveAsList
        self.ignoreExprs.extend(self.expr.ignoreExprs)
        return self
        
    def __ilshift__(self, other):
        return self << other
    
    def leaveWhitespace( self ):
        self.skipWhitespace = False
        return self

    def streamline( self ):
        if not self.streamlined:
            self.streamlined = True
            if self.expr is not None:
                self.expr.streamline()
        return self

    def validate( self, validateTrace=[] ):
        if self not in validateTrace:
            tmp = validateTrace[:]+[self]
            if self.expr is not None:
                self.expr.validate(tmp)
        self.checkRecursion([])

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name
        return self.__class__.__name__ + ": ..."

        # stubbed out for now - creates awful memory and perf issues
        self._revertClass = self.__class__
        self.__class__ = _ForwardNoRecurse
        try:
            if self.expr is not None:
                retString = _ustr(self.expr)
            else:
                retString = "None"
        finally:
            self.__class__ = self._revertClass
        return self.__class__.__name__ + ": " + retString

    def copy(self):
        if self.expr is not None:
            return super(Forward,self).copy()
        else:
            ret = Forward()
            ret <<= self
            return ret

class _ForwardNoRecurse(Forward):
    def __str__( self ):
        return "..."

class TokenConverter(ParseElementEnhance):
    """
    Abstract subclass of C{ParseExpression}, for converting parsed results.
    """
    def __init__( self, expr, savelist=False ):
        super(TokenConverter,self).__init__( expr )#, savelist )
        self.saveAsList = False

class Combine(TokenConverter):
    """
    Converter to concatenate all matching tokens to a single string.
    By default, the matching patterns must also be contiguous in the input string;
    this can be disabled by specifying C{'adjacent=False'} in the constructor.

    Example::
        real = Word(nums) + '.' + Word(nums)
        print(real.parseString('3.1416')) # -> ['3', '.', '1416']
        # will also erroneously match the following
        print(real.parseString('3. 1416')) # -> ['3', '.', '1416']

        real = Combine(Word(nums) + '.' + Word(nums))
        print(real.parseString('3.1416')) # -> ['3.1416']
        # no match when there are internal spaces
        print(real.parseString('3. 1416')) # -> Exception: Expected W:(0123...)
    """
    def __init__( self, expr, joinString="", adjacent=True ):
        super(Combine,self).__init__( expr )
        # suppress whitespace-stripping in contained parse expressions, but re-enable it on the Combine itself
        if adjacent:
            self.leaveWhitespace()
        self.adjacent = adjacent
        self.skipWhitespace = True
        self.joinString = joinString
        self.callPreparse = True

    def ignore( self, other ):
        if self.adjacent:
            ParserElement.ignore(self, other)
        else:
            super( Combine, self).ignore( other )
        return self

    def postParse( self, instring, loc, tokenlist ):
        retToks = tokenlist.copy()
        del retToks[:]
        retToks += ParseResults([ "".join(tokenlist._asStringList(self.joinString)) ], modal=self.modalResults)

        if self.resultsName and retToks.haskeys():
            return [ retToks ]
        else:
            return retToks

class Group(TokenConverter):
    """
    Converter to return the matched tokens as a list - useful for returning tokens of C{L{ZeroOrMore}} and C{L{OneOrMore}} expressions.

    Example::
        ident = Word(alphas)
        num = Word(nums)
        term = ident | num
        func = ident + Optional(delimitedList(term))
        print(func.parseString("fn a,b,100"))  # -> ['fn', 'a', 'b', '100']

        func = ident + Group(Optional(delimitedList(term)))
        print(func.parseString("fn a,b,100"))  # -> ['fn', ['a', 'b', '100']]
    """
    def __init__( self, expr ):
        super(Group,self).__init__( expr )
        self.saveAsList = True

    def postParse( self, instring, loc, tokenlist ):
        return [ tokenlist ]

class Dict(TokenConverter):
    """
    Converter to return a repetitive expression as a list, but also as a dictionary.
    Each element can also be referenced using the first token in the expression as its key.
    Useful for tabular report scraping when the first column can be used as a item key.

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        # print attributes as plain groups
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        # instead of OneOrMore(expr), parse using Dict(OneOrMore(Group(expr))) - Dict will auto-assign names
        result = Dict(OneOrMore(Group(attr_expr))).parseString(text)
        print(result.dump())
        
        # access named fields as dict entries, or output as dict
        print(result['shape'])        
        print(result.asDict())
    prints::
        ['shape', 'SQUARE', 'posn', 'upper left', 'color', 'light blue', 'texture', 'burlap']

        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        {'color': 'light blue', 'posn': 'upper left', 'texture': 'burlap', 'shape': 'SQUARE'}
    See more examples at L{ParseResults} of accessing fields by results name.
    """
    def __init__( self, expr ):
        super(Dict,self).__init__( expr )
        self.saveAsList = True

    def postParse( self, instring, loc, tokenlist ):
        for i,tok in enumerate(tokenlist):
            if len(tok) == 0:
                continue
            ikey = tok[0]
            if isinstance(ikey,int):
                ikey = _ustr(tok[0]).strip()
            if len(tok)==1:
                tokenlist[ikey] = _ParseResultsWithOffset("",i)
            elif len(tok)==2 and not isinstance(tok[1],ParseResults):
                tokenlist[ikey] = _ParseResultsWithOffset(tok[1],i)
            else:
                dictvalue = tok.copy() #ParseResults(i)
                del dictvalue[0]
                if len(dictvalue)!= 1 or (isinstance(dictvalue,ParseResults) and dictvalue.haskeys()):
                    tokenlist[ikey] = _ParseResultsWithOffset(dictvalue,i)
                else:
                    tokenlist[ikey] = _ParseResultsWithOffset(dictvalue[0],i)

        if self.resultsName:
            return [ tokenlist ]
        else:
            return tokenlist


class Suppress(TokenConverter):
    """
    Converter for ignoring the results of a parsed expression.

    Example::
        source = "a, b, c,d"
        wd = Word(alphas)
        wd_list1 = wd + ZeroOrMore(',' + wd)
        print(wd_list1.parseString(source))

        # often, delimiters that are useful during parsing are just in the
        # way afterward - use Suppress to keep them out of the parsed output
        wd_list2 = wd + ZeroOrMore(Suppress(',') + wd)
        print(wd_list2.parseString(source))
    prints::
        ['a', ',', 'b', ',', 'c', ',', 'd']
        ['a', 'b', 'c', 'd']
    (See also L{delimitedList}.)
    """
    def postParse( self, instring, loc, tokenlist ):
        return []

    def suppress( self ):
        return self


class OnlyOnce(object):
    """
    Wrapper for parse actions, to ensure they are only called once.
    """
    def __init__(self, methodCall):
        self.callable = _trim_arity(methodCall)
        self.called = False
    def __call__(self,s,l,t):
        if not self.called:
            results = self.callable(s,l,t)
            self.called = True
            return results
        raise ParseException(s,l,"")
    def reset(self):
        self.called = False

def traceParseAction(f):
    """
    Decorator for debugging parse actions. 
    
    When the parse action is called, this decorator will print C{">> entering I{method-name}(line:I{current_source_line}, I{parse_location}, I{matched_tokens})".}
    When the parse action completes, the decorator will print C{"<<"} followed by the returned value, or any exception that the parse action raised.

    Example::
        wd = Word(alphas)

        @traceParseAction
        def remove_duplicate_chars(tokens):
            return ''.join(sorted(set(''.join(tokens)))

        wds = OneOrMore(wd).setParseAction(remove_duplicate_chars)
        print(wds.parseString("slkdjs sld sldd sdlf sdljf"))
    prints::
        >>entering remove_duplicate_chars(line: 'slkdjs sld sldd sdlf sdljf', 0, (['slkdjs', 'sld', 'sldd', 'sdlf', 'sdljf'], {}))
        <<leaving remove_duplicate_chars (ret: 'dfjkls')
        ['dfjkls']
    """
    f = _trim_arity(f)
    def z(*paArgs):
        thisFunc = f.__name__
        s,l,t = paArgs[-3:]
        if len(paArgs)>3:
            thisFunc = paArgs[0].__class__.__name__ + '.' + thisFunc
        sys.stderr.write( ">>entering %s(line: '%s', %d, %r)\n" % (thisFunc,line(l,s),l,t) )
        try:
            ret = f(*paArgs)
        except Exception as exc:
            sys.stderr.write( "<<leaving %s (exception: %s)\n" % (thisFunc,exc) )
            raise
        sys.stderr.write( "<<leaving %s (ret: %r)\n" % (thisFunc,ret) )
        return ret
    try:
        z.__name__ = f.__name__
    except AttributeError:
        pass
    return z

#
# global helpers
#
def delimitedList( expr, delim=",", combine=False ):
    """
    Helper to define a delimited list of expressions - the delimiter defaults to ','.
    By default, the list elements and delimiters can have intervening whitespace, and
    comments, but this can be overridden by passing C{combine=True} in the constructor.
    If C{combine} is set to C{True}, the matching tokens are returned as a single token
    string, with the delimiters included; otherwise, the matching tokens are returned
    as a list of tokens, with the delimiters suppressed.

    Example::
        delimitedList(Word(alphas)).parseString("aa,bb,cc") # -> ['aa', 'bb', 'cc']
        delimitedList(Word(hexnums), delim=':', combine=True).parseString("AA:BB:CC:DD:EE") # -> ['AA:BB:CC:DD:EE']
    """
    dlName = _ustr(expr)+" ["+_ustr(delim)+" "+_ustr(expr)+"]..."
    if combine:
        return Combine( expr + ZeroOrMore( delim + expr ) ).setName(dlName)
    else:
        return ( expr + ZeroOrMore( Suppress( delim ) + expr ) ).setName(dlName)

def countedArray( expr, intExpr=None ):
    """
    Helper to define a counted list of expressions.
    This helper defines a pattern of the form::
        integer expr expr expr...
    where the leading integer tells how many expr expressions follow.
    The matched tokens returns the array of expr tokens as a list - the leading count token is suppressed.
    
    If C{intExpr} is specified, it should be a pyparsing expression that produces an integer value.

    Example::
        countedArray(Word(alphas)).parseString('2 ab cd ef')  # -> ['ab', 'cd']

        # in this parser, the leading integer value is given in binary,
        # '10' indicating that 2 values are in the array
        binaryConstant = Word('01').setParseAction(lambda t: int(t[0], 2))
        countedArray(Word(alphas), intExpr=binaryConstant).parseString('10 ab cd ef')  # -> ['ab', 'cd']
    """
    arrayExpr = Forward()
    def countFieldParseAction(s,l,t):
        n = t[0]
        arrayExpr << (n and Group(And([expr]*n)) or Group(empty))
        return []
    if intExpr is None:
        intExpr = Word(nums).setParseAction(lambda t:int(t[0]))
    else:
        intExpr = intExpr.copy()
    intExpr.setName("arrayLen")
    intExpr.addParseAction(countFieldParseAction, callDuringTry=True)
    return ( intExpr + arrayExpr ).setName('(len) ' + _ustr(expr) + '...')

def _flatten(L):
    ret = []
    for i in L:
        if isinstance(i,list):
            ret.extend(_flatten(i))
        else:
            ret.append(i)
    return ret

def matchPreviousLiteral(expr):
    """
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousLiteral(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches a
    previous literal, will also match the leading C{"1:1"} in C{"1:10"}.
    If this is not desired, use C{matchPreviousExpr}.
    Do I{not} use with packrat parsing enabled.
    """
    rep = Forward()
    def copyTokenToRepeater(s,l,t):
        if t:
            if len(t) == 1:
                rep << t[0]
            else:
                # flatten t tokens
                tflat = _flatten(t.asList())
                rep << And(Literal(tt) for tt in tflat)
        else:
            rep << Empty()
    expr.addParseAction(copyTokenToRepeater, callDuringTry=True)
    rep.setName('(prev) ' + _ustr(expr))
    return rep

def matchPreviousExpr(expr):
    """
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousExpr(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches by
    expressions, will I{not} match the leading C{"1:1"} in C{"1:10"};
    the expressions are evaluated first, and then compared, so
    C{"1"} is compared with C{"10"}.
    Do I{not} use with packrat parsing enabled.
    """
    rep = Forward()
    e2 = expr.copy()
    rep <<= e2
    def copyTokenToRepeater(s,l,t):
        matchTokens = _flatten(t.asList())
        def mustMatchTheseTokens(s,l,t):
            theseTokens = _flatten(t.asList())
            if  theseTokens != matchTokens:
                raise ParseException("",0,"")
        rep.setParseAction( mustMatchTheseTokens, callDuringTry=True )
    expr.addParseAction(copyTokenToRepeater, callDuringTry=True)
    rep.setName('(prev) ' + _ustr(expr))
    return rep

def _escapeRegexRangeChars(s):
    #~  escape these chars: ^-]
    for c in r"\^-]":
        s = s.replace(c,_bslash+c)
    s = s.replace("\n",r"\n")
    s = s.replace("\t",r"\t")
    return _ustr(s)

def oneOf( strs, caseless=False, useRegex=True ):
    """
    Helper to quickly define a set of alternative Literals, and makes sure to do
    longest-first testing when there is a conflict, regardless of the input order,
    but returns a C{L{MatchFirst}} for best performance.

    Parameters:
     - strs - a string of space-delimited literals, or a collection of string literals
     - caseless - (default=C{False}) - treat all literals as caseless
     - useRegex - (default=C{True}) - as an optimization, will generate a Regex
          object; otherwise, will generate a C{MatchFirst} object (if C{caseless=True}, or
          if creating a C{Regex} raises an exception)

    Example::
        comp_oper = oneOf("< = > <= >= !=")
        var = Word(alphas)
        number = Word(nums)
        term = var | number
        comparison_expr = term + comp_oper + term
        print(comparison_expr.searchString("B = 12  AA=23 B<=AA AA>12"))
    prints::
        [['B', '=', '12'], ['AA', '=', '23'], ['B', '<=', 'AA'], ['AA', '>', '12']]
    """
    if caseless:
        isequal = ( lambda a,b: a.upper() == b.upper() )
        masks = ( lambda a,b: b.upper().startswith(a.upper()) )
        parseElementClass = CaselessLiteral
    else:
        isequal = ( lambda a,b: a == b )
        masks = ( lambda a,b: b.startswith(a) )
        parseElementClass = Literal

    symbols = []
    if isinstance(strs,basestring):
        symbols = strs.split()
    elif isinstance(strs, collections.Iterable):
        symbols = list(strs)
    else:
        warnings.warn("Invalid argument to oneOf, expected string or iterable",
                SyntaxWarning, stacklevel=2)
    if not symbols:
        return NoMatch()

    i = 0
    while i < len(symbols)-1:
        cur = symbols[i]
        for j,other in enumerate(symbols[i+1:]):
            if ( isequal(other, cur) ):
                del symbols[i+j+1]
                break
            elif ( masks(cur, other) ):
                del symbols[i+j+1]
                symbols.insert(i,other)
                cur = other
                break
        else:
            i += 1

    if not caseless and useRegex:
        #~ print (strs,"->", "|".join( [ _escapeRegexChars(sym) for sym in symbols] ))
        try:
            if len(symbols)==len("".join(symbols)):
                return Regex( "[%s]" % "".join(_escapeRegexRangeChars(sym) for sym in symbols) ).setName(' | '.join(symbols))
            else:
                return Regex( "|".join(re.escape(sym) for sym in symbols) ).setName(' | '.join(symbols))
        except Exception:
            warnings.warn("Exception creating Regex for oneOf, building MatchFirst",
                    SyntaxWarning, stacklevel=2)


    # last resort, just use MatchFirst
    return MatchFirst(parseElementClass(sym) for sym in symbols).setName(' | '.join(symbols))

def dictOf( key, value ):
    """
    Helper to easily and clearly define a dictionary by specifying the respective patterns
    for the key and value.  Takes care of defining the C{L{Dict}}, C{L{ZeroOrMore}}, and C{L{Group}} tokens
    in the proper order.  The key pattern can include delimiting markers or punctuation,
    as long as they are suppressed, thereby leaving the significant key text.  The value
    pattern can include named results, so that the C{Dict} results can include named token
    fields.

    Example::
        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        attr_label = label
        attr_value = Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join)

        # similar to Dict, but simpler call format
        result = dictOf(attr_label, attr_value).parseString(text)
        print(result.dump())
        print(result['shape'])
        print(result.shape)  # object attribute access works too
        print(result.asDict())
    prints::
        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        SQUARE
        {'color': 'light blue', 'shape': 'SQUARE', 'posn': 'upper left', 'texture': 'burlap'}
    """
    return Dict( ZeroOrMore( Group ( key + value ) ) )

def originalTextFor(expr, asString=True):
    """
    Helper to return the original, untokenized text for a given expression.  Useful to
    restore the parsed fields of an HTML start tag into the raw tag text itself, or to
    revert separate tokens with intervening whitespace back to the original matching
    input text. By default, returns astring containing the original parsed text.  
       
    If the optional C{asString} argument is passed as C{False}, then the return value is a 
    C{L{ParseResults}} containing any results names that were originally matched, and a 
    single token containing the original matched text from the input string.  So if 
    the expression passed to C{L{originalTextFor}} contains expressions with defined
    results names, you must set C{asString} to C{False} if you want to preserve those
    results name values.

    Example::
        src = "this is test <b> bold <i>text</i> </b> normal text "
        for tag in ("b","i"):
            opener,closer = makeHTMLTags(tag)
            patt = originalTextFor(opener + SkipTo(closer) + closer)
            print(patt.searchString(src)[0])
    prints::
        ['<b> bold <i>text</i> </b>']
        ['<i>text</i>']
    """
    locMarker = Empty().setParseAction(lambda s,loc,t: loc)
    endlocMarker = locMarker.copy()
    endlocMarker.callPreparse = False
    matchExpr = locMarker("_original_start") + expr + endlocMarker("_original_end")
    if asString:
        extractText = lambda s,l,t: s[t._original_start:t._original_end]
    else:
        def extractText(s,l,t):
            t[:] = [s[t.pop('_original_start'):t.pop('_original_end')]]
    matchExpr.setParseAction(extractText)
    matchExpr.ignoreExprs = expr.ignoreExprs
    return matchExpr

def ungroup(expr): 
    """
    Helper to undo pyparsing's default grouping of And expressions, even
    if all but one are non-empty.
    """
    return TokenConverter(expr).setParseAction(lambda t:t[0])

def locatedExpr(expr):
    """
    Helper to decorate a returned token with its starting and ending locations in the input string.
    This helper adds the following results names:
     - locn_start = location where matched expression begins
     - locn_end = location where matched expression ends
     - value = the actual parsed results

    Be careful if the input text contains C{<TAB>} characters, you may want to call
    C{L{ParserElement.parseWithTabs}}

    Example::
        wd = Word(alphas)
        for match in locatedExpr(wd).searchString("ljsdf123lksdjjf123lkkjj1222"):
            print(match)
    prints::
        [[0, 'ljsdf', 5]]
        [[8, 'lksdjjf', 15]]
        [[18, 'lkkjj', 23]]
    """
    locator = Empty().setParseAction(lambda s,l,t: l)
    return Group(locator("locn_start") + expr("value") + locator.copy().leaveWhitespace()("locn_end"))


# convenience constants for positional expressions
empty       = Empty().setName("empty")
lineStart   = LineStart().setName("lineStart")
lineEnd     = LineEnd().setName("lineEnd")
stringStart = StringStart().setName("stringStart")
stringEnd   = StringEnd().setName("stringEnd")

_escapedPunc = Word( _bslash, r"\[]-*.$+^?()~ ", exact=2 ).setParseAction(lambda s,l,t:t[0][1])
_escapedHexChar = Regex(r"\\0?[xX][0-9a-fA-F]+").setParseAction(lambda s,l,t:unichr(int(t[0].lstrip(r'\0x'),16)))
_escapedOctChar = Regex(r"\\0[0-7]+").setParseAction(lambda s,l,t:unichr(int(t[0][1:],8)))
_singleChar = _escapedPunc | _escapedHexChar | _escapedOctChar | Word(printables, excludeChars=r'\]', exact=1) | Regex(r"\w", re.UNICODE)
_charRange = Group(_singleChar + Suppress("-") + _singleChar)
_reBracketExpr = Literal("[") + Optional("^").setResultsName("negate") + Group( OneOrMore( _charRange | _singleChar ) ).setResultsName("body") + "]"

def srange(s):
    r"""
    Helper to easily define string ranges for use in Word construction.  Borrows
    syntax from regexp '[]' string range definitions::
        srange("[0-9]")   -> "0123456789"
        srange("[a-z]")   -> "abcdefghijklmnopqrstuvwxyz"
        srange("[a-z$_]") -> "abcdefghijklmnopqrstuvwxyz$_"
    The input string must be enclosed in []'s, and the returned string is the expanded
    character set joined into a single string.
    The values enclosed in the []'s may be:
     - a single character
     - an escaped character with a leading backslash (such as C{\-} or C{\]})
     - an escaped hex character with a leading C{'\x'} (C{\x21}, which is a C{'!'} character) 
         (C{\0x##} is also supported for backwards compatibility) 
     - an escaped octal character with a leading C{'\0'} (C{\041}, which is a C{'!'} character)
     - a range of any of the above, separated by a dash (C{'a-z'}, etc.)
     - any combination of the above (C{'aeiouy'}, C{'a-zA-Z0-9_$'}, etc.)
    """
    _expanded = lambda p: p if not isinstance(p,ParseResults) else ''.join(unichr(c) for c in range(ord(p[0]),ord(p[1])+1))
    try:
        return "".join(_expanded(part) for part in _reBracketExpr.parseString(s).body)
    except Exception:
        return ""

def matchOnlyAtCol(n):
    """
    Helper method for defining parse actions that require matching at a specific
    column in the input text.
    """
    def verifyCol(strg,locn,toks):
        if col(locn,strg) != n:
            raise ParseException(strg,locn,"matched token not at column %d" % n)
    return verifyCol

def replaceWith(replStr):
    """
    Helper method for common parse actions that simply return a literal value.  Especially
    useful when used with C{L{transformString<ParserElement.transformString>}()}.

    Example::
        num = Word(nums).setParseAction(lambda toks: int(toks[0]))
        na = oneOf("N/A NA").setParseAction(replaceWith(math.nan))
        term = na | num
        
        OneOrMore(term).parseString("324 234 N/A 234") # -> [324, 234, nan, 234]
    """
    return lambda s,l,t: [replStr]

def removeQuotes(s,l,t):
    """
    Helper parse action for removing quotation marks from parsed quoted strings.

    Example::
        # by default, quotation marks are included in parsed results
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["'Now is the Winter of our Discontent'"]

        # use removeQuotes to strip quotation marks from parsed results
        quotedString.setParseAction(removeQuotes)
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["Now is the Winter of our Discontent"]
    """
    return t[0][1:-1]

def tokenMap(func, *args):
    """
    Helper to define a parse action by mapping a function to all elements of a ParseResults list.If any additional 
    args are passed, they are forwarded to the given function as additional arguments after
    the token, as in C{hex_integer = Word(hexnums).setParseAction(tokenMap(int, 16))}, which will convert the
    parsed data to an integer using base 16.

    Example (compare the last to example in L{ParserElement.transformString}::
        hex_ints = OneOrMore(Word(hexnums)).setParseAction(tokenMap(int, 16))
        hex_ints.runTests('''
            00 11 22 aa FF 0a 0d 1a
            ''')
        
        upperword = Word(alphas).setParseAction(tokenMap(str.upper))
        OneOrMore(upperword).runTests('''
            my kingdom for a horse
            ''')

        wd = Word(alphas).setParseAction(tokenMap(str.title))
        OneOrMore(wd).setParseAction(' '.join).runTests('''
            now is the winter of our discontent made glorious summer by this sun of york
            ''')
    prints::
        00 11 22 aa FF 0a 0d 1a
        [0, 17, 34, 170, 255, 10, 13, 26]

        my kingdom for a horse
        ['MY', 'KINGDOM', 'FOR', 'A', 'HORSE']

        now is the winter of our discontent made glorious summer by this sun of york
        ['Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York']
    """
    def pa(s,l,t):
        return [func(tokn, *args) for tokn in t]

    try:
        func_name = getattr(func, '__name__', 
                            getattr(func, '__class__').__name__)
    except Exception:
        func_name = str(func)
    pa.__name__ = func_name

    return pa

upcaseTokens = tokenMap(lambda t: _ustr(t).upper())
"""(Deprecated) Helper parse action to convert tokens to upper case. Deprecated in favor of L{pyparsing_common.upcaseTokens}"""

downcaseTokens = tokenMap(lambda t: _ustr(t).lower())
"""(Deprecated) Helper parse action to convert tokens to lower case. Deprecated in favor of L{pyparsing_common.downcaseTokens}"""
    
def _makeTags(tagStr, xml):
    """Internal helper to construct opening and closing tag expressions, given a tag name"""
    if isinstance(tagStr,basestring):
        resname = tagStr
        tagStr = Keyword(tagStr, caseless=not xml)
    else:
        resname = tagStr.name

    tagAttrName = Word(alphas,alphanums+"_-:")
    if (xml):
        tagAttrValue = dblQuotedString.copy().setParseAction( removeQuotes )
        openTag = Suppress("<") + tagStr("tag") + \
                Dict(ZeroOrMore(Group( tagAttrName + Suppress("=") + tagAttrValue ))) + \
                Optional("/",default=[False]).setResultsName("empty").setParseAction(lambda s,l,t:t[0]=='/') + Suppress(">")
    else:
        printablesLessRAbrack = "".join(c for c in printables if c not in ">")
        tagAttrValue = quotedString.copy().setParseAction( removeQuotes ) | Word(printablesLessRAbrack)
        openTag = Suppress("<") + tagStr("tag") + \
                Dict(ZeroOrMore(Group( tagAttrName.setParseAction(downcaseTokens) + \
                Optional( Suppress("=") + tagAttrValue ) ))) + \
                Optional("/",default=[False]).setResultsName("empty").setParseAction(lambda s,l,t:t[0]=='/') + Suppress(">")
    closeTag = Combine(_L("</") + tagStr + ">")

    openTag = openTag.setResultsName("start"+"".join(resname.replace(":"," ").title().split())).setName("<%s>" % resname)
    closeTag = closeTag.setResultsName("end"+"".join(resname.replace(":"," ").title().split())).setName("</%s>" % resname)
    openTag.tag = resname
    closeTag.tag = resname
    return openTag, closeTag

def makeHTMLTags(tagStr):
    """
    Helper to construct opening and closing tag expressions for HTML, given a tag name. Matches
    tags in either upper or lower case, attributes with namespaces and with quoted or unquoted values.

    Example::
        text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
        # makeHTMLTags returns pyparsing expressions for the opening and closing tags as a 2-tuple
        a,a_end = makeHTMLTags("A")
        link_expr = a + SkipTo(a_end)("link_text") + a_end
        
        for link in link_expr.searchString(text):
            # attributes in the <A> tag (like "href" shown here) are also accessible as named results
            print(link.link_text, '->', link.href)
    prints::
        pyparsing -> http://pyparsing.wikispaces.com
    """
    return _makeTags( tagStr, False )

def makeXMLTags(tagStr):
    """
    Helper to construct opening and closing tag expressions for XML, given a tag name. Matches
    tags only in the given upper/lower case.

    Example: similar to L{makeHTMLTags}
    """
    return _makeTags( tagStr, True )

def withAttribute(*args,**attrDict):
    """
    Helper to create a validating parse action to be used with start tags created
    with C{L{makeXMLTags}} or C{L{makeHTMLTags}}. Use C{withAttribute} to qualify a starting tag
    with a required attribute value, to avoid false matches on common tags such as
    C{<TD>} or C{<DIV>}.

    Call C{withAttribute} with a series of attribute names and values. Specify the list
    of filter attributes names and values as:
     - keyword arguments, as in C{(align="right")}, or
     - as an explicit dict with C{**} operator, when an attribute name is also a Python
          reserved word, as in C{**{"class":"Customer", "align":"right"}}
     - a list of name-value tuples, as in ( ("ns1:class", "Customer"), ("ns2:align","right") )
    For attribute names with a namespace prefix, you must use the second form.  Attribute
    names are matched insensitive to upper/lower case.
       
    If just testing for C{class} (with or without a namespace), use C{L{withClass}}.

    To verify that the attribute exists, but without specifying a value, pass
    C{withAttribute.ANY_VALUE} as the value.

    Example::
        html = '''
            <div>
            Some text
            <div type="grid">1 4 0 1 0</div>
            <div type="graph">1,3 2,3 1,1</div>
            <div>this has no type</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")

        # only match div tag having a type attribute with value "grid"
        div_grid = div().setParseAction(withAttribute(type="grid"))
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        # construct a match with any div tag having a type attribute, regardless of the value
        div_any_type = div().setParseAction(withAttribute(type=withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    """
    if args:
        attrs = args[:]
    else:
        attrs = attrDict.items()
    attrs = [(k,v) for k,v in attrs]
    def pa(s,l,tokens):
        for attrName,attrValue in attrs:
            if attrName not in tokens:
                raise ParseException(s,l,"no matching attribute " + attrName)
            if attrValue != withAttribute.ANY_VALUE and tokens[attrName] != attrValue:
                raise ParseException(s,l,"attribute '%s' has value '%s', must be '%s'" %
                                            (attrName, tokens[attrName], attrValue))
    return pa
withAttribute.ANY_VALUE = object()

def withClass(classname, namespace=''):
    """
    Simplified version of C{L{withAttribute}} when matching on a div class - made
    difficult because C{class} is a reserved word in Python.

    Example::
        html = '''
            <div>
            Some text
            <div class="grid">1 4 0 1 0</div>
            <div class="graph">1,3 2,3 1,1</div>
            <div>this &lt;div&gt; has no class</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")
        div_grid = div().setParseAction(withClass("grid"))
        
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        div_any_type = div().setParseAction(withClass(withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    """
    classattr = "%s:class" % namespace if namespace else "class"
    return withAttribute(**{classattr : classname})        

opAssoc = _Constants()
opAssoc.LEFT = object()
opAssoc.RIGHT = object()

def infixNotation( baseExpr, opList, lpar=Suppress('('), rpar=Suppress(')') ):
    """
    Helper method for constructing grammars of expressions made up of
    operators working in a precedence hierarchy.  Operators may be unary or
    binary, left- or right-associative.  Parse actions can also be attached
    to operator expressions. The generated parser will also recognize the use 
    of parentheses to override operator precedences (see example below).
    
    Note: if you define a deep operator list, you may see performance issues
    when using infixNotation. See L{ParserElement.enablePackrat} for a
    mechanism to potentially improve your parser performance.

    Parameters:
     - baseExpr - expression representing the most basic element for the nested
     - opList - list of tuples, one for each operator precedence level in the
      expression grammar; each tuple is of the form
      (opExpr, numTerms, rightLeftAssoc, parseAction), where:
       - opExpr is the pyparsing expression for the operator;
          may also be a string, which will be converted to a Literal;
          if numTerms is 3, opExpr is a tuple of two expressions, for the
          two operators separating the 3 terms
       - numTerms is the number of terms for this operator (must
          be 1, 2, or 3)
       - rightLeftAssoc is the indicator whether the operator is
          right or left associative, using the pyparsing-defined
          constants C{opAssoc.RIGHT} and C{opAssoc.LEFT}.
       - parseAction is the parse action to be associated with
          expressions matching this operator expression (the
          parse action tuple member may be omitted)
     - lpar - expression for matching left-parentheses (default=C{Suppress('(')})
     - rpar - expression for matching right-parentheses (default=C{Suppress(')')})

    Example::
        # simple example of four-function arithmetic with ints and variable names
        integer = pyparsing_common.signed_integer
        varname = pyparsing_common.identifier 
        
        arith_expr = infixNotation(integer | varname,
            [
            ('-', 1, opAssoc.RIGHT),
            (oneOf('* /'), 2, opAssoc.LEFT),
            (oneOf('+ -'), 2, opAssoc.LEFT),
            ])
        
        arith_expr.runTests('''
            5+3*6
            (5+3)*6
            -2--11
            ''', fullDump=False)
    prints::
        5+3*6
        [[5, '+', [3, '*', 6]]]

        (5+3)*6
        [[[5, '+', 3], '*', 6]]

        -2--11
        [[['-', 2], '-', ['-', 11]]]
    """
    ret = Forward()
    lastExpr = baseExpr | ( lpar + ret + rpar )
    for i,operDef in enumerate(opList):
        opExpr,arity,rightLeftAssoc,pa = (operDef + (None,))[:4]
        termName = "%s term" % opExpr if arity < 3 else "%s%s term" % opExpr
        if arity == 3:
            if opExpr is None or len(opExpr) != 2:
                raise ValueError("if numterms=3, opExpr must be a tuple or list of two expressions")
            opExpr1, opExpr2 = opExpr
        thisExpr = Forward().setName(termName)
        if rightLeftAssoc == opAssoc.LEFT:
            if arity == 1:
                matchExpr = FollowedBy(lastExpr + opExpr) + Group( lastExpr + OneOrMore( opExpr ) )
            elif arity == 2:
                if opExpr is not None:
                    matchExpr = FollowedBy(lastExpr + opExpr + lastExpr) + Group( lastExpr + OneOrMore( opExpr + lastExpr ) )
                else:
                    matchExpr = FollowedBy(lastExpr+lastExpr) + Group( lastExpr + OneOrMore(lastExpr) )
            elif arity == 3:
                matchExpr = FollowedBy(lastExpr + opExpr1 + lastExpr + opExpr2 + lastExpr) + \
                            Group( lastExpr + opExpr1 + lastExpr + opExpr2 + lastExpr )
            else:
                raise ValueError("operator must be unary (1), binary (2), or ternary (3)")
        elif rightLeftAssoc == opAssoc.RIGHT:
            if arity == 1:
                # try to avoid LR with this extra test
                if not isinstance(opExpr, Optional):
                    opExpr = Optional(opExpr)
                matchExpr = FollowedBy(opExpr.expr + thisExpr) + Group( opExpr + thisExpr )
            elif arity == 2:
                if opExpr is not None:
                    matchExpr = FollowedBy(lastExpr + opExpr + thisExpr) + Group( lastExpr + OneOrMore( opExpr + thisExpr ) )
                else:
                    matchExpr = FollowedBy(lastExpr + thisExpr) + Group( lastExpr + OneOrMore( thisExpr ) )
            elif arity == 3:
                matchExpr = FollowedBy(lastExpr + opExpr1 + thisExpr + opExpr2 + thisExpr) + \
                            Group( lastExpr + opExpr1 + thisExpr + opExpr2 + thisExpr )
            else:
                raise ValueError("operator must be unary (1), binary (2), or ternary (3)")
        else:
            raise ValueError("operator must indicate right or left associativity")
        if pa:
            matchExpr.setParseAction( pa )
        thisExpr <<= ( matchExpr.setName(termName) | lastExpr )
        lastExpr = thisExpr
    ret <<= lastExpr
    return ret

operatorPrecedence = infixNotation
"""(Deprecated) Former name of C{L{infixNotation}}, will be dropped in a future release."""

dblQuotedString = Combine(Regex(r'"(?:[^"\n\r\\]|(?:"")|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*')+'"').setName("string enclosed in double quotes")
sglQuotedString = Combine(Regex(r"'(?:[^'\n\r\\]|(?:'')|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*")+"'").setName("string enclosed in single quotes")
quotedString = Combine(Regex(r'"(?:[^"\n\r\\]|(?:"")|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*')+'"'|
                       Regex(r"'(?:[^'\n\r\\]|(?:'')|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*")+"'").setName("quotedString using single or double quotes")
unicodeString = Combine(_L('u') + quotedString.copy()).setName("unicode string literal")

def nestedExpr(opener="(", closer=")", content=None, ignoreExpr=quotedString.copy()):
    """
    Helper method for defining nested lists enclosed in opening and closing
    delimiters ("(" and ")" are the default).

    Parameters:
     - opener - opening character for a nested list (default=C{"("}); can also be a pyparsing expression
     - closer - closing character for a nested list (default=C{")"}); can also be a pyparsing expression
     - content - expression for items within the nested lists (default=C{None})
     - ignoreExpr - expression for ignoring opening and closing delimiters (default=C{quotedString})

    If an expression is not provided for the content argument, the nested
    expression will capture all whitespace-delimited content between delimiters
    as a list of separate values.

    Use the C{ignoreExpr} argument to define expressions that may contain
    opening or closing characters that should not be treated as opening
    or closing characters for nesting, such as quotedString or a comment
    expression.  Specify multiple expressions using an C{L{Or}} or C{L{MatchFirst}}.
    The default is L{quotedString}, but if no expressions are to be ignored,
    then pass C{None} for this argument.

    Example::
        data_type = oneOf("void int short long char float double")
        decl_data_type = Combine(data_type + Optional(Word('*')))
        ident = Word(alphas+'_', alphanums+'_')
        number = pyparsing_common.number
        arg = Group(decl_data_type + ident)
        LPAR,RPAR = map(Suppress, "()")

        code_body = nestedExpr('{', '}', ignoreExpr=(quotedString | cStyleComment))

        c_function = (decl_data_type("type") 
                      + ident("name")
                      + LPAR + Optional(delimitedList(arg), [])("args") + RPAR 
                      + code_body("body"))
        c_function.ignore(cStyleComment)
        
        source_code = '''
            int is_odd(int x) { 
                return (x%2); 
            }
                
            int dec_to_hex(char hchar) { 
                if (hchar >= '0' && hchar <= '9') { 
                    return (ord(hchar)-ord('0')); 
                } else { 
                    return (10+ord(hchar)-ord('A'));
                } 
            }
        '''
        for func in c_function.searchString(source_code):
            print("%(name)s (%(type)s) args: %(args)s" % func)

    prints::
        is_odd (int) args: [['int', 'x']]
        dec_to_hex (int) args: [['char', 'hchar']]
    """
    if opener == closer:
        raise ValueError("opening and closing strings cannot be the same")
    if content is None:
        if isinstance(opener,basestring) and isinstance(closer,basestring):
            if len(opener) == 1 and len(closer)==1:
                if ignoreExpr is not None:
                    content = (Combine(OneOrMore(~ignoreExpr +
                                    CharsNotIn(opener+closer+ParserElement.DEFAULT_WHITE_CHARS,exact=1))
                                ).setParseAction(lambda t:t[0].strip()))
                else:
                    content = (empty.copy()+CharsNotIn(opener+closer+ParserElement.DEFAULT_WHITE_CHARS
                                ).setParseAction(lambda t:t[0].strip()))
            else:
                if ignoreExpr is not None:
                    content = (Combine(OneOrMore(~ignoreExpr + 
                                    ~Literal(opener) + ~Literal(closer) +
                                    CharsNotIn(ParserElement.DEFAULT_WHITE_CHARS,exact=1))
                                ).setParseAction(lambda t:t[0].strip()))
                else:
                    content = (Combine(OneOrMore(~Literal(opener) + ~Literal(closer) +
                                    CharsNotIn(ParserElement.DEFAULT_WHITE_CHARS,exact=1))
                                ).setParseAction(lambda t:t[0].strip()))
        else:
            raise ValueError("opening and closing arguments must be strings if no content expression is given")
    ret = Forward()
    if ignoreExpr is not None:
        ret <<= Group( Suppress(opener) + ZeroOrMore( ignoreExpr | ret | content ) + Suppress(closer) )
    else:
        ret <<= Group( Suppress(opener) + ZeroOrMore( ret | content )  + Suppress(closer) )
    ret.setName('nested %s%s expression' % (opener,closer))
    return ret

def indentedBlock(blockStatementExpr, indentStack, indent=True):
    """
    Helper method for defining space-delimited indentation blocks, such as
    those used to define block statements in Python source code.

    Parameters:
     - blockStatementExpr - expression defining syntax of statement that
            is repeated within the indented block
     - indentStack - list created by caller to manage indentation stack
            (multiple statementWithIndentedBlock expressions within a single grammar
            should share a common indentStack)
     - indent - boolean indicating whether block must be indented beyond the
            the current level; set to False for block of left-most statements
            (default=C{True})

    A valid block must contain at least one C{blockStatement}.

    Example::
        data = '''
        def A(z):
          A1
          B = 100
          G = A2
          A2
          A3
        B
        def BB(a,b,c):
          BB1
          def BBA():
            bba1
            bba2
            bba3
        C
        D
        def spam(x,y):
             def eggs(z):
                 pass
        '''


        indentStack = [1]
        stmt = Forward()

        identifier = Word(alphas, alphanums)
        funcDecl = ("def" + identifier + Group( "(" + Optional( delimitedList(identifier) ) + ")" ) + ":")
        func_body = indentedBlock(stmt, indentStack)
        funcDef = Group( funcDecl + func_body )

        rvalue = Forward()
        funcCall = Group(identifier + "(" + Optional(delimitedList(rvalue)) + ")")
        rvalue << (funcCall | identifier | Word(nums))
        assignment = Group(identifier + "=" + rvalue)
        stmt << ( funcDef | assignment | identifier )

        module_body = OneOrMore(stmt)

        parseTree = module_body.parseString(data)
        parseTree.pprint()
    prints::
        [['def',
          'A',
          ['(', 'z', ')'],
          ':',
          [['A1'], [['B', '=', '100']], [['G', '=', 'A2']], ['A2'], ['A3']]],
         'B',
         ['def',
          'BB',
          ['(', 'a', 'b', 'c', ')'],
          ':',
          [['BB1'], [['def', 'BBA', ['(', ')'], ':', [['bba1'], ['bba2'], ['bba3']]]]]],
         'C',
         'D',
         ['def',
          'spam',
          ['(', 'x', 'y', ')'],
          ':',
          [[['def', 'eggs', ['(', 'z', ')'], ':', [['pass']]]]]]] 
    """
    def checkPeerIndent(s,l,t):
        if l >= len(s): return
        curCol = col(l,s)
        if curCol != indentStack[-1]:
            if curCol > indentStack[-1]:
                raise ParseFatalException(s,l,"illegal nesting")
            raise ParseException(s,l,"not a peer entry")

    def checkSubIndent(s,l,t):
        curCol = col(l,s)
        if curCol > indentStack[-1]:
            indentStack.append( curCol )
        else:
            raise ParseException(s,l,"not a subentry")

    def checkUnindent(s,l,t):
        if l >= len(s): return
        curCol = col(l,s)
        if not(indentStack and curCol < indentStack[-1] and curCol <= indentStack[-2]):
            raise ParseException(s,l,"not an unindent")
        indentStack.pop()

    NL = OneOrMore(LineEnd().setWhitespaceChars("\t ").suppress())
    INDENT = (Empty() + Empty().setParseAction(checkSubIndent)).setName('INDENT')
    PEER   = Empty().setParseAction(checkPeerIndent).setName('')
    UNDENT = Empty().setParseAction(checkUnindent).setName('UNINDENT')
    if indent:
        smExpr = Group( Optional(NL) +
            #~ FollowedBy(blockStatementExpr) +
            INDENT + (OneOrMore( PEER + Group(blockStatementExpr) + Optional(NL) )) + UNDENT)
    else:
        smExpr = Group( Optional(NL) +
            (OneOrMore( PEER + Group(blockStatementExpr) + Optional(NL) )) )
    blockStatementExpr.ignore(_bslash + LineEnd())
    return smExpr.setName('indented block')

alphas8bit = srange(r"[\0xc0-\0xd6\0xd8-\0xf6\0xf8-\0xff]")
punc8bit = srange(r"[\0xa1-\0xbf\0xd7\0xf7]")

anyOpenTag,anyCloseTag = makeHTMLTags(Word(alphas,alphanums+"_:").setName('any tag'))
_htmlEntityMap = dict(zip("gt lt amp nbsp quot apos".split(),'><& "\''))
commonHTMLEntity = Regex('&(?P<entity>' + '|'.join(_htmlEntityMap.keys()) +");").setName("common HTML entity")
def replaceHTMLEntity(t):
    """Helper parser action to replace common HTML entities with their special characters"""
    return _htmlEntityMap.get(t.entity)

# it's easy to get these comment structures wrong - they're very common, so may as well make them available
cStyleComment = Combine(Regex(r"/\*(?:[^*]|\*(?!/))*") + '*/').setName("C style comment")
"Comment of the form C{/* ... */}"

htmlComment = Regex(r"<!--[\s\S]*?-->").setName("HTML comment")
"Comment of the form C{<!-- ... -->}"

restOfLine = Regex(r".*").leaveWhitespace().setName("rest of line")
dblSlashComment = Regex(r"//(?:\\\n|[^\n])*").setName("// comment")
"Comment of the form C{// ... (to end of line)}"

cppStyleComment = Combine(Regex(r"/\*(?:[^*]|\*(?!/))*") + '*/'| dblSlashComment).setName("C++ style comment")
"Comment of either form C{L{cStyleComment}} or C{L{dblSlashComment}}"

javaStyleComment = cppStyleComment
"Same as C{L{cppStyleComment}}"

pythonStyleComment = Regex(r"#.*").setName("Python style comment")
"Comment of the form C{# ... (to end of line)}"

_commasepitem = Combine(OneOrMore(Word(printables, excludeChars=',') +
                                  Optional( Word(" \t") +
                                            ~Literal(",") + ~LineEnd() ) ) ).streamline().setName("commaItem")
commaSeparatedList = delimitedList( Optional( quotedString.copy() | _commasepitem, default="") ).setName("commaSeparatedList")
"""(Deprecated) Predefined expression of 1 or more printable words or quoted strings, separated by commas.
   This expression is deprecated in favor of L{pyparsing_common.comma_separated_list}."""

# some other useful expressions - using lower-case class name since we are really using this as a namespace
class pyparsing_common:
    """
    Here are some common low-level expressions that may be useful in jump-starting parser development:
     - numeric forms (L{integers<integer>}, L{reals<real>}, L{scientific notation<sci_real>})
     - common L{programming identifiers<identifier>}
     - network addresses (L{MAC<mac_address>}, L{IPv4<ipv4_address>}, L{IPv6<ipv6_address>})
     - ISO8601 L{dates<iso8601_date>} and L{datetime<iso8601_datetime>}
     - L{UUID<uuid>}
     - L{comma-separated list<comma_separated_list>}
    Parse actions:
     - C{L{convertToInteger}}
     - C{L{convertToFloat}}
     - C{L{convertToDate}}
     - C{L{convertToDatetime}}
     - C{L{stripHTMLTags}}
     - C{L{upcaseTokens}}
     - C{L{downcaseTokens}}

    Example::
        pyparsing_common.number.runTests('''
            # any int or real number, returned as the appropriate type
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.fnumber.runTests('''
            # any int or real number, returned as float
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.hex_integer.runTests('''
            # hex numbers
            100
            FF
            ''')

        pyparsing_common.fraction.runTests('''
            # fractions
            1/2
            -3/4
            ''')

        pyparsing_common.mixed_integer.runTests('''
            # mixed fractions
            1
            1/2
            -3/4
            1-3/4
            ''')

        import uuid
        pyparsing_common.uuid.setParseAction(tokenMap(uuid.UUID))
        pyparsing_common.uuid.runTests('''
            # uuid
            12345678-1234-5678-1234-567812345678
            ''')
    prints::
        # any int or real number, returned as the appropriate type
        100
        [100]

        -100
        [-100]

        +100
        [100]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # any int or real number, returned as float
        100
        [100.0]

        -100
        [-100.0]

        +100
        [100.0]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # hex numbers
        100
        [256]

        FF
        [255]

        # fractions
        1/2
        [0.5]

        -3/4
        [-0.75]

        # mixed fractions
        1
        [1]

        1/2
        [0.5]

        -3/4
        [-0.75]

        1-3/4
        [1.75]

        # uuid
        12345678-1234-5678-1234-567812345678
        [UUID('12345678-1234-5678-1234-567812345678')]
    """

    convertToInteger = tokenMap(int)
    """
    Parse action for converting parsed integers to Python int
    """

    convertToFloat = tokenMap(float)
    """
    Parse action for converting parsed numbers to Python float
    """

    integer = Word(nums).setName("integer").setParseAction(convertToInteger)
    """expression that parses an unsigned integer, returns an int"""

    hex_integer = Word(hexnums).setName("hex integer").setParseAction(tokenMap(int,16))
    """expression that parses a hexadecimal integer, returns an int"""

    signed_integer = Regex(r'[+-]?\d+').setName("signed integer").setParseAction(convertToInteger)
    """expression that parses an integer with optional leading sign, returns an int"""

    fraction = (signed_integer().setParseAction(convertToFloat) + '/' + signed_integer().setParseAction(convertToFloat)).setName("fraction")
    """fractional expression of an integer divided by an integer, returns a float"""
    fraction.addParseAction(lambda t: t[0]/t[-1])

    mixed_integer = (fraction | signed_integer + Optional(Optional('-').suppress() + fraction)).setName("fraction or mixed integer-fraction")
    """mixed integer of the form 'integer - fraction', with optional leading integer, returns float"""
    mixed_integer.addParseAction(sum)

    real = Regex(r'[+-]?\d+\.\d*').setName("real number").setParseAction(convertToFloat)
    """expression that parses a floating point number and returns a float"""

    sci_real = Regex(r'[+-]?\d+([eE][+-]?\d+|\.\d*([eE][+-]?\d+)?)').setName("real number with scientific notation").setParseAction(convertToFloat)
    """expression that parses a floating point number with optional scientific notation and returns a float"""

    # streamlining this expression makes the docs nicer-looking
    number = (sci_real | real | signed_integer).streamline()
    """any numeric expression, returns the corresponding Python type"""

    fnumber = Regex(r'[+-]?\d+\.?\d*([eE][+-]?\d+)?').setName("fnumber").setParseAction(convertToFloat)
    """any int or real number, returned as float"""
    
    identifier = Word(alphas+'_', alphanums+'_').setName("identifier")
    """typical code identifier (leading alpha or '_', followed by 0 or more alphas, nums, or '_')"""
    
    ipv4_address = Regex(r'(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})(\.(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})){3}').setName("IPv4 address")
    "IPv4 address (C{0.0.0.0 - 255.255.255.255})"

    _ipv6_part = Regex(r'[0-9a-fA-F]{1,4}').setName("hex_integer")
    _full_ipv6_address = (_ipv6_part + (':' + _ipv6_part)*7).setName("full IPv6 address")
    _short_ipv6_address = (Optional(_ipv6_part + (':' + _ipv6_part)*(0,6)) + "::" + Optional(_ipv6_part + (':' + _ipv6_part)*(0,6))).setName("short IPv6 address")
    _short_ipv6_address.addCondition(lambda t: sum(1 for tt in t if pyparsing_common._ipv6_part.matches(tt)) < 8)
    _mixed_ipv6_address = ("::ffff:" + ipv4_address).setName("mixed IPv6 address")
    ipv6_address = Combine((_full_ipv6_address | _mixed_ipv6_address | _short_ipv6_address).setName("IPv6 address")).setName("IPv6 address")
    "IPv6 address (long, short, or mixed form)"
    
    mac_address = Regex(r'[0-9a-fA-F]{2}([:.-])[0-9a-fA-F]{2}(?:\1[0-9a-fA-F]{2}){4}').setName("MAC address")
    "MAC address xx:xx:xx:xx:xx (may also have '-' or '.' delimiters)"

    @staticmethod
    def convertToDate(fmt="%Y-%m-%d"):
        """
        Helper to create a parse action for converting parsed date string to Python datetime.date

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%d"})

        Example::
            date_expr = pyparsing_common.iso8601_date.copy()
            date_expr.setParseAction(pyparsing_common.convertToDate())
            print(date_expr.parseString("1999-12-31"))
        prints::
            [datetime.date(1999, 12, 31)]
        """
        def cvt_fn(s,l,t):
            try:
                return datetime.strptime(t[0], fmt).date()
            except ValueError as ve:
                raise ParseException(s, l, str(ve))
        return cvt_fn

    @staticmethod
    def convertToDatetime(fmt="%Y-%m-%dT%H:%M:%S.%f"):
        """
        Helper to create a parse action for converting parsed datetime string to Python datetime.datetime

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%dT%H:%M:%S.%f"})

        Example::
            dt_expr = pyparsing_common.iso8601_datetime.copy()
            dt_expr.setParseAction(pyparsing_common.convertToDatetime())
            print(dt_expr.parseString("1999-12-31T23:59:59.999"))
        prints::
            [datetime.datetime(1999, 12, 31, 23, 59, 59, 999000)]
        """
        def cvt_fn(s,l,t):
            try:
                return datetime.strptime(t[0], fmt)
            except ValueError as ve:
                raise ParseException(s, l, str(ve))
        return cvt_fn

    iso8601_date = Regex(r'(?P<year>\d{4})(?:-(?P<month>\d\d)(?:-(?P<day>\d\d))?)?').setName("ISO8601 date")
    "ISO8601 date (C{yyyy-mm-dd})"

    iso8601_datetime = Regex(r'(?P<year>\d{4})-(?P<month>\d\d)-(?P<day>\d\d)[T ](?P<hour>\d\d):(?P<minute>\d\d)(:(?P<second>\d\d(\.\d*)?)?)?(?P<tz>Z|[+-]\d\d:?\d\d)?').setName("ISO8601 datetime")
    "ISO8601 datetime (C{yyyy-mm-ddThh:mm:ss.s(Z|+-00:00)}) - trailing seconds, milliseconds, and timezone optional; accepts separating C{'T'} or C{' '}"

    uuid = Regex(r'[0-9a-fA-F]{8}(-[0-9a-fA-F]{4}){3}-[0-9a-fA-F]{12}').setName("UUID")
    "UUID (C{xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx})"

    _html_stripper = anyOpenTag.suppress() | anyCloseTag.suppress()
    @staticmethod
    def stripHTMLTags(s, l, tokens):
        """
        Parse action to remove HTML tags from web page HTML source

        Example::
            # strip HTML links from normal text 
            text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
            td,td_end = makeHTMLTags("TD")
            table_text = td + SkipTo(td_end).setParseAction(pyparsing_common.stripHTMLTags)("body") + td_end
            
            print(table_text.parseString(text).body) # -> 'More info at the pyparsing wiki page'
        """
        return pyparsing_common._html_stripper.transformString(tokens[0])

    _commasepitem = Combine(OneOrMore(~Literal(",") + ~LineEnd() + Word(printables, excludeChars=',') 
                                        + Optional( White(" \t") ) ) ).streamline().setName("commaItem")
    comma_separated_list = delimitedList( Optional( quotedString.copy() | _commasepitem, default="") ).setName("comma separated list")
    """Predefined expression of 1 or more printable words or quoted strings, separated by commas."""

    upcaseTokens = staticmethod(tokenMap(lambda t: _ustr(t).upper()))
    """Parse action to convert tokens to upper case."""

    downcaseTokens = staticmethod(tokenMap(lambda t: _ustr(t).lower()))
    """Parse action to convert tokens to lower case."""


if __name__ == "__main__":

    selectToken    = CaselessLiteral("select")
    fromToken      = CaselessLiteral("from")

    ident          = Word(alphas, alphanums + "_$")

    columnName     = delimitedList(ident, ".", combine=True).setParseAction(upcaseTokens)
    columnNameList = Group(delimitedList(columnName)).setName("columns")
    columnSpec     = ('*' | columnNameList)

    tableName      = delimitedList(ident, ".", combine=True).setParseAction(upcaseTokens)
    tableNameList  = Group(delimitedList(tableName)).setName("tables")
    
    simpleSQL      = selectToken("command") + columnSpec("columns") + fromToken + tableNameList("tables")

    # demo runTests method, including embedded comments in test string
    simpleSQL.runTests("""
        # '*' as column list and dotted table name
        select * from SYS.XYZZY

        # caseless match on "SELECT", and casts back to "select"
        SELECT * from XYZZY, ABC

        # list of column names, and mixed case SELECT keyword
        Select AA,BB,CC from Sys.dual

        # multiple tables
        Select A, B, C from Sys.dual, Table2

        # invalid SELECT keyword - should fail
        Xelect A, B, C from Sys.dual

        # incomplete command - should fail
        Select

        # invalid column name - should fail
        Select ^^^ frox Sys.dual

        """)

    pyparsing_common.number.runTests("""
        100
        -100
        +100
        3.14159
        6.02e23
        1e-12
        """)

    # any int or real number, returned as float
    pyparsing_common.fnumber.runTests("""
        100
        -100
        +100
        3.14159
        6.02e23
        1e-12
        """)

    pyparsing_common.hex_integer.runTests("""
        100
        FF
        """)

    import uuid
    pyparsing_common.uuid.setParseAction(tokenMap(uuid.UUID))
    pyparsing_common.uuid.runTests("""
        12345678-1234-5678-1234-567812345678
        """)
site-packages/setuptools/_vendor/packaging/_compat.py000064400000001534147511334620017072 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

import sys


PY2 = sys.version_info[0] == 2
PY3 = sys.version_info[0] == 3

# flake8: noqa

if PY3:
    string_types = str,
else:
    string_types = basestring,


def with_metaclass(meta, *bases):
    """
    Create a base class with a metaclass.
    """
    # This requires a bit of explanation: the basic idea is to make a dummy
    # metaclass for one level of class instantiation that replaces itself with
    # the actual metaclass.
    class metaclass(meta):
        def __new__(cls, name, this_bases, d):
            return meta(name, bases, d)
    return type.__new__(metaclass, 'temporary_class', (), {})
site-packages/setuptools/_vendor/packaging/requirements.py000064400000010367147511334620020177 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

import string
import re

from setuptools.extern.pyparsing import stringStart, stringEnd, originalTextFor, ParseException
from setuptools.extern.pyparsing import ZeroOrMore, Word, Optional, Regex, Combine
from setuptools.extern.pyparsing import Literal as L  # noqa
from setuptools.extern.six.moves.urllib import parse as urlparse

from .markers import MARKER_EXPR, Marker
from .specifiers import LegacySpecifier, Specifier, SpecifierSet


class InvalidRequirement(ValueError):
    """
    An invalid requirement was found, users should refer to PEP 508.
    """


ALPHANUM = Word(string.ascii_letters + string.digits)

LBRACKET = L("[").suppress()
RBRACKET = L("]").suppress()
LPAREN = L("(").suppress()
RPAREN = L(")").suppress()
COMMA = L(",").suppress()
SEMICOLON = L(";").suppress()
AT = L("@").suppress()

PUNCTUATION = Word("-_.")
IDENTIFIER_END = ALPHANUM | (ZeroOrMore(PUNCTUATION) + ALPHANUM)
IDENTIFIER = Combine(ALPHANUM + ZeroOrMore(IDENTIFIER_END))

NAME = IDENTIFIER("name")
EXTRA = IDENTIFIER

URI = Regex(r'[^ ]+')("url")
URL = (AT + URI)

EXTRAS_LIST = EXTRA + ZeroOrMore(COMMA + EXTRA)
EXTRAS = (LBRACKET + Optional(EXTRAS_LIST) + RBRACKET)("extras")

VERSION_PEP440 = Regex(Specifier._regex_str, re.VERBOSE | re.IGNORECASE)
VERSION_LEGACY = Regex(LegacySpecifier._regex_str, re.VERBOSE | re.IGNORECASE)

VERSION_ONE = VERSION_PEP440 ^ VERSION_LEGACY
VERSION_MANY = Combine(VERSION_ONE + ZeroOrMore(COMMA + VERSION_ONE),
                       joinString=",", adjacent=False)("_raw_spec")
_VERSION_SPEC = Optional(((LPAREN + VERSION_MANY + RPAREN) | VERSION_MANY))
_VERSION_SPEC.setParseAction(lambda s, l, t: t._raw_spec or '')

VERSION_SPEC = originalTextFor(_VERSION_SPEC)("specifier")
VERSION_SPEC.setParseAction(lambda s, l, t: t[1])

MARKER_EXPR = originalTextFor(MARKER_EXPR())("marker")
MARKER_EXPR.setParseAction(
    lambda s, l, t: Marker(s[t._original_start:t._original_end])
)
MARKER_SEPERATOR = SEMICOLON
MARKER = MARKER_SEPERATOR + MARKER_EXPR

VERSION_AND_MARKER = VERSION_SPEC + Optional(MARKER)
URL_AND_MARKER = URL + Optional(MARKER)

NAMED_REQUIREMENT = \
    NAME + Optional(EXTRAS) + (URL_AND_MARKER | VERSION_AND_MARKER)

REQUIREMENT = stringStart + NAMED_REQUIREMENT + stringEnd


class Requirement(object):
    """Parse a requirement.

    Parse a given requirement string into its parts, such as name, specifier,
    URL, and extras. Raises InvalidRequirement on a badly-formed requirement
    string.
    """

    # TODO: Can we test whether something is contained within a requirement?
    #       If so how do we do that? Do we need to test against the _name_ of
    #       the thing as well as the version? What about the markers?
    # TODO: Can we normalize the name and extra name?

    def __init__(self, requirement_string):
        try:
            req = REQUIREMENT.parseString(requirement_string)
        except ParseException as e:
            raise InvalidRequirement(
                "Invalid requirement, parse error at \"{0!r}\"".format(
                    requirement_string[e.loc:e.loc + 8]))

        self.name = req.name
        if req.url:
            parsed_url = urlparse.urlparse(req.url)
            if not (parsed_url.scheme and parsed_url.netloc) or (
                    not parsed_url.scheme and not parsed_url.netloc):
                raise InvalidRequirement("Invalid URL given")
            self.url = req.url
        else:
            self.url = None
        self.extras = set(req.extras.asList() if req.extras else [])
        self.specifier = SpecifierSet(req.specifier)
        self.marker = req.marker if req.marker else None

    def __str__(self):
        parts = [self.name]

        if self.extras:
            parts.append("[{0}]".format(",".join(sorted(self.extras))))

        if self.specifier:
            parts.append(str(self.specifier))

        if self.url:
            parts.append("@ {0}".format(self.url))

        if self.marker:
            parts.append("; {0}".format(self.marker))

        return "".join(parts)

    def __repr__(self):
        return "<Requirement({0!r})>".format(str(self))
site-packages/setuptools/_vendor/packaging/_structures.py000064400000002610147511334620020026 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function


class Infinity(object):

    def __repr__(self):
        return "Infinity"

    def __hash__(self):
        return hash(repr(self))

    def __lt__(self, other):
        return False

    def __le__(self, other):
        return False

    def __eq__(self, other):
        return isinstance(other, self.__class__)

    def __ne__(self, other):
        return not isinstance(other, self.__class__)

    def __gt__(self, other):
        return True

    def __ge__(self, other):
        return True

    def __neg__(self):
        return NegativeInfinity

Infinity = Infinity()


class NegativeInfinity(object):

    def __repr__(self):
        return "-Infinity"

    def __hash__(self):
        return hash(repr(self))

    def __lt__(self, other):
        return True

    def __le__(self, other):
        return True

    def __eq__(self, other):
        return isinstance(other, self.__class__)

    def __ne__(self, other):
        return not isinstance(other, self.__class__)

    def __gt__(self, other):
        return False

    def __ge__(self, other):
        return False

    def __neg__(self):
        return Infinity

NegativeInfinity = NegativeInfinity()
site-packages/setuptools/_vendor/packaging/__about__.py000064400000001320147511334620017347 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

__all__ = [
    "__title__", "__summary__", "__uri__", "__version__", "__author__",
    "__email__", "__license__", "__copyright__",
]

__title__ = "packaging"
__summary__ = "Core utilities for Python packages"
__uri__ = "https://github.com/pypa/packaging"

__version__ = "16.8"

__author__ = "Donald Stufft and individual contributors"
__email__ = "donald@stufft.io"

__license__ = "BSD or Apache License, Version 2.0"
__copyright__ = "Copyright 2014-2016 %s" % __author__
site-packages/setuptools/_vendor/packaging/__pycache__/version.cpython-36.opt-1.pyc000064400000024426147511334620024365 0ustar003

��f$-�@s�ddlmZmZmZddlZddlZddlZddlmZddddd	gZ	ej
d
ddd
dddg�Zdd�ZGdd�de
�ZGdd�de�ZGdd�de�Zejdej�Zdddddd�Zdd�Zdd�ZdZGd d�de�Zd!d"�Zejd#�Zd$d%�Zd&d'�ZdS)(�)�absolute_import�division�print_functionN�)�Infinity�parse�Version�
LegacyVersion�InvalidVersion�VERSION_PATTERN�_Version�epoch�release�dev�pre�post�localcCs&yt|�Stk
r t|�SXdS)z�
    Parse the given version string and return either a :class:`Version` object
    or a :class:`LegacyVersion` object depending on if the given version is
    a valid PEP 440 version or a legacy version.
    N)rr
r	)�version�r�/usr/lib/python3.6/version.pyrsc@seZdZdZdS)r
zF
    An invalid version was found, users should refer to PEP 440.
    N)�__name__�
__module__�__qualname__�__doc__rrrrr
$sc@sLeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)�_BaseVersioncCs
t|j�S)N)�hash�_key)�selfrrr�__hash__,sz_BaseVersion.__hash__cCs|j|dd��S)NcSs||kS)Nr)�s�orrr�<lambda>0sz%_BaseVersion.__lt__.<locals>.<lambda>)�_compare)r�otherrrr�__lt__/sz_BaseVersion.__lt__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!3sz%_BaseVersion.__le__.<locals>.<lambda>)r")rr#rrr�__le__2sz_BaseVersion.__le__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!6sz%_BaseVersion.__eq__.<locals>.<lambda>)r")rr#rrr�__eq__5sz_BaseVersion.__eq__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!9sz%_BaseVersion.__ge__.<locals>.<lambda>)r")rr#rrr�__ge__8sz_BaseVersion.__ge__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!<sz%_BaseVersion.__gt__.<locals>.<lambda>)r")rr#rrr�__gt__;sz_BaseVersion.__gt__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!?sz%_BaseVersion.__ne__.<locals>.<lambda>)r")rr#rrr�__ne__>sz_BaseVersion.__ne__cCst|t�stS||j|j�S)N)�
isinstancer�NotImplementedr)rr#�methodrrrr"As
z_BaseVersion._compareN)rrrrr$r%r&r'r(r)r"rrrrr*src@s`eZdZdd�Zdd�Zdd�Zedd��Zed	d
��Zedd��Z	ed
d��Z
edd��ZdS)r	cCst|�|_t|j�|_dS)N)�str�_version�_legacy_cmpkeyr)rrrrr�__init__Js
zLegacyVersion.__init__cCs|jS)N)r.)rrrr�__str__NszLegacyVersion.__str__cCsdjtt|���S)Nz<LegacyVersion({0})>)�format�reprr-)rrrr�__repr__QszLegacyVersion.__repr__cCs|jS)N)r.)rrrr�publicTszLegacyVersion.publiccCs|jS)N)r.)rrrr�base_versionXszLegacyVersion.base_versioncCsdS)Nr)rrrrr\szLegacyVersion.localcCsdS)NFr)rrrr�
is_prerelease`szLegacyVersion.is_prereleasecCsdS)NFr)rrrr�is_postreleasedszLegacyVersion.is_postreleaseN)rrrr0r1r4�propertyr5r6rr7r8rrrrr	Hsz(\d+ | [a-z]+ | \.| -)�czfinal-�@)r�preview�-�rcrccsbxVtj|�D]H}tj||�}|s|dkr,q|dd�dkrJ|jd�Vqd|VqWdVdS)N�.r�
0123456789��*z*final)�_legacy_version_component_re�split�_legacy_version_replacement_map�get�zfill)r�partrrr�_parse_version_partsrsrIcCs�d}g}xlt|j��D]\}|jd�rh|dkrJx|rH|ddkrH|j�q.Wx|rf|ddkrf|j�qLW|j|�qWt|�}||fS)	NrrBz*finalz*final-Z00000000���rJrJ)rI�lower�
startswith�pop�append�tuple)rr
�partsrHrrrr/�s
r/a�
    v?
    (?:
        (?:(?P<epoch>[0-9]+)!)?                           # epoch
        (?P<release>[0-9]+(?:\.[0-9]+)*)                  # release segment
        (?P<pre>                                          # pre-release
            [-_\.]?
            (?P<pre_l>(a|b|c|rc|alpha|beta|pre|preview))
            [-_\.]?
            (?P<pre_n>[0-9]+)?
        )?
        (?P<post>                                         # post release
            (?:-(?P<post_n1>[0-9]+))
            |
            (?:
                [-_\.]?
                (?P<post_l>post|rev|r)
                [-_\.]?
                (?P<post_n2>[0-9]+)?
            )
        )?
        (?P<dev>                                          # dev release
            [-_\.]?
            (?P<dev_l>dev)
            [-_\.]?
            (?P<dev_n>[0-9]+)?
        )?
    )
    (?:\+(?P<local>[a-z0-9]+(?:[-_\.][a-z0-9]+)*))?       # local version
c@s|eZdZejdedejejB�Zdd�Z	dd�Z
dd�Zed	d
��Z
edd��Zed
d��Zedd��Zedd��ZdS)rz^\s*z\s*$c	Cs�|jj|�}|stdj|���t|jd�r8t|jd��ndtdd�|jd�jd�D��t	|jd�|jd	��t	|jd
�|jd�p�|jd��t	|jd
�|jd��t
|jd��d�|_t|jj
|jj|jj|jj|jj|jj�|_dS)NzInvalid version: '{0}'r
rcss|]}t|�VqdS)N)�int)�.0�irrr�	<genexpr>�sz#Version.__init__.<locals>.<genexpr>rr?Zpre_lZpre_nZpost_lZpost_n1Zpost_n2Zdev_lZdev_nr)r
rrrrr)�_regex�searchr
r2r�grouprQrOrD�_parse_letter_version�_parse_local_versionr.�_cmpkeyr
rrrrrr)rr�matchrrrr0�s.

zVersion.__init__cCsdjtt|���S)Nz<Version({0})>)r2r3r-)rrrrr4�szVersion.__repr__cCs�g}|jjdkr$|jdj|jj��|jdjdd�|jjD���|jjdk	rl|jdjdd�|jjD���|jjdk	r�|jdj|jjd	��|jjdk	r�|jd
j|jjd	��|jj	dk	r�|jdjdjdd�|jj	D����dj|�S)
Nrz{0}!r?css|]}t|�VqdS)N)r-)rR�xrrrrT�sz"Version.__str__.<locals>.<genexpr>�css|]}t|�VqdS)N)r-)rRr\rrrrT�sz.post{0}rz.dev{0}z+{0}css|]}t|�VqdS)N)r-)rRr\rrrrTs)
r.r
rNr2�joinrrrrr)rrPrrrr1�s zVersion.__str__cCst|�jdd�dS)N�+rr)r-rD)rrrrr5
szVersion.publiccCsLg}|jjdkr$|jdj|jj��|jdjdd�|jjD���dj|�S)Nrz{0}!r?css|]}t|�VqdS)N)r-)rRr\rrrrTsz'Version.base_version.<locals>.<genexpr>r])r.r
rNr2r^r)rrPrrrr6s
zVersion.base_versioncCs$t|�}d|kr |jdd�dSdS)Nr_r)r-rD)rZversion_stringrrrrsz
Version.localcCst|jjp|jj�S)N)�boolr.rr)rrrrr7!szVersion.is_prereleasecCst|jj�S)N)r`r.r)rrrrr8%szVersion.is_postreleaseN)rrr�re�compiler�VERBOSE�
IGNORECASErUr0r4r1r9r5r6rr7r8rrrrr�s
#
cCsx|rZ|dkrd}|j�}|dkr&d}n(|dkr4d}n|d
krBd	}n|dkrNd}|t|�fS|rt|rtd}|t|�fSdS)NrZalpha�aZbeta�br:rr<r>�rev�rr)r:rr<)rgrh)rKrQ)ZletterZnumberrrrrX*s 
rXz[\._-]cCs$|dk	r tdd�tj|�D��SdS)zR
    Takes a string like abc.1.twelve and turns it into ("abc", 1, "twelve").
    Ncss&|]}|j�s|j�nt|�VqdS)N)�isdigitrKrQ)rRrHrrrrTRsz'_parse_local_version.<locals>.<genexpr>)rO�_local_version_seperatorsrD)rrrrrYLsrYcCs�ttttjdd�t|�����}|dkr@|dkr@|dk	r@t}n|dkrLt}|dkrZt}|dkrft}|dkrvt}ntdd�|D��}||||||fS)NcSs|dkS)Nrr)r\rrrr!`sz_cmpkey.<locals>.<lambda>css*|]"}t|t�r|dfnt|fVqdS)r]N)r*rQr)rRrSrrrrT�sz_cmpkey.<locals>.<genexpr>)rO�reversed�list�	itertools�	dropwhiler)r
rrrrrrrrrZWs&		
rZ)Z
__future__rrr�collectionsrmraZ_structuresr�__all__�
namedtuplerr�
ValueErrorr
�objectrr	rbrcrCrErIr/rrrXrjrYrZrrrr�<module>s.!
9k
site-packages/setuptools/_vendor/packaging/__pycache__/utils.cpython-36.pyc000064400000000630147511334620023070 0ustar003

��f��@s2ddlmZmZmZddlZejd�Zdd�ZdS)�)�absolute_import�division�print_functionNz[-_.]+cCstjd|�j�S)N�-)�_canonicalize_regex�sub�lower)�name�r
�/usr/lib/python3.6/utils.py�canonicalize_namesr)Z
__future__rrr�re�compilerrr
r
r
r�<module>s
site-packages/setuptools/_vendor/packaging/__pycache__/requirements.cpython-36.pyc000064400000007322147511334620024460 0ustar003

��f��@srddlmZmZmZddlZddlZddlmZmZm	Z	m
Z
ddlmZmZm
Z
mZmZddlmZddlmZddlmZmZdd	lmZmZmZGd
d�de�Zeejej�Z ed�j!�Z"ed
�j!�Z#ed�j!�Z$ed�j!�Z%ed�j!�Z&ed�j!�Z'ed�j!�Z(ed�Z)e ee)�e BZ*ee ee*��Z+e+d�Z,e+Z-ed�d�Z.e(e.Z/e-ee&e-�Z0e"e
e0�e#d�Z1eej2ej3ej4B�Z5eej2ej3ej4B�Z6e5e6AZ7ee7ee&e7�ddd�d�Z8e
e$e8e%e8B�Z9e9j:dd��e	e9�d�Z;e;j:dd��e	e��d�Zej:d d��e'Z<e<eZ=e;e
e=�Z>e/e
e=�Z?e,e
e1�e?e>BZ@ee@eZAGd!d"�d"eB�ZCdS)#�)�absolute_import�division�print_functionN)�stringStart�	stringEnd�originalTextFor�ParseException)�
ZeroOrMore�Word�Optional�Regex�Combine)�Literal)�parse�)�MARKER_EXPR�Marker)�LegacySpecifier�	Specifier�SpecifierSetc@seZdZdZdS)�InvalidRequirementzJ
    An invalid requirement was found, users should refer to PEP 508.
    N)�__name__�
__module__�__qualname__�__doc__�rr�"/usr/lib/python3.6/requirements.pyrsr�[�]�(�)�,�;�@z-_.�namez[^ ]+�url�extrasF)Z
joinStringZadjacent�	_raw_speccCs
|jpdS)N�)r')�s�l�trrr�<lambda>6sr,�	specifiercCs|dS)Nrr)r)r*r+rrrr,9s�markercCst||j|j��S)N)rZ_original_startZ
_original_end)r)r*r+rrrr,=sc@s(eZdZdZdd�Zdd�Zdd�ZdS)	�Requirementz�Parse a requirement.

    Parse a given requirement string into its parts, such as name, specifier,
    URL, and extras. Raises InvalidRequirement on a badly-formed requirement
    string.
    cCs�ytj|�}Wn@tk
rN}z$tdj||j|jd����WYdd}~XnX|j|_|jr�tj|j�}|j	ot|j
s�|j	r�|j
r�td��|j|_nd|_t|jr�|jj
�ng�|_t|j�|_|jr�|jnd|_dS)Nz+Invalid requirement, parse error at "{0!r}"�zInvalid URL given)�REQUIREMENTZparseStringrr�format�locr$r%�urlparse�schemeZnetloc�setr&ZasListrr-r.)�selfZrequirement_stringZreq�eZ
parsed_urlrrr�__init__Xs"*
zRequirement.__init__cCsz|jg}|jr*|jdjdjt|j����|jr@|jt|j��|jrX|jdj|j��|j	rp|jdj|j	��dj|�S)Nz[{0}]r!z@ {0}z; {0}r()
r$r&�appendr2�join�sortedr-�strr%r.)r7�partsrrr�__str__mszRequirement.__str__cCsdjt|��S)Nz<Requirement({0!r})>)r2r=)r7rrr�__repr__~szRequirement.__repr__N)rrrrr9r?r@rrrrr/Ksr/)DZ
__future__rrr�string�reZsetuptools.extern.pyparsingrrrrr	r
rrr
r�LZ"setuptools.extern.six.moves.urllibrr4ZmarkersrrZ
specifiersrrr�
ValueErrorrZ
ascii_lettersZdigitsZALPHANUM�suppressZLBRACKETZRBRACKETZLPARENZRPAREN�COMMAZ	SEMICOLON�ATZPUNCTUATIONZIDENTIFIER_ENDZ
IDENTIFIER�NAMEZEXTRAZURIZURLZEXTRAS_LISTZEXTRASZ
_regex_str�VERBOSE�
IGNORECASEZVERSION_PEP440ZVERSION_LEGACYZVERSION_ONEZVERSION_MANYZ
_VERSION_SPECZsetParseActionZVERSION_SPECZMARKER_SEPERATORZMARKERZVERSION_AND_MARKERZURL_AND_MARKERZNAMED_REQUIREMENTr1�objectr/rrrr�<module>sZ
site-packages/setuptools/_vendor/packaging/__pycache__/_structures.cpython-36.pyc000064400000005335147511334620024321 0ustar003

��f��@sDddlmZmZmZGdd�de�Ze�ZGdd�de�Ze�ZdS)�)�absolute_import�division�print_functionc@sTeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)�InfinitycCsdS)Nr�)�selfrr�!/usr/lib/python3.6/_structures.py�__repr__	szInfinity.__repr__cCstt|��S)N)�hash�repr)rrrr�__hash__szInfinity.__hash__cCsdS)NFr)r�otherrrr�__lt__szInfinity.__lt__cCsdS)NFr)rr
rrr�__le__szInfinity.__le__cCst||j�S)N)�
isinstance�	__class__)rr
rrr�__eq__szInfinity.__eq__cCst||j�S)N)rr)rr
rrr�__ne__szInfinity.__ne__cCsdS)NTr)rr
rrr�__gt__szInfinity.__gt__cCsdS)NTr)rr
rrr�__ge__szInfinity.__ge__cCstS)N)�NegativeInfinity)rrrr�__neg__!szInfinity.__neg__N)�__name__�
__module__�__qualname__r	rrrrrrrrrrrrrsrc@sTeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)rcCsdS)Nz	-Infinityr)rrrrr	)szNegativeInfinity.__repr__cCstt|��S)N)r
r)rrrrr,szNegativeInfinity.__hash__cCsdS)NTr)rr
rrrr/szNegativeInfinity.__lt__cCsdS)NTr)rr
rrrr2szNegativeInfinity.__le__cCst||j�S)N)rr)rr
rrrr5szNegativeInfinity.__eq__cCst||j�S)N)rr)rr
rrrr8szNegativeInfinity.__ne__cCsdS)NFr)rr
rrrr;szNegativeInfinity.__gt__cCsdS)NFr)rr
rrrr>szNegativeInfinity.__ge__cCstS)N)r)rrrrrAszNegativeInfinity.__neg__N)rrrr	rrrrrrrrrrrrr'srN)Z
__future__rrr�objectrrrrrr�<module>ssite-packages/setuptools/_vendor/packaging/__pycache__/specifiers.cpython-36.opt-1.pyc000064400000046437147511334620025042 0ustar003

��fym�@s�ddlmZmZmZddlZddlZddlZddlZddlm	Z	m
Z
ddlmZm
Z
mZGdd�de�ZGdd	�d	e
eje��ZGd
d�de�ZGdd
�d
e�Zdd�ZGdd�de�Zejd�Zdd�Zdd�ZGdd�de�ZdS)�)�absolute_import�division�print_functionN�)�string_types�with_metaclass)�Version�
LegacyVersion�parsec@seZdZdZdS)�InvalidSpecifierzH
    An invalid specifier was found, users should refer to PEP 440.
    N)�__name__�
__module__�__qualname__�__doc__�rr� /usr/lib/python3.6/specifiers.pyrsrc@s�eZdZejdd��Zejdd��Zejdd��Zejdd��Zej	d	d
��Z
e
jdd
��Z
ejdd
d��Zejddd��Z
dS)�
BaseSpecifiercCsdS)z�
        Returns the str representation of this Specifier like object. This
        should be representative of the Specifier itself.
        Nr)�selfrrr�__str__szBaseSpecifier.__str__cCsdS)zF
        Returns a hash value for this Specifier like object.
        Nr)rrrr�__hash__szBaseSpecifier.__hash__cCsdS)zq
        Returns a boolean representing whether or not the two Specifier like
        objects are equal.
        Nr)r�otherrrr�__eq__$szBaseSpecifier.__eq__cCsdS)zu
        Returns a boolean representing whether or not the two Specifier like
        objects are not equal.
        Nr)rrrrr�__ne__+szBaseSpecifier.__ne__cCsdS)zg
        Returns whether or not pre-releases as a whole are allowed by this
        specifier.
        Nr)rrrr�prereleases2szBaseSpecifier.prereleasescCsdS)zd
        Sets whether or not pre-releases as a whole are allowed by this
        specifier.
        Nr)r�valuerrrr9sNcCsdS)zR
        Determines if the given item is contained within this specifier.
        Nr)r�itemrrrr�contains@szBaseSpecifier.containscCsdS)z�
        Takes an iterable of items and filters them so that only items which
        are contained within this specifier are allowed in it.
        Nr)r�iterablerrrr�filterFszBaseSpecifier.filter)N)N)rr
r�abc�abstractmethodrrrr�abstractpropertyr�setterrrrrrrrsrc@s�eZdZiZd dd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zedd��Z
edd��Zedd��Zejdd��Zdd�Zd!dd�Zd"dd�ZdS)#�_IndividualSpecifier�NcCsF|jj|�}|stdj|���|jd�j�|jd�j�f|_||_dS)NzInvalid specifier: '{0}'�operator�version)�_regex�searchr�format�group�strip�_spec�_prereleases)r�specr�matchrrr�__init__Rsz_IndividualSpecifier.__init__cCs0|jdk	rdj|j�nd}dj|jjt|�|�S)Nz, prereleases={0!r}r$z<{0}({1!r}{2})>)r-r)r�	__class__r�str)r�prerrr�__repr___sz_IndividualSpecifier.__repr__cCsdj|j�S)Nz{0}{1})r)r,)rrrrrlsz_IndividualSpecifier.__str__cCs
t|j�S)N)�hashr,)rrrrrosz_IndividualSpecifier.__hash__cCsLt|t�r0y|j|�}Wq@tk
r,tSXnt||j�s@tS|j|jkS)N)�
isinstancerr1r�NotImplementedr,)rrrrrrrs
z_IndividualSpecifier.__eq__cCsLt|t�r0y|j|�}Wq@tk
r,tSXnt||j�s@tS|j|jkS)N)r6rr1rr7r,)rrrrrr}s
z_IndividualSpecifier.__ne__cCst|dj|j|��S)Nz_compare_{0})�getattrr)�
_operators)r�oprrr�
_get_operator�sz"_IndividualSpecifier._get_operatorcCst|ttf�st|�}|S)N)r6r	rr
)rr&rrr�_coerce_version�sz$_IndividualSpecifier._coerce_versioncCs
|jdS)Nr)r,)rrrrr%�sz_IndividualSpecifier.operatorcCs
|jdS)Nr)r,)rrrrr&�sz_IndividualSpecifier.versioncCs|jS)N)r-)rrrrr�sz _IndividualSpecifier.prereleasescCs
||_dS)N)r-)rrrrrr�scCs
|j|�S)N)r)rrrrr�__contains__�sz!_IndividualSpecifier.__contains__cCs<|dkr|j}|j|�}|jr(|r(dS|j|j�||j�S)NF)rr<�
is_prereleaser;r%r&)rrrrrrr�s
z_IndividualSpecifier.containsccs�d}g}d|dk	r|ndi}xL|D]D}|j|�}|j|f|�r"|jr\|pL|jr\|j|�q"d}|Vq"W|r�|r�x|D]
}|VqzWdS)NFrT)r<rr>r�append)rrrZyielded�found_prereleases�kwr&�parsed_versionrrrr�s




z_IndividualSpecifier.filter)r$N)N)N)rr
rr9r0r4rrrrr;r<�propertyr%r&rr"r=rrrrrrr#Ns 



r#c@sveZdZdZejdedejejB�Zdddddd	d
�Z	dd�Z
d
d�Zdd�Zdd�Z
dd�Zdd�Zdd�ZdS)�LegacySpecifiera�
        (?P<operator>(==|!=|<=|>=|<|>))
        \s*
        (?P<version>
            [^,;\s)]* # Since this is a "legacy" specifier, and the version
                      # string can be just about anything, we match everything
                      # except for whitespace, a semi-colon for marker support,
                      # a closing paren since versions can be enclosed in
                      # them, and a comma since it's a version separator.
        )
        z^\s*z\s*$�equal�	not_equal�less_than_equal�greater_than_equal�	less_than�greater_than)z==z!=z<=z>=�<�>cCst|t�stt|��}|S)N)r6r	r2)rr&rrrr<�s
zLegacySpecifier._coerce_versioncCs||j|�kS)N)r<)r�prospectiver.rrr�_compare_equal�szLegacySpecifier._compare_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_not_equal�sz"LegacySpecifier._compare_not_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_less_than_equal�sz(LegacySpecifier._compare_less_than_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_greater_than_equalsz+LegacySpecifier._compare_greater_than_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_less_thansz"LegacySpecifier._compare_less_thancCs||j|�kS)N)r<)rrMr.rrr�_compare_greater_thansz%LegacySpecifier._compare_greater_thanN)rr
r�
_regex_str�re�compile�VERBOSE�
IGNORECASEr'r9r<rNrOrPrQrRrSrrrrrD�s 
rDcstj���fdd��}|S)Ncst|t�sdS�|||�S)NF)r6r)rrMr.)�fnrr�wrappeds
z)_require_version_compare.<locals>.wrapped)�	functools�wraps)rYrZr)rYr�_require_version_compare
sr]c	@s�eZdZdZejdedejejB�Zdddddd	d
dd�Z	e
d
d��Ze
dd��Ze
dd��Z
e
dd��Ze
dd��Ze
dd��Ze
dd��Zdd�Zedd��Zejdd��Zd S)!�	Specifiera
        (?P<operator>(~=|==|!=|<=|>=|<|>|===))
        (?P<version>
            (?:
                # The identity operators allow for an escape hatch that will
                # do an exact string match of the version you wish to install.
                # This will not be parsed by PEP 440 and we cannot determine
                # any semantic meaning from it. This operator is discouraged
                # but included entirely as an escape hatch.
                (?<====)  # Only match for the identity operator
                \s*
                [^\s]*    # We just match everything, except for whitespace
                          # since we are only testing for strict identity.
            )
            |
            (?:
                # The (non)equality operators allow for wild card and local
                # versions to be specified so we have to define these two
                # operators separately to enable that.
                (?<===|!=)            # Only match for equals and not equals

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)*   # release
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?

                # You cannot use a wild card and a dev or local version
                # together so group them with a | and make them optional.
                (?:
                    (?:[-_\.]?dev[-_\.]?[0-9]*)?         # dev release
                    (?:\+[a-z0-9]+(?:[-_\.][a-z0-9]+)*)? # local
                    |
                    \.\*  # Wild card syntax of .*
                )?
            )
            |
            (?:
                # The compatible operator requires at least two digits in the
                # release segment.
                (?<=~=)               # Only match for the compatible operator

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)+   # release  (We have a + instead of a *)
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?
                (?:[-_\.]?dev[-_\.]?[0-9]*)?          # dev release
            )
            |
            (?:
                # All other operators only allow a sub set of what the
                # (non)equality operators do. Specifically they do not allow
                # local versions to be specified nor do they allow the prefix
                # matching wild cards.
                (?<!==|!=|~=)         # We have special cases for these
                                      # operators so we want to make sure they
                                      # don't match here.

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)*   # release
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?
                (?:[-_\.]?dev[-_\.]?[0-9]*)?          # dev release
            )
        )
        z^\s*z\s*$Z
compatiblerErFrGrHrIrJZ	arbitrary)z~=z==z!=z<=z>=rKrLz===cCsNdjttjdd�t|���dd��}|d7}|jd�||�oL|jd�||�S)	N�.cSs|jd�o|jd�S)NZpostZdev)�
startswith)�xrrr�<lambda>�sz/Specifier._compare_compatible.<locals>.<lambda>rz.*z>=z==���)�join�list�	itertools�	takewhile�_version_splitr;)rrMr.�prefixrrr�_compare_compatible�s
zSpecifier._compare_compatiblecCsp|jd�rPt|j�}t|dd��}tt|��}|dt|��}t||�\}}nt|�}|jsht|j�}||kS)Nz.*����)�endswithrZpublicrhr2�len�_pad_version�local)rrMr.rrrrN�s


zSpecifier._compare_equalcCs|j||�S)N)rN)rrMr.rrrrO�szSpecifier._compare_not_equalcCs|t|�kS)N)r)rrMr.rrrrP�sz"Specifier._compare_less_than_equalcCs|t|�kS)N)r)rrMr.rrrrQ�sz%Specifier._compare_greater_than_equalcCs>t|�}||ksdS|jr:|jr:t|j�t|j�kr:dSdS)NFT)rr>�base_version)rrMr.rrrrR�szSpecifier._compare_less_thancCs`t|�}||ksdS|jr:|jr:t|j�t|j�kr:dS|jdk	r\t|j�t|j�kr\dSdS)NFT)rZis_postreleaserqrp)rrMr.rrrrS�s
zSpecifier._compare_greater_thancCst|�j�t|�j�kS)N)r2�lower)rrMr.rrr�_compare_arbitraryszSpecifier._compare_arbitrarycCsR|jdk	r|jS|j\}}|d
krN|dkr@|jd�r@|dd�}t|�jrNdSd	S)N�==�>=�<=�~=�===z.*rkTF)rtrurvrwrxrl)r-r,rmr
r>)rr%r&rrrrs


zSpecifier.prereleasescCs
||_dS)N)r-)rrrrrrsN)rr
rrTrUrVrWrXr'r9r]rjrNrOrPrQrRrSrsrCrr"rrrrr^s*^#r^z^([0-9]+)((?:a|b|c|rc)[0-9]+)$cCsDg}x:|jd�D],}tj|�}|r2|j|j��q|j|�qW|S)Nr_)�split�
_prefix_regexr(�extend�groupsr?)r&�resultrr/rrrrh's
rhc	Cs�gg}}|jttjdd�|���|jttjdd�|���|j|t|d�d��|j|t|d�d��|jddgtdt|d�t|d���|jddgtdt|d�t|d���ttj|��ttj|��fS)NcSs|j�S)N)�isdigit)rarrrrb6sz_pad_version.<locals>.<lambda>cSs|j�S)N)r~)rarrrrb7srr�0)r?rerfrgrn�insert�max�chain)�left�rightZ
left_splitZright_splitrrrro2s
&&roc@s�eZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zedd��Z
e
jdd��Z
dd�Zddd�Zd dd�ZdS)!�SpecifierSetr$NcCsrdd�|jd�D�}t�}xB|D]:}y|jt|��Wq tk
rX|jt|��Yq Xq Wt|�|_||_dS)NcSsg|]}|j�r|j��qSr)r+)�.0�srrr�
<listcomp>Rsz)SpecifierSet.__init__.<locals>.<listcomp>�,)	ry�set�addr^rrD�	frozenset�_specsr-)rZ
specifiersrZparsed�	specifierrrrr0Os

zSpecifierSet.__init__cCs*|jdk	rdj|j�nd}djt|�|�S)Nz, prereleases={0!r}r$z<SpecifierSet({0!r}{1})>)r-r)rr2)rr3rrrr4dszSpecifierSet.__repr__cCsdjtdd�|jD���S)Nr�css|]}t|�VqdS)N)r2)r�r�rrr�	<genexpr>nsz'SpecifierSet.__str__.<locals>.<genexpr>)rd�sortedr�)rrrrrmszSpecifierSet.__str__cCs
t|j�S)N)r5r�)rrrrrpszSpecifierSet.__hash__cCs�t|t�rt|�}nt|t�s"tSt�}t|j|jB�|_|jdkrX|jdk	rX|j|_n<|jdk	rv|jdkrv|j|_n|j|jkr�|j|_ntd��|S)NzFCannot combine SpecifierSets with True and False prerelease overrides.)r6rr�r7r�r�r-�
ValueError)rrr�rrr�__and__ss





zSpecifierSet.__and__cCsFt|t�rt|�}n&t|t�r,tt|��}nt|t�s:tS|j|jkS)N)r6rr�r#r2r7r�)rrrrrr�s



zSpecifierSet.__eq__cCsFt|t�rt|�}n&t|t�r,tt|��}nt|t�s:tS|j|jkS)N)r6rr�r#r2r7r�)rrrrrr�s



zSpecifierSet.__ne__cCs
t|j�S)N)rnr�)rrrr�__len__�szSpecifierSet.__len__cCs
t|j�S)N)�iterr�)rrrr�__iter__�szSpecifierSet.__iter__cCs.|jdk	r|jS|jsdStdd�|jD��S)Ncss|]}|jVqdS)N)r)r�r�rrrr��sz+SpecifierSet.prereleases.<locals>.<genexpr>)r-r��any)rrrrr�s

zSpecifierSet.prereleasescCs
||_dS)N)r-)rrrrrr�scCs
|j|�S)N)r)rrrrrr=�szSpecifierSet.__contains__csNt�ttf�st����dkr$|j��r4�jr4dSt��fdd�|jD��S)NFc3s|]}|j��d�VqdS))rN)r)r�r�)rrrrr��sz(SpecifierSet.contains.<locals>.<genexpr>)r6r	rr
rr>�allr�)rrrr)rrrr�szSpecifierSet.containscCs�|dkr|j}|jr:x |jD]}|j|t|�d�}qW|Sg}g}xZ|D]R}t|ttf�sdt|�}n|}t|t�rtqH|jr�|r�|s�|j	|�qH|j	|�qHW|r�|r�|dkr�|S|SdS)N)r)
rr�r�boolr6r	rr
r>r?)rrrr.Zfilteredr@rrBrrrr�s*


zSpecifierSet.filter)r$N)N)N)rr
rr0r4rrr�rrr�r�rCrr"r=rrrrrrr�Ms
	


r�)Z
__future__rrrrr[rfrUZ_compatrrr&rr	r
r�r�ABCMeta�objectrr#rDr]r^rVrzrhror�rrrr�<module>s&9	4	
site-packages/setuptools/_vendor/packaging/__pycache__/_compat.cpython-36.pyc000064400000001634147511334620023357 0ustar003

��f\�@sVddlmZmZmZddlZejddkZejddkZerDefZ	ne
fZ	dd�ZdS)�)�absolute_import�division�print_functionN��cs&G��fdd�d��}tj|dfi�S)z/
    Create a base class with a metaclass.
    cseZdZ��fdd�ZdS)z!with_metaclass.<locals>.metaclasscs�|�|�S)N�)�cls�nameZ
this_bases�d)�bases�metar�/usr/lib/python3.6/_compat.py�__new__sz)with_metaclass.<locals>.metaclass.__new__N)�__name__�
__module__�__qualname__rr)rrrr
�	metaclasssrZtemporary_class)�typer)rrrr)rrr
�with_metaclasssr)Z
__future__rrr�sys�version_infoZPY2ZPY3�strZstring_typesZ
basestringrrrrr
�<module>ssite-packages/setuptools/_vendor/packaging/__pycache__/specifiers.cpython-36.pyc000064400000046437147511334620024103 0ustar003

��fym�@s�ddlmZmZmZddlZddlZddlZddlZddlm	Z	m
Z
ddlmZm
Z
mZGdd�de�ZGdd	�d	e
eje��ZGd
d�de�ZGdd
�d
e�Zdd�ZGdd�de�Zejd�Zdd�Zdd�ZGdd�de�ZdS)�)�absolute_import�division�print_functionN�)�string_types�with_metaclass)�Version�
LegacyVersion�parsec@seZdZdZdS)�InvalidSpecifierzH
    An invalid specifier was found, users should refer to PEP 440.
    N)�__name__�
__module__�__qualname__�__doc__�rr� /usr/lib/python3.6/specifiers.pyrsrc@s�eZdZejdd��Zejdd��Zejdd��Zejdd��Zej	d	d
��Z
e
jdd
��Z
ejdd
d��Zejddd��Z
dS)�
BaseSpecifiercCsdS)z�
        Returns the str representation of this Specifier like object. This
        should be representative of the Specifier itself.
        Nr)�selfrrr�__str__szBaseSpecifier.__str__cCsdS)zF
        Returns a hash value for this Specifier like object.
        Nr)rrrr�__hash__szBaseSpecifier.__hash__cCsdS)zq
        Returns a boolean representing whether or not the two Specifier like
        objects are equal.
        Nr)r�otherrrr�__eq__$szBaseSpecifier.__eq__cCsdS)zu
        Returns a boolean representing whether or not the two Specifier like
        objects are not equal.
        Nr)rrrrr�__ne__+szBaseSpecifier.__ne__cCsdS)zg
        Returns whether or not pre-releases as a whole are allowed by this
        specifier.
        Nr)rrrr�prereleases2szBaseSpecifier.prereleasescCsdS)zd
        Sets whether or not pre-releases as a whole are allowed by this
        specifier.
        Nr)r�valuerrrr9sNcCsdS)zR
        Determines if the given item is contained within this specifier.
        Nr)r�itemrrrr�contains@szBaseSpecifier.containscCsdS)z�
        Takes an iterable of items and filters them so that only items which
        are contained within this specifier are allowed in it.
        Nr)r�iterablerrrr�filterFszBaseSpecifier.filter)N)N)rr
r�abc�abstractmethodrrrr�abstractpropertyr�setterrrrrrrrsrc@s�eZdZiZd dd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zedd��Z
edd��Zedd��Zejdd��Zdd�Zd!dd�Zd"dd�ZdS)#�_IndividualSpecifier�NcCsF|jj|�}|stdj|���|jd�j�|jd�j�f|_||_dS)NzInvalid specifier: '{0}'�operator�version)�_regex�searchr�format�group�strip�_spec�_prereleases)r�specr�matchrrr�__init__Rsz_IndividualSpecifier.__init__cCs0|jdk	rdj|j�nd}dj|jjt|�|�S)Nz, prereleases={0!r}r$z<{0}({1!r}{2})>)r-r)r�	__class__r�str)r�prerrr�__repr___sz_IndividualSpecifier.__repr__cCsdj|j�S)Nz{0}{1})r)r,)rrrrrlsz_IndividualSpecifier.__str__cCs
t|j�S)N)�hashr,)rrrrrosz_IndividualSpecifier.__hash__cCsLt|t�r0y|j|�}Wq@tk
r,tSXnt||j�s@tS|j|jkS)N)�
isinstancerr1r�NotImplementedr,)rrrrrrrs
z_IndividualSpecifier.__eq__cCsLt|t�r0y|j|�}Wq@tk
r,tSXnt||j�s@tS|j|jkS)N)r6rr1rr7r,)rrrrrr}s
z_IndividualSpecifier.__ne__cCst|dj|j|��S)Nz_compare_{0})�getattrr)�
_operators)r�oprrr�
_get_operator�sz"_IndividualSpecifier._get_operatorcCst|ttf�st|�}|S)N)r6r	rr
)rr&rrr�_coerce_version�sz$_IndividualSpecifier._coerce_versioncCs
|jdS)Nr)r,)rrrrr%�sz_IndividualSpecifier.operatorcCs
|jdS)Nr)r,)rrrrr&�sz_IndividualSpecifier.versioncCs|jS)N)r-)rrrrr�sz _IndividualSpecifier.prereleasescCs
||_dS)N)r-)rrrrrr�scCs
|j|�S)N)r)rrrrr�__contains__�sz!_IndividualSpecifier.__contains__cCs<|dkr|j}|j|�}|jr(|r(dS|j|j�||j�S)NF)rr<�
is_prereleaser;r%r&)rrrrrrr�s
z_IndividualSpecifier.containsccs�d}g}d|dk	r|ndi}xL|D]D}|j|�}|j|f|�r"|jr\|pL|jr\|j|�q"d}|Vq"W|r�|r�x|D]
}|VqzWdS)NFrT)r<rr>r�append)rrrZyielded�found_prereleases�kwr&�parsed_versionrrrr�s




z_IndividualSpecifier.filter)r$N)N)N)rr
rr9r0r4rrrrr;r<�propertyr%r&rr"r=rrrrrrr#Ns 



r#c@sveZdZdZejdedejejB�Zdddddd	d
�Z	dd�Z
d
d�Zdd�Zdd�Z
dd�Zdd�Zdd�ZdS)�LegacySpecifiera�
        (?P<operator>(==|!=|<=|>=|<|>))
        \s*
        (?P<version>
            [^,;\s)]* # Since this is a "legacy" specifier, and the version
                      # string can be just about anything, we match everything
                      # except for whitespace, a semi-colon for marker support,
                      # a closing paren since versions can be enclosed in
                      # them, and a comma since it's a version separator.
        )
        z^\s*z\s*$�equal�	not_equal�less_than_equal�greater_than_equal�	less_than�greater_than)z==z!=z<=z>=�<�>cCst|t�stt|��}|S)N)r6r	r2)rr&rrrr<�s
zLegacySpecifier._coerce_versioncCs||j|�kS)N)r<)r�prospectiver.rrr�_compare_equal�szLegacySpecifier._compare_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_not_equal�sz"LegacySpecifier._compare_not_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_less_than_equal�sz(LegacySpecifier._compare_less_than_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_greater_than_equalsz+LegacySpecifier._compare_greater_than_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_less_thansz"LegacySpecifier._compare_less_thancCs||j|�kS)N)r<)rrMr.rrr�_compare_greater_thansz%LegacySpecifier._compare_greater_thanN)rr
r�
_regex_str�re�compile�VERBOSE�
IGNORECASEr'r9r<rNrOrPrQrRrSrrrrrD�s 
rDcstj���fdd��}|S)Ncst|t�sdS�|||�S)NF)r6r)rrMr.)�fnrr�wrappeds
z)_require_version_compare.<locals>.wrapped)�	functools�wraps)rYrZr)rYr�_require_version_compare
sr]c	@s�eZdZdZejdedejejB�Zdddddd	d
dd�Z	e
d
d��Ze
dd��Ze
dd��Z
e
dd��Ze
dd��Ze
dd��Ze
dd��Zdd�Zedd��Zejdd��Zd S)!�	Specifiera
        (?P<operator>(~=|==|!=|<=|>=|<|>|===))
        (?P<version>
            (?:
                # The identity operators allow for an escape hatch that will
                # do an exact string match of the version you wish to install.
                # This will not be parsed by PEP 440 and we cannot determine
                # any semantic meaning from it. This operator is discouraged
                # but included entirely as an escape hatch.
                (?<====)  # Only match for the identity operator
                \s*
                [^\s]*    # We just match everything, except for whitespace
                          # since we are only testing for strict identity.
            )
            |
            (?:
                # The (non)equality operators allow for wild card and local
                # versions to be specified so we have to define these two
                # operators separately to enable that.
                (?<===|!=)            # Only match for equals and not equals

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)*   # release
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?

                # You cannot use a wild card and a dev or local version
                # together so group them with a | and make them optional.
                (?:
                    (?:[-_\.]?dev[-_\.]?[0-9]*)?         # dev release
                    (?:\+[a-z0-9]+(?:[-_\.][a-z0-9]+)*)? # local
                    |
                    \.\*  # Wild card syntax of .*
                )?
            )
            |
            (?:
                # The compatible operator requires at least two digits in the
                # release segment.
                (?<=~=)               # Only match for the compatible operator

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)+   # release  (We have a + instead of a *)
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?
                (?:[-_\.]?dev[-_\.]?[0-9]*)?          # dev release
            )
            |
            (?:
                # All other operators only allow a sub set of what the
                # (non)equality operators do. Specifically they do not allow
                # local versions to be specified nor do they allow the prefix
                # matching wild cards.
                (?<!==|!=|~=)         # We have special cases for these
                                      # operators so we want to make sure they
                                      # don't match here.

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)*   # release
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?
                (?:[-_\.]?dev[-_\.]?[0-9]*)?          # dev release
            )
        )
        z^\s*z\s*$Z
compatiblerErFrGrHrIrJZ	arbitrary)z~=z==z!=z<=z>=rKrLz===cCsNdjttjdd�t|���dd��}|d7}|jd�||�oL|jd�||�S)	N�.cSs|jd�o|jd�S)NZpostZdev)�
startswith)�xrrr�<lambda>�sz/Specifier._compare_compatible.<locals>.<lambda>rz.*z>=z==���)�join�list�	itertools�	takewhile�_version_splitr;)rrMr.�prefixrrr�_compare_compatible�s
zSpecifier._compare_compatiblecCsp|jd�rPt|j�}t|dd��}tt|��}|dt|��}t||�\}}nt|�}|jsht|j�}||kS)Nz.*����)�endswithrZpublicrhr2�len�_pad_version�local)rrMr.rrrrN�s


zSpecifier._compare_equalcCs|j||�S)N)rN)rrMr.rrrrO�szSpecifier._compare_not_equalcCs|t|�kS)N)r)rrMr.rrrrP�sz"Specifier._compare_less_than_equalcCs|t|�kS)N)r)rrMr.rrrrQ�sz%Specifier._compare_greater_than_equalcCs>t|�}||ksdS|jr:|jr:t|j�t|j�kr:dSdS)NFT)rr>�base_version)rrMr.rrrrR�szSpecifier._compare_less_thancCs`t|�}||ksdS|jr:|jr:t|j�t|j�kr:dS|jdk	r\t|j�t|j�kr\dSdS)NFT)rZis_postreleaserqrp)rrMr.rrrrS�s
zSpecifier._compare_greater_thancCst|�j�t|�j�kS)N)r2�lower)rrMr.rrr�_compare_arbitraryszSpecifier._compare_arbitrarycCsR|jdk	r|jS|j\}}|d
krN|dkr@|jd�r@|dd�}t|�jrNdSd	S)N�==�>=�<=�~=�===z.*rkTF)rtrurvrwrxrl)r-r,rmr
r>)rr%r&rrrrs


zSpecifier.prereleasescCs
||_dS)N)r-)rrrrrrsN)rr
rrTrUrVrWrXr'r9r]rjrNrOrPrQrRrSrsrCrr"rrrrr^s*^#r^z^([0-9]+)((?:a|b|c|rc)[0-9]+)$cCsDg}x:|jd�D],}tj|�}|r2|j|j��q|j|�qW|S)Nr_)�split�
_prefix_regexr(�extend�groupsr?)r&�resultrr/rrrrh's
rhc	Cs�gg}}|jttjdd�|���|jttjdd�|���|j|t|d�d��|j|t|d�d��|jddgtdt|d�t|d���|jddgtdt|d�t|d���ttj|��ttj|��fS)NcSs|j�S)N)�isdigit)rarrrrb6sz_pad_version.<locals>.<lambda>cSs|j�S)N)r~)rarrrrb7srr�0)r?rerfrgrn�insert�max�chain)�left�rightZ
left_splitZright_splitrrrro2s
&&roc@s�eZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zedd��Z
e
jdd��Z
dd�Zddd�Zd dd�ZdS)!�SpecifierSetr$NcCsrdd�|jd�D�}t�}xB|D]:}y|jt|��Wq tk
rX|jt|��Yq Xq Wt|�|_||_dS)NcSsg|]}|j�r|j��qSr)r+)�.0�srrr�
<listcomp>Rsz)SpecifierSet.__init__.<locals>.<listcomp>�,)	ry�set�addr^rrD�	frozenset�_specsr-)rZ
specifiersrZparsed�	specifierrrrr0Os

zSpecifierSet.__init__cCs*|jdk	rdj|j�nd}djt|�|�S)Nz, prereleases={0!r}r$z<SpecifierSet({0!r}{1})>)r-r)rr2)rr3rrrr4dszSpecifierSet.__repr__cCsdjtdd�|jD���S)Nr�css|]}t|�VqdS)N)r2)r�r�rrr�	<genexpr>nsz'SpecifierSet.__str__.<locals>.<genexpr>)rd�sortedr�)rrrrrmszSpecifierSet.__str__cCs
t|j�S)N)r5r�)rrrrrpszSpecifierSet.__hash__cCs�t|t�rt|�}nt|t�s"tSt�}t|j|jB�|_|jdkrX|jdk	rX|j|_n<|jdk	rv|jdkrv|j|_n|j|jkr�|j|_ntd��|S)NzFCannot combine SpecifierSets with True and False prerelease overrides.)r6rr�r7r�r�r-�
ValueError)rrr�rrr�__and__ss





zSpecifierSet.__and__cCsFt|t�rt|�}n&t|t�r,tt|��}nt|t�s:tS|j|jkS)N)r6rr�r#r2r7r�)rrrrrr�s



zSpecifierSet.__eq__cCsFt|t�rt|�}n&t|t�r,tt|��}nt|t�s:tS|j|jkS)N)r6rr�r#r2r7r�)rrrrrr�s



zSpecifierSet.__ne__cCs
t|j�S)N)rnr�)rrrr�__len__�szSpecifierSet.__len__cCs
t|j�S)N)�iterr�)rrrr�__iter__�szSpecifierSet.__iter__cCs.|jdk	r|jS|jsdStdd�|jD��S)Ncss|]}|jVqdS)N)r)r�r�rrrr��sz+SpecifierSet.prereleases.<locals>.<genexpr>)r-r��any)rrrrr�s

zSpecifierSet.prereleasescCs
||_dS)N)r-)rrrrrr�scCs
|j|�S)N)r)rrrrrr=�szSpecifierSet.__contains__csNt�ttf�st����dkr$|j��r4�jr4dSt��fdd�|jD��S)NFc3s|]}|j��d�VqdS))rN)r)r�r�)rrrrr��sz(SpecifierSet.contains.<locals>.<genexpr>)r6r	rr
rr>�allr�)rrrr)rrrr�szSpecifierSet.containscCs�|dkr|j}|jr:x |jD]}|j|t|�d�}qW|Sg}g}xZ|D]R}t|ttf�sdt|�}n|}t|t�rtqH|jr�|r�|s�|j	|�qH|j	|�qHW|r�|r�|dkr�|S|SdS)N)r)
rr�r�boolr6r	rr
r>r?)rrrr.Zfilteredr@rrBrrrr�s*


zSpecifierSet.filter)r$N)N)N)rr
rr0r4rrr�rrr�r�rCrr"r=rrrrrrr�Ms
	


r�)Z
__future__rrrrr[rfrUZ_compatrrr&rr	r
r�r�ABCMeta�objectrr#rDr]r^rVrzrhror�rrrr�<module>s&9	4	
site-packages/setuptools/_vendor/packaging/__pycache__/__init__.cpython-36.pyc000064400000000735147511334620023475 0ustar003

��f�@sTddlmZmZmZddlmZmZmZmZm	Z	m
Z
mZmZdddddd	d
dgZ
dS)
�)�absolute_import�division�print_function�)�
__author__�
__copyright__�	__email__�__license__�__summary__�	__title__�__uri__�__version__rr
rr
rrr	rN)Z
__future__rrr�	__about__rrrr	r
rrr
�__all__�rr�/usr/lib/python3.6/__init__.py�<module>s(
site-packages/setuptools/_vendor/packaging/__pycache__/_compat.cpython-36.opt-1.pyc000064400000001634147511334620024316 0ustar003

��f\�@sVddlmZmZmZddlZejddkZejddkZerDefZ	ne
fZ	dd�ZdS)�)�absolute_import�division�print_functionN��cs&G��fdd�d��}tj|dfi�S)z/
    Create a base class with a metaclass.
    cseZdZ��fdd�ZdS)z!with_metaclass.<locals>.metaclasscs�|�|�S)N�)�cls�nameZ
this_bases�d)�bases�metar�/usr/lib/python3.6/_compat.py�__new__sz)with_metaclass.<locals>.metaclass.__new__N)�__name__�
__module__�__qualname__rr)rrrr
�	metaclasssrZtemporary_class)�typer)rrrr)rrr
�with_metaclasssr)Z
__future__rrr�sys�version_infoZPY2ZPY3�strZstring_typesZ
basestringrrrrr
�<module>ssite-packages/setuptools/_vendor/packaging/__pycache__/__about__.cpython-36.pyc000064400000001177147511334620023645 0ustar003

��f��@sPddlmZmZmZdddddddd	gZd
ZdZdZd
ZdZ	dZ
dZde	ZdS)�)�absolute_import�division�print_function�	__title__�__summary__�__uri__�__version__�
__author__�	__email__�__license__�
__copyright__Z	packagingz"Core utilities for Python packagesz!https://github.com/pypa/packagingz16.8z)Donald Stufft and individual contributorszdonald@stufft.ioz"BSD or Apache License, Version 2.0zCopyright 2014-2016 %sN)
Z
__future__rrr�__all__rrrrr	r
rr�rr�/usr/lib/python3.6/__about__.py�<module>s

site-packages/setuptools/_vendor/packaging/__pycache__/version.cpython-36.pyc000064400000024426147511334620023426 0ustar003

��f$-�@s�ddlmZmZmZddlZddlZddlZddlmZddddd	gZ	ej
d
ddd
dddg�Zdd�ZGdd�de
�ZGdd�de�ZGdd�de�Zejdej�Zdddddd�Zdd�Zdd�ZdZGd d�de�Zd!d"�Zejd#�Zd$d%�Zd&d'�ZdS)(�)�absolute_import�division�print_functionN�)�Infinity�parse�Version�
LegacyVersion�InvalidVersion�VERSION_PATTERN�_Version�epoch�release�dev�pre�post�localcCs&yt|�Stk
r t|�SXdS)z�
    Parse the given version string and return either a :class:`Version` object
    or a :class:`LegacyVersion` object depending on if the given version is
    a valid PEP 440 version or a legacy version.
    N)rr
r	)�version�r�/usr/lib/python3.6/version.pyrsc@seZdZdZdS)r
zF
    An invalid version was found, users should refer to PEP 440.
    N)�__name__�
__module__�__qualname__�__doc__rrrrr
$sc@sLeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)�_BaseVersioncCs
t|j�S)N)�hash�_key)�selfrrr�__hash__,sz_BaseVersion.__hash__cCs|j|dd��S)NcSs||kS)Nr)�s�orrr�<lambda>0sz%_BaseVersion.__lt__.<locals>.<lambda>)�_compare)r�otherrrr�__lt__/sz_BaseVersion.__lt__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!3sz%_BaseVersion.__le__.<locals>.<lambda>)r")rr#rrr�__le__2sz_BaseVersion.__le__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!6sz%_BaseVersion.__eq__.<locals>.<lambda>)r")rr#rrr�__eq__5sz_BaseVersion.__eq__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!9sz%_BaseVersion.__ge__.<locals>.<lambda>)r")rr#rrr�__ge__8sz_BaseVersion.__ge__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!<sz%_BaseVersion.__gt__.<locals>.<lambda>)r")rr#rrr�__gt__;sz_BaseVersion.__gt__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!?sz%_BaseVersion.__ne__.<locals>.<lambda>)r")rr#rrr�__ne__>sz_BaseVersion.__ne__cCst|t�stS||j|j�S)N)�
isinstancer�NotImplementedr)rr#�methodrrrr"As
z_BaseVersion._compareN)rrrrr$r%r&r'r(r)r"rrrrr*src@s`eZdZdd�Zdd�Zdd�Zedd��Zed	d
��Zedd��Z	ed
d��Z
edd��ZdS)r	cCst|�|_t|j�|_dS)N)�str�_version�_legacy_cmpkeyr)rrrrr�__init__Js
zLegacyVersion.__init__cCs|jS)N)r.)rrrr�__str__NszLegacyVersion.__str__cCsdjtt|���S)Nz<LegacyVersion({0})>)�format�reprr-)rrrr�__repr__QszLegacyVersion.__repr__cCs|jS)N)r.)rrrr�publicTszLegacyVersion.publiccCs|jS)N)r.)rrrr�base_versionXszLegacyVersion.base_versioncCsdS)Nr)rrrrr\szLegacyVersion.localcCsdS)NFr)rrrr�
is_prerelease`szLegacyVersion.is_prereleasecCsdS)NFr)rrrr�is_postreleasedszLegacyVersion.is_postreleaseN)rrrr0r1r4�propertyr5r6rr7r8rrrrr	Hsz(\d+ | [a-z]+ | \.| -)�czfinal-�@)r�preview�-�rcrccsbxVtj|�D]H}tj||�}|s|dkr,q|dd�dkrJ|jd�Vqd|VqWdVdS)N�.r�
0123456789��*z*final)�_legacy_version_component_re�split�_legacy_version_replacement_map�get�zfill)r�partrrr�_parse_version_partsrsrIcCs�d}g}xlt|j��D]\}|jd�rh|dkrJx|rH|ddkrH|j�q.Wx|rf|ddkrf|j�qLW|j|�qWt|�}||fS)	NrrBz*finalz*final-Z00000000���rJrJ)rI�lower�
startswith�pop�append�tuple)rr
�partsrHrrrr/�s
r/a�
    v?
    (?:
        (?:(?P<epoch>[0-9]+)!)?                           # epoch
        (?P<release>[0-9]+(?:\.[0-9]+)*)                  # release segment
        (?P<pre>                                          # pre-release
            [-_\.]?
            (?P<pre_l>(a|b|c|rc|alpha|beta|pre|preview))
            [-_\.]?
            (?P<pre_n>[0-9]+)?
        )?
        (?P<post>                                         # post release
            (?:-(?P<post_n1>[0-9]+))
            |
            (?:
                [-_\.]?
                (?P<post_l>post|rev|r)
                [-_\.]?
                (?P<post_n2>[0-9]+)?
            )
        )?
        (?P<dev>                                          # dev release
            [-_\.]?
            (?P<dev_l>dev)
            [-_\.]?
            (?P<dev_n>[0-9]+)?
        )?
    )
    (?:\+(?P<local>[a-z0-9]+(?:[-_\.][a-z0-9]+)*))?       # local version
c@s|eZdZejdedejejB�Zdd�Z	dd�Z
dd�Zed	d
��Z
edd��Zed
d��Zedd��Zedd��ZdS)rz^\s*z\s*$c	Cs�|jj|�}|stdj|���t|jd�r8t|jd��ndtdd�|jd�jd�D��t	|jd�|jd	��t	|jd
�|jd�p�|jd��t	|jd
�|jd��t
|jd��d�|_t|jj
|jj|jj|jj|jj|jj�|_dS)NzInvalid version: '{0}'r
rcss|]}t|�VqdS)N)�int)�.0�irrr�	<genexpr>�sz#Version.__init__.<locals>.<genexpr>rr?Zpre_lZpre_nZpost_lZpost_n1Zpost_n2Zdev_lZdev_nr)r
rrrrr)�_regex�searchr
r2r�grouprQrOrD�_parse_letter_version�_parse_local_versionr.�_cmpkeyr
rrrrrr)rr�matchrrrr0�s.

zVersion.__init__cCsdjtt|���S)Nz<Version({0})>)r2r3r-)rrrrr4�szVersion.__repr__cCs�g}|jjdkr$|jdj|jj��|jdjdd�|jjD���|jjdk	rl|jdjdd�|jjD���|jjdk	r�|jdj|jjd	��|jjdk	r�|jd
j|jjd	��|jj	dk	r�|jdjdjdd�|jj	D����dj|�S)
Nrz{0}!r?css|]}t|�VqdS)N)r-)rR�xrrrrT�sz"Version.__str__.<locals>.<genexpr>�css|]}t|�VqdS)N)r-)rRr\rrrrT�sz.post{0}rz.dev{0}z+{0}css|]}t|�VqdS)N)r-)rRr\rrrrTs)
r.r
rNr2�joinrrrrr)rrPrrrr1�s zVersion.__str__cCst|�jdd�dS)N�+rr)r-rD)rrrrr5
szVersion.publiccCsLg}|jjdkr$|jdj|jj��|jdjdd�|jjD���dj|�S)Nrz{0}!r?css|]}t|�VqdS)N)r-)rRr\rrrrTsz'Version.base_version.<locals>.<genexpr>r])r.r
rNr2r^r)rrPrrrr6s
zVersion.base_versioncCs$t|�}d|kr |jdd�dSdS)Nr_r)r-rD)rZversion_stringrrrrsz
Version.localcCst|jjp|jj�S)N)�boolr.rr)rrrrr7!szVersion.is_prereleasecCst|jj�S)N)r`r.r)rrrrr8%szVersion.is_postreleaseN)rrr�re�compiler�VERBOSE�
IGNORECASErUr0r4r1r9r5r6rr7r8rrrrr�s
#
cCsx|rZ|dkrd}|j�}|dkr&d}n(|dkr4d}n|d
krBd	}n|dkrNd}|t|�fS|rt|rtd}|t|�fSdS)NrZalpha�aZbeta�br:rr<r>�rev�rr)r:rr<)rgrh)rKrQ)ZletterZnumberrrrrX*s 
rXz[\._-]cCs$|dk	r tdd�tj|�D��SdS)zR
    Takes a string like abc.1.twelve and turns it into ("abc", 1, "twelve").
    Ncss&|]}|j�s|j�nt|�VqdS)N)�isdigitrKrQ)rRrHrrrrTRsz'_parse_local_version.<locals>.<genexpr>)rO�_local_version_seperatorsrD)rrrrrYLsrYcCs�ttttjdd�t|�����}|dkr@|dkr@|dk	r@t}n|dkrLt}|dkrZt}|dkrft}|dkrvt}ntdd�|D��}||||||fS)NcSs|dkS)Nrr)r\rrrr!`sz_cmpkey.<locals>.<lambda>css*|]"}t|t�r|dfnt|fVqdS)r]N)r*rQr)rRrSrrrrT�sz_cmpkey.<locals>.<genexpr>)rO�reversed�list�	itertools�	dropwhiler)r
rrrrrrrrrZWs&		
rZ)Z
__future__rrr�collectionsrmraZ_structuresr�__all__�
namedtuplerr�
ValueErrorr
�objectrr	rbrcrCrErIr/rrrXrjrYrZrrrr�<module>s.!
9k
site-packages/setuptools/_vendor/packaging/__pycache__/__about__.cpython-36.opt-1.pyc000064400000001177147511334620024604 0ustar003

��f��@sPddlmZmZmZdddddddd	gZd
ZdZdZd
ZdZ	dZ
dZde	ZdS)�)�absolute_import�division�print_function�	__title__�__summary__�__uri__�__version__�
__author__�	__email__�__license__�
__copyright__Z	packagingz"Core utilities for Python packagesz!https://github.com/pypa/packagingz16.8z)Donald Stufft and individual contributorszdonald@stufft.ioz"BSD or Apache License, Version 2.0zCopyright 2014-2016 %sN)
Z
__future__rrr�__all__rrrrr	r
rr�rr�/usr/lib/python3.6/__about__.py�<module>s

site-packages/setuptools/_vendor/packaging/__pycache__/requirements.cpython-36.opt-1.pyc000064400000007322147511334620025417 0ustar003

��f��@srddlmZmZmZddlZddlZddlmZmZm	Z	m
Z
ddlmZmZm
Z
mZmZddlmZddlmZddlmZmZdd	lmZmZmZGd
d�de�Zeejej�Z ed�j!�Z"ed
�j!�Z#ed�j!�Z$ed�j!�Z%ed�j!�Z&ed�j!�Z'ed�j!�Z(ed�Z)e ee)�e BZ*ee ee*��Z+e+d�Z,e+Z-ed�d�Z.e(e.Z/e-ee&e-�Z0e"e
e0�e#d�Z1eej2ej3ej4B�Z5eej2ej3ej4B�Z6e5e6AZ7ee7ee&e7�ddd�d�Z8e
e$e8e%e8B�Z9e9j:dd��e	e9�d�Z;e;j:dd��e	e��d�Zej:d d��e'Z<e<eZ=e;e
e=�Z>e/e
e=�Z?e,e
e1�e?e>BZ@ee@eZAGd!d"�d"eB�ZCdS)#�)�absolute_import�division�print_functionN)�stringStart�	stringEnd�originalTextFor�ParseException)�
ZeroOrMore�Word�Optional�Regex�Combine)�Literal)�parse�)�MARKER_EXPR�Marker)�LegacySpecifier�	Specifier�SpecifierSetc@seZdZdZdS)�InvalidRequirementzJ
    An invalid requirement was found, users should refer to PEP 508.
    N)�__name__�
__module__�__qualname__�__doc__�rr�"/usr/lib/python3.6/requirements.pyrsr�[�]�(�)�,�;�@z-_.�namez[^ ]+�url�extrasF)Z
joinStringZadjacent�	_raw_speccCs
|jpdS)N�)r')�s�l�trrr�<lambda>6sr,�	specifiercCs|dS)Nrr)r)r*r+rrrr,9s�markercCst||j|j��S)N)rZ_original_startZ
_original_end)r)r*r+rrrr,=sc@s(eZdZdZdd�Zdd�Zdd�ZdS)	�Requirementz�Parse a requirement.

    Parse a given requirement string into its parts, such as name, specifier,
    URL, and extras. Raises InvalidRequirement on a badly-formed requirement
    string.
    cCs�ytj|�}Wn@tk
rN}z$tdj||j|jd����WYdd}~XnX|j|_|jr�tj|j�}|j	ot|j
s�|j	r�|j
r�td��|j|_nd|_t|jr�|jj
�ng�|_t|j�|_|jr�|jnd|_dS)Nz+Invalid requirement, parse error at "{0!r}"�zInvalid URL given)�REQUIREMENTZparseStringrr�format�locr$r%�urlparse�schemeZnetloc�setr&ZasListrr-r.)�selfZrequirement_stringZreq�eZ
parsed_urlrrr�__init__Xs"*
zRequirement.__init__cCsz|jg}|jr*|jdjdjt|j����|jr@|jt|j��|jrX|jdj|j��|j	rp|jdj|j	��dj|�S)Nz[{0}]r!z@ {0}z; {0}r()
r$r&�appendr2�join�sortedr-�strr%r.)r7�partsrrr�__str__mszRequirement.__str__cCsdjt|��S)Nz<Requirement({0!r})>)r2r=)r7rrr�__repr__~szRequirement.__repr__N)rrrrr9r?r@rrrrr/Ksr/)DZ
__future__rrr�string�reZsetuptools.extern.pyparsingrrrrr	r
rrr
r�LZ"setuptools.extern.six.moves.urllibrr4ZmarkersrrZ
specifiersrrr�
ValueErrorrZ
ascii_lettersZdigitsZALPHANUM�suppressZLBRACKETZRBRACKETZLPARENZRPAREN�COMMAZ	SEMICOLON�ATZPUNCTUATIONZIDENTIFIER_ENDZ
IDENTIFIER�NAMEZEXTRAZURIZURLZEXTRAS_LISTZEXTRASZ
_regex_str�VERBOSE�
IGNORECASEZVERSION_PEP440ZVERSION_LEGACYZVERSION_ONEZVERSION_MANYZ
_VERSION_SPECZsetParseActionZVERSION_SPECZMARKER_SEPERATORZMARKERZVERSION_AND_MARKERZURL_AND_MARKERZNAMED_REQUIREMENTr1�objectr/rrrr�<module>sZ
site-packages/setuptools/_vendor/packaging/__pycache__/markers.cpython-36.pyc000064400000021141147511334620023374 0ustar003

��f/ �	@s@ddlmZmZmZddlZddlZddlZddlZddlm	Z	m
Z
mZmZddlm
Z
mZmZmZddlmZddlmZddlmZmZd	d
ddd
gZGdd	�d	e�ZGdd
�d
e�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�Z Gdd�de�Z!ed�ed�Bed�Bed�Bed�Bed�Bed�Bed �Bed!�Bed"�Bed#�Bed$�Bed%�Bed&�Bed'�Bed(�Bed)�Bed*�BZ"d#d"ddddd+�Z#e"j$d,d-��ed.�ed/�Bed0�Bed1�Bed2�Bed3�Bed4�Bed5�BZ%e%ed6�Bed7�BZ&e&j$d8d-��ed9�ed:�BZ'e'j$d;d-��ed<�ed=�BZ(e"e'BZ)ee)e&e)�Z*e*j$d>d-��ed?�j+�Z,ed@�j+�Z-e�Z.e*ee,e.e-�BZ/e.e/e
e(e.�>ee.eZ0dAdB�Z1dSdDdE�Z2dFd-�dGd-�ej3ej4ej5ej6ej7ej8dH�Z9dIdJ�Z:e�Z;dKdL�Z<dMdN�Z=dOdP�Z>dQd
�Z?GdRd�de�Z@dS)T�)�absolute_import�division�print_functionN)�ParseException�ParseResults�stringStart�	stringEnd)�
ZeroOrMore�Group�Forward�QuotedString)�Literal�)�string_types)�	Specifier�InvalidSpecifier�
InvalidMarker�UndefinedComparison�UndefinedEnvironmentName�Marker�default_environmentc@seZdZdZdS)rzE
    An invalid marker was found, users should refer to PEP 508.
    N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/markers.pyrsc@seZdZdZdS)rzP
    An invalid operation was attempted on a value that doesn't support it.
    N)rrrrrrrrrsc@seZdZdZdS)rz\
    A name was attempted to be used that does not exist inside of the
    environment.
    N)rrrrrrrrr%sc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�NodecCs
||_dS)N)�value)�selfrrrr�__init__.sz
Node.__init__cCs
t|j�S)N)�strr)rrrr�__str__1szNode.__str__cCsdj|jjt|��S)Nz<{0}({1!r})>)�format�	__class__rr!)rrrr�__repr__4sz
Node.__repr__cCst�dS)N)�NotImplementedError)rrrr�	serialize7szNode.serializeN)rrrr r"r%r'rrrrr,src@seZdZdd�ZdS)�VariablecCst|�S)N)r!)rrrrr'=szVariable.serializeN)rrrr'rrrrr(;sr(c@seZdZdd�ZdS)�ValuecCs
dj|�S)Nz"{0}")r#)rrrrr'CszValue.serializeN)rrrr'rrrrr)Asr)c@seZdZdd�ZdS)�OpcCst|�S)N)r!)rrrrr'IszOp.serializeN)rrrr'rrrrr*Gsr*�implementation_version�platform_python_implementation�implementation_name�python_full_version�platform_release�platform_version�platform_machine�platform_system�python_version�sys_platform�os_namezos.namezsys.platformzplatform.versionzplatform.machinezplatform.python_implementation�python_implementationZextra)zos.namezsys.platformzplatform.versionzplatform.machinezplatform.python_implementationr6cCsttj|d|d��S)Nr)r(�ALIASES�get)�s�l�trrr�<lambda>isr<z===z==z>=z<=z!=z~=�>�<znot in�incCst|d�S)Nr)r*)r9r:r;rrrr<ws�'�"cCst|d�S)Nr)r))r9r:r;rrrr<zs�and�orcCst|d�S)Nr)�tuple)r9r:r;rrrr<�s�(�)cCs t|t�rdd�|D�S|SdS)NcSsg|]}t|��qSr)�_coerce_parse_result)�.0�irrr�
<listcomp>�sz(_coerce_parse_result.<locals>.<listcomp>)�
isinstancer)�resultsrrrrG�s
rGTcCs�t|tttf�st�t|t�rHt|�dkrHt|dttf�rHt|d�St|t�r�dd�|D�}|rndj|�Sddj|�dSn"t|t�r�djdd	�|D��S|SdS)
Nrrcss|]}t|dd�VqdS)F)�firstN)�_format_marker)rH�mrrr�	<genexpr>�sz!_format_marker.<locals>.<genexpr>� rErFcSsg|]}|j��qSr)r')rHrOrrrrJ�sz"_format_marker.<locals>.<listcomp>)rK�listrDr�AssertionError�lenrN�join)�markerrM�innerrrrrN�s


rNcCs||kS)Nr)�lhs�rhsrrrr<�scCs||kS)Nr)rXrYrrrr<�s)r?znot inr>z<=z==z!=z>=r=c
Cslytdj|j�|g��}Wntk
r.YnX|j|�Stj|j��}|dkrbtdj|||���|||�S)N�z#Undefined {0!r} on {1!r} and {2!r}.)	rrUr'r�contains�
_operatorsr8rr#)rX�oprY�specZoperrrr�_eval_op�s
r_cCs&|j|t�}|tkr"tdj|���|S)Nz/{0!r} does not exist in evaluation environment.)r8�
_undefinedrr#)�environment�namerrrr�_get_env�s
rcc	Cs�gg}x�|D]�}t|tttf�s$t�t|t�rD|djt||��qt|t�r�|\}}}t|t�rvt||j	�}|j	}n|j	}t||j	�}|djt
|||��q|dks�t�|dkr|jg�qWtdd�|D��S)	NrrBrCcss|]}t|�VqdS)N)�all)rH�itemrrrrP�sz$_evaluate_markers.<locals>.<genexpr>���rf)rBrC)rKrRrDrrS�append�_evaluate_markersr(rcrr_�any)	Zmarkersra�groupsrVrXr]rYZ	lhs_valueZ	rhs_valuerrrrh�s"




rhcCs2dj|�}|j}|dkr.||dt|j�7}|S)Nz{0.major}.{0.minor}.{0.micro}�finalr)r#�releaselevelr!�serial)�info�versionZkindrrr�format_full_version�s

rpcCslttd�r ttjj�}tjj}nd}d}||tjtj�tj	�tj
�tj�tj�tj�tj�dd�tjd�S)N�implementation�0rZ�)r-r+r5r1r/r2r0r.r,r3r4)
�hasattr�sysrprqrorb�os�platform�machine�release�systemr3r6)Ziverr-rrrr�s 

c@s.eZdZdd�Zdd�Zdd�Zd
dd	�ZdS)rcCs`yttj|��|_WnFtk
rZ}z*dj|||j|jd��}t|��WYdd}~XnXdS)Nz+Invalid marker: {0!r}, parse error at {1!r}�)rG�MARKERZparseString�_markersrr#�locr)rrV�eZerr_strrrrr szMarker.__init__cCs
t|j�S)N)rNr})rrrrr"szMarker.__str__cCsdjt|��S)Nz<Marker({0!r})>)r#r!)rrrrr%szMarker.__repr__NcCs$t�}|dk	r|j|�t|j|�S)a$Evaluate a marker.

        Return the boolean from evaluating the given marker against the
        environment. environment is an optional argument to override all or
        part of the determined environment.

        The environment is determined from the current Python process.
        N)r�updaterhr})rraZcurrent_environmentrrr�evaluate s	
zMarker.evaluate)N)rrrr r"r%r�rrrrrs)T)AZ
__future__rrr�operatorrvrwruZsetuptools.extern.pyparsingrrrrr	r
rrr
�LZ_compatrZ
specifiersrr�__all__�
ValueErrorrrr�objectrr(r)r*ZVARIABLEr7ZsetParseActionZVERSION_CMPZ	MARKER_OPZMARKER_VALUEZBOOLOPZ
MARKER_VARZMARKER_ITEM�suppressZLPARENZRPARENZMARKER_EXPRZMARKER_ATOMr|rGrN�lt�le�eq�ne�ge�gtr\r_r`rcrhrprrrrrr�<module>sx�
	6


site-packages/setuptools/_vendor/packaging/__pycache__/markers.cpython-36.opt-1.pyc000064400000020767147511334620024350 0ustar003

��f/ �	@s@ddlmZmZmZddlZddlZddlZddlZddlm	Z	m
Z
mZmZddlm
Z
mZmZmZddlmZddlmZddlmZmZd	d
ddd
gZGdd	�d	e�ZGdd
�d
e�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�Z Gdd�de�Z!ed�ed�Bed�Bed�Bed�Bed�Bed�Bed �Bed!�Bed"�Bed#�Bed$�Bed%�Bed&�Bed'�Bed(�Bed)�Bed*�BZ"d#d"ddddd+�Z#e"j$d,d-��ed.�ed/�Bed0�Bed1�Bed2�Bed3�Bed4�Bed5�BZ%e%ed6�Bed7�BZ&e&j$d8d-��ed9�ed:�BZ'e'j$d;d-��ed<�ed=�BZ(e"e'BZ)ee)e&e)�Z*e*j$d>d-��ed?�j+�Z,ed@�j+�Z-e�Z.e*ee,e.e-�BZ/e.e/e
e(e.�>ee.eZ0dAdB�Z1dSdDdE�Z2dFd-�dGd-�ej3ej4ej5ej6ej7ej8dH�Z9dIdJ�Z:e�Z;dKdL�Z<dMdN�Z=dOdP�Z>dQd
�Z?GdRd�de�Z@dS)T�)�absolute_import�division�print_functionN)�ParseException�ParseResults�stringStart�	stringEnd)�
ZeroOrMore�Group�Forward�QuotedString)�Literal�)�string_types)�	Specifier�InvalidSpecifier�
InvalidMarker�UndefinedComparison�UndefinedEnvironmentName�Marker�default_environmentc@seZdZdZdS)rzE
    An invalid marker was found, users should refer to PEP 508.
    N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/markers.pyrsc@seZdZdZdS)rzP
    An invalid operation was attempted on a value that doesn't support it.
    N)rrrrrrrrrsc@seZdZdZdS)rz\
    A name was attempted to be used that does not exist inside of the
    environment.
    N)rrrrrrrrr%sc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�NodecCs
||_dS)N)�value)�selfrrrr�__init__.sz
Node.__init__cCs
t|j�S)N)�strr)rrrr�__str__1szNode.__str__cCsdj|jjt|��S)Nz<{0}({1!r})>)�format�	__class__rr!)rrrr�__repr__4sz
Node.__repr__cCst�dS)N)�NotImplementedError)rrrr�	serialize7szNode.serializeN)rrrr r"r%r'rrrrr,src@seZdZdd�ZdS)�VariablecCst|�S)N)r!)rrrrr'=szVariable.serializeN)rrrr'rrrrr(;sr(c@seZdZdd�ZdS)�ValuecCs
dj|�S)Nz"{0}")r#)rrrrr'CszValue.serializeN)rrrr'rrrrr)Asr)c@seZdZdd�ZdS)�OpcCst|�S)N)r!)rrrrr'IszOp.serializeN)rrrr'rrrrr*Gsr*�implementation_version�platform_python_implementation�implementation_name�python_full_version�platform_release�platform_version�platform_machine�platform_system�python_version�sys_platform�os_namezos.namezsys.platformzplatform.versionzplatform.machinezplatform.python_implementation�python_implementationZextra)zos.namezsys.platformzplatform.versionzplatform.machinezplatform.python_implementationr6cCsttj|d|d��S)Nr)r(�ALIASES�get)�s�l�trrr�<lambda>isr<z===z==z>=z<=z!=z~=�>�<znot in�incCst|d�S)Nr)r*)r9r:r;rrrr<ws�'�"cCst|d�S)Nr)r))r9r:r;rrrr<zs�and�orcCst|d�S)Nr)�tuple)r9r:r;rrrr<�s�(�)cCs t|t�rdd�|D�S|SdS)NcSsg|]}t|��qSr)�_coerce_parse_result)�.0�irrr�
<listcomp>�sz(_coerce_parse_result.<locals>.<listcomp>)�
isinstancer)�resultsrrrrG�s
rGTcCs�t|t�r4t|�dkr4t|dttf�r4t|d�St|t�rndd�|D�}|rZdj|�Sddj|�dSn"t|t�r�djdd	�|D��S|SdS)
Nrrcss|]}t|dd�VqdS)F)�firstN)�_format_marker)rH�mrrr�	<genexpr>�sz!_format_marker.<locals>.<genexpr>� rErFcSsg|]}|j��qSr)r')rHrOrrrrJ�sz"_format_marker.<locals>.<listcomp>)rK�list�lenrDrN�join)�markerrM�innerrrrrN�s


rNcCs||kS)Nr)�lhs�rhsrrrr<�scCs||kS)Nr)rWrXrrrr<�s)r?znot inr>z<=z==z!=z>=r=c
Cslytdj|j�|g��}Wntk
r.YnX|j|�Stj|j��}|dkrbtdj|||���|||�S)N�z#Undefined {0!r} on {1!r} and {2!r}.)	rrTr'r�contains�
_operatorsr8rr#)rW�oprX�specZoperrrr�_eval_op�s
r^cCs&|j|t�}|tkr"tdj|���|S)Nz/{0!r} does not exist in evaluation environment.)r8�
_undefinedrr#)�environment�namerrrr�_get_env�s
rbc	Cs�gg}x�|D]�}t|t�r0|djt||��qt|t�r�|\}}}t|t�rbt||j�}|j}n|j}t||j�}|djt|||��q|dkr|jg�qWt	dd�|D��S)NrrCcss|]}t|�VqdS)N)�all)rH�itemrrrrP�sz$_evaluate_markers.<locals>.<genexpr>���re)
rKrR�append�_evaluate_markersrDr(rbrr^�any)	Zmarkersr`�groupsrUrWr\rXZ	lhs_valueZ	rhs_valuerrrrg�s




rgcCs2dj|�}|j}|dkr.||dt|j�7}|S)Nz{0.major}.{0.minor}.{0.micro}�finalr)r#�releaselevelr!�serial)�info�versionZkindrrr�format_full_version�s

rocCslttd�r ttjj�}tjj}nd}d}||tjtj�tj	�tj
�tj�tj�tj�tj�dd�tjd�S)N�implementation�0rY�)r-r+r5r1r/r2r0r.r,r3r4)
�hasattr�sysrorprnra�os�platform�machine�release�systemr3r6)Ziverr-rrrr�s 

c@s.eZdZdd�Zdd�Zdd�Zd
dd	�ZdS)rcCs`yttj|��|_WnFtk
rZ}z*dj|||j|jd��}t|��WYdd}~XnXdS)Nz+Invalid marker: {0!r}, parse error at {1!r}�)rG�MARKERZparseString�_markersrr#�locr)rrU�eZerr_strrrrr szMarker.__init__cCs
t|j�S)N)rNr|)rrrrr"szMarker.__str__cCsdjt|��S)Nz<Marker({0!r})>)r#r!)rrrrr%szMarker.__repr__NcCs$t�}|dk	r|j|�t|j|�S)a$Evaluate a marker.

        Return the boolean from evaluating the given marker against the
        environment. environment is an optional argument to override all or
        part of the determined environment.

        The environment is determined from the current Python process.
        N)r�updatergr|)rr`Zcurrent_environmentrrr�evaluate s	
zMarker.evaluate)N)rrrr r"r%r�rrrrrs)T)AZ
__future__rrr�operatorrurvrtZsetuptools.extern.pyparsingrrrrr	r
rrr
�LZ_compatrZ
specifiersrr�__all__�
ValueErrorrrr�objectrr(r)r*ZVARIABLEr7ZsetParseActionZVERSION_CMPZ	MARKER_OPZMARKER_VALUEZBOOLOPZ
MARKER_VARZMARKER_ITEM�suppressZLPARENZRPARENZMARKER_EXPRZMARKER_ATOMr{rGrN�lt�le�eq�ne�ge�gtr[r^r_rbrgrorrrrrr�<module>sx�
	6


site-packages/setuptools/_vendor/packaging/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000735147511334620024434 0ustar003

��f�@sTddlmZmZmZddlmZmZmZmZm	Z	m
Z
mZmZdddddd	d
dgZ
dS)
�)�absolute_import�division�print_function�)�
__author__�
__copyright__�	__email__�__license__�__summary__�	__title__�__uri__�__version__rr
rr
rrr	rN)Z
__future__rrr�	__about__rrrr	r
rrr
�__all__�rr�/usr/lib/python3.6/__init__.py�<module>s(
site-packages/setuptools/_vendor/packaging/__pycache__/utils.cpython-36.opt-1.pyc000064400000000630147511334620024027 0ustar003

��f��@s2ddlmZmZmZddlZejd�Zdd�ZdS)�)�absolute_import�division�print_functionNz[-_.]+cCstjd|�j�S)N�-)�_canonicalize_regex�sub�lower)�name�r
�/usr/lib/python3.6/utils.py�canonicalize_namesr)Z
__future__rrr�re�compilerrr
r
r
r�<module>s
site-packages/setuptools/_vendor/packaging/__pycache__/_structures.cpython-36.opt-1.pyc000064400000005335147511334620025260 0ustar003

��f��@sDddlmZmZmZGdd�de�Ze�ZGdd�de�Ze�ZdS)�)�absolute_import�division�print_functionc@sTeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)�InfinitycCsdS)Nr�)�selfrr�!/usr/lib/python3.6/_structures.py�__repr__	szInfinity.__repr__cCstt|��S)N)�hash�repr)rrrr�__hash__szInfinity.__hash__cCsdS)NFr)r�otherrrr�__lt__szInfinity.__lt__cCsdS)NFr)rr
rrr�__le__szInfinity.__le__cCst||j�S)N)�
isinstance�	__class__)rr
rrr�__eq__szInfinity.__eq__cCst||j�S)N)rr)rr
rrr�__ne__szInfinity.__ne__cCsdS)NTr)rr
rrr�__gt__szInfinity.__gt__cCsdS)NTr)rr
rrr�__ge__szInfinity.__ge__cCstS)N)�NegativeInfinity)rrrr�__neg__!szInfinity.__neg__N)�__name__�
__module__�__qualname__r	rrrrrrrrrrrrrsrc@sTeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)rcCsdS)Nz	-Infinityr)rrrrr	)szNegativeInfinity.__repr__cCstt|��S)N)r
r)rrrrr,szNegativeInfinity.__hash__cCsdS)NTr)rr
rrrr/szNegativeInfinity.__lt__cCsdS)NTr)rr
rrrr2szNegativeInfinity.__le__cCst||j�S)N)rr)rr
rrrr5szNegativeInfinity.__eq__cCst||j�S)N)rr)rr
rrrr8szNegativeInfinity.__ne__cCsdS)NFr)rr
rrrr;szNegativeInfinity.__gt__cCsdS)NFr)rr
rrrr>szNegativeInfinity.__ge__cCstS)N)r)rrrrrAszNegativeInfinity.__neg__N)rrrr	rrrrrrrrrrrrr'srN)Z
__future__rrr�objectrrrrrr�<module>ssite-packages/setuptools/_vendor/packaging/specifiers.py000064400000066571147511334620017620 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

import abc
import functools
import itertools
import re

from ._compat import string_types, with_metaclass
from .version import Version, LegacyVersion, parse


class InvalidSpecifier(ValueError):
    """
    An invalid specifier was found, users should refer to PEP 440.
    """


class BaseSpecifier(with_metaclass(abc.ABCMeta, object)):

    @abc.abstractmethod
    def __str__(self):
        """
        Returns the str representation of this Specifier like object. This
        should be representative of the Specifier itself.
        """

    @abc.abstractmethod
    def __hash__(self):
        """
        Returns a hash value for this Specifier like object.
        """

    @abc.abstractmethod
    def __eq__(self, other):
        """
        Returns a boolean representing whether or not the two Specifier like
        objects are equal.
        """

    @abc.abstractmethod
    def __ne__(self, other):
        """
        Returns a boolean representing whether or not the two Specifier like
        objects are not equal.
        """

    @abc.abstractproperty
    def prereleases(self):
        """
        Returns whether or not pre-releases as a whole are allowed by this
        specifier.
        """

    @prereleases.setter
    def prereleases(self, value):
        """
        Sets whether or not pre-releases as a whole are allowed by this
        specifier.
        """

    @abc.abstractmethod
    def contains(self, item, prereleases=None):
        """
        Determines if the given item is contained within this specifier.
        """

    @abc.abstractmethod
    def filter(self, iterable, prereleases=None):
        """
        Takes an iterable of items and filters them so that only items which
        are contained within this specifier are allowed in it.
        """


class _IndividualSpecifier(BaseSpecifier):

    _operators = {}

    def __init__(self, spec="", prereleases=None):
        match = self._regex.search(spec)
        if not match:
            raise InvalidSpecifier("Invalid specifier: '{0}'".format(spec))

        self._spec = (
            match.group("operator").strip(),
            match.group("version").strip(),
        )

        # Store whether or not this Specifier should accept prereleases
        self._prereleases = prereleases

    def __repr__(self):
        pre = (
            ", prereleases={0!r}".format(self.prereleases)
            if self._prereleases is not None
            else ""
        )

        return "<{0}({1!r}{2})>".format(
            self.__class__.__name__,
            str(self),
            pre,
        )

    def __str__(self):
        return "{0}{1}".format(*self._spec)

    def __hash__(self):
        return hash(self._spec)

    def __eq__(self, other):
        if isinstance(other, string_types):
            try:
                other = self.__class__(other)
            except InvalidSpecifier:
                return NotImplemented
        elif not isinstance(other, self.__class__):
            return NotImplemented

        return self._spec == other._spec

    def __ne__(self, other):
        if isinstance(other, string_types):
            try:
                other = self.__class__(other)
            except InvalidSpecifier:
                return NotImplemented
        elif not isinstance(other, self.__class__):
            return NotImplemented

        return self._spec != other._spec

    def _get_operator(self, op):
        return getattr(self, "_compare_{0}".format(self._operators[op]))

    def _coerce_version(self, version):
        if not isinstance(version, (LegacyVersion, Version)):
            version = parse(version)
        return version

    @property
    def operator(self):
        return self._spec[0]

    @property
    def version(self):
        return self._spec[1]

    @property
    def prereleases(self):
        return self._prereleases

    @prereleases.setter
    def prereleases(self, value):
        self._prereleases = value

    def __contains__(self, item):
        return self.contains(item)

    def contains(self, item, prereleases=None):
        # Determine if prereleases are to be allowed or not.
        if prereleases is None:
            prereleases = self.prereleases

        # Normalize item to a Version or LegacyVersion, this allows us to have
        # a shortcut for ``"2.0" in Specifier(">=2")
        item = self._coerce_version(item)

        # Determine if we should be supporting prereleases in this specifier
        # or not, if we do not support prereleases than we can short circuit
        # logic if this version is a prereleases.
        if item.is_prerelease and not prereleases:
            return False

        # Actually do the comparison to determine if this item is contained
        # within this Specifier or not.
        return self._get_operator(self.operator)(item, self.version)

    def filter(self, iterable, prereleases=None):
        yielded = False
        found_prereleases = []

        kw = {"prereleases": prereleases if prereleases is not None else True}

        # Attempt to iterate over all the values in the iterable and if any of
        # them match, yield them.
        for version in iterable:
            parsed_version = self._coerce_version(version)

            if self.contains(parsed_version, **kw):
                # If our version is a prerelease, and we were not set to allow
                # prereleases, then we'll store it for later incase nothing
                # else matches this specifier.
                if (parsed_version.is_prerelease and not
                        (prereleases or self.prereleases)):
                    found_prereleases.append(version)
                # Either this is not a prerelease, or we should have been
                # accepting prereleases from the begining.
                else:
                    yielded = True
                    yield version

        # Now that we've iterated over everything, determine if we've yielded
        # any values, and if we have not and we have any prereleases stored up
        # then we will go ahead and yield the prereleases.
        if not yielded and found_prereleases:
            for version in found_prereleases:
                yield version


class LegacySpecifier(_IndividualSpecifier):

    _regex_str = (
        r"""
        (?P<operator>(==|!=|<=|>=|<|>))
        \s*
        (?P<version>
            [^,;\s)]* # Since this is a "legacy" specifier, and the version
                      # string can be just about anything, we match everything
                      # except for whitespace, a semi-colon for marker support,
                      # a closing paren since versions can be enclosed in
                      # them, and a comma since it's a version separator.
        )
        """
    )

    _regex = re.compile(
        r"^\s*" + _regex_str + r"\s*$", re.VERBOSE | re.IGNORECASE)

    _operators = {
        "==": "equal",
        "!=": "not_equal",
        "<=": "less_than_equal",
        ">=": "greater_than_equal",
        "<": "less_than",
        ">": "greater_than",
    }

    def _coerce_version(self, version):
        if not isinstance(version, LegacyVersion):
            version = LegacyVersion(str(version))
        return version

    def _compare_equal(self, prospective, spec):
        return prospective == self._coerce_version(spec)

    def _compare_not_equal(self, prospective, spec):
        return prospective != self._coerce_version(spec)

    def _compare_less_than_equal(self, prospective, spec):
        return prospective <= self._coerce_version(spec)

    def _compare_greater_than_equal(self, prospective, spec):
        return prospective >= self._coerce_version(spec)

    def _compare_less_than(self, prospective, spec):
        return prospective < self._coerce_version(spec)

    def _compare_greater_than(self, prospective, spec):
        return prospective > self._coerce_version(spec)


def _require_version_compare(fn):
    @functools.wraps(fn)
    def wrapped(self, prospective, spec):
        if not isinstance(prospective, Version):
            return False
        return fn(self, prospective, spec)
    return wrapped


class Specifier(_IndividualSpecifier):

    _regex_str = (
        r"""
        (?P<operator>(~=|==|!=|<=|>=|<|>|===))
        (?P<version>
            (?:
                # The identity operators allow for an escape hatch that will
                # do an exact string match of the version you wish to install.
                # This will not be parsed by PEP 440 and we cannot determine
                # any semantic meaning from it. This operator is discouraged
                # but included entirely as an escape hatch.
                (?<====)  # Only match for the identity operator
                \s*
                [^\s]*    # We just match everything, except for whitespace
                          # since we are only testing for strict identity.
            )
            |
            (?:
                # The (non)equality operators allow for wild card and local
                # versions to be specified so we have to define these two
                # operators separately to enable that.
                (?<===|!=)            # Only match for equals and not equals

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)*   # release
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?

                # You cannot use a wild card and a dev or local version
                # together so group them with a | and make them optional.
                (?:
                    (?:[-_\.]?dev[-_\.]?[0-9]*)?         # dev release
                    (?:\+[a-z0-9]+(?:[-_\.][a-z0-9]+)*)? # local
                    |
                    \.\*  # Wild card syntax of .*
                )?
            )
            |
            (?:
                # The compatible operator requires at least two digits in the
                # release segment.
                (?<=~=)               # Only match for the compatible operator

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)+   # release  (We have a + instead of a *)
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?
                (?:[-_\.]?dev[-_\.]?[0-9]*)?          # dev release
            )
            |
            (?:
                # All other operators only allow a sub set of what the
                # (non)equality operators do. Specifically they do not allow
                # local versions to be specified nor do they allow the prefix
                # matching wild cards.
                (?<!==|!=|~=)         # We have special cases for these
                                      # operators so we want to make sure they
                                      # don't match here.

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)*   # release
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?
                (?:[-_\.]?dev[-_\.]?[0-9]*)?          # dev release
            )
        )
        """
    )

    _regex = re.compile(
        r"^\s*" + _regex_str + r"\s*$", re.VERBOSE | re.IGNORECASE)

    _operators = {
        "~=": "compatible",
        "==": "equal",
        "!=": "not_equal",
        "<=": "less_than_equal",
        ">=": "greater_than_equal",
        "<": "less_than",
        ">": "greater_than",
        "===": "arbitrary",
    }

    @_require_version_compare
    def _compare_compatible(self, prospective, spec):
        # Compatible releases have an equivalent combination of >= and ==. That
        # is that ~=2.2 is equivalent to >=2.2,==2.*. This allows us to
        # implement this in terms of the other specifiers instead of
        # implementing it ourselves. The only thing we need to do is construct
        # the other specifiers.

        # We want everything but the last item in the version, but we want to
        # ignore post and dev releases and we want to treat the pre-release as
        # it's own separate segment.
        prefix = ".".join(
            list(
                itertools.takewhile(
                    lambda x: (not x.startswith("post") and not
                               x.startswith("dev")),
                    _version_split(spec),
                )
            )[:-1]
        )

        # Add the prefix notation to the end of our string
        prefix += ".*"

        return (self._get_operator(">=")(prospective, spec) and
                self._get_operator("==")(prospective, prefix))

    @_require_version_compare
    def _compare_equal(self, prospective, spec):
        # We need special logic to handle prefix matching
        if spec.endswith(".*"):
            # In the case of prefix matching we want to ignore local segment.
            prospective = Version(prospective.public)
            # Split the spec out by dots, and pretend that there is an implicit
            # dot in between a release segment and a pre-release segment.
            spec = _version_split(spec[:-2])  # Remove the trailing .*

            # Split the prospective version out by dots, and pretend that there
            # is an implicit dot in between a release segment and a pre-release
            # segment.
            prospective = _version_split(str(prospective))

            # Shorten the prospective version to be the same length as the spec
            # so that we can determine if the specifier is a prefix of the
            # prospective version or not.
            prospective = prospective[:len(spec)]

            # Pad out our two sides with zeros so that they both equal the same
            # length.
            spec, prospective = _pad_version(spec, prospective)
        else:
            # Convert our spec string into a Version
            spec = Version(spec)

            # If the specifier does not have a local segment, then we want to
            # act as if the prospective version also does not have a local
            # segment.
            if not spec.local:
                prospective = Version(prospective.public)

        return prospective == spec

    @_require_version_compare
    def _compare_not_equal(self, prospective, spec):
        return not self._compare_equal(prospective, spec)

    @_require_version_compare
    def _compare_less_than_equal(self, prospective, spec):
        return prospective <= Version(spec)

    @_require_version_compare
    def _compare_greater_than_equal(self, prospective, spec):
        return prospective >= Version(spec)

    @_require_version_compare
    def _compare_less_than(self, prospective, spec):
        # Convert our spec to a Version instance, since we'll want to work with
        # it as a version.
        spec = Version(spec)

        # Check to see if the prospective version is less than the spec
        # version. If it's not we can short circuit and just return False now
        # instead of doing extra unneeded work.
        if not prospective < spec:
            return False

        # This special case is here so that, unless the specifier itself
        # includes is a pre-release version, that we do not accept pre-release
        # versions for the version mentioned in the specifier (e.g. <3.1 should
        # not match 3.1.dev0, but should match 3.0.dev0).
        if not spec.is_prerelease and prospective.is_prerelease:
            if Version(prospective.base_version) == Version(spec.base_version):
                return False

        # If we've gotten to here, it means that prospective version is both
        # less than the spec version *and* it's not a pre-release of the same
        # version in the spec.
        return True

    @_require_version_compare
    def _compare_greater_than(self, prospective, spec):
        # Convert our spec to a Version instance, since we'll want to work with
        # it as a version.
        spec = Version(spec)

        # Check to see if the prospective version is greater than the spec
        # version. If it's not we can short circuit and just return False now
        # instead of doing extra unneeded work.
        if not prospective > spec:
            return False

        # This special case is here so that, unless the specifier itself
        # includes is a post-release version, that we do not accept
        # post-release versions for the version mentioned in the specifier
        # (e.g. >3.1 should not match 3.0.post0, but should match 3.2.post0).
        if not spec.is_postrelease and prospective.is_postrelease:
            if Version(prospective.base_version) == Version(spec.base_version):
                return False

        # Ensure that we do not allow a local version of the version mentioned
        # in the specifier, which is techincally greater than, to match.
        if prospective.local is not None:
            if Version(prospective.base_version) == Version(spec.base_version):
                return False

        # If we've gotten to here, it means that prospective version is both
        # greater than the spec version *and* it's not a pre-release of the
        # same version in the spec.
        return True

    def _compare_arbitrary(self, prospective, spec):
        return str(prospective).lower() == str(spec).lower()

    @property
    def prereleases(self):
        # If there is an explicit prereleases set for this, then we'll just
        # blindly use that.
        if self._prereleases is not None:
            return self._prereleases

        # Look at all of our specifiers and determine if they are inclusive
        # operators, and if they are if they are including an explicit
        # prerelease.
        operator, version = self._spec
        if operator in ["==", ">=", "<=", "~=", "==="]:
            # The == specifier can include a trailing .*, if it does we
            # want to remove before parsing.
            if operator == "==" and version.endswith(".*"):
                version = version[:-2]

            # Parse the version, and if it is a pre-release than this
            # specifier allows pre-releases.
            if parse(version).is_prerelease:
                return True

        return False

    @prereleases.setter
    def prereleases(self, value):
        self._prereleases = value


_prefix_regex = re.compile(r"^([0-9]+)((?:a|b|c|rc)[0-9]+)$")


def _version_split(version):
    result = []
    for item in version.split("."):
        match = _prefix_regex.search(item)
        if match:
            result.extend(match.groups())
        else:
            result.append(item)
    return result


def _pad_version(left, right):
    left_split, right_split = [], []

    # Get the release segment of our versions
    left_split.append(list(itertools.takewhile(lambda x: x.isdigit(), left)))
    right_split.append(list(itertools.takewhile(lambda x: x.isdigit(), right)))

    # Get the rest of our versions
    left_split.append(left[len(left_split[0]):])
    right_split.append(right[len(right_split[0]):])

    # Insert our padding
    left_split.insert(
        1,
        ["0"] * max(0, len(right_split[0]) - len(left_split[0])),
    )
    right_split.insert(
        1,
        ["0"] * max(0, len(left_split[0]) - len(right_split[0])),
    )

    return (
        list(itertools.chain(*left_split)),
        list(itertools.chain(*right_split)),
    )


class SpecifierSet(BaseSpecifier):

    def __init__(self, specifiers="", prereleases=None):
        # Split on , to break each indidivual specifier into it's own item, and
        # strip each item to remove leading/trailing whitespace.
        specifiers = [s.strip() for s in specifiers.split(",") if s.strip()]

        # Parsed each individual specifier, attempting first to make it a
        # Specifier and falling back to a LegacySpecifier.
        parsed = set()
        for specifier in specifiers:
            try:
                parsed.add(Specifier(specifier))
            except InvalidSpecifier:
                parsed.add(LegacySpecifier(specifier))

        # Turn our parsed specifiers into a frozen set and save them for later.
        self._specs = frozenset(parsed)

        # Store our prereleases value so we can use it later to determine if
        # we accept prereleases or not.
        self._prereleases = prereleases

    def __repr__(self):
        pre = (
            ", prereleases={0!r}".format(self.prereleases)
            if self._prereleases is not None
            else ""
        )

        return "<SpecifierSet({0!r}{1})>".format(str(self), pre)

    def __str__(self):
        return ",".join(sorted(str(s) for s in self._specs))

    def __hash__(self):
        return hash(self._specs)

    def __and__(self, other):
        if isinstance(other, string_types):
            other = SpecifierSet(other)
        elif not isinstance(other, SpecifierSet):
            return NotImplemented

        specifier = SpecifierSet()
        specifier._specs = frozenset(self._specs | other._specs)

        if self._prereleases is None and other._prereleases is not None:
            specifier._prereleases = other._prereleases
        elif self._prereleases is not None and other._prereleases is None:
            specifier._prereleases = self._prereleases
        elif self._prereleases == other._prereleases:
            specifier._prereleases = self._prereleases
        else:
            raise ValueError(
                "Cannot combine SpecifierSets with True and False prerelease "
                "overrides."
            )

        return specifier

    def __eq__(self, other):
        if isinstance(other, string_types):
            other = SpecifierSet(other)
        elif isinstance(other, _IndividualSpecifier):
            other = SpecifierSet(str(other))
        elif not isinstance(other, SpecifierSet):
            return NotImplemented

        return self._specs == other._specs

    def __ne__(self, other):
        if isinstance(other, string_types):
            other = SpecifierSet(other)
        elif isinstance(other, _IndividualSpecifier):
            other = SpecifierSet(str(other))
        elif not isinstance(other, SpecifierSet):
            return NotImplemented

        return self._specs != other._specs

    def __len__(self):
        return len(self._specs)

    def __iter__(self):
        return iter(self._specs)

    @property
    def prereleases(self):
        # If we have been given an explicit prerelease modifier, then we'll
        # pass that through here.
        if self._prereleases is not None:
            return self._prereleases

        # If we don't have any specifiers, and we don't have a forced value,
        # then we'll just return None since we don't know if this should have
        # pre-releases or not.
        if not self._specs:
            return None

        # Otherwise we'll see if any of the given specifiers accept
        # prereleases, if any of them do we'll return True, otherwise False.
        return any(s.prereleases for s in self._specs)

    @prereleases.setter
    def prereleases(self, value):
        self._prereleases = value

    def __contains__(self, item):
        return self.contains(item)

    def contains(self, item, prereleases=None):
        # Ensure that our item is a Version or LegacyVersion instance.
        if not isinstance(item, (LegacyVersion, Version)):
            item = parse(item)

        # Determine if we're forcing a prerelease or not, if we're not forcing
        # one for this particular filter call, then we'll use whatever the
        # SpecifierSet thinks for whether or not we should support prereleases.
        if prereleases is None:
            prereleases = self.prereleases

        # We can determine if we're going to allow pre-releases by looking to
        # see if any of the underlying items supports them. If none of them do
        # and this item is a pre-release then we do not allow it and we can
        # short circuit that here.
        # Note: This means that 1.0.dev1 would not be contained in something
        #       like >=1.0.devabc however it would be in >=1.0.debabc,>0.0.dev0
        if not prereleases and item.is_prerelease:
            return False

        # We simply dispatch to the underlying specs here to make sure that the
        # given version is contained within all of them.
        # Note: This use of all() here means that an empty set of specifiers
        #       will always return True, this is an explicit design decision.
        return all(
            s.contains(item, prereleases=prereleases)
            for s in self._specs
        )

    def filter(self, iterable, prereleases=None):
        # Determine if we're forcing a prerelease or not, if we're not forcing
        # one for this particular filter call, then we'll use whatever the
        # SpecifierSet thinks for whether or not we should support prereleases.
        if prereleases is None:
            prereleases = self.prereleases

        # If we have any specifiers, then we want to wrap our iterable in the
        # filter method for each one, this will act as a logical AND amongst
        # each specifier.
        if self._specs:
            for spec in self._specs:
                iterable = spec.filter(iterable, prereleases=bool(prereleases))
            return iterable
        # If we do not have any specifiers, then we need to have a rough filter
        # which will filter out any pre-releases, unless there are no final
        # releases, and which will filter out LegacyVersion in general.
        else:
            filtered = []
            found_prereleases = []

            for item in iterable:
                # Ensure that we some kind of Version class for this item.
                if not isinstance(item, (LegacyVersion, Version)):
                    parsed_version = parse(item)
                else:
                    parsed_version = item

                # Filter out any item which is parsed as a LegacyVersion
                if isinstance(parsed_version, LegacyVersion):
                    continue

                # Store any item which is a pre-release for later unless we've
                # already found a final version or we are accepting prereleases
                if parsed_version.is_prerelease and not prereleases:
                    if not filtered:
                        found_prereleases.append(item)
                else:
                    filtered.append(item)

            # If we've found no items except for pre-releases, then we'll go
            # ahead and use the pre-releases
            if not filtered and found_prereleases and prereleases is None:
                return found_prereleases

            return filtered
site-packages/setuptools/_vendor/packaging/utils.py000064400000000645147511334620016612 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

import re


_canonicalize_regex = re.compile(r"[-_.]+")


def canonicalize_name(name):
    # This is taken from PEP 503.
    return _canonicalize_regex.sub("-", name).lower()
site-packages/setuptools/_vendor/packaging/version.py000064400000026444147511334620017144 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

import collections
import itertools
import re

from ._structures import Infinity


__all__ = [
    "parse", "Version", "LegacyVersion", "InvalidVersion", "VERSION_PATTERN"
]


_Version = collections.namedtuple(
    "_Version",
    ["epoch", "release", "dev", "pre", "post", "local"],
)


def parse(version):
    """
    Parse the given version string and return either a :class:`Version` object
    or a :class:`LegacyVersion` object depending on if the given version is
    a valid PEP 440 version or a legacy version.
    """
    try:
        return Version(version)
    except InvalidVersion:
        return LegacyVersion(version)


class InvalidVersion(ValueError):
    """
    An invalid version was found, users should refer to PEP 440.
    """


class _BaseVersion(object):

    def __hash__(self):
        return hash(self._key)

    def __lt__(self, other):
        return self._compare(other, lambda s, o: s < o)

    def __le__(self, other):
        return self._compare(other, lambda s, o: s <= o)

    def __eq__(self, other):
        return self._compare(other, lambda s, o: s == o)

    def __ge__(self, other):
        return self._compare(other, lambda s, o: s >= o)

    def __gt__(self, other):
        return self._compare(other, lambda s, o: s > o)

    def __ne__(self, other):
        return self._compare(other, lambda s, o: s != o)

    def _compare(self, other, method):
        if not isinstance(other, _BaseVersion):
            return NotImplemented

        return method(self._key, other._key)


class LegacyVersion(_BaseVersion):

    def __init__(self, version):
        self._version = str(version)
        self._key = _legacy_cmpkey(self._version)

    def __str__(self):
        return self._version

    def __repr__(self):
        return "<LegacyVersion({0})>".format(repr(str(self)))

    @property
    def public(self):
        return self._version

    @property
    def base_version(self):
        return self._version

    @property
    def local(self):
        return None

    @property
    def is_prerelease(self):
        return False

    @property
    def is_postrelease(self):
        return False


_legacy_version_component_re = re.compile(
    r"(\d+ | [a-z]+ | \.| -)", re.VERBOSE,
)

_legacy_version_replacement_map = {
    "pre": "c", "preview": "c", "-": "final-", "rc": "c", "dev": "@",
}


def _parse_version_parts(s):
    for part in _legacy_version_component_re.split(s):
        part = _legacy_version_replacement_map.get(part, part)

        if not part or part == ".":
            continue

        if part[:1] in "0123456789":
            # pad for numeric comparison
            yield part.zfill(8)
        else:
            yield "*" + part

    # ensure that alpha/beta/candidate are before final
    yield "*final"


def _legacy_cmpkey(version):
    # We hardcode an epoch of -1 here. A PEP 440 version can only have a epoch
    # greater than or equal to 0. This will effectively put the LegacyVersion,
    # which uses the defacto standard originally implemented by setuptools,
    # as before all PEP 440 versions.
    epoch = -1

    # This scheme is taken from pkg_resources.parse_version setuptools prior to
    # it's adoption of the packaging library.
    parts = []
    for part in _parse_version_parts(version.lower()):
        if part.startswith("*"):
            # remove "-" before a prerelease tag
            if part < "*final":
                while parts and parts[-1] == "*final-":
                    parts.pop()

            # remove trailing zeros from each series of numeric parts
            while parts and parts[-1] == "00000000":
                parts.pop()

        parts.append(part)
    parts = tuple(parts)

    return epoch, parts

# Deliberately not anchored to the start and end of the string, to make it
# easier for 3rd party code to reuse
VERSION_PATTERN = r"""
    v?
    (?:
        (?:(?P<epoch>[0-9]+)!)?                           # epoch
        (?P<release>[0-9]+(?:\.[0-9]+)*)                  # release segment
        (?P<pre>                                          # pre-release
            [-_\.]?
            (?P<pre_l>(a|b|c|rc|alpha|beta|pre|preview))
            [-_\.]?
            (?P<pre_n>[0-9]+)?
        )?
        (?P<post>                                         # post release
            (?:-(?P<post_n1>[0-9]+))
            |
            (?:
                [-_\.]?
                (?P<post_l>post|rev|r)
                [-_\.]?
                (?P<post_n2>[0-9]+)?
            )
        )?
        (?P<dev>                                          # dev release
            [-_\.]?
            (?P<dev_l>dev)
            [-_\.]?
            (?P<dev_n>[0-9]+)?
        )?
    )
    (?:\+(?P<local>[a-z0-9]+(?:[-_\.][a-z0-9]+)*))?       # local version
"""


class Version(_BaseVersion):

    _regex = re.compile(
        r"^\s*" + VERSION_PATTERN + r"\s*$",
        re.VERBOSE | re.IGNORECASE,
    )

    def __init__(self, version):
        # Validate the version and parse it into pieces
        match = self._regex.search(version)
        if not match:
            raise InvalidVersion("Invalid version: '{0}'".format(version))

        # Store the parsed out pieces of the version
        self._version = _Version(
            epoch=int(match.group("epoch")) if match.group("epoch") else 0,
            release=tuple(int(i) for i in match.group("release").split(".")),
            pre=_parse_letter_version(
                match.group("pre_l"),
                match.group("pre_n"),
            ),
            post=_parse_letter_version(
                match.group("post_l"),
                match.group("post_n1") or match.group("post_n2"),
            ),
            dev=_parse_letter_version(
                match.group("dev_l"),
                match.group("dev_n"),
            ),
            local=_parse_local_version(match.group("local")),
        )

        # Generate a key which will be used for sorting
        self._key = _cmpkey(
            self._version.epoch,
            self._version.release,
            self._version.pre,
            self._version.post,
            self._version.dev,
            self._version.local,
        )

    def __repr__(self):
        return "<Version({0})>".format(repr(str(self)))

    def __str__(self):
        parts = []

        # Epoch
        if self._version.epoch != 0:
            parts.append("{0}!".format(self._version.epoch))

        # Release segment
        parts.append(".".join(str(x) for x in self._version.release))

        # Pre-release
        if self._version.pre is not None:
            parts.append("".join(str(x) for x in self._version.pre))

        # Post-release
        if self._version.post is not None:
            parts.append(".post{0}".format(self._version.post[1]))

        # Development release
        if self._version.dev is not None:
            parts.append(".dev{0}".format(self._version.dev[1]))

        # Local version segment
        if self._version.local is not None:
            parts.append(
                "+{0}".format(".".join(str(x) for x in self._version.local))
            )

        return "".join(parts)

    @property
    def public(self):
        return str(self).split("+", 1)[0]

    @property
    def base_version(self):
        parts = []

        # Epoch
        if self._version.epoch != 0:
            parts.append("{0}!".format(self._version.epoch))

        # Release segment
        parts.append(".".join(str(x) for x in self._version.release))

        return "".join(parts)

    @property
    def local(self):
        version_string = str(self)
        if "+" in version_string:
            return version_string.split("+", 1)[1]

    @property
    def is_prerelease(self):
        return bool(self._version.dev or self._version.pre)

    @property
    def is_postrelease(self):
        return bool(self._version.post)


def _parse_letter_version(letter, number):
    if letter:
        # We consider there to be an implicit 0 in a pre-release if there is
        # not a numeral associated with it.
        if number is None:
            number = 0

        # We normalize any letters to their lower case form
        letter = letter.lower()

        # We consider some words to be alternate spellings of other words and
        # in those cases we want to normalize the spellings to our preferred
        # spelling.
        if letter == "alpha":
            letter = "a"
        elif letter == "beta":
            letter = "b"
        elif letter in ["c", "pre", "preview"]:
            letter = "rc"
        elif letter in ["rev", "r"]:
            letter = "post"

        return letter, int(number)
    if not letter and number:
        # We assume if we are given a number, but we are not given a letter
        # then this is using the implicit post release syntax (e.g. 1.0-1)
        letter = "post"

        return letter, int(number)


_local_version_seperators = re.compile(r"[\._-]")


def _parse_local_version(local):
    """
    Takes a string like abc.1.twelve and turns it into ("abc", 1, "twelve").
    """
    if local is not None:
        return tuple(
            part.lower() if not part.isdigit() else int(part)
            for part in _local_version_seperators.split(local)
        )


def _cmpkey(epoch, release, pre, post, dev, local):
    # When we compare a release version, we want to compare it with all of the
    # trailing zeros removed. So we'll use a reverse the list, drop all the now
    # leading zeros until we come to something non zero, then take the rest
    # re-reverse it back into the correct order and make it a tuple and use
    # that for our sorting key.
    release = tuple(
        reversed(list(
            itertools.dropwhile(
                lambda x: x == 0,
                reversed(release),
            )
        ))
    )

    # We need to "trick" the sorting algorithm to put 1.0.dev0 before 1.0a0.
    # We'll do this by abusing the pre segment, but we _only_ want to do this
    # if there is not a pre or a post segment. If we have one of those then
    # the normal sorting rules will handle this case correctly.
    if pre is None and post is None and dev is not None:
        pre = -Infinity
    # Versions without a pre-release (except as noted above) should sort after
    # those with one.
    elif pre is None:
        pre = Infinity

    # Versions without a post segment should sort before those with one.
    if post is None:
        post = -Infinity

    # Versions without a development segment should sort after those with one.
    if dev is None:
        dev = Infinity

    if local is None:
        # Versions without a local segment should sort before those with one.
        local = -Infinity
    else:
        # Versions with a local segment need that segment parsed to implement
        # the sorting rules in PEP440.
        # - Alpha numeric segments sort before numeric segments
        # - Alpha numeric segments sort lexicographically
        # - Numeric segments sort numerically
        # - Shorter versions sort before longer versions when the prefixes
        #   match exactly
        local = tuple(
            (i, "") if isinstance(i, int) else (-Infinity, i)
            for i in local
        )

    return epoch, release, pre, post, dev, local
site-packages/setuptools/_vendor/packaging/__init__.py000064400000001001147511334620017174 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

from .__about__ import (
    __author__, __copyright__, __email__, __license__, __summary__, __title__,
    __uri__, __version__
)

__all__ = [
    "__title__", "__summary__", "__uri__", "__version__", "__author__",
    "__email__", "__license__", "__copyright__",
]
site-packages/setuptools/_vendor/packaging/markers.py000064400000020057147511334620017115 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

import operator
import os
import platform
import sys

from setuptools.extern.pyparsing import ParseException, ParseResults, stringStart, stringEnd
from setuptools.extern.pyparsing import ZeroOrMore, Group, Forward, QuotedString
from setuptools.extern.pyparsing import Literal as L  # noqa

from ._compat import string_types
from .specifiers import Specifier, InvalidSpecifier


__all__ = [
    "InvalidMarker", "UndefinedComparison", "UndefinedEnvironmentName",
    "Marker", "default_environment",
]


class InvalidMarker(ValueError):
    """
    An invalid marker was found, users should refer to PEP 508.
    """


class UndefinedComparison(ValueError):
    """
    An invalid operation was attempted on a value that doesn't support it.
    """


class UndefinedEnvironmentName(ValueError):
    """
    A name was attempted to be used that does not exist inside of the
    environment.
    """


class Node(object):

    def __init__(self, value):
        self.value = value

    def __str__(self):
        return str(self.value)

    def __repr__(self):
        return "<{0}({1!r})>".format(self.__class__.__name__, str(self))

    def serialize(self):
        raise NotImplementedError


class Variable(Node):

    def serialize(self):
        return str(self)


class Value(Node):

    def serialize(self):
        return '"{0}"'.format(self)


class Op(Node):

    def serialize(self):
        return str(self)


VARIABLE = (
    L("implementation_version") |
    L("platform_python_implementation") |
    L("implementation_name") |
    L("python_full_version") |
    L("platform_release") |
    L("platform_version") |
    L("platform_machine") |
    L("platform_system") |
    L("python_version") |
    L("sys_platform") |
    L("os_name") |
    L("os.name") |  # PEP-345
    L("sys.platform") |  # PEP-345
    L("platform.version") |  # PEP-345
    L("platform.machine") |  # PEP-345
    L("platform.python_implementation") |  # PEP-345
    L("python_implementation") |  # undocumented setuptools legacy
    L("extra")
)
ALIASES = {
    'os.name': 'os_name',
    'sys.platform': 'sys_platform',
    'platform.version': 'platform_version',
    'platform.machine': 'platform_machine',
    'platform.python_implementation': 'platform_python_implementation',
    'python_implementation': 'platform_python_implementation'
}
VARIABLE.setParseAction(lambda s, l, t: Variable(ALIASES.get(t[0], t[0])))

VERSION_CMP = (
    L("===") |
    L("==") |
    L(">=") |
    L("<=") |
    L("!=") |
    L("~=") |
    L(">") |
    L("<")
)

MARKER_OP = VERSION_CMP | L("not in") | L("in")
MARKER_OP.setParseAction(lambda s, l, t: Op(t[0]))

MARKER_VALUE = QuotedString("'") | QuotedString('"')
MARKER_VALUE.setParseAction(lambda s, l, t: Value(t[0]))

BOOLOP = L("and") | L("or")

MARKER_VAR = VARIABLE | MARKER_VALUE

MARKER_ITEM = Group(MARKER_VAR + MARKER_OP + MARKER_VAR)
MARKER_ITEM.setParseAction(lambda s, l, t: tuple(t[0]))

LPAREN = L("(").suppress()
RPAREN = L(")").suppress()

MARKER_EXPR = Forward()
MARKER_ATOM = MARKER_ITEM | Group(LPAREN + MARKER_EXPR + RPAREN)
MARKER_EXPR << MARKER_ATOM + ZeroOrMore(BOOLOP + MARKER_EXPR)

MARKER = stringStart + MARKER_EXPR + stringEnd


def _coerce_parse_result(results):
    if isinstance(results, ParseResults):
        return [_coerce_parse_result(i) for i in results]
    else:
        return results


def _format_marker(marker, first=True):
    assert isinstance(marker, (list, tuple, string_types))

    # Sometimes we have a structure like [[...]] which is a single item list
    # where the single item is itself it's own list. In that case we want skip
    # the rest of this function so that we don't get extraneous () on the
    # outside.
    if (isinstance(marker, list) and len(marker) == 1 and
            isinstance(marker[0], (list, tuple))):
        return _format_marker(marker[0])

    if isinstance(marker, list):
        inner = (_format_marker(m, first=False) for m in marker)
        if first:
            return " ".join(inner)
        else:
            return "(" + " ".join(inner) + ")"
    elif isinstance(marker, tuple):
        return " ".join([m.serialize() for m in marker])
    else:
        return marker


_operators = {
    "in": lambda lhs, rhs: lhs in rhs,
    "not in": lambda lhs, rhs: lhs not in rhs,
    "<": operator.lt,
    "<=": operator.le,
    "==": operator.eq,
    "!=": operator.ne,
    ">=": operator.ge,
    ">": operator.gt,
}


def _eval_op(lhs, op, rhs):
    try:
        spec = Specifier("".join([op.serialize(), rhs]))
    except InvalidSpecifier:
        pass
    else:
        return spec.contains(lhs)

    oper = _operators.get(op.serialize())
    if oper is None:
        raise UndefinedComparison(
            "Undefined {0!r} on {1!r} and {2!r}.".format(op, lhs, rhs)
        )

    return oper(lhs, rhs)


_undefined = object()


def _get_env(environment, name):
    value = environment.get(name, _undefined)

    if value is _undefined:
        raise UndefinedEnvironmentName(
            "{0!r} does not exist in evaluation environment.".format(name)
        )

    return value


def _evaluate_markers(markers, environment):
    groups = [[]]

    for marker in markers:
        assert isinstance(marker, (list, tuple, string_types))

        if isinstance(marker, list):
            groups[-1].append(_evaluate_markers(marker, environment))
        elif isinstance(marker, tuple):
            lhs, op, rhs = marker

            if isinstance(lhs, Variable):
                lhs_value = _get_env(environment, lhs.value)
                rhs_value = rhs.value
            else:
                lhs_value = lhs.value
                rhs_value = _get_env(environment, rhs.value)

            groups[-1].append(_eval_op(lhs_value, op, rhs_value))
        else:
            assert marker in ["and", "or"]
            if marker == "or":
                groups.append([])

    return any(all(item) for item in groups)


def format_full_version(info):
    version = '{0.major}.{0.minor}.{0.micro}'.format(info)
    kind = info.releaselevel
    if kind != 'final':
        version += kind[0] + str(info.serial)
    return version


def default_environment():
    if hasattr(sys, 'implementation'):
        iver = format_full_version(sys.implementation.version)
        implementation_name = sys.implementation.name
    else:
        iver = '0'
        implementation_name = ''

    return {
        "implementation_name": implementation_name,
        "implementation_version": iver,
        "os_name": os.name,
        "platform_machine": platform.machine(),
        "platform_release": platform.release(),
        "platform_system": platform.system(),
        "platform_version": platform.version(),
        "python_full_version": platform.python_version(),
        "platform_python_implementation": platform.python_implementation(),
        "python_version": platform.python_version()[:3],
        "sys_platform": sys.platform,
    }


class Marker(object):

    def __init__(self, marker):
        try:
            self._markers = _coerce_parse_result(MARKER.parseString(marker))
        except ParseException as e:
            err_str = "Invalid marker: {0!r}, parse error at {1!r}".format(
                marker, marker[e.loc:e.loc + 8])
            raise InvalidMarker(err_str)

    def __str__(self):
        return _format_marker(self._markers)

    def __repr__(self):
        return "<Marker({0!r})>".format(str(self))

    def evaluate(self, environment=None):
        """Evaluate a marker.

        Return the boolean from evaluating the given marker against the
        environment. environment is an optional argument to override all or
        part of the determined environment.

        The environment is determined from the current Python process.
        """
        current_environment = default_environment()
        if environment is not None:
            current_environment.update(environment)

        return _evaluate_markers(self._markers, current_environment)
site-packages/setuptools/_vendor/six.py000064400000072622147511334620014335 0ustar00"""Utilities for writing code that runs on Python 2 and 3"""

# Copyright (c) 2010-2015 Benjamin Peterson
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in all
# copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.

from __future__ import absolute_import

import functools
import itertools
import operator
import sys
import types

__author__ = "Benjamin Peterson <benjamin@python.org>"
__version__ = "1.10.0"


# Useful for very coarse version differentiation.
PY2 = sys.version_info[0] == 2
PY3 = sys.version_info[0] == 3
PY34 = sys.version_info[0:2] >= (3, 4)

if PY3:
    string_types = str,
    integer_types = int,
    class_types = type,
    text_type = str
    binary_type = bytes

    MAXSIZE = sys.maxsize
else:
    string_types = basestring,
    integer_types = (int, long)
    class_types = (type, types.ClassType)
    text_type = unicode
    binary_type = str

    if sys.platform.startswith("java"):
        # Jython always uses 32 bits.
        MAXSIZE = int((1 << 31) - 1)
    else:
        # It's possible to have sizeof(long) != sizeof(Py_ssize_t).
        class X(object):

            def __len__(self):
                return 1 << 31
        try:
            len(X())
        except OverflowError:
            # 32-bit
            MAXSIZE = int((1 << 31) - 1)
        else:
            # 64-bit
            MAXSIZE = int((1 << 63) - 1)
        del X


def _add_doc(func, doc):
    """Add documentation to a function."""
    func.__doc__ = doc


def _import_module(name):
    """Import module, returning the module after the last dot."""
    __import__(name)
    return sys.modules[name]


class _LazyDescr(object):

    def __init__(self, name):
        self.name = name

    def __get__(self, obj, tp):
        result = self._resolve()
        setattr(obj, self.name, result)  # Invokes __set__.
        try:
            # This is a bit ugly, but it avoids running this again by
            # removing this descriptor.
            delattr(obj.__class__, self.name)
        except AttributeError:
            pass
        return result


class MovedModule(_LazyDescr):

    def __init__(self, name, old, new=None):
        super(MovedModule, self).__init__(name)
        if PY3:
            if new is None:
                new = name
            self.mod = new
        else:
            self.mod = old

    def _resolve(self):
        return _import_module(self.mod)

    def __getattr__(self, attr):
        _module = self._resolve()
        value = getattr(_module, attr)
        setattr(self, attr, value)
        return value


class _LazyModule(types.ModuleType):

    def __init__(self, name):
        super(_LazyModule, self).__init__(name)
        self.__doc__ = self.__class__.__doc__

    def __dir__(self):
        attrs = ["__doc__", "__name__"]
        attrs += [attr.name for attr in self._moved_attributes]
        return attrs

    # Subclasses should override this
    _moved_attributes = []


class MovedAttribute(_LazyDescr):

    def __init__(self, name, old_mod, new_mod, old_attr=None, new_attr=None):
        super(MovedAttribute, self).__init__(name)
        if PY3:
            if new_mod is None:
                new_mod = name
            self.mod = new_mod
            if new_attr is None:
                if old_attr is None:
                    new_attr = name
                else:
                    new_attr = old_attr
            self.attr = new_attr
        else:
            self.mod = old_mod
            if old_attr is None:
                old_attr = name
            self.attr = old_attr

    def _resolve(self):
        module = _import_module(self.mod)
        return getattr(module, self.attr)


class _SixMetaPathImporter(object):

    """
    A meta path importer to import six.moves and its submodules.

    This class implements a PEP302 finder and loader. It should be compatible
    with Python 2.5 and all existing versions of Python3
    """

    def __init__(self, six_module_name):
        self.name = six_module_name
        self.known_modules = {}

    def _add_module(self, mod, *fullnames):
        for fullname in fullnames:
            self.known_modules[self.name + "." + fullname] = mod

    def _get_module(self, fullname):
        return self.known_modules[self.name + "." + fullname]

    def find_module(self, fullname, path=None):
        if fullname in self.known_modules:
            return self
        return None

    def __get_module(self, fullname):
        try:
            return self.known_modules[fullname]
        except KeyError:
            raise ImportError("This loader does not know module " + fullname)

    def load_module(self, fullname):
        try:
            # in case of a reload
            return sys.modules[fullname]
        except KeyError:
            pass
        mod = self.__get_module(fullname)
        if isinstance(mod, MovedModule):
            mod = mod._resolve()
        else:
            mod.__loader__ = self
        sys.modules[fullname] = mod
        return mod

    def is_package(self, fullname):
        """
        Return true, if the named module is a package.

        We need this method to get correct spec objects with
        Python 3.4 (see PEP451)
        """
        return hasattr(self.__get_module(fullname), "__path__")

    def get_code(self, fullname):
        """Return None

        Required, if is_package is implemented"""
        self.__get_module(fullname)  # eventually raises ImportError
        return None
    get_source = get_code  # same as get_code

_importer = _SixMetaPathImporter(__name__)


class _MovedItems(_LazyModule):

    """Lazy loading of moved objects"""
    __path__ = []  # mark as package


_moved_attributes = [
    MovedAttribute("cStringIO", "cStringIO", "io", "StringIO"),
    MovedAttribute("filter", "itertools", "builtins", "ifilter", "filter"),
    MovedAttribute("filterfalse", "itertools", "itertools", "ifilterfalse", "filterfalse"),
    MovedAttribute("input", "__builtin__", "builtins", "raw_input", "input"),
    MovedAttribute("intern", "__builtin__", "sys"),
    MovedAttribute("map", "itertools", "builtins", "imap", "map"),
    MovedAttribute("getcwd", "os", "os", "getcwdu", "getcwd"),
    MovedAttribute("getcwdb", "os", "os", "getcwd", "getcwdb"),
    MovedAttribute("range", "__builtin__", "builtins", "xrange", "range"),
    MovedAttribute("reload_module", "__builtin__", "importlib" if PY34 else "imp", "reload"),
    MovedAttribute("reduce", "__builtin__", "functools"),
    MovedAttribute("shlex_quote", "pipes", "shlex", "quote"),
    MovedAttribute("StringIO", "StringIO", "io"),
    MovedAttribute("UserDict", "UserDict", "collections"),
    MovedAttribute("UserList", "UserList", "collections"),
    MovedAttribute("UserString", "UserString", "collections"),
    MovedAttribute("xrange", "__builtin__", "builtins", "xrange", "range"),
    MovedAttribute("zip", "itertools", "builtins", "izip", "zip"),
    MovedAttribute("zip_longest", "itertools", "itertools", "izip_longest", "zip_longest"),
    MovedModule("builtins", "__builtin__"),
    MovedModule("configparser", "ConfigParser"),
    MovedModule("copyreg", "copy_reg"),
    MovedModule("dbm_gnu", "gdbm", "dbm.gnu"),
    MovedModule("_dummy_thread", "dummy_thread", "_dummy_thread"),
    MovedModule("http_cookiejar", "cookielib", "http.cookiejar"),
    MovedModule("http_cookies", "Cookie", "http.cookies"),
    MovedModule("html_entities", "htmlentitydefs", "html.entities"),
    MovedModule("html_parser", "HTMLParser", "html.parser"),
    MovedModule("http_client", "httplib", "http.client"),
    MovedModule("email_mime_multipart", "email.MIMEMultipart", "email.mime.multipart"),
    MovedModule("email_mime_nonmultipart", "email.MIMENonMultipart", "email.mime.nonmultipart"),
    MovedModule("email_mime_text", "email.MIMEText", "email.mime.text"),
    MovedModule("email_mime_base", "email.MIMEBase", "email.mime.base"),
    MovedModule("BaseHTTPServer", "BaseHTTPServer", "http.server"),
    MovedModule("CGIHTTPServer", "CGIHTTPServer", "http.server"),
    MovedModule("SimpleHTTPServer", "SimpleHTTPServer", "http.server"),
    MovedModule("cPickle", "cPickle", "pickle"),
    MovedModule("queue", "Queue"),
    MovedModule("reprlib", "repr"),
    MovedModule("socketserver", "SocketServer"),
    MovedModule("_thread", "thread", "_thread"),
    MovedModule("tkinter", "Tkinter"),
    MovedModule("tkinter_dialog", "Dialog", "tkinter.dialog"),
    MovedModule("tkinter_filedialog", "FileDialog", "tkinter.filedialog"),
    MovedModule("tkinter_scrolledtext", "ScrolledText", "tkinter.scrolledtext"),
    MovedModule("tkinter_simpledialog", "SimpleDialog", "tkinter.simpledialog"),
    MovedModule("tkinter_tix", "Tix", "tkinter.tix"),
    MovedModule("tkinter_ttk", "ttk", "tkinter.ttk"),
    MovedModule("tkinter_constants", "Tkconstants", "tkinter.constants"),
    MovedModule("tkinter_dnd", "Tkdnd", "tkinter.dnd"),
    MovedModule("tkinter_colorchooser", "tkColorChooser",
                "tkinter.colorchooser"),
    MovedModule("tkinter_commondialog", "tkCommonDialog",
                "tkinter.commondialog"),
    MovedModule("tkinter_tkfiledialog", "tkFileDialog", "tkinter.filedialog"),
    MovedModule("tkinter_font", "tkFont", "tkinter.font"),
    MovedModule("tkinter_messagebox", "tkMessageBox", "tkinter.messagebox"),
    MovedModule("tkinter_tksimpledialog", "tkSimpleDialog",
                "tkinter.simpledialog"),
    MovedModule("urllib_parse", __name__ + ".moves.urllib_parse", "urllib.parse"),
    MovedModule("urllib_error", __name__ + ".moves.urllib_error", "urllib.error"),
    MovedModule("urllib", __name__ + ".moves.urllib", __name__ + ".moves.urllib"),
    MovedModule("urllib_robotparser", "robotparser", "urllib.robotparser"),
    MovedModule("xmlrpc_client", "xmlrpclib", "xmlrpc.client"),
    MovedModule("xmlrpc_server", "SimpleXMLRPCServer", "xmlrpc.server"),
]
# Add windows specific modules.
if sys.platform == "win32":
    _moved_attributes += [
        MovedModule("winreg", "_winreg"),
    ]

for attr in _moved_attributes:
    setattr(_MovedItems, attr.name, attr)
    if isinstance(attr, MovedModule):
        _importer._add_module(attr, "moves." + attr.name)
del attr

_MovedItems._moved_attributes = _moved_attributes

moves = _MovedItems(__name__ + ".moves")
_importer._add_module(moves, "moves")


class Module_six_moves_urllib_parse(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_parse"""


_urllib_parse_moved_attributes = [
    MovedAttribute("ParseResult", "urlparse", "urllib.parse"),
    MovedAttribute("SplitResult", "urlparse", "urllib.parse"),
    MovedAttribute("parse_qs", "urlparse", "urllib.parse"),
    MovedAttribute("parse_qsl", "urlparse", "urllib.parse"),
    MovedAttribute("urldefrag", "urlparse", "urllib.parse"),
    MovedAttribute("urljoin", "urlparse", "urllib.parse"),
    MovedAttribute("urlparse", "urlparse", "urllib.parse"),
    MovedAttribute("urlsplit", "urlparse", "urllib.parse"),
    MovedAttribute("urlunparse", "urlparse", "urllib.parse"),
    MovedAttribute("urlunsplit", "urlparse", "urllib.parse"),
    MovedAttribute("quote", "urllib", "urllib.parse"),
    MovedAttribute("quote_plus", "urllib", "urllib.parse"),
    MovedAttribute("unquote", "urllib", "urllib.parse"),
    MovedAttribute("unquote_plus", "urllib", "urllib.parse"),
    MovedAttribute("urlencode", "urllib", "urllib.parse"),
    MovedAttribute("splitquery", "urllib", "urllib.parse"),
    MovedAttribute("splittag", "urllib", "urllib.parse"),
    MovedAttribute("splituser", "urllib", "urllib.parse"),
    MovedAttribute("uses_fragment", "urlparse", "urllib.parse"),
    MovedAttribute("uses_netloc", "urlparse", "urllib.parse"),
    MovedAttribute("uses_params", "urlparse", "urllib.parse"),
    MovedAttribute("uses_query", "urlparse", "urllib.parse"),
    MovedAttribute("uses_relative", "urlparse", "urllib.parse"),
]
for attr in _urllib_parse_moved_attributes:
    setattr(Module_six_moves_urllib_parse, attr.name, attr)
del attr

Module_six_moves_urllib_parse._moved_attributes = _urllib_parse_moved_attributes

_importer._add_module(Module_six_moves_urllib_parse(__name__ + ".moves.urllib_parse"),
                      "moves.urllib_parse", "moves.urllib.parse")


class Module_six_moves_urllib_error(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_error"""


_urllib_error_moved_attributes = [
    MovedAttribute("URLError", "urllib2", "urllib.error"),
    MovedAttribute("HTTPError", "urllib2", "urllib.error"),
    MovedAttribute("ContentTooShortError", "urllib", "urllib.error"),
]
for attr in _urllib_error_moved_attributes:
    setattr(Module_six_moves_urllib_error, attr.name, attr)
del attr

Module_six_moves_urllib_error._moved_attributes = _urllib_error_moved_attributes

_importer._add_module(Module_six_moves_urllib_error(__name__ + ".moves.urllib.error"),
                      "moves.urllib_error", "moves.urllib.error")


class Module_six_moves_urllib_request(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_request"""


_urllib_request_moved_attributes = [
    MovedAttribute("urlopen", "urllib2", "urllib.request"),
    MovedAttribute("install_opener", "urllib2", "urllib.request"),
    MovedAttribute("build_opener", "urllib2", "urllib.request"),
    MovedAttribute("pathname2url", "urllib", "urllib.request"),
    MovedAttribute("url2pathname", "urllib", "urllib.request"),
    MovedAttribute("getproxies", "urllib", "urllib.request"),
    MovedAttribute("Request", "urllib2", "urllib.request"),
    MovedAttribute("OpenerDirector", "urllib2", "urllib.request"),
    MovedAttribute("HTTPDefaultErrorHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPRedirectHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPCookieProcessor", "urllib2", "urllib.request"),
    MovedAttribute("ProxyHandler", "urllib2", "urllib.request"),
    MovedAttribute("BaseHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPPasswordMgr", "urllib2", "urllib.request"),
    MovedAttribute("HTTPPasswordMgrWithDefaultRealm", "urllib2", "urllib.request"),
    MovedAttribute("AbstractBasicAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPBasicAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("ProxyBasicAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("AbstractDigestAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPDigestAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("ProxyDigestAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPSHandler", "urllib2", "urllib.request"),
    MovedAttribute("FileHandler", "urllib2", "urllib.request"),
    MovedAttribute("FTPHandler", "urllib2", "urllib.request"),
    MovedAttribute("CacheFTPHandler", "urllib2", "urllib.request"),
    MovedAttribute("UnknownHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPErrorProcessor", "urllib2", "urllib.request"),
    MovedAttribute("urlretrieve", "urllib", "urllib.request"),
    MovedAttribute("urlcleanup", "urllib", "urllib.request"),
    MovedAttribute("URLopener", "urllib", "urllib.request"),
    MovedAttribute("FancyURLopener", "urllib", "urllib.request"),
    MovedAttribute("proxy_bypass", "urllib", "urllib.request"),
]
for attr in _urllib_request_moved_attributes:
    setattr(Module_six_moves_urllib_request, attr.name, attr)
del attr

Module_six_moves_urllib_request._moved_attributes = _urllib_request_moved_attributes

_importer._add_module(Module_six_moves_urllib_request(__name__ + ".moves.urllib.request"),
                      "moves.urllib_request", "moves.urllib.request")


class Module_six_moves_urllib_response(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_response"""


_urllib_response_moved_attributes = [
    MovedAttribute("addbase", "urllib", "urllib.response"),
    MovedAttribute("addclosehook", "urllib", "urllib.response"),
    MovedAttribute("addinfo", "urllib", "urllib.response"),
    MovedAttribute("addinfourl", "urllib", "urllib.response"),
]
for attr in _urllib_response_moved_attributes:
    setattr(Module_six_moves_urllib_response, attr.name, attr)
del attr

Module_six_moves_urllib_response._moved_attributes = _urllib_response_moved_attributes

_importer._add_module(Module_six_moves_urllib_response(__name__ + ".moves.urllib.response"),
                      "moves.urllib_response", "moves.urllib.response")


class Module_six_moves_urllib_robotparser(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_robotparser"""


_urllib_robotparser_moved_attributes = [
    MovedAttribute("RobotFileParser", "robotparser", "urllib.robotparser"),
]
for attr in _urllib_robotparser_moved_attributes:
    setattr(Module_six_moves_urllib_robotparser, attr.name, attr)
del attr

Module_six_moves_urllib_robotparser._moved_attributes = _urllib_robotparser_moved_attributes

_importer._add_module(Module_six_moves_urllib_robotparser(__name__ + ".moves.urllib.robotparser"),
                      "moves.urllib_robotparser", "moves.urllib.robotparser")


class Module_six_moves_urllib(types.ModuleType):

    """Create a six.moves.urllib namespace that resembles the Python 3 namespace"""
    __path__ = []  # mark as package
    parse = _importer._get_module("moves.urllib_parse")
    error = _importer._get_module("moves.urllib_error")
    request = _importer._get_module("moves.urllib_request")
    response = _importer._get_module("moves.urllib_response")
    robotparser = _importer._get_module("moves.urllib_robotparser")

    def __dir__(self):
        return ['parse', 'error', 'request', 'response', 'robotparser']

_importer._add_module(Module_six_moves_urllib(__name__ + ".moves.urllib"),
                      "moves.urllib")


def add_move(move):
    """Add an item to six.moves."""
    setattr(_MovedItems, move.name, move)


def remove_move(name):
    """Remove item from six.moves."""
    try:
        delattr(_MovedItems, name)
    except AttributeError:
        try:
            del moves.__dict__[name]
        except KeyError:
            raise AttributeError("no such move, %r" % (name,))


if PY3:
    _meth_func = "__func__"
    _meth_self = "__self__"

    _func_closure = "__closure__"
    _func_code = "__code__"
    _func_defaults = "__defaults__"
    _func_globals = "__globals__"
else:
    _meth_func = "im_func"
    _meth_self = "im_self"

    _func_closure = "func_closure"
    _func_code = "func_code"
    _func_defaults = "func_defaults"
    _func_globals = "func_globals"


try:
    advance_iterator = next
except NameError:
    def advance_iterator(it):
        return it.next()
next = advance_iterator


try:
    callable = callable
except NameError:
    def callable(obj):
        return any("__call__" in klass.__dict__ for klass in type(obj).__mro__)


if PY3:
    def get_unbound_function(unbound):
        return unbound

    create_bound_method = types.MethodType

    def create_unbound_method(func, cls):
        return func

    Iterator = object
else:
    def get_unbound_function(unbound):
        return unbound.im_func

    def create_bound_method(func, obj):
        return types.MethodType(func, obj, obj.__class__)

    def create_unbound_method(func, cls):
        return types.MethodType(func, None, cls)

    class Iterator(object):

        def next(self):
            return type(self).__next__(self)

    callable = callable
_add_doc(get_unbound_function,
         """Get the function out of a possibly unbound function""")


get_method_function = operator.attrgetter(_meth_func)
get_method_self = operator.attrgetter(_meth_self)
get_function_closure = operator.attrgetter(_func_closure)
get_function_code = operator.attrgetter(_func_code)
get_function_defaults = operator.attrgetter(_func_defaults)
get_function_globals = operator.attrgetter(_func_globals)


if PY3:
    def iterkeys(d, **kw):
        return iter(d.keys(**kw))

    def itervalues(d, **kw):
        return iter(d.values(**kw))

    def iteritems(d, **kw):
        return iter(d.items(**kw))

    def iterlists(d, **kw):
        return iter(d.lists(**kw))

    viewkeys = operator.methodcaller("keys")

    viewvalues = operator.methodcaller("values")

    viewitems = operator.methodcaller("items")
else:
    def iterkeys(d, **kw):
        return d.iterkeys(**kw)

    def itervalues(d, **kw):
        return d.itervalues(**kw)

    def iteritems(d, **kw):
        return d.iteritems(**kw)

    def iterlists(d, **kw):
        return d.iterlists(**kw)

    viewkeys = operator.methodcaller("viewkeys")

    viewvalues = operator.methodcaller("viewvalues")

    viewitems = operator.methodcaller("viewitems")

_add_doc(iterkeys, "Return an iterator over the keys of a dictionary.")
_add_doc(itervalues, "Return an iterator over the values of a dictionary.")
_add_doc(iteritems,
         "Return an iterator over the (key, value) pairs of a dictionary.")
_add_doc(iterlists,
         "Return an iterator over the (key, [values]) pairs of a dictionary.")


if PY3:
    def b(s):
        return s.encode("latin-1")

    def u(s):
        return s
    unichr = chr
    import struct
    int2byte = struct.Struct(">B").pack
    del struct
    byte2int = operator.itemgetter(0)
    indexbytes = operator.getitem
    iterbytes = iter
    import io
    StringIO = io.StringIO
    BytesIO = io.BytesIO
    _assertCountEqual = "assertCountEqual"
    if sys.version_info[1] <= 1:
        _assertRaisesRegex = "assertRaisesRegexp"
        _assertRegex = "assertRegexpMatches"
    else:
        _assertRaisesRegex = "assertRaisesRegex"
        _assertRegex = "assertRegex"
else:
    def b(s):
        return s
    # Workaround for standalone backslash

    def u(s):
        return unicode(s.replace(r'\\', r'\\\\'), "unicode_escape")
    unichr = unichr
    int2byte = chr

    def byte2int(bs):
        return ord(bs[0])

    def indexbytes(buf, i):
        return ord(buf[i])
    iterbytes = functools.partial(itertools.imap, ord)
    import StringIO
    StringIO = BytesIO = StringIO.StringIO
    _assertCountEqual = "assertItemsEqual"
    _assertRaisesRegex = "assertRaisesRegexp"
    _assertRegex = "assertRegexpMatches"
_add_doc(b, """Byte literal""")
_add_doc(u, """Text literal""")


def assertCountEqual(self, *args, **kwargs):
    return getattr(self, _assertCountEqual)(*args, **kwargs)


def assertRaisesRegex(self, *args, **kwargs):
    return getattr(self, _assertRaisesRegex)(*args, **kwargs)


def assertRegex(self, *args, **kwargs):
    return getattr(self, _assertRegex)(*args, **kwargs)


if PY3:
    exec_ = getattr(moves.builtins, "exec")

    def reraise(tp, value, tb=None):
        if value is None:
            value = tp()
        if value.__traceback__ is not tb:
            raise value.with_traceback(tb)
        raise value

else:
    def exec_(_code_, _globs_=None, _locs_=None):
        """Execute code in a namespace."""
        if _globs_ is None:
            frame = sys._getframe(1)
            _globs_ = frame.f_globals
            if _locs_ is None:
                _locs_ = frame.f_locals
            del frame
        elif _locs_ is None:
            _locs_ = _globs_
        exec("""exec _code_ in _globs_, _locs_""")

    exec_("""def reraise(tp, value, tb=None):
    raise tp, value, tb
""")


if sys.version_info[:2] == (3, 2):
    exec_("""def raise_from(value, from_value):
    if from_value is None:
        raise value
    raise value from from_value
""")
elif sys.version_info[:2] > (3, 2):
    exec_("""def raise_from(value, from_value):
    raise value from from_value
""")
else:
    def raise_from(value, from_value):
        raise value


print_ = getattr(moves.builtins, "print", None)
if print_ is None:
    def print_(*args, **kwargs):
        """The new-style print function for Python 2.4 and 2.5."""
        fp = kwargs.pop("file", sys.stdout)
        if fp is None:
            return

        def write(data):
            if not isinstance(data, basestring):
                data = str(data)
            # If the file has an encoding, encode unicode with it.
            if (isinstance(fp, file) and
                    isinstance(data, unicode) and
                    fp.encoding is not None):
                errors = getattr(fp, "errors", None)
                if errors is None:
                    errors = "strict"
                data = data.encode(fp.encoding, errors)
            fp.write(data)
        want_unicode = False
        sep = kwargs.pop("sep", None)
        if sep is not None:
            if isinstance(sep, unicode):
                want_unicode = True
            elif not isinstance(sep, str):
                raise TypeError("sep must be None or a string")
        end = kwargs.pop("end", None)
        if end is not None:
            if isinstance(end, unicode):
                want_unicode = True
            elif not isinstance(end, str):
                raise TypeError("end must be None or a string")
        if kwargs:
            raise TypeError("invalid keyword arguments to print()")
        if not want_unicode:
            for arg in args:
                if isinstance(arg, unicode):
                    want_unicode = True
                    break
        if want_unicode:
            newline = unicode("\n")
            space = unicode(" ")
        else:
            newline = "\n"
            space = " "
        if sep is None:
            sep = space
        if end is None:
            end = newline
        for i, arg in enumerate(args):
            if i:
                write(sep)
            write(arg)
        write(end)
if sys.version_info[:2] < (3, 3):
    _print = print_

    def print_(*args, **kwargs):
        fp = kwargs.get("file", sys.stdout)
        flush = kwargs.pop("flush", False)
        _print(*args, **kwargs)
        if flush and fp is not None:
            fp.flush()

_add_doc(reraise, """Reraise an exception.""")

if sys.version_info[0:2] < (3, 4):
    def wraps(wrapped, assigned=functools.WRAPPER_ASSIGNMENTS,
              updated=functools.WRAPPER_UPDATES):
        def wrapper(f):
            f = functools.wraps(wrapped, assigned, updated)(f)
            f.__wrapped__ = wrapped
            return f
        return wrapper
else:
    wraps = functools.wraps


def with_metaclass(meta, *bases):
    """Create a base class with a metaclass."""
    # This requires a bit of explanation: the basic idea is to make a dummy
    # metaclass for one level of class instantiation that replaces itself with
    # the actual metaclass.
    class metaclass(meta):

        def __new__(cls, name, this_bases, d):
            return meta(name, bases, d)
    return type.__new__(metaclass, 'temporary_class', (), {})


def add_metaclass(metaclass):
    """Class decorator for creating a class with a metaclass."""
    def wrapper(cls):
        orig_vars = cls.__dict__.copy()
        slots = orig_vars.get('__slots__')
        if slots is not None:
            if isinstance(slots, str):
                slots = [slots]
            for slots_var in slots:
                orig_vars.pop(slots_var)
        orig_vars.pop('__dict__', None)
        orig_vars.pop('__weakref__', None)
        return metaclass(cls.__name__, cls.__bases__, orig_vars)
    return wrapper


def python_2_unicode_compatible(klass):
    """
    A decorator that defines __unicode__ and __str__ methods under Python 2.
    Under Python 3 it does nothing.

    To support Python 2 and 3 with a single code base, define a __str__ method
    returning text and apply this decorator to the class.
    """
    if PY2:
        if '__str__' not in klass.__dict__:
            raise ValueError("@python_2_unicode_compatible cannot be applied "
                             "to %s because it doesn't define __str__()." %
                             klass.__name__)
        klass.__unicode__ = klass.__str__
        klass.__str__ = lambda self: self.__unicode__().encode('utf-8')
    return klass


# Complete the moves implementation.
# This code is at the end of this module to speed up module loading.
# Turn this module into a package.
__path__ = []  # required for PEP 302 and PEP 451
__package__ = __name__  # see PEP 366 @ReservedAssignment
if globals().get("__spec__") is not None:
    __spec__.submodule_search_locations = []  # PEP 451 @UndefinedVariable
# Remove other six meta path importers, since they cause problems. This can
# happen if six is removed from sys.modules and then reloaded. (Setuptools does
# this for some reason.)
if sys.meta_path:
    for i, importer in enumerate(sys.meta_path):
        # Here's some real nastiness: Another "instance" of the six module might
        # be floating around. Therefore, we can't use isinstance() to check for
        # the six meta path importer, since the other six instance will have
        # inserted an importer with different class.
        if (type(importer).__name__ == "_SixMetaPathImporter" and
                importer.name == __name__):
            del sys.meta_path[i]
            break
    del i, importer
# Finally, add the importer to the meta path import hook.
sys.meta_path.append(_importer)
site-packages/setuptools/_vendor/__pycache__/pyparsing.cpython-36.pyc000064400000610513147511334630022030 0ustar003

��f��@s�dZdZdZdZddlZddlmZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlmZyddlmZWn ek
r�ddlmZYnXydd	l
mZWn>ek
r�ydd	lmZWnek
r�dZYnXYnXd
ddd
ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrgiZee	j�dds�ZeddskZe�r"e	jZe Z!e"Z#e Z$e%e&e'e(e)ee*e+e,e-e.gZ/nbe	j0Ze1Z2dtdu�Z$gZ/ddl3Z3xBdvj4�D]6Z5ye/j6e7e3e5��Wne8k
�r|�wJYnX�qJWe9dwdx�e2dy�D��Z:dzd{�Z;Gd|d}�d}e<�Z=ej>ej?Z@d~ZAeAdZBe@eAZCe"d��ZDd�jEd�dx�ejFD��ZGGd�d!�d!eH�ZIGd�d#�d#eI�ZJGd�d%�d%eI�ZKGd�d'�d'eK�ZLGd�d*�d*eH�ZMGd�d��d�e<�ZNGd�d&�d&e<�ZOe
jPjQeO�d�d=�ZRd�dN�ZSd�dK�ZTd�d��ZUd�d��ZVd�d��ZWd�dU�ZX�d/d�d��ZYGd�d(�d(e<�ZZGd�d0�d0eZ�Z[Gd�d�de[�Z\Gd�d�de[�Z]Gd�d�de[�Z^e^Z_e^eZ_`Gd�d�de[�ZaGd�d�de^�ZbGd�d�dea�ZcGd�dp�dpe[�ZdGd�d3�d3e[�ZeGd�d+�d+e[�ZfGd�d)�d)e[�ZgGd�d
�d
e[�ZhGd�d2�d2e[�ZiGd�d��d�e[�ZjGd�d�dej�ZkGd�d�dej�ZlGd�d�dej�ZmGd�d.�d.ej�ZnGd�d-�d-ej�ZoGd�d5�d5ej�ZpGd�d4�d4ej�ZqGd�d$�d$eZ�ZrGd�d
�d
er�ZsGd�d �d er�ZtGd�d�der�ZuGd�d�der�ZvGd�d"�d"eZ�ZwGd�d�dew�ZxGd�d�dew�ZyGd�d��d�ew�ZzGd�d�dez�Z{Gd�d6�d6ez�Z|Gd�d��d�e<�Z}e}�Z~Gd�d�dew�ZGd�d,�d,ew�Z�Gd�d�dew�Z�Gd�d��d�e��Z�Gd�d1�d1ew�Z�Gd�d�de��Z�Gd�d�de��Z�Gd�d�de��Z�Gd�d/�d/e��Z�Gd�d�de<�Z�d�df�Z��d0d�dD�Z��d1d�d@�Z�d�d΄Z�d�dS�Z�d�dR�Z�d�d҄Z��d2d�dW�Z�d�dE�Z��d3d�dk�Z�d�dl�Z�d�dn�Z�e\�j�dG�Z�el�j�dM�Z�em�j�dL�Z�en�j�de�Z�eo�j�dd�Z�eeeDd�d�dڍj�d�d܄�Z�efd݃j�d�d܄�Z�efd߃j�d�d܄�Z�e�e�Be�BeeeGd�dyd�Befd�ej��BZ�e�e�e�d�e��Z�e^d�ed�j�d�e�e{e�e�B��j�d�d�Z�d�dc�Z�d�dQ�Z�d�d`�Z�d�d^�Z�d�dq�Z�e�d�d܄�Z�e�d�d܄�Z�d�d�Z�d�dO�Z�d�dP�Z�d�di�Z�e<�e�_��d4d�do�Z�e=�Z�e<�e�_�e<�e�_�e�d��e�d��fd�dm�Z�e�Z�e�efd��d��j�d��Z�e�efd��d��j�d��Z�e�efd��d�efd��d�B�j��d�Z�e�e_�d�e�j��j��d�Z�d�d�de�j�f�ddT�Z��d5�ddj�Z�e��d�Z�e��d�Z�e�eee@eC�d�j��d��\Z�Z�e�e��d	j4��d
��Z�ef�d�djEe�j��d
�j��d�ZĐdd_�Z�e�ef�d��d�j��d�Z�ef�d�j��d�Z�ef�d�jȃj��d�Z�ef�d�j��d�Z�e�ef�d��de�B�j��d�Z�e�Z�ef�d�j��d�Z�e�e{eeeGdɐd�eee�d�e^dɃem����j΃j��d�Z�e�ee�j�e�Bd��d��j�d>�Z�G�d dr�dr�Z�eҐd!k�r�eb�d"�Z�eb�d#�Z�eee@eC�d$�Z�e�eՐd%dӐd&�j�e��Z�e�e�eփ�j��d'�Zאd(e�BZ�e�eՐd%dӐd&�j�e��Z�e�e�eك�j��d)�Z�eӐd*�eؐd'�e�eڐd)�Z�e�jܐd+�e�j�jܐd,�e�j�jܐd,�e�j�jܐd-�ddl�Z�e�j�j�e�e�j��e�j�jܐd.�dS(6aS
pyparsing module - Classes and methods to define and execute parsing grammars

The pyparsing module is an alternative approach to creating and executing simple grammars,
vs. the traditional lex/yacc approach, or the use of regular expressions.  With pyparsing, you
don't need to learn a new syntax for defining grammars or matching expressions - the parsing module
provides a library of classes that you use to construct the grammar directly in Python.

Here is a program to parse "Hello, World!" (or any greeting of the form 
C{"<salutation>, <addressee>!"}), built up using L{Word}, L{Literal}, and L{And} elements 
(L{'+'<ParserElement.__add__>} operator gives L{And} expressions, strings are auto-converted to
L{Literal} expressions)::

    from pyparsing import Word, alphas

    # define grammar of a greeting
    greet = Word(alphas) + "," + Word(alphas) + "!"

    hello = "Hello, World!"
    print (hello, "->", greet.parseString(hello))

The program outputs the following::

    Hello, World! -> ['Hello', ',', 'World', '!']

The Python representation of the grammar is quite readable, owing to the self-explanatory
class names, and the use of '+', '|' and '^' operators.

The L{ParseResults} object returned from L{ParserElement.parseString<ParserElement.parseString>} can be accessed as a nested list, a dictionary, or an
object with named attributes.

The pyparsing module handles some of the problems that are typically vexing when writing text parsers:
 - extra or missing whitespace (the above program will also handle "Hello,World!", "Hello  ,  World  !", etc.)
 - quoted strings
 - embedded comments
z2.1.10z07 Oct 2016 01:31 UTCz*Paul McGuire <ptmcg@users.sourceforge.net>�N)�ref)�datetime)�RLock)�OrderedDict�And�CaselessKeyword�CaselessLiteral�
CharsNotIn�Combine�Dict�Each�Empty�
FollowedBy�Forward�
GoToColumn�Group�Keyword�LineEnd�	LineStart�Literal�
MatchFirst�NoMatch�NotAny�	OneOrMore�OnlyOnce�Optional�Or�ParseBaseException�ParseElementEnhance�ParseException�ParseExpression�ParseFatalException�ParseResults�ParseSyntaxException�
ParserElement�QuotedString�RecursiveGrammarException�Regex�SkipTo�	StringEnd�StringStart�Suppress�Token�TokenConverter�White�Word�WordEnd�	WordStart�
ZeroOrMore�	alphanums�alphas�
alphas8bit�anyCloseTag�
anyOpenTag�
cStyleComment�col�commaSeparatedList�commonHTMLEntity�countedArray�cppStyleComment�dblQuotedString�dblSlashComment�
delimitedList�dictOf�downcaseTokens�empty�hexnums�htmlComment�javaStyleComment�line�lineEnd�	lineStart�lineno�makeHTMLTags�makeXMLTags�matchOnlyAtCol�matchPreviousExpr�matchPreviousLiteral�
nestedExpr�nullDebugAction�nums�oneOf�opAssoc�operatorPrecedence�
printables�punc8bit�pythonStyleComment�quotedString�removeQuotes�replaceHTMLEntity�replaceWith�
restOfLine�sglQuotedString�srange�	stringEnd�stringStart�traceParseAction�
unicodeString�upcaseTokens�
withAttribute�
indentedBlock�originalTextFor�ungroup�
infixNotation�locatedExpr�	withClass�
CloseMatch�tokenMap�pyparsing_common�cCs`t|t�r|Syt|�Stk
rZt|�jtj�d�}td�}|jdd��|j	|�SXdS)aDrop-in replacement for str(obj) that tries to be Unicode friendly. It first tries
           str(obj). If that fails with a UnicodeEncodeError, then it tries unicode(obj). It
           then < returns the unicode object | encodes it with the default encoding | ... >.
        �xmlcharrefreplacez&#\d+;cSs$dtt|ddd���dd�S)Nz\ur�����)�hex�int)�t�rw�/usr/lib/python3.6/pyparsing.py�<lambda>�sz_ustr.<locals>.<lambda>N)
�
isinstanceZunicode�str�UnicodeEncodeError�encode�sys�getdefaultencodingr'�setParseAction�transformString)�obj�retZ
xmlcharrefrwrwrx�_ustr�s
r�z6sum len sorted reversed list tuple set any all min maxccs|]
}|VqdS)Nrw)�.0�yrwrwrx�	<genexpr>�sr�rrcCs>d}dd�dj�D�}x"t||�D]\}}|j||�}q"W|S)z/Escape &, <, >, ", ', etc. in a string of data.z&><"'css|]}d|dVqdS)�&�;Nrw)r��srwrwrxr��sz_xml_escape.<locals>.<genexpr>zamp gt lt quot apos)�split�zip�replace)�dataZfrom_symbolsZ
to_symbolsZfrom_Zto_rwrwrx�_xml_escape�s
r�c@seZdZdS)�
_ConstantsN)�__name__�
__module__�__qualname__rwrwrwrxr��sr��
0123456789ZABCDEFabcdef�\�ccs|]}|tjkr|VqdS)N)�stringZ
whitespace)r��crwrwrxr��sc@sPeZdZdZddd�Zedd��Zdd	�Zd
d�Zdd
�Z	ddd�Z
dd�ZdS)rz7base exception class for all parsing runtime exceptionsrNcCs>||_|dkr||_d|_n||_||_||_|||f|_dS)Nr�)�loc�msg�pstr�
parserElement�args)�selfr�r�r��elemrwrwrx�__init__�szParseBaseException.__init__cCs||j|j|j|j�S)z�
        internal factory method to simplify creating one type of ParseException 
        from another - avoids having __init__ signature conflicts among subclasses
        )r�r�r�r�)�cls�perwrwrx�_from_exception�sz"ParseBaseException._from_exceptioncCsN|dkrt|j|j�S|dkr,t|j|j�S|dkrBt|j|j�St|��dS)z�supported attributes by name are:
            - lineno - returns the line number of the exception text
            - col - returns the column number of the exception text
            - line - returns the line containing the exception text
        rJr9�columnrGN)r9r�)rJr�r�r9rG�AttributeError)r�Zanamerwrwrx�__getattr__�szParseBaseException.__getattr__cCsd|j|j|j|jfS)Nz"%s (at char %d), (line:%d, col:%d))r�r�rJr�)r�rwrwrx�__str__�szParseBaseException.__str__cCst|�S)N)r�)r�rwrwrx�__repr__�szParseBaseException.__repr__�>!<cCs<|j}|jd}|r4dj|d|�|||d�f�}|j�S)z�Extracts the exception line from the input string, and marks
           the location of the exception with a special symbol.
        rrr�N)rGr��join�strip)r�ZmarkerStringZline_strZline_columnrwrwrx�
markInputline�s
z ParseBaseException.markInputlinecCsdj�tt|��S)Nzlineno col line)r��dir�type)r�rwrwrx�__dir__�szParseBaseException.__dir__)rNN)r�)r�r�r��__doc__r��classmethodr�r�r�r�r�r�rwrwrwrxr�s


c@seZdZdZdS)raN
    Exception thrown when parse expressions don't match class;
    supported attributes by name are:
     - lineno - returns the line number of the exception text
     - col - returns the column number of the exception text
     - line - returns the line containing the exception text
        
    Example::
        try:
            Word(nums).setName("integer").parseString("ABC")
        except ParseException as pe:
            print(pe)
            print("column: {}".format(pe.col))
            
    prints::
       Expected integer (at char 0), (line:1, col:1)
        column: 1
    N)r�r�r�r�rwrwrwrxr�sc@seZdZdZdS)r!znuser-throwable exception thrown when inconsistent parse content
       is found; stops all parsing immediatelyN)r�r�r�r�rwrwrwrxr!sc@seZdZdZdS)r#z�just like L{ParseFatalException}, but thrown internally when an
       L{ErrorStop<And._ErrorStop>} ('-' operator) indicates that parsing is to stop 
       immediately because an unbacktrackable syntax error has been foundN)r�r�r�r�rwrwrwrxr#sc@s eZdZdZdd�Zdd�ZdS)r&zZexception thrown by L{ParserElement.validate} if the grammar could be improperly recursivecCs
||_dS)N)�parseElementTrace)r��parseElementListrwrwrxr�sz"RecursiveGrammarException.__init__cCs
d|jS)NzRecursiveGrammarException: %s)r�)r�rwrwrxr� sz!RecursiveGrammarException.__str__N)r�r�r�r�r�r�rwrwrwrxr&sc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�_ParseResultsWithOffsetcCs||f|_dS)N)�tup)r�Zp1Zp2rwrwrxr�$sz _ParseResultsWithOffset.__init__cCs
|j|S)N)r�)r��irwrwrx�__getitem__&sz#_ParseResultsWithOffset.__getitem__cCst|jd�S)Nr)�reprr�)r�rwrwrxr�(sz _ParseResultsWithOffset.__repr__cCs|jd|f|_dS)Nr)r�)r�r�rwrwrx�	setOffset*sz!_ParseResultsWithOffset.setOffsetN)r�r�r�r�r�r�r�rwrwrwrxr�#sr�c@s�eZdZdZd[dd�Zddddefdd�Zdd	�Zefd
d�Zdd
�Z	dd�Z
dd�Zdd�ZeZ
dd�Zdd�Zdd�Zdd�Zdd�Zer�eZeZeZn$eZeZeZdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd\d(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Z d2d3�Z!d4d5�Z"d6d7�Z#d8d9�Z$d:d;�Z%d<d=�Z&d]d?d@�Z'dAdB�Z(dCdD�Z)dEdF�Z*d^dHdI�Z+dJdK�Z,dLdM�Z-d_dOdP�Z.dQdR�Z/dSdT�Z0dUdV�Z1dWdX�Z2dYdZ�Z3dS)`r"aI
    Structured parse results, to provide multiple means of access to the parsed data:
       - as a list (C{len(results)})
       - by list index (C{results[0], results[1]}, etc.)
       - by attribute (C{results.<resultsName>} - see L{ParserElement.setResultsName})

    Example::
        integer = Word(nums)
        date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))
        # equivalent form:
        # date_str = integer("year") + '/' + integer("month") + '/' + integer("day")

        # parseString returns a ParseResults object
        result = date_str.parseString("1999/12/31")

        def test(s, fn=repr):
            print("%s -> %s" % (s, fn(eval(s))))
        test("list(result)")
        test("result[0]")
        test("result['month']")
        test("result.day")
        test("'month' in result")
        test("'minutes' in result")
        test("result.dump()", str)
    prints::
        list(result) -> ['1999', '/', '12', '/', '31']
        result[0] -> '1999'
        result['month'] -> '12'
        result.day -> '31'
        'month' in result -> True
        'minutes' in result -> False
        result.dump() -> ['1999', '/', '12', '/', '31']
        - day: 31
        - month: 12
        - year: 1999
    NTcCs"t||�r|Stj|�}d|_|S)NT)rz�object�__new__�_ParseResults__doinit)r��toklist�name�asList�modalZretobjrwrwrxr�Ts


zParseResults.__new__c
Cs`|jrvd|_d|_d|_i|_||_||_|dkr6g}||t�rP|dd�|_n||t�rft|�|_n|g|_t	�|_
|dk	o�|�r\|s�d|j|<||t�r�t|�}||_||t
d�ttf�o�|ddgfk�s\||t�r�|g}|�r&||t��rt|j�d�||<ntt|d�d�||<|||_n6y|d||<Wn$tttfk
�rZ|||<YnXdS)NFrr�)r��_ParseResults__name�_ParseResults__parent�_ParseResults__accumNames�_ParseResults__asList�_ParseResults__modal�list�_ParseResults__toklist�_generatorType�dict�_ParseResults__tokdictrur�r��
basestringr"r��copy�KeyError�	TypeError�
IndexError)r�r�r�r�r�rzrwrwrxr�]sB



$
zParseResults.__init__cCsPt|ttf�r|j|S||jkr4|j|ddStdd�|j|D��SdS)NrrrcSsg|]}|d�qS)rrw)r��vrwrwrx�
<listcomp>�sz,ParseResults.__getitem__.<locals>.<listcomp>rs)rzru�slicer�r�r�r")r�r�rwrwrxr��s


zParseResults.__getitem__cCs�||t�r0|jj|t��|g|j|<|d}nD||ttf�rN||j|<|}n&|jj|t��t|d�g|j|<|}||t�r�t|�|_	dS)Nr)
r�r��getr�rur�r�r"�wkrefr�)r��kr�rz�subrwrwrx�__setitem__�s


"
zParseResults.__setitem__c
Cs�t|ttf�r�t|j�}|j|=t|t�rH|dkr:||7}t||d�}tt|j|���}|j�x^|j	j
�D]F\}}x<|D]4}x.t|�D]"\}\}}	t||	|	|k�||<q�Wq|WqnWn|j	|=dS)Nrrr)
rzrur��lenr�r��range�indices�reverser��items�	enumerater�)
r�r�ZmylenZremovedr��occurrences�jr��value�positionrwrwrx�__delitem__�s


$zParseResults.__delitem__cCs
||jkS)N)r�)r�r�rwrwrx�__contains__�szParseResults.__contains__cCs
t|j�S)N)r�r�)r�rwrwrx�__len__�szParseResults.__len__cCs
|jS)N)r�)r�rwrwrx�__bool__�szParseResults.__bool__cCs
t|j�S)N)�iterr�)r�rwrwrx�__iter__�szParseResults.__iter__cCst|jddd��S)Nrrrs)r�r�)r�rwrwrx�__reversed__�szParseResults.__reversed__cCs$t|jd�r|jj�St|j�SdS)N�iterkeys)�hasattrr�r�r�)r�rwrwrx�	_iterkeys�s
zParseResults._iterkeyscs�fdd��j�D�S)Nc3s|]}�|VqdS)Nrw)r�r�)r�rwrxr��sz+ParseResults._itervalues.<locals>.<genexpr>)r�)r�rw)r�rx�_itervalues�szParseResults._itervaluescs�fdd��j�D�S)Nc3s|]}|�|fVqdS)Nrw)r�r�)r�rwrxr��sz*ParseResults._iteritems.<locals>.<genexpr>)r�)r�rw)r�rx�
_iteritems�szParseResults._iteritemscCst|j��S)zVReturns all named result keys (as a list in Python 2.x, as an iterator in Python 3.x).)r�r�)r�rwrwrx�keys�szParseResults.keyscCst|j��S)zXReturns all named result values (as a list in Python 2.x, as an iterator in Python 3.x).)r��
itervalues)r�rwrwrx�values�szParseResults.valuescCst|j��S)zfReturns all named result key-values (as a list of tuples in Python 2.x, as an iterator in Python 3.x).)r��	iteritems)r�rwrwrxr��szParseResults.itemscCs
t|j�S)z�Since keys() returns an iterator, this method is helpful in bypassing
           code that looks for the existence of any defined results names.)�boolr�)r�rwrwrx�haskeys�szParseResults.haskeyscOs�|s
dg}x6|j�D]*\}}|dkr2|d|f}qtd|��qWt|dt�sht|�dksh|d|kr�|d}||}||=|S|d}|SdS)a�
        Removes and returns item at specified index (default=C{last}).
        Supports both C{list} and C{dict} semantics for C{pop()}. If passed no
        argument or an integer argument, it will use C{list} semantics
        and pop tokens from the list of parsed tokens. If passed a 
        non-integer argument (most likely a string), it will use C{dict}
        semantics and pop the corresponding value from any defined 
        results names. A second default return value argument is 
        supported, just as in C{dict.pop()}.

        Example::
            def remove_first(tokens):
                tokens.pop(0)
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            print(OneOrMore(Word(nums)).addParseAction(remove_first).parseString("0 123 321")) # -> ['123', '321']

            label = Word(alphas)
            patt = label("LABEL") + OneOrMore(Word(nums))
            print(patt.parseString("AAB 123 321").dump())

            # Use pop() in a parse action to remove named result (note that corresponding value is not
            # removed from list form of results)
            def remove_LABEL(tokens):
                tokens.pop("LABEL")
                return tokens
            patt.addParseAction(remove_LABEL)
            print(patt.parseString("AAB 123 321").dump())
        prints::
            ['AAB', '123', '321']
            - LABEL: AAB

            ['AAB', '123', '321']
        rr�defaultrz-pop() got an unexpected keyword argument '%s'Nrs)r�r�rzrur�)r�r��kwargsr�r��indexr�Zdefaultvaluerwrwrx�pop�s"zParseResults.popcCs||kr||S|SdS)ai
        Returns named result matching the given key, or if there is no
        such name, then returns the given C{defaultValue} or C{None} if no
        C{defaultValue} is specified.

        Similar to C{dict.get()}.
        
        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            result = date_str.parseString("1999/12/31")
            print(result.get("year")) # -> '1999'
            print(result.get("hour", "not specified")) # -> 'not specified'
            print(result.get("hour")) # -> None
        Nrw)r��key�defaultValuerwrwrxr�szParseResults.getcCsZ|jj||�xF|jj�D]8\}}x.t|�D]"\}\}}t||||k�||<q,WqWdS)a
        Inserts new element at location index in the list of parsed tokens.
        
        Similar to C{list.insert()}.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']

            # use a parse action to insert the parse location in the front of the parsed results
            def insert_locn(locn, tokens):
                tokens.insert(0, locn)
            print(OneOrMore(Word(nums)).addParseAction(insert_locn).parseString("0 123 321")) # -> [0, '0', '123', '321']
        N)r��insertr�r�r�r�)r�r�ZinsStrr�r�r�r�r�rwrwrxr�2szParseResults.insertcCs|jj|�dS)a�
        Add single element to end of ParseResults list of elements.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            
            # use a parse action to compute the sum of the parsed integers, and add it to the end
            def append_sum(tokens):
                tokens.append(sum(map(int, tokens)))
            print(OneOrMore(Word(nums)).addParseAction(append_sum).parseString("0 123 321")) # -> ['0', '123', '321', 444]
        N)r��append)r��itemrwrwrxr�FszParseResults.appendcCs$t|t�r||7}n|jj|�dS)a
        Add sequence of elements to end of ParseResults list of elements.

        Example::
            patt = OneOrMore(Word(alphas))
            
            # use a parse action to append the reverse of the matched strings, to make a palindrome
            def make_palindrome(tokens):
                tokens.extend(reversed([t[::-1] for t in tokens]))
                return ''.join(tokens)
            print(patt.addParseAction(make_palindrome).parseString("lskdj sdlkjf lksd")) # -> 'lskdjsdlkjflksddsklfjkldsjdksl'
        N)rzr"r��extend)r�Zitemseqrwrwrxr�Ts

zParseResults.extendcCs|jdd�=|jj�dS)z7
        Clear all elements and results names.
        N)r�r��clear)r�rwrwrxr�fszParseResults.clearcCsfy||Stk
rdSX||jkr^||jkrD|j|ddStdd�|j|D��SndSdS)Nr�rrrcSsg|]}|d�qS)rrw)r�r�rwrwrxr�wsz,ParseResults.__getattr__.<locals>.<listcomp>rs)r�r�r�r")r�r�rwrwrxr�ms

zParseResults.__getattr__cCs|j�}||7}|S)N)r�)r��otherr�rwrwrx�__add__{szParseResults.__add__cs�|jrnt|j���fdd��|jj�}�fdd�|D�}x4|D],\}}|||<t|dt�r>t|�|d_q>W|j|j7_|jj	|j�|S)Ncs|dkr�S|�S)Nrrw)�a)�offsetrwrxry�sz'ParseResults.__iadd__.<locals>.<lambda>c	s4g|],\}}|D]}|t|d�|d��f�qqS)rrr)r�)r�r��vlistr�)�	addoffsetrwrxr��sz)ParseResults.__iadd__.<locals>.<listcomp>r)
r�r�r�r�rzr"r�r�r��update)r�r�Z
otheritemsZotherdictitemsr�r�rw)rrrx�__iadd__�s


zParseResults.__iadd__cCs&t|t�r|dkr|j�S||SdS)Nr)rzrur�)r�r�rwrwrx�__radd__�szParseResults.__radd__cCsdt|j�t|j�fS)Nz(%s, %s))r�r�r�)r�rwrwrxr��szParseResults.__repr__cCsddjdd�|jD��dS)N�[z, css(|] }t|t�rt|�nt|�VqdS)N)rzr"r�r�)r�r�rwrwrxr��sz'ParseResults.__str__.<locals>.<genexpr>�])r�r�)r�rwrwrxr��szParseResults.__str__r�cCsPg}xF|jD]<}|r"|r"|j|�t|t�r:||j�7}q|jt|��qW|S)N)r�r�rzr"�
_asStringListr�)r��sep�outr�rwrwrxr
�s

zParseResults._asStringListcCsdd�|jD�S)a�
        Returns the parse results as a nested list of matching tokens, all converted to strings.

        Example::
            patt = OneOrMore(Word(alphas))
            result = patt.parseString("sldkj lsdkj sldkj")
            # even though the result prints in string-like form, it is actually a pyparsing ParseResults
            print(type(result), result) # -> <class 'pyparsing.ParseResults'> ['sldkj', 'lsdkj', 'sldkj']
            
            # Use asList() to create an actual list
            result_list = result.asList()
            print(type(result_list), result_list) # -> <class 'list'> ['sldkj', 'lsdkj', 'sldkj']
        cSs"g|]}t|t�r|j�n|�qSrw)rzr"r�)r��resrwrwrxr��sz'ParseResults.asList.<locals>.<listcomp>)r�)r�rwrwrxr��szParseResults.asListcs6tr|j}n|j}�fdd��t�fdd�|�D��S)a�
        Returns the named parse results as a nested dictionary.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(type(result), repr(result)) # -> <class 'pyparsing.ParseResults'> (['12', '/', '31', '/', '1999'], {'day': [('1999', 4)], 'year': [('12', 0)], 'month': [('31', 2)]})
            
            result_dict = result.asDict()
            print(type(result_dict), repr(result_dict)) # -> <class 'dict'> {'day': '1999', 'year': '12', 'month': '31'}

            # even though a ParseResults supports dict-like access, sometime you just need to have a dict
            import json
            print(json.dumps(result)) # -> Exception: TypeError: ... is not JSON serializable
            print(json.dumps(result.asDict())) # -> {"month": "31", "day": "1999", "year": "12"}
        cs6t|t�r.|j�r|j�S�fdd�|D�Sn|SdS)Ncsg|]}�|��qSrwrw)r�r�)�toItemrwrxr��sz7ParseResults.asDict.<locals>.toItem.<locals>.<listcomp>)rzr"r��asDict)r�)rrwrxr�s

z#ParseResults.asDict.<locals>.toItemc3s|]\}}|�|�fVqdS)Nrw)r�r�r�)rrwrxr��sz&ParseResults.asDict.<locals>.<genexpr>)�PY_3r�r�r�)r�Zitem_fnrw)rrxr�s
	zParseResults.asDictcCs8t|j�}|jj�|_|j|_|jj|j�|j|_|S)zA
        Returns a new copy of a C{ParseResults} object.
        )r"r�r�r�r�r�rr�)r�r�rwrwrxr��s
zParseResults.copyFcCsPd}g}tdd�|jj�D��}|d}|s8d}d}d}d}	|dk	rJ|}	n|jrV|j}	|	sf|rbdSd}	|||d|	d	g7}x�t|j�D]�\}
}t|t�r�|
|kr�||j||
|o�|dk||�g7}n||jd|o�|dk||�g7}q�d}|
|kr�||
}|�s
|�rq�nd}t	t
|��}
|||d|d	|
d
|d	g	7}q�W|||d
|	d	g7}dj|�S)z�
        (Deprecated) Returns the parse results as XML. Tags are created for tokens and lists that have defined results names.
        �
css(|] \}}|D]}|d|fVqqdS)rrNrw)r�r�rr�rwrwrxr��sz%ParseResults.asXML.<locals>.<genexpr>z  r�NZITEM�<�>z</)r�r�r�r�r�r�rzr"�asXMLr�r�r�)r�ZdoctagZnamedItemsOnly�indentZ	formatted�nlrZ
namedItemsZnextLevelIndentZselfTagr�r
ZresTagZxmlBodyTextrwrwrxr�sT


zParseResults.asXMLcCs:x4|jj�D]&\}}x|D]\}}||kr|SqWqWdS)N)r�r�)r�r�r�rr�r�rwrwrxZ__lookup$s
zParseResults.__lookupcCs�|jr|jS|jr.|j�}|r(|j|�SdSnNt|�dkrxt|j�dkrxtt|jj���dddkrxtt|jj���SdSdS)a(
        Returns the results name for this token expression. Useful when several 
        different expressions might match at a particular location.

        Example::
            integer = Word(nums)
            ssn_expr = Regex(r"\d\d\d-\d\d-\d\d\d\d")
            house_number_expr = Suppress('#') + Word(nums, alphanums)
            user_data = (Group(house_number_expr)("house_number") 
                        | Group(ssn_expr)("ssn")
                        | Group(integer)("age"))
            user_info = OneOrMore(user_data)
            
            result = user_info.parseString("22 111-22-3333 #221B")
            for item in result:
                print(item.getName(), ':', item[0])
        prints::
            age : 22
            ssn : 111-22-3333
            house_number : 221B
        Nrrrrs)rrs)	r�r��_ParseResults__lookupr�r��nextr�r�r�)r��parrwrwrx�getName+s
zParseResults.getNamercCsbg}d}|j|t|j���|�rX|j�r�tdd�|j�D��}xz|D]r\}}|r^|j|�|jd|d||f�t|t�r�|r�|j|j||d��q�|jt|��qH|jt	|��qHWn�t
dd�|D���rX|}x~t|�D]r\}	}
t|
t��r*|jd|d||	|d|d|
j||d�f�q�|jd|d||	|d|dt|
�f�q�Wd	j|�S)
aH
        Diagnostic method for listing out the contents of a C{ParseResults}.
        Accepts an optional C{indent} argument so that this string can be embedded
        in a nested display of other data.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(result.dump())
        prints::
            ['12', '/', '31', '/', '1999']
            - day: 1999
            - month: 31
            - year: 12
        rcss|]\}}t|�|fVqdS)N)r{)r�r�r�rwrwrxr�gsz$ParseResults.dump.<locals>.<genexpr>z
%s%s- %s: z  rrcss|]}t|t�VqdS)N)rzr")r��vvrwrwrxr�ssz
%s%s[%d]:
%s%s%sr�)
r�r�r�r��sortedr�rzr"�dumpr��anyr�r�)r�r�depth�fullr�NLr�r�r�r�rrwrwrxrPs,

4.zParseResults.dumpcOstj|j�f|�|�dS)a�
        Pretty-printer for parsed results as a list, using the C{pprint} module.
        Accepts additional positional or keyword args as defined for the 
        C{pprint.pprint} method. (U{http://docs.python.org/3/library/pprint.html#pprint.pprint})

        Example::
            ident = Word(alphas, alphanums)
            num = Word(nums)
            func = Forward()
            term = ident | num | Group('(' + func + ')')
            func <<= ident + Group(Optional(delimitedList(term)))
            result = func.parseString("fna a,b,(fnb c,d,200),100")
            result.pprint(width=40)
        prints::
            ['fna',
             ['a',
              'b',
              ['(', 'fnb', ['c', 'd', '200'], ')'],
              '100']]
        N)�pprintr�)r�r�r�rwrwrxr"}szParseResults.pprintcCs.|j|jj�|jdk	r|j�p d|j|jffS)N)r�r�r�r�r�r�)r�rwrwrx�__getstate__�s
zParseResults.__getstate__cCsN|d|_|d\|_}}|_i|_|jj|�|dk	rDt|�|_nd|_dS)Nrrr)r�r�r�r�rr�r�)r��staterZinAccumNamesrwrwrx�__setstate__�s
zParseResults.__setstate__cCs|j|j|j|jfS)N)r�r�r�r�)r�rwrwrx�__getnewargs__�szParseResults.__getnewargs__cCstt|��t|j��S)N)r�r�r�r�)r�rwrwrxr��szParseResults.__dir__)NNTT)N)r�)NFr�T)r�rT)4r�r�r�r�r�rzr�r�r�r�r�r�r��__nonzero__r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr�r�r
r�rr�rrrrr"r#r%r&r�rwrwrwrxr"-sh&
	'	
4

#
=%
-
cCsF|}d|kot|�knr4||ddkr4dS||jdd|�S)aReturns current column within a string, counting newlines as line separators.
   The first column is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   rrrr)r��rfind)r��strgr�rwrwrxr9�s
cCs|jdd|�dS)aReturns current line number within a string, counting newlines as line separators.
   The first line is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   rrrr)�count)r�r)rwrwrxrJ�s
cCsF|jdd|�}|jd|�}|dkr2||d|�S||dd�SdS)zfReturns the line of text containing loc within a string, counting newlines as line separators.
       rrrrN)r(�find)r�r)ZlastCRZnextCRrwrwrxrG�s
cCs8tdt|�dt|�dt||�t||�f�dS)NzMatch z at loc z(%d,%d))�printr�rJr9)�instringr��exprrwrwrx�_defaultStartDebugAction�sr/cCs$tdt|�dt|j���dS)NzMatched z -> )r,r�r{r�)r-�startlocZendlocr.�toksrwrwrx�_defaultSuccessDebugAction�sr2cCstdt|��dS)NzException raised:)r,r�)r-r�r.�excrwrwrx�_defaultExceptionDebugAction�sr4cGsdS)zG'Do-nothing' debug action, to suppress debugging output during parsing.Nrw)r�rwrwrxrQ�srqcs��tkr�fdd�Sdg�dg�tdd�dkrFddd	�}dd
d��ntj}tj�d}|dd
�d}|d|d|f�������fdd�}d}yt�dt�d�j�}Wntk
r�t��}YnX||_|S)Ncs�|�S)Nrw)r��lrv)�funcrwrxry�sz_trim_arity.<locals>.<lambda>rFrqro�cSs8tdkrdnd	}tj||dd�|}|j|jfgS)
Nror7rrqrr)�limit)ror7r������)�system_version�	traceback�
extract_stack�filenamerJ)r8r�
frame_summaryrwrwrxr=sz"_trim_arity.<locals>.extract_stackcSs$tj||d�}|d}|j|jfgS)N)r8rrrs)r<�
extract_tbr>rJ)�tbr8Zframesr?rwrwrxr@sz_trim_arity.<locals>.extract_tb�)r8rrcs�x�y �|�dd��}d�d<|Stk
r��dr>�n4z.tj�d}�|dd�ddd��ksj�Wd~X�d�kr��dd7<w�YqXqWdS)NrTrrrq)r8rsrs)r�r~�exc_info)r�r�rA)r@�
foundArityr6r8�maxargs�pa_call_line_synthrwrx�wrappers"z_trim_arity.<locals>.wrapperz<parse action>r��	__class__)ror7)r)rrs)	�singleArgBuiltinsr;r<r=r@�getattrr��	Exceptionr{)r6rEr=Z	LINE_DIFFZ	this_linerG�	func_namerw)r@rDr6r8rErFrx�_trim_arity�s*
rMcs�eZdZdZdZdZedd��Zedd��Zd�dd	�Z	d
d�Z
dd
�Zd�dd�Zd�dd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd�dd �Zd!d"�Zd�d#d$�Zd%d&�Zd'd(�ZGd)d*�d*e�Zed+k	r�Gd,d-�d-e�ZnGd.d-�d-e�ZiZe�Zd/d/gZ d�d0d1�Z!eZ"ed2d3��Z#dZ$ed�d5d6��Z%d�d7d8�Z&e'dfd9d:�Z(d;d<�Z)e'fd=d>�Z*e'dfd?d@�Z+dAdB�Z,dCdD�Z-dEdF�Z.dGdH�Z/dIdJ�Z0dKdL�Z1dMdN�Z2dOdP�Z3dQdR�Z4dSdT�Z5dUdV�Z6dWdX�Z7dYdZ�Z8d�d[d\�Z9d]d^�Z:d_d`�Z;dadb�Z<dcdd�Z=dedf�Z>dgdh�Z?d�didj�Z@dkdl�ZAdmdn�ZBdodp�ZCdqdr�ZDgfdsdt�ZEd�dudv�ZF�fdwdx�ZGdydz�ZHd{d|�ZId}d~�ZJdd��ZKd�d�d��ZLd�d�d��ZM�ZNS)�r$z)Abstract base level parser element class.z 
	
FcCs
|t_dS)a�
        Overrides the default whitespace chars

        Example::
            # default whitespace chars are space, <TAB> and newline
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def', 'ghi', 'jkl']
            
            # change to just treat newline as significant
            ParserElement.setDefaultWhitespaceChars(" \t")
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def']
        N)r$�DEFAULT_WHITE_CHARS)�charsrwrwrx�setDefaultWhitespaceChars=s
z'ParserElement.setDefaultWhitespaceCharscCs
|t_dS)a�
        Set class to be used for inclusion of string literals into a parser.
        
        Example::
            # default literal class used is Literal
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']


            # change to Suppress
            ParserElement.inlineLiteralsUsing(Suppress)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '12', '31']
        N)r$�_literalStringClass)r�rwrwrx�inlineLiteralsUsingLsz!ParserElement.inlineLiteralsUsingcCs�t�|_d|_d|_d|_||_d|_tj|_	d|_
d|_d|_t�|_
d|_d|_d|_d|_d|_d|_d|_d|_d|_dS)NTFr�)NNN)r��parseAction�
failAction�strRepr�resultsName�
saveAsList�skipWhitespacer$rN�
whiteChars�copyDefaultWhiteChars�mayReturnEmpty�keepTabs�ignoreExprs�debug�streamlined�
mayIndexError�errmsg�modalResults�debugActions�re�callPreparse�
callDuringTry)r��savelistrwrwrxr�as(zParserElement.__init__cCs<tj|�}|jdd�|_|jdd�|_|jr8tj|_|S)a$
        Make a copy of this C{ParserElement}.  Useful for defining different parse actions
        for the same parsing pattern, using copies of the original parse element.
        
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            integerK = integer.copy().addParseAction(lambda toks: toks[0]*1024) + Suppress("K")
            integerM = integer.copy().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
            
            print(OneOrMore(integerK | integerM | integer).parseString("5K 100 640K 256M"))
        prints::
            [5120, 100, 655360, 268435456]
        Equivalent form of C{expr.copy()} is just C{expr()}::
            integerM = integer().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
        N)r�rSr]rZr$rNrY)r�Zcpyrwrwrxr�xs
zParserElement.copycCs*||_d|j|_t|d�r&|j|j_|S)af
        Define name for this expression, makes debugging and exception messages clearer.
        
        Example::
            Word(nums).parseString("ABC")  # -> Exception: Expected W:(0123...) (at char 0), (line:1, col:1)
            Word(nums).setName("integer").parseString("ABC")  # -> Exception: Expected integer (at char 0), (line:1, col:1)
        z	Expected �	exception)r�rar�rhr�)r�r�rwrwrx�setName�s


zParserElement.setNamecCs4|j�}|jd�r"|dd�}d}||_||_|S)aP
        Define name for referencing matching tokens as a nested attribute
        of the returned parse results.
        NOTE: this returns a *copy* of the original C{ParserElement} object;
        this is so that the client can define a basic element, such as an
        integer, and reference it in multiple places with different names.

        You can also set results names using the abbreviated syntax,
        C{expr("name")} in place of C{expr.setResultsName("name")} - 
        see L{I{__call__}<__call__>}.

        Example::
            date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))

            # equivalent form:
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
        �*NrrTrs)r��endswithrVrb)r�r��listAllMatchesZnewselfrwrwrx�setResultsName�s
zParserElement.setResultsNameTcs@|r&|j�d�fdd�	}�|_||_nt|jd�r<|jj|_|S)z�Method to invoke the Python pdb debugger when this element is
           about to be parsed. Set C{breakFlag} to True to enable, False to
           disable.
        Tcsddl}|j��||||�S)Nr)�pdbZ	set_trace)r-r��	doActions�callPreParsern)�_parseMethodrwrx�breaker�sz'ParserElement.setBreak.<locals>.breaker�_originalParseMethod)TT)�_parsersr�)r�Z	breakFlagrrrw)rqrx�setBreak�s
zParserElement.setBreakcOs&tttt|���|_|jdd�|_|S)a
        Define action to perform when successfully matching parse element definition.
        Parse action fn is a callable method with 0-3 arguments, called as C{fn(s,loc,toks)},
        C{fn(loc,toks)}, C{fn(toks)}, or just C{fn()}, where:
         - s   = the original string being parsed (see note below)
         - loc = the location of the matching substring
         - toks = a list of the matched tokens, packaged as a C{L{ParseResults}} object
        If the functions in fns modify the tokens, they can return them as the return
        value from fn, and the modified list of tokens will replace the original.
        Otherwise, fn does not need to return any value.

        Optional keyword arguments:
         - callDuringTry = (default=C{False}) indicate if parse action should be run during lookaheads and alternate testing

        Note: the default parsing behavior is to expand tabs in the input string
        before starting the parsing process.  See L{I{parseString}<parseString>} for more information
        on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
        consistent view of the parsed string, the parse location, and line and column
        positions within the parsed string.
        
        Example::
            integer = Word(nums)
            date_str = integer + '/' + integer + '/' + integer

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']

            # use parse action to convert to ints at parse time
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            date_str = integer + '/' + integer + '/' + integer

            # note that integer fields are now ints, not strings
            date_str.parseString("1999/12/31")  # -> [1999, '/', 12, '/', 31]
        rfF)r��maprMrSr�rf)r��fnsr�rwrwrxr��s"zParserElement.setParseActioncOs4|jtttt|���7_|jp,|jdd�|_|S)z�
        Add parse action to expression's list of parse actions. See L{I{setParseAction}<setParseAction>}.
        
        See examples in L{I{copy}<copy>}.
        rfF)rSr�rvrMrfr�)r�rwr�rwrwrx�addParseAction�szParserElement.addParseActioncsb|jdd��|jdd�rtnt�x(|D] ����fdd�}|jj|�q&W|jpZ|jdd�|_|S)a�Add a boolean predicate function to expression's list of parse actions. See 
        L{I{setParseAction}<setParseAction>} for function call signatures. Unlike C{setParseAction}, 
        functions passed to C{addCondition} need to return boolean success/fail of the condition.

        Optional keyword arguments:
         - message = define a custom message to be used in the raised exception
         - fatal   = if True, will raise ParseFatalException to stop parsing immediately; otherwise will raise ParseException
         
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            year_int = integer.copy()
            year_int.addCondition(lambda toks: toks[0] >= 2000, message="Only support years 2000 and later")
            date_str = year_int + '/' + integer + '/' + integer

            result = date_str.parseString("1999/12/31")  # -> Exception: Only support years 2000 and later (at char 0), (line:1, col:1)
        �messagezfailed user-defined condition�fatalFcs$tt��|||��s �||���dS)N)r�rM)r�r5rv)�exc_type�fnr�rwrx�pasz&ParserElement.addCondition.<locals>.parf)r�r!rrSr�rf)r�rwr�r}rw)r{r|r�rx�addCondition�s
zParserElement.addConditioncCs
||_|S)aDefine action to perform if parsing fails at this expression.
           Fail acton fn is a callable function that takes the arguments
           C{fn(s,loc,expr,err)} where:
            - s = string being parsed
            - loc = location where expression match was attempted and failed
            - expr = the parse expression that failed
            - err = the exception thrown
           The function returns no value.  It may throw C{L{ParseFatalException}}
           if it is desired to stop parsing immediately.)rT)r�r|rwrwrx�
setFailActions
zParserElement.setFailActioncCsZd}xP|rTd}xB|jD]8}yx|j||�\}}d}qWWqtk
rLYqXqWqW|S)NTF)r]rtr)r�r-r�Z
exprsFound�eZdummyrwrwrx�_skipIgnorables#szParserElement._skipIgnorablescCsL|jr|j||�}|jrH|j}t|�}x ||krF|||krF|d7}q(W|S)Nrr)r]r�rXrYr�)r�r-r�Zwt�instrlenrwrwrx�preParse0szParserElement.preParsecCs|gfS)Nrw)r�r-r�rorwrwrx�	parseImpl<szParserElement.parseImplcCs|S)Nrw)r�r-r��	tokenlistrwrwrx�	postParse?szParserElement.postParsec"Cs�|j}|s|jr�|jdr,|jd|||�|rD|jrD|j||�}n|}|}yDy|j|||�\}}Wn(tk
r�t|t|�|j	|��YnXWnXt
k
r�}	z<|jdr�|jd||||	�|jr�|j||||	��WYdd}	~	XnXn�|o�|j�r|j||�}n|}|}|j�s$|t|�k�rhy|j|||�\}}Wn*tk
�rdt|t|�|j	|��YnXn|j|||�\}}|j|||�}t
||j|j|jd�}
|j�r�|�s�|j�r�|�rVyRxL|jD]B}||||
�}|dk	�r�t
||j|j�o�t|t
tf�|jd�}
�q�WWnFt
k
�rR}	z(|jd�r@|jd||||	��WYdd}	~	XnXnNxL|jD]B}||||
�}|dk	�r^t
||j|j�o�t|t
tf�|jd�}
�q^W|�r�|jd�r�|jd|||||
�||
fS)Nrrq)r�r�rr)r^rTrcrer�r�r�rr�rarr`r�r"rVrWrbrSrfrzr�)r�r-r�rorpZ	debugging�prelocZtokensStart�tokens�errZ	retTokensr|rwrwrx�
_parseNoCacheCsp





zParserElement._parseNoCachecCs>y|j||dd�dStk
r8t|||j|��YnXdS)NF)ror)rtr!rra)r�r-r�rwrwrx�tryParse�szParserElement.tryParsecCs2y|j||�Wnttfk
r(dSXdSdS)NFT)r�rr�)r�r-r�rwrwrx�canParseNext�s
zParserElement.canParseNextc@seZdZdd�ZdS)zParserElement._UnboundedCachecsdi�t�|_���fdd�}�fdd�}�fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)�cache�not_in_cacherwrxr��sz3ParserElement._UnboundedCache.__init__.<locals>.getcs|�|<dS)Nrw)r�r�r�)r�rwrx�set�sz3ParserElement._UnboundedCache.__init__.<locals>.setcs�j�dS)N)r�)r�)r�rwrxr��sz5ParserElement._UnboundedCache.__init__.<locals>.clear)r�r��types�
MethodTyper�r�r�)r�r�r�r�rw)r�r�rxr��sz&ParserElement._UnboundedCache.__init__N)r�r�r�r�rwrwrwrx�_UnboundedCache�sr�Nc@seZdZdd�ZdS)zParserElement._FifoCachecsht�|_�t����fdd�}��fdd�}�fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)r�r�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.getcs"|�|<t���kr�jd�dS)NF)r��popitem)r�r�r�)r��sizerwrxr��sz.ParserElement._FifoCache.__init__.<locals>.setcs�j�dS)N)r�)r�)r�rwrxr��sz0ParserElement._FifoCache.__init__.<locals>.clear)r�r��_OrderedDictr�r�r�r�r�)r�r�r�r�r�rw)r�r�r�rxr��sz!ParserElement._FifoCache.__init__N)r�r�r�r�rwrwrwrx�
_FifoCache�sr�c@seZdZdd�ZdS)zParserElement._FifoCachecsvt�|_�i�tjg�����fdd�}���fdd�}��fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)r�r�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.getcs2|�|<t���kr$�j�j�d��j|�dS)N)r�r��popleftr�)r�r�r�)r��key_fifor�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.setcs�j��j�dS)N)r�)r�)r�r�rwrxr��sz0ParserElement._FifoCache.__init__.<locals>.clear)	r�r��collections�dequer�r�r�r�r�)r�r�r�r�r�rw)r�r�r�r�rxr��sz!ParserElement._FifoCache.__init__N)r�r�r�r�rwrwrwrxr��srcCs�d\}}|||||f}tj��tj}|j|�}	|	|jkr�tj|d7<y|j||||�}	Wn8tk
r�}
z|j||
j	|
j
���WYdd}
~
Xq�X|j||	d|	dj�f�|	Sn4tj|d7<t|	t
�r�|	�|	d|	dj�fSWdQRXdS)Nrrr)rrr)r$�packrat_cache_lock�
packrat_cacher�r��packrat_cache_statsr�rr�rHr�r�rzrK)r�r-r�rorpZHITZMISS�lookupr�r�r�rwrwrx�_parseCache�s$


zParserElement._parseCachecCs(tjj�dgttj�tjdd�<dS)Nr)r$r�r�r�r�rwrwrwrx�
resetCache�s
zParserElement.resetCache�cCs8tjs4dt_|dkr tj�t_ntj|�t_tjt_dS)a�Enables "packrat" parsing, which adds memoizing to the parsing logic.
           Repeated parse attempts at the same string location (which happens
           often in many complex grammars) can immediately return a cached value,
           instead of re-executing parsing/validating code.  Memoizing is done of
           both valid results and parsing exceptions.
           
           Parameters:
            - cache_size_limit - (default=C{128}) - if an integer value is provided
              will limit the size of the packrat cache; if None is passed, then
              the cache size will be unbounded; if 0 is passed, the cache will
              be effectively disabled.
            
           This speedup may break existing programs that use parse actions that
           have side-effects.  For this reason, packrat parsing is disabled when
           you first import pyparsing.  To activate the packrat feature, your
           program must call the class method C{ParserElement.enablePackrat()}.  If
           your program uses C{psyco} to "compile as you go", you must call
           C{enablePackrat} before calling C{psyco.full()}.  If you do not do this,
           Python will crash.  For best results, call C{enablePackrat()} immediately
           after importing pyparsing.
           
           Example::
               import pyparsing
               pyparsing.ParserElement.enablePackrat()
        TN)r$�_packratEnabledr�r�r�r�rt)Zcache_size_limitrwrwrx�
enablePackratszParserElement.enablePackratcCs�tj�|js|j�x|jD]}|j�qW|js<|j�}y<|j|d�\}}|rv|j||�}t	�t
�}|j||�Wn0tk
r�}ztjr��n|�WYdd}~XnX|SdS)aB
        Execute the parse expression with the given string.
        This is the main interface to the client code, once the complete
        expression has been built.

        If you want the grammar to require that the entire input string be
        successfully parsed, then set C{parseAll} to True (equivalent to ending
        the grammar with C{L{StringEnd()}}).

        Note: C{parseString} implicitly calls C{expandtabs()} on the input string,
        in order to report proper column numbers in parse actions.
        If the input string contains tabs and
        the grammar uses parse actions that use the C{loc} argument to index into the
        string being parsed, you can ensure you have a consistent view of the input
        string by:
         - calling C{parseWithTabs} on your grammar before calling C{parseString}
           (see L{I{parseWithTabs}<parseWithTabs>})
         - define your parse action using the full C{(s,loc,toks)} signature, and
           reference the input string using the parse action's C{s} argument
         - explictly expand the tabs in your input string before calling
           C{parseString}
        
        Example::
            Word('a').parseString('aaaaabaaa')  # -> ['aaaaa']
            Word('a').parseString('aaaaabaaa', parseAll=True)  # -> Exception: Expected end of text
        rN)
r$r�r_�
streamliner]r\�
expandtabsrtr�r
r)r�verbose_stacktrace)r�r-�parseAllr�r�r�Zser3rwrwrx�parseString#s$zParserElement.parseStringccs@|js|j�x|jD]}|j�qW|js8t|�j�}t|�}d}|j}|j}t	j
�d}	y�x�||kon|	|k�ry |||�}
|||
dd�\}}Wntk
r�|
d}Yq`X||kr�|	d7}	||
|fV|r�|||�}
|
|kr�|}q�|d7}n|}q`|
d}q`WWn4tk
�r:}zt	j
�r&�n|�WYdd}~XnXdS)a�
        Scan the input string for expression matches.  Each match will return the
        matching tokens, start location, and end location.  May be called with optional
        C{maxMatches} argument, to clip scanning after 'n' matches are found.  If
        C{overlap} is specified, then overlapping matches will be reported.

        Note that the start and end locations are reported relative to the string
        being parsed.  See L{I{parseString}<parseString>} for more information on parsing
        strings with embedded tabs.

        Example::
            source = "sldjf123lsdjjkf345sldkjf879lkjsfd987"
            print(source)
            for tokens,start,end in Word(alphas).scanString(source):
                print(' '*start + '^'*(end-start))
                print(' '*start + tokens[0])
        
        prints::
        
            sldjf123lsdjjkf345sldkjf879lkjsfd987
            ^^^^^
            sldjf
                    ^^^^^^^
                    lsdjjkf
                              ^^^^^^
                              sldkjf
                                       ^^^^^^
                                       lkjsfd
        rF)rprrN)r_r�r]r\r�r�r�r�rtr$r�rrr�)r�r-�
maxMatchesZoverlapr�r�r�Z
preparseFnZparseFn�matchesr�ZnextLocr�Znextlocr3rwrwrx�
scanStringUsB


zParserElement.scanStringcCs�g}d}d|_y�xh|j|�D]Z\}}}|j|||��|rrt|t�rT||j�7}nt|t�rh||7}n
|j|�|}qW|j||d��dd�|D�}djtt	t
|���Stk
r�}ztj
rȂn|�WYdd}~XnXdS)af
        Extension to C{L{scanString}}, to modify matching text with modified tokens that may
        be returned from a parse action.  To use C{transformString}, define a grammar and
        attach a parse action to it that modifies the returned token list.
        Invoking C{transformString()} on a target string will then scan for matches,
        and replace the matched text patterns according to the logic in the parse
        action.  C{transformString()} returns the resulting transformed string.
        
        Example::
            wd = Word(alphas)
            wd.setParseAction(lambda toks: toks[0].title())
            
            print(wd.transformString("now is the winter of our discontent made glorious summer by this sun of york."))
        Prints::
            Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York.
        rTNcSsg|]}|r|�qSrwrw)r��orwrwrxr��sz1ParserElement.transformString.<locals>.<listcomp>r�)r\r�r�rzr"r�r�r�rvr��_flattenrr$r�)r�r-rZlastErvr�r�r3rwrwrxr��s(



zParserElement.transformStringcCsPytdd�|j||�D��Stk
rJ}ztjr6�n|�WYdd}~XnXdS)a~
        Another extension to C{L{scanString}}, simplifying the access to the tokens found
        to match the given parse expression.  May be called with optional
        C{maxMatches} argument, to clip searching after 'n' matches are found.
        
        Example::
            # a capitalized word starts with an uppercase letter, followed by zero or more lowercase letters
            cap_word = Word(alphas.upper(), alphas.lower())
            
            print(cap_word.searchString("More than Iron, more than Lead, more than Gold I need Electricity"))
        prints::
            ['More', 'Iron', 'Lead', 'Gold', 'I']
        cSsg|]\}}}|�qSrwrw)r�rvr�r�rwrwrxr��sz.ParserElement.searchString.<locals>.<listcomp>N)r"r�rr$r�)r�r-r�r3rwrwrx�searchString�szParserElement.searchStringc	csXd}d}x<|j||d�D]*\}}}|||�V|r>|dV|}qW||d�VdS)a[
        Generator method to split a string using the given expression as a separator.
        May be called with optional C{maxsplit} argument, to limit the number of splits;
        and the optional C{includeSeparators} argument (default=C{False}), if the separating
        matching text should be included in the split results.
        
        Example::        
            punc = oneOf(list(".,;:/-!?"))
            print(list(punc.split("This, this?, this sentence, is badly punctuated!")))
        prints::
            ['This', ' this', '', ' this sentence', ' is badly punctuated', '']
        r)r�N)r�)	r�r-�maxsplitZincludeSeparatorsZsplitsZlastrvr�r�rwrwrxr��s

zParserElement.splitcCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)a�
        Implementation of + operator - returns C{L{And}}. Adding strings to a ParserElement
        converts them to L{Literal}s by default.
        
        Example::
            greet = Word(alphas) + "," + Word(alphas) + "!"
            hello = "Hello, World!"
            print (hello, "->", greet.parseString(hello))
        Prints::
            Hello, World! -> ['Hello', ',', 'World', '!']
        z4Cannot combine element of type %s with ParserElementrq)�
stacklevelN)	rzr�r$rQ�warnings�warnr��
SyntaxWarningr)r�r�rwrwrxr�s



zParserElement.__add__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||S)z]
        Implementation of + operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrxrs



zParserElement.__radd__cCsLt|t�rtj|�}t|t�s:tjdt|�tdd�dSt|tj	�|g�S)zQ
        Implementation of - operator, returns C{L{And}} with error stop
        z4Cannot combine element of type %s with ParserElementrq)r�N)
rzr�r$rQr�r�r�r�r�
_ErrorStop)r�r�rwrwrx�__sub__s



zParserElement.__sub__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||S)z]
        Implementation of - operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rsub__ s



zParserElement.__rsub__cs�t|t�r|d}}n�t|t�r�|ddd�}|ddkrHd|df}t|dt�r�|ddkr�|ddkrvt��S|ddkr�t��S�|dt��SnJt|dt�r�t|dt�r�|\}}||8}ntdt|d�t|d���ntdt|���|dk�rtd��|dk�rtd��||k�o2dkn�rBtd	��|�r���fd
d��|�r�|dk�rt��|�}nt�g|��|�}n�|�}n|dk�r��}nt�g|�}|S)
a�
        Implementation of * operator, allows use of C{expr * 3} in place of
        C{expr + expr + expr}.  Expressions may also me multiplied by a 2-integer
        tuple, similar to C{{min,max}} multipliers in regular expressions.  Tuples
        may also include C{None} as in:
         - C{expr*(n,None)} or C{expr*(n,)} is equivalent
              to C{expr*n + L{ZeroOrMore}(expr)}
              (read as "at least n instances of C{expr}")
         - C{expr*(None,n)} is equivalent to C{expr*(0,n)}
              (read as "0 to n instances of C{expr}")
         - C{expr*(None,None)} is equivalent to C{L{ZeroOrMore}(expr)}
         - C{expr*(1,None)} is equivalent to C{L{OneOrMore}(expr)}

        Note that C{expr*(None,n)} does not raise an exception if
        more than n exprs exist in the input stream; that is,
        C{expr*(None,n)} does not enforce a maximum number of expr
        occurrences.  If this behavior is desired, then write
        C{expr*(None,n) + ~expr}
        rNrqrrz7cannot multiply 'ParserElement' and ('%s','%s') objectsz0cannot multiply 'ParserElement' and '%s' objectsz/cannot multiply ParserElement by negative valuez@second tuple value must be greater or equal to first tuple valuez+cannot multiply ParserElement by 0 or (0,0)cs(|dkrt��|d��St��SdS)Nrr)r)�n)�makeOptionalListr�rwrxr�]sz/ParserElement.__mul__.<locals>.makeOptionalList)NN)	rzru�tupler2rr�r��
ValueErrorr)r�r�ZminElementsZoptElementsr�rw)r�r�rx�__mul__,sD







zParserElement.__mul__cCs
|j|�S)N)r�)r�r�rwrwrx�__rmul__pszParserElement.__rmul__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zI
        Implementation of | operator - returns C{L{MatchFirst}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__or__ss



zParserElement.__or__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||BS)z]
        Implementation of | operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__ror__s



zParserElement.__ror__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zA
        Implementation of ^ operator - returns C{L{Or}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__xor__�s



zParserElement.__xor__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||AS)z]
        Implementation of ^ operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rxor__�s



zParserElement.__rxor__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zC
        Implementation of & operator - returns C{L{Each}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__and__�s



zParserElement.__and__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||@S)z]
        Implementation of & operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rand__�s



zParserElement.__rand__cCst|�S)zE
        Implementation of ~ operator - returns C{L{NotAny}}
        )r)r�rwrwrx�
__invert__�szParserElement.__invert__cCs|dk	r|j|�S|j�SdS)a

        Shortcut for C{L{setResultsName}}, with C{listAllMatches=False}.
        
        If C{name} is given with a trailing C{'*'} character, then C{listAllMatches} will be
        passed as C{True}.
           
        If C{name} is omitted, same as calling C{L{copy}}.

        Example::
            # these are equivalent
            userdata = Word(alphas).setResultsName("name") + Word(nums+"-").setResultsName("socsecno")
            userdata = Word(alphas)("name") + Word(nums+"-")("socsecno")             
        N)rmr�)r�r�rwrwrx�__call__�s
zParserElement.__call__cCst|�S)z�
        Suppresses the output of this C{ParserElement}; useful to keep punctuation from
        cluttering up returned output.
        )r+)r�rwrwrx�suppress�szParserElement.suppresscCs
d|_|S)a
        Disables the skipping of whitespace before matching the characters in the
        C{ParserElement}'s defined pattern.  This is normally only used internally by
        the pyparsing module, but may be needed in some whitespace-sensitive grammars.
        F)rX)r�rwrwrx�leaveWhitespace�szParserElement.leaveWhitespacecCsd|_||_d|_|S)z8
        Overrides the default whitespace chars
        TF)rXrYrZ)r�rOrwrwrx�setWhitespaceChars�sz ParserElement.setWhitespaceCharscCs
d|_|S)z�
        Overrides default behavior to expand C{<TAB>}s to spaces before parsing the input string.
        Must be called before C{parseString} when the input grammar contains elements that
        match C{<TAB>} characters.
        T)r\)r�rwrwrx�
parseWithTabs�szParserElement.parseWithTabscCsLt|t�rt|�}t|t�r4||jkrH|jj|�n|jjt|j���|S)a�
        Define expression to be ignored (e.g., comments) while doing pattern
        matching; may be called repeatedly, to define multiple comment or other
        ignorable patterns.
        
        Example::
            patt = OneOrMore(Word(alphas))
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj']
            
            patt.ignore(cStyleComment)
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj', 'lskjd']
        )rzr�r+r]r�r�)r�r�rwrwrx�ignore�s


zParserElement.ignorecCs"|pt|pt|ptf|_d|_|S)zT
        Enable display of debugging messages while doing pattern matching.
        T)r/r2r4rcr^)r�ZstartActionZ
successActionZexceptionActionrwrwrx�setDebugActions
s
zParserElement.setDebugActionscCs|r|jttt�nd|_|S)a�
        Enable display of debugging messages while doing pattern matching.
        Set C{flag} to True to enable, False to disable.

        Example::
            wd = Word(alphas).setName("alphaword")
            integer = Word(nums).setName("numword")
            term = wd | integer
            
            # turn on debugging for wd
            wd.setDebug()

            OneOrMore(term).parseString("abc 123 xyz 890")
        
        prints::
            Match alphaword at loc 0(1,1)
            Matched alphaword -> ['abc']
            Match alphaword at loc 3(1,4)
            Exception raised:Expected alphaword (at char 4), (line:1, col:5)
            Match alphaword at loc 7(1,8)
            Matched alphaword -> ['xyz']
            Match alphaword at loc 11(1,12)
            Exception raised:Expected alphaword (at char 12), (line:1, col:13)
            Match alphaword at loc 15(1,16)
            Exception raised:Expected alphaword (at char 15), (line:1, col:16)

        The output shown is that produced by the default debug actions - custom debug actions can be
        specified using L{setDebugActions}. Prior to attempting
        to match the C{wd} expression, the debugging message C{"Match <exprname> at loc <n>(<line>,<col>)"}
        is shown. Then if the parse succeeds, a C{"Matched"} message is shown, or an C{"Exception raised"}
        message is shown. Also note the use of L{setName} to assign a human-readable name to the expression,
        which makes debugging and exception messages easier to understand - for instance, the default
        name created for the C{Word} expression without calling C{setName} is C{"W:(ABCD...)"}.
        F)r�r/r2r4r^)r��flagrwrwrx�setDebugs#zParserElement.setDebugcCs|jS)N)r�)r�rwrwrxr�@szParserElement.__str__cCst|�S)N)r�)r�rwrwrxr�CszParserElement.__repr__cCsd|_d|_|S)NT)r_rU)r�rwrwrxr�FszParserElement.streamlinecCsdS)Nrw)r�r�rwrwrx�checkRecursionKszParserElement.checkRecursioncCs|jg�dS)zj
        Check defined expressions for valid structure, check for infinite recursive definitions.
        N)r�)r��
validateTracerwrwrx�validateNszParserElement.validatecCs�y|j�}Wn2tk
r>t|d��}|j�}WdQRXYnXy|j||�Stk
r|}ztjrh�n|�WYdd}~XnXdS)z�
        Execute the parse expression on the given file or filename.
        If a filename is specified (instead of a file object),
        the entire file is opened, read, and closed before parsing.
        �rN)�readr��openr�rr$r�)r�Zfile_or_filenamer�Z
file_contents�fr3rwrwrx�	parseFileTszParserElement.parseFilecsHt|t�r"||kp t|�t|�kSt|t�r6|j|�Stt|�|kSdS)N)rzr$�varsr�r��super)r�r�)rHrwrx�__eq__hs



zParserElement.__eq__cCs
||kS)Nrw)r�r�rwrwrx�__ne__pszParserElement.__ne__cCstt|��S)N)�hash�id)r�rwrwrx�__hash__sszParserElement.__hash__cCs||kS)Nrw)r�r�rwrwrx�__req__vszParserElement.__req__cCs
||kS)Nrw)r�r�rwrwrx�__rne__yszParserElement.__rne__cCs0y|jt|�|d�dStk
r*dSXdS)a�
        Method for quick testing of a parser against a test string. Good for simple 
        inline microtests of sub expressions while building up larger parser.
           
        Parameters:
         - testString - to test against this expression for a match
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests
            
        Example::
            expr = Word(nums)
            assert expr.matches("100")
        )r�TFN)r�r�r)r�Z
testStringr�rwrwrxr�|s

zParserElement.matches�#cCs�t|t�r"tttj|j�j���}t|t�r4t|�}g}g}d}	�x�|D�]�}
|dk	rb|j	|
d�sl|rx|
rx|j
|
�qH|
s~qHdj|�|
g}g}y:|
jdd�}
|j
|
|d�}|j
|j|d��|	o�|}	Wn�tk
�rx}
z�t|
t�r�dnd	}d|
k�r0|j
t|
j|
��|j
d
t|
j|
�dd|�n|j
d
|
jd|�|j
d
t|
��|	�ob|}	|
}WYdd}
~
XnDtk
�r�}z&|j
dt|��|	�o�|}	|}WYdd}~XnX|�r�|�r�|j
d	�tdj|��|j
|
|f�qHW|	|fS)a3
        Execute the parse expression on a series of test strings, showing each
        test, the parsed results or where the parse failed. Quick and easy way to
        run a parse expression against a list of sample strings.
           
        Parameters:
         - tests - a list of separate test strings, or a multiline string of test strings
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests           
         - comment - (default=C{'#'}) - expression for indicating embedded comments in the test 
              string; pass None to disable comment filtering
         - fullDump - (default=C{True}) - dump results as list followed by results names in nested outline;
              if False, only dump nested list
         - printResults - (default=C{True}) prints test output to stdout
         - failureTests - (default=C{False}) indicates if these tests are expected to fail parsing

        Returns: a (success, results) tuple, where success indicates that all tests succeeded
        (or failed if C{failureTests} is True), and the results contain a list of lines of each 
        test's output
        
        Example::
            number_expr = pyparsing_common.number.copy()

            result = number_expr.runTests('''
                # unsigned integer
                100
                # negative integer
                -100
                # float with scientific notation
                6.02e23
                # integer with scientific notation
                1e-12
                ''')
            print("Success" if result[0] else "Failed!")

            result = number_expr.runTests('''
                # stray character
                100Z
                # missing leading digit before '.'
                -.100
                # too many '.'
                3.14.159
                ''', failureTests=True)
            print("Success" if result[0] else "Failed!")
        prints::
            # unsigned integer
            100
            [100]

            # negative integer
            -100
            [-100]

            # float with scientific notation
            6.02e23
            [6.02e+23]

            # integer with scientific notation
            1e-12
            [1e-12]

            Success
            
            # stray character
            100Z
               ^
            FAIL: Expected end of text (at char 3), (line:1, col:4)

            # missing leading digit before '.'
            -.100
            ^
            FAIL: Expected {real number with scientific notation | real number | signed integer} (at char 0), (line:1, col:1)

            # too many '.'
            3.14.159
                ^
            FAIL: Expected end of text (at char 4), (line:1, col:5)

            Success

        Each test string must be on a single line. If you want to test a string that spans multiple
        lines, create a test like this::

            expr.runTest(r"this is a test\n of strings that spans \n 3 lines")
        
        (Note that this is a raw string literal, you must include the leading 'r'.)
        TNFrz\n)r�)r z(FATAL)r�� rr�^zFAIL: zFAIL-EXCEPTION: )rzr�r�rvr{r��rstrip�
splitlinesrr�r�r�r�r�rrr!rGr�r9rKr,)r�Ztestsr�ZcommentZfullDumpZprintResultsZfailureTestsZ
allResultsZcomments�successrvr�resultr�rzr3rwrwrx�runTests�sNW



$


zParserElement.runTests)F)F)T)T)TT)TT)r�)F)N)T)F)T)Tr�TTF)Or�r�r�r�rNr��staticmethodrPrRr�r�rirmrur�rxr~rr�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�rtr�r�r�r��_MAX_INTr�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r��
__classcell__rwrw)rHrxr$8s�


&




H
"
2G+D
			

)

cs eZdZdZ�fdd�Z�ZS)r,zT
    Abstract C{ParserElement} subclass, for defining atomic matching patterns.
    cstt|�jdd�dS)NF)rg)r�r,r�)r�)rHrwrxr�	szToken.__init__)r�r�r�r�r�r�rwrw)rHrxr,	scs eZdZdZ�fdd�Z�ZS)r
z,
    An empty token, will always match.
    cs$tt|�j�d|_d|_d|_dS)Nr
TF)r�r
r�r�r[r`)r�)rHrwrxr�	szEmpty.__init__)r�r�r�r�r�r�rwrw)rHrxr
	scs*eZdZdZ�fdd�Zddd�Z�ZS)rz(
    A token that will never match.
    cs*tt|�j�d|_d|_d|_d|_dS)NrTFzUnmatchable token)r�rr�r�r[r`ra)r�)rHrwrxr�*	s
zNoMatch.__init__TcCst|||j|��dS)N)rra)r�r-r�rorwrwrxr�1	szNoMatch.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr&	scs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Token to exactly match a specified string.
    
    Example::
        Literal('blah').parseString('blah')  # -> ['blah']
        Literal('blah').parseString('blahfooblah')  # -> ['blah']
        Literal('blah').parseString('bla')  # -> Exception: Expected "blah"
    
    For case-insensitive matching, use L{CaselessLiteral}.
    
    For keyword matching (force word break before and after the matched string),
    use L{Keyword} or L{CaselessKeyword}.
    cs�tt|�j�||_t|�|_y|d|_Wn*tk
rVtj	dt
dd�t|_YnXdt
|j�|_d|j|_d|_d|_dS)Nrz2null string passed to Literal; use Empty() insteadrq)r�z"%s"z	Expected F)r�rr��matchr��matchLen�firstMatchCharr�r�r�r�r
rHr�r�rar[r`)r��matchString)rHrwrxr�C	s

zLiteral.__init__TcCsJ|||jkr6|jdks&|j|j|�r6||j|jfSt|||j|��dS)Nrr)r�r��
startswithr�rra)r�r-r�rorwrwrxr�V	szLiteral.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr5	s
csLeZdZdZedZd�fdd�	Zddd	�Z�fd
d�Ze	dd
��Z
�ZS)ra\
    Token to exactly match a specified string as a keyword, that is, it must be
    immediately followed by a non-keyword character.  Compare with C{L{Literal}}:
     - C{Literal("if")} will match the leading C{'if'} in C{'ifAndOnlyIf'}.
     - C{Keyword("if")} will not; it will only match the leading C{'if'} in C{'if x=1'}, or C{'if(y==2)'}
    Accepts two optional constructor arguments in addition to the keyword string:
     - C{identChars} is a string of characters that would be valid identifier characters,
          defaulting to all alphanumerics + "_" and "$"
     - C{caseless} allows case-insensitive matching, default is C{False}.
       
    Example::
        Keyword("start").parseString("start")  # -> ['start']
        Keyword("start").parseString("starting")  # -> Exception

    For case-insensitive matching, use L{CaselessKeyword}.
    z_$NFcs�tt|�j�|dkrtj}||_t|�|_y|d|_Wn$tk
r^t	j
dtdd�YnXd|j|_d|j|_
d|_d|_||_|r�|j�|_|j�}t|�|_dS)Nrz2null string passed to Keyword; use Empty() insteadrq)r�z"%s"z	Expected F)r�rr��DEFAULT_KEYWORD_CHARSr�r�r�r�r�r�r�r�r�rar[r`�caseless�upper�
caselessmatchr��
identChars)r�r�r�r�)rHrwrxr�q	s&

zKeyword.__init__TcCs|jr|||||j�j�|jkr�|t|�|jksL|||jj�|jkr�|dksj||dj�|jkr�||j|jfSnv|||jkr�|jdks�|j|j|�r�|t|�|jks�|||j|jkr�|dks�||d|jkr�||j|jfSt	|||j
|��dS)Nrrr)r�r�r�r�r�r�r�r�r�rra)r�r-r�rorwrwrxr��	s*&zKeyword.parseImplcstt|�j�}tj|_|S)N)r�rr�r�r�)r�r�)rHrwrxr��	szKeyword.copycCs
|t_dS)z,Overrides the default Keyword chars
        N)rr�)rOrwrwrx�setDefaultKeywordChars�	szKeyword.setDefaultKeywordChars)NF)T)r�r�r�r�r3r�r�r�r�r�r�r�rwrw)rHrxr^	s
cs*eZdZdZ�fdd�Zddd�Z�ZS)ral
    Token to match a specified string, ignoring case of letters.
    Note: the matched results will always be in the case of the given
    match string, NOT the case of the input text.

    Example::
        OneOrMore(CaselessLiteral("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD', 'CMD']
        
    (Contrast with example for L{CaselessKeyword}.)
    cs6tt|�j|j��||_d|j|_d|j|_dS)Nz'%s'z	Expected )r�rr�r��returnStringr�ra)r�r�)rHrwrxr��	szCaselessLiteral.__init__TcCs@||||j�j�|jkr,||j|jfSt|||j|��dS)N)r�r�r�r�rra)r�r-r�rorwrwrxr��	szCaselessLiteral.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr�	s
cs,eZdZdZd�fdd�	Zd	dd�Z�ZS)
rz�
    Caseless version of L{Keyword}.

    Example::
        OneOrMore(CaselessKeyword("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD']
        
    (Contrast with example for L{CaselessLiteral}.)
    Ncstt|�j||dd�dS)NT)r�)r�rr�)r�r�r�)rHrwrxr��	szCaselessKeyword.__init__TcCsj||||j�j�|jkrV|t|�|jksF|||jj�|jkrV||j|jfSt|||j|��dS)N)r�r�r�r�r�r�rra)r�r-r�rorwrwrxr��	s*zCaselessKeyword.parseImpl)N)T)r�r�r�r�r�r�r�rwrw)rHrxr�	scs,eZdZdZd�fdd�	Zd	dd�Z�ZS)
rlax
    A variation on L{Literal} which matches "close" matches, that is, 
    strings with at most 'n' mismatching characters. C{CloseMatch} takes parameters:
     - C{match_string} - string to be matched
     - C{maxMismatches} - (C{default=1}) maximum number of mismatches allowed to count as a match
    
    The results from a successful parse will contain the matched text from the input string and the following named results:
     - C{mismatches} - a list of the positions within the match_string where mismatches were found
     - C{original} - the original match_string used to compare against the input string
    
    If C{mismatches} is an empty list, then the match was an exact match.
    
    Example::
        patt = CloseMatch("ATCATCGAATGGA")
        patt.parseString("ATCATCGAAXGGA") # -> (['ATCATCGAAXGGA'], {'mismatches': [[9]], 'original': ['ATCATCGAATGGA']})
        patt.parseString("ATCAXCGAAXGGA") # -> Exception: Expected 'ATCATCGAATGGA' (with up to 1 mismatches) (at char 0), (line:1, col:1)

        # exact match
        patt.parseString("ATCATCGAATGGA") # -> (['ATCATCGAATGGA'], {'mismatches': [[]], 'original': ['ATCATCGAATGGA']})

        # close match allowing up to 2 mismatches
        patt = CloseMatch("ATCATCGAATGGA", maxMismatches=2)
        patt.parseString("ATCAXCGAAXGGA") # -> (['ATCAXCGAAXGGA'], {'mismatches': [[4, 9]], 'original': ['ATCATCGAATGGA']})
    rrcsBtt|�j�||_||_||_d|j|jf|_d|_d|_dS)Nz&Expected %r (with up to %d mismatches)F)	r�rlr�r��match_string�
maxMismatchesrar`r[)r�r�r�)rHrwrxr��	szCloseMatch.__init__TcCs�|}t|�}|t|j�}||kr�|j}d}g}	|j}
x�tt|||�|j��D]0\}}|\}}
||
krP|	j|�t|	�|
krPPqPW|d}t|||�g�}|j|d<|	|d<||fSt|||j|��dS)Nrrr�original�
mismatches)	r�r�r�r�r�r�r"rra)r�r-r�ro�startr��maxlocr�Zmatch_stringlocr�r�Zs_m�src�mat�resultsrwrwrxr��	s("

zCloseMatch.parseImpl)rr)T)r�r�r�r�r�r�r�rwrw)rHrxrl�	s	cs8eZdZdZd
�fdd�	Zdd	d
�Z�fdd�Z�ZS)r/a	
    Token for matching words composed of allowed character sets.
    Defined with string containing all allowed initial characters,
    an optional string containing allowed body characters (if omitted,
    defaults to the initial character set), and an optional minimum,
    maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction. An optional
    C{excludeChars} parameter can list characters that might be found in 
    the input C{bodyChars} string; useful to define a word of all printables
    except for one or two characters, for instance.
    
    L{srange} is useful for defining custom character set strings for defining 
    C{Word} expressions, using range notation from regular expression character sets.
    
    A common mistake is to use C{Word} to match a specific literal string, as in 
    C{Word("Address")}. Remember that C{Word} uses the string argument to define
    I{sets} of matchable characters. This expression would match "Add", "AAA",
    "dAred", or any other word made up of the characters 'A', 'd', 'r', 'e', and 's'.
    To match an exact literal string, use L{Literal} or L{Keyword}.

    pyparsing includes helper strings for building Words:
     - L{alphas}
     - L{nums}
     - L{alphanums}
     - L{hexnums}
     - L{alphas8bit} (alphabetic characters in ASCII range 128-255 - accented, tilded, umlauted, etc.)
     - L{punc8bit} (non-alphabetic characters in ASCII range 128-255 - currency, symbols, superscripts, diacriticals, etc.)
     - L{printables} (any non-whitespace character)

    Example::
        # a word composed of digits
        integer = Word(nums) # equivalent to Word("0123456789") or Word(srange("0-9"))
        
        # a word with a leading capital, and zero or more lowercase
        capital_word = Word(alphas.upper(), alphas.lower())

        # hostnames are alphanumeric, with leading alpha, and '-'
        hostname = Word(alphas, alphanums+'-')
        
        # roman numeral (not a strict parser, accepts invalid mix of characters)
        roman = Word("IVXLCDM")
        
        # any string of non-whitespace characters, except for ','
        csv_value = Word(printables, excludeChars=",")
    NrrrFcs�tt|�j��rFdj�fdd�|D��}|rFdj�fdd�|D��}||_t|�|_|rl||_t|�|_n||_t|�|_|dk|_	|dkr�t
d��||_|dkr�||_nt
|_|dkr�||_||_t|�|_d|j|_d	|_||_d
|j|jk�r�|dk�r�|dk�r�|dk�r�|j|jk�r8dt|j�|_nHt|j�dk�rfdtj|j�t|j�f|_nd
t|j�t|j�f|_|j�r�d|jd|_ytj|j�|_Wntk
�r�d|_YnXdS)Nr�c3s|]}|�kr|VqdS)Nrw)r�r�)�excludeCharsrwrxr�7
sz Word.__init__.<locals>.<genexpr>c3s|]}|�kr|VqdS)Nrw)r�r�)r�rwrxr�9
srrrzZcannot specify a minimum length < 1; use Optional(Word()) if zero-length word is permittedz	Expected Fr�z[%s]+z%s[%s]*z	[%s][%s]*z\b)r�r/r�r��
initCharsOrigr��	initChars�
bodyCharsOrig�	bodyChars�maxSpecifiedr��minLen�maxLenr�r�r�rar`�	asKeyword�_escapeRegexRangeChars�reStringr�rd�escape�compilerK)r�rr�min�max�exactrr�)rH)r�rxr�4
sT



0
z
Word.__init__Tc
CsD|jr<|jj||�}|s(t|||j|��|j�}||j�fS|||jkrZt|||j|��|}|d7}t|�}|j}||j	}t
||�}x ||kr�|||kr�|d7}q�Wd}	|||jkr�d}	|jr�||kr�|||kr�d}	|j
�r|dk�r||d|k�s||k�r|||k�rd}	|	�r4t|||j|��||||�fS)NrrFTr)rdr�rra�end�grouprr�rrrrrr)
r�r-r�ror�r�r�Z	bodycharsr�ZthrowExceptionrwrwrxr�j
s6

4zWord.parseImplcstytt|�j�Stk
r"YnX|jdkrndd�}|j|jkr^d||j�||j�f|_nd||j�|_|jS)NcSs$t|�dkr|dd�dS|SdS)N�z...)r�)r�rwrwrx�
charsAsStr�
sz Word.__str__.<locals>.charsAsStrz	W:(%s,%s)zW:(%s))r�r/r�rKrUr�r)r�r)rHrwrxr��
s
zWord.__str__)NrrrrFN)T)r�r�r�r�r�r�r�r�rwrw)rHrxr/
s.6
#csFeZdZdZeejd��Zd�fdd�	Zddd�Z	�fd	d
�Z
�ZS)
r'a�
    Token for matching strings that match a given regular expression.
    Defined with string specifying the regular expression in a form recognized by the inbuilt Python re module.
    If the given regex contains named groups (defined using C{(?P<name>...)}), these will be preserved as 
    named parse results.

    Example::
        realnum = Regex(r"[+-]?\d+\.\d*")
        date = Regex(r'(?P<year>\d{4})-(?P<month>\d\d?)-(?P<day>\d\d?)')
        # ref: http://stackoverflow.com/questions/267399/how-do-you-match-only-valid-roman-numerals-with-a-regular-expression
        roman = Regex(r"M{0,4}(CM|CD|D?C{0,3})(XC|XL|L?X{0,3})(IX|IV|V?I{0,3})")
    z[A-Z]rcs�tt|�j�t|t�r�|s,tjdtdd�||_||_	yt
j|j|j	�|_
|j|_Wq�t
jk
r�tjd|tdd��Yq�Xn2t|tj�r�||_
t|�|_|_||_	ntd��t|�|_d|j|_d|_d|_d	S)
z�The parameters C{pattern} and C{flags} are passed to the C{re.compile()} function as-is. See the Python C{re} module for an explanation of the acceptable patterns and flags.z0null string passed to Regex; use Empty() insteadrq)r�z$invalid pattern (%s) passed to RegexzCRegex may only be constructed with a string or a compiled RE objectz	Expected FTN)r�r'r�rzr�r�r�r��pattern�flagsrdr
r�
sre_constants�error�compiledREtyper{r�r�r�rar`r[)r�rr)rHrwrxr��
s.





zRegex.__init__TcCsd|jj||�}|s"t|||j|��|j�}|j�}t|j��}|r\x|D]}||||<qHW||fS)N)rdr�rrar�	groupdictr"r)r�r-r�ror��dr�r�rwrwrxr��
s
zRegex.parseImplcsDytt|�j�Stk
r"YnX|jdkr>dt|j�|_|jS)NzRe:(%s))r�r'r�rKrUr�r)r�)rHrwrxr��
s
z
Regex.__str__)r)T)r�r�r�r�r�rdr
rr�r�r�r�rwrw)rHrxr'�
s
"

cs8eZdZdZd�fdd�	Zddd�Z�fd	d
�Z�ZS)
r%a�
    Token for matching strings that are delimited by quoting characters.
    
    Defined with the following parameters:
        - quoteChar - string of one or more characters defining the quote delimiting string
        - escChar - character to escape quotes, typically backslash (default=C{None})
        - escQuote - special quote sequence to escape an embedded quote string (such as SQL's "" to escape an embedded ") (default=C{None})
        - multiline - boolean indicating whether quotes can span multiple lines (default=C{False})
        - unquoteResults - boolean indicating whether the matched text should be unquoted (default=C{True})
        - endQuoteChar - string of one or more characters defining the end of the quote delimited string (default=C{None} => same as quoteChar)
        - convertWhitespaceEscapes - convert escaped whitespace (C{'\t'}, C{'\n'}, etc.) to actual whitespace (default=C{True})

    Example::
        qs = QuotedString('"')
        print(qs.searchString('lsjdf "This is the quote" sldjf'))
        complex_qs = QuotedString('{{', endQuoteChar='}}')
        print(complex_qs.searchString('lsjdf {{This is the "quote"}} sldjf'))
        sql_qs = QuotedString('"', escQuote='""')
        print(sql_qs.searchString('lsjdf "This is the quote with ""embedded"" quotes" sldjf'))
    prints::
        [['This is the quote']]
        [['This is the "quote"']]
        [['This is the quote with "embedded" quotes']]
    NFTcsNtt��j�|j�}|s0tjdtdd�t��|dkr>|}n"|j�}|s`tjdtdd�t��|�_t	|��_
|d�_|�_t	|��_
|�_|�_|�_|�_|r�tjtjB�_dtj�j�t�jd�|dk	r�t|�p�df�_n<d�_dtj�j�t�jd�|dk	�rt|��pdf�_t	�j�d	k�rp�jd
dj�fdd
�tt	�j�d	dd�D��d7_|�r��jdtj|�7_|�r��jdtj|�7_tj�j�d�_�jdtj�j�7_ytj�j�j��_�j�_Wn0tjk
�r&tjd�jtdd��YnXt ���_!d�j!�_"d�_#d�_$dS)Nz$quoteChar cannot be the empty stringrq)r�z'endQuoteChar cannot be the empty stringrz%s(?:[^%s%s]r�z%s(?:[^%s\n\r%s]rrz|(?:z)|(?:c3s4|],}dtj�jd|��t�j|�fVqdS)z%s[^%s]N)rdr	�endQuoteCharr)r�r�)r�rwrxr�/sz(QuotedString.__init__.<locals>.<genexpr>�)z|(?:%s)z|(?:%s.)z(.)z)*%sz$invalid pattern (%s) passed to Regexz	Expected FTrs)%r�r%r�r�r�r�r��SyntaxError�	quoteCharr��quoteCharLen�firstQuoteCharr�endQuoteCharLen�escChar�escQuote�unquoteResults�convertWhitespaceEscapesrd�	MULTILINE�DOTALLrr	rrr�r��escCharReplacePatternr
rrrr�r�rar`r[)r�rr r!Z	multiliner"rr#)rH)r�rxr�sf




6

zQuotedString.__init__c	Cs�|||jkr|jj||�pd}|s4t|||j|��|j�}|j�}|jr�||j|j	�}t
|t�r�d|kr�|jr�ddddd�}x |j
�D]\}}|j||�}q�W|jr�tj|jd|�}|jr�|j|j|j�}||fS)N�\�	r��
)z\tz\nz\fz\rz\g<1>)rrdr�rrarrr"rrrzr�r#r�r�r r�r&r!r)	r�r-r�ror�r�Zws_mapZwslitZwscharrwrwrxr�Gs( 
zQuotedString.parseImplcsFytt|�j�Stk
r"YnX|jdkr@d|j|jf|_|jS)Nz.quoted string, starting with %s ending with %s)r�r%r�rKrUrr)r�)rHrwrxr�js
zQuotedString.__str__)NNFTNT)T)r�r�r�r�r�r�r�r�rwrw)rHrxr%�
sA
#cs8eZdZdZd�fdd�	Zddd�Z�fd	d
�Z�ZS)
r	a�
    Token for matching words composed of characters I{not} in a given set (will
    include whitespace in matched characters if not listed in the provided exclusion set - see example).
    Defined with string containing all disallowed characters, and an optional
    minimum, maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction.

    Example::
        # define a comma-separated-value as anything that is not a ','
        csv_value = CharsNotIn(',')
        print(delimitedList(csv_value).parseString("dkls,lsdkjf,s12 34,@!#,213"))
    prints::
        ['dkls', 'lsdkjf', 's12 34', '@!#', '213']
    rrrcs�tt|�j�d|_||_|dkr*td��||_|dkr@||_nt|_|dkrZ||_||_t	|�|_
d|j
|_|jdk|_d|_
dS)NFrrzfcannot specify a minimum length < 1; use Optional(CharsNotIn()) if zero-length char group is permittedrz	Expected )r�r	r�rX�notCharsr�rrr�r�r�rar[r`)r�r+rrr
)rHrwrxr��s 
zCharsNotIn.__init__TcCs�|||jkrt|||j|��|}|d7}|j}t||jt|��}x ||krd|||krd|d7}qFW|||jkr�t|||j|��||||�fS)Nrr)r+rrarrr�r)r�r-r�ror�Znotchars�maxlenrwrwrxr��s
zCharsNotIn.parseImplcsdytt|�j�Stk
r"YnX|jdkr^t|j�dkrRd|jdd�|_nd|j|_|jS)Nrz
!W:(%s...)z!W:(%s))r�r	r�rKrUr�r+)r�)rHrwrxr��s
zCharsNotIn.__str__)rrrr)T)r�r�r�r�r�r�r�r�rwrw)rHrxr	vs
cs<eZdZdZdddddd�Zd�fdd�	Zddd�Z�ZS)r.a�
    Special matching class for matching whitespace.  Normally, whitespace is ignored
    by pyparsing grammars.  This class is included when some whitespace structures
    are significant.  Define with a string containing the whitespace characters to be
    matched; default is C{" \t\r\n"}.  Also takes optional C{min}, C{max}, and C{exact} arguments,
    as defined for the C{L{Word}} class.
    z<SPC>z<TAB>z<LF>z<CR>z<FF>)r�r(rr*r)� 	
rrrcs�tt��j�|�_�jdj�fdd��jD���djdd��jD���_d�_d�j�_	|�_
|dkrt|�_nt�_|dkr�|�_|�_
dS)Nr�c3s|]}|�jkr|VqdS)N)�
matchWhite)r�r�)r�rwrxr��sz!White.__init__.<locals>.<genexpr>css|]}tj|VqdS)N)r.�	whiteStrs)r�r�rwrwrxr��sTz	Expected r)
r�r.r�r.r�r�rYr�r[rarrr�)r�Zwsrrr
)rH)r�rxr��s zWhite.__init__TcCs�|||jkrt|||j|��|}|d7}||j}t|t|��}x"||krd|||jkrd|d7}qDW|||jkr�t|||j|��||||�fS)Nrr)r.rrarrr�r)r�r-r�ror�r�rwrwrxr��s
zWhite.parseImpl)r-rrrr)T)r�r�r�r�r/r�r�r�rwrw)rHrxr.�scseZdZ�fdd�Z�ZS)�_PositionTokencs(tt|�j�|jj|_d|_d|_dS)NTF)r�r0r�rHr�r�r[r`)r�)rHrwrxr��s
z_PositionToken.__init__)r�r�r�r�r�rwrw)rHrxr0�sr0cs2eZdZdZ�fdd�Zdd�Zd	dd�Z�ZS)
rzb
    Token to advance to a specific column of input text; useful for tabular report scraping.
    cstt|�j�||_dS)N)r�rr�r9)r��colno)rHrwrxr��szGoToColumn.__init__cCs`t||�|jkr\t|�}|jr*|j||�}x0||krZ||j�rZt||�|jkrZ|d7}q,W|S)Nrr)r9r�r]r��isspace)r�r-r�r�rwrwrxr��s&zGoToColumn.preParseTcCsDt||�}||jkr"t||d|��||j|}|||�}||fS)NzText not in expected column)r9r)r�r-r�roZthiscolZnewlocr�rwrwrxr�s

zGoToColumn.parseImpl)T)r�r�r�r�r�r�r�r�rwrw)rHrxr�s	cs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Matches if current position is at the beginning of a line within the parse string
    
    Example::
    
        test = '''        AAA this line
        AAA and this line
          AAA but not this one
        B AAA and definitely not this one
        '''

        for t in (LineStart() + 'AAA' + restOfLine).searchString(test):
            print(t)
    
    Prints::
        ['AAA', ' this line']
        ['AAA', ' and this line']    

    cstt|�j�d|_dS)NzExpected start of line)r�rr�ra)r�)rHrwrxr�&szLineStart.__init__TcCs*t||�dkr|gfSt|||j|��dS)Nrr)r9rra)r�r-r�rorwrwrxr�*szLineStart.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxrscs*eZdZdZ�fdd�Zddd�Z�ZS)rzU
    Matches if current position is at the end of a line within the parse string
    cs,tt|�j�|jtjjdd��d|_dS)Nrr�zExpected end of line)r�rr�r�r$rNr�ra)r�)rHrwrxr�3szLineEnd.__init__TcCsb|t|�kr6||dkr$|ddfSt|||j|��n(|t|�krN|dgfSt|||j|��dS)Nrrr)r�rra)r�r-r�rorwrwrxr�8szLineEnd.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr/scs*eZdZdZ�fdd�Zddd�Z�ZS)r*zM
    Matches if current position is at the beginning of the parse string
    cstt|�j�d|_dS)NzExpected start of text)r�r*r�ra)r�)rHrwrxr�GszStringStart.__init__TcCs0|dkr(||j|d�kr(t|||j|��|gfS)Nr)r�rra)r�r-r�rorwrwrxr�KszStringStart.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr*Cscs*eZdZdZ�fdd�Zddd�Z�ZS)r)zG
    Matches if current position is at the end of the parse string
    cstt|�j�d|_dS)NzExpected end of text)r�r)r�ra)r�)rHrwrxr�VszStringEnd.__init__TcCs^|t|�krt|||j|��n<|t|�kr6|dgfS|t|�krJ|gfSt|||j|��dS)Nrr)r�rra)r�r-r�rorwrwrxr�ZszStringEnd.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr)Rscs.eZdZdZef�fdd�	Zddd�Z�ZS)r1ap
    Matches if the current position is at the beginning of a Word, and
    is not preceded by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{} behavior of regular expressions,
    use C{WordStart(alphanums)}. C{WordStart} will also match at the beginning of
    the string being parsed, or at the beginning of a line.
    cs"tt|�j�t|�|_d|_dS)NzNot at the start of a word)r�r1r�r��	wordCharsra)r�r3)rHrwrxr�ls
zWordStart.__init__TcCs@|dkr8||d|jks(|||jkr8t|||j|��|gfS)Nrrr)r3rra)r�r-r�rorwrwrxr�qs
zWordStart.parseImpl)T)r�r�r�r�rVr�r�r�rwrw)rHrxr1dscs.eZdZdZef�fdd�	Zddd�Z�ZS)r0aZ
    Matches if the current position is at the end of a Word, and
    is not followed by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{} behavior of regular expressions,
    use C{WordEnd(alphanums)}. C{WordEnd} will also match at the end of
    the string being parsed, or at the end of a line.
    cs(tt|�j�t|�|_d|_d|_dS)NFzNot at the end of a word)r�r0r�r�r3rXra)r�r3)rHrwrxr��s
zWordEnd.__init__TcCsPt|�}|dkrH||krH|||jks8||d|jkrHt|||j|��|gfS)Nrrr)r�r3rra)r�r-r�ror�rwrwrxr��szWordEnd.parseImpl)T)r�r�r�r�rVr�r�r�rwrw)rHrxr0xscs�eZdZdZd�fdd�	Zdd�Zdd�Zd	d
�Z�fdd�Z�fd
d�Z	�fdd�Z
d�fdd�	Zgfdd�Z�fdd�Z
�ZS)r z^
    Abstract subclass of ParserElement, for combining and post-processing parsed tokens.
    Fcs�tt|�j|�t|t�r"t|�}t|t�r<tj|�g|_	njt|t
j�rzt|�}tdd�|D��rnt
tj|�}t|�|_	n,yt|�|_	Wntk
r�|g|_	YnXd|_dS)Ncss|]}t|t�VqdS)N)rzr�)r�r.rwrwrxr��sz+ParseExpression.__init__.<locals>.<genexpr>F)r�r r�rzr�r�r�r$rQ�exprsr��Iterable�allrvr�re)r�r4rg)rHrwrxr��s

zParseExpression.__init__cCs
|j|S)N)r4)r�r�rwrwrxr��szParseExpression.__getitem__cCs|jj|�d|_|S)N)r4r�rU)r�r�rwrwrxr��szParseExpression.appendcCs4d|_dd�|jD�|_x|jD]}|j�q W|S)z~Extends C{leaveWhitespace} defined in base class, and also invokes C{leaveWhitespace} on
           all contained expressions.FcSsg|]}|j��qSrw)r�)r�r�rwrwrxr��sz3ParseExpression.leaveWhitespace.<locals>.<listcomp>)rXr4r�)r�r�rwrwrxr��s
zParseExpression.leaveWhitespacecszt|t�rF||jkrvtt|�j|�xP|jD]}|j|jd�q,Wn0tt|�j|�x|jD]}|j|jd�q^W|S)Nrrrsrs)rzr+r]r�r r�r4)r�r�r�)rHrwrxr��s

zParseExpression.ignorecsLytt|�j�Stk
r"YnX|jdkrFd|jjt|j�f|_|jS)Nz%s:(%s))	r�r r�rKrUrHr�r�r4)r�)rHrwrxr��s
zParseExpression.__str__cs0tt|�j�x|jD]}|j�qWt|j�dk�r|jd}t||j�r�|jr�|jdkr�|j	r�|jdd�|jdg|_d|_
|j|jO_|j|jO_|jd}t||j�o�|jo�|jdko�|j	�r|jdd�|jdd�|_d|_
|j|jO_|j|jO_dt
|�|_|S)Nrqrrrz	Expected rsrs)r�r r�r4r�rzrHrSrVr^rUr[r`r�ra)r�r�r�)rHrwrxr��s0




zParseExpression.streamlinecstt|�j||�}|S)N)r�r rm)r�r�rlr�)rHrwrxrm�szParseExpression.setResultsNamecCs:|dd�|g}x|jD]}|j|�qW|jg�dS)N)r4r�r�)r�r��tmpr�rwrwrxr��szParseExpression.validatecs$tt|�j�}dd�|jD�|_|S)NcSsg|]}|j��qSrw)r�)r�r�rwrwrxr��sz(ParseExpression.copy.<locals>.<listcomp>)r�r r�r4)r�r�)rHrwrxr��szParseExpression.copy)F)F)r�r�r�r�r�r�r�r�r�r�r�rmr�r�r�rwrw)rHrxr �s	
"csTeZdZdZGdd�de�Zd�fdd�	Zddd�Zd	d
�Zdd�Z	d
d�Z
�ZS)ra

    Requires all given C{ParseExpression}s to be found in the given order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'+'} operator.
    May also be constructed using the C{'-'} operator, which will suppress backtracking.

    Example::
        integer = Word(nums)
        name_expr = OneOrMore(Word(alphas))

        expr = And([integer("id"),name_expr("name"),integer("age")])
        # more easily written as:
        expr = integer("id") + name_expr("name") + integer("age")
    cseZdZ�fdd�Z�ZS)zAnd._ErrorStopcs&ttj|�j||�d|_|j�dS)N�-)r�rr�r�r�r�)r�r�r�)rHrwrxr�
szAnd._ErrorStop.__init__)r�r�r�r�r�rwrw)rHrxr�
sr�TcsRtt|�j||�tdd�|jD��|_|j|jdj�|jdj|_d|_	dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�
szAnd.__init__.<locals>.<genexpr>rT)
r�rr�r6r4r[r�rYrXre)r�r4rg)rHrwrxr�
s
zAnd.__init__c	Cs|jdj|||dd�\}}d}x�|jdd�D]�}t|tj�rFd}q0|r�y|j|||�\}}Wq�tk
rv�Yq�tk
r�}zd|_tj|��WYdd}~Xq�t	k
r�t|t
|�|j|��Yq�Xn|j|||�\}}|s�|j�r0||7}q0W||fS)NrF)rprrT)
r4rtrzrr�r#r�
__traceback__r�r�r�rar�)	r�r-r�ro�
resultlistZ	errorStopr�Z
exprtokensr�rwrwrxr�
s(z
And.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrxr5
s

zAnd.__iadd__cCs8|dd�|g}x |jD]}|j|�|jsPqWdS)N)r4r�r[)r�r��subRecCheckListr�rwrwrxr�:
s

zAnd.checkRecursioncCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr��{r�css|]}t|�VqdS)N)r�)r�r�rwrwrxr�F
szAnd.__str__.<locals>.<genexpr>�})r�r�rUr�r4)r�rwrwrxr�A
s


 zAnd.__str__)T)T)r�r�r�r�r
r�r�r�rr�r�r�rwrw)rHrxr�s
csDeZdZdZd�fdd�	Zddd�Zdd	�Zd
d�Zdd
�Z�Z	S)ra�
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the expression that matches the longest string will be used.
    May be constructed using the C{'^'} operator.

    Example::
        # construct Or using '^' operator
        
        number = Word(nums) ^ Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789"))
    prints::
        [['123'], ['3.1416'], ['789']]
    Fcs:tt|�j||�|jr0tdd�|jD��|_nd|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�\
szOr.__init__.<locals>.<genexpr>T)r�rr�r4rr[)r�r4rg)rHrwrxr�Y
szOr.__init__TcCsTd}d}g}x�|jD]�}y|j||�}Wnvtk
rd}	z d|	_|	j|krT|	}|	j}WYdd}	~	Xqtk
r�t|�|kr�t|t|�|j|�}t|�}YqX|j||f�qW|�r*|j	dd�d�x`|D]X\}
}y|j
|||�Stk
�r$}	z"d|	_|	j|k�r|	}|	j}WYdd}	~	Xq�Xq�W|dk	�rB|j|_|�nt||d|��dS)NrrcSs
|dS)Nrrw)�xrwrwrxryu
szOr.parseImpl.<locals>.<lambda>)r�z no defined alternatives to matchrs)r4r�rr9r�r�r�rar��sortrtr�)r�r-r�ro�	maxExcLoc�maxExceptionr�r�Zloc2r��_rwrwrxr�`
s<

zOr.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrx�__ixor__�
s

zOr.__ixor__cCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r<z ^ css|]}t|�VqdS)N)r�)r�r�rwrwrxr��
szOr.__str__.<locals>.<genexpr>r=)r�r�rUr�r4)r�rwrwrxr��
s


 z
Or.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r4r�)r�r�r;r�rwrwrxr��
szOr.checkRecursion)F)T)
r�r�r�r�r�r�rCr�r�r�rwrw)rHrxrK
s

&	csDeZdZdZd�fdd�	Zddd�Zdd	�Zd
d�Zdd
�Z�Z	S)ra�
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the first one listed is the one that will match.
    May be constructed using the C{'|'} operator.

    Example::
        # construct MatchFirst using '|' operator
        
        # watch the order of expressions to match
        number = Word(nums) | Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789")) #  Fail! -> [['123'], ['3'], ['1416'], ['789']]

        # put more selective expression first
        number = Combine(Word(nums) + '.' + Word(nums)) | Word(nums)
        print(number.searchString("123 3.1416 789")) #  Better -> [['123'], ['3.1416'], ['789']]
    Fcs:tt|�j||�|jr0tdd�|jD��|_nd|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr��
sz&MatchFirst.__init__.<locals>.<genexpr>T)r�rr�r4rr[)r�r4rg)rHrwrxr��
szMatchFirst.__init__Tc	Cs�d}d}x�|jD]�}y|j|||�}|Stk
r\}z|j|krL|}|j}WYdd}~Xqtk
r�t|�|kr�t|t|�|j|�}t|�}YqXqW|dk	r�|j|_|�nt||d|��dS)Nrrz no defined alternatives to matchrs)r4rtrr�r�r�rar�)	r�r-r�ror@rAr�r�r�rwrwrxr��
s$
zMatchFirst.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrx�__ior__�
s

zMatchFirst.__ior__cCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r<z | css|]}t|�VqdS)N)r�)r�r�rwrwrxr��
sz%MatchFirst.__str__.<locals>.<genexpr>r=)r�r�rUr�r4)r�rwrwrxr��
s


 zMatchFirst.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r4r�)r�r�r;r�rwrwrxr��
szMatchFirst.checkRecursion)F)T)
r�r�r�r�r�r�rDr�r�r�rwrw)rHrxr�
s
	cs<eZdZdZd�fdd�	Zddd�Zdd�Zd	d
�Z�ZS)
ram
    Requires all given C{ParseExpression}s to be found, but in any order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'&'} operator.

    Example::
        color = oneOf("RED ORANGE YELLOW GREEN BLUE PURPLE BLACK WHITE BROWN")
        shape_type = oneOf("SQUARE CIRCLE TRIANGLE STAR HEXAGON OCTAGON")
        integer = Word(nums)
        shape_attr = "shape:" + shape_type("shape")
        posn_attr = "posn:" + Group(integer("x") + ',' + integer("y"))("posn")
        color_attr = "color:" + color("color")
        size_attr = "size:" + integer("size")

        # use Each (using operator '&') to accept attributes in any order 
        # (shape and posn are required, color and size are optional)
        shape_spec = shape_attr & posn_attr & Optional(color_attr) & Optional(size_attr)

        shape_spec.runTests('''
            shape: SQUARE color: BLACK posn: 100, 120
            shape: CIRCLE size: 50 color: BLUE posn: 50,80
            color:GREEN size:20 shape:TRIANGLE posn:20,40
            '''
            )
    prints::
        shape: SQUARE color: BLACK posn: 100, 120
        ['shape:', 'SQUARE', 'color:', 'BLACK', 'posn:', ['100', ',', '120']]
        - color: BLACK
        - posn: ['100', ',', '120']
          - x: 100
          - y: 120
        - shape: SQUARE


        shape: CIRCLE size: 50 color: BLUE posn: 50,80
        ['shape:', 'CIRCLE', 'size:', '50', 'color:', 'BLUE', 'posn:', ['50', ',', '80']]
        - color: BLUE
        - posn: ['50', ',', '80']
          - x: 50
          - y: 80
        - shape: CIRCLE
        - size: 50


        color: GREEN size: 20 shape: TRIANGLE posn: 20,40
        ['color:', 'GREEN', 'size:', '20', 'shape:', 'TRIANGLE', 'posn:', ['20', ',', '40']]
        - color: GREEN
        - posn: ['20', ',', '40']
          - x: 20
          - y: 40
        - shape: TRIANGLE
        - size: 20
    Tcs8tt|�j||�tdd�|jD��|_d|_d|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�sz Each.__init__.<locals>.<genexpr>T)r�rr�r6r4r[rX�initExprGroups)r�r4rg)rHrwrxr�sz
Each.__init__c
s�|jr�tdd�|jD��|_dd�|jD�}dd�|jD�}|||_dd�|jD�|_dd�|jD�|_dd�|jD�|_|j|j7_d	|_|}|jdd�}|jdd��g}d
}	x�|	�rp|�|j|j}
g}x~|
D]v}y|j||�}Wn t	k
�r|j
|�Yq�X|j
|jjt|�|��||k�rD|j
|�q�|�kr�j
|�q�Wt|�t|
�kr�d	}	q�W|�r�djdd�|D��}
t	||d
|
��|�fdd�|jD�7}g}x*|D]"}|j|||�\}}|j
|��q�Wt|tg��}||fS)Ncss&|]}t|t�rt|j�|fVqdS)N)rzrr�r.)r�r�rwrwrxr�sz!Each.parseImpl.<locals>.<genexpr>cSsg|]}t|t�r|j�qSrw)rzrr.)r�r�rwrwrxr�sz"Each.parseImpl.<locals>.<listcomp>cSs"g|]}|jrt|t�r|�qSrw)r[rzr)r�r�rwrwrxr�scSsg|]}t|t�r|j�qSrw)rzr2r.)r�r�rwrwrxr� scSsg|]}t|t�r|j�qSrw)rzrr.)r�r�rwrwrxr�!scSs g|]}t|tttf�s|�qSrw)rzrr2r)r�r�rwrwrxr�"sFTz, css|]}t|�VqdS)N)r�)r�r�rwrwrxr�=sz*Missing one or more required elements (%s)cs$g|]}t|t�r|j�kr|�qSrw)rzrr.)r�r�)�tmpOptrwrxr�As)rEr�r4Zopt1mapZ	optionalsZmultioptionalsZ
multirequiredZrequiredr�rr�r�r��remover�r�rt�sumr")r�r-r�roZopt1Zopt2ZtmpLocZtmpReqdZ
matchOrderZkeepMatchingZtmpExprsZfailedr�Zmissingr:r�ZfinalResultsrw)rFrxr�sP



zEach.parseImplcCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r<z & css|]}t|�VqdS)N)r�)r�r�rwrwrxr�PszEach.__str__.<locals>.<genexpr>r=)r�r�rUr�r4)r�rwrwrxr�Ks


 zEach.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r4r�)r�r�r;r�rwrwrxr�TszEach.checkRecursion)T)T)	r�r�r�r�r�r�r�r�r�rwrw)rHrxr�
s
5
1	csleZdZdZd�fdd�	Zddd�Zdd	�Z�fd
d�Z�fdd
�Zdd�Z	gfdd�Z
�fdd�Z�ZS)rza
    Abstract subclass of C{ParserElement}, for combining and post-processing parsed tokens.
    Fcs�tt|�j|�t|t�r@ttjt�r2tj|�}ntjt	|��}||_
d|_|dk	r�|j|_|j
|_
|j|j�|j|_|j|_|j|_|jj|j�dS)N)r�rr�rzr��
issubclassr$rQr,rr.rUr`r[r�rYrXrWrer]r�)r�r.rg)rHrwrxr�^s
zParseElementEnhance.__init__TcCs2|jdk	r|jj|||dd�Std||j|��dS)NF)rpr�)r.rtrra)r�r-r�rorwrwrxr�ps
zParseElementEnhance.parseImplcCs*d|_|jj�|_|jdk	r&|jj�|S)NF)rXr.r�r�)r�rwrwrxr�vs


z#ParseElementEnhance.leaveWhitespacecsrt|t�rB||jkrntt|�j|�|jdk	rn|jj|jd�n,tt|�j|�|jdk	rn|jj|jd�|S)Nrrrsrs)rzr+r]r�rr�r.)r�r�)rHrwrxr�}s



zParseElementEnhance.ignorecs&tt|�j�|jdk	r"|jj�|S)N)r�rr�r.)r�)rHrwrxr��s

zParseElementEnhance.streamlinecCsB||krt||g��|dd�|g}|jdk	r>|jj|�dS)N)r&r.r�)r�r�r;rwrwrxr��s

z"ParseElementEnhance.checkRecursioncCs6|dd�|g}|jdk	r(|jj|�|jg�dS)N)r.r�r�)r�r�r7rwrwrxr��s
zParseElementEnhance.validatecsVytt|�j�Stk
r"YnX|jdkrP|jdk	rPd|jjt|j�f|_|jS)Nz%s:(%s))	r�rr�rKrUr.rHr�r�)r�)rHrwrxr��szParseElementEnhance.__str__)F)T)
r�r�r�r�r�r�r�r�r�r�r�r�r�rwrw)rHrxrZs
cs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Lookahead matching of the given parse expression.  C{FollowedBy}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression matches at the current
    position.  C{FollowedBy} always returns a null token list.

    Example::
        # use FollowedBy to match a label only if it is followed by a ':'
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        OneOrMore(attr_expr).parseString("shape: SQUARE color: BLACK posn: upper left").pprint()
    prints::
        [['shape', 'SQUARE'], ['color', 'BLACK'], ['posn', 'upper left']]
    cstt|�j|�d|_dS)NT)r�rr�r[)r�r.)rHrwrxr��szFollowedBy.__init__TcCs|jj||�|gfS)N)r.r�)r�r-r�rorwrwrxr��szFollowedBy.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr�scs2eZdZdZ�fdd�Zd	dd�Zdd�Z�ZS)
ra�
    Lookahead to disallow matching with the given parse expression.  C{NotAny}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression does I{not} match at the current
    position.  Also, C{NotAny} does I{not} skip over leading whitespace. C{NotAny}
    always returns a null token list.  May be constructed using the '~' operator.

    Example::
        
    cs0tt|�j|�d|_d|_dt|j�|_dS)NFTzFound unwanted token, )r�rr�rXr[r�r.ra)r�r.)rHrwrxr��szNotAny.__init__TcCs&|jj||�rt|||j|��|gfS)N)r.r�rra)r�r-r�rorwrwrxr��szNotAny.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�z~{r=)r�r�rUr�r.)r�rwrwrxr��s


zNotAny.__str__)T)r�r�r�r�r�r�r�r�rwrw)rHrxr�s

cs(eZdZd�fdd�	Zddd�Z�ZS)	�_MultipleMatchNcsFtt|�j|�d|_|}t|t�r.tj|�}|dk	r<|nd|_dS)NT)	r�rJr�rWrzr�r$rQ�	not_ender)r�r.�stopOnZender)rHrwrxr��s

z_MultipleMatch.__init__TcCs�|jj}|j}|jdk	}|r$|jj}|r2|||�||||dd�\}}yZ|j}	xJ|rb|||�|	rr|||�}
n|}
|||
|�\}}|s�|j�rT||7}qTWWnttfk
r�YnX||fS)NF)rp)	r.rtr�rKr�r]r�rr�)r�r-r�roZself_expr_parseZself_skip_ignorablesZcheck_enderZ
try_not_enderr�ZhasIgnoreExprsr�Z	tmptokensrwrwrxr��s,



z_MultipleMatch.parseImpl)N)T)r�r�r�r�r�r�rwrw)rHrxrJ�srJc@seZdZdZdd�ZdS)ra�
    Repetition of one or more of the given expression.
    
    Parameters:
     - expr - expression that must match one or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: BLACK"
        OneOrMore(attr_expr).parseString(text).pprint()  # Fail! read 'color' as data instead of next label -> [['shape', 'SQUARE color']]

        # use stopOn attribute for OneOrMore to avoid reading label string as part of the data
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        OneOrMore(attr_expr).parseString(text).pprint() # Better -> [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'BLACK']]
        
        # could also be written as
        (attr_expr * (1,)).parseString(text).pprint()
    cCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�r<z}...)r�r�rUr�r.)r�rwrwrxr�!s


zOneOrMore.__str__N)r�r�r�r�r�rwrwrwrxrscs8eZdZdZd
�fdd�	Zd�fdd�	Zdd	�Z�ZS)r2aw
    Optional repetition of zero or more of the given expression.
    
    Parameters:
     - expr - expression that must match zero or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example: similar to L{OneOrMore}
    Ncstt|�j||d�d|_dS)N)rLT)r�r2r�r[)r�r.rL)rHrwrxr�6szZeroOrMore.__init__Tcs6ytt|�j|||�Sttfk
r0|gfSXdS)N)r�r2r�rr�)r�r-r�ro)rHrwrxr�:szZeroOrMore.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�rz]...)r�r�rUr�r.)r�rwrwrxr�@s


zZeroOrMore.__str__)N)T)r�r�r�r�r�r�r�r�rwrw)rHrxr2*sc@s eZdZdd�ZeZdd�ZdS)�
_NullTokencCsdS)NFrw)r�rwrwrxr�Jsz_NullToken.__bool__cCsdS)Nr�rw)r�rwrwrxr�Msz_NullToken.__str__N)r�r�r�r�r'r�rwrwrwrxrMIsrMcs6eZdZdZef�fdd�	Zd	dd�Zdd�Z�ZS)
raa
    Optional matching of the given expression.

    Parameters:
     - expr - expression that must match zero or more times
     - default (optional) - value to be returned if the optional expression is not found.

    Example::
        # US postal code can be a 5-digit zip, plus optional 4-digit qualifier
        zip = Combine(Word(nums, exact=5) + Optional('-' + Word(nums, exact=4)))
        zip.runTests('''
            # traditional ZIP code
            12345
            
            # ZIP+4 form
            12101-0001
            
            # invalid ZIP
            98765-
            ''')
    prints::
        # traditional ZIP code
        12345
        ['12345']

        # ZIP+4 form
        12101-0001
        ['12101-0001']

        # invalid ZIP
        98765-
             ^
        FAIL: Expected end of text (at char 5), (line:1, col:6)
    cs.tt|�j|dd�|jj|_||_d|_dS)NF)rgT)r�rr�r.rWr�r[)r�r.r�)rHrwrxr�ts
zOptional.__init__TcCszy|jj|||dd�\}}WnTttfk
rp|jtk	rh|jjr^t|jg�}|j||jj<ql|jg}ng}YnX||fS)NF)rp)r.rtrr�r��_optionalNotMatchedrVr")r�r-r�ror�rwrwrxr�zs


zOptional.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�rr	)r�r�rUr�r.)r�rwrwrxr��s


zOptional.__str__)T)	r�r�r�r�rNr�r�r�r�rwrw)rHrxrQs"
cs,eZdZdZd	�fdd�	Zd
dd�Z�ZS)r(a�	
    Token for skipping over all undefined text until the matched expression is found.

    Parameters:
     - expr - target expression marking the end of the data to be skipped
     - include - (default=C{False}) if True, the target expression is also parsed 
          (the skipped text and target expression are returned as a 2-element list).
     - ignore - (default=C{None}) used to define grammars (typically quoted strings and 
          comments) that might contain false matches to the target expression
     - failOn - (default=C{None}) define expressions that are not allowed to be 
          included in the skipped test; if found before the target expression is found, 
          the SkipTo is not a match

    Example::
        report = '''
            Outstanding Issues Report - 1 Jan 2000

               # | Severity | Description                               |  Days Open
            -----+----------+-------------------------------------------+-----------
             101 | Critical | Intermittent system crash                 |          6
              94 | Cosmetic | Spelling error on Login ('log|n')         |         14
              79 | Minor    | System slow when running too many reports |         47
            '''
        integer = Word(nums)
        SEP = Suppress('|')
        # use SkipTo to simply match everything up until the next SEP
        # - ignore quoted strings, so that a '|' character inside a quoted string does not match
        # - parse action will call token.strip() for each matched token, i.e., the description body
        string_data = SkipTo(SEP, ignore=quotedString)
        string_data.setParseAction(tokenMap(str.strip))
        ticket_expr = (integer("issue_num") + SEP 
                      + string_data("sev") + SEP 
                      + string_data("desc") + SEP 
                      + integer("days_open"))
        
        for tkt in ticket_expr.searchString(report):
            print tkt.dump()
    prints::
        ['101', 'Critical', 'Intermittent system crash', '6']
        - days_open: 6
        - desc: Intermittent system crash
        - issue_num: 101
        - sev: Critical
        ['94', 'Cosmetic', "Spelling error on Login ('log|n')", '14']
        - days_open: 14
        - desc: Spelling error on Login ('log|n')
        - issue_num: 94
        - sev: Cosmetic
        ['79', 'Minor', 'System slow when running too many reports', '47']
        - days_open: 47
        - desc: System slow when running too many reports
        - issue_num: 79
        - sev: Minor
    FNcs`tt|�j|�||_d|_d|_||_d|_t|t	�rFt
j|�|_n||_dt
|j�|_dS)NTFzNo match found for )r�r(r��
ignoreExprr[r`�includeMatchr�rzr�r$rQ�failOnr�r.ra)r�r��includer�rQ)rHrwrxr��s
zSkipTo.__init__TcCs,|}t|�}|j}|jj}|jdk	r,|jjnd}|jdk	rB|jjnd}	|}
x�|
|kr�|dk	rh|||
�rhP|	dk	r�x*y|	||
�}
Wqrtk
r�PYqrXqrWy|||
ddd�Wn tt	fk
r�|
d7}
YqLXPqLWt|||j
|��|
}|||�}t|�}|j�r$||||dd�\}}
||
7}||fS)NF)rorprr)rp)
r�r.rtrQr�rOr�rrr�rar"rP)r�r-r�ror0r�r.Z
expr_parseZself_failOn_canParseNextZself_ignoreExpr_tryParseZtmplocZskiptextZ
skipresultr�rwrwrxr��s<

zSkipTo.parseImpl)FNN)T)r�r�r�r�r�r�r�rwrw)rHrxr(�s6
csbeZdZdZd�fdd�	Zdd�Zdd�Zd	d
�Zdd�Zgfd
d�Z	dd�Z
�fdd�Z�ZS)raK
    Forward declaration of an expression to be defined later -
    used for recursive grammars, such as algebraic infix notation.
    When the expression is known, it is assigned to the C{Forward} variable using the '<<' operator.

    Note: take care when assigning to C{Forward} not to overlook precedence of operators.
    Specifically, '|' has a lower precedence than '<<', so that::
        fwdExpr << a | b | c
    will actually be evaluated as::
        (fwdExpr << a) | b | c
    thereby leaving b and c out as parseable alternatives.  It is recommended that you
    explicitly group the values inserted into the C{Forward}::
        fwdExpr << (a | b | c)
    Converting to use the '<<=' operator instead will avoid this problem.

    See L{ParseResults.pprint} for an example of a recursive parser created using
    C{Forward}.
    Ncstt|�j|dd�dS)NF)rg)r�rr�)r�r�)rHrwrxr�szForward.__init__cCsjt|t�rtj|�}||_d|_|jj|_|jj|_|j|jj	�|jj
|_
|jj|_|jj
|jj�|S)N)rzr�r$rQr.rUr`r[r�rYrXrWr]r�)r�r�rwrwrx�
__lshift__s





zForward.__lshift__cCs||>S)Nrw)r�r�rwrwrx�__ilshift__'szForward.__ilshift__cCs
d|_|S)NF)rX)r�rwrwrxr�*szForward.leaveWhitespacecCs$|js d|_|jdk	r |jj�|S)NT)r_r.r�)r�rwrwrxr�.s


zForward.streamlinecCs>||kr0|dd�|g}|jdk	r0|jj|�|jg�dS)N)r.r�r�)r�r�r7rwrwrxr�5s

zForward.validatecCs>t|d�r|jS|jjdSd}Wd|j|_X|jjd|S)Nr�z: ...�Nonez: )r�r�rHr�Z_revertClass�_ForwardNoRecurser.r�)r�Z	retStringrwrwrxr�<s

zForward.__str__cs.|jdk	rtt|�j�St�}||K}|SdS)N)r.r�rr�)r�r�)rHrwrxr�Ms

zForward.copy)N)
r�r�r�r�r�rSrTr�r�r�r�r�r�rwrw)rHrxrs
c@seZdZdd�ZdS)rVcCsdS)Nz...rw)r�rwrwrxr�Vsz_ForwardNoRecurse.__str__N)r�r�r�r�rwrwrwrxrVUsrVcs"eZdZdZd�fdd�	Z�ZS)r-zQ
    Abstract subclass of C{ParseExpression}, for converting parsed results.
    Fcstt|�j|�d|_dS)NF)r�r-r�rW)r�r.rg)rHrwrxr�]szTokenConverter.__init__)F)r�r�r�r�r�r�rwrw)rHrxr-Yscs6eZdZdZd
�fdd�	Z�fdd�Zdd	�Z�ZS)r
a�
    Converter to concatenate all matching tokens to a single string.
    By default, the matching patterns must also be contiguous in the input string;
    this can be disabled by specifying C{'adjacent=False'} in the constructor.

    Example::
        real = Word(nums) + '.' + Word(nums)
        print(real.parseString('3.1416')) # -> ['3', '.', '1416']
        # will also erroneously match the following
        print(real.parseString('3. 1416')) # -> ['3', '.', '1416']

        real = Combine(Word(nums) + '.' + Word(nums))
        print(real.parseString('3.1416')) # -> ['3.1416']
        # no match when there are internal spaces
        print(real.parseString('3. 1416')) # -> Exception: Expected W:(0123...)
    r�Tcs8tt|�j|�|r|j�||_d|_||_d|_dS)NT)r�r
r�r��adjacentrX�
joinStringre)r�r.rXrW)rHrwrxr�rszCombine.__init__cs(|jrtj||�ntt|�j|�|S)N)rWr$r�r�r
)r�r�)rHrwrxr�|szCombine.ignorecCsP|j�}|dd�=|tdj|j|j��g|jd�7}|jrH|j�rH|gS|SdS)Nr�)r�)r�r"r�r
rXrbrVr�)r�r-r�r�ZretToksrwrwrxr��s
"zCombine.postParse)r�T)r�r�r�r�r�r�r�r�rwrw)rHrxr
as
cs(eZdZdZ�fdd�Zdd�Z�ZS)ra�
    Converter to return the matched tokens as a list - useful for returning tokens of C{L{ZeroOrMore}} and C{L{OneOrMore}} expressions.

    Example::
        ident = Word(alphas)
        num = Word(nums)
        term = ident | num
        func = ident + Optional(delimitedList(term))
        print(func.parseString("fn a,b,100"))  # -> ['fn', 'a', 'b', '100']

        func = ident + Group(Optional(delimitedList(term)))
        print(func.parseString("fn a,b,100"))  # -> ['fn', ['a', 'b', '100']]
    cstt|�j|�d|_dS)NT)r�rr�rW)r�r.)rHrwrxr��szGroup.__init__cCs|gS)Nrw)r�r-r�r�rwrwrxr��szGroup.postParse)r�r�r�r�r�r�r�rwrw)rHrxr�s
cs(eZdZdZ�fdd�Zdd�Z�ZS)raW
    Converter to return a repetitive expression as a list, but also as a dictionary.
    Each element can also be referenced using the first token in the expression as its key.
    Useful for tabular report scraping when the first column can be used as a item key.

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        # print attributes as plain groups
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        # instead of OneOrMore(expr), parse using Dict(OneOrMore(Group(expr))) - Dict will auto-assign names
        result = Dict(OneOrMore(Group(attr_expr))).parseString(text)
        print(result.dump())
        
        # access named fields as dict entries, or output as dict
        print(result['shape'])        
        print(result.asDict())
    prints::
        ['shape', 'SQUARE', 'posn', 'upper left', 'color', 'light blue', 'texture', 'burlap']

        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        {'color': 'light blue', 'posn': 'upper left', 'texture': 'burlap', 'shape': 'SQUARE'}
    See more examples at L{ParseResults} of accessing fields by results name.
    cstt|�j|�d|_dS)NT)r�rr�rW)r�r.)rHrwrxr��sz
Dict.__init__cCs�x�t|�D]�\}}t|�dkr q
|d}t|t�rBt|d�j�}t|�dkr^td|�||<q
t|�dkr�t|dt�r�t|d|�||<q
|j�}|d=t|�dks�t|t�r�|j	�r�t||�||<q
t|d|�||<q
W|j
r�|gS|SdS)Nrrrr�rq)r�r�rzrur�r�r�r"r�r�rV)r�r-r�r�r��tokZikeyZ	dictvaluerwrwrxr��s$
zDict.postParse)r�r�r�r�r�r�r�rwrw)rHrxr�s#c@s eZdZdZdd�Zdd�ZdS)r+aV
    Converter for ignoring the results of a parsed expression.

    Example::
        source = "a, b, c,d"
        wd = Word(alphas)
        wd_list1 = wd + ZeroOrMore(',' + wd)
        print(wd_list1.parseString(source))

        # often, delimiters that are useful during parsing are just in the
        # way afterward - use Suppress to keep them out of the parsed output
        wd_list2 = wd + ZeroOrMore(Suppress(',') + wd)
        print(wd_list2.parseString(source))
    prints::
        ['a', ',', 'b', ',', 'c', ',', 'd']
        ['a', 'b', 'c', 'd']
    (See also L{delimitedList}.)
    cCsgS)Nrw)r�r-r�r�rwrwrxr��szSuppress.postParsecCs|S)Nrw)r�rwrwrxr��szSuppress.suppressN)r�r�r�r�r�r�rwrwrwrxr+�sc@s(eZdZdZdd�Zdd�Zdd�ZdS)	rzI
    Wrapper for parse actions, to ensure they are only called once.
    cCst|�|_d|_dS)NF)rM�callable�called)r�Z
methodCallrwrwrxr�s
zOnlyOnce.__init__cCs.|js|j|||�}d|_|St||d��dS)NTr�)r[rZr)r�r�r5rvr�rwrwrxr�s
zOnlyOnce.__call__cCs
d|_dS)NF)r[)r�rwrwrx�reset
szOnlyOnce.resetN)r�r�r�r�r�r�r\rwrwrwrxr�scs:t����fdd�}y�j|_Wntk
r4YnX|S)as
    Decorator for debugging parse actions. 
    
    When the parse action is called, this decorator will print C{">> entering I{method-name}(line:I{current_source_line}, I{parse_location}, I{matched_tokens})".}
    When the parse action completes, the decorator will print C{"<<"} followed by the returned value, or any exception that the parse action raised.

    Example::
        wd = Word(alphas)

        @traceParseAction
        def remove_duplicate_chars(tokens):
            return ''.join(sorted(set(''.join(tokens)))

        wds = OneOrMore(wd).setParseAction(remove_duplicate_chars)
        print(wds.parseString("slkdjs sld sldd sdlf sdljf"))
    prints::
        >>entering remove_duplicate_chars(line: 'slkdjs sld sldd sdlf sdljf', 0, (['slkdjs', 'sld', 'sldd', 'sdlf', 'sdljf'], {}))
        <<leaving remove_duplicate_chars (ret: 'dfjkls')
        ['dfjkls']
    cs��j}|dd�\}}}t|�dkr8|djjd|}tjjd|t||�||f�y�|�}Wn8tk
r�}ztjjd||f��WYdd}~XnXtjjd||f�|S)Nror�.z">>entering %s(line: '%s', %d, %r)
z<<leaving %s (exception: %s)
z<<leaving %s (ret: %r)
r9)r�r�rHr~�stderr�writerGrK)ZpaArgsZthisFuncr�r5rvr�r3)r�rwrx�z#sztraceParseAction.<locals>.z)rMr�r�)r�r`rw)r�rxrb
s
�,FcCs`t|�dt|�dt|�d}|rBt|t||��j|�S|tt|�|�j|�SdS)a�
    Helper to define a delimited list of expressions - the delimiter defaults to ','.
    By default, the list elements and delimiters can have intervening whitespace, and
    comments, but this can be overridden by passing C{combine=True} in the constructor.
    If C{combine} is set to C{True}, the matching tokens are returned as a single token
    string, with the delimiters included; otherwise, the matching tokens are returned
    as a list of tokens, with the delimiters suppressed.

    Example::
        delimitedList(Word(alphas)).parseString("aa,bb,cc") # -> ['aa', 'bb', 'cc']
        delimitedList(Word(hexnums), delim=':', combine=True).parseString("AA:BB:CC:DD:EE") # -> ['AA:BB:CC:DD:EE']
    z [r�z]...N)r�r
r2rir+)r.Zdelim�combineZdlNamerwrwrxr@9s
$csjt����fdd�}|dkr0tt�jdd��}n|j�}|jd�|j|dd�|�jd	t��d
�S)a:
    Helper to define a counted list of expressions.
    This helper defines a pattern of the form::
        integer expr expr expr...
    where the leading integer tells how many expr expressions follow.
    The matched tokens returns the array of expr tokens as a list - the leading count token is suppressed.
    
    If C{intExpr} is specified, it should be a pyparsing expression that produces an integer value.

    Example::
        countedArray(Word(alphas)).parseString('2 ab cd ef')  # -> ['ab', 'cd']

        # in this parser, the leading integer value is given in binary,
        # '10' indicating that 2 values are in the array
        binaryConstant = Word('01').setParseAction(lambda t: int(t[0], 2))
        countedArray(Word(alphas), intExpr=binaryConstant).parseString('10 ab cd ef')  # -> ['ab', 'cd']
    cs.|d}�|r tt�g|��p&tt�>gS)Nr)rrrC)r�r5rvr�)�	arrayExprr.rwrx�countFieldParseAction_s"z+countedArray.<locals>.countFieldParseActionNcSst|d�S)Nr)ru)rvrwrwrxrydszcountedArray.<locals>.<lambda>ZarrayLenT)rfz(len) z...)rr/rRr�r�rirxr�)r.ZintExprrdrw)rcr.rxr<Ls
cCs:g}x0|D](}t|t�r(|jt|��q
|j|�q
W|S)N)rzr�r�r�r�)�Lr�r�rwrwrxr�ks

r�cs6t���fdd�}|j|dd��jdt|���S)a*
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousLiteral(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches a
    previous literal, will also match the leading C{"1:1"} in C{"1:10"}.
    If this is not desired, use C{matchPreviousExpr}.
    Do I{not} use with packrat parsing enabled.
    csP|rBt|�dkr�|d>qLt|j��}�tdd�|D��>n
�t�>dS)Nrrrcss|]}t|�VqdS)N)r)r��ttrwrwrxr��szDmatchPreviousLiteral.<locals>.copyTokenToRepeater.<locals>.<genexpr>)r�r�r�rr
)r�r5rvZtflat)�reprwrx�copyTokenToRepeater�sz1matchPreviousLiteral.<locals>.copyTokenToRepeaterT)rfz(prev) )rrxrir�)r.rhrw)rgrxrOts


csFt��|j�}�|K��fdd�}|j|dd��jdt|���S)aS
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousExpr(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches by
    expressions, will I{not} match the leading C{"1:1"} in C{"1:10"};
    the expressions are evaluated first, and then compared, so
    C{"1"} is compared with C{"10"}.
    Do I{not} use with packrat parsing enabled.
    cs*t|j����fdd�}�j|dd�dS)Ncs$t|j��}|�kr tddd��dS)Nr�r)r�r�r)r�r5rvZtheseTokens)�matchTokensrwrx�mustMatchTheseTokens�szLmatchPreviousExpr.<locals>.copyTokenToRepeater.<locals>.mustMatchTheseTokensT)rf)r�r�r�)r�r5rvrj)rg)rirxrh�sz.matchPreviousExpr.<locals>.copyTokenToRepeaterT)rfz(prev) )rr�rxrir�)r.Ze2rhrw)rgrxrN�scCs>xdD]}|j|t|�}qW|jdd�}|jdd�}t|�S)Nz\^-]rz\nr(z\t)r��_bslashr�)r�r�rwrwrxr�s

rTc
s�|rdd�}dd�}t�ndd�}dd�}t�g}t|t�rF|j�}n&t|tj�r\t|�}ntj	dt
dd�|svt�Sd	}x�|t|�d
k�r||}xnt
||d
d��D]N\}}	||	|�r�|||d
=Pq�|||	�r�|||d
=|j||	�|	}Pq�W|d
7}q|W|�r�|�r�yht|�tdj|��k�rZtd
djdd�|D���jdj|��Stdjdd�|D���jdj|��SWn&tk
�r�tj	dt
dd�YnXt�fdd�|D��jdj|��S)a�
    Helper to quickly define a set of alternative Literals, and makes sure to do
    longest-first testing when there is a conflict, regardless of the input order,
    but returns a C{L{MatchFirst}} for best performance.

    Parameters:
     - strs - a string of space-delimited literals, or a collection of string literals
     - caseless - (default=C{False}) - treat all literals as caseless
     - useRegex - (default=C{True}) - as an optimization, will generate a Regex
          object; otherwise, will generate a C{MatchFirst} object (if C{caseless=True}, or
          if creating a C{Regex} raises an exception)

    Example::
        comp_oper = oneOf("< = > <= >= !=")
        var = Word(alphas)
        number = Word(nums)
        term = var | number
        comparison_expr = term + comp_oper + term
        print(comparison_expr.searchString("B = 12  AA=23 B<=AA AA>12"))
    prints::
        [['B', '=', '12'], ['AA', '=', '23'], ['B', '<=', 'AA'], ['AA', '>', '12']]
    cSs|j�|j�kS)N)r�)r�brwrwrxry�szoneOf.<locals>.<lambda>cSs|j�j|j��S)N)r�r�)rrlrwrwrxry�scSs||kS)Nrw)rrlrwrwrxry�scSs
|j|�S)N)r�)rrlrwrwrxry�sz6Invalid argument to oneOf, expected string or iterablerq)r�rrrNr�z[%s]css|]}t|�VqdS)N)r)r��symrwrwrxr��szoneOf.<locals>.<genexpr>z | �|css|]}tj|�VqdS)N)rdr	)r�rmrwrwrxr��sz7Exception creating Regex for oneOf, building MatchFirstc3s|]}�|�VqdS)Nrw)r�rm)�parseElementClassrwrxr��s)rrrzr�r�r�r5r�r�r�r�rr�r�r�r�r'rirKr)
Zstrsr�ZuseRegexZisequalZmasksZsymbolsr�Zcurr�r�rw)rorxrS�sL





((cCsttt||���S)a�
    Helper to easily and clearly define a dictionary by specifying the respective patterns
    for the key and value.  Takes care of defining the C{L{Dict}}, C{L{ZeroOrMore}}, and C{L{Group}} tokens
    in the proper order.  The key pattern can include delimiting markers or punctuation,
    as long as they are suppressed, thereby leaving the significant key text.  The value
    pattern can include named results, so that the C{Dict} results can include named token
    fields.

    Example::
        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        attr_label = label
        attr_value = Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join)

        # similar to Dict, but simpler call format
        result = dictOf(attr_label, attr_value).parseString(text)
        print(result.dump())
        print(result['shape'])
        print(result.shape)  # object attribute access works too
        print(result.asDict())
    prints::
        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        SQUARE
        {'color': 'light blue', 'shape': 'SQUARE', 'posn': 'upper left', 'texture': 'burlap'}
    )rr2r)r�r�rwrwrxrA�s!cCs^t�jdd��}|j�}d|_|d�||d�}|r@dd�}ndd�}|j|�|j|_|S)	a�
    Helper to return the original, untokenized text for a given expression.  Useful to
    restore the parsed fields of an HTML start tag into the raw tag text itself, or to
    revert separate tokens with intervening whitespace back to the original matching
    input text. By default, returns astring containing the original parsed text.  
       
    If the optional C{asString} argument is passed as C{False}, then the return value is a 
    C{L{ParseResults}} containing any results names that were originally matched, and a 
    single token containing the original matched text from the input string.  So if 
    the expression passed to C{L{originalTextFor}} contains expressions with defined
    results names, you must set C{asString} to C{False} if you want to preserve those
    results name values.

    Example::
        src = "this is test <b> bold <i>text</i> </b> normal text "
        for tag in ("b","i"):
            opener,closer = makeHTMLTags(tag)
            patt = originalTextFor(opener + SkipTo(closer) + closer)
            print(patt.searchString(src)[0])
    prints::
        ['<b> bold <i>text</i> </b>']
        ['<i>text</i>']
    cSs|S)Nrw)r�r�rvrwrwrxry8sz!originalTextFor.<locals>.<lambda>F�_original_start�
_original_endcSs||j|j�S)N)rprq)r�r5rvrwrwrxry=scSs&||jd�|jd��g|dd�<dS)Nrprq)r�)r�r5rvrwrwrx�extractText?sz$originalTextFor.<locals>.extractText)r
r�r�rer])r.ZasStringZ	locMarkerZendlocMarker�	matchExprrrrwrwrxrg s

cCst|�jdd��S)zp
    Helper to undo pyparsing's default grouping of And expressions, even
    if all but one are non-empty.
    cSs|dS)Nrrw)rvrwrwrxryJszungroup.<locals>.<lambda>)r-r�)r.rwrwrxrhEscCs4t�jdd��}t|d�|d�|j�j�d��S)a�
    Helper to decorate a returned token with its starting and ending locations in the input string.
    This helper adds the following results names:
     - locn_start = location where matched expression begins
     - locn_end = location where matched expression ends
     - value = the actual parsed results

    Be careful if the input text contains C{<TAB>} characters, you may want to call
    C{L{ParserElement.parseWithTabs}}

    Example::
        wd = Word(alphas)
        for match in locatedExpr(wd).searchString("ljsdf123lksdjjf123lkkjj1222"):
            print(match)
    prints::
        [[0, 'ljsdf', 5]]
        [[8, 'lksdjjf', 15]]
        [[18, 'lkkjj', 23]]
    cSs|S)Nrw)r�r5rvrwrwrxry`szlocatedExpr.<locals>.<lambda>Z
locn_startr�Zlocn_end)r
r�rr�r�)r.ZlocatorrwrwrxrjLsz\[]-*.$+^?()~ )r
cCs|ddS)Nrrrrw)r�r5rvrwrwrxryksryz\\0?[xX][0-9a-fA-F]+cCstt|djd�d��S)Nrz\0x�)�unichrru�lstrip)r�r5rvrwrwrxrylsz	\\0[0-7]+cCstt|ddd�d��S)Nrrr�)ruru)r�r5rvrwrwrxrymsz\])r�r
z\wr8rr�Znegate�bodyr	csBdd��y dj�fdd�tj|�jD��Stk
r<dSXdS)a�
    Helper to easily define string ranges for use in Word construction.  Borrows
    syntax from regexp '[]' string range definitions::
        srange("[0-9]")   -> "0123456789"
        srange("[a-z]")   -> "abcdefghijklmnopqrstuvwxyz"
        srange("[a-z$_]") -> "abcdefghijklmnopqrstuvwxyz$_"
    The input string must be enclosed in []'s, and the returned string is the expanded
    character set joined into a single string.
    The values enclosed in the []'s may be:
     - a single character
     - an escaped character with a leading backslash (such as C{\-} or C{\]})
     - an escaped hex character with a leading C{'\x'} (C{\x21}, which is a C{'!'} character) 
         (C{\0x##} is also supported for backwards compatibility) 
     - an escaped octal character with a leading C{'\0'} (C{\041}, which is a C{'!'} character)
     - a range of any of the above, separated by a dash (C{'a-z'}, etc.)
     - any combination of the above (C{'aeiouy'}, C{'a-zA-Z0-9_$'}, etc.)
    cSs<t|t�s|Sdjdd�tt|d�t|d�d�D��S)Nr�css|]}t|�VqdS)N)ru)r�r�rwrwrxr��sz+srange.<locals>.<lambda>.<locals>.<genexpr>rrr)rzr"r�r��ord)�prwrwrxry�szsrange.<locals>.<lambda>r�c3s|]}�|�VqdS)Nrw)r��part)�	_expandedrwrxr��szsrange.<locals>.<genexpr>N)r��_reBracketExprr�rxrK)r�rw)r|rxr_rs
 cs�fdd�}|S)zt
    Helper method for defining parse actions that require matching at a specific
    column in the input text.
    cs"t||��krt||d���dS)Nzmatched token not at column %d)r9r)r)Zlocnr1)r�rwrx�	verifyCol�sz!matchOnlyAtCol.<locals>.verifyColrw)r�r~rw)r�rxrM�scs�fdd�S)a�
    Helper method for common parse actions that simply return a literal value.  Especially
    useful when used with C{L{transformString<ParserElement.transformString>}()}.

    Example::
        num = Word(nums).setParseAction(lambda toks: int(toks[0]))
        na = oneOf("N/A NA").setParseAction(replaceWith(math.nan))
        term = na | num
        
        OneOrMore(term).parseString("324 234 N/A 234") # -> [324, 234, nan, 234]
    cs�gS)Nrw)r�r5rv)�replStrrwrxry�szreplaceWith.<locals>.<lambda>rw)rrw)rrxr\�scCs|ddd�S)a
    Helper parse action for removing quotation marks from parsed quoted strings.

    Example::
        # by default, quotation marks are included in parsed results
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["'Now is the Winter of our Discontent'"]

        # use removeQuotes to strip quotation marks from parsed results
        quotedString.setParseAction(removeQuotes)
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["Now is the Winter of our Discontent"]
    rrrrsrw)r�r5rvrwrwrxrZ�scsN��fdd�}yt�dt�d�j�}Wntk
rBt��}YnX||_|S)aG
    Helper to define a parse action by mapping a function to all elements of a ParseResults list.If any additional 
    args are passed, they are forwarded to the given function as additional arguments after
    the token, as in C{hex_integer = Word(hexnums).setParseAction(tokenMap(int, 16))}, which will convert the
    parsed data to an integer using base 16.

    Example (compare the last to example in L{ParserElement.transformString}::
        hex_ints = OneOrMore(Word(hexnums)).setParseAction(tokenMap(int, 16))
        hex_ints.runTests('''
            00 11 22 aa FF 0a 0d 1a
            ''')
        
        upperword = Word(alphas).setParseAction(tokenMap(str.upper))
        OneOrMore(upperword).runTests('''
            my kingdom for a horse
            ''')

        wd = Word(alphas).setParseAction(tokenMap(str.title))
        OneOrMore(wd).setParseAction(' '.join).runTests('''
            now is the winter of our discontent made glorious summer by this sun of york
            ''')
    prints::
        00 11 22 aa FF 0a 0d 1a
        [0, 17, 34, 170, 255, 10, 13, 26]

        my kingdom for a horse
        ['MY', 'KINGDOM', 'FOR', 'A', 'HORSE']

        now is the winter of our discontent made glorious summer by this sun of york
        ['Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York']
    cs��fdd�|D�S)Ncsg|]}�|f����qSrwrw)r�Ztokn)r�r6rwrxr��sz(tokenMap.<locals>.pa.<locals>.<listcomp>rw)r�r5rv)r�r6rwrxr}�sztokenMap.<locals>.par�rH)rJr�rKr{)r6r�r}rLrw)r�r6rxrm�s cCst|�j�S)N)r�r�)rvrwrwrxry�scCst|�j�S)N)r��lower)rvrwrwrxry�scCs�t|t�r|}t||d�}n|j}tttd�}|r�tj�j	t
�}td�|d�tt
t|td�|���tddgd�jd	�j	d
d��td�}n�d
jdd�tD��}tj�j	t
�t|�B}td�|d�tt
t|j	t�ttd�|����tddgd�jd	�j	dd��td�}ttd�|d�}|jdd
j|jdd�j�j���jd|�}|jdd
j|jdd�j�j���jd|�}||_||_||fS)zRInternal helper to construct opening and closing tag expressions, given a tag name)r�z_-:r�tag�=�/F)r�rCcSs|ddkS)Nrr�rw)r�r5rvrwrwrxry�sz_makeTags.<locals>.<lambda>rr�css|]}|dkr|VqdS)rNrw)r�r�rwrwrxr��sz_makeTags.<locals>.<genexpr>cSs|ddkS)Nrr�rw)r�r5rvrwrwrxry�sz</r��:r�z<%s>rz</%s>)rzr�rr�r/r4r3r>r�r�rZr+rr2rrrmr�rVrYrBr
�_Lr��titler�rir�)�tagStrZxmlZresnameZtagAttrNameZtagAttrValueZopenTagZprintablesLessRAbrackZcloseTagrwrwrx�	_makeTags�s"
T\..r�cCs
t|d�S)a 
    Helper to construct opening and closing tag expressions for HTML, given a tag name. Matches
    tags in either upper or lower case, attributes with namespaces and with quoted or unquoted values.

    Example::
        text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
        # makeHTMLTags returns pyparsing expressions for the opening and closing tags as a 2-tuple
        a,a_end = makeHTMLTags("A")
        link_expr = a + SkipTo(a_end)("link_text") + a_end
        
        for link in link_expr.searchString(text):
            # attributes in the <A> tag (like "href" shown here) are also accessible as named results
            print(link.link_text, '->', link.href)
    prints::
        pyparsing -> http://pyparsing.wikispaces.com
    F)r�)r�rwrwrxrK�scCs
t|d�S)z�
    Helper to construct opening and closing tag expressions for XML, given a tag name. Matches
    tags only in the given upper/lower case.

    Example: similar to L{makeHTMLTags}
    T)r�)r�rwrwrxrLscs8|r|dd��n|j��dd��D���fdd�}|S)a<
    Helper to create a validating parse action to be used with start tags created
    with C{L{makeXMLTags}} or C{L{makeHTMLTags}}. Use C{withAttribute} to qualify a starting tag
    with a required attribute value, to avoid false matches on common tags such as
    C{<TD>} or C{<DIV>}.

    Call C{withAttribute} with a series of attribute names and values. Specify the list
    of filter attributes names and values as:
     - keyword arguments, as in C{(align="right")}, or
     - as an explicit dict with C{**} operator, when an attribute name is also a Python
          reserved word, as in C{**{"class":"Customer", "align":"right"}}
     - a list of name-value tuples, as in ( ("ns1:class", "Customer"), ("ns2:align","right") )
    For attribute names with a namespace prefix, you must use the second form.  Attribute
    names are matched insensitive to upper/lower case.
       
    If just testing for C{class} (with or without a namespace), use C{L{withClass}}.

    To verify that the attribute exists, but without specifying a value, pass
    C{withAttribute.ANY_VALUE} as the value.

    Example::
        html = '''
            <div>
            Some text
            <div type="grid">1 4 0 1 0</div>
            <div type="graph">1,3 2,3 1,1</div>
            <div>this has no type</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")

        # only match div tag having a type attribute with value "grid"
        div_grid = div().setParseAction(withAttribute(type="grid"))
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        # construct a match with any div tag having a type attribute, regardless of the value
        div_any_type = div().setParseAction(withAttribute(type=withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    NcSsg|]\}}||f�qSrwrw)r�r�r�rwrwrxr�Qsz!withAttribute.<locals>.<listcomp>cs^xX�D]P\}}||kr&t||d|��|tjkr|||krt||d||||f��qWdS)Nzno matching attribute z+attribute '%s' has value '%s', must be '%s')rre�	ANY_VALUE)r�r5r�ZattrNameZ	attrValue)�attrsrwrxr}RszwithAttribute.<locals>.pa)r�)r�ZattrDictr}rw)r�rxres2cCs|rd|nd}tf||i�S)a�
    Simplified version of C{L{withAttribute}} when matching on a div class - made
    difficult because C{class} is a reserved word in Python.

    Example::
        html = '''
            <div>
            Some text
            <div class="grid">1 4 0 1 0</div>
            <div class="graph">1,3 2,3 1,1</div>
            <div>this &lt;div&gt; has no class</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")
        div_grid = div().setParseAction(withClass("grid"))
        
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        div_any_type = div().setParseAction(withClass(withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    z%s:class�class)re)Z	classname�	namespaceZ	classattrrwrwrxrk\s �(rcCs�t�}||||B}�x`t|�D�]R\}}|ddd�\}}	}
}|	dkrTd|nd|}|	dkr�|dksxt|�dkr�td��|\}
}t�j|�}|
tjk�rd|	dkr�t||�t|t	|��}n�|	dk�r|dk	�rt|||�t|t	||��}nt||�t|t	|��}nD|	dk�rZt||
|||�t||
|||�}ntd	��n�|
tj
k�rH|	dk�r�t|t��s�t|�}t|j
|�t||�}n�|	dk�r|dk	�r�t|||�t|t	||��}nt||�t|t	|��}nD|	dk�r>t||
|||�t||
|||�}ntd	��ntd
��|�r`|j|�||j|�|BK}|}q"W||K}|S)a�	
    Helper method for constructing grammars of expressions made up of
    operators working in a precedence hierarchy.  Operators may be unary or
    binary, left- or right-associative.  Parse actions can also be attached
    to operator expressions. The generated parser will also recognize the use 
    of parentheses to override operator precedences (see example below).
    
    Note: if you define a deep operator list, you may see performance issues
    when using infixNotation. See L{ParserElement.enablePackrat} for a
    mechanism to potentially improve your parser performance.

    Parameters:
     - baseExpr - expression representing the most basic element for the nested
     - opList - list of tuples, one for each operator precedence level in the
      expression grammar; each tuple is of the form
      (opExpr, numTerms, rightLeftAssoc, parseAction), where:
       - opExpr is the pyparsing expression for the operator;
          may also be a string, which will be converted to a Literal;
          if numTerms is 3, opExpr is a tuple of two expressions, for the
          two operators separating the 3 terms
       - numTerms is the number of terms for this operator (must
          be 1, 2, or 3)
       - rightLeftAssoc is the indicator whether the operator is
          right or left associative, using the pyparsing-defined
          constants C{opAssoc.RIGHT} and C{opAssoc.LEFT}.
       - parseAction is the parse action to be associated with
          expressions matching this operator expression (the
          parse action tuple member may be omitted)
     - lpar - expression for matching left-parentheses (default=C{Suppress('(')})
     - rpar - expression for matching right-parentheses (default=C{Suppress(')')})

    Example::
        # simple example of four-function arithmetic with ints and variable names
        integer = pyparsing_common.signed_integer
        varname = pyparsing_common.identifier 
        
        arith_expr = infixNotation(integer | varname,
            [
            ('-', 1, opAssoc.RIGHT),
            (oneOf('* /'), 2, opAssoc.LEFT),
            (oneOf('+ -'), 2, opAssoc.LEFT),
            ])
        
        arith_expr.runTests('''
            5+3*6
            (5+3)*6
            -2--11
            ''', fullDump=False)
    prints::
        5+3*6
        [[5, '+', [3, '*', 6]]]

        (5+3)*6
        [[[5, '+', 3], '*', 6]]

        -2--11
        [[['-', 2], '-', ['-', 11]]]
    Nrroz%s termz	%s%s termrqz@if numterms=3, opExpr must be a tuple or list of two expressionsrrz6operator must be unary (1), binary (2), or ternary (3)z2operator must indicate right or left associativity)N)rr�r�r�rirT�LEFTrrr�RIGHTrzrr.r�)ZbaseExprZopListZlparZrparr�ZlastExprr�ZoperDefZopExprZarityZrightLeftAssocr}ZtermNameZopExpr1ZopExpr2ZthisExprrsrwrwrxri�sR;

&




&


z4"(?:[^"\n\r\\]|(?:"")|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*�"z string enclosed in double quotesz4'(?:[^'\n\r\\]|(?:'')|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*�'z string enclosed in single quotesz*quotedString using single or double quotes�uzunicode string literalcCs�||krtd��|dk�r(t|t�o,t|t��r t|�dkr�t|�dkr�|dk	r�tt|t||tjdd���j	dd��}n$t
j�t||tj�j	dd��}nx|dk	r�tt|t|�t|�ttjdd���j	dd��}n4ttt|�t|�ttjdd���j	d	d��}ntd
��t
�}|dk	�rb|tt|�t||B|B�t|��K}n$|tt|�t||B�t|��K}|jd||f�|S)a~	
    Helper method for defining nested lists enclosed in opening and closing
    delimiters ("(" and ")" are the default).

    Parameters:
     - opener - opening character for a nested list (default=C{"("}); can also be a pyparsing expression
     - closer - closing character for a nested list (default=C{")"}); can also be a pyparsing expression
     - content - expression for items within the nested lists (default=C{None})
     - ignoreExpr - expression for ignoring opening and closing delimiters (default=C{quotedString})

    If an expression is not provided for the content argument, the nested
    expression will capture all whitespace-delimited content between delimiters
    as a list of separate values.

    Use the C{ignoreExpr} argument to define expressions that may contain
    opening or closing characters that should not be treated as opening
    or closing characters for nesting, such as quotedString or a comment
    expression.  Specify multiple expressions using an C{L{Or}} or C{L{MatchFirst}}.
    The default is L{quotedString}, but if no expressions are to be ignored,
    then pass C{None} for this argument.

    Example::
        data_type = oneOf("void int short long char float double")
        decl_data_type = Combine(data_type + Optional(Word('*')))
        ident = Word(alphas+'_', alphanums+'_')
        number = pyparsing_common.number
        arg = Group(decl_data_type + ident)
        LPAR,RPAR = map(Suppress, "()")

        code_body = nestedExpr('{', '}', ignoreExpr=(quotedString | cStyleComment))

        c_function = (decl_data_type("type") 
                      + ident("name")
                      + LPAR + Optional(delimitedList(arg), [])("args") + RPAR 
                      + code_body("body"))
        c_function.ignore(cStyleComment)
        
        source_code = '''
            int is_odd(int x) { 
                return (x%2); 
            }
                
            int dec_to_hex(char hchar) { 
                if (hchar >= '0' && hchar <= '9') { 
                    return (ord(hchar)-ord('0')); 
                } else { 
                    return (10+ord(hchar)-ord('A'));
                } 
            }
        '''
        for func in c_function.searchString(source_code):
            print("%(name)s (%(type)s) args: %(args)s" % func)

    prints::
        is_odd (int) args: [['int', 'x']]
        dec_to_hex (int) args: [['char', 'hchar']]
    z.opening and closing strings cannot be the sameNrr)r
cSs|dj�S)Nr)r�)rvrwrwrxry9sznestedExpr.<locals>.<lambda>cSs|dj�S)Nr)r�)rvrwrwrxry<scSs|dj�S)Nr)r�)rvrwrwrxryBscSs|dj�S)Nr)r�)rvrwrwrxryFszOopening and closing arguments must be strings if no content expression is givenznested %s%s expression)r�rzr�r�r
rr	r$rNr�rCr�rrrr+r2ri)�openerZcloserZcontentrOr�rwrwrxrP�s4:

*$cs��fdd�}�fdd�}�fdd�}tt�jd�j��}t�t�j|�jd�}t�j|�jd	�}t�j|�jd
�}	|r�tt|�|t|t|�t|��|	�}
n$tt|�t|t|�t|���}
|j	t
t��|
jd�S)a
	
    Helper method for defining space-delimited indentation blocks, such as
    those used to define block statements in Python source code.

    Parameters:
     - blockStatementExpr - expression defining syntax of statement that
            is repeated within the indented block
     - indentStack - list created by caller to manage indentation stack
            (multiple statementWithIndentedBlock expressions within a single grammar
            should share a common indentStack)
     - indent - boolean indicating whether block must be indented beyond the
            the current level; set to False for block of left-most statements
            (default=C{True})

    A valid block must contain at least one C{blockStatement}.

    Example::
        data = '''
        def A(z):
          A1
          B = 100
          G = A2
          A2
          A3
        B
        def BB(a,b,c):
          BB1
          def BBA():
            bba1
            bba2
            bba3
        C
        D
        def spam(x,y):
             def eggs(z):
                 pass
        '''


        indentStack = [1]
        stmt = Forward()

        identifier = Word(alphas, alphanums)
        funcDecl = ("def" + identifier + Group( "(" + Optional( delimitedList(identifier) ) + ")" ) + ":")
        func_body = indentedBlock(stmt, indentStack)
        funcDef = Group( funcDecl + func_body )

        rvalue = Forward()
        funcCall = Group(identifier + "(" + Optional(delimitedList(rvalue)) + ")")
        rvalue << (funcCall | identifier | Word(nums))
        assignment = Group(identifier + "=" + rvalue)
        stmt << ( funcDef | assignment | identifier )

        module_body = OneOrMore(stmt)

        parseTree = module_body.parseString(data)
        parseTree.pprint()
    prints::
        [['def',
          'A',
          ['(', 'z', ')'],
          ':',
          [['A1'], [['B', '=', '100']], [['G', '=', 'A2']], ['A2'], ['A3']]],
         'B',
         ['def',
          'BB',
          ['(', 'a', 'b', 'c', ')'],
          ':',
          [['BB1'], [['def', 'BBA', ['(', ')'], ':', [['bba1'], ['bba2'], ['bba3']]]]]],
         'C',
         'D',
         ['def',
          'spam',
          ['(', 'x', 'y', ')'],
          ':',
          [[['def', 'eggs', ['(', 'z', ')'], ':', [['pass']]]]]]] 
    csN|t|�krdSt||�}|�dkrJ|�dkr>t||d��t||d��dS)Nrrzillegal nestingznot a peer entryrsrs)r�r9r!r)r�r5rv�curCol)�indentStackrwrx�checkPeerIndent�s
z&indentedBlock.<locals>.checkPeerIndentcs2t||�}|�dkr"�j|�nt||d��dS)Nrrznot a subentryrs)r9r�r)r�r5rvr�)r�rwrx�checkSubIndent�s
z%indentedBlock.<locals>.checkSubIndentcsN|t|�krdSt||�}�o4|�dko4|�dksBt||d���j�dS)Nrrrqznot an unindentrsr:)r�r9rr�)r�r5rvr�)r�rwrx�
checkUnindent�s
z$indentedBlock.<locals>.checkUnindentz	 �INDENTr�ZUNINDENTzindented block)rrr�r�r
r�rirrr�rk)ZblockStatementExprr�rr�r�r�r!r�ZPEERZUNDENTZsmExprrw)r�rxrfQsN,z#[\0xc0-\0xd6\0xd8-\0xf6\0xf8-\0xff]z[\0xa1-\0xbf\0xd7\0xf7]z_:zany tagzgt lt amp nbsp quot aposz><& "'z&(?P<entity>rnz);zcommon HTML entitycCstj|j�S)zRHelper parser action to replace common HTML entities with their special characters)�_htmlEntityMapr�Zentity)rvrwrwrxr[�sz/\*(?:[^*]|\*(?!/))*z*/zC style commentz<!--[\s\S]*?-->zHTML commentz.*zrest of linez//(?:\\\n|[^\n])*z
// commentzC++ style commentz#.*zPython style comment)r�z 	�	commaItem)r�c@s�eZdZdZee�Zee�Ze	e
�jd�je�Z
e	e�jd�jeed��Zed�jd�je�Ze�je�de�je�jd�Zejd	d
��eeeed�j�e�Bjd�Zeje�ed
�jd�je�Zed�jd�je�ZeeBeBj�Zed�jd�je�Ze	eded�jd�Zed�jd�Z ed�jd�Z!e!de!djd�Z"ee!de!d>�dee!de!d?�jd�Z#e#j$d d
��d!e jd"�Z%e&e"e%Be#Bjd#��jd#�Z'ed$�jd%�Z(e)d@d'd(��Z*e)dAd*d+��Z+ed,�jd-�Z,ed.�jd/�Z-ed0�jd1�Z.e/j�e0j�BZ1e)d2d3��Z2e&e3e4d4�e5�e	e6d4d5�ee7d6����j�jd7�Z8e9ee:j;�e8Bd8d9��jd:�Z<e)ed;d
���Z=e)ed<d
���Z>d=S)Brna�

    Here are some common low-level expressions that may be useful in jump-starting parser development:
     - numeric forms (L{integers<integer>}, L{reals<real>}, L{scientific notation<sci_real>})
     - common L{programming identifiers<identifier>}
     - network addresses (L{MAC<mac_address>}, L{IPv4<ipv4_address>}, L{IPv6<ipv6_address>})
     - ISO8601 L{dates<iso8601_date>} and L{datetime<iso8601_datetime>}
     - L{UUID<uuid>}
     - L{comma-separated list<comma_separated_list>}
    Parse actions:
     - C{L{convertToInteger}}
     - C{L{convertToFloat}}
     - C{L{convertToDate}}
     - C{L{convertToDatetime}}
     - C{L{stripHTMLTags}}
     - C{L{upcaseTokens}}
     - C{L{downcaseTokens}}

    Example::
        pyparsing_common.number.runTests('''
            # any int or real number, returned as the appropriate type
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.fnumber.runTests('''
            # any int or real number, returned as float
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.hex_integer.runTests('''
            # hex numbers
            100
            FF
            ''')

        pyparsing_common.fraction.runTests('''
            # fractions
            1/2
            -3/4
            ''')

        pyparsing_common.mixed_integer.runTests('''
            # mixed fractions
            1
            1/2
            -3/4
            1-3/4
            ''')

        import uuid
        pyparsing_common.uuid.setParseAction(tokenMap(uuid.UUID))
        pyparsing_common.uuid.runTests('''
            # uuid
            12345678-1234-5678-1234-567812345678
            ''')
    prints::
        # any int or real number, returned as the appropriate type
        100
        [100]

        -100
        [-100]

        +100
        [100]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # any int or real number, returned as float
        100
        [100.0]

        -100
        [-100.0]

        +100
        [100.0]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # hex numbers
        100
        [256]

        FF
        [255]

        # fractions
        1/2
        [0.5]

        -3/4
        [-0.75]

        # mixed fractions
        1
        [1]

        1/2
        [0.5]

        -3/4
        [-0.75]

        1-3/4
        [1.75]

        # uuid
        12345678-1234-5678-1234-567812345678
        [UUID('12345678-1234-5678-1234-567812345678')]
    �integerzhex integerrtz[+-]?\d+zsigned integerr��fractioncCs|d|dS)Nrrrrsrw)rvrwrwrxry�szpyparsing_common.<lambda>r8z"fraction or mixed integer-fractionz
[+-]?\d+\.\d*zreal numberz+[+-]?\d+([eE][+-]?\d+|\.\d*([eE][+-]?\d+)?)z$real number with scientific notationz[+-]?\d+\.?\d*([eE][+-]?\d+)?�fnumberrB�
identifierzK(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})(\.(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})){3}zIPv4 addressz[0-9a-fA-F]{1,4}�hex_integerr��zfull IPv6 addressrrBz::zshort IPv6 addresscCstdd�|D��dkS)Ncss|]}tjj|�rdVqdS)rrN)rn�
_ipv6_partr�)r�rfrwrwrxr��sz,pyparsing_common.<lambda>.<locals>.<genexpr>rw)rH)rvrwrwrxry�sz::ffff:zmixed IPv6 addresszIPv6 addressz:[0-9a-fA-F]{2}([:.-])[0-9a-fA-F]{2}(?:\1[0-9a-fA-F]{2}){4}zMAC address�%Y-%m-%dcs�fdd�}|S)a�
        Helper to create a parse action for converting parsed date string to Python datetime.date

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%d"})

        Example::
            date_expr = pyparsing_common.iso8601_date.copy()
            date_expr.setParseAction(pyparsing_common.convertToDate())
            print(date_expr.parseString("1999-12-31"))
        prints::
            [datetime.date(1999, 12, 31)]
        csLytj|d��j�Stk
rF}zt||t|���WYdd}~XnXdS)Nr)r�strptimeZdater�rr{)r�r5rv�ve)�fmtrwrx�cvt_fn�sz.pyparsing_common.convertToDate.<locals>.cvt_fnrw)r�r�rw)r�rx�
convertToDate�szpyparsing_common.convertToDate�%Y-%m-%dT%H:%M:%S.%fcs�fdd�}|S)a
        Helper to create a parse action for converting parsed datetime string to Python datetime.datetime

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%dT%H:%M:%S.%f"})

        Example::
            dt_expr = pyparsing_common.iso8601_datetime.copy()
            dt_expr.setParseAction(pyparsing_common.convertToDatetime())
            print(dt_expr.parseString("1999-12-31T23:59:59.999"))
        prints::
            [datetime.datetime(1999, 12, 31, 23, 59, 59, 999000)]
        csHytj|d��Stk
rB}zt||t|���WYdd}~XnXdS)Nr)rr�r�rr{)r�r5rvr�)r�rwrxr��sz2pyparsing_common.convertToDatetime.<locals>.cvt_fnrw)r�r�rw)r�rx�convertToDatetime�sz"pyparsing_common.convertToDatetimez7(?P<year>\d{4})(?:-(?P<month>\d\d)(?:-(?P<day>\d\d))?)?zISO8601 datez�(?P<year>\d{4})-(?P<month>\d\d)-(?P<day>\d\d)[T ](?P<hour>\d\d):(?P<minute>\d\d)(:(?P<second>\d\d(\.\d*)?)?)?(?P<tz>Z|[+-]\d\d:?\d\d)?zISO8601 datetimez2[0-9a-fA-F]{8}(-[0-9a-fA-F]{4}){3}-[0-9a-fA-F]{12}�UUIDcCstjj|d�S)a
        Parse action to remove HTML tags from web page HTML source

        Example::
            # strip HTML links from normal text 
            text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
            td,td_end = makeHTMLTags("TD")
            table_text = td + SkipTo(td_end).setParseAction(pyparsing_common.stripHTMLTags)("body") + td_end
            
            print(table_text.parseString(text).body) # -> 'More info at the pyparsing wiki page'
        r)rn�_html_stripperr�)r�r5r�rwrwrx�
stripHTMLTags�s
zpyparsing_common.stripHTMLTagsra)r�z 	r�r�)r�zcomma separated listcCst|�j�S)N)r�r�)rvrwrwrxry�scCst|�j�S)N)r�r�)rvrwrwrxry�sN)rrB)rrB)r�)r�)?r�r�r�r�rmruZconvertToInteger�floatZconvertToFloatr/rRrir�r�rDr�r'Zsigned_integerr�rxrr�Z
mixed_integerrH�realZsci_realr��numberr�r4r3r�Zipv4_addressr�Z_full_ipv6_addressZ_short_ipv6_addressr~Z_mixed_ipv6_addressr
Zipv6_addressZmac_addressr�r�r�Ziso8601_dateZiso8601_datetime�uuidr7r6r�r�rrrrVr.�
_commasepitemr@rYr�Zcomma_separated_listrdrBrwrwrwrxrn�sN""
28�__main__Zselect�fromz_$r])rb�columnsrjZtablesZcommandaK
        # '*' as column list and dotted table name
        select * from SYS.XYZZY

        # caseless match on "SELECT", and casts back to "select"
        SELECT * from XYZZY, ABC

        # list of column names, and mixed case SELECT keyword
        Select AA,BB,CC from Sys.dual

        # multiple tables
        Select A, B, C from Sys.dual, Table2

        # invalid SELECT keyword - should fail
        Xelect A, B, C from Sys.dual

        # incomplete command - should fail
        Select

        # invalid column name - should fail
        Select ^^^ frox Sys.dual

        z]
        100
        -100
        +100
        3.14159
        6.02e23
        1e-12
        z 
        100
        FF
        z6
        12345678-1234-5678-1234-567812345678
        )rq)raF)N)FT)T)r�)T)�r��__version__Z__versionTime__�
__author__r��weakrefrr�r�r~r�rdrr�r"r<r�r�_threadr�ImportErrorZ	threadingrr�Zordereddict�__all__r��version_infor;r�maxsizer�r{r��chrrur�rHr�r�reversedr�r�rr6rrrIZmaxintZxranger�Z__builtin__r�Zfnamer�rJr�r�r�r�r�r�Zascii_uppercaseZascii_lowercaser4rRrDr3rkr�Z	printablerVrKrrr!r#r&r�r"�MutableMapping�registerr9rJrGr/r2r4rQrMr$r,r
rrr�rQrrrrlr/r'r%r	r.r0rrrr*r)r1r0r rrrrrrrrJrr2rMrNrr(rrVr-r
rrr+rrbr@r<r�rOrNrrSrArgrhrjrirCrIrHrar`r�Z_escapedPuncZ_escapedHexCharZ_escapedOctChar�UNICODEZ_singleCharZ
_charRangermr}r_rMr\rZrmrdrBr�rKrLrer�rkrTr�r�rirUr>r^rYrcrPrfr5rWr7r6r�r�r�r�r;r[r8rEr�r]r?r=rFrXr�r�r:rnr�ZselectTokenZ	fromTokenZidentZ
columnNameZcolumnNameListZ
columnSpecZ	tableNameZ
tableNameListZ	simpleSQLr�r�r�r�r�r�rwrwrwrx�<module>=s�









8


@d&A= I
G3pLOD|M &#@sQ,A,	I#%&0
,	?#kZr

 (
 0 


"site-packages/setuptools/_vendor/__pycache__/__init__.cpython-36.pyc000064400000000161147511334630021543 0ustar003

��f�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/setuptools/_vendor/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000161147511334630022502 0ustar003

��f�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/setuptools/_vendor/__pycache__/pyparsing.cpython-36.opt-1.pyc000064400000610513147511334630022767 0ustar003

��f��@s�dZdZdZdZddlZddlmZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlmZyddlmZWn ek
r�ddlmZYnXydd	l
mZWn>ek
r�ydd	lmZWnek
r�dZYnXYnXd
ddd
ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrgiZee	j�dds�ZeddskZe�r"e	jZe Z!e"Z#e Z$e%e&e'e(e)ee*e+e,e-e.gZ/nbe	j0Ze1Z2dtdu�Z$gZ/ddl3Z3xBdvj4�D]6Z5ye/j6e7e3e5��Wne8k
�r|�wJYnX�qJWe9dwdx�e2dy�D��Z:dzd{�Z;Gd|d}�d}e<�Z=ej>ej?Z@d~ZAeAdZBe@eAZCe"d��ZDd�jEd�dx�ejFD��ZGGd�d!�d!eH�ZIGd�d#�d#eI�ZJGd�d%�d%eI�ZKGd�d'�d'eK�ZLGd�d*�d*eH�ZMGd�d��d�e<�ZNGd�d&�d&e<�ZOe
jPjQeO�d�d=�ZRd�dN�ZSd�dK�ZTd�d��ZUd�d��ZVd�d��ZWd�dU�ZX�d/d�d��ZYGd�d(�d(e<�ZZGd�d0�d0eZ�Z[Gd�d�de[�Z\Gd�d�de[�Z]Gd�d�de[�Z^e^Z_e^eZ_`Gd�d�de[�ZaGd�d�de^�ZbGd�d�dea�ZcGd�dp�dpe[�ZdGd�d3�d3e[�ZeGd�d+�d+e[�ZfGd�d)�d)e[�ZgGd�d
�d
e[�ZhGd�d2�d2e[�ZiGd�d��d�e[�ZjGd�d�dej�ZkGd�d�dej�ZlGd�d�dej�ZmGd�d.�d.ej�ZnGd�d-�d-ej�ZoGd�d5�d5ej�ZpGd�d4�d4ej�ZqGd�d$�d$eZ�ZrGd�d
�d
er�ZsGd�d �d er�ZtGd�d�der�ZuGd�d�der�ZvGd�d"�d"eZ�ZwGd�d�dew�ZxGd�d�dew�ZyGd�d��d�ew�ZzGd�d�dez�Z{Gd�d6�d6ez�Z|Gd�d��d�e<�Z}e}�Z~Gd�d�dew�ZGd�d,�d,ew�Z�Gd�d�dew�Z�Gd�d��d�e��Z�Gd�d1�d1ew�Z�Gd�d�de��Z�Gd�d�de��Z�Gd�d�de��Z�Gd�d/�d/e��Z�Gd�d�de<�Z�d�df�Z��d0d�dD�Z��d1d�d@�Z�d�d΄Z�d�dS�Z�d�dR�Z�d�d҄Z��d2d�dW�Z�d�dE�Z��d3d�dk�Z�d�dl�Z�d�dn�Z�e\�j�dG�Z�el�j�dM�Z�em�j�dL�Z�en�j�de�Z�eo�j�dd�Z�eeeDd�d�dڍj�d�d܄�Z�efd݃j�d�d܄�Z�efd߃j�d�d܄�Z�e�e�Be�BeeeGd�dyd�Befd�ej��BZ�e�e�e�d�e��Z�e^d�ed�j�d�e�e{e�e�B��j�d�d�Z�d�dc�Z�d�dQ�Z�d�d`�Z�d�d^�Z�d�dq�Z�e�d�d܄�Z�e�d�d܄�Z�d�d�Z�d�dO�Z�d�dP�Z�d�di�Z�e<�e�_��d4d�do�Z�e=�Z�e<�e�_�e<�e�_�e�d��e�d��fd�dm�Z�e�Z�e�efd��d��j�d��Z�e�efd��d��j�d��Z�e�efd��d�efd��d�B�j��d�Z�e�e_�d�e�j��j��d�Z�d�d�de�j�f�ddT�Z��d5�ddj�Z�e��d�Z�e��d�Z�e�eee@eC�d�j��d��\Z�Z�e�e��d	j4��d
��Z�ef�d�djEe�j��d
�j��d�ZĐdd_�Z�e�ef�d��d�j��d�Z�ef�d�j��d�Z�ef�d�jȃj��d�Z�ef�d�j��d�Z�e�ef�d��de�B�j��d�Z�e�Z�ef�d�j��d�Z�e�e{eeeGdɐd�eee�d�e^dɃem����j΃j��d�Z�e�ee�j�e�Bd��d��j�d>�Z�G�d dr�dr�Z�eҐd!k�r�eb�d"�Z�eb�d#�Z�eee@eC�d$�Z�e�eՐd%dӐd&�j�e��Z�e�e�eփ�j��d'�Zאd(e�BZ�e�eՐd%dӐd&�j�e��Z�e�e�eك�j��d)�Z�eӐd*�eؐd'�e�eڐd)�Z�e�jܐd+�e�j�jܐd,�e�j�jܐd,�e�j�jܐd-�ddl�Z�e�j�j�e�e�j��e�j�jܐd.�dS(6aS
pyparsing module - Classes and methods to define and execute parsing grammars

The pyparsing module is an alternative approach to creating and executing simple grammars,
vs. the traditional lex/yacc approach, or the use of regular expressions.  With pyparsing, you
don't need to learn a new syntax for defining grammars or matching expressions - the parsing module
provides a library of classes that you use to construct the grammar directly in Python.

Here is a program to parse "Hello, World!" (or any greeting of the form 
C{"<salutation>, <addressee>!"}), built up using L{Word}, L{Literal}, and L{And} elements 
(L{'+'<ParserElement.__add__>} operator gives L{And} expressions, strings are auto-converted to
L{Literal} expressions)::

    from pyparsing import Word, alphas

    # define grammar of a greeting
    greet = Word(alphas) + "," + Word(alphas) + "!"

    hello = "Hello, World!"
    print (hello, "->", greet.parseString(hello))

The program outputs the following::

    Hello, World! -> ['Hello', ',', 'World', '!']

The Python representation of the grammar is quite readable, owing to the self-explanatory
class names, and the use of '+', '|' and '^' operators.

The L{ParseResults} object returned from L{ParserElement.parseString<ParserElement.parseString>} can be accessed as a nested list, a dictionary, or an
object with named attributes.

The pyparsing module handles some of the problems that are typically vexing when writing text parsers:
 - extra or missing whitespace (the above program will also handle "Hello,World!", "Hello  ,  World  !", etc.)
 - quoted strings
 - embedded comments
z2.1.10z07 Oct 2016 01:31 UTCz*Paul McGuire <ptmcg@users.sourceforge.net>�N)�ref)�datetime)�RLock)�OrderedDict�And�CaselessKeyword�CaselessLiteral�
CharsNotIn�Combine�Dict�Each�Empty�
FollowedBy�Forward�
GoToColumn�Group�Keyword�LineEnd�	LineStart�Literal�
MatchFirst�NoMatch�NotAny�	OneOrMore�OnlyOnce�Optional�Or�ParseBaseException�ParseElementEnhance�ParseException�ParseExpression�ParseFatalException�ParseResults�ParseSyntaxException�
ParserElement�QuotedString�RecursiveGrammarException�Regex�SkipTo�	StringEnd�StringStart�Suppress�Token�TokenConverter�White�Word�WordEnd�	WordStart�
ZeroOrMore�	alphanums�alphas�
alphas8bit�anyCloseTag�
anyOpenTag�
cStyleComment�col�commaSeparatedList�commonHTMLEntity�countedArray�cppStyleComment�dblQuotedString�dblSlashComment�
delimitedList�dictOf�downcaseTokens�empty�hexnums�htmlComment�javaStyleComment�line�lineEnd�	lineStart�lineno�makeHTMLTags�makeXMLTags�matchOnlyAtCol�matchPreviousExpr�matchPreviousLiteral�
nestedExpr�nullDebugAction�nums�oneOf�opAssoc�operatorPrecedence�
printables�punc8bit�pythonStyleComment�quotedString�removeQuotes�replaceHTMLEntity�replaceWith�
restOfLine�sglQuotedString�srange�	stringEnd�stringStart�traceParseAction�
unicodeString�upcaseTokens�
withAttribute�
indentedBlock�originalTextFor�ungroup�
infixNotation�locatedExpr�	withClass�
CloseMatch�tokenMap�pyparsing_common�cCs`t|t�r|Syt|�Stk
rZt|�jtj�d�}td�}|jdd��|j	|�SXdS)aDrop-in replacement for str(obj) that tries to be Unicode friendly. It first tries
           str(obj). If that fails with a UnicodeEncodeError, then it tries unicode(obj). It
           then < returns the unicode object | encodes it with the default encoding | ... >.
        �xmlcharrefreplacez&#\d+;cSs$dtt|ddd���dd�S)Nz\ur�����)�hex�int)�t�rw�/usr/lib/python3.6/pyparsing.py�<lambda>�sz_ustr.<locals>.<lambda>N)
�
isinstanceZunicode�str�UnicodeEncodeError�encode�sys�getdefaultencodingr'�setParseAction�transformString)�obj�retZ
xmlcharrefrwrwrx�_ustr�s
r�z6sum len sorted reversed list tuple set any all min maxccs|]
}|VqdS)Nrw)�.0�yrwrwrx�	<genexpr>�sr�rrcCs>d}dd�dj�D�}x"t||�D]\}}|j||�}q"W|S)z/Escape &, <, >, ", ', etc. in a string of data.z&><"'css|]}d|dVqdS)�&�;Nrw)r��srwrwrxr��sz_xml_escape.<locals>.<genexpr>zamp gt lt quot apos)�split�zip�replace)�dataZfrom_symbolsZ
to_symbolsZfrom_Zto_rwrwrx�_xml_escape�s
r�c@seZdZdS)�
_ConstantsN)�__name__�
__module__�__qualname__rwrwrwrxr��sr��
0123456789ZABCDEFabcdef�\�ccs|]}|tjkr|VqdS)N)�stringZ
whitespace)r��crwrwrxr��sc@sPeZdZdZddd�Zedd��Zdd	�Zd
d�Zdd
�Z	ddd�Z
dd�ZdS)rz7base exception class for all parsing runtime exceptionsrNcCs>||_|dkr||_d|_n||_||_||_|||f|_dS)Nr�)�loc�msg�pstr�
parserElement�args)�selfr�r�r��elemrwrwrx�__init__�szParseBaseException.__init__cCs||j|j|j|j�S)z�
        internal factory method to simplify creating one type of ParseException 
        from another - avoids having __init__ signature conflicts among subclasses
        )r�r�r�r�)�cls�perwrwrx�_from_exception�sz"ParseBaseException._from_exceptioncCsN|dkrt|j|j�S|dkr,t|j|j�S|dkrBt|j|j�St|��dS)z�supported attributes by name are:
            - lineno - returns the line number of the exception text
            - col - returns the column number of the exception text
            - line - returns the line containing the exception text
        rJr9�columnrGN)r9r�)rJr�r�r9rG�AttributeError)r�Zanamerwrwrx�__getattr__�szParseBaseException.__getattr__cCsd|j|j|j|jfS)Nz"%s (at char %d), (line:%d, col:%d))r�r�rJr�)r�rwrwrx�__str__�szParseBaseException.__str__cCst|�S)N)r�)r�rwrwrx�__repr__�szParseBaseException.__repr__�>!<cCs<|j}|jd}|r4dj|d|�|||d�f�}|j�S)z�Extracts the exception line from the input string, and marks
           the location of the exception with a special symbol.
        rrr�N)rGr��join�strip)r�ZmarkerStringZline_strZline_columnrwrwrx�
markInputline�s
z ParseBaseException.markInputlinecCsdj�tt|��S)Nzlineno col line)r��dir�type)r�rwrwrx�__dir__�szParseBaseException.__dir__)rNN)r�)r�r�r��__doc__r��classmethodr�r�r�r�r�r�rwrwrwrxr�s


c@seZdZdZdS)raN
    Exception thrown when parse expressions don't match class;
    supported attributes by name are:
     - lineno - returns the line number of the exception text
     - col - returns the column number of the exception text
     - line - returns the line containing the exception text
        
    Example::
        try:
            Word(nums).setName("integer").parseString("ABC")
        except ParseException as pe:
            print(pe)
            print("column: {}".format(pe.col))
            
    prints::
       Expected integer (at char 0), (line:1, col:1)
        column: 1
    N)r�r�r�r�rwrwrwrxr�sc@seZdZdZdS)r!znuser-throwable exception thrown when inconsistent parse content
       is found; stops all parsing immediatelyN)r�r�r�r�rwrwrwrxr!sc@seZdZdZdS)r#z�just like L{ParseFatalException}, but thrown internally when an
       L{ErrorStop<And._ErrorStop>} ('-' operator) indicates that parsing is to stop 
       immediately because an unbacktrackable syntax error has been foundN)r�r�r�r�rwrwrwrxr#sc@s eZdZdZdd�Zdd�ZdS)r&zZexception thrown by L{ParserElement.validate} if the grammar could be improperly recursivecCs
||_dS)N)�parseElementTrace)r��parseElementListrwrwrxr�sz"RecursiveGrammarException.__init__cCs
d|jS)NzRecursiveGrammarException: %s)r�)r�rwrwrxr� sz!RecursiveGrammarException.__str__N)r�r�r�r�r�r�rwrwrwrxr&sc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�_ParseResultsWithOffsetcCs||f|_dS)N)�tup)r�Zp1Zp2rwrwrxr�$sz _ParseResultsWithOffset.__init__cCs
|j|S)N)r�)r��irwrwrx�__getitem__&sz#_ParseResultsWithOffset.__getitem__cCst|jd�S)Nr)�reprr�)r�rwrwrxr�(sz _ParseResultsWithOffset.__repr__cCs|jd|f|_dS)Nr)r�)r�r�rwrwrx�	setOffset*sz!_ParseResultsWithOffset.setOffsetN)r�r�r�r�r�r�r�rwrwrwrxr�#sr�c@s�eZdZdZd[dd�Zddddefdd�Zdd	�Zefd
d�Zdd
�Z	dd�Z
dd�Zdd�ZeZ
dd�Zdd�Zdd�Zdd�Zdd�Zer�eZeZeZn$eZeZeZdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd\d(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Z d2d3�Z!d4d5�Z"d6d7�Z#d8d9�Z$d:d;�Z%d<d=�Z&d]d?d@�Z'dAdB�Z(dCdD�Z)dEdF�Z*d^dHdI�Z+dJdK�Z,dLdM�Z-d_dOdP�Z.dQdR�Z/dSdT�Z0dUdV�Z1dWdX�Z2dYdZ�Z3dS)`r"aI
    Structured parse results, to provide multiple means of access to the parsed data:
       - as a list (C{len(results)})
       - by list index (C{results[0], results[1]}, etc.)
       - by attribute (C{results.<resultsName>} - see L{ParserElement.setResultsName})

    Example::
        integer = Word(nums)
        date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))
        # equivalent form:
        # date_str = integer("year") + '/' + integer("month") + '/' + integer("day")

        # parseString returns a ParseResults object
        result = date_str.parseString("1999/12/31")

        def test(s, fn=repr):
            print("%s -> %s" % (s, fn(eval(s))))
        test("list(result)")
        test("result[0]")
        test("result['month']")
        test("result.day")
        test("'month' in result")
        test("'minutes' in result")
        test("result.dump()", str)
    prints::
        list(result) -> ['1999', '/', '12', '/', '31']
        result[0] -> '1999'
        result['month'] -> '12'
        result.day -> '31'
        'month' in result -> True
        'minutes' in result -> False
        result.dump() -> ['1999', '/', '12', '/', '31']
        - day: 31
        - month: 12
        - year: 1999
    NTcCs"t||�r|Stj|�}d|_|S)NT)rz�object�__new__�_ParseResults__doinit)r��toklist�name�asList�modalZretobjrwrwrxr�Ts


zParseResults.__new__c
Cs`|jrvd|_d|_d|_i|_||_||_|dkr6g}||t�rP|dd�|_n||t�rft|�|_n|g|_t	�|_
|dk	o�|�r\|s�d|j|<||t�r�t|�}||_||t
d�ttf�o�|ddgfk�s\||t�r�|g}|�r&||t��rt|j�d�||<ntt|d�d�||<|||_n6y|d||<Wn$tttfk
�rZ|||<YnXdS)NFrr�)r��_ParseResults__name�_ParseResults__parent�_ParseResults__accumNames�_ParseResults__asList�_ParseResults__modal�list�_ParseResults__toklist�_generatorType�dict�_ParseResults__tokdictrur�r��
basestringr"r��copy�KeyError�	TypeError�
IndexError)r�r�r�r�r�rzrwrwrxr�]sB



$
zParseResults.__init__cCsPt|ttf�r|j|S||jkr4|j|ddStdd�|j|D��SdS)NrrrcSsg|]}|d�qS)rrw)r��vrwrwrx�
<listcomp>�sz,ParseResults.__getitem__.<locals>.<listcomp>rs)rzru�slicer�r�r�r")r�r�rwrwrxr��s


zParseResults.__getitem__cCs�||t�r0|jj|t��|g|j|<|d}nD||ttf�rN||j|<|}n&|jj|t��t|d�g|j|<|}||t�r�t|�|_	dS)Nr)
r�r��getr�rur�r�r"�wkrefr�)r��kr�rz�subrwrwrx�__setitem__�s


"
zParseResults.__setitem__c
Cs�t|ttf�r�t|j�}|j|=t|t�rH|dkr:||7}t||d�}tt|j|���}|j�x^|j	j
�D]F\}}x<|D]4}x.t|�D]"\}\}}	t||	|	|k�||<q�Wq|WqnWn|j	|=dS)Nrrr)
rzrur��lenr�r��range�indices�reverser��items�	enumerater�)
r�r�ZmylenZremovedr��occurrences�jr��value�positionrwrwrx�__delitem__�s


$zParseResults.__delitem__cCs
||jkS)N)r�)r�r�rwrwrx�__contains__�szParseResults.__contains__cCs
t|j�S)N)r�r�)r�rwrwrx�__len__�szParseResults.__len__cCs
|jS)N)r�)r�rwrwrx�__bool__�szParseResults.__bool__cCs
t|j�S)N)�iterr�)r�rwrwrx�__iter__�szParseResults.__iter__cCst|jddd��S)Nrrrs)r�r�)r�rwrwrx�__reversed__�szParseResults.__reversed__cCs$t|jd�r|jj�St|j�SdS)N�iterkeys)�hasattrr�r�r�)r�rwrwrx�	_iterkeys�s
zParseResults._iterkeyscs�fdd��j�D�S)Nc3s|]}�|VqdS)Nrw)r�r�)r�rwrxr��sz+ParseResults._itervalues.<locals>.<genexpr>)r�)r�rw)r�rx�_itervalues�szParseResults._itervaluescs�fdd��j�D�S)Nc3s|]}|�|fVqdS)Nrw)r�r�)r�rwrxr��sz*ParseResults._iteritems.<locals>.<genexpr>)r�)r�rw)r�rx�
_iteritems�szParseResults._iteritemscCst|j��S)zVReturns all named result keys (as a list in Python 2.x, as an iterator in Python 3.x).)r�r�)r�rwrwrx�keys�szParseResults.keyscCst|j��S)zXReturns all named result values (as a list in Python 2.x, as an iterator in Python 3.x).)r��
itervalues)r�rwrwrx�values�szParseResults.valuescCst|j��S)zfReturns all named result key-values (as a list of tuples in Python 2.x, as an iterator in Python 3.x).)r��	iteritems)r�rwrwrxr��szParseResults.itemscCs
t|j�S)z�Since keys() returns an iterator, this method is helpful in bypassing
           code that looks for the existence of any defined results names.)�boolr�)r�rwrwrx�haskeys�szParseResults.haskeyscOs�|s
dg}x6|j�D]*\}}|dkr2|d|f}qtd|��qWt|dt�sht|�dksh|d|kr�|d}||}||=|S|d}|SdS)a�
        Removes and returns item at specified index (default=C{last}).
        Supports both C{list} and C{dict} semantics for C{pop()}. If passed no
        argument or an integer argument, it will use C{list} semantics
        and pop tokens from the list of parsed tokens. If passed a 
        non-integer argument (most likely a string), it will use C{dict}
        semantics and pop the corresponding value from any defined 
        results names. A second default return value argument is 
        supported, just as in C{dict.pop()}.

        Example::
            def remove_first(tokens):
                tokens.pop(0)
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            print(OneOrMore(Word(nums)).addParseAction(remove_first).parseString("0 123 321")) # -> ['123', '321']

            label = Word(alphas)
            patt = label("LABEL") + OneOrMore(Word(nums))
            print(patt.parseString("AAB 123 321").dump())

            # Use pop() in a parse action to remove named result (note that corresponding value is not
            # removed from list form of results)
            def remove_LABEL(tokens):
                tokens.pop("LABEL")
                return tokens
            patt.addParseAction(remove_LABEL)
            print(patt.parseString("AAB 123 321").dump())
        prints::
            ['AAB', '123', '321']
            - LABEL: AAB

            ['AAB', '123', '321']
        rr�defaultrz-pop() got an unexpected keyword argument '%s'Nrs)r�r�rzrur�)r�r��kwargsr�r��indexr�Zdefaultvaluerwrwrx�pop�s"zParseResults.popcCs||kr||S|SdS)ai
        Returns named result matching the given key, or if there is no
        such name, then returns the given C{defaultValue} or C{None} if no
        C{defaultValue} is specified.

        Similar to C{dict.get()}.
        
        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            result = date_str.parseString("1999/12/31")
            print(result.get("year")) # -> '1999'
            print(result.get("hour", "not specified")) # -> 'not specified'
            print(result.get("hour")) # -> None
        Nrw)r��key�defaultValuerwrwrxr�szParseResults.getcCsZ|jj||�xF|jj�D]8\}}x.t|�D]"\}\}}t||||k�||<q,WqWdS)a
        Inserts new element at location index in the list of parsed tokens.
        
        Similar to C{list.insert()}.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']

            # use a parse action to insert the parse location in the front of the parsed results
            def insert_locn(locn, tokens):
                tokens.insert(0, locn)
            print(OneOrMore(Word(nums)).addParseAction(insert_locn).parseString("0 123 321")) # -> [0, '0', '123', '321']
        N)r��insertr�r�r�r�)r�r�ZinsStrr�r�r�r�r�rwrwrxr�2szParseResults.insertcCs|jj|�dS)a�
        Add single element to end of ParseResults list of elements.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            
            # use a parse action to compute the sum of the parsed integers, and add it to the end
            def append_sum(tokens):
                tokens.append(sum(map(int, tokens)))
            print(OneOrMore(Word(nums)).addParseAction(append_sum).parseString("0 123 321")) # -> ['0', '123', '321', 444]
        N)r��append)r��itemrwrwrxr�FszParseResults.appendcCs$t|t�r||7}n|jj|�dS)a
        Add sequence of elements to end of ParseResults list of elements.

        Example::
            patt = OneOrMore(Word(alphas))
            
            # use a parse action to append the reverse of the matched strings, to make a palindrome
            def make_palindrome(tokens):
                tokens.extend(reversed([t[::-1] for t in tokens]))
                return ''.join(tokens)
            print(patt.addParseAction(make_palindrome).parseString("lskdj sdlkjf lksd")) # -> 'lskdjsdlkjflksddsklfjkldsjdksl'
        N)rzr"r��extend)r�Zitemseqrwrwrxr�Ts

zParseResults.extendcCs|jdd�=|jj�dS)z7
        Clear all elements and results names.
        N)r�r��clear)r�rwrwrxr�fszParseResults.clearcCsfy||Stk
rdSX||jkr^||jkrD|j|ddStdd�|j|D��SndSdS)Nr�rrrcSsg|]}|d�qS)rrw)r�r�rwrwrxr�wsz,ParseResults.__getattr__.<locals>.<listcomp>rs)r�r�r�r")r�r�rwrwrxr�ms

zParseResults.__getattr__cCs|j�}||7}|S)N)r�)r��otherr�rwrwrx�__add__{szParseResults.__add__cs�|jrnt|j���fdd��|jj�}�fdd�|D�}x4|D],\}}|||<t|dt�r>t|�|d_q>W|j|j7_|jj	|j�|S)Ncs|dkr�S|�S)Nrrw)�a)�offsetrwrxry�sz'ParseResults.__iadd__.<locals>.<lambda>c	s4g|],\}}|D]}|t|d�|d��f�qqS)rrr)r�)r�r��vlistr�)�	addoffsetrwrxr��sz)ParseResults.__iadd__.<locals>.<listcomp>r)
r�r�r�r�rzr"r�r�r��update)r�r�Z
otheritemsZotherdictitemsr�r�rw)rrrx�__iadd__�s


zParseResults.__iadd__cCs&t|t�r|dkr|j�S||SdS)Nr)rzrur�)r�r�rwrwrx�__radd__�szParseResults.__radd__cCsdt|j�t|j�fS)Nz(%s, %s))r�r�r�)r�rwrwrxr��szParseResults.__repr__cCsddjdd�|jD��dS)N�[z, css(|] }t|t�rt|�nt|�VqdS)N)rzr"r�r�)r�r�rwrwrxr��sz'ParseResults.__str__.<locals>.<genexpr>�])r�r�)r�rwrwrxr��szParseResults.__str__r�cCsPg}xF|jD]<}|r"|r"|j|�t|t�r:||j�7}q|jt|��qW|S)N)r�r�rzr"�
_asStringListr�)r��sep�outr�rwrwrxr
�s

zParseResults._asStringListcCsdd�|jD�S)a�
        Returns the parse results as a nested list of matching tokens, all converted to strings.

        Example::
            patt = OneOrMore(Word(alphas))
            result = patt.parseString("sldkj lsdkj sldkj")
            # even though the result prints in string-like form, it is actually a pyparsing ParseResults
            print(type(result), result) # -> <class 'pyparsing.ParseResults'> ['sldkj', 'lsdkj', 'sldkj']
            
            # Use asList() to create an actual list
            result_list = result.asList()
            print(type(result_list), result_list) # -> <class 'list'> ['sldkj', 'lsdkj', 'sldkj']
        cSs"g|]}t|t�r|j�n|�qSrw)rzr"r�)r��resrwrwrxr��sz'ParseResults.asList.<locals>.<listcomp>)r�)r�rwrwrxr��szParseResults.asListcs6tr|j}n|j}�fdd��t�fdd�|�D��S)a�
        Returns the named parse results as a nested dictionary.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(type(result), repr(result)) # -> <class 'pyparsing.ParseResults'> (['12', '/', '31', '/', '1999'], {'day': [('1999', 4)], 'year': [('12', 0)], 'month': [('31', 2)]})
            
            result_dict = result.asDict()
            print(type(result_dict), repr(result_dict)) # -> <class 'dict'> {'day': '1999', 'year': '12', 'month': '31'}

            # even though a ParseResults supports dict-like access, sometime you just need to have a dict
            import json
            print(json.dumps(result)) # -> Exception: TypeError: ... is not JSON serializable
            print(json.dumps(result.asDict())) # -> {"month": "31", "day": "1999", "year": "12"}
        cs6t|t�r.|j�r|j�S�fdd�|D�Sn|SdS)Ncsg|]}�|��qSrwrw)r�r�)�toItemrwrxr��sz7ParseResults.asDict.<locals>.toItem.<locals>.<listcomp>)rzr"r��asDict)r�)rrwrxr�s

z#ParseResults.asDict.<locals>.toItemc3s|]\}}|�|�fVqdS)Nrw)r�r�r�)rrwrxr��sz&ParseResults.asDict.<locals>.<genexpr>)�PY_3r�r�r�)r�Zitem_fnrw)rrxr�s
	zParseResults.asDictcCs8t|j�}|jj�|_|j|_|jj|j�|j|_|S)zA
        Returns a new copy of a C{ParseResults} object.
        )r"r�r�r�r�r�rr�)r�r�rwrwrxr��s
zParseResults.copyFcCsPd}g}tdd�|jj�D��}|d}|s8d}d}d}d}	|dk	rJ|}	n|jrV|j}	|	sf|rbdSd}	|||d|	d	g7}x�t|j�D]�\}
}t|t�r�|
|kr�||j||
|o�|dk||�g7}n||jd|o�|dk||�g7}q�d}|
|kr�||
}|�s
|�rq�nd}t	t
|��}
|||d|d	|
d
|d	g	7}q�W|||d
|	d	g7}dj|�S)z�
        (Deprecated) Returns the parse results as XML. Tags are created for tokens and lists that have defined results names.
        �
css(|] \}}|D]}|d|fVqqdS)rrNrw)r�r�rr�rwrwrxr��sz%ParseResults.asXML.<locals>.<genexpr>z  r�NZITEM�<�>z</)r�r�r�r�r�r�rzr"�asXMLr�r�r�)r�ZdoctagZnamedItemsOnly�indentZ	formatted�nlrZ
namedItemsZnextLevelIndentZselfTagr�r
ZresTagZxmlBodyTextrwrwrxr�sT


zParseResults.asXMLcCs:x4|jj�D]&\}}x|D]\}}||kr|SqWqWdS)N)r�r�)r�r�r�rr�r�rwrwrxZ__lookup$s
zParseResults.__lookupcCs�|jr|jS|jr.|j�}|r(|j|�SdSnNt|�dkrxt|j�dkrxtt|jj���dddkrxtt|jj���SdSdS)a(
        Returns the results name for this token expression. Useful when several 
        different expressions might match at a particular location.

        Example::
            integer = Word(nums)
            ssn_expr = Regex(r"\d\d\d-\d\d-\d\d\d\d")
            house_number_expr = Suppress('#') + Word(nums, alphanums)
            user_data = (Group(house_number_expr)("house_number") 
                        | Group(ssn_expr)("ssn")
                        | Group(integer)("age"))
            user_info = OneOrMore(user_data)
            
            result = user_info.parseString("22 111-22-3333 #221B")
            for item in result:
                print(item.getName(), ':', item[0])
        prints::
            age : 22
            ssn : 111-22-3333
            house_number : 221B
        Nrrrrs)rrs)	r�r��_ParseResults__lookupr�r��nextr�r�r�)r��parrwrwrx�getName+s
zParseResults.getNamercCsbg}d}|j|t|j���|�rX|j�r�tdd�|j�D��}xz|D]r\}}|r^|j|�|jd|d||f�t|t�r�|r�|j|j||d��q�|jt|��qH|jt	|��qHWn�t
dd�|D���rX|}x~t|�D]r\}	}
t|
t��r*|jd|d||	|d|d|
j||d�f�q�|jd|d||	|d|dt|
�f�q�Wd	j|�S)
aH
        Diagnostic method for listing out the contents of a C{ParseResults}.
        Accepts an optional C{indent} argument so that this string can be embedded
        in a nested display of other data.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(result.dump())
        prints::
            ['12', '/', '31', '/', '1999']
            - day: 1999
            - month: 31
            - year: 12
        rcss|]\}}t|�|fVqdS)N)r{)r�r�r�rwrwrxr�gsz$ParseResults.dump.<locals>.<genexpr>z
%s%s- %s: z  rrcss|]}t|t�VqdS)N)rzr")r��vvrwrwrxr�ssz
%s%s[%d]:
%s%s%sr�)
r�r�r�r��sortedr�rzr"�dumpr��anyr�r�)r�r�depth�fullr�NLr�r�r�r�rrwrwrxrPs,

4.zParseResults.dumpcOstj|j�f|�|�dS)a�
        Pretty-printer for parsed results as a list, using the C{pprint} module.
        Accepts additional positional or keyword args as defined for the 
        C{pprint.pprint} method. (U{http://docs.python.org/3/library/pprint.html#pprint.pprint})

        Example::
            ident = Word(alphas, alphanums)
            num = Word(nums)
            func = Forward()
            term = ident | num | Group('(' + func + ')')
            func <<= ident + Group(Optional(delimitedList(term)))
            result = func.parseString("fna a,b,(fnb c,d,200),100")
            result.pprint(width=40)
        prints::
            ['fna',
             ['a',
              'b',
              ['(', 'fnb', ['c', 'd', '200'], ')'],
              '100']]
        N)�pprintr�)r�r�r�rwrwrxr"}szParseResults.pprintcCs.|j|jj�|jdk	r|j�p d|j|jffS)N)r�r�r�r�r�r�)r�rwrwrx�__getstate__�s
zParseResults.__getstate__cCsN|d|_|d\|_}}|_i|_|jj|�|dk	rDt|�|_nd|_dS)Nrrr)r�r�r�r�rr�r�)r��staterZinAccumNamesrwrwrx�__setstate__�s
zParseResults.__setstate__cCs|j|j|j|jfS)N)r�r�r�r�)r�rwrwrx�__getnewargs__�szParseResults.__getnewargs__cCstt|��t|j��S)N)r�r�r�r�)r�rwrwrxr��szParseResults.__dir__)NNTT)N)r�)NFr�T)r�rT)4r�r�r�r�r�rzr�r�r�r�r�r�r��__nonzero__r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr�r�r
r�rr�rrrrr"r#r%r&r�rwrwrwrxr"-sh&
	'	
4

#
=%
-
cCsF|}d|kot|�knr4||ddkr4dS||jdd|�S)aReturns current column within a string, counting newlines as line separators.
   The first column is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   rrrr)r��rfind)r��strgr�rwrwrxr9�s
cCs|jdd|�dS)aReturns current line number within a string, counting newlines as line separators.
   The first line is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   rrrr)�count)r�r)rwrwrxrJ�s
cCsF|jdd|�}|jd|�}|dkr2||d|�S||dd�SdS)zfReturns the line of text containing loc within a string, counting newlines as line separators.
       rrrrN)r(�find)r�r)ZlastCRZnextCRrwrwrxrG�s
cCs8tdt|�dt|�dt||�t||�f�dS)NzMatch z at loc z(%d,%d))�printr�rJr9)�instringr��exprrwrwrx�_defaultStartDebugAction�sr/cCs$tdt|�dt|j���dS)NzMatched z -> )r,r�r{r�)r-�startlocZendlocr.�toksrwrwrx�_defaultSuccessDebugAction�sr2cCstdt|��dS)NzException raised:)r,r�)r-r�r.�excrwrwrx�_defaultExceptionDebugAction�sr4cGsdS)zG'Do-nothing' debug action, to suppress debugging output during parsing.Nrw)r�rwrwrxrQ�srqcs��tkr�fdd�Sdg�dg�tdd�dkrFddd	�}dd
d��ntj}tj�d}|dd
�d}|d|d|f�������fdd�}d}yt�dt�d�j�}Wntk
r�t��}YnX||_|S)Ncs�|�S)Nrw)r��lrv)�funcrwrxry�sz_trim_arity.<locals>.<lambda>rFrqro�cSs8tdkrdnd	}tj||dd�|}|j|jfgS)
Nror7rrqrr)�limit)ror7r������)�system_version�	traceback�
extract_stack�filenamerJ)r8r�
frame_summaryrwrwrxr=sz"_trim_arity.<locals>.extract_stackcSs$tj||d�}|d}|j|jfgS)N)r8rrrs)r<�
extract_tbr>rJ)�tbr8Zframesr?rwrwrxr@sz_trim_arity.<locals>.extract_tb�)r8rrcs�x�y �|�dd��}d�d<|Stk
r��dr>�n4z.tj�d}�|dd�ddd��ksj�Wd~X�d�kr��dd7<w�YqXqWdS)NrTrrrq)r8rsrs)r�r~�exc_info)r�r�rA)r@�
foundArityr6r8�maxargs�pa_call_line_synthrwrx�wrappers"z_trim_arity.<locals>.wrapperz<parse action>r��	__class__)ror7)r)rrs)	�singleArgBuiltinsr;r<r=r@�getattrr��	Exceptionr{)r6rEr=Z	LINE_DIFFZ	this_linerG�	func_namerw)r@rDr6r8rErFrx�_trim_arity�s*
rMcs�eZdZdZdZdZedd��Zedd��Zd�dd	�Z	d
d�Z
dd
�Zd�dd�Zd�dd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd�dd �Zd!d"�Zd�d#d$�Zd%d&�Zd'd(�ZGd)d*�d*e�Zed+k	r�Gd,d-�d-e�ZnGd.d-�d-e�ZiZe�Zd/d/gZ d�d0d1�Z!eZ"ed2d3��Z#dZ$ed�d5d6��Z%d�d7d8�Z&e'dfd9d:�Z(d;d<�Z)e'fd=d>�Z*e'dfd?d@�Z+dAdB�Z,dCdD�Z-dEdF�Z.dGdH�Z/dIdJ�Z0dKdL�Z1dMdN�Z2dOdP�Z3dQdR�Z4dSdT�Z5dUdV�Z6dWdX�Z7dYdZ�Z8d�d[d\�Z9d]d^�Z:d_d`�Z;dadb�Z<dcdd�Z=dedf�Z>dgdh�Z?d�didj�Z@dkdl�ZAdmdn�ZBdodp�ZCdqdr�ZDgfdsdt�ZEd�dudv�ZF�fdwdx�ZGdydz�ZHd{d|�ZId}d~�ZJdd��ZKd�d�d��ZLd�d�d��ZM�ZNS)�r$z)Abstract base level parser element class.z 
	
FcCs
|t_dS)a�
        Overrides the default whitespace chars

        Example::
            # default whitespace chars are space, <TAB> and newline
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def', 'ghi', 'jkl']
            
            # change to just treat newline as significant
            ParserElement.setDefaultWhitespaceChars(" \t")
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def']
        N)r$�DEFAULT_WHITE_CHARS)�charsrwrwrx�setDefaultWhitespaceChars=s
z'ParserElement.setDefaultWhitespaceCharscCs
|t_dS)a�
        Set class to be used for inclusion of string literals into a parser.
        
        Example::
            # default literal class used is Literal
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']


            # change to Suppress
            ParserElement.inlineLiteralsUsing(Suppress)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '12', '31']
        N)r$�_literalStringClass)r�rwrwrx�inlineLiteralsUsingLsz!ParserElement.inlineLiteralsUsingcCs�t�|_d|_d|_d|_||_d|_tj|_	d|_
d|_d|_t�|_
d|_d|_d|_d|_d|_d|_d|_d|_d|_dS)NTFr�)NNN)r��parseAction�
failAction�strRepr�resultsName�
saveAsList�skipWhitespacer$rN�
whiteChars�copyDefaultWhiteChars�mayReturnEmpty�keepTabs�ignoreExprs�debug�streamlined�
mayIndexError�errmsg�modalResults�debugActions�re�callPreparse�
callDuringTry)r��savelistrwrwrxr�as(zParserElement.__init__cCs<tj|�}|jdd�|_|jdd�|_|jr8tj|_|S)a$
        Make a copy of this C{ParserElement}.  Useful for defining different parse actions
        for the same parsing pattern, using copies of the original parse element.
        
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            integerK = integer.copy().addParseAction(lambda toks: toks[0]*1024) + Suppress("K")
            integerM = integer.copy().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
            
            print(OneOrMore(integerK | integerM | integer).parseString("5K 100 640K 256M"))
        prints::
            [5120, 100, 655360, 268435456]
        Equivalent form of C{expr.copy()} is just C{expr()}::
            integerM = integer().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
        N)r�rSr]rZr$rNrY)r�Zcpyrwrwrxr�xs
zParserElement.copycCs*||_d|j|_t|d�r&|j|j_|S)af
        Define name for this expression, makes debugging and exception messages clearer.
        
        Example::
            Word(nums).parseString("ABC")  # -> Exception: Expected W:(0123...) (at char 0), (line:1, col:1)
            Word(nums).setName("integer").parseString("ABC")  # -> Exception: Expected integer (at char 0), (line:1, col:1)
        z	Expected �	exception)r�rar�rhr�)r�r�rwrwrx�setName�s


zParserElement.setNamecCs4|j�}|jd�r"|dd�}d}||_||_|S)aP
        Define name for referencing matching tokens as a nested attribute
        of the returned parse results.
        NOTE: this returns a *copy* of the original C{ParserElement} object;
        this is so that the client can define a basic element, such as an
        integer, and reference it in multiple places with different names.

        You can also set results names using the abbreviated syntax,
        C{expr("name")} in place of C{expr.setResultsName("name")} - 
        see L{I{__call__}<__call__>}.

        Example::
            date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))

            # equivalent form:
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
        �*NrrTrs)r��endswithrVrb)r�r��listAllMatchesZnewselfrwrwrx�setResultsName�s
zParserElement.setResultsNameTcs@|r&|j�d�fdd�	}�|_||_nt|jd�r<|jj|_|S)z�Method to invoke the Python pdb debugger when this element is
           about to be parsed. Set C{breakFlag} to True to enable, False to
           disable.
        Tcsddl}|j��||||�S)Nr)�pdbZ	set_trace)r-r��	doActions�callPreParsern)�_parseMethodrwrx�breaker�sz'ParserElement.setBreak.<locals>.breaker�_originalParseMethod)TT)�_parsersr�)r�Z	breakFlagrrrw)rqrx�setBreak�s
zParserElement.setBreakcOs&tttt|���|_|jdd�|_|S)a
        Define action to perform when successfully matching parse element definition.
        Parse action fn is a callable method with 0-3 arguments, called as C{fn(s,loc,toks)},
        C{fn(loc,toks)}, C{fn(toks)}, or just C{fn()}, where:
         - s   = the original string being parsed (see note below)
         - loc = the location of the matching substring
         - toks = a list of the matched tokens, packaged as a C{L{ParseResults}} object
        If the functions in fns modify the tokens, they can return them as the return
        value from fn, and the modified list of tokens will replace the original.
        Otherwise, fn does not need to return any value.

        Optional keyword arguments:
         - callDuringTry = (default=C{False}) indicate if parse action should be run during lookaheads and alternate testing

        Note: the default parsing behavior is to expand tabs in the input string
        before starting the parsing process.  See L{I{parseString}<parseString>} for more information
        on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
        consistent view of the parsed string, the parse location, and line and column
        positions within the parsed string.
        
        Example::
            integer = Word(nums)
            date_str = integer + '/' + integer + '/' + integer

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']

            # use parse action to convert to ints at parse time
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            date_str = integer + '/' + integer + '/' + integer

            # note that integer fields are now ints, not strings
            date_str.parseString("1999/12/31")  # -> [1999, '/', 12, '/', 31]
        rfF)r��maprMrSr�rf)r��fnsr�rwrwrxr��s"zParserElement.setParseActioncOs4|jtttt|���7_|jp,|jdd�|_|S)z�
        Add parse action to expression's list of parse actions. See L{I{setParseAction}<setParseAction>}.
        
        See examples in L{I{copy}<copy>}.
        rfF)rSr�rvrMrfr�)r�rwr�rwrwrx�addParseAction�szParserElement.addParseActioncsb|jdd��|jdd�rtnt�x(|D] ����fdd�}|jj|�q&W|jpZ|jdd�|_|S)a�Add a boolean predicate function to expression's list of parse actions. See 
        L{I{setParseAction}<setParseAction>} for function call signatures. Unlike C{setParseAction}, 
        functions passed to C{addCondition} need to return boolean success/fail of the condition.

        Optional keyword arguments:
         - message = define a custom message to be used in the raised exception
         - fatal   = if True, will raise ParseFatalException to stop parsing immediately; otherwise will raise ParseException
         
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            year_int = integer.copy()
            year_int.addCondition(lambda toks: toks[0] >= 2000, message="Only support years 2000 and later")
            date_str = year_int + '/' + integer + '/' + integer

            result = date_str.parseString("1999/12/31")  # -> Exception: Only support years 2000 and later (at char 0), (line:1, col:1)
        �messagezfailed user-defined condition�fatalFcs$tt��|||��s �||���dS)N)r�rM)r�r5rv)�exc_type�fnr�rwrx�pasz&ParserElement.addCondition.<locals>.parf)r�r!rrSr�rf)r�rwr�r}rw)r{r|r�rx�addCondition�s
zParserElement.addConditioncCs
||_|S)aDefine action to perform if parsing fails at this expression.
           Fail acton fn is a callable function that takes the arguments
           C{fn(s,loc,expr,err)} where:
            - s = string being parsed
            - loc = location where expression match was attempted and failed
            - expr = the parse expression that failed
            - err = the exception thrown
           The function returns no value.  It may throw C{L{ParseFatalException}}
           if it is desired to stop parsing immediately.)rT)r�r|rwrwrx�
setFailActions
zParserElement.setFailActioncCsZd}xP|rTd}xB|jD]8}yx|j||�\}}d}qWWqtk
rLYqXqWqW|S)NTF)r]rtr)r�r-r�Z
exprsFound�eZdummyrwrwrx�_skipIgnorables#szParserElement._skipIgnorablescCsL|jr|j||�}|jrH|j}t|�}x ||krF|||krF|d7}q(W|S)Nrr)r]r�rXrYr�)r�r-r�Zwt�instrlenrwrwrx�preParse0szParserElement.preParsecCs|gfS)Nrw)r�r-r�rorwrwrx�	parseImpl<szParserElement.parseImplcCs|S)Nrw)r�r-r��	tokenlistrwrwrx�	postParse?szParserElement.postParsec"Cs�|j}|s|jr�|jdr,|jd|||�|rD|jrD|j||�}n|}|}yDy|j|||�\}}Wn(tk
r�t|t|�|j	|��YnXWnXt
k
r�}	z<|jdr�|jd||||	�|jr�|j||||	��WYdd}	~	XnXn�|o�|j�r|j||�}n|}|}|j�s$|t|�k�rhy|j|||�\}}Wn*tk
�rdt|t|�|j	|��YnXn|j|||�\}}|j|||�}t
||j|j|jd�}
|j�r�|�s�|j�r�|�rVyRxL|jD]B}||||
�}|dk	�r�t
||j|j�o�t|t
tf�|jd�}
�q�WWnFt
k
�rR}	z(|jd�r@|jd||||	��WYdd}	~	XnXnNxL|jD]B}||||
�}|dk	�r^t
||j|j�o�t|t
tf�|jd�}
�q^W|�r�|jd�r�|jd|||||
�||
fS)Nrrq)r�r�rr)r^rTrcrer�r�r�rr�rarr`r�r"rVrWrbrSrfrzr�)r�r-r�rorpZ	debugging�prelocZtokensStart�tokens�errZ	retTokensr|rwrwrx�
_parseNoCacheCsp





zParserElement._parseNoCachecCs>y|j||dd�dStk
r8t|||j|��YnXdS)NF)ror)rtr!rra)r�r-r�rwrwrx�tryParse�szParserElement.tryParsecCs2y|j||�Wnttfk
r(dSXdSdS)NFT)r�rr�)r�r-r�rwrwrx�canParseNext�s
zParserElement.canParseNextc@seZdZdd�ZdS)zParserElement._UnboundedCachecsdi�t�|_���fdd�}�fdd�}�fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)�cache�not_in_cacherwrxr��sz3ParserElement._UnboundedCache.__init__.<locals>.getcs|�|<dS)Nrw)r�r�r�)r�rwrx�set�sz3ParserElement._UnboundedCache.__init__.<locals>.setcs�j�dS)N)r�)r�)r�rwrxr��sz5ParserElement._UnboundedCache.__init__.<locals>.clear)r�r��types�
MethodTyper�r�r�)r�r�r�r�rw)r�r�rxr��sz&ParserElement._UnboundedCache.__init__N)r�r�r�r�rwrwrwrx�_UnboundedCache�sr�Nc@seZdZdd�ZdS)zParserElement._FifoCachecsht�|_�t����fdd�}��fdd�}�fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)r�r�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.getcs"|�|<t���kr�jd�dS)NF)r��popitem)r�r�r�)r��sizerwrxr��sz.ParserElement._FifoCache.__init__.<locals>.setcs�j�dS)N)r�)r�)r�rwrxr��sz0ParserElement._FifoCache.__init__.<locals>.clear)r�r��_OrderedDictr�r�r�r�r�)r�r�r�r�r�rw)r�r�r�rxr��sz!ParserElement._FifoCache.__init__N)r�r�r�r�rwrwrwrx�
_FifoCache�sr�c@seZdZdd�ZdS)zParserElement._FifoCachecsvt�|_�i�tjg�����fdd�}���fdd�}��fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)r�r�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.getcs2|�|<t���kr$�j�j�d��j|�dS)N)r�r��popleftr�)r�r�r�)r��key_fifor�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.setcs�j��j�dS)N)r�)r�)r�r�rwrxr��sz0ParserElement._FifoCache.__init__.<locals>.clear)	r�r��collections�dequer�r�r�r�r�)r�r�r�r�r�rw)r�r�r�r�rxr��sz!ParserElement._FifoCache.__init__N)r�r�r�r�rwrwrwrxr��srcCs�d\}}|||||f}tj��tj}|j|�}	|	|jkr�tj|d7<y|j||||�}	Wn8tk
r�}
z|j||
j	|
j
���WYdd}
~
Xq�X|j||	d|	dj�f�|	Sn4tj|d7<t|	t
�r�|	�|	d|	dj�fSWdQRXdS)Nrrr)rrr)r$�packrat_cache_lock�
packrat_cacher�r��packrat_cache_statsr�rr�rHr�r�rzrK)r�r-r�rorpZHITZMISS�lookupr�r�r�rwrwrx�_parseCache�s$


zParserElement._parseCachecCs(tjj�dgttj�tjdd�<dS)Nr)r$r�r�r�r�rwrwrwrx�
resetCache�s
zParserElement.resetCache�cCs8tjs4dt_|dkr tj�t_ntj|�t_tjt_dS)a�Enables "packrat" parsing, which adds memoizing to the parsing logic.
           Repeated parse attempts at the same string location (which happens
           often in many complex grammars) can immediately return a cached value,
           instead of re-executing parsing/validating code.  Memoizing is done of
           both valid results and parsing exceptions.
           
           Parameters:
            - cache_size_limit - (default=C{128}) - if an integer value is provided
              will limit the size of the packrat cache; if None is passed, then
              the cache size will be unbounded; if 0 is passed, the cache will
              be effectively disabled.
            
           This speedup may break existing programs that use parse actions that
           have side-effects.  For this reason, packrat parsing is disabled when
           you first import pyparsing.  To activate the packrat feature, your
           program must call the class method C{ParserElement.enablePackrat()}.  If
           your program uses C{psyco} to "compile as you go", you must call
           C{enablePackrat} before calling C{psyco.full()}.  If you do not do this,
           Python will crash.  For best results, call C{enablePackrat()} immediately
           after importing pyparsing.
           
           Example::
               import pyparsing
               pyparsing.ParserElement.enablePackrat()
        TN)r$�_packratEnabledr�r�r�r�rt)Zcache_size_limitrwrwrx�
enablePackratszParserElement.enablePackratcCs�tj�|js|j�x|jD]}|j�qW|js<|j�}y<|j|d�\}}|rv|j||�}t	�t
�}|j||�Wn0tk
r�}ztjr��n|�WYdd}~XnX|SdS)aB
        Execute the parse expression with the given string.
        This is the main interface to the client code, once the complete
        expression has been built.

        If you want the grammar to require that the entire input string be
        successfully parsed, then set C{parseAll} to True (equivalent to ending
        the grammar with C{L{StringEnd()}}).

        Note: C{parseString} implicitly calls C{expandtabs()} on the input string,
        in order to report proper column numbers in parse actions.
        If the input string contains tabs and
        the grammar uses parse actions that use the C{loc} argument to index into the
        string being parsed, you can ensure you have a consistent view of the input
        string by:
         - calling C{parseWithTabs} on your grammar before calling C{parseString}
           (see L{I{parseWithTabs}<parseWithTabs>})
         - define your parse action using the full C{(s,loc,toks)} signature, and
           reference the input string using the parse action's C{s} argument
         - explictly expand the tabs in your input string before calling
           C{parseString}
        
        Example::
            Word('a').parseString('aaaaabaaa')  # -> ['aaaaa']
            Word('a').parseString('aaaaabaaa', parseAll=True)  # -> Exception: Expected end of text
        rN)
r$r�r_�
streamliner]r\�
expandtabsrtr�r
r)r�verbose_stacktrace)r�r-�parseAllr�r�r�Zser3rwrwrx�parseString#s$zParserElement.parseStringccs@|js|j�x|jD]}|j�qW|js8t|�j�}t|�}d}|j}|j}t	j
�d}	y�x�||kon|	|k�ry |||�}
|||
dd�\}}Wntk
r�|
d}Yq`X||kr�|	d7}	||
|fV|r�|||�}
|
|kr�|}q�|d7}n|}q`|
d}q`WWn4tk
�r:}zt	j
�r&�n|�WYdd}~XnXdS)a�
        Scan the input string for expression matches.  Each match will return the
        matching tokens, start location, and end location.  May be called with optional
        C{maxMatches} argument, to clip scanning after 'n' matches are found.  If
        C{overlap} is specified, then overlapping matches will be reported.

        Note that the start and end locations are reported relative to the string
        being parsed.  See L{I{parseString}<parseString>} for more information on parsing
        strings with embedded tabs.

        Example::
            source = "sldjf123lsdjjkf345sldkjf879lkjsfd987"
            print(source)
            for tokens,start,end in Word(alphas).scanString(source):
                print(' '*start + '^'*(end-start))
                print(' '*start + tokens[0])
        
        prints::
        
            sldjf123lsdjjkf345sldkjf879lkjsfd987
            ^^^^^
            sldjf
                    ^^^^^^^
                    lsdjjkf
                              ^^^^^^
                              sldkjf
                                       ^^^^^^
                                       lkjsfd
        rF)rprrN)r_r�r]r\r�r�r�r�rtr$r�rrr�)r�r-�
maxMatchesZoverlapr�r�r�Z
preparseFnZparseFn�matchesr�ZnextLocr�Znextlocr3rwrwrx�
scanStringUsB


zParserElement.scanStringcCs�g}d}d|_y�xh|j|�D]Z\}}}|j|||��|rrt|t�rT||j�7}nt|t�rh||7}n
|j|�|}qW|j||d��dd�|D�}djtt	t
|���Stk
r�}ztj
rȂn|�WYdd}~XnXdS)af
        Extension to C{L{scanString}}, to modify matching text with modified tokens that may
        be returned from a parse action.  To use C{transformString}, define a grammar and
        attach a parse action to it that modifies the returned token list.
        Invoking C{transformString()} on a target string will then scan for matches,
        and replace the matched text patterns according to the logic in the parse
        action.  C{transformString()} returns the resulting transformed string.
        
        Example::
            wd = Word(alphas)
            wd.setParseAction(lambda toks: toks[0].title())
            
            print(wd.transformString("now is the winter of our discontent made glorious summer by this sun of york."))
        Prints::
            Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York.
        rTNcSsg|]}|r|�qSrwrw)r��orwrwrxr��sz1ParserElement.transformString.<locals>.<listcomp>r�)r\r�r�rzr"r�r�r�rvr��_flattenrr$r�)r�r-rZlastErvr�r�r3rwrwrxr��s(



zParserElement.transformStringcCsPytdd�|j||�D��Stk
rJ}ztjr6�n|�WYdd}~XnXdS)a~
        Another extension to C{L{scanString}}, simplifying the access to the tokens found
        to match the given parse expression.  May be called with optional
        C{maxMatches} argument, to clip searching after 'n' matches are found.
        
        Example::
            # a capitalized word starts with an uppercase letter, followed by zero or more lowercase letters
            cap_word = Word(alphas.upper(), alphas.lower())
            
            print(cap_word.searchString("More than Iron, more than Lead, more than Gold I need Electricity"))
        prints::
            ['More', 'Iron', 'Lead', 'Gold', 'I']
        cSsg|]\}}}|�qSrwrw)r�rvr�r�rwrwrxr��sz.ParserElement.searchString.<locals>.<listcomp>N)r"r�rr$r�)r�r-r�r3rwrwrx�searchString�szParserElement.searchStringc	csXd}d}x<|j||d�D]*\}}}|||�V|r>|dV|}qW||d�VdS)a[
        Generator method to split a string using the given expression as a separator.
        May be called with optional C{maxsplit} argument, to limit the number of splits;
        and the optional C{includeSeparators} argument (default=C{False}), if the separating
        matching text should be included in the split results.
        
        Example::        
            punc = oneOf(list(".,;:/-!?"))
            print(list(punc.split("This, this?, this sentence, is badly punctuated!")))
        prints::
            ['This', ' this', '', ' this sentence', ' is badly punctuated', '']
        r)r�N)r�)	r�r-�maxsplitZincludeSeparatorsZsplitsZlastrvr�r�rwrwrxr��s

zParserElement.splitcCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)a�
        Implementation of + operator - returns C{L{And}}. Adding strings to a ParserElement
        converts them to L{Literal}s by default.
        
        Example::
            greet = Word(alphas) + "," + Word(alphas) + "!"
            hello = "Hello, World!"
            print (hello, "->", greet.parseString(hello))
        Prints::
            Hello, World! -> ['Hello', ',', 'World', '!']
        z4Cannot combine element of type %s with ParserElementrq)�
stacklevelN)	rzr�r$rQ�warnings�warnr��
SyntaxWarningr)r�r�rwrwrxr�s



zParserElement.__add__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||S)z]
        Implementation of + operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrxrs



zParserElement.__radd__cCsLt|t�rtj|�}t|t�s:tjdt|�tdd�dSt|tj	�|g�S)zQ
        Implementation of - operator, returns C{L{And}} with error stop
        z4Cannot combine element of type %s with ParserElementrq)r�N)
rzr�r$rQr�r�r�r�r�
_ErrorStop)r�r�rwrwrx�__sub__s



zParserElement.__sub__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||S)z]
        Implementation of - operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rsub__ s



zParserElement.__rsub__cs�t|t�r|d}}n�t|t�r�|ddd�}|ddkrHd|df}t|dt�r�|ddkr�|ddkrvt��S|ddkr�t��S�|dt��SnJt|dt�r�t|dt�r�|\}}||8}ntdt|d�t|d���ntdt|���|dk�rtd��|dk�rtd��||k�o2dkn�rBtd	��|�r���fd
d��|�r�|dk�rt��|�}nt�g|��|�}n�|�}n|dk�r��}nt�g|�}|S)
a�
        Implementation of * operator, allows use of C{expr * 3} in place of
        C{expr + expr + expr}.  Expressions may also me multiplied by a 2-integer
        tuple, similar to C{{min,max}} multipliers in regular expressions.  Tuples
        may also include C{None} as in:
         - C{expr*(n,None)} or C{expr*(n,)} is equivalent
              to C{expr*n + L{ZeroOrMore}(expr)}
              (read as "at least n instances of C{expr}")
         - C{expr*(None,n)} is equivalent to C{expr*(0,n)}
              (read as "0 to n instances of C{expr}")
         - C{expr*(None,None)} is equivalent to C{L{ZeroOrMore}(expr)}
         - C{expr*(1,None)} is equivalent to C{L{OneOrMore}(expr)}

        Note that C{expr*(None,n)} does not raise an exception if
        more than n exprs exist in the input stream; that is,
        C{expr*(None,n)} does not enforce a maximum number of expr
        occurrences.  If this behavior is desired, then write
        C{expr*(None,n) + ~expr}
        rNrqrrz7cannot multiply 'ParserElement' and ('%s','%s') objectsz0cannot multiply 'ParserElement' and '%s' objectsz/cannot multiply ParserElement by negative valuez@second tuple value must be greater or equal to first tuple valuez+cannot multiply ParserElement by 0 or (0,0)cs(|dkrt��|d��St��SdS)Nrr)r)�n)�makeOptionalListr�rwrxr�]sz/ParserElement.__mul__.<locals>.makeOptionalList)NN)	rzru�tupler2rr�r��
ValueErrorr)r�r�ZminElementsZoptElementsr�rw)r�r�rx�__mul__,sD







zParserElement.__mul__cCs
|j|�S)N)r�)r�r�rwrwrx�__rmul__pszParserElement.__rmul__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zI
        Implementation of | operator - returns C{L{MatchFirst}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__or__ss



zParserElement.__or__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||BS)z]
        Implementation of | operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__ror__s



zParserElement.__ror__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zA
        Implementation of ^ operator - returns C{L{Or}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__xor__�s



zParserElement.__xor__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||AS)z]
        Implementation of ^ operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rxor__�s



zParserElement.__rxor__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zC
        Implementation of & operator - returns C{L{Each}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__and__�s



zParserElement.__and__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||@S)z]
        Implementation of & operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rand__�s



zParserElement.__rand__cCst|�S)zE
        Implementation of ~ operator - returns C{L{NotAny}}
        )r)r�rwrwrx�
__invert__�szParserElement.__invert__cCs|dk	r|j|�S|j�SdS)a

        Shortcut for C{L{setResultsName}}, with C{listAllMatches=False}.
        
        If C{name} is given with a trailing C{'*'} character, then C{listAllMatches} will be
        passed as C{True}.
           
        If C{name} is omitted, same as calling C{L{copy}}.

        Example::
            # these are equivalent
            userdata = Word(alphas).setResultsName("name") + Word(nums+"-").setResultsName("socsecno")
            userdata = Word(alphas)("name") + Word(nums+"-")("socsecno")             
        N)rmr�)r�r�rwrwrx�__call__�s
zParserElement.__call__cCst|�S)z�
        Suppresses the output of this C{ParserElement}; useful to keep punctuation from
        cluttering up returned output.
        )r+)r�rwrwrx�suppress�szParserElement.suppresscCs
d|_|S)a
        Disables the skipping of whitespace before matching the characters in the
        C{ParserElement}'s defined pattern.  This is normally only used internally by
        the pyparsing module, but may be needed in some whitespace-sensitive grammars.
        F)rX)r�rwrwrx�leaveWhitespace�szParserElement.leaveWhitespacecCsd|_||_d|_|S)z8
        Overrides the default whitespace chars
        TF)rXrYrZ)r�rOrwrwrx�setWhitespaceChars�sz ParserElement.setWhitespaceCharscCs
d|_|S)z�
        Overrides default behavior to expand C{<TAB>}s to spaces before parsing the input string.
        Must be called before C{parseString} when the input grammar contains elements that
        match C{<TAB>} characters.
        T)r\)r�rwrwrx�
parseWithTabs�szParserElement.parseWithTabscCsLt|t�rt|�}t|t�r4||jkrH|jj|�n|jjt|j���|S)a�
        Define expression to be ignored (e.g., comments) while doing pattern
        matching; may be called repeatedly, to define multiple comment or other
        ignorable patterns.
        
        Example::
            patt = OneOrMore(Word(alphas))
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj']
            
            patt.ignore(cStyleComment)
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj', 'lskjd']
        )rzr�r+r]r�r�)r�r�rwrwrx�ignore�s


zParserElement.ignorecCs"|pt|pt|ptf|_d|_|S)zT
        Enable display of debugging messages while doing pattern matching.
        T)r/r2r4rcr^)r�ZstartActionZ
successActionZexceptionActionrwrwrx�setDebugActions
s
zParserElement.setDebugActionscCs|r|jttt�nd|_|S)a�
        Enable display of debugging messages while doing pattern matching.
        Set C{flag} to True to enable, False to disable.

        Example::
            wd = Word(alphas).setName("alphaword")
            integer = Word(nums).setName("numword")
            term = wd | integer
            
            # turn on debugging for wd
            wd.setDebug()

            OneOrMore(term).parseString("abc 123 xyz 890")
        
        prints::
            Match alphaword at loc 0(1,1)
            Matched alphaword -> ['abc']
            Match alphaword at loc 3(1,4)
            Exception raised:Expected alphaword (at char 4), (line:1, col:5)
            Match alphaword at loc 7(1,8)
            Matched alphaword -> ['xyz']
            Match alphaword at loc 11(1,12)
            Exception raised:Expected alphaword (at char 12), (line:1, col:13)
            Match alphaword at loc 15(1,16)
            Exception raised:Expected alphaword (at char 15), (line:1, col:16)

        The output shown is that produced by the default debug actions - custom debug actions can be
        specified using L{setDebugActions}. Prior to attempting
        to match the C{wd} expression, the debugging message C{"Match <exprname> at loc <n>(<line>,<col>)"}
        is shown. Then if the parse succeeds, a C{"Matched"} message is shown, or an C{"Exception raised"}
        message is shown. Also note the use of L{setName} to assign a human-readable name to the expression,
        which makes debugging and exception messages easier to understand - for instance, the default
        name created for the C{Word} expression without calling C{setName} is C{"W:(ABCD...)"}.
        F)r�r/r2r4r^)r��flagrwrwrx�setDebugs#zParserElement.setDebugcCs|jS)N)r�)r�rwrwrxr�@szParserElement.__str__cCst|�S)N)r�)r�rwrwrxr�CszParserElement.__repr__cCsd|_d|_|S)NT)r_rU)r�rwrwrxr�FszParserElement.streamlinecCsdS)Nrw)r�r�rwrwrx�checkRecursionKszParserElement.checkRecursioncCs|jg�dS)zj
        Check defined expressions for valid structure, check for infinite recursive definitions.
        N)r�)r��
validateTracerwrwrx�validateNszParserElement.validatecCs�y|j�}Wn2tk
r>t|d��}|j�}WdQRXYnXy|j||�Stk
r|}ztjrh�n|�WYdd}~XnXdS)z�
        Execute the parse expression on the given file or filename.
        If a filename is specified (instead of a file object),
        the entire file is opened, read, and closed before parsing.
        �rN)�readr��openr�rr$r�)r�Zfile_or_filenamer�Z
file_contents�fr3rwrwrx�	parseFileTszParserElement.parseFilecsHt|t�r"||kp t|�t|�kSt|t�r6|j|�Stt|�|kSdS)N)rzr$�varsr�r��super)r�r�)rHrwrx�__eq__hs



zParserElement.__eq__cCs
||kS)Nrw)r�r�rwrwrx�__ne__pszParserElement.__ne__cCstt|��S)N)�hash�id)r�rwrwrx�__hash__sszParserElement.__hash__cCs||kS)Nrw)r�r�rwrwrx�__req__vszParserElement.__req__cCs
||kS)Nrw)r�r�rwrwrx�__rne__yszParserElement.__rne__cCs0y|jt|�|d�dStk
r*dSXdS)a�
        Method for quick testing of a parser against a test string. Good for simple 
        inline microtests of sub expressions while building up larger parser.
           
        Parameters:
         - testString - to test against this expression for a match
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests
            
        Example::
            expr = Word(nums)
            assert expr.matches("100")
        )r�TFN)r�r�r)r�Z
testStringr�rwrwrxr�|s

zParserElement.matches�#cCs�t|t�r"tttj|j�j���}t|t�r4t|�}g}g}d}	�x�|D�]�}
|dk	rb|j	|
d�sl|rx|
rx|j
|
�qH|
s~qHdj|�|
g}g}y:|
jdd�}
|j
|
|d�}|j
|j|d��|	o�|}	Wn�tk
�rx}
z�t|
t�r�dnd	}d|
k�r0|j
t|
j|
��|j
d
t|
j|
�dd|�n|j
d
|
jd|�|j
d
t|
��|	�ob|}	|
}WYdd}
~
XnDtk
�r�}z&|j
dt|��|	�o�|}	|}WYdd}~XnX|�r�|�r�|j
d	�tdj|��|j
|
|f�qHW|	|fS)a3
        Execute the parse expression on a series of test strings, showing each
        test, the parsed results or where the parse failed. Quick and easy way to
        run a parse expression against a list of sample strings.
           
        Parameters:
         - tests - a list of separate test strings, or a multiline string of test strings
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests           
         - comment - (default=C{'#'}) - expression for indicating embedded comments in the test 
              string; pass None to disable comment filtering
         - fullDump - (default=C{True}) - dump results as list followed by results names in nested outline;
              if False, only dump nested list
         - printResults - (default=C{True}) prints test output to stdout
         - failureTests - (default=C{False}) indicates if these tests are expected to fail parsing

        Returns: a (success, results) tuple, where success indicates that all tests succeeded
        (or failed if C{failureTests} is True), and the results contain a list of lines of each 
        test's output
        
        Example::
            number_expr = pyparsing_common.number.copy()

            result = number_expr.runTests('''
                # unsigned integer
                100
                # negative integer
                -100
                # float with scientific notation
                6.02e23
                # integer with scientific notation
                1e-12
                ''')
            print("Success" if result[0] else "Failed!")

            result = number_expr.runTests('''
                # stray character
                100Z
                # missing leading digit before '.'
                -.100
                # too many '.'
                3.14.159
                ''', failureTests=True)
            print("Success" if result[0] else "Failed!")
        prints::
            # unsigned integer
            100
            [100]

            # negative integer
            -100
            [-100]

            # float with scientific notation
            6.02e23
            [6.02e+23]

            # integer with scientific notation
            1e-12
            [1e-12]

            Success
            
            # stray character
            100Z
               ^
            FAIL: Expected end of text (at char 3), (line:1, col:4)

            # missing leading digit before '.'
            -.100
            ^
            FAIL: Expected {real number with scientific notation | real number | signed integer} (at char 0), (line:1, col:1)

            # too many '.'
            3.14.159
                ^
            FAIL: Expected end of text (at char 4), (line:1, col:5)

            Success

        Each test string must be on a single line. If you want to test a string that spans multiple
        lines, create a test like this::

            expr.runTest(r"this is a test\n of strings that spans \n 3 lines")
        
        (Note that this is a raw string literal, you must include the leading 'r'.)
        TNFrz\n)r�)r z(FATAL)r�� rr�^zFAIL: zFAIL-EXCEPTION: )rzr�r�rvr{r��rstrip�
splitlinesrr�r�r�r�r�rrr!rGr�r9rKr,)r�Ztestsr�ZcommentZfullDumpZprintResultsZfailureTestsZ
allResultsZcomments�successrvr�resultr�rzr3rwrwrx�runTests�sNW



$


zParserElement.runTests)F)F)T)T)TT)TT)r�)F)N)T)F)T)Tr�TTF)Or�r�r�r�rNr��staticmethodrPrRr�r�rirmrur�rxr~rr�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�rtr�r�r�r��_MAX_INTr�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r��
__classcell__rwrw)rHrxr$8s�


&




H
"
2G+D
			

)

cs eZdZdZ�fdd�Z�ZS)r,zT
    Abstract C{ParserElement} subclass, for defining atomic matching patterns.
    cstt|�jdd�dS)NF)rg)r�r,r�)r�)rHrwrxr�	szToken.__init__)r�r�r�r�r�r�rwrw)rHrxr,	scs eZdZdZ�fdd�Z�ZS)r
z,
    An empty token, will always match.
    cs$tt|�j�d|_d|_d|_dS)Nr
TF)r�r
r�r�r[r`)r�)rHrwrxr�	szEmpty.__init__)r�r�r�r�r�r�rwrw)rHrxr
	scs*eZdZdZ�fdd�Zddd�Z�ZS)rz(
    A token that will never match.
    cs*tt|�j�d|_d|_d|_d|_dS)NrTFzUnmatchable token)r�rr�r�r[r`ra)r�)rHrwrxr�*	s
zNoMatch.__init__TcCst|||j|��dS)N)rra)r�r-r�rorwrwrxr�1	szNoMatch.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr&	scs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Token to exactly match a specified string.
    
    Example::
        Literal('blah').parseString('blah')  # -> ['blah']
        Literal('blah').parseString('blahfooblah')  # -> ['blah']
        Literal('blah').parseString('bla')  # -> Exception: Expected "blah"
    
    For case-insensitive matching, use L{CaselessLiteral}.
    
    For keyword matching (force word break before and after the matched string),
    use L{Keyword} or L{CaselessKeyword}.
    cs�tt|�j�||_t|�|_y|d|_Wn*tk
rVtj	dt
dd�t|_YnXdt
|j�|_d|j|_d|_d|_dS)Nrz2null string passed to Literal; use Empty() insteadrq)r�z"%s"z	Expected F)r�rr��matchr��matchLen�firstMatchCharr�r�r�r�r
rHr�r�rar[r`)r��matchString)rHrwrxr�C	s

zLiteral.__init__TcCsJ|||jkr6|jdks&|j|j|�r6||j|jfSt|||j|��dS)Nrr)r�r��
startswithr�rra)r�r-r�rorwrwrxr�V	szLiteral.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr5	s
csLeZdZdZedZd�fdd�	Zddd	�Z�fd
d�Ze	dd
��Z
�ZS)ra\
    Token to exactly match a specified string as a keyword, that is, it must be
    immediately followed by a non-keyword character.  Compare with C{L{Literal}}:
     - C{Literal("if")} will match the leading C{'if'} in C{'ifAndOnlyIf'}.
     - C{Keyword("if")} will not; it will only match the leading C{'if'} in C{'if x=1'}, or C{'if(y==2)'}
    Accepts two optional constructor arguments in addition to the keyword string:
     - C{identChars} is a string of characters that would be valid identifier characters,
          defaulting to all alphanumerics + "_" and "$"
     - C{caseless} allows case-insensitive matching, default is C{False}.
       
    Example::
        Keyword("start").parseString("start")  # -> ['start']
        Keyword("start").parseString("starting")  # -> Exception

    For case-insensitive matching, use L{CaselessKeyword}.
    z_$NFcs�tt|�j�|dkrtj}||_t|�|_y|d|_Wn$tk
r^t	j
dtdd�YnXd|j|_d|j|_
d|_d|_||_|r�|j�|_|j�}t|�|_dS)Nrz2null string passed to Keyword; use Empty() insteadrq)r�z"%s"z	Expected F)r�rr��DEFAULT_KEYWORD_CHARSr�r�r�r�r�r�r�r�r�rar[r`�caseless�upper�
caselessmatchr��
identChars)r�r�r�r�)rHrwrxr�q	s&

zKeyword.__init__TcCs|jr|||||j�j�|jkr�|t|�|jksL|||jj�|jkr�|dksj||dj�|jkr�||j|jfSnv|||jkr�|jdks�|j|j|�r�|t|�|jks�|||j|jkr�|dks�||d|jkr�||j|jfSt	|||j
|��dS)Nrrr)r�r�r�r�r�r�r�r�r�rra)r�r-r�rorwrwrxr��	s*&zKeyword.parseImplcstt|�j�}tj|_|S)N)r�rr�r�r�)r�r�)rHrwrxr��	szKeyword.copycCs
|t_dS)z,Overrides the default Keyword chars
        N)rr�)rOrwrwrx�setDefaultKeywordChars�	szKeyword.setDefaultKeywordChars)NF)T)r�r�r�r�r3r�r�r�r�r�r�r�rwrw)rHrxr^	s
cs*eZdZdZ�fdd�Zddd�Z�ZS)ral
    Token to match a specified string, ignoring case of letters.
    Note: the matched results will always be in the case of the given
    match string, NOT the case of the input text.

    Example::
        OneOrMore(CaselessLiteral("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD', 'CMD']
        
    (Contrast with example for L{CaselessKeyword}.)
    cs6tt|�j|j��||_d|j|_d|j|_dS)Nz'%s'z	Expected )r�rr�r��returnStringr�ra)r�r�)rHrwrxr��	szCaselessLiteral.__init__TcCs@||||j�j�|jkr,||j|jfSt|||j|��dS)N)r�r�r�r�rra)r�r-r�rorwrwrxr��	szCaselessLiteral.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr�	s
cs,eZdZdZd�fdd�	Zd	dd�Z�ZS)
rz�
    Caseless version of L{Keyword}.

    Example::
        OneOrMore(CaselessKeyword("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD']
        
    (Contrast with example for L{CaselessLiteral}.)
    Ncstt|�j||dd�dS)NT)r�)r�rr�)r�r�r�)rHrwrxr��	szCaselessKeyword.__init__TcCsj||||j�j�|jkrV|t|�|jksF|||jj�|jkrV||j|jfSt|||j|��dS)N)r�r�r�r�r�r�rra)r�r-r�rorwrwrxr��	s*zCaselessKeyword.parseImpl)N)T)r�r�r�r�r�r�r�rwrw)rHrxr�	scs,eZdZdZd�fdd�	Zd	dd�Z�ZS)
rlax
    A variation on L{Literal} which matches "close" matches, that is, 
    strings with at most 'n' mismatching characters. C{CloseMatch} takes parameters:
     - C{match_string} - string to be matched
     - C{maxMismatches} - (C{default=1}) maximum number of mismatches allowed to count as a match
    
    The results from a successful parse will contain the matched text from the input string and the following named results:
     - C{mismatches} - a list of the positions within the match_string where mismatches were found
     - C{original} - the original match_string used to compare against the input string
    
    If C{mismatches} is an empty list, then the match was an exact match.
    
    Example::
        patt = CloseMatch("ATCATCGAATGGA")
        patt.parseString("ATCATCGAAXGGA") # -> (['ATCATCGAAXGGA'], {'mismatches': [[9]], 'original': ['ATCATCGAATGGA']})
        patt.parseString("ATCAXCGAAXGGA") # -> Exception: Expected 'ATCATCGAATGGA' (with up to 1 mismatches) (at char 0), (line:1, col:1)

        # exact match
        patt.parseString("ATCATCGAATGGA") # -> (['ATCATCGAATGGA'], {'mismatches': [[]], 'original': ['ATCATCGAATGGA']})

        # close match allowing up to 2 mismatches
        patt = CloseMatch("ATCATCGAATGGA", maxMismatches=2)
        patt.parseString("ATCAXCGAAXGGA") # -> (['ATCAXCGAAXGGA'], {'mismatches': [[4, 9]], 'original': ['ATCATCGAATGGA']})
    rrcsBtt|�j�||_||_||_d|j|jf|_d|_d|_dS)Nz&Expected %r (with up to %d mismatches)F)	r�rlr�r��match_string�
maxMismatchesrar`r[)r�r�r�)rHrwrxr��	szCloseMatch.__init__TcCs�|}t|�}|t|j�}||kr�|j}d}g}	|j}
x�tt|||�|j��D]0\}}|\}}
||
krP|	j|�t|	�|
krPPqPW|d}t|||�g�}|j|d<|	|d<||fSt|||j|��dS)Nrrr�original�
mismatches)	r�r�r�r�r�r�r"rra)r�r-r�ro�startr��maxlocr�Zmatch_stringlocr�r�Zs_m�src�mat�resultsrwrwrxr��	s("

zCloseMatch.parseImpl)rr)T)r�r�r�r�r�r�r�rwrw)rHrxrl�	s	cs8eZdZdZd
�fdd�	Zdd	d
�Z�fdd�Z�ZS)r/a	
    Token for matching words composed of allowed character sets.
    Defined with string containing all allowed initial characters,
    an optional string containing allowed body characters (if omitted,
    defaults to the initial character set), and an optional minimum,
    maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction. An optional
    C{excludeChars} parameter can list characters that might be found in 
    the input C{bodyChars} string; useful to define a word of all printables
    except for one or two characters, for instance.
    
    L{srange} is useful for defining custom character set strings for defining 
    C{Word} expressions, using range notation from regular expression character sets.
    
    A common mistake is to use C{Word} to match a specific literal string, as in 
    C{Word("Address")}. Remember that C{Word} uses the string argument to define
    I{sets} of matchable characters. This expression would match "Add", "AAA",
    "dAred", or any other word made up of the characters 'A', 'd', 'r', 'e', and 's'.
    To match an exact literal string, use L{Literal} or L{Keyword}.

    pyparsing includes helper strings for building Words:
     - L{alphas}
     - L{nums}
     - L{alphanums}
     - L{hexnums}
     - L{alphas8bit} (alphabetic characters in ASCII range 128-255 - accented, tilded, umlauted, etc.)
     - L{punc8bit} (non-alphabetic characters in ASCII range 128-255 - currency, symbols, superscripts, diacriticals, etc.)
     - L{printables} (any non-whitespace character)

    Example::
        # a word composed of digits
        integer = Word(nums) # equivalent to Word("0123456789") or Word(srange("0-9"))
        
        # a word with a leading capital, and zero or more lowercase
        capital_word = Word(alphas.upper(), alphas.lower())

        # hostnames are alphanumeric, with leading alpha, and '-'
        hostname = Word(alphas, alphanums+'-')
        
        # roman numeral (not a strict parser, accepts invalid mix of characters)
        roman = Word("IVXLCDM")
        
        # any string of non-whitespace characters, except for ','
        csv_value = Word(printables, excludeChars=",")
    NrrrFcs�tt|�j��rFdj�fdd�|D��}|rFdj�fdd�|D��}||_t|�|_|rl||_t|�|_n||_t|�|_|dk|_	|dkr�t
d��||_|dkr�||_nt
|_|dkr�||_||_t|�|_d|j|_d	|_||_d
|j|jk�r�|dk�r�|dk�r�|dk�r�|j|jk�r8dt|j�|_nHt|j�dk�rfdtj|j�t|j�f|_nd
t|j�t|j�f|_|j�r�d|jd|_ytj|j�|_Wntk
�r�d|_YnXdS)Nr�c3s|]}|�kr|VqdS)Nrw)r�r�)�excludeCharsrwrxr�7
sz Word.__init__.<locals>.<genexpr>c3s|]}|�kr|VqdS)Nrw)r�r�)r�rwrxr�9
srrrzZcannot specify a minimum length < 1; use Optional(Word()) if zero-length word is permittedz	Expected Fr�z[%s]+z%s[%s]*z	[%s][%s]*z\b)r�r/r�r��
initCharsOrigr��	initChars�
bodyCharsOrig�	bodyChars�maxSpecifiedr��minLen�maxLenr�r�r�rar`�	asKeyword�_escapeRegexRangeChars�reStringr�rd�escape�compilerK)r�rr�min�max�exactrr�)rH)r�rxr�4
sT



0
z
Word.__init__Tc
CsD|jr<|jj||�}|s(t|||j|��|j�}||j�fS|||jkrZt|||j|��|}|d7}t|�}|j}||j	}t
||�}x ||kr�|||kr�|d7}q�Wd}	|||jkr�d}	|jr�||kr�|||kr�d}	|j
�r|dk�r||d|k�s||k�r|||k�rd}	|	�r4t|||j|��||||�fS)NrrFTr)rdr�rra�end�grouprr�rrrrrr)
r�r-r�ror�r�r�Z	bodycharsr�ZthrowExceptionrwrwrxr�j
s6

4zWord.parseImplcstytt|�j�Stk
r"YnX|jdkrndd�}|j|jkr^d||j�||j�f|_nd||j�|_|jS)NcSs$t|�dkr|dd�dS|SdS)N�z...)r�)r�rwrwrx�
charsAsStr�
sz Word.__str__.<locals>.charsAsStrz	W:(%s,%s)zW:(%s))r�r/r�rKrUr�r)r�r)rHrwrxr��
s
zWord.__str__)NrrrrFN)T)r�r�r�r�r�r�r�r�rwrw)rHrxr/
s.6
#csFeZdZdZeejd��Zd�fdd�	Zddd�Z	�fd	d
�Z
�ZS)
r'a�
    Token for matching strings that match a given regular expression.
    Defined with string specifying the regular expression in a form recognized by the inbuilt Python re module.
    If the given regex contains named groups (defined using C{(?P<name>...)}), these will be preserved as 
    named parse results.

    Example::
        realnum = Regex(r"[+-]?\d+\.\d*")
        date = Regex(r'(?P<year>\d{4})-(?P<month>\d\d?)-(?P<day>\d\d?)')
        # ref: http://stackoverflow.com/questions/267399/how-do-you-match-only-valid-roman-numerals-with-a-regular-expression
        roman = Regex(r"M{0,4}(CM|CD|D?C{0,3})(XC|XL|L?X{0,3})(IX|IV|V?I{0,3})")
    z[A-Z]rcs�tt|�j�t|t�r�|s,tjdtdd�||_||_	yt
j|j|j	�|_
|j|_Wq�t
jk
r�tjd|tdd��Yq�Xn2t|tj�r�||_
t|�|_|_||_	ntd��t|�|_d|j|_d|_d|_d	S)
z�The parameters C{pattern} and C{flags} are passed to the C{re.compile()} function as-is. See the Python C{re} module for an explanation of the acceptable patterns and flags.z0null string passed to Regex; use Empty() insteadrq)r�z$invalid pattern (%s) passed to RegexzCRegex may only be constructed with a string or a compiled RE objectz	Expected FTN)r�r'r�rzr�r�r�r��pattern�flagsrdr
r�
sre_constants�error�compiledREtyper{r�r�r�rar`r[)r�rr)rHrwrxr��
s.





zRegex.__init__TcCsd|jj||�}|s"t|||j|��|j�}|j�}t|j��}|r\x|D]}||||<qHW||fS)N)rdr�rrar�	groupdictr"r)r�r-r�ror��dr�r�rwrwrxr��
s
zRegex.parseImplcsDytt|�j�Stk
r"YnX|jdkr>dt|j�|_|jS)NzRe:(%s))r�r'r�rKrUr�r)r�)rHrwrxr��
s
z
Regex.__str__)r)T)r�r�r�r�r�rdr
rr�r�r�r�rwrw)rHrxr'�
s
"

cs8eZdZdZd�fdd�	Zddd�Z�fd	d
�Z�ZS)
r%a�
    Token for matching strings that are delimited by quoting characters.
    
    Defined with the following parameters:
        - quoteChar - string of one or more characters defining the quote delimiting string
        - escChar - character to escape quotes, typically backslash (default=C{None})
        - escQuote - special quote sequence to escape an embedded quote string (such as SQL's "" to escape an embedded ") (default=C{None})
        - multiline - boolean indicating whether quotes can span multiple lines (default=C{False})
        - unquoteResults - boolean indicating whether the matched text should be unquoted (default=C{True})
        - endQuoteChar - string of one or more characters defining the end of the quote delimited string (default=C{None} => same as quoteChar)
        - convertWhitespaceEscapes - convert escaped whitespace (C{'\t'}, C{'\n'}, etc.) to actual whitespace (default=C{True})

    Example::
        qs = QuotedString('"')
        print(qs.searchString('lsjdf "This is the quote" sldjf'))
        complex_qs = QuotedString('{{', endQuoteChar='}}')
        print(complex_qs.searchString('lsjdf {{This is the "quote"}} sldjf'))
        sql_qs = QuotedString('"', escQuote='""')
        print(sql_qs.searchString('lsjdf "This is the quote with ""embedded"" quotes" sldjf'))
    prints::
        [['This is the quote']]
        [['This is the "quote"']]
        [['This is the quote with "embedded" quotes']]
    NFTcsNtt��j�|j�}|s0tjdtdd�t��|dkr>|}n"|j�}|s`tjdtdd�t��|�_t	|��_
|d�_|�_t	|��_
|�_|�_|�_|�_|r�tjtjB�_dtj�j�t�jd�|dk	r�t|�p�df�_n<d�_dtj�j�t�jd�|dk	�rt|��pdf�_t	�j�d	k�rp�jd
dj�fdd
�tt	�j�d	dd�D��d7_|�r��jdtj|�7_|�r��jdtj|�7_tj�j�d�_�jdtj�j�7_ytj�j�j��_�j�_Wn0tjk
�r&tjd�jtdd��YnXt ���_!d�j!�_"d�_#d�_$dS)Nz$quoteChar cannot be the empty stringrq)r�z'endQuoteChar cannot be the empty stringrz%s(?:[^%s%s]r�z%s(?:[^%s\n\r%s]rrz|(?:z)|(?:c3s4|],}dtj�jd|��t�j|�fVqdS)z%s[^%s]N)rdr	�endQuoteCharr)r�r�)r�rwrxr�/sz(QuotedString.__init__.<locals>.<genexpr>�)z|(?:%s)z|(?:%s.)z(.)z)*%sz$invalid pattern (%s) passed to Regexz	Expected FTrs)%r�r%r�r�r�r�r��SyntaxError�	quoteCharr��quoteCharLen�firstQuoteCharr�endQuoteCharLen�escChar�escQuote�unquoteResults�convertWhitespaceEscapesrd�	MULTILINE�DOTALLrr	rrr�r��escCharReplacePatternr
rrrr�r�rar`r[)r�rr r!Z	multiliner"rr#)rH)r�rxr�sf




6

zQuotedString.__init__c	Cs�|||jkr|jj||�pd}|s4t|||j|��|j�}|j�}|jr�||j|j	�}t
|t�r�d|kr�|jr�ddddd�}x |j
�D]\}}|j||�}q�W|jr�tj|jd|�}|jr�|j|j|j�}||fS)N�\�	r��
)z\tz\nz\fz\rz\g<1>)rrdr�rrarrr"rrrzr�r#r�r�r r�r&r!r)	r�r-r�ror�r�Zws_mapZwslitZwscharrwrwrxr�Gs( 
zQuotedString.parseImplcsFytt|�j�Stk
r"YnX|jdkr@d|j|jf|_|jS)Nz.quoted string, starting with %s ending with %s)r�r%r�rKrUrr)r�)rHrwrxr�js
zQuotedString.__str__)NNFTNT)T)r�r�r�r�r�r�r�r�rwrw)rHrxr%�
sA
#cs8eZdZdZd�fdd�	Zddd�Z�fd	d
�Z�ZS)
r	a�
    Token for matching words composed of characters I{not} in a given set (will
    include whitespace in matched characters if not listed in the provided exclusion set - see example).
    Defined with string containing all disallowed characters, and an optional
    minimum, maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction.

    Example::
        # define a comma-separated-value as anything that is not a ','
        csv_value = CharsNotIn(',')
        print(delimitedList(csv_value).parseString("dkls,lsdkjf,s12 34,@!#,213"))
    prints::
        ['dkls', 'lsdkjf', 's12 34', '@!#', '213']
    rrrcs�tt|�j�d|_||_|dkr*td��||_|dkr@||_nt|_|dkrZ||_||_t	|�|_
d|j
|_|jdk|_d|_
dS)NFrrzfcannot specify a minimum length < 1; use Optional(CharsNotIn()) if zero-length char group is permittedrz	Expected )r�r	r�rX�notCharsr�rrr�r�r�rar[r`)r�r+rrr
)rHrwrxr��s 
zCharsNotIn.__init__TcCs�|||jkrt|||j|��|}|d7}|j}t||jt|��}x ||krd|||krd|d7}qFW|||jkr�t|||j|��||||�fS)Nrr)r+rrarrr�r)r�r-r�ror�Znotchars�maxlenrwrwrxr��s
zCharsNotIn.parseImplcsdytt|�j�Stk
r"YnX|jdkr^t|j�dkrRd|jdd�|_nd|j|_|jS)Nrz
!W:(%s...)z!W:(%s))r�r	r�rKrUr�r+)r�)rHrwrxr��s
zCharsNotIn.__str__)rrrr)T)r�r�r�r�r�r�r�r�rwrw)rHrxr	vs
cs<eZdZdZdddddd�Zd�fdd�	Zddd�Z�ZS)r.a�
    Special matching class for matching whitespace.  Normally, whitespace is ignored
    by pyparsing grammars.  This class is included when some whitespace structures
    are significant.  Define with a string containing the whitespace characters to be
    matched; default is C{" \t\r\n"}.  Also takes optional C{min}, C{max}, and C{exact} arguments,
    as defined for the C{L{Word}} class.
    z<SPC>z<TAB>z<LF>z<CR>z<FF>)r�r(rr*r)� 	
rrrcs�tt��j�|�_�jdj�fdd��jD���djdd��jD���_d�_d�j�_	|�_
|dkrt|�_nt�_|dkr�|�_|�_
dS)Nr�c3s|]}|�jkr|VqdS)N)�
matchWhite)r�r�)r�rwrxr��sz!White.__init__.<locals>.<genexpr>css|]}tj|VqdS)N)r.�	whiteStrs)r�r�rwrwrxr��sTz	Expected r)
r�r.r�r.r�r�rYr�r[rarrr�)r�Zwsrrr
)rH)r�rxr��s zWhite.__init__TcCs�|||jkrt|||j|��|}|d7}||j}t|t|��}x"||krd|||jkrd|d7}qDW|||jkr�t|||j|��||||�fS)Nrr)r.rrarrr�r)r�r-r�ror�r�rwrwrxr��s
zWhite.parseImpl)r-rrrr)T)r�r�r�r�r/r�r�r�rwrw)rHrxr.�scseZdZ�fdd�Z�ZS)�_PositionTokencs(tt|�j�|jj|_d|_d|_dS)NTF)r�r0r�rHr�r�r[r`)r�)rHrwrxr��s
z_PositionToken.__init__)r�r�r�r�r�rwrw)rHrxr0�sr0cs2eZdZdZ�fdd�Zdd�Zd	dd�Z�ZS)
rzb
    Token to advance to a specific column of input text; useful for tabular report scraping.
    cstt|�j�||_dS)N)r�rr�r9)r��colno)rHrwrxr��szGoToColumn.__init__cCs`t||�|jkr\t|�}|jr*|j||�}x0||krZ||j�rZt||�|jkrZ|d7}q,W|S)Nrr)r9r�r]r��isspace)r�r-r�r�rwrwrxr��s&zGoToColumn.preParseTcCsDt||�}||jkr"t||d|��||j|}|||�}||fS)NzText not in expected column)r9r)r�r-r�roZthiscolZnewlocr�rwrwrxr�s

zGoToColumn.parseImpl)T)r�r�r�r�r�r�r�r�rwrw)rHrxr�s	cs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Matches if current position is at the beginning of a line within the parse string
    
    Example::
    
        test = '''        AAA this line
        AAA and this line
          AAA but not this one
        B AAA and definitely not this one
        '''

        for t in (LineStart() + 'AAA' + restOfLine).searchString(test):
            print(t)
    
    Prints::
        ['AAA', ' this line']
        ['AAA', ' and this line']    

    cstt|�j�d|_dS)NzExpected start of line)r�rr�ra)r�)rHrwrxr�&szLineStart.__init__TcCs*t||�dkr|gfSt|||j|��dS)Nrr)r9rra)r�r-r�rorwrwrxr�*szLineStart.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxrscs*eZdZdZ�fdd�Zddd�Z�ZS)rzU
    Matches if current position is at the end of a line within the parse string
    cs,tt|�j�|jtjjdd��d|_dS)Nrr�zExpected end of line)r�rr�r�r$rNr�ra)r�)rHrwrxr�3szLineEnd.__init__TcCsb|t|�kr6||dkr$|ddfSt|||j|��n(|t|�krN|dgfSt|||j|��dS)Nrrr)r�rra)r�r-r�rorwrwrxr�8szLineEnd.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr/scs*eZdZdZ�fdd�Zddd�Z�ZS)r*zM
    Matches if current position is at the beginning of the parse string
    cstt|�j�d|_dS)NzExpected start of text)r�r*r�ra)r�)rHrwrxr�GszStringStart.__init__TcCs0|dkr(||j|d�kr(t|||j|��|gfS)Nr)r�rra)r�r-r�rorwrwrxr�KszStringStart.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr*Cscs*eZdZdZ�fdd�Zddd�Z�ZS)r)zG
    Matches if current position is at the end of the parse string
    cstt|�j�d|_dS)NzExpected end of text)r�r)r�ra)r�)rHrwrxr�VszStringEnd.__init__TcCs^|t|�krt|||j|��n<|t|�kr6|dgfS|t|�krJ|gfSt|||j|��dS)Nrr)r�rra)r�r-r�rorwrwrxr�ZszStringEnd.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr)Rscs.eZdZdZef�fdd�	Zddd�Z�ZS)r1ap
    Matches if the current position is at the beginning of a Word, and
    is not preceded by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{} behavior of regular expressions,
    use C{WordStart(alphanums)}. C{WordStart} will also match at the beginning of
    the string being parsed, or at the beginning of a line.
    cs"tt|�j�t|�|_d|_dS)NzNot at the start of a word)r�r1r�r��	wordCharsra)r�r3)rHrwrxr�ls
zWordStart.__init__TcCs@|dkr8||d|jks(|||jkr8t|||j|��|gfS)Nrrr)r3rra)r�r-r�rorwrwrxr�qs
zWordStart.parseImpl)T)r�r�r�r�rVr�r�r�rwrw)rHrxr1dscs.eZdZdZef�fdd�	Zddd�Z�ZS)r0aZ
    Matches if the current position is at the end of a Word, and
    is not followed by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{} behavior of regular expressions,
    use C{WordEnd(alphanums)}. C{WordEnd} will also match at the end of
    the string being parsed, or at the end of a line.
    cs(tt|�j�t|�|_d|_d|_dS)NFzNot at the end of a word)r�r0r�r�r3rXra)r�r3)rHrwrxr��s
zWordEnd.__init__TcCsPt|�}|dkrH||krH|||jks8||d|jkrHt|||j|��|gfS)Nrrr)r�r3rra)r�r-r�ror�rwrwrxr��szWordEnd.parseImpl)T)r�r�r�r�rVr�r�r�rwrw)rHrxr0xscs�eZdZdZd�fdd�	Zdd�Zdd�Zd	d
�Z�fdd�Z�fd
d�Z	�fdd�Z
d�fdd�	Zgfdd�Z�fdd�Z
�ZS)r z^
    Abstract subclass of ParserElement, for combining and post-processing parsed tokens.
    Fcs�tt|�j|�t|t�r"t|�}t|t�r<tj|�g|_	njt|t
j�rzt|�}tdd�|D��rnt
tj|�}t|�|_	n,yt|�|_	Wntk
r�|g|_	YnXd|_dS)Ncss|]}t|t�VqdS)N)rzr�)r�r.rwrwrxr��sz+ParseExpression.__init__.<locals>.<genexpr>F)r�r r�rzr�r�r�r$rQ�exprsr��Iterable�allrvr�re)r�r4rg)rHrwrxr��s

zParseExpression.__init__cCs
|j|S)N)r4)r�r�rwrwrxr��szParseExpression.__getitem__cCs|jj|�d|_|S)N)r4r�rU)r�r�rwrwrxr��szParseExpression.appendcCs4d|_dd�|jD�|_x|jD]}|j�q W|S)z~Extends C{leaveWhitespace} defined in base class, and also invokes C{leaveWhitespace} on
           all contained expressions.FcSsg|]}|j��qSrw)r�)r�r�rwrwrxr��sz3ParseExpression.leaveWhitespace.<locals>.<listcomp>)rXr4r�)r�r�rwrwrxr��s
zParseExpression.leaveWhitespacecszt|t�rF||jkrvtt|�j|�xP|jD]}|j|jd�q,Wn0tt|�j|�x|jD]}|j|jd�q^W|S)Nrrrsrs)rzr+r]r�r r�r4)r�r�r�)rHrwrxr��s

zParseExpression.ignorecsLytt|�j�Stk
r"YnX|jdkrFd|jjt|j�f|_|jS)Nz%s:(%s))	r�r r�rKrUrHr�r�r4)r�)rHrwrxr��s
zParseExpression.__str__cs0tt|�j�x|jD]}|j�qWt|j�dk�r|jd}t||j�r�|jr�|jdkr�|j	r�|jdd�|jdg|_d|_
|j|jO_|j|jO_|jd}t||j�o�|jo�|jdko�|j	�r|jdd�|jdd�|_d|_
|j|jO_|j|jO_dt
|�|_|S)Nrqrrrz	Expected rsrs)r�r r�r4r�rzrHrSrVr^rUr[r`r�ra)r�r�r�)rHrwrxr��s0




zParseExpression.streamlinecstt|�j||�}|S)N)r�r rm)r�r�rlr�)rHrwrxrm�szParseExpression.setResultsNamecCs:|dd�|g}x|jD]}|j|�qW|jg�dS)N)r4r�r�)r�r��tmpr�rwrwrxr��szParseExpression.validatecs$tt|�j�}dd�|jD�|_|S)NcSsg|]}|j��qSrw)r�)r�r�rwrwrxr��sz(ParseExpression.copy.<locals>.<listcomp>)r�r r�r4)r�r�)rHrwrxr��szParseExpression.copy)F)F)r�r�r�r�r�r�r�r�r�r�r�rmr�r�r�rwrw)rHrxr �s	
"csTeZdZdZGdd�de�Zd�fdd�	Zddd�Zd	d
�Zdd�Z	d
d�Z
�ZS)ra

    Requires all given C{ParseExpression}s to be found in the given order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'+'} operator.
    May also be constructed using the C{'-'} operator, which will suppress backtracking.

    Example::
        integer = Word(nums)
        name_expr = OneOrMore(Word(alphas))

        expr = And([integer("id"),name_expr("name"),integer("age")])
        # more easily written as:
        expr = integer("id") + name_expr("name") + integer("age")
    cseZdZ�fdd�Z�ZS)zAnd._ErrorStopcs&ttj|�j||�d|_|j�dS)N�-)r�rr�r�r�r�)r�r�r�)rHrwrxr�
szAnd._ErrorStop.__init__)r�r�r�r�r�rwrw)rHrxr�
sr�TcsRtt|�j||�tdd�|jD��|_|j|jdj�|jdj|_d|_	dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�
szAnd.__init__.<locals>.<genexpr>rT)
r�rr�r6r4r[r�rYrXre)r�r4rg)rHrwrxr�
s
zAnd.__init__c	Cs|jdj|||dd�\}}d}x�|jdd�D]�}t|tj�rFd}q0|r�y|j|||�\}}Wq�tk
rv�Yq�tk
r�}zd|_tj|��WYdd}~Xq�t	k
r�t|t
|�|j|��Yq�Xn|j|||�\}}|s�|j�r0||7}q0W||fS)NrF)rprrT)
r4rtrzrr�r#r�
__traceback__r�r�r�rar�)	r�r-r�ro�
resultlistZ	errorStopr�Z
exprtokensr�rwrwrxr�
s(z
And.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrxr5
s

zAnd.__iadd__cCs8|dd�|g}x |jD]}|j|�|jsPqWdS)N)r4r�r[)r�r��subRecCheckListr�rwrwrxr�:
s

zAnd.checkRecursioncCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr��{r�css|]}t|�VqdS)N)r�)r�r�rwrwrxr�F
szAnd.__str__.<locals>.<genexpr>�})r�r�rUr�r4)r�rwrwrxr�A
s


 zAnd.__str__)T)T)r�r�r�r�r
r�r�r�rr�r�r�rwrw)rHrxr�s
csDeZdZdZd�fdd�	Zddd�Zdd	�Zd
d�Zdd
�Z�Z	S)ra�
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the expression that matches the longest string will be used.
    May be constructed using the C{'^'} operator.

    Example::
        # construct Or using '^' operator
        
        number = Word(nums) ^ Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789"))
    prints::
        [['123'], ['3.1416'], ['789']]
    Fcs:tt|�j||�|jr0tdd�|jD��|_nd|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�\
szOr.__init__.<locals>.<genexpr>T)r�rr�r4rr[)r�r4rg)rHrwrxr�Y
szOr.__init__TcCsTd}d}g}x�|jD]�}y|j||�}Wnvtk
rd}	z d|	_|	j|krT|	}|	j}WYdd}	~	Xqtk
r�t|�|kr�t|t|�|j|�}t|�}YqX|j||f�qW|�r*|j	dd�d�x`|D]X\}
}y|j
|||�Stk
�r$}	z"d|	_|	j|k�r|	}|	j}WYdd}	~	Xq�Xq�W|dk	�rB|j|_|�nt||d|��dS)NrrcSs
|dS)Nrrw)�xrwrwrxryu
szOr.parseImpl.<locals>.<lambda>)r�z no defined alternatives to matchrs)r4r�rr9r�r�r�rar��sortrtr�)r�r-r�ro�	maxExcLoc�maxExceptionr�r�Zloc2r��_rwrwrxr�`
s<

zOr.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrx�__ixor__�
s

zOr.__ixor__cCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r<z ^ css|]}t|�VqdS)N)r�)r�r�rwrwrxr��
szOr.__str__.<locals>.<genexpr>r=)r�r�rUr�r4)r�rwrwrxr��
s


 z
Or.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r4r�)r�r�r;r�rwrwrxr��
szOr.checkRecursion)F)T)
r�r�r�r�r�r�rCr�r�r�rwrw)rHrxrK
s

&	csDeZdZdZd�fdd�	Zddd�Zdd	�Zd
d�Zdd
�Z�Z	S)ra�
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the first one listed is the one that will match.
    May be constructed using the C{'|'} operator.

    Example::
        # construct MatchFirst using '|' operator
        
        # watch the order of expressions to match
        number = Word(nums) | Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789")) #  Fail! -> [['123'], ['3'], ['1416'], ['789']]

        # put more selective expression first
        number = Combine(Word(nums) + '.' + Word(nums)) | Word(nums)
        print(number.searchString("123 3.1416 789")) #  Better -> [['123'], ['3.1416'], ['789']]
    Fcs:tt|�j||�|jr0tdd�|jD��|_nd|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr��
sz&MatchFirst.__init__.<locals>.<genexpr>T)r�rr�r4rr[)r�r4rg)rHrwrxr��
szMatchFirst.__init__Tc	Cs�d}d}x�|jD]�}y|j|||�}|Stk
r\}z|j|krL|}|j}WYdd}~Xqtk
r�t|�|kr�t|t|�|j|�}t|�}YqXqW|dk	r�|j|_|�nt||d|��dS)Nrrz no defined alternatives to matchrs)r4rtrr�r�r�rar�)	r�r-r�ror@rAr�r�r�rwrwrxr��
s$
zMatchFirst.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrx�__ior__�
s

zMatchFirst.__ior__cCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r<z | css|]}t|�VqdS)N)r�)r�r�rwrwrxr��
sz%MatchFirst.__str__.<locals>.<genexpr>r=)r�r�rUr�r4)r�rwrwrxr��
s


 zMatchFirst.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r4r�)r�r�r;r�rwrwrxr��
szMatchFirst.checkRecursion)F)T)
r�r�r�r�r�r�rDr�r�r�rwrw)rHrxr�
s
	cs<eZdZdZd�fdd�	Zddd�Zdd�Zd	d
�Z�ZS)
ram
    Requires all given C{ParseExpression}s to be found, but in any order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'&'} operator.

    Example::
        color = oneOf("RED ORANGE YELLOW GREEN BLUE PURPLE BLACK WHITE BROWN")
        shape_type = oneOf("SQUARE CIRCLE TRIANGLE STAR HEXAGON OCTAGON")
        integer = Word(nums)
        shape_attr = "shape:" + shape_type("shape")
        posn_attr = "posn:" + Group(integer("x") + ',' + integer("y"))("posn")
        color_attr = "color:" + color("color")
        size_attr = "size:" + integer("size")

        # use Each (using operator '&') to accept attributes in any order 
        # (shape and posn are required, color and size are optional)
        shape_spec = shape_attr & posn_attr & Optional(color_attr) & Optional(size_attr)

        shape_spec.runTests('''
            shape: SQUARE color: BLACK posn: 100, 120
            shape: CIRCLE size: 50 color: BLUE posn: 50,80
            color:GREEN size:20 shape:TRIANGLE posn:20,40
            '''
            )
    prints::
        shape: SQUARE color: BLACK posn: 100, 120
        ['shape:', 'SQUARE', 'color:', 'BLACK', 'posn:', ['100', ',', '120']]
        - color: BLACK
        - posn: ['100', ',', '120']
          - x: 100
          - y: 120
        - shape: SQUARE


        shape: CIRCLE size: 50 color: BLUE posn: 50,80
        ['shape:', 'CIRCLE', 'size:', '50', 'color:', 'BLUE', 'posn:', ['50', ',', '80']]
        - color: BLUE
        - posn: ['50', ',', '80']
          - x: 50
          - y: 80
        - shape: CIRCLE
        - size: 50


        color: GREEN size: 20 shape: TRIANGLE posn: 20,40
        ['color:', 'GREEN', 'size:', '20', 'shape:', 'TRIANGLE', 'posn:', ['20', ',', '40']]
        - color: GREEN
        - posn: ['20', ',', '40']
          - x: 20
          - y: 40
        - shape: TRIANGLE
        - size: 20
    Tcs8tt|�j||�tdd�|jD��|_d|_d|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�sz Each.__init__.<locals>.<genexpr>T)r�rr�r6r4r[rX�initExprGroups)r�r4rg)rHrwrxr�sz
Each.__init__c
s�|jr�tdd�|jD��|_dd�|jD�}dd�|jD�}|||_dd�|jD�|_dd�|jD�|_dd�|jD�|_|j|j7_d	|_|}|jdd�}|jdd��g}d
}	x�|	�rp|�|j|j}
g}x~|
D]v}y|j||�}Wn t	k
�r|j
|�Yq�X|j
|jjt|�|��||k�rD|j
|�q�|�kr�j
|�q�Wt|�t|
�kr�d	}	q�W|�r�djdd�|D��}
t	||d
|
��|�fdd�|jD�7}g}x*|D]"}|j|||�\}}|j
|��q�Wt|tg��}||fS)Ncss&|]}t|t�rt|j�|fVqdS)N)rzrr�r.)r�r�rwrwrxr�sz!Each.parseImpl.<locals>.<genexpr>cSsg|]}t|t�r|j�qSrw)rzrr.)r�r�rwrwrxr�sz"Each.parseImpl.<locals>.<listcomp>cSs"g|]}|jrt|t�r|�qSrw)r[rzr)r�r�rwrwrxr�scSsg|]}t|t�r|j�qSrw)rzr2r.)r�r�rwrwrxr� scSsg|]}t|t�r|j�qSrw)rzrr.)r�r�rwrwrxr�!scSs g|]}t|tttf�s|�qSrw)rzrr2r)r�r�rwrwrxr�"sFTz, css|]}t|�VqdS)N)r�)r�r�rwrwrxr�=sz*Missing one or more required elements (%s)cs$g|]}t|t�r|j�kr|�qSrw)rzrr.)r�r�)�tmpOptrwrxr�As)rEr�r4Zopt1mapZ	optionalsZmultioptionalsZ
multirequiredZrequiredr�rr�r�r��remover�r�rt�sumr")r�r-r�roZopt1Zopt2ZtmpLocZtmpReqdZ
matchOrderZkeepMatchingZtmpExprsZfailedr�Zmissingr:r�ZfinalResultsrw)rFrxr�sP



zEach.parseImplcCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r<z & css|]}t|�VqdS)N)r�)r�r�rwrwrxr�PszEach.__str__.<locals>.<genexpr>r=)r�r�rUr�r4)r�rwrwrxr�Ks


 zEach.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r4r�)r�r�r;r�rwrwrxr�TszEach.checkRecursion)T)T)	r�r�r�r�r�r�r�r�r�rwrw)rHrxr�
s
5
1	csleZdZdZd�fdd�	Zddd�Zdd	�Z�fd
d�Z�fdd
�Zdd�Z	gfdd�Z
�fdd�Z�ZS)rza
    Abstract subclass of C{ParserElement}, for combining and post-processing parsed tokens.
    Fcs�tt|�j|�t|t�r@ttjt�r2tj|�}ntjt	|��}||_
d|_|dk	r�|j|_|j
|_
|j|j�|j|_|j|_|j|_|jj|j�dS)N)r�rr�rzr��
issubclassr$rQr,rr.rUr`r[r�rYrXrWrer]r�)r�r.rg)rHrwrxr�^s
zParseElementEnhance.__init__TcCs2|jdk	r|jj|||dd�Std||j|��dS)NF)rpr�)r.rtrra)r�r-r�rorwrwrxr�ps
zParseElementEnhance.parseImplcCs*d|_|jj�|_|jdk	r&|jj�|S)NF)rXr.r�r�)r�rwrwrxr�vs


z#ParseElementEnhance.leaveWhitespacecsrt|t�rB||jkrntt|�j|�|jdk	rn|jj|jd�n,tt|�j|�|jdk	rn|jj|jd�|S)Nrrrsrs)rzr+r]r�rr�r.)r�r�)rHrwrxr�}s



zParseElementEnhance.ignorecs&tt|�j�|jdk	r"|jj�|S)N)r�rr�r.)r�)rHrwrxr��s

zParseElementEnhance.streamlinecCsB||krt||g��|dd�|g}|jdk	r>|jj|�dS)N)r&r.r�)r�r�r;rwrwrxr��s

z"ParseElementEnhance.checkRecursioncCs6|dd�|g}|jdk	r(|jj|�|jg�dS)N)r.r�r�)r�r�r7rwrwrxr��s
zParseElementEnhance.validatecsVytt|�j�Stk
r"YnX|jdkrP|jdk	rPd|jjt|j�f|_|jS)Nz%s:(%s))	r�rr�rKrUr.rHr�r�)r�)rHrwrxr��szParseElementEnhance.__str__)F)T)
r�r�r�r�r�r�r�r�r�r�r�r�r�rwrw)rHrxrZs
cs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Lookahead matching of the given parse expression.  C{FollowedBy}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression matches at the current
    position.  C{FollowedBy} always returns a null token list.

    Example::
        # use FollowedBy to match a label only if it is followed by a ':'
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        OneOrMore(attr_expr).parseString("shape: SQUARE color: BLACK posn: upper left").pprint()
    prints::
        [['shape', 'SQUARE'], ['color', 'BLACK'], ['posn', 'upper left']]
    cstt|�j|�d|_dS)NT)r�rr�r[)r�r.)rHrwrxr��szFollowedBy.__init__TcCs|jj||�|gfS)N)r.r�)r�r-r�rorwrwrxr��szFollowedBy.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr�scs2eZdZdZ�fdd�Zd	dd�Zdd�Z�ZS)
ra�
    Lookahead to disallow matching with the given parse expression.  C{NotAny}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression does I{not} match at the current
    position.  Also, C{NotAny} does I{not} skip over leading whitespace. C{NotAny}
    always returns a null token list.  May be constructed using the '~' operator.

    Example::
        
    cs0tt|�j|�d|_d|_dt|j�|_dS)NFTzFound unwanted token, )r�rr�rXr[r�r.ra)r�r.)rHrwrxr��szNotAny.__init__TcCs&|jj||�rt|||j|��|gfS)N)r.r�rra)r�r-r�rorwrwrxr��szNotAny.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�z~{r=)r�r�rUr�r.)r�rwrwrxr��s


zNotAny.__str__)T)r�r�r�r�r�r�r�r�rwrw)rHrxr�s

cs(eZdZd�fdd�	Zddd�Z�ZS)	�_MultipleMatchNcsFtt|�j|�d|_|}t|t�r.tj|�}|dk	r<|nd|_dS)NT)	r�rJr�rWrzr�r$rQ�	not_ender)r�r.�stopOnZender)rHrwrxr��s

z_MultipleMatch.__init__TcCs�|jj}|j}|jdk	}|r$|jj}|r2|||�||||dd�\}}yZ|j}	xJ|rb|||�|	rr|||�}
n|}
|||
|�\}}|s�|j�rT||7}qTWWnttfk
r�YnX||fS)NF)rp)	r.rtr�rKr�r]r�rr�)r�r-r�roZself_expr_parseZself_skip_ignorablesZcheck_enderZ
try_not_enderr�ZhasIgnoreExprsr�Z	tmptokensrwrwrxr��s,



z_MultipleMatch.parseImpl)N)T)r�r�r�r�r�r�rwrw)rHrxrJ�srJc@seZdZdZdd�ZdS)ra�
    Repetition of one or more of the given expression.
    
    Parameters:
     - expr - expression that must match one or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: BLACK"
        OneOrMore(attr_expr).parseString(text).pprint()  # Fail! read 'color' as data instead of next label -> [['shape', 'SQUARE color']]

        # use stopOn attribute for OneOrMore to avoid reading label string as part of the data
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        OneOrMore(attr_expr).parseString(text).pprint() # Better -> [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'BLACK']]
        
        # could also be written as
        (attr_expr * (1,)).parseString(text).pprint()
    cCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�r<z}...)r�r�rUr�r.)r�rwrwrxr�!s


zOneOrMore.__str__N)r�r�r�r�r�rwrwrwrxrscs8eZdZdZd
�fdd�	Zd�fdd�	Zdd	�Z�ZS)r2aw
    Optional repetition of zero or more of the given expression.
    
    Parameters:
     - expr - expression that must match zero or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example: similar to L{OneOrMore}
    Ncstt|�j||d�d|_dS)N)rLT)r�r2r�r[)r�r.rL)rHrwrxr�6szZeroOrMore.__init__Tcs6ytt|�j|||�Sttfk
r0|gfSXdS)N)r�r2r�rr�)r�r-r�ro)rHrwrxr�:szZeroOrMore.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�rz]...)r�r�rUr�r.)r�rwrwrxr�@s


zZeroOrMore.__str__)N)T)r�r�r�r�r�r�r�r�rwrw)rHrxr2*sc@s eZdZdd�ZeZdd�ZdS)�
_NullTokencCsdS)NFrw)r�rwrwrxr�Jsz_NullToken.__bool__cCsdS)Nr�rw)r�rwrwrxr�Msz_NullToken.__str__N)r�r�r�r�r'r�rwrwrwrxrMIsrMcs6eZdZdZef�fdd�	Zd	dd�Zdd�Z�ZS)
raa
    Optional matching of the given expression.

    Parameters:
     - expr - expression that must match zero or more times
     - default (optional) - value to be returned if the optional expression is not found.

    Example::
        # US postal code can be a 5-digit zip, plus optional 4-digit qualifier
        zip = Combine(Word(nums, exact=5) + Optional('-' + Word(nums, exact=4)))
        zip.runTests('''
            # traditional ZIP code
            12345
            
            # ZIP+4 form
            12101-0001
            
            # invalid ZIP
            98765-
            ''')
    prints::
        # traditional ZIP code
        12345
        ['12345']

        # ZIP+4 form
        12101-0001
        ['12101-0001']

        # invalid ZIP
        98765-
             ^
        FAIL: Expected end of text (at char 5), (line:1, col:6)
    cs.tt|�j|dd�|jj|_||_d|_dS)NF)rgT)r�rr�r.rWr�r[)r�r.r�)rHrwrxr�ts
zOptional.__init__TcCszy|jj|||dd�\}}WnTttfk
rp|jtk	rh|jjr^t|jg�}|j||jj<ql|jg}ng}YnX||fS)NF)rp)r.rtrr�r��_optionalNotMatchedrVr")r�r-r�ror�rwrwrxr�zs


zOptional.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�rr	)r�r�rUr�r.)r�rwrwrxr��s


zOptional.__str__)T)	r�r�r�r�rNr�r�r�r�rwrw)rHrxrQs"
cs,eZdZdZd	�fdd�	Zd
dd�Z�ZS)r(a�	
    Token for skipping over all undefined text until the matched expression is found.

    Parameters:
     - expr - target expression marking the end of the data to be skipped
     - include - (default=C{False}) if True, the target expression is also parsed 
          (the skipped text and target expression are returned as a 2-element list).
     - ignore - (default=C{None}) used to define grammars (typically quoted strings and 
          comments) that might contain false matches to the target expression
     - failOn - (default=C{None}) define expressions that are not allowed to be 
          included in the skipped test; if found before the target expression is found, 
          the SkipTo is not a match

    Example::
        report = '''
            Outstanding Issues Report - 1 Jan 2000

               # | Severity | Description                               |  Days Open
            -----+----------+-------------------------------------------+-----------
             101 | Critical | Intermittent system crash                 |          6
              94 | Cosmetic | Spelling error on Login ('log|n')         |         14
              79 | Minor    | System slow when running too many reports |         47
            '''
        integer = Word(nums)
        SEP = Suppress('|')
        # use SkipTo to simply match everything up until the next SEP
        # - ignore quoted strings, so that a '|' character inside a quoted string does not match
        # - parse action will call token.strip() for each matched token, i.e., the description body
        string_data = SkipTo(SEP, ignore=quotedString)
        string_data.setParseAction(tokenMap(str.strip))
        ticket_expr = (integer("issue_num") + SEP 
                      + string_data("sev") + SEP 
                      + string_data("desc") + SEP 
                      + integer("days_open"))
        
        for tkt in ticket_expr.searchString(report):
            print tkt.dump()
    prints::
        ['101', 'Critical', 'Intermittent system crash', '6']
        - days_open: 6
        - desc: Intermittent system crash
        - issue_num: 101
        - sev: Critical
        ['94', 'Cosmetic', "Spelling error on Login ('log|n')", '14']
        - days_open: 14
        - desc: Spelling error on Login ('log|n')
        - issue_num: 94
        - sev: Cosmetic
        ['79', 'Minor', 'System slow when running too many reports', '47']
        - days_open: 47
        - desc: System slow when running too many reports
        - issue_num: 79
        - sev: Minor
    FNcs`tt|�j|�||_d|_d|_||_d|_t|t	�rFt
j|�|_n||_dt
|j�|_dS)NTFzNo match found for )r�r(r��
ignoreExprr[r`�includeMatchr�rzr�r$rQ�failOnr�r.ra)r�r��includer�rQ)rHrwrxr��s
zSkipTo.__init__TcCs,|}t|�}|j}|jj}|jdk	r,|jjnd}|jdk	rB|jjnd}	|}
x�|
|kr�|dk	rh|||
�rhP|	dk	r�x*y|	||
�}
Wqrtk
r�PYqrXqrWy|||
ddd�Wn tt	fk
r�|
d7}
YqLXPqLWt|||j
|��|
}|||�}t|�}|j�r$||||dd�\}}
||
7}||fS)NF)rorprr)rp)
r�r.rtrQr�rOr�rrr�rar"rP)r�r-r�ror0r�r.Z
expr_parseZself_failOn_canParseNextZself_ignoreExpr_tryParseZtmplocZskiptextZ
skipresultr�rwrwrxr��s<

zSkipTo.parseImpl)FNN)T)r�r�r�r�r�r�r�rwrw)rHrxr(�s6
csbeZdZdZd�fdd�	Zdd�Zdd�Zd	d
�Zdd�Zgfd
d�Z	dd�Z
�fdd�Z�ZS)raK
    Forward declaration of an expression to be defined later -
    used for recursive grammars, such as algebraic infix notation.
    When the expression is known, it is assigned to the C{Forward} variable using the '<<' operator.

    Note: take care when assigning to C{Forward} not to overlook precedence of operators.
    Specifically, '|' has a lower precedence than '<<', so that::
        fwdExpr << a | b | c
    will actually be evaluated as::
        (fwdExpr << a) | b | c
    thereby leaving b and c out as parseable alternatives.  It is recommended that you
    explicitly group the values inserted into the C{Forward}::
        fwdExpr << (a | b | c)
    Converting to use the '<<=' operator instead will avoid this problem.

    See L{ParseResults.pprint} for an example of a recursive parser created using
    C{Forward}.
    Ncstt|�j|dd�dS)NF)rg)r�rr�)r�r�)rHrwrxr�szForward.__init__cCsjt|t�rtj|�}||_d|_|jj|_|jj|_|j|jj	�|jj
|_
|jj|_|jj
|jj�|S)N)rzr�r$rQr.rUr`r[r�rYrXrWr]r�)r�r�rwrwrx�
__lshift__s





zForward.__lshift__cCs||>S)Nrw)r�r�rwrwrx�__ilshift__'szForward.__ilshift__cCs
d|_|S)NF)rX)r�rwrwrxr�*szForward.leaveWhitespacecCs$|js d|_|jdk	r |jj�|S)NT)r_r.r�)r�rwrwrxr�.s


zForward.streamlinecCs>||kr0|dd�|g}|jdk	r0|jj|�|jg�dS)N)r.r�r�)r�r�r7rwrwrxr�5s

zForward.validatecCs>t|d�r|jS|jjdSd}Wd|j|_X|jjd|S)Nr�z: ...�Nonez: )r�r�rHr�Z_revertClass�_ForwardNoRecurser.r�)r�Z	retStringrwrwrxr�<s

zForward.__str__cs.|jdk	rtt|�j�St�}||K}|SdS)N)r.r�rr�)r�r�)rHrwrxr�Ms

zForward.copy)N)
r�r�r�r�r�rSrTr�r�r�r�r�r�rwrw)rHrxrs
c@seZdZdd�ZdS)rVcCsdS)Nz...rw)r�rwrwrxr�Vsz_ForwardNoRecurse.__str__N)r�r�r�r�rwrwrwrxrVUsrVcs"eZdZdZd�fdd�	Z�ZS)r-zQ
    Abstract subclass of C{ParseExpression}, for converting parsed results.
    Fcstt|�j|�d|_dS)NF)r�r-r�rW)r�r.rg)rHrwrxr�]szTokenConverter.__init__)F)r�r�r�r�r�r�rwrw)rHrxr-Yscs6eZdZdZd
�fdd�	Z�fdd�Zdd	�Z�ZS)r
a�
    Converter to concatenate all matching tokens to a single string.
    By default, the matching patterns must also be contiguous in the input string;
    this can be disabled by specifying C{'adjacent=False'} in the constructor.

    Example::
        real = Word(nums) + '.' + Word(nums)
        print(real.parseString('3.1416')) # -> ['3', '.', '1416']
        # will also erroneously match the following
        print(real.parseString('3. 1416')) # -> ['3', '.', '1416']

        real = Combine(Word(nums) + '.' + Word(nums))
        print(real.parseString('3.1416')) # -> ['3.1416']
        # no match when there are internal spaces
        print(real.parseString('3. 1416')) # -> Exception: Expected W:(0123...)
    r�Tcs8tt|�j|�|r|j�||_d|_||_d|_dS)NT)r�r
r�r��adjacentrX�
joinStringre)r�r.rXrW)rHrwrxr�rszCombine.__init__cs(|jrtj||�ntt|�j|�|S)N)rWr$r�r�r
)r�r�)rHrwrxr�|szCombine.ignorecCsP|j�}|dd�=|tdj|j|j��g|jd�7}|jrH|j�rH|gS|SdS)Nr�)r�)r�r"r�r
rXrbrVr�)r�r-r�r�ZretToksrwrwrxr��s
"zCombine.postParse)r�T)r�r�r�r�r�r�r�r�rwrw)rHrxr
as
cs(eZdZdZ�fdd�Zdd�Z�ZS)ra�
    Converter to return the matched tokens as a list - useful for returning tokens of C{L{ZeroOrMore}} and C{L{OneOrMore}} expressions.

    Example::
        ident = Word(alphas)
        num = Word(nums)
        term = ident | num
        func = ident + Optional(delimitedList(term))
        print(func.parseString("fn a,b,100"))  # -> ['fn', 'a', 'b', '100']

        func = ident + Group(Optional(delimitedList(term)))
        print(func.parseString("fn a,b,100"))  # -> ['fn', ['a', 'b', '100']]
    cstt|�j|�d|_dS)NT)r�rr�rW)r�r.)rHrwrxr��szGroup.__init__cCs|gS)Nrw)r�r-r�r�rwrwrxr��szGroup.postParse)r�r�r�r�r�r�r�rwrw)rHrxr�s
cs(eZdZdZ�fdd�Zdd�Z�ZS)raW
    Converter to return a repetitive expression as a list, but also as a dictionary.
    Each element can also be referenced using the first token in the expression as its key.
    Useful for tabular report scraping when the first column can be used as a item key.

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        # print attributes as plain groups
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        # instead of OneOrMore(expr), parse using Dict(OneOrMore(Group(expr))) - Dict will auto-assign names
        result = Dict(OneOrMore(Group(attr_expr))).parseString(text)
        print(result.dump())
        
        # access named fields as dict entries, or output as dict
        print(result['shape'])        
        print(result.asDict())
    prints::
        ['shape', 'SQUARE', 'posn', 'upper left', 'color', 'light blue', 'texture', 'burlap']

        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        {'color': 'light blue', 'posn': 'upper left', 'texture': 'burlap', 'shape': 'SQUARE'}
    See more examples at L{ParseResults} of accessing fields by results name.
    cstt|�j|�d|_dS)NT)r�rr�rW)r�r.)rHrwrxr��sz
Dict.__init__cCs�x�t|�D]�\}}t|�dkr q
|d}t|t�rBt|d�j�}t|�dkr^td|�||<q
t|�dkr�t|dt�r�t|d|�||<q
|j�}|d=t|�dks�t|t�r�|j	�r�t||�||<q
t|d|�||<q
W|j
r�|gS|SdS)Nrrrr�rq)r�r�rzrur�r�r�r"r�r�rV)r�r-r�r�r��tokZikeyZ	dictvaluerwrwrxr��s$
zDict.postParse)r�r�r�r�r�r�r�rwrw)rHrxr�s#c@s eZdZdZdd�Zdd�ZdS)r+aV
    Converter for ignoring the results of a parsed expression.

    Example::
        source = "a, b, c,d"
        wd = Word(alphas)
        wd_list1 = wd + ZeroOrMore(',' + wd)
        print(wd_list1.parseString(source))

        # often, delimiters that are useful during parsing are just in the
        # way afterward - use Suppress to keep them out of the parsed output
        wd_list2 = wd + ZeroOrMore(Suppress(',') + wd)
        print(wd_list2.parseString(source))
    prints::
        ['a', ',', 'b', ',', 'c', ',', 'd']
        ['a', 'b', 'c', 'd']
    (See also L{delimitedList}.)
    cCsgS)Nrw)r�r-r�r�rwrwrxr��szSuppress.postParsecCs|S)Nrw)r�rwrwrxr��szSuppress.suppressN)r�r�r�r�r�r�rwrwrwrxr+�sc@s(eZdZdZdd�Zdd�Zdd�ZdS)	rzI
    Wrapper for parse actions, to ensure they are only called once.
    cCst|�|_d|_dS)NF)rM�callable�called)r�Z
methodCallrwrwrxr�s
zOnlyOnce.__init__cCs.|js|j|||�}d|_|St||d��dS)NTr�)r[rZr)r�r�r5rvr�rwrwrxr�s
zOnlyOnce.__call__cCs
d|_dS)NF)r[)r�rwrwrx�reset
szOnlyOnce.resetN)r�r�r�r�r�r�r\rwrwrwrxr�scs:t����fdd�}y�j|_Wntk
r4YnX|S)as
    Decorator for debugging parse actions. 
    
    When the parse action is called, this decorator will print C{">> entering I{method-name}(line:I{current_source_line}, I{parse_location}, I{matched_tokens})".}
    When the parse action completes, the decorator will print C{"<<"} followed by the returned value, or any exception that the parse action raised.

    Example::
        wd = Word(alphas)

        @traceParseAction
        def remove_duplicate_chars(tokens):
            return ''.join(sorted(set(''.join(tokens)))

        wds = OneOrMore(wd).setParseAction(remove_duplicate_chars)
        print(wds.parseString("slkdjs sld sldd sdlf sdljf"))
    prints::
        >>entering remove_duplicate_chars(line: 'slkdjs sld sldd sdlf sdljf', 0, (['slkdjs', 'sld', 'sldd', 'sdlf', 'sdljf'], {}))
        <<leaving remove_duplicate_chars (ret: 'dfjkls')
        ['dfjkls']
    cs��j}|dd�\}}}t|�dkr8|djjd|}tjjd|t||�||f�y�|�}Wn8tk
r�}ztjjd||f��WYdd}~XnXtjjd||f�|S)Nror�.z">>entering %s(line: '%s', %d, %r)
z<<leaving %s (exception: %s)
z<<leaving %s (ret: %r)
r9)r�r�rHr~�stderr�writerGrK)ZpaArgsZthisFuncr�r5rvr�r3)r�rwrx�z#sztraceParseAction.<locals>.z)rMr�r�)r�r`rw)r�rxrb
s
�,FcCs`t|�dt|�dt|�d}|rBt|t||��j|�S|tt|�|�j|�SdS)a�
    Helper to define a delimited list of expressions - the delimiter defaults to ','.
    By default, the list elements and delimiters can have intervening whitespace, and
    comments, but this can be overridden by passing C{combine=True} in the constructor.
    If C{combine} is set to C{True}, the matching tokens are returned as a single token
    string, with the delimiters included; otherwise, the matching tokens are returned
    as a list of tokens, with the delimiters suppressed.

    Example::
        delimitedList(Word(alphas)).parseString("aa,bb,cc") # -> ['aa', 'bb', 'cc']
        delimitedList(Word(hexnums), delim=':', combine=True).parseString("AA:BB:CC:DD:EE") # -> ['AA:BB:CC:DD:EE']
    z [r�z]...N)r�r
r2rir+)r.Zdelim�combineZdlNamerwrwrxr@9s
$csjt����fdd�}|dkr0tt�jdd��}n|j�}|jd�|j|dd�|�jd	t��d
�S)a:
    Helper to define a counted list of expressions.
    This helper defines a pattern of the form::
        integer expr expr expr...
    where the leading integer tells how many expr expressions follow.
    The matched tokens returns the array of expr tokens as a list - the leading count token is suppressed.
    
    If C{intExpr} is specified, it should be a pyparsing expression that produces an integer value.

    Example::
        countedArray(Word(alphas)).parseString('2 ab cd ef')  # -> ['ab', 'cd']

        # in this parser, the leading integer value is given in binary,
        # '10' indicating that 2 values are in the array
        binaryConstant = Word('01').setParseAction(lambda t: int(t[0], 2))
        countedArray(Word(alphas), intExpr=binaryConstant).parseString('10 ab cd ef')  # -> ['ab', 'cd']
    cs.|d}�|r tt�g|��p&tt�>gS)Nr)rrrC)r�r5rvr�)�	arrayExprr.rwrx�countFieldParseAction_s"z+countedArray.<locals>.countFieldParseActionNcSst|d�S)Nr)ru)rvrwrwrxrydszcountedArray.<locals>.<lambda>ZarrayLenT)rfz(len) z...)rr/rRr�r�rirxr�)r.ZintExprrdrw)rcr.rxr<Ls
cCs:g}x0|D](}t|t�r(|jt|��q
|j|�q
W|S)N)rzr�r�r�r�)�Lr�r�rwrwrxr�ks

r�cs6t���fdd�}|j|dd��jdt|���S)a*
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousLiteral(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches a
    previous literal, will also match the leading C{"1:1"} in C{"1:10"}.
    If this is not desired, use C{matchPreviousExpr}.
    Do I{not} use with packrat parsing enabled.
    csP|rBt|�dkr�|d>qLt|j��}�tdd�|D��>n
�t�>dS)Nrrrcss|]}t|�VqdS)N)r)r��ttrwrwrxr��szDmatchPreviousLiteral.<locals>.copyTokenToRepeater.<locals>.<genexpr>)r�r�r�rr
)r�r5rvZtflat)�reprwrx�copyTokenToRepeater�sz1matchPreviousLiteral.<locals>.copyTokenToRepeaterT)rfz(prev) )rrxrir�)r.rhrw)rgrxrOts


csFt��|j�}�|K��fdd�}|j|dd��jdt|���S)aS
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousExpr(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches by
    expressions, will I{not} match the leading C{"1:1"} in C{"1:10"};
    the expressions are evaluated first, and then compared, so
    C{"1"} is compared with C{"10"}.
    Do I{not} use with packrat parsing enabled.
    cs*t|j����fdd�}�j|dd�dS)Ncs$t|j��}|�kr tddd��dS)Nr�r)r�r�r)r�r5rvZtheseTokens)�matchTokensrwrx�mustMatchTheseTokens�szLmatchPreviousExpr.<locals>.copyTokenToRepeater.<locals>.mustMatchTheseTokensT)rf)r�r�r�)r�r5rvrj)rg)rirxrh�sz.matchPreviousExpr.<locals>.copyTokenToRepeaterT)rfz(prev) )rr�rxrir�)r.Ze2rhrw)rgrxrN�scCs>xdD]}|j|t|�}qW|jdd�}|jdd�}t|�S)Nz\^-]rz\nr(z\t)r��_bslashr�)r�r�rwrwrxr�s

rTc
s�|rdd�}dd�}t�ndd�}dd�}t�g}t|t�rF|j�}n&t|tj�r\t|�}ntj	dt
dd�|svt�Sd	}x�|t|�d
k�r||}xnt
||d
d��D]N\}}	||	|�r�|||d
=Pq�|||	�r�|||d
=|j||	�|	}Pq�W|d
7}q|W|�r�|�r�yht|�tdj|��k�rZtd
djdd�|D���jdj|��Stdjdd�|D���jdj|��SWn&tk
�r�tj	dt
dd�YnXt�fdd�|D��jdj|��S)a�
    Helper to quickly define a set of alternative Literals, and makes sure to do
    longest-first testing when there is a conflict, regardless of the input order,
    but returns a C{L{MatchFirst}} for best performance.

    Parameters:
     - strs - a string of space-delimited literals, or a collection of string literals
     - caseless - (default=C{False}) - treat all literals as caseless
     - useRegex - (default=C{True}) - as an optimization, will generate a Regex
          object; otherwise, will generate a C{MatchFirst} object (if C{caseless=True}, or
          if creating a C{Regex} raises an exception)

    Example::
        comp_oper = oneOf("< = > <= >= !=")
        var = Word(alphas)
        number = Word(nums)
        term = var | number
        comparison_expr = term + comp_oper + term
        print(comparison_expr.searchString("B = 12  AA=23 B<=AA AA>12"))
    prints::
        [['B', '=', '12'], ['AA', '=', '23'], ['B', '<=', 'AA'], ['AA', '>', '12']]
    cSs|j�|j�kS)N)r�)r�brwrwrxry�szoneOf.<locals>.<lambda>cSs|j�j|j��S)N)r�r�)rrlrwrwrxry�scSs||kS)Nrw)rrlrwrwrxry�scSs
|j|�S)N)r�)rrlrwrwrxry�sz6Invalid argument to oneOf, expected string or iterablerq)r�rrrNr�z[%s]css|]}t|�VqdS)N)r)r��symrwrwrxr��szoneOf.<locals>.<genexpr>z | �|css|]}tj|�VqdS)N)rdr	)r�rmrwrwrxr��sz7Exception creating Regex for oneOf, building MatchFirstc3s|]}�|�VqdS)Nrw)r�rm)�parseElementClassrwrxr��s)rrrzr�r�r�r5r�r�r�r�rr�r�r�r�r'rirKr)
Zstrsr�ZuseRegexZisequalZmasksZsymbolsr�Zcurr�r�rw)rorxrS�sL





((cCsttt||���S)a�
    Helper to easily and clearly define a dictionary by specifying the respective patterns
    for the key and value.  Takes care of defining the C{L{Dict}}, C{L{ZeroOrMore}}, and C{L{Group}} tokens
    in the proper order.  The key pattern can include delimiting markers or punctuation,
    as long as they are suppressed, thereby leaving the significant key text.  The value
    pattern can include named results, so that the C{Dict} results can include named token
    fields.

    Example::
        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        attr_label = label
        attr_value = Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join)

        # similar to Dict, but simpler call format
        result = dictOf(attr_label, attr_value).parseString(text)
        print(result.dump())
        print(result['shape'])
        print(result.shape)  # object attribute access works too
        print(result.asDict())
    prints::
        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        SQUARE
        {'color': 'light blue', 'shape': 'SQUARE', 'posn': 'upper left', 'texture': 'burlap'}
    )rr2r)r�r�rwrwrxrA�s!cCs^t�jdd��}|j�}d|_|d�||d�}|r@dd�}ndd�}|j|�|j|_|S)	a�
    Helper to return the original, untokenized text for a given expression.  Useful to
    restore the parsed fields of an HTML start tag into the raw tag text itself, or to
    revert separate tokens with intervening whitespace back to the original matching
    input text. By default, returns astring containing the original parsed text.  
       
    If the optional C{asString} argument is passed as C{False}, then the return value is a 
    C{L{ParseResults}} containing any results names that were originally matched, and a 
    single token containing the original matched text from the input string.  So if 
    the expression passed to C{L{originalTextFor}} contains expressions with defined
    results names, you must set C{asString} to C{False} if you want to preserve those
    results name values.

    Example::
        src = "this is test <b> bold <i>text</i> </b> normal text "
        for tag in ("b","i"):
            opener,closer = makeHTMLTags(tag)
            patt = originalTextFor(opener + SkipTo(closer) + closer)
            print(patt.searchString(src)[0])
    prints::
        ['<b> bold <i>text</i> </b>']
        ['<i>text</i>']
    cSs|S)Nrw)r�r�rvrwrwrxry8sz!originalTextFor.<locals>.<lambda>F�_original_start�
_original_endcSs||j|j�S)N)rprq)r�r5rvrwrwrxry=scSs&||jd�|jd��g|dd�<dS)Nrprq)r�)r�r5rvrwrwrx�extractText?sz$originalTextFor.<locals>.extractText)r
r�r�rer])r.ZasStringZ	locMarkerZendlocMarker�	matchExprrrrwrwrxrg s

cCst|�jdd��S)zp
    Helper to undo pyparsing's default grouping of And expressions, even
    if all but one are non-empty.
    cSs|dS)Nrrw)rvrwrwrxryJszungroup.<locals>.<lambda>)r-r�)r.rwrwrxrhEscCs4t�jdd��}t|d�|d�|j�j�d��S)a�
    Helper to decorate a returned token with its starting and ending locations in the input string.
    This helper adds the following results names:
     - locn_start = location where matched expression begins
     - locn_end = location where matched expression ends
     - value = the actual parsed results

    Be careful if the input text contains C{<TAB>} characters, you may want to call
    C{L{ParserElement.parseWithTabs}}

    Example::
        wd = Word(alphas)
        for match in locatedExpr(wd).searchString("ljsdf123lksdjjf123lkkjj1222"):
            print(match)
    prints::
        [[0, 'ljsdf', 5]]
        [[8, 'lksdjjf', 15]]
        [[18, 'lkkjj', 23]]
    cSs|S)Nrw)r�r5rvrwrwrxry`szlocatedExpr.<locals>.<lambda>Z
locn_startr�Zlocn_end)r
r�rr�r�)r.ZlocatorrwrwrxrjLsz\[]-*.$+^?()~ )r
cCs|ddS)Nrrrrw)r�r5rvrwrwrxryksryz\\0?[xX][0-9a-fA-F]+cCstt|djd�d��S)Nrz\0x�)�unichrru�lstrip)r�r5rvrwrwrxrylsz	\\0[0-7]+cCstt|ddd�d��S)Nrrr�)ruru)r�r5rvrwrwrxrymsz\])r�r
z\wr8rr�Znegate�bodyr	csBdd��y dj�fdd�tj|�jD��Stk
r<dSXdS)a�
    Helper to easily define string ranges for use in Word construction.  Borrows
    syntax from regexp '[]' string range definitions::
        srange("[0-9]")   -> "0123456789"
        srange("[a-z]")   -> "abcdefghijklmnopqrstuvwxyz"
        srange("[a-z$_]") -> "abcdefghijklmnopqrstuvwxyz$_"
    The input string must be enclosed in []'s, and the returned string is the expanded
    character set joined into a single string.
    The values enclosed in the []'s may be:
     - a single character
     - an escaped character with a leading backslash (such as C{\-} or C{\]})
     - an escaped hex character with a leading C{'\x'} (C{\x21}, which is a C{'!'} character) 
         (C{\0x##} is also supported for backwards compatibility) 
     - an escaped octal character with a leading C{'\0'} (C{\041}, which is a C{'!'} character)
     - a range of any of the above, separated by a dash (C{'a-z'}, etc.)
     - any combination of the above (C{'aeiouy'}, C{'a-zA-Z0-9_$'}, etc.)
    cSs<t|t�s|Sdjdd�tt|d�t|d�d�D��S)Nr�css|]}t|�VqdS)N)ru)r�r�rwrwrxr��sz+srange.<locals>.<lambda>.<locals>.<genexpr>rrr)rzr"r�r��ord)�prwrwrxry�szsrange.<locals>.<lambda>r�c3s|]}�|�VqdS)Nrw)r��part)�	_expandedrwrxr��szsrange.<locals>.<genexpr>N)r��_reBracketExprr�rxrK)r�rw)r|rxr_rs
 cs�fdd�}|S)zt
    Helper method for defining parse actions that require matching at a specific
    column in the input text.
    cs"t||��krt||d���dS)Nzmatched token not at column %d)r9r)r)Zlocnr1)r�rwrx�	verifyCol�sz!matchOnlyAtCol.<locals>.verifyColrw)r�r~rw)r�rxrM�scs�fdd�S)a�
    Helper method for common parse actions that simply return a literal value.  Especially
    useful when used with C{L{transformString<ParserElement.transformString>}()}.

    Example::
        num = Word(nums).setParseAction(lambda toks: int(toks[0]))
        na = oneOf("N/A NA").setParseAction(replaceWith(math.nan))
        term = na | num
        
        OneOrMore(term).parseString("324 234 N/A 234") # -> [324, 234, nan, 234]
    cs�gS)Nrw)r�r5rv)�replStrrwrxry�szreplaceWith.<locals>.<lambda>rw)rrw)rrxr\�scCs|ddd�S)a
    Helper parse action for removing quotation marks from parsed quoted strings.

    Example::
        # by default, quotation marks are included in parsed results
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["'Now is the Winter of our Discontent'"]

        # use removeQuotes to strip quotation marks from parsed results
        quotedString.setParseAction(removeQuotes)
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["Now is the Winter of our Discontent"]
    rrrrsrw)r�r5rvrwrwrxrZ�scsN��fdd�}yt�dt�d�j�}Wntk
rBt��}YnX||_|S)aG
    Helper to define a parse action by mapping a function to all elements of a ParseResults list.If any additional 
    args are passed, they are forwarded to the given function as additional arguments after
    the token, as in C{hex_integer = Word(hexnums).setParseAction(tokenMap(int, 16))}, which will convert the
    parsed data to an integer using base 16.

    Example (compare the last to example in L{ParserElement.transformString}::
        hex_ints = OneOrMore(Word(hexnums)).setParseAction(tokenMap(int, 16))
        hex_ints.runTests('''
            00 11 22 aa FF 0a 0d 1a
            ''')
        
        upperword = Word(alphas).setParseAction(tokenMap(str.upper))
        OneOrMore(upperword).runTests('''
            my kingdom for a horse
            ''')

        wd = Word(alphas).setParseAction(tokenMap(str.title))
        OneOrMore(wd).setParseAction(' '.join).runTests('''
            now is the winter of our discontent made glorious summer by this sun of york
            ''')
    prints::
        00 11 22 aa FF 0a 0d 1a
        [0, 17, 34, 170, 255, 10, 13, 26]

        my kingdom for a horse
        ['MY', 'KINGDOM', 'FOR', 'A', 'HORSE']

        now is the winter of our discontent made glorious summer by this sun of york
        ['Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York']
    cs��fdd�|D�S)Ncsg|]}�|f����qSrwrw)r�Ztokn)r�r6rwrxr��sz(tokenMap.<locals>.pa.<locals>.<listcomp>rw)r�r5rv)r�r6rwrxr}�sztokenMap.<locals>.par�rH)rJr�rKr{)r6r�r}rLrw)r�r6rxrm�s cCst|�j�S)N)r�r�)rvrwrwrxry�scCst|�j�S)N)r��lower)rvrwrwrxry�scCs�t|t�r|}t||d�}n|j}tttd�}|r�tj�j	t
�}td�|d�tt
t|td�|���tddgd�jd	�j	d
d��td�}n�d
jdd�tD��}tj�j	t
�t|�B}td�|d�tt
t|j	t�ttd�|����tddgd�jd	�j	dd��td�}ttd�|d�}|jdd
j|jdd�j�j���jd|�}|jdd
j|jdd�j�j���jd|�}||_||_||fS)zRInternal helper to construct opening and closing tag expressions, given a tag name)r�z_-:r�tag�=�/F)r�rCcSs|ddkS)Nrr�rw)r�r5rvrwrwrxry�sz_makeTags.<locals>.<lambda>rr�css|]}|dkr|VqdS)rNrw)r�r�rwrwrxr��sz_makeTags.<locals>.<genexpr>cSs|ddkS)Nrr�rw)r�r5rvrwrwrxry�sz</r��:r�z<%s>rz</%s>)rzr�rr�r/r4r3r>r�r�rZr+rr2rrrmr�rVrYrBr
�_Lr��titler�rir�)�tagStrZxmlZresnameZtagAttrNameZtagAttrValueZopenTagZprintablesLessRAbrackZcloseTagrwrwrx�	_makeTags�s"
T\..r�cCs
t|d�S)a 
    Helper to construct opening and closing tag expressions for HTML, given a tag name. Matches
    tags in either upper or lower case, attributes with namespaces and with quoted or unquoted values.

    Example::
        text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
        # makeHTMLTags returns pyparsing expressions for the opening and closing tags as a 2-tuple
        a,a_end = makeHTMLTags("A")
        link_expr = a + SkipTo(a_end)("link_text") + a_end
        
        for link in link_expr.searchString(text):
            # attributes in the <A> tag (like "href" shown here) are also accessible as named results
            print(link.link_text, '->', link.href)
    prints::
        pyparsing -> http://pyparsing.wikispaces.com
    F)r�)r�rwrwrxrK�scCs
t|d�S)z�
    Helper to construct opening and closing tag expressions for XML, given a tag name. Matches
    tags only in the given upper/lower case.

    Example: similar to L{makeHTMLTags}
    T)r�)r�rwrwrxrLscs8|r|dd��n|j��dd��D���fdd�}|S)a<
    Helper to create a validating parse action to be used with start tags created
    with C{L{makeXMLTags}} or C{L{makeHTMLTags}}. Use C{withAttribute} to qualify a starting tag
    with a required attribute value, to avoid false matches on common tags such as
    C{<TD>} or C{<DIV>}.

    Call C{withAttribute} with a series of attribute names and values. Specify the list
    of filter attributes names and values as:
     - keyword arguments, as in C{(align="right")}, or
     - as an explicit dict with C{**} operator, when an attribute name is also a Python
          reserved word, as in C{**{"class":"Customer", "align":"right"}}
     - a list of name-value tuples, as in ( ("ns1:class", "Customer"), ("ns2:align","right") )
    For attribute names with a namespace prefix, you must use the second form.  Attribute
    names are matched insensitive to upper/lower case.
       
    If just testing for C{class} (with or without a namespace), use C{L{withClass}}.

    To verify that the attribute exists, but without specifying a value, pass
    C{withAttribute.ANY_VALUE} as the value.

    Example::
        html = '''
            <div>
            Some text
            <div type="grid">1 4 0 1 0</div>
            <div type="graph">1,3 2,3 1,1</div>
            <div>this has no type</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")

        # only match div tag having a type attribute with value "grid"
        div_grid = div().setParseAction(withAttribute(type="grid"))
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        # construct a match with any div tag having a type attribute, regardless of the value
        div_any_type = div().setParseAction(withAttribute(type=withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    NcSsg|]\}}||f�qSrwrw)r�r�r�rwrwrxr�Qsz!withAttribute.<locals>.<listcomp>cs^xX�D]P\}}||kr&t||d|��|tjkr|||krt||d||||f��qWdS)Nzno matching attribute z+attribute '%s' has value '%s', must be '%s')rre�	ANY_VALUE)r�r5r�ZattrNameZ	attrValue)�attrsrwrxr}RszwithAttribute.<locals>.pa)r�)r�ZattrDictr}rw)r�rxres2cCs|rd|nd}tf||i�S)a�
    Simplified version of C{L{withAttribute}} when matching on a div class - made
    difficult because C{class} is a reserved word in Python.

    Example::
        html = '''
            <div>
            Some text
            <div class="grid">1 4 0 1 0</div>
            <div class="graph">1,3 2,3 1,1</div>
            <div>this &lt;div&gt; has no class</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")
        div_grid = div().setParseAction(withClass("grid"))
        
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        div_any_type = div().setParseAction(withClass(withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    z%s:class�class)re)Z	classname�	namespaceZ	classattrrwrwrxrk\s �(rcCs�t�}||||B}�x`t|�D�]R\}}|ddd�\}}	}
}|	dkrTd|nd|}|	dkr�|dksxt|�dkr�td��|\}
}t�j|�}|
tjk�rd|	dkr�t||�t|t	|��}n�|	dk�r|dk	�rt|||�t|t	||��}nt||�t|t	|��}nD|	dk�rZt||
|||�t||
|||�}ntd	��n�|
tj
k�rH|	dk�r�t|t��s�t|�}t|j
|�t||�}n�|	dk�r|dk	�r�t|||�t|t	||��}nt||�t|t	|��}nD|	dk�r>t||
|||�t||
|||�}ntd	��ntd
��|�r`|j|�||j|�|BK}|}q"W||K}|S)a�	
    Helper method for constructing grammars of expressions made up of
    operators working in a precedence hierarchy.  Operators may be unary or
    binary, left- or right-associative.  Parse actions can also be attached
    to operator expressions. The generated parser will also recognize the use 
    of parentheses to override operator precedences (see example below).
    
    Note: if you define a deep operator list, you may see performance issues
    when using infixNotation. See L{ParserElement.enablePackrat} for a
    mechanism to potentially improve your parser performance.

    Parameters:
     - baseExpr - expression representing the most basic element for the nested
     - opList - list of tuples, one for each operator precedence level in the
      expression grammar; each tuple is of the form
      (opExpr, numTerms, rightLeftAssoc, parseAction), where:
       - opExpr is the pyparsing expression for the operator;
          may also be a string, which will be converted to a Literal;
          if numTerms is 3, opExpr is a tuple of two expressions, for the
          two operators separating the 3 terms
       - numTerms is the number of terms for this operator (must
          be 1, 2, or 3)
       - rightLeftAssoc is the indicator whether the operator is
          right or left associative, using the pyparsing-defined
          constants C{opAssoc.RIGHT} and C{opAssoc.LEFT}.
       - parseAction is the parse action to be associated with
          expressions matching this operator expression (the
          parse action tuple member may be omitted)
     - lpar - expression for matching left-parentheses (default=C{Suppress('(')})
     - rpar - expression for matching right-parentheses (default=C{Suppress(')')})

    Example::
        # simple example of four-function arithmetic with ints and variable names
        integer = pyparsing_common.signed_integer
        varname = pyparsing_common.identifier 
        
        arith_expr = infixNotation(integer | varname,
            [
            ('-', 1, opAssoc.RIGHT),
            (oneOf('* /'), 2, opAssoc.LEFT),
            (oneOf('+ -'), 2, opAssoc.LEFT),
            ])
        
        arith_expr.runTests('''
            5+3*6
            (5+3)*6
            -2--11
            ''', fullDump=False)
    prints::
        5+3*6
        [[5, '+', [3, '*', 6]]]

        (5+3)*6
        [[[5, '+', 3], '*', 6]]

        -2--11
        [[['-', 2], '-', ['-', 11]]]
    Nrroz%s termz	%s%s termrqz@if numterms=3, opExpr must be a tuple or list of two expressionsrrz6operator must be unary (1), binary (2), or ternary (3)z2operator must indicate right or left associativity)N)rr�r�r�rirT�LEFTrrr�RIGHTrzrr.r�)ZbaseExprZopListZlparZrparr�ZlastExprr�ZoperDefZopExprZarityZrightLeftAssocr}ZtermNameZopExpr1ZopExpr2ZthisExprrsrwrwrxri�sR;

&




&


z4"(?:[^"\n\r\\]|(?:"")|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*�"z string enclosed in double quotesz4'(?:[^'\n\r\\]|(?:'')|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*�'z string enclosed in single quotesz*quotedString using single or double quotes�uzunicode string literalcCs�||krtd��|dk�r(t|t�o,t|t��r t|�dkr�t|�dkr�|dk	r�tt|t||tjdd���j	dd��}n$t
j�t||tj�j	dd��}nx|dk	r�tt|t|�t|�ttjdd���j	dd��}n4ttt|�t|�ttjdd���j	d	d��}ntd
��t
�}|dk	�rb|tt|�t||B|B�t|��K}n$|tt|�t||B�t|��K}|jd||f�|S)a~	
    Helper method for defining nested lists enclosed in opening and closing
    delimiters ("(" and ")" are the default).

    Parameters:
     - opener - opening character for a nested list (default=C{"("}); can also be a pyparsing expression
     - closer - closing character for a nested list (default=C{")"}); can also be a pyparsing expression
     - content - expression for items within the nested lists (default=C{None})
     - ignoreExpr - expression for ignoring opening and closing delimiters (default=C{quotedString})

    If an expression is not provided for the content argument, the nested
    expression will capture all whitespace-delimited content between delimiters
    as a list of separate values.

    Use the C{ignoreExpr} argument to define expressions that may contain
    opening or closing characters that should not be treated as opening
    or closing characters for nesting, such as quotedString or a comment
    expression.  Specify multiple expressions using an C{L{Or}} or C{L{MatchFirst}}.
    The default is L{quotedString}, but if no expressions are to be ignored,
    then pass C{None} for this argument.

    Example::
        data_type = oneOf("void int short long char float double")
        decl_data_type = Combine(data_type + Optional(Word('*')))
        ident = Word(alphas+'_', alphanums+'_')
        number = pyparsing_common.number
        arg = Group(decl_data_type + ident)
        LPAR,RPAR = map(Suppress, "()")

        code_body = nestedExpr('{', '}', ignoreExpr=(quotedString | cStyleComment))

        c_function = (decl_data_type("type") 
                      + ident("name")
                      + LPAR + Optional(delimitedList(arg), [])("args") + RPAR 
                      + code_body("body"))
        c_function.ignore(cStyleComment)
        
        source_code = '''
            int is_odd(int x) { 
                return (x%2); 
            }
                
            int dec_to_hex(char hchar) { 
                if (hchar >= '0' && hchar <= '9') { 
                    return (ord(hchar)-ord('0')); 
                } else { 
                    return (10+ord(hchar)-ord('A'));
                } 
            }
        '''
        for func in c_function.searchString(source_code):
            print("%(name)s (%(type)s) args: %(args)s" % func)

    prints::
        is_odd (int) args: [['int', 'x']]
        dec_to_hex (int) args: [['char', 'hchar']]
    z.opening and closing strings cannot be the sameNrr)r
cSs|dj�S)Nr)r�)rvrwrwrxry9sznestedExpr.<locals>.<lambda>cSs|dj�S)Nr)r�)rvrwrwrxry<scSs|dj�S)Nr)r�)rvrwrwrxryBscSs|dj�S)Nr)r�)rvrwrwrxryFszOopening and closing arguments must be strings if no content expression is givenznested %s%s expression)r�rzr�r�r
rr	r$rNr�rCr�rrrr+r2ri)�openerZcloserZcontentrOr�rwrwrxrP�s4:

*$cs��fdd�}�fdd�}�fdd�}tt�jd�j��}t�t�j|�jd�}t�j|�jd	�}t�j|�jd
�}	|r�tt|�|t|t|�t|��|	�}
n$tt|�t|t|�t|���}
|j	t
t��|
jd�S)a
	
    Helper method for defining space-delimited indentation blocks, such as
    those used to define block statements in Python source code.

    Parameters:
     - blockStatementExpr - expression defining syntax of statement that
            is repeated within the indented block
     - indentStack - list created by caller to manage indentation stack
            (multiple statementWithIndentedBlock expressions within a single grammar
            should share a common indentStack)
     - indent - boolean indicating whether block must be indented beyond the
            the current level; set to False for block of left-most statements
            (default=C{True})

    A valid block must contain at least one C{blockStatement}.

    Example::
        data = '''
        def A(z):
          A1
          B = 100
          G = A2
          A2
          A3
        B
        def BB(a,b,c):
          BB1
          def BBA():
            bba1
            bba2
            bba3
        C
        D
        def spam(x,y):
             def eggs(z):
                 pass
        '''


        indentStack = [1]
        stmt = Forward()

        identifier = Word(alphas, alphanums)
        funcDecl = ("def" + identifier + Group( "(" + Optional( delimitedList(identifier) ) + ")" ) + ":")
        func_body = indentedBlock(stmt, indentStack)
        funcDef = Group( funcDecl + func_body )

        rvalue = Forward()
        funcCall = Group(identifier + "(" + Optional(delimitedList(rvalue)) + ")")
        rvalue << (funcCall | identifier | Word(nums))
        assignment = Group(identifier + "=" + rvalue)
        stmt << ( funcDef | assignment | identifier )

        module_body = OneOrMore(stmt)

        parseTree = module_body.parseString(data)
        parseTree.pprint()
    prints::
        [['def',
          'A',
          ['(', 'z', ')'],
          ':',
          [['A1'], [['B', '=', '100']], [['G', '=', 'A2']], ['A2'], ['A3']]],
         'B',
         ['def',
          'BB',
          ['(', 'a', 'b', 'c', ')'],
          ':',
          [['BB1'], [['def', 'BBA', ['(', ')'], ':', [['bba1'], ['bba2'], ['bba3']]]]]],
         'C',
         'D',
         ['def',
          'spam',
          ['(', 'x', 'y', ')'],
          ':',
          [[['def', 'eggs', ['(', 'z', ')'], ':', [['pass']]]]]]] 
    csN|t|�krdSt||�}|�dkrJ|�dkr>t||d��t||d��dS)Nrrzillegal nestingznot a peer entryrsrs)r�r9r!r)r�r5rv�curCol)�indentStackrwrx�checkPeerIndent�s
z&indentedBlock.<locals>.checkPeerIndentcs2t||�}|�dkr"�j|�nt||d��dS)Nrrznot a subentryrs)r9r�r)r�r5rvr�)r�rwrx�checkSubIndent�s
z%indentedBlock.<locals>.checkSubIndentcsN|t|�krdSt||�}�o4|�dko4|�dksBt||d���j�dS)Nrrrqznot an unindentrsr:)r�r9rr�)r�r5rvr�)r�rwrx�
checkUnindent�s
z$indentedBlock.<locals>.checkUnindentz	 �INDENTr�ZUNINDENTzindented block)rrr�r�r
r�rirrr�rk)ZblockStatementExprr�rr�r�r�r!r�ZPEERZUNDENTZsmExprrw)r�rxrfQsN,z#[\0xc0-\0xd6\0xd8-\0xf6\0xf8-\0xff]z[\0xa1-\0xbf\0xd7\0xf7]z_:zany tagzgt lt amp nbsp quot aposz><& "'z&(?P<entity>rnz);zcommon HTML entitycCstj|j�S)zRHelper parser action to replace common HTML entities with their special characters)�_htmlEntityMapr�Zentity)rvrwrwrxr[�sz/\*(?:[^*]|\*(?!/))*z*/zC style commentz<!--[\s\S]*?-->zHTML commentz.*zrest of linez//(?:\\\n|[^\n])*z
// commentzC++ style commentz#.*zPython style comment)r�z 	�	commaItem)r�c@s�eZdZdZee�Zee�Ze	e
�jd�je�Z
e	e�jd�jeed��Zed�jd�je�Ze�je�de�je�jd�Zejd	d
��eeeed�j�e�Bjd�Zeje�ed
�jd�je�Zed�jd�je�ZeeBeBj�Zed�jd�je�Ze	eded�jd�Zed�jd�Z ed�jd�Z!e!de!djd�Z"ee!de!d>�dee!de!d?�jd�Z#e#j$d d
��d!e jd"�Z%e&e"e%Be#Bjd#��jd#�Z'ed$�jd%�Z(e)d@d'd(��Z*e)dAd*d+��Z+ed,�jd-�Z,ed.�jd/�Z-ed0�jd1�Z.e/j�e0j�BZ1e)d2d3��Z2e&e3e4d4�e5�e	e6d4d5�ee7d6����j�jd7�Z8e9ee:j;�e8Bd8d9��jd:�Z<e)ed;d
���Z=e)ed<d
���Z>d=S)Brna�

    Here are some common low-level expressions that may be useful in jump-starting parser development:
     - numeric forms (L{integers<integer>}, L{reals<real>}, L{scientific notation<sci_real>})
     - common L{programming identifiers<identifier>}
     - network addresses (L{MAC<mac_address>}, L{IPv4<ipv4_address>}, L{IPv6<ipv6_address>})
     - ISO8601 L{dates<iso8601_date>} and L{datetime<iso8601_datetime>}
     - L{UUID<uuid>}
     - L{comma-separated list<comma_separated_list>}
    Parse actions:
     - C{L{convertToInteger}}
     - C{L{convertToFloat}}
     - C{L{convertToDate}}
     - C{L{convertToDatetime}}
     - C{L{stripHTMLTags}}
     - C{L{upcaseTokens}}
     - C{L{downcaseTokens}}

    Example::
        pyparsing_common.number.runTests('''
            # any int or real number, returned as the appropriate type
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.fnumber.runTests('''
            # any int or real number, returned as float
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.hex_integer.runTests('''
            # hex numbers
            100
            FF
            ''')

        pyparsing_common.fraction.runTests('''
            # fractions
            1/2
            -3/4
            ''')

        pyparsing_common.mixed_integer.runTests('''
            # mixed fractions
            1
            1/2
            -3/4
            1-3/4
            ''')

        import uuid
        pyparsing_common.uuid.setParseAction(tokenMap(uuid.UUID))
        pyparsing_common.uuid.runTests('''
            # uuid
            12345678-1234-5678-1234-567812345678
            ''')
    prints::
        # any int or real number, returned as the appropriate type
        100
        [100]

        -100
        [-100]

        +100
        [100]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # any int or real number, returned as float
        100
        [100.0]

        -100
        [-100.0]

        +100
        [100.0]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # hex numbers
        100
        [256]

        FF
        [255]

        # fractions
        1/2
        [0.5]

        -3/4
        [-0.75]

        # mixed fractions
        1
        [1]

        1/2
        [0.5]

        -3/4
        [-0.75]

        1-3/4
        [1.75]

        # uuid
        12345678-1234-5678-1234-567812345678
        [UUID('12345678-1234-5678-1234-567812345678')]
    �integerzhex integerrtz[+-]?\d+zsigned integerr��fractioncCs|d|dS)Nrrrrsrw)rvrwrwrxry�szpyparsing_common.<lambda>r8z"fraction or mixed integer-fractionz
[+-]?\d+\.\d*zreal numberz+[+-]?\d+([eE][+-]?\d+|\.\d*([eE][+-]?\d+)?)z$real number with scientific notationz[+-]?\d+\.?\d*([eE][+-]?\d+)?�fnumberrB�
identifierzK(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})(\.(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})){3}zIPv4 addressz[0-9a-fA-F]{1,4}�hex_integerr��zfull IPv6 addressrrBz::zshort IPv6 addresscCstdd�|D��dkS)Ncss|]}tjj|�rdVqdS)rrN)rn�
_ipv6_partr�)r�rfrwrwrxr��sz,pyparsing_common.<lambda>.<locals>.<genexpr>rw)rH)rvrwrwrxry�sz::ffff:zmixed IPv6 addresszIPv6 addressz:[0-9a-fA-F]{2}([:.-])[0-9a-fA-F]{2}(?:\1[0-9a-fA-F]{2}){4}zMAC address�%Y-%m-%dcs�fdd�}|S)a�
        Helper to create a parse action for converting parsed date string to Python datetime.date

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%d"})

        Example::
            date_expr = pyparsing_common.iso8601_date.copy()
            date_expr.setParseAction(pyparsing_common.convertToDate())
            print(date_expr.parseString("1999-12-31"))
        prints::
            [datetime.date(1999, 12, 31)]
        csLytj|d��j�Stk
rF}zt||t|���WYdd}~XnXdS)Nr)r�strptimeZdater�rr{)r�r5rv�ve)�fmtrwrx�cvt_fn�sz.pyparsing_common.convertToDate.<locals>.cvt_fnrw)r�r�rw)r�rx�
convertToDate�szpyparsing_common.convertToDate�%Y-%m-%dT%H:%M:%S.%fcs�fdd�}|S)a
        Helper to create a parse action for converting parsed datetime string to Python datetime.datetime

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%dT%H:%M:%S.%f"})

        Example::
            dt_expr = pyparsing_common.iso8601_datetime.copy()
            dt_expr.setParseAction(pyparsing_common.convertToDatetime())
            print(dt_expr.parseString("1999-12-31T23:59:59.999"))
        prints::
            [datetime.datetime(1999, 12, 31, 23, 59, 59, 999000)]
        csHytj|d��Stk
rB}zt||t|���WYdd}~XnXdS)Nr)rr�r�rr{)r�r5rvr�)r�rwrxr��sz2pyparsing_common.convertToDatetime.<locals>.cvt_fnrw)r�r�rw)r�rx�convertToDatetime�sz"pyparsing_common.convertToDatetimez7(?P<year>\d{4})(?:-(?P<month>\d\d)(?:-(?P<day>\d\d))?)?zISO8601 datez�(?P<year>\d{4})-(?P<month>\d\d)-(?P<day>\d\d)[T ](?P<hour>\d\d):(?P<minute>\d\d)(:(?P<second>\d\d(\.\d*)?)?)?(?P<tz>Z|[+-]\d\d:?\d\d)?zISO8601 datetimez2[0-9a-fA-F]{8}(-[0-9a-fA-F]{4}){3}-[0-9a-fA-F]{12}�UUIDcCstjj|d�S)a
        Parse action to remove HTML tags from web page HTML source

        Example::
            # strip HTML links from normal text 
            text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
            td,td_end = makeHTMLTags("TD")
            table_text = td + SkipTo(td_end).setParseAction(pyparsing_common.stripHTMLTags)("body") + td_end
            
            print(table_text.parseString(text).body) # -> 'More info at the pyparsing wiki page'
        r)rn�_html_stripperr�)r�r5r�rwrwrx�
stripHTMLTags�s
zpyparsing_common.stripHTMLTagsra)r�z 	r�r�)r�zcomma separated listcCst|�j�S)N)r�r�)rvrwrwrxry�scCst|�j�S)N)r�r�)rvrwrwrxry�sN)rrB)rrB)r�)r�)?r�r�r�r�rmruZconvertToInteger�floatZconvertToFloatr/rRrir�r�rDr�r'Zsigned_integerr�rxrr�Z
mixed_integerrH�realZsci_realr��numberr�r4r3r�Zipv4_addressr�Z_full_ipv6_addressZ_short_ipv6_addressr~Z_mixed_ipv6_addressr
Zipv6_addressZmac_addressr�r�r�Ziso8601_dateZiso8601_datetime�uuidr7r6r�r�rrrrVr.�
_commasepitemr@rYr�Zcomma_separated_listrdrBrwrwrwrxrn�sN""
28�__main__Zselect�fromz_$r])rb�columnsrjZtablesZcommandaK
        # '*' as column list and dotted table name
        select * from SYS.XYZZY

        # caseless match on "SELECT", and casts back to "select"
        SELECT * from XYZZY, ABC

        # list of column names, and mixed case SELECT keyword
        Select AA,BB,CC from Sys.dual

        # multiple tables
        Select A, B, C from Sys.dual, Table2

        # invalid SELECT keyword - should fail
        Xelect A, B, C from Sys.dual

        # incomplete command - should fail
        Select

        # invalid column name - should fail
        Select ^^^ frox Sys.dual

        z]
        100
        -100
        +100
        3.14159
        6.02e23
        1e-12
        z 
        100
        FF
        z6
        12345678-1234-5678-1234-567812345678
        )rq)raF)N)FT)T)r�)T)�r��__version__Z__versionTime__�
__author__r��weakrefrr�r�r~r�rdrr�r"r<r�r�_threadr�ImportErrorZ	threadingrr�Zordereddict�__all__r��version_infor;r�maxsizer�r{r��chrrur�rHr�r�reversedr�r�rr6rrrIZmaxintZxranger�Z__builtin__r�Zfnamer�rJr�r�r�r�r�r�Zascii_uppercaseZascii_lowercaser4rRrDr3rkr�Z	printablerVrKrrr!r#r&r�r"�MutableMapping�registerr9rJrGr/r2r4rQrMr$r,r
rrr�rQrrrrlr/r'r%r	r.r0rrrr*r)r1r0r rrrrrrrrJrr2rMrNrr(rrVr-r
rrr+rrbr@r<r�rOrNrrSrArgrhrjrirCrIrHrar`r�Z_escapedPuncZ_escapedHexCharZ_escapedOctChar�UNICODEZ_singleCharZ
_charRangermr}r_rMr\rZrmrdrBr�rKrLrer�rkrTr�r�rirUr>r^rYrcrPrfr5rWr7r6r�r�r�r�r;r[r8rEr�r]r?r=rFrXr�r�r:rnr�ZselectTokenZ	fromTokenZidentZ
columnNameZcolumnNameListZ
columnSpecZ	tableNameZ
tableNameListZ	simpleSQLr�r�r�r�r�r�rwrwrwrx�<module>=s�









8


@d&A= I
G3pLOD|M &#@sQ,A,	I#%&0
,	?#kZr

 (
 0 


"site-packages/setuptools/_vendor/__pycache__/six.cpython-36.pyc000064400000057532147511334630020625 0ustar003

��f�u�I@srdZddlmZddlZddlZddlZddlZddlZdZdZ	ej
ddkZej
ddkZej
dd��dzkZ
er�efZefZefZeZeZejZn�efZeefZeejfZeZeZejjd	�r�e�d|�ZnLGdd
�d
e�Z ye!e ��Wn e"k
�re�d~�ZYnXe�d��Z[ dd�Z#dd�Z$Gdd�de�Z%Gdd�de%�Z&Gdd�dej'�Z(Gdd�de%�Z)Gdd�de�Z*e*e+�Z,Gdd�de(�Z-e)ddd d!�e)d"d#d$d%d"�e)d&d#d#d'd&�e)d(d)d$d*d(�e)d+d)d,�e)d-d#d$d.d-�e)d/d0d0d1d/�e)d2d0d0d/d2�e)d3d)d$d4d3�e)d5d)e
�rd6nd7d8�e)d9d)d:�e)d;d<d=d>�e)d!d!d �e)d?d?d@�e)dAdAd@�e)dBdBd@�e)d4d)d$d4d3�e)dCd#d$dDdC�e)dEd#d#dFdE�e&d$d)�e&dGdH�e&dIdJ�e&dKdLdM�e&dNdOdN�e&dPdQdR�e&dSdTdU�e&dVdWdX�e&dYdZd[�e&d\d]d^�e&d_d`da�e&dbdcdd�e&dedfdg�e&dhdidj�e&dkdkdl�e&dmdmdl�e&dndndl�e&dododp�e&dqdr�e&dsdt�e&dudv�e&dwdxdw�e&dydz�e&d{d|d}�e&d~dd��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�e+d�d��e&d�e+d�d��e&d�e+d�e+d��e&d�d�d��e&d�d�d��e&d�d�d��g>Z.ejd�k�rZe.e&d�d��g7Z.x:e.D]2Z/e0e-e/j1e/�e2e/e&��r`e,j3e/d�e/j1��q`W[/e.e-_.e-e+d��Z4e,j3e4d��Gd�d��d�e(�Z5e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d>d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��gZ6xe6D]Z/e0e5e/j1e/��q�W[/e6e5_.e,j3e5e+d��d�dӃGd�dՄd�e(�Z7e)d�d�d��e)d�d�d��e)d�d�d��gZ8xe8D]Z/e0e7e/j1e/��q$W[/e8e7_.e,j3e7e+d��d�d܃Gd�dބd�e(�Z9e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)�dd�d�g!Z:xe:D]Z/e0e9e/j1e/��q�W[/e:e9_.e,j3e9e+�d��d�d�G�d�d��de(�Z;e)�dd��d�e)�dd��d�e)�d	d��d�e)�d
d��d�gZ<xe<D]Z/e0e;e/j1e/��qTW[/e<e;_.e,j3e;e+�d��d�d
�G�d�d��de(�Z=e)�dd�d��gZ>xe>D]Z/e0e=e/j1e/��q�W[/e>e=_.e,j3e=e+�d��d�d�G�d�d��dej'�Z?e,j3e?e+d���d��d�d�Z@�d�d�ZAe�	rj�dZB�dZC�dZD�dZE�dZF�d ZGn$�d!ZB�d"ZC�d#ZD�d$ZE�d%ZF�d&ZGyeHZIWn"eJk
�	r��d'�d(�ZIYnXeIZHyeKZKWn"eJk
�	r��d)�d*�ZKYnXe�
r�d+�d,�ZLejMZN�d-�d.�ZOeZPn>�d/�d,�ZL�d0�d1�ZN�d2�d.�ZOG�d3�d4��d4e�ZPeKZKe#eL�d5�ejQeB�ZRejQeC�ZSejQeD�ZTejQeE�ZUejQeF�ZVejQeG�ZWe�
r��d6�d7�ZX�d8�d9�ZY�d:�d;�ZZ�d<�d=�Z[ej\�d>�Z]ej\�d?�Z^ej\�d@�Z_nT�dA�d7�ZX�dB�d9�ZY�dC�d;�ZZ�dD�d=�Z[ej\�dE�Z]ej\�dF�Z^ej\�dG�Z_e#eX�dH�e#eY�dI�e#eZ�dJ�e#e[�dK�e�r�dL�dM�Z`�dN�dO�ZaebZcddldZdedje�dP�jfZg[dejhd�ZiejjZkelZmddlnZnenjoZoenjpZp�dQZqej
d
d
k�r�dRZr�dSZsn�dTZr�dUZsnj�dV�dM�Z`�dW�dO�ZaecZcebZg�dX�dY�Zi�dZ�d[�Zkejtejuev�ZmddloZoeojoZoZp�d\Zq�dRZr�dSZse#e`�d]�e#ea�d^��d_�dQ�Zw�d`�dT�Zx�da�dU�Zye�r�eze4j{�db�Z|�d��dc�dd�Z}n�d��de�df�Z|e|�dg�ej
dd��d�k�
re|�dh�n.ej
dd��d�k�
r8e|�di�n�dj�dk�Z~eze4j{�dld�Zedk�
rj�dm�dn�Zej
dd��d�k�
r�eZ��do�dn�Ze#e}�dp�ej
dd��d�k�
r�ej�ej�f�dq�dr�Z�nej�Z��ds�dt�Z��du�dv�Z��dw�dx�Z�gZ�e+Z�e��j��dy�dk	�rge�_�ej��rbx>e�ej��D]0\Z�Z�ee��j+dk�r*e�j1e+k�r*ej�e�=P�q*W[�[�ej�j�e,�dS(�z6Utilities for writing code that runs on Python 2 and 3�)�absolute_importNz'Benjamin Peterson <benjamin@python.org>z1.10.0����java��c@seZdZdd�ZdS)�XcCsdS)Nrrl�)�selfr
r
�/usr/lib/python3.6/six.py�__len__>sz	X.__len__N)�__name__�
__module__�__qualname__r
r
r
r
rr	<sr	�?cCs
||_dS)z Add documentation to a function.N)�__doc__)�func�docr
r
r�_add_docKsrcCst|�tj|S)z7Import module, returning the module after the last dot.)�
__import__�sys�modules)�namer
r
r�_import_modulePsrc@seZdZdd�Zdd�ZdS)�
_LazyDescrcCs
||_dS)N)r)rrr
r
r�__init__Xsz_LazyDescr.__init__cCsB|j�}t||j|�yt|j|j�Wntk
r<YnX|S)N)�_resolve�setattrr�delattr�	__class__�AttributeError)r�obj�tp�resultr
r
r�__get__[sz_LazyDescr.__get__N)rrrrr%r
r
r
rrVsrcs.eZdZd�fdd�	Zdd�Zdd�Z�ZS)	�MovedModuleNcs2tt|�j|�tr(|dkr |}||_n||_dS)N)�superr&r�PY3�mod)rr�old�new)r r
rriszMovedModule.__init__cCs
t|j�S)N)rr))rr
r
rrrszMovedModule._resolvecCs"|j�}t||�}t|||�|S)N)r�getattrr)r�attr�_module�valuer
r
r�__getattr__us
zMovedModule.__getattr__)N)rrrrrr0�
__classcell__r
r
)r rr&gs	r&cs(eZdZ�fdd�Zdd�ZgZ�ZS)�_LazyModulecstt|�j|�|jj|_dS)N)r'r2rr r)rr)r r
rr~sz_LazyModule.__init__cCs ddg}|dd�|jD�7}|S)NrrcSsg|]
}|j�qSr
)r)�.0r-r
r
r�
<listcomp>�sz'_LazyModule.__dir__.<locals>.<listcomp>)�_moved_attributes)rZattrsr
r
r�__dir__�sz_LazyModule.__dir__)rrrrr6r5r1r
r
)r rr2|sr2cs&eZdZd�fdd�	Zdd�Z�ZS)�MovedAttributeNcsdtt|�j|�trH|dkr |}||_|dkr@|dkr<|}n|}||_n||_|dkrZ|}||_dS)N)r'r7rr(r)r-)rrZold_modZnew_modZold_attrZnew_attr)r r
rr�szMovedAttribute.__init__cCst|j�}t||j�S)N)rr)r,r-)r�moduler
r
rr�s
zMovedAttribute._resolve)NN)rrrrrr1r
r
)r rr7�sr7c@sVeZdZdZdd�Zdd�Zdd�Zdd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZeZdS)�_SixMetaPathImporterz�
    A meta path importer to import six.moves and its submodules.

    This class implements a PEP302 finder and loader. It should be compatible
    with Python 2.5 and all existing versions of Python3
    cCs||_i|_dS)N)r�
known_modules)rZsix_module_namer
r
rr�sz_SixMetaPathImporter.__init__cGs&x |D]}||j|jd|<qWdS)N�.)r:r)rr)Z	fullnames�fullnamer
r
r�_add_module�s
z _SixMetaPathImporter._add_modulecCs|j|jd|S)Nr;)r:r)rr<r
r
r�_get_module�sz _SixMetaPathImporter._get_moduleNcCs||jkr|SdS)N)r:)rr<�pathr
r
r�find_module�s
z _SixMetaPathImporter.find_modulecCs0y
|j|Stk
r*td|��YnXdS)Nz!This loader does not know module )r:�KeyError�ImportError)rr<r
r
rZ__get_module�s
z!_SixMetaPathImporter.__get_modulecCsRy
tj|Stk
rYnX|j|�}t|t�r>|j�}n||_|tj|<|S)N)rrrA� _SixMetaPathImporter__get_module�
isinstancer&r�
__loader__)rr<r)r
r
r�load_module�s




z _SixMetaPathImporter.load_modulecCst|j|�d�S)z�
        Return true, if the named module is a package.

        We need this method to get correct spec objects with
        Python 3.4 (see PEP451)
        �__path__)�hasattrrC)rr<r
r
r�
is_package�sz_SixMetaPathImporter.is_packagecCs|j|�dS)z;Return None

        Required, if is_package is implementedN)rC)rr<r
r
r�get_code�s
z_SixMetaPathImporter.get_code)N)
rrrrrr=r>r@rCrFrIrJ�
get_sourcer
r
r
rr9�s
	r9c@seZdZdZgZdS)�_MovedItemszLazy loading of moved objectsN)rrrrrGr
r
r
rrL�srLZ	cStringIO�io�StringIO�filter�	itertools�builtinsZifilter�filterfalseZifilterfalse�inputZ__builtin__Z	raw_input�internr�map�imap�getcwd�osZgetcwdu�getcwdb�rangeZxrangeZ
reload_module�	importlibZimp�reload�reduce�	functoolsZshlex_quoteZpipesZshlexZquote�UserDict�collections�UserList�
UserString�zipZizip�zip_longestZizip_longestZconfigparserZConfigParser�copyregZcopy_regZdbm_gnuZgdbmzdbm.gnuZ
_dummy_threadZdummy_threadZhttp_cookiejarZ	cookielibzhttp.cookiejarZhttp_cookiesZCookiezhttp.cookiesZ
html_entitiesZhtmlentitydefsz
html.entitiesZhtml_parserZ
HTMLParserzhtml.parserZhttp_clientZhttplibzhttp.clientZemail_mime_multipartzemail.MIMEMultipartzemail.mime.multipartZemail_mime_nonmultipartzemail.MIMENonMultipartzemail.mime.nonmultipartZemail_mime_textzemail.MIMETextzemail.mime.textZemail_mime_basezemail.MIMEBasezemail.mime.baseZBaseHTTPServerzhttp.serverZ
CGIHTTPServerZSimpleHTTPServerZcPickle�pickleZqueueZQueue�reprlib�reprZsocketserverZSocketServer�_threadZthreadZtkinterZTkinterZtkinter_dialogZDialogztkinter.dialogZtkinter_filedialogZ
FileDialogztkinter.filedialogZtkinter_scrolledtextZScrolledTextztkinter.scrolledtextZtkinter_simpledialogZSimpleDialogztkinter.simpledialogZtkinter_tixZTixztkinter.tixZtkinter_ttkZttkztkinter.ttkZtkinter_constantsZTkconstantsztkinter.constantsZtkinter_dndZTkdndztkinter.dndZtkinter_colorchooserZtkColorChooserztkinter.colorchooserZtkinter_commondialogZtkCommonDialogztkinter.commondialogZtkinter_tkfiledialogZtkFileDialogZtkinter_fontZtkFontztkinter.fontZtkinter_messageboxZtkMessageBoxztkinter.messageboxZtkinter_tksimpledialogZtkSimpleDialogZurllib_parsez.moves.urllib_parsezurllib.parseZurllib_errorz.moves.urllib_errorzurllib.errorZurllibz
.moves.urllibZurllib_robotparser�robotparserzurllib.robotparserZ
xmlrpc_clientZ	xmlrpclibz
xmlrpc.clientZ
xmlrpc_serverZSimpleXMLRPCServerz
xmlrpc.serverZwin32�winreg�_winregzmoves.z.moves�movesc@seZdZdZdS)�Module_six_moves_urllib_parsez7Lazy loading of moved objects in six.moves.urllib_parseN)rrrrr
r
r
rrn@srnZParseResultZurlparseZSplitResultZparse_qsZ	parse_qslZ	urldefragZurljoinZurlsplitZ
urlunparseZ
urlunsplitZ
quote_plusZunquoteZunquote_plusZ	urlencodeZ
splitqueryZsplittagZ	splituserZ
uses_fragmentZuses_netlocZuses_paramsZ
uses_queryZ
uses_relativezmoves.urllib_parsezmoves.urllib.parsec@seZdZdZdS)�Module_six_moves_urllib_errorz7Lazy loading of moved objects in six.moves.urllib_errorN)rrrrr
r
r
rrohsroZURLErrorZurllib2Z	HTTPErrorZContentTooShortErrorz.moves.urllib.errorzmoves.urllib_errorzmoves.urllib.errorc@seZdZdZdS)�Module_six_moves_urllib_requestz9Lazy loading of moved objects in six.moves.urllib_requestN)rrrrr
r
r
rrp|srpZurlopenzurllib.requestZinstall_openerZbuild_openerZpathname2urlZurl2pathnameZ
getproxiesZRequestZOpenerDirectorZHTTPDefaultErrorHandlerZHTTPRedirectHandlerZHTTPCookieProcessorZProxyHandlerZBaseHandlerZHTTPPasswordMgrZHTTPPasswordMgrWithDefaultRealmZAbstractBasicAuthHandlerZHTTPBasicAuthHandlerZProxyBasicAuthHandlerZAbstractDigestAuthHandlerZHTTPDigestAuthHandlerZProxyDigestAuthHandlerZHTTPHandlerZHTTPSHandlerZFileHandlerZ
FTPHandlerZCacheFTPHandlerZUnknownHandlerZHTTPErrorProcessorZurlretrieveZ
urlcleanupZ	URLopenerZFancyURLopenerZproxy_bypassz.moves.urllib.requestzmoves.urllib_requestzmoves.urllib.requestc@seZdZdZdS)� Module_six_moves_urllib_responsez:Lazy loading of moved objects in six.moves.urllib_responseN)rrrrr
r
r
rrq�srqZaddbasezurllib.responseZaddclosehookZaddinfoZ
addinfourlz.moves.urllib.responsezmoves.urllib_responsezmoves.urllib.responsec@seZdZdZdS)�#Module_six_moves_urllib_robotparserz=Lazy loading of moved objects in six.moves.urllib_robotparserN)rrrrr
r
r
rrr�srrZRobotFileParserz.moves.urllib.robotparserzmoves.urllib_robotparserzmoves.urllib.robotparserc@sNeZdZdZgZejd�Zejd�Zejd�Z	ejd�Z
ejd�Zdd�Zd	S)
�Module_six_moves_urllibzICreate a six.moves.urllib namespace that resembles the Python 3 namespacezmoves.urllib_parsezmoves.urllib_errorzmoves.urllib_requestzmoves.urllib_responsezmoves.urllib_robotparsercCsdddddgS)N�parse�error�request�responserjr
)rr
r
rr6�szModule_six_moves_urllib.__dir__N)
rrrrrG�	_importerr>rtrurvrwrjr6r
r
r
rrs�s




rszmoves.urllibcCstt|j|�dS)zAdd an item to six.moves.N)rrLr)Zmover
r
r�add_move�srycCsXytt|�WnDtk
rRytj|=Wn"tk
rLtd|f��YnXYnXdS)zRemove item from six.moves.zno such move, %rN)rrLr!rm�__dict__rA)rr
r
r�remove_move�sr{�__func__�__self__�__closure__�__code__�__defaults__�__globals__�im_funcZim_selfZfunc_closureZ	func_codeZ
func_defaultsZfunc_globalscCs|j�S)N)�next)�itr
r
r�advance_iteratorsr�cCstdd�t|�jD��S)Ncss|]}d|jkVqdS)�__call__N)rz)r3�klassr
r
r�	<genexpr>szcallable.<locals>.<genexpr>)�any�type�__mro__)r"r
r
r�callablesr�cCs|S)Nr
)�unboundr
r
r�get_unbound_functionsr�cCs|S)Nr
)r�clsr
r
r�create_unbound_methodsr�cCs|jS)N)r�)r�r
r
rr�"scCstj|||j�S)N)�types�
MethodTyper )rr"r
r
r�create_bound_method%sr�cCstj|d|�S)N)r�r�)rr�r
r
rr�(sc@seZdZdd�ZdS)�IteratorcCst|�j|�S)N)r��__next__)rr
r
rr�-sz
Iterator.nextN)rrrr�r
r
r
rr�+sr�z3Get the function out of a possibly unbound functioncKst|jf|��S)N)�iter�keys)�d�kwr
r
r�iterkeys>sr�cKst|jf|��S)N)r��values)r�r�r
r
r�
itervaluesAsr�cKst|jf|��S)N)r��items)r�r�r
r
r�	iteritemsDsr�cKst|jf|��S)N)r�Zlists)r�r�r
r
r�	iterlistsGsr�r�r�r�cKs|jf|�S)N)r�)r�r�r
r
rr�PscKs|jf|�S)N)r�)r�r�r
r
rr�SscKs|jf|�S)N)r�)r�r�r
r
rr�VscKs|jf|�S)N)r�)r�r�r
r
rr�Ys�viewkeys�
viewvalues�	viewitemsz1Return an iterator over the keys of a dictionary.z3Return an iterator over the values of a dictionary.z?Return an iterator over the (key, value) pairs of a dictionary.zBReturn an iterator over the (key, [values]) pairs of a dictionary.cCs
|jd�S)Nzlatin-1)�encode)�sr
r
r�bksr�cCs|S)Nr
)r�r
r
r�unsr�z>B�assertCountEqualZassertRaisesRegexpZassertRegexpMatches�assertRaisesRegex�assertRegexcCs|S)Nr
)r�r
r
rr��scCst|jdd�d�S)Nz\\z\\\\Zunicode_escape)�unicode�replace)r�r
r
rr��scCst|d�S)Nr)�ord)Zbsr
r
r�byte2int�sr�cCst||�S)N)r�)Zbuf�ir
r
r�
indexbytes�sr�ZassertItemsEqualzByte literalzText literalcOst|t�||�S)N)r,�_assertCountEqual)r�args�kwargsr
r
rr��scOst|t�||�S)N)r,�_assertRaisesRegex)rr�r�r
r
rr��scOst|t�||�S)N)r,�_assertRegex)rr�r�r
r
rr��s�execcCs*|dkr|�}|j|k	r"|j|��|�dS)N)�
__traceback__�with_traceback)r#r/�tbr
r
r�reraise�s


r�cCsB|dkr*tjd�}|j}|dkr&|j}~n|dkr6|}td�dS)zExecute code in a namespace.Nrzexec _code_ in _globs_, _locs_)r�	_getframe�	f_globals�f_localsr�)Z_code_Z_globs_Z_locs_�framer
r
r�exec_�s
r�z9def reraise(tp, value, tb=None):
    raise tp, value, tb
zrdef raise_from(value, from_value):
    if from_value is None:
        raise value
    raise value from from_value
zCdef raise_from(value, from_value):
    raise value from from_value
cCs|�dS)Nr
)r/Z
from_valuer
r
r�
raise_from�sr��printc
s6|jdtj���dkrdS�fdd�}d}|jdd�}|dk	r`t|t�rNd}nt|t�s`td��|jd	d�}|dk	r�t|t�r�d}nt|t�s�td
��|r�td��|s�x|D]}t|t�r�d}Pq�W|r�td�}td
�}nd}d
}|dkr�|}|dk�r�|}x,t|�D] \}	}|	�r||�||��qW||�dS)z4The new-style print function for Python 2.4 and 2.5.�fileNcsdt|t�st|�}t�t�rVt|t�rV�jdk	rVt�dd�}|dkrHd}|j�j|�}�j|�dS)N�errors�strict)	rD�
basestring�strr�r��encodingr,r��write)�datar�)�fpr
rr��s



zprint_.<locals>.writeF�sepTzsep must be None or a string�endzend must be None or a stringz$invalid keyword arguments to print()�
� )�popr�stdoutrDr�r��	TypeError�	enumerate)
r�r�r�Zwant_unicoder�r��arg�newlineZspacer�r
)r�r�print_�sL







r�cOs<|jdtj�}|jdd�}t||�|r8|dk	r8|j�dS)Nr��flushF)�getrr�r��_printr�)r�r�r�r�r
r
rr�s

zReraise an exception.cs���fdd�}|S)Ncstj����|�}�|_|S)N)r^�wraps�__wrapped__)�f)�assigned�updated�wrappedr
r�wrapperszwraps.<locals>.wrapperr
)r�r�r�r�r
)r�r�r�rr�sr�cs&G��fdd�d��}tj|dfi�S)z%Create a base class with a metaclass.cseZdZ��fdd�ZdS)z!with_metaclass.<locals>.metaclasscs�|�|�S)Nr
)r�rZ
this_basesr�)�bases�metar
r�__new__'sz)with_metaclass.<locals>.metaclass.__new__N)rrrr�r
)r�r�r
r�	metaclass%sr�Ztemporary_class)r�r�)r�r�r�r
)r�r�r�with_metaclass sr�cs�fdd�}|S)z6Class decorator for creating a class with a metaclass.csl|jj�}|jd�}|dk	rDt|t�r,|g}x|D]}|j|�q2W|jdd�|jdd��|j|j|�S)N�	__slots__rz�__weakref__)rz�copyr�rDr�r�r�	__bases__)r�Z	orig_vars�slotsZ	slots_var)r�r
rr�.s



zadd_metaclass.<locals>.wrapperr
)r�r�r
)r�r�
add_metaclass,sr�cCs2tr.d|jkrtd|j��|j|_dd�|_|S)a
    A decorator that defines __unicode__ and __str__ methods under Python 2.
    Under Python 3 it does nothing.

    To support Python 2 and 3 with a single code base, define a __str__ method
    returning text and apply this decorator to the class.
    �__str__zY@python_2_unicode_compatible cannot be applied to %s because it doesn't define __str__().cSs|j�jd�S)Nzutf-8)�__unicode__r�)rr
r
r�<lambda>Jsz-python_2_unicode_compatible.<locals>.<lambda>)�PY2rz�
ValueErrorrr�r�)r�r
r
r�python_2_unicode_compatible<s


r��__spec__)rrli���li���ll����)N)NN)rr)rr)rr)rr)�rZ
__future__rr^rP�operatorrr��
__author__�__version__�version_infor�r(ZPY34r�Zstring_types�intZ
integer_typesr�Zclass_typesZ	text_type�bytesZbinary_type�maxsizeZMAXSIZEr�ZlongZ	ClassTyper��platform�
startswith�objectr	�len�
OverflowErrorrrrr&�
ModuleTyper2r7r9rrxrLr5r-rrrDr=rmrnZ_urllib_parse_moved_attributesroZ_urllib_error_moved_attributesrpZ _urllib_request_moved_attributesrqZ!_urllib_response_moved_attributesrrZ$_urllib_robotparser_moved_attributesrsryr{Z
_meth_funcZ
_meth_selfZ
_func_closureZ
_func_codeZ_func_defaultsZ
_func_globalsr�r��	NameErrorr�r�r�r�r�r��
attrgetterZget_method_functionZget_method_selfZget_function_closureZget_function_codeZget_function_defaultsZget_function_globalsr�r�r�r��methodcallerr�r�r�r�r��chrZunichr�struct�Struct�packZint2byte�
itemgetterr��getitemr�r�Z	iterbytesrMrN�BytesIOr�r�r��partialrVr�r�r�r�r,rQr�r�r�r�r��WRAPPER_ASSIGNMENTS�WRAPPER_UPDATESr�r�r�r�rG�__package__�globalsr�r��submodule_search_locations�	meta_pathr�r�Zimporter�appendr
r
r
r�<module>s�

>












































































































5site-packages/setuptools/_vendor/__pycache__/six.cpython-36.opt-1.pyc000064400000057532147511334630021564 0ustar003

��f�u�I@srdZddlmZddlZddlZddlZddlZddlZdZdZ	ej
ddkZej
ddkZej
dd��dzkZ
er�efZefZefZeZeZejZn�efZeefZeejfZeZeZejjd	�r�e�d|�ZnLGdd
�d
e�Z ye!e ��Wn e"k
�re�d~�ZYnXe�d��Z[ dd�Z#dd�Z$Gdd�de�Z%Gdd�de%�Z&Gdd�dej'�Z(Gdd�de%�Z)Gdd�de�Z*e*e+�Z,Gdd�de(�Z-e)ddd d!�e)d"d#d$d%d"�e)d&d#d#d'd&�e)d(d)d$d*d(�e)d+d)d,�e)d-d#d$d.d-�e)d/d0d0d1d/�e)d2d0d0d/d2�e)d3d)d$d4d3�e)d5d)e
�rd6nd7d8�e)d9d)d:�e)d;d<d=d>�e)d!d!d �e)d?d?d@�e)dAdAd@�e)dBdBd@�e)d4d)d$d4d3�e)dCd#d$dDdC�e)dEd#d#dFdE�e&d$d)�e&dGdH�e&dIdJ�e&dKdLdM�e&dNdOdN�e&dPdQdR�e&dSdTdU�e&dVdWdX�e&dYdZd[�e&d\d]d^�e&d_d`da�e&dbdcdd�e&dedfdg�e&dhdidj�e&dkdkdl�e&dmdmdl�e&dndndl�e&dododp�e&dqdr�e&dsdt�e&dudv�e&dwdxdw�e&dydz�e&d{d|d}�e&d~dd��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�e+d�d��e&d�e+d�d��e&d�e+d�e+d��e&d�d�d��e&d�d�d��e&d�d�d��g>Z.ejd�k�rZe.e&d�d��g7Z.x:e.D]2Z/e0e-e/j1e/�e2e/e&��r`e,j3e/d�e/j1��q`W[/e.e-_.e-e+d��Z4e,j3e4d��Gd�d��d�e(�Z5e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d>d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��gZ6xe6D]Z/e0e5e/j1e/��q�W[/e6e5_.e,j3e5e+d��d�dӃGd�dՄd�e(�Z7e)d�d�d��e)d�d�d��e)d�d�d��gZ8xe8D]Z/e0e7e/j1e/��q$W[/e8e7_.e,j3e7e+d��d�d܃Gd�dބd�e(�Z9e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)�dd�d�g!Z:xe:D]Z/e0e9e/j1e/��q�W[/e:e9_.e,j3e9e+�d��d�d�G�d�d��de(�Z;e)�dd��d�e)�dd��d�e)�d	d��d�e)�d
d��d�gZ<xe<D]Z/e0e;e/j1e/��qTW[/e<e;_.e,j3e;e+�d��d�d
�G�d�d��de(�Z=e)�dd�d��gZ>xe>D]Z/e0e=e/j1e/��q�W[/e>e=_.e,j3e=e+�d��d�d�G�d�d��dej'�Z?e,j3e?e+d���d��d�d�Z@�d�d�ZAe�	rj�dZB�dZC�dZD�dZE�dZF�d ZGn$�d!ZB�d"ZC�d#ZD�d$ZE�d%ZF�d&ZGyeHZIWn"eJk
�	r��d'�d(�ZIYnXeIZHyeKZKWn"eJk
�	r��d)�d*�ZKYnXe�
r�d+�d,�ZLejMZN�d-�d.�ZOeZPn>�d/�d,�ZL�d0�d1�ZN�d2�d.�ZOG�d3�d4��d4e�ZPeKZKe#eL�d5�ejQeB�ZRejQeC�ZSejQeD�ZTejQeE�ZUejQeF�ZVejQeG�ZWe�
r��d6�d7�ZX�d8�d9�ZY�d:�d;�ZZ�d<�d=�Z[ej\�d>�Z]ej\�d?�Z^ej\�d@�Z_nT�dA�d7�ZX�dB�d9�ZY�dC�d;�ZZ�dD�d=�Z[ej\�dE�Z]ej\�dF�Z^ej\�dG�Z_e#eX�dH�e#eY�dI�e#eZ�dJ�e#e[�dK�e�r�dL�dM�Z`�dN�dO�ZaebZcddldZdedje�dP�jfZg[dejhd�ZiejjZkelZmddlnZnenjoZoenjpZp�dQZqej
d
d
k�r�dRZr�dSZsn�dTZr�dUZsnj�dV�dM�Z`�dW�dO�ZaecZcebZg�dX�dY�Zi�dZ�d[�Zkejtejuev�ZmddloZoeojoZoZp�d\Zq�dRZr�dSZse#e`�d]�e#ea�d^��d_�dQ�Zw�d`�dT�Zx�da�dU�Zye�r�eze4j{�db�Z|�d��dc�dd�Z}n�d��de�df�Z|e|�dg�ej
dd��d�k�
re|�dh�n.ej
dd��d�k�
r8e|�di�n�dj�dk�Z~eze4j{�dld�Zedk�
rj�dm�dn�Zej
dd��d�k�
r�eZ��do�dn�Ze#e}�dp�ej
dd��d�k�
r�ej�ej�f�dq�dr�Z�nej�Z��ds�dt�Z��du�dv�Z��dw�dx�Z�gZ�e+Z�e��j��dy�dk	�rge�_�ej��rbx>e�ej��D]0\Z�Z�ee��j+dk�r*e�j1e+k�r*ej�e�=P�q*W[�[�ej�j�e,�dS(�z6Utilities for writing code that runs on Python 2 and 3�)�absolute_importNz'Benjamin Peterson <benjamin@python.org>z1.10.0����java��c@seZdZdd�ZdS)�XcCsdS)Nrrl�)�selfr
r
�/usr/lib/python3.6/six.py�__len__>sz	X.__len__N)�__name__�
__module__�__qualname__r
r
r
r
rr	<sr	�?cCs
||_dS)z Add documentation to a function.N)�__doc__)�func�docr
r
r�_add_docKsrcCst|�tj|S)z7Import module, returning the module after the last dot.)�
__import__�sys�modules)�namer
r
r�_import_modulePsrc@seZdZdd�Zdd�ZdS)�
_LazyDescrcCs
||_dS)N)r)rrr
r
r�__init__Xsz_LazyDescr.__init__cCsB|j�}t||j|�yt|j|j�Wntk
r<YnX|S)N)�_resolve�setattrr�delattr�	__class__�AttributeError)r�obj�tp�resultr
r
r�__get__[sz_LazyDescr.__get__N)rrrrr%r
r
r
rrVsrcs.eZdZd�fdd�	Zdd�Zdd�Z�ZS)	�MovedModuleNcs2tt|�j|�tr(|dkr |}||_n||_dS)N)�superr&r�PY3�mod)rr�old�new)r r
rriszMovedModule.__init__cCs
t|j�S)N)rr))rr
r
rrrszMovedModule._resolvecCs"|j�}t||�}t|||�|S)N)r�getattrr)r�attr�_module�valuer
r
r�__getattr__us
zMovedModule.__getattr__)N)rrrrrr0�
__classcell__r
r
)r rr&gs	r&cs(eZdZ�fdd�Zdd�ZgZ�ZS)�_LazyModulecstt|�j|�|jj|_dS)N)r'r2rr r)rr)r r
rr~sz_LazyModule.__init__cCs ddg}|dd�|jD�7}|S)NrrcSsg|]
}|j�qSr
)r)�.0r-r
r
r�
<listcomp>�sz'_LazyModule.__dir__.<locals>.<listcomp>)�_moved_attributes)rZattrsr
r
r�__dir__�sz_LazyModule.__dir__)rrrrr6r5r1r
r
)r rr2|sr2cs&eZdZd�fdd�	Zdd�Z�ZS)�MovedAttributeNcsdtt|�j|�trH|dkr |}||_|dkr@|dkr<|}n|}||_n||_|dkrZ|}||_dS)N)r'r7rr(r)r-)rrZold_modZnew_modZold_attrZnew_attr)r r
rr�szMovedAttribute.__init__cCst|j�}t||j�S)N)rr)r,r-)r�moduler
r
rr�s
zMovedAttribute._resolve)NN)rrrrrr1r
r
)r rr7�sr7c@sVeZdZdZdd�Zdd�Zdd�Zdd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZeZdS)�_SixMetaPathImporterz�
    A meta path importer to import six.moves and its submodules.

    This class implements a PEP302 finder and loader. It should be compatible
    with Python 2.5 and all existing versions of Python3
    cCs||_i|_dS)N)r�
known_modules)rZsix_module_namer
r
rr�sz_SixMetaPathImporter.__init__cGs&x |D]}||j|jd|<qWdS)N�.)r:r)rr)Z	fullnames�fullnamer
r
r�_add_module�s
z _SixMetaPathImporter._add_modulecCs|j|jd|S)Nr;)r:r)rr<r
r
r�_get_module�sz _SixMetaPathImporter._get_moduleNcCs||jkr|SdS)N)r:)rr<�pathr
r
r�find_module�s
z _SixMetaPathImporter.find_modulecCs0y
|j|Stk
r*td|��YnXdS)Nz!This loader does not know module )r:�KeyError�ImportError)rr<r
r
rZ__get_module�s
z!_SixMetaPathImporter.__get_modulecCsRy
tj|Stk
rYnX|j|�}t|t�r>|j�}n||_|tj|<|S)N)rrrA� _SixMetaPathImporter__get_module�
isinstancer&r�
__loader__)rr<r)r
r
r�load_module�s




z _SixMetaPathImporter.load_modulecCst|j|�d�S)z�
        Return true, if the named module is a package.

        We need this method to get correct spec objects with
        Python 3.4 (see PEP451)
        �__path__)�hasattrrC)rr<r
r
r�
is_package�sz_SixMetaPathImporter.is_packagecCs|j|�dS)z;Return None

        Required, if is_package is implementedN)rC)rr<r
r
r�get_code�s
z_SixMetaPathImporter.get_code)N)
rrrrrr=r>r@rCrFrIrJ�
get_sourcer
r
r
rr9�s
	r9c@seZdZdZgZdS)�_MovedItemszLazy loading of moved objectsN)rrrrrGr
r
r
rrL�srLZ	cStringIO�io�StringIO�filter�	itertools�builtinsZifilter�filterfalseZifilterfalse�inputZ__builtin__Z	raw_input�internr�map�imap�getcwd�osZgetcwdu�getcwdb�rangeZxrangeZ
reload_module�	importlibZimp�reload�reduce�	functoolsZshlex_quoteZpipesZshlexZquote�UserDict�collections�UserList�
UserString�zipZizip�zip_longestZizip_longestZconfigparserZConfigParser�copyregZcopy_regZdbm_gnuZgdbmzdbm.gnuZ
_dummy_threadZdummy_threadZhttp_cookiejarZ	cookielibzhttp.cookiejarZhttp_cookiesZCookiezhttp.cookiesZ
html_entitiesZhtmlentitydefsz
html.entitiesZhtml_parserZ
HTMLParserzhtml.parserZhttp_clientZhttplibzhttp.clientZemail_mime_multipartzemail.MIMEMultipartzemail.mime.multipartZemail_mime_nonmultipartzemail.MIMENonMultipartzemail.mime.nonmultipartZemail_mime_textzemail.MIMETextzemail.mime.textZemail_mime_basezemail.MIMEBasezemail.mime.baseZBaseHTTPServerzhttp.serverZ
CGIHTTPServerZSimpleHTTPServerZcPickle�pickleZqueueZQueue�reprlib�reprZsocketserverZSocketServer�_threadZthreadZtkinterZTkinterZtkinter_dialogZDialogztkinter.dialogZtkinter_filedialogZ
FileDialogztkinter.filedialogZtkinter_scrolledtextZScrolledTextztkinter.scrolledtextZtkinter_simpledialogZSimpleDialogztkinter.simpledialogZtkinter_tixZTixztkinter.tixZtkinter_ttkZttkztkinter.ttkZtkinter_constantsZTkconstantsztkinter.constantsZtkinter_dndZTkdndztkinter.dndZtkinter_colorchooserZtkColorChooserztkinter.colorchooserZtkinter_commondialogZtkCommonDialogztkinter.commondialogZtkinter_tkfiledialogZtkFileDialogZtkinter_fontZtkFontztkinter.fontZtkinter_messageboxZtkMessageBoxztkinter.messageboxZtkinter_tksimpledialogZtkSimpleDialogZurllib_parsez.moves.urllib_parsezurllib.parseZurllib_errorz.moves.urllib_errorzurllib.errorZurllibz
.moves.urllibZurllib_robotparser�robotparserzurllib.robotparserZ
xmlrpc_clientZ	xmlrpclibz
xmlrpc.clientZ
xmlrpc_serverZSimpleXMLRPCServerz
xmlrpc.serverZwin32�winreg�_winregzmoves.z.moves�movesc@seZdZdZdS)�Module_six_moves_urllib_parsez7Lazy loading of moved objects in six.moves.urllib_parseN)rrrrr
r
r
rrn@srnZParseResultZurlparseZSplitResultZparse_qsZ	parse_qslZ	urldefragZurljoinZurlsplitZ
urlunparseZ
urlunsplitZ
quote_plusZunquoteZunquote_plusZ	urlencodeZ
splitqueryZsplittagZ	splituserZ
uses_fragmentZuses_netlocZuses_paramsZ
uses_queryZ
uses_relativezmoves.urllib_parsezmoves.urllib.parsec@seZdZdZdS)�Module_six_moves_urllib_errorz7Lazy loading of moved objects in six.moves.urllib_errorN)rrrrr
r
r
rrohsroZURLErrorZurllib2Z	HTTPErrorZContentTooShortErrorz.moves.urllib.errorzmoves.urllib_errorzmoves.urllib.errorc@seZdZdZdS)�Module_six_moves_urllib_requestz9Lazy loading of moved objects in six.moves.urllib_requestN)rrrrr
r
r
rrp|srpZurlopenzurllib.requestZinstall_openerZbuild_openerZpathname2urlZurl2pathnameZ
getproxiesZRequestZOpenerDirectorZHTTPDefaultErrorHandlerZHTTPRedirectHandlerZHTTPCookieProcessorZProxyHandlerZBaseHandlerZHTTPPasswordMgrZHTTPPasswordMgrWithDefaultRealmZAbstractBasicAuthHandlerZHTTPBasicAuthHandlerZProxyBasicAuthHandlerZAbstractDigestAuthHandlerZHTTPDigestAuthHandlerZProxyDigestAuthHandlerZHTTPHandlerZHTTPSHandlerZFileHandlerZ
FTPHandlerZCacheFTPHandlerZUnknownHandlerZHTTPErrorProcessorZurlretrieveZ
urlcleanupZ	URLopenerZFancyURLopenerZproxy_bypassz.moves.urllib.requestzmoves.urllib_requestzmoves.urllib.requestc@seZdZdZdS)� Module_six_moves_urllib_responsez:Lazy loading of moved objects in six.moves.urllib_responseN)rrrrr
r
r
rrq�srqZaddbasezurllib.responseZaddclosehookZaddinfoZ
addinfourlz.moves.urllib.responsezmoves.urllib_responsezmoves.urllib.responsec@seZdZdZdS)�#Module_six_moves_urllib_robotparserz=Lazy loading of moved objects in six.moves.urllib_robotparserN)rrrrr
r
r
rrr�srrZRobotFileParserz.moves.urllib.robotparserzmoves.urllib_robotparserzmoves.urllib.robotparserc@sNeZdZdZgZejd�Zejd�Zejd�Z	ejd�Z
ejd�Zdd�Zd	S)
�Module_six_moves_urllibzICreate a six.moves.urllib namespace that resembles the Python 3 namespacezmoves.urllib_parsezmoves.urllib_errorzmoves.urllib_requestzmoves.urllib_responsezmoves.urllib_robotparsercCsdddddgS)N�parse�error�request�responserjr
)rr
r
rr6�szModule_six_moves_urllib.__dir__N)
rrrrrG�	_importerr>rtrurvrwrjr6r
r
r
rrs�s




rszmoves.urllibcCstt|j|�dS)zAdd an item to six.moves.N)rrLr)Zmover
r
r�add_move�srycCsXytt|�WnDtk
rRytj|=Wn"tk
rLtd|f��YnXYnXdS)zRemove item from six.moves.zno such move, %rN)rrLr!rm�__dict__rA)rr
r
r�remove_move�sr{�__func__�__self__�__closure__�__code__�__defaults__�__globals__�im_funcZim_selfZfunc_closureZ	func_codeZ
func_defaultsZfunc_globalscCs|j�S)N)�next)�itr
r
r�advance_iteratorsr�cCstdd�t|�jD��S)Ncss|]}d|jkVqdS)�__call__N)rz)r3�klassr
r
r�	<genexpr>szcallable.<locals>.<genexpr>)�any�type�__mro__)r"r
r
r�callablesr�cCs|S)Nr
)�unboundr
r
r�get_unbound_functionsr�cCs|S)Nr
)r�clsr
r
r�create_unbound_methodsr�cCs|jS)N)r�)r�r
r
rr�"scCstj|||j�S)N)�types�
MethodTyper )rr"r
r
r�create_bound_method%sr�cCstj|d|�S)N)r�r�)rr�r
r
rr�(sc@seZdZdd�ZdS)�IteratorcCst|�j|�S)N)r��__next__)rr
r
rr�-sz
Iterator.nextN)rrrr�r
r
r
rr�+sr�z3Get the function out of a possibly unbound functioncKst|jf|��S)N)�iter�keys)�d�kwr
r
r�iterkeys>sr�cKst|jf|��S)N)r��values)r�r�r
r
r�
itervaluesAsr�cKst|jf|��S)N)r��items)r�r�r
r
r�	iteritemsDsr�cKst|jf|��S)N)r�Zlists)r�r�r
r
r�	iterlistsGsr�r�r�r�cKs|jf|�S)N)r�)r�r�r
r
rr�PscKs|jf|�S)N)r�)r�r�r
r
rr�SscKs|jf|�S)N)r�)r�r�r
r
rr�VscKs|jf|�S)N)r�)r�r�r
r
rr�Ys�viewkeys�
viewvalues�	viewitemsz1Return an iterator over the keys of a dictionary.z3Return an iterator over the values of a dictionary.z?Return an iterator over the (key, value) pairs of a dictionary.zBReturn an iterator over the (key, [values]) pairs of a dictionary.cCs
|jd�S)Nzlatin-1)�encode)�sr
r
r�bksr�cCs|S)Nr
)r�r
r
r�unsr�z>B�assertCountEqualZassertRaisesRegexpZassertRegexpMatches�assertRaisesRegex�assertRegexcCs|S)Nr
)r�r
r
rr��scCst|jdd�d�S)Nz\\z\\\\Zunicode_escape)�unicode�replace)r�r
r
rr��scCst|d�S)Nr)�ord)Zbsr
r
r�byte2int�sr�cCst||�S)N)r�)Zbuf�ir
r
r�
indexbytes�sr�ZassertItemsEqualzByte literalzText literalcOst|t�||�S)N)r,�_assertCountEqual)r�args�kwargsr
r
rr��scOst|t�||�S)N)r,�_assertRaisesRegex)rr�r�r
r
rr��scOst|t�||�S)N)r,�_assertRegex)rr�r�r
r
rr��s�execcCs*|dkr|�}|j|k	r"|j|��|�dS)N)�
__traceback__�with_traceback)r#r/�tbr
r
r�reraise�s


r�cCsB|dkr*tjd�}|j}|dkr&|j}~n|dkr6|}td�dS)zExecute code in a namespace.Nrzexec _code_ in _globs_, _locs_)r�	_getframe�	f_globals�f_localsr�)Z_code_Z_globs_Z_locs_�framer
r
r�exec_�s
r�z9def reraise(tp, value, tb=None):
    raise tp, value, tb
zrdef raise_from(value, from_value):
    if from_value is None:
        raise value
    raise value from from_value
zCdef raise_from(value, from_value):
    raise value from from_value
cCs|�dS)Nr
)r/Z
from_valuer
r
r�
raise_from�sr��printc
s6|jdtj���dkrdS�fdd�}d}|jdd�}|dk	r`t|t�rNd}nt|t�s`td��|jd	d�}|dk	r�t|t�r�d}nt|t�s�td
��|r�td��|s�x|D]}t|t�r�d}Pq�W|r�td�}td
�}nd}d
}|dkr�|}|dk�r�|}x,t|�D] \}	}|	�r||�||��qW||�dS)z4The new-style print function for Python 2.4 and 2.5.�fileNcsdt|t�st|�}t�t�rVt|t�rV�jdk	rVt�dd�}|dkrHd}|j�j|�}�j|�dS)N�errors�strict)	rD�
basestring�strr�r��encodingr,r��write)�datar�)�fpr
rr��s



zprint_.<locals>.writeF�sepTzsep must be None or a string�endzend must be None or a stringz$invalid keyword arguments to print()�
� )�popr�stdoutrDr�r��	TypeError�	enumerate)
r�r�r�Zwant_unicoder�r��arg�newlineZspacer�r
)r�r�print_�sL







r�cOs<|jdtj�}|jdd�}t||�|r8|dk	r8|j�dS)Nr��flushF)�getrr�r��_printr�)r�r�r�r�r
r
rr�s

zReraise an exception.cs���fdd�}|S)Ncstj����|�}�|_|S)N)r^�wraps�__wrapped__)�f)�assigned�updated�wrappedr
r�wrapperszwraps.<locals>.wrapperr
)r�r�r�r�r
)r�r�r�rr�sr�cs&G��fdd�d��}tj|dfi�S)z%Create a base class with a metaclass.cseZdZ��fdd�ZdS)z!with_metaclass.<locals>.metaclasscs�|�|�S)Nr
)r�rZ
this_basesr�)�bases�metar
r�__new__'sz)with_metaclass.<locals>.metaclass.__new__N)rrrr�r
)r�r�r
r�	metaclass%sr�Ztemporary_class)r�r�)r�r�r�r
)r�r�r�with_metaclass sr�cs�fdd�}|S)z6Class decorator for creating a class with a metaclass.csl|jj�}|jd�}|dk	rDt|t�r,|g}x|D]}|j|�q2W|jdd�|jdd��|j|j|�S)N�	__slots__rz�__weakref__)rz�copyr�rDr�r�r�	__bases__)r�Z	orig_vars�slotsZ	slots_var)r�r
rr�.s



zadd_metaclass.<locals>.wrapperr
)r�r�r
)r�r�
add_metaclass,sr�cCs2tr.d|jkrtd|j��|j|_dd�|_|S)a
    A decorator that defines __unicode__ and __str__ methods under Python 2.
    Under Python 3 it does nothing.

    To support Python 2 and 3 with a single code base, define a __str__ method
    returning text and apply this decorator to the class.
    �__str__zY@python_2_unicode_compatible cannot be applied to %s because it doesn't define __str__().cSs|j�jd�S)Nzutf-8)�__unicode__r�)rr
r
r�<lambda>Jsz-python_2_unicode_compatible.<locals>.<lambda>)�PY2rz�
ValueErrorrr�r�)r�r
r
r�python_2_unicode_compatible<s


r��__spec__)rrli���li���ll����)N)NN)rr)rr)rr)rr)�rZ
__future__rr^rP�operatorrr��
__author__�__version__�version_infor�r(ZPY34r�Zstring_types�intZ
integer_typesr�Zclass_typesZ	text_type�bytesZbinary_type�maxsizeZMAXSIZEr�ZlongZ	ClassTyper��platform�
startswith�objectr	�len�
OverflowErrorrrrr&�
ModuleTyper2r7r9rrxrLr5r-rrrDr=rmrnZ_urllib_parse_moved_attributesroZ_urllib_error_moved_attributesrpZ _urllib_request_moved_attributesrqZ!_urllib_response_moved_attributesrrZ$_urllib_robotparser_moved_attributesrsryr{Z
_meth_funcZ
_meth_selfZ
_func_closureZ
_func_codeZ_func_defaultsZ
_func_globalsr�r��	NameErrorr�r�r�r�r�r��
attrgetterZget_method_functionZget_method_selfZget_function_closureZget_function_codeZget_function_defaultsZget_function_globalsr�r�r�r��methodcallerr�r�r�r�r��chrZunichr�struct�Struct�packZint2byte�
itemgetterr��getitemr�r�Z	iterbytesrMrN�BytesIOr�r�r��partialrVr�r�r�r�r,rQr�r�r�r�r��WRAPPER_ASSIGNMENTS�WRAPPER_UPDATESr�r�r�r�rG�__package__�globalsr�r��submodule_search_locations�	meta_pathr�r�Zimporter�appendr
r
r
r�<module>s�

>












































































































5site-packages/setuptools/monkey.py000064400000012215147511334630013371 0ustar00"""
Monkey patching of distutils.
"""

import sys
import distutils.filelist
import platform
import types
import functools
from importlib import import_module
import inspect

from setuptools.extern import six

import setuptools

__all__ = []
"""
Everything is private. Contact the project team
if you think you need this functionality.
"""


def _get_mro(cls):
    """
    Returns the bases classes for cls sorted by the MRO.

    Works around an issue on Jython where inspect.getmro will not return all
    base classes if multiple classes share the same name. Instead, this
    function will return a tuple containing the class itself, and the contents
    of cls.__bases__. See https://github.com/pypa/setuptools/issues/1024.
    """
    if platform.python_implementation() == "Jython":
        return (cls,) + cls.__bases__
    return inspect.getmro(cls)


def get_unpatched(item):
    lookup = (
        get_unpatched_class if isinstance(item, six.class_types) else
        get_unpatched_function if isinstance(item, types.FunctionType) else
        lambda item: None
    )
    return lookup(item)


def get_unpatched_class(cls):
    """Protect against re-patching the distutils if reloaded

    Also ensures that no other distutils extension monkeypatched the distutils
    first.
    """
    external_bases = (
        cls
        for cls in _get_mro(cls)
        if not cls.__module__.startswith('setuptools')
    )
    base = next(external_bases)
    if not base.__module__.startswith('distutils'):
        msg = "distutils has already been patched by %r" % cls
        raise AssertionError(msg)
    return base


def patch_all():
    # we can't patch distutils.cmd, alas
    distutils.core.Command = setuptools.Command

    has_issue_12885 = sys.version_info <= (3, 5, 3)

    if has_issue_12885:
        # fix findall bug in distutils (http://bugs.python.org/issue12885)
        distutils.filelist.findall = setuptools.findall

    needs_warehouse = (
        sys.version_info < (2, 7, 13)
        or
        (3, 0) < sys.version_info < (3, 3, 7)
        or
        (3, 4) < sys.version_info < (3, 4, 6)
        or
        (3, 5) < sys.version_info <= (3, 5, 3)
    )

    if needs_warehouse:
        warehouse = 'https://upload.pypi.org/legacy/'
        distutils.config.PyPIRCCommand.DEFAULT_REPOSITORY = warehouse

    _patch_distribution_metadata_write_pkg_file()

    # Install Distribution throughout the distutils
    for module in distutils.dist, distutils.core, distutils.cmd:
        module.Distribution = setuptools.dist.Distribution

    # Install the patched Extension
    distutils.core.Extension = setuptools.extension.Extension
    distutils.extension.Extension = setuptools.extension.Extension
    if 'distutils.command.build_ext' in sys.modules:
        sys.modules['distutils.command.build_ext'].Extension = (
            setuptools.extension.Extension
        )

    patch_for_msvc_specialized_compiler()


def _patch_distribution_metadata_write_pkg_file():
    """Patch write_pkg_file to also write Requires-Python/Requires-External"""
    distutils.dist.DistributionMetadata.write_pkg_file = (
        setuptools.dist.write_pkg_file
    )


def patch_func(replacement, target_mod, func_name):
    """
    Patch func_name in target_mod with replacement

    Important - original must be resolved by name to avoid
    patching an already patched function.
    """
    original = getattr(target_mod, func_name)

    # set the 'unpatched' attribute on the replacement to
    # point to the original.
    vars(replacement).setdefault('unpatched', original)

    # replace the function in the original module
    setattr(target_mod, func_name, replacement)


def get_unpatched_function(candidate):
    return getattr(candidate, 'unpatched')


def patch_for_msvc_specialized_compiler():
    """
    Patch functions in distutils to use standalone Microsoft Visual C++
    compilers.
    """
    # import late to avoid circular imports on Python < 3.5
    msvc = import_module('setuptools.msvc')

    if platform.system() != 'Windows':
        # Compilers only availables on Microsoft Windows
        return

    def patch_params(mod_name, func_name):
        """
        Prepare the parameters for patch_func to patch indicated function.
        """
        repl_prefix = 'msvc9_' if 'msvc9' in mod_name else 'msvc14_'
        repl_name = repl_prefix + func_name.lstrip('_')
        repl = getattr(msvc, repl_name)
        mod = import_module(mod_name)
        if not hasattr(mod, func_name):
            raise ImportError(func_name)
        return repl, mod, func_name

    # Python 2.7 to 3.4
    msvc9 = functools.partial(patch_params, 'distutils.msvc9compiler')

    # Python 3.5+
    msvc14 = functools.partial(patch_params, 'distutils._msvccompiler')

    try:
        # Patch distutils.msvc9compiler
        patch_func(*msvc9('find_vcvarsall'))
        patch_func(*msvc9('query_vcvarsall'))
    except ImportError:
        pass

    try:
        # Patch distutils._msvccompiler._get_vc_env
        patch_func(*msvc14('_get_vc_env'))
    except ImportError:
        pass

    try:
        # Patch distutils._msvccompiler.gen_lib_options for Numpy
        patch_func(*msvc14('gen_lib_options'))
    except ImportError:
        pass
site-packages/setuptools/namespaces.py000064400000006177147511334630014220 0ustar00import os
from distutils import log
import itertools

from setuptools.extern.six.moves import map


flatten = itertools.chain.from_iterable


class Installer:

    nspkg_ext = '-nspkg.pth'

    def install_namespaces(self):
        nsp = self._get_all_ns_packages()
        if not nsp:
            return
        filename, ext = os.path.splitext(self._get_target())
        filename += self.nspkg_ext
        self.outputs.append(filename)
        log.info("Installing %s", filename)
        lines = map(self._gen_nspkg_line, nsp)

        if self.dry_run:
            # always generate the lines, even in dry run
            list(lines)
            return

        with open(filename, 'wt') as f:
            f.writelines(lines)

    def uninstall_namespaces(self):
        filename, ext = os.path.splitext(self._get_target())
        filename += self.nspkg_ext
        if not os.path.exists(filename):
            return
        log.info("Removing %s", filename)
        os.remove(filename)

    def _get_target(self):
        return self.target

    _nspkg_tmpl = (
        "import sys, types, os",
        "has_mfs = sys.version_info > (3, 5)",
        "p = os.path.join(%(root)s, *%(pth)r)",
        "importlib = has_mfs and __import__('importlib.util')",
        "has_mfs and __import__('importlib.machinery')",
        "m = has_mfs and "
            "sys.modules.setdefault(%(pkg)r, "
                "importlib.util.module_from_spec("
                    "importlib.machinery.PathFinder.find_spec(%(pkg)r, "
                        "[os.path.dirname(p)])))",
        "m = m or "
            "sys.modules.setdefault(%(pkg)r, types.ModuleType(%(pkg)r))",
        "mp = (m or []) and m.__dict__.setdefault('__path__',[])",
        "(p not in mp) and mp.append(p)",
    )
    "lines for the namespace installer"

    _nspkg_tmpl_multi = (
        'm and setattr(sys.modules[%(parent)r], %(child)r, m)',
    )
    "additional line(s) when a parent package is indicated"

    def _get_root(self):
        return "sys._getframe(1).f_locals['sitedir']"

    def _gen_nspkg_line(self, pkg):
        # ensure pkg is not a unicode string under Python 2.7
        pkg = str(pkg)
        pth = tuple(pkg.split('.'))
        root = self._get_root()
        tmpl_lines = self._nspkg_tmpl
        parent, sep, child = pkg.rpartition('.')
        if parent:
            tmpl_lines += self._nspkg_tmpl_multi
        return ';'.join(tmpl_lines) % locals() + '\n'

    def _get_all_ns_packages(self):
        """Return sorted list of all package namespaces"""
        pkgs = self.distribution.namespace_packages or []
        return sorted(flatten(map(self._pkg_names, pkgs)))

    @staticmethod
    def _pkg_names(pkg):
        """
        Given a namespace package, yield the components of that
        package.

        >>> names = Installer._pkg_names('a.b.c')
        >>> set(names) == set(['a', 'a.b', 'a.b.c'])
        True
        """
        parts = pkg.split('.')
        while parts:
            yield '.'.join(parts)
            parts.pop()


class DevelopInstaller(Installer):
    def _get_root(self):
        return repr(str(self.egg_path))

    def _get_target(self):
        return self.egg_link
site-packages/setuptools/wheel.py000064400000017142147511334630013177 0ustar00'''Wheels support.'''

from distutils.util import get_platform
import email
import itertools
import os
import posixpath
import re
import zipfile

from pkg_resources import Distribution, PathMetadata, parse_version
from setuptools.extern.packaging.utils import canonicalize_name
from setuptools.extern.six import PY3
from setuptools import Distribution as SetuptoolsDistribution
from setuptools import pep425tags
from setuptools.command.egg_info import write_requirements


WHEEL_NAME = re.compile(
    r"""^(?P<project_name>.+?)-(?P<version>\d.*?)
    ((-(?P<build>\d.*?))?-(?P<py_version>.+?)-(?P<abi>.+?)-(?P<platform>.+?)
    )\.whl$""",
re.VERBOSE).match

NAMESPACE_PACKAGE_INIT = '''\
try:
    __import__('pkg_resources').declare_namespace(__name__)
except ImportError:
    __path__ = __import__('pkgutil').extend_path(__path__, __name__)
'''


def unpack(src_dir, dst_dir):
    '''Move everything under `src_dir` to `dst_dir`, and delete the former.'''
    for dirpath, dirnames, filenames in os.walk(src_dir):
        subdir = os.path.relpath(dirpath, src_dir)
        for f in filenames:
            src = os.path.join(dirpath, f)
            dst = os.path.join(dst_dir, subdir, f)
            os.renames(src, dst)
        for n, d in reversed(list(enumerate(dirnames))):
            src = os.path.join(dirpath, d)
            dst = os.path.join(dst_dir, subdir, d)
            if not os.path.exists(dst):
                # Directory does not exist in destination,
                # rename it and prune it from os.walk list.
                os.renames(src, dst)
                del dirnames[n]
    # Cleanup.
    for dirpath, dirnames, filenames in os.walk(src_dir, topdown=True):
        assert not filenames
        os.rmdir(dirpath)


class Wheel(object):

    def __init__(self, filename):
        match = WHEEL_NAME(os.path.basename(filename))
        if match is None:
            raise ValueError('invalid wheel name: %r' % filename)
        self.filename = filename
        for k, v in match.groupdict().items():
            setattr(self, k, v)

    def tags(self):
        '''List tags (py_version, abi, platform) supported by this wheel.'''
        return itertools.product(self.py_version.split('.'),
                                 self.abi.split('.'),
                                 self.platform.split('.'))

    def is_compatible(self):
        '''Is the wheel is compatible with the current platform?'''
        supported_tags = pep425tags.get_supported()
        return next((True for t in self.tags() if t in supported_tags), False)

    def egg_name(self):
        return Distribution(
            project_name=self.project_name, version=self.version,
            platform=(None if self.platform == 'any' else get_platform()),
        ).egg_name() + '.egg'

    def get_dist_info(self, zf):
        # find the correct name of the .dist-info dir in the wheel file
        for member in zf.namelist():
            dirname = posixpath.dirname(member)
            if (dirname.endswith('.dist-info') and
                    canonicalize_name(dirname).startswith(
                        canonicalize_name(self.project_name))):
                return dirname
        raise ValueError("unsupported wheel format. .dist-info not found")

    def install_as_egg(self, destination_eggdir):
        '''Install wheel as an egg directory.'''
        with zipfile.ZipFile(self.filename) as zf:
            dist_basename = '%s-%s' % (self.project_name, self.version)
            dist_info = self.get_dist_info(zf)
            dist_data = '%s.data' % dist_basename
            def get_metadata(name):
                with zf.open(posixpath.join(dist_info, name)) as fp:
                    value = fp.read().decode('utf-8') if PY3 else fp.read()
                    return email.parser.Parser().parsestr(value)
            wheel_metadata = get_metadata('WHEEL')
            dist_metadata = get_metadata('METADATA')
            # Check wheel format version is supported.
            wheel_version = parse_version(wheel_metadata.get('Wheel-Version'))
            if not parse_version('1.0') <= wheel_version < parse_version('2.0dev0'):
                raise ValueError('unsupported wheel format version: %s' % wheel_version)
            # Extract to target directory.
            os.mkdir(destination_eggdir)
            zf.extractall(destination_eggdir)
            # Convert metadata.
            dist_info = os.path.join(destination_eggdir, dist_info)
            dist = Distribution.from_location(
                destination_eggdir, dist_info,
                metadata=PathMetadata(destination_eggdir, dist_info)
            )
            # Note: we need to evaluate and strip markers now,
            # as we can't easily convert back from the syntax:
            # foobar; "linux" in sys_platform and extra == 'test'
            def raw_req(req):
                req.marker = None
                return str(req)
            install_requires = list(sorted(map(raw_req, dist.requires())))
            extras_require = {
                extra: list(sorted(
                    req
                    for req in map(raw_req, dist.requires((extra,)))
                    if req not in install_requires
                ))
                for extra in dist.extras
            }
            egg_info = os.path.join(destination_eggdir, 'EGG-INFO')
            os.rename(dist_info, egg_info)
            os.rename(os.path.join(egg_info, 'METADATA'),
                      os.path.join(egg_info, 'PKG-INFO'))
            setup_dist = SetuptoolsDistribution(attrs=dict(
                install_requires=install_requires,
                extras_require=extras_require,
            ))
            write_requirements(setup_dist.get_command_obj('egg_info'),
                               None, os.path.join(egg_info, 'requires.txt'))
            # Move data entries to their correct location.
            dist_data = os.path.join(destination_eggdir, dist_data)
            dist_data_scripts = os.path.join(dist_data, 'scripts')
            if os.path.exists(dist_data_scripts):
                egg_info_scripts = os.path.join(destination_eggdir,
                                                'EGG-INFO', 'scripts')
                os.mkdir(egg_info_scripts)
                for entry in os.listdir(dist_data_scripts):
                    # Remove bytecode, as it's not properly handled
                    # during easy_install scripts install phase.
                    if entry.endswith('.pyc'):
                        os.unlink(os.path.join(dist_data_scripts, entry))
                    else:
                        os.rename(os.path.join(dist_data_scripts, entry),
                                  os.path.join(egg_info_scripts, entry))
                os.rmdir(dist_data_scripts)
            for subdir in filter(os.path.exists, (
                os.path.join(dist_data, d)
                for d in ('data', 'headers', 'purelib', 'platlib')
            )):
                unpack(subdir, destination_eggdir)
            if os.path.exists(dist_data):
                os.rmdir(dist_data)
            # Fix namespace packages.
            namespace_packages = os.path.join(egg_info, 'namespace_packages.txt')
            if os.path.exists(namespace_packages):
                with open(namespace_packages) as fp:
                    namespace_packages = fp.read().split()
                for mod in namespace_packages:
                    mod_dir = os.path.join(destination_eggdir, *mod.split('.'))
                    mod_init = os.path.join(mod_dir, '__init__.py')
                    if os.path.exists(mod_dir) and not os.path.exists(mod_init):
                        with open(mod_init, 'w') as fp:
                            fp.write(NAMESPACE_PACKAGE_INIT)
site-packages/setuptools/msvc.py000064400000117655147511334630013055 0ustar00"""
Improved support for Microsoft Visual C++ compilers.

Known supported compilers:
--------------------------
Microsoft Visual C++ 9.0:
    Microsoft Visual C++ Compiler for Python 2.7 (x86, amd64)
    Microsoft Windows SDK 6.1 (x86, x64, ia64)
    Microsoft Windows SDK 7.0 (x86, x64, ia64)

Microsoft Visual C++ 10.0:
    Microsoft Windows SDK 7.1 (x86, x64, ia64)

Microsoft Visual C++ 14.0:
    Microsoft Visual C++ Build Tools 2015 (x86, x64, arm)
    Microsoft Visual Studio 2017 (x86, x64, arm, arm64)
    Microsoft Visual Studio Build Tools 2017 (x86, x64, arm, arm64)
"""

import os
import sys
import platform
import itertools
import distutils.errors
from setuptools.extern.packaging.version import LegacyVersion

from setuptools.extern.six.moves import filterfalse

from .monkey import get_unpatched

if platform.system() == 'Windows':
    from setuptools.extern.six.moves import winreg
    safe_env = os.environ
else:
    """
    Mock winreg and environ so the module can be imported
    on this platform.
    """

    class winreg:
        HKEY_USERS = None
        HKEY_CURRENT_USER = None
        HKEY_LOCAL_MACHINE = None
        HKEY_CLASSES_ROOT = None

    safe_env = dict()

_msvc9_suppress_errors = (
    # msvc9compiler isn't available on some platforms
    ImportError,

    # msvc9compiler raises DistutilsPlatformError in some
    # environments. See #1118.
    distutils.errors.DistutilsPlatformError,
)

try:
    from distutils.msvc9compiler import Reg
except _msvc9_suppress_errors:
    pass


def msvc9_find_vcvarsall(version):
    """
    Patched "distutils.msvc9compiler.find_vcvarsall" to use the standalone
    compiler build for Python (VCForPython). Fall back to original behavior
    when the standalone compiler is not available.

    Redirect the path of "vcvarsall.bat".

    Known supported compilers
    -------------------------
    Microsoft Visual C++ 9.0:
        Microsoft Visual C++ Compiler for Python 2.7 (x86, amd64)

    Parameters
    ----------
    version: float
        Required Microsoft Visual C++ version.

    Return
    ------
    vcvarsall.bat path: str
    """
    VC_BASE = r'Software\%sMicrosoft\DevDiv\VCForPython\%0.1f'
    key = VC_BASE % ('', version)
    try:
        # Per-user installs register the compiler path here
        productdir = Reg.get_value(key, "installdir")
    except KeyError:
        try:
            # All-user installs on a 64-bit system register here
            key = VC_BASE % ('Wow6432Node\\', version)
            productdir = Reg.get_value(key, "installdir")
        except KeyError:
            productdir = None

    if productdir:
        vcvarsall = os.path.os.path.join(productdir, "vcvarsall.bat")
        if os.path.isfile(vcvarsall):
            return vcvarsall

    return get_unpatched(msvc9_find_vcvarsall)(version)


def msvc9_query_vcvarsall(ver, arch='x86', *args, **kwargs):
    """
    Patched "distutils.msvc9compiler.query_vcvarsall" for support extra
    compilers.

    Set environment without use of "vcvarsall.bat".

    Known supported compilers
    -------------------------
    Microsoft Visual C++ 9.0:
        Microsoft Visual C++ Compiler for Python 2.7 (x86, amd64)
        Microsoft Windows SDK 6.1 (x86, x64, ia64)
        Microsoft Windows SDK 7.0 (x86, x64, ia64)

    Microsoft Visual C++ 10.0:
        Microsoft Windows SDK 7.1 (x86, x64, ia64)

    Parameters
    ----------
    ver: float
        Required Microsoft Visual C++ version.
    arch: str
        Target architecture.

    Return
    ------
    environment: dict
    """
    # Try to get environement from vcvarsall.bat (Classical way)
    try:
        orig = get_unpatched(msvc9_query_vcvarsall)
        return orig(ver, arch, *args, **kwargs)
    except distutils.errors.DistutilsPlatformError:
        # Pass error if Vcvarsall.bat is missing
        pass
    except ValueError:
        # Pass error if environment not set after executing vcvarsall.bat
        pass

    # If error, try to set environment directly
    try:
        return EnvironmentInfo(arch, ver).return_env()
    except distutils.errors.DistutilsPlatformError as exc:
        _augment_exception(exc, ver, arch)
        raise


def msvc14_get_vc_env(plat_spec):
    """
    Patched "distutils._msvccompiler._get_vc_env" for support extra
    compilers.

    Set environment without use of "vcvarsall.bat".

    Known supported compilers
    -------------------------
    Microsoft Visual C++ 14.0:
        Microsoft Visual C++ Build Tools 2015 (x86, x64, arm)
        Microsoft Visual Studio 2017 (x86, x64, arm, arm64)
        Microsoft Visual Studio Build Tools 2017 (x86, x64, arm, arm64)

    Parameters
    ----------
    plat_spec: str
        Target architecture.

    Return
    ------
    environment: dict
    """
    # Try to get environment from vcvarsall.bat (Classical way)
    try:
        return get_unpatched(msvc14_get_vc_env)(plat_spec)
    except distutils.errors.DistutilsPlatformError:
        # Pass error Vcvarsall.bat is missing
        pass

    # If error, try to set environment directly
    try:
        return EnvironmentInfo(plat_spec, vc_min_ver=14.0).return_env()
    except distutils.errors.DistutilsPlatformError as exc:
        _augment_exception(exc, 14.0)
        raise


def msvc14_gen_lib_options(*args, **kwargs):
    """
    Patched "distutils._msvccompiler.gen_lib_options" for fix
    compatibility between "numpy.distutils" and "distutils._msvccompiler"
    (for Numpy < 1.11.2)
    """
    if "numpy.distutils" in sys.modules:
        import numpy as np
        if LegacyVersion(np.__version__) < LegacyVersion('1.11.2'):
            return np.distutils.ccompiler.gen_lib_options(*args, **kwargs)
    return get_unpatched(msvc14_gen_lib_options)(*args, **kwargs)


def _augment_exception(exc, version, arch=''):
    """
    Add details to the exception message to help guide the user
    as to what action will resolve it.
    """
    # Error if MSVC++ directory not found or environment not set
    message = exc.args[0]

    if "vcvarsall" in message.lower() or "visual c" in message.lower():
        # Special error message if MSVC++ not installed
        tmpl = 'Microsoft Visual C++ {version:0.1f} is required.'
        message = tmpl.format(**locals())
        msdownload = 'www.microsoft.com/download/details.aspx?id=%d'
        if version == 9.0:
            if arch.lower().find('ia64') > -1:
                # For VC++ 9.0, if IA64 support is needed, redirect user
                # to Windows SDK 7.0
                message += ' Get it with "Microsoft Windows SDK 7.0": '
                message += msdownload % 3138
            else:
                # For VC++ 9.0 redirect user to Vc++ for Python 2.7 :
                # This redirection link is maintained by Microsoft.
                # Contact vspython@microsoft.com if it needs updating.
                message += ' Get it from http://aka.ms/vcpython27'
        elif version == 10.0:
            # For VC++ 10.0 Redirect user to Windows SDK 7.1
            message += ' Get it with "Microsoft Windows SDK 7.1": '
            message += msdownload % 8279
        elif version >= 14.0:
            # For VC++ 14.0 Redirect user to Visual C++ Build Tools
            message += (' Get it with "Microsoft Visual C++ Build Tools": '
                        r'http://landinghub.visualstudio.com/'
                        'visual-cpp-build-tools')

    exc.args = (message, )


class PlatformInfo:
    """
    Current and Target Architectures informations.

    Parameters
    ----------
    arch: str
        Target architecture.
    """
    current_cpu = safe_env.get('processor_architecture', '').lower()

    def __init__(self, arch):
        self.arch = arch.lower().replace('x64', 'amd64')

    @property
    def target_cpu(self):
        return self.arch[self.arch.find('_') + 1:]

    def target_is_x86(self):
        return self.target_cpu == 'x86'

    def current_is_x86(self):
        return self.current_cpu == 'x86'

    def current_dir(self, hidex86=False, x64=False):
        """
        Current platform specific subfolder.

        Parameters
        ----------
        hidex86: bool
            return '' and not '\x86' if architecture is x86.
        x64: bool
            return '\x64' and not '\amd64' if architecture is amd64.

        Return
        ------
        subfolder: str
            '\target', or '' (see hidex86 parameter)
        """
        return (
            '' if (self.current_cpu == 'x86' and hidex86) else
            r'\x64' if (self.current_cpu == 'amd64' and x64) else
            r'\%s' % self.current_cpu
        )

    def target_dir(self, hidex86=False, x64=False):
        r"""
        Target platform specific subfolder.

        Parameters
        ----------
        hidex86: bool
            return '' and not '\x86' if architecture is x86.
        x64: bool
            return '\x64' and not '\amd64' if architecture is amd64.

        Return
        ------
        subfolder: str
            '\current', or '' (see hidex86 parameter)
        """
        return (
            '' if (self.target_cpu == 'x86' and hidex86) else
            r'\x64' if (self.target_cpu == 'amd64' and x64) else
            r'\%s' % self.target_cpu
        )

    def cross_dir(self, forcex86=False):
        r"""
        Cross platform specific subfolder.

        Parameters
        ----------
        forcex86: bool
            Use 'x86' as current architecture even if current acritecture is
            not x86.

        Return
        ------
        subfolder: str
            '' if target architecture is current architecture,
            '\current_target' if not.
        """
        current = 'x86' if forcex86 else self.current_cpu
        return (
            '' if self.target_cpu == current else
            self.target_dir().replace('\\', '\\%s_' % current)
        )


class RegistryInfo:
    """
    Microsoft Visual Studio related registry informations.

    Parameters
    ----------
    platform_info: PlatformInfo
        "PlatformInfo" instance.
    """
    HKEYS = (winreg.HKEY_USERS,
             winreg.HKEY_CURRENT_USER,
             winreg.HKEY_LOCAL_MACHINE,
             winreg.HKEY_CLASSES_ROOT)

    def __init__(self, platform_info):
        self.pi = platform_info

    @property
    def visualstudio(self):
        """
        Microsoft Visual Studio root registry key.
        """
        return 'VisualStudio'

    @property
    def sxs(self):
        """
        Microsoft Visual Studio SxS registry key.
        """
        return os.path.join(self.visualstudio, 'SxS')

    @property
    def vc(self):
        """
        Microsoft Visual C++ VC7 registry key.
        """
        return os.path.join(self.sxs, 'VC7')

    @property
    def vs(self):
        """
        Microsoft Visual Studio VS7 registry key.
        """
        return os.path.join(self.sxs, 'VS7')

    @property
    def vc_for_python(self):
        """
        Microsoft Visual C++ for Python registry key.
        """
        return r'DevDiv\VCForPython'

    @property
    def microsoft_sdk(self):
        """
        Microsoft SDK registry key.
        """
        return 'Microsoft SDKs'

    @property
    def windows_sdk(self):
        """
        Microsoft Windows/Platform SDK registry key.
        """
        return os.path.join(self.microsoft_sdk, 'Windows')

    @property
    def netfx_sdk(self):
        """
        Microsoft .NET Framework SDK registry key.
        """
        return os.path.join(self.microsoft_sdk, 'NETFXSDK')

    @property
    def windows_kits_roots(self):
        """
        Microsoft Windows Kits Roots registry key.
        """
        return r'Windows Kits\Installed Roots'

    def microsoft(self, key, x86=False):
        """
        Return key in Microsoft software registry.

        Parameters
        ----------
        key: str
            Registry key path where look.
        x86: str
            Force x86 software registry.

        Return
        ------
        str: value
        """
        node64 = '' if self.pi.current_is_x86() or x86 else 'Wow6432Node'
        return os.path.join('Software', node64, 'Microsoft', key)

    def lookup(self, key, name):
        """
        Look for values in registry in Microsoft software registry.

        Parameters
        ----------
        key: str
            Registry key path where look.
        name: str
            Value name to find.

        Return
        ------
        str: value
        """
        KEY_READ = winreg.KEY_READ
        openkey = winreg.OpenKey
        ms = self.microsoft
        for hkey in self.HKEYS:
            try:
                bkey = openkey(hkey, ms(key), 0, KEY_READ)
            except (OSError, IOError):
                if not self.pi.current_is_x86():
                    try:
                        bkey = openkey(hkey, ms(key, True), 0, KEY_READ)
                    except (OSError, IOError):
                        continue
                else:
                    continue
            try:
                return winreg.QueryValueEx(bkey, name)[0]
            except (OSError, IOError):
                pass


class SystemInfo:
    """
    Microsoft Windows and Visual Studio related system inormations.

    Parameters
    ----------
    registry_info: RegistryInfo
        "RegistryInfo" instance.
    vc_ver: float
        Required Microsoft Visual C++ version.
    """

    # Variables and properties in this class use originals CamelCase variables
    # names from Microsoft source files for more easy comparaison.
    WinDir = safe_env.get('WinDir', '')
    ProgramFiles = safe_env.get('ProgramFiles', '')
    ProgramFilesx86 = safe_env.get('ProgramFiles(x86)', ProgramFiles)

    def __init__(self, registry_info, vc_ver=None):
        self.ri = registry_info
        self.pi = self.ri.pi
        self.vc_ver = vc_ver or self._find_latest_available_vc_ver()

    def _find_latest_available_vc_ver(self):
        try:
            return self.find_available_vc_vers()[-1]
        except IndexError:
            err = 'No Microsoft Visual C++ version found'
            raise distutils.errors.DistutilsPlatformError(err)

    def find_available_vc_vers(self):
        """
        Find all available Microsoft Visual C++ versions.
        """
        ms = self.ri.microsoft
        vckeys = (self.ri.vc, self.ri.vc_for_python, self.ri.vs)
        vc_vers = []
        for hkey in self.ri.HKEYS:
            for key in vckeys:
                try:
                    bkey = winreg.OpenKey(hkey, ms(key), 0, winreg.KEY_READ)
                except (OSError, IOError):
                    continue
                subkeys, values, _ = winreg.QueryInfoKey(bkey)
                for i in range(values):
                    try:
                        ver = float(winreg.EnumValue(bkey, i)[0])
                        if ver not in vc_vers:
                            vc_vers.append(ver)
                    except ValueError:
                        pass
                for i in range(subkeys):
                    try:
                        ver = float(winreg.EnumKey(bkey, i))
                        if ver not in vc_vers:
                            vc_vers.append(ver)
                    except ValueError:
                        pass
        return sorted(vc_vers)

    @property
    def VSInstallDir(self):
        """
        Microsoft Visual Studio directory.
        """
        # Default path
        name = 'Microsoft Visual Studio %0.1f' % self.vc_ver
        default = os.path.join(self.ProgramFilesx86, name)

        # Try to get path from registry, if fail use default path
        return self.ri.lookup(self.ri.vs, '%0.1f' % self.vc_ver) or default

    @property
    def VCInstallDir(self):
        """
        Microsoft Visual C++ directory.
        """
        self.VSInstallDir

        guess_vc = self._guess_vc() or self._guess_vc_legacy()

        # Try to get "VC++ for Python" path from registry as default path
        reg_path = os.path.join(self.ri.vc_for_python, '%0.1f' % self.vc_ver)
        python_vc = self.ri.lookup(reg_path, 'installdir')
        default_vc = os.path.join(python_vc, 'VC') if python_vc else guess_vc

        # Try to get path from registry, if fail use default path
        path = self.ri.lookup(self.ri.vc, '%0.1f' % self.vc_ver) or default_vc

        if not os.path.isdir(path):
            msg = 'Microsoft Visual C++ directory not found'
            raise distutils.errors.DistutilsPlatformError(msg)

        return path

    def _guess_vc(self):
        """
        Locate Visual C for 2017
        """
        if self.vc_ver <= 14.0:
            return

        default = r'VC\Tools\MSVC'
        guess_vc = os.path.join(self.VSInstallDir, default)
        # Subdir with VC exact version as name
        try:
            vc_exact_ver = os.listdir(guess_vc)[-1]
            return os.path.join(guess_vc, vc_exact_ver)
        except (OSError, IOError, IndexError):
            pass

    def _guess_vc_legacy(self):
        """
        Locate Visual C for versions prior to 2017
        """
        default = r'Microsoft Visual Studio %0.1f\VC' % self.vc_ver
        return os.path.join(self.ProgramFilesx86, default)

    @property
    def WindowsSdkVersion(self):
        """
        Microsoft Windows SDK versions for specified MSVC++ version.
        """
        if self.vc_ver <= 9.0:
            return ('7.0', '6.1', '6.0a')
        elif self.vc_ver == 10.0:
            return ('7.1', '7.0a')
        elif self.vc_ver == 11.0:
            return ('8.0', '8.0a')
        elif self.vc_ver == 12.0:
            return ('8.1', '8.1a')
        elif self.vc_ver >= 14.0:
            return ('10.0', '8.1')

    @property
    def WindowsSdkLastVersion(self):
        """
        Microsoft Windows SDK last version
        """
        return self._use_last_dir_name(os.path.join(
            self.WindowsSdkDir, 'lib'))

    @property
    def WindowsSdkDir(self):
        """
        Microsoft Windows SDK directory.
        """
        sdkdir = ''
        for ver in self.WindowsSdkVersion:
            # Try to get it from registry
            loc = os.path.join(self.ri.windows_sdk, 'v%s' % ver)
            sdkdir = self.ri.lookup(loc, 'installationfolder')
            if sdkdir:
                break
        if not sdkdir or not os.path.isdir(sdkdir):
            # Try to get "VC++ for Python" version from registry
            path = os.path.join(self.ri.vc_for_python, '%0.1f' % self.vc_ver)
            install_base = self.ri.lookup(path, 'installdir')
            if install_base:
                sdkdir = os.path.join(install_base, 'WinSDK')
        if not sdkdir or not os.path.isdir(sdkdir):
            # If fail, use default new path
            for ver in self.WindowsSdkVersion:
                intver = ver[:ver.rfind('.')]
                path = r'Microsoft SDKs\Windows Kits\%s' % (intver)
                d = os.path.join(self.ProgramFiles, path)
                if os.path.isdir(d):
                    sdkdir = d
        if not sdkdir or not os.path.isdir(sdkdir):
            # If fail, use default old path
            for ver in self.WindowsSdkVersion:
                path = r'Microsoft SDKs\Windows\v%s' % ver
                d = os.path.join(self.ProgramFiles, path)
                if os.path.isdir(d):
                    sdkdir = d
        if not sdkdir:
            # If fail, use Platform SDK
            sdkdir = os.path.join(self.VCInstallDir, 'PlatformSDK')
        return sdkdir

    @property
    def WindowsSDKExecutablePath(self):
        """
        Microsoft Windows SDK executable directory.
        """
        # Find WinSDK NetFx Tools registry dir name
        if self.vc_ver <= 11.0:
            netfxver = 35
            arch = ''
        else:
            netfxver = 40
            hidex86 = True if self.vc_ver <= 12.0 else False
            arch = self.pi.current_dir(x64=True, hidex86=hidex86)
        fx = 'WinSDK-NetFx%dTools%s' % (netfxver, arch.replace('\\', '-'))

        # liste all possibles registry paths
        regpaths = []
        if self.vc_ver >= 14.0:
            for ver in self.NetFxSdkVersion:
                regpaths += [os.path.join(self.ri.netfx_sdk, ver, fx)]

        for ver in self.WindowsSdkVersion:
            regpaths += [os.path.join(self.ri.windows_sdk, 'v%sA' % ver, fx)]

        # Return installation folder from the more recent path
        for path in regpaths:
            execpath = self.ri.lookup(path, 'installationfolder')
            if execpath:
                break
        return execpath

    @property
    def FSharpInstallDir(self):
        """
        Microsoft Visual F# directory.
        """
        path = r'%0.1f\Setup\F#' % self.vc_ver
        path = os.path.join(self.ri.visualstudio, path)
        return self.ri.lookup(path, 'productdir') or ''

    @property
    def UniversalCRTSdkDir(self):
        """
        Microsoft Universal CRT SDK directory.
        """
        # Set Kit Roots versions for specified MSVC++ version
        if self.vc_ver >= 14.0:
            vers = ('10', '81')
        else:
            vers = ()

        # Find path of the more recent Kit
        for ver in vers:
            sdkdir = self.ri.lookup(self.ri.windows_kits_roots,
                                    'kitsroot%s' % ver)
            if sdkdir:
                break
        return sdkdir or ''

    @property
    def UniversalCRTSdkLastVersion(self):
        """
        Microsoft Universal C Runtime SDK last version
        """
        return self._use_last_dir_name(os.path.join(
            self.UniversalCRTSdkDir, 'lib'))

    @property
    def NetFxSdkVersion(self):
        """
        Microsoft .NET Framework SDK versions.
        """
        # Set FxSdk versions for specified MSVC++ version
        if self.vc_ver >= 14.0:
            return ('4.6.1', '4.6')
        else:
            return ()

    @property
    def NetFxSdkDir(self):
        """
        Microsoft .NET Framework SDK directory.
        """
        for ver in self.NetFxSdkVersion:
            loc = os.path.join(self.ri.netfx_sdk, ver)
            sdkdir = self.ri.lookup(loc, 'kitsinstallationfolder')
            if sdkdir:
                break
        return sdkdir or ''

    @property
    def FrameworkDir32(self):
        """
        Microsoft .NET Framework 32bit directory.
        """
        # Default path
        guess_fw = os.path.join(self.WinDir, r'Microsoft.NET\Framework')

        # Try to get path from registry, if fail use default path
        return self.ri.lookup(self.ri.vc, 'frameworkdir32') or guess_fw

    @property
    def FrameworkDir64(self):
        """
        Microsoft .NET Framework 64bit directory.
        """
        # Default path
        guess_fw = os.path.join(self.WinDir, r'Microsoft.NET\Framework64')

        # Try to get path from registry, if fail use default path
        return self.ri.lookup(self.ri.vc, 'frameworkdir64') or guess_fw

    @property
    def FrameworkVersion32(self):
        """
        Microsoft .NET Framework 32bit versions.
        """
        return self._find_dot_net_versions(32)

    @property
    def FrameworkVersion64(self):
        """
        Microsoft .NET Framework 64bit versions.
        """
        return self._find_dot_net_versions(64)

    def _find_dot_net_versions(self, bits):
        """
        Find Microsoft .NET Framework versions.

        Parameters
        ----------
        bits: int
            Platform number of bits: 32 or 64.
        """
        # Find actual .NET version in registry
        reg_ver = self.ri.lookup(self.ri.vc, 'frameworkver%d' % bits)
        dot_net_dir = getattr(self, 'FrameworkDir%d' % bits)
        ver = reg_ver or self._use_last_dir_name(dot_net_dir, 'v') or ''

        # Set .NET versions for specified MSVC++ version
        if self.vc_ver >= 12.0:
            frameworkver = (ver, 'v4.0')
        elif self.vc_ver >= 10.0:
            frameworkver = ('v4.0.30319' if ver.lower()[:2] != 'v4' else ver,
                            'v3.5')
        elif self.vc_ver == 9.0:
            frameworkver = ('v3.5', 'v2.0.50727')
        if self.vc_ver == 8.0:
            frameworkver = ('v3.0', 'v2.0.50727')
        return frameworkver

    def _use_last_dir_name(self, path, prefix=''):
        """
        Return name of the last dir in path or '' if no dir found.

        Parameters
        ----------
        path: str
            Use dirs in this path
        prefix: str
            Use only dirs startings by this prefix
        """
        matching_dirs = (
            dir_name
            for dir_name in reversed(os.listdir(path))
            if os.path.isdir(os.path.join(path, dir_name)) and
            dir_name.startswith(prefix)
        )
        return next(matching_dirs, None) or ''


class EnvironmentInfo:
    """
    Return environment variables for specified Microsoft Visual C++ version
    and platform : Lib, Include, Path and libpath.

    This function is compatible with Microsoft Visual C++ 9.0 to 14.0.

    Script created by analysing Microsoft environment configuration files like
    "vcvars[...].bat", "SetEnv.Cmd", "vcbuildtools.bat", ...

    Parameters
    ----------
    arch: str
        Target architecture.
    vc_ver: float
        Required Microsoft Visual C++ version. If not set, autodetect the last
        version.
    vc_min_ver: float
        Minimum Microsoft Visual C++ version.
    """

    # Variables and properties in this class use originals CamelCase variables
    # names from Microsoft source files for more easy comparaison.

    def __init__(self, arch, vc_ver=None, vc_min_ver=0):
        self.pi = PlatformInfo(arch)
        self.ri = RegistryInfo(self.pi)
        self.si = SystemInfo(self.ri, vc_ver)

        if self.vc_ver < vc_min_ver:
            err = 'No suitable Microsoft Visual C++ version found'
            raise distutils.errors.DistutilsPlatformError(err)

    @property
    def vc_ver(self):
        """
        Microsoft Visual C++ version.
        """
        return self.si.vc_ver

    @property
    def VSTools(self):
        """
        Microsoft Visual Studio Tools
        """
        paths = [r'Common7\IDE', r'Common7\Tools']

        if self.vc_ver >= 14.0:
            arch_subdir = self.pi.current_dir(hidex86=True, x64=True)
            paths += [r'Common7\IDE\CommonExtensions\Microsoft\TestWindow']
            paths += [r'Team Tools\Performance Tools']
            paths += [r'Team Tools\Performance Tools%s' % arch_subdir]

        return [os.path.join(self.si.VSInstallDir, path) for path in paths]

    @property
    def VCIncludes(self):
        """
        Microsoft Visual C++ & Microsoft Foundation Class Includes
        """
        return [os.path.join(self.si.VCInstallDir, 'Include'),
                os.path.join(self.si.VCInstallDir, r'ATLMFC\Include')]

    @property
    def VCLibraries(self):
        """
        Microsoft Visual C++ & Microsoft Foundation Class Libraries
        """
        if self.vc_ver >= 15.0:
            arch_subdir = self.pi.target_dir(x64=True)
        else:
            arch_subdir = self.pi.target_dir(hidex86=True)
        paths = ['Lib%s' % arch_subdir, r'ATLMFC\Lib%s' % arch_subdir]

        if self.vc_ver >= 14.0:
            paths += [r'Lib\store%s' % arch_subdir]

        return [os.path.join(self.si.VCInstallDir, path) for path in paths]

    @property
    def VCStoreRefs(self):
        """
        Microsoft Visual C++ store references Libraries
        """
        if self.vc_ver < 14.0:
            return []
        return [os.path.join(self.si.VCInstallDir, r'Lib\store\references')]

    @property
    def VCTools(self):
        """
        Microsoft Visual C++ Tools
        """
        si = self.si
        tools = [os.path.join(si.VCInstallDir, 'VCPackages')]

        forcex86 = True if self.vc_ver <= 10.0 else False
        arch_subdir = self.pi.cross_dir(forcex86)
        if arch_subdir:
            tools += [os.path.join(si.VCInstallDir, 'Bin%s' % arch_subdir)]

        if self.vc_ver == 14.0:
            path = 'Bin%s' % self.pi.current_dir(hidex86=True)
            tools += [os.path.join(si.VCInstallDir, path)]

        elif self.vc_ver >= 15.0:
            host_dir = (r'bin\HostX86%s' if self.pi.current_is_x86() else
                        r'bin\HostX64%s')
            tools += [os.path.join(
                si.VCInstallDir, host_dir % self.pi.target_dir(x64=True))]

            if self.pi.current_cpu != self.pi.target_cpu:
                tools += [os.path.join(
                    si.VCInstallDir, host_dir % self.pi.current_dir(x64=True))]

        else:
            tools += [os.path.join(si.VCInstallDir, 'Bin')]

        return tools

    @property
    def OSLibraries(self):
        """
        Microsoft Windows SDK Libraries
        """
        if self.vc_ver <= 10.0:
            arch_subdir = self.pi.target_dir(hidex86=True, x64=True)
            return [os.path.join(self.si.WindowsSdkDir, 'Lib%s' % arch_subdir)]

        else:
            arch_subdir = self.pi.target_dir(x64=True)
            lib = os.path.join(self.si.WindowsSdkDir, 'lib')
            libver = self._sdk_subdir
            return [os.path.join(lib, '%sum%s' % (libver , arch_subdir))]

    @property
    def OSIncludes(self):
        """
        Microsoft Windows SDK Include
        """
        include = os.path.join(self.si.WindowsSdkDir, 'include')

        if self.vc_ver <= 10.0:
            return [include, os.path.join(include, 'gl')]

        else:
            if self.vc_ver >= 14.0:
                sdkver = self._sdk_subdir
            else:
                sdkver = ''
            return [os.path.join(include, '%sshared' % sdkver),
                    os.path.join(include, '%sum' % sdkver),
                    os.path.join(include, '%swinrt' % sdkver)]

    @property
    def OSLibpath(self):
        """
        Microsoft Windows SDK Libraries Paths
        """
        ref = os.path.join(self.si.WindowsSdkDir, 'References')
        libpath = []

        if self.vc_ver <= 9.0:
            libpath += self.OSLibraries

        if self.vc_ver >= 11.0:
            libpath += [os.path.join(ref, r'CommonConfiguration\Neutral')]

        if self.vc_ver >= 14.0:
            libpath += [
                ref,
                os.path.join(self.si.WindowsSdkDir, 'UnionMetadata'),
                os.path.join(
                    ref,
                    'Windows.Foundation.UniversalApiContract',
                    '1.0.0.0',
                ),
                os.path.join(
                    ref,
                    'Windows.Foundation.FoundationContract',
                    '1.0.0.0',
                ),
                os.path.join(
                    ref,
                    'Windows.Networking.Connectivity.WwanContract',
                    '1.0.0.0',
                ),
                os.path.join(
                    self.si.WindowsSdkDir,
                    'ExtensionSDKs',
                    'Microsoft.VCLibs',
                    '%0.1f' % self.vc_ver,
                    'References',
                    'CommonConfiguration',
                    'neutral',
                ),
            ]
        return libpath

    @property
    def SdkTools(self):
        """
        Microsoft Windows SDK Tools
        """
        return list(self._sdk_tools())

    def _sdk_tools(self):
        """
        Microsoft Windows SDK Tools paths generator
        """
        if self.vc_ver < 15.0:
            bin_dir = 'Bin' if self.vc_ver <= 11.0 else r'Bin\x86'
            yield os.path.join(self.si.WindowsSdkDir, bin_dir)

        if not self.pi.current_is_x86():
            arch_subdir = self.pi.current_dir(x64=True)
            path = 'Bin%s' % arch_subdir
            yield os.path.join(self.si.WindowsSdkDir, path)

        if self.vc_ver == 10.0 or self.vc_ver == 11.0:
            if self.pi.target_is_x86():
                arch_subdir = ''
            else:
                arch_subdir = self.pi.current_dir(hidex86=True, x64=True)
            path = r'Bin\NETFX 4.0 Tools%s' % arch_subdir
            yield os.path.join(self.si.WindowsSdkDir, path)

        elif self.vc_ver >= 15.0:
            path = os.path.join(self.si.WindowsSdkDir, 'Bin')
            arch_subdir = self.pi.current_dir(x64=True)
            sdkver = self.si.WindowsSdkLastVersion
            yield os.path.join(path, '%s%s' % (sdkver, arch_subdir))

        if self.si.WindowsSDKExecutablePath:
            yield self.si.WindowsSDKExecutablePath

    @property
    def _sdk_subdir(self):
        """
        Microsoft Windows SDK version subdir
        """
        ucrtver = self.si.WindowsSdkLastVersion
        return ('%s\\' % ucrtver) if ucrtver else ''

    @property
    def SdkSetup(self):
        """
        Microsoft Windows SDK Setup
        """
        if self.vc_ver > 9.0:
            return []

        return [os.path.join(self.si.WindowsSdkDir, 'Setup')]

    @property
    def FxTools(self):
        """
        Microsoft .NET Framework Tools
        """
        pi = self.pi
        si = self.si

        if self.vc_ver <= 10.0:
            include32 = True
            include64 = not pi.target_is_x86() and not pi.current_is_x86()
        else:
            include32 = pi.target_is_x86() or pi.current_is_x86()
            include64 = pi.current_cpu == 'amd64' or pi.target_cpu == 'amd64'

        tools = []
        if include32:
            tools += [os.path.join(si.FrameworkDir32, ver)
                      for ver in si.FrameworkVersion32]
        if include64:
            tools += [os.path.join(si.FrameworkDir64, ver)
                      for ver in si.FrameworkVersion64]
        return tools

    @property
    def NetFxSDKLibraries(self):
        """
        Microsoft .Net Framework SDK Libraries
        """
        if self.vc_ver < 14.0 or not self.si.NetFxSdkDir:
            return []

        arch_subdir = self.pi.target_dir(x64=True)
        return [os.path.join(self.si.NetFxSdkDir, r'lib\um%s' % arch_subdir)]

    @property
    def NetFxSDKIncludes(self):
        """
        Microsoft .Net Framework SDK Includes
        """
        if self.vc_ver < 14.0 or not self.si.NetFxSdkDir:
            return []

        return [os.path.join(self.si.NetFxSdkDir, r'include\um')]

    @property
    def VsTDb(self):
        """
        Microsoft Visual Studio Team System Database
        """
        return [os.path.join(self.si.VSInstallDir, r'VSTSDB\Deploy')]

    @property
    def MSBuild(self):
        """
        Microsoft Build Engine
        """
        if self.vc_ver < 12.0:
            return []
        elif self.vc_ver < 15.0:
            base_path = self.si.ProgramFilesx86
            arch_subdir = self.pi.current_dir(hidex86=True)
        else:
            base_path = self.si.VSInstallDir
            arch_subdir = ''

        path = r'MSBuild\%0.1f\bin%s' % (self.vc_ver, arch_subdir)
        build = [os.path.join(base_path, path)]

        if self.vc_ver >= 15.0:
            # Add Roslyn C# & Visual Basic Compiler
            build += [os.path.join(base_path, path, 'Roslyn')]

        return build

    @property
    def HTMLHelpWorkshop(self):
        """
        Microsoft HTML Help Workshop
        """
        if self.vc_ver < 11.0:
            return []

        return [os.path.join(self.si.ProgramFilesx86, 'HTML Help Workshop')]

    @property
    def UCRTLibraries(self):
        """
        Microsoft Universal C Runtime SDK Libraries
        """
        if self.vc_ver < 14.0:
            return []

        arch_subdir = self.pi.target_dir(x64=True)
        lib = os.path.join(self.si.UniversalCRTSdkDir, 'lib')
        ucrtver = self._ucrt_subdir
        return [os.path.join(lib, '%sucrt%s' % (ucrtver, arch_subdir))]

    @property
    def UCRTIncludes(self):
        """
        Microsoft Universal C Runtime SDK Include
        """
        if self.vc_ver < 14.0:
            return []

        include = os.path.join(self.si.UniversalCRTSdkDir, 'include')
        return [os.path.join(include, '%sucrt' % self._ucrt_subdir)]

    @property
    def _ucrt_subdir(self):
        """
        Microsoft Universal C Runtime SDK version subdir
        """
        ucrtver = self.si.UniversalCRTSdkLastVersion
        return ('%s\\' % ucrtver) if ucrtver else ''

    @property
    def FSharp(self):
        """
        Microsoft Visual F#
        """
        if self.vc_ver < 11.0 and self.vc_ver > 12.0:
            return []

        return self.si.FSharpInstallDir

    @property
    def VCRuntimeRedist(self):
        """
        Microsoft Visual C++ runtime redistribuable dll
        """
        arch_subdir = self.pi.target_dir(x64=True)
        if self.vc_ver < 15:
            redist_path = self.si.VCInstallDir
            vcruntime = 'redist%s\\Microsoft.VC%d0.CRT\\vcruntime%d0.dll'
        else:
            redist_path = self.si.VCInstallDir.replace('\\Tools', '\\Redist')
            vcruntime = 'onecore%s\\Microsoft.VC%d0.CRT\\vcruntime%d0.dll'

        # Visual Studio 2017  is still Visual C++ 14.0
        dll_ver = 14.0 if self.vc_ver == 15 else self.vc_ver

        vcruntime = vcruntime % (arch_subdir, self.vc_ver, dll_ver)
        return os.path.join(redist_path, vcruntime)

    def return_env(self, exists=True):
        """
        Return environment dict.

        Parameters
        ----------
        exists: bool
            It True, only return existing paths.
        """
        env = dict(
            include=self._build_paths('include',
                                      [self.VCIncludes,
                                       self.OSIncludes,
                                       self.UCRTIncludes,
                                       self.NetFxSDKIncludes],
                                      exists),
            lib=self._build_paths('lib',
                                  [self.VCLibraries,
                                   self.OSLibraries,
                                   self.FxTools,
                                   self.UCRTLibraries,
                                   self.NetFxSDKLibraries],
                                  exists),
            libpath=self._build_paths('libpath',
                                      [self.VCLibraries,
                                       self.FxTools,
                                       self.VCStoreRefs,
                                       self.OSLibpath],
                                      exists),
            path=self._build_paths('path',
                                   [self.VCTools,
                                    self.VSTools,
                                    self.VsTDb,
                                    self.SdkTools,
                                    self.SdkSetup,
                                    self.FxTools,
                                    self.MSBuild,
                                    self.HTMLHelpWorkshop,
                                    self.FSharp],
                                   exists),
        )
        if self.vc_ver >= 14 and os.path.isfile(self.VCRuntimeRedist):
            env['py_vcruntime_redist'] = self.VCRuntimeRedist
        return env

    def _build_paths(self, name, spec_path_lists, exists):
        """
        Given an environment variable name and specified paths,
        return a pathsep-separated string of paths containing
        unique, extant, directories from those paths and from
        the environment variable. Raise an error if no paths
        are resolved.
        """
        # flatten spec_path_lists
        spec_paths = itertools.chain.from_iterable(spec_path_lists)
        env_paths = safe_env.get(name, '').split(os.pathsep)
        paths = itertools.chain(spec_paths, env_paths)
        extant_paths = list(filter(os.path.isdir, paths)) if exists else paths
        if not extant_paths:
            msg = "%s environment variable is empty" % name.upper()
            raise distutils.errors.DistutilsPlatformError(msg)
        unique_paths = self._unique_everseen(extant_paths)
        return os.pathsep.join(unique_paths)

    # from Python docs
    def _unique_everseen(self, iterable, key=None):
        """
        List unique elements, preserving order.
        Remember all elements ever seen.

        _unique_everseen('AAAABBBCCDAABBB') --> A B C D

        _unique_everseen('ABBCcAD', str.lower) --> A B C D
        """
        seen = set()
        seen_add = seen.add
        if key is None:
            for element in filterfalse(seen.__contains__, iterable):
                seen_add(element)
                yield element
        else:
            for element in iterable:
                k = key(element)
                if k not in seen:
                    seen_add(k)
                    yield element
site-packages/setuptools/script.tmpl000064400000000212147511334630013711 0ustar00# EASY-INSTALL-SCRIPT: %(spec)r,%(script_name)r
__requires__ = %(spec)r
__import__('pkg_resources').run_script(%(spec)r, %(script_name)r)
site-packages/setuptools/pep425tags.py000064400000025171147511334630013772 0ustar00# This file originally from pip:
# https://github.com/pypa/pip/blob/8f4f15a5a95d7d5b511ceaee9ed261176c181970/src/pip/_internal/pep425tags.py
"""Generate and work with PEP 425 Compatibility Tags."""
from __future__ import absolute_import

import distutils.util
from distutils import log
import platform
import re
import sys
import sysconfig
import warnings
from collections import OrderedDict

from . import glibc

_osx_arch_pat = re.compile(r'(.+)_(\d+)_(\d+)_(.+)')


def get_config_var(var):
    try:
        return sysconfig.get_config_var(var)
    except IOError as e:  # Issue #1074
        warnings.warn("{}".format(e), RuntimeWarning)
        return None


def get_abbr_impl():
    """Return abbreviated implementation name."""
    if hasattr(sys, 'pypy_version_info'):
        pyimpl = 'pp'
    elif sys.platform.startswith('java'):
        pyimpl = 'jy'
    elif sys.platform == 'cli':
        pyimpl = 'ip'
    else:
        pyimpl = 'cp'
    return pyimpl


def get_impl_ver():
    """Return implementation version."""
    impl_ver = get_config_var("py_version_nodot")
    if not impl_ver or get_abbr_impl() == 'pp':
        impl_ver = ''.join(map(str, get_impl_version_info()))
    return impl_ver


def get_impl_version_info():
    """Return sys.version_info-like tuple for use in decrementing the minor
    version."""
    if get_abbr_impl() == 'pp':
        # as per https://github.com/pypa/pip/issues/2882
        return (sys.version_info[0], sys.pypy_version_info.major,
                sys.pypy_version_info.minor)
    else:
        return sys.version_info[0], sys.version_info[1]


def get_impl_tag():
    """
    Returns the Tag for this specific implementation.
    """
    return "{}{}".format(get_abbr_impl(), get_impl_ver())


def get_flag(var, fallback, expected=True, warn=True):
    """Use a fallback method for determining SOABI flags if the needed config
    var is unset or unavailable."""
    val = get_config_var(var)
    if val is None:
        if warn:
            log.debug("Config variable '%s' is unset, Python ABI tag may "
                      "be incorrect", var)
        return fallback()
    return val == expected


def get_abi_tag():
    """Return the ABI tag based on SOABI (if available) or emulate SOABI
    (CPython 2, PyPy)."""
    soabi = get_config_var('SOABI')
    impl = get_abbr_impl()
    if not soabi and impl in {'cp', 'pp'} and hasattr(sys, 'maxunicode'):
        d = ''
        m = ''
        u = ''
        if get_flag('Py_DEBUG',
                    lambda: hasattr(sys, 'gettotalrefcount'),
                    warn=(impl == 'cp')):
            d = 'd'
        if get_flag('WITH_PYMALLOC',
                    lambda: impl == 'cp',
                    warn=(impl == 'cp')):
            m = 'm'
        if get_flag('Py_UNICODE_SIZE',
                    lambda: sys.maxunicode == 0x10ffff,
                    expected=4,
                    warn=(impl == 'cp' and
                          sys.version_info < (3, 3))) \
                and sys.version_info < (3, 3):
            u = 'u'
        abi = '%s%s%s%s%s' % (impl, get_impl_ver(), d, m, u)
    elif soabi and soabi.startswith('cpython-'):
        abi = 'cp' + soabi.split('-')[1]
    elif soabi:
        abi = soabi.replace('.', '_').replace('-', '_')
    else:
        abi = None
    return abi


def _is_running_32bit():
    return sys.maxsize == 2147483647


def get_platform():
    """Return our platform name 'win32', 'linux_x86_64'"""
    if sys.platform == 'darwin':
        # distutils.util.get_platform() returns the release based on the value
        # of MACOSX_DEPLOYMENT_TARGET on which Python was built, which may
        # be significantly older than the user's current machine.
        release, _, machine = platform.mac_ver()
        split_ver = release.split('.')

        if machine == "x86_64" and _is_running_32bit():
            machine = "i386"
        elif machine == "ppc64" and _is_running_32bit():
            machine = "ppc"

        return 'macosx_{}_{}_{}'.format(split_ver[0], split_ver[1], machine)

    # XXX remove distutils dependency
    result = distutils.util.get_platform().replace('.', '_').replace('-', '_')
    if result == "linux_x86_64" and _is_running_32bit():
        # 32 bit Python program (running on a 64 bit Linux): pip should only
        # install and run 32 bit compiled extensions in that case.
        result = "linux_i686"

    return result


def is_manylinux1_compatible():
    # Only Linux, and only x86-64 / i686
    if get_platform() not in {"linux_x86_64", "linux_i686"}:
        return False

    # Check for presence of _manylinux module
    try:
        import _manylinux
        return bool(_manylinux.manylinux1_compatible)
    except (ImportError, AttributeError):
        # Fall through to heuristic check below
        pass

    # Check glibc version. CentOS 5 uses glibc 2.5.
    return glibc.have_compatible_glibc(2, 5)


def get_darwin_arches(major, minor, machine):
    """Return a list of supported arches (including group arches) for
    the given major, minor and machine architecture of an macOS machine.
    """
    arches = []

    def _supports_arch(major, minor, arch):
        # Looking at the application support for macOS versions in the chart
        # provided by https://en.wikipedia.org/wiki/OS_X#Versions it appears
        # our timeline looks roughly like:
        #
        # 10.0 - Introduces ppc support.
        # 10.4 - Introduces ppc64, i386, and x86_64 support, however the ppc64
        #        and x86_64 support is CLI only, and cannot be used for GUI
        #        applications.
        # 10.5 - Extends ppc64 and x86_64 support to cover GUI applications.
        # 10.6 - Drops support for ppc64
        # 10.7 - Drops support for ppc
        #
        # Given that we do not know if we're installing a CLI or a GUI
        # application, we must be conservative and assume it might be a GUI
        # application and behave as if ppc64 and x86_64 support did not occur
        # until 10.5.
        #
        # Note: The above information is taken from the "Application support"
        #       column in the chart not the "Processor support" since I believe
        #       that we care about what instruction sets an application can use
        #       not which processors the OS supports.
        if arch == 'ppc':
            return (major, minor) <= (10, 5)
        if arch == 'ppc64':
            return (major, minor) == (10, 5)
        if arch == 'i386':
            return (major, minor) >= (10, 4)
        if arch == 'x86_64':
            return (major, minor) >= (10, 5)
        if arch in groups:
            for garch in groups[arch]:
                if _supports_arch(major, minor, garch):
                    return True
        return False

    groups = OrderedDict([
        ("fat", ("i386", "ppc")),
        ("intel", ("x86_64", "i386")),
        ("fat64", ("x86_64", "ppc64")),
        ("fat32", ("x86_64", "i386", "ppc")),
    ])

    if _supports_arch(major, minor, machine):
        arches.append(machine)

    for garch in groups:
        if machine in groups[garch] and _supports_arch(major, minor, garch):
            arches.append(garch)

    arches.append('universal')

    return arches


def get_supported(versions=None, noarch=False, platform=None,
                  impl=None, abi=None):
    """Return a list of supported tags for each version specified in
    `versions`.

    :param versions: a list of string versions, of the form ["33", "32"],
        or None. The first version will be assumed to support our ABI.
    :param platform: specify the exact platform you want valid
        tags for, or None. If None, use the local system platform.
    :param impl: specify the exact implementation you want valid
        tags for, or None. If None, use the local interpreter impl.
    :param abi: specify the exact abi you want valid
        tags for, or None. If None, use the local interpreter abi.
    """
    supported = []

    # Versions must be given with respect to the preference
    if versions is None:
        versions = []
        version_info = get_impl_version_info()
        major = version_info[:-1]
        # Support all previous minor Python versions.
        for minor in range(version_info[-1], -1, -1):
            versions.append(''.join(map(str, major + (minor,))))

    impl = impl or get_abbr_impl()

    abis = []

    abi = abi or get_abi_tag()
    if abi:
        abis[0:0] = [abi]

    abi3s = set()
    import imp
    for suffix in imp.get_suffixes():
        if suffix[0].startswith('.abi'):
            abi3s.add(suffix[0].split('.', 2)[1])

    abis.extend(sorted(list(abi3s)))

    abis.append('none')

    if not noarch:
        arch = platform or get_platform()
        if arch.startswith('macosx'):
            # support macosx-10.6-intel on macosx-10.9-x86_64
            match = _osx_arch_pat.match(arch)
            if match:
                name, major, minor, actual_arch = match.groups()
                tpl = '{}_{}_%i_%s'.format(name, major)
                arches = []
                for m in reversed(range(int(minor) + 1)):
                    for a in get_darwin_arches(int(major), m, actual_arch):
                        arches.append(tpl % (m, a))
            else:
                # arch pattern didn't match (?!)
                arches = [arch]
        elif platform is None and is_manylinux1_compatible():
            arches = [arch.replace('linux', 'manylinux1'), arch]
        else:
            arches = [arch]

        # Current version, current API (built specifically for our Python):
        for abi in abis:
            for arch in arches:
                supported.append(('%s%s' % (impl, versions[0]), abi, arch))

        # abi3 modules compatible with older version of Python
        for version in versions[1:]:
            # abi3 was introduced in Python 3.2
            if version in {'31', '30'}:
                break
            for abi in abi3s:   # empty set if not Python 3
                for arch in arches:
                    supported.append(("%s%s" % (impl, version), abi, arch))

        # Has binaries, does not use the Python API:
        for arch in arches:
            supported.append(('py%s' % (versions[0][0]), 'none', arch))

    # No abi / arch, but requires our implementation:
    supported.append(('%s%s' % (impl, versions[0]), 'none', 'any'))
    # Tagged specifically as being cross-version compatible
    # (with just the major version specified)
    supported.append(('%s%s' % (impl, versions[0][0]), 'none', 'any'))

    # No abi / arch, generic Python
    for i, version in enumerate(versions):
        supported.append(('py%s' % (version,), 'none', 'any'))
        if i == 0:
            supported.append(('py%s' % (version[0]), 'none', 'any'))

    return supported


implementation_tag = get_impl_tag()
site-packages/setuptools/script (dev).tmpl000064400000000311147511334630014571 0ustar00# EASY-INSTALL-DEV-SCRIPT: %(spec)r,%(script_name)r
__requires__ = %(spec)r
__import__('pkg_resources').require(%(spec)r)
__file__ = %(dev_path)r
exec(compile(open(__file__).read(), __file__, 'exec'))
site-packages/setuptools/__init__.py000064400000013104147511334630013624 0ustar00"""Extensions to the 'distutils' for large or complex distributions"""

import os
import functools
import distutils.core
import distutils.filelist
from distutils.util import convert_path
from fnmatch import fnmatchcase

from setuptools.extern.six.moves import filter, map

import setuptools.version
from setuptools.extension import Extension
from setuptools.dist import Distribution, Feature
from setuptools.depends import Require
from . import monkey

__all__ = [
    'setup', 'Distribution', 'Feature', 'Command', 'Extension', 'Require',
    'find_packages',
]

__version__ = setuptools.version.__version__

bootstrap_install_from = None

# If we run 2to3 on .py files, should we also convert docstrings?
# Default: yes; assume that we can detect doctests reliably
run_2to3_on_doctests = True
# Standard package names for fixer packages
lib2to3_fixer_packages = ['lib2to3.fixes']


class PackageFinder(object):
    """
    Generate a list of all Python packages found within a directory
    """

    @classmethod
    def find(cls, where='.', exclude=(), include=('*',)):
        """Return a list all Python packages found within directory 'where'

        'where' is the root directory which will be searched for packages.  It
        should be supplied as a "cross-platform" (i.e. URL-style) path; it will
        be converted to the appropriate local path syntax.

        'exclude' is a sequence of package names to exclude; '*' can be used
        as a wildcard in the names, such that 'foo.*' will exclude all
        subpackages of 'foo' (but not 'foo' itself).

        'include' is a sequence of package names to include.  If it's
        specified, only the named packages will be included.  If it's not
        specified, all found packages will be included.  'include' can contain
        shell style wildcard patterns just like 'exclude'.
        """

        return list(cls._find_packages_iter(
            convert_path(where),
            cls._build_filter('ez_setup', '*__pycache__', *exclude),
            cls._build_filter(*include)))

    @classmethod
    def _find_packages_iter(cls, where, exclude, include):
        """
        All the packages found in 'where' that pass the 'include' filter, but
        not the 'exclude' filter.
        """
        for root, dirs, files in os.walk(where, followlinks=True):
            # Copy dirs to iterate over it, then empty dirs.
            all_dirs = dirs[:]
            dirs[:] = []

            for dir in all_dirs:
                full_path = os.path.join(root, dir)
                rel_path = os.path.relpath(full_path, where)
                package = rel_path.replace(os.path.sep, '.')

                # Skip directory trees that are not valid packages
                if ('.' in dir or not cls._looks_like_package(full_path)):
                    continue

                # Should this package be included?
                if include(package) and not exclude(package):
                    yield package

                # Keep searching subdirectories, as there may be more packages
                # down there, even if the parent was excluded.
                dirs.append(dir)

    @staticmethod
    def _looks_like_package(path):
        """Does a directory look like a package?"""
        return os.path.isfile(os.path.join(path, '__init__.py'))

    @staticmethod
    def _build_filter(*patterns):
        """
        Given a list of patterns, return a callable that will be true only if
        the input matches at least one of the patterns.
        """
        return lambda name: any(fnmatchcase(name, pat=pat) for pat in patterns)


class PEP420PackageFinder(PackageFinder):
    @staticmethod
    def _looks_like_package(path):
        return True


find_packages = PackageFinder.find


def _install_setup_requires(attrs):
    # Note: do not use `setuptools.Distribution` directly, as
    # our PEP 517 backend patch `distutils.core.Distribution`.
    dist = distutils.core.Distribution(dict(
        (k, v) for k, v in attrs.items()
        if k in ('dependency_links', 'setup_requires')
    ))
    # Honor setup.cfg's options.
    dist.parse_config_files(ignore_option_errors=True)
    if dist.setup_requires:
        dist.fetch_build_eggs(dist.setup_requires)


def setup(**attrs):
    # Make sure we have any requirements needed to interpret 'attrs'.
    _install_setup_requires(attrs)
    return distutils.core.setup(**attrs)

setup.__doc__ = distutils.core.setup.__doc__


_Command = monkey.get_unpatched(distutils.core.Command)


class Command(_Command):
    __doc__ = _Command.__doc__

    command_consumes_arguments = False

    def __init__(self, dist, **kw):
        """
        Construct the command for dist, updating
        vars(self) with any keyword parameters.
        """
        _Command.__init__(self, dist)
        vars(self).update(kw)

    def reinitialize_command(self, command, reinit_subcommands=0, **kw):
        cmd = _Command.reinitialize_command(self, command, reinit_subcommands)
        vars(cmd).update(kw)
        return cmd


def _find_all_simple(path):
    """
    Find all files under 'path'
    """
    results = (
        os.path.join(base, file)
        for base, dirs, files in os.walk(path, followlinks=True)
        for file in files
    )
    return filter(os.path.isfile, results)


def findall(dir=os.curdir):
    """
    Find all files under 'dir' and return the list of full filenames.
    Unless dir is '.', return full filenames with dir prepended.
    """
    files = _find_all_simple(dir)
    if dir == os.curdir:
        make_rel = functools.partial(os.path.relpath, start=dir)
        files = map(make_rel, files)
    return list(files)


monkey.patch_all()
site-packages/setuptools/command/install_egg_info.py000064400000004233147511334630017011 0ustar00from distutils import log, dir_util
import os

from setuptools import Command
from setuptools import namespaces
from setuptools.archive_util import unpack_archive
import pkg_resources


class install_egg_info(namespaces.Installer, Command):
    """Install an .egg-info directory for the package"""

    description = "Install an .egg-info directory for the package"

    user_options = [
        ('install-dir=', 'd', "directory to install to"),
    ]

    def initialize_options(self):
        self.install_dir = None

    def finalize_options(self):
        self.set_undefined_options('install_lib',
                                   ('install_dir', 'install_dir'))
        ei_cmd = self.get_finalized_command("egg_info")
        basename = pkg_resources.Distribution(
            None, None, ei_cmd.egg_name, ei_cmd.egg_version
        ).egg_name() + '.egg-info'
        self.source = ei_cmd.egg_info
        self.target = os.path.join(self.install_dir, basename)
        self.outputs = []

    def run(self):
        self.run_command('egg_info')
        if os.path.isdir(self.target) and not os.path.islink(self.target):
            dir_util.remove_tree(self.target, dry_run=self.dry_run)
        elif os.path.exists(self.target):
            self.execute(os.unlink, (self.target,), "Removing " + self.target)
        if not self.dry_run:
            pkg_resources.ensure_directory(self.target)
        self.execute(
            self.copytree, (), "Copying %s to %s" % (self.source, self.target)
        )
        self.install_namespaces()

    def get_outputs(self):
        return self.outputs

    def copytree(self):
        # Copy the .egg-info tree to site-packages
        def skimmer(src, dst):
            # filter out source-control directories; note that 'src' is always
            # a '/'-separated path, regardless of platform.  'dst' is a
            # platform-specific path.
            for skip in '.svn/', 'CVS/':
                if src.startswith(skip) or '/' + skip in src:
                    return None
            self.outputs.append(dst)
            log.debug("Copying %s to %s", src, dst)
            return dst

        unpack_archive(self.source, self.target, skimmer)
site-packages/setuptools/command/launcher manifest.xml000064400000001164147511334630017246 0ustar00<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<assembly xmlns="urn:schemas-microsoft-com:asm.v1" manifestVersion="1.0">
    <assemblyIdentity version="1.0.0.0"
                      processorArchitecture="X86"
                      name="%(name)s"
                      type="win32"/>
    <!-- Identify the application security requirements. -->
    <trustInfo xmlns="urn:schemas-microsoft-com:asm.v3">
        <security>
            <requestedPrivileges>
                <requestedExecutionLevel level="asInvoker" uiAccess="false"/>
            </requestedPrivileges>
        </security>
    </trustInfo>
</assembly>
site-packages/setuptools/command/__pycache__/dist_info.cpython-36.opt-1.pyc000064400000002445147511334630022712 0ustar003

��f��@s8dZddlZddlmZddlmZGdd�de�ZdS)zD
Create a dist_info directory
As defined in the wheel specification
�N)�Command)�logc@s.eZdZdZdgZdd�Zdd�Zd	d
�ZdS)
�	dist_infozcreate a .dist-info directory�	egg-base=�e�Ldirectory containing .egg-info directories (default: top of the source tree)cCs
d|_dS)N)�egg_base)�self�r
�/usr/lib/python3.6/dist_info.py�initialize_optionsszdist_info.initialize_optionscCsdS)Nr
)r	r
r
r�finalize_optionsszdist_info.finalize_optionscCsn|jd�}|j|_|j�|j�|jdtd��d}tjdjt	j
j|���|jd�}|j|j|�dS)N�egg_infoz	.egg-infoz
.dist-infoz
creating '{}'�bdist_wheel)
Zget_finalized_commandrr
�runr�lenr�info�format�os�path�abspathZegg2dist)r	rZ
dist_info_dirrr
r
rrs

z
dist_info.runN)rrr)�__name__�
__module__�__qualname__�descriptionZuser_optionsrr
rr
r
r
rrs
r)�__doc__rZdistutils.corerZ	distutilsrrr
r
r
r�<module>ssite-packages/setuptools/command/__pycache__/register.cpython-36.pyc000064400000001005147511334630021610 0ustar003

��f�@s"ddljjZGdd�dej�ZdS)�Nc@seZdZejjZdd�ZdS)�registercCs|jd�tjj|�dS)NZegg_info)Zrun_command�origr�run)�self�r�/usr/lib/python3.6/register.pyrs
zregister.runN)�__name__�
__module__�__qualname__rr�__doc__rrrrrrsr)Zdistutils.command.registerZcommandrrrrrr�<module>ssite-packages/setuptools/command/__pycache__/upload_docs.cpython-36.pyc000064400000013610147511334630022265 0ustar003

��f��@s�dZddlmZddlmZddlmZddlZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlmZddlmZmZddlmZd	d
lmZdd�ZGd
d�de�ZdS)zpupload_docs

Implements a Distutils 'upload_docs' subcommand (upload documentation to
PyPI's pythonhosted.org).
�)�standard_b64encode)�log)�DistutilsOptionErrorN)�six)�http_client�urllib)�iter_entry_points�)�uploadcCstjr
dnd}|jd|�S)N�surrogateescape�strictzutf-8)r�PY3�encode)�s�errors�r�!/usr/lib/python3.6/upload_docs.py�_encodesrc@s�eZdZdZdZdddejfddgZejZdd�Zd
efgZ	dd�Z
dd�Zdd�Zdd�Z
edd��Zedd��Zdd�ZdS)�upload_docszhttps://pypi.python.org/pypi/zUpload documentation to PyPIzrepository=�rzurl of repository [default: %s]�
show-responseN�&display full response text from server�upload-dir=�directory to uploadcCs$|jdkr xtdd�D]}dSWdS)Nzdistutils.commands�build_sphinxT)�
upload_dirr)�selfZeprrr�
has_sphinx/s
zupload_docs.has_sphinxrcCstj|�d|_d|_dS)N)r
�initialize_optionsr�
target_dir)rrrrr6s
zupload_docs.initialize_optionscCs�tj|�|jdkrN|j�r0|jd�}|j|_q`|jd�}tjj	|j
d�|_n|jd�|j|_d|jkrtt
jd�|jd|j�dS)Nr�buildZdocsrzpypi.python.orgz3Upload_docs command is deprecated. Use RTD instead.zUsing upload directory %s)r
�finalize_optionsrrZget_finalized_commandZbuilder_target_dirr�os�path�joinZ
build_baseZensure_dirname�
repositoryr�warn�announce)rrr rrrr!;s







zupload_docs.finalize_optionscCs�tj|d�}z�|j|j�x�tj|j�D]~\}}}||jkrT|rTd}t||j��xP|D]H}tjj||�}|t	|j�d�j
tjj�}	tjj|	|�}
|j||
�qZWq(WWd|j
�XdS)N�wz'no files found in upload directory '%s')�zipfileZZipFileZmkpathrr"�walkrr#r$�len�lstrip�sep�write�close)r�filename�zip_file�root�dirs�filesZtmpl�nameZfullZrelative�destrrr�create_zipfileKs
zupload_docs.create_zipfilecCslx|j�D]}|j|�q
Wtj�}|jjj�}tjj	|d|�}z|j
|�|j|�Wdtj
|�XdS)Nz%s.zip)Zget_sub_commandsZrun_command�tempfileZmkdtemp�distribution�metadata�get_namer"r#r$r7�upload_file�shutilZrmtree)rZcmd_nameZtmp_dirr5r1rrr�run[s
zupload_docs.runccs�|\}}d|}t|t�s |g}xn|D]f}t|t�rN|d|d7}|d}nt|�}|Vt|�VdV|V|r&|dd�dkr&dVq&WdS)	Nz*
Content-Disposition: form-data; name="%s"z; filename="%s"rr	s

�
�
���)�
isinstance�list�tupler)�item�sep_boundary�key�values�title�valuerrr�_build_partis




zupload_docs._build_partcCsnd}d|}|d}|df}tj|j|d�}t||j��}tjj|�}tj||�}	d|jd�}
dj	|	�|
fS)	z=
        Build up the MIME payload for the POST data
        s3--------------GHSKFJDLGDS7543FJKLFHRE75642756743254s
--s--r@)rFz multipart/form-data; boundary=%s�ascii�)
�	functools�partialrK�map�items�	itertools�chain�
from_iterable�decoder$)�cls�data�boundaryrFZend_boundaryZ	end_itemsZbuilderZpart_groups�partsZ
body_items�content_typerrr�_build_multipart}szupload_docs._build_multipartcCsPt|d��}|j�}WdQRX|jj}d|j�tjj|�|fd�}t|j	d|j
�}t|�}tj
rn|jd�}d|}|j|�\}}	d|j}
|j|
tj�tjj|j�\}}}
}}}|r�|r�|s�t�|dkr�tj|�}n |d	kr�tj|�}ntd
|��d}yZ|j�|jd|
�|	}|jd
|�|jdtt|���|jd|�|j �|j!|�Wn6t"j#k
�r�}z|jt|�tj$�dSd}~XnX|j%�}|j&dk�r�d|j&|j'f}
|j|
tj�nb|j&dk�r|j(d�}|dk�r�d|j�}d|}
|j|
tj�nd|j&|j'f}
|j|
tj$�|j)�rLt*dd|j�dd�dS)N�rbZ
doc_upload)z:actionr5�content�:rLzBasic zSubmitting documentation to %sZhttpZhttpszunsupported schema �ZPOSTzContent-typezContent-lengthZ
Authorization��zServer response (%s): %si-ZLocationzhttps://pythonhosted.org/%s/zUpload successful. Visit %szUpload failed (%s): %s�-�K)+�open�readr9r:r;r"r#�basenamerZusernameZpasswordrrr
rUr[r%r'r�INFOr�parseZurlparse�AssertionErrorrZHTTPConnectionZHTTPSConnectionZconnectZ
putrequestZ	putheader�strr+Z
endheaders�send�socket�errorZERRORZgetresponseZstatus�reasonZ	getheaderZ
show_response�print)rr0�fr]�metarWZcredentialsZauthZbodyZct�msgZschemaZnetlocZurlZparamsZqueryZ	fragmentsZconnrZ�er�locationrrrr<�s`



zupload_docs.upload_file)rNr)rNr)�__name__�
__module__�__qualname__ZDEFAULT_REPOSITORY�descriptionr
Zuser_optionsZboolean_optionsrZsub_commandsrr!r7r>�staticmethodrK�classmethodr[r<rrrrrs"

r)�__doc__�base64rZ	distutilsrZdistutils.errorsrr"rkr)r8r=rRrNZsetuptools.externrZsetuptools.extern.six.movesrrZ
pkg_resourcesrr
rrrrrr�<module>s site-packages/setuptools/command/__pycache__/sdist.cpython-36.opt-1.pyc000064400000014250147511334630022057 0ustar003

��f7�@s~ddlmZddljjZddlZddlZddlZddl	Z	ddl
mZddlm
Z
ddlZeZddd�ZGd	d
�d
e
ej�ZdS)�)�logN)�six�)�sdist_add_defaults�ccs4x.tjd�D] }x|j�|�D]
}|VqWqWdS)z%Find all files under revision controlzsetuptools.file_findersN)�
pkg_resourcesZiter_entry_points�load)�dirnameZep�item�r�/usr/lib/python3.6/sdist.py�walk_revctrlsr
cs�eZdZdZd0d2d3gZiZdd
ddgZedd�eD��Zdd�Z	dd�Z
dd�Zdd�Ze
ejdd���Zdd�Zejd4kp�d5ejko�d6knp�d7ejko�d8knZer�eZd$d%�Z�fd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Z�ZS)9�sdistz=Smart sdist that finds anything supported by revision control�formats=N�6formats for source distribution (comma-separated list)�	keep-temp�kz1keep the distribution tree around after creating zarchive file(s)�	dist-dir=�d�Fdirectory to put the source distribution archive(s) in [default: dist]rz.rstz.txtz.mdccs|]}dj|�VqdS)z	README{0}N)�format)�.0Zextrrr�	<genexpr>)szsdist.<genexpr>cCs�|jd�|jd�}|j|_|jjtjj|jd��|j�x|j	�D]}|j|�qFW|j
�t|jdg�}x*|j
D] }dd|f}||krv|j|�qvWdS)N�egg_infozSOURCES.txt�
dist_filesrr)Zrun_command�get_finalized_command�filelist�append�os�path�joinr�check_readmeZget_sub_commands�make_distribution�getattr�distributionZ
archive_files)�selfZei_cmdZcmd_namer�file�datarrr�run+s


z	sdist.runcCstjj|�|j�dS)N)�origr�initialize_options�_default_to_gztar)r%rrrr*>szsdist.initialize_optionscCstjdkrdSdg|_dS)N��r�betarZgztar)r,r-rr.r)�sys�version_infoZformats)r%rrrr+Cs
zsdist._default_to_gztarc	Cs$|j��tjj|�WdQRXdS)z%
        Workaround for #516
        N)�_remove_os_linkr)rr")r%rrrr"Is
zsdist.make_distributionccs^Gdd�d�}ttd|�}yt`Wntk
r6YnXz
dVWd||k	rXttd|�XdS)zG
        In a context, remove and restore os.link if it exists
        c@seZdZdS)z&sdist._remove_os_link.<locals>.NoValueN)�__name__�
__module__�__qualname__rrrr�NoValueWsr5�linkN)r#rr6�	Exception�setattr)r5Zorig_valrrrr1Ps
zsdist._remove_os_linkcCsLytjj|�Wn6tk
rFtj�\}}}|jjjdj	��YnXdS)N�template)
r)r�
read_templater7r/�exc_info�tb_next�tb_frame�f_locals�close)r%�_�tbrrrZ__read_template_hackeszsdist.__read_template_hack��r,rr�csb|jj�r^|jd�}|jj|j��|jjs^x0|jD]&\}�}}|jj�fdd�|D��q4WdS)zgetting python files�build_pycsg|]}tjj�|��qSr)rrr )r�filename)�src_dirrr�
<listcomp>�sz.sdist._add_defaults_python.<locals>.<listcomp>N)r$Zhas_pure_modulesrr�extendZget_source_filesZinclude_package_dataZ
data_files)r%rEr@�	filenamesr)rGr�_add_defaults_python|s

zsdist._add_defaults_pythoncsDy tjrtj|�n
t�j�Wntk
r>tjd�YnXdS)Nz&data_files contains unexpected objects)rZPY2r�_add_defaults_data_files�super�	TypeErrorr�warn)r%)�	__class__rrrL�szsdist._add_defaults_data_filescCs:x4|jD]}tjj|�rdSqW|jddj|j��dS)Nz,standard file not found: should have one of z, )�READMESrr�existsrOr )r%�frrrr!�szsdist.check_readmecCs^tjj|||�tjj|d�}ttd�rJtjj|�rJtj|�|j	d|�|j
d�j|�dS)Nz	setup.cfgr6r)r)r�make_release_treerrr �hasattrrR�unlinkZ	copy_filerZsave_version_info)r%Zbase_dir�files�destrrrrT�s
zsdist.make_release_treec	Cs@tjj|j�sdStj|jd��}|j�}WdQRX|dj�kS)NF�rbz+# file GENERATED by distutils, do NOT edit
)rr�isfile�manifest�io�open�readline�encode)r%�fpZ
first_linerrr�_manifest_is_not_generated�sz sdist._manifest_is_not_generatedcCs�tjd|j�t|jd�}xl|D]d}tjr^y|jd�}Wn$tk
r\tjd|�w YnX|j	�}|j
d�s |rxq |jj|�q W|j
�dS)z�Read the manifest file (named by 'self.manifest') and use it to
        fill in 'self.filelist', the list of files to include in the source
        distribution.
        zreading manifest file '%s'rYzUTF-8z"%r not UTF-8 decodable -- skipping�#N)r�infor[r]rZPY3�decode�UnicodeDecodeErrorrO�strip�
startswithrrr?)r%r[�linerrr�
read_manifest�s
zsdist.read_manifest)rNr�@keep the distribution tree around after creating archive file(s))rrrj)rrr)rBrCrB)r,r)r,rrD)r,rB)r,rBr)r2r3r4�__doc__Zuser_optionsZnegative_optZREADME_EXTENSIONS�tuplerQr(r*r+r"�staticmethod�
contextlib�contextmanagerr1Z_sdist__read_template_hackr/r0Zhas_leaky_handler:rKrLr!rTrari�
__classcell__rr)rPrrs:
	


r)r)Z	distutilsrZdistutils.command.sdistZcommandrr)rr/r\rnZsetuptools.externrZ
py36compatrr�listZ_default_revctrlr
rrrr�<module>s
site-packages/setuptools/command/__pycache__/test.cpython-36.opt-1.pyc000064400000017625147511334630021721 0ustar003

��f�#�@s�ddlZddlZddlZddlZddlZddlZddlmZmZddl	m
Z
ddlmZddlm
Z
ddlmZmZddlmZmZmZmZmZmZmZmZmZddlmZGd	d
�d
e�ZGdd�de�ZGd
d�de�Z dS)�N)�DistutilsError�DistutilsOptionError)�log)�
TestLoader)�six)�map�filter)	�resource_listdir�resource_exists�normalize_path�working_set�_namespace_packages�evaluate_marker�add_activation_listener�require�
EntryPoint)�Commandc@seZdZdd�Zddd�ZdS)�ScanningLoadercCstj|�t�|_dS)N)r�__init__�set�_visited)�self�r�/usr/lib/python3.6/test.pyrs
zScanningLoader.__init__NcCs�||jkrdS|jj|�g}|jtj||��t|d�rH|j|j��t|d�r�xpt|jd�D]`}|j	d�r�|dkr�|jd|dd�}n"t
|j|d	�r`|jd|}nq`|j|j|��q`Wt|�d
kr�|j
|�S|dSdS)
aReturn a suite of all tests cases contained in the given module

        If the module is a package, load tests from all the modules in it.
        If the module has an ``additional_tests`` function, call it and add
        the return value to the tests.
        N�additional_tests�__path__�z.pyz__init__.py�.�z/__init__.py�r���)r�add�appendr�loadTestsFromModule�hasattrrr	�__name__�endswithr
ZloadTestsFromName�lenZ
suiteClass)r�module�patternZtests�fileZ	submodulerrrr#s$



z"ScanningLoader.loadTestsFromModule)N)r%�
__module__�__qualname__rr#rrrrrsrc@seZdZdd�Zddd�ZdS)�NonDataPropertycCs
||_dS)N)�fget)rr.rrrr>szNonDataProperty.__init__NcCs|dkr|S|j|�S)N)r.)r�objZobjtyperrr�__get__AszNonDataProperty.__get__)N)r%r+r,rr0rrrrr-=sr-c@s�eZdZdZdZd%d&d'gZdd
�Zdd�Zedd��Z	dd�Z
dd�Zej
gfdd��Zeej
dd���Zedd��Zdd�Zdd�Zed d!��Zed"d#��Zd$S)(�testz.Command to run unit tests after in-place buildz#run unit tests after in-place build�test-module=�m�$Run 'test_suite' in specified module�test-suite=�s�9Run single test, case or suite (e.g. 'module.test_suite')�test-runner=�r�Test runner to usecCsd|_d|_d|_d|_dS)N)�
test_suite�test_module�test_loader�test_runner)rrrr�initialize_optionsSsztest.initialize_optionscCs�|jr|jrd}t|��|jdkrD|jdkr8|jj|_n|jd|_|jdkr^t|jdd�|_|jdkrnd|_|jdkr�t|jdd�|_dS)Nz1You may specify a module or a suite, but not bothz.test_suiter=z&setuptools.command.test:ScanningLoaderr>)r;r<r�distributionr=�getattrr>)r�msgrrr�finalize_optionsYs




ztest.finalize_optionscCst|j��S)N)�list�
_test_args)rrrr�	test_argslsztest.test_argsccs6|jrtjdkrdV|jr$dV|jr2|jVdS)N��Zdiscoverz	--verbose)rGrH)r;�sys�version_info�verbose)rrrrrEpsztest._test_argsc	Cs|j��|�WdQRXdS)zI
        Backward compatibility for project_on_sys_path context.
        N)�project_on_sys_path)r�funcrrr�with_project_on_sys_pathxs
ztest.with_project_on_sys_pathc	csPtjot|jdd�}|rv|jddd�|jd�|jd�}t|j�}|jd|d�|jd�|jddd�|jd�n"|jd�|jdd	d�|jd�|jd�}t	j
dd�}t	jj�}zbt|j
�}t	j
jd|�tj�td
d��td|j|jf�|j|g��dVWdQRXWd|t	j
dd�<t	jj�t	jj|�tj�XdS)
N�use_2to3FZbuild_pyr)ZinplaceZegg_info)�egg_baseZ	build_extrcSs|j�S)N)Zactivate)�distrrr�<lambda>�sz*test.project_on_sys_path.<locals>.<lambda>z%s==%s)r�PY3rAr@Zreinitialize_commandZrun_commandZget_finalized_commandrZ	build_librI�path�modules�copyrP�insertrrrrZegg_nameZegg_version�paths_on_pythonpath�clear�update)	rZ
include_distsZ	with_2to3Zbpy_cmdZ
build_pathZei_cmdZold_pathZold_modulesZproject_pathrrrrLs8









ztest.project_on_sys_pathccs�t�}tjjd|�}tjjdd�}z>tjj|�}td||g�}tjj|�}|rX|tjd<dVWd||krztjjdd�n
|tjd<XdS)z�
        Add the indicated paths to the head of the PYTHONPATH environment
        variable so that subprocesses will also see the packages at
        these paths.

        Do this in a context that restores the value on exit.
        �
PYTHONPATHrN)�object�os�environ�get�pathsep�joinr�pop)�pathsZnothingZorig_pythonpathZcurrent_pythonpath�prefixZto_join�new_pathrrrrX�s


ztest.paths_on_pythonpathcCsD|j|j�}|j|jpg�}|jdd�|jj�D��}tj|||�S)z�
        Install the requirements indicated by self.distribution and
        return an iterable of the dists that were built.
        css0|](\}}|jd�rt|dd��r|VqdS)�:rN)�
startswithr)�.0�k�vrrr�	<genexpr>�sz%test.install_dists.<locals>.<genexpr>)Zfetch_build_eggsZinstall_requiresZ
tests_requireZextras_require�items�	itertools�chain)rQZir_dZtr_dZer_drrr�
install_dists�s
ztest.install_distscCs�|j|j�}dj|j�}|jr0|jd|�dS|jd|�ttjd�|�}|j	|��"|j
��|j�WdQRXWdQRXdS)N� zskipping "%s" (dry run)zrunning "%s"�location)ror@ra�_argvZdry_run�announcer�operator�
attrgetterrXrL�	run_tests)rZinstalled_dists�cmdrcrrr�run�s
ztest.runcCs�tjr�t|jdd�r�|jjd�d}|tkr�g}|tjkrD|j	|�|d7}x"tjD]}|j
|�rT|j	|�qTWtttjj
|��tjdd|j|j|j�|j|j�dd�}|jj�s�d|j}|j|tj�t|��dS)NrOFrr)Z
testLoaderZ
testRunner�exitzTest failed: %s)rrSrAr@r;�splitr
rIrUr"rgrDr�__delitem__�unittest�mainrr�_resolve_as_epr=r>�resultZ
wasSuccessfulrsrZERRORr)rr(Zdel_modules�namer1rBrrrrv�s(






ztest.run_testscCsdg|jS)Nr|)rF)rrrrrr�sz
test._argvcCs$|dkrdStjd|�}|j��S)zu
        Load the indicated attribute value, called, as a as if it were
        specified as an entry point.
        Nzx=)r�parseZresolve)�valZparsedrrrr~sztest._resolve_as_epN)r2r3r4)r5r6r7)r8r9r:)r%r+r,�__doc__�descriptionZuser_optionsr?rCr-rFrErN�
contextlib�contextmanagerrL�staticmethodrXrorxrv�propertyrrr~rrrrr1Gs(-r1)!r]rtrIr�rmr|Zdistutils.errorsrrZ	distutilsrrZsetuptools.externrZsetuptools.extern.six.movesrrZ
pkg_resourcesr	r
rrr
rrrrZ
setuptoolsrrr\r-r1rrrr�<module>s,)
site-packages/setuptools/command/__pycache__/setopt.cpython-36.opt-1.pyc000064400000010656147511334630022255 0ustar003

��f��@s�ddlmZddlmZddlmZddlZddlZddlmZddl	m
Z
ddd	d
gZddd�Zddd�Z
Gdd	�d	e
�ZGdd
�d
e�ZdS)�)�convert_path)�log)�DistutilsOptionErrorN)�configparser)�Command�config_file�edit_config�option_base�setopt�localcCsh|dkrdS|dkr,tjjtjjtj�d�S|dkrZtjdkrBdpDd}tjjtd	|��St	d
|��dS)z�Get the filename of the distutils, local, global, or per-user config

    `kind` must be one of "local", "global", or "user"
    rz	setup.cfg�globalz
distutils.cfg�user�posix�.�z~/%spydistutils.cfgz7config_file() type must be 'local', 'global', or 'user'N)
�os�path�join�dirname�	distutils�__file__�name�
expanduserr�
ValueError)Zkind�dot�r�/usr/lib/python3.6/setopt.pyrsFc		Cs.tjd|�tj�}|j|g�x�|j�D]�\}}|dkrTtjd||�|j|�q*|j|�svtjd||�|j	|�x||j�D]p\}}|dkr�tjd|||�|j
||�|j|�s�tjd||�|j|�q�tjd||||�|j|||�q�Wq*Wtjd|�|�s*t
|d	��}|j|�WdQRXdS)
aYEdit a configuration file to include `settings`

    `settings` is a dictionary of dictionaries or ``None`` values, keyed by
    command/section name.  A ``None`` value means to delete the entire section,
    while a dictionary lists settings to be changed or deleted in that section.
    A setting of ``None`` means to delete that setting.
    zReading configuration from %sNzDeleting section [%s] from %szAdding new section [%s] to %szDeleting %s.%s from %sz#Deleting empty [%s] section from %szSetting %s.%s to %r in %sz
Writing %s�w)r�debugrZRawConfigParser�read�items�infoZremove_sectionZhas_sectionZadd_sectionZ
remove_option�options�set�open�write)	�filenameZsettings�dry_runZoptsZsectionr"�option�value�frrrr!s8



c@s2eZdZdZdddgZddgZdd�Zd
d�ZdS)r	z<Abstract base class for commands that mess with config files�
global-config�g�0save options to the site-wide distutils.cfg file�user-config�u�7save options to the current user's pydistutils.cfg file�	filename=r*�-configuration file to use (default=setup.cfg)cCsd|_d|_d|_dS)N)�
global_config�user_configr&)�selfrrr�initialize_options\szoption_base.initialize_optionscCsvg}|jr|jtd��|jr,|jtd��|jdk	rB|j|j�|sT|jtd��t|�dkrjtd|��|\|_dS)Nrr
r�z/Must specify only one configuration file option)r3�appendrr4r&�lenr)r5�	filenamesrrr�finalize_optionsas
zoption_base.finalize_optionsN)r+r,r-)r.r/r0)r1r*r2)�__name__�
__module__�__qualname__�__doc__�user_options�boolean_optionsr6r;rrrrr	Lsc@sJeZdZdZdZddddgejZejdgZdd�Zdd�Z	dd�Z
dS)r
z#Save command-line options to a filez1set an option in setup.cfg or another config file�command=�c�command to set an option for�option=�o�
option to set�
set-value=�s�value of the option�remove�r�remove (unset) the valuecCs&tj|�d|_d|_d|_d|_dS)N)r	r6�commandr(�	set_valuerK)r5rrrr6�s

zsetopt.initialize_optionscCsDtj|�|jdks|jdkr&td��|jdkr@|jr@td��dS)Nz%Must specify --command *and* --optionz$Must specify --set-value or --remove)r	r;rNr(rrOrK)r5rrrr;�s

zsetopt.finalize_optionscCs*t|j|j|jjdd�|jii|j�dS)N�-�_)rr&rNr(�replacerOr')r5rrr�run�sz
setopt.runN)rBrCrD)rErFrG)rHrIrJ)rKrLrM)r<r=r>r?�descriptionr	r@rAr6r;rSrrrrr
ss)r)F)Zdistutils.utilrrrZdistutils.errorsrrZsetuptools.extern.six.movesrZ
setuptoolsr�__all__rrr	r
rrrr�<module>s

+'site-packages/setuptools/command/__pycache__/upload_docs.cpython-36.opt-1.pyc000064400000013560147511334630023230 0ustar003

��f��@s�dZddlmZddlmZddlmZddlZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlmZddlmZmZddlmZd	d
lmZdd�ZGd
d�de�ZdS)zpupload_docs

Implements a Distutils 'upload_docs' subcommand (upload documentation to
PyPI's pythonhosted.org).
�)�standard_b64encode)�log)�DistutilsOptionErrorN)�six)�http_client�urllib)�iter_entry_points�)�uploadcCstjr
dnd}|jd|�S)N�surrogateescape�strictzutf-8)r�PY3�encode)�s�errors�r�!/usr/lib/python3.6/upload_docs.py�_encodesrc@s�eZdZdZdZdddejfddgZejZdd�Zd
efgZ	dd�Z
dd�Zdd�Zdd�Z
edd��Zedd��Zdd�ZdS)�upload_docszhttps://pypi.python.org/pypi/zUpload documentation to PyPIzrepository=�rzurl of repository [default: %s]�
show-responseN�&display full response text from server�upload-dir=�directory to uploadcCs$|jdkr xtdd�D]}dSWdS)Nzdistutils.commands�build_sphinxT)�
upload_dirr)�selfZeprrr�
has_sphinx/s
zupload_docs.has_sphinxrcCstj|�d|_d|_dS)N)r
�initialize_optionsr�
target_dir)rrrrr6s
zupload_docs.initialize_optionscCs�tj|�|jdkrN|j�r0|jd�}|j|_q`|jd�}tjj	|j
d�|_n|jd�|j|_d|jkrtt
jd�|jd|j�dS)Nr�buildZdocsrzpypi.python.orgz3Upload_docs command is deprecated. Use RTD instead.zUsing upload directory %s)r
�finalize_optionsrrZget_finalized_commandZbuilder_target_dirr�os�path�joinZ
build_baseZensure_dirname�
repositoryr�warn�announce)rrr rrrr!;s







zupload_docs.finalize_optionscCs�tj|d�}z�|j|j�x�tj|j�D]~\}}}||jkrT|rTd}t||j��xP|D]H}tjj||�}|t	|j�d�j
tjj�}	tjj|	|�}
|j||
�qZWq(WWd|j
�XdS)N�wz'no files found in upload directory '%s')�zipfileZZipFileZmkpathrr"�walkrr#r$�len�lstrip�sep�write�close)r�filename�zip_file�root�dirs�filesZtmpl�nameZfullZrelative�destrrr�create_zipfileKs
zupload_docs.create_zipfilecCslx|j�D]}|j|�q
Wtj�}|jjj�}tjj	|d|�}z|j
|�|j|�Wdtj
|�XdS)Nz%s.zip)Zget_sub_commandsZrun_command�tempfileZmkdtemp�distribution�metadata�get_namer"r#r$r7�upload_file�shutilZrmtree)rZcmd_nameZtmp_dirr5r1rrr�run[s
zupload_docs.runccs�|\}}d|}t|t�s |g}xn|D]f}t|t�rN|d|d7}|d}nt|�}|Vt|�VdV|V|r&|dd�dkr&dVq&WdS)	Nz*
Content-Disposition: form-data; name="%s"z; filename="%s"rr	s

�
�
���)�
isinstance�list�tupler)�item�sep_boundary�key�values�title�valuerrr�_build_partis




zupload_docs._build_partcCsnd}d|}|d}|df}tj|j|d�}t||j��}tjj|�}tj||�}	d|jd�}
dj	|	�|
fS)	z=
        Build up the MIME payload for the POST data
        s3--------------GHSKFJDLGDS7543FJKLFHRE75642756743254s
--s--r@)rFz multipart/form-data; boundary=%s�ascii�)
�	functools�partialrK�map�items�	itertools�chain�
from_iterable�decoder$)�cls�data�boundaryrFZend_boundaryZ	end_itemsZbuilderZpart_groups�partsZ
body_items�content_typerrr�_build_multipart}szupload_docs._build_multipartcCs:t|d��}|j�}WdQRX|jj}d|j�tjj|�|fd�}t|j	d|j
�}t|�}tj
rn|jd�}d|}|j|�\}}	d|j}
|j|
tj�tjj|j�\}}}
}}}|dkr�tj|�}n |d	kr�tj|�}ntd
|��d}yZ|j�|jd|
�|	}|jd
|�|jdtt|���|jd|�|j �|j!|�Wn6t"j#k
�r~}z|jt|�tj$�dSd}~XnX|j%�}|j&dk�r�d|j&|j'f}
|j|
tj�nb|j&dk�r�|j(d�}|dk�r�d|j�}d|}
|j|
tj�nd|j&|j'f}
|j|
tj$�|j)�r6t*dd|j�dd�dS)N�rbZ
doc_upload)z:actionr5�content�:rLzBasic zSubmitting documentation to %sZhttpZhttpszunsupported schema �ZPOSTzContent-typezContent-lengthZ
Authorization��zServer response (%s): %si-ZLocationzhttps://pythonhosted.org/%s/zUpload successful. Visit %szUpload failed (%s): %s�-�K)+�open�readr9r:r;r"r#�basenamerZusernameZpasswordrrr
rUr[r%r'r�INFOr�parseZurlparserZHTTPConnectionZHTTPSConnection�AssertionErrorZconnectZ
putrequestZ	putheader�strr+Z
endheaders�send�socket�errorZERRORZgetresponseZstatus�reasonZ	getheaderZ
show_response�print)rr0�fr]�metarWZcredentialsZauthZbodyZct�msgZschemaZnetlocZurlZparamsZqueryZ	fragmentsZconnrZ�er�locationrrrr<�s^



zupload_docs.upload_file)rNr)rNr)�__name__�
__module__�__qualname__ZDEFAULT_REPOSITORY�descriptionr
Zuser_optionsZboolean_optionsrZsub_commandsrr!r7r>�staticmethodrK�classmethodr[r<rrrrrs"

r)�__doc__�base64rZ	distutilsrZdistutils.errorsrr"rkr)r8r=rRrNZsetuptools.externrZsetuptools.extern.six.movesrrZ
pkg_resourcesrr
rrrrrr�<module>s site-packages/setuptools/command/__pycache__/bdist_wininst.cpython-36.pyc000064400000001605147511334630022652 0ustar003

��f}�@s"ddljjZGdd�dej�ZdS)�Nc@seZdZddd�Zdd�ZdS)�
bdist_wininstrcCs |jj||�}|dkrd|_|S)zj
        Supplement reinitialize_command to work around
        http://bugs.python.org/issue20819
        �install�install_libN)rr)Zdistribution�reinitialize_commandr)�self�commandZreinit_subcommands�cmd�r	�#/usr/lib/python3.6/bdist_wininst.pyrs
z"bdist_wininst.reinitialize_commandcCs$d|_ztjj|�Wdd|_XdS)NTF)Z_is_running�origr�run)rr	r	r
rszbdist_wininst.runN)r)�__name__�
__module__�__qualname__rrr	r	r	r
rs
r)Zdistutils.command.bdist_wininstrrrr	r	r	r
�<module>ssite-packages/setuptools/command/__pycache__/bdist_egg.cpython-36.pyc000064400000034002147511334630021716 0ustar003

��f	G�@sxdZddlmZddlmZmZddlmZddlm	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlmZddlmZmZmZdd	lmZdd
lmZddlmZyddlmZmZd
d�ZWn,ek
r�ddlm Z mZdd�ZYnXdd�Z!dd�Z"dd�Z#Gdd�de�Z$e%j&dj'��Z(dd�Z)dd�Z*dd�Z+d d!d"�Z,d#d$�Z-d%d&�Z.d'd(�Z/d)d*d+d,gZ0d1d/d0�Z1dS)2z6setuptools.command.bdist_egg

Build .egg distributions�)�DistutilsSetupError)�remove_tree�mkpath)�log)�CodeTypeN)�six)�get_build_platform�Distribution�ensure_directory)�
EntryPoint)�Library)�Command)�get_path�get_python_versioncCstd�S)N�purelib)r�rr�/usr/lib/python3.6/bdist_egg.py�_get_purelibsr)�get_python_librcCstd�S)NF)rrrrrrscCs2d|krtjj|�d}|jd�r.|dd�}|S)N�.r�module�i����)�os�path�splitext�endswith)�filenamerrr�strip_module#s

rccs:x4tj|�D]&\}}}|j�|j�|||fVqWdS)zbDo os.walk in a reproducible way,
    independent of indeterministic filesystem readdir order
    N)r�walk�sort)�dir�base�dirs�filesrrr�sorted_walk+sr$c
Cs6tjd�j�}t|d��}|j||�WdQRXdS)NaR
        def __bootstrap__():
            global __bootstrap__, __loader__, __file__
            import sys, pkg_resources, imp
            __file__ = pkg_resources.resource_filename(__name__, %r)
            __loader__ = None; del __bootstrap__, __loader__
            imp.load_dynamic(__name__,__file__)
        __bootstrap__()
        �w)�textwrap�dedent�lstrip�open�write)Zresource�pyfileZ_stub_template�frrr�
write_stub5s
r-c@s�eZdZdZd*ddde�fd+d-d.d/gZdddgZdd�Zdd�Zdd�Z	dd�Z
dd�Zdd�Zd d!�Z
d"d#�Zd$d%�Zd&d'�Zd(d)�Zd	S)0�	bdist_eggzcreate an "egg" distribution�
bdist-dir=�b�1temporary directory for creating the distributionz
plat-name=�pz;platform name to embed in generated filenames (default: %s)�exclude-source-filesN�+remove all .py files from the generated egg�	keep-temp�kz/keep the pseudo-installation tree around after z!creating the distribution archive�	dist-dir=�d�-directory to put final built distributions in�
skip-build�2skip rebuilding everything (for testing/debugging)cCs.d|_d|_d|_d|_d|_d|_d|_dS)Nr)�	bdist_dir�	plat_name�	keep_temp�dist_dir�
skip_build�
egg_output�exclude_source_files)�selfrrr�initialize_optionsZszbdist_egg.initialize_optionscCs�|jd�}|_|j|_|jdkr>|jd�j}tjj|d�|_|jdkrPt	�|_|j
dd�|jdkr�tdd|j
|jt�|jj�o�|j�j
�}tjj|j|d�|_dS)N�egg_infoZbdistZeggr?z.egg)r?r?)�get_finalized_command�ei_cmdrEr<�
bdist_baserr�joinr=rZset_undefined_optionsrAr	Zegg_nameZegg_versionr�distribution�has_ext_modulesr?)rCrGrH�basenamerrr�finalize_optionscs


zbdist_egg.finalize_optionscCs�|j|jd�_tjjtjjt���}|jj	g}|j_	x�|D]�}t
|t�r�t|�dkr�tjj
|d�r�tjj|d�}tjj|�}||ks�|j|tj�r�|t|�dd�|df}|jj	j|�q<Wz"tjd|j�|jdddd�Wd||j_	XdS)N�install�r�zinstalling package data to %s�install_data)�force�root)r<rF�install_librr�normcase�realpathrrJ�
data_files�
isinstance�tuple�len�isabs�
startswith�sep�appendr�info�call_command)rCZ
site_packages�old�itemrVZ
normalizedrrr�do_install_data{s 
zbdist_egg.do_install_datacCs|jgS)N)rA)rCrrr�get_outputs�szbdist_egg.get_outputscKsTxtD]}|j||j�qW|jd|j�|jd|j�|j|f|�}|j|�|S)z8Invoke reinitialized command `cmdname` with keyword argsr@�dry_run)�INSTALL_DIRECTORY_ATTRS�
setdefaultr<r@reZreinitialize_command�run_command)rCZcmdname�kw�dirname�cmdrrrr`�s

zbdist_egg.call_commandcCs�|jd�tjd|j�|jd�}|j}d|_|jj�rJ|jrJ|jd�|j	ddd�}||_|j
�\}}g|_g}x�t|�D]|\}}t
jj|�\}	}
t
jj|jt|	�d�}|jj|�tjd	|�|js�tt
jj|�|�|j|�|jt
jd
�||<q~W|�r|j|�|jj�r |j�|j}t
jj|d�}
|j|
�|jj�rrt
jj|
d�}tjd
|�|j	d|dd�|j|
�t
jj|
d�}|�r�tjd|�|j�st|�t|d�}|j dj|��|j d�|j!�n,t
jj"|��rtjd|�|j�st
j#|�t$t
jj|d�|j%��t
jj&t
jj|j'd���rBtj(d�|j)�rR|j*�t+|j,||j-|j|j.�d�|j/�s�t0|j|jd�t1|jdg�jdt2�|j,f�dS)NrEzinstalling library code to %srNZ
build_clibrTr)Zwarn_dirz.pyzcreating stub loader for %s�/zEGG-INFO�scriptszinstalling scripts to %sZinstall_scriptsrP)�install_dirZno_epznative_libs.txtz
writing %s�wt�
zremoving %szdepends.txtzxWARNING: 'depends.txt' will not be used by setuptools 0.6!
Use the install_requires/extras_require setup() args instead.)�verbosere�mode)reZ
dist_filesr.)3rhrr_r<rFrSrJZhas_c_librariesr@r`�get_ext_outputs�stubs�	enumeraterrrrIrr^rer-rL�replacer]Zbyte_compilerWrcrrm�copy_metadata_tor
r)r*�close�isfile�unlink�write_safety_flag�zip_safe�existsrE�warnrB�zap_pyfiles�make_zipfilerArq�
gen_headerr>r�getattrr)rCZinstcmdZold_rootrk�all_outputs�ext_outputsZ
to_compiler2Zext_namer�extr+Zarchive_rootrEZ
script_dirZnative_libsZ	libs_filerrr�run�sz












z
bdist_egg.runc

Cs�tjd�x�t|j�D]�\}}}x�|D]�}tjj||�}|jd�rXtjd|�tj	|�|jd�r&|}d}t
j||�}tjj|tj|j
d�d�}	tjd||	f�ytj|	�Wntk
r�YnXtj||	�q&WqWdS)	Nz+Removing .py files from temporary directoryz.pyzDeleting %s�__pycache__z#(?P<name>.+)\.(?P<magic>[^.]+)\.pyc�namez.pyczRenaming file from [%s] to [%s])rr_�walk_eggr<rrrIr�debugrz�re�match�pardir�group�remove�OSError�rename)
rCr!r"r#r�rZpath_old�pattern�mZpath_newrrrr�s*




zbdist_egg.zap_pyfilescCs2t|jdd�}|dk	r|Stjd�t|j|j�S)Nr|z4zip_safe flag not set; analyzing archive contents...)r�rJrr~�analyze_eggr<rt)rC�saferrrr|s

zbdist_egg.zip_safec
Cs�tj|jjpd�}|jdi�jd�}|dkr0dS|js>|jrLtd|f��tj	dd�}|j
}dj|j�}|jd}tj
j|j�}d	t�}|js�ttj
j|j�|jd
�t|jd�}	|	j|�|	j�dS)N�zsetuptools.installationZeggsecutabler%zGeggsecutable entry point (%r) cannot have 'extras' or refer to a module�rraH#!/bin/sh
if [ `basename $0` = "%(basename)s" ]
then exec python%(pyver)s -c "import sys, os; sys.path.insert(0, os.path.abspath('$0')); from %(pkg)s import %(base)s; sys.exit(%(full)s())" "$@"
else
  echo $0 is not the correct name for this egg file.
  echo Please rename it back to %(basename)s and try again.
  exec false
fi
)re�a)rZ	parse_maprJZentry_points�getZattrsZextrasr�sys�versionZmodule_namerIrrrLrA�localsrerrjr)r*rx)
rCZepmZepZpyver�pkgZfullr!rL�headerr,rrrr�s*


zbdist_egg.gen_headercCsltjj|j�}tjj|d�}xJ|jjjD]<}|j|�r(tjj||t	|�d��}t
|�|j||�q(WdS)z*Copy metadata (egg info) to the target_dirr�N)rr�normpathrErIrGZfilelistr#r\rZr
Z	copy_file)rCZ
target_dirZ
norm_egg_info�prefixr�targetrrrrw:s
zbdist_egg.copy_metadata_tocCsg}g}|jdi}x|t|j�D]n\}}}x6|D].}tjj|�dj�tkr.|j|||�q.Wx*|D]"}|||d|tjj||�<qfWqW|j	j
��r|jd�}xd|jD]Z}	t
|	t�r�q�|j|	j�}
|j|
�}tjj|�jd�s�tjjtjj|j|��r�|j|�q�W||fS)zAGet a list of relative paths to C extensions in the output distror�rPrlZ	build_extzdl-)r<r$rrr�lower�NATIVE_EXTENSIONSr^rIrJrKrF�
extensionsrXrZget_ext_fullnamer�Zget_ext_filenamerLr\r})rCr�r��pathsr!r"r#rZ	build_cmdr��fullnamerrrrsFs(


&


zbdist_egg.get_ext_outputs)r/r0r1)r3Nr4�Pkeep the pseudo-installation tree around after creating the distribution archive)r5r6r�)r7r8r9)r:Nr;)�__name__�
__module__�__qualname__�descriptionrZuser_optionsZboolean_optionsrDrMrcrdr`r�rr|r�rwrsrrrrr.Cs4
	
Q'r.z.dll .so .dylib .pydccsLt|�}t|�\}}}d|kr(|jd�|||fVx|D]
}|Vq:WdS)z@Walk an unpacked egg's contents, skipping the metadata directoryzEGG-INFON)r$�nextr�)�egg_dirZwalkerr!r"r#Zbdfrrrr�fs

r�c	Cs�x0tj�D]$\}}tjjtjj|d|��r
|Sq
Wt�s<dSd}xbt|�D]V\}}}xJ|D]B}|jd�sZ|jd�rvqZqZ|jd�s�|jd�rZt	||||�o�|}qZWqJW|S)NzEGG-INFOFTz.pyz.pywz.pycz.pyo)
�safety_flags�itemsrrr}rI�can_scanr�r�scan_module)	r�rt�flag�fnr�r!r"r#r�rrrr�qs
r�cCs�x~tj�D]r\}}tjj||�}tjj|�rL|dks@t|�|kr|tj|�q
|dk	r
t|�|kr
t|d�}|j	d�|j
�q
WdS)Nrorp)r�r�rrrIr}�boolrzr)r*rx)r�r�r�r�r,rrrr{�s

r{zzip-safeznot-zip-safe)TFc
Cstjj||�}|dd�|kr"dS|t|�dd�jtjd�}||rJdpLdtjj|�d}tjdkrpd}ntjd kr�d
}nd}t	|d�}|j
|�tj|�}	|j
�d}
tjt|	��}x&d!D]}||kr�tjd||�d}
q�Wd|k�rx*d"D]"}||k�r�tjd||�d}
�q�W|
S)#z;Check whether module possibly uses unsafe-for-zipfile stuffNrPTrr�rr������rb�__file__�__path__z%s: module references %sF�inspect�	getsource�
getabsfile�
getsourcefile�getfilegetsourcelines�
findsource�getcomments�getframeinfo�getinnerframes�getouterframes�stack�tracez"%s: module MAY be using inspect.%s���)r�r�)r�r�)r�r�)r�r�r�r�r�r�r�r�r�r�r�)rrrIrZrvr]rr��version_infor)�read�marshal�loadrx�dict�fromkeys�iter_symbolsrr~)
r�r!r�rtrr�r�skipr,�coder�ZsymbolsZbadrrrr��s: 








r�ccs`x|jD]
}|VqWxD|jD]:}t|tj�r6|Vqt|t�rxt|�D]
}|VqJWqWdS)zBYield names and strings used by `code` and its nested code objectsN)�co_names�	co_constsrXrZstring_typesrr�)r�r��constrrrr��s

r�cCs4tjjd�rtjdkrdStjd�tjd�dS)N�javaZcliTz1Unable to analyze compiled code on this platform.zfPlease ask the author to include a 'zip_safe' setting (either True or False) in the package's setup.py)r��platformr\rr~rrrrr��s
r�rTrnrQZinstall_baseTr%c
s�ddl}ttjj|��d�tjd|����fdd�}|rB|jn|j}�s�|j	|||d�}	x"t
��D]\}
}}||	|
|�qfW|	j�n$x"t
��D]\}
}}|d|
|�q�W|S)aqCreate a zip file from all the files under 'base_dir'.  The output
    zip file will be named 'base_dir' + ".zip".  Uses either the "zipfile"
    Python module (if available) or the InfoZIP "zip" utility (if installed
    and found on the default search path).  If neither tool is available,
    raises DistutilsExecError.  Returns the name of the output zip file.
    rN)rez#creating '%s' and adding '%s' to itcsdx^|D]V}tjjtjj||��}tjj|�r|t��dd�}�sP|j||�tjd|�qWdS)NrPzadding '%s')	rrr�rIryrZr*rr�)�zrj�namesr�rr2)�base_dirrerr�visit�s
zmake_zipfile.<locals>.visit)�compression)�zipfilerrrrjrr_ZZIP_DEFLATEDZ
ZIP_STOREDZZipFiler$rx)
Zzip_filenamer�rqre�compressrrr�r�r�r�rjr"r#r)r�rerr��s	
r�)rrTr%)2�__doc__Zdistutils.errorsrZdistutils.dir_utilrrZ	distutilsr�typesrr�rr�r&r�Zsetuptools.externrZ
pkg_resourcesrr	r
rZsetuptools.extensionrZ
setuptoolsr
�	sysconfigrrr�ImportErrorZdistutils.sysconfigrrr$r-r.r�r��splitr�r�r�r{r�r�r�r�rfr�rrrr�<module>sL
"$
site-packages/setuptools/command/__pycache__/__init__.cpython-36.pyc000064400000001230147511334630021523 0ustar003

��fR�@szdddddddddd	d
ddd
dddddddddgZddlmZddlZddlmZdejkrrdejd<ejjd�[[dS)�alias�	bdist_eggZ	bdist_rpmZ	build_extZbuild_pyZdevelopZeasy_installZegg_infoZinstallZinstall_lib�rotateZsaveoptsZsdistZsetoptZtestZinstall_egg_info�install_scripts�registerZ
bdist_wininstZupload_docsZuploadZ
build_clibZ	dist_info�)�bdistN)rZegg�Python .egg file)rr)	�__all__Zdistutils.command.bdistr�sysZsetuptools.commandrZformat_commandsZformat_command�append�rr�/usr/lib/python3.6/__init__.py�<module>s



site-packages/setuptools/command/__pycache__/egg_info.cpython-36.pyc000064400000050633147511334630021554 0ustar003

��f�`�@s�dZddlmZddlmZddlmZddlm	Z	ddlZddlZddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlmZddlmZdd	lmZdd
lmZddlmZddlmZdd
lmZddlmZm Z m!Z!m"Z"m#Z#m$Z$m%Z%m&Z&ddl'j(Z(ddl)m*Z*ddlm+Z+dd�Z,Gdd�de�Z-Gdd�de�ZGdd�de�Z.dd�Z/dd�Z0dd�Z1dd �Z2d!d"�Z3d#d$�Z4d%d&�Z5d'd(�Z6d0d*d+�Z7d,d-�Z8d.d/�Z9dS)1zUsetuptools.command.egg_info

Create a distribution's .egg-info directory and contents�)�FileList)�DistutilsInternalError)�convert_path)�logN)�six)�map)�Command)�sdist)�walk_revctrl)�edit_config)�	bdist_egg)�parse_requirements�	safe_name�
parse_version�safe_version�yield_lines�
EntryPoint�iter_entry_points�to_filename)�glob)�	packagingcCs�d}|jtjj�}tjtj�}d|f}�x�t|�D�]�\}}|t|�dk}|dkrv|rd|d7}q4|d||f7}q4d}t|�}	�x:||	k�r�||}
|
dkr�||d7}�n|
d	kr�||7}n�|
d
k�r�|d}||	kr�||dkr�|d}||	k�r||dk�r|d}x&||	k�r6||dk�r6|d}�qW||	k�rR|tj|
�7}nR||d|�}d}
|ddk�r�d
}
|dd�}|
tj|�7}
|d|
f7}|}n|tj|
�7}|d7}q�W|s4||7}q4W|d7}tj|tj	tj
Bd�S)z�
    Translate a file path glob like '*.txt' in to a regular expression.
    This differs from fnmatch.translate which allows wildcards to match
    directory separators. It also knows about '**/' which matches any number of
    directories.
    �z[^%s]�z**z.*z
(?:%s+%s)*r�*�?�[�!�]�^Nz[%s]z\Z)�flags)�split�os�path�sep�re�escape�	enumerate�len�compile�	MULTILINE�DOTALL)rZpatZchunksr#Z
valid_char�c�chunkZ
last_chunk�iZ	chunk_len�charZinner_i�innerZ
char_class�r0�/usr/lib/python3.6/egg_info.py�translate_pattern$sV




r2c@s�eZdZdZd)d*d+d,gZdgZddiZdd�Zedd��Z	e	j
dd��Z	dd�Zdd�Zd-dd�Z
dd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(S).�egg_infoz+create a distribution's .egg-info directory�	egg-base=�e�Ldirectory containing .egg-info directories (default: top of the source tree)�tag-date�d�0Add date stamp (e.g. 20050528) to version number�
tag-build=�b�-Specify explicit tag to add to version number�no-date�D�"Don't include date stamp [default]cCs4d|_d|_d|_d|_d|_d|_d|_d|_dS)NrF)�egg_name�egg_version�egg_baser3�	tag_build�tag_date�broken_egg_info�vtags)�selfr0r0r1�initialize_options�szegg_info.initialize_optionscCsdS)Nr0)rGr0r0r1�tag_svn_revision�szegg_info.tag_svn_revisioncCsdS)Nr0)rG�valuer0r0r1rI�scCs0tj�}|j�|d<d|d<t|t|d��dS)z�
        Materialize the value of date into the
        build tag. Install build keys in a deterministic order
        to avoid arbitrary reordering on subsequent builds.
        rCrrD)r3N)�collections�OrderedDict�tagsr�dict)rG�filenamer3r0r0r1�save_version_info�szegg_info.save_version_infocCsVt|jj��|_|j�|_|j�|_t|j�}y6t	|t
jj�}|rFdnd}t
t||j|jf��Wn,tk
r�tjjd|j|jf��YnX|jdkr�|jj}|p�ijdtj�|_|jd�t|j�d|_|jtjkr�tjj|j|j�|_d|jk�r|j�|j|jj_|jj}|dk	�rR|j |jj!�k�rR|j|_"t|j�|_#d|j_dS)Nz%s==%sz%s===%sz2Invalid distribution name or version syntax: %s-%srrBz	.egg-info�-)$r�distributionZget_namer@rMrF�tagged_versionrAr�
isinstancer�versionZVersion�listr
�
ValueError�	distutils�errorsZDistutilsOptionErrorrBZpackage_dir�getr!�curdirZensure_dirnamerr3r"�join�check_broken_egg_info�metadataZ
_patched_dist�key�lowerZ_versionZ_parsed_version)rGZparsed_versionZ
is_version�spec�dirsZpdr0r0r1�finalize_options�s8




zegg_info.finalize_optionsFcCsN|r|j|||�n6tjj|�rJ|dkr@|r@tjd||�dS|j|�dS)a�Write `data` to `filename` or delete if empty

        If `data` is non-empty, this routine is the same as ``write_file()``.
        If `data` is empty but not ``None``, this is the same as calling
        ``delete_file(filename)`.  If `data` is ``None``, then this is a no-op
        unless `filename` exists, in which case a warning is issued about the
        orphaned file (if `force` is false), or deleted (if `force` is true).
        Nz$%s not set in setup(), but %s exists)�
write_filer!r"�existsr�warn�delete_file)rG�whatrO�data�forcer0r0r1�write_or_delete_file�s	
zegg_info.write_or_delete_filecCsDtjd||�tjr|jd�}|js@t|d�}|j|�|j�dS)z�Write `data` to `filename` (if not a dry run) after announcing it

        `what` is used in a log message to identify what is being written
        to the file.
        zwriting %s to %szutf-8�wbN)	r�inforZPY3�encode�dry_run�open�write�close)rGrhrOri�fr0r0r1rd�s


zegg_info.write_filecCs tjd|�|jstj|�dS)z8Delete `filename` (if not a dry run) after announcing itzdeleting %sN)rrmror!�unlink)rGrOr0r0r1rg�szegg_info.delete_filecCs2|jj�}|jr$|j|j�r$t|�St||j�S)N)rRZget_versionrF�endswithr)rGrUr0r0r1rSs
zegg_info.tagged_versioncCs�|j|j�|jj}x@td�D]4}|j|d�|j�}|||jtj	j
|j|j��qWtj	j
|jd�}tj	j|�r||j|�|j
�dS)Nzegg_info.writers)�	installerznative_libs.txt)Zmkpathr3rRZfetch_build_eggrZrequireZresolve�namer!r"r\rerg�find_sources)rGrv�ep�writer�nlr0r0r1�run	s 
zegg_info.runcCs,d}|jr||j7}|jr(|tjd�7}|S)Nrz-%Y%m%d)rCrD�timeZstrftime)rGrUr0r0r1rMs
z
egg_info.tagscCs4tjj|jd�}t|j�}||_|j�|j|_dS)z"Generate SOURCES.txt manifest filezSOURCES.txtN)	r!r"r\r3�manifest_makerrR�manifestr|�filelist)rGZmanifest_filenameZmmr0r0r1rx s

zegg_info.find_sourcescCsd|jd}|jtjkr&tjj|j|�}tjj|�r`tjddddd||j	�|j	|_
||_	dS)Nz	.egg-inforQ�Nz�
Note: Your current .egg-info directory has a '-' in its name;
this will not work correctly with "setup.py develop".

Please rename %s to %s to correct this problem.
)r@rBr!r[r"r\rerrfr3rE)rGZbeir0r0r1r](s

zegg_info.check_broken_egg_infoN)r4r5r6)r7r8r9)r:r;r<)r=r>r?)F)�__name__�
__module__�__qualname__�descriptionZuser_optionsZboolean_optionsZnegative_optrH�propertyrI�setterrPrcrkrdrgrSr|rMrxr]r0r0r0r1r3ws(

/
r3c@s|eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�ZdS)rcCs<|j|�\}}}}|dkrV|jddj|��x"|D]}|j|�s4tjd|�q4W�n�|dkr�|jddj|��x"|D]}|j|�sxtjd|�qxW�n�|dkr�|jd	dj|��x"|D]}|j|�s�tjd
|�q�W�nZ|dk�r(|jddj|��x&|D]}|j|��stjd
|��qW�n|dk�rx|jd|dj|�f�x�|D]"}|j	||��sPtjd||��qPWn�|dk�r�|jd|dj|�f�x�|D]"}|j
||��s�tjd||��q�Wnp|dk�r�|jd|�|j|��s8tjd|�n>|dk�r,|jd|�|j|��s8tjd|�nt
d|��dS)N�includezinclude � z%warning: no files found matching '%s'�excludezexclude z9warning: no previously-included files found matching '%s'zglobal-includezglobal-include z>warning: no files found matching '%s' anywhere in distributionzglobal-excludezglobal-exclude zRwarning: no previously-included files matching '%s' found anywhere in distributionzrecursive-includezrecursive-include %s %sz:warning: no files found matching '%s' under directory '%s'zrecursive-excludezrecursive-exclude %s %szNwarning: no previously-included files matching '%s' found under directory '%s'�graftzgraft z+warning: no directories found matching '%s'�prunezprune z6no previously-included directories found matching '%s'z'this cannot happen: invalid action '%s')Z_parse_template_line�debug_printr\r�rrfr��global_include�global_exclude�recursive_include�recursive_excluder�r�r)rG�line�actionZpatterns�dirZdir_pattern�patternr0r0r1�process_template_line;sd













zFileList.process_template_linecCsVd}xLtt|j�ddd�D]2}||j|�r|jd|j|�|j|=d}qW|S)z�
        Remove all files from the file list that match the predicate.
        Return True if any matching files were removed
        Frz
 removing T���r�)�ranger'�filesr�)rGZ	predicate�foundr-r0r0r1�
_remove_files�szFileList._remove_filescCs$dd�t|�D�}|j|�t|�S)z#Include files that match 'pattern'.cSsg|]}tjj|�s|�qSr0)r!r"�isdir)�.0rsr0r0r1�
<listcomp>�sz$FileList.include.<locals>.<listcomp>)r�extend�bool)rGr�r�r0r0r1r��s
zFileList.includecCst|�}|j|j�S)z#Exclude files that match 'pattern'.)r2r��match)rGr�r�r0r0r1r��szFileList.excludecCs8tjj|d|�}dd�t|dd�D�}|j|�t|�S)zN
        Include all files anywhere in 'dir/' that match the pattern.
        z**cSsg|]}tjj|�s|�qSr0)r!r"r�)r�rsr0r0r1r��sz.FileList.recursive_include.<locals>.<listcomp>T)�	recursive)r!r"r\rr�r�)rGr�r�Zfull_patternr�r0r0r1r��s
zFileList.recursive_includecCs ttjj|d|��}|j|j�S)zM
        Exclude any file anywhere in 'dir/' that match the pattern.
        z**)r2r!r"r\r�r�)rGr�r�r�r0r0r1r��szFileList.recursive_excludecCs$dd�t|�D�}|j|�t|�S)zInclude all files from 'dir/'.cSs"g|]}tjj|�D]}|�qqSr0)rXr��findall)r�Z	match_dir�itemr0r0r1r��sz"FileList.graft.<locals>.<listcomp>)rr�r�)rGr�r�r0r0r1r��s
zFileList.graftcCsttjj|d��}|j|j�S)zFilter out files from 'dir/'.z**)r2r!r"r\r�r�)rGr�r�r0r0r1r��szFileList.prunecsJ|jdkr|j�ttjjd|����fdd�|jD�}|j|�t|�S)z�
        Include all files anywhere in the current directory that match the
        pattern. This is very inefficient on large file trees.
        Nz**csg|]}�j|�r|�qSr0)r�)r�rs)r�r0r1r��sz+FileList.global_include.<locals>.<listcomp>)Zallfilesr�r2r!r"r\r�r�)rGr�r�r0)r�r1r��s

zFileList.global_includecCsttjjd|��}|j|j�S)zD
        Exclude all files anywhere that match the pattern.
        z**)r2r!r"r\r�r�)rGr�r�r0r0r1r��szFileList.global_excludecCs8|jd�r|dd�}t|�}|j|�r4|jj|�dS)N�
rr�)rur�
_safe_pathr��append)rGr�r"r0r0r1r��s


zFileList.appendcCs|jjt|j|��dS)N)r�r��filterr�)rG�pathsr0r0r1r��szFileList.extendcCstt|j|j��|_dS)z�
        Replace self.files with only safe paths

        Because some owners of FileList manipulate the underlying
        ``files`` attribute directly, this method must be called to
        repair those paths.
        N)rVr�r�r�)rGr0r0r1�_repair�szFileList._repaircCs�d}tj|�}|dkr(tjd|�dStj|d�}|dkrNtj||d�dSy tjj|�shtjj|�rldSWn&tk
r�tj||t	j
��YnXdS)Nz!'%s' not %s encodable -- skippingz''%s' in unexpected encoding -- skippingFzutf-8T)�
unicode_utils�filesys_decoderrfZ
try_encoder!r"re�UnicodeEncodeError�sys�getfilesystemencoding)rGr"Zenc_warnZu_pathZ	utf8_pathr0r0r1r��s
zFileList._safe_pathN)r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r0r0r0r1r8sI



rc@s\eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	e
dd��Zdd�Zdd�Z
dS)r~zMANIFEST.incCsd|_d|_d|_d|_dS)Nr)Zuse_defaultsr�Z
manifest_onlyZforce_manifest)rGr0r0r1rH�sz!manifest_maker.initialize_optionscCsdS)Nr0)rGr0r0r1rcszmanifest_maker.finalize_optionscCsdt�|_tjj|j�s|j�|j�tjj|j�r<|j	�|j
�|jj�|jj�|j�dS)N)
rr�r!r"rer�write_manifest�add_defaults�templateZ
read_template�prune_file_list�sortZremove_duplicates)rGr0r0r1r|s

zmanifest_maker.runcCstj|�}|jtjd�S)N�/)r�r��replacer!r#)rGr"r0r0r1�_manifest_normalizes
z"manifest_maker._manifest_normalizecsB�jj��fdd��jjD�}d�j}�jt�j|f|�dS)zo
        Write the file list in 'self.filelist' to the manifest file
        named by 'self.manifest'.
        csg|]}�j|��qSr0)r�)r�rs)rGr0r1r� sz1manifest_maker.write_manifest.<locals>.<listcomp>zwriting manifest file '%s'N)r�r�r�rZexecuterd)rGr��msgr0)rGr1r�s

zmanifest_maker.write_manifestcCs|j|�stj||�dS)N)�_should_suppress_warningr	rf)rGr�r0r0r1rf$s
zmanifest_maker.warncCstjd|�S)z;
        suppress missing-file warnings from sdist
        zstandard file .*not found)r$r�)r�r0r0r1r�(sz'manifest_maker._should_suppress_warningcCsttj|�|jj|j�|jj|j�tt��}|rB|jj|�nt	j
j|j�rX|j�|j
d�}|jj|j�dS)Nr3)r	r�r�r�r�rrVr
r�r!r"reZ
read_manifest�get_finalized_commandr�r3)rGZrcfilesZei_cmdr0r0r1r�/s


zmanifest_maker.add_defaultscCsZ|jd�}|jj�}|jj|j�|jj|�tjtj	�}|jj
d|d|dd�dS)N�buildz(^|z)(RCS|CVS|\.svn)r)Zis_regex)r�rRZget_fullnamer�r�Z
build_baser$r%r!r#Zexclude_pattern)rGr�Zbase_dirr#r0r0r1r�;s

zmanifest_maker.prune_file_listN)r�r�r�r�rHrcr|r�r�rf�staticmethodr�r�r�r0r0r0r1r~�sr~c	Cs8dj|�}|jd�}t|d��}|j|�WdQRXdS)z{Create a file with the specified name and write 'contents' (a
    sequence of strings without line terminators) to it.
    �
zutf-8rlN)r\rnrprq)rO�contentsrsr0r0r1rdEs

rdcCs|tjd|�|jsx|jj}|j|j|_}|j|j|_}z|j	|j
�Wd|||_|_Xt|jdd�}tj
|j
|�dS)Nz
writing %sZzip_safe)rrmrorRr^rArUr@rw�write_pkg_infor3�getattrrZwrite_safety_flag)�cmd�basenamerOr^ZoldverZoldnameZsafer0r0r1r�Rsr�cCstjj|�rtjd�dS)NzsWARNING: 'depends.txt' is not used by setuptools 0.6!
Use the install_requires/extras_require setup() args instead.)r!r"rerrf)r�r�rOr0r0r1�warn_depends_obsoleteesr�cCs,t|pf�}dd�}t||�}|j|�dS)NcSs|dS)Nr�r0)r�r0r0r1�<lambda>osz%_write_requirements.<locals>.<lambda>)rr�
writelines)�streamZreqs�linesZ	append_crr0r0r1�_write_requirementsms
r�cCsn|j}tj�}t||j�|jp"i}x2t|�D]&}|jdjft	���t|||�q.W|j
d||j��dS)Nz
[{extra}]
Zrequirements)rRr�StringIOr�Zinstall_requires�extras_require�sortedrq�format�varsrk�getvalue)r�r�rOZdistrir�Zextrar0r0r1�write_requirementsts
r�cCs,tj�}t||jj�|jd||j��dS)Nzsetup-requirements)�ior�r�rRZsetup_requiresrkr�)r�r�rOrir0r0r1�write_setup_requirementssr�cCs:tjdd�|jj�D��}|jd|djt|��d�dS)NcSsg|]}|jdd�d�qS)�.rr)r )r��kr0r0r1r��sz(write_toplevel_names.<locals>.<listcomp>ztop-level namesr�)rN�fromkeysrRZiter_distribution_namesrdr\r�)r�r�rOZpkgsr0r0r1�write_toplevel_names�sr�cCst|||d�dS)NT)�	write_arg)r�r�rOr0r0r1�
overwrite_arg�sr�FcCsHtjj|�d}t|j|d�}|dk	r4dj|�d}|j||||�dS)Nrr�)r!r"�splitextr�rRr\rk)r�r�rOrjZargnamerJr0r0r1r��s
r�cCs�|jj}t|tj�s|dkr"|}nr|dk	r�g}xZt|j��D]J\}}t|tj�sttj||�}dj	tt
t|j����}|j
d||f�q<Wdj	|�}|jd||d�dS)Nr�z	[%s]
%s

rzentry pointsT)rRZentry_pointsrTrZstring_typesr��itemsrZparse_groupr\r�str�valuesr�rk)r�r�rOryriZsectionr�r0r0r1�
write_entries�s
r�cCs^tjdt�tjjd�rZtjd��2}x*|D]"}tj	d|�}|r*t
|jd��Sq*WWdQRXdS)zd
    Get a -r### off of PKG-INFO Version in case this is an sdist of
    a subversion revision.
    z$get_pkg_info_revision is deprecated.zPKG-INFOzVersion:.*-r(\d+)\s*$rNr)�warningsrf�DeprecationWarningr!r"rer�rpr$r��int�group)rsr�r�r0r0r1�get_pkg_info_revision�s
r�)F):�__doc__Zdistutils.filelistrZ	_FileListZdistutils.errorsrZdistutils.utilrrXrr!r$r�r�r�r}rKZsetuptools.externrZsetuptools.extern.six.movesrZ
setuptoolsrZsetuptools.command.sdistr	r
Zsetuptools.command.setoptrZsetuptools.commandrZ
pkg_resourcesr
rrrrrrrZsetuptools.unicode_utilsr�Zsetuptools.globrrr2r3r~rdr�r�r�r�r�r�r�r�r�r�r0r0r0r1�<module>sR(
SBEI


site-packages/setuptools/command/__pycache__/test.cpython-36.pyc000064400000017625147511334630020762 0ustar003

��f�#�@s�ddlZddlZddlZddlZddlZddlZddlmZmZddl	m
Z
ddlmZddlm
Z
ddlmZmZddlmZmZmZmZmZmZmZmZmZddlmZGd	d
�d
e�ZGdd�de�ZGd
d�de�Z dS)�N)�DistutilsError�DistutilsOptionError)�log)�
TestLoader)�six)�map�filter)	�resource_listdir�resource_exists�normalize_path�working_set�_namespace_packages�evaluate_marker�add_activation_listener�require�
EntryPoint)�Commandc@seZdZdd�Zddd�ZdS)�ScanningLoadercCstj|�t�|_dS)N)r�__init__�set�_visited)�self�r�/usr/lib/python3.6/test.pyrs
zScanningLoader.__init__NcCs�||jkrdS|jj|�g}|jtj||��t|d�rH|j|j��t|d�r�xpt|jd�D]`}|j	d�r�|dkr�|jd|dd�}n"t
|j|d	�r`|jd|}nq`|j|j|��q`Wt|�d
kr�|j
|�S|dSdS)
aReturn a suite of all tests cases contained in the given module

        If the module is a package, load tests from all the modules in it.
        If the module has an ``additional_tests`` function, call it and add
        the return value to the tests.
        N�additional_tests�__path__�z.pyz__init__.py�.�z/__init__.py�r���)r�add�appendr�loadTestsFromModule�hasattrrr	�__name__�endswithr
ZloadTestsFromName�lenZ
suiteClass)r�module�patternZtests�fileZ	submodulerrrr#s$



z"ScanningLoader.loadTestsFromModule)N)r%�
__module__�__qualname__rr#rrrrrsrc@seZdZdd�Zddd�ZdS)�NonDataPropertycCs
||_dS)N)�fget)rr.rrrr>szNonDataProperty.__init__NcCs|dkr|S|j|�S)N)r.)r�objZobjtyperrr�__get__AszNonDataProperty.__get__)N)r%r+r,rr0rrrrr-=sr-c@s�eZdZdZdZd%d&d'gZdd
�Zdd�Zedd��Z	dd�Z
dd�Zej
gfdd��Zeej
dd���Zedd��Zdd�Zdd�Zed d!��Zed"d#��Zd$S)(�testz.Command to run unit tests after in-place buildz#run unit tests after in-place build�test-module=�m�$Run 'test_suite' in specified module�test-suite=�s�9Run single test, case or suite (e.g. 'module.test_suite')�test-runner=�r�Test runner to usecCsd|_d|_d|_d|_dS)N)�
test_suite�test_module�test_loader�test_runner)rrrr�initialize_optionsSsztest.initialize_optionscCs�|jr|jrd}t|��|jdkrD|jdkr8|jj|_n|jd|_|jdkr^t|jdd�|_|jdkrnd|_|jdkr�t|jdd�|_dS)Nz1You may specify a module or a suite, but not bothz.test_suiter=z&setuptools.command.test:ScanningLoaderr>)r;r<r�distributionr=�getattrr>)r�msgrrr�finalize_optionsYs




ztest.finalize_optionscCst|j��S)N)�list�
_test_args)rrrr�	test_argslsztest.test_argsccs6|jrtjdkrdV|jr$dV|jr2|jVdS)N��Zdiscoverz	--verbose)rGrH)r;�sys�version_info�verbose)rrrrrEpsztest._test_argsc	Cs|j��|�WdQRXdS)zI
        Backward compatibility for project_on_sys_path context.
        N)�project_on_sys_path)r�funcrrr�with_project_on_sys_pathxs
ztest.with_project_on_sys_pathc	csPtjot|jdd�}|rv|jddd�|jd�|jd�}t|j�}|jd|d�|jd�|jddd�|jd�n"|jd�|jdd	d�|jd�|jd�}t	j
dd�}t	jj�}zbt|j
�}t	j
jd|�tj�td
d��td|j|jf�|j|g��dVWdQRXWd|t	j
dd�<t	jj�t	jj|�tj�XdS)
N�use_2to3FZbuild_pyr)ZinplaceZegg_info)�egg_baseZ	build_extrcSs|j�S)N)Zactivate)�distrrr�<lambda>�sz*test.project_on_sys_path.<locals>.<lambda>z%s==%s)r�PY3rAr@Zreinitialize_commandZrun_commandZget_finalized_commandrZ	build_librI�path�modules�copyrP�insertrrrrZegg_nameZegg_version�paths_on_pythonpath�clear�update)	rZ
include_distsZ	with_2to3Zbpy_cmdZ
build_pathZei_cmdZold_pathZold_modulesZproject_pathrrrrLs8









ztest.project_on_sys_pathccs�t�}tjjd|�}tjjdd�}z>tjj|�}td||g�}tjj|�}|rX|tjd<dVWd||krztjjdd�n
|tjd<XdS)z�
        Add the indicated paths to the head of the PYTHONPATH environment
        variable so that subprocesses will also see the packages at
        these paths.

        Do this in a context that restores the value on exit.
        �
PYTHONPATHrN)�object�os�environ�get�pathsep�joinr�pop)�pathsZnothingZorig_pythonpathZcurrent_pythonpath�prefixZto_join�new_pathrrrrX�s


ztest.paths_on_pythonpathcCsD|j|j�}|j|jpg�}|jdd�|jj�D��}tj|||�S)z�
        Install the requirements indicated by self.distribution and
        return an iterable of the dists that were built.
        css0|](\}}|jd�rt|dd��r|VqdS)�:rN)�
startswithr)�.0�k�vrrr�	<genexpr>�sz%test.install_dists.<locals>.<genexpr>)Zfetch_build_eggsZinstall_requiresZ
tests_requireZextras_require�items�	itertools�chain)rQZir_dZtr_dZer_drrr�
install_dists�s
ztest.install_distscCs�|j|j�}dj|j�}|jr0|jd|�dS|jd|�ttjd�|�}|j	|��"|j
��|j�WdQRXWdQRXdS)N� zskipping "%s" (dry run)zrunning "%s"�location)ror@ra�_argvZdry_run�announcer�operator�
attrgetterrXrL�	run_tests)rZinstalled_dists�cmdrcrrr�run�s
ztest.runcCs�tjr�t|jdd�r�|jjd�d}|tkr�g}|tjkrD|j	|�|d7}x"tjD]}|j
|�rT|j	|�qTWtttjj
|��tjdd|j|j|j�|j|j�dd�}|jj�s�d|j}|j|tj�t|��dS)NrOFrr)Z
testLoaderZ
testRunner�exitzTest failed: %s)rrSrAr@r;�splitr
rIrUr"rgrDr�__delitem__�unittest�mainrr�_resolve_as_epr=r>�resultZ
wasSuccessfulrsrZERRORr)rr(Zdel_modules�namer1rBrrrrv�s(






ztest.run_testscCsdg|jS)Nr|)rF)rrrrrr�sz
test._argvcCs$|dkrdStjd|�}|j��S)zu
        Load the indicated attribute value, called, as a as if it were
        specified as an entry point.
        Nzx=)r�parseZresolve)�valZparsedrrrr~sztest._resolve_as_epN)r2r3r4)r5r6r7)r8r9r:)r%r+r,�__doc__�descriptionZuser_optionsr?rCr-rFrErN�
contextlib�contextmanagerrL�staticmethodrXrorxrv�propertyrrr~rrrrr1Gs(-r1)!r]rtrIr�rmr|Zdistutils.errorsrrZ	distutilsrrZsetuptools.externrZsetuptools.extern.six.movesrrZ
pkg_resourcesr	r
rrr
rrrrZ
setuptoolsrrr\r-r1rrrr�<module>s,)
site-packages/setuptools/command/__pycache__/saveopts.cpython-36.pyc000064400000001520147511334630021632 0ustar003

��f��@s$ddlmZmZGdd�de�ZdS)�)�edit_config�option_basec@seZdZdZdZdd�ZdS)�saveoptsz#Save command-line options to a filez7save supplied options to setup.cfg or other config filecCsp|j}i}xP|jD]F}|dkr qx6|j|�j�D]$\}\}}|dkr0||j|i�|<q0WqWt|j||j�dS)Nrzcommand line)ZdistributionZcommand_optionsZget_option_dict�items�
setdefaultr�filenameZdry_run)�selfZdistZsettings�cmd�opt�src�val�r
�/usr/lib/python3.6/saveopts.py�run	szsaveopts.runN)�__name__�
__module__�__qualname__�__doc__�descriptionrr
r
r
rrsrN)Zsetuptools.command.setoptrrrr
r
r
r�<module>ssite-packages/setuptools/command/__pycache__/install_scripts.cpython-36.pyc000064400000004232147511334630023206 0ustar003

��f�	�@sRddlmZddljjZddlZddlZddlm	Z	m
Z
mZGdd�dej�ZdS)�)�logN)�Distribution�PathMetadata�ensure_directoryc@s*eZdZdZdd�Zdd�Zd
dd�Zd	S)�install_scriptsz;Do normal script install, plus any egg_info wrapper scriptscCstjj|�d|_dS)NF)�origr�initialize_options�no_ep)�self�r�%/usr/lib/python3.6/install_scripts.pyrsz"install_scripts.initialize_optionscCs�ddljj}|jd�|jjr,tjj|�ng|_	|j
r<dS|jd�}t|j
t|j
|j�|j|j�}|jd�}t|dd�}|jd�}t|dd�}|j}|r�d}|j}|tjkr�|g}|j�}|jj�j|�}	x"|j||	j��D]}
|j|
�q�WdS)	Nr�egg_infoZ
build_scripts�
executableZ
bdist_wininstZ_is_runningFz
python.exe)�setuptools.command.easy_install�commandZeasy_installZrun_commandZdistribution�scriptsrr�run�outfilesr	Zget_finalized_commandrZegg_baserr
Zegg_nameZegg_version�getattrZScriptWriterZWindowsScriptWriter�sysrZbestZcommand_spec_classZ
from_paramZget_argsZ	as_header�write_script)r
ZeiZei_cmdZdistZbs_cmdZ
exec_paramZbw_cmdZ
is_wininst�writer�cmd�argsrrrrs2




zinstall_scripts.run�tc
Gs�ddlm}m}tjd||j�tjj|j|�}|j	j
|�|�}|js~t|�t
|d|�}	|	j|�|	j�||d|�dS)z1Write an executable file to the scripts directoryr)�chmod�
current_umaskzInstalling %s script to %s�wi�N)rrrr�infoZinstall_dir�os�path�joinr�appendZdry_runr�open�write�close)
r
Zscript_name�contents�modeZignoredrr�target�mask�frrrr3s
zinstall_scripts.write_scriptN)r)�__name__�
__module__�__qualname__�__doc__rrrrrrrr	s#r)Z	distutilsrZ!distutils.command.install_scriptsrrrrrZ
pkg_resourcesrrrrrrr�<module>s
site-packages/setuptools/command/__pycache__/py36compat.cpython-36.opt-1.pyc000064400000010703147511334630022735 0ustar003

��fz�@sdddlZddlmZddlmZddlmZddlmZGdd�d�Ze	ejd�r`Gd	d�d�ZdS)
�N)�glob)�convert_path)�sdist)�filterc@s\eZdZdZdd�Zedd��Zdd�Zdd	�Zd
d�Z	dd
�Z
dd�Zdd�Zdd�Z
dS)�sdist_add_defaultsz�
    Mix-in providing forward-compatibility for functionality as found in
    distutils on Python 3.7.

    Do not edit the code in this class except to update functionality
    as implemented in distutils. Instead, override in the subclass.
    cCs<|j�|j�|j�|j�|j�|j�|j�dS)a9Add all the default files to self.filelist:
          - README or README.txt
          - setup.py
          - test/test*.py
          - all pure Python modules mentioned in setup script
          - all files pointed by package_data (build_py)
          - all files defined in data_files.
          - all files defined as scripts.
          - all C sources listed as part of extensions or C libraries
            in the setup script (doesn't catch C headers!)
        Warns if (README or README.txt) or setup.py are missing; everything
        else is optional.
        N)�_add_defaults_standards�_add_defaults_optional�_add_defaults_python�_add_defaults_data_files�_add_defaults_ext�_add_defaults_c_libs�_add_defaults_scripts)�self�r� /usr/lib/python3.6/py36compat.py�add_defaultsszsdist_add_defaults.add_defaultscCs:tjj|�sdStjj|�}tjj|�\}}|tj|�kS)z�
        Case-sensitive path existence check

        >>> sdist_add_defaults._cs_path_exists(__file__)
        True
        >>> sdist_add_defaults._cs_path_exists(__file__.upper())
        False
        F)�os�path�exists�abspath�split�listdir)�fspathrZ	directory�filenamerrr�_cs_path_exists(s

z"sdist_add_defaults._cs_path_existscCs�|j|jjg}x�|D]�}t|t�rn|}d}x(|D] }|j|�r0d}|jj|�Pq0W|s�|jddj	|��q|j|�r�|jj|�q|jd|�qWdS)NFTz,standard file not found: should have one of z, zstandard file '%s' not found)
ZREADMES�distributionZscript_name�
isinstance�tupler�filelist�append�warn�join)rZ	standards�fnZaltsZgot_itrrrr9s 




z*sdist_add_defaults._add_defaults_standardscCs8ddg}x*|D]"}ttjjt|��}|jj|�qWdS)Nz
test/test*.pyz	setup.cfg)rrr�isfilerr�extend)rZoptional�pattern�filesrrrrNs
z)sdist_add_defaults._add_defaults_optionalcCsd|jd�}|jj�r$|jj|j��x:|jD]0\}}}}x"|D]}|jjtj	j
||��q>Wq,WdS)N�build_py)�get_finalized_commandrZhas_pure_modulesrr$�get_source_files�
data_filesrrrr!)rr'ZpkgZsrc_dirZ	build_dir�	filenamesrrrrr	Ts


z'sdist_add_defaults._add_defaults_pythoncCs�|jj�r~xr|jjD]f}t|t�rDt|�}tjj|�rz|j	j
|�q|\}}x,|D]$}t|�}tjj|�rR|j	j
|�qRWqWdS)N)rZhas_data_filesr*r�strrrrr#rr)r�item�dirnamer+�frrrr
ds


z+sdist_add_defaults._add_defaults_data_filescCs(|jj�r$|jd�}|jj|j��dS)N�	build_ext)rZhas_ext_modulesr(rr$r))rr0rrrrus

z$sdist_add_defaults._add_defaults_extcCs(|jj�r$|jd�}|jj|j��dS)N�
build_clib)rZhas_c_librariesr(rr$r))rr1rrrrzs

z'sdist_add_defaults._add_defaults_c_libscCs(|jj�r$|jd�}|jj|j��dS)N�
build_scripts)rZhas_scriptsr(rr$r))rr2rrrr
s

z(sdist_add_defaults._add_defaults_scriptsN)�__name__�
__module__�__qualname__�__doc__r�staticmethodrrrr	r
rrr
rrrrr	srrc@seZdZdS)rN)r3r4r5rrrrr�s)
rrZdistutils.utilrZdistutils.commandrZsetuptools.extern.six.movesrr�hasattrrrrr�<module>s|site-packages/setuptools/command/__pycache__/alias.cpython-36.pyc000064400000004465147511334630021072 0ustar003

��fz	�@sPddlmZddlmZddlmZmZmZdd�ZGdd�de�Z	dd	�Z
d
S)�)�DistutilsOptionError)�map)�edit_config�option_base�config_filecCs8xdD]}||krt|�SqW|j�|gkr4t|�S|S)z4Quote an argument for later parsing by shlex.split()�"�'�\�#)rrr	r
)�repr�split)�arg�c�r�/usr/lib/python3.6/alias.py�shquotes
rc@sHeZdZdZdZdZdgejZejdgZdd�Z	d	d
�Z
dd�Zd
S)�aliasz3Define a shortcut that invokes one or more commandsz0define a shortcut to invoke one or more commandsT�remove�r�remove (unset) the aliascCstj|�d|_d|_dS)N)r�initialize_options�argsr)�selfrrrrs
zalias.initialize_optionscCs*tj|�|jr&t|j�dkr&td��dS)N�zFMust specify exactly one argument (the alias name) when using --remove)r�finalize_optionsr�lenrr)rrrrr#s
zalias.finalize_optionscCs�|jjd�}|jsDtd�td�x|D]}tdt||��q(WdSt|j�dkr�|j\}|jrfd}q�||kr�tdt||��dStd|�dSn$|jd}djtt	|jdd���}t
|jd||ii|j�dS)	N�aliaseszCommand Aliasesz---------------zsetup.py aliasrz No alias definition found for %rr� )
ZdistributionZget_option_dictr�print�format_aliasrr�joinrrr�filenameZdry_run)rrr�commandrrr�run+s&

z	alias.runN)rrr)�__name__�
__module__�__qualname__�__doc__�descriptionZcommand_consumes_argumentsrZuser_optionsZboolean_optionsrrr#rrrrrsrcCsZ||\}}|td�krd}n,|td�kr0d}n|td�krBd}nd|}||d|S)	N�globalz--global-config �userz--user-config Zlocal�z
--filename=%rr)r)�namer�sourcer"rrrrFsrN)Zdistutils.errorsrZsetuptools.extern.six.movesrZsetuptools.command.setoptrrrrrrrrrr�<module>s

4site-packages/setuptools/command/__pycache__/upload.cpython-36.opt-1.pyc000064400000002443147511334630022216 0ustar003

��f��@s*ddlZddlmZGdd�dej�ZdS)�N)�uploadc@s(eZdZdZdd�Zdd�Zdd�ZdS)	rza
    Override default upload behavior to obtain password
    in a variety of different ways.
    cCs8tjj|�|jptj�|_|jp0|j�p0|j�|_dS)N)	�origr�finalize_options�username�getpassZgetuserZpassword�_load_password_from_keyring�_prompt_for_password)�self�r
�/usr/lib/python3.6/upload.pyrs
zupload.finalize_optionscCs2ytd�}|j|j|j�Stk
r,YnXdS)zM
        Attempt to load password from keyring. Suppress Exceptions.
        �keyringN)�
__import__Zget_passwordZ
repositoryr�	Exception)r	rr
r
rrs
z"upload._load_password_from_keyringcCs&ytj�Sttfk
r YnXdS)zH
        Prompt for a password on the tty. Suppress Exceptions.
        N)rr�KeyboardInterrupt)r	r
r
rr#szupload._prompt_for_passwordN)�__name__�
__module__�__qualname__�__doc__rrrr
r
r
rrs
r)rZdistutils.commandrrr
r
r
r�<module>ssite-packages/setuptools/command/__pycache__/install_lib.cpython-36.opt-1.pyc000064400000007603147511334630023231 0ustar003

��f�@sBddlZddlZddlmZmZddljjZGdd�dej�ZdS)�N)�product�starmapc@sZeZdZdZdd�Zdd�Zdd�Zedd	��Zd
d�Z	edd
��Z
ddd�Zdd�ZdS)�install_libz9Don't add compiled flags to filenames of non-Python filescCs&|j�|j�}|dk	r"|j|�dS)N)Zbuild�installZbyte_compile)�self�outfiles�r�!/usr/lib/python3.6/install_lib.py�run
szinstall_lib.runcs4�fdd��j�D�}t|�j��}tt�j|��S)z�
        Return a collections.Sized collections.Container of paths to be
        excluded for single_version_externally_managed installations.
        c3s"|]}�j|�D]
}|VqqdS)N)�
_all_packages)�.0Zns_pkg�pkg)rrr	�	<genexpr>sz-install_lib.get_exclusions.<locals>.<genexpr>)�_get_SVEM_NSPsr�_gen_exclusion_paths�setr�_exclude_pkg_path)rZall_packagesZ
excl_specsr)rr	�get_exclusionss
zinstall_lib.get_exclusionscCs$|jd�|g}tjj|jf|��S)zw
        Given a package name and exclusion path within that package,
        compute the full exclusion path.
        �.)�split�os�path�joinZinstall_dir)rr
Zexclusion_path�partsrrr	rszinstall_lib._exclude_pkg_pathccs$x|r|V|jd�\}}}qWdS)zn
        >>> list(install_lib._all_packages('foo.bar.baz'))
        ['foo.bar.baz', 'foo.bar', 'foo']
        rN)�
rpartition)Zpkg_name�sepZchildrrr	r'szinstall_lib._all_packagescCs,|jjsgS|jd�}|j}|r(|jjSgS)z�
        Get namespace packages (list) but only for
        single_version_externally_managed installations and empty otherwise.
        r)ZdistributionZnamespace_packagesZget_finalized_commandZ!single_version_externally_managed)rZinstall_cmdZsvemrrr	r1s

zinstall_lib._get_SVEM_NSPsccsbdVdVdVttd�s dStjjddtj��}|dV|d	V|d
V|dVdS)zk
        Generate file paths to be excluded for namespace packages (bytecode
        cache files).
        z__init__.pyz__init__.pycz__init__.pyo�get_tagN�__pycache__z	__init__.z.pycz.pyoz
.opt-1.pycz
.opt-2.pyc)�hasattr�imprrrr)�baserrr	rAs



z install_lib._gen_exclusion_paths�rc	sX|j���stjj|||�Sddlm}ddlm�g����fdd�}||||��S)Nr)�unpack_directory)�logcs<|�kr�jd|�dS�jd|tjj|���j|�|S)Nz/Skipping installation of %s (namespace package)Fzcopying %s -> %s)�warn�inforr�dirname�append)�src�dst)�excluder#rrr	�pfgs
z!install_lib.copy_tree.<locals>.pf)r�origr�	copy_treeZsetuptools.archive_utilr"Z	distutilsr#)	rZinfileZoutfileZ
preserve_modeZpreserve_timesZpreserve_symlinks�levelr"r+r)r*r#rr	r-Vs
zinstall_lib.copy_treecs.tjj|�}|j���r*�fdd�|D�S|S)Ncsg|]}|�kr|�qSrr)r�f)r*rr	�
<listcomp>xsz+install_lib.get_outputs.<locals>.<listcomp>)r,r�get_outputsr)rZoutputsr)r*r	r1ts
zinstall_lib.get_outputsN)r!r!rr!)
�__name__�
__module__�__qualname__�__doc__r
rr�staticmethodrrrr-r1rrrr	rs

r)	rr�	itertoolsrrZdistutils.command.install_libZcommandrr,rrrr	�<module>ssite-packages/setuptools/command/__pycache__/sdist.cpython-36.pyc000064400000014250147511334630021120 0ustar003

��f7�@s~ddlmZddljjZddlZddlZddlZddl	Z	ddl
mZddlm
Z
ddlZeZddd�ZGd	d
�d
e
ej�ZdS)�)�logN)�six�)�sdist_add_defaults�ccs4x.tjd�D] }x|j�|�D]
}|VqWqWdS)z%Find all files under revision controlzsetuptools.file_findersN)�
pkg_resourcesZiter_entry_points�load)�dirnameZep�item�r�/usr/lib/python3.6/sdist.py�walk_revctrlsr
cs�eZdZdZd0d2d3gZiZdd
ddgZedd�eD��Zdd�Z	dd�Z
dd�Zdd�Ze
ejdd���Zdd�Zejd4kp�d5ejko�d6knp�d7ejko�d8knZer�eZd$d%�Z�fd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Z�ZS)9�sdistz=Smart sdist that finds anything supported by revision control�formats=N�6formats for source distribution (comma-separated list)�	keep-temp�kz1keep the distribution tree around after creating zarchive file(s)�	dist-dir=�d�Fdirectory to put the source distribution archive(s) in [default: dist]rz.rstz.txtz.mdccs|]}dj|�VqdS)z	README{0}N)�format)�.0Zextrrr�	<genexpr>)szsdist.<genexpr>cCs�|jd�|jd�}|j|_|jjtjj|jd��|j�x|j	�D]}|j|�qFW|j
�t|jdg�}x*|j
D] }dd|f}||krv|j|�qvWdS)N�egg_infozSOURCES.txt�
dist_filesrr)Zrun_command�get_finalized_command�filelist�append�os�path�joinr�check_readmeZget_sub_commands�make_distribution�getattr�distributionZ
archive_files)�selfZei_cmdZcmd_namer�file�datarrr�run+s


z	sdist.runcCstjj|�|j�dS)N)�origr�initialize_options�_default_to_gztar)r%rrrr*>szsdist.initialize_optionscCstjdkrdSdg|_dS)N��r�betarZgztar)r,r-rr.r)�sys�version_infoZformats)r%rrrr+Cs
zsdist._default_to_gztarc	Cs$|j��tjj|�WdQRXdS)z%
        Workaround for #516
        N)�_remove_os_linkr)rr")r%rrrr"Is
zsdist.make_distributionccs^Gdd�d�}ttd|�}yt`Wntk
r6YnXz
dVWd||k	rXttd|�XdS)zG
        In a context, remove and restore os.link if it exists
        c@seZdZdS)z&sdist._remove_os_link.<locals>.NoValueN)�__name__�
__module__�__qualname__rrrr�NoValueWsr5�linkN)r#rr6�	Exception�setattr)r5Zorig_valrrrr1Ps
zsdist._remove_os_linkcCsLytjj|�Wn6tk
rFtj�\}}}|jjjdj	��YnXdS)N�template)
r)r�
read_templater7r/�exc_info�tb_next�tb_frame�f_locals�close)r%�_�tbrrrZ__read_template_hackeszsdist.__read_template_hack��r,rr�csb|jj�r^|jd�}|jj|j��|jjs^x0|jD]&\}�}}|jj�fdd�|D��q4WdS)zgetting python files�build_pycsg|]}tjj�|��qSr)rrr )r�filename)�src_dirrr�
<listcomp>�sz.sdist._add_defaults_python.<locals>.<listcomp>N)r$Zhas_pure_modulesrr�extendZget_source_filesZinclude_package_dataZ
data_files)r%rEr@�	filenamesr)rGr�_add_defaults_python|s

zsdist._add_defaults_pythoncsDy tjrtj|�n
t�j�Wntk
r>tjd�YnXdS)Nz&data_files contains unexpected objects)rZPY2r�_add_defaults_data_files�super�	TypeErrorr�warn)r%)�	__class__rrrL�szsdist._add_defaults_data_filescCs:x4|jD]}tjj|�rdSqW|jddj|j��dS)Nz,standard file not found: should have one of z, )�READMESrr�existsrOr )r%�frrrr!�szsdist.check_readmecCs^tjj|||�tjj|d�}ttd�rJtjj|�rJtj|�|j	d|�|j
d�j|�dS)Nz	setup.cfgr6r)r)r�make_release_treerrr �hasattrrR�unlinkZ	copy_filerZsave_version_info)r%Zbase_dir�files�destrrrrT�s
zsdist.make_release_treec	Cs@tjj|j�sdStj|jd��}|j�}WdQRX|dj�kS)NF�rbz+# file GENERATED by distutils, do NOT edit
)rr�isfile�manifest�io�open�readline�encode)r%�fpZ
first_linerrr�_manifest_is_not_generated�sz sdist._manifest_is_not_generatedcCs�tjd|j�t|jd�}xl|D]d}tjr^y|jd�}Wn$tk
r\tjd|�w YnX|j	�}|j
d�s |rxq |jj|�q W|j
�dS)z�Read the manifest file (named by 'self.manifest') and use it to
        fill in 'self.filelist', the list of files to include in the source
        distribution.
        zreading manifest file '%s'rYzUTF-8z"%r not UTF-8 decodable -- skipping�#N)r�infor[r]rZPY3�decode�UnicodeDecodeErrorrO�strip�
startswithrrr?)r%r[�linerrr�
read_manifest�s
zsdist.read_manifest)rNr�@keep the distribution tree around after creating archive file(s))rrrj)rrr)rBrCrB)r,r)r,rrD)r,rB)r,rBr)r2r3r4�__doc__Zuser_optionsZnegative_optZREADME_EXTENSIONS�tuplerQr(r*r+r"�staticmethod�
contextlib�contextmanagerr1Z_sdist__read_template_hackr/r0Zhas_leaky_handler:rKrLr!rTrari�
__classcell__rr)rPrrs:
	


r)r)Z	distutilsrZdistutils.command.sdistZcommandrr)rr/r\rnZsetuptools.externrZ
py36compatrr�listZ_default_revctrlr
rrrr�<module>s
site-packages/setuptools/command/__pycache__/register.cpython-36.opt-1.pyc000064400000001005147511334630022547 0ustar003

��f�@s"ddljjZGdd�dej�ZdS)�Nc@seZdZejjZdd�ZdS)�registercCs|jd�tjj|�dS)NZegg_info)Zrun_command�origr�run)�self�r�/usr/lib/python3.6/register.pyrs
zregister.runN)�__name__�
__module__�__qualname__rr�__doc__rrrrrrsr)Zdistutils.command.registerZcommandrrrrrr�<module>ssite-packages/setuptools/command/__pycache__/dist_info.cpython-36.pyc000064400000002445147511334630021753 0ustar003

��f��@s8dZddlZddlmZddlmZGdd�de�ZdS)zD
Create a dist_info directory
As defined in the wheel specification
�N)�Command)�logc@s.eZdZdZdgZdd�Zdd�Zd	d
�ZdS)
�	dist_infozcreate a .dist-info directory�	egg-base=�e�Ldirectory containing .egg-info directories (default: top of the source tree)cCs
d|_dS)N)�egg_base)�self�r
�/usr/lib/python3.6/dist_info.py�initialize_optionsszdist_info.initialize_optionscCsdS)Nr
)r	r
r
r�finalize_optionsszdist_info.finalize_optionscCsn|jd�}|j|_|j�|j�|jdtd��d}tjdjt	j
j|���|jd�}|j|j|�dS)N�egg_infoz	.egg-infoz
.dist-infoz
creating '{}'�bdist_wheel)
Zget_finalized_commandrr
�runr�lenr�info�format�os�path�abspathZegg2dist)r	rZ
dist_info_dirrr
r
rrs

z
dist_info.runN)rrr)�__name__�
__module__�__qualname__�descriptionZuser_optionsrr
rr
r
r
rrs
r)�__doc__rZdistutils.corerZ	distutilsrrr
r
r
r�<module>ssite-packages/setuptools/command/__pycache__/build_clib.cpython-36.opt-1.pyc000064400000004504147511334630023022 0ustar003

��f��@sFddljjZddlmZddlmZddlm	Z	Gdd�dej�ZdS)�N)�DistutilsSetupError)�log)�newer_pairwise_groupc@seZdZdZdd�ZdS)�
build_clibav
    Override the default build_clib behaviour to do the following:

    1. Implement a rudimentary timestamp-based dependency system
       so 'compile()' doesn't run every time.
    2. Add more keys to the 'build_info' dictionary:
        * obj_deps - specify dependencies for each object compiled.
                     this should be a dictionary mapping a key
                     with the source filename to a list of
                     dependencies. Use an empty string for global
                     dependencies.
        * cflags   - specify a list of additional flags to pass to
                     the compiler.
    c	Cs~�xv|D�]l\}}|jd�}|dks4t|ttf�r@td|��t|�}tjd|�|jdt��}t|t�sxtd|��g}|jdt��}t|ttf�s�td|��xX|D]P}|g}	|	j|�|j|t��}
t|
ttf�s�td|��|	j|
�|j	|	�q�W|j
j||jd�}t
||�ggfk�r^|jd�}|jd	�}
|jd
�}|j
j||j||
||jd�}|j
j|||j|jd�qWdS)
N�sourceszfin 'libraries' option (library '%s'), 'sources' must be present and must be a list of source filenameszbuilding '%s' library�obj_depsz\in 'libraries' option (library '%s'), 'obj_deps' must be a dictionary of type 'source: list'�)�
output_dir�macros�include_dirs�cflags)r	r
rZextra_postargs�debug)r	r
)�get�
isinstance�list�tuplerr�info�dict�extend�appendZcompilerZobject_filenamesZ
build_tempr�compiler
Zcreate_static_libr)�selfZ	librariesZlib_nameZ
build_inforrZdependenciesZglobal_deps�sourceZsrc_depsZ
extra_depsZexpected_objectsr
rrZobjects�r� /usr/lib/python3.6/build_clib.py�build_librariess`









zbuild_clib.build_librariesN)�__name__�
__module__�__qualname__�__doc__rrrrrrsr)
Zdistutils.command.build_clibZcommandrZorigZdistutils.errorsrZ	distutilsrZsetuptools.dep_utilrrrrr�<module>ssite-packages/setuptools/command/__pycache__/bdist_wininst.cpython-36.opt-1.pyc000064400000001605147511334630023611 0ustar003

��f}�@s"ddljjZGdd�dej�ZdS)�Nc@seZdZddd�Zdd�ZdS)�
bdist_wininstrcCs |jj||�}|dkrd|_|S)zj
        Supplement reinitialize_command to work around
        http://bugs.python.org/issue20819
        �install�install_libN)rr)Zdistribution�reinitialize_commandr)�self�commandZreinit_subcommands�cmd�r	�#/usr/lib/python3.6/bdist_wininst.pyrs
z"bdist_wininst.reinitialize_commandcCs$d|_ztjj|�Wdd|_XdS)NTF)Z_is_running�origr�run)rr	r	r
rszbdist_wininst.runN)r)�__name__�
__module__�__qualname__rrr	r	r	r
rs
r)Zdistutils.command.bdist_wininstrrrr	r	r	r
�<module>ssite-packages/setuptools/command/__pycache__/bdist_egg.cpython-36.opt-1.pyc000064400000034002147511334630022655 0ustar003

��f	G�@sxdZddlmZddlmZmZddlmZddlm	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlmZddlmZmZmZdd	lmZdd
lmZddlmZyddlmZmZd
d�ZWn,ek
r�ddlm Z mZdd�ZYnXdd�Z!dd�Z"dd�Z#Gdd�de�Z$e%j&dj'��Z(dd�Z)dd�Z*dd�Z+d d!d"�Z,d#d$�Z-d%d&�Z.d'd(�Z/d)d*d+d,gZ0d1d/d0�Z1dS)2z6setuptools.command.bdist_egg

Build .egg distributions�)�DistutilsSetupError)�remove_tree�mkpath)�log)�CodeTypeN)�six)�get_build_platform�Distribution�ensure_directory)�
EntryPoint)�Library)�Command)�get_path�get_python_versioncCstd�S)N�purelib)r�rr�/usr/lib/python3.6/bdist_egg.py�_get_purelibsr)�get_python_librcCstd�S)NF)rrrrrrscCs2d|krtjj|�d}|jd�r.|dd�}|S)N�.r�module�i����)�os�path�splitext�endswith)�filenamerrr�strip_module#s

rccs:x4tj|�D]&\}}}|j�|j�|||fVqWdS)zbDo os.walk in a reproducible way,
    independent of indeterministic filesystem readdir order
    N)r�walk�sort)�dir�base�dirs�filesrrr�sorted_walk+sr$c
Cs6tjd�j�}t|d��}|j||�WdQRXdS)NaR
        def __bootstrap__():
            global __bootstrap__, __loader__, __file__
            import sys, pkg_resources, imp
            __file__ = pkg_resources.resource_filename(__name__, %r)
            __loader__ = None; del __bootstrap__, __loader__
            imp.load_dynamic(__name__,__file__)
        __bootstrap__()
        �w)�textwrap�dedent�lstrip�open�write)Zresource�pyfileZ_stub_template�frrr�
write_stub5s
r-c@s�eZdZdZd*ddde�fd+d-d.d/gZdddgZdd�Zdd�Zdd�Z	dd�Z
dd�Zdd�Zd d!�Z
d"d#�Zd$d%�Zd&d'�Zd(d)�Zd	S)0�	bdist_eggzcreate an "egg" distribution�
bdist-dir=�b�1temporary directory for creating the distributionz
plat-name=�pz;platform name to embed in generated filenames (default: %s)�exclude-source-filesN�+remove all .py files from the generated egg�	keep-temp�kz/keep the pseudo-installation tree around after z!creating the distribution archive�	dist-dir=�d�-directory to put final built distributions in�
skip-build�2skip rebuilding everything (for testing/debugging)cCs.d|_d|_d|_d|_d|_d|_d|_dS)Nr)�	bdist_dir�	plat_name�	keep_temp�dist_dir�
skip_build�
egg_output�exclude_source_files)�selfrrr�initialize_optionsZszbdist_egg.initialize_optionscCs�|jd�}|_|j|_|jdkr>|jd�j}tjj|d�|_|jdkrPt	�|_|j
dd�|jdkr�tdd|j
|jt�|jj�o�|j�j
�}tjj|j|d�|_dS)N�egg_infoZbdistZeggr?z.egg)r?r?)�get_finalized_command�ei_cmdrEr<�
bdist_baserr�joinr=rZset_undefined_optionsrAr	Zegg_nameZegg_versionr�distribution�has_ext_modulesr?)rCrGrH�basenamerrr�finalize_optionscs


zbdist_egg.finalize_optionscCs�|j|jd�_tjjtjjt���}|jj	g}|j_	x�|D]�}t
|t�r�t|�dkr�tjj
|d�r�tjj|d�}tjj|�}||ks�|j|tj�r�|t|�dd�|df}|jj	j|�q<Wz"tjd|j�|jdddd�Wd||j_	XdS)N�install�r�zinstalling package data to %s�install_data)�force�root)r<rF�install_librr�normcase�realpathrrJ�
data_files�
isinstance�tuple�len�isabs�
startswith�sep�appendr�info�call_command)rCZ
site_packages�old�itemrVZ
normalizedrrr�do_install_data{s 
zbdist_egg.do_install_datacCs|jgS)N)rA)rCrrr�get_outputs�szbdist_egg.get_outputscKsTxtD]}|j||j�qW|jd|j�|jd|j�|j|f|�}|j|�|S)z8Invoke reinitialized command `cmdname` with keyword argsr@�dry_run)�INSTALL_DIRECTORY_ATTRS�
setdefaultr<r@reZreinitialize_command�run_command)rCZcmdname�kw�dirname�cmdrrrr`�s

zbdist_egg.call_commandcCs�|jd�tjd|j�|jd�}|j}d|_|jj�rJ|jrJ|jd�|j	ddd�}||_|j
�\}}g|_g}x�t|�D]|\}}t
jj|�\}	}
t
jj|jt|	�d�}|jj|�tjd	|�|js�tt
jj|�|�|j|�|jt
jd
�||<q~W|�r|j|�|jj�r |j�|j}t
jj|d�}
|j|
�|jj�rrt
jj|
d�}tjd
|�|j	d|dd�|j|
�t
jj|
d�}|�r�tjd|�|j�st|�t|d�}|j dj|��|j d�|j!�n,t
jj"|��rtjd|�|j�st
j#|�t$t
jj|d�|j%��t
jj&t
jj|j'd���rBtj(d�|j)�rR|j*�t+|j,||j-|j|j.�d�|j/�s�t0|j|jd�t1|jdg�jdt2�|j,f�dS)NrEzinstalling library code to %srNZ
build_clibrTr)Zwarn_dirz.pyzcreating stub loader for %s�/zEGG-INFO�scriptszinstalling scripts to %sZinstall_scriptsrP)�install_dirZno_epznative_libs.txtz
writing %s�wt�
zremoving %szdepends.txtzxWARNING: 'depends.txt' will not be used by setuptools 0.6!
Use the install_requires/extras_require setup() args instead.)�verbosere�mode)reZ
dist_filesr.)3rhrr_r<rFrSrJZhas_c_librariesr@r`�get_ext_outputs�stubs�	enumeraterrrrIrr^rer-rL�replacer]Zbyte_compilerWrcrrm�copy_metadata_tor
r)r*�close�isfile�unlink�write_safety_flag�zip_safe�existsrE�warnrB�zap_pyfiles�make_zipfilerArq�
gen_headerr>r�getattrr)rCZinstcmdZold_rootrk�all_outputs�ext_outputsZ
to_compiler2Zext_namer�extr+Zarchive_rootrEZ
script_dirZnative_libsZ	libs_filerrr�run�sz












z
bdist_egg.runc

Cs�tjd�x�t|j�D]�\}}}x�|D]�}tjj||�}|jd�rXtjd|�tj	|�|jd�r&|}d}t
j||�}tjj|tj|j
d�d�}	tjd||	f�ytj|	�Wntk
r�YnXtj||	�q&WqWdS)	Nz+Removing .py files from temporary directoryz.pyzDeleting %s�__pycache__z#(?P<name>.+)\.(?P<magic>[^.]+)\.pyc�namez.pyczRenaming file from [%s] to [%s])rr_�walk_eggr<rrrIr�debugrz�re�match�pardir�group�remove�OSError�rename)
rCr!r"r#r�rZpath_old�pattern�mZpath_newrrrr�s*




zbdist_egg.zap_pyfilescCs2t|jdd�}|dk	r|Stjd�t|j|j�S)Nr|z4zip_safe flag not set; analyzing archive contents...)r�rJrr~�analyze_eggr<rt)rC�saferrrr|s

zbdist_egg.zip_safec
Cs�tj|jjpd�}|jdi�jd�}|dkr0dS|js>|jrLtd|f��tj	dd�}|j
}dj|j�}|jd}tj
j|j�}d	t�}|js�ttj
j|j�|jd
�t|jd�}	|	j|�|	j�dS)N�zsetuptools.installationZeggsecutabler%zGeggsecutable entry point (%r) cannot have 'extras' or refer to a module�rraH#!/bin/sh
if [ `basename $0` = "%(basename)s" ]
then exec python%(pyver)s -c "import sys, os; sys.path.insert(0, os.path.abspath('$0')); from %(pkg)s import %(base)s; sys.exit(%(full)s())" "$@"
else
  echo $0 is not the correct name for this egg file.
  echo Please rename it back to %(basename)s and try again.
  exec false
fi
)re�a)rZ	parse_maprJZentry_points�getZattrsZextrasr�sys�versionZmodule_namerIrrrLrA�localsrerrjr)r*rx)
rCZepmZepZpyver�pkgZfullr!rL�headerr,rrrr�s*


zbdist_egg.gen_headercCsltjj|j�}tjj|d�}xJ|jjjD]<}|j|�r(tjj||t	|�d��}t
|�|j||�q(WdS)z*Copy metadata (egg info) to the target_dirr�N)rr�normpathrErIrGZfilelistr#r\rZr
Z	copy_file)rCZ
target_dirZ
norm_egg_info�prefixr�targetrrrrw:s
zbdist_egg.copy_metadata_tocCsg}g}|jdi}x|t|j�D]n\}}}x6|D].}tjj|�dj�tkr.|j|||�q.Wx*|D]"}|||d|tjj||�<qfWqW|j	j
��r|jd�}xd|jD]Z}	t
|	t�r�q�|j|	j�}
|j|
�}tjj|�jd�s�tjjtjj|j|��r�|j|�q�W||fS)zAGet a list of relative paths to C extensions in the output distror�rPrlZ	build_extzdl-)r<r$rrr�lower�NATIVE_EXTENSIONSr^rIrJrKrF�
extensionsrXrZget_ext_fullnamer�Zget_ext_filenamerLr\r})rCr�r��pathsr!r"r#rZ	build_cmdr��fullnamerrrrsFs(


&


zbdist_egg.get_ext_outputs)r/r0r1)r3Nr4�Pkeep the pseudo-installation tree around after creating the distribution archive)r5r6r�)r7r8r9)r:Nr;)�__name__�
__module__�__qualname__�descriptionrZuser_optionsZboolean_optionsrDrMrcrdr`r�rr|r�rwrsrrrrr.Cs4
	
Q'r.z.dll .so .dylib .pydccsLt|�}t|�\}}}d|kr(|jd�|||fVx|D]
}|Vq:WdS)z@Walk an unpacked egg's contents, skipping the metadata directoryzEGG-INFON)r$�nextr�)�egg_dirZwalkerr!r"r#Zbdfrrrr�fs

r�c	Cs�x0tj�D]$\}}tjjtjj|d|��r
|Sq
Wt�s<dSd}xbt|�D]V\}}}xJ|D]B}|jd�sZ|jd�rvqZqZ|jd�s�|jd�rZt	||||�o�|}qZWqJW|S)NzEGG-INFOFTz.pyz.pywz.pycz.pyo)
�safety_flags�itemsrrr}rI�can_scanr�r�scan_module)	r�rt�flag�fnr�r!r"r#r�rrrr�qs
r�cCs�x~tj�D]r\}}tjj||�}tjj|�rL|dks@t|�|kr|tj|�q
|dk	r
t|�|kr
t|d�}|j	d�|j
�q
WdS)Nrorp)r�r�rrrIr}�boolrzr)r*rx)r�r�r�r�r,rrrr{�s

r{zzip-safeznot-zip-safe)TFc
Cstjj||�}|dd�|kr"dS|t|�dd�jtjd�}||rJdpLdtjj|�d}tjdkrpd}ntjd kr�d
}nd}t	|d�}|j
|�tj|�}	|j
�d}
tjt|	��}x&d!D]}||kr�tjd||�d}
q�Wd|k�rx*d"D]"}||k�r�tjd||�d}
�q�W|
S)#z;Check whether module possibly uses unsafe-for-zipfile stuffNrPTrr�rr������rb�__file__�__path__z%s: module references %sF�inspect�	getsource�
getabsfile�
getsourcefile�getfilegetsourcelines�
findsource�getcomments�getframeinfo�getinnerframes�getouterframes�stack�tracez"%s: module MAY be using inspect.%s���)r�r�)r�r�)r�r�)r�r�r�r�r�r�r�r�r�r�r�)rrrIrZrvr]rr��version_infor)�read�marshal�loadrx�dict�fromkeys�iter_symbolsrr~)
r�r!r�rtrr�r�skipr,�coder�ZsymbolsZbadrrrr��s: 








r�ccs`x|jD]
}|VqWxD|jD]:}t|tj�r6|Vqt|t�rxt|�D]
}|VqJWqWdS)zBYield names and strings used by `code` and its nested code objectsN)�co_names�	co_constsrXrZstring_typesrr�)r�r��constrrrr��s

r�cCs4tjjd�rtjdkrdStjd�tjd�dS)N�javaZcliTz1Unable to analyze compiled code on this platform.zfPlease ask the author to include a 'zip_safe' setting (either True or False) in the package's setup.py)r��platformr\rr~rrrrr��s
r�rTrnrQZinstall_baseTr%c
s�ddl}ttjj|��d�tjd|����fdd�}|rB|jn|j}�s�|j	|||d�}	x"t
��D]\}
}}||	|
|�qfW|	j�n$x"t
��D]\}
}}|d|
|�q�W|S)aqCreate a zip file from all the files under 'base_dir'.  The output
    zip file will be named 'base_dir' + ".zip".  Uses either the "zipfile"
    Python module (if available) or the InfoZIP "zip" utility (if installed
    and found on the default search path).  If neither tool is available,
    raises DistutilsExecError.  Returns the name of the output zip file.
    rN)rez#creating '%s' and adding '%s' to itcsdx^|D]V}tjjtjj||��}tjj|�r|t��dd�}�sP|j||�tjd|�qWdS)NrPzadding '%s')	rrr�rIryrZr*rr�)�zrj�namesr�rr2)�base_dirrerr�visit�s
zmake_zipfile.<locals>.visit)�compression)�zipfilerrrrjrr_ZZIP_DEFLATEDZ
ZIP_STOREDZZipFiler$rx)
Zzip_filenamer�rqre�compressrrr�r�r�r�rjr"r#r)r�rerr��s	
r�)rrTr%)2�__doc__Zdistutils.errorsrZdistutils.dir_utilrrZ	distutilsr�typesrr�rr�r&r�Zsetuptools.externrZ
pkg_resourcesrr	r
rZsetuptools.extensionrZ
setuptoolsr
�	sysconfigrrr�ImportErrorZdistutils.sysconfigrrr$r-r.r�r��splitr�r�r�r{r�r�r�r�rfr�rrrr�<module>sL
"$
site-packages/setuptools/command/__pycache__/develop.cpython-36.pyc000064400000014316147511334630021433 0ustar003

��fn�@s�ddlmZddlmZddlmZmZddlZddlZddl	Z	ddl
mZddlm
Z
mZmZddlmZddlmZddlZGd	d
�d
eje�ZGdd�de�ZdS)
�)�convert_path)�log)�DistutilsError�DistutilsOptionErrorN)�six)�Distribution�PathMetadata�normalize_path)�easy_install)�
namespacesc@sveZdZdZdZejddgZejdgZd	Zd
d�Z	dd
�Z
dd�Zedd��Z
dd�Zdd�Zdd�Zdd�ZdS)�developzSet up package for developmentz%install package in 'development mode'�	uninstall�u�Uninstall this source package�	egg-path=N�-Set the path to be used in the .egg-link fileFcCs2|jrd|_|j�|j�n|j�|j�dS)NT)r
Z
multi_version�uninstall_linkZuninstall_namespaces�install_for_developmentZwarn_deprecated_options)�self�r�/usr/lib/python3.6/develop.py�runs
zdevelop.runcCs&d|_d|_tj|�d|_d|_dS)N�.)r
�egg_pathr
�initialize_options�
setup_pathZalways_copy_from)rrrrr's

zdevelop.initialize_optionscCs|jd�}|jr,d}|j|jf}t||��|jg|_tj|�|j�|j	�|j
jtjd��|jd}t
jj|j|�|_|j|_|jdkr�t
jj|j�|_t|j�}tt
jj|j|j��}||kr�td|��t|t|t
jj|j��|jd�|_|j|j|j|j�|_dS)N�egg_infoz-Please rename %r to %r before using 'develop'z*.eggz	.egg-linkzA--egg-path must be a relative path from the install directory to )�project_name)�get_finalized_commandZbroken_egg_inforrZegg_name�argsr
�finalize_optionsZexpand_basedirsZexpand_dirsZ
package_index�scan�glob�os�path�join�install_dir�egg_link�egg_baser�abspathr	rrr�dist�_resolve_setup_pathr)rZei�templaterZegg_link_fn�targetrrrrr .s<






zdevelop.finalize_optionscCsh|jtjd�jd�}|tjkr0d|jd�d}ttjj|||��}|ttj�krdt	d|ttj���|S)z�
        Generate a path from egg_base back to '.' where the
        setup script resides and ensure that path points to the
        setup path from $install_dir/$egg_path.
        �/z../�zGCan't get a consistent path to setup script from installation directory)
�replacer#�sep�rstrip�curdir�countr	r$r%r)r(r&rZ
path_to_setupZresolvedrrrr+Xs
zdevelop._resolve_setup_pathcCsDtjr�t|jdd�r�|jddd�|jd�|jd�}t|j�}|jd|d�|jd�|jddd�|jd�|jd�}||_	||j
_t||j
�|j
_n"|jd�|jdd	d�|jd�|j�tjr�|jtj�dt_|j�tjd
|j|j�|j�s,t|jd��}|j|j	d|j�WdQRX|jd|j
|j�dS)
NZuse_2to3FZbuild_pyr)Zinplacer)r(Z	build_extr/zCreating %s (link to %s)�w�
)rZPY3�getattr�distributionZreinitialize_commandZrun_commandrr	Z	build_librr*�locationrrZ	_providerZinstall_site_py�
setuptoolsZbootstrap_install_fromr
Zinstall_namespacesr�infor'r(�dry_run�open�writerZprocess_distributionZno_deps)rZbpy_cmdZ
build_pathZei_cmd�frrrrks4







 zdevelop.install_for_developmentcCs�tjj|j�rztjd|j|j�t|j�}dd�|D�}|j�||j	g|j	|j
gfkrhtjd|�dS|jsztj
|j�|js�|j|j�|jjr�tjd�dS)NzRemoving %s (link to %s)cSsg|]}|j��qSr)r2)�.0�linerrr�
<listcomp>�sz*develop.uninstall_link.<locals>.<listcomp>z$Link points to %s: uninstall abortedz5Note: you must uninstall or replace scripts manually!)r#r$�existsr'rr;r(r=�closerr�warnr<�unlinkZ
update_pthr*r8�scripts)rZ
egg_link_file�contentsrrrr�s
zdevelop.uninstall_linkc
Cs�||jk	rtj||�S|j|�x^|jjp,gD]N}tjjt	|��}tjj
|�}tj|��}|j
�}WdQRX|j||||�q.WdS)N)r*r
�install_egg_scripts�install_wrapper_scriptsr8rGr#r$r)r�basename�ior=�readZinstall_script)rr*Zscript_nameZscript_pathZstrmZscript_textrrrrI�s

zdevelop.install_egg_scriptscCst|�}tj||�S)N)�VersionlessRequirementr
rJ)rr*rrrrJ�szdevelop.install_wrapper_scripts)r
rr)rNr)�__name__�
__module__�__qualname__�__doc__�descriptionr
Zuser_optionsZboolean_optionsZcommand_consumes_argumentsrrr �staticmethodr+rrrIrJrrrrrs	*/rc@s(eZdZdZdd�Zdd�Zdd�ZdS)	rNaz
    Adapt a pkg_resources.Distribution to simply return the project
    name as the 'requirement' so that scripts will work across
    multiple versions.

    >>> dist = Distribution(project_name='foo', version='1.0')
    >>> str(dist.as_requirement())
    'foo==1.0'
    >>> adapted_dist = VersionlessRequirement(dist)
    >>> str(adapted_dist.as_requirement())
    'foo'
    cCs
||_dS)N)�_VersionlessRequirement__dist)rr*rrr�__init__�szVersionlessRequirement.__init__cCst|j|�S)N)r7rU)r�namerrr�__getattr__�sz"VersionlessRequirement.__getattr__cCs|jS)N)r)rrrr�as_requirement�sz%VersionlessRequirement.as_requirementN)rOrPrQrRrVrXrYrrrrrN�srN)Zdistutils.utilrZ	distutilsrZdistutils.errorsrrr#r"rLZsetuptools.externrZ
pkg_resourcesrrr	Zsetuptools.command.easy_installr
r:rZDevelopInstallerr�objectrNrrrr�<module>s4site-packages/setuptools/command/__pycache__/setopt.cpython-36.pyc000064400000010656147511334630021316 0ustar003

��f��@s�ddlmZddlmZddlmZddlZddlZddlmZddl	m
Z
ddd	d
gZddd�Zddd�Z
Gdd	�d	e
�ZGdd
�d
e�ZdS)�)�convert_path)�log)�DistutilsOptionErrorN)�configparser)�Command�config_file�edit_config�option_base�setopt�localcCsh|dkrdS|dkr,tjjtjjtj�d�S|dkrZtjdkrBdpDd}tjjtd	|��St	d
|��dS)z�Get the filename of the distutils, local, global, or per-user config

    `kind` must be one of "local", "global", or "user"
    rz	setup.cfg�globalz
distutils.cfg�user�posix�.�z~/%spydistutils.cfgz7config_file() type must be 'local', 'global', or 'user'N)
�os�path�join�dirname�	distutils�__file__�name�
expanduserr�
ValueError)Zkind�dot�r�/usr/lib/python3.6/setopt.pyrsFc		Cs.tjd|�tj�}|j|g�x�|j�D]�\}}|dkrTtjd||�|j|�q*|j|�svtjd||�|j	|�x||j�D]p\}}|dkr�tjd|||�|j
||�|j|�s�tjd||�|j|�q�tjd||||�|j|||�q�Wq*Wtjd|�|�s*t
|d	��}|j|�WdQRXdS)
aYEdit a configuration file to include `settings`

    `settings` is a dictionary of dictionaries or ``None`` values, keyed by
    command/section name.  A ``None`` value means to delete the entire section,
    while a dictionary lists settings to be changed or deleted in that section.
    A setting of ``None`` means to delete that setting.
    zReading configuration from %sNzDeleting section [%s] from %szAdding new section [%s] to %szDeleting %s.%s from %sz#Deleting empty [%s] section from %szSetting %s.%s to %r in %sz
Writing %s�w)r�debugrZRawConfigParser�read�items�infoZremove_sectionZhas_sectionZadd_sectionZ
remove_option�options�set�open�write)	�filenameZsettings�dry_runZoptsZsectionr"�option�value�frrrr!s8



c@s2eZdZdZdddgZddgZdd�Zd
d�ZdS)r	z<Abstract base class for commands that mess with config files�
global-config�g�0save options to the site-wide distutils.cfg file�user-config�u�7save options to the current user's pydistutils.cfg file�	filename=r*�-configuration file to use (default=setup.cfg)cCsd|_d|_d|_dS)N)�
global_config�user_configr&)�selfrrr�initialize_options\szoption_base.initialize_optionscCsvg}|jr|jtd��|jr,|jtd��|jdk	rB|j|j�|sT|jtd��t|�dkrjtd|��|\|_dS)Nrr
r�z/Must specify only one configuration file option)r3�appendrr4r&�lenr)r5�	filenamesrrr�finalize_optionsas
zoption_base.finalize_optionsN)r+r,r-)r.r/r0)r1r*r2)�__name__�
__module__�__qualname__�__doc__�user_options�boolean_optionsr6r;rrrrr	Lsc@sJeZdZdZdZddddgejZejdgZdd�Zdd�Z	dd�Z
dS)r
z#Save command-line options to a filez1set an option in setup.cfg or another config file�command=�c�command to set an option for�option=�o�
option to set�
set-value=�s�value of the option�remove�r�remove (unset) the valuecCs&tj|�d|_d|_d|_d|_dS)N)r	r6�commandr(�	set_valuerK)r5rrrr6�s

zsetopt.initialize_optionscCsDtj|�|jdks|jdkr&td��|jdkr@|jr@td��dS)Nz%Must specify --command *and* --optionz$Must specify --set-value or --remove)r	r;rNr(rrOrK)r5rrrr;�s

zsetopt.finalize_optionscCs*t|j|j|jjdd�|jii|j�dS)N�-�_)rr&rNr(�replacerOr')r5rrr�run�sz
setopt.runN)rBrCrD)rErFrG)rHrIrJ)rKrLrM)r<r=r>r?�descriptionr	r@rAr6r;rSrrrrr
ss)r)F)Zdistutils.utilrrrZdistutils.errorsrrZsetuptools.extern.six.movesrZ
setuptoolsr�__all__rrr	r
rrrr�<module>s

+'site-packages/setuptools/command/__pycache__/py36compat.cpython-36.pyc000064400000010703147511334630021776 0ustar003

��fz�@sdddlZddlmZddlmZddlmZddlmZGdd�d�Ze	ejd�r`Gd	d�d�ZdS)
�N)�glob)�convert_path)�sdist)�filterc@s\eZdZdZdd�Zedd��Zdd�Zdd	�Zd
d�Z	dd
�Z
dd�Zdd�Zdd�Z
dS)�sdist_add_defaultsz�
    Mix-in providing forward-compatibility for functionality as found in
    distutils on Python 3.7.

    Do not edit the code in this class except to update functionality
    as implemented in distutils. Instead, override in the subclass.
    cCs<|j�|j�|j�|j�|j�|j�|j�dS)a9Add all the default files to self.filelist:
          - README or README.txt
          - setup.py
          - test/test*.py
          - all pure Python modules mentioned in setup script
          - all files pointed by package_data (build_py)
          - all files defined in data_files.
          - all files defined as scripts.
          - all C sources listed as part of extensions or C libraries
            in the setup script (doesn't catch C headers!)
        Warns if (README or README.txt) or setup.py are missing; everything
        else is optional.
        N)�_add_defaults_standards�_add_defaults_optional�_add_defaults_python�_add_defaults_data_files�_add_defaults_ext�_add_defaults_c_libs�_add_defaults_scripts)�self�r� /usr/lib/python3.6/py36compat.py�add_defaultsszsdist_add_defaults.add_defaultscCs:tjj|�sdStjj|�}tjj|�\}}|tj|�kS)z�
        Case-sensitive path existence check

        >>> sdist_add_defaults._cs_path_exists(__file__)
        True
        >>> sdist_add_defaults._cs_path_exists(__file__.upper())
        False
        F)�os�path�exists�abspath�split�listdir)�fspathrZ	directory�filenamerrr�_cs_path_exists(s

z"sdist_add_defaults._cs_path_existscCs�|j|jjg}x�|D]�}t|t�rn|}d}x(|D] }|j|�r0d}|jj|�Pq0W|s�|jddj	|��q|j|�r�|jj|�q|jd|�qWdS)NFTz,standard file not found: should have one of z, zstandard file '%s' not found)
ZREADMES�distributionZscript_name�
isinstance�tupler�filelist�append�warn�join)rZ	standards�fnZaltsZgot_itrrrr9s 




z*sdist_add_defaults._add_defaults_standardscCs8ddg}x*|D]"}ttjjt|��}|jj|�qWdS)Nz
test/test*.pyz	setup.cfg)rrr�isfilerr�extend)rZoptional�pattern�filesrrrrNs
z)sdist_add_defaults._add_defaults_optionalcCsd|jd�}|jj�r$|jj|j��x:|jD]0\}}}}x"|D]}|jjtj	j
||��q>Wq,WdS)N�build_py)�get_finalized_commandrZhas_pure_modulesrr$�get_source_files�
data_filesrrrr!)rr'ZpkgZsrc_dirZ	build_dir�	filenamesrrrrr	Ts


z'sdist_add_defaults._add_defaults_pythoncCs�|jj�r~xr|jjD]f}t|t�rDt|�}tjj|�rz|j	j
|�q|\}}x,|D]$}t|�}tjj|�rR|j	j
|�qRWqWdS)N)rZhas_data_filesr*r�strrrrr#rr)r�item�dirnamer+�frrrr
ds


z+sdist_add_defaults._add_defaults_data_filescCs(|jj�r$|jd�}|jj|j��dS)N�	build_ext)rZhas_ext_modulesr(rr$r))rr0rrrrus

z$sdist_add_defaults._add_defaults_extcCs(|jj�r$|jd�}|jj|j��dS)N�
build_clib)rZhas_c_librariesr(rr$r))rr1rrrrzs

z'sdist_add_defaults._add_defaults_c_libscCs(|jj�r$|jd�}|jj|j��dS)N�
build_scripts)rZhas_scriptsr(rr$r))rr2rrrr
s

z(sdist_add_defaults._add_defaults_scriptsN)�__name__�
__module__�__qualname__�__doc__r�staticmethodrrrr	r
rrr
rrrrr	srrc@seZdZdS)rN)r3r4r5rrrrr�s)
rrZdistutils.utilrZdistutils.commandrZsetuptools.extern.six.movesrr�hasattrrrrr�<module>s|site-packages/setuptools/command/__pycache__/install.cpython-36.opt-1.pyc000064400000007471147511334630022406 0ustar003

��fK�@svddlmZddlZddlZddlZddlZddljjZ	ddl
Z
e	jZGdd�de	j�Zdd�e	jjD�ej
e_dS)�)�DistutilsArgErrorNc@s�eZdZdZejjddgZejjddgZddd	�fd
dd	�fgZe	e�Z
dd
�Zdd�Zdd�Z
dd�Zedd��Zdd�ZdS)�installz7Use easy_install to install the package, w/dependencies�old-and-unmanageableN�Try not to use this!�!single-version-externally-managed�5used by system package builders to create 'flat' eggsZinstall_egg_infocCsdS)NT�)�selfrr�/usr/lib/python3.6/install.py�<lambda>szinstall.<lambda>Zinstall_scriptscCsdS)NTr)r	rrr
rscCstjj|�d|_d|_dS)N)�origr�initialize_options�old_and_unmanageable�!single_version_externally_managed)r	rrr
r
 szinstall.initialize_optionscCs<tjj|�|jrd|_n|jr8|jr8|jr8td��dS)NTzAYou must specify --record or --root when building system packages)rr�finalize_options�rootr�recordr)r	rrr
r%szinstall.finalize_optionscCs(|js|jrtjj|�Sd|_d|_dS)N�)rrrr�handle_extra_pathZ	path_fileZ
extra_dirs)r	rrr
r0szinstall.handle_extra_pathcCs@|js|jrtjj|�S|jtj��s4tjj|�n|j�dS)N)	rrrr�run�_called_from_setup�inspectZcurrentframe�do_egg_install)r	rrr
r:s
zinstall.runcCsz|dkr4d}tj|�tj�dkr0d}tj|�dStj|�d}|dd�\}tj|�}|jjdd	�}|d
kox|j	dkS)a�
        Attempt to detect whether run() was called from setup() or by another
        command.  If called by setup(), the parent caller will be the
        'run_command' method in 'distutils.dist', and *its* caller will be
        the 'run_commands' method.  If called any other way, the
        immediate caller *might* be 'run_command', but it won't have been
        called by 'run_commands'. Return True in that case or if a call stack
        is unavailable. Return False otherwise.
        Nz4Call stack not available. bdist_* commands may fail.Z
IronPythonz6For best results, pass -X:Frames to enable call stack.T���__name__rzdistutils.distZrun_commands)
�warnings�warn�platformZpython_implementationrZgetouterframesZgetframeinfo�	f_globals�getZfunction)Z	run_frame�msg�resZcaller�infoZ
caller_modulerrr
rEs


zinstall._called_from_setupcCs�|jjd�}||jd|j|jd�}|j�d|_|jjtjd��|j	d�|jj
d�jg}tj
rp|jdtj
�||_|j�dt_
dS)N�easy_install�x)�argsrr�.z*.eggZ	bdist_eggr)ZdistributionZget_command_classrrZensure_finalizedZalways_copy_fromZ
package_index�scan�globZrun_commandZget_command_objZ
egg_output�
setuptoolsZbootstrap_install_from�insertr&r)r	r$�cmdr&rrr
r`s
zinstall.do_egg_install)rNr)rNr)r�
__module__�__qualname__�__doc__rrZuser_optionsZboolean_options�new_commands�dict�_ncr
rrr�staticmethodrrrrrr
rs 


rcCsg|]}|dtjkr|�qS)r)rr2)�.0r,rrr
�
<listcomp>{sr5)Zdistutils.errorsrrr)rrZdistutils.command.installZcommandrrr*�_installZsub_commandsr0rrrr
�<module>slsite-packages/setuptools/command/__pycache__/easy_install.cpython-36.pyc000064400000176663147511334640022503 0ustar003

��f�T�@sdZddlmZddlmZddlmZmZddlmZmZm	Z	m
Z
ddlmZm
Z
ddlmZmZddlmZdd	lmZdd
lZdd
lZdd
lZdd
lZdd
lZdd
lZdd
lZdd
lZdd
lZdd
lZdd
lZdd
l Z dd
l!Z!dd
l"Z"dd
l#Z#dd
l$Z$dd
l%Z%ddl&m'Z'ddl(m)Z)m*Z*dd
l+m,Z,ddl-m.Z.ddl/m0Z0m1Z1ddl2m3Z3ddl4m5Z5ddl6m7Z7ddl8m9Z9m:Z:m;Z;ddl4m<Z<m=Z=ddl>m?Z?ddl@mAZAmBZBmCZCmDZDmEZEmFZFmGZGmHZHmIZImJZJmKZKmLZLmMZMmNZNmOZOdd
lPZ@ejQde@jRd�ddddddgZSdd �ZTd!d�ZUe'jV�r2d"d#�ZWd$d%�ZXnd&d#�ZWd'd%�ZXd(d)�ZYGd*d�de,�ZZd+d,�Z[d-d.�Z\d/d0�Z]d1d�Z^d2d�Z_Gd3d�deG�Z`Gd4d5�d5e`�Zaejbjcd6d7�d8k�r�eaZ`d9d:�Zdd;d<�Zed=d>�Zfd?d@�ZgdpdAdB�ZhdCdD�ZidEdF�ZjdGejkk�rejZlndHdI�ZldqdKdL�ZmdMdN�ZndOdP�ZodQdR�ZpyddSlmqZrWnesk
�r^dTdU�ZrYnXdVdW�ZqGdXdY�dYet�Zueujv�ZwGdZd[�d[eu�ZxGd\d]�d]ey�ZzGd^d_�d_ez�Z{Gd`da�dae{�Z|ezj}Z}ezj~Z~dbdc�Zddde�Z�dfeefdgdh�Z�didj�Z�dkdl�Z�drdmd�Z�e"j�dndo��Z�d
S)sa%
Easy Install
------------

A tool for doing automatic download/extract/build of distutils-based Python
packages.  For detailed documentation, see the accompanying EasyInstall.txt
file, or visit the `EasyInstall home page`__.

__ https://setuptools.readthedocs.io/en/latest/easy_install.html

�)�glob)�get_platform)�convert_path�
subst_vars)�DistutilsArgError�DistutilsOptionError�DistutilsError�DistutilsPlatformError)�INSTALL_SCHEMES�SCHEME_KEYS)�log�dir_util)�
first_line_re)�find_executableN)�six)�configparser�map)�Command)�	run_setup)�get_path�get_config_vars)�rmtree_safe)�setopt)�unpack_archive)�PackageIndex�parse_requirement_arg�
URL_SCHEME)�	bdist_egg�egg_info)�Wheel)�yield_lines�normalize_path�resource_string�ensure_directory�get_distribution�find_distributions�Environment�Requirement�Distribution�PathMetadata�EggMetadata�
WorkingSet�DistributionNotFound�VersionConflict�DEVELOP_DIST�default)�category�samefile�easy_install�PthDistributions�extract_wininst_cfg�main�get_exe_prefixescCstjd�dkS)N�P�)�struct�calcsize�r;r;�"/usr/lib/python3.6/easy_install.py�is_64bitIsr=cCsjtjj|�otjj|�}ttjd�o&|}|r:tjj||�Stjjtjj|��}tjjtjj|��}||kS)z�
    Determine if two paths reference the same file.

    Augments os.path.samefile to work on Windows and
    suppresses errors if the path doesn't exist.
    r1)�os�path�exists�hasattrr1�normpath�normcase)Zp1Zp2Z
both_existZuse_samefileZnorm_p1Znorm_p2r;r;r<r1MscCs|S)Nr;)�sr;r;r<�	_to_ascii_srEcCs*ytj|d�dStk
r$dSXdS)N�asciiTF)rZ	text_type�UnicodeError)rDr;r;r<�isasciibs
rHcCs
|jd�S)NrF)�encode)rDr;r;r<rEjscCs(y|jd�dStk
r"dSXdS)NrFTF)rIrG)rDr;r;r<rHms

cCstj|�j�jdd�S)N�
z; )�textwrap�dedent�strip�replace)�textr;r;r<�<lambda>usrPc@s�eZdZdZdZdZd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gZdd
dd
dd0d3d9d<g	Zej	�r�d@ej
ZejdAdef�ejdA�d*diZ
eZdBdC�ZdDdE�ZdFdG�ZedHdI��ZdJdK�ZdLdM�ZdNdO�ZdPdQ�ZdRdS�ZdTdU�ZdVdW�ZdXdY�ZdZd[�Zejd\�j �Z!ejd]�j �Z"ejd^�j �Z#d_d`�Z$dadb�Z%dcdd�Z&dedf�Z'dgdh�Z(didj�Z)e*j+dkdl��Z,d�dndo�Z-d�dpdq�Z.drds�Z/d�dtdu�Z0dvdw�Z1dxdy�Z2dzd{�Z3d�d|d}�Z4ed~d��Z5d�ffd�d��Z6d�d��Z7d�d��Z8d�d��Z9d�d��Z:d�d��Z;d�d��Z<ejd��j �Z=ejd��Z>d�d�d��Z?ejd��j �Z@d�d��ZAd�d��ZBd�d��ZCd�d��ZDd�d��ZEd�d��ZFd�d��ZGd�d��ZHejd��j �ZId�d��ZJd�d��ZKd�d��ZLeMeMd�d�d��d��ZNeMd�d�d��ZOd�d��ZPdS)�r2z'Manage a download/build/install processz Find/get/install Python packagesT�prefix=N�installation prefix�zip-ok�z�install package as a zipfile�
multi-version�m�%make apps have to require() a version�upgrade�U�1force upgrade (searches PyPI for latest versions)�install-dir=�d�install package to DIR�script-dir=rD�install scripts to DIR�exclude-scripts�x�Don't install scripts�always-copy�a�'Copy all needed packages to install dir�
index-url=�i� base URL of Python Package Index�find-links=�f�(additional URL(s) to search for packages�build-directory=�b�/download/extract/build in DIR; keep the results�	optimize=�O�lalso compile with optimization: -O1 for "python -O", -O2 for "python -OO", and -O0 to disable [default: -O0]�record=�3filename in which to record list of installed files�always-unzip�Z�*don't install as a zipfile, no matter what�
site-dirs=�S�)list of directories where .pth files work�editable�e�+Install specified packages in editable form�no-deps�N�don't install dependencies�allow-hosts=�H�$pattern(s) that hostnames must match�local-snapshots-ok�l�(allow building eggs from local checkouts�version�"print version information and exit�
no-find-links�9Don't load find-links defined in packages being installedz!install in user site-package '%s'�usercCs,d|_d|_|_d|_|_|_d|_d|_d|_d|_	d|_
|_d|_|_
|_d|_|_|_d|_|_|_d|_d|_d|_d|_d|_d|_d|_d|_d|_tjr�tj |_!tj"|_#nd|_!d|_#d|_$d|_%d|_&|_'d|_(i|_)d|_*d|_+|j,j-|_-|j,j.||j,j/d��dS)NrFr2)0r��zip_ok�local_snapshots_ok�install_dir�
script_dir�exclude_scripts�	index_url�
find_links�build_directory�args�optimize�recordrY�always_copy�
multi_versionr{�no_deps�allow_hosts�root�prefix�	no_reportr��install_purelib�install_platlib�install_headers�install_lib�install_scripts�install_data�install_base�install_platbase�site�ENABLE_USER_SITE�	USER_BASE�install_userbase�	USER_SITE�install_usersite�
no_find_links�
package_index�pth_file�always_copy_from�	site_dirs�installed_projects�sitepy_installedZ_dry_run�distribution�verboseZ_set_command_options�get_option_dict)�selfr;r;r<�initialize_options�sF

zeasy_install.initialize_optionscCs"dd�|D�}tt|j|��dS)Ncss*|]"}tjj|�stjj|�r|VqdS)N)r>r?r@�islink)�.0�filenamer;r;r<�	<genexpr>�sz/easy_install.delete_blockers.<locals>.<genexpr>)�listr�_delete_path)r��blockersZextant_blockersr;r;r<�delete_blockers�szeasy_install.delete_blockerscCsJtjd|�|jrdStjj|�o.tjj|�}|r8tntj}||�dS)NzDeleting %s)	r�info�dry_runr>r?�isdirr��rmtree�unlink)r�r?Zis_treeZremoverr;r;r<r��szeasy_install._delete_pathcCs6tjdd�}td�}d}t|jft���t��dS)zT
        Render the Setuptools version and installation details, then exit.
        N��
setuptoolsz=setuptools {dist.version} from {dist.location} (Python {ver}))�sysr�r$�print�format�locals�
SystemExit)Zver�dist�tmplr;r;r<�_render_version�s
zeasy_install._render_versionc	Cst|jo|j�tjj�d}tdd�\}}|jj�|jj�|jj�||dd�|d|d||||t	tdd�d�|_
tjr�|j
|j
d	<|j|j
d
<|j�|j�|j�|jddd
d�|jdkr�|j|_|jdkr�d|_|jdd!�|jdd"�|j�r|j�r|j|_|j|_|jdd#�tttj�}t�|_|jdk	�r�dd�|jjd�D�}xV|D]N}t jj!|��s~t"j#d|�n,t|�|k�r�t$|d��n|jj%t|���q^W|j&�s�|j'�|j(�p�d|_(|jdd�|_)x4|jt|j�fD] }||j)k�r�|j)j*d|��q�W|j+dk	�r8dd�|j+jd�D�}ndg}|j,dk�r`|j-|j(|j)|d�|_,t.|j)tj�|_/|j0dk	�r�t1|j0t2j3��r�|j0j�|_0ng|_0|j4�r�|j,j5|j)tj�|j�s�|j,j6|j0�|jdd$�t1|j7t8��s@y2t8|j7�|_7d|j7k�odkn�st9�Wnt9k
�r>t$d��YnX|j&�rZ|j:�rZt;d��|j<�sjt;d ��g|_=dS)%Nrr��exec_prefixr���abiflags�)Z	dist_nameZdist_versionZ
dist_fullname�
py_version�py_version_short�py_version_nodotZ
sys_prefixr�Zsys_exec_prefixr�r��userbaseZusersiter�r�r�r�Fr�r��installr�cSsg|]}tjj|j���qSr;)r>r?�
expanduserrM)r�rDr;r;r<�
<listcomp>3sz1easy_install.finalize_options.<locals>.<listcomp>�,z"%s (in --site-dirs) does not existz$ (in --site-dirs) is not on sys.pathzhttps://pypi.org/simple/cSsg|]}|j��qSr;)rM)r�rDr;r;r<r�Hs�*)Zsearch_path�hostsr�z--optimize must be 0, 1, or 2z9Must specify a build directory (-b) when using --editablez:No urls, filenames, or requirements specified (see --help))r�r�)r�r�)r�r�)r�r�)>r�r�r��splitrr�Zget_nameZget_versionZget_fullname�getattr�config_varsr�r�r�r��_fix_install_dir_for_user_site�expand_basedirs�expand_dirs�_expandr�r�r�Zset_undefined_optionsr�r�r�rr!r?�
get_site_dirs�
all_site_dirsr�r>r�r�warnr�appendr{�check_site_dirr��shadow_path�insertr�r��create_indexr&�local_indexr��
isinstancerZstring_typesr�Zscan_egg_links�add_find_linksr��int�
ValueErrorr�rr��outputs)	r�r�r�r�rBr�r]Z	path_itemr�r;r;r<�finalize_options�s�



zeasy_install.finalize_optionscCs`|jstjrdS|j�|jdkr2d}t|��|j|_|_tj	j
dd�d}|j|�dS)z;
        Fix the install_dir if "--user" was used.
        Nz$User base directory is not specified�posixZunixZ_user)r�r�r��create_home_pathr�r	r�r�r>�namerN�
select_scheme)r��msgZscheme_namer;r;r<r�ms
z+easy_install._fix_install_dir_for_user_sitecCs\xV|D]N}t||�}|dk	rtjdks0tjdkr<tjj|�}t||j�}t|||�qWdS)Nr��nt)r�r>r�r?r�rr��setattr)r��attrs�attr�valr;r;r<�
_expand_attrs|s

zeasy_install._expand_attrscCs|jdddg�dS)zNCalls `os.path.expanduser` on install_base, install_platbase and
        root.r�r�r�N)r�)r�r;r;r<r��szeasy_install.expand_basedirscCsddddddg}|j|�dS)z+Calls `os.path.expanduser` on install dirs.r�r�r�r�r�r�N)r�)r��dirsr;r;r<r��szeasy_install.expand_dirscCs�|j|jjkrtj|j�z�x|jD]}|j||j�q$W|jr�|j}|j	r�t
|j	�}x(tt
|��D]}|||d�||<qfWddlm
}|j|j|j|fd|j�|j�Wdtj|jj�XdS)Nr)�	file_utilz'writing list of installed files to '%s')r�r�r�
set_verbosityr�r2r�r�r�r��len�range�	distutilsr��executeZ
write_file�warn_deprecated_options)r��specr�Zroot_lenZcounterr�r;r;r<�run�s$

zeasy_install.runcCsDytj�}Wn"tk
r.tjdtj�}YnXtjj|j	d|�S)z�Return a pseudo-tempname base in the install directory.
        This code is intentionally naive; if a malicious party can write to
        the target directory you're already in deep doodoo.
        rztest-easy-install-%s)
r>�getpid�	Exception�randomZrandintr��maxsizer?�joinr�)r��pidr;r;r<�pseudo_tempname�s
zeasy_install.pseudo_tempnamecCsdS)Nr;)r�r;r;r<r�sz$easy_install.warn_deprecated_optionscCsdt|j�}tjj|d�}tjj|�sTytj|�Wn ttfk
rR|j	�YnX||j
k}|rv|jrv|j�}nd|j
�d}tjj|�}y*|r�tj|�t|d�j�tj|�Wn ttfk
r�|j	�YnX|r�|jr�t|j���|�r|jdk�rt||j
�|_nd|_|ttt��k�r6d|_n$|j�rZtjj|��rZd|_d|_||_dS)z;Verify that self.install_dir is .pth-capable dir, if neededzeasy-install.pthz.write-test�wNT)r!r�r>r?rr@�makedirs�OSError�IOError�cant_write_to_targetr�r��check_pth_processingrr��open�closer�no_default_version_msgr�r3r�_pythonpathr�)r��instdirr�Zis_site_dirZtestfileZtest_existsr;r;r<r��s>



zeasy_install.check_site_diraS
        can't create or remove files in install directory

        The following error occurred while trying to add or remove files in the
        installation directory:

            %s

        The installation directory you specified (via --install-dir, --prefix, or
        the distutils default setting) was:

            %s
        z�
        This directory does not currently exist.  Please create it and try again, or
        choose a different installation directory (using the -d or --install-dir
        option).
        a�
        Perhaps your account does not have write access to this directory?  If the
        installation directory is a system-owned directory, you may need to sign in
        as the administrator or "root" account.  If you do not have administrative
        access to this machine, you may wish to choose a different installation
        directory, preferably one that is listed in your PYTHONPATH environment
        variable.

        For information on other options, you may wish to consult the
        documentation at:

          https://setuptools.readthedocs.io/en/latest/easy_install.html

        Please make the appropriate changes for your system and try again.
        cCsP|jtj�d|jf}tjj|j�s6|d|j7}n|d|j7}t	|��dS)N�rJ)
�_easy_install__cant_write_msgr��exc_infor�r>r?r@�_easy_install__not_exists_id�_easy_install__access_msgr)r�r�r;r;r<rs
z!easy_install.cant_write_to_targetc
Cs�|j}tjd|�|j�d}|d}tjj|�}td�d}y8|rNtj|�tjj	|�}t
jj|dd�t
|d�}Wn ttfk
r�|j�Yn�Xz�|j|jft���|j�d	}tj}tjd
k�rtjj|�\}}	tjj|d�}
|	j�dk�otjj|
�}|�r|
}d
dlm}||dddgd
�tjj|��rJtjd|�dSWd	|�r\|j�tjj|��rttj|�tjj|��r�tj|�X|j�s�tjd|�dS)z@Empirically verify whether .pth files are supported in inst. dirz Checking .pth file support in %sz.pthz.okzz
            import os
            f = open({ok_file!r}, 'w')
            f.write('OK')
            f.close()
            rJT)�exist_okrNr�zpythonw.exez
python.exer)�spawnz-Ez-c�passz-TEST PASSED: %s appears to support .pth filesz+TEST FAILED: %s does NOT support .pth filesF)r�rr�rr>r?r@�
_one_linerr��dirname�
pkg_resourcesZ
py31compatrrrrr�writer�r�rr��
executabler�r�r�lower�distutils.spawnr r�r�)
r�rr�Zok_fileZ	ok_existsr�r#rkr&�basenameZaltZuse_altr r;r;r<rsV


z!easy_install.check_pth_processingcCs\|jrN|jd�rNx:|jd�D],}|jd|�r2q|j|||jd|��qW|j|�dS)z=Write all the scripts for `dist`, unless scripts are excluded�scriptszscripts/N)r�Zmetadata_isdirZmetadata_listdir�install_scriptZget_metadata�install_wrapper_scripts)r�r��script_namer;r;r<�install_egg_scriptsSsz easy_install.install_egg_scriptscCs\tjj|�rLxJtj|�D].\}}}x"|D]}|jjtjj||��q(WqWn|jj|�dS)N)r>r?r��walkr�r�r)r�r?�baser��filesr�r;r;r<�
add_outputas

 zeasy_install.add_outputcCs|jrtd|f��dS)NzjInvalid argument %r: you can't use filenames or URLs with --editable (except via the --find-links option).)r{r)r�rr;r;r<�not_editableiszeasy_install.not_editablecCs<|js
dStjjtjj|j|j��r8td|j|jf��dS)Nz2%r already exists in %s; can't do a checkout there)r{r>r?r@rr��keyr)r�rr;r;r<�check_editableqszeasy_install.check_editableccs@tjtjd�d�}zt|�VWdtjj|�o8tt	|��XdS)Nz
easy_install-)r�)
�tempfile�mkdtempr�u�strr>r?r@r�r)r��tmpdirr;r;r<�_tmpdir{szeasy_install._tmpdirFcCs|js|j�|j���}t|t�s�t|�rT|j|�|jj||�}|j	d|||d�St
jj|�r||j|�|j	d|||d�St
|�}|j|�|jj|||j|j|j|j�}|dkr�d|}|jr�|d7}t|��n0|jtkr�|j|||d�|S|j	||j||�SWdQRXdS)NTz+Could not find suitable distribution for %rz2 (--always-copy skips system and development eggs)�Using)r{�install_site_pyr;r�r'rr3r��download�install_itemr>r?r@rr5Zfetch_distributionrYr�r�rZ
precedencer.�process_distribution�location)r�r�depsr:�dlr�r�r;r;r<r2�s2






zeasy_install.easy_installcCs|p|j}|ptjj|�|k}|p,|jd�}|pT|jdk	oTtjjt|��t|j�k}|r�|r�x$|j|jD]}|j	|krnPqnWd}t
jdtjj|��|r�|j
|||�}x<|D]}|j|||�q�Wn |j|�g}|j||d|d�|dk	�rx|D]}||kr�|Sq�WdS)Nz.eggTz
Processing %srr<)r�r>r?r#�endswithr�r!r��project_namerArr�r)�install_eggsr@�egg_distribution)r�rr>r:rBZinstall_neededr�Zdistsr;r;r<r?�s.






zeasy_install.install_itemcCs@t|}x2tD]*}d|}t||�dkrt||||�qWdS)z=Sets the install directories by applying the install schemes.Zinstall_N)r
rr�r�)r�r��schemer4Zattrnamer;r;r<r��s

zeasy_install.select_schemecGs�|j|�|jj|�||j|jkr2|jj|�|jj|�|j|�||j|j<tj	|j
||f|���|jd�r�|jr�|jj
|jd��|r�|jr�dS|dk	r�|j|jkr�tjd|�dS|dks�||kr�|j�}tt|��}tj	d|�ytg�j|g|j|j�}Wn^tk
�rB}ztt|���WYdd}~Xn0tk
�rp}zt|j���WYdd}~XnX|j�s�|j�r�x*|D]"}|j|jk�r�|j|j���q�Wtj	d|�dS)Nzdependency_links.txtzSkipping dependencies for %szProcessing dependencies for %sz'Finished processing dependencies for %s)�
update_pthr��addr�r4�remover.r�rr��installation_report�has_metadatar�r�Zget_metadata_linesr�r��as_requirementr'r9r+Zresolver2r,rr-Zreportr�)r�Zrequirementr�rBr�ZdistreqZdistrosr|r;r;r<r@�sB



z!easy_install.process_distributioncCs2|jdk	r|jS|jd�r dS|jd�s.dSdS)Nznot-zip-safeTzzip-safeF)r�rM)r�r�r;r;r<�should_unzip�s


zeasy_install.should_unzipcCs�tjj|j|j�}tjj|�r:d}tj||j|j|�|Stjj|�rL|}nRtjj	|�|krftj
|�tj|�}t|�dkr�tjj||d�}tjj|�r�|}t
|�tj||�|S)Nz<%r already exists in %s; build directory %s will not be keptrr)r>r?rr�r4r@rr�r�r#r��listdirrr#�shutil�move)r�r�
dist_filename�
setup_base�dstr��contentsr;r;r<�
maybe_moves"

zeasy_install.maybe_movecCs0|jr
dSx tj�j|�D]}|j|�qWdS)N)r��ScriptWriter�best�get_args�write_script)r�r�r�r;r;r<r,sz$easy_install.install_wrapper_scriptscCsNt|j��}t||�}|r8|j|�t�}tj|�|}|j|t|�d�dS)z/Generate a legacy script wrapper and install itrnN)	r9rN�is_python_script�_load_templater�rX�
get_headerr[rE)r�r�r-�script_text�dev_pathrZ	is_scriptZbodyr;r;r<r+"s
zeasy_install.install_scriptcCs(d}|r|jdd�}td|�}|jd�S)z�
        There are a couple of template scripts in the package. This
        function loads one of them and prepares it for use.
        zscript.tmplz.tmplz (dev).tmplr�zutf-8)rNr"�decode)r`r�Z	raw_bytesr;r;r<r],s

zeasy_install._load_template�tcs��j�fdd�|D��tjd|�j�tjj�j|�}�j|��jrLdSt	�}t
|�tjj|�rptj|�t
|d|��}|j|�WdQRXt|d|�dS)z1Write an executable file to the scripts directorycsg|]}tjj�j|��qSr;)r>r?rr�)r�rb)r�r;r<r�>sz-easy_install.write_script.<locals>.<listcomp>zInstalling %s script to %sNri�)r�rr�r�r>r?rr2r��
current_umaskr#r@r�rr%�chmod)r�r-rV�moder��target�maskrkr;)r�r<r[;s

zeasy_install.write_scriptcCs`|j�jd�r|j||�gS|j�jd�r8|j||�gS|j�jd�rT|j||�gS|}tjj|�r�|jd�r�t|||j	�ntjj
|�r�tjj|�}|j|�r�|j
r�|dk	r�|j|||�}tjj|d�}tjj|��s2ttjj|dd��}|�stdtjj|���t|�dk�r*td	tjj|���|d
}|j�rPtj|j||��gS|j||�SdS)Nz.eggz.exez.whlz.pyzsetup.pyr�z"Couldn't find a setup script in %srzMultiple setup scripts in %sr)r'rD�install_egg�install_exe�
install_wheelr>r?�isfiler�unpack_progressr��abspath�
startswithr�rWrr@rrrr{rr��report_editable�build_and_install)r�rrSr:rT�setup_scriptZsetupsr;r;r<rFOs<
zeasy_install.install_eggscCs>tjj|�r"t|tjj|d��}nttj|��}tj	||d�S)NzEGG-INFO)�metadata)
r>r?r�r)rr*�	zipimport�zipimporterr(Z
from_filename)r��egg_pathrrr;r;r<rG{s

zeasy_install.egg_distributionc
Cs�tjj|jtjj|��}tjj|�}|js2t|�|j|�}t	||��s|tjj
|�rttjj|�rttj
||jd�n"tjj|�r�|jtj|fd|�y�d}tjj
|�r�|j|�r�tjd}}ntjd}}nL|j|�r�|j|�|jd}}n*d}|j|��rtjd}}ntjd}}|j|||f|dtjj|�tjj|�f�t||d	�Wn$tk
�rzt|dd	��YnX|j|�|j|�S)
N)r�z	Removing FZMovingZCopyingZ
ExtractingTz	 %s to %s)�fix_zipimporter_caches)r>r?rr�r)rmr�r#rGr1r�r�r
�remove_treer@rr�rnrQrRZcopytreerOZmkpath�unpack_and_compileZcopy2r#�update_dist_cachesr	r2)r�rur:�destinationr�Znew_dist_is_zippedrkrWr;r;r<rh�sT






zeasy_install.install_eggcsTt|�}|dkrtd|��td|jdd�|jdd�t�d�}tjj||j�d�}||_	|d}tjj|d�}tjj|d	�}t
|�t||�|_|j
||�tjj|��st|d
�}	|	jd�x<|jd�D].\}
}|
dkr�|	jd
|
jdd�j�|f�q�W|	j�tjj|d��|j�fdd�tj|�D��tj|||j|jd�|j||�S)Nz(%s is not a valid distutils Windows .exerrr�r�)rEr��platformz.eggz.tmpzEGG-INFOzPKG-INFOrzMetadata-Version: 1.0
�target_versionz%s: %s
�_�-r*csg|]}tjj�|d��qS)r)r>r?r)r�r�)r�r;r<r��sz,easy_install.install_exe.<locals>.<listcomp>)r�r�)r4rr(�getrr>r?r�egg_namerAr#r)Z	_provider�
exe_to_eggr@rr%�itemsrN�titlerr�rXrZrZmake_zipfiler�r�rh)r�rSr:�cfgr�ru�egg_tmpZ	_egg_infoZpkg_infrk�k�vr;)r�r<ri�s<



"
zeasy_install.install_execs>t|��g�g�i������fdd�}t|�|�g}xt�D]l}|j�jd�r>|jd�}|d}tj|d�d|d<tjj	�f|��}�j
|�|j
|�tj||�q>W|j��tj
tjj	�d�tj�|��xbdD]Z}	t�|	r�tjj	�d|	d
�}
tjj|
�s�t|
d�}|jdj	t�|	�d�|j�q�Wd
S)z;Extract a bdist_wininst to the directories an egg would usecs�|j�}xԈD]�\}}|j|�r||t|�d�}|jd�}tjj�f|��}|j�}|jd�sl|jd�r�tj	|d
�|d<d�tjj
|d�d<�j|�n4|jd�r�|dkr�d�tjj
|d�d<�j|�|SqW|jd�s�tj
d	|�dS)N�/z.pydz.dllrrz.pyzSCRIPTS/z.pthzWARNING: can't process %s���r�)r'rnrr�r>r?rrDr�strip_module�splitextr�rr�)�srcrUrD�old�new�partsrC)r��native_libs�prefixes�
to_compile�	top_levelr;r<�process�s$



z(easy_install.exe_to_egg.<locals>.processz.pydr�rz.pyzEGG-INFOr�r�z.txtrrJNr�r�r�)r�r�)r6rr'rDr�rr�r>r?rr�Z
write_stub�byte_compileZwrite_safety_flagZanalyze_eggr�r@rr%r)r�rSr�r�Zstubs�resr�ZresourceZpyfiler�Ztxtrkr;)r�r�r�r�r�r<r��s6







zeasy_install.exe_to_eggc
Cs�t|�}|j�st�tjj|j|j��}tjj|�}|j	sBt
|�tjj|�rntjj|�rnt
j||j	d�n"tjj|�r�|jtj|fd|�z.|j|j|fdtjj|�tjj|�f�Wdt|dd�X|j|�|j|�S)N)r�z	Removing zInstalling %s to %sF)rv)rZ
is_compatible�AssertionErrorr>r?rr�r�rmr�r#r�r�r
rwr@rr�Zinstall_as_eggr)r#ryr2rG)r�Z
wheel_pathr:Zwheelrzr;r;r<rjs.


zeasy_install.install_wheela(
        Because this distribution was installed --multi-version, before you can
        import modules from this package in an application, you will need to
        'import pkg_resources' and then use a 'require()' call similar to one of
        these examples, in order to select the desired version:

            pkg_resources.require("%(name)s")  # latest installed version
            pkg_resources.require("%(name)s==%(version)s")  # this exact version
            pkg_resources.require("%(name)s>=%(version)s")  # this version or higher
        z�
        Note also that the installation directory must be on sys.path at runtime for
        this to work.  (e.g. by being the application's script directory, by being on
        PYTHONPATH, or by being added to sys.path by your code.)
        �	Installedc	Cs`d}|jr@|jr@|d|j7}|jtttj�kr@|d|j7}|j	}|j
}|j}d}|t�S)z9Helpful installation message for display to package usersz
%(what)s %(eggloc)s%(extras)srJr�)
r�r��_easy_install__mv_warningr�rr!r�r?�_easy_install__id_warningrArEr�r�)	r�Zreqr�Zwhatr�Zegglocr�r�Zextrasr;r;r<rLIsz easy_install.installation_reportaR
        Extracted editable version of %(spec)s to %(dirname)s

        If it uses setuptools in its setup script, you can activate it in
        "development" mode by going to that directory and running::

            %(python)s setup.py develop

        See the setuptools documentation for the "develop" command for more info.
        cCs"tjj|�}tj}d|jt�S)NrJ)r>r?r#r�r&�_easy_install__editable_msgr�)r�rrqr#�pythonr;r;r<robszeasy_install.report_editablecCs�tjjdt�tjjdt�t|�}|jdkrNd|jd}|jdd|�n|jdkrd|jdd�|jrv|jdd	�t	j
d
|t|�dd�dj|��yt
||�Wn6tk
r�}ztd|jdf��WYdd}~XnXdS)
Nzdistutils.command.bdist_eggzdistutils.command.egg_infor�r�rrr~z-qz-nz
Running %s %s� zSetup script exited with %s)r��modules�
setdefaultrrr�r�r�r�rr�rrrr�rr�)r�rqrTr�r�r;r;r<rgs 

 zeasy_install.run_setupc	Cs�ddg}tjdtjj|�d�}z�|jtjj|��|j|�|j|||�t|g�}g}x2|D]*}x$||D]}|j|j	|j
|��qlWq^W|r�|jr�tj
d|�|St|�tj|j�XdS)Nrz
--dist-dirz
egg-dist-tmp-)r��dirz+No eggs found in %s (setup script problem?))r6r7r>r?r#�_set_fetcher_optionsr�rr&rhrAr�rr�r�rr�)	r�rqrTr�Zdist_dirZall_eggsZeggsr4r�r;r;r<rp{s$


zeasy_install.build_and_installc	Cst|jjd�j�}d
}i}x2|j�D]&\}}||kr4q"|d||jdd	�<q"Wt|d
�}tjj|d�}t	j
||�dS)a
        When easy_install is about to run bdist_egg on a source dist, that
        source dist might have 'setup_requires' directives, requiring
        additional fetching. Ensure the fetcher options given to easy_install
        are available to that command as well.
        r2r�r�r�r�r�rr}r~)r2z	setup.cfgN)r�r�r�r�r�r�)r�r��copyr�rN�dictr>r?rrZedit_config)	r�r0Zei_optsZfetch_directivesZ
fetch_optionsr4r�ZsettingsZcfg_filenamer;r;r<r��s	
z!easy_install._set_fetcher_optionscCs0|jdkrdSxX|j|jD]H}|js2|j|jkrtjd|�|jj|�|j|jkr|jj|j�qW|js�|j|jjkr�tjd|�n2tjd|�|jj	|�|j|jkr�|jj
|j�|j�s,|jj�|jdk�r,t
jj|jd�}t
jj|��rt
j|�t|d�}|j|jj|j�d�|j�dS)Nz&Removing %s from easy-install.pth filez4%s is already the active version in easy-install.pthz"Adding %s to easy-install.pth filer�zsetuptools.pth�wtrJ)r�r4r�rArr�rKr��pathsrJr�r��saver>r?rr�r�r�rr%�
make_relativer)r�r�r]r�rkr;r;r<rI�s4



zeasy_install.update_pthcCstjd||�|S)NzUnpacking %s to %s)r�debug)r�r�rUr;r;r<rl�szeasy_install.unpack_progresscshg�g����fdd�}t|||��j���jsdx.�D]&}tj|�tjdBd@}t||�q:WdS)Ncs\|jd�r"|jd�r"�j|�n|jd�s6|jd�r@�j|��j||��jrX|pZdS)Nz.pyz	EGG-INFO/z.dllz.so)rDrnr�rlr�)r�rU)r��to_chmodr�r;r<�pf�s
z+easy_install.unpack_and_compile.<locals>.pfimi�)rr�r�r>�stat�ST_MODErd)r�rurzr�rkrer;)r�r�r�r<rx�s

zeasy_install.unpack_and_compilecCsjtjr
dSddlm}z@tj|jd�||dd|jd�|jrT|||jd|jd�Wdtj|j�XdS)Nr)r�r)r��forcer�)	r��dont_write_bytecode�distutils.utilr�rrr�r�r�)r�r�r�r;r;r<r��szeasy_install.byte_compilea�
        bad install directory or PYTHONPATH

        You are attempting to install a package to a directory that is not
        on PYTHONPATH and which Python does not read ".pth" files from.  The
        installation directory you specified (via --install-dir, --prefix, or
        the distutils default setting) was:

            %s

        and your PYTHONPATH environment variable currently contains:

            %r

        Here are some of your options for correcting the problem:

        * You can choose a different installation directory, i.e., one that is
          on PYTHONPATH or supports .pth files

        * You can add the installation directory to the PYTHONPATH environment
          variable.  (It must then also be on PYTHONPATH whenever you run
          Python and want to use the package(s) you are installing.)

        * You can set up the installation directory to support ".pth" files by
          using one of the approaches described here:

          https://setuptools.readthedocs.io/en/latest/easy_install.html#custom-installation-locations


        Please make the appropriate changes for your system and try again.cCs|j}||jtjjdd�fS)N�
PYTHONPATHr�)�_easy_install__no_default_msgr�r>�environr)r��templater;r;r<rsz#easy_install.no_default_version_msgcCs�|jr
dStjj|jd�}tdd�}|jd�}d}tjj|�r�tj	d|j�t
j|��}|j�}WdQRX|j
d�s�td	|��||kr�tjd
|�|js�t|�t
j|ddd��}|j|�WdQRX|j|g�d
|_dS)z8Make sure there's a site.py in the target dir, if neededNzsite.pyr�z
site-patch.pyzutf-8r�zChecking existing site.py in %sz
def __boot():z;%s is not a setuptools-generated site.py; please remove it.zCreating %sr)�encodingT)r�r>r?rr�r"rar@rr��ior�readrnrr�r�r#r%r�)r�Zsitepy�sourceZcurrentZstrmr;r;r<r=s,


zeasy_install.install_site_pycCsj|js
dSttjjd��}xJtj|j�D]:\}}|j|�r(tjj	|�r(|j
d|�tj|d�q(WdS)zCreate directories under ~.N�~zos.makedirs('%s', 0o700)i�)r�rr>r?r�rZ	iteritemsr�rnr�Zdebug_printr)r��homer�r?r;r;r<r�>szeasy_install.create_home_pathz/$base/lib/python$py_version_short/site-packagesz	$base/bin)r�r�)r�z$base/Lib/site-packagesz
$base/ScriptscGs�|jd�j}|jrh|j�}|j|d<|jjtj|j�}x0|j	�D]$\}}t
||d�dkr@t|||�q@Wddlm
}xJ|D]B}t
||�}|dk	rz|||�}tjdkr�tjj|�}t|||�qzWdS)Nr�r0r)rr�)Zget_finalized_commandr�r�r�r
rr>r��DEFAULT_SCHEMEr�r�r�r�rr?r�)r�r�r�rHr�r�rr;r;r<r�Ts 




zeasy_install._expand)rQNrR)rSrTrU)rVrWrX)rYrZr[)r\r]r^)r_rDr`)rarbrc)rdrerf)rgrhri)rjrkrl)rmrnro)rprqrr)rsNrt)rurvrw)rxryrz)r{r|r})r~rr�)r�r�r�)r�r�r�)r�Nr�)r�Nr�)F)F)T)N)r�)Q�__name__�
__module__�__qualname__�__doc__�descriptionZcommand_consumes_argumentsZuser_optionsZboolean_optionsr�r�r�Zhelp_msgr�Znegative_optrr�r�r�r��staticmethodr�r�r�r�r�r�rrrr�rKrL�lstriprrrrrr.r2r3r5�
contextlib�contextmanagerr;r2r?r�r@rOrWr,r+r]r[rFrGrhrir�rjr�r�rLr�rorrpr�rIrlrxr�r�rr=r�r�r
r�r�r;r;r;r<r2xs�



0	z	0


	;
	
$
$	
'	

,6-5	

	
%
 
cCs tjjdd�jtj�}td|�S)Nr�r�)r>r�rr��pathsep�filter)r�r;r;r<rksrcCs�g}|jt��tjg}tjtjkr0|jtj�x�|D]�}|r6tjdkr`|jtjj	|dd��n\tj
dkr�|jtjj	|ddtjdd	�d�tjj	|dd
�g�n|j|tjj	|dd�g�tjdkr6d|kr6tjj
d
�}|r6tjj	|ddtjdd	�d�}|j|�q6Wtd�td�f}x"|D]}||k�r |j|��q Wtj�rR|jtj�y|jtj��Wntk
�rzYnXttt|��}|S)z&
    Return a list of 'site' dirs
    �os2emx�riscosZLibz
site-packagesr��libr�Nr�zsite-python�darwinzPython.framework�HOME�Library�Python�purelib�platlib)r�r�)�extendrr�r�r�r�r{r>r?r�sepr�r�rrr�r�r��getsitepackages�AttributeErrorr�rr!)�sitedirsr�r�r�Zhome_spZ	lib_pathsZsite_libr;r;r<r�psV





r�ccs�i}x�|D]�}t|�}||kr q
d||<tjj|�s6q
tj|�}||fVx�|D]�}|jd�s`qP|dkrjqPttjj||��}tt	|��}|j
�xP|D]H}|jd�s�t|j��}||kr�d||<tjj|�s�q�|tj|�fVq�WqPWq
WdS)zBYield sys.path directories that might contain "old-style" packagesrz.pth�easy-install.pth�setuptools.pth�importN)r�r�)
r!r>r?r�rPrDrrr�r rrn�rstrip)Zinputs�seenr#r1r�rk�lines�liner;r;r<�expand_paths�s4






r�cCs&t|d�}�z
tj|�}|dkr$dS|d|d|d}|dkrHdS|j|d�tjd|jd��\}}}|dkrzdS|j|d|�d
d
d�}tj|�}y<|j|�}	|	j	dd
�d}
|
j
tj��}
|j
tj|
��Wntjk
r�dSX|jd��s|jd��rdS|S|j�XdS)znExtract configuration data from a bdist_wininst .exe

    Returns a configparser.RawConfigParser, or None
    �rbN�	���z<iii�zV4�{V4r�)r�r|�rrrrZSetup)r�r�)r�zipfileZ_EndRecData�seekr9�unpackr�rZRawConfigParserr�rar��getfilesystemencodingZreadfpr�StringIO�ErrorZhas_sectionr)rSrkZendrecZ	prepended�tagZcfglenZbmlenZinitr��part�configr;r;r<r4�s4




c
CsJdddddg}tj|�}�z�x�|j�D]�}|j}|jd�}t|�d	kr�|d
dkr�|djd
�r�|jddj|dd
��df�Pt|�d
ks(|jd�r�q(|jd�r�q(|dj	�dkr(|j
|�}tjr�|j
�}xDt|�D]8}|j�jdd�}|jd�s�|jd|d|fdf�q�Wq(WWd|j�Xdd�|D�}|j�|j�|S) z4Get exe->egg path translations for a given .exe file�PURELIB/r��PLATLIB/pywin32_system32�PLATLIB/�SCRIPTS/�EGG-INFO/scripts/�DATA/lib/site-packagesr�r�r�zPKG-INFOrz	.egg-inforNz	EGG-INFO/z.pthz
-nspkg.pth�PURELIB�PLATLIB�\r�z%s/%s/cSsg|]\}}|j�|f�qSr;)r')r�rb�yr;r;r<r�$sz$get_exe_prefixes.<locals>.<listcomp>)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�ZZipFileZinfolistr�r�rrDr�r�upperr�rZPY3rar rMrNrnr�r�sort�reverse)Zexe_filenamer�rTr�r�r�rVZpthr;r;r<r6s>




&
c@sTeZdZdZdZffdd�Zdd�Zdd�Zed	d
��Z	dd�Z
d
d�Zdd�ZdS)r3z)A .pth file with Distribution paths in itFcCsp||_ttt|��|_ttjj|j��|_|j	�t
j|gdd�x(t|j
�D]}tt|jt|d���qNWdS)NT)r�r�rr!r�r>r?r#�basedir�_loadr&�__init__r r�rJr%)r�r�r�r?r;r;r<r�/szPthDistributions.__init__cCsg|_d}tj|j�}tjj|j�r�t|jd�}x�|D]�}|j	d�rJd}q6|j
�}|jj|�|j�s6|j�j	d�rxq6t
tjj|j|��}|jd<tjj|�s�||kr�|jj�d|_q6d||<q6W|j�|jr�|r�d|_x&|jo�|jdj��r
|jj�q�WdS)	NFZrtr�T�#rr�r�)r�r��fromkeysr�r>r?rkr�rrnr�r�rMr!rr�r@�pop�dirtyr)r�Z
saw_importr�rkr�r?r;r;r<r�8s2


zPthDistributions._loadc	Cs�|js
dStt|j|j��}|r�tjd|j�|j|�}dj	|�d}t
jj|j�r`t
j
|j�t|jd��}|j|�WdQRXn(t
jj|j�r�tjd|j�t
j
|j�d|_dS)z$Write changed .pth file back to diskNz	Saving %srJr�zDeleting empty %sF)rr�rr�r�rr�r��_wrap_linesrr>r?r�r�rr%r@)r�Z	rel_pathsr��datarkr;r;r<r�Ws
zPthDistributions.savecCs|S)Nr;)r�r;r;r<rmszPthDistributions._wrap_linescCsN|j|jko$|j|jkp$|jtj�k}|r>|jj|j�d|_tj||�dS)z"Add `dist` to the distribution mapTN)	rAr�r�r>�getcwdr�rr&rJ)r�r��new_pathr;r;r<rJqszPthDistributions.addcCs6x$|j|jkr$|jj|j�d|_qWtj||�dS)z'Remove `dist` from the distribution mapTN)rAr�rKrr&)r�r�r;r;r<rKs
zPthDistributions.removecCs�tjjt|��\}}t|j�}|g}tjdkr2dp6tj}xVt|�|kr�||jkrn|jtj	�|j
�|j|�Stjj|�\}}|j|�q:W|SdS)Nr�)r>r?r�r!rr��altsepr�r��curdirr�r)r�r?ZnpathZlastZbaselenr�r�r;r;r<r��s


zPthDistributions.make_relativeN)
r�r�r�r�rr�r�r�r�rrJrKr�r;r;r;r<r3*s	c@s(eZdZedd��Zed�Zed�ZdS)�RewritePthDistributionsccs(|jVx|D]
}|VqW|jVdS)N)�prelude�postlude)�clsr�r�r;r;r<r�s

z#RewritePthDistributions._wrap_linesz?
        import sys
        sys.__plen = len(sys.path)
        z�
        import sys
        new = sys.path[sys.__plen:]
        del sys.path[sys.__plen:]
        p = getattr(sys, '__egginsert', 0)
        sys.path[p:p] = new
        sys.__egginsert = p + len(new)
        N)r�r�r��classmethodrr"rr	r;r;r;r<r�s
rZSETUPTOOLS_SYS_PATH_TECHNIQUE�rawZrewritecCs ttjt�rtStjtjj��S)z_
    Return a regular expression based on first_line_re suitable for matching
    strings.
    )r�r�patternr9�re�compilerar;r;r;r<�_first_line_re�srcCsd|tjtjgkr.tjdkr.t|tj�||�Stj�\}}}t	j
||d|dd||ff�dS)Nr�rrz %s %s)r>r�rKr�rdr��S_IWRITEr�rrZreraise)�func�arg�excZetZevr}r;r;r<�
auto_chmod�s
rcCs.t|�}t|tj�|r"t|�nt|�dS)aa

    Fix any globally cached `dist_path` related data

    `dist_path` should be a path of a newly installed egg distribution (zipped
    or unzipped).

    sys.path_importer_cache contains finder objects that have been cached when
    importing data from the original distribution. Any such finders need to be
    cleared since the replacement distribution might be packaged differently,
    e.g. a zipped egg distribution might get replaced with an unzipped egg
    folder or vice versa. Having the old finders cached may then cause Python
    to attempt loading modules from the replacement distribution using an
    incorrect loader.

    zipimport.zipimporter objects are Python loaders charged with importing
    data packaged inside zip archives. If stale loaders referencing the
    original distribution, are left behind, they can fail to load modules from
    the replacement distribution. E.g. if an old zipimport.zipimporter instance
    is used to load data from a new zipped egg archive, it may cause the
    operation to attempt to locate the requested data in the wrong location -
    one indicated by the original distribution's zip archive directory
    information. Such an operation may then fail outright, e.g. report having
    read a 'bad local file header', or even worse, it may fail silently &
    return invalid data.

    zipimport._zip_directory_cache contains cached zip archive directory
    information for all existing zipimport.zipimporter instances and all such
    instances connected to the same archive share the same cached directory
    information.

    If asked, and the underlying Python implementation allows it, we can fix
    all existing zipimport.zipimporter instances instead of having to track
    them down and remove them one by one, by updating their shared cached zip
    archive directory information. This, of course, assumes that the
    replacement distribution is packaged as a zipped egg.

    If not asked to fix existing zipimport.zipimporter instances, we still do
    our best to clear any remaining zipimport.zipimporter related cached data
    that might somehow later get used when attempting to load data from the new
    distribution and thus cause such load operations to fail. Note that when
    tracking down such remaining stale data, we can not catch every conceivable
    usage from here, and we clear only those that we know of and have found to
    cause problems if left alive. Any remaining caches should be updated by
    whomever is in charge of maintaining them, i.e. they should be ready to
    handle us replacing their zip archives with new distributions at runtime.

    N)r!�_uncacher��path_importer_cache�!_replace_zip_directory_cache_data�*_remove_and_clear_zip_directory_cache_data)Z	dist_pathrv�normalized_pathr;r;r<ry�s
<
rycCsTg}t|�}xB|D]:}t|�}|j|�r|||d�tjdfkr|j|�qW|S)ap
    Return zipimporter cache entry keys related to a given normalized path.

    Alternative path spellings (e.g. those using different character case or
    those using alternative path separators) related to the same path are
    included. Any sub-path entries are included as well, i.e. those
    corresponding to zip archives embedded in other zip archives.

    rr�)rr!rnr>r�r�)r�cache�resultZ
prefix_len�pZnpr;r;r<�"_collect_zipimporter_cache_entriess


rcCsDx>t||�D]0}||}||=|o*|||�}|dk	r|||<qWdS)a�
    Update zipimporter cache data for a given normalized path.

    Any sub-path entries are processed as well, i.e. those corresponding to zip
    archives embedded in other zip archives.

    Given updater is a callable taking a cache entry key and the original entry
    (after already removing the entry from the cache), and expected to update
    the entry and possibly return a new one to be inserted in its place.
    Returning None indicates that the entry should not be replaced with a new
    one. If no updater is given, the cache entries are simply removed without
    any additional processing, the same as if the updater simply returned None.

    N)r)rr�updaterr�	old_entryZ	new_entryr;r;r<�_update_zipimporter_cache*s
r!cCst||�dS)N)r!)rrr;r;r<rJsrcCsdd�}t|tj|d�dS)NcSs|j�dS)N)�clear)r?r r;r;r<�2clear_and_remove_cached_zip_archive_directory_dataOszf_remove_and_clear_zip_directory_cache_data.<locals>.clear_and_remove_cached_zip_archive_directory_data)r)r!rs�_zip_directory_cache)rr#r;r;r<rNsrZ__pypy__cCsdd�}t|tj|d�dS)NcSs&|j�tj|�|jtj|�|S)N)r"rsrt�updater$)r?r r;r;r<�)replace_cached_zip_archive_directory_dataes
zT_replace_zip_directory_cache_data.<locals>.replace_cached_zip_archive_directory_data)r)r!rsr$)rr&r;r;r<rds
r�<string>cCs2yt||d�Wnttfk
r(dSXdSdS)z%Is this string a valid Python script?�execFTN)r�SyntaxError�	TypeError)rOr�r;r;r<�	is_pythonws
r+cCsJy(tj|dd��}|jd�}WdQRXWnttfk
r@|SX|dkS)zCDetermine if the specified executable is a .sh (contains a #! line)zlatin-1)r�r�Nz#!)r�rr�rr)r&�fp�magicr;r;r<�is_sh�sr.cCstj|g�S)z@Quote a command line argument according to Windows parsing rules)�
subprocess�list2cmdline)rr;r;r<�nt_quote_arg�sr1cCsH|jd�s|jd�rdSt||�r&dS|jd�rDd|j�dj�kSdS)zMIs this text, as a whole, a Python script? (as opposed to shell/bat/etc.
    z.pyz.pywTz#!r�rF)rDr+rn�
splitlinesr')r_r�r;r;r<r\�s

r\)rdcGsdS)Nr;)r�r;r;r<�_chmod�sr3cCsRtjd||�yt||�Wn0tjk
rL}ztjd|�WYdd}~XnXdS)Nzchanging mode of %s to %ozchmod failed: %s)rr�r3r>�error)r?rer|r;r;r<rd�s
rdc@s�eZdZdZgZe�Zedd��Zedd��Z	edd��Z
edd	��Zed
d��Zdd
�Z
edd��Zdd�Zedd��Zedd��ZdS)�CommandSpeczm
    A command spec for a #! header, specified as a list of arguments akin to
    those passed to Popen.
    cCs|S)zV
        Choose the best CommandSpec class based on environmental conditions.
        r;)r
r;r;r<rY�szCommandSpec.bestcCstjjtj�}tjjd|�S)N�__PYVENV_LAUNCHER__)r>r?rBr�r&r�r)r
Z_defaultr;r;r<�_sys_executable�szCommandSpec._sys_executablecCs:t||�r|St|t�r ||�S|dkr0|j�S|j|�S)zg
        Construct a CommandSpec from a parameter to build_scripts, which may
        be None.
        N)r�r��from_environment�from_string)r
Zparamr;r;r<�
from_param�s

zCommandSpec.from_paramcCs||j�g�S)N)r7)r
r;r;r<r8�szCommandSpec.from_environmentcCstj|f|j�}||�S)z}
        Construct a command spec from a simple string representing a command
        line parseable by shlex.split.
        )�shlexr��
split_args)r
�stringr�r;r;r<r9�szCommandSpec.from_stringcCs8tj|j|��|_tj|�}t|�s4dg|jdd�<dS)Nz-xr)r;r��_extract_options�optionsr/r0rH)r�r_�cmdliner;r;r<�install_options�s
zCommandSpec.install_optionscCs:|dj�d}t�j|�}|r.|jd�p0dnd}|j�S)zH
        Extract any options from the first line of the script.
        rJrrr�)r2r�match�grouprM)Zorig_script�firstrBr?r;r;r<r>�szCommandSpec._extract_optionscCs|j|t|j��S)N)�_renderr�r?)r�r;r;r<�	as_header�szCommandSpec.as_headercCs6d}x,|D]$}|j|�r
|j|�r
|dd�Sq
W|S)Nz"'rr�)rnrD)�itemZ_QUOTES�qr;r;r<�
_strip_quotes�s

zCommandSpec._strip_quotescCs tjdd�|D��}d|dS)Ncss|]}tj|j��VqdS)N)r5rIrM)r�rGr;r;r<r��sz&CommandSpec._render.<locals>.<genexpr>z#!rJ)r/r0)r�r@r;r;r<rE�szCommandSpec._renderN)r�r�r�r�r?r�r<rrYr7r:r8r9rAr�r>rFrIrEr;r;r;r<r5�s	
r5c@seZdZedd�ZdS)�WindowsCommandSpecF)r�N)r�r�r�r�r<r;r;r;r<rJsrJc@s�eZdZdZejd�j�ZeZ	e
ddd��Ze
ddd��Ze
dd	d
��Z
edd��Ze
d
d��Ze
dd��Ze
dd��Ze
ddd��ZdS)rXz`
    Encapsulates behavior around writing entry point scripts for console and
    gui apps.
    a�
        # EASY-INSTALL-ENTRY-SCRIPT: %(spec)r,%(group)r,%(name)r
        __requires__ = %(spec)r
        import re
        import sys
        from pkg_resources import load_entry_point

        if __name__ == '__main__':
            sys.argv[0] = re.sub(r'(-script\.pyw?|\.exe)?$', '', sys.argv[0])
            sys.exit(
                load_entry_point(%(spec)r, %(group)r, %(name)r)()
            )
    NFcCs6tjdt�|rtntj�}|jd||�}|j||�S)NzUse get_argsr�)�warningsr��DeprecationWarning�WindowsScriptWriterrXrY�get_script_headerrZ)r
r�r&�wininst�writer�headerr;r;r<�get_script_argsszScriptWriter.get_script_argscCs6tjdt�|rd}|jj�j|�}|j|�|j�S)NzUse get_headerz
python.exe)rKr�rL�command_spec_classrYr:rArF)r
r_r&rO�cmdr;r;r<rN's
zScriptWriter.get_script_headerccs�|dkr|j�}t|j��}xjdD]b}|d}xT|j|�j�D]B\}}|j|�|jt�}|j||||�}	x|	D]
}
|
VqrWq>Wq"WdS)z�
        Yield write_script() argument tuples for a distribution's
        console_scripts and gui_scripts entry points.
        N�console�guiZ_scripts)rUrV)	r^r9rNZ
get_entry_mapr��_ensure_safe_namer�r��_get_script_args)r
r�rQr�type_rCr�Zepr_r�r�r;r;r<rZ1s


zScriptWriter.get_argscCstjd|�}|rtd��dS)z?
        Prevent paths in *_scripts entry point names.
        z[\\/]z+Path separators not allowed in script namesN)r�searchr�)r�Zhas_path_sepr;r;r<rWCszScriptWriter._ensure_safe_namecCs tjdt�|rtj�S|j�S)NzUse best)rKr�rLrMrY)r
Z
force_windowsr;r;r<�
get_writerLszScriptWriter.get_writercCs.tjdkstjdkr&tjdkr&tj�S|SdS)zD
        Select the best ScriptWriter for this environment.
        �win32�javar�N)r�r{r>r��_namerMrY)r
r;r;r<rYRszScriptWriter.bestccs|||fVdS)Nr;)r
rYr�rQr_r;r;r<rX\szScriptWriter._get_script_argsr�cCs"|jj�j|�}|j|�|j�S)z;Create a #! line, getting options (if any) from script_text)rSrYr:rArF)r
r_r&rTr;r;r<r^as
zScriptWriter.get_header)NF)NF)N)r�N)r�r�r�r�rKrLr�r�r5rSrrRrNrZr�rWr[rYrXr^r;r;r;r<rX	s 
		
rXc@sLeZdZeZedd��Zedd��Zedd��Zedd��Z	e
d	d
��ZdS)rMcCstjdt�|j�S)NzUse best)rKr�rLrY)r
r;r;r<r[lszWindowsScriptWriter.get_writercCs"tt|d�}tjjdd�}||S)zC
        Select the best ScriptWriter suitable for Windows
        )r&ZnaturalZSETUPTOOLS_LAUNCHERr&)r��WindowsExecutableLauncherWriterr>r�r)r
Z
writer_lookupZlauncherr;r;r<rYrs
zWindowsScriptWriter.bestc	#s�tddd�|}|tjdj�jd�krBdjft��}tj|t	�dddd	d
ddg}|j
|�|j||�}�fdd
�|D�}�|||d|fVdS)z For Windows, add a .py extensionz.pyaz.pyw)rUrVZPATHEXT�;zK{ext} not listed in PATHEXT; scripts will not be recognized as executables.z.pyz
-script.pyz.pycz.pyoz.execsg|]}�|�qSr;r;)r�rb)r�r;r<r��sz8WindowsScriptWriter._get_script_args.<locals>.<listcomp>rbN)r�r>r�r'r�r�r�rKr��UserWarningrK�_adjust_header)	r
rYr�rQr_�extr�r�r�r;)r�r<rXs
z$WindowsScriptWriter._get_script_argscCsNd}d}|dkr||}}tjtj|�tj�}|j||d�}|j|�rJ|S|S)z�
        Make sure 'pythonw' is used for gui and and 'python' is used for
        console (regardless of what sys.executable is).
        zpythonw.exez
python.exerV)r=�repl)rr�escape�
IGNORECASE�sub�_use_header)r
rYZorig_headerr
rdZ
pattern_ob�
new_headerr;r;r<rb�s
z"WindowsScriptWriter._adjust_headercCs$|dd�jd�}tjdkp"t|�S)z�
        Should _adjust_header use the replaced header?

        On non-windows systems, always use. On
        Windows systems, only use the replaced header if it resolves
        to an executable on the system.
        r�r�"r\r�)rMr�r{r)riZclean_headerr;r;r<rh�s	zWindowsScriptWriter._use_headerN)r�r�r�rJrSrr[rYrXrbr�rhr;r;r;r<rMis
rMc@seZdZedd��ZdS)r_c#s�|dkrd}d}dg}nd}d}dddg}|j||�}�fd	d
�|D�}	�|||d|	fV�dt|�d
fVt�s��d}
|
t��dfVdS)zG
        For Windows, add a .py extension and an .exe launcher
        rVz-script.pywz.pywZcliz
-script.pyz.pyz.pycz.pyocsg|]}�|�qSr;r;)r�rb)r�r;r<r��szDWindowsExecutableLauncherWriter._get_script_args.<locals>.<listcomp>rbz.exernz
.exe.manifestN)rb�get_win_launcherr=�load_launcher_manifest)r
rYr�rQr_Z
launcher_typercr�Zhdrr�Zm_namer;)r�r<rX�s
z0WindowsExecutableLauncherWriter._get_script_argsN)r�r�r�rrXr;r;r;r<r_�sr_cCs2d|}t�r|jdd�}n|jdd�}td|�S)z�
    Load the Windows launcher (executable) suitable for launching a script.

    `type` should be either 'cli' or 'gui'

    Returns the executable as a byte string.
    z%s.exe�.z-64.z-32.r�)r=rNr")�typeZlauncher_fnr;r;r<rk�s
rkcCs0tjtd�}tjr|t�S|jd�t�SdS)Nzlauncher manifest.xmlzutf-8)r$r"r�r�PY2�varsra)r�Zmanifestr;r;r<rl�s
rlFcCstj|||�S)N)rQr�)r?�
ignore_errors�onerrorr;r;r<r��sr�cCstjd�}tj|�|S)N�)r>�umask)Ztmpr;r;r<rc�s

rccCs:ddl}tjj|jd�}|tjd<tjj|�t�dS)Nr)	r�r>r?r#�__path__r��argvr�r5)r�Zargv0r;r;r<�	bootstrap�s

rwc
s�ddlm}ddlm�G�fdd�d��}|dkrBtjdd�}t��0|fddd	g|tjdpfd|d
�|��WdQRXdS)Nr)�setup)r(cseZdZdZ�fdd�ZdS)z-main.<locals>.DistributionWithoutHelpCommandsr�c
s(t���j|f|�|�WdQRXdS)N)�_patch_usage�
_show_help)r�r��kw)r(r;r<rz	sz8main.<locals>.DistributionWithoutHelpCommands._show_helpN)r�r�r�Zcommon_usagerzr;)r(r;r<�DistributionWithoutHelpCommands�sr|rz-qr2z-v)Zscript_argsr-Z	distclass)r�rxZsetuptools.distr(r�rvry)rvr{rxr|r;)r(r<r5�sc#sLddl}tjd�j���fdd�}|jj}||j_z
dVWd||j_XdS)Nrze
        usage: %(script)s [options] requirement_or_url ...
           or: %(script)s --help
        cs�ttjj|�d�S)N)Zscript)r�r>r?r))r-)�USAGEr;r<�	gen_usage	sz_patch_usage.<locals>.gen_usage)Zdistutils.corerKrLr�Zcorer~)rr~Zsavedr;)r}r<ry	s

ry)N)r')N)�r�rr�rrrZdistutils.errorsrrrr	Zdistutils.command.installr
rrrr
Zdistutils.command.build_scriptsrr(rr�r>rsrQr6r�rr�r
rKrKr�r9r�r/r;r�Zsetuptools.externrZsetuptools.extern.six.movesrrr�rZsetuptools.sandboxrZsetuptools.py31compatrrZsetuptools.py27compatrZsetuptools.commandrZsetuptools.archive_utilrZsetuptools.package_indexrrrrrZsetuptools.wheelrr$r r!r"r#r$r%r&r'r(r)r*r+r,r-r.Zpkg_resources.py31compat�filterwarningsZ
PEP440Warning�__all__r=r1rorErHr"r2rr�r�r4r6r3rr�rrrryrr!rr�builtin_module_namesrr+r.r1r\rdr3�ImportErrorr�r5r7Zsys_executablerJ�objectrXrMr_rRrNrkrlr�rcrwr5r�ryr;r;r;r<�<module>s�D
|A))'lR
 


T`A 

site-packages/setuptools/command/__pycache__/build_py.cpython-36.opt-1.pyc000064400000020460147511334640022541 0ustar003

��f|%�
@s�ddlmZddlmZddljjZddlZddlZddl	Z	ddl
Z
ddlZddl
Z
ddlmZddlmZmZmZyddlmZWn"ek
r�Gdd�d�ZYnXGd	d
�d
eje�Zddd�Zd
d�ZdS)�)�glob)�convert_pathN)�six)�map�filter�filterfalse)�	Mixin2to3c@seZdZddd�ZdS)rTcCsdS)z
do nothingN�)�self�filesZdoctestsr	r	�/usr/lib/python3.6/build_py.py�run_2to3szMixin2to3.run_2to3N)T)�__name__�
__module__�__qualname__r
r	r	r	rrsrc@s�eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zedd��Zd S)!�build_pyaXEnhanced 'build_py' command that includes data files with packages

    The data files are specified via a 'package_data' argument to 'setup()'.
    See 'setuptools.dist.Distribution' for more details.

    Also, this version of the 'build_py' command allows you to specify both
    'py_modules' and 'packages' in the same setup operation.
    cCsFtjj|�|jj|_|jjp i|_d|jkr6|jd=g|_g|_dS)N�
data_files)	�origr�finalize_options�distribution�package_data�exclude_package_data�__dict__�_build_py__updated_files�_build_py__doctests_2to3)r
r	r	rr!s

zbuild_py.finalize_optionscCs||jr|jrdS|jr"|j�|jr8|j�|j�|j|jd�|j|jd�|j|jd�|jt	j
j|dd��dS)z?Build modules, packages, and copy data files to build directoryNFTr)Zinclude_bytecode)Z
py_modules�packagesZ
build_modulesZbuild_packages�build_package_datar
rrZbyte_compilerrZget_outputs)r
r	r	r�run+szbuild_py.runcCs&|dkr|j�|_|jStjj||�S)zlazily compute data filesr)�_get_data_filesrrr�__getattr__)r
�attrr	r	rr?s
zbuild_py.__getattr__cCsJtjrt|tj�r|jd�}tjj||||�\}}|rB|jj	|�||fS)N�.)
rZPY2�
isinstanceZstring_types�splitrr�build_moduler�append)r
�moduleZmodule_file�packageZoutfile�copiedr	r	rr$Fs

zbuild_py.build_modulecCs|j�tt|j|jpf��S)z?Generate list of '(package,src_dir,build_dir,filenames)' tuples)�analyze_manifest�listr�_get_pkg_data_filesr)r
r	r	rrPszbuild_py._get_data_filescsJ|j|��tjj|jg|jd��}�fdd�|j|��D�}|�||fS)Nr!csg|]}tjj|���qSr	)�os�path�relpath)�.0�file)�src_dirr	r�
<listcomp>^sz0build_py._get_pkg_data_files.<locals>.<listcomp>)�get_package_dirr,r-�joinZ	build_libr#�find_data_files)r
r'�	build_dir�	filenamesr	)r1rr+Us


zbuild_py._get_pkg_data_filescCsX|j|j||�}tt|�}tjj|�}ttj	j
|�}tj|jj|g�|�}|j
|||�S)z6Return filenames for package's data files in 'src_dir')�_get_platform_patternsrrr�	itertools�chain�
from_iterablerr,r-�isfile�manifest_files�get�exclude_data_files)r
r'r1�patternsZglobs_expandedZ
globs_matchesZ
glob_filesrr	r	rr5cs
zbuild_py.find_data_filesc
Cs�x�|jD]�\}}}}xr|D]j}tjj||�}|jtjj|��tjj||�}|j||�\}}	tjj|�}|	r||jj	kr|j
j|�qWqWdS)z$Copy data files into build directoryN)rr,r-r4Zmkpath�dirnameZ	copy_file�abspathrZconvert_2to3_doctestsrr%)
r
r'r1r6r7�filename�targetZsrcfileZoutfr(r	r	rrts
zbuild_py.build_package_datacCs�i|_}|jjsdSi}x$|jp$fD]}||t|j|��<q&W|jd�|jd�}x�|jj	D]�}t
jjt|��\}}d}|}	x:|r�||kr�||kr�|}t
jj|�\}}
t
jj
|
|�}q�W||kr^|jd�r�||	kr�q^|j||g�j|�q^WdS)NZegg_infoz.py)r=rZinclude_package_datar�assert_relativer3Zrun_commandZget_finalized_commandZfilelistrr,r-r#r4�endswith�
setdefaultr%)r
ZmfZsrc_dirsr'Zei_cmdr-�d�f�prevZoldfZdfr	r	rr)�s(


zbuild_py.analyze_manifestcCsdS)Nr	)r
r	r	r�get_data_files�szbuild_py.get_data_filescCs�y
|j|Stk
rYnXtjj|||�}||j|<|sJ|jjrN|Sx,|jjD]}||ksr|j|d�rXPqXW|Stj	|d��}|j
�}WdQRXd|kr�tjj
d|f��|S)z8Check namespace packages' __init__ for declare_namespacer!�rbNsdeclare_namespacez�Namespace package problem: %s is a namespace package, but its
__init__.py does not call declare_namespace()! Please fix it.
(See the setuptools manual under "Namespace Packages" for details.)
")�packages_checked�KeyErrorrr�
check_packagerZnamespace_packages�
startswith�io�open�read�	distutils�errorsZDistutilsError)r
r'Zpackage_dirZinit_pyZpkgrI�contentsr	r	rrO�s&


zbuild_py.check_packagecCsi|_tjj|�dS)N)rMrr�initialize_options)r
r	r	rrW�szbuild_py.initialize_optionscCs0tjj||�}|jjdk	r,tjj|jj|�S|S)N)rrr3rZsrc_rootr,r-r4)r
r'�resr	r	rr3�szbuild_py.get_package_dircs\t���|j|j||�}�fdd�|D�}tjj|�}t|���fdd��D�}tt|��S)z6Filter filenames for package's data files in 'src_dir'c3s|]}tj�|�VqdS)N)�fnmatchr)r/�pattern)rr	r�	<genexpr>�sz.build_py.exclude_data_files.<locals>.<genexpr>c3s|]}|�kr|VqdS)Nr	)r/�fn)�badr	rr[�s)r*r8rr9r:r;�set�_unique_everseen)r
r'r1rr@Zmatch_groupsZmatchesZkeepersr	)r]rrr?�s

zbuild_py.exclude_data_filescs.tj|jdg�|j|g��}�fdd�|D�S)z�
        yield platform-specific path patterns (suitable for glob
        or fn_match) from a glob-based spec (such as
        self.package_data or self.exclude_package_data)
        matching package in src_dir.
        �c3s |]}tjj�t|��VqdS)N)r,r-r4r)r/rZ)r1r	rr[�sz2build_py._get_platform_patterns.<locals>.<genexpr>)r9r:r>)�specr'r1Zraw_patternsr	)r1rr8�s


zbuild_py._get_platform_patternsN)rrr�__doc__rrrr$rr+r5rr)rKrOrWr3r?�staticmethodr8r	r	r	rrs 


rccsjt�}|j}|dkr:xPt|j|�D]}||�|Vq"Wn,x*|D]"}||�}||kr@||�|Vq@WdS)zHList unique elements, preserving order. Remember all elements ever seen.N)r^�addr�__contains__)�iterable�key�seenZseen_add�element�kr	r	rr_�s
r_cCs:tjj|�s|Sddlm}tjd�j�|}||��dS)Nr)�DistutilsSetupErrorz�
        Error: setup script specifies an absolute path:

            %s

        setup() arguments must *always* be /-separated paths relative to the
        setup.py directory, *never* absolute paths.
        )r,r-�isabs�distutils.errorsrk�textwrap�dedent�lstrip)r-rk�msgr	r	rrEsrE)N)rZdistutils.utilrZdistutils.command.build_pyZcommandrrr,rYrnrQrmrTr9Zsetuptools.externrZsetuptools.extern.six.movesrrrZsetuptools.lib2to3_exr�ImportErrorr_rEr	r	r	r�<module>s$Y
site-packages/setuptools/command/__pycache__/install.cpython-36.pyc000064400000007471147511334640021450 0ustar003

��fK�@svddlmZddlZddlZddlZddlZddljjZ	ddl
Z
e	jZGdd�de	j�Zdd�e	jjD�ej
e_dS)�)�DistutilsArgErrorNc@s�eZdZdZejjddgZejjddgZddd	�fd
dd	�fgZe	e�Z
dd
�Zdd�Zdd�Z
dd�Zedd��Zdd�ZdS)�installz7Use easy_install to install the package, w/dependencies�old-and-unmanageableN�Try not to use this!�!single-version-externally-managed�5used by system package builders to create 'flat' eggsZinstall_egg_infocCsdS)NT�)�selfrr�/usr/lib/python3.6/install.py�<lambda>szinstall.<lambda>Zinstall_scriptscCsdS)NTr)r	rrr
rscCstjj|�d|_d|_dS)N)�origr�initialize_options�old_and_unmanageable�!single_version_externally_managed)r	rrr
r
 szinstall.initialize_optionscCs<tjj|�|jrd|_n|jr8|jr8|jr8td��dS)NTzAYou must specify --record or --root when building system packages)rr�finalize_options�rootr�recordr)r	rrr
r%szinstall.finalize_optionscCs(|js|jrtjj|�Sd|_d|_dS)N�)rrrr�handle_extra_pathZ	path_fileZ
extra_dirs)r	rrr
r0szinstall.handle_extra_pathcCs@|js|jrtjj|�S|jtj��s4tjj|�n|j�dS)N)	rrrr�run�_called_from_setup�inspectZcurrentframe�do_egg_install)r	rrr
r:s
zinstall.runcCsz|dkr4d}tj|�tj�dkr0d}tj|�dStj|�d}|dd�\}tj|�}|jjdd	�}|d
kox|j	dkS)a�
        Attempt to detect whether run() was called from setup() or by another
        command.  If called by setup(), the parent caller will be the
        'run_command' method in 'distutils.dist', and *its* caller will be
        the 'run_commands' method.  If called any other way, the
        immediate caller *might* be 'run_command', but it won't have been
        called by 'run_commands'. Return True in that case or if a call stack
        is unavailable. Return False otherwise.
        Nz4Call stack not available. bdist_* commands may fail.Z
IronPythonz6For best results, pass -X:Frames to enable call stack.T���__name__rzdistutils.distZrun_commands)
�warnings�warn�platformZpython_implementationrZgetouterframesZgetframeinfo�	f_globals�getZfunction)Z	run_frame�msg�resZcaller�infoZ
caller_modulerrr
rEs


zinstall._called_from_setupcCs�|jjd�}||jd|j|jd�}|j�d|_|jjtjd��|j	d�|jj
d�jg}tj
rp|jdtj
�||_|j�dt_
dS)N�easy_install�x)�argsrr�.z*.eggZ	bdist_eggr)ZdistributionZget_command_classrrZensure_finalizedZalways_copy_fromZ
package_index�scan�globZrun_commandZget_command_objZ
egg_output�
setuptoolsZbootstrap_install_from�insertr&r)r	r$�cmdr&rrr
r`s
zinstall.do_egg_install)rNr)rNr)r�
__module__�__qualname__�__doc__rrZuser_optionsZboolean_options�new_commands�dict�_ncr
rrr�staticmethodrrrrrr
rs 


rcCsg|]}|dtjkr|�qS)r)rr2)�.0r,rrr
�
<listcomp>{sr5)Zdistutils.errorsrrr)rrZdistutils.command.installZcommandrrr*�_installZsub_commandsr0rrrr
�<module>slsite-packages/setuptools/command/__pycache__/develop.cpython-36.opt-1.pyc000064400000014316147511334640022373 0ustar003

��fn�@s�ddlmZddlmZddlmZmZddlZddlZddl	Z	ddl
mZddlm
Z
mZmZddlmZddlmZddlZGd	d
�d
eje�ZGdd�de�ZdS)
�)�convert_path)�log)�DistutilsError�DistutilsOptionErrorN)�six)�Distribution�PathMetadata�normalize_path)�easy_install)�
namespacesc@sveZdZdZdZejddgZejdgZd	Zd
d�Z	dd
�Z
dd�Zedd��Z
dd�Zdd�Zdd�Zdd�ZdS)�developzSet up package for developmentz%install package in 'development mode'�	uninstall�u�Uninstall this source package�	egg-path=N�-Set the path to be used in the .egg-link fileFcCs2|jrd|_|j�|j�n|j�|j�dS)NT)r
Z
multi_version�uninstall_linkZuninstall_namespaces�install_for_developmentZwarn_deprecated_options)�self�r�/usr/lib/python3.6/develop.py�runs
zdevelop.runcCs&d|_d|_tj|�d|_d|_dS)N�.)r
�egg_pathr
�initialize_options�
setup_pathZalways_copy_from)rrrrr's

zdevelop.initialize_optionscCs|jd�}|jr,d}|j|jf}t||��|jg|_tj|�|j�|j	�|j
jtjd��|jd}t
jj|j|�|_|j|_|jdkr�t
jj|j�|_t|j�}tt
jj|j|j��}||kr�td|��t|t|t
jj|j��|jd�|_|j|j|j|j�|_dS)N�egg_infoz-Please rename %r to %r before using 'develop'z*.eggz	.egg-linkzA--egg-path must be a relative path from the install directory to )�project_name)�get_finalized_commandZbroken_egg_inforrZegg_name�argsr
�finalize_optionsZexpand_basedirsZexpand_dirsZ
package_index�scan�glob�os�path�join�install_dir�egg_link�egg_baser�abspathr	rrr�dist�_resolve_setup_pathr)rZei�templaterZegg_link_fn�targetrrrrr .s<






zdevelop.finalize_optionscCsh|jtjd�jd�}|tjkr0d|jd�d}ttjj|||��}|ttj�krdt	d|ttj���|S)z�
        Generate a path from egg_base back to '.' where the
        setup script resides and ensure that path points to the
        setup path from $install_dir/$egg_path.
        �/z../�zGCan't get a consistent path to setup script from installation directory)
�replacer#�sep�rstrip�curdir�countr	r$r%r)r(r&rZ
path_to_setupZresolvedrrrr+Xs
zdevelop._resolve_setup_pathcCsDtjr�t|jdd�r�|jddd�|jd�|jd�}t|j�}|jd|d�|jd�|jddd�|jd�|jd�}||_	||j
_t||j
�|j
_n"|jd�|jdd	d�|jd�|j�tjr�|jtj�dt_|j�tjd
|j|j�|j�s,t|jd��}|j|j	d|j�WdQRX|jd|j
|j�dS)
NZuse_2to3FZbuild_pyr)Zinplacer)r(Z	build_extr/zCreating %s (link to %s)�w�
)rZPY3�getattr�distributionZreinitialize_commandZrun_commandrr	Z	build_librr*�locationrrZ	_providerZinstall_site_py�
setuptoolsZbootstrap_install_fromr
Zinstall_namespacesr�infor'r(�dry_run�open�writerZprocess_distributionZno_deps)rZbpy_cmdZ
build_pathZei_cmd�frrrrks4







 zdevelop.install_for_developmentcCs�tjj|j�rztjd|j|j�t|j�}dd�|D�}|j�||j	g|j	|j
gfkrhtjd|�dS|jsztj
|j�|js�|j|j�|jjr�tjd�dS)NzRemoving %s (link to %s)cSsg|]}|j��qSr)r2)�.0�linerrr�
<listcomp>�sz*develop.uninstall_link.<locals>.<listcomp>z$Link points to %s: uninstall abortedz5Note: you must uninstall or replace scripts manually!)r#r$�existsr'rr;r(r=�closerr�warnr<�unlinkZ
update_pthr*r8�scripts)rZ
egg_link_file�contentsrrrr�s
zdevelop.uninstall_linkc
Cs�||jk	rtj||�S|j|�x^|jjp,gD]N}tjjt	|��}tjj
|�}tj|��}|j
�}WdQRX|j||||�q.WdS)N)r*r
�install_egg_scripts�install_wrapper_scriptsr8rGr#r$r)r�basename�ior=�readZinstall_script)rr*Zscript_nameZscript_pathZstrmZscript_textrrrrI�s

zdevelop.install_egg_scriptscCst|�}tj||�S)N)�VersionlessRequirementr
rJ)rr*rrrrJ�szdevelop.install_wrapper_scripts)r
rr)rNr)�__name__�
__module__�__qualname__�__doc__�descriptionr
Zuser_optionsZboolean_optionsZcommand_consumes_argumentsrrr �staticmethodr+rrrIrJrrrrrs	*/rc@s(eZdZdZdd�Zdd�Zdd�ZdS)	rNaz
    Adapt a pkg_resources.Distribution to simply return the project
    name as the 'requirement' so that scripts will work across
    multiple versions.

    >>> dist = Distribution(project_name='foo', version='1.0')
    >>> str(dist.as_requirement())
    'foo==1.0'
    >>> adapted_dist = VersionlessRequirement(dist)
    >>> str(adapted_dist.as_requirement())
    'foo'
    cCs
||_dS)N)�_VersionlessRequirement__dist)rr*rrr�__init__�szVersionlessRequirement.__init__cCst|j|�S)N)r7rU)r�namerrr�__getattr__�sz"VersionlessRequirement.__getattr__cCs|jS)N)r)rrrr�as_requirement�sz%VersionlessRequirement.as_requirementN)rOrPrQrRrVrXrYrrrrrN�srN)Zdistutils.utilrZ	distutilsrZdistutils.errorsrrr#r"rLZsetuptools.externrZ
pkg_resourcesrrr	Zsetuptools.command.easy_installr
r:rZDevelopInstallerr�objectrNrrrr�<module>s4site-packages/setuptools/command/__pycache__/__init__.cpython-36.opt-1.pyc000064400000001230147511334640022463 0ustar003

��fR�@szdddddddddd	d
ddd
dddddddddgZddlmZddlZddlmZdejkrrdejd<ejjd�[[dS)�alias�	bdist_eggZ	bdist_rpmZ	build_extZbuild_pyZdevelopZeasy_installZegg_infoZinstallZinstall_lib�rotateZsaveoptsZsdistZsetoptZtestZinstall_egg_info�install_scripts�registerZ
bdist_wininstZupload_docsZuploadZ
build_clibZ	dist_info�)�bdistN)rZegg�Python .egg file)rr)	�__all__Zdistutils.command.bdistr�sysZsetuptools.commandrZformat_commandsZformat_command�append�rr�/usr/lib/python3.6/__init__.py�<module>s



site-packages/setuptools/command/__pycache__/upload.cpython-36.pyc000064400000002443147511334640021260 0ustar003

��f��@s*ddlZddlmZGdd�dej�ZdS)�N)�uploadc@s(eZdZdZdd�Zdd�Zdd�ZdS)	rza
    Override default upload behavior to obtain password
    in a variety of different ways.
    cCs8tjj|�|jptj�|_|jp0|j�p0|j�|_dS)N)	�origr�finalize_options�username�getpassZgetuserZpassword�_load_password_from_keyring�_prompt_for_password)�self�r
�/usr/lib/python3.6/upload.pyrs
zupload.finalize_optionscCs2ytd�}|j|j|j�Stk
r,YnXdS)zM
        Attempt to load password from keyring. Suppress Exceptions.
        �keyringN)�
__import__Zget_passwordZ
repositoryr�	Exception)r	rr
r
rrs
z"upload._load_password_from_keyringcCs&ytj�Sttfk
r YnXdS)zH
        Prompt for a password on the tty. Suppress Exceptions.
        N)rr�KeyboardInterrupt)r	r
r
rr#szupload._prompt_for_passwordN)�__name__�
__module__�__qualname__�__doc__rrrr
r
r
rrs
r)rZdistutils.commandrrr
r
r
r�<module>ssite-packages/setuptools/command/__pycache__/alias.cpython-36.opt-1.pyc000064400000004465147511334640022032 0ustar003

��fz	�@sPddlmZddlmZddlmZmZmZdd�ZGdd�de�Z	dd	�Z
d
S)�)�DistutilsOptionError)�map)�edit_config�option_base�config_filecCs8xdD]}||krt|�SqW|j�|gkr4t|�S|S)z4Quote an argument for later parsing by shlex.split()�"�'�\�#)rrr	r
)�repr�split)�arg�c�r�/usr/lib/python3.6/alias.py�shquotes
rc@sHeZdZdZdZdZdgejZejdgZdd�Z	d	d
�Z
dd�Zd
S)�aliasz3Define a shortcut that invokes one or more commandsz0define a shortcut to invoke one or more commandsT�remove�r�remove (unset) the aliascCstj|�d|_d|_dS)N)r�initialize_options�argsr)�selfrrrrs
zalias.initialize_optionscCs*tj|�|jr&t|j�dkr&td��dS)N�zFMust specify exactly one argument (the alias name) when using --remove)r�finalize_optionsr�lenrr)rrrrr#s
zalias.finalize_optionscCs�|jjd�}|jsDtd�td�x|D]}tdt||��q(WdSt|j�dkr�|j\}|jrfd}q�||kr�tdt||��dStd|�dSn$|jd}djtt	|jdd���}t
|jd||ii|j�dS)	N�aliaseszCommand Aliasesz---------------zsetup.py aliasrz No alias definition found for %rr� )
ZdistributionZget_option_dictr�print�format_aliasrr�joinrrr�filenameZdry_run)rrr�commandrrr�run+s&

z	alias.runN)rrr)�__name__�
__module__�__qualname__�__doc__�descriptionZcommand_consumes_argumentsrZuser_optionsZboolean_optionsrrr#rrrrrsrcCsZ||\}}|td�krd}n,|td�kr0d}n|td�krBd}nd|}||d|S)	N�globalz--global-config �userz--user-config Zlocal�z
--filename=%rr)r)�namer�sourcer"rrrrFsrN)Zdistutils.errorsrZsetuptools.extern.six.movesrZsetuptools.command.setoptrrrrrrrrrr�<module>s

4site-packages/setuptools/command/__pycache__/build_py.cpython-36.pyc000064400000020460147511334640021602 0ustar003

��f|%�
@s�ddlmZddlmZddljjZddlZddlZddl	Z	ddl
Z
ddlZddl
Z
ddlmZddlmZmZmZyddlmZWn"ek
r�Gdd�d�ZYnXGd	d
�d
eje�Zddd�Zd
d�ZdS)�)�glob)�convert_pathN)�six)�map�filter�filterfalse)�	Mixin2to3c@seZdZddd�ZdS)rTcCsdS)z
do nothingN�)�self�filesZdoctestsr	r	�/usr/lib/python3.6/build_py.py�run_2to3szMixin2to3.run_2to3N)T)�__name__�
__module__�__qualname__r
r	r	r	rrsrc@s�eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zedd��Zd S)!�build_pyaXEnhanced 'build_py' command that includes data files with packages

    The data files are specified via a 'package_data' argument to 'setup()'.
    See 'setuptools.dist.Distribution' for more details.

    Also, this version of the 'build_py' command allows you to specify both
    'py_modules' and 'packages' in the same setup operation.
    cCsFtjj|�|jj|_|jjp i|_d|jkr6|jd=g|_g|_dS)N�
data_files)	�origr�finalize_options�distribution�package_data�exclude_package_data�__dict__�_build_py__updated_files�_build_py__doctests_2to3)r
r	r	rr!s

zbuild_py.finalize_optionscCs||jr|jrdS|jr"|j�|jr8|j�|j�|j|jd�|j|jd�|j|jd�|jt	j
j|dd��dS)z?Build modules, packages, and copy data files to build directoryNFTr)Zinclude_bytecode)Z
py_modules�packagesZ
build_modulesZbuild_packages�build_package_datar
rrZbyte_compilerrZget_outputs)r
r	r	r�run+szbuild_py.runcCs&|dkr|j�|_|jStjj||�S)zlazily compute data filesr)�_get_data_filesrrr�__getattr__)r
�attrr	r	rr?s
zbuild_py.__getattr__cCsJtjrt|tj�r|jd�}tjj||||�\}}|rB|jj	|�||fS)N�.)
rZPY2�
isinstanceZstring_types�splitrr�build_moduler�append)r
�moduleZmodule_file�packageZoutfile�copiedr	r	rr$Fs

zbuild_py.build_modulecCs|j�tt|j|jpf��S)z?Generate list of '(package,src_dir,build_dir,filenames)' tuples)�analyze_manifest�listr�_get_pkg_data_filesr)r
r	r	rrPszbuild_py._get_data_filescsJ|j|��tjj|jg|jd��}�fdd�|j|��D�}|�||fS)Nr!csg|]}tjj|���qSr	)�os�path�relpath)�.0�file)�src_dirr	r�
<listcomp>^sz0build_py._get_pkg_data_files.<locals>.<listcomp>)�get_package_dirr,r-�joinZ	build_libr#�find_data_files)r
r'�	build_dir�	filenamesr	)r1rr+Us


zbuild_py._get_pkg_data_filescCsX|j|j||�}tt|�}tjj|�}ttj	j
|�}tj|jj|g�|�}|j
|||�S)z6Return filenames for package's data files in 'src_dir')�_get_platform_patternsrrr�	itertools�chain�
from_iterablerr,r-�isfile�manifest_files�get�exclude_data_files)r
r'r1�patternsZglobs_expandedZ
globs_matchesZ
glob_filesrr	r	rr5cs
zbuild_py.find_data_filesc
Cs�x�|jD]�\}}}}xr|D]j}tjj||�}|jtjj|��tjj||�}|j||�\}}	tjj|�}|	r||jj	kr|j
j|�qWqWdS)z$Copy data files into build directoryN)rr,r-r4Zmkpath�dirnameZ	copy_file�abspathrZconvert_2to3_doctestsrr%)
r
r'r1r6r7�filename�targetZsrcfileZoutfr(r	r	rrts
zbuild_py.build_package_datacCs�i|_}|jjsdSi}x$|jp$fD]}||t|j|��<q&W|jd�|jd�}x�|jj	D]�}t
jjt|��\}}d}|}	x:|r�||kr�||kr�|}t
jj|�\}}
t
jj
|
|�}q�W||kr^|jd�r�||	kr�q^|j||g�j|�q^WdS)NZegg_infoz.py)r=rZinclude_package_datar�assert_relativer3Zrun_commandZget_finalized_commandZfilelistrr,r-r#r4�endswith�
setdefaultr%)r
ZmfZsrc_dirsr'Zei_cmdr-�d�f�prevZoldfZdfr	r	rr)�s(


zbuild_py.analyze_manifestcCsdS)Nr	)r
r	r	r�get_data_files�szbuild_py.get_data_filescCs�y
|j|Stk
rYnXtjj|||�}||j|<|sJ|jjrN|Sx,|jjD]}||ksr|j|d�rXPqXW|Stj	|d��}|j
�}WdQRXd|kr�tjj
d|f��|S)z8Check namespace packages' __init__ for declare_namespacer!�rbNsdeclare_namespacez�Namespace package problem: %s is a namespace package, but its
__init__.py does not call declare_namespace()! Please fix it.
(See the setuptools manual under "Namespace Packages" for details.)
")�packages_checked�KeyErrorrr�
check_packagerZnamespace_packages�
startswith�io�open�read�	distutils�errorsZDistutilsError)r
r'Zpackage_dirZinit_pyZpkgrI�contentsr	r	rrO�s&


zbuild_py.check_packagecCsi|_tjj|�dS)N)rMrr�initialize_options)r
r	r	rrW�szbuild_py.initialize_optionscCs0tjj||�}|jjdk	r,tjj|jj|�S|S)N)rrr3rZsrc_rootr,r-r4)r
r'�resr	r	rr3�szbuild_py.get_package_dircs\t���|j|j||�}�fdd�|D�}tjj|�}t|���fdd��D�}tt|��S)z6Filter filenames for package's data files in 'src_dir'c3s|]}tj�|�VqdS)N)�fnmatchr)r/�pattern)rr	r�	<genexpr>�sz.build_py.exclude_data_files.<locals>.<genexpr>c3s|]}|�kr|VqdS)Nr	)r/�fn)�badr	rr[�s)r*r8rr9r:r;�set�_unique_everseen)r
r'r1rr@Zmatch_groupsZmatchesZkeepersr	)r]rrr?�s

zbuild_py.exclude_data_filescs.tj|jdg�|j|g��}�fdd�|D�S)z�
        yield platform-specific path patterns (suitable for glob
        or fn_match) from a glob-based spec (such as
        self.package_data or self.exclude_package_data)
        matching package in src_dir.
        �c3s |]}tjj�t|��VqdS)N)r,r-r4r)r/rZ)r1r	rr[�sz2build_py._get_platform_patterns.<locals>.<genexpr>)r9r:r>)�specr'r1Zraw_patternsr	)r1rr8�s


zbuild_py._get_platform_patternsN)rrr�__doc__rrrr$rr+r5rr)rKrOrWr3r?�staticmethodr8r	r	r	rrs 


rccsjt�}|j}|dkr:xPt|j|�D]}||�|Vq"Wn,x*|D]"}||�}||kr@||�|Vq@WdS)zHList unique elements, preserving order. Remember all elements ever seen.N)r^�addr�__contains__)�iterable�key�seenZseen_add�element�kr	r	rr_�s
r_cCs:tjj|�s|Sddlm}tjd�j�|}||��dS)Nr)�DistutilsSetupErrorz�
        Error: setup script specifies an absolute path:

            %s

        setup() arguments must *always* be /-separated paths relative to the
        setup.py directory, *never* absolute paths.
        )r,r-�isabs�distutils.errorsrk�textwrap�dedent�lstrip)r-rk�msgr	r	rrEsrE)N)rZdistutils.utilrZdistutils.command.build_pyZcommandrrr,rYrnrQrmrTr9Zsetuptools.externrZsetuptools.extern.six.movesrrrZsetuptools.lib2to3_exr�ImportErrorr_rEr	r	r	r�<module>s$Y
site-packages/setuptools/command/__pycache__/bdist_rpm.cpython-36.pyc000064400000003244147511334640021757 0ustar003

��f��@s"ddljjZGdd�dej�ZdS)�Nc@s eZdZdZdd�Zdd�ZdS)�	bdist_rpmaf
    Override the default bdist_rpm behavior to do the following:

    1. Run egg_info to ensure the name and version are properly calculated.
    2. Always run 'install' using --single-version-externally-managed to
       disable eggs in RPM distributions.
    3. Replace dash with underscore in the version numbers for better RPM
       compatibility.
    cCs|jd�tjj|�dS)NZegg_info)Zrun_command�origr�run)�self�r�/usr/lib/python3.6/bdist_rpm.pyrs
z
bdist_rpm.runcsl|jj�}|jdd�}tjj|�}d|�d|���fdd�|D�}|j��d}d|}|j||�|S)N�-�_z%define version cs0g|](}|jdd�jdd�jdd�j����qS)zSource0: %{name}-%{version}.tarz)Source0: %{name}-%{unmangled_version}.tarzsetup.py install z5setup.py install --single-version-externally-managed z%setupz&%setup -n %{name}-%{unmangled_version})�replace)�.0�line)�line23�line24rr�
<listcomp>s
z-bdist_rpm._make_spec_file.<locals>.<listcomp>�z%define unmangled_version )ZdistributionZget_versionr
rr�_make_spec_file�index�insert)r�versionZ
rpmversion�specZ
insert_locZunmangled_versionr)r
rrrs

zbdist_rpm._make_spec_fileN)�__name__�
__module__�__qualname__�__doc__rrrrrrrs	r)Zdistutils.command.bdist_rpmZcommandrrrrrr�<module>ssite-packages/setuptools/command/__pycache__/install_scripts.cpython-36.opt-1.pyc000064400000004232147511334640024146 0ustar003

��f�	�@sRddlmZddljjZddlZddlZddlm	Z	m
Z
mZGdd�dej�ZdS)�)�logN)�Distribution�PathMetadata�ensure_directoryc@s*eZdZdZdd�Zdd�Zd
dd�Zd	S)�install_scriptsz;Do normal script install, plus any egg_info wrapper scriptscCstjj|�d|_dS)NF)�origr�initialize_options�no_ep)�self�r�%/usr/lib/python3.6/install_scripts.pyrsz"install_scripts.initialize_optionscCs�ddljj}|jd�|jjr,tjj|�ng|_	|j
r<dS|jd�}t|j
t|j
|j�|j|j�}|jd�}t|dd�}|jd�}t|dd�}|j}|r�d}|j}|tjkr�|g}|j�}|jj�j|�}	x"|j||	j��D]}
|j|
�q�WdS)	Nr�egg_infoZ
build_scripts�
executableZ
bdist_wininstZ_is_runningFz
python.exe)�setuptools.command.easy_install�commandZeasy_installZrun_commandZdistribution�scriptsrr�run�outfilesr	Zget_finalized_commandrZegg_baserr
Zegg_nameZegg_version�getattrZScriptWriterZWindowsScriptWriter�sysrZbestZcommand_spec_classZ
from_paramZget_argsZ	as_header�write_script)r
ZeiZei_cmdZdistZbs_cmdZ
exec_paramZbw_cmdZ
is_wininst�writer�cmd�argsrrrrs2




zinstall_scripts.run�tc
Gs�ddlm}m}tjd||j�tjj|j|�}|j	j
|�|�}|js~t|�t
|d|�}	|	j|�|	j�||d|�dS)z1Write an executable file to the scripts directoryr)�chmod�
current_umaskzInstalling %s script to %s�wi�N)rrrr�infoZinstall_dir�os�path�joinr�appendZdry_runr�open�write�close)
r
Zscript_name�contents�modeZignoredrr�target�mask�frrrr3s
zinstall_scripts.write_scriptN)r)�__name__�
__module__�__qualname__�__doc__rrrrrrrr	s#r)Z	distutilsrZ!distutils.command.install_scriptsrrrrrZ
pkg_resourcesrrrrrrr�<module>s
site-packages/setuptools/command/__pycache__/rotate.cpython-36.pyc000064400000004707147511334640021277 0ustar003

��ft�@s`ddlmZddlmZddlmZddlZddlZddlm	Z	ddl
mZGdd�de�ZdS)	�)�convert_path)�log)�DistutilsOptionErrorN)�six)�Commandc@s:eZdZdZdZdddgZgZdd
�Zdd�Zdd�Z	dS)�rotatezDelete older distributionsz2delete older distributions, keeping N newest files�match=�m�patterns to match (required)�	dist-dir=�d�%directory where the distributions are�keep=�k�(number of matching distributions to keepcCsd|_d|_d|_dS)N)�match�dist_dir�keep)�self�r�/usr/lib/python3.6/rotate.py�initialize_optionsszrotate.initialize_optionscCs�|jdkrtd��|jdkr$td��yt|j�|_Wntk
rPtd��YnXt|jtj�rxdd�|jjd�D�|_|j	dd	�dS)
NzQMust specify one or more (comma-separated) match patterns (e.g. '.zip' or '.egg')z$Must specify number of files to keepz--keep must be an integercSsg|]}t|j���qSr)r�strip)�.0�prrr�
<listcomp>+sz+rotate.finalize_options.<locals>.<listcomp>�,Zbdistr)rr)
rrr�int�
ValueError�
isinstancerZstring_types�splitZset_undefined_options)rrrr�finalize_optionss

zrotate.finalize_optionscCs�|jd�ddlm}x�|jD]�}|jj�d|}|tjj|j|��}dd�|D�}|j	�|j
�tjdt
|�|�||jd�}xD|D]<\}}tjd|�|js�tjj|�r�tj|�q�tj|�q�WqWdS)	NZegg_infor)�glob�*cSsg|]}tjj|�|f�qSr)�os�path�getmtime)r�frrrr6szrotate.run.<locals>.<listcomp>z%d file(s) matching %szDeleting %s)Zrun_commandr"rZdistributionZget_namer$r%�joinr�sort�reverser�info�lenrZdry_run�isdir�shutilZrmtree�unlink)rr"�pattern�files�tr'rrr�run/s 
z
rotate.runN)rr	r
)rrr
)rrr)
�__name__�
__module__�__qualname__�__doc__�descriptionZuser_optionsZboolean_optionsrr!r3rrrrrsr)
Zdistutils.utilrZ	distutilsrZdistutils.errorsrr$r.Zsetuptools.externrZ
setuptoolsrrrrrr�<module>ssite-packages/setuptools/command/__pycache__/install_egg_info.cpython-36.pyc000064400000004472147511334640023303 0ustar003

��f��@s\ddlmZmZddlZddlmZddlmZddlmZddl	Z	Gdd�dej
e�ZdS)�)�log�dir_utilN)�Command)�
namespaces)�unpack_archivec@sBeZdZdZdZdgZdd�Zdd�Zd	d
�Zdd�Z	d
d�Z
dS)�install_egg_infoz.Install an .egg-info directory for the package�install-dir=�d�directory to install tocCs
d|_dS)N)�install_dir)�self�r
�&/usr/lib/python3.6/install_egg_info.py�initialize_optionssz#install_egg_info.initialize_optionscCsV|jdd�|jd�}tjdd|j|j�j�d}|j|_tj	j
|j|�|_g|_
dS)NZinstall_libr�egg_infoz	.egg-info)rr)Zset_undefined_optionsZget_finalized_command�
pkg_resourcesZDistributionZegg_nameZegg_versionr�source�os�path�joinr�target�outputs)rZei_cmd�basenamer
r
r�finalize_optionss
z!install_egg_info.finalize_optionscCs�|jd�tjj|j�r<tjj|j�r<tj|j|jd�n(tjj	|j�rd|j
tj|jfd|j�|jsvtj
|j�|j
|jfd|j|jf�|j�dS)Nr)�dry_runz	Removing zCopying %s to %s)Zrun_commandrr�isdirr�islinkrZremove_treer�existsZexecute�unlinkrZensure_directory�copytreerZinstall_namespaces)rr
r
r�run!s
zinstall_egg_info.runcCs|jS)N)r)rr
r
r�get_outputs.szinstall_egg_info.get_outputscs �fdd�}t�j�j|�dS)NcsFx&dD]}|j|�s d||krdSqW�jj|�tjd||�|S)N�.svn/�CVS/�/zCopying %s to %s)r"r#)�
startswithr�appendr�debug)�src�dst�skip)rr
r�skimmer3s
z*install_egg_info.copytree.<locals>.skimmer)rrr)rr+r
)rrr1szinstall_egg_info.copytreeN)rr	r
)�__name__�
__module__�__qualname__�__doc__�descriptionZuser_optionsrrr r!rr
r
r
rr
s
r)Z	distutilsrrrZ
setuptoolsrrZsetuptools.archive_utilrrZ	Installerrr
r
r
r�<module>ssite-packages/setuptools/command/__pycache__/build_clib.cpython-36.pyc000064400000004504147511334640022064 0ustar003

��f��@sFddljjZddlmZddlmZddlm	Z	Gdd�dej�ZdS)�N)�DistutilsSetupError)�log)�newer_pairwise_groupc@seZdZdZdd�ZdS)�
build_clibav
    Override the default build_clib behaviour to do the following:

    1. Implement a rudimentary timestamp-based dependency system
       so 'compile()' doesn't run every time.
    2. Add more keys to the 'build_info' dictionary:
        * obj_deps - specify dependencies for each object compiled.
                     this should be a dictionary mapping a key
                     with the source filename to a list of
                     dependencies. Use an empty string for global
                     dependencies.
        * cflags   - specify a list of additional flags to pass to
                     the compiler.
    c	Cs~�xv|D�]l\}}|jd�}|dks4t|ttf�r@td|��t|�}tjd|�|jdt��}t|t�sxtd|��g}|jdt��}t|ttf�s�td|��xX|D]P}|g}	|	j|�|j|t��}
t|
ttf�s�td|��|	j|
�|j	|	�q�W|j
j||jd�}t
||�ggfk�r^|jd�}|jd	�}
|jd
�}|j
j||j||
||jd�}|j
j|||j|jd�qWdS)
N�sourceszfin 'libraries' option (library '%s'), 'sources' must be present and must be a list of source filenameszbuilding '%s' library�obj_depsz\in 'libraries' option (library '%s'), 'obj_deps' must be a dictionary of type 'source: list'�)�
output_dir�macros�include_dirs�cflags)r	r
rZextra_postargs�debug)r	r
)�get�
isinstance�list�tuplerr�info�dict�extend�appendZcompilerZobject_filenamesZ
build_tempr�compiler
Zcreate_static_libr)�selfZ	librariesZlib_nameZ
build_inforrZdependenciesZglobal_deps�sourceZsrc_depsZ
extra_depsZexpected_objectsr
rrZobjects�r� /usr/lib/python3.6/build_clib.py�build_librariess`









zbuild_clib.build_librariesN)�__name__�
__module__�__qualname__�__doc__rrrrrrsr)
Zdistutils.command.build_clibZcommandrZorigZdistutils.errorsrZ	distutilsrZsetuptools.dep_utilrrrrr�<module>ssite-packages/setuptools/command/__pycache__/install_lib.cpython-36.pyc000064400000007647147511334640022303 0ustar003

��f�@sBddlZddlZddlmZmZddljjZGdd�dej�ZdS)�N)�product�starmapc@sZeZdZdZdd�Zdd�Zdd�Zedd	��Zd
d�Z	edd
��Z
ddd�Zdd�ZdS)�install_libz9Don't add compiled flags to filenames of non-Python filescCs&|j�|j�}|dk	r"|j|�dS)N)Zbuild�installZbyte_compile)�self�outfiles�r�!/usr/lib/python3.6/install_lib.py�run
szinstall_lib.runcs4�fdd��j�D�}t|�j��}tt�j|��S)z�
        Return a collections.Sized collections.Container of paths to be
        excluded for single_version_externally_managed installations.
        c3s"|]}�j|�D]
}|VqqdS)N)�
_all_packages)�.0Zns_pkg�pkg)rrr	�	<genexpr>sz-install_lib.get_exclusions.<locals>.<genexpr>)�_get_SVEM_NSPsr�_gen_exclusion_paths�setr�_exclude_pkg_path)rZall_packagesZ
excl_specsr)rr	�get_exclusionss
zinstall_lib.get_exclusionscCs$|jd�|g}tjj|jf|��S)zw
        Given a package name and exclusion path within that package,
        compute the full exclusion path.
        �.)�split�os�path�joinZinstall_dir)rr
Zexclusion_path�partsrrr	rszinstall_lib._exclude_pkg_pathccs$x|r|V|jd�\}}}qWdS)zn
        >>> list(install_lib._all_packages('foo.bar.baz'))
        ['foo.bar.baz', 'foo.bar', 'foo']
        rN)�
rpartition)Zpkg_name�sepZchildrrr	r'szinstall_lib._all_packagescCs,|jjsgS|jd�}|j}|r(|jjSgS)z�
        Get namespace packages (list) but only for
        single_version_externally_managed installations and empty otherwise.
        r)ZdistributionZnamespace_packagesZget_finalized_commandZ!single_version_externally_managed)rZinstall_cmdZsvemrrr	r1s

zinstall_lib._get_SVEM_NSPsccsbdVdVdVttd�s dStjjddtj��}|dV|d	V|d
V|dVdS)zk
        Generate file paths to be excluded for namespace packages (bytecode
        cache files).
        z__init__.pyz__init__.pycz__init__.pyo�get_tagN�__pycache__z	__init__.z.pycz.pyoz
.opt-1.pycz
.opt-2.pyc)�hasattr�imprrrr)�baserrr	rAs



z install_lib._gen_exclusion_paths�rc	sj|r|r|st�|j���s.tjj|||�Sddlm}ddlm�g����fdd�}||||��S)Nr)�unpack_directory)�logcs<|�kr�jd|�dS�jd|tjj|���j|�|S)Nz/Skipping installation of %s (namespace package)Fzcopying %s -> %s)�warn�inforr�dirname�append)�src�dst)�excluder#rrr	�pfgs
z!install_lib.copy_tree.<locals>.pf)	�AssertionErrorr�origr�	copy_treeZsetuptools.archive_utilr"Z	distutilsr#)	rZinfileZoutfileZ
preserve_modeZpreserve_timesZpreserve_symlinks�levelr"r+r)r*r#rr	r.Vs
zinstall_lib.copy_treecs.tjj|�}|j���r*�fdd�|D�S|S)Ncsg|]}|�kr|�qSrr)r�f)r*rr	�
<listcomp>xsz+install_lib.get_outputs.<locals>.<listcomp>)r-r�get_outputsr)rZoutputsr)r*r	r2ts
zinstall_lib.get_outputsN)r!r!rr!)
�__name__�
__module__�__qualname__�__doc__r
rr�staticmethodrrrr.r2rrrr	rs

r)	rr�	itertoolsrrZdistutils.command.install_libZcommandrr-rrrr	�<module>ssite-packages/setuptools/command/__pycache__/saveopts.cpython-36.opt-1.pyc000064400000001520147511334640022572 0ustar003

��f��@s$ddlmZmZGdd�de�ZdS)�)�edit_config�option_basec@seZdZdZdZdd�ZdS)�saveoptsz#Save command-line options to a filez7save supplied options to setup.cfg or other config filecCsp|j}i}xP|jD]F}|dkr qx6|j|�j�D]$\}\}}|dkr0||j|i�|<q0WqWt|j||j�dS)Nrzcommand line)ZdistributionZcommand_optionsZget_option_dict�items�
setdefaultr�filenameZdry_run)�selfZdistZsettings�cmd�opt�src�val�r
�/usr/lib/python3.6/saveopts.py�run	szsaveopts.runN)�__name__�
__module__�__qualname__�__doc__�descriptionrr
r
r
rrsrN)Zsetuptools.command.setoptrrrr
r
r
r�<module>ssite-packages/setuptools/command/__pycache__/build_ext.cpython-36.opt-1.pyc000064400000023325147511334640022714 0ustar003

��fu3�@s�ddlZddlZddlZddlZddlmZddlmZddl	m
Z
ddlmZm
Z
ddlmZddlmZddlmZdd	lmZyddlmZed
�Wnek
r�eZYnXe
d�ddlmZd
d�ZdZdZdZej dkr�dZn>ej!dk�r,yddl"Z"e#e"d�ZZWnek
�r*YnXdd�Z$dd�Z%Gdd�de�Ze�s^ej!dk�rjd!dd�Z&ndZd"dd�Z&dd �Z'dS)#�N)�	build_ext)�	copy_file)�new_compiler)�customize_compiler�get_config_var)�DistutilsError)�log)�Library)�sixzCython.Compiler.Main�LDSHARED)�_config_varscCsZtjdkrNtj�}z$dtd<dtd<dtd<t|�Wdtj�tj|�Xnt|�dS)N�darwinz0gcc -Wl,-x -dynamiclib -undefined dynamic_lookuprz -dynamiclib�CCSHAREDz.dylib�SO)�sys�platform�_CONFIG_VARS�copyr�clear�update)�compilerZtmp�r�/usr/lib/python3.6/build_ext.py�_customize_compiler_for_shlibs
rFZsharedr
T�nt�RTLD_NOWcCstr|SdS)N�)�	have_rtld)�srrr�<lambda>>srcCs>x8dd�tj�D�D]"\}}}d|kr*|S|dkr|SqWdS)z;Return the file extension for an abi3-compliant Extension()css |]}|dtjkr|VqdS)�N)�impZC_EXTENSION)�.0rrrr�	<genexpr>Csz"get_abi3_suffix.<locals>.<genexpr>z.abi3z.pydN)r!Zget_suffixes)�suffix�_rrr�get_abi3_suffixAs
r&c@sveZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zddd�ZdS)rcCs.|jd}|_tj|�||_|r*|j�dS)z;Build extensions in build directory, then copy if --inplacerN)Zinplace�
_build_ext�run�copy_extensions_to_source)�selfZold_inplacerrrr(Ks

z
build_ext.runc
Cs�|jd�}x�|jD]�}|j|j�}|j|�}|jd�}dj|dd��}|j|�}tj	j|tj	j
|��}tj	j|j|�}	t|	||j
|jd�|jr|j|p�tj|d�qWdS)N�build_py�.�)�verbose�dry_runT���)�get_finalized_command�
extensions�get_ext_fullname�name�get_ext_filename�split�joinZget_package_dir�os�path�basename�	build_librr.r/�_needs_stub�
write_stub�curdir)
r*r+�ext�fullname�filenameZmodpath�packageZpackage_dirZ
dest_filenameZsrc_filenamerrrr)Ss




z#build_ext.copy_extensions_to_sourcecCs�tj||�}||jkr�|j|}tjo4t|d�o4t�}|r^td�}|dt|��}|t�}t	|t
�r�tjj
|�\}}|jj|t�Str�|jr�tjj|�\}}tjj|d|�S|S)NZpy_limited_api�
EXT_SUFFIXzdl-)r'r5�ext_mapr
ZPY3�getattrr&�_get_config_var_837�len�
isinstancer	r8r9�splitext�shlib_compiler�library_filename�libtype�	use_stubs�_links_to_dynamicr6r7)r*r@rAr?Zuse_abi3Zso_ext�fn�drrrr5is"





zbuild_ext.get_ext_filenamecCs tj|�d|_g|_i|_dS)N)r'�initialize_optionsrJ�shlibsrD)r*rrrrQ~s
zbuild_ext.initialize_optionscCs2tj|�|jpg|_|j|j�dd�|jD�|_|jrB|j�x|jD]}|j|j�|_qJWx�|jD]�}|j}||j	|<||j	|j
d�d<|jr�|j|�p�d}|o�to�t
|t�}||_||_|j|�}|_tjjtjj|j|��}|o�||jk�r|jj|�|rhtrhtj|jkrh|jjtj�qhWdS)NcSsg|]}t|t�r|�qSr)rHr	)r"r?rrr�
<listcomp>�sz.build_ext.finalize_options.<locals>.<listcomp>r,r-Fr0)r'�finalize_optionsr2Zcheck_extensions_listrR�setup_shlib_compilerr3r4�
_full_namerDr6�links_to_dynamicrMrHr	rNr<r5�
_file_namer8r9�dirnamer7r;�library_dirs�appendr>�runtime_library_dirs)r*r?r@Zltd�nsrAZlibdirrrrrT�s,

zbuild_ext.finalize_optionscCs�t|j|j|jd�}|_t|�|jdk	r8|j|j�|jdk	rbx|jD]\}}|j	||�qJW|j
dk	r�x|j
D]}|j|�qtW|jdk	r�|j
|j�|jdk	r�|j|j�|jdk	r�|j|j�|jdk	r�|j|j�tj|�|_dS)N)rr/�force)rrr/r^rJrZinclude_dirsZset_include_dirsZdefineZdefine_macroZundefZundefine_macro�	librariesZ
set_librariesrZZset_library_dirsZrpathZset_runtime_library_dirsZlink_objectsZset_link_objects�link_shared_object�__get__)r*rr4�valueZmacrorrrrU�s(






zbuild_ext.setup_shlib_compilercCst|t�r|jStj||�S)N)rHr	�export_symbolsr'�get_export_symbols)r*r?rrrrd�s
zbuild_ext.get_export_symbolscCs\|j�|j}z@t|t�r"|j|_tj||�|jrL|jd�j	}|j
||�Wd||_XdS)Nr+)Z_convert_pyx_sources_to_langrrHr	rJr'�build_extensionr<r1r;r=)r*r?Z	_compiler�cmdrrrre�s
zbuild_ext.build_extensioncsPtjdd�|jD���dj|jjd�dd	�dg��t��fdd�|jD��S)
z?Return true if 'ext' links to a dynamic lib in the same packagecSsg|]
}|j�qSr)rV)r"�librrrrS�sz.build_ext.links_to_dynamic.<locals>.<listcomp>r,Nr-rc3s|]}�|�kVqdS)Nr)r"Zlibname)�libnames�pkgrrr#�sz-build_ext.links_to_dynamic.<locals>.<genexpr>r0)�dict�fromkeysrRr7rVr6�anyr_)r*r?r)rhrirrW�s zbuild_ext.links_to_dynamiccCstj|�|j�S)N)r'�get_outputs�_build_ext__get_stubs_outputs)r*rrrrm�szbuild_ext.get_outputscs6�fdd��jD�}tj|�j��}tdd�|D��S)Nc3s0|](}|jrtjj�jf|jjd���VqdS)r,N)r<r8r9r7r;rVr6)r"r?)r*rrr#�sz0build_ext.__get_stubs_outputs.<locals>.<genexpr>css|]\}}||VqdS)Nr)r"�baseZfnextrrrr#�s)r2�	itertools�product�!_build_ext__get_output_extensions�list)r*Zns_ext_basesZpairsr)r*rZ__get_stubs_outputs�s

zbuild_ext.__get_stubs_outputsccs"dVdV|jd�jrdVdS)Nz.pyz.pycr+z.pyo)r1�optimize)r*rrrZ__get_output_extensions�sz!build_ext.__get_output_extensionsFcCs.tjd|j|�tjj|f|jjd���d}|rJtjj|�rJt|d��|j	s�t
|d�}|jdjddd	td
�dtjj
|j�dd
dtd�dddtd�ddtd�dddg��|j�|�r*ddlm}||gdd|j	d�|jd�j}|dk�r||g|d|j	d�tjj|��r*|j	�r*tj|�dS)Nz writing stub loader for %s to %sr,z.pyz already exists! Please delete.�w�
zdef __bootstrap__():z-   global __bootstrap__, __file__, __loader__z%   import sys, os, pkg_resources, impz, dlz:   __file__ = pkg_resources.resource_filename(__name__,%r)z   del __bootstrap__z    if '__loader__' in globals():z       del __loader__z#   old_flags = sys.getdlopenflags()z   old_dir = os.getcwd()z   try:z(     os.chdir(os.path.dirname(__file__))z$     sys.setdlopenflags(dl.RTLD_NOW)z(     imp.load_dynamic(__name__,__file__)z   finally:z"     sys.setdlopenflags(old_flags)z     os.chdir(old_dir)z__bootstrap__()rr)�byte_compileT)rtr^r/Zinstall_lib)r�inforVr8r9r7r6�existsrr/�open�write�if_dlr:rX�closeZdistutils.utilrwr1rt�unlink)r*�
output_dirr?�compileZ	stub_file�frwrtrrrr=�sP




zbuild_ext.write_stubN)F)�__name__�
__module__�__qualname__r(r)r5rQrTrUrdrerWrmrnrrr=rrrrrJs
	rc

Cs(|j|j|||||||||	|
||�
dS)N)�linkZSHARED_LIBRARY)
r*�objects�output_libnamerr_rZr\rc�debug�
extra_preargs�extra_postargs�
build_temp�target_langrrrr`s
r`Zstaticc
CsRtjj|�\}}
tjj|
�\}}|jd�jd�r<|dd�}|j|||||�dS)N�xrg�)r8r9r6rIrK�
startswithZcreate_static_lib)r*r�r�rr_rZr\rcr�r�r�r�r�rAr:r?rrrr`,scCstjdkrd}t|�S)z�
    In https://github.com/pypa/setuptools/pull/837, we discovered
    Python 3.3.0 exposes the extension suffix under the name 'SO'.
    r�r-r)r�r�r-)r�version_infor)r4rrrrFDs
rF)
NNNNNrNNNN)
NNNNNrNNNN)(r8rrpr!Zdistutils.command.build_extrZ
_du_build_extZdistutils.file_utilrZdistutils.ccompilerrZdistutils.sysconfigrrZdistutils.errorsrZ	distutilsrZsetuptools.extensionr	Zsetuptools.externr
ZCython.Distutils.build_extr'�
__import__�ImportErrorrrrrrMrLrr4Zdl�hasattrr|r&r`rFrrrr�<module>sZ

	Q	
site-packages/setuptools/command/__pycache__/rotate.cpython-36.opt-1.pyc000064400000004707147511334640022236 0ustar003

��ft�@s`ddlmZddlmZddlmZddlZddlZddlm	Z	ddl
mZGdd�de�ZdS)	�)�convert_path)�log)�DistutilsOptionErrorN)�six)�Commandc@s:eZdZdZdZdddgZgZdd
�Zdd�Zdd�Z	dS)�rotatezDelete older distributionsz2delete older distributions, keeping N newest files�match=�m�patterns to match (required)�	dist-dir=�d�%directory where the distributions are�keep=�k�(number of matching distributions to keepcCsd|_d|_d|_dS)N)�match�dist_dir�keep)�self�r�/usr/lib/python3.6/rotate.py�initialize_optionsszrotate.initialize_optionscCs�|jdkrtd��|jdkr$td��yt|j�|_Wntk
rPtd��YnXt|jtj�rxdd�|jjd�D�|_|j	dd	�dS)
NzQMust specify one or more (comma-separated) match patterns (e.g. '.zip' or '.egg')z$Must specify number of files to keepz--keep must be an integercSsg|]}t|j���qSr)r�strip)�.0�prrr�
<listcomp>+sz+rotate.finalize_options.<locals>.<listcomp>�,Zbdistr)rr)
rrr�int�
ValueError�
isinstancerZstring_types�splitZset_undefined_options)rrrr�finalize_optionss

zrotate.finalize_optionscCs�|jd�ddlm}x�|jD]�}|jj�d|}|tjj|j|��}dd�|D�}|j	�|j
�tjdt
|�|�||jd�}xD|D]<\}}tjd|�|js�tjj|�r�tj|�q�tj|�q�WqWdS)	NZegg_infor)�glob�*cSsg|]}tjj|�|f�qSr)�os�path�getmtime)r�frrrr6szrotate.run.<locals>.<listcomp>z%d file(s) matching %szDeleting %s)Zrun_commandr"rZdistributionZget_namer$r%�joinr�sort�reverser�info�lenrZdry_run�isdir�shutilZrmtree�unlink)rr"�pattern�files�tr'rrr�run/s 
z
rotate.runN)rr	r
)rrr
)rrr)
�__name__�
__module__�__qualname__�__doc__�descriptionZuser_optionsZboolean_optionsrr!r3rrrrrsr)
Zdistutils.utilrZ	distutilsrZdistutils.errorsrr$r.Zsetuptools.externrZ
setuptoolsrrrrrr�<module>ssite-packages/setuptools/command/__pycache__/egg_info.cpython-36.opt-1.pyc000064400000050633147511334640022514 0ustar003

��f�`�@s�dZddlmZddlmZddlmZddlm	Z	ddlZddlZddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlmZddlmZdd	lmZdd
lmZddlmZddlmZdd
lmZddlmZm Z m!Z!m"Z"m#Z#m$Z$m%Z%m&Z&ddl'j(Z(ddl)m*Z*ddlm+Z+dd�Z,Gdd�de�Z-Gdd�de�ZGdd�de�Z.dd�Z/dd�Z0dd�Z1dd �Z2d!d"�Z3d#d$�Z4d%d&�Z5d'd(�Z6d0d*d+�Z7d,d-�Z8d.d/�Z9dS)1zUsetuptools.command.egg_info

Create a distribution's .egg-info directory and contents�)�FileList)�DistutilsInternalError)�convert_path)�logN)�six)�map)�Command)�sdist)�walk_revctrl)�edit_config)�	bdist_egg)�parse_requirements�	safe_name�
parse_version�safe_version�yield_lines�
EntryPoint�iter_entry_points�to_filename)�glob)�	packagingcCs�d}|jtjj�}tjtj�}d|f}�x�t|�D�]�\}}|t|�dk}|dkrv|rd|d7}q4|d||f7}q4d}t|�}	�x:||	k�r�||}
|
dkr�||d7}�n|
d	kr�||7}n�|
d
k�r�|d}||	kr�||dkr�|d}||	k�r||dk�r|d}x&||	k�r6||dk�r6|d}�qW||	k�rR|tj|
�7}nR||d|�}d}
|ddk�r�d
}
|dd�}|
tj|�7}
|d|
f7}|}n|tj|
�7}|d7}q�W|s4||7}q4W|d7}tj|tj	tj
Bd�S)z�
    Translate a file path glob like '*.txt' in to a regular expression.
    This differs from fnmatch.translate which allows wildcards to match
    directory separators. It also knows about '**/' which matches any number of
    directories.
    �z[^%s]�z**z.*z
(?:%s+%s)*r�*�?�[�!�]�^Nz[%s]z\Z)�flags)�split�os�path�sep�re�escape�	enumerate�len�compile�	MULTILINE�DOTALL)rZpatZchunksr#Z
valid_char�c�chunkZ
last_chunk�iZ	chunk_len�charZinner_i�innerZ
char_class�r0�/usr/lib/python3.6/egg_info.py�translate_pattern$sV




r2c@s�eZdZdZd)d*d+d,gZdgZddiZdd�Zedd��Z	e	j
dd��Z	dd�Zdd�Zd-dd�Z
dd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(S).�egg_infoz+create a distribution's .egg-info directory�	egg-base=�e�Ldirectory containing .egg-info directories (default: top of the source tree)�tag-date�d�0Add date stamp (e.g. 20050528) to version number�
tag-build=�b�-Specify explicit tag to add to version number�no-date�D�"Don't include date stamp [default]cCs4d|_d|_d|_d|_d|_d|_d|_d|_dS)NrF)�egg_name�egg_version�egg_baser3�	tag_build�tag_date�broken_egg_info�vtags)�selfr0r0r1�initialize_options�szegg_info.initialize_optionscCsdS)Nr0)rGr0r0r1�tag_svn_revision�szegg_info.tag_svn_revisioncCsdS)Nr0)rG�valuer0r0r1rI�scCs0tj�}|j�|d<d|d<t|t|d��dS)z�
        Materialize the value of date into the
        build tag. Install build keys in a deterministic order
        to avoid arbitrary reordering on subsequent builds.
        rCrrD)r3N)�collections�OrderedDict�tagsr�dict)rG�filenamer3r0r0r1�save_version_info�szegg_info.save_version_infocCsVt|jj��|_|j�|_|j�|_t|j�}y6t	|t
jj�}|rFdnd}t
t||j|jf��Wn,tk
r�tjjd|j|jf��YnX|jdkr�|jj}|p�ijdtj�|_|jd�t|j�d|_|jtjkr�tjj|j|j�|_d|jk�r|j�|j|jj_|jj}|dk	�rR|j |jj!�k�rR|j|_"t|j�|_#d|j_dS)Nz%s==%sz%s===%sz2Invalid distribution name or version syntax: %s-%srrBz	.egg-info�-)$r�distributionZget_namer@rMrF�tagged_versionrAr�
isinstancer�versionZVersion�listr
�
ValueError�	distutils�errorsZDistutilsOptionErrorrBZpackage_dir�getr!�curdirZensure_dirnamerr3r"�join�check_broken_egg_info�metadataZ
_patched_dist�key�lowerZ_versionZ_parsed_version)rGZparsed_versionZ
is_version�spec�dirsZpdr0r0r1�finalize_options�s8




zegg_info.finalize_optionsFcCsN|r|j|||�n6tjj|�rJ|dkr@|r@tjd||�dS|j|�dS)a�Write `data` to `filename` or delete if empty

        If `data` is non-empty, this routine is the same as ``write_file()``.
        If `data` is empty but not ``None``, this is the same as calling
        ``delete_file(filename)`.  If `data` is ``None``, then this is a no-op
        unless `filename` exists, in which case a warning is issued about the
        orphaned file (if `force` is false), or deleted (if `force` is true).
        Nz$%s not set in setup(), but %s exists)�
write_filer!r"�existsr�warn�delete_file)rG�whatrO�data�forcer0r0r1�write_or_delete_file�s	
zegg_info.write_or_delete_filecCsDtjd||�tjr|jd�}|js@t|d�}|j|�|j�dS)z�Write `data` to `filename` (if not a dry run) after announcing it

        `what` is used in a log message to identify what is being written
        to the file.
        zwriting %s to %szutf-8�wbN)	r�inforZPY3�encode�dry_run�open�write�close)rGrhrOri�fr0r0r1rd�s


zegg_info.write_filecCs tjd|�|jstj|�dS)z8Delete `filename` (if not a dry run) after announcing itzdeleting %sN)rrmror!�unlink)rGrOr0r0r1rg�szegg_info.delete_filecCs2|jj�}|jr$|j|j�r$t|�St||j�S)N)rRZget_versionrF�endswithr)rGrUr0r0r1rSs
zegg_info.tagged_versioncCs�|j|j�|jj}x@td�D]4}|j|d�|j�}|||jtj	j
|j|j��qWtj	j
|jd�}tj	j|�r||j|�|j
�dS)Nzegg_info.writers)�	installerznative_libs.txt)Zmkpathr3rRZfetch_build_eggrZrequireZresolve�namer!r"r\rerg�find_sources)rGrv�ep�writer�nlr0r0r1�run	s 
zegg_info.runcCs,d}|jr||j7}|jr(|tjd�7}|S)Nrz-%Y%m%d)rCrD�timeZstrftime)rGrUr0r0r1rMs
z
egg_info.tagscCs4tjj|jd�}t|j�}||_|j�|j|_dS)z"Generate SOURCES.txt manifest filezSOURCES.txtN)	r!r"r\r3�manifest_makerrR�manifestr|�filelist)rGZmanifest_filenameZmmr0r0r1rx s

zegg_info.find_sourcescCsd|jd}|jtjkr&tjj|j|�}tjj|�r`tjddddd||j	�|j	|_
||_	dS)Nz	.egg-inforQ�Nz�
Note: Your current .egg-info directory has a '-' in its name;
this will not work correctly with "setup.py develop".

Please rename %s to %s to correct this problem.
)r@rBr!r[r"r\rerrfr3rE)rGZbeir0r0r1r](s

zegg_info.check_broken_egg_infoN)r4r5r6)r7r8r9)r:r;r<)r=r>r?)F)�__name__�
__module__�__qualname__�descriptionZuser_optionsZboolean_optionsZnegative_optrH�propertyrI�setterrPrcrkrdrgrSr|rMrxr]r0r0r0r1r3ws(

/
r3c@s|eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�ZdS)rcCs<|j|�\}}}}|dkrV|jddj|��x"|D]}|j|�s4tjd|�q4W�n�|dkr�|jddj|��x"|D]}|j|�sxtjd|�qxW�n�|dkr�|jd	dj|��x"|D]}|j|�s�tjd
|�q�W�nZ|dk�r(|jddj|��x&|D]}|j|��stjd
|��qW�n|dk�rx|jd|dj|�f�x�|D]"}|j	||��sPtjd||��qPWn�|dk�r�|jd|dj|�f�x�|D]"}|j
||��s�tjd||��q�Wnp|dk�r�|jd|�|j|��s8tjd|�n>|dk�r,|jd|�|j|��s8tjd|�nt
d|��dS)N�includezinclude � z%warning: no files found matching '%s'�excludezexclude z9warning: no previously-included files found matching '%s'zglobal-includezglobal-include z>warning: no files found matching '%s' anywhere in distributionzglobal-excludezglobal-exclude zRwarning: no previously-included files matching '%s' found anywhere in distributionzrecursive-includezrecursive-include %s %sz:warning: no files found matching '%s' under directory '%s'zrecursive-excludezrecursive-exclude %s %szNwarning: no previously-included files matching '%s' found under directory '%s'�graftzgraft z+warning: no directories found matching '%s'�prunezprune z6no previously-included directories found matching '%s'z'this cannot happen: invalid action '%s')Z_parse_template_line�debug_printr\r�rrfr��global_include�global_exclude�recursive_include�recursive_excluder�r�r)rG�line�actionZpatterns�dirZdir_pattern�patternr0r0r1�process_template_line;sd













zFileList.process_template_linecCsVd}xLtt|j�ddd�D]2}||j|�r|jd|j|�|j|=d}qW|S)z�
        Remove all files from the file list that match the predicate.
        Return True if any matching files were removed
        Frz
 removing T���r�)�ranger'�filesr�)rGZ	predicate�foundr-r0r0r1�
_remove_files�szFileList._remove_filescCs$dd�t|�D�}|j|�t|�S)z#Include files that match 'pattern'.cSsg|]}tjj|�s|�qSr0)r!r"�isdir)�.0rsr0r0r1�
<listcomp>�sz$FileList.include.<locals>.<listcomp>)r�extend�bool)rGr�r�r0r0r1r��s
zFileList.includecCst|�}|j|j�S)z#Exclude files that match 'pattern'.)r2r��match)rGr�r�r0r0r1r��szFileList.excludecCs8tjj|d|�}dd�t|dd�D�}|j|�t|�S)zN
        Include all files anywhere in 'dir/' that match the pattern.
        z**cSsg|]}tjj|�s|�qSr0)r!r"r�)r�rsr0r0r1r��sz.FileList.recursive_include.<locals>.<listcomp>T)�	recursive)r!r"r\rr�r�)rGr�r�Zfull_patternr�r0r0r1r��s
zFileList.recursive_includecCs ttjj|d|��}|j|j�S)zM
        Exclude any file anywhere in 'dir/' that match the pattern.
        z**)r2r!r"r\r�r�)rGr�r�r�r0r0r1r��szFileList.recursive_excludecCs$dd�t|�D�}|j|�t|�S)zInclude all files from 'dir/'.cSs"g|]}tjj|�D]}|�qqSr0)rXr��findall)r�Z	match_dir�itemr0r0r1r��sz"FileList.graft.<locals>.<listcomp>)rr�r�)rGr�r�r0r0r1r��s
zFileList.graftcCsttjj|d��}|j|j�S)zFilter out files from 'dir/'.z**)r2r!r"r\r�r�)rGr�r�r0r0r1r��szFileList.prunecsJ|jdkr|j�ttjjd|����fdd�|jD�}|j|�t|�S)z�
        Include all files anywhere in the current directory that match the
        pattern. This is very inefficient on large file trees.
        Nz**csg|]}�j|�r|�qSr0)r�)r�rs)r�r0r1r��sz+FileList.global_include.<locals>.<listcomp>)Zallfilesr�r2r!r"r\r�r�)rGr�r�r0)r�r1r��s

zFileList.global_includecCsttjjd|��}|j|j�S)zD
        Exclude all files anywhere that match the pattern.
        z**)r2r!r"r\r�r�)rGr�r�r0r0r1r��szFileList.global_excludecCs8|jd�r|dd�}t|�}|j|�r4|jj|�dS)N�
rr�)rur�
_safe_pathr��append)rGr�r"r0r0r1r��s


zFileList.appendcCs|jjt|j|��dS)N)r�r��filterr�)rG�pathsr0r0r1r��szFileList.extendcCstt|j|j��|_dS)z�
        Replace self.files with only safe paths

        Because some owners of FileList manipulate the underlying
        ``files`` attribute directly, this method must be called to
        repair those paths.
        N)rVr�r�r�)rGr0r0r1�_repair�szFileList._repaircCs�d}tj|�}|dkr(tjd|�dStj|d�}|dkrNtj||d�dSy tjj|�shtjj|�rldSWn&tk
r�tj||t	j
��YnXdS)Nz!'%s' not %s encodable -- skippingz''%s' in unexpected encoding -- skippingFzutf-8T)�
unicode_utils�filesys_decoderrfZ
try_encoder!r"re�UnicodeEncodeError�sys�getfilesystemencoding)rGr"Zenc_warnZu_pathZ	utf8_pathr0r0r1r��s
zFileList._safe_pathN)r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r0r0r0r1r8sI



rc@s\eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	e
dd��Zdd�Zdd�Z
dS)r~zMANIFEST.incCsd|_d|_d|_d|_dS)Nr)Zuse_defaultsr�Z
manifest_onlyZforce_manifest)rGr0r0r1rH�sz!manifest_maker.initialize_optionscCsdS)Nr0)rGr0r0r1rcszmanifest_maker.finalize_optionscCsdt�|_tjj|j�s|j�|j�tjj|j�r<|j	�|j
�|jj�|jj�|j�dS)N)
rr�r!r"rer�write_manifest�add_defaults�templateZ
read_template�prune_file_list�sortZremove_duplicates)rGr0r0r1r|s

zmanifest_maker.runcCstj|�}|jtjd�S)N�/)r�r��replacer!r#)rGr"r0r0r1�_manifest_normalizes
z"manifest_maker._manifest_normalizecsB�jj��fdd��jjD�}d�j}�jt�j|f|�dS)zo
        Write the file list in 'self.filelist' to the manifest file
        named by 'self.manifest'.
        csg|]}�j|��qSr0)r�)r�rs)rGr0r1r� sz1manifest_maker.write_manifest.<locals>.<listcomp>zwriting manifest file '%s'N)r�r�r�rZexecuterd)rGr��msgr0)rGr1r�s

zmanifest_maker.write_manifestcCs|j|�stj||�dS)N)�_should_suppress_warningr	rf)rGr�r0r0r1rf$s
zmanifest_maker.warncCstjd|�S)z;
        suppress missing-file warnings from sdist
        zstandard file .*not found)r$r�)r�r0r0r1r�(sz'manifest_maker._should_suppress_warningcCsttj|�|jj|j�|jj|j�tt��}|rB|jj|�nt	j
j|j�rX|j�|j
d�}|jj|j�dS)Nr3)r	r�r�r�r�rrVr
r�r!r"reZ
read_manifest�get_finalized_commandr�r3)rGZrcfilesZei_cmdr0r0r1r�/s


zmanifest_maker.add_defaultscCsZ|jd�}|jj�}|jj|j�|jj|�tjtj	�}|jj
d|d|dd�dS)N�buildz(^|z)(RCS|CVS|\.svn)r)Zis_regex)r�rRZget_fullnamer�r�Z
build_baser$r%r!r#Zexclude_pattern)rGr�Zbase_dirr#r0r0r1r�;s

zmanifest_maker.prune_file_listN)r�r�r�r�rHrcr|r�r�rf�staticmethodr�r�r�r0r0r0r1r~�sr~c	Cs8dj|�}|jd�}t|d��}|j|�WdQRXdS)z{Create a file with the specified name and write 'contents' (a
    sequence of strings without line terminators) to it.
    �
zutf-8rlN)r\rnrprq)rO�contentsrsr0r0r1rdEs

rdcCs|tjd|�|jsx|jj}|j|j|_}|j|j|_}z|j	|j
�Wd|||_|_Xt|jdd�}tj
|j
|�dS)Nz
writing %sZzip_safe)rrmrorRr^rArUr@rw�write_pkg_infor3�getattrrZwrite_safety_flag)�cmd�basenamerOr^ZoldverZoldnameZsafer0r0r1r�Rsr�cCstjj|�rtjd�dS)NzsWARNING: 'depends.txt' is not used by setuptools 0.6!
Use the install_requires/extras_require setup() args instead.)r!r"rerrf)r�r�rOr0r0r1�warn_depends_obsoleteesr�cCs,t|pf�}dd�}t||�}|j|�dS)NcSs|dS)Nr�r0)r�r0r0r1�<lambda>osz%_write_requirements.<locals>.<lambda>)rr�
writelines)�streamZreqs�linesZ	append_crr0r0r1�_write_requirementsms
r�cCsn|j}tj�}t||j�|jp"i}x2t|�D]&}|jdjft	���t|||�q.W|j
d||j��dS)Nz
[{extra}]
Zrequirements)rRr�StringIOr�Zinstall_requires�extras_require�sortedrq�format�varsrk�getvalue)r�r�rOZdistrir�Zextrar0r0r1�write_requirementsts
r�cCs,tj�}t||jj�|jd||j��dS)Nzsetup-requirements)�ior�r�rRZsetup_requiresrkr�)r�r�rOrir0r0r1�write_setup_requirementssr�cCs:tjdd�|jj�D��}|jd|djt|��d�dS)NcSsg|]}|jdd�d�qS)�.rr)r )r��kr0r0r1r��sz(write_toplevel_names.<locals>.<listcomp>ztop-level namesr�)rN�fromkeysrRZiter_distribution_namesrdr\r�)r�r�rOZpkgsr0r0r1�write_toplevel_names�sr�cCst|||d�dS)NT)�	write_arg)r�r�rOr0r0r1�
overwrite_arg�sr�FcCsHtjj|�d}t|j|d�}|dk	r4dj|�d}|j||||�dS)Nrr�)r!r"�splitextr�rRr\rk)r�r�rOrjZargnamerJr0r0r1r��s
r�cCs�|jj}t|tj�s|dkr"|}nr|dk	r�g}xZt|j��D]J\}}t|tj�sttj||�}dj	tt
t|j����}|j
d||f�q<Wdj	|�}|jd||d�dS)Nr�z	[%s]
%s

rzentry pointsT)rRZentry_pointsrTrZstring_typesr��itemsrZparse_groupr\r�str�valuesr�rk)r�r�rOryriZsectionr�r0r0r1�
write_entries�s
r�cCs^tjdt�tjjd�rZtjd��2}x*|D]"}tj	d|�}|r*t
|jd��Sq*WWdQRXdS)zd
    Get a -r### off of PKG-INFO Version in case this is an sdist of
    a subversion revision.
    z$get_pkg_info_revision is deprecated.zPKG-INFOzVersion:.*-r(\d+)\s*$rNr)�warningsrf�DeprecationWarningr!r"rer�rpr$r��int�group)rsr�r�r0r0r1�get_pkg_info_revision�s
r�)F):�__doc__Zdistutils.filelistrZ	_FileListZdistutils.errorsrZdistutils.utilrrXrr!r$r�r�r�r}rKZsetuptools.externrZsetuptools.extern.six.movesrZ
setuptoolsrZsetuptools.command.sdistr	r
Zsetuptools.command.setoptrZsetuptools.commandrZ
pkg_resourcesr
rrrrrrrZsetuptools.unicode_utilsr�Zsetuptools.globrrr2r3r~rdr�r�r�r�r�r�r�r�r�r�r0r0r0r1�<module>sR(
SBEI


site-packages/setuptools/command/__pycache__/build_ext.cpython-36.pyc000064400000023363147511334640021757 0ustar003

��fu3�@s�ddlZddlZddlZddlZddlmZddlmZddl	m
Z
ddlmZm
Z
ddlmZddlmZddlmZdd	lmZyddlmZed
�Wnek
r�eZYnXe
d�ddlmZd
d�ZdZdZdZej dkr�dZn>ej!dk�r,yddl"Z"e#e"d�ZZWnek
�r*YnXdd�Z$dd�Z%Gdd�de�Ze�s^ej!dk�rjd!dd�Z&ndZd"dd�Z&dd �Z'dS)#�N)�	build_ext)�	copy_file)�new_compiler)�customize_compiler�get_config_var)�DistutilsError)�log)�Library)�sixzCython.Compiler.Main�LDSHARED)�_config_varscCsZtjdkrNtj�}z$dtd<dtd<dtd<t|�Wdtj�tj|�Xnt|�dS)N�darwinz0gcc -Wl,-x -dynamiclib -undefined dynamic_lookuprz -dynamiclib�CCSHAREDz.dylib�SO)�sys�platform�_CONFIG_VARS�copyr�clear�update)�compilerZtmp�r�/usr/lib/python3.6/build_ext.py�_customize_compiler_for_shlibs
rFZsharedr
T�nt�RTLD_NOWcCstr|SdS)N�)�	have_rtld)�srrr�<lambda>>srcCs>x8dd�tj�D�D]"\}}}d|kr*|S|dkr|SqWdS)z;Return the file extension for an abi3-compliant Extension()css |]}|dtjkr|VqdS)�N)�impZC_EXTENSION)�.0rrrr�	<genexpr>Csz"get_abi3_suffix.<locals>.<genexpr>z.abi3z.pydN)r!Zget_suffixes)�suffix�_rrr�get_abi3_suffixAs
r&c@sveZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zddd�ZdS)rcCs.|jd}|_tj|�||_|r*|j�dS)z;Build extensions in build directory, then copy if --inplacerN)Zinplace�
_build_ext�run�copy_extensions_to_source)�selfZold_inplacerrrr(Ks

z
build_ext.runc
Cs�|jd�}x�|jD]�}|j|j�}|j|�}|jd�}dj|dd��}|j|�}tj	j|tj	j
|��}tj	j|j|�}	t|	||j
|jd�|jr|j|p�tj|d�qWdS)N�build_py�.�)�verbose�dry_runT���)�get_finalized_command�
extensions�get_ext_fullname�name�get_ext_filename�split�joinZget_package_dir�os�path�basename�	build_librr.r/�_needs_stub�
write_stub�curdir)
r*r+�ext�fullname�filenameZmodpath�packageZpackage_dirZ
dest_filenameZsrc_filenamerrrr)Ss




z#build_ext.copy_extensions_to_sourcecCs�tj||�}||jkr�|j|}tjo4t|d�o4t�}|r^td�}|dt|��}|t�}t	|t
�r�tjj
|�\}}|jj|t�Str�|jr�tjj|�\}}tjj|d|�S|S)NZpy_limited_api�
EXT_SUFFIXzdl-)r'r5�ext_mapr
ZPY3�getattrr&�_get_config_var_837�len�
isinstancer	r8r9�splitext�shlib_compiler�library_filename�libtype�	use_stubs�_links_to_dynamicr6r7)r*r@rAr?Zuse_abi3Zso_ext�fn�drrrr5is"





zbuild_ext.get_ext_filenamecCs tj|�d|_g|_i|_dS)N)r'�initialize_optionsrJ�shlibsrD)r*rrrrQ~s
zbuild_ext.initialize_optionscCs2tj|�|jpg|_|j|j�dd�|jD�|_|jrB|j�x|jD]}|j|j�|_qJWx�|jD]�}|j}||j	|<||j	|j
d�d<|jr�|j|�p�d}|o�to�t
|t�}||_||_|j|�}|_tjjtjj|j|��}|o�||jk�r|jj|�|rhtrhtj|jkrh|jjtj�qhWdS)NcSsg|]}t|t�r|�qSr)rHr	)r"r?rrr�
<listcomp>�sz.build_ext.finalize_options.<locals>.<listcomp>r,r-Fr0)r'�finalize_optionsr2Zcheck_extensions_listrR�setup_shlib_compilerr3r4�
_full_namerDr6�links_to_dynamicrMrHr	rNr<r5�
_file_namer8r9�dirnamer7r;�library_dirs�appendr>�runtime_library_dirs)r*r?r@Zltd�nsrAZlibdirrrrrT�s,

zbuild_ext.finalize_optionscCs�t|j|j|jd�}|_t|�|jdk	r8|j|j�|jdk	rbx|jD]\}}|j	||�qJW|j
dk	r�x|j
D]}|j|�qtW|jdk	r�|j
|j�|jdk	r�|j|j�|jdk	r�|j|j�|jdk	r�|j|j�tj|�|_dS)N)rr/�force)rrr/r^rJrZinclude_dirsZset_include_dirsZdefineZdefine_macroZundefZundefine_macro�	librariesZ
set_librariesrZZset_library_dirsZrpathZset_runtime_library_dirsZlink_objectsZset_link_objects�link_shared_object�__get__)r*rr4�valueZmacrorrrrU�s(






zbuild_ext.setup_shlib_compilercCst|t�r|jStj||�S)N)rHr	�export_symbolsr'�get_export_symbols)r*r?rrrrd�s
zbuild_ext.get_export_symbolscCs\|j�|j}z@t|t�r"|j|_tj||�|jrL|jd�j	}|j
||�Wd||_XdS)Nr+)Z_convert_pyx_sources_to_langrrHr	rJr'�build_extensionr<r1r;r=)r*r?Z	_compiler�cmdrrrre�s
zbuild_ext.build_extensioncsPtjdd�|jD���dj|jjd�dd	�dg��t��fdd�|jD��S)
z?Return true if 'ext' links to a dynamic lib in the same packagecSsg|]
}|j�qSr)rV)r"�librrrrS�sz.build_ext.links_to_dynamic.<locals>.<listcomp>r,Nr-rc3s|]}�|�kVqdS)Nr)r"Zlibname)�libnames�pkgrrr#�sz-build_ext.links_to_dynamic.<locals>.<genexpr>r0)�dict�fromkeysrRr7rVr6�anyr_)r*r?r)rhrirrW�s zbuild_ext.links_to_dynamiccCstj|�|j�S)N)r'�get_outputs�_build_ext__get_stubs_outputs)r*rrrrm�szbuild_ext.get_outputscs6�fdd��jD�}tj|�j��}tdd�|D��S)Nc3s0|](}|jrtjj�jf|jjd���VqdS)r,N)r<r8r9r7r;rVr6)r"r?)r*rrr#�sz0build_ext.__get_stubs_outputs.<locals>.<genexpr>css|]\}}||VqdS)Nr)r"�baseZfnextrrrr#�s)r2�	itertools�product�!_build_ext__get_output_extensions�list)r*Zns_ext_basesZpairsr)r*rZ__get_stubs_outputs�s

zbuild_ext.__get_stubs_outputsccs"dVdV|jd�jrdVdS)Nz.pyz.pycr+z.pyo)r1�optimize)r*rrrZ__get_output_extensions�sz!build_ext.__get_output_extensionsFcCs.tjd|j|�tjj|f|jjd���d}|rJtjj|�rJt|d��|j	s�t
|d�}|jdjddd	td
�dtjj
|j�dd
dtd�dddtd�ddtd�dddg��|j�|�r*ddlm}||gdd|j	d�|jd�j}|dk�r||g|d|j	d�tjj|��r*|j	�r*tj|�dS)Nz writing stub loader for %s to %sr,z.pyz already exists! Please delete.�w�
zdef __bootstrap__():z-   global __bootstrap__, __file__, __loader__z%   import sys, os, pkg_resources, impz, dlz:   __file__ = pkg_resources.resource_filename(__name__,%r)z   del __bootstrap__z    if '__loader__' in globals():z       del __loader__z#   old_flags = sys.getdlopenflags()z   old_dir = os.getcwd()z   try:z(     os.chdir(os.path.dirname(__file__))z$     sys.setdlopenflags(dl.RTLD_NOW)z(     imp.load_dynamic(__name__,__file__)z   finally:z"     sys.setdlopenflags(old_flags)z     os.chdir(old_dir)z__bootstrap__()rr)�byte_compileT)rtr^r/Zinstall_lib)r�inforVr8r9r7r6�existsrr/�open�write�if_dlr:rX�closeZdistutils.utilrwr1rt�unlink)r*�
output_dirr?�compileZ	stub_file�frwrtrrrr=�sP




zbuild_ext.write_stubN)F)�__name__�
__module__�__qualname__r(r)r5rQrTrUrdrerWrmrnrrr=rrrrrJs
	rc

Cs(|j|j|||||||||	|
||�
dS)N)�linkZSHARED_LIBRARY)
r*�objects�output_libnamerr_rZr\rc�debug�
extra_preargs�extra_postargs�
build_temp�target_langrrrr`s
r`Zstaticc
Cs^|dkst�tjj|�\}}
tjj|
�\}}|jd�jd�rH|dd�}|j|||||�dS)N�xrg�)�AssertionErrorr8r9r6rIrK�
startswithZcreate_static_lib)r*r�r�rr_rZr\rcr�r�r�r�r�rAr:r?rrrr`,scCstjdkrd}t|�S)z�
    In https://github.com/pypa/setuptools/pull/837, we discovered
    Python 3.3.0 exposes the extension suffix under the name 'SO'.
    r�r-r)r�r�r-)r�version_infor)r4rrrrFDs
rF)
NNNNNrNNNN)
NNNNNrNNNN)(r8rrpr!Zdistutils.command.build_extrZ
_du_build_extZdistutils.file_utilrZdistutils.ccompilerrZdistutils.sysconfigrrZdistutils.errorsrZ	distutilsrZsetuptools.extensionr	Zsetuptools.externr
ZCython.Distutils.build_extr'�
__import__�ImportErrorrrrrrMrLrr4Zdl�hasattrr|r&r`rFrrrr�<module>sZ

	Q	
site-packages/setuptools/command/__pycache__/install_egg_info.cpython-36.opt-1.pyc000064400000004472147511334640024242 0ustar003

��f��@s\ddlmZmZddlZddlmZddlmZddlmZddl	Z	Gdd�dej
e�ZdS)�)�log�dir_utilN)�Command)�
namespaces)�unpack_archivec@sBeZdZdZdZdgZdd�Zdd�Zd	d
�Zdd�Z	d
d�Z
dS)�install_egg_infoz.Install an .egg-info directory for the package�install-dir=�d�directory to install tocCs
d|_dS)N)�install_dir)�self�r
�&/usr/lib/python3.6/install_egg_info.py�initialize_optionssz#install_egg_info.initialize_optionscCsV|jdd�|jd�}tjdd|j|j�j�d}|j|_tj	j
|j|�|_g|_
dS)NZinstall_libr�egg_infoz	.egg-info)rr)Zset_undefined_optionsZget_finalized_command�
pkg_resourcesZDistributionZegg_nameZegg_versionr�source�os�path�joinr�target�outputs)rZei_cmd�basenamer
r
r�finalize_optionss
z!install_egg_info.finalize_optionscCs�|jd�tjj|j�r<tjj|j�r<tj|j|jd�n(tjj	|j�rd|j
tj|jfd|j�|jsvtj
|j�|j
|jfd|j|jf�|j�dS)Nr)�dry_runz	Removing zCopying %s to %s)Zrun_commandrr�isdirr�islinkrZremove_treer�existsZexecute�unlinkrZensure_directory�copytreerZinstall_namespaces)rr
r
r�run!s
zinstall_egg_info.runcCs|jS)N)r)rr
r
r�get_outputs.szinstall_egg_info.get_outputscs �fdd�}t�j�j|�dS)NcsFx&dD]}|j|�s d||krdSqW�jj|�tjd||�|S)N�.svn/�CVS/�/zCopying %s to %s)r"r#)�
startswithr�appendr�debug)�src�dst�skip)rr
r�skimmer3s
z*install_egg_info.copytree.<locals>.skimmer)rrr)rr+r
)rrr1szinstall_egg_info.copytreeN)rr	r
)�__name__�
__module__�__qualname__�__doc__�descriptionZuser_optionsrrr r!rr
r
r
rr
s
r)Z	distutilsrrrZ
setuptoolsrrZsetuptools.archive_utilrrZ	Installerrr
r
r
r�<module>ssite-packages/setuptools/command/__pycache__/easy_install.cpython-36.opt-1.pyc000064400000176606147511334640023437 0ustar003

��f�T�@sdZddlmZddlmZddlmZmZddlmZmZm	Z	m
Z
ddlmZm
Z
ddlmZmZddlmZdd	lmZdd
lZdd
lZdd
lZdd
lZdd
lZdd
lZdd
lZdd
lZdd
lZdd
lZdd
lZdd
l Z dd
l!Z!dd
l"Z"dd
l#Z#dd
l$Z$dd
l%Z%ddl&m'Z'ddl(m)Z)m*Z*dd
l+m,Z,ddl-m.Z.ddl/m0Z0m1Z1ddl2m3Z3ddl4m5Z5ddl6m7Z7ddl8m9Z9m:Z:m;Z;ddl4m<Z<m=Z=ddl>m?Z?ddl@mAZAmBZBmCZCmDZDmEZEmFZFmGZGmHZHmIZImJZJmKZKmLZLmMZMmNZNmOZOdd
lPZ@ejQde@jRd�ddddddgZSdd �ZTd!d�ZUe'jV�r2d"d#�ZWd$d%�ZXnd&d#�ZWd'd%�ZXd(d)�ZYGd*d�de,�ZZd+d,�Z[d-d.�Z\d/d0�Z]d1d�Z^d2d�Z_Gd3d�deG�Z`Gd4d5�d5e`�Zaejbjcd6d7�d8k�r�eaZ`d9d:�Zdd;d<�Zed=d>�Zfd?d@�ZgdpdAdB�ZhdCdD�ZidEdF�ZjdGejkk�rejZlndHdI�ZldqdKdL�ZmdMdN�ZndOdP�ZodQdR�ZpyddSlmqZrWnesk
�r^dTdU�ZrYnXdVdW�ZqGdXdY�dYet�Zueujv�ZwGdZd[�d[eu�ZxGd\d]�d]ey�ZzGd^d_�d_ez�Z{Gd`da�dae{�Z|ezj}Z}ezj~Z~dbdc�Zddde�Z�dfeefdgdh�Z�didj�Z�dkdl�Z�drdmd�Z�e"j�dndo��Z�d
S)sa%
Easy Install
------------

A tool for doing automatic download/extract/build of distutils-based Python
packages.  For detailed documentation, see the accompanying EasyInstall.txt
file, or visit the `EasyInstall home page`__.

__ https://setuptools.readthedocs.io/en/latest/easy_install.html

�)�glob)�get_platform)�convert_path�
subst_vars)�DistutilsArgError�DistutilsOptionError�DistutilsError�DistutilsPlatformError)�INSTALL_SCHEMES�SCHEME_KEYS)�log�dir_util)�
first_line_re)�find_executableN)�six)�configparser�map)�Command)�	run_setup)�get_path�get_config_vars)�rmtree_safe)�setopt)�unpack_archive)�PackageIndex�parse_requirement_arg�
URL_SCHEME)�	bdist_egg�egg_info)�Wheel)�yield_lines�normalize_path�resource_string�ensure_directory�get_distribution�find_distributions�Environment�Requirement�Distribution�PathMetadata�EggMetadata�
WorkingSet�DistributionNotFound�VersionConflict�DEVELOP_DIST�default)�category�samefile�easy_install�PthDistributions�extract_wininst_cfg�main�get_exe_prefixescCstjd�dkS)N�P�)�struct�calcsize�r;r;�"/usr/lib/python3.6/easy_install.py�is_64bitIsr=cCsjtjj|�otjj|�}ttjd�o&|}|r:tjj||�Stjjtjj|��}tjjtjj|��}||kS)z�
    Determine if two paths reference the same file.

    Augments os.path.samefile to work on Windows and
    suppresses errors if the path doesn't exist.
    r1)�os�path�exists�hasattrr1�normpath�normcase)Zp1Zp2Z
both_existZuse_samefileZnorm_p1Znorm_p2r;r;r<r1MscCs|S)Nr;)�sr;r;r<�	_to_ascii_srEcCs*ytj|d�dStk
r$dSXdS)N�asciiTF)rZ	text_type�UnicodeError)rDr;r;r<�isasciibs
rHcCs
|jd�S)NrF)�encode)rDr;r;r<rEjscCs(y|jd�dStk
r"dSXdS)NrFTF)rIrG)rDr;r;r<rHms

cCstj|�j�jdd�S)N�
z; )�textwrap�dedent�strip�replace)�textr;r;r<�<lambda>usrPc@s�eZdZdZdZdZd�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�d�gZdd
dd
dd0d3d9d<g	Zej	�r�d@ej
ZejdAdef�ejdA�d*diZ
eZdBdC�ZdDdE�ZdFdG�ZedHdI��ZdJdK�ZdLdM�ZdNdO�ZdPdQ�ZdRdS�ZdTdU�ZdVdW�ZdXdY�ZdZd[�Zejd\�j �Z!ejd]�j �Z"ejd^�j �Z#d_d`�Z$dadb�Z%dcdd�Z&dedf�Z'dgdh�Z(didj�Z)e*j+dkdl��Z,d�dndo�Z-d�dpdq�Z.drds�Z/d�dtdu�Z0dvdw�Z1dxdy�Z2dzd{�Z3d�d|d}�Z4ed~d��Z5d�ffd�d��Z6d�d��Z7d�d��Z8d�d��Z9d�d��Z:d�d��Z;d�d��Z<ejd��j �Z=ejd��Z>d�d�d��Z?ejd��j �Z@d�d��ZAd�d��ZBd�d��ZCd�d��ZDd�d��ZEd�d��ZFd�d��ZGd�d��ZHejd��j �ZId�d��ZJd�d��ZKd�d��ZLeMeMd�d�d��d��ZNeMd�d�d��ZOd�d��ZPdS)�r2z'Manage a download/build/install processz Find/get/install Python packagesT�prefix=N�installation prefix�zip-ok�z�install package as a zipfile�
multi-version�m�%make apps have to require() a version�upgrade�U�1force upgrade (searches PyPI for latest versions)�install-dir=�d�install package to DIR�script-dir=rD�install scripts to DIR�exclude-scripts�x�Don't install scripts�always-copy�a�'Copy all needed packages to install dir�
index-url=�i� base URL of Python Package Index�find-links=�f�(additional URL(s) to search for packages�build-directory=�b�/download/extract/build in DIR; keep the results�	optimize=�O�lalso compile with optimization: -O1 for "python -O", -O2 for "python -OO", and -O0 to disable [default: -O0]�record=�3filename in which to record list of installed files�always-unzip�Z�*don't install as a zipfile, no matter what�
site-dirs=�S�)list of directories where .pth files work�editable�e�+Install specified packages in editable form�no-deps�N�don't install dependencies�allow-hosts=�H�$pattern(s) that hostnames must match�local-snapshots-ok�l�(allow building eggs from local checkouts�version�"print version information and exit�
no-find-links�9Don't load find-links defined in packages being installedz!install in user site-package '%s'�usercCs,d|_d|_|_d|_|_|_d|_d|_d|_d|_	d|_
|_d|_|_
|_d|_|_|_d|_|_|_d|_d|_d|_d|_d|_d|_d|_d|_d|_tjr�tj |_!tj"|_#nd|_!d|_#d|_$d|_%d|_&|_'d|_(i|_)d|_*d|_+|j,j-|_-|j,j.||j,j/d��dS)NrFr2)0r��zip_ok�local_snapshots_ok�install_dir�
script_dir�exclude_scripts�	index_url�
find_links�build_directory�args�optimize�recordrY�always_copy�
multi_versionr{�no_deps�allow_hosts�root�prefix�	no_reportr��install_purelib�install_platlib�install_headers�install_lib�install_scripts�install_data�install_base�install_platbase�site�ENABLE_USER_SITE�	USER_BASE�install_userbase�	USER_SITE�install_usersite�
no_find_links�
package_index�pth_file�always_copy_from�	site_dirs�installed_projects�sitepy_installedZ_dry_run�distribution�verboseZ_set_command_options�get_option_dict)�selfr;r;r<�initialize_options�sF

zeasy_install.initialize_optionscCs"dd�|D�}tt|j|��dS)Ncss*|]"}tjj|�stjj|�r|VqdS)N)r>r?r@�islink)�.0�filenamer;r;r<�	<genexpr>�sz/easy_install.delete_blockers.<locals>.<genexpr>)�listr�_delete_path)r��blockersZextant_blockersr;r;r<�delete_blockers�szeasy_install.delete_blockerscCsJtjd|�|jrdStjj|�o.tjj|�}|r8tntj}||�dS)NzDeleting %s)	r�info�dry_runr>r?�isdirr��rmtree�unlink)r�r?Zis_treeZremoverr;r;r<r��szeasy_install._delete_pathcCs6tjdd�}td�}d}t|jft���t��dS)zT
        Render the Setuptools version and installation details, then exit.
        N��
setuptoolsz=setuptools {dist.version} from {dist.location} (Python {ver}))�sysr�r$�print�format�locals�
SystemExit)Zver�dist�tmplr;r;r<�_render_version�s
zeasy_install._render_versionc	Cst|jo|j�tjj�d}tdd�\}}|jj�|jj�|jj�||dd�|d|d||||t	tdd�d�|_
tjr�|j
|j
d	<|j|j
d
<|j�|j�|j�|jddd
d�|jdkr�|j|_|jdkr�d|_|jdd!�|jdd"�|j�r|j�r|j|_|j|_|jdd#�tttj�}t�|_|jdk	�r�dd�|jjd�D�}xV|D]N}t jj!|��s~t"j#d|�n,t|�|k�r�t$|d��n|jj%t|���q^W|j&�s�|j'�|j(�p�d|_(|jdd�|_)x4|jt|j�fD] }||j)k�r�|j)j*d|��q�W|j+dk	�r8dd�|j+jd�D�}ndg}|j,dk�r`|j-|j(|j)|d�|_,t.|j)tj�|_/|j0dk	�r�t1|j0t2j3��r�|j0j�|_0ng|_0|j4�r�|j,j5|j)tj�|j�s�|j,j6|j0�|jdd$�t1|j7t8��s@y2t8|j7�|_7d|j7k�odkn�st9�Wnt9k
�r>t$d��YnX|j&�rZ|j:�rZt;d��|j<�sjt;d ��g|_=dS)%Nrr��exec_prefixr���abiflags�)Z	dist_nameZdist_versionZ
dist_fullname�
py_version�py_version_short�py_version_nodotZ
sys_prefixr�Zsys_exec_prefixr�r��userbaseZusersiter�r�r�r�Fr�r��installr�cSsg|]}tjj|j���qSr;)r>r?�
expanduserrM)r�rDr;r;r<�
<listcomp>3sz1easy_install.finalize_options.<locals>.<listcomp>�,z"%s (in --site-dirs) does not existz$ (in --site-dirs) is not on sys.pathzhttps://pypi.org/simple/cSsg|]}|j��qSr;)rM)r�rDr;r;r<r�Hs�*)Zsearch_path�hostsr�z--optimize must be 0, 1, or 2z9Must specify a build directory (-b) when using --editablez:No urls, filenames, or requirements specified (see --help))r�r�)r�r�)r�r�)r�r�)>r�r�r��splitrr�Zget_nameZget_versionZget_fullname�getattr�config_varsr�r�r�r��_fix_install_dir_for_user_site�expand_basedirs�expand_dirs�_expandr�r�r�Zset_undefined_optionsr�r�r�rr!r?�
get_site_dirs�
all_site_dirsr�r>r�r�warnr�appendr{�check_site_dirr��shadow_path�insertr�r��create_indexr&�local_indexr��
isinstancerZstring_typesr�Zscan_egg_links�add_find_linksr��int�
ValueErrorr�rr��outputs)	r�r�r�r�rBr�r]Z	path_itemr�r;r;r<�finalize_options�s�



zeasy_install.finalize_optionscCs`|jstjrdS|j�|jdkr2d}t|��|j|_|_tj	j
dd�d}|j|�dS)z;
        Fix the install_dir if "--user" was used.
        Nz$User base directory is not specified�posixZunixZ_user)r�r�r��create_home_pathr�r	r�r�r>�namerN�
select_scheme)r��msgZscheme_namer;r;r<r�ms
z+easy_install._fix_install_dir_for_user_sitecCs\xV|D]N}t||�}|dk	rtjdks0tjdkr<tjj|�}t||j�}t|||�qWdS)Nr��nt)r�r>r�r?r�rr��setattr)r��attrs�attr�valr;r;r<�
_expand_attrs|s

zeasy_install._expand_attrscCs|jdddg�dS)zNCalls `os.path.expanduser` on install_base, install_platbase and
        root.r�r�r�N)r�)r�r;r;r<r��szeasy_install.expand_basedirscCsddddddg}|j|�dS)z+Calls `os.path.expanduser` on install dirs.r�r�r�r�r�r�N)r�)r��dirsr;r;r<r��szeasy_install.expand_dirscCs�|j|jjkrtj|j�z�x|jD]}|j||j�q$W|jr�|j}|j	r�t
|j	�}x(tt
|��D]}|||d�||<qfWddlm
}|j|j|j|fd|j�|j�Wdtj|jj�XdS)Nr)�	file_utilz'writing list of installed files to '%s')r�r�r�
set_verbosityr�r2r�r�r�r��len�range�	distutilsr��executeZ
write_file�warn_deprecated_options)r��specr�Zroot_lenZcounterr�r;r;r<�run�s$

zeasy_install.runcCsDytj�}Wn"tk
r.tjdtj�}YnXtjj|j	d|�S)z�Return a pseudo-tempname base in the install directory.
        This code is intentionally naive; if a malicious party can write to
        the target directory you're already in deep doodoo.
        rztest-easy-install-%s)
r>�getpid�	Exception�randomZrandintr��maxsizer?�joinr�)r��pidr;r;r<�pseudo_tempname�s
zeasy_install.pseudo_tempnamecCsdS)Nr;)r�r;r;r<r�sz$easy_install.warn_deprecated_optionscCsdt|j�}tjj|d�}tjj|�sTytj|�Wn ttfk
rR|j	�YnX||j
k}|rv|jrv|j�}nd|j
�d}tjj|�}y*|r�tj|�t|d�j�tj|�Wn ttfk
r�|j	�YnX|r�|jr�t|j���|�r|jdk�rt||j
�|_nd|_|ttt��k�r6d|_n$|j�rZtjj|��rZd|_d|_||_dS)z;Verify that self.install_dir is .pth-capable dir, if neededzeasy-install.pthz.write-test�wNT)r!r�r>r?rr@�makedirs�OSError�IOError�cant_write_to_targetr�r��check_pth_processingrr��open�closer�no_default_version_msgr�r3r�_pythonpathr�)r��instdirr�Zis_site_dirZtestfileZtest_existsr;r;r<r��s>



zeasy_install.check_site_diraS
        can't create or remove files in install directory

        The following error occurred while trying to add or remove files in the
        installation directory:

            %s

        The installation directory you specified (via --install-dir, --prefix, or
        the distutils default setting) was:

            %s
        z�
        This directory does not currently exist.  Please create it and try again, or
        choose a different installation directory (using the -d or --install-dir
        option).
        a�
        Perhaps your account does not have write access to this directory?  If the
        installation directory is a system-owned directory, you may need to sign in
        as the administrator or "root" account.  If you do not have administrative
        access to this machine, you may wish to choose a different installation
        directory, preferably one that is listed in your PYTHONPATH environment
        variable.

        For information on other options, you may wish to consult the
        documentation at:

          https://setuptools.readthedocs.io/en/latest/easy_install.html

        Please make the appropriate changes for your system and try again.
        cCsP|jtj�d|jf}tjj|j�s6|d|j7}n|d|j7}t	|��dS)N�rJ)
�_easy_install__cant_write_msgr��exc_infor�r>r?r@�_easy_install__not_exists_id�_easy_install__access_msgr)r�r�r;r;r<rs
z!easy_install.cant_write_to_targetc
Cs�|j}tjd|�|j�d}|d}tjj|�}td�d}y8|rNtj|�tjj	|�}t
jj|dd�t
|d�}Wn ttfk
r�|j�Yn�Xz�|j|jft���|j�d	}tj}tjd
k�rtjj|�\}}	tjj|d�}
|	j�dk�otjj|
�}|�r|
}d
dlm}||dddgd
�tjj|��rJtjd|�dSWd	|�r\|j�tjj|��rttj|�tjj|��r�tj|�X|j�s�tjd|�dS)z@Empirically verify whether .pth files are supported in inst. dirz Checking .pth file support in %sz.pthz.okzz
            import os
            f = open({ok_file!r}, 'w')
            f.write('OK')
            f.close()
            rJT)�exist_okrNr�zpythonw.exez
python.exer)�spawnz-Ez-c�passz-TEST PASSED: %s appears to support .pth filesz+TEST FAILED: %s does NOT support .pth filesF)r�rr�rr>r?r@�
_one_linerr��dirname�
pkg_resourcesZ
py31compatrrrrr�writer�r�rr��
executabler�r�r�lower�distutils.spawnr r�r�)
r�rr�Zok_fileZ	ok_existsr�r#rkr&�basenameZaltZuse_altr r;r;r<rsV


z!easy_install.check_pth_processingcCs\|jrN|jd�rNx:|jd�D],}|jd|�r2q|j|||jd|��qW|j|�dS)z=Write all the scripts for `dist`, unless scripts are excluded�scriptszscripts/N)r�Zmetadata_isdirZmetadata_listdir�install_scriptZget_metadata�install_wrapper_scripts)r�r��script_namer;r;r<�install_egg_scriptsSsz easy_install.install_egg_scriptscCs\tjj|�rLxJtj|�D].\}}}x"|D]}|jjtjj||��q(WqWn|jj|�dS)N)r>r?r��walkr�r�r)r�r?�baser��filesr�r;r;r<�
add_outputas

 zeasy_install.add_outputcCs|jrtd|f��dS)NzjInvalid argument %r: you can't use filenames or URLs with --editable (except via the --find-links option).)r{r)r�rr;r;r<�not_editableiszeasy_install.not_editablecCs<|js
dStjjtjj|j|j��r8td|j|jf��dS)Nz2%r already exists in %s; can't do a checkout there)r{r>r?r@rr��keyr)r�rr;r;r<�check_editableqszeasy_install.check_editableccs@tjtjd�d�}zt|�VWdtjj|�o8tt	|��XdS)Nz
easy_install-)r�)
�tempfile�mkdtempr�u�strr>r?r@r�r)r��tmpdirr;r;r<�_tmpdir{szeasy_install._tmpdirFcCs|js|j�|j���}t|t�s�t|�rT|j|�|jj||�}|j	d|||d�St
jj|�r||j|�|j	d|||d�St
|�}|j|�|jj|||j|j|j|j�}|dkr�d|}|jr�|d7}t|��n0|jtkr�|j|||d�|S|j	||j||�SWdQRXdS)NTz+Could not find suitable distribution for %rz2 (--always-copy skips system and development eggs)�Using)r{�install_site_pyr;r�r'rr3r��download�install_itemr>r?r@rr5Zfetch_distributionrYr�r�rZ
precedencer.�process_distribution�location)r�r�depsr:�dlr�r�r;r;r<r2�s2






zeasy_install.easy_installcCs|p|j}|ptjj|�|k}|p,|jd�}|pT|jdk	oTtjjt|��t|j�k}|r�|r�x$|j|jD]}|j	|krnPqnWd}t
jdtjj|��|r�|j
|||�}x<|D]}|j|||�q�Wn |j|�g}|j||d|d�|dk	�rx|D]}||kr�|Sq�WdS)Nz.eggTz
Processing %srr<)r�r>r?r#�endswithr�r!r��project_namerArr�r)�install_eggsr@�egg_distribution)r�rr>r:rBZinstall_neededr�Zdistsr;r;r<r?�s.






zeasy_install.install_itemcCs@t|}x2tD]*}d|}t||�dkrt||||�qWdS)z=Sets the install directories by applying the install schemes.Zinstall_N)r
rr�r�)r�r��schemer4Zattrnamer;r;r<r��s

zeasy_install.select_schemecGs�|j|�|jj|�||j|jkr2|jj|�|jj|�|j|�||j|j<tj	|j
||f|���|jd�r�|jr�|jj
|jd��|r�|jr�dS|dk	r�|j|jkr�tjd|�dS|dks�||kr�|j�}tt|��}tj	d|�ytg�j|g|j|j�}Wn^tk
�rB}ztt|���WYdd}~Xn0tk
�rp}zt|j���WYdd}~XnX|j�s�|j�r�x*|D]"}|j|jk�r�|j|j���q�Wtj	d|�dS)Nzdependency_links.txtzSkipping dependencies for %szProcessing dependencies for %sz'Finished processing dependencies for %s)�
update_pthr��addr�r4�remover.r�rr��installation_report�has_metadatar�r�Zget_metadata_linesr�r��as_requirementr'r9r+Zresolver2r,rr-Zreportr�)r�Zrequirementr�rBr�ZdistreqZdistrosr|r;r;r<r@�sB



z!easy_install.process_distributioncCs2|jdk	r|jS|jd�r dS|jd�s.dSdS)Nznot-zip-safeTzzip-safeF)r�rM)r�r�r;r;r<�should_unzip�s


zeasy_install.should_unzipcCs�tjj|j|j�}tjj|�r:d}tj||j|j|�|Stjj|�rL|}nRtjj	|�|krftj
|�tj|�}t|�dkr�tjj||d�}tjj|�r�|}t
|�tj||�|S)Nz<%r already exists in %s; build directory %s will not be keptrr)r>r?rr�r4r@rr�r�r#r��listdirrr#�shutil�move)r�r�
dist_filename�
setup_base�dstr��contentsr;r;r<�
maybe_moves"

zeasy_install.maybe_movecCs0|jr
dSx tj�j|�D]}|j|�qWdS)N)r��ScriptWriter�best�get_args�write_script)r�r�r�r;r;r<r,sz$easy_install.install_wrapper_scriptscCsNt|j��}t||�}|r8|j|�t�}tj|�|}|j|t|�d�dS)z/Generate a legacy script wrapper and install itrnN)	r9rN�is_python_script�_load_templater�rX�
get_headerr[rE)r�r�r-�script_text�dev_pathrZ	is_scriptZbodyr;r;r<r+"s
zeasy_install.install_scriptcCs(d}|r|jdd�}td|�}|jd�S)z�
        There are a couple of template scripts in the package. This
        function loads one of them and prepares it for use.
        zscript.tmplz.tmplz (dev).tmplr�zutf-8)rNr"�decode)r`r�Z	raw_bytesr;r;r<r],s

zeasy_install._load_template�tcs��j�fdd�|D��tjd|�j�tjj�j|�}�j|��jrLdSt	�}t
|�tjj|�rptj|�t
|d|��}|j|�WdQRXt|d|�dS)z1Write an executable file to the scripts directorycsg|]}tjj�j|��qSr;)r>r?rr�)r�rb)r�r;r<r�>sz-easy_install.write_script.<locals>.<listcomp>zInstalling %s script to %sNri�)r�rr�r�r>r?rr2r��
current_umaskr#r@r�rr%�chmod)r�r-rV�moder��target�maskrkr;)r�r<r[;s

zeasy_install.write_scriptcCs`|j�jd�r|j||�gS|j�jd�r8|j||�gS|j�jd�rT|j||�gS|}tjj|�r�|jd�r�t|||j	�ntjj
|�r�tjj|�}|j|�r�|j
r�|dk	r�|j|||�}tjj|d�}tjj|��s2ttjj|dd��}|�stdtjj|���t|�dk�r*td	tjj|���|d
}|j�rPtj|j||��gS|j||�SdS)Nz.eggz.exez.whlz.pyzsetup.pyr�z"Couldn't find a setup script in %srzMultiple setup scripts in %sr)r'rD�install_egg�install_exe�
install_wheelr>r?�isfiler�unpack_progressr��abspath�
startswithr�rWrr@rrrr{rr��report_editable�build_and_install)r�rrSr:rT�setup_scriptZsetupsr;r;r<rFOs<
zeasy_install.install_eggscCs>tjj|�r"t|tjj|d��}nttj|��}tj	||d�S)NzEGG-INFO)�metadata)
r>r?r�r)rr*�	zipimport�zipimporterr(Z
from_filename)r��egg_pathrrr;r;r<rG{s

zeasy_install.egg_distributionc
Cs�tjj|jtjj|��}tjj|�}|js2t|�|j|�}t	||��s|tjj
|�rttjj|�rttj
||jd�n"tjj|�r�|jtj|fd|�y�d}tjj
|�r�|j|�r�tjd}}ntjd}}nL|j|�r�|j|�|jd}}n*d}|j|��rtjd}}ntjd}}|j|||f|dtjj|�tjj|�f�t||d	�Wn$tk
�rzt|dd	��YnX|j|�|j|�S)
N)r�z	Removing FZMovingZCopyingZ
ExtractingTz	 %s to %s)�fix_zipimporter_caches)r>r?rr�r)rmr�r#rGr1r�r�r
�remove_treer@rr�rnrQrRZcopytreerOZmkpath�unpack_and_compileZcopy2r#�update_dist_cachesr	r2)r�rur:�destinationr�Znew_dist_is_zippedrkrWr;r;r<rh�sT






zeasy_install.install_eggcsTt|�}|dkrtd|��td|jdd�|jdd�t�d�}tjj||j�d�}||_	|d}tjj|d�}tjj|d	�}t
|�t||�|_|j
||�tjj|��st|d
�}	|	jd�x<|jd�D].\}
}|
dkr�|	jd
|
jdd�j�|f�q�W|	j�tjj|d��|j�fdd�tj|�D��tj|||j|jd�|j||�S)Nz(%s is not a valid distutils Windows .exerrr�r�)rEr��platformz.eggz.tmpzEGG-INFOzPKG-INFOrzMetadata-Version: 1.0
�target_versionz%s: %s
�_�-r*csg|]}tjj�|d��qS)r)r>r?r)r�r�)r�r;r<r��sz,easy_install.install_exe.<locals>.<listcomp>)r�r�)r4rr(�getrr>r?r�egg_namerAr#r)Z	_provider�
exe_to_eggr@rr%�itemsrN�titlerr�rXrZrZmake_zipfiler�r�rh)r�rSr:�cfgr�ru�egg_tmpZ	_egg_infoZpkg_infrk�k�vr;)r�r<ri�s<



"
zeasy_install.install_execs>t|��g�g�i������fdd�}t|�|�g}xt�D]l}|j�jd�r>|jd�}|d}tj|d�d|d<tjj	�f|��}�j
|�|j
|�tj||�q>W|j��tj
tjj	�d�tj�|��xbdD]Z}	t�|	r�tjj	�d|	d
�}
tjj|
�s�t|
d�}|jdj	t�|	�d�|j�q�Wd
S)z;Extract a bdist_wininst to the directories an egg would usecs�|j�}xԈD]�\}}|j|�r||t|�d�}|jd�}tjj�f|��}|j�}|jd�sl|jd�r�tj	|d
�|d<d�tjj
|d�d<�j|�n4|jd�r�|dkr�d�tjj
|d�d<�j|�|SqW|jd�s�tj
d	|�dS)N�/z.pydz.dllrrz.pyzSCRIPTS/z.pthzWARNING: can't process %s���r�)r'rnrr�r>r?rrDr�strip_module�splitextr�rr�)�srcrUrD�old�new�partsrC)r��native_libs�prefixes�
to_compile�	top_levelr;r<�process�s$



z(easy_install.exe_to_egg.<locals>.processz.pydr�rz.pyzEGG-INFOr�r�z.txtrrJNr�r�r�)r�r�)r6rr'rDr�rr�r>r?rr�Z
write_stub�byte_compileZwrite_safety_flagZanalyze_eggr�r@rr%r)r�rSr�r�Zstubs�resr�ZresourceZpyfiler�Ztxtrkr;)r�r�r�r�r�r<r��s6







zeasy_install.exe_to_eggc
Cs�t|�}tjj|j|j��}tjj|�}|js6t|�tjj	|�rbtjj
|�rbtj||jd�n"tjj
|�r�|jtj|fd|�z.|j|j|fdtjj|�tjj|�f�Wdt|dd�X|j|�|j|�S)N)r�z	Removing zInstalling %s to %sF)rv)rr>r?rr�r�rmr�r#r�r�r
rwr@rr�Zinstall_as_eggr)r#ryr2rG)r�Z
wheel_pathr:Zwheelrzr;r;r<rjs,


zeasy_install.install_wheela(
        Because this distribution was installed --multi-version, before you can
        import modules from this package in an application, you will need to
        'import pkg_resources' and then use a 'require()' call similar to one of
        these examples, in order to select the desired version:

            pkg_resources.require("%(name)s")  # latest installed version
            pkg_resources.require("%(name)s==%(version)s")  # this exact version
            pkg_resources.require("%(name)s>=%(version)s")  # this version or higher
        z�
        Note also that the installation directory must be on sys.path at runtime for
        this to work.  (e.g. by being the application's script directory, by being on
        PYTHONPATH, or by being added to sys.path by your code.)
        �	Installedc	Cs`d}|jr@|jr@|d|j7}|jtttj�kr@|d|j7}|j	}|j
}|j}d}|t�S)z9Helpful installation message for display to package usersz
%(what)s %(eggloc)s%(extras)srJr�)
r�r��_easy_install__mv_warningr�rr!r�r?�_easy_install__id_warningrArEr�r�)	r�Zreqr�Zwhatr�Zegglocr�r�Zextrasr;r;r<rLIsz easy_install.installation_reportaR
        Extracted editable version of %(spec)s to %(dirname)s

        If it uses setuptools in its setup script, you can activate it in
        "development" mode by going to that directory and running::

            %(python)s setup.py develop

        See the setuptools documentation for the "develop" command for more info.
        cCs"tjj|�}tj}d|jt�S)NrJ)r>r?r#r�r&�_easy_install__editable_msgr�)r�rrqr#�pythonr;r;r<robszeasy_install.report_editablecCs�tjjdt�tjjdt�t|�}|jdkrNd|jd}|jdd|�n|jdkrd|jdd�|jrv|jdd	�t	j
d
|t|�dd�dj|��yt
||�Wn6tk
r�}ztd|jdf��WYdd}~XnXdS)
Nzdistutils.command.bdist_eggzdistutils.command.egg_infor�r�rrr~z-qz-nz
Running %s %s� zSetup script exited with %s)r��modules�
setdefaultrrr�r�r�r�rr�rrrr�rr�)r�rqrTr�r�r;r;r<rgs 

 zeasy_install.run_setupc	Cs�ddg}tjdtjj|�d�}z�|jtjj|��|j|�|j|||�t|g�}g}x2|D]*}x$||D]}|j|j	|j
|��qlWq^W|r�|jr�tj
d|�|St|�tj|j�XdS)Nrz
--dist-dirz
egg-dist-tmp-)r��dirz+No eggs found in %s (setup script problem?))r6r7r>r?r#�_set_fetcher_optionsr�rr&rhrAr�rr�r�rr�)	r�rqrTr�Zdist_dirZall_eggsZeggsr4r�r;r;r<rp{s$


zeasy_install.build_and_installc	Cst|jjd�j�}d
}i}x2|j�D]&\}}||kr4q"|d||jdd	�<q"Wt|d
�}tjj|d�}t	j
||�dS)a
        When easy_install is about to run bdist_egg on a source dist, that
        source dist might have 'setup_requires' directives, requiring
        additional fetching. Ensure the fetcher options given to easy_install
        are available to that command as well.
        r2r�r�r�r�r�rr}r~)r2z	setup.cfgN)r�r�r�r�r�r�)r�r��copyr�rN�dictr>r?rrZedit_config)	r�r0Zei_optsZfetch_directivesZ
fetch_optionsr4r�ZsettingsZcfg_filenamer;r;r<r��s	
z!easy_install._set_fetcher_optionscCs0|jdkrdSxX|j|jD]H}|js2|j|jkrtjd|�|jj|�|j|jkr|jj|j�qW|js�|j|jjkr�tjd|�n2tjd|�|jj	|�|j|jkr�|jj
|j�|j�s,|jj�|jdk�r,t
jj|jd�}t
jj|��rt
j|�t|d�}|j|jj|j�d�|j�dS)Nz&Removing %s from easy-install.pth filez4%s is already the active version in easy-install.pthz"Adding %s to easy-install.pth filer�zsetuptools.pth�wtrJ)r�r4r�rArr�rKr��pathsrJr�r��saver>r?rr�r�r�rr%�
make_relativer)r�r�r]r�rkr;r;r<rI�s4



zeasy_install.update_pthcCstjd||�|S)NzUnpacking %s to %s)r�debug)r�r�rUr;r;r<rl�szeasy_install.unpack_progresscshg�g����fdd�}t|||��j���jsdx.�D]&}tj|�tjdBd@}t||�q:WdS)Ncs\|jd�r"|jd�r"�j|�n|jd�s6|jd�r@�j|��j||��jrX|pZdS)Nz.pyz	EGG-INFO/z.dllz.so)rDrnr�rlr�)r�rU)r��to_chmodr�r;r<�pf�s
z+easy_install.unpack_and_compile.<locals>.pfimi�)rr�r�r>�stat�ST_MODErd)r�rurzr�rkrer;)r�r�r�r<rx�s

zeasy_install.unpack_and_compilecCsjtjr
dSddlm}z@tj|jd�||dd|jd�|jrT|||jd|jd�Wdtj|j�XdS)Nr)r�r)r��forcer�)	r��dont_write_bytecode�distutils.utilr�rrr�r�r�)r�r�r�r;r;r<r��szeasy_install.byte_compilea�
        bad install directory or PYTHONPATH

        You are attempting to install a package to a directory that is not
        on PYTHONPATH and which Python does not read ".pth" files from.  The
        installation directory you specified (via --install-dir, --prefix, or
        the distutils default setting) was:

            %s

        and your PYTHONPATH environment variable currently contains:

            %r

        Here are some of your options for correcting the problem:

        * You can choose a different installation directory, i.e., one that is
          on PYTHONPATH or supports .pth files

        * You can add the installation directory to the PYTHONPATH environment
          variable.  (It must then also be on PYTHONPATH whenever you run
          Python and want to use the package(s) you are installing.)

        * You can set up the installation directory to support ".pth" files by
          using one of the approaches described here:

          https://setuptools.readthedocs.io/en/latest/easy_install.html#custom-installation-locations


        Please make the appropriate changes for your system and try again.cCs|j}||jtjjdd�fS)N�
PYTHONPATHr�)�_easy_install__no_default_msgr�r>�environr)r��templater;r;r<rsz#easy_install.no_default_version_msgcCs�|jr
dStjj|jd�}tdd�}|jd�}d}tjj|�r�tj	d|j�t
j|��}|j�}WdQRX|j
d�s�td	|��||kr�tjd
|�|js�t|�t
j|ddd��}|j|�WdQRX|j|g�d
|_dS)z8Make sure there's a site.py in the target dir, if neededNzsite.pyr�z
site-patch.pyzutf-8r�zChecking existing site.py in %sz
def __boot():z;%s is not a setuptools-generated site.py; please remove it.zCreating %sr)�encodingT)r�r>r?rr�r"rar@rr��ior�readrnrr�r�r#r%r�)r�Zsitepy�sourceZcurrentZstrmr;r;r<r=s,


zeasy_install.install_site_pycCsj|js
dSttjjd��}xJtj|j�D]:\}}|j|�r(tjj	|�r(|j
d|�tj|d�q(WdS)zCreate directories under ~.N�~zos.makedirs('%s', 0o700)i�)r�rr>r?r�rZ	iteritemsr�rnr�Zdebug_printr)r��homer�r?r;r;r<r�>szeasy_install.create_home_pathz/$base/lib/python$py_version_short/site-packagesz	$base/bin)r�r�)r�z$base/Lib/site-packagesz
$base/ScriptscGs�|jd�j}|jrh|j�}|j|d<|jjtj|j�}x0|j	�D]$\}}t
||d�dkr@t|||�q@Wddlm
}xJ|D]B}t
||�}|dk	rz|||�}tjdkr�tjj|�}t|||�qzWdS)Nr�r0r)rr�)Zget_finalized_commandr�r�r�r
rr>r��DEFAULT_SCHEMEr�r�r�r�rr?r�)r�r�r�rHr�r�rr;r;r<r�Ts 




zeasy_install._expand)rQNrR)rSrTrU)rVrWrX)rYrZr[)r\r]r^)r_rDr`)rarbrc)rdrerf)rgrhri)rjrkrl)rmrnro)rprqrr)rsNrt)rurvrw)rxryrz)r{r|r})r~rr�)r�r�r�)r�r�r�)r�Nr�)r�Nr�)F)F)T)N)r�)Q�__name__�
__module__�__qualname__�__doc__�descriptionZcommand_consumes_argumentsZuser_optionsZboolean_optionsr�r�r�Zhelp_msgr�Znegative_optrr�r�r�r��staticmethodr�r�r�r�r�r�rrrr�rKrL�lstriprrrrrr.r2r3r5�
contextlib�contextmanagerr;r2r?r�r@rOrWr,r+r]r[rFrGrhrir�rjr�r�rLr�rorrpr�rIrlrxr�r�rr=r�r�r
r�r�r;r;r;r<r2xs�



0	z	0


	;
	
$
$	
'	

,6-5	

	
%
 
cCs tjjdd�jtj�}td|�S)Nr�r�)r>r�rr��pathsep�filter)r�r;r;r<rksrcCs�g}|jt��tjg}tjtjkr0|jtj�x�|D]�}|r6tjdkr`|jtjj	|dd��n\tj
dkr�|jtjj	|ddtjdd	�d�tjj	|dd
�g�n|j|tjj	|dd�g�tjdkr6d|kr6tjj
d
�}|r6tjj	|ddtjdd	�d�}|j|�q6Wtd�td�f}x"|D]}||k�r |j|��q Wtj�rR|jtj�y|jtj��Wntk
�rzYnXttt|��}|S)z&
    Return a list of 'site' dirs
    �os2emx�riscosZLibz
site-packagesr��libr�Nr�zsite-python�darwinzPython.framework�HOME�Library�Python�purelib�platlib)r�r�)�extendrr�r�r�r�r{r>r?r�sepr�r�rrr�r�r��getsitepackages�AttributeErrorr�rr!)�sitedirsr�r�r�Zhome_spZ	lib_pathsZsite_libr;r;r<r�psV





r�ccs�i}x�|D]�}t|�}||kr q
d||<tjj|�s6q
tj|�}||fVx�|D]�}|jd�s`qP|dkrjqPttjj||��}tt	|��}|j
�xP|D]H}|jd�s�t|j��}||kr�d||<tjj|�s�q�|tj|�fVq�WqPWq
WdS)zBYield sys.path directories that might contain "old-style" packagesrz.pth�easy-install.pth�setuptools.pth�importN)r�r�)
r!r>r?r�rPrDrrr�r rrn�rstrip)Zinputs�seenr#r1r�rk�lines�liner;r;r<�expand_paths�s4






r�cCs&t|d�}�z
tj|�}|dkr$dS|d|d|d}|dkrHdS|j|d�tjd|jd��\}}}|dkrzdS|j|d|�d
d
d�}tj|�}y<|j|�}	|	j	dd
�d}
|
j
tj��}
|j
tj|
��Wntjk
r�dSX|jd��s|jd��rdS|S|j�XdS)znExtract configuration data from a bdist_wininst .exe

    Returns a configparser.RawConfigParser, or None
    �rbN�	���z<iii�zV4�{V4r�)r�r|�rrrrZSetup)r�r�)r�zipfileZ_EndRecData�seekr9�unpackr�rZRawConfigParserr�rar��getfilesystemencodingZreadfpr�StringIO�ErrorZhas_sectionr)rSrkZendrecZ	prepended�tagZcfglenZbmlenZinitr��part�configr;r;r<r4�s4




c
CsJdddddg}tj|�}�z�x�|j�D]�}|j}|jd�}t|�d	kr�|d
dkr�|djd
�r�|jddj|dd
��df�Pt|�d
ks(|jd�r�q(|jd�r�q(|dj	�dkr(|j
|�}tjr�|j
�}xDt|�D]8}|j�jdd�}|jd�s�|jd|d|fdf�q�Wq(WWd|j�Xdd�|D�}|j�|j�|S) z4Get exe->egg path translations for a given .exe file�PURELIB/r��PLATLIB/pywin32_system32�PLATLIB/�SCRIPTS/�EGG-INFO/scripts/�DATA/lib/site-packagesr�r�r�zPKG-INFOrz	.egg-inforNz	EGG-INFO/z.pthz
-nspkg.pth�PURELIB�PLATLIB�\r�z%s/%s/cSsg|]\}}|j�|f�qSr;)r')r�rb�yr;r;r<r�$sz$get_exe_prefixes.<locals>.<listcomp>)r�r�)r�r�)r�r�)r�r�)r�r�)r�r�)r�ZZipFileZinfolistr�r�rrDr�r�upperr�rZPY3rar rMrNrnr�r�sort�reverse)Zexe_filenamer�rTr�r�r�rVZpthr;r;r<r6s>




&
c@sTeZdZdZdZffdd�Zdd�Zdd�Zed	d
��Z	dd�Z
d
d�Zdd�ZdS)r3z)A .pth file with Distribution paths in itFcCsp||_ttt|��|_ttjj|j��|_|j	�t
j|gdd�x(t|j
�D]}tt|jt|d���qNWdS)NT)r�r�rr!r�r>r?r#�basedir�_loadr&�__init__r r�rJr%)r�r�r�r?r;r;r<r�/szPthDistributions.__init__cCsg|_d}tj|j�}tjj|j�r�t|jd�}x�|D]�}|j	d�rJd}q6|j
�}|jj|�|j�s6|j�j	d�rxq6t
tjj|j|��}|jd<tjj|�s�||kr�|jj�d|_q6d||<q6W|j�|jr�|r�d|_x&|jo�|jdj��r
|jj�q�WdS)	NFZrtr�T�#rr�r�)r�r��fromkeysr�r>r?rkr�rrnr�r�rMr!rr�r@�pop�dirtyr)r�Z
saw_importr�rkr�r?r;r;r<r�8s2


zPthDistributions._loadc	Cs�|js
dStt|j|j��}|r�tjd|j�|j|�}dj	|�d}t
jj|j�r`t
j
|j�t|jd��}|j|�WdQRXn(t
jj|j�r�tjd|j�t
j
|j�d|_dS)z$Write changed .pth file back to diskNz	Saving %srJr�zDeleting empty %sF)r�r�rr�r�rr�r��_wrap_linesrr>r?r�r�rr%r@)r�Z	rel_pathsr��datarkr;r;r<r�Ws
zPthDistributions.savecCs|S)Nr;)r�r;r;r<rmszPthDistributions._wrap_linescCsN|j|jko$|j|jkp$|jtj�k}|r>|jj|j�d|_tj||�dS)z"Add `dist` to the distribution mapTN)	rAr�r�r>�getcwdr�r�r&rJ)r�r��new_pathr;r;r<rJqszPthDistributions.addcCs6x$|j|jkr$|jj|j�d|_qWtj||�dS)z'Remove `dist` from the distribution mapTN)rAr�rKr�r&)r�r�r;r;r<rKs
zPthDistributions.removecCs�tjjt|��\}}t|j�}|g}tjdkr2dp6tj}xVt|�|kr�||jkrn|jtj	�|j
�|j|�Stjj|�\}}|j|�q:W|SdS)Nr�)r>r?r�r!rr��altsepr�r��curdirr�r)r�r?ZnpathZlastZbaselenr�r�r;r;r<r��s


zPthDistributions.make_relativeN)
r�r�r�r�r�r�r�r�r�rrJrKr�r;r;r;r<r3*s	c@s(eZdZedd��Zed�Zed�ZdS)�RewritePthDistributionsccs(|jVx|D]
}|VqW|jVdS)N)�prelude�postlude)�clsr�r�r;r;r<r�s

z#RewritePthDistributions._wrap_linesz?
        import sys
        sys.__plen = len(sys.path)
        z�
        import sys
        new = sys.path[sys.__plen:]
        del sys.path[sys.__plen:]
        p = getattr(sys, '__egginsert', 0)
        sys.path[p:p] = new
        sys.__egginsert = p + len(new)
        N)r�r�r��classmethodrr"rrr;r;r;r<r�s
rZSETUPTOOLS_SYS_PATH_TECHNIQUE�rawZrewritecCs ttjt�rtStjtjj��S)z_
    Return a regular expression based on first_line_re suitable for matching
    strings.
    )r�r�patternr9�re�compilerar;r;r;r<�_first_line_re�srcCsd|tjtjgkr.tjdkr.t|tj�||�Stj�\}}}t	j
||d|dd||ff�dS)Nr�rrz %s %s)r>r�rKr�rdr��S_IWRITEr�rrZreraise)�func�arg�excZetZevr}r;r;r<�
auto_chmod�s
rcCs.t|�}t|tj�|r"t|�nt|�dS)aa

    Fix any globally cached `dist_path` related data

    `dist_path` should be a path of a newly installed egg distribution (zipped
    or unzipped).

    sys.path_importer_cache contains finder objects that have been cached when
    importing data from the original distribution. Any such finders need to be
    cleared since the replacement distribution might be packaged differently,
    e.g. a zipped egg distribution might get replaced with an unzipped egg
    folder or vice versa. Having the old finders cached may then cause Python
    to attempt loading modules from the replacement distribution using an
    incorrect loader.

    zipimport.zipimporter objects are Python loaders charged with importing
    data packaged inside zip archives. If stale loaders referencing the
    original distribution, are left behind, they can fail to load modules from
    the replacement distribution. E.g. if an old zipimport.zipimporter instance
    is used to load data from a new zipped egg archive, it may cause the
    operation to attempt to locate the requested data in the wrong location -
    one indicated by the original distribution's zip archive directory
    information. Such an operation may then fail outright, e.g. report having
    read a 'bad local file header', or even worse, it may fail silently &
    return invalid data.

    zipimport._zip_directory_cache contains cached zip archive directory
    information for all existing zipimport.zipimporter instances and all such
    instances connected to the same archive share the same cached directory
    information.

    If asked, and the underlying Python implementation allows it, we can fix
    all existing zipimport.zipimporter instances instead of having to track
    them down and remove them one by one, by updating their shared cached zip
    archive directory information. This, of course, assumes that the
    replacement distribution is packaged as a zipped egg.

    If not asked to fix existing zipimport.zipimporter instances, we still do
    our best to clear any remaining zipimport.zipimporter related cached data
    that might somehow later get used when attempting to load data from the new
    distribution and thus cause such load operations to fail. Note that when
    tracking down such remaining stale data, we can not catch every conceivable
    usage from here, and we clear only those that we know of and have found to
    cause problems if left alive. Any remaining caches should be updated by
    whomever is in charge of maintaining them, i.e. they should be ready to
    handle us replacing their zip archives with new distributions at runtime.

    N)r!�_uncacher��path_importer_cache�!_replace_zip_directory_cache_data�*_remove_and_clear_zip_directory_cache_data)Z	dist_pathrv�normalized_pathr;r;r<ry�s
<
rycCsTg}t|�}xB|D]:}t|�}|j|�r|||d�tjdfkr|j|�qW|S)ap
    Return zipimporter cache entry keys related to a given normalized path.

    Alternative path spellings (e.g. those using different character case or
    those using alternative path separators) related to the same path are
    included. Any sub-path entries are included as well, i.e. those
    corresponding to zip archives embedded in other zip archives.

    rr�)rr!rnr>r�r�)r�cache�resultZ
prefix_len�pZnpr;r;r<�"_collect_zipimporter_cache_entriess


rcCsDx>t||�D]0}||}||=|o*|||�}|dk	r|||<qWdS)a�
    Update zipimporter cache data for a given normalized path.

    Any sub-path entries are processed as well, i.e. those corresponding to zip
    archives embedded in other zip archives.

    Given updater is a callable taking a cache entry key and the original entry
    (after already removing the entry from the cache), and expected to update
    the entry and possibly return a new one to be inserted in its place.
    Returning None indicates that the entry should not be replaced with a new
    one. If no updater is given, the cache entries are simply removed without
    any additional processing, the same as if the updater simply returned None.

    N)r)rr�updaterr�	old_entryZ	new_entryr;r;r<�_update_zipimporter_cache*s
r cCst||�dS)N)r )rrr;r;r<rJsrcCsdd�}t|tj|d�dS)NcSs|j�dS)N)�clear)r?rr;r;r<�2clear_and_remove_cached_zip_archive_directory_dataOszf_remove_and_clear_zip_directory_cache_data.<locals>.clear_and_remove_cached_zip_archive_directory_data)r)r rs�_zip_directory_cache)rr"r;r;r<rNsrZ__pypy__cCsdd�}t|tj|d�dS)NcSs&|j�tj|�|jtj|�|S)N)r!rsrt�updater#)r?rr;r;r<�)replace_cached_zip_archive_directory_dataes
zT_replace_zip_directory_cache_data.<locals>.replace_cached_zip_archive_directory_data)r)r rsr#)rr%r;r;r<rds
r�<string>cCs2yt||d�Wnttfk
r(dSXdSdS)z%Is this string a valid Python script?�execFTN)r�SyntaxError�	TypeError)rOr�r;r;r<�	is_pythonws
r*cCsJy(tj|dd��}|jd�}WdQRXWnttfk
r@|SX|dkS)zCDetermine if the specified executable is a .sh (contains a #! line)zlatin-1)r�r�Nz#!)r�rr�rr)r&�fp�magicr;r;r<�is_sh�sr-cCstj|g�S)z@Quote a command line argument according to Windows parsing rules)�
subprocess�list2cmdline)rr;r;r<�nt_quote_arg�sr0cCsH|jd�s|jd�rdSt||�r&dS|jd�rDd|j�dj�kSdS)zMIs this text, as a whole, a Python script? (as opposed to shell/bat/etc.
    z.pyz.pywTz#!r�rF)rDr*rn�
splitlinesr')r_r�r;r;r<r\�s

r\)rdcGsdS)Nr;)r�r;r;r<�_chmod�sr2cCsRtjd||�yt||�Wn0tjk
rL}ztjd|�WYdd}~XnXdS)Nzchanging mode of %s to %ozchmod failed: %s)rr�r2r>�error)r?rer|r;r;r<rd�s
rdc@s�eZdZdZgZe�Zedd��Zedd��Z	edd��Z
edd	��Zed
d��Zdd
�Z
edd��Zdd�Zedd��Zedd��ZdS)�CommandSpeczm
    A command spec for a #! header, specified as a list of arguments akin to
    those passed to Popen.
    cCs|S)zV
        Choose the best CommandSpec class based on environmental conditions.
        r;)r	r;r;r<rY�szCommandSpec.bestcCstjjtj�}tjjd|�S)N�__PYVENV_LAUNCHER__)r>r?rBr�r&r�r)r	Z_defaultr;r;r<�_sys_executable�szCommandSpec._sys_executablecCs:t||�r|St|t�r ||�S|dkr0|j�S|j|�S)zg
        Construct a CommandSpec from a parameter to build_scripts, which may
        be None.
        N)r�r��from_environment�from_string)r	Zparamr;r;r<�
from_param�s

zCommandSpec.from_paramcCs||j�g�S)N)r6)r	r;r;r<r7�szCommandSpec.from_environmentcCstj|f|j�}||�S)z}
        Construct a command spec from a simple string representing a command
        line parseable by shlex.split.
        )�shlexr��
split_args)r	�stringr�r;r;r<r8�szCommandSpec.from_stringcCs8tj|j|��|_tj|�}t|�s4dg|jdd�<dS)Nz-xr)r:r��_extract_options�optionsr.r/rH)r�r_�cmdliner;r;r<�install_options�s
zCommandSpec.install_optionscCs:|dj�d}t�j|�}|r.|jd�p0dnd}|j�S)zH
        Extract any options from the first line of the script.
        rJrrr�)r1r�match�grouprM)Zorig_script�firstrAr>r;r;r<r=�szCommandSpec._extract_optionscCs|j|t|j��S)N)�_renderr�r>)r�r;r;r<�	as_header�szCommandSpec.as_headercCs6d}x,|D]$}|j|�r
|j|�r
|dd�Sq
W|S)Nz"'rr�)rnrD)�itemZ_QUOTES�qr;r;r<�
_strip_quotes�s

zCommandSpec._strip_quotescCs tjdd�|D��}d|dS)Ncss|]}tj|j��VqdS)N)r4rHrM)r�rFr;r;r<r��sz&CommandSpec._render.<locals>.<genexpr>z#!rJ)r.r/)r�r?r;r;r<rD�szCommandSpec._renderN)r�r�r�r�r>r�r;r
rYr6r9r7r8r@r�r=rErHrDr;r;r;r<r4�s	
r4c@seZdZedd�ZdS)�WindowsCommandSpecF)r�N)r�r�r�r�r;r;r;r;r<rIsrIc@s�eZdZdZejd�j�ZeZ	e
ddd��Ze
ddd��Ze
dd	d
��Z
edd��Ze
d
d��Ze
dd��Ze
dd��Ze
ddd��ZdS)rXz`
    Encapsulates behavior around writing entry point scripts for console and
    gui apps.
    a�
        # EASY-INSTALL-ENTRY-SCRIPT: %(spec)r,%(group)r,%(name)r
        __requires__ = %(spec)r
        import re
        import sys
        from pkg_resources import load_entry_point

        if __name__ == '__main__':
            sys.argv[0] = re.sub(r'(-script\.pyw?|\.exe)?$', '', sys.argv[0])
            sys.exit(
                load_entry_point(%(spec)r, %(group)r, %(name)r)()
            )
    NFcCs6tjdt�|rtntj�}|jd||�}|j||�S)NzUse get_argsr�)�warningsr��DeprecationWarning�WindowsScriptWriterrXrY�get_script_headerrZ)r	r�r&�wininst�writer�headerr;r;r<�get_script_argsszScriptWriter.get_script_argscCs6tjdt�|rd}|jj�j|�}|j|�|j�S)NzUse get_headerz
python.exe)rJr�rK�command_spec_classrYr9r@rE)r	r_r&rN�cmdr;r;r<rM's
zScriptWriter.get_script_headerccs�|dkr|j�}t|j��}xjdD]b}|d}xT|j|�j�D]B\}}|j|�|jt�}|j||||�}	x|	D]
}
|
VqrWq>Wq"WdS)z�
        Yield write_script() argument tuples for a distribution's
        console_scripts and gui_scripts entry points.
        N�console�guiZ_scripts)rTrU)	r^r9rNZ
get_entry_mapr��_ensure_safe_namer�r��_get_script_args)r	r�rPr�type_rBr�Zepr_r�r�r;r;r<rZ1s


zScriptWriter.get_argscCstjd|�}|rtd��dS)z?
        Prevent paths in *_scripts entry point names.
        z[\\/]z+Path separators not allowed in script namesN)r
�searchr�)r�Zhas_path_sepr;r;r<rVCszScriptWriter._ensure_safe_namecCs tjdt�|rtj�S|j�S)NzUse best)rJr�rKrLrY)r	Z
force_windowsr;r;r<�
get_writerLszScriptWriter.get_writercCs.tjdkstjdkr&tjdkr&tj�S|SdS)zD
        Select the best ScriptWriter for this environment.
        �win32�javar�N)r�r{r>r��_namerLrY)r	r;r;r<rYRszScriptWriter.bestccs|||fVdS)Nr;)r	rXr�rPr_r;r;r<rW\szScriptWriter._get_script_argsr�cCs"|jj�j|�}|j|�|j�S)z;Create a #! line, getting options (if any) from script_text)rRrYr9r@rE)r	r_r&rSr;r;r<r^as
zScriptWriter.get_header)NF)NF)N)r�N)r�r�r�r�rKrLr�r�r4rRr
rQrMrZr�rVrZrYrWr^r;r;r;r<rX	s 
		
rXc@sLeZdZeZedd��Zedd��Zedd��Zedd��Z	e
d	d
��ZdS)rLcCstjdt�|j�S)NzUse best)rJr�rKrY)r	r;r;r<rZlszWindowsScriptWriter.get_writercCs"tt|d�}tjjdd�}||S)zC
        Select the best ScriptWriter suitable for Windows
        )r&ZnaturalZSETUPTOOLS_LAUNCHERr&)r��WindowsExecutableLauncherWriterr>r�r)r	Z
writer_lookupZlauncherr;r;r<rYrs
zWindowsScriptWriter.bestc	#s�tddd�|}|tjdj�jd�krBdjft��}tj|t	�dddd	d
ddg}|j
|�|j||�}�fdd
�|D�}�|||d|fVdS)z For Windows, add a .py extensionz.pyaz.pyw)rTrUZPATHEXT�;zK{ext} not listed in PATHEXT; scripts will not be recognized as executables.z.pyz
-script.pyz.pycz.pyoz.execsg|]}�|�qSr;r;)r�rb)r�r;r<r��sz8WindowsScriptWriter._get_script_args.<locals>.<listcomp>rbN)r�r>r�r'r�r�r�rJr��UserWarningrK�_adjust_header)	r	rXr�rPr_�extr�r�r�r;)r�r<rWs
z$WindowsScriptWriter._get_script_argscCsNd}d}|dkr||}}tjtj|�tj�}|j||d�}|j|�rJ|S|S)z�
        Make sure 'pythonw' is used for gui and and 'python' is used for
        console (regardless of what sys.executable is).
        zpythonw.exez
python.exerU)r<�repl)r
r�escape�
IGNORECASE�sub�_use_header)r	rXZorig_headerrrcZ
pattern_ob�
new_headerr;r;r<ra�s
z"WindowsScriptWriter._adjust_headercCs$|dd�jd�}tjdkp"t|�S)z�
        Should _adjust_header use the replaced header?

        On non-windows systems, always use. On
        Windows systems, only use the replaced header if it resolves
        to an executable on the system.
        r�r�"r[r�)rMr�r{r)rhZclean_headerr;r;r<rg�s	zWindowsScriptWriter._use_headerN)r�r�r�rIrRr
rZrYrWrar�rgr;r;r;r<rLis
rLc@seZdZedd��ZdS)r^c#s�|dkrd}d}dg}nd}d}dddg}|j||�}�fd	d
�|D�}	�|||d|	fV�dt|�d
fVt�s��d}
|
t��dfVdS)zG
        For Windows, add a .py extension and an .exe launcher
        rUz-script.pywz.pywZcliz
-script.pyz.pyz.pycz.pyocsg|]}�|�qSr;r;)r�rb)r�r;r<r��szDWindowsExecutableLauncherWriter._get_script_args.<locals>.<listcomp>rbz.exernz
.exe.manifestN)ra�get_win_launcherr=�load_launcher_manifest)r	rXr�rPr_Z
launcher_typerbr�Zhdrr�Zm_namer;)r�r<rW�s
z0WindowsExecutableLauncherWriter._get_script_argsN)r�r�r�r
rWr;r;r;r<r^�sr^cCs2d|}t�r|jdd�}n|jdd�}td|�S)z�
    Load the Windows launcher (executable) suitable for launching a script.

    `type` should be either 'cli' or 'gui'

    Returns the executable as a byte string.
    z%s.exe�.z-64.z-32.r�)r=rNr")�typeZlauncher_fnr;r;r<rj�s
rjcCs0tjtd�}tjr|t�S|jd�t�SdS)Nzlauncher manifest.xmlzutf-8)r$r"r�r�PY2�varsra)r�Zmanifestr;r;r<rk�s
rkFcCstj|||�S)N)rQr�)r?�
ignore_errors�onerrorr;r;r<r��sr�cCstjd�}tj|�|S)N�)r>�umask)Ztmpr;r;r<rc�s

rccCs:ddl}tjj|jd�}|tjd<tjj|�t�dS)Nr)	r�r>r?r#�__path__r��argvr�r5)r�Zargv0r;r;r<�	bootstrap�s

rvc
s�ddlm}ddlm�G�fdd�d��}|dkrBtjdd�}t��0|fddd	g|tjdpfd|d
�|��WdQRXdS)Nr)�setup)r(cseZdZdZ�fdd�ZdS)z-main.<locals>.DistributionWithoutHelpCommandsr�c
s(t���j|f|�|�WdQRXdS)N)�_patch_usage�
_show_help)r�r��kw)r(r;r<ry	sz8main.<locals>.DistributionWithoutHelpCommands._show_helpN)r�r�r�Zcommon_usageryr;)r(r;r<�DistributionWithoutHelpCommands�sr{rz-qr2z-v)Zscript_argsr-Z	distclass)r�rwZsetuptools.distr(r�rurx)rurzrwr{r;)r(r<r5�sc#sLddl}tjd�j���fdd�}|jj}||j_z
dVWd||j_XdS)Nrze
        usage: %(script)s [options] requirement_or_url ...
           or: %(script)s --help
        cs�ttjj|�d�S)N)Zscript)r�r>r?r))r-)�USAGEr;r<�	gen_usage	sz_patch_usage.<locals>.gen_usage)Zdistutils.corerKrLr�Zcorer})rr}Zsavedr;)r|r<rx	s

rx)N)r&)N)�r�rr�rrrZdistutils.errorsrrrr	Zdistutils.command.installr
rrrr
Zdistutils.command.build_scriptsrr(rr�r>rsrQr6r�r
r�r
rKrJr�r9r�r.r:r�Zsetuptools.externrZsetuptools.extern.six.movesrrr�rZsetuptools.sandboxrZsetuptools.py31compatrrZsetuptools.py27compatrZsetuptools.commandrZsetuptools.archive_utilrZsetuptools.package_indexrrrrrZsetuptools.wheelrr$r r!r"r#r$r%r&r'r(r)r*r+r,r-r.Zpkg_resources.py31compat�filterwarningsZ
PEP440Warning�__all__r=r1rnrErHr"r2rr�r�r4r6r3rr�rrrryrr rr�builtin_module_namesrr*r-r0r\rdr2�ImportErrorr�r4r6Zsys_executablerI�objectrXrLr^rQrMrjrkr�rcrvr5r�rxr;r;r;r<�<module>s�D
|A))'lR
 


T`A 

site-packages/setuptools/command/__pycache__/bdist_rpm.cpython-36.opt-1.pyc000064400000003244147511334640022716 0ustar003

��f��@s"ddljjZGdd�dej�ZdS)�Nc@s eZdZdZdd�Zdd�ZdS)�	bdist_rpmaf
    Override the default bdist_rpm behavior to do the following:

    1. Run egg_info to ensure the name and version are properly calculated.
    2. Always run 'install' using --single-version-externally-managed to
       disable eggs in RPM distributions.
    3. Replace dash with underscore in the version numbers for better RPM
       compatibility.
    cCs|jd�tjj|�dS)NZegg_info)Zrun_command�origr�run)�self�r�/usr/lib/python3.6/bdist_rpm.pyrs
z
bdist_rpm.runcsl|jj�}|jdd�}tjj|�}d|�d|���fdd�|D�}|j��d}d|}|j||�|S)N�-�_z%define version cs0g|](}|jdd�jdd�jdd�j����qS)zSource0: %{name}-%{version}.tarz)Source0: %{name}-%{unmangled_version}.tarzsetup.py install z5setup.py install --single-version-externally-managed z%setupz&%setup -n %{name}-%{unmangled_version})�replace)�.0�line)�line23�line24rr�
<listcomp>s
z-bdist_rpm._make_spec_file.<locals>.<listcomp>�z%define unmangled_version )ZdistributionZget_versionr
rr�_make_spec_file�index�insert)r�versionZ
rpmversion�specZ
insert_locZunmangled_versionr)r
rrrs

zbdist_rpm._make_spec_fileN)�__name__�
__module__�__qualname__�__doc__rrrrrrrs	r)Zdistutils.command.bdist_rpmZcommandrrrrrr�<module>ssite-packages/setuptools/command/install_lib.py000064400000007400147511334640016002 0ustar00import os
import imp
from itertools import product, starmap
import distutils.command.install_lib as orig


class install_lib(orig.install_lib):
    """Don't add compiled flags to filenames of non-Python files"""

    def run(self):
        self.build()
        outfiles = self.install()
        if outfiles is not None:
            # always compile, in case we have any extension stubs to deal with
            self.byte_compile(outfiles)

    def get_exclusions(self):
        """
        Return a collections.Sized collections.Container of paths to be
        excluded for single_version_externally_managed installations.
        """
        all_packages = (
            pkg
            for ns_pkg in self._get_SVEM_NSPs()
            for pkg in self._all_packages(ns_pkg)
        )

        excl_specs = product(all_packages, self._gen_exclusion_paths())
        return set(starmap(self._exclude_pkg_path, excl_specs))

    def _exclude_pkg_path(self, pkg, exclusion_path):
        """
        Given a package name and exclusion path within that package,
        compute the full exclusion path.
        """
        parts = pkg.split('.') + [exclusion_path]
        return os.path.join(self.install_dir, *parts)

    @staticmethod
    def _all_packages(pkg_name):
        """
        >>> list(install_lib._all_packages('foo.bar.baz'))
        ['foo.bar.baz', 'foo.bar', 'foo']
        """
        while pkg_name:
            yield pkg_name
            pkg_name, sep, child = pkg_name.rpartition('.')

    def _get_SVEM_NSPs(self):
        """
        Get namespace packages (list) but only for
        single_version_externally_managed installations and empty otherwise.
        """
        # TODO: is it necessary to short-circuit here? i.e. what's the cost
        # if get_finalized_command is called even when namespace_packages is
        # False?
        if not self.distribution.namespace_packages:
            return []

        install_cmd = self.get_finalized_command('install')
        svem = install_cmd.single_version_externally_managed

        return self.distribution.namespace_packages if svem else []

    @staticmethod
    def _gen_exclusion_paths():
        """
        Generate file paths to be excluded for namespace packages (bytecode
        cache files).
        """
        # always exclude the package module itself
        yield '__init__.py'

        yield '__init__.pyc'
        yield '__init__.pyo'

        if not hasattr(imp, 'get_tag'):
            return

        base = os.path.join('__pycache__', '__init__.' + imp.get_tag())
        yield base + '.pyc'
        yield base + '.pyo'
        yield base + '.opt-1.pyc'
        yield base + '.opt-2.pyc'

    def copy_tree(
            self, infile, outfile,
            preserve_mode=1, preserve_times=1, preserve_symlinks=0, level=1
    ):
        assert preserve_mode and preserve_times and not preserve_symlinks
        exclude = self.get_exclusions()

        if not exclude:
            return orig.install_lib.copy_tree(self, infile, outfile)

        # Exclude namespace package __init__.py* files from the output

        from setuptools.archive_util import unpack_directory
        from distutils import log

        outfiles = []

        def pf(src, dst):
            if dst in exclude:
                log.warn("Skipping installation of %s (namespace package)",
                         dst)
                return False

            log.info("copying %s -> %s", src, os.path.dirname(dst))
            outfiles.append(dst)
            return dst

        unpack_directory(infile, outfile, pf)
        return outfiles

    def get_outputs(self):
        outputs = orig.install_lib.get_outputs(self)
        exclude = self.get_exclusions()
        if exclude:
            return [f for f in outputs if f not in exclude]
        return outputs
site-packages/setuptools/command/alias.py000064400000004572147511334640014606 0ustar00from distutils.errors import DistutilsOptionError

from setuptools.extern.six.moves import map

from setuptools.command.setopt import edit_config, option_base, config_file


def shquote(arg):
    """Quote an argument for later parsing by shlex.split()"""
    for c in '"', "'", "\\", "#":
        if c in arg:
            return repr(arg)
    if arg.split() != [arg]:
        return repr(arg)
    return arg


class alias(option_base):
    """Define a shortcut that invokes one or more commands"""

    description = "define a shortcut to invoke one or more commands"
    command_consumes_arguments = True

    user_options = [
        ('remove', 'r', 'remove (unset) the alias'),
    ] + option_base.user_options

    boolean_options = option_base.boolean_options + ['remove']

    def initialize_options(self):
        option_base.initialize_options(self)
        self.args = None
        self.remove = None

    def finalize_options(self):
        option_base.finalize_options(self)
        if self.remove and len(self.args) != 1:
            raise DistutilsOptionError(
                "Must specify exactly one argument (the alias name) when "
                "using --remove"
            )

    def run(self):
        aliases = self.distribution.get_option_dict('aliases')

        if not self.args:
            print("Command Aliases")
            print("---------------")
            for alias in aliases:
                print("setup.py alias", format_alias(alias, aliases))
            return

        elif len(self.args) == 1:
            alias, = self.args
            if self.remove:
                command = None
            elif alias in aliases:
                print("setup.py alias", format_alias(alias, aliases))
                return
            else:
                print("No alias definition found for %r" % alias)
                return
        else:
            alias = self.args[0]
            command = ' '.join(map(shquote, self.args[1:]))

        edit_config(self.filename, {'aliases': {alias: command}}, self.dry_run)


def format_alias(name, aliases):
    source, command = aliases[name]
    if source == config_file('global'):
        source = '--global-config '
    elif source == config_file('user'):
        source = '--user-config '
    elif source == config_file('local'):
        source = ''
    else:
        source = '--filename=%r' % source
    return source + name + ' ' + command
site-packages/setuptools/command/rotate.py000064400000004164147511334640015010 0ustar00from distutils.util import convert_path
from distutils import log
from distutils.errors import DistutilsOptionError
import os
import shutil

from setuptools.extern import six

from setuptools import Command


class rotate(Command):
    """Delete older distributions"""

    description = "delete older distributions, keeping N newest files"
    user_options = [
        ('match=', 'm', "patterns to match (required)"),
        ('dist-dir=', 'd', "directory where the distributions are"),
        ('keep=', 'k', "number of matching distributions to keep"),
    ]

    boolean_options = []

    def initialize_options(self):
        self.match = None
        self.dist_dir = None
        self.keep = None

    def finalize_options(self):
        if self.match is None:
            raise DistutilsOptionError(
                "Must specify one or more (comma-separated) match patterns "
                "(e.g. '.zip' or '.egg')"
            )
        if self.keep is None:
            raise DistutilsOptionError("Must specify number of files to keep")
        try:
            self.keep = int(self.keep)
        except ValueError:
            raise DistutilsOptionError("--keep must be an integer")
        if isinstance(self.match, six.string_types):
            self.match = [
                convert_path(p.strip()) for p in self.match.split(',')
            ]
        self.set_undefined_options('bdist', ('dist_dir', 'dist_dir'))

    def run(self):
        self.run_command("egg_info")
        from glob import glob

        for pattern in self.match:
            pattern = self.distribution.get_name() + '*' + pattern
            files = glob(os.path.join(self.dist_dir, pattern))
            files = [(os.path.getmtime(f), f) for f in files]
            files.sort()
            files.reverse()

            log.info("%d file(s) matching %s", len(files), pattern)
            files = files[self.keep:]
            for (t, f) in files:
                log.info("Deleting %s", f)
                if not self.dry_run:
                    if os.path.isdir(f):
                        shutil.rmtree(f)
                    else:
                        os.unlink(f)
site-packages/setuptools/command/egg_info.py000064400000060340147511334640015265 0ustar00"""setuptools.command.egg_info

Create a distribution's .egg-info directory and contents"""

from distutils.filelist import FileList as _FileList
from distutils.errors import DistutilsInternalError
from distutils.util import convert_path
from distutils import log
import distutils.errors
import distutils.filelist
import os
import re
import sys
import io
import warnings
import time
import collections

from setuptools.extern import six
from setuptools.extern.six.moves import map

from setuptools import Command
from setuptools.command.sdist import sdist
from setuptools.command.sdist import walk_revctrl
from setuptools.command.setopt import edit_config
from setuptools.command import bdist_egg
from pkg_resources import (
    parse_requirements, safe_name, parse_version,
    safe_version, yield_lines, EntryPoint, iter_entry_points, to_filename)
import setuptools.unicode_utils as unicode_utils
from setuptools.glob import glob

from setuptools.extern import packaging


def translate_pattern(glob):
    """
    Translate a file path glob like '*.txt' in to a regular expression.
    This differs from fnmatch.translate which allows wildcards to match
    directory separators. It also knows about '**/' which matches any number of
    directories.
    """
    pat = ''

    # This will split on '/' within [character classes]. This is deliberate.
    chunks = glob.split(os.path.sep)

    sep = re.escape(os.sep)
    valid_char = '[^%s]' % (sep,)

    for c, chunk in enumerate(chunks):
        last_chunk = c == len(chunks) - 1

        # Chunks that are a literal ** are globstars. They match anything.
        if chunk == '**':
            if last_chunk:
                # Match anything if this is the last component
                pat += '.*'
            else:
                # Match '(name/)*'
                pat += '(?:%s+%s)*' % (valid_char, sep)
            continue  # Break here as the whole path component has been handled

        # Find any special characters in the remainder
        i = 0
        chunk_len = len(chunk)
        while i < chunk_len:
            char = chunk[i]
            if char == '*':
                # Match any number of name characters
                pat += valid_char + '*'
            elif char == '?':
                # Match a name character
                pat += valid_char
            elif char == '[':
                # Character class
                inner_i = i + 1
                # Skip initial !/] chars
                if inner_i < chunk_len and chunk[inner_i] == '!':
                    inner_i = inner_i + 1
                if inner_i < chunk_len and chunk[inner_i] == ']':
                    inner_i = inner_i + 1

                # Loop till the closing ] is found
                while inner_i < chunk_len and chunk[inner_i] != ']':
                    inner_i = inner_i + 1

                if inner_i >= chunk_len:
                    # Got to the end of the string without finding a closing ]
                    # Do not treat this as a matching group, but as a literal [
                    pat += re.escape(char)
                else:
                    # Grab the insides of the [brackets]
                    inner = chunk[i + 1:inner_i]
                    char_class = ''

                    # Class negation
                    if inner[0] == '!':
                        char_class = '^'
                        inner = inner[1:]

                    char_class += re.escape(inner)
                    pat += '[%s]' % (char_class,)

                    # Skip to the end ]
                    i = inner_i
            else:
                pat += re.escape(char)
            i += 1

        # Join each chunk with the dir separator
        if not last_chunk:
            pat += sep

    pat += r'\Z'
    return re.compile(pat, flags=re.MULTILINE|re.DOTALL)


class egg_info(Command):
    description = "create a distribution's .egg-info directory"

    user_options = [
        ('egg-base=', 'e', "directory containing .egg-info directories"
                           " (default: top of the source tree)"),
        ('tag-date', 'd', "Add date stamp (e.g. 20050528) to version number"),
        ('tag-build=', 'b', "Specify explicit tag to add to version number"),
        ('no-date', 'D', "Don't include date stamp [default]"),
    ]

    boolean_options = ['tag-date']
    negative_opt = {
        'no-date': 'tag-date',
    }

    def initialize_options(self):
        self.egg_name = None
        self.egg_version = None
        self.egg_base = None
        self.egg_info = None
        self.tag_build = None
        self.tag_date = 0
        self.broken_egg_info = False
        self.vtags = None

    ####################################
    # allow the 'tag_svn_revision' to be detected and
    # set, supporting sdists built on older Setuptools.
    @property
    def tag_svn_revision(self):
        pass

    @tag_svn_revision.setter
    def tag_svn_revision(self, value):
        pass
    ####################################

    def save_version_info(self, filename):
        """
        Materialize the value of date into the
        build tag. Install build keys in a deterministic order
        to avoid arbitrary reordering on subsequent builds.
        """
        egg_info = collections.OrderedDict()
        # follow the order these keys would have been added
        # when PYTHONHASHSEED=0
        egg_info['tag_build'] = self.tags()
        egg_info['tag_date'] = 0
        edit_config(filename, dict(egg_info=egg_info))

    def finalize_options(self):
        self.egg_name = safe_name(self.distribution.get_name())
        self.vtags = self.tags()
        self.egg_version = self.tagged_version()

        parsed_version = parse_version(self.egg_version)

        try:
            is_version = isinstance(parsed_version, packaging.version.Version)
            spec = (
                "%s==%s" if is_version else "%s===%s"
            )
            list(
                parse_requirements(spec % (self.egg_name, self.egg_version))
            )
        except ValueError:
            raise distutils.errors.DistutilsOptionError(
                "Invalid distribution name or version syntax: %s-%s" %
                (self.egg_name, self.egg_version)
            )

        if self.egg_base is None:
            dirs = self.distribution.package_dir
            self.egg_base = (dirs or {}).get('', os.curdir)

        self.ensure_dirname('egg_base')
        self.egg_info = to_filename(self.egg_name) + '.egg-info'
        if self.egg_base != os.curdir:
            self.egg_info = os.path.join(self.egg_base, self.egg_info)
        if '-' in self.egg_name:
            self.check_broken_egg_info()

        # Set package version for the benefit of dumber commands
        # (e.g. sdist, bdist_wininst, etc.)
        #
        self.distribution.metadata.version = self.egg_version

        # If we bootstrapped around the lack of a PKG-INFO, as might be the
        # case in a fresh checkout, make sure that any special tags get added
        # to the version info
        #
        pd = self.distribution._patched_dist
        if pd is not None and pd.key == self.egg_name.lower():
            pd._version = self.egg_version
            pd._parsed_version = parse_version(self.egg_version)
            self.distribution._patched_dist = None

    def write_or_delete_file(self, what, filename, data, force=False):
        """Write `data` to `filename` or delete if empty

        If `data` is non-empty, this routine is the same as ``write_file()``.
        If `data` is empty but not ``None``, this is the same as calling
        ``delete_file(filename)`.  If `data` is ``None``, then this is a no-op
        unless `filename` exists, in which case a warning is issued about the
        orphaned file (if `force` is false), or deleted (if `force` is true).
        """
        if data:
            self.write_file(what, filename, data)
        elif os.path.exists(filename):
            if data is None and not force:
                log.warn(
                    "%s not set in setup(), but %s exists", what, filename
                )
                return
            else:
                self.delete_file(filename)

    def write_file(self, what, filename, data):
        """Write `data` to `filename` (if not a dry run) after announcing it

        `what` is used in a log message to identify what is being written
        to the file.
        """
        log.info("writing %s to %s", what, filename)
        if six.PY3:
            data = data.encode("utf-8")
        if not self.dry_run:
            f = open(filename, 'wb')
            f.write(data)
            f.close()

    def delete_file(self, filename):
        """Delete `filename` (if not a dry run) after announcing it"""
        log.info("deleting %s", filename)
        if not self.dry_run:
            os.unlink(filename)

    def tagged_version(self):
        version = self.distribution.get_version()
        # egg_info may be called more than once for a distribution,
        # in which case the version string already contains all tags.
        if self.vtags and version.endswith(self.vtags):
            return safe_version(version)
        return safe_version(version + self.vtags)

    def run(self):
        self.mkpath(self.egg_info)
        installer = self.distribution.fetch_build_egg
        for ep in iter_entry_points('egg_info.writers'):
            ep.require(installer=installer)
            writer = ep.resolve()
            writer(self, ep.name, os.path.join(self.egg_info, ep.name))

        # Get rid of native_libs.txt if it was put there by older bdist_egg
        nl = os.path.join(self.egg_info, "native_libs.txt")
        if os.path.exists(nl):
            self.delete_file(nl)

        self.find_sources()

    def tags(self):
        version = ''
        if self.tag_build:
            version += self.tag_build
        if self.tag_date:
            version += time.strftime("-%Y%m%d")
        return version

    def find_sources(self):
        """Generate SOURCES.txt manifest file"""
        manifest_filename = os.path.join(self.egg_info, "SOURCES.txt")
        mm = manifest_maker(self.distribution)
        mm.manifest = manifest_filename
        mm.run()
        self.filelist = mm.filelist

    def check_broken_egg_info(self):
        bei = self.egg_name + '.egg-info'
        if self.egg_base != os.curdir:
            bei = os.path.join(self.egg_base, bei)
        if os.path.exists(bei):
            log.warn(
                "-" * 78 + '\n'
                "Note: Your current .egg-info directory has a '-' in its name;"
                '\nthis will not work correctly with "setup.py develop".\n\n'
                'Please rename %s to %s to correct this problem.\n' + '-' * 78,
                bei, self.egg_info
            )
            self.broken_egg_info = self.egg_info
            self.egg_info = bei  # make it work for now


class FileList(_FileList):
    # Implementations of the various MANIFEST.in commands

    def process_template_line(self, line):
        # Parse the line: split it up, make sure the right number of words
        # is there, and return the relevant words.  'action' is always
        # defined: it's the first word of the line.  Which of the other
        # three are defined depends on the action; it'll be either
        # patterns, (dir and patterns), or (dir_pattern).
        (action, patterns, dir, dir_pattern) = self._parse_template_line(line)

        # OK, now we know that the action is valid and we have the
        # right number of words on the line for that action -- so we
        # can proceed with minimal error-checking.
        if action == 'include':
            self.debug_print("include " + ' '.join(patterns))
            for pattern in patterns:
                if not self.include(pattern):
                    log.warn("warning: no files found matching '%s'", pattern)

        elif action == 'exclude':
            self.debug_print("exclude " + ' '.join(patterns))
            for pattern in patterns:
                if not self.exclude(pattern):
                    log.warn(("warning: no previously-included files "
                              "found matching '%s'"), pattern)

        elif action == 'global-include':
            self.debug_print("global-include " + ' '.join(patterns))
            for pattern in patterns:
                if not self.global_include(pattern):
                    log.warn(("warning: no files found matching '%s' "
                              "anywhere in distribution"), pattern)

        elif action == 'global-exclude':
            self.debug_print("global-exclude " + ' '.join(patterns))
            for pattern in patterns:
                if not self.global_exclude(pattern):
                    log.warn(("warning: no previously-included files matching "
                              "'%s' found anywhere in distribution"),
                             pattern)

        elif action == 'recursive-include':
            self.debug_print("recursive-include %s %s" %
                             (dir, ' '.join(patterns)))
            for pattern in patterns:
                if not self.recursive_include(dir, pattern):
                    log.warn(("warning: no files found matching '%s' "
                              "under directory '%s'"),
                             pattern, dir)

        elif action == 'recursive-exclude':
            self.debug_print("recursive-exclude %s %s" %
                             (dir, ' '.join(patterns)))
            for pattern in patterns:
                if not self.recursive_exclude(dir, pattern):
                    log.warn(("warning: no previously-included files matching "
                              "'%s' found under directory '%s'"),
                             pattern, dir)

        elif action == 'graft':
            self.debug_print("graft " + dir_pattern)
            if not self.graft(dir_pattern):
                log.warn("warning: no directories found matching '%s'",
                         dir_pattern)

        elif action == 'prune':
            self.debug_print("prune " + dir_pattern)
            if not self.prune(dir_pattern):
                log.warn(("no previously-included directories found "
                          "matching '%s'"), dir_pattern)

        else:
            raise DistutilsInternalError(
                "this cannot happen: invalid action '%s'" % action)

    def _remove_files(self, predicate):
        """
        Remove all files from the file list that match the predicate.
        Return True if any matching files were removed
        """
        found = False
        for i in range(len(self.files) - 1, -1, -1):
            if predicate(self.files[i]):
                self.debug_print(" removing " + self.files[i])
                del self.files[i]
                found = True
        return found

    def include(self, pattern):
        """Include files that match 'pattern'."""
        found = [f for f in glob(pattern) if not os.path.isdir(f)]
        self.extend(found)
        return bool(found)

    def exclude(self, pattern):
        """Exclude files that match 'pattern'."""
        match = translate_pattern(pattern)
        return self._remove_files(match.match)

    def recursive_include(self, dir, pattern):
        """
        Include all files anywhere in 'dir/' that match the pattern.
        """
        full_pattern = os.path.join(dir, '**', pattern)
        found = [f for f in glob(full_pattern, recursive=True)
                 if not os.path.isdir(f)]
        self.extend(found)
        return bool(found)

    def recursive_exclude(self, dir, pattern):
        """
        Exclude any file anywhere in 'dir/' that match the pattern.
        """
        match = translate_pattern(os.path.join(dir, '**', pattern))
        return self._remove_files(match.match)

    def graft(self, dir):
        """Include all files from 'dir/'."""
        found = [
            item
            for match_dir in glob(dir)
            for item in distutils.filelist.findall(match_dir)
        ]
        self.extend(found)
        return bool(found)

    def prune(self, dir):
        """Filter out files from 'dir/'."""
        match = translate_pattern(os.path.join(dir, '**'))
        return self._remove_files(match.match)

    def global_include(self, pattern):
        """
        Include all files anywhere in the current directory that match the
        pattern. This is very inefficient on large file trees.
        """
        if self.allfiles is None:
            self.findall()
        match = translate_pattern(os.path.join('**', pattern))
        found = [f for f in self.allfiles if match.match(f)]
        self.extend(found)
        return bool(found)

    def global_exclude(self, pattern):
        """
        Exclude all files anywhere that match the pattern.
        """
        match = translate_pattern(os.path.join('**', pattern))
        return self._remove_files(match.match)

    def append(self, item):
        if item.endswith('\r'):  # Fix older sdists built on Windows
            item = item[:-1]
        path = convert_path(item)

        if self._safe_path(path):
            self.files.append(path)

    def extend(self, paths):
        self.files.extend(filter(self._safe_path, paths))

    def _repair(self):
        """
        Replace self.files with only safe paths

        Because some owners of FileList manipulate the underlying
        ``files`` attribute directly, this method must be called to
        repair those paths.
        """
        self.files = list(filter(self._safe_path, self.files))

    def _safe_path(self, path):
        enc_warn = "'%s' not %s encodable -- skipping"

        # To avoid accidental trans-codings errors, first to unicode
        u_path = unicode_utils.filesys_decode(path)
        if u_path is None:
            log.warn("'%s' in unexpected encoding -- skipping" % path)
            return False

        # Must ensure utf-8 encodability
        utf8_path = unicode_utils.try_encode(u_path, "utf-8")
        if utf8_path is None:
            log.warn(enc_warn, path, 'utf-8')
            return False

        try:
            # accept is either way checks out
            if os.path.exists(u_path) or os.path.exists(utf8_path):
                return True
        # this will catch any encode errors decoding u_path
        except UnicodeEncodeError:
            log.warn(enc_warn, path, sys.getfilesystemencoding())


class manifest_maker(sdist):
    template = "MANIFEST.in"

    def initialize_options(self):
        self.use_defaults = 1
        self.prune = 1
        self.manifest_only = 1
        self.force_manifest = 1

    def finalize_options(self):
        pass

    def run(self):
        self.filelist = FileList()
        if not os.path.exists(self.manifest):
            self.write_manifest()  # it must exist so it'll get in the list
        self.add_defaults()
        if os.path.exists(self.template):
            self.read_template()
        self.prune_file_list()
        self.filelist.sort()
        self.filelist.remove_duplicates()
        self.write_manifest()

    def _manifest_normalize(self, path):
        path = unicode_utils.filesys_decode(path)
        return path.replace(os.sep, '/')

    def write_manifest(self):
        """
        Write the file list in 'self.filelist' to the manifest file
        named by 'self.manifest'.
        """
        self.filelist._repair()

        # Now _repairs should encodability, but not unicode
        files = [self._manifest_normalize(f) for f in self.filelist.files]
        msg = "writing manifest file '%s'" % self.manifest
        self.execute(write_file, (self.manifest, files), msg)

    def warn(self, msg):
        if not self._should_suppress_warning(msg):
            sdist.warn(self, msg)

    @staticmethod
    def _should_suppress_warning(msg):
        """
        suppress missing-file warnings from sdist
        """
        return re.match(r"standard file .*not found", msg)

    def add_defaults(self):
        sdist.add_defaults(self)
        self.filelist.append(self.template)
        self.filelist.append(self.manifest)
        rcfiles = list(walk_revctrl())
        if rcfiles:
            self.filelist.extend(rcfiles)
        elif os.path.exists(self.manifest):
            self.read_manifest()
        ei_cmd = self.get_finalized_command('egg_info')
        self.filelist.graft(ei_cmd.egg_info)

    def prune_file_list(self):
        build = self.get_finalized_command('build')
        base_dir = self.distribution.get_fullname()
        self.filelist.prune(build.build_base)
        self.filelist.prune(base_dir)
        sep = re.escape(os.sep)
        self.filelist.exclude_pattern(r'(^|' + sep + r')(RCS|CVS|\.svn)' + sep,
                                      is_regex=1)


def write_file(filename, contents):
    """Create a file with the specified name and write 'contents' (a
    sequence of strings without line terminators) to it.
    """
    contents = "\n".join(contents)

    # assuming the contents has been vetted for utf-8 encoding
    contents = contents.encode("utf-8")

    with open(filename, "wb") as f:  # always write POSIX-style manifest
        f.write(contents)


def write_pkg_info(cmd, basename, filename):
    log.info("writing %s", filename)
    if not cmd.dry_run:
        metadata = cmd.distribution.metadata
        metadata.version, oldver = cmd.egg_version, metadata.version
        metadata.name, oldname = cmd.egg_name, metadata.name

        try:
            # write unescaped data to PKG-INFO, so older pkg_resources
            # can still parse it
            metadata.write_pkg_info(cmd.egg_info)
        finally:
            metadata.name, metadata.version = oldname, oldver

        safe = getattr(cmd.distribution, 'zip_safe', None)

        bdist_egg.write_safety_flag(cmd.egg_info, safe)


def warn_depends_obsolete(cmd, basename, filename):
    if os.path.exists(filename):
        log.warn(
            "WARNING: 'depends.txt' is not used by setuptools 0.6!\n"
            "Use the install_requires/extras_require setup() args instead."
        )


def _write_requirements(stream, reqs):
    lines = yield_lines(reqs or ())
    append_cr = lambda line: line + '\n'
    lines = map(append_cr, lines)
    stream.writelines(lines)


def write_requirements(cmd, basename, filename):
    dist = cmd.distribution
    data = six.StringIO()
    _write_requirements(data, dist.install_requires)
    extras_require = dist.extras_require or {}
    for extra in sorted(extras_require):
        data.write('\n[{extra}]\n'.format(**vars()))
        _write_requirements(data, extras_require[extra])
    cmd.write_or_delete_file("requirements", filename, data.getvalue())


def write_setup_requirements(cmd, basename, filename):
    data = io.StringIO()
    _write_requirements(data, cmd.distribution.setup_requires)
    cmd.write_or_delete_file("setup-requirements", filename, data.getvalue())


def write_toplevel_names(cmd, basename, filename):
    pkgs = dict.fromkeys(
        [
            k.split('.', 1)[0]
            for k in cmd.distribution.iter_distribution_names()
        ]
    )
    cmd.write_file("top-level names", filename, '\n'.join(sorted(pkgs)) + '\n')


def overwrite_arg(cmd, basename, filename):
    write_arg(cmd, basename, filename, True)


def write_arg(cmd, basename, filename, force=False):
    argname = os.path.splitext(basename)[0]
    value = getattr(cmd.distribution, argname, None)
    if value is not None:
        value = '\n'.join(value) + '\n'
    cmd.write_or_delete_file(argname, filename, value, force)


def write_entries(cmd, basename, filename):
    ep = cmd.distribution.entry_points

    if isinstance(ep, six.string_types) or ep is None:
        data = ep
    elif ep is not None:
        data = []
        for section, contents in sorted(ep.items()):
            if not isinstance(contents, six.string_types):
                contents = EntryPoint.parse_group(section, contents)
                contents = '\n'.join(sorted(map(str, contents.values())))
            data.append('[%s]\n%s\n\n' % (section, contents))
        data = ''.join(data)

    cmd.write_or_delete_file('entry points', filename, data, True)


def get_pkg_info_revision():
    """
    Get a -r### off of PKG-INFO Version in case this is an sdist of
    a subversion revision.
    """
    warnings.warn("get_pkg_info_revision is deprecated.", DeprecationWarning)
    if os.path.exists('PKG-INFO'):
        with io.open('PKG-INFO') as f:
            for line in f:
                match = re.match(r"Version:.*-r(\d+)\s*$", line)
                if match:
                    return int(match.group(1))
    return 0
site-packages/setuptools/command/upload_docs.py000064400000016217147511334640016010 0ustar00# -*- coding: utf-8 -*-
"""upload_docs

Implements a Distutils 'upload_docs' subcommand (upload documentation to
PyPI's pythonhosted.org).
"""

from base64 import standard_b64encode
from distutils import log
from distutils.errors import DistutilsOptionError
import os
import socket
import zipfile
import tempfile
import shutil
import itertools
import functools

from setuptools.extern import six
from setuptools.extern.six.moves import http_client, urllib

from pkg_resources import iter_entry_points
from .upload import upload


def _encode(s):
    errors = 'surrogateescape' if six.PY3 else 'strict'
    return s.encode('utf-8', errors)


class upload_docs(upload):
    # override the default repository as upload_docs isn't
    # supported by Warehouse (and won't be).
    DEFAULT_REPOSITORY = 'https://pypi.python.org/pypi/'

    description = 'Upload documentation to PyPI'

    user_options = [
        ('repository=', 'r',
         "url of repository [default: %s]" % upload.DEFAULT_REPOSITORY),
        ('show-response', None,
         'display full response text from server'),
        ('upload-dir=', None, 'directory to upload'),
    ]
    boolean_options = upload.boolean_options

    def has_sphinx(self):
        if self.upload_dir is None:
            for ep in iter_entry_points('distutils.commands', 'build_sphinx'):
                return True

    sub_commands = [('build_sphinx', has_sphinx)]

    def initialize_options(self):
        upload.initialize_options(self)
        self.upload_dir = None
        self.target_dir = None

    def finalize_options(self):
        upload.finalize_options(self)
        if self.upload_dir is None:
            if self.has_sphinx():
                build_sphinx = self.get_finalized_command('build_sphinx')
                self.target_dir = build_sphinx.builder_target_dir
            else:
                build = self.get_finalized_command('build')
                self.target_dir = os.path.join(build.build_base, 'docs')
        else:
            self.ensure_dirname('upload_dir')
            self.target_dir = self.upload_dir
        if 'pypi.python.org' in self.repository:
            log.warn("Upload_docs command is deprecated. Use RTD instead.")
        self.announce('Using upload directory %s' % self.target_dir)

    def create_zipfile(self, filename):
        zip_file = zipfile.ZipFile(filename, "w")
        try:
            self.mkpath(self.target_dir)  # just in case
            for root, dirs, files in os.walk(self.target_dir):
                if root == self.target_dir and not files:
                    tmpl = "no files found in upload directory '%s'"
                    raise DistutilsOptionError(tmpl % self.target_dir)
                for name in files:
                    full = os.path.join(root, name)
                    relative = root[len(self.target_dir):].lstrip(os.path.sep)
                    dest = os.path.join(relative, name)
                    zip_file.write(full, dest)
        finally:
            zip_file.close()

    def run(self):
        # Run sub commands
        for cmd_name in self.get_sub_commands():
            self.run_command(cmd_name)

        tmp_dir = tempfile.mkdtemp()
        name = self.distribution.metadata.get_name()
        zip_file = os.path.join(tmp_dir, "%s.zip" % name)
        try:
            self.create_zipfile(zip_file)
            self.upload_file(zip_file)
        finally:
            shutil.rmtree(tmp_dir)

    @staticmethod
    def _build_part(item, sep_boundary):
        key, values = item
        title = '\nContent-Disposition: form-data; name="%s"' % key
        # handle multiple entries for the same name
        if not isinstance(values, list):
            values = [values]
        for value in values:
            if isinstance(value, tuple):
                title += '; filename="%s"' % value[0]
                value = value[1]
            else:
                value = _encode(value)
            yield sep_boundary
            yield _encode(title)
            yield b"\n\n"
            yield value
            if value and value[-1:] == b'\r':
                yield b'\n'  # write an extra newline (lurve Macs)

    @classmethod
    def _build_multipart(cls, data):
        """
        Build up the MIME payload for the POST data
        """
        boundary = b'--------------GHSKFJDLGDS7543FJKLFHRE75642756743254'
        sep_boundary = b'\n--' + boundary
        end_boundary = sep_boundary + b'--'
        end_items = end_boundary, b"\n",
        builder = functools.partial(
            cls._build_part,
            sep_boundary=sep_boundary,
        )
        part_groups = map(builder, data.items())
        parts = itertools.chain.from_iterable(part_groups)
        body_items = itertools.chain(parts, end_items)
        content_type = 'multipart/form-data; boundary=%s' % boundary.decode('ascii')
        return b''.join(body_items), content_type

    def upload_file(self, filename):
        with open(filename, 'rb') as f:
            content = f.read()
        meta = self.distribution.metadata
        data = {
            ':action': 'doc_upload',
            'name': meta.get_name(),
            'content': (os.path.basename(filename), content),
        }
        # set up the authentication
        credentials = _encode(self.username + ':' + self.password)
        credentials = standard_b64encode(credentials)
        if six.PY3:
            credentials = credentials.decode('ascii')
        auth = "Basic " + credentials

        body, ct = self._build_multipart(data)

        msg = "Submitting documentation to %s" % (self.repository)
        self.announce(msg, log.INFO)

        # build the Request
        # We can't use urllib2 since we need to send the Basic
        # auth right with the first request
        schema, netloc, url, params, query, fragments = \
            urllib.parse.urlparse(self.repository)
        assert not params and not query and not fragments
        if schema == 'http':
            conn = http_client.HTTPConnection(netloc)
        elif schema == 'https':
            conn = http_client.HTTPSConnection(netloc)
        else:
            raise AssertionError("unsupported schema " + schema)

        data = ''
        try:
            conn.connect()
            conn.putrequest("POST", url)
            content_type = ct
            conn.putheader('Content-type', content_type)
            conn.putheader('Content-length', str(len(body)))
            conn.putheader('Authorization', auth)
            conn.endheaders()
            conn.send(body)
        except socket.error as e:
            self.announce(str(e), log.ERROR)
            return

        r = conn.getresponse()
        if r.status == 200:
            msg = 'Server response (%s): %s' % (r.status, r.reason)
            self.announce(msg, log.INFO)
        elif r.status == 301:
            location = r.getheader('Location')
            if location is None:
                location = 'https://pythonhosted.org/%s/' % meta.get_name()
            msg = 'Upload successful. Visit %s' % location
            self.announce(msg, log.INFO)
        else:
            msg = 'Upload failed (%s): %s' % (r.status, r.reason)
            self.announce(msg, log.ERROR)
        if self.show_response:
            print('-' * 75, r.read(), '-' * 75)
site-packages/setuptools/command/sdist.py000064400000015067147511334640014644 0ustar00from distutils import log
import distutils.command.sdist as orig
import os
import sys
import io
import contextlib

from setuptools.extern import six

from .py36compat import sdist_add_defaults

import pkg_resources

_default_revctrl = list


def walk_revctrl(dirname=''):
    """Find all files under revision control"""
    for ep in pkg_resources.iter_entry_points('setuptools.file_finders'):
        for item in ep.load()(dirname):
            yield item


class sdist(sdist_add_defaults, orig.sdist):
    """Smart sdist that finds anything supported by revision control"""

    user_options = [
        ('formats=', None,
         "formats for source distribution (comma-separated list)"),
        ('keep-temp', 'k',
         "keep the distribution tree around after creating " +
         "archive file(s)"),
        ('dist-dir=', 'd',
         "directory to put the source distribution archive(s) in "
         "[default: dist]"),
    ]

    negative_opt = {}

    README_EXTENSIONS = ['', '.rst', '.txt', '.md']
    READMES = tuple('README{0}'.format(ext) for ext in README_EXTENSIONS)

    def run(self):
        self.run_command('egg_info')
        ei_cmd = self.get_finalized_command('egg_info')
        self.filelist = ei_cmd.filelist
        self.filelist.append(os.path.join(ei_cmd.egg_info, 'SOURCES.txt'))
        self.check_readme()

        # Run sub commands
        for cmd_name in self.get_sub_commands():
            self.run_command(cmd_name)

        self.make_distribution()

        dist_files = getattr(self.distribution, 'dist_files', [])
        for file in self.archive_files:
            data = ('sdist', '', file)
            if data not in dist_files:
                dist_files.append(data)

    def initialize_options(self):
        orig.sdist.initialize_options(self)

        self._default_to_gztar()

    def _default_to_gztar(self):
        # only needed on Python prior to 3.6.
        if sys.version_info >= (3, 6, 0, 'beta', 1):
            return
        self.formats = ['gztar']

    def make_distribution(self):
        """
        Workaround for #516
        """
        with self._remove_os_link():
            orig.sdist.make_distribution(self)

    @staticmethod
    @contextlib.contextmanager
    def _remove_os_link():
        """
        In a context, remove and restore os.link if it exists
        """

        class NoValue:
            pass

        orig_val = getattr(os, 'link', NoValue)
        try:
            del os.link
        except Exception:
            pass
        try:
            yield
        finally:
            if orig_val is not NoValue:
                setattr(os, 'link', orig_val)

    def __read_template_hack(self):
        # This grody hack closes the template file (MANIFEST.in) if an
        #  exception occurs during read_template.
        # Doing so prevents an error when easy_install attempts to delete the
        #  file.
        try:
            orig.sdist.read_template(self)
        except Exception:
            _, _, tb = sys.exc_info()
            tb.tb_next.tb_frame.f_locals['template'].close()
            raise

    # Beginning with Python 2.7.2, 3.1.4, and 3.2.1, this leaky file handle
    #  has been fixed, so only override the method if we're using an earlier
    #  Python.
    has_leaky_handle = (
        sys.version_info < (2, 7, 2)
        or (3, 0) <= sys.version_info < (3, 1, 4)
        or (3, 2) <= sys.version_info < (3, 2, 1)
    )
    if has_leaky_handle:
        read_template = __read_template_hack

    def _add_defaults_python(self):
        """getting python files"""
        if self.distribution.has_pure_modules():
            build_py = self.get_finalized_command('build_py')
            self.filelist.extend(build_py.get_source_files())
            # This functionality is incompatible with include_package_data, and
            # will in fact create an infinite recursion if include_package_data
            # is True.  Use of include_package_data will imply that
            # distutils-style automatic handling of package_data is disabled
            if not self.distribution.include_package_data:
                for _, src_dir, _, filenames in build_py.data_files:
                    self.filelist.extend([os.path.join(src_dir, filename)
                                          for filename in filenames])

    def _add_defaults_data_files(self):
        try:
            if six.PY2:
                sdist_add_defaults._add_defaults_data_files(self)
            else:
                super()._add_defaults_data_files()
        except TypeError:
            log.warn("data_files contains unexpected objects")

    def check_readme(self):
        for f in self.READMES:
            if os.path.exists(f):
                return
        else:
            self.warn(
                "standard file not found: should have one of " +
                ', '.join(self.READMES)
            )

    def make_release_tree(self, base_dir, files):
        orig.sdist.make_release_tree(self, base_dir, files)

        # Save any egg_info command line options used to create this sdist
        dest = os.path.join(base_dir, 'setup.cfg')
        if hasattr(os, 'link') and os.path.exists(dest):
            # unlink and re-copy, since it might be hard-linked, and
            # we don't want to change the source version
            os.unlink(dest)
            self.copy_file('setup.cfg', dest)

        self.get_finalized_command('egg_info').save_version_info(dest)

    def _manifest_is_not_generated(self):
        # check for special comment used in 2.7.1 and higher
        if not os.path.isfile(self.manifest):
            return False

        with io.open(self.manifest, 'rb') as fp:
            first_line = fp.readline()
        return (first_line !=
                '# file GENERATED by distutils, do NOT edit\n'.encode())

    def read_manifest(self):
        """Read the manifest file (named by 'self.manifest') and use it to
        fill in 'self.filelist', the list of files to include in the source
        distribution.
        """
        log.info("reading manifest file '%s'", self.manifest)
        manifest = open(self.manifest, 'rb')
        for line in manifest:
            # The manifest must contain UTF-8. See #303.
            if six.PY3:
                try:
                    line = line.decode('UTF-8')
                except UnicodeDecodeError:
                    log.warn("%r not UTF-8 decodable -- skipping" % line)
                    continue
            # ignore comments and blank lines
            line = line.strip()
            if line.startswith('#') or not line:
                continue
            self.filelist.append(line)
        manifest.close()
site-packages/setuptools/command/build_ext.py000064400000031565147511334640015476 0ustar00import os
import sys
import itertools
import imp
from distutils.command.build_ext import build_ext as _du_build_ext
from distutils.file_util import copy_file
from distutils.ccompiler import new_compiler
from distutils.sysconfig import customize_compiler, get_config_var
from distutils.errors import DistutilsError
from distutils import log

from setuptools.extension import Library
from setuptools.extern import six

try:
    # Attempt to use Cython for building extensions, if available
    from Cython.Distutils.build_ext import build_ext as _build_ext
    # Additionally, assert that the compiler module will load
    # also. Ref #1229.
    __import__('Cython.Compiler.Main')
except ImportError:
    _build_ext = _du_build_ext

# make sure _config_vars is initialized
get_config_var("LDSHARED")
from distutils.sysconfig import _config_vars as _CONFIG_VARS


def _customize_compiler_for_shlib(compiler):
    if sys.platform == "darwin":
        # building .dylib requires additional compiler flags on OSX; here we
        # temporarily substitute the pyconfig.h variables so that distutils'
        # 'customize_compiler' uses them before we build the shared libraries.
        tmp = _CONFIG_VARS.copy()
        try:
            # XXX Help!  I don't have any idea whether these are right...
            _CONFIG_VARS['LDSHARED'] = (
                "gcc -Wl,-x -dynamiclib -undefined dynamic_lookup")
            _CONFIG_VARS['CCSHARED'] = " -dynamiclib"
            _CONFIG_VARS['SO'] = ".dylib"
            customize_compiler(compiler)
        finally:
            _CONFIG_VARS.clear()
            _CONFIG_VARS.update(tmp)
    else:
        customize_compiler(compiler)


have_rtld = False
use_stubs = False
libtype = 'shared'

if sys.platform == "darwin":
    use_stubs = True
elif os.name != 'nt':
    try:
        import dl
        use_stubs = have_rtld = hasattr(dl, 'RTLD_NOW')
    except ImportError:
        pass

if_dl = lambda s: s if have_rtld else ''


def get_abi3_suffix():
    """Return the file extension for an abi3-compliant Extension()"""
    for suffix, _, _ in (s for s in imp.get_suffixes() if s[2] == imp.C_EXTENSION):
        if '.abi3' in suffix:  # Unix
            return suffix
        elif suffix == '.pyd':  # Windows
            return suffix


class build_ext(_build_ext):
    def run(self):
        """Build extensions in build directory, then copy if --inplace"""
        old_inplace, self.inplace = self.inplace, 0
        _build_ext.run(self)
        self.inplace = old_inplace
        if old_inplace:
            self.copy_extensions_to_source()

    def copy_extensions_to_source(self):
        build_py = self.get_finalized_command('build_py')
        for ext in self.extensions:
            fullname = self.get_ext_fullname(ext.name)
            filename = self.get_ext_filename(fullname)
            modpath = fullname.split('.')
            package = '.'.join(modpath[:-1])
            package_dir = build_py.get_package_dir(package)
            dest_filename = os.path.join(package_dir,
                                         os.path.basename(filename))
            src_filename = os.path.join(self.build_lib, filename)

            # Always copy, even if source is older than destination, to ensure
            # that the right extensions for the current Python/platform are
            # used.
            copy_file(
                src_filename, dest_filename, verbose=self.verbose,
                dry_run=self.dry_run
            )
            if ext._needs_stub:
                self.write_stub(package_dir or os.curdir, ext, True)

    def get_ext_filename(self, fullname):
        filename = _build_ext.get_ext_filename(self, fullname)
        if fullname in self.ext_map:
            ext = self.ext_map[fullname]
            use_abi3 = (
                six.PY3
                and getattr(ext, 'py_limited_api')
                and get_abi3_suffix()
            )
            if use_abi3:
                so_ext = _get_config_var_837('EXT_SUFFIX')
                filename = filename[:-len(so_ext)]
                filename = filename + get_abi3_suffix()
            if isinstance(ext, Library):
                fn, ext = os.path.splitext(filename)
                return self.shlib_compiler.library_filename(fn, libtype)
            elif use_stubs and ext._links_to_dynamic:
                d, fn = os.path.split(filename)
                return os.path.join(d, 'dl-' + fn)
        return filename

    def initialize_options(self):
        _build_ext.initialize_options(self)
        self.shlib_compiler = None
        self.shlibs = []
        self.ext_map = {}

    def finalize_options(self):
        _build_ext.finalize_options(self)
        self.extensions = self.extensions or []
        self.check_extensions_list(self.extensions)
        self.shlibs = [ext for ext in self.extensions
                       if isinstance(ext, Library)]
        if self.shlibs:
            self.setup_shlib_compiler()
        for ext in self.extensions:
            ext._full_name = self.get_ext_fullname(ext.name)
        for ext in self.extensions:
            fullname = ext._full_name
            self.ext_map[fullname] = ext

            # distutils 3.1 will also ask for module names
            # XXX what to do with conflicts?
            self.ext_map[fullname.split('.')[-1]] = ext

            ltd = self.shlibs and self.links_to_dynamic(ext) or False
            ns = ltd and use_stubs and not isinstance(ext, Library)
            ext._links_to_dynamic = ltd
            ext._needs_stub = ns
            filename = ext._file_name = self.get_ext_filename(fullname)
            libdir = os.path.dirname(os.path.join(self.build_lib, filename))
            if ltd and libdir not in ext.library_dirs:
                ext.library_dirs.append(libdir)
            if ltd and use_stubs and os.curdir not in ext.runtime_library_dirs:
                ext.runtime_library_dirs.append(os.curdir)

    def setup_shlib_compiler(self):
        compiler = self.shlib_compiler = new_compiler(
            compiler=self.compiler, dry_run=self.dry_run, force=self.force
        )
        _customize_compiler_for_shlib(compiler)

        if self.include_dirs is not None:
            compiler.set_include_dirs(self.include_dirs)
        if self.define is not None:
            # 'define' option is a list of (name,value) tuples
            for (name, value) in self.define:
                compiler.define_macro(name, value)
        if self.undef is not None:
            for macro in self.undef:
                compiler.undefine_macro(macro)
        if self.libraries is not None:
            compiler.set_libraries(self.libraries)
        if self.library_dirs is not None:
            compiler.set_library_dirs(self.library_dirs)
        if self.rpath is not None:
            compiler.set_runtime_library_dirs(self.rpath)
        if self.link_objects is not None:
            compiler.set_link_objects(self.link_objects)

        # hack so distutils' build_extension() builds a library instead
        compiler.link_shared_object = link_shared_object.__get__(compiler)

    def get_export_symbols(self, ext):
        if isinstance(ext, Library):
            return ext.export_symbols
        return _build_ext.get_export_symbols(self, ext)

    def build_extension(self, ext):
        ext._convert_pyx_sources_to_lang()
        _compiler = self.compiler
        try:
            if isinstance(ext, Library):
                self.compiler = self.shlib_compiler
            _build_ext.build_extension(self, ext)
            if ext._needs_stub:
                cmd = self.get_finalized_command('build_py').build_lib
                self.write_stub(cmd, ext)
        finally:
            self.compiler = _compiler

    def links_to_dynamic(self, ext):
        """Return true if 'ext' links to a dynamic lib in the same package"""
        # XXX this should check to ensure the lib is actually being built
        # XXX as dynamic, and not just using a locally-found version or a
        # XXX static-compiled version
        libnames = dict.fromkeys([lib._full_name for lib in self.shlibs])
        pkg = '.'.join(ext._full_name.split('.')[:-1] + [''])
        return any(pkg + libname in libnames for libname in ext.libraries)

    def get_outputs(self):
        return _build_ext.get_outputs(self) + self.__get_stubs_outputs()

    def __get_stubs_outputs(self):
        # assemble the base name for each extension that needs a stub
        ns_ext_bases = (
            os.path.join(self.build_lib, *ext._full_name.split('.'))
            for ext in self.extensions
            if ext._needs_stub
        )
        # pair each base with the extension
        pairs = itertools.product(ns_ext_bases, self.__get_output_extensions())
        return list(base + fnext for base, fnext in pairs)

    def __get_output_extensions(self):
        yield '.py'
        yield '.pyc'
        if self.get_finalized_command('build_py').optimize:
            yield '.pyo'

    def write_stub(self, output_dir, ext, compile=False):
        log.info("writing stub loader for %s to %s", ext._full_name,
                 output_dir)
        stub_file = (os.path.join(output_dir, *ext._full_name.split('.')) +
                     '.py')
        if compile and os.path.exists(stub_file):
            raise DistutilsError(stub_file + " already exists! Please delete.")
        if not self.dry_run:
            f = open(stub_file, 'w')
            f.write(
                '\n'.join([
                    "def __bootstrap__():",
                    "   global __bootstrap__, __file__, __loader__",
                    "   import sys, os, pkg_resources, imp" + if_dl(", dl"),
                    "   __file__ = pkg_resources.resource_filename"
                    "(__name__,%r)"
                    % os.path.basename(ext._file_name),
                    "   del __bootstrap__",
                    "   if '__loader__' in globals():",
                    "       del __loader__",
                    if_dl("   old_flags = sys.getdlopenflags()"),
                    "   old_dir = os.getcwd()",
                    "   try:",
                    "     os.chdir(os.path.dirname(__file__))",
                    if_dl("     sys.setdlopenflags(dl.RTLD_NOW)"),
                    "     imp.load_dynamic(__name__,__file__)",
                    "   finally:",
                    if_dl("     sys.setdlopenflags(old_flags)"),
                    "     os.chdir(old_dir)",
                    "__bootstrap__()",
                    ""  # terminal \n
                ])
            )
            f.close()
        if compile:
            from distutils.util import byte_compile

            byte_compile([stub_file], optimize=0,
                         force=True, dry_run=self.dry_run)
            optimize = self.get_finalized_command('install_lib').optimize
            if optimize > 0:
                byte_compile([stub_file], optimize=optimize,
                             force=True, dry_run=self.dry_run)
            if os.path.exists(stub_file) and not self.dry_run:
                os.unlink(stub_file)


if use_stubs or os.name == 'nt':
    # Build shared libraries
    #
    def link_shared_object(
            self, objects, output_libname, output_dir=None, libraries=None,
            library_dirs=None, runtime_library_dirs=None, export_symbols=None,
            debug=0, extra_preargs=None, extra_postargs=None, build_temp=None,
            target_lang=None):
        self.link(
            self.SHARED_LIBRARY, objects, output_libname,
            output_dir, libraries, library_dirs, runtime_library_dirs,
            export_symbols, debug, extra_preargs, extra_postargs,
            build_temp, target_lang
        )
else:
    # Build static libraries everywhere else
    libtype = 'static'

    def link_shared_object(
            self, objects, output_libname, output_dir=None, libraries=None,
            library_dirs=None, runtime_library_dirs=None, export_symbols=None,
            debug=0, extra_preargs=None, extra_postargs=None, build_temp=None,
            target_lang=None):
        # XXX we need to either disallow these attrs on Library instances,
        # or warn/abort here if set, or something...
        # libraries=None, library_dirs=None, runtime_library_dirs=None,
        # export_symbols=None, extra_preargs=None, extra_postargs=None,
        # build_temp=None

        assert output_dir is None  # distutils build_ext doesn't pass this
        output_dir, filename = os.path.split(output_libname)
        basename, ext = os.path.splitext(filename)
        if self.library_filename("x").startswith('lib'):
            # strip 'lib' prefix; this is kludgy if some platform uses
            # a different prefix
            basename = basename[3:]

        self.create_static_lib(
            objects, basename, output_dir, debug, target_lang
        )


def _get_config_var_837(name):
    """
    In https://github.com/pypa/setuptools/pull/837, we discovered
    Python 3.3.0 exposes the extension suffix under the name 'SO'.
    """
    if sys.version_info < (3, 3, 1):
        name = 'SO'
    return get_config_var(name)
site-packages/setuptools/command/bdist_egg.py000064400000043411147511334640015437 0ustar00"""setuptools.command.bdist_egg

Build .egg distributions"""

from distutils.errors import DistutilsSetupError
from distutils.dir_util import remove_tree, mkpath
from distutils import log
from types import CodeType
import sys
import os
import re
import textwrap
import marshal

from setuptools.extern import six

from pkg_resources import get_build_platform, Distribution, ensure_directory
from pkg_resources import EntryPoint
from setuptools.extension import Library
from setuptools import Command

try:
    # Python 2.7 or >=3.2
    from sysconfig import get_path, get_python_version

    def _get_purelib():
        return get_path("purelib")
except ImportError:
    from distutils.sysconfig import get_python_lib, get_python_version

    def _get_purelib():
        return get_python_lib(False)


def strip_module(filename):
    if '.' in filename:
        filename = os.path.splitext(filename)[0]
    if filename.endswith('module'):
        filename = filename[:-6]
    return filename


def sorted_walk(dir):
    """Do os.walk in a reproducible way,
    independent of indeterministic filesystem readdir order
    """
    for base, dirs, files in os.walk(dir):
        dirs.sort()
        files.sort()
        yield base, dirs, files


def write_stub(resource, pyfile):
    _stub_template = textwrap.dedent("""
        def __bootstrap__():
            global __bootstrap__, __loader__, __file__
            import sys, pkg_resources, imp
            __file__ = pkg_resources.resource_filename(__name__, %r)
            __loader__ = None; del __bootstrap__, __loader__
            imp.load_dynamic(__name__,__file__)
        __bootstrap__()
        """).lstrip()
    with open(pyfile, 'w') as f:
        f.write(_stub_template % resource)


class bdist_egg(Command):
    description = "create an \"egg\" distribution"

    user_options = [
        ('bdist-dir=', 'b',
         "temporary directory for creating the distribution"),
        ('plat-name=', 'p', "platform name to embed in generated filenames "
                            "(default: %s)" % get_build_platform()),
        ('exclude-source-files', None,
         "remove all .py files from the generated egg"),
        ('keep-temp', 'k',
         "keep the pseudo-installation tree around after " +
         "creating the distribution archive"),
        ('dist-dir=', 'd',
         "directory to put final built distributions in"),
        ('skip-build', None,
         "skip rebuilding everything (for testing/debugging)"),
    ]

    boolean_options = [
        'keep-temp', 'skip-build', 'exclude-source-files'
    ]

    def initialize_options(self):
        self.bdist_dir = None
        self.plat_name = None
        self.keep_temp = 0
        self.dist_dir = None
        self.skip_build = 0
        self.egg_output = None
        self.exclude_source_files = None

    def finalize_options(self):
        ei_cmd = self.ei_cmd = self.get_finalized_command("egg_info")
        self.egg_info = ei_cmd.egg_info

        if self.bdist_dir is None:
            bdist_base = self.get_finalized_command('bdist').bdist_base
            self.bdist_dir = os.path.join(bdist_base, 'egg')

        if self.plat_name is None:
            self.plat_name = get_build_platform()

        self.set_undefined_options('bdist', ('dist_dir', 'dist_dir'))

        if self.egg_output is None:

            # Compute filename of the output egg
            basename = Distribution(
                None, None, ei_cmd.egg_name, ei_cmd.egg_version,
                get_python_version(),
                self.distribution.has_ext_modules() and self.plat_name
            ).egg_name()

            self.egg_output = os.path.join(self.dist_dir, basename + '.egg')

    def do_install_data(self):
        # Hack for packages that install data to install's --install-lib
        self.get_finalized_command('install').install_lib = self.bdist_dir

        site_packages = os.path.normcase(os.path.realpath(_get_purelib()))
        old, self.distribution.data_files = self.distribution.data_files, []

        for item in old:
            if isinstance(item, tuple) and len(item) == 2:
                if os.path.isabs(item[0]):
                    realpath = os.path.realpath(item[0])
                    normalized = os.path.normcase(realpath)
                    if normalized == site_packages or normalized.startswith(
                        site_packages + os.sep
                    ):
                        item = realpath[len(site_packages) + 1:], item[1]
                        # XXX else: raise ???
            self.distribution.data_files.append(item)

        try:
            log.info("installing package data to %s", self.bdist_dir)
            self.call_command('install_data', force=0, root=None)
        finally:
            self.distribution.data_files = old

    def get_outputs(self):
        return [self.egg_output]

    def call_command(self, cmdname, **kw):
        """Invoke reinitialized command `cmdname` with keyword args"""
        for dirname in INSTALL_DIRECTORY_ATTRS:
            kw.setdefault(dirname, self.bdist_dir)
        kw.setdefault('skip_build', self.skip_build)
        kw.setdefault('dry_run', self.dry_run)
        cmd = self.reinitialize_command(cmdname, **kw)
        self.run_command(cmdname)
        return cmd

    def run(self):
        # Generate metadata first
        self.run_command("egg_info")
        # We run install_lib before install_data, because some data hacks
        # pull their data path from the install_lib command.
        log.info("installing library code to %s", self.bdist_dir)
        instcmd = self.get_finalized_command('install')
        old_root = instcmd.root
        instcmd.root = None
        if self.distribution.has_c_libraries() and not self.skip_build:
            self.run_command('build_clib')
        cmd = self.call_command('install_lib', warn_dir=0)
        instcmd.root = old_root

        all_outputs, ext_outputs = self.get_ext_outputs()
        self.stubs = []
        to_compile = []
        for (p, ext_name) in enumerate(ext_outputs):
            filename, ext = os.path.splitext(ext_name)
            pyfile = os.path.join(self.bdist_dir, strip_module(filename) +
                                  '.py')
            self.stubs.append(pyfile)
            log.info("creating stub loader for %s", ext_name)
            if not self.dry_run:
                write_stub(os.path.basename(ext_name), pyfile)
            to_compile.append(pyfile)
            ext_outputs[p] = ext_name.replace(os.sep, '/')

        if to_compile:
            cmd.byte_compile(to_compile)
        if self.distribution.data_files:
            self.do_install_data()

        # Make the EGG-INFO directory
        archive_root = self.bdist_dir
        egg_info = os.path.join(archive_root, 'EGG-INFO')
        self.mkpath(egg_info)
        if self.distribution.scripts:
            script_dir = os.path.join(egg_info, 'scripts')
            log.info("installing scripts to %s", script_dir)
            self.call_command('install_scripts', install_dir=script_dir,
                              no_ep=1)

        self.copy_metadata_to(egg_info)
        native_libs = os.path.join(egg_info, "native_libs.txt")
        if all_outputs:
            log.info("writing %s", native_libs)
            if not self.dry_run:
                ensure_directory(native_libs)
                libs_file = open(native_libs, 'wt')
                libs_file.write('\n'.join(all_outputs))
                libs_file.write('\n')
                libs_file.close()
        elif os.path.isfile(native_libs):
            log.info("removing %s", native_libs)
            if not self.dry_run:
                os.unlink(native_libs)

        write_safety_flag(
            os.path.join(archive_root, 'EGG-INFO'), self.zip_safe()
        )

        if os.path.exists(os.path.join(self.egg_info, 'depends.txt')):
            log.warn(
                "WARNING: 'depends.txt' will not be used by setuptools 0.6!\n"
                "Use the install_requires/extras_require setup() args instead."
            )

        if self.exclude_source_files:
            self.zap_pyfiles()

        # Make the archive
        make_zipfile(self.egg_output, archive_root, verbose=self.verbose,
                     dry_run=self.dry_run, mode=self.gen_header())
        if not self.keep_temp:
            remove_tree(self.bdist_dir, dry_run=self.dry_run)

        # Add to 'Distribution.dist_files' so that the "upload" command works
        getattr(self.distribution, 'dist_files', []).append(
            ('bdist_egg', get_python_version(), self.egg_output))

    def zap_pyfiles(self):
        log.info("Removing .py files from temporary directory")
        for base, dirs, files in walk_egg(self.bdist_dir):
            for name in files:
                path = os.path.join(base, name)

                if name.endswith('.py'):
                    log.debug("Deleting %s", path)
                    os.unlink(path)

                if base.endswith('__pycache__'):
                    path_old = path

                    pattern = r'(?P<name>.+)\.(?P<magic>[^.]+)\.pyc'
                    m = re.match(pattern, name)
                    path_new = os.path.join(
                        base, os.pardir, m.group('name') + '.pyc')
                    log.info(
                        "Renaming file from [%s] to [%s]"
                        % (path_old, path_new))
                    try:
                        os.remove(path_new)
                    except OSError:
                        pass
                    os.rename(path_old, path_new)

    def zip_safe(self):
        safe = getattr(self.distribution, 'zip_safe', None)
        if safe is not None:
            return safe
        log.warn("zip_safe flag not set; analyzing archive contents...")
        return analyze_egg(self.bdist_dir, self.stubs)

    def gen_header(self):
        epm = EntryPoint.parse_map(self.distribution.entry_points or '')
        ep = epm.get('setuptools.installation', {}).get('eggsecutable')
        if ep is None:
            return 'w'  # not an eggsecutable, do it the usual way.

        if not ep.attrs or ep.extras:
            raise DistutilsSetupError(
                "eggsecutable entry point (%r) cannot have 'extras' "
                "or refer to a module" % (ep,)
            )

        pyver = sys.version[:3]
        pkg = ep.module_name
        full = '.'.join(ep.attrs)
        base = ep.attrs[0]
        basename = os.path.basename(self.egg_output)

        header = (
            "#!/bin/sh\n"
            'if [ `basename $0` = "%(basename)s" ]\n'
            'then exec python%(pyver)s -c "'
            "import sys, os; sys.path.insert(0, os.path.abspath('$0')); "
            "from %(pkg)s import %(base)s; sys.exit(%(full)s())"
            '" "$@"\n'
            'else\n'
            '  echo $0 is not the correct name for this egg file.\n'
            '  echo Please rename it back to %(basename)s and try again.\n'
            '  exec false\n'
            'fi\n'
        ) % locals()

        if not self.dry_run:
            mkpath(os.path.dirname(self.egg_output), dry_run=self.dry_run)
            f = open(self.egg_output, 'w')
            f.write(header)
            f.close()
        return 'a'

    def copy_metadata_to(self, target_dir):
        "Copy metadata (egg info) to the target_dir"
        # normalize the path (so that a forward-slash in egg_info will
        # match using startswith below)
        norm_egg_info = os.path.normpath(self.egg_info)
        prefix = os.path.join(norm_egg_info, '')
        for path in self.ei_cmd.filelist.files:
            if path.startswith(prefix):
                target = os.path.join(target_dir, path[len(prefix):])
                ensure_directory(target)
                self.copy_file(path, target)

    def get_ext_outputs(self):
        """Get a list of relative paths to C extensions in the output distro"""

        all_outputs = []
        ext_outputs = []

        paths = {self.bdist_dir: ''}
        for base, dirs, files in sorted_walk(self.bdist_dir):
            for filename in files:
                if os.path.splitext(filename)[1].lower() in NATIVE_EXTENSIONS:
                    all_outputs.append(paths[base] + filename)
            for filename in dirs:
                paths[os.path.join(base, filename)] = (paths[base] +
                                                       filename + '/')

        if self.distribution.has_ext_modules():
            build_cmd = self.get_finalized_command('build_ext')
            for ext in build_cmd.extensions:
                if isinstance(ext, Library):
                    continue
                fullname = build_cmd.get_ext_fullname(ext.name)
                filename = build_cmd.get_ext_filename(fullname)
                if not os.path.basename(filename).startswith('dl-'):
                    if os.path.exists(os.path.join(self.bdist_dir, filename)):
                        ext_outputs.append(filename)

        return all_outputs, ext_outputs


NATIVE_EXTENSIONS = dict.fromkeys('.dll .so .dylib .pyd'.split())


def walk_egg(egg_dir):
    """Walk an unpacked egg's contents, skipping the metadata directory"""
    walker = sorted_walk(egg_dir)
    base, dirs, files = next(walker)
    if 'EGG-INFO' in dirs:
        dirs.remove('EGG-INFO')
    yield base, dirs, files
    for bdf in walker:
        yield bdf


def analyze_egg(egg_dir, stubs):
    # check for existing flag in EGG-INFO
    for flag, fn in safety_flags.items():
        if os.path.exists(os.path.join(egg_dir, 'EGG-INFO', fn)):
            return flag
    if not can_scan():
        return False
    safe = True
    for base, dirs, files in walk_egg(egg_dir):
        for name in files:
            if name.endswith('.py') or name.endswith('.pyw'):
                continue
            elif name.endswith('.pyc') or name.endswith('.pyo'):
                # always scan, even if we already know we're not safe
                safe = scan_module(egg_dir, base, name, stubs) and safe
    return safe


def write_safety_flag(egg_dir, safe):
    # Write or remove zip safety flag file(s)
    for flag, fn in safety_flags.items():
        fn = os.path.join(egg_dir, fn)
        if os.path.exists(fn):
            if safe is None or bool(safe) != flag:
                os.unlink(fn)
        elif safe is not None and bool(safe) == flag:
            f = open(fn, 'wt')
            f.write('\n')
            f.close()


safety_flags = {
    True: 'zip-safe',
    False: 'not-zip-safe',
}


def scan_module(egg_dir, base, name, stubs):
    """Check whether module possibly uses unsafe-for-zipfile stuff"""

    filename = os.path.join(base, name)
    if filename[:-1] in stubs:
        return True  # Extension module
    pkg = base[len(egg_dir) + 1:].replace(os.sep, '.')
    module = pkg + (pkg and '.' or '') + os.path.splitext(name)[0]
    if sys.version_info < (3, 3):
        skip = 8  # skip magic & date
    elif sys.version_info < (3, 7):
        skip = 12  # skip magic & date & file size
    else:
        skip = 16  # skip magic & reserved? & date & file size
    f = open(filename, 'rb')
    f.read(skip)
    code = marshal.load(f)
    f.close()
    safe = True
    symbols = dict.fromkeys(iter_symbols(code))
    for bad in ['__file__', '__path__']:
        if bad in symbols:
            log.warn("%s: module references %s", module, bad)
            safe = False
    if 'inspect' in symbols:
        for bad in [
            'getsource', 'getabsfile', 'getsourcefile', 'getfile'
            'getsourcelines', 'findsource', 'getcomments', 'getframeinfo',
            'getinnerframes', 'getouterframes', 'stack', 'trace'
        ]:
            if bad in symbols:
                log.warn("%s: module MAY be using inspect.%s", module, bad)
                safe = False
    return safe


def iter_symbols(code):
    """Yield names and strings used by `code` and its nested code objects"""
    for name in code.co_names:
        yield name
    for const in code.co_consts:
        if isinstance(const, six.string_types):
            yield const
        elif isinstance(const, CodeType):
            for name in iter_symbols(const):
                yield name


def can_scan():
    if not sys.platform.startswith('java') and sys.platform != 'cli':
        # CPython, PyPy, etc.
        return True
    log.warn("Unable to analyze compiled code on this platform.")
    log.warn("Please ask the author to include a 'zip_safe'"
             " setting (either True or False) in the package's setup.py")


# Attribute names of options for commands that might need to be convinced to
# install to the egg build directory

INSTALL_DIRECTORY_ATTRS = [
    'install_lib', 'install_dir', 'install_data', 'install_base'
]


def make_zipfile(zip_filename, base_dir, verbose=0, dry_run=0, compress=True,
                 mode='w'):
    """Create a zip file from all the files under 'base_dir'.  The output
    zip file will be named 'base_dir' + ".zip".  Uses either the "zipfile"
    Python module (if available) or the InfoZIP "zip" utility (if installed
    and found on the default search path).  If neither tool is available,
    raises DistutilsExecError.  Returns the name of the output zip file.
    """
    import zipfile

    mkpath(os.path.dirname(zip_filename), dry_run=dry_run)
    log.info("creating '%s' and adding '%s' to it", zip_filename, base_dir)

    def visit(z, dirname, names):
        for name in names:
            path = os.path.normpath(os.path.join(dirname, name))
            if os.path.isfile(path):
                p = path[len(base_dir) + 1:]
                if not dry_run:
                    z.write(path, p)
                log.debug("adding '%s'", p)

    compression = zipfile.ZIP_DEFLATED if compress else zipfile.ZIP_STORED
    if not dry_run:
        z = zipfile.ZipFile(zip_filename, mode, compression=compression)
        for dirname, dirs, files in sorted_walk(base_dir):
            visit(z, dirname, files)
        z.close()
    else:
        for dirname, dirs, files in sorted_walk(base_dir):
            visit(None, dirname, files)
    return zip_filename
site-packages/setuptools/command/build_clib.py000064400000010604147511334640015576 0ustar00import distutils.command.build_clib as orig
from distutils.errors import DistutilsSetupError
from distutils import log
from setuptools.dep_util import newer_pairwise_group


class build_clib(orig.build_clib):
    """
    Override the default build_clib behaviour to do the following:

    1. Implement a rudimentary timestamp-based dependency system
       so 'compile()' doesn't run every time.
    2. Add more keys to the 'build_info' dictionary:
        * obj_deps - specify dependencies for each object compiled.
                     this should be a dictionary mapping a key
                     with the source filename to a list of
                     dependencies. Use an empty string for global
                     dependencies.
        * cflags   - specify a list of additional flags to pass to
                     the compiler.
    """

    def build_libraries(self, libraries):
        for (lib_name, build_info) in libraries:
            sources = build_info.get('sources')
            if sources is None or not isinstance(sources, (list, tuple)):
                raise DistutilsSetupError(
                       "in 'libraries' option (library '%s'), "
                       "'sources' must be present and must be "
                       "a list of source filenames" % lib_name)
            sources = list(sources)

            log.info("building '%s' library", lib_name)

            # Make sure everything is the correct type.
            # obj_deps should be a dictionary of keys as sources
            # and a list/tuple of files that are its dependencies.
            obj_deps = build_info.get('obj_deps', dict())
            if not isinstance(obj_deps, dict):
                raise DistutilsSetupError(
                       "in 'libraries' option (library '%s'), "
                       "'obj_deps' must be a dictionary of "
                       "type 'source: list'" % lib_name)
            dependencies = []

            # Get the global dependencies that are specified by the '' key.
            # These will go into every source's dependency list.
            global_deps = obj_deps.get('', list())
            if not isinstance(global_deps, (list, tuple)):
                raise DistutilsSetupError(
                       "in 'libraries' option (library '%s'), "
                       "'obj_deps' must be a dictionary of "
                       "type 'source: list'" % lib_name)

            # Build the list to be used by newer_pairwise_group
            # each source will be auto-added to its dependencies.
            for source in sources:
                src_deps = [source]
                src_deps.extend(global_deps)
                extra_deps = obj_deps.get(source, list())
                if not isinstance(extra_deps, (list, tuple)):
                    raise DistutilsSetupError(
                           "in 'libraries' option (library '%s'), "
                           "'obj_deps' must be a dictionary of "
                           "type 'source: list'" % lib_name)
                src_deps.extend(extra_deps)
                dependencies.append(src_deps)

            expected_objects = self.compiler.object_filenames(
                    sources,
                    output_dir=self.build_temp
                    )

            if newer_pairwise_group(dependencies, expected_objects) != ([], []):
                # First, compile the source code to object files in the library
                # directory.  (This should probably change to putting object
                # files in a temporary build directory.)
                macros = build_info.get('macros')
                include_dirs = build_info.get('include_dirs')
                cflags = build_info.get('cflags')
                objects = self.compiler.compile(
                        sources,
                        output_dir=self.build_temp,
                        macros=macros,
                        include_dirs=include_dirs,
                        extra_postargs=cflags,
                        debug=self.debug
                        )

            # Now "link" the object files together into a static library.
            # (On Unix at least, this isn't really linking -- it just
            # builds an archive.  Whatever.)
            self.compiler.create_static_lib(
                    expected_objects,
                    lib_name,
                    output_dir=self.build_clib,
                    debug=self.debug
                    )
site-packages/setuptools/command/bdist_wininst.py000064400000001175147511334640016371 0ustar00import distutils.command.bdist_wininst as orig


class bdist_wininst(orig.bdist_wininst):
    def reinitialize_command(self, command, reinit_subcommands=0):
        """
        Supplement reinitialize_command to work around
        http://bugs.python.org/issue20819
        """
        cmd = self.distribution.reinitialize_command(
            command, reinit_subcommands)
        if command in ('install', 'install_lib'):
            cmd.install_lib = None
        return cmd

    def run(self):
        self._is_running = True
        try:
            orig.bdist_wininst.run(self)
        finally:
            self._is_running = False
site-packages/setuptools/command/install_scripts.py000064400000004607147511334640016731 0ustar00from distutils import log
import distutils.command.install_scripts as orig
import os
import sys

from pkg_resources import Distribution, PathMetadata, ensure_directory


class install_scripts(orig.install_scripts):
    """Do normal script install, plus any egg_info wrapper scripts"""

    def initialize_options(self):
        orig.install_scripts.initialize_options(self)
        self.no_ep = False

    def run(self):
        import setuptools.command.easy_install as ei

        self.run_command("egg_info")
        if self.distribution.scripts:
            orig.install_scripts.run(self)  # run first to set up self.outfiles
        else:
            self.outfiles = []
        if self.no_ep:
            # don't install entry point scripts into .egg file!
            return

        ei_cmd = self.get_finalized_command("egg_info")
        dist = Distribution(
            ei_cmd.egg_base, PathMetadata(ei_cmd.egg_base, ei_cmd.egg_info),
            ei_cmd.egg_name, ei_cmd.egg_version,
        )
        bs_cmd = self.get_finalized_command('build_scripts')
        exec_param = getattr(bs_cmd, 'executable', None)
        bw_cmd = self.get_finalized_command("bdist_wininst")
        is_wininst = getattr(bw_cmd, '_is_running', False)
        writer = ei.ScriptWriter
        if is_wininst:
            exec_param = "python.exe"
            writer = ei.WindowsScriptWriter
        if exec_param == sys.executable:
            # In case the path to the Python executable contains a space, wrap
            # it so it's not split up.
            exec_param = [exec_param]
        # resolve the writer to the environment
        writer = writer.best()
        cmd = writer.command_spec_class.best().from_param(exec_param)
        for args in writer.get_args(dist, cmd.as_header()):
            self.write_script(*args)

    def write_script(self, script_name, contents, mode="t", *ignored):
        """Write an executable file to the scripts directory"""
        from setuptools.command.easy_install import chmod, current_umask

        log.info("Installing %s script to %s", script_name, self.install_dir)
        target = os.path.join(self.install_dir, script_name)
        self.outfiles.append(target)

        mask = current_umask()
        if not self.dry_run:
            ensure_directory(target)
            f = open(target, "w" + mode)
            f.write(contents)
            f.close()
            chmod(target, 0o777 - mask)
site-packages/setuptools/command/__init__.py000064400000001122147511334640015240 0ustar00__all__ = [
    'alias', 'bdist_egg', 'bdist_rpm', 'build_ext', 'build_py', 'develop',
    'easy_install', 'egg_info', 'install', 'install_lib', 'rotate', 'saveopts',
    'sdist', 'setopt', 'test', 'install_egg_info', 'install_scripts',
    'register', 'bdist_wininst', 'upload_docs', 'upload', 'build_clib',
    'dist_info',
]

from distutils.command.bdist import bdist
import sys

from setuptools.command import install_scripts

if 'egg' not in bdist.format_commands:
    bdist.format_command['egg'] = ('bdist_egg', "Python .egg file")
    bdist.format_commands.append('egg')

del bdist, sys
site-packages/setuptools/command/saveopts.py000064400000001222147511334640015346 0ustar00from setuptools.command.setopt import edit_config, option_base


class saveopts(option_base):
    """Save command-line options to a file"""

    description = "save supplied options to setup.cfg or other config file"

    def run(self):
        dist = self.distribution
        settings = {}

        for cmd in dist.command_options:

            if cmd == 'saveopts':
                continue  # don't save our own options!

            for opt, (src, val) in dist.get_option_dict(cmd).items():
                if src == "command line":
                    settings.setdefault(cmd, {})[opt] = val

        edit_config(self.filename, settings, self.dry_run)
site-packages/setuptools/command/register.py000064400000000416147511334640015332 0ustar00import distutils.command.register as orig


class register(orig.register):
    __doc__ = orig.register.__doc__

    def run(self):
        # Make sure that we are using valid current name/version info
        self.run_command('egg_info')
        orig.register.run(self)
site-packages/setuptools/command/test.py000064400000021776147511334640014501 0ustar00import os
import operator
import sys
import contextlib
import itertools
import unittest
from distutils.errors import DistutilsError, DistutilsOptionError
from distutils import log
from unittest import TestLoader

from setuptools.extern import six
from setuptools.extern.six.moves import map, filter

from pkg_resources import (resource_listdir, resource_exists, normalize_path,
                           working_set, _namespace_packages, evaluate_marker,
                           add_activation_listener, require, EntryPoint)
from setuptools import Command


class ScanningLoader(TestLoader):

    def __init__(self):
        TestLoader.__init__(self)
        self._visited = set()

    def loadTestsFromModule(self, module, pattern=None):
        """Return a suite of all tests cases contained in the given module

        If the module is a package, load tests from all the modules in it.
        If the module has an ``additional_tests`` function, call it and add
        the return value to the tests.
        """
        if module in self._visited:
            return None
        self._visited.add(module)

        tests = []
        tests.append(TestLoader.loadTestsFromModule(self, module))

        if hasattr(module, "additional_tests"):
            tests.append(module.additional_tests())

        if hasattr(module, '__path__'):
            for file in resource_listdir(module.__name__, ''):
                if file.endswith('.py') and file != '__init__.py':
                    submodule = module.__name__ + '.' + file[:-3]
                else:
                    if resource_exists(module.__name__, file + '/__init__.py'):
                        submodule = module.__name__ + '.' + file
                    else:
                        continue
                tests.append(self.loadTestsFromName(submodule))

        if len(tests) != 1:
            return self.suiteClass(tests)
        else:
            return tests[0]  # don't create a nested suite for only one return


# adapted from jaraco.classes.properties:NonDataProperty
class NonDataProperty(object):
    def __init__(self, fget):
        self.fget = fget

    def __get__(self, obj, objtype=None):
        if obj is None:
            return self
        return self.fget(obj)


class test(Command):
    """Command to run unit tests after in-place build"""

    description = "run unit tests after in-place build"

    user_options = [
        ('test-module=', 'm', "Run 'test_suite' in specified module"),
        ('test-suite=', 's',
         "Run single test, case or suite (e.g. 'module.test_suite')"),
        ('test-runner=', 'r', "Test runner to use"),
    ]

    def initialize_options(self):
        self.test_suite = None
        self.test_module = None
        self.test_loader = None
        self.test_runner = None

    def finalize_options(self):

        if self.test_suite and self.test_module:
            msg = "You may specify a module or a suite, but not both"
            raise DistutilsOptionError(msg)

        if self.test_suite is None:
            if self.test_module is None:
                self.test_suite = self.distribution.test_suite
            else:
                self.test_suite = self.test_module + ".test_suite"

        if self.test_loader is None:
            self.test_loader = getattr(self.distribution, 'test_loader', None)
        if self.test_loader is None:
            self.test_loader = "setuptools.command.test:ScanningLoader"
        if self.test_runner is None:
            self.test_runner = getattr(self.distribution, 'test_runner', None)

    @NonDataProperty
    def test_args(self):
        return list(self._test_args())

    def _test_args(self):
        if not self.test_suite and sys.version_info >= (2, 7):
            yield 'discover'
        if self.verbose:
            yield '--verbose'
        if self.test_suite:
            yield self.test_suite

    def with_project_on_sys_path(self, func):
        """
        Backward compatibility for project_on_sys_path context.
        """
        with self.project_on_sys_path():
            func()

    @contextlib.contextmanager
    def project_on_sys_path(self, include_dists=[]):
        with_2to3 = six.PY3 and getattr(self.distribution, 'use_2to3', False)

        if with_2to3:
            # If we run 2to3 we can not do this inplace:

            # Ensure metadata is up-to-date
            self.reinitialize_command('build_py', inplace=0)
            self.run_command('build_py')
            bpy_cmd = self.get_finalized_command("build_py")
            build_path = normalize_path(bpy_cmd.build_lib)

            # Build extensions
            self.reinitialize_command('egg_info', egg_base=build_path)
            self.run_command('egg_info')

            self.reinitialize_command('build_ext', inplace=0)
            self.run_command('build_ext')
        else:
            # Without 2to3 inplace works fine:
            self.run_command('egg_info')

            # Build extensions in-place
            self.reinitialize_command('build_ext', inplace=1)
            self.run_command('build_ext')

        ei_cmd = self.get_finalized_command("egg_info")

        old_path = sys.path[:]
        old_modules = sys.modules.copy()

        try:
            project_path = normalize_path(ei_cmd.egg_base)
            sys.path.insert(0, project_path)
            working_set.__init__()
            add_activation_listener(lambda dist: dist.activate())
            require('%s==%s' % (ei_cmd.egg_name, ei_cmd.egg_version))
            with self.paths_on_pythonpath([project_path]):
                yield
        finally:
            sys.path[:] = old_path
            sys.modules.clear()
            sys.modules.update(old_modules)
            working_set.__init__()

    @staticmethod
    @contextlib.contextmanager
    def paths_on_pythonpath(paths):
        """
        Add the indicated paths to the head of the PYTHONPATH environment
        variable so that subprocesses will also see the packages at
        these paths.

        Do this in a context that restores the value on exit.
        """
        nothing = object()
        orig_pythonpath = os.environ.get('PYTHONPATH', nothing)
        current_pythonpath = os.environ.get('PYTHONPATH', '')
        try:
            prefix = os.pathsep.join(paths)
            to_join = filter(None, [prefix, current_pythonpath])
            new_path = os.pathsep.join(to_join)
            if new_path:
                os.environ['PYTHONPATH'] = new_path
            yield
        finally:
            if orig_pythonpath is nothing:
                os.environ.pop('PYTHONPATH', None)
            else:
                os.environ['PYTHONPATH'] = orig_pythonpath

    @staticmethod
    def install_dists(dist):
        """
        Install the requirements indicated by self.distribution and
        return an iterable of the dists that were built.
        """
        ir_d = dist.fetch_build_eggs(dist.install_requires)
        tr_d = dist.fetch_build_eggs(dist.tests_require or [])
        er_d = dist.fetch_build_eggs(
            v for k, v in dist.extras_require.items()
            if k.startswith(':') and evaluate_marker(k[1:])
        )
        return itertools.chain(ir_d, tr_d, er_d)

    def run(self):
        installed_dists = self.install_dists(self.distribution)

        cmd = ' '.join(self._argv)
        if self.dry_run:
            self.announce('skipping "%s" (dry run)' % cmd)
            return

        self.announce('running "%s"' % cmd)

        paths = map(operator.attrgetter('location'), installed_dists)
        with self.paths_on_pythonpath(paths):
            with self.project_on_sys_path():
                self.run_tests()

    def run_tests(self):
        # Purge modules under test from sys.modules. The test loader will
        # re-import them from the build location. Required when 2to3 is used
        # with namespace packages.
        if six.PY3 and getattr(self.distribution, 'use_2to3', False):
            module = self.test_suite.split('.')[0]
            if module in _namespace_packages:
                del_modules = []
                if module in sys.modules:
                    del_modules.append(module)
                module += '.'
                for name in sys.modules:
                    if name.startswith(module):
                        del_modules.append(name)
                list(map(sys.modules.__delitem__, del_modules))

        test = unittest.main(
            None, None, self._argv,
            testLoader=self._resolve_as_ep(self.test_loader),
            testRunner=self._resolve_as_ep(self.test_runner),
            exit=False,
        )
        if not test.result.wasSuccessful():
            msg = 'Test failed: %s' % test.result
            self.announce(msg, log.ERROR)
            raise DistutilsError(msg)

    @property
    def _argv(self):
        return ['unittest'] + self.test_args

    @staticmethod
    def _resolve_as_ep(val):
        """
        Load the indicated attribute value, called, as a as if it were
        specified as an entry point.
        """
        if val is None:
            return
        parsed = EntryPoint.parse("x=" + val)
        return parsed.resolve()()
site-packages/setuptools/command/build_py.py000064400000022574147511334640015326 0ustar00from glob import glob
from distutils.util import convert_path
import distutils.command.build_py as orig
import os
import fnmatch
import textwrap
import io
import distutils.errors
import itertools

from setuptools.extern import six
from setuptools.extern.six.moves import map, filter, filterfalse

try:
    from setuptools.lib2to3_ex import Mixin2to3
except ImportError:

    class Mixin2to3:
        def run_2to3(self, files, doctests=True):
            "do nothing"


class build_py(orig.build_py, Mixin2to3):
    """Enhanced 'build_py' command that includes data files with packages

    The data files are specified via a 'package_data' argument to 'setup()'.
    See 'setuptools.dist.Distribution' for more details.

    Also, this version of the 'build_py' command allows you to specify both
    'py_modules' and 'packages' in the same setup operation.
    """

    def finalize_options(self):
        orig.build_py.finalize_options(self)
        self.package_data = self.distribution.package_data
        self.exclude_package_data = (self.distribution.exclude_package_data or
                                     {})
        if 'data_files' in self.__dict__:
            del self.__dict__['data_files']
        self.__updated_files = []
        self.__doctests_2to3 = []

    def run(self):
        """Build modules, packages, and copy data files to build directory"""
        if not self.py_modules and not self.packages:
            return

        if self.py_modules:
            self.build_modules()

        if self.packages:
            self.build_packages()
            self.build_package_data()

        self.run_2to3(self.__updated_files, False)
        self.run_2to3(self.__updated_files, True)
        self.run_2to3(self.__doctests_2to3, True)

        # Only compile actual .py files, using our base class' idea of what our
        # output files are.
        self.byte_compile(orig.build_py.get_outputs(self, include_bytecode=0))

    def __getattr__(self, attr):
        "lazily compute data files"
        if attr == 'data_files':
            self.data_files = self._get_data_files()
            return self.data_files
        return orig.build_py.__getattr__(self, attr)

    def build_module(self, module, module_file, package):
        if six.PY2 and isinstance(package, six.string_types):
            # avoid errors on Python 2 when unicode is passed (#190)
            package = package.split('.')
        outfile, copied = orig.build_py.build_module(self, module, module_file,
                                                     package)
        if copied:
            self.__updated_files.append(outfile)
        return outfile, copied

    def _get_data_files(self):
        """Generate list of '(package,src_dir,build_dir,filenames)' tuples"""
        self.analyze_manifest()
        return list(map(self._get_pkg_data_files, self.packages or ()))

    def _get_pkg_data_files(self, package):
        # Locate package source directory
        src_dir = self.get_package_dir(package)

        # Compute package build directory
        build_dir = os.path.join(*([self.build_lib] + package.split('.')))

        # Strip directory from globbed filenames
        filenames = [
            os.path.relpath(file, src_dir)
            for file in self.find_data_files(package, src_dir)
        ]
        return package, src_dir, build_dir, filenames

    def find_data_files(self, package, src_dir):
        """Return filenames for package's data files in 'src_dir'"""
        patterns = self._get_platform_patterns(
            self.package_data,
            package,
            src_dir,
        )
        globs_expanded = map(glob, patterns)
        # flatten the expanded globs into an iterable of matches
        globs_matches = itertools.chain.from_iterable(globs_expanded)
        glob_files = filter(os.path.isfile, globs_matches)
        files = itertools.chain(
            self.manifest_files.get(package, []),
            glob_files,
        )
        return self.exclude_data_files(package, src_dir, files)

    def build_package_data(self):
        """Copy data files into build directory"""
        for package, src_dir, build_dir, filenames in self.data_files:
            for filename in filenames:
                target = os.path.join(build_dir, filename)
                self.mkpath(os.path.dirname(target))
                srcfile = os.path.join(src_dir, filename)
                outf, copied = self.copy_file(srcfile, target)
                srcfile = os.path.abspath(srcfile)
                if (copied and
                        srcfile in self.distribution.convert_2to3_doctests):
                    self.__doctests_2to3.append(outf)

    def analyze_manifest(self):
        self.manifest_files = mf = {}
        if not self.distribution.include_package_data:
            return
        src_dirs = {}
        for package in self.packages or ():
            # Locate package source directory
            src_dirs[assert_relative(self.get_package_dir(package))] = package

        self.run_command('egg_info')
        ei_cmd = self.get_finalized_command('egg_info')
        for path in ei_cmd.filelist.files:
            d, f = os.path.split(assert_relative(path))
            prev = None
            oldf = f
            while d and d != prev and d not in src_dirs:
                prev = d
                d, df = os.path.split(d)
                f = os.path.join(df, f)
            if d in src_dirs:
                if path.endswith('.py') and f == oldf:
                    continue  # it's a module, not data
                mf.setdefault(src_dirs[d], []).append(path)

    def get_data_files(self):
        pass  # Lazily compute data files in _get_data_files() function.

    def check_package(self, package, package_dir):
        """Check namespace packages' __init__ for declare_namespace"""
        try:
            return self.packages_checked[package]
        except KeyError:
            pass

        init_py = orig.build_py.check_package(self, package, package_dir)
        self.packages_checked[package] = init_py

        if not init_py or not self.distribution.namespace_packages:
            return init_py

        for pkg in self.distribution.namespace_packages:
            if pkg == package or pkg.startswith(package + '.'):
                break
        else:
            return init_py

        with io.open(init_py, 'rb') as f:
            contents = f.read()
        if b'declare_namespace' not in contents:
            raise distutils.errors.DistutilsError(
                "Namespace package problem: %s is a namespace package, but "
                "its\n__init__.py does not call declare_namespace()! Please "
                'fix it.\n(See the setuptools manual under '
                '"Namespace Packages" for details.)\n"' % (package,)
            )
        return init_py

    def initialize_options(self):
        self.packages_checked = {}
        orig.build_py.initialize_options(self)

    def get_package_dir(self, package):
        res = orig.build_py.get_package_dir(self, package)
        if self.distribution.src_root is not None:
            return os.path.join(self.distribution.src_root, res)
        return res

    def exclude_data_files(self, package, src_dir, files):
        """Filter filenames for package's data files in 'src_dir'"""
        files = list(files)
        patterns = self._get_platform_patterns(
            self.exclude_package_data,
            package,
            src_dir,
        )
        match_groups = (
            fnmatch.filter(files, pattern)
            for pattern in patterns
        )
        # flatten the groups of matches into an iterable of matches
        matches = itertools.chain.from_iterable(match_groups)
        bad = set(matches)
        keepers = (
            fn
            for fn in files
            if fn not in bad
        )
        # ditch dupes
        return list(_unique_everseen(keepers))

    @staticmethod
    def _get_platform_patterns(spec, package, src_dir):
        """
        yield platform-specific path patterns (suitable for glob
        or fn_match) from a glob-based spec (such as
        self.package_data or self.exclude_package_data)
        matching package in src_dir.
        """
        raw_patterns = itertools.chain(
            spec.get('', []),
            spec.get(package, []),
        )
        return (
            # Each pattern has to be converted to a platform-specific path
            os.path.join(src_dir, convert_path(pattern))
            for pattern in raw_patterns
        )


# from Python docs
def _unique_everseen(iterable, key=None):
    "List unique elements, preserving order. Remember all elements ever seen."
    # unique_everseen('AAAABBBCCDAABBB') --> A B C D
    # unique_everseen('ABBCcAD', str.lower) --> A B C D
    seen = set()
    seen_add = seen.add
    if key is None:
        for element in filterfalse(seen.__contains__, iterable):
            seen_add(element)
            yield element
    else:
        for element in iterable:
            k = key(element)
            if k not in seen:
                seen_add(k)
                yield element


def assert_relative(path):
    if not os.path.isabs(path):
        return path
    from distutils.errors import DistutilsSetupError

    msg = textwrap.dedent("""
        Error: setup script specifies an absolute path:

            %s

        setup() arguments must *always* be /-separated paths relative to the
        setup.py directory, *never* absolute paths.
        """).lstrip() % path
    raise DistutilsSetupError(msg)
site-packages/setuptools/command/setopt.py000064400000011735147511334640015032 0ustar00from distutils.util import convert_path
from distutils import log
from distutils.errors import DistutilsOptionError
import distutils
import os

from setuptools.extern.six.moves import configparser

from setuptools import Command

__all__ = ['config_file', 'edit_config', 'option_base', 'setopt']


def config_file(kind="local"):
    """Get the filename of the distutils, local, global, or per-user config

    `kind` must be one of "local", "global", or "user"
    """
    if kind == 'local':
        return 'setup.cfg'
    if kind == 'global':
        return os.path.join(
            os.path.dirname(distutils.__file__), 'distutils.cfg'
        )
    if kind == 'user':
        dot = os.name == 'posix' and '.' or ''
        return os.path.expanduser(convert_path("~/%spydistutils.cfg" % dot))
    raise ValueError(
        "config_file() type must be 'local', 'global', or 'user'", kind
    )


def edit_config(filename, settings, dry_run=False):
    """Edit a configuration file to include `settings`

    `settings` is a dictionary of dictionaries or ``None`` values, keyed by
    command/section name.  A ``None`` value means to delete the entire section,
    while a dictionary lists settings to be changed or deleted in that section.
    A setting of ``None`` means to delete that setting.
    """
    log.debug("Reading configuration from %s", filename)
    opts = configparser.RawConfigParser()
    opts.read([filename])
    for section, options in settings.items():
        if options is None:
            log.info("Deleting section [%s] from %s", section, filename)
            opts.remove_section(section)
        else:
            if not opts.has_section(section):
                log.debug("Adding new section [%s] to %s", section, filename)
                opts.add_section(section)
            for option, value in options.items():
                if value is None:
                    log.debug(
                        "Deleting %s.%s from %s",
                        section, option, filename
                    )
                    opts.remove_option(section, option)
                    if not opts.options(section):
                        log.info("Deleting empty [%s] section from %s",
                                 section, filename)
                        opts.remove_section(section)
                else:
                    log.debug(
                        "Setting %s.%s to %r in %s",
                        section, option, value, filename
                    )
                    opts.set(section, option, value)

    log.info("Writing %s", filename)
    if not dry_run:
        with open(filename, 'w') as f:
            opts.write(f)


class option_base(Command):
    """Abstract base class for commands that mess with config files"""

    user_options = [
        ('global-config', 'g',
         "save options to the site-wide distutils.cfg file"),
        ('user-config', 'u',
         "save options to the current user's pydistutils.cfg file"),
        ('filename=', 'f',
         "configuration file to use (default=setup.cfg)"),
    ]

    boolean_options = [
        'global-config', 'user-config',
    ]

    def initialize_options(self):
        self.global_config = None
        self.user_config = None
        self.filename = None

    def finalize_options(self):
        filenames = []
        if self.global_config:
            filenames.append(config_file('global'))
        if self.user_config:
            filenames.append(config_file('user'))
        if self.filename is not None:
            filenames.append(self.filename)
        if not filenames:
            filenames.append(config_file('local'))
        if len(filenames) > 1:
            raise DistutilsOptionError(
                "Must specify only one configuration file option",
                filenames
            )
        self.filename, = filenames


class setopt(option_base):
    """Save command-line options to a file"""

    description = "set an option in setup.cfg or another config file"

    user_options = [
        ('command=', 'c', 'command to set an option for'),
        ('option=', 'o', 'option to set'),
        ('set-value=', 's', 'value of the option'),
        ('remove', 'r', 'remove (unset) the value'),
    ] + option_base.user_options

    boolean_options = option_base.boolean_options + ['remove']

    def initialize_options(self):
        option_base.initialize_options(self)
        self.command = None
        self.option = None
        self.set_value = None
        self.remove = None

    def finalize_options(self):
        option_base.finalize_options(self)
        if self.command is None or self.option is None:
            raise DistutilsOptionError("Must specify --command *and* --option")
        if self.set_value is None and not self.remove:
            raise DistutilsOptionError("Must specify --set-value or --remove")

    def run(self):
        edit_config(
            self.filename, {
                self.command: {self.option.replace('-', '_'): self.set_value}
            },
            self.dry_run
        )
site-packages/setuptools/command/easy_install.py000064400000252244147511334640016205 0ustar00"""
Easy Install
------------

A tool for doing automatic download/extract/build of distutils-based Python
packages.  For detailed documentation, see the accompanying EasyInstall.txt
file, or visit the `EasyInstall home page`__.

__ https://setuptools.readthedocs.io/en/latest/easy_install.html

"""

from glob import glob
from distutils.util import get_platform
from distutils.util import convert_path, subst_vars
from distutils.errors import (
    DistutilsArgError, DistutilsOptionError,
    DistutilsError, DistutilsPlatformError,
)
from distutils.command.install import INSTALL_SCHEMES, SCHEME_KEYS
from distutils import log, dir_util
from distutils.command.build_scripts import first_line_re
from distutils.spawn import find_executable
import sys
import os
import zipimport
import shutil
import tempfile
import zipfile
import re
import stat
import random
import textwrap
import warnings
import site
import struct
import contextlib
import subprocess
import shlex
import io

from setuptools.extern import six
from setuptools.extern.six.moves import configparser, map

from setuptools import Command
from setuptools.sandbox import run_setup
from setuptools.py31compat import get_path, get_config_vars
from setuptools.py27compat import rmtree_safe
from setuptools.command import setopt
from setuptools.archive_util import unpack_archive
from setuptools.package_index import (
    PackageIndex, parse_requirement_arg, URL_SCHEME,
)
from setuptools.command import bdist_egg, egg_info
from setuptools.wheel import Wheel
from pkg_resources import (
    yield_lines, normalize_path, resource_string, ensure_directory,
    get_distribution, find_distributions, Environment, Requirement,
    Distribution, PathMetadata, EggMetadata, WorkingSet, DistributionNotFound,
    VersionConflict, DEVELOP_DIST,
)
import pkg_resources.py31compat

# Turn on PEP440Warnings
warnings.filterwarnings("default", category=pkg_resources.PEP440Warning)

__all__ = [
    'samefile', 'easy_install', 'PthDistributions', 'extract_wininst_cfg',
    'main', 'get_exe_prefixes',
]


def is_64bit():
    return struct.calcsize("P") == 8


def samefile(p1, p2):
    """
    Determine if two paths reference the same file.

    Augments os.path.samefile to work on Windows and
    suppresses errors if the path doesn't exist.
    """
    both_exist = os.path.exists(p1) and os.path.exists(p2)
    use_samefile = hasattr(os.path, 'samefile') and both_exist
    if use_samefile:
        return os.path.samefile(p1, p2)
    norm_p1 = os.path.normpath(os.path.normcase(p1))
    norm_p2 = os.path.normpath(os.path.normcase(p2))
    return norm_p1 == norm_p2


if six.PY2:

    def _to_ascii(s):
        return s

    def isascii(s):
        try:
            six.text_type(s, 'ascii')
            return True
        except UnicodeError:
            return False
else:

    def _to_ascii(s):
        return s.encode('ascii')

    def isascii(s):
        try:
            s.encode('ascii')
            return True
        except UnicodeError:
            return False


_one_liner = lambda text: textwrap.dedent(text).strip().replace('\n', '; ')


class easy_install(Command):
    """Manage a download/build/install process"""
    description = "Find/get/install Python packages"
    command_consumes_arguments = True

    user_options = [
        ('prefix=', None, "installation prefix"),
        ("zip-ok", "z", "install package as a zipfile"),
        ("multi-version", "m", "make apps have to require() a version"),
        ("upgrade", "U", "force upgrade (searches PyPI for latest versions)"),
        ("install-dir=", "d", "install package to DIR"),
        ("script-dir=", "s", "install scripts to DIR"),
        ("exclude-scripts", "x", "Don't install scripts"),
        ("always-copy", "a", "Copy all needed packages to install dir"),
        ("index-url=", "i", "base URL of Python Package Index"),
        ("find-links=", "f", "additional URL(s) to search for packages"),
        ("build-directory=", "b",
         "download/extract/build in DIR; keep the results"),
        ('optimize=', 'O',
         "also compile with optimization: -O1 for \"python -O\", "
         "-O2 for \"python -OO\", and -O0 to disable [default: -O0]"),
        ('record=', None,
         "filename in which to record list of installed files"),
        ('always-unzip', 'Z', "don't install as a zipfile, no matter what"),
        ('site-dirs=', 'S', "list of directories where .pth files work"),
        ('editable', 'e', "Install specified packages in editable form"),
        ('no-deps', 'N', "don't install dependencies"),
        ('allow-hosts=', 'H', "pattern(s) that hostnames must match"),
        ('local-snapshots-ok', 'l',
         "allow building eggs from local checkouts"),
        ('version', None, "print version information and exit"),
        ('no-find-links', None,
         "Don't load find-links defined in packages being installed")
    ]
    boolean_options = [
        'zip-ok', 'multi-version', 'exclude-scripts', 'upgrade', 'always-copy',
        'editable',
        'no-deps', 'local-snapshots-ok', 'version'
    ]

    if site.ENABLE_USER_SITE:
        help_msg = "install in user site-package '%s'" % site.USER_SITE
        user_options.append(('user', None, help_msg))
        boolean_options.append('user')

    negative_opt = {'always-unzip': 'zip-ok'}
    create_index = PackageIndex

    def initialize_options(self):
        # the --user option seems to be an opt-in one,
        # so the default should be False.
        self.user = 0
        self.zip_ok = self.local_snapshots_ok = None
        self.install_dir = self.script_dir = self.exclude_scripts = None
        self.index_url = None
        self.find_links = None
        self.build_directory = None
        self.args = None
        self.optimize = self.record = None
        self.upgrade = self.always_copy = self.multi_version = None
        self.editable = self.no_deps = self.allow_hosts = None
        self.root = self.prefix = self.no_report = None
        self.version = None
        self.install_purelib = None  # for pure module distributions
        self.install_platlib = None  # non-pure (dists w/ extensions)
        self.install_headers = None  # for C/C++ headers
        self.install_lib = None  # set to either purelib or platlib
        self.install_scripts = None
        self.install_data = None
        self.install_base = None
        self.install_platbase = None
        if site.ENABLE_USER_SITE:
            self.install_userbase = site.USER_BASE
            self.install_usersite = site.USER_SITE
        else:
            self.install_userbase = None
            self.install_usersite = None
        self.no_find_links = None

        # Options not specifiable via command line
        self.package_index = None
        self.pth_file = self.always_copy_from = None
        self.site_dirs = None
        self.installed_projects = {}
        self.sitepy_installed = False
        # Always read easy_install options, even if we are subclassed, or have
        # an independent instance created.  This ensures that defaults will
        # always come from the standard configuration file(s)' "easy_install"
        # section, even if this is a "develop" or "install" command, or some
        # other embedding.
        self._dry_run = None
        self.verbose = self.distribution.verbose
        self.distribution._set_command_options(
            self, self.distribution.get_option_dict('easy_install')
        )

    def delete_blockers(self, blockers):
        extant_blockers = (
            filename for filename in blockers
            if os.path.exists(filename) or os.path.islink(filename)
        )
        list(map(self._delete_path, extant_blockers))

    def _delete_path(self, path):
        log.info("Deleting %s", path)
        if self.dry_run:
            return

        is_tree = os.path.isdir(path) and not os.path.islink(path)
        remover = rmtree if is_tree else os.unlink
        remover(path)

    @staticmethod
    def _render_version():
        """
        Render the Setuptools version and installation details, then exit.
        """
        ver = sys.version[:3]
        dist = get_distribution('setuptools')
        tmpl = 'setuptools {dist.version} from {dist.location} (Python {ver})'
        print(tmpl.format(**locals()))
        raise SystemExit()

    def finalize_options(self):
        self.version and self._render_version()

        py_version = sys.version.split()[0]
        prefix, exec_prefix = get_config_vars('prefix', 'exec_prefix')

        self.config_vars = {
            'dist_name': self.distribution.get_name(),
            'dist_version': self.distribution.get_version(),
            'dist_fullname': self.distribution.get_fullname(),
            'py_version': py_version,
            'py_version_short': py_version[0:3],
            'py_version_nodot': py_version[0] + py_version[2],
            'sys_prefix': prefix,
            'prefix': prefix,
            'sys_exec_prefix': exec_prefix,
            'exec_prefix': exec_prefix,
            # Only python 3.2+ has abiflags
            'abiflags': getattr(sys, 'abiflags', ''),
        }

        if site.ENABLE_USER_SITE:
            self.config_vars['userbase'] = self.install_userbase
            self.config_vars['usersite'] = self.install_usersite

        self._fix_install_dir_for_user_site()

        self.expand_basedirs()
        self.expand_dirs()

        self._expand(
            'install_dir', 'script_dir', 'build_directory',
            'site_dirs',
        )
        # If a non-default installation directory was specified, default the
        # script directory to match it.
        if self.script_dir is None:
            self.script_dir = self.install_dir

        if self.no_find_links is None:
            self.no_find_links = False

        # Let install_dir get set by install_lib command, which in turn
        # gets its info from the install command, and takes into account
        # --prefix and --home and all that other crud.
        self.set_undefined_options(
            'install_lib', ('install_dir', 'install_dir')
        )
        # Likewise, set default script_dir from 'install_scripts.install_dir'
        self.set_undefined_options(
            'install_scripts', ('install_dir', 'script_dir')
        )

        if self.user and self.install_purelib:
            self.install_dir = self.install_purelib
            self.script_dir = self.install_scripts
        # default --record from the install command
        self.set_undefined_options('install', ('record', 'record'))
        # Should this be moved to the if statement below? It's not used
        # elsewhere
        normpath = map(normalize_path, sys.path)
        self.all_site_dirs = get_site_dirs()
        if self.site_dirs is not None:
            site_dirs = [
                os.path.expanduser(s.strip()) for s in
                self.site_dirs.split(',')
            ]
            for d in site_dirs:
                if not os.path.isdir(d):
                    log.warn("%s (in --site-dirs) does not exist", d)
                elif normalize_path(d) not in normpath:
                    raise DistutilsOptionError(
                        d + " (in --site-dirs) is not on sys.path"
                    )
                else:
                    self.all_site_dirs.append(normalize_path(d))
        if not self.editable:
            self.check_site_dir()
        self.index_url = self.index_url or "https://pypi.org/simple/"
        self.shadow_path = self.all_site_dirs[:]
        for path_item in self.install_dir, normalize_path(self.script_dir):
            if path_item not in self.shadow_path:
                self.shadow_path.insert(0, path_item)

        if self.allow_hosts is not None:
            hosts = [s.strip() for s in self.allow_hosts.split(',')]
        else:
            hosts = ['*']
        if self.package_index is None:
            self.package_index = self.create_index(
                self.index_url, search_path=self.shadow_path, hosts=hosts,
            )
        self.local_index = Environment(self.shadow_path + sys.path)

        if self.find_links is not None:
            if isinstance(self.find_links, six.string_types):
                self.find_links = self.find_links.split()
        else:
            self.find_links = []
        if self.local_snapshots_ok:
            self.package_index.scan_egg_links(self.shadow_path + sys.path)
        if not self.no_find_links:
            self.package_index.add_find_links(self.find_links)
        self.set_undefined_options('install_lib', ('optimize', 'optimize'))
        if not isinstance(self.optimize, int):
            try:
                self.optimize = int(self.optimize)
                if not (0 <= self.optimize <= 2):
                    raise ValueError
            except ValueError:
                raise DistutilsOptionError("--optimize must be 0, 1, or 2")

        if self.editable and not self.build_directory:
            raise DistutilsArgError(
                "Must specify a build directory (-b) when using --editable"
            )
        if not self.args:
            raise DistutilsArgError(
                "No urls, filenames, or requirements specified (see --help)")

        self.outputs = []

    def _fix_install_dir_for_user_site(self):
        """
        Fix the install_dir if "--user" was used.
        """
        if not self.user or not site.ENABLE_USER_SITE:
            return

        self.create_home_path()
        if self.install_userbase is None:
            msg = "User base directory is not specified"
            raise DistutilsPlatformError(msg)
        self.install_base = self.install_platbase = self.install_userbase
        scheme_name = os.name.replace('posix', 'unix') + '_user'
        self.select_scheme(scheme_name)

    def _expand_attrs(self, attrs):
        for attr in attrs:
            val = getattr(self, attr)
            if val is not None:
                if os.name == 'posix' or os.name == 'nt':
                    val = os.path.expanduser(val)
                val = subst_vars(val, self.config_vars)
                setattr(self, attr, val)

    def expand_basedirs(self):
        """Calls `os.path.expanduser` on install_base, install_platbase and
        root."""
        self._expand_attrs(['install_base', 'install_platbase', 'root'])

    def expand_dirs(self):
        """Calls `os.path.expanduser` on install dirs."""
        dirs = [
            'install_purelib',
            'install_platlib',
            'install_lib',
            'install_headers',
            'install_scripts',
            'install_data',
        ]
        self._expand_attrs(dirs)

    def run(self):
        if self.verbose != self.distribution.verbose:
            log.set_verbosity(self.verbose)
        try:
            for spec in self.args:
                self.easy_install(spec, not self.no_deps)
            if self.record:
                outputs = self.outputs
                if self.root:  # strip any package prefix
                    root_len = len(self.root)
                    for counter in range(len(outputs)):
                        outputs[counter] = outputs[counter][root_len:]
                from distutils import file_util

                self.execute(
                    file_util.write_file, (self.record, outputs),
                    "writing list of installed files to '%s'" %
                    self.record
                )
            self.warn_deprecated_options()
        finally:
            log.set_verbosity(self.distribution.verbose)

    def pseudo_tempname(self):
        """Return a pseudo-tempname base in the install directory.
        This code is intentionally naive; if a malicious party can write to
        the target directory you're already in deep doodoo.
        """
        try:
            pid = os.getpid()
        except Exception:
            pid = random.randint(0, sys.maxsize)
        return os.path.join(self.install_dir, "test-easy-install-%s" % pid)

    def warn_deprecated_options(self):
        pass

    def check_site_dir(self):
        """Verify that self.install_dir is .pth-capable dir, if needed"""

        instdir = normalize_path(self.install_dir)
        pth_file = os.path.join(instdir, 'easy-install.pth')

        if not os.path.exists(instdir):
            try:
                os.makedirs(instdir)
            except (OSError, IOError):
                self.cant_write_to_target()

        # Is it a configured, PYTHONPATH, implicit, or explicit site dir?
        is_site_dir = instdir in self.all_site_dirs

        if not is_site_dir and not self.multi_version:
            # No?  Then directly test whether it does .pth file processing
            is_site_dir = self.check_pth_processing()
        else:
            # make sure we can write to target dir
            testfile = self.pseudo_tempname() + '.write-test'
            test_exists = os.path.exists(testfile)
            try:
                if test_exists:
                    os.unlink(testfile)
                open(testfile, 'w').close()
                os.unlink(testfile)
            except (OSError, IOError):
                self.cant_write_to_target()

        if not is_site_dir and not self.multi_version:
            # Can't install non-multi to non-site dir
            raise DistutilsError(self.no_default_version_msg())

        if is_site_dir:
            if self.pth_file is None:
                self.pth_file = PthDistributions(pth_file, self.all_site_dirs)
        else:
            self.pth_file = None

        if instdir not in map(normalize_path, _pythonpath()):
            # only PYTHONPATH dirs need a site.py, so pretend it's there
            self.sitepy_installed = True
        elif self.multi_version and not os.path.exists(pth_file):
            self.sitepy_installed = True  # don't need site.py in this case
            self.pth_file = None  # and don't create a .pth file
        self.install_dir = instdir

    __cant_write_msg = textwrap.dedent("""
        can't create or remove files in install directory

        The following error occurred while trying to add or remove files in the
        installation directory:

            %s

        The installation directory you specified (via --install-dir, --prefix, or
        the distutils default setting) was:

            %s
        """).lstrip()

    __not_exists_id = textwrap.dedent("""
        This directory does not currently exist.  Please create it and try again, or
        choose a different installation directory (using the -d or --install-dir
        option).
        """).lstrip()

    __access_msg = textwrap.dedent("""
        Perhaps your account does not have write access to this directory?  If the
        installation directory is a system-owned directory, you may need to sign in
        as the administrator or "root" account.  If you do not have administrative
        access to this machine, you may wish to choose a different installation
        directory, preferably one that is listed in your PYTHONPATH environment
        variable.

        For information on other options, you may wish to consult the
        documentation at:

          https://setuptools.readthedocs.io/en/latest/easy_install.html

        Please make the appropriate changes for your system and try again.
        """).lstrip()

    def cant_write_to_target(self):
        msg = self.__cant_write_msg % (sys.exc_info()[1], self.install_dir,)

        if not os.path.exists(self.install_dir):
            msg += '\n' + self.__not_exists_id
        else:
            msg += '\n' + self.__access_msg
        raise DistutilsError(msg)

    def check_pth_processing(self):
        """Empirically verify whether .pth files are supported in inst. dir"""
        instdir = self.install_dir
        log.info("Checking .pth file support in %s", instdir)
        pth_file = self.pseudo_tempname() + ".pth"
        ok_file = pth_file + '.ok'
        ok_exists = os.path.exists(ok_file)
        tmpl = _one_liner("""
            import os
            f = open({ok_file!r}, 'w')
            f.write('OK')
            f.close()
            """) + '\n'
        try:
            if ok_exists:
                os.unlink(ok_file)
            dirname = os.path.dirname(ok_file)
            pkg_resources.py31compat.makedirs(dirname, exist_ok=True)
            f = open(pth_file, 'w')
        except (OSError, IOError):
            self.cant_write_to_target()
        else:
            try:
                f.write(tmpl.format(**locals()))
                f.close()
                f = None
                executable = sys.executable
                if os.name == 'nt':
                    dirname, basename = os.path.split(executable)
                    alt = os.path.join(dirname, 'pythonw.exe')
                    use_alt = (
                        basename.lower() == 'python.exe' and
                        os.path.exists(alt)
                    )
                    if use_alt:
                        # use pythonw.exe to avoid opening a console window
                        executable = alt

                from distutils.spawn import spawn

                spawn([executable, '-E', '-c', 'pass'], 0)

                if os.path.exists(ok_file):
                    log.info(
                        "TEST PASSED: %s appears to support .pth files",
                        instdir
                    )
                    return True
            finally:
                if f:
                    f.close()
                if os.path.exists(ok_file):
                    os.unlink(ok_file)
                if os.path.exists(pth_file):
                    os.unlink(pth_file)
        if not self.multi_version:
            log.warn("TEST FAILED: %s does NOT support .pth files", instdir)
        return False

    def install_egg_scripts(self, dist):
        """Write all the scripts for `dist`, unless scripts are excluded"""
        if not self.exclude_scripts and dist.metadata_isdir('scripts'):
            for script_name in dist.metadata_listdir('scripts'):
                if dist.metadata_isdir('scripts/' + script_name):
                    # The "script" is a directory, likely a Python 3
                    # __pycache__ directory, so skip it.
                    continue
                self.install_script(
                    dist, script_name,
                    dist.get_metadata('scripts/' + script_name)
                )
        self.install_wrapper_scripts(dist)

    def add_output(self, path):
        if os.path.isdir(path):
            for base, dirs, files in os.walk(path):
                for filename in files:
                    self.outputs.append(os.path.join(base, filename))
        else:
            self.outputs.append(path)

    def not_editable(self, spec):
        if self.editable:
            raise DistutilsArgError(
                "Invalid argument %r: you can't use filenames or URLs "
                "with --editable (except via the --find-links option)."
                % (spec,)
            )

    def check_editable(self, spec):
        if not self.editable:
            return

        if os.path.exists(os.path.join(self.build_directory, spec.key)):
            raise DistutilsArgError(
                "%r already exists in %s; can't do a checkout there" %
                (spec.key, self.build_directory)
            )

    @contextlib.contextmanager
    def _tmpdir(self):
        tmpdir = tempfile.mkdtemp(prefix=six.u("easy_install-"))
        try:
            # cast to str as workaround for #709 and #710 and #712
            yield str(tmpdir)
        finally:
            os.path.exists(tmpdir) and rmtree(rmtree_safe(tmpdir))

    def easy_install(self, spec, deps=False):
        if not self.editable:
            self.install_site_py()

        with self._tmpdir() as tmpdir:
            if not isinstance(spec, Requirement):
                if URL_SCHEME(spec):
                    # It's a url, download it to tmpdir and process
                    self.not_editable(spec)
                    dl = self.package_index.download(spec, tmpdir)
                    return self.install_item(None, dl, tmpdir, deps, True)

                elif os.path.exists(spec):
                    # Existing file or directory, just process it directly
                    self.not_editable(spec)
                    return self.install_item(None, spec, tmpdir, deps, True)
                else:
                    spec = parse_requirement_arg(spec)

            self.check_editable(spec)
            dist = self.package_index.fetch_distribution(
                spec, tmpdir, self.upgrade, self.editable,
                not self.always_copy, self.local_index
            )
            if dist is None:
                msg = "Could not find suitable distribution for %r" % spec
                if self.always_copy:
                    msg += " (--always-copy skips system and development eggs)"
                raise DistutilsError(msg)
            elif dist.precedence == DEVELOP_DIST:
                # .egg-info dists don't need installing, just process deps
                self.process_distribution(spec, dist, deps, "Using")
                return dist
            else:
                return self.install_item(spec, dist.location, tmpdir, deps)

    def install_item(self, spec, download, tmpdir, deps, install_needed=False):

        # Installation is also needed if file in tmpdir or is not an egg
        install_needed = install_needed or self.always_copy
        install_needed = install_needed or os.path.dirname(download) == tmpdir
        install_needed = install_needed or not download.endswith('.egg')
        install_needed = install_needed or (
            self.always_copy_from is not None and
            os.path.dirname(normalize_path(download)) ==
            normalize_path(self.always_copy_from)
        )

        if spec and not install_needed:
            # at this point, we know it's a local .egg, we just don't know if
            # it's already installed.
            for dist in self.local_index[spec.project_name]:
                if dist.location == download:
                    break
            else:
                install_needed = True  # it's not in the local index

        log.info("Processing %s", os.path.basename(download))

        if install_needed:
            dists = self.install_eggs(spec, download, tmpdir)
            for dist in dists:
                self.process_distribution(spec, dist, deps)
        else:
            dists = [self.egg_distribution(download)]
            self.process_distribution(spec, dists[0], deps, "Using")

        if spec is not None:
            for dist in dists:
                if dist in spec:
                    return dist

    def select_scheme(self, name):
        """Sets the install directories by applying the install schemes."""
        # it's the caller's problem if they supply a bad name!
        scheme = INSTALL_SCHEMES[name]
        for key in SCHEME_KEYS:
            attrname = 'install_' + key
            if getattr(self, attrname) is None:
                setattr(self, attrname, scheme[key])

    def process_distribution(self, requirement, dist, deps=True, *info):
        self.update_pth(dist)
        self.package_index.add(dist)
        if dist in self.local_index[dist.key]:
            self.local_index.remove(dist)
        self.local_index.add(dist)
        self.install_egg_scripts(dist)
        self.installed_projects[dist.key] = dist
        log.info(self.installation_report(requirement, dist, *info))
        if (dist.has_metadata('dependency_links.txt') and
                not self.no_find_links):
            self.package_index.add_find_links(
                dist.get_metadata_lines('dependency_links.txt')
            )
        if not deps and not self.always_copy:
            return
        elif requirement is not None and dist.key != requirement.key:
            log.warn("Skipping dependencies for %s", dist)
            return  # XXX this is not the distribution we were looking for
        elif requirement is None or dist not in requirement:
            # if we wound up with a different version, resolve what we've got
            distreq = dist.as_requirement()
            requirement = Requirement(str(distreq))
        log.info("Processing dependencies for %s", requirement)
        try:
            distros = WorkingSet([]).resolve(
                [requirement], self.local_index, self.easy_install
            )
        except DistributionNotFound as e:
            raise DistutilsError(str(e))
        except VersionConflict as e:
            raise DistutilsError(e.report())
        if self.always_copy or self.always_copy_from:
            # Force all the relevant distros to be copied or activated
            for dist in distros:
                if dist.key not in self.installed_projects:
                    self.easy_install(dist.as_requirement())
        log.info("Finished processing dependencies for %s", requirement)

    def should_unzip(self, dist):
        if self.zip_ok is not None:
            return not self.zip_ok
        if dist.has_metadata('not-zip-safe'):
            return True
        if not dist.has_metadata('zip-safe'):
            return True
        return False

    def maybe_move(self, spec, dist_filename, setup_base):
        dst = os.path.join(self.build_directory, spec.key)
        if os.path.exists(dst):
            msg = (
                "%r already exists in %s; build directory %s will not be kept"
            )
            log.warn(msg, spec.key, self.build_directory, setup_base)
            return setup_base
        if os.path.isdir(dist_filename):
            setup_base = dist_filename
        else:
            if os.path.dirname(dist_filename) == setup_base:
                os.unlink(dist_filename)  # get it out of the tmp dir
            contents = os.listdir(setup_base)
            if len(contents) == 1:
                dist_filename = os.path.join(setup_base, contents[0])
                if os.path.isdir(dist_filename):
                    # if the only thing there is a directory, move it instead
                    setup_base = dist_filename
        ensure_directory(dst)
        shutil.move(setup_base, dst)
        return dst

    def install_wrapper_scripts(self, dist):
        if self.exclude_scripts:
            return
        for args in ScriptWriter.best().get_args(dist):
            self.write_script(*args)

    def install_script(self, dist, script_name, script_text, dev_path=None):
        """Generate a legacy script wrapper and install it"""
        spec = str(dist.as_requirement())
        is_script = is_python_script(script_text, script_name)

        if is_script:
            body = self._load_template(dev_path) % locals()
            script_text = ScriptWriter.get_header(script_text) + body
        self.write_script(script_name, _to_ascii(script_text), 'b')

    @staticmethod
    def _load_template(dev_path):
        """
        There are a couple of template scripts in the package. This
        function loads one of them and prepares it for use.
        """
        # See https://github.com/pypa/setuptools/issues/134 for info
        # on script file naming and downstream issues with SVR4
        name = 'script.tmpl'
        if dev_path:
            name = name.replace('.tmpl', ' (dev).tmpl')

        raw_bytes = resource_string('setuptools', name)
        return raw_bytes.decode('utf-8')

    def write_script(self, script_name, contents, mode="t", blockers=()):
        """Write an executable file to the scripts directory"""
        self.delete_blockers(  # clean up old .py/.pyw w/o a script
            [os.path.join(self.script_dir, x) for x in blockers]
        )
        log.info("Installing %s script to %s", script_name, self.script_dir)
        target = os.path.join(self.script_dir, script_name)
        self.add_output(target)

        if self.dry_run:
            return

        mask = current_umask()
        ensure_directory(target)
        if os.path.exists(target):
            os.unlink(target)
        with open(target, "w" + mode) as f:
            f.write(contents)
        chmod(target, 0o777 - mask)

    def install_eggs(self, spec, dist_filename, tmpdir):
        # .egg dirs or files are already built, so just return them
        if dist_filename.lower().endswith('.egg'):
            return [self.install_egg(dist_filename, tmpdir)]
        elif dist_filename.lower().endswith('.exe'):
            return [self.install_exe(dist_filename, tmpdir)]
        elif dist_filename.lower().endswith('.whl'):
            return [self.install_wheel(dist_filename, tmpdir)]

        # Anything else, try to extract and build
        setup_base = tmpdir
        if os.path.isfile(dist_filename) and not dist_filename.endswith('.py'):
            unpack_archive(dist_filename, tmpdir, self.unpack_progress)
        elif os.path.isdir(dist_filename):
            setup_base = os.path.abspath(dist_filename)

        if (setup_base.startswith(tmpdir)  # something we downloaded
                and self.build_directory and spec is not None):
            setup_base = self.maybe_move(spec, dist_filename, setup_base)

        # Find the setup.py file
        setup_script = os.path.join(setup_base, 'setup.py')

        if not os.path.exists(setup_script):
            setups = glob(os.path.join(setup_base, '*', 'setup.py'))
            if not setups:
                raise DistutilsError(
                    "Couldn't find a setup script in %s" %
                    os.path.abspath(dist_filename)
                )
            if len(setups) > 1:
                raise DistutilsError(
                    "Multiple setup scripts in %s" %
                    os.path.abspath(dist_filename)
                )
            setup_script = setups[0]

        # Now run it, and return the result
        if self.editable:
            log.info(self.report_editable(spec, setup_script))
            return []
        else:
            return self.build_and_install(setup_script, setup_base)

    def egg_distribution(self, egg_path):
        if os.path.isdir(egg_path):
            metadata = PathMetadata(egg_path, os.path.join(egg_path,
                                                           'EGG-INFO'))
        else:
            metadata = EggMetadata(zipimport.zipimporter(egg_path))
        return Distribution.from_filename(egg_path, metadata=metadata)

    def install_egg(self, egg_path, tmpdir):
        destination = os.path.join(
            self.install_dir,
            os.path.basename(egg_path),
        )
        destination = os.path.abspath(destination)
        if not self.dry_run:
            ensure_directory(destination)

        dist = self.egg_distribution(egg_path)
        if not samefile(egg_path, destination):
            if os.path.isdir(destination) and not os.path.islink(destination):
                dir_util.remove_tree(destination, dry_run=self.dry_run)
            elif os.path.exists(destination):
                self.execute(
                    os.unlink,
                    (destination,),
                    "Removing " + destination,
                )
            try:
                new_dist_is_zipped = False
                if os.path.isdir(egg_path):
                    if egg_path.startswith(tmpdir):
                        f, m = shutil.move, "Moving"
                    else:
                        f, m = shutil.copytree, "Copying"
                elif self.should_unzip(dist):
                    self.mkpath(destination)
                    f, m = self.unpack_and_compile, "Extracting"
                else:
                    new_dist_is_zipped = True
                    if egg_path.startswith(tmpdir):
                        f, m = shutil.move, "Moving"
                    else:
                        f, m = shutil.copy2, "Copying"
                self.execute(
                    f,
                    (egg_path, destination),
                    (m + " %s to %s") % (
                        os.path.basename(egg_path),
                        os.path.dirname(destination)
                    ),
                )
                update_dist_caches(
                    destination,
                    fix_zipimporter_caches=new_dist_is_zipped,
                )
            except Exception:
                update_dist_caches(destination, fix_zipimporter_caches=False)
                raise

        self.add_output(destination)
        return self.egg_distribution(destination)

    def install_exe(self, dist_filename, tmpdir):
        # See if it's valid, get data
        cfg = extract_wininst_cfg(dist_filename)
        if cfg is None:
            raise DistutilsError(
                "%s is not a valid distutils Windows .exe" % dist_filename
            )
        # Create a dummy distribution object until we build the real distro
        dist = Distribution(
            None,
            project_name=cfg.get('metadata', 'name'),
            version=cfg.get('metadata', 'version'), platform=get_platform(),
        )

        # Convert the .exe to an unpacked egg
        egg_path = os.path.join(tmpdir, dist.egg_name() + '.egg')
        dist.location = egg_path
        egg_tmp = egg_path + '.tmp'
        _egg_info = os.path.join(egg_tmp, 'EGG-INFO')
        pkg_inf = os.path.join(_egg_info, 'PKG-INFO')
        ensure_directory(pkg_inf)  # make sure EGG-INFO dir exists
        dist._provider = PathMetadata(egg_tmp, _egg_info)  # XXX
        self.exe_to_egg(dist_filename, egg_tmp)

        # Write EGG-INFO/PKG-INFO
        if not os.path.exists(pkg_inf):
            f = open(pkg_inf, 'w')
            f.write('Metadata-Version: 1.0\n')
            for k, v in cfg.items('metadata'):
                if k != 'target_version':
                    f.write('%s: %s\n' % (k.replace('_', '-').title(), v))
            f.close()
        script_dir = os.path.join(_egg_info, 'scripts')
        # delete entry-point scripts to avoid duping
        self.delete_blockers([
            os.path.join(script_dir, args[0])
            for args in ScriptWriter.get_args(dist)
        ])
        # Build .egg file from tmpdir
        bdist_egg.make_zipfile(
            egg_path, egg_tmp, verbose=self.verbose, dry_run=self.dry_run,
        )
        # install the .egg
        return self.install_egg(egg_path, tmpdir)

    def exe_to_egg(self, dist_filename, egg_tmp):
        """Extract a bdist_wininst to the directories an egg would use"""
        # Check for .pth file and set up prefix translations
        prefixes = get_exe_prefixes(dist_filename)
        to_compile = []
        native_libs = []
        top_level = {}

        def process(src, dst):
            s = src.lower()
            for old, new in prefixes:
                if s.startswith(old):
                    src = new + src[len(old):]
                    parts = src.split('/')
                    dst = os.path.join(egg_tmp, *parts)
                    dl = dst.lower()
                    if dl.endswith('.pyd') or dl.endswith('.dll'):
                        parts[-1] = bdist_egg.strip_module(parts[-1])
                        top_level[os.path.splitext(parts[0])[0]] = 1
                        native_libs.append(src)
                    elif dl.endswith('.py') and old != 'SCRIPTS/':
                        top_level[os.path.splitext(parts[0])[0]] = 1
                        to_compile.append(dst)
                    return dst
            if not src.endswith('.pth'):
                log.warn("WARNING: can't process %s", src)
            return None

        # extract, tracking .pyd/.dll->native_libs and .py -> to_compile
        unpack_archive(dist_filename, egg_tmp, process)
        stubs = []
        for res in native_libs:
            if res.lower().endswith('.pyd'):  # create stubs for .pyd's
                parts = res.split('/')
                resource = parts[-1]
                parts[-1] = bdist_egg.strip_module(parts[-1]) + '.py'
                pyfile = os.path.join(egg_tmp, *parts)
                to_compile.append(pyfile)
                stubs.append(pyfile)
                bdist_egg.write_stub(resource, pyfile)
        self.byte_compile(to_compile)  # compile .py's
        bdist_egg.write_safety_flag(
            os.path.join(egg_tmp, 'EGG-INFO'),
            bdist_egg.analyze_egg(egg_tmp, stubs))  # write zip-safety flag

        for name in 'top_level', 'native_libs':
            if locals()[name]:
                txt = os.path.join(egg_tmp, 'EGG-INFO', name + '.txt')
                if not os.path.exists(txt):
                    f = open(txt, 'w')
                    f.write('\n'.join(locals()[name]) + '\n')
                    f.close()

    def install_wheel(self, wheel_path, tmpdir):
        wheel = Wheel(wheel_path)
        assert wheel.is_compatible()
        destination = os.path.join(self.install_dir, wheel.egg_name())
        destination = os.path.abspath(destination)
        if not self.dry_run:
            ensure_directory(destination)
        if os.path.isdir(destination) and not os.path.islink(destination):
            dir_util.remove_tree(destination, dry_run=self.dry_run)
        elif os.path.exists(destination):
            self.execute(
                os.unlink,
                (destination,),
                "Removing " + destination,
            )
        try:
            self.execute(
                wheel.install_as_egg,
                (destination,),
                ("Installing %s to %s") % (
                    os.path.basename(wheel_path),
                    os.path.dirname(destination)
                ),
            )
        finally:
            update_dist_caches(destination, fix_zipimporter_caches=False)
        self.add_output(destination)
        return self.egg_distribution(destination)

    __mv_warning = textwrap.dedent("""
        Because this distribution was installed --multi-version, before you can
        import modules from this package in an application, you will need to
        'import pkg_resources' and then use a 'require()' call similar to one of
        these examples, in order to select the desired version:

            pkg_resources.require("%(name)s")  # latest installed version
            pkg_resources.require("%(name)s==%(version)s")  # this exact version
            pkg_resources.require("%(name)s>=%(version)s")  # this version or higher
        """).lstrip()

    __id_warning = textwrap.dedent("""
        Note also that the installation directory must be on sys.path at runtime for
        this to work.  (e.g. by being the application's script directory, by being on
        PYTHONPATH, or by being added to sys.path by your code.)
        """)

    def installation_report(self, req, dist, what="Installed"):
        """Helpful installation message for display to package users"""
        msg = "\n%(what)s %(eggloc)s%(extras)s"
        if self.multi_version and not self.no_report:
            msg += '\n' + self.__mv_warning
            if self.install_dir not in map(normalize_path, sys.path):
                msg += '\n' + self.__id_warning

        eggloc = dist.location
        name = dist.project_name
        version = dist.version
        extras = ''  # TODO: self.report_extras(req, dist)
        return msg % locals()

    __editable_msg = textwrap.dedent("""
        Extracted editable version of %(spec)s to %(dirname)s

        If it uses setuptools in its setup script, you can activate it in
        "development" mode by going to that directory and running::

            %(python)s setup.py develop

        See the setuptools documentation for the "develop" command for more info.
        """).lstrip()

    def report_editable(self, spec, setup_script):
        dirname = os.path.dirname(setup_script)
        python = sys.executable
        return '\n' + self.__editable_msg % locals()

    def run_setup(self, setup_script, setup_base, args):
        sys.modules.setdefault('distutils.command.bdist_egg', bdist_egg)
        sys.modules.setdefault('distutils.command.egg_info', egg_info)

        args = list(args)
        if self.verbose > 2:
            v = 'v' * (self.verbose - 1)
            args.insert(0, '-' + v)
        elif self.verbose < 2:
            args.insert(0, '-q')
        if self.dry_run:
            args.insert(0, '-n')
        log.info(
            "Running %s %s", setup_script[len(setup_base) + 1:], ' '.join(args)
        )
        try:
            run_setup(setup_script, args)
        except SystemExit as v:
            raise DistutilsError("Setup script exited with %s" % (v.args[0],))

    def build_and_install(self, setup_script, setup_base):
        args = ['bdist_egg', '--dist-dir']

        dist_dir = tempfile.mkdtemp(
            prefix='egg-dist-tmp-', dir=os.path.dirname(setup_script)
        )
        try:
            self._set_fetcher_options(os.path.dirname(setup_script))
            args.append(dist_dir)

            self.run_setup(setup_script, setup_base, args)
            all_eggs = Environment([dist_dir])
            eggs = []
            for key in all_eggs:
                for dist in all_eggs[key]:
                    eggs.append(self.install_egg(dist.location, setup_base))
            if not eggs and not self.dry_run:
                log.warn("No eggs found in %s (setup script problem?)",
                         dist_dir)
            return eggs
        finally:
            rmtree(dist_dir)
            log.set_verbosity(self.verbose)  # restore our log verbosity

    def _set_fetcher_options(self, base):
        """
        When easy_install is about to run bdist_egg on a source dist, that
        source dist might have 'setup_requires' directives, requiring
        additional fetching. Ensure the fetcher options given to easy_install
        are available to that command as well.
        """
        # find the fetch options from easy_install and write them out
        # to the setup.cfg file.
        ei_opts = self.distribution.get_option_dict('easy_install').copy()
        fetch_directives = (
            'find_links', 'site_dirs', 'index_url', 'optimize',
            'site_dirs', 'allow_hosts',
        )
        fetch_options = {}
        for key, val in ei_opts.items():
            if key not in fetch_directives:
                continue
            fetch_options[key.replace('_', '-')] = val[1]
        # create a settings dictionary suitable for `edit_config`
        settings = dict(easy_install=fetch_options)
        cfg_filename = os.path.join(base, 'setup.cfg')
        setopt.edit_config(cfg_filename, settings)

    def update_pth(self, dist):
        if self.pth_file is None:
            return

        for d in self.pth_file[dist.key]:  # drop old entries
            if self.multi_version or d.location != dist.location:
                log.info("Removing %s from easy-install.pth file", d)
                self.pth_file.remove(d)
                if d.location in self.shadow_path:
                    self.shadow_path.remove(d.location)

        if not self.multi_version:
            if dist.location in self.pth_file.paths:
                log.info(
                    "%s is already the active version in easy-install.pth",
                    dist,
                )
            else:
                log.info("Adding %s to easy-install.pth file", dist)
                self.pth_file.add(dist)  # add new entry
                if dist.location not in self.shadow_path:
                    self.shadow_path.append(dist.location)

        if not self.dry_run:

            self.pth_file.save()

            if dist.key == 'setuptools':
                # Ensure that setuptools itself never becomes unavailable!
                # XXX should this check for latest version?
                filename = os.path.join(self.install_dir, 'setuptools.pth')
                if os.path.islink(filename):
                    os.unlink(filename)
                f = open(filename, 'wt')
                f.write(self.pth_file.make_relative(dist.location) + '\n')
                f.close()

    def unpack_progress(self, src, dst):
        # Progress filter for unpacking
        log.debug("Unpacking %s to %s", src, dst)
        return dst  # only unpack-and-compile skips files for dry run

    def unpack_and_compile(self, egg_path, destination):
        to_compile = []
        to_chmod = []

        def pf(src, dst):
            if dst.endswith('.py') and not src.startswith('EGG-INFO/'):
                to_compile.append(dst)
            elif dst.endswith('.dll') or dst.endswith('.so'):
                to_chmod.append(dst)
            self.unpack_progress(src, dst)
            return not self.dry_run and dst or None

        unpack_archive(egg_path, destination, pf)
        self.byte_compile(to_compile)
        if not self.dry_run:
            for f in to_chmod:
                mode = ((os.stat(f)[stat.ST_MODE]) | 0o555) & 0o7755
                chmod(f, mode)

    def byte_compile(self, to_compile):
        if sys.dont_write_bytecode:
            return

        from distutils.util import byte_compile

        try:
            # try to make the byte compile messages quieter
            log.set_verbosity(self.verbose - 1)

            byte_compile(to_compile, optimize=0, force=1, dry_run=self.dry_run)
            if self.optimize:
                byte_compile(
                    to_compile, optimize=self.optimize, force=1,
                    dry_run=self.dry_run,
                )
        finally:
            log.set_verbosity(self.verbose)  # restore original verbosity

    __no_default_msg = textwrap.dedent("""
        bad install directory or PYTHONPATH

        You are attempting to install a package to a directory that is not
        on PYTHONPATH and which Python does not read ".pth" files from.  The
        installation directory you specified (via --install-dir, --prefix, or
        the distutils default setting) was:

            %s

        and your PYTHONPATH environment variable currently contains:

            %r

        Here are some of your options for correcting the problem:

        * You can choose a different installation directory, i.e., one that is
          on PYTHONPATH or supports .pth files

        * You can add the installation directory to the PYTHONPATH environment
          variable.  (It must then also be on PYTHONPATH whenever you run
          Python and want to use the package(s) you are installing.)

        * You can set up the installation directory to support ".pth" files by
          using one of the approaches described here:

          https://setuptools.readthedocs.io/en/latest/easy_install.html#custom-installation-locations


        Please make the appropriate changes for your system and try again.""").lstrip()

    def no_default_version_msg(self):
        template = self.__no_default_msg
        return template % (self.install_dir, os.environ.get('PYTHONPATH', ''))

    def install_site_py(self):
        """Make sure there's a site.py in the target dir, if needed"""

        if self.sitepy_installed:
            return  # already did it, or don't need to

        sitepy = os.path.join(self.install_dir, "site.py")
        source = resource_string("setuptools", "site-patch.py")
        source = source.decode('utf-8')
        current = ""

        if os.path.exists(sitepy):
            log.debug("Checking existing site.py in %s", self.install_dir)
            with io.open(sitepy) as strm:
                current = strm.read()

            if not current.startswith('def __boot():'):
                raise DistutilsError(
                    "%s is not a setuptools-generated site.py; please"
                    " remove it." % sitepy
                )

        if current != source:
            log.info("Creating %s", sitepy)
            if not self.dry_run:
                ensure_directory(sitepy)
                with io.open(sitepy, 'w', encoding='utf-8') as strm:
                    strm.write(source)
            self.byte_compile([sitepy])

        self.sitepy_installed = True

    def create_home_path(self):
        """Create directories under ~."""
        if not self.user:
            return
        home = convert_path(os.path.expanduser("~"))
        for name, path in six.iteritems(self.config_vars):
            if path.startswith(home) and not os.path.isdir(path):
                self.debug_print("os.makedirs('%s', 0o700)" % path)
                os.makedirs(path, 0o700)

    INSTALL_SCHEMES = dict(
        posix=dict(
            install_dir='$base/lib/python$py_version_short/site-packages',
            script_dir='$base/bin',
        ),
    )

    DEFAULT_SCHEME = dict(
        install_dir='$base/Lib/site-packages',
        script_dir='$base/Scripts',
    )

    def _expand(self, *attrs):
        config_vars = self.get_finalized_command('install').config_vars

        if self.prefix:
            # Set default install_dir/scripts from --prefix
            config_vars = config_vars.copy()
            config_vars['base'] = self.prefix
            scheme = self.INSTALL_SCHEMES.get(os.name, self.DEFAULT_SCHEME)
            for attr, val in scheme.items():
                if getattr(self, attr, None) is None:
                    setattr(self, attr, val)

        from distutils.util import subst_vars

        for attr in attrs:
            val = getattr(self, attr)
            if val is not None:
                val = subst_vars(val, config_vars)
                if os.name == 'posix':
                    val = os.path.expanduser(val)
                setattr(self, attr, val)


def _pythonpath():
    items = os.environ.get('PYTHONPATH', '').split(os.pathsep)
    return filter(None, items)


def get_site_dirs():
    """
    Return a list of 'site' dirs
    """

    sitedirs = []

    # start with PYTHONPATH
    sitedirs.extend(_pythonpath())

    prefixes = [sys.prefix]
    if sys.exec_prefix != sys.prefix:
        prefixes.append(sys.exec_prefix)
    for prefix in prefixes:
        if prefix:
            if sys.platform in ('os2emx', 'riscos'):
                sitedirs.append(os.path.join(prefix, "Lib", "site-packages"))
            elif os.sep == '/':
                sitedirs.extend([
                    os.path.join(
                        prefix,
                        "lib",
                        "python" + sys.version[:3],
                        "site-packages",
                    ),
                    os.path.join(prefix, "lib", "site-python"),
                ])
            else:
                sitedirs.extend([
                    prefix,
                    os.path.join(prefix, "lib", "site-packages"),
                ])
            if sys.platform == 'darwin':
                # for framework builds *only* we add the standard Apple
                # locations. Currently only per-user, but /Library and
                # /Network/Library could be added too
                if 'Python.framework' in prefix:
                    home = os.environ.get('HOME')
                    if home:
                        home_sp = os.path.join(
                            home,
                            'Library',
                            'Python',
                            sys.version[:3],
                            'site-packages',
                        )
                        sitedirs.append(home_sp)
    lib_paths = get_path('purelib'), get_path('platlib')
    for site_lib in lib_paths:
        if site_lib not in sitedirs:
            sitedirs.append(site_lib)

    if site.ENABLE_USER_SITE:
        sitedirs.append(site.USER_SITE)

    try:
        sitedirs.extend(site.getsitepackages())
    except AttributeError:
        pass

    sitedirs = list(map(normalize_path, sitedirs))

    return sitedirs


def expand_paths(inputs):
    """Yield sys.path directories that might contain "old-style" packages"""

    seen = {}

    for dirname in inputs:
        dirname = normalize_path(dirname)
        if dirname in seen:
            continue

        seen[dirname] = 1
        if not os.path.isdir(dirname):
            continue

        files = os.listdir(dirname)
        yield dirname, files

        for name in files:
            if not name.endswith('.pth'):
                # We only care about the .pth files
                continue
            if name in ('easy-install.pth', 'setuptools.pth'):
                # Ignore .pth files that we control
                continue

            # Read the .pth file
            f = open(os.path.join(dirname, name))
            lines = list(yield_lines(f))
            f.close()

            # Yield existing non-dupe, non-import directory lines from it
            for line in lines:
                if not line.startswith("import"):
                    line = normalize_path(line.rstrip())
                    if line not in seen:
                        seen[line] = 1
                        if not os.path.isdir(line):
                            continue
                        yield line, os.listdir(line)


def extract_wininst_cfg(dist_filename):
    """Extract configuration data from a bdist_wininst .exe

    Returns a configparser.RawConfigParser, or None
    """
    f = open(dist_filename, 'rb')
    try:
        endrec = zipfile._EndRecData(f)
        if endrec is None:
            return None

        prepended = (endrec[9] - endrec[5]) - endrec[6]
        if prepended < 12:  # no wininst data here
            return None
        f.seek(prepended - 12)

        tag, cfglen, bmlen = struct.unpack("<iii", f.read(12))
        if tag not in (0x1234567A, 0x1234567B):
            return None  # not a valid tag

        f.seek(prepended - (12 + cfglen))
        init = {'version': '', 'target_version': ''}
        cfg = configparser.RawConfigParser(init)
        try:
            part = f.read(cfglen)
            # Read up to the first null byte.
            config = part.split(b'\0', 1)[0]
            # Now the config is in bytes, but for RawConfigParser, it should
            #  be text, so decode it.
            config = config.decode(sys.getfilesystemencoding())
            cfg.readfp(six.StringIO(config))
        except configparser.Error:
            return None
        if not cfg.has_section('metadata') or not cfg.has_section('Setup'):
            return None
        return cfg

    finally:
        f.close()


def get_exe_prefixes(exe_filename):
    """Get exe->egg path translations for a given .exe file"""

    prefixes = [
        ('PURELIB/', ''),
        ('PLATLIB/pywin32_system32', ''),
        ('PLATLIB/', ''),
        ('SCRIPTS/', 'EGG-INFO/scripts/'),
        ('DATA/lib/site-packages', ''),
    ]
    z = zipfile.ZipFile(exe_filename)
    try:
        for info in z.infolist():
            name = info.filename
            parts = name.split('/')
            if len(parts) == 3 and parts[2] == 'PKG-INFO':
                if parts[1].endswith('.egg-info'):
                    prefixes.insert(0, ('/'.join(parts[:2]), 'EGG-INFO/'))
                    break
            if len(parts) != 2 or not name.endswith('.pth'):
                continue
            if name.endswith('-nspkg.pth'):
                continue
            if parts[0].upper() in ('PURELIB', 'PLATLIB'):
                contents = z.read(name)
                if six.PY3:
                    contents = contents.decode()
                for pth in yield_lines(contents):
                    pth = pth.strip().replace('\\', '/')
                    if not pth.startswith('import'):
                        prefixes.append((('%s/%s/' % (parts[0], pth)), ''))
    finally:
        z.close()
    prefixes = [(x.lower(), y) for x, y in prefixes]
    prefixes.sort()
    prefixes.reverse()
    return prefixes


class PthDistributions(Environment):
    """A .pth file with Distribution paths in it"""

    dirty = False

    def __init__(self, filename, sitedirs=()):
        self.filename = filename
        self.sitedirs = list(map(normalize_path, sitedirs))
        self.basedir = normalize_path(os.path.dirname(self.filename))
        self._load()
        Environment.__init__(self, [], None, None)
        for path in yield_lines(self.paths):
            list(map(self.add, find_distributions(path, True)))

    def _load(self):
        self.paths = []
        saw_import = False
        seen = dict.fromkeys(self.sitedirs)
        if os.path.isfile(self.filename):
            f = open(self.filename, 'rt')
            for line in f:
                if line.startswith('import'):
                    saw_import = True
                    continue
                path = line.rstrip()
                self.paths.append(path)
                if not path.strip() or path.strip().startswith('#'):
                    continue
                # skip non-existent paths, in case somebody deleted a package
                # manually, and duplicate paths as well
                path = self.paths[-1] = normalize_path(
                    os.path.join(self.basedir, path)
                )
                if not os.path.exists(path) or path in seen:
                    self.paths.pop()  # skip it
                    self.dirty = True  # we cleaned up, so we're dirty now :)
                    continue
                seen[path] = 1
            f.close()

        if self.paths and not saw_import:
            self.dirty = True  # ensure anything we touch has import wrappers
        while self.paths and not self.paths[-1].strip():
            self.paths.pop()

    def save(self):
        """Write changed .pth file back to disk"""
        if not self.dirty:
            return

        rel_paths = list(map(self.make_relative, self.paths))
        if rel_paths:
            log.debug("Saving %s", self.filename)
            lines = self._wrap_lines(rel_paths)
            data = '\n'.join(lines) + '\n'

            if os.path.islink(self.filename):
                os.unlink(self.filename)
            with open(self.filename, 'wt') as f:
                f.write(data)

        elif os.path.exists(self.filename):
            log.debug("Deleting empty %s", self.filename)
            os.unlink(self.filename)

        self.dirty = False

    @staticmethod
    def _wrap_lines(lines):
        return lines

    def add(self, dist):
        """Add `dist` to the distribution map"""
        new_path = (
            dist.location not in self.paths and (
                dist.location not in self.sitedirs or
                # account for '.' being in PYTHONPATH
                dist.location == os.getcwd()
            )
        )
        if new_path:
            self.paths.append(dist.location)
            self.dirty = True
        Environment.add(self, dist)

    def remove(self, dist):
        """Remove `dist` from the distribution map"""
        while dist.location in self.paths:
            self.paths.remove(dist.location)
            self.dirty = True
        Environment.remove(self, dist)

    def make_relative(self, path):
        npath, last = os.path.split(normalize_path(path))
        baselen = len(self.basedir)
        parts = [last]
        sep = os.altsep == '/' and '/' or os.sep
        while len(npath) >= baselen:
            if npath == self.basedir:
                parts.append(os.curdir)
                parts.reverse()
                return sep.join(parts)
            npath, last = os.path.split(npath)
            parts.append(last)
        else:
            return path


class RewritePthDistributions(PthDistributions):
    @classmethod
    def _wrap_lines(cls, lines):
        yield cls.prelude
        for line in lines:
            yield line
        yield cls.postlude

    prelude = _one_liner("""
        import sys
        sys.__plen = len(sys.path)
        """)
    postlude = _one_liner("""
        import sys
        new = sys.path[sys.__plen:]
        del sys.path[sys.__plen:]
        p = getattr(sys, '__egginsert', 0)
        sys.path[p:p] = new
        sys.__egginsert = p + len(new)
        """)


if os.environ.get('SETUPTOOLS_SYS_PATH_TECHNIQUE', 'raw') == 'rewrite':
    PthDistributions = RewritePthDistributions


def _first_line_re():
    """
    Return a regular expression based on first_line_re suitable for matching
    strings.
    """
    if isinstance(first_line_re.pattern, str):
        return first_line_re

    # first_line_re in Python >=3.1.4 and >=3.2.1 is a bytes pattern.
    return re.compile(first_line_re.pattern.decode())


def auto_chmod(func, arg, exc):
    if func in [os.unlink, os.remove] and os.name == 'nt':
        chmod(arg, stat.S_IWRITE)
        return func(arg)
    et, ev, _ = sys.exc_info()
    six.reraise(et, (ev[0], ev[1] + (" %s %s" % (func, arg))))


def update_dist_caches(dist_path, fix_zipimporter_caches):
    """
    Fix any globally cached `dist_path` related data

    `dist_path` should be a path of a newly installed egg distribution (zipped
    or unzipped).

    sys.path_importer_cache contains finder objects that have been cached when
    importing data from the original distribution. Any such finders need to be
    cleared since the replacement distribution might be packaged differently,
    e.g. a zipped egg distribution might get replaced with an unzipped egg
    folder or vice versa. Having the old finders cached may then cause Python
    to attempt loading modules from the replacement distribution using an
    incorrect loader.

    zipimport.zipimporter objects are Python loaders charged with importing
    data packaged inside zip archives. If stale loaders referencing the
    original distribution, are left behind, they can fail to load modules from
    the replacement distribution. E.g. if an old zipimport.zipimporter instance
    is used to load data from a new zipped egg archive, it may cause the
    operation to attempt to locate the requested data in the wrong location -
    one indicated by the original distribution's zip archive directory
    information. Such an operation may then fail outright, e.g. report having
    read a 'bad local file header', or even worse, it may fail silently &
    return invalid data.

    zipimport._zip_directory_cache contains cached zip archive directory
    information for all existing zipimport.zipimporter instances and all such
    instances connected to the same archive share the same cached directory
    information.

    If asked, and the underlying Python implementation allows it, we can fix
    all existing zipimport.zipimporter instances instead of having to track
    them down and remove them one by one, by updating their shared cached zip
    archive directory information. This, of course, assumes that the
    replacement distribution is packaged as a zipped egg.

    If not asked to fix existing zipimport.zipimporter instances, we still do
    our best to clear any remaining zipimport.zipimporter related cached data
    that might somehow later get used when attempting to load data from the new
    distribution and thus cause such load operations to fail. Note that when
    tracking down such remaining stale data, we can not catch every conceivable
    usage from here, and we clear only those that we know of and have found to
    cause problems if left alive. Any remaining caches should be updated by
    whomever is in charge of maintaining them, i.e. they should be ready to
    handle us replacing their zip archives with new distributions at runtime.

    """
    # There are several other known sources of stale zipimport.zipimporter
    # instances that we do not clear here, but might if ever given a reason to
    # do so:
    # * Global setuptools pkg_resources.working_set (a.k.a. 'master working
    # set') may contain distributions which may in turn contain their
    #   zipimport.zipimporter loaders.
    # * Several zipimport.zipimporter loaders held by local variables further
    #   up the function call stack when running the setuptools installation.
    # * Already loaded modules may have their __loader__ attribute set to the
    #   exact loader instance used when importing them. Python 3.4 docs state
    #   that this information is intended mostly for introspection and so is
    #   not expected to cause us problems.
    normalized_path = normalize_path(dist_path)
    _uncache(normalized_path, sys.path_importer_cache)
    if fix_zipimporter_caches:
        _replace_zip_directory_cache_data(normalized_path)
    else:
        # Here, even though we do not want to fix existing and now stale
        # zipimporter cache information, we still want to remove it. Related to
        # Python's zip archive directory information cache, we clear each of
        # its stale entries in two phases:
        #   1. Clear the entry so attempting to access zip archive information
        #      via any existing stale zipimport.zipimporter instances fails.
        #   2. Remove the entry from the cache so any newly constructed
        #      zipimport.zipimporter instances do not end up using old stale
        #      zip archive directory information.
        # This whole stale data removal step does not seem strictly necessary,
        # but has been left in because it was done before we started replacing
        # the zip archive directory information cache content if possible, and
        # there are no relevant unit tests that we can depend on to tell us if
        # this is really needed.
        _remove_and_clear_zip_directory_cache_data(normalized_path)


def _collect_zipimporter_cache_entries(normalized_path, cache):
    """
    Return zipimporter cache entry keys related to a given normalized path.

    Alternative path spellings (e.g. those using different character case or
    those using alternative path separators) related to the same path are
    included. Any sub-path entries are included as well, i.e. those
    corresponding to zip archives embedded in other zip archives.

    """
    result = []
    prefix_len = len(normalized_path)
    for p in cache:
        np = normalize_path(p)
        if (np.startswith(normalized_path) and
                np[prefix_len:prefix_len + 1] in (os.sep, '')):
            result.append(p)
    return result


def _update_zipimporter_cache(normalized_path, cache, updater=None):
    """
    Update zipimporter cache data for a given normalized path.

    Any sub-path entries are processed as well, i.e. those corresponding to zip
    archives embedded in other zip archives.

    Given updater is a callable taking a cache entry key and the original entry
    (after already removing the entry from the cache), and expected to update
    the entry and possibly return a new one to be inserted in its place.
    Returning None indicates that the entry should not be replaced with a new
    one. If no updater is given, the cache entries are simply removed without
    any additional processing, the same as if the updater simply returned None.

    """
    for p in _collect_zipimporter_cache_entries(normalized_path, cache):
        # N.B. pypy's custom zipimport._zip_directory_cache implementation does
        # not support the complete dict interface:
        # * Does not support item assignment, thus not allowing this function
        #    to be used only for removing existing cache entries.
        #  * Does not support the dict.pop() method, forcing us to use the
        #    get/del patterns instead. For more detailed information see the
        #    following links:
        #      https://github.com/pypa/setuptools/issues/202#issuecomment-202913420
        #      http://bit.ly/2h9itJX
        old_entry = cache[p]
        del cache[p]
        new_entry = updater and updater(p, old_entry)
        if new_entry is not None:
            cache[p] = new_entry


def _uncache(normalized_path, cache):
    _update_zipimporter_cache(normalized_path, cache)


def _remove_and_clear_zip_directory_cache_data(normalized_path):
    def clear_and_remove_cached_zip_archive_directory_data(path, old_entry):
        old_entry.clear()

    _update_zipimporter_cache(
        normalized_path, zipimport._zip_directory_cache,
        updater=clear_and_remove_cached_zip_archive_directory_data)


# PyPy Python implementation does not allow directly writing to the
# zipimport._zip_directory_cache and so prevents us from attempting to correct
# its content. The best we can do there is clear the problematic cache content
# and have PyPy repopulate it as needed. The downside is that if there are any
# stale zipimport.zipimporter instances laying around, attempting to use them
# will fail due to not having its zip archive directory information available
# instead of being automatically corrected to use the new correct zip archive
# directory information.
if '__pypy__' in sys.builtin_module_names:
    _replace_zip_directory_cache_data = \
        _remove_and_clear_zip_directory_cache_data
else:

    def _replace_zip_directory_cache_data(normalized_path):
        def replace_cached_zip_archive_directory_data(path, old_entry):
            # N.B. In theory, we could load the zip directory information just
            # once for all updated path spellings, and then copy it locally and
            # update its contained path strings to contain the correct
            # spelling, but that seems like a way too invasive move (this cache
            # structure is not officially documented anywhere and could in
            # theory change with new Python releases) for no significant
            # benefit.
            old_entry.clear()
            zipimport.zipimporter(path)
            old_entry.update(zipimport._zip_directory_cache[path])
            return old_entry

        _update_zipimporter_cache(
            normalized_path, zipimport._zip_directory_cache,
            updater=replace_cached_zip_archive_directory_data)


def is_python(text, filename='<string>'):
    "Is this string a valid Python script?"
    try:
        compile(text, filename, 'exec')
    except (SyntaxError, TypeError):
        return False
    else:
        return True


def is_sh(executable):
    """Determine if the specified executable is a .sh (contains a #! line)"""
    try:
        with io.open(executable, encoding='latin-1') as fp:
            magic = fp.read(2)
    except (OSError, IOError):
        return executable
    return magic == '#!'


def nt_quote_arg(arg):
    """Quote a command line argument according to Windows parsing rules"""
    return subprocess.list2cmdline([arg])


def is_python_script(script_text, filename):
    """Is this text, as a whole, a Python script? (as opposed to shell/bat/etc.
    """
    if filename.endswith('.py') or filename.endswith('.pyw'):
        return True  # extension says it's Python
    if is_python(script_text, filename):
        return True  # it's syntactically valid Python
    if script_text.startswith('#!'):
        # It begins with a '#!' line, so check if 'python' is in it somewhere
        return 'python' in script_text.splitlines()[0].lower()

    return False  # Not any Python I can recognize


try:
    from os import chmod as _chmod
except ImportError:
    # Jython compatibility
    def _chmod(*args):
        pass


def chmod(path, mode):
    log.debug("changing mode of %s to %o", path, mode)
    try:
        _chmod(path, mode)
    except os.error as e:
        log.debug("chmod failed: %s", e)


class CommandSpec(list):
    """
    A command spec for a #! header, specified as a list of arguments akin to
    those passed to Popen.
    """

    options = []
    split_args = dict()

    @classmethod
    def best(cls):
        """
        Choose the best CommandSpec class based on environmental conditions.
        """
        return cls

    @classmethod
    def _sys_executable(cls):
        _default = os.path.normpath(sys.executable)
        return os.environ.get('__PYVENV_LAUNCHER__', _default)

    @classmethod
    def from_param(cls, param):
        """
        Construct a CommandSpec from a parameter to build_scripts, which may
        be None.
        """
        if isinstance(param, cls):
            return param
        if isinstance(param, list):
            return cls(param)
        if param is None:
            return cls.from_environment()
        # otherwise, assume it's a string.
        return cls.from_string(param)

    @classmethod
    def from_environment(cls):
        return cls([cls._sys_executable()])

    @classmethod
    def from_string(cls, string):
        """
        Construct a command spec from a simple string representing a command
        line parseable by shlex.split.
        """
        items = shlex.split(string, **cls.split_args)
        return cls(items)

    def install_options(self, script_text):
        self.options = shlex.split(self._extract_options(script_text))
        cmdline = subprocess.list2cmdline(self)
        if not isascii(cmdline):
            self.options[:0] = ['-x']

    @staticmethod
    def _extract_options(orig_script):
        """
        Extract any options from the first line of the script.
        """
        first = (orig_script + '\n').splitlines()[0]
        match = _first_line_re().match(first)
        options = match.group(1) or '' if match else ''
        return options.strip()

    def as_header(self):
        return self._render(self + list(self.options))

    @staticmethod
    def _strip_quotes(item):
        _QUOTES = '"\''
        for q in _QUOTES:
            if item.startswith(q) and item.endswith(q):
                return item[1:-1]
        return item

    @staticmethod
    def _render(items):
        cmdline = subprocess.list2cmdline(
            CommandSpec._strip_quotes(item.strip()) for item in items)
        return '#!' + cmdline + '\n'


# For pbr compat; will be removed in a future version.
sys_executable = CommandSpec._sys_executable()


class WindowsCommandSpec(CommandSpec):
    split_args = dict(posix=False)


class ScriptWriter(object):
    """
    Encapsulates behavior around writing entry point scripts for console and
    gui apps.
    """

    template = textwrap.dedent(r"""
        # EASY-INSTALL-ENTRY-SCRIPT: %(spec)r,%(group)r,%(name)r
        __requires__ = %(spec)r
        import re
        import sys
        from pkg_resources import load_entry_point

        if __name__ == '__main__':
            sys.argv[0] = re.sub(r'(-script\.pyw?|\.exe)?$', '', sys.argv[0])
            sys.exit(
                load_entry_point(%(spec)r, %(group)r, %(name)r)()
            )
    """).lstrip()

    command_spec_class = CommandSpec

    @classmethod
    def get_script_args(cls, dist, executable=None, wininst=False):
        # for backward compatibility
        warnings.warn("Use get_args", DeprecationWarning)
        writer = (WindowsScriptWriter if wininst else ScriptWriter).best()
        header = cls.get_script_header("", executable, wininst)
        return writer.get_args(dist, header)

    @classmethod
    def get_script_header(cls, script_text, executable=None, wininst=False):
        # for backward compatibility
        warnings.warn("Use get_header", DeprecationWarning)
        if wininst:
            executable = "python.exe"
        cmd = cls.command_spec_class.best().from_param(executable)
        cmd.install_options(script_text)
        return cmd.as_header()

    @classmethod
    def get_args(cls, dist, header=None):
        """
        Yield write_script() argument tuples for a distribution's
        console_scripts and gui_scripts entry points.
        """
        if header is None:
            header = cls.get_header()
        spec = str(dist.as_requirement())
        for type_ in 'console', 'gui':
            group = type_ + '_scripts'
            for name, ep in dist.get_entry_map(group).items():
                cls._ensure_safe_name(name)
                script_text = cls.template % locals()
                args = cls._get_script_args(type_, name, header, script_text)
                for res in args:
                    yield res

    @staticmethod
    def _ensure_safe_name(name):
        """
        Prevent paths in *_scripts entry point names.
        """
        has_path_sep = re.search(r'[\\/]', name)
        if has_path_sep:
            raise ValueError("Path separators not allowed in script names")

    @classmethod
    def get_writer(cls, force_windows):
        # for backward compatibility
        warnings.warn("Use best", DeprecationWarning)
        return WindowsScriptWriter.best() if force_windows else cls.best()

    @classmethod
    def best(cls):
        """
        Select the best ScriptWriter for this environment.
        """
        if sys.platform == 'win32' or (os.name == 'java' and os._name == 'nt'):
            return WindowsScriptWriter.best()
        else:
            return cls

    @classmethod
    def _get_script_args(cls, type_, name, header, script_text):
        # Simply write the stub with no extension.
        yield (name, header + script_text)

    @classmethod
    def get_header(cls, script_text="", executable=None):
        """Create a #! line, getting options (if any) from script_text"""
        cmd = cls.command_spec_class.best().from_param(executable)
        cmd.install_options(script_text)
        return cmd.as_header()


class WindowsScriptWriter(ScriptWriter):
    command_spec_class = WindowsCommandSpec

    @classmethod
    def get_writer(cls):
        # for backward compatibility
        warnings.warn("Use best", DeprecationWarning)
        return cls.best()

    @classmethod
    def best(cls):
        """
        Select the best ScriptWriter suitable for Windows
        """
        writer_lookup = dict(
            executable=WindowsExecutableLauncherWriter,
            natural=cls,
        )
        # for compatibility, use the executable launcher by default
        launcher = os.environ.get('SETUPTOOLS_LAUNCHER', 'executable')
        return writer_lookup[launcher]

    @classmethod
    def _get_script_args(cls, type_, name, header, script_text):
        "For Windows, add a .py extension"
        ext = dict(console='.pya', gui='.pyw')[type_]
        if ext not in os.environ['PATHEXT'].lower().split(';'):
            msg = (
                "{ext} not listed in PATHEXT; scripts will not be "
                "recognized as executables."
            ).format(**locals())
            warnings.warn(msg, UserWarning)
        old = ['.pya', '.py', '-script.py', '.pyc', '.pyo', '.pyw', '.exe']
        old.remove(ext)
        header = cls._adjust_header(type_, header)
        blockers = [name + x for x in old]
        yield name + ext, header + script_text, 't', blockers

    @classmethod
    def _adjust_header(cls, type_, orig_header):
        """
        Make sure 'pythonw' is used for gui and and 'python' is used for
        console (regardless of what sys.executable is).
        """
        pattern = 'pythonw.exe'
        repl = 'python.exe'
        if type_ == 'gui':
            pattern, repl = repl, pattern
        pattern_ob = re.compile(re.escape(pattern), re.IGNORECASE)
        new_header = pattern_ob.sub(string=orig_header, repl=repl)
        return new_header if cls._use_header(new_header) else orig_header

    @staticmethod
    def _use_header(new_header):
        """
        Should _adjust_header use the replaced header?

        On non-windows systems, always use. On
        Windows systems, only use the replaced header if it resolves
        to an executable on the system.
        """
        clean_header = new_header[2:-1].strip('"')
        return sys.platform != 'win32' or find_executable(clean_header)


class WindowsExecutableLauncherWriter(WindowsScriptWriter):
    @classmethod
    def _get_script_args(cls, type_, name, header, script_text):
        """
        For Windows, add a .py extension and an .exe launcher
        """
        if type_ == 'gui':
            launcher_type = 'gui'
            ext = '-script.pyw'
            old = ['.pyw']
        else:
            launcher_type = 'cli'
            ext = '-script.py'
            old = ['.py', '.pyc', '.pyo']
        hdr = cls._adjust_header(type_, header)
        blockers = [name + x for x in old]
        yield (name + ext, hdr + script_text, 't', blockers)
        yield (
            name + '.exe', get_win_launcher(launcher_type),
            'b'  # write in binary mode
        )
        if not is_64bit():
            # install a manifest for the launcher to prevent Windows
            # from detecting it as an installer (which it will for
            #  launchers like easy_install.exe). Consider only
            #  adding a manifest for launchers detected as installers.
            #  See Distribute #143 for details.
            m_name = name + '.exe.manifest'
            yield (m_name, load_launcher_manifest(name), 't')


# for backward-compatibility
get_script_args = ScriptWriter.get_script_args
get_script_header = ScriptWriter.get_script_header


def get_win_launcher(type):
    """
    Load the Windows launcher (executable) suitable for launching a script.

    `type` should be either 'cli' or 'gui'

    Returns the executable as a byte string.
    """
    launcher_fn = '%s.exe' % type
    if is_64bit():
        launcher_fn = launcher_fn.replace(".", "-64.")
    else:
        launcher_fn = launcher_fn.replace(".", "-32.")
    return resource_string('setuptools', launcher_fn)


def load_launcher_manifest(name):
    manifest = pkg_resources.resource_string(__name__, 'launcher manifest.xml')
    if six.PY2:
        return manifest % vars()
    else:
        return manifest.decode('utf-8') % vars()


def rmtree(path, ignore_errors=False, onerror=auto_chmod):
    return shutil.rmtree(path, ignore_errors, onerror)


def current_umask():
    tmp = os.umask(0o022)
    os.umask(tmp)
    return tmp


def bootstrap():
    # This function is called when setuptools*.egg is run using /bin/sh
    import setuptools

    argv0 = os.path.dirname(setuptools.__path__[0])
    sys.argv[0] = argv0
    sys.argv.append(argv0)
    main()


def main(argv=None, **kw):
    from setuptools import setup
    from setuptools.dist import Distribution

    class DistributionWithoutHelpCommands(Distribution):
        common_usage = ""

        def _show_help(self, *args, **kw):
            with _patch_usage():
                Distribution._show_help(self, *args, **kw)

    if argv is None:
        argv = sys.argv[1:]

    with _patch_usage():
        setup(
            script_args=['-q', 'easy_install', '-v'] + argv,
            script_name=sys.argv[0] or 'easy_install',
            distclass=DistributionWithoutHelpCommands,
            **kw
        )


@contextlib.contextmanager
def _patch_usage():
    import distutils.core
    USAGE = textwrap.dedent("""
        usage: %(script)s [options] requirement_or_url ...
           or: %(script)s --help
        """).lstrip()

    def gen_usage(script_name):
        return USAGE % dict(
            script=os.path.basename(script_name),
        )

    saved = distutils.core.gen_usage
    distutils.core.gen_usage = gen_usage
    try:
        yield
    finally:
        distutils.core.gen_usage = saved
site-packages/setuptools/command/install.py000064400000011113147511334640015150 0ustar00from distutils.errors import DistutilsArgError
import inspect
import glob
import warnings
import platform
import distutils.command.install as orig

import setuptools

# Prior to numpy 1.9, NumPy relies on the '_install' name, so provide it for
# now. See https://github.com/pypa/setuptools/issues/199/
_install = orig.install


class install(orig.install):
    """Use easy_install to install the package, w/dependencies"""

    user_options = orig.install.user_options + [
        ('old-and-unmanageable', None, "Try not to use this!"),
        ('single-version-externally-managed', None,
         "used by system package builders to create 'flat' eggs"),
    ]
    boolean_options = orig.install.boolean_options + [
        'old-and-unmanageable', 'single-version-externally-managed',
    ]
    new_commands = [
        ('install_egg_info', lambda self: True),
        ('install_scripts', lambda self: True),
    ]
    _nc = dict(new_commands)

    def initialize_options(self):
        orig.install.initialize_options(self)
        self.old_and_unmanageable = None
        self.single_version_externally_managed = None

    def finalize_options(self):
        orig.install.finalize_options(self)
        if self.root:
            self.single_version_externally_managed = True
        elif self.single_version_externally_managed:
            if not self.root and not self.record:
                raise DistutilsArgError(
                    "You must specify --record or --root when building system"
                    " packages"
                )

    def handle_extra_path(self):
        if self.root or self.single_version_externally_managed:
            # explicit backward-compatibility mode, allow extra_path to work
            return orig.install.handle_extra_path(self)

        # Ignore extra_path when installing an egg (or being run by another
        # command without --root or --single-version-externally-managed
        self.path_file = None
        self.extra_dirs = ''

    def run(self):
        # Explicit request for old-style install?  Just do it
        if self.old_and_unmanageable or self.single_version_externally_managed:
            return orig.install.run(self)

        if not self._called_from_setup(inspect.currentframe()):
            # Run in backward-compatibility mode to support bdist_* commands.
            orig.install.run(self)
        else:
            self.do_egg_install()

    @staticmethod
    def _called_from_setup(run_frame):
        """
        Attempt to detect whether run() was called from setup() or by another
        command.  If called by setup(), the parent caller will be the
        'run_command' method in 'distutils.dist', and *its* caller will be
        the 'run_commands' method.  If called any other way, the
        immediate caller *might* be 'run_command', but it won't have been
        called by 'run_commands'. Return True in that case or if a call stack
        is unavailable. Return False otherwise.
        """
        if run_frame is None:
            msg = "Call stack not available. bdist_* commands may fail."
            warnings.warn(msg)
            if platform.python_implementation() == 'IronPython':
                msg = "For best results, pass -X:Frames to enable call stack."
                warnings.warn(msg)
            return True
        res = inspect.getouterframes(run_frame)[2]
        caller, = res[:1]
        info = inspect.getframeinfo(caller)
        caller_module = caller.f_globals.get('__name__', '')
        return (
            caller_module == 'distutils.dist'
            and info.function == 'run_commands'
        )

    def do_egg_install(self):

        easy_install = self.distribution.get_command_class('easy_install')

        cmd = easy_install(
            self.distribution, args="x", root=self.root, record=self.record,
        )
        cmd.ensure_finalized()  # finalize before bdist_egg munges install cmd
        cmd.always_copy_from = '.'  # make sure local-dir eggs get installed

        # pick up setup-dir .egg files only: no .egg-info
        cmd.package_index.scan(glob.glob('*.egg'))

        self.run_command('bdist_egg')
        args = [self.distribution.get_command_obj('bdist_egg').egg_output]

        if setuptools.bootstrap_install_from:
            # Bootstrap self-installation of setuptools
            args.insert(0, setuptools.bootstrap_install_from)

        cmd.args = args
        cmd.run()
        setuptools.bootstrap_install_from = None


# XXX Python 3.1 doesn't see _nc if this is inside the class
install.sub_commands = (
    [cmd for cmd in orig.install.sub_commands if cmd[0] not in install._nc] +
    install.new_commands
)
site-packages/setuptools/command/dist_info.py000064400000001700147511334640015461 0ustar00"""
Create a dist_info directory
As defined in the wheel specification
"""

import os

from distutils.core import Command
from distutils import log


class dist_info(Command):

    description = 'create a .dist-info directory'

    user_options = [
        ('egg-base=', 'e', "directory containing .egg-info directories"
                           " (default: top of the source tree)"),
    ]

    def initialize_options(self):
        self.egg_base = None

    def finalize_options(self):
        pass

    def run(self):
        egg_info = self.get_finalized_command('egg_info')
        egg_info.egg_base = self.egg_base
        egg_info.finalize_options()
        egg_info.run()
        dist_info_dir = egg_info.egg_info[:-len('.egg-info')] + '.dist-info'
        log.info("creating '{}'".format(os.path.abspath(dist_info_dir)))

        bdist_wheel = self.get_finalized_command('bdist_wheel')
        bdist_wheel.egg2dist(egg_info.egg_info, dist_info_dir)
site-packages/setuptools/command/bdist_rpm.py000064400000002744147511334640015477 0ustar00import distutils.command.bdist_rpm as orig


class bdist_rpm(orig.bdist_rpm):
    """
    Override the default bdist_rpm behavior to do the following:

    1. Run egg_info to ensure the name and version are properly calculated.
    2. Always run 'install' using --single-version-externally-managed to
       disable eggs in RPM distributions.
    3. Replace dash with underscore in the version numbers for better RPM
       compatibility.
    """

    def run(self):
        # ensure distro name is up-to-date
        self.run_command('egg_info')

        orig.bdist_rpm.run(self)

    def _make_spec_file(self):
        version = self.distribution.get_version()
        rpmversion = version.replace('-', '_')
        spec = orig.bdist_rpm._make_spec_file(self)
        line23 = '%define version ' + version
        line24 = '%define version ' + rpmversion
        spec = [
            line.replace(
                "Source0: %{name}-%{version}.tar",
                "Source0: %{name}-%{unmangled_version}.tar"
            ).replace(
                "setup.py install ",
                "setup.py install --single-version-externally-managed "
            ).replace(
                "%setup",
                "%setup -n %{name}-%{unmangled_version}"
            ).replace(line23, line24)
            for line in spec
        ]
        insert_loc = spec.index(line24) + 1
        unmangled_version = "%define unmangled_version " + version
        spec.insert(insert_loc, unmangled_version)
        return spec
site-packages/setuptools/command/py36compat.py000064400000011572147511334640015520 0ustar00import os
from glob import glob
from distutils.util import convert_path
from distutils.command import sdist

from setuptools.extern.six.moves import filter


class sdist_add_defaults:
    """
    Mix-in providing forward-compatibility for functionality as found in
    distutils on Python 3.7.

    Do not edit the code in this class except to update functionality
    as implemented in distutils. Instead, override in the subclass.
    """

    def add_defaults(self):
        """Add all the default files to self.filelist:
          - README or README.txt
          - setup.py
          - test/test*.py
          - all pure Python modules mentioned in setup script
          - all files pointed by package_data (build_py)
          - all files defined in data_files.
          - all files defined as scripts.
          - all C sources listed as part of extensions or C libraries
            in the setup script (doesn't catch C headers!)
        Warns if (README or README.txt) or setup.py are missing; everything
        else is optional.
        """
        self._add_defaults_standards()
        self._add_defaults_optional()
        self._add_defaults_python()
        self._add_defaults_data_files()
        self._add_defaults_ext()
        self._add_defaults_c_libs()
        self._add_defaults_scripts()

    @staticmethod
    def _cs_path_exists(fspath):
        """
        Case-sensitive path existence check

        >>> sdist_add_defaults._cs_path_exists(__file__)
        True
        >>> sdist_add_defaults._cs_path_exists(__file__.upper())
        False
        """
        if not os.path.exists(fspath):
            return False
        # make absolute so we always have a directory
        abspath = os.path.abspath(fspath)
        directory, filename = os.path.split(abspath)
        return filename in os.listdir(directory)

    def _add_defaults_standards(self):
        standards = [self.READMES, self.distribution.script_name]
        for fn in standards:
            if isinstance(fn, tuple):
                alts = fn
                got_it = False
                for fn in alts:
                    if self._cs_path_exists(fn):
                        got_it = True
                        self.filelist.append(fn)
                        break

                if not got_it:
                    self.warn("standard file not found: should have one of " +
                              ', '.join(alts))
            else:
                if self._cs_path_exists(fn):
                    self.filelist.append(fn)
                else:
                    self.warn("standard file '%s' not found" % fn)

    def _add_defaults_optional(self):
        optional = ['test/test*.py', 'setup.cfg']
        for pattern in optional:
            files = filter(os.path.isfile, glob(pattern))
            self.filelist.extend(files)

    def _add_defaults_python(self):
        # build_py is used to get:
        #  - python modules
        #  - files defined in package_data
        build_py = self.get_finalized_command('build_py')

        # getting python files
        if self.distribution.has_pure_modules():
            self.filelist.extend(build_py.get_source_files())

        # getting package_data files
        # (computed in build_py.data_files by build_py.finalize_options)
        for pkg, src_dir, build_dir, filenames in build_py.data_files:
            for filename in filenames:
                self.filelist.append(os.path.join(src_dir, filename))

    def _add_defaults_data_files(self):
        # getting distribution.data_files
        if self.distribution.has_data_files():
            for item in self.distribution.data_files:
                if isinstance(item, str):
                    # plain file
                    item = convert_path(item)
                    if os.path.isfile(item):
                        self.filelist.append(item)
                else:
                    # a (dirname, filenames) tuple
                    dirname, filenames = item
                    for f in filenames:
                        f = convert_path(f)
                        if os.path.isfile(f):
                            self.filelist.append(f)

    def _add_defaults_ext(self):
        if self.distribution.has_ext_modules():
            build_ext = self.get_finalized_command('build_ext')
            self.filelist.extend(build_ext.get_source_files())

    def _add_defaults_c_libs(self):
        if self.distribution.has_c_libraries():
            build_clib = self.get_finalized_command('build_clib')
            self.filelist.extend(build_clib.get_source_files())

    def _add_defaults_scripts(self):
        if self.distribution.has_scripts():
            build_scripts = self.get_finalized_command('build_scripts')
            self.filelist.extend(build_scripts.get_source_files())


if hasattr(sdist.sdist, '_add_defaults_standards'):
    # disable the functionality already available upstream
    class sdist_add_defaults:
        pass
site-packages/setuptools/command/develop.py000064400000017556147511334640015161 0ustar00from distutils.util import convert_path
from distutils import log
from distutils.errors import DistutilsError, DistutilsOptionError
import os
import glob
import io

from setuptools.extern import six

from pkg_resources import Distribution, PathMetadata, normalize_path
from setuptools.command.easy_install import easy_install
from setuptools import namespaces
import setuptools


class develop(namespaces.DevelopInstaller, easy_install):
    """Set up package for development"""

    description = "install package in 'development mode'"

    user_options = easy_install.user_options + [
        ("uninstall", "u", "Uninstall this source package"),
        ("egg-path=", None, "Set the path to be used in the .egg-link file"),
    ]

    boolean_options = easy_install.boolean_options + ['uninstall']

    command_consumes_arguments = False  # override base

    def run(self):
        if self.uninstall:
            self.multi_version = True
            self.uninstall_link()
            self.uninstall_namespaces()
        else:
            self.install_for_development()
        self.warn_deprecated_options()

    def initialize_options(self):
        self.uninstall = None
        self.egg_path = None
        easy_install.initialize_options(self)
        self.setup_path = None
        self.always_copy_from = '.'  # always copy eggs installed in curdir

    def finalize_options(self):
        ei = self.get_finalized_command("egg_info")
        if ei.broken_egg_info:
            template = "Please rename %r to %r before using 'develop'"
            args = ei.egg_info, ei.broken_egg_info
            raise DistutilsError(template % args)
        self.args = [ei.egg_name]

        easy_install.finalize_options(self)
        self.expand_basedirs()
        self.expand_dirs()
        # pick up setup-dir .egg files only: no .egg-info
        self.package_index.scan(glob.glob('*.egg'))

        egg_link_fn = ei.egg_name + '.egg-link'
        self.egg_link = os.path.join(self.install_dir, egg_link_fn)
        self.egg_base = ei.egg_base
        if self.egg_path is None:
            self.egg_path = os.path.abspath(ei.egg_base)

        target = normalize_path(self.egg_base)
        egg_path = normalize_path(os.path.join(self.install_dir,
                                               self.egg_path))
        if egg_path != target:
            raise DistutilsOptionError(
                "--egg-path must be a relative path from the install"
                " directory to " + target
            )

        # Make a distribution for the package's source
        self.dist = Distribution(
            target,
            PathMetadata(target, os.path.abspath(ei.egg_info)),
            project_name=ei.egg_name
        )

        self.setup_path = self._resolve_setup_path(
            self.egg_base,
            self.install_dir,
            self.egg_path,
        )

    @staticmethod
    def _resolve_setup_path(egg_base, install_dir, egg_path):
        """
        Generate a path from egg_base back to '.' where the
        setup script resides and ensure that path points to the
        setup path from $install_dir/$egg_path.
        """
        path_to_setup = egg_base.replace(os.sep, '/').rstrip('/')
        if path_to_setup != os.curdir:
            path_to_setup = '../' * (path_to_setup.count('/') + 1)
        resolved = normalize_path(
            os.path.join(install_dir, egg_path, path_to_setup)
        )
        if resolved != normalize_path(os.curdir):
            raise DistutilsOptionError(
                "Can't get a consistent path to setup script from"
                " installation directory", resolved, normalize_path(os.curdir))
        return path_to_setup

    def install_for_development(self):
        if six.PY3 and getattr(self.distribution, 'use_2to3', False):
            # If we run 2to3 we can not do this inplace:

            # Ensure metadata is up-to-date
            self.reinitialize_command('build_py', inplace=0)
            self.run_command('build_py')
            bpy_cmd = self.get_finalized_command("build_py")
            build_path = normalize_path(bpy_cmd.build_lib)

            # Build extensions
            self.reinitialize_command('egg_info', egg_base=build_path)
            self.run_command('egg_info')

            self.reinitialize_command('build_ext', inplace=0)
            self.run_command('build_ext')

            # Fixup egg-link and easy-install.pth
            ei_cmd = self.get_finalized_command("egg_info")
            self.egg_path = build_path
            self.dist.location = build_path
            # XXX
            self.dist._provider = PathMetadata(build_path, ei_cmd.egg_info)
        else:
            # Without 2to3 inplace works fine:
            self.run_command('egg_info')

            # Build extensions in-place
            self.reinitialize_command('build_ext', inplace=1)
            self.run_command('build_ext')

        self.install_site_py()  # ensure that target dir is site-safe
        if setuptools.bootstrap_install_from:
            self.easy_install(setuptools.bootstrap_install_from)
            setuptools.bootstrap_install_from = None

        self.install_namespaces()

        # create an .egg-link in the installation dir, pointing to our egg
        log.info("Creating %s (link to %s)", self.egg_link, self.egg_base)
        if not self.dry_run:
            with open(self.egg_link, "w") as f:
                f.write(self.egg_path + "\n" + self.setup_path)
        # postprocess the installed distro, fixing up .pth, installing scripts,
        # and handling requirements
        self.process_distribution(None, self.dist, not self.no_deps)

    def uninstall_link(self):
        if os.path.exists(self.egg_link):
            log.info("Removing %s (link to %s)", self.egg_link, self.egg_base)
            egg_link_file = open(self.egg_link)
            contents = [line.rstrip() for line in egg_link_file]
            egg_link_file.close()
            if contents not in ([self.egg_path],
                                [self.egg_path, self.setup_path]):
                log.warn("Link points to %s: uninstall aborted", contents)
                return
            if not self.dry_run:
                os.unlink(self.egg_link)
        if not self.dry_run:
            self.update_pth(self.dist)  # remove any .pth link to us
        if self.distribution.scripts:
            # XXX should also check for entry point scripts!
            log.warn("Note: you must uninstall or replace scripts manually!")

    def install_egg_scripts(self, dist):
        if dist is not self.dist:
            # Installing a dependency, so fall back to normal behavior
            return easy_install.install_egg_scripts(self, dist)

        # create wrapper scripts in the script dir, pointing to dist.scripts

        # new-style...
        self.install_wrapper_scripts(dist)

        # ...and old-style
        for script_name in self.distribution.scripts or []:
            script_path = os.path.abspath(convert_path(script_name))
            script_name = os.path.basename(script_path)
            with io.open(script_path) as strm:
                script_text = strm.read()
            self.install_script(dist, script_name, script_text, script_path)

    def install_wrapper_scripts(self, dist):
        dist = VersionlessRequirement(dist)
        return easy_install.install_wrapper_scripts(self, dist)


class VersionlessRequirement(object):
    """
    Adapt a pkg_resources.Distribution to simply return the project
    name as the 'requirement' so that scripts will work across
    multiple versions.

    >>> dist = Distribution(project_name='foo', version='1.0')
    >>> str(dist.as_requirement())
    'foo==1.0'
    >>> adapted_dist = VersionlessRequirement(dist)
    >>> str(adapted_dist.as_requirement())
    'foo'
    """

    def __init__(self, dist):
        self.__dist = dist

    def __getattr__(self, name):
        return getattr(self.__dist, name)

    def as_requirement(self):
        return self.project_name
site-packages/setuptools/command/upload.py000064400000002224147511334640014771 0ustar00import getpass
from distutils.command import upload as orig


class upload(orig.upload):
    """
    Override default upload behavior to obtain password
    in a variety of different ways.
    """

    def finalize_options(self):
        orig.upload.finalize_options(self)
        self.username = (
            self.username or
            getpass.getuser()
        )
        # Attempt to obtain password. Short circuit evaluation at the first
        # sign of success.
        self.password = (
            self.password or
            self._load_password_from_keyring() or
            self._prompt_for_password()
        )

    def _load_password_from_keyring(self):
        """
        Attempt to load password from keyring. Suppress Exceptions.
        """
        try:
            keyring = __import__('keyring')
            return keyring.get_password(self.repository, self.username)
        except Exception:
            pass

    def _prompt_for_password(self):
        """
        Prompt for a password on the tty. Suppress Exceptions.
        """
        try:
            return getpass.getpass()
        except (Exception, KeyboardInterrupt):
            pass
site-packages/setuptools/site-patch.py000064400000004403147511334640014131 0ustar00def __boot():
    import sys
    import os
    PYTHONPATH = os.environ.get('PYTHONPATH')
    if PYTHONPATH is None or (sys.platform == 'win32' and not PYTHONPATH):
        PYTHONPATH = []
    else:
        PYTHONPATH = PYTHONPATH.split(os.pathsep)

    pic = getattr(sys, 'path_importer_cache', {})
    stdpath = sys.path[len(PYTHONPATH):]
    mydir = os.path.dirname(__file__)

    for item in stdpath:
        if item == mydir or not item:
            continue  # skip if current dir. on Windows, or my own directory
        importer = pic.get(item)
        if importer is not None:
            loader = importer.find_module('site')
            if loader is not None:
                # This should actually reload the current module
                loader.load_module('site')
                break
        else:
            try:
                import imp  # Avoid import loop in Python >= 3.3
                stream, path, descr = imp.find_module('site', [item])
            except ImportError:
                continue
            if stream is None:
                continue
            try:
                # This should actually reload the current module
                imp.load_module('site', stream, path, descr)
            finally:
                stream.close()
            break
    else:
        raise ImportError("Couldn't find the real 'site' module")

    known_paths = dict([(makepath(item)[1], 1) for item in sys.path])  # 2.2 comp

    oldpos = getattr(sys, '__egginsert', 0)  # save old insertion position
    sys.__egginsert = 0  # and reset the current one

    for item in PYTHONPATH:
        addsitedir(item)

    sys.__egginsert += oldpos  # restore effective old position

    d, nd = makepath(stdpath[0])
    insert_at = None
    new_path = []

    for item in sys.path:
        p, np = makepath(item)

        if np == nd and insert_at is None:
            # We've hit the first 'system' path entry, so added entries go here
            insert_at = len(new_path)

        if np in known_paths or insert_at is None:
            new_path.append(item)
        else:
            # new path after the insert point, back-insert it
            new_path.insert(insert_at, item)
            insert_at += 1

    sys.path[:] = new_path


if __name__ == 'site':
    __boot()
    del __boot
site-packages/setuptools/py33compat.py000064400000002236147511334640014074 0ustar00import dis
import array
import collections

try:
    import html
except ImportError:
    html = None

from setuptools.extern import six
from setuptools.extern.six.moves import html_parser


OpArg = collections.namedtuple('OpArg', 'opcode arg')


class Bytecode_compat(object):
    def __init__(self, code):
        self.code = code

    def __iter__(self):
        """Yield '(op,arg)' pair for each operation in code object 'code'"""

        bytes = array.array('b', self.code.co_code)
        eof = len(self.code.co_code)

        ptr = 0
        extended_arg = 0

        while ptr < eof:

            op = bytes[ptr]

            if op >= dis.HAVE_ARGUMENT:

                arg = bytes[ptr + 1] + bytes[ptr + 2] * 256 + extended_arg
                ptr += 3

                if op == dis.EXTENDED_ARG:
                    long_type = six.integer_types[-1]
                    extended_arg = arg * long_type(65536)
                    continue

            else:
                arg = None
                ptr += 1

            yield OpArg(op, arg)


Bytecode = getattr(dis, 'Bytecode', Bytecode_compat)


unescape = getattr(html, 'unescape', html_parser.HTMLParser().unescape)
site-packages/setuptools/extern/__init__.py000064400000004703147511334640015137 0ustar00import sys


class VendorImporter:
    """
    A PEP 302 meta path importer for finding optionally-vendored
    or otherwise naturally-installed packages from root_name.
    """

    def __init__(self, root_name, vendored_names=(), vendor_pkg=None):
        self.root_name = root_name
        self.vendored_names = set(vendored_names)
        self.vendor_pkg = vendor_pkg or root_name.replace('extern', '_vendor')

    @property
    def search_path(self):
        """
        Search first the vendor package then as a natural package.
        """
        yield self.vendor_pkg + '.'
        yield ''

    def find_module(self, fullname, path=None):
        """
        Return self when fullname starts with root_name and the
        target module is one vendored through this importer.
        """
        root, base, target = fullname.partition(self.root_name + '.')
        if root:
            return
        if not any(map(target.startswith, self.vendored_names)):
            return
        return self

    def load_module(self, fullname):
        """
        Iterate over the search path to locate and load fullname.
        """
        root, base, target = fullname.partition(self.root_name + '.')
        for prefix in self.search_path:
            try:
                extant = prefix + target
                __import__(extant)
                mod = sys.modules[extant]
                sys.modules[fullname] = mod
                # mysterious hack:
                # Remove the reference to the extant package/module
                # on later Python versions to cause relative imports
                # in the vendor package to resolve the same modules
                # as those going through this importer.
                if sys.version_info > (3, 3):
                    del sys.modules[extant]
                return mod
            except ImportError:
                pass
        else:
            raise ImportError(
                "The '{target}' package is required; "
                "normally this is bundled with this package so if you get "
                "this warning, consult the packager of your "
                "distribution.".format(**locals())
            )

    def install(self):
        """
        Install this importer into sys.meta_path if not already present.
        """
        if self not in sys.meta_path:
            sys.meta_path.append(self)


names = 'six', 'packaging', 'pyparsing',
VendorImporter(__name__, names, 'setuptools._vendor').install()
site-packages/setuptools/extern/__pycache__/__init__.cpython-36.pyc000064400000004461147511334640021424 0ustar003

��f�	�@s.ddlZGdd�d�ZdZeeed�j�dS)	�Nc@sDeZdZdZfdfdd�Zedd��Zd
dd�Zd	d
�Zdd�Z	dS)�VendorImporterz�
    A PEP 302 meta path importer for finding optionally-vendored
    or otherwise naturally-installed packages from root_name.
    NcCs&||_t|�|_|p|jdd�|_dS)NZexternZ_vendor)�	root_name�set�vendored_names�replace�
vendor_pkg)�selfrrr�r	�/usr/lib/python3.6/__init__.py�__init__
s
zVendorImporter.__init__ccs|jdVdVdS)zL
        Search first the vendor package then as a natural package.
        �.�N)r)rr	r	r
�search_pathszVendorImporter.search_pathcCs8|j|jd�\}}}|rdStt|j|j��s4dS|S)z�
        Return self when fullname starts with root_name and the
        target module is one vendored through this importer.
        rN)�	partitionr�any�map�
startswithr)r�fullname�path�root�base�targetr	r	r
�find_moduleszVendorImporter.find_modulecCs�|j|jd�\}}}xp|jD]T}y:||}t|�tj|}|tj|<tjdkrZtj|=|Stk
rpYqXqWtdjft	����dS)zK
        Iterate over the search path to locate and load fullname.
        r�z�The '{target}' package is required; normally this is bundled with this package so if you get this warning, consult the packager of your distribution.N)rr)
rrr�
__import__�sys�modules�version_info�ImportError�format�locals)rrrrr�prefixZextant�modr	r	r
�load_module#s



zVendorImporter.load_modulecCs|tjkrtjj|�dS)zR
        Install this importer into sys.meta_path if not already present.
        N)r�	meta_path�append)rr	r	r
�install@s
zVendorImporter.install)N)
�__name__�
__module__�__qualname__�__doc__r�propertyrrr#r&r	r	r	r
rs
r�six�	packaging�	pyparsingzsetuptools._vendor)r,r-r.)rr�namesr'r&r	r	r	r
�<module>sDsite-packages/setuptools/extern/__pycache__/__init__.cpython-36.opt-1.pyc000064400000004461147511334640022363 0ustar003

��f�	�@s.ddlZGdd�d�ZdZeeed�j�dS)	�Nc@sDeZdZdZfdfdd�Zedd��Zd
dd�Zd	d
�Zdd�Z	dS)�VendorImporterz�
    A PEP 302 meta path importer for finding optionally-vendored
    or otherwise naturally-installed packages from root_name.
    NcCs&||_t|�|_|p|jdd�|_dS)NZexternZ_vendor)�	root_name�set�vendored_names�replace�
vendor_pkg)�selfrrr�r	�/usr/lib/python3.6/__init__.py�__init__
s
zVendorImporter.__init__ccs|jdVdVdS)zL
        Search first the vendor package then as a natural package.
        �.�N)r)rr	r	r
�search_pathszVendorImporter.search_pathcCs8|j|jd�\}}}|rdStt|j|j��s4dS|S)z�
        Return self when fullname starts with root_name and the
        target module is one vendored through this importer.
        rN)�	partitionr�any�map�
startswithr)r�fullname�path�root�base�targetr	r	r
�find_moduleszVendorImporter.find_modulecCs�|j|jd�\}}}xp|jD]T}y:||}t|�tj|}|tj|<tjdkrZtj|=|Stk
rpYqXqWtdjft	����dS)zK
        Iterate over the search path to locate and load fullname.
        r�z�The '{target}' package is required; normally this is bundled with this package so if you get this warning, consult the packager of your distribution.N)rr)
rrr�
__import__�sys�modules�version_info�ImportError�format�locals)rrrrr�prefixZextant�modr	r	r
�load_module#s



zVendorImporter.load_modulecCs|tjkrtjj|�dS)zR
        Install this importer into sys.meta_path if not already present.
        N)r�	meta_path�append)rr	r	r
�install@s
zVendorImporter.install)N)
�__name__�
__module__�__qualname__�__doc__r�propertyrrr#r&r	r	r	r
rs
r�six�	packaging�	pyparsingzsetuptools._vendor)r,r-r.)rr�namesr'r&r	r	r	r
�<module>sDsite-packages/setuptools/lib2to3_ex.py000064400000003735147511334640014051 0ustar00"""
Customized Mixin2to3 support:

 - adds support for converting doctests


This module raises an ImportError on Python 2.
"""

from distutils.util import Mixin2to3 as _Mixin2to3
from distutils import log
from lib2to3.refactor import RefactoringTool, get_fixers_from_package

import setuptools


class DistutilsRefactoringTool(RefactoringTool):
    def log_error(self, msg, *args, **kw):
        log.error(msg, *args)

    def log_message(self, msg, *args):
        log.info(msg, *args)

    def log_debug(self, msg, *args):
        log.debug(msg, *args)


class Mixin2to3(_Mixin2to3):
    def run_2to3(self, files, doctests=False):
        # See of the distribution option has been set, otherwise check the
        # setuptools default.
        if self.distribution.use_2to3 is not True:
            return
        if not files:
            return
        log.info("Fixing " + " ".join(files))
        self.__build_fixer_names()
        self.__exclude_fixers()
        if doctests:
            if setuptools.run_2to3_on_doctests:
                r = DistutilsRefactoringTool(self.fixer_names)
                r.refactor(files, write=True, doctests_only=True)
        else:
            _Mixin2to3.run_2to3(self, files)

    def __build_fixer_names(self):
        if self.fixer_names:
            return
        self.fixer_names = []
        for p in setuptools.lib2to3_fixer_packages:
            self.fixer_names.extend(get_fixers_from_package(p))
        if self.distribution.use_2to3_fixers is not None:
            for p in self.distribution.use_2to3_fixers:
                self.fixer_names.extend(get_fixers_from_package(p))

    def __exclude_fixers(self):
        excluded_fixers = getattr(self, 'exclude_fixers', [])
        if self.distribution.use_2to3_exclude_fixers is not None:
            excluded_fixers.extend(self.distribution.use_2to3_exclude_fixers)
        for fixer_name in excluded_fixers:
            if fixer_name in self.fixer_names:
                self.fixer_names.remove(fixer_name)
site-packages/setuptools/ssl_support.py000064400000020454147511334640014471 0ustar00import os
import socket
import atexit
import re
import functools

from setuptools.extern.six.moves import urllib, http_client, map, filter

from pkg_resources import ResolutionError, ExtractionError

try:
    import ssl
except ImportError:
    ssl = None

__all__ = [
    'VerifyingHTTPSHandler', 'find_ca_bundle', 'is_available', 'cert_paths',
    'opener_for'
]

cert_paths = """
/etc/pki/tls/certs/ca-bundle.crt
/etc/ssl/certs/ca-certificates.crt
/usr/share/ssl/certs/ca-bundle.crt
/usr/local/share/certs/ca-root.crt
/etc/ssl/cert.pem
/System/Library/OpenSSL/certs/cert.pem
/usr/local/share/certs/ca-root-nss.crt
/etc/ssl/ca-bundle.pem
""".strip().split()

try:
    HTTPSHandler = urllib.request.HTTPSHandler
    HTTPSConnection = http_client.HTTPSConnection
except AttributeError:
    HTTPSHandler = HTTPSConnection = object

is_available = ssl is not None and object not in (HTTPSHandler, HTTPSConnection)


try:
    from ssl import CertificateError, match_hostname
except ImportError:
    try:
        from backports.ssl_match_hostname import CertificateError
        from backports.ssl_match_hostname import match_hostname
    except ImportError:
        CertificateError = None
        match_hostname = None

if not CertificateError:

    class CertificateError(ValueError):
        pass


if not match_hostname:

    def _dnsname_match(dn, hostname, max_wildcards=1):
        """Matching according to RFC 6125, section 6.4.3

        http://tools.ietf.org/html/rfc6125#section-6.4.3
        """
        pats = []
        if not dn:
            return False

        # Ported from python3-syntax:
        # leftmost, *remainder = dn.split(r'.')
        parts = dn.split(r'.')
        leftmost = parts[0]
        remainder = parts[1:]

        wildcards = leftmost.count('*')
        if wildcards > max_wildcards:
            # Issue #17980: avoid denials of service by refusing more
            # than one wildcard per fragment.  A survey of established
            # policy among SSL implementations showed it to be a
            # reasonable choice.
            raise CertificateError(
                "too many wildcards in certificate DNS name: " + repr(dn))

        # speed up common case w/o wildcards
        if not wildcards:
            return dn.lower() == hostname.lower()

        # RFC 6125, section 6.4.3, subitem 1.
        # The client SHOULD NOT attempt to match a presented identifier in which
        # the wildcard character comprises a label other than the left-most label.
        if leftmost == '*':
            # When '*' is a fragment by itself, it matches a non-empty dotless
            # fragment.
            pats.append('[^.]+')
        elif leftmost.startswith('xn--') or hostname.startswith('xn--'):
            # RFC 6125, section 6.4.3, subitem 3.
            # The client SHOULD NOT attempt to match a presented identifier
            # where the wildcard character is embedded within an A-label or
            # U-label of an internationalized domain name.
            pats.append(re.escape(leftmost))
        else:
            # Otherwise, '*' matches any dotless string, e.g. www*
            pats.append(re.escape(leftmost).replace(r'\*', '[^.]*'))

        # add the remaining fragments, ignore any wildcards
        for frag in remainder:
            pats.append(re.escape(frag))

        pat = re.compile(r'\A' + r'\.'.join(pats) + r'\Z', re.IGNORECASE)
        return pat.match(hostname)

    def match_hostname(cert, hostname):
        """Verify that *cert* (in decoded format as returned by
        SSLSocket.getpeercert()) matches the *hostname*.  RFC 2818 and RFC 6125
        rules are followed, but IP addresses are not accepted for *hostname*.

        CertificateError is raised on failure. On success, the function
        returns nothing.
        """
        if not cert:
            raise ValueError("empty or no certificate")
        dnsnames = []
        san = cert.get('subjectAltName', ())
        for key, value in san:
            if key == 'DNS':
                if _dnsname_match(value, hostname):
                    return
                dnsnames.append(value)
        if not dnsnames:
            # The subject is only checked when there is no dNSName entry
            # in subjectAltName
            for sub in cert.get('subject', ()):
                for key, value in sub:
                    # XXX according to RFC 2818, the most specific Common Name
                    # must be used.
                    if key == 'commonName':
                        if _dnsname_match(value, hostname):
                            return
                        dnsnames.append(value)
        if len(dnsnames) > 1:
            raise CertificateError("hostname %r "
                "doesn't match either of %s"
                % (hostname, ', '.join(map(repr, dnsnames))))
        elif len(dnsnames) == 1:
            raise CertificateError("hostname %r "
                "doesn't match %r"
                % (hostname, dnsnames[0]))
        else:
            raise CertificateError("no appropriate commonName or "
                "subjectAltName fields were found")


class VerifyingHTTPSHandler(HTTPSHandler):
    """Simple verifying handler: no auth, subclasses, timeouts, etc."""

    def __init__(self, ca_bundle):
        self.ca_bundle = ca_bundle
        HTTPSHandler.__init__(self)

    def https_open(self, req):
        return self.do_open(
            lambda host, **kw: VerifyingHTTPSConn(host, self.ca_bundle, **kw), req
        )


class VerifyingHTTPSConn(HTTPSConnection):
    """Simple verifying connection: no auth, subclasses, timeouts, etc."""

    def __init__(self, host, ca_bundle, **kw):
        HTTPSConnection.__init__(self, host, **kw)
        self.ca_bundle = ca_bundle

    def connect(self):
        sock = socket.create_connection(
            (self.host, self.port), getattr(self, 'source_address', None)
        )

        # Handle the socket if a (proxy) tunnel is present
        if hasattr(self, '_tunnel') and getattr(self, '_tunnel_host', None):
            self.sock = sock
            self._tunnel()
            # http://bugs.python.org/issue7776: Python>=3.4.1 and >=2.7.7
            # change self.host to mean the proxy server host when tunneling is
            # being used. Adapt, since we are interested in the destination
            # host for the match_hostname() comparison.
            actual_host = self._tunnel_host
        else:
            actual_host = self.host

        if hasattr(ssl, 'create_default_context'):
            ctx = ssl.create_default_context(cafile=self.ca_bundle)
            self.sock = ctx.wrap_socket(sock, server_hostname=actual_host)
        else:
            # This is for python < 2.7.9 and < 3.4?
            self.sock = ssl.wrap_socket(
                sock, cert_reqs=ssl.CERT_REQUIRED, ca_certs=self.ca_bundle
            )
        try:
            match_hostname(self.sock.getpeercert(), actual_host)
        except CertificateError:
            self.sock.shutdown(socket.SHUT_RDWR)
            self.sock.close()
            raise


def opener_for(ca_bundle=None):
    """Get a urlopen() replacement that uses ca_bundle for verification"""
    return urllib.request.build_opener(
        VerifyingHTTPSHandler(ca_bundle or find_ca_bundle())
    ).open


# from jaraco.functools
def once(func):
    @functools.wraps(func)
    def wrapper(*args, **kwargs):
        if not hasattr(func, 'always_returns'):
            func.always_returns = func(*args, **kwargs)
        return func.always_returns
    return wrapper


@once
def get_win_certfile():
    try:
        import wincertstore
    except ImportError:
        return None

    class CertFile(wincertstore.CertFile):
        def __init__(self):
            super(CertFile, self).__init__()
            atexit.register(self.close)

        def close(self):
            try:
                super(CertFile, self).close()
            except OSError:
                pass

    _wincerts = CertFile()
    _wincerts.addstore('CA')
    _wincerts.addstore('ROOT')
    return _wincerts.name


def find_ca_bundle():
    """Return an existing CA bundle path, or None"""
    extant_cert_paths = filter(os.path.isfile, cert_paths)
    return (
        get_win_certfile()
        or next(extant_cert_paths, None)
        or _certifi_where()
    )


def _certifi_where():
    try:
        return __import__('certifi').where()
    except (ImportError, ResolutionError, ExtractionError):
        pass
site-packages/setuptools/extension.py000064400000003301147511334640014100 0ustar00import re
import functools
import distutils.core
import distutils.errors
import distutils.extension

from setuptools.extern.six.moves import map

from .monkey import get_unpatched


def _have_cython():
    """
    Return True if Cython can be imported.
    """
    cython_impl = 'Cython.Distutils.build_ext'
    try:
        # from (cython_impl) import build_ext
        __import__(cython_impl, fromlist=['build_ext']).build_ext
        return True
    except Exception:
        pass
    return False


# for compatibility
have_pyrex = _have_cython

_Extension = get_unpatched(distutils.core.Extension)


class Extension(_Extension):
    """Extension that uses '.c' files in place of '.pyx' files"""

    def __init__(self, name, sources, *args, **kw):
        # The *args is needed for compatibility as calls may use positional
        # arguments. py_limited_api may be set only via keyword.
        self.py_limited_api = kw.pop("py_limited_api", False)
        _Extension.__init__(self, name, sources, *args, **kw)

    def _convert_pyx_sources_to_lang(self):
        """
        Replace sources with .pyx extensions to sources with the target
        language extension. This mechanism allows language authors to supply
        pre-converted sources but to prefer the .pyx sources.
        """
        if _have_cython():
            # the build has Cython, so allow it to compile the .pyx files
            return
        lang = self.language or ''
        target_ext = '.cpp' if lang.lower() == 'c++' else '.c'
        sub = functools.partial(re.sub, '.pyx$', target_ext)
        self.sources = list(map(sub, self.sources))


class Library(Extension):
    """Just like a regular Extension, but built as a library instead"""
site-packages/setuptools/archive_util.py000064400000014700147511334640014547 0ustar00"""Utilities for extracting common archive formats"""

import zipfile
import tarfile
import os
import shutil
import posixpath
import contextlib
from distutils.errors import DistutilsError

from pkg_resources import ensure_directory

__all__ = [
    "unpack_archive", "unpack_zipfile", "unpack_tarfile", "default_filter",
    "UnrecognizedFormat", "extraction_drivers", "unpack_directory",
]


class UnrecognizedFormat(DistutilsError):
    """Couldn't recognize the archive type"""


def default_filter(src, dst):
    """The default progress/filter callback; returns True for all files"""
    return dst


def unpack_archive(filename, extract_dir, progress_filter=default_filter,
        drivers=None):
    """Unpack `filename` to `extract_dir`, or raise ``UnrecognizedFormat``

    `progress_filter` is a function taking two arguments: a source path
    internal to the archive ('/'-separated), and a filesystem path where it
    will be extracted.  The callback must return the desired extract path
    (which may be the same as the one passed in), or else ``None`` to skip
    that file or directory.  The callback can thus be used to report on the
    progress of the extraction, as well as to filter the items extracted or
    alter their extraction paths.

    `drivers`, if supplied, must be a non-empty sequence of functions with the
    same signature as this function (minus the `drivers` argument), that raise
    ``UnrecognizedFormat`` if they do not support extracting the designated
    archive type.  The `drivers` are tried in sequence until one is found that
    does not raise an error, or until all are exhausted (in which case
    ``UnrecognizedFormat`` is raised).  If you do not supply a sequence of
    drivers, the module's ``extraction_drivers`` constant will be used, which
    means that ``unpack_zipfile`` and ``unpack_tarfile`` will be tried, in that
    order.
    """
    for driver in drivers or extraction_drivers:
        try:
            driver(filename, extract_dir, progress_filter)
        except UnrecognizedFormat:
            continue
        else:
            return
    else:
        raise UnrecognizedFormat(
            "Not a recognized archive type: %s" % filename
        )


def unpack_directory(filename, extract_dir, progress_filter=default_filter):
    """"Unpack" a directory, using the same interface as for archives

    Raises ``UnrecognizedFormat`` if `filename` is not a directory
    """
    if not os.path.isdir(filename):
        raise UnrecognizedFormat("%s is not a directory" % filename)

    paths = {
        filename: ('', extract_dir),
    }
    for base, dirs, files in os.walk(filename):
        src, dst = paths[base]
        for d in dirs:
            paths[os.path.join(base, d)] = src + d + '/', os.path.join(dst, d)
        for f in files:
            target = os.path.join(dst, f)
            target = progress_filter(src + f, target)
            if not target:
                # skip non-files
                continue
            ensure_directory(target)
            f = os.path.join(base, f)
            shutil.copyfile(f, target)
            shutil.copystat(f, target)


def unpack_zipfile(filename, extract_dir, progress_filter=default_filter):
    """Unpack zip `filename` to `extract_dir`

    Raises ``UnrecognizedFormat`` if `filename` is not a zipfile (as determined
    by ``zipfile.is_zipfile()``).  See ``unpack_archive()`` for an explanation
    of the `progress_filter` argument.
    """

    if not zipfile.is_zipfile(filename):
        raise UnrecognizedFormat("%s is not a zip file" % (filename,))

    with zipfile.ZipFile(filename) as z:
        for info in z.infolist():
            name = info.filename

            # don't extract absolute paths or ones with .. in them
            if name.startswith('/') or '..' in name.split('/'):
                continue

            target = os.path.join(extract_dir, *name.split('/'))
            target = progress_filter(name, target)
            if not target:
                continue
            if name.endswith('/'):
                # directory
                ensure_directory(target)
            else:
                # file
                ensure_directory(target)
                data = z.read(info.filename)
                with open(target, 'wb') as f:
                    f.write(data)
            unix_attributes = info.external_attr >> 16
            if unix_attributes:
                os.chmod(target, unix_attributes)


def unpack_tarfile(filename, extract_dir, progress_filter=default_filter):
    """Unpack tar/tar.gz/tar.bz2 `filename` to `extract_dir`

    Raises ``UnrecognizedFormat`` if `filename` is not a tarfile (as determined
    by ``tarfile.open()``).  See ``unpack_archive()`` for an explanation
    of the `progress_filter` argument.
    """
    try:
        tarobj = tarfile.open(filename)
    except tarfile.TarError:
        raise UnrecognizedFormat(
            "%s is not a compressed or uncompressed tar file" % (filename,)
        )
    with contextlib.closing(tarobj):
        # don't do any chowning!
        tarobj.chown = lambda *args: None
        for member in tarobj:
            name = member.name
            # don't extract absolute paths or ones with .. in them
            if not name.startswith('/') and '..' not in name.split('/'):
                prelim_dst = os.path.join(extract_dir, *name.split('/'))

                # resolve any links and to extract the link targets as normal
                # files
                while member is not None and (member.islnk() or member.issym()):
                    linkpath = member.linkname
                    if member.issym():
                        base = posixpath.dirname(member.name)
                        linkpath = posixpath.join(base, linkpath)
                        linkpath = posixpath.normpath(linkpath)
                    member = tarobj._getmember(linkpath)

                if member is not None and (member.isfile() or member.isdir()):
                    final_dst = progress_filter(name, prelim_dst)
                    if final_dst:
                        if final_dst.endswith(os.sep):
                            final_dst = final_dst[:-1]
                        try:
                            # XXX Ugh
                            tarobj._extract_member(member, final_dst)
                        except tarfile.ExtractError:
                            # chown/chmod/mkfifo/mknode/makedev failed
                            pass
        return True


extraction_drivers = unpack_directory, unpack_zipfile, unpack_tarfile
site-packages/setuptools/depends.py000064400000013315147511334640013514 0ustar00import sys
import imp
import marshal
from distutils.version import StrictVersion
from imp import PKG_DIRECTORY, PY_COMPILED, PY_SOURCE, PY_FROZEN

from .py33compat import Bytecode


__all__ = [
    'Require', 'find_module', 'get_module_constant', 'extract_constant'
]


class Require:
    """A prerequisite to building or installing a distribution"""

    def __init__(self, name, requested_version, module, homepage='',
            attribute=None, format=None):

        if format is None and requested_version is not None:
            format = StrictVersion

        if format is not None:
            requested_version = format(requested_version)
            if attribute is None:
                attribute = '__version__'

        self.__dict__.update(locals())
        del self.self

    def full_name(self):
        """Return full package/distribution name, w/version"""
        if self.requested_version is not None:
            return '%s-%s' % (self.name, self.requested_version)
        return self.name

    def version_ok(self, version):
        """Is 'version' sufficiently up-to-date?"""
        return self.attribute is None or self.format is None or \
            str(version) != "unknown" and version >= self.requested_version

    def get_version(self, paths=None, default="unknown"):
        """Get version number of installed module, 'None', or 'default'

        Search 'paths' for module.  If not found, return 'None'.  If found,
        return the extracted version attribute, or 'default' if no version
        attribute was specified, or the value cannot be determined without
        importing the module.  The version is formatted according to the
        requirement's version format (if any), unless it is 'None' or the
        supplied 'default'.
        """

        if self.attribute is None:
            try:
                f, p, i = find_module(self.module, paths)
                if f:
                    f.close()
                return default
            except ImportError:
                return None

        v = get_module_constant(self.module, self.attribute, default, paths)

        if v is not None and v is not default and self.format is not None:
            return self.format(v)

        return v

    def is_present(self, paths=None):
        """Return true if dependency is present on 'paths'"""
        return self.get_version(paths) is not None

    def is_current(self, paths=None):
        """Return true if dependency is present and up-to-date on 'paths'"""
        version = self.get_version(paths)
        if version is None:
            return False
        return self.version_ok(version)


def find_module(module, paths=None):
    """Just like 'imp.find_module()', but with package support"""

    parts = module.split('.')

    while parts:
        part = parts.pop(0)
        f, path, (suffix, mode, kind) = info = imp.find_module(part, paths)

        if kind == PKG_DIRECTORY:
            parts = parts or ['__init__']
            paths = [path]

        elif parts:
            raise ImportError("Can't find %r in %s" % (parts, module))

    return info


def get_module_constant(module, symbol, default=-1, paths=None):
    """Find 'module' by searching 'paths', and extract 'symbol'

    Return 'None' if 'module' does not exist on 'paths', or it does not define
    'symbol'.  If the module defines 'symbol' as a constant, return the
    constant.  Otherwise, return 'default'."""

    try:
        f, path, (suffix, mode, kind) = find_module(module, paths)
    except ImportError:
        # Module doesn't exist
        return None

    try:
        if kind == PY_COMPILED:
            f.read(8)  # skip magic & date
            code = marshal.load(f)
        elif kind == PY_FROZEN:
            code = imp.get_frozen_object(module)
        elif kind == PY_SOURCE:
            code = compile(f.read(), path, 'exec')
        else:
            # Not something we can parse; we'll have to import it.  :(
            if module not in sys.modules:
                imp.load_module(module, f, path, (suffix, mode, kind))
            return getattr(sys.modules[module], symbol, None)

    finally:
        if f:
            f.close()

    return extract_constant(code, symbol, default)


def extract_constant(code, symbol, default=-1):
    """Extract the constant value of 'symbol' from 'code'

    If the name 'symbol' is bound to a constant value by the Python code
    object 'code', return that value.  If 'symbol' is bound to an expression,
    return 'default'.  Otherwise, return 'None'.

    Return value is based on the first assignment to 'symbol'.  'symbol' must
    be a global, or at least a non-"fast" local in the code block.  That is,
    only 'STORE_NAME' and 'STORE_GLOBAL' opcodes are checked, and 'symbol'
    must be present in 'code.co_names'.
    """
    if symbol not in code.co_names:
        # name's not there, can't possibly be an assignment
        return None

    name_idx = list(code.co_names).index(symbol)

    STORE_NAME = 90
    STORE_GLOBAL = 97
    LOAD_CONST = 100

    const = default

    for byte_code in Bytecode(code):
        op = byte_code.opcode
        arg = byte_code.arg

        if op == LOAD_CONST:
            const = code.co_consts[arg]
        elif arg == name_idx and (op == STORE_NAME or op == STORE_GLOBAL):
            return const
        else:
            const = default


def _update_globals():
    """
    Patch the globals to remove the objects not available on some platforms.

    XXX it'd be better to test assertions about bytecode instead.
    """

    if not sys.platform.startswith('java') and sys.platform != 'cli':
        return
    incompatible = 'extract_constant', 'get_module_constant'
    for name in incompatible:
        del globals()[name]
        __all__.remove(name)


_update_globals()
site-packages/setuptools/launch.py000064400000001423147511334640013341 0ustar00"""
Launch the Python script on the command line after
setuptools is bootstrapped via import.
"""

# Note that setuptools gets imported implicitly by the
# invocation of this script using python -m setuptools.launch

import tokenize
import sys


def run():
    """
    Run the script in sys.argv[1] as if it had
    been invoked naturally.
    """
    __builtins__
    script_name = sys.argv[1]
    namespace = dict(
        __file__=script_name,
        __name__='__main__',
        __doc__=None,
    )
    sys.argv[:] = sys.argv[1:]

    open_ = getattr(tokenize, 'open', open)
    script = open_(script_name).read()
    norm_script = script.replace('\\r\\n', '\\n')
    code = compile(norm_script, script_name, 'exec')
    exec(code, namespace)


if __name__ == '__main__':
    run()
site-packages/setuptools/py27compat.py000064400000001030147511334640014066 0ustar00"""
Compatibility Support for Python 2.7 and earlier
"""

import platform

from setuptools.extern import six


def get_all_headers(message, key):
    """
    Given an HTTPMessage, return all headers matching a given key.
    """
    return message.get_all(key)


if six.PY2:
    def get_all_headers(message, key):
        return message.getheaders(key)


linux_py2_ascii = (
    platform.system() == 'Linux' and
    six.PY2
)

rmtree_safe = str if linux_py2_ascii else lambda x: x
"""Workaround for http://bugs.python.org/issue24672"""
site-packages/setuptools/dist.py000064400000123165147511334640013042 0ustar00# -*- coding: utf-8 -*-
__all__ = ['Distribution']

import re
import os
import warnings
import numbers
import distutils.log
import distutils.core
import distutils.cmd
import distutils.dist
import itertools
from collections import defaultdict
from distutils.errors import (
    DistutilsOptionError, DistutilsPlatformError, DistutilsSetupError,
)
from distutils.util import rfc822_escape
from distutils.version import StrictVersion

from setuptools.extern import six
from setuptools.extern import packaging
from setuptools.extern.six.moves import map, filter, filterfalse

from setuptools.depends import Require
from setuptools import windows_support
from setuptools.monkey import get_unpatched
from setuptools.config import parse_configuration
import pkg_resources
from .py36compat import Distribution_parse_config_files

__import__('setuptools.extern.packaging.specifiers')
__import__('setuptools.extern.packaging.version')


def _get_unpatched(cls):
    warnings.warn("Do not call this function", DeprecationWarning)
    return get_unpatched(cls)


def get_metadata_version(dist_md):
    if dist_md.long_description_content_type or dist_md.provides_extras:
        return StrictVersion('2.1')
    elif (dist_md.maintainer is not None or
          dist_md.maintainer_email is not None or
          getattr(dist_md, 'python_requires', None) is not None):
        return StrictVersion('1.2')
    elif (dist_md.provides or dist_md.requires or dist_md.obsoletes or
            dist_md.classifiers or dist_md.download_url):
        return StrictVersion('1.1')

    return StrictVersion('1.0')


# Based on Python 3.5 version
def write_pkg_file(self, file):
    """Write the PKG-INFO format data to a file object.
    """
    version = get_metadata_version(self)

    file.write('Metadata-Version: %s\n' % version)
    file.write('Name: %s\n' % self.get_name())
    file.write('Version: %s\n' % self.get_version())
    file.write('Summary: %s\n' % self.get_description())
    file.write('Home-page: %s\n' % self.get_url())

    if version < StrictVersion('1.2'):
        file.write('Author: %s\n' % self.get_contact())
        file.write('Author-email: %s\n' % self.get_contact_email())
    else:
        optional_fields = (
            ('Author', 'author'),
            ('Author-email', 'author_email'),
            ('Maintainer', 'maintainer'),
            ('Maintainer-email', 'maintainer_email'),
        )

        for field, attr in optional_fields:
            attr_val = getattr(self, attr)
            if six.PY2:
                attr_val = self._encode_field(attr_val)

            if attr_val is not None:
                file.write('%s: %s\n' % (field, attr_val))

    file.write('License: %s\n' % self.get_license())
    if self.download_url:
        file.write('Download-URL: %s\n' % self.download_url)
    for project_url in self.project_urls.items():
        file.write('Project-URL: %s, %s\n' % project_url)

    long_desc = rfc822_escape(self.get_long_description())
    file.write('Description: %s\n' % long_desc)

    keywords = ','.join(self.get_keywords())
    if keywords:
        file.write('Keywords: %s\n' % keywords)

    if version >= StrictVersion('1.2'):
        for platform in self.get_platforms():
            file.write('Platform: %s\n' % platform)
    else:
        self._write_list(file, 'Platform', self.get_platforms())

    self._write_list(file, 'Classifier', self.get_classifiers())

    # PEP 314
    self._write_list(file, 'Requires', self.get_requires())
    self._write_list(file, 'Provides', self.get_provides())
    self._write_list(file, 'Obsoletes', self.get_obsoletes())

    # Setuptools specific for PEP 345
    if hasattr(self, 'python_requires'):
        file.write('Requires-Python: %s\n' % self.python_requires)

    # PEP 566
    if self.long_description_content_type:
        file.write(
            'Description-Content-Type: %s\n' %
            self.long_description_content_type
        )
    if self.provides_extras:
        for extra in self.provides_extras:
            file.write('Provides-Extra: %s\n' % extra)


sequence = tuple, list


def check_importable(dist, attr, value):
    try:
        ep = pkg_resources.EntryPoint.parse('x=' + value)
        assert not ep.extras
    except (TypeError, ValueError, AttributeError, AssertionError):
        raise DistutilsSetupError(
            "%r must be importable 'module:attrs' string (got %r)"
            % (attr, value)
        )


def assert_string_list(dist, attr, value):
    """Verify that value is a string list or None"""
    try:
        assert ''.join(value) != value
    except (TypeError, ValueError, AttributeError, AssertionError):
        raise DistutilsSetupError(
            "%r must be a list of strings (got %r)" % (attr, value)
        )


def check_nsp(dist, attr, value):
    """Verify that namespace packages are valid"""
    ns_packages = value
    assert_string_list(dist, attr, ns_packages)
    for nsp in ns_packages:
        if not dist.has_contents_for(nsp):
            raise DistutilsSetupError(
                "Distribution contains no modules or packages for " +
                "namespace package %r" % nsp
            )
        parent, sep, child = nsp.rpartition('.')
        if parent and parent not in ns_packages:
            distutils.log.warn(
                "WARNING: %r is declared as a package namespace, but %r"
                " is not: please correct this in setup.py", nsp, parent
            )


def check_extras(dist, attr, value):
    """Verify that extras_require mapping is valid"""
    try:
        list(itertools.starmap(_check_extra, value.items()))
    except (TypeError, ValueError, AttributeError):
        raise DistutilsSetupError(
            "'extras_require' must be a dictionary whose values are "
            "strings or lists of strings containing valid project/version "
            "requirement specifiers."
        )


def _check_extra(extra, reqs):
    name, sep, marker = extra.partition(':')
    if marker and pkg_resources.invalid_marker(marker):
        raise DistutilsSetupError("Invalid environment marker: " + marker)
    list(pkg_resources.parse_requirements(reqs))


def assert_bool(dist, attr, value):
    """Verify that value is True, False, 0, or 1"""
    if bool(value) != value:
        tmpl = "{attr!r} must be a boolean value (got {value!r})"
        raise DistutilsSetupError(tmpl.format(attr=attr, value=value))


def check_requirements(dist, attr, value):
    """Verify that install_requires is a valid requirements list"""
    try:
        list(pkg_resources.parse_requirements(value))
        if isinstance(value, (dict, set)):
            raise TypeError("Unordered types are not allowed")
    except (TypeError, ValueError) as error:
        tmpl = (
            "{attr!r} must be a string or list of strings "
            "containing valid project/version requirement specifiers; {error}"
        )
        raise DistutilsSetupError(tmpl.format(attr=attr, error=error))


def check_specifier(dist, attr, value):
    """Verify that value is a valid version specifier"""
    try:
        packaging.specifiers.SpecifierSet(value)
    except packaging.specifiers.InvalidSpecifier as error:
        tmpl = (
            "{attr!r} must be a string "
            "containing valid version specifiers; {error}"
        )
        raise DistutilsSetupError(tmpl.format(attr=attr, error=error))


def check_entry_points(dist, attr, value):
    """Verify that entry_points map is parseable"""
    try:
        pkg_resources.EntryPoint.parse_map(value)
    except ValueError as e:
        raise DistutilsSetupError(e)


def check_test_suite(dist, attr, value):
    if not isinstance(value, six.string_types):
        raise DistutilsSetupError("test_suite must be a string")


def check_package_data(dist, attr, value):
    """Verify that value is a dictionary of package names to glob lists"""
    if isinstance(value, dict):
        for k, v in value.items():
            if not isinstance(k, str):
                break
            try:
                iter(v)
            except TypeError:
                break
        else:
            return
    raise DistutilsSetupError(
        attr + " must be a dictionary mapping package names to lists of "
        "wildcard patterns"
    )


def check_packages(dist, attr, value):
    for pkgname in value:
        if not re.match(r'\w+(\.\w+)*', pkgname):
            distutils.log.warn(
                "WARNING: %r not a valid package name; please use only "
                ".-separated package names in setup.py", pkgname
            )


_Distribution = get_unpatched(distutils.core.Distribution)


class Distribution(Distribution_parse_config_files, _Distribution):
    """Distribution with support for features, tests, and package data

    This is an enhanced version of 'distutils.dist.Distribution' that
    effectively adds the following new optional keyword arguments to 'setup()':

     'install_requires' -- a string or sequence of strings specifying project
        versions that the distribution requires when installed, in the format
        used by 'pkg_resources.require()'.  They will be installed
        automatically when the package is installed.  If you wish to use
        packages that are not available in PyPI, or want to give your users an
        alternate download location, you can add a 'find_links' option to the
        '[easy_install]' section of your project's 'setup.cfg' file, and then
        setuptools will scan the listed web pages for links that satisfy the
        requirements.

     'extras_require' -- a dictionary mapping names of optional "extras" to the
        additional requirement(s) that using those extras incurs. For example,
        this::

            extras_require = dict(reST = ["docutils>=0.3", "reSTedit"])

        indicates that the distribution can optionally provide an extra
        capability called "reST", but it can only be used if docutils and
        reSTedit are installed.  If the user installs your package using
        EasyInstall and requests one of your extras, the corresponding
        additional requirements will be installed if needed.

     'features' **deprecated** -- a dictionary mapping option names to
        'setuptools.Feature'
        objects.  Features are a portion of the distribution that can be
        included or excluded based on user options, inter-feature dependencies,
        and availability on the current system.  Excluded features are omitted
        from all setup commands, including source and binary distributions, so
        you can create multiple distributions from the same source tree.
        Feature names should be valid Python identifiers, except that they may
        contain the '-' (minus) sign.  Features can be included or excluded
        via the command line options '--with-X' and '--without-X', where 'X' is
        the name of the feature.  Whether a feature is included by default, and
        whether you are allowed to control this from the command line, is
        determined by the Feature object.  See the 'Feature' class for more
        information.

     'test_suite' -- the name of a test suite to run for the 'test' command.
        If the user runs 'python setup.py test', the package will be installed,
        and the named test suite will be run.  The format is the same as
        would be used on a 'unittest.py' command line.  That is, it is the
        dotted name of an object to import and call to generate a test suite.

     'package_data' -- a dictionary mapping package names to lists of filenames
        or globs to use to find data files contained in the named packages.
        If the dictionary has filenames or globs listed under '""' (the empty
        string), those names will be searched for in every package, in addition
        to any names for the specific package.  Data files found using these
        names/globs will be installed along with the package, in the same
        location as the package.  Note that globs are allowed to reference
        the contents of non-package subdirectories, as long as you use '/' as
        a path separator.  (Globs are automatically converted to
        platform-specific paths at runtime.)

    In addition to these new keywords, this class also has several new methods
    for manipulating the distribution's contents.  For example, the 'include()'
    and 'exclude()' methods can be thought of as in-place add and subtract
    commands that add or remove packages, modules, extensions, and so on from
    the distribution.  They are used by the feature subsystem to configure the
    distribution for the included and excluded features.
    """

    _DISTUTILS_UNSUPPORTED_METADATA = {
        'long_description_content_type': None,
        'project_urls': dict,
        'provides_extras': set,
    }

    _patched_dist = None

    def patch_missing_pkg_info(self, attrs):
        # Fake up a replacement for the data that would normally come from
        # PKG-INFO, but which might not yet be built if this is a fresh
        # checkout.
        #
        if not attrs or 'name' not in attrs or 'version' not in attrs:
            return
        key = pkg_resources.safe_name(str(attrs['name'])).lower()
        dist = pkg_resources.working_set.by_key.get(key)
        if dist is not None and not dist.has_metadata('PKG-INFO'):
            dist._version = pkg_resources.safe_version(str(attrs['version']))
            self._patched_dist = dist

    def __init__(self, attrs=None):
        have_package_data = hasattr(self, "package_data")
        if not have_package_data:
            self.package_data = {}
        attrs = attrs or {}
        if 'features' in attrs or 'require_features' in attrs:
            Feature.warn_deprecated()
        self.require_features = []
        self.features = {}
        self.dist_files = []
        # Filter-out setuptools' specific options.
        self.src_root = attrs.pop("src_root", None)
        self.patch_missing_pkg_info(attrs)
        self.dependency_links = attrs.pop('dependency_links', [])
        self.setup_requires = attrs.pop('setup_requires', [])
        for ep in pkg_resources.iter_entry_points('distutils.setup_keywords'):
            vars(self).setdefault(ep.name, None)
        _Distribution.__init__(self, {
            k: v for k, v in attrs.items()
            if k not in self._DISTUTILS_UNSUPPORTED_METADATA
        })

        # Fill-in missing metadata fields not supported by distutils.
        # Note some fields may have been set by other tools (e.g. pbr)
        # above; they are taken preferrentially to setup() arguments
        for option, default in self._DISTUTILS_UNSUPPORTED_METADATA.items():
            for source in self.metadata.__dict__, attrs:
                if option in source:
                    value = source[option]
                    break
            else:
                value = default() if default else None
            setattr(self.metadata, option, value)

        if isinstance(self.metadata.version, numbers.Number):
            # Some people apparently take "version number" too literally :)
            self.metadata.version = str(self.metadata.version)

        if self.metadata.version is not None:
            try:
                ver = packaging.version.Version(self.metadata.version)
                normalized_version = str(ver)
                if self.metadata.version != normalized_version:
                    warnings.warn(
                        "Normalizing '%s' to '%s'" % (
                            self.metadata.version,
                            normalized_version,
                        )
                    )
                    self.metadata.version = normalized_version
            except (packaging.version.InvalidVersion, TypeError):
                warnings.warn(
                    "The version specified (%r) is an invalid version, this "
                    "may not work as expected with newer versions of "
                    "setuptools, pip, and PyPI. Please see PEP 440 for more "
                    "details." % self.metadata.version
                )
        self._finalize_requires()

    def _finalize_requires(self):
        """
        Set `metadata.python_requires` and fix environment markers
        in `install_requires` and `extras_require`.
        """
        if getattr(self, 'python_requires', None):
            self.metadata.python_requires = self.python_requires

        if getattr(self, 'extras_require', None):
            for extra in self.extras_require.keys():
                # Since this gets called multiple times at points where the
                # keys have become 'converted' extras, ensure that we are only
                # truly adding extras we haven't seen before here.
                extra = extra.split(':')[0]
                if extra:
                    self.metadata.provides_extras.add(extra)

        self._convert_extras_requirements()
        self._move_install_requirements_markers()

    def _convert_extras_requirements(self):
        """
        Convert requirements in `extras_require` of the form
        `"extra": ["barbazquux; {marker}"]` to
        `"extra:{marker}": ["barbazquux"]`.
        """
        spec_ext_reqs = getattr(self, 'extras_require', None) or {}
        self._tmp_extras_require = defaultdict(list)
        for section, v in spec_ext_reqs.items():
            # Do not strip empty sections.
            self._tmp_extras_require[section]
            for r in pkg_resources.parse_requirements(v):
                suffix = self._suffix_for(r)
                self._tmp_extras_require[section + suffix].append(r)

    @staticmethod
    def _suffix_for(req):
        """
        For a requirement, return the 'extras_require' suffix for
        that requirement.
        """
        return ':' + str(req.marker) if req.marker else ''

    def _move_install_requirements_markers(self):
        """
        Move requirements in `install_requires` that are using environment
        markers `extras_require`.
        """

        # divide the install_requires into two sets, simple ones still
        # handled by install_requires and more complex ones handled
        # by extras_require.

        def is_simple_req(req):
            return not req.marker

        spec_inst_reqs = getattr(self, 'install_requires', None) or ()
        inst_reqs = list(pkg_resources.parse_requirements(spec_inst_reqs))
        simple_reqs = filter(is_simple_req, inst_reqs)
        complex_reqs = filterfalse(is_simple_req, inst_reqs)
        self.install_requires = list(map(str, simple_reqs))

        for r in complex_reqs:
            self._tmp_extras_require[':' + str(r.marker)].append(r)
        self.extras_require = dict(
            (k, [str(r) for r in map(self._clean_req, v)])
            for k, v in self._tmp_extras_require.items()
        )

    def _clean_req(self, req):
        """
        Given a Requirement, remove environment markers and return it.
        """
        req.marker = None
        return req

    def parse_config_files(self, filenames=None, ignore_option_errors=False):
        """Parses configuration files from various levels
        and loads configuration.

        """
        _Distribution.parse_config_files(self, filenames=filenames)

        parse_configuration(self, self.command_options,
                            ignore_option_errors=ignore_option_errors)
        self._finalize_requires()

    def parse_command_line(self):
        """Process features after parsing command line options"""
        result = _Distribution.parse_command_line(self)
        if self.features:
            self._finalize_features()
        return result

    def _feature_attrname(self, name):
        """Convert feature name to corresponding option attribute name"""
        return 'with_' + name.replace('-', '_')

    def fetch_build_eggs(self, requires):
        """Resolve pre-setup requirements"""
        resolved_dists = pkg_resources.working_set.resolve(
            pkg_resources.parse_requirements(requires),
            installer=self.fetch_build_egg,
            replace_conflicting=True,
        )
        for dist in resolved_dists:
            pkg_resources.working_set.add(dist, replace=True)
        return resolved_dists

    def finalize_options(self):
        _Distribution.finalize_options(self)
        if self.features:
            self._set_global_opts_from_features()

        for ep in pkg_resources.iter_entry_points('distutils.setup_keywords'):
            value = getattr(self, ep.name, None)
            if value is not None:
                ep.require(installer=self.fetch_build_egg)
                ep.load()(self, ep.name, value)
        if getattr(self, 'convert_2to3_doctests', None):
            # XXX may convert to set here when we can rely on set being builtin
            self.convert_2to3_doctests = [
                os.path.abspath(p)
                for p in self.convert_2to3_doctests
            ]
        else:
            self.convert_2to3_doctests = []

    def get_egg_cache_dir(self):
        egg_cache_dir = os.path.join(os.curdir, '.eggs')
        if not os.path.exists(egg_cache_dir):
            os.mkdir(egg_cache_dir)
            windows_support.hide_file(egg_cache_dir)
            readme_txt_filename = os.path.join(egg_cache_dir, 'README.txt')
            with open(readme_txt_filename, 'w') as f:
                f.write('This directory contains eggs that were downloaded '
                        'by setuptools to build, test, and run plug-ins.\n\n')
                f.write('This directory caches those eggs to prevent '
                        'repeated downloads.\n\n')
                f.write('However, it is safe to delete this directory.\n\n')

        return egg_cache_dir

    def fetch_build_egg(self, req):
        """Fetch an egg needed for building"""
        from setuptools.command.easy_install import easy_install
        dist = self.__class__({'script_args': ['easy_install']})
        opts = dist.get_option_dict('easy_install')
        opts.clear()
        opts.update(
            (k, v)
            for k, v in self.get_option_dict('easy_install').items()
            if k in (
                # don't use any other settings
                'find_links', 'site_dirs', 'index_url',
                'optimize', 'site_dirs', 'allow_hosts',
            ))
        if self.dependency_links:
            links = self.dependency_links[:]
            if 'find_links' in opts:
                links = opts['find_links'][1] + links
            opts['find_links'] = ('setup', links)
        install_dir = self.get_egg_cache_dir()
        cmd = easy_install(
            dist, args=["x"], install_dir=install_dir,
            exclude_scripts=True,
            always_copy=False, build_directory=None, editable=False,
            upgrade=False, multi_version=True, no_report=True, user=False
        )
        cmd.ensure_finalized()
        return cmd.easy_install(req)

    def _set_global_opts_from_features(self):
        """Add --with-X/--without-X options based on optional features"""

        go = []
        no = self.negative_opt.copy()

        for name, feature in self.features.items():
            self._set_feature(name, None)
            feature.validate(self)

            if feature.optional:
                descr = feature.description
                incdef = ' (default)'
                excdef = ''
                if not feature.include_by_default():
                    excdef, incdef = incdef, excdef

                new = (
                    ('with-' + name, None, 'include ' + descr + incdef),
                    ('without-' + name, None, 'exclude ' + descr + excdef),
                )
                go.extend(new)
                no['without-' + name] = 'with-' + name

        self.global_options = self.feature_options = go + self.global_options
        self.negative_opt = self.feature_negopt = no

    def _finalize_features(self):
        """Add/remove features and resolve dependencies between them"""

        # First, flag all the enabled items (and thus their dependencies)
        for name, feature in self.features.items():
            enabled = self.feature_is_included(name)
            if enabled or (enabled is None and feature.include_by_default()):
                feature.include_in(self)
                self._set_feature(name, 1)

        # Then disable the rest, so that off-by-default features don't
        # get flagged as errors when they're required by an enabled feature
        for name, feature in self.features.items():
            if not self.feature_is_included(name):
                feature.exclude_from(self)
                self._set_feature(name, 0)

    def get_command_class(self, command):
        """Pluggable version of get_command_class()"""
        if command in self.cmdclass:
            return self.cmdclass[command]

        eps = pkg_resources.iter_entry_points('distutils.commands', command)
        for ep in eps:
            ep.require(installer=self.fetch_build_egg)
            self.cmdclass[command] = cmdclass = ep.load()
            return cmdclass
        else:
            return _Distribution.get_command_class(self, command)

    def print_commands(self):
        for ep in pkg_resources.iter_entry_points('distutils.commands'):
            if ep.name not in self.cmdclass:
                # don't require extras as the commands won't be invoked
                cmdclass = ep.resolve()
                self.cmdclass[ep.name] = cmdclass
        return _Distribution.print_commands(self)

    def get_command_list(self):
        for ep in pkg_resources.iter_entry_points('distutils.commands'):
            if ep.name not in self.cmdclass:
                # don't require extras as the commands won't be invoked
                cmdclass = ep.resolve()
                self.cmdclass[ep.name] = cmdclass
        return _Distribution.get_command_list(self)

    def _set_feature(self, name, status):
        """Set feature's inclusion status"""
        setattr(self, self._feature_attrname(name), status)

    def feature_is_included(self, name):
        """Return 1 if feature is included, 0 if excluded, 'None' if unknown"""
        return getattr(self, self._feature_attrname(name))

    def include_feature(self, name):
        """Request inclusion of feature named 'name'"""

        if self.feature_is_included(name) == 0:
            descr = self.features[name].description
            raise DistutilsOptionError(
                descr + " is required, but was excluded or is not available"
            )
        self.features[name].include_in(self)
        self._set_feature(name, 1)

    def include(self, **attrs):
        """Add items to distribution that are named in keyword arguments

        For example, 'dist.exclude(py_modules=["x"])' would add 'x' to
        the distribution's 'py_modules' attribute, if it was not already
        there.

        Currently, this method only supports inclusion for attributes that are
        lists or tuples.  If you need to add support for adding to other
        attributes in this or a subclass, you can add an '_include_X' method,
        where 'X' is the name of the attribute.  The method will be called with
        the value passed to 'include()'.  So, 'dist.include(foo={"bar":"baz"})'
        will try to call 'dist._include_foo({"bar":"baz"})', which can then
        handle whatever special inclusion logic is needed.
        """
        for k, v in attrs.items():
            include = getattr(self, '_include_' + k, None)
            if include:
                include(v)
            else:
                self._include_misc(k, v)

    def exclude_package(self, package):
        """Remove packages, modules, and extensions in named package"""

        pfx = package + '.'
        if self.packages:
            self.packages = [
                p for p in self.packages
                if p != package and not p.startswith(pfx)
            ]

        if self.py_modules:
            self.py_modules = [
                p for p in self.py_modules
                if p != package and not p.startswith(pfx)
            ]

        if self.ext_modules:
            self.ext_modules = [
                p for p in self.ext_modules
                if p.name != package and not p.name.startswith(pfx)
            ]

    def has_contents_for(self, package):
        """Return true if 'exclude_package(package)' would do something"""

        pfx = package + '.'

        for p in self.iter_distribution_names():
            if p == package or p.startswith(pfx):
                return True

    def _exclude_misc(self, name, value):
        """Handle 'exclude()' for list/tuple attrs without a special handler"""
        if not isinstance(value, sequence):
            raise DistutilsSetupError(
                "%s: setting must be a list or tuple (%r)" % (name, value)
            )
        try:
            old = getattr(self, name)
        except AttributeError:
            raise DistutilsSetupError(
                "%s: No such distribution setting" % name
            )
        if old is not None and not isinstance(old, sequence):
            raise DistutilsSetupError(
                name + ": this setting cannot be changed via include/exclude"
            )
        elif old:
            setattr(self, name, [item for item in old if item not in value])

    def _include_misc(self, name, value):
        """Handle 'include()' for list/tuple attrs without a special handler"""

        if not isinstance(value, sequence):
            raise DistutilsSetupError(
                "%s: setting must be a list (%r)" % (name, value)
            )
        try:
            old = getattr(self, name)
        except AttributeError:
            raise DistutilsSetupError(
                "%s: No such distribution setting" % name
            )
        if old is None:
            setattr(self, name, value)
        elif not isinstance(old, sequence):
            raise DistutilsSetupError(
                name + ": this setting cannot be changed via include/exclude"
            )
        else:
            new = [item for item in value if item not in old]
            setattr(self, name, old + new)

    def exclude(self, **attrs):
        """Remove items from distribution that are named in keyword arguments

        For example, 'dist.exclude(py_modules=["x"])' would remove 'x' from
        the distribution's 'py_modules' attribute.  Excluding packages uses
        the 'exclude_package()' method, so all of the package's contained
        packages, modules, and extensions are also excluded.

        Currently, this method only supports exclusion from attributes that are
        lists or tuples.  If you need to add support for excluding from other
        attributes in this or a subclass, you can add an '_exclude_X' method,
        where 'X' is the name of the attribute.  The method will be called with
        the value passed to 'exclude()'.  So, 'dist.exclude(foo={"bar":"baz"})'
        will try to call 'dist._exclude_foo({"bar":"baz"})', which can then
        handle whatever special exclusion logic is needed.
        """
        for k, v in attrs.items():
            exclude = getattr(self, '_exclude_' + k, None)
            if exclude:
                exclude(v)
            else:
                self._exclude_misc(k, v)

    def _exclude_packages(self, packages):
        if not isinstance(packages, sequence):
            raise DistutilsSetupError(
                "packages: setting must be a list or tuple (%r)" % (packages,)
            )
        list(map(self.exclude_package, packages))

    def _parse_command_opts(self, parser, args):
        # Remove --with-X/--without-X options when processing command args
        self.global_options = self.__class__.global_options
        self.negative_opt = self.__class__.negative_opt

        # First, expand any aliases
        command = args[0]
        aliases = self.get_option_dict('aliases')
        while command in aliases:
            src, alias = aliases[command]
            del aliases[command]  # ensure each alias can expand only once!
            import shlex
            args[:1] = shlex.split(alias, True)
            command = args[0]

        nargs = _Distribution._parse_command_opts(self, parser, args)

        # Handle commands that want to consume all remaining arguments
        cmd_class = self.get_command_class(command)
        if getattr(cmd_class, 'command_consumes_arguments', None):
            self.get_option_dict(command)['args'] = ("command line", nargs)
            if nargs is not None:
                return []

        return nargs

    def get_cmdline_options(self):
        """Return a '{cmd: {opt:val}}' map of all command-line options

        Option names are all long, but do not include the leading '--', and
        contain dashes rather than underscores.  If the option doesn't take
        an argument (e.g. '--quiet'), the 'val' is 'None'.

        Note that options provided by config files are intentionally excluded.
        """

        d = {}

        for cmd, opts in self.command_options.items():

            for opt, (src, val) in opts.items():

                if src != "command line":
                    continue

                opt = opt.replace('_', '-')

                if val == 0:
                    cmdobj = self.get_command_obj(cmd)
                    neg_opt = self.negative_opt.copy()
                    neg_opt.update(getattr(cmdobj, 'negative_opt', {}))
                    for neg, pos in neg_opt.items():
                        if pos == opt:
                            opt = neg
                            val = None
                            break
                    else:
                        raise AssertionError("Shouldn't be able to get here")

                elif val == 1:
                    val = None

                d.setdefault(cmd, {})[opt] = val

        return d

    def iter_distribution_names(self):
        """Yield all packages, modules, and extension names in distribution"""

        for pkg in self.packages or ():
            yield pkg

        for module in self.py_modules or ():
            yield module

        for ext in self.ext_modules or ():
            if isinstance(ext, tuple):
                name, buildinfo = ext
            else:
                name = ext.name
            if name.endswith('module'):
                name = name[:-6]
            yield name

    def handle_display_options(self, option_order):
        """If there were any non-global "display-only" options
        (--help-commands or the metadata display options) on the command
        line, display the requested info and return true; else return
        false.
        """
        import sys

        if six.PY2 or self.help_commands:
            return _Distribution.handle_display_options(self, option_order)

        # Stdout may be StringIO (e.g. in tests)
        import io
        if not isinstance(sys.stdout, io.TextIOWrapper):
            return _Distribution.handle_display_options(self, option_order)

        # Don't wrap stdout if utf-8 is already the encoding. Provides
        #  workaround for #334.
        if sys.stdout.encoding.lower() in ('utf-8', 'utf8'):
            return _Distribution.handle_display_options(self, option_order)

        # Print metadata in UTF-8 no matter the platform
        encoding = sys.stdout.encoding
        errors = sys.stdout.errors
        newline = sys.platform != 'win32' and '\n' or None
        line_buffering = sys.stdout.line_buffering

        sys.stdout = io.TextIOWrapper(
            sys.stdout.detach(), 'utf-8', errors, newline, line_buffering)
        try:
            return _Distribution.handle_display_options(self, option_order)
        finally:
            sys.stdout = io.TextIOWrapper(
                sys.stdout.detach(), encoding, errors, newline, line_buffering)


class Feature:
    """
    **deprecated** -- The `Feature` facility was never completely implemented
    or supported, `has reported issues
    <https://github.com/pypa/setuptools/issues/58>`_ and will be removed in
    a future version.

    A subset of the distribution that can be excluded if unneeded/wanted

    Features are created using these keyword arguments:

      'description' -- a short, human readable description of the feature, to
         be used in error messages, and option help messages.

      'standard' -- if true, the feature is included by default if it is
         available on the current system.  Otherwise, the feature is only
         included if requested via a command line '--with-X' option, or if
         another included feature requires it.  The default setting is 'False'.

      'available' -- if true, the feature is available for installation on the
         current system.  The default setting is 'True'.

      'optional' -- if true, the feature's inclusion can be controlled from the
         command line, using the '--with-X' or '--without-X' options.  If
         false, the feature's inclusion status is determined automatically,
         based on 'availabile', 'standard', and whether any other feature
         requires it.  The default setting is 'True'.

      'require_features' -- a string or sequence of strings naming features
         that should also be included if this feature is included.  Defaults to
         empty list.  May also contain 'Require' objects that should be
         added/removed from the distribution.

      'remove' -- a string or list of strings naming packages to be removed
         from the distribution if this feature is *not* included.  If the
         feature *is* included, this argument is ignored.  This argument exists
         to support removing features that "crosscut" a distribution, such as
         defining a 'tests' feature that removes all the 'tests' subpackages
         provided by other features.  The default for this argument is an empty
         list.  (Note: the named package(s) or modules must exist in the base
         distribution when the 'setup()' function is initially called.)

      other keywords -- any other keyword arguments are saved, and passed to
         the distribution's 'include()' and 'exclude()' methods when the
         feature is included or excluded, respectively.  So, for example, you
         could pass 'packages=["a","b"]' to cause packages 'a' and 'b' to be
         added or removed from the distribution as appropriate.

    A feature must include at least one 'requires', 'remove', or other
    keyword argument.  Otherwise, it can't affect the distribution in any way.
    Note also that you can subclass 'Feature' to create your own specialized
    feature types that modify the distribution in other ways when included or
    excluded.  See the docstrings for the various methods here for more detail.
    Aside from the methods, the only feature attributes that distributions look
    at are 'description' and 'optional'.
    """

    @staticmethod
    def warn_deprecated():
        msg = (
            "Features are deprecated and will be removed in a future "
            "version. See https://github.com/pypa/setuptools/issues/65."
        )
        warnings.warn(msg, DeprecationWarning, stacklevel=3)

    def __init__(
            self, description, standard=False, available=True,
            optional=True, require_features=(), remove=(), **extras):
        self.warn_deprecated()

        self.description = description
        self.standard = standard
        self.available = available
        self.optional = optional
        if isinstance(require_features, (str, Require)):
            require_features = require_features,

        self.require_features = [
            r for r in require_features if isinstance(r, str)
        ]
        er = [r for r in require_features if not isinstance(r, str)]
        if er:
            extras['require_features'] = er

        if isinstance(remove, str):
            remove = remove,
        self.remove = remove
        self.extras = extras

        if not remove and not require_features and not extras:
            raise DistutilsSetupError(
                "Feature %s: must define 'require_features', 'remove', or "
                "at least one of 'packages', 'py_modules', etc."
            )

    def include_by_default(self):
        """Should this feature be included by default?"""
        return self.available and self.standard

    def include_in(self, dist):
        """Ensure feature and its requirements are included in distribution

        You may override this in a subclass to perform additional operations on
        the distribution.  Note that this method may be called more than once
        per feature, and so should be idempotent.

        """

        if not self.available:
            raise DistutilsPlatformError(
                self.description + " is required, "
                "but is not available on this platform"
            )

        dist.include(**self.extras)

        for f in self.require_features:
            dist.include_feature(f)

    def exclude_from(self, dist):
        """Ensure feature is excluded from distribution

        You may override this in a subclass to perform additional operations on
        the distribution.  This method will be called at most once per
        feature, and only after all included features have been asked to
        include themselves.
        """

        dist.exclude(**self.extras)

        if self.remove:
            for item in self.remove:
                dist.exclude_package(item)

    def validate(self, dist):
        """Verify that feature makes sense in context of distribution

        This method is called by the distribution just before it parses its
        command line.  It checks to ensure that the 'remove' attribute, if any,
        contains only valid package/module names that are present in the base
        distribution when 'setup()' is called.  You may override it in a
        subclass to perform any other required validation of the feature
        against a target distribution.
        """

        for item in self.remove:
            if not dist.has_contents_for(item):
                raise DistutilsSetupError(
                    "%s wants to be able to remove %s, but the distribution"
                    " doesn't contain any packages or modules under %s"
                    % (self.description, item, item)
                )
site-packages/setuptools/package_index.py000064400000116305147511334640014657 0ustar00"""PyPI and direct package downloading"""
import subprocess
import sys
import os
import re
import shutil
import socket
import base64
import hashlib
import itertools
from functools import wraps

from setuptools.extern import six
from setuptools.extern.six.moves import urllib, http_client, configparser, map

import setuptools
from pkg_resources import (
    CHECKOUT_DIST, Distribution, BINARY_DIST, normalize_path, SOURCE_DIST,
    Environment, find_distributions, safe_name, safe_version,
    to_filename, Requirement, DEVELOP_DIST, EGG_DIST,
)
from setuptools import ssl_support
from distutils import log
from distutils.errors import DistutilsError
from fnmatch import translate
from setuptools.py27compat import get_all_headers
from setuptools.py33compat import unescape
from setuptools.wheel import Wheel

EGG_FRAGMENT = re.compile(r'^egg=([-A-Za-z0-9_.+!]+)$')
HREF = re.compile("""href\\s*=\\s*['"]?([^'"> ]+)""", re.I)
# this is here to fix emacs' cruddy broken syntax highlighting
PYPI_MD5 = re.compile(
    '<a href="([^"#]+)">([^<]+)</a>\n\\s+\\(<a (?:title="MD5 hash"\n\\s+)'
    'href="[^?]+\\?:action=show_md5&amp;digest=([0-9a-f]{32})">md5</a>\\)'
)
URL_SCHEME = re.compile('([-+.a-z0-9]{2,}):', re.I).match
EXTENSIONS = ".tar.gz .tar.bz2 .tar .zip .tgz".split()

__all__ = [
    'PackageIndex', 'distros_for_url', 'parse_bdist_wininst',
    'interpret_distro_name',
]

_SOCKET_TIMEOUT = 15

_tmpl = "setuptools/{setuptools.__version__} Python-urllib/{py_major}"
user_agent = _tmpl.format(py_major=sys.version[:3], setuptools=setuptools)


def parse_requirement_arg(spec):
    try:
        return Requirement.parse(spec)
    except ValueError:
        raise DistutilsError(
            "Not a URL, existing file, or requirement spec: %r" % (spec,)
        )


def parse_bdist_wininst(name):
    """Return (base,pyversion) or (None,None) for possible .exe name"""

    lower = name.lower()
    base, py_ver, plat = None, None, None

    if lower.endswith('.exe'):
        if lower.endswith('.win32.exe'):
            base = name[:-10]
            plat = 'win32'
        elif lower.startswith('.win32-py', -16):
            py_ver = name[-7:-4]
            base = name[:-16]
            plat = 'win32'
        elif lower.endswith('.win-amd64.exe'):
            base = name[:-14]
            plat = 'win-amd64'
        elif lower.startswith('.win-amd64-py', -20):
            py_ver = name[-7:-4]
            base = name[:-20]
            plat = 'win-amd64'
    return base, py_ver, plat


def egg_info_for_url(url):
    parts = urllib.parse.urlparse(url)
    scheme, server, path, parameters, query, fragment = parts
    base = urllib.parse.unquote(path.split('/')[-1])
    if server == 'sourceforge.net' and base == 'download':  # XXX Yuck
        base = urllib.parse.unquote(path.split('/')[-2])
    if '#' in base:
        base, fragment = base.split('#', 1)
    return base, fragment


def distros_for_url(url, metadata=None):
    """Yield egg or source distribution objects that might be found at a URL"""
    base, fragment = egg_info_for_url(url)
    for dist in distros_for_location(url, base, metadata):
        yield dist
    if fragment:
        match = EGG_FRAGMENT.match(fragment)
        if match:
            for dist in interpret_distro_name(
                url, match.group(1), metadata, precedence=CHECKOUT_DIST
            ):
                yield dist


def distros_for_location(location, basename, metadata=None):
    """Yield egg or source distribution objects based on basename"""
    if basename.endswith('.egg.zip'):
        basename = basename[:-4]  # strip the .zip
    if basename.endswith('.egg') and '-' in basename:
        # only one, unambiguous interpretation
        return [Distribution.from_location(location, basename, metadata)]
    if basename.endswith('.whl') and '-' in basename:
        wheel = Wheel(basename)
        if not wheel.is_compatible():
            return []
        return [Distribution(
            location=location,
            project_name=wheel.project_name,
            version=wheel.version,
            # Increase priority over eggs.
            precedence=EGG_DIST + 1,
        )]
    if basename.endswith('.exe'):
        win_base, py_ver, platform = parse_bdist_wininst(basename)
        if win_base is not None:
            return interpret_distro_name(
                location, win_base, metadata, py_ver, BINARY_DIST, platform
            )
    # Try source distro extensions (.zip, .tgz, etc.)
    #
    for ext in EXTENSIONS:
        if basename.endswith(ext):
            basename = basename[:-len(ext)]
            return interpret_distro_name(location, basename, metadata)
    return []  # no extension matched


def distros_for_filename(filename, metadata=None):
    """Yield possible egg or source distribution objects based on a filename"""
    return distros_for_location(
        normalize_path(filename), os.path.basename(filename), metadata
    )


def interpret_distro_name(
        location, basename, metadata, py_version=None, precedence=SOURCE_DIST,
        platform=None
):
    """Generate alternative interpretations of a source distro name

    Note: if `location` is a filesystem filename, you should call
    ``pkg_resources.normalize_path()`` on it before passing it to this
    routine!
    """
    # Generate alternative interpretations of a source distro name
    # Because some packages are ambiguous as to name/versions split
    # e.g. "adns-python-1.1.0", "egenix-mx-commercial", etc.
    # So, we generate each possible interepretation (e.g. "adns, python-1.1.0"
    # "adns-python, 1.1.0", and "adns-python-1.1.0, no version").  In practice,
    # the spurious interpretations should be ignored, because in the event
    # there's also an "adns" package, the spurious "python-1.1.0" version will
    # compare lower than any numeric version number, and is therefore unlikely
    # to match a request for it.  It's still a potential problem, though, and
    # in the long run PyPI and the distutils should go for "safe" names and
    # versions in distribution archive names (sdist and bdist).

    parts = basename.split('-')
    if not py_version and any(re.match(r'py\d\.\d$', p) for p in parts[2:]):
        # it is a bdist_dumb, not an sdist -- bail out
        return

    for p in range(1, len(parts) + 1):
        yield Distribution(
            location, metadata, '-'.join(parts[:p]), '-'.join(parts[p:]),
            py_version=py_version, precedence=precedence,
            platform=platform
        )


# From Python 2.7 docs
def unique_everseen(iterable, key=None):
    "List unique elements, preserving order. Remember all elements ever seen."
    # unique_everseen('AAAABBBCCDAABBB') --> A B C D
    # unique_everseen('ABBCcAD', str.lower) --> A B C D
    seen = set()
    seen_add = seen.add
    if key is None:
        for element in six.moves.filterfalse(seen.__contains__, iterable):
            seen_add(element)
            yield element
    else:
        for element in iterable:
            k = key(element)
            if k not in seen:
                seen_add(k)
                yield element


def unique_values(func):
    """
    Wrap a function returning an iterable such that the resulting iterable
    only ever yields unique items.
    """

    @wraps(func)
    def wrapper(*args, **kwargs):
        return unique_everseen(func(*args, **kwargs))

    return wrapper


REL = re.compile(r"""<([^>]*\srel\s{0,10}=\s{0,10}['"]?([^'" >]+)[^>]*)>""", re.I)
# this line is here to fix emacs' cruddy broken syntax highlighting


@unique_values
def find_external_links(url, page):
    """Find rel="homepage" and rel="download" links in `page`, yielding URLs"""

    for match in REL.finditer(page):
        tag, rel = match.groups()
        rels = set(map(str.strip, rel.lower().split(',')))
        if 'homepage' in rels or 'download' in rels:
            for match in HREF.finditer(tag):
                yield urllib.parse.urljoin(url, htmldecode(match.group(1)))

    for tag in ("<th>Home Page", "<th>Download URL"):
        pos = page.find(tag)
        if pos != -1:
            match = HREF.search(page, pos)
            if match:
                yield urllib.parse.urljoin(url, htmldecode(match.group(1)))


class ContentChecker(object):
    """
    A null content checker that defines the interface for checking content
    """

    def feed(self, block):
        """
        Feed a block of data to the hash.
        """
        return

    def is_valid(self):
        """
        Check the hash. Return False if validation fails.
        """
        return True

    def report(self, reporter, template):
        """
        Call reporter with information about the checker (hash name)
        substituted into the template.
        """
        return


class HashChecker(ContentChecker):
    pattern = re.compile(
        r'(?P<hash_name>sha1|sha224|sha384|sha256|sha512|md5)='
        r'(?P<expected>[a-f0-9]+)'
    )

    def __init__(self, hash_name, expected):
        self.hash_name = hash_name
        self.hash = hashlib.new(hash_name)
        self.expected = expected

    @classmethod
    def from_url(cls, url):
        "Construct a (possibly null) ContentChecker from a URL"
        fragment = urllib.parse.urlparse(url)[-1]
        if not fragment:
            return ContentChecker()
        match = cls.pattern.search(fragment)
        if not match:
            return ContentChecker()
        return cls(**match.groupdict())

    def feed(self, block):
        self.hash.update(block)

    def is_valid(self):
        return self.hash.hexdigest() == self.expected

    def report(self, reporter, template):
        msg = template % self.hash_name
        return reporter(msg)


class PackageIndex(Environment):
    """A distribution index that scans web pages for download URLs"""

    def __init__(
            self, index_url="https://pypi.org/simple/", hosts=('*',),
            ca_bundle=None, verify_ssl=True, *args, **kw
    ):
        Environment.__init__(self, *args, **kw)
        self.index_url = index_url + "/" [:not index_url.endswith('/')]
        self.scanned_urls = {}
        self.fetched_urls = {}
        self.package_pages = {}
        self.allows = re.compile('|'.join(map(translate, hosts))).match
        self.to_scan = []
        use_ssl = (
            verify_ssl
            and ssl_support.is_available
            and (ca_bundle or ssl_support.find_ca_bundle())
        )
        if use_ssl:
            self.opener = ssl_support.opener_for(ca_bundle)
        else:
            self.opener = urllib.request.urlopen

    def process_url(self, url, retrieve=False):
        """Evaluate a URL as a possible download, and maybe retrieve it"""
        if url in self.scanned_urls and not retrieve:
            return
        self.scanned_urls[url] = True
        if not URL_SCHEME(url):
            self.process_filename(url)
            return
        else:
            dists = list(distros_for_url(url))
            if dists:
                if not self.url_ok(url):
                    return
                self.debug("Found link: %s", url)

        if dists or not retrieve or url in self.fetched_urls:
            list(map(self.add, dists))
            return  # don't need the actual page

        if not self.url_ok(url):
            self.fetched_urls[url] = True
            return

        self.info("Reading %s", url)
        self.fetched_urls[url] = True  # prevent multiple fetch attempts
        tmpl = "Download error on %s: %%s -- Some packages may not be found!"
        f = self.open_url(url, tmpl % url)
        if f is None:
            return
        self.fetched_urls[f.url] = True
        if 'html' not in f.headers.get('content-type', '').lower():
            f.close()  # not html, we can't process it
            return

        base = f.url  # handle redirects
        page = f.read()
        if not isinstance(page, str):
            # In Python 3 and got bytes but want str.
            if isinstance(f, urllib.error.HTTPError):
                # Errors have no charset, assume latin1:
                charset = 'latin-1'
            else:
                charset = f.headers.get_param('charset') or 'latin-1'
            page = page.decode(charset, "ignore")
        f.close()
        for match in HREF.finditer(page):
            link = urllib.parse.urljoin(base, htmldecode(match.group(1)))
            self.process_url(link)
        if url.startswith(self.index_url) and getattr(f, 'code', None) != 404:
            page = self.process_index(url, page)

    def process_filename(self, fn, nested=False):
        # process filenames or directories
        if not os.path.exists(fn):
            self.warn("Not found: %s", fn)
            return

        if os.path.isdir(fn) and not nested:
            path = os.path.realpath(fn)
            for item in os.listdir(path):
                self.process_filename(os.path.join(path, item), True)

        dists = distros_for_filename(fn)
        if dists:
            self.debug("Found: %s", fn)
            list(map(self.add, dists))

    def url_ok(self, url, fatal=False):
        s = URL_SCHEME(url)
        is_file = s and s.group(1).lower() == 'file'
        if is_file or self.allows(urllib.parse.urlparse(url)[1]):
            return True
        msg = (
            "\nNote: Bypassing %s (disallowed host; see "
            "http://bit.ly/2hrImnY for details).\n")
        if fatal:
            raise DistutilsError(msg % url)
        else:
            self.warn(msg, url)

    def scan_egg_links(self, search_path):
        dirs = filter(os.path.isdir, search_path)
        egg_links = (
            (path, entry)
            for path in dirs
            for entry in os.listdir(path)
            if entry.endswith('.egg-link')
        )
        list(itertools.starmap(self.scan_egg_link, egg_links))

    def scan_egg_link(self, path, entry):
        with open(os.path.join(path, entry)) as raw_lines:
            # filter non-empty lines
            lines = list(filter(None, map(str.strip, raw_lines)))

        if len(lines) != 2:
            # format is not recognized; punt
            return

        egg_path, setup_path = lines

        for dist in find_distributions(os.path.join(path, egg_path)):
            dist.location = os.path.join(path, *lines)
            dist.precedence = SOURCE_DIST
            self.add(dist)

    def process_index(self, url, page):
        """Process the contents of a PyPI page"""

        def scan(link):
            # Process a URL to see if it's for a package page
            if link.startswith(self.index_url):
                parts = list(map(
                    urllib.parse.unquote, link[len(self.index_url):].split('/')
                ))
                if len(parts) == 2 and '#' not in parts[1]:
                    # it's a package page, sanitize and index it
                    pkg = safe_name(parts[0])
                    ver = safe_version(parts[1])
                    self.package_pages.setdefault(pkg.lower(), {})[link] = True
                    return to_filename(pkg), to_filename(ver)
            return None, None

        # process an index page into the package-page index
        for match in HREF.finditer(page):
            try:
                scan(urllib.parse.urljoin(url, htmldecode(match.group(1))))
            except ValueError:
                pass

        pkg, ver = scan(url)  # ensure this page is in the page index
        if pkg:
            # process individual package page
            for new_url in find_external_links(url, page):
                # Process the found URL
                base, frag = egg_info_for_url(new_url)
                if base.endswith('.py') and not frag:
                    if ver:
                        new_url += '#egg=%s-%s' % (pkg, ver)
                    else:
                        self.need_version_info(url)
                self.scan_url(new_url)

            return PYPI_MD5.sub(
                lambda m: '<a href="%s#md5=%s">%s</a>' % m.group(1, 3, 2), page
            )
        else:
            return ""  # no sense double-scanning non-package pages

    def need_version_info(self, url):
        self.scan_all(
            "Page at %s links to .py file(s) without version info; an index "
            "scan is required.", url
        )

    def scan_all(self, msg=None, *args):
        if self.index_url not in self.fetched_urls:
            if msg:
                self.warn(msg, *args)
            self.info(
                "Scanning index of all packages (this may take a while)"
            )
        self.scan_url(self.index_url)

    def find_packages(self, requirement):
        self.scan_url(self.index_url + requirement.unsafe_name + '/')

        if not self.package_pages.get(requirement.key):
            # Fall back to safe version of the name
            self.scan_url(self.index_url + requirement.project_name + '/')

        if not self.package_pages.get(requirement.key):
            # We couldn't find the target package, so search the index page too
            self.not_found_in_index(requirement)

        for url in list(self.package_pages.get(requirement.key, ())):
            # scan each page that might be related to the desired package
            self.scan_url(url)

    def obtain(self, requirement, installer=None):
        self.prescan()
        self.find_packages(requirement)
        for dist in self[requirement.key]:
            if dist in requirement:
                return dist
            self.debug("%s does not match %s", requirement, dist)
        return super(PackageIndex, self).obtain(requirement, installer)

    def check_hash(self, checker, filename, tfp):
        """
        checker is a ContentChecker
        """
        checker.report(
            self.debug,
            "Validating %%s checksum for %s" % filename)
        if not checker.is_valid():
            tfp.close()
            os.unlink(filename)
            raise DistutilsError(
                "%s validation failed for %s; "
                "possible download problem?"
                % (checker.hash.name, os.path.basename(filename))
            )

    def add_find_links(self, urls):
        """Add `urls` to the list that will be prescanned for searches"""
        for url in urls:
            if (
                self.to_scan is None  # if we have already "gone online"
                or not URL_SCHEME(url)  # or it's a local file/directory
                or url.startswith('file:')
                or list(distros_for_url(url))  # or a direct package link
            ):
                # then go ahead and process it now
                self.scan_url(url)
            else:
                # otherwise, defer retrieval till later
                self.to_scan.append(url)

    def prescan(self):
        """Scan urls scheduled for prescanning (e.g. --find-links)"""
        if self.to_scan:
            list(map(self.scan_url, self.to_scan))
        self.to_scan = None  # from now on, go ahead and process immediately

    def not_found_in_index(self, requirement):
        if self[requirement.key]:  # we've seen at least one distro
            meth, msg = self.info, "Couldn't retrieve index page for %r"
        else:  # no distros seen for this name, might be misspelled
            meth, msg = (
                self.warn,
                "Couldn't find index page for %r (maybe misspelled?)")
        meth(msg, requirement.unsafe_name)
        self.scan_all()

    def download(self, spec, tmpdir):
        """Locate and/or download `spec` to `tmpdir`, returning a local path

        `spec` may be a ``Requirement`` object, or a string containing a URL,
        an existing local filename, or a project/version requirement spec
        (i.e. the string form of a ``Requirement`` object).  If it is the URL
        of a .py file with an unambiguous ``#egg=name-version`` tag (i.e., one
        that escapes ``-`` as ``_`` throughout), a trivial ``setup.py`` is
        automatically created alongside the downloaded file.

        If `spec` is a ``Requirement`` object or a string containing a
        project/version requirement spec, this method returns the location of
        a matching distribution (possibly after downloading it to `tmpdir`).
        If `spec` is a locally existing file or directory name, it is simply
        returned unchanged.  If `spec` is a URL, it is downloaded to a subpath
        of `tmpdir`, and the local filename is returned.  Various errors may be
        raised if a problem occurs during downloading.
        """
        if not isinstance(spec, Requirement):
            scheme = URL_SCHEME(spec)
            if scheme:
                # It's a url, download it to tmpdir
                found = self._download_url(scheme.group(1), spec, tmpdir)
                base, fragment = egg_info_for_url(spec)
                if base.endswith('.py'):
                    found = self.gen_setup(found, fragment, tmpdir)
                return found
            elif os.path.exists(spec):
                # Existing file or directory, just return it
                return spec
            else:
                spec = parse_requirement_arg(spec)
        return getattr(self.fetch_distribution(spec, tmpdir), 'location', None)

    def fetch_distribution(
            self, requirement, tmpdir, force_scan=False, source=False,
            develop_ok=False, local_index=None):
        """Obtain a distribution suitable for fulfilling `requirement`

        `requirement` must be a ``pkg_resources.Requirement`` instance.
        If necessary, or if the `force_scan` flag is set, the requirement is
        searched for in the (online) package index as well as the locally
        installed packages.  If a distribution matching `requirement` is found,
        the returned distribution's ``location`` is the value you would have
        gotten from calling the ``download()`` method with the matching
        distribution's URL or filename.  If no matching distribution is found,
        ``None`` is returned.

        If the `source` flag is set, only source distributions and source
        checkout links will be considered.  Unless the `develop_ok` flag is
        set, development and system eggs (i.e., those using the ``.egg-info``
        format) will be ignored.
        """
        # process a Requirement
        self.info("Searching for %s", requirement)
        skipped = {}
        dist = None

        def find(req, env=None):
            if env is None:
                env = self
            # Find a matching distribution; may be called more than once

            for dist in env[req.key]:

                if dist.precedence == DEVELOP_DIST and not develop_ok:
                    if dist not in skipped:
                        self.warn(
                            "Skipping development or system egg: %s", dist,
                        )
                        skipped[dist] = 1
                    continue

                test = (
                    dist in req
                    and (dist.precedence <= SOURCE_DIST or not source)
                )
                if test:
                    loc = self.download(dist.location, tmpdir)
                    dist.download_location = loc
                    if os.path.exists(dist.download_location):
                        return dist

        if force_scan:
            self.prescan()
            self.find_packages(requirement)
            dist = find(requirement)

        if not dist and local_index is not None:
            dist = find(requirement, local_index)

        if dist is None:
            if self.to_scan is not None:
                self.prescan()
            dist = find(requirement)

        if dist is None and not force_scan:
            self.find_packages(requirement)
            dist = find(requirement)

        if dist is None:
            self.warn(
                "No local packages or working download links found for %s%s",
                (source and "a source distribution of " or ""),
                requirement,
            )
        else:
            self.info("Best match: %s", dist)
            return dist.clone(location=dist.download_location)

    def fetch(self, requirement, tmpdir, force_scan=False, source=False):
        """Obtain a file suitable for fulfilling `requirement`

        DEPRECATED; use the ``fetch_distribution()`` method now instead.  For
        backward compatibility, this routine is identical but returns the
        ``location`` of the downloaded distribution instead of a distribution
        object.
        """
        dist = self.fetch_distribution(requirement, tmpdir, force_scan, source)
        if dist is not None:
            return dist.location
        return None

    def gen_setup(self, filename, fragment, tmpdir):
        match = EGG_FRAGMENT.match(fragment)
        dists = match and [
            d for d in
            interpret_distro_name(filename, match.group(1), None) if d.version
        ] or []

        if len(dists) == 1:  # unambiguous ``#egg`` fragment
            basename = os.path.basename(filename)

            # Make sure the file has been downloaded to the temp dir.
            if os.path.dirname(filename) != tmpdir:
                dst = os.path.join(tmpdir, basename)
                from setuptools.command.easy_install import samefile
                if not samefile(filename, dst):
                    shutil.copy2(filename, dst)
                    filename = dst

            with open(os.path.join(tmpdir, 'setup.py'), 'w') as file:
                file.write(
                    "from setuptools import setup\n"
                    "setup(name=%r, version=%r, py_modules=[%r])\n"
                    % (
                        dists[0].project_name, dists[0].version,
                        os.path.splitext(basename)[0]
                    )
                )
            return filename

        elif match:
            raise DistutilsError(
                "Can't unambiguously interpret project/version identifier %r; "
                "any dashes in the name or version should be escaped using "
                "underscores. %r" % (fragment, dists)
            )
        else:
            raise DistutilsError(
                "Can't process plain .py files without an '#egg=name-version'"
                " suffix to enable automatic setup script generation."
            )

    dl_blocksize = 8192

    def _download_to(self, url, filename):
        self.info("Downloading %s", url)
        # Download the file
        fp = None
        try:
            checker = HashChecker.from_url(url)
            fp = self.open_url(url)
            if isinstance(fp, urllib.error.HTTPError):
                raise DistutilsError(
                    "Can't download %s: %s %s" % (url, fp.code, fp.msg)
                )
            headers = fp.info()
            blocknum = 0
            bs = self.dl_blocksize
            size = -1
            if "content-length" in headers:
                # Some servers return multiple Content-Length headers :(
                sizes = get_all_headers(headers, 'Content-Length')
                size = max(map(int, sizes))
                self.reporthook(url, filename, blocknum, bs, size)
            with open(filename, 'wb') as tfp:
                while True:
                    block = fp.read(bs)
                    if block:
                        checker.feed(block)
                        tfp.write(block)
                        blocknum += 1
                        self.reporthook(url, filename, blocknum, bs, size)
                    else:
                        break
                self.check_hash(checker, filename, tfp)
            return headers
        finally:
            if fp:
                fp.close()

    def reporthook(self, url, filename, blocknum, blksize, size):
        pass  # no-op

    def open_url(self, url, warning=None):
        if url.startswith('file:'):
            return local_open(url)
        try:
            return open_with_auth(url, self.opener)
        except (ValueError, http_client.InvalidURL) as v:
            msg = ' '.join([str(arg) for arg in v.args])
            if warning:
                self.warn(warning, msg)
            else:
                raise DistutilsError('%s %s' % (url, msg))
        except urllib.error.HTTPError as v:
            return v
        except urllib.error.URLError as v:
            if warning:
                self.warn(warning, v.reason)
            else:
                raise DistutilsError("Download error for %s: %s"
                                     % (url, v.reason))
        except http_client.BadStatusLine as v:
            if warning:
                self.warn(warning, v.line)
            else:
                raise DistutilsError(
                    '%s returned a bad status line. The server might be '
                    'down, %s' %
                    (url, v.line)
                )
        except (http_client.HTTPException, socket.error) as v:
            if warning:
                self.warn(warning, v)
            else:
                raise DistutilsError("Download error for %s: %s"
                                     % (url, v))

    def _download_url(self, scheme, url, tmpdir):
        # Determine download filename
        #
        name, fragment = egg_info_for_url(url)
        if name:
            while '..' in name:
                name = name.replace('..', '.').replace('\\', '_')
        else:
            name = "__downloaded__"  # default if URL has no path contents

        if name.endswith('.egg.zip'):
            name = name[:-4]  # strip the extra .zip before download

        filename = os.path.join(tmpdir, name)

        # Download the file
        #
        if scheme == 'svn' or scheme.startswith('svn+'):
            return self._download_svn(url, filename)
        elif scheme == 'git' or scheme.startswith('git+'):
            return self._download_git(url, filename)
        elif scheme.startswith('hg+'):
            return self._download_hg(url, filename)
        elif scheme == 'file':
            return urllib.request.url2pathname(urllib.parse.urlparse(url)[2])
        else:
            self.url_ok(url, True)  # raises error if not allowed
            return self._attempt_download(url, filename)

    def scan_url(self, url):
        self.process_url(url, True)

    def _attempt_download(self, url, filename):
        headers = self._download_to(url, filename)
        if 'html' in headers.get('content-type', '').lower():
            return self._download_html(url, headers, filename)
        else:
            return filename

    def _download_html(self, url, headers, filename):
        file = open(filename)
        for line in file:
            if line.strip():
                # Check for a subversion index page
                if re.search(r'<title>([^- ]+ - )?Revision \d+:', line):
                    # it's a subversion index page:
                    file.close()
                    os.unlink(filename)
                    return self._download_svn(url, filename)
                break  # not an index page
        file.close()
        os.unlink(filename)
        raise DistutilsError("Unexpected HTML page found at " + url)

    def _download_svn(self, url, filename):
        url = url.split('#', 1)[0]  # remove any fragment for svn's sake
        creds = []
        if url.lower().startswith('svn:') and '@' in url:
            scheme, netloc, path, p, q, f = urllib.parse.urlparse(url)
            if not netloc and path.startswith('//') and '/' in path[2:]:
                netloc, path = path[2:].split('/', 1)
                auth, host = urllib.parse.splituser(netloc)
                if auth:
                    if ':' in auth:
                        user, pw = auth.split(':', 1)
                        creds = ["--username=" + user, "--password=" + pw]
                    else:
                        creds = ["--username=" + auth]
                    netloc = host
                    parts = scheme, netloc, url, p, q, f
                    url = urllib.parse.urlunparse(parts)
        self.info("Doing subversion checkout from %s to %s", url, filename)
        subprocess.check_call(["svn", "checkout"] + creds + ["-q", url, filename])
        return filename

    @staticmethod
    def _vcs_split_rev_from_url(url, pop_prefix=False):
        scheme, netloc, path, query, frag = urllib.parse.urlsplit(url)

        scheme = scheme.split('+', 1)[-1]

        # Some fragment identification fails
        path = path.split('#', 1)[0]

        rev = None
        if '@' in path:
            path, rev = path.rsplit('@', 1)

        # Also, discard fragment
        url = urllib.parse.urlunsplit((scheme, netloc, path, query, ''))

        return url, rev

    def _download_git(self, url, filename):
        filename = filename.split('#', 1)[0]
        url, rev = self._vcs_split_rev_from_url(url, pop_prefix=True)

        self.info("Doing git clone from %s to %s", url, filename)
        subprocess.check_call(["git", "clone", "--quiet", url, filename])

        if rev is not None:
            self.info("Checking out %s", rev)
            subprocess.check_call(["git", "-C", filename, "checkout", "--quiet", rev])

        return filename

    def _download_hg(self, url, filename):
        filename = filename.split('#', 1)[0]
        url, rev = self._vcs_split_rev_from_url(url, pop_prefix=True)

        self.info("Doing hg clone from %s to %s", url, filename)
        subprocess.check_call(["hg", "clone", "--quiet", url, filename])

        if rev is not None:
            self.info("Updating to %s", rev)
            subprocess.check_call(["hg", "--cwd", filename, "up", "-C", "-r", rev, "-q"])

        return filename

    def debug(self, msg, *args):
        log.debug(msg, *args)

    def info(self, msg, *args):
        log.info(msg, *args)

    def warn(self, msg, *args):
        log.warn(msg, *args)


# This pattern matches a character entity reference (a decimal numeric
# references, a hexadecimal numeric reference, or a named reference).
entity_sub = re.compile(r'&(#(\d+|x[\da-fA-F]+)|[\w.:-]+);?').sub


def decode_entity(match):
    what = match.group(1)
    return unescape(what)


def htmldecode(text):
    """Decode HTML entities in the given text."""
    return entity_sub(decode_entity, text)


def socket_timeout(timeout=15):
    def _socket_timeout(func):
        def _socket_timeout(*args, **kwargs):
            old_timeout = socket.getdefaulttimeout()
            socket.setdefaulttimeout(timeout)
            try:
                return func(*args, **kwargs)
            finally:
                socket.setdefaulttimeout(old_timeout)

        return _socket_timeout

    return _socket_timeout


def _encode_auth(auth):
    """
    A function compatible with Python 2.3-3.3 that will encode
    auth from a URL suitable for an HTTP header.
    >>> str(_encode_auth('username%3Apassword'))
    'dXNlcm5hbWU6cGFzc3dvcmQ='

    Long auth strings should not cause a newline to be inserted.
    >>> long_auth = 'username:' + 'password'*10
    >>> chr(10) in str(_encode_auth(long_auth))
    False
    """
    auth_s = urllib.parse.unquote(auth)
    # convert to bytes
    auth_bytes = auth_s.encode()
    # use the legacy interface for Python 2.3 support
    encoded_bytes = base64.encodestring(auth_bytes)
    # convert back to a string
    encoded = encoded_bytes.decode()
    # strip the trailing carriage return
    return encoded.replace('\n', '')


class Credential(object):
    """
    A username/password pair. Use like a namedtuple.
    """

    def __init__(self, username, password):
        self.username = username
        self.password = password

    def __iter__(self):
        yield self.username
        yield self.password

    def __str__(self):
        return '%(username)s:%(password)s' % vars(self)


class PyPIConfig(configparser.RawConfigParser):
    def __init__(self):
        """
        Load from ~/.pypirc
        """
        defaults = dict.fromkeys(['username', 'password', 'repository'], '')
        configparser.RawConfigParser.__init__(self, defaults)

        rc = os.path.join(os.path.expanduser('~'), '.pypirc')
        if os.path.exists(rc):
            self.read(rc)

    @property
    def creds_by_repository(self):
        sections_with_repositories = [
            section for section in self.sections()
            if self.get(section, 'repository').strip()
        ]

        return dict(map(self._get_repo_cred, sections_with_repositories))

    def _get_repo_cred(self, section):
        repo = self.get(section, 'repository').strip()
        return repo, Credential(
            self.get(section, 'username').strip(),
            self.get(section, 'password').strip(),
        )

    def find_credential(self, url):
        """
        If the URL indicated appears to be a repository defined in this
        config, return the credential for that repository.
        """
        for repository, cred in self.creds_by_repository.items():
            if url.startswith(repository):
                return cred


def open_with_auth(url, opener=urllib.request.urlopen):
    """Open a urllib2 request, handling HTTP authentication"""

    scheme, netloc, path, params, query, frag = urllib.parse.urlparse(url)

    # Double scheme does not raise on Mac OS X as revealed by a
    # failing test. We would expect "nonnumeric port". Refs #20.
    if netloc.endswith(':'):
        raise http_client.InvalidURL("nonnumeric port: ''")

    if scheme in ('http', 'https'):
        auth, host = urllib.parse.splituser(netloc)
    else:
        auth = None

    if not auth:
        cred = PyPIConfig().find_credential(url)
        if cred:
            auth = str(cred)
            info = cred.username, url
            log.info('Authenticating as %s for %s (from .pypirc)', *info)

    if auth:
        auth = "Basic " + _encode_auth(auth)
        parts = scheme, host, path, params, query, frag
        new_url = urllib.parse.urlunparse(parts)
        request = urllib.request.Request(new_url)
        request.add_header("Authorization", auth)
    else:
        request = urllib.request.Request(url)

    request.add_header('User-Agent', user_agent)
    fp = opener(request)

    if auth:
        # Put authentication info back into request URL if same host,
        # so that links found on the page will work
        s2, h2, path2, param2, query2, frag2 = urllib.parse.urlparse(fp.url)
        if s2 == scheme and h2 == host:
            parts = s2, netloc, path2, param2, query2, frag2
            fp.url = urllib.parse.urlunparse(parts)

    return fp


# adding a timeout to avoid freezing package_index
open_with_auth = socket_timeout(_SOCKET_TIMEOUT)(open_with_auth)


def fix_sf_url(url):
    return url  # backward compatibility


def local_open(url):
    """Read a local path, with special support for directories"""
    scheme, server, path, param, query, frag = urllib.parse.urlparse(url)
    filename = urllib.request.url2pathname(path)
    if os.path.isfile(filename):
        return urllib.request.urlopen(url)
    elif path.endswith('/') and os.path.isdir(filename):
        files = []
        for f in os.listdir(filename):
            filepath = os.path.join(filename, f)
            if f == 'index.html':
                with open(filepath, 'r') as fp:
                    body = fp.read()
                break
            elif os.path.isdir(filepath):
                f += '/'
            files.append('<a href="{name}">{name}</a>'.format(name=f))
        else:
            tmpl = (
                "<html><head><title>{url}</title>"
                "</head><body>{files}</body></html>")
            body = tmpl.format(url=url, files='\n'.join(files))
        status, message = 200, "OK"
    else:
        status, message, body = 404, "Path not found", "Not found"

    headers = {'content-type': 'text/html'}
    body_stream = six.StringIO(body)
    return urllib.error.HTTPError(url, status, message, headers, body_stream)
site-packages/setuptools/glob.py000064400000012127147511334640013015 0ustar00"""
Filename globbing utility. Mostly a copy of `glob` from Python 3.5.

Changes include:
 * `yield from` and PEP3102 `*` removed.
 * `bytes` changed to `six.binary_type`.
 * Hidden files are not ignored.
"""

import os
import re
import fnmatch
from setuptools.extern.six import binary_type

__all__ = ["glob", "iglob", "escape"]


def glob(pathname, recursive=False):
    """Return a list of paths matching a pathname pattern.

    The pattern may contain simple shell-style wildcards a la
    fnmatch. However, unlike fnmatch, filenames starting with a
    dot are special cases that are not matched by '*' and '?'
    patterns.

    If recursive is true, the pattern '**' will match any files and
    zero or more directories and subdirectories.
    """
    return list(iglob(pathname, recursive=recursive))


def iglob(pathname, recursive=False):
    """Return an iterator which yields the paths matching a pathname pattern.

    The pattern may contain simple shell-style wildcards a la
    fnmatch. However, unlike fnmatch, filenames starting with a
    dot are special cases that are not matched by '*' and '?'
    patterns.

    If recursive is true, the pattern '**' will match any files and
    zero or more directories and subdirectories.
    """
    it = _iglob(pathname, recursive)
    if recursive and _isrecursive(pathname):
        s = next(it)  # skip empty string
        assert not s
    return it


def _iglob(pathname, recursive):
    dirname, basename = os.path.split(pathname)
    if not has_magic(pathname):
        if basename:
            if os.path.lexists(pathname):
                yield pathname
        else:
            # Patterns ending with a slash should match only directories
            if os.path.isdir(dirname):
                yield pathname
        return
    if not dirname:
        if recursive and _isrecursive(basename):
            for x in glob2(dirname, basename):
                yield x
        else:
            for x in glob1(dirname, basename):
                yield x
        return
    # `os.path.split()` returns the argument itself as a dirname if it is a
    # drive or UNC path.  Prevent an infinite recursion if a drive or UNC path
    # contains magic characters (i.e. r'\\?\C:').
    if dirname != pathname and has_magic(dirname):
        dirs = _iglob(dirname, recursive)
    else:
        dirs = [dirname]
    if has_magic(basename):
        if recursive and _isrecursive(basename):
            glob_in_dir = glob2
        else:
            glob_in_dir = glob1
    else:
        glob_in_dir = glob0
    for dirname in dirs:
        for name in glob_in_dir(dirname, basename):
            yield os.path.join(dirname, name)


# These 2 helper functions non-recursively glob inside a literal directory.
# They return a list of basenames. `glob1` accepts a pattern while `glob0`
# takes a literal basename (so it only has to check for its existence).


def glob1(dirname, pattern):
    if not dirname:
        if isinstance(pattern, binary_type):
            dirname = os.curdir.encode('ASCII')
        else:
            dirname = os.curdir
    try:
        names = os.listdir(dirname)
    except OSError:
        return []
    return fnmatch.filter(names, pattern)


def glob0(dirname, basename):
    if not basename:
        # `os.path.split()` returns an empty basename for paths ending with a
        # directory separator.  'q*x/' should match only directories.
        if os.path.isdir(dirname):
            return [basename]
    else:
        if os.path.lexists(os.path.join(dirname, basename)):
            return [basename]
    return []


# This helper function recursively yields relative pathnames inside a literal
# directory.


def glob2(dirname, pattern):
    assert _isrecursive(pattern)
    yield pattern[:0]
    for x in _rlistdir(dirname):
        yield x


# Recursively yields relative pathnames inside a literal directory.
def _rlistdir(dirname):
    if not dirname:
        if isinstance(dirname, binary_type):
            dirname = binary_type(os.curdir, 'ASCII')
        else:
            dirname = os.curdir
    try:
        names = os.listdir(dirname)
    except os.error:
        return
    for x in names:
        yield x
        path = os.path.join(dirname, x) if dirname else x
        for y in _rlistdir(path):
            yield os.path.join(x, y)


magic_check = re.compile('([*?[])')
magic_check_bytes = re.compile(b'([*?[])')


def has_magic(s):
    if isinstance(s, binary_type):
        match = magic_check_bytes.search(s)
    else:
        match = magic_check.search(s)
    return match is not None


def _isrecursive(pattern):
    if isinstance(pattern, binary_type):
        return pattern == b'**'
    else:
        return pattern == '**'


def escape(pathname):
    """Escape all special characters.
    """
    # Escaping is done by wrapping any of "*?[" between square brackets.
    # Metacharacters do not work in the drive part and shouldn't be escaped.
    drive, pathname = os.path.splitdrive(pathname)
    if isinstance(pathname, binary_type):
        pathname = magic_check_bytes.sub(br'[\1]', pathname)
    else:
        pathname = magic_check.sub(r'[\1]', pathname)
    return drive + pathname
site-packages/setuptools/__pycache__/launch.cpython-36.pyc000064400000001421147511334640017623 0ustar003

��f�@s.dZddlZddlZdd�Zedkr*e�dS)z[
Launch the Python script on the command line after
setuptools is bootstrapped via import.
�NcCsrttjd}t|ddd�}tjdd�tjdd�<ttdt�}||�j�}|jdd�}t	||d�}t
||�dS)	zP
    Run the script in sys.argv[1] as if it had
    been invoked naturally.
    ��__main__N)�__file__�__name__�__doc__�openz\r\nz\n�exec)�__builtins__�sys�argv�dict�getattr�tokenizer�read�replace�compiler)Zscript_name�	namespaceZopen_ZscriptZnorm_script�code�r�/usr/lib/python3.6/launch.py�run
s
rr)rrr
rrrrrr�<module>s
site-packages/setuptools/__pycache__/namespaces.cpython-36.opt-1.pyc000064400000007031147511334640021432 0ustar003

��f�@sRddlZddlmZddlZddlmZejjZGdd�d�Z	Gdd�de	�Z
dS)�N)�log)�mapc	@sTeZdZdZdd�Zdd�Zdd�ZdZdZdd�Z	dd�Z
dd�Zedd��Z
dS)�	Installerz
-nspkg.pthc	Cs�|j�}|sdStjj|j��\}}||j7}|jj|�tj	d|�t
|j|�}|jrdt
|�dSt|d��}|j|�WdQRXdS)Nz
Installing %sZwt)�_get_all_ns_packages�os�path�splitext�_get_target�	nspkg_extZoutputs�appendr�infor�_gen_nspkg_lineZdry_run�list�open�
writelines)�selfZnsp�filename�ext�lines�f�r� /usr/lib/python3.6/namespaces.py�install_namespacess
zInstaller.install_namespacescCsHtjj|j��\}}||j7}tjj|�s.dStjd|�tj|�dS)NzRemoving %s)	rrrr	r
�existsrr�remove)rrrrrr�uninstall_namespaces!s
zInstaller.uninstall_namespacescCs|jS)N)�target)rrrrr	)szInstaller._get_target�import sys, types, os�#has_mfs = sys.version_info > (3, 5)�$p = os.path.join(%(root)s, *%(pth)r)�4importlib = has_mfs and __import__('importlib.util')�-has_mfs and __import__('importlib.machinery')��m = has_mfs and sys.modules.setdefault(%(pkg)r, importlib.util.module_from_spec(importlib.machinery.PathFinder.find_spec(%(pkg)r, [os.path.dirname(p)])))�Cm = m or sys.modules.setdefault(%(pkg)r, types.ModuleType(%(pkg)r))�7mp = (m or []) and m.__dict__.setdefault('__path__',[])�(p not in mp) and mp.append(p)�4m and setattr(sys.modules[%(parent)r], %(child)r, m)cCsdS)Nz$sys._getframe(1).f_locals['sitedir']r)rrrr�	_get_rootCszInstaller._get_rootcCsVt|�}t|jd��}|j�}|j}|jd�\}}}|rB||j7}dj|�t�dS)N�.�;�
)	�str�tuple�splitr'�_nspkg_tmpl�
rpartition�_nspkg_tmpl_multi�join�locals)r�pkgZpth�rootZ
tmpl_lines�parent�sepZchildrrrr
Fs
zInstaller._gen_nspkg_linecCs |jjp
g}ttt|j|���S)z,Return sorted list of all package namespaces)ZdistributionZnamespace_packages�sorted�flattenr�
_pkg_names)rZpkgsrrrrQszInstaller._get_all_ns_packagesccs,|jd�}x|r&dj|�V|j�qWdS)z�
        Given a namespace package, yield the components of that
        package.

        >>> names = Installer._pkg_names('a.b.c')
        >>> set(names) == set(['a', 'a.b', 'a.b.c'])
        True
        r(N)r-r1�pop)r3�partsrrrr9Vs

zInstaller._pkg_namesN)	rrrr r!r"r#r$r%)r&)�__name__�
__module__�__qualname__r
rrr	r.r0r'r
r�staticmethodr9rrrrrs$rc@seZdZdd�Zdd�ZdS)�DevelopInstallercCstt|j��S)N)�reprr+Zegg_path)rrrrr'gszDevelopInstaller._get_rootcCs|jS)N)Zegg_link)rrrrr	jszDevelopInstaller._get_targetN)r<r=r>r'r	rrrrr@fsr@)rZ	distutilsr�	itertoolsZsetuptools.extern.six.movesr�chain�
from_iterabler8rr@rrrr�<module>s[site-packages/setuptools/__pycache__/config.cpython-36.pyc000064400000035217147511334640017630 0ustar003

��fVF�@s�ddlmZmZddlZddlZddlZddlmZddlm	Z	ddl
mZddlm
Z
mZddlmZmZddlmZdd
d�Zdd
�Zddd�ZGdd�de�ZGdd�de�ZGdd�de�ZdS)�)�absolute_import�unicode_literalsN)�defaultdict)�partial)�
import_module)�DistutilsOptionError�DistutilsFileError)�
LegacyVersion�parse)�string_typesFc	Cs�ddlm}m}tjj|�}tjj|�s4td|��tj�}tj	tjj
|��zJ|�}|rb|j�ng}||krx|j|�|j
||d�t||j|d�}Wdtj	|�Xt|�S)a,Read given configuration file and returns options from it as a dict.

    :param str|unicode filepath: Path to configuration file
        to get options from.

    :param bool find_others: Whether to search for other configuration files
        which could be on in various places.

    :param bool ignore_option_errors: Whether to silently ignore
        options, values of which could not be resolved (e.g. due to exceptions
        in directives such as file:, attr:, etc.).
        If False exceptions are propagated as expected.

    :rtype: dict
    r)�Distribution�
_Distributionz%Configuration file %s does not exist.)�	filenames)�ignore_option_errorsN)Zsetuptools.distrr
�os�path�abspath�isfiler�getcwd�chdir�dirnameZfind_config_files�appendZparse_config_files�parse_configuration�command_options�configuration_to_dict)	�filepathZfind_othersrrr
Zcurrent_directoryZdistr�handlers�r�/usr/lib/python3.6/config.py�read_configurations$

rcCsltt�}x^|D]V}|j}|j}xD|jD]:}t|d|d�}|dkrNt||�}n|�}||||<q&WqW|S)z�Returns configuration data gathered by given handlers as a dict.

    :param list[ConfigHandler] handlers: Handlers list,
        usually from parse_configuration()

    :rtype: dict
    zget_%sN)r�dict�section_prefix�
target_obj�set_options�getattr)rZconfig_dictZhandlerZ	obj_aliasr"Zoption�getter�valuerrrr=s
rcCs6t|||�}|j�t|j|||j�}|j�||fS)a�Performs additional parsing of configuration options
    for a distribution.

    Returns a list of used option handlers.

    :param Distribution distribution:
    :param dict command_options:
    :param bool ignore_option_errors: Whether to silently ignore
        options, values of which could not be resolved (e.g. due to exceptions
        in directives such as file:, attr:, etc.).
        If False exceptions are propagated as expected.
    :rtype: list
    )�ConfigOptionsHandlerr
�ConfigMetadataHandler�metadata�package_dir)Zdistributionrr�options�metarrrrZs
rc@s�eZdZdZdZiZd!dd�Zedd��Zdd	�Z	e
d"dd��Ze
d
d��Ze
dd��Z
e
dd��Zedd��Zedd��Ze
d#dd��Ze
dd��Ze
d$dd��Zdd�Zdd �ZdS)%�
ConfigHandlerz1Handles metadata supplied in configuration files.NFcCsbi}|j}x:|j�D].\}}|j|�s(q|j|d�jd�}|||<qW||_||_||_g|_dS)N��.)	r!�items�
startswith�replace�striprr"�sectionsr#)�selfr"r+rr4r!�section_name�section_optionsrrr�__init__�s
zConfigHandler.__init__cCstd|jj��dS)z.Metadata item name to parser function mapping.z!%s must provide .parsers propertyN)�NotImplementedError�	__class__�__name__)r5rrr�parsers�szConfigHandler.parsersc	Cs�t�}|j}|jj||�}t|||�}||kr6t|��|r>dSd}|jj|�}|r�y||�}Wn tk
r~d}|jsz�YnX|r�dSt|d|d�}|dkr�t	|||�n||�|j
j|�dS)NFTzset_%s)�tupler"�aliases�getr$�KeyErrorr<�	Exceptionr�setattrr#r)	r5Zoption_namer&�unknownr"Z
current_valueZskip_option�parser�setterrrr�__setitem__�s0zConfigHandler.__setitem__�,cCs8t|t�r|Sd|kr |j�}n
|j|�}dd�|D�S)z�Represents value as a list.

        Value is split either by separator (defaults to comma) or by lines.

        :param value:
        :param separator: List items separator character.
        :rtype: list
        �
cSsg|]}|j�r|j��qSr)r3)�.0�chunkrrr�
<listcomp>�sz-ConfigHandler._parse_list.<locals>.<listcomp>)�
isinstance�list�
splitlines�split)�clsr&�	separatorrrr�_parse_list�s



zConfigHandler._parse_listcCsTd}i}xF|j|�D]8}|j|�\}}}||kr<td|��|j�||j�<qW|S)zPRepresents value as a dict.

        :param value:
        :rtype: dict
        �=z(Unable to parse option value to dict: %s)rR�	partitionrr3)rPr&rQ�result�line�key�sep�valrrr�_parse_dict�s
zConfigHandler._parse_dictcCs|j�}|dkS)zQRepresents value as boolean.

        :param value:
        :rtype: bool
        �1�true�yes)r[r\r])�lower)rPr&rrr�_parse_bool�szConfigHandler._parse_boolcs\d}t|t�s|S|j|�s |S|t|�d�}dd�|jd�D�}dj�fdd�|D��S)aiRepresents value as a string, allowing including text
        from nearest files using `file:` directive.

        Directive is sandboxed and won't reach anything outside
        directory with setup.py.

        Examples:
            file: LICENSE
            file: README.rst, CHANGELOG.md, src/file.txt

        :param str value:
        :rtype: str
        zfile:Ncss|]}tjj|j��VqdS)N)rrrr3)rIrrrr�	<genexpr>sz,ConfigHandler._parse_file.<locals>.<genexpr>rGrHc3s2|]*}�j|�sdrtjj|�r�j|�VqdS)TN)�
_assert_localrrr�
_read_file)rIr)rPrrr`
s)rLrr1�lenrO�join)rPr&Zinclude_directive�specZ	filepathsr)rPr�_parse_file�s


zConfigHandler._parse_filecCs|jtj��std|��dS)Nz#`file:` directive can not access %s)r1rrr)rrrrraszConfigHandler._assert_localc	Cs"tj|dd��
}|j�SQRXdS)Nzutf-8)�encoding)�io�open�read)r�frrrrbszConfigHandler._read_filecCs�d}|j|�s|S|j|d�j�jd�}|j�}dj|�}|p@d}tj�}|r�|d|kr�||d}|jdd�}	t	|	�dkr�tj
jtj�|	d�}|	d}q�|}nd|kr�tj
jtj�|d�}tj
jd|�zt
|�}
t|
|�}Wdtj
dd�t_
X|S)	z�Represents value as a module attribute.

        Examples:
            attr: package.attr
            attr: package.module.attr

        :param str value:
        :rtype: str
        zattr:r.r/r8r�/�N)r1r2r3rO�poprdrr�rsplitrcr�sys�insertrr$)rPr&r*Zattr_directiveZ
attrs_pathZ	attr_nameZmodule_name�parent_pathZcustom_path�parts�modulerrr�_parse_attrs0


zConfigHandler._parse_attrcs�fdd�}|S)z�Returns parser function to represents value as a list.

        Parses a value applying given methods one after another.

        :param parse_methods:
        :rtype: callable
        cs|}x�D]}||�}q
W|S)Nr)r&�parsed�method)�
parse_methodsrrr
Qs
z1ConfigHandler._get_parser_compound.<locals>.parser)rPrxr
r)rxr�_get_parser_compoundHs	z"ConfigHandler._get_parser_compoundcCs:i}|pdd�}x$|j�D]\}\}}||�||<qW|S)z�Parses section options into a dictionary.

        Optionally applies a given parser to values.

        :param dict section_options:
        :param callable values_parser:
        :rtype: dict
        cSs|S)Nr)rYrrr�<lambda>fsz6ConfigHandler._parse_section_to_dict.<locals>.<lambda>)r0)rPr7Z
values_parserr&rW�_rYrrr�_parse_section_to_dict[s

z$ConfigHandler._parse_section_to_dictcCs@x:|j�D].\}\}}y|||<Wq
tk
r6Yq
Xq
WdS)zQParses configuration file section.

        :param dict section_options:
        N)r0r@)r5r7�namer{r&rrr�
parse_sectionks
zConfigHandler.parse_sectioncCsfx`|jj�D]R\}}d}|r$d|}t|d|jdd�d�}|dkrVtd|j|f��||�qWdS)zTParses configuration file items from one
        or more related sections.

        r.z_%szparse_section%sr/�__Nz0Unsupported distribution option section: [%s.%s])r4r0r$r2rr!)r5r6r7Zmethod_postfixZsection_parser_methodrrrr
wszConfigHandler.parse)F)rG)N)N)r;�
__module__�__qualname__�__doc__r!r>r8�propertyr<rF�classmethodrRrZr_rf�staticmethodrarbruryr|r~r
rrrrr-ts(
&
 ,r-csHeZdZdZddddd�ZdZd�fd	d
�	Zedd��Zd
d�Z	�Z
S)r(r)Zurl�description�classifiers�	platforms)Z	home_pageZsummaryZ
classifier�platformFNcstt|�j|||�||_dS)N)�superr(r8r*)r5r"r+rr*)r:rrr8�szConfigMetadataHandler.__init__cCs8|j}|j}|j}||||||j||�||||j|d�S)z.Metadata item name to parser function mapping.)r��keywordsZprovidesZrequiresZ	obsoletesr��licenser�Zlong_description�versionZproject_urls)rRrfrZry�_parse_version)r5�
parse_listZ
parse_file�
parse_dictrrrr<�s
zConfigMetadataHandler.parserscCs�|j|�}||kr<|j�}tt|�t�r8td||f��|S|j||j�}t|�rX|�}t|t	�s�t
|d�r~djtt
|��}nd|}|S)zSParses `version` option value.

        :param value:
        :rtype: str

        z7Version loaded from %s does not comply with PEP 440: %s�__iter__r/z%s)rfr3rLr
r	rrur*�callabler�hasattrrd�map�str)r5r&r�rrrr��s


z$ConfigMetadataHandler._parse_version)FN)r;r�r�r!r>Zstrict_moder8r�r<r��
__classcell__rr)r:rr(�sr(c@sTeZdZdZedd��Zdd�Zdd�Zdd	�Zd
d�Z	dd
�Z
dd�Zdd�ZdS)r'r+cCsL|j}t|jdd�}|j}|j}|||||||||||||||j|j|d�S)z.Metadata item name to parser function mapping.�;)rQ)Zzip_safeZuse_2to3Zinclude_package_datar*Zuse_2to3_fixersZuse_2to3_exclude_fixersZconvert_2to3_doctests�scriptsZeager_resourcesZdependency_linksZnamespace_packagesZinstall_requiresZsetup_requiresZ
tests_requireZpackages�entry_pointsZ
py_modules)rRrr_rZ�_parse_packagesrf)r5r�Zparse_list_semicolonZ
parse_boolr�rrrr<�s*zConfigOptionsHandler.parserscCsBd}|j|�s|j|�S|j|jjdi��}ddlm}|f|�S)zTParses `packages` option value.

        :param value:
        :rtype: list
        zfind:z
packages.findr)�
find_packages)r1rR�parse_section_packages__findr4r?Z
setuptoolsr�)r5r&Zfind_directive�find_kwargsr�rrrr��s

z$ConfigOptionsHandler._parse_packagescsT|j||j�}dddg�t�fdd�|j�D��}|jd�}|dk	rP|d|d<|S)z�Parses `packages.find` configuration file section.

        To be used in conjunction with _parse_packages().

        :param dict section_options:
        �where�include�excludecs$g|]\}}|�kr|r||f�qSrr)rI�k�v)�
valid_keysrrrKszEConfigOptionsHandler.parse_section_packages__find.<locals>.<listcomp>Nr)r|rRr r0r?)r5r7Zsection_datar�r�r)r�rr�s


z1ConfigOptionsHandler.parse_section_packages__findcCs|j||j�}||d<dS)z`Parses `entry_points` configuration file section.

        :param dict section_options:
        r�N)r|rR)r5r7rvrrr�parse_section_entry_points%sz/ConfigOptionsHandler.parse_section_entry_pointscCs.|j||j�}|jd�}|r*||d<|d=|S)N�*r.)r|rRr?)r5r7rv�rootrrr�_parse_package_data-s
z(ConfigOptionsHandler._parse_package_datacCs|j|�|d<dS)z`Parses `package_data` configuration file section.

        :param dict section_options:
        Zpackage_dataN)r�)r5r7rrr�parse_section_package_data7sz/ConfigOptionsHandler.parse_section_package_datacCs|j|�|d<dS)zhParses `exclude_package_data` configuration file section.

        :param dict section_options:
        Zexclude_package_dataN)r�)r5r7rrr�"parse_section_exclude_package_data>sz7ConfigOptionsHandler.parse_section_exclude_package_datacCs"t|jdd�}|j||�|d<dS)zbParses `extras_require` configuration file section.

        :param dict section_options:
        r�)rQZextras_requireN)rrRr|)r5r7r�rrr�parse_section_extras_requireFsz1ConfigOptionsHandler.parse_section_extras_requireN)
r;r�r�r!r�r<r�r�r�r�r�r�r�rrrrr'�s
r')FF)F)Z
__future__rrrhrrp�collectionsr�	functoolsr�	importlibrZdistutils.errorsrrZ#setuptools.extern.packaging.versionr	r
Zsetuptools.extern.sixrrrr�objectr-r(r'rrrr�<module>s"
.
Msite-packages/setuptools/__pycache__/monkey.cpython-36.opt-1.pyc000064400000010773147511334640020624 0ustar003

��f��@s�dZddlZddlZddlZddlZddlZddlmZddl	Z	ddl
mZddlZgZ
dd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Zdd�Zdd�ZdS)z
Monkey patching of distutils.
�N)�
import_module)�sixcCs"tj�dkr|f|jStj|�S)am
    Returns the bases classes for cls sorted by the MRO.

    Works around an issue on Jython where inspect.getmro will not return all
    base classes if multiple classes share the same name. Instead, this
    function will return a tuple containing the class itself, and the contents
    of cls.__bases__. See https://github.com/pypa/setuptools/issues/1024.
    ZJython)�platformZpython_implementation�	__bases__�inspectZgetmro)�cls�r�/usr/lib/python3.6/monkey.py�_get_mros	r
cCs0t|tj�rtnt|tj�r tndd�}||�S)NcSsdS)Nr)�itemrrr	�<lambda>*szget_unpatched.<locals>.<lambda>)�
isinstancerZclass_types�get_unpatched_class�types�FunctionType�get_unpatched_function)r�lookuprrr	�
get_unpatched&srcCs:dd�t|�D�}t|�}|jjd�s6d|}t|��|S)z�Protect against re-patching the distutils if reloaded

    Also ensures that no other distutils extension monkeypatched the distutils
    first.
    css|]}|jjd�s|VqdS)�
setuptoolsN)�
__module__�
startswith)�.0rrrr	�	<genexpr>6sz&get_unpatched_class.<locals>.<genexpr>�	distutilsz(distutils has already been patched by %r)r
�nextrr�AssertionError)rZexternal_bases�base�msgrrr	r/srcCs�tjtj_tjdk}|r"tjtj_tjdkpxd
tjko@dknpxdtjkoZdknpxdtjkotdkn}|r�d	}|tjj	_
t�x"tjtjtj
fD]}tjj|_q�Wtjjtj_tjjtj_d
tjk�r�tjjtjd
_t�dS)N�����
r��zhttps://upload.pypi.org/legacy/zdistutils.command.build_ext)rrr)r r!r")rr)rrr!)rr#)rr#r$)rr)rrr)rZCommandrZcore�sys�version_info�findallZfilelist�configZ
PyPIRCCommandZDEFAULT_REPOSITORY�+_patch_distribution_metadata_write_pkg_file�dist�cmdZDistribution�	extensionZ	Extension�modules�#patch_for_msvc_specialized_compiler)Zhas_issue_12885Zneeds_warehouseZ	warehouse�modulerrr	�	patch_allAs&




r0cCstjjtjj_dS)zDPatch write_pkg_file to also write Requires-Python/Requires-ExternalN)rr*Zwrite_pkg_filerZDistributionMetadatarrrr	r)jsr)cCs*t||�}t|�jd|�t|||�dS)z�
    Patch func_name in target_mod with replacement

    Important - original must be resolved by name to avoid
    patching an already patched function.
    �	unpatchedN)�getattr�vars�
setdefault�setattr)ZreplacementZ
target_mod�	func_name�originalrrr	�
patch_funcqs
r8cCs
t|d�S)Nr1)r2)�	candidaterrr	r�srcs�td��tj�dkrdS�fdd�}tj|d�}tj|d�}yt|d��t|d	��Wntk
rlYnXyt|d
��Wntk
r�YnXyt|d��Wntk
r�YnXdS)z\
    Patch functions in distutils to use standalone Microsoft Visual C++
    compilers.
    zsetuptools.msvcZWindowsNcsLd|krdnd}||jd�}t�|�}t|�}t||�sBt|��|||fS)zT
        Prepare the parameters for patch_func to patch indicated function.
        �msvc9Zmsvc9_Zmsvc14_�_)�lstripr2r�hasattr�ImportError)Zmod_namer6Zrepl_prefixZ	repl_name�repl�mod)�msvcrr	�patch_params�s

z9patch_for_msvc_specialized_compiler.<locals>.patch_paramszdistutils.msvc9compilerzdistutils._msvccompilerZfind_vcvarsallZquery_vcvarsallZ_get_vc_envZgen_lib_options)rr�system�	functools�partialr8r>)rBr:Zmsvc14r)rAr	r.�s&
r.)�__doc__r%Zdistutils.filelistrrrrD�	importlibrrZsetuptools.externrr�__all__r
rrr0r)r8rr.rrrr	�<module>s$	)site-packages/setuptools/__pycache__/extension.cpython-36.pyc000064400000003562147511334640020375 0ustar003

��f��@s|ddlZddlZddlZddlZddlZddlmZddlm	Z	dd�Z
e
Ze	ejj
�ZGdd�de�Z
Gd	d
�d
e
�ZdS)�N)�map�)�
get_unpatchedcCs2d}yt|dgd�jdStk
r,YnXdS)z0
    Return True if Cython can be imported.
    zCython.Distutils.build_ext�	build_ext)�fromlistTF)�
__import__r�	Exception)Zcython_impl�r	�/usr/lib/python3.6/extension.py�_have_cythonsrc@s eZdZdZdd�Zdd�ZdS)�	Extensionz7Extension that uses '.c' files in place of '.pyx' filescOs(|jdd�|_tj|||f|�|�dS)N�py_limited_apiF)�popr
�
_Extension�__init__)�self�name�sources�args�kwr	r	r
r#szExtension.__init__cCsNt�r
dS|jpd}|j�dkr$dnd}tjtjd|�}tt||j	��|_	dS)z�
        Replace sources with .pyx extensions to sources with the target
        language extension. This mechanism allows language authors to supply
        pre-converted sources but to prefer the .pyx sources.
        N�zc++z.cppz.cz.pyx$)
rZlanguage�lower�	functools�partial�re�sub�listrr)rZlangZ
target_extrr	r	r
�_convert_pyx_sources_to_lang)s
z&Extension._convert_pyx_sources_to_langN)�__name__�
__module__�__qualname__�__doc__rrr	r	r	r
r src@seZdZdZdS)�Libraryz=Just like a regular Extension, but built as a library insteadN)rrr r!r	r	r	r
r"8sr")rrZdistutils.coreZ	distutilsZdistutils.errorsZdistutils.extensionZsetuptools.extern.six.movesrZmonkeyrrZ
have_pyrexZcorerrr"r	r	r	r
�<module>ssite-packages/setuptools/__pycache__/package_index.cpython-36.pyc000064400000076647147511334640021161 0ustar003

��fŜ�@s�dZddlZddlZddlZddlZddlZddlZddlZddlZddl	Z	ddl
mZddlm
Z
ddlmZmZmZmZddlZddlmZmZmZmZmZmZmZmZmZmZmZm Z m!Z!ddlm"Z"ddl#m$Z$dd	l%m&Z&dd
l'm(Z(ddl)m*Z*ddl+m,Z,dd
l-m.Z.ej/d�Z0ej/dej1�Z2ej/d�Z3ej/dej1�j4Z5dj6�Z7ddddgZ8dZ9dZ:e:j;ej<dd�ed�Z=dd�Z>dd�Z?dd�Z@dEd d�ZAdFd!d"�ZBdGd#d$�ZCdedfd%d�ZDdHd&d'�ZEd(d)�ZFej/d*ej1�ZGeFd+d,��ZHGd-d.�d.eI�ZJGd/d0�d0eJ�ZKGd1d�de�ZLej/d2�jMZNd3d4�ZOd5d6�ZPdId7d8�ZQd9d:�ZRGd;d<�d<eI�ZSGd=d>�d>ejT�ZUejVjWfd?d@�ZXeQe9�eX�ZXdAdB�ZYdCdD�ZZdS)Jz#PyPI and direct package downloading�N)�wraps)�six)�urllib�http_client�configparser�map)
�
CHECKOUT_DIST�Distribution�BINARY_DIST�normalize_path�SOURCE_DIST�Environment�find_distributions�	safe_name�safe_version�to_filename�Requirement�DEVELOP_DIST�EGG_DIST)�ssl_support)�log)�DistutilsError)�	translate)�get_all_headers)�unescape)�Wheelz^egg=([-A-Za-z0-9_.+!]+)$zhref\s*=\s*['"]?([^'"> ]+)z�<a href="([^"#]+)">([^<]+)</a>
\s+\(<a (?:title="MD5 hash"
\s+)href="[^?]+\?:action=show_md5&amp;digest=([0-9a-f]{32})">md5</a>\)z([-+.a-z0-9]{2,}):z.tar.gz .tar.bz2 .tar .zip .tgz�PackageIndex�distros_for_url�parse_bdist_wininst�interpret_distro_name�z<setuptools/{setuptools.__version__} Python-urllib/{py_major}�)Zpy_major�
setuptoolscCs2y
tj|�Stk
r,td|f��YnXdS)Nz1Not a URL, existing file, or requirement spec: %r)r�parse�
ValueErrorr)�spec�r&�#/usr/lib/python3.6/package_index.py�parse_requirement_arg3s

r(cCs�|j�}d\}}}|jd�r�|jd�r8|dd�}d}nn|jdd�rb|dd�}|dd�}d}nD|jd
�r~|dd�}d}n(|jd
d�r�|dd�}|dd�}d}|||fS)z=Return (base,pyversion) or (None,None) for possible .exe nameNz.exez
.win32.exe�
Zwin32z	.win32-py���z.win-amd64.exe�z	win-amd64z
.win-amd64-py�)NNNi����i�i�������i�i�i��i����r/i��)�lower�endswith�
startswith)�namer0�base�py_verZplatr&r&r'r<s$



c	Csxtjj|�}|\}}}}}}tjj|jd�d�}|dkrX|dkrXtjj|jd�d�}d|krp|jdd�\}}||fS)	N�/�zsourceforge.net�download��#������)rr#�urlparse�unquote�split)	�url�parts�scheme�server�pathZ
parameters�query�fragmentr4r&r&r'�egg_info_for_urlTsrGccsdt|�\}}xt|||�D]
}|VqW|r`tj|�}|r`x$t||jd�|td�D]
}|VqRWdS)zEYield egg or source distribution objects that might be found at a URLr7)�
precedenceN)rG�distros_for_location�EGG_FRAGMENT�matchr�groupr)r@�metadatar4rF�distrKr&r&r'r_s

cCs�|jd�r|dd
�}|jd�r8d|kr8tj|||�gS|jd�rxd|krxt|�}|j�s^gSt||j|jtdd�gS|jd	�r�t|�\}}}|dk	r�t	||||t
|�Sx4tD],}|j|�r�|dt|��}t	|||�Sq�WgS)z:Yield egg or source distribution objects based on basenamez.egg.zipNr,z.egg�-z.whlr7)�location�project_name�versionrHz.exer/)
r1r	Z
from_locationrZ
is_compatiblerQrRrrrr
�
EXTENSIONS�len)rP�basenamerMZwheelZwin_baser5�platformZextr&r&r'rIms.



rIcCstt|�tjj|�|�S)zEYield possible egg or source distribution objects based on a filename)rIr�osrDrU)�filenamerMr&r&r'�distros_for_filename�srYc
cs�|jd�}|r.tdd�|dd�D��r.dSxNtdt|�d�D]8}t||dj|d|��dj||d��|||d�VqBWdS)z�Generate alternative interpretations of a source distro name

    Note: if `location` is a filesystem filename, you should call
    ``pkg_resources.normalize_path()`` on it before passing it to this
    routine!
    rOcss|]}tjd|�VqdS)z	py\d\.\d$N)�rerK)�.0�pr&r&r'�	<genexpr>�sz(interpret_distro_name.<locals>.<genexpr>r9Nr7)�
py_versionrHrV)r?�any�rangerTr	�join)rPrUrMr^rHrVrAr\r&r&r'r�s
 $ccsnt�}|j}|dkr>xTtjj|j|�D]}||�|Vq&Wn,x*|D]"}||�}||krD||�|VqDWdS)zHList unique elements, preserving order. Remember all elements ever seen.N)�set�addrZmoves�filterfalse�__contains__)�iterable�key�seenZseen_add�element�kr&r&r'�unique_everseen�s
rkcst���fdd��}|S)zs
    Wrap a function returning an iterable such that the resulting iterable
    only ever yields unique items.
    cst�||��S)N)rk)�args�kwargs)�funcr&r'�wrapper�szunique_values.<locals>.wrapper)r)rnror&)rnr'�
unique_values�srpz3<([^>]*\srel\s{0,10}=\s{0,10}['"]?([^'" >]+)[^>]*)>ccs�xvtj|�D]h}|j�\}}tttj|j�jd���}d|ksFd|krx,t	j|�D]}t
jj|t
|jd���VqRWqWxHdD]@}|j|�}|d	kr~t	j||�}|r~t
jj|t
|jd���Vq~WdS)
zEFind rel="homepage" and rel="download" links in `page`, yielding URLs�,Zhomepager8r7�
<th>Home Page�<th>Download URLN)rrrsr;)�REL�finditer�groupsrbr�str�stripr0r?�HREFrr#�urljoin�
htmldecoderL�find�search)r@�pagerK�tagZrelZrels�posr&r&r'�find_external_links�s"

r�c@s(eZdZdZdd�Zdd�Zdd�ZdS)	�ContentCheckerzP
    A null content checker that defines the interface for checking content
    cCsdS)z3
        Feed a block of data to the hash.
        Nr&)�self�blockr&r&r'�feed�szContentChecker.feedcCsdS)zC
        Check the hash. Return False if validation fails.
        Tr&)r�r&r&r'�is_valid�szContentChecker.is_validcCsdS)zu
        Call reporter with information about the checker (hash name)
        substituted into the template.
        Nr&)r��reporter�templater&r&r'�reportszContentChecker.reportN)�__name__�
__module__�__qualname__�__doc__r�r�r�r&r&r&r'r��sr�c@sBeZdZejd�Zdd�Zedd��Zdd�Z	dd	�Z
d
d�ZdS)
�HashCheckerzK(?P<hash_name>sha1|sha224|sha384|sha256|sha512|md5)=(?P<expected>[a-f0-9]+)cCs||_tj|�|_||_dS)N)�	hash_name�hashlib�new�hash�expected)r�r�r�r&r&r'�__init__szHashChecker.__init__cCs>tjj|�d}|st�S|jj|�}|s0t�S|f|j��S)z5Construct a (possibly null) ContentChecker from a URLr7r;)rr#r=r��patternr}�	groupdict)�clsr@rFrKr&r&r'�from_urlszHashChecker.from_urlcCs|jj|�dS)N)r��update)r�r�r&r&r'r�szHashChecker.feedcCs|jj�|jkS)N)r�Z	hexdigestr�)r�r&r&r'r�!szHashChecker.is_validcCs||j}||�S)N)r�)r�r�r��msgr&r&r'r�$s
zHashChecker.reportN)r�r�r�rZ�compiler�r��classmethodr�r�r�r�r&r&r&r'r�sr�cs<eZdZdZdKdd�ZdLd	d
�ZdMdd�ZdNd
d�Zdd�Zdd�Z	dd�Z
dd�ZdOdd�Zdd�Z
dP�fdd�	Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�ZdQd'd(�ZdRd)d*�Zd+d,�Zd-Zd.d/�Zd0d1�ZdSd2d3�Zd4d5�Zd6d7�Zd8d9�Zd:d;�Zd<d=�Ze dTd>d?��Z!d@dA�Z"dBdC�Z#dDdE�Z$dFdG�Z%dHdI�Z&�Z'S)Urz;A distribution index that scans web pages for download URLs�https://pypi.org/simple/�*NTcOs�tj|f|�|�|dd|jd��|_i|_i|_i|_tjdj	t
t|���j|_
g|_|ortjor|prtj�}|r�tj|�|_n
tjj|_dS)Nr6�|)r
r�r1�	index_url�scanned_urls�fetched_urls�
package_pagesrZr�rarrrK�allows�to_scanrZis_availableZfind_ca_bundleZ
opener_for�openerr�request�urlopen)r�r�ZhostsZ	ca_bundleZ
verify_sslrl�kwZuse_sslr&r&r'r�,szPackageIndex.__init__FcCs�||jkr|rdSd|j|<t|�s4|j|�dStt|��}|r^|j|�sRdS|jd|�|sr|sr||jkr�tt|j	|��dS|j|�s�d|j|<dS|j
d|�d|j|<d}|j|||�}|dkr�dSd|j|j<d|j
jdd�j�k�r|j�dS|j}|j�}t|t��sRt|tjj��r4d	}n|j
jd
��pDd	}|j|d�}|j�x6tj|�D](}	tjj|t|	jd���}
|j|
��qfW|j |j!��r�t"|d
d�dk�r�|j#||�}dS)z<Evaluate a URL as a possible download, and maybe retrieve itNTzFound link: %sz
Reading %sz<Download error on %s: %%s -- Some packages may not be found!�htmlzcontent-type�zlatin-1�charset�ignorer7�codei�)$r��
URL_SCHEME�process_filename�listr�url_ok�debugr�rrc�info�open_urlr@�headers�getr0�close�read�
isinstancerwr�error�	HTTPErrorZ	get_param�decoderyrur#rzr{rL�process_urlr2r��getattr�
process_index)r�r@Zretrieve�dists�tmpl�fr4r~r�rK�linkr&r&r'r�AsP





 zPackageIndex.process_urlcCs�tjj|�s|jd|�dStjj|�rd|rdtjj|�}x(tj|�D]}|jtjj||�d�qFWt	|�}|r�|j
d|�tt|j
|��dS)Nz
Not found: %sTz	Found: %s)rWrD�exists�warn�isdir�realpath�listdirr�rarYr�r�rrc)r��fn�nestedrD�itemr�r&r&r'r�tszPackageIndex.process_filenamecCsbt|�}|o|jd�j�dk}|s8|jtjj|�d�r<dSd}|rRt||��n|j||�dS)Nr7�fileTzN
Note: Bypassing %s (disallowed host; see http://bit.ly/2hrImnY for details).
)	r�rLr0r�rr#r=rr�)r�r@Zfatal�s�is_filer�r&r&r'r��szPackageIndex.url_okcCs2ttjj|�}dd�|D�}ttj|j|��dS)Ncss0|](}tj|�D]}|jd�r||fVqqdS)z	.egg-linkN)rWr�r1)r[rD�entryr&r&r'r]�sz.PackageIndex.scan_egg_links.<locals>.<genexpr>)�filterrWrDr�r��	itertools�starmap�
scan_egg_link)r�Zsearch_path�dirsZ	egg_linksr&r&r'�scan_egg_links�szPackageIndex.scan_egg_linksc
Cs�ttjj||���}ttdttj|���}WdQRXt	|�dkrDdS|\}}x>t
tjj||��D](}tjj|f|��|_t|_
|j|�q`WdS)Nr9)�openrWrDrar�r�rrwrxrTrrPrrHrc)r�rDr�Z	raw_lines�linesZegg_pathZ
setup_pathrNr&r&r'r��s zPackageIndex.scan_egg_linkc

s��fdd�}xHtj|�D]:}y |tjj|t|jd����Wqtk
rPYqXqW||�\}}|r�xXt||�D]J}t	|�\}}	|j
d�r�|	r�|r�|d||f7}n
�j|��j|�qrWt
jdd�|�SdSd	S)
z#Process the contents of a PyPI pagecs�|j�j�r�tttjj|t�j�d�jd���}t|�dkr�d|dkr�t	|d�}t
|d�}d�jj|j
�i�|<t|�t|�fSdS)Nr6r9r:r7rT)NN)r2r�r�rrr#r>rTr?rrr��
setdefaultr0r)r�rA�pkg�ver)r�r&r'�scan�s"z(PackageIndex.process_index.<locals>.scanr7z.pyz
#egg=%s-%scSsd|jddd�S)Nz<a href="%s#md5=%s">%s</a>r7r!r9)rL)�mr&r&r'�<lambda>�sz,PackageIndex.process_index.<locals>.<lambda>r�N)ryrurr#rzr{rLr$r�rGr1�need_version_info�scan_url�PYPI_MD5�sub)
r�r@r~r�rKr�r��new_urlr4�fragr&)r�r'r��s$ 

zPackageIndex.process_indexcCs|jd|�dS)NzPPage at %s links to .py file(s) without version info; an index scan is required.)�scan_all)r�r@r&r&r'r��szPackageIndex.need_version_infocGs:|j|jkr*|r |j|f|��|jd�|j|j�dS)Nz6Scanning index of all packages (this may take a while))r�r�r�r�r�)r�r�rlr&r&r'r��szPackageIndex.scan_allcCs~|j|j|jd�|jj|j�s:|j|j|jd�|jj|j�sR|j|�x&t|jj|jf��D]}|j|�qhWdS)Nr6)	r�r��unsafe_namer�r�rgrQ�not_found_in_indexr�)r��requirementr@r&r&r'�
find_packages�s
zPackageIndex.find_packagescsR|j�|j|�x,||jD]}||kr.|S|jd||�qWtt|�j||�S)Nz%s does not match %s)�prescanr�rgr��superr�obtain)r�r�Z	installerrN)�	__class__r&r'r��s
zPackageIndex.obtaincCsL|j|jd|�|j�sH|j�tj|�td|jjtj	j
|�f��dS)z-
        checker is a ContentChecker
        zValidating %%s checksum for %sz7%s validation failed for %s; possible download problem?N)r�r�r�r�rW�unlinkrr�r3rDrU)r��checkerrX�tfpr&r&r'�
check_hash�s

zPackageIndex.check_hashcCsTxN|D]F}|jdks4t|�s4|jd�s4tt|��r@|j|�q|jj|�qWdS)z;Add `urls` to the list that will be prescanned for searchesNzfile:)r�r�r2r�rr��append)r�Zurlsr@r&r&r'�add_find_links
s



zPackageIndex.add_find_linkscCs"|jrtt|j|j��d|_dS)z7Scan urls scheduled for prescanning (e.g. --find-links)N)r�r�rr�)r�r&r&r'r�szPackageIndex.prescancCs<||jr|jd}}n|jd}}|||j�|j�dS)Nz#Couldn't retrieve index page for %rz3Couldn't find index page for %r (maybe misspelled?))rgr�r�r�r�)r�r��methr�r&r&r'r�"s
zPackageIndex.not_found_in_indexcCs~t|t�sjt|�}|rR|j|jd�||�}t|�\}}|jd�rN|j|||�}|Stj	j
|�rb|St|�}t|j
||�dd�S)aLocate and/or download `spec` to `tmpdir`, returning a local path

        `spec` may be a ``Requirement`` object, or a string containing a URL,
        an existing local filename, or a project/version requirement spec
        (i.e. the string form of a ``Requirement`` object).  If it is the URL
        of a .py file with an unambiguous ``#egg=name-version`` tag (i.e., one
        that escapes ``-`` as ``_`` throughout), a trivial ``setup.py`` is
        automatically created alongside the downloaded file.

        If `spec` is a ``Requirement`` object or a string containing a
        project/version requirement spec, this method returns the location of
        a matching distribution (possibly after downloading it to `tmpdir`).
        If `spec` is a locally existing file or directory name, it is simply
        returned unchanged.  If `spec` is a URL, it is downloaded to a subpath
        of `tmpdir`, and the local filename is returned.  Various errors may be
        raised if a problem occurs during downloading.
        r7z.pyrPN)r�rr��
_download_urlrLrGr1�	gen_setuprWrDr�r(r��fetch_distribution)r�r%�tmpdirrB�foundr4rFr&r&r'r8,s

zPackageIndex.downloadc	s��jd|�i�d}d
�����fdd�	}|rH�j��j|�||�}|r`|dk	r`|||�}|dkr��jdk	rz�j�||�}|dkr�|r��j|�||�}|dkrˆjd�r�dp�d|�n�jd|�|j|jd	�SdS)a|Obtain a distribution suitable for fulfilling `requirement`

        `requirement` must be a ``pkg_resources.Requirement`` instance.
        If necessary, or if the `force_scan` flag is set, the requirement is
        searched for in the (online) package index as well as the locally
        installed packages.  If a distribution matching `requirement` is found,
        the returned distribution's ``location`` is the value you would have
        gotten from calling the ``download()`` method with the matching
        distribution's URL or filename.  If no matching distribution is found,
        ``None`` is returned.

        If the `source` flag is set, only source distributions and source
        checkout links will be considered.  Unless the `develop_ok` flag is
        set, development and system eggs (i.e., those using the ``.egg-info``
        format) will be ignored.
        zSearching for %sNcs�|dkr�}x�||jD]t}|jtkrJ�rJ|�kr�jd|�d�|<q||ko`|jtkp`�}|r�j|j��}||_tj	j
|j�r|SqWdS)Nz&Skipping development or system egg: %sr7)rgrHrr�rr8rP�download_locationrWrDr�)Zreq�envrNZtest�loc)�
develop_okr��skipped�sourcer�r&r'r|fs z-PackageIndex.fetch_distribution.<locals>.findz:No local packages or working download links found for %s%sza source distribution of r�zBest match: %s)rP)N)r�r�r�r�r��cloner�)	r�r�r��
force_scanr�r�Zlocal_indexrNr|r&)r�r�r�r�r�r'r�Ns0




zPackageIndex.fetch_distributioncCs"|j||||�}|dk	r|jSdS)a3Obtain a file suitable for fulfilling `requirement`

        DEPRECATED; use the ``fetch_distribution()`` method now instead.  For
        backward compatibility, this routine is identical but returns the
        ``location`` of the downloaded distribution instead of a distribution
        object.
        N)r�rP)r�r�r�rr�rNr&r&r'�fetch�szPackageIndex.fetchc

Cs�tj|�}|r*dd�t||jd�d�D�p,g}t|�dkr�tjj|�}tjj|�|kr�tjj	||�}ddl
m}|||�s�tj
||�|}ttjj	|d�d��2}	|	jd|dj|djtjj|�df�WdQRX|S|r�td	||f��ntd
��dS)NcSsg|]}|jr|�qSr&)rR)r[�dr&r&r'�
<listcomp>�sz*PackageIndex.gen_setup.<locals>.<listcomp>r7r)�samefilezsetup.py�wzIfrom setuptools import setup
setup(name=%r, version=%r, py_modules=[%r])
z�Can't unambiguously interpret project/version identifier %r; any dashes in the name or version should be escaped using underscores. %rzpCan't process plain .py files without an '#egg=name-version' suffix to enable automatic setup script generation.)rJrKrrLrTrWrDrU�dirnameraZsetuptools.command.easy_installr�shutilZcopy2r��writerQrR�splitextr)
r�rXrFr�rKr�rU�dstrr�r&r&r'r��s2

 zPackageIndex.gen_setupi cCs|jd|�d}z�tj|�}|j|�}t|tjj�rJtd||j	|j
f��|j�}d}|j}d}d|kr�t|d�}	t
tt|	��}|j|||||�t|d��Z}
xD|j|�}|r�|j|�|
j|�|d7}|j|||||�q�Pq�W|j|||
�WdQRX|S|�r|j�XdS)	NzDownloading %szCan't download %s: %s %srr7zcontent-lengthzContent-Length�wbr;)r�r�r�r�r�rr�r�rr�r��dl_blocksizer�maxr�int�
reporthookr�r�r�r	r�r�)r�r@rX�fpr�r��blocknumZbs�sizeZsizesr�r�r&r&r'�_download_to�s:





zPackageIndex._download_tocCsdS)Nr&)r�r@rXrZblksizerr&r&r'r�szPackageIndex.reporthookcCs�|jd�rt|�Syt||j�Sttjfk
r�}z>djdd�|jD��}|r^|j	||�nt
d||f��WYdd}~X�ntjj
k
r�}z|Sd}~Xn�tjjk
r�}z,|r�|j	||j�nt
d||jf��WYdd}~Xn�tjk
�r8}z.|�r|j	||j�nt
d||jf��WYdd}~XnPtjtjfk
�r�}z*|�rf|j	||�nt
d||f��WYdd}~XnXdS)Nzfile:� cSsg|]}t|��qSr&)rw)r[�argr&r&r'r�sz)PackageIndex.open_url.<locals>.<listcomp>z%s %szDownload error for %s: %sz;%s returned a bad status line. The server might be down, %s)r2�
local_open�open_with_authr�r$r�
InvalidURLrarlr�rrr�r�ZURLError�reasonZ
BadStatusLine�lineZ
HTTPException�socket)r�r@Zwarning�vr�r&r&r'r��s6
"zPackageIndex.open_urlcCs�t|�\}}|r4x&d|kr0|jdd�jdd�}qWnd}|jd�rN|dd�}tjj||�}|dksn|jd	�rz|j||�S|d
ks�|jd�r�|j||�S|jd�r�|j	||�S|d
kr�t
jjt
j
j|�d�S|j|d�|j||�SdS)Nz..�.�\�_Z__downloaded__z.egg.zipr,�svnzsvn+�gitzgit+zhg+r�r9Tr/)rG�replacer1rWrDrar2�
_download_svn�
_download_git�_download_hgrr��url2pathnamer#r=r��_attempt_download)r�rBr@r�r3rFrXr&r&r'r�s$


zPackageIndex._download_urlcCs|j|d�dS)NT)r�)r�r@r&r&r'r�9szPackageIndex.scan_urlcCs6|j||�}d|jdd�j�kr.|j|||�S|SdS)Nr�zcontent-typer�)rr�r0�_download_html)r�r@rXr�r&r&r'r(<szPackageIndex._attempt_downloadcCslt|�}x@|D]8}|j�rtjd|�rD|j�tj|�|j||�SPqW|j�tj|�td|��dS)Nz <title>([^- ]+ - )?Revision \d+:zUnexpected HTML page found at )	r�rxrZr}r�rWr�r$r)r�r@r�rXr�rr&r&r'r)Cs


zPackageIndex._download_htmlcCs|jdd�d}g}|j�jd�r�d|kr�tjj|�\}}}}}}	|r�|jd�r�d|dd�kr�|dd�jdd�\}}tjj|�\}
}|
r�d	|
kr�|
jd	d�\}}
d
|d|
g}n
d
|
g}|}||||||	f}tjj|�}|jd||�t	j
d
dg|d||g�|S)Nr:r7rzsvn:�@z//r6r9�:z--username=z--password=z'Doing subversion checkout from %s to %sr!�checkoutz-q)r?r0r2rr#r=�	splituser�
urlunparser��
subprocess�
check_call)r�r@rXZcredsrB�netlocrDr\�qr��auth�host�userZpwrAr&r&r'r$Rs$ 
zPackageIndex._download_svncCsptjj|�\}}}}}|jdd�d}|jdd�d}d}d|krR|jdd�\}}tjj||||df�}||fS)N�+r7r:rr*r�r;)rr#Zurlsplitr?�rsplitZ
urlunsplit)r@�
pop_prefixrBr1rDrEr��revr&r&r'�_vcs_split_rev_from_urlgsz$PackageIndex._vcs_split_rev_from_urlcCsr|jdd�d}|j|dd�\}}|jd||�tjddd	||g�|dk	rn|jd
|�tjdd|dd	|g�|S)
Nr:r7rT)r8zDoing git clone from %s to %sr"rz--quietzChecking out %sz-Cr,)r?r:r�r/r0)r�r@rXr9r&r&r'r%yszPackageIndex._download_gitc	Csv|jdd�d}|j|dd�\}}|jd||�tjddd	||g�|dk	rr|jd
|�tjdd|dd
d|dg�|S)Nr:r7rT)r8zDoing hg clone from %s to %sZhgrz--quietzUpdating to %sz--cwdZupz-Cz-rz-q)r?r:r�r/r0)r�r@rXr9r&r&r'r&�szPackageIndex._download_hgcGstj|f|��dS)N)rr�)r�r�rlr&r&r'r��szPackageIndex.debugcGstj|f|��dS)N)rr�)r�r�rlr&r&r'r��szPackageIndex.infocGstj|f|��dS)N)rr�)r�r�rlr&r&r'r��szPackageIndex.warn�r�)r�r;NT)F)F)F)N)N)FFFN)FF)N)F)(r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r8r�rr�r
rrr�r�r�r(r)r$�staticmethodr:r%r&r�r�r��
__classcell__r&r&)r�r'r)sL

3



+
		
#
J

)$
#

z!&(#(\d+|x[\da-fA-F]+)|[\w.:-]+);?cCs|jd�}t|�S)Nr7)rLr)rKZwhatr&r&r'�
decode_entity�s
r>cCs
tt|�S)z'Decode HTML entities in the given text.)�
entity_subr>)�textr&r&r'r{�sr{cs�fdd�}|S)Ncs��fdd�}|S)Ncs.tj�}tj��z
�||�Stj|�XdS)N)rZgetdefaulttimeoutZsetdefaulttimeout)rlrmZold_timeout)rn�timeoutr&r'�_socket_timeout�s


z@socket_timeout.<locals>._socket_timeout.<locals>._socket_timeoutr&)rnrB)rA)rnr'rB�sz'socket_timeout.<locals>._socket_timeoutr&)rArBr&)rAr'�socket_timeout�srCcCs2tjj|�}|j�}tj|�}|j�}|jdd�S)aq
    A function compatible with Python 2.3-3.3 that will encode
    auth from a URL suitable for an HTTP header.
    >>> str(_encode_auth('username%3Apassword'))
    'dXNlcm5hbWU6cGFzc3dvcmQ='

    Long auth strings should not cause a newline to be inserted.
    >>> long_auth = 'username:' + 'password'*10
    >>> chr(10) in str(_encode_auth(long_auth))
    False
    �
r�)rr#r>�encode�base64Zencodestringr�r#)r3Zauth_sZ
auth_bytesZ
encoded_bytesZencodedr&r&r'�_encode_auth�s

rGc@s(eZdZdZdd�Zdd�Zdd�ZdS)	�
Credentialz:
    A username/password pair. Use like a namedtuple.
    cCs||_||_dS)N)�username�password)r�rIrJr&r&r'r��szCredential.__init__ccs|jV|jVdS)N)rIrJ)r�r&r&r'�__iter__�szCredential.__iter__cCsdt|�S)Nz%(username)s:%(password)s)�vars)r�r&r&r'�__str__�szCredential.__str__N)r�r�r�r�r�rKrMr&r&r&r'rH�srHc@s0eZdZdd�Zedd��Zdd�Zdd�Zd	S)
�
PyPIConfigcCsPtjdddgd�}tjj||�tjjtjjd�d�}tjj	|�rL|j
|�dS)z%
        Load from ~/.pypirc
        rIrJ�
repositoryr��~z.pypircN)�dict�fromkeysr�RawConfigParserr�rWrDra�
expanduserr�r�)r�ZdefaultsZrcr&r&r'r��s
zPyPIConfig.__init__cs&�fdd��j�D�}tt�j|��S)Ncs g|]}�j|d�j�r|�qS)rO)r�rx)r[�section)r�r&r'r�sz2PyPIConfig.creds_by_repository.<locals>.<listcomp>)ZsectionsrQr�_get_repo_cred)r�Zsections_with_repositoriesr&)r�r'�creds_by_repository�szPyPIConfig.creds_by_repositorycCs6|j|d�j�}|t|j|d�j�|j|d�j��fS)NrOrIrJ)r�rxrH)r�rUZrepor&r&r'rV�szPyPIConfig._get_repo_credcCs*x$|jj�D]\}}|j|�r|SqWdS)z�
        If the URL indicated appears to be a repository defined in this
        config, return the credential for that repository.
        N)rW�itemsr2)r�r@rO�credr&r&r'�find_credential�s
zPyPIConfig.find_credentialN)r�r�r�r��propertyrWrVrZr&r&r&r'rN�s	rNcCs:tjj|�\}}}}}}|jd�r,tjd��|d
krFtjj|�\}}	nd}|s~t�j|�}
|
r~t	|
�}|
j
|f}tjd|��|r�dt
|�}||	||||f}tjj|�}
tjj|
�}|jd|�ntjj|�}|jd	t�||�}|�r6tjj|j�\}}}}}}||k�r6||	k�r6||||||f}tjj|�|_|S)z4Open a urllib2 request, handling HTTP authenticationr+znonnumeric port: ''�http�httpsN�*Authenticating as %s for %s (from .pypirc)zBasic Z
Authorizationz
User-Agent)r\r])r^)rr#r=r1rrr-rNrZrwrIrr�rGr.r�ZRequestZ
add_header�
user_agentr@)r@r�rBr1rDZparamsrEr�r3r4rYr�rAr�r�r�s2Zh2Zpath2Zparam2Zquery2Zfrag2r&r&r'r	s6


rcCs|S)Nr&)r@r&r&r'�
fix_sf_url:sracCstjj|�\}}}}}}tjj|�}tjj|�r<tjj|�S|j	d�r�tjj
|�r�g}x�tj|�D]b}	tjj||	�}
|	dkr�t
|
d��}|j�}WdQRXPntjj
|
�r�|	d7}	|jdj|	d��qbWd}
|
j|dj|�d	�}d\}}n
d\}}}ddi}tj|�}tjj|||||�S)z7Read a local path, with special support for directoriesr6z
index.html�rNz<a href="{name}">{name}</a>)r3zB<html><head><title>{url}</title></head><body>{files}</body></html>rD)r@�files���OK��Path not found�	Not foundzcontent-typez	text/html)rdre)rfrgrh)rr#r=r�r'rWrD�isfiler�r1r�r�rar�r�r��formatr�StringIOr�r�)r@rBrCrDZparamrEr�rXrcr��filepathrZbodyr�Zstatus�messager�Zbody_streamr&r&r'r>s,


r)N)N)N)N)r )[r�r/�sysrWrZrrrFr�r��	functoolsrZsetuptools.externrZsetuptools.extern.six.movesrrrrr"Z
pkg_resourcesrr	r
rrr
rrrrrrrrZ	distutilsrZdistutils.errorsrZfnmatchrZsetuptools.py27compatrZsetuptools.py33compatrZsetuptools.wheelrr�rJ�Iryr�rKr�r?rS�__all__Z_SOCKET_TIMEOUTZ_tmplrjrRr_r(rrGrrIrYrrkrprtr��objectr�r�rr�r?r>r{rCrGrHrSrNr�r�rrarr&r&r&r'�<module>s|<
	

!
"

!z
&.site-packages/setuptools/__pycache__/glob.cpython-36.pyc000064400000007276147511334640017312 0ustar003

��fW�@s�dZddlZddlZddlZddlmZdddgZddd�Zdd	d�Zd
d�Z	dd
�Z
dd�Zdd�Zdd�Z
ejd�Zejd�Zdd�Zdd�Zdd�ZdS)z�
Filename globbing utility. Mostly a copy of `glob` from Python 3.5.

Changes include:
 * `yield from` and PEP3102 `*` removed.
 * `bytes` changed to `six.binary_type`.
 * Hidden files are not ignored.
�N)�binary_type�glob�iglob�escapeFcCstt||d��S)ayReturn a list of paths matching a pathname pattern.

    The pattern may contain simple shell-style wildcards a la
    fnmatch. However, unlike fnmatch, filenames starting with a
    dot are special cases that are not matched by '*' and '?'
    patterns.

    If recursive is true, the pattern '**' will match any files and
    zero or more directories and subdirectories.
    )�	recursive)�listr)�pathnamer�r	�/usr/lib/python3.6/glob.pyrscCs,t||�}|r(t|�r(t|�}|s(t�|S)a�Return an iterator which yields the paths matching a pathname pattern.

    The pattern may contain simple shell-style wildcards a la
    fnmatch. However, unlike fnmatch, filenames starting with a
    dot are special cases that are not matched by '*' and '?'
    patterns.

    If recursive is true, the pattern '**' will match any files and
    zero or more directories and subdirectories.
    )�_iglob�_isrecursive�next�AssertionError)rr�it�sr	r	r
r s


ccstjj|�\}}t|�sF|r0tjj|�rB|Vntjj|�rB|VdS|s�|rrt|�rrx4t||�D]
}|VqbWnxt||�D]
}|Vq~WdS||kr�t|�r�t	||�}n|g}t|�r�|r�t|�r�t}q�t}nt
}x0|D](}x"|||�D]}tjj||�Vq�Wq�WdS)N)�os�path�split�	has_magic�lexists�isdirr�glob2�glob1r�glob0�join)rr�dirname�basename�x�dirsZglob_in_dir�namer	r	r
r2s4

rcCsR|s"t|t�rtjjd�}ntj}ytj|�}Wntk
rDgSXtj||�S)N�ASCII)	�
isinstancerr�curdir�encode�listdir�OSError�fnmatch�filter)r�pattern�namesr	r	r
r]s
rcCs8|stjj|�r4|gSntjjtjj||��r4|gSgS)N)rrrrr)rrr	r	r
rjsrccs6t|�st�|dd�Vxt|�D]
}|Vq$WdS)Nr)rr�	_rlistdir)rr(rr	r	r
rzsrc
cs�|s"t|t�rttjd�}ntj}ytj|�}Wntjk
rFdSXxJ|D]B}|V|rjtjj||�n|}x t|�D]}tjj||�VqxWqNWdS)Nr )	r!rrr"r$�errorrrr*)rr)rr�yr	r	r
r*�s

r*z([*?[])s([*?[])cCs(t|t�rtj|�}n
tj|�}|dk	S)N)r!r�magic_check_bytes�search�magic_check)r�matchr	r	r
r�s

rcCst|t�r|dkS|dkSdS)Ns**z**)r!r)r(r	r	r
r�s
rcCs<tjj|�\}}t|t�r(tjd|�}ntjd|�}||S)z#Escape all special characters.
    s[\1]z[\1])rr�
splitdriver!rr-�subr/)rZdriver	r	r
r�s

)F)F)�__doc__r�rer&Zsetuptools.extern.sixr�__all__rrrrrrr*�compiler/r-rrrr	r	r	r
�<module>s"


+


site-packages/setuptools/__pycache__/namespaces.cpython-36.pyc000064400000007031147511334640020473 0ustar003

��f�@sRddlZddlmZddlZddlmZejjZGdd�d�Z	Gdd�de	�Z
dS)�N)�log)�mapc	@sTeZdZdZdd�Zdd�Zdd�ZdZdZdd�Z	dd�Z
dd�Zedd��Z
dS)�	Installerz
-nspkg.pthc	Cs�|j�}|sdStjj|j��\}}||j7}|jj|�tj	d|�t
|j|�}|jrdt
|�dSt|d��}|j|�WdQRXdS)Nz
Installing %sZwt)�_get_all_ns_packages�os�path�splitext�_get_target�	nspkg_extZoutputs�appendr�infor�_gen_nspkg_lineZdry_run�list�open�
writelines)�selfZnsp�filename�ext�lines�f�r� /usr/lib/python3.6/namespaces.py�install_namespacess
zInstaller.install_namespacescCsHtjj|j��\}}||j7}tjj|�s.dStjd|�tj|�dS)NzRemoving %s)	rrrr	r
�existsrr�remove)rrrrrr�uninstall_namespaces!s
zInstaller.uninstall_namespacescCs|jS)N)�target)rrrrr	)szInstaller._get_target�import sys, types, os�#has_mfs = sys.version_info > (3, 5)�$p = os.path.join(%(root)s, *%(pth)r)�4importlib = has_mfs and __import__('importlib.util')�-has_mfs and __import__('importlib.machinery')��m = has_mfs and sys.modules.setdefault(%(pkg)r, importlib.util.module_from_spec(importlib.machinery.PathFinder.find_spec(%(pkg)r, [os.path.dirname(p)])))�Cm = m or sys.modules.setdefault(%(pkg)r, types.ModuleType(%(pkg)r))�7mp = (m or []) and m.__dict__.setdefault('__path__',[])�(p not in mp) and mp.append(p)�4m and setattr(sys.modules[%(parent)r], %(child)r, m)cCsdS)Nz$sys._getframe(1).f_locals['sitedir']r)rrrr�	_get_rootCszInstaller._get_rootcCsVt|�}t|jd��}|j�}|j}|jd�\}}}|rB||j7}dj|�t�dS)N�.�;�
)	�str�tuple�splitr'�_nspkg_tmpl�
rpartition�_nspkg_tmpl_multi�join�locals)r�pkgZpth�rootZ
tmpl_lines�parent�sepZchildrrrr
Fs
zInstaller._gen_nspkg_linecCs |jjp
g}ttt|j|���S)z,Return sorted list of all package namespaces)ZdistributionZnamespace_packages�sorted�flattenr�
_pkg_names)rZpkgsrrrrQszInstaller._get_all_ns_packagesccs,|jd�}x|r&dj|�V|j�qWdS)z�
        Given a namespace package, yield the components of that
        package.

        >>> names = Installer._pkg_names('a.b.c')
        >>> set(names) == set(['a', 'a.b', 'a.b.c'])
        True
        r(N)r-r1�pop)r3�partsrrrr9Vs

zInstaller._pkg_namesN)	rrrr r!r"r#r$r%)r&)�__name__�
__module__�__qualname__r
rrr	r.r0r'r
r�staticmethodr9rrrrrs$rc@seZdZdd�Zdd�ZdS)�DevelopInstallercCstt|j��S)N)�reprr+Zegg_path)rrrrr'gszDevelopInstaller._get_rootcCs|jS)N)Zegg_link)rrrrr	jszDevelopInstaller._get_targetN)r<r=r>r'r	rrrrr@fsr@)rZ	distutilsr�	itertoolsZsetuptools.extern.six.movesr�chain�
from_iterabler8rr@rrrr�<module>s[site-packages/setuptools/__pycache__/msvc.cpython-36.pyc000064400000103247147511334640017332 0ustar003

��f���@sdZddlZddlZddlZddlZddlZddlmZddl	m
Z
ddlmZej
�dkrpddl	mZejZnGd	d
�d
�Ze�ZeejjfZyddlmZWnek
r�YnXdd
�Zd dd�Zdd�Zdd�Zd!dd�ZGdd�d�ZGdd�d�ZGdd�d�ZGdd�d�Z dS)"a@
Improved support for Microsoft Visual C++ compilers.

Known supported compilers:
--------------------------
Microsoft Visual C++ 9.0:
    Microsoft Visual C++ Compiler for Python 2.7 (x86, amd64)
    Microsoft Windows SDK 6.1 (x86, x64, ia64)
    Microsoft Windows SDK 7.0 (x86, x64, ia64)

Microsoft Visual C++ 10.0:
    Microsoft Windows SDK 7.1 (x86, x64, ia64)

Microsoft Visual C++ 14.0:
    Microsoft Visual C++ Build Tools 2015 (x86, x64, arm)
    Microsoft Visual Studio 2017 (x86, x64, arm, arm64)
    Microsoft Visual Studio Build Tools 2017 (x86, x64, arm, arm64)
�N)�
LegacyVersion)�filterfalse�)�
get_unpatched�Windows)�winregc@seZdZdZdZdZdZdS)rN)�__name__�
__module__�__qualname__�
HKEY_USERS�HKEY_CURRENT_USER�HKEY_LOCAL_MACHINE�HKEY_CLASSES_ROOT�rr�/usr/lib/python3.6/msvc.pyr(sr)�RegcCs�d}|d|f}ytj|d�}WnJtk
rjy|d|f}tj|d�}Wntk
rdd}YnXYnX|r�tjjjj|d�}tjj|�r�|Stt�|�S)a+
    Patched "distutils.msvc9compiler.find_vcvarsall" to use the standalone
    compiler build for Python (VCForPython). Fall back to original behavior
    when the standalone compiler is not available.

    Redirect the path of "vcvarsall.bat".

    Known supported compilers
    -------------------------
    Microsoft Visual C++ 9.0:
        Microsoft Visual C++ Compiler for Python 2.7 (x86, amd64)

    Parameters
    ----------
    version: float
        Required Microsoft Visual C++ version.

    Return
    ------
    vcvarsall.bat path: str
    z-Software\%sMicrosoft\DevDiv\VCForPython\%0.1f��
installdirzWow6432Node\Nz
vcvarsall.bat)	rZ	get_value�KeyError�os�path�join�isfiler�msvc9_find_vcvarsall)�versionZVC_BASE�key�
productdir�	vcvarsallrrrr?sr�x86cOs�ytt�}|||f|�|�Stjjk
r2Yntk
rDYnXyt||�j�Stjjk
r�}zt|||��WYdd}~XnXdS)a�
    Patched "distutils.msvc9compiler.query_vcvarsall" for support extra
    compilers.

    Set environment without use of "vcvarsall.bat".

    Known supported compilers
    -------------------------
    Microsoft Visual C++ 9.0:
        Microsoft Visual C++ Compiler for Python 2.7 (x86, amd64)
        Microsoft Windows SDK 6.1 (x86, x64, ia64)
        Microsoft Windows SDK 7.0 (x86, x64, ia64)

    Microsoft Visual C++ 10.0:
        Microsoft Windows SDK 7.1 (x86, x64, ia64)

    Parameters
    ----------
    ver: float
        Required Microsoft Visual C++ version.
    arch: str
        Target architecture.

    Return
    ------
    environment: dict
    N)	r�msvc9_query_vcvarsall�	distutils�errors�DistutilsPlatformError�
ValueError�EnvironmentInfo�
return_env�_augment_exception)�ver�arch�args�kwargsZorig�excrrrrjsrcCsnytt�|�Stjjk
r$YnXyt|dd�j�Stjjk
rh}zt|d��WYdd}~XnXdS)a'
    Patched "distutils._msvccompiler._get_vc_env" for support extra
    compilers.

    Set environment without use of "vcvarsall.bat".

    Known supported compilers
    -------------------------
    Microsoft Visual C++ 14.0:
        Microsoft Visual C++ Build Tools 2015 (x86, x64, arm)
        Microsoft Visual Studio 2017 (x86, x64, arm, arm64)
        Microsoft Visual Studio Build Tools 2017 (x86, x64, arm, arm64)

    Parameters
    ----------
    plat_spec: str
        Target architecture.

    Return
    ------
    environment: dict
    g,@)�
vc_min_verN)r�msvc14_get_vc_envr r!r"r$r%r&)Z	plat_specr+rrrr-�s
r-cOsBdtjkr4ddl}t|j�td�kr4|jjj||�Stt	�||�S)z�
    Patched "distutils._msvccompiler.gen_lib_options" for fix
    compatibility between "numpy.distutils" and "distutils._msvccompiler"
    (for Numpy < 1.11.2)
    znumpy.distutilsrNz1.11.2)
�sys�modulesZnumpyr�__version__r Z	ccompilerZgen_lib_optionsr�msvc14_gen_lib_options)r)r*Znprrrr1�s

r1rcCs�|jd}d|j�ks"d|j�kr�d}|jft��}d}|dkrr|j�jd�dkrh|d	7}||d
7}q�|d7}n.|dkr�|d
7}||d7}n|dkr�|d7}|f|_dS)zl
    Add details to the exception message to help guide the user
    as to what action will resolve it.
    rrzvisual cz0Microsoft Visual C++ {version:0.1f} is required.z-www.microsoft.com/download/details.aspx?id=%dg"@Zia64rz* Get it with "Microsoft Windows SDK 7.0": iBz% Get it from http://aka.ms/vcpython27g$@z* Get it with "Microsoft Windows SDK 7.1": iW g,@zj Get it with "Microsoft Visual C++ Build Tools": http://landinghub.visualstudio.com/visual-cpp-build-toolsN���)r)�lower�format�locals�find)r+rr(�messageZtmplZ
msdownloadrrrr&�s 

r&c@sbeZdZdZejdd�j�Zdd�Ze	dd��Z
dd	�Zd
d�Zdd
d�Z
ddd�Zddd�ZdS)�PlatformInfoz�
    Current and Target Architectures informations.

    Parameters
    ----------
    arch: str
        Target architecture.
    Zprocessor_architecturercCs|j�jdd�|_dS)N�x64�amd64)r3�replacer()�selfr(rrr�__init__�szPlatformInfo.__init__cCs|j|jjd�dd�S)N�_r)r(r6)r<rrr�
target_cpu�szPlatformInfo.target_cpucCs
|jdkS)Nr)r?)r<rrr�
target_is_x86szPlatformInfo.target_is_x86cCs
|jdkS)Nr)�current_cpu)r<rrr�current_is_x86szPlatformInfo.current_is_x86FcCs.|jdkr|rdS|jdkr$|r$dSd|jS)uk
        Current platform specific subfolder.

        Parameters
        ----------
        hidex86: bool
            return '' and not '†' if architecture is x86.
        x64: bool
            return 'd' and not 'md64' if architecture is amd64.

        Return
        ------
        subfolder: str
            '	arget', or '' (see hidex86 parameter)
        rrr:z\x64z\%s)rA)r<�hidex86r9rrr�current_dir	szPlatformInfo.current_dircCs.|jdkr|rdS|jdkr$|r$dSd|jS)ar
        Target platform specific subfolder.

        Parameters
        ----------
        hidex86: bool
            return '' and not '\x86' if architecture is x86.
        x64: bool
            return '\x64' and not '\amd64' if architecture is amd64.

        Return
        ------
        subfolder: str
            '\current', or '' (see hidex86 parameter)
        rrr:z\x64z\%s)r?)r<rCr9rrr�
target_dirszPlatformInfo.target_dircCs0|rdn|j}|j|krdS|j�jdd|�S)ao
        Cross platform specific subfolder.

        Parameters
        ----------
        forcex86: bool
            Use 'x86' as current architecture even if current acritecture is
            not x86.

        Return
        ------
        subfolder: str
            '' if target architecture is current architecture,
            '\current_target' if not.
        rr�\z\%s_)rAr?rEr;)r<�forcex86Zcurrentrrr�	cross_dir5szPlatformInfo.cross_dirN)FF)FF)F)rr	r
�__doc__�safe_env�getr3rAr=�propertyr?r@rBrDrErHrrrrr8�s

r8c@s�eZdZdZejejejejfZ	dd�Z
edd��Zedd��Z
edd	��Zed
d��Zedd
��Zedd��Zedd��Zedd��Zedd��Zddd�Zdd�ZdS)�RegistryInfoz�
    Microsoft Visual Studio related registry informations.

    Parameters
    ----------
    platform_info: PlatformInfo
        "PlatformInfo" instance.
    cCs
||_dS)N)�pi)r<Z
platform_inforrrr=ZszRegistryInfo.__init__cCsdS)z<
        Microsoft Visual Studio root registry key.
        ZVisualStudior)r<rrr�visualstudio]szRegistryInfo.visualstudiocCstjj|jd�S)z;
        Microsoft Visual Studio SxS registry key.
        ZSxS)rrrrO)r<rrr�sxsdszRegistryInfo.sxscCstjj|jd�S)z8
        Microsoft Visual C++ VC7 registry key.
        ZVC7)rrrrP)r<rrr�vckszRegistryInfo.vccCstjj|jd�S)z;
        Microsoft Visual Studio VS7 registry key.
        ZVS7)rrrrP)r<rrr�vsrszRegistryInfo.vscCsdS)z?
        Microsoft Visual C++ for Python registry key.
        zDevDiv\VCForPythonr)r<rrr�
vc_for_pythonyszRegistryInfo.vc_for_pythoncCsdS)z-
        Microsoft SDK registry key.
        zMicrosoft SDKsr)r<rrr�
microsoft_sdk�szRegistryInfo.microsoft_sdkcCstjj|jd�S)z>
        Microsoft Windows/Platform SDK registry key.
        r)rrrrT)r<rrr�windows_sdk�szRegistryInfo.windows_sdkcCstjj|jd�S)z<
        Microsoft .NET Framework SDK registry key.
        ZNETFXSDK)rrrrT)r<rrr�	netfx_sdk�szRegistryInfo.netfx_sdkcCsdS)z<
        Microsoft Windows Kits Roots registry key.
        zWindows Kits\Installed Rootsr)r<rrr�windows_kits_roots�szRegistryInfo.windows_kits_rootsFcCs(|jj�s|rdnd}tjjd|d|�S)a

        Return key in Microsoft software registry.

        Parameters
        ----------
        key: str
            Registry key path where look.
        x86: str
            Force x86 software registry.

        Return
        ------
        str: value
        rZWow6432NodeZSoftwareZ	Microsoft)rNrBrrr)r<rrZnode64rrr�	microsoft�szRegistryInfo.microsoftcCs�tj}tj}|j}x�|jD]�}y||||�d|�}WnZttfk
r�|jj�s�y||||d�d|�}Wq�ttfk
r�wYq�XnwYnXytj	||�dSttfk
r�YqXqWdS)a
        Look for values in registry in Microsoft software registry.

        Parameters
        ----------
        key: str
            Registry key path where look.
        name: str
            Value name to find.

        Return
        ------
        str: value
        rTN)
r�KEY_READ�OpenKeyrX�HKEYS�OSError�IOErrorrNrBZQueryValueEx)r<r�namerYZopenkey�ms�hkey�bkeyrrr�lookup�s"

zRegistryInfo.lookupN)F)rr	r
rIrrrr
rr[r=rLrOrPrQrRrSrTrUrVrWrXrbrrrrrMLs"
rMc@s$eZdZdZejdd�Zejdd�Zejde�Zd3dd�Z	d	d
�Z
dd�Zed
d��Z
edd��Zdd�Zdd�Zedd��Zedd��Zedd��Zedd��Zedd��Zedd ��Zed!d"��Zed#d$��Zed%d&��Zed'd(��Zed)d*��Zed+d,��Zed-d.��Zd/d0�Zd4d1d2�ZdS)5�
SystemInfoz�
    Microsoft Windows and Visual Studio related system inormations.

    Parameters
    ----------
    registry_info: RegistryInfo
        "RegistryInfo" instance.
    vc_ver: float
        Required Microsoft Visual C++ version.
    �WinDirr�ProgramFileszProgramFiles(x86)NcCs"||_|jj|_|p|j�|_dS)N)�rirN�_find_latest_available_vc_ver�vc_ver)r<Z
registry_inforhrrrr=�s
zSystemInfo.__init__cCs6y|j�dStk
r0d}tjj|��YnXdS)Nrz%No Microsoft Visual C++ version foundr2)�find_available_vc_vers�
IndexErrorr r!r")r<�errrrrrg�s
z(SystemInfo._find_latest_available_vc_vercCs6|jj}|jj|jj|jjf}g}�x|jjD]�}x�|D]�}ytj|||�dtj�}Wnt	t
fk
rpw8YnXtj|�\}}}	xPt|�D]D}
y*t
tj||
�d�}||kr�|j|�Wq�tk
r�Yq�Xq�WxPt|�D]D}
y(t
tj||
��}||k�r|j|�Wq�tk
�r Yq�Xq�Wq8Wq.Wt|�S)zC
        Find all available Microsoft Visual C++ versions.
        r)rfrXrQrSrRr[rrZrYr\r]ZQueryInfoKey�range�floatZ	EnumValue�appendr#ZEnumKey�sorted)r<r_ZvckeysZvc_versr`rraZsubkeys�valuesr>�ir'rrrri�s2


z!SystemInfo.find_available_vc_verscCs6d|j}tjj|j|�}|jj|jjd|j�p4|S)z4
        Microsoft Visual Studio directory.
        zMicrosoft Visual Studio %0.1fz%0.1f)rhrrr�ProgramFilesx86rfrbrR)r<r^�defaultrrr�VSInstallDir
s
zSystemInfo.VSInstallDircCs�|j|j�p|j�}tjj|jjd|j�}|jj	|d�}|rNtjj|d�n|}|jj	|jj
d|j�pl|}tjj|�s�d}tj
j|��|S)z1
        Microsoft Visual C++ directory.
        z%0.1frZVCz(Microsoft Visual C++ directory not found)rt�	_guess_vc�_guess_vc_legacyrrrrfrSrhrbrQ�isdirr r!r")r<�guess_vcZreg_pathZ	python_vcZ
default_vcr�msgrrr�VCInstallDirszSystemInfo.VCInstallDirc
Cs^|jdkrdSd}tjj|j|�}ytj|�d}tjj||�Stttfk
rXYnXdS)z*
        Locate Visual C for 2017
        g,@Nz
VC\Tools\MSVCrr2)	rhrrrrt�listdirr\r]rj)r<rsrxZvc_exact_verrrrru0s
zSystemInfo._guess_vccCsd|j}tjj|j|�S)z<
        Locate Visual C for versions prior to 2017
        z Microsoft Visual Studio %0.1f\VC)rhrrrrr)r<rsrrrrv@s
zSystemInfo._guess_vc_legacycCsJ|jdkrdS|jdkrdS|jdkr*dS|jdkr8dS|jdkrFdSdS)zN
        Microsoft Windows SDK versions for specified MSVC++ version.
        g"@�7.0�6.1�6.0ag$@�7.1�7.0ag&@�8.0�8.0ag(@�8.1�8.1ag,@�10.0N)r|r}r~)rr�)r�r�)r�r�)r�r�)rh)r<rrr�WindowsSdkVersionGs




zSystemInfo.WindowsSdkVersioncCs|jtjj|jd��S)z4
        Microsoft Windows SDK last version
        �lib)�_use_last_dir_namerrr�
WindowsSdkDir)r<rrr�WindowsSdkLastVersionWs
z SystemInfo.WindowsSdkLastVersioncCsTd}x8|jD].}tjj|jjd|�}|jj|d�}|rPqW|sRtjj|�r�tjj|jjd|j	�}|jj|d�}|r�tjj|d�}|s�tjj|�r�xH|jD]>}|d|j
d��}d	|}tjj|j|�}tjj|�r�|}q�W|s�tjj|��r:x:|jD]0}d
|}tjj|j|�}tjj|��r|}�qW|�sPtjj|jd�}|S)z2
        Microsoft Windows SDK directory.
        rzv%s�installationfolderz%0.1frZWinSDKN�.zMicrosoft SDKs\Windows Kits\%szMicrosoft SDKs\Windows\v%sZPlatformSDK)
r�rrrrfrUrbrwrSrh�rfindrerz)r<�sdkdirr'�locrZinstall_baseZintver�drrrr�_s6
zSystemInfo.WindowsSdkDirc	Cs�|jdkrd}d}n&d}|jdkr&dnd}|jjd|d�}d	||jd
d�f}g}|jdkr�x(|jD]}|tjj|jj	||�g7}qdWx,|j
D]"}|tjj|jjd
||�g7}q�Wx |D]}|jj|d�}|r�Pq�W|S)z=
        Microsoft Windows SDK executable directory.
        g&@�#r�(g(@TF)r9rCzWinSDK-NetFx%dTools%srF�-g,@zv%sAr�)
rhrNrDr;�NetFxSdkVersionrrrrfrVr�rUrb)	r<Znetfxverr(rCZfxZregpathsr'rZexecpathrrr�WindowsSDKExecutablePath�s$

"
z#SystemInfo.WindowsSDKExecutablePathcCs.d|j}tjj|jj|�}|jj|d�p,dS)z0
        Microsoft Visual F# directory.
        z%0.1f\Setup\F#rr)rhrrrrfrOrb)r<rrrr�FSharpInstallDir�s
zSystemInfo.FSharpInstallDircCsF|jdkrd}nf}x(|D] }|jj|jjd|�}|rPqW|pDdS)z8
        Microsoft Universal CRT SDK directory.
        g,@�10�81z
kitsroot%sr)r�r�)rhrfrbrW)r<Zversr'r�rrr�UniversalCRTSdkDir�s


zSystemInfo.UniversalCRTSdkDircCs|jtjj|jd��S)z@
        Microsoft Universal C Runtime SDK last version
        r�)r�rrrr�)r<rrr�UniversalCRTSdkLastVersion�s
z%SystemInfo.UniversalCRTSdkLastVersioncCs|jdkrdSfSdS)z8
        Microsoft .NET Framework SDK versions.
        g,@�4.6.1�4.6N)r�r�)rh)r<rrrr��s
zSystemInfo.NetFxSdkVersioncCs>x4|jD]*}tjj|jj|�}|jj|d�}|rPqW|p<dS)z9
        Microsoft .NET Framework SDK directory.
        Zkitsinstallationfolderr)r�rrrrfrVrb)r<r'r�r�rrr�NetFxSdkDir�szSystemInfo.NetFxSdkDircCs&tjj|jd�}|jj|jjd�p$|S)z;
        Microsoft .NET Framework 32bit directory.
        zMicrosoft.NET\FrameworkZframeworkdir32)rrrrdrfrbrQ)r<�guess_fwrrr�FrameworkDir32�szSystemInfo.FrameworkDir32cCs&tjj|jd�}|jj|jjd�p$|S)z;
        Microsoft .NET Framework 64bit directory.
        zMicrosoft.NET\Framework64Zframeworkdir64)rrrrdrfrbrQ)r<r�rrr�FrameworkDir64�szSystemInfo.FrameworkDir64cCs
|jd�S)z:
        Microsoft .NET Framework 32bit versions.
        � )�_find_dot_net_versions)r<rrr�FrameworkVersion32�szSystemInfo.FrameworkVersion32cCs
|jd�S)z:
        Microsoft .NET Framework 64bit versions.
        �@)r�)r<rrr�FrameworkVersion64�szSystemInfo.FrameworkVersion64cCs�|jj|jjd|�}t|d|�}|p6|j|d�p6d}|jdkrL|df}n:|jdkrx|j�dd	�d
krndn|df}n|jd
kr�d}|jdkr�d}|S)z�
        Find Microsoft .NET Framework versions.

        Parameters
        ----------
        bits: int
            Platform number of bits: 32 or 64.
        zframeworkver%dzFrameworkDir%d�vrg(@zv4.0g$@N�Zv4z
v4.0.30319�v3.5g"@�
v2.0.50727g @�v3.0)r�r�)r�r�)rfrbrQ�getattrr�rhr3)r<�bitsZreg_verZdot_net_dirr'Zframeworkverrrrr�s





z!SystemInfo._find_dot_net_versionscs,��fdd�ttj���D�}t|d�p*dS)z�
        Return name of the last dir in path or '' if no dir found.

        Parameters
        ----------
        path: str
            Use dirs in this path
        prefix: str
            Use only dirs startings by this prefix
        c3s2|]*}tjjtjj�|��r|j��r|VqdS)N)rrrwr�
startswith)�.0Zdir_name)r�prefixrr�	<genexpr>)sz0SystemInfo._use_last_dir_name.<locals>.<genexpr>Nr)�reversedrr{�next)r<rr�Z
matching_dirsr)rr�rr�szSystemInfo._use_last_dir_name)N)r) rr	r
rIrJrKrdrerrr=rgrirLrtrzrurvr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrc�s4

&	rcc@sReZdZdZd=dd�Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��Zedd��Zedd��Z
edd��Zedd��Zdd�Zedd��Zedd��Zed d!��Zed"d#��Zed$d%��Zed&d'��Zed(d)��Zed*d+��Zed,d-��Zed.d/��Zed0d1��Zed2d3��Zed4d5��Zd>d7d8�Zd9d:�Zd?d;d<�Z dS)@r$aY
    Return environment variables for specified Microsoft Visual C++ version
    and platform : Lib, Include, Path and libpath.

    This function is compatible with Microsoft Visual C++ 9.0 to 14.0.

    Script created by analysing Microsoft environment configuration files like
    "vcvars[...].bat", "SetEnv.Cmd", "vcbuildtools.bat", ...

    Parameters
    ----------
    arch: str
        Target architecture.
    vc_ver: float
        Required Microsoft Visual C++ version. If not set, autodetect the last
        version.
    vc_min_ver: float
        Minimum Microsoft Visual C++ version.
    NrcCsBt|�|_t|j�|_t|j|�|_|j|kr>d}tjj	|��dS)Nz.No suitable Microsoft Visual C++ version found)
r8rNrMrfrc�sirhr r!r")r<r(rhr,rkrrrr=Is

zEnvironmentInfo.__init__cCs|jjS)z/
        Microsoft Visual C++ version.
        )r�rh)r<rrrrhRszEnvironmentInfo.vc_vercsVddg}�jdkrD�jjddd�}|dg7}|dg7}|d|g7}�fd	d
�|D�S)z/
        Microsoft Visual Studio Tools
        zCommon7\IDEz
Common7\Toolsg,@T)rCr9z1Common7\IDE\CommonExtensions\Microsoft\TestWindowzTeam Tools\Performance ToolszTeam Tools\Performance Tools%scsg|]}tjj�jj|��qSr)rrrr�rt)r�r)r<rr�
<listcomp>fsz+EnvironmentInfo.VSTools.<locals>.<listcomp>)rhrNrD)r<�paths�arch_subdirr)r<r�VSToolsYs


zEnvironmentInfo.VSToolscCs$tjj|jjd�tjj|jjd�gS)zL
        Microsoft Visual C++ & Microsoft Foundation Class Includes
        ZIncludezATLMFC\Include)rrrr�rz)r<rrr�
VCIncludeshszEnvironmentInfo.VCIncludescsb�jdkr�jjdd�}n�jjdd�}d|d|g}�jdkrP|d|g7}�fd	d
�|D�S)zM
        Microsoft Visual C++ & Microsoft Foundation Class Libraries
        g.@T)r9)rCzLib%szATLMFC\Lib%sg,@zLib\store%scsg|]}tjj�jj|��qSr)rrrr�rz)r�r)r<rrr�~sz/EnvironmentInfo.VCLibraries.<locals>.<listcomp>)rhrNrE)r<r�r�r)r<r�VCLibrariesps

zEnvironmentInfo.VCLibrariescCs"|jdkrgStjj|jjd�gS)zA
        Microsoft Visual C++ store references Libraries
        g,@zLib\store\references)rhrrrr�rz)r<rrr�VCStoreRefs�s
zEnvironmentInfo.VCStoreRefscCs|j}tjj|jd�g}|jdkr&dnd}|jj|�}|rT|tjj|jd|�g7}|jdkr�d|jjdd�}|tjj|j|�g7}n�|jdkr�|jj	�r�d	nd
}|tjj|j||jj
dd��g7}|jj|jjkr�|tjj|j||jjdd��g7}n|tjj|jd�g7}|S)
z,
        Microsoft Visual C++ Tools
        Z
VCPackagesg$@TFzBin%sg,@)rCg.@z
bin\HostX86%sz
bin\HostX64%s)r9�Bin)
r�rrrrzrhrNrHrDrBrErAr?)r<r��toolsrGr�rZhost_dirrrr�VCTools�s&

zEnvironmentInfo.VCToolscCst|jdkr2|jjddd�}tjj|jjd|�gS|jjdd�}tjj|jjd�}|j}tjj|d||f�gSdS)	z1
        Microsoft Windows SDK Libraries
        g$@T)rCr9zLib%s)r9r�z%sum%sN)	rhrNrErrrr�r��_sdk_subdir)r<r�r�Zlibverrrr�OSLibraries�s
zEnvironmentInfo.OSLibrariescCs|tjj|jjd�}|jdkr.|tjj|d�gS|jdkr@|j}nd}tjj|d|�tjj|d|�tjj|d|�gSd	S)
z/
        Microsoft Windows SDK Include
        �includeg$@Zglg,@rz%ssharedz%sumz%swinrtN)rrrr�r�rhr�)r<r��sdkverrrr�
OSIncludes�s

zEnvironmentInfo.OSIncludescCs�tjj|jjd�}g}|jdkr*||j7}|jdkrH|tjj|d�g7}|jdkr�||tjj|jjd�tjj|dd�tjj|d	d�tjj|d
d�tjj|jjddd
|jddd�g7}|S)z7
        Microsoft Windows SDK Libraries Paths
        Z
Referencesg"@g&@zCommonConfiguration\Neutralg,@Z
UnionMetadataz'Windows.Foundation.UniversalApiContractz1.0.0.0z%Windows.Foundation.FoundationContractz,Windows.Networking.Connectivity.WwanContractZ
ExtensionSDKszMicrosoft.VCLibsz%0.1fZCommonConfigurationZneutral)rrrr�r�rhr�)r<�ref�libpathrrr�	OSLibpath�s>




zEnvironmentInfo.OSLibpathcCst|j��S)z-
        Microsoft Windows SDK Tools
        )�list�
_sdk_tools)r<rrr�SdkTools�szEnvironmentInfo.SdkToolsccs|jdkr0|jdkrdnd}tjj|jj|�V|jj�sd|jjdd�}d|}tjj|jj|�V|jdksx|jdkr�|jj	�r�d	}n|jjddd
�}d|}tjj|jj|�VnL|jdk�rtjj|jjd�}|jjdd�}|jj
}tjj|d||f�V|jj�r|jjVd
S)z=
        Microsoft Windows SDK Tools paths generator
        g.@g&@r�zBin\x86T)r9zBin%sg$@r)rCr9zBin\NETFX 4.0 Tools%sz%s%sN)rhrrrr�r�rNrBrDr@r�r�)r<Zbin_dirr�rr�rrrr��s(



zEnvironmentInfo._sdk_toolscCs|jj}|rd|SdS)z6
        Microsoft Windows SDK version subdir
        z%s\r)r�r�)r<�ucrtverrrrr�szEnvironmentInfo._sdk_subdircCs"|jdkrgStjj|jjd�gS)z-
        Microsoft Windows SDK Setup
        g"@ZSetup)rhrrrr�r�)r<rrr�SdkSetup%s
zEnvironmentInfo.SdkSetupcs�|j}|j�|jdkr0d}|j�o,|j�}n$|j�p>|j�}|jdkpR|jdk}g}|rt|�fdd��jD�7}|r�|�fdd��jD�7}|S)z0
        Microsoft .NET Framework Tools
        g$@Tr:csg|]}tjj�j|��qSr)rrrr�)r�r')r�rrr�@sz+EnvironmentInfo.FxTools.<locals>.<listcomp>csg|]}tjj�j|��qSr)rrrr�)r�r')r�rrr�Cs)	rNr�rhr@rBrAr?r�r�)r<rNZ	include32Z	include64r�r)r�r�FxTools/s
zEnvironmentInfo.FxToolscCs>|jdks|jjrgS|jjdd�}tjj|jjd|�gS)z8
        Microsoft .Net Framework SDK Libraries
        g,@T)r9zlib\um%s)rhr�r�rNrErrr)r<r�rrr�NetFxSDKLibrariesGsz!EnvironmentInfo.NetFxSDKLibrariescCs,|jdks|jjrgStjj|jjd�gS)z7
        Microsoft .Net Framework SDK Includes
        g,@z
include\um)rhr�r�rrr)r<rrr�NetFxSDKIncludesRsz EnvironmentInfo.NetFxSDKIncludescCstjj|jjd�gS)z>
        Microsoft Visual Studio Team System Database
        z
VSTSDB\Deploy)rrrr�rt)r<rrr�VsTDb\szEnvironmentInfo.VsTDbcCs~|jdkrgS|jdkr0|jj}|jjdd�}n|jj}d}d|j|f}tjj||�g}|jdkrz|tjj||d�g7}|S)z(
        Microsoft Build Engine
        g(@g.@T)rCrzMSBuild\%0.1f\bin%sZRoslyn)	rhr�rrrNrDrtrrr)r<�	base_pathr�rZbuildrrr�MSBuildcs


zEnvironmentInfo.MSBuildcCs"|jdkrgStjj|jjd�gS)z.
        Microsoft HTML Help Workshop
        g&@zHTML Help Workshop)rhrrrr�rr)r<rrr�HTMLHelpWorkshopzs
z EnvironmentInfo.HTMLHelpWorkshopcCsL|jdkrgS|jjdd�}tjj|jjd�}|j}tjj|d||f�gS)z=
        Microsoft Universal C Runtime SDK Libraries
        g,@T)r9r�z%sucrt%s)	rhrNrErrrr�r��_ucrt_subdir)r<r�r�r�rrr�
UCRTLibraries�s
zEnvironmentInfo.UCRTLibrariescCs6|jdkrgStjj|jjd�}tjj|d|j�gS)z;
        Microsoft Universal C Runtime SDK Include
        g,@r�z%sucrt)rhrrrr�r�r�)r<r�rrr�UCRTIncludes�s
zEnvironmentInfo.UCRTIncludescCs|jj}|rd|SdS)zB
        Microsoft Universal C Runtime SDK version subdir
        z%s\r)r�r�)r<r�rrrr��szEnvironmentInfo._ucrt_subdircCs |jdkr|jdkrgS|jjS)z%
        Microsoft Visual F#
        g&@g(@)rhr�r�)r<rrr�FSharp�szEnvironmentInfo.FSharpcCsl|jjdd�}|jdkr&|jj}d}n|jjjdd�}d}|jdkrHdn|j}|||j|f}tjj||�S)	zA
        Microsoft Visual C++ runtime redistribuable dll
        T)r9�z-redist%s\Microsoft.VC%d0.CRT\vcruntime%d0.dllz\Toolsz\Redistz.onecore%s\Microsoft.VC%d0.CRT\vcruntime%d0.dllg,@)	rNrErhr�rzr;rrr)r<r�Zredist_pathZ	vcruntimeZdll_verrrr�VCRuntimeRedist�s
zEnvironmentInfo.VCRuntimeRedistTcCs�t|jd|j|j|j|jg|�|jd|j|j|j|j	|j
g|�|jd|j|j|j|jg|�|jd|j
|j|j|j|j|j|j|j|jg	|�d�}|jdkr�tjj|j�r�|j|d<|S)z�
        Return environment dict.

        Parameters
        ----------
        exists: bool
            It True, only return existing paths.
        r�r�r�r)r�r�r�r�Zpy_vcruntime_redist)�dict�_build_pathsr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rhrrrr�)r<�exists�envrrrr%�sD	

zEnvironmentInfo.return_envc
Csxtjj|�}tj|d�jtj�}tj||�}|rBtt	tj
j|��n|}|sbd|j�}t
jj|��|j|�}	tjj|	�S)a
        Given an environment variable name and specified paths,
        return a pathsep-separated string of paths containing
        unique, extant, directories from those paths and from
        the environment variable. Raise an error if no paths
        are resolved.
        rz %s environment variable is empty)�	itertools�chain�
from_iterablerJrK�splitr�pathsepr��filterrrw�upperr r!r"�_unique_everseenr)
r<r^Zspec_path_listsr�Z
spec_pathsZ	env_pathsr�Zextant_pathsryZunique_pathsrrrr��s	
zEnvironmentInfo._build_pathsccsjt�}|j}|dkr:xPt|j|�D]}||�|Vq"Wn,x*|D]"}||�}||kr@||�|Vq@WdS)z�
        List unique elements, preserving order.
        Remember all elements ever seen.

        _unique_everseen('AAAABBBCCDAABBB') --> A B C D

        _unique_everseen('ABBCcAD', str.lower) --> A B C D
        N)�set�addr�__contains__)r<�iterabler�seenZseen_add�element�krrrr�s	
z EnvironmentInfo._unique_everseen)Nr)T)N)!rr	r
rIr=rLrhr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r%r�r�rrrrr$1s:
		 -





-r$)r)r)!rIrr.�platformr�Zdistutils.errorsr Z#setuptools.extern.packaging.versionrZsetuptools.extern.six.movesrZmonkeyr�systemr�environrJr��ImportErrorr!r"Z_msvc9_suppress_errorsZdistutils.msvc9compilerrrrr-r1r&r8rMrcr$rrrr�<module>s>
+
/&
%[bsite-packages/setuptools/__pycache__/py36compat.cpython-36.opt-1.pyc000064400000004132147511334640021317 0ustar003

��fK�@sVddlZddlmZddlmZddlmZGdd�d�Zejd	krRGdd�d�ZdS)
�N)�DistutilsOptionError)�	strtobool)�DEBUGc@seZdZdZddd�ZdS)�Distribution_parse_config_filesz�
    Mix-in providing forward-compatibility for functionality to be
    included by default on Python 3.7.

    Do not edit the code in this class except to update functionality
    as implemented in distutils.
    NcCs�ddlm}tjtjkr8ddddddd	d
ddd
ddg
}ng}t|�}|dkrT|j�}trb|jd�|dd�}x�|D]�}tr�|jd|�|j	|�xf|j
�D]Z}|j|�}|j|�}x@|D]8}	|	dkr�|	|kr�|j
||	�}
|	jdd�}	||
f||	<q�Wq�W|j�qrWd|jk�r�x�|jdj�D]�\}	\}}
|jj
|	�}yF|�rVt||t|
��n(|	dk�rrt||	t|
��nt||	|
�Wn,tk
�r�}
zt|
��WYdd}
~
XnX�q"WdS)Nr)�ConfigParserzinstall-basezinstall-platbasezinstall-libzinstall-platlibzinstall-purelibzinstall-headerszinstall-scriptszinstall-data�prefixzexec-prefix�home�user�rootz"Distribution.parse_config_files():)Z
interpolationz  reading %s�__name__�-�_�global�verbose�dry_run)rr)Zconfigparserr�sysr�base_prefix�	frozensetZfind_config_filesrZannounce�readZsections�optionsZget_option_dict�get�replace�__init__Zcommand_options�itemsZnegative_opt�setattrr�
ValueErrorr)�self�	filenamesrZignore_options�parser�filenameZsectionrZopt_dict�opt�val�src�alias�msg�r%� /usr/lib/python3.6/py36compat.py�parse_config_filessJ







z2Distribution_parse_config_files.parse_config_files)N)r�
__module__�__qualname__�__doc__r'r%r%r%r&rsr�c@seZdZdS)rN)rr(r)r%r%r%r&rJs)r+)	rZdistutils.errorsrZdistutils.utilrZdistutils.debugrr�version_infor%r%r%r&�<module>sA
site-packages/setuptools/__pycache__/wheel.cpython-36.opt-1.pyc000064400000014011147511334640020413 0ustar003

��fb�@s�dZddlmZddlZddlZddlZddlZddlZddlZddl	m
Z
mZmZddl
mZddlmZddlm
ZddlmZdd	lmZejd
ej�jZdZdd
�ZGdd�de�ZdS)zWheels support.�)�get_platformN)�Distribution�PathMetadata�
parse_version)�canonicalize_name)�PY3)r)�
pep425tags)�write_requirementsz�^(?P<project_name>.+?)-(?P<version>\d.*?)
    ((-(?P<build>\d.*?))?-(?P<py_version>.+?)-(?P<abi>.+?)-(?P<platform>.+?)
    )\.whl$z�try:
    __import__('pkg_resources').declare_namespace(__name__)
except ImportError:
    __path__ = __import__('pkgutil').extend_path(__path__, __name__)
cCs�x�tj|�D]�\}}}tjj||�}x6|D].}tjj||�}tjj|||�}tj||�q*WxXttt|���D]D\}	}
tjj||
�}tjj|||
�}tjj	|�sntj||�||	=qnWqWx&tj|dd�D]\}}}tj
|�q�WdS)zDMove everything under `src_dir` to `dst_dir`, and delete the former.T)�topdownN)�os�walk�path�relpath�join�renames�reversed�list�	enumerate�exists�rmdir)Zsrc_dirZdst_dir�dirpathZdirnames�	filenames�subdir�f�src�dst�n�d�r�/usr/lib/python3.6/wheel.py�unpack!s
r c@s<eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
S)�WheelcCsTttjj|��}|dkr$td|��||_x$|j�j�D]\}}t|||�q8WdS)Nzinvalid wheel name: %r)	�
WHEEL_NAMErr
�basename�
ValueError�filename�	groupdict�items�setattr)�selfr%�match�k�vrrr�__init__9szWheel.__init__cCs&tj|jjd�|jjd�|jjd��S)z>List tags (py_version, abi, platform) supported by this wheel.�.)�	itertools�product�
py_version�split�abi�platform)r)rrr�tagsAs
z
Wheel.tagscs$tj��t�fdd�|j�D�d�S)z5Is the wheel is compatible with the current platform?c3s|]}|�krdVqdS)TNr)�.0�t)�supported_tagsrr�	<genexpr>Jsz&Wheel.is_compatible.<locals>.<genexpr>F)rZ
get_supported�nextr5)r)r)r8r�
is_compatibleGszWheel.is_compatiblecCs*t|j|j|jdkrdnt�d�j�dS)N�any)�project_name�versionr4z.egg)rr=r>r4r�egg_name)r)rrrr?LszWheel.egg_namecCsJx<|j�D]0}tj|�}|jd�r
t|�jt|j��r
|Sq
Wtd��dS)Nz
.dist-infoz.unsupported wheel format. .dist-info not found)Znamelist�	posixpath�dirname�endswithr�
startswithr=r$)r)�zf�memberrArrr�
get_dist_infoRs

zWheel.get_dist_infocstj|j�����d|j|jf}|j���d|���fdd�}|d�}|d�}t|jd��}td�|koxtd	�kns�td
|��t	j
|��j|�t	jj
|���tj|�t|��d��dd
��ttt��j��������fdd��jD�}t	jj
|d�}t	j�|�t	jt	jj
|d�t	jj
|d��tt�|d�d�}	t|	jd�dt	jj
|d��t	jj
|���t	jj
�d�}
t	jj|
��rt	jj
|dd�}t	j
|�xVt	j|
�D]H}|jd��r�t	jt	jj
|
|��n t	jt	jj
|
|�t	jj
||���q�Wt	j|
�x0t t	jj�fdd�d#D��D]}
t!|
|��q$Wt	jj���rPt	j��t	jj
|d�}t	jj|��rt"|��}|j#�j$�}WdQRXxr|D]j}t	jj
|f|j$d ���}t	jj
|d!�}t	jj|��r�t	jj|��r�t"|d"��}|j%t&�WdQRX�q�WWdQRXdS)$z"Install wheel as an egg directory.z%s-%sz%s.datac	sH�jtj�|���,}tr&|j�jd�n|j�}tjj�j	|�SQRXdS)Nzutf-8)
�openr@rr�read�decode�email�parserZParserZparsestr)�name�fp�value)�	dist_inforDrr�get_metadatabsz*Wheel.install_as_egg.<locals>.get_metadataZWHEELZMETADATAz
Wheel-Versionz1.0z2.0dev0z$unsupported wheel format version: %s)ZmetadatacSsd|_t|�S)N)Zmarker�str)�reqrrr�raw_reqxsz%Wheel.install_as_egg.<locals>.raw_reqc	s6i|].}tt�fdd�t��j|f��D���|�qS)c3s|]}|�kr|VqdS)Nr)r6rR)�install_requiresrrr9~sz2Wheel.install_as_egg.<locals>.<dictcomp>.<genexpr>)r�sorted�map�requires)r6Zextra)�distrTrSrr�
<dictcomp>|sz(Wheel.install_as_egg.<locals>.<dictcomp>zEGG-INFOzPKG-INFO)rT�extras_require)Zattrs�egg_infoNzrequires.txt�scriptsz.pycc3s|]}tjj�|�VqdS)N)rr
r)r6r)�	dist_datarrr9�sz'Wheel.install_as_egg.<locals>.<genexpr>�data�headers�purelib�platlibznamespace_packages.txtr.z__init__.py�w)r^r_r`ra)'�zipfileZZipFiler%r=r>rFr�getr$r�mkdirZ
extractallr
rrZ
from_locationrrrUrVrWZextras�rename�SetuptoolsDistribution�dictr	Zget_command_objr�listdirrB�unlinkr�filterr rGrHr2�write�NAMESPACE_PACKAGE_INIT)r)Zdestination_eggdirZ
dist_basenamerPZwheel_metadataZ
dist_metadataZ
wheel_versionrZr[Z
setup_distZdist_data_scriptsZegg_info_scripts�entryrZnamespace_packagesrM�modZmod_dirZmod_initr)rXr]rOrTrSrDr�install_as_egg\sr
 










zWheel.install_as_eggN)	�__name__�
__module__�__qualname__r-r5r;r?rFrprrrrr!7s
r!)�__doc__Zdistutils.utilrrJr/rr@�rercZ
pkg_resourcesrrrZ!setuptools.extern.packaging.utilsrZsetuptools.extern.sixrZ
setuptoolsrgrZsetuptools.command.egg_infor	�compile�VERBOSEr*r"rmr �objectr!rrrr�<module>s&
site-packages/setuptools/__pycache__/py33compat.cpython-36.pyc000064400000002466147511334640020365 0ustar003

��f��@s�ddlZddlZddlZyddlZWnek
r<dZYnXddlmZddlmZej	dd�Z
Gdd�de�Ze
ede�Ze
ed	ej�j�ZdS)
�N)�six)�html_parser�OpArgz
opcode argc@seZdZdd�Zdd�ZdS)�Bytecode_compatcCs
||_dS)N)�code)�selfr�r� /usr/lib/python3.6/py33compat.py�__init__szBytecode_compat.__init__ccs�tjd|jj�}t|jj�}d}d}x�||kr�||}|tjkr�||d||dd|}|d7}|tjkr�tjd	}||d�}q&nd}|d7}t	||�Vq&WdS)
z>Yield '(op,arg)' pair for each operation in code object 'code'�br����iN���)
�arrayr�co_code�len�disZ
HAVE_ARGUMENTZEXTENDED_ARGrZ
integer_typesr)r�bytes�eofZptrZextended_arg�op�argZ	long_typerrr	�__iter__s 

 

zBytecode_compat.__iter__N)�__name__�
__module__�__qualname__r
rrrrr	rsr�Bytecode�unescape)rr�collectionsZhtml�ImportErrorZsetuptools.externrZsetuptools.extern.six.movesr�
namedtupler�objectr�getattrrZ
HTMLParserrrrrr	�<module>s
"site-packages/setuptools/__pycache__/dist.cpython-36.pyc000064400000107706147511334640017331 0ustar003

��fu��@s�dgZddlZddlZddlZddlZddlZddlZddlZddl	Zddl
Z
ddlmZddl
mZmZmZddlmZddlmZddlmZddlmZdd	lmZmZmZdd
lmZddlmZddl m!Z!dd
l"m#Z#ddl$Z$ddl%m&Z&e'd�e'd�dd�Z(dd�Z)dd�Z*e+e,fZ-dd�Z.dd�Z/dd�Z0dd�Z1d d!�Z2d"d#�Z3d$d%�Z4d&d'�Z5d(d)�Z6d*d+�Z7d,d-�Z8d.d/�Z9e!ej:j;�Z<Gd0d�de&e<�Z;Gd1d2�d2�Z=dS)3�Distribution�N)�defaultdict)�DistutilsOptionError�DistutilsPlatformError�DistutilsSetupError)�
rfc822_escape)�
StrictVersion)�six)�	packaging)�map�filter�filterfalse)�Require)�windows_support)�
get_unpatched)�parse_configuration�)�Distribution_parse_config_filesz&setuptools.extern.packaging.specifiersz#setuptools.extern.packaging.versioncCstjdt�t|�S)NzDo not call this function)�warnings�warn�DeprecationWarningr)�cls�r�/usr/lib/python3.6/dist.py�_get_unpatched#srcCsn|js|jrtd�S|jdk	s8|jdk	s8t|dd�dk	r@td�S|js^|js^|js^|j	s^|j
rftd�Std�S)Nz2.1�python_requiresz1.2z1.1z1.0)�long_description_content_type�provides_extrasr�
maintainer�maintainer_email�getattrZprovides�requiresZ	obsoletesZclassifiers�download_url)Zdist_mdrrr�get_metadata_version(s

r#cCsPt|�}|jd|�|jd|j��|jd|j��|jd|j��|jd|j��|td�kr�|jd|j��|jd|j��nJd'}xD|D]<\}}t	||�}t
jr�|j|�}|dk	r�|jd||f�q�W|jd|j
��|j�r|jd|j�x"|jj�D]}|jd|��qWt|j��}|jd|�dj|j��}	|	�rd|jd|	�|td�k�r�x4|j�D]}
|jd|
��q|Wn|j|d|j��|j|d|j��|j|d|j��|j|d|j��|j|d|j��t|d��r|jd |j�|j�r$|jd!|j�|j�rLx|jD]}|jd"|��q4WdS)(z5Write the PKG-INFO format data to a file object.
    zMetadata-Version: %s
z	Name: %s
zVersion: %s
zSummary: %s
zHome-page: %s
z1.2zAuthor: %s
zAuthor-email: %s
�Author�author�Author-email�author_email�
Maintainerr�Maintainer-emailrNz%s: %s
zLicense: %s
zDownload-URL: %s
zProject-URL: %s, %s
zDescription: %s
�,z
Keywords: %s
z
Platform: %s
ZPlatformZ
ClassifierZRequiresZProvidesZ	ObsoletesrzRequires-Python: %s
zDescription-Content-Type: %s
zProvides-Extra: %s
�r$r%�r&r'�r(r�r)r)r+r,r-r.)r#�writeZget_nameZget_versionZget_descriptionZget_urlrZget_contactZget_contact_emailr r	�PY2Z
_encode_fieldZget_licenser"�project_urls�itemsrZget_long_description�joinZget_keywordsZ
get_platformsZ_write_listZget_classifiersZget_requiresZget_providesZ
get_obsoletes�hasattrrrr)�self�file�versionZoptional_fieldsZfield�attrZattr_valZproject_urlZ	long_desc�keywords�platform�extrarrr�write_pkg_file7s\


r<cCsRy tjjd|�}|jst�Wn,ttttfk
rLtd||f��YnXdS)Nzx=z4%r must be importable 'module:attrs' string (got %r))	�
pkg_resources�
EntryPoint�parse�extras�AssertionError�	TypeError�
ValueError�AttributeErrorr)�distr8�value�eprrr�check_importable�srHcCsHydj|�|kst�Wn,ttttfk
rBtd||f��YnXdS)z*Verify that value is a string list or None�z%%r must be a list of strings (got %r)N)r3rArBrCrDr)rEr8rFrrr�assert_string_list�s
rJcCsh|}t|||�xR|D]J}|j|�s4tdd|��|jd�\}}}|r||krtjjd||�qWdS)z(Verify that namespace packages are validz1Distribution contains no modules or packages for znamespace package %r�.z^WARNING: %r is declared as a package namespace, but %r is not: please correct this in setup.pyN)rJ�has_contents_forr�
rpartition�	distutils�logr)rEr8rFZns_packagesZnsp�parent�sepZchildrrr�	check_nsp�s

rRc
Cs@yttjt|j���Wn"tttfk
r:td��YnXdS)z+Verify that extras_require mapping is validz�'extras_require' must be a dictionary whose values are strings or lists of strings containing valid project/version requirement specifiers.N)	�list�	itertools�starmap�_check_extrar2rBrCrDr)rEr8rFrrr�check_extras�s
rWcCs<|jd�\}}}|r*tj|�r*td|��ttj|��dS)N�:zInvalid environment marker: )�	partitionr=Zinvalid_markerrrS�parse_requirements)r;Zreqs�namerQ�markerrrrrV�srVcCs&t|�|kr"d}t|j||d���dS)z)Verify that value is True, False, 0, or 1z0{attr!r} must be a boolean value (got {value!r}))r8rFN)�boolr�format)rEr8rF�tmplrrr�assert_bool�sr`cCsjy(ttj|��t|ttf�r&td��Wn<ttfk
rd}zd}t|j	||d���WYdd}~XnXdS)z9Verify that install_requires is a valid requirements listzUnordered types are not allowedzm{attr!r} must be a string or list of strings containing valid project/version requirement specifiers; {error})r8�errorN)
rSr=rZ�
isinstance�dict�setrBrCrr^)rEr8rFrar_rrr�check_requirements�srecCsRytjj|�Wn<tjjk
rL}zd}t|j||d���WYdd}~XnXdS)z.Verify that value is a valid version specifierzF{attr!r} must be a string containing valid version specifiers; {error})r8raN)r
Z
specifiersZSpecifierSetZInvalidSpecifierrr^)rEr8rFrar_rrr�check_specifier�s
rfcCs@ytjj|�Wn*tk
r:}zt|��WYdd}~XnXdS)z)Verify that entry_points map is parseableN)r=r>Z	parse_maprCr)rEr8rF�errr�check_entry_points�srhcCst|tj�std��dS)Nztest_suite must be a string)rbr	Zstring_typesr)rEr8rFrrr�check_test_suite�sricCsdt|t�rTxH|j�D]8\}}t|t�s(Pyt|�Wqtk
rJPYqXqWdSt|d��dS)z@Verify that value is a dictionary of package names to glob listsNzI must be a dictionary mapping package names to lists of wildcard patterns)rbrcr2�str�iterrBr)rEr8rF�k�vrrr�check_package_data�s

rncCs,x&|D]}tjd|�stjjd|�qWdS)Nz\w+(\.\w+)*z[WARNING: %r not a valid package name; please use only .-separated package names in setup.py)�re�matchrNrOr)rEr8rFZpkgnamerrr�check_packages�s

rqc@s0eZdZdZdeed�ZdZdd�ZdGdd�Z	dd	�Z
d
d�Zedd
��Z
dd�Zdd�ZdHdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Z d5d6�Z!d7d8�Z"d9d:�Z#d;d<�Z$d=d>�Z%d?d@�Z&dAdB�Z'dCdD�Z(dEdF�Z)dS)Ira�Distribution with support for features, tests, and package data

    This is an enhanced version of 'distutils.dist.Distribution' that
    effectively adds the following new optional keyword arguments to 'setup()':

     'install_requires' -- a string or sequence of strings specifying project
        versions that the distribution requires when installed, in the format
        used by 'pkg_resources.require()'.  They will be installed
        automatically when the package is installed.  If you wish to use
        packages that are not available in PyPI, or want to give your users an
        alternate download location, you can add a 'find_links' option to the
        '[easy_install]' section of your project's 'setup.cfg' file, and then
        setuptools will scan the listed web pages for links that satisfy the
        requirements.

     'extras_require' -- a dictionary mapping names of optional "extras" to the
        additional requirement(s) that using those extras incurs. For example,
        this::

            extras_require = dict(reST = ["docutils>=0.3", "reSTedit"])

        indicates that the distribution can optionally provide an extra
        capability called "reST", but it can only be used if docutils and
        reSTedit are installed.  If the user installs your package using
        EasyInstall and requests one of your extras, the corresponding
        additional requirements will be installed if needed.

     'features' **deprecated** -- a dictionary mapping option names to
        'setuptools.Feature'
        objects.  Features are a portion of the distribution that can be
        included or excluded based on user options, inter-feature dependencies,
        and availability on the current system.  Excluded features are omitted
        from all setup commands, including source and binary distributions, so
        you can create multiple distributions from the same source tree.
        Feature names should be valid Python identifiers, except that they may
        contain the '-' (minus) sign.  Features can be included or excluded
        via the command line options '--with-X' and '--without-X', where 'X' is
        the name of the feature.  Whether a feature is included by default, and
        whether you are allowed to control this from the command line, is
        determined by the Feature object.  See the 'Feature' class for more
        information.

     'test_suite' -- the name of a test suite to run for the 'test' command.
        If the user runs 'python setup.py test', the package will be installed,
        and the named test suite will be run.  The format is the same as
        would be used on a 'unittest.py' command line.  That is, it is the
        dotted name of an object to import and call to generate a test suite.

     'package_data' -- a dictionary mapping package names to lists of filenames
        or globs to use to find data files contained in the named packages.
        If the dictionary has filenames or globs listed under '""' (the empty
        string), those names will be searched for in every package, in addition
        to any names for the specific package.  Data files found using these
        names/globs will be installed along with the package, in the same
        location as the package.  Note that globs are allowed to reference
        the contents of non-package subdirectories, as long as you use '/' as
        a path separator.  (Globs are automatically converted to
        platform-specific paths at runtime.)

    In addition to these new keywords, this class also has several new methods
    for manipulating the distribution's contents.  For example, the 'include()'
    and 'exclude()' methods can be thought of as in-place add and subtract
    commands that add or remove packages, modules, extensions, and so on from
    the distribution.  They are used by the feature subsystem to configure the
    distribution for the included and excluded features.
    N)rr1rcCsp|sd|ksd|krdStjt|d��j�}tjjj|�}|dk	rl|jd�rltjt|d��|_	||_
dS)Nr[r7zPKG-INFO)r=Z	safe_namerj�lower�working_setZby_key�getZhas_metadataZsafe_versionZ_version�
_patched_dist)r5�attrs�keyrErrr�patch_missing_pkg_infoSsz#Distribution.patch_missing_pkg_infoc
s�t�d�}|si�_|pi}d|ks,d|kr4tj�g�_i�_g�_|jdd��_�j	|�|jdg��_
|jdg��_x$tj
d�D]}t��j|jd�q�Wtj��fdd	�|j�D��x\�jj�D]N\}}x6�jj|fD]}||kr�||}Pq�W|�r|�nd}t�j||�q�Wt�jjtj��r>t�jj��j_�jjdk	�r�yHtjj�jj�}t|�}	�jj|	k�r�tj d
�jj|	f�|	�j_Wn0tjj!t"fk
�r�tj d�jj�YnX�j#�dS)N�package_data�features�require_features�src_root�dependency_links�setup_requireszdistutils.setup_keywordscs i|]\}}|�jkr||�qSr)�_DISTUTILS_UNSUPPORTED_METADATA)�.0rlrm)r5rr�
<dictcomp>qsz)Distribution.__init__.<locals>.<dictcomp>zNormalizing '%s' to '%s'z�The version specified (%r) is an invalid version, this may not work as expected with newer versions of setuptools, pip, and PyPI. Please see PEP 440 for more details.)$r4ry�Feature�warn_deprecatedr{rzZ
dist_files�popr|rxr}r~r=�iter_entry_points�vars�
setdefaultr[�
_Distribution�__init__r2r�metadata�__dict__�setattrrbr7�numbers�Numberrjr
ZVersionrrZInvalidVersionrB�_finalize_requires)
r5rvZhave_package_datarGZoption�default�sourcerFZverZnormalized_versionr)r5rr�`sR


zDistribution.__init__cCsjt|dd�r|j|j_t|dd�rVx2|jj�D]$}|jd�d}|r.|jjj|�q.W|j�|j	�dS)z�
        Set `metadata.python_requires` and fix environment markers
        in `install_requires` and `extras_require`.
        rN�extras_requirerXr)
r rr�r��keys�splitr�add�_convert_extras_requirements�"_move_install_requirements_markers)r5r;rrrr��s
zDistribution._finalize_requirescCspt|dd�pi}tt�|_xP|j�D]D\}}|j|x0tj|�D]"}|j|�}|j||j|�qBWq$WdS)z�
        Convert requirements in `extras_require` of the form
        `"extra": ["barbazquux; {marker}"]` to
        `"extra:{marker}": ["barbazquux"]`.
        r�N)	r rrS�_tmp_extras_requirer2r=rZ�_suffix_for�append)r5Z
spec_ext_reqsZsectionrm�r�suffixrrrr��s


z)Distribution._convert_extras_requirementscCs|jrdt|j�SdS)ze
        For a requirement, return the 'extras_require' suffix for
        that requirement.
        rXrI)r\rj)�reqrrrr��szDistribution._suffix_forcs�dd�}t�dd�pf}ttj|��}t||�}t||�}ttt|���_x&|D]}�j	dt|j
�j|�qPWt�fdd��j	j
�D���_dS)zv
        Move requirements in `install_requires` that are using environment
        markers `extras_require`.
        cSs|jS)N)r\)r�rrr�
is_simple_req�szFDistribution._move_install_requirements_markers.<locals>.is_simple_req�install_requiresNrXc3s,|]$\}}|dd�t�j|�D�fVqdS)cSsg|]}t|��qSr)rj)r�r�rrr�
<listcomp>�szMDistribution._move_install_requirements_markers.<locals>.<genexpr>.<listcomp>N)r�
_clean_req)r�rlrm)r5rr�	<genexpr>�szBDistribution._move_install_requirements_markers.<locals>.<genexpr>)r rSr=rZrr
rrjr�r�r\r�rcr2r�)r5r�Zspec_inst_reqsZ	inst_reqsZsimple_reqsZcomplex_reqsr�r)r5rr��s




z/Distribution._move_install_requirements_markerscCs
d|_|S)zP
        Given a Requirement, remove environment markers and return it.
        N)r\)r5r�rrrr��szDistribution._clean_reqFcCs*tj||d�t||j|d�|j�dS)zYParses configuration files from various levels
        and loads configuration.

        )�	filenames)�ignore_option_errorsN)r��parse_config_filesr�command_optionsr�)r5r�r�rrrr��szDistribution.parse_config_filescCstj|�}|jr|j�|S)z3Process features after parsing command line options)r��parse_command_linerz�_finalize_features)r5�resultrrrr��s
zDistribution.parse_command_linecCsd|jdd�S)z;Convert feature name to corresponding option attribute nameZwith_�-�_)�replace)r5r[rrr�_feature_attrname�szDistribution._feature_attrnamecCs<tjjtj|�|jdd�}x|D]}tjj|dd�q W|S)zResolve pre-setup requirementsT)�	installerZreplace_conflicting)r�)r=rs�resolverZ�fetch_build_eggr�)r5r!Zresolved_distsrErrr�fetch_build_eggs�s
zDistribution.fetch_build_eggscCs�tj|�|jr|j�xHtjd�D]:}t||jd�}|dk	r$|j|j	d�|j
�||j|�q$Wt|dd�r�dd�|jD�|_ng|_dS)Nzdistutils.setup_keywords)r��convert_2to3_doctestscSsg|]}tjj|��qSr)�os�path�abspath)r��prrrr�sz1Distribution.finalize_options.<locals>.<listcomp>)r��finalize_optionsrz�_set_global_opts_from_featuresr=r�r r[�requirer��loadr�)r5rGrFrrrr�s
zDistribution.finalize_optionsc	Csvtjjtjd�}tjj|�srtj|�tj|�tjj|d�}t|d��$}|j	d�|j	d�|j	d�WdQRX|S)Nz.eggsz
README.txt�wzcThis directory contains eggs that were downloaded by setuptools to build, test, and run plug-ins.

zAThis directory caches those eggs to prevent repeated downloads.

z/However, it is safe to delete this directory.

)
r�r�r3�curdir�exists�mkdirrZ	hide_file�openr/)r5Z
egg_cache_dirZreadme_txt_filename�frrr�get_egg_cache_dirs



zDistribution.get_egg_cache_dirc
Cs�ddlm}|jddgi�}|jd�}|j�|jdd�|jd�j�D��|jr�|jdd�}d|krx|dd	|}d
|f|d<|j�}||dg|dd
dd
d
ddd
d�}|j	�|j|�S)z Fetch an egg needed for buildingr)�easy_installZscript_argsr�css"|]\}}|dkr||fVqdS)�
find_links�	site_dirs�	index_url�optimize�allow_hostsN)r�r�r�r�r�r�r)r�rlrmrrrr�1sz/Distribution.fetch_build_egg.<locals>.<genexpr>Nr�rZsetup�xTF)
�args�install_dirZexclude_scriptsZalways_copyZbuild_directoryZeditableZupgradeZ
multi_versionZ	no_report�user)
Zsetuptools.command.easy_installr��	__class__�get_option_dict�clear�updater2r}r�Zensure_finalized)r5r�r�rE�optsZlinksr��cmdrrrr�*s(
zDistribution.fetch_build_eggc	Cs�g}|jj�}x�|jj�D]�\}}|j|d�|j|�|jr|j}d}d}|j�s^||}}d|dd||fd|dd||ff}|j	|�d||d|<qW||j
|_
|_||_|_dS)z;Add --with-X/--without-X options based on optional featuresNz
 (default)rIzwith-zinclude zwithout-zexclude )
�negative_opt�copyrzr2�_set_feature�validate�optional�description�include_by_default�extend�global_optionsZfeature_optionsZfeature_negopt)	r5Zgo�nor[�feature�descrZincdefZexcdef�newrrrr�Gs"



z+Distribution._set_global_opts_from_featurescCs�xJ|jj�D]<\}}|j|�}|s2|dkr|j�r|j|�|j|d�qWx6|jj�D](\}}|j|�sX|j|�|j|d�qXWdS)z9Add/remove features and resolve dependencies between themNrr)rzr2�feature_is_includedr��
include_inr��exclude_from)r5r[r�Zenabledrrrr�bs



zDistribution._finalize_featurescCs`||jkr|j|Stjd|�}x:|D]&}|j|jd�|j�|j|<}|SWtj||�SdS)z(Pluggable version of get_command_class()zdistutils.commands)r�N)�cmdclassr=r�r�r�r�r��get_command_class)r5�commandZepsrGr�rrrr�ss


zDistribution.get_command_classcCs>x2tjd�D]$}|j|jkr|j�}||j|j<qWtj|�S)Nzdistutils.commands)r=r�r[r�r�r��print_commands)r5rGr�rrrr��s
zDistribution.print_commandscCs>x2tjd�D]$}|j|jkr|j�}||j|j<qWtj|�S)Nzdistutils.commands)r=r�r[r�r�r��get_command_list)r5rGr�rrrr��s
zDistribution.get_command_listcCst||j|�|�dS)zSet feature's inclusion statusN)r�r�)r5r[Zstatusrrrr��szDistribution._set_featurecCst||j|��S)zAReturn 1 if feature is included, 0 if excluded, 'None' if unknown)r r�)r5r[rrrr��sz Distribution.feature_is_includedcCsF|j|�dkr&|j|j}t|d��|j|j|�|j|d�dS)z)Request inclusion of feature named 'name'rz2 is required, but was excluded or is not availablerN)r�rzr�rr�r�)r5r[r�rrr�include_feature�s
zDistribution.include_featurecKsDx>|j�D]2\}}t|d|d�}|r0||�q
|j||�q
WdS)a�Add items to distribution that are named in keyword arguments

        For example, 'dist.exclude(py_modules=["x"])' would add 'x' to
        the distribution's 'py_modules' attribute, if it was not already
        there.

        Currently, this method only supports inclusion for attributes that are
        lists or tuples.  If you need to add support for adding to other
        attributes in this or a subclass, you can add an '_include_X' method,
        where 'X' is the name of the attribute.  The method will be called with
        the value passed to 'include()'.  So, 'dist.include(foo={"bar":"baz"})'
        will try to call 'dist._include_foo({"bar":"baz"})', which can then
        handle whatever special inclusion logic is needed.
        Z	_include_N)r2r �
_include_misc)r5rvrlrm�includerrrr��s

zDistribution.includecsf�d�|jr&��fdd�|jD�|_|jrD��fdd�|jD�|_|jrb��fdd�|jD�|_dS)z9Remove packages, modules, and extensions in named packagerKcs$g|]}|�kr|j��r|�qSr)�
startswith)r�r�)�package�pfxrrr��sz0Distribution.exclude_package.<locals>.<listcomp>cs$g|]}|�kr|j��r|�qSr)r�)r�r�)r�r�rrr��scs(g|] }|j�kr|jj��r|�qSr)r[r�)r�r�)r�r�rrr��sN)�packages�
py_modules�ext_modules)r5r�r)r�r�r�exclude_package�szDistribution.exclude_packagecCs4|d}x&|j�D]}||ks(|j|�rdSqWdS)z<Return true if 'exclude_package(package)' would do somethingrKTN)�iter_distribution_namesr�)r5r�r�r�rrrrL�szDistribution.has_contents_forcs�t�t�std|�f��yt||�}Wn tk
rHtd|��YnX|dk	rlt|t�rlt|d��n|r�t||�fdd�|D��dS)zAHandle 'exclude()' for list/tuple attrs without a special handlerz(%s: setting must be a list or tuple (%r)z %s: No such distribution settingNz4: this setting cannot be changed via include/excludecsg|]}|�kr|�qSrr)r��item)rFrrr��sz.Distribution._exclude_misc.<locals>.<listcomp>)rb�sequencerr rDr�)r5r[rF�oldr)rFr�
_exclude_misc�s
zDistribution._exclude_miscc
s�t|t�std||f��yt||��Wn tk
rHtd|��YnX�dkr`t|||�n:t�t�sxt|d��n"�fdd�|D�}t||�|�dS)zAHandle 'include()' for list/tuple attrs without a special handlerz%s: setting must be a list (%r)z %s: No such distribution settingNz4: this setting cannot be changed via include/excludecsg|]}|�kr|�qSrr)r�r�)r�rrr��sz.Distribution._include_misc.<locals>.<listcomp>)rbr�rr rDr�)r5r[rFr�r)r�rr��s

zDistribution._include_misccKsDx>|j�D]2\}}t|d|d�}|r0||�q
|j||�q
WdS)aRemove items from distribution that are named in keyword arguments

        For example, 'dist.exclude(py_modules=["x"])' would remove 'x' from
        the distribution's 'py_modules' attribute.  Excluding packages uses
        the 'exclude_package()' method, so all of the package's contained
        packages, modules, and extensions are also excluded.

        Currently, this method only supports exclusion from attributes that are
        lists or tuples.  If you need to add support for excluding from other
        attributes in this or a subclass, you can add an '_exclude_X' method,
        where 'X' is the name of the attribute.  The method will be called with
        the value passed to 'exclude()'.  So, 'dist.exclude(foo={"bar":"baz"})'
        will try to call 'dist._exclude_foo({"bar":"baz"})', which can then
        handle whatever special exclusion logic is needed.
        Z	_exclude_N)r2r r�)r5rvrlrm�excluderrrr�s

zDistribution.excludecCs,t|t�std|f��tt|j|��dS)Nz.packages: setting must be a list or tuple (%r))rbr�rrSrr�)r5r�rrr�_exclude_packagess
zDistribution._exclude_packagesc
Cs�|jj|_|jj|_|d}|jd�}xB||krh||\}}||=ddl}|j|d�|dd�<|d}q(Wtj|||�}|j|�}	t	|	dd�r�d|f|j|�d<|dk	r�gS|S)Nr�aliasesTrZcommand_consumes_argumentszcommand liner�)
r�r�r�r��shlexr�r��_parse_command_optsr�r )
r5�parserr�r�r��src�aliasr��nargsZ	cmd_classrrrr� s"




z Distribution._parse_command_optscCs�i}x�|jj�D]�\}}x�|j�D]�\}\}}|dkr8q"|jdd�}|dkr�|j|�}|jj�}|jt|di��x<|j�D]\}	}
|
|kr||	}d}Pq|Wtd��n|dkr�d}||j	|i�|<q"WqW|S)	ahReturn a '{cmd: {opt:val}}' map of all command-line options

        Option names are all long, but do not include the leading '--', and
        contain dashes rather than underscores.  If the option doesn't take
        an argument (e.g. '--quiet'), the 'val' is 'None'.

        Note that options provided by config files are intentionally excluded.
        zcommand liner�r�rr�NzShouldn't be able to get herer)
r�r2r�Zget_command_objr�r�r�r rAr�)r5�dr�r��optr��valZcmdobjZneg_opt�neg�posrrr�get_cmdline_options:s(



z Distribution.get_cmdline_optionsccs�x|jp
fD]
}|VqWx|jp$fD]
}|Vq&WxH|jp>fD]:}t|t�rX|\}}n|j}|jd�rt|dd�}|Vq@WdS)z@Yield all packages, modules, and extension names in distribution�moduleN�i����)r�r�r�rb�tupler[�endswith)r5ZpkgrZextr[Z	buildinforrrr�bs




z$Distribution.iter_distribution_namescCs�ddl}tjs|jr tj||�Sddl}t|j|j	�sBtj||�S|jj
j�dkr^tj||�S|jj
}|jj}|j
dkr|dp~d}|jj}|j	|jj�d|||�|_ztj||�S|j	|jj�||||�|_XdS)z�If there were any non-global "display-only" options
        (--help-commands or the metadata display options) on the command
        line, display the requested info and return true; else return
        false.
        rN�utf-8�utf8Zwin32�
)r	r
)�sysr	r0Z
help_commandsr��handle_display_options�iorb�stdout�
TextIOWrapper�encodingrr�errorsr:�line_buffering�detach)r5Zoption_orderrrrr�newlinerrrrr
ts$z#Distribution.handle_display_options)N)NF)*�__name__�
__module__�__qualname__�__doc__rcrdrrurxr�r�r��staticmethodr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rLr�r�r�r�r�rr�r
rrrrrsLB
;

	(c@sPeZdZdZedd��Zdddfffdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dS)r�a�
    **deprecated** -- The `Feature` facility was never completely implemented
    or supported, `has reported issues
    <https://github.com/pypa/setuptools/issues/58>`_ and will be removed in
    a future version.

    A subset of the distribution that can be excluded if unneeded/wanted

    Features are created using these keyword arguments:

      'description' -- a short, human readable description of the feature, to
         be used in error messages, and option help messages.

      'standard' -- if true, the feature is included by default if it is
         available on the current system.  Otherwise, the feature is only
         included if requested via a command line '--with-X' option, or if
         another included feature requires it.  The default setting is 'False'.

      'available' -- if true, the feature is available for installation on the
         current system.  The default setting is 'True'.

      'optional' -- if true, the feature's inclusion can be controlled from the
         command line, using the '--with-X' or '--without-X' options.  If
         false, the feature's inclusion status is determined automatically,
         based on 'availabile', 'standard', and whether any other feature
         requires it.  The default setting is 'True'.

      'require_features' -- a string or sequence of strings naming features
         that should also be included if this feature is included.  Defaults to
         empty list.  May also contain 'Require' objects that should be
         added/removed from the distribution.

      'remove' -- a string or list of strings naming packages to be removed
         from the distribution if this feature is *not* included.  If the
         feature *is* included, this argument is ignored.  This argument exists
         to support removing features that "crosscut" a distribution, such as
         defining a 'tests' feature that removes all the 'tests' subpackages
         provided by other features.  The default for this argument is an empty
         list.  (Note: the named package(s) or modules must exist in the base
         distribution when the 'setup()' function is initially called.)

      other keywords -- any other keyword arguments are saved, and passed to
         the distribution's 'include()' and 'exclude()' methods when the
         feature is included or excluded, respectively.  So, for example, you
         could pass 'packages=["a","b"]' to cause packages 'a' and 'b' to be
         added or removed from the distribution as appropriate.

    A feature must include at least one 'requires', 'remove', or other
    keyword argument.  Otherwise, it can't affect the distribution in any way.
    Note also that you can subclass 'Feature' to create your own specialized
    feature types that modify the distribution in other ways when included or
    excluded.  See the docstrings for the various methods here for more detail.
    Aside from the methods, the only feature attributes that distributions look
    at are 'description' and 'optional'.
    cCsd}tj|tdd�dS)NzrFeatures are deprecated and will be removed in a future version. See https://github.com/pypa/setuptools/issues/65.�)�
stacklevel)rrr)�msgrrrr��szFeature.warn_deprecatedFTc	Ks�|j�||_||_||_||_t|ttf�r4|f}dd�|D�|_dd�|D�}|r^||d<t|t�rn|f}||_	||_
|r�|r�|r�td��dS)NcSsg|]}t|t�r|�qSr)rbrj)r�r�rrrr��sz$Feature.__init__.<locals>.<listcomp>cSsg|]}t|t�s|�qSr)rbrj)r�r�rrrr��sr{zgFeature %s: must define 'require_features', 'remove', or at least one of 'packages', 'py_modules', etc.)r�r��standard�	availabler�rbrjrr{�remover@r)	r5r�rrr�r{r r@Zerrrrr��s$
zFeature.__init__cCs|jo
|jS)z+Should this feature be included by default?)rr)r5rrrr��szFeature.include_by_defaultcCs@|jst|jd��|jf|j�x|jD]}|j|�q*WdS)aEnsure feature and its requirements are included in distribution

        You may override this in a subclass to perform additional operations on
        the distribution.  Note that this method may be called more than once
        per feature, and so should be idempotent.

        z3 is required, but is not available on this platformN)rrr�r�r@r{r�)r5rEr�rrrr��s	zFeature.include_incCs2|jf|j�|jr.x|jD]}|j|�qWdS)a2Ensure feature is excluded from distribution

        You may override this in a subclass to perform additional operations on
        the distribution.  This method will be called at most once per
        feature, and only after all included features have been asked to
        include themselves.
        N)r�r@r r�)r5rEr�rrrr�s	zFeature.exclude_fromcCs2x,|jD]"}|j|�std|j||f��qWdS)a�Verify that feature makes sense in context of distribution

        This method is called by the distribution just before it parses its
        command line.  It checks to ensure that the 'remove' attribute, if any,
        contains only valid package/module names that are present in the base
        distribution when 'setup()' is called.  You may override it in a
        subclass to perform any other required validation of the feature
        against a target distribution.
        zg%s wants to be able to remove %s, but the distribution doesn't contain any packages or modules under %sN)r rLrr�)r5rEr�rrrr�s

zFeature.validateN)rrrrrr�r�r�r�r�r�rrrrr��s7	r�)>�__all__ror�rr�Z
distutils.logrNZdistutils.coreZ
distutils.cmdZdistutils.distrT�collectionsrZdistutils.errorsrrrZdistutils.utilrZdistutils.versionrZsetuptools.externr	r
Zsetuptools.extern.six.movesrrr
Zsetuptools.dependsrZ
setuptoolsrZsetuptools.monkeyrZsetuptools.configrr=Z
py36compatr�
__import__rr#r<rrSr�rHrJrRrWrVr`rerfrhrirnrqZcorerr�r�rrrr�<module>s`G
	site-packages/setuptools/__pycache__/unicode_utils.cpython-36.pyc000064400000002114147511334640021217 0ustar003

��f��@s8ddlZddlZddlmZdd�Zdd�Zdd�ZdS)	�N)�sixcCsVt|tj�rtjd|�Sy$|jd�}tjd|�}|jd�}Wntk
rPYnX|S)NZNFDzutf-8)�
isinstancer�	text_type�unicodedataZ	normalize�decode�encode�UnicodeError)�path�r
�#/usr/lib/python3.6/unicode_utils.py�	decomposes
rcCsXt|tj�r|Stj�pd}|df}x.|D]&}y
|j|�Stk
rNw*Yq*Xq*WdS)zY
    Ensure that the given path is decoded,
    NONE when no expected encoding works
    zutf-8N)rrr�sys�getfilesystemencodingr�UnicodeDecodeError)r	Zfs_encZ
candidates�encr
r
r�filesys_decodes

rcCs$y
|j|�Stk
rdSXdS)z/turn unicode encoding into a functional routineN)r�UnicodeEncodeError)�stringrr
r
r�
try_encode's
r)rr
Zsetuptools.externrrrrr
r
r
r�<module>s
site-packages/setuptools/__pycache__/launch.cpython-36.opt-1.pyc000064400000001421147511334640020562 0ustar003

��f�@s.dZddlZddlZdd�Zedkr*e�dS)z[
Launch the Python script on the command line after
setuptools is bootstrapped via import.
�NcCsrttjd}t|ddd�}tjdd�tjdd�<ttdt�}||�j�}|jdd�}t	||d�}t
||�dS)	zP
    Run the script in sys.argv[1] as if it had
    been invoked naturally.
    ��__main__N)�__file__�__name__�__doc__�openz\r\nz\n�exec)�__builtins__�sys�argv�dict�getattr�tokenizer�read�replace�compiler)Zscript_name�	namespaceZopen_ZscriptZnorm_script�code�r�/usr/lib/python3.6/launch.py�run
s
rr)rrr
rrrrrr�<module>s
site-packages/setuptools/__pycache__/wheel.cpython-36.pyc000064400000014045147511334640017463 0ustar003

��fb�@s�dZddlmZddlZddlZddlZddlZddlZddlZddl	m
Z
mZmZddl
mZddlmZddlm
ZddlmZdd	lmZejd
ej�jZdZdd
�ZGdd�de�ZdS)zWheels support.�)�get_platformN)�Distribution�PathMetadata�
parse_version)�canonicalize_name)�PY3)r)�
pep425tags)�write_requirementsz�^(?P<project_name>.+?)-(?P<version>\d.*?)
    ((-(?P<build>\d.*?))?-(?P<py_version>.+?)-(?P<abi>.+?)-(?P<platform>.+?)
    )\.whl$z�try:
    __import__('pkg_resources').declare_namespace(__name__)
except ImportError:
    __path__ = __import__('pkgutil').extend_path(__path__, __name__)
cCs�x�tj|�D]�\}}}tjj||�}x6|D].}tjj||�}tjj|||�}tj||�q*WxXttt|���D]D\}	}
tjj||
�}tjj|||
�}tjj	|�sntj||�||	=qnWqWx0tj|dd�D]\}}}|s�t
�tj|�q�WdS)zDMove everything under `src_dir` to `dst_dir`, and delete the former.T)�topdownN)�os�walk�path�relpath�join�renames�reversed�list�	enumerate�exists�AssertionError�rmdir)Zsrc_dirZdst_dir�dirpathZdirnames�	filenames�subdir�f�src�dst�n�d�r�/usr/lib/python3.6/wheel.py�unpack!s

r!c@s<eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
S)�WheelcCsTttjj|��}|dkr$td|��||_x$|j�j�D]\}}t|||�q8WdS)Nzinvalid wheel name: %r)	�
WHEEL_NAMErr
�basename�
ValueError�filename�	groupdict�items�setattr)�selfr&�match�k�vrrr �__init__9szWheel.__init__cCs&tj|jjd�|jjd�|jjd��S)z>List tags (py_version, abi, platform) supported by this wheel.�.)�	itertools�product�
py_version�split�abi�platform)r*rrr �tagsAs
z
Wheel.tagscs$tj��t�fdd�|j�D�d�S)z5Is the wheel is compatible with the current platform?c3s|]}|�krdVqdS)TNr)�.0�t)�supported_tagsrr �	<genexpr>Jsz&Wheel.is_compatible.<locals>.<genexpr>F)rZ
get_supported�nextr6)r*r)r9r �
is_compatibleGszWheel.is_compatiblecCs*t|j|j|jdkrdnt�d�j�dS)N�any)�project_name�versionr5z.egg)rr>r?r5r�egg_name)r*rrr r@LszWheel.egg_namecCsJx<|j�D]0}tj|�}|jd�r
t|�jt|j��r
|Sq
Wtd��dS)Nz
.dist-infoz.unsupported wheel format. .dist-info not found)Znamelist�	posixpath�dirname�endswithr�
startswithr>r%)r*�zf�memberrBrrr �
get_dist_infoRs

zWheel.get_dist_infocstj|j�����d|j|jf}|j���d|���fdd�}|d�}|d�}t|jd��}td�|koxtd	�kns�td
|��t	j
|��j|�t	jj
|���tj|�t|��d��dd
��ttt��j��������fdd��jD�}t	jj
|d�}t	j�|�t	jt	jj
|d�t	jj
|d��tt�|d�d�}	t|	jd�dt	jj
|d��t	jj
|���t	jj
�d�}
t	jj|
��rt	jj
|dd�}t	j
|�xVt	j|
�D]H}|jd��r�t	jt	jj
|
|��n t	jt	jj
|
|�t	jj
||���q�Wt	j|
�x0t t	jj�fdd�d#D��D]}
t!|
|��q$Wt	jj���rPt	j��t	jj
|d�}t	jj|��rt"|��}|j#�j$�}WdQRXxr|D]j}t	jj
|f|j$d ���}t	jj
|d!�}t	jj|��r�t	jj|��r�t"|d"��}|j%t&�WdQRX�q�WWdQRXdS)$z"Install wheel as an egg directory.z%s-%sz%s.datac	sH�jtj�|���,}tr&|j�jd�n|j�}tjj�j	|�SQRXdS)Nzutf-8)
�openrArr�read�decode�email�parserZParserZparsestr)�name�fp�value)�	dist_inforErr �get_metadatabsz*Wheel.install_as_egg.<locals>.get_metadataZWHEELZMETADATAz
Wheel-Versionz1.0z2.0dev0z$unsupported wheel format version: %s)ZmetadatacSsd|_t|�S)N)Zmarker�str)�reqrrr �raw_reqxsz%Wheel.install_as_egg.<locals>.raw_reqc	s6i|].}tt�fdd�t��j|f��D���|�qS)c3s|]}|�kr|VqdS)Nr)r7rS)�install_requiresrr r:~sz2Wheel.install_as_egg.<locals>.<dictcomp>.<genexpr>)r�sorted�map�requires)r7Zextra)�distrUrTrr �
<dictcomp>|sz(Wheel.install_as_egg.<locals>.<dictcomp>zEGG-INFOzPKG-INFO)rU�extras_require)Zattrs�egg_infoNzrequires.txt�scriptsz.pycc3s|]}tjj�|�VqdS)N)rr
r)r7r)�	dist_datarr r:�sz'Wheel.install_as_egg.<locals>.<genexpr>�data�headers�purelib�platlibznamespace_packages.txtr/z__init__.py�w)r_r`rarb)'�zipfileZZipFiler&r>r?rGr�getr%r�mkdirZ
extractallr
rrZ
from_locationrrrVrWrXZextras�rename�SetuptoolsDistribution�dictr	Zget_command_objr�listdirrC�unlinkr�filterr!rHrIr3�write�NAMESPACE_PACKAGE_INIT)r*Zdestination_eggdirZ
dist_basenamerQZwheel_metadataZ
dist_metadataZ
wheel_versionr[r\Z
setup_distZdist_data_scriptsZegg_info_scripts�entryrZnamespace_packagesrN�modZmod_dirZmod_initr)rYr^rPrUrTrEr �install_as_egg\sr
 










zWheel.install_as_eggN)	�__name__�
__module__�__qualname__r.r6r<r@rGrqrrrr r"7s
r")�__doc__Zdistutils.utilrrKr0rrA�rerdZ
pkg_resourcesrrrZ!setuptools.extern.packaging.utilsrZsetuptools.extern.sixrZ
setuptoolsrhrZsetuptools.command.egg_infor	�compile�VERBOSEr+r#rnr!�objectr"rrrr �<module>s&
site-packages/setuptools/__pycache__/archive_util.cpython-36.opt-1.pyc000064400000011713147511334640021773 0ustar003

��f��@s�dZddlZddlZddlZddlZddlZddlZddlmZddl	m
Z
ddddd	d
dgZGdd	�d	e�Zd
d�Z
e
dfdd�Ze
fdd�Ze
fdd�Ze
fdd�ZeeefZdS)z/Utilities for extracting common archive formats�N)�DistutilsError)�ensure_directory�unpack_archive�unpack_zipfile�unpack_tarfile�default_filter�UnrecognizedFormat�extraction_drivers�unpack_directoryc@seZdZdZdS)rz#Couldn't recognize the archive typeN)�__name__�
__module__�__qualname__�__doc__�rr�"/usr/lib/python3.6/archive_util.pyrscCs|S)z@The default progress/filter callback; returns True for all filesr)�src�dstrrrrscCsNxH|ptD]0}y||||�Wntk
r4w
Yq
XdSq
Wtd|��dS)a�Unpack `filename` to `extract_dir`, or raise ``UnrecognizedFormat``

    `progress_filter` is a function taking two arguments: a source path
    internal to the archive ('/'-separated), and a filesystem path where it
    will be extracted.  The callback must return the desired extract path
    (which may be the same as the one passed in), or else ``None`` to skip
    that file or directory.  The callback can thus be used to report on the
    progress of the extraction, as well as to filter the items extracted or
    alter their extraction paths.

    `drivers`, if supplied, must be a non-empty sequence of functions with the
    same signature as this function (minus the `drivers` argument), that raise
    ``UnrecognizedFormat`` if they do not support extracting the designated
    archive type.  The `drivers` are tried in sequence until one is found that
    does not raise an error, or until all are exhausted (in which case
    ``UnrecognizedFormat`` is raised).  If you do not supply a sequence of
    drivers, the module's ``extraction_drivers`` constant will be used, which
    means that ``unpack_zipfile`` and ``unpack_tarfile`` will be tried, in that
    order.
    Nz!Not a recognized archive type: %s)r	r)�filename�extract_dir�progress_filterZdriversZdriverrrrrscCs�tjj|�std|��|d|fi}x�tj|�D]�\}}}||\}}x4|D],}	||	dtjj||	�f|tjj||	�<qLWx\|D]T}
tjj||
�}|||
|�}|s�q�t|�tjj||
�}
tj|
|�tj	|
|�q�Wq0WdS)z�"Unpack" a directory, using the same interface as for archives

    Raises ``UnrecognizedFormat`` if `filename` is not a directory
    z%s is not a directory��/N)
�os�path�isdirr�walk�joinr�shutilZcopyfileZcopystat)rrr�paths�base�dirs�filesrr�d�f�targetrrrr
?s 
,
c
Cs�tj|�std|f��tj|���}x�|j�D]�}|j}|jd�s.d|jd�krRq.tj	j
|f|jd���}|||�}|szq.|jd�r�t|�n4t|�|j
|j�}t|d��}|j|�WdQRX|jd?}	|	r.tj||	�q.WWdQRXdS)z�Unpack zip `filename` to `extract_dir`

    Raises ``UnrecognizedFormat`` if `filename` is not a zipfile (as determined
    by ``zipfile.is_zipfile()``).  See ``unpack_archive()`` for an explanation
    of the `progress_filter` argument.
    z%s is not a zip filerz..�wbN�)�zipfileZ
is_zipfilerZZipFileZinfolistr�
startswith�splitrrr�endswithr�read�open�writeZ
external_attr�chmod)
rrr�z�info�namer$�datar#Zunix_attributesrrrrZs(




c
Cshytj|�}Wn$tjk
r2td|f��YnXtj|���dd�|_�x
|D�]}|j}|jd�oxd|j	d�krTt
jj|f|j	d���}xV|dk	r�|j
�s�|j�r�|j}|j�r�tj|j�}tj||�}tj|�}|j|�}q�W|dk	rT|j��s|j�rT|||�}	|	rT|	jt
j��r,|	dd	�}	y|j||	�WqTtjk
�rTYqTXqTWdSQRXdS)
z�Unpack tar/tar.gz/tar.bz2 `filename` to `extract_dir`

    Raises ``UnrecognizedFormat`` if `filename` is not a tarfile (as determined
    by ``tarfile.open()``).  See ``unpack_archive()`` for an explanation
    of the `progress_filter` argument.
    z/%s is not a compressed or uncompressed tar filecWsdS)Nr)�argsrrr�<lambda>�sz unpack_tarfile.<locals>.<lambda>rz..N�T���)�tarfiler,ZTarErrorr�
contextlib�closing�chownr1r(r)rrrZislnkZissymZlinkname�	posixpath�dirname�normpathZ
_getmember�isfilerr*�sepZ_extract_memberZExtractError)
rrrZtarobj�memberr1Z
prelim_dstZlinkpathrZ	final_dstrrrrs8



)rr'r7rrr;r8Zdistutils.errorsrZ
pkg_resourcesr�__all__rrrr
rrr	rrrr�<module>s$
"%.site-packages/setuptools/__pycache__/depends.cpython-36.opt-1.pyc000064400000012134147511334640020735 0ustar003

��f��@s�ddlZddlZddlZddlmZddlmZmZmZmZddl	m
Z
dddd	gZGd
d�d�Zddd�Z
ddd�Zdd
d	�Zdd�Ze�dS)�N)�
StrictVersion)�
PKG_DIRECTORY�PY_COMPILED�	PY_SOURCE�	PY_FROZEN�)�Bytecode�Require�find_module�get_module_constant�extract_constantc@sHeZdZdZddd�Zdd�Zdd	�Zddd�Zdd
d�Zddd�Z	dS)r	z7A prerequisite to building or installing a distribution�NcCsF|dkr|dk	rt}|dk	r0||�}|dkr0d}|jjt��|`dS)N�__version__)r�__dict__�update�locals�self)r�name�requested_version�moduleZhomepage�	attribute�format�r�/usr/lib/python3.6/depends.py�__init__szRequire.__init__cCs |jdk	rd|j|jfS|jS)z0Return full package/distribution name, w/versionNz%s-%s)rr)rrrr�	full_name s
zRequire.full_namecCs*|jdkp(|jdkp(t|�dko(||jkS)z%Is 'version' sufficiently up-to-date?N�unknown)rr�strr)r�versionrrr�
version_ok&szRequire.version_okrc
Cs||jdkrBy"t|j|�\}}}|r*|j�|Stk
r@dSXt|j|j||�}|dk	rx||k	rx|jdk	rx|j|�S|S)a�Get version number of installed module, 'None', or 'default'

        Search 'paths' for module.  If not found, return 'None'.  If found,
        return the extracted version attribute, or 'default' if no version
        attribute was specified, or the value cannot be determined without
        importing the module.  The version is formatted according to the
        requirement's version format (if any), unless it is 'None' or the
        supplied 'default'.
        N)rr
r�close�ImportErrorrr)r�paths�default�f�p�i�vrrr�get_version+s

zRequire.get_versioncCs|j|�dk	S)z/Return true if dependency is present on 'paths'N)r()rr"rrr�
is_presentFszRequire.is_presentcCs |j|�}|dkrdS|j|�S)z>Return true if dependency is present and up-to-date on 'paths'NF)r(r)rr"rrrr�
is_currentJs
zRequire.is_current)r
NN)Nr)N)N)
�__name__�
__module__�__qualname__�__doc__rrrr(r)r*rrrrr	s



c
Csl|jd�}x\|rf|jd�}tj||�\}}\}}}}	|tkrP|pFdg}|g}q|rtd||f��qW|	S)z7Just like 'imp.find_module()', but with package support�.rrzCan't find %r in %s)�split�pop�impr
rr!)
rr"�parts�partr$�path�suffix�mode�kind�inforrrr
Rs


c
Cs�yt||�\}}\}}}Wntk
r.dSXz�|tkrP|jd�tj|�}	n`|tkrdtj|�}	nL|t	kr~t
|j�|d�}	n2|tjkr�tj
||||||f�ttj||d�SWd|r�|j�Xt|	||�S)z�Find 'module' by searching 'paths', and extract 'symbol'

    Return 'None' if 'module' does not exist on 'paths', or it does not define
    'symbol'.  If the module defines 'symbol' as a constant, return the
    constant.  Otherwise, return 'default'.N��exec)r
r!r�read�marshal�loadrr2�get_frozen_objectr�compile�sys�modules�load_module�getattrr r)
r�symbolr#r"r$r5r6r7r8�coderrrres$


cCs�||jkrdSt|j�j|�}d}d}d}|}xPt|�D]D}|j}	|j}
|	|kr\|j|
}q8|
|krx|	|kst|	|krx|S|}q8WdS)aExtract the constant value of 'symbol' from 'code'

    If the name 'symbol' is bound to a constant value by the Python code
    object 'code', return that value.  If 'symbol' is bound to an expression,
    return 'default'.  Otherwise, return 'None'.

    Return value is based on the first assignment to 'symbol'.  'symbol' must
    be a global, or at least a non-"fast" local in the code block.  That is,
    only 'STORE_NAME' and 'STORE_GLOBAL' opcodes are checked, and 'symbol'
    must be present in 'code.co_names'.
    N�Z�a�d)�co_names�list�indexrZopcode�arg�	co_consts)rFrEr#Zname_idxZ
STORE_NAMEZSTORE_GLOBALZ
LOAD_CONST�constZ	byte_code�oprMrrrr�s
cCsDtjjd�rtjdkrdSd}x|D]}t�|=tj|�q&WdS)z�
    Patch the globals to remove the objects not available on some platforms.

    XXX it'd be better to test assertions about bytecode instead.
    �javaZcliNrr)rr)rA�platform�
startswith�globals�__all__�remove)Zincompatiblerrrr�_update_globals�s
rW)N���)rXNrX)rX)rAr2r=Zdistutils.versionrrrrrZ
py33compatrrUr	r
rrrWrrrr�<module>sC

"
$site-packages/setuptools/__pycache__/sandbox.cpython-36.pyc000064400000036446147511334640020026 0ustar003

��f�7�@s�ddlZddlZddlZddlZddlZddlZddlZddlZddlZddl	Z	ddl
mZddlm
Z
mZddlZejjd�r�ddljjjjZnejejZyeZWnek
r�dZYnXeZddlm Z ddlm!Z!ddd	d
gZ"d-dd�Z#ej$d.d
d��Z%ej$dd��Z&ej$dd��Z'ej$dd��Z(Gdd�de)�Z*Gdd�d�Z+ej$dd��Z,dd�Z-ej$dd��Z.ej$dd ��Z/d!d"�Z0d#d$�Z1d%d
�Z2Gd&d�d�Z3e4ed'��r�ej5gZ6ngZ6Gd(d�de3�Z7ej8ej9d)d*�d+j:�D��Z;Gd,d	�d	e �Z<dS)/�N)�six)�builtins�map�java)�DistutilsError)�working_set�AbstractSandbox�DirectorySandbox�SandboxViolation�	run_setupcCsJd}t||��}|j�}WdQRX|dkr.|}t||d�}t|||�dS)z.
    Python 3 implementation of execfile.
    �rbN�exec)�open�read�compiler
)�filename�globals�locals�mode�streamZscript�code�r�/usr/lib/python3.6/sandbox.py�	_execfile#src
csDtjdd�}|dk	r$|tjdd�<z
|VWd|tjdd�<XdS)N)�sys�argv)�repl�savedrrr�	save_argv0s
rc
cs.tjdd�}z
|VWd|tjdd�<XdS)N)r�path)rrrr�	save_path;s
r ccs4tjj|dd�tj}|t_z
dVWd|t_XdS)zL
    Monkey-patch tempfile.tempdir with replacement, ensuring it exists
    T)�exist_okN)�
pkg_resourcesZ
py31compat�makedirs�tempfileZtempdir)Zreplacementrrrr�
override_tempDs
r%ccs.tj�}tj|�z
|VWdtj|�XdS)N)�os�getcwd�chdir)�targetrrrr�pushdUs


r*c@seZdZdZedd��ZdS)�UnpickleableExceptionzP
    An exception representing another Exception that could not be pickled.
    cCsJytj|�tj|�fStk
rDddlm}|j||t|���SXdS)z�
        Always return a dumped (pickled) type and exc. If exc can't be pickled,
        wrap it in UnpickleableException first.
        r)r+N)�pickle�dumps�	Exception�setuptools.sandboxr+�dump�repr)�type�exc�clsrrrr0ds
zUnpickleableException.dumpN)�__name__�
__module__�__qualname__�__doc__�staticmethodr0rrrrr+_sr+c@s(eZdZdZdd�Zdd�Zdd�ZdS)	�ExceptionSaverz^
    A Context Manager that will save an exception, serialized, and restore it
    later.
    cCs|S)Nr)�selfrrr�	__enter__xszExceptionSaver.__enter__cCs |sdStj||�|_||_dS)NT)r+r0�_saved�_tb)r;r2r3�tbrrr�__exit__{s
zExceptionSaver.__exit__cCs6dt|�krdSttj|j�\}}tj|||j�dS)z"restore and re-raise any exceptionr=N)�varsrr,�loadsr=rZreraiser>)r;r2r3rrr�resume�szExceptionSaver.resumeN)r5r6r7r8r<r@rCrrrrr:rsr:c
#sVtjj��t��}�VWdQRXtjj���fdd�tjD�}t|�|j�dS)z�
    Context in which imported modules are saved.

    Translates exceptions internal to the context into the equivalent exception
    outside the context.
    Nc3s&|]}|�kr|jd�r|VqdS)z
encodings.N)�
startswith)�.0�mod_name)rrr�	<genexpr>�szsave_modules.<locals>.<genexpr>)r�modules�copyr:�update�_clear_modulesrC)�	saved_excZdel_modulesr)rr�save_modules�s
rMcCsxt|�D]}tj|=q
WdS)N)�listrrH)Zmodule_namesrFrrrrK�srKccs$tj�}z
|VWdtj|�XdS)N)r"�__getstate__�__setstate__)rrrr�save_pkg_resources_state�s
rQc,cs�tjj|d�}t��xt��ft�t��Nt��<t|��(t	|��t
d�dVWdQRXWdQRXWdQRXWdQRXWdQRXWdQRXdS)NZtempZ
setuptools)r&r�joinrQrM�hide_setuptoolsr rr%r*�
__import__)�	setup_dirZtemp_dirrrr�
setup_context�s

rVcCstjd�}t|j|��S)aH
    >>> _needs_hiding('setuptools')
    True
    >>> _needs_hiding('pkg_resources')
    True
    >>> _needs_hiding('setuptools_plugin')
    False
    >>> _needs_hiding('setuptools.__init__')
    True
    >>> _needs_hiding('distutils')
    True
    >>> _needs_hiding('os')
    False
    >>> _needs_hiding('Cython')
    True
    z1(setuptools|pkg_resources|distutils|Cython)(\.|$))�rer�bool�match)rF�patternrrr�
_needs_hiding�s
r[cCstttj�}t|�dS)a%
    Remove references to setuptools' modules from sys.modules to allow the
    invocation to import the most appropriate setuptools. This technique is
    necessary to avoid issues such as #315 where setuptools upgrading itself
    would fail to find a function declared in the metadata.
    N)�filterr[rrHrK)rHrrrrS�srScCs�tjjtjj|��}t|���y�|gt|�tjdd�<tjjd|�t	j
�t	jjdd��t
|t�rl|n|jtj��}t|��t|dd�}t||�WdQRXWn4tk
r�}z|jr�|jdrʂWYdd}~XnXWdQRXdS)z8Run a distutils setup script, sandboxed in its directoryNrcSs|j�S)N)Zactivate)Zdistrrr�<lambda>�szrun_setup.<locals>.<lambda>�__main__)�__file__r5)r&r�abspath�dirnamerVrNrr�insertr�__init__Z	callbacks�append�
isinstance�str�encode�getfilesystemencodingr	�dictr�
SystemExit�args)Zsetup_scriptrkrUZdunder_file�ns�vrrrr�s

c@s2eZdZdZdZdd�Zdd�Zdd�Zd	d
�Zdd�Z	d
d�Z
x$d9D]Zee
e�rFe
e�e�e<qFWd:dd�Zer~ede�Zede�Zx$d;D]Zee
e�r�ee�e�e<q�Wd)d*�Zx$d<D]Zee
e�r�ee�e�e<q�Wd-d.�Zx(d=D] Zee
e��r�ee�e�e<�q�Wd1d2�Zd3d4�Zd5d6�Zd7d8�ZdS)>rzDWrap 'os' module and 'open()' builtin for virtualizing setup scriptsFcs�fdd�tt�D��_dS)Ncs&g|]}|jd�rt�|�r|�qS)�_)rD�hasattr)rE�name)r;rr�
<listcomp>sz,AbstractSandbox.__init__.<locals>.<listcomp>)�dir�_os�_attrs)r;r)r;rrcszAbstractSandbox.__init__cCs&x |jD]}tt|t||��qWdS)N)rt�setattrr&�getattr)r;�sourcerprrr�_copyszAbstractSandbox._copycCs(|j|�tr|jt_|jt_d|_dS)NT)rx�_filer�file�_openr�_active)r;rrrr<s

zAbstractSandbox.__enter__cCs$d|_trtt_tt_|jt�dS)NF)r|ryrrzr{rrxrs)r;�exc_type�	exc_value�	tracebackrrrr@s
zAbstractSandbox.__exit__c	Cs|�|�SQRXdS)zRun 'func' under os sandboxingNr)r;�funcrrr�runszAbstractSandbox.runcstt�����fdd�}|S)Ncs2|jr |j�||f|�|�\}}�||f|�|�S)N)r|�_remap_pair)r;�src�dstrk�kw)rp�originalrr�wrap&sz3AbstractSandbox._mk_dual_path_wrapper.<locals>.wrap)rvrs)rpr�r)rpr�r�_mk_dual_path_wrapper#s
z%AbstractSandbox._mk_dual_path_wrapper�rename�link�symlinkNcs �ptt�����fdd�}|S)Ncs*|jr|j�|f|�|�}�|f|�|�S)N)r|�_remap_input)r;rrkr�)rpr�rrr�4sz5AbstractSandbox._mk_single_path_wrapper.<locals>.wrap)rvrs)rpr�r�r)rpr�r�_mk_single_path_wrapper1sz'AbstractSandbox._mk_single_path_wrapperrzr�stat�listdirr(�chmod�chown�mkdir�remove�unlink�rmdir�utime�lchown�chroot�lstat�	startfile�mkfifo�mknod�pathconf�accesscstt�����fdd�}|S)NcsB|jr2|j�|f|�|�}|j��|f|�|��S�|f|�|�S)N)r|r��
_remap_output)r;rrkr�)rpr�rrr�Isz4AbstractSandbox._mk_single_with_return.<locals>.wrap)rvrs)rpr�r)rpr�r�_mk_single_with_returnFs
z&AbstractSandbox._mk_single_with_return�readlink�tempnamcstt�����fdd�}|S)Ncs �||�}|jr|j�|�S|S)N)r|r�)r;rkr�Zretval)rpr�rrr�Xs
z'AbstractSandbox._mk_query.<locals>.wrap)rvrs)rpr�r)rpr�r�	_mk_queryUs
zAbstractSandbox._mk_queryr'�tmpnamcCs|S)z=Called to remap or validate any path, whether input or outputr)r;rrrr�_validate_pathdszAbstractSandbox._validate_pathcOs
|j|�S)zCalled for path inputs)r�)r;�	operationrrkr�rrrr�hszAbstractSandbox._remap_inputcCs
|j|�S)zCalled for path outputs)r�)r;r�rrrrr�lszAbstractSandbox._remap_outputcOs0|j|d|f|�|�|j|d|f|�|�fS)z?Called for path pairs like rename, link, and symlink operationsz-fromz-to)r�)r;r�r�r�rkr�rrrr�pszAbstractSandbox._remap_pair)r�r�r�)N)r�r�r(rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)r�r�)r'r�)r5r6r7r8r|rcrxr<r@r�r�rprorsrr�ryr{r�r�r�r�r�r�rrrrrsB










�devnullc@s�eZdZdZejdddddddd	d
ddd
dg
�ZdgZefdd�Z	dd�Z
erXd'dd�Zd(dd�Zdd�Z
dd�Zdd�Zdd �Zd!d"�Zd)d$d%�Zd&S)*r	z<Restrict operations to a single subdirectory - pseudo-chrootrr�r�r�r�r�r�r�r�r�r�r�r�z.*lib2to3.*\.pickle$cCsFtjjtjj|��|_tjj|jd�|_dd�|D�|_tj	|�dS)N�cSs g|]}tjjtjj|���qSr)r&r�normcase�realpath)rErrrrrq�sz-DirectorySandbox.__init__.<locals>.<listcomp>)
r&rr�r��_sandboxrR�_prefix�_exceptionsrrc)r;Zsandbox�
exceptionsrrrrc�s

zDirectorySandbox.__init__cOsddlm}||||��dS)Nr)r
)r/r
)r;r�rkr�r
rrr�
_violation�szDirectorySandbox._violation�rcOs<|dkr*|j|�r*|jd||f|�|�t||f|�|�S)Nr��rtr�rU�Urz)r�r�rr�r�)�_okr�ry)r;rrrkr�rrrry�szDirectorySandbox._filecOs<|dkr*|j|�r*|jd||f|�|�t||f|�|�S)Nr�r�rr�r�r)r�r�rr�r�)r�r�r{)r;rrrkr�rrrr{�szDirectorySandbox._opencCs|jd�dS)Nr�)r�)r;rrrr��szDirectorySandbox.tmpnamcCsN|j}z:d|_tjjtjj|��}|j|�p@||jkp@|j|j�S||_XdS)NF)	r|r&rr�r��	_exemptedr�rDr�)r;rZactiver�rrrr��s

zDirectorySandbox._okcs<�fdd�|jD�}�fdd�|jD�}tj||�}t|�S)Nc3s|]}�j|�VqdS)N)rD)rEZ	exception)�filepathrrrG�sz-DirectorySandbox._exempted.<locals>.<genexpr>c3s|]}tj|��VqdS)N)rWrY)rErZ)r�rrrG�s)r��_exception_patterns�	itertools�chain�any)r;r�Z
start_matchesZpattern_matchesZ
candidatesr)r�rr��s



zDirectorySandbox._exemptedcOs6||jkr2|j|�r2|j|tjj|�f|�|�|S)zCalled for path inputs)�	write_opsr�r�r&rr�)r;r�rrkr�rrrr��szDirectorySandbox._remap_inputcOs6|j|�s|j|�r.|j|||f|�|�||fS)z?Called for path pairs like rename, link, and symlink operations)r�r�)r;r�r�r�rkr�rrrr��szDirectorySandbox._remap_pair�cOsB|t@r,|j|�r,|jd|||f|�|�tj|||f|�|�S)zCalled for low-level os.open()zos.open)�WRITE_FLAGSr�r�rsr)r;rz�flagsrrkr�rrrr�szDirectorySandbox.openN)r�)r�)r�)r5r6r7r8ri�fromkeysr�r��_EXCEPTIONSrcr�ryr{r�r�r�r�r�rrrrrr	~s 	


cCsg|]}tt|d��qS)r)rvrs)rE�arrrrq�srqz4O_WRONLY O_RDWR O_APPEND O_CREAT O_TRUNC O_TEMPORARYc@s&eZdZdZejd�j�Zdd�ZdS)r
zEA setup script attempted to modify the filesystem outside the sandboxa
        SandboxViolation: {cmd}{args!r} {kwargs}

        The package setup script has attempted to modify files on your system
        that are not within the EasyInstall build area, and has been aborted.

        This package cannot be safely installed by EasyInstall, and may not
        support alternate installation locations even if you run its setup
        script by hand.  Please inform the package's author and the EasyInstall
        maintainers to find out if a fix or workaround is available.
        cCs|j\}}}|jjft��S)N)rk�tmpl�formatr)r;�cmdrk�kwargsrrr�__str__�szSandboxViolation.__str__N)	r5r6r7r8�textwrap�dedent�lstripr�r�rrrrr
�s

)N)N)=r&rr$�operator�	functoolsr�rW�
contextlibr,r�Zsetuptools.externrZsetuptools.extern.six.movesrrZpkg_resources.py31compatr"�platformrDZ$org.python.modules.posix.PosixModule�pythonrH�posixZPosixModulersrprzry�	NameErrorrr{Zdistutils.errorsrr�__all__r�contextmanagerrr r%r*r.r+r:rMrKrQrVr[rSrrror�r�r	�reduce�or_�splitr�r
rrrr�<module>s^



	
	w
V
site-packages/setuptools/__pycache__/version.cpython-36.opt-1.pyc000064400000000403147511334640020774 0ustar003

��f��@s6ddlZyejd�jZWnek
r0dZYnXdS)�NZ
setuptools�unknown)Z
pkg_resourcesZget_distribution�version�__version__�	Exception�rr�/usr/lib/python3.6/version.py�<module>ssite-packages/setuptools/__pycache__/dist.cpython-36.opt-1.pyc000064400000107631147511334640020265 0ustar003

��fu��@s�dgZddlZddlZddlZddlZddlZddlZddlZddl	Zddl
Z
ddlmZddl
mZmZmZddlmZddlmZddlmZddlmZdd	lmZmZmZdd
lmZddlmZddl m!Z!dd
l"m#Z#ddl$Z$ddl%m&Z&e'd�e'd�dd�Z(dd�Z)dd�Z*e+e,fZ-dd�Z.dd�Z/dd�Z0dd�Z1d d!�Z2d"d#�Z3d$d%�Z4d&d'�Z5d(d)�Z6d*d+�Z7d,d-�Z8d.d/�Z9e!ej:j;�Z<Gd0d�de&e<�Z;Gd1d2�d2�Z=dS)3�Distribution�N)�defaultdict)�DistutilsOptionError�DistutilsPlatformError�DistutilsSetupError)�
rfc822_escape)�
StrictVersion)�six)�	packaging)�map�filter�filterfalse)�Require)�windows_support)�
get_unpatched)�parse_configuration�)�Distribution_parse_config_filesz&setuptools.extern.packaging.specifiersz#setuptools.extern.packaging.versioncCstjdt�t|�S)NzDo not call this function)�warnings�warn�DeprecationWarningr)�cls�r�/usr/lib/python3.6/dist.py�_get_unpatched#srcCsn|js|jrtd�S|jdk	s8|jdk	s8t|dd�dk	r@td�S|js^|js^|js^|j	s^|j
rftd�Std�S)Nz2.1�python_requiresz1.2z1.1z1.0)�long_description_content_type�provides_extrasr�
maintainer�maintainer_email�getattrZprovides�requiresZ	obsoletesZclassifiers�download_url)Zdist_mdrrr�get_metadata_version(s

r#cCsPt|�}|jd|�|jd|j��|jd|j��|jd|j��|jd|j��|td�kr�|jd|j��|jd|j��nJd'}xD|D]<\}}t	||�}t
jr�|j|�}|dk	r�|jd||f�q�W|jd|j
��|j�r|jd|j�x"|jj�D]}|jd|��qWt|j��}|jd|�dj|j��}	|	�rd|jd|	�|td�k�r�x4|j�D]}
|jd|
��q|Wn|j|d|j��|j|d|j��|j|d|j��|j|d|j��|j|d|j��t|d��r|jd |j�|j�r$|jd!|j�|j�rLx|jD]}|jd"|��q4WdS)(z5Write the PKG-INFO format data to a file object.
    zMetadata-Version: %s
z	Name: %s
zVersion: %s
zSummary: %s
zHome-page: %s
z1.2zAuthor: %s
zAuthor-email: %s
�Author�author�Author-email�author_email�
Maintainerr�Maintainer-emailrNz%s: %s
zLicense: %s
zDownload-URL: %s
zProject-URL: %s, %s
zDescription: %s
�,z
Keywords: %s
z
Platform: %s
ZPlatformZ
ClassifierZRequiresZProvidesZ	ObsoletesrzRequires-Python: %s
zDescription-Content-Type: %s
zProvides-Extra: %s
�r$r%�r&r'�r(r�r)r)r+r,r-r.)r#�writeZget_nameZget_versionZget_descriptionZget_urlrZget_contactZget_contact_emailr r	�PY2Z
_encode_fieldZget_licenser"�project_urls�itemsrZget_long_description�joinZget_keywordsZ
get_platformsZ_write_listZget_classifiersZget_requiresZget_providesZ
get_obsoletes�hasattrrrr)�self�file�versionZoptional_fieldsZfield�attrZattr_valZproject_urlZ	long_desc�keywords�platform�extrarrr�write_pkg_file7s\


r<cCsFytjjd|�}Wn,ttttfk
r@td||f��YnXdS)Nzx=z4%r must be importable 'module:attrs' string (got %r))�
pkg_resources�
EntryPoint�parse�	TypeError�
ValueError�AttributeError�AssertionErrorr)�distr8�value�eprrr�check_importable�srGcCs6yWn,ttttfk
r0td||f��YnXdS)z*Verify that value is a string list or Nonez%%r must be a list of strings (got %r)N)r@rArBrCr)rDr8rErrr�assert_string_list�s
rHcCsh|}t|||�xR|D]J}|j|�s4tdd|��|jd�\}}}|r||krtjjd||�qWdS)z(Verify that namespace packages are validz1Distribution contains no modules or packages for znamespace package %r�.z^WARNING: %r is declared as a package namespace, but %r is not: please correct this in setup.pyN)rH�has_contents_forr�
rpartition�	distutils�logr)rDr8rEZns_packagesZnsp�parent�sepZchildrrr�	check_nsp�s

rPc
Cs@yttjt|j���Wn"tttfk
r:td��YnXdS)z+Verify that extras_require mapping is validz�'extras_require' must be a dictionary whose values are strings or lists of strings containing valid project/version requirement specifiers.N)	�list�	itertools�starmap�_check_extrar2r@rArBr)rDr8rErrr�check_extras�s
rUcCs<|jd�\}}}|r*tj|�r*td|��ttj|��dS)N�:zInvalid environment marker: )�	partitionr=Zinvalid_markerrrQ�parse_requirements)r;Zreqs�namerO�markerrrrrT�srTcCs&t|�|kr"d}t|j||d���dS)z)Verify that value is True, False, 0, or 1z0{attr!r} must be a boolean value (got {value!r}))r8rEN)�boolr�format)rDr8rE�tmplrrr�assert_bool�sr^cCsjy(ttj|��t|ttf�r&td��Wn<ttfk
rd}zd}t|j	||d���WYdd}~XnXdS)z9Verify that install_requires is a valid requirements listzUnordered types are not allowedzm{attr!r} must be a string or list of strings containing valid project/version requirement specifiers; {error})r8�errorN)
rQr=rX�
isinstance�dict�setr@rArr\)rDr8rEr_r]rrr�check_requirements�srccCsRytjj|�Wn<tjjk
rL}zd}t|j||d���WYdd}~XnXdS)z.Verify that value is a valid version specifierzF{attr!r} must be a string containing valid version specifiers; {error})r8r_N)r
Z
specifiersZSpecifierSetZInvalidSpecifierrr\)rDr8rEr_r]rrr�check_specifier�s
rdcCs@ytjj|�Wn*tk
r:}zt|��WYdd}~XnXdS)z)Verify that entry_points map is parseableN)r=r>Z	parse_maprAr)rDr8rE�errr�check_entry_points�srfcCst|tj�std��dS)Nztest_suite must be a string)r`r	Zstring_typesr)rDr8rErrr�check_test_suite�srgcCsdt|t�rTxH|j�D]8\}}t|t�s(Pyt|�Wqtk
rJPYqXqWdSt|d��dS)z@Verify that value is a dictionary of package names to glob listsNzI must be a dictionary mapping package names to lists of wildcard patterns)r`rar2�str�iterr@r)rDr8rE�k�vrrr�check_package_data�s

rlcCs,x&|D]}tjd|�stjjd|�qWdS)Nz\w+(\.\w+)*z[WARNING: %r not a valid package name; please use only .-separated package names in setup.py)�re�matchrLrMr)rDr8rEZpkgnamerrr�check_packages�s

roc@s0eZdZdZdeed�ZdZdd�ZdGdd�Z	dd	�Z
d
d�Zedd
��Z
dd�Zdd�ZdHdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Z d5d6�Z!d7d8�Z"d9d:�Z#d;d<�Z$d=d>�Z%d?d@�Z&dAdB�Z'dCdD�Z(dEdF�Z)dS)Ira�Distribution with support for features, tests, and package data

    This is an enhanced version of 'distutils.dist.Distribution' that
    effectively adds the following new optional keyword arguments to 'setup()':

     'install_requires' -- a string or sequence of strings specifying project
        versions that the distribution requires when installed, in the format
        used by 'pkg_resources.require()'.  They will be installed
        automatically when the package is installed.  If you wish to use
        packages that are not available in PyPI, or want to give your users an
        alternate download location, you can add a 'find_links' option to the
        '[easy_install]' section of your project's 'setup.cfg' file, and then
        setuptools will scan the listed web pages for links that satisfy the
        requirements.

     'extras_require' -- a dictionary mapping names of optional "extras" to the
        additional requirement(s) that using those extras incurs. For example,
        this::

            extras_require = dict(reST = ["docutils>=0.3", "reSTedit"])

        indicates that the distribution can optionally provide an extra
        capability called "reST", but it can only be used if docutils and
        reSTedit are installed.  If the user installs your package using
        EasyInstall and requests one of your extras, the corresponding
        additional requirements will be installed if needed.

     'features' **deprecated** -- a dictionary mapping option names to
        'setuptools.Feature'
        objects.  Features are a portion of the distribution that can be
        included or excluded based on user options, inter-feature dependencies,
        and availability on the current system.  Excluded features are omitted
        from all setup commands, including source and binary distributions, so
        you can create multiple distributions from the same source tree.
        Feature names should be valid Python identifiers, except that they may
        contain the '-' (minus) sign.  Features can be included or excluded
        via the command line options '--with-X' and '--without-X', where 'X' is
        the name of the feature.  Whether a feature is included by default, and
        whether you are allowed to control this from the command line, is
        determined by the Feature object.  See the 'Feature' class for more
        information.

     'test_suite' -- the name of a test suite to run for the 'test' command.
        If the user runs 'python setup.py test', the package will be installed,
        and the named test suite will be run.  The format is the same as
        would be used on a 'unittest.py' command line.  That is, it is the
        dotted name of an object to import and call to generate a test suite.

     'package_data' -- a dictionary mapping package names to lists of filenames
        or globs to use to find data files contained in the named packages.
        If the dictionary has filenames or globs listed under '""' (the empty
        string), those names will be searched for in every package, in addition
        to any names for the specific package.  Data files found using these
        names/globs will be installed along with the package, in the same
        location as the package.  Note that globs are allowed to reference
        the contents of non-package subdirectories, as long as you use '/' as
        a path separator.  (Globs are automatically converted to
        platform-specific paths at runtime.)

    In addition to these new keywords, this class also has several new methods
    for manipulating the distribution's contents.  For example, the 'include()'
    and 'exclude()' methods can be thought of as in-place add and subtract
    commands that add or remove packages, modules, extensions, and so on from
    the distribution.  They are used by the feature subsystem to configure the
    distribution for the included and excluded features.
    N)rr1rcCsp|sd|ksd|krdStjt|d��j�}tjjj|�}|dk	rl|jd�rltjt|d��|_	||_
dS)NrYr7zPKG-INFO)r=Z	safe_namerh�lower�working_setZby_key�getZhas_metadataZsafe_versionZ_version�
_patched_dist)r5�attrs�keyrDrrr�patch_missing_pkg_infoSsz#Distribution.patch_missing_pkg_infoc
s�t�d�}|si�_|pi}d|ks,d|kr4tj�g�_i�_g�_|jdd��_�j	|�|jdg��_
|jdg��_x$tj
d�D]}t��j|jd�q�Wtj��fdd	�|j�D��x\�jj�D]N\}}x6�jj|fD]}||kr�||}Pq�W|�r|�nd}t�j||�q�Wt�jjtj��r>t�jj��j_�jjdk	�r�yHtjj�jj�}t|�}	�jj|	k�r�tj d
�jj|	f�|	�j_Wn0tjj!t"fk
�r�tj d�jj�YnX�j#�dS)N�package_data�features�require_features�src_root�dependency_links�setup_requireszdistutils.setup_keywordscs i|]\}}|�jkr||�qSr)�_DISTUTILS_UNSUPPORTED_METADATA)�.0rjrk)r5rr�
<dictcomp>qsz)Distribution.__init__.<locals>.<dictcomp>zNormalizing '%s' to '%s'z�The version specified (%r) is an invalid version, this may not work as expected with newer versions of setuptools, pip, and PyPI. Please see PEP 440 for more details.)$r4rw�Feature�warn_deprecatedryrxZ
dist_files�poprzrvr{r|r=�iter_entry_points�vars�
setdefaultrY�
_Distribution�__init__r2r}�metadata�__dict__�setattrr`r7�numbers�Numberrhr
ZVersionrrZInvalidVersionr@�_finalize_requires)
r5rtZhave_package_datarFZoption�default�sourcerEZverZnormalized_versionr)r5rr�`sR


zDistribution.__init__cCsjt|dd�r|j|j_t|dd�rVx2|jj�D]$}|jd�d}|r.|jjj|�q.W|j�|j	�dS)z�
        Set `metadata.python_requires` and fix environment markers
        in `install_requires` and `extras_require`.
        rN�extras_requirerVr)
r rr�r��keys�splitr�add�_convert_extras_requirements�"_move_install_requirements_markers)r5r;rrrr��s
zDistribution._finalize_requirescCspt|dd�pi}tt�|_xP|j�D]D\}}|j|x0tj|�D]"}|j|�}|j||j|�qBWq$WdS)z�
        Convert requirements in `extras_require` of the form
        `"extra": ["barbazquux; {marker}"]` to
        `"extra:{marker}": ["barbazquux"]`.
        r�N)	r rrQ�_tmp_extras_requirer2r=rX�_suffix_for�append)r5Z
spec_ext_reqsZsectionrk�r�suffixrrrr��s


z)Distribution._convert_extras_requirementscCs|jrdt|j�SdS)ze
        For a requirement, return the 'extras_require' suffix for
        that requirement.
        rV�)rZrh)�reqrrrr��szDistribution._suffix_forcs�dd�}t�dd�pf}ttj|��}t||�}t||�}ttt|���_x&|D]}�j	dt|j
�j|�qPWt�fdd��j	j
�D���_dS)zv
        Move requirements in `install_requires` that are using environment
        markers `extras_require`.
        cSs|jS)N)rZ)r�rrr�
is_simple_req�szFDistribution._move_install_requirements_markers.<locals>.is_simple_req�install_requiresNrVc3s,|]$\}}|dd�t�j|�D�fVqdS)cSsg|]}t|��qSr)rh)r~r�rrr�
<listcomp>�szMDistribution._move_install_requirements_markers.<locals>.<genexpr>.<listcomp>N)r�
_clean_req)r~rjrk)r5rr�	<genexpr>�szBDistribution._move_install_requirements_markers.<locals>.<genexpr>)r rQr=rXrr
rrhr�r�rZr�rar2r�)r5r�Zspec_inst_reqsZ	inst_reqsZsimple_reqsZcomplex_reqsr�r)r5rr��s




z/Distribution._move_install_requirements_markerscCs
d|_|S)zP
        Given a Requirement, remove environment markers and return it.
        N)rZ)r5r�rrrr��szDistribution._clean_reqFcCs*tj||d�t||j|d�|j�dS)zYParses configuration files from various levels
        and loads configuration.

        )�	filenames)�ignore_option_errorsN)r��parse_config_filesr�command_optionsr�)r5r�r�rrrr��szDistribution.parse_config_filescCstj|�}|jr|j�|S)z3Process features after parsing command line options)r��parse_command_linerx�_finalize_features)r5�resultrrrr��s
zDistribution.parse_command_linecCsd|jdd�S)z;Convert feature name to corresponding option attribute nameZwith_�-�_)�replace)r5rYrrr�_feature_attrname�szDistribution._feature_attrnamecCs<tjjtj|�|jdd�}x|D]}tjj|dd�q W|S)zResolve pre-setup requirementsT)�	installerZreplace_conflicting)r�)r=rq�resolverX�fetch_build_eggr�)r5r!Zresolved_distsrDrrr�fetch_build_eggs�s
zDistribution.fetch_build_eggscCs�tj|�|jr|j�xHtjd�D]:}t||jd�}|dk	r$|j|j	d�|j
�||j|�q$Wt|dd�r�dd�|jD�|_ng|_dS)Nzdistutils.setup_keywords)r��convert_2to3_doctestscSsg|]}tjj|��qSr)�os�path�abspath)r~�prrrr�sz1Distribution.finalize_options.<locals>.<listcomp>)r��finalize_optionsrx�_set_global_opts_from_featuresr=r�r rY�requirer��loadr�)r5rFrErrrr�s
zDistribution.finalize_optionsc	Csvtjjtjd�}tjj|�srtj|�tj|�tjj|d�}t|d��$}|j	d�|j	d�|j	d�WdQRX|S)Nz.eggsz
README.txt�wzcThis directory contains eggs that were downloaded by setuptools to build, test, and run plug-ins.

zAThis directory caches those eggs to prevent repeated downloads.

z/However, it is safe to delete this directory.

)
r�r�r3�curdir�exists�mkdirrZ	hide_file�openr/)r5Z
egg_cache_dirZreadme_txt_filename�frrr�get_egg_cache_dirs



zDistribution.get_egg_cache_dirc
Cs�ddlm}|jddgi�}|jd�}|j�|jdd�|jd�j�D��|jr�|jdd�}d|krx|dd	|}d
|f|d<|j�}||dg|dd
dd
d
ddd
d�}|j	�|j|�S)z Fetch an egg needed for buildingr)�easy_installZscript_argsr�css"|]\}}|dkr||fVqdS)�
find_links�	site_dirs�	index_url�optimize�allow_hostsN)r�r�r�r�r�r�r)r~rjrkrrrr�1sz/Distribution.fetch_build_egg.<locals>.<genexpr>Nr�rZsetup�xTF)
�args�install_dirZexclude_scriptsZalways_copyZbuild_directoryZeditableZupgradeZ
multi_versionZ	no_report�user)
Zsetuptools.command.easy_installr��	__class__�get_option_dict�clear�updater2r{r�Zensure_finalized)r5r�r�rD�optsZlinksr��cmdrrrr�*s(
zDistribution.fetch_build_eggc	Cs�g}|jj�}x�|jj�D]�\}}|j|d�|j|�|jr|j}d}d}|j�s^||}}d|dd||fd|dd||ff}|j	|�d||d|<qW||j
|_
|_||_|_dS)z;Add --with-X/--without-X options based on optional featuresNz
 (default)r�zwith-zinclude zwithout-zexclude )
�negative_opt�copyrxr2�_set_feature�validate�optional�description�include_by_default�extend�global_optionsZfeature_optionsZfeature_negopt)	r5Zgo�norY�feature�descrZincdefZexcdef�newrrrr�Gs"



z+Distribution._set_global_opts_from_featurescCs�xJ|jj�D]<\}}|j|�}|s2|dkr|j�r|j|�|j|d�qWx6|jj�D](\}}|j|�sX|j|�|j|d�qXWdS)z9Add/remove features and resolve dependencies between themNrr)rxr2�feature_is_includedr��
include_inr��exclude_from)r5rYr�Zenabledrrrr�bs



zDistribution._finalize_featurescCs`||jkr|j|Stjd|�}x:|D]&}|j|jd�|j�|j|<}|SWtj||�SdS)z(Pluggable version of get_command_class()zdistutils.commands)r�N)�cmdclassr=r�r�r�r�r��get_command_class)r5�commandZepsrFr�rrrr�ss


zDistribution.get_command_classcCs>x2tjd�D]$}|j|jkr|j�}||j|j<qWtj|�S)Nzdistutils.commands)r=r�rYr�r�r��print_commands)r5rFr�rrrr��s
zDistribution.print_commandscCs>x2tjd�D]$}|j|jkr|j�}||j|j<qWtj|�S)Nzdistutils.commands)r=r�rYr�r�r��get_command_list)r5rFr�rrrr��s
zDistribution.get_command_listcCst||j|�|�dS)zSet feature's inclusion statusN)r�r�)r5rYZstatusrrrr��szDistribution._set_featurecCst||j|��S)zAReturn 1 if feature is included, 0 if excluded, 'None' if unknown)r r�)r5rYrrrr��sz Distribution.feature_is_includedcCsF|j|�dkr&|j|j}t|d��|j|j|�|j|d�dS)z)Request inclusion of feature named 'name'rz2 is required, but was excluded or is not availablerN)r�rxr�rr�r�)r5rYr�rrr�include_feature�s
zDistribution.include_featurecKsDx>|j�D]2\}}t|d|d�}|r0||�q
|j||�q
WdS)a�Add items to distribution that are named in keyword arguments

        For example, 'dist.exclude(py_modules=["x"])' would add 'x' to
        the distribution's 'py_modules' attribute, if it was not already
        there.

        Currently, this method only supports inclusion for attributes that are
        lists or tuples.  If you need to add support for adding to other
        attributes in this or a subclass, you can add an '_include_X' method,
        where 'X' is the name of the attribute.  The method will be called with
        the value passed to 'include()'.  So, 'dist.include(foo={"bar":"baz"})'
        will try to call 'dist._include_foo({"bar":"baz"})', which can then
        handle whatever special inclusion logic is needed.
        Z	_include_N)r2r �
_include_misc)r5rtrjrk�includerrrr��s

zDistribution.includecsf�d�|jr&��fdd�|jD�|_|jrD��fdd�|jD�|_|jrb��fdd�|jD�|_dS)z9Remove packages, modules, and extensions in named packagerIcs$g|]}|�kr|j��r|�qSr)�
startswith)r~r�)�package�pfxrrr��sz0Distribution.exclude_package.<locals>.<listcomp>cs$g|]}|�kr|j��r|�qSr)r�)r~r�)r�r�rrr��scs(g|] }|j�kr|jj��r|�qSr)rYr�)r~r�)r�r�rrr��sN)�packages�
py_modules�ext_modules)r5r�r)r�r�r�exclude_package�szDistribution.exclude_packagecCs4|d}x&|j�D]}||ks(|j|�rdSqWdS)z<Return true if 'exclude_package(package)' would do somethingrITN)�iter_distribution_namesr�)r5r�r�r�rrrrJ�szDistribution.has_contents_forcs�t�t�std|�f��yt||�}Wn tk
rHtd|��YnX|dk	rlt|t�rlt|d��n|r�t||�fdd�|D��dS)zAHandle 'exclude()' for list/tuple attrs without a special handlerz(%s: setting must be a list or tuple (%r)z %s: No such distribution settingNz4: this setting cannot be changed via include/excludecsg|]}|�kr|�qSrr)r~�item)rErrr��sz.Distribution._exclude_misc.<locals>.<listcomp>)r`�sequencerr rBr�)r5rYrE�oldr)rEr�
_exclude_misc�s
zDistribution._exclude_miscc
s�t|t�std||f��yt||��Wn tk
rHtd|��YnX�dkr`t|||�n:t�t�sxt|d��n"�fdd�|D�}t||�|�dS)zAHandle 'include()' for list/tuple attrs without a special handlerz%s: setting must be a list (%r)z %s: No such distribution settingNz4: this setting cannot be changed via include/excludecsg|]}|�kr|�qSrr)r~r�)r�rrr��sz.Distribution._include_misc.<locals>.<listcomp>)r`r�rr rBr�)r5rYrEr�r)r�rr��s

zDistribution._include_misccKsDx>|j�D]2\}}t|d|d�}|r0||�q
|j||�q
WdS)aRemove items from distribution that are named in keyword arguments

        For example, 'dist.exclude(py_modules=["x"])' would remove 'x' from
        the distribution's 'py_modules' attribute.  Excluding packages uses
        the 'exclude_package()' method, so all of the package's contained
        packages, modules, and extensions are also excluded.

        Currently, this method only supports exclusion from attributes that are
        lists or tuples.  If you need to add support for excluding from other
        attributes in this or a subclass, you can add an '_exclude_X' method,
        where 'X' is the name of the attribute.  The method will be called with
        the value passed to 'exclude()'.  So, 'dist.exclude(foo={"bar":"baz"})'
        will try to call 'dist._exclude_foo({"bar":"baz"})', which can then
        handle whatever special exclusion logic is needed.
        Z	_exclude_N)r2r r�)r5rtrjrk�excluderrrr�s

zDistribution.excludecCs,t|t�std|f��tt|j|��dS)Nz.packages: setting must be a list or tuple (%r))r`r�rrQrr�)r5r�rrr�_exclude_packagess
zDistribution._exclude_packagesc
Cs�|jj|_|jj|_|d}|jd�}xB||krh||\}}||=ddl}|j|d�|dd�<|d}q(Wtj|||�}|j|�}	t	|	dd�r�d|f|j|�d<|dk	r�gS|S)Nr�aliasesTrZcommand_consumes_argumentszcommand liner�)
r�r�r�r��shlexr�r��_parse_command_optsr�r )
r5�parserr�r�r��src�aliasr��nargsZ	cmd_classrrrr� s"




z Distribution._parse_command_optscCs�i}x�|jj�D]�\}}x�|j�D]�\}\}}|dkr8q"|jdd�}|dkr�|j|�}|jj�}|jt|di��x<|j�D]\}	}
|
|kr||	}d}Pq|Wtd��n|dkr�d}||j	|i�|<q"WqW|S)	ahReturn a '{cmd: {opt:val}}' map of all command-line options

        Option names are all long, but do not include the leading '--', and
        contain dashes rather than underscores.  If the option doesn't take
        an argument (e.g. '--quiet'), the 'val' is 'None'.

        Note that options provided by config files are intentionally excluded.
        zcommand liner�r�rr�NzShouldn't be able to get herer)
r�r2r�Zget_command_objr�r�r�r rCr�)r5�dr�r��optr��valZcmdobjZneg_opt�neg�posrrr�get_cmdline_options:s(



z Distribution.get_cmdline_optionsccs�x|jp
fD]
}|VqWx|jp$fD]
}|Vq&WxH|jp>fD]:}t|t�rX|\}}n|j}|jd�rt|dd�}|Vq@WdS)z@Yield all packages, modules, and extension names in distribution�moduleN�i����)r�r�r�r`�tuplerY�endswith)r5ZpkgrZextrYZ	buildinforrrr�bs




z$Distribution.iter_distribution_namescCs�ddl}tjs|jr tj||�Sddl}t|j|j	�sBtj||�S|jj
j�dkr^tj||�S|jj
}|jj}|j
dkr|dp~d}|jj}|j	|jj�d|||�|_ztj||�S|j	|jj�||||�|_XdS)z�If there were any non-global "display-only" options
        (--help-commands or the metadata display options) on the command
        line, display the requested info and return true; else return
        false.
        rN�utf-8�utf8Zwin32�
)rr	)�sysr	r0Z
help_commandsr��handle_display_options�ior`�stdout�
TextIOWrapper�encodingrp�errorsr:�line_buffering�detach)r5Zoption_orderrr
rr�newlinerrrrrts$z#Distribution.handle_display_options)N)NF)*�__name__�
__module__�__qualname__�__doc__rarbr}rsrvr�r�r��staticmethodr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rJr�r�r�r�r�rr�rrrrrrsLB
;

	(c@sPeZdZdZedd��Zdddfffdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dS)r�a�
    **deprecated** -- The `Feature` facility was never completely implemented
    or supported, `has reported issues
    <https://github.com/pypa/setuptools/issues/58>`_ and will be removed in
    a future version.

    A subset of the distribution that can be excluded if unneeded/wanted

    Features are created using these keyword arguments:

      'description' -- a short, human readable description of the feature, to
         be used in error messages, and option help messages.

      'standard' -- if true, the feature is included by default if it is
         available on the current system.  Otherwise, the feature is only
         included if requested via a command line '--with-X' option, or if
         another included feature requires it.  The default setting is 'False'.

      'available' -- if true, the feature is available for installation on the
         current system.  The default setting is 'True'.

      'optional' -- if true, the feature's inclusion can be controlled from the
         command line, using the '--with-X' or '--without-X' options.  If
         false, the feature's inclusion status is determined automatically,
         based on 'availabile', 'standard', and whether any other feature
         requires it.  The default setting is 'True'.

      'require_features' -- a string or sequence of strings naming features
         that should also be included if this feature is included.  Defaults to
         empty list.  May also contain 'Require' objects that should be
         added/removed from the distribution.

      'remove' -- a string or list of strings naming packages to be removed
         from the distribution if this feature is *not* included.  If the
         feature *is* included, this argument is ignored.  This argument exists
         to support removing features that "crosscut" a distribution, such as
         defining a 'tests' feature that removes all the 'tests' subpackages
         provided by other features.  The default for this argument is an empty
         list.  (Note: the named package(s) or modules must exist in the base
         distribution when the 'setup()' function is initially called.)

      other keywords -- any other keyword arguments are saved, and passed to
         the distribution's 'include()' and 'exclude()' methods when the
         feature is included or excluded, respectively.  So, for example, you
         could pass 'packages=["a","b"]' to cause packages 'a' and 'b' to be
         added or removed from the distribution as appropriate.

    A feature must include at least one 'requires', 'remove', or other
    keyword argument.  Otherwise, it can't affect the distribution in any way.
    Note also that you can subclass 'Feature' to create your own specialized
    feature types that modify the distribution in other ways when included or
    excluded.  See the docstrings for the various methods here for more detail.
    Aside from the methods, the only feature attributes that distributions look
    at are 'description' and 'optional'.
    cCsd}tj|tdd�dS)NzrFeatures are deprecated and will be removed in a future version. See https://github.com/pypa/setuptools/issues/65.�)�
stacklevel)rrr)�msgrrrr��szFeature.warn_deprecatedFTc	Ks�|j�||_||_||_||_t|ttf�r4|f}dd�|D�|_dd�|D�}|r^||d<t|t�rn|f}||_	||_
|r�|r�|r�td��dS)NcSsg|]}t|t�r|�qSr)r`rh)r~r�rrrr��sz$Feature.__init__.<locals>.<listcomp>cSsg|]}t|t�s|�qSr)r`rh)r~r�rrrr��sryzgFeature %s: must define 'require_features', 'remove', or at least one of 'packages', 'py_modules', etc.)r�r��standard�	availabler�r`rhrry�remove�extrasr)	r5r�rrr�ryrr Zerrrrr��s$
zFeature.__init__cCs|jo
|jS)z+Should this feature be included by default?)rr)r5rrrr��szFeature.include_by_defaultcCs@|jst|jd��|jf|j�x|jD]}|j|�q*WdS)aEnsure feature and its requirements are included in distribution

        You may override this in a subclass to perform additional operations on
        the distribution.  Note that this method may be called more than once
        per feature, and so should be idempotent.

        z3 is required, but is not available on this platformN)rrr�r�r ryr�)r5rDr�rrrr��s	zFeature.include_incCs2|jf|j�|jr.x|jD]}|j|�qWdS)a2Ensure feature is excluded from distribution

        You may override this in a subclass to perform additional operations on
        the distribution.  This method will be called at most once per
        feature, and only after all included features have been asked to
        include themselves.
        N)r�r rr�)r5rDr�rrrr�s	zFeature.exclude_fromcCs2x,|jD]"}|j|�std|j||f��qWdS)a�Verify that feature makes sense in context of distribution

        This method is called by the distribution just before it parses its
        command line.  It checks to ensure that the 'remove' attribute, if any,
        contains only valid package/module names that are present in the base
        distribution when 'setup()' is called.  You may override it in a
        subclass to perform any other required validation of the feature
        against a target distribution.
        zg%s wants to be able to remove %s, but the distribution doesn't contain any packages or modules under %sN)rrJrr�)r5rDr�rrrr�s

zFeature.validateN)rrrrrr�r�r�r�r�r�rrrrr��s7	r�)>�__all__rmr�rr�Z
distutils.logrLZdistutils.coreZ
distutils.cmdZdistutils.distrR�collectionsrZdistutils.errorsrrrZdistutils.utilrZdistutils.versionrZsetuptools.externr	r
Zsetuptools.extern.six.movesrrr
Zsetuptools.dependsrZ
setuptoolsrZsetuptools.monkeyrZsetuptools.configrr=Z
py36compatr�
__import__rr#r<rrQr�rGrHrPrUrTr^rcrdrfrgrlroZcorerr�r�rrrr�<module>s`G
	site-packages/setuptools/__pycache__/pep425tags.cpython-36.pyc000064400000016151147511334640020255 0ustar003

��fy*�@s�dZddlmZddlZddlmZddlZddlZddlZddl	Z	ddl
Z
ddlmZddl
mZejd�Zd	d
�Zdd�Zd
d�Zdd�Zdd�Zd#dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd$d!d"�Ze�ZdS)%z2Generate and work with PEP 425 Compatibility Tags.�)�absolute_importN)�log)�OrderedDict�)�glibcz(.+)_(\d+)_(\d+)_(.+)cCsBy
tj|�Stk
r<}ztjdj|�t�dSd}~XnXdS)Nz{})�	sysconfig�get_config_var�IOError�warnings�warn�format�RuntimeWarning)�var�e�r� /usr/lib/python3.6/pep425tags.pyrs

rcCs:ttd�rd}n&tjjd�r"d}ntjdkr2d}nd}|S)z'Return abbreviated implementation name.�pypy_version_info�pp�javaZjyZcliZip�cp)�hasattr�sys�platform�
startswith)Zpyimplrrr�
get_abbr_impls

rcCs.td�}|st�dkr*djttt���}|S)zReturn implementation version.�py_version_nodotr�)rr�join�map�str�get_impl_version_info)Zimpl_verrrr�get_impl_ver)sr!cCs:t�dkr"tjdtjjtjjfStjdtjdfSdS)zQReturn sys.version_info-like tuple for use in decrementing the minor
    version.rrrN)rr�version_infor�major�minorrrrrr 1s

r cCsdjt�t��S)z;
    Returns the Tag for this specific implementation.
    z{}{})rrr!rrrr�get_impl_tag<sr%TcCs.t|�}|dkr&|r tjd|�|�S||kS)zgUse a fallback method for determining SOABI flags if the needed config
    var is unset or unavailable.Nz>Config variable '%s' is unset, Python ABI tag may be incorrect)rr�debug)rZfallback�expectedr�valrrr�get_flagCsr)cs�td�}t��|r��dkr�ttd�r�d}d}d}tddd��dkd	�rLd
}td�fdd��dkd	�rjd
}tddd�d�dko�tjdkd�r�tjdkr�d}d�t�|||f}n@|r�|jd�r�d|jd�d}n|r�|j	dd�j	dd�}nd}|S)zXReturn the ABI tag based on SOABI (if available) or emulate SOABI
    (CPython 2, PyPy).�SOABIrr�
maxunicoder�Py_DEBUGcSs
ttd�S)N�gettotalrefcount)rrrrrr�<lambda>Yszget_abi_tag.<locals>.<lambda>)r�d�
WITH_PYMALLOCcs�dkS)Nrrr)�implrrr.]s�mZPy_UNICODE_SIZEcSs
tjdkS)Ni��)rr+rrrrr.as��)r'r�uz
%s%s%s%s%szcpython-�-r�.�_N>rr)r4r4)r4r4)
rrrrr)r"r!r�split�replace)Zsoabir/r2r5�abir)r1r�get_abi_tagOs8

r<cCs
tjdkS)Ni���)r�maxsizerrrr�_is_running_32bitqsr>cCs�tjdkr^tj�\}}}|jd�}|dkr6t�r6d}n|dkrHt�rHd}dj|d|d	|�Stjj�j	dd
�j	dd
�}|dkr�t�r�d
}|S)z0Return our platform name 'win32', 'linux_x86_64'�darwinr7�x86_64�i386�ppc64�ppczmacosx_{}_{}_{}rrr8r6�linux_x86_64�
linux_i686)
rrZmac_verr9r>r�	distutils�util�get_platformr:)�releaser8�machineZ	split_ver�resultrrrrHus

rHcCsFt�dkrdSyddl}t|j�Sttfk
r8YnXtjdd�S)NrDrEFr��>rDrE)rH�
_manylinux�boolZmanylinux1_compatible�ImportError�AttributeErrorrZhave_compatible_glibc)rNrrr�is_manylinux1_compatible�s

rRcsvg}��fdd��td
dddg���|||�r8|j|�x.�D]&}|�|kr>�|||�r>|j|�q>W|jd�|S)z�Return a list of supported arches (including group arches) for
    the given major, minor and machine architecture of an macOS machine.
    cs~|dkr||fd
kS|dkr(||fdkS|dkr<||fdkS|dkrP||fd
kS|�krzx �|D]}�|||�rbdSqbWd	S)NrC�
rMrBrAr3r@TF)rSrM)rSrM)rSr3)rSrMr)r#r$�arch�garch)�_supports_arch�groupsrrrV�sz)get_darwin_arches.<locals>._supports_arch�fatrArC�intelr@�fat64rB�fat32Z	universal�rArC)rXr\�r@rA)rYr]�r@rB)rZr^�r@rArC)r[r_)r�append)r#r$rJ�archesrUr)rVrWr�get_darwin_arches�s$


rbFcCsg}|dkrXg}t�}|dd�}x4t|ddd�D] }|jdjtt||f���q4W|p`t�}g}	|pnt�}|r�|g|	dd�<t�}
ddl	}x8|j
�D],}|djd�r�|
j|dj
dd�d�q�W|	jtt|
���|	jd�|�sx|p�t�}
|
jd	��r�tj|
�}|�r�|j�\}}}}d
j||�}g}xTttt|�d��D]4}x,tt|�||�D]}|j|||f��q^W�qHWn|
g}n*|dk�r�t��r�|
jdd�|
g}n|
g}x:|	D]2}x*|D]"}
|jd
||df||
f��q�W�q�WxZ|dd�D]J}|dk�rPx6|
D].}x&|D]}
|jd
||f||
f��qW�qW�q�Wx*|D]"}
|jd|ddd|
f��qRW|jd
||dfddf�|jd
||ddfddf�xNt|�D]B\}}|jd|fddf�|dk�r�|jd|dddf��q�W|S)acReturn a list of supported tags for each version specified in
    `versions`.

    :param versions: a list of string versions, of the form ["33", "32"],
        or None. The first version will be assumed to support our ABI.
    :param platform: specify the exact platform you want valid
        tags for, or None. If None, use the local system platform.
    :param impl: specify the exact implementation you want valid
        tags for, or None. If None, use the local interpreter impl.
    :param abi: specify the exact abi you want valid
        tags for, or None. If None, use the local interpreter abi.
    Nrrrz.abir7rLZnoneZmacosxz{}_{}_%i_%s�linuxZ
manylinux1z%s%s�31�30zpy%s�any���rgrgrg>rerd)r �ranger`rrrrr<�set�impZget_suffixesr�addr9�extend�sorted�listrH�
_osx_arch_pat�matchrWr�reversed�intrbrRr:�	enumerate)ZversionsZnoarchrr1r;Z	supportedr"r#r$ZabisZabi3srj�suffixrTrp�nameZactual_archZtplrar2�a�version�irrr�
get_supported�sh 




 

(


*
" 
ry)TT)NFNNN)�__doc__Z
__future__rZdistutils.utilrFrr�rerrr
�collectionsrrr�compilerorrr!r r%r)r<r>rHrRrbryZimplementation_tagrrrr�<module>s2


"=
_site-packages/setuptools/__pycache__/windows_support.cpython-36.pyc000064400000001656147511334640021651 0ustar003

��f��@s(ddlZddlZdd�Zedd��ZdS)�NcCstj�dkrdd�S|S)NZWindowsc_sdS)N�)�args�kwargsrr�%/usr/lib/python3.6/windows_support.py�<lambda>szwindows_only.<locals>.<lambda>)�platform�system)�funcrrr�windows_onlysr
cCsLtd�tjjj}tjjtjjf|_tjj	|_
d}|||�}|sHtj��dS)z�
    Set the hidden attribute on a file or directory.

    From http://stackoverflow.com/questions/19622133/

    `path` must be text.
    zctypes.wintypes�N)�
__import__�ctypesZwindllZkernel32ZSetFileAttributesWZwintypesZLPWSTRZDWORDZargtypesZBOOLZrestypeZWinError)�pathZSetFileAttributes�FILE_ATTRIBUTE_HIDDEN�retrrr�	hide_files	


r)rr
r
rrrrr�<module>ssite-packages/setuptools/__pycache__/version.cpython-36.pyc000064400000000403147511334640020035 0ustar003

��f��@s6ddlZyejd�jZWnek
r0dZYnXdS)�NZ
setuptools�unknown)Z
pkg_resourcesZget_distribution�version�__version__�	Exception�rr�/usr/lib/python3.6/version.py�<module>ssite-packages/setuptools/__pycache__/site-patch.cpython-36.pyc000064400000002635147511334640020422 0ustar003

��f	�@sdd�Zedkre�[dS)cCsddl}ddl}|jjd�}|dks4|jdkr:|r:g}n|j|j�}t|di�}|jt	|�d�}|jj
t�}x�|D]�}||ksv|r�qv|j|�}|dk	r�|jd�}|dk	r�|j
d�Pqvy ddl}	|	jd|g�\}
}}Wntk
r�wvYnX|
dkr�qvz|	j
d|
||�Wd|
j�XPqvWtd��tdd�|jD��}
t|d	d�}d|_x|D]}t|��qXW|j|7_t|d�\}}d}g}xl|jD]b}t|�\}}||k�r�|dk�r�t	|�}||
k�s�|dk�r�|j|�n|j||�|d
7}�q�W||jdd�<dS)N��
PYTHONPATHZwin32�path_importer_cache�sitez$Couldn't find the real 'site' modulecSsg|]}t|�ddf�qS)�)�makepath)�.0�item�r	� /usr/lib/python3.6/site-patch.py�
<listcomp>)sz__boot.<locals>.<listcomp>�__egginsertr)�sys�os�environ�get�platform�split�pathsep�getattr�path�len�dirname�__file__�find_module�load_module�imp�ImportError�close�dictr�
addsitedirr�append�insert)r
rrZpicZstdpathZmydirrZimporter�loaderr�streamrZdescr�known_pathsZoldpos�dZndZ	insert_at�new_path�pZnpr	r	r
�__boots`





r(rN)r(�__name__r	r	r	r
�<module>sGsite-packages/setuptools/__pycache__/dep_util.cpython-36.pyc000064400000001422147511334640020157 0ustar003

��f��@sddlmZdd�ZdS)�)�newer_groupcCslt|�t|�krtd��g}g}xBtt|��D]2}t||||�r.|j||�|j||�q.W||fS)z�Walk both arguments in parallel, testing if each source group is newer
    than its corresponding target. Returns a pair of lists (sources_groups,
    targets) where sources is newer than target, according to the semantics
    of 'newer_group()'.
    z5'sources_group' and 'targets' must be the same length)�len�
ValueError�ranger�append)Zsources_groupsZtargetsZ	n_sourcesZ	n_targets�i�r�/usr/lib/python3.6/dep_util.py�newer_pairwise_groupsr
N)Zdistutils.dep_utilrr
rrrr	�<module>ssite-packages/setuptools/__pycache__/glob.cpython-36.opt-1.pyc000064400000007212147511334640020237 0ustar003

��fW�@s�dZddlZddlZddlZddlmZdddgZddd�Zdd	d�Zd
d�Z	dd
�Z
dd�Zdd�Zdd�Z
ejd�Zejd�Zdd�Zdd�Zdd�ZdS)z�
Filename globbing utility. Mostly a copy of `glob` from Python 3.5.

Changes include:
 * `yield from` and PEP3102 `*` removed.
 * `bytes` changed to `six.binary_type`.
 * Hidden files are not ignored.
�N)�binary_type�glob�iglob�escapeFcCstt||d��S)ayReturn a list of paths matching a pathname pattern.

    The pattern may contain simple shell-style wildcards a la
    fnmatch. However, unlike fnmatch, filenames starting with a
    dot are special cases that are not matched by '*' and '?'
    patterns.

    If recursive is true, the pattern '**' will match any files and
    zero or more directories and subdirectories.
    )�	recursive)�listr)�pathnamer�r	�/usr/lib/python3.6/glob.pyrscCs"t||�}|rt|�rt|�}|S)a�Return an iterator which yields the paths matching a pathname pattern.

    The pattern may contain simple shell-style wildcards a la
    fnmatch. However, unlike fnmatch, filenames starting with a
    dot are special cases that are not matched by '*' and '?'
    patterns.

    If recursive is true, the pattern '**' will match any files and
    zero or more directories and subdirectories.
    )�_iglob�_isrecursive�next)rr�it�sr	r	r
r s
ccstjj|�\}}t|�sF|r0tjj|�rB|Vntjj|�rB|VdS|s�|rrt|�rrx4t||�D]
}|VqbWnxt||�D]
}|Vq~WdS||kr�t|�r�t	||�}n|g}t|�r�|r�t|�r�t}q�t}nt
}x0|D](}x"|||�D]}tjj||�Vq�Wq�WdS)N)�os�path�split�	has_magic�lexists�isdirr�glob2�glob1r�glob0�join)rr�dirname�basename�x�dirsZglob_in_dir�namer	r	r
r2s4

rcCsR|s"t|t�rtjjd�}ntj}ytj|�}Wntk
rDgSXtj||�S)N�ASCII)	�
isinstancerr�curdir�encode�listdir�OSError�fnmatch�filter)r�pattern�namesr	r	r
r]s
rcCs8|stjj|�r4|gSntjjtjj||��r4|gSgS)N)rrrrr)rrr	r	r
rjsrccs*|dd�Vxt|�D]
}|VqWdS)Nr)�	_rlistdir)rr'rr	r	r
rzsrc
cs�|s"t|t�rttjd�}ntj}ytj|�}Wntjk
rFdSXxJ|D]B}|V|rjtjj||�n|}x t|�D]}tjj||�VqxWqNWdS)Nr)	r rrr!r#�errorrrr))rr(rr�yr	r	r
r)�s

r)z([*?[])s([*?[])cCs(t|t�rtj|�}n
tj|�}|dk	S)N)r r�magic_check_bytes�search�magic_check)r�matchr	r	r
r�s

rcCst|t�r|dkS|dkSdS)Ns**z**)r r)r'r	r	r
r�s
rcCs<tjj|�\}}t|t�r(tjd|�}ntjd|�}||S)z#Escape all special characters.
    s[\1]z[\1])rr�
splitdriver rr,�subr.)rZdriver	r	r
r�s

)F)F)�__doc__r�rer%Zsetuptools.extern.sixr�__all__rrrrrrr)�compiler.r,rrrr	r	r	r
�<module>s"


+


site-packages/setuptools/__pycache__/__init__.cpython-36.opt-1.pyc000064400000014113147511334640021051 0ustar003

��fD�@s2dZddlZddlZddlZddlZddlmZddlm	Z	ddl
mZmZddl
ZddlmZddlmZmZddlmZd	d
lmZddd
ddddgZejjZdZdZdgZGdd�de�ZGdd�de�Z ej!Z"dd�Z#dd�Z$ej%j$je$_ej&ej%j'�Z(Gdd�de(�Z'dd�Z)ej*fdd�Z+ej,�dS) z@Extensions to the 'distutils' for large or complex distributions�N)�convert_path)�fnmatchcase)�filter�map)�	Extension)�Distribution�Feature)�Require�)�monkey�setuprr�Commandrr	�
find_packagesTz
lib2to3.fixesc@sHeZdZdZedfd
fdd��Zedd��Zedd	��Zed
d��Z	dS)�
PackageFinderzI
    Generate a list of all Python packages found within a directory
    �.�*cCs&t|jt|�|jd|��|j|���S)a	Return a list all Python packages found within directory 'where'

        'where' is the root directory which will be searched for packages.  It
        should be supplied as a "cross-platform" (i.e. URL-style) path; it will
        be converted to the appropriate local path syntax.

        'exclude' is a sequence of package names to exclude; '*' can be used
        as a wildcard in the names, such that 'foo.*' will exclude all
        subpackages of 'foo' (but not 'foo' itself).

        'include' is a sequence of package names to include.  If it's
        specified, only the named packages will be included.  If it's not
        specified, all found packages will be included.  'include' can contain
        shell style wildcard patterns just like 'exclude'.
        �ez_setup�*__pycache__)rr)�list�_find_packages_iterr�
_build_filter)�cls�where�exclude�include�r�/usr/lib/python3.6/__init__.py�find'szPackageFinder.findccs�x�tj|dd�D]�\}}}|dd�}g|dd�<xp|D]h}tjj||�}	tjj|	|�}
|
jtjjd�}d|ks8|j|	�r~q8||�r�||�r�|V|j|�q8WqWdS)zy
        All the packages found in 'where' that pass the 'include' filter, but
        not the 'exclude' filter.
        T)�followlinksNr)	�os�walk�path�join�relpath�replace�sep�_looks_like_package�append)rrrr�root�dirs�filesZall_dirs�dir�	full_pathZrel_path�packagerrrr>s
z!PackageFinder._find_packages_itercCstjjtjj|d��S)z%Does a directory look like a package?z__init__.py)rr!�isfiler")r!rrrr&Zsz!PackageFinder._looks_like_packagecs�fdd�S)z�
        Given a list of patterns, return a callable that will be true only if
        the input matches at least one of the patterns.
        cst�fdd��D��S)Nc3s|]}t�|d�VqdS))�patN)r)�.0r/)�namerr�	<genexpr>esz@PackageFinder._build_filter.<locals>.<lambda>.<locals>.<genexpr>)�any)r1)�patterns)r1r�<lambda>esz-PackageFinder._build_filter.<locals>.<lambda>r)r4r)r4rr_szPackageFinder._build_filterN)r)
�__name__�
__module__�__qualname__�__doc__�classmethodrr�staticmethodr&rrrrrr"src@seZdZedd��ZdS)�PEP420PackageFindercCsdS)NTr)r!rrrr&isz'PEP420PackageFinder._looks_like_packageN)r6r7r8r;r&rrrrr<hsr<cCs@tjjtdd�|j�D���}|jdd�|jr<|j|j�dS)Ncss"|]\}}|dkr||fVqdS)�dependency_links�setup_requiresN)r=r>r)r0�k�vrrrr2usz*_install_setup_requires.<locals>.<genexpr>T)Zignore_option_errors)�	distutils�corer�dict�itemsZparse_config_filesr>Zfetch_build_eggs)�attrs�distrrr�_install_setup_requiresqs
rGcKst|�tjjf|�S)N)rGrArBr)rErrrr~sc@s(eZdZejZdZdd�Zddd�ZdS)	r
FcKstj||�t|�j|�dS)zj
        Construct the command for dist, updating
        vars(self) with any keyword parameters.
        N)�_Command�__init__�vars�update)�selfrF�kwrrrrI�szCommand.__init__rcKs tj|||�}t|�j|�|S)N)rH�reinitialize_commandrJrK)rLZcommandZreinit_subcommandsrM�cmdrrrrN�szCommand.reinitialize_commandN)r)r6r7r8rHr9Zcommand_consumes_argumentsrIrNrrrrr
�scCs&dd�tj|dd�D�}ttjj|�S)z%
    Find all files under 'path'
    css,|]$\}}}|D]}tjj||�VqqdS)N)rr!r")r0�baser)r*�filerrrr2�sz#_find_all_simple.<locals>.<genexpr>T)r)rr rr!r.)r!�resultsrrr�_find_all_simple�srScCs6t|�}|tjkr.tjtjj|d�}t||�}t|�S)z�
    Find all files under 'dir' and return the list of full filenames.
    Unless dir is '.', return full filenames with dir prepended.
    )�start)	rSr�curdir�	functools�partialr!r#rr)r+r*Zmake_relrrr�findall�s


rX)-r9rrVZdistutils.corerAZdistutils.filelistZdistutils.utilrZfnmatchrZsetuptools.extern.six.movesrrZsetuptools.versionZ
setuptoolsZsetuptools.extensionrZsetuptools.distrrZsetuptools.dependsr	�r�__all__�version�__version__Zbootstrap_install_fromZrun_2to3_on_doctestsZlib2to3_fixer_packages�objectrr<rrrGrrBZ
get_unpatchedr
rHrSrUrXZ	patch_allrrrr�<module>s:F
site-packages/setuptools/__pycache__/lib2to3_ex.cpython-36.opt-1.pyc000064400000004474147511334640021275 0ustar003

��f��@sXdZddlmZddlmZddlmZmZddl	Z	Gdd�de�Z
Gdd	�d	e�ZdS)
zy
Customized Mixin2to3 support:

 - adds support for converting doctests


This module raises an ImportError on Python 2.
�)�	Mixin2to3)�log)�RefactoringTool�get_fixers_from_packageNc@s$eZdZdd�Zdd�Zdd�ZdS)�DistutilsRefactoringToolcOstj|f|��dS)N)r�error)�self�msg�args�kw�r� /usr/lib/python3.6/lib2to3_ex.py�	log_errorsz"DistutilsRefactoringTool.log_errorcGstj|f|��dS)N)r�info)rr	r
rrr
�log_messagesz$DistutilsRefactoringTool.log_messagecGstj|f|��dS)N)r�debug)rr	r
rrr
�	log_debugsz"DistutilsRefactoringTool.log_debugN)�__name__�
__module__�__qualname__rrrrrrr
rsrc@s&eZdZd	dd�Zdd�Zdd�ZdS)
rFcCsr|jjdk	rdS|sdStjddj|��|j�|j�|rbtjrnt	|j
�}|j|ddd�ntj
||�dS)NTzFixing � )�writeZ
doctests_only)�distributionZuse_2to3rr�join�_Mixin2to3__build_fixer_names�_Mixin2to3__exclude_fixers�
setuptoolsZrun_2to3_on_doctestsr�fixer_namesZrefactor�
_Mixin2to3�run_2to3)r�filesZdoctests�rrrr
rs
zMixin2to3.run_2to3cCsb|jr
dSg|_xtjD]}|jjt|��qW|jjdk	r^x |jjD]}|jjt|��qFWdS)N)rrZlib2to3_fixer_packages�extendrrZuse_2to3_fixers)r�prrr
Z__build_fixer_names.szMixin2to3.__build_fixer_namescCsNt|dg�}|jjdk	r&|j|jj�x"|D]}||jkr,|jj|�q,WdS)NZexclude_fixers)�getattrrZuse_2to3_exclude_fixersr"r�remove)rZexcluded_fixersZ
fixer_namerrr
Z__exclude_fixers8s

zMixin2to3.__exclude_fixersN)F)rrrrrrrrrr
rs

r)�__doc__Zdistutils.utilrrZ	distutilsrZlib2to3.refactorrrrrrrrr
�<module>ssite-packages/setuptools/__pycache__/site-patch.cpython-36.opt-1.pyc000064400000002635147511334640021361 0ustar003

��f	�@sdd�Zedkre�[dS)cCsddl}ddl}|jjd�}|dks4|jdkr:|r:g}n|j|j�}t|di�}|jt	|�d�}|jj
t�}x�|D]�}||ksv|r�qv|j|�}|dk	r�|jd�}|dk	r�|j
d�Pqvy ddl}	|	jd|g�\}
}}Wntk
r�wvYnX|
dkr�qvz|	j
d|
||�Wd|
j�XPqvWtd��tdd�|jD��}
t|d	d�}d|_x|D]}t|��qXW|j|7_t|d�\}}d}g}xl|jD]b}t|�\}}||k�r�|dk�r�t	|�}||
k�s�|dk�r�|j|�n|j||�|d
7}�q�W||jdd�<dS)N��
PYTHONPATHZwin32�path_importer_cache�sitez$Couldn't find the real 'site' modulecSsg|]}t|�ddf�qS)�)�makepath)�.0�item�r	� /usr/lib/python3.6/site-patch.py�
<listcomp>)sz__boot.<locals>.<listcomp>�__egginsertr)�sys�os�environ�get�platform�split�pathsep�getattr�path�len�dirname�__file__�find_module�load_module�imp�ImportError�close�dictr�
addsitedirr�append�insert)r
rrZpicZstdpathZmydirrZimporter�loaderr�streamrZdescr�known_pathsZoldpos�dZndZ	insert_at�new_path�pZnpr	r	r
�__boots`





r(rN)r(�__name__r	r	r	r
�<module>sGsite-packages/setuptools/__pycache__/config.cpython-36.opt-1.pyc000064400000035217147511334640020567 0ustar003

��fVF�@s�ddlmZmZddlZddlZddlZddlmZddlm	Z	ddl
mZddlm
Z
mZddlmZmZddlmZdd
d�Zdd
�Zddd�ZGdd�de�ZGdd�de�ZGdd�de�ZdS)�)�absolute_import�unicode_literalsN)�defaultdict)�partial)�
import_module)�DistutilsOptionError�DistutilsFileError)�
LegacyVersion�parse)�string_typesFc	Cs�ddlm}m}tjj|�}tjj|�s4td|��tj�}tj	tjj
|��zJ|�}|rb|j�ng}||krx|j|�|j
||d�t||j|d�}Wdtj	|�Xt|�S)a,Read given configuration file and returns options from it as a dict.

    :param str|unicode filepath: Path to configuration file
        to get options from.

    :param bool find_others: Whether to search for other configuration files
        which could be on in various places.

    :param bool ignore_option_errors: Whether to silently ignore
        options, values of which could not be resolved (e.g. due to exceptions
        in directives such as file:, attr:, etc.).
        If False exceptions are propagated as expected.

    :rtype: dict
    r)�Distribution�
_Distributionz%Configuration file %s does not exist.)�	filenames)�ignore_option_errorsN)Zsetuptools.distrr
�os�path�abspath�isfiler�getcwd�chdir�dirnameZfind_config_files�appendZparse_config_files�parse_configuration�command_options�configuration_to_dict)	�filepathZfind_othersrrr
Zcurrent_directoryZdistr�handlers�r�/usr/lib/python3.6/config.py�read_configurations$

rcCsltt�}x^|D]V}|j}|j}xD|jD]:}t|d|d�}|dkrNt||�}n|�}||||<q&WqW|S)z�Returns configuration data gathered by given handlers as a dict.

    :param list[ConfigHandler] handlers: Handlers list,
        usually from parse_configuration()

    :rtype: dict
    zget_%sN)r�dict�section_prefix�
target_obj�set_options�getattr)rZconfig_dictZhandlerZ	obj_aliasr"Zoption�getter�valuerrrr=s
rcCs6t|||�}|j�t|j|||j�}|j�||fS)a�Performs additional parsing of configuration options
    for a distribution.

    Returns a list of used option handlers.

    :param Distribution distribution:
    :param dict command_options:
    :param bool ignore_option_errors: Whether to silently ignore
        options, values of which could not be resolved (e.g. due to exceptions
        in directives such as file:, attr:, etc.).
        If False exceptions are propagated as expected.
    :rtype: list
    )�ConfigOptionsHandlerr
�ConfigMetadataHandler�metadata�package_dir)Zdistributionrr�options�metarrrrZs
rc@s�eZdZdZdZiZd!dd�Zedd��Zdd	�Z	e
d"dd��Ze
d
d��Ze
dd��Z
e
dd��Zedd��Zedd��Ze
d#dd��Ze
dd��Ze
d$dd��Zdd�Zdd �ZdS)%�
ConfigHandlerz1Handles metadata supplied in configuration files.NFcCsbi}|j}x:|j�D].\}}|j|�s(q|j|d�jd�}|||<qW||_||_||_g|_dS)N��.)	r!�items�
startswith�replace�striprr"�sectionsr#)�selfr"r+rr4r!�section_name�section_optionsrrr�__init__�s
zConfigHandler.__init__cCstd|jj��dS)z.Metadata item name to parser function mapping.z!%s must provide .parsers propertyN)�NotImplementedError�	__class__�__name__)r5rrr�parsers�szConfigHandler.parsersc	Cs�t�}|j}|jj||�}t|||�}||kr6t|��|r>dSd}|jj|�}|r�y||�}Wn tk
r~d}|jsz�YnX|r�dSt|d|d�}|dkr�t	|||�n||�|j
j|�dS)NFTzset_%s)�tupler"�aliases�getr$�KeyErrorr<�	Exceptionr�setattrr#r)	r5Zoption_namer&�unknownr"Z
current_valueZskip_option�parser�setterrrr�__setitem__�s0zConfigHandler.__setitem__�,cCs8t|t�r|Sd|kr |j�}n
|j|�}dd�|D�S)z�Represents value as a list.

        Value is split either by separator (defaults to comma) or by lines.

        :param value:
        :param separator: List items separator character.
        :rtype: list
        �
cSsg|]}|j�r|j��qSr)r3)�.0�chunkrrr�
<listcomp>�sz-ConfigHandler._parse_list.<locals>.<listcomp>)�
isinstance�list�
splitlines�split)�clsr&�	separatorrrr�_parse_list�s



zConfigHandler._parse_listcCsTd}i}xF|j|�D]8}|j|�\}}}||kr<td|��|j�||j�<qW|S)zPRepresents value as a dict.

        :param value:
        :rtype: dict
        �=z(Unable to parse option value to dict: %s)rR�	partitionrr3)rPr&rQ�result�line�key�sep�valrrr�_parse_dict�s
zConfigHandler._parse_dictcCs|j�}|dkS)zQRepresents value as boolean.

        :param value:
        :rtype: bool
        �1�true�yes)r[r\r])�lower)rPr&rrr�_parse_bool�szConfigHandler._parse_boolcs\d}t|t�s|S|j|�s |S|t|�d�}dd�|jd�D�}dj�fdd�|D��S)aiRepresents value as a string, allowing including text
        from nearest files using `file:` directive.

        Directive is sandboxed and won't reach anything outside
        directory with setup.py.

        Examples:
            file: LICENSE
            file: README.rst, CHANGELOG.md, src/file.txt

        :param str value:
        :rtype: str
        zfile:Ncss|]}tjj|j��VqdS)N)rrrr3)rIrrrr�	<genexpr>sz,ConfigHandler._parse_file.<locals>.<genexpr>rGrHc3s2|]*}�j|�sdrtjj|�r�j|�VqdS)TN)�
_assert_localrrr�
_read_file)rIr)rPrrr`
s)rLrr1�lenrO�join)rPr&Zinclude_directive�specZ	filepathsr)rPr�_parse_file�s


zConfigHandler._parse_filecCs|jtj��std|��dS)Nz#`file:` directive can not access %s)r1rrr)rrrrraszConfigHandler._assert_localc	Cs"tj|dd��
}|j�SQRXdS)Nzutf-8)�encoding)�io�open�read)r�frrrrbszConfigHandler._read_filecCs�d}|j|�s|S|j|d�j�jd�}|j�}dj|�}|p@d}tj�}|r�|d|kr�||d}|jdd�}	t	|	�dkr�tj
jtj�|	d�}|	d}q�|}nd|kr�tj
jtj�|d�}tj
jd|�zt
|�}
t|
|�}Wdtj
dd�t_
X|S)	z�Represents value as a module attribute.

        Examples:
            attr: package.attr
            attr: package.module.attr

        :param str value:
        :rtype: str
        zattr:r.r/r8r�/�N)r1r2r3rO�poprdrr�rsplitrcr�sys�insertrr$)rPr&r*Zattr_directiveZ
attrs_pathZ	attr_nameZmodule_name�parent_pathZcustom_path�parts�modulerrr�_parse_attrs0


zConfigHandler._parse_attrcs�fdd�}|S)z�Returns parser function to represents value as a list.

        Parses a value applying given methods one after another.

        :param parse_methods:
        :rtype: callable
        cs|}x�D]}||�}q
W|S)Nr)r&�parsed�method)�
parse_methodsrrr
Qs
z1ConfigHandler._get_parser_compound.<locals>.parser)rPrxr
r)rxr�_get_parser_compoundHs	z"ConfigHandler._get_parser_compoundcCs:i}|pdd�}x$|j�D]\}\}}||�||<qW|S)z�Parses section options into a dictionary.

        Optionally applies a given parser to values.

        :param dict section_options:
        :param callable values_parser:
        :rtype: dict
        cSs|S)Nr)rYrrr�<lambda>fsz6ConfigHandler._parse_section_to_dict.<locals>.<lambda>)r0)rPr7Z
values_parserr&rW�_rYrrr�_parse_section_to_dict[s

z$ConfigHandler._parse_section_to_dictcCs@x:|j�D].\}\}}y|||<Wq
tk
r6Yq
Xq
WdS)zQParses configuration file section.

        :param dict section_options:
        N)r0r@)r5r7�namer{r&rrr�
parse_sectionks
zConfigHandler.parse_sectioncCsfx`|jj�D]R\}}d}|r$d|}t|d|jdd�d�}|dkrVtd|j|f��||�qWdS)zTParses configuration file items from one
        or more related sections.

        r.z_%szparse_section%sr/�__Nz0Unsupported distribution option section: [%s.%s])r4r0r$r2rr!)r5r6r7Zmethod_postfixZsection_parser_methodrrrr
wszConfigHandler.parse)F)rG)N)N)r;�
__module__�__qualname__�__doc__r!r>r8�propertyr<rF�classmethodrRrZr_rf�staticmethodrarbruryr|r~r
rrrrr-ts(
&
 ,r-csHeZdZdZddddd�ZdZd�fd	d
�	Zedd��Zd
d�Z	�Z
S)r(r)Zurl�description�classifiers�	platforms)Z	home_pageZsummaryZ
classifier�platformFNcstt|�j|||�||_dS)N)�superr(r8r*)r5r"r+rr*)r:rrr8�szConfigMetadataHandler.__init__cCs8|j}|j}|j}||||||j||�||||j|d�S)z.Metadata item name to parser function mapping.)r��keywordsZprovidesZrequiresZ	obsoletesr��licenser�Zlong_description�versionZproject_urls)rRrfrZry�_parse_version)r5�
parse_listZ
parse_file�
parse_dictrrrr<�s
zConfigMetadataHandler.parserscCs�|j|�}||kr<|j�}tt|�t�r8td||f��|S|j||j�}t|�rX|�}t|t	�s�t
|d�r~djtt
|��}nd|}|S)zSParses `version` option value.

        :param value:
        :rtype: str

        z7Version loaded from %s does not comply with PEP 440: %s�__iter__r/z%s)rfr3rLr
r	rrur*�callabler�hasattrrd�map�str)r5r&r�rrrr��s


z$ConfigMetadataHandler._parse_version)FN)r;r�r�r!r>Zstrict_moder8r�r<r��
__classcell__rr)r:rr(�sr(c@sTeZdZdZedd��Zdd�Zdd�Zdd	�Zd
d�Z	dd
�Z
dd�Zdd�ZdS)r'r+cCsL|j}t|jdd�}|j}|j}|||||||||||||||j|j|d�S)z.Metadata item name to parser function mapping.�;)rQ)Zzip_safeZuse_2to3Zinclude_package_datar*Zuse_2to3_fixersZuse_2to3_exclude_fixersZconvert_2to3_doctests�scriptsZeager_resourcesZdependency_linksZnamespace_packagesZinstall_requiresZsetup_requiresZ
tests_requireZpackages�entry_pointsZ
py_modules)rRrr_rZ�_parse_packagesrf)r5r�Zparse_list_semicolonZ
parse_boolr�rrrr<�s*zConfigOptionsHandler.parserscCsBd}|j|�s|j|�S|j|jjdi��}ddlm}|f|�S)zTParses `packages` option value.

        :param value:
        :rtype: list
        zfind:z
packages.findr)�
find_packages)r1rR�parse_section_packages__findr4r?Z
setuptoolsr�)r5r&Zfind_directive�find_kwargsr�rrrr��s

z$ConfigOptionsHandler._parse_packagescsT|j||j�}dddg�t�fdd�|j�D��}|jd�}|dk	rP|d|d<|S)z�Parses `packages.find` configuration file section.

        To be used in conjunction with _parse_packages().

        :param dict section_options:
        �where�include�excludecs$g|]\}}|�kr|r||f�qSrr)rI�k�v)�
valid_keysrrrKszEConfigOptionsHandler.parse_section_packages__find.<locals>.<listcomp>Nr)r|rRr r0r?)r5r7Zsection_datar�r�r)r�rr�s


z1ConfigOptionsHandler.parse_section_packages__findcCs|j||j�}||d<dS)z`Parses `entry_points` configuration file section.

        :param dict section_options:
        r�N)r|rR)r5r7rvrrr�parse_section_entry_points%sz/ConfigOptionsHandler.parse_section_entry_pointscCs.|j||j�}|jd�}|r*||d<|d=|S)N�*r.)r|rRr?)r5r7rv�rootrrr�_parse_package_data-s
z(ConfigOptionsHandler._parse_package_datacCs|j|�|d<dS)z`Parses `package_data` configuration file section.

        :param dict section_options:
        Zpackage_dataN)r�)r5r7rrr�parse_section_package_data7sz/ConfigOptionsHandler.parse_section_package_datacCs|j|�|d<dS)zhParses `exclude_package_data` configuration file section.

        :param dict section_options:
        Zexclude_package_dataN)r�)r5r7rrr�"parse_section_exclude_package_data>sz7ConfigOptionsHandler.parse_section_exclude_package_datacCs"t|jdd�}|j||�|d<dS)zbParses `extras_require` configuration file section.

        :param dict section_options:
        r�)rQZextras_requireN)rrRr|)r5r7r�rrr�parse_section_extras_requireFsz1ConfigOptionsHandler.parse_section_extras_requireN)
r;r�r�r!r�r<r�r�r�r�r�r�r�rrrrr'�s
r')FF)F)Z
__future__rrrhrrp�collectionsr�	functoolsr�	importlibrZdistutils.errorsrrZ#setuptools.extern.packaging.versionr	r
Zsetuptools.extern.sixrrrr�objectr-r(r'rrrr�<module>s"
.
Msite-packages/setuptools/__pycache__/build_meta.cpython-36.opt-1.pyc000064400000013166147511334640021426 0ustar003

��f'�@s�dZddlZddlZddlZddlZddlZddlZddlZGdd�de�Z	Gdd�dej
j�Zddd	�Zd
d�Z
dd
�Zdd�Zddd�Zddd�Zddd�Zddd�Zddd�ZdS) a-A PEP 517 interface to setuptools

Previously, when a user or a command line tool (let's call it a "frontend")
needed to make a request of setuptools to take a certain action, for
example, generating a list of installation requirements, the frontend would
would call "setup.py egg_info" or "setup.py bdist_wheel" on the command line.

PEP 517 defines a different method of interfacing with setuptools. Rather
than calling "setup.py" directly, the frontend should:

  1. Set the current directory to the directory with a setup.py file
  2. Import this module into a safe python interpreter (one in which
     setuptools can potentially set global variables or crash hard).
  3. Call one of the functions defined in PEP 517.

What each function does is defined in PEP 517. However, here is a "casual"
definition of the functions (this definition should not be relied on for
bug reports or API stability):

  - `build_wheel`: build a wheel in the folder and return the basename
  - `get_requires_for_build_wheel`: get the `setup_requires` to build
  - `prepare_metadata_for_build_wheel`: get the `install_requires`
  - `build_sdist`: build an sdist in the folder and return the basename
  - `get_requires_for_build_sdist`: get the `setup_requires` to build

Again, this is not a formal definition! Just a "taste" of the module.
�Nc@seZdZdd�ZdS)�SetupRequirementsErrorcCs
||_dS)N)�
specifiers)�selfr�r� /usr/lib/python3.6/build_meta.py�__init__(szSetupRequirementsError.__init__N)�__name__�
__module__�__qualname__rrrrrr'src@s&eZdZdd�Zeejdd���ZdS)�DistributioncCst|��dS)N)r)rrrrr�fetch_build_eggs-szDistribution.fetch_build_eggsccs*tjj}|tj_z
dVWd|tj_XdS)zw
        Replace
        distutils.dist.Distribution with this class
        for the duration of this context.
        N)�	distutilsZcorer)�clsZorigrrr�patch0s

zDistribution.patchN)rr	r
r�classmethod�
contextlib�contextmanagerrrrrrr,sr�setup.pycCsH|}d}ttdt�|�}|j�jdd�}|j�tt||d�t��dS)N�__main__�openz\r\nz\n�exec)	�getattr�tokenizer�read�replace�closer�compile�locals)Zsetup_script�__file__r�f�coderrr�
_run_setup@sr!cCs|pi}|jdg�|S)Nz--global-option)�
setdefault)�config_settingsrrr�_fix_configKsr$cCs~t|�}ddg}tjdd�dg|dt_ytj��t�WdQRXWn,tk
rx}z||j7}WYdd}~XnX|S)N�
setuptoolsZwheel�Zegg_infoz--global-option)r$�sys�argvrrr!rr)r#Zrequirements�errr�_get_build_requiresQs
r*cs�fdd�tj��D�S)Ncs&g|]}tjjtjj�|��r|�qSr)�os�path�isdir�join)�.0�name)�a_dirrr�
<listcomp>asz1_get_immediate_subdirectories.<locals>.<listcomp>)r+�listdir)r1r)r1r�_get_immediate_subdirectories`sr4cCst|�}t|�S)N)r$r*)r#rrr�get_requires_for_build_wheelesr5cCst|�}t|�S)N)r$r*)r#rrr�get_requires_for_build_sdistjsr6cCs�tjdd�dd|gt_t�|}xPdd�tj|�D�}t|�dkrptt|��dkrptjj|tj|�d�}q&Pq&W||kr�t	j
tjj||d�|�t	j|dd�|dS)	Nr&Z	dist_infoz
--egg-basecSsg|]}|jd�r|�qS)z
.dist-info)�endswith)r/rrrrr2usz4prepare_metadata_for_build_wheel.<locals>.<listcomp>rT)�
ignore_errors)r'r(r!r+r3�lenr4r,r.�shutilZmove�rmtree)�metadata_directoryr#Zdist_info_directoryZ
dist_infosrrr� prepare_metadata_for_build_wheelos"r=cCsrt|�}tjj|�}tjdd�dg|dt_t�|dkrVtj|�tj	d|�dd�tj
|�D�}|dS)Nr&Zbdist_wheelz--global-option�distcSsg|]}|jd�r|�qS)z.whl)r7)r/rrrrr2�szbuild_wheel.<locals>.<listcomp>r)r$r+r,�abspathr'r(r!r:r;�copytreer3)Zwheel_directoryr#r<Zwheelsrrr�build_wheel�s
rAcCsrt|�}tjj|�}tjdd�dg|dt_t�|dkrVtj|�tj	d|�dd�tj
|�D�}|dS)Nr&Zsdistz--global-optionr>cSsg|]}|jd�r|�qS)z.tar.gz)r7)r/rrrrr2�szbuild_sdist.<locals>.<listcomp>r)r$r+r,r?r'r(r!r:r;r@r3)Zsdist_directoryr#Zsdistsrrr�build_sdist�s
rB)r)N)N)N)NN)N)�__doc__r+r'rr:rr%r
�
BaseExceptionrr>rr!r$r*r4r5r6r=rArBrrrr�<module>s&




site-packages/setuptools/__pycache__/dep_util.cpython-36.opt-1.pyc000064400000001422147511334640021116 0ustar003

��f��@sddlmZdd�ZdS)�)�newer_groupcCslt|�t|�krtd��g}g}xBtt|��D]2}t||||�r.|j||�|j||�q.W||fS)z�Walk both arguments in parallel, testing if each source group is newer
    than its corresponding target. Returns a pair of lists (sources_groups,
    targets) where sources is newer than target, according to the semantics
    of 'newer_group()'.
    z5'sources_group' and 'targets' must be the same length)�len�
ValueError�ranger�append)Zsources_groupsZtargetsZ	n_sourcesZ	n_targets�i�r�/usr/lib/python3.6/dep_util.py�newer_pairwise_groupsr
N)Zdistutils.dep_utilrr
rrrr	�<module>ssite-packages/setuptools/__pycache__/ssl_support.cpython-36.pyc000064400000015075147511334640020760 0ustar003

��f,!�"@s�ddlZddlZddlZddlZddlZddlmZmZmZm	Z	ddl
mZmZyddl
Z
Wnek
rtdZ
YnXdddddgZd	j�j�ZyejjZejZWnek
r�eZZYnXe
dk	o�eeefkZydd
l
mZmZWnRek
�r:yddlmZddlmZWnek
�r4dZdZYnXYnXe�sRGd
d�de�Ze�sjddd�Zdd�ZGdd�de�ZGdd�de�Zd dd�Z dd�Z!e!dd��Z"dd�Z#dd�Z$dS)!�N)�urllib�http_client�map�filter)�ResolutionError�ExtractionError�VerifyingHTTPSHandler�find_ca_bundle�is_available�
cert_paths�
opener_fora
/etc/pki/tls/certs/ca-bundle.crt
/etc/ssl/certs/ca-certificates.crt
/usr/share/ssl/certs/ca-bundle.crt
/usr/local/share/certs/ca-root.crt
/etc/ssl/cert.pem
/System/Library/OpenSSL/certs/cert.pem
/usr/local/share/certs/ca-root-nss.crt
/etc/ssl/ca-bundle.pem
)�CertificateError�match_hostname)r
)rc@seZdZdS)r
N)�__name__�
__module__�__qualname__�rr�!/usr/lib/python3.6/ssl_support.pyr
5sr
�c
Cs�g}|sdS|jd�}|d}|dd�}|jd�}||krLtdt|���|s`|j�|j�kS|dkrt|jd�n>|jd	�s�|jd	�r�|jtj|��n|jtj|�j	d
d��x|D]}|jtj|��q�Wtj
dd
j|�dtj�}	|	j
|�S)zpMatching according to RFC 6125, section 6.4.3

        http://tools.ietf.org/html/rfc6125#section-6.4.3
        F�.rrN�*z,too many wildcards in certificate DNS name: z[^.]+zxn--z\*z[^.]*z\Az\.z\Z)�split�countr
�repr�lower�append�
startswith�re�escape�replace�compile�join�
IGNORECASE�match)
Zdn�hostnameZ
max_wildcardsZpats�partsZleftmostZ	remainderZ	wildcardsZfragZpatrrr�_dnsname_match;s*


r&cCs�|std��g}|jdf�}x0|D](\}}|dkr"t||�r@dS|j|�q"W|s�xF|jdf�D]6}x0|D](\}}|dkrjt||�r�dS|j|�qjWq`Wt|�dkr�td|d	jtt|��f��n*t|�dkr�td
||df��ntd��dS)
a=Verify that *cert* (in decoded format as returned by
        SSLSocket.getpeercert()) matches the *hostname*.  RFC 2818 and RFC 6125
        rules are followed, but IP addresses are not accepted for *hostname*.

        CertificateError is raised on failure. On success, the function
        returns nothing.
        zempty or no certificateZsubjectAltNameZDNSNZsubjectZ
commonNamerz&hostname %r doesn't match either of %sz, zhostname %r doesn't match %rrz=no appropriate commonName or subjectAltName fields were found)	�
ValueError�getr&r�lenr
r!rr)Zcertr$ZdnsnamesZsan�key�value�subrrrros.

rc@s eZdZdZdd�Zdd�ZdS)rz=Simple verifying handler: no auth, subclasses, timeouts, etc.cCs||_tj|�dS)N)�	ca_bundle�HTTPSHandler�__init__)�selfr-rrrr/�szVerifyingHTTPSHandler.__init__cs�j�fdd�|�S)Ncst|�jf|�S)N)�VerifyingHTTPSConnr-)�host�kw)r0rr�<lambda>�sz2VerifyingHTTPSHandler.https_open.<locals>.<lambda>)Zdo_open)r0Zreqr)r0r�
https_open�sz VerifyingHTTPSHandler.https_openN)rrr�__doc__r/r5rrrrr�sc@s eZdZdZdd�Zdd�ZdS)r1z@Simple verifying connection: no auth, subclasses, timeouts, etc.cKstj||f|�||_dS)N)�HTTPSConnectionr/r-)r0r2r-r3rrrr/�szVerifyingHTTPSConn.__init__cCs�tj|j|jft|dd��}t|d�rHt|dd�rH||_|j�|j}n|j}tt	d�rxt	j
|jd�}|j||d�|_nt	j|t	j
|jd�|_yt|jj�|�Wn.tk
r�|jjtj�|jj��YnXdS)NZsource_address�_tunnel�_tunnel_host�create_default_context)Zcafile)Zserver_hostname)Z	cert_reqsZca_certs)�socketZcreate_connectionr2Zport�getattr�hasattr�sockr8r9�sslr:r-Zwrap_socketZ
CERT_REQUIREDrZgetpeercertr
ZshutdownZ	SHUT_RDWR�close)r0r>Zactual_hostZctxrrr�connect�s$

zVerifyingHTTPSConn.connectN)rrrr6r/rArrrrr1�sr1cCstjjt|pt���jS)z@Get a urlopen() replacement that uses ca_bundle for verification)r�requestZbuild_openerrr	�open)r-rrrr�scstj���fdd��}|S)Ncst�d�s�||��_�jS)N�always_returns)r=rD)�args�kwargs)�funcrr�wrapper�s
zonce.<locals>.wrapper)�	functools�wraps)rGrHr)rGr�once�srKcsXyddl}Wntk
r dSXG�fdd�d|j����}|jd�|jd�|jS)Nrcs,eZdZ��fdd�Z��fdd�Z�ZS)z"get_win_certfile.<locals>.CertFilecst�|�j�tj|j�dS)N)�superr/�atexit�registerr@)r0)�CertFile�	__class__rrr/�sz+get_win_certfile.<locals>.CertFile.__init__cs,yt�|�j�Wntk
r&YnXdS)N)rLr@�OSError)r0)rOrPrrr@�sz(get_win_certfile.<locals>.CertFile.close)rrrr/r@�
__classcell__r)rO)rPrrO�srOZCAZROOT)�wincertstore�ImportErrorrOZaddstore�name)rSZ	_wincertsr)rOr�get_win_certfile�s

rVcCs$ttjjt�}t�p"t|d�p"t�S)z*Return an existing CA bundle path, or NoneN)r�os�path�isfilerrV�next�_certifi_where)Zextant_cert_pathsrrrr	�s
c
Cs,ytd�j�Stttfk
r&YnXdS)NZcertifi)�
__import__�whererTrrrrrrr[sr[)r)N)%rWr;rMrrIZsetuptools.extern.six.movesrrrrZ
pkg_resourcesrrr?rT�__all__�striprrrBr.r7�AttributeError�objectr
r
rZbackports.ssl_match_hostnamer'r&rr1rrKrVr	r[rrrr�<module>sP


4)
(
	
site-packages/setuptools/__pycache__/extension.cpython-36.opt-1.pyc000064400000003562147511334640021334 0ustar003

��f��@s|ddlZddlZddlZddlZddlZddlmZddlm	Z	dd�Z
e
Ze	ejj
�ZGdd�de�Z
Gd	d
�d
e
�ZdS)�N)�map�)�
get_unpatchedcCs2d}yt|dgd�jdStk
r,YnXdS)z0
    Return True if Cython can be imported.
    zCython.Distutils.build_ext�	build_ext)�fromlistTF)�
__import__r�	Exception)Zcython_impl�r	�/usr/lib/python3.6/extension.py�_have_cythonsrc@s eZdZdZdd�Zdd�ZdS)�	Extensionz7Extension that uses '.c' files in place of '.pyx' filescOs(|jdd�|_tj|||f|�|�dS)N�py_limited_apiF)�popr
�
_Extension�__init__)�self�name�sources�args�kwr	r	r
r#szExtension.__init__cCsNt�r
dS|jpd}|j�dkr$dnd}tjtjd|�}tt||j	��|_	dS)z�
        Replace sources with .pyx extensions to sources with the target
        language extension. This mechanism allows language authors to supply
        pre-converted sources but to prefer the .pyx sources.
        N�zc++z.cppz.cz.pyx$)
rZlanguage�lower�	functools�partial�re�sub�listrr)rZlangZ
target_extrr	r	r
�_convert_pyx_sources_to_lang)s
z&Extension._convert_pyx_sources_to_langN)�__name__�
__module__�__qualname__�__doc__rrr	r	r	r
r src@seZdZdZdS)�Libraryz=Just like a regular Extension, but built as a library insteadN)rrr r!r	r	r	r
r"8sr")rrZdistutils.coreZ	distutilsZdistutils.errorsZdistutils.extensionZsetuptools.extern.six.movesrZmonkeyrrZ
have_pyrexZcorerrr"r	r	r	r
�<module>ssite-packages/setuptools/__pycache__/archive_util.cpython-36.pyc000064400000011713147511334640021034 0ustar003

��f��@s�dZddlZddlZddlZddlZddlZddlZddlmZddl	m
Z
ddddd	d
dgZGdd	�d	e�Zd
d�Z
e
dfdd�Ze
fdd�Ze
fdd�Ze
fdd�ZeeefZdS)z/Utilities for extracting common archive formats�N)�DistutilsError)�ensure_directory�unpack_archive�unpack_zipfile�unpack_tarfile�default_filter�UnrecognizedFormat�extraction_drivers�unpack_directoryc@seZdZdZdS)rz#Couldn't recognize the archive typeN)�__name__�
__module__�__qualname__�__doc__�rr�"/usr/lib/python3.6/archive_util.pyrscCs|S)z@The default progress/filter callback; returns True for all filesr)�src�dstrrrrscCsNxH|ptD]0}y||||�Wntk
r4w
Yq
XdSq
Wtd|��dS)a�Unpack `filename` to `extract_dir`, or raise ``UnrecognizedFormat``

    `progress_filter` is a function taking two arguments: a source path
    internal to the archive ('/'-separated), and a filesystem path where it
    will be extracted.  The callback must return the desired extract path
    (which may be the same as the one passed in), or else ``None`` to skip
    that file or directory.  The callback can thus be used to report on the
    progress of the extraction, as well as to filter the items extracted or
    alter their extraction paths.

    `drivers`, if supplied, must be a non-empty sequence of functions with the
    same signature as this function (minus the `drivers` argument), that raise
    ``UnrecognizedFormat`` if they do not support extracting the designated
    archive type.  The `drivers` are tried in sequence until one is found that
    does not raise an error, or until all are exhausted (in which case
    ``UnrecognizedFormat`` is raised).  If you do not supply a sequence of
    drivers, the module's ``extraction_drivers`` constant will be used, which
    means that ``unpack_zipfile`` and ``unpack_tarfile`` will be tried, in that
    order.
    Nz!Not a recognized archive type: %s)r	r)�filename�extract_dir�progress_filterZdriversZdriverrrrrscCs�tjj|�std|��|d|fi}x�tj|�D]�\}}}||\}}x4|D],}	||	dtjj||	�f|tjj||	�<qLWx\|D]T}
tjj||
�}|||
|�}|s�q�t|�tjj||
�}
tj|
|�tj	|
|�q�Wq0WdS)z�"Unpack" a directory, using the same interface as for archives

    Raises ``UnrecognizedFormat`` if `filename` is not a directory
    z%s is not a directory��/N)
�os�path�isdirr�walk�joinr�shutilZcopyfileZcopystat)rrr�paths�base�dirs�filesrr�d�f�targetrrrr
?s 
,
c
Cs�tj|�std|f��tj|���}x�|j�D]�}|j}|jd�s.d|jd�krRq.tj	j
|f|jd���}|||�}|szq.|jd�r�t|�n4t|�|j
|j�}t|d��}|j|�WdQRX|jd?}	|	r.tj||	�q.WWdQRXdS)z�Unpack zip `filename` to `extract_dir`

    Raises ``UnrecognizedFormat`` if `filename` is not a zipfile (as determined
    by ``zipfile.is_zipfile()``).  See ``unpack_archive()`` for an explanation
    of the `progress_filter` argument.
    z%s is not a zip filerz..�wbN�)�zipfileZ
is_zipfilerZZipFileZinfolistr�
startswith�splitrrr�endswithr�read�open�writeZ
external_attr�chmod)
rrr�z�info�namer$�datar#Zunix_attributesrrrrZs(




c
Cshytj|�}Wn$tjk
r2td|f��YnXtj|���dd�|_�x
|D�]}|j}|jd�oxd|j	d�krTt
jj|f|j	d���}xV|dk	r�|j
�s�|j�r�|j}|j�r�tj|j�}tj||�}tj|�}|j|�}q�W|dk	rT|j��s|j�rT|||�}	|	rT|	jt
j��r,|	dd	�}	y|j||	�WqTtjk
�rTYqTXqTWdSQRXdS)
z�Unpack tar/tar.gz/tar.bz2 `filename` to `extract_dir`

    Raises ``UnrecognizedFormat`` if `filename` is not a tarfile (as determined
    by ``tarfile.open()``).  See ``unpack_archive()`` for an explanation
    of the `progress_filter` argument.
    z/%s is not a compressed or uncompressed tar filecWsdS)Nr)�argsrrr�<lambda>�sz unpack_tarfile.<locals>.<lambda>rz..N�T���)�tarfiler,ZTarErrorr�
contextlib�closing�chownr1r(r)rrrZislnkZissymZlinkname�	posixpath�dirname�normpathZ
_getmember�isfilerr*�sepZ_extract_memberZExtractError)
rrrZtarobj�memberr1Z
prelim_dstZlinkpathrZ	final_dstrrrrs8



)rr'r7rrr;r8Zdistutils.errorsrZ
pkg_resourcesr�__all__rrrr
rrr	rrrr�<module>s$
"%.site-packages/setuptools/__pycache__/py31compat.cpython-36.opt-1.pyc000064400000002711147511334640021313 0ustar003

��f��@s�ddgZyddlmZmZWn,ek
rHddlmZmZdd�ZYnXyddlmZWn4ek
r�ddl	Z	ddlZGdd	�d	e
�ZYnXdS)
�get_config_vars�get_path�)rr)r�get_python_libcCs|dkrtd��t|dk�S)N�platlib�purelibzName must be purelib or platlib)rr)�
ValueErrorr)�name�r	� /usr/lib/python3.6/py31compat.pyr	s)�TemporaryDirectoryNc@s(eZdZdZdd�Zdd�Zdd�ZdS)	rz�
        Very simple temporary directory context manager.
        Will try to delete afterward, but will also ignore OS and similar
        errors on deletion.
        cCsd|_tj�|_dS)N)r�tempfileZmkdtemp)�selfr	r	r
�__init__szTemporaryDirectory.__init__cCs|jS)N)r)r
r	r	r
�	__enter__!szTemporaryDirectory.__enter__cCs2ytj|jd�Wntk
r&YnXd|_dS)NT)�shutilZrmtreer�OSError)r
�exctypeZexcvalueZexctracer	r	r
�__exit__$s
zTemporaryDirectory.__exit__N)�__name__�
__module__�__qualname__�__doc__rrrr	r	r	r
rsr)�__all__�	sysconfigrr�ImportErrorZdistutils.sysconfigrrrr�objectr	r	r	r
�<module>ssite-packages/setuptools/__pycache__/py31compat.cpython-36.pyc000064400000002711147511334640020354 0ustar003

��f��@s�ddgZyddlmZmZWn,ek
rHddlmZmZdd�ZYnXyddlmZWn4ek
r�ddl	Z	ddlZGdd	�d	e
�ZYnXdS)
�get_config_vars�get_path�)rr)r�get_python_libcCs|dkrtd��t|dk�S)N�platlib�purelibzName must be purelib or platlib)rr)�
ValueErrorr)�name�r	� /usr/lib/python3.6/py31compat.pyr	s)�TemporaryDirectoryNc@s(eZdZdZdd�Zdd�Zdd�ZdS)	rz�
        Very simple temporary directory context manager.
        Will try to delete afterward, but will also ignore OS and similar
        errors on deletion.
        cCsd|_tj�|_dS)N)r�tempfileZmkdtemp)�selfr	r	r
�__init__szTemporaryDirectory.__init__cCs|jS)N)r)r
r	r	r
�	__enter__!szTemporaryDirectory.__enter__cCs2ytj|jd�Wntk
r&YnXd|_dS)NT)�shutilZrmtreer�OSError)r
�exctypeZexcvalueZexctracer	r	r
�__exit__$s
zTemporaryDirectory.__exit__N)�__name__�
__module__�__qualname__�__doc__rrrr	r	r	r
rsr)�__all__�	sysconfigrr�ImportErrorZdistutils.sysconfigrrrr�objectr	r	r	r
�<module>ssite-packages/setuptools/__pycache__/monkey.cpython-36.pyc000064400000010773147511334640017665 0ustar003

��f��@s�dZddlZddlZddlZddlZddlZddlmZddl	Z	ddl
mZddlZgZ
dd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Zdd�Zdd�ZdS)z
Monkey patching of distutils.
�N)�
import_module)�sixcCs"tj�dkr|f|jStj|�S)am
    Returns the bases classes for cls sorted by the MRO.

    Works around an issue on Jython where inspect.getmro will not return all
    base classes if multiple classes share the same name. Instead, this
    function will return a tuple containing the class itself, and the contents
    of cls.__bases__. See https://github.com/pypa/setuptools/issues/1024.
    ZJython)�platformZpython_implementation�	__bases__�inspectZgetmro)�cls�r�/usr/lib/python3.6/monkey.py�_get_mros	r
cCs0t|tj�rtnt|tj�r tndd�}||�S)NcSsdS)Nr)�itemrrr	�<lambda>*szget_unpatched.<locals>.<lambda>)�
isinstancerZclass_types�get_unpatched_class�types�FunctionType�get_unpatched_function)r�lookuprrr	�
get_unpatched&srcCs:dd�t|�D�}t|�}|jjd�s6d|}t|��|S)z�Protect against re-patching the distutils if reloaded

    Also ensures that no other distutils extension monkeypatched the distutils
    first.
    css|]}|jjd�s|VqdS)�
setuptoolsN)�
__module__�
startswith)�.0rrrr	�	<genexpr>6sz&get_unpatched_class.<locals>.<genexpr>�	distutilsz(distutils has already been patched by %r)r
�nextrr�AssertionError)rZexternal_bases�base�msgrrr	r/srcCs�tjtj_tjdk}|r"tjtj_tjdkpxd
tjko@dknpxdtjkoZdknpxdtjkotdkn}|r�d	}|tjj	_
t�x"tjtjtj
fD]}tjj|_q�Wtjjtj_tjjtj_d
tjk�r�tjjtjd
_t�dS)N�����
r��zhttps://upload.pypi.org/legacy/zdistutils.command.build_ext)rrr)r r!r")rr)rrr!)rr#)rr#r$)rr)rrr)rZCommandrZcore�sys�version_info�findallZfilelist�configZ
PyPIRCCommandZDEFAULT_REPOSITORY�+_patch_distribution_metadata_write_pkg_file�dist�cmdZDistribution�	extensionZ	Extension�modules�#patch_for_msvc_specialized_compiler)Zhas_issue_12885Zneeds_warehouseZ	warehouse�modulerrr	�	patch_allAs&




r0cCstjjtjj_dS)zDPatch write_pkg_file to also write Requires-Python/Requires-ExternalN)rr*Zwrite_pkg_filerZDistributionMetadatarrrr	r)jsr)cCs*t||�}t|�jd|�t|||�dS)z�
    Patch func_name in target_mod with replacement

    Important - original must be resolved by name to avoid
    patching an already patched function.
    �	unpatchedN)�getattr�vars�
setdefault�setattr)ZreplacementZ
target_mod�	func_name�originalrrr	�
patch_funcqs
r8cCs
t|d�S)Nr1)r2)�	candidaterrr	r�srcs�td��tj�dkrdS�fdd�}tj|d�}tj|d�}yt|d��t|d	��Wntk
rlYnXyt|d
��Wntk
r�YnXyt|d��Wntk
r�YnXdS)z\
    Patch functions in distutils to use standalone Microsoft Visual C++
    compilers.
    zsetuptools.msvcZWindowsNcsLd|krdnd}||jd�}t�|�}t|�}t||�sBt|��|||fS)zT
        Prepare the parameters for patch_func to patch indicated function.
        �msvc9Zmsvc9_Zmsvc14_�_)�lstripr2r�hasattr�ImportError)Zmod_namer6Zrepl_prefixZ	repl_name�repl�mod)�msvcrr	�patch_params�s

z9patch_for_msvc_specialized_compiler.<locals>.patch_paramszdistutils.msvc9compilerzdistutils._msvccompilerZfind_vcvarsallZquery_vcvarsallZ_get_vc_envZgen_lib_options)rr�system�	functools�partialr8r>)rBr:Zmsvc14r)rAr	r.�s&
r.)�__doc__r%Zdistutils.filelistrrrrD�	importlibrrZsetuptools.externrr�__all__r
rrr0r)r8rr.rrrr	�<module>s$	)site-packages/setuptools/__pycache__/glibc.cpython-36.opt-1.pyc000064400000002704147511334640020375 0ustar003

��fJ�@sHddlmZddlZddlZddlZdd�Zdd�Zdd�Zd	d
�ZdS)�)�absolute_importNcCsPtjd�}y
|j}Wntk
r(dSXtj|_|�}t|t�sL|jd�}|S)z9Returns glibc version string, or None if not using glibc.N�ascii)	�ctypesZCDLL�gnu_get_libc_version�AttributeErrorZc_char_pZrestype�
isinstance�str�decode)Zprocess_namespacer�version_str�r�/usr/lib/python3.6/glibc.py�glibc_version_string
s



r
cCsHtjd|�}|s$tjd|t�dSt|jd��|koFt|jd��|kS)Nz$(?P<major>[0-9]+)\.(?P<minor>[0-9]+)z=Expected glibc version with 2 components major.minor, got: %sF�major�minor)�re�match�warnings�warn�RuntimeWarning�int�group)r
�required_major�
minimum_minor�mrrr�check_glibc_version$s
rcCst�}|dkrdSt|||�S)NF)r
r)rrr
rrr�have_compatible_glibc4srcCst�}|dkrdSd|fSdS)z�Try to determine the glibc version

    Returns a tuple of strings (lib, version) which default to empty strings
    in case the lookup fails.
    N�Zglibc)rr)r
)Z
glibc_versionrrr�libc_verLsr)	Z
__future__rrrrr
rrrrrrr�<module>ssite-packages/setuptools/__pycache__/sandbox.cpython-36.opt-1.pyc000064400000036446147511334640020765 0ustar003

��f�7�@s�ddlZddlZddlZddlZddlZddlZddlZddlZddlZddl	Z	ddl
mZddlm
Z
mZddlZejjd�r�ddljjjjZnejejZyeZWnek
r�dZYnXeZddlm Z ddlm!Z!ddd	d
gZ"d-dd�Z#ej$d.d
d��Z%ej$dd��Z&ej$dd��Z'ej$dd��Z(Gdd�de)�Z*Gdd�d�Z+ej$dd��Z,dd�Z-ej$dd��Z.ej$dd ��Z/d!d"�Z0d#d$�Z1d%d
�Z2Gd&d�d�Z3e4ed'��r�ej5gZ6ngZ6Gd(d�de3�Z7ej8ej9d)d*�d+j:�D��Z;Gd,d	�d	e �Z<dS)/�N)�six)�builtins�map�java)�DistutilsError)�working_set�AbstractSandbox�DirectorySandbox�SandboxViolation�	run_setupcCsJd}t||��}|j�}WdQRX|dkr.|}t||d�}t|||�dS)z.
    Python 3 implementation of execfile.
    �rbN�exec)�open�read�compiler
)�filename�globals�locals�mode�streamZscript�code�r�/usr/lib/python3.6/sandbox.py�	_execfile#src
csDtjdd�}|dk	r$|tjdd�<z
|VWd|tjdd�<XdS)N)�sys�argv)�repl�savedrrr�	save_argv0s
rc
cs.tjdd�}z
|VWd|tjdd�<XdS)N)r�path)rrrr�	save_path;s
r ccs4tjj|dd�tj}|t_z
dVWd|t_XdS)zL
    Monkey-patch tempfile.tempdir with replacement, ensuring it exists
    T)�exist_okN)�
pkg_resourcesZ
py31compat�makedirs�tempfileZtempdir)Zreplacementrrrr�
override_tempDs
r%ccs.tj�}tj|�z
|VWdtj|�XdS)N)�os�getcwd�chdir)�targetrrrr�pushdUs


r*c@seZdZdZedd��ZdS)�UnpickleableExceptionzP
    An exception representing another Exception that could not be pickled.
    cCsJytj|�tj|�fStk
rDddlm}|j||t|���SXdS)z�
        Always return a dumped (pickled) type and exc. If exc can't be pickled,
        wrap it in UnpickleableException first.
        r)r+N)�pickle�dumps�	Exception�setuptools.sandboxr+�dump�repr)�type�exc�clsrrrr0ds
zUnpickleableException.dumpN)�__name__�
__module__�__qualname__�__doc__�staticmethodr0rrrrr+_sr+c@s(eZdZdZdd�Zdd�Zdd�ZdS)	�ExceptionSaverz^
    A Context Manager that will save an exception, serialized, and restore it
    later.
    cCs|S)Nr)�selfrrr�	__enter__xszExceptionSaver.__enter__cCs |sdStj||�|_||_dS)NT)r+r0�_saved�_tb)r;r2r3�tbrrr�__exit__{s
zExceptionSaver.__exit__cCs6dt|�krdSttj|j�\}}tj|||j�dS)z"restore and re-raise any exceptionr=N)�varsrr,�loadsr=rZreraiser>)r;r2r3rrr�resume�szExceptionSaver.resumeN)r5r6r7r8r<r@rCrrrrr:rsr:c
#sVtjj��t��}�VWdQRXtjj���fdd�tjD�}t|�|j�dS)z�
    Context in which imported modules are saved.

    Translates exceptions internal to the context into the equivalent exception
    outside the context.
    Nc3s&|]}|�kr|jd�r|VqdS)z
encodings.N)�
startswith)�.0�mod_name)rrr�	<genexpr>�szsave_modules.<locals>.<genexpr>)r�modules�copyr:�update�_clear_modulesrC)�	saved_excZdel_modulesr)rr�save_modules�s
rMcCsxt|�D]}tj|=q
WdS)N)�listrrH)Zmodule_namesrFrrrrK�srKccs$tj�}z
|VWdtj|�XdS)N)r"�__getstate__�__setstate__)rrrr�save_pkg_resources_state�s
rQc,cs�tjj|d�}t��xt��ft�t��Nt��<t|��(t	|��t
d�dVWdQRXWdQRXWdQRXWdQRXWdQRXWdQRXdS)NZtempZ
setuptools)r&r�joinrQrM�hide_setuptoolsr rr%r*�
__import__)�	setup_dirZtemp_dirrrr�
setup_context�s

rVcCstjd�}t|j|��S)aH
    >>> _needs_hiding('setuptools')
    True
    >>> _needs_hiding('pkg_resources')
    True
    >>> _needs_hiding('setuptools_plugin')
    False
    >>> _needs_hiding('setuptools.__init__')
    True
    >>> _needs_hiding('distutils')
    True
    >>> _needs_hiding('os')
    False
    >>> _needs_hiding('Cython')
    True
    z1(setuptools|pkg_resources|distutils|Cython)(\.|$))�rer�bool�match)rF�patternrrr�
_needs_hiding�s
r[cCstttj�}t|�dS)a%
    Remove references to setuptools' modules from sys.modules to allow the
    invocation to import the most appropriate setuptools. This technique is
    necessary to avoid issues such as #315 where setuptools upgrading itself
    would fail to find a function declared in the metadata.
    N)�filterr[rrHrK)rHrrrrS�srScCs�tjjtjj|��}t|���y�|gt|�tjdd�<tjjd|�t	j
�t	jjdd��t
|t�rl|n|jtj��}t|��t|dd�}t||�WdQRXWn4tk
r�}z|jr�|jdrʂWYdd}~XnXWdQRXdS)z8Run a distutils setup script, sandboxed in its directoryNrcSs|j�S)N)Zactivate)Zdistrrr�<lambda>�szrun_setup.<locals>.<lambda>�__main__)�__file__r5)r&r�abspath�dirnamerVrNrr�insertr�__init__Z	callbacks�append�
isinstance�str�encode�getfilesystemencodingr	�dictr�
SystemExit�args)Zsetup_scriptrkrUZdunder_file�ns�vrrrr�s

c@s2eZdZdZdZdd�Zdd�Zdd�Zd	d
�Zdd�Z	d
d�Z
x$d9D]Zee
e�rFe
e�e�e<qFWd:dd�Zer~ede�Zede�Zx$d;D]Zee
e�r�ee�e�e<q�Wd)d*�Zx$d<D]Zee
e�r�ee�e�e<q�Wd-d.�Zx(d=D] Zee
e��r�ee�e�e<�q�Wd1d2�Zd3d4�Zd5d6�Zd7d8�ZdS)>rzDWrap 'os' module and 'open()' builtin for virtualizing setup scriptsFcs�fdd�tt�D��_dS)Ncs&g|]}|jd�rt�|�r|�qS)�_)rD�hasattr)rE�name)r;rr�
<listcomp>sz,AbstractSandbox.__init__.<locals>.<listcomp>)�dir�_os�_attrs)r;r)r;rrcszAbstractSandbox.__init__cCs&x |jD]}tt|t||��qWdS)N)rt�setattrr&�getattr)r;�sourcerprrr�_copyszAbstractSandbox._copycCs(|j|�tr|jt_|jt_d|_dS)NT)rx�_filer�file�_openr�_active)r;rrrr<s

zAbstractSandbox.__enter__cCs$d|_trtt_tt_|jt�dS)NF)r|ryrrzr{rrxrs)r;�exc_type�	exc_value�	tracebackrrrr@s
zAbstractSandbox.__exit__c	Cs|�|�SQRXdS)zRun 'func' under os sandboxingNr)r;�funcrrr�runszAbstractSandbox.runcstt�����fdd�}|S)Ncs2|jr |j�||f|�|�\}}�||f|�|�S)N)r|�_remap_pair)r;�src�dstrk�kw)rp�originalrr�wrap&sz3AbstractSandbox._mk_dual_path_wrapper.<locals>.wrap)rvrs)rpr�r)rpr�r�_mk_dual_path_wrapper#s
z%AbstractSandbox._mk_dual_path_wrapper�rename�link�symlinkNcs �ptt�����fdd�}|S)Ncs*|jr|j�|f|�|�}�|f|�|�S)N)r|�_remap_input)r;rrkr�)rpr�rrr�4sz5AbstractSandbox._mk_single_path_wrapper.<locals>.wrap)rvrs)rpr�r�r)rpr�r�_mk_single_path_wrapper1sz'AbstractSandbox._mk_single_path_wrapperrzr�stat�listdirr(�chmod�chown�mkdir�remove�unlink�rmdir�utime�lchown�chroot�lstat�	startfile�mkfifo�mknod�pathconf�accesscstt�����fdd�}|S)NcsB|jr2|j�|f|�|�}|j��|f|�|��S�|f|�|�S)N)r|r��
_remap_output)r;rrkr�)rpr�rrr�Isz4AbstractSandbox._mk_single_with_return.<locals>.wrap)rvrs)rpr�r)rpr�r�_mk_single_with_returnFs
z&AbstractSandbox._mk_single_with_return�readlink�tempnamcstt�����fdd�}|S)Ncs �||�}|jr|j�|�S|S)N)r|r�)r;rkr�Zretval)rpr�rrr�Xs
z'AbstractSandbox._mk_query.<locals>.wrap)rvrs)rpr�r)rpr�r�	_mk_queryUs
zAbstractSandbox._mk_queryr'�tmpnamcCs|S)z=Called to remap or validate any path, whether input or outputr)r;rrrr�_validate_pathdszAbstractSandbox._validate_pathcOs
|j|�S)zCalled for path inputs)r�)r;�	operationrrkr�rrrr�hszAbstractSandbox._remap_inputcCs
|j|�S)zCalled for path outputs)r�)r;r�rrrrr�lszAbstractSandbox._remap_outputcOs0|j|d|f|�|�|j|d|f|�|�fS)z?Called for path pairs like rename, link, and symlink operationsz-fromz-to)r�)r;r�r�r�rkr�rrrr�pszAbstractSandbox._remap_pair)r�r�r�)N)r�r�r(rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)r�r�)r'r�)r5r6r7r8r|rcrxr<r@r�r�rprorsrr�ryr{r�r�r�r�r�r�rrrrrsB










�devnullc@s�eZdZdZejdddddddd	d
ddd
dg
�ZdgZefdd�Z	dd�Z
erXd'dd�Zd(dd�Zdd�Z
dd�Zdd�Zdd �Zd!d"�Zd)d$d%�Zd&S)*r	z<Restrict operations to a single subdirectory - pseudo-chrootrr�r�r�r�r�r�r�r�r�r�r�r�z.*lib2to3.*\.pickle$cCsFtjjtjj|��|_tjj|jd�|_dd�|D�|_tj	|�dS)N�cSs g|]}tjjtjj|���qSr)r&r�normcase�realpath)rErrrrrq�sz-DirectorySandbox.__init__.<locals>.<listcomp>)
r&rr�r��_sandboxrR�_prefix�_exceptionsrrc)r;Zsandbox�
exceptionsrrrrc�s

zDirectorySandbox.__init__cOsddlm}||||��dS)Nr)r
)r/r
)r;r�rkr�r
rrr�
_violation�szDirectorySandbox._violation�rcOs<|dkr*|j|�r*|jd||f|�|�t||f|�|�S)Nr��rtr�rU�Urz)r�r�rr�r�)�_okr�ry)r;rrrkr�rrrry�szDirectorySandbox._filecOs<|dkr*|j|�r*|jd||f|�|�t||f|�|�S)Nr�r�rr�r�r)r�r�rr�r�)r�r�r{)r;rrrkr�rrrr{�szDirectorySandbox._opencCs|jd�dS)Nr�)r�)r;rrrr��szDirectorySandbox.tmpnamcCsN|j}z:d|_tjjtjj|��}|j|�p@||jkp@|j|j�S||_XdS)NF)	r|r&rr�r��	_exemptedr�rDr�)r;rZactiver�rrrr��s

zDirectorySandbox._okcs<�fdd�|jD�}�fdd�|jD�}tj||�}t|�S)Nc3s|]}�j|�VqdS)N)rD)rEZ	exception)�filepathrrrG�sz-DirectorySandbox._exempted.<locals>.<genexpr>c3s|]}tj|��VqdS)N)rWrY)rErZ)r�rrrG�s)r��_exception_patterns�	itertools�chain�any)r;r�Z
start_matchesZpattern_matchesZ
candidatesr)r�rr��s



zDirectorySandbox._exemptedcOs6||jkr2|j|�r2|j|tjj|�f|�|�|S)zCalled for path inputs)�	write_opsr�r�r&rr�)r;r�rrkr�rrrr��szDirectorySandbox._remap_inputcOs6|j|�s|j|�r.|j|||f|�|�||fS)z?Called for path pairs like rename, link, and symlink operations)r�r�)r;r�r�r�rkr�rrrr��szDirectorySandbox._remap_pair�cOsB|t@r,|j|�r,|jd|||f|�|�tj|||f|�|�S)zCalled for low-level os.open()zos.open)�WRITE_FLAGSr�r�rsr)r;rz�flagsrrkr�rrrr�szDirectorySandbox.openN)r�)r�)r�)r5r6r7r8ri�fromkeysr�r��_EXCEPTIONSrcr�ryr{r�r�r�r�r�rrrrrr	~s 	


cCsg|]}tt|d��qS)r)rvrs)rE�arrrrq�srqz4O_WRONLY O_RDWR O_APPEND O_CREAT O_TRUNC O_TEMPORARYc@s&eZdZdZejd�j�Zdd�ZdS)r
zEA setup script attempted to modify the filesystem outside the sandboxa
        SandboxViolation: {cmd}{args!r} {kwargs}

        The package setup script has attempted to modify files on your system
        that are not within the EasyInstall build area, and has been aborted.

        This package cannot be safely installed by EasyInstall, and may not
        support alternate installation locations even if you run its setup
        script by hand.  Please inform the package's author and the EasyInstall
        maintainers to find out if a fix or workaround is available.
        cCs|j\}}}|jjft��S)N)rk�tmpl�formatr)r;�cmdrk�kwargsrrr�__str__�szSandboxViolation.__str__N)	r5r6r7r8�textwrap�dedent�lstripr�r�rrrrr
�s

)N)N)=r&rr$�operator�	functoolsr�rW�
contextlibr,r�Zsetuptools.externrZsetuptools.extern.six.movesrrZpkg_resources.py31compatr"�platformrDZ$org.python.modules.posix.PosixModule�pythonrH�posixZPosixModulersrprzry�	NameErrorrr{Zdistutils.errorsrr�__all__r�contextmanagerrr r%r*r.r+r:rMrKrQrVr[rSrrror�r�r	�reduce�or_�splitr�r
rrrr�<module>s^



	
	w
V
site-packages/setuptools/__pycache__/glibc.cpython-36.pyc000064400000002704147511334640017436 0ustar003

��fJ�@sHddlmZddlZddlZddlZdd�Zdd�Zdd�Zd	d
�ZdS)�)�absolute_importNcCsPtjd�}y
|j}Wntk
r(dSXtj|_|�}t|t�sL|jd�}|S)z9Returns glibc version string, or None if not using glibc.N�ascii)	�ctypesZCDLL�gnu_get_libc_version�AttributeErrorZc_char_pZrestype�
isinstance�str�decode)Zprocess_namespacer�version_str�r�/usr/lib/python3.6/glibc.py�glibc_version_string
s



r
cCsHtjd|�}|s$tjd|t�dSt|jd��|koFt|jd��|kS)Nz$(?P<major>[0-9]+)\.(?P<minor>[0-9]+)z=Expected glibc version with 2 components major.minor, got: %sF�major�minor)�re�match�warnings�warn�RuntimeWarning�int�group)r
�required_major�
minimum_minor�mrrr�check_glibc_version$s
rcCst�}|dkrdSt|||�S)NF)r
r)rrr
rrr�have_compatible_glibc4srcCst�}|dkrdSd|fSdS)z�Try to determine the glibc version

    Returns a tuple of strings (lib, version) which default to empty strings
    in case the lookup fails.
    N�Zglibc)rr)r
)Z
glibc_versionrrr�libc_verLsr)	Z
__future__rrrrr
rrrrrrr�<module>ssite-packages/setuptools/__pycache__/depends.cpython-36.pyc000064400000012134147511334640017776 0ustar003

��f��@s�ddlZddlZddlZddlmZddlmZmZmZmZddl	m
Z
dddd	gZGd
d�d�Zddd�Z
ddd�Zdd
d	�Zdd�Ze�dS)�N)�
StrictVersion)�
PKG_DIRECTORY�PY_COMPILED�	PY_SOURCE�	PY_FROZEN�)�Bytecode�Require�find_module�get_module_constant�extract_constantc@sHeZdZdZddd�Zdd�Zdd	�Zddd�Zdd
d�Zddd�Z	dS)r	z7A prerequisite to building or installing a distribution�NcCsF|dkr|dk	rt}|dk	r0||�}|dkr0d}|jjt��|`dS)N�__version__)r�__dict__�update�locals�self)r�name�requested_version�moduleZhomepage�	attribute�format�r�/usr/lib/python3.6/depends.py�__init__szRequire.__init__cCs |jdk	rd|j|jfS|jS)z0Return full package/distribution name, w/versionNz%s-%s)rr)rrrr�	full_name s
zRequire.full_namecCs*|jdkp(|jdkp(t|�dko(||jkS)z%Is 'version' sufficiently up-to-date?N�unknown)rr�strr)r�versionrrr�
version_ok&szRequire.version_okrc
Cs||jdkrBy"t|j|�\}}}|r*|j�|Stk
r@dSXt|j|j||�}|dk	rx||k	rx|jdk	rx|j|�S|S)a�Get version number of installed module, 'None', or 'default'

        Search 'paths' for module.  If not found, return 'None'.  If found,
        return the extracted version attribute, or 'default' if no version
        attribute was specified, or the value cannot be determined without
        importing the module.  The version is formatted according to the
        requirement's version format (if any), unless it is 'None' or the
        supplied 'default'.
        N)rr
r�close�ImportErrorrr)r�paths�default�f�p�i�vrrr�get_version+s

zRequire.get_versioncCs|j|�dk	S)z/Return true if dependency is present on 'paths'N)r()rr"rrr�
is_presentFszRequire.is_presentcCs |j|�}|dkrdS|j|�S)z>Return true if dependency is present and up-to-date on 'paths'NF)r(r)rr"rrrr�
is_currentJs
zRequire.is_current)r
NN)Nr)N)N)
�__name__�
__module__�__qualname__�__doc__rrrr(r)r*rrrrr	s



c
Csl|jd�}x\|rf|jd�}tj||�\}}\}}}}	|tkrP|pFdg}|g}q|rtd||f��qW|	S)z7Just like 'imp.find_module()', but with package support�.rrzCan't find %r in %s)�split�pop�impr
rr!)
rr"�parts�partr$�path�suffix�mode�kind�inforrrr
Rs


c
Cs�yt||�\}}\}}}Wntk
r.dSXz�|tkrP|jd�tj|�}	n`|tkrdtj|�}	nL|t	kr~t
|j�|d�}	n2|tjkr�tj
||||||f�ttj||d�SWd|r�|j�Xt|	||�S)z�Find 'module' by searching 'paths', and extract 'symbol'

    Return 'None' if 'module' does not exist on 'paths', or it does not define
    'symbol'.  If the module defines 'symbol' as a constant, return the
    constant.  Otherwise, return 'default'.N��exec)r
r!r�read�marshal�loadrr2�get_frozen_objectr�compile�sys�modules�load_module�getattrr r)
r�symbolr#r"r$r5r6r7r8�coderrrres$


cCs�||jkrdSt|j�j|�}d}d}d}|}xPt|�D]D}|j}	|j}
|	|kr\|j|
}q8|
|krx|	|kst|	|krx|S|}q8WdS)aExtract the constant value of 'symbol' from 'code'

    If the name 'symbol' is bound to a constant value by the Python code
    object 'code', return that value.  If 'symbol' is bound to an expression,
    return 'default'.  Otherwise, return 'None'.

    Return value is based on the first assignment to 'symbol'.  'symbol' must
    be a global, or at least a non-"fast" local in the code block.  That is,
    only 'STORE_NAME' and 'STORE_GLOBAL' opcodes are checked, and 'symbol'
    must be present in 'code.co_names'.
    N�Z�a�d)�co_names�list�indexrZopcode�arg�	co_consts)rFrEr#Zname_idxZ
STORE_NAMEZSTORE_GLOBALZ
LOAD_CONST�constZ	byte_code�oprMrrrr�s
cCsDtjjd�rtjdkrdSd}x|D]}t�|=tj|�q&WdS)z�
    Patch the globals to remove the objects not available on some platforms.

    XXX it'd be better to test assertions about bytecode instead.
    �javaZcliNrr)rr)rA�platform�
startswith�globals�__all__�remove)Zincompatiblerrrr�_update_globals�s
rW)N���)rXNrX)rX)rAr2r=Zdistutils.versionrrrrrZ
py33compatrrUr	r
rrrWrrrr�<module>sC

"
$site-packages/setuptools/__pycache__/py33compat.cpython-36.opt-1.pyc000064400000002466147511334640021324 0ustar003

��f��@s�ddlZddlZddlZyddlZWnek
r<dZYnXddlmZddlmZej	dd�Z
Gdd�de�Ze
ede�Ze
ed	ej�j�ZdS)
�N)�six)�html_parser�OpArgz
opcode argc@seZdZdd�Zdd�ZdS)�Bytecode_compatcCs
||_dS)N)�code)�selfr�r� /usr/lib/python3.6/py33compat.py�__init__szBytecode_compat.__init__ccs�tjd|jj�}t|jj�}d}d}x�||kr�||}|tjkr�||d||dd|}|d7}|tjkr�tjd	}||d�}q&nd}|d7}t	||�Vq&WdS)
z>Yield '(op,arg)' pair for each operation in code object 'code'�br����iN���)
�arrayr�co_code�len�disZ
HAVE_ARGUMENTZEXTENDED_ARGrZ
integer_typesr)r�bytes�eofZptrZextended_arg�op�argZ	long_typerrr	�__iter__s 

 

zBytecode_compat.__iter__N)�__name__�
__module__�__qualname__r
rrrrr	rsr�Bytecode�unescape)rr�collectionsZhtml�ImportErrorZsetuptools.externrZsetuptools.extern.six.movesr�
namedtupler�objectr�getattrrZ
HTMLParserrrrrr	�<module>s
"site-packages/setuptools/__pycache__/msvc.cpython-36.opt-1.pyc000064400000103247147511334640020271 0ustar003

��f���@sdZddlZddlZddlZddlZddlZddlmZddl	m
Z
ddlmZej
�dkrpddl	mZejZnGd	d
�d
�Ze�ZeejjfZyddlmZWnek
r�YnXdd
�Zd dd�Zdd�Zdd�Zd!dd�ZGdd�d�ZGdd�d�ZGdd�d�ZGdd�d�Z dS)"a@
Improved support for Microsoft Visual C++ compilers.

Known supported compilers:
--------------------------
Microsoft Visual C++ 9.0:
    Microsoft Visual C++ Compiler for Python 2.7 (x86, amd64)
    Microsoft Windows SDK 6.1 (x86, x64, ia64)
    Microsoft Windows SDK 7.0 (x86, x64, ia64)

Microsoft Visual C++ 10.0:
    Microsoft Windows SDK 7.1 (x86, x64, ia64)

Microsoft Visual C++ 14.0:
    Microsoft Visual C++ Build Tools 2015 (x86, x64, arm)
    Microsoft Visual Studio 2017 (x86, x64, arm, arm64)
    Microsoft Visual Studio Build Tools 2017 (x86, x64, arm, arm64)
�N)�
LegacyVersion)�filterfalse�)�
get_unpatched�Windows)�winregc@seZdZdZdZdZdZdS)rN)�__name__�
__module__�__qualname__�
HKEY_USERS�HKEY_CURRENT_USER�HKEY_LOCAL_MACHINE�HKEY_CLASSES_ROOT�rr�/usr/lib/python3.6/msvc.pyr(sr)�RegcCs�d}|d|f}ytj|d�}WnJtk
rjy|d|f}tj|d�}Wntk
rdd}YnXYnX|r�tjjjj|d�}tjj|�r�|Stt�|�S)a+
    Patched "distutils.msvc9compiler.find_vcvarsall" to use the standalone
    compiler build for Python (VCForPython). Fall back to original behavior
    when the standalone compiler is not available.

    Redirect the path of "vcvarsall.bat".

    Known supported compilers
    -------------------------
    Microsoft Visual C++ 9.0:
        Microsoft Visual C++ Compiler for Python 2.7 (x86, amd64)

    Parameters
    ----------
    version: float
        Required Microsoft Visual C++ version.

    Return
    ------
    vcvarsall.bat path: str
    z-Software\%sMicrosoft\DevDiv\VCForPython\%0.1f��
installdirzWow6432Node\Nz
vcvarsall.bat)	rZ	get_value�KeyError�os�path�join�isfiler�msvc9_find_vcvarsall)�versionZVC_BASE�key�
productdir�	vcvarsallrrrr?sr�x86cOs�ytt�}|||f|�|�Stjjk
r2Yntk
rDYnXyt||�j�Stjjk
r�}zt|||��WYdd}~XnXdS)a�
    Patched "distutils.msvc9compiler.query_vcvarsall" for support extra
    compilers.

    Set environment without use of "vcvarsall.bat".

    Known supported compilers
    -------------------------
    Microsoft Visual C++ 9.0:
        Microsoft Visual C++ Compiler for Python 2.7 (x86, amd64)
        Microsoft Windows SDK 6.1 (x86, x64, ia64)
        Microsoft Windows SDK 7.0 (x86, x64, ia64)

    Microsoft Visual C++ 10.0:
        Microsoft Windows SDK 7.1 (x86, x64, ia64)

    Parameters
    ----------
    ver: float
        Required Microsoft Visual C++ version.
    arch: str
        Target architecture.

    Return
    ------
    environment: dict
    N)	r�msvc9_query_vcvarsall�	distutils�errors�DistutilsPlatformError�
ValueError�EnvironmentInfo�
return_env�_augment_exception)�ver�arch�args�kwargsZorig�excrrrrjsrcCsnytt�|�Stjjk
r$YnXyt|dd�j�Stjjk
rh}zt|d��WYdd}~XnXdS)a'
    Patched "distutils._msvccompiler._get_vc_env" for support extra
    compilers.

    Set environment without use of "vcvarsall.bat".

    Known supported compilers
    -------------------------
    Microsoft Visual C++ 14.0:
        Microsoft Visual C++ Build Tools 2015 (x86, x64, arm)
        Microsoft Visual Studio 2017 (x86, x64, arm, arm64)
        Microsoft Visual Studio Build Tools 2017 (x86, x64, arm, arm64)

    Parameters
    ----------
    plat_spec: str
        Target architecture.

    Return
    ------
    environment: dict
    g,@)�
vc_min_verN)r�msvc14_get_vc_envr r!r"r$r%r&)Z	plat_specr+rrrr-�s
r-cOsBdtjkr4ddl}t|j�td�kr4|jjj||�Stt	�||�S)z�
    Patched "distutils._msvccompiler.gen_lib_options" for fix
    compatibility between "numpy.distutils" and "distutils._msvccompiler"
    (for Numpy < 1.11.2)
    znumpy.distutilsrNz1.11.2)
�sys�modulesZnumpyr�__version__r Z	ccompilerZgen_lib_optionsr�msvc14_gen_lib_options)r)r*Znprrrr1�s

r1rcCs�|jd}d|j�ks"d|j�kr�d}|jft��}d}|dkrr|j�jd�dkrh|d	7}||d
7}q�|d7}n.|dkr�|d
7}||d7}n|dkr�|d7}|f|_dS)zl
    Add details to the exception message to help guide the user
    as to what action will resolve it.
    rrzvisual cz0Microsoft Visual C++ {version:0.1f} is required.z-www.microsoft.com/download/details.aspx?id=%dg"@Zia64rz* Get it with "Microsoft Windows SDK 7.0": iBz% Get it from http://aka.ms/vcpython27g$@z* Get it with "Microsoft Windows SDK 7.1": iW g,@zj Get it with "Microsoft Visual C++ Build Tools": http://landinghub.visualstudio.com/visual-cpp-build-toolsN���)r)�lower�format�locals�find)r+rr(�messageZtmplZ
msdownloadrrrr&�s 

r&c@sbeZdZdZejdd�j�Zdd�Ze	dd��Z
dd	�Zd
d�Zdd
d�Z
ddd�Zddd�ZdS)�PlatformInfoz�
    Current and Target Architectures informations.

    Parameters
    ----------
    arch: str
        Target architecture.
    Zprocessor_architecturercCs|j�jdd�|_dS)N�x64�amd64)r3�replacer()�selfr(rrr�__init__�szPlatformInfo.__init__cCs|j|jjd�dd�S)N�_r)r(r6)r<rrr�
target_cpu�szPlatformInfo.target_cpucCs
|jdkS)Nr)r?)r<rrr�
target_is_x86szPlatformInfo.target_is_x86cCs
|jdkS)Nr)�current_cpu)r<rrr�current_is_x86szPlatformInfo.current_is_x86FcCs.|jdkr|rdS|jdkr$|r$dSd|jS)uk
        Current platform specific subfolder.

        Parameters
        ----------
        hidex86: bool
            return '' and not '†' if architecture is x86.
        x64: bool
            return 'd' and not 'md64' if architecture is amd64.

        Return
        ------
        subfolder: str
            '	arget', or '' (see hidex86 parameter)
        rrr:z\x64z\%s)rA)r<�hidex86r9rrr�current_dir	szPlatformInfo.current_dircCs.|jdkr|rdS|jdkr$|r$dSd|jS)ar
        Target platform specific subfolder.

        Parameters
        ----------
        hidex86: bool
            return '' and not '\x86' if architecture is x86.
        x64: bool
            return '\x64' and not '\amd64' if architecture is amd64.

        Return
        ------
        subfolder: str
            '\current', or '' (see hidex86 parameter)
        rrr:z\x64z\%s)r?)r<rCr9rrr�
target_dirszPlatformInfo.target_dircCs0|rdn|j}|j|krdS|j�jdd|�S)ao
        Cross platform specific subfolder.

        Parameters
        ----------
        forcex86: bool
            Use 'x86' as current architecture even if current acritecture is
            not x86.

        Return
        ------
        subfolder: str
            '' if target architecture is current architecture,
            '\current_target' if not.
        rr�\z\%s_)rAr?rEr;)r<�forcex86Zcurrentrrr�	cross_dir5szPlatformInfo.cross_dirN)FF)FF)F)rr	r
�__doc__�safe_env�getr3rAr=�propertyr?r@rBrDrErHrrrrr8�s

r8c@s�eZdZdZejejejejfZ	dd�Z
edd��Zedd��Z
edd	��Zed
d��Zedd
��Zedd��Zedd��Zedd��Zedd��Zddd�Zdd�ZdS)�RegistryInfoz�
    Microsoft Visual Studio related registry informations.

    Parameters
    ----------
    platform_info: PlatformInfo
        "PlatformInfo" instance.
    cCs
||_dS)N)�pi)r<Z
platform_inforrrr=ZszRegistryInfo.__init__cCsdS)z<
        Microsoft Visual Studio root registry key.
        ZVisualStudior)r<rrr�visualstudio]szRegistryInfo.visualstudiocCstjj|jd�S)z;
        Microsoft Visual Studio SxS registry key.
        ZSxS)rrrrO)r<rrr�sxsdszRegistryInfo.sxscCstjj|jd�S)z8
        Microsoft Visual C++ VC7 registry key.
        ZVC7)rrrrP)r<rrr�vckszRegistryInfo.vccCstjj|jd�S)z;
        Microsoft Visual Studio VS7 registry key.
        ZVS7)rrrrP)r<rrr�vsrszRegistryInfo.vscCsdS)z?
        Microsoft Visual C++ for Python registry key.
        zDevDiv\VCForPythonr)r<rrr�
vc_for_pythonyszRegistryInfo.vc_for_pythoncCsdS)z-
        Microsoft SDK registry key.
        zMicrosoft SDKsr)r<rrr�
microsoft_sdk�szRegistryInfo.microsoft_sdkcCstjj|jd�S)z>
        Microsoft Windows/Platform SDK registry key.
        r)rrrrT)r<rrr�windows_sdk�szRegistryInfo.windows_sdkcCstjj|jd�S)z<
        Microsoft .NET Framework SDK registry key.
        ZNETFXSDK)rrrrT)r<rrr�	netfx_sdk�szRegistryInfo.netfx_sdkcCsdS)z<
        Microsoft Windows Kits Roots registry key.
        zWindows Kits\Installed Rootsr)r<rrr�windows_kits_roots�szRegistryInfo.windows_kits_rootsFcCs(|jj�s|rdnd}tjjd|d|�S)a

        Return key in Microsoft software registry.

        Parameters
        ----------
        key: str
            Registry key path where look.
        x86: str
            Force x86 software registry.

        Return
        ------
        str: value
        rZWow6432NodeZSoftwareZ	Microsoft)rNrBrrr)r<rrZnode64rrr�	microsoft�szRegistryInfo.microsoftcCs�tj}tj}|j}x�|jD]�}y||||�d|�}WnZttfk
r�|jj�s�y||||d�d|�}Wq�ttfk
r�wYq�XnwYnXytj	||�dSttfk
r�YqXqWdS)a
        Look for values in registry in Microsoft software registry.

        Parameters
        ----------
        key: str
            Registry key path where look.
        name: str
            Value name to find.

        Return
        ------
        str: value
        rTN)
r�KEY_READ�OpenKeyrX�HKEYS�OSError�IOErrorrNrBZQueryValueEx)r<r�namerYZopenkey�ms�hkey�bkeyrrr�lookup�s"

zRegistryInfo.lookupN)F)rr	r
rIrrrr
rr[r=rLrOrPrQrRrSrTrUrVrWrXrbrrrrrMLs"
rMc@s$eZdZdZejdd�Zejdd�Zejde�Zd3dd�Z	d	d
�Z
dd�Zed
d��Z
edd��Zdd�Zdd�Zedd��Zedd��Zedd��Zedd��Zedd��Zedd ��Zed!d"��Zed#d$��Zed%d&��Zed'd(��Zed)d*��Zed+d,��Zed-d.��Zd/d0�Zd4d1d2�ZdS)5�
SystemInfoz�
    Microsoft Windows and Visual Studio related system inormations.

    Parameters
    ----------
    registry_info: RegistryInfo
        "RegistryInfo" instance.
    vc_ver: float
        Required Microsoft Visual C++ version.
    �WinDirr�ProgramFileszProgramFiles(x86)NcCs"||_|jj|_|p|j�|_dS)N)�rirN�_find_latest_available_vc_ver�vc_ver)r<Z
registry_inforhrrrr=�s
zSystemInfo.__init__cCs6y|j�dStk
r0d}tjj|��YnXdS)Nrz%No Microsoft Visual C++ version foundr2)�find_available_vc_vers�
IndexErrorr r!r")r<�errrrrrg�s
z(SystemInfo._find_latest_available_vc_vercCs6|jj}|jj|jj|jjf}g}�x|jjD]�}x�|D]�}ytj|||�dtj�}Wnt	t
fk
rpw8YnXtj|�\}}}	xPt|�D]D}
y*t
tj||
�d�}||kr�|j|�Wq�tk
r�Yq�Xq�WxPt|�D]D}
y(t
tj||
��}||k�r|j|�Wq�tk
�r Yq�Xq�Wq8Wq.Wt|�S)zC
        Find all available Microsoft Visual C++ versions.
        r)rfrXrQrSrRr[rrZrYr\r]ZQueryInfoKey�range�floatZ	EnumValue�appendr#ZEnumKey�sorted)r<r_ZvckeysZvc_versr`rraZsubkeys�valuesr>�ir'rrrri�s2


z!SystemInfo.find_available_vc_verscCs6d|j}tjj|j|�}|jj|jjd|j�p4|S)z4
        Microsoft Visual Studio directory.
        zMicrosoft Visual Studio %0.1fz%0.1f)rhrrr�ProgramFilesx86rfrbrR)r<r^�defaultrrr�VSInstallDir
s
zSystemInfo.VSInstallDircCs�|j|j�p|j�}tjj|jjd|j�}|jj	|d�}|rNtjj|d�n|}|jj	|jj
d|j�pl|}tjj|�s�d}tj
j|��|S)z1
        Microsoft Visual C++ directory.
        z%0.1frZVCz(Microsoft Visual C++ directory not found)rt�	_guess_vc�_guess_vc_legacyrrrrfrSrhrbrQ�isdirr r!r")r<�guess_vcZreg_pathZ	python_vcZ
default_vcr�msgrrr�VCInstallDirszSystemInfo.VCInstallDirc
Cs^|jdkrdSd}tjj|j|�}ytj|�d}tjj||�Stttfk
rXYnXdS)z*
        Locate Visual C for 2017
        g,@Nz
VC\Tools\MSVCrr2)	rhrrrrt�listdirr\r]rj)r<rsrxZvc_exact_verrrrru0s
zSystemInfo._guess_vccCsd|j}tjj|j|�S)z<
        Locate Visual C for versions prior to 2017
        z Microsoft Visual Studio %0.1f\VC)rhrrrrr)r<rsrrrrv@s
zSystemInfo._guess_vc_legacycCsJ|jdkrdS|jdkrdS|jdkr*dS|jdkr8dS|jdkrFdSdS)zN
        Microsoft Windows SDK versions for specified MSVC++ version.
        g"@�7.0�6.1�6.0ag$@�7.1�7.0ag&@�8.0�8.0ag(@�8.1�8.1ag,@�10.0N)r|r}r~)rr�)r�r�)r�r�)r�r�)rh)r<rrr�WindowsSdkVersionGs




zSystemInfo.WindowsSdkVersioncCs|jtjj|jd��S)z4
        Microsoft Windows SDK last version
        �lib)�_use_last_dir_namerrr�
WindowsSdkDir)r<rrr�WindowsSdkLastVersionWs
z SystemInfo.WindowsSdkLastVersioncCsTd}x8|jD].}tjj|jjd|�}|jj|d�}|rPqW|sRtjj|�r�tjj|jjd|j	�}|jj|d�}|r�tjj|d�}|s�tjj|�r�xH|jD]>}|d|j
d��}d	|}tjj|j|�}tjj|�r�|}q�W|s�tjj|��r:x:|jD]0}d
|}tjj|j|�}tjj|��r|}�qW|�sPtjj|jd�}|S)z2
        Microsoft Windows SDK directory.
        rzv%s�installationfolderz%0.1frZWinSDKN�.zMicrosoft SDKs\Windows Kits\%szMicrosoft SDKs\Windows\v%sZPlatformSDK)
r�rrrrfrUrbrwrSrh�rfindrerz)r<�sdkdirr'�locrZinstall_baseZintver�drrrr�_s6
zSystemInfo.WindowsSdkDirc	Cs�|jdkrd}d}n&d}|jdkr&dnd}|jjd|d�}d	||jd
d�f}g}|jdkr�x(|jD]}|tjj|jj	||�g7}qdWx,|j
D]"}|tjj|jjd
||�g7}q�Wx |D]}|jj|d�}|r�Pq�W|S)z=
        Microsoft Windows SDK executable directory.
        g&@�#r�(g(@TF)r9rCzWinSDK-NetFx%dTools%srF�-g,@zv%sAr�)
rhrNrDr;�NetFxSdkVersionrrrrfrVr�rUrb)	r<Znetfxverr(rCZfxZregpathsr'rZexecpathrrr�WindowsSDKExecutablePath�s$

"
z#SystemInfo.WindowsSDKExecutablePathcCs.d|j}tjj|jj|�}|jj|d�p,dS)z0
        Microsoft Visual F# directory.
        z%0.1f\Setup\F#rr)rhrrrrfrOrb)r<rrrr�FSharpInstallDir�s
zSystemInfo.FSharpInstallDircCsF|jdkrd}nf}x(|D] }|jj|jjd|�}|rPqW|pDdS)z8
        Microsoft Universal CRT SDK directory.
        g,@�10�81z
kitsroot%sr)r�r�)rhrfrbrW)r<Zversr'r�rrr�UniversalCRTSdkDir�s


zSystemInfo.UniversalCRTSdkDircCs|jtjj|jd��S)z@
        Microsoft Universal C Runtime SDK last version
        r�)r�rrrr�)r<rrr�UniversalCRTSdkLastVersion�s
z%SystemInfo.UniversalCRTSdkLastVersioncCs|jdkrdSfSdS)z8
        Microsoft .NET Framework SDK versions.
        g,@�4.6.1�4.6N)r�r�)rh)r<rrrr��s
zSystemInfo.NetFxSdkVersioncCs>x4|jD]*}tjj|jj|�}|jj|d�}|rPqW|p<dS)z9
        Microsoft .NET Framework SDK directory.
        Zkitsinstallationfolderr)r�rrrrfrVrb)r<r'r�r�rrr�NetFxSdkDir�szSystemInfo.NetFxSdkDircCs&tjj|jd�}|jj|jjd�p$|S)z;
        Microsoft .NET Framework 32bit directory.
        zMicrosoft.NET\FrameworkZframeworkdir32)rrrrdrfrbrQ)r<�guess_fwrrr�FrameworkDir32�szSystemInfo.FrameworkDir32cCs&tjj|jd�}|jj|jjd�p$|S)z;
        Microsoft .NET Framework 64bit directory.
        zMicrosoft.NET\Framework64Zframeworkdir64)rrrrdrfrbrQ)r<r�rrr�FrameworkDir64�szSystemInfo.FrameworkDir64cCs
|jd�S)z:
        Microsoft .NET Framework 32bit versions.
        � )�_find_dot_net_versions)r<rrr�FrameworkVersion32�szSystemInfo.FrameworkVersion32cCs
|jd�S)z:
        Microsoft .NET Framework 64bit versions.
        �@)r�)r<rrr�FrameworkVersion64�szSystemInfo.FrameworkVersion64cCs�|jj|jjd|�}t|d|�}|p6|j|d�p6d}|jdkrL|df}n:|jdkrx|j�dd	�d
krndn|df}n|jd
kr�d}|jdkr�d}|S)z�
        Find Microsoft .NET Framework versions.

        Parameters
        ----------
        bits: int
            Platform number of bits: 32 or 64.
        zframeworkver%dzFrameworkDir%d�vrg(@zv4.0g$@N�Zv4z
v4.0.30319�v3.5g"@�
v2.0.50727g @�v3.0)r�r�)r�r�)rfrbrQ�getattrr�rhr3)r<�bitsZreg_verZdot_net_dirr'Zframeworkverrrrr�s





z!SystemInfo._find_dot_net_versionscs,��fdd�ttj���D�}t|d�p*dS)z�
        Return name of the last dir in path or '' if no dir found.

        Parameters
        ----------
        path: str
            Use dirs in this path
        prefix: str
            Use only dirs startings by this prefix
        c3s2|]*}tjjtjj�|��r|j��r|VqdS)N)rrrwr�
startswith)�.0Zdir_name)r�prefixrr�	<genexpr>)sz0SystemInfo._use_last_dir_name.<locals>.<genexpr>Nr)�reversedrr{�next)r<rr�Z
matching_dirsr)rr�rr�szSystemInfo._use_last_dir_name)N)r) rr	r
rIrJrKrdrerrr=rgrirLrtrzrurvr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrc�s4

&	rcc@sReZdZdZd=dd�Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��Zedd��Zedd��Z
edd��Zedd��Zdd�Zedd��Zedd��Zed d!��Zed"d#��Zed$d%��Zed&d'��Zed(d)��Zed*d+��Zed,d-��Zed.d/��Zed0d1��Zed2d3��Zed4d5��Zd>d7d8�Zd9d:�Zd?d;d<�Z dS)@r$aY
    Return environment variables for specified Microsoft Visual C++ version
    and platform : Lib, Include, Path and libpath.

    This function is compatible with Microsoft Visual C++ 9.0 to 14.0.

    Script created by analysing Microsoft environment configuration files like
    "vcvars[...].bat", "SetEnv.Cmd", "vcbuildtools.bat", ...

    Parameters
    ----------
    arch: str
        Target architecture.
    vc_ver: float
        Required Microsoft Visual C++ version. If not set, autodetect the last
        version.
    vc_min_ver: float
        Minimum Microsoft Visual C++ version.
    NrcCsBt|�|_t|j�|_t|j|�|_|j|kr>d}tjj	|��dS)Nz.No suitable Microsoft Visual C++ version found)
r8rNrMrfrc�sirhr r!r")r<r(rhr,rkrrrr=Is

zEnvironmentInfo.__init__cCs|jjS)z/
        Microsoft Visual C++ version.
        )r�rh)r<rrrrhRszEnvironmentInfo.vc_vercsVddg}�jdkrD�jjddd�}|dg7}|dg7}|d|g7}�fd	d
�|D�S)z/
        Microsoft Visual Studio Tools
        zCommon7\IDEz
Common7\Toolsg,@T)rCr9z1Common7\IDE\CommonExtensions\Microsoft\TestWindowzTeam Tools\Performance ToolszTeam Tools\Performance Tools%scsg|]}tjj�jj|��qSr)rrrr�rt)r�r)r<rr�
<listcomp>fsz+EnvironmentInfo.VSTools.<locals>.<listcomp>)rhrNrD)r<�paths�arch_subdirr)r<r�VSToolsYs


zEnvironmentInfo.VSToolscCs$tjj|jjd�tjj|jjd�gS)zL
        Microsoft Visual C++ & Microsoft Foundation Class Includes
        ZIncludezATLMFC\Include)rrrr�rz)r<rrr�
VCIncludeshszEnvironmentInfo.VCIncludescsb�jdkr�jjdd�}n�jjdd�}d|d|g}�jdkrP|d|g7}�fd	d
�|D�S)zM
        Microsoft Visual C++ & Microsoft Foundation Class Libraries
        g.@T)r9)rCzLib%szATLMFC\Lib%sg,@zLib\store%scsg|]}tjj�jj|��qSr)rrrr�rz)r�r)r<rrr�~sz/EnvironmentInfo.VCLibraries.<locals>.<listcomp>)rhrNrE)r<r�r�r)r<r�VCLibrariesps

zEnvironmentInfo.VCLibrariescCs"|jdkrgStjj|jjd�gS)zA
        Microsoft Visual C++ store references Libraries
        g,@zLib\store\references)rhrrrr�rz)r<rrr�VCStoreRefs�s
zEnvironmentInfo.VCStoreRefscCs|j}tjj|jd�g}|jdkr&dnd}|jj|�}|rT|tjj|jd|�g7}|jdkr�d|jjdd�}|tjj|j|�g7}n�|jdkr�|jj	�r�d	nd
}|tjj|j||jj
dd��g7}|jj|jjkr�|tjj|j||jjdd��g7}n|tjj|jd�g7}|S)
z,
        Microsoft Visual C++ Tools
        Z
VCPackagesg$@TFzBin%sg,@)rCg.@z
bin\HostX86%sz
bin\HostX64%s)r9�Bin)
r�rrrrzrhrNrHrDrBrErAr?)r<r��toolsrGr�rZhost_dirrrr�VCTools�s&

zEnvironmentInfo.VCToolscCst|jdkr2|jjddd�}tjj|jjd|�gS|jjdd�}tjj|jjd�}|j}tjj|d||f�gSdS)	z1
        Microsoft Windows SDK Libraries
        g$@T)rCr9zLib%s)r9r�z%sum%sN)	rhrNrErrrr�r��_sdk_subdir)r<r�r�Zlibverrrr�OSLibraries�s
zEnvironmentInfo.OSLibrariescCs|tjj|jjd�}|jdkr.|tjj|d�gS|jdkr@|j}nd}tjj|d|�tjj|d|�tjj|d|�gSd	S)
z/
        Microsoft Windows SDK Include
        �includeg$@Zglg,@rz%ssharedz%sumz%swinrtN)rrrr�r�rhr�)r<r��sdkverrrr�
OSIncludes�s

zEnvironmentInfo.OSIncludescCs�tjj|jjd�}g}|jdkr*||j7}|jdkrH|tjj|d�g7}|jdkr�||tjj|jjd�tjj|dd�tjj|d	d�tjj|d
d�tjj|jjddd
|jddd�g7}|S)z7
        Microsoft Windows SDK Libraries Paths
        Z
Referencesg"@g&@zCommonConfiguration\Neutralg,@Z
UnionMetadataz'Windows.Foundation.UniversalApiContractz1.0.0.0z%Windows.Foundation.FoundationContractz,Windows.Networking.Connectivity.WwanContractZ
ExtensionSDKszMicrosoft.VCLibsz%0.1fZCommonConfigurationZneutral)rrrr�r�rhr�)r<�ref�libpathrrr�	OSLibpath�s>




zEnvironmentInfo.OSLibpathcCst|j��S)z-
        Microsoft Windows SDK Tools
        )�list�
_sdk_tools)r<rrr�SdkTools�szEnvironmentInfo.SdkToolsccs|jdkr0|jdkrdnd}tjj|jj|�V|jj�sd|jjdd�}d|}tjj|jj|�V|jdksx|jdkr�|jj	�r�d	}n|jjddd
�}d|}tjj|jj|�VnL|jdk�rtjj|jjd�}|jjdd�}|jj
}tjj|d||f�V|jj�r|jjVd
S)z=
        Microsoft Windows SDK Tools paths generator
        g.@g&@r�zBin\x86T)r9zBin%sg$@r)rCr9zBin\NETFX 4.0 Tools%sz%s%sN)rhrrrr�r�rNrBrDr@r�r�)r<Zbin_dirr�rr�rrrr��s(



zEnvironmentInfo._sdk_toolscCs|jj}|rd|SdS)z6
        Microsoft Windows SDK version subdir
        z%s\r)r�r�)r<�ucrtverrrrr�szEnvironmentInfo._sdk_subdircCs"|jdkrgStjj|jjd�gS)z-
        Microsoft Windows SDK Setup
        g"@ZSetup)rhrrrr�r�)r<rrr�SdkSetup%s
zEnvironmentInfo.SdkSetupcs�|j}|j�|jdkr0d}|j�o,|j�}n$|j�p>|j�}|jdkpR|jdk}g}|rt|�fdd��jD�7}|r�|�fdd��jD�7}|S)z0
        Microsoft .NET Framework Tools
        g$@Tr:csg|]}tjj�j|��qSr)rrrr�)r�r')r�rrr�@sz+EnvironmentInfo.FxTools.<locals>.<listcomp>csg|]}tjj�j|��qSr)rrrr�)r�r')r�rrr�Cs)	rNr�rhr@rBrAr?r�r�)r<rNZ	include32Z	include64r�r)r�r�FxTools/s
zEnvironmentInfo.FxToolscCs>|jdks|jjrgS|jjdd�}tjj|jjd|�gS)z8
        Microsoft .Net Framework SDK Libraries
        g,@T)r9zlib\um%s)rhr�r�rNrErrr)r<r�rrr�NetFxSDKLibrariesGsz!EnvironmentInfo.NetFxSDKLibrariescCs,|jdks|jjrgStjj|jjd�gS)z7
        Microsoft .Net Framework SDK Includes
        g,@z
include\um)rhr�r�rrr)r<rrr�NetFxSDKIncludesRsz EnvironmentInfo.NetFxSDKIncludescCstjj|jjd�gS)z>
        Microsoft Visual Studio Team System Database
        z
VSTSDB\Deploy)rrrr�rt)r<rrr�VsTDb\szEnvironmentInfo.VsTDbcCs~|jdkrgS|jdkr0|jj}|jjdd�}n|jj}d}d|j|f}tjj||�g}|jdkrz|tjj||d�g7}|S)z(
        Microsoft Build Engine
        g(@g.@T)rCrzMSBuild\%0.1f\bin%sZRoslyn)	rhr�rrrNrDrtrrr)r<�	base_pathr�rZbuildrrr�MSBuildcs


zEnvironmentInfo.MSBuildcCs"|jdkrgStjj|jjd�gS)z.
        Microsoft HTML Help Workshop
        g&@zHTML Help Workshop)rhrrrr�rr)r<rrr�HTMLHelpWorkshopzs
z EnvironmentInfo.HTMLHelpWorkshopcCsL|jdkrgS|jjdd�}tjj|jjd�}|j}tjj|d||f�gS)z=
        Microsoft Universal C Runtime SDK Libraries
        g,@T)r9r�z%sucrt%s)	rhrNrErrrr�r��_ucrt_subdir)r<r�r�r�rrr�
UCRTLibraries�s
zEnvironmentInfo.UCRTLibrariescCs6|jdkrgStjj|jjd�}tjj|d|j�gS)z;
        Microsoft Universal C Runtime SDK Include
        g,@r�z%sucrt)rhrrrr�r�r�)r<r�rrr�UCRTIncludes�s
zEnvironmentInfo.UCRTIncludescCs|jj}|rd|SdS)zB
        Microsoft Universal C Runtime SDK version subdir
        z%s\r)r�r�)r<r�rrrr��szEnvironmentInfo._ucrt_subdircCs |jdkr|jdkrgS|jjS)z%
        Microsoft Visual F#
        g&@g(@)rhr�r�)r<rrr�FSharp�szEnvironmentInfo.FSharpcCsl|jjdd�}|jdkr&|jj}d}n|jjjdd�}d}|jdkrHdn|j}|||j|f}tjj||�S)	zA
        Microsoft Visual C++ runtime redistribuable dll
        T)r9�z-redist%s\Microsoft.VC%d0.CRT\vcruntime%d0.dllz\Toolsz\Redistz.onecore%s\Microsoft.VC%d0.CRT\vcruntime%d0.dllg,@)	rNrErhr�rzr;rrr)r<r�Zredist_pathZ	vcruntimeZdll_verrrr�VCRuntimeRedist�s
zEnvironmentInfo.VCRuntimeRedistTcCs�t|jd|j|j|j|jg|�|jd|j|j|j|j	|j
g|�|jd|j|j|j|jg|�|jd|j
|j|j|j|j|j|j|j|jg	|�d�}|jdkr�tjj|j�r�|j|d<|S)z�
        Return environment dict.

        Parameters
        ----------
        exists: bool
            It True, only return existing paths.
        r�r�r�r)r�r�r�r�Zpy_vcruntime_redist)�dict�_build_pathsr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rhrrrr�)r<�exists�envrrrr%�sD	

zEnvironmentInfo.return_envc
Csxtjj|�}tj|d�jtj�}tj||�}|rBtt	tj
j|��n|}|sbd|j�}t
jj|��|j|�}	tjj|	�S)a
        Given an environment variable name and specified paths,
        return a pathsep-separated string of paths containing
        unique, extant, directories from those paths and from
        the environment variable. Raise an error if no paths
        are resolved.
        rz %s environment variable is empty)�	itertools�chain�
from_iterablerJrK�splitr�pathsepr��filterrrw�upperr r!r"�_unique_everseenr)
r<r^Zspec_path_listsr�Z
spec_pathsZ	env_pathsr�Zextant_pathsryZunique_pathsrrrr��s	
zEnvironmentInfo._build_pathsccsjt�}|j}|dkr:xPt|j|�D]}||�|Vq"Wn,x*|D]"}||�}||kr@||�|Vq@WdS)z�
        List unique elements, preserving order.
        Remember all elements ever seen.

        _unique_everseen('AAAABBBCCDAABBB') --> A B C D

        _unique_everseen('ABBCcAD', str.lower) --> A B C D
        N)�set�addr�__contains__)r<�iterabler�seenZseen_add�element�krrrr�s	
z EnvironmentInfo._unique_everseen)Nr)T)N)!rr	r
rIr=rLrhr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r%r�r�rrrrr$1s:
		 -





-r$)r)r)!rIrr.�platformr�Zdistutils.errorsr Z#setuptools.extern.packaging.versionrZsetuptools.extern.six.movesrZmonkeyr�systemr�environrJr��ImportErrorr!r"Z_msvc9_suppress_errorsZdistutils.msvc9compilerrrrr-r1r&r8rMrcr$rrrr�<module>s>
+
/&
%[bsite-packages/setuptools/__pycache__/py27compat.cpython-36.opt-1.pyc000064400000001346147511334640021323 0ustar003

��f�@sTdZddlZddlmZdd�Zejr.dd�Zej�dko>ejZerHendd	�Z	dS)
z2
Compatibility Support for Python 2.7 and earlier
�N)�sixcCs
|j|�S)zH
    Given an HTTPMessage, return all headers matching a given key.
    )Zget_all)�message�key�r� /usr/lib/python3.6/py27compat.py�get_all_headers
srcCs
|j|�S)N)Z
getheaders)rrrrrrsZLinuxcCs|S)Nr)�xrrr�<lambda>sr	)
�__doc__�platformZsetuptools.externrrZPY2�systemZlinux_py2_ascii�strZrmtree_saferrrr�<module>ssite-packages/setuptools/__pycache__/ssl_support.cpython-36.opt-1.pyc000064400000015075147511334640021717 0ustar003

��f,!�"@s�ddlZddlZddlZddlZddlZddlmZmZmZm	Z	ddl
mZmZyddl
Z
Wnek
rtdZ
YnXdddddgZd	j�j�ZyejjZejZWnek
r�eZZYnXe
dk	o�eeefkZydd
l
mZmZWnRek
�r:yddlmZddlmZWnek
�r4dZdZYnXYnXe�sRGd
d�de�Ze�sjddd�Zdd�ZGdd�de�ZGdd�de�Zd dd�Z dd�Z!e!dd��Z"dd�Z#dd�Z$dS)!�N)�urllib�http_client�map�filter)�ResolutionError�ExtractionError�VerifyingHTTPSHandler�find_ca_bundle�is_available�
cert_paths�
opener_fora
/etc/pki/tls/certs/ca-bundle.crt
/etc/ssl/certs/ca-certificates.crt
/usr/share/ssl/certs/ca-bundle.crt
/usr/local/share/certs/ca-root.crt
/etc/ssl/cert.pem
/System/Library/OpenSSL/certs/cert.pem
/usr/local/share/certs/ca-root-nss.crt
/etc/ssl/ca-bundle.pem
)�CertificateError�match_hostname)r
)rc@seZdZdS)r
N)�__name__�
__module__�__qualname__�rr�!/usr/lib/python3.6/ssl_support.pyr
5sr
�c
Cs�g}|sdS|jd�}|d}|dd�}|jd�}||krLtdt|���|s`|j�|j�kS|dkrt|jd�n>|jd	�s�|jd	�r�|jtj|��n|jtj|�j	d
d��x|D]}|jtj|��q�Wtj
dd
j|�dtj�}	|	j
|�S)zpMatching according to RFC 6125, section 6.4.3

        http://tools.ietf.org/html/rfc6125#section-6.4.3
        F�.rrN�*z,too many wildcards in certificate DNS name: z[^.]+zxn--z\*z[^.]*z\Az\.z\Z)�split�countr
�repr�lower�append�
startswith�re�escape�replace�compile�join�
IGNORECASE�match)
Zdn�hostnameZ
max_wildcardsZpats�partsZleftmostZ	remainderZ	wildcardsZfragZpatrrr�_dnsname_match;s*


r&cCs�|std��g}|jdf�}x0|D](\}}|dkr"t||�r@dS|j|�q"W|s�xF|jdf�D]6}x0|D](\}}|dkrjt||�r�dS|j|�qjWq`Wt|�dkr�td|d	jtt|��f��n*t|�dkr�td
||df��ntd��dS)
a=Verify that *cert* (in decoded format as returned by
        SSLSocket.getpeercert()) matches the *hostname*.  RFC 2818 and RFC 6125
        rules are followed, but IP addresses are not accepted for *hostname*.

        CertificateError is raised on failure. On success, the function
        returns nothing.
        zempty or no certificateZsubjectAltNameZDNSNZsubjectZ
commonNamerz&hostname %r doesn't match either of %sz, zhostname %r doesn't match %rrz=no appropriate commonName or subjectAltName fields were found)	�
ValueError�getr&r�lenr
r!rr)Zcertr$ZdnsnamesZsan�key�value�subrrrros.

rc@s eZdZdZdd�Zdd�ZdS)rz=Simple verifying handler: no auth, subclasses, timeouts, etc.cCs||_tj|�dS)N)�	ca_bundle�HTTPSHandler�__init__)�selfr-rrrr/�szVerifyingHTTPSHandler.__init__cs�j�fdd�|�S)Ncst|�jf|�S)N)�VerifyingHTTPSConnr-)�host�kw)r0rr�<lambda>�sz2VerifyingHTTPSHandler.https_open.<locals>.<lambda>)Zdo_open)r0Zreqr)r0r�
https_open�sz VerifyingHTTPSHandler.https_openN)rrr�__doc__r/r5rrrrr�sc@s eZdZdZdd�Zdd�ZdS)r1z@Simple verifying connection: no auth, subclasses, timeouts, etc.cKstj||f|�||_dS)N)�HTTPSConnectionr/r-)r0r2r-r3rrrr/�szVerifyingHTTPSConn.__init__cCs�tj|j|jft|dd��}t|d�rHt|dd�rH||_|j�|j}n|j}tt	d�rxt	j
|jd�}|j||d�|_nt	j|t	j
|jd�|_yt|jj�|�Wn.tk
r�|jjtj�|jj��YnXdS)NZsource_address�_tunnel�_tunnel_host�create_default_context)Zcafile)Zserver_hostname)Z	cert_reqsZca_certs)�socketZcreate_connectionr2Zport�getattr�hasattr�sockr8r9�sslr:r-Zwrap_socketZ
CERT_REQUIREDrZgetpeercertr
ZshutdownZ	SHUT_RDWR�close)r0r>Zactual_hostZctxrrr�connect�s$

zVerifyingHTTPSConn.connectN)rrrr6r/rArrrrr1�sr1cCstjjt|pt���jS)z@Get a urlopen() replacement that uses ca_bundle for verification)r�requestZbuild_openerrr	�open)r-rrrr�scstj���fdd��}|S)Ncst�d�s�||��_�jS)N�always_returns)r=rD)�args�kwargs)�funcrr�wrapper�s
zonce.<locals>.wrapper)�	functools�wraps)rGrHr)rGr�once�srKcsXyddl}Wntk
r dSXG�fdd�d|j����}|jd�|jd�|jS)Nrcs,eZdZ��fdd�Z��fdd�Z�ZS)z"get_win_certfile.<locals>.CertFilecst�|�j�tj|j�dS)N)�superr/�atexit�registerr@)r0)�CertFile�	__class__rrr/�sz+get_win_certfile.<locals>.CertFile.__init__cs,yt�|�j�Wntk
r&YnXdS)N)rLr@�OSError)r0)rOrPrrr@�sz(get_win_certfile.<locals>.CertFile.close)rrrr/r@�
__classcell__r)rO)rPrrO�srOZCAZROOT)�wincertstore�ImportErrorrOZaddstore�name)rSZ	_wincertsr)rOr�get_win_certfile�s

rVcCs$ttjjt�}t�p"t|d�p"t�S)z*Return an existing CA bundle path, or NoneN)r�os�path�isfilerrV�next�_certifi_where)Zextant_cert_pathsrrrr	�s
c
Cs,ytd�j�Stttfk
r&YnXdS)NZcertifi)�
__import__�whererTrrrrrrr[sr[)r)N)%rWr;rMrrIZsetuptools.extern.six.movesrrrrZ
pkg_resourcesrrr?rT�__all__�striprrrBr.r7�AttributeError�objectr
r
rZbackports.ssl_match_hostnamer'r&rr1rrKrVr	r[rrrr�<module>sP


4)
(
	
site-packages/setuptools/__pycache__/py36compat.cpython-36.pyc000064400000004132147511334640020360 0ustar003

��fK�@sVddlZddlmZddlmZddlmZGdd�d�Zejd	krRGdd�d�ZdS)
�N)�DistutilsOptionError)�	strtobool)�DEBUGc@seZdZdZddd�ZdS)�Distribution_parse_config_filesz�
    Mix-in providing forward-compatibility for functionality to be
    included by default on Python 3.7.

    Do not edit the code in this class except to update functionality
    as implemented in distutils.
    NcCs�ddlm}tjtjkr8ddddddd	d
ddd
ddg
}ng}t|�}|dkrT|j�}trb|jd�|dd�}x�|D]�}tr�|jd|�|j	|�xf|j
�D]Z}|j|�}|j|�}x@|D]8}	|	dkr�|	|kr�|j
||	�}
|	jdd�}	||
f||	<q�Wq�W|j�qrWd|jk�r�x�|jdj�D]�\}	\}}
|jj
|	�}yF|�rVt||t|
��n(|	dk�rrt||	t|
��nt||	|
�Wn,tk
�r�}
zt|
��WYdd}
~
XnX�q"WdS)Nr)�ConfigParserzinstall-basezinstall-platbasezinstall-libzinstall-platlibzinstall-purelibzinstall-headerszinstall-scriptszinstall-data�prefixzexec-prefix�home�user�rootz"Distribution.parse_config_files():)Z
interpolationz  reading %s�__name__�-�_�global�verbose�dry_run)rr)Zconfigparserr�sysr�base_prefix�	frozensetZfind_config_filesrZannounce�readZsections�optionsZget_option_dict�get�replace�__init__Zcommand_options�itemsZnegative_opt�setattrr�
ValueErrorr)�self�	filenamesrZignore_options�parser�filenameZsectionrZopt_dict�opt�val�src�alias�msg�r%� /usr/lib/python3.6/py36compat.py�parse_config_filessJ







z2Distribution_parse_config_files.parse_config_files)N)r�
__module__�__qualname__�__doc__r'r%r%r%r&rsr�c@seZdZdS)rN)rr(r)r%r%r%r&rJs)r+)	rZdistutils.errorsrZdistutils.utilrZdistutils.debugrr�version_infor%r%r%r&�<module>sA
site-packages/setuptools/__pycache__/package_index.cpython-36.opt-1.pyc000064400000076647147511334640022120 0ustar003

��fŜ�@s�dZddlZddlZddlZddlZddlZddlZddlZddlZddl	Z	ddl
mZddlm
Z
ddlmZmZmZmZddlZddlmZmZmZmZmZmZmZmZmZmZmZm Z m!Z!ddlm"Z"ddl#m$Z$dd	l%m&Z&dd
l'm(Z(ddl)m*Z*ddl+m,Z,dd
l-m.Z.ej/d�Z0ej/dej1�Z2ej/d�Z3ej/dej1�j4Z5dj6�Z7ddddgZ8dZ9dZ:e:j;ej<dd�ed�Z=dd�Z>dd�Z?dd�Z@dEd d�ZAdFd!d"�ZBdGd#d$�ZCdedfd%d�ZDdHd&d'�ZEd(d)�ZFej/d*ej1�ZGeFd+d,��ZHGd-d.�d.eI�ZJGd/d0�d0eJ�ZKGd1d�de�ZLej/d2�jMZNd3d4�ZOd5d6�ZPdId7d8�ZQd9d:�ZRGd;d<�d<eI�ZSGd=d>�d>ejT�ZUejVjWfd?d@�ZXeQe9�eX�ZXdAdB�ZYdCdD�ZZdS)Jz#PyPI and direct package downloading�N)�wraps)�six)�urllib�http_client�configparser�map)
�
CHECKOUT_DIST�Distribution�BINARY_DIST�normalize_path�SOURCE_DIST�Environment�find_distributions�	safe_name�safe_version�to_filename�Requirement�DEVELOP_DIST�EGG_DIST)�ssl_support)�log)�DistutilsError)�	translate)�get_all_headers)�unescape)�Wheelz^egg=([-A-Za-z0-9_.+!]+)$zhref\s*=\s*['"]?([^'"> ]+)z�<a href="([^"#]+)">([^<]+)</a>
\s+\(<a (?:title="MD5 hash"
\s+)href="[^?]+\?:action=show_md5&amp;digest=([0-9a-f]{32})">md5</a>\)z([-+.a-z0-9]{2,}):z.tar.gz .tar.bz2 .tar .zip .tgz�PackageIndex�distros_for_url�parse_bdist_wininst�interpret_distro_name�z<setuptools/{setuptools.__version__} Python-urllib/{py_major}�)Zpy_major�
setuptoolscCs2y
tj|�Stk
r,td|f��YnXdS)Nz1Not a URL, existing file, or requirement spec: %r)r�parse�
ValueErrorr)�spec�r&�#/usr/lib/python3.6/package_index.py�parse_requirement_arg3s

r(cCs�|j�}d\}}}|jd�r�|jd�r8|dd�}d}nn|jdd�rb|dd�}|dd�}d}nD|jd
�r~|dd�}d}n(|jd
d�r�|dd�}|dd�}d}|||fS)z=Return (base,pyversion) or (None,None) for possible .exe nameNz.exez
.win32.exe�
Zwin32z	.win32-py���z.win-amd64.exe�z	win-amd64z
.win-amd64-py�)NNNi����i�i�������i�i�i��i����r/i��)�lower�endswith�
startswith)�namer0�base�py_verZplatr&r&r'r<s$



c	Csxtjj|�}|\}}}}}}tjj|jd�d�}|dkrX|dkrXtjj|jd�d�}d|krp|jdd�\}}||fS)	N�/�zsourceforge.net�download��#������)rr#�urlparse�unquote�split)	�url�parts�scheme�server�pathZ
parameters�query�fragmentr4r&r&r'�egg_info_for_urlTsrGccsdt|�\}}xt|||�D]
}|VqW|r`tj|�}|r`x$t||jd�|td�D]
}|VqRWdS)zEYield egg or source distribution objects that might be found at a URLr7)�
precedenceN)rG�distros_for_location�EGG_FRAGMENT�matchr�groupr)r@�metadatar4rF�distrKr&r&r'r_s

cCs�|jd�r|dd
�}|jd�r8d|kr8tj|||�gS|jd�rxd|krxt|�}|j�s^gSt||j|jtdd�gS|jd	�r�t|�\}}}|dk	r�t	||||t
|�Sx4tD],}|j|�r�|dt|��}t	|||�Sq�WgS)z:Yield egg or source distribution objects based on basenamez.egg.zipNr,z.egg�-z.whlr7)�location�project_name�versionrHz.exer/)
r1r	Z
from_locationrZ
is_compatiblerQrRrrrr
�
EXTENSIONS�len)rP�basenamerMZwheelZwin_baser5�platformZextr&r&r'rIms.



rIcCstt|�tjj|�|�S)zEYield possible egg or source distribution objects based on a filename)rIr�osrDrU)�filenamerMr&r&r'�distros_for_filename�srYc
cs�|jd�}|r.tdd�|dd�D��r.dSxNtdt|�d�D]8}t||dj|d|��dj||d��|||d�VqBWdS)z�Generate alternative interpretations of a source distro name

    Note: if `location` is a filesystem filename, you should call
    ``pkg_resources.normalize_path()`` on it before passing it to this
    routine!
    rOcss|]}tjd|�VqdS)z	py\d\.\d$N)�rerK)�.0�pr&r&r'�	<genexpr>�sz(interpret_distro_name.<locals>.<genexpr>r9Nr7)�
py_versionrHrV)r?�any�rangerTr	�join)rPrUrMr^rHrVrAr\r&r&r'r�s
 $ccsnt�}|j}|dkr>xTtjj|j|�D]}||�|Vq&Wn,x*|D]"}||�}||krD||�|VqDWdS)zHList unique elements, preserving order. Remember all elements ever seen.N)�set�addrZmoves�filterfalse�__contains__)�iterable�key�seenZseen_add�element�kr&r&r'�unique_everseen�s
rkcst���fdd��}|S)zs
    Wrap a function returning an iterable such that the resulting iterable
    only ever yields unique items.
    cst�||��S)N)rk)�args�kwargs)�funcr&r'�wrapper�szunique_values.<locals>.wrapper)r)rnror&)rnr'�
unique_values�srpz3<([^>]*\srel\s{0,10}=\s{0,10}['"]?([^'" >]+)[^>]*)>ccs�xvtj|�D]h}|j�\}}tttj|j�jd���}d|ksFd|krx,t	j|�D]}t
jj|t
|jd���VqRWqWxHdD]@}|j|�}|d	kr~t	j||�}|r~t
jj|t
|jd���Vq~WdS)
zEFind rel="homepage" and rel="download" links in `page`, yielding URLs�,Zhomepager8r7�
<th>Home Page�<th>Download URLN)rrrsr;)�REL�finditer�groupsrbr�str�stripr0r?�HREFrr#�urljoin�
htmldecoderL�find�search)r@�pagerK�tagZrelZrels�posr&r&r'�find_external_links�s"

r�c@s(eZdZdZdd�Zdd�Zdd�ZdS)	�ContentCheckerzP
    A null content checker that defines the interface for checking content
    cCsdS)z3
        Feed a block of data to the hash.
        Nr&)�self�blockr&r&r'�feed�szContentChecker.feedcCsdS)zC
        Check the hash. Return False if validation fails.
        Tr&)r�r&r&r'�is_valid�szContentChecker.is_validcCsdS)zu
        Call reporter with information about the checker (hash name)
        substituted into the template.
        Nr&)r��reporter�templater&r&r'�reportszContentChecker.reportN)�__name__�
__module__�__qualname__�__doc__r�r�r�r&r&r&r'r��sr�c@sBeZdZejd�Zdd�Zedd��Zdd�Z	dd	�Z
d
d�ZdS)
�HashCheckerzK(?P<hash_name>sha1|sha224|sha384|sha256|sha512|md5)=(?P<expected>[a-f0-9]+)cCs||_tj|�|_||_dS)N)�	hash_name�hashlib�new�hash�expected)r�r�r�r&r&r'�__init__szHashChecker.__init__cCs>tjj|�d}|st�S|jj|�}|s0t�S|f|j��S)z5Construct a (possibly null) ContentChecker from a URLr7r;)rr#r=r��patternr}�	groupdict)�clsr@rFrKr&r&r'�from_urlszHashChecker.from_urlcCs|jj|�dS)N)r��update)r�r�r&r&r'r�szHashChecker.feedcCs|jj�|jkS)N)r�Z	hexdigestr�)r�r&r&r'r�!szHashChecker.is_validcCs||j}||�S)N)r�)r�r�r��msgr&r&r'r�$s
zHashChecker.reportN)r�r�r�rZ�compiler�r��classmethodr�r�r�r�r&r&r&r'r�sr�cs<eZdZdZdKdd�ZdLd	d
�ZdMdd�ZdNd
d�Zdd�Zdd�Z	dd�Z
dd�ZdOdd�Zdd�Z
dP�fdd�	Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�ZdQd'd(�ZdRd)d*�Zd+d,�Zd-Zd.d/�Zd0d1�ZdSd2d3�Zd4d5�Zd6d7�Zd8d9�Zd:d;�Zd<d=�Ze dTd>d?��Z!d@dA�Z"dBdC�Z#dDdE�Z$dFdG�Z%dHdI�Z&�Z'S)Urz;A distribution index that scans web pages for download URLs�https://pypi.org/simple/�*NTcOs�tj|f|�|�|dd|jd��|_i|_i|_i|_tjdj	t
t|���j|_
g|_|ortjor|prtj�}|r�tj|�|_n
tjj|_dS)Nr6�|)r
r�r1�	index_url�scanned_urls�fetched_urls�
package_pagesrZr�rarrrK�allows�to_scanrZis_availableZfind_ca_bundleZ
opener_for�openerr�request�urlopen)r�r�ZhostsZ	ca_bundleZ
verify_sslrl�kwZuse_sslr&r&r'r�,szPackageIndex.__init__FcCs�||jkr|rdSd|j|<t|�s4|j|�dStt|��}|r^|j|�sRdS|jd|�|sr|sr||jkr�tt|j	|��dS|j|�s�d|j|<dS|j
d|�d|j|<d}|j|||�}|dkr�dSd|j|j<d|j
jdd�j�k�r|j�dS|j}|j�}t|t��sRt|tjj��r4d	}n|j
jd
��pDd	}|j|d�}|j�x6tj|�D](}	tjj|t|	jd���}
|j|
��qfW|j |j!��r�t"|d
d�dk�r�|j#||�}dS)z<Evaluate a URL as a possible download, and maybe retrieve itNTzFound link: %sz
Reading %sz<Download error on %s: %%s -- Some packages may not be found!�htmlzcontent-type�zlatin-1�charset�ignorer7�codei�)$r��
URL_SCHEME�process_filename�listr�url_ok�debugr�rrc�info�open_urlr@�headers�getr0�close�read�
isinstancerwr�error�	HTTPErrorZ	get_param�decoderyrur#rzr{rL�process_urlr2r��getattr�
process_index)r�r@Zretrieve�dists�tmpl�fr4r~r�rK�linkr&r&r'r�AsP





 zPackageIndex.process_urlcCs�tjj|�s|jd|�dStjj|�rd|rdtjj|�}x(tj|�D]}|jtjj||�d�qFWt	|�}|r�|j
d|�tt|j
|��dS)Nz
Not found: %sTz	Found: %s)rWrD�exists�warn�isdir�realpath�listdirr�rarYr�r�rrc)r��fn�nestedrD�itemr�r&r&r'r�tszPackageIndex.process_filenamecCsbt|�}|o|jd�j�dk}|s8|jtjj|�d�r<dSd}|rRt||��n|j||�dS)Nr7�fileTzN
Note: Bypassing %s (disallowed host; see http://bit.ly/2hrImnY for details).
)	r�rLr0r�rr#r=rr�)r�r@Zfatal�s�is_filer�r&r&r'r��szPackageIndex.url_okcCs2ttjj|�}dd�|D�}ttj|j|��dS)Ncss0|](}tj|�D]}|jd�r||fVqqdS)z	.egg-linkN)rWr�r1)r[rD�entryr&r&r'r]�sz.PackageIndex.scan_egg_links.<locals>.<genexpr>)�filterrWrDr�r��	itertools�starmap�
scan_egg_link)r�Zsearch_path�dirsZ	egg_linksr&r&r'�scan_egg_links�szPackageIndex.scan_egg_linksc
Cs�ttjj||���}ttdttj|���}WdQRXt	|�dkrDdS|\}}x>t
tjj||��D](}tjj|f|��|_t|_
|j|�q`WdS)Nr9)�openrWrDrar�r�rrwrxrTrrPrrHrc)r�rDr�Z	raw_lines�linesZegg_pathZ
setup_pathrNr&r&r'r��s zPackageIndex.scan_egg_linkc

s��fdd�}xHtj|�D]:}y |tjj|t|jd����Wqtk
rPYqXqW||�\}}|r�xXt||�D]J}t	|�\}}	|j
d�r�|	r�|r�|d||f7}n
�j|��j|�qrWt
jdd�|�SdSd	S)
z#Process the contents of a PyPI pagecs�|j�j�r�tttjj|t�j�d�jd���}t|�dkr�d|dkr�t	|d�}t
|d�}d�jj|j
�i�|<t|�t|�fSdS)Nr6r9r:r7rT)NN)r2r�r�rrr#r>rTr?rrr��
setdefaultr0r)r�rA�pkg�ver)r�r&r'�scan�s"z(PackageIndex.process_index.<locals>.scanr7z.pyz
#egg=%s-%scSsd|jddd�S)Nz<a href="%s#md5=%s">%s</a>r7r!r9)rL)�mr&r&r'�<lambda>�sz,PackageIndex.process_index.<locals>.<lambda>r�N)ryrurr#rzr{rLr$r�rGr1�need_version_info�scan_url�PYPI_MD5�sub)
r�r@r~r�rKr�r��new_urlr4�fragr&)r�r'r��s$ 

zPackageIndex.process_indexcCs|jd|�dS)NzPPage at %s links to .py file(s) without version info; an index scan is required.)�scan_all)r�r@r&r&r'r��szPackageIndex.need_version_infocGs:|j|jkr*|r |j|f|��|jd�|j|j�dS)Nz6Scanning index of all packages (this may take a while))r�r�r�r�r�)r�r�rlr&r&r'r��szPackageIndex.scan_allcCs~|j|j|jd�|jj|j�s:|j|j|jd�|jj|j�sR|j|�x&t|jj|jf��D]}|j|�qhWdS)Nr6)	r�r��unsafe_namer�r�rgrQ�not_found_in_indexr�)r��requirementr@r&r&r'�
find_packages�s
zPackageIndex.find_packagescsR|j�|j|�x,||jD]}||kr.|S|jd||�qWtt|�j||�S)Nz%s does not match %s)�prescanr�rgr��superr�obtain)r�r�Z	installerrN)�	__class__r&r'r��s
zPackageIndex.obtaincCsL|j|jd|�|j�sH|j�tj|�td|jjtj	j
|�f��dS)z-
        checker is a ContentChecker
        zValidating %%s checksum for %sz7%s validation failed for %s; possible download problem?N)r�r�r�r�rW�unlinkrr�r3rDrU)r��checkerrX�tfpr&r&r'�
check_hash�s

zPackageIndex.check_hashcCsTxN|D]F}|jdks4t|�s4|jd�s4tt|��r@|j|�q|jj|�qWdS)z;Add `urls` to the list that will be prescanned for searchesNzfile:)r�r�r2r�rr��append)r�Zurlsr@r&r&r'�add_find_links
s



zPackageIndex.add_find_linkscCs"|jrtt|j|j��d|_dS)z7Scan urls scheduled for prescanning (e.g. --find-links)N)r�r�rr�)r�r&r&r'r�szPackageIndex.prescancCs<||jr|jd}}n|jd}}|||j�|j�dS)Nz#Couldn't retrieve index page for %rz3Couldn't find index page for %r (maybe misspelled?))rgr�r�r�r�)r�r��methr�r&r&r'r�"s
zPackageIndex.not_found_in_indexcCs~t|t�sjt|�}|rR|j|jd�||�}t|�\}}|jd�rN|j|||�}|Stj	j
|�rb|St|�}t|j
||�dd�S)aLocate and/or download `spec` to `tmpdir`, returning a local path

        `spec` may be a ``Requirement`` object, or a string containing a URL,
        an existing local filename, or a project/version requirement spec
        (i.e. the string form of a ``Requirement`` object).  If it is the URL
        of a .py file with an unambiguous ``#egg=name-version`` tag (i.e., one
        that escapes ``-`` as ``_`` throughout), a trivial ``setup.py`` is
        automatically created alongside the downloaded file.

        If `spec` is a ``Requirement`` object or a string containing a
        project/version requirement spec, this method returns the location of
        a matching distribution (possibly after downloading it to `tmpdir`).
        If `spec` is a locally existing file or directory name, it is simply
        returned unchanged.  If `spec` is a URL, it is downloaded to a subpath
        of `tmpdir`, and the local filename is returned.  Various errors may be
        raised if a problem occurs during downloading.
        r7z.pyrPN)r�rr��
_download_urlrLrGr1�	gen_setuprWrDr�r(r��fetch_distribution)r�r%�tmpdirrB�foundr4rFr&r&r'r8,s

zPackageIndex.downloadc	s��jd|�i�d}d
�����fdd�	}|rH�j��j|�||�}|r`|dk	r`|||�}|dkr��jdk	rz�j�||�}|dkr�|r��j|�||�}|dkrˆjd�r�dp�d|�n�jd|�|j|jd	�SdS)a|Obtain a distribution suitable for fulfilling `requirement`

        `requirement` must be a ``pkg_resources.Requirement`` instance.
        If necessary, or if the `force_scan` flag is set, the requirement is
        searched for in the (online) package index as well as the locally
        installed packages.  If a distribution matching `requirement` is found,
        the returned distribution's ``location`` is the value you would have
        gotten from calling the ``download()`` method with the matching
        distribution's URL or filename.  If no matching distribution is found,
        ``None`` is returned.

        If the `source` flag is set, only source distributions and source
        checkout links will be considered.  Unless the `develop_ok` flag is
        set, development and system eggs (i.e., those using the ``.egg-info``
        format) will be ignored.
        zSearching for %sNcs�|dkr�}x�||jD]t}|jtkrJ�rJ|�kr�jd|�d�|<q||ko`|jtkp`�}|r�j|j��}||_tj	j
|j�r|SqWdS)Nz&Skipping development or system egg: %sr7)rgrHrr�rr8rP�download_locationrWrDr�)Zreq�envrNZtest�loc)�
develop_okr��skipped�sourcer�r&r'r|fs z-PackageIndex.fetch_distribution.<locals>.findz:No local packages or working download links found for %s%sza source distribution of r�zBest match: %s)rP)N)r�r�r�r�r��cloner�)	r�r�r��
force_scanr�r�Zlocal_indexrNr|r&)r�r�r�r�r�r'r�Ns0




zPackageIndex.fetch_distributioncCs"|j||||�}|dk	r|jSdS)a3Obtain a file suitable for fulfilling `requirement`

        DEPRECATED; use the ``fetch_distribution()`` method now instead.  For
        backward compatibility, this routine is identical but returns the
        ``location`` of the downloaded distribution instead of a distribution
        object.
        N)r�rP)r�r�r�rr�rNr&r&r'�fetch�szPackageIndex.fetchc

Cs�tj|�}|r*dd�t||jd�d�D�p,g}t|�dkr�tjj|�}tjj|�|kr�tjj	||�}ddl
m}|||�s�tj
||�|}ttjj	|d�d��2}	|	jd|dj|djtjj|�df�WdQRX|S|r�td	||f��ntd
��dS)NcSsg|]}|jr|�qSr&)rR)r[�dr&r&r'�
<listcomp>�sz*PackageIndex.gen_setup.<locals>.<listcomp>r7r)�samefilezsetup.py�wzIfrom setuptools import setup
setup(name=%r, version=%r, py_modules=[%r])
z�Can't unambiguously interpret project/version identifier %r; any dashes in the name or version should be escaped using underscores. %rzpCan't process plain .py files without an '#egg=name-version' suffix to enable automatic setup script generation.)rJrKrrLrTrWrDrU�dirnameraZsetuptools.command.easy_installr�shutilZcopy2r��writerQrR�splitextr)
r�rXrFr�rKr�rU�dstrr�r&r&r'r��s2

 zPackageIndex.gen_setupi cCs|jd|�d}z�tj|�}|j|�}t|tjj�rJtd||j	|j
f��|j�}d}|j}d}d|kr�t|d�}	t
tt|	��}|j|||||�t|d��Z}
xD|j|�}|r�|j|�|
j|�|d7}|j|||||�q�Pq�W|j|||
�WdQRX|S|�r|j�XdS)	NzDownloading %szCan't download %s: %s %srr7zcontent-lengthzContent-Length�wbr;)r�r�r�r�r�rr�r�rr�r��dl_blocksizer�maxr�int�
reporthookr�r�r�r	r�r�)r�r@rX�fpr�r��blocknumZbs�sizeZsizesr�r�r&r&r'�_download_to�s:





zPackageIndex._download_tocCsdS)Nr&)r�r@rXrZblksizerr&r&r'r�szPackageIndex.reporthookcCs�|jd�rt|�Syt||j�Sttjfk
r�}z>djdd�|jD��}|r^|j	||�nt
d||f��WYdd}~X�ntjj
k
r�}z|Sd}~Xn�tjjk
r�}z,|r�|j	||j�nt
d||jf��WYdd}~Xn�tjk
�r8}z.|�r|j	||j�nt
d||jf��WYdd}~XnPtjtjfk
�r�}z*|�rf|j	||�nt
d||f��WYdd}~XnXdS)Nzfile:� cSsg|]}t|��qSr&)rw)r[�argr&r&r'r�sz)PackageIndex.open_url.<locals>.<listcomp>z%s %szDownload error for %s: %sz;%s returned a bad status line. The server might be down, %s)r2�
local_open�open_with_authr�r$r�
InvalidURLrarlr�rrr�r�ZURLError�reasonZ
BadStatusLine�lineZ
HTTPException�socket)r�r@Zwarning�vr�r&r&r'r��s6
"zPackageIndex.open_urlcCs�t|�\}}|r4x&d|kr0|jdd�jdd�}qWnd}|jd�rN|dd�}tjj||�}|dksn|jd	�rz|j||�S|d
ks�|jd�r�|j||�S|jd�r�|j	||�S|d
kr�t
jjt
j
j|�d�S|j|d�|j||�SdS)Nz..�.�\�_Z__downloaded__z.egg.zipr,�svnzsvn+�gitzgit+zhg+r�r9Tr/)rG�replacer1rWrDrar2�
_download_svn�
_download_git�_download_hgrr��url2pathnamer#r=r��_attempt_download)r�rBr@r�r3rFrXr&r&r'r�s$


zPackageIndex._download_urlcCs|j|d�dS)NT)r�)r�r@r&r&r'r�9szPackageIndex.scan_urlcCs6|j||�}d|jdd�j�kr.|j|||�S|SdS)Nr�zcontent-typer�)rr�r0�_download_html)r�r@rXr�r&r&r'r(<szPackageIndex._attempt_downloadcCslt|�}x@|D]8}|j�rtjd|�rD|j�tj|�|j||�SPqW|j�tj|�td|��dS)Nz <title>([^- ]+ - )?Revision \d+:zUnexpected HTML page found at )	r�rxrZr}r�rWr�r$r)r�r@r�rXr�rr&r&r'r)Cs


zPackageIndex._download_htmlcCs|jdd�d}g}|j�jd�r�d|kr�tjj|�\}}}}}}	|r�|jd�r�d|dd�kr�|dd�jdd�\}}tjj|�\}
}|
r�d	|
kr�|
jd	d�\}}
d
|d|
g}n
d
|
g}|}||||||	f}tjj|�}|jd||�t	j
d
dg|d||g�|S)Nr:r7rzsvn:�@z//r6r9�:z--username=z--password=z'Doing subversion checkout from %s to %sr!�checkoutz-q)r?r0r2rr#r=�	splituser�
urlunparser��
subprocess�
check_call)r�r@rXZcredsrB�netlocrDr\�qr��auth�host�userZpwrAr&r&r'r$Rs$ 
zPackageIndex._download_svncCsptjj|�\}}}}}|jdd�d}|jdd�d}d}d|krR|jdd�\}}tjj||||df�}||fS)N�+r7r:rr*r�r;)rr#Zurlsplitr?�rsplitZ
urlunsplit)r@�
pop_prefixrBr1rDrEr��revr&r&r'�_vcs_split_rev_from_urlgsz$PackageIndex._vcs_split_rev_from_urlcCsr|jdd�d}|j|dd�\}}|jd||�tjddd	||g�|dk	rn|jd
|�tjdd|dd	|g�|S)
Nr:r7rT)r8zDoing git clone from %s to %sr"rz--quietzChecking out %sz-Cr,)r?r:r�r/r0)r�r@rXr9r&r&r'r%yszPackageIndex._download_gitc	Csv|jdd�d}|j|dd�\}}|jd||�tjddd	||g�|dk	rr|jd
|�tjdd|dd
d|dg�|S)Nr:r7rT)r8zDoing hg clone from %s to %sZhgrz--quietzUpdating to %sz--cwdZupz-Cz-rz-q)r?r:r�r/r0)r�r@rXr9r&r&r'r&�szPackageIndex._download_hgcGstj|f|��dS)N)rr�)r�r�rlr&r&r'r��szPackageIndex.debugcGstj|f|��dS)N)rr�)r�r�rlr&r&r'r��szPackageIndex.infocGstj|f|��dS)N)rr�)r�r�rlr&r&r'r��szPackageIndex.warn�r�)r�r;NT)F)F)F)N)N)FFFN)FF)N)F)(r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r8r�rr�r
rrr�r�r�r(r)r$�staticmethodr:r%r&r�r�r��
__classcell__r&r&)r�r'r)sL

3



+
		
#
J

)$
#

z!&(#(\d+|x[\da-fA-F]+)|[\w.:-]+);?cCs|jd�}t|�S)Nr7)rLr)rKZwhatr&r&r'�
decode_entity�s
r>cCs
tt|�S)z'Decode HTML entities in the given text.)�
entity_subr>)�textr&r&r'r{�sr{cs�fdd�}|S)Ncs��fdd�}|S)Ncs.tj�}tj��z
�||�Stj|�XdS)N)rZgetdefaulttimeoutZsetdefaulttimeout)rlrmZold_timeout)rn�timeoutr&r'�_socket_timeout�s


z@socket_timeout.<locals>._socket_timeout.<locals>._socket_timeoutr&)rnrB)rA)rnr'rB�sz'socket_timeout.<locals>._socket_timeoutr&)rArBr&)rAr'�socket_timeout�srCcCs2tjj|�}|j�}tj|�}|j�}|jdd�S)aq
    A function compatible with Python 2.3-3.3 that will encode
    auth from a URL suitable for an HTTP header.
    >>> str(_encode_auth('username%3Apassword'))
    'dXNlcm5hbWU6cGFzc3dvcmQ='

    Long auth strings should not cause a newline to be inserted.
    >>> long_auth = 'username:' + 'password'*10
    >>> chr(10) in str(_encode_auth(long_auth))
    False
    �
r�)rr#r>�encode�base64Zencodestringr�r#)r3Zauth_sZ
auth_bytesZ
encoded_bytesZencodedr&r&r'�_encode_auth�s

rGc@s(eZdZdZdd�Zdd�Zdd�ZdS)	�
Credentialz:
    A username/password pair. Use like a namedtuple.
    cCs||_||_dS)N)�username�password)r�rIrJr&r&r'r��szCredential.__init__ccs|jV|jVdS)N)rIrJ)r�r&r&r'�__iter__�szCredential.__iter__cCsdt|�S)Nz%(username)s:%(password)s)�vars)r�r&r&r'�__str__�szCredential.__str__N)r�r�r�r�r�rKrMr&r&r&r'rH�srHc@s0eZdZdd�Zedd��Zdd�Zdd�Zd	S)
�
PyPIConfigcCsPtjdddgd�}tjj||�tjjtjjd�d�}tjj	|�rL|j
|�dS)z%
        Load from ~/.pypirc
        rIrJ�
repositoryr��~z.pypircN)�dict�fromkeysr�RawConfigParserr�rWrDra�
expanduserr�r�)r�ZdefaultsZrcr&r&r'r��s
zPyPIConfig.__init__cs&�fdd��j�D�}tt�j|��S)Ncs g|]}�j|d�j�r|�qS)rO)r�rx)r[�section)r�r&r'r�sz2PyPIConfig.creds_by_repository.<locals>.<listcomp>)ZsectionsrQr�_get_repo_cred)r�Zsections_with_repositoriesr&)r�r'�creds_by_repository�szPyPIConfig.creds_by_repositorycCs6|j|d�j�}|t|j|d�j�|j|d�j��fS)NrOrIrJ)r�rxrH)r�rUZrepor&r&r'rV�szPyPIConfig._get_repo_credcCs*x$|jj�D]\}}|j|�r|SqWdS)z�
        If the URL indicated appears to be a repository defined in this
        config, return the credential for that repository.
        N)rW�itemsr2)r�r@rO�credr&r&r'�find_credential�s
zPyPIConfig.find_credentialN)r�r�r�r��propertyrWrVrZr&r&r&r'rN�s	rNcCs:tjj|�\}}}}}}|jd�r,tjd��|d
krFtjj|�\}}	nd}|s~t�j|�}
|
r~t	|
�}|
j
|f}tjd|��|r�dt
|�}||	||||f}tjj|�}
tjj|
�}|jd|�ntjj|�}|jd	t�||�}|�r6tjj|j�\}}}}}}||k�r6||	k�r6||||||f}tjj|�|_|S)z4Open a urllib2 request, handling HTTP authenticationr+znonnumeric port: ''�http�httpsN�*Authenticating as %s for %s (from .pypirc)zBasic Z
Authorizationz
User-Agent)r\r])r^)rr#r=r1rrr-rNrZrwrIrr�rGr.r�ZRequestZ
add_header�
user_agentr@)r@r�rBr1rDZparamsrEr�r3r4rYr�rAr�r�r�s2Zh2Zpath2Zparam2Zquery2Zfrag2r&r&r'r	s6


rcCs|S)Nr&)r@r&r&r'�
fix_sf_url:sracCstjj|�\}}}}}}tjj|�}tjj|�r<tjj|�S|j	d�r�tjj
|�r�g}x�tj|�D]b}	tjj||	�}
|	dkr�t
|
d��}|j�}WdQRXPntjj
|
�r�|	d7}	|jdj|	d��qbWd}
|
j|dj|�d	�}d\}}n
d\}}}ddi}tj|�}tjj|||||�S)z7Read a local path, with special support for directoriesr6z
index.html�rNz<a href="{name}">{name}</a>)r3zB<html><head><title>{url}</title></head><body>{files}</body></html>rD)r@�files���OK��Path not found�	Not foundzcontent-typez	text/html)rdre)rfrgrh)rr#r=r�r'rWrD�isfiler�r1r�r�rar�r�r��formatr�StringIOr�r�)r@rBrCrDZparamrEr�rXrcr��filepathrZbodyr�Zstatus�messager�Zbody_streamr&r&r'r>s,


r)N)N)N)N)r )[r�r/�sysrWrZrrrFr�r��	functoolsrZsetuptools.externrZsetuptools.extern.six.movesrrrrr"Z
pkg_resourcesrr	r
rrr
rrrrrrrrZ	distutilsrZdistutils.errorsrZfnmatchrZsetuptools.py27compatrZsetuptools.py33compatrZsetuptools.wheelrr�rJ�Iryr�rKr�r?rS�__all__Z_SOCKET_TIMEOUTZ_tmplrjrRr_r(rrGrrIrYrrkrprtr��objectr�r�rr�r?r>r{rCrGrHrSrNr�r�rrarr&r&r&r'�<module>s|<
	

!
"

!z
&.site-packages/setuptools/__pycache__/py27compat.cpython-36.pyc000064400000001346147511334640020364 0ustar003

��f�@sTdZddlZddlmZdd�Zejr.dd�Zej�dko>ejZerHendd	�Z	dS)
z2
Compatibility Support for Python 2.7 and earlier
�N)�sixcCs
|j|�S)zH
    Given an HTTPMessage, return all headers matching a given key.
    )Zget_all)�message�key�r� /usr/lib/python3.6/py27compat.py�get_all_headers
srcCs
|j|�S)N)Z
getheaders)rrrrrrsZLinuxcCs|S)Nr)�xrrr�<lambda>sr	)
�__doc__�platformZsetuptools.externrrZPY2�systemZlinux_py2_ascii�strZrmtree_saferrrr�<module>ssite-packages/setuptools/__pycache__/pep425tags.cpython-36.opt-1.pyc000064400000016151147511334650021215 0ustar003

��fy*�@s�dZddlmZddlZddlmZddlZddlZddlZddl	Z	ddl
Z
ddlmZddl
mZejd�Zd	d
�Zdd�Zd
d�Zdd�Zdd�Zd#dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd$d!d"�Ze�ZdS)%z2Generate and work with PEP 425 Compatibility Tags.�)�absolute_importN)�log)�OrderedDict�)�glibcz(.+)_(\d+)_(\d+)_(.+)cCsBy
tj|�Stk
r<}ztjdj|�t�dSd}~XnXdS)Nz{})�	sysconfig�get_config_var�IOError�warnings�warn�format�RuntimeWarning)�var�e�r� /usr/lib/python3.6/pep425tags.pyrs

rcCs:ttd�rd}n&tjjd�r"d}ntjdkr2d}nd}|S)z'Return abbreviated implementation name.�pypy_version_info�pp�javaZjyZcliZip�cp)�hasattr�sys�platform�
startswith)Zpyimplrrr�
get_abbr_impls

rcCs.td�}|st�dkr*djttt���}|S)zReturn implementation version.�py_version_nodotr�)rr�join�map�str�get_impl_version_info)Zimpl_verrrr�get_impl_ver)sr!cCs:t�dkr"tjdtjjtjjfStjdtjdfSdS)zQReturn sys.version_info-like tuple for use in decrementing the minor
    version.rrrN)rr�version_infor�major�minorrrrrr 1s

r cCsdjt�t��S)z;
    Returns the Tag for this specific implementation.
    z{}{})rrr!rrrr�get_impl_tag<sr%TcCs.t|�}|dkr&|r tjd|�|�S||kS)zgUse a fallback method for determining SOABI flags if the needed config
    var is unset or unavailable.Nz>Config variable '%s' is unset, Python ABI tag may be incorrect)rr�debug)rZfallback�expectedr�valrrr�get_flagCsr)cs�td�}t��|r��dkr�ttd�r�d}d}d}tddd��dkd	�rLd
}td�fdd��dkd	�rjd
}tddd�d�dko�tjdkd�r�tjdkr�d}d�t�|||f}n@|r�|jd�r�d|jd�d}n|r�|j	dd�j	dd�}nd}|S)zXReturn the ABI tag based on SOABI (if available) or emulate SOABI
    (CPython 2, PyPy).�SOABIrr�
maxunicoder�Py_DEBUGcSs
ttd�S)N�gettotalrefcount)rrrrrr�<lambda>Yszget_abi_tag.<locals>.<lambda>)r�d�
WITH_PYMALLOCcs�dkS)Nrrr)�implrrr.]s�mZPy_UNICODE_SIZEcSs
tjdkS)Ni��)rr+rrrrr.as��)r'r�uz
%s%s%s%s%szcpython-�-r�.�_N>rr)r4r4)r4r4)
rrrrr)r"r!r�split�replace)Zsoabir/r2r5�abir)r1r�get_abi_tagOs8

r<cCs
tjdkS)Ni���)r�maxsizerrrr�_is_running_32bitqsr>cCs�tjdkr^tj�\}}}|jd�}|dkr6t�r6d}n|dkrHt�rHd}dj|d|d	|�Stjj�j	dd
�j	dd
�}|dkr�t�r�d
}|S)z0Return our platform name 'win32', 'linux_x86_64'�darwinr7�x86_64�i386�ppc64�ppczmacosx_{}_{}_{}rrr8r6�linux_x86_64�
linux_i686)
rrZmac_verr9r>r�	distutils�util�get_platformr:)�releaser8�machineZ	split_ver�resultrrrrHus

rHcCsFt�dkrdSyddl}t|j�Sttfk
r8YnXtjdd�S)NrDrEFr��>rDrE)rH�
_manylinux�boolZmanylinux1_compatible�ImportError�AttributeErrorrZhave_compatible_glibc)rNrrr�is_manylinux1_compatible�s

rRcsvg}��fdd��td
dddg���|||�r8|j|�x.�D]&}|�|kr>�|||�r>|j|�q>W|jd�|S)z�Return a list of supported arches (including group arches) for
    the given major, minor and machine architecture of an macOS machine.
    cs~|dkr||fd
kS|dkr(||fdkS|dkr<||fdkS|dkrP||fd
kS|�krzx �|D]}�|||�rbdSqbWd	S)NrC�
rMrBrAr3r@TF)rSrM)rSrM)rSr3)rSrMr)r#r$�arch�garch)�_supports_arch�groupsrrrV�sz)get_darwin_arches.<locals>._supports_arch�fatrArC�intelr@�fat64rB�fat32Z	universal�rArC)rXr\�r@rA)rYr]�r@rB)rZr^�r@rArC)r[r_)r�append)r#r$rJ�archesrUr)rVrWr�get_darwin_arches�s$


rbFcCsg}|dkrXg}t�}|dd�}x4t|ddd�D] }|jdjtt||f���q4W|p`t�}g}	|pnt�}|r�|g|	dd�<t�}
ddl	}x8|j
�D],}|djd�r�|
j|dj
dd�d�q�W|	jtt|
���|	jd�|�sx|p�t�}
|
jd	��r�tj|
�}|�r�|j�\}}}}d
j||�}g}xTttt|�d��D]4}x,tt|�||�D]}|j|||f��q^W�qHWn|
g}n*|dk�r�t��r�|
jdd�|
g}n|
g}x:|	D]2}x*|D]"}
|jd
||df||
f��q�W�q�WxZ|dd�D]J}|dk�rPx6|
D].}x&|D]}
|jd
||f||
f��qW�qW�q�Wx*|D]"}
|jd|ddd|
f��qRW|jd
||dfddf�|jd
||ddfddf�xNt|�D]B\}}|jd|fddf�|dk�r�|jd|dddf��q�W|S)acReturn a list of supported tags for each version specified in
    `versions`.

    :param versions: a list of string versions, of the form ["33", "32"],
        or None. The first version will be assumed to support our ABI.
    :param platform: specify the exact platform you want valid
        tags for, or None. If None, use the local system platform.
    :param impl: specify the exact implementation you want valid
        tags for, or None. If None, use the local interpreter impl.
    :param abi: specify the exact abi you want valid
        tags for, or None. If None, use the local interpreter abi.
    Nrrrz.abir7rLZnoneZmacosxz{}_{}_%i_%s�linuxZ
manylinux1z%s%s�31�30zpy%s�any���rgrgrg>rdre)r �ranger`rrrrr<�set�impZget_suffixesr�addr9�extend�sorted�listrH�
_osx_arch_pat�matchrWr�reversed�intrbrRr:�	enumerate)ZversionsZnoarchrr1r;Z	supportedr"r#r$ZabisZabi3srj�suffixrTrp�nameZactual_archZtplrar2�a�version�irrr�
get_supported�sh 




 

(


*
" 
ry)TT)NFNNN)�__doc__Z
__future__rZdistutils.utilrFrr�rerrr
�collectionsrrr�compilerorrr!r r%r)r<r>rHrRrbryZimplementation_tagrrrr�<module>s2


"=
_site-packages/setuptools/__pycache__/unicode_utils.cpython-36.opt-1.pyc000064400000002114147511334650022157 0ustar003

��f��@s8ddlZddlZddlmZdd�Zdd�Zdd�ZdS)	�N)�sixcCsVt|tj�rtjd|�Sy$|jd�}tjd|�}|jd�}Wntk
rPYnX|S)NZNFDzutf-8)�
isinstancer�	text_type�unicodedataZ	normalize�decode�encode�UnicodeError)�path�r
�#/usr/lib/python3.6/unicode_utils.py�	decomposes
rcCsXt|tj�r|Stj�pd}|df}x.|D]&}y
|j|�Stk
rNw*Yq*Xq*WdS)zY
    Ensure that the given path is decoded,
    NONE when no expected encoding works
    zutf-8N)rrr�sys�getfilesystemencodingr�UnicodeDecodeError)r	Zfs_encZ
candidates�encr
r
r�filesys_decodes

rcCs$y
|j|�Stk
rdSXdS)z/turn unicode encoding into a functional routineN)r�UnicodeEncodeError)�stringrr
r
r�
try_encode's
r)rr
Zsetuptools.externrrrrr
r
r
r�<module>s
site-packages/setuptools/__pycache__/lib2to3_ex.cpython-36.pyc000064400000004474147511334650020337 0ustar003

��f��@sXdZddlmZddlmZddlmZmZddl	Z	Gdd�de�Z
Gdd	�d	e�ZdS)
zy
Customized Mixin2to3 support:

 - adds support for converting doctests


This module raises an ImportError on Python 2.
�)�	Mixin2to3)�log)�RefactoringTool�get_fixers_from_packageNc@s$eZdZdd�Zdd�Zdd�ZdS)�DistutilsRefactoringToolcOstj|f|��dS)N)r�error)�self�msg�args�kw�r� /usr/lib/python3.6/lib2to3_ex.py�	log_errorsz"DistutilsRefactoringTool.log_errorcGstj|f|��dS)N)r�info)rr	r
rrr
�log_messagesz$DistutilsRefactoringTool.log_messagecGstj|f|��dS)N)r�debug)rr	r
rrr
�	log_debugsz"DistutilsRefactoringTool.log_debugN)�__name__�
__module__�__qualname__rrrrrrr
rsrc@s&eZdZd	dd�Zdd�Zdd�ZdS)
rFcCsr|jjdk	rdS|sdStjddj|��|j�|j�|rbtjrnt	|j
�}|j|ddd�ntj
||�dS)NTzFixing � )�writeZ
doctests_only)�distributionZuse_2to3rr�join�_Mixin2to3__build_fixer_names�_Mixin2to3__exclude_fixers�
setuptoolsZrun_2to3_on_doctestsr�fixer_namesZrefactor�
_Mixin2to3�run_2to3)r�filesZdoctests�rrrr
rs
zMixin2to3.run_2to3cCsb|jr
dSg|_xtjD]}|jjt|��qW|jjdk	r^x |jjD]}|jjt|��qFWdS)N)rrZlib2to3_fixer_packages�extendrrZuse_2to3_fixers)r�prrr
Z__build_fixer_names.szMixin2to3.__build_fixer_namescCsNt|dg�}|jjdk	r&|j|jj�x"|D]}||jkr,|jj|�q,WdS)NZexclude_fixers)�getattrrZuse_2to3_exclude_fixersr"r�remove)rZexcluded_fixersZ
fixer_namerrr
Z__exclude_fixers8s

zMixin2to3.__exclude_fixersN)F)rrrrrrrrrr
rs

r)�__doc__Zdistutils.utilrrZ	distutilsrZlib2to3.refactorrrrrrrrr
�<module>ssite-packages/setuptools/__pycache__/windows_support.cpython-36.opt-1.pyc000064400000001656147511334650022611 0ustar003

��f��@s(ddlZddlZdd�Zedd��ZdS)�NcCstj�dkrdd�S|S)NZWindowsc_sdS)N�)�args�kwargsrr�%/usr/lib/python3.6/windows_support.py�<lambda>szwindows_only.<locals>.<lambda>)�platform�system)�funcrrr�windows_onlysr
cCsLtd�tjjj}tjjtjjf|_tjj	|_
d}|||�}|sHtj��dS)z�
    Set the hidden attribute on a file or directory.

    From http://stackoverflow.com/questions/19622133/

    `path` must be text.
    zctypes.wintypes�N)�
__import__�ctypesZwindllZkernel32ZSetFileAttributesWZwintypesZLPWSTRZDWORDZargtypesZBOOLZrestypeZWinError)�pathZSetFileAttributes�FILE_ATTRIBUTE_HIDDEN�retrrr�	hide_files	


r)rr
r
rrrrr�<module>ssite-packages/setuptools/__pycache__/build_meta.cpython-36.pyc000064400000013320147511334650020460 0ustar003

��f'�@s�dZddlZddlZddlZddlZddlZddlZddlZGdd�de�Z	Gdd�dej
j�Zddd	�Zd
d�Z
dd
�Zdd�Zddd�Zddd�Zddd�Zddd�Zddd�ZdS) a-A PEP 517 interface to setuptools

Previously, when a user or a command line tool (let's call it a "frontend")
needed to make a request of setuptools to take a certain action, for
example, generating a list of installation requirements, the frontend would
would call "setup.py egg_info" or "setup.py bdist_wheel" on the command line.

PEP 517 defines a different method of interfacing with setuptools. Rather
than calling "setup.py" directly, the frontend should:

  1. Set the current directory to the directory with a setup.py file
  2. Import this module into a safe python interpreter (one in which
     setuptools can potentially set global variables or crash hard).
  3. Call one of the functions defined in PEP 517.

What each function does is defined in PEP 517. However, here is a "casual"
definition of the functions (this definition should not be relied on for
bug reports or API stability):

  - `build_wheel`: build a wheel in the folder and return the basename
  - `get_requires_for_build_wheel`: get the `setup_requires` to build
  - `prepare_metadata_for_build_wheel`: get the `install_requires`
  - `build_sdist`: build an sdist in the folder and return the basename
  - `get_requires_for_build_sdist`: get the `setup_requires` to build

Again, this is not a formal definition! Just a "taste" of the module.
�Nc@seZdZdd�ZdS)�SetupRequirementsErrorcCs
||_dS)N)�
specifiers)�selfr�r� /usr/lib/python3.6/build_meta.py�__init__(szSetupRequirementsError.__init__N)�__name__�
__module__�__qualname__rrrrrr'src@s&eZdZdd�Zeejdd���ZdS)�DistributioncCst|��dS)N)r)rrrrr�fetch_build_eggs-szDistribution.fetch_build_eggsccs*tjj}|tj_z
dVWd|tj_XdS)zw
        Replace
        distutils.dist.Distribution with this class
        for the duration of this context.
        N)�	distutilsZcorer)�clsZorigrrr�patch0s

zDistribution.patchN)rr	r
r�classmethod�
contextlib�contextmanagerrrrrrr,sr�setup.pycCsH|}d}ttdt�|�}|j�jdd�}|j�tt||d�t��dS)N�__main__�openz\r\nz\n�exec)	�getattr�tokenizer�read�replace�closer�compile�locals)Zsetup_script�__file__r�f�coderrr�
_run_setup@sr!cCs|pi}|jdg�|S)Nz--global-option)�
setdefault)�config_settingsrrr�_fix_configKsr$cCs~t|�}ddg}tjdd�dg|dt_ytj��t�WdQRXWn,tk
rx}z||j7}WYdd}~XnX|S)N�
setuptoolsZwheel�Zegg_infoz--global-option)r$�sys�argvrrr!rr)r#Zrequirements�errr�_get_build_requiresQs
r*cs�fdd�tj��D�S)Ncs&g|]}tjjtjj�|��r|�qSr)�os�path�isdir�join)�.0�name)�a_dirrr�
<listcomp>asz1_get_immediate_subdirectories.<locals>.<listcomp>)r+�listdir)r1r)r1r�_get_immediate_subdirectories`sr4cCst|�}t|�S)N)r$r*)r#rrr�get_requires_for_build_wheelesr5cCst|�}t|�S)N)r$r*)r#rrr�get_requires_for_build_sdistjsr6cCs�tjdd�dd|gt_t�|}x`dd�tj|�D�}t|�dkrptt|��dkrptjj|tj|�d�}q&t|�dks�t	�Pq&W||kr�t
jtjj||d�|�t
j|dd�|dS)	Nr&Z	dist_infoz
--egg-basecSsg|]}|jd�r|�qS)z
.dist-info)�endswith)r/rrrrr2usz4prepare_metadata_for_build_wheel.<locals>.<listcomp>rT)�
ignore_errors)
r'r(r!r+r3�lenr4r,r.�AssertionError�shutilZmove�rmtree)�metadata_directoryr#Zdist_info_directoryZ
dist_infosrrr� prepare_metadata_for_build_wheelos$r>cCs�t|�}tjj|�}tjdd�dg|dt_t�|dkrVtj|�tj	d|�dd�tj
|�D�}t|�dkszt�|dS)Nr&Zbdist_wheelz--global-option�distcSsg|]}|jd�r|�qS)z.whl)r7)r/rrrrr2�szbuild_wheel.<locals>.<listcomp>r)
r$r+r,�abspathr'r(r!r;r<�copytreer3r9r:)Zwheel_directoryr#r=Zwheelsrrr�build_wheel�s
rBcCs�t|�}tjj|�}tjdd�dg|dt_t�|dkrVtj|�tj	d|�dd�tj
|�D�}t|�dkszt�|dS)Nr&Zsdistz--global-optionr?cSsg|]}|jd�r|�qS)z.tar.gz)r7)r/rrrrr2�szbuild_sdist.<locals>.<listcomp>r)
r$r+r,r@r'r(r!r;r<rAr3r9r:)Zsdist_directoryr#Zsdistsrrr�build_sdist�s
rC)r)N)N)N)NN)N)�__doc__r+r'rr;rr%r
�
BaseExceptionrr?rr!r$r*r4r5r6r>rBrCrrrr�<module>s&




site-packages/setuptools/__pycache__/__init__.cpython-36.pyc000064400000014113147511334650020113 0ustar003

��fD�@s2dZddlZddlZddlZddlZddlmZddlm	Z	ddl
mZmZddl
ZddlmZddlmZmZddlmZd	d
lmZddd
ddddgZejjZdZdZdgZGdd�de�ZGdd�de�Z ej!Z"dd�Z#dd�Z$ej%j$je$_ej&ej%j'�Z(Gdd�de(�Z'dd�Z)ej*fdd�Z+ej,�dS) z@Extensions to the 'distutils' for large or complex distributions�N)�convert_path)�fnmatchcase)�filter�map)�	Extension)�Distribution�Feature)�Require�)�monkey�setuprr�Commandrr	�
find_packagesTz
lib2to3.fixesc@sHeZdZdZedfd
fdd��Zedd��Zedd	��Zed
d��Z	dS)�
PackageFinderzI
    Generate a list of all Python packages found within a directory
    �.�*cCs&t|jt|�|jd|��|j|���S)a	Return a list all Python packages found within directory 'where'

        'where' is the root directory which will be searched for packages.  It
        should be supplied as a "cross-platform" (i.e. URL-style) path; it will
        be converted to the appropriate local path syntax.

        'exclude' is a sequence of package names to exclude; '*' can be used
        as a wildcard in the names, such that 'foo.*' will exclude all
        subpackages of 'foo' (but not 'foo' itself).

        'include' is a sequence of package names to include.  If it's
        specified, only the named packages will be included.  If it's not
        specified, all found packages will be included.  'include' can contain
        shell style wildcard patterns just like 'exclude'.
        �ez_setup�*__pycache__)rr)�list�_find_packages_iterr�
_build_filter)�cls�where�exclude�include�r�/usr/lib/python3.6/__init__.py�find'szPackageFinder.findccs�x�tj|dd�D]�\}}}|dd�}g|dd�<xp|D]h}tjj||�}	tjj|	|�}
|
jtjjd�}d|ks8|j|	�r~q8||�r�||�r�|V|j|�q8WqWdS)zy
        All the packages found in 'where' that pass the 'include' filter, but
        not the 'exclude' filter.
        T)�followlinksNr)	�os�walk�path�join�relpath�replace�sep�_looks_like_package�append)rrrr�root�dirs�filesZall_dirs�dir�	full_pathZrel_path�packagerrrr>s
z!PackageFinder._find_packages_itercCstjjtjj|d��S)z%Does a directory look like a package?z__init__.py)rr!�isfiler")r!rrrr&Zsz!PackageFinder._looks_like_packagecs�fdd�S)z�
        Given a list of patterns, return a callable that will be true only if
        the input matches at least one of the patterns.
        cst�fdd��D��S)Nc3s|]}t�|d�VqdS))�patN)r)�.0r/)�namerr�	<genexpr>esz@PackageFinder._build_filter.<locals>.<lambda>.<locals>.<genexpr>)�any)r1)�patterns)r1r�<lambda>esz-PackageFinder._build_filter.<locals>.<lambda>r)r4r)r4rr_szPackageFinder._build_filterN)r)
�__name__�
__module__�__qualname__�__doc__�classmethodrr�staticmethodr&rrrrrr"src@seZdZedd��ZdS)�PEP420PackageFindercCsdS)NTr)r!rrrr&isz'PEP420PackageFinder._looks_like_packageN)r6r7r8r;r&rrrrr<hsr<cCs@tjjtdd�|j�D���}|jdd�|jr<|j|j�dS)Ncss"|]\}}|dkr||fVqdS)�dependency_links�setup_requiresN)r=r>r)r0�k�vrrrr2usz*_install_setup_requires.<locals>.<genexpr>T)Zignore_option_errors)�	distutils�corer�dict�itemsZparse_config_filesr>Zfetch_build_eggs)�attrs�distrrr�_install_setup_requiresqs
rGcKst|�tjjf|�S)N)rGrArBr)rErrrr~sc@s(eZdZejZdZdd�Zddd�ZdS)	r
FcKstj||�t|�j|�dS)zj
        Construct the command for dist, updating
        vars(self) with any keyword parameters.
        N)�_Command�__init__�vars�update)�selfrF�kwrrrrI�szCommand.__init__rcKs tj|||�}t|�j|�|S)N)rH�reinitialize_commandrJrK)rLZcommandZreinit_subcommandsrM�cmdrrrrN�szCommand.reinitialize_commandN)r)r6r7r8rHr9Zcommand_consumes_argumentsrIrNrrrrr
�scCs&dd�tj|dd�D�}ttjj|�S)z%
    Find all files under 'path'
    css,|]$\}}}|D]}tjj||�VqqdS)N)rr!r")r0�baser)r*�filerrrr2�sz#_find_all_simple.<locals>.<genexpr>T)r)rr rr!r.)r!�resultsrrr�_find_all_simple�srScCs6t|�}|tjkr.tjtjj|d�}t||�}t|�S)z�
    Find all files under 'dir' and return the list of full filenames.
    Unless dir is '.', return full filenames with dir prepended.
    )�start)	rSr�curdir�	functools�partialr!r#rr)r+r*Zmake_relrrr�findall�s


rX)-r9rrVZdistutils.corerAZdistutils.filelistZdistutils.utilrZfnmatchrZsetuptools.extern.six.movesrrZsetuptools.versionZ
setuptoolsZsetuptools.extensionrZsetuptools.distrrZsetuptools.dependsr	�r�__all__�version�__version__Zbootstrap_install_fromZrun_2to3_on_doctestsZlib2to3_fixer_packages�objectrr<rrrGrrBZ
get_unpatchedr
rHrSrUrXZ	patch_allrrrr�<module>s:F
site-packages/setuptools/glibc.py000064400000006112147511334650013150 0ustar00# This file originally from pip:
# https://github.com/pypa/pip/blob/8f4f15a5a95d7d5b511ceaee9ed261176c181970/src/pip/_internal/utils/glibc.py
from __future__ import absolute_import

import ctypes
import re
import warnings


def glibc_version_string():
    "Returns glibc version string, or None if not using glibc."

    # ctypes.CDLL(None) internally calls dlopen(NULL), and as the dlopen
    # manpage says, "If filename is NULL, then the returned handle is for the
    # main program". This way we can let the linker do the work to figure out
    # which libc our process is actually using.
    process_namespace = ctypes.CDLL(None)
    try:
        gnu_get_libc_version = process_namespace.gnu_get_libc_version
    except AttributeError:
        # Symbol doesn't exist -> therefore, we are not linked to
        # glibc.
        return None

    # Call gnu_get_libc_version, which returns a string like "2.5"
    gnu_get_libc_version.restype = ctypes.c_char_p
    version_str = gnu_get_libc_version()
    # py2 / py3 compatibility:
    if not isinstance(version_str, str):
        version_str = version_str.decode("ascii")

    return version_str


# Separated out from have_compatible_glibc for easier unit testing
def check_glibc_version(version_str, required_major, minimum_minor):
    # Parse string and check against requested version.
    #
    # We use a regexp instead of str.split because we want to discard any
    # random junk that might come after the minor version -- this might happen
    # in patched/forked versions of glibc (e.g. Linaro's version of glibc
    # uses version strings like "2.20-2014.11"). See gh-3588.
    m = re.match(r"(?P<major>[0-9]+)\.(?P<minor>[0-9]+)", version_str)
    if not m:
        warnings.warn("Expected glibc version with 2 components major.minor,"
                      " got: %s" % version_str, RuntimeWarning)
        return False
    return (int(m.group("major")) == required_major and
            int(m.group("minor")) >= minimum_minor)


def have_compatible_glibc(required_major, minimum_minor):
    version_str = glibc_version_string()
    if version_str is None:
        return False
    return check_glibc_version(version_str, required_major, minimum_minor)


# platform.libc_ver regularly returns completely nonsensical glibc
# versions. E.g. on my computer, platform says:
#
#   ~$ python2.7 -c 'import platform; print(platform.libc_ver())'
#   ('glibc', '2.7')
#   ~$ python3.5 -c 'import platform; print(platform.libc_ver())'
#   ('glibc', '2.9')
#
# But the truth is:
#
#   ~$ ldd --version
#   ldd (Debian GLIBC 2.22-11) 2.22
#
# This is unfortunate, because it means that the linehaul data on libc
# versions that was generated by pip 8.1.2 and earlier is useless and
# misleading. Solution: instead of using platform, use our code that actually
# works.
def libc_ver():
    """Try to determine the glibc version

    Returns a tuple of strings (lib, version) which default to empty strings
    in case the lookup fails.
    """
    glibc_version = glibc_version_string()
    if glibc_version is None:
        return ("", "")
    else:
        return ("glibc", glibc_version)
site-packages/setuptools/dep_util.py000064400000001647147511334650013705 0ustar00from distutils.dep_util import newer_group

# yes, this is was almost entirely copy-pasted from
# 'newer_pairwise()', this is just another convenience
# function.
def newer_pairwise_group(sources_groups, targets):
    """Walk both arguments in parallel, testing if each source group is newer
    than its corresponding target. Returns a pair of lists (sources_groups,
    targets) where sources is newer than target, according to the semantics
    of 'newer_group()'.
    """
    if len(sources_groups) != len(targets):
        raise ValueError("'sources_group' and 'targets' must be the same length")

    # build a pair of lists (sources_groups, targets) where source is newer
    n_sources = []
    n_targets = []
    for i in range(len(sources_groups)):
        if newer_group(sources_groups[i], targets[i]):
            n_sources.append(sources_groups[i])
            n_targets.append(targets[i])

    return n_sources, n_targets
site-packages/setuptools/py36compat.py000064400000005513147511334650014101 0ustar00import sys
from distutils.errors import DistutilsOptionError
from distutils.util import strtobool
from distutils.debug import DEBUG


class Distribution_parse_config_files:
    """
    Mix-in providing forward-compatibility for functionality to be
    included by default on Python 3.7.

    Do not edit the code in this class except to update functionality
    as implemented in distutils.
    """
    def parse_config_files(self, filenames=None):
        from configparser import ConfigParser

        # Ignore install directory options if we have a venv
        if sys.prefix != sys.base_prefix:
            ignore_options = [
                'install-base', 'install-platbase', 'install-lib',
                'install-platlib', 'install-purelib', 'install-headers',
                'install-scripts', 'install-data', 'prefix', 'exec-prefix',
                'home', 'user', 'root']
        else:
            ignore_options = []

        ignore_options = frozenset(ignore_options)

        if filenames is None:
            filenames = self.find_config_files()

        if DEBUG:
            self.announce("Distribution.parse_config_files():")

        parser = ConfigParser(interpolation=None)
        for filename in filenames:
            if DEBUG:
                self.announce("  reading %s" % filename)
            parser.read(filename)
            for section in parser.sections():
                options = parser.options(section)
                opt_dict = self.get_option_dict(section)

                for opt in options:
                    if opt != '__name__' and opt not in ignore_options:
                        val = parser.get(section,opt)
                        opt = opt.replace('-', '_')
                        opt_dict[opt] = (filename, val)

            # Make the ConfigParser forget everything (so we retain
            # the original filenames that options come from)
            parser.__init__()

        # If there was a "global" section in the config file, use it
        # to set Distribution options.

        if 'global' in self.command_options:
            for (opt, (src, val)) in self.command_options['global'].items():
                alias = self.negative_opt.get(opt)
                try:
                    if alias:
                        setattr(self, alias, not strtobool(val))
                    elif opt in ('verbose', 'dry_run'): # ugh!
                        setattr(self, opt, strtobool(val))
                    else:
                        setattr(self, opt, val)
                except ValueError as msg:
                    raise DistutilsOptionError(msg)


if sys.version_info < (3,):
    # Python 2 behavior is sufficient
    class Distribution_parse_config_files:
        pass


if False:
    # When updated behavior is available upstream,
    # disable override here.
    class Distribution_parse_config_files:
        pass
site-packages/setuptools/config.py000064400000043126147511334650013343 0ustar00from __future__ import absolute_import, unicode_literals
import io
import os
import sys
from collections import defaultdict
from functools import partial
from importlib import import_module

from distutils.errors import DistutilsOptionError, DistutilsFileError
from setuptools.extern.packaging.version import LegacyVersion, parse
from setuptools.extern.six import string_types


def read_configuration(
        filepath, find_others=False, ignore_option_errors=False):
    """Read given configuration file and returns options from it as a dict.

    :param str|unicode filepath: Path to configuration file
        to get options from.

    :param bool find_others: Whether to search for other configuration files
        which could be on in various places.

    :param bool ignore_option_errors: Whether to silently ignore
        options, values of which could not be resolved (e.g. due to exceptions
        in directives such as file:, attr:, etc.).
        If False exceptions are propagated as expected.

    :rtype: dict
    """
    from setuptools.dist import Distribution, _Distribution

    filepath = os.path.abspath(filepath)

    if not os.path.isfile(filepath):
        raise DistutilsFileError(
            'Configuration file %s does not exist.' % filepath)

    current_directory = os.getcwd()
    os.chdir(os.path.dirname(filepath))

    try:
        dist = Distribution()

        filenames = dist.find_config_files() if find_others else []
        if filepath not in filenames:
            filenames.append(filepath)

        _Distribution.parse_config_files(dist, filenames=filenames)

        handlers = parse_configuration(
            dist, dist.command_options,
            ignore_option_errors=ignore_option_errors)

    finally:
        os.chdir(current_directory)

    return configuration_to_dict(handlers)


def configuration_to_dict(handlers):
    """Returns configuration data gathered by given handlers as a dict.

    :param list[ConfigHandler] handlers: Handlers list,
        usually from parse_configuration()

    :rtype: dict
    """
    config_dict = defaultdict(dict)

    for handler in handlers:

        obj_alias = handler.section_prefix
        target_obj = handler.target_obj

        for option in handler.set_options:
            getter = getattr(target_obj, 'get_%s' % option, None)

            if getter is None:
                value = getattr(target_obj, option)

            else:
                value = getter()

            config_dict[obj_alias][option] = value

    return config_dict


def parse_configuration(
        distribution, command_options, ignore_option_errors=False):
    """Performs additional parsing of configuration options
    for a distribution.

    Returns a list of used option handlers.

    :param Distribution distribution:
    :param dict command_options:
    :param bool ignore_option_errors: Whether to silently ignore
        options, values of which could not be resolved (e.g. due to exceptions
        in directives such as file:, attr:, etc.).
        If False exceptions are propagated as expected.
    :rtype: list
    """
    options = ConfigOptionsHandler(
        distribution, command_options, ignore_option_errors)
    options.parse()

    meta = ConfigMetadataHandler(
        distribution.metadata, command_options, ignore_option_errors, distribution.package_dir)
    meta.parse()

    return meta, options


class ConfigHandler(object):
    """Handles metadata supplied in configuration files."""

    section_prefix = None
    """Prefix for config sections handled by this handler.
    Must be provided by class heirs.

    """

    aliases = {}
    """Options aliases.
    For compatibility with various packages. E.g.: d2to1 and pbr.
    Note: `-` in keys is replaced with `_` by config parser.

    """

    def __init__(self, target_obj, options, ignore_option_errors=False):
        sections = {}

        section_prefix = self.section_prefix
        for section_name, section_options in options.items():
            if not section_name.startswith(section_prefix):
                continue

            section_name = section_name.replace(section_prefix, '').strip('.')
            sections[section_name] = section_options

        self.ignore_option_errors = ignore_option_errors
        self.target_obj = target_obj
        self.sections = sections
        self.set_options = []

    @property
    def parsers(self):
        """Metadata item name to parser function mapping."""
        raise NotImplementedError(
            '%s must provide .parsers property' % self.__class__.__name__)

    def __setitem__(self, option_name, value):
        unknown = tuple()
        target_obj = self.target_obj

        # Translate alias into real name.
        option_name = self.aliases.get(option_name, option_name)

        current_value = getattr(target_obj, option_name, unknown)

        if current_value is unknown:
            raise KeyError(option_name)

        if current_value:
            # Already inhabited. Skipping.
            return

        skip_option = False
        parser = self.parsers.get(option_name)
        if parser:
            try:
                value = parser(value)

            except Exception:
                skip_option = True
                if not self.ignore_option_errors:
                    raise

        if skip_option:
            return

        setter = getattr(target_obj, 'set_%s' % option_name, None)
        if setter is None:
            setattr(target_obj, option_name, value)
        else:
            setter(value)

        self.set_options.append(option_name)

    @classmethod
    def _parse_list(cls, value, separator=','):
        """Represents value as a list.

        Value is split either by separator (defaults to comma) or by lines.

        :param value:
        :param separator: List items separator character.
        :rtype: list
        """
        if isinstance(value, list):  # _get_parser_compound case
            return value

        if '\n' in value:
            value = value.splitlines()
        else:
            value = value.split(separator)

        return [chunk.strip() for chunk in value if chunk.strip()]

    @classmethod
    def _parse_dict(cls, value):
        """Represents value as a dict.

        :param value:
        :rtype: dict
        """
        separator = '='
        result = {}
        for line in cls._parse_list(value):
            key, sep, val = line.partition(separator)
            if sep != separator:
                raise DistutilsOptionError(
                    'Unable to parse option value to dict: %s' % value)
            result[key.strip()] = val.strip()

        return result

    @classmethod
    def _parse_bool(cls, value):
        """Represents value as boolean.

        :param value:
        :rtype: bool
        """
        value = value.lower()
        return value in ('1', 'true', 'yes')

    @classmethod
    def _parse_file(cls, value):
        """Represents value as a string, allowing including text
        from nearest files using `file:` directive.

        Directive is sandboxed and won't reach anything outside
        directory with setup.py.

        Examples:
            file: LICENSE
            file: README.rst, CHANGELOG.md, src/file.txt

        :param str value:
        :rtype: str
        """
        include_directive = 'file:'

        if not isinstance(value, string_types):
            return value

        if not value.startswith(include_directive):
            return value

        spec = value[len(include_directive):]
        filepaths = (os.path.abspath(path.strip()) for path in spec.split(','))
        return '\n'.join(
            cls._read_file(path)
            for path in filepaths
            if (cls._assert_local(path) or True)
            and os.path.isfile(path)
        )

    @staticmethod
    def _assert_local(filepath):
        if not filepath.startswith(os.getcwd()):
            raise DistutilsOptionError(
                '`file:` directive can not access %s' % filepath)

    @staticmethod
    def _read_file(filepath):
        with io.open(filepath, encoding='utf-8') as f:
            return f.read()

    @classmethod
    def _parse_attr(cls, value, package_dir=None):
        """Represents value as a module attribute.

        Examples:
            attr: package.attr
            attr: package.module.attr

        :param str value:
        :rtype: str
        """
        attr_directive = 'attr:'
        if not value.startswith(attr_directive):
            return value

        attrs_path = value.replace(attr_directive, '').strip().split('.')
        attr_name = attrs_path.pop()

        module_name = '.'.join(attrs_path)
        module_name = module_name or '__init__'

        parent_path = os.getcwd()
        if package_dir:
            if attrs_path[0] in package_dir:
                # A custom path was specified for the module we want to import
                custom_path = package_dir[attrs_path[0]]
                parts = custom_path.rsplit('/', 1)
                if len(parts) > 1:
                    parent_path = os.path.join(os.getcwd(), parts[0])
                    module_name = parts[1]
                else:
                    module_name = custom_path
            elif '' in package_dir:
                # A custom parent directory was specified for all root modules
                parent_path = os.path.join(os.getcwd(), package_dir[''])
        sys.path.insert(0, parent_path)
        try:
            module = import_module(module_name)
            value = getattr(module, attr_name)

        finally:
            sys.path = sys.path[1:]

        return value

    @classmethod
    def _get_parser_compound(cls, *parse_methods):
        """Returns parser function to represents value as a list.

        Parses a value applying given methods one after another.

        :param parse_methods:
        :rtype: callable
        """
        def parse(value):
            parsed = value

            for method in parse_methods:
                parsed = method(parsed)

            return parsed

        return parse

    @classmethod
    def _parse_section_to_dict(cls, section_options, values_parser=None):
        """Parses section options into a dictionary.

        Optionally applies a given parser to values.

        :param dict section_options:
        :param callable values_parser:
        :rtype: dict
        """
        value = {}
        values_parser = values_parser or (lambda val: val)
        for key, (_, val) in section_options.items():
            value[key] = values_parser(val)
        return value

    def parse_section(self, section_options):
        """Parses configuration file section.

        :param dict section_options:
        """
        for (name, (_, value)) in section_options.items():
            try:
                self[name] = value

            except KeyError:
                pass  # Keep silent for a new option may appear anytime.

    def parse(self):
        """Parses configuration file items from one
        or more related sections.

        """
        for section_name, section_options in self.sections.items():

            method_postfix = ''
            if section_name:  # [section.option] variant
                method_postfix = '_%s' % section_name

            section_parser_method = getattr(
                self,
                # Dots in section names are tranlsated into dunderscores.
                ('parse_section%s' % method_postfix).replace('.', '__'),
                None)

            if section_parser_method is None:
                raise DistutilsOptionError(
                    'Unsupported distribution option section: [%s.%s]' % (
                        self.section_prefix, section_name))

            section_parser_method(section_options)


class ConfigMetadataHandler(ConfigHandler):

    section_prefix = 'metadata'

    aliases = {
        'home_page': 'url',
        'summary': 'description',
        'classifier': 'classifiers',
        'platform': 'platforms',
    }

    strict_mode = False
    """We need to keep it loose, to be partially compatible with
    `pbr` and `d2to1` packages which also uses `metadata` section.

    """

    def __init__(self, target_obj, options, ignore_option_errors=False,
                 package_dir=None):
        super(ConfigMetadataHandler, self).__init__(target_obj, options,
                                                    ignore_option_errors)
        self.package_dir = package_dir

    @property
    def parsers(self):
        """Metadata item name to parser function mapping."""
        parse_list = self._parse_list
        parse_file = self._parse_file
        parse_dict = self._parse_dict

        return {
            'platforms': parse_list,
            'keywords': parse_list,
            'provides': parse_list,
            'requires': parse_list,
            'obsoletes': parse_list,
            'classifiers': self._get_parser_compound(parse_file, parse_list),
            'license': parse_file,
            'description': parse_file,
            'long_description': parse_file,
            'version': self._parse_version,
            'project_urls': parse_dict,
        }

    def _parse_version(self, value):
        """Parses `version` option value.

        :param value:
        :rtype: str

        """
        version = self._parse_file(value)

        if version != value:
            version = version.strip()
            # Be strict about versions loaded from file because it's easy to
            # accidentally include newlines and other unintended content
            if isinstance(parse(version), LegacyVersion):
                raise DistutilsOptionError('Version loaded from %s does not comply with PEP 440: %s' % (
                    value, version
                ))
            return version

        version = self._parse_attr(value, self.package_dir)

        if callable(version):
            version = version()

        if not isinstance(version, string_types):
            if hasattr(version, '__iter__'):
                version = '.'.join(map(str, version))
            else:
                version = '%s' % version

        return version


class ConfigOptionsHandler(ConfigHandler):

    section_prefix = 'options'

    @property
    def parsers(self):
        """Metadata item name to parser function mapping."""
        parse_list = self._parse_list
        parse_list_semicolon = partial(self._parse_list, separator=';')
        parse_bool = self._parse_bool
        parse_dict = self._parse_dict

        return {
            'zip_safe': parse_bool,
            'use_2to3': parse_bool,
            'include_package_data': parse_bool,
            'package_dir': parse_dict,
            'use_2to3_fixers': parse_list,
            'use_2to3_exclude_fixers': parse_list,
            'convert_2to3_doctests': parse_list,
            'scripts': parse_list,
            'eager_resources': parse_list,
            'dependency_links': parse_list,
            'namespace_packages': parse_list,
            'install_requires': parse_list_semicolon,
            'setup_requires': parse_list_semicolon,
            'tests_require': parse_list_semicolon,
            'packages': self._parse_packages,
            'entry_points': self._parse_file,
            'py_modules': parse_list,
        }

    def _parse_packages(self, value):
        """Parses `packages` option value.

        :param value:
        :rtype: list
        """
        find_directive = 'find:'

        if not value.startswith(find_directive):
            return self._parse_list(value)

        # Read function arguments from a dedicated section.
        find_kwargs = self.parse_section_packages__find(
            self.sections.get('packages.find', {}))

        from setuptools import find_packages

        return find_packages(**find_kwargs)

    def parse_section_packages__find(self, section_options):
        """Parses `packages.find` configuration file section.

        To be used in conjunction with _parse_packages().

        :param dict section_options:
        """
        section_data = self._parse_section_to_dict(
            section_options, self._parse_list)

        valid_keys = ['where', 'include', 'exclude']

        find_kwargs = dict(
            [(k, v) for k, v in section_data.items() if k in valid_keys and v])

        where = find_kwargs.get('where')
        if where is not None:
            find_kwargs['where'] = where[0]  # cast list to single val

        return find_kwargs

    def parse_section_entry_points(self, section_options):
        """Parses `entry_points` configuration file section.

        :param dict section_options:
        """
        parsed = self._parse_section_to_dict(section_options, self._parse_list)
        self['entry_points'] = parsed

    def _parse_package_data(self, section_options):
        parsed = self._parse_section_to_dict(section_options, self._parse_list)

        root = parsed.get('*')
        if root:
            parsed[''] = root
            del parsed['*']

        return parsed

    def parse_section_package_data(self, section_options):
        """Parses `package_data` configuration file section.

        :param dict section_options:
        """
        self['package_data'] = self._parse_package_data(section_options)

    def parse_section_exclude_package_data(self, section_options):
        """Parses `exclude_package_data` configuration file section.

        :param dict section_options:
        """
        self['exclude_package_data'] = self._parse_package_data(
            section_options)

    def parse_section_extras_require(self, section_options):
        """Parses `extras_require` configuration file section.

        :param dict section_options:
        """
        parse_list = partial(self._parse_list, separator=';')
        self['extras_require'] = self._parse_section_to_dict(
            section_options, parse_list)
site-packages/setuptools/unicode_utils.py000064400000001744147511334650014744 0ustar00import unicodedata
import sys

from setuptools.extern import six


# HFS Plus uses decomposed UTF-8
def decompose(path):
    if isinstance(path, six.text_type):
        return unicodedata.normalize('NFD', path)
    try:
        path = path.decode('utf-8')
        path = unicodedata.normalize('NFD', path)
        path = path.encode('utf-8')
    except UnicodeError:
        pass  # Not UTF-8
    return path


def filesys_decode(path):
    """
    Ensure that the given path is decoded,
    NONE when no expected encoding works
    """

    if isinstance(path, six.text_type):
        return path

    fs_enc = sys.getfilesystemencoding() or 'utf-8'
    candidates = fs_enc, 'utf-8'

    for enc in candidates:
        try:
            return path.decode(enc)
        except UnicodeDecodeError:
            continue


def try_encode(string, enc):
    "turn unicode encoding into a functional routine"
    try:
        return string.encode(enc)
    except UnicodeEncodeError:
        return None
site-packages/setuptools/sandbox.py000064400000033704147511334650013535 0ustar00import os
import sys
import tempfile
import operator
import functools
import itertools
import re
import contextlib
import pickle
import textwrap

from setuptools.extern import six
from setuptools.extern.six.moves import builtins, map

import pkg_resources.py31compat

if sys.platform.startswith('java'):
    import org.python.modules.posix.PosixModule as _os
else:
    _os = sys.modules[os.name]
try:
    _file = file
except NameError:
    _file = None
_open = open
from distutils.errors import DistutilsError
from pkg_resources import working_set


__all__ = [
    "AbstractSandbox", "DirectorySandbox", "SandboxViolation", "run_setup",
]


def _execfile(filename, globals, locals=None):
    """
    Python 3 implementation of execfile.
    """
    mode = 'rb'
    with open(filename, mode) as stream:
        script = stream.read()
    if locals is None:
        locals = globals
    code = compile(script, filename, 'exec')
    exec(code, globals, locals)


@contextlib.contextmanager
def save_argv(repl=None):
    saved = sys.argv[:]
    if repl is not None:
        sys.argv[:] = repl
    try:
        yield saved
    finally:
        sys.argv[:] = saved


@contextlib.contextmanager
def save_path():
    saved = sys.path[:]
    try:
        yield saved
    finally:
        sys.path[:] = saved


@contextlib.contextmanager
def override_temp(replacement):
    """
    Monkey-patch tempfile.tempdir with replacement, ensuring it exists
    """
    pkg_resources.py31compat.makedirs(replacement, exist_ok=True)

    saved = tempfile.tempdir

    tempfile.tempdir = replacement

    try:
        yield
    finally:
        tempfile.tempdir = saved


@contextlib.contextmanager
def pushd(target):
    saved = os.getcwd()
    os.chdir(target)
    try:
        yield saved
    finally:
        os.chdir(saved)


class UnpickleableException(Exception):
    """
    An exception representing another Exception that could not be pickled.
    """

    @staticmethod
    def dump(type, exc):
        """
        Always return a dumped (pickled) type and exc. If exc can't be pickled,
        wrap it in UnpickleableException first.
        """
        try:
            return pickle.dumps(type), pickle.dumps(exc)
        except Exception:
            # get UnpickleableException inside the sandbox
            from setuptools.sandbox import UnpickleableException as cls
            return cls.dump(cls, cls(repr(exc)))


class ExceptionSaver:
    """
    A Context Manager that will save an exception, serialized, and restore it
    later.
    """

    def __enter__(self):
        return self

    def __exit__(self, type, exc, tb):
        if not exc:
            return

        # dump the exception
        self._saved = UnpickleableException.dump(type, exc)
        self._tb = tb

        # suppress the exception
        return True

    def resume(self):
        "restore and re-raise any exception"

        if '_saved' not in vars(self):
            return

        type, exc = map(pickle.loads, self._saved)
        six.reraise(type, exc, self._tb)


@contextlib.contextmanager
def save_modules():
    """
    Context in which imported modules are saved.

    Translates exceptions internal to the context into the equivalent exception
    outside the context.
    """
    saved = sys.modules.copy()
    with ExceptionSaver() as saved_exc:
        yield saved

    sys.modules.update(saved)
    # remove any modules imported since
    del_modules = (
        mod_name for mod_name in sys.modules
        if mod_name not in saved
        # exclude any encodings modules. See #285
        and not mod_name.startswith('encodings.')
    )
    _clear_modules(del_modules)

    saved_exc.resume()


def _clear_modules(module_names):
    for mod_name in list(module_names):
        del sys.modules[mod_name]


@contextlib.contextmanager
def save_pkg_resources_state():
    saved = pkg_resources.__getstate__()
    try:
        yield saved
    finally:
        pkg_resources.__setstate__(saved)


@contextlib.contextmanager
def setup_context(setup_dir):
    temp_dir = os.path.join(setup_dir, 'temp')
    with save_pkg_resources_state():
        with save_modules():
            hide_setuptools()
            with save_path():
                with save_argv():
                    with override_temp(temp_dir):
                        with pushd(setup_dir):
                            # ensure setuptools commands are available
                            __import__('setuptools')
                            yield


def _needs_hiding(mod_name):
    """
    >>> _needs_hiding('setuptools')
    True
    >>> _needs_hiding('pkg_resources')
    True
    >>> _needs_hiding('setuptools_plugin')
    False
    >>> _needs_hiding('setuptools.__init__')
    True
    >>> _needs_hiding('distutils')
    True
    >>> _needs_hiding('os')
    False
    >>> _needs_hiding('Cython')
    True
    """
    pattern = re.compile(r'(setuptools|pkg_resources|distutils|Cython)(\.|$)')
    return bool(pattern.match(mod_name))


def hide_setuptools():
    """
    Remove references to setuptools' modules from sys.modules to allow the
    invocation to import the most appropriate setuptools. This technique is
    necessary to avoid issues such as #315 where setuptools upgrading itself
    would fail to find a function declared in the metadata.
    """
    modules = filter(_needs_hiding, sys.modules)
    _clear_modules(modules)


def run_setup(setup_script, args):
    """Run a distutils setup script, sandboxed in its directory"""
    setup_dir = os.path.abspath(os.path.dirname(setup_script))
    with setup_context(setup_dir):
        try:
            sys.argv[:] = [setup_script] + list(args)
            sys.path.insert(0, setup_dir)
            # reset to include setup dir, w/clean callback list
            working_set.__init__()
            working_set.callbacks.append(lambda dist: dist.activate())

            # __file__ should be a byte string on Python 2 (#712)
            dunder_file = (
                setup_script
                if isinstance(setup_script, str) else
                setup_script.encode(sys.getfilesystemencoding())
            )

            with DirectorySandbox(setup_dir):
                ns = dict(__file__=dunder_file, __name__='__main__')
                _execfile(setup_script, ns)
        except SystemExit as v:
            if v.args and v.args[0]:
                raise
            # Normal exit, just return


class AbstractSandbox:
    """Wrap 'os' module and 'open()' builtin for virtualizing setup scripts"""

    _active = False

    def __init__(self):
        self._attrs = [
            name for name in dir(_os)
            if not name.startswith('_') and hasattr(self, name)
        ]

    def _copy(self, source):
        for name in self._attrs:
            setattr(os, name, getattr(source, name))

    def __enter__(self):
        self._copy(self)
        if _file:
            builtins.file = self._file
        builtins.open = self._open
        self._active = True

    def __exit__(self, exc_type, exc_value, traceback):
        self._active = False
        if _file:
            builtins.file = _file
        builtins.open = _open
        self._copy(_os)

    def run(self, func):
        """Run 'func' under os sandboxing"""
        with self:
            return func()

    def _mk_dual_path_wrapper(name):
        original = getattr(_os, name)

        def wrap(self, src, dst, *args, **kw):
            if self._active:
                src, dst = self._remap_pair(name, src, dst, *args, **kw)
            return original(src, dst, *args, **kw)

        return wrap

    for name in ["rename", "link", "symlink"]:
        if hasattr(_os, name):
            locals()[name] = _mk_dual_path_wrapper(name)

    def _mk_single_path_wrapper(name, original=None):
        original = original or getattr(_os, name)

        def wrap(self, path, *args, **kw):
            if self._active:
                path = self._remap_input(name, path, *args, **kw)
            return original(path, *args, **kw)

        return wrap

    if _file:
        _file = _mk_single_path_wrapper('file', _file)
    _open = _mk_single_path_wrapper('open', _open)
    for name in [
        "stat", "listdir", "chdir", "open", "chmod", "chown", "mkdir",
        "remove", "unlink", "rmdir", "utime", "lchown", "chroot", "lstat",
        "startfile", "mkfifo", "mknod", "pathconf", "access"
    ]:
        if hasattr(_os, name):
            locals()[name] = _mk_single_path_wrapper(name)

    def _mk_single_with_return(name):
        original = getattr(_os, name)

        def wrap(self, path, *args, **kw):
            if self._active:
                path = self._remap_input(name, path, *args, **kw)
                return self._remap_output(name, original(path, *args, **kw))
            return original(path, *args, **kw)

        return wrap

    for name in ['readlink', 'tempnam']:
        if hasattr(_os, name):
            locals()[name] = _mk_single_with_return(name)

    def _mk_query(name):
        original = getattr(_os, name)

        def wrap(self, *args, **kw):
            retval = original(*args, **kw)
            if self._active:
                return self._remap_output(name, retval)
            return retval

        return wrap

    for name in ['getcwd', 'tmpnam']:
        if hasattr(_os, name):
            locals()[name] = _mk_query(name)

    def _validate_path(self, path):
        """Called to remap or validate any path, whether input or output"""
        return path

    def _remap_input(self, operation, path, *args, **kw):
        """Called for path inputs"""
        return self._validate_path(path)

    def _remap_output(self, operation, path):
        """Called for path outputs"""
        return self._validate_path(path)

    def _remap_pair(self, operation, src, dst, *args, **kw):
        """Called for path pairs like rename, link, and symlink operations"""
        return (
            self._remap_input(operation + '-from', src, *args, **kw),
            self._remap_input(operation + '-to', dst, *args, **kw)
        )


if hasattr(os, 'devnull'):
    _EXCEPTIONS = [os.devnull,]
else:
    _EXCEPTIONS = []


class DirectorySandbox(AbstractSandbox):
    """Restrict operations to a single subdirectory - pseudo-chroot"""

    write_ops = dict.fromkeys([
        "open", "chmod", "chown", "mkdir", "remove", "unlink", "rmdir",
        "utime", "lchown", "chroot", "mkfifo", "mknod", "tempnam",
    ])

    _exception_patterns = [
        # Allow lib2to3 to attempt to save a pickled grammar object (#121)
        r'.*lib2to3.*\.pickle$',
    ]
    "exempt writing to paths that match the pattern"

    def __init__(self, sandbox, exceptions=_EXCEPTIONS):
        self._sandbox = os.path.normcase(os.path.realpath(sandbox))
        self._prefix = os.path.join(self._sandbox, '')
        self._exceptions = [
            os.path.normcase(os.path.realpath(path))
            for path in exceptions
        ]
        AbstractSandbox.__init__(self)

    def _violation(self, operation, *args, **kw):
        from setuptools.sandbox import SandboxViolation
        raise SandboxViolation(operation, args, kw)

    if _file:

        def _file(self, path, mode='r', *args, **kw):
            if mode not in ('r', 'rt', 'rb', 'rU', 'U') and not self._ok(path):
                self._violation("file", path, mode, *args, **kw)
            return _file(path, mode, *args, **kw)

    def _open(self, path, mode='r', *args, **kw):
        if mode not in ('r', 'rt', 'rb', 'rU', 'U') and not self._ok(path):
            self._violation("open", path, mode, *args, **kw)
        return _open(path, mode, *args, **kw)

    def tmpnam(self):
        self._violation("tmpnam")

    def _ok(self, path):
        active = self._active
        try:
            self._active = False
            realpath = os.path.normcase(os.path.realpath(path))
            return (
                self._exempted(realpath)
                or realpath == self._sandbox
                or realpath.startswith(self._prefix)
            )
        finally:
            self._active = active

    def _exempted(self, filepath):
        start_matches = (
            filepath.startswith(exception)
            for exception in self._exceptions
        )
        pattern_matches = (
            re.match(pattern, filepath)
            for pattern in self._exception_patterns
        )
        candidates = itertools.chain(start_matches, pattern_matches)
        return any(candidates)

    def _remap_input(self, operation, path, *args, **kw):
        """Called for path inputs"""
        if operation in self.write_ops and not self._ok(path):
            self._violation(operation, os.path.realpath(path), *args, **kw)
        return path

    def _remap_pair(self, operation, src, dst, *args, **kw):
        """Called for path pairs like rename, link, and symlink operations"""
        if not self._ok(src) or not self._ok(dst):
            self._violation(operation, src, dst, *args, **kw)
        return (src, dst)

    def open(self, file, flags, mode=0o777, *args, **kw):
        """Called for low-level os.open()"""
        if flags & WRITE_FLAGS and not self._ok(file):
            self._violation("os.open", file, flags, mode, *args, **kw)
        return _os.open(file, flags, mode, *args, **kw)


WRITE_FLAGS = functools.reduce(
    operator.or_, [getattr(_os, a, 0) for a in
        "O_WRONLY O_RDWR O_APPEND O_CREAT O_TRUNC O_TEMPORARY".split()]
)


class SandboxViolation(DistutilsError):
    """A setup script attempted to modify the filesystem outside the sandbox"""

    tmpl = textwrap.dedent("""
        SandboxViolation: {cmd}{args!r} {kwargs}

        The package setup script has attempted to modify files on your system
        that are not within the EasyInstall build area, and has been aborted.

        This package cannot be safely installed by EasyInstall, and may not
        support alternate installation locations even if you run its setup
        script by hand.  Please inform the package's author and the EasyInstall
        maintainers to find out if a fix or workaround is available.
        """).lstrip()

    def __str__(self):
        cmd, args, kwargs = self.args
        return self.tmpl.format(**locals())
site-packages/setuptools/build_meta.py000064400000013047147511334650014202 0ustar00"""A PEP 517 interface to setuptools

Previously, when a user or a command line tool (let's call it a "frontend")
needed to make a request of setuptools to take a certain action, for
example, generating a list of installation requirements, the frontend would
would call "setup.py egg_info" or "setup.py bdist_wheel" on the command line.

PEP 517 defines a different method of interfacing with setuptools. Rather
than calling "setup.py" directly, the frontend should:

  1. Set the current directory to the directory with a setup.py file
  2. Import this module into a safe python interpreter (one in which
     setuptools can potentially set global variables or crash hard).
  3. Call one of the functions defined in PEP 517.

What each function does is defined in PEP 517. However, here is a "casual"
definition of the functions (this definition should not be relied on for
bug reports or API stability):

  - `build_wheel`: build a wheel in the folder and return the basename
  - `get_requires_for_build_wheel`: get the `setup_requires` to build
  - `prepare_metadata_for_build_wheel`: get the `install_requires`
  - `build_sdist`: build an sdist in the folder and return the basename
  - `get_requires_for_build_sdist`: get the `setup_requires` to build

Again, this is not a formal definition! Just a "taste" of the module.
"""

import os
import sys
import tokenize
import shutil
import contextlib

import setuptools
import distutils


class SetupRequirementsError(BaseException):
    def __init__(self, specifiers):
        self.specifiers = specifiers


class Distribution(setuptools.dist.Distribution):
    def fetch_build_eggs(self, specifiers):
        raise SetupRequirementsError(specifiers)

    @classmethod
    @contextlib.contextmanager
    def patch(cls):
        """
        Replace
        distutils.dist.Distribution with this class
        for the duration of this context.
        """
        orig = distutils.core.Distribution
        distutils.core.Distribution = cls
        try:
            yield
        finally:
            distutils.core.Distribution = orig


def _run_setup(setup_script='setup.py'):
    # Note that we can reuse our build directory between calls
    # Correctness comes first, then optimization later
    __file__ = setup_script
    __name__ = '__main__'
    f = getattr(tokenize, 'open', open)(__file__)
    code = f.read().replace('\\r\\n', '\\n')
    f.close()
    exec(compile(code, __file__, 'exec'), locals())


def _fix_config(config_settings):
    config_settings = config_settings or {}
    config_settings.setdefault('--global-option', [])
    return config_settings


def _get_build_requires(config_settings):
    config_settings = _fix_config(config_settings)
    requirements = ['setuptools', 'wheel']

    sys.argv = sys.argv[:1] + ['egg_info'] + \
        config_settings["--global-option"]
    try:
        with Distribution.patch():
            _run_setup()
    except SetupRequirementsError as e:
        requirements += e.specifiers

    return requirements


def _get_immediate_subdirectories(a_dir):
    return [name for name in os.listdir(a_dir)
            if os.path.isdir(os.path.join(a_dir, name))]


def get_requires_for_build_wheel(config_settings=None):
    config_settings = _fix_config(config_settings)
    return _get_build_requires(config_settings)


def get_requires_for_build_sdist(config_settings=None):
    config_settings = _fix_config(config_settings)
    return _get_build_requires(config_settings)


def prepare_metadata_for_build_wheel(metadata_directory, config_settings=None):
    sys.argv = sys.argv[:1] + ['dist_info', '--egg-base', metadata_directory]
    _run_setup()
    
    dist_info_directory = metadata_directory
    while True:    
        dist_infos = [f for f in os.listdir(dist_info_directory)
                      if f.endswith('.dist-info')]

        if len(dist_infos) == 0 and \
                len(_get_immediate_subdirectories(dist_info_directory)) == 1:
            dist_info_directory = os.path.join(
                dist_info_directory, os.listdir(dist_info_directory)[0])
            continue

        assert len(dist_infos) == 1
        break

    # PEP 517 requires that the .dist-info directory be placed in the
    # metadata_directory. To comply, we MUST copy the directory to the root
    if dist_info_directory != metadata_directory:
        shutil.move(
            os.path.join(dist_info_directory, dist_infos[0]),
            metadata_directory)
        shutil.rmtree(dist_info_directory, ignore_errors=True)

    return dist_infos[0]


def build_wheel(wheel_directory, config_settings=None,
                metadata_directory=None):
    config_settings = _fix_config(config_settings)
    wheel_directory = os.path.abspath(wheel_directory)
    sys.argv = sys.argv[:1] + ['bdist_wheel'] + \
        config_settings["--global-option"]
    _run_setup()
    if wheel_directory != 'dist':
        shutil.rmtree(wheel_directory)
        shutil.copytree('dist', wheel_directory)

    wheels = [f for f in os.listdir(wheel_directory)
              if f.endswith('.whl')]

    assert len(wheels) == 1
    return wheels[0]


def build_sdist(sdist_directory, config_settings=None):
    config_settings = _fix_config(config_settings)
    sdist_directory = os.path.abspath(sdist_directory)
    sys.argv = sys.argv[:1] + ['sdist'] + \
        config_settings["--global-option"]
    _run_setup()
    if sdist_directory != 'dist':
        shutil.rmtree(sdist_directory)
        shutil.copytree('dist', sdist_directory)

    sdists = [f for f in os.listdir(sdist_directory)
              if f.endswith('.tar.gz')]

    assert len(sdists) == 1
    return sdists[0]
site-packages/setuptools/windows_support.py000064400000001312147511334650015353 0ustar00import platform
import ctypes


def windows_only(func):
    if platform.system() != 'Windows':
        return lambda *args, **kwargs: None
    return func


@windows_only
def hide_file(path):
    """
    Set the hidden attribute on a file or directory.

    From http://stackoverflow.com/questions/19622133/

    `path` must be text.
    """
    __import__('ctypes.wintypes')
    SetFileAttributes = ctypes.windll.kernel32.SetFileAttributesW
    SetFileAttributes.argtypes = ctypes.wintypes.LPWSTR, ctypes.wintypes.DWORD
    SetFileAttributes.restype = ctypes.wintypes.BOOL

    FILE_ATTRIBUTE_HIDDEN = 0x02

    ret = SetFileAttributes(path, FILE_ATTRIBUTE_HIDDEN)
    if not ret:
        raise ctypes.WinError()
site-packages/setuptools/version.py000064400000000220147511334650013547 0ustar00import pkg_resources

try:
    __version__ = pkg_resources.get_distribution('setuptools').version
except Exception:
    __version__ = 'unknown'
site-packages/setuptools/py31compat.py000064400000002250147511334650014067 0ustar00__all__ = ['get_config_vars', 'get_path']

try:
    # Python 2.7 or >=3.2
    from sysconfig import get_config_vars, get_path
except ImportError:
    from distutils.sysconfig import get_config_vars, get_python_lib

    def get_path(name):
        if name not in ('platlib', 'purelib'):
            raise ValueError("Name must be purelib or platlib")
        return get_python_lib(name == 'platlib')


try:
    # Python >=3.2
    from tempfile import TemporaryDirectory
except ImportError:
    import shutil
    import tempfile

    class TemporaryDirectory(object):
        """
        Very simple temporary directory context manager.
        Will try to delete afterward, but will also ignore OS and similar
        errors on deletion.
        """

        def __init__(self):
            self.name = None  # Handle mkdtemp raising an exception
            self.name = tempfile.mkdtemp()

        def __enter__(self):
            return self.name

        def __exit__(self, exctype, excvalue, exctrace):
            try:
                shutil.rmtree(self.name, True)
            except OSError:  # removal errors are not the only possible
                pass
            self.name = None
site-packages/dnf/logging.py000064400000024162147511334650012051 0ustar00# logging.py
# DNF Logging Subsystem.
#
# Copyright (C) 2013-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
import dnf.exceptions
import dnf.const
import dnf.lock
import dnf.util
import libdnf.repo
import logging
import logging.handlers
import os
import sys
import time
import warnings
import gzip

# :api loggers are: 'dnf', 'dnf.plugin', 'dnf.rpm'

SUPERCRITICAL = 100 # do not use this for logging
CRITICAL = logging.CRITICAL
ERROR = logging.ERROR
WARNING = logging.WARNING
INFO = logging.INFO
DEBUG = logging.DEBUG
DDEBUG = 8  # used by anaconda (pyanaconda/payload/dnfpayload.py)
SUBDEBUG = 6
TRACE = 4
ALL = 2

def only_once(func):
    """Method decorator turning the method into noop on second or later calls."""
    def noop(*_args, **_kwargs):
        pass
    def swan_song(self, *args, **kwargs):
        func(self, *args, **kwargs)
        setattr(self, func.__name__, noop)
    return swan_song

class _MaxLevelFilter(object):
    def __init__(self, max_level):
        self.max_level = max_level

    def filter(self, record):
        if record.levelno >= self.max_level:
            return 0
        return 1

_VERBOSE_VAL_MAPPING = {
    0 : SUPERCRITICAL,
    1 : logging.INFO,
    2 : logging.INFO, # the default
    3 : logging.DEBUG,
    4 : logging.DEBUG,
    5 : logging.DEBUG,
    6 : logging.DEBUG, # verbose value
    7 : DDEBUG,
    8 : SUBDEBUG,
    9 : TRACE,
    10: ALL,   # more verbous librepo and hawkey
    }

def _cfg_verbose_val2level(cfg_errval):
    assert 0 <= cfg_errval <= 10
    return _VERBOSE_VAL_MAPPING.get(cfg_errval, TRACE)


# Both the DNF default and the verbose default are WARNING. Note that ERROR has
# no specific level.
_ERR_VAL_MAPPING = {
    0: SUPERCRITICAL,
    1: logging.CRITICAL,
    2: logging.ERROR
    }

def _cfg_err_val2level(cfg_errval):
    assert 0 <= cfg_errval <= 10
    return _ERR_VAL_MAPPING.get(cfg_errval, logging.WARNING)


def compression_namer(name):
    return name + ".gz"


CHUNK_SIZE = 128 * 1024 # 128 KB


def compression_rotator(source, dest):
    with open(source, "rb") as sf:
        with gzip.open(dest, 'wb') as wf:
            while True:
                data = sf.read(CHUNK_SIZE)
                if not data:
                    break
                wf.write(data)
    os.remove(source)


class MultiprocessRotatingFileHandler(logging.handlers.RotatingFileHandler):
    def __init__(self, filename, mode='a', maxBytes=0, backupCount=0, encoding=None, delay=False):
        super(MultiprocessRotatingFileHandler, self).__init__(
            filename, mode, maxBytes, backupCount, encoding, delay)
        self.rotate_lock = dnf.lock.build_log_lock("/var/log/", True)

    def emit(self, record):
        while True:
            try:
                if self.shouldRollover(record):
                    with self.rotate_lock:
                        # Do rollover while preserving the mode of the new log file
                        mode = os.stat(self.baseFilename).st_mode
                        self.doRollover()
                        os.chmod(self.baseFilename, mode)
                logging.FileHandler.emit(self, record)
                return
            except (dnf.exceptions.ProcessLockError, dnf.exceptions.ThreadLockError):
                time.sleep(0.01)
            except Exception:
                self.handleError(record)
                return


def _create_filehandler(logfile, log_size, log_rotate, log_compress):
    if not os.path.exists(logfile):
        dnf.util.ensure_dir(os.path.dirname(logfile))
        dnf.util.touch(logfile)
    handler = MultiprocessRotatingFileHandler(logfile, maxBytes=log_size, backupCount=log_rotate)
    formatter = logging.Formatter("%(asctime)s %(levelname)s %(message)s",
                                  "%Y-%m-%dT%H:%M:%S%z")
    formatter.converter = time.localtime
    handler.setFormatter(formatter)
    if log_compress:
        handler.rotator = compression_rotator
        handler.namer = compression_namer
    return handler

def _paint_mark(logger):
    logger.log(INFO, dnf.const.LOG_MARKER)


class Logging(object):
    def __init__(self):
        self.stdout_handler = self.stderr_handler = None
        logging.addLevelName(DDEBUG, "DDEBUG")
        logging.addLevelName(SUBDEBUG, "SUBDEBUG")
        logging.addLevelName(TRACE, "TRACE")
        logging.addLevelName(ALL, "ALL")
        logging.captureWarnings(True)
        logging.raiseExceptions = False

    @only_once
    def _presetup(self):
        logger_dnf = logging.getLogger("dnf")
        logger_dnf.setLevel(TRACE)

        # setup stdout
        stdout = logging.StreamHandler(sys.stdout)
        stdout.setLevel(INFO)
        stdout.addFilter(_MaxLevelFilter(logging.WARNING))
        logger_dnf.addHandler(stdout)
        self.stdout_handler = stdout

        # setup stderr
        stderr = logging.StreamHandler(sys.stderr)
        stderr.setLevel(WARNING)
        logger_dnf.addHandler(stderr)
        self.stderr_handler = stderr

    @only_once
    def _setup_file_loggers(self, logfile_level, logdir, log_size, log_rotate, log_compress):
        logger_dnf = logging.getLogger("dnf")
        logger_dnf.setLevel(TRACE)

        # setup file logger
        logfile = os.path.join(logdir, dnf.const.LOG)
        handler = _create_filehandler(logfile, log_size, log_rotate, log_compress)
        handler.setLevel(logfile_level)
        logger_dnf.addHandler(handler)

        # setup Python warnings
        logger_warnings = logging.getLogger("py.warnings")
        logger_warnings.addHandler(handler)

        logger_librepo = logging.getLogger("librepo")
        logger_librepo.setLevel(TRACE)
        logfile = os.path.join(logdir, dnf.const.LOG_LIBREPO)
        handler = _create_filehandler(logfile, log_size, log_rotate, log_compress)
        logger_librepo.addHandler(handler)
        libdnf.repo.LibrepoLog.addHandler(logfile, logfile_level <= ALL)

        # setup RPM callbacks logger
        logger_rpm = logging.getLogger("dnf.rpm")
        logger_rpm.propagate = False
        logger_rpm.setLevel(SUBDEBUG)
        logfile = os.path.join(logdir, dnf.const.LOG_RPM)
        handler = _create_filehandler(logfile, log_size, log_rotate, log_compress)
        logger_rpm.addHandler(handler)

    @only_once
    def _setup(self, verbose_level, error_level, logfile_level, logdir, log_size, log_rotate, log_compress):
        self._presetup()

        self._setup_file_loggers(logfile_level, logdir, log_size, log_rotate, log_compress)

        logger_warnings = logging.getLogger("py.warnings")
        logger_warnings.addHandler(self.stderr_handler)

        # setup RPM callbacks logger
        logger_rpm = logging.getLogger("dnf.rpm")
        logger_rpm.addHandler(self.stdout_handler)
        logger_rpm.addHandler(self.stderr_handler)

        logger_dnf = logging.getLogger("dnf")
        # temporarily turn off stdout/stderr handlers:
        self.stdout_handler.setLevel(WARNING)
        self.stderr_handler.setLevel(WARNING)
        _paint_mark(logger_dnf)
        _paint_mark(logger_rpm)
        # bring std handlers to the preferred level
        self.stdout_handler.setLevel(verbose_level)
        self.stderr_handler.setLevel(error_level)

    def _setup_from_dnf_conf(self, conf, file_loggers_only=False):
        verbose_level_r = _cfg_verbose_val2level(conf.debuglevel)
        error_level_r = _cfg_err_val2level(conf.errorlevel)
        logfile_level_r = _cfg_verbose_val2level(conf.logfilelevel)
        logdir = conf.logdir
        log_size = conf.log_size
        log_rotate = conf.log_rotate
        log_compress = conf.log_compress
        if file_loggers_only:
            return self._setup_file_loggers(logfile_level_r, logdir, log_size, log_rotate, log_compress)
        else:
            return self._setup(
                verbose_level_r, error_level_r, logfile_level_r, logdir, log_size, log_rotate, log_compress)


class Timer(object):
    def __init__(self, what):
        self.what = what
        self.start = time.time()

    def __call__(self):
        diff = time.time() - self.start
        msg = 'timer: %s: %d ms' % (self.what, diff * 1000)
        logging.getLogger("dnf").log(DDEBUG, msg)


_LIBDNF_TO_DNF_LOGLEVEL_MAPPING = {
    libdnf.utils.Logger.Level_CRITICAL: CRITICAL,
    libdnf.utils.Logger.Level_ERROR: ERROR,
    libdnf.utils.Logger.Level_WARNING: WARNING,
    libdnf.utils.Logger.Level_NOTICE: INFO,
    libdnf.utils.Logger.Level_INFO: INFO,
    libdnf.utils.Logger.Level_DEBUG: DEBUG,
    libdnf.utils.Logger.Level_TRACE: TRACE
}


class LibdnfLoggerCB(libdnf.utils.Logger):
    def __init__(self):
        super(LibdnfLoggerCB, self).__init__()
        self._dnf_logger = logging.getLogger("dnf")
        self._librepo_logger = logging.getLogger("librepo")

    def write(self, source, *args):
        """Log message.

        source -- integer, defines origin (libdnf, librepo, ...) of message, 0 - unknown
        """
        if len(args) == 2:
            level, message = args
        elif len(args) == 4:
            time, pid, level, message = args
        if source == libdnf.utils.Logger.LOG_SOURCE_LIBREPO:
            self._librepo_logger.log(_LIBDNF_TO_DNF_LOGLEVEL_MAPPING[level], message)
        else:
            self._dnf_logger.log(_LIBDNF_TO_DNF_LOGLEVEL_MAPPING[level], message)


libdnfLoggerCB = LibdnfLoggerCB()
libdnf.utils.Log.setLogger(libdnfLoggerCB)
site-packages/dnf/base.py000064400000345365147511334650011350 0ustar00# Copyright 2005 Duke University
# Copyright (C) 2012-2018 Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU Library General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.

"""
Supplies the Base class.
"""

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals

import argparse
import dnf
import libdnf.transaction

from copy import deepcopy
from dnf.comps import CompsQuery
from dnf.i18n import _, P_, ucd
from dnf.util import _parse_specs
from dnf.db.history import SwdbInterface
from dnf.yum import misc
try:
    from collections.abc import Sequence
except ImportError:
    from collections import Sequence
import datetime
import dnf.callback
import dnf.comps
import dnf.conf
import dnf.conf.read
import dnf.crypto
import dnf.dnssec
import dnf.drpm
import dnf.exceptions
import dnf.goal
import dnf.history
import dnf.lock
import dnf.logging
# WITH_MODULES is used by ansible (lib/ansible/modules/packaging/os/dnf.py)
try:
    import dnf.module.module_base
    WITH_MODULES = True
except ImportError:
    WITH_MODULES = False
import dnf.persistor
import dnf.plugin
import dnf.query
import dnf.repo
import dnf.repodict
import dnf.rpm.connection
import dnf.rpm.miscutils
import dnf.rpm.transaction
import dnf.sack
import dnf.selector
import dnf.subject
import dnf.transaction
import dnf.util
import dnf.yum.rpmtrans
import functools
import gc
import hawkey
import itertools
import logging
import math
import os
import operator
import re
import rpm
import time
import shutil


logger = logging.getLogger("dnf")


class Base(object):

    def __init__(self, conf=None):
        # :api
        self._closed = False
        self._conf = conf or self._setup_default_conf()
        self._goal = None
        self._repo_persistor = None
        self._sack = None
        self._transaction = None
        self._priv_ts = None
        self._comps = None
        self._comps_trans = dnf.comps.TransactionBunch()
        self._history = None
        self._tempfiles = set()
        self._trans_tempfiles = set()
        self._ds_callback = dnf.callback.Depsolve()
        self._logging = dnf.logging.Logging()
        self._repos = dnf.repodict.RepoDict()
        self._rpm_probfilter = set([rpm.RPMPROB_FILTER_OLDPACKAGE])
        self._plugins = dnf.plugin.Plugins()
        self._trans_success = False
        self._trans_install_set = False
        self._tempfile_persistor = None
        #  self._update_security_filters is used by ansible
        self._update_security_filters = []
        self._update_security_options = {}
        self._allow_erasing = False
        self._repo_set_imported_gpg_keys = set()
        self.output = None

    def __enter__(self):
        return self

    def __exit__(self, *exc_args):
        self.close()

    def __del__(self):
        self.close()

    def _add_tempfiles(self, files):
        if self._transaction:
            self._trans_tempfiles.update(files)
        elif self.conf.destdir:
            pass
        else:
            self._tempfiles.update(files)

    def _add_repo_to_sack(self, repo):
        repo.load()
        mdload_flags = dict(load_filelists=True,
                            load_presto=repo.deltarpm,
                            load_updateinfo=True)
        if repo.load_metadata_other:
            mdload_flags["load_other"] = True
        try:
            self._sack.load_repo(repo._repo, build_cache=True, **mdload_flags)
        except hawkey.Exception as e:
            logger.debug(_("loading repo '{}' failure: {}").format(repo.id, e))
            raise dnf.exceptions.RepoError(
                _("Loading repository '{}' has failed").format(repo.id))

    @staticmethod
    def _setup_default_conf():
        conf = dnf.conf.Conf()
        subst = conf.substitutions
        if 'releasever' not in subst:
            subst['releasever'] = \
                dnf.rpm.detect_releasever(conf.installroot)
        return conf

    def _setup_modular_excludes(self):
        hot_fix_repos = [i.id for i in self.repos.iter_enabled() if i.module_hotfixes]
        try:
            solver_errors = self.sack.filter_modules(
                self._moduleContainer, hot_fix_repos, self.conf.installroot,
                self.conf.module_platform_id, update_only=False, debugsolver=self.conf.debug_solver,
                module_obsoletes=self.conf.module_obsoletes)
        except hawkey.Exception as e:
            raise dnf.exceptions.Error(ucd(e))
        if solver_errors:
            logger.warning(
                dnf.module.module_base.format_modular_solver_errors(solver_errors[0]))

    def _setup_excludes_includes(self, only_main=False):
        disabled = set(self.conf.disable_excludes)
        if 'all' in disabled and WITH_MODULES:
            self._setup_modular_excludes()
            return
        repo_includes = []
        repo_excludes = []
        # first evaluate repo specific includes/excludes
        if not only_main:
            for r in self.repos.iter_enabled():
                if r.id in disabled:
                    continue
                if len(r.includepkgs) > 0:
                    incl_query = self.sack.query().filterm(empty=True)
                    for incl in set(r.includepkgs):
                        subj = dnf.subject.Subject(incl)
                        incl_query = incl_query.union(subj.get_best_query(
                            self.sack, with_nevra=True, with_provides=False, with_filenames=False))
                    incl_query.filterm(reponame=r.id)
                    repo_includes.append((incl_query.apply(), r.id))
                excl_query = self.sack.query().filterm(empty=True)
                for excl in set(r.excludepkgs):
                    subj = dnf.subject.Subject(excl)
                    excl_query = excl_query.union(subj.get_best_query(
                        self.sack, with_nevra=True, with_provides=False, with_filenames=False))
                excl_query.filterm(reponame=r.id)
                if excl_query:
                    repo_excludes.append((excl_query, r.id))

        # then main (global) includes/excludes because they can mask
        # repo specific settings
        if 'main' not in disabled:
            include_query = self.sack.query().filterm(empty=True)
            if len(self.conf.includepkgs) > 0:
                for incl in set(self.conf.includepkgs):
                    subj = dnf.subject.Subject(incl)
                    include_query = include_query.union(subj.get_best_query(
                        self.sack, with_nevra=True, with_provides=False, with_filenames=False))
            exclude_query = self.sack.query().filterm(empty=True)
            for excl in set(self.conf.excludepkgs):
                subj = dnf.subject.Subject(excl)
                exclude_query = exclude_query.union(subj.get_best_query(
                    self.sack, with_nevra=True, with_provides=False, with_filenames=False))
            if len(self.conf.includepkgs) > 0:
                self.sack.add_includes(include_query)
                self.sack.set_use_includes(True)
            if exclude_query:
                self.sack.add_excludes(exclude_query)

        if repo_includes:
            for query, repoid in repo_includes:
                self.sack.add_includes(query)
                self.sack.set_use_includes(True, repoid)

        if repo_excludes:
            for query, repoid in repo_excludes:
                self.sack.add_excludes(query)

        if not only_main and WITH_MODULES:
            self._setup_modular_excludes()

    def _store_persistent_data(self):
        if self._repo_persistor and not self.conf.cacheonly:
            expired = [r.id for r in self.repos.iter_enabled()
                       if (r.metadata and r._repo.isExpired())]
            self._repo_persistor.expired_to_add.update(expired)
            self._repo_persistor.save()

        if self._tempfile_persistor:
            self._tempfile_persistor.save()

    @property
    def comps(self):
        # :api
        if self._comps is None:
            self.read_comps(arch_filter=True)
        return self._comps

    @property
    def conf(self):
        # :api
        return self._conf

    @property
    def repos(self):
        # :api
        return self._repos

    @repos.deleter
    def repos(self):
        # :api
        self._repos = None

    @property
    @dnf.util.lazyattr("_priv_rpmconn")
    def _rpmconn(self):
        return dnf.rpm.connection.RpmConnection(self.conf.installroot)

    @property
    def sack(self):
        # :api
        return self._sack

    @property
    def _moduleContainer(self):
        if self.sack is None:
            raise dnf.exceptions.Error("Sack was not initialized")
        if self.sack._moduleContainer is None:
            self.sack._moduleContainer = libdnf.module.ModulePackageContainer(
                False, self.conf.installroot, self.conf.substitutions["arch"], self.conf.persistdir)
        return self.sack._moduleContainer

    @property
    def transaction(self):
        # :api
        return self._transaction

    @transaction.setter
    def transaction(self, value):
        # :api
        if self._transaction:
            raise ValueError('transaction already set')
        self._transaction = value

    def _activate_persistor(self):
        self._repo_persistor = dnf.persistor.RepoPersistor(self.conf.cachedir)

    def init_plugins(self, disabled_glob=(), enable_plugins=(), cli=None):
        # :api
        """Load plugins and run their __init__()."""
        if self.conf.plugins:
            self._plugins._load(self.conf, disabled_glob, enable_plugins)
        self._plugins._run_init(self, cli)

    def pre_configure_plugins(self):
        # :api
        """Run plugins pre_configure() method."""
        self._plugins._run_pre_config()

    def configure_plugins(self):
        # :api
        """Run plugins configure() method."""
        self._plugins._run_config()

    def unload_plugins(self):
        # :api
        """Run plugins unload() method."""
        self._plugins._unload()

    def update_cache(self, timer=False):
        # :api

        period = self.conf.metadata_timer_sync
        if self._repo_persistor is None:
            self._activate_persistor()
        persistor = self._repo_persistor
        if timer:
            if dnf.util.on_metered_connection():
                msg = _('Metadata timer caching disabled '
                        'when running on metered connection.')
                logger.info(msg)
                return False
            if dnf.util.on_ac_power() is False:
                msg = _('Metadata timer caching disabled '
                        'when running on a battery.')
                logger.info(msg)
                return False
            if period <= 0:
                msg = _('Metadata timer caching disabled.')
                logger.info(msg)
                return False
            since_last_makecache = persistor.since_last_makecache()
            if since_last_makecache is not None and since_last_makecache < period:
                logger.info(_('Metadata cache refreshed recently.'))
                return False
            for repo in self.repos.values():
                repo._repo.setMaxMirrorTries(1)

        if not self.repos._any_enabled():
            logger.info(_('There are no enabled repositories in "{}".').format(
                '", "'.join(self.conf.reposdir)))
            return False

        for r in self.repos.iter_enabled():
            (is_cache, expires_in) = r._metadata_expire_in()
            if expires_in is None:
                logger.info(_('%s: will never be expired and will not be refreshed.'), r.id)
            elif not is_cache or expires_in <= 0:
                logger.debug(_('%s: has expired and will be refreshed.'), r.id)
                r._repo.expire()
            elif timer and expires_in < period:
                # expires within the checking period:
                msg = _("%s: metadata will expire after %d seconds and will be refreshed now")
                logger.debug(msg, r.id, expires_in)
                r._repo.expire()
            else:
                logger.debug(_('%s: will expire after %d seconds.'), r.id,
                             expires_in)

        if timer:
            persistor.reset_last_makecache = True
        self.fill_sack(load_system_repo=False, load_available_repos=True)  # performs the md sync
        logger.info(_('Metadata cache created.'))
        return True

    def fill_sack(self, load_system_repo=True, load_available_repos=True):
        # :api
        """Prepare the Sack and the Goal objects. """
        timer = dnf.logging.Timer('sack setup')
        self.reset(sack=True, goal=True)
        self._sack = dnf.sack._build_sack(self)
        lock = dnf.lock.build_metadata_lock(self.conf.cachedir, self.conf.exit_on_lock)
        with lock:
            if load_system_repo is not False:
                try:
                    # FIXME: If build_cache=True, @System.solv is incorrectly updated in install-
                    # remove loops
                    self._sack.load_system_repo(build_cache=False)
                except IOError:
                    if load_system_repo != 'auto':
                        raise
            if load_available_repos:
                error_repos = []
                mts = 0
                age = time.time()
                # Iterate over installed GPG keys and check their validity using DNSSEC
                if self.conf.gpgkey_dns_verification:
                    dnf.dnssec.RpmImportedKeys.check_imported_keys_validity()
                for r in self.repos.iter_enabled():
                    try:
                        self._add_repo_to_sack(r)
                        if r._repo.getTimestamp() > mts:
                            mts = r._repo.getTimestamp()
                        if r._repo.getAge() < age:
                            age = r._repo.getAge()
                        logger.debug(_("%s: using metadata from %s."), r.id,
                                     dnf.util.normalize_time(
                                         r._repo.getMaxTimestamp()))
                    except dnf.exceptions.RepoError as e:
                        r._repo.expire()
                        if r.skip_if_unavailable is False:
                            raise
                        logger.warning("Error: %s", e)
                        error_repos.append(r.id)
                        r.disable()
                if error_repos:
                    logger.warning(
                        _("Ignoring repositories: %s"), ', '.join(error_repos))
                if self.repos._any_enabled():
                    if age != 0 and mts != 0:
                        logger.info(_("Last metadata expiration check: %s ago on %s."),
                                    datetime.timedelta(seconds=int(age)),
                                    dnf.util.normalize_time(mts))
            else:
                self.repos.all().disable()
        conf = self.conf
        self._sack._configure(conf.installonlypkgs, conf.installonly_limit, conf.allow_vendor_change)
        self._setup_excludes_includes()
        timer()
        self._goal = dnf.goal.Goal(self._sack)
        self._goal.protect_running_kernel = conf.protect_running_kernel
        self._plugins.run_sack()
        return self._sack

    def fill_sack_from_repos_in_cache(self, load_system_repo=True):
        # :api
        """
        Prepare Sack and Goal objects and also load all enabled repositories from cache only,
        it doesn't download anything and it doesn't check if metadata are expired.
        If there is not enough metadata present (repond.xml or both primary.xml and solv file
        are missing) given repo is either skipped or it throws a RepoError exception depending
        on skip_if_unavailable configuration.
        """
        timer = dnf.logging.Timer('sack setup')
        self.reset(sack=True, goal=True)
        self._sack = dnf.sack._build_sack(self)
        lock = dnf.lock.build_metadata_lock(self.conf.cachedir, self.conf.exit_on_lock)
        with lock:
            if load_system_repo is not False:
                try:
                    # FIXME: If build_cache=True, @System.solv is incorrectly updated in install-
                    # remove loops
                    self._sack.load_system_repo(build_cache=False)
                except IOError:
                    if load_system_repo != 'auto':
                        raise

            error_repos = []
            # Iterate over installed GPG keys and check their validity using DNSSEC
            if self.conf.gpgkey_dns_verification:
                dnf.dnssec.RpmImportedKeys.check_imported_keys_validity()
            for repo in self.repos.iter_enabled():
                try:
                    repo._repo.loadCache(throwExcept=True, ignoreMissing=True)
                    mdload_flags = dict(load_filelists=True,
                                        load_presto=repo.deltarpm,
                                        load_updateinfo=True)
                    if repo.load_metadata_other:
                        mdload_flags["load_other"] = True

                    self._sack.load_repo(repo._repo, **mdload_flags)

                    logger.debug(_("%s: using metadata from %s."), repo.id,
                                 dnf.util.normalize_time(
                                     repo._repo.getMaxTimestamp()))
                except (RuntimeError, hawkey.Exception) as e:
                    if repo.skip_if_unavailable is False:
                        raise dnf.exceptions.RepoError(
                            _("loading repo '{}' failure: {}").format(repo.id, e))
                    else:
                        logger.debug(_("loading repo '{}' failure: {}").format(repo.id, e))
                    error_repos.append(repo.id)
                    repo.disable()
            if error_repos:
                logger.warning(
                    _("Ignoring repositories: %s"), ', '.join(error_repos))

        conf = self.conf
        self._sack._configure(conf.installonlypkgs, conf.installonly_limit, conf.allow_vendor_change)
        self._setup_excludes_includes()
        timer()
        self._goal = dnf.goal.Goal(self._sack)
        self._goal.protect_running_kernel = conf.protect_running_kernel
        self._plugins.run_sack()
        return self._sack

    def _finalize_base(self):
        self._tempfile_persistor = dnf.persistor.TempfilePersistor(
            self.conf.cachedir)

        if not self.conf.keepcache:
            self._clean_packages(self._tempfiles)
            if self._trans_success:
                self._trans_tempfiles.update(
                    self._tempfile_persistor.get_saved_tempfiles())
                self._tempfile_persistor.empty()
                if self._trans_install_set:
                    self._clean_packages(self._trans_tempfiles)
            else:
                self._tempfile_persistor.tempfiles_to_add.update(
                    self._trans_tempfiles)

        if self._tempfile_persistor.tempfiles_to_add:
            logger.info(_("The downloaded packages were saved in cache "
                          "until the next successful transaction."))
            logger.info(_("You can remove cached packages by executing "
                          "'%s'."), "{prog} clean packages".format(prog=dnf.util.MAIN_PROG))

        # Do not trigger the lazy creation:
        if self._history is not None:
            self.history.close()
        self._store_persistent_data()
        self._closeRpmDB()
        self._trans_success = False

    def close(self):
        # :api
        """Close all potential handles and clean cache.

        Typically the handles are to data sources and sinks.

        """

        if self._closed:
            return
        logger.log(dnf.logging.DDEBUG, 'Cleaning up.')
        self._closed = True
        self._finalize_base()
        self.reset(sack=True, repos=True, goal=True)
        self._plugins = None

    def read_all_repos(self, opts=None):
        # :api
        """Read repositories from the main conf file and from .repo files."""

        reader = dnf.conf.read.RepoReader(self.conf, opts)
        for repo in reader:
            try:
                self.repos.add(repo)
            except dnf.exceptions.ConfigError as e:
                logger.warning(e)

    def reset(self, sack=False, repos=False, goal=False):
        # :api
        """Make the Base object forget about various things."""
        if sack:
            self._sack = None
        if repos:
            self._repos = dnf.repodict.RepoDict()
        if goal:
            self._goal = None
            if self._sack is not None:
                self._goal = dnf.goal.Goal(self._sack)
                self._goal.protect_running_kernel = self.conf.protect_running_kernel
            if self._sack and self._moduleContainer:
                # sack must be set to enable operations on moduleContainer
                self._moduleContainer.rollback()
            if self._history is not None:
                self.history.close()
            self._comps_trans = dnf.comps.TransactionBunch()
            self._transaction = None
        self._update_security_filters = []
        if sack and goal:
            # We've just done this, above:
            #
            #      _sack                     _goal
            #         |                        |
            #    -- [CUT] --              -- [CUT] --
            #         |                        |
            #         v                |       v
            #    +----------------+   [C]  +-------------+
            #    | DnfSack object | <-[U]- | Goal object |
            #    +----------------+   [T]  +-------------+
            #      |^    |^    |^      |
            #      ||    ||    ||
            #      ||    ||    ||         |
            #   +--||----||----||---+    [C]
            #   |  v|    v|    v|   | <--[U]-- _transaction
            #   | Pkg1  Pkg2  PkgN  |    [T]
            #   |                   |     |
            #   | Transaction oject |
            #   +-------------------+
            #
            # At this point, the DnfSack object would be released only
            # eventually, by Python's generational garbage collector, due to the
            # cyclic references DnfSack<->Pkg1 ... DnfSack<->PkgN.
            #
            # The delayed release is a problem: the DnfSack object may
            # (indirectly) own "page file" file descriptors in libsolv, via
            # libdnf. For example,
            #
            #   sack->priv->pool->repos[1]->repodata[1]->store.pagefd = 7
            #   sack->priv->pool->repos[1]->repodata[2]->store.pagefd = 8
            #
            # These file descriptors are closed when the DnfSack object is
            # eventually released, that is, when dnf_sack_finalize() (in libdnf)
            # calls pool_free() (in libsolv).
            #
            # We need that to happen right now, as callers may want to unmount
            # the filesystems which those file descriptors refer to immediately
            # after reset() returns. Therefore, force a garbage collection here.
            gc.collect()

    def _closeRpmDB(self):
        """Closes down the instances of rpmdb that could be open."""
        del self._ts

    _TS_FLAGS_TO_RPM = {'noscripts': rpm.RPMTRANS_FLAG_NOSCRIPTS,
                        'notriggers': rpm.RPMTRANS_FLAG_NOTRIGGERS,
                        'nodocs': rpm.RPMTRANS_FLAG_NODOCS,
                        'test': rpm.RPMTRANS_FLAG_TEST,
                        'justdb': rpm.RPMTRANS_FLAG_JUSTDB,
                        'nocontexts': rpm.RPMTRANS_FLAG_NOCONTEXTS,
                        'nocrypto': rpm.RPMTRANS_FLAG_NOFILEDIGEST}
    if hasattr(rpm, 'RPMTRANS_FLAG_NOCAPS'):
        # Introduced in rpm-4.14
        _TS_FLAGS_TO_RPM['nocaps'] = rpm.RPMTRANS_FLAG_NOCAPS

    _TS_VSFLAGS_TO_RPM = {'nocrypto': rpm._RPMVSF_NOSIGNATURES |
                          rpm._RPMVSF_NODIGESTS}

    @property
    def goal(self):
        return self._goal

    @property
    def _ts(self):
        """Set up the RPM transaction set that will be used
           for all the work."""
        if self._priv_ts is not None:
            return self._priv_ts
        self._priv_ts = dnf.rpm.transaction.TransactionWrapper(
            self.conf.installroot)
        self._priv_ts.setFlags(0)  # reset everything.
        for flag in self.conf.tsflags:
            rpm_flag = self._TS_FLAGS_TO_RPM.get(flag)
            if rpm_flag is None:
                logger.critical(_('Invalid tsflag in config file: %s'), flag)
                continue
            self._priv_ts.addTsFlag(rpm_flag)
            vs_flag = self._TS_VSFLAGS_TO_RPM.get(flag)
            if vs_flag is not None:
                self._priv_ts.pushVSFlags(vs_flag)

        if not self.conf.diskspacecheck:
            self._rpm_probfilter.add(rpm.RPMPROB_FILTER_DISKSPACE)

        if self.conf.ignorearch:
            self._rpm_probfilter.add(rpm.RPMPROB_FILTER_IGNOREARCH)

        probfilter = functools.reduce(operator.or_, self._rpm_probfilter, 0)
        self._priv_ts.setProbFilter(probfilter)
        return self._priv_ts

    @_ts.deleter
    def _ts(self):
        """Releases the RPM transaction set. """
        if self._priv_ts is None:
            return
        self._priv_ts.close()
        del self._priv_ts
        self._priv_ts = None

    def read_comps(self, arch_filter=False):
        # :api
        """Create the groups object to access the comps metadata."""
        timer = dnf.logging.Timer('loading comps')
        self._comps = dnf.comps.Comps()

        logger.log(dnf.logging.DDEBUG, 'Getting group metadata')
        for repo in self.repos.iter_enabled():
            if not repo.enablegroups:
                continue
            if not repo.metadata:
                continue
            comps_fn = repo._repo.getCompsFn()
            if not comps_fn:
                continue

            logger.log(dnf.logging.DDEBUG,
                       'Adding group file from repository: %s', repo.id)
            if repo._repo.getSyncStrategy() == dnf.repo.SYNC_ONLY_CACHE:
                decompressed = misc.calculate_repo_gen_dest(comps_fn,
                                                            'groups.xml')
                if not os.path.exists(decompressed):
                    # root privileges are needed for comps decompression
                    continue
            else:
                decompressed = misc.repo_gen_decompress(comps_fn, 'groups.xml')

            try:
                self._comps._add_from_xml_filename(decompressed)
            except dnf.exceptions.CompsError as e:
                msg = _('Failed to add groups file for repository: %s - %s')
                logger.critical(msg, repo.id, e)

        if arch_filter:
            self._comps._i.arch_filter(
                [self._conf.substitutions['basearch']])
        timer()
        return self._comps

    def _getHistory(self):
        """auto create the history object that to access/append the transaction
           history information. """
        if self._history is None:
            releasever = self.conf.releasever
            self._history = SwdbInterface(self.conf.persistdir, releasever=releasever)
        return self._history

    history = property(fget=lambda self: self._getHistory(),
                       fset=lambda self, value: setattr(
                           self, "_history", value),
                       fdel=lambda self: setattr(self, "_history", None),
                       doc="DNF SWDB Interface Object")

    def _goal2transaction(self, goal):
        ts = self.history.rpm
        all_obsoleted = set(goal.list_obsoleted())
        installonly_query = self._get_installonly_query()
        installonly_query.apply()
        installonly_query_installed = installonly_query.installed().apply()

        for pkg in goal.list_downgrades():
            obs = goal.obsoleted_by_package(pkg)
            downgraded = obs[0]
            self._ds_callback.pkg_added(downgraded, 'dd')
            self._ds_callback.pkg_added(pkg, 'd')
            ts.add_downgrade(pkg, downgraded, obs[1:])
        for pkg in goal.list_reinstalls():
            self._ds_callback.pkg_added(pkg, 'r')
            obs = goal.obsoleted_by_package(pkg)
            nevra_pkg = str(pkg)
            # reinstall could obsolete multiple packages with the same NEVRA or different NEVRA
            # Set the package with the same NEVRA as reinstalled
            obsoletes = []
            for obs_pkg in obs:
                if str(obs_pkg) == nevra_pkg:
                    obsoletes.insert(0, obs_pkg)
                else:
                    obsoletes.append(obs_pkg)
            reinstalled = obsoletes[0]
            ts.add_reinstall(pkg, reinstalled, obsoletes[1:])
        for pkg in goal.list_installs():
            self._ds_callback.pkg_added(pkg, 'i')
            obs = goal.obsoleted_by_package(pkg)
            # Skip obsoleted packages that are not part of all_obsoleted,
            # they are handled as upgrades/downgrades.
            # Also keep RPMs with the same name - they're not always in all_obsoleted.
            obs = [i for i in obs if i in all_obsoleted or i.name == pkg.name]

            reason = goal.get_reason(pkg)

            #  Inherit reason if package is installonly an package with same name is installed
            #  Use the same logic like upgrade
            #  Upgrade of installonly packages result in install or install and remove step
            if pkg in installonly_query and installonly_query_installed.filter(name=pkg.name):
                reason = ts.get_reason(pkg)

            # inherit the best reason from obsoleted packages
            for obsolete in obs:
                reason_obsolete = ts.get_reason(obsolete)
                if libdnf.transaction.TransactionItemReasonCompare(reason, reason_obsolete) == -1:
                    reason = reason_obsolete

            ts.add_install(pkg, obs, reason)
            cb = lambda pkg: self._ds_callback.pkg_added(pkg, 'od')
            dnf.util.mapall(cb, obs)
        for pkg in goal.list_upgrades():
            obs = goal.obsoleted_by_package(pkg)
            upgraded = None
            for i in obs:
                # try to find a package with matching name as the upgrade
                if i.name == pkg.name:
                    upgraded = i
                    break
            if upgraded is None:
                # no matching name -> pick the first one
                upgraded = obs.pop(0)
            else:
                obs.remove(upgraded)
            # Skip obsoleted packages that are not part of all_obsoleted,
            # they are handled as upgrades/downgrades.
            # Also keep RPMs with the same name - they're not always in all_obsoleted.
            obs = [i for i in obs if i in all_obsoleted or i.name == pkg.name]

            cb = lambda pkg: self._ds_callback.pkg_added(pkg, 'od')
            dnf.util.mapall(cb, obs)
            if pkg in installonly_query:
                ts.add_install(pkg, obs)
            else:
                ts.add_upgrade(pkg, upgraded, obs)
                self._ds_callback.pkg_added(upgraded, 'ud')
            self._ds_callback.pkg_added(pkg, 'u')
        erasures = goal.list_erasures()
        if erasures:
            remaining_installed_query = self.sack.query(flags=hawkey.IGNORE_EXCLUDES).installed()
            remaining_installed_query.filterm(pkg__neq=erasures)
            for pkg in erasures:
                if remaining_installed_query.filter(name=pkg.name):
                    remaining = remaining_installed_query[0]
                    ts.get_reason(remaining)
                    self.history.set_reason(remaining, ts.get_reason(remaining))
                self._ds_callback.pkg_added(pkg, 'e')
                reason = goal.get_reason(pkg)
                ts.add_erase(pkg, reason)
        return ts

    def _query_matches_installed(self, q):
        """ See what packages in the query match packages (also in older
            versions, but always same architecture) that are already installed.

            Unlike in case of _sltr_matches_installed(), it is practical here
            to know even the packages in the original query that can still be
            installed.
        """
        inst = q.installed()
        inst_per_arch = inst._na_dict()
        avail_per_arch = q.available()._na_dict()
        avail_l = []
        inst_l = []
        for na in avail_per_arch:
            if na in inst_per_arch:
                inst_l.append(inst_per_arch[na][0])
            else:
                avail_l.append(avail_per_arch[na])
        return inst_l, avail_l

    def _sltr_matches_installed(self, sltr):
        """ See if sltr matches a patches that is (in older version or different
            architecture perhaps) already installed.
        """
        inst = self.sack.query().installed().filterm(pkg=sltr.matches())
        return list(inst)

    def iter_userinstalled(self):
        """Get iterator over the packages installed by the user."""
        return (pkg for pkg in self.sack.query().installed()
                if self.history.user_installed(pkg))

    def _run_hawkey_goal(self, goal, allow_erasing):
        ret = goal.run(
            allow_uninstall=allow_erasing, force_best=self.conf.best,
            ignore_weak_deps=(not self.conf.install_weak_deps))
        if self.conf.debug_solver:
            goal.write_debugdata('./debugdata/rpms')
        return ret

    def resolve(self, allow_erasing=False):
        # :api
        """Build the transaction set."""
        exc = None
        self._finalize_comps_trans()

        timer = dnf.logging.Timer('depsolve')
        self._ds_callback.start()
        goal = self._goal
        if goal.req_has_erase():
            goal.push_userinstalled(self.sack.query().installed(),
                                    self.history)
        elif not self.conf.upgrade_group_objects_upgrade:
            # exclude packages installed from groups
            # these packages will be marked to installation
            # which could prevent them from upgrade, downgrade
            # to prevent "conflicting job" error it's not applied
            # to "remove" and "reinstall" commands

            solver = self._build_comps_solver()
            solver._exclude_packages_from_installed_groups(self)

        goal.add_protected(self.sack.query().filterm(
            name=self.conf.protected_packages))
        if not self._run_hawkey_goal(goal, allow_erasing):
            if self.conf.debuglevel >= 6:
                goal.log_decisions()
            msg = dnf.util._format_resolve_problems(goal.problem_rules())
            exc = dnf.exceptions.DepsolveError(msg)
        else:
            self._transaction = self._goal2transaction(goal)

        self._ds_callback.end()
        timer()

        got_transaction = self._transaction is not None and \
            len(self._transaction) > 0
        if got_transaction:
            msg = self._transaction._rpm_limitations()
            if msg:
                exc = dnf.exceptions.Error(msg)

        if exc is not None:
            raise exc

        self._plugins.run_resolved()

        # auto-enable module streams based on installed RPMs
        new_pkgs = self._goal.list_installs()
        new_pkgs += self._goal.list_upgrades()
        new_pkgs += self._goal.list_downgrades()
        new_pkgs += self._goal.list_reinstalls()
        self.sack.set_modules_enabled_by_pkgset(self._moduleContainer, new_pkgs)

        return got_transaction

    def do_transaction(self, display=()):
        # :api
        if not isinstance(display, Sequence):
            display = [display]
        display = \
            [dnf.yum.rpmtrans.LoggingTransactionDisplay()] + list(display)

        if not self.transaction:
            # packages are not changed, but comps and modules changes need to be committed
            self._moduleContainer.save()
            self._moduleContainer.updateFailSafeData()
            if self._history and (self._history.group or self._history.env):
                cmdline = None
                if hasattr(self, 'args') and self.args:
                    cmdline = ' '.join(self.args)
                elif hasattr(self, 'cmds') and self.cmds:
                    cmdline = ' '.join(self.cmds)
                old = self.history.last()
                if old is None:
                    rpmdb_version = self.sack._rpmdb_version()
                else:
                    rpmdb_version = old.end_rpmdb_version

                self.history.beg(rpmdb_version, [], [], cmdline)
                self.history.end(rpmdb_version)
            self._plugins.run_pre_transaction()
            self._plugins.run_transaction()
            self._trans_success = True
            return

        tid = None
        logger.info(_('Running transaction check'))
        lock = dnf.lock.build_rpmdb_lock(self.conf.persistdir,
                                         self.conf.exit_on_lock)
        with lock:
            self.transaction._populate_rpm_ts(self._ts)

            msgs = self._run_rpm_check()
            if msgs:
                msg = _('Error: transaction check vs depsolve:')
                logger.error(msg)
                for msg in msgs:
                    logger.error(msg)
                raise dnf.exceptions.TransactionCheckError(msg)

            logger.info(_('Transaction check succeeded.'))

            timer = dnf.logging.Timer('transaction test')
            logger.info(_('Running transaction test'))

            self._ts.order()  # order the transaction
            self._ts.clean()  # release memory not needed beyond this point

            testcb = dnf.yum.rpmtrans.RPMTransaction(self, test=True)
            tserrors = self._ts.test(testcb)

            if len(tserrors) > 0:
                for msg in testcb.messages():
                    logger.critical(_('RPM: {}').format(msg))
                errstring = _('Transaction test error:') + '\n'
                for descr in tserrors:
                    errstring += '  %s\n' % ucd(descr)

                summary = self._trans_error_summary(errstring)
                if summary:
                    errstring += '\n' + summary

                raise dnf.exceptions.Error(errstring)
            del testcb

            logger.info(_('Transaction test succeeded.'))
            #  With RPMTRANS_FLAG_TEST return just before anything is stored permanently
            if self._ts.isTsFlagSet(rpm.RPMTRANS_FLAG_TEST):
                return
            timer()

            # save module states on disk right before entering rpm transaction,
            # because we want system in recoverable state if transaction gets interrupted
            self._moduleContainer.save()
            self._moduleContainer.updateFailSafeData()

            # unset the sigquit handler
            timer = dnf.logging.Timer('transaction')
            # setup our rpm ts callback
            cb = dnf.yum.rpmtrans.RPMTransaction(self, displays=display)
            if self.conf.debuglevel < 2:
                for display_ in cb.displays:
                    display_.output = False

            self._plugins.run_pre_transaction()

            logger.info(_('Running transaction'))
            tid = self._run_transaction(cb=cb)
        timer()
        self._plugins.unload_removed_plugins(self.transaction)
        self._plugins.run_transaction()

        # log post transaction summary
        def _pto_callback(action, tsis):
            msgs = []
            for tsi in tsis:
                msgs.append('{}: {}'.format(action, str(tsi)))
            return msgs
        for msg in dnf.util._post_transaction_output(self, self.transaction, _pto_callback):
            logger.debug(msg)

        return tid

    def _trans_error_summary(self, errstring):
        """Parse the error string for 'interesting' errors which can
        be grouped, such as disk space issues.

        :param errstring: the error string
        :return: a string containing a summary of the errors
        """
        summary = ''
        # do disk space report first
        p = re.compile(r'needs (\d+)(K|M)B(?: more space)? on the (\S+) filesystem')
        disk = {}
        for m in p.finditer(errstring):
            size_in_mb = int(m.group(1)) if m.group(2) == 'M' else math.ceil(
                int(m.group(1)) / 1024.0)
            if m.group(3) not in disk:
                disk[m.group(3)] = size_in_mb
            if disk[m.group(3)] < size_in_mb:
                disk[m.group(3)] = size_in_mb

        if disk:
            summary += _('Disk Requirements:') + "\n"
            for k in disk:
                summary += "   " + P_(
                    'At least {0}MB more space needed on the {1} filesystem.',
                    'At least {0}MB more space needed on the {1} filesystem.',
                    disk[k]).format(disk[k], k) + '\n'

        if not summary:
            return None

        summary = _('Error Summary') + '\n-------------\n' + summary

        return summary

    def _record_history(self):
        return self.conf.history_record and \
            not self._ts.isTsFlagSet(rpm.RPMTRANS_FLAG_TEST)

    def _run_transaction(self, cb):
        """
        Perform the RPM transaction.

        :return: history database transaction ID or None
        """

        tid = None
        if self._record_history():
            using_pkgs_pats = list(self.conf.history_record_packages)
            installed_query = self.sack.query().installed()
            using_pkgs = installed_query.filter(name=using_pkgs_pats).run()
            rpmdbv = self.sack._rpmdb_version()
            lastdbv = self.history.last()
            if lastdbv is not None:
                lastdbv = lastdbv.end_rpmdb_version

            if lastdbv is None or rpmdbv != lastdbv:
                logger.debug(_("RPMDB altered outside of {prog}.").format(
                    prog=dnf.util.MAIN_PROG_UPPER))

            cmdline = None
            if hasattr(self, 'args') and self.args:
                cmdline = ' '.join(self.args)
            elif hasattr(self, 'cmds') and self.cmds:
                cmdline = ' '.join(self.cmds)

            comment = self.conf.comment if self.conf.comment else ""
            tid = self.history.beg(rpmdbv, using_pkgs, [], cmdline, comment)

        if self.conf.reset_nice:
            onice = os.nice(0)
            if onice:
                try:
                    os.nice(-onice)
                except:
                    onice = 0

        logger.log(dnf.logging.DDEBUG, 'RPM transaction start.')
        errors = self._ts.run(cb.callback, '')
        logger.log(dnf.logging.DDEBUG, 'RPM transaction over.')
        # ts.run() exit codes are, hmm, "creative": None means all ok, empty
        # list means some errors happened in the transaction and non-empty
        # list that there were errors preventing the ts from starting...
        if self.conf.reset_nice:
            try:
                os.nice(onice)
            except:
                pass
        dnf.util._sync_rpm_trans_with_swdb(self._ts, self._transaction)

        if errors is None:
            pass
        elif len(errors) == 0:
            # If there is no failing element it means that some "global" error
            # occurred (like rpm failed to obtain the transaction lock). Just pass
            # the rpm logs on to the user and raise an Error.
            # If there are failing elements the problem is related to those
            # elements and the Error is raised later, after saving the failure
            # to the history and printing out the transaction table to user.
            failed = [el for el in self._ts if el.Failed()]
            if not failed:
                for msg in cb.messages():
                    logger.critical(_('RPM: {}').format(msg))
                msg = _('Could not run transaction.')
                raise dnf.exceptions.Error(msg)
        else:
            logger.critical(_("Transaction couldn't start:"))
            for e in errors:
                logger.critical(ucd(e[0]))
            if self._record_history() and not self._ts.isTsFlagSet(rpm.RPMTRANS_FLAG_TEST):
                self.history.end(rpmdbv)
            msg = _("Could not run transaction.")
            raise dnf.exceptions.Error(msg)

        for i in ('ts_all_fn', 'ts_done_fn'):
            if hasattr(cb, i):
                fn = getattr(cb, i)
                try:
                    misc.unlink_f(fn)
                except (IOError, OSError):
                    msg = _('Failed to remove transaction file %s')
                    logger.critical(msg, fn)

        # keep install_set status because _verify_transaction will clean it
        self._trans_install_set = bool(self._transaction.install_set)

        # sync up what just happened versus what is in the rpmdb
        if not self._ts.isTsFlagSet(rpm.RPMTRANS_FLAG_TEST):
            self._verify_transaction(cb.verify_tsi_package)

        return tid

    def _verify_transaction(self, verify_pkg_cb=None):
        transaction_items = [
            tsi for tsi in self.transaction
            if tsi.action != libdnf.transaction.TransactionItemAction_REASON_CHANGE]
        total = len(transaction_items)

        def display_banner(pkg, count):
            count += 1
            if verify_pkg_cb is not None:
                verify_pkg_cb(pkg, count, total)
            return count

        timer = dnf.logging.Timer('verify transaction')
        count = 0

        rpmdb_sack = dnf.sack.rpmdb_sack(self)

        # mark group packages that are installed on the system as installed in the db
        q = rpmdb_sack.query().installed()
        names = set([i.name for i in q])
        for ti in self.history.group:
            g = ti.getCompsGroupItem()
            for p in g.getPackages():
                if p.getName() in names:
                    p.setInstalled(True)
                    p.save()

        # TODO: installed groups in environments

        # Post-transaction verification is no longer needed,
        # because DNF trusts error codes returned by RPM.
        # Verification banner is displayed to preserve UX.
        # TODO: drop in future DNF
        for tsi in transaction_items:
            count = display_banner(tsi.pkg, count)

        rpmdbv = rpmdb_sack._rpmdb_version()
        self.history.end(rpmdbv)

        timer()
        self._trans_success = True

    def _download_remote_payloads(self, payloads, drpm, progress, callback_total, fail_fast=True):
        lock = dnf.lock.build_download_lock(self.conf.cachedir, self.conf.exit_on_lock)
        with lock:
            beg_download = time.time()
            est_remote_size = sum(pload.download_size for pload in payloads)
            total_drpm = len(
                [payload for payload in payloads if isinstance(payload, dnf.drpm.DeltaPayload)])
            # compatibility part for tools that do not accept total_drpms keyword
            if progress.start.__code__.co_argcount == 4:
                progress.start(len(payloads), est_remote_size, total_drpms=total_drpm)
            else:
                progress.start(len(payloads), est_remote_size)
            errors = dnf.repo._download_payloads(payloads, drpm, fail_fast)

            if errors._irrecoverable():
                raise dnf.exceptions.DownloadError(errors._irrecoverable())

            remote_size = sum(errors._bandwidth_used(pload)
                              for pload in payloads)
            saving = dnf.repo._update_saving((0, 0), payloads,
                                             errors._recoverable)

            retries = self.conf.retries
            forever = retries == 0
            while errors._recoverable and (forever or retries > 0):
                if retries > 0:
                    retries -= 1

                msg = _("Some packages were not downloaded. Retrying.")
                logger.info(msg)

                remaining_pkgs = [pkg for pkg in errors._recoverable]
                payloads = \
                    [dnf.repo._pkg2payload(pkg, progress, dnf.repo.RPMPayload)
                     for pkg in remaining_pkgs]
                est_remote_size = sum(pload.download_size
                                      for pload in payloads)
                progress.start(len(payloads), est_remote_size)
                errors = dnf.repo._download_payloads(payloads, drpm, fail_fast)

                if errors._irrecoverable():
                    raise dnf.exceptions.DownloadError(errors._irrecoverable())

                remote_size += \
                    sum(errors._bandwidth_used(pload) for pload in payloads)
                saving = dnf.repo._update_saving(saving, payloads, {})

            if errors._recoverable:
                msg = dnf.exceptions.DownloadError.errmap2str(
                    errors._recoverable)
                logger.info(msg)

        if callback_total is not None:
            callback_total(remote_size, beg_download)

        (real, full) = saving
        if real != full:
            if real < full:
                msg = _("Delta RPMs reduced %.1f MB of updates to %.1f MB "
                        "(%d.1%% saved)")
            elif real > full:
                msg = _("Failed Delta RPMs increased %.1f MB of updates to %.1f MB "
                        "(%d.1%% wasted)")
            percent = 100 - real / full * 100
            logger.info(msg, full / 1024 ** 2, real / 1024 ** 2, percent)

    def download_packages(self, pkglist, progress=None, callback_total=None):
        # :api
        """Download the packages specified by the given list of packages.

        `pkglist` is a list of packages to download, `progress` is an optional
         DownloadProgress instance, `callback_total` an optional callback to
         output messages about the download operation.

        """
        remote_pkgs, local_pkgs = self._select_remote_pkgs(pkglist)
        if remote_pkgs:
            if progress is None:
                progress = dnf.callback.NullDownloadProgress()
            drpm = dnf.drpm.DeltaInfo(self.sack.query().installed(),
                                      progress, self.conf.deltarpm_percentage)
            self._add_tempfiles([pkg.localPkg() for pkg in remote_pkgs])
            payloads = [dnf.repo._pkg2payload(pkg, progress, drpm.delta_factory,
                                              dnf.repo.RPMPayload)
                        for pkg in remote_pkgs]
            self._download_remote_payloads(payloads, drpm, progress, callback_total)

        if self.conf.destdir:
            for pkg in local_pkgs:
                if pkg.baseurl:
                    location = os.path.join(pkg.get_local_baseurl(),
                                            pkg.location.lstrip("/"))
                else:
                    location = os.path.join(pkg.repo.pkgdir, pkg.location.lstrip("/"))
                shutil.copy(location, self.conf.destdir)

    def add_remote_rpms(self, path_list, strict=True, progress=None):
        # :api
        pkgs = []
        if not path_list:
            return pkgs
        if self._goal.req_length():
            raise dnf.exceptions.Error(
                _("Cannot add local packages, because transaction job already exists"))
        pkgs_error = []
        for path in path_list:
            if not os.path.exists(path) and '://' in path:
                # download remote rpm to a tempfile
                path = dnf.util._urlopen_progress(path, self.conf, progress)
                self._add_tempfiles([path])
            try:
                pkgs.append(self.sack.add_cmdline_package(path))
            except IOError as e:
                logger.warning(e)
                pkgs_error.append(path)
        self._setup_excludes_includes(only_main=True)
        if pkgs_error and strict:
            raise IOError(_("Could not open: {}").format(' '.join(pkgs_error)))
        return pkgs

    def _sig_check_pkg(self, po):
        """Verify the GPG signature of the given package object.

        :param po: the package object to verify the signature of
        :return: (result, error_string)
           where result is::

              0 = GPG signature verifies ok or verification is not required.
              1 = GPG verification failed but installation of the right GPG key
                    might help.
              2 = Fatal GPG verification error, give up.
        """
        if po._from_cmdline:
            check = self.conf.localpkg_gpgcheck
            hasgpgkey = 0
        else:
            repo = self.repos[po.repoid]
            check = repo.gpgcheck
            hasgpgkey = not not repo.gpgkey

        if check:
            root = self.conf.installroot
            ts = dnf.rpm.transaction.initReadOnlyTransaction(root)
            sigresult = dnf.rpm.miscutils.checkSig(ts, po.localPkg())
            localfn = os.path.basename(po.localPkg())
            del ts
            if sigresult == 0:
                result = 0
                msg = ''

            elif sigresult == 1:
                if hasgpgkey:
                    result = 1
                else:
                    result = 2
                msg = _('Public key for %s is not installed') % localfn

            elif sigresult == 2:
                result = 2
                msg = _('Problem opening package %s') % localfn

            elif sigresult == 3:
                if hasgpgkey:
                    result = 1
                else:
                    result = 2
                result = 1
                msg = _('Public key for %s is not trusted') % localfn

            elif sigresult == 4:
                result = 2
                msg = _('Package %s is not signed') % localfn

        else:
            result = 0
            msg = ''

        return result, msg

    def package_signature_check(self, pkg):
        # :api
        """Verify the GPG signature of the given package object.

        :param pkg: the package object to verify the signature of
        :return: (result, error_string)
           where result is::

              0 = GPG signature verifies ok or verification is not required.
              1 = GPG verification failed but installation of the right GPG key
                    might help.
              2 = Fatal GPG verification error, give up.
        """
        return self._sig_check_pkg(pkg)

    def _clean_packages(self, packages):
        for fn in packages:
            if not os.path.exists(fn):
                continue
            try:
                misc.unlink_f(fn)
            except OSError:
                logger.warning(_('Cannot remove %s'), fn)
                continue
            else:
                logger.log(dnf.logging.DDEBUG,
                           _('%s removed'), fn)

    def _do_package_lists(self, pkgnarrow='all', patterns=None, showdups=None,
                       ignore_case=False, reponame=None):
        """Return a :class:`misc.GenericHolder` containing
        lists of package objects.  The contents of the lists are
        specified in various ways by the arguments.

        :param pkgnarrow: a string specifying which types of packages
           lists to produces, such as updates, installed, available,
           etc.
        :param patterns: a list of names or wildcards specifying
           packages to list
        :param showdups: whether to include duplicate packages in the
           lists
        :param ignore_case: whether to ignore case when searching by
           package names
        :param reponame: limit packages list to the given repository
        :return: a :class:`misc.GenericHolder` instance with the
           following lists defined::

             available = list of packageObjects
             installed = list of packageObjects
             upgrades = tuples of packageObjects (updating, installed)
             extras = list of packageObjects
             obsoletes = tuples of packageObjects (obsoleting, installed)
             recent = list of packageObjects
        """
        if showdups is None:
            showdups = self.conf.showdupesfromrepos
        if patterns is None:
            return self._list_pattern(
                pkgnarrow, patterns, showdups, ignore_case, reponame)

        assert not dnf.util.is_string_type(patterns)
        list_fn = functools.partial(
            self._list_pattern, pkgnarrow, showdups=showdups,
            ignore_case=ignore_case, reponame=reponame)
        if patterns is None or len(patterns) == 0:
            return list_fn(None)
        yghs = map(list_fn, patterns)
        return functools.reduce(lambda a, b: a.merge_lists(b), yghs)

    def _list_pattern(self, pkgnarrow, pattern, showdups, ignore_case,
                      reponame=None):
        def is_from_repo(package):
            """Test whether given package originates from the repository."""
            if reponame is None:
                return True
            return self.history.repo(package) == reponame

        def pkgs_from_repo(packages):
            """Filter out the packages which do not originate from the repo."""
            return (package for package in packages if is_from_repo(package))

        def query_for_repo(query):
            """Filter out the packages which do not originate from the repo."""
            if reponame is None:
                return query
            return query.filter(reponame=reponame)

        ygh = misc.GenericHolder(iter=pkgnarrow)

        installed = []
        available = []
        reinstall_available = []
        old_available = []
        updates = []
        obsoletes = []
        obsoletesTuples = []
        recent = []
        extras = []
        autoremove = []

        # do the initial pre-selection
        ic = ignore_case
        q = self.sack.query()
        if pattern is not None:
            subj = dnf.subject.Subject(pattern, ignore_case=ic)
            q = subj.get_best_query(self.sack, with_provides=False)

        # list all packages - those installed and available:
        if pkgnarrow == 'all':
            dinst = {}
            ndinst = {}  # Newest versions by name.arch
            for po in q.installed():
                dinst[po.pkgtup] = po
                if showdups:
                    continue
                key = (po.name, po.arch)
                if key not in ndinst or po > ndinst[key]:
                    ndinst[key] = po
            installed = list(pkgs_from_repo(dinst.values()))

            avail = query_for_repo(q.available())
            if not showdups:
                avail = avail.filterm(latest_per_arch_by_priority=True)
            for pkg in avail:
                if showdups:
                    if pkg.pkgtup in dinst:
                        reinstall_available.append(pkg)
                    else:
                        available.append(pkg)
                else:
                    key = (pkg.name, pkg.arch)
                    if pkg.pkgtup in dinst:
                        reinstall_available.append(pkg)
                    elif key not in ndinst or pkg.evr_gt(ndinst[key]):
                        available.append(pkg)
                    else:
                        old_available.append(pkg)

        # produce the updates list of tuples
        elif pkgnarrow == 'upgrades':
            updates = query_for_repo(q).filterm(upgrades_by_priority=True)
            # reduce a query to security upgrades if they are specified
            updates = self._merge_update_filters(updates, upgrade=True)
            # reduce a query to remove src RPMs
            updates.filterm(arch__neq=['src', 'nosrc'])
            # reduce a query to latest packages
            updates = updates.latest().run()

        # installed only
        elif pkgnarrow == 'installed':
            installed = list(pkgs_from_repo(q.installed()))

        # available in a repository
        elif pkgnarrow == 'available':
            if showdups:
                avail = query_for_repo(q).available()
                installed_dict = q.installed()._na_dict()
                for avail_pkg in avail:
                    key = (avail_pkg.name, avail_pkg.arch)
                    installed_pkgs = installed_dict.get(key, [])
                    same_ver = [pkg for pkg in installed_pkgs
                                if pkg.evr == avail_pkg.evr]
                    if len(same_ver) > 0:
                        reinstall_available.append(avail_pkg)
                    else:
                        available.append(avail_pkg)
            else:
                # we will only look at the latest versions of packages:
                available_dict = query_for_repo(
                    q).available().filterm(latest_per_arch_by_priority=True)._na_dict()
                installed_dict = q.installed().latest()._na_dict()
                for (name, arch) in available_dict:
                    avail_pkg = available_dict[(name, arch)][0]
                    inst_pkg = installed_dict.get((name, arch), [None])[0]
                    if not inst_pkg or avail_pkg.evr_gt(inst_pkg):
                        available.append(avail_pkg)
                    elif avail_pkg.evr_eq(inst_pkg):
                        reinstall_available.append(avail_pkg)
                    else:
                        old_available.append(avail_pkg)

        # packages to be removed by autoremove
        elif pkgnarrow == 'autoremove':
            autoremove_q = query_for_repo(q)._unneeded(self.history.swdb)
            autoremove = autoremove_q.run()

        # not in a repo but installed
        elif pkgnarrow == 'extras':
            extras = [pkg for pkg in q.extras() if is_from_repo(pkg)]

        # obsoleting packages (and what they obsolete)
        elif pkgnarrow == 'obsoletes':
            inst = q.installed()
            obsoletes = query_for_repo(
                self.sack.query()).filter(obsoletes_by_priority=inst)
            # reduce a query to security upgrades if they are specified
            obsoletes = self._merge_update_filters(obsoletes, warning=False, upgrade=True)
            # reduce a query to remove src RPMs
            obsoletes.filterm(arch__neq=['src', 'nosrc'])
            obsoletesTuples = []
            for new in obsoletes:
                obsoleted_reldeps = new.obsoletes
                obsoletesTuples.extend(
                    [(new, old) for old in
                     inst.filter(provides=obsoleted_reldeps)])

        # packages recently added to the repositories
        elif pkgnarrow == 'recent':
            avail = q.available()
            if not showdups:
                avail = avail.filterm(latest_per_arch_by_priority=True)
            recent = query_for_repo(avail)._recent(self.conf.recent)

        ygh.installed = installed
        ygh.available = available
        ygh.reinstall_available = reinstall_available
        ygh.old_available = old_available
        ygh.updates = updates
        ygh.obsoletes = obsoletes
        ygh.obsoletesTuples = obsoletesTuples
        ygh.recent = recent
        ygh.extras = extras
        ygh.autoremove = autoremove

        return ygh

    def _add_comps_trans(self, trans):
        self._comps_trans += trans
        return len(trans)

    def _remove_if_unneeded(self, query):
        """
        Mark to remove packages that are not required by any user installed package (reason group
        or user)
        :param query: dnf.query.Query() object
        """
        query = query.installed()
        if not query:
            return

        unneeded_pkgs = query._safe_to_remove(self.history.swdb, debug_solver=False)
        unneeded_pkgs_history = query.filter(
            pkg=[i for i in query if self.history.group.is_removable_pkg(i.name)])
        pkg_with_dependent_pkgs = unneeded_pkgs_history.difference(unneeded_pkgs)

        # mark packages with dependent packages as a dependency to allow removal with dependent
        # package
        for pkg in pkg_with_dependent_pkgs:
            self.history.set_reason(pkg, libdnf.transaction.TransactionItemReason_DEPENDENCY)
        unneeded_pkgs = unneeded_pkgs.intersection(unneeded_pkgs_history)

        remove_packages = query.intersection(unneeded_pkgs)
        if remove_packages:
            for pkg in remove_packages:
                self._goal.erase(pkg, clean_deps=self.conf.clean_requirements_on_remove)

    def _finalize_comps_trans(self):
        trans = self._comps_trans
        basearch = self.conf.substitutions['basearch']

        def trans_upgrade(query, remove_query, comps_pkg):
            sltr = dnf.selector.Selector(self.sack)
            sltr.set(pkg=query)
            self._goal.upgrade(select=sltr)
            return remove_query

        def trans_install(query, remove_query, comps_pkg, strict):
            if self.conf.multilib_policy == "all":
                if not comps_pkg.requires:
                    self._install_multiarch(query, strict=strict)
                else:
                    # it installs only one arch for conditional packages
                    installed_query = query.installed().apply()
                    self._report_already_installed(installed_query)
                    sltr = dnf.selector.Selector(self.sack)
                    sltr.set(provides="({} if {})".format(comps_pkg.name, comps_pkg.requires))
                    self._goal.install(select=sltr, optional=not strict)

            else:
                sltr = dnf.selector.Selector(self.sack)
                if comps_pkg.requires:
                    sltr.set(provides="({} if {})".format(comps_pkg.name, comps_pkg.requires))
                else:
                    if self.conf.obsoletes:
                        query = query.union(self.sack.query().filterm(obsoletes=query))
                    sltr.set(pkg=query)
                self._goal.install(select=sltr, optional=not strict)
            return remove_query

        def trans_remove(query, remove_query, comps_pkg):
            remove_query = remove_query.union(query)
            return remove_query

        remove_query = self.sack.query().filterm(empty=True)
        attr_fn = ((trans.install, functools.partial(trans_install, strict=True)),
                   (trans.install_opt, functools.partial(trans_install, strict=False)),
                   (trans.upgrade, trans_upgrade),
                   (trans.remove, trans_remove))

        for (attr, fn) in attr_fn:
            for comps_pkg in attr:
                query_args = {'name': comps_pkg.name}
                if (comps_pkg.basearchonly):
                    query_args.update({'arch': basearch})
                q = self.sack.query().filterm(**query_args).apply()
                q.filterm(arch__neq=["src", "nosrc"])
                if not q:
                    package_string = comps_pkg.name
                    if comps_pkg.basearchonly:
                        package_string += '.' + basearch
                    logger.warning(_('No match for group package "{}"').format(package_string))
                    continue
                remove_query = fn(q, remove_query, comps_pkg)
                self._goal.group_members.add(comps_pkg.name)

        self._remove_if_unneeded(remove_query)

    def _build_comps_solver(self):
        def reason_fn(pkgname):
            q = self.sack.query().installed().filterm(name=pkgname)
            if not q:
                return None
            try:
                return self.history.rpm.get_reason(q[0])
            except AttributeError:
                return libdnf.transaction.TransactionItemReason_UNKNOWN

        return dnf.comps.Solver(self.history, self._comps, reason_fn)

    def environment_install(self, env_id, types, exclude=None, strict=True, exclude_groups=None):
        # :api
        """Installs packages of environment group identified by env_id.
        :param types: Types of packages to install. Either an integer as a
            logical conjunction of CompsPackageType ids or a list of string
            package type ids (conditional, default, mandatory, optional).
        """
        assert dnf.util.is_string_type(env_id)
        solver = self._build_comps_solver()

        if not isinstance(types, int):
            types = libdnf.transaction.listToCompsPackageType(types)

        trans = solver._environment_install(env_id, types, exclude or set(), strict, exclude_groups)
        if not trans:
            return 0
        return self._add_comps_trans(trans)

    def environment_remove(self, env_id):
        # :api
        assert dnf.util.is_string_type(env_id)
        solver = self._build_comps_solver()
        trans = solver._environment_remove(env_id)
        return self._add_comps_trans(trans)

    def group_install(self, grp_id, pkg_types, exclude=None, strict=True):
        # :api
        """Installs packages of selected group
        :param pkg_types: Types of packages to install. Either an integer as a
            logical conjunction of CompsPackageType ids or a list of string
            package type ids (conditional, default, mandatory, optional).
        :param exclude: list of package name glob patterns
            that will be excluded from install set
        :param strict: boolean indicating whether group packages that
            exist but are non-installable due to e.g. dependency
            issues should be skipped (False) or cause transaction to
            fail to resolve (True)
        """
        def _pattern_to_pkgname(pattern):
            if dnf.util.is_glob_pattern(pattern):
                q = self.sack.query().filterm(name__glob=pattern)
                return map(lambda p: p.name, q)
            else:
                return (pattern,)

        assert dnf.util.is_string_type(grp_id)
        exclude_pkgnames = None
        if exclude:
            nested_excludes = [_pattern_to_pkgname(p) for p in exclude]
            exclude_pkgnames = itertools.chain.from_iterable(nested_excludes)

        solver = self._build_comps_solver()

        if not isinstance(pkg_types, int):
            pkg_types = libdnf.transaction.listToCompsPackageType(pkg_types)

        trans = solver._group_install(grp_id, pkg_types, exclude_pkgnames, strict)
        if not trans:
            return 0
        if strict:
            instlog = trans.install
        else:
            instlog = trans.install_opt
        logger.debug(_("Adding packages from group '%s': %s"),
                     grp_id, instlog)
        return self._add_comps_trans(trans)

    def env_group_install(self, patterns, types, strict=True, exclude=None, exclude_groups=None):
        q = CompsQuery(self.comps, self.history, CompsQuery.ENVIRONMENTS | CompsQuery.GROUPS,
                       CompsQuery.AVAILABLE)
        cnt = 0
        done = True
        for pattern in patterns:
            try:
                res = q.get(pattern)
            except dnf.exceptions.CompsError as err:
                logger.error(ucd(err))
                done = False
                continue
            for group_id in res.groups:
                if not exclude_groups or group_id not in exclude_groups:
                    cnt += self.group_install(group_id, types, exclude=exclude, strict=strict)
            for env_id in res.environments:
                cnt += self.environment_install(env_id, types, exclude=exclude, strict=strict,
                                                exclude_groups=exclude_groups)
        if not done and strict:
            raise dnf.exceptions.Error(_('Nothing to do.'))
        return cnt

    def group_remove(self, grp_id):
        # :api
        assert dnf.util.is_string_type(grp_id)
        solver = self._build_comps_solver()
        trans = solver._group_remove(grp_id)
        return self._add_comps_trans(trans)

    def env_group_remove(self, patterns):
        q = CompsQuery(self.comps, self.history,
                       CompsQuery.ENVIRONMENTS | CompsQuery.GROUPS,
                       CompsQuery.INSTALLED)
        try:
            res = q.get(*patterns)
        except dnf.exceptions.CompsError as err:
            logger.error("Warning: %s", ucd(err))
            raise dnf.exceptions.Error(_('No groups marked for removal.'))
        cnt = 0
        for env in res.environments:
            cnt += self.environment_remove(env)
        for grp in res.groups:
            cnt += self.group_remove(grp)
        return cnt

    def env_group_upgrade(self, patterns):
        q = CompsQuery(self.comps, self.history,
                       CompsQuery.GROUPS | CompsQuery.ENVIRONMENTS,
                       CompsQuery.INSTALLED)
        group_upgraded = False
        for pattern in patterns:
            try:
                res = q.get(pattern)
            except dnf.exceptions.CompsError as err:
                logger.error(ucd(err))
                continue
            for env in res.environments:
                try:
                    self.environment_upgrade(env)
                    group_upgraded = True
                except dnf.exceptions.CompsError as err:
                    logger.error(ucd(err))
                    continue
            for grp in res.groups:
                try:
                    self.group_upgrade(grp)
                    group_upgraded = True
                except dnf.exceptions.CompsError as err:
                    logger.error(ucd(err))
                    continue
        if not group_upgraded:
            msg = _('No group marked for upgrade.')
            raise dnf.cli.CliError(msg)

    def environment_upgrade(self, env_id):
        # :api
        assert dnf.util.is_string_type(env_id)
        solver = self._build_comps_solver()
        trans = solver._environment_upgrade(env_id)
        return self._add_comps_trans(trans)

    def group_upgrade(self, grp_id):
        # :api
        assert dnf.util.is_string_type(grp_id)
        solver = self._build_comps_solver()
        trans = solver._group_upgrade(grp_id)
        return self._add_comps_trans(trans)

    def _gpg_key_check(self):
        """Checks for the presence of GPG keys in the rpmdb.

        :return: 0 if there are no GPG keys in the rpmdb, and 1 if
           there are keys
        """
        gpgkeyschecked = self.conf.cachedir + '/.gpgkeyschecked.yum'
        if os.path.exists(gpgkeyschecked):
            return 1

        installroot = self.conf.installroot
        myts = dnf.rpm.transaction.initReadOnlyTransaction(root=installroot)
        myts.pushVSFlags(~(rpm._RPMVSF_NOSIGNATURES | rpm._RPMVSF_NODIGESTS))
        idx = myts.dbMatch('name', 'gpg-pubkey')
        keys = len(idx)
        del idx
        del myts

        if keys == 0:
            return 0
        else:
            mydir = os.path.dirname(gpgkeyschecked)
            if not os.path.exists(mydir):
                os.makedirs(mydir)

            fo = open(gpgkeyschecked, 'w')
            fo.close()
            del fo
            return 1

    def _install_multiarch(self, query, reponame=None, strict=True):
        already_inst, available = self._query_matches_installed(query)
        self._report_already_installed(already_inst)
        for packages in available:
            sltr = dnf.selector.Selector(self.sack)
            q = self.sack.query().filterm(pkg=packages)
            if self.conf.obsoletes:
                q = q.union(self.sack.query().filterm(obsoletes=q))
            sltr = sltr.set(pkg=q)
            if reponame is not None:
                sltr = sltr.set(reponame=reponame)
            self._goal.install(select=sltr, optional=(not strict))
        return len(available)

    def _categorize_specs(self, install, exclude):
        """
        Categorize :param install and :param exclude list into two groups each (packages and groups)

        :param install: list of specs, whether packages ('foo') or groups/modules ('@bar')
        :param exclude: list of specs, whether packages ('foo') or groups/modules ('@bar')
        :return: categorized install and exclude specs (stored in argparse.Namespace class)

        To access packages use: specs.pkg_specs,
        to access groups use: specs.grp_specs
        """
        install_specs = argparse.Namespace()
        exclude_specs = argparse.Namespace()
        _parse_specs(install_specs, install)
        _parse_specs(exclude_specs, exclude)

        return install_specs, exclude_specs

    def _exclude_package_specs(self, exclude_specs):
        glob_excludes = [exclude for exclude in exclude_specs.pkg_specs
                         if dnf.util.is_glob_pattern(exclude)]
        excludes = [exclude for exclude in exclude_specs.pkg_specs
                    if exclude not in glob_excludes]

        exclude_query = self.sack.query().filter(name=excludes)
        glob_exclude_query = self.sack.query().filter(name__glob=glob_excludes)

        self.sack.add_excludes(exclude_query)
        self.sack.add_excludes(glob_exclude_query)

    def _expand_groups(self, group_specs):
        groups = set()
        q = CompsQuery(self.comps, self.history,
                       CompsQuery.ENVIRONMENTS | CompsQuery.GROUPS,
                       CompsQuery.AVAILABLE | CompsQuery.INSTALLED)

        for pattern in group_specs:
            try:
                res = q.get(pattern)
            except dnf.exceptions.CompsError as err:
                logger.error("Warning: Module or %s", ucd(err))
                continue

            groups.update(res.groups)
            groups.update(res.environments)

            for environment_id in res.environments:
                environment = self.comps._environment_by_id(environment_id)
                for group in environment.groups_iter():
                    groups.add(group.id)

        return list(groups)

    def _install_groups(self, group_specs, excludes, skipped, strict=True):
        for group_spec in group_specs:
            try:
                types = self.conf.group_package_types

                if '/' in group_spec:
                    split = group_spec.split('/')
                    group_spec = split[0]
                    types = split[1].split(',')

                self.env_group_install([group_spec], types, strict, excludes.pkg_specs,
                                       excludes.grp_specs)
            except dnf.exceptions.Error:
                skipped.append("@" + group_spec)

    def install_specs(self, install, exclude=None, reponame=None, strict=True, forms=None):
        # :api
        if exclude is None:
            exclude = []
        no_match_group_specs = []
        error_group_specs = []
        no_match_pkg_specs = []
        error_pkg_specs = []
        install_specs, exclude_specs = self._categorize_specs(install, exclude)

        self._exclude_package_specs(exclude_specs)
        for spec in install_specs.pkg_specs:
            try:
                self.install(spec, reponame=reponame, strict=strict, forms=forms)
            except dnf.exceptions.MarkingError as e:
                logger.error(str(e))
                no_match_pkg_specs.append(spec)
        no_match_module_specs = []
        module_depsolv_errors = ()
        if WITH_MODULES and install_specs.grp_specs:
            try:
                module_base = dnf.module.module_base.ModuleBase(self)
                module_base.install(install_specs.grp_specs, strict)
            except dnf.exceptions.MarkingErrors as e:
                if e.no_match_group_specs:
                    for e_spec in e.no_match_group_specs:
                        no_match_module_specs.append(e_spec)
                if e.error_group_specs:
                    for e_spec in e.error_group_specs:
                        error_group_specs.append("@" + e_spec)
                module_depsolv_errors = e.module_depsolv_errors

        else:
            no_match_module_specs = install_specs.grp_specs

        if no_match_module_specs:
            exclude_specs.grp_specs = self._expand_groups(exclude_specs.grp_specs)
            self._install_groups(no_match_module_specs, exclude_specs, no_match_group_specs, strict)

        if no_match_group_specs or error_group_specs or no_match_pkg_specs or error_pkg_specs \
                or module_depsolv_errors:
            raise dnf.exceptions.MarkingErrors(no_match_group_specs=no_match_group_specs,
                                               error_group_specs=error_group_specs,
                                               no_match_pkg_specs=no_match_pkg_specs,
                                               error_pkg_specs=error_pkg_specs,
                                               module_depsolv_errors=module_depsolv_errors)

    def install(self, pkg_spec, reponame=None, strict=True, forms=None):
        # :api
        """Mark package(s) given by pkg_spec and reponame for installation."""

        subj = dnf.subject.Subject(pkg_spec)
        solution = subj.get_best_solution(self.sack, forms=forms, with_src=False)

        if self.conf.multilib_policy == "all" or subj._is_arch_specified(solution):
            q = solution['query']
            if reponame is not None:
                q.filterm(reponame=reponame)
            if not q:
                self._raise_package_not_found_error(pkg_spec, forms, reponame)
            return self._install_multiarch(q, reponame=reponame, strict=strict)

        elif self.conf.multilib_policy == "best":
            sltrs = subj._get_best_selectors(self,
                                             forms=forms,
                                             obsoletes=self.conf.obsoletes,
                                             reponame=reponame,
                                             reports=True,
                                             solution=solution)
            if not sltrs:
                self._raise_package_not_found_error(pkg_spec, forms, reponame)

            for sltr in sltrs:
                self._goal.install(select=sltr, optional=(not strict))
            return 1
        return 0

    def package_downgrade(self, pkg, strict=False):
        # :api
        if pkg._from_system:
            msg = 'downgrade_package() for an installed package.'
            raise NotImplementedError(msg)

        q = self.sack.query().installed().filterm(name=pkg.name, arch=[pkg.arch, "noarch"])
        if not q:
            msg = _("Package %s not installed, cannot downgrade it.")
            logger.warning(msg, pkg.name)
            raise dnf.exceptions.MarkingError(_('No match for argument: %s') % pkg.location, pkg.name)
        elif sorted(q)[0] > pkg:
            sltr = dnf.selector.Selector(self.sack)
            sltr.set(pkg=[pkg])
            self._goal.install(select=sltr, optional=(not strict))
            return 1
        else:
            msg = _("Package %s of lower version already installed, "
                    "cannot downgrade it.")
            logger.warning(msg, pkg.name)
            return 0

    def package_install(self, pkg, strict=True):
        # :api
        q = self.sack.query()._nevra(pkg.name, pkg.evr, pkg.arch)
        already_inst, available = self._query_matches_installed(q)
        if pkg in already_inst:
            self._report_already_installed([pkg])
        elif pkg not in itertools.chain.from_iterable(available):
            raise dnf.exceptions.PackageNotFoundError(_('No match for argument: %s'), pkg.location)
        else:
            sltr = dnf.selector.Selector(self.sack)
            sltr.set(pkg=[pkg])
            self._goal.install(select=sltr, optional=(not strict))
        return 1

    def package_reinstall(self, pkg):
        if self.sack.query().installed().filterm(name=pkg.name, evr=pkg.evr, arch=pkg.arch):
            self._goal.install(pkg)
            return 1
        msg = _("Package %s not installed, cannot reinstall it.")
        logger.warning(msg, str(pkg))
        raise dnf.exceptions.MarkingError(_('No match for argument: %s') % pkg.location, pkg.name)

    def package_remove(self, pkg):
        self._goal.erase(pkg)
        return 1

    def package_upgrade(self, pkg):
        # :api
        if pkg._from_system:
            msg = 'upgrade_package() for an installed package.'
            raise NotImplementedError(msg)

        if pkg.arch == 'src':
            msg = _("File %s is a source package and cannot be updated, ignoring.")
            logger.info(msg, pkg.location)
            return 0
        installed = self.sack.query().installed().apply()
        if self.conf.obsoletes and self.sack.query().filterm(pkg=[pkg]).filterm(obsoletes=installed):
            sltr = dnf.selector.Selector(self.sack)
            sltr.set(pkg=[pkg])
            self._goal.upgrade(select=sltr)
            return 1
        # do not filter by arch if the package is noarch
        if pkg.arch == "noarch":
            q = installed.filter(name=pkg.name)
        else:
            q = installed.filter(name=pkg.name, arch=[pkg.arch, "noarch"])
        if not q:
            msg = _("Package %s not installed, cannot update it.")
            logger.warning(msg, pkg.name)
            raise dnf.exceptions.MarkingError(
                _('No match for argument: %s') % pkg.location, pkg.name)
        elif sorted(q)[-1] < pkg:
            sltr = dnf.selector.Selector(self.sack)
            sltr.set(pkg=[pkg])
            self._goal.upgrade(select=sltr)
            return 1
        else:
            msg = _("The same or higher version of %s is already installed, "
                    "cannot update it.")
            logger.warning(msg, pkg.name)
            return 0

    def _upgrade_internal(self, query, obsoletes, reponame, pkg_spec=None):
        installed_all = self.sack.query().installed()
        # Add only relevant obsoletes to transaction => installed, upgrades
        q = query.intersection(self.sack.query().filterm(name=[pkg.name for pkg in installed_all]))
        installed_query = q.installed()
        if obsoletes:
            obsoletes = self.sack.query().available().filterm(
                obsoletes=installed_query.union(q.upgrades()))
            # add obsoletes into transaction
            query = query.union(obsoletes)
        if reponame is not None:
            query.filterm(reponame=reponame)
        query = self._merge_update_filters(query, pkg_spec=pkg_spec, upgrade=True)
        if query:
            # Given that we use libsolv's targeted transactions, we need to ensure that the transaction contains both
            # the new targeted version and also the current installed version (for the upgraded package). This is
            # because if it only contained the new version, libsolv would decide to reinstall the package even if it
            # had just a different buildtime or vendor but the same version
            # (https://github.com/openSUSE/libsolv/issues/287)
            #   - In general, the query already contains both the new and installed versions but not always.
            #     If repository-packages command is used, the installed packages are filtered out because they are from
            #     the @system repo. We need to add them back in.
            #   - However we need to add installed versions of just the packages that are being upgraded. We don't want
            #     to add all installed packages because it could increase the number of solutions for the transaction
            #     (especially without --best) and since libsolv prefers the smallest possible upgrade it could result
            #     in no upgrade even if there is one available. This is a problem in general but its critical with
            #     --security transactions (https://bugzilla.redhat.com/show_bug.cgi?id=2097757)
            #   - We want to add only the latest versions of installed packages, this is specifically for installonly
            #     packages. Otherwise if for example kernel-1 and kernel-3 were installed and present in the
            #     transaction libsolv could decide to install kernel-2 because it is an upgrade for kernel-1 even
            #     though we don't want it because there already is a newer version present.
            query = query.union(installed_all.latest().filter(name=[pkg.name for pkg in query]))
            sltr = dnf.selector.Selector(self.sack)
            sltr.set(pkg=query)
            self._goal.upgrade(select=sltr)
        return 1


    def upgrade(self, pkg_spec, reponame=None):
        # :api
        subj = dnf.subject.Subject(pkg_spec)
        solution = subj.get_best_solution(self.sack)
        q = solution["query"]
        if q:
            wildcard = dnf.util.is_glob_pattern(pkg_spec)
            # wildcard shouldn't print not installed packages
            # only solution with nevra.name provide packages with same name
            if not wildcard and solution['nevra'] and solution['nevra'].name:
                pkg_name = solution['nevra'].name
                installed = self.sack.query().installed().apply()
                obsoleters = q.filter(obsoletes=installed) \
                    if self.conf.obsoletes else self.sack.query().filterm(empty=True)
                if not obsoleters:
                    installed_name = installed.filter(name=pkg_name).apply()
                    if not installed_name:
                        msg = _('Package %s available, but not installed.')
                        logger.warning(msg, pkg_name)
                        raise dnf.exceptions.PackagesNotInstalledError(
                            _('No match for argument: %s') % pkg_spec, pkg_spec)
                    elif solution['nevra'].arch and not dnf.util.is_glob_pattern(solution['nevra'].arch):
                        if not installed_name.filterm(arch=solution['nevra'].arch):
                            msg = _('Package %s available, but installed for different architecture.')
                            logger.warning(msg, "{}.{}".format(pkg_name, solution['nevra'].arch))
            obsoletes = self.conf.obsoletes and solution['nevra'] \
                        and solution['nevra'].has_just_name()
            return self._upgrade_internal(q, obsoletes, reponame, pkg_spec)
        raise dnf.exceptions.MarkingError(_('No match for argument: %s') % pkg_spec, pkg_spec)

    def upgrade_all(self, reponame=None):
        # :api
        # provide only available packages to solver to trigger targeted upgrade
        # possibilities will be ignored
        # usage of selected packages will unify dnf behavior with other upgrade functions
        return self._upgrade_internal(
            self.sack.query(), self.conf.obsoletes, reponame, pkg_spec=None)

    def distro_sync(self, pkg_spec=None):
        if pkg_spec is None:
            self._goal.distupgrade_all()
        else:
            subject = dnf.subject.Subject(pkg_spec)
            solution = subject.get_best_solution(self.sack, with_src=False)
            solution["query"].filterm(reponame__neq=hawkey.SYSTEM_REPO_NAME)
            sltrs = subject._get_best_selectors(self, solution=solution,
                                                obsoletes=self.conf.obsoletes, reports=True)
            if not sltrs:
                logger.info(_('No package %s installed.'), pkg_spec)
                return 0
            for sltr in sltrs:
                self._goal.distupgrade(select=sltr)
        return 1

    def autoremove(self, forms=None, pkg_specs=None, grp_specs=None, filenames=None):
        # :api
        """Removes all 'leaf' packages from the system that were originally
        installed as dependencies of user-installed packages but which are
        no longer required by any such package."""

        if any([grp_specs, pkg_specs, filenames]):
            pkg_specs += filenames
            done = False
            # Remove groups.
            if grp_specs and forms:
                for grp_spec in grp_specs:
                    msg = _('Not a valid form: %s')
                    logger.warning(msg, grp_spec)
            elif grp_specs:
                if self.env_group_remove(grp_specs):
                    done = True

            for pkg_spec in pkg_specs:
                try:
                    self.remove(pkg_spec, forms=forms)
                except dnf.exceptions.MarkingError as e:
                    logger.info(str(e))
                else:
                    done = True

            if not done:
                logger.warning(_('No packages marked for removal.'))

        else:
            pkgs = self.sack.query()._unneeded(self.history.swdb,
                                               debug_solver=self.conf.debug_solver)
            for pkg in pkgs:
                self.package_remove(pkg)

    def remove(self, pkg_spec, reponame=None, forms=None):
        # :api
        """Mark the specified package for removal."""

        matches = dnf.subject.Subject(pkg_spec).get_best_query(self.sack, forms=forms)
        installed = [
            pkg for pkg in matches.installed()
            if reponame is None or
            self.history.repo(pkg) == reponame]
        if not installed:
            self._raise_package_not_installed_error(pkg_spec, forms, reponame)

        clean_deps = self.conf.clean_requirements_on_remove
        for pkg in installed:
            self._goal.erase(pkg, clean_deps=clean_deps)
        return len(installed)

    def reinstall(self, pkg_spec, old_reponame=None, new_reponame=None,
                  new_reponame_neq=None, remove_na=False):
        subj = dnf.subject.Subject(pkg_spec)
        q = subj.get_best_query(self.sack)
        installed_pkgs = [
            pkg for pkg in q.installed()
            if old_reponame is None or
            self.history.repo(pkg) == old_reponame]

        available_q = q.available()
        if new_reponame is not None:
            available_q.filterm(reponame=new_reponame)
        if new_reponame_neq is not None:
            available_q.filterm(reponame__neq=new_reponame_neq)
        available_nevra2pkg = dnf.query._per_nevra_dict(available_q)

        if not installed_pkgs:
            raise dnf.exceptions.PackagesNotInstalledError(
                'no package matched', pkg_spec, available_nevra2pkg.values())

        cnt = 0
        clean_deps = self.conf.clean_requirements_on_remove
        for installed_pkg in installed_pkgs:
            try:
                available_pkg = available_nevra2pkg[ucd(installed_pkg)]
            except KeyError:
                if not remove_na:
                    continue
                self._goal.erase(installed_pkg, clean_deps=clean_deps)
            else:
                self._goal.install(available_pkg)
            cnt += 1

        if cnt == 0:
            raise dnf.exceptions.PackagesNotAvailableError(
                'no package matched', pkg_spec, installed_pkgs)

        return cnt

    def downgrade(self, pkg_spec):
        # :api
        """Mark a package to be downgraded.

        This is equivalent to first removing the currently installed package,
        and then installing an older version.

        """
        return self.downgrade_to(pkg_spec)

    def downgrade_to(self, pkg_spec, strict=False):
        """Downgrade to specific version if specified otherwise downgrades
        to one version lower than the package installed.
        """
        subj = dnf.subject.Subject(pkg_spec)
        q = subj.get_best_query(self.sack)
        if not q:
            msg = _('No match for argument: %s') % pkg_spec
            raise dnf.exceptions.PackageNotFoundError(msg, pkg_spec)
        done = 0
        available_pkgs = q.available()
        available_pkg_names = list(available_pkgs._name_dict().keys())
        q_installed = self.sack.query().installed().filterm(name=available_pkg_names)
        if len(q_installed) == 0:
            msg = _('Packages for argument %s available, but not installed.') % pkg_spec
            raise dnf.exceptions.PackagesNotInstalledError(msg, pkg_spec, available_pkgs)
        for pkg_name in q_installed._name_dict().keys():
            downgrade_pkgs = available_pkgs.downgrades().filter(name=pkg_name)
            if not downgrade_pkgs:
                msg = _("Package %s of lowest version already installed, cannot downgrade it.")
                logger.warning(msg, pkg_name)
                continue
            sltr = dnf.selector.Selector(self.sack)
            sltr.set(pkg=downgrade_pkgs)
            self._goal.install(select=sltr, optional=(not strict))
            done = 1
        return done

    def provides(self, provides_spec):
        providers = self.sack.query().filterm(file__glob=provides_spec)
        if providers:
            return providers, [provides_spec]
        providers = dnf.query._by_provides(self.sack, provides_spec)
        if providers:
            return providers, [provides_spec]
        if provides_spec.startswith('/bin/') or provides_spec.startswith('/sbin/'):
            # compatibility for packages that didn't do UsrMove
            binary_provides = ['/usr' + provides_spec]
        elif provides_spec.startswith('/'):
            # provides_spec is a file path
            return providers, [provides_spec]
        else:
            # suppose that provides_spec is a command, search in /usr/sbin/
            binary_provides = [prefix + provides_spec
                               for prefix in ['/bin/', '/sbin/', '/usr/bin/', '/usr/sbin/']]
        return self.sack.query().filterm(file__glob=binary_provides), binary_provides

    def add_security_filters(self, cmp_type, types=(), advisory=(), bugzilla=(), cves=(), severity=()):
        #  :api
        """
        It modifies results of install, upgrade, and distrosync methods according to provided
        filters.

        :param cmp_type: only 'eq' or 'gte' allowed
        :param types: List or tuple with strings. E.g. 'bugfix', 'enhancement', 'newpackage',
        'security'
        :param advisory: List or tuple with strings. E.g.Eg. FEDORA-2201-123
        :param bugzilla: List or tuple with strings. Include packages that fix a Bugzilla ID,
        Eg. 123123.
        :param cves: List or tuple with strings. Include packages that fix a CVE
        (Common Vulnerabilities and Exposures) ID. Eg. CVE-2201-0123
        :param severity: List or tuple with strings. Includes packages that provide a fix
        for an issue of the specified severity.
        """
        cmp_dict = {'eq': '__eqg', 'gte': '__eqg__gt'}
        if cmp_type not in cmp_dict:
            raise ValueError("Unsupported value for `cmp_type`")
        cmp = cmp_dict[cmp_type]
        if types:
            key = 'advisory_type' + cmp
            self._update_security_options.setdefault(key, set()).update(types)
        if advisory:
            key = 'advisory' + cmp
            self._update_security_options.setdefault(key, set()).update(advisory)
        if bugzilla:
            key = 'advisory_bug' + cmp
            self._update_security_options.setdefault(key, set()).update(bugzilla)
        if cves:
            key = 'advisory_cve' + cmp
            self._update_security_options.setdefault(key, set()).update(cves)
        if severity:
            key = 'advisory_severity' + cmp
            self._update_security_options.setdefault(key, set()).update(severity)

    def reset_security_filters(self):
        #  :api
        """
        Reset all security filters
        """
        self._update_security_options = {}

    def _merge_update_filters(self, q, pkg_spec=None, warning=True, upgrade=False):
        """
        Merge Queries in _update_filters and return intersection with q Query
        @param q: Query
        @return: Query
        """
        if not (self._update_security_options or self._update_security_filters) or not q:
            return q
        merged_queries = self.sack.query().filterm(empty=True)
        if self._update_security_filters:
            for query in self._update_security_filters:
                merged_queries = merged_queries.union(query)

            self._update_security_filters = [merged_queries]
        if self._update_security_options:
            for filter_name, values in self._update_security_options.items():
                if upgrade:
                    filter_name = filter_name + '__upgrade'
                kwargs = {filter_name: values}
                merged_queries = merged_queries.union(q.filter(**kwargs))

        merged_queries = q.intersection(merged_queries)
        if not merged_queries:
            if warning:
                q = q.upgrades()
                count = len(q._name_dict().keys())
                if count > 0:
                    if pkg_spec is None:
                        msg1 = _("No security updates needed, but {} update "
                                 "available").format(count)
                        msg2 = _("No security updates needed, but {} updates "
                                 "available").format(count)
                        logger.warning(P_(msg1, msg2, count))
                    else:
                        msg1 = _('No security updates needed for "{}", but {} '
                                 'update available').format(pkg_spec, count)
                        msg2 = _('No security updates needed for "{}", but {} '
                                 'updates available').format(pkg_spec, count)
                        logger.warning(P_(msg1, msg2, count))
        return merged_queries

    def _get_key_for_package(self, po, askcb=None, fullaskcb=None):
        """Retrieve a key for a package. If needed, use the given
        callback to prompt whether the key should be imported.

        :param po: the package object to retrieve the key of
        :param askcb: Callback function to use to ask permission to
           import a key.  The arguments *askcb* should take are the
           package object, the userid of the key, and the keyid
        :param fullaskcb: Callback function to use to ask permission to
           import a key.  This differs from *askcb* in that it gets
           passed a dictionary so that we can expand the values passed.
        :raises: :class:`dnf.exceptions.Error` if there are errors
           retrieving the keys
        """
        if po._from_cmdline:
            # raise an exception, because po.repoid is not in self.repos
            msg = _('Unable to retrieve a key for a commandline package: %s')
            raise ValueError(msg % po)

        repo = self.repos[po.repoid]
        key_installed = repo.id in self._repo_set_imported_gpg_keys
        keyurls = [] if key_installed else repo.gpgkey

        def _prov_key_data(msg):
            msg += _('. Failing package is: %s') % (po) + '\n '
            msg += _('GPG Keys are configured as: %s') % \
                    (', '.join(repo.gpgkey))
            return msg

        user_cb_fail = False
        self._repo_set_imported_gpg_keys.add(repo.id)
        for keyurl in keyurls:
            keys = dnf.crypto.retrieve(keyurl, repo)

            for info in keys:
                # Check if key is already installed
                if misc.keyInstalled(self._ts, info.rpm_id, info.timestamp) >= 0:
                    msg = _('GPG key at %s (0x%s) is already installed')
                    logger.info(msg, keyurl, info.short_id)
                    continue

                # DNS Extension: create a key object, pass it to the verification class
                # and print its result as an advice to the user.
                if self.conf.gpgkey_dns_verification:
                    dns_input_key = dnf.dnssec.KeyInfo.from_rpm_key_object(info.userid,
                                                                           info.raw_key)
                    dns_result = dnf.dnssec.DNSSECKeyVerification.verify(dns_input_key)
                    logger.info(dnf.dnssec.nice_user_msg(dns_input_key, dns_result))

                # Try installing/updating GPG key
                info.url = keyurl
                if self.conf.gpgkey_dns_verification:
                    dnf.crypto.log_dns_key_import(info, dns_result)
                else:
                    dnf.crypto.log_key_import(info)
                rc = False
                if self.conf.assumeno:
                    rc = False
                elif self.conf.assumeyes:
                    # DNS Extension: We assume, that the key is trusted in case it is valid,
                    # its existence is explicitly denied or in case the domain is not signed
                    # and therefore there is no way to know for sure (this is mainly for
                    # backward compatibility)
                    # FAQ:
                    # * What is PROVEN_NONEXISTENCE?
                    #    In DNSSEC, your domain does not need to be signed, but this state
                    #    (not signed) has to be proven by the upper domain. e.g. when example.com.
                    #    is not signed, com. servers have to sign the message, that example.com.
                    #    does not have any signing key (KSK to be more precise).
                    if self.conf.gpgkey_dns_verification:
                        if dns_result in (dnf.dnssec.Validity.VALID,
                                          dnf.dnssec.Validity.PROVEN_NONEXISTENCE):
                            rc = True
                            logger.info(dnf.dnssec.any_msg(_("The key has been approved.")))
                        else:
                            rc = False
                            logger.info(dnf.dnssec.any_msg(_("The key has been rejected.")))
                    else:
                        rc = True

                # grab the .sig/.asc for the keyurl, if it exists if it
                # does check the signature on the key if it is signed by
                # one of our ca-keys for this repo or the global one then
                # rc = True else ask as normal.

                elif fullaskcb:
                    rc = fullaskcb({"po": po, "userid": info.userid,
                                    "hexkeyid": info.short_id,
                                    "keyurl": keyurl,
                                    "fingerprint": info.fingerprint,
                                    "timestamp": info.timestamp})
                elif askcb:
                    rc = askcb(po, info.userid, info.short_id)

                if not rc:
                    user_cb_fail = True
                    continue

                # Import the key
                # If rpm.RPMTRANS_FLAG_TEST in self._ts, gpg keys cannot be imported successfully
                # therefore the flag was removed for import operation
                test_flag = self._ts.isTsFlagSet(rpm.RPMTRANS_FLAG_TEST)
                if test_flag:
                    orig_flags = self._ts.getTsFlags()
                    self._ts.setFlags(orig_flags - rpm.RPMTRANS_FLAG_TEST)
                result = self._ts.pgpImportPubkey(misc.procgpgkey(info.raw_key))
                if test_flag:
                    self._ts.setFlags(orig_flags)
                if result != 0:
                    msg = _('Key import failed (code %d)') % result
                    raise dnf.exceptions.Error(_prov_key_data(msg))
                logger.info(_('Key imported successfully'))
                key_installed = True

        if not key_installed and user_cb_fail:
            raise dnf.exceptions.Error(_("Didn't install any keys"))

        if not key_installed:
            msg = _('The GPG keys listed for the "%s" repository are '
                    'already installed but they are not correct for this '
                    'package.\n'
                    'Check that the correct key URLs are configured for '
                    'this repository.') % repo.name
            raise dnf.exceptions.Error(_prov_key_data(msg))

        # Check if the newly installed keys helped
        result, errmsg = self._sig_check_pkg(po)
        if result != 0:
            if keyurls:
                msg = _("Import of key(s) didn't help, wrong key(s)?")
                logger.info(msg)
            errmsg = ucd(errmsg)
            raise dnf.exceptions.Error(_prov_key_data(errmsg))

    def package_import_key(self, pkg, askcb=None, fullaskcb=None):
        # :api
        """Retrieve a key for a package. If needed, use the given
        callback to prompt whether the key should be imported.

        :param pkg: the package object to retrieve the key of
        :param askcb: Callback function to use to ask permission to
           import a key.  The arguments *askcb* should take are the
           package object, the userid of the key, and the keyid
        :param fullaskcb: Callback function to use to ask permission to
           import a key.  This differs from *askcb* in that it gets
           passed a dictionary so that we can expand the values passed.
        :raises: :class:`dnf.exceptions.Error` if there are errors
           retrieving the keys
        """
        self._get_key_for_package(pkg, askcb, fullaskcb)

    def _run_rpm_check(self):
        results = []
        self._ts.check()
        for prob in self._ts.problems():
            #  Newer rpm (4.8.0+) has problem objects, older have just strings.
            #  Should probably move to using the new objects, when we can. For
            # now just be compatible.
            results.append(ucd(prob))

        return results

    def urlopen(self, url, repo=None, mode='w+b', **kwargs):
        # :api
        """
        Open the specified absolute url, return a file object
        which respects proxy setting even for non-repo downloads
        """
        return dnf.util._urlopen(url, self.conf, repo, mode, **kwargs)

    def _get_installonly_query(self, q=None):
        if q is None:
            q = self._sack.query(flags=hawkey.IGNORE_EXCLUDES)
        installonly = q.filter(provides=self.conf.installonlypkgs)
        return installonly

    def _report_icase_hint(self, pkg_spec):
        subj = dnf.subject.Subject(pkg_spec, ignore_case=True)
        solution = subj.get_best_solution(self.sack, with_nevra=True,
                                          with_provides=False, with_filenames=False)
        if solution['query'] and solution['nevra'] and solution['nevra'].name and \
                pkg_spec != solution['query'][0].name:
            logger.info(_("  * Maybe you meant: {}").format(solution['query'][0].name))

    def _select_remote_pkgs(self, install_pkgs):
        """ Check checksum of packages from local repositories and returns list packages from remote
        repositories that will be downloaded. Packages from commandline are skipped.

        :param install_pkgs: list of packages
        :return: list of remote pkgs
        """
        def _verification_of_packages(pkg_list, logger_msg):
            all_packages_verified = True
            for pkg in pkg_list:
                pkg_successfully_verified = False
                try:
                    pkg_successfully_verified = pkg.verifyLocalPkg()
                except Exception as e:
                    logger.critical(str(e))
                if pkg_successfully_verified is not True:
                    logger.critical(logger_msg.format(pkg, pkg.reponame))
                    all_packages_verified = False

            return all_packages_verified

        remote_pkgs = []
        local_repository_pkgs = []
        for pkg in install_pkgs:
            if pkg._is_local_pkg():
                if pkg.reponame != hawkey.CMDLINE_REPO_NAME:
                    local_repository_pkgs.append(pkg)
            else:
                remote_pkgs.append(pkg)

        msg = _('Package "{}" from local repository "{}" has incorrect checksum')
        if not _verification_of_packages(local_repository_pkgs, msg):
            raise dnf.exceptions.Error(
                _("Some packages from local repository have incorrect checksum"))

        if self.conf.cacheonly:
            msg = _('Package "{}" from repository "{}" has incorrect checksum')
            if not _verification_of_packages(remote_pkgs, msg):
                raise dnf.exceptions.Error(
                    _('Some packages have invalid cache, but cannot be downloaded due to '
                      '"--cacheonly" option'))
            remote_pkgs = []

        return remote_pkgs, local_repository_pkgs

    def _report_already_installed(self, packages):
        for pkg in packages:
            _msg_installed(pkg)

    def _raise_package_not_found_error(self, pkg_spec, forms, reponame):
        all_query = self.sack.query(flags=hawkey.IGNORE_EXCLUDES)
        subject = dnf.subject.Subject(pkg_spec)
        solution = subject.get_best_solution(
            self.sack, forms=forms, with_src=False, query=all_query)
        if reponame is not None:
            solution['query'].filterm(reponame=reponame)
        if not solution['query']:
            raise dnf.exceptions.PackageNotFoundError(_('No match for argument'), pkg_spec)
        else:
            with_regular_query = self.sack.query(flags=hawkey.IGNORE_REGULAR_EXCLUDES)
            with_regular_query = solution['query'].intersection(with_regular_query)
            # Modular filtering is applied on a package set that already has regular excludes
            # filtered out. So if a package wasn't filtered out by regular excludes, it must have
            # been filtered out by modularity.
            if with_regular_query:
                msg = _('All matches were filtered out by exclude filtering for argument')
            else:
                msg = _('All matches were filtered out by modular filtering for argument')
            raise dnf.exceptions.PackageNotFoundError(msg, pkg_spec)

    def _raise_package_not_installed_error(self, pkg_spec, forms, reponame):
        all_query = self.sack.query(flags=hawkey.IGNORE_EXCLUDES).installed()
        subject = dnf.subject.Subject(pkg_spec)
        solution = subject.get_best_solution(
            self.sack, forms=forms, with_src=False, query=all_query)

        if not solution['query']:
            raise dnf.exceptions.PackagesNotInstalledError(_('No match for argument'), pkg_spec)
        if reponame is not None:
            installed = [pkg for pkg in solution['query'] if self.history.repo(pkg) == reponame]
        else:
            installed = solution['query']
        if not installed:
            msg = _('All matches were installed from a different repository for argument')
        else:
            msg = _('All matches were filtered out by exclude filtering for argument')
        raise dnf.exceptions.PackagesNotInstalledError(msg, pkg_spec)

    def setup_loggers(self):
        # :api
        """
        Setup DNF file loggers based on given configuration file. The loggers are set the same
        way as if DNF was run from CLI.
        """
        self._logging._setup_from_dnf_conf(self.conf, file_loggers_only=True)

    def _skipped_packages(self, report_problems, transaction):
        """returns set of conflicting packages and set of packages with broken dependency that would
        be additionally installed when --best and --allowerasing"""
        if self._goal.actions & (hawkey.INSTALL | hawkey.UPGRADE | hawkey.UPGRADE_ALL):
            best = True
        else:
            best = False
        ng = deepcopy(self._goal)
        params = {"allow_uninstall": self._allow_erasing,
                  "force_best": best,
                  "ignore_weak": True}
        ret = ng.run(**params)
        if not ret and report_problems:
            msg = dnf.util._format_resolve_problems(ng.problem_rules())
            logger.warning(msg)
        problem_conflicts = set(ng.problem_conflicts(available=True))
        problem_dependency = set(ng.problem_broken_dependency(available=True)) - problem_conflicts

        def _nevra(item):
            return hawkey.NEVRA(name=item.name, epoch=item.epoch, version=item.version,
                                release=item.release, arch=item.arch)

        # Sometimes, pkg is not in transaction item, therefore, comparing by nevra
        transaction_nevras = [_nevra(tsi) for tsi in transaction]
        skipped_conflicts = set(
            [pkg for pkg in problem_conflicts if _nevra(pkg) not in transaction_nevras])
        skipped_dependency = set(
            [pkg for pkg in problem_dependency if _nevra(pkg) not in transaction_nevras])

        return skipped_conflicts, skipped_dependency


def _msg_installed(pkg):
    name = ucd(pkg)
    msg = _('Package %s is already installed.')
    logger.info(msg, name)
site-packages/dnf/i18n.py000064400000030041147511334650011173 0ustar00# i18n.py
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import print_function
from __future__ import unicode_literals
from dnf.pycomp import unicode

import dnf
import locale
import os
import signal
import sys
import unicodedata

"""
Centralize i18n stuff here. Must be unittested.
"""

class UnicodeStream(object):
    def __init__(self, stream, encoding):
        self.stream = stream
        self.encoding = encoding

    def write(self, s):
        if not isinstance(s, str):
            s = (s.decode(self.encoding, 'replace') if dnf.pycomp.PY3 else
                 s.encode(self.encoding, 'replace'))
        try:
            self.stream.write(s)
        except UnicodeEncodeError:
            s_bytes = s.encode(self.stream.encoding, 'backslashreplace')
            if hasattr(self.stream, 'buffer'):
                self.stream.buffer.write(s_bytes)
            else:
                s = s_bytes.decode(self.stream.encoding, 'ignore')
                self.stream.write(s)


    def __getattr__(self, name):
        return getattr(self.stream, name)

def _full_ucd_support(encoding):
    """Return true if encoding can express any Unicode character.

    Even if an encoding can express all accented letters in the given language,
    we can't generally settle for it in DNF since sometimes we output special
    characters like the registered trademark symbol (U+00AE) and surprisingly
    many national non-unicode encodings, including e.g. ASCII and ISO-8859-2,
    don't contain it.

    """
    if encoding is None:
        return False
    lower = encoding.lower()
    if lower.startswith('utf-') or lower.startswith('utf_'):
        return True
    return False

def _guess_encoding():
    """ Take the best shot at the current system's string encoding. """
    encoding = locale.getpreferredencoding(False)
    return 'utf-8' if encoding.startswith("ANSI") else encoding

def setup_locale():
    try:
        dnf.pycomp.setlocale(locale.LC_ALL, '')
    except locale.Error:
        # default to C.UTF-8 or C locale if we got a failure.
        try:
            dnf.pycomp.setlocale(locale.LC_ALL, 'C.UTF-8')
            os.environ['LC_ALL'] = 'C.UTF-8'
        except locale.Error:
            dnf.pycomp.setlocale(locale.LC_ALL, 'C')
            os.environ['LC_ALL'] = 'C'
        print('Failed to set locale, defaulting to {}'.format(os.environ['LC_ALL']),
              file=sys.stderr)

def setup_stdout():
    """ Check that stdout is of suitable encoding and handle the situation if
        not.

        Returns True if stdout was of suitable encoding already and no changes
        were needed.
    """
    stdout = sys.stdout
    if not stdout.isatty():
        signal.signal(signal.SIGPIPE, signal.SIG_DFL)
    try:
        encoding = stdout.encoding
    except AttributeError:
        encoding = None
    if not _full_ucd_support(encoding):
        sys.stdout = UnicodeStream(stdout, _guess_encoding())
        return False
    return True


def ucd_input(ucstring):
    # :api, deprecated in 2.0.0, will be erased when python2 is abandoned
    """ It uses print instead of passing the prompt to raw_input.

        raw_input doesn't encode the passed string and the output
        goes into stderr
    """
    print(ucstring, end='')
    return dnf.pycomp.raw_input()


def ucd(obj):
    # :api, deprecated in 2.0.0, will be erased when python2 is abandoned
    """ Like the builtin unicode() but tries to use a reasonable encoding. """
    if dnf.pycomp.PY3:
        if dnf.pycomp.is_py3bytes(obj):
            return str(obj, _guess_encoding(), errors='ignore')
        elif isinstance(obj, str):
            return obj
        return str(obj)
    else:
        if isinstance(obj, dnf.pycomp.unicode):
            return obj
        if hasattr(obj, '__unicode__'):
            # see the doc for the unicode() built-in. The logic here is: if obj
            # implements __unicode__, let it take a crack at it, but handle the
            # situation if it fails:
            try:
                return dnf.pycomp.unicode(obj)
            except UnicodeError:
                pass
        return dnf.pycomp.unicode(str(obj), _guess_encoding(), errors='ignore')


# functions for formatting output according to terminal width,
# They should be used instead of build-in functions to count on different
# widths of Unicode characters

def _exact_width_char(uchar):
    return 2 if unicodedata.east_asian_width(uchar) in ('W', 'F') else 1


def chop_str(msg, chop=None):
    """ Return the textual width of a Unicode string, chopping it to
        a specified value. This is what you want to use instead of %.*s, as it
        does the "right" thing with regard to different Unicode character width
        Eg. "%.*s" % (10, msg)   <= becomes => "%s" % (chop_str(msg, 10)) """

    if chop is None:
        return exact_width(msg), msg

    width = 0
    chopped_msg = ""
    for char in msg:
        char_width = _exact_width_char(char)
        if width + char_width > chop:
            break
        chopped_msg += char
        width += char_width
    return width, chopped_msg


def exact_width(msg):
    """ Calculates width of char at terminal screen
        (Asian char counts for two) """
    return sum(_exact_width_char(c) for c in msg)


def fill_exact_width(msg, fill, chop=None, left=True, prefix='', suffix=''):
    """ Expand a msg to a specified "width" or chop to same.
        Expansion can be left or right. This is what you want to use instead of
        %*.*s, as it does the "right" thing with regard to different Unicode
        character width.
        prefix and suffix should be used for "invisible" bytes, like
        highlighting.

        Examples:

        ``"%-*.*s" % (10, 20, msg)`` becomes
            ``"%s" % (fill_exact_width(msg, 10, 20))``.

        ``"%20.10s" % (msg)`` becomes
            ``"%s" % (fill_exact_width(msg, 20, 10, left=False))``.

        ``"%s%.10s%s" % (pre, msg, suf)`` becomes
            ``"%s" % (fill_exact_width(msg, 0, 10, prefix=pre, suffix=suf))``.
        """
    width, msg = chop_str(msg, chop)

    if width >= fill:
        if prefix or suffix:
            msg = ''.join([prefix, msg, suffix])
    else:
        extra = " " * (fill - width)
        if left:
            msg = ''.join([prefix, msg, suffix, extra])
        else:
            msg = ''.join([extra, prefix, msg, suffix])

    return msg


def textwrap_fill(text, width=70, initial_indent='', subsequent_indent=''):
    """ Works like we want textwrap.wrap() to work, uses Unicode strings
        and doesn't screw up lists/blocks/etc. """

    def _indent_at_beg(line):
        count = 0
        byte = 'X'
        for byte in line:
            if byte != ' ':
                break
            count += 1
        if byte not in ("-", "*", ".", "o", '\xe2'):
            return count, 0
        list_chr = chop_str(line[count:], 1)[1]
        if list_chr in ("-", "*", ".", "o",
                        "\u2022", "\u2023", "\u2218"):
            nxt = _indent_at_beg(line[count+len(list_chr):])
            nxt = nxt[1] or nxt[0]
            if nxt:
                return count, count + 1 + nxt
        return count, 0

    text = text.rstrip('\n')
    lines = text.replace('\t', ' ' * 8).split('\n')

    ret = []
    indent = initial_indent
    wrap_last = False
    csab = 0
    cspc_indent = 0
    for line in lines:
        line = line.rstrip(' ')
        (lsab, lspc_indent) = (csab, cspc_indent)
        (csab, cspc_indent) = _indent_at_beg(line)
        force_nl = False # We want to stop wrapping under "certain" conditions:
        if wrap_last and cspc_indent:        # if line starts a list or
            force_nl = True
        if wrap_last and csab == len(line):  # is empty line
            force_nl = True
        # if line doesn't continue a list and is "block indented"
        if wrap_last and not lspc_indent:
            if csab >= 4 and csab != lsab:
                force_nl = True
        if force_nl:
            ret.append(indent.rstrip(' '))
            indent = subsequent_indent
            wrap_last = False
        if csab == len(line):  # empty line, remove spaces to make it easier.
            line = ''
        if wrap_last:
            line = line.lstrip(' ')
            cspc_indent = lspc_indent

        if exact_width(indent + line) <= width:
            wrap_last = False
            ret.append(indent + line)
            indent = subsequent_indent
            continue

        wrap_last = True
        words = line.split(' ')
        line = indent
        spcs = cspc_indent
        if not spcs and csab >= 4:
            spcs = csab
        for word in words:
            if (width < exact_width(line + word)) and \
               (exact_width(line) > exact_width(subsequent_indent)):
                ret.append(line.rstrip(' '))
                line = subsequent_indent + ' ' * spcs
            line += word
            line += ' '
        indent = line.rstrip(' ') + ' '
    if wrap_last:
        ret.append(indent.rstrip(' '))

    return '\n'.join(ret)


def select_short_long(width, msg_short, msg_long):
    """ Automatically selects the short (abbreviated) or long (full) message
        depending on whether we have enough screen space to display the full
        message or not. If a caller by mistake passes a long string as
        msg_short and a short string as a msg_long this function recognizes
        the mistake and swaps the arguments. This function is especially useful
        in the i18n context when you cannot predict how long are the translated
        messages.

        Limitations:

        1. If msg_short is longer than width you will still get an overflow.
           This function does not abbreviate the string.
        2. You are not obliged to provide an actually abbreviated string, it is
           perfectly correct to pass the same string twice if you don't want
           any abbreviation. However, if you provide two different strings but
           having the same width this function is unable to recognize which one
           is correct and you should assume that it is unpredictable which one
           is returned.

       Example:

       ``select_short_long (10, _("Repo"), _("Repository"))``

       will return "Repository" in English but the results in other languages
       may be different. """
    width_short = exact_width(msg_short)
    width_long = exact_width(msg_long)
    # If we have two strings of the same width:
    if width_short == width_long:
        return msg_long
    # If the short string is wider than the long string:
    elif width_short > width_long:
        return msg_short if width_short <= width else msg_long
    # The regular case:
    else:
        return msg_long if width_long <= width else msg_short


def translation(name):
    # :api, deprecated in 2.0.0, will be erased when python2 is abandoned
    """ Easy gettext translations setup based on given domain name """

    setup_locale()
    def ucd_wrapper(fnc):
        return lambda *w: ucd(fnc(*w))
    t = dnf.pycomp.gettext.translation(name, fallback=True)
    return map(ucd_wrapper, dnf.pycomp.gettext_setup(t))


def pgettext(context, message):
    result = _(context + chr(4) + message)
    if "\004" in result:
        return message
    else:
        return result

# setup translations
_, P_ = translation("dnf")
C_ = pgettext
site-packages/dnf/rpm/connection.py000064400000002531147511334650013354 0ustar00# connection.py
# Maintain RPMDB connections.
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals

from .transaction import initReadOnlyTransaction
import dnf.util

class RpmConnection(object):
    def __init__(self, root):
        self.root = root

    @property
    @dnf.util.lazyattr("_readonly_ts")
    def readonly_ts(self):
        return initReadOnlyTransaction(self.root)
site-packages/dnf/rpm/__init__.py000064400000011300147511334650012746 0ustar00# __init__.py
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from . import transaction
from dnf.pycomp import is_py3bytes
import dnf.const
import dnf.exceptions
import rpm  # used by ansible (dnf.rpm.rpm.labelCompare in lib/ansible/modules/packaging/os/dnf.py)


def detect_releasever(installroot):
    # :api
    """Calculate the release version for the system."""

    ts = transaction.initReadOnlyTransaction(root=installroot)
    ts.pushVSFlags(~(rpm._RPMVSF_NOSIGNATURES | rpm._RPMVSF_NODIGESTS))
    for distroverpkg in dnf.const.DISTROVERPKG:
        if dnf.pycomp.PY3:
            distroverpkg = bytes(distroverpkg, 'utf-8')
        try:
            idx = ts.dbMatch('provides', distroverpkg)
        except (TypeError, rpm.error) as e:
            raise dnf.exceptions.Error('Error: %s' % str(e))
        if not len(idx):
            continue
        try:
            hdr = next(idx)
        except StopIteration:
            msg = 'Error: rpmdb failed to list provides. Try: rpm --rebuilddb'
            raise dnf.exceptions.Error(msg)
        releasever = hdr['version']
        try:
            try:
                # header returns bytes -> look for bytes
                # it may fail because rpm returns a decoded string since 10 Apr 2019
                off = hdr[rpm.RPMTAG_PROVIDENAME].index(distroverpkg)
            except ValueError:
                # header returns a string -> look for a string
                off = hdr[rpm.RPMTAG_PROVIDENAME].index(distroverpkg.decode("utf8"))
            flag = hdr[rpm.RPMTAG_PROVIDEFLAGS][off]
            ver = hdr[rpm.RPMTAG_PROVIDEVERSION][off]
            if flag == rpm.RPMSENSE_EQUAL and ver:
                if hdr['name'] not in (distroverpkg, distroverpkg.decode("utf8")):
                    # override the package version
                    releasever = ver
        except (ValueError, KeyError, IndexError):
            pass

        if is_py3bytes(releasever):
            releasever = str(releasever, "utf-8")
        return releasever
    return None


def _header(path):
    """Return RPM header of the file."""
    ts = transaction.initReadOnlyTransaction()
    with open(path) as package:
        fdno = package.fileno()
        try:
            hdr = ts.hdrFromFdno(fdno)
        except rpm.error as e:
            raise dnf.exceptions.Error("{0}: '{1}'".format(e, path))
        return hdr


def _invert(dct):
    return {v: k for k in dct for v in dct[k]}

_BASEARCH_MAP = _invert({
    'aarch64': ('aarch64',),
    'alpha': ('alpha', 'alphaev4', 'alphaev45', 'alphaev5', 'alphaev56',
              'alphaev6', 'alphaev67', 'alphaev68', 'alphaev7', 'alphapca56'),
    'arm': ('armv5tejl', 'armv5tel', 'armv5tl', 'armv6l', 'armv7l', 'armv8l'),
    'armhfp': ('armv6hl', 'armv7hl', 'armv7hnl', 'armv8hl'),
    'i386': ('i386', 'athlon', 'geode', 'i386', 'i486', 'i586', 'i686'),
    'ia64': ('ia64',),
    'mips': ('mips',),
    'mipsel': ('mipsel',),
    'mips64': ('mips64',),
    'mips64el': ('mips64el',),
    'noarch': ('noarch',),
    'ppc': ('ppc',),
    'ppc64': ('ppc64', 'ppc64iseries', 'ppc64p7', 'ppc64pseries'),
    'ppc64le': ('ppc64le',),
    'riscv32' : ('riscv32',),
    'riscv64' : ('riscv64',),
    'riscv128' : ('riscv128',),
    's390': ('s390',),
    's390x': ('s390x',),
    'sh3': ('sh3',),
    'sh4': ('sh4', 'sh4a'),
    'sparc': ('sparc', 'sparc64', 'sparc64v', 'sparcv8', 'sparcv9',
              'sparcv9v'),
    'x86_64': ('x86_64', 'amd64', 'ia32e'),
})


def basearch(arch):
    # :api
    return _BASEARCH_MAP[arch]


def getheader(rpm_hdr, key):
    '''
    Returns value of rpm_hdr[key] as a string. Rpm has switched from bytes to str
    and we need to handle both properly.
    '''
    value = rpm_hdr[key]
    if is_py3bytes(value):
        value = str(value, "utf-8")
    return value
site-packages/dnf/rpm/miscutils.py000064400000007433147511334650013237 0ustar00# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU Library General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
# Copyright 2003 Duke University

from __future__ import print_function, absolute_import, unicode_literals

import os
import subprocess
import logging
from shutil import which

from dnf.i18n import _

_logger = logging.getLogger('dnf')
_rpmkeys_binary = None

def _find_rpmkeys_binary():
    global _rpmkeys_binary
    if _rpmkeys_binary is None:
        _rpmkeys_binary = which("rpmkeys")
        _logger.debug(_('Using rpmkeys executable at %s to verify signatures'),
                      _rpmkeys_binary)
    return _rpmkeys_binary

def _process_rpm_output(data):
    # No signatures or digests = corrupt package.
    # There is at least one line for -: and another (empty) entry after the
    # last newline.
    if len(data) < 3 or data[0] != b'-:' or data[-1]:
        return 2
    seen_sig, missing_key, not_trusted, not_signed = False, False, False, False
    for i in data[1:-1]:
        if b': BAD' in i:
            return 2
        elif i.endswith(b': NOKEY'):
            missing_key = True
        elif i.endswith(b': NOTTRUSTED'):
            not_trusted = True
        elif i.endswith(b': NOTFOUND'):
            not_signed = True
        elif not i.endswith(b': OK'):
            return 2
    if not_trusted:
        return 3
    elif missing_key:
        return 1
    elif not_signed:
        return 4
    # we still check return code, so this is safe
    return 0

def _verifyPackageUsingRpmkeys(package, installroot):
    rpmkeys_binary = _find_rpmkeys_binary()
    if rpmkeys_binary is None or not os.path.isfile(rpmkeys_binary):
        _logger.critical(_('Cannot find rpmkeys executable to verify signatures.'))
        return 2

    # "--define=_pkgverify_level signature" enforces signature checking;
    # "--define=_pkgverify_flags 0x0" ensures that all signatures are checked.
    args = ('rpmkeys', '--checksig', '--root', installroot, '--verbose',
            '--define=_pkgverify_level signature', '--define=_pkgverify_flags 0x0',
            '-')
    with subprocess.Popen(
            args=args,
            executable=rpmkeys_binary,
            env={'LC_ALL': 'C'},
            stdout=subprocess.PIPE,
            cwd='/',
            stdin=package) as p:
        data = p.communicate()[0]
    returncode = p.returncode
    if type(returncode) is not int:
        raise AssertionError('Popen set return code to non-int')
    # rpmkeys can return something other than 0 or 1 in the case of a
    # fatal error (OOM, abort() called, SIGSEGV, etc)
    if returncode >= 2 or returncode < 0:
        return 2
    ret = _process_rpm_output(data.split(b'\n'))
    if ret:
        return ret
    return 2 if returncode else 0

def checkSig(ts, package):
    """Takes a transaction set and a package, check it's sigs,
    return 0 if they are all fine
    return 1 if the gpg key can't be found
    return 2 if the header is in someway damaged
    return 3 if the key is not trusted
    return 4 if the pkg is not gpg or pgp signed"""

    fdno = os.open(package, os.O_RDONLY|os.O_NOCTTY|os.O_CLOEXEC)
    try:
        value = _verifyPackageUsingRpmkeys(fdno, ts.ts.rootDir)
    finally:
        os.close(fdno)
    return value
site-packages/dnf/rpm/__pycache__/error.cpython-36.pyc000064400000000422147511334650016627 0ustar003

�ft`�@sGdd�de�ZdS)c@seZdZdS)�
RpmUtilsErrorN)�__name__�
__module__�__qualname__�rr�/usr/lib/python3.6/error.pyrsrN)�	Exceptionrrrrr�<module>ssite-packages/dnf/rpm/__pycache__/miscutils.cpython-36.opt-1.pyc000064400000004642147511334650020461 0ustar003

h�-e�@svddlmZmZmZddlZddlZddlZddlmZddl	m
Z
ejd�Zda
dd�Zdd	�Zd
d�Zdd
�ZdS)�)�print_function�absolute_import�unicode_literalsN)�which)�_ZdnfcCs$tdkr td�atjtd�t�tS)N�rpmkeysz3Using rpmkeys executable at %s to verify signatures)�_rpmkeys_binaryr�_logger�debugr�rr�/usr/lib/python3.6/miscutils.py�_find_rpmkeys_binarys

r
cCs�t|�dks |ddks |dr$dSd\}}}}x^|dd�D]N}d|krNdS|jd�r^d	}q>|jd
�rnd	}q>|jd�r~d	}q>|jd�s>dSq>W|r�dS|r�dS|r�d
SdS)N�rs-:��Fs: BADs: NOKEYTs: NOTTRUSTEDs
: NOTFOUNDs: OK����)FFFFr)�len�endswith)�dataZseen_sigZmissing_keyZnot_trustedZ
not_signed�irrr�_process_rpm_output$s* 



rc
Cs�t�}|dkstjj|�r.tjtd��dSddd|dddd	f}tj||d
ditj	d|d
��}|j
�d}WdQRX|j}t|�t
k	r�td��|dks�|dkr�dSt|jd��}|r�|S|r�dSdS)Nz4Cannot find rpmkeys executable to verify signatures.rrz
--checksigz--rootz	--verbosez#--define=_pkgverify_level signaturez--define=_pkgverify_flags 0x0�-�LC_ALL�C�/)�args�
executable�env�stdout�cwd�stdinrz Popen set return code to non-int�
)r
�os�path�isfiler	Zcriticalr�
subprocess�Popen�PIPEZcommunicate�
returncode�type�int�AssertionErrorr�split)�packageZinstallrootZrpmkeys_binaryr�prr)�retrrr�_verifyPackageUsingRpmkeys?s0

r1cCs>tj|tjtjBtjB�}zt||jj�}Wdtj|�X|S)a
Takes a transaction set and a package, check it's sigs,
    return 0 if they are all fine
    return 1 if the gpg key can't be found
    return 2 if the header is in someway damaged
    return 3 if the key is not trusted
    return 4 if the pkg is not gpg or pgp signedN)	r#�open�O_RDONLY�O_NOCTTY�	O_CLOEXECr1�tsZrootDir�close)r6r.Zfdno�valuerrr�checkSig^s
r9)Z
__future__rrrr#r&ZloggingZshutilrZdnf.i18nrZ	getLoggerr	rr
rr1r9rrrr�<module>s
site-packages/dnf/rpm/__pycache__/error.cpython-36.opt-1.pyc000064400000000422147511334650017566 0ustar003

�ft`�@sGdd�de�ZdS)c@seZdZdS)�
RpmUtilsErrorN)�__name__�
__module__�__qualname__�rr�/usr/lib/python3.6/error.pyrsrN)�	Exceptionrrrrr�<module>ssite-packages/dnf/rpm/__pycache__/connection.cpython-36.pyc000064400000001432147511334650017637 0ustar003

�ft`Y�@s@ddlmZddlmZddlmZddlZGdd�de�ZdS)�)�absolute_import)�unicode_literals�)�initReadOnlyTransactionNc@s,eZdZdd�Zeejjd�dd���ZdS)�
RpmConnectioncCs
||_dS)N)�root)�selfr�r	� /usr/lib/python3.6/connection.py�__init__szRpmConnection.__init__Z_readonly_tscCs
t|j�S)N)rr)rr	r	r
�readonly_tsszRpmConnection.readonly_tsN)	�__name__�
__module__�__qualname__r�property�dnf�utilZlazyattrrr	r	r	r
rsr)	Z
__future__rrZtransactionrZdnf.utilr�objectrr	r	r	r
�<module>ssite-packages/dnf/rpm/__pycache__/miscutils.cpython-36.pyc000064400000004642147511334650017522 0ustar003

h�-e�@svddlmZmZmZddlZddlZddlZddlmZddl	m
Z
ejd�Zda
dd�Zdd	�Zd
d�Zdd
�ZdS)�)�print_function�absolute_import�unicode_literalsN)�which)�_ZdnfcCs$tdkr td�atjtd�t�tS)N�rpmkeysz3Using rpmkeys executable at %s to verify signatures)�_rpmkeys_binaryr�_logger�debugr�rr�/usr/lib/python3.6/miscutils.py�_find_rpmkeys_binarys

r
cCs�t|�dks |ddks |dr$dSd\}}}}x^|dd�D]N}d|krNdS|jd�r^d	}q>|jd
�rnd	}q>|jd�r~d	}q>|jd�s>dSq>W|r�dS|r�dS|r�d
SdS)N�rs-:��Fs: BADs: NOKEYTs: NOTTRUSTEDs
: NOTFOUNDs: OK����)FFFFr)�len�endswith)�dataZseen_sigZmissing_keyZnot_trustedZ
not_signed�irrr�_process_rpm_output$s* 



rc
Cs�t�}|dkstjj|�r.tjtd��dSddd|dddd	f}tj||d
ditj	d|d
��}|j
�d}WdQRX|j}t|�t
k	r�td��|dks�|dkr�dSt|jd��}|r�|S|r�dSdS)Nz4Cannot find rpmkeys executable to verify signatures.rrz
--checksigz--rootz	--verbosez#--define=_pkgverify_level signaturez--define=_pkgverify_flags 0x0�-�LC_ALL�C�/)�args�
executable�env�stdout�cwd�stdinrz Popen set return code to non-int�
)r
�os�path�isfiler	Zcriticalr�
subprocess�Popen�PIPEZcommunicate�
returncode�type�int�AssertionErrorr�split)�packageZinstallrootZrpmkeys_binaryr�prr)�retrrr�_verifyPackageUsingRpmkeys?s0

r1cCs>tj|tjtjBtjB�}zt||jj�}Wdtj|�X|S)a
Takes a transaction set and a package, check it's sigs,
    return 0 if they are all fine
    return 1 if the gpg key can't be found
    return 2 if the header is in someway damaged
    return 3 if the key is not trusted
    return 4 if the pkg is not gpg or pgp signedN)	r#�open�O_RDONLY�O_NOCTTY�	O_CLOEXECr1�tsZrootDir�close)r6r.Zfdno�valuerrr�checkSig^s
r9)Z
__future__rrrr#r&ZloggingZshutilrZdnf.i18nrZ	getLoggerr	rr
rr1r9rrrr�<module>s
site-packages/dnf/rpm/__pycache__/connection.cpython-36.opt-1.pyc000064400000001432147511334650020576 0ustar003

�ft`Y�@s@ddlmZddlmZddlmZddlZGdd�de�ZdS)�)�absolute_import)�unicode_literals�)�initReadOnlyTransactionNc@s,eZdZdd�Zeejjd�dd���ZdS)�
RpmConnectioncCs
||_dS)N)�root)�selfr�r	� /usr/lib/python3.6/connection.py�__init__szRpmConnection.__init__Z_readonly_tscCs
t|j�S)N)rr)rr	r	r
�readonly_tsszRpmConnection.readonly_tsN)	�__name__�
__module__�__qualname__r�property�dnf�utilZlazyattrrr	r	r	r
rsr)	Z
__future__rrZtransactionrZdnf.utilr�objectrr	r	r	r
�<module>ssite-packages/dnf/rpm/__pycache__/__init__.cpython-36.opt-1.pyc000064400000007026147511334650020203 0ustar003

�ft`��@s�ddlmZddlmZddlmZddlmZddlZddl	Zddl
Z
dd�Zd	d
�Zdd�Z
e
dJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dE��ZdFdG�ZdHdI�ZdS)a�)�absolute_import)�unicode_literals�)�transaction)�is_py3bytesNc)Cs�tj|d�}|jtjtjB��xztjjD�]l}tj	j
rDt|d�}y|jd|�}Wn<t
tjfk
r�}ztjjdt|���WYdd}~XnXt|�s�q,yt|�}Wn$tk
r�d}tjj|��YnX|d}y�y|tjj|�}Wn,tk
�r|tjj|jd��}YnX|tj|}	|tj|}
|	tjk�rd|
�rd|d	||jd�fk�rd|
}Wntttfk
�r�YnXt|��r�t|d�}|SWdS)
z-Calculate the release version for the system.)�rootzutf-8Zprovidesz	Error: %sNz:Error: rpmdb failed to list provides. Try: rpm --rebuilddb�version�utf8�name)r�initReadOnlyTransactionZpushVSFlags�rpmZ_RPMVSF_NOSIGNATURESZ_RPMVSF_NODIGESTS�dnf�constZDISTROVERPKGZpycompZPY3�bytesZdbMatch�	TypeError�error�
exceptions�Error�str�len�next�
StopIterationZRPMTAG_PROVIDENAME�index�
ValueError�decodeZRPMTAG_PROVIDEFLAGSZRPMTAG_PROVIDEVERSIONZRPMSENSE_EQUAL�KeyError�
IndexErrorr)Zinstallroot�tsZdistroverpkg�idx�e�hdr�msgZ
releaseverZoff�flagZver�r#�/usr/lib/python3.6/__init__.py�detect_releaseversB
&

r%cCsptj�}t|��V}|j�}y|j|�}Wn8tjk
r`}ztjj	dj
||���WYdd}~XnX|SQRXdS)zReturn RPM header of the file.z
{0}: '{1}'N)rr�open�filenoZhdrFromFdnorrr
rr�format)�pathr�packageZfdnor rr#r#r$�_headerIs
&r+cs�fdd��D�S)Ncs i|]}�|D]
}||�qqSr#r#)�.0�k�v)�dctr#r$�
<dictcomp>Vsz_invert.<locals>.<dictcomp>r#)r/r#)r/r$�_invertUsr1�aarch64�alpha�alphaev4�	alphaev45�alphaev5�	alphaev56�alphaev6�	alphaev67�	alphaev68�alphaev7�
alphapca56�	armv5tejl�armv5tel�armv5tl�armv6l�armv7l�armv8l�armv6hl�armv7hl�armv7hnl�armv8hl�i386�athlon�geode�i486�i586�i686�ia64�mips�mipsel�mips64�mips64el�noarch�ppc�ppc64�ppc64iseries�ppc64p7�ppc64pseries�ppc64le�riscv32�riscv64�riscv128�s390�s390x�sh3�sh4�sh4a�sparc�sparc64�sparc64v�sparcv8�sparcv9�sparcv9v�x86_64�amd64�ia32e)r2r3ZarmZarmhfprGrMrNrOrPrQrRrSrTrXrYrZr[r\r]r^r_rargcCst|S)N)�
_BASEARCH_MAP)Zarchr#r#r$�basearchusrkcCs||}t|�rt|d�}|S)z�
    Returns value of rpm_hdr[key] as a string. Rpm has switched from bytes to str
    and we need to handle both properly.
    zutf-8)rr)Zrpm_hdr�key�valuer#r#r$�	getheaderzs
rn)r2)
r3r4r5r6r7r8r9r:r;r<)r=r>r?r@rArB)rCrDrErF)rGrHrIrGrJrKrL)rM)rN)rO)rP)rQ)rR)rS)rTrUrVrW)rX)rY)rZ)r[)r\)r])r^)r_r`)rarbrcrdrerf)rgrhri)Z
__future__rr�rZ
dnf.pycomprZ	dnf.constr
Zdnf.exceptionsrr%r+r1rjrkrnr#r#r#r$�<module>sJ,
site-packages/dnf/rpm/__pycache__/transaction.cpython-36.opt-1.pyc000064400000007150147511334650020767 0ustar003

�ft`��@sRddlmZddlmZddlmZddlZdZdZGdd�de�Z	d
dd	�Z
dS)�)�absolute_import)�unicode_literals)�_Nc@szeZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zifdd�ZdS)�TransactionWrapper�/cCs@tj|�|_ddddddddd	d
ddd
ddg|_g|_d|_dS)NZcheck�orderZaddEraseZ
addInstallZaddReinstall�runZpgpImportPubkeyZ
pgpPrtPktsZproblems�setFlags�
setVSFlags�
setProbFilterZhdrFromFdno�nextZcleanT)�rpmZTransactionSet�ts�_methods�tsflags�open)�self�root�r�!/usr/lib/python3.6/transaction.py�__init__s$zTransactionWrapper.__init__cCs|j�dS)N)�close)rrrr�__del__+szTransactionWrapper.__del__cCs |jr|jj�d|_d|_dS)NF)rrZcloseDB)rrrrr/s
zTransactionWrapper.closecOsLd|kr|jd�}ng}|jj||�}x |D]\}}}|j|||�q,W|S)N�patterns)�popr�dbMatch�pattern)r�args�kwdsrZmi�tag�tpZpatrrrr5szTransactionWrapper.dbMatchcCs ||jkr|j|�St|��dS)N)r�	getMethod�AttributeError)r�attrrrr�__getattr__@s

zTransactionWrapper.__getattr__cCs|jS)N)r)rrrr�__iter__FszTransactionWrapper.__iter__cCst|j|�S)N)�getattrr)r�methodrrrr!IszTransactionWrapper.getMethodcCs"|jj|�|jj|jd�dS)N����)r�appendrr
)r�flagsrrr�pushVSFlagsQszTransactionWrapper.pushVSFlagscCs |jjd�}|jj||B�dS)Nr)rr	)r�flag�curflagsrrr�	addTsFlagUszTransactionWrapper.addTsFlagcCs|jjd�}|jj|�|S)Nr)rr	)rr.rrr�
getTsFlagsYszTransactionWrapper.getTsFlagscCs|j�}t||@�S)N)r0�bool)rr-�valrrr�isTsFlagSet^szTransactionWrapper.isTsFlagSetcCs|j�|j_dS)N)�filenorZscriptFd)r�fdrrr�setScriptFdbszTransactionWrapper.setScriptFdc
Cs�|j�}|jtj�|jd�dkr0|jjtj�|jj|j	d�}|jj
|�g}|dk	r�x |D]\}\}}}	|j|�q^W|s�|jtd��|S)zetests the ts we've setup, takes a callback function and a conf dict
           for flags and what notZdiskspacecheckr�Nz(Errors occurred during test transaction.)
r0r/r
ZRPMTRANS_FLAG_TEST�getrrZRPMPROB_FILTER_DISKSPACEr�callbackr	r*r)
r�cbZconfZ	origflagsZtserrorsZ	reserrorsZdescr�etypeZmountZneedrrr�testeszTransactionWrapper.testN)r)�__name__�
__module__�__qualname__rrrrr$r%r!r,r/r0r3r6r<rrrrrs
rrcCs t|d�}|jtjtjB�|S)N)r)rr,r
Z_RPMVSF_NOSIGNATURESZ_RPMVSF_NODIGESTS)r�read_tsrrr�initReadOnlyTransaction{s
rA)r)Z
__future__rrZdnf.i18nrr
r@r�objectrrArrrr�<module>sesite-packages/dnf/rpm/__pycache__/transaction.cpython-36.pyc000064400000007150147511334650020030 0ustar003

�ft`��@sRddlmZddlmZddlmZddlZdZdZGdd�de�Z	d
dd	�Z
dS)�)�absolute_import)�unicode_literals)�_Nc@szeZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zifdd�ZdS)�TransactionWrapper�/cCs@tj|�|_ddddddddd	d
ddd
ddg|_g|_d|_dS)NZcheck�orderZaddEraseZ
addInstallZaddReinstall�runZpgpImportPubkeyZ
pgpPrtPktsZproblems�setFlags�
setVSFlags�
setProbFilterZhdrFromFdno�nextZcleanT)�rpmZTransactionSet�ts�_methods�tsflags�open)�self�root�r�!/usr/lib/python3.6/transaction.py�__init__s$zTransactionWrapper.__init__cCs|j�dS)N)�close)rrrr�__del__+szTransactionWrapper.__del__cCs |jr|jj�d|_d|_dS)NF)rrZcloseDB)rrrrr/s
zTransactionWrapper.closecOsLd|kr|jd�}ng}|jj||�}x |D]\}}}|j|||�q,W|S)N�patterns)�popr�dbMatch�pattern)r�args�kwdsrZmi�tag�tpZpatrrrr5szTransactionWrapper.dbMatchcCs ||jkr|j|�St|��dS)N)r�	getMethod�AttributeError)r�attrrrr�__getattr__@s

zTransactionWrapper.__getattr__cCs|jS)N)r)rrrr�__iter__FszTransactionWrapper.__iter__cCst|j|�S)N)�getattrr)r�methodrrrr!IszTransactionWrapper.getMethodcCs"|jj|�|jj|jd�dS)N����)r�appendrr
)r�flagsrrr�pushVSFlagsQszTransactionWrapper.pushVSFlagscCs |jjd�}|jj||B�dS)Nr)rr	)r�flag�curflagsrrr�	addTsFlagUszTransactionWrapper.addTsFlagcCs|jjd�}|jj|�|S)Nr)rr	)rr.rrr�
getTsFlagsYszTransactionWrapper.getTsFlagscCs|j�}t||@�S)N)r0�bool)rr-�valrrr�isTsFlagSet^szTransactionWrapper.isTsFlagSetcCs|j�|j_dS)N)�filenorZscriptFd)r�fdrrr�setScriptFdbszTransactionWrapper.setScriptFdc
Cs�|j�}|jtj�|jd�dkr0|jjtj�|jj|j	d�}|jj
|�g}|dk	r�x |D]\}\}}}	|j|�q^W|s�|jtd��|S)zetests the ts we've setup, takes a callback function and a conf dict
           for flags and what notZdiskspacecheckr�Nz(Errors occurred during test transaction.)
r0r/r
ZRPMTRANS_FLAG_TEST�getrrZRPMPROB_FILTER_DISKSPACEr�callbackr	r*r)
r�cbZconfZ	origflagsZtserrorsZ	reserrorsZdescr�etypeZmountZneedrrr�testeszTransactionWrapper.testN)r)�__name__�
__module__�__qualname__rrrrr$r%r!r,r/r0r3r6r<rrrrrs
rrcCs t|d�}|jtjtjB�|S)N)r)rr,r
Z_RPMVSF_NOSIGNATURESZ_RPMVSF_NODIGESTS)r�read_tsrrr�initReadOnlyTransaction{s
rA)r)Z
__future__rrZdnf.i18nrr
r@r�objectrrArrrr�<module>sesite-packages/dnf/rpm/__pycache__/__init__.cpython-36.pyc000064400000007026147511334650017244 0ustar003

�ft`��@s�ddlmZddlmZddlmZddlmZddlZddl	Zddl
Z
dd�Zd	d
�Zdd�Z
e
dJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dE��ZdFdG�ZdHdI�ZdS)a�)�absolute_import)�unicode_literals�)�transaction)�is_py3bytesNc)Cs�tj|d�}|jtjtjB��xztjjD�]l}tj	j
rDt|d�}y|jd|�}Wn<t
tjfk
r�}ztjjdt|���WYdd}~XnXt|�s�q,yt|�}Wn$tk
r�d}tjj|��YnX|d}y�y|tjj|�}Wn,tk
�r|tjj|jd��}YnX|tj|}	|tj|}
|	tjk�rd|
�rd|d	||jd�fk�rd|
}Wntttfk
�r�YnXt|��r�t|d�}|SWdS)
z-Calculate the release version for the system.)�rootzutf-8Zprovidesz	Error: %sNz:Error: rpmdb failed to list provides. Try: rpm --rebuilddb�version�utf8�name)r�initReadOnlyTransactionZpushVSFlags�rpmZ_RPMVSF_NOSIGNATURESZ_RPMVSF_NODIGESTS�dnf�constZDISTROVERPKGZpycompZPY3�bytesZdbMatch�	TypeError�error�
exceptions�Error�str�len�next�
StopIterationZRPMTAG_PROVIDENAME�index�
ValueError�decodeZRPMTAG_PROVIDEFLAGSZRPMTAG_PROVIDEVERSIONZRPMSENSE_EQUAL�KeyError�
IndexErrorr)Zinstallroot�tsZdistroverpkg�idx�e�hdr�msgZ
releaseverZoff�flagZver�r#�/usr/lib/python3.6/__init__.py�detect_releaseversB
&

r%cCsptj�}t|��V}|j�}y|j|�}Wn8tjk
r`}ztjj	dj
||���WYdd}~XnX|SQRXdS)zReturn RPM header of the file.z
{0}: '{1}'N)rr�open�filenoZhdrFromFdnorrr
rr�format)�pathr�packageZfdnor rr#r#r$�_headerIs
&r+cs�fdd��D�S)Ncs i|]}�|D]
}||�qqSr#r#)�.0�k�v)�dctr#r$�
<dictcomp>Vsz_invert.<locals>.<dictcomp>r#)r/r#)r/r$�_invertUsr1�aarch64�alpha�alphaev4�	alphaev45�alphaev5�	alphaev56�alphaev6�	alphaev67�	alphaev68�alphaev7�
alphapca56�	armv5tejl�armv5tel�armv5tl�armv6l�armv7l�armv8l�armv6hl�armv7hl�armv7hnl�armv8hl�i386�athlon�geode�i486�i586�i686�ia64�mips�mipsel�mips64�mips64el�noarch�ppc�ppc64�ppc64iseries�ppc64p7�ppc64pseries�ppc64le�riscv32�riscv64�riscv128�s390�s390x�sh3�sh4�sh4a�sparc�sparc64�sparc64v�sparcv8�sparcv9�sparcv9v�x86_64�amd64�ia32e)r2r3ZarmZarmhfprGrMrNrOrPrQrRrSrTrXrYrZr[r\r]r^r_rargcCst|S)N)�
_BASEARCH_MAP)Zarchr#r#r$�basearchusrkcCs||}t|�rt|d�}|S)z�
    Returns value of rpm_hdr[key] as a string. Rpm has switched from bytes to str
    and we need to handle both properly.
    zutf-8)rr)Zrpm_hdr�key�valuer#r#r$�	getheaderzs
rn)r2)
r3r4r5r6r7r8r9r:r;r<)r=r>r?r@rArB)rCrDrErF)rGrHrIrGrJrKrL)rM)rN)rO)rP)rQ)rR)rS)rTrUrVrW)rX)rY)rZ)r[)r\)r])r^)r_r`)rarbrcrdrerf)rgrhri)Z
__future__rr�rZ
dnf.pycomprZ	dnf.constr
Zdnf.exceptionsrr%r+r1rjrkrnr#r#r#r$�<module>sJ,
site-packages/dnf/rpm/transaction.py000064400000007353147511334650013551 0ustar00#
# Client code for Update Agent
# Copyright (c) 1999-2002 Red Hat, Inc.  Distributed under GPL.
#
#         Adrian Likins <alikins@redhat.com>
# Some Edits by Seth Vidal <skvidal@phy.duke.edu>
#
# a couple of classes wrapping up transactions so that we
#    can share transactions instead of creating new ones all over
#

from __future__ import absolute_import
from __future__ import unicode_literals
from dnf.i18n import _
import rpm

read_ts = None
ts = None

# wrapper/proxy class for rpm.Transaction so we can
# instrument it, etc easily
class TransactionWrapper(object):
    def __init__(self, root='/'):
        self.ts = rpm.TransactionSet(root)
        self._methods = ['check',
                         'order',
                         'addErase',
                         'addInstall',
                         'addReinstall',
                         'run',
                         'pgpImportPubkey',
                         'pgpPrtPkts',
                         'problems',
                         'setFlags',
                         'setVSFlags',
                         'setProbFilter',
                         'hdrFromFdno',
                         'next',
                         'clean']
        self.tsflags = []
        self.open = True

    def __del__(self):
        # Automatically close the rpm transaction when the reference is lost
        self.close()

    def close(self):
        if self.open:
            self.ts.closeDB()
            self.ts = None
            self.open = False

    def dbMatch(self, *args, **kwds):
        if 'patterns' in kwds:
            patterns = kwds.pop('patterns')
        else:
            patterns = []

        mi = self.ts.dbMatch(*args, **kwds)
        for (tag, tp, pat) in patterns:
            mi.pattern(tag, tp, pat)
        return mi

    def __getattr__(self, attr):
        if attr in self._methods:
            return self.getMethod(attr)
        else:
            raise AttributeError(attr)

    def __iter__(self):
        return self.ts

    def getMethod(self, method):
        # in theory, we can override this with
        # profile/etc info
        return getattr(self.ts, method)

    # push/pop methods so we dont lose the previous
    # set value, and we can potentiall debug a bit
    # easier
    def pushVSFlags(self, flags):
        self.tsflags.append(flags)
        self.ts.setVSFlags(self.tsflags[-1])

    def addTsFlag(self, flag):
        curflags = self.ts.setFlags(0)
        self.ts.setFlags(curflags | flag)

    def getTsFlags(self):
        curflags = self.ts.setFlags(0)
        self.ts.setFlags(curflags)
        return curflags

    def isTsFlagSet(self, flag):
        val = self.getTsFlags()
        return bool(flag & val)

    def setScriptFd(self, fd):
        self.ts.scriptFd = fd.fileno()

    def test(self, cb, conf={}):
        """tests the ts we've setup, takes a callback function and a conf dict
           for flags and what not"""

        origflags = self.getTsFlags()
        self.addTsFlag(rpm.RPMTRANS_FLAG_TEST)
        # FIXME GARBAGE - remove once this is reimplemented elsewhere
        # KEEPING FOR API COMPLIANCE ONLY
        if conf.get('diskspacecheck') == 0:
            self.ts.setProbFilter(rpm.RPMPROB_FILTER_DISKSPACE)
        tserrors = self.ts.run(cb.callback, '')
        self.ts.setFlags(origflags)

        reserrors = []
        if tserrors is not None:
            for (descr, (etype, mount, need)) in tserrors:
                reserrors.append(descr)
            if not reserrors:
                reserrors.append(_('Errors occurred during test transaction.'))

        return reserrors

def initReadOnlyTransaction(root='/'):
    read_ts =  TransactionWrapper(root=root)
    read_ts.pushVSFlags((rpm._RPMVSF_NOSIGNATURES|rpm._RPMVSF_NODIGESTS))
    return read_ts
site-packages/dnf/rpm/error.py000064400000002006147511334650012343 0ustar00# error.py
# RpmUtilsError
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

class RpmUtilsError(Exception):
    pass
site-packages/dnf/transaction.py000064400000010455147511334650012750 0ustar00# -*- coding: utf-8 -*-

# transaction.py
# Managing the transaction to be passed to RPM.
#
# Copyright (C) 2013-2018 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals

import libdnf.transaction

from dnf.i18n import _, C_

# :api - all action constants are considered an API

# per-package actions - from libdnf
PKG_DOWNGRADE = libdnf.transaction.TransactionItemAction_DOWNGRADE
PKG_DOWNGRADED = libdnf.transaction.TransactionItemAction_DOWNGRADED
PKG_INSTALL = libdnf.transaction.TransactionItemAction_INSTALL
PKG_OBSOLETE = libdnf.transaction.TransactionItemAction_OBSOLETE
PKG_OBSOLETED = libdnf.transaction.TransactionItemAction_OBSOLETED
PKG_REINSTALL = libdnf.transaction.TransactionItemAction_REINSTALL
PKG_REINSTALLED = libdnf.transaction.TransactionItemAction_REINSTALLED
PKG_REMOVE = libdnf.transaction.TransactionItemAction_REMOVE
PKG_UPGRADE = libdnf.transaction.TransactionItemAction_UPGRADE
PKG_UPGRADED = libdnf.transaction.TransactionItemAction_UPGRADED

# compatibility
PKG_ERASE = PKG_REMOVE

# per-package actions - additional
PKG_CLEANUP = 101
PKG_VERIFY = 102
PKG_SCRIPTLET = 103

# transaction-wide actions
TRANS_PREPARATION = 201
TRANS_POST = 202


# packages that appeared on the system
FORWARD_ACTIONS = [
    libdnf.transaction.TransactionItemAction_INSTALL,
    libdnf.transaction.TransactionItemAction_DOWNGRADE,
    libdnf.transaction.TransactionItemAction_OBSOLETE,
    libdnf.transaction.TransactionItemAction_UPGRADE,
    libdnf.transaction.TransactionItemAction_REINSTALL,
]


# packages that got removed from the system
BACKWARD_ACTIONS = [
    libdnf.transaction.TransactionItemAction_DOWNGRADED,
    libdnf.transaction.TransactionItemAction_OBSOLETED,
    libdnf.transaction.TransactionItemAction_UPGRADED,
    libdnf.transaction.TransactionItemAction_REMOVE,
# TODO: REINSTALLED may and may not belong here; the same NEVRA is in FORWARD_ACTIONS already
#    libdnf.transaction.TransactionItemAction_REINSTALLED,
]


ACTIONS = {
    # TRANSLATORS: This is for a single package currently being downgraded.
    PKG_DOWNGRADE: C_('currently', 'Downgrading'),
    PKG_DOWNGRADED: _('Cleanup'),
    # TRANSLATORS: This is for a single package currently being installed.
    PKG_INSTALL: C_('currently', 'Installing'),
    PKG_OBSOLETE: _('Obsoleting'),
    PKG_OBSOLETED: _('Obsoleting'),
    # TRANSLATORS: This is for a single package currently being reinstalled.
    PKG_REINSTALL: C_('currently', 'Reinstalling'),
    PKG_REINSTALLED: _('Cleanup'),
    # TODO: 'Removing'?
    PKG_REMOVE: _('Erasing'),
    # TRANSLATORS: This is for a single package currently being upgraded.
    PKG_UPGRADE: C_('currently', 'Upgrading'),
    PKG_UPGRADED: _('Cleanup'),

    PKG_CLEANUP: _('Cleanup'),
    PKG_VERIFY: _('Verifying'),
    PKG_SCRIPTLET: _('Running scriptlet'),

    TRANS_PREPARATION: _('Preparing'),
    # TODO: TRANS_POST
}


# untranslated strings, logging to /var/log/dnf/dnf.rpm.log
FILE_ACTIONS = {
    PKG_DOWNGRADE: 'Downgrade',
    PKG_DOWNGRADED: 'Downgraded',
    PKG_INSTALL: 'Installed',
    PKG_OBSOLETE: 'Obsolete',
    PKG_OBSOLETED: 'Obsoleted',
    PKG_REINSTALL: 'Reinstall',
    PKG_REINSTALLED: 'Reinstalled',
    # TODO: 'Removed'?
    PKG_REMOVE: 'Erase',
    PKG_UPGRADE: 'Upgrade',
    PKG_UPGRADED: 'Upgraded',

    PKG_CLEANUP: 'Cleanup',
    PKG_VERIFY: 'Verified',
    PKG_SCRIPTLET: 'Running scriptlet',

    TRANS_PREPARATION: 'Preparing',
    # TODO: TRANS_POST
}
site-packages/dnf/query.py000064400000003063147511334650011565 0ustar00# query.py
# Implements Query.
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
import hawkey

from hawkey import Query
from dnf.i18n import ucd
from dnf.pycomp import basestring



def _by_provides(sack, patterns, ignore_case=False, get_query=False):
    if isinstance(patterns, basestring):
        patterns = [patterns]

    q = sack.query()
    flags = []
    if ignore_case:
        flags.append(hawkey.ICASE)

    q.filterm(*flags, provides__glob=patterns)
    if get_query:
        return q
    return q.run()

def _per_nevra_dict(pkg_list):
    return {ucd(pkg):pkg for pkg in pkg_list}
site-packages/dnf/persistor.py000064400000011157147511334650012455 0ustar00# persistor.py
# Persistence data container.
#
# Copyright (C) 2013-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

# The current implementation is storing to files in persistdir. Do not depend on
# specific files existing, instead use the persistor API. The underlying
# implementation can change, e.g. for one general file with a serialized dict of
# data etc.

from __future__ import absolute_import
from __future__ import unicode_literals
from dnf.i18n import _
import distutils.version
import dnf.util
import errno
import fnmatch
import json
import logging
import os
import re

logger = logging.getLogger("dnf")


class JSONDB(object):

    def _check_json_db(self, json_path):
        if not os.path.isfile(json_path):
            # initialize new db
            dnf.util.ensure_dir(os.path.dirname(json_path))
            self._write_json_db(json_path, [])

    def _get_json_db(self, json_path, default=[]):
        with open(json_path, 'r') as f:
            content = f.read()
        if content == "":
            # empty file is invalid json format
            logger.warning(_("%s is empty file"), json_path)
            self._write_json_db(json_path, default)
        else:
            try:
                default = json.loads(content)
            except ValueError as e:
                logger.warning(e)
        return default

    @staticmethod
    def _write_json_db(json_path, content):
        with open(json_path, 'w') as f:
            json.dump(content, f)


class RepoPersistor(JSONDB):
    """Persistent data kept for repositories.

    Is arch/releasever specific and stores to cachedir.

    """

    def __init__(self, cachedir):
        self.cachedir = cachedir
        self.db_path = os.path.join(self.cachedir, "expired_repos.json")
        self.expired_to_add = set()
        self.reset_last_makecache = False

    @property
    def _last_makecache_path(self):
        return os.path.join(self.cachedir, "last_makecache")

    def get_expired_repos(self):
        try:
            self._check_json_db(self.db_path)
            return set(self._get_json_db(self.db_path))
        except OSError as e:
            logger.warning(_("Failed to load expired repos cache: %s"), e)
            return None

    def save(self):
        try:
            self._check_json_db(self.db_path)
            self._write_json_db(self.db_path, list(self.expired_to_add))
        except OSError as e:
            logger.warning(_("Failed to store expired repos cache: %s"), e)
            return False
        if self.reset_last_makecache:
            try:
                dnf.util.touch(self._last_makecache_path)
                return True
            except IOError:
                logger.warning(_("Failed storing last makecache time."))
                return False

    def since_last_makecache(self):
        try:
            return int(dnf.util.file_age(self._last_makecache_path))
        except OSError:
            logger.warning(_("Failed determining last makecache time."))
            return None


class TempfilePersistor(JSONDB):

    def __init__(self, cachedir):
        self.db_path = os.path.join(cachedir, "tempfiles.json")
        self.tempfiles_to_add = set()
        self._empty = False

    def get_saved_tempfiles(self):
        self._check_json_db(self.db_path)
        return self._get_json_db(self.db_path)

    def save(self):
        if not self._empty and not self.tempfiles_to_add:
            return
        self._check_json_db(self.db_path)
        if self._empty:
            self._write_json_db(self.db_path, [])
            return
        if self.tempfiles_to_add:
            data = set(self._get_json_db(self.db_path))
            data.update(self.tempfiles_to_add)
            self._write_json_db(self.db_path, list(data))

    def empty(self):
        self._empty = True
site-packages/dnf/history.py000064400000002176147511334650012125 0ustar00# history.py
# Interfaces to the history of transactions.
#
# Copyright (C) 2013-2018 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

"""Interfaces to the history of transactions."""

from __future__ import absolute_import
from __future__ import unicode_literals

site-packages/dnf/package.py000064400000025704147511334650012021 0ustar00# package.py
# Module defining the dnf.Package class.
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

""" Contains the dnf.Package class. """

from __future__ import absolute_import
from __future__ import unicode_literals

from dnf.i18n import _

import binascii
import dnf.exceptions
import dnf.rpm
import dnf.yum.misc
import hawkey
import libdnf.error
import libdnf.utils
import logging
import os
import rpm

logger = logging.getLogger("dnf")


class Package(hawkey.Package):
    """ Represents a package. #:api """

    DEBUGINFO_SUFFIX = "-debuginfo"  # :api
    DEBUGSOURCE_SUFFIX = "-debugsource"  # :api

    def __init__(self, initobject, base):
        super(Package, self).__init__(initobject)
        self.base = base
        self._priv_chksum = None
        self._repo = None
        self._priv_size = None

    @property
    def _chksum(self):
        if self._priv_chksum:
            return self._priv_chksum
        if self._from_cmdline:
            chksum_type = dnf.yum.misc.get_default_chksum_type()
            try:
                chksum_val = libdnf.utils.checksum_value(chksum_type, self.location)
            except libdnf.error.Error as e:
                raise dnf.exceptions.MiscError(str(e))
            return (hawkey.chksum_type(chksum_type),
                    binascii.unhexlify(chksum_val))
        return super(Package, self).chksum

    @_chksum.setter
    def _chksum(self, val):
        self._priv_chksum = val

    @property
    def _from_cmdline(self):
        return self.reponame == hawkey.CMDLINE_REPO_NAME

    @property
    def _from_system(self):
        return self.reponame == hawkey.SYSTEM_REPO_NAME

    @property
    def _from_repo(self):
        """
        For installed packages returns id of repository from which the package was installed
        prefixed with '@' (if such information is available in the history database). Otherwise
        returns id of repository the package belongs to (@System for installed packages of unknown
        origin)
        """
        pkgrepo = None
        if self._from_system:
            pkgrepo = self.base.history.repo(self)
        if pkgrepo:
            return '@' + pkgrepo
        return self.reponame

    @property
    def from_repo(self):
        # :api
        if self._from_system:
            return self.base.history.repo(self)
        return ""

    @property
    def _header(self):
        """
        Returns the header of a locally present rpm package file. As opposed to
        self.get_header(), which retrieves the header of an installed package
        from rpmdb.
        """
        return dnf.rpm._header(self.localPkg())

    @property
    def _size(self):
        if self._priv_size:
            return self._priv_size
        return super(Package, self).size

    @_size.setter
    def _size(self, val):
        self._priv_size = val

    @property
    def _pkgid(self):
        if self.hdr_chksum is None:
            return None
        (_, chksum) = self.hdr_chksum
        return binascii.hexlify(chksum)

    @property
    def source_name(self):
        # :api
        """
        returns name of source package
        e.g. krb5-libs -> krb5
        """
        if self.sourcerpm is not None:
            # trim suffix first
            srcname = dnf.util.rtrim(self.sourcerpm, ".src.rpm")
            # sourcerpm should be in form of name-version-release now, so we
            # will strip the two rightmost parts separated by dash.
            # Using rtrim with version and release of self is not sufficient
            # because the package can have different version to the source
            # package.
            srcname = srcname.rsplit('-', 2)[0]
        else:
            srcname = None
        return srcname

    @property
    def debug_name(self):
        # :api
        """
        Returns name of the debuginfo package for this package.
        If this package is a debuginfo package, returns its name.
        If this package is a debugsource package, returns the debuginfo package
        for the base package.
        e.g. kernel-PAE -> kernel-PAE-debuginfo
        """
        if self.name.endswith(self.DEBUGINFO_SUFFIX):
            return self.name

        name = self.name
        if self.name.endswith(self.DEBUGSOURCE_SUFFIX):
            name = name[:-len(self.DEBUGSOURCE_SUFFIX)]

        return name + self.DEBUGINFO_SUFFIX

    @property
    def debugsource_name(self):
        # :api
        """
        Returns name of the debugsource package for this package.
        e.g. krb5-libs -> krb5-debugsource
        """
        # assuming self.source_name is None only for a source package
        src_name = self.source_name if self.source_name is not None else self.name
        return src_name + self.DEBUGSOURCE_SUFFIX

    def get_header(self):
        """
        Returns the rpm header of the package if it is installed. If not
        installed, returns None. The header is not cached, it is retrieved from
        rpmdb on every call. In case of a failure (e.g. when the rpmdb changes
        between loading the data and calling this method), raises an instance
        of PackageNotFoundError.
        """
        if not self._from_system:
            return None

        try:
            # RPMDBI_PACKAGES stands for the header of the package
            return next(self.base._ts.dbMatch(rpm.RPMDBI_PACKAGES, self.rpmdbid))
        except StopIteration:
            raise dnf.exceptions.PackageNotFoundError("Package not found when attempting to retrieve header", str(self))

    @property
    def source_debug_name(self):
        # :api
        """
        returns name of debuginfo package for source package of given package
        e.g. krb5-libs -> krb5-debuginfo
        """
        # assuming self.source_name is None only for a source package
        src_name = self.source_name if self.source_name is not None else self.name
        return src_name + self.DEBUGINFO_SUFFIX

    @property # yum compatibility attribute
    def idx(self):
        """ Always type it to int, rpm bindings expect it like that. """
        return int(self.rpmdbid)

    @property # yum compatibility attribute
    def repoid(self):
        return self.reponame

    @property # yum compatibility attribute
    def pkgtup(self):
        return (self.name, self.arch, str(self.e), self.v, self.r)

    @property # yum compatibility attribute
    def repo(self):
        if self._repo:
            return self._repo
        return self.base.repos[self.reponame]

    @repo.setter
    def repo(self, val):
        self._repo = val

    @property
    def reason(self):
        if self.repoid != hawkey.SYSTEM_REPO_NAME:
            return None
        return self.base.history.rpm.get_reason_name(self)

    @property # yum compatibility attribute
    def relativepath(self):
        return self.location

    @property # yum compatibility attribute
    def a(self):
        return self.arch

    @property # yum compatibility attribute
    def e(self):
        return self.epoch

    @property # yum compatibility attribute
    def v(self):
        return self.version

    @property # yum compatibility attribute
    def r(self):
        return self.release

    @property # yum compatibility attribute
    def ui_from_repo(self):
        return self.reponame

    # yum compatibility method
    def evr_eq(self, pkg):
        return self.evr_cmp(pkg) == 0

    # yum compatibility method
    def evr_gt(self, pkg):
        return self.evr_cmp(pkg) > 0

    # yum compatibility method
    def evr_lt(self, pkg):
        return self.evr_cmp(pkg) < 0

    # yum compatibility method
    def getDiscNum(self):
        return self.medianr

    # yum compatibility method
    def localPkg(self):
        """ Package's location in the filesystem.

            For packages in remote repo returns where the package will be/has
            been downloaded.
        """
        if self._from_cmdline:
            return self.location
        loc = self.location
        if self.repo._repo.isLocal() and self.baseurl and self.baseurl.startswith('file://'):
            return os.path.join(self.get_local_baseurl(), loc.lstrip("/"))
        if not self._is_local_pkg():
            loc = os.path.basename(loc)
        return os.path.join(self.pkgdir, loc.lstrip("/"))

    def remote_location(self, schemes=('http', 'ftp', 'file', 'https')):
        # :api
        """
        The location from where the package can be downloaded from. Returns None for installed and
        commandline packages.

        :param schemes: list of allowed protocols. Default is ('http', 'ftp', 'file', 'https')
        :return: location (string) or None
        """
        if self._from_system or self._from_cmdline:
            return None
        return self.repo.remote_location(self.location, schemes)

    def _is_local_pkg(self):
        if self._from_system:
            return True
        if '://' in self.location and not self.location.startswith('file://'):
            # the package has a remote URL as its location
            return False
        return self._from_cmdline or \
            (self.repo._repo.isLocal() and (not self.baseurl or self.baseurl.startswith('file://')))

    @property
    def pkgdir(self):
        if (self.repo._repo.isLocal() and not self._is_local_pkg()):
            return self.repo.cache_pkgdir()
        else:
            return self.repo.pkgdir

    # yum compatibility method
    def returnIdSum(self):
        """ Return the chksum type and chksum string how the legacy yum expects
            it.
        """
        if self._chksum is None:
            return (None, None)
        (chksum_type, chksum) = self._chksum
        return (hawkey.chksum_name(chksum_type), binascii.hexlify(chksum).decode())

    # yum compatibility method
    def verifyLocalPkg(self):
        if self._from_system:
            raise ValueError("Can not verify an installed package.")
        if self._from_cmdline:
            return True # local package always verifies against itself
        (chksum_type, chksum) = self.returnIdSum()
        try:
            return libdnf.utils.checksum_check(chksum_type, self.localPkg(), chksum)
        except libdnf.error.Error as e:
            raise dnf.exceptions.MiscError(str(e))
site-packages/dnf/plugin.py000064400000022526147511334650011723 0ustar00# plugin.py
# The interface for building DNF plugins.
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import print_function
from __future__ import unicode_literals

import fnmatch
import glob
import importlib
import inspect
import logging
import operator
import os
import sys
import traceback

import libdnf
import dnf.logging
import dnf.pycomp
import dnf.util
from dnf.i18n import _

logger = logging.getLogger('dnf')

DYNAMIC_PACKAGE = 'dnf.plugin.dynamic'


class Plugin(object):
    """The base class custom plugins must derive from. #:api"""

    name = '<invalid>'
    config_name = None

    @classmethod
    def read_config(cls, conf):
        # :api
        parser = libdnf.conf.ConfigParser()
        name = cls.config_name if cls.config_name else cls.name
        files = ['%s/%s.conf' % (path, name) for path in conf.pluginconfpath]
        for file in files:
            if os.path.isfile(file):
                try:
                    parser.read(file)
                except Exception as e:
                    raise dnf.exceptions.ConfigError(_("Parsing file failed: %s") % str(e))
        return parser

    def __init__(self, base, cli):
        # :api
        self.base = base
        self.cli = cli

    def pre_config(self):
        # :api
        pass

    def config(self):
        # :api
        pass

    def resolved(self):
        # :api
        pass

    def sack(self):
        # :api
        pass

    def pre_transaction(self):
        # :api
        pass

    def transaction(self):
        # :api
        pass


class Plugins(object):
    def __init__(self):
        self.plugin_cls = []
        self.plugins = []

    def __del__(self):
        self._unload()

    def _caller(self, method):
        for plugin in self.plugins:
            try:
                getattr(plugin, method)()
            except dnf.exceptions.Error:
                raise
            except Exception:
                exc_type, exc_value, exc_traceback = sys.exc_info()
                except_list = traceback.format_exception(exc_type, exc_value, exc_traceback)
                logger.critical(''.join(except_list))

    def _check_enabled(self, conf, enable_plugins):
        """Checks whether plugins are enabled or disabled in configuration files
           and removes disabled plugins from list"""
        for plug_cls in self.plugin_cls[:]:
            name = plug_cls.name
            if any(fnmatch.fnmatch(name, pattern) for pattern in enable_plugins):
                continue
            parser = plug_cls.read_config(conf)
            # has it enabled = False?
            disabled = (parser.has_section('main')
                        and parser.has_option('main', 'enabled')
                        and not parser.getboolean('main', 'enabled'))
            if disabled:
                self.plugin_cls.remove(plug_cls)

    def _load(self, conf, skips, enable_plugins):
        """Dynamically load relevant plugin modules."""

        if DYNAMIC_PACKAGE in sys.modules:
            raise RuntimeError("load_plugins() called twice")
        sys.modules[DYNAMIC_PACKAGE] = package = dnf.pycomp.ModuleType(DYNAMIC_PACKAGE)
        package.__path__ = []

        files = _get_plugins_files(conf.pluginpath, skips, enable_plugins)
        _import_modules(package, files)
        self.plugin_cls = _plugin_classes()[:]
        self._check_enabled(conf, enable_plugins)
        if len(self.plugin_cls) > 0:
            names = sorted(plugin.name for plugin in self.plugin_cls)
            logger.debug(_('Loaded plugins: %s'), ', '.join(names))

    def _run_pre_config(self):
        self._caller('pre_config')

    def _run_config(self):
        self._caller('config')

    def _run_init(self, base, cli=None):
        for p_cls in self.plugin_cls:
            plugin = p_cls(base, cli)
            self.plugins.append(plugin)

    def run_sack(self):
        self._caller('sack')

    def run_resolved(self):
        self._caller('resolved')

    def run_pre_transaction(self):
        self._caller('pre_transaction')

    def run_transaction(self):
        self._caller('transaction')

    def _unload(self):
        if DYNAMIC_PACKAGE in sys.modules:
            logger.log(dnf.logging.DDEBUG, 'Plugins were unloaded.')
            del sys.modules[DYNAMIC_PACKAGE]

    def unload_removed_plugins(self, transaction):
        """
        Unload plugins that were removed in the `transaction`.
        """
        if not transaction.remove_set:
            return

        # gather all installed plugins and their files
        plugins = dict()
        for plugin in self.plugins:
            plugins[inspect.getfile(plugin.__class__)] = plugin

        # gather all removed files that are plugin files
        plugin_files = set(plugins.keys())
        erased_plugin_files = set()
        for pkg in transaction.remove_set:
            erased_plugin_files.update(plugin_files.intersection(pkg.files))
        if not erased_plugin_files:
            return

        # check whether removed plugin file is added at the same time (upgrade of a plugin)
        for pkg in transaction.install_set:
            erased_plugin_files.difference_update(pkg.files)

        # unload plugins that were removed in transaction
        for plugin_file in erased_plugin_files:
            self.plugins.remove(plugins[plugin_file])


def _plugin_classes():
    return Plugin.__subclasses__()


def _import_modules(package, py_files):
    for fn in py_files:
        path, module = os.path.split(fn)
        package.__path__.append(path)
        (module, ext) = os.path.splitext(module)
        name = '%s.%s' % (package.__name__, module)
        try:
            module = importlib.import_module(name)
        except Exception as e:
            logger.error(_('Failed loading plugin "%s": %s'), module, e)
            logger.log(dnf.logging.SUBDEBUG, '', exc_info=True)


def _get_plugins_files(paths, disable_plugins, enable_plugins):
    plugins = []
    disable_plugins = set(disable_plugins)
    enable_plugins = set(enable_plugins)
    pattern_enable_found = set()
    pattern_disable_found = set()
    for p in paths:
        for fn in glob.glob('%s/*.py' % p):
            (plugin_name, dummy) = os.path.splitext(os.path.basename(fn))
            matched = True
            enable_pattern_tested = False
            for pattern_skip in disable_plugins:
                if _plugin_name_matches_pattern(plugin_name, pattern_skip):
                    pattern_disable_found.add(pattern_skip)
                    matched = False
                    for pattern_enable in enable_plugins:
                        if _plugin_name_matches_pattern(plugin_name, pattern_enable):
                            matched = True
                            pattern_enable_found.add(pattern_enable)
                    enable_pattern_tested = True
            if not enable_pattern_tested:
                for pattern_enable in enable_plugins:
                    if _plugin_name_matches_pattern(plugin_name, pattern_enable):
                        pattern_enable_found.add(pattern_enable)
            if matched:
                plugins.append(fn)
    enable_not_found = enable_plugins.difference(pattern_enable_found)
    if enable_not_found:
        logger.warning(_("No matches found for the following enable plugin patterns: {}").format(
            ", ".join(sorted(enable_not_found))))
    disable_not_found = disable_plugins.difference(pattern_disable_found)
    if disable_not_found:
        logger.warning(_("No matches found for the following disable plugin patterns: {}").format(
            ", ".join(sorted(disable_not_found))))
    return plugins


def _plugin_name_matches_pattern(plugin_name, pattern):
    """
    Checks plugin name matches the pattern.

    The alternative plugin name using dashes instead of underscores is tried
    in case of original name is not matched.

    (see https://bugzilla.redhat.com/show_bug.cgi?id=1980712)
    """

    try_names = set((plugin_name, plugin_name.replace('_', '-')))
    return any(fnmatch.fnmatch(name, pattern) for name in try_names)


def register_command(command_class):
    # :api
    """A class decorator for automatic command registration."""
    def __init__(self, base, cli):
        if cli:
            cli.register_command(command_class)
    plugin_class = type(str(command_class.__name__ + 'Plugin'),
                        (dnf.Plugin,),
                        {"__init__": __init__,
                         "name": command_class.aliases[0]})
    command_class._plugin = plugin_class
    return command_class
site-packages/dnf/drpm.py000064400000014320147511334650011360 0ustar00# drpm.py
# Delta RPM support
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from binascii import hexlify
from dnf.yum.misc import unlink_f
from dnf.i18n import _

import dnf.callback
import dnf.logging
import dnf.repo
import hawkey
import logging
import libdnf.repo
import os

APPLYDELTA = '/usr/bin/applydeltarpm'

logger = logging.getLogger("dnf")


class DeltaPayload(dnf.repo.PackagePayload):
    def __init__(self, delta_info, delta, pkg, progress):
        super(DeltaPayload, self).__init__(pkg, progress)
        self.delta_info = delta_info
        self.delta = delta

    def __str__(self):
        return os.path.basename(self.delta.location)

    def _end_cb(self, cbdata, lr_status, msg):
        super(DeltaPayload, self)._end_cb(cbdata, lr_status, msg)
        if lr_status != libdnf.repo.PackageTargetCB.TransferStatus_ERROR:
            self.delta_info.enqueue(self)

    def _target_params(self):
        delta = self.delta
        ctype, csum = delta.chksum
        ctype = hawkey.chksum_name(ctype)
        chksum = hexlify(csum).decode()

        ctype_code = libdnf.repo.PackageTarget.checksumType(ctype)
        if ctype_code == libdnf.repo.PackageTarget.ChecksumType_UNKNOWN:
            logger.warning(_("unsupported checksum type: %s"), ctype)

        return {
            'relative_url' : delta.location,
            'checksum_type' : ctype_code,
            'checksum' : chksum,
            'expectedsize' : delta.downloadsize,
            'base_url' : delta.baseurl,
        }

    @property
    def download_size(self):
        return self.delta.downloadsize

    @property
    def _full_size(self):
        return self.pkg.downloadsize

    def localPkg(self):
        location = self.delta.location
        return os.path.join(self.pkg.repo.pkgdir, os.path.basename(location))


class DeltaInfo(object):
    def __init__(self, query, progress, deltarpm_percentage=None):
        '''A delta lookup and rebuild context
           query -- installed packages to use when looking up deltas
           progress -- progress obj to display finished delta rebuilds
        '''
        self.deltarpm_installed = False
        if os.access(APPLYDELTA, os.X_OK):
            self.deltarpm_installed = True
        try:
            self.deltarpm_jobs = os.sysconf('SC_NPROCESSORS_ONLN')
        except (TypeError, ValueError):
            self.deltarpm_jobs = 4
        if deltarpm_percentage is None:
            self.deltarpm_percentage = dnf.conf.Conf().deltarpm_percentage
        else:
            self.deltarpm_percentage = deltarpm_percentage
        self.query = query
        self.progress = progress

        self.queue = []
        self.jobs = {}
        self.err = {}

    def delta_factory(self, po, progress):
        '''Turn a po to Delta RPM po, if possible'''
        if not self.deltarpm_installed:
            # deltarpm is not installed
            return None
        if not po.repo.deltarpm or not self.deltarpm_percentage:
            # drpm disabled
            return None
        if po._is_local_pkg():
            # drpm disabled for local
            return None
        if os.path.exists(po.localPkg()):
            # already there
            return None

        best = po._size * self.deltarpm_percentage / 100
        best_delta = None
        for ipo in self.query.filter(name=po.name, arch=po.arch):
            delta = po.get_delta_from_evr(ipo.evr)
            if delta and delta.downloadsize < best:
                best = delta.downloadsize
                best_delta = delta
        if best_delta:
            return DeltaPayload(self, best_delta, po, progress)
        return None

    def job_done(self, pid, code):
        # handle a finished delta rebuild
        logger.log(dnf.logging.SUBDEBUG, 'drpm: %d: return code: %d, %d', pid,
                   code >> 8, code & 0xff)

        pload = self.jobs.pop(pid)
        pkg = pload.pkg
        if code != 0:
            unlink_f(pload.pkg.localPkg())
            self.err[pkg] = [_('Delta RPM rebuild failed')]
        elif not pload.pkg.verifyLocalPkg():
            self.err[pkg] = [_('Checksum of the delta-rebuilt RPM failed')]
        else:
            os.unlink(pload.localPkg())
            self.progress.end(pload, dnf.callback.STATUS_DRPM, _('done'))

    def start_job(self, pload):
        # spawn a delta rebuild job
        spawn_args = [APPLYDELTA, APPLYDELTA,
                      '-a', pload.pkg.arch,
                      pload.localPkg(), pload.pkg.localPkg()]
        pid = os.spawnl(os.P_NOWAIT, *spawn_args)
        logger.log(dnf.logging.SUBDEBUG, 'drpm: spawned %d: %s', pid,
                   ' '.join(spawn_args[1:]))
        self.jobs[pid] = pload

    def enqueue(self, pload):
        # process finished jobs, start new ones
        while self.jobs:
            pid, code = os.waitpid(-1, os.WNOHANG)
            if not pid:
                break
            self.job_done(pid, code)
        self.queue.append(pload)
        while len(self.jobs) < self.deltarpm_jobs:
            self.start_job(self.queue.pop(0))
            if not self.queue:
                break

    def wait(self):
        '''Wait until all jobs have finished'''
        while self.jobs:
            pid, code = os.wait()
            self.job_done(pid, code)
            if self.queue:
                self.start_job(self.queue.pop(0))
site-packages/dnf/repodict.py000064400000012046147511334650012232 0ustar00# repodict.py
# Managing repo configuration in DNF.
#
# Copyright (C) 2013-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import unicode_literals
from dnf.exceptions import ConfigError
from dnf.i18n import _

import dnf.util
import libdnf.conf
import fnmatch
import os

logger = dnf.util.logger


class RepoDict(dict):
    # :api
    def add(self, repo):
        # :api
        id_ = repo.id
        if id_ in self:
            msg = 'Repository %s is listed more than once in the configuration'
            raise ConfigError(msg % id_)
        try:
            repo._repo.verify()
        except RuntimeError as e:
            raise ConfigError("{0}".format(e))
        self[id_] = repo

    def all(self):
        # :api
        return dnf.util.MultiCallList(self.values())

    def _any_enabled(self):
        return not dnf.util.empty(self.iter_enabled())

    def _enable_sub_repos(self, sub_name_fn):
        for repo in self.iter_enabled():
            for found in self.get_matching(sub_name_fn(repo.id)):
                if not found.enabled:
                    logger.info(_('enabling %s repository'), found.id)
                    found.enable()

    def add_new_repo(self, repoid, conf, baseurl=(), **kwargs):
        # :api
        """
        Creates new repo object and add it into RepoDict. Variables in provided values will be
        automatically substituted using conf.substitutions (like $releasever, ...)

        @param repoid: Repo ID - string
        @param conf: dnf Base().conf object
        @param baseurl: List of strings
        @param kwargs: keys and values that will be used to setattr on dnf.repo.Repo() object
        @return: dnf.repo.Repo() object
        """
        def substitute(values):
            if isinstance(values, str):
                return libdnf.conf.ConfigParser.substitute(values, conf.substitutions)
            elif isinstance(values, list) or isinstance(values, tuple):
                substituted = []
                for value in values:
                    if isinstance(value, str):
                        substituted.append(
                            libdnf.conf.ConfigParser.substitute(value, conf.substitutions))
                if substituted:
                    return substituted
            return values

        repo = dnf.repo.Repo(repoid, conf)
        for path in baseurl:
            if '://' not in path:
                path = 'file://{}'.format(os.path.abspath(path))
            repo.baseurl += [substitute(path)]
        for (key, value) in kwargs.items():
            setattr(repo, key, substitute(value))
        self.add(repo)
        logger.info(_("Added %s repo from %s"), repoid, ', '.join(baseurl))
        return repo

    def enable_debug_repos(self):
        # :api
        """enable debug repos corresponding to already enabled binary repos"""

        def debug_name(name):
            return ("{}-debug-rpms".format(name[:-5]) if name.endswith("-rpms")
                    else "{}-debuginfo".format(name))

        self._enable_sub_repos(debug_name)

    def enable_source_repos(self):
        # :api
        """enable source repos corresponding to already enabled binary repos"""

        def source_name(name):
            return ("{}-source-rpms".format(name[:-5]) if name.endswith("-rpms")
                    else "{}-source".format(name))

        self._enable_sub_repos(source_name)

    def get_matching(self, key):
        # :api
        if dnf.util.is_glob_pattern(key):
            l = [self[k] for k in self if fnmatch.fnmatch(k, key)]
            return dnf.util.MultiCallList(l)
        repo = self.get(key, None)
        if repo is None:
            return dnf.util.MultiCallList([])
        return dnf.util.MultiCallList([repo])

    def iter_enabled(self):
        # :api
        return (r for r in self.values() if r.enabled)

    def items(self):
        """return repos sorted by priority"""
        return (item for item in sorted(super(RepoDict, self).items(),
                                        key=lambda x: (x[1].priority, x[1].cost)))

    def __iter__(self):
        return self.keys()

    def keys(self):
        return (k for k, v in self.items())

    def values(self):
        return (v for k, v in self.items())
site-packages/dnf/__pycache__/repodict.cpython-36.opt-1.pyc000064400000012704147511334650017456 0ustar003

i�-e&�@s`ddlmZddlmZddlmZddlZddlZ	ddl
Z
ddlZejj
Z
Gdd�de�ZdS)�)�unicode_literals)�ConfigError)�_Ncs�eZdZdd�Zdd�Zdd�Zdd�Zffd	d
�Zdd�Zd
d�Z	dd�Z
dd�Z�fdd�Zdd�Z
dd�Zdd�Z�ZS)�RepoDictcCsj|j}||krd}t||��y|jj�Wn0tk
r\}ztdj|���WYdd}~XnX|||<dS)Nz;Repository %s is listed more than once in the configurationz{0})�idrZ_repoZverify�RuntimeError�format)�self�repoZid_�msg�e�r
�/usr/lib/python3.6/repodict.py�add#s zRepoDict.addcCstjj|j��S)N)�dnf�util�
MultiCallList�values)r	r
r
r�all/szRepoDict.allcCstjj|j��S)N)rr�empty�iter_enabled)r	r
r
r�_any_enabled3szRepoDict._any_enabledcCsPxJ|j�D]>}x8|j||j��D]$}|js tjtd�|j�|j�q Wq
WdS)Nzenabling %s repository)r�get_matchingr�enabled�logger�infor�enable)r	Zsub_name_fnr
�foundr
r
r�_enable_sub_repos6s
zRepoDict._enable_sub_reposc
s��fdd�}tjj|��}x:|D]2}d|kr>djtjj|��}|j||�g7_q Wx$|j�D]\}}	t	||||	��q`W|j
|�tjt
d�|dj|��|S)a�
        Creates new repo object and add it into RepoDict. Variables in provided values will be
        automatically substituted using conf.substitutions (like $releasever, ...)

        @param repoid: Repo ID - string
        @param conf: dnf Base().conf object
        @param baseurl: List of strings
        @param kwargs: keys and values that will be used to setattr on dnf.repo.Repo() object
        @return: dnf.repo.Repo() object
        cspt|t�rtjjj|�j�St|t�s0t|t�rlg}x.|D]&}t|t�r:|j	tjjj|�j��q:W|rl|S|S)N)
�
isinstance�str�libdnf�confZConfigParser�
substituteZ
substitutions�list�tuple�append)rZsubstituted�value)r"r
rr#Is


z)RepoDict.add_new_repo.<locals>.substitutez://z	file://{}zAdded %s repo from %sz, )rr
ZRepor�os�path�abspath�baseurl�items�setattrrrrr�join)
r	Zrepoidr"r+�kwargsr#r
r)�keyr'r
)r"r�add_new_repo=s


zRepoDict.add_new_repocCsdd�}|j|�dS)z@enable debug repos corresponding to already enabled binary reposcSs&|jd�rdj|dd��Sdj|�S)Nz-rpmsz
{}-debug-rpms�z{}-debuginfo���)�endswithr)�namer
r
r�
debug_nameesz/RepoDict.enable_debug_repos.<locals>.debug_nameN)r)r	r6r
r
r�enable_debug_reposaszRepoDict.enable_debug_reposcCsdd�}|j|�dS)zAenable source repos corresponding to already enabled binary reposcSs&|jd�rdj|dd��Sdj|�S)Nz-rpmsz{}-source-rpmsr2z	{}-sourcer3)r4r)r5r
r
r�source_nameosz1RepoDict.enable_source_repos.<locals>.source_nameN)r)r	r8r
r
r�enable_source_reposkszRepoDict.enable_source_reposcsZtjj��r,��fdd��D�}tjj|�S�j�d�}|dkrLtjjg�Stjj|g�S)Ncs g|]}tj|��r�|�qSr
)�fnmatch)�.0�k)r0r	r
r�
<listcomp>xsz)RepoDict.get_matching.<locals>.<listcomp>)rrZis_glob_patternr�get)r	r0�lr
r
)r0r	rruszRepoDict.get_matchingcCsdd�|j�D�S)Ncss|]}|jr|VqdS)N)r)r;�rr
r
r�	<genexpr>�sz(RepoDict.iter_enabled.<locals>.<genexpr>)r)r	r
r
rrszRepoDict.iter_enabledcs$dd�ttt|�j�dd�d�D�S)zreturn repos sorted by prioritycss|]
}|VqdS)Nr
)r;�itemr
r
rrA�sz!RepoDict.items.<locals>.<genexpr>cSs|dj|djfS)N�)ZpriorityZcost)�xr
r
r�<lambda>�sz RepoDict.items.<locals>.<lambda>)r0)�sorted�superrr,)r	)�	__class__r
rr,�szRepoDict.itemscCs|j�S)N)�keys)r	r
r
r�__iter__�szRepoDict.__iter__cCsdd�|j�D�S)Ncss|]\}}|VqdS)Nr
)r;r<�vr
r
rrA�sz RepoDict.keys.<locals>.<genexpr>)r,)r	r
r
rrI�sz
RepoDict.keyscCsdd�|j�D�S)Ncss|]\}}|VqdS)Nr
)r;r<rKr
r
rrA�sz"RepoDict.values.<locals>.<genexpr>)r,)r	r
r
rr�szRepoDict.values)�__name__�
__module__�__qualname__rrrrr1r7r9rrr,rJrIr�
__classcell__r
r
)rHrr!s$


r)Z
__future__rZdnf.exceptionsrZdnf.i18nrZdnf.utilrZlibdnf.confr!r:r(rr�dictrr
r
r
r�<module>ssite-packages/dnf/__pycache__/history.cpython-36.pyc000064400000000404147511334650016401 0ustar003

�ft`~�@s dZddlmZddlmZdS)z*Interfaces to the history of transactions.�)�absolute_import)�unicode_literalsN)�__doc__Z
__future__rr�rr�/usr/lib/python3.6/history.py�<module>ssite-packages/dnf/__pycache__/transaction.cpython-36.pyc000064400000003147147511334650017234 0ustar003

�ft`-�@s�ddlmZddlmZddlZddlmZmZejj	Z
ejjZejj
ZejjZejjZejjZejjZejjZejjZejjZeZdZdZdZ dZ!d	Z"ejj
ejj	ejjejjejjgZ#ejjejjejjejjgZ$e
ed
d�eed�eed
d
�eed�eed�eed
d�eed�eed�eed
d�eed�eed�eed�e ed�e!ed�iZ%e
dededededededededededede de!diZ&dS) �)�absolute_import)�unicode_literalsN)�_�C_�e�f�g����Z	currentlyZDowngradingZCleanupZ
InstallingZ
ObsoletingZReinstallingZErasingZ	UpgradingZ	VerifyingzRunning scriptletZ	PreparingZ	DowngradeZ
DowngradedZ	InstalledZObsoleteZ	ObsoletedZ	ReinstallZReinstalledZEraseZUpgradeZUpgradedZVerified)'Z
__future__rrZlibdnf.transactionZlibdnfZdnf.i18nrrZtransactionZTransactionItemAction_DOWNGRADEZ
PKG_DOWNGRADEZ TransactionItemAction_DOWNGRADEDZPKG_DOWNGRADEDZTransactionItemAction_INSTALLZPKG_INSTALLZTransactionItemAction_OBSOLETEZPKG_OBSOLETEZTransactionItemAction_OBSOLETEDZ
PKG_OBSOLETEDZTransactionItemAction_REINSTALLZ
PKG_REINSTALLZ!TransactionItemAction_REINSTALLEDZPKG_REINSTALLEDZTransactionItemAction_REMOVEZ
PKG_REMOVEZTransactionItemAction_UPGRADEZPKG_UPGRADEZTransactionItemAction_UPGRADEDZPKG_UPGRADEDZ	PKG_ERASEZPKG_CLEANUPZ
PKG_VERIFYZ
PKG_SCRIPTLETZTRANS_PREPARATIONZ
TRANS_POSTZFORWARD_ACTIONSZBACKWARD_ACTIONSZACTIONSZFILE_ACTIONS�rr�!/usr/lib/python3.6/transaction.py�<module>sp





site-packages/dnf/__pycache__/package.cpython-36.pyc000064400000023452147511334650016303 0ustar003

�ft`�+�@s�dZddlmZddlmZddlmZddlZddlZddl	Zddl
ZddlZddlZ
ddlZ
ddlZddlZddlZejd�ZGdd�dej�ZdS)	z! Contains the dnf.Package class. �)�absolute_import)�unicode_literals)�_N�dnfcs�eZdZdZdZdZ�fdd�Ze�fdd��Zej	dd��Zed	d
��Z
edd��Zed
d��Zedd��Z
edd��Ze�fdd��Zej	dd��Zedd��Zedd��Zedd��Zedd��Zdd�Zed d!��Zed"d#��Zed$d%��Zed&d'��Zed(d)��Zej	d*d)��Zed+d,��Zed-d.��Zed/d0��Zed1d2��Zed3d4��Zed5d6��Zed7d8��Z d9d:�Z!d;d<�Z"d=d>�Z#d?d@�Z$dAdB�Z%dRdGdH�Z&dIdJ�Z'edKdL��Z(dMdN�Z)dOdP�Z*�Z+S)S�Packagez Represents a package. #:api z
-debuginfoz-debugsourcecs,tt|�j|�||_d|_d|_d|_dS)N)�superr�__init__�base�_priv_chksum�_repo�
_priv_size)�selfZ
initobjectr	)�	__class__��/usr/lib/python3.6/package.pyr0s
zPackage.__init__cs�|jr|jS|jr~tjjj�}ytjj||j	�}Wn6tj
jk
rh}ztjj
t|���WYdd}~XnXtj|�tj|�fStt|�jS)N)r
�
_from_cmdlinerZyumZmiscZget_default_chksum_type�libdnf�utilsZchecksum_value�location�error�Error�
exceptions�	MiscError�str�hawkey�chksum_type�binasciiZ	unhexlifyrr�chksum)r
rZ
chksum_val�e)rrr�_chksum7s"zPackage._chksumcCs
||_dS)N)r
)r
�valrrrrEscCs|jtjkS)N)�reponamerZCMDLINE_REPO_NAME)r
rrrrIszPackage._from_cmdlinecCs|jtjkS)N)r!r�SYSTEM_REPO_NAME)r
rrr�_from_systemMszPackage._from_systemcCs*d}|jr|jjj|�}|r$d|S|jS)a9
        For installed packages returns id of repository from which the package was installed
        prefixed with '@' (if such information is available in the history database). Otherwise
        returns id of repository the package belongs to (@System for installed packages of unknown
        origin)
        N�@)r#r	�history�repor!)r
Zpkgreporrr�
_from_repoQszPackage._from_repocCs|jr|jjj|�SdS)N�)r#r	r%r&)r
rrr�	from_repo`szPackage.from_repocCstjj|j��S)z�
        Returns the header of a locally present rpm package file. As opposed to
        self.get_header(), which retrieves the header of an installed package
        from rpmdb.
        )r�rpm�_header�localPkg)r
rrrr+gszPackage._headercs|jr|jStt|�jS)N)rrr�size)r
)rrr�_sizepsz
Package._sizecCs
||_dS)N)r)r
r rrrr.vscCs"|jdkrdS|j\}}tj|�S)N)Z
hdr_chksumr�hexlify)r
rrrrr�_pkgidzs

zPackage._pkgidcCs4|jdk	r,tjj|jd�}|jdd�d}nd}|S)zO
        returns name of source package
        e.g. krb5-libs -> krb5
        Nz.src.rpm�-�r)Z	sourcerpmr�utilZrtrim�rsplit)r
Zsrcnamerrr�source_name�s

zPackage.source_namecCsF|jj|j�r|jS|j}|jj|j�r<|dt|j��}||jS)a)
        Returns name of the debuginfo package for this package.
        If this package is a debuginfo package, returns its name.
        If this package is a debugsource package, returns the debuginfo package
        for the base package.
        e.g. kernel-PAE -> kernel-PAE-debuginfo
        N)�name�endswith�DEBUGINFO_SUFFIX�DEBUGSOURCE_SUFFIX�len)r
r6rrr�
debug_name�s
zPackage.debug_namecCs |jdk	r|jn|j}||jS)zv
        Returns name of the debugsource package for this package.
        e.g. krb5-libs -> krb5-debugsource
        N)r5r6r9)r
�src_namerrr�debugsource_name�szPackage.debugsource_namecCsN|js
dSyt|jjjtj|j��Stk
rHt	j
jdt|���YnXdS)a`
        Returns the rpm header of the package if it is installed. If not
        installed, returns None. The header is not cached, it is retrieved from
        rpmdb on every call. In case of a failure (e.g. when the rpmdb changes
        between loading the data and calling this method), raises an instance
        of PackageNotFoundError.
        Nz4Package not found when attempting to retrieve header)
r#�nextr	Z_tsZdbMatchr*ZRPMDBI_PACKAGES�rpmdbid�
StopIterationrrZPackageNotFoundErrorr)r
rrr�
get_header�szPackage.get_headercCs |jdk	r|jn|j}||jS)z�
        returns name of debuginfo package for source package of given package
        e.g. krb5-libs -> krb5-debuginfo
        N)r5r6r8)r
r<rrr�source_debug_name�szPackage.source_debug_namecCs
t|j�S)z: Always type it to int, rpm bindings expect it like that. )�intr?)r
rrr�idx�szPackage.idxcCs|jS)N)r!)r
rrr�repoid�szPackage.repoidcCs|j|jt|j�|j|jfS)N)r6�archrr�v�r)r
rrr�pkgtup�szPackage.pkgtupcCs|jr|jS|jj|jS)N)rr	Zreposr!)r
rrrr&�szPackage.repocCs
||_dS)N)r)r
r rrrr&�scCs |jtjkrdS|jjjj|�S)N)rErr"r	r%r*Zget_reason_name)r
rrr�reason�szPackage.reasoncCs|jS)N)r)r
rrr�relativepath�szPackage.relativepathcCs|jS)N)rF)r
rrr�a�sz	Package.acCs|jS)N)Zepoch)r
rrrr�sz	Package.ecCs|jS)N)�version)r
rrrrG�sz	Package.vcCs|jS)N)�release)r
rrrrH�sz	Package.rcCs|jS)N)r!)r
rrr�ui_from_reposzPackage.ui_from_repocCs|j|�dkS)Nr)�evr_cmp)r
�pkgrrr�evr_eqszPackage.evr_eqcCs|j|�dkS)Nr)rP)r
rQrrr�evr_gt	szPackage.evr_gtcCs|j|�dkS)Nr)rP)r
rQrrr�evr_lt
szPackage.evr_ltcCs|jS)N)Zmedianr)r
rrr�
getDiscNumszPackage.getDiscNumcCsr|jr|jS|j}|jjj�rH|jrH|jjd�rHtjj	|j
�|jd��S|j�s\tjj
|�}tjj	|j|jd��S)z� Package's location in the filesystem.

            For packages in remote repo returns where the package will be/has
            been downloaded.
        zfile://�/)rrr&r�isLocal�baseurl�
startswith�os�path�joinZget_local_baseurl�lstrip�
_is_local_pkg�basename�pkgdir)r
�locrrrr,szPackage.localPkg�http�ftp�file�httpscCs |js|jrdS|jj|j|�S)a
        The location from where the package can be downloaded from. Returns None for installed and
        commandline packages.

        :param schemes: list of allowed protocols. Default is ('http', 'ftp', 'file', 'https')
        :return: location (string) or None
        N)r#rr&�remote_locationr)r
Zschemesrrrrf$s	zPackage.remote_locationcCsL|jr
dSd|jkr&|jjd�r&dS|jpJ|jjj�oJ|jpJ|jjd�S)NTz://zfile://F)r#rrYrr&rrWrX)r
rrrr^1szPackage._is_local_pkgcCs,|jjj�r |j�r |jj�S|jjSdS)N)r&rrWr^Zcache_pkgdirr`)r
rrrr`:s
zPackage.pkgdircCs0|jdkrdS|j\}}tj|�tj|�j�fS)z] Return the chksum type and chksum string how the legacy yum expects
            it.
        N)NN)rrZchksum_namerr/�decode)r
rrrrr�returnIdSumBs

zPackage.returnIdSumcCst|jrtd��|jrdS|j�\}}ytjj||j�|�Stjj	k
rn}zt
jjt
|���WYdd}~XnXdS)Nz$Can not verify an installed package.T)r#�
ValueErrorrrhrrZchecksum_checkr,rrrrrr)r
rrrrrr�verifyLocalPkgLszPackage.verifyLocalPkg�rbrcrdre)rk),�__name__�
__module__�__qualname__�__doc__r8r9r�propertyr�setterrr#r'r)r+r.r0r5r;r=rArBrDrErIr&rJrKrLrrGrHrOrRrSrTrUr,rfr^r`rhrj�
__classcell__rr)rrr*sR	

	
r)roZ
__future__rrZdnf.i18nrrZdnf.exceptionsrZdnf.rpmZdnf.yum.miscrZlibdnf.errorrZlibdnf.utilsZloggingrZr*Z	getLoggerZloggerrrrrr�<module>s
site-packages/dnf/__pycache__/query.cpython-36.pyc000064400000001621147511334650016047 0ustar003

�ft`3�@sZddlmZddlmZddlZddlmZddlmZddlmZddd	�Z	d
d�Z
dS)
�)�absolute_import)�unicode_literalsN)�Query)�ucd)�
basestringFcCsLt|t�r|g}|j�}g}|r,|jtj�|j|d|i�|rD|S|j�S)NZprovides__glob)�
isinstancerZquery�append�hawkeyZICASEZfiltermZrun)ZsackZpatternsZignore_caseZ	get_query�q�flags�r�/usr/lib/python3.6/query.py�_by_providess
rcCsdd�|D�S)NcSsi|]}|t|��qSr)r)�.0Zpkgrrr
�
<dictcomp>.sz#_per_nevra_dict.<locals>.<dictcomp>r)Zpkg_listrrr
�_per_nevra_dict-sr)FF)Z
__future__rrr	rZdnf.i18nrZ
dnf.pycomprrrrrrr
�<module>s
site-packages/dnf/__pycache__/exceptions.cpython-36.pyc000064400000015510147511334650017065 0ustar003

�ft`��@spdZddlmZddlmZmZmZddlZddl	Z	ddl
Z
Gdd�de�ZGdd�de�Z
Gd	d
�d
e
�ZGdd�de
�ZGd
d�de
�ZGdd�de
�ZGdd�de
�ZGdd�de
�ZGdd�de
�ZGdd�de
�ZGdd�de
�ZGdd�de
�ZGdd�de�ZGdd �d e�ZGd!d"�d"e�ZGd#d$�d$e�ZGd%d&�d&e
�ZGd'd(�d(e�ZGd)d*�d*e
�ZdS)+z
Core DNF Errors.
�)�unicode_literals)�ucd�_�P_Nc@seZdZdS)�DeprecationWarningN)�__name__�
__module__�__qualname__�r
r
� /usr/lib/python3.6/exceptions.pyrsrcs2eZdZdZd	�fdd�	Zdd�Zdd�Z�ZS)
�ErrorzTBase Error. All other Errors thrown by DNF should inherit from this.

    :api

    Ncs(tt|�j�|dkrdnt|�|_dS)N)�superr�__init__r�value)�selfr)�	__class__r
rr&szError.__init__cCsdj|j�S)Nz{})�formatr)rr
r
r�__str__*sz
Error.__str__cCst|j��S)N)rr)rr
r
r�__unicode__-szError.__unicode__)N)rrr	�__doc__rrr�
__classcell__r
r
)rrrsrc@seZdZdS)�
CompsErrorN)rrr	r
r
r
rr2srcseZdZd�fdd�	Z�ZS)�ConfigErrorNcs*tt|�j|�|dk	r t|�nd|_dS)N)r
rrr�	raw_error)rrr)rr
rr8szConfigError.__init__)NN)rrr	rrr
r
)rrr7src@seZdZdS)�
DatabaseErrorN)rrr	r
r
r
rr=src@seZdZdS)�
DepsolveErrorN)rrr	r
r
r
rrAsrcs0eZdZ�fdd�Zedd��Zdd�Z�ZS)�
DownloadErrorcstt|�j�||_dS)N)r
rr�errmap)rr)rr
rrHszDownloadError.__init__cCsPg}x@|D]8}x2||D]&}|r,d||fnd|}|j|�qWq
Wdj|�S)Nz%s: %sz%s�
)�append�join)rZ
errstrings�key�error�msgr
r
r�
errmap2strLs
zDownloadError.errmap2strcCs|j|j�S)N)r$r)rr
r
rrUszDownloadError.__str__)rrr	r�staticmethodr$rrr
r
)rrrFs	rc@seZdZdS)�	LockErrorN)rrr	r
r
r
rr&Ysr&cs*eZdZd�fdd�	Z�fdd�Z�ZS)�MarkingErrorNcs*tt|�j|�|dkrdnt|�|_dS)z&Initialize the marking error instance.N)r
r'rr�pkg_spec)rrr()rr
rr`szMarkingError.__init__cs&tt|�j�}|jr"|d|j7}|S)Nz: )r
r'rr()r�string)rr
rreszMarkingError.__str__)NN)rrr	rrrr
r
)rrr']sr'cs4eZdZffffff�fdd�	Zedd��Z�ZS)�
MarkingErrorscstd�}|r&|dtd�dj|�7}|rD|dtd�dj|�7}|rb|dtd�dj|�7}|r�|dtd�dj|�7}|r�tjj|d�}|d	tjjjkr�|ddjt	d
dt
|��|g�7}n"|ddjt	dd
t
|��|g�7}tt|�j
|�||_||_||_||_||_dS)z&Initialize the marking error instance.zProblems in request:rzmissing packages: z, zbroken packages: zmissing groups or modules: zbroken groups or modules: r�z)Modular dependency problem with Defaults:z*Modular dependency problems with Defaults:zModular dependency problem:zModular dependency problems:N)rr �dnf�utilZ_format_resolve_problems�libdnf�moduleZModulePackageContainerZ!ModuleErrorType_ERROR_IN_DEFAULTSr�lenr
r*r�no_match_group_specs�error_group_specs�no_match_pkg_specs�error_pkg_specs�module_depsolv_errors)rr1r2r3r4r5r#Zmsg_mod)rr
rrns6zMarkingErrors.__init__cCsd}tj|tdd�|jS)Nz[Attribute module_debsolv_errors is deprecated. Use module_depsolv_errors attribute instead.�)�
stacklevel)�warnings�warnrr5)rr#r
r
r�module_debsolv_errors�sz#MarkingErrors.module_debsolv_errors)rrr	r�propertyr:rr
r
)rrr*lsr*c@seZdZdS)�
MetadataErrorN)rrr	r
r
r
rr<�sr<c@seZdZdS)�	MiscErrorN)rrr	r
r
r
rr=�sr=cseZdZd�fdd�	Z�ZS)�PackagesNotAvailableErrorNcs tt|�j||�|pg|_dS)N)r
r>r�packages)rrr(r?)rr
rr�sz"PackagesNotAvailableError.__init__)NNN)rrr	rrr
r
)rrr>�sr>c@seZdZdS)�PackageNotFoundErrorN)rrr	r
r
r
rr@�sr@cseZdZd�fdd�	Z�ZS)�PackagesNotInstalledErrorNcs tt|�j||�|pg|_dS)N)r
rArr?)rrr(r?)rr
rr�sz"PackagesNotInstalledError.__init__)NNN)rrr	rrr
r
)rrrA�srAcs$eZdZ�fdd�Zdd�Z�ZS)�ProcessLockErrorcstt|�j|�||_dS)N)r
rBr�pid)rrrC)rr
rr�szProcessLockError.__init__cCst|j|jffS)zPickling support.)rBrrC)rr
r
r�
__reduce__�szProcessLockError.__reduce__)rrr	rrDrr
r
)rrrB�srBc@seZdZdS)�	RepoErrorN)rrr	r
r
r
rrE�srEc@seZdZdS)�ThreadLockErrorN)rrr	r
r
r
rrF�srFc@seZdZdS)�TransactionCheckErrorN)rrr	r
r
r
rrG�srG)rZ
__future__rZdnf.i18nrrrZdnf.utilr,r.r8r�	Exceptionrrrrrrr&r'r*r<r=r>r@rArBrErFrGr
r
r
r�<module>s0)
site-packages/dnf/__pycache__/goal.cpython-36.opt-1.pyc000064400000000351147511334650016562 0ustar003

�ft`M�@s(ddlmZddlmZddlmZdS)�)�absolute_import)�unicode_literals)�GoalN)Z
__future__rrZhawkeyr�rr�/usr/lib/python3.6/goal.py�<module>ssite-packages/dnf/__pycache__/history.cpython-36.opt-1.pyc000064400000000404147511334650017340 0ustar003

�ft`~�@s dZddlmZddlmZdS)z*Interfaces to the history of transactions.�)�absolute_import)�unicode_literalsN)�__doc__Z
__future__rr�rr�/usr/lib/python3.6/history.py�<module>ssite-packages/dnf/__pycache__/__init__.cpython-36.opt-1.pyc000064400000001006147511334650017375 0ustar003

�ft`m�@spddlmZddlZddlZejdedd�ddlmZeZ	ddl
ZejjZddl
ZejjZejjjjd�dS)�)�unicode_literalsN�oncez	^dnf\..*$)�category�module)�VERSIONZmedia)Z
__future__r�warningsZ
dnf.pycompZdnf�filterwarnings�DeprecationWarningZ	dnf.constr�__version__Zdnf.base�baseZBaseZ
dnf.pluginZpluginZPluginZpycompZurlparseZ
uses_fragment�append�r
r
�/usr/lib/python3.6/__init__.py�<module>ssite-packages/dnf/__pycache__/dnssec.cpython-36.pyc000064400000021257147511334650016170 0ustar003

�ft`;,�@s�ddlmZddlmZddlmZddlmZddlZddlZddlZddl	Z	ddl
mZddlZ
ddlZ
ejd�ZdZGd	d
�d
e
jj�Zddd
�ZGdd�de�ZGdd�d�ZGdd�d�ZGdd�d�Zdd�Zdd�ZGdd�d�ZdS)�)�print_function)�absolute_import)�unicode_literals)�EnumN)�_�dnf�=c@seZdZdZdd�ZdS)�DnssecErrorz-
    Exception used in the dnssec module
    cCsdj|jdk	r|jnd�S)Nz<DnssecError, value='{}'>z
Not specified)�format�value)�self�r
�/usr/lib/python3.6/dnssec.py�__repr__-szDnssecError.__repr__N)�__name__�
__module__�__qualname__�__doc__rr
r
r
rr	)sr	�_openpgpkeycCs~|jd�}t|�dkr"d}t|��|d}|d}tj�}|j|jd��tj|j	�dd��j
d�j�}|d|d|S)	z�
    Implements RFC 7929, section 3
    https://tools.ietf.org/html/rfc7929#section-3
    :param email_address:
    :param tag:
    :return:
    �@�z0Email address must contain exactly one '@' sign.r�zutf-8��.)�split�lenr	�hashlibZsha256�update�encode�base64Z	b16encode�digest�decode�lower)Z
email_address�tagr�msgZlocalZdomain�hashr r
r
r�email2location2s	

r&c@s(eZdZdZdZdZdZdZdZdZ	dS)	�Validityz�
    Output of the verification algorithm.
    TODO: this type might be simplified in order to less reflect the underlying DNS layer.
    TODO: more specifically the variants from 3 to 5 should have more understandable names
    rr����	N)
rrrr�VALID�REVOKED�PROVEN_NONEXISTENCE�RESULT_NOT_SECURE�BOGUS_RESULT�ERRORr
r
r
rr'Jsr'c@seZdZdZdS)�NoKeyz�
    This class represents an absence of a key in the cache. It is an expression of non-existence
    using the Python's type system.
    N)rrrrr
r
r
rr2Xsr2c@s&eZdZdZddd�Zedd��ZdS)�KeyInfozv
    Wrapper class for email and associated verification key, where both are represented in
    form of a string.
    NcCs||_||_dS)N)�email�key)rr4r5r
r
r�__init__eszKeyInfo.__init__c	Cs�tjd|�}|dkrt�|jd�}|jd�jd�}d}d}x6tdt|��D]$}||dkr^|}||dkrJ|}qJWd	j||d
|d��j	d�}t
||�S)z�
        Since dnf uses different format of the key than the one used in DNS RR, I need to convert
        the former one into the new one.
        z	<(.*@.*)>Nr�ascii�
rz$-----BEGIN PGP PUBLIC KEY BLOCK-----z"-----END PGP PUBLIC KEY BLOCK-----�r)�re�searchr	�groupr!r�ranger�joinrr3)	ZuseridZraw_keyZinput_emailr4r5�start�stop�iZcat_keyr
r
r�from_rpm_key_objectis
 zKeyInfo.from_rpm_key_object)NN)rrrrr6�staticmethodrBr
r
r
rr3`s
r3c@s8eZdZdZiZedd��Zedd��Zedd��ZdS)	�DNSSECKeyVerificationz�
    The main class when it comes to verification itself. It wraps Unbound context and a cache with
    already obtained results.
    cCsZ||krtjd�tjS|tkr0tjd�tjStjdj|��tjdj|��tjSdS)zD
        Compare the key in case it was found in the cache.
        zCache hit, valid keyzCache hit, proven non-existencezKey in cache: {}zInput key   : {}N)�logger�debugr'r,r2r.r
r-)�	key_unionZinput_key_stringr
r
r�
_cache_hit�s

z DNSSECKeyVerification._cache_hitc	Cs�yddl}Wn<tk
rH}z tdj|��}tjj|��WYdd}~XnX|j�}|jdd�dkrlt	j
d�|jdd�dkr�t	j
d	�|j�dkr�t	j
d
�|jd�dkr�t	j
d�|j
t|j�t|j�\}}|dkr�t	j
d
�tjS|jr�t	j
d�tjS|j�st	j
d�tjS|j�r,t	j
d�tjS|j�sDt	j
d�tjS|jj�d}tj|�}||jk�rntj St	j
dj|��t	j
dj|j��tj!SdS)zz
        In case the key was not found in the cache, create an Unbound context and contact the DNS
        system
        rNzLConfiguration option 'gpgkey_dns_verification' requires python3-unbound ({})z
verbosity:�0z(Unbound context: Failed to set verbosityzqname-minimisation:�yesz1Unbound context: Failed to set qname minimisationz+Unbound context: Failed to read resolv.confz/var/lib/unbound/root.keyz0Unbound context: Failed to add trust anchor filez%Communication with DNS servers failedzDNSSEC signatures are wrongz!Result is not secured with DNSSECz1Non-existence of this record was proven by DNSSECz"Unknown error in DNS communicationzKey from DNS: {}zInput key   : {})"�unbound�ImportErrorrr
r�
exceptions�ErrorZub_ctxZ
set_optionrErFZ
resolvconfZadd_ta_fileZresolver&r4�RR_TYPE_OPENPGPKEYZRR_CLASS_INr'r1Zbogusr0Zsecurer/Znxdomainr.Zhavedata�dataZas_raw_datarZ	b64encoder5r,r-)	�	input_keyrK�er$ZctxZstatus�resultrPZdns_data_b64r
r
r�_cache_miss�sN









z!DNSSECKeyVerification._cache_misscCsztjdj|j��tjj|j�}|dk	r6tj||j�Stj	|�}|t
jkrZ|jtj|j<n|t
jkrrt
�tj|j<|SdS)zI
        Public API. Use this method to verify a KeyInfo object.
        z(Running verification for key with id: {}N)rErFr
r4rD�_cache�getrHr5rTr'r,r.r2)rQrGrSr
r
r�verify�s


zDNSSECKeyVerification.verifyN)	rrrrrUrCrHrTrWr
r
r
rrD�s
9rDcCs8td�|jd}|tjkr(|td�S|td�SdS)zE
    Inform the user about key validity in a human readable way.
    zDNSSEC extension: Key for user � z	is valid.zhas unknown status.N)rr4r'r,)Zki�v�prefixr
r
r�
nice_user_msg�s
r[cCstd�|S)z;
    Label any given message with DNSSEC extension tag
    zDNSSEC extension: )r)�mr
r
r�any_msg�sr]c@s(eZdZdZedd��Zedd��ZdS)�RpmImportedKeysaQ
    Wrapper around keys, that are imported in the RPM database.

    The keys are stored in packages with name gpg-pubkey, where the version and
    release is different for each of them. The key content itself is stored as
    an ASCII armored string in the package description, so it needs to be parsed
    before it can be used.
    c	Cs�tjjj�}|jdd�}g}xl|D]d}tjj|d�}tjd|�jd�}tjj|d�}|j	d�dd�}d	j
|�}|t||jd
��g7}q"W|S)N�namez
gpg-pubkey�packagerz	<(.*@.*)>r�descriptionr8r(r9r7���)
rZrpmZtransactionZTransactionWrapperZdbMatchZ	getheaderr:r;r<rr>r3r)	Ztransaction_setZpackagesZreturn_listZpkgr`r4raZ	key_linesZkey_strr
r
r�_query_db_for_gpg_keyss

z&RpmImportedKeys._query_db_for_gpg_keyscCstj�}tjttd���x�|D]�}ytj|�}Wn:tk
rl}ztj	dj
|j|j��w WYdd}~XnX|t
jkr�tjtdj
|j���q |t
jkr�tjtdj
|j���q |t
jkr�tjtdj
|j���q |t
jkr�tjtdj
|j���q tjtdj
|j���q WdS)Nz1Testing already imported keys for their validity.z%DNSSEC extension error (email={}): {}zGPG Key {} is validz,GPG Key {} does not support DNS verificationz�GPG Key {} could not be verified, because DNSSEC signatures are bogus. Possible causes: wrong configuration of the DNS server, MITM attackz=GPG Key {} has been revoked and should be removed immediatelyzGPG Key {} could not be tested)r^rcrE�infor]rrDrWr	Zwarningr
r4rr'r,rFr.r0r-)�keysr5rSrRr
r
r�check_imported_keys_validitys,







z,RpmImportedKeys.check_imported_keys_validityN)rrrrrCrcrfr
r
r
rr^�sr^)r)Z
__future__rrr�enumrrrZloggingr:Zdnf.i18nrZdnf.rpmrZdnf.exceptionsZ	getLoggerrErOrMrNr	r&r'r2r3rDr[r]r^r
r
r
r�<module>s*
	
#gsite-packages/dnf/__pycache__/callback.cpython-36.opt-1.pyc000064400000007067147511334650017407 0ustar003

�ft`��@s
ddlmZddlZddlZejjZejjZejjZejj	Z	ejj
Z
ejjZejjZejj
ZeZ
ejjZejjZejjZejjZejjZejjZejjZdZdZdZdZdZGdd�de�ZGd	d
�d
e�ZGdd�de�ZGd
d�de�ZGdd�de�Z ej!j"j#Z$dS)�)�unicode_literalsN����c@seZdZdd�ZdS)�	KeyImportcCsdS)z+Ask the user if the key should be imported.F�)�self�idZuseridZfingerprintZurlZ	timestamprr�/usr/lib/python3.6/callback.py�_confirm5szKeyImport._confirmN)�__name__�
__module__�__qualname__rrrrrr4src@s(eZdZdd�Zdd�Zedd��ZdS)�PayloadcCs
||_dS)N)�progress)r	rrrr�__init__=szPayload.__init__cCsdS)z)Nice, human-readable representation. :apiNr)r	rrr�__str__@szPayload.__str__cCsdS)z Total size of the download. :apiNr)r	rrr�
download_sizeDszPayload.download_sizeN)r
rrrr�propertyrrrrrr:src@s.eZdZdd�Zdd�Zdd�Zddd	�Zd
S)�DownloadProgresscCsdS)z�Communicate the information that `payload` has finished downloading.

        :api, `status` is a constant denoting the type of outcome, `err_msg` is an
        error message in case the outcome was an error.

        Nr)r	�payloadZstatus�msgrrr�endMszDownloadProgress.endcCsdS)Nr)r	rrrr�messageVszDownloadProgress.messagecCsdS)z�Update the progress display. :api

        `payload` is the payload this call reports progress for, `done` is how
        many bytes of this payload are already downloaded.

        Nr)r	r�donerrrrYszDownloadProgress.progressrcCsdS)z�Start new progress metering. :api

        `total_files` the number of files that will be downloaded,
        `total_size` total size of all files.

        Nr)r	Ztotal_filesZ
total_sizeZtotal_drpmsrrr�startcszDownloadProgress.startN)r)r
rrrrrrrrrrrJs	
rc@seZdZdS)�NullDownloadProgressN)r
rrrrrrrnsrc@s$eZdZdd�Zdd�Zdd�ZdS)�DepsolvecCsdS)Nr)r	rrrrsszDepsolve.startcCsdS)Nr)r	Zpkg�moderrr�	pkg_addedvszDepsolve.pkg_addedcCsdS)Nr)r	rrrryszDepsolve.endN)r
rrrr rrrrrrrsr)%Z
__future__rZdnf.yum.rpmtransZdnfZdnf.transactionZtransactionZ
PKG_DOWNGRADEZPKG_DOWNGRADEDZPKG_INSTALLZPKG_OBSOLETEZ
PKG_OBSOLETEDZ
PKG_REINSTALLZPKG_REINSTALLEDZ	PKG_ERASEZ
PKG_REMOVEZPKG_UPGRADEZPKG_UPGRADEDZPKG_CLEANUPZ
PKG_VERIFYZ
PKG_SCRIPTLETZTRANS_PREPARATIONZ
TRANS_POSTZ	STATUS_OKZ
STATUS_FAILEDZSTATUS_ALREADY_EXISTSZ
STATUS_MIRRORZSTATUS_DRPM�objectrrrrrZyumZrpmtransZTransactionDisplayZTransactionProgressrrrr�<module>s:$site-packages/dnf/__pycache__/exceptions.cpython-36.opt-1.pyc000064400000015510147511334650020024 0ustar003

�ft`��@spdZddlmZddlmZmZmZddlZddl	Z	ddl
Z
Gdd�de�ZGdd�de�Z
Gd	d
�d
e
�ZGdd�de
�ZGd
d�de
�ZGdd�de
�ZGdd�de
�ZGdd�de
�ZGdd�de
�ZGdd�de
�ZGdd�de
�ZGdd�de
�ZGdd�de�ZGdd �d e�ZGd!d"�d"e�ZGd#d$�d$e�ZGd%d&�d&e
�ZGd'd(�d(e�ZGd)d*�d*e
�ZdS)+z
Core DNF Errors.
�)�unicode_literals)�ucd�_�P_Nc@seZdZdS)�DeprecationWarningN)�__name__�
__module__�__qualname__�r
r
� /usr/lib/python3.6/exceptions.pyrsrcs2eZdZdZd	�fdd�	Zdd�Zdd�Z�ZS)
�ErrorzTBase Error. All other Errors thrown by DNF should inherit from this.

    :api

    Ncs(tt|�j�|dkrdnt|�|_dS)N)�superr�__init__r�value)�selfr)�	__class__r
rr&szError.__init__cCsdj|j�S)Nz{})�formatr)rr
r
r�__str__*sz
Error.__str__cCst|j��S)N)rr)rr
r
r�__unicode__-szError.__unicode__)N)rrr	�__doc__rrr�
__classcell__r
r
)rrrsrc@seZdZdS)�
CompsErrorN)rrr	r
r
r
rr2srcseZdZd�fdd�	Z�ZS)�ConfigErrorNcs*tt|�j|�|dk	r t|�nd|_dS)N)r
rrr�	raw_error)rrr)rr
rr8szConfigError.__init__)NN)rrr	rrr
r
)rrr7src@seZdZdS)�
DatabaseErrorN)rrr	r
r
r
rr=src@seZdZdS)�
DepsolveErrorN)rrr	r
r
r
rrAsrcs0eZdZ�fdd�Zedd��Zdd�Z�ZS)�
DownloadErrorcstt|�j�||_dS)N)r
rr�errmap)rr)rr
rrHszDownloadError.__init__cCsPg}x@|D]8}x2||D]&}|r,d||fnd|}|j|�qWq
Wdj|�S)Nz%s: %sz%s�
)�append�join)rZ
errstrings�key�error�msgr
r
r�
errmap2strLs
zDownloadError.errmap2strcCs|j|j�S)N)r$r)rr
r
rrUszDownloadError.__str__)rrr	r�staticmethodr$rrr
r
)rrrFs	rc@seZdZdS)�	LockErrorN)rrr	r
r
r
rr&Ysr&cs*eZdZd�fdd�	Z�fdd�Z�ZS)�MarkingErrorNcs*tt|�j|�|dkrdnt|�|_dS)z&Initialize the marking error instance.N)r
r'rr�pkg_spec)rrr()rr
rr`szMarkingError.__init__cs&tt|�j�}|jr"|d|j7}|S)Nz: )r
r'rr()r�string)rr
rreszMarkingError.__str__)NN)rrr	rrrr
r
)rrr']sr'cs4eZdZffffff�fdd�	Zedd��Z�ZS)�
MarkingErrorscstd�}|r&|dtd�dj|�7}|rD|dtd�dj|�7}|rb|dtd�dj|�7}|r�|dtd�dj|�7}|r�tjj|d�}|d	tjjjkr�|ddjt	d
dt
|��|g�7}n"|ddjt	dd
t
|��|g�7}tt|�j
|�||_||_||_||_||_dS)z&Initialize the marking error instance.zProblems in request:rzmissing packages: z, zbroken packages: zmissing groups or modules: zbroken groups or modules: r�z)Modular dependency problem with Defaults:z*Modular dependency problems with Defaults:zModular dependency problem:zModular dependency problems:N)rr �dnf�utilZ_format_resolve_problems�libdnf�moduleZModulePackageContainerZ!ModuleErrorType_ERROR_IN_DEFAULTSr�lenr
r*r�no_match_group_specs�error_group_specs�no_match_pkg_specs�error_pkg_specs�module_depsolv_errors)rr1r2r3r4r5r#Zmsg_mod)rr
rrns6zMarkingErrors.__init__cCsd}tj|tdd�|jS)Nz[Attribute module_debsolv_errors is deprecated. Use module_depsolv_errors attribute instead.�)�
stacklevel)�warnings�warnrr5)rr#r
r
r�module_debsolv_errors�sz#MarkingErrors.module_debsolv_errors)rrr	r�propertyr:rr
r
)rrr*lsr*c@seZdZdS)�
MetadataErrorN)rrr	r
r
r
rr<�sr<c@seZdZdS)�	MiscErrorN)rrr	r
r
r
rr=�sr=cseZdZd�fdd�	Z�ZS)�PackagesNotAvailableErrorNcs tt|�j||�|pg|_dS)N)r
r>r�packages)rrr(r?)rr
rr�sz"PackagesNotAvailableError.__init__)NNN)rrr	rrr
r
)rrr>�sr>c@seZdZdS)�PackageNotFoundErrorN)rrr	r
r
r
rr@�sr@cseZdZd�fdd�	Z�ZS)�PackagesNotInstalledErrorNcs tt|�j||�|pg|_dS)N)r
rArr?)rrr(r?)rr
rr�sz"PackagesNotInstalledError.__init__)NNN)rrr	rrr
r
)rrrA�srAcs$eZdZ�fdd�Zdd�Z�ZS)�ProcessLockErrorcstt|�j|�||_dS)N)r
rBr�pid)rrrC)rr
rr�szProcessLockError.__init__cCst|j|jffS)zPickling support.)rBrrC)rr
r
r�
__reduce__�szProcessLockError.__reduce__)rrr	rrDrr
r
)rrrB�srBc@seZdZdS)�	RepoErrorN)rrr	r
r
r
rrE�srEc@seZdZdS)�ThreadLockErrorN)rrr	r
r
r
rrF�srFc@seZdZdS)�TransactionCheckErrorN)rrr	r
r
r
rrG�srG)rZ
__future__rZdnf.i18nrrrZdnf.utilr,r.r8r�	Exceptionrrrrrrr&r'r*r<r=r>r@rArBrErFrGr
r
r
r�<module>s0)
site-packages/dnf/__pycache__/callback.cpython-36.pyc000064400000007067147511334650016450 0ustar003

�ft`��@s
ddlmZddlZddlZejjZejjZejjZejj	Z	ejj
Z
ejjZejjZejj
ZeZ
ejjZejjZejjZejjZejjZejjZejjZdZdZdZdZdZGdd�de�ZGd	d
�d
e�ZGdd�de�ZGd
d�de�ZGdd�de�Z ej!j"j#Z$dS)�)�unicode_literalsN����c@seZdZdd�ZdS)�	KeyImportcCsdS)z+Ask the user if the key should be imported.F�)�self�idZuseridZfingerprintZurlZ	timestamprr�/usr/lib/python3.6/callback.py�_confirm5szKeyImport._confirmN)�__name__�
__module__�__qualname__rrrrrr4src@s(eZdZdd�Zdd�Zedd��ZdS)�PayloadcCs
||_dS)N)�progress)r	rrrr�__init__=szPayload.__init__cCsdS)z)Nice, human-readable representation. :apiNr)r	rrr�__str__@szPayload.__str__cCsdS)z Total size of the download. :apiNr)r	rrr�
download_sizeDszPayload.download_sizeN)r
rrrr�propertyrrrrrr:src@s.eZdZdd�Zdd�Zdd�Zddd	�Zd
S)�DownloadProgresscCsdS)z�Communicate the information that `payload` has finished downloading.

        :api, `status` is a constant denoting the type of outcome, `err_msg` is an
        error message in case the outcome was an error.

        Nr)r	�payloadZstatus�msgrrr�endMszDownloadProgress.endcCsdS)Nr)r	rrrr�messageVszDownloadProgress.messagecCsdS)z�Update the progress display. :api

        `payload` is the payload this call reports progress for, `done` is how
        many bytes of this payload are already downloaded.

        Nr)r	r�donerrrrYszDownloadProgress.progressrcCsdS)z�Start new progress metering. :api

        `total_files` the number of files that will be downloaded,
        `total_size` total size of all files.

        Nr)r	Ztotal_filesZ
total_sizeZtotal_drpmsrrr�startcszDownloadProgress.startN)r)r
rrrrrrrrrrrJs	
rc@seZdZdS)�NullDownloadProgressN)r
rrrrrrrnsrc@s$eZdZdd�Zdd�Zdd�ZdS)�DepsolvecCsdS)Nr)r	rrrrsszDepsolve.startcCsdS)Nr)r	Zpkg�moderrr�	pkg_addedvszDepsolve.pkg_addedcCsdS)Nr)r	rrrryszDepsolve.endN)r
rrrr rrrrrrrsr)%Z
__future__rZdnf.yum.rpmtransZdnfZdnf.transactionZtransactionZ
PKG_DOWNGRADEZPKG_DOWNGRADEDZPKG_INSTALLZPKG_OBSOLETEZ
PKG_OBSOLETEDZ
PKG_REINSTALLZPKG_REINSTALLEDZ	PKG_ERASEZ
PKG_REMOVEZPKG_UPGRADEZPKG_UPGRADEDZPKG_CLEANUPZ
PKG_VERIFYZ
PKG_SCRIPTLETZTRANS_PREPARATIONZ
TRANS_POSTZ	STATUS_OKZ
STATUS_FAILEDZSTATUS_ALREADY_EXISTSZ
STATUS_MIRRORZSTATUS_DRPM�objectrrrrrZyumZrpmtransZTransactionDisplayZTransactionProgressrrrr�<module>s:$site-packages/dnf/__pycache__/persistor.cpython-36.pyc000064400000010067147511334650016740 0ustar003

�ft`o�@s�ddlmZddlmZddlmZddlZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZejd�ZGdd�de�ZGdd	�d	e�ZGd
d�de�ZdS)�)�absolute_import)�unicode_literals)�_N�dnfc@s,eZdZdd�Zgfdd�Zedd��ZdS)�JSONDBcCs0tjj|�s,tjjtjj|��|j|g�dS)N)�os�path�isfiler�utilZ
ensure_dir�dirname�_write_json_db)�self�	json_path�r�/usr/lib/python3.6/persistor.py�_check_json_db+szJSONDB._check_json_dbcCs�t|d��}|j�}WdQRX|dkrDtjtd�|�|j||�n<ytj|�}Wn,tk
r~}ztj|�WYdd}~XnX|S)N�r�z%s is empty file)	�open�read�logger�warningrr�json�loads�
ValueError)r
r�default�f�content�errr�_get_json_db1szJSONDB._get_json_dbc
Cs&t|d��}tj||�WdQRXdS)N�w)rr�dump)rrrrrrr?szJSONDB._write_json_dbN)�__name__�
__module__�__qualname__rr�staticmethodrrrrrr)src@s<eZdZdZdd�Zedd��Zdd�Zdd	�Zd
d�Z	dS)
�
RepoPersistorzePersistent data kept for repositories.

    Is arch/releasever specific and stores to cachedir.

    cCs*||_tjj|jd�|_t�|_d|_dS)Nzexpired_repos.jsonF)�cachedirrr�join�db_path�set�expired_to_add�reset_last_makecache)r
r'rrr�__init__LszRepoPersistor.__init__cCstjj|jd�S)NZlast_makecache)rrr(r')r
rrr�_last_makecache_pathRsz"RepoPersistor._last_makecache_pathcCsRy|j|j�t|j|j��Stk
rL}ztjtd�|�dSd}~XnXdS)Nz&Failed to load expired repos cache: %s)rr)r*r�OSErrorrrr)r
rrrr�get_expired_reposVszRepoPersistor.get_expired_reposcCs�y$|j|j�|j|jt|j��Wn0tk
rT}ztjtd�|�dSd}~XnX|j	r�yt
jj|j
�dStk
r�tjtd��dSXdS)Nz'Failed to store expired repos cache: %sFTz#Failed storing last makecache time.)rr)r�listr+r/rrrr,rr
Ztouchr.�IOError)r
rrrr�save^szRepoPersistor.savecCs:yttjj|j��Stk
r4tjtd��dSXdS)Nz'Failed determining last makecache time.)	�intrr
Zfile_ager.r/rrr)r
rrr�since_last_makecachems
z"RepoPersistor.since_last_makecacheN)
r"r#r$�__doc__r-�propertyr.r0r3r5rrrrr&Esr&c@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�TempfilePersistorcCs"tjj|d�|_t�|_d|_dS)Nztempfiles.jsonF)rrr(r)r*�tempfiles_to_add�_empty)r
r'rrrr-wszTempfilePersistor.__init__cCs|j|j�|j|j�S)N)rr)r)r
rrr�get_saved_tempfiles|sz%TempfilePersistor.get_saved_tempfilescCsp|jr|jrdS|j|j�|jr8|j|jg�dS|jrlt|j|j��}|j|j�|j|jt|��dS)N)	r:r9rr)rr*r�updater1)r
�datarrrr3�szTempfilePersistor.savecCs
d|_dS)NT)r:)r
rrr�empty�szTempfilePersistor.emptyN)r"r#r$r-r;r3r>rrrrr8usr8)Z
__future__rrZdnf.i18nrZdistutils.versionZ	distutilsZdnf.utilr�errnoZfnmatchrZloggingr�reZ	getLoggerr�objectrr&r8rrrr�<module>s
0site-packages/dnf/__pycache__/transaction_sr.cpython-36.opt-1.pyc000064400000042137147511334650020701 0ustar003

i�-eaf�@s�ddlmZddlmZddlmZddlZddlZddlmZddlZ	ddl
Z
dZdZdeefZ
Gdd�de	jj�ZGd	d
�d
e	jj�ZGdd�de�Zd
d�Zdd�ZGdd�de�ZdS)�)�absolute_import)�print_function)�unicode_literalsN)�_z%s.%scseZdZ�fdd�Z�ZS)�TransactionErrorcstt|�j|�dS)N)�superr�__init__)�self�msg)�	__class__��$/usr/lib/python3.6/transaction_sr.pyr/szTransactionError.__init__)�__name__�
__module__�__qualname__r�
__classcell__rr)rr
r.srcseZdZ�fdd�Z�ZS)�TransactionReplayErrorcsv||_t|ttf�r||_n|g|_|r:td�j|d�}ntd�}x|jD]}|dt|�7}qJWtt	|�j
|�dS)z�
        :param filename: The name of the transaction file being replayed
        :param errors: a list of error classes or a string with an error description
        zWThe following problems occurred while replaying the transaction from file "{filename}":)�filenamez<The following problems occurred while running a transaction:z
  N)r�
isinstance�list�tuple�errorsr�format�strrrr)r	rrr
�error)rrr
r4szTransactionReplayError.__init__)rrrrrrr)rr
r3srcseZdZ�fdd�Z�ZS)�#IncompatibleTransactionVersionErrorcstt|�j||�dS)N)rrr)r	rr
)rrr
rMsz,IncompatibleTransactionVersionError.__init__)rrrrrrr)rr
rLsrc"Cs�|jd�\}}yt|�}Wn8tk
rR}zt|td�j|d���WYdd}~XnXyt|�Wn8tk
r�}zt|td�j|d���WYdd}~XnX|tkr�t|td�j|td���dS)N�.z1Invalid major version "{major}", number expected.)�majorz1Invalid minor version "{minor}", number expected.)�minorzPIncompatible major version "{major}", supported major version is "{major_supp}".)rZ
major_supp)�split�int�
ValueErrorrrr�
VERSION_MAJORr)�versionrrr�errr
�_check_versionQs$$$r%cCs�dti}g}g}g}|dkr |S�x0|j�D�]"}|j�r`|j|j|jtjj|j	�|j
d��q,|j�r�|j�}|j|j
�gtjj|j��d�}x:|j�D].}|dj|j�|j�tjj|j��d��q�W|j|�q,|j�r,|j�}	|j|	j�gtjj|	j��d�}
x<|	j�D]0}|
dj|j
�|j�tjj|j��d	���qW|j|
�q,W|�rb||d
<|�rp||d<|�r~||d<|S)z�
    Serializes a transaction to a data structure that is equivalent to the stored JSON format.
    :param transaction: the transaction to serialize (an instance of dnf.db.history.TransactionWrapper)
    r#N)�action�nevra�reason�repo_id)r&�id�packages�
package_typesr+)�name�	installed�package_type)r&r*�groupsr,r0)r*r.�
group_type�rpms�environments)�VERSIONr+�
is_package�appendZaction_namer'�libdnf�transactionZTransactionItemReasonToStringr(Z	from_repoZis_groupZ	get_groupZ
getGroupIdZcompsPackageTypeToStringZgetPackageTypesZgetPackagesZgetNameZgetInstalledZgetPackageTypeZis_environmentZget_environmentZgetEnvironmentIdZ	getGroupsZgetGroupType)r8�datar2r0r3�tsi�group�
group_data�pkg�env�env_data�grprrr
�serialize_transactionlsXrAc@s�eZdZdZd/dd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�ZdS)0�TransactionReplaya�
    A class that encapsulates replaying a transaction. The transaction data are
    loaded and stored when the class is initialized. The transaction is run by
    calling the `run()` method, after the transaction is created (but before it is
    performed), the `post_transaction()` method needs to be called to verify no
    extra packages were pulled in and also to fix the reasons.
    �NFcCsv||_||_||_||_||_|jjjs.d|_t�|_i|_	g|_
|rX|rXttd���n|rh|j
|�n
|j|�dS)a
        :param base: the dnf base
        :param filename: the filename to load the transaction from (conflicts with the 'data' argument)
        :param data: the dictionary to load the transaction from (conflicts with the 'filename' argument)
        :param ignore_extras: whether to ignore extra package pulled into the transaction
        :param ignore_installed: whether to ignore installed versions of packages
        :param skip_unavailable: whether to skip transaction packages that aren't available
        TzKConflicting TransactionReplay arguments have been specified: filename, dataN)�_base�	_filename�_ignore_installed�_ignore_extras�_skip_unavailable�conf�strict�set�_nevra_cache�_nevra_reason_cache�	_warningsr!r�_load_from_file�_load_from_data)r	�baserr9Z
ignore_extrasZignore_installedZskip_unavailablerrr
r�s
zTransactionReplay.__init__c%Cs�||_t|d��N}ytj|�}Wn8tjjk
rX}zt|t|�d��WYdd}~XnXWdQRXy|j|�Wn,t	k
r�}zt||��WYdd}~XnXdS)N�rr)
rE�open�json�load�decoderZJSONDecodeErrorrrrPr)r	�fn�f�replay_datar$rrr
rO�s.z!TransactionReplay._load_from_filecCs|||_|j|j�|jjdg�|_|j|jtdd�|jjdg�|_|j|jtdd�|jjdg�|_|j|jtdd�dS)Nr2Zarrayr0r3)�_replay_data�_verify_toplevel_json�get�_rpms�_assert_typer�_groups�
_environments)r	r9rrr
rP�sz!TransactionReplay._load_from_datacCs|r|jj|�nt|��dS)N)rNr6r)r	Z	warn_onlyr
rrr
�_raise_or_warnsz TransactionReplay._raise_or_warncCs$t||�s ttd�j||d���dS)Nz*Unexpected type of "{id}", {exp} expected.)r*Zexp)rrrr)r	�value�tr*Zexpectedrrr
r^s
zTransactionReplay._assert_typecCsJ|j}d|kr$t|tdjdd����|j|dtdd�t|d|�dS)Nr#zMissing key "{key}".)�key�string)rErrrr^rr%)r	rYrWrrr
r[s
z'TransactionReplay._verify_toplevel_jsoncCsDy,|d}|d}|d}tjj|d�}Wnvtk
rh}z ttd�j|jdd���WYdd}~Xn<tk
r�}z ttd�j|d|d	���WYdd}~XnXt	j
|�}|jt	jgd
�}t
|�dkr�ttd�j|d
���|d}	d|	j|	jf}
|jjj�j|	j|	jd�}|	jdk	�r"|	jnd}|j||	j|	jd�}
|�r`|
j|d�}|�r`|j|
j��}
|
�s�|j|jtd�j|d
��dS|dk�r�|jj|�|d(k�r�||j|<|d)k�r.|dk�r�|j��r�|jj|��r�|j|jtd�j|
|d��t j!j"|jj�j#|
d�}|jj$j%||jj&j'd��n|dk�r�|
j(�}
|
�sf|j|jtd�j||d��dSt j!j"|jj�j#|
d�}|jj$j%||jj&j'd�n�|d*k�r
|
j�}
|
�s�|j|jtd#�j||d��dS|j�s�|dk�r@xX|
D]}|jj$j)|d$d%��q�Wn6|dk�r*|jj*j+|
d|�nttd&�j||d'���dS)+Nr&r'r)r(z%Missing object key "{key}" in an rpm.r)rdzFUnexpected value of package reason "{reason}" for rpm nevra "{nevra}".)r(r')Zforms�z)Cannot parse NEVRA for package "{nevra}".)r'z%s.%s)r-�arch)�epochr#�release)Zreponamez Cannot find rpm nevra "{nevra}".z
Reason Change�Install�Upgrade�	Downgrade�	Reinstall�Removedz:Package "{na}" is already installed for action "{action}".)�nar&)r=)ZselectZoptionalzLPackage nevra "{nevra}" not available in repositories for action "{action}".)r'r&�Upgraded�
Downgraded�Reinstalled�	Obsoletedz<Package nevra "{nevra}" not installed for action "{action}".F)Z
clean_depszFUnexpected value of package action "{action}" for rpm nevra "{nevra}".)r&r')rjrkrlrmrn)rjrkrl)rprqrrrnrs),r7r8ZStringToTransactionItemReason�KeyErrorrrr�args�
IndexError�hawkeyZSubjectZget_nevra_possibilitiesZ
FORM_NEVRA�lenr-rgrDZsack�query�filterrhr#ri�unionr.rarHrL�addrMZ_get_installonly_queryrF�dnfZselectorZSelectorrKZgoal�installrIrJZ	availableZerase�historyZ
set_reason)r	�pkg_datar&r'r)r(r$ZsubjZ
parsed_nevrasZparsed_nevraroZquery_narhryZ
query_repoZsltrr=rrr
�_replay_pkg_actionsz("





$



z$TransactionReplay._replay_pkg_actioncCs2|jjj|�}|s,|j|jtd�|�dS|jjjj||j	|j
|�}y�x�|D]�}|d}|j|tdd�|d}|j|t
dd�|d}	|j|	td	d�y|j||tjj|	��WqNtjjk
r�}
ztt|
���WYdd}
~
XqNXqNWWn>tk
�r,}
z ttd
�j|
jdd���WYdd}
~
XnX|S)
NzGroup id '%s' is not available.r-zgroups.packages.namerer.zgroups.packages.installed�booleanr/zgroups.packages.package_typez.Missing object key "{key}" in groups.packages.r)rd)rD�compsZ_group_by_idrarHrrr;�newr-�ui_namer^r�boolZ
addPackager7r8�stringToCompsPackageTyper�Errorrrtrru)r	�group_id�	pkg_types�pkgsZcomps_group�
swdb_groupr=r-r.r/r$rrr
�_create_swdb_groupvs*
&*z$TransactionReplay._create_swdb_groupcCs*|j|||�}|dk	r&|jjjj|�dS)N)r�rDrr;r~)r	r�r�r�r�rrr
�_swdb_group_install�sz%TransactionReplay._swdb_group_installcCsT|jjjj|�s*|j|jtd�|�dS|j|||�}|dk	rP|jjjj|�dS)NzGroup id '%s' is not installed.)	rDrr;r\rarFrr��upgrade)r	r�r�r�r�rrr
�_swdb_group_upgrade�sz%TransactionReplay._swdb_group_upgradecCsT|jjjj|�s*|j|jtd�|�dS|j|||�}|dk	rP|jjjj|�dS)NzGroup id '%s' is not installed.)	rDrr;r\rarFrr��	downgrade)r	r�r�r�r�rrr
�_swdb_group_downgrade�sz'TransactionReplay._swdb_group_downgradecCsT|jjjj|�s*|j|jtd�|�dS|j|||�}|dk	rP|jjjj|�dS)NzGroup id '%s' is not installed.)	rDrr;r\rarFrr��remove)r	r�r�r�r�rrr
�_swdb_group_remove�sz$TransactionReplay._swdb_group_removecCsd|jjj|�}|s,|j|jtd�|�dS|jjjj||j	|j
|�}y�x�|D]�}|d}|j|tdd�|d}|j|t
dd�|d}	|j|	td	d�ytjj|	�}	Wn2tjjk
r�}
ztt|
���WYdd}
~
XnX|	tjjtjjfk�rttd
�j|dd���|j|||	�qNWWn>tk
�r^}
z ttd�j|
jd
d���WYdd}
~
XnX|S)Nz%Environment id '%s' is not available.r*zenvironments.groups.idrer.zenvironments.groups.installedr�r1zenvironments.groups.group_typezlInvalid value "{group_type}" of environments.groups.group_type, only "mandatory" or "optional" is supported.)r1z2Missing object key "{key}" in environments.groups.r)rd)rDr�Z_environment_by_idrarHrrr>r�r-r�r^rr�r7r8r�rr�rZCompsPackageType_MANDATORYZCompsPackageType_OPTIONALrZaddGrouprtru)r	�env_idr�r0Z	comps_env�swdb_envr@r*r.r1r$rrr
�_create_swdb_environment�s8
*z*TransactionReplay._create_swdb_environmentcCs*|j|||�}|dk	r&|jjjj|�dS)N)r�rDrr>r~)r	r�r�r0r�rrr
�_swdb_environment_install�sz+TransactionReplay._swdb_environment_installcCsT|jjjj|�s*|j|jtd�|�dS|j|||�}|dk	rP|jjjj|�dS)Nz%Environment id '%s' is not installed.)	rDrr>r\rarFrr�r�)r	r�r�r0r�rrr
�_swdb_environment_upgrade�sz+TransactionReplay._swdb_environment_upgradecCsT|jjjj|�s*|j|jtd�|�dS|j|||�}|dk	rP|jjjj|�dS)Nz%Environment id '%s' is not installed.)	rDrr>r\rarFrr�r�)r	r�r�r0r�rrr
�_swdb_environment_downgrade�sz-TransactionReplay._swdb_environment_downgradecCsT|jjjj|�s*|j|jtd�|�dS|j|||�}|dk	rP|jjjj|�dS)Nz%Environment id '%s' is not installed.)	rDrr>r\rarFrr�r�)r	r�r�r0r�rrr
�_swdb_environment_remove�sz*TransactionReplay._swdb_environment_removecCs|jS)z>
        :returns: the loaded data of the transaction
        )rZ)r	rrr
�get_dataszTransactionReplay.get_datacCs|jS)zW
        :returns: an array of warnings gathered during the transaction replay
        )rN)r	rrr
�get_warnings
szTransactionReplay.get_warningscCs�|j}g}xJ|jD]@}y|j|�Wqtk
rP}z|j|�WYdd}~XqXqW�x�|jD�]�}�y
|d}|d}ytjj|d�}Wn:tj	j
k
r�}z|jtt|���w`WYdd}~XnX|dkr�|j|||d�n�|dk�r|j
|||d�nl|dk�s|d	k�r,|j|||d�nD|d
k�s@|dk�rT|j|||d�n|jttd�j||d
���Wq`tk
�r�}z&|jttd�j|jdd���WYdd}~Xq`tk
�r�}z|j|�WYdd}~Xq`Xq`W�x�|jD�]�}	�y|	d}|	d}
ytjj|	d�}Wn>tj	j
k
�r^}z|jtt|����w�WYdd}~XnX|dk�r~|j|
||	d�n�|dk�r�|j|
||	d�nl|dk�s�|d	k�r�|j|
||	d�nD|d
k�s�|dk�r�|j|
||	d�n|jttd�j||
d���Wnptk
�rN}z&|jttd�j|jdd���WYdd}~Xn.tk
�rz}z|j|�WYdd}~XnX�q�W|�r�t||��dS)z*
        Replays the transaction.
        Nr&r*r,rjr+rnrkrprlrqz@Unexpected value of group action "{action}" for group "{group}".)r&r;z&Missing object key "{key}" in a group.r)rdr0zJUnexpected value of environment action "{action}" for environment "{env}".)r&r>z-Missing object key "{key}" in an environment.)rEr]r�rr6r_r7r8r�rr�rr�r�r�r�rrrtrur`r�r�r�r�r)r	rWrr�r$r<r&r�r�r?r�rrr
�runsv 
* 

*"zTransactionReplay.runcCs8|jjsdSg}�x|jjD�]}y
|j}Wn$tk
rP}zwWYdd}~XnXt|�}||jkr�|js�|jtjj	tjj
tjjfkr�td�j
|d�}|js�|jt|��n|jj|�y>|j|}|jtjjtjjfks�tjj||j�dkr�||_Wqtk
�r}zWYdd}~XqXqW|�r4t|j|��dS)z�
        Sets reasons in the transaction history to values from the stored transaction.

        Also serves to check whether additional packages were pulled in by the
        transaction, which results in an error (unless ignore_extras is True).
        NzgPackage nevra "{nevra}", which is not present in the transaction file, was pulled into the transaction.)r'r)rDr8r=rtrrLrFr&r7ZTransactionItemAction_UPGRADEDZ TransactionItemAction_DOWNGRADEDZ!TransactionItemAction_REINSTALLEDrrrGr6rrNrMZTransactionItemAction_INSTALLZTransactionItemAction_REMOVEZTransactionItemReasonComparer(rrE)r	rr:r=r$r'r
Z
replay_reasonrrr
�post_transactionds<



z"TransactionReplay.post_transaction)rCNFFF)rrr�__doc__rrOrPrar^r[r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr
rB�s4
 


a


(


SrB)Z
__future__rrrr7rwZdnf.i18nrZdnf.exceptionsr}rTr"Z
VERSION_MINORr4�
exceptionsr�rrrr%rA�objectrBrrrr
�<module>s Ksite-packages/dnf/__pycache__/comps.cpython-36.opt-1.pyc000064400000062017147511334650016770 0ustar003

h�-e�`�@s�ddlmZddlmZddlmZddlZddlmZddlm	Z	m
Z
ddlmZddlZ
ddlZ
ddlZddlZddlZddlZddlZddlZddlZddlZddlZejd�ZejjZejjZejjZ ejj!Z"eeBe Be"BZ#d	d
�Z$dd�Z%d
d�Z&dd�Z'd*dd�Z(Gdd�de)�Z*Gdd�de)�Z+Gdd�de)�Z,Gdd�de,�Z-Gdd�de,�Z.Gdd�de,�Z/Gd d!�d!e,�Z0Gd"d#�d#e)�Z1Gd$d%�d%e)�Z2Gd&d'�d'e)�Z3Gd(d)�d)e)�Z4dS)+�)�absolute_import)�print_function)�unicode_literalsN)�
CompsError)�_�ucd)�reduce�dnfcCs"|j|j|jf}ttjtt|��S)N)�
categories�groups�environmentsr�operator�__add__�map�len)�comps�collections�r�/usr/lib/python3.6/comps.py�_internal_comps_length6srcCs|dkrdStjj|�S)N)r	�util�first)�seqrrr�_first_if_iterable;srcs�tjj����fdd�|D�}|r&|S|r>tjtj���j}ntjtj��tjd�j}t	�}x`|D]X}||j
�r||j|�qb|jdk	r�||j�r�|j|�qb|j
dk	rb||j
�rb|j|�qbW|S)z;Return items from sqn matching either exactly or glob-wise.cs$h|]}|j�ks|j�kr|�qSr)�name�id)�.0�g)�patternrr�	<setcomp>Esz_by_pattern.<locals>.<setcomp>)�flagsN)r	Zi18nr�re�compile�fnmatch�	translate�match�I�setr�addr�ui_name)r�case_sensitiveZsqn�exactr%�retrr)rr�_by_patternAs 

r-cCs|jdkrtjS|jS)N)Z
display_order�sys�maxsize)�grouprrr�_fn_display_orderZsr1TcCs||||||�S)aF
    Installs a group or an environment identified by grp_or_env_id.
    This method is preserved for API compatibility. It used to catch an
    exception thrown when a gorup or env was already installed, which is no
    longer thrown.
    `install_fnc` has to be Solver._group_install or
    Solver._environment_install.
    r)Zinstall_fncZ
grp_or_env_id�types�exclude�strict�exclude_groupsrrr�install_or_skip^s
r6c@s,eZdZdZdd�Zedd��Zdd�ZdS)	�_Langsz6Get all usable abbreviations for the current language.cCsd|_d|_dS)N)�last_locale�cache)�selfrrr�__init__osz_Langs.__init__cCs"tjtj�}|dkrdSdj|�S)N�C�.)NN)�localeZ	getlocale�LC_MESSAGES�join)Zlclrrr�_dotted_locale_strssz_Langs._dotted_locale_strcCsz|j�}|j|kr|jSg|_|g}|dkr6|jd�x6|D].}x(tj|�D]}||jkrL|jj|�qLWq<W||_|jS)Nr<)rAr8r9�append�gettextZ_expand_lang)r:Zcurrent_localeZlocales�lZnlangrrr�getzs



z
_Langs.getN)�__name__�
__module__�__qualname__�__doc__r;�staticmethodrArErrrrr7ksr7c@s<eZdZdZdZdZdZdd�Zdd�Zdd�Z	d	d
�Z
dS)�
CompsQuery��cCs||_||_||_||_dS)N)r�history�kinds�status)r:rrNrOrPrrrr;�szCompsQuery.__init__cCs`t�}|j|j@r&|jdd�|D��|j|j@r\x(|D] }|j�}|sJq8|j|j��q8W|S)NcSsh|]
}|j�qSr)r)r�irrrr�sz)CompsQuery._get_groups.<locals>.<setcomp>)r'rP�	AVAILABLE�update�	INSTALLEDZgetCompsGroupItemr(�
getGroupId)r:�	available�	installed�resultrQr0rrr�_get_groups�s
zCompsQuery._get_groupscCs`t�}|j|j@r&|jdd�|D��|j|j@r\x(|D] }|j�}|sJq8|j|j��q8W|S)NcSsh|]
}|j�qSr)r)rrQrrrr�sz'CompsQuery._get_envs.<locals>.<setcomp>)r'rPrRrSrTZgetCompsEnvironmentItemr(ZgetEnvironmentId)r:rVrWrXrQ�envrrr�	_get_envs�s
zCompsQuery._get_envsc	Gstjj�}g|_g|_x�|D]�}g}}|j|j@rf|jj|�}|j	j
j|�}|j||�}|jj
|�|j|j@r�|jj|�}|j	jj|�}|j||�}|jj
|�|o�|r|j|jkr�td�t|�}n.|j|jkr�td�t|�}ntd�t|�}t|��qW|S)Nz&Module or Group '%s' is not installed.z&Module or Group '%s' is not available.z$Module or Group '%s' does not exist.)r	rZBunchrrrO�ENVIRONMENTSr�environments_by_patternrNrZZsearch_by_patternr[�extend�GROUPS�groups_by_patternr0rYrPrTrrrRr)	r:Zpatterns�resZpat�envs�grpsrVrW�msgrrrrE�s.

zCompsQuery.getN)rFrGrHrRrTr\r_r;rYr[rErrrrrK�srKc@s<eZdZdd�Zdd�Zdd�Zedd��Zed	d
��ZdS)�	ForwardercCs||_||_dS)N)�_i�_langs)r:�iobj�langsrrrr;�szForwarder.__init__cCst|j|�S)N)�getattrrf)r:rrrr�__getattr__�szForwarder.__getattr__cCs.x(|jj�D]}|j|�}|dk	r|SqW|S)N)rgrE)r:�defaultZdctrD�trrr�_ui_text�s

zForwarder._ui_textcCs|j|j|j�S)N)rnZdescZdesc_by_lang)r:rrr�ui_description�szForwarder.ui_descriptioncCs|j|j|j�S)N)rnrZname_by_lang)r:rrrr)�szForwarder.ui_nameN)	rFrGrHr;rkrn�propertyror)rrrrre�s
recs8eZdZ�fdd�Zdd�Zdd�Zedd��Z�ZS)	�Categorycstt|�j||�||_dS)N)�superrqr;�_group_factory)r:rhri�
group_factory)�	__class__rrr;�szCategory.__init__cCs0|j|j�}|dkr,d}t||j|jf��|S)Nz no group '%s' from category '%s')rsr�
ValueErrorr)r:�grp_id�grprdrrr�_build_group�s
zCategory._build_groupccs x|jD]}|j|�VqWdS)N)�	group_idsry)r:rwrrr�groups_iter�szCategory.groups_itercCst|j��S)N)�listr{)r:rrrr�szCategory.groups)	rFrGrHr;ryr{rpr�
__classcell__rr)rurrq�srqcsLeZdZ�fdd�Zdd�Zdd�Zdd�Zed	d
��Zedd��Z	�Z
S)
�Environmentcstt|�j||�||_dS)N)rrr~r;rs)r:rhrirt)rurrr;�szEnvironment.__init__cCs0|j|j�}|dkr,d}t||j|jf��|S)Nz#no group '%s' from environment '%s')rsrrvr)r:rwrxrdrrrrys
zEnvironment._build_groupcCsXg}xN|D]F}y|j|j|��Wq
tk
rN}ztj|�WYdd}~Xq
Xq
W|S)N)rBryrv�logger�error)r:ZidsrZgi�errr�
_build_groupss
 zEnvironment._build_groupsccs\xVtj|j|j�D]B}y|j|�VWqtk
rR}ztj|�WYdd}~XqXqWdS)N)�	itertools�chainrz�
option_idsryrvrr�)r:rwr�rrrr{s
zEnvironment.groups_itercCs|j|j�S)N)r�rz)r:rrr�mandatory_groupsszEnvironment.mandatory_groupscCs|j|j�S)N)r�r�)r:rrr�optional_groupsszEnvironment.optional_groups)rFrGrHr;ryr�r{rpr�r�r}rr)rurr~�s
r~csheZdZ�fdd�Zdd�Zedd��Zedd��Zd	d
�Zedd��Z	ed
d��Z
edd��Z�ZS)�Groupcs$tt|�j||�||_|j|_dS)N)rrr�r;�_pkg_factoryrlZselected)r:rhriZpkg_factory)rurrr;"szGroup.__init__cs�fdd�|jD�S)Ncsg|]}|j�kr|�qSr)�type)r�pkg)�type_rr�
<listcomp>(sz+Group._packages_of_type.<locals>.<listcomp>)�packages)r:r�r)r�r�_packages_of_type'szGroup._packages_of_typecCs|jtj�S)N)r��libcomps�PACKAGE_TYPE_CONDITIONAL)r:rrr�conditional_packages*szGroup.conditional_packagescCs|jtj�S)N)r�r��PACKAGE_TYPE_DEFAULT)r:rrr�default_packages.szGroup.default_packagescCst|j|j�S)N)rr�r�)r:rrr�
packages_iter2szGroup.packages_itercCs|jtj�S)N)r�r��PACKAGE_TYPE_MANDATORY)r:rrr�mandatory_packages6szGroup.mandatory_packagescCs|jtj�S)N)r�r��PACKAGE_TYPE_OPTIONAL)r:rrr�optional_packages:szGroup.optional_packagescCs|jjS)N)rfZuservisible)r:rrr�visible>sz
Group.visible)
rFrGrHr;r�rpr�r�r�r�r�r�r}rr)rurr� sr�c@sLeZdZdZejeejeej	e
ejeiZ
dd�Zedd��Zedd��ZdS)	�Packagez#Represents comps package data. :apicCs
||_dS)N)rf)r:�ipkgrrrr;LszPackage.__init__cCs|jjS)N)rfr)r:rrrrOszPackage.namecCs|j|jS)N)�_OPT_MAPr�)r:rrr�option_typeTszPackage.option_typeN)rFrGrHrIr�r��CONDITIONALr��DEFAULTr��	MANDATORYr��OPTIONALr�r;rprr�rrrrr�Bs
r�c@s�eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	e
dd��Zd-dd�Zd.dd�Z
dd�Ze
dd��Zdd�Zd/dd�Zd0dd�Zd d!�Ze
d"d#��Zd$d%�Zd1d&d'�Zd2d(d)�Zd*d+�Zd,S)3�CompscCstj�|_t�|_dS)N)r�r�rfr7rg)r:rrrr;\s
zComps.__init__cCs
t|j�S)N)rrf)r:rrr�__len__`sz
Comps.__len__cCst||j|j�S)N)rqrg�_group_by_id)r:Z	icategoryrrr�_build_categorycszComps._build_categorycCst||j|j�S)N)r~rgr�)r:Zienvironmentrrr�_build_environmentfszComps._build_environmentcCst||j|j�S)N)r�rg�_build_package)r:ZigrouprrrryiszComps._build_groupcCst|�S)N)r�)r:r�rrrr�lszComps._build_packagecCsVtj�}y|j|�Wn,tjk
rB|j�}tdj|���YnX|j|7_dS)N� )r�r�Z	fromxml_fZParserErrorZget_last_errorsrr@rf)r:�fnr�errorsrrr�_add_from_xml_filenameoszComps._add_from_xml_filenamecCst|j��S)N)r|�categories_iter)r:rrrr
xszComps.categoriesFcCs|j||�}t|�S)N)�categories_by_patternr)r:rr*Zcatsrrr�category_by_pattern}szComps.category_by_patterncCst|||j�S)N)r-r
)r:rr*rrrr��szComps.categories_by_patterncs�fdd��jjD�S)Nc3s|]}�j|�VqdS)N)r�)r�c)r:rr�	<genexpr>�sz(Comps.categories_iter.<locals>.<genexpr>)rfr
)r:r)r:rr��szComps.categories_itercCst|j�td�S)N)�key)�sorted�environments_iterr1)r:rrrr�szComps.environmentscstjj�fdd�|j�D��S)Nc3s|]}|j�kr|VqdS)N)r)rr)rrrr��sz+Comps._environment_by_id.<locals>.<genexpr>)r	rrr�)r:rr)rr�_environment_by_id�szComps._environment_by_idcCs|j||�}t|�S)N)r]r)r:rr*rbrrr�environment_by_pattern�szComps.environment_by_patterncCs$t|j��}t|||�}t|td�S)N)r�)r|r�r-r�r1)r:rr*rbZ
found_envsrrrr]�szComps.environments_by_patterncs�fdd��jjD�S)Nc3s|]}�j|�VqdS)N)r�)rr�)r:rrr��sz*Comps.environments_iter.<locals>.<genexpr>)rfr)r:r)r:rr��szComps.environments_itercCst|j�td�S)N)r�)r�r{r1)r:rrrr�szComps.groupscstjj�fdd�|j�D��S)Nc3s|]}|j�kr|VqdS)N)r)rr)�id_rrr��sz%Comps._group_by_id.<locals>.<genexpr>)r	rrr{)r:r�r)r�rr��szComps._group_by_idcCs|j||�}t|�S)N)r`r)r:rr*rcrrr�group_by_pattern�szComps.group_by_patterncCs t||t|j���}t|td�S)N)r�)r-r|r{r�r1)r:rr*rcrrrr`�szComps.groups_by_patterncs�fdd��jjD�S)Nc3s|]}�j|�VqdS)N)ry)rr)r:rrr��sz$Comps.groups_iter.<locals>.<genexpr>)rfr)r:r)r:rr{�szComps.groups_iterN)F)F)F)F)F)F)rFrGrHr;r�r�r�ryr�r�rpr
r�r�r�rr�r�r]r�rr�r�r`r{rrrrr�Ys*	





r�c@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�
CompsTransPkgcCs�tjj|�r&d|_||_d|_d|_n\t|tj	j
�r\d|_|j�|_|j�t
j@|_d|_n&|j|_|j|_|jt
j@|_|j|_dS)NFT)r	rZis_string_type�basearchonlyr�optional�requires�
isinstance�libdnf�transactionZCompsGroupPackage�getNameZgetPackageTyper�r�r�)r:Zpkg_or_namerrrr;�s
zCompsTransPkg.__init__cCs0|j|jko.|j|jko.|j|jko.|j|jkS)N)rr�r�r�)r:�otherrrr�__eq__�szCompsTransPkg.__eq__cCs|jS)N)r)r:rrr�__str__�szCompsTransPkg.__str__cCst|j|j|j|jf�S)N)�hashrr�r�r�)r:rrr�__hash__�szCompsTransPkg.__hash__N)rFrGrHr;r�r�r�rrrrr��sr�c@s�eZdZdd�Zdd�Zdd�Zedd��Zed	d
��Z	e	j
dd
��Z	edd
��Zej
dd
��Zedd��Zej
dd��Zedd��Z
e
j
dd��Z
dS)�TransactionBunchcCs$t�|_t�|_t�|_t�|_dS)N)r'�_install�_install_opt�_remove�_upgrade)r:rrrr;�szTransactionBunch.__init__cCsN|jj|j�|jj|j�|jj|j�|j|jB|j|j|j|_|S)N)r�rSr�r�r�)r:r�rrr�__iadd__�s
 zTransactionBunch.__iadd__cCs(t|j�t|j�t|j�t|j�S)N)r�install�install_opt�upgrade�remove)r:rrrr��szTransactionBunch.__len__cCs6x0|D](}t|t�r |j|�q|jt|��qWdS)N)r�r�r()Zparam�val�itemrrr�
_set_value�s

zTransactionBunch._set_valuecCs|jS)z�
        Packages to be installed with strict=True - transaction will
        fail if they cannot be installed due to dependency errors etc.
        )r�)r:rrrr�szTransactionBunch.installcCs|j|j|�dS)N)r�r�)r:�valuerrrr�	scCs|jS)zw
        Packages to be installed with strict=False - they will be
        skipped if they cannot be installed
        )r�)r:rrrr�
szTransactionBunch.install_optcCs|j|j|�dS)N)r�r�)r:r�rrrr�scCs|jS)N)r�)r:rrrr�szTransactionBunch.removecCs|j|j|�dS)N)r�r�)r:r�rrrr�scCs|jS)N)r�)r:rrrr�!szTransactionBunch.upgradecCs|j|j|�dS)N)r�r�)r:r�rrrr�%sN)rFrGrHr;r�r�rJr�rpr��setterr�r�r�rrrrr��sr�c@s�eZdZdd�Zedd��Zedd��Zegfdd��Zd	d
�Zdd�Z	ddd�Z
dd�Zdd�Zddd�Z
dd�Zdd�Zdd�Zd
S)�SolvercCs||_||_||_dS)N)rNrZ
_reason_fn)r:rNrZ	reason_fnrrrr;+szSolver.__init__cCsdd�|jD�S)NcSsh|]
}|j�qSr)r)rrxrrrr2sz.Solver._mandatory_group_set.<locals>.<setcomp>)r�)rZrrr�_mandatory_group_set0szSolver._mandatory_group_setcCs"dd�|j|j|j|jD�S)NcSsh|]}|j��qSr)r�)rr�rrrr6sz+Solver._full_package_set.<locals>.<setcomp>)r�r�r�r�)rxrrr�_full_package_set4szSolver._full_package_setcsv�fdd�}t�}|t@r*|j||j��|t@rB|j||j��|t@rZ|j||j��|t@rr|j||j	��|S)Ncs�fdd�|D�S)Ncsg|]}|j�kr|�qSr)r)rr�)r3rrr�=sz8Solver._pkgs_of_type.<locals>.filter.<locals>.<listcomp>r)�pkgs)r3rr�filter<sz$Solver._pkgs_of_type.<locals>.filter)
r'r�rSr�r�r�r�r�r�r�)r0�	pkg_typesr3r�r�r)r3r�
_pkgs_of_type:szSolver._pkgs_of_typecCs|jjj|�S)N)rNr0Zis_removable_pkg)r:Zpkg_namerrr�_removable_pkgKszSolver._removable_pkgcCs|jjj|�S)N)rNrZZis_removable_group)r:�group_idrrr�_removable_grpOszSolver._removable_grpNTc
Cs�|jj|�}|s$ttd�t|���|jjj||j|j	|�}|jjj
|�t�}xD|jD]:}	|rl|	j
|krlqX||j|	j
|||�7}|j|	j
dt�qXWx.|jD]$}	|r�|	j
|kr�q�|j|	j
dt�q�W|S)Nz#Environment id '%s' does not exist.TF)rr�rrrrNrZ�newrr)r�r�r�r�_group_install�addGroupr�r�r�)
r:�env_idr�r3r4r5�	comps_env�swdb_env�trans�comps_grouprrr�_environment_installSs zSolver._environment_installcCsx|jjj|�}|s"ttd�|��|jjj|�t�}tdd�|j�D��}x&|D]}|j	|�sbqR||j
|�7}qRW|S)Nz%Environment id '%s' is not installed.cSsg|]}|j��qSr)rU)rrQrrrr�ssz.Solver._environment_remove.<locals>.<listcomp>)rNrZrErrr�r�r'�	getGroupsr��
_group_remove)r:r�r�r�rzr�rrr�_environment_removejs

zSolver._environment_removecCs>|jj|�}|jjj|�}|s.ttd�|��|sBttd�|��tdd�|j�D��}|j	�}|jjj
|j|j|j
|�}t�}x\|jD]R}|j|kr�|jjj|j�r�||j|j�7}n||j|j|�7}|j|jdt�q�WxL|jD]B}|j|k�r|jjj|j��r||j|j�7}|j|jdt�q�W|jjj|�|S)Nz"Environment '%s' is not installed.z"Environment '%s' is not available.cSsg|]}|j��qSr)rU)rrQrrrr��sz/Solver._environment_upgrade.<locals>.<listcomp>TF)rr�rNrZrErrr'r��getPackageTypesr�rrr)r�r�r0�_group_upgrader�r�r�r�r�r�)r:r�r�r��old_setr�r�r�rrr�_environment_upgradezs,
zSolver._environment_upgradec
Cs�|jj|�}|s$ttd�t|���|jjj||j|j	|�}x(|j
�D]}|j|jdtj
|j�qFW|jjj|�t�}	|r�|	jj|j||gd��n|	jj|j||gd��|	S)NzGroup id '%s' does not exist.F)r3)rr�rrrrNr0r�rr)r��
addPackager�r�r�r�r�rSr�r�)
r:r�r�r3r4r5r��
swdb_grouprQr�rrrr��szSolver._group_installcsR�jjj|�}|s"ttd�|���jjj|�t�}�fdd�|j�D�|_|S)Nz&Module or Group '%s' is not installed.csh|]}�j|j��r|�qSr)r�r�)rr�)r:rrr�sz'Solver._group_remove.<locals>.<setcomp>)rNr0rErrr�r��getPackages)r:r�r�r�r)r:rr��szSolver._group_removec	s|jj|�}|jjj|�}g}|s@|r,|jn|}ttd�|��|sTttd�|��|j�}t	dd�|j
�D���|j|||��|jjj||j
|j|�}x(|j�D]}|j|j
dtj|j�q�W|jjj|�t�}�fdd��D�|_�fdd��D�|_�fd	d��D�|_|S)
Nz&Module or Group '%s' is not installed.z&Module or Group '%s' is not available.cSsg|]}|j��qSr)r�)rrQrrrr��sz)Solver._group_upgrade.<locals>.<listcomp>Fcsh|]}|j�kr|�qSr)r)rr�)r�rrr�sz(Solver._group_upgrade.<locals>.<setcomp>cs"h|]}|dd��D�kr|�qS)cSsg|]
}|j�qSr)r)rr�rrrr��sz3Solver._group_upgrade.<locals>.<setcomp>.<listcomp>r)rr)�new_setrrr�scsh|]}|j�kr|�qSr)r)rr�)r�rrr�s)rr�rNr0rEr)rrr�r'r�r�r�rr�r�r�r�r�r�r�r�r�)	r:r�r�r�r3Zargumentr�rQr�r)r�r�rr��s(zSolver._group_upgradecCslxf|jjD]Z}|jj|�}|jr
t|j�t|j�}|jj�j�j	|d�}x|D]}|j
j|�qPWq
WdS)N)r)Z	persistorrr0rWr'Z	full_listZpkg_excludeZsackZqueryZfiltermZ_goalr�)r:�baser0Zp_grpZinstalled_pkg_namesZinstalled_pkgsr�rrr�'_exclude_packages_from_installed_groups�s
z.Solver._exclude_packages_from_installed_groups)NTN)NTN)rFrGrHr;rJr�r�r�r�r�r�r�r�r�r�r�r�rrrrr�*s
#

r�)NTN)5Z
__future__rrrZlibdnf.transactionr�Zdnf.exceptionsrZdnf.i18nrr�	functoolsrr	Zdnf.utilr#rCr�r�r>Zloggingr
r!r.Z	getLoggerrr�ZCompsPackageType_CONDITIONALr�ZCompsPackageType_DEFAULTr�ZCompsPackageType_MANDATORYr�ZCompsPackageType_OPTIONALr�Z	ALL_TYPESrrr-r1r6�objectr7rKrerqr~r�r�r�r�r�r�rrrr�<module>sP

!A'"f(Csite-packages/dnf/__pycache__/match_counter.cpython-36.pyc000064400000007771147511334650017551 0ustar003

�ft`T�@sZddlmZddlmZddlmZddlmZddddd	�Zd
d�ZGdd
�d
e�Z	dS)�)�absolute_import)�print_function)�unicode_literals)�reduce����)�nameZsummary�descriptionZurlcCs"t|�}dg||t|�}|S)z, Ordered sset with empty strings prepended. �)�len�sorted)ZssetZlengthZcurrent�l�r�#/usr/lib/python3.6/match_counter.py�_canonize_string_set"src@sfeZdZdZedd��Zdd�Zdd�Zdd	�Zd
d�Z	dd
�Z
dd�Zdd�Zddd�Z
dd�ZdS)�MatchCounterz�Map packages to which of their attributes matched in a search against
    what values.

    The mapping is: ``package -> [(key, needle), ... ]``.

    cs�fdd�}tt||��S)Ncs>|d}|d}t�|�}|dkr6||kr6dt|St|S)Nrr	r
r)�getattr�WEIGHTS)�match�key�needleZhaystack)�pkgrr�weight4s
z*MatchCounter._eval_weights.<locals>.weight)�sum�map)rZmatchesrr)rr�
_eval_weights1s	zMatchCounter._eval_weightscs�fdd�}|S)a�Get the key function used for sorting matches.

        It is not enough to only look at the matches and order them by the sum
        of their weighted hits. In case this number is the same we have to
        ensure that the same matched needles are next to each other in the
        result.

        Returned function is:
        pkg -> (weights_sum, canonized_needles_set, -distance)

        cs�j|�|�|jfS)N)rr
)r)�selfrr�get_keyKsz'MatchCounter._key_func.<locals>.get_keyr)rrr)rr�	_key_func?szMatchCounter._key_funccs�rt�fdd��D��SdS)z0Return the max count of needles of all packages.c3s|]}t�j|��VqdS)N)r
�matched_needles)�.0r)rrr�	<genexpr>Wsz,MatchCounter._max_needles.<locals>.<genexpr>r)�max)rr)rr�_max_needlesTszMatchCounter._max_needlescCs|j|g�j||f�dS)N)�
setdefault�append)rrrrrrr�addZszMatchCounter.addcCs&x |D]}td|||f�qWdS)Nz%s	%s)�print)rrrrr�dump]s
zMatchCounter.dumpcst�fdd�|�D��S)Nc3s|]}t�|d�VqdS)rN)r)r"�m)rrrr#bsz1MatchCounter.matched_haystacks.<locals>.<genexpr>)�set)rrr)rr�matched_haystacksaszMatchCounter.matched_haystackscCs6g}x,||D] }|d|kr q|j|d�qW|S)Nr)r')rr�result�irrr�matched_keysdszMatchCounter.matched_keyscCstdd�||D��S)Ncss|]}|dVqdS)r	Nr)r"r+rrrr#nsz/MatchCounter.matched_needles.<locals>.<genexpr>)r,)rrrrrr!mszMatchCounter.matched_needlesFNcCs |r|n|j�}t||j�d�S)N)r)�keysrr )r�reverseZlimit_tor1rrrrpszMatchCounter.sortedcst�fdd��d�S)Ncs|t�|�S)N)r
)�totalr)rrr�<lambda>usz$MatchCounter.total.<locals>.<lambda>r)r)rr)rrr3tszMatchCounter.total)FN)�__name__�
__module__�__qualname__�__doc__�staticmethodrr r%r(r*r-r0r!rr3rrrrr)s	
rN)
Z
__future__rrr�	functoolsrrr�dictrrrrr�<module>ssite-packages/dnf/__pycache__/persistor.cpython-36.opt-1.pyc000064400000010067147511334650017677 0ustar003

�ft`o�@s�ddlmZddlmZddlmZddlZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZejd�ZGdd�de�ZGdd	�d	e�ZGd
d�de�ZdS)�)�absolute_import)�unicode_literals)�_N�dnfc@s,eZdZdd�Zgfdd�Zedd��ZdS)�JSONDBcCs0tjj|�s,tjjtjj|��|j|g�dS)N)�os�path�isfiler�utilZ
ensure_dir�dirname�_write_json_db)�self�	json_path�r�/usr/lib/python3.6/persistor.py�_check_json_db+szJSONDB._check_json_dbcCs�t|d��}|j�}WdQRX|dkrDtjtd�|�|j||�n<ytj|�}Wn,tk
r~}ztj|�WYdd}~XnX|S)N�r�z%s is empty file)	�open�read�logger�warningrr�json�loads�
ValueError)r
r�default�f�content�errr�_get_json_db1szJSONDB._get_json_dbc
Cs&t|d��}tj||�WdQRXdS)N�w)rr�dump)rrrrrrr?szJSONDB._write_json_dbN)�__name__�
__module__�__qualname__rr�staticmethodrrrrrr)src@s<eZdZdZdd�Zedd��Zdd�Zdd	�Zd
d�Z	dS)
�
RepoPersistorzePersistent data kept for repositories.

    Is arch/releasever specific and stores to cachedir.

    cCs*||_tjj|jd�|_t�|_d|_dS)Nzexpired_repos.jsonF)�cachedirrr�join�db_path�set�expired_to_add�reset_last_makecache)r
r'rrr�__init__LszRepoPersistor.__init__cCstjj|jd�S)NZlast_makecache)rrr(r')r
rrr�_last_makecache_pathRsz"RepoPersistor._last_makecache_pathcCsRy|j|j�t|j|j��Stk
rL}ztjtd�|�dSd}~XnXdS)Nz&Failed to load expired repos cache: %s)rr)r*r�OSErrorrrr)r
rrrr�get_expired_reposVszRepoPersistor.get_expired_reposcCs�y$|j|j�|j|jt|j��Wn0tk
rT}ztjtd�|�dSd}~XnX|j	r�yt
jj|j
�dStk
r�tjtd��dSXdS)Nz'Failed to store expired repos cache: %sFTz#Failed storing last makecache time.)rr)r�listr+r/rrrr,rr
Ztouchr.�IOError)r
rrrr�save^szRepoPersistor.savecCs:yttjj|j��Stk
r4tjtd��dSXdS)Nz'Failed determining last makecache time.)	�intrr
Zfile_ager.r/rrr)r
rrr�since_last_makecachems
z"RepoPersistor.since_last_makecacheN)
r"r#r$�__doc__r-�propertyr.r0r3r5rrrrr&Esr&c@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�TempfilePersistorcCs"tjj|d�|_t�|_d|_dS)Nztempfiles.jsonF)rrr(r)r*�tempfiles_to_add�_empty)r
r'rrrr-wszTempfilePersistor.__init__cCs|j|j�|j|j�S)N)rr)r)r
rrr�get_saved_tempfiles|sz%TempfilePersistor.get_saved_tempfilescCsp|jr|jrdS|j|j�|jr8|j|jg�dS|jrlt|j|j��}|j|j�|j|jt|��dS)N)	r:r9rr)rr*r�updater1)r
�datarrrr3�szTempfilePersistor.savecCs
d|_dS)NT)r:)r
rrr�empty�szTempfilePersistor.emptyN)r"r#r$r-r;r3r>rrrrr8usr8)Z
__future__rrZdnf.i18nrZdistutils.versionZ	distutilsZdnf.utilr�errnoZfnmatchrZloggingr�reZ	getLoggerr�objectrr&r8rrrr�<module>s
0site-packages/dnf/__pycache__/pycomp.cpython-36.pyc000064400000005633147511334650016220 0ustar003

�ft`�@s�ddlmZddlmZddlZddlZddlZddlZddlZddlZddl	Z	ej
dkZe�r ddlm
Z
ddlmZddlZddlZddlZejZeZZejZeZeje_eje_eZe Z!ej"Z#ej$Z%e%j&Z'ej&Z(ej)Z*dd�Z+d	d
�Z,dd�Z-d
d�Z.ej/Z0ddd�Z1dd�Z2dd�Z3n�ddl4mZmZmZmZm!Z!ddl
m
Z
ddlmZddlZddlZddl%Z%ddl5Z5ejZej6Zej7Z#ej&Z'e5j&Z(ej8Z*dd�Z+dd
�Z,dd�Z-dd�Z.dd�Z0d dd�Z1dd�Z2dd�Z3dS)!�)�NullTranslations)�version_infoN�)�StringIO)�ConfigParsercCs|j}|j}||fS)N)�gettext�ngettext)�t�_�P_�r�/usr/lib/python3.6/pycomp.py�
gettext_setup8srcCs
t|t�S)N)�
isinstance�bytes)�orrr
�is_py2str_py3bytes>srcCs
t|t�S)N)rr)rrrr
�is_py3bytes@srcCs
tj|�S)N)�types�
ModuleType)�mrrr
�<lambda>DsrcCstj||�dS)N)�locale�	setlocale)�category�locrrr
rFsrcCs|j|�dS)N)�write)�f�contentrrr
�
write_to_fileHsrcCstjjj|�S)N)�email�mime�text�MIMEText)�bodyrrr
�
email_mimeJsr%)�unicode�
basestring�long�xrange�	raw_inputcCs|j}|j}||fS)N)�ugettext�	ungettext)r	r
rrrr
r]scCs
t|t�S)N)r�str)rrrr
rcscCsdS)NFr)rrrr
rescCstj|jd��S)Nzutf-8)rr�encode)rrrr
riscOstj|jd�f|�|�S)Nzutf-8)r�formatr.)Zpercent�args�kwargsrrr
r/jsr/cCstj||jd��dS)Nzutf-8)rrr.)rrrrr
rlscCs|j|jd��dS)Nzutf-8)rr.)rrrrr
rnscCstjjj|jd��S)Nzutf-8)r r!r"r#r.)r$rrr
r%ps)N)N)9rr�sysr�base64Zemail.mime.textr �	itertoolsrr�majorZPY3�iorZconfigparserrZqueueZurllib.parseZurllibZshlexZQueuer-r'r&�filterfalse�intr(r+rr,�ranger)�inputr*ZdecodebytesZbase64_decodebytes�parseZurlparseZquoteZurllib_quoteZshlex_quote�maxsizeZsys_maxsizerrrr�
format_stringr/rrr%Z__builtin__ZpipesZifilterfalseZdecodestringZmaxintrrrr
�<module>sr



site-packages/dnf/__pycache__/repo.cpython-36.pyc000064400000053275147511334650015663 0ustar003

i�-eLQ�@sFddlmZddlmZddlmZmZddlZddlZddl	Zddl
ZddlZddlZddl
ZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZdZ dZ!ej"ej#dZ$dej%e$�ej&fZ'd	e'e!fd
e'e fdd�Z(ej)d
�Z*dd�Z+dd�Z,d-dd�Z-dd�Z.Gdd�de/�Z0Gdd�de1�Z2Gdd�dej3j4�Z5Gdd�de/�Z6Gdd �d ej7j8�Z9Gd!d"�d"ej3j:�Z;Gd#d$�d$e;�Z<Gd%d&�d&e;�Z=Gd'd(�d(ej3j:�Z>ej7j?j@ZAej7j?jBZCej7j?jDZEGd)d*�d*ej7jF�ZGGd+d,�d,ejHjI�Z?dS).�)�absolute_import)�unicode_literals)�ucd�_N�packagesZ
mirrorlistz-_.:z(?P<repoid>[%s]+)\-[%s]{16}z>^%s\/.*((xml|yaml)(\.gz|\.xz|\.bz2|.zck)?|asc|cachecookie|%s)$z^%s\/%s\/.+rpm$z^.+(solv|solvx)$)�metadatarZdbcache�dnfcCstjjj|�}|dkrdS|S)zAReturn index of an invalid character in the repo ID (if present).rN)�libdnf�repo�RepoZverifyId)Zrepo_idZ
first_invalid�r�/usr/lib/python3.6/repo.py�repo_id_invalidHsrcGs8x"|D]}|||�}|dk	r|SqWttd�|��dS)Nz"no matching payload factory for %s)�
ValueErrorr)�pkg�progressZ	factories�fn�ploadrrr
�_pkg2payloadOs


rTcCs�dd�}|jj�dd�t||d�D�}t�}ytjjjtjj|�|�Wn,t	k
rv}zt
|�|_WYdd}~XnX|j�|jj
�|_xj|D]b}|j�}|dks�|jd�r�q�|j�}	|	j}
|
j}|dkr�|jj|�q�|jjj�|g|j|<q�W|S)NcSst|d�S)NZdelta)�hasattr)�payloadrrr
�_download_sort_keyYsz._download_payloads.<locals>._download_sort_keycSsg|]}|j��qSr)�_librepo_target)�.0rrrr
�
<listcomp>]sz&_download_payloads.<locals>.<listcomp>)�keyzNot finishedzAlready downloaded)�err�clear�sorted�_DownloadErrorsr	r
�
PackageTargetZdownloadPackagesZVectorPPackageTarget�RuntimeError�str�_fatal�wait�copy�_recoverableZgetErr�
startswithZgetCallbacks�
package_ploadr�_skipped�add�_repoZexpire�_pkg_irrecoverable)�payloadsZdrpmZ	fail_fastrZtargets�errs�eZtgtr�	callbacksrrrrr
�_download_payloadsWs0

r1cCsL|\}}x:|D]2}|j}||kr,||j7}q||j7}||j7}qW||fS)N)r�
download_size�
_full_size)Zsavingr-r.�realZfullrrrrr
�_update_savingxs


r5c@s>eZdZdd�Zdd�Zedd��Zejdd��Zdd	�Zd
S)rcCsi|_i|_d|_t�|_dS)N)r,�_val_recoverabler#�setr))�selfrrr
�__init__�sz_DownloadErrors.__init__cCs"|jr|jS|jrd|jgiSiS)N�)r,r#)r8rrr
�_irrecoverable�s
z_DownloadErrors._irrecoverablecCs|jS)N)r6)r8rrr
r&�sz_DownloadErrors._recoverablecCs
||_dS)N)r6)r8Znew_dctrrr
r&�scCs|j|jkrdS|jS)Nr)rr)r2)r8rrrr
�_bandwidth_used�sz_DownloadErrors._bandwidth_usedN)	�__name__�
__module__�__qualname__r9r;�propertyr&�setterr<rrrr
r�s
rc@seZdZdd�ZdS)�_DetailedLibrepoErrorcCs,tj|�|jd|_|jd|_||_dS)Nr�)�	Exceptionr9�argsZlibrepo_codeZlibrepo_msg�
source_url)r8Zlibrepo_errrFrrr
r9�s
z_DetailedLibrepoError.__init__N)r=r>r?r9rrrr
rB�srBc@seZdZdd�ZdS)�_NullKeyImportcCsdS)NTr)r8�id�userid�fingerprint�url�	timestamprrr
�_confirm�sz_NullKeyImport._confirmN)r=r>r?rMrrrr
rG�srGc@s eZdZdd�Zedd��ZdS)�MetadatacCs
||_dS)N)r+)r8r
rrr
r9�szMetadata.__init__cCs
|jj�S)N)r+�fresh)r8rrr
rO�szMetadata.freshN)r=r>r?r9r@rOrrrr
rN�srNcs4eZdZ�fdd�Zdd�Zdd�Zdd�Z�ZS)	�PackageTargetCallbackscstt|�j�||_dS)N)�superrPr9r()r8r()�	__class__rr
r9�szPackageTargetCallbacks.__init__cCs|jjd||�dS)Nr)r(�_end_cb)r8�status�msgrrr
�end�szPackageTargetCallbacks.endcCs|jjd||�dS)Nr)r(�_progress_cb)r8�totalToDownload�
downloadedrrr
r�szPackageTargetCallbacks.progresscCs|jjd||�dS)Nr)r(�_mirrorfail_cb)r8rUrKrrr
�
mirrorFailure�sz$PackageTargetCallbacks.mirrorFailure)r=r>r?r9rVrr[�
__classcell__rr)rRr
rP�srPcsHeZdZ�fdd�Zdd�Zdd�Zdd�Zed	d
��Zdd�Z	�Z
S)
�PackagePayloadcs$tt|�j|�t|�|_||_dS)N)rQr]r9rPr0r)r8rr)rRrr
r9�s
zPackagePayload.__init__cCsRtjj}|dkrtjj}n$|jd�r(dS|tjjjkr>tjj	}|j
j|||�dS)z"End callback to librepo operation.NzNot finished)r�callbackZ
STATUS_FAILEDZ	STATUS_OKr'r	r
�PackageTargetCBZTransferStatus_ALREADYEXISTSZSTATUS_ALREADY_EXISTSrrV)r8�cbdataZ	lr_statusrUrTrrr
rS�s

zPackagePayload._end_cbcCs|jj|tjj|�dS)N)rrVrr^Z
STATUS_MIRROR)r8r`rrKrrr
rZ�szPackagePayload._mirrorfail_cbcCsXy|jj||�Wn@tk
rRtj�\}}}tj|||�}tjdj|��YnXdS)Nr:)	rrD�sys�exc_info�	traceback�format_exception�logger�critical�join)r8r`�total�done�exc_type�	exc_value�
exc_traceback�except_listrrr
rW�szPackagePayload._progress_cbcCs|jS)N)r2)r8rrr
r3�szPackagePayload._full_sizecCs�|j}|j}tjj|�|d||j|j|jd�}|j|j	��t
jj|jj
|d|d|d|d|d|d|d	d
d
|j�S)NT)�dest�resumer`Z
progresscbZendcbZmirrorfailurecb�relative_urlrn�
checksum_type�checksum�expectedsize�base_urlror)r�pkgdirr�util�
ensure_dirrWrSrZ�update�_target_paramsr	r
r r+r0)r8rruZ
target_dctrrr
r�s 
zPackagePayload._librepo_target)r=r>r?r9rSrZrWr@r3rr\rr)rRr
r]�sr]c@s(eZdZdd�Zdd�Zedd��ZdS)�
RPMPayloadcCstjj|jj�S)N)�os�path�basenamer�location)r8rrr
�__str__szRPMPayload.__str__cCsT|j}|j�\}}tjjj|�}|tjjjkr>tjt	d�|�|j
|||j|jd�S)Nzunsupported checksum type: %s)rprqrrrsrt)
rZreturnIdSumr	r
r ZchecksumTypeZChecksumType_UNKNOWNre�warningrr~�downloadsize�baseurl)r8rZctypeZcsumZ
ctype_coderrr
ryszRPMPayload._target_paramscCs|jjS)zTotal size of the download.)rr�)r8rrr
r2szRPMPayload.download_sizeN)r=r>r?rryr@r2rrrr
rzsrzcs@eZdZ�fdd�Zdd�Zdd�Zdd�Zed	d
��Z�Z	S)�RemoteRPMPayloadcs�tt|�jd|�||_d|_||_|jjp.d|jjjd�}t	j
|jd��j�dd�}d|}t
jj|jj|d�|_tjj|j�t
jj|j|j�jd	��|_dS)
NZ
unused_objectrr:Zbasearch�utf8�zcommandline-r�/)rQr�r9�remote_location�remote_size�confZ
releasever�
substitutions�get�hashlibZsha256�encodeZ	hexdigestr{r|rgZcachedirrurrvrwr�lstripZ
local_path)r8r�r�r�sZdigestZrepodir)rRrr
r9szRemoteRPMPayload.__init__cCstjj|j�S)N)r{r|r}r�)r8rrr
r)szRemoteRPMPayload.__str__cCs^||_y|jj||�Wn@tk
rXtj�\}}}tj|||�}tjdj	|��YnXdS)Nr:)
r�rrDrarbrcrdrerfrg)r8r`rhrirjrkrlrmrrr
rW,szRemoteRPMPayload._progress_cbcCs<tjj|jjtjj|j�|j	dddtjj
|j�ddd|j�S)NrT)r	r
r r��_configr{r|r}r�ru�dirnamer0)r8rrr
r5sz RemoteRPMPayload._librepo_targetcCs|jS)zTotal size of the download.)r�)r8rrr
r2;szRemoteRPMPayload.download_size)
r=r>r?r9rrWrr@r2r\rr)rRr
r�s
	r�cszeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Ze	d
d��Z
e	dd��Zejdd��Zdd�Z
dd�Z�ZS)�	MDPayloadcs.tt|�j|�d|_d|_d|_t�|_dS)Nr:rF)rQr�r9�_text�_download_size�fastest_mirror_runningr7�mirror_failures)r8r)rRrr
r9Cs
zMDPayload.__init__cCstjjr|jS|jjd�SdS)Nzutf-8)r�pycompZPY3r�r�)r8rrr
rJszMDPayload.__str__cCs|jS)N)r�)r8rrr
�__unicode__PszMDPayload.__unicode__cCs||_|jj||�dS)N)r�r)r8r`rhrirrr
rWSszMDPayload._progress_cbcCs\|tjjjkr"td�|}d|_n*|tjjjkrH|jrH|rBd|nd}ndS|jj|�dS)Nz,determining the fastest mirror (%s hosts).. Tz
error: %s
zdone.
)	r	r
�RepoCBZFastestMirrorStage_DETECTIONrr�ZFastestMirrorStage_STATUSr�message)r8r`�stage�datarUrrr
�_fastestmirror_cbWszMDPayload._fastestmirror_cbcCs&|jj|�d||f}tj|�dS)Nzerror: %s (%s).)r�r*re�debug)r8r`rUrKrrrr
�_mirror_failure_cbcszMDPayload._mirror_failure_cbcCs|jS)N)r�)r8rrr
r2hszMDPayload.download_sizecCs|jS)N)�	_progress)r8rrr
rlszMDPayload.progresscCs|dkrtjj�}||_dS)N)rr^�NullDownloadProgressr�)r8rrrr
rps
cCs||_|jjdd�dS)NrCr)r�r�start)r8�textrrr
r�vszMDPayload.startcCsd|_|jj|dd�dS)Nr)r�rrV)r8rrr
rVzsz
MDPayload.end)r=r>r?r9rr�rWr�r�r@r2rrAr�rVr\rr)rRr
r�Asr�csLeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	�Z
S)�
RepoCallbackscs tt|�j�||_|j|_dS)N)rQr�r9r+�	_md_pload)r8r
)rRrr
r9�szRepoCallbacks.__init__cCs|jj|�dS)N)r�r�)r8Zwhatrrr
r��szRepoCallbacks.startcCs|jj�dS)N)r�rV)r8rrr
rV�szRepoCallbacks.endcCs|jjd||�dS)Nr)r�rW)r8rXrYrrr
r�szRepoCallbacks.progresscCs|jjd||�dS)N)r�r�)r8r�Zptrrrr
�
fastestMirror�szRepoCallbacks.fastestMirrorcCs|jjd|||�dS)Nr)r�r�)r8rUrKrrrr
�handleMirrorFailure�sz!RepoCallbacks.handleMirrorFailurecCs|jjj|||||�S)N)r+�_key_importrM)r8rHrIrJrKrLrrr
�
repokeyImport�szRepoCallbacks.repokeyImport)r=r>r?r9r�rVrr�r�r�r\rr)rRr
r��sr�cseZdZeZd7�fdd�	Zedd��Zedd��Zej	dd��Zed	d
��Z
dd�Ze
j	d
d
��Z
edd��Zedd��Z
e
j	dd��Z
dd�Zdd�Z�fdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd9d5d6�Z�ZS):rNcs�tt|�j||d�|jjj�tjj|r.|nd|j�|_t	t
jj��|_
t|�|_|jjj�|jj|j�d|_t�|_d|_|jj|r�|jr�tn|j�|r�|jj|j�t
jjj�|_|dk	r�|jnd|_dS)N)Zsection�parentr:T) rQrr9r��thisZdisownr	r
r+r�rr^r�r�r�Z
_callbacksZsetCallbacks�_pkgdirrGr�rZsetSyncStrategyZ	cacheonly�SYNC_ONLY_CACHE�DEFAULT_SYNCZsetSubstitutionsr�r�Z
SubstitutionsZ_substitutionsZcheck_config_file_ageZ_check_config_file_age)r8�nameZparent_conf)rRrr
r9�s
z
Repo.__init__cCs
|jj�S)N)r+ZgetId)r8rrr
rH�szRepo.idcCs
|jj�S)N)r+ZgetRepoFilePath)r8rrr
�repofile�sz
Repo.repofilecCs|jj|�dS)N)r+ZsetRepoFilePath)r8�valuerrr
r��scCs|jj�r|jj�S|j�S)N)r+ZisLocalZgetLocalBaseurl�cache_pkgdir)r8rrr
ru�s

zRepo.pkgdircCs$|jdk	r|jStjj|jj�t�S)N)r�r{r|rgr+�getCachedir�_PACKAGES_RELATIVE_DIR)r8rrr
r��s
zRepo.cache_pkgdircCs
||_dS)N)r�)r8�valrrr
ru�scCstjj|jj�d�S)NZpubring)r{r|rgr+r�)r8rrr
�_pubring_dir�szRepo._pubring_dircCs
|jj�S)N)r+ZgetLoadMetadataOther)r8rrr
�load_metadata_other�szRepo.load_metadata_othercCs|jj|�dS)N)r+ZsetLoadMetadataOther)r8r�rrr
r��scCs|j|jkS)N)rH)r8�otherrrr
�__lt__�szRepo.__lt__cCsd|jj|jfS)Nz<%s %s>)rRr=rH)r8rrr
�__repr__�sz
Repo.__repr__cstt|�j||�dS)N)rQr�__setattr__)r8r�r�)rRrr
r��szRepo.__setattr__cCs|jj�dS)N)r+�disable)r8rrr
r��szRepo.disablecCs|jj�dS)N)r+�enable)r8rrr
r��szRepo.enablecCs|jj|�dS)a/Ask for additional repository metadata type to download.

        Given metadata_type is appended to the default metadata set when
        repository is downloaded.

        Parameters
        ----------
        metadata_type: string

        Example: add_metadata_type_to_download("productid")
        N)r+ZaddMetadataTypeToDownload)r8�
metadata_typerrr
�add_metadata_type_to_download�s
z"Repo.add_metadata_type_to_downloadcCs|jj|�dS)aIStop asking for this additional repository metadata type
        in download.

        Given metadata_type is no longer downloaded by default
        when this repository is downloaded.

        Parameters
        ----------
        metadata_type: string

        Example: remove_metadata_type_from_download("productid")
        N)r+ZremoveMetadataTypeFromDownload)r8r�rrr
�"remove_metadata_type_from_downloadsz'Repo.remove_metadata_type_from_downloadcCs|jj|�S)z�Return path to the file with downloaded repository metadata of given type.

        Parameters
        ----------
        metadata_type: string
        )r+ZgetMetadataPath)r8r�rrr
�get_metadata_pathszRepo.get_metadata_pathcCs|jj|�S)z�Return content of the file with downloaded repository metadata of given type.

        Content of compressed metadata file is returned uncompressed.

        Parameters
        ----------
        metadata_type: string
        )r+ZgetMetadataContent)r8r�rrr
�get_metadata_content!s
zRepo.get_metadata_contentcCs�d}z�y|jj�}Wnttjjtfk
r�}zP|jjrhd|j}x|jjD]}|d|7}qJWt	j
|�tjj
t|���WYdd}~XnXWdt�|j_Xt|j�|_|S)a�Load the metadata for this repo.

        Depending on the configuration and the age and consistence of data
        available on the disk cache, either loads the metadata from the cache or
        downloads them from the mirror, baseurl or metalink.

        This method will by default not try to refresh already loaded data if
        called repeatedly.

        Returns True if this call to load() caused a fresh metadata download.

        Fz7Errors during downloading metadata for repository '%s':z
  - %sN)r+�loadr	�error�Errorr!r�r�rHrer�r�
exceptionsZ	RepoErrorr"r7rNr)r8�retr/rUZfailurerrr
r�-s

&z	Repo.loadcCsP|js|jjd�|jrL|jdkr&dS|jj�}|jj�rDtd|�}d|fSdS)	a)Get the number of seconds after which the cached metadata will expire.

        Returns a tuple, boolean whether there even is cached metadata and the
        number of seconds it will expire in. Negative number means the metadata
        has expired already, None that it never expires.

        FrCTNr���)TN)Fr)rr+Z	loadCacheZmetadata_expireZgetExpiresInZ	isExpired�min)r8Z
expirationrrr
�_metadata_expire_inJs



zRepo._metadata_expire_incCs
||_dS)N)r�)r8Z
key_importrrr
�_set_key_import]szRepo._set_key_importcCs||j_dS)N)r�r)r8rrrr
�set_progress_bar`szRepo.set_progress_barcCs
|jj�S)zoReturns user defined http headers.

        Returns
        -------
        headers : tuple of strings
        )r+ZgetHttpHeaders)r8rrr
�get_http_headersdszRepo.get_http_headerscCs|jj|�dS)aSets http headers.

        Sets new http headers and rewrites existing ones.

        Parameters
        ----------
        headers : tuple or list of strings
            Example: set_http_headers(["User-Agent: Agent007", "MyFieldName: MyFieldValue"])
        N)r+ZsetHttpHeaders)r8Zheadersrrr
�set_http_headersnszRepo.set_http_headers�http�ftp�file�httpscs@��fdd�}�sdS|jj�}|r,||�S|jr<||j�SdS)z�
        :param location: relative location inside the repo
        :param schemes: list of allowed protocols. Default is ('http', 'ftp', 'file', 'https')
        :return: absolute url (string) or None
        csZxT|D]L}�r>tjjj|�d}|�krRtjj|�jd��Sqtjj|�jd��SqWdS)Nrr�)rr�Zurlparser{r|rgr�)Zurl_listrKr�)r~�schemesrr
�schemes_filter�s
z,Repo.remote_location.<locals>.schemes_filterN)r+Z
getMirrorsr�)r8r~r�r�Zmirrorsr)r~r�r
r�{s

zRepo.remote_location)NN�r�r�r�r�)r�)r=r>r?�SYNC_TRY_CACHEr�r9r@rHr�rArur�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r\rr)rRr
r�s6


r)T)JZ
__future__rrZdnf.i18nrrZdnf.callbackrZdnf.confZdnf.conf.substitutionsZ	dnf.constZ
dnf.cryptoZdnf.exceptionsZdnf.loggingZ
dnf.pycompZdnf.utilZdnf.yum.miscZlibdnf.errorr	Zlibdnf.repo�	functoolsr�ZhawkeyZlogging�operatorr{�reZshutil�stringraZtimercr�Z_MIRRORLIST_FILENAMEZ
ascii_lettersZdigitsZ
_REPOID_CHARS�escapeZ	hexdigitsZ_CACHEDIR_REZCACHE_FILESZ	getLoggerrerrr1r5�objectrrDrBr^Z	KeyImportrGrNr
r_rPZPayloadr]rzr�r�rZSyncStrategy_LAZYZ	SYNC_LAZYZSyncStrategy_ONLY_CACHEr�ZSyncStrategy_TRY_CACHEr�r�r�r�ZRepoConfrrrr
�<module>sl




!
8&?


site-packages/dnf/__pycache__/sack.cpython-36.opt-1.pyc000064400000004303147511334650016562 0ustar003

�ft`��@s�ddlmZddlmZddlZddlZddlZddlZddlZddl	Z	ddl
mZddlm
Z
ejd�ZGdd�dej�Zd	d
�Zdd�Zd
d�ZdS)�)�absolute_import)�unicode_literalsN)�
basestring)�_�dnfcs0eZdZ�fdd�Zd	dd�Zd
dd�Z�ZS)�Sackcstt|�j||�dS)N)�superr�__init__)�self�args�kwargs)�	__class__��/usr/lib/python3.6/sack.pyr	%sz
Sack.__init__NrcCs8|r
||_||_|dk	r4||_|dkr4tjtd��dS)NFznallow_vendor_change is disabled. This option is currently not supported for downgrade and distro-sync commands)�installonly�installonly_limit�allow_vendor_change�loggerZwarningr)r
rrrrrr�
_configure(szSack._configurecCstjj||�S)z'Factory function returning a DNF Query.)r�queryZQuery)r
�flagsrrrr1sz
Sack.query)NrN)r)�__name__�
__module__�__qualname__r	rr�
__classcell__rr)r
rr"s
	rc	CsT|jj}tjj|�ttjj||jjd||jj	t
jj|jj
tjj�|jjdkd�S)N�arch�	)ZpkgclsZ
pkginitvalr�cachedirZrootdirZlogfileZlogdebug)Zconfrr�utilZ
ensure_dirr�packageZPackageZ
substitutionsZinstallroot�os�path�joinZlogdir�constZ
LOG_HAWKEYZlogfilelevel)�baserrrr�_build_sack7s

r%cCs2t|�}y|jdd�Wntk
r,YnX|S)NF)Zbuild_cache)r%Zload_system_repo�IOError)r$Zsackrrr�_rpmdb_sackBsr'cCst|�S)z�
    Returns a new instance of sack containing only installed packages (@System repo)
    Useful to get list of the installed RPMs after transaction.
    )r')r$rrr�
rpmdb_sackMsr()Z
__future__rrZdnf.utilrZdnf.packageZ	dnf.queryZloggingZhawkeyr Z
dnf.pycomprZdnf.i18nrZ	getLoggerrrr%r'r(rrrr�<module>s
site-packages/dnf/__pycache__/crypto.cpython-36.opt-1.pyc000064400000014714147511334650017170 0ustar003

�ft`��@s<ddlmZddlmZddlmZddlmZddlZddlZddl	Zddl
ZddlZddlZddl
Z
ddlZyddlmZddlmZWn<ek
r�ddlZGdd	�d	e�ZGd
d�de�ZYnXdZejd
�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zejdd��Zdd�Z d"dd�Z!Gd d!�d!e�Z"dS)#�)�print_function)�absolute_import)�unicode_literals)�_N)�Context)�Datac@sVeZdZdd�Zdd�Zdd�Zedd��Zejd	d��Zd
d�Z	dd
�Z
dd�ZdS)rcCstj�|jd<dS)N�ctx)�gpgmer�__dict__)�self�r�/usr/lib/python3.6/crypto.py�__init__*szContext.__init__cCs|S)Nr)rrrr
�	__enter__-szContext.__enter__cCsdS)Nr)r�type�value�tbrrr
�__exit__0szContext.__exit__cCs|jjS)N)r�armor)rrrr
r3sz
Context.armorcCs||j_dS)N)rr)rrrrr
r7scCs$t|t�rtj|�}|jj|�dS)N)�
isinstanceZ
basestring�io�BytesIOr�import_)r�key_forrr
�	op_import;s

zContext.op_importcCs|jj||�dS)N)rZexport)r�pattern�modeZkeydatarrr
�	op_export@szContext.op_exportcCst|j|�S)N)�getattrr)r�namerrr
�__getattr__CszContext.__getattr__N)�__name__�
__module__�__qualname__rrr�propertyr�setterrrr rrrr
r)src@s4eZdZdd�Zdd�Zdd�Zdd�Zd	d
�ZdS)rcCstj�|jd<dS)N�buf)rrr
)rrrr
rHsz
Data.__init__cCs|S)Nr)rrrr
rKszData.__enter__cCsdS)Nr)rrrrrrr
rNsz
Data.__exit__cCs
|jj�S)N)r&�getvalue)rrrr
�readQsz	Data.readcCst|j|�S)N)rr&)rrrrr
r TszData.__getattr__N)r!r"r#rrrr(r rrrr
rGs
rZ	GNUPGHOME�dnfcCstjjdd�|jD��S)Ncss|]}|jr|VqdS)N)Zcan_sign)�.0�subkeyrrr
�	<genexpr>]sz*_extract_signing_subkey.<locals>.<genexpr>)r)�util�firstZsubkeys)�keyrrr
�_extract_signing_subkey\sr0cs(�fdd�tdt��d�D�}dj|�S)Nc3s|]}�||d�VqdS)�Nr)r*�i)�fpr_hexrr
r,asz)_printable_fingerprint.<locals>.<genexpr>rr1� )�range�len�join)r3Zsegmentsr)r3r
�_printable_fingerprint`sr8cCs�|j}t|�}x�|jD]x}xrt||�D]d}|j}||krNtjtd�|j|�q&|j	j
|�s\q&tjj
j|j|j|dd�tjtd�|j|�q&WqWdS)Nzrepo %s: 0x%s already importedF)�gpgdirZmake_ro_copyzrepo %s: imported key 0x%s.)Z_pubring_dir�keyids_from_pubringZgpgkey�retrieve�id_�logger�debugr�idZ_key_importZ_confirmr)ZyumZmiscZimport_key_to_pubring�raw_key�short_id)�repor9Z
known_keys�keyurl�keyinfo�keyidrrr
�import_repo_keyses
rFcCsltjj|�sgSt|��Jt��8}g}x,|j�D] }t|�}|dk	r0|j|j�q0W|SQRXWdQRXdS)N)	�os�path�exists�pubring_dirr�keylistr0�appendrE)r9rZkeyids�kr+rrr
r:vsr:cCs8td�|j|jt|j�|jjdd�f}tjd|�dS)NzLImporting GPG key 0x%s:
 Userid     : "%s"
 Fingerprint: %s
 From       : %szfile://�z%s)	rrA�useridr8�fingerprint�url�replacer=�critical)rD�msgrrr
�log_key_import�s
rUcCs8t|�|tjjjkr&tjtd��ntjtd��dS)Nz0Verified using DNS record with DNSSEC signature.zNOT verified using DNS record.)rUr)ZdnssecZValidityZVALIDr=rSr)rDZ
dns_resultrrr
�log_dns_key_import�srVccsFtjjtd�}|tjt<z
dVWd|dkr6tjt=n
|tjt<XdS)N)rG�environ�get�GPG_HOME_ENV)rJZorigrrr
rJ�s


rJcCs�tj�}g}t|���t���}|j|�x2|j�D]&}t|�}|dkrHq2|jt||��q2Wd|_	xF|D]>}t
��.}|j|jd|�|j
dtj�|j�|_WdQRXqhWWdQRXWdQRXtjj|�|S)NTr)�tempfileZmkdtemprJrrrKr0rL�Keyrrrr<�seekrG�SEEK_SETr(r@r)r-Zrm_rf)rZpb_dir�keyinfosrr/r+�infoZsinkrrr
�rawkey2infos�s"

,r`c
CsZ|jd�rtjtd�|j|�tjj||d��}t|�}WdQRXx|D]
}||_	qHW|S)Nzhttp:z.retrieving repo key for %s unencrypted from %s)rB)
�
startswithr=Zwarningrr?r)r-Z_urlopenr`rQ)rCrBZhandler^rDrrr
r;�s


r;c@s,eZdZdd�Zedd��Zedd��ZdS)r[cCs6|j|_|j|_d|_|j|_d|_|jdj|_	dS)Nr)
rEr<ZfprrPr@Z	timestamprQZuidsZuidrO)rr/r+rrr
r�szKey.__init__cCs&tjjrdnd}|jdd�jd|�S)N�0�0�i����)r)ZpycompZPY3r<�rjust)rZrjrrr
rA�szKey.short_idcCs
|jj�S)N)rA�lower)rrrr
�rpm_id�sz
Key.rpm_idN)r!r"r#rr$rArgrrrr
r[�sr[)N)#Z
__future__rrrZdnf.i18nr�
contextlibZ
dnf.pycompr)Zdnf.utilZdnf.yum.miscrZloggingrGrZZgpgrr�ImportErrorr	�objectrYZ	getLoggerr=r0r8rFr:rUrV�contextmanagerrJr`r;r[rrrr
�<module>s<




site-packages/dnf/__pycache__/repodict.cpython-36.pyc000064400000012704147511334650016517 0ustar003

i�-e&�@s`ddlmZddlmZddlmZddlZddlZ	ddl
Z
ddlZejj
Z
Gdd�de�ZdS)�)�unicode_literals)�ConfigError)�_Ncs�eZdZdd�Zdd�Zdd�Zdd�Zffd	d
�Zdd�Zd
d�Z	dd�Z
dd�Z�fdd�Zdd�Z
dd�Zdd�Z�ZS)�RepoDictcCsj|j}||krd}t||��y|jj�Wn0tk
r\}ztdj|���WYdd}~XnX|||<dS)Nz;Repository %s is listed more than once in the configurationz{0})�idrZ_repoZverify�RuntimeError�format)�self�repoZid_�msg�e�r
�/usr/lib/python3.6/repodict.py�add#s zRepoDict.addcCstjj|j��S)N)�dnf�util�
MultiCallList�values)r	r
r
r�all/szRepoDict.allcCstjj|j��S)N)rr�empty�iter_enabled)r	r
r
r�_any_enabled3szRepoDict._any_enabledcCsPxJ|j�D]>}x8|j||j��D]$}|js tjtd�|j�|j�q Wq
WdS)Nzenabling %s repository)r�get_matchingr�enabled�logger�infor�enable)r	Zsub_name_fnr
�foundr
r
r�_enable_sub_repos6s
zRepoDict._enable_sub_reposc
s��fdd�}tjj|��}x:|D]2}d|kr>djtjj|��}|j||�g7_q Wx$|j�D]\}}	t	||||	��q`W|j
|�tjt
d�|dj|��|S)a�
        Creates new repo object and add it into RepoDict. Variables in provided values will be
        automatically substituted using conf.substitutions (like $releasever, ...)

        @param repoid: Repo ID - string
        @param conf: dnf Base().conf object
        @param baseurl: List of strings
        @param kwargs: keys and values that will be used to setattr on dnf.repo.Repo() object
        @return: dnf.repo.Repo() object
        cspt|t�rtjjj|�j�St|t�s0t|t�rlg}x.|D]&}t|t�r:|j	tjjj|�j��q:W|rl|S|S)N)
�
isinstance�str�libdnf�confZConfigParser�
substituteZ
substitutions�list�tuple�append)rZsubstituted�value)r"r
rr#Is


z)RepoDict.add_new_repo.<locals>.substitutez://z	file://{}zAdded %s repo from %sz, )rr
ZRepor�os�path�abspath�baseurl�items�setattrrrrr�join)
r	Zrepoidr"r+�kwargsr#r
r)�keyr'r
)r"r�add_new_repo=s


zRepoDict.add_new_repocCsdd�}|j|�dS)z@enable debug repos corresponding to already enabled binary reposcSs&|jd�rdj|dd��Sdj|�S)Nz-rpmsz
{}-debug-rpms�z{}-debuginfo���)�endswithr)�namer
r
r�
debug_nameesz/RepoDict.enable_debug_repos.<locals>.debug_nameN)r)r	r6r
r
r�enable_debug_reposaszRepoDict.enable_debug_reposcCsdd�}|j|�dS)zAenable source repos corresponding to already enabled binary reposcSs&|jd�rdj|dd��Sdj|�S)Nz-rpmsz{}-source-rpmsr2z	{}-sourcer3)r4r)r5r
r
r�source_nameosz1RepoDict.enable_source_repos.<locals>.source_nameN)r)r	r8r
r
r�enable_source_reposkszRepoDict.enable_source_reposcsZtjj��r,��fdd��D�}tjj|�S�j�d�}|dkrLtjjg�Stjj|g�S)Ncs g|]}tj|��r�|�qSr
)�fnmatch)�.0�k)r0r	r
r�
<listcomp>xsz)RepoDict.get_matching.<locals>.<listcomp>)rrZis_glob_patternr�get)r	r0�lr
r
)r0r	rruszRepoDict.get_matchingcCsdd�|j�D�S)Ncss|]}|jr|VqdS)N)r)r;�rr
r
r�	<genexpr>�sz(RepoDict.iter_enabled.<locals>.<genexpr>)r)r	r
r
rrszRepoDict.iter_enabledcs$dd�ttt|�j�dd�d�D�S)zreturn repos sorted by prioritycss|]
}|VqdS)Nr
)r;�itemr
r
rrA�sz!RepoDict.items.<locals>.<genexpr>cSs|dj|djfS)N�)ZpriorityZcost)�xr
r
r�<lambda>�sz RepoDict.items.<locals>.<lambda>)r0)�sorted�superrr,)r	)�	__class__r
rr,�szRepoDict.itemscCs|j�S)N)�keys)r	r
r
r�__iter__�szRepoDict.__iter__cCsdd�|j�D�S)Ncss|]\}}|VqdS)Nr
)r;r<�vr
r
rrA�sz RepoDict.keys.<locals>.<genexpr>)r,)r	r
r
rrI�sz
RepoDict.keyscCsdd�|j�D�S)Ncss|]\}}|VqdS)Nr
)r;r<rKr
r
rrA�sz"RepoDict.values.<locals>.<genexpr>)r,)r	r
r
rr�szRepoDict.values)�__name__�
__module__�__qualname__rrrrr1r7r9rrr,rJrIr�
__classcell__r
r
)rHrr!s$


r)Z
__future__rZdnf.exceptionsrZdnf.i18nrZdnf.utilrZlibdnf.confr!r:r(rr�dictrr
r
r
r�<module>ssite-packages/dnf/__pycache__/lock.cpython-36.opt-1.pyc000064400000010002147511334650016562 0ustar003

�ft`��@s�ddlmZddlmZddlmZmZmZddlmZddl	m
Z
ddlZddl
ZddlZddlZddlZddlZddlZddlZddlZejd�Zdd	�Zd
d�Zdd
�Zdd�Zdd�ZGdd�de�ZdS)�)�absolute_import)�unicode_literals)�ProcessLockError�ThreadLockError�	LockError)�_)�miscN�dnfcCs6tjj�s2tj|jd��j�}tjj	t
j�d|�}|S)Nzutf-8Zlocks)r	�utilZ	am_i_root�hashlibZsha1�encodeZ	hexdigest�os�path�joinrZgetCacheDir)Zdir_Zhexdir�r�/usr/lib/python3.6/lock.py�
_fit_lock_dir&s
rcCsttjjt|�d�d|�S)Nzdownload_lock.pid�cachedir)�ProcessLockr
rrr)r�exit_on_lockrrr�build_download_lock/srcCsttjjt|�d�d|�S)Nzmetadata_lock.pidZmetadata)rr
rrr)rrrrr�build_metadata_lock3srcCsttjjt|�d�d|�S)Nzrpmdb_lock.pidZRPMDB)rr
rrr)Z
persistdirrrrr�build_rpmdb_lock8srcCsttjjt|�d�d|�S)Nzlog_lock.pid�log)rr
rrr)Zlogdirrrrr�build_log_lock=src@s>eZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�ZdS)rFcCs&||_d|_||_||_tj�|_dS)Nr)�blocking�count�description�target�	threading�RLock�thread_lock)�selfrrrrrr�__init__Cs
zProcessLock.__init__cCs2|jjdd�s d|j}t|��|jd7_dS)NF)rz'%s already locked by a different thread�)r!�acquirerrr)r"�msgrrr�_lock_threadJs
zProcessLock._lock_threadc Cs>tj|jtjtjBd�}�zytj|tjtjB�Wn4t	k
rh}z|j
t
jkrVdS�WYdd}~XnXtj|d�}t
|�dkr�tj|t|�jd��|Syt|�}Wn*tk
r�td�|j}t|��YnX||kr�|Stjd|tj��s*tj|dtj�tj|d�tj|t|�jd��|S|Stj|�XdS)	Ni�r$�rzutf-8z�Malformed lock file found: %s.
Ensure no other dnf/yum process is running and remove the lock file manually or run systemd-tmpfiles --remove dnf.conf.z
/proc/%d/stat���)r
�openr�O_CREAT�O_RDWR�fcntlZflockZLOCK_EXZLOCK_NB�OSError�errnoZEWOULDBLOCK�read�len�write�strr�int�
ValueErrorrr�access�F_OK�lseek�SEEK_SET�	ftruncate�close)r"�pid�fd�eZold_pidr&rrr�	_try_lockPs6zProcessLock._try_lockcCs|jd8_|jj�dS)Nr$)rr!�release)r"rrr�_unlock_threadzszProcessLock._unlock_threadcCs�tjjtjj|j��|j�d}tj�}|j	|�}xp||kr�|dkr�|j
sl|j�d|j|f}t
||��||kr�td�|}tj|�|}tjd�|j	|�}q6WdS)Nr$z%s already locked by %dz*Waiting for process with pid %d to finish.r)r))r	r
Z
ensure_dirr
r�dirnamerr'�getpidr?rrArrr�logger�info�timeZsleep)r"Zprev_pidZmy_pidr<r&rrr�	__enter__~s"




zProcessLock.__enter__cGs"|jdkrtj|j�|j�dS)Nr$)rr
�unlinkrrA)r"Zexc_argsrrr�__exit__�s
zProcessLock.__exit__N)F)	�__name__�
__module__�__qualname__r#r'r?rArGrIrrrrrBs
*r)Z
__future__rrZdnf.exceptionsrrrZdnf.i18nrZdnf.yumrZdnf.loggingr	Zdnf.utilr/r-rZloggingr
rrFZ	getLoggerrDrrrrr�objectrrrrr�<module>s(
	site-packages/dnf/__pycache__/match_counter.cpython-36.opt-1.pyc000064400000007771147511334650020510 0ustar003

�ft`T�@sZddlmZddlmZddlmZddlmZddddd	�Zd
d�ZGdd
�d
e�Z	dS)�)�absolute_import)�print_function)�unicode_literals)�reduce����)�nameZsummary�descriptionZurlcCs"t|�}dg||t|�}|S)z, Ordered sset with empty strings prepended. �)�len�sorted)ZssetZlengthZcurrent�l�r�#/usr/lib/python3.6/match_counter.py�_canonize_string_set"src@sfeZdZdZedd��Zdd�Zdd�Zdd	�Zd
d�Z	dd
�Z
dd�Zdd�Zddd�Z
dd�ZdS)�MatchCounterz�Map packages to which of their attributes matched in a search against
    what values.

    The mapping is: ``package -> [(key, needle), ... ]``.

    cs�fdd�}tt||��S)Ncs>|d}|d}t�|�}|dkr6||kr6dt|St|S)Nrr	r
r)�getattr�WEIGHTS)�match�key�needleZhaystack)�pkgrr�weight4s
z*MatchCounter._eval_weights.<locals>.weight)�sum�map)rZmatchesrr)rr�
_eval_weights1s	zMatchCounter._eval_weightscs�fdd�}|S)a�Get the key function used for sorting matches.

        It is not enough to only look at the matches and order them by the sum
        of their weighted hits. In case this number is the same we have to
        ensure that the same matched needles are next to each other in the
        result.

        Returned function is:
        pkg -> (weights_sum, canonized_needles_set, -distance)

        cs�j|�|�|jfS)N)rr
)r)�selfrr�get_keyKsz'MatchCounter._key_func.<locals>.get_keyr)rrr)rr�	_key_func?szMatchCounter._key_funccs�rt�fdd��D��SdS)z0Return the max count of needles of all packages.c3s|]}t�j|��VqdS)N)r
�matched_needles)�.0r)rrr�	<genexpr>Wsz,MatchCounter._max_needles.<locals>.<genexpr>r)�max)rr)rr�_max_needlesTszMatchCounter._max_needlescCs|j|g�j||f�dS)N)�
setdefault�append)rrrrrrr�addZszMatchCounter.addcCs&x |D]}td|||f�qWdS)Nz%s	%s)�print)rrrrr�dump]s
zMatchCounter.dumpcst�fdd�|�D��S)Nc3s|]}t�|d�VqdS)rN)r)r"�m)rrrr#bsz1MatchCounter.matched_haystacks.<locals>.<genexpr>)�set)rrr)rr�matched_haystacksaszMatchCounter.matched_haystackscCs6g}x,||D] }|d|kr q|j|d�qW|S)Nr)r')rr�result�irrr�matched_keysdszMatchCounter.matched_keyscCstdd�||D��S)Ncss|]}|dVqdS)r	Nr)r"r+rrrr#nsz/MatchCounter.matched_needles.<locals>.<genexpr>)r,)rrrrrr!mszMatchCounter.matched_needlesFNcCs |r|n|j�}t||j�d�S)N)r)�keysrr )r�reverseZlimit_tor1rrrrpszMatchCounter.sortedcst�fdd��d�S)Ncs|t�|�S)N)r
)�totalr)rrr�<lambda>usz$MatchCounter.total.<locals>.<lambda>r)r)rr)rrr3tszMatchCounter.total)FN)�__name__�
__module__�__qualname__�__doc__�staticmethodrr r%r(r*r-r0r!rr3rrrrr)s	
rN)
Z
__future__rrr�	functoolsrrr�dictrrrrr�<module>ssite-packages/dnf/__pycache__/plugin.cpython-36.pyc000064400000021433147511334650016203 0ustar003

i�-eV%�@s�ddlmZddlmZddlmZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlmZejd�ZdZGdd	�d	e�ZGd
d�de�Zdd
�Zdd�Zdd�Zdd�Zdd�ZdS)�)�absolute_import)�print_function)�unicode_literalsN)�_�dnfzdnf.plugin.dynamicc@s\eZdZdZdZdZedd��Zdd�Zdd	�Z	d
d�Z
dd
�Zdd�Zdd�Z
dd�ZdS)�Pluginz5The base class custom plugins must derive from. #:apiz	<invalid>Ncs�tjj�}|jr|jn|j��fdd�|jD�}xb|D]Z}tjj|�r6y|j	|�Wq6t
k
r�}ztjj
td�t|���WYdd}~Xq6Xq6W|S)Ncsg|]}d|�f�qS)z
%s/%s.conf�)�.0�path)�namer�/usr/lib/python3.6/plugin.py�
<listcomp>9sz&Plugin.read_config.<locals>.<listcomp>zParsing file failed: %s)�libdnf�confZConfigParser�config_namerZpluginconfpath�osr
�isfile�read�	Exceptionr�
exceptionsZConfigErrorr�str)�clsr�parser�files�file�er)rr�read_config4s

.zPlugin.read_configcCs||_||_dS)N)�base�cli)�selfrrrrr�__init__BszPlugin.__init__cCsdS)Nr)rrrr�
pre_configGszPlugin.pre_configcCsdS)Nr)rrrr�configKsz
Plugin.configcCsdS)Nr)rrrr�resolvedOszPlugin.resolvedcCsdS)Nr)rrrr�sackSszPlugin.sackcCsdS)Nr)rrrr�pre_transactionWszPlugin.pre_transactioncCsdS)Nr)rrrr�transaction[szPlugin.transaction)�__name__�
__module__�__qualname__�__doc__rr�classmethodrr r!r"r#r$r%r&rrrrr.src@s~eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	ddd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�ZdS)�PluginscCsg|_g|_dS)N)�
plugin_cls�plugins)rrrrr aszPlugins.__init__cCs|j�dS)N)�_unload)rrrr�__del__eszPlugins.__del__cCs~xx|jD]n}yt||��Wqtjjk
r6�Yqtk
rttj�\}}}tj	|||�}t
jdj|��YqXqWdS)N�)
r.�getattrrr�Errorr�sys�exc_info�	traceback�format_exception�loggerZcritical�join)r�method�plugin�exc_type�	exc_value�
exc_tracebackZexcept_listrrr�_callerhszPlugins._callercsxxr|jdd�D]`}|j�t�fdd�|D��r2q|j|�}|jd�o^|jdd�o^|jdd�}|r|jj|�qWdS)zwChecks whether plugins are enabled or disabled in configuration files
           and removes disabled plugins from listNc3s|]}tj�|�VqdS)N)�fnmatch)r	�pattern)rrr�	<genexpr>xsz)Plugins._check_enabled.<locals>.<genexpr>�mainZenabled)r-r�anyrZhas_sectionZ
has_optionZ
getboolean�remove)rr�enable_pluginsZplug_clsrZdisabledr)rr�_check_enabledss

zPlugins._check_enabledcCs�ttjkrtd��tjjt�tjt<}g|_t|j	||�}t
||�t�dd�|_|j
||�t|j�dkr�tdd�|jD��}tjtd�dj|��dS)z)Dynamically load relevant plugin modules.zload_plugins() called twiceNrcss|]}|jVqdS)N)r)r	r;rrrrB�sz Plugins._load.<locals>.<genexpr>zLoaded plugins: %sz, )�DYNAMIC_PACKAGEr4�modules�RuntimeErrorrZpycomp�
ModuleType�__path__�_get_plugins_filesZ
pluginpath�_import_modules�_plugin_classesr-rG�len�sortedr8�debugrr9)rrZskipsrF�packager�namesrrr�_load�s

z
Plugins._loadcCs|jd�dS)Nr!)r?)rrrr�_run_pre_config�szPlugins._run_pre_configcCs|jd�dS)Nr")r?)rrrr�_run_config�szPlugins._run_configNcCs*x$|jD]}|||�}|jj|�qWdS)N)r-r.�append)rrrZp_clsr;rrr�	_run_init�s
zPlugins._run_initcCs|jd�dS)Nr$)r?)rrrr�run_sack�szPlugins.run_sackcCs|jd�dS)Nr#)r?)rrrr�run_resolved�szPlugins.run_resolvedcCs|jd�dS)Nr%)r?)rrrr�run_pre_transaction�szPlugins.run_pre_transactioncCs|jd�dS)Nr&)r?)rrrr�run_transaction�szPlugins.run_transactioncCs&ttjkr"tjtjjd�tjt=dS)NzPlugins were unloaded.)rHr4rIr8�logr�loggingZDDEBUG)rrrrr/�s
zPlugins._unloadcCs�|js
dSt�}x|jD]}||tj|j�<qWt|j��}t�}x |jD]}|j|j	|j
��qJW|sldSx|jD]}|j|j
�qtWx|D]}|jj
||�q�WdS)zH
        Unload plugins that were removed in the `transaction`.
        N)Z
remove_set�dictr.�inspectZgetfile�	__class__�set�keys�update�intersectionrZinstall_set�difference_updaterE)rr&r.r;Zplugin_filesZerased_plugin_filesZpkgZplugin_filerrr�unload_removed_plugins�s
zPlugins.unload_removed_plugins)N)r'r(r)r r0r?rGrUrVrWrYrZr[r\r]r/rhrrrrr,`s
r,cCstj�S)N)r�__subclasses__rrrrrO�srOcCs�x�|D]�}tjj|�\}}|jj|�tjj|�\}}d|j|f}ytj|�}Wqt	k
r�}z,t
jtd�||�t
j
tjjddd�WYdd}~XqXqWdS)Nz%s.%szFailed loading plugin "%s": %sr1T)r5)rr
�splitrLrX�splitextr'�	importlib�
import_modulerr8�errorrr^rr_ZSUBDEBUG)rSZpy_files�fnr
�moduleZextrrrrrrN�s
rNcCsJg}t|�}t|�}t�}t�}x�|D]�}x�tjd|�D]�}tjjtjj|��\}}	d}
d}xN|D]F}t||�rd|j|�d}
x$|D]}
t||
�r�d}
|j|
�q�Wd}qdW|s�x |D]}
t||
�r�|j|
�q�W|
r:|j|�q:Wq&W|j	|�}|�rt
jtd�j
djt|����|j	|�}|�rFt
jtd�j
djt|����|S)Nz%s/*.pyTFz=No matches found for the following enable plugin patterns: {}z, z>No matches found for the following disable plugin patterns: {})rc�globrr
rk�basename�_plugin_name_matches_pattern�addrX�
differencer8Zwarningr�formatr9rQ)�pathsZdisable_pluginsrFr.Zpattern_enable_foundZpattern_disable_found�pro�plugin_nameZdummyZmatchedZenable_pattern_testedZpattern_skipZpattern_enableZenable_not_foundZdisable_not_foundrrrrM�sD









rMcs*t||jdd�f�}t�fdd�|D��S)z�
    Checks plugin name matches the pattern.

    The alternative plugin name using dashes instead of underscores is tried
    in case of original name is not matched.

    (see https://bugzilla.redhat.com/show_bug.cgi?id=1980712)
    r�-c3s|]}tj|��VqdS)N)r@)r	r)rArrrBsz/_plugin_name_matches_pattern.<locals>.<genexpr>)rc�replacerD)ryrAZ	try_namesr)rArrss
rscs<�fdd�}tt�jd�tjf|�jdd��}|�_�S)z5A class decorator for automatic command registration.cs|r|j��dS)N)�register_command)rrr)�
command_classrrr sz"register_command.<locals>.__init__rr)r r)�typerr'rr�aliasesZ_plugin)r}r Zplugin_classr)r}rr|sr|)Z
__future__rrrr@rqrlrar_�operatorrr4r6rZdnf.loggingrZ
dnf.pycompZdnf.utilZdnf.i18nrZ	getLoggerr8rH�objectrr,rOrNrMrsr|rrrr�<module>s2
2k
%site-packages/dnf/__pycache__/logging.cpython-36.pyc000064400000020256147511334650016335 0ustar003

�ft`r(�@s�ddlmZddlmZddlZddlZddlZddlZddlZ	ddl
Z
ddlZ
ddlZddl
Z
ddlZddlZddlZdZe
jZe
jZe
jZe
jZe
jZdZdZdZdZd	d
�ZGdd�de�Zee
je
je
je
je
je
jeeeed
�Zdd�Zee
je
jd�Z dd�Z!dd�Z"d%Z#dd�Z$Gdd�de
j%j&�Z'dd�Z(dd�Z)Gdd �d e�Z*Gd!d"�d"e�Z+e	j,j-j.ee	j,j-j/ee	j,j-j0ee	j,j-j1ee	j,j-j2ee	j,j-j3ee	j,j-j4eiZ5Gd#d$�d$e	j,j-�Z6e6�Z7e	j,j8j9e7�dS)&�)�absolute_import)�unicode_literalsN�d����csdd����fdd�}|S)zGMethod decorator turning the method into noop on second or later calls.c_sdS)N�)Z_argsZ_kwargsr	r	�/usr/lib/python3.6/logging.py�noop3szonly_once.<locals>.noopcs"�|f|�|�t|�j��dS)N)�setattr�__name__)�self�args�kwargs)�funcrr	r
�	swan_song5szonly_once.<locals>.swan_songr	)rrr	)rrr
�	only_once1src@seZdZdd�Zdd�ZdS)�_MaxLevelFiltercCs
||_dS)N)�	max_level)rrr	r	r
�__init__;sz_MaxLevelFilter.__init__cCs|j|jkrdSdS)Nr�)Zlevelnor)r�recordr	r	r
�filter>sz_MaxLevelFilter.filterN)r
�
__module__�__qualname__rrr	r	r	r
r:sr)rrr�r�r�r�	�
cCs(d|kodknst�tj|t�S)Nrr )�AssertionError�_VERBOSE_VAL_MAPPING�get�TRACE)�
cfg_errvalr	r	r
�_cfg_verbose_val2levelQsr&)rrrcCs*d|kodknst�tj|tj�S)Nrr )r!�_ERR_VAL_MAPPINGr#�logging�WARNING)r%r	r	r
�_cfg_err_val2level^sr*cCs|dS)Nz.gzr	)�namer	r	r
�compression_namercsr,�icCs\t|d��>}tj|d��&}x|jt�}|s,P|j|�qWWdQRXWdQRXtj|�dS)N�rb�wb)�open�gzip�read�
CHUNK_SIZE�write�os�remove)�source�destZsfZwf�datar	r	r
�compression_rotatorjs
"r:cs&eZdZd	�fdd�	Zdd�Z�ZS)
�MultiprocessRotatingFileHandler�arNFcs.tt|�j||||||�tjjdd�|_dS)Nz	/var/log/T)�superr;r�dnf�lockZbuild_log_lock�rotate_lock)r�filename�mode�maxBytes�backupCount�encodingZdelay)�	__class__r	r
rvs
z(MultiprocessRotatingFileHandler.__init__cCs�x�yR|j|�rD|j�*tj|j�j}|j�tj|j|�WdQRXtj	j
||�dStjj
tjjfk
r~tjd�Yqtk
r�|j|�dSXqWdS)Ng{�G�z�?)ZshouldRolloverr@r5�statZbaseFilename�st_modeZ
doRollover�chmodr(ZFileHandler�emitr>�
exceptionsZProcessLockErrorZThreadLockError�timeZsleep�	ExceptionZhandleError)rrrBr	r	r
rJ{s

z$MultiprocessRotatingFileHandler.emit)r<rrNF)r
rrrrJ�
__classcell__r	r	)rFr
r;usr;cCsltjj|�s,tjjtjj|��tjj|�t|||d�}t	j
dd�}tj|_
|j|�|rht|_t|_|S)N)rCrDz%%(asctime)s %(levelname)s %(message)sz%Y-%m-%dT%H:%M:%S%z)r5�path�existsr>�utilZ
ensure_dir�dirnameZtouchr;r(Z	FormatterrLZ	localtimeZ	converterZsetFormatterr:Zrotatorr,Znamer)�logfile�log_size�
log_rotate�log_compress�handlerZ	formatterr	r	r
�_create_filehandler�s
rXcCs|jttjj�dS)N)�log�INFOr>�constZ
LOG_MARKER)Zloggerr	r	r
�_paint_mark�sr\c@sBeZdZdd�Zedd��Zedd��Zedd��Zd
d
d�ZdS)�LoggingcCsPd|_|_tjtd�tjtd�tjtd�tjtd�tjd�dt_	dS)N�DDEBUG�SUBDEBUGr$�ALLTF)
�stdout_handler�stderr_handlerr(ZaddLevelNamer^r_r$r`ZcaptureWarningsZraiseExceptions)rr	r	r
r�s
zLogging.__init__cCsttjd�}|jt�tjtj�}|jt�|jt	tj
��|j|�||_tjtj
�}|jt
�|j|�||_dS)Nr>)r(�	getLogger�setLevelr$Z
StreamHandler�sys�stdoutrZZ	addFilterrr)�
addHandlerra�stderrrb)r�
logger_dnfrfrhr	r	r
�	_presetup�s





zLogging._presetupcCs�tjd�}|jt�tjj|tjj	�}t
||||�}|j|�|j|�tjd�}	|	j|�tjd�}
|
jt�tjj|tjj�}t
||||�}|
j|�t
jjj||tk�tjd�}d|_|jt�tjj|tjj�}t
||||�}|j|�dS)Nr>zpy.warnings�librepozdnf.rpmF)r(rcrdr$r5rO�joinr>r[ZLOGrXrgZLOG_LIBREPO�libdnfZrepoZ
LibrepoLogr`Z	propagater_ZLOG_RPM)r�
logfile_level�logdirrTrUrVrirSrW�logger_warningsZlogger_librepo�
logger_rpmr	r	r
�_setup_file_loggers�s(










zLogging._setup_file_loggerscCs�|j�|j|||||�tjd�}|j|j�tjd�}	|	j|j�|	j|j�tjd�}
|jjt�|jjt�t	|
�t	|	�|jj|�|jj|�dS)Nzpy.warningszdnf.rpmr>)
rjrrr(rcrgrbrardr)r\)rZ
verbose_levelZerror_levelrnrorTrUrVrprqrir	r	r
�_setup�s


zLogging._setupFc
Csft|j�}t|j�}t|j�}|j}|j}|j}|j}	|rL|j	|||||	�S|j
|||||||	�SdS)N)r&Z
debuglevelr*Z
errorlevelZlogfilelevelrorTrUrVrrrs)
rZconfZfile_loggers_onlyZverbose_level_rZ
error_level_rZlogfile_level_rrorTrUrVr	r	r
�_setup_from_dnf_conf�s


zLogging._setup_from_dnf_confN)F)	r
rrrrrjrrrsrtr	r	r	r
r]�s
	r]c@seZdZdd�Zdd�ZdS)�TimercCs||_tj�|_dS)N)�whatrL�start)rrvr	r	r
rszTimer.__init__cCs6tj�|j}d|j|df}tjd�jt|�dS)Nztimer: %s: %d msi�r>)rLrwrvr(rcrYr^)rZdiff�msgr	r	r
�__call__szTimer.__call__N)r
rrrryr	r	r	r
rusrucs$eZdZ�fdd�Zdd�Z�ZS)�LibdnfLoggerCBcs*tt|�j�tjd�|_tjd�|_dS)Nr>rk)r=rzrr(rc�_dnf_logger�_librepo_logger)r)rFr	r
rszLibdnfLoggerCB.__init__cGsft|�dkr|\}}nt|�dkr.|\}}}}|tjjjkrP|jjt||�n|jjt||�dS)zoLog message.

        source -- integer, defines origin (libdnf, librepo, ...) of message, 0 - unknown
        rrN)	�lenrm�utils�LoggerZLOG_SOURCE_LIBREPOr|rY�_LIBDNF_TO_DNF_LOGLEVEL_MAPPINGr{)rr7r�level�messagerL�pidr	r	r
r4s
zLibdnfLoggerCB.write)r
rrrr4rNr	r	)rFr
rzsrzi):Z
__future__rrZdnf.exceptionsr>Z	dnf.constZdnf.lockZdnf.utilZlibdnf.repormr(Zlogging.handlersr5rerL�warningsr1Z
SUPERCRITICALZCRITICALZERRORr)rZ�DEBUGr^r_r$r`r�objectrr"r&r'r*r,r3r:ZhandlersZRotatingFileHandlerr;rXr\r]rur~rZLevel_CRITICALZLevel_ERRORZ
Level_WARNINGZLevel_NOTICEZ
Level_INFOZLevel_DEBUGZLevel_TRACEr�rzZlibdnfLoggerCBZLogZ	setLoggerr	r	r	r
�<module>sv	

a





site-packages/dnf/__pycache__/dnssec.cpython-36.opt-1.pyc000064400000021257147511334650017127 0ustar003

�ft`;,�@s�ddlmZddlmZddlmZddlmZddlZddlZddlZddl	Z	ddl
mZddlZ
ddlZ
ejd�ZdZGd	d
�d
e
jj�Zddd
�ZGdd�de�ZGdd�d�ZGdd�d�ZGdd�d�Zdd�Zdd�ZGdd�d�ZdS)�)�print_function)�absolute_import)�unicode_literals)�EnumN)�_�dnf�=c@seZdZdZdd�ZdS)�DnssecErrorz-
    Exception used in the dnssec module
    cCsdj|jdk	r|jnd�S)Nz<DnssecError, value='{}'>z
Not specified)�format�value)�self�r
�/usr/lib/python3.6/dnssec.py�__repr__-szDnssecError.__repr__N)�__name__�
__module__�__qualname__�__doc__rr
r
r
rr	)sr	�_openpgpkeycCs~|jd�}t|�dkr"d}t|��|d}|d}tj�}|j|jd��tj|j	�dd��j
d�j�}|d|d|S)	z�
    Implements RFC 7929, section 3
    https://tools.ietf.org/html/rfc7929#section-3
    :param email_address:
    :param tag:
    :return:
    �@�z0Email address must contain exactly one '@' sign.r�zutf-8��.)�split�lenr	�hashlibZsha256�update�encode�base64Z	b16encode�digest�decode�lower)Z
email_address�tagr�msgZlocalZdomain�hashr r
r
r�email2location2s	

r&c@s(eZdZdZdZdZdZdZdZdZ	dS)	�Validityz�
    Output of the verification algorithm.
    TODO: this type might be simplified in order to less reflect the underlying DNS layer.
    TODO: more specifically the variants from 3 to 5 should have more understandable names
    rr����	N)
rrrr�VALID�REVOKED�PROVEN_NONEXISTENCE�RESULT_NOT_SECURE�BOGUS_RESULT�ERRORr
r
r
rr'Jsr'c@seZdZdZdS)�NoKeyz�
    This class represents an absence of a key in the cache. It is an expression of non-existence
    using the Python's type system.
    N)rrrrr
r
r
rr2Xsr2c@s&eZdZdZddd�Zedd��ZdS)�KeyInfozv
    Wrapper class for email and associated verification key, where both are represented in
    form of a string.
    NcCs||_||_dS)N)�email�key)rr4r5r
r
r�__init__eszKeyInfo.__init__c	Cs�tjd|�}|dkrt�|jd�}|jd�jd�}d}d}x6tdt|��D]$}||dkr^|}||dkrJ|}qJWd	j||d
|d��j	d�}t
||�S)z�
        Since dnf uses different format of the key than the one used in DNS RR, I need to convert
        the former one into the new one.
        z	<(.*@.*)>Nr�ascii�
rz$-----BEGIN PGP PUBLIC KEY BLOCK-----z"-----END PGP PUBLIC KEY BLOCK-----�r)�re�searchr	�groupr!r�ranger�joinrr3)	ZuseridZraw_keyZinput_emailr4r5�start�stop�iZcat_keyr
r
r�from_rpm_key_objectis
 zKeyInfo.from_rpm_key_object)NN)rrrrr6�staticmethodrBr
r
r
rr3`s
r3c@s8eZdZdZiZedd��Zedd��Zedd��ZdS)	�DNSSECKeyVerificationz�
    The main class when it comes to verification itself. It wraps Unbound context and a cache with
    already obtained results.
    cCsZ||krtjd�tjS|tkr0tjd�tjStjdj|��tjdj|��tjSdS)zD
        Compare the key in case it was found in the cache.
        zCache hit, valid keyzCache hit, proven non-existencezKey in cache: {}zInput key   : {}N)�logger�debugr'r,r2r.r
r-)�	key_unionZinput_key_stringr
r
r�
_cache_hit�s

z DNSSECKeyVerification._cache_hitc	Cs�yddl}Wn<tk
rH}z tdj|��}tjj|��WYdd}~XnX|j�}|jdd�dkrlt	j
d�|jdd�dkr�t	j
d	�|j�dkr�t	j
d
�|jd�dkr�t	j
d�|j
t|j�t|j�\}}|dkr�t	j
d
�tjS|jr�t	j
d�tjS|j�st	j
d�tjS|j�r,t	j
d�tjS|j�sDt	j
d�tjS|jj�d}tj|�}||jk�rntj St	j
dj|��t	j
dj|j��tj!SdS)zz
        In case the key was not found in the cache, create an Unbound context and contact the DNS
        system
        rNzLConfiguration option 'gpgkey_dns_verification' requires python3-unbound ({})z
verbosity:�0z(Unbound context: Failed to set verbosityzqname-minimisation:�yesz1Unbound context: Failed to set qname minimisationz+Unbound context: Failed to read resolv.confz/var/lib/unbound/root.keyz0Unbound context: Failed to add trust anchor filez%Communication with DNS servers failedzDNSSEC signatures are wrongz!Result is not secured with DNSSECz1Non-existence of this record was proven by DNSSECz"Unknown error in DNS communicationzKey from DNS: {}zInput key   : {})"�unbound�ImportErrorrr
r�
exceptions�ErrorZub_ctxZ
set_optionrErFZ
resolvconfZadd_ta_fileZresolver&r4�RR_TYPE_OPENPGPKEYZRR_CLASS_INr'r1Zbogusr0Zsecurer/Znxdomainr.Zhavedata�dataZas_raw_datarZ	b64encoder5r,r-)	�	input_keyrK�er$ZctxZstatus�resultrPZdns_data_b64r
r
r�_cache_miss�sN









z!DNSSECKeyVerification._cache_misscCsztjdj|j��tjj|j�}|dk	r6tj||j�Stj	|�}|t
jkrZ|jtj|j<n|t
jkrrt
�tj|j<|SdS)zI
        Public API. Use this method to verify a KeyInfo object.
        z(Running verification for key with id: {}N)rErFr
r4rD�_cache�getrHr5rTr'r,r.r2)rQrGrSr
r
r�verify�s


zDNSSECKeyVerification.verifyN)	rrrrrUrCrHrTrWr
r
r
rrD�s
9rDcCs8td�|jd}|tjkr(|td�S|td�SdS)zE
    Inform the user about key validity in a human readable way.
    zDNSSEC extension: Key for user � z	is valid.zhas unknown status.N)rr4r'r,)Zki�v�prefixr
r
r�
nice_user_msg�s
r[cCstd�|S)z;
    Label any given message with DNSSEC extension tag
    zDNSSEC extension: )r)�mr
r
r�any_msg�sr]c@s(eZdZdZedd��Zedd��ZdS)�RpmImportedKeysaQ
    Wrapper around keys, that are imported in the RPM database.

    The keys are stored in packages with name gpg-pubkey, where the version and
    release is different for each of them. The key content itself is stored as
    an ASCII armored string in the package description, so it needs to be parsed
    before it can be used.
    c	Cs�tjjj�}|jdd�}g}xl|D]d}tjj|d�}tjd|�jd�}tjj|d�}|j	d�dd�}d	j
|�}|t||jd
��g7}q"W|S)N�namez
gpg-pubkey�packagerz	<(.*@.*)>r�descriptionr8r(r9r7���)
rZrpmZtransactionZTransactionWrapperZdbMatchZ	getheaderr:r;r<rr>r3r)	Ztransaction_setZpackagesZreturn_listZpkgr`r4raZ	key_linesZkey_strr
r
r�_query_db_for_gpg_keyss

z&RpmImportedKeys._query_db_for_gpg_keyscCstj�}tjttd���x�|D]�}ytj|�}Wn:tk
rl}ztj	dj
|j|j��w WYdd}~XnX|t
jkr�tjtdj
|j���q |t
jkr�tjtdj
|j���q |t
jkr�tjtdj
|j���q |t
jkr�tjtdj
|j���q tjtdj
|j���q WdS)Nz1Testing already imported keys for their validity.z%DNSSEC extension error (email={}): {}zGPG Key {} is validz,GPG Key {} does not support DNS verificationz�GPG Key {} could not be verified, because DNSSEC signatures are bogus. Possible causes: wrong configuration of the DNS server, MITM attackz=GPG Key {} has been revoked and should be removed immediatelyzGPG Key {} could not be tested)r^rcrE�infor]rrDrWr	Zwarningr
r4rr'r,rFr.r0r-)�keysr5rSrRr
r
r�check_imported_keys_validitys,







z,RpmImportedKeys.check_imported_keys_validityN)rrrrrCrcrfr
r
r
rr^�sr^)r)Z
__future__rrr�enumrrrZloggingr:Zdnf.i18nrZdnf.rpmrZdnf.exceptionsZ	getLoggerrErOrMrNr	r&r'r2r3rDr[r]r^r
r
r
r�<module>s*
	
#gsite-packages/dnf/__pycache__/util.cpython-36.opt-1.pyc000064400000045565147511334650016635 0ustar003

�ft`�O�@svddlmZddlmZddlmZddlmZmZddlmZm	Z	ddl
Z
ddlZddlZddl
ZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZejd�Ze
j �j!d	kr�e
j �j!ndZ"e"j#�Z$d
d�Z%d^dd
�Z&d_dd�Z'dd�Z(dd�Z)dd�Z*dd�Z+dd�Z,dd�Z-dd�Z.dd �Z/d!d"�Z0d#d$�Z1d%d&�Z2d'd(�Z3d)d*�Z4d+d,�Z5d-d.�Z6d/d0�Z7d1d2�Z8d3d4�Z9d5d6�Z:d7d8�Z;d9d:�Z<d;d<�Z=d=d>�Z>d?d@�Z?dAdB�Z@dCdD�ZAd`dFdG�ZBdHdIejCfdJdK�ZDdLdM�ZEdNdO�ZFdPdQ�ZGdRdS�ZHGdTdU�dUeI�ZJGdVdW�dWeK�ZLGdXdY�dYeM�ZNdZd[�ZOd\d]�ZPdS)a�)�print_function)�absolute_import)�unicode_literals�)�PY3�
basestring)�_�ucdN�dnfZyumcCs�t|dg�t|dg�t|dg�t�}x�|D]�}||kr>q0|j|�tjjj|�d}|jd�rr|jj|�q0|r�|d
kr�|jj|�q0|j	d
�r�|j
j|dd��q0|jj|�q0WdS)a�
    Categorize :param values list into packages, groups and filenames

    :param namespace: argparse.Namespace, where specs will be stored
    :param values: list of specs, whether packages ('foo') or groups/modules ('@bar')
                   or filenames ('*.rmp', 'http://*', ...)

    To access packages use: specs.pkg_specs,
    to access groups use: specs.grp_specs,
    to access filenames use: specs.filenames
    �	filenames�	grp_specs�	pkg_specsrz.rpm�http�ftp�file�https�@rN)rrrr)�setattr�set�addr
�pycompZurlparse�endswithr�append�
startswithrr
)�	namespace�valuesZtmp_set�valueZschemes�r�/usr/lib/python3.6/util.py�_parse_specs7s 




rcCs�|dkrtjj�}tjj|||�}tjj|j�r6|jSt	|j
g�}|jd|�|j�g}yt
jjjt
jj|�d�WnBtk
r�}z&|jr�tt|���tjt|��WYdd}~XnX|jS)NrT)r
�callbackZNullDownloadProgress�repoZRemoteRPMPayload�os�path�existsZ
local_path�sumZ
download_size�startZ_librepo_target�libdnfZ
PackageTargetZdownloadPackagesZVectorPPackageTarget�RuntimeError�strict�IOError�str�logger�error)�url�confZprogressZploadZest_remote_sizeZtargets�errr�_urlopen_progressWs

 r1�w+bcKs�trd|kr|jdd�tj|f|�}y<|r@|jj||j��n tjj	j
|rR|jnd||j��Wn.tk
r�}zt
t|���WYdd}~XnX|jd�|S)z|
    Open the specified absolute url, return a file object
    which respects proxy setting even for non-repo downloads
    �b�encodingzutf-8Nr)r�
setdefault�tempfileZNamedTemporaryFileZ_repoZdownloadUrl�filenor'r!Z
DownloaderZdownloadURLZ_configr(r*r+�seek)r.r/r!�mode�kwargsZfor0rrr�_urlopenhs$
r;cCs |j|�r|dt|��}|S)N)r�len)�s�rrrr�rtrim|s
r?cCstj�dkS)Nr)r"�geteuidrrrr�	am_i_root�srAcCs.x(tj|�D]}tjj||�}t|�qWdS)zBRemove all files and dirs under `path`

    Also see rm_rf()

    N)r"�listdirr#�join�rm_rf)r#�entryZcontained_pathrrr�	clear_dir�srFcCsXytj|dd�Wn@tk
rR}z$|jtjks>tjj|�rB|�WYdd}~XnXdS)Ni�)r9)r"�makedirs�OSError�errnoZEEXISTr#�isdir)Zdnamer0rrr�
ensure_dir�s
rKcCsJg}|}x<tjj|�\}}|s6|s(|r4|jd|�P|jd|�q
W|S)z`
    Split path by path separators.
    Use os.path.join() to join the path back to string.
    r)r"r#�split�insert)r#�result�head�tailrrr�
split_path�s
rQcCs6yt|�}Wn tk
r,tt|��}YnX|dkS)Nr)r<�	TypeError�list)�iterable�lrrr�empty�s
rVcCs*t|�}yt|�Stk
r$dSXdS)zFReturns the first item from an iterable or None if it has no elements.N)�iter�next�
StopIteration)rT�itrrr�first�s
r[cCs4t|�}ytdd�|D��Stk
r.dSXdS)Ncss|]}|dk	r|VqdS)Nr)�.0�itemrrr�	<genexpr>�sz!first_not_none.<locals>.<genexpr>)rWrXrY)rTrZrrr�first_not_none�s
r_cCstj�t|�S)N)�time�file_timestamp)�fnrrr�file_age�srccCstj|�jS)N)r"�stat�st_mtime)rbrrrra�sracCs4ytjtj��dStk
r.dtj�SXdS)NrzUID: %s)�pwd�getpwuidr"r@�KeyErrorrrrr�get_effective_login�sricCs(x"|D]}|j|�}|dkr|SqW|S)z!Like dict.get() for nested dicts.N)�get)Zdct�keysZ	not_found�krrr�get_in�s


rmcs�fdd�}tj||ggf�S)Ncs|t�|��j|�|S)N)�boolr)Zaccr])rbrr�splitter�sz!group_by_filter.<locals>.splitter)�	functools�reduce)rbrTror)rbr�group_by_filter�srrccs&x |D]}||�r|V|VqWdS)z/Insert an item into an iterable by a condition.Nr)r]rT�	conditionZ
original_itemrrr�	insert_if�s
rtcCs*yt|�Wntk
r dSXdSdS)z&Test whether an iterator is exhausted.TFN)rXrY)�iteratorrrr�is_exhausted�s
rvcCs*t|�r|g}t|t�o(tdd�|D��S)Ncss|]}t|�td�@VqdS)z*[?N)r)r\�prrrr^�sz"is_glob_pattern.<locals>.<genexpr>)�is_string_type�
isinstancerS�any)�patternrrr�is_glob_pattern�sr|cCstrt|t�St|t�SdS)N)rryr+r)�objrrrrx�s
rxcs�fdd�}|S)z�Decorator to get lazy attribute initialization.

    Composes with @property. Force reinitialization by deleting the <attrname>.
    cs��fdd�}|S)Ncs8y
t|��Stk
r2�|�}t|�|�|SXdS)N)�getattr�AttributeErrorr)r}�val)�attrnamerbrr�
cached_getters
z6lazyattr.<locals>.get_decorated.<locals>.cached_getterr)rbr�)r�)rbr�
get_decorated�szlazyattr.<locals>.get_decoratedr)r�r�r)r�r�lazyattr�s	r�cGstt|f|���S)z�Like functools.map(), but return a list instead of an iterator.

    This means all side effects of fn take place even without iterating the
    result.

    )rS�map)rb�seqrrr�mapallsr�cCs8tjdtj|��}tjjs4tj�d}|r4|j|�}|S)z6Convert time into locale aware datetime string object.z%cr)	r`ZstrftimeZ	localtimer
rr�localeZ	getlocale�decode)Z	timestamp�tZcurrent_locale_settingrrr�normalize_times
r�cCszy\d}dd�tj|�D�}t|�dkrZ|d}tdj||���}|j�}t|�dkSQRXdSttfk
rtdSXdS)z�Decide whether we are on line power.

    Returns True if we are on line power, False if not, None if it can not be
    decided.

    z/sys/class/power_supplycSsg|]}|jd�r|�qS)ZAC)r)r\Znoderrr�
<listcomp>&szon_ac_power.<locals>.<listcomp>rz{}/{}/onlinerN)	r"rBr<�open�format�read�intr*�
ValueError)Z	ps_folderZac_nodesZac_nodeZ	ac_status�datarrr�on_ac_powersr�cCs�yddl}Wntk
r dSXy0|j�}|jdd�}|j|d�}|jdd�}Wn|jk
rhdSX|dkrvdS|dkr�d	S|dkr�dStd
|��dS)z�Decide whether we are on metered connection.

    Returns:
      True: if on metered connection
      False: if not
      None: if it can not be decided
    rNzorg.freedesktop.NetworkManagerz/org/freedesktop/NetworkManagerzorg.freedesktop.DBus.PropertiesZMeteredr�T��Fz&Unknown value for metered property: %r)rr�)r�r�)�dbus�ImportErrorZ	SystemBusZ
get_objectZ	InterfaceZGetZ
DBusExceptionr�)r�Zbus�proxyZifaceZmeteredrrr�on_metered_connection1s&r�cCs&tj|�\}}tjj||�t||�fS)z�Use a predicate to partition entries into false entries and true entries.

    Credit: Python library itertools' documentation.

    )�	itertools�teer
r�filterfalse�filter)ZpredrTZt1Zt2rrr�	partitionNsr�cCs(ytj|�Wntk
r"YnXdS)N)�shutilZrmtreerH)r#rrrrDWsrDc#sFt���fdd�}t�||�}||�Vx||�}|s8P|Vq*WdS)z�Split an iterable into tuples by a condition.

    Inserts a separator before each item which meets the condition and then
    cuts the iterable by these separators.

    csttj�fdd�|��S)Ncs|�kS)Nr)r0)�	separatorrr�<lambda>gsz4split_by.<locals>.next_subsequence.<locals>.<lambda>)�tupler��	takewhile)rZ)r�rr�next_subsequencefsz"split_by.<locals>.next_subsequenceN)�objectrt)rTrsr�ZmarkedZsubsequencer)r�r�split_by]s
r�cCs|j|�r|t|�d�SdS)N)rr<)r=�prefixrrr�strip_prefixus
r�Fc	Cs8|stj|tj�rtj|d�St|d��WdQRXdS)z{Create an empty file if it doesn't exist or bump it's timestamps.

    If no_create is True only bumps the timestamps.
    N�a)r"�access�F_OK�utimer�)r#Z	no_createrrr�touch{sr��write�cCs�yh|dkr|j|�nP|dkr(|j�n>|dkrD|j|�|j�n"|dkrZt||d�ntd|��Wn>tk
r�}z"tjdjt|�j	t
|���WYdd}~XnXdS)Nr��flushZwrite_flush�print)rzUnsupported type: z{}: {})r�r�r�r�r*r,�criticalr��type�__name__r	)�tp�msg�outr0rrr�_terminal_messenger�s


r�cCsnd}t|�dk}xXt|dd�D]H\}}|rD|dtd�d|7}n|dtd�d7}|dj|�7}qW|S)	z�
    Format string about problems in resolve

    :param resolve_problems: list with list of strings (output of goal.problem_rules())
    :return: string
    r�r)r&z
 ZProblemz %d: z: z
  - )r<�	enumeraterrC)Zresolve_problemsr�Zcount_problems�iZrsrrr�_format_resolve_problems�sr�cCsX|j�d}|j�dk	r4|j�dkr4||j�d7}||j�d|j�d|j�S)N�-�0�:�.)�N�E�V�R�A)ZteZnevrarrr�	_te_nevra�sr�cCs�tjd�xH|D]@}|j�}d}|dk	r.|j}djt|�|||j��}tj|�qWx:|D]2}djt|�|j|j|j	|j
|j��}tj|�qZWdS)NzLogging transaction elementsz@RPM element: '{}', Key(): '{}', Key state: '{}', Failed() '{}': z^SWDB element: '{}', State: '{}', Action: '{}', From repo: '{}', Reason: '{}', Get reason: '{}')r,�debug�Key�stater�r��Failedr+�actionZ	from_repo�reasonZ
get_reason)�rpm_transaction�swdb_transaction�rpm_el�tsiZ	tsi_stater�rrr�_log_rpm_trans_with_swdb�s



r�c
CsVtjjtjjtjjtjjtjjh}dd�|D�}d}d}x�|D]�}t|�}|j�}|dksft	|d�r�x:|D]2}	|	j
tjjkr�ql|	j|kr�qlt
|	�|krl|	}PqlW|dks�t	|d�r�tjtd�j|��d}q>|j�r�tjj|_
d}q>tjj|_
q>Wx6|D].}|j
tjjkr�tjtd�jt
|���d}q�W|�rBtjtd��|�rRt||�dS)	NcSsg|]}|�qSrr)r\r�rrrr��sz-_sync_rpm_trans_with_swdb.<locals>.<listcomp>FZpkgz%TransactionItem not found for key: {}Tz)TransactionSWDBItem not found for key: {}z#Errors occurred during transaction.)r'�transactionZ TransactionItemAction_DOWNGRADEDZTransactionItemAction_OBSOLETED�TransactionItemAction_REMOVEZTransactionItemAction_UPGRADEDZ!TransactionItemAction_REINSTALLEDr�r��hasattrr�ZTransactionItemState_UNKNOWNr�r+r,r�rr�r��TransactionItemState_ERRORZTransactionItemState_DONEr�r�)
r�r�Zrevert_actionsZ
cached_tsiZel_not_foundr-r�Zte_nevrar�Z
tsi_candidaterrr�_sync_rpm_trans_with_swdb�sH





r�c@s$eZdZdd�Zdd�Zdd�ZdS)�tmpdircCsdtjj}tj|d�|_dS)Nz%s-)r�)r
�constZPREFIXr6Zmkdtempr#)�selfr�rrr�__init__�sztmpdir.__init__cCs|jS)N)r#)r�rrr�	__enter__�sztmpdir.__enter__cCst|j�dS)N)rDr#)r��exc_type�	exc_value�	tracebackrrr�__exit__�sztmpdir.__exit__N)r��
__module__�__qualname__r�r�r�rrrrr��sr�cs(eZdZdZ�fdd�Zdd�Z�ZS)�Bunchz�Dictionary with attribute accessing syntax.

    In DNF, prefer using this over dnf.yum.misc.GenericHolder.

    Credit: Alex Martelli, Doug Hudgeon

    cstt|�j||�||_dS)N)�superr�r��__dict__)r��args�kwds)�	__class__rrr��szBunch.__init__cCst|�S)N)�id)r�rrr�__hash__szBunch.__hash__)r�r�r��__doc__r�r��
__classcell__rr)r�rr��sr�cs,eZdZ�fdd�Zdd�Zdd�Z�ZS)�
MultiCallListcstt|�j�|j|�dS)N)r�r�r��extend)r�rT)r�rrr�szMultiCallList.__init__cs��fdd�}|S)Ncs���fdd�}tt|���S)Ncst|��}|���S)N)r~)�v�method)r�r:�whatrr�	call_what
s
z8MultiCallList.__getattr__.<locals>.fn.<locals>.call_what)rSr�)r�r:r�)r�r�)r�r:rrbsz%MultiCallList.__getattr__.<locals>.fnr)r�r�rbr)r�r�r�__getattr__szMultiCallList.__getattr__cs��fdd�}tt||��S)Ncst|���dS)N)r)r])r�r�rr�settersz)MultiCallList.__setattr__.<locals>.setter)rSr�)r�r�r�r�r)r�r�r�__setattr__szMultiCallList.__setattr__)r�r�r�r�r�r�r�rr)r�rr�sr�c
Csntgggggggggggd��}�xF|D�]<}|jtjjkrJ|jj|�q(|jtjjkrf|j	j|�q(|jtjj
kr�|jtjjkr�|j
j|�nD|jtjjkr�|jj|�n(|jtjjkr�|jj|�n|jj|�q(|jtjjkr�|jj|�q(|jtjjk�rL|jtjjk�r |jj|�n*|jtjjk�r>|jj|�n|jj|�q(|jtjjkr(|jj|�q(W|S)N)�
downgraded�erased�erased_clean�
erased_dep�	installed�installed_group�
installed_dep�installed_weak�reinstalled�upgraded�failed)r�r�r'r�r�r�rr�ZTransactionItemAction_DOWNGRADEr�ZTransactionItemAction_INSTALLr�ZTransactionItemReason_GROUPr�Z TransactionItemReason_DEPENDENCYr�Z%TransactionItemReason_WEAK_DEPENDENCYr�r�ZTransactionItemAction_REINSTALLr�r�ZTransactionItemReason_CLEANr�r�r�ZTransactionItemAction_UPGRADEr�)r�r3r�rrr�_make_listssH
rcs��fdd�}tjj|�}�jd|d�\}}|j|�}g}x�td�|jftd�|jftd�|j|j	|j
|jftd�|jftd	�|ftd
�|j
|j|jftd�|jfgD]&\}	}
|j||	t|
tj|�d���q�W|S)
alReturns a human-readable summary of the results of the
    transaction.

    :param action_callback: function generating output for specific action. It
       takes two parameters - action as a string and list of affected packages for
       this action
    :return: a list of lines containing a human-readable summary of the
       results of the transaction
    cs�|j|jk|j|jk}|dkr$|Stj|j|j|j|j|jd�}tj|j|j|j|j|jd�}|j|�j�}|dkrz|S|j|jk|j|jkS)z�Compares two transaction items or packages by nevra.
           Used as a fallback when tsi does not contain package object.
        r)�name�epoch�version�release�arch)	r�hawkeyZNEVRArrrrZevr_cmpZsack)Zitem1Zitem2�retZnevra1Znevra2)�baserr�_tsi_or_pkg_nevra_cmpPsz7_post_transaction_output.<locals>._tsi_or_pkg_nevra_cmpF)Zreport_problemsr�ZUpgradedZ
DowngradedZ	InstalledZReinstalledZSkippedZRemovedr�)�key)r
�utilrZ_skipped_packages�unionrr�r�r�r�r�r�r�r�r�r�r�r��sortedrp�
cmp_to_key)rr�Zaction_callbackr	Z
list_bunchZskipped_conflictsZskipped_brokenZskippedr�r�Ztsisr)rr�_post_transaction_outputFs(



r)N)NNr2)F)QZ
__future__rrrrrrZdnf.i18nrr	�argparser
Zdnf.callbackZ	dnf.constZ
dnf.pycomprIrprr�r�Zloggingr"rfr��sysr6r`Zlibdnf.repor'Zlibdnf.transactionZ	getLoggerr,�ArgumentParser�progZ	MAIN_PROG�upperZMAIN_PROG_UPPERrr1r;r?rArFrKrQrVr[r_rcrarirmrrrtrvr|rxr�r�r�r�r�r�rDr�r�r��stdoutr�r�r�r�r�r�r��dictr�rSr�rrrrrr�<module>s�
 


						
(-site-packages/dnf/__pycache__/query.cpython-36.opt-1.pyc000064400000001621147511334650017006 0ustar003

�ft`3�@sZddlmZddlmZddlZddlmZddlmZddlmZddd	�Z	d
d�Z
dS)
�)�absolute_import)�unicode_literalsN)�Query)�ucd)�
basestringFcCsLt|t�r|g}|j�}g}|r,|jtj�|j|d|i�|rD|S|j�S)NZprovides__glob)�
isinstancerZquery�append�hawkeyZICASEZfiltermZrun)ZsackZpatternsZignore_caseZ	get_query�q�flags�r�/usr/lib/python3.6/query.py�_by_providess
rcCsdd�|D�S)NcSsi|]}|t|��qSr)r)�.0Zpkgrrr
�
<dictcomp>.sz#_per_nevra_dict.<locals>.<dictcomp>r)Zpkg_listrrr
�_per_nevra_dict-sr)FF)Z
__future__rrr	rZdnf.i18nrZ
dnf.pycomprrrrrrr
�<module>s
site-packages/dnf/__pycache__/pycomp.cpython-36.opt-1.pyc000064400000005633147511334650017157 0ustar003

�ft`�@s�ddlmZddlmZddlZddlZddlZddlZddlZddlZddl	Z	ej
dkZe�r ddlm
Z
ddlmZddlZddlZddlZejZeZZejZeZeje_eje_eZe Z!ej"Z#ej$Z%e%j&Z'ej&Z(ej)Z*dd�Z+d	d
�Z,dd�Z-d
d�Z.ej/Z0ddd�Z1dd�Z2dd�Z3n�ddl4mZmZmZmZm!Z!ddl
m
Z
ddlmZddlZddlZddl%Z%ddl5Z5ejZej6Zej7Z#ej&Z'e5j&Z(ej8Z*dd�Z+dd
�Z,dd�Z-dd�Z.dd�Z0d dd�Z1dd�Z2dd�Z3dS)!�)�NullTranslations)�version_infoN�)�StringIO)�ConfigParsercCs|j}|j}||fS)N)�gettext�ngettext)�t�_�P_�r�/usr/lib/python3.6/pycomp.py�
gettext_setup8srcCs
t|t�S)N)�
isinstance�bytes)�orrr
�is_py2str_py3bytes>srcCs
t|t�S)N)rr)rrrr
�is_py3bytes@srcCs
tj|�S)N)�types�
ModuleType)�mrrr
�<lambda>DsrcCstj||�dS)N)�locale�	setlocale)�category�locrrr
rFsrcCs|j|�dS)N)�write)�f�contentrrr
�
write_to_fileHsrcCstjjj|�S)N)�email�mime�text�MIMEText)�bodyrrr
�
email_mimeJsr%)�unicode�
basestring�long�xrange�	raw_inputcCs|j}|j}||fS)N)�ugettext�	ungettext)r	r
rrrr
r]scCs
t|t�S)N)r�str)rrrr
rcscCsdS)NFr)rrrr
rescCstj|jd��S)Nzutf-8)rr�encode)rrrr
riscOstj|jd�f|�|�S)Nzutf-8)r�formatr.)Zpercent�args�kwargsrrr
r/jsr/cCstj||jd��dS)Nzutf-8)rrr.)rrrrr
rlscCs|j|jd��dS)Nzutf-8)rr.)rrrrr
rnscCstjjj|jd��S)Nzutf-8)r r!r"r#r.)r$rrr
r%ps)N)N)9rr�sysr�base64Zemail.mime.textr �	itertoolsrr�majorZPY3�iorZconfigparserrZqueueZurllib.parseZurllibZshlexZQueuer-r'r&�filterfalse�intr(r+rr,�ranger)�inputr*ZdecodebytesZbase64_decodebytes�parseZurlparseZquoteZurllib_quoteZshlex_quote�maxsizeZsys_maxsizerrrr�
format_stringr/rrr%Z__builtin__ZpipesZifilterfalseZdecodestringZmaxintrrrr
�<module>sr



site-packages/dnf/__pycache__/i18n.cpython-36.opt-1.pyc000064400000022612147511334650016423 0ustar003

�ft`!0�@s�ddlmZddlmZddlmZddlZddlZddlZddlZddl	Z	ddl
Z
Gdd�de�Zdd�Z
d	d
�Zdd�Zd
d�Zdd�Zdd�Zdd�Zd'dd�Zdd�Zd(dd�Zd)dd�Zd d!�Zd"d#�Zd$d%�Zed&�\ZZeZdS)*�)�print_function)�unicode_literals)�unicodeNc@s$eZdZdd�Zdd�Zdd�ZdS)�
UnicodeStreamcCs||_||_dS)N)�stream�encoding)�selfrr�r	�/usr/lib/python3.6/i18n.py�__init__$szUnicodeStream.__init__cCs�t|t�s.tjjr |j|jd�n|j|jd�}y|jj	|�Wn\t
k
r�|j|jjd�}t|jd�rz|jjj	|�n|j|jjd�}|jj	|�YnXdS)N�replace�backslashreplace�buffer�ignore)
�
isinstance�str�dnf�pycomp�PY3�decoder�encoder�write�UnicodeEncodeError�hasattrr)r�sZs_bytesr	r	r
r(s
zUnicodeStream.writecCst|j|�S)N)�getattrr)r�namer	r	r
�__getattr__7szUnicodeStream.__getattr__N)�__name__�
__module__�__qualname__rrrr	r	r	r
r#srcCs0|dkrdS|j�}|jd�s(|jd�r,dSdS)a�Return true if encoding can express any Unicode character.

    Even if an encoding can express all accented letters in the given language,
    we can't generally settle for it in DNF since sometimes we output special
    characters like the registered trademark symbol (U+00AE) and surprisingly
    many national non-unicode encodings, including e.g. ASCII and ISO-8859-2,
    don't contain it.

    NFzutf-Zutf_T)�lower�
startswith)rr!r	r	r
�_full_ucd_support:s
r#cCstjd�}|jd�rdS|S)z= Take the best shot at the current system's string encoding. FZANSIzutf-8)�locale�getpreferredencodingr")rr	r	r
�_guess_encodingKs
r&cCs�ytjjtjd�Wn�tjk
r�ytjjtjd�dtjd<Wn0tjk
rttjjtjd�dtjd<YnXtdj	tjd�t
jd�YnXdS)N�zC.UTF-8�LC_ALL�Cz&Failed to set locale, defaulting to {})�file)rr�	setlocaler$r(�Error�os�environ�print�format�sys�stderrr	r	r	r
�setup_localePsr3cCs`tj}|j�stjtjtj�y
|j}Wntk
r@d}YnXt|�s\t	|t
��t_dSdS)z� Check that stdout is of suitable encoding and handle the situation if
        not.

        Returns True if stdout was of suitable encoding already and no changes
        were needed.
    NFT)r1�stdout�isatty�signal�SIGPIPE�SIG_DFLr�AttributeErrorr#rr&)r4rr	r	r
�setup_stdout^s

r:cCst|dd�tjj�S)z� It uses print instead of passing the prompt to raw_input.

        raw_input doesn't encode the passed string and the output
        goes into stderr
    r')�end)r/rrZ	raw_input)Zucstringr	r	r
�	ucd_inputrsr<c
Cs�tjjr:tjj|�r$t|t�dd�St|t�r2|St|�St|tjj�rL|St|d�rxytjj|�St	k
rvYnXtjjt|�t�dd�SdS)zD Like the builtin unicode() but tries to use a reasonable encoding. r)�errorsZ__unicode__N)
rrrZis_py3bytesrr&rrr�UnicodeError)�objr	r	r
�ucd}s

r@cCstj|�dkrdSdS)N�W�F��)rArB)�unicodedataZeast_asian_width)Zucharr	r	r
�_exact_width_char�srFcCsX|dkrt|�|fSd}d}x2|D]*}t|�}|||kr<P||7}||7}q"W||fS)a' Return the textual width of a Unicode string, chopping it to
        a specified value. This is what you want to use instead of %.*s, as it
        does the "right" thing with regard to different Unicode character width
        Eg. "%.*s" % (10, msg)   <= becomes => "%s" % (chop_str(msg, 10)) Nrr')�exact_widthrF)�msg�chop�widthZchopped_msg�charZ
char_widthr	r	r
�chop_str�s
rLcCstdd�|D��S)zQ Calculates width of char at terminal screen
        (Asian char counts for two) css|]}t|�VqdS)N)rF)�.0�cr	r	r
�	<genexpr>�szexact_width.<locals>.<genexpr>)�sum)rHr	r	r
rG�srGTr'cCsjt||�\}}||kr0|s|rfdj|||g�}n6d||}|rTdj||||g�}ndj||||g�}|S)a� Expand a msg to a specified "width" or chop to same.
        Expansion can be left or right. This is what you want to use instead of
        %*.*s, as it does the "right" thing with regard to different Unicode
        character width.
        prefix and suffix should be used for "invisible" bytes, like
        highlighting.

        Examples:

        ``"%-*.*s" % (10, 20, msg)`` becomes
            ``"%s" % (fill_exact_width(msg, 10, 20))``.

        ``"%20.10s" % (msg)`` becomes
            ``"%s" % (fill_exact_width(msg, 20, 10, left=False))``.

        ``"%s%.10s%s" % (pre, msg, suf)`` becomes
            ``"%s" % (fill_exact_width(msg, 0, 10, prefix=pre, suffix=suf))``.
        r'� )rL�join)rHZfillrI�left�prefix�suffixrJZextrar	r	r
�fill_exact_width�srV�Fcs��fdd��|jd�}|jdd�jd�}g}|}d}d}d}	�xr|D�]h}
|
jd�}
||	}}�|
�\}}	d}
|rz|	rzd	}
|r�|t|
�kr�d	}
|r�|r�|d
kr�||kr�d	}
|
r�|j|jd��|}d}|t|
�kr�d}
|r�|
jd�}
|}	t||
�|k�rd}|j||
�|}qDd	}|
jd�}|}
|	}|�r@|d
k�r@|}x^|D]V}|t|
|�k�r�t|
�t|�k�r�|j|
jd��|d|}
|
|7}
|
d7}
�qFW|
jd�d}qDW|�r�|j|jd��dj|�S)
zq Works like we want textwrap.wrap() to work, uses Unicode strings
        and doesn't screw up lists/blocks/etc. cs�d}d}x|D]}|dkrP|d7}qW|d
kr8|dfSt||d�d�d}|dkr��||t|�d��}|dp||d}|r�||d|fS|dfS)Nr�XrQrD�-�*�.�o�â�•�‣�∘)rYrZr[r\r])rYrZr[r\r^r_r`)rL�len)�line�countZbyteZlist_chrZnxt)�_indent_at_begr	r
rd�s 
z%textwrap_fill.<locals>._indent_at_beg�
�	rQ�FrT�r'z        )�rstripr�splitra�append�lstriprGrR)�textrJZinitial_indentZsubsequent_indent�lines�ret�indentZ	wrap_lastZcsabZcspc_indentrbZlsabZlspc_indentZforce_nlZwordsZspcsZwordr	)rdr
�
textwrap_fill�sf






rqcCsHt|�}t|�}||kr|S||kr4||kr0|S|S||kr@|S|SdS)a� Automatically selects the short (abbreviated) or long (full) message
        depending on whether we have enough screen space to display the full
        message or not. If a caller by mistake passes a long string as
        msg_short and a short string as a msg_long this function recognizes
        the mistake and swaps the arguments. This function is especially useful
        in the i18n context when you cannot predict how long are the translated
        messages.

        Limitations:

        1. If msg_short is longer than width you will still get an overflow.
           This function does not abbreviate the string.
        2. You are not obliged to provide an actually abbreviated string, it is
           perfectly correct to pass the same string twice if you don't want
           any abbreviation. However, if you provide two different strings but
           having the same width this function is unable to recognize which one
           is correct and you should assume that it is unpredictable which one
           is returned.

       Example:

       ``select_short_long (10, _("Repo"), _("Repository"))``

       will return "Repository" in English but the results in other languages
       may be different. N)rG)rJZ	msg_shortZmsg_longZwidth_shortZ
width_longr	r	r
�select_short_long'srrcCs2t�dd�}tjjj|dd�}t|tjj|��S)z< Easy gettext translations setup based on given domain name cs�fdd�S)Ncst�|��S)N)r@)�w)�fncr	r
�<lambda>Tsz2translation.<locals>.ucd_wrapper.<locals>.<lambda>r	)rtr	)rtr
�ucd_wrapperSsz translation.<locals>.ucd_wrapperT)Zfallback)r3rr�gettext�translation�mapZ
gettext_setup)rrv�tr	r	r
rxNsrxcCs(t|td�|�}d|kr |S|SdS)Nrh�)�_�chr)�context�message�resultr	r	r
�pgettextYsr�r)N)NTr'r')rWr'r')Z
__future__rrZ
dnf.pycomprrr$r-r6r1rE�objectrr#r&r3r:r<r@rFrLrGrVrqrrrxr�r|ZP_ZC_r	r	r	r
�<module>s2

"
O'site-packages/dnf/__pycache__/const.cpython-36.pyc000064400000002572147511334650016036 0ustar003

i�-eA	�@s�ddlmZddlZdZdZd&Zd'ZddddddgZdZ	dZ
dZdZdZ
dZdZdZdZdZdZdZd Zej�Zej�Zd!Zd"ejj�Zd#Zd$eZej�Zd%Z dS)(�)�unicode_literalsNz/etc/dnf/dnf.confz/etc/dnf/automatic.conf�system-release(releasever)�system-release� distribution-release(releasever)�distribution-release�redhat-release�suse-release�	mandatory�default�conditionalZkernelz
kernel-PAEzinstallonlypkg(kernel)zinstallonlypkg(kernel-module)zinstallonlypkg(vm)zmultiversion(kernel)zdnf.logz
hawkey.logzdnf.librepo.logz--- logging initialized ---zdnf.rpm.logZDNFz/var/lib/dnfz/var/run/dnf.pidz/runz	/run/userz/var/cache/dnfz	/var/tmp/�z/etc/dnf/pluginsz%s/dnf-pluginsz4.7.0zdnf/%szhttps://bugs.rockylinux.org/)rrrrrr)r	r
r)!Z
__future__rZdistutils.sysconfigZ	distutilsZ
CONF_FILENAMEZCONF_AUTOMATIC_FILENAMEZDISTROVERPKGZGROUP_PACKAGE_TYPESZINSTALLONLYPKGSZLOGZ
LOG_HAWKEYZLOG_LIBREPOZ
LOG_MARKERZLOG_RPM�NAMEZ
PERSISTDIRZPID_FILENAMEZRUNDIRZUSER_RUNDIRZSYSTEM_CACHEDIRZTMPDIRZ
VERBOSE_LEVEL�lowerZPREFIXZPROGRAM_NAMEZPLUGINCONFPATH�	sysconfigZget_python_libZ
PLUGINPATH�VERSIONZ
USER_AGENTZBUGTRACKER_COMPONENTZ
BUGTRACKER�rr�/usr/lib/python3.6/const.py�<module>sBsite-packages/dnf/__pycache__/lock.cpython-36.pyc000064400000010002147511334650015623 0ustar003

�ft`��@s�ddlmZddlmZddlmZmZmZddlmZddl	m
Z
ddlZddl
ZddlZddlZddlZddlZddlZddlZddlZejd�Zdd	�Zd
d�Zdd
�Zdd�Zdd�ZGdd�de�ZdS)�)�absolute_import)�unicode_literals)�ProcessLockError�ThreadLockError�	LockError)�_)�miscN�dnfcCs6tjj�s2tj|jd��j�}tjj	t
j�d|�}|S)Nzutf-8Zlocks)r	�utilZ	am_i_root�hashlibZsha1�encodeZ	hexdigest�os�path�joinrZgetCacheDir)Zdir_Zhexdir�r�/usr/lib/python3.6/lock.py�
_fit_lock_dir&s
rcCsttjjt|�d�d|�S)Nzdownload_lock.pid�cachedir)�ProcessLockr
rrr)r�exit_on_lockrrr�build_download_lock/srcCsttjjt|�d�d|�S)Nzmetadata_lock.pidZmetadata)rr
rrr)rrrrr�build_metadata_lock3srcCsttjjt|�d�d|�S)Nzrpmdb_lock.pidZRPMDB)rr
rrr)Z
persistdirrrrr�build_rpmdb_lock8srcCsttjjt|�d�d|�S)Nzlog_lock.pid�log)rr
rrr)Zlogdirrrrr�build_log_lock=src@s>eZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�ZdS)rFcCs&||_d|_||_||_tj�|_dS)Nr)�blocking�count�description�target�	threading�RLock�thread_lock)�selfrrrrrr�__init__Cs
zProcessLock.__init__cCs2|jjdd�s d|j}t|��|jd7_dS)NF)rz'%s already locked by a different thread�)r!�acquirerrr)r"�msgrrr�_lock_threadJs
zProcessLock._lock_threadc Cs>tj|jtjtjBd�}�zytj|tjtjB�Wn4t	k
rh}z|j
t
jkrVdS�WYdd}~XnXtj|d�}t
|�dkr�tj|t|�jd��|Syt|�}Wn*tk
r�td�|j}t|��YnX||kr�|Stjd|tj��s*tj|dtj�tj|d�tj|t|�jd��|S|Stj|�XdS)	Ni�r$�rzutf-8z�Malformed lock file found: %s.
Ensure no other dnf/yum process is running and remove the lock file manually or run systemd-tmpfiles --remove dnf.conf.z
/proc/%d/stat���)r
�openr�O_CREAT�O_RDWR�fcntlZflockZLOCK_EXZLOCK_NB�OSError�errnoZEWOULDBLOCK�read�len�write�strr�int�
ValueErrorrr�access�F_OK�lseek�SEEK_SET�	ftruncate�close)r"�pid�fd�eZold_pidr&rrr�	_try_lockPs6zProcessLock._try_lockcCs|jd8_|jj�dS)Nr$)rr!�release)r"rrr�_unlock_threadzszProcessLock._unlock_threadcCs�tjjtjj|j��|j�d}tj�}|j	|�}xp||kr�|dkr�|j
sl|j�d|j|f}t
||��||kr�td�|}tj|�|}tjd�|j	|�}q6WdS)Nr$z%s already locked by %dz*Waiting for process with pid %d to finish.r)r))r	r
Z
ensure_dirr
r�dirnamerr'�getpidr?rrArrr�logger�info�timeZsleep)r"Zprev_pidZmy_pidr<r&rrr�	__enter__~s"




zProcessLock.__enter__cGs"|jdkrtj|j�|j�dS)Nr$)rr
�unlinkrrA)r"Zexc_argsrrr�__exit__�s
zProcessLock.__exit__N)F)	�__name__�
__module__�__qualname__r#r'r?rArGrIrrrrrBs
*r)Z
__future__rrZdnf.exceptionsrrrZdnf.i18nrZdnf.yumrZdnf.loggingr	Zdnf.utilr/r-rZloggingr
rrFZ	getLoggerrDrrrrr�objectrrrrr�<module>s(
	site-packages/dnf/__pycache__/const.cpython-36.opt-1.pyc000064400000002572147511334650016775 0ustar003

i�-eA	�@s�ddlmZddlZdZdZd&Zd'ZddddddgZdZ	dZ
dZdZdZ
dZdZdZdZdZdZdZd Zej�Zej�Zd!Zd"ejj�Zd#Zd$eZej�Zd%Z dS)(�)�unicode_literalsNz/etc/dnf/dnf.confz/etc/dnf/automatic.conf�system-release(releasever)�system-release� distribution-release(releasever)�distribution-release�redhat-release�suse-release�	mandatory�default�conditionalZkernelz
kernel-PAEzinstallonlypkg(kernel)zinstallonlypkg(kernel-module)zinstallonlypkg(vm)zmultiversion(kernel)zdnf.logz
hawkey.logzdnf.librepo.logz--- logging initialized ---zdnf.rpm.logZDNFz/var/lib/dnfz/var/run/dnf.pidz/runz	/run/userz/var/cache/dnfz	/var/tmp/�z/etc/dnf/pluginsz%s/dnf-pluginsz4.7.0zdnf/%szhttps://bugs.rockylinux.org/)rrrrrr)r	r
r)!Z
__future__rZdistutils.sysconfigZ	distutilsZ
CONF_FILENAMEZCONF_AUTOMATIC_FILENAMEZDISTROVERPKGZGROUP_PACKAGE_TYPESZINSTALLONLYPKGSZLOGZ
LOG_HAWKEYZLOG_LIBREPOZ
LOG_MARKERZLOG_RPM�NAMEZ
PERSISTDIRZPID_FILENAMEZRUNDIRZUSER_RUNDIRZSYSTEM_CACHEDIRZTMPDIRZ
VERBOSE_LEVEL�lowerZPREFIXZPROGRAM_NAMEZPLUGINCONFPATH�	sysconfigZget_python_libZ
PLUGINPATH�VERSIONZ
USER_AGENTZBUGTRACKER_COMPONENTZ
BUGTRACKER�rr�/usr/lib/python3.6/const.py�<module>sBsite-packages/dnf/__pycache__/drpm.cpython-36.pyc000064400000012244147511334650015647 0ustar003

�ft`��@s�ddlmZddlmZddlmZddlmZddlmZddl	Z
ddlZ
ddlZ
ddl
Z
ddlZddlZddlZdZejd�ZGd	d
�d
e
jj�ZGdd�de�ZdS)
�)�absolute_import)�unicode_literals)�hexlify)�unlink_f)�_Nz/usr/bin/applydeltarpm�dnfcsXeZdZ�fdd�Zdd�Z�fdd�Zdd�Zed	d
��Zedd��Z	d
d�Z
�ZS)�DeltaPayloadcs"tt|�j||�||_||_dS)N)�superr�__init__�
delta_info�delta)�selfrr�pkg�progress)�	__class__��/usr/lib/python3.6/drpm.pyr
)szDeltaPayload.__init__cCstjj|jj�S)N)�os�path�basenamer�location)r
rrr�__str__.szDeltaPayload.__str__cs2tt|�j|||�|tjjjkr.|jj|�dS)N)	r	r�_end_cb�libdnf�repoZPackageTargetCBZTransferStatus_ERRORr�enqueue)r
ZcbdataZ	lr_status�msg)rrrr1szDeltaPayload._end_cbcCsh|j}|j\}}tj|�}t|�j�}tjjj	|�}|tjjj
krRtjt
d�|�|j|||j|jd�S)Nzunsupported checksum type: %s)Zrelative_urlZ
checksum_typeZchecksumZexpectedsizeZbase_url)r�chksum�hawkeyZchksum_namer�decoderrZ
PackageTargetZchecksumTypeZChecksumType_UNKNOWN�loggerZwarningrr�downloadsizeZbaseurl)r
rZctypeZcsumrZ
ctype_coderrr�_target_params6s

zDeltaPayload._target_paramscCs|jjS)N)rr!)r
rrr�
download_sizeHszDeltaPayload.download_sizecCs|jjS)N)rr!)r
rrr�
_full_sizeLszDeltaPayload._full_sizecCs$|jj}tjj|jjjtjj|��S)N)	rrrr�joinrrZpkgdirr)r
rrrr�localPkgPszDeltaPayload.localPkg)�__name__�
__module__�__qualname__r
rrr"�propertyr#r$r&�
__classcell__rr)rrr(src@s>eZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�ZdS)�	DeltaInfoNcCs�d|_tjttj�rd|_ytjd�|_Wnttfk
rHd|_YnX|dkrbt	j
j�j|_n||_||_
||_g|_i|_i|_dS)z�A delta lookup and rebuild context
           query -- installed packages to use when looking up deltas
           progress -- progress obj to display finished delta rebuilds
        FT�SC_NPROCESSORS_ONLN�N)�deltarpm_installedr�access�
APPLYDELTA�X_OK�sysconf�
deltarpm_jobs�	TypeError�
ValueErrorrZconfZConf�deltarpm_percentage�queryr�queue�jobs�err)r
r8rr7rrrr
VszDeltaInfo.__init__cCs�|js
dS|jjs|jr dS|j�r,dStjj|j��r@dS|j	|jd}d}x@|j
j|j|j
d�D](}|j|j�}|rj|j|krj|j}|}qjW|r�t||||�SdS)z&Turn a po to Delta RPM po, if possibleN�d)�name�arch)r/rZdeltarpmr7Z
_is_local_pkgrr�existsr&Z_sizer8�filterr=r>Zget_delta_from_evrZevrr!r)r
ZporZbestZ
best_deltaZiporrrr�
delta_factoryms$zDeltaInfo.delta_factorycCs�tjtjjd||d?|d@�|jj|�}|j}|dkrXt|jj	��t
d�g|j|<nB|jj�stt
d�g|j|<n&t
j|j	��|jj|tjjt
d��dS)Nzdrpm: %d: return code: %d, %d��rzDelta RPM rebuild failedz(Checksum of the delta-rebuilt RPM failed�done)r �logr�logging�SUBDEBUGr:�poprrr&rr;ZverifyLocalPkgr�unlinkr�end�callbackZSTATUS_DRPM)r
�pid�code�ploadrrrr�job_done�s
zDeltaInfo.job_donecCs`ttd|jj|j�|jj�g}tjtjf|��}tjt	j
jd|dj|dd���||j
|<dS)Nz-azdrpm: spawned %d: %s� �)r1rr>r&r�spawnl�P_NOWAITr rErrFrGr%r:)r
rNZ
spawn_argsrLrrr�	start_job�szDeltaInfo.start_jobcCspx.|jr.tjdtj�\}}|s P|j||�qW|jj|�x.t|j�|jkrj|j	|jj
d��|js>Pq>WdS)NrQr���)r:r�waitpid�WNOHANGrOr9�append�lenr4rTrH)r
rNrLrMrrrr�szDeltaInfo.enqueuecCs@x:|jr:tj�\}}|j||�|jr|j|jjd��qWdS)z!Wait until all jobs have finishedrN)r:r�waitrOr9rTrH)r
rLrMrrrrZ�s
zDeltaInfo.wait)N)	r'r(r)r
rArOrTrrZrrrrr,Us


r,)Z
__future__rrZbinasciirZdnf.yum.miscrZdnf.i18nrZdnf.callbackrZdnf.loggingZdnf.reporrFZlibdnf.reporrr1Z	getLoggerr rZPackagePayloadr�objectr,rrrr�<module>s
-site-packages/dnf/__pycache__/i18n.cpython-36.pyc000064400000022612147511334650015464 0ustar003

�ft`!0�@s�ddlmZddlmZddlmZddlZddlZddlZddlZddl	Z	ddl
Z
Gdd�de�Zdd�Z
d	d
�Zdd�Zd
d�Zdd�Zdd�Zdd�Zd'dd�Zdd�Zd(dd�Zd)dd�Zd d!�Zd"d#�Zd$d%�Zed&�\ZZeZdS)*�)�print_function)�unicode_literals)�unicodeNc@s$eZdZdd�Zdd�Zdd�ZdS)�
UnicodeStreamcCs||_||_dS)N)�stream�encoding)�selfrr�r	�/usr/lib/python3.6/i18n.py�__init__$szUnicodeStream.__init__cCs�t|t�s.tjjr |j|jd�n|j|jd�}y|jj	|�Wn\t
k
r�|j|jjd�}t|jd�rz|jjj	|�n|j|jjd�}|jj	|�YnXdS)N�replace�backslashreplace�buffer�ignore)
�
isinstance�str�dnf�pycomp�PY3�decoder�encoder�write�UnicodeEncodeError�hasattrr)r�sZs_bytesr	r	r
r(s
zUnicodeStream.writecCst|j|�S)N)�getattrr)r�namer	r	r
�__getattr__7szUnicodeStream.__getattr__N)�__name__�
__module__�__qualname__rrrr	r	r	r
r#srcCs0|dkrdS|j�}|jd�s(|jd�r,dSdS)a�Return true if encoding can express any Unicode character.

    Even if an encoding can express all accented letters in the given language,
    we can't generally settle for it in DNF since sometimes we output special
    characters like the registered trademark symbol (U+00AE) and surprisingly
    many national non-unicode encodings, including e.g. ASCII and ISO-8859-2,
    don't contain it.

    NFzutf-Zutf_T)�lower�
startswith)rr!r	r	r
�_full_ucd_support:s
r#cCstjd�}|jd�rdS|S)z= Take the best shot at the current system's string encoding. FZANSIzutf-8)�locale�getpreferredencodingr")rr	r	r
�_guess_encodingKs
r&cCs�ytjjtjd�Wn�tjk
r�ytjjtjd�dtjd<Wn0tjk
rttjjtjd�dtjd<YnXtdj	tjd�t
jd�YnXdS)N�zC.UTF-8�LC_ALL�Cz&Failed to set locale, defaulting to {})�file)rr�	setlocaler$r(�Error�os�environ�print�format�sys�stderrr	r	r	r
�setup_localePsr3cCs`tj}|j�stjtjtj�y
|j}Wntk
r@d}YnXt|�s\t	|t
��t_dSdS)z� Check that stdout is of suitable encoding and handle the situation if
        not.

        Returns True if stdout was of suitable encoding already and no changes
        were needed.
    NFT)r1�stdout�isatty�signal�SIGPIPE�SIG_DFLr�AttributeErrorr#rr&)r4rr	r	r
�setup_stdout^s

r:cCst|dd�tjj�S)z� It uses print instead of passing the prompt to raw_input.

        raw_input doesn't encode the passed string and the output
        goes into stderr
    r')�end)r/rrZ	raw_input)Zucstringr	r	r
�	ucd_inputrsr<c
Cs�tjjr:tjj|�r$t|t�dd�St|t�r2|St|�St|tjj�rL|St|d�rxytjj|�St	k
rvYnXtjjt|�t�dd�SdS)zD Like the builtin unicode() but tries to use a reasonable encoding. r)�errorsZ__unicode__N)
rrrZis_py3bytesrr&rrr�UnicodeError)�objr	r	r
�ucd}s

r@cCstj|�dkrdSdS)N�W�F��)rArB)�unicodedataZeast_asian_width)Zucharr	r	r
�_exact_width_char�srFcCsX|dkrt|�|fSd}d}x2|D]*}t|�}|||kr<P||7}||7}q"W||fS)a' Return the textual width of a Unicode string, chopping it to
        a specified value. This is what you want to use instead of %.*s, as it
        does the "right" thing with regard to different Unicode character width
        Eg. "%.*s" % (10, msg)   <= becomes => "%s" % (chop_str(msg, 10)) Nrr')�exact_widthrF)�msg�chop�widthZchopped_msg�charZ
char_widthr	r	r
�chop_str�s
rLcCstdd�|D��S)zQ Calculates width of char at terminal screen
        (Asian char counts for two) css|]}t|�VqdS)N)rF)�.0�cr	r	r
�	<genexpr>�szexact_width.<locals>.<genexpr>)�sum)rHr	r	r
rG�srGTr'cCsjt||�\}}||kr0|s|rfdj|||g�}n6d||}|rTdj||||g�}ndj||||g�}|S)a� Expand a msg to a specified "width" or chop to same.
        Expansion can be left or right. This is what you want to use instead of
        %*.*s, as it does the "right" thing with regard to different Unicode
        character width.
        prefix and suffix should be used for "invisible" bytes, like
        highlighting.

        Examples:

        ``"%-*.*s" % (10, 20, msg)`` becomes
            ``"%s" % (fill_exact_width(msg, 10, 20))``.

        ``"%20.10s" % (msg)`` becomes
            ``"%s" % (fill_exact_width(msg, 20, 10, left=False))``.

        ``"%s%.10s%s" % (pre, msg, suf)`` becomes
            ``"%s" % (fill_exact_width(msg, 0, 10, prefix=pre, suffix=suf))``.
        r'� )rL�join)rHZfillrI�left�prefix�suffixrJZextrar	r	r
�fill_exact_width�srV�Fcs��fdd��|jd�}|jdd�jd�}g}|}d}d}d}	�xr|D�]h}
|
jd�}
||	}}�|
�\}}	d}
|rz|	rzd	}
|r�|t|
�kr�d	}
|r�|r�|d
kr�||kr�d	}
|
r�|j|jd��|}d}|t|
�kr�d}
|r�|
jd�}
|}	t||
�|k�rd}|j||
�|}qDd	}|
jd�}|}
|	}|�r@|d
k�r@|}x^|D]V}|t|
|�k�r�t|
�t|�k�r�|j|
jd��|d|}
|
|7}
|
d7}
�qFW|
jd�d}qDW|�r�|j|jd��dj|�S)
zq Works like we want textwrap.wrap() to work, uses Unicode strings
        and doesn't screw up lists/blocks/etc. cs�d}d}x|D]}|dkrP|d7}qW|d
kr8|dfSt||d�d�d}|dkr��||t|�d��}|dp||d}|r�||d|fS|dfS)Nr�XrQrD�-�*�.�o�â�•�‣�∘)rYrZr[r\r])rYrZr[r\r^r_r`)rL�len)�line�countZbyteZlist_chrZnxt)�_indent_at_begr	r
rd�s 
z%textwrap_fill.<locals>._indent_at_beg�
�	rQ�FrT�r'z        )�rstripr�splitra�append�lstriprGrR)�textrJZinitial_indentZsubsequent_indent�lines�ret�indentZ	wrap_lastZcsabZcspc_indentrbZlsabZlspc_indentZforce_nlZwordsZspcsZwordr	)rdr
�
textwrap_fill�sf






rqcCsHt|�}t|�}||kr|S||kr4||kr0|S|S||kr@|S|SdS)a� Automatically selects the short (abbreviated) or long (full) message
        depending on whether we have enough screen space to display the full
        message or not. If a caller by mistake passes a long string as
        msg_short and a short string as a msg_long this function recognizes
        the mistake and swaps the arguments. This function is especially useful
        in the i18n context when you cannot predict how long are the translated
        messages.

        Limitations:

        1. If msg_short is longer than width you will still get an overflow.
           This function does not abbreviate the string.
        2. You are not obliged to provide an actually abbreviated string, it is
           perfectly correct to pass the same string twice if you don't want
           any abbreviation. However, if you provide two different strings but
           having the same width this function is unable to recognize which one
           is correct and you should assume that it is unpredictable which one
           is returned.

       Example:

       ``select_short_long (10, _("Repo"), _("Repository"))``

       will return "Repository" in English but the results in other languages
       may be different. N)rG)rJZ	msg_shortZmsg_longZwidth_shortZ
width_longr	r	r
�select_short_long'srrcCs2t�dd�}tjjj|dd�}t|tjj|��S)z< Easy gettext translations setup based on given domain name cs�fdd�S)Ncst�|��S)N)r@)�w)�fncr	r
�<lambda>Tsz2translation.<locals>.ucd_wrapper.<locals>.<lambda>r	)rtr	)rtr
�ucd_wrapperSsz translation.<locals>.ucd_wrapperT)Zfallback)r3rr�gettext�translation�mapZ
gettext_setup)rrv�tr	r	r
rxNsrxcCs(t|td�|�}d|kr |S|SdS)Nrh�)�_�chr)�context�message�resultr	r	r
�pgettextYsr�r)N)NTr'r')rWr'r')Z
__future__rrZ
dnf.pycomprrr$r-r6r1rE�objectrr#r&r3r:r<r@rFrLrGrVrqrrrxr�r|ZP_ZC_r	r	r	r
�<module>s2

"
O'site-packages/dnf/__pycache__/logging.cpython-36.opt-1.pyc000064400000020111147511334650017262 0ustar003

�ft`r(�@s�ddlmZddlmZddlZddlZddlZddlZddlZ	ddl
Z
ddlZ
ddlZddl
Z
ddlZddlZddlZdZe
jZe
jZe
jZe
jZe
jZdZdZdZdZd	d
�ZGdd�de�Zee
je
je
je
je
je
jeeeed
�Zdd�Zee
je
jd�Z dd�Z!dd�Z"d%Z#dd�Z$Gdd�de
j%j&�Z'dd�Z(dd�Z)Gdd �d e�Z*Gd!d"�d"e�Z+e	j,j-j.ee	j,j-j/ee	j,j-j0ee	j,j-j1ee	j,j-j2ee	j,j-j3ee	j,j-j4eiZ5Gd#d$�d$e	j,j-�Z6e6�Z7e	j,j8j9e7�dS)&�)�absolute_import)�unicode_literalsN�d����csdd����fdd�}|S)zGMethod decorator turning the method into noop on second or later calls.c_sdS)N�)Z_argsZ_kwargsr	r	�/usr/lib/python3.6/logging.py�noop3szonly_once.<locals>.noopcs"�|f|�|�t|�j��dS)N)�setattr�__name__)�self�args�kwargs)�funcrr	r
�	swan_song5szonly_once.<locals>.swan_songr	)rrr	)rrr
�	only_once1src@seZdZdd�Zdd�ZdS)�_MaxLevelFiltercCs
||_dS)N)�	max_level)rrr	r	r
�__init__;sz_MaxLevelFilter.__init__cCs|j|jkrdSdS)Nr�)Zlevelnor)r�recordr	r	r
�filter>sz_MaxLevelFilter.filterN)r
�
__module__�__qualname__rrr	r	r	r
r:sr)rrr�r�r�r�	�
cCstj|t�S)N)�_VERBOSE_VAL_MAPPING�get�TRACE)�
cfg_errvalr	r	r
�_cfg_verbose_val2levelQsr%)rrrcCstj|tj�S)N)�_ERR_VAL_MAPPINGr"�logging�WARNING)r$r	r	r
�_cfg_err_val2level^sr)cCs|dS)Nz.gzr	)�namer	r	r
�compression_namercsr+�icCs\t|d��>}tj|d��&}x|jt�}|s,P|j|�qWWdQRXWdQRXtj|�dS)N�rb�wb)�open�gzip�read�
CHUNK_SIZE�write�os�remove)�source�destZsfZwf�datar	r	r
�compression_rotatorjs
"r9cs&eZdZd	�fdd�	Zdd�Z�ZS)
�MultiprocessRotatingFileHandler�arNFcs.tt|�j||||||�tjjdd�|_dS)Nz	/var/log/T)�superr:r�dnf�lockZbuild_log_lock�rotate_lock)r�filename�mode�maxBytes�backupCount�encodingZdelay)�	__class__r	r
rvs
z(MultiprocessRotatingFileHandler.__init__cCs�x�yR|j|�rD|j�*tj|j�j}|j�tj|j|�WdQRXtj	j
||�dStjj
tjjfk
r~tjd�Yqtk
r�|j|�dSXqWdS)Ng{�G�z�?)ZshouldRolloverr?r4�statZbaseFilename�st_modeZ
doRollover�chmodr'ZFileHandler�emitr=�
exceptionsZProcessLockErrorZThreadLockError�timeZsleep�	ExceptionZhandleError)rrrAr	r	r
rI{s

z$MultiprocessRotatingFileHandler.emit)r;rrNF)r
rrrrI�
__classcell__r	r	)rEr
r:usr:cCsltjj|�s,tjjtjj|��tjj|�t|||d�}t	j
dd�}tj|_
|j|�|rht|_t|_|S)N)rBrCz%%(asctime)s %(levelname)s %(message)sz%Y-%m-%dT%H:%M:%S%z)r4�path�existsr=�utilZ
ensure_dir�dirnameZtouchr:r'Z	FormatterrKZ	localtimeZ	converterZsetFormatterr9Zrotatorr+Znamer)�logfile�log_size�
log_rotate�log_compress�handlerZ	formatterr	r	r
�_create_filehandler�s
rWcCs|jttjj�dS)N)�log�INFOr=�constZ
LOG_MARKER)Zloggerr	r	r
�_paint_mark�sr[c@sBeZdZdd�Zedd��Zedd��Zedd��Zd
d
d�ZdS)�LoggingcCsPd|_|_tjtd�tjtd�tjtd�tjtd�tjd�dt_	dS)N�DDEBUG�SUBDEBUGr#�ALLTF)
�stdout_handler�stderr_handlerr'ZaddLevelNamer]r^r#r_ZcaptureWarningsZraiseExceptions)rr	r	r
r�s
zLogging.__init__cCsttjd�}|jt�tjtj�}|jt�|jt	tj
��|j|�||_tjtj
�}|jt
�|j|�||_dS)Nr=)r'�	getLogger�setLevelr#Z
StreamHandler�sys�stdoutrYZ	addFilterrr(�
addHandlerr`�stderrra)r�
logger_dnfrergr	r	r
�	_presetup�s





zLogging._presetupcCs�tjd�}|jt�tjj|tjj	�}t
||||�}|j|�|j|�tjd�}	|	j|�tjd�}
|
jt�tjj|tjj�}t
||||�}|
j|�t
jjj||tk�tjd�}d|_|jt�tjj|tjj�}t
||||�}|j|�dS)Nr=zpy.warnings�librepozdnf.rpmF)r'rbrcr#r4rN�joinr=rZZLOGrWrfZLOG_LIBREPO�libdnfZrepoZ
LibrepoLogr_Z	propagater^ZLOG_RPM)r�
logfile_level�logdirrSrTrUrhrRrV�logger_warningsZlogger_librepo�
logger_rpmr	r	r
�_setup_file_loggers�s(










zLogging._setup_file_loggerscCs�|j�|j|||||�tjd�}|j|j�tjd�}	|	j|j�|	j|j�tjd�}
|jjt�|jjt�t	|
�t	|	�|jj|�|jj|�dS)Nzpy.warningszdnf.rpmr=)
rirqr'rbrfrar`rcr(r[)rZ
verbose_levelZerror_levelrmrnrSrTrUrorprhr	r	r
�_setup�s


zLogging._setupFc
Csft|j�}t|j�}t|j�}|j}|j}|j}|j}	|rL|j	|||||	�S|j
|||||||	�SdS)N)r%Z
debuglevelr)Z
errorlevelZlogfilelevelrnrSrTrUrqrr)
rZconfZfile_loggers_onlyZverbose_level_rZ
error_level_rZlogfile_level_rrnrSrTrUr	r	r
�_setup_from_dnf_conf�s


zLogging._setup_from_dnf_confN)F)	r
rrrrrirqrrrsr	r	r	r
r\�s
	r\c@seZdZdd�Zdd�ZdS)�TimercCs||_tj�|_dS)N)�whatrK�start)rrur	r	r
rszTimer.__init__cCs6tj�|j}d|j|df}tjd�jt|�dS)Nztimer: %s: %d msi�r=)rKrvrur'rbrXr])rZdiff�msgr	r	r
�__call__szTimer.__call__N)r
rrrrxr	r	r	r
rtsrtcs$eZdZ�fdd�Zdd�Z�ZS)�LibdnfLoggerCBcs*tt|�j�tjd�|_tjd�|_dS)Nr=rj)r<ryrr'rb�_dnf_logger�_librepo_logger)r)rEr	r
rszLibdnfLoggerCB.__init__cGsft|�dkr|\}}nt|�dkr.|\}}}}|tjjjkrP|jjt||�n|jjt||�dS)zoLog message.

        source -- integer, defines origin (libdnf, librepo, ...) of message, 0 - unknown
        rrN)	�lenrl�utils�LoggerZLOG_SOURCE_LIBREPOr{rX�_LIBDNF_TO_DNF_LOGLEVEL_MAPPINGrz)rr6r�level�messagerK�pidr	r	r
r3s
zLibdnfLoggerCB.write)r
rrrr3rMr	r	)rEr
rysryi):Z
__future__rrZdnf.exceptionsr=Z	dnf.constZdnf.lockZdnf.utilZlibdnf.reporlr'Zlogging.handlersr4rdrK�warningsr0Z
SUPERCRITICALZCRITICALZERRORr(rY�DEBUGr]r^r#r_r�objectrr!r%r&r)r+r2r9ZhandlersZRotatingFileHandlerr:rWr[r\rtr}r~ZLevel_CRITICALZLevel_ERRORZ
Level_WARNINGZLevel_NOTICEZ
Level_INFOZLevel_DEBUGZLevel_TRACErryZlibdnfLoggerCBZLogZ	setLoggerr	r	r	r
�<module>sv	

a





site-packages/dnf/__pycache__/__init__.cpython-36.pyc000064400000001006147511334650016436 0ustar003

�ft`m�@spddlmZddlZddlZejdedd�ddlmZeZ	ddl
ZejjZddl
ZejjZejjjjd�dS)�)�unicode_literalsN�oncez	^dnf\..*$)�category�module)�VERSIONZmedia)Z
__future__r�warningsZ
dnf.pycompZdnf�filterwarnings�DeprecationWarningZ	dnf.constr�__version__Zdnf.base�baseZBaseZ
dnf.pluginZpluginZPluginZpycompZurlparseZ
uses_fragment�append�r
r
�/usr/lib/python3.6/__init__.py�<module>ssite-packages/dnf/__pycache__/subject.cpython-36.pyc000064400000000424147511334650016341 0ustar003

�ft`~�@s4ddlmZddlmZddlmZddlmZdS)�)�absolute_import)�print_function)�unicode_literals)�SubjectN)Z
__future__rrrZhawkeyr�rr�/usr/lib/python3.6/subject.py�<module>ssite-packages/dnf/__pycache__/crypto.cpython-36.pyc000064400000014714147511334650016231 0ustar003

�ft`��@s<ddlmZddlmZddlmZddlmZddlZddlZddl	Zddl
ZddlZddlZddl
Z
ddlZyddlmZddlmZWn<ek
r�ddlZGdd	�d	e�ZGd
d�de�ZYnXdZejd
�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zejdd��Zdd�Z d"dd�Z!Gd d!�d!e�Z"dS)#�)�print_function)�absolute_import)�unicode_literals)�_N)�Context)�Datac@sVeZdZdd�Zdd�Zdd�Zedd��Zejd	d��Zd
d�Z	dd
�Z
dd�ZdS)rcCstj�|jd<dS)N�ctx)�gpgmer�__dict__)�self�r�/usr/lib/python3.6/crypto.py�__init__*szContext.__init__cCs|S)Nr)rrrr
�	__enter__-szContext.__enter__cCsdS)Nr)r�type�value�tbrrr
�__exit__0szContext.__exit__cCs|jjS)N)r�armor)rrrr
r3sz
Context.armorcCs||j_dS)N)rr)rrrrr
r7scCs$t|t�rtj|�}|jj|�dS)N)�
isinstanceZ
basestring�io�BytesIOr�import_)r�key_forrr
�	op_import;s

zContext.op_importcCs|jj||�dS)N)rZexport)r�pattern�modeZkeydatarrr
�	op_export@szContext.op_exportcCst|j|�S)N)�getattrr)r�namerrr
�__getattr__CszContext.__getattr__N)�__name__�
__module__�__qualname__rrr�propertyr�setterrrr rrrr
r)src@s4eZdZdd�Zdd�Zdd�Zdd�Zd	d
�ZdS)rcCstj�|jd<dS)N�buf)rrr
)rrrr
rHsz
Data.__init__cCs|S)Nr)rrrr
rKszData.__enter__cCsdS)Nr)rrrrrrr
rNsz
Data.__exit__cCs
|jj�S)N)r&�getvalue)rrrr
�readQsz	Data.readcCst|j|�S)N)rr&)rrrrr
r TszData.__getattr__N)r!r"r#rrrr(r rrrr
rGs
rZ	GNUPGHOME�dnfcCstjjdd�|jD��S)Ncss|]}|jr|VqdS)N)Zcan_sign)�.0�subkeyrrr
�	<genexpr>]sz*_extract_signing_subkey.<locals>.<genexpr>)r)�util�firstZsubkeys)�keyrrr
�_extract_signing_subkey\sr0cs(�fdd�tdt��d�D�}dj|�S)Nc3s|]}�||d�VqdS)�Nr)r*�i)�fpr_hexrr
r,asz)_printable_fingerprint.<locals>.<genexpr>rr1� )�range�len�join)r3Zsegmentsr)r3r
�_printable_fingerprint`sr8cCs�|j}t|�}x�|jD]x}xrt||�D]d}|j}||krNtjtd�|j|�q&|j	j
|�s\q&tjj
j|j|j|dd�tjtd�|j|�q&WqWdS)Nzrepo %s: 0x%s already importedF)�gpgdirZmake_ro_copyzrepo %s: imported key 0x%s.)Z_pubring_dir�keyids_from_pubringZgpgkey�retrieve�id_�logger�debugr�idZ_key_importZ_confirmr)ZyumZmiscZimport_key_to_pubring�raw_key�short_id)�repor9Z
known_keys�keyurl�keyinfo�keyidrrr
�import_repo_keyses
rFcCsltjj|�sgSt|��Jt��8}g}x,|j�D] }t|�}|dk	r0|j|j�q0W|SQRXWdQRXdS)N)	�os�path�exists�pubring_dirr�keylistr0�appendrE)r9rZkeyids�kr+rrr
r:vsr:cCs8td�|j|jt|j�|jjdd�f}tjd|�dS)NzLImporting GPG key 0x%s:
 Userid     : "%s"
 Fingerprint: %s
 From       : %szfile://�z%s)	rrA�useridr8�fingerprint�url�replacer=�critical)rD�msgrrr
�log_key_import�s
rUcCs8t|�|tjjjkr&tjtd��ntjtd��dS)Nz0Verified using DNS record with DNSSEC signature.zNOT verified using DNS record.)rUr)ZdnssecZValidityZVALIDr=rSr)rDZ
dns_resultrrr
�log_dns_key_import�srVccsFtjjtd�}|tjt<z
dVWd|dkr6tjt=n
|tjt<XdS)N)rG�environ�get�GPG_HOME_ENV)rJZorigrrr
rJ�s


rJcCs�tj�}g}t|���t���}|j|�x2|j�D]&}t|�}|dkrHq2|jt||��q2Wd|_	xF|D]>}t
��.}|j|jd|�|j
dtj�|j�|_WdQRXqhWWdQRXWdQRXtjj|�|S)NTr)�tempfileZmkdtemprJrrrKr0rL�Keyrrrr<�seekrG�SEEK_SETr(r@r)r-Zrm_rf)rZpb_dir�keyinfosrr/r+�infoZsinkrrr
�rawkey2infos�s"

,r`c
CsZ|jd�rtjtd�|j|�tjj||d��}t|�}WdQRXx|D]
}||_	qHW|S)Nzhttp:z.retrieving repo key for %s unencrypted from %s)rB)
�
startswithr=Zwarningrr?r)r-Z_urlopenr`rQ)rCrBZhandler^rDrrr
r;�s


r;c@s,eZdZdd�Zedd��Zedd��ZdS)r[cCs6|j|_|j|_d|_|j|_d|_|jdj|_	dS)Nr)
rEr<ZfprrPr@Z	timestamprQZuidsZuidrO)rr/r+rrr
r�szKey.__init__cCs&tjjrdnd}|jdd�jd|�S)N�0�0�i����)r)ZpycompZPY3r<�rjust)rZrjrrr
rA�szKey.short_idcCs
|jj�S)N)rA�lower)rrrr
�rpm_id�sz
Key.rpm_idN)r!r"r#rr$rArgrrrr
r[�sr[)N)#Z
__future__rrrZdnf.i18nr�
contextlibZ
dnf.pycompr)Zdnf.utilZdnf.yum.miscrZloggingrGrZZgpgrr�ImportErrorr	�objectrYZ	getLoggerr=r0r8rFr:rUrV�contextmanagerrJr`r;r[rrrr
�<module>s<




site-packages/dnf/__pycache__/base.cpython-36.pyc000064400000233373147511334650015627 0ustar003

i�-e���@sXdZddlmZddlmZddlmZddlmZddlZddlZddlZ	ddl
mZddlm
Z
dd	lmZmZmZdd
lmZddlmZddlmZydd
lmZWn ek
r�dd
lmZYnXddlZddlZddlZddlZddlZddl Zddl!Zddl"Zddl#Zddl$Zddl%Zddl&Zddl'Zyddl(ZdZ)Wnek
�r`dZ)YnXddl*Zddl+Zddl,Zddl-Zddl.Zddl/Zddl0Zddl1Zddl2Zddl3Zddl4Zddl5ZddlZddl6Zddl7Z7ddl8Z8ddl9Z9ddl:Z:ddl;Z;ddl<Z<ddl=Z=ddl>Z>ddl?Z?ddl@Z@ddlAZAddlBZBe;jCd�ZDGdd�deE�ZFdd�ZGdS)z
Supplies the Base class.
�)�absolute_import)�division)�print_function)�unicode_literalsN)�deepcopy)�
CompsQuery)�_�P_�ucd)�_parse_specs)�
SwdbInterface)�misc)�SequenceTF�dnfc@s�eZdZd�dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Ze	dd��Z
dd�Zd�dd�Zdd�Z
edd��Zedd��Zedd��Zejdd��Zeejjd�dd ���Zed!d"��Zed#d$��Zed%d&��Zejd'd&��Zd(d)�Zffdfd*d+�Zd,d-�Zd.d/�Zd0d1�Zd�d2d3�Z d�d5d6�Z!d�d7d8�Z"d9d:�Z#d;d<�Z$d�d=d>�Z%d�d?d@�Z&dAdB�Z'e(j)e(j*e(j+e(j,e(j-e(j.e(j/dC�Z0e1e(dD��r�e(j2e0dE<dFe(j3e(j4BiZ5edGdH��Z6edIdJ��Z7e7jdKdJ��Z7d�dLdM�Z8dNdO�Z9edPdQ�dRdQ�dSdQ�dTdU�Z:dVdW�Z;dXdY�Z<dZd[�Z=d\d]�Z>d^d_�Z?d�d`da�Z@ffdbdc�ZAddde�ZBdfdg�ZCdhdi�ZDd�djdk�ZEd�dldm�ZFd�dndo�ZGd�dpdq�ZHdrds�ZIdtdu�ZJdvdw�ZKd�dydz�ZLd�d{d|�ZMd}d~�ZNdd��ZOd�d��ZPd�d��ZQd�d�d��ZRd�d��ZSd�d�d��ZTd�d�d��ZUd�d��ZVd�d��ZWd�d��ZXd�d��ZYd�d��ZZd�d��Z[d�d�d��Z\d�d��Z]d�d��Z^d�d��Z_d�d�d��Z`d�d�d��Zad�d�d��Zbd�d�d��Zcd�d�d��Zdd�d��Zed�d��Zfd�d��Zgd�d�d��Zhd�d�d��Zid�d�d��Zjd�d�d��Zk�dd�d��Zl�dd�d��Zm�dd�d��Znd�d��Zo�dd�d„Zpd�dĄZqffffffd�dƄZrd�dȄZs�dd�dʄZt�dd�d̄Zu�dd�d΄Zvd�dЄZw�dd�dӄZx�dd�dՄZyd�dׄZzd�dلZ{d�dۄZ|d�d݄Z}d�d߄Z~d�d�Zd�d�Z�dS(	�BaseNcCs�d|_|p|j�|_d|_d|_d|_d|_d|_d|_t	j
j�|_d|_
t�|_t�|_t	jj�|_t	jj�|_t	jj�|_ttjg�|_t	jj�|_d|_ d|_!d|_"g|_#i|_$d|_%t�|_&d|_'dS)NF)(�_closed�_setup_default_conf�_conf�_goal�_repo_persistor�_sack�_transaction�_priv_ts�_compsr�comps�TransactionBunch�_comps_trans�_history�set�
_tempfiles�_trans_tempfiles�callbackZDepsolve�_ds_callback�loggingZLogging�_logging�repodict�RepoDict�_repos�rpmZRPMPROB_FILTER_OLDPACKAGE�_rpm_probfilterZpluginZPlugins�_plugins�_trans_success�_trans_install_set�_tempfile_persistor�_update_security_filters�_update_security_options�_allow_erasing�_repo_set_imported_gpg_keys�output)�self�conf�r5�/usr/lib/python3.6/base.py�__init__]s2z
Base.__init__cCs|S)Nr5)r3r5r5r6�	__enter__zszBase.__enter__cGs|j�dS)N)�close)r3Zexc_argsr5r5r6�__exit__}sz
Base.__exit__cCs|j�dS)N)r9)r3r5r5r6�__del__�szBase.__del__cCs.|jr|jj|�n|jjrn|jj|�dS)N)rr �updater4�destdirr)r3�filesr5r5r6�_add_tempfiles�s
zBase._add_tempfilescCs�|j�td|jdd�}|jr&d|d<y|jj|jfddi|��WnTtjk
r�}z6t	j
td�j|j
|��tjjtd�j|j
���WYdd}~XnXdS)NT)�load_filelists�load_presto�load_updateinfo�
load_other�build_cachezloading repo '{}' failure: {}z"Loading repository '{}' has failed)�load�dict�deltarpm�load_metadata_otherr�	load_repo�_repo�hawkey�	Exception�logger�debugr�format�idr�
exceptions�	RepoError)r3�repo�mdload_flags�er5r5r6�_add_repo_to_sack�szBase._add_repo_to_sackcCs.tjj�}|j}d|kr*tjj|j�|d<|S)N�
releasever)rr4ZConf�
substitutionsr(Zdetect_releasever�installroot)r4Zsubstr5r5r6r�s

zBase._setup_default_confcCs�dd�|jj�D�}y0|jj|j||jj|jjd|jj|jj	d�}Wn4t
jk
rx}ztj
jt|���WYdd}~XnX|r�tjtjjj|d��dS)NcSsg|]}|jr|j�qSr5)Zmodule_hotfixesrP)�.0�ir5r5r6�
<listcomp>�sz0Base._setup_modular_excludes.<locals>.<listcomp>F)Zupdate_onlyZdebugsolver�module_obsoletesr)�repos�iter_enabled�sackZfilter_modules�_moduleContainerr4rYZmodule_platform_id�debug_solverr]rKrLrrQ�Errorr
rM�warning�module�module_baseZformat_modular_solver_errors)r3Z
hot_fix_reposZ
solver_errorsrUr5r5r6�_setup_modular_excludes�s"zBase._setup_modular_excludesFc	Cs�t|jj�}d|kr$tr$|j�dSg}g}|�s>�x|jj�D]�}|j|krPq@t|j	�dkr�|j
j�jdd�}x8t|j	�D]*}t
jj|�}|j|j|j
dddd��}q|W|j|jd�|j|j�|jf�|j
j�jdd�}	x8t|j�D]*}
t
jj|
�}|	j|j|j
dddd��}	q�W|	j|jd�|	r@|j|	|jf�q@Wd|k�r6|j
j�jdd�}t|jj	�dk�r�x<t|jj	�D],}t
jj|�}|j|j|j
dddd��}�qzW|j
j�jdd�}x<t|jj�D],}
t
jj|
�}|j|j|j
dddd��}�q�Wt|jj	�dk�r$|j
j|�|j
jd�|�r6|j
j|�|�rjx,|D]$\}
}|j
j|
�|j
jd|��qBW|�r�x|D]\}
}|j
j|
��qvW|�r�t�r�|j�dS)	N�allrT)�emptyF)�
with_nevra�
with_provides�with_filenames)�reponame�main)rr4Zdisable_excludes�WITH_MODULESrgr^r_rP�lenZincludepkgsr`�query�filtermr�subject�Subject�union�get_best_query�append�applyZexcludepkgsZadd_includesZset_use_includes�add_excludes)r3�	only_mainZdisabledZ
repo_includesZ
repo_excludes�rZ
incl_queryZincl�subjZ
excl_queryZexclZ
include_query�
exclude_queryrq�repoidr5r5r6�_setup_excludes_includes�sh

zBase._setup_excludes_includescCsP|jr<|jjr<dd�|jj�D�}|jjj|�|jj�|jrL|jj�dS)NcSs"g|]}|jr|jj�r|j�qSr5)�metadatarJZ	isExpiredrP)rZr{r5r5r6r\�sz/Base._store_persistent_data.<locals>.<listcomp>)	rr4�	cacheonlyr^r_Zexpired_to_addr<�saver-)r3Zexpiredr5r5r6�_store_persistent_data�s
zBase._store_persistent_datacCs|jdkr|jdd�|jS)NT)�arch_filter)r�
read_comps)r3r5r5r6r�s
z
Base.compscCs|jS)N)r)r3r5r5r6r4�sz	Base.confcCs|jS)N)r')r3r5r5r6r^sz
Base.reposcCs
d|_dS)N)r')r3r5r5r6r^sZ
_priv_rpmconncCstjjj|jj�S)N)rr(Z
connectionZ
RpmConnectionr4rY)r3r5r5r6�_rpmconnsz
Base._rpmconncCs|jS)N)r)r3r5r5r6r`sz	Base.sackcCsP|jdkrtjjd��|jjdkrHtjjd|jj	|jj
d|jj�|j_|jjS)NzSack was not initializedF�arch)r`rrQrcra�libdnfreZModulePackageContainerr4rYrX�
persistdir)r3r5r5r6ras
 zBase._moduleContainercCs|jS)N)r)r3r5r5r6�transactionszBase.transactioncCs|jrtd��||_dS)Nztransaction already set)r�
ValueError)r3�valuer5r5r6r�$scCstjj|jj�|_dS)N)r�	persistorZ
RepoPersistorr4�cachedirr)r3r5r5r6�_activate_persistor+szBase._activate_persistorcCs,|jjr|jj|j||�|jj||�dS)z&Load plugins and run their __init__().N)r4Zpluginsr*�_loadZ	_run_init)r3Z
disabled_globZenable_plugins�clir5r5r6�init_plugins.szBase.init_pluginscCs|jj�dS)z#Run plugins pre_configure() method.N)r*Z_run_pre_config)r3r5r5r6�pre_configure_plugins5szBase.pre_configure_pluginscCs|jj�dS)zRun plugins configure() method.N)r*Z_run_config)r3r5r5r6�configure_plugins:szBase.configure_pluginscCs|jj�dS)zRun plugins unload() method.N)r*Z_unload)r3r5r5r6�unload_plugins?szBase.unload_pluginsc
Cs�|jj}|jdkr|j�|j}|r�tjj�rDtd�}tj	|�dStjj
�dkrhtd�}tj	|�dS|dkr�td�}tj	|�dS|j�}|dk	r�||kr�tj	td��dSx|jj
�D]}|jjd�q�W|jj�s�tj	td�jd	j|jj���dSx�|jj�D]�}|j�\}}	|	dk�r6tj	td
�|j�nx|�sH|	dk�rftjtd�|j�|jj�nH|�r�|	|k�r�td�}tj||j|	�|jj�ntjtd
�|j|	��qW|�r�d|_|jddd�tj	td��dS)NzCMetadata timer caching disabled when running on metered connection.Fz:Metadata timer caching disabled when running on a battery.rz Metadata timer caching disabled.z"Metadata cache refreshed recently.�z*There are no enabled repositories in "{}".z", "z4%s: will never be expired and will not be refreshed.z&%s: has expired and will be refreshed.zC%s: metadata will expire after %d seconds and will be refreshed nowz!%s: will expire after %d seconds.T)�load_system_repo�load_available_reposzMetadata cache created.)r4Zmetadata_timer_syncrr�r�utilZon_metered_connectionrrM�infoZon_ac_power�since_last_makecacher^�valuesrJZsetMaxMirrorTries�_any_enabledrO�joinZreposdirr_Z_metadata_expire_inrPrN�expireZreset_last_makecache�	fill_sack)
r3�timerZperiodr��msgr�rSr{Zis_cacheZ
expires_inr5r5r6�update_cacheDsZ






zBase.update_cacheTc CsPtjjd�}|jddd�tjj|�|_tjj|j	j
|j	j�}|���|dk	r�y|jjdd�Wnt
k
r~|dkrz�YnX|�r�g}d}tj�}|j	jr�tjjj�x�|jj�D]�}y`|j|�|jj�|kr�|jj�}|jj�|kr�|jj�}tjtd�|jtjj|jj���Wq�tj j!k
�rz}	z>|jj"�|j#dk�rJ�tj$d	|	�|j%|j�|j&�WYd
d
}	~	Xq�Xq�W|�r�tj$td�dj'|��|jj(��r�|dk�r�|dk�r�tj)td
�t*j+t,|�d�tjj|��n|jj-�j&�Wd
QRX|j	}
|jj.|
j/|
j0|
j1�|j2�|�tj3j4|j�|_5|
j6|j5_6|j7j8�|jS)z'Prepare the Sack and the Goal objects. z
sack setupT)r`�goalF)rD�autorz%s: using metadata from %s.z	Error: %sNzIgnoring repositories: %sz, z-Last metadata expiration check: %s ago on %s.)Zseconds)9rr#�Timer�resetr`�_build_sackr�lock�build_metadata_lockr4r��exit_on_lockr��IOError�time�gpgkey_dns_verification�dnssec�RpmImportedKeys�check_imported_keys_validityr^r_rVrJZgetTimestampZgetAgerMrNrrPr��normalize_time�getMaxTimestamprQrRr��skip_if_unavailablerdrw�disabler�r�r��datetimeZ	timedelta�intrh�
_configure�installonlypkgs�installonly_limit�allow_vendor_changerr��Goalr�protect_running_kernelr*�run_sack)r3r�r�r�r��error_reposZmtsZager{rUr4r5r5r6r�|sf






zBase.fill_sackc	 Cstjjd�}|jddd�tjj|�|_tjj|j	j
|j	j�}|��n|dk	r�y|jjdd�Wnt
k
r~|dkrz�YnXg}|j	jr�tjjj�x�|jj�D]�}yf|jjddd�td|jdd�}|jr�d|d	<|jj|jf|�tjtd
�|jtjj|jj ���Wq�t!t"j#fk
�r�}zZ|j$dk�rPtj%j&td�j'|j|���ntjtd�j'|j|��|j(|j�|j)�WYdd}~Xq�Xq�W|�r�tj*td
�dj+|��WdQRX|j	}|jj,|j-|j.|j/�|j0�|�tj1j2|j�|_3|j4|j3_4|j5j6�|jS)a�
        Prepare Sack and Goal objects and also load all enabled repositories from cache only,
        it doesn't download anything and it doesn't check if metadata are expired.
        If there is not enough metadata present (repond.xml or both primary.xml and solv file
        are missing) given repo is either skipped or it throws a RepoError exception depending
        on skip_if_unavailable configuration.
        z
sack setupT)r`r�F)rDr�)ZthrowExceptZ
ignoreMissing)r@rArBrCz%s: using metadata from %s.zloading repo '{}' failure: {}NzIgnoring repositories: %sz, )7rr#r�r�r`r�rr�r�r4r�r�r�r�r�r�r�r�r^r_rJZ	loadCacherFrGrHrIrMrNrrPr�r�r��RuntimeErrorrKrLr�rQrRrOrwr�rdr�r�r�r�r�rr�r�rr�r*r�)	r3r�r�r�r�rSrTrUr4r5r5r6�fill_sack_from_repos_in_cache�sX	

z"Base.fill_sack_from_repos_in_cachecCs�tjj|jj�|_|jjsl|j|j�|j	r\|j
j|jj��|jj
�|jrl|j|j
�n|jjj|j
�|jjr�tjtd��tjtd�djtjjd��|jdk	r�|jj�|j�|j�d|_	dS)NzRThe downloaded packages were saved in cache until the next successful transaction.z1You can remove cached packages by executing '%s'.z{prog} clean packages)�progF)rr�ZTempfilePersistorr4r�r-Z	keepcache�_clean_packagesrr+r r<Zget_saved_tempfilesrir,Ztempfiles_to_addrMr�rrOr�Z	MAIN_PROGr�historyr9r��_closeRpmDB)r3r5r5r6�_finalize_base�s*



zBase._finalize_basecCsB|jr
dStjtjjd�d|_|j�|jdddd�d|_dS)ztClose all potential handles and clean cache.

        Typically the handles are to data sources and sinks.

        NzCleaning up.T)r`r^r�)	rrM�logrr#�DDEBUGr�r�r*)r3r5r5r6r9sz
Base.closecCsftjjj|j|�}xN|D]F}y|jj|�Wqtjjk
r\}ztj	|�WYdd}~XqXqWdS)z?Read repositories from the main conf file and from .repo files.N)
rr4�readZ
RepoReaderr^�addrQZConfigErrorrMrd)r3Zopts�readerrSrUr5r5r6�read_all_repos"s
zBase.read_all_reposcCs�|r
d|_|rtjj�|_|r�d|_|jdk	rJtjj|j�|_|jj	|j_	|jr`|j
r`|j
j�|jdk	rt|j
j�tjj�|_d|_g|_|r�|r�tj�dS)z1Make the Base object forget about various things.N)rrr%r&r'rr�r�r4r�raZrollbackrr�r9rrrrr.�gcZcollect)r3r`r^r�r5r5r6r�-s$



'z
Base.resetcCs|`dS)z6Closes down the instances of rpmdb that could be open.N)�_ts)r3r5r5r6r�jszBase._closeRpmDB)Z	noscriptsZ
notriggersZnodocs�testZjustdbZ
nocontexts�nocrypto�RPMTRANS_FLAG_NOCAPSZnocapsr�cCs|jS)N)r)r3r5r5r6r�|sz	Base.goalcCs�|jdk	r|jStjjj|jj�|_|jjd�xb|jjD]V}|j	j
|�}|dkrdtjt
d�|�q:|jj|�|jj
|�}|dk	r:|jj|�q:W|jjs�|jjtj�|jjr�|jjtj�tjtj|jd�}|jj|�|jS)zMSet up the RPM transaction set that will be used
           for all the work.Nrz!Invalid tsflag in config file: %s)rrr(r�ZTransactionWrapperr4rY�setFlagsZtsflags�_TS_FLAGS_TO_RPM�getrM�criticalrZ	addTsFlag�_TS_VSFLAGS_TO_RPM�pushVSFlagsZdiskspacecheckr)r�ZRPMPROB_FILTER_DISKSPACEZ
ignorearchZRPMPROB_FILTER_IGNOREARCH�	functools�reduce�operator�or_Z
setProbFilter)r3�flagZrpm_flagZvs_flagZ
probfilterr5r5r6r��s*
zBase._tscCs&|jdkrdS|jj�|`d|_dS)z"Releases the RPM transaction set. N)rr9)r3r5r5r6r��s


cCs$tjjd�}tjj�|_tjtjjd�x�|j	j
�D]�}|js@q4|jsHq4|j
j�}|sXq4tjtjjd|j�|j
j�tjjkr�tj|d�}tjj|�s�q4ntj|d�}y|jj|�Wq4tjjk
r�}ztd�}tj||j|�WYdd}~Xq4Xq4W|�r|jjj|j j!dg�|�|jS)z6Create the groups object to access the comps metadata.z
loading compszGetting group metadataz%Adding group file from repository: %sz
groups.xmlz1Failed to add groups file for repository: %s - %sN�basearch)"rr#r�rZCompsrrMr�r�r^r_Zenablegroupsr�rJZ
getCompsFnrPZgetSyncStrategyrSZSYNC_ONLY_CACHEr
Zcalculate_repo_gen_dest�os�path�existsZrepo_gen_decompressZ_add_from_xml_filenamerQ�
CompsErrorrr�Z_ir�rrX)r3r�r�rSZcomps_fnZdecompressedrUr�r5r5r6r��s:


&zBase.read_compscCs*|jdkr$|jj}t|jj|d�|_|jS)zeauto create the history object that to access/append the transaction
           history information. N)rW)rr4rWrr�)r3rWr5r5r6�_getHistory�s
zBase._getHistorycCs|j�S)N)r�)r3r5r5r6�<lambda>�sz
Base.<lambda>cCst|d|�S)Nr)�setattr)r3r�r5r5r6r��scCst|dd�S)Nr)r�)r3r5r5r6r��szDNF SWDB Interface Object)�fget�fset�fdel�doccsF�jj}t|j����j�}|j�|j�j�}xT|j�D]H�|j��}|d}�j	j
|d��j	j
�d�|j�||dd��q:Wx�|j�D]x��j	j
�d�|j��}t
��}g}x0|D](}	t
|	�|kr�|jd|	�q�|j|	�q�W|d}
|j�|
|dd��q�Wx�|j�D]���j	j
�d�|j��}��fdd�|D�}|j��}�|k�rt|j�jd	��rt|j��}x0|D](}|j|�}
tjj||
�dk�rz|
}�qzW|j�||��fd
d�}tjj||��qWx�|j�D]ȉ|j��}d}x"|D]}|j�jk�r�|}P�q�W|dk�r*|jd�}n
|j|���fdd�|D�}�fd
d�}tjj||��|k�rz|j�|�n|j�||��j	j
|d��j	j
�d��q�W|j �}|�rB�j!j"t#j$d�j�}|j%|d�xh|D]`�|j�jd	��r|d}|j|��jj&||j|���j	j
�d�|j��}|j'�|��q�W|S)NrZdd�dr�r{r[cs$g|]}|�ks|j�jkr|�qSr5)�name)rZr[)�
all_obsoleted�pkgr5r6r\�sz*Base._goal2transaction.<locals>.<listcomp>)r�cs�jj|d�S)N�od)r"�	pkg_added)r�)r3r5r6r�
sz(Base._goal2transaction.<locals>.<lambda>cs$g|]}|�ks|j�jkr|�qSr5)r�)rZr[)r�r�r5r6r\scs�jj|d�S)Nr�)r"r�)r�)r3r5r6r�!sZud�u)�flags)Zpkg__neqrU���)(r�r(rZlist_obsoleted�_get_installonly_queryrx�	installed�list_downgradesZobsoleted_by_packager"r�Z
add_downgrade�list_reinstalls�str�insertrwZ
add_reinstall�
list_installs�
get_reason�filterr�r�r�ZTransactionItemReasonCompareZadd_installrr�Zmapall�
list_upgrades�pop�removeZadd_upgradeZ
list_erasuresr`rqrK�IGNORE_EXCLUDESrr�
set_reasonZ	add_erase)r3r��tsZinstallonly_queryZinstallonly_query_installedZobsZ
downgradedZ	nevra_pkg�	obsoletesZobs_pkgZreinstalled�reasonZobsoleteZreason_obsolete�cbZupgradedr[ZerasuresZremaining_installed_queryZ	remainingr5)r�r�r3r6�_goal2transaction�s�
















zBase._goal2transactioncCsd|j�}|j�}|j�j�}g}g}x6|D].}||krJ|j||d�q*|j||�q*W||fS)aJ See what packages in the query match packages (also in older
            versions, but always same architecture) that are already installed.

            Unlike in case of _sltr_matches_installed(), it is practical here
            to know even the packages in the original query that can still be
            installed.
        r)r��_na_dict�	availablerw)r3�q�instZ
inst_per_archZavail_per_archZavail_lZinst_lZnar5r5r6�_query_matches_installed7s
zBase._query_matches_installedcCs"|jj�j�j|j�d�}t|�S)z� See if sltr matches a patches that is (in older version or different
            architecture perhaps) already installed.
        )r�)r`rqr�rr�matches�list)r3�sltrrr5r5r6�_sltr_matches_installedKszBase._sltr_matches_installedcs�fdd��jj�j�D�S)z5Get iterator over the packages installed by the user.c3s|]}�jj|�r|VqdS)N)r�Zuser_installed)rZr�)r3r5r6�	<genexpr>Tsz*Base.iter_userinstalled.<locals>.<genexpr>)r`rqr�)r3r5)r3r6�iter_userinstalledRszBase.iter_userinstalledcCs0|j||jj|jjd�}|jjr,|jd�|S)N)�allow_uninstall�
force_bestZignore_weak_depsz./debugdata/rpms)�runr4�bestZinstall_weak_depsrbZwrite_debugdata)r3r��
allow_erasing�retr5r5r6�_run_hawkey_goalWs
zBase._run_hawkey_goalc	Cstd}|j�tjjd�}|jj�|j}|j�rJ|j|j	j
�j�|j�n|j
jsd|j�}|j|�|j|j	j
�j|j
jd��|j||�s�|j
jdkr�|j�tjj|j��}tjj|�}n|j|�|_|jj�|�|jdk	o�t|j�dk}|�r|jj �}|�rtjj!|�}|dk	�r"|�|j"j#�|jj$�}||jj%�7}||jj&�7}||jj'�7}|j	j(|j)|�|S)zBuild the transaction set.NZdepsolve)r��r)*�_finalize_comps_transrr#r�r"�startrZ
req_has_eraseZpush_userinstalledr`rqr�r�r4Zupgrade_group_objects_upgrade�_build_comps_solverZ'_exclude_packages_from_installed_groupsZ
add_protectedrrZprotected_packagesr�
debuglevelZ
log_decisionsr��_format_resolve_problems�
problem_rulesrQZ
DepsolveErrorrr�endrpZ_rpm_limitationsrcr*Zrun_resolvedr�r�r�r�Zset_modules_enabled_by_pkgsetra)	r3r�excr�r��solverr�Zgot_transactionZnew_pkgsr5r5r6�resolve_sH








zBase.resolvecCs^t|t�s|g}tjjj�gt|�}|js�|jj	�|jj
�|jr�|jjsV|jj
r�d}t|d�rx|jrxdj|j�}nt|d�r�|jr�dj|j�}|jj�}|dkr�|jj�}n|j}|jj|gg|�|jj|�|jj�|jj�d|_dSd}tjtd��tj j!|j"j#|j"j$�}|���|jj%|j&�|j'�}|�rxtd�}tj(|�x|D]}tj(|��qXWtj)j*|��tjtd��tj+j,d�}	tjtd	��|j&j-�|j&j.�tjjj/|dd
�}
|j&j0|
�}t1|�dk�r\x&|
j2�D]}tj3td�j4|���q�Wtd
�d}x|D]}
|dt5|
�7}�qW|j6|�}|�rP|d|7}tj)j7|��~
tjtd��|j&j8t9j:��r�dS|	�|jj	�|jj
�tj+j,d�}	tjjj/||d�}|j"j;dk�r�x|j<D]}d|_=�q�W|jj�tjtd��|j>|d�}WdQRX|	�|jj?|j�|jj�dd�}x&tj@jA||j|�D]}tjB|��qFW|S)N�args� �cmdsTzRunning transaction checkz%Error: transaction check vs depsolve:zTransaction check succeeded.ztransaction testzRunning transaction test)r�rzRPM: {}zTransaction test error:�
z  %s
zTransaction test succeeded.r�)�displays�FzRunning transaction)rcSs,g}x"|D]}|jdj|t|���q
W|S)Nz{}: {})rwrOr�)�actionZtsis�msgs�tsir5r5r6�
_pto_callback�s
z*Base.do_transaction.<locals>._pto_callback)C�
isinstancerrZyumZrpmtransZLoggingTransactionDisplayrr�rar�ZupdateFailSafeDatar�group�env�hasattrr"r�r$r��lastr`�_rpmdb_version�end_rpmdb_version�begrr*Zrun_pre_transactionZrun_transactionr+rMr�rr�Zbuild_rpmdb_lockr4r�r�Z_populate_rpm_tsr��_run_rpm_check�errorrQZTransactionCheckErrorr#r��orderZcleanZRPMTransactionr�rp�messagesr�rOr
�_trans_error_summaryrc�isTsFlagSetr(�RPMTRANS_FLAG_TESTrr&r2�_run_transactionZunload_removed_pluginsr�Z_post_transaction_outputrN)r3Zdisplay�cmdline�oldZ
rpmdb_version�tidr�r)r�r�ZtestcbZtserrors�	errstringZdescr�summaryrZdisplay_r+r5r5r6�do_transaction�s�
















zBase.do_transactioncCs�d}tjd�}i}x�|j|�D]t}|jd�dkr>t|jd��ntjt|jd��d�}|jd�|krr|||jd�<||jd�|kr|||jd�<qW|r�|td�d	7}x4|D],}|d
tdd||�j	|||�d	7}q�W|s�dStd
�d|}|S)z�Parse the error string for 'interesting' errors which can
        be grouped, such as disk space issues.

        :param errstring: the error string
        :return: a string containing a summary of the errors
        �z9needs (\d+)(K|M)B(?: more space)? on the (\S+) filesystemr'�Mr�g�@�zDisk Requirements:r%z   z7At least {0}MB more space needed on the {1} filesystem.Nz
Error Summaryz
-------------
)
�re�compile�finditerr-r��mathZceilrr	rO)r3r?r@�pZdisk�mZ
size_in_mb�kr5r5r6r8s&
 
*zBase._trans_error_summarycCs|jjo|jjtj�S)N)r4Zhistory_recordr�r9r(r:)r3r5r5r6�_record_history%szBase._record_historycCs�d}|j�r�t|jj�}|jj�j�}|j|d�j�}|jj	�}|j
j�}|dk	rX|j}|dksh||kr�t
jtd�jtjjd��d}t|d�r�|jr�dj|j�}nt|d�r�|jr�dj|j�}|jjr�|jjnd}	|j
j||g||	�}|jj�r$tjd	�}
|
�r$ytj|
�Wnd	}
YnXt
jtjjd
�|j j|j!d�}t
jtjjd�|jj�rzytj|
�WnYnXtjj"|j |j#�|dk�r�n�t$|�d	k�r�dd
�|j D�}|�sfx&|j%�D]}
t
j&td�j|
���q�Wtd�}
tj'j(|
��nlt
j&td��x |D]}t
j&t)|d	���qW|j��rR|j j*t+j,��rR|j
j-|�td�}
tj'j(|
��xbdD]Z}t||��rlt.||�}yt/j0|�Wn.t1t2fk
�r�td�}
t
j&|
|�YnX�qlWt3|j#j4�|_5|j j*t+j,��s�|j6|j7�|S)zh
        Perform the RPM transaction.

        :return: history database transaction ID or None
        N)r�z RPMDB altered outside of {prog}.)r�r"r#r$rBrzRPM transaction start.zRPM transaction over.cSsg|]}|j�r|�qSr5)ZFailed)rZZelr5r5r6r\esz)Base._run_transaction.<locals>.<listcomp>zRPM: {}zCould not run transaction.zTransaction couldn't start:�	ts_all_fn�
ts_done_fnz$Failed to remove transaction file %s)rMrN)8rLrr4Zhistory_record_packagesr`rqr�r�rr1r�r0r2rMrNrrOrr�ZMAIN_PROG_UPPERr/r"r�r$�commentr3Z
reset_nicer��nicer�r#r�r�r!Z_sync_rpm_trans_with_swdbrrpr7r�rQrcr
r9r(r:r�getattrr
�unlink_fr��OSError�boolZinstall_setr,�_verify_transactionZverify_tsi_package)r3rr>Zusing_pkgs_pats�installed_queryZ
using_pkgs�rpmdbvZlastdbvr<rOZonice�errorsZfailedr�rUr[�fnr5r5r6r;)s~









zBase._run_transactioncs�dd�|jD�}t|����fdd�}tjjd�}d}tjj|�}|j�j�}t	dd�|D��}xH|j
jD]<}	|	j�}
x.|
j
�D]"}|j�|kr�|jd�|j�q�WqjWx|D]}||j|�}q�W|j�}
|j
j|
�|�d|_dS)	NcSsg|]}|jtjjkr|�qSr5)r(r�r�Z#TransactionItemAction_REASON_CHANGE)rZr*r5r5r6r\�sz,Base._verify_transaction.<locals>.<listcomp>cs |d7}�dk	r�||��|S)Nr�r5)r��count)�total�
verify_pkg_cbr5r6�display_banner�sz0Base._verify_transaction.<locals>.display_bannerzverify transactionrcSsg|]
}|j�qSr5)r�)rZr[r5r5r6r\�sT)r�rprr#r�r`�
rpmdb_sackrqr�rr�r-ZgetCompsGroupItemZgetPackagesZgetNameZsetInstalledr�r�r1rr+)r3r\Ztransaction_itemsr]r�rZr^r�namesZti�grIr*rWr5)r[r\r6rU�s(

zBase._verify_transactionc
sXtjj|jj|jj�}|���tj�}tdd�|D��}tdd�|D��}	�j	j
jdkrn�j	t|�||	d�n�j	t|�|�tjj
|||���j�r�tjj�j���t�fdd�|D��}
tjjd|�j�}|jj}|dk}
xԈjo�|
s�|dk�r�|dk�r|d	8}td
�}tj|�dd��jD�}�fdd�|D�}td
d�|D��}�j	t|�|�tjj
|||���j��r�tjj�j���|
t�fdd�|D��7}
tjj||i�}q�W�j�r�tjjj�j�}tj|�WdQRX|dk	�r�||
|�|\}}||k�rT||k�rtd�}n||k�r,td�}d||d}tj||d|d|�dS)Ncss|]}|jVqdS)N)�
download_size)rZ�ploadr5r5r6r�sz1Base._download_remote_payloads.<locals>.<genexpr>cSsg|]}t|tjj�r|�qSr5)r,r�drpmZDeltaPayload)rZZpayloadr5r5r6r\�sz2Base._download_remote_payloads.<locals>.<listcomp>�)Ztotal_drpmsc3s|]}�j|�VqdS)N)�_bandwidth_used)rZrb)rXr5r6r�srr�z,Some packages were not downloaded. Retrying.cSsg|]}|�qSr5r5)rZr�r5r5r6r\�scs g|]}tjj|�tjj��qSr5)rrS�_pkg2payload�
RPMPayload)rZr�)�progressr5r6r\�scss|]}|jVqdS)N)ra)rZrbr5r5r6r�sc3s|]}�j|�VqdS)N)re)rZrb)rXr5r6r�sz?Delta RPMs reduced %.1f MB of updates to %.1f MB (%d.1%% saved)zIFailed Delta RPMs increased %.1f MB of updates to %.1f MB (%d.1%% wasted)�dir')rrii)rr�Zbuild_download_lockr4r�r�r��sumrpr�__code__�co_argcountrSZ_download_payloadsZ_irrecoverablerQZ
DownloadErrorZ_update_savingZ_recoverable�retriesrrMr�Z
errmap2str)r3�payloadsrcrh�callback_totalZ	fail_fastr�Zbeg_downloadZest_remote_sizeZ
total_drpmZremote_sizeZsavingrmZforeverr�Zremaining_pkgs�realZfullZpercentr5)rXrhr6�_download_remote_payloads�sb












zBase._download_remote_payloadsc	s�|j|�\}}|rz�dkr$tjj��tjj|jj�j��|j	j
��|jdd�|D����fdd�|D�}|j|��|�|j	j
r�xX|D]P}|jr�tjj|j�|jjd��}ntjj|jj|jjd��}tj||j	j
�q�WdS)aDownload the packages specified by the given list of packages.

        `pkglist` is a list of packages to download, `progress` is an optional
         DownloadProgress instance, `callback_total` an optional callback to
         output messages about the download operation.

        NcSsg|]}|j��qSr5)�localPkg)rZr�r5r5r6r\sz*Base.download_packages.<locals>.<listcomp>cs$g|]}tjj|��jtjj��qSr5)rrSrfZ
delta_factoryrg)rZr�)rcrhr5r6r\s�/)�_select_remote_pkgsrr!ZNullDownloadProgressrcZ	DeltaInfor`rqr�r4Zdeltarpm_percentager?rqr=Zbaseurlr�r�r�Zget_local_baseurl�location�lstriprSZpkgdir�shutil�copy)	r3Zpkglistrhro�remote_pkgsZ
local_pkgsrnr�rur5)rcrhr6�download_packages�s"	

zBase.download_packagescCs�g}|s|S|jj�r&tjjtd���g}x�|D]�}tjj|�rhd|krhtj	j
||j|�}|j|g�y|j
|jj|��Wq0tk
r�}ztj|�|j
|�WYdd}~Xq0Xq0W|jdd�|r�|r�ttd�jdj|����|S)NzACannot add local packages, because transaction job already existsz://T)rzzCould not open: {}r#)rZ
req_lengthrrQrcrr�r�r�r�Z_urlopen_progressr4r?rwr`Zadd_cmdline_packager�rMrdrrOr�)r3�	path_list�strictrh�pkgsZ
pkgs_errorr�rUr5r5r6�add_remote_rpmss(



 zBase.add_remote_rpmscCs|jr|jj}d}n|j|j}|j}|j}|�r|jj}tj	j
j|�}tj	jj
||j��}tjj|j��}~|dkr�d}	d}
n�|dkr�|r�d}	nd}	td�|}
n\|dkr�d}	td�|}
nB|dkr�|r�d}	nd}	d}	td�|}
n|d	k�rd}	td
�|}
nd}	d}
|	|
fS)a�Verify the GPG signature of the given package object.

        :param po: the package object to verify the signature of
        :return: (result, error_string)
           where result is::

              0 = GPG signature verifies ok or verification is not required.
              1 = GPG verification failed but installation of the right GPG key
                    might help.
              2 = Fatal GPG verification error, give up.
        rrBr�r'z"Public key for %s is not installedzProblem opening package %srDz Public key for %s is not trustedrdzPackage %s is not signed)�
_from_cmdliner4Zlocalpkg_gpgcheckr^r~Zgpgcheck�gpgkeyrYrr(r��initReadOnlyTransactionZ	miscutilsZcheckSigrrr�r��basenamer)r3�po�checkZ	hasgpgkeyrS�rootrZ	sigresultZlocalfn�resultr�r5r5r6�_sig_check_pkg(sF

zBase._sig_check_pkgcCs
|j|�S)a�Verify the GPG signature of the given package object.

        :param pkg: the package object to verify the signature of
        :return: (result, error_string)
           where result is::

              0 = GPG signature verifies ok or verification is not required.
              1 = GPG verification failed but installation of the right GPG key
                    might help.
              2 = Fatal GPG verification error, give up.
        )r�)r3r�r5r5r6�package_signature_checkcs
zBase.package_signature_checkc
Cslxf|D]^}tjj|�sqytj|�Wn&tk
rLtjtd�|�wYqXtj	t
jjtd�|�qWdS)NzCannot remove %sz
%s removed)
r�r�r�r
rRrSrMrdrr�rr#r�)r3�packagesrYr5r5r6r�rs

zBase._clean_packagesrhcCs�|dkr|jj}|dkr*|j|||||�Stjj|�s<t�tj|j||||d�}|dksft	|�dkrn|d�St
||�}tjdd�|�S)aRReturn a :class:`misc.GenericHolder` containing
        lists of package objects.  The contents of the lists are
        specified in various ways by the arguments.

        :param pkgnarrow: a string specifying which types of packages
           lists to produces, such as updates, installed, available,
           etc.
        :param patterns: a list of names or wildcards specifying
           packages to list
        :param showdups: whether to include duplicate packages in the
           lists
        :param ignore_case: whether to ignore case when searching by
           package names
        :param reponame: limit packages list to the given repository
        :return: a :class:`misc.GenericHolder` instance with the
           following lists defined::

             available = list of packageObjects
             installed = list of packageObjects
             upgrades = tuples of packageObjects (updating, installed)
             extras = list of packageObjects
             obsoletes = tuples of packageObjects (obsoleting, installed)
             recent = list of packageObjects
        N)�showdups�ignore_casermrcSs
|j|�S)N)Zmerge_lists)�a�br5r5r6r��sz(Base._do_package_lists.<locals>.<lambda>)r4Zshowdupesfromrepos�
_list_patternrr��is_string_type�AssertionErrorr��partialrp�mapr�)r3�	pkgnarrow�patternsr�r�rmZlist_fnZyghsr5r5r6�_do_package_listss

zBase._do_package_listsc&s���fdd���fdd�}�fdd�}tj|d�}g}	g}
g}g}g}
g}g}g}g}g}|}�jj�}|dk	r�tjj||d�}|j�jd	d
�}|dk�r�i}i}xH|j�D]<}|||j	<|r�q�|j
|jf}||ks�|||kr�|||<q�Wt||j
���}	||j��}|�s|jdd
�}x�|D]�}|�rN|j	|k�rB|j|�n
|
j|�nT|j
|jf}|j	|k�rr|j|�n0||k�s�|j||��r�|
j|�n
|j|��q W�n�|dk�r�||�jdd�}
�j|
dd�}
|
jddgd�|
j�j�}
�nP|dk�rt||j���}	�n2|dk�rB|�r�||�j�}|j�j�}x\|D]T��j
�jf}|j|g�}�fdd�|D�}t|�dk�r�|j��n
|
j���q@Wn�||�j�jdd
�j�}|j�j�j�}xz|D]r\} }!|| |!fd�|j| |!fdg�d}"|"�s
�j|"��r|
j��n"�j|"��r.|j��n
|j���q�W�n|dk�rh||�j�jj�}#|#j�}n�|dk�r��fdd�|j�D�}n�|dk�r|j�}$|�jj��j|$d�}�j|d	dd�}|jddgd�g}xl|D],��j}%|j�fdd�|$j|%d �D���q�Wn6|d!k�rD|j�}|�s2|jdd
�}||�j �j!j"�}|	|_|
|_||_#||_$|
|_%||_||_&||_"||_||_'|S)"Ncs�dkrdS�jj|��kS)z:Test whether given package originates from the repository.NT)r�rS)�package)rmr3r5r6�is_from_repo�sz(Base._list_pattern.<locals>.is_from_repocs�fdd�|D�S)z=Filter out the packages which do not originate from the repo.c3s|]}�|�r|VqdS)Nr5)rZr�)r�r5r6r�sz=Base._list_pattern.<locals>.pkgs_from_repo.<locals>.<genexpr>r5)r�)r�r5r6�pkgs_from_repo�sz*Base._list_pattern.<locals>.pkgs_from_repocs�dkr|S|j�d�S)z=Filter out the packages which do not originate from the repo.N)rm)r�)rq)rmr5r6�query_for_repo�sz*Base._list_pattern.<locals>.query_for_repo)�iter)r�F)rkrhT)Zlatest_per_arch_by_priority�upgrades)Zupgrades_by_priority)�upgrade�src�nosrc)�	arch__neqr�rcsg|]}|j�jkr|�qSr5)�evr)rZr�)�	avail_pkgr5r6r\sz&Base._list_pattern.<locals>.<listcomp>r�
autoremove�extrascsg|]}�|�r|�qSr5r5)rZr�)r�r5r6r\sr)Zobsoletes_by_priority)rdr�csg|]}�|f�qSr5r5)rZr=)�newr5r6r\.s)�provides�recent)(r
Z
GenericHolderr`rqrrsrtrvr�Zpkgtupr�r�rr�rrrrwZevr_gt�_merge_update_filters�latestrrr�rpZevr_eq�	_unneededr��swdbr�r�r�extendZ_recentr4r��reinstall_available�
old_available�updates�obsoletesTuplesr�)&r3r��patternr�r�rmr�r�Zyghr�rr�r�r�rr�r�r�r�Zicrr|ZdinstZndinstr��keyZavailr�Zinstalled_dict�installed_pkgsZsame_verZavailable_dictr�r�Zinst_pkgZautoremove_qrZobsoleted_reldepsr5)r�r�r�rmr3r6r��s�














zBase._list_patterncCs|j|7_t|�S)N)rrp)r3�transr5r5r6�_add_comps_transEszBase._add_comps_transcs�|j�}|sdS|j�jjdd�}|j�fdd�|D�d�}|j|�}x|D]}�jj|tjj	�qLW|j
|�}|j
|�}|r�x |D]}�jj|�j
jd�q�WdS)z�
        Mark to remove packages that are not required by any user installed package (reason group
        or user)
        :param query: dnf.query.Query() object
        NF)rbcs g|]}�jjj|j�r|�qSr5)r�r-Zis_removable_pkgr�)rZr[)r3r5r6r\Usz,Base._remove_if_unneeded.<locals>.<listcomp>)r�)�
clean_deps)r�Z_safe_to_remover�r�r��
differencer�r�r�Z TransactionItemReason_DEPENDENCY�intersectionr�eraser4�clean_requirements_on_remove)r3rqZ
unneeded_pkgsZunneeded_pkgs_historyZpkg_with_dependent_pkgsr�Zremove_packagesr5)r3r6�_remove_if_unneededIs




zBase._remove_if_unneededcs>�j}�jjd}�fdd�}�fdd�}dd�}�jj�jdd	�}|jtj|dd
�f|j	tj|dd
�f|j
|f|j|ff}x�|D]�\}}	x�|D]�}
d|
ji}|
j
r�|jd
|i��jj�jf|�j�}|jddgd�|�s|
j}
|
j
r�|
d|7}
tjtd�j|
��q�|	|||
�}�jjj|
j�q�Wq�W�j|�dS)Nr�cs,tjj�j�}|j|d��jj|d�|S)N)r�)�select)r�selector�Selectorr`rrr�)rq�remove_query�	comps_pkgr)r3r5r6�
trans_upgradegsz1Base._finalize_comps_trans.<locals>.trans_upgradecs��jjdkrr|js"�j||d�q�|j�j�}�j|�tjj	�j
�}|jdj|j
|j�d��jj||d�nltjj	�j
�}|jr�|jdj|j
|j�d�n,�jjr�|j�j
j�j|d��}|j|d��jj||d�|S)Nrh)r|z
({} if {}))r�)r��optional)r)r�)r4�multilib_policyZrequires�_install_multiarchr�rx�_report_already_installedrr�r�r`rrOr�r�installrrurqrr)rqr�r�r|rVr)r3r5r6�
trans_installms 
z1Base._finalize_comps_trans.<locals>.trans_installcSs|j|�}|S)N)ru)rqr�r�r5r5r6�trans_remove�s
z0Base._finalize_comps_trans.<locals>.trans_removeT)ri)r|Fr�r�r�r�)r��.zNo match for group package "{}")rr4rXr`rqrrr�r�r��install_optr�r�r�Zbasearchonlyr<rxrMrdrrOrZ
group_membersr�r�)r3r�r�r�r�r�r�Zattr_fn�attrrYr�Z
query_argsrZpackage_stringr5)r3r6rcs4

zBase._finalize_comps_transcs �fdd�}tjj�j�j|�S)NcsN�jj�j�j|d�}|sdSy�jjj|d�Stk
rHtj	j
SXdS)N)r�r)r`rqr�rrr�r(r��AttributeErrorr�r�ZTransactionItemReason_UNKNOWN)Zpkgnamer)r3r5r6�	reason_fn�sz+Base._build_comps_solver.<locals>.reason_fn)rrZSolverr�r)r3r�r5)r3r6r�s	zBase._build_comps_solvercCsXtjj|�st�|j�}t|t�s.tjj	|�}|j
|||p>t�||�}|sNdS|j|�S)a&Installs packages of environment group identified by env_id.
        :param types: Types of packages to install. Either an integer as a
            logical conjunction of CompsPackageType ids or a list of string
            package type ids (conditional, default, mandatory, optional).
        r)
rr�r�r�rr,r�r�r��listToCompsPackageTypeZ_environment_installrr�)r3�env_id�types�excluder|�exclude_groupsr r�r5r5r6�environment_install�s
zBase.environment_installcCs,tjj|�st�|j�}|j|�}|j|�S)N)rr�r�r�rZ_environment_remover�)r3r�r r�r5r5r6�environment_remove�s
zBase.environment_removec
s��fdd��tjj|�st�d}|rB�fdd�|D�}tjj|�}�j�}t|t	�s`t
jj|�}|j
||||�}|sxdS|r�|j}	n|j}	tjtd�||	��j|�S)anInstalls packages of selected group
        :param pkg_types: Types of packages to install. Either an integer as a
            logical conjunction of CompsPackageType ids or a list of string
            package type ids (conditional, default, mandatory, optional).
        :param exclude: list of package name glob patterns
            that will be excluded from install set
        :param strict: boolean indicating whether group packages that
            exist but are non-installable due to e.g. dependency
            issues should be skipped (False) or cause transaction to
            fail to resolve (True)
        cs6tjj|�r,�jj�j|d�}tdd�|�S|fSdS)N)�
name__globcSs|jS)N)r�)rIr5r5r6r��szABase.group_install.<locals>._pattern_to_pkgname.<locals>.<lambda>)rr��is_glob_patternr`rqrrr�)r�r)r3r5r6�_pattern_to_pkgname�sz/Base.group_install.<locals>._pattern_to_pkgnameNcsg|]}�|��qSr5r5)rZrI)r�r5r6r\�sz&Base.group_install.<locals>.<listcomp>rz#Adding packages from group '%s': %s)rr�r�r��	itertools�chain�
from_iterablerr,r�r�r�r�Z_group_installr�r�rMrNrr�)
r3�grp_idZ	pkg_typesr�r|Zexclude_pkgnamesZnested_excludesr r�Zinstlogr5)r�r3r6�
group_install�s$


zBase.group_installcCs�t|j|jtjtjBtj�}d}d}x�|D]�}	y|j|	�}
Wn:tjj	k
rv}zt
jt|��d}w*WYdd}~XnXx2|
j
D](}|s�||kr�||j||||d�7}q�Wx&|
jD]}
||j|
||||d�7}q�Wq*W|r�|r�tjjtd���|S)NrTF)r�r|)r�r|r�zNothing to do.)rrr��ENVIRONMENTS�GROUPS�	AVAILABLEr�rrQr�rMr5r
�groupsr��environmentsr�rcr)r3r�r�r|r�r�r�cnt�doner��res�errZgroup_idr�r5r5r6�env_group_install�s(

zBase.env_group_installcCs,tjj|�st�|j�}|j|�}|j|�S)N)rr�r�r�rZ
_group_remover�)r3r�r r�r5r5r6�group_removes
zBase.group_removecCs�t|j|jtjtjBtj�}y|j|�}WnFtjj	k
rp}z&t
jdt|��tjj
td���WYdd}~XnXd}x|jD]}||j|�7}q~Wx|jD]}||j|�7}q�W|S)NzWarning: %szNo groups marked for removal.r)rrr�r�r��	INSTALLEDr�rrQr�rMr5r
rcrr�r�r�r�)r3r�rr�r�r�r.�grpr5r5r6�env_group_removes

"zBase.env_group_removec
 CsLt|j|jtjtjBtj�}d}�x
|D�]}y|j|�}Wn6tjj	k
rr}zt
jt|��w(WYdd}~XnXxX|j
D]N}y|j|�d}Wq|tjj	k
r�}zt
jt|��w|WYdd}~Xq|Xq|WxZ|jD]P}y|j|�d}Wq�tjj	k
�r$}zt
jt|��w�WYdd}~Xq�Xq�Wq(W|�sHtd�}	tjj|	��dS)NFTzNo group marked for upgrade.)rrr�r�r�r�r�rrQr�rMr5r
r��environment_upgrader��
group_upgraderr�ZCliError)
r3r�rZgroup_upgradedr�r�r�r.r�r�r5r5r6�env_group_upgrades6



zBase.env_group_upgradecCs,tjj|�st�|j�}|j|�}|j|�S)N)rr�r�r�rZ_environment_upgrader�)r3r�r r�r5r5r6r�9s
zBase.environment_upgradecCs,tjj|�st�|j�}|j|�}|j|�S)N)rr�r�r�rZ_group_upgrader�)r3r�r r�r5r5r6r�@s
zBase.group_upgradecCs�|jjd}tjj|�rdS|jj}tjjj	|d�}|j
tjtjB�|j
dd�}t|�}~~|dkrldStjj|�}tjj|�s�tj|�t|d�}|j�~dSdS)	z�Checks for the presence of GPG keys in the rpmdb.

        :return: 0 if there are no GPG keys in the rpmdb, and 1 if
           there are keys
        z/.gpgkeyschecked.yumr�)r�r�z
gpg-pubkeyr�wN)r4r�r�r�r�rYrr(r�r�r��_RPMVSF_NOSIGNATURES�_RPMVSF_NODIGESTSZdbMatchrp�dirname�makedirs�openr9)r3ZgpgkeyscheckedrYZmyts�idx�keysZmydirZfor5r5r6�_gpg_key_checkGs&

zBase._gpg_key_checkc	Cs�|j|�\}}|j|�x~|D]v}tjj|j�}|jj�j|d�}|jj	rb|j
|jj�j|d��}|j|d�}|dk	r�|j|d�}|jj
||d�qWt|�S)N)r�)r)rm)r�r�)r	r�rr�r�r`rqrrr4rrurrr�rp)	r3rqrmr|�already_instrr�rrr5r5r6r�es

zBase._install_multiarchcCs,tj�}tj�}t||�t||�||fS)a�
        Categorize :param install and :param exclude list into two groups each (packages and groups)

        :param install: list of specs, whether packages ('foo') or groups/modules ('@bar')
        :param exclude: list of specs, whether packages ('foo') or groups/modules ('@bar')
        :return: categorized install and exclude specs (stored in argparse.Namespace class)

        To access packages use: specs.pkg_specs,
        to access groups use: specs.grp_specs
        )�argparseZ	Namespacer)r3r�r��
install_specs�
exclude_specsr5r5r6�_categorize_specsss


zBase._categorize_specscsddd�|jD���fdd�|jD�}|jj�j|d�}|jj�j�d�}|jj|�|jj|�dS)NcSsg|]}tjj|�r|�qSr5)rr�r�)rZr�r5r5r6r\�sz/Base._exclude_package_specs.<locals>.<listcomp>csg|]}|�kr|�qSr5r5)rZr�)�
glob_excludesr5r6r\�s)r�)r�)�	pkg_specsr`rqr�ry)r3r��excludesr}Zglob_exclude_queryr5)r�r6�_exclude_package_specs�szBase._exclude_package_specsc
Cs�t�}t|j|jtjtjBtjtjB�}x�|D]�}y|j|�}Wn8t	j
jk
rx}ztj
dt|��w.WYdd}~XnX|j|j�|j|j�x8|jD].}|jj|�}x|j�D]}	|j|	j�q�Wq�Wq.Wt|�S)NzWarning: Module or %s)rrrr�r�r�r�r�r�rrQr�rMr5r
r<r�r�Z_environment_by_idZgroups_iterr�rPr)
r3�group_specsr�rr�r�r�Zenvironment_idZenvironmentr-r5r5r6�_expand_groups�s"


zBase._expand_groupsc
Cs�x�|D]x}yL|jj}d|kr<|jd�}|d}|djd�}|j|g|||j|j�Wqtjjk
r||j	d|�YqXqWdS)Nrsrr��,�@)
r4Zgroup_package_types�splitr�r��	grp_specsrrQrcrw)r3r�r�Zskippedr|Z
group_specr�rr5r5r6�_install_groups�s

zBase._install_groupscCs�|dkrg}g}g}g}g}	|j||�\}
}|j|�xd|
jD]Z}y|j||||d�Wq>tjjk
r�}
ztjt	|
��|j
|�WYdd}
~
Xq>Xq>Wg}f}to�|
j�rLy tj
jj|�}|j|
j|�Wnxtjjk
�rH}
zV|
j�r
x|
jD]}|j
|�q�W|
j�r2x|
jD]}|j
d|��qW|
j}WYdd}
~
XnXn|
j}|�rv|j|j�|_|j||||�|�s�|�s�|�s�|	�s�|�r�tjj||||	|d��dS)N)rmr|�formsr)�no_match_group_specs�error_group_specs�no_match_pkg_specs�error_pkg_specs�module_depsolv_errors)r�r�r�r�rrQ�MarkingErrorrMr5r�rwrorrerfZ
ModuleBaseZ
MarkingErrorsrrrrr)r3r�r�rmr|rrrr	r
r�r��specrUZno_match_module_specsrrfZe_specr5r5r6r��sN
 zBase.install_specsc
Cs�tjj|�}|j|j|dd�}|jjdks4|j|�rr|d}|dk	rP|j|d�|sb|j	|||�|j
|||d�S|jjdkr�|j|||jj|d	|d
�}|s�|j	|||�x|D]}	|j
j|	|d�q�WdSd
S)z@Mark package(s) given by pkg_spec and reponame for installation.F)r�with_srcrhrqN)rm)rmr|rT)rrrm�reports�solution)r�r�r�r)rrsrt�get_best_solutionr`r4r�Z_is_arch_specifiedrr�_raise_package_not_found_errorr��_get_best_selectorsrrr�)
r3�pkg_specrmr|rr|rr�sltrsrr5r5r6r��s,
zBase.installcCs�|jrd}t|��|jj�j�j|j|jdgd�}|shtd�}t	j
||j�tjj
td�|j|j��n\t|�d|kr�tjj|j�}|j|gd�|jj||d�d	Std
�}t	j
||j�dSdS)Nz-downgrade_package() for an installed package.�noarch)r�r�z.Package %s not installed, cannot downgrade it.zNo match for argument: %sr)r�)r�r�r�zCPackage %s of lower version already installed, cannot downgrade it.)�_from_system�NotImplementedErrorr`rqr�rrr�r�rrMrdrrQrru�sortedr�r�rrr�)r3r�r|r�rrr5r5r6�package_downgrades  zBase.package_downgradecCs�|jj�j|j|j|j�}|j|�\}}||kr>|j|g�nT|tj	j
|�krdtjj
td�|j��n.tjj|j�}|j|gd�|jj||d�dS)NzNo match for argument: %s)r�)r�r�r�)r`rq�_nevrar�r�r�r	r�r�r�r�rrQ�PackageNotFoundErrorrrur�r�rrr�)r3r�r|rr�rrr5r5r6�package_installszBase.package_installcCsf|jj�j�j|j|j|jd�r0|jj|�dSt	d�}t
j|t|��t
jjt	d�|j|j��dS)N)r�r�r�r�z.Package %s not installed, cannot reinstall it.zNo match for argument: %s)r`rqr�rrr�r�r�rr�rrMrdr�rrQrru)r3r�r�r5r5r6�package_reinstall(s zBase.package_reinstallcCs|jj|�dS)Nr�)rr�)r3r�r5r5r6�package_remove0szBase.package_removecCs`|jrd}t|��|jdkr6td�}tj||j�dS|jj�j	�j
�}|jjr�|jj�j
|gd�j
|d�r�tjj|j�}|j|gd�|jj|d�dS|jd	kr�|j|jd
�}n|j|j|jd	gd�}|�std�}tj||j�tjjtd
�|j|j��nZt|�d|k�rBtjj|j�}|j|gd�|jj|d�dStd�}tj||j�dSdS)Nz+upgrade_package() for an installed package.r�z<File %s is a source package and cannot be updated, ignoring.r)r�)r)r�r�r)r�)r�r�z+Package %s not installed, cannot update it.zNo match for argument: %szHThe same or higher version of %s is already installed, cannot update it.r�)rrr�rrMr�rur`rqr�rxr4rrrrr�r�rrr�r�r�rdrQrr)r3r�r�r�rrr5r5r6�package_upgrade4s:
$
zBase.package_upgradec	Cs�|jj�j�}|j|jj�jdd�|D�d��}|j�}|rf|jj�j�j|j|j��d�}|j|�}|dk	rz|j|d�|j||dd�}|r�|j|j	�j
dd�|D�d��}tjj
|j�}|j|d	�|jj|d
�dS)NcSsg|]
}|j�qSr5)r�)rZr�r5r5r6r\\sz*Base._upgrade_internal.<locals>.<listcomp>)r�)r)rmT)rr�cSsg|]
}|j�qSr5)r�)rZr�r5r5r6r\xs)r�)r�r�)r`rqr�r�rrrrur�r�r�r�rr�r�rrr�)	r3rqrrmrZ
installed_allrrVrr5r5r6�_upgrade_internalYs "
 zBase._upgrade_internalc
Csttjj|�}|j|j�}|d}|�rZtjj|�}|oH|doH|dj�r*|dj}|jj�j	�j
�}|jjr||j
|d�n|jj�jdd�}	|	�s*|j
|d�j
�}
|
s�td�}tj||�tjjtd�||��nV|djo�tjj|dj��r*|
j|djd	��s*td
�}tj|dj||dj��|jj�oH|d�oH|dj�}|j||||�Stjjtd�||��dS)Nrq�nevra)rT)ri)r�z(Package %s available, but not installed.zNo match for argument: %s)r�z?Package %s available, but installed for different architecture.z{}.{})rrsrtrr`r�r�r�rqr�rxr4rr�rrrrMrdrQ�PackagesNotInstalledErrorr�rOZ
has_just_namer!r)
r3rrmr|rrZwildcard�pkg_namer�Z
obsoletersZinstalled_namer�rr5r5r6r�s0
& zBase.upgradecCs|j|jj�|jj|dd�S)N)r)r!r`rqr4r)r3rmr5r5r6�upgrade_all�szBase.upgrade_allcCs�|dkr|jj�nxtjj|�}|j|jdd�}|djtj	d�|j
|||jjdd�}|spt
jtd�|�dSx|D]}|jj|d	�qvWd
S)NF)rrq)�
reponame__neqT)rrrzNo package %s installed.r)r�r�)rZdistupgrade_allrrsrtrr`rrrKZSYSTEM_REPO_NAMErr4rrMr�rZdistupgrade)r3rrsrrrr5r5r6�distro_sync�s
zBase.distro_synccCs�t|||g�r�||7}d}|rF|rFx4|D]}td�}tj||�q(Wn|rX|j|�rXd}xX|D]P}y|j||d�Wn4tjjk
r�}	ztj	t
|	��WYdd}	~	Xq^Xd}q^W|s�tjtd��n4|jj�j
|jj|jjd�}
x|
D]}|j|�q�WdS)z�Removes all 'leaf' packages from the system that were originally
        installed as dependencies of user-installed packages but which are
        no longer required by any such package.FzNot a valid form: %sT)rNzNo packages marked for removal.)rb)�anyrrMrdr�r�rrQrr�r�r`rqr�r�r�r4rbr)r3rr�r�	filenamesr�Zgrp_specr�rrUr}r�r5r5r6r��s,


 
zBase.autoremovecsptjj|�j�j|d�}��fdd�|j�D�}|sB�j||���jj}x|D]}�j	j
||d�qPWt|�S)z'Mark the specified package for removal.)rcs(g|] }�dks �jj|��kr|�qS)N)r�rS)rZr�)rmr3r5r6r\�szBase.remove.<locals>.<listcomp>)r�)rrsrtrvr`r��"_raise_package_not_installed_errorr4r�rr�rp)r3rrmrr
r�r�r�r5)rmr3r6r��s
zBase.removecstjj|�}|j�j�}��fdd�|j�D�}|j�}	|dk	rL|	j|d�|dk	r`|	j|d�tjj	|	�}
|s�tj
jd||
j���d}�j
j}x\|D]T}
y|
t|
�}Wn*tk
r�|s�w��jj|
|d�YnX�jj|�|d7}q�W|dk�rtj
jd||��|S)	Ncs(g|] }�dks �jj|��kr|�qS)N)r�rS)rZr�)�old_reponamer3r5r6r\�sz"Base.reinstall.<locals>.<listcomp>)rm)r&zno package matchedr)r�r�)rrsrtrvr`r�rrrrqZ_per_nevra_dictrQr#r�r4r�r
�KeyErrorrr�r�ZPackagesNotAvailableError)r3rr+Znew_reponameZnew_reponame_neqZ	remove_nar|rr�Zavailable_qZavailable_nevra2pkgr�r�Z
installed_pkgZ
available_pkgr5)r+r3r6�	reinstall�s6


zBase.reinstallcCs
|j|�S)z�Mark a package to be downgraded.

        This is equivalent to first removing the currently installed package,
        and then installing an older version.

        )�downgrade_to)r3rr5r5r6�	downgrade	szBase.downgradec
Cstjj|�}|j|j�}|s6td�|}tjj||��d}|j�}t	|j
�j��}|jj�j
�j|d�}	t|	�dkr�td�|}tjj|||��xn|	j
�j�D]^}
|j�j|
d�}|s�td�}tj||
�q�tjj|j�}|j|d�|jj||d�d}q�W|S)	z�Downgrade to specific version if specified otherwise downgrades
        to one version lower than the package installed.
        zNo match for argument: %sr)r�z6Packages for argument %s available, but not installed.zDPackage %s of lowest version already installed, cannot downgrade it.)r�)r�r�r�)rrsrtrvr`rrQrrr�
_name_dictr�rqr�rrrpr#Z
downgradesr�rMrdr�r�rrr�)
r3rr|r|rr�r�Zavailable_pkgsZavailable_pkg_namesZq_installedr$Zdowngrade_pkgsrr5r5r6r.	s.zBase.downgrade_tocs�|jj�j�d�}|r |�gfStjj|j��}|r>|�gfS�jd�sR�jd�r^d�g}n&�jd�rr|�gfS�fdd�d
D�}|jj�j|d�|fS)N)Z
file__glob�/bin/�/sbin/z/usrrscsg|]}|��qSr5r5)rZ�prefix)�
provides_specr5r6r\E	sz!Base.provides.<locals>.<listcomp>�	/usr/bin/�
/usr/sbin/)r1r2r5r6)r`rqrrrZ_by_provides�
startswith)r3r4Z	providersZbinary_providesr5)r4r6r�6	s




z
Base.providesc
Cs�ddd�}||krtd��||}|rDd|}	|jj|	t��j|�|rfd|}	|jj|	t��j|�|r�d|}	|jj|	t��j|�|r�d|}	|jj|	t��j|�|r�d	|}	|jj|	t��j|�d
S)a�
        It modifies results of install, upgrade, and distrosync methods according to provided
        filters.

        :param cmp_type: only 'eq' or 'gte' allowed
        :param types: List or tuple with strings. E.g. 'bugfix', 'enhancement', 'newpackage',
        'security'
        :param advisory: List or tuple with strings. E.g.Eg. FEDORA-2201-123
        :param bugzilla: List or tuple with strings. Include packages that fix a Bugzilla ID,
        Eg. 123123.
        :param cves: List or tuple with strings. Include packages that fix a CVE
        (Common Vulnerabilities and Exposures) ID. Eg. CVE-2201-0123
        :param severity: List or tuple with strings. Includes packages that provide a fix
        for an issue of the specified severity.
        Z__eqgZ	__eqg__gt)�eqZgtez Unsupported value for `cmp_type`Z
advisory_type�advisoryZadvisory_bugZadvisory_cveZadvisory_severityN)r�r/�
setdefaultrr<)
r3Zcmp_typer�r9ZbugzillaZcvesZseverityZcmp_dictZcmpr�r5r5r6�add_security_filtersI	s&
zBase.add_security_filterscCs
i|_dS)z,
        Reset all security filters
        N)r/)r3r5r5r6�reset_security_filtersn	szBase.reset_security_filtersc
Cs>|jp
|js|r|S|jj�jdd�}|jrRx|jD]}|j|�}q8W|g|_|jr�x<|jj�D].\}}|rx|d}||i}	|j|jf|	��}qdW|j|�}|�s:|�r:|j	�}t
|j�j��}
|
dk�r:|dk�rt
d�j|
�}t
d�j|
�}tjt|||
��n2t
d�j||
�}t
d	�j||
�}tjt|||
��|S)
z�
        Merge Queries in _update_filters and return intersection with q Query
        @param q: Query
        @return: Query
        T)riZ	__upgraderNz3No security updates needed, but {} update availablez4No security updates needed, but {} updates availablez<No security updates needed for "{}", but {} update availablez=No security updates needed for "{}", but {} updates available)r/r.r`rqrrru�itemsr�r�r�rpr0r�rrOrMrdr	)
r3rrrdr�Zmerged_queriesrqZfilter_namer��kwargsrZZmsg1Zmsg2r5r5r6r�u	s>


zBase._merge_update_filtersc
s�jrtd�}t|���|j�j��j|jk}|r:gn�j}��fdd�}d}|jj�j��x |D�]}	t	j
j|	��}
�x|
D�]�}tj
|j|j|j�dkr�td�}tj||	|j�q�|jjr�t	jjj|j|j�}t	jjj|�}
tjt	jj||
��|	|_|jj�rt	j
j||
�nt	j
j |�d}|jj!�r:d}n�|jj"�r�|jj�r�|
t	jj#j$t	jj#j%fk�r�d}tjt	jj&td���nd}tjt	jj&td	���nd}n<|�r�|�|j|j|	|j'|jd
��}n|�r�|�|j|j�}|�s�d}q�|jj(t)j*�}|�r|jj+�}|jj,|t)j*�|jj-tj.|j��}|�rD|jj,|�|dk�rjtd�|}t	j/j0||���tjtd��d}q�WqhW|�r�|�r�t	j/j0td
���|�s�td��j1}t	j/j0||���|j2��\}}|dk�r|�r�td�}tj|�t3|�}t	j/j0||���dS)a�Retrieve a key for a package. If needed, use the given
        callback to prompt whether the key should be imported.

        :param po: the package object to retrieve the key of
        :param askcb: Callback function to use to ask permission to
           import a key.  The arguments *askcb* should take are the
           package object, the userid of the key, and the keyid
        :param fullaskcb: Callback function to use to ask permission to
           import a key.  This differs from *askcb* in that it gets
           passed a dictionary so that we can expand the values passed.
        :raises: :class:`dnf.exceptions.Error` if there are errors
           retrieving the keys
        z6Unable to retrieve a key for a commandline package: %scs0|td��d7}|td�dj�j�7}|S)Nz. Failing package is: %sz
 zGPG Keys are configured as: %sz, )rr�r�)r�)r�rSr5r6�_prov_key_data�	sz1Base._get_key_for_package.<locals>._prov_key_dataFrz)GPG key at %s (0x%s) is already installedTzThe key has been approved.zThe key has been rejected.)r��useridZhexkeyid�keyurl�fingerprint�	timestampzKey import failed (code %d)zKey imported successfullyzDidn't install any keysz�The GPG keys listed for the "%s" repository are already installed but they are not correct for this package.
Check that the correct key URLs are configured for this repository.z+Import of key(s) didn't help, wrong key(s)?N)4rrr�r^r~rPr1r�r�rZcryptoZretriever
ZkeyInstalledr�Zrpm_idrCrMr�Zshort_idr4r�r�ZKeyInfoZfrom_rpm_key_objectr@Zraw_keyZDNSSECKeyVerificationZverifyZ
nice_user_msg�urlZlog_dns_key_importZlog_key_importZassumenoZ	assumeyesZValidityZVALIDZPROVEN_NONEXISTENCEZany_msgrBr9r(r:Z
getTsFlagsr�ZpgpImportPubkeyZ
procgpgkeyrQrcr�r�r
)r3r��askcb�	fullaskcbr�Z
key_installedZkeyurlsr?Zuser_cb_failrAr�r�Z
dns_input_keyZ
dns_resultZrcZ	test_flagZ
orig_flagsr��errmsgr5)r�rSr6�_get_key_for_package�	s�








zBase._get_key_for_packagecCs|j|||�dS)a�Retrieve a key for a package. If needed, use the given
        callback to prompt whether the key should be imported.

        :param pkg: the package object to retrieve the key of
        :param askcb: Callback function to use to ask permission to
           import a key.  The arguments *askcb* should take are the
           package object, the userid of the key, and the keyid
        :param fullaskcb: Callback function to use to ask permission to
           import a key.  This differs from *askcb* in that it gets
           passed a dictionary so that we can expand the values passed.
        :raises: :class:`dnf.exceptions.Error` if there are errors
           retrieving the keys
        N)rH)r3r�rErFr5r5r6�package_import_key$
szBase.package_import_keycCs4g}|jj�x |jj�D]}|jt|��qW|S)N)r�r�Zproblemsrwr
)r3�resultsZprobr5r5r6r45
s

zBase._run_rpm_check�w+bcKstjj||j||f|�S)z�
        Open the specified absolute url, return a file object
        which respects proxy setting even for non-repo downloads
        )rr�Z_urlopenr4)r3rDrS�moder>r5r5r6�urlopen@
szBase.urlopencCs,|dkr|jjtjd�}|j|jjd�}|S)N)r�)r�)rrqrKr�r�r4r�)r3rZinstallonlyr5r5r6r�H
szBase._get_installonly_querycCsrtjj|dd�}|j|jdddd�}|drn|drn|djrn||ddjkrntjtd�j	|ddj��dS)	NT)r�F)rjrkrlrqr"rz  * Maybe you meant: {})
rrsrtrr`r�rMr�rrO)r3rr|rr5r5r6�_report_icase_hintN
s

zBase._report_icase_hintcCs�dd�}g}g}x6|D].}|j�r:|jtjkrD|j|�q|j|�qWtd�}|||�sjtjjtd���|j	j
r�td�}|||�s�tjjtd���g}||fS)a Check checksum of packages from local repositories and returns list packages from remote
        repositories that will be downloaded. Packages from commandline are skipped.

        :param install_pkgs: list of packages
        :return: list of remote pkgs
        cSsxd}xn|D]f}d}y|j�}Wn0tk
rN}ztjt|��WYdd}~XnX|dk	r
tj|j||j��d}q
W|S)NTF)ZverifyLocalPkgrLrMr�r�rOrm)Zpkg_listZ
logger_msgZall_packages_verifiedr�Zpkg_successfully_verifiedrUr5r5r6�_verification_of_packages]
s
 z;Base._select_remote_pkgs.<locals>._verification_of_packagesz>Package "{}" from local repository "{}" has incorrect checksumz;Some packages from local repository have incorrect checksumz8Package "{}" from repository "{}" has incorrect checksumzVSome packages have invalid cache, but cannot be downloaded due to "--cacheonly" option)Z
_is_local_pkgrmrKZCMDLINE_REPO_NAMErwrrrQrcr4r�)r3Zinstall_pkgsrOryZlocal_repository_pkgsr�r�r5r5r6rtV
s&




zBase._select_remote_pkgscCsx|D]}t|�qWdS)N)�_msg_installed)r3r�r�r5r5r6r��
s
zBase._report_already_installedc	Cs�|jjtjd�}tjj|�}|j|j|d|d�}|dk	rH|dj|d�|dsdtj	j
td�|��nB|jjtjd�}|dj
|�}|r�td�}ntd�}tj	j
||��dS)	N)r�F)rrrqrq)rmzNo match for argumentz?All matches were filtered out by exclude filtering for argumentz?All matches were filtered out by modular filtering for argument)r`rqrKr�rrsrtrrrrQrrZIGNORE_REGULAR_EXCLUDESr�)	r3rrrm�	all_queryrsrZwith_regular_queryr�r5r5r6r�
s
z#Base._raise_package_not_found_errorc	s��jjtjd�j�}tjj|�}|j�j|d|d�}|dsNtj	j
td�|���dk	rp��fdd�|dD�}n|d}|s�td�}ntd	�}tj	j
||��dS)
N)r�F)rrrqrqzNo match for argumentcs g|]}�jj|��kr|�qSr5)r�rS)rZr�)rmr3r5r6r\�
sz;Base._raise_package_not_installed_error.<locals>.<listcomp>zCAll matches were installed from a different repository for argumentz?All matches were filtered out by exclude filtering for argument)r`rqrKr�r�rrsrtrrQr#r)	r3rrrmrQrsrr�r�r5)rmr3r6r*�
s
z'Base._raise_package_not_installed_errorcCs|jj|jdd�dS)z�
        Setup DNF file loggers based on given configuration file. The loggers are set the same
        way as if DNF was run from CLI.
        T)Zfile_loggers_onlyN)r$Z_setup_from_dnf_confr4)r3r5r5r6�
setup_loggers�
szBase.setup_loggerscs�|jjtjtjBtjB@r d}nd}t|j�}|j|dd�}|jf|�}|rl|rlt	j
j|j��}t
j|�t|jdd��}t|jdd��|}	dd���fdd�|D��t��fd	d�|D��}
t��fd
d�|	D��}|
|fS)z�returns set of conflicting packages and set of packages with broken dependency that would
        be additionally installed when --best and --allowerasingTF)rrZignore_weak)rcSstj|j|j|j|j|jd�S)N)r��epoch�version�releaser�)rKZNEVRAr�rSrTrUr�)�itemr5r5r6r�
sz&Base._skipped_packages.<locals>._nevracsg|]}�|��qSr5r5)rZr*)rr5r6r\�
sz*Base._skipped_packages.<locals>.<listcomp>csg|]}�|��kr|�qSr5r5)rZr�)r�transaction_nevrasr5r6r\�
scsg|]}�|��kr|�qSr5r5)rZr�)rrWr5r6r\�
s)rZactionsrK�INSTALLZUPGRADEZUPGRADE_ALLrr0rrr�rrrMrdr�problem_conflictsZproblem_broken_dependency)r3Zreport_problemsr�rZngZparamsrr�rYZproblem_dependencyZskipped_conflictsZskipped_dependencyr5)rrWr6�_skipped_packages�
s(


zBase._skipped_packages)N)F)F)TT)T)N)FFF)F)F)N)T)NN)TN)rhNNFN)N)NTN)NT)TNN)NT)T)NNTN)NTN)F)T)N)N)N)N)NNNN)NN)NNNF)F)NTF)NN)NN)NrK)N)��__name__�
__module__�__qualname__r7r8r:r;r?rV�staticmethodrrgrr��propertyrr4r^�deleterrr�Zlazyattrr�r`rar��setterr�r�r�r�r�r�r�r�r�r9r�r�r�r(ZRPMTRANS_FLAG_NOSCRIPTSZRPMTRANS_FLAG_NOTRIGGERSZRPMTRANS_FLAG_NODOCSr:ZRPMTRANS_FLAG_JUSTDBZRPMTRANS_FLAG_NOCONTEXTSZRPMTRANS_FLAG_NOFILEDIGESTr�r/r�r�r�r�r�r�r�r�r�rr	r
rrr!rAr8rLr;rUrqrzr~r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�rrr�r�rrrrr r!r�r%r'r�r�r-r/r.r�r;r<r�rHrIr4rMr�rNrtr�rr*rRrZr5r5r5r6r[s�
	
=
	
8
;
>

=


	
'\
8l"]
*
B

;

)
=

*



/


%
&


#&
%)-rcCs t|�}td�}tj||�dS)Nz Package %s is already installed.)r
rrMr�)r�r�r�r5r5r6rP�
srP)H�__doc__Z
__future__rrrrr�rZlibdnf.transactionr�rxrZ	dnf.compsrZdnf.i18nrr	r
Zdnf.utilrZdnf.db.historyrZdnf.yumr
�collections.abcr�ImportError�collectionsr�Zdnf.callbackZdnf.confZ
dnf.conf.readZ
dnf.cryptoZ
dnf.dnssecZdnf.drpmZdnf.exceptionsZdnf.goalZdnf.historyZdnf.lockZdnf.loggingZdnf.module.module_baseroZ
dnf.persistorZ
dnf.pluginZ	dnf.queryZdnf.repoZdnf.repodictZdnf.rpm.connectionZdnf.rpm.miscutilsZdnf.rpm.transactionZdnf.sackZdnf.selectorZdnf.subjectZdnf.transactionZdnf.yum.rpmtransr�r�rKr�r#rHr�r�rEr(r�rwZ	getLoggerrM�objectrrPr5r5r5r6�<module>s�

site-packages/dnf/__pycache__/transaction.cpython-36.opt-1.pyc000064400000003147147511334650020173 0ustar003

�ft`-�@s�ddlmZddlmZddlZddlmZmZejj	Z
ejjZejj
ZejjZejjZejjZejjZejjZejjZejjZeZdZdZdZ dZ!d	Z"ejj
ejj	ejjejjejjgZ#ejjejjejjejjgZ$e
ed
d�eed�eed
d
�eed�eed�eed
d�eed�eed�eed
d�eed�eed�eed�e ed�e!ed�iZ%e
dededededededededededede de!diZ&dS) �)�absolute_import)�unicode_literalsN)�_�C_�e�f�g����Z	currentlyZDowngradingZCleanupZ
InstallingZ
ObsoletingZReinstallingZErasingZ	UpgradingZ	VerifyingzRunning scriptletZ	PreparingZ	DowngradeZ
DowngradedZ	InstalledZObsoleteZ	ObsoletedZ	ReinstallZReinstalledZEraseZUpgradeZUpgradedZVerified)'Z
__future__rrZlibdnf.transactionZlibdnfZdnf.i18nrrZtransactionZTransactionItemAction_DOWNGRADEZ
PKG_DOWNGRADEZ TransactionItemAction_DOWNGRADEDZPKG_DOWNGRADEDZTransactionItemAction_INSTALLZPKG_INSTALLZTransactionItemAction_OBSOLETEZPKG_OBSOLETEZTransactionItemAction_OBSOLETEDZ
PKG_OBSOLETEDZTransactionItemAction_REINSTALLZ
PKG_REINSTALLZ!TransactionItemAction_REINSTALLEDZPKG_REINSTALLEDZTransactionItemAction_REMOVEZ
PKG_REMOVEZTransactionItemAction_UPGRADEZPKG_UPGRADEZTransactionItemAction_UPGRADEDZPKG_UPGRADEDZ	PKG_ERASEZPKG_CLEANUPZ
PKG_VERIFYZ
PKG_SCRIPTLETZTRANS_PREPARATIONZ
TRANS_POSTZFORWARD_ACTIONSZBACKWARD_ACTIONSZACTIONSZFILE_ACTIONS�rr�!/usr/lib/python3.6/transaction.py�<module>sp





site-packages/dnf/__pycache__/selector.cpython-36.pyc000064400000000361147511334650016522 0ustar003

�ft`e�@s(ddlmZddlmZddlmZdS)�)�absolute_import)�unicode_literals)�SelectorN)Z
__future__rrZhawkeyr�rr�/usr/lib/python3.6/selector.py�<module>ssite-packages/dnf/__pycache__/goal.cpython-36.pyc000064400000000351147511334650015623 0ustar003

�ft`M�@s(ddlmZddlmZddlmZdS)�)�absolute_import)�unicode_literals)�GoalN)Z
__future__rrZhawkeyr�rr�/usr/lib/python3.6/goal.py�<module>ssite-packages/dnf/__pycache__/repo.cpython-36.opt-1.pyc000064400000053275147511334650016622 0ustar003

i�-eLQ�@sFddlmZddlmZddlmZmZddlZddlZddl	Zddl
ZddlZddlZddl
ZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZdZ dZ!ej"ej#dZ$dej%e$�ej&fZ'd	e'e!fd
e'e fdd�Z(ej)d
�Z*dd�Z+dd�Z,d-dd�Z-dd�Z.Gdd�de/�Z0Gdd�de1�Z2Gdd�dej3j4�Z5Gdd�de/�Z6Gdd �d ej7j8�Z9Gd!d"�d"ej3j:�Z;Gd#d$�d$e;�Z<Gd%d&�d&e;�Z=Gd'd(�d(ej3j:�Z>ej7j?j@ZAej7j?jBZCej7j?jDZEGd)d*�d*ej7jF�ZGGd+d,�d,ejHjI�Z?dS).�)�absolute_import)�unicode_literals)�ucd�_N�packagesZ
mirrorlistz-_.:z(?P<repoid>[%s]+)\-[%s]{16}z>^%s\/.*((xml|yaml)(\.gz|\.xz|\.bz2|.zck)?|asc|cachecookie|%s)$z^%s\/%s\/.+rpm$z^.+(solv|solvx)$)�metadatarZdbcache�dnfcCstjjj|�}|dkrdS|S)zAReturn index of an invalid character in the repo ID (if present).rN)�libdnf�repo�RepoZverifyId)Zrepo_idZ
first_invalid�r�/usr/lib/python3.6/repo.py�repo_id_invalidHsrcGs8x"|D]}|||�}|dk	r|SqWttd�|��dS)Nz"no matching payload factory for %s)�
ValueErrorr)�pkg�progressZ	factories�fn�ploadrrr
�_pkg2payloadOs


rTcCs�dd�}|jj�dd�t||d�D�}t�}ytjjjtjj|�|�Wn,t	k
rv}zt
|�|_WYdd}~XnX|j�|jj
�|_xj|D]b}|j�}|dks�|jd�r�q�|j�}	|	j}
|
j}|dkr�|jj|�q�|jjj�|g|j|<q�W|S)NcSst|d�S)NZdelta)�hasattr)�payloadrrr
�_download_sort_keyYsz._download_payloads.<locals>._download_sort_keycSsg|]}|j��qSr)�_librepo_target)�.0rrrr
�
<listcomp>]sz&_download_payloads.<locals>.<listcomp>)�keyzNot finishedzAlready downloaded)�err�clear�sorted�_DownloadErrorsr	r
�
PackageTargetZdownloadPackagesZVectorPPackageTarget�RuntimeError�str�_fatal�wait�copy�_recoverableZgetErr�
startswithZgetCallbacks�
package_ploadr�_skipped�add�_repoZexpire�_pkg_irrecoverable)�payloadsZdrpmZ	fail_fastrZtargets�errs�eZtgtr�	callbacksrrrrr
�_download_payloadsWs0

r1cCsL|\}}x:|D]2}|j}||kr,||j7}q||j7}||j7}qW||fS)N)r�
download_size�
_full_size)Zsavingr-r.�realZfullrrrrr
�_update_savingxs


r5c@s>eZdZdd�Zdd�Zedd��Zejdd��Zdd	�Zd
S)rcCsi|_i|_d|_t�|_dS)N)r,�_val_recoverabler#�setr))�selfrrr
�__init__�sz_DownloadErrors.__init__cCs"|jr|jS|jrd|jgiSiS)N�)r,r#)r8rrr
�_irrecoverable�s
z_DownloadErrors._irrecoverablecCs|jS)N)r6)r8rrr
r&�sz_DownloadErrors._recoverablecCs
||_dS)N)r6)r8Znew_dctrrr
r&�scCs|j|jkrdS|jS)Nr)rr)r2)r8rrrr
�_bandwidth_used�sz_DownloadErrors._bandwidth_usedN)	�__name__�
__module__�__qualname__r9r;�propertyr&�setterr<rrrr
r�s
rc@seZdZdd�ZdS)�_DetailedLibrepoErrorcCs,tj|�|jd|_|jd|_||_dS)Nr�)�	Exceptionr9�argsZlibrepo_codeZlibrepo_msg�
source_url)r8Zlibrepo_errrFrrr
r9�s
z_DetailedLibrepoError.__init__N)r=r>r?r9rrrr
rB�srBc@seZdZdd�ZdS)�_NullKeyImportcCsdS)NTr)r8�id�userid�fingerprint�url�	timestamprrr
�_confirm�sz_NullKeyImport._confirmN)r=r>r?rMrrrr
rG�srGc@s eZdZdd�Zedd��ZdS)�MetadatacCs
||_dS)N)r+)r8r
rrr
r9�szMetadata.__init__cCs
|jj�S)N)r+�fresh)r8rrr
rO�szMetadata.freshN)r=r>r?r9r@rOrrrr
rN�srNcs4eZdZ�fdd�Zdd�Zdd�Zdd�Z�ZS)	�PackageTargetCallbackscstt|�j�||_dS)N)�superrPr9r()r8r()�	__class__rr
r9�szPackageTargetCallbacks.__init__cCs|jjd||�dS)Nr)r(�_end_cb)r8�status�msgrrr
�end�szPackageTargetCallbacks.endcCs|jjd||�dS)Nr)r(�_progress_cb)r8�totalToDownload�
downloadedrrr
r�szPackageTargetCallbacks.progresscCs|jjd||�dS)Nr)r(�_mirrorfail_cb)r8rUrKrrr
�
mirrorFailure�sz$PackageTargetCallbacks.mirrorFailure)r=r>r?r9rVrr[�
__classcell__rr)rRr
rP�srPcsHeZdZ�fdd�Zdd�Zdd�Zdd�Zed	d
��Zdd�Z	�Z
S)
�PackagePayloadcs$tt|�j|�t|�|_||_dS)N)rQr]r9rPr0r)r8rr)rRrr
r9�s
zPackagePayload.__init__cCsRtjj}|dkrtjj}n$|jd�r(dS|tjjjkr>tjj	}|j
j|||�dS)z"End callback to librepo operation.NzNot finished)r�callbackZ
STATUS_FAILEDZ	STATUS_OKr'r	r
�PackageTargetCBZTransferStatus_ALREADYEXISTSZSTATUS_ALREADY_EXISTSrrV)r8�cbdataZ	lr_statusrUrTrrr
rS�s

zPackagePayload._end_cbcCs|jj|tjj|�dS)N)rrVrr^Z
STATUS_MIRROR)r8r`rrKrrr
rZ�szPackagePayload._mirrorfail_cbcCsXy|jj||�Wn@tk
rRtj�\}}}tj|||�}tjdj|��YnXdS)Nr:)	rrD�sys�exc_info�	traceback�format_exception�logger�critical�join)r8r`�total�done�exc_type�	exc_value�
exc_traceback�except_listrrr
rW�szPackagePayload._progress_cbcCs|jS)N)r2)r8rrr
r3�szPackagePayload._full_sizecCs�|j}|j}tjj|�|d||j|j|jd�}|j|j	��t
jj|jj
|d|d|d|d|d|d|d	d
d
|j�S)NT)�dest�resumer`Z
progresscbZendcbZmirrorfailurecb�relative_urlrn�
checksum_type�checksum�expectedsize�base_urlror)r�pkgdirr�util�
ensure_dirrWrSrZ�update�_target_paramsr	r
r r+r0)r8rruZ
target_dctrrr
r�s 
zPackagePayload._librepo_target)r=r>r?r9rSrZrWr@r3rr\rr)rRr
r]�sr]c@s(eZdZdd�Zdd�Zedd��ZdS)�
RPMPayloadcCstjj|jj�S)N)�os�path�basenamer�location)r8rrr
�__str__szRPMPayload.__str__cCsT|j}|j�\}}tjjj|�}|tjjjkr>tjt	d�|�|j
|||j|jd�S)Nzunsupported checksum type: %s)rprqrrrsrt)
rZreturnIdSumr	r
r ZchecksumTypeZChecksumType_UNKNOWNre�warningrr~�downloadsize�baseurl)r8rZctypeZcsumZ
ctype_coderrr
ryszRPMPayload._target_paramscCs|jjS)zTotal size of the download.)rr�)r8rrr
r2szRPMPayload.download_sizeN)r=r>r?rryr@r2rrrr
rzsrzcs@eZdZ�fdd�Zdd�Zdd�Zdd�Zed	d
��Z�Z	S)�RemoteRPMPayloadcs�tt|�jd|�||_d|_||_|jjp.d|jjjd�}t	j
|jd��j�dd�}d|}t
jj|jj|d�|_tjj|j�t
jj|j|j�jd	��|_dS)
NZ
unused_objectrr:Zbasearch�utf8�zcommandline-r�/)rQr�r9�remote_location�remote_size�confZ
releasever�
substitutions�get�hashlibZsha256�encodeZ	hexdigestr{r|rgZcachedirrurrvrwr�lstripZ
local_path)r8r�r�r�sZdigestZrepodir)rRrr
r9szRemoteRPMPayload.__init__cCstjj|j�S)N)r{r|r}r�)r8rrr
r)szRemoteRPMPayload.__str__cCs^||_y|jj||�Wn@tk
rXtj�\}}}tj|||�}tjdj	|��YnXdS)Nr:)
r�rrDrarbrcrdrerfrg)r8r`rhrirjrkrlrmrrr
rW,szRemoteRPMPayload._progress_cbcCs<tjj|jjtjj|j�|j	dddtjj
|j�ddd|j�S)NrT)r	r
r r��_configr{r|r}r�ru�dirnamer0)r8rrr
r5sz RemoteRPMPayload._librepo_targetcCs|jS)zTotal size of the download.)r�)r8rrr
r2;szRemoteRPMPayload.download_size)
r=r>r?r9rrWrr@r2r\rr)rRr
r�s
	r�cszeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Ze	d
d��Z
e	dd��Zejdd��Zdd�Z
dd�Z�ZS)�	MDPayloadcs.tt|�j|�d|_d|_d|_t�|_dS)Nr:rF)rQr�r9�_text�_download_size�fastest_mirror_runningr7�mirror_failures)r8r)rRrr
r9Cs
zMDPayload.__init__cCstjjr|jS|jjd�SdS)Nzutf-8)r�pycompZPY3r�r�)r8rrr
rJszMDPayload.__str__cCs|jS)N)r�)r8rrr
�__unicode__PszMDPayload.__unicode__cCs||_|jj||�dS)N)r�r)r8r`rhrirrr
rWSszMDPayload._progress_cbcCs\|tjjjkr"td�|}d|_n*|tjjjkrH|jrH|rBd|nd}ndS|jj|�dS)Nz,determining the fastest mirror (%s hosts).. Tz
error: %s
zdone.
)	r	r
�RepoCBZFastestMirrorStage_DETECTIONrr�ZFastestMirrorStage_STATUSr�message)r8r`�stage�datarUrrr
�_fastestmirror_cbWszMDPayload._fastestmirror_cbcCs&|jj|�d||f}tj|�dS)Nzerror: %s (%s).)r�r*re�debug)r8r`rUrKrrrr
�_mirror_failure_cbcszMDPayload._mirror_failure_cbcCs|jS)N)r�)r8rrr
r2hszMDPayload.download_sizecCs|jS)N)�	_progress)r8rrr
rlszMDPayload.progresscCs|dkrtjj�}||_dS)N)rr^�NullDownloadProgressr�)r8rrrr
rps
cCs||_|jjdd�dS)NrCr)r�r�start)r8�textrrr
r�vszMDPayload.startcCsd|_|jj|dd�dS)Nr)r�rrV)r8rrr
rVzsz
MDPayload.end)r=r>r?r9rr�rWr�r�r@r2rrAr�rVr\rr)rRr
r�Asr�csLeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	�Z
S)�
RepoCallbackscs tt|�j�||_|j|_dS)N)rQr�r9r+�	_md_pload)r8r
)rRrr
r9�szRepoCallbacks.__init__cCs|jj|�dS)N)r�r�)r8Zwhatrrr
r��szRepoCallbacks.startcCs|jj�dS)N)r�rV)r8rrr
rV�szRepoCallbacks.endcCs|jjd||�dS)Nr)r�rW)r8rXrYrrr
r�szRepoCallbacks.progresscCs|jjd||�dS)N)r�r�)r8r�Zptrrrr
�
fastestMirror�szRepoCallbacks.fastestMirrorcCs|jjd|||�dS)Nr)r�r�)r8rUrKrrrr
�handleMirrorFailure�sz!RepoCallbacks.handleMirrorFailurecCs|jjj|||||�S)N)r+�_key_importrM)r8rHrIrJrKrLrrr
�
repokeyImport�szRepoCallbacks.repokeyImport)r=r>r?r9r�rVrr�r�r�r\rr)rRr
r��sr�cseZdZeZd7�fdd�	Zedd��Zedd��Zej	dd��Zed	d
��Z
dd�Ze
j	d
d
��Z
edd��Zedd��Z
e
j	dd��Z
dd�Zdd�Z�fdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd9d5d6�Z�ZS):rNcs�tt|�j||d�|jjj�tjj|r.|nd|j�|_t	t
jj��|_
t|�|_|jjj�|jj|j�d|_t�|_d|_|jj|r�|jr�tn|j�|r�|jj|j�t
jjj�|_|dk	r�|jnd|_dS)N)Zsection�parentr:T) rQrr9r��thisZdisownr	r
r+r�rr^r�r�r�Z
_callbacksZsetCallbacks�_pkgdirrGr�rZsetSyncStrategyZ	cacheonly�SYNC_ONLY_CACHE�DEFAULT_SYNCZsetSubstitutionsr�r�Z
SubstitutionsZ_substitutionsZcheck_config_file_ageZ_check_config_file_age)r8�nameZparent_conf)rRrr
r9�s
z
Repo.__init__cCs
|jj�S)N)r+ZgetId)r8rrr
rH�szRepo.idcCs
|jj�S)N)r+ZgetRepoFilePath)r8rrr
�repofile�sz
Repo.repofilecCs|jj|�dS)N)r+ZsetRepoFilePath)r8�valuerrr
r��scCs|jj�r|jj�S|j�S)N)r+ZisLocalZgetLocalBaseurl�cache_pkgdir)r8rrr
ru�s

zRepo.pkgdircCs$|jdk	r|jStjj|jj�t�S)N)r�r{r|rgr+�getCachedir�_PACKAGES_RELATIVE_DIR)r8rrr
r��s
zRepo.cache_pkgdircCs
||_dS)N)r�)r8�valrrr
ru�scCstjj|jj�d�S)NZpubring)r{r|rgr+r�)r8rrr
�_pubring_dir�szRepo._pubring_dircCs
|jj�S)N)r+ZgetLoadMetadataOther)r8rrr
�load_metadata_other�szRepo.load_metadata_othercCs|jj|�dS)N)r+ZsetLoadMetadataOther)r8r�rrr
r��scCs|j|jkS)N)rH)r8�otherrrr
�__lt__�szRepo.__lt__cCsd|jj|jfS)Nz<%s %s>)rRr=rH)r8rrr
�__repr__�sz
Repo.__repr__cstt|�j||�dS)N)rQr�__setattr__)r8r�r�)rRrr
r��szRepo.__setattr__cCs|jj�dS)N)r+�disable)r8rrr
r��szRepo.disablecCs|jj�dS)N)r+�enable)r8rrr
r��szRepo.enablecCs|jj|�dS)a/Ask for additional repository metadata type to download.

        Given metadata_type is appended to the default metadata set when
        repository is downloaded.

        Parameters
        ----------
        metadata_type: string

        Example: add_metadata_type_to_download("productid")
        N)r+ZaddMetadataTypeToDownload)r8�
metadata_typerrr
�add_metadata_type_to_download�s
z"Repo.add_metadata_type_to_downloadcCs|jj|�dS)aIStop asking for this additional repository metadata type
        in download.

        Given metadata_type is no longer downloaded by default
        when this repository is downloaded.

        Parameters
        ----------
        metadata_type: string

        Example: remove_metadata_type_from_download("productid")
        N)r+ZremoveMetadataTypeFromDownload)r8r�rrr
�"remove_metadata_type_from_downloadsz'Repo.remove_metadata_type_from_downloadcCs|jj|�S)z�Return path to the file with downloaded repository metadata of given type.

        Parameters
        ----------
        metadata_type: string
        )r+ZgetMetadataPath)r8r�rrr
�get_metadata_pathszRepo.get_metadata_pathcCs|jj|�S)z�Return content of the file with downloaded repository metadata of given type.

        Content of compressed metadata file is returned uncompressed.

        Parameters
        ----------
        metadata_type: string
        )r+ZgetMetadataContent)r8r�rrr
�get_metadata_content!s
zRepo.get_metadata_contentcCs�d}z�y|jj�}Wnttjjtfk
r�}zP|jjrhd|j}x|jjD]}|d|7}qJWt	j
|�tjj
t|���WYdd}~XnXWdt�|j_Xt|j�|_|S)a�Load the metadata for this repo.

        Depending on the configuration and the age and consistence of data
        available on the disk cache, either loads the metadata from the cache or
        downloads them from the mirror, baseurl or metalink.

        This method will by default not try to refresh already loaded data if
        called repeatedly.

        Returns True if this call to load() caused a fresh metadata download.

        Fz7Errors during downloading metadata for repository '%s':z
  - %sN)r+�loadr	�error�Errorr!r�r�rHrer�r�
exceptionsZ	RepoErrorr"r7rNr)r8�retr/rUZfailurerrr
r�-s

&z	Repo.loadcCsP|js|jjd�|jrL|jdkr&dS|jj�}|jj�rDtd|�}d|fSdS)	a)Get the number of seconds after which the cached metadata will expire.

        Returns a tuple, boolean whether there even is cached metadata and the
        number of seconds it will expire in. Negative number means the metadata
        has expired already, None that it never expires.

        FrCTNr���)TN)Fr)rr+Z	loadCacheZmetadata_expireZgetExpiresInZ	isExpired�min)r8Z
expirationrrr
�_metadata_expire_inJs



zRepo._metadata_expire_incCs
||_dS)N)r�)r8Z
key_importrrr
�_set_key_import]szRepo._set_key_importcCs||j_dS)N)r�r)r8rrrr
�set_progress_bar`szRepo.set_progress_barcCs
|jj�S)zoReturns user defined http headers.

        Returns
        -------
        headers : tuple of strings
        )r+ZgetHttpHeaders)r8rrr
�get_http_headersdszRepo.get_http_headerscCs|jj|�dS)aSets http headers.

        Sets new http headers and rewrites existing ones.

        Parameters
        ----------
        headers : tuple or list of strings
            Example: set_http_headers(["User-Agent: Agent007", "MyFieldName: MyFieldValue"])
        N)r+ZsetHttpHeaders)r8Zheadersrrr
�set_http_headersnszRepo.set_http_headers�http�ftp�file�httpscs@��fdd�}�sdS|jj�}|r,||�S|jr<||j�SdS)z�
        :param location: relative location inside the repo
        :param schemes: list of allowed protocols. Default is ('http', 'ftp', 'file', 'https')
        :return: absolute url (string) or None
        csZxT|D]L}�r>tjjj|�d}|�krRtjj|�jd��Sqtjj|�jd��SqWdS)Nrr�)rr�Zurlparser{r|rgr�)Zurl_listrKr�)r~�schemesrr
�schemes_filter�s
z,Repo.remote_location.<locals>.schemes_filterN)r+Z
getMirrorsr�)r8r~r�r�Zmirrorsr)r~r�r
r�{s

zRepo.remote_location)NN�r�r�r�r�)r�)r=r>r?�SYNC_TRY_CACHEr�r9r@rHr�rArur�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r\rr)rRr
r�s6


r)T)JZ
__future__rrZdnf.i18nrrZdnf.callbackrZdnf.confZdnf.conf.substitutionsZ	dnf.constZ
dnf.cryptoZdnf.exceptionsZdnf.loggingZ
dnf.pycompZdnf.utilZdnf.yum.miscZlibdnf.errorr	Zlibdnf.repo�	functoolsr�ZhawkeyZlogging�operatorr{�reZshutil�stringraZtimercr�Z_MIRRORLIST_FILENAMEZ
ascii_lettersZdigitsZ
_REPOID_CHARS�escapeZ	hexdigitsZ_CACHEDIR_REZCACHE_FILESZ	getLoggerrerrr1r5�objectrrDrBr^Z	KeyImportrGrNr
r_rPZPayloadr]rzr�r�rZSyncStrategy_LAZYZ	SYNC_LAZYZSyncStrategy_ONLY_CACHEr�ZSyncStrategy_TRY_CACHEr�r�r�r�ZRepoConfrrrr
�<module>sl




!
8&?


site-packages/dnf/__pycache__/drpm.cpython-36.opt-1.pyc000064400000012244147511334650016606 0ustar003

�ft`��@s�ddlmZddlmZddlmZddlmZddlmZddl	Z
ddlZ
ddlZ
ddl
Z
ddlZddlZddlZdZejd�ZGd	d
�d
e
jj�ZGdd�de�ZdS)
�)�absolute_import)�unicode_literals)�hexlify)�unlink_f)�_Nz/usr/bin/applydeltarpm�dnfcsXeZdZ�fdd�Zdd�Z�fdd�Zdd�Zed	d
��Zedd��Z	d
d�Z
�ZS)�DeltaPayloadcs"tt|�j||�||_||_dS)N)�superr�__init__�
delta_info�delta)�selfrr�pkg�progress)�	__class__��/usr/lib/python3.6/drpm.pyr
)szDeltaPayload.__init__cCstjj|jj�S)N)�os�path�basenamer�location)r
rrr�__str__.szDeltaPayload.__str__cs2tt|�j|||�|tjjjkr.|jj|�dS)N)	r	r�_end_cb�libdnf�repoZPackageTargetCBZTransferStatus_ERRORr�enqueue)r
ZcbdataZ	lr_status�msg)rrrr1szDeltaPayload._end_cbcCsh|j}|j\}}tj|�}t|�j�}tjjj	|�}|tjjj
krRtjt
d�|�|j|||j|jd�S)Nzunsupported checksum type: %s)Zrelative_urlZ
checksum_typeZchecksumZexpectedsizeZbase_url)r�chksum�hawkeyZchksum_namer�decoderrZ
PackageTargetZchecksumTypeZChecksumType_UNKNOWN�loggerZwarningrr�downloadsizeZbaseurl)r
rZctypeZcsumrZ
ctype_coderrr�_target_params6s

zDeltaPayload._target_paramscCs|jjS)N)rr!)r
rrr�
download_sizeHszDeltaPayload.download_sizecCs|jjS)N)rr!)r
rrr�
_full_sizeLszDeltaPayload._full_sizecCs$|jj}tjj|jjjtjj|��S)N)	rrrr�joinrrZpkgdirr)r
rrrr�localPkgPszDeltaPayload.localPkg)�__name__�
__module__�__qualname__r
rrr"�propertyr#r$r&�
__classcell__rr)rrr(src@s>eZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�ZdS)�	DeltaInfoNcCs�d|_tjttj�rd|_ytjd�|_Wnttfk
rHd|_YnX|dkrbt	j
j�j|_n||_||_
||_g|_i|_i|_dS)z�A delta lookup and rebuild context
           query -- installed packages to use when looking up deltas
           progress -- progress obj to display finished delta rebuilds
        FT�SC_NPROCESSORS_ONLN�N)�deltarpm_installedr�access�
APPLYDELTA�X_OK�sysconf�
deltarpm_jobs�	TypeError�
ValueErrorrZconfZConf�deltarpm_percentage�queryr�queue�jobs�err)r
r8rr7rrrr
VszDeltaInfo.__init__cCs�|js
dS|jjs|jr dS|j�r,dStjj|j��r@dS|j	|jd}d}x@|j
j|j|j
d�D](}|j|j�}|rj|j|krj|j}|}qjW|r�t||||�SdS)z&Turn a po to Delta RPM po, if possibleN�d)�name�arch)r/rZdeltarpmr7Z
_is_local_pkgrr�existsr&Z_sizer8�filterr=r>Zget_delta_from_evrZevrr!r)r
ZporZbestZ
best_deltaZiporrrr�
delta_factoryms$zDeltaInfo.delta_factorycCs�tjtjjd||d?|d@�|jj|�}|j}|dkrXt|jj	��t
d�g|j|<nB|jj�stt
d�g|j|<n&t
j|j	��|jj|tjjt
d��dS)Nzdrpm: %d: return code: %d, %d��rzDelta RPM rebuild failedz(Checksum of the delta-rebuilt RPM failed�done)r �logr�logging�SUBDEBUGr:�poprrr&rr;ZverifyLocalPkgr�unlinkr�end�callbackZSTATUS_DRPM)r
�pid�code�ploadrrrr�job_done�s
zDeltaInfo.job_donecCs`ttd|jj|j�|jj�g}tjtjf|��}tjt	j
jd|dj|dd���||j
|<dS)Nz-azdrpm: spawned %d: %s� �)r1rr>r&r�spawnl�P_NOWAITr rErrFrGr%r:)r
rNZ
spawn_argsrLrrr�	start_job�szDeltaInfo.start_jobcCspx.|jr.tjdtj�\}}|s P|j||�qW|jj|�x.t|j�|jkrj|j	|jj
d��|js>Pq>WdS)NrQr���)r:r�waitpid�WNOHANGrOr9�append�lenr4rTrH)r
rNrLrMrrrr�szDeltaInfo.enqueuecCs@x:|jr:tj�\}}|j||�|jr|j|jjd��qWdS)z!Wait until all jobs have finishedrN)r:r�waitrOr9rTrH)r
rLrMrrrrZ�s
zDeltaInfo.wait)N)	r'r(r)r
rArOrTrrZrrrrr,Us


r,)Z
__future__rrZbinasciirZdnf.yum.miscrZdnf.i18nrZdnf.callbackrZdnf.loggingZdnf.reporrFZlibdnf.reporrr1Z	getLoggerr rZPackagePayloadr�objectr,rrrr�<module>s
-site-packages/dnf/__pycache__/package.cpython-36.opt-1.pyc000064400000023452147511334650017242 0ustar003

�ft`�+�@s�dZddlmZddlmZddlmZddlZddlZddl	Zddl
ZddlZddlZ
ddlZ
ddlZddlZddlZejd�ZGdd�dej�ZdS)	z! Contains the dnf.Package class. �)�absolute_import)�unicode_literals)�_N�dnfcs�eZdZdZdZdZ�fdd�Ze�fdd��Zej	dd��Zed	d
��Z
edd��Zed
d��Zedd��Z
edd��Ze�fdd��Zej	dd��Zedd��Zedd��Zedd��Zedd��Zdd�Zed d!��Zed"d#��Zed$d%��Zed&d'��Zed(d)��Zej	d*d)��Zed+d,��Zed-d.��Zed/d0��Zed1d2��Zed3d4��Zed5d6��Zed7d8��Z d9d:�Z!d;d<�Z"d=d>�Z#d?d@�Z$dAdB�Z%dRdGdH�Z&dIdJ�Z'edKdL��Z(dMdN�Z)dOdP�Z*�Z+S)S�Packagez Represents a package. #:api z
-debuginfoz-debugsourcecs,tt|�j|�||_d|_d|_d|_dS)N)�superr�__init__�base�_priv_chksum�_repo�
_priv_size)�selfZ
initobjectr	)�	__class__��/usr/lib/python3.6/package.pyr0s
zPackage.__init__cs�|jr|jS|jr~tjjj�}ytjj||j	�}Wn6tj
jk
rh}ztjj
t|���WYdd}~XnXtj|�tj|�fStt|�jS)N)r
�
_from_cmdlinerZyumZmiscZget_default_chksum_type�libdnf�utilsZchecksum_value�location�error�Error�
exceptions�	MiscError�str�hawkey�chksum_type�binasciiZ	unhexlifyrr�chksum)r
rZ
chksum_val�e)rrr�_chksum7s"zPackage._chksumcCs
||_dS)N)r
)r
�valrrrrEscCs|jtjkS)N)�reponamerZCMDLINE_REPO_NAME)r
rrrrIszPackage._from_cmdlinecCs|jtjkS)N)r!r�SYSTEM_REPO_NAME)r
rrr�_from_systemMszPackage._from_systemcCs*d}|jr|jjj|�}|r$d|S|jS)a9
        For installed packages returns id of repository from which the package was installed
        prefixed with '@' (if such information is available in the history database). Otherwise
        returns id of repository the package belongs to (@System for installed packages of unknown
        origin)
        N�@)r#r	�history�repor!)r
Zpkgreporrr�
_from_repoQszPackage._from_repocCs|jr|jjj|�SdS)N�)r#r	r%r&)r
rrr�	from_repo`szPackage.from_repocCstjj|j��S)z�
        Returns the header of a locally present rpm package file. As opposed to
        self.get_header(), which retrieves the header of an installed package
        from rpmdb.
        )r�rpm�_header�localPkg)r
rrrr+gszPackage._headercs|jr|jStt|�jS)N)rrr�size)r
)rrr�_sizepsz
Package._sizecCs
||_dS)N)r)r
r rrrr.vscCs"|jdkrdS|j\}}tj|�S)N)Z
hdr_chksumr�hexlify)r
rrrrr�_pkgidzs

zPackage._pkgidcCs4|jdk	r,tjj|jd�}|jdd�d}nd}|S)zO
        returns name of source package
        e.g. krb5-libs -> krb5
        Nz.src.rpm�-�r)Z	sourcerpmr�utilZrtrim�rsplit)r
Zsrcnamerrr�source_name�s

zPackage.source_namecCsF|jj|j�r|jS|j}|jj|j�r<|dt|j��}||jS)a)
        Returns name of the debuginfo package for this package.
        If this package is a debuginfo package, returns its name.
        If this package is a debugsource package, returns the debuginfo package
        for the base package.
        e.g. kernel-PAE -> kernel-PAE-debuginfo
        N)�name�endswith�DEBUGINFO_SUFFIX�DEBUGSOURCE_SUFFIX�len)r
r6rrr�
debug_name�s
zPackage.debug_namecCs |jdk	r|jn|j}||jS)zv
        Returns name of the debugsource package for this package.
        e.g. krb5-libs -> krb5-debugsource
        N)r5r6r9)r
�src_namerrr�debugsource_name�szPackage.debugsource_namecCsN|js
dSyt|jjjtj|j��Stk
rHt	j
jdt|���YnXdS)a`
        Returns the rpm header of the package if it is installed. If not
        installed, returns None. The header is not cached, it is retrieved from
        rpmdb on every call. In case of a failure (e.g. when the rpmdb changes
        between loading the data and calling this method), raises an instance
        of PackageNotFoundError.
        Nz4Package not found when attempting to retrieve header)
r#�nextr	Z_tsZdbMatchr*ZRPMDBI_PACKAGES�rpmdbid�
StopIterationrrZPackageNotFoundErrorr)r
rrr�
get_header�szPackage.get_headercCs |jdk	r|jn|j}||jS)z�
        returns name of debuginfo package for source package of given package
        e.g. krb5-libs -> krb5-debuginfo
        N)r5r6r8)r
r<rrr�source_debug_name�szPackage.source_debug_namecCs
t|j�S)z: Always type it to int, rpm bindings expect it like that. )�intr?)r
rrr�idx�szPackage.idxcCs|jS)N)r!)r
rrr�repoid�szPackage.repoidcCs|j|jt|j�|j|jfS)N)r6�archrr�v�r)r
rrr�pkgtup�szPackage.pkgtupcCs|jr|jS|jj|jS)N)rr	Zreposr!)r
rrrr&�szPackage.repocCs
||_dS)N)r)r
r rrrr&�scCs |jtjkrdS|jjjj|�S)N)rErr"r	r%r*Zget_reason_name)r
rrr�reason�szPackage.reasoncCs|jS)N)r)r
rrr�relativepath�szPackage.relativepathcCs|jS)N)rF)r
rrr�a�sz	Package.acCs|jS)N)Zepoch)r
rrrr�sz	Package.ecCs|jS)N)�version)r
rrrrG�sz	Package.vcCs|jS)N)�release)r
rrrrH�sz	Package.rcCs|jS)N)r!)r
rrr�ui_from_reposzPackage.ui_from_repocCs|j|�dkS)Nr)�evr_cmp)r
�pkgrrr�evr_eqszPackage.evr_eqcCs|j|�dkS)Nr)rP)r
rQrrr�evr_gt	szPackage.evr_gtcCs|j|�dkS)Nr)rP)r
rQrrr�evr_lt
szPackage.evr_ltcCs|jS)N)Zmedianr)r
rrr�
getDiscNumszPackage.getDiscNumcCsr|jr|jS|j}|jjj�rH|jrH|jjd�rHtjj	|j
�|jd��S|j�s\tjj
|�}tjj	|j|jd��S)z� Package's location in the filesystem.

            For packages in remote repo returns where the package will be/has
            been downloaded.
        zfile://�/)rrr&r�isLocal�baseurl�
startswith�os�path�joinZget_local_baseurl�lstrip�
_is_local_pkg�basename�pkgdir)r
�locrrrr,szPackage.localPkg�http�ftp�file�httpscCs |js|jrdS|jj|j|�S)a
        The location from where the package can be downloaded from. Returns None for installed and
        commandline packages.

        :param schemes: list of allowed protocols. Default is ('http', 'ftp', 'file', 'https')
        :return: location (string) or None
        N)r#rr&�remote_locationr)r
Zschemesrrrrf$s	zPackage.remote_locationcCsL|jr
dSd|jkr&|jjd�r&dS|jpJ|jjj�oJ|jpJ|jjd�S)NTz://zfile://F)r#rrYrr&rrWrX)r
rrrr^1szPackage._is_local_pkgcCs,|jjj�r |j�r |jj�S|jjSdS)N)r&rrWr^Zcache_pkgdirr`)r
rrrr`:s
zPackage.pkgdircCs0|jdkrdS|j\}}tj|�tj|�j�fS)z] Return the chksum type and chksum string how the legacy yum expects
            it.
        N)NN)rrZchksum_namerr/�decode)r
rrrrr�returnIdSumBs

zPackage.returnIdSumcCst|jrtd��|jrdS|j�\}}ytjj||j�|�Stjj	k
rn}zt
jjt
|���WYdd}~XnXdS)Nz$Can not verify an installed package.T)r#�
ValueErrorrrhrrZchecksum_checkr,rrrrrr)r
rrrrrr�verifyLocalPkgLszPackage.verifyLocalPkg�rbrcrdre)rk),�__name__�
__module__�__qualname__�__doc__r8r9r�propertyr�setterrr#r'r)r+r.r0r5r;r=rArBrDrErIr&rJrKrLrrGrHrOrRrSrTrUr,rfr^r`rhrj�
__classcell__rr)rrr*sR	

	
r)roZ
__future__rrZdnf.i18nrrZdnf.exceptionsrZdnf.rpmZdnf.yum.miscrZlibdnf.errorrZlibdnf.utilsZloggingrZr*Z	getLoggerZloggerrrrrr�<module>s
site-packages/dnf/__pycache__/subject.cpython-36.opt-1.pyc000064400000000424147511334650017300 0ustar003

�ft`~�@s4ddlmZddlmZddlmZddlmZdS)�)�absolute_import)�print_function)�unicode_literals)�SubjectN)Z
__future__rrrZhawkeyr�rr�/usr/lib/python3.6/subject.py�<module>ssite-packages/dnf/__pycache__/base.cpython-36.opt-1.pyc000064400000232731147511334650016563 0ustar003

i�-e���@sXdZddlmZddlmZddlmZddlmZddlZddlZddlZ	ddl
mZddlm
Z
dd	lmZmZmZdd
lmZddlmZddlmZydd
lmZWn ek
r�dd
lmZYnXddlZddlZddlZddlZddlZddl Zddl!Zddl"Zddl#Zddl$Zddl%Zddl&Zddl'Zyddl(ZdZ)Wnek
�r`dZ)YnXddl*Zddl+Zddl,Zddl-Zddl.Zddl/Zddl0Zddl1Zddl2Zddl3Zddl4Zddl5ZddlZddl6Zddl7Z7ddl8Z8ddl9Z9ddl:Z:ddl;Z;ddl<Z<ddl=Z=ddl>Z>ddl?Z?ddl@Z@ddlAZAddlBZBe;jCd�ZDGdd�deE�ZFdd�ZGdS)z
Supplies the Base class.
�)�absolute_import)�division)�print_function)�unicode_literalsN)�deepcopy)�
CompsQuery)�_�P_�ucd)�_parse_specs)�
SwdbInterface)�misc)�SequenceTF�dnfc@s�eZdZd�dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Ze	dd��Z
dd�Zd�dd�Zdd�Z
edd��Zedd��Zedd��Zejdd��Zeejjd�dd ���Zed!d"��Zed#d$��Zed%d&��Zejd'd&��Zd(d)�Zffdfd*d+�Zd,d-�Zd.d/�Zd0d1�Zd�d2d3�Z d�d5d6�Z!d�d7d8�Z"d9d:�Z#d;d<�Z$d�d=d>�Z%d�d?d@�Z&dAdB�Z'e(j)e(j*e(j+e(j,e(j-e(j.e(j/dC�Z0e1e(dD��r�e(j2e0dE<dFe(j3e(j4BiZ5edGdH��Z6edIdJ��Z7e7jdKdJ��Z7d�dLdM�Z8dNdO�Z9edPdQ�dRdQ�dSdQ�dTdU�Z:dVdW�Z;dXdY�Z<dZd[�Z=d\d]�Z>d^d_�Z?d�d`da�Z@ffdbdc�ZAddde�ZBdfdg�ZCdhdi�ZDd�djdk�ZEd�dldm�ZFd�dndo�ZGd�dpdq�ZHdrds�ZIdtdu�ZJdvdw�ZKd�dydz�ZLd�d{d|�ZMd}d~�ZNdd��ZOd�d��ZPd�d��ZQd�d�d��ZRd�d��ZSd�d�d��ZTd�d�d��ZUd�d��ZVd�d��ZWd�d��ZXd�d��ZYd�d��ZZd�d��Z[d�d�d��Z\d�d��Z]d�d��Z^d�d��Z_d�d�d��Z`d�d�d��Zad�d�d��Zbd�d�d��Zcd�d�d��Zdd�d��Zed�d��Zfd�d��Zgd�d�d��Zhd�d�d��Zid�d�d��Zjd�d�d��Zk�dd�d��Zl�dd�d��Zm�dd�d��Znd�d��Zo�dd�d„Zpd�dĄZqffffffd�dƄZrd�dȄZs�dd�dʄZt�dd�d̄Zu�dd�d΄Zvd�dЄZw�dd�dӄZx�dd�dՄZyd�dׄZzd�dلZ{d�dۄZ|d�d݄Z}d�d߄Z~d�d�Zd�d�Z�dS(	�BaseNcCs�d|_|p|j�|_d|_d|_d|_d|_d|_d|_t	j
j�|_d|_
t�|_t�|_t	jj�|_t	jj�|_t	jj�|_ttjg�|_t	jj�|_d|_ d|_!d|_"g|_#i|_$d|_%t�|_&d|_'dS)NF)(�_closed�_setup_default_conf�_conf�_goal�_repo_persistor�_sack�_transaction�_priv_ts�_compsr�comps�TransactionBunch�_comps_trans�_history�set�
_tempfiles�_trans_tempfiles�callbackZDepsolve�_ds_callback�loggingZLogging�_logging�repodict�RepoDict�_repos�rpmZRPMPROB_FILTER_OLDPACKAGE�_rpm_probfilterZpluginZPlugins�_plugins�_trans_success�_trans_install_set�_tempfile_persistor�_update_security_filters�_update_security_options�_allow_erasing�_repo_set_imported_gpg_keys�output)�self�conf�r5�/usr/lib/python3.6/base.py�__init__]s2z
Base.__init__cCs|S)Nr5)r3r5r5r6�	__enter__zszBase.__enter__cGs|j�dS)N)�close)r3Zexc_argsr5r5r6�__exit__}sz
Base.__exit__cCs|j�dS)N)r9)r3r5r5r6�__del__�szBase.__del__cCs.|jr|jj|�n|jjrn|jj|�dS)N)rr �updater4�destdirr)r3�filesr5r5r6�_add_tempfiles�s
zBase._add_tempfilescCs�|j�td|jdd�}|jr&d|d<y|jj|jfddi|��WnTtjk
r�}z6t	j
td�j|j
|��tjjtd�j|j
���WYdd}~XnXdS)NT)�load_filelists�load_presto�load_updateinfo�
load_other�build_cachezloading repo '{}' failure: {}z"Loading repository '{}' has failed)�load�dict�deltarpm�load_metadata_otherr�	load_repo�_repo�hawkey�	Exception�logger�debugr�format�idr�
exceptions�	RepoError)r3�repo�mdload_flags�er5r5r6�_add_repo_to_sack�szBase._add_repo_to_sackcCs.tjj�}|j}d|kr*tjj|j�|d<|S)N�
releasever)rr4ZConf�
substitutionsr(Zdetect_releasever�installroot)r4Zsubstr5r5r6r�s

zBase._setup_default_confcCs�dd�|jj�D�}y0|jj|j||jj|jjd|jj|jj	d�}Wn4t
jk
rx}ztj
jt|���WYdd}~XnX|r�tjtjjj|d��dS)NcSsg|]}|jr|j�qSr5)Zmodule_hotfixesrP)�.0�ir5r5r6�
<listcomp>�sz0Base._setup_modular_excludes.<locals>.<listcomp>F)Zupdate_onlyZdebugsolver�module_obsoletesr)�repos�iter_enabled�sackZfilter_modules�_moduleContainerr4rYZmodule_platform_id�debug_solverr]rKrLrrQ�Errorr
rM�warning�module�module_baseZformat_modular_solver_errors)r3Z
hot_fix_reposZ
solver_errorsrUr5r5r6�_setup_modular_excludes�s"zBase._setup_modular_excludesFc	Cs�t|jj�}d|kr$tr$|j�dSg}g}|�s>�x|jj�D]�}|j|krPq@t|j	�dkr�|j
j�jdd�}x8t|j	�D]*}t
jj|�}|j|j|j
dddd��}q|W|j|jd�|j|j�|jf�|j
j�jdd�}	x8t|j�D]*}
t
jj|
�}|	j|j|j
dddd��}	q�W|	j|jd�|	r@|j|	|jf�q@Wd|k�r6|j
j�jdd�}t|jj	�dk�r�x<t|jj	�D],}t
jj|�}|j|j|j
dddd��}�qzW|j
j�jdd�}x<t|jj�D],}
t
jj|
�}|j|j|j
dddd��}�q�Wt|jj	�dk�r$|j
j|�|j
jd�|�r6|j
j|�|�rjx,|D]$\}
}|j
j|
�|j
jd|��qBW|�r�x|D]\}
}|j
j|
��qvW|�r�t�r�|j�dS)	N�allrT)�emptyF)�
with_nevra�
with_provides�with_filenames)�reponame�main)rr4Zdisable_excludes�WITH_MODULESrgr^r_rP�lenZincludepkgsr`�query�filtermr�subject�Subject�union�get_best_query�append�applyZexcludepkgsZadd_includesZset_use_includes�add_excludes)r3�	only_mainZdisabledZ
repo_includesZ
repo_excludes�rZ
incl_queryZincl�subjZ
excl_queryZexclZ
include_query�
exclude_queryrq�repoidr5r5r6�_setup_excludes_includes�sh

zBase._setup_excludes_includescCsP|jr<|jjr<dd�|jj�D�}|jjj|�|jj�|jrL|jj�dS)NcSs"g|]}|jr|jj�r|j�qSr5)�metadatarJZ	isExpiredrP)rZr{r5r5r6r\�sz/Base._store_persistent_data.<locals>.<listcomp>)	rr4�	cacheonlyr^r_Zexpired_to_addr<�saver-)r3Zexpiredr5r5r6�_store_persistent_data�s
zBase._store_persistent_datacCs|jdkr|jdd�|jS)NT)�arch_filter)r�
read_comps)r3r5r5r6r�s
z
Base.compscCs|jS)N)r)r3r5r5r6r4�sz	Base.confcCs|jS)N)r')r3r5r5r6r^sz
Base.reposcCs
d|_dS)N)r')r3r5r5r6r^sZ
_priv_rpmconncCstjjj|jj�S)N)rr(Z
connectionZ
RpmConnectionr4rY)r3r5r5r6�_rpmconnsz
Base._rpmconncCs|jS)N)r)r3r5r5r6r`sz	Base.sackcCsP|jdkrtjjd��|jjdkrHtjjd|jj	|jj
d|jj�|j_|jjS)NzSack was not initializedF�arch)r`rrQrcra�libdnfreZModulePackageContainerr4rYrX�
persistdir)r3r5r5r6ras
 zBase._moduleContainercCs|jS)N)r)r3r5r5r6�transactionszBase.transactioncCs|jrtd��||_dS)Nztransaction already set)r�
ValueError)r3�valuer5r5r6r�$scCstjj|jj�|_dS)N)r�	persistorZ
RepoPersistorr4�cachedirr)r3r5r5r6�_activate_persistor+szBase._activate_persistorcCs,|jjr|jj|j||�|jj||�dS)z&Load plugins and run their __init__().N)r4Zpluginsr*�_loadZ	_run_init)r3Z
disabled_globZenable_plugins�clir5r5r6�init_plugins.szBase.init_pluginscCs|jj�dS)z#Run plugins pre_configure() method.N)r*Z_run_pre_config)r3r5r5r6�pre_configure_plugins5szBase.pre_configure_pluginscCs|jj�dS)zRun plugins configure() method.N)r*Z_run_config)r3r5r5r6�configure_plugins:szBase.configure_pluginscCs|jj�dS)zRun plugins unload() method.N)r*Z_unload)r3r5r5r6�unload_plugins?szBase.unload_pluginsc
Cs�|jj}|jdkr|j�|j}|r�tjj�rDtd�}tj	|�dStjj
�dkrhtd�}tj	|�dS|dkr�td�}tj	|�dS|j�}|dk	r�||kr�tj	td��dSx|jj
�D]}|jjd�q�W|jj�s�tj	td�jd	j|jj���dSx�|jj�D]�}|j�\}}	|	dk�r6tj	td
�|j�nx|�sH|	dk�rftjtd�|j�|jj�nH|�r�|	|k�r�td�}tj||j|	�|jj�ntjtd
�|j|	��qW|�r�d|_|jddd�tj	td��dS)NzCMetadata timer caching disabled when running on metered connection.Fz:Metadata timer caching disabled when running on a battery.rz Metadata timer caching disabled.z"Metadata cache refreshed recently.�z*There are no enabled repositories in "{}".z", "z4%s: will never be expired and will not be refreshed.z&%s: has expired and will be refreshed.zC%s: metadata will expire after %d seconds and will be refreshed nowz!%s: will expire after %d seconds.T)�load_system_repo�load_available_reposzMetadata cache created.)r4Zmetadata_timer_syncrr�r�utilZon_metered_connectionrrM�infoZon_ac_power�since_last_makecacher^�valuesrJZsetMaxMirrorTries�_any_enabledrO�joinZreposdirr_Z_metadata_expire_inrPrN�expireZreset_last_makecache�	fill_sack)
r3�timerZperiodr��msgr�rSr{Zis_cacheZ
expires_inr5r5r6�update_cacheDsZ






zBase.update_cacheTc CsPtjjd�}|jddd�tjj|�|_tjj|j	j
|j	j�}|���|dk	r�y|jjdd�Wnt
k
r~|dkrz�YnX|�r�g}d}tj�}|j	jr�tjjj�x�|jj�D]�}y`|j|�|jj�|kr�|jj�}|jj�|kr�|jj�}tjtd�|jtjj|jj���Wq�tj j!k
�rz}	z>|jj"�|j#dk�rJ�tj$d	|	�|j%|j�|j&�WYd
d
}	~	Xq�Xq�W|�r�tj$td�dj'|��|jj(��r�|dk�r�|dk�r�tj)td
�t*j+t,|�d�tjj|��n|jj-�j&�Wd
QRX|j	}
|jj.|
j/|
j0|
j1�|j2�|�tj3j4|j�|_5|
j6|j5_6|j7j8�|jS)z'Prepare the Sack and the Goal objects. z
sack setupT)r`�goalF)rD�autorz%s: using metadata from %s.z	Error: %sNzIgnoring repositories: %sz, z-Last metadata expiration check: %s ago on %s.)Zseconds)9rr#�Timer�resetr`�_build_sackr�lock�build_metadata_lockr4r��exit_on_lockr��IOError�time�gpgkey_dns_verification�dnssec�RpmImportedKeys�check_imported_keys_validityr^r_rVrJZgetTimestampZgetAgerMrNrrPr��normalize_time�getMaxTimestamprQrRr��skip_if_unavailablerdrw�disabler�r�r��datetimeZ	timedelta�intrh�
_configure�installonlypkgs�installonly_limit�allow_vendor_changerr��Goalr�protect_running_kernelr*�run_sack)r3r�r�r�r��error_reposZmtsZager{rUr4r5r5r6r�|sf






zBase.fill_sackc	 Cstjjd�}|jddd�tjj|�|_tjj|j	j
|j	j�}|��n|dk	r�y|jjdd�Wnt
k
r~|dkrz�YnXg}|j	jr�tjjj�x�|jj�D]�}yf|jjddd�td|jdd�}|jr�d|d	<|jj|jf|�tjtd
�|jtjj|jj ���Wq�t!t"j#fk
�r�}zZ|j$dk�rPtj%j&td�j'|j|���ntjtd�j'|j|��|j(|j�|j)�WYdd}~Xq�Xq�W|�r�tj*td
�dj+|��WdQRX|j	}|jj,|j-|j.|j/�|j0�|�tj1j2|j�|_3|j4|j3_4|j5j6�|jS)a�
        Prepare Sack and Goal objects and also load all enabled repositories from cache only,
        it doesn't download anything and it doesn't check if metadata are expired.
        If there is not enough metadata present (repond.xml or both primary.xml and solv file
        are missing) given repo is either skipped or it throws a RepoError exception depending
        on skip_if_unavailable configuration.
        z
sack setupT)r`r�F)rDr�)ZthrowExceptZ
ignoreMissing)r@rArBrCz%s: using metadata from %s.zloading repo '{}' failure: {}NzIgnoring repositories: %sz, )7rr#r�r�r`r�rr�r�r4r�r�r�r�r�r�r�r�r^r_rJZ	loadCacherFrGrHrIrMrNrrPr�r�r��RuntimeErrorrKrLr�rQrRrOrwr�rdr�r�r�r�r�rr�r�rr�r*r�)	r3r�r�r�r�rSrTrUr4r5r5r6�fill_sack_from_repos_in_cache�sX	

z"Base.fill_sack_from_repos_in_cachecCs�tjj|jj�|_|jjsl|j|j�|j	r\|j
j|jj��|jj
�|jrl|j|j
�n|jjj|j
�|jjr�tjtd��tjtd�djtjjd��|jdk	r�|jj�|j�|j�d|_	dS)NzRThe downloaded packages were saved in cache until the next successful transaction.z1You can remove cached packages by executing '%s'.z{prog} clean packages)�progF)rr�ZTempfilePersistorr4r�r-Z	keepcache�_clean_packagesrr+r r<Zget_saved_tempfilesrir,Ztempfiles_to_addrMr�rrOr�Z	MAIN_PROGr�historyr9r��_closeRpmDB)r3r5r5r6�_finalize_base�s*



zBase._finalize_basecCsB|jr
dStjtjjd�d|_|j�|jdddd�d|_dS)ztClose all potential handles and clean cache.

        Typically the handles are to data sources and sinks.

        NzCleaning up.T)r`r^r�)	rrM�logrr#�DDEBUGr�r�r*)r3r5r5r6r9sz
Base.closecCsftjjj|j|�}xN|D]F}y|jj|�Wqtjjk
r\}ztj	|�WYdd}~XqXqWdS)z?Read repositories from the main conf file and from .repo files.N)
rr4�readZ
RepoReaderr^�addrQZConfigErrorrMrd)r3Zopts�readerrSrUr5r5r6�read_all_repos"s
zBase.read_all_reposcCs�|r
d|_|rtjj�|_|r�d|_|jdk	rJtjj|j�|_|jj	|j_	|jr`|j
r`|j
j�|jdk	rt|j
j�tjj�|_d|_g|_|r�|r�tj�dS)z1Make the Base object forget about various things.N)rrr%r&r'rr�r�r4r�raZrollbackrr�r9rrrrr.�gcZcollect)r3r`r^r�r5r5r6r�-s$



'z
Base.resetcCs|`dS)z6Closes down the instances of rpmdb that could be open.N)�_ts)r3r5r5r6r�jszBase._closeRpmDB)Z	noscriptsZ
notriggersZnodocs�testZjustdbZ
nocontexts�nocrypto�RPMTRANS_FLAG_NOCAPSZnocapsr�cCs|jS)N)r)r3r5r5r6r�|sz	Base.goalcCs�|jdk	r|jStjjj|jj�|_|jjd�xb|jjD]V}|j	j
|�}|dkrdtjt
d�|�q:|jj|�|jj
|�}|dk	r:|jj|�q:W|jjs�|jjtj�|jjr�|jjtj�tjtj|jd�}|jj|�|jS)zMSet up the RPM transaction set that will be used
           for all the work.Nrz!Invalid tsflag in config file: %s)rrr(r�ZTransactionWrapperr4rY�setFlagsZtsflags�_TS_FLAGS_TO_RPM�getrM�criticalrZ	addTsFlag�_TS_VSFLAGS_TO_RPM�pushVSFlagsZdiskspacecheckr)r�ZRPMPROB_FILTER_DISKSPACEZ
ignorearchZRPMPROB_FILTER_IGNOREARCH�	functools�reduce�operator�or_Z
setProbFilter)r3�flagZrpm_flagZvs_flagZ
probfilterr5r5r6r��s*
zBase._tscCs&|jdkrdS|jj�|`d|_dS)z"Releases the RPM transaction set. N)rr9)r3r5r5r6r��s


cCs$tjjd�}tjj�|_tjtjjd�x�|j	j
�D]�}|js@q4|jsHq4|j
j�}|sXq4tjtjjd|j�|j
j�tjjkr�tj|d�}tjj|�s�q4ntj|d�}y|jj|�Wq4tjjk
r�}ztd�}tj||j|�WYdd}~Xq4Xq4W|�r|jjj|j j!dg�|�|jS)z6Create the groups object to access the comps metadata.z
loading compszGetting group metadataz%Adding group file from repository: %sz
groups.xmlz1Failed to add groups file for repository: %s - %sN�basearch)"rr#r�rZCompsrrMr�r�r^r_Zenablegroupsr�rJZ
getCompsFnrPZgetSyncStrategyrSZSYNC_ONLY_CACHEr
Zcalculate_repo_gen_dest�os�path�existsZrepo_gen_decompressZ_add_from_xml_filenamerQ�
CompsErrorrr�Z_ir�rrX)r3r�r�rSZcomps_fnZdecompressedrUr�r5r5r6r��s:


&zBase.read_compscCs*|jdkr$|jj}t|jj|d�|_|jS)zeauto create the history object that to access/append the transaction
           history information. N)rW)rr4rWrr�)r3rWr5r5r6�_getHistory�s
zBase._getHistorycCs|j�S)N)r�)r3r5r5r6�<lambda>�sz
Base.<lambda>cCst|d|�S)Nr)�setattr)r3r�r5r5r6r��scCst|dd�S)Nr)r�)r3r5r5r6r��szDNF SWDB Interface Object)�fget�fset�fdel�doccsF�jj}t|j����j�}|j�|j�j�}xT|j�D]H�|j��}|d}�j	j
|d��j	j
�d�|j�||dd��q:Wx�|j�D]x��j	j
�d�|j��}t
��}g}x0|D](}	t
|	�|kr�|jd|	�q�|j|	�q�W|d}
|j�|
|dd��q�Wx�|j�D]���j	j
�d�|j��}��fdd�|D�}|j��}�|k�rt|j�jd	��rt|j��}x0|D](}|j|�}
tjj||
�dk�rz|
}�qzW|j�||��fd
d�}tjj||��qWx�|j�D]ȉ|j��}d}x"|D]}|j�jk�r�|}P�q�W|dk�r*|jd�}n
|j|���fdd�|D�}�fd
d�}tjj||��|k�rz|j�|�n|j�||��j	j
|d��j	j
�d��q�W|j �}|�rB�j!j"t#j$d�j�}|j%|d�xh|D]`�|j�jd	��r|d}|j|��jj&||j|���j	j
�d�|j��}|j'�|��q�W|S)NrZdd�dr�r{r[cs$g|]}|�ks|j�jkr|�qSr5)�name)rZr[)�
all_obsoleted�pkgr5r6r\�sz*Base._goal2transaction.<locals>.<listcomp>)r�cs�jj|d�S)N�od)r"�	pkg_added)r�)r3r5r6r�
sz(Base._goal2transaction.<locals>.<lambda>cs$g|]}|�ks|j�jkr|�qSr5)r�)rZr[)r�r�r5r6r\scs�jj|d�S)Nr�)r"r�)r�)r3r5r6r�!sZud�u)�flags)Zpkg__neqrU���)(r�r(rZlist_obsoleted�_get_installonly_queryrx�	installed�list_downgradesZobsoleted_by_packager"r�Z
add_downgrade�list_reinstalls�str�insertrwZ
add_reinstall�
list_installs�
get_reason�filterr�r�r�ZTransactionItemReasonCompareZadd_installrr�Zmapall�
list_upgrades�pop�removeZadd_upgradeZ
list_erasuresr`rqrK�IGNORE_EXCLUDESrr�
set_reasonZ	add_erase)r3r��tsZinstallonly_queryZinstallonly_query_installedZobsZ
downgradedZ	nevra_pkg�	obsoletesZobs_pkgZreinstalled�reasonZobsoleteZreason_obsolete�cbZupgradedr[ZerasuresZremaining_installed_queryZ	remainingr5)r�r�r3r6�_goal2transaction�s�
















zBase._goal2transactioncCsd|j�}|j�}|j�j�}g}g}x6|D].}||krJ|j||d�q*|j||�q*W||fS)aJ See what packages in the query match packages (also in older
            versions, but always same architecture) that are already installed.

            Unlike in case of _sltr_matches_installed(), it is practical here
            to know even the packages in the original query that can still be
            installed.
        r)r��_na_dict�	availablerw)r3�q�instZ
inst_per_archZavail_per_archZavail_lZinst_lZnar5r5r6�_query_matches_installed7s
zBase._query_matches_installedcCs"|jj�j�j|j�d�}t|�S)z� See if sltr matches a patches that is (in older version or different
            architecture perhaps) already installed.
        )r�)r`rqr�rr�matches�list)r3�sltrrr5r5r6�_sltr_matches_installedKszBase._sltr_matches_installedcs�fdd��jj�j�D�S)z5Get iterator over the packages installed by the user.c3s|]}�jj|�r|VqdS)N)r�Zuser_installed)rZr�)r3r5r6�	<genexpr>Tsz*Base.iter_userinstalled.<locals>.<genexpr>)r`rqr�)r3r5)r3r6�iter_userinstalledRszBase.iter_userinstalledcCs0|j||jj|jjd�}|jjr,|jd�|S)N)�allow_uninstall�
force_bestZignore_weak_depsz./debugdata/rpms)�runr4�bestZinstall_weak_depsrbZwrite_debugdata)r3r��
allow_erasing�retr5r5r6�_run_hawkey_goalWs
zBase._run_hawkey_goalc	Cstd}|j�tjjd�}|jj�|j}|j�rJ|j|j	j
�j�|j�n|j
jsd|j�}|j|�|j|j	j
�j|j
jd��|j||�s�|j
jdkr�|j�tjj|j��}tjj|�}n|j|�|_|jj�|�|jdk	o�t|j�dk}|�r|jj �}|�rtjj!|�}|dk	�r"|�|j"j#�|jj$�}||jj%�7}||jj&�7}||jj'�7}|j	j(|j)|�|S)zBuild the transaction set.NZdepsolve)r��r)*�_finalize_comps_transrr#r�r"�startrZ
req_has_eraseZpush_userinstalledr`rqr�r�r4Zupgrade_group_objects_upgrade�_build_comps_solverZ'_exclude_packages_from_installed_groupsZ
add_protectedrrZprotected_packagesr�
debuglevelZ
log_decisionsr��_format_resolve_problems�
problem_rulesrQZ
DepsolveErrorrr�endrpZ_rpm_limitationsrcr*Zrun_resolvedr�r�r�r�Zset_modules_enabled_by_pkgsetra)	r3r�excr�r��solverr�Zgot_transactionZnew_pkgsr5r5r6�resolve_sH








zBase.resolvecCs^t|t�s|g}tjjj�gt|�}|js�|jj	�|jj
�|jr�|jjsV|jj
r�d}t|d�rx|jrxdj|j�}nt|d�r�|jr�dj|j�}|jj�}|dkr�|jj�}n|j}|jj|gg|�|jj|�|jj�|jj�d|_dSd}tjtd��tj j!|j"j#|j"j$�}|���|jj%|j&�|j'�}|�rxtd�}tj(|�x|D]}tj(|��qXWtj)j*|��tjtd��tj+j,d�}	tjtd	��|j&j-�|j&j.�tjjj/|dd
�}
|j&j0|
�}t1|�dk�r\x&|
j2�D]}tj3td�j4|���q�Wtd
�d}x|D]}
|dt5|
�7}�qW|j6|�}|�rP|d|7}tj)j7|��~
tjtd��|j&j8t9j:��r�dS|	�|jj	�|jj
�tj+j,d�}	tjjj/||d�}|j"j;dk�r�x|j<D]}d|_=�q�W|jj�tjtd��|j>|d�}WdQRX|	�|jj?|j�|jj�dd�}x&tj@jA||j|�D]}tjB|��qFW|S)N�args� �cmdsTzRunning transaction checkz%Error: transaction check vs depsolve:zTransaction check succeeded.ztransaction testzRunning transaction test)r�rzRPM: {}zTransaction test error:�
z  %s
zTransaction test succeeded.r�)�displays�FzRunning transaction)rcSs,g}x"|D]}|jdj|t|���q
W|S)Nz{}: {})rwrOr�)�actionZtsis�msgs�tsir5r5r6�
_pto_callback�s
z*Base.do_transaction.<locals>._pto_callback)C�
isinstancerrZyumZrpmtransZLoggingTransactionDisplayrr�rar�ZupdateFailSafeDatar�group�env�hasattrr"r�r$r��lastr`�_rpmdb_version�end_rpmdb_version�begrr*Zrun_pre_transactionZrun_transactionr+rMr�rr�Zbuild_rpmdb_lockr4r�r�Z_populate_rpm_tsr��_run_rpm_check�errorrQZTransactionCheckErrorr#r��orderZcleanZRPMTransactionr�rp�messagesr�rOr
�_trans_error_summaryrc�isTsFlagSetr(�RPMTRANS_FLAG_TESTrr&r2�_run_transactionZunload_removed_pluginsr�Z_post_transaction_outputrN)r3Zdisplay�cmdline�oldZ
rpmdb_version�tidr�r)r�r�ZtestcbZtserrors�	errstringZdescr�summaryrZdisplay_r+r5r5r6�do_transaction�s�
















zBase.do_transactioncCs�d}tjd�}i}x�|j|�D]t}|jd�dkr>t|jd��ntjt|jd��d�}|jd�|krr|||jd�<||jd�|kr|||jd�<qW|r�|td�d	7}x4|D],}|d
tdd||�j	|||�d	7}q�W|s�dStd
�d|}|S)z�Parse the error string for 'interesting' errors which can
        be grouped, such as disk space issues.

        :param errstring: the error string
        :return: a string containing a summary of the errors
        �z9needs (\d+)(K|M)B(?: more space)? on the (\S+) filesystemr'�Mr�g�@�zDisk Requirements:r%z   z7At least {0}MB more space needed on the {1} filesystem.Nz
Error Summaryz
-------------
)
�re�compile�finditerr-r��mathZceilrr	rO)r3r?r@�pZdisk�mZ
size_in_mb�kr5r5r6r8s&
 
*zBase._trans_error_summarycCs|jjo|jjtj�S)N)r4Zhistory_recordr�r9r(r:)r3r5r5r6�_record_history%szBase._record_historycCs�d}|j�r�t|jj�}|jj�j�}|j|d�j�}|jj	�}|j
j�}|dk	rX|j}|dksh||kr�t
jtd�jtjjd��d}t|d�r�|jr�dj|j�}nt|d�r�|jr�dj|j�}|jjr�|jjnd}	|j
j||g||	�}|jj�r$tjd	�}
|
�r$ytj|
�Wnd	}
YnXt
jtjjd
�|j j|j!d�}t
jtjjd�|jj�rzytj|
�WnYnXtjj"|j |j#�|dk�r�n�t$|�d	k�r�dd
�|j D�}|�sfx&|j%�D]}
t
j&td�j|
���q�Wtd�}
tj'j(|
��nlt
j&td��x |D]}t
j&t)|d	���qW|j��rR|j j*t+j,��rR|j
j-|�td�}
tj'j(|
��xbdD]Z}t||��rlt.||�}yt/j0|�Wn.t1t2fk
�r�td�}
t
j&|
|�YnX�qlWt3|j#j4�|_5|j j*t+j,��s�|j6|j7�|S)zh
        Perform the RPM transaction.

        :return: history database transaction ID or None
        N)r�z RPMDB altered outside of {prog}.)r�r"r#r$rBrzRPM transaction start.zRPM transaction over.cSsg|]}|j�r|�qSr5)ZFailed)rZZelr5r5r6r\esz)Base._run_transaction.<locals>.<listcomp>zRPM: {}zCould not run transaction.zTransaction couldn't start:�	ts_all_fn�
ts_done_fnz$Failed to remove transaction file %s)rMrN)8rLrr4Zhistory_record_packagesr`rqr�r�rr1r�r0r2rMrNrrOrr�ZMAIN_PROG_UPPERr/r"r�r$�commentr3Z
reset_nicer��nicer�r#r�r�r!Z_sync_rpm_trans_with_swdbrrpr7r�rQrcr
r9r(r:r�getattrr
�unlink_fr��OSError�boolZinstall_setr,�_verify_transactionZverify_tsi_package)r3rr>Zusing_pkgs_pats�installed_queryZ
using_pkgs�rpmdbvZlastdbvr<rOZonice�errorsZfailedr�rUr[�fnr5r5r6r;)s~









zBase._run_transactioncs�dd�|jD�}t|����fdd�}tjjd�}d}tjj|�}|j�j�}t	dd�|D��}xH|j
jD]<}	|	j�}
x.|
j
�D]"}|j�|kr�|jd�|j�q�WqjWx|D]}||j|�}q�W|j�}
|j
j|
�|�d|_dS)	NcSsg|]}|jtjjkr|�qSr5)r(r�r�Z#TransactionItemAction_REASON_CHANGE)rZr*r5r5r6r\�sz,Base._verify_transaction.<locals>.<listcomp>cs |d7}�dk	r�||��|S)Nr�r5)r��count)�total�
verify_pkg_cbr5r6�display_banner�sz0Base._verify_transaction.<locals>.display_bannerzverify transactionrcSsg|]
}|j�qSr5)r�)rZr[r5r5r6r\�sT)r�rprr#r�r`�
rpmdb_sackrqr�rr�r-ZgetCompsGroupItemZgetPackagesZgetNameZsetInstalledr�r�r1rr+)r3r\Ztransaction_itemsr]r�rZr^r�namesZti�grIr*rWr5)r[r\r6rU�s(

zBase._verify_transactionc
sXtjj|jj|jj�}|���tj�}tdd�|D��}tdd�|D��}	�j	j
jdkrn�j	t|�||	d�n�j	t|�|�tjj
|||���j�r�tjj�j���t�fdd�|D��}
tjjd|�j�}|jj}|dk}
xԈjo�|
s�|dk�r�|dk�r|d	8}td
�}tj|�dd��jD�}�fdd�|D�}td
d�|D��}�j	t|�|�tjj
|||���j��r�tjj�j���|
t�fdd�|D��7}
tjj||i�}q�W�j�r�tjjj�j�}tj|�WdQRX|dk	�r�||
|�|\}}||k�rT||k�rtd�}n||k�r,td�}d||d}tj||d|d|�dS)Ncss|]}|jVqdS)N)�
download_size)rZ�ploadr5r5r6r�sz1Base._download_remote_payloads.<locals>.<genexpr>cSsg|]}t|tjj�r|�qSr5)r,r�drpmZDeltaPayload)rZZpayloadr5r5r6r\�sz2Base._download_remote_payloads.<locals>.<listcomp>�)Ztotal_drpmsc3s|]}�j|�VqdS)N)�_bandwidth_used)rZrb)rXr5r6r�srr�z,Some packages were not downloaded. Retrying.cSsg|]}|�qSr5r5)rZr�r5r5r6r\�scs g|]}tjj|�tjj��qSr5)rrS�_pkg2payload�
RPMPayload)rZr�)�progressr5r6r\�scss|]}|jVqdS)N)ra)rZrbr5r5r6r�sc3s|]}�j|�VqdS)N)re)rZrb)rXr5r6r�sz?Delta RPMs reduced %.1f MB of updates to %.1f MB (%d.1%% saved)zIFailed Delta RPMs increased %.1f MB of updates to %.1f MB (%d.1%% wasted)�dir')rrii)rr�Zbuild_download_lockr4r�r�r��sumrpr�__code__�co_argcountrSZ_download_payloadsZ_irrecoverablerQZ
DownloadErrorZ_update_savingZ_recoverable�retriesrrMr�Z
errmap2str)r3�payloadsrcrh�callback_totalZ	fail_fastr�Zbeg_downloadZest_remote_sizeZ
total_drpmZremote_sizeZsavingrmZforeverr�Zremaining_pkgs�realZfullZpercentr5)rXrhr6�_download_remote_payloads�sb












zBase._download_remote_payloadsc	s�|j|�\}}|rz�dkr$tjj��tjj|jj�j��|j	j
��|jdd�|D����fdd�|D�}|j|��|�|j	j
r�xX|D]P}|jr�tjj|j�|jjd��}ntjj|jj|jjd��}tj||j	j
�q�WdS)aDownload the packages specified by the given list of packages.

        `pkglist` is a list of packages to download, `progress` is an optional
         DownloadProgress instance, `callback_total` an optional callback to
         output messages about the download operation.

        NcSsg|]}|j��qSr5)�localPkg)rZr�r5r5r6r\sz*Base.download_packages.<locals>.<listcomp>cs$g|]}tjj|��jtjj��qSr5)rrSrfZ
delta_factoryrg)rZr�)rcrhr5r6r\s�/)�_select_remote_pkgsrr!ZNullDownloadProgressrcZ	DeltaInfor`rqr�r4Zdeltarpm_percentager?rqr=Zbaseurlr�r�r�Zget_local_baseurl�location�lstriprSZpkgdir�shutil�copy)	r3Zpkglistrhro�remote_pkgsZ
local_pkgsrnr�rur5)rcrhr6�download_packages�s"	

zBase.download_packagescCs�g}|s|S|jj�r&tjjtd���g}x�|D]�}tjj|�rhd|krhtj	j
||j|�}|j|g�y|j
|jj|��Wq0tk
r�}ztj|�|j
|�WYdd}~Xq0Xq0W|jdd�|r�|r�ttd�jdj|����|S)NzACannot add local packages, because transaction job already existsz://T)rzzCould not open: {}r#)rZ
req_lengthrrQrcrr�r�r�r�Z_urlopen_progressr4r?rwr`Zadd_cmdline_packager�rMrdrrOr�)r3�	path_list�strictrh�pkgsZ
pkgs_errorr�rUr5r5r6�add_remote_rpmss(



 zBase.add_remote_rpmscCs|jr|jj}d}n|j|j}|j}|j}|�r|jj}tj	j
j|�}tj	jj
||j��}tjj|j��}~|dkr�d}	d}
n�|dkr�|r�d}	nd}	td�|}
n\|dkr�d}	td�|}
nB|dkr�|r�d}	nd}	d}	td�|}
n|d	k�rd}	td
�|}
nd}	d}
|	|
fS)a�Verify the GPG signature of the given package object.

        :param po: the package object to verify the signature of
        :return: (result, error_string)
           where result is::

              0 = GPG signature verifies ok or verification is not required.
              1 = GPG verification failed but installation of the right GPG key
                    might help.
              2 = Fatal GPG verification error, give up.
        rrBr�r'z"Public key for %s is not installedzProblem opening package %srDz Public key for %s is not trustedrdzPackage %s is not signed)�
_from_cmdliner4Zlocalpkg_gpgcheckr^r~Zgpgcheck�gpgkeyrYrr(r��initReadOnlyTransactionZ	miscutilsZcheckSigrrr�r��basenamer)r3�po�checkZ	hasgpgkeyrS�rootrZ	sigresultZlocalfn�resultr�r5r5r6�_sig_check_pkg(sF

zBase._sig_check_pkgcCs
|j|�S)a�Verify the GPG signature of the given package object.

        :param pkg: the package object to verify the signature of
        :return: (result, error_string)
           where result is::

              0 = GPG signature verifies ok or verification is not required.
              1 = GPG verification failed but installation of the right GPG key
                    might help.
              2 = Fatal GPG verification error, give up.
        )r�)r3r�r5r5r6�package_signature_checkcs
zBase.package_signature_checkc
Cslxf|D]^}tjj|�sqytj|�Wn&tk
rLtjtd�|�wYqXtj	t
jjtd�|�qWdS)NzCannot remove %sz
%s removed)
r�r�r�r
rRrSrMrdrr�rr#r�)r3�packagesrYr5r5r6r�rs

zBase._clean_packagesrhcCsv|dkr|jj}|dkr*|j|||||�Stj|j||||d�}|dksTt|�dkr\|d�St||�}tjdd�|�S)aRReturn a :class:`misc.GenericHolder` containing
        lists of package objects.  The contents of the lists are
        specified in various ways by the arguments.

        :param pkgnarrow: a string specifying which types of packages
           lists to produces, such as updates, installed, available,
           etc.
        :param patterns: a list of names or wildcards specifying
           packages to list
        :param showdups: whether to include duplicate packages in the
           lists
        :param ignore_case: whether to ignore case when searching by
           package names
        :param reponame: limit packages list to the given repository
        :return: a :class:`misc.GenericHolder` instance with the
           following lists defined::

             available = list of packageObjects
             installed = list of packageObjects
             upgrades = tuples of packageObjects (updating, installed)
             extras = list of packageObjects
             obsoletes = tuples of packageObjects (obsoleting, installed)
             recent = list of packageObjects
        N)�showdups�ignore_casermrcSs
|j|�S)N)Zmerge_lists)�a�br5r5r6r��sz(Base._do_package_lists.<locals>.<lambda>)r4Zshowdupesfromrepos�
_list_patternr��partialrp�mapr�)r3�	pkgnarrow�patternsr�r�rmZlist_fnZyghsr5r5r6�_do_package_listss

zBase._do_package_listsc&s���fdd���fdd�}�fdd�}tj|d�}g}	g}
g}g}g}
g}g}g}g}g}|}�jj�}|dk	r�tjj||d�}|j�jd	d
�}|dk�r�i}i}xH|j�D]<}|||j	<|r�q�|j
|jf}||ks�|||kr�|||<q�Wt||j
���}	||j��}|�s|jdd
�}x�|D]�}|�rN|j	|k�rB|j|�n
|
j|�nT|j
|jf}|j	|k�rr|j|�n0||k�s�|j||��r�|
j|�n
|j|��q W�n�|dk�r�||�jdd�}
�j|
dd�}
|
jddgd�|
j�j�}
�nP|dk�rt||j���}	�n2|dk�rB|�r�||�j�}|j�j�}x\|D]T��j
�jf}|j|g�}�fdd�|D�}t|�dk�r�|j��n
|
j���q@Wn�||�j�jdd
�j�}|j�j�j�}xz|D]r\} }!|| |!fd�|j| |!fdg�d}"|"�s
�j|"��r|
j��n"�j|"��r.|j��n
|j���q�W�n|dk�rh||�j�jj�}#|#j�}n�|dk�r��fdd�|j�D�}n�|dk�r|j�}$|�jj��j|$d�}�j|d	dd�}|jddgd�g}xl|D],��j}%|j�fdd�|$j|%d �D���q�Wn6|d!k�rD|j�}|�s2|jdd
�}||�j �j!j"�}|	|_|
|_||_#||_$|
|_%||_||_&||_"||_||_'|S)"Ncs�dkrdS�jj|��kS)z:Test whether given package originates from the repository.NT)r�rS)�package)rmr3r5r6�is_from_repo�sz(Base._list_pattern.<locals>.is_from_repocs�fdd�|D�S)z=Filter out the packages which do not originate from the repo.c3s|]}�|�r|VqdS)Nr5)rZr�)r�r5r6r�sz=Base._list_pattern.<locals>.pkgs_from_repo.<locals>.<genexpr>r5)r�)r�r5r6�pkgs_from_repo�sz*Base._list_pattern.<locals>.pkgs_from_repocs�dkr|S|j�d�S)z=Filter out the packages which do not originate from the repo.N)rm)r�)rq)rmr5r6�query_for_repo�sz*Base._list_pattern.<locals>.query_for_repo)�iter)r�F)rkrhT)Zlatest_per_arch_by_priority�upgrades)Zupgrades_by_priority)�upgrade�src�nosrc)�	arch__neqr�rcsg|]}|j�jkr|�qSr5)�evr)rZr�)�	avail_pkgr5r6r\sz&Base._list_pattern.<locals>.<listcomp>r�
autoremove�extrascsg|]}�|�r|�qSr5r5)rZr�)r�r5r6r\sr)Zobsoletes_by_priority)rdr�csg|]}�|f�qSr5r5)rZr=)�newr5r6r\.s)�provides�recent)(r
Z
GenericHolderr`rqrrsrtrvr�Zpkgtupr�r�rr�rrrrwZevr_gt�_merge_update_filters�latestrrr�rpZevr_eq�	_unneededr��swdbr�r�r�extendZ_recentr4r��reinstall_available�
old_available�updates�obsoletesTuplesr�)&r3r��patternr�r�rmr�r�Zyghr�rr�r�r�rr�r�r�r�Zicrr|ZdinstZndinstr��keyZavailr�Zinstalled_dict�installed_pkgsZsame_verZavailable_dictr�r�Zinst_pkgZautoremove_qrZobsoleted_reldepsr5)r�r�r�rmr3r6r��s�














zBase._list_patterncCs|j|7_t|�S)N)rrp)r3�transr5r5r6�_add_comps_transEszBase._add_comps_transcs�|j�}|sdS|j�jjdd�}|j�fdd�|D�d�}|j|�}x|D]}�jj|tjj	�qLW|j
|�}|j
|�}|r�x |D]}�jj|�j
jd�q�WdS)z�
        Mark to remove packages that are not required by any user installed package (reason group
        or user)
        :param query: dnf.query.Query() object
        NF)rbcs g|]}�jjj|j�r|�qSr5)r�r-Zis_removable_pkgr�)rZr[)r3r5r6r\Usz,Base._remove_if_unneeded.<locals>.<listcomp>)r�)�
clean_deps)r�Z_safe_to_remover�r�r��
differencer�r�r�Z TransactionItemReason_DEPENDENCY�intersectionr�eraser4�clean_requirements_on_remove)r3rqZ
unneeded_pkgsZunneeded_pkgs_historyZpkg_with_dependent_pkgsr�Zremove_packagesr5)r3r6�_remove_if_unneededIs




zBase._remove_if_unneededcs>�j}�jjd}�fdd�}�fdd�}dd�}�jj�jdd	�}|jtj|dd
�f|j	tj|dd
�f|j
|f|j|ff}x�|D]�\}}	x�|D]�}
d|
ji}|
j
r�|jd
|i��jj�jf|�j�}|jddgd�|�s|
j}
|
j
r�|
d|7}
tjtd�j|
��q�|	|||
�}�jjj|
j�q�Wq�W�j|�dS)Nr�cs,tjj�j�}|j|d��jj|d�|S)N)r�)�select)r�selector�Selectorr`rrr�)rq�remove_query�	comps_pkgr)r3r5r6�
trans_upgradegsz1Base._finalize_comps_trans.<locals>.trans_upgradecs��jjdkrr|js"�j||d�q�|j�j�}�j|�tjj	�j
�}|jdj|j
|j�d��jj||d�nltjj	�j
�}|jr�|jdj|j
|j�d�n,�jjr�|j�j
j�j|d��}|j|d��jj||d�|S)Nrh)r|z
({} if {}))r�)r��optional)r)r�)r4�multilib_policyZrequires�_install_multiarchr�rx�_report_already_installedrr�r�r`rrOr�r�installrrurqrr)rqr�r�r|rVr)r3r5r6�
trans_installms 
z1Base._finalize_comps_trans.<locals>.trans_installcSs|j|�}|S)N)ru)rqr�r�r5r5r6�trans_remove�s
z0Base._finalize_comps_trans.<locals>.trans_removeT)ri)r|Fr�r�r�r�)r��.zNo match for group package "{}")rr4rXr`rqrrr�r�r��install_optr�r�r�Zbasearchonlyr<rxrMrdrrOrZ
group_membersr�r�)r3r�r�r�r�r�r�Zattr_fn�attrrYr�Z
query_argsrZpackage_stringr5)r3r6rcs4

zBase._finalize_comps_transcs �fdd�}tjj�j�j|�S)NcsN�jj�j�j|d�}|sdSy�jjj|d�Stk
rHtj	j
SXdS)N)r�r)r`rqr�rrr�r(r��AttributeErrorr�r�ZTransactionItemReason_UNKNOWN)Zpkgnamer)r3r5r6�	reason_fn�sz+Base._build_comps_solver.<locals>.reason_fn)rrZSolverr�r)r3r�r5)r3r6r�s	zBase._build_comps_solvercCsH|j�}t|t�stjj|�}|j|||p.t�||�}|s>dS|j|�S)a&Installs packages of environment group identified by env_id.
        :param types: Types of packages to install. Either an integer as a
            logical conjunction of CompsPackageType ids or a list of string
            package type ids (conditional, default, mandatory, optional).
        r)	rr,r�r�r��listToCompsPackageTypeZ_environment_installrr�)r3�env_id�types�excluder|�exclude_groupsr r�r5r5r6�environment_install�s
zBase.environment_installcCs|j�}|j|�}|j|�S)N)rZ_environment_remover�)r3r�r r�r5r5r6�environment_remove�s
zBase.environment_removec
s��fdd��d}|r2�fdd�|D�}tjj|�}�j�}t|t�sPtjj|�}|j	||||�}|shdS|rt|j
}	n|j}	tj
td�||	��j|�S)anInstalls packages of selected group
        :param pkg_types: Types of packages to install. Either an integer as a
            logical conjunction of CompsPackageType ids or a list of string
            package type ids (conditional, default, mandatory, optional).
        :param exclude: list of package name glob patterns
            that will be excluded from install set
        :param strict: boolean indicating whether group packages that
            exist but are non-installable due to e.g. dependency
            issues should be skipped (False) or cause transaction to
            fail to resolve (True)
        cs6tjj|�r,�jj�j|d�}tdd�|�S|fSdS)N)�
name__globcSs|jS)N)r�)rIr5r5r6r��szABase.group_install.<locals>._pattern_to_pkgname.<locals>.<lambda>)rr��is_glob_patternr`rqrrr�)r�r)r3r5r6�_pattern_to_pkgname�sz/Base.group_install.<locals>._pattern_to_pkgnameNcsg|]}�|��qSr5r5)rZrI)r�r5r6r\�sz&Base.group_install.<locals>.<listcomp>rz#Adding packages from group '%s': %s)�	itertools�chain�
from_iterablerr,r�r�r�r�Z_group_installr�r�rMrNrr�)
r3�grp_idZ	pkg_typesr�r|Zexclude_pkgnamesZnested_excludesr r�Zinstlogr5)r�r3r6�
group_install�s"


zBase.group_installcCs�t|j|jtjtjBtj�}d}d}x�|D]�}	y|j|	�}
Wn:tjj	k
rv}zt
jt|��d}w*WYdd}~XnXx2|
j
D](}|s�||kr�||j||||d�7}q�Wx&|
jD]}
||j|
||||d�7}q�Wq*W|r�|r�tjjtd���|S)NrTF)r�r|)r�r|r�zNothing to do.)rrr��ENVIRONMENTS�GROUPS�	AVAILABLEr�rrQr�rMr5r
�groupsr��environmentsr�rcr)r3r�r�r|r�r�r�cnt�doner��res�errZgroup_idr�r5r5r6�env_group_install�s(

zBase.env_group_installcCs|j�}|j|�}|j|�S)N)rZ
_group_remover�)r3r�r r�r5r5r6�group_removes
zBase.group_removecCs�t|j|jtjtjBtj�}y|j|�}WnFtjj	k
rp}z&t
jdt|��tjj
td���WYdd}~XnXd}x|jD]}||j|�7}q~Wx|jD]}||j|�7}q�W|S)NzWarning: %szNo groups marked for removal.r)rrr�r�r��	INSTALLEDr�rrQr�rMr5r
rcrr�r�r�r�)r3r�rr�r�r�r.�grpr5r5r6�env_group_removes

"zBase.env_group_removec
 CsLt|j|jtjtjBtj�}d}�x
|D�]}y|j|�}Wn6tjj	k
rr}zt
jt|��w(WYdd}~XnXxX|j
D]N}y|j|�d}Wq|tjj	k
r�}zt
jt|��w|WYdd}~Xq|Xq|WxZ|jD]P}y|j|�d}Wq�tjj	k
�r$}zt
jt|��w�WYdd}~Xq�Xq�Wq(W|�sHtd�}	tjj|	��dS)NFTzNo group marked for upgrade.)rrr�r�r�r�r�rrQr�rMr5r
r��environment_upgrader��
group_upgraderr�ZCliError)
r3r�rZgroup_upgradedr�r�r�r.r�r�r5r5r6�env_group_upgrades6



zBase.env_group_upgradecCs|j�}|j|�}|j|�S)N)rZ_environment_upgrader�)r3r�r r�r5r5r6r�9s
zBase.environment_upgradecCs|j�}|j|�}|j|�S)N)rZ_group_upgrader�)r3r�r r�r5r5r6r�@s
zBase.group_upgradecCs�|jjd}tjj|�rdS|jj}tjjj	|d�}|j
tjtjB�|j
dd�}t|�}~~|dkrldStjj|�}tjj|�s�tj|�t|d�}|j�~dSdS)	z�Checks for the presence of GPG keys in the rpmdb.

        :return: 0 if there are no GPG keys in the rpmdb, and 1 if
           there are keys
        z/.gpgkeyschecked.yumr�)r�r�z
gpg-pubkeyr�wN)r4r�r�r�r�rYrr(r�r�r��_RPMVSF_NOSIGNATURES�_RPMVSF_NODIGESTSZdbMatchrp�dirname�makedirs�openr9)r3ZgpgkeyscheckedrYZmyts�idx�keysZmydirZfor5r5r6�_gpg_key_checkGs&

zBase._gpg_key_checkc	Cs�|j|�\}}|j|�x~|D]v}tjj|j�}|jj�j|d�}|jj	rb|j
|jj�j|d��}|j|d�}|dk	r�|j|d�}|jj
||d�qWt|�S)N)r�)r)rm)r�r�)r	r�rr�r�r`rqrrr4rrurrr�rp)	r3rqrmr|�already_instrr�rrr5r5r6r�es

zBase._install_multiarchcCs,tj�}tj�}t||�t||�||fS)a�
        Categorize :param install and :param exclude list into two groups each (packages and groups)

        :param install: list of specs, whether packages ('foo') or groups/modules ('@bar')
        :param exclude: list of specs, whether packages ('foo') or groups/modules ('@bar')
        :return: categorized install and exclude specs (stored in argparse.Namespace class)

        To access packages use: specs.pkg_specs,
        to access groups use: specs.grp_specs
        )�argparseZ	Namespacer)r3r�r��
install_specs�
exclude_specsr5r5r6�_categorize_specsss


zBase._categorize_specscsddd�|jD���fdd�|jD�}|jj�j|d�}|jj�j�d�}|jj|�|jj|�dS)NcSsg|]}tjj|�r|�qSr5)rr�r�)rZr�r5r5r6r\�sz/Base._exclude_package_specs.<locals>.<listcomp>csg|]}|�kr|�qSr5r5)rZr�)�
glob_excludesr5r6r\�s)r�)r�)�	pkg_specsr`rqr�ry)r3r��excludesr}Zglob_exclude_queryr5)r�r6�_exclude_package_specs�szBase._exclude_package_specsc
Cs�t�}t|j|jtjtjBtjtjB�}x�|D]�}y|j|�}Wn8t	j
jk
rx}ztj
dt|��w.WYdd}~XnX|j|j�|j|j�x8|jD].}|jj|�}x|j�D]}	|j|	j�q�Wq�Wq.Wt|�S)NzWarning: Module or %s)rrrr�r�r�r�r�r�rrQr�rMr5r
r<r�r�Z_environment_by_idZgroups_iterr�rPr)
r3�group_specsr�rr�r�r�Zenvironment_idZenvironmentr-r5r5r6�_expand_groups�s"


zBase._expand_groupsc
Cs�x�|D]x}yL|jj}d|kr<|jd�}|d}|djd�}|j|g|||j|j�Wqtjjk
r||j	d|�YqXqWdS)Nrsrr��,�@)
r4Zgroup_package_types�splitr�r��	grp_specsrrQrcrw)r3r�r�Zskippedr|Z
group_specr�rr5r5r6�_install_groups�s

zBase._install_groupscCs�|dkrg}g}g}g}g}	|j||�\}
}|j|�xd|
jD]Z}y|j||||d�Wq>tjjk
r�}
ztjt	|
��|j
|�WYdd}
~
Xq>Xq>Wg}f}to�|
j�rLy tj
jj|�}|j|
j|�Wnxtjjk
�rH}
zV|
j�r
x|
jD]}|j
|�q�W|
j�r2x|
jD]}|j
d|��qW|
j}WYdd}
~
XnXn|
j}|�rv|j|j�|_|j||||�|�s�|�s�|�s�|	�s�|�r�tjj||||	|d��dS)N)rmr|�formsr)�no_match_group_specs�error_group_specs�no_match_pkg_specs�error_pkg_specs�module_depsolv_errors)r�r�r�r�rrQ�MarkingErrorrMr5r�rwrorrerfZ
ModuleBaseZ
MarkingErrorsrrr	r�r)r3r�r�rmr|rrrrrr�r��specrUZno_match_module_specsr	rfZe_specr5r5r6r��sN
 zBase.install_specsc
Cs�tjj|�}|j|j|dd�}|jjdks4|j|�rr|d}|dk	rP|j|d�|sb|j	|||�|j
|||d�S|jjdkr�|j|||jj|d	|d
�}|s�|j	|||�x|D]}	|j
j|	|d�q�WdSd
S)z@Mark package(s) given by pkg_spec and reponame for installation.F)r�with_srcrhrqN)rm)rmr|rT)rrrm�reports�solution)r�r�r�r)rrsrt�get_best_solutionr`r4r�Z_is_arch_specifiedrr�_raise_package_not_found_errorr��_get_best_selectorsrrr�)
r3�pkg_specrmr|rr|rr�sltrsrr5r5r6r��s,
zBase.installcCs�|jrd}t|��|jj�j�j|j|jdgd�}|shtd�}t	j
||j�tjj
td�|j|j��n\t|�d|kr�tjj|j�}|j|gd�|jj||d�d	Std
�}t	j
||j�dSdS)Nz-downgrade_package() for an installed package.�noarch)r�r�z.Package %s not installed, cannot downgrade it.zNo match for argument: %sr)r�)r�r�r�zCPackage %s of lower version already installed, cannot downgrade it.)�_from_system�NotImplementedErrorr`rqr�rrr�r�rrMrdrrQr
ru�sortedr�r�rrr�)r3r�r|r�rrr5r5r6�package_downgrades  zBase.package_downgradecCs�|jj�j|j|j|j�}|j|�\}}||kr>|j|g�nT|tj	j
|�krdtjj
td�|j��n.tjj|j�}|j|gd�|jj||d�dS)NzNo match for argument: %s)r�)r�r�r�)r`rq�_nevrar�r�r�r	r�r�r�r�rrQ�PackageNotFoundErrorrrur�r�rrr�)r3r�r|rr�rrr5r5r6�package_installszBase.package_installcCsf|jj�j�j|j|j|jd�r0|jj|�dSt	d�}t
j|t|��t
jjt	d�|j|j��dS)N)r�r�r�r�z.Package %s not installed, cannot reinstall it.zNo match for argument: %s)r`rqr�rrr�r�r�rr�rrMrdr�rrQr
ru)r3r�r�r5r5r6�package_reinstall(s zBase.package_reinstallcCs|jj|�dS)Nr�)rr�)r3r�r5r5r6�package_remove0szBase.package_removecCs`|jrd}t|��|jdkr6td�}tj||j�dS|jj�j	�j
�}|jjr�|jj�j
|gd�j
|d�r�tjj|j�}|j|gd�|jj|d�dS|jd	kr�|j|jd
�}n|j|j|jd	gd�}|�std�}tj||j�tjjtd
�|j|j��nZt|�d|k�rBtjj|j�}|j|gd�|jj|d�dStd�}tj||j�dSdS)Nz+upgrade_package() for an installed package.r�z<File %s is a source package and cannot be updated, ignoring.r)r�)r)r�r�r)r�)r�r�z+Package %s not installed, cannot update it.zNo match for argument: %szHThe same or higher version of %s is already installed, cannot update it.r�)rrr�rrMr�rur`rqr�rxr4rrrrr�r�rrr�r�r�rdrQr
r)r3r�r�r�rrr5r5r6�package_upgrade4s:
$
zBase.package_upgradec	Cs�|jj�j�}|j|jj�jdd�|D�d��}|j�}|rf|jj�j�j|j|j��d�}|j|�}|dk	rz|j|d�|j||dd�}|r�|j|j	�j
dd�|D�d��}tjj
|j�}|j|d	�|jj|d
�dS)NcSsg|]
}|j�qSr5)r�)rZr�r5r5r6r\\sz*Base._upgrade_internal.<locals>.<listcomp>)r�)r)rmT)rr�cSsg|]
}|j�qSr5)r�)rZr�r5r5r6r\xs)r�)r�r�)r`rqr�r�rrrrur�r�r�r�rr�r�rrr�)	r3rqrrmrZ
installed_allrrVrr5r5r6�_upgrade_internalYs "
 zBase._upgrade_internalc
Csttjj|�}|j|j�}|d}|�rZtjj|�}|oH|doH|dj�r*|dj}|jj�j	�j
�}|jjr||j
|d�n|jj�jdd�}	|	�s*|j
|d�j
�}
|
s�td�}tj||�tjjtd�||��nV|djo�tjj|dj��r*|
j|djd	��s*td
�}tj|dj||dj��|jj�oH|d�oH|dj�}|j||||�Stjjtd�||��dS)Nrq�nevra)rT)ri)r�z(Package %s available, but not installed.zNo match for argument: %s)r�z?Package %s available, but installed for different architecture.z{}.{})rrsrtrr`r�r�r�rqr�rxr4rr�rrrrMrdrQ�PackagesNotInstalledErrorr�rOZ
has_just_namerr
)
r3rrmr|rrZwildcard�pkg_namer�Z
obsoletersZinstalled_namer�rr5r5r6r�s0
& zBase.upgradecCs|j|jj�|jj|dd�S)N)r)rr`rqr4r)r3rmr5r5r6�upgrade_all�szBase.upgrade_allcCs�|dkr|jj�nxtjj|�}|j|jdd�}|djtj	d�|j
|||jjdd�}|spt
jtd�|�dSx|D]}|jj|d	�qvWd
S)NF)rrq)�
reponame__neqT)rrr
zNo package %s installed.r)r�r�)rZdistupgrade_allrrsrtrr`rrrKZSYSTEM_REPO_NAMErr4rrMr�rZdistupgrade)r3rrsrrrr5r5r6�distro_sync�s
zBase.distro_synccCs�t|||g�r�||7}d}|rF|rFx4|D]}td�}tj||�q(Wn|rX|j|�rXd}xX|D]P}y|j||d�Wn4tjjk
r�}	ztj	t
|	��WYdd}	~	Xq^Xd}q^W|s�tjtd��n4|jj�j
|jj|jjd�}
x|
D]}|j|�q�WdS)z�Removes all 'leaf' packages from the system that were originally
        installed as dependencies of user-installed packages but which are
        no longer required by any such package.FzNot a valid form: %sT)rNzNo packages marked for removal.)rb)�anyrrMrdr�r�rrQr
r�r�r`rqr�r�r�r4rbr)r3rr�r�	filenamesr�Zgrp_specr�rrUr}r�r5r5r6r��s,


 
zBase.autoremovecsptjj|�j�j|d�}��fdd�|j�D�}|sB�j||���jj}x|D]}�j	j
||d�qPWt|�S)z'Mark the specified package for removal.)rcs(g|] }�dks �jj|��kr|�qS)N)r�rS)rZr�)rmr3r5r6r\�szBase.remove.<locals>.<listcomp>)r�)rrsrtrvr`r��"_raise_package_not_installed_errorr4r�rr�rp)r3rrmrr
r�r�r�r5)rmr3r6r��s
zBase.removecstjj|�}|j�j�}��fdd�|j�D�}|j�}	|dk	rL|	j|d�|dk	r`|	j|d�tjj	|	�}
|s�tj
jd||
j���d}�j
j}x\|D]T}
y|
t|
�}Wn*tk
r�|s�w��jj|
|d�YnX�jj|�|d7}q�W|dk�rtj
jd||��|S)	Ncs(g|] }�dks �jj|��kr|�qS)N)r�rS)rZr�)�old_reponamer3r5r6r\�sz"Base.reinstall.<locals>.<listcomp>)rm)r$zno package matchedr)r�r�)rrsrtrvr`r�rrrrqZ_per_nevra_dictrQr!r�r4r�r
�KeyErrorrr�r�ZPackagesNotAvailableError)r3rr)Znew_reponameZnew_reponame_neqZ	remove_nar|rr�Zavailable_qZavailable_nevra2pkgr�r�Z
installed_pkgZ
available_pkgr5)r)r3r6�	reinstall�s6


zBase.reinstallcCs
|j|�S)z�Mark a package to be downgraded.

        This is equivalent to first removing the currently installed package,
        and then installing an older version.

        )�downgrade_to)r3rr5r5r6�	downgrade	szBase.downgradec
Cstjj|�}|j|j�}|s6td�|}tjj||��d}|j�}t	|j
�j��}|jj�j
�j|d�}	t|	�dkr�td�|}tjj|||��xn|	j
�j�D]^}
|j�j|
d�}|s�td�}tj||
�q�tjj|j�}|j|d�|jj||d�d}q�W|S)	z�Downgrade to specific version if specified otherwise downgrades
        to one version lower than the package installed.
        zNo match for argument: %sr)r�z6Packages for argument %s available, but not installed.zDPackage %s of lowest version already installed, cannot downgrade it.)r�)r�r�r�)rrsrtrvr`rrQrrr�
_name_dictr�rqr�rrrpr!Z
downgradesr�rMrdr�r�rrr�)
r3rr|r|rr�r�Zavailable_pkgsZavailable_pkg_namesZq_installedr"Zdowngrade_pkgsrr5r5r6r,	s.zBase.downgrade_tocs�|jj�j�d�}|r |�gfStjj|j��}|r>|�gfS�jd�sR�jd�r^d�g}n&�jd�rr|�gfS�fdd�d
D�}|jj�j|d�|fS)N)Z
file__glob�/bin/�/sbin/z/usrrscsg|]}|��qSr5r5)rZ�prefix)�
provides_specr5r6r\E	sz!Base.provides.<locals>.<listcomp>�	/usr/bin/�
/usr/sbin/)r/r0r3r4)r`rqrrrZ_by_provides�
startswith)r3r2Z	providersZbinary_providesr5)r2r6r�6	s




z
Base.providesc
Cs�ddd�}||krtd��||}|rDd|}	|jj|	t��j|�|rfd|}	|jj|	t��j|�|r�d|}	|jj|	t��j|�|r�d|}	|jj|	t��j|�|r�d	|}	|jj|	t��j|�d
S)a�
        It modifies results of install, upgrade, and distrosync methods according to provided
        filters.

        :param cmp_type: only 'eq' or 'gte' allowed
        :param types: List or tuple with strings. E.g. 'bugfix', 'enhancement', 'newpackage',
        'security'
        :param advisory: List or tuple with strings. E.g.Eg. FEDORA-2201-123
        :param bugzilla: List or tuple with strings. Include packages that fix a Bugzilla ID,
        Eg. 123123.
        :param cves: List or tuple with strings. Include packages that fix a CVE
        (Common Vulnerabilities and Exposures) ID. Eg. CVE-2201-0123
        :param severity: List or tuple with strings. Includes packages that provide a fix
        for an issue of the specified severity.
        Z__eqgZ	__eqg__gt)�eqZgtez Unsupported value for `cmp_type`Z
advisory_type�advisoryZadvisory_bugZadvisory_cveZadvisory_severityN)r�r/�
setdefaultrr<)
r3Zcmp_typer�r7ZbugzillaZcvesZseverityZcmp_dictZcmpr�r5r5r6�add_security_filtersI	s&
zBase.add_security_filterscCs
i|_dS)z,
        Reset all security filters
        N)r/)r3r5r5r6�reset_security_filtersn	szBase.reset_security_filtersc
Cs>|jp
|js|r|S|jj�jdd�}|jrRx|jD]}|j|�}q8W|g|_|jr�x<|jj�D].\}}|rx|d}||i}	|j|jf|	��}qdW|j|�}|�s:|�r:|j	�}t
|j�j��}
|
dk�r:|dk�rt
d�j|
�}t
d�j|
�}tjt|||
��n2t
d�j||
�}t
d	�j||
�}tjt|||
��|S)
z�
        Merge Queries in _update_filters and return intersection with q Query
        @param q: Query
        @return: Query
        T)riZ	__upgraderNz3No security updates needed, but {} update availablez4No security updates needed, but {} updates availablez<No security updates needed for "{}", but {} update availablez=No security updates needed for "{}", but {} updates available)r/r.r`rqrrru�itemsr�r�r�rpr.r�rrOrMrdr	)
r3rrrdr�Zmerged_queriesrqZfilter_namer��kwargsrZZmsg1Zmsg2r5r5r6r�u	s>


zBase._merge_update_filtersc
s�jrtd�}t|���|j�j��j|jk}|r:gn�j}��fdd�}d}|jj�j��x |D�]}	t	j
j|	��}
�x|
D�]�}tj
|j|j|j�dkr�td�}tj||	|j�q�|jjr�t	jjj|j|j�}t	jjj|�}
tjt	jj||
��|	|_|jj�rt	j
j||
�nt	j
j |�d}|jj!�r:d}n�|jj"�r�|jj�r�|
t	jj#j$t	jj#j%fk�r�d}tjt	jj&td���nd}tjt	jj&td	���nd}n<|�r�|�|j|j|	|j'|jd
��}n|�r�|�|j|j�}|�s�d}q�|jj(t)j*�}|�r|jj+�}|jj,|t)j*�|jj-tj.|j��}|�rD|jj,|�|dk�rjtd�|}t	j/j0||���tjtd��d}q�WqhW|�r�|�r�t	j/j0td
���|�s�td��j1}t	j/j0||���|j2��\}}|dk�r|�r�td�}tj|�t3|�}t	j/j0||���dS)a�Retrieve a key for a package. If needed, use the given
        callback to prompt whether the key should be imported.

        :param po: the package object to retrieve the key of
        :param askcb: Callback function to use to ask permission to
           import a key.  The arguments *askcb* should take are the
           package object, the userid of the key, and the keyid
        :param fullaskcb: Callback function to use to ask permission to
           import a key.  This differs from *askcb* in that it gets
           passed a dictionary so that we can expand the values passed.
        :raises: :class:`dnf.exceptions.Error` if there are errors
           retrieving the keys
        z6Unable to retrieve a key for a commandline package: %scs0|td��d7}|td�dj�j�7}|S)Nz. Failing package is: %sz
 zGPG Keys are configured as: %sz, )rr�r�)r�)r�rSr5r6�_prov_key_data�	sz1Base._get_key_for_package.<locals>._prov_key_dataFrz)GPG key at %s (0x%s) is already installedTzThe key has been approved.zThe key has been rejected.)r��useridZhexkeyid�keyurl�fingerprint�	timestampzKey import failed (code %d)zKey imported successfullyzDidn't install any keysz�The GPG keys listed for the "%s" repository are already installed but they are not correct for this package.
Check that the correct key URLs are configured for this repository.z+Import of key(s) didn't help, wrong key(s)?N)4rrr�r^r~rPr1r�r�rZcryptoZretriever
ZkeyInstalledr�Zrpm_idrArMr�Zshort_idr4r�r�ZKeyInfoZfrom_rpm_key_objectr>Zraw_keyZDNSSECKeyVerificationZverifyZ
nice_user_msg�urlZlog_dns_key_importZlog_key_importZassumenoZ	assumeyesZValidityZVALIDZPROVEN_NONEXISTENCEZany_msgr@r9r(r:Z
getTsFlagsr�ZpgpImportPubkeyZ
procgpgkeyrQrcr�r�r
)r3r��askcb�	fullaskcbr�Z
key_installedZkeyurlsr=Zuser_cb_failr?r�r�Z
dns_input_keyZ
dns_resultZrcZ	test_flagZ
orig_flagsr��errmsgr5)r�rSr6�_get_key_for_package�	s�








zBase._get_key_for_packagecCs|j|||�dS)a�Retrieve a key for a package. If needed, use the given
        callback to prompt whether the key should be imported.

        :param pkg: the package object to retrieve the key of
        :param askcb: Callback function to use to ask permission to
           import a key.  The arguments *askcb* should take are the
           package object, the userid of the key, and the keyid
        :param fullaskcb: Callback function to use to ask permission to
           import a key.  This differs from *askcb* in that it gets
           passed a dictionary so that we can expand the values passed.
        :raises: :class:`dnf.exceptions.Error` if there are errors
           retrieving the keys
        N)rF)r3r�rCrDr5r5r6�package_import_key$
szBase.package_import_keycCs4g}|jj�x |jj�D]}|jt|��qW|S)N)r�r�Zproblemsrwr
)r3�resultsZprobr5r5r6r45
s

zBase._run_rpm_check�w+bcKstjj||j||f|�S)z�
        Open the specified absolute url, return a file object
        which respects proxy setting even for non-repo downloads
        )rr�Z_urlopenr4)r3rBrS�moder<r5r5r6�urlopen@
szBase.urlopencCs,|dkr|jjtjd�}|j|jjd�}|S)N)r�)r�)rrqrKr�r�r4r�)r3rZinstallonlyr5r5r6r�H
szBase._get_installonly_querycCsrtjj|dd�}|j|jdddd�}|drn|drn|djrn||ddjkrntjtd�j	|ddj��dS)	NT)r�F)rjrkrlrqr rz  * Maybe you meant: {})
rrsrtrr`r�rMr�rrO)r3rr|rr5r5r6�_report_icase_hintN
s

zBase._report_icase_hintcCs�dd�}g}g}x6|D].}|j�r:|jtjkrD|j|�q|j|�qWtd�}|||�sjtjjtd���|j	j
r�td�}|||�s�tjjtd���g}||fS)a Check checksum of packages from local repositories and returns list packages from remote
        repositories that will be downloaded. Packages from commandline are skipped.

        :param install_pkgs: list of packages
        :return: list of remote pkgs
        cSsxd}xn|D]f}d}y|j�}Wn0tk
rN}ztjt|��WYdd}~XnX|dk	r
tj|j||j��d}q
W|S)NTF)ZverifyLocalPkgrLrMr�r�rOrm)Zpkg_listZ
logger_msgZall_packages_verifiedr�Zpkg_successfully_verifiedrUr5r5r6�_verification_of_packages]
s
 z;Base._select_remote_pkgs.<locals>._verification_of_packagesz>Package "{}" from local repository "{}" has incorrect checksumz;Some packages from local repository have incorrect checksumz8Package "{}" from repository "{}" has incorrect checksumzVSome packages have invalid cache, but cannot be downloaded due to "--cacheonly" option)Z
_is_local_pkgrmrKZCMDLINE_REPO_NAMErwrrrQrcr4r�)r3Zinstall_pkgsrMryZlocal_repository_pkgsr�r�r5r5r6rtV
s&




zBase._select_remote_pkgscCsx|D]}t|�qWdS)N)�_msg_installed)r3r�r�r5r5r6r��
s
zBase._report_already_installedc	Cs�|jjtjd�}tjj|�}|j|j|d|d�}|dk	rH|dj|d�|dsdtj	j
td�|��nB|jjtjd�}|dj
|�}|r�td�}ntd�}tj	j
||��dS)	N)r�F)rrrqrq)rmzNo match for argumentz?All matches were filtered out by exclude filtering for argumentz?All matches were filtered out by modular filtering for argument)r`rqrKr�rrsrtrrrrQrrZIGNORE_REGULAR_EXCLUDESr�)	r3rrrm�	all_queryrsrZwith_regular_queryr�r5r5r6r�
s
z#Base._raise_package_not_found_errorc	s��jjtjd�j�}tjj|�}|j�j|d|d�}|dsNtj	j
td�|���dk	rp��fdd�|dD�}n|d}|s�td�}ntd	�}tj	j
||��dS)
N)r�F)rrrqrqzNo match for argumentcs g|]}�jj|��kr|�qSr5)r�rS)rZr�)rmr3r5r6r\�
sz;Base._raise_package_not_installed_error.<locals>.<listcomp>zCAll matches were installed from a different repository for argumentz?All matches were filtered out by exclude filtering for argument)r`rqrKr�r�rrsrtrrQr!r)	r3rrrmrOrsrr�r�r5)rmr3r6r(�
s
z'Base._raise_package_not_installed_errorcCs|jj|jdd�dS)z�
        Setup DNF file loggers based on given configuration file. The loggers are set the same
        way as if DNF was run from CLI.
        T)Zfile_loggers_onlyN)r$Z_setup_from_dnf_confr4)r3r5r5r6�
setup_loggers�
szBase.setup_loggerscs�|jjtjtjBtjB@r d}nd}t|j�}|j|dd�}|jf|�}|rl|rlt	j
j|j��}t
j|�t|jdd��}t|jdd��|}	dd���fdd�|D��t��fd	d�|D��}
t��fd
d�|	D��}|
|fS)z�returns set of conflicting packages and set of packages with broken dependency that would
        be additionally installed when --best and --allowerasingTF)rrZignore_weak)rcSstj|j|j|j|j|jd�S)N)r��epoch�version�releaser�)rKZNEVRAr�rQrRrSr�)�itemr5r5r6r�
sz&Base._skipped_packages.<locals>._nevracsg|]}�|��qSr5r5)rZr*)rr5r6r\�
sz*Base._skipped_packages.<locals>.<listcomp>csg|]}�|��kr|�qSr5r5)rZr�)r�transaction_nevrasr5r6r\�
scsg|]}�|��kr|�qSr5r5)rZr�)rrUr5r6r\�
s)rZactionsrK�INSTALLZUPGRADEZUPGRADE_ALLrr0rrr�rrrMrdr�problem_conflictsZproblem_broken_dependency)r3Zreport_problemsr�rZngZparamsrr�rWZproblem_dependencyZskipped_conflictsZskipped_dependencyr5)rrUr6�_skipped_packages�
s(


zBase._skipped_packages)N)F)F)TT)T)N)FFF)F)F)N)T)NN)TN)rhNNFN)N)NTN)NT)TNN)NT)T)NNTN)NTN)F)T)N)N)N)N)NNNN)NN)NNNF)F)NTF)NN)NN)NrI)N)��__name__�
__module__�__qualname__r7r8r:r;r?rV�staticmethodrrgrr��propertyrr4r^�deleterrr�Zlazyattrr�r`rar��setterr�r�r�r�r�r�r�r�r�r9r�r�r�r(ZRPMTRANS_FLAG_NOSCRIPTSZRPMTRANS_FLAG_NOTRIGGERSZRPMTRANS_FLAG_NODOCSr:ZRPMTRANS_FLAG_JUSTDBZRPMTRANS_FLAG_NOCONTEXTSZRPMTRANS_FLAG_NOFILEDIGESTr�r/r�r�r�r�r�r�r�r�r�rr	r
rrr!rAr8rLr;rUrqrzr~r�r�r�r�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�rrrrrrr�r#r%r�r�r+r-r,r�r9r:r�rFrGr4rKr�rLrtr�rr(rPrXr5r5r5r6r[s�
	
=
	
8
;
>

=


	
'\
8l"]
*
B

;

)
=

*



/


%
&


#&
%)-rcCs t|�}td�}tj||�dS)Nz Package %s is already installed.)r
rrMr�)r�r�r�r5r5r6rN�
srN)H�__doc__Z
__future__rrrrr�rZlibdnf.transactionr�rxrZ	dnf.compsrZdnf.i18nrr	r
Zdnf.utilrZdnf.db.historyrZdnf.yumr
�collections.abcr�ImportError�collectionsr�Zdnf.callbackZdnf.confZ
dnf.conf.readZ
dnf.cryptoZ
dnf.dnssecZdnf.drpmZdnf.exceptionsZdnf.goalZdnf.historyZdnf.lockZdnf.loggingZdnf.module.module_baseroZ
dnf.persistorZ
dnf.pluginZ	dnf.queryZdnf.repoZdnf.repodictZdnf.rpm.connectionZdnf.rpm.miscutilsZdnf.rpm.transactionZdnf.sackZdnf.selectorZdnf.subjectZdnf.transactionZdnf.yum.rpmtransr�r�rKr�r#rHr�r�rEr(r�rwZ	getLoggerrM�objectrrNr5r5r5r6�<module>s�

site-packages/dnf/__pycache__/comps.cpython-36.pyc000064400000063153147511334650016033 0ustar003

h�-e�`�@s�ddlmZddlmZddlmZddlZddlmZddlm	Z	m
Z
ddlmZddlZ
ddlZ
ddlZddlZddlZddlZddlZddlZddlZddlZddlZejd�ZejjZejjZejjZ ejj!Z"eeBe Be"BZ#d	d
�Z$dd�Z%d
d�Z&dd�Z'd*dd�Z(Gdd�de)�Z*Gdd�de)�Z+Gdd�de)�Z,Gdd�de,�Z-Gdd�de,�Z.Gdd�de,�Z/Gd d!�d!e,�Z0Gd"d#�d#e)�Z1Gd$d%�d%e)�Z2Gd&d'�d'e)�Z3Gd(d)�d)e)�Z4dS)+�)�absolute_import)�print_function)�unicode_literalsN)�
CompsError)�_�ucd)�reduce�dnfcCs"|j|j|jf}ttjtt|��S)N)�
categories�groups�environmentsr�operator�__add__�map�len)�comps�collections�r�/usr/lib/python3.6/comps.py�_internal_comps_length6srcCs|dkrdStjj|�S)N)r	�util�first)�seqrrr�_first_if_iterable;srcs�tjj����fdd�|D�}|r&|S|r>tjtj���j}ntjtj��tjd�j}t	�}x`|D]X}||j
�r||j|�qb|jdk	r�||j�r�|j|�qb|j
dk	rb||j
�rb|j|�qbW|S)z;Return items from sqn matching either exactly or glob-wise.cs$h|]}|j�ks|j�kr|�qSr)�name�id)�.0�g)�patternrr�	<setcomp>Esz_by_pattern.<locals>.<setcomp>)�flagsN)r	Zi18nr�re�compile�fnmatch�	translate�match�I�setr�addr�ui_name)r�case_sensitiveZsqn�exactr%�retrr)rr�_by_patternAs 

r-cCs|jdkrtjS|jS)N)Z
display_order�sys�maxsize)�grouprrr�_fn_display_orderZsr1TcCs||||||�S)aF
    Installs a group or an environment identified by grp_or_env_id.
    This method is preserved for API compatibility. It used to catch an
    exception thrown when a gorup or env was already installed, which is no
    longer thrown.
    `install_fnc` has to be Solver._group_install or
    Solver._environment_install.
    r)Zinstall_fncZ
grp_or_env_id�types�exclude�strict�exclude_groupsrrr�install_or_skip^s
r6c@s,eZdZdZdd�Zedd��Zdd�ZdS)	�_Langsz6Get all usable abbreviations for the current language.cCsd|_d|_dS)N)�last_locale�cache)�selfrrr�__init__osz_Langs.__init__cCs"tjtj�}|dkrdSdj|�S)N�C�.)NN)�localeZ	getlocale�LC_MESSAGES�join)Zlclrrr�_dotted_locale_strssz_Langs._dotted_locale_strcCsz|j�}|j|kr|jSg|_|g}|dkr6|jd�x6|D].}x(tj|�D]}||jkrL|jj|�qLWq<W||_|jS)Nr<)rAr8r9�append�gettextZ_expand_lang)r:Zcurrent_localeZlocales�lZnlangrrr�getzs



z
_Langs.getN)�__name__�
__module__�__qualname__�__doc__r;�staticmethodrArErrrrr7ksr7c@s<eZdZdZdZdZdZdd�Zdd�Zdd�Z	d	d
�Z
dS)�
CompsQuery��cCs||_||_||_||_dS)N)r�history�kinds�status)r:rrNrOrPrrrr;�szCompsQuery.__init__cCs`t�}|j|j@r&|jdd�|D��|j|j@r\x(|D] }|j�}|sJq8|j|j��q8W|S)NcSsh|]
}|j�qSr)r)r�irrrr�sz)CompsQuery._get_groups.<locals>.<setcomp>)r'rP�	AVAILABLE�update�	INSTALLEDZgetCompsGroupItemr(�
getGroupId)r:�	available�	installed�resultrQr0rrr�_get_groups�s
zCompsQuery._get_groupscCs`t�}|j|j@r&|jdd�|D��|j|j@r\x(|D] }|j�}|sJq8|j|j��q8W|S)NcSsh|]
}|j�qSr)r)rrQrrrr�sz'CompsQuery._get_envs.<locals>.<setcomp>)r'rPrRrSrTZgetCompsEnvironmentItemr(ZgetEnvironmentId)r:rVrWrXrQ�envrrr�	_get_envs�s
zCompsQuery._get_envsc	Gstjj�}g|_g|_x�|D]�}g}}|j|j@rf|jj|�}|j	j
j|�}|j||�}|jj
|�|j|j@r�|jj|�}|j	jj|�}|j||�}|jj
|�|o�|r|j|jkr�td�t|�}n.|j|jkr�td�t|�}ntd�t|�}t|��qW|S)Nz&Module or Group '%s' is not installed.z&Module or Group '%s' is not available.z$Module or Group '%s' does not exist.)r	rZBunchrrrO�ENVIRONMENTSr�environments_by_patternrNrZZsearch_by_patternr[�extend�GROUPS�groups_by_patternr0rYrPrTrrrRr)	r:Zpatterns�resZpat�envs�grpsrVrW�msgrrrrE�s.

zCompsQuery.getN)rFrGrHrRrTr\r_r;rYr[rErrrrrK�srKc@s<eZdZdd�Zdd�Zdd�Zedd��Zed	d
��ZdS)�	ForwardercCs||_||_dS)N)�_i�_langs)r:�iobj�langsrrrr;�szForwarder.__init__cCst|j|�S)N)�getattrrf)r:rrrr�__getattr__�szForwarder.__getattr__cCs.x(|jj�D]}|j|�}|dk	r|SqW|S)N)rgrE)r:�defaultZdctrD�trrr�_ui_text�s

zForwarder._ui_textcCs|j|j|j�S)N)rnZdescZdesc_by_lang)r:rrr�ui_description�szForwarder.ui_descriptioncCs|j|j|j�S)N)rnrZname_by_lang)r:rrrr)�szForwarder.ui_nameN)	rFrGrHr;rkrn�propertyror)rrrrre�s
recs8eZdZ�fdd�Zdd�Zdd�Zedd��Z�ZS)	�Categorycstt|�j||�||_dS)N)�superrqr;�_group_factory)r:rhri�
group_factory)�	__class__rrr;�szCategory.__init__cCs0|j|j�}|dkr,d}t||j|jf��|S)Nz no group '%s' from category '%s')rsr�
ValueErrorr)r:�grp_id�grprdrrr�_build_group�s
zCategory._build_groupccs x|jD]}|j|�VqWdS)N)�	group_idsry)r:rwrrr�groups_iter�szCategory.groups_itercCst|j��S)N)�listr{)r:rrrr�szCategory.groups)	rFrGrHr;ryr{rpr�
__classcell__rr)rurrq�srqcsLeZdZ�fdd�Zdd�Zdd�Zdd�Zed	d
��Zedd��Z	�Z
S)
�Environmentcstt|�j||�||_dS)N)rrr~r;rs)r:rhrirt)rurrr;�szEnvironment.__init__cCs0|j|j�}|dkr,d}t||j|jf��|S)Nz#no group '%s' from environment '%s')rsrrvr)r:rwrxrdrrrrys
zEnvironment._build_groupcCsXg}xN|D]F}y|j|j|��Wq
tk
rN}ztj|�WYdd}~Xq
Xq
W|S)N)rBryrv�logger�error)r:ZidsrZgi�errr�
_build_groupss
 zEnvironment._build_groupsccs\xVtj|j|j�D]B}y|j|�VWqtk
rR}ztj|�WYdd}~XqXqWdS)N)�	itertools�chainrz�
option_idsryrvrr�)r:rwr�rrrr{s
zEnvironment.groups_itercCs|j|j�S)N)r�rz)r:rrr�mandatory_groupsszEnvironment.mandatory_groupscCs|j|j�S)N)r�r�)r:rrr�optional_groupsszEnvironment.optional_groups)rFrGrHr;ryr�r{rpr�r�r}rr)rurr~�s
r~csheZdZ�fdd�Zdd�Zedd��Zedd��Zd	d
�Zedd��Z	ed
d��Z
edd��Z�ZS)�Groupcs$tt|�j||�||_|j|_dS)N)rrr�r;�_pkg_factoryrlZselected)r:rhriZpkg_factory)rurrr;"szGroup.__init__cs�fdd�|jD�S)Ncsg|]}|j�kr|�qSr)�type)r�pkg)�type_rr�
<listcomp>(sz+Group._packages_of_type.<locals>.<listcomp>)�packages)r:r�r)r�r�_packages_of_type'szGroup._packages_of_typecCs|jtj�S)N)r��libcomps�PACKAGE_TYPE_CONDITIONAL)r:rrr�conditional_packages*szGroup.conditional_packagescCs|jtj�S)N)r�r��PACKAGE_TYPE_DEFAULT)r:rrr�default_packages.szGroup.default_packagescCst|j|j�S)N)rr�r�)r:rrr�
packages_iter2szGroup.packages_itercCs|jtj�S)N)r�r��PACKAGE_TYPE_MANDATORY)r:rrr�mandatory_packages6szGroup.mandatory_packagescCs|jtj�S)N)r�r��PACKAGE_TYPE_OPTIONAL)r:rrr�optional_packages:szGroup.optional_packagescCs|jjS)N)rfZuservisible)r:rrr�visible>sz
Group.visible)
rFrGrHr;r�rpr�r�r�r�r�r�r}rr)rurr� sr�c@sLeZdZdZejeejeej	e
ejeiZ
dd�Zedd��Zedd��ZdS)	�Packagez#Represents comps package data. :apicCs
||_dS)N)rf)r:�ipkgrrrr;LszPackage.__init__cCs|jjS)N)rfr)r:rrrrOszPackage.namecCs|j|jS)N)�_OPT_MAPr�)r:rrr�option_typeTszPackage.option_typeN)rFrGrHrIr�r��CONDITIONALr��DEFAULTr��	MANDATORYr��OPTIONALr�r;rprr�rrrrr�Bs
r�c@s�eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	e
dd��Zd-dd�Zd.dd�Z
dd�Ze
dd��Zdd�Zd/dd�Zd0dd�Zd d!�Ze
d"d#��Zd$d%�Zd1d&d'�Zd2d(d)�Zd*d+�Zd,S)3�CompscCstj�|_t�|_dS)N)r�r�rfr7rg)r:rrrr;\s
zComps.__init__cCs
t|j�S)N)rrf)r:rrr�__len__`sz
Comps.__len__cCst||j|j�S)N)rqrg�_group_by_id)r:Z	icategoryrrr�_build_categorycszComps._build_categorycCst||j|j�S)N)r~rgr�)r:Zienvironmentrrr�_build_environmentfszComps._build_environmentcCst||j|j�S)N)r�rg�_build_package)r:ZigrouprrrryiszComps._build_groupcCst|�S)N)r�)r:r�rrrr�lszComps._build_packagecCsVtj�}y|j|�Wn,tjk
rB|j�}tdj|���YnX|j|7_dS)N� )r�r�Z	fromxml_fZParserErrorZget_last_errorsrr@rf)r:�fnr�errorsrrr�_add_from_xml_filenameoszComps._add_from_xml_filenamecCst|j��S)N)r|�categories_iter)r:rrrr
xszComps.categoriesFcCs$tjj|�st�|j||�}t|�S)N)r	r�is_string_type�AssertionError�categories_by_patternr)r:rr*Zcatsrrr�category_by_pattern}szComps.category_by_patterncCstjj|�st�t|||j�S)N)r	rr�r�r-r
)r:rr*rrrr��szComps.categories_by_patterncs�fdd��jjD�S)Nc3s|]}�j|�VqdS)N)r�)r�c)r:rr�	<genexpr>�sz(Comps.categories_iter.<locals>.<genexpr>)rfr
)r:r)r:rr��szComps.categories_itercCst|j�td�S)N)�key)�sorted�environments_iterr1)r:rrrr�szComps.environmentscs.tjj��st�tjj�fdd�|j�D��S)Nc3s|]}|j�kr|VqdS)N)r)rr)rrrr��sz+Comps._environment_by_id.<locals>.<genexpr>)r	rr�r�rr�)r:rr)rr�_environment_by_id�szComps._environment_by_idcCs$tjj|�st�|j||�}t|�S)N)r	rr�r�r]r)r:rr*rbrrr�environment_by_pattern�szComps.environment_by_patterncCs4tjj|�st�t|j��}t|||�}t|td�S)N)r�)	r	rr�r�r|r�r-r�r1)r:rr*rbZ
found_envsrrrr]�szComps.environments_by_patterncs�fdd��jjD�S)Nc3s|]}�j|�VqdS)N)r�)rr�)r:rrr��sz*Comps.environments_iter.<locals>.<genexpr>)rfr)r:r)r:rr��szComps.environments_itercCst|j�td�S)N)r�)r�r{r1)r:rrrr�szComps.groupscs.tjj��st�tjj�fdd�|j�D��S)Nc3s|]}|j�kr|VqdS)N)r)rr)�id_rrr��sz%Comps._group_by_id.<locals>.<genexpr>)r	rr�r�rr{)r:r�r)r�rr��szComps._group_by_idcCs$tjj|�st�|j||�}t|�S)N)r	rr�r�r`r)r:rr*rcrrr�group_by_pattern�szComps.group_by_patterncCs0tjj|�st�t||t|j���}t|td�S)N)r�)	r	rr�r�r-r|r{r�r1)r:rr*rcrrrr`�szComps.groups_by_patterncs�fdd��jjD�S)Nc3s|]}�j|�VqdS)N)ry)rr)r:rrr��sz$Comps.groups_iter.<locals>.<genexpr>)rfr)r:r)r:rr{�szComps.groups_iterN)F)F)F)F)F)F)rFrGrHr;r�r�r�ryr�r�rpr
r�r�r�rr�r�r]r�rr�r�r`r{rrrrr�Ys*	





r�c@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�
CompsTransPkgcCs�tjj|�r&d|_||_d|_d|_n\t|tj	j
�r\d|_|j�|_|j�t
j@|_d|_n&|j|_|j|_|jt
j@|_|j|_dS)NFT)r	rr��basearchonlyr�optional�requires�
isinstance�libdnf�transactionZCompsGroupPackage�getNameZgetPackageTyper�r�r�)r:Zpkg_or_namerrrr;�s
zCompsTransPkg.__init__cCs0|j|jko.|j|jko.|j|jko.|j|jkS)N)rr�r�r�)r:�otherrrr�__eq__�szCompsTransPkg.__eq__cCs|jS)N)r)r:rrr�__str__�szCompsTransPkg.__str__cCst|j|j|j|jf�S)N)�hashrr�r�r�)r:rrr�__hash__�szCompsTransPkg.__hash__N)rFrGrHr;r�r�r�rrrrr��sr�c@s�eZdZdd�Zdd�Zdd�Zedd��Zed	d
��Z	e	j
dd
��Z	edd
��Zej
dd
��Zedd��Zej
dd��Zedd��Z
e
j
dd��Z
dS)�TransactionBunchcCs$t�|_t�|_t�|_t�|_dS)N)r'�_install�_install_opt�_remove�_upgrade)r:rrrr;�szTransactionBunch.__init__cCsN|jj|j�|jj|j�|jj|j�|j|jB|j|j|j|_|S)N)r�rSr�r�r�)r:r�rrr�__iadd__�s
 zTransactionBunch.__iadd__cCs(t|j�t|j�t|j�t|j�S)N)r�install�install_opt�upgrade�remove)r:rrrr��szTransactionBunch.__len__cCs6x0|D](}t|t�r |j|�q|jt|��qWdS)N)r�r�r()Zparam�val�itemrrr�
_set_value�s

zTransactionBunch._set_valuecCs|jS)z�
        Packages to be installed with strict=True - transaction will
        fail if they cannot be installed due to dependency errors etc.
        )r�)r:rrrr�szTransactionBunch.installcCs|j|j|�dS)N)r�r�)r:�valuerrrr�	scCs|jS)zw
        Packages to be installed with strict=False - they will be
        skipped if they cannot be installed
        )r�)r:rrrr�
szTransactionBunch.install_optcCs|j|j|�dS)N)r�r�)r:r�rrrr�scCs|jS)N)r�)r:rrrr�szTransactionBunch.removecCs|j|j|�dS)N)r�r�)r:r�rrrr�scCs|jS)N)r�)r:rrrr�!szTransactionBunch.upgradecCs|j|j|�dS)N)r�r�)r:r�rrrr�%sN)rFrGrHr;r�r�rJr�rpr��setterr�r�r�rrrrr��sr�c@s�eZdZdd�Zedd��Zedd��Zegfdd��Zd	d
�Zdd�Z	ddd�Z
dd�Zdd�Zddd�Z
dd�Zdd�Zdd�Zd
S)�SolvercCs||_||_||_dS)N)rNrZ
_reason_fn)r:rNrZ	reason_fnrrrr;+szSolver.__init__cCsdd�|jD�S)NcSsh|]
}|j�qSr)r)rrxrrrr2sz.Solver._mandatory_group_set.<locals>.<setcomp>)r�)rZrrr�_mandatory_group_set0szSolver._mandatory_group_setcCs"dd�|j|j|j|jD�S)NcSsh|]}|j��qSr)r�)rr�rrrr6sz+Solver._full_package_set.<locals>.<setcomp>)r�r�r�r�)rxrrr�_full_package_set4szSolver._full_package_setcsv�fdd�}t�}|t@r*|j||j��|t@rB|j||j��|t@rZ|j||j��|t@rr|j||j	��|S)Ncs�fdd�|D�S)Ncsg|]}|j�kr|�qSr)r)rr�)r3rrr�=sz8Solver._pkgs_of_type.<locals>.filter.<locals>.<listcomp>r)�pkgs)r3rr�filter<sz$Solver._pkgs_of_type.<locals>.filter)
r'r�rSr�r�r�r�r�r�r�)r0�	pkg_typesr3r�r�r)r3r�
_pkgs_of_type:szSolver._pkgs_of_typecCstjj|�st�|jjj|�S)N)r	rr�r�rNr0Zis_removable_pkg)r:Zpkg_namerrr�_removable_pkgKszSolver._removable_pkgcCstjj|�st�|jjj|�S)N)r	rr�r�rNrZZis_removable_group)r:�group_idrrr�_removable_grpOszSolver._removable_grpNTc
Cs�tjj|�st�|jj|�}|s4ttd�t|���|j	j
j||j|j
|�}|j	j
j|�t�}xD|jD]:}	|r||	j|kr|qh||j|	j|||�7}|j|	jdt�qhWx.|jD]$}	|r�|	j|kr�q�|j|	jdt�q�W|S)Nz#Environment id '%s' does not exist.TF)r	rr�r�rr�rrrrNrZ�newrr)r�r�r�r�_group_install�addGroupr�r�r�)
r:�env_idr�r3r4r5�	comps_env�swdb_env�trans�comps_grouprrr�_environment_installSs"zSolver._environment_installcCs�tjj|�dkst�|jjj|�}|s6ttd�|��|jjj	|�t
�}tdd�|j�D��}x&|D]}|j
|�svqf||j|�7}qfW|S)NTz%Environment id '%s' is not installed.cSsg|]}|j��qSr)rU)rrQrrrr�ssz.Solver._environment_remove.<locals>.<listcomp>)r	rr�r�rNrZrErrr�r�r'�	getGroupsr��
_group_remove)r:r�r�r�rzr�rrr�_environment_removejs

zSolver._environment_removecCsNtjj|�st�|jj|�}|jjj|�}|s>t	t
d�|��|sRt	t
d�|��tdd�|j�D��}|j
�}|jjj|j|j|j|�}t�}x\|jD]R}|j|kr�|jjj|j�r�||j|j�7}n||j|j|�7}|j|jdt�q�WxL|jD]B}|j|k�r(|jjj|j��r(||j|j�7}|j|jdt�q�W|jjj|�|S)Nz"Environment '%s' is not installed.z"Environment '%s' is not available.cSsg|]}|j��qSr)rU)rrQrrrr��sz/Solver._environment_upgrade.<locals>.<listcomp>TF)r	rr�r�rr�rNrZrErrr'r��getPackageTypesr�rrr)r�r�r0�_group_upgrader�r�r�r�r�r�)r:r�r�r��old_setr�r�r�rrr�_environment_upgradezs.
zSolver._environment_upgradec
Cs�tjj|�st�|jj|�}|s4ttd�t|���|j	j
j||j|j
|�}x(|j�D]}|j|jdtj|j�qVW|j	j
j|�t�}	|r�|	jj|j||gd��n|	jj|j||gd��|	S)NzGroup id '%s' does not exist.F)r3)r	rr�r�rr�rrrrNr0r�rr)r��
addPackager�r�r�r�r�rSr�r�)
r:r�r�r3r4r5r��
swdb_grouprQr�rrrr��szSolver._group_installcsbtjj|�st��jjj|�}|s2ttd�|���jjj	|�t
�}�fdd�|j�D�|_	|S)Nz&Module or Group '%s' is not installed.csh|]}�j|j��r|�qSr)r�r�)rr�)r:rrr�sz'Solver._group_remove.<locals>.<setcomp>)r	rr�r�rNr0rErrr�r��getPackages)r:r�r�r�r)r:rr��szSolver._group_removec	s&tjj|�st�|jj|�}|jjj|�}g}|sP|r<|j	n|}t
td�|��|sdt
td�|��|j�}t
dd�|j�D���|j|||��|jjj||j|j	|�}x(|j�D]}|j|jdtj|j�q�W|jjj|�t�}�fdd��D�|_�fdd��D�|_�fd	d��D�|_|S)
Nz&Module or Group '%s' is not installed.z&Module or Group '%s' is not available.cSsg|]}|j��qSr)r�)rrQrrrr��sz)Solver._group_upgrade.<locals>.<listcomp>Fcsh|]}|j�kr|�qSr)r)rr�)r�rrr�sz(Solver._group_upgrade.<locals>.<setcomp>cs"h|]}|dd��D�kr|�qS)cSsg|]
}|j�qSr)r)rr�rrrr��sz3Solver._group_upgrade.<locals>.<setcomp>.<listcomp>r)rr)�new_setrrr�scsh|]}|j�kr|�qSr)r)rr�)r�rrr�s)r	rr�r�rr�rNr0rEr)rrr�r'r�r�r�rr�r�r�r�r�r�r�r�r�)	r:r�r�r�r3Zargumentr�rQr�r)r�r�rr��s*zSolver._group_upgradecCslxf|jjD]Z}|jj|�}|jr
t|j�t|j�}|jj�j�j	|d�}x|D]}|j
j|�qPWq
WdS)N)r)Z	persistorrr0rWr'Z	full_listZpkg_excludeZsackZqueryZfiltermZ_goalr�)r:�baser0Zp_grpZinstalled_pkg_namesZinstalled_pkgsr�rrr�'_exclude_packages_from_installed_groups�s
z.Solver._exclude_packages_from_installed_groups)NTN)NTN)rFrGrHr;rJr�r�r�r�r�r�r�r�r�r�r�r�rrrrr�*s
#

r�)NTN)5Z
__future__rrrZlibdnf.transactionr�Zdnf.exceptionsrZdnf.i18nrr�	functoolsrr	Zdnf.utilr#rCr�r�r>Zloggingr
r!r.Z	getLoggerrr�ZCompsPackageType_CONDITIONALr�ZCompsPackageType_DEFAULTr�ZCompsPackageType_MANDATORYr�ZCompsPackageType_OPTIONALr�Z	ALL_TYPESrrr-r1r6�objectr7rKrerqr~r�r�r�r�r�r�rrrr�<module>sP

!A'"f(Csite-packages/dnf/__pycache__/plugin.cpython-36.opt-1.pyc000064400000021433147511334650017142 0ustar003

i�-eV%�@s�ddlmZddlmZddlmZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlmZejd�ZdZGdd	�d	e�ZGd
d�de�Zdd
�Zdd�Zdd�Zdd�Zdd�ZdS)�)�absolute_import)�print_function)�unicode_literalsN)�_�dnfzdnf.plugin.dynamicc@s\eZdZdZdZdZedd��Zdd�Zdd	�Z	d
d�Z
dd
�Zdd�Zdd�Z
dd�ZdS)�Pluginz5The base class custom plugins must derive from. #:apiz	<invalid>Ncs�tjj�}|jr|jn|j��fdd�|jD�}xb|D]Z}tjj|�r6y|j	|�Wq6t
k
r�}ztjj
td�t|���WYdd}~Xq6Xq6W|S)Ncsg|]}d|�f�qS)z
%s/%s.conf�)�.0�path)�namer�/usr/lib/python3.6/plugin.py�
<listcomp>9sz&Plugin.read_config.<locals>.<listcomp>zParsing file failed: %s)�libdnf�confZConfigParser�config_namerZpluginconfpath�osr
�isfile�read�	Exceptionr�
exceptionsZConfigErrorr�str)�clsr�parser�files�file�er)rr�read_config4s

.zPlugin.read_configcCs||_||_dS)N)�base�cli)�selfrrrrr�__init__BszPlugin.__init__cCsdS)Nr)rrrr�
pre_configGszPlugin.pre_configcCsdS)Nr)rrrr�configKsz
Plugin.configcCsdS)Nr)rrrr�resolvedOszPlugin.resolvedcCsdS)Nr)rrrr�sackSszPlugin.sackcCsdS)Nr)rrrr�pre_transactionWszPlugin.pre_transactioncCsdS)Nr)rrrr�transaction[szPlugin.transaction)�__name__�
__module__�__qualname__�__doc__rr�classmethodrr r!r"r#r$r%r&rrrrr.src@s~eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	ddd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�ZdS)�PluginscCsg|_g|_dS)N)�
plugin_cls�plugins)rrrrr aszPlugins.__init__cCs|j�dS)N)�_unload)rrrr�__del__eszPlugins.__del__cCs~xx|jD]n}yt||��Wqtjjk
r6�Yqtk
rttj�\}}}tj	|||�}t
jdj|��YqXqWdS)N�)
r.�getattrrr�Errorr�sys�exc_info�	traceback�format_exception�loggerZcritical�join)r�method�plugin�exc_type�	exc_value�
exc_tracebackZexcept_listrrr�_callerhszPlugins._callercsxxr|jdd�D]`}|j�t�fdd�|D��r2q|j|�}|jd�o^|jdd�o^|jdd�}|r|jj|�qWdS)zwChecks whether plugins are enabled or disabled in configuration files
           and removes disabled plugins from listNc3s|]}tj�|�VqdS)N)�fnmatch)r	�pattern)rrr�	<genexpr>xsz)Plugins._check_enabled.<locals>.<genexpr>�mainZenabled)r-r�anyrZhas_sectionZ
has_optionZ
getboolean�remove)rr�enable_pluginsZplug_clsrZdisabledr)rr�_check_enabledss

zPlugins._check_enabledcCs�ttjkrtd��tjjt�tjt<}g|_t|j	||�}t
||�t�dd�|_|j
||�t|j�dkr�tdd�|jD��}tjtd�dj|��dS)z)Dynamically load relevant plugin modules.zload_plugins() called twiceNrcss|]}|jVqdS)N)r)r	r;rrrrB�sz Plugins._load.<locals>.<genexpr>zLoaded plugins: %sz, )�DYNAMIC_PACKAGEr4�modules�RuntimeErrorrZpycomp�
ModuleType�__path__�_get_plugins_filesZ
pluginpath�_import_modules�_plugin_classesr-rG�len�sortedr8�debugrr9)rrZskipsrF�packager�namesrrr�_load�s

z
Plugins._loadcCs|jd�dS)Nr!)r?)rrrr�_run_pre_config�szPlugins._run_pre_configcCs|jd�dS)Nr")r?)rrrr�_run_config�szPlugins._run_configNcCs*x$|jD]}|||�}|jj|�qWdS)N)r-r.�append)rrrZp_clsr;rrr�	_run_init�s
zPlugins._run_initcCs|jd�dS)Nr$)r?)rrrr�run_sack�szPlugins.run_sackcCs|jd�dS)Nr#)r?)rrrr�run_resolved�szPlugins.run_resolvedcCs|jd�dS)Nr%)r?)rrrr�run_pre_transaction�szPlugins.run_pre_transactioncCs|jd�dS)Nr&)r?)rrrr�run_transaction�szPlugins.run_transactioncCs&ttjkr"tjtjjd�tjt=dS)NzPlugins were unloaded.)rHr4rIr8�logr�loggingZDDEBUG)rrrrr/�s
zPlugins._unloadcCs�|js
dSt�}x|jD]}||tj|j�<qWt|j��}t�}x |jD]}|j|j	|j
��qJW|sldSx|jD]}|j|j
�qtWx|D]}|jj
||�q�WdS)zH
        Unload plugins that were removed in the `transaction`.
        N)Z
remove_set�dictr.�inspectZgetfile�	__class__�set�keys�update�intersectionrZinstall_set�difference_updaterE)rr&r.r;Zplugin_filesZerased_plugin_filesZpkgZplugin_filerrr�unload_removed_plugins�s
zPlugins.unload_removed_plugins)N)r'r(r)r r0r?rGrUrVrWrYrZr[r\r]r/rhrrrrr,`s
r,cCstj�S)N)r�__subclasses__rrrrrO�srOcCs�x�|D]�}tjj|�\}}|jj|�tjj|�\}}d|j|f}ytj|�}Wqt	k
r�}z,t
jtd�||�t
j
tjjddd�WYdd}~XqXqWdS)Nz%s.%szFailed loading plugin "%s": %sr1T)r5)rr
�splitrLrX�splitextr'�	importlib�
import_modulerr8�errorrr^rr_ZSUBDEBUG)rSZpy_files�fnr
�moduleZextrrrrrrN�s
rNcCsJg}t|�}t|�}t�}t�}x�|D]�}x�tjd|�D]�}tjjtjj|��\}}	d}
d}xN|D]F}t||�rd|j|�d}
x$|D]}
t||
�r�d}
|j|
�q�Wd}qdW|s�x |D]}
t||
�r�|j|
�q�W|
r:|j|�q:Wq&W|j	|�}|�rt
jtd�j
djt|����|j	|�}|�rFt
jtd�j
djt|����|S)Nz%s/*.pyTFz=No matches found for the following enable plugin patterns: {}z, z>No matches found for the following disable plugin patterns: {})rc�globrr
rk�basename�_plugin_name_matches_pattern�addrX�
differencer8Zwarningr�formatr9rQ)�pathsZdisable_pluginsrFr.Zpattern_enable_foundZpattern_disable_found�pro�plugin_nameZdummyZmatchedZenable_pattern_testedZpattern_skipZpattern_enableZenable_not_foundZdisable_not_foundrrrrM�sD









rMcs*t||jdd�f�}t�fdd�|D��S)z�
    Checks plugin name matches the pattern.

    The alternative plugin name using dashes instead of underscores is tried
    in case of original name is not matched.

    (see https://bugzilla.redhat.com/show_bug.cgi?id=1980712)
    r�-c3s|]}tj|��VqdS)N)r@)r	r)rArrrBsz/_plugin_name_matches_pattern.<locals>.<genexpr>)rc�replacerD)ryrAZ	try_namesr)rArrss
rscs<�fdd�}tt�jd�tjf|�jdd��}|�_�S)z5A class decorator for automatic command registration.cs|r|j��dS)N)�register_command)rrr)�
command_classrrr sz"register_command.<locals>.__init__rr)r r)�typerr'rr�aliasesZ_plugin)r}r Zplugin_classr)r}rr|sr|)Z
__future__rrrr@rqrlrar_�operatorrr4r6rZdnf.loggingrZ
dnf.pycompZdnf.utilZdnf.i18nrZ	getLoggerr8rH�objectrr,rOrNrMrsr|rrrr�<module>s2
2k
%site-packages/dnf/__pycache__/sack.cpython-36.pyc000064400000004303147511334650015623 0ustar003

�ft`��@s�ddlmZddlmZddlZddlZddlZddlZddlZddl	Z	ddl
mZddlm
Z
ejd�ZGdd�dej�Zd	d
�Zdd�Zd
d�ZdS)�)�absolute_import)�unicode_literalsN)�
basestring)�_�dnfcs0eZdZ�fdd�Zd	dd�Zd
dd�Z�ZS)�Sackcstt|�j||�dS)N)�superr�__init__)�self�args�kwargs)�	__class__��/usr/lib/python3.6/sack.pyr	%sz
Sack.__init__NrcCs8|r
||_||_|dk	r4||_|dkr4tjtd��dS)NFznallow_vendor_change is disabled. This option is currently not supported for downgrade and distro-sync commands)�installonly�installonly_limit�allow_vendor_change�loggerZwarningr)r
rrrrrr�
_configure(szSack._configurecCstjj||�S)z'Factory function returning a DNF Query.)r�queryZQuery)r
�flagsrrrr1sz
Sack.query)NrN)r)�__name__�
__module__�__qualname__r	rr�
__classcell__rr)r
rr"s
	rc	CsT|jj}tjj|�ttjj||jjd||jj	t
jj|jj
tjj�|jjdkd�S)N�arch�	)ZpkgclsZ
pkginitvalr�cachedirZrootdirZlogfileZlogdebug)Zconfrr�utilZ
ensure_dirr�packageZPackageZ
substitutionsZinstallroot�os�path�joinZlogdir�constZ
LOG_HAWKEYZlogfilelevel)�baserrrr�_build_sack7s

r%cCs2t|�}y|jdd�Wntk
r,YnX|S)NF)Zbuild_cache)r%Zload_system_repo�IOError)r$Zsackrrr�_rpmdb_sackBsr'cCst|�S)z�
    Returns a new instance of sack containing only installed packages (@System repo)
    Useful to get list of the installed RPMs after transaction.
    )r')r$rrr�
rpmdb_sackMsr()Z
__future__rrZdnf.utilrZdnf.packageZ	dnf.queryZloggingZhawkeyr Z
dnf.pycomprZdnf.i18nrZ	getLoggerrrr%r'r(rrrr�<module>s
site-packages/dnf/__pycache__/transaction_sr.cpython-36.pyc000064400000042137147511334650017742 0ustar003

i�-eaf�@s�ddlmZddlmZddlmZddlZddlZddlmZddlZ	ddl
Z
dZdZdeefZ
Gdd�de	jj�ZGd	d
�d
e	jj�ZGdd�de�Zd
d�Zdd�ZGdd�de�ZdS)�)�absolute_import)�print_function)�unicode_literalsN)�_z%s.%scseZdZ�fdd�Z�ZS)�TransactionErrorcstt|�j|�dS)N)�superr�__init__)�self�msg)�	__class__��$/usr/lib/python3.6/transaction_sr.pyr/szTransactionError.__init__)�__name__�
__module__�__qualname__r�
__classcell__rr)rr
r.srcseZdZ�fdd�Z�ZS)�TransactionReplayErrorcsv||_t|ttf�r||_n|g|_|r:td�j|d�}ntd�}x|jD]}|dt|�7}qJWtt	|�j
|�dS)z�
        :param filename: The name of the transaction file being replayed
        :param errors: a list of error classes or a string with an error description
        zWThe following problems occurred while replaying the transaction from file "{filename}":)�filenamez<The following problems occurred while running a transaction:z
  N)r�
isinstance�list�tuple�errorsr�format�strrrr)r	rrr
�error)rrr
r4szTransactionReplayError.__init__)rrrrrrr)rr
r3srcseZdZ�fdd�Z�ZS)�#IncompatibleTransactionVersionErrorcstt|�j||�dS)N)rrr)r	rr
)rrr
rMsz,IncompatibleTransactionVersionError.__init__)rrrrrrr)rr
rLsrc"Cs�|jd�\}}yt|�}Wn8tk
rR}zt|td�j|d���WYdd}~XnXyt|�Wn8tk
r�}zt|td�j|d���WYdd}~XnX|tkr�t|td�j|td���dS)N�.z1Invalid major version "{major}", number expected.)�majorz1Invalid minor version "{minor}", number expected.)�minorzPIncompatible major version "{major}", supported major version is "{major_supp}".)rZ
major_supp)�split�int�
ValueErrorrrr�
VERSION_MAJORr)�versionrrr�errr
�_check_versionQs$$$r%cCs�dti}g}g}g}|dkr |S�x0|j�D�]"}|j�r`|j|j|jtjj|j	�|j
d��q,|j�r�|j�}|j|j
�gtjj|j��d�}x:|j�D].}|dj|j�|j�tjj|j��d��q�W|j|�q,|j�r,|j�}	|j|	j�gtjj|	j��d�}
x<|	j�D]0}|
dj|j
�|j�tjj|j��d	���qW|j|
�q,W|�rb||d
<|�rp||d<|�r~||d<|S)z�
    Serializes a transaction to a data structure that is equivalent to the stored JSON format.
    :param transaction: the transaction to serialize (an instance of dnf.db.history.TransactionWrapper)
    r#N)�action�nevra�reason�repo_id)r&�id�packages�
package_typesr+)�name�	installed�package_type)r&r*�groupsr,r0)r*r.�
group_type�rpms�environments)�VERSIONr+�
is_package�appendZaction_namer'�libdnf�transactionZTransactionItemReasonToStringr(Z	from_repoZis_groupZ	get_groupZ
getGroupIdZcompsPackageTypeToStringZgetPackageTypesZgetPackagesZgetNameZgetInstalledZgetPackageTypeZis_environmentZget_environmentZgetEnvironmentIdZ	getGroupsZgetGroupType)r8�datar2r0r3�tsi�group�
group_data�pkg�env�env_data�grprrr
�serialize_transactionlsXrAc@s�eZdZdZd/dd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�ZdS)0�TransactionReplaya�
    A class that encapsulates replaying a transaction. The transaction data are
    loaded and stored when the class is initialized. The transaction is run by
    calling the `run()` method, after the transaction is created (but before it is
    performed), the `post_transaction()` method needs to be called to verify no
    extra packages were pulled in and also to fix the reasons.
    �NFcCsv||_||_||_||_||_|jjjs.d|_t�|_i|_	g|_
|rX|rXttd���n|rh|j
|�n
|j|�dS)a
        :param base: the dnf base
        :param filename: the filename to load the transaction from (conflicts with the 'data' argument)
        :param data: the dictionary to load the transaction from (conflicts with the 'filename' argument)
        :param ignore_extras: whether to ignore extra package pulled into the transaction
        :param ignore_installed: whether to ignore installed versions of packages
        :param skip_unavailable: whether to skip transaction packages that aren't available
        TzKConflicting TransactionReplay arguments have been specified: filename, dataN)�_base�	_filename�_ignore_installed�_ignore_extras�_skip_unavailable�conf�strict�set�_nevra_cache�_nevra_reason_cache�	_warningsr!r�_load_from_file�_load_from_data)r	�baserr9Z
ignore_extrasZignore_installedZskip_unavailablerrr
r�s
zTransactionReplay.__init__c%Cs�||_t|d��N}ytj|�}Wn8tjjk
rX}zt|t|�d��WYdd}~XnXWdQRXy|j|�Wn,t	k
r�}zt||��WYdd}~XnXdS)N�rr)
rE�open�json�load�decoderZJSONDecodeErrorrrrPr)r	�fn�f�replay_datar$rrr
rO�s.z!TransactionReplay._load_from_filecCs|||_|j|j�|jjdg�|_|j|jtdd�|jjdg�|_|j|jtdd�|jjdg�|_|j|jtdd�dS)Nr2Zarrayr0r3)�_replay_data�_verify_toplevel_json�get�_rpms�_assert_typer�_groups�
_environments)r	r9rrr
rP�sz!TransactionReplay._load_from_datacCs|r|jj|�nt|��dS)N)rNr6r)r	Z	warn_onlyr
rrr
�_raise_or_warnsz TransactionReplay._raise_or_warncCs$t||�s ttd�j||d���dS)Nz*Unexpected type of "{id}", {exp} expected.)r*Zexp)rrrr)r	�value�tr*Zexpectedrrr
r^s
zTransactionReplay._assert_typecCsJ|j}d|kr$t|tdjdd����|j|dtdd�t|d|�dS)Nr#zMissing key "{key}".)�key�string)rErrrr^rr%)r	rYrWrrr
r[s
z'TransactionReplay._verify_toplevel_jsoncCsDy,|d}|d}|d}tjj|d�}Wnvtk
rh}z ttd�j|jdd���WYdd}~Xn<tk
r�}z ttd�j|d|d	���WYdd}~XnXt	j
|�}|jt	jgd
�}t
|�dkr�ttd�j|d
���|d}	d|	j|	jf}
|jjj�j|	j|	jd�}|	jdk	�r"|	jnd}|j||	j|	jd�}
|�r`|
j|d�}|�r`|j|
j��}
|
�s�|j|jtd�j|d
��dS|dk�r�|jj|�|d(k�r�||j|<|d)k�r.|dk�r�|j��r�|jj|��r�|j|jtd�j|
|d��t j!j"|jj�j#|
d�}|jj$j%||jj&j'd��n|dk�r�|
j(�}
|
�sf|j|jtd�j||d��dSt j!j"|jj�j#|
d�}|jj$j%||jj&j'd�n�|d*k�r
|
j�}
|
�s�|j|jtd#�j||d��dS|j�s�|dk�r@xX|
D]}|jj$j)|d$d%��q�Wn6|dk�r*|jj*j+|
d|�nttd&�j||d'���dS)+Nr&r'r)r(z%Missing object key "{key}" in an rpm.r)rdzFUnexpected value of package reason "{reason}" for rpm nevra "{nevra}".)r(r')Zforms�z)Cannot parse NEVRA for package "{nevra}".)r'z%s.%s)r-�arch)�epochr#�release)Zreponamez Cannot find rpm nevra "{nevra}".z
Reason Change�Install�Upgrade�	Downgrade�	Reinstall�Removedz:Package "{na}" is already installed for action "{action}".)�nar&)r=)ZselectZoptionalzLPackage nevra "{nevra}" not available in repositories for action "{action}".)r'r&�Upgraded�
Downgraded�Reinstalled�	Obsoletedz<Package nevra "{nevra}" not installed for action "{action}".F)Z
clean_depszFUnexpected value of package action "{action}" for rpm nevra "{nevra}".)r&r')rjrkrlrmrn)rjrkrl)rprqrrrnrs),r7r8ZStringToTransactionItemReason�KeyErrorrrr�args�
IndexError�hawkeyZSubjectZget_nevra_possibilitiesZ
FORM_NEVRA�lenr-rgrDZsack�query�filterrhr#ri�unionr.rarHrL�addrMZ_get_installonly_queryrF�dnfZselectorZSelectorrKZgoal�installrIrJZ	availableZerase�historyZ
set_reason)r	�pkg_datar&r'r)r(r$ZsubjZ
parsed_nevrasZparsed_nevraroZquery_narhryZ
query_repoZsltrr=rrr
�_replay_pkg_actionsz("





$



z$TransactionReplay._replay_pkg_actioncCs2|jjj|�}|s,|j|jtd�|�dS|jjjj||j	|j
|�}y�x�|D]�}|d}|j|tdd�|d}|j|t
dd�|d}	|j|	td	d�y|j||tjj|	��WqNtjjk
r�}
ztt|
���WYdd}
~
XqNXqNWWn>tk
�r,}
z ttd
�j|
jdd���WYdd}
~
XnX|S)
NzGroup id '%s' is not available.r-zgroups.packages.namerer.zgroups.packages.installed�booleanr/zgroups.packages.package_typez.Missing object key "{key}" in groups.packages.r)rd)rD�compsZ_group_by_idrarHrrr;�newr-�ui_namer^r�boolZ
addPackager7r8�stringToCompsPackageTyper�Errorrrtrru)r	�group_id�	pkg_types�pkgsZcomps_group�
swdb_groupr=r-r.r/r$rrr
�_create_swdb_groupvs*
&*z$TransactionReplay._create_swdb_groupcCs*|j|||�}|dk	r&|jjjj|�dS)N)r�rDrr;r~)r	r�r�r�r�rrr
�_swdb_group_install�sz%TransactionReplay._swdb_group_installcCsT|jjjj|�s*|j|jtd�|�dS|j|||�}|dk	rP|jjjj|�dS)NzGroup id '%s' is not installed.)	rDrr;r\rarFrr��upgrade)r	r�r�r�r�rrr
�_swdb_group_upgrade�sz%TransactionReplay._swdb_group_upgradecCsT|jjjj|�s*|j|jtd�|�dS|j|||�}|dk	rP|jjjj|�dS)NzGroup id '%s' is not installed.)	rDrr;r\rarFrr��	downgrade)r	r�r�r�r�rrr
�_swdb_group_downgrade�sz'TransactionReplay._swdb_group_downgradecCsT|jjjj|�s*|j|jtd�|�dS|j|||�}|dk	rP|jjjj|�dS)NzGroup id '%s' is not installed.)	rDrr;r\rarFrr��remove)r	r�r�r�r�rrr
�_swdb_group_remove�sz$TransactionReplay._swdb_group_removecCsd|jjj|�}|s,|j|jtd�|�dS|jjjj||j	|j
|�}y�x�|D]�}|d}|j|tdd�|d}|j|t
dd�|d}	|j|	td	d�ytjj|	�}	Wn2tjjk
r�}
ztt|
���WYdd}
~
XnX|	tjjtjjfk�rttd
�j|dd���|j|||	�qNWWn>tk
�r^}
z ttd�j|
jd
d���WYdd}
~
XnX|S)Nz%Environment id '%s' is not available.r*zenvironments.groups.idrer.zenvironments.groups.installedr�r1zenvironments.groups.group_typezlInvalid value "{group_type}" of environments.groups.group_type, only "mandatory" or "optional" is supported.)r1z2Missing object key "{key}" in environments.groups.r)rd)rDr�Z_environment_by_idrarHrrr>r�r-r�r^rr�r7r8r�rr�rZCompsPackageType_MANDATORYZCompsPackageType_OPTIONALrZaddGrouprtru)r	�env_idr�r0Z	comps_env�swdb_envr@r*r.r1r$rrr
�_create_swdb_environment�s8
*z*TransactionReplay._create_swdb_environmentcCs*|j|||�}|dk	r&|jjjj|�dS)N)r�rDrr>r~)r	r�r�r0r�rrr
�_swdb_environment_install�sz+TransactionReplay._swdb_environment_installcCsT|jjjj|�s*|j|jtd�|�dS|j|||�}|dk	rP|jjjj|�dS)Nz%Environment id '%s' is not installed.)	rDrr>r\rarFrr�r�)r	r�r�r0r�rrr
�_swdb_environment_upgrade�sz+TransactionReplay._swdb_environment_upgradecCsT|jjjj|�s*|j|jtd�|�dS|j|||�}|dk	rP|jjjj|�dS)Nz%Environment id '%s' is not installed.)	rDrr>r\rarFrr�r�)r	r�r�r0r�rrr
�_swdb_environment_downgrade�sz-TransactionReplay._swdb_environment_downgradecCsT|jjjj|�s*|j|jtd�|�dS|j|||�}|dk	rP|jjjj|�dS)Nz%Environment id '%s' is not installed.)	rDrr>r\rarFrr�r�)r	r�r�r0r�rrr
�_swdb_environment_remove�sz*TransactionReplay._swdb_environment_removecCs|jS)z>
        :returns: the loaded data of the transaction
        )rZ)r	rrr
�get_dataszTransactionReplay.get_datacCs|jS)zW
        :returns: an array of warnings gathered during the transaction replay
        )rN)r	rrr
�get_warnings
szTransactionReplay.get_warningscCs�|j}g}xJ|jD]@}y|j|�Wqtk
rP}z|j|�WYdd}~XqXqW�x�|jD�]�}�y
|d}|d}ytjj|d�}Wn:tj	j
k
r�}z|jtt|���w`WYdd}~XnX|dkr�|j|||d�n�|dk�r|j
|||d�nl|dk�s|d	k�r,|j|||d�nD|d
k�s@|dk�rT|j|||d�n|jttd�j||d
���Wq`tk
�r�}z&|jttd�j|jdd���WYdd}~Xq`tk
�r�}z|j|�WYdd}~Xq`Xq`W�x�|jD�]�}	�y|	d}|	d}
ytjj|	d�}Wn>tj	j
k
�r^}z|jtt|����w�WYdd}~XnX|dk�r~|j|
||	d�n�|dk�r�|j|
||	d�nl|dk�s�|d	k�r�|j|
||	d�nD|d
k�s�|dk�r�|j|
||	d�n|jttd�j||
d���Wnptk
�rN}z&|jttd�j|jdd���WYdd}~Xn.tk
�rz}z|j|�WYdd}~XnX�q�W|�r�t||��dS)z*
        Replays the transaction.
        Nr&r*r,rjr+rnrkrprlrqz@Unexpected value of group action "{action}" for group "{group}".)r&r;z&Missing object key "{key}" in a group.r)rdr0zJUnexpected value of environment action "{action}" for environment "{env}".)r&r>z-Missing object key "{key}" in an environment.)rEr]r�rr6r_r7r8r�rr�rr�r�r�r�rrrtrur`r�r�r�r�r)r	rWrr�r$r<r&r�r�r?r�rrr
�runsv 
* 

*"zTransactionReplay.runcCs8|jjsdSg}�x|jjD�]}y
|j}Wn$tk
rP}zwWYdd}~XnXt|�}||jkr�|js�|jtjj	tjj
tjjfkr�td�j
|d�}|js�|jt|��n|jj|�y>|j|}|jtjjtjjfks�tjj||j�dkr�||_Wqtk
�r}zWYdd}~XqXqW|�r4t|j|��dS)z�
        Sets reasons in the transaction history to values from the stored transaction.

        Also serves to check whether additional packages were pulled in by the
        transaction, which results in an error (unless ignore_extras is True).
        NzgPackage nevra "{nevra}", which is not present in the transaction file, was pulled into the transaction.)r'r)rDr8r=rtrrLrFr&r7ZTransactionItemAction_UPGRADEDZ TransactionItemAction_DOWNGRADEDZ!TransactionItemAction_REINSTALLEDrrrGr6rrNrMZTransactionItemAction_INSTALLZTransactionItemAction_REMOVEZTransactionItemReasonComparer(rrE)r	rr:r=r$r'r
Z
replay_reasonrrr
�post_transactionds<



z"TransactionReplay.post_transaction)rCNFFF)rrr�__doc__rrOrPrar^r[r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr
rB�s4
 


a


(


SrB)Z
__future__rrrr7rwZdnf.i18nrZdnf.exceptionsr}rTr"Z
VERSION_MINORr4�
exceptionsr�rrrr%rA�objectrBrrrr
�<module>s Ksite-packages/dnf/__pycache__/util.cpython-36.pyc000064400000045565147511334650015676 0ustar003

�ft`�O�@svddlmZddlmZddlmZddlmZmZddlmZm	Z	ddl
Z
ddlZddlZddl
ZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZejd�Ze
j �j!d	kr�e
j �j!ndZ"e"j#�Z$d
d�Z%d^dd
�Z&d_dd�Z'dd�Z(dd�Z)dd�Z*dd�Z+dd�Z,dd�Z-dd�Z.dd �Z/d!d"�Z0d#d$�Z1d%d&�Z2d'd(�Z3d)d*�Z4d+d,�Z5d-d.�Z6d/d0�Z7d1d2�Z8d3d4�Z9d5d6�Z:d7d8�Z;d9d:�Z<d;d<�Z=d=d>�Z>d?d@�Z?dAdB�Z@dCdD�ZAd`dFdG�ZBdHdIejCfdJdK�ZDdLdM�ZEdNdO�ZFdPdQ�ZGdRdS�ZHGdTdU�dUeI�ZJGdVdW�dWeK�ZLGdXdY�dYeM�ZNdZd[�ZOd\d]�ZPdS)a�)�print_function)�absolute_import)�unicode_literals�)�PY3�
basestring)�_�ucdN�dnfZyumcCs�t|dg�t|dg�t|dg�t�}x�|D]�}||kr>q0|j|�tjjj|�d}|jd�rr|jj|�q0|r�|d
kr�|jj|�q0|j	d
�r�|j
j|dd��q0|jj|�q0WdS)a�
    Categorize :param values list into packages, groups and filenames

    :param namespace: argparse.Namespace, where specs will be stored
    :param values: list of specs, whether packages ('foo') or groups/modules ('@bar')
                   or filenames ('*.rmp', 'http://*', ...)

    To access packages use: specs.pkg_specs,
    to access groups use: specs.grp_specs,
    to access filenames use: specs.filenames
    �	filenames�	grp_specs�	pkg_specsrz.rpm�http�ftp�file�https�@rN)rrrr)�setattr�set�addr
�pycompZurlparse�endswithr�append�
startswithrr
)�	namespace�valuesZtmp_set�valueZschemes�r�/usr/lib/python3.6/util.py�_parse_specs7s 




rcCs�|dkrtjj�}tjj|||�}tjj|j�r6|jSt	|j
g�}|jd|�|j�g}yt
jjjt
jj|�d�WnBtk
r�}z&|jr�tt|���tjt|��WYdd}~XnX|jS)NrT)r
�callbackZNullDownloadProgress�repoZRemoteRPMPayload�os�path�existsZ
local_path�sumZ
download_size�startZ_librepo_target�libdnfZ
PackageTargetZdownloadPackagesZVectorPPackageTarget�RuntimeError�strict�IOError�str�logger�error)�url�confZprogressZploadZest_remote_sizeZtargets�errr�_urlopen_progressWs

 r1�w+bcKs�trd|kr|jdd�tj|f|�}y<|r@|jj||j��n tjj	j
|rR|jnd||j��Wn.tk
r�}zt
t|���WYdd}~XnX|jd�|S)z|
    Open the specified absolute url, return a file object
    which respects proxy setting even for non-repo downloads
    �b�encodingzutf-8Nr)r�
setdefault�tempfileZNamedTemporaryFileZ_repoZdownloadUrl�filenor'r!Z
DownloaderZdownloadURLZ_configr(r*r+�seek)r.r/r!�mode�kwargsZfor0rrr�_urlopenhs$
r;cCs |j|�r|dt|��}|S)N)r�len)�s�rrrr�rtrim|s
r?cCstj�dkS)Nr)r"�geteuidrrrr�	am_i_root�srAcCs.x(tj|�D]}tjj||�}t|�qWdS)zBRemove all files and dirs under `path`

    Also see rm_rf()

    N)r"�listdirr#�join�rm_rf)r#�entryZcontained_pathrrr�	clear_dir�srFcCsXytj|dd�Wn@tk
rR}z$|jtjks>tjj|�rB|�WYdd}~XnXdS)Ni�)r9)r"�makedirs�OSError�errnoZEEXISTr#�isdir)Zdnamer0rrr�
ensure_dir�s
rKcCsJg}|}x<tjj|�\}}|s6|s(|r4|jd|�P|jd|�q
W|S)z`
    Split path by path separators.
    Use os.path.join() to join the path back to string.
    r)r"r#�split�insert)r#�result�head�tailrrr�
split_path�s
rQcCs6yt|�}Wn tk
r,tt|��}YnX|dkS)Nr)r<�	TypeError�list)�iterable�lrrr�empty�s
rVcCs*t|�}yt|�Stk
r$dSXdS)zFReturns the first item from an iterable or None if it has no elements.N)�iter�next�
StopIteration)rT�itrrr�first�s
r[cCs4t|�}ytdd�|D��Stk
r.dSXdS)Ncss|]}|dk	r|VqdS)Nr)�.0�itemrrr�	<genexpr>�sz!first_not_none.<locals>.<genexpr>)rWrXrY)rTrZrrr�first_not_none�s
r_cCstj�t|�S)N)�time�file_timestamp)�fnrrr�file_age�srccCstj|�jS)N)r"�stat�st_mtime)rbrrrra�sracCs4ytjtj��dStk
r.dtj�SXdS)NrzUID: %s)�pwd�getpwuidr"r@�KeyErrorrrrr�get_effective_login�sricCs(x"|D]}|j|�}|dkr|SqW|S)z!Like dict.get() for nested dicts.N)�get)Zdct�keysZ	not_found�krrr�get_in�s


rmcs�fdd�}tj||ggf�S)Ncs|t�|��j|�|S)N)�boolr)Zaccr])rbrr�splitter�sz!group_by_filter.<locals>.splitter)�	functools�reduce)rbrTror)rbr�group_by_filter�srrccs&x |D]}||�r|V|VqWdS)z/Insert an item into an iterable by a condition.Nr)r]rT�	conditionZ
original_itemrrr�	insert_if�s
rtcCs*yt|�Wntk
r dSXdSdS)z&Test whether an iterator is exhausted.TFN)rXrY)�iteratorrrr�is_exhausted�s
rvcCs*t|�r|g}t|t�o(tdd�|D��S)Ncss|]}t|�td�@VqdS)z*[?N)r)r\�prrrr^�sz"is_glob_pattern.<locals>.<genexpr>)�is_string_type�
isinstancerS�any)�patternrrr�is_glob_pattern�sr|cCstrt|t�St|t�SdS)N)rryr+r)�objrrrrx�s
rxcs�fdd�}|S)z�Decorator to get lazy attribute initialization.

    Composes with @property. Force reinitialization by deleting the <attrname>.
    cs��fdd�}|S)Ncs8y
t|��Stk
r2�|�}t|�|�|SXdS)N)�getattr�AttributeErrorr)r}�val)�attrnamerbrr�
cached_getters
z6lazyattr.<locals>.get_decorated.<locals>.cached_getterr)rbr�)r�)rbr�
get_decorated�szlazyattr.<locals>.get_decoratedr)r�r�r)r�r�lazyattr�s	r�cGstt|f|���S)z�Like functools.map(), but return a list instead of an iterator.

    This means all side effects of fn take place even without iterating the
    result.

    )rS�map)rb�seqrrr�mapallsr�cCs8tjdtj|��}tjjs4tj�d}|r4|j|�}|S)z6Convert time into locale aware datetime string object.z%cr)	r`ZstrftimeZ	localtimer
rr�localeZ	getlocale�decode)Z	timestamp�tZcurrent_locale_settingrrr�normalize_times
r�cCszy\d}dd�tj|�D�}t|�dkrZ|d}tdj||���}|j�}t|�dkSQRXdSttfk
rtdSXdS)z�Decide whether we are on line power.

    Returns True if we are on line power, False if not, None if it can not be
    decided.

    z/sys/class/power_supplycSsg|]}|jd�r|�qS)ZAC)r)r\Znoderrr�
<listcomp>&szon_ac_power.<locals>.<listcomp>rz{}/{}/onlinerN)	r"rBr<�open�format�read�intr*�
ValueError)Z	ps_folderZac_nodesZac_nodeZ	ac_status�datarrr�on_ac_powersr�cCs�yddl}Wntk
r dSXy0|j�}|jdd�}|j|d�}|jdd�}Wn|jk
rhdSX|dkrvdS|dkr�d	S|dkr�dStd
|��dS)z�Decide whether we are on metered connection.

    Returns:
      True: if on metered connection
      False: if not
      None: if it can not be decided
    rNzorg.freedesktop.NetworkManagerz/org/freedesktop/NetworkManagerzorg.freedesktop.DBus.PropertiesZMeteredr�T��Fz&Unknown value for metered property: %r)rr�)r�r�)�dbus�ImportErrorZ	SystemBusZ
get_objectZ	InterfaceZGetZ
DBusExceptionr�)r�Zbus�proxyZifaceZmeteredrrr�on_metered_connection1s&r�cCs&tj|�\}}tjj||�t||�fS)z�Use a predicate to partition entries into false entries and true entries.

    Credit: Python library itertools' documentation.

    )�	itertools�teer
r�filterfalse�filter)ZpredrTZt1Zt2rrr�	partitionNsr�cCs(ytj|�Wntk
r"YnXdS)N)�shutilZrmtreerH)r#rrrrDWsrDc#sFt���fdd�}t�||�}||�Vx||�}|s8P|Vq*WdS)z�Split an iterable into tuples by a condition.

    Inserts a separator before each item which meets the condition and then
    cuts the iterable by these separators.

    csttj�fdd�|��S)Ncs|�kS)Nr)r0)�	separatorrr�<lambda>gsz4split_by.<locals>.next_subsequence.<locals>.<lambda>)�tupler��	takewhile)rZ)r�rr�next_subsequencefsz"split_by.<locals>.next_subsequenceN)�objectrt)rTrsr�ZmarkedZsubsequencer)r�r�split_by]s
r�cCs|j|�r|t|�d�SdS)N)rr<)r=�prefixrrr�strip_prefixus
r�Fc	Cs8|stj|tj�rtj|d�St|d��WdQRXdS)z{Create an empty file if it doesn't exist or bump it's timestamps.

    If no_create is True only bumps the timestamps.
    N�a)r"�access�F_OK�utimer�)r#Z	no_createrrr�touch{sr��write�cCs�yh|dkr|j|�nP|dkr(|j�n>|dkrD|j|�|j�n"|dkrZt||d�ntd|��Wn>tk
r�}z"tjdjt|�j	t
|���WYdd}~XnXdS)Nr��flushZwrite_flush�print)rzUnsupported type: z{}: {})r�r�r�r�r*r,�criticalr��type�__name__r	)�tp�msg�outr0rrr�_terminal_messenger�s


r�cCsnd}t|�dk}xXt|dd�D]H\}}|rD|dtd�d|7}n|dtd�d7}|dj|�7}qW|S)	z�
    Format string about problems in resolve

    :param resolve_problems: list with list of strings (output of goal.problem_rules())
    :return: string
    r�r)r&z
 ZProblemz %d: z: z
  - )r<�	enumeraterrC)Zresolve_problemsr�Zcount_problems�iZrsrrr�_format_resolve_problems�sr�cCsX|j�d}|j�dk	r4|j�dkr4||j�d7}||j�d|j�d|j�S)N�-�0�:�.)�N�E�V�R�A)ZteZnevrarrr�	_te_nevra�sr�cCs�tjd�xH|D]@}|j�}d}|dk	r.|j}djt|�|||j��}tj|�qWx:|D]2}djt|�|j|j|j	|j
|j��}tj|�qZWdS)NzLogging transaction elementsz@RPM element: '{}', Key(): '{}', Key state: '{}', Failed() '{}': z^SWDB element: '{}', State: '{}', Action: '{}', From repo: '{}', Reason: '{}', Get reason: '{}')r,�debug�Key�stater�r��Failedr+�actionZ	from_repo�reasonZ
get_reason)�rpm_transaction�swdb_transaction�rpm_el�tsiZ	tsi_stater�rrr�_log_rpm_trans_with_swdb�s



r�c
CsVtjjtjjtjjtjjtjjh}dd�|D�}d}d}x�|D]�}t|�}|j�}|dksft	|d�r�x:|D]2}	|	j
tjjkr�ql|	j|kr�qlt
|	�|krl|	}PqlW|dks�t	|d�r�tjtd�j|��d}q>|j�r�tjj|_
d}q>tjj|_
q>Wx6|D].}|j
tjjkr�tjtd�jt
|���d}q�W|�rBtjtd��|�rRt||�dS)	NcSsg|]}|�qSrr)r\r�rrrr��sz-_sync_rpm_trans_with_swdb.<locals>.<listcomp>FZpkgz%TransactionItem not found for key: {}Tz)TransactionSWDBItem not found for key: {}z#Errors occurred during transaction.)r'�transactionZ TransactionItemAction_DOWNGRADEDZTransactionItemAction_OBSOLETED�TransactionItemAction_REMOVEZTransactionItemAction_UPGRADEDZ!TransactionItemAction_REINSTALLEDr�r��hasattrr�ZTransactionItemState_UNKNOWNr�r+r,r�rr�r��TransactionItemState_ERRORZTransactionItemState_DONEr�r�)
r�r�Zrevert_actionsZ
cached_tsiZel_not_foundr-r�Zte_nevrar�Z
tsi_candidaterrr�_sync_rpm_trans_with_swdb�sH





r�c@s$eZdZdd�Zdd�Zdd�ZdS)�tmpdircCsdtjj}tj|d�|_dS)Nz%s-)r�)r
�constZPREFIXr6Zmkdtempr#)�selfr�rrr�__init__�sztmpdir.__init__cCs|jS)N)r#)r�rrr�	__enter__�sztmpdir.__enter__cCst|j�dS)N)rDr#)r��exc_type�	exc_value�	tracebackrrr�__exit__�sztmpdir.__exit__N)r��
__module__�__qualname__r�r�r�rrrrr��sr�cs(eZdZdZ�fdd�Zdd�Z�ZS)�Bunchz�Dictionary with attribute accessing syntax.

    In DNF, prefer using this over dnf.yum.misc.GenericHolder.

    Credit: Alex Martelli, Doug Hudgeon

    cstt|�j||�||_dS)N)�superr�r��__dict__)r��args�kwds)�	__class__rrr��szBunch.__init__cCst|�S)N)�id)r�rrr�__hash__szBunch.__hash__)r�r�r��__doc__r�r��
__classcell__rr)r�rr��sr�cs,eZdZ�fdd�Zdd�Zdd�Z�ZS)�
MultiCallListcstt|�j�|j|�dS)N)r�r�r��extend)r�rT)r�rrr�szMultiCallList.__init__cs��fdd�}|S)Ncs���fdd�}tt|���S)Ncst|��}|���S)N)r~)�v�method)r�r:�whatrr�	call_what
s
z8MultiCallList.__getattr__.<locals>.fn.<locals>.call_what)rSr�)r�r:r�)r�r�)r�r:rrbsz%MultiCallList.__getattr__.<locals>.fnr)r�r�rbr)r�r�r�__getattr__szMultiCallList.__getattr__cs��fdd�}tt||��S)Ncst|���dS)N)r)r])r�r�rr�settersz)MultiCallList.__setattr__.<locals>.setter)rSr�)r�r�r�r�r)r�r�r�__setattr__szMultiCallList.__setattr__)r�r�r�r�r�r�r�rr)r�rr�sr�c
Csntgggggggggggd��}�xF|D�]<}|jtjjkrJ|jj|�q(|jtjjkrf|j	j|�q(|jtjj
kr�|jtjjkr�|j
j|�nD|jtjjkr�|jj|�n(|jtjjkr�|jj|�n|jj|�q(|jtjjkr�|jj|�q(|jtjjk�rL|jtjjk�r |jj|�n*|jtjjk�r>|jj|�n|jj|�q(|jtjjkr(|jj|�q(W|S)N)�
downgraded�erased�erased_clean�
erased_dep�	installed�installed_group�
installed_dep�installed_weak�reinstalled�upgraded�failed)r�r�r'r�r�r�rr�ZTransactionItemAction_DOWNGRADEr�ZTransactionItemAction_INSTALLr�ZTransactionItemReason_GROUPr�Z TransactionItemReason_DEPENDENCYr�Z%TransactionItemReason_WEAK_DEPENDENCYr�r�ZTransactionItemAction_REINSTALLr�r�ZTransactionItemReason_CLEANr�r�r�ZTransactionItemAction_UPGRADEr�)r�r3r�rrr�_make_listssH
rcs��fdd�}tjj|�}�jd|d�\}}|j|�}g}x�td�|jftd�|jftd�|j|j	|j
|jftd�|jftd	�|ftd
�|j
|j|jftd�|jfgD]&\}	}
|j||	t|
tj|�d���q�W|S)
alReturns a human-readable summary of the results of the
    transaction.

    :param action_callback: function generating output for specific action. It
       takes two parameters - action as a string and list of affected packages for
       this action
    :return: a list of lines containing a human-readable summary of the
       results of the transaction
    cs�|j|jk|j|jk}|dkr$|Stj|j|j|j|j|jd�}tj|j|j|j|j|jd�}|j|�j�}|dkrz|S|j|jk|j|jkS)z�Compares two transaction items or packages by nevra.
           Used as a fallback when tsi does not contain package object.
        r)�name�epoch�version�release�arch)	r�hawkeyZNEVRArrrrZevr_cmpZsack)Zitem1Zitem2�retZnevra1Znevra2)�baserr�_tsi_or_pkg_nevra_cmpPsz7_post_transaction_output.<locals>._tsi_or_pkg_nevra_cmpF)Zreport_problemsr�ZUpgradedZ
DowngradedZ	InstalledZReinstalledZSkippedZRemovedr�)�key)r
�utilrZ_skipped_packages�unionrr�r�r�r�r�r�r�r�r�r�r�r��sortedrp�
cmp_to_key)rr�Zaction_callbackr	Z
list_bunchZskipped_conflictsZskipped_brokenZskippedr�r�Ztsisr)rr�_post_transaction_outputFs(



r)N)NNr2)F)QZ
__future__rrrrrrZdnf.i18nrr	�argparser
Zdnf.callbackZ	dnf.constZ
dnf.pycomprIrprr�r�Zloggingr"rfr��sysr6r`Zlibdnf.repor'Zlibdnf.transactionZ	getLoggerr,�ArgumentParser�progZ	MAIN_PROG�upperZMAIN_PROG_UPPERrr1r;r?rArFrKrQrVr[r_rcrarirmrrrtrvr|rxr�r�r�r�r�r�rDr�r�r��stdoutr�r�r�r�r�r�r��dictr�rSr�rrrrrr�<module>s�
 


						
(-site-packages/dnf/__pycache__/selector.cpython-36.opt-1.pyc000064400000000361147511334650017461 0ustar003

�ft`e�@s(ddlmZddlmZddlmZdS)�)�absolute_import)�unicode_literals)�SelectorN)Z
__future__rrZhawkeyr�rr�/usr/lib/python3.6/selector.py�<module>ssite-packages/dnf/crypto.py000064400000013753147511334650011747 0ustar00# crypto.py
# Keys and signatures.
#
# Copyright (C) 2014  Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import print_function
from __future__ import absolute_import
from __future__ import unicode_literals
from dnf.i18n import _
import contextlib
import dnf.pycomp
import dnf.util
import dnf.yum.misc
import io
import logging
import os
import tempfile

try:
    from gpg import Context
    from gpg import Data
except ImportError:
    import gpgme


    class Context(object):
        def __init__(self):
            self.__dict__["ctx"] = gpgme.Context()

        def __enter__(self):
            return self

        def __exit__(self, type, value, tb):
            pass

        @property
        def armor(self):
            return self.ctx.armor

        @armor.setter
        def armor(self, value):
            self.ctx.armor = value

        def op_import(self, key_fo):
            if isinstance(key_fo, basestring):
                key_fo = io.BytesIO(key_fo)
            self.ctx.import_(key_fo)

        def op_export(self, pattern, mode, keydata):
            self.ctx.export(pattern, keydata)

        def __getattr__(self, name):
            return getattr(self.ctx, name)


    class Data(object):
        def __init__(self):
            self.__dict__["buf"] = io.BytesIO()

        def __enter__(self):
            return self

        def __exit__(self, type, value, tb):
            pass

        def read(self):
            return self.buf.getvalue()

        def __getattr__(self, name):
            return getattr(self.buf, name)


GPG_HOME_ENV = 'GNUPGHOME'
logger = logging.getLogger('dnf')


def _extract_signing_subkey(key):
    return dnf.util.first(subkey for subkey in key.subkeys if subkey.can_sign)


def _printable_fingerprint(fpr_hex):
    segments = (fpr_hex[i:i + 4] for i in range(0, len(fpr_hex), 4))
    return " ".join(segments)


def import_repo_keys(repo):
    gpgdir = repo._pubring_dir
    known_keys = keyids_from_pubring(gpgdir)
    for keyurl in repo.gpgkey:
        for keyinfo in retrieve(keyurl, repo):
            keyid = keyinfo.id_
            if keyid in known_keys:
                logger.debug(_('repo %s: 0x%s already imported'), repo.id, keyid)
                continue
            if not repo._key_import._confirm(keyinfo):
                continue
            dnf.yum.misc.import_key_to_pubring(
                keyinfo.raw_key, keyinfo.short_id, gpgdir=gpgdir,
                make_ro_copy=False)
            logger.debug(_('repo %s: imported key 0x%s.'), repo.id, keyid)


def keyids_from_pubring(gpgdir):
    if not os.path.exists(gpgdir):
        return []

    with pubring_dir(gpgdir), Context() as ctx:
        keyids = []
        for k in ctx.keylist():
            subkey = _extract_signing_subkey(k)
            if subkey is not None:
                keyids.append(subkey.keyid)
        return keyids


def log_key_import(keyinfo):
    msg = (_('Importing GPG key 0x%s:\n'
             ' Userid     : "%s"\n'
             ' Fingerprint: %s\n'
             ' From       : %s') %
           (keyinfo.short_id, keyinfo.userid,
            _printable_fingerprint(keyinfo.fingerprint),
            keyinfo.url.replace("file://", "")))
    logger.critical("%s", msg)


def log_dns_key_import(keyinfo, dns_result):
    log_key_import(keyinfo)
    if dns_result == dnf.dnssec.Validity.VALID:
        logger.critical(_('Verified using DNS record with DNSSEC signature.'))
    else:
        logger.critical(_('NOT verified using DNS record.'))

@contextlib.contextmanager
def pubring_dir(pubring_dir):
    orig = os.environ.get(GPG_HOME_ENV, None)
    os.environ[GPG_HOME_ENV] = pubring_dir
    try:
        yield
    finally:
        if orig is None:
            del os.environ[GPG_HOME_ENV]
        else:
            os.environ[GPG_HOME_ENV] = orig


def rawkey2infos(key_fo):
    pb_dir = tempfile.mkdtemp()
    keyinfos = []
    with pubring_dir(pb_dir), Context() as ctx:
        ctx.op_import(key_fo)
        for key in ctx.keylist():
            subkey = _extract_signing_subkey(key)
            if subkey is None:
                continue
            keyinfos.append(Key(key, subkey))
        ctx.armor = True
        for info in keyinfos:
            with Data() as sink:
                ctx.op_export(info.id_, 0, sink)
                sink.seek(0, os.SEEK_SET)
                info.raw_key = sink.read()
    dnf.util.rm_rf(pb_dir)
    return keyinfos


def retrieve(keyurl, repo=None):
    if keyurl.startswith('http:'):
        logger.warning(_("retrieving repo key for %s unencrypted from %s"), repo.id, keyurl)
    with dnf.util._urlopen(keyurl, repo=repo) as handle:
        keyinfos = rawkey2infos(handle)
    for keyinfo in keyinfos:
        keyinfo.url = keyurl
    return keyinfos


class Key(object):
    def __init__(self, key, subkey):
        self.id_ = subkey.keyid
        self.fingerprint = subkey.fpr
        self.raw_key = None
        self.timestamp = subkey.timestamp
        self.url = None
        self.userid = key.uids[0].uid

    @property
    def short_id(self):
        rj = '0' if dnf.pycomp.PY3 else b'0'
        return self.id_[-8:].rjust(8, rj)

    @property
    def rpm_id(self):
        return self.short_id.lower()
site-packages/dnf/module/__pycache__/__init__.cpython-36.pyc000064400000001074147511334650017730 0ustar003

�ft`��@sPddlmZdZdZdZdZdZeed�eed�eed	�eed
�eed�iZdS)
�)�_�����z#Enabling different stream for '{}'.zNothing to show.z;Installing newer version of '{}' than specified. Reason: {}zEnabled modules: {}.z6No profile specified for '{}', please specify profile.N)Zdnf.i18nrZDIFFERENT_STREAM_INFOZNOTHING_TO_SHOWZINSTALLING_NEWER_VERSIONZENABLED_MODULESZNO_PROFILE_SPECIFIEDZmodule_messages�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/dnf/module/__pycache__/exceptions.cpython-36.pyc000064400000011022147511334650020344 0ustar003

�ft`F�@sddlZddlmZmZddlmZGdd�dejj�ZGdd�dejj�Z	Gdd	�d	ejj�Z
Gd
d�dejj�ZGdd
�d
ejj�ZGdd�dejj�Z
Gdd�dejj�ZGdd�dejj�ZGdd�dejj�ZGdd�dejj�ZGdd�dejj�ZdS)�N)�module_messages�NO_PROFILE_SPECIFIED)�_cseZdZ�fdd�Z�ZS)�NoModuleExceptioncs"td�j|�}tt|�j|�dS)NzNo such module: {})r�format�superr�__init__)�self�module_spec�value)�	__class__�� /usr/lib/python3.6/exceptions.pyrszNoModuleException.__init__)�__name__�
__module__�__qualname__r�
__classcell__r
r
)rrrsrcseZdZ�fdd�Z�ZS)�NoStreamExceptioncs"td�j|�}tt|�j|�dS)NzNo such stream: {})rrrrr)r	�streamr)rr
rr szNoStreamException.__init__)rrrrrr
r
)rrrsrcseZdZ�fdd�Z�ZS)�EnabledStreamExceptioncs"td�j|�}tt|�j|�dS)Nz No enabled stream for module: {})rrrrr)r	r
r)rr
rr&szEnabledStreamException.__init__)rrrrrr
r
)rrr%srcseZdZd�fdd�	Z�ZS)�EnableMultipleStreamsExceptionNcs*|dkrtd�j|�}tt|�j|�dS)Nz<Cannot enable more streams from module '{}' at the same time)rrrrr)r	r
r)rr
rr,sz'EnableMultipleStreamsException.__init__)N)rrrrrr
r
)rrr+srcseZdZ�fdd�Z�ZS)�DifferentStreamEnabledExceptioncs"td�j|�}tt|�j|�dS)Nz'Different stream enabled for module: {})rrrrr)r	r
r)rr
rr3sz(DifferentStreamEnabledException.__init__)rrrrrr
r
)rrr2srcseZdZ�fdd�Z�ZS)�NoProfileExceptioncs"td�j|�}tt|�j|�dS)NzNo such profile: {})rrrrr)r	Zprofiler)rr
rr9szNoProfileException.__init__)rrrrrr
r
)rrr8srcseZdZ�fdd�Z�ZS)�ProfileNotInstalledExceptioncs"td�j|�}tt|�j|�dS)Nz&Specified profile not installed for {})rrrrr)r	r
r)rr
rr?sz%ProfileNotInstalledException.__init__)rrrrrr
r
)rrr>srcseZdZ�fdd�Z�ZS)�NoStreamSpecifiedExceptioncs"td�j|�}tt|�j|�dS)Nz3No stream specified for '{}', please specify stream)rrrrr)r	r
r)rr
rrEsz#NoStreamSpecifiedException.__init__)rrrrrr
r
)rrrDsrcseZdZ�fdd�Z�ZS)�NoProfileSpecifiedExceptioncs"ttj|�}tt|�j|�dS)N)rrrrrr)r	r
r)rr
rrKsz$NoProfileSpecifiedException.__init__)rrrrrr
r
)rrrJsrcseZdZ�fdd�Z�ZS)�NoProfilesExceptioncs"td�j|�}tt|�j|�dS)Nz*No such profile: {}. No profiles available)rrrrr)r	r
r)rr
rrQszNoProfilesException.__init__)rrrrrr
r
)rrrPsrcseZdZ�fdd�Z�ZS)�NoProfileToRemoveExceptioncs"td�j|�}tt|�j|�dS)NzNo profile to remove for '{}')rrrrr)r	r
r)rr
rrWsz#NoProfileToRemoveException.__init__)rrrrrr
r
)rrrVsr)ZdnfZ
dnf.modulerrZdnf.i18nr�
exceptions�Errorrrrrrrrrrrrr
r
r
r�<module>ssite-packages/dnf/module/__pycache__/exceptions.cpython-36.opt-1.pyc000064400000011022147511334650021303 0ustar003

�ft`F�@sddlZddlmZmZddlmZGdd�dejj�ZGdd�dejj�Z	Gdd	�d	ejj�Z
Gd
d�dejj�ZGdd
�d
ejj�ZGdd�dejj�Z
Gdd�dejj�ZGdd�dejj�ZGdd�dejj�ZGdd�dejj�ZGdd�dejj�ZdS)�N)�module_messages�NO_PROFILE_SPECIFIED)�_cseZdZ�fdd�Z�ZS)�NoModuleExceptioncs"td�j|�}tt|�j|�dS)NzNo such module: {})r�format�superr�__init__)�self�module_spec�value)�	__class__�� /usr/lib/python3.6/exceptions.pyrszNoModuleException.__init__)�__name__�
__module__�__qualname__r�
__classcell__r
r
)rrrsrcseZdZ�fdd�Z�ZS)�NoStreamExceptioncs"td�j|�}tt|�j|�dS)NzNo such stream: {})rrrrr)r	�streamr)rr
rr szNoStreamException.__init__)rrrrrr
r
)rrrsrcseZdZ�fdd�Z�ZS)�EnabledStreamExceptioncs"td�j|�}tt|�j|�dS)Nz No enabled stream for module: {})rrrrr)r	r
r)rr
rr&szEnabledStreamException.__init__)rrrrrr
r
)rrr%srcseZdZd�fdd�	Z�ZS)�EnableMultipleStreamsExceptionNcs*|dkrtd�j|�}tt|�j|�dS)Nz<Cannot enable more streams from module '{}' at the same time)rrrrr)r	r
r)rr
rr,sz'EnableMultipleStreamsException.__init__)N)rrrrrr
r
)rrr+srcseZdZ�fdd�Z�ZS)�DifferentStreamEnabledExceptioncs"td�j|�}tt|�j|�dS)Nz'Different stream enabled for module: {})rrrrr)r	r
r)rr
rr3sz(DifferentStreamEnabledException.__init__)rrrrrr
r
)rrr2srcseZdZ�fdd�Z�ZS)�NoProfileExceptioncs"td�j|�}tt|�j|�dS)NzNo such profile: {})rrrrr)r	Zprofiler)rr
rr9szNoProfileException.__init__)rrrrrr
r
)rrr8srcseZdZ�fdd�Z�ZS)�ProfileNotInstalledExceptioncs"td�j|�}tt|�j|�dS)Nz&Specified profile not installed for {})rrrrr)r	r
r)rr
rr?sz%ProfileNotInstalledException.__init__)rrrrrr
r
)rrr>srcseZdZ�fdd�Z�ZS)�NoStreamSpecifiedExceptioncs"td�j|�}tt|�j|�dS)Nz3No stream specified for '{}', please specify stream)rrrrr)r	r
r)rr
rrEsz#NoStreamSpecifiedException.__init__)rrrrrr
r
)rrrDsrcseZdZ�fdd�Z�ZS)�NoProfileSpecifiedExceptioncs"ttj|�}tt|�j|�dS)N)rrrrrr)r	r
r)rr
rrKsz$NoProfileSpecifiedException.__init__)rrrrrr
r
)rrrJsrcseZdZ�fdd�Z�ZS)�NoProfilesExceptioncs"td�j|�}tt|�j|�dS)Nz*No such profile: {}. No profiles available)rrrrr)r	r
r)rr
rrQszNoProfilesException.__init__)rrrrrr
r
)rrrPsrcseZdZ�fdd�Z�ZS)�NoProfileToRemoveExceptioncs"td�j|�}tt|�j|�dS)NzNo profile to remove for '{}')rrrrr)r	r
r)rr
rrWsz#NoProfileToRemoveException.__init__)rrrrrr
r
)rrrVsr)ZdnfZ
dnf.modulerrZdnf.i18nr�
exceptions�Errorrrrrrrrrrrrr
r
r
r�<module>ssite-packages/dnf/module/__pycache__/module_base.cpython-36.pyc000064400000057322147511334650020457 0ustar003

�ft`���@s�ddlmZddlZddlZddlZddlZddlZddl	m
Z
ddlmZddl
mZmZmZddlZejjjZejjjZejjjZejjjZed�Zed�Zdd	�ZGd
d�de�Z dd
�Z!dS)�)�OrderedDictN)�EnableMultipleStreamsException)�logger)�_�P_�ucdz6

Hint: [d]efault, [e]nabled, [x]disabled, [i]nstalledz@

Hint: [d]efault, [e]nabled, [x]disabled, [i]nstalled, [a]ctivecCs|j�S)N)�getName)�profile�r
�!/usr/lib/python3.6/module_base.py�_profile_comparison_key'src@seZdZdd�Zdd�Zdd�ZdBdd	�ZdCd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dDdd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�ZdEd%d&�Zd'd(�Zd)d*�Zd+d,�ZdFd-d.�Zd/d0�Zed1d2��Zd3d4�Zd5d6�Zd7d8�Zd9d:�Zd;d<�Z d=d>�Z!d?d@�Z"dAS)G�
ModuleBasecCs
||_dS)N)�base)�selfrr
r
r�__init__.szModuleBase.__init__c	Csn|j|�\}}}}x8|j�D],\}\}}|jrtjtd�j|j|j��qW|sX|sX|rjtj	j
|||d��dS)Nz%Ignoring unnecessary profile: '{}/{}')�no_match_group_specs�error_group_specs�module_depsolv_errors)�!_resolve_specs_enable_update_sack�itemsr	r�infor�format�name�dnf�
exceptions�
MarkingErrors)	r�module_specs�no_match_specs�error_specs�
solver_errors�module_dicts�spec�nsvcap�module_dictr
r
r�enable2szModuleBase.enablecCs,|j|t�\}}|s|r(tjj||d��dS)N)rr)�_modules_reset_or_disable�STATE_DISABLEDrrr)rrrrr
r
r�disable?szModuleBase.disableTc	s��j|�\}}}}tj}i}t�}	d}
�xt|j�D�]f\}\}}
�xV|
j�D�]H\}}�x<|j�D�].\}}�fdd�|D�}|s�tjtd�j|||��|j	|�qbg}�j
|�}|j�|kr�td�}tj|j|j
�|��d}
|j�rd|j|j|j��|�s(|j�}|�r>djtdd�|D���}td	�j||||�}ntd
�j|�}tj|�|j	|�qbnĈjjj||�}|�s�|j�}|�r�djtdd�|D���}td�j|||�}ntd
�j||�}tj|�|j	|�xNt|�D]B}|j|�}|�stjtd�j|||��|j	|�|j|��q�WxJ|D]B}�jjj||j��x&|j�D]}|j|t��j|��qPW�q.Wx|D]}|	j|j���qzWqbWqLWq2W|
�r�tjjtd����j |	||�\}}|�r�|j|�|�s�|�s�|�r�tjj!|||d��dS)NFcs"g|]}�jjj|j��r|�qSr
)r�_moduleContainer�isModuleActive�getId)�.0�x)rr
r�
<listcomp>Ssz&ModuleBase.install.<locals>.<listcomp>zAAll matches for argument '{0}' in module '{1}:{2}' are not activezDInstalling module '{0}' from Fail-Safe repository {1} is not allowedTz, cSsg|]}|j��qSr
)r)r+r	r
r
rr-gszKUnable to match profile for argument {}. Available profiles for '{}:{}': {}z'Unable to match profile for argument {}cSsg|]}|j��qSr
)r)r+r	r
r
rr-wsz<No default profiles for module {}:{}. Available profiles: {}zNo profiles for module {}:{}z0Default profile {} not available in module {}:{}z:Installing module from Fail-Safe repository is not allowed)rrr)"r�hawkey�MODULE_FAIL_SAFE_REPO_NAME�setrr�errorrr�append�_get_latest�	getRepoID�critical�
getNameStreamr	�extend�getProfiles�join�sortedrr(�getDefaultProfiles�installr�
getContent�
setdefault�add�update�getArtifactsrr�Error�_install_profiles_internalr)rr�strictrrrr �fail_safe_repo�install_dict�install_set_artifacts�fail_safe_repo_usedr!r"�
moduledictr�
streamdict�stream�module_list�install_module_list�profiles�
latest_module�msg�available_profiles�
profile_namesZprofiles_stringsr	�module_profiles�pkg_name�module�__�profiles_errorsr
)rrr<Fs�








 
 
zModuleBase.installc.	s��j|�\}}}t�}t�}ddh}x�|j�D]�\}	\}
}x�|j�D]x}xr�jjj|dddd�D]X}
�jjj|
�r`xD|
j�D]8}|j	dd�d}||kr�q||j	dd�d}|j
|�q|Wq`WqDWq.W�j�}�j|�}|r�|j
|�tj}i}t�}d	}�jjj�}�x^|j�D�]P\}	\}
}�x>|j�D�]0\}}�x"|j�D�]\}}�fd
d�|D�}|�s�tjtd�j|	||��|j|	��q@g}�j|�}|j�|k�r�td
�}tj|j|j�|��d}|
j�rJ|j
|j|
j��|�s�|j�}|�r"djtdd�|D���} td�j|	||| �}ntd�j|	�}tj|�|j|	��q@nT||k�r�xH||D]<}!|j|!�}"|"�s�tjtd�j|!||���q^|j
|"��q^WxJ|D]B}!�jjj||!j ��x&|!j!�D]}|j"|t��j
|	��q�W�q�Wxh|D]`}
|
j�}#|j#|#�xF|#D]>}|j	dd�d}||k�r.�q|j	dd�d}|j
|��qW�q�W�q@W�q*W�qW|�r|t$j%j&td����j'|||�\}$}%|%�r�|j
|%�t�}&|&j#|�|&j#|��jj(j�j)dd�}'�jj(j�j)ddgd�j*�}(x�|&D]�}|(j+|d�})|)j,�}*|*�s�q�|)j-�}+|+�sJtjtd�j|��||k�r�|'j.|)�}'�q�|)j/|$�},|,�r^|,})t$j0j1�jj(�}-|-j|)d��jj2j3|-d��q�W�jj4|'�|�s�|�s�|�r�t$j%j5|||d��dS)N�nosrc�src��.��-�rFcs"g|]}�jjj|j��r|�qSr
)rr(r)r*)r+r,)rr
rr-�sz(ModuleBase.switch_to.<locals>.<listcomp>z8No active matches for argument '{0}' in module '{1}:{2}'zDInstalling module '{0}' from Fail-Safe repository {1} is not allowedTz, cSsg|]}|j��qSr
)r)r+r	r
r
rr-�szKUnable to match profile for argument {}. Available profiles for '{}:{}': {}z'Unable to match profile for argument {}zEInstalled profile '{0}' is not available in module '{1}' stream '{2}'z:Installing module from Fail-Safe repository is not allowed)�empty)�	arch__neq)rz9No packages available to distrosync for package name '{}')�pkg)�select)rrr)6�_resolve_specs_enabler0r�keysrr(�queryr)rA�rsplitr?�_update_sack�_enable_dependenciesr7r.r/ZgetRemovedProfilesrr1rrr2r3r4r5r6r	r8r9r:Zwarningr<rr=r>r@rrrBrC�sack�filterm�apply�filter�	installed�	available�union�intersection�selector�Selector�_goalZdistupgrade�_remove_if_unneededr).rrrDrrr Znew_artifacts_namesZactive_artifacts_namesZ
src_archesr!r"rIrrU�artifact�archrTr�dependency_error_specrErFrGrHZremoved_profilesrJrKrLrMrNrOrPrQrRr	rS�	artifacts�install_base_queryrWZ	all_namesZremove_query�base_no_source_queryrermrnZonly_new_module�sltrr
)rr�	switch_to�s�








 



(






zModuleBase.switch_tocCs(|j|t�\}}|r$tjj||d��dS)N)rr)r%�
STATE_UNKNOWNrrr)rrrrr
r
r�resetszModuleBase.resetcs,g}tj}d}�jjj�jddgd�j�}�x�|D�]�}�j|�\}}|sV|j|�q2�fdd�|D�}	|	s�t	j
td�j|��q2�j
|	|d�}
t�}�x|
j�D�]\}}
�x|
j�D]�\}}|j�j||���j|�}|j�|k�rtd�}t	j|j|j�|��d	}|j�rH|j|j�}|�s(q�x�|D]}|j|j���q.Wq�x |j�D]}|j|j���qRWxD|j�D]8}tj|�}x&|jtjgd
�D]}|j|j��q�W�qtWq�Wq�W|�s�t	j
td�j|��|j|d�}|r2t j!j"�jj�}|j|d
��jj#j$|d�q2W|�r(t j%j&td���|S)NFrYrX)r`cs"g|]}�jjj|j��r|�qSr
)rr(r)r*)r+r,)rr
rr-3sz&ModuleBase.upgrade.<locals>.<listcomp>zUnable to resolve argument {}zCUpgrading module '{0}' from Fail-Safe repository {1} is not allowedT)Zformsz&Unable to match profile in argument {})r)ra)rbz9Upgrading module from Fail-Safe repository is not allowed)'r.r/rrirerjrk�_get_modulesr2rr1rr�_create_module_dict_and_enabler0rr@�)_get_package_name_set_and_remove_profilesr3r4r5r6r	r8r=rA�SubjectZget_nevra_possibilitiesZ
FORM_NEVRAr?rrlrrqrrrs�upgraderrB)rrrrErHrzr!rLr"Zupdate_module_listr#Zupgrade_package_setrrJrK�module_list_from_dictrOrP�profiles_setr	ru�subjZ	nevra_objrer{r
)rrr�%sb




 zModuleBase.upgradecCs�g}t�}x�|D]�}|j|�\}}|s2|j|�q|j||d�}g}x>|j�D]2\}	}
x(|
j�D]\}}|j|j||d��q`WqNW|s�tjt	d�j
|��|j|�qW|r�|jj
j�}
|j|
�}|r�|jjj�j�j|d�}|r�|jj|�|S)NFTz&Unable to match profile in argument {})r)r0rr2r�rr7r�rr1rrr@rr(ZgetInstalledPkgNames�
differencerirermrjrt)rrrZremove_package_setr!rLr"r#Zremove_packages_namesrrJrKr�Zkeep_pkg_namesrer
r
r�remove_s0


zModuleBase.removecCs
|j|�S)N)r)r�module_specr
r
r�get_modules|szModuleBase.get_modulesc
Cs�tj|�}x�|j�D]�}|jr$|jnd}|jr4|jnd}d}|jrH|jnd}|jrX|jnd}|jrv|jdkrvt|j�}|j	j
j|||||�}	|	r|	|fSqWfdfS)NrZr\���)r.r�Znsvcap_possibilitiesrrK�contextrv�version�strrr(re)
rr�r�r"rrKr�r�rv�modulesr
r
rr�s

zModuleBase._get_modulescCs>d}|r:|d}x(|dd�D]}|j�|j�kr|}qW|S)Nrr\)Z
getVersionNum)rrL�latestrUr
r
rr3�szModuleBase._get_latestc
Cs�i}x.|D]&}|j|j�i�j|j�g�j|�q
W�xL|j�D�]>\}}|jjj|�}t|�dk�rF|t	kr�|t
kr�|tkr�djt
|j�tj|jjj�d��}	td�j|t|�|	|d�}
t||
��|t
kr�|jjj|�}n|jjj|�}|s�||k�rt|��xlt
|j��D]0}||k�r8|�r|jjj||��q||=�qWn*|�rpx"|j�D]}|jjj||��qVWt|�dks@t�q@W|S)Nr\z', ')�keyz�Argument '{argument}' matches {stream_count} streams ('{streams}') of module '{module}', but none of the streams are enabled or default)ZargumentZstream_countZstreamsrU)r>r�	getStreamr2rrr(ZgetModuleState�len�
STATE_DEFAULT�
STATE_ENABLEDr&r9r:rd�	functools�
cmp_to_keyriZevr_cmprrrZgetEnabledStream�getDefaultStreamr$�AssertionError)
rrLr!r$�
moduleDictrUZ
moduleName�
streamDictZmoduleStateZstreams_strrPrKr�r
r
rr��s>
"


z)ModuleBase._create_module_dict_and_enablec
Cs�g}g}i}x�|D]�}|j|�\}}|s4|j|�qy|j||d�}||f||<Wqttfk
r�}	z2|j|�tjt|	��tjtd�j	|��WYdd}	~	XqXqW|||fS)NTzUnable to resolve argument {})
rr2r��RuntimeErrorrrr1rrr)
rrr�
error_specr r!rLr"r#�er
r
rrc�s 


*z ModuleBase._resolve_specs_enablecCs�dd�|jjj�D�}y4|jjj|jj||jjj|jjjd|jjj	d�}Wn4t
jk
r~}ztj
jt|���WYdd}~XnX|S)NcSsg|]}|jr|j�qSr
)�module_hotfixes�id)r+�ir
r
rr-�sz+ModuleBase._update_sack.<locals>.<listcomp>T)Zupdate_onlyZdebugsolver)r�repos�iter_enabledriZfilter_modulesr(�confZinstallrootZmodule_platform_idZdebug_solverr.�	ExceptionrrrBr)r�
hot_fix_reposrr�r
r
rrg�s
"zModuleBase._update_sackc	Cs�g}x�|j�D]�\}\}}x�|j�D]�}xz|j�D]n}y|jjjtjj|��Wq2tk
r�}z2|j	|�t
jt|��t
jt
d�j|��WYdd}~Xq2Xq2Wq$WqW|S)NzUnable to resolve argument {})r�valuesrr(ZenableDependencyTree�libdnfrUZVectorModulePackagePtrr�r2rr1rrr)	rr r�r!r"r�r�r�r�r
r
rrh�s
2zModuleBase._enable_dependenciescCs<|j|�\}}}|j�}|j|�}|r0|j|�||||fS)N)rcrgrhr7)rrrr�r rrwr
r
rr�s

z,ModuleBase._resolve_specs_enable_update_sackcCs�g}x�|D]�}|j|�\}}|s@tjtd�j|��|j|�q
|js^|js^|js^|j	s^|j
rrtjtd�j|��t�}x|D]}|j
|j��q~Wx8|D]0}	|tkr�|jjj|	�|tkr�|jjj|	�q�Wq
W|j�}
||
fS)NzUnable to resolve argument {}zMOnly module name is required. Ignoring unneeded information in argument: '{}')rrr1rrr2rKr�r�rvr	rr0r?rr}rr(r~r&r'rg)rrZto_staterr!rLr"Zmodule_namesrUrrr
r
rr%�s(



z$ModuleBase._modules_reset_or_disableFc
Cs�t�}|j|�}t|jjj|j���}|s0t�S|jr�|j|j�}|sLt�Sx�|D]6}|j�|krR|rz|jjj||j��|j	|j
��qRWnDxB|D]:}	|r�|jjj||	�x |j|	�D]}|j	|j
��q�Wq�W|S)N)r0r3rr(�getInstalledProfilesrr	r8Z	uninstallr@r=)
rrLr"r�Zpackage_name_setrOZinstalled_profiles_stringsr�r	Zprofile_stringr
r
rr�s*



z4ModuleBase._get_package_name_set_and_remove_profilesc	Cs�t�}x�|D]�}|j|�\}}|s8tjtd�j|��q|jrXtjtd�j|j|j��xl|D]d}t�}|j	�|d<x8t
|j�td�D]$}dj
dd�|j�D��||j�<q�W|j|j|�j��q^WqWdj
t
|��S)	NzUnable to resolve argument {}z%Ignoring unnecessary profile: '{}/{}'�Name)r��
cSsg|]}|�qSr
r
)r+ZpkgNamer
r
rr-5sz1ModuleBase._get_info_profiles.<locals>.<listcomp>z

)r0rrrrrr	rr�getFullIdentifierr:r8rr9r=rr?�_create_simple_table�toString)	rr�outputr�rLr"rU�linesr	r
r
r�_get_info_profiles#s"

 zModuleBase._get_info_profilescCs�|jjj|j��}|j�}d}xTt|td�D]D}|dj|j�|j�|krLdnd�7}||j�|krj|rjdnd7}q,W|dd�S)	NrZ)r�z{}{}z [d]z [i], z, r^���)rr(r�rr8r:rr)r�
modulePackage�default_profiles�enabled_strZinstalled_profilesrQ�profiles_strr	r
r
r�_profile_report_formatter:s

z$ModuleBase._profile_report_formattercCs|j�jdd�S)Nr�� )�strip�replace)rZsummaryr
r
r�_summary_report_formatterFsz$ModuleBase._summary_report_formattercCs�d}d}d}|j�|jjj|j��kr*d}|jjj|�rJ|s@d}|d7}n|jjj|�rh|s`d}|d7}|r�|jjj|�r�|s�d}|d7}|||fS)NrZz [d]r�z[e]z[x]z[a])r�rr(r�rZ	isEnabledZ
isDisabledr))rr��
markActive�default_strr��disabled_strr
r
r�_module_strs_formatterIs&
z!ModuleBase._module_strs_formatterc
Cs�t�}�x�|D�]�}|j|�\}}|s<tjtd�j|��q|jr\tjtd�j|j|j���xj|D�]`}|j|dd�\}}}	|j	j
j|j�|j
��}
|j||
|�}t�}|j�|d<|j
�|||	|d<|j�|d<|j�|d<|j�|d	<||d
<dj|
�|d<|j�|d
<|j�|d<|j�|d<t�}
xV|j�D]J}xB|j�D]6}x.|j�D]"\}}|
jdj|dj|����qXW�qJW�q<Wdjt|
��|d<djt|j���|d<|j|j|�j��qdWqWdjt|��}|�r�|t7}|S)NzUnable to resolve argument {}z%Ignoring unnecessary profile: '{}/{}'T)r�r��StreamZVersionZContextZArchitecture�Profilesr�zDefault profiles�Repo�SummaryZDescriptionz{}:[{}]�,r�ZRequiresZ	Artifactsz

) r0rrrrrr	rr�rr(r;rr�r�rZ
getVersionZ
getContextZgetArchr9r4�
getSummaryZgetDescriptionZgetModuleDependenciesZgetRequiresrr?r:rAr�r��MODULE_INFO_TABLE_HINT)rrr�r�rLr"r�r�r�r�r�r�r�Zreq_setZreqZrequire_dictZmod_requirerK�	str_tabler
r
r�	_get_info^sP

*zModuleBase._get_infocCs�tjj�}|jd�|jd�|jd�}|jd�}|jd�|jd�|j�xL|j	�D]@\}}|dkrld}|j
�}|j|�j|�|j|�jt
|��qXW|S)NTz : r�ZValuer�rZ)r��	smartcols�TableZenableNoheadingsZsetColumnSeparator�	newColumn�setWrapZsetSafecharsZsetNewlineWrapFunctionr�newLine�
getColumnCell�setDatar�)r��table�column_nameZcolumn_valueZ	line_name�value�liner
r
rr��s






zModuleBase._create_simple_tablec	Cs�t�}xx|D]p}|j|�\}}|s8tjtd�j|��q|jrXtjtd�j|j|j��x"|D]}|j�}|r^|j	|�q^WqWdj
t|��}|S)NzUnable to resolve argument {}z%Ignoring unnecessary profile: '{}/{}'z

)r0rrrrrr	rZgetYamlr?r9r:)	rrr�r�rLr"r�rZ
output_stringr
r
r�_get_full_info�s

zModuleBase._get_full_infoc	Cs�t�}|jjj�}|jjj�jdd�j�}|jjjtj	d�}x6|D].}t
jj|�}|j
|j|jjddd|d��}qBW|j�x�|D]�}|j�}	|	s�q�|j|	d�}
|
r�x�|
D]�}d}g}
x4t|j�td�D] }|j|j�kr�|
j|j��q�Wt�}|j�|d	<d
jt|
��|d<|j�|d<|j�|d
<|j|�}|dj|jjj j!t"|���7}|dj|j#��7}|j$|�q�Wq�Wdjt|��S)NT)r_)�flagsF)Z
with_nevraZ
with_providesZwith_filenamesre)�nevra_strictrZ)r�ZModuler�r�r�r�z{}
z{}z

)%r0rr(�getModulePackagesrirerjrkr.ZIGNORE_MODULAR_EXCLUDESrZsubjectr�roZget_best_queryrArlr:r8rrr=r2rrr�r9r4r�r�rr��term�boldr�r�r?)rZ	rpm_specsr�ZmodulePackagesZ	baseQueryZgetBestInitQueryr!r�r�rxreraZ
string_outputrNr	r�r�r
r
r�_what_provides�s@



zModuleBase._what_providescsrtjj�}|jtjjj�|jd�|jd�}|jd�}|jd�}|jd�|jd�}|jd��jj	j
spd|_x�|D]�}x�|D]�}t|�dkr�|d}	n(�fdd	�|D�}
|
r�|
d}	n|d}	|j
�}�j|	d
d�\}}
}�jjj|	j�|	j��}�j|	||
�}|j|�j|	j��|j|�j|	j�||
|�|j|�j|��j|	j��}|j|�j|�q�WqvW|S)NTr�r�r�r�r\rcsg|]}�jjj|�r|�qSr
)rr(r))r+rU)rr
rr-�sz5ModuleBase._create_and_fill_table.<locals>.<listcomp>F)r�)r�r�r�ZsetTermforceZTermForce_AUTOZenableMaxoutr�r�rr��verboseZhiddenr�r�r�r(r;rr�r�r�r�r�r�)rr�r�r�Z
column_streamZcolumn_profilesZcolumn_infoZlatest_per_repoZnameStreamArchr�Zactiver�r�r�r�r�r�Zsummary_strr
)rr�_create_and_fill_table�sD












z!ModuleBase._create_and_fill_tablecCs�g}|r0x2|D]}|j|�\}}|j|�qWn|jjj�}|jjj||�}|sTdS|j|�}d}	d}
y"|jj|dddj�j	}Wn(t
k
r�|dddj�}YnXt|d�}|j|�}
|j
|�}||
7}x�td|j��D]�}||
|k�r�|
|7}
|	d7}	y"|jj||	ddj�j	}Wn*t
k
�rX||	ddj�}YnXt||	�}|d7}||j
|�7}||
7}|j|�}||j||�7}q�W|tS)NrZrr\r�)rr7rr(r�ZgetLatestModulesPerRepor�r�r4r�KeyErrorr��_format_header�_format_repoid�rangeZgetNumberOfLines�getLiner��MODULE_TABLE_HINT)rrZmodule_stater�r!rLr"r�r�Zcurrent_repo_id_indexZalready_printed_lines�	repo_nameZversions�headerr�r�r�r
r
r�_get_brief_descriptionsH

"


z!ModuleBase._get_brief_descriptioncCs&|jd�}|j||�jdd�ddS)Nrr�r\)r�r��split)rr�r�r
r
rr�.s
zModuleBase._format_headercCsdj|jjjj|��S)Nz{}
)rrr�r�r�)rr�r
r
rr�2szModuleBase._format_repoidcCs|jjj�jddgd�j�}|j|d�}g}dd�|jjj�D�}|j||j�d�}|j	|�}x�|j
�D]�\}	}
|j|	d�}|s�|j|	d�}|s�x |
D]}tjt
d	�j|��q�Wtjt
d
�j|	��|j|
�qh|jjjj|	�tjj|jj�}
|
j|d�|jjj|
|d�qhW||fS)
NrYrX)r`)r�cSsg|]}|jr|j�qSr
)r�r�)r+r�r
r
rr-<sz9ModuleBase._install_profiles_internal.<locals>.<listcomp>)Zreponamer)rzUnable to resolve argument {}zNo match for package {})ra)rbZoptional)rrirerjrkrlr�r�rdrorrr1rrr7rsZ
group_membersr?rrqrrr0r<)rrGrFrDrzryrr�Zhotfix_packagesrTZ	set_specsrer!r{r
r
rrC5s,


z%ModuleBase._install_profiles_internalN)T)T)T)F)F)#�__name__�
__module__�__qualname__rr$r'r<r|r~r�r�r�rr3r�rcrgrhrr%r�r�r�r�r�r��staticmethodr�r�r�r�r�r�r�rCr
r
r
rr
+s@

U
:	
%

-(**r
cCs&tjj|�}djtddt|��|g�S)Nr�zModular dependency problem:zModular dependency problems:)r�utilZ_format_resolve_problemsr9rr�)�errorsrPr
r
r�format_modular_solver_errorsSsr�)"�collectionsrr.Zlibdnf.smartcolsr�Z
libdnf.moduleZdnf.selectorrZdnf.exceptionsZdnf.module.exceptionsrZdnf.utilrZdnf.i18nrrrr�rUZModulePackageContainerZModuleState_DEFAULTr�ZModuleState_ENABLEDr�ZModuleState_DISABLEDr&ZModuleState_UNKNOWNr}r�r�r�objectr
r�r
r
r
r�<module>s0



.site-packages/dnf/module/__pycache__/__init__.cpython-36.opt-1.pyc000064400000001074147511334650020667 0ustar003

�ft`��@sPddlmZdZdZdZdZdZeed�eed�eed	�eed
�eed�iZdS)
�)�_�����z#Enabling different stream for '{}'.zNothing to show.z;Installing newer version of '{}' than specified. Reason: {}zEnabled modules: {}.z6No profile specified for '{}', please specify profile.N)Zdnf.i18nrZDIFFERENT_STREAM_INFOZNOTHING_TO_SHOWZINSTALLING_NEWER_VERSIONZENABLED_MODULESZNO_PROFILE_SPECIFIEDZmodule_messages�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/dnf/module/__pycache__/module_base.cpython-36.opt-1.pyc000064400000057260147511334650021417 0ustar003

�ft`���@s�ddlmZddlZddlZddlZddlZddlZddl	m
Z
ddlmZddl
mZmZmZddlZejjjZejjjZejjjZejjjZed�Zed�Zdd	�ZGd
d�de�Z dd
�Z!dS)�)�OrderedDictN)�EnableMultipleStreamsException)�logger)�_�P_�ucdz6

Hint: [d]efault, [e]nabled, [x]disabled, [i]nstalledz@

Hint: [d]efault, [e]nabled, [x]disabled, [i]nstalled, [a]ctivecCs|j�S)N)�getName)�profile�r
�!/usr/lib/python3.6/module_base.py�_profile_comparison_key'src@seZdZdd�Zdd�Zdd�ZdBdd	�ZdCd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dDdd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�ZdEd%d&�Zd'd(�Zd)d*�Zd+d,�ZdFd-d.�Zd/d0�Zed1d2��Zd3d4�Zd5d6�Zd7d8�Zd9d:�Zd;d<�Z d=d>�Z!d?d@�Z"dAS)G�
ModuleBasecCs
||_dS)N)�base)�selfrr
r
r�__init__.szModuleBase.__init__c	Csn|j|�\}}}}x8|j�D],\}\}}|jrtjtd�j|j|j��qW|sX|sX|rjtj	j
|||d��dS)Nz%Ignoring unnecessary profile: '{}/{}')�no_match_group_specs�error_group_specs�module_depsolv_errors)�!_resolve_specs_enable_update_sack�itemsr	r�infor�format�name�dnf�
exceptions�
MarkingErrors)	r�module_specs�no_match_specs�error_specs�
solver_errors�module_dicts�spec�nsvcap�module_dictr
r
r�enable2szModuleBase.enablecCs,|j|t�\}}|s|r(tjj||d��dS)N)rr)�_modules_reset_or_disable�STATE_DISABLEDrrr)rrrrr
r
r�disable?szModuleBase.disableTc	s��j|�\}}}}tj}i}t�}	d}
�xt|j�D�]f\}\}}
�xV|
j�D�]H\}}�x<|j�D�].\}}�fdd�|D�}|s�tjtd�j|||��|j	|�qbg}�j
|�}|j�|kr�td�}tj|j|j
�|��d}
|j�rd|j|j|j��|�s(|j�}|�r>djtdd�|D���}td	�j||||�}ntd
�j|�}tj|�|j	|�qbnĈjjj||�}|�s�|j�}|�r�djtdd�|D���}td�j|||�}ntd
�j||�}tj|�|j	|�xNt|�D]B}|j|�}|�stjtd�j|||��|j	|�|j|��q�WxJ|D]B}�jjj||j��x&|j�D]}|j|t��j|��qPW�q.Wx|D]}|	j|j���qzWqbWqLWq2W|
�r�tjjtd����j |	||�\}}|�r�|j|�|�s�|�s�|�r�tjj!|||d��dS)NFcs"g|]}�jjj|j��r|�qSr
)r�_moduleContainer�isModuleActive�getId)�.0�x)rr
r�
<listcomp>Ssz&ModuleBase.install.<locals>.<listcomp>zAAll matches for argument '{0}' in module '{1}:{2}' are not activezDInstalling module '{0}' from Fail-Safe repository {1} is not allowedTz, cSsg|]}|j��qSr
)r)r+r	r
r
rr-gszKUnable to match profile for argument {}. Available profiles for '{}:{}': {}z'Unable to match profile for argument {}cSsg|]}|j��qSr
)r)r+r	r
r
rr-wsz<No default profiles for module {}:{}. Available profiles: {}zNo profiles for module {}:{}z0Default profile {} not available in module {}:{}z:Installing module from Fail-Safe repository is not allowed)rrr)"r�hawkey�MODULE_FAIL_SAFE_REPO_NAME�setrr�errorrr�append�_get_latest�	getRepoID�critical�
getNameStreamr	�extend�getProfiles�join�sortedrr(�getDefaultProfiles�installr�
getContent�
setdefault�add�update�getArtifactsrr�Error�_install_profiles_internalr)rr�strictrrrr �fail_safe_repo�install_dict�install_set_artifacts�fail_safe_repo_usedr!r"�
moduledictr�
streamdict�stream�module_list�install_module_list�profiles�
latest_module�msg�available_profiles�
profile_namesZprofiles_stringsr	�module_profiles�pkg_name�module�__�profiles_errorsr
)rrr<Fs�








 
 
zModuleBase.installc.	s��j|�\}}}t�}t�}ddh}x�|j�D]�\}	\}
}x�|j�D]x}xr�jjj|dddd�D]X}
�jjj|
�r`xD|
j�D]8}|j	dd�d}||kr�q||j	dd�d}|j
|�q|Wq`WqDWq.W�j�}�j|�}|r�|j
|�tj}i}t�}d	}�jjj�}�x^|j�D�]P\}	\}
}�x>|j�D�]0\}}�x"|j�D�]\}}�fd
d�|D�}|�s�tjtd�j|	||��|j|	��q@g}�j|�}|j�|k�r�td
�}tj|j|j�|��d}|
j�rJ|j
|j|
j��|�s�|j�}|�r"djtdd�|D���} td�j|	||| �}ntd�j|	�}tj|�|j|	��q@nT||k�r�xH||D]<}!|j|!�}"|"�s�tjtd�j|!||���q^|j
|"��q^WxJ|D]B}!�jjj||!j ��x&|!j!�D]}|j"|t��j
|	��q�W�q�Wxh|D]`}
|
j�}#|j#|#�xF|#D]>}|j	dd�d}||k�r.�q|j	dd�d}|j
|��qW�q�W�q@W�q*W�qW|�r|t$j%j&td����j'|||�\}$}%|%�r�|j
|%�t�}&|&j#|�|&j#|��jj(j�j)dd�}'�jj(j�j)ddgd�j*�}(x�|&D]�}|(j+|d�})|)j,�}*|*�s�q�|)j-�}+|+�sJtjtd�j|��||k�r�|'j.|)�}'�q�|)j/|$�},|,�r^|,})t$j0j1�jj(�}-|-j|)d��jj2j3|-d��q�W�jj4|'�|�s�|�s�|�r�t$j%j5|||d��dS)N�nosrc�src��.��-�rFcs"g|]}�jjj|j��r|�qSr
)rr(r)r*)r+r,)rr
rr-�sz(ModuleBase.switch_to.<locals>.<listcomp>z8No active matches for argument '{0}' in module '{1}:{2}'zDInstalling module '{0}' from Fail-Safe repository {1} is not allowedTz, cSsg|]}|j��qSr
)r)r+r	r
r
rr-�szKUnable to match profile for argument {}. Available profiles for '{}:{}': {}z'Unable to match profile for argument {}zEInstalled profile '{0}' is not available in module '{1}' stream '{2}'z:Installing module from Fail-Safe repository is not allowed)�empty)�	arch__neq)rz9No packages available to distrosync for package name '{}')�pkg)�select)rrr)6�_resolve_specs_enabler0r�keysrr(�queryr)rA�rsplitr?�_update_sack�_enable_dependenciesr7r.r/ZgetRemovedProfilesrr1rrr2r3r4r5r6r	r8r9r:Zwarningr<rr=r>r@rrrBrC�sack�filterm�apply�filter�	installed�	available�union�intersection�selector�Selector�_goalZdistupgrade�_remove_if_unneededr).rrrDrrr Znew_artifacts_namesZactive_artifacts_namesZ
src_archesr!r"rIrrU�artifact�archrTr�dependency_error_specrErFrGrHZremoved_profilesrJrKrLrMrNrOrPrQrRr	rS�	artifacts�install_base_queryrWZ	all_namesZremove_query�base_no_source_queryrermrnZonly_new_module�sltrr
)rr�	switch_to�s�








 



(






zModuleBase.switch_tocCs(|j|t�\}}|r$tjj||d��dS)N)rr)r%�
STATE_UNKNOWNrrr)rrrrr
r
r�resetszModuleBase.resetcs,g}tj}d}�jjj�jddgd�j�}�x�|D�]�}�j|�\}}|sV|j|�q2�fdd�|D�}	|	s�t	j
td�j|��q2�j
|	|d�}
t�}�x|
j�D�]\}}
�x|
j�D]�\}}|j�j||���j|�}|j�|k�rtd�}t	j|j|j�|��d	}|j�rH|j|j�}|�s(q�x�|D]}|j|j���q.Wq�x |j�D]}|j|j���qRWxD|j�D]8}tj|�}x&|jtjgd
�D]}|j|j��q�W�qtWq�Wq�W|�s�t	j
td�j|��|j|d�}|r2t j!j"�jj�}|j|d
��jj#j$|d�q2W|�r(t j%j&td���|S)NFrYrX)r`cs"g|]}�jjj|j��r|�qSr
)rr(r)r*)r+r,)rr
rr-3sz&ModuleBase.upgrade.<locals>.<listcomp>zUnable to resolve argument {}zCUpgrading module '{0}' from Fail-Safe repository {1} is not allowedT)Zformsz&Unable to match profile in argument {})r)ra)rbz9Upgrading module from Fail-Safe repository is not allowed)'r.r/rrirerjrk�_get_modulesr2rr1rr�_create_module_dict_and_enabler0rr@�)_get_package_name_set_and_remove_profilesr3r4r5r6r	r8r=rA�SubjectZget_nevra_possibilitiesZ
FORM_NEVRAr?rrlrrqrrrs�upgraderrB)rrrrErHrzr!rLr"Zupdate_module_listr#Zupgrade_package_setrrJrK�module_list_from_dictrOrP�profiles_setr	ru�subjZ	nevra_objrer{r
)rrr�%sb




 zModuleBase.upgradecCs�g}t�}x�|D]�}|j|�\}}|s2|j|�q|j||d�}g}x>|j�D]2\}	}
x(|
j�D]\}}|j|j||d��q`WqNW|s�tjt	d�j
|��|j|�qW|r�|jj
j�}
|j|
�}|r�|jjj�j�j|d�}|r�|jj|�|S)NFTz&Unable to match profile in argument {})r)r0rr2r�rr7r�rr1rrr@rr(ZgetInstalledPkgNames�
differencerirermrjrt)rrrZremove_package_setr!rLr"r#Zremove_packages_namesrrJrKr�Zkeep_pkg_namesrer
r
r�remove_s0


zModuleBase.removecCs
|j|�S)N)r)r�module_specr
r
r�get_modules|szModuleBase.get_modulesc
Cs�tj|�}x�|j�D]�}|jr$|jnd}|jr4|jnd}d}|jrH|jnd}|jrX|jnd}|jrv|jdkrvt|j�}|j	j
j|||||�}	|	r|	|fSqWfdfS)NrZr\���)r.r�Znsvcap_possibilitiesrrK�contextrv�version�strrr(re)
rr�r�r"rrKr�r�rv�modulesr
r
rr�s

zModuleBase._get_modulescCs>d}|r:|d}x(|dd�D]}|j�|j�kr|}qW|S)Nrr\)Z
getVersionNum)rrL�latestrUr
r
rr3�szModuleBase._get_latestc
Csvi}x.|D]&}|j|j�i�j|j�g�j|�q
W�x:|j�D�],\}}|jjj|�}t|�dk�rF|t	kr�|t
kr�|tkr�djt
|j�tj|jjj�d��}	td�j|t|�|	|d�}
t||
��|t
kr�|jjj|�}n|jjj|�}|s�||k�rt|��xjt
|j��D]0}||k�r8|�r|jjj||��q||=�qWq@|r@x"|j�D]}|jjj||��qTWq@W|S)Nr\z', ')�keyz�Argument '{argument}' matches {stream_count} streams ('{streams}') of module '{module}', but none of the streams are enabled or default)ZargumentZstream_countZstreamsrU)r>r�	getStreamr2rrr(ZgetModuleState�len�
STATE_DEFAULT�
STATE_ENABLEDr&r9r:rd�	functools�
cmp_to_keyriZevr_cmprrrZgetEnabledStream�getDefaultStreamr$)
rrLr!r$�
moduleDictrUZ
moduleName�
streamDictZmoduleStateZstreams_strrPrKr�r
r
rr��s>
"


z)ModuleBase._create_module_dict_and_enablec
Cs�g}g}i}x�|D]�}|j|�\}}|s4|j|�qy|j||d�}||f||<Wqttfk
r�}	z2|j|�tjt|	��tjtd�j	|��WYdd}	~	XqXqW|||fS)NTzUnable to resolve argument {})
rr2r��RuntimeErrorrrr1rrr)
rrr�
error_specr r!rLr"r#�er
r
rrc�s 


*z ModuleBase._resolve_specs_enablecCs�dd�|jjj�D�}y4|jjj|jj||jjj|jjjd|jjj	d�}Wn4t
jk
r~}ztj
jt|���WYdd}~XnX|S)NcSsg|]}|jr|j�qSr
)�module_hotfixes�id)r+�ir
r
rr-�sz+ModuleBase._update_sack.<locals>.<listcomp>T)Zupdate_onlyZdebugsolver)r�repos�iter_enabledriZfilter_modulesr(�confZinstallrootZmodule_platform_idZdebug_solverr.�	ExceptionrrrBr)r�
hot_fix_reposrr�r
r
rrg�s
"zModuleBase._update_sackc	Cs�g}x�|j�D]�\}\}}x�|j�D]�}xz|j�D]n}y|jjjtjj|��Wq2tk
r�}z2|j	|�t
jt|��t
jt
d�j|��WYdd}~Xq2Xq2Wq$WqW|S)NzUnable to resolve argument {})r�valuesrr(ZenableDependencyTree�libdnfrUZVectorModulePackagePtrr�r2rr1rrr)	rr r�r!r"r�r�r�r�r
r
rrh�s
2zModuleBase._enable_dependenciescCs<|j|�\}}}|j�}|j|�}|r0|j|�||||fS)N)rcrgrhr7)rrrr�r rrwr
r
rr�s

z,ModuleBase._resolve_specs_enable_update_sackcCs�g}x�|D]�}|j|�\}}|s@tjtd�j|��|j|�q
|js^|js^|js^|j	s^|j
rrtjtd�j|��t�}x|D]}|j
|j��q~Wx8|D]0}	|tkr�|jjj|	�|tkr�|jjj|	�q�Wq
W|j�}
||
fS)NzUnable to resolve argument {}zMOnly module name is required. Ignoring unneeded information in argument: '{}')rrr1rrr2rKr�r�rvr	rr0r?rr}rr(r~r&r'rg)rrZto_staterr!rLr"Zmodule_namesrUrrr
r
rr%�s(



z$ModuleBase._modules_reset_or_disableFc
Cs�t�}|j|�}t|jjj|j���}|s0t�S|jr�|j|j�}|sLt�Sx�|D]6}|j�|krR|rz|jjj||j��|j	|j
��qRWnDxB|D]:}	|r�|jjj||	�x |j|	�D]}|j	|j
��q�Wq�W|S)N)r0r3rr(�getInstalledProfilesrr	r8Z	uninstallr@r=)
rrLr"r�Zpackage_name_setrOZinstalled_profiles_stringsr�r	Zprofile_stringr
r
rr�s*



z4ModuleBase._get_package_name_set_and_remove_profilesc	Cs�t�}x�|D]�}|j|�\}}|s8tjtd�j|��q|jrXtjtd�j|j|j��xl|D]d}t�}|j	�|d<x8t
|j�td�D]$}dj
dd�|j�D��||j�<q�W|j|j|�j��q^WqWdj
t
|��S)	NzUnable to resolve argument {}z%Ignoring unnecessary profile: '{}/{}'�Name)r��
cSsg|]}|�qSr
r
)r+ZpkgNamer
r
rr-5sz1ModuleBase._get_info_profiles.<locals>.<listcomp>z

)r0rrrrrr	rr�getFullIdentifierr:r8rr9r=rr?�_create_simple_table�toString)	rr�outputr�rLr"rU�linesr	r
r
r�_get_info_profiles#s"

 zModuleBase._get_info_profilescCs�|jjj|j��}|j�}d}xTt|td�D]D}|dj|j�|j�|krLdnd�7}||j�|krj|rjdnd7}q,W|dd�S)	NrZ)r�z{}{}z [d]z [i], z, r^���)rr(r�rr8r:rr)r�
modulePackage�default_profiles�enabled_strZinstalled_profilesrQ�profiles_strr	r
r
r�_profile_report_formatter:s

z$ModuleBase._profile_report_formattercCs|j�jdd�S)Nr�� )�strip�replace)rZsummaryr
r
r�_summary_report_formatterFsz$ModuleBase._summary_report_formattercCs�d}d}d}|j�|jjj|j��kr*d}|jjj|�rJ|s@d}|d7}n|jjj|�rh|s`d}|d7}|r�|jjj|�r�|s�d}|d7}|||fS)NrZz [d]r�z[e]z[x]z[a])r�rr(r�rZ	isEnabledZ
isDisabledr))rr��
markActive�default_strr��disabled_strr
r
r�_module_strs_formatterIs&
z!ModuleBase._module_strs_formatterc
Cs�t�}�x�|D�]�}|j|�\}}|s<tjtd�j|��q|jr\tjtd�j|j|j���xj|D�]`}|j|dd�\}}}	|j	j
j|j�|j
��}
|j||
|�}t�}|j�|d<|j
�|||	|d<|j�|d<|j�|d<|j�|d	<||d
<dj|
�|d<|j�|d
<|j�|d<|j�|d<t�}
xV|j�D]J}xB|j�D]6}x.|j�D]"\}}|
jdj|dj|����qXW�qJW�q<Wdjt|
��|d<djt|j���|d<|j|j|�j��qdWqWdjt|��}|�r�|t7}|S)NzUnable to resolve argument {}z%Ignoring unnecessary profile: '{}/{}'T)r�r��StreamZVersionZContextZArchitecture�Profilesr�zDefault profiles�Repo�SummaryZDescriptionz{}:[{}]�,r�ZRequiresZ	Artifactsz

) r0rrrrrr	rr�rr(r;rr�r�rZ
getVersionZ
getContextZgetArchr9r4�
getSummaryZgetDescriptionZgetModuleDependenciesZgetRequiresrr?r:rAr�r��MODULE_INFO_TABLE_HINT)rrr�r�rLr"r�r�r�r�r�r�r�Zreq_setZreqZrequire_dictZmod_requirerK�	str_tabler
r
r�	_get_info^sP

*zModuleBase._get_infocCs�tjj�}|jd�|jd�|jd�}|jd�}|jd�|jd�|j�xL|j	�D]@\}}|dkrld}|j
�}|j|�j|�|j|�jt
|��qXW|S)NTz : r�ZValuer�rZ)r��	smartcols�TableZenableNoheadingsZsetColumnSeparator�	newColumn�setWrapZsetSafecharsZsetNewlineWrapFunctionr�newLine�
getColumnCell�setDatar�)r��table�column_nameZcolumn_valueZ	line_name�value�liner
r
rr��s






zModuleBase._create_simple_tablec	Cs�t�}xx|D]p}|j|�\}}|s8tjtd�j|��q|jrXtjtd�j|j|j��x"|D]}|j�}|r^|j	|�q^WqWdj
t|��}|S)NzUnable to resolve argument {}z%Ignoring unnecessary profile: '{}/{}'z

)r0rrrrrr	rZgetYamlr?r9r:)	rrr�r�rLr"r�rZ
output_stringr
r
r�_get_full_info�s

zModuleBase._get_full_infoc	Cs�t�}|jjj�}|jjj�jdd�j�}|jjjtj	d�}x6|D].}t
jj|�}|j
|j|jjddd|d��}qBW|j�x�|D]�}|j�}	|	s�q�|j|	d�}
|
r�x�|
D]�}d}g}
x4t|j�td�D] }|j|j�kr�|
j|j��q�Wt�}|j�|d	<d
jt|
��|d<|j�|d<|j�|d
<|j|�}|dj|jjj j!t"|���7}|dj|j#��7}|j$|�q�Wq�Wdjt|��S)NT)r_)�flagsF)Z
with_nevraZ
with_providesZwith_filenamesre)�nevra_strictrZ)r�ZModuler�r�r�r�z{}
z{}z

)%r0rr(�getModulePackagesrirerjrkr.ZIGNORE_MODULAR_EXCLUDESrZsubjectr�roZget_best_queryrArlr:r8rrr=r2rrr�r9r4r�r�rr��term�boldr�r�r?)rZ	rpm_specsr�ZmodulePackagesZ	baseQueryZgetBestInitQueryr!r�r�rxreraZ
string_outputrNr	r�r�r
r
r�_what_provides�s@



zModuleBase._what_providescsrtjj�}|jtjjj�|jd�|jd�}|jd�}|jd�}|jd�|jd�}|jd��jj	j
spd|_x�|D]�}x�|D]�}t|�dkr�|d}	n(�fdd	�|D�}
|
r�|
d}	n|d}	|j
�}�j|	d
d�\}}
}�jjj|	j�|	j��}�j|	||
�}|j|�j|	j��|j|�j|	j�||
|�|j|�j|��j|	j��}|j|�j|�q�WqvW|S)NTr�r�r�r�r\rcsg|]}�jjj|�r|�qSr
)rr(r))r+rU)rr
rr-�sz5ModuleBase._create_and_fill_table.<locals>.<listcomp>F)r�)r�r�r�ZsetTermforceZTermForce_AUTOZenableMaxoutr�r�rr��verboseZhiddenr�r�r�r(r;rr�r�r�r�r�r�)rr�r�r�Z
column_streamZcolumn_profilesZcolumn_infoZlatest_per_repoZnameStreamArchr�Zactiver�r�r�r�r�r�Zsummary_strr
)rr�_create_and_fill_table�sD












z!ModuleBase._create_and_fill_tablecCs�g}|r0x2|D]}|j|�\}}|j|�qWn|jjj�}|jjj||�}|sTdS|j|�}d}	d}
y"|jj|dddj�j	}Wn(t
k
r�|dddj�}YnXt|d�}|j|�}
|j
|�}||
7}x�td|j��D]�}||
|k�r�|
|7}
|	d7}	y"|jj||	ddj�j	}Wn*t
k
�rX||	ddj�}YnXt||	�}|d7}||j
|�7}||
7}|j|�}||j||�7}q�W|tS)NrZrr\r�)rr7rr(r�ZgetLatestModulesPerRepor�r�r4r�KeyErrorr��_format_header�_format_repoid�rangeZgetNumberOfLines�getLiner��MODULE_TABLE_HINT)rrZmodule_stater�r!rLr"r�r�Zcurrent_repo_id_indexZalready_printed_lines�	repo_nameZversions�headerr�r�r�r
r
r�_get_brief_descriptionsH

"


z!ModuleBase._get_brief_descriptioncCs&|jd�}|j||�jdd�ddS)Nrr�r\)r�r��split)rr�r�r
r
rr�.s
zModuleBase._format_headercCsdj|jjjj|��S)Nz{}
)rrr�r�r�)rr�r
r
rr�2szModuleBase._format_repoidcCs|jjj�jddgd�j�}|j|d�}g}dd�|jjj�D�}|j||j�d�}|j	|�}x�|j
�D]�\}	}
|j|	d�}|s�|j|	d�}|s�x |
D]}tjt
d	�j|��q�Wtjt
d
�j|	��|j|
�qh|jjjj|	�tjj|jj�}
|
j|d�|jjj|
|d�qhW||fS)
NrYrX)r`)r�cSsg|]}|jr|j�qSr
)r�r�)r+r�r
r
rr-<sz9ModuleBase._install_profiles_internal.<locals>.<listcomp>)Zreponamer)rzUnable to resolve argument {}zNo match for package {})ra)rbZoptional)rrirerjrkrlr�r�rdrorrr1rrr7rsZ
group_membersr?rrqrrr0r<)rrGrFrDrzryrr�Zhotfix_packagesrTZ	set_specsrer!r{r
r
rrC5s,


z%ModuleBase._install_profiles_internalN)T)T)T)F)F)#�__name__�
__module__�__qualname__rr$r'r<r|r~r�r�r�rr3r�rcrgrhrr%r�r�r�r�r�r��staticmethodr�r�r�r�r�r�r�rCr
r
r
rr
+s@

U
:	
%

-(**r
cCs&tjj|�}djtddt|��|g�S)Nr�zModular dependency problem:zModular dependency problems:)r�utilZ_format_resolve_problemsr9rr�)�errorsrPr
r
r�format_modular_solver_errorsSsr�)"�collectionsrr.Zlibdnf.smartcolsr�Z
libdnf.moduleZdnf.selectorrZdnf.exceptionsZdnf.module.exceptionsrZdnf.utilrZdnf.i18nrrrr�rUZModulePackageContainerZModuleState_DEFAULTr�ZModuleState_ENABLEDr�ZModuleState_DISABLEDr&ZModuleState_UNKNOWNr}r�r�r�objectr
r�r
r
r
r�<module>s0



.site-packages/dnf/module/module_base.py000064400000120617147511334650014171 0ustar00# Copyright (C) 2017-2018  Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU Library General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.

from collections import OrderedDict

import hawkey
import libdnf.smartcols
import libdnf.module
import dnf.selector
import dnf.exceptions

from dnf.module.exceptions import EnableMultipleStreamsException
from dnf.util import logger
from dnf.i18n import _, P_, ucd

import functools

STATE_DEFAULT = libdnf.module.ModulePackageContainer.ModuleState_DEFAULT
STATE_ENABLED = libdnf.module.ModulePackageContainer.ModuleState_ENABLED
STATE_DISABLED = libdnf.module.ModulePackageContainer.ModuleState_DISABLED
STATE_UNKNOWN = libdnf.module.ModulePackageContainer.ModuleState_UNKNOWN
MODULE_TABLE_HINT = _("\n\nHint: [d]efault, [e]nabled, [x]disabled, [i]nstalled")
MODULE_INFO_TABLE_HINT = _("\n\nHint: [d]efault, [e]nabled, [x]disabled, [i]nstalled, [a]ctive")


def _profile_comparison_key(profile):
    return profile.getName()


class ModuleBase(object):
    # :api

    def __init__(self, base):
        # :api
        self.base = base

    def enable(self, module_specs):
        # :api
        no_match_specs, error_specs, solver_errors, module_dicts = \
            self._resolve_specs_enable_update_sack(module_specs)
        for spec, (nsvcap, module_dict) in module_dicts.items():
            if nsvcap.profile:
                logger.info(_("Ignoring unnecessary profile: '{}/{}'").format(
                    nsvcap.name, nsvcap.profile))
        if no_match_specs or error_specs or solver_errors:
            raise dnf.exceptions.MarkingErrors(no_match_group_specs=no_match_specs,
                                               error_group_specs=error_specs,
                                               module_depsolv_errors=solver_errors)

    def disable(self, module_specs):
        # :api
        no_match_specs, solver_errors = self._modules_reset_or_disable(module_specs, STATE_DISABLED)
        if no_match_specs or solver_errors:
            raise dnf.exceptions.MarkingErrors(no_match_group_specs=no_match_specs,
                                               module_depsolv_errors=solver_errors)

    def install(self, module_specs, strict=True):
        # :api
        no_match_specs, error_specs, solver_errors, module_dicts = \
            self._resolve_specs_enable_update_sack(module_specs)

        # <package_name, set_of_spec>
        fail_safe_repo = hawkey.MODULE_FAIL_SAFE_REPO_NAME
        install_dict = {}
        install_set_artifacts = set()
        fail_safe_repo_used = False
        for spec, (nsvcap, moduledict) in module_dicts.items():
            for name, streamdict in moduledict.items():
                for stream, module_list in streamdict.items():
                    install_module_list = [x for x in module_list
                                           if self.base._moduleContainer.isModuleActive(x.getId())]
                    if not install_module_list:
                        logger.error(_("All matches for argument '{0}' in module '{1}:{2}' are not "
                                       "active").format(spec, name, stream))
                        error_specs.append(spec)
                        continue
                    profiles = []
                    latest_module = self._get_latest(install_module_list)
                    if latest_module.getRepoID() == fail_safe_repo:
                        msg = _(
                            "Installing module '{0}' from Fail-Safe repository {1} is not allowed")
                        logger.critical(msg.format(latest_module.getNameStream(), fail_safe_repo))
                        fail_safe_repo_used = True
                    if nsvcap.profile:
                        profiles.extend(latest_module.getProfiles(nsvcap.profile))
                        if not profiles:
                            available_profiles = latest_module.getProfiles()
                            if available_profiles:
                                profile_names = ", ".join(sorted(
                                    [profile.getName() for profile in available_profiles]))
                                msg = _("Unable to match profile for argument {}. Available "
                                        "profiles for '{}:{}': {}").format(
                                    spec, name, stream, profile_names)
                            else:
                                msg = _("Unable to match profile for argument {}").format(spec)
                            logger.error(msg)
                            no_match_specs.append(spec)
                            continue
                    else:
                        profiles_strings = self.base._moduleContainer.getDefaultProfiles(
                            name, stream)
                        if not profiles_strings:
                            available_profiles = latest_module.getProfiles()
                            if available_profiles:
                                profile_names = ", ".join(sorted(
                                    [profile.getName() for profile in available_profiles]))
                                msg = _("No default profiles for module {}:{}. Available profiles"
                                        ": {}").format(
                                    name, stream, profile_names)
                            else:
                                msg = _("No profiles for module {}:{}").format(name, stream)
                            logger.error(msg)
                            error_specs.append(spec)
                        for profile in set(profiles_strings):
                            module_profiles = latest_module.getProfiles(profile)
                            if not module_profiles:
                                logger.error(
                                    _("Default profile {} not available in module {}:{}").format(
                                        profile, name, stream))
                                error_specs.append(spec)

                            profiles.extend(module_profiles)
                    for profile in profiles:
                        self.base._moduleContainer.install(latest_module ,profile.getName())
                        for pkg_name in profile.getContent():
                            install_dict.setdefault(pkg_name, set()).add(spec)
                    for module in install_module_list:
                        install_set_artifacts.update(module.getArtifacts())
        if fail_safe_repo_used:
            raise dnf.exceptions.Error(_(
                "Installing module from Fail-Safe repository is not allowed"))
        __, profiles_errors = self._install_profiles_internal(
            install_set_artifacts, install_dict, strict)
        if profiles_errors:
            error_specs.extend(profiles_errors)

        if no_match_specs or error_specs or solver_errors:
            raise dnf.exceptions.MarkingErrors(no_match_group_specs=no_match_specs,
                                               error_group_specs=error_specs,
                                               module_depsolv_errors=solver_errors)

    def switch_to(self, module_specs, strict=True):
        # :api
        no_match_specs, error_specs, module_dicts = self._resolve_specs_enable(module_specs)
        # collect name of artifacts from new modules for distrosync
        new_artifacts_names = set()
        # collect name of artifacts from active modules for distrosync before sack update
        active_artifacts_names = set()
        src_arches = {"nosrc", "src"}
        for spec, (nsvcap, moduledict) in module_dicts.items():
            for name in moduledict.keys():
                for module in self.base._moduleContainer.query(name, "", "", "", ""):
                    if self.base._moduleContainer.isModuleActive(module):
                        for artifact in module.getArtifacts():
                            arch = artifact.rsplit(".", 1)[1]
                            if arch in src_arches:
                                continue
                            pkg_name = artifact.rsplit("-", 2)[0]
                            active_artifacts_names.add(pkg_name)

        solver_errors = self._update_sack()

        dependency_error_spec = self._enable_dependencies(module_dicts)
        if dependency_error_spec:
            error_specs.extend(dependency_error_spec)

        # <package_name, set_of_spec>
        fail_safe_repo = hawkey.MODULE_FAIL_SAFE_REPO_NAME
        install_dict = {}
        install_set_artifacts = set()
        fail_safe_repo_used = False

        # list of name: [profiles] for module profiles being removed
        removed_profiles = self.base._moduleContainer.getRemovedProfiles()

        for spec, (nsvcap, moduledict) in module_dicts.items():
            for name, streamdict in moduledict.items():
                for stream, module_list in streamdict.items():
                    install_module_list = [x for x in module_list
                                           if self.base._moduleContainer.isModuleActive(x.getId())]
                    if not install_module_list:
                        "No active matches for argument '{0}' in module '{1}:{2}'"
                        logger.error(_("No active matches for argument '{0}' in module "
                                       "'{1}:{2}'").format(spec, name, stream))
                        error_specs.append(spec)
                        continue
                    profiles = []
                    latest_module = self._get_latest(install_module_list)
                    if latest_module.getRepoID() == fail_safe_repo:
                        msg = _(
                            "Installing module '{0}' from Fail-Safe repository {1} is not allowed")
                        logger.critical(msg.format(latest_module.getNameStream(), fail_safe_repo))
                        fail_safe_repo_used = True
                    if nsvcap.profile:
                        profiles.extend(latest_module.getProfiles(nsvcap.profile))
                        if not profiles:
                            available_profiles = latest_module.getProfiles()
                            if available_profiles:
                                profile_names = ", ".join(sorted(
                                    [profile.getName() for profile in available_profiles]))
                                msg = _("Unable to match profile for argument {}. Available "
                                        "profiles for '{}:{}': {}").format(
                                    spec, name, stream, profile_names)
                            else:
                                msg = _("Unable to match profile for argument {}").format(spec)
                            logger.error(msg)
                            no_match_specs.append(spec)
                            continue
                    elif name in removed_profiles:

                        for profile in removed_profiles[name]:
                            module_profiles = latest_module.getProfiles(profile)
                            if not module_profiles:
                                logger.warning(
                                    _("Installed profile '{0}' is not available in module "
                                      "'{1}' stream '{2}'").format(profile, name, stream))
                                continue
                            profiles.extend(module_profiles)
                    for profile in profiles:
                        self.base._moduleContainer.install(latest_module, profile.getName())
                        for pkg_name in profile.getContent():
                            install_dict.setdefault(pkg_name, set()).add(spec)
                    for module in install_module_list:
                        artifacts = module.getArtifacts()
                        install_set_artifacts.update(artifacts)
                        for artifact in artifacts:
                            arch = artifact.rsplit(".", 1)[1]
                            if arch in src_arches:
                                continue
                            pkg_name = artifact.rsplit("-", 2)[0]
                            new_artifacts_names.add(pkg_name)
        if fail_safe_repo_used:
            raise dnf.exceptions.Error(_(
                "Installing module from Fail-Safe repository is not allowed"))
        install_base_query, profiles_errors = self._install_profiles_internal(
            install_set_artifacts, install_dict, strict)
        if profiles_errors:
            error_specs.extend(profiles_errors)

        # distrosync module name
        all_names = set()
        all_names.update(new_artifacts_names)
        all_names.update(active_artifacts_names)
        remove_query = self.base.sack.query().filterm(empty=True)
        base_no_source_query = self.base.sack.query().filterm(arch__neq=['src', 'nosrc']).apply()

        for pkg_name in all_names:
            query = base_no_source_query.filter(name=pkg_name)
            installed = query.installed()
            if not installed:
                continue
            available = query.available()
            if not available:
                logger.warning(_("No packages available to distrosync for package name "
                                 "'{}'").format(pkg_name))
                if pkg_name not in new_artifacts_names:
                    remove_query = remove_query.union(query)
                continue

            only_new_module = query.intersection(install_base_query)
            if only_new_module:
                query = only_new_module
            sltr = dnf.selector.Selector(self.base.sack)
            sltr.set(pkg=query)
            self.base._goal.distupgrade(select=sltr)
        self.base._remove_if_unneeded(remove_query)

        if no_match_specs or error_specs or solver_errors:
            raise dnf.exceptions.MarkingErrors(no_match_group_specs=no_match_specs,
                                               error_group_specs=error_specs,
                                               module_depsolv_errors=solver_errors)

    def reset(self, module_specs):
        # :api
        no_match_specs, solver_errors = self._modules_reset_or_disable(module_specs, STATE_UNKNOWN)
        if no_match_specs:
            raise dnf.exceptions.MarkingErrors(no_match_group_specs=no_match_specs,
                                               module_depsolv_errors=solver_errors)

    def upgrade(self, module_specs):
        # :api
        no_match_specs = []
        fail_safe_repo = hawkey.MODULE_FAIL_SAFE_REPO_NAME
        fail_safe_repo_used = False

        #  Remove source packages because they cannot be installed or upgraded
        base_no_source_query = self.base.sack.query().filterm(arch__neq=['src', 'nosrc']).apply()

        for spec in module_specs:
            module_list, nsvcap = self._get_modules(spec)
            if not module_list:
                no_match_specs.append(spec)
                continue
            update_module_list = [x for x in module_list
                                  if self.base._moduleContainer.isModuleActive(x.getId())]
            if not update_module_list:
                logger.error(_("Unable to resolve argument {}").format(spec))
                continue
            module_dict = self._create_module_dict_and_enable(update_module_list, spec, False)
            upgrade_package_set = set()
            for name, streamdict in module_dict.items():
                for stream, module_list_from_dict in streamdict.items():
                    upgrade_package_set.update(self._get_package_name_set_and_remove_profiles(
                        module_list_from_dict, nsvcap))
                    latest_module = self._get_latest(module_list_from_dict)
                    if latest_module.getRepoID() == fail_safe_repo:
                        msg = _(
                            "Upgrading module '{0}' from Fail-Safe repository {1} is not allowed")
                        logger.critical(msg.format(latest_module.getNameStream(), fail_safe_repo))
                        fail_safe_repo_used = True
                    if nsvcap.profile:
                        profiles_set = latest_module.getProfiles(nsvcap.profile)
                        if not profiles_set:
                            continue
                        for profile in profiles_set:
                            upgrade_package_set.update(profile.getContent())
                    else:
                        for profile in latest_module.getProfiles():
                            upgrade_package_set.update(profile.getContent())
                        for artifact in latest_module.getArtifacts():
                            subj = hawkey.Subject(artifact)
                            for nevra_obj in subj.get_nevra_possibilities(
                                    forms=[hawkey.FORM_NEVRA]):
                                upgrade_package_set.add(nevra_obj.name)

            if not upgrade_package_set:
                logger.error(_("Unable to match profile in argument {}").format(spec))
            query = base_no_source_query.filter(name=upgrade_package_set)
            if query:
                sltr = dnf.selector.Selector(self.base.sack)
                sltr.set(pkg=query)
                self.base._goal.upgrade(select=sltr)
        if fail_safe_repo_used:
            raise dnf.exceptions.Error(_(
                "Upgrading module from Fail-Safe repository is not allowed"))
        return no_match_specs

    def remove(self, module_specs):
        # :api
        no_match_specs = []
        remove_package_set = set()

        for spec in module_specs:
            module_list, nsvcap = self._get_modules(spec)
            if not module_list:
                no_match_specs.append(spec)
                continue
            module_dict = self._create_module_dict_and_enable(module_list, spec, False)
            remove_packages_names = []
            for name, streamdict in module_dict.items():
                for stream, module_list_from_dict in streamdict.items():
                    remove_packages_names.extend(self._get_package_name_set_and_remove_profiles(
                        module_list_from_dict, nsvcap, True))
            if not remove_packages_names:
                logger.error(_("Unable to match profile in argument {}").format(spec))
            remove_package_set.update(remove_packages_names)

        if remove_package_set:
            keep_pkg_names = self.base._moduleContainer.getInstalledPkgNames()
            remove_package_set = remove_package_set.difference(keep_pkg_names)
            if remove_package_set:
                query = self.base.sack.query().installed().filterm(name=remove_package_set)
                if query:
                    self.base._remove_if_unneeded(query)
        return no_match_specs

    def get_modules(self, module_spec):
        # :api
        return self._get_modules(module_spec)

    def _get_modules(self, module_spec):
        # used by ansible (lib/ansible/modules/packaging/os/dnf.py)
        subj = hawkey.Subject(module_spec)
        for nsvcap in subj.nsvcap_possibilities():
            name = nsvcap.name if nsvcap.name else ""
            stream = nsvcap.stream if nsvcap.stream else ""
            version = ""
            context = nsvcap.context if nsvcap.context else ""
            arch = nsvcap.arch if nsvcap.arch else ""
            if nsvcap.version and nsvcap.version != -1:
                version = str(nsvcap.version)
            modules = self.base._moduleContainer.query(name, stream, version, context, arch)
            if modules:
                return modules, nsvcap
        return (), None

    def _get_latest(self, module_list):
        latest = None
        if module_list:
            latest = module_list[0]
            for module in module_list[1:]:
                if module.getVersionNum() > latest.getVersionNum():
                    latest = module
        return latest

    def _create_module_dict_and_enable(self, module_list, spec, enable=True):
        moduleDict = {}
        for module in module_list:
            moduleDict.setdefault(
                module.getName(), {}).setdefault(module.getStream(), []).append(module)

        for moduleName, streamDict in moduleDict.items():
            moduleState = self.base._moduleContainer.getModuleState(moduleName)
            if len(streamDict) > 1:
                if moduleState != STATE_DEFAULT and moduleState != STATE_ENABLED \
                        and moduleState != STATE_DISABLED:
                    streams_str = "', '".join(
                        sorted(streamDict.keys(), key=functools.cmp_to_key(self.base.sack.evr_cmp)))
                    msg = _("Argument '{argument}' matches {stream_count} streams ('{streams}') of "
                            "module '{module}', but none of the streams are enabled or "
                            "default").format(
                        argument=spec, stream_count=len(streamDict), streams=streams_str,
                        module=moduleName)
                    raise EnableMultipleStreamsException(moduleName, msg)
                if moduleState == STATE_ENABLED:
                    stream = self.base._moduleContainer.getEnabledStream(moduleName)
                else:
                    stream = self.base._moduleContainer.getDefaultStream(moduleName)
                if not stream or stream not in streamDict:
                    raise EnableMultipleStreamsException(moduleName)
                for key in sorted(streamDict.keys()):
                    if key == stream:
                        if enable:
                            self.base._moduleContainer.enable(moduleName, key)
                        continue
                    del streamDict[key]
            elif enable:
                for key in streamDict.keys():
                    self.base._moduleContainer.enable(moduleName, key)
            assert len(streamDict) == 1
        return moduleDict

    def _resolve_specs_enable(self, module_specs):
        no_match_specs = []
        error_spec = []
        module_dicts = {}
        for spec in module_specs:
            module_list, nsvcap = self._get_modules(spec)
            if not module_list:
                no_match_specs.append(spec)
                continue
            try:
                module_dict = self._create_module_dict_and_enable(module_list, spec, True)
                module_dicts[spec] = (nsvcap, module_dict)
            except (RuntimeError, EnableMultipleStreamsException) as e:
                error_spec.append(spec)
                logger.error(ucd(e))
                logger.error(_("Unable to resolve argument {}").format(spec))
        return no_match_specs, error_spec, module_dicts

    def _update_sack(self):
        hot_fix_repos = [i.id for i in self.base.repos.iter_enabled() if i.module_hotfixes]
        try:
            solver_errors = self.base.sack.filter_modules(
                self.base._moduleContainer, hot_fix_repos, self.base.conf.installroot,
                self.base.conf.module_platform_id, update_only=True,
                debugsolver=self.base.conf.debug_solver)
        except hawkey.Exception as e:
            raise dnf.exceptions.Error(ucd(e))
        return solver_errors

    def _enable_dependencies(self, module_dicts):
        error_spec = []
        for spec, (nsvcap, moduleDict) in module_dicts.items():
            for streamDict in moduleDict.values():
                for modules in streamDict.values():
                    try:
                        self.base._moduleContainer.enableDependencyTree(
                            libdnf.module.VectorModulePackagePtr(modules))
                    except RuntimeError as e:
                        error_spec.append(spec)
                        logger.error(ucd(e))
                        logger.error(_("Unable to resolve argument {}").format(spec))
        return error_spec

    def _resolve_specs_enable_update_sack(self, module_specs):
        no_match_specs, error_spec, module_dicts = self._resolve_specs_enable(module_specs)

        solver_errors = self._update_sack()

        dependency_error_spec = self._enable_dependencies(module_dicts)
        if dependency_error_spec:
            error_spec.extend(dependency_error_spec)

        return no_match_specs, error_spec, solver_errors, module_dicts

    def _modules_reset_or_disable(self, module_specs, to_state):
        no_match_specs = []
        for spec in module_specs:
            module_list, nsvcap = self._get_modules(spec)
            if not module_list:
                logger.error(_("Unable to resolve argument {}").format(spec))
                no_match_specs.append(spec)
                continue
            if nsvcap.stream or nsvcap.version or nsvcap.context or nsvcap.arch or nsvcap.profile:
                logger.info(_("Only module name is required. "
                              "Ignoring unneeded information in argument: '{}'").format(spec))
            module_names = set()
            for module in module_list:
                module_names.add(module.getName())
            for name in module_names:
                if to_state == STATE_UNKNOWN:
                    self.base._moduleContainer.reset(name)
                if to_state == STATE_DISABLED:
                    self.base._moduleContainer.disable(name)

        solver_errors = self._update_sack()
        return no_match_specs, solver_errors

    def _get_package_name_set_and_remove_profiles(self, module_list, nsvcap, remove=False):
        package_name_set = set()
        latest_module = self._get_latest(module_list)
        installed_profiles_strings = set(self.base._moduleContainer.getInstalledProfiles(
            latest_module.getName()))
        if not installed_profiles_strings:
            return set()
        if nsvcap.profile:
            profiles_set = latest_module.getProfiles(nsvcap.profile)
            if not profiles_set:
                return set()
            for profile in profiles_set:
                if profile.getName() in installed_profiles_strings:
                    if remove:
                        self.base._moduleContainer.uninstall(latest_module, profile.getName())
                    package_name_set.update(profile.getContent())
        else:
            for profile_string in installed_profiles_strings:
                if remove:
                    self.base._moduleContainer.uninstall(latest_module, profile_string)
                for profile in latest_module.getProfiles(profile_string):
                    package_name_set.update(profile.getContent())
        return package_name_set

    def _get_info_profiles(self, module_specs):
        output = set()
        for module_spec in module_specs:
            module_list, nsvcap = self._get_modules(module_spec)
            if not module_list:
                logger.info(_("Unable to resolve argument {}").format(module_spec))
                continue

            if nsvcap.profile:
                logger.info(_("Ignoring unnecessary profile: '{}/{}'").format(
                    nsvcap.name, nsvcap.profile))
            for module in module_list:

                lines = OrderedDict()
                lines["Name"] = module.getFullIdentifier()

                for profile in sorted(module.getProfiles(), key=_profile_comparison_key):
                    lines[profile.getName()] = "\n".join(
                        [pkgName for pkgName in profile.getContent()])

                output.add(self._create_simple_table(lines).toString())
        return "\n\n".join(sorted(output))

    def _profile_report_formatter(self, modulePackage, default_profiles, enabled_str):
        installed_profiles = self.base._moduleContainer.getInstalledProfiles(
            modulePackage.getName())
        available_profiles = modulePackage.getProfiles()
        profiles_str = ""
        for profile in sorted(available_profiles, key=_profile_comparison_key):
            profiles_str += "{}{}".format(
                profile.getName(), " [d]" if profile.getName() in default_profiles else "")
            profiles_str += " [i], " if profile.getName() in installed_profiles and enabled_str \
                else ", "
        return profiles_str[:-2]

    def _summary_report_formatter(self, summary):
        return summary.strip().replace("\n", " ")

    def _module_strs_formatter(self, modulePackage, markActive=False):
        default_str = ""
        enabled_str = ""
        disabled_str = ""
        if modulePackage.getStream() == self.base._moduleContainer.getDefaultStream(
                modulePackage.getName()):
            default_str = " [d]"
        if self.base._moduleContainer.isEnabled(modulePackage):
            if not default_str:
                enabled_str = " "
            enabled_str += "[e]"
        elif self.base._moduleContainer.isDisabled(modulePackage):
            if not default_str:
                disabled_str = " "
            disabled_str += "[x]"
        if markActive and self.base._moduleContainer.isModuleActive(modulePackage):
            if not default_str:
                disabled_str = " "
            disabled_str += "[a]"
        return default_str, enabled_str, disabled_str

    def _get_info(self, module_specs):
        output = set()
        for module_spec in module_specs:
            module_list, nsvcap = self._get_modules(module_spec)
            if not module_list:
                logger.info(_("Unable to resolve argument {}").format(module_spec))
                continue

            if nsvcap.profile:
                logger.info(_("Ignoring unnecessary profile: '{}/{}'").format(
                    nsvcap.name, nsvcap.profile))
            for modulePackage in module_list:
                default_str, enabled_str, disabled_str = self._module_strs_formatter(
                    modulePackage, markActive=True)
                default_profiles = self.base._moduleContainer.getDefaultProfiles(
                    modulePackage.getName(), modulePackage.getStream())

                profiles_str = self._profile_report_formatter(
                    modulePackage, default_profiles, enabled_str)

                lines = OrderedDict()
                lines["Name"] = modulePackage.getName()
                lines["Stream"] = modulePackage.getStream() + default_str + enabled_str + \
                                  disabled_str
                lines["Version"] = modulePackage.getVersion()
                lines["Context"] = modulePackage.getContext()
                lines["Architecture"] = modulePackage.getArch()
                lines["Profiles"] = profiles_str
                lines["Default profiles"] = " ".join(default_profiles)
                lines["Repo"] = modulePackage.getRepoID()
                lines["Summary"] = modulePackage.getSummary()
                lines["Description"] = modulePackage.getDescription()
                req_set = set()
                for req in modulePackage.getModuleDependencies():
                    for require_dict in req.getRequires():
                        for mod_require, stream in require_dict.items():
                            req_set.add("{}:[{}]".format(mod_require, ",".join(stream)))
                lines["Requires"] = "\n".join(sorted(req_set))
                lines["Artifacts"] = "\n".join(sorted(modulePackage.getArtifacts()))
                output.add(self._create_simple_table(lines).toString())
        str_table = "\n\n".join(sorted(output))
        if str_table:
            str_table += MODULE_INFO_TABLE_HINT
        return str_table

    @staticmethod
    def _create_simple_table(lines):
        table = libdnf.smartcols.Table()
        table.enableNoheadings(True)
        table.setColumnSeparator(" : ")

        column_name = table.newColumn("Name")
        column_value = table.newColumn("Value")
        column_value.setWrap(True)
        column_value.setSafechars("\n")
        column_value.setNewlineWrapFunction()

        for line_name, value in lines.items():
            if value is None:
                value = ""
            line = table.newLine()
            line.getColumnCell(column_name).setData(line_name)
            line.getColumnCell(column_value).setData(str(value))

        return table

    def _get_full_info(self, module_specs):
        output = set()
        for module_spec in module_specs:
            module_list, nsvcap = self._get_modules(module_spec)
            if not module_list:
                logger.info(_("Unable to resolve argument {}").format(module_spec))
                continue

            if nsvcap.profile:
                logger.info(_("Ignoring unnecessary profile: '{}/{}'").format(
                    nsvcap.name, nsvcap.profile))
            for modulePackage in module_list:
                info = modulePackage.getYaml()
                if info:
                    output.add(info)
        output_string = "\n\n".join(sorted(output))
        return output_string

    def _what_provides(self, rpm_specs):
        output = set()
        modulePackages = self.base._moduleContainer.getModulePackages()
        baseQuery = self.base.sack.query().filterm(empty=True).apply()
        getBestInitQuery = self.base.sack.query(flags=hawkey.IGNORE_MODULAR_EXCLUDES)

        for spec in rpm_specs:
            subj = dnf.subject.Subject(spec)
            baseQuery = baseQuery.union(subj.get_best_query(
                self.base.sack, with_nevra=True, with_provides=False, with_filenames=False,
                query=getBestInitQuery))

        baseQuery.apply()

        for modulePackage in modulePackages:
            artifacts = modulePackage.getArtifacts()
            if not artifacts:
                continue
            query = baseQuery.filter(nevra_strict=artifacts)
            if query:
                for pkg in query:
                    string_output = ""
                    profiles = []
                    for profile in sorted(modulePackage.getProfiles(), key=_profile_comparison_key):
                        if pkg.name in profile.getContent():
                            profiles.append(profile.getName())
                    lines = OrderedDict()
                    lines["Module"] = modulePackage.getFullIdentifier()
                    lines["Profiles"] = " ".join(sorted(profiles))
                    lines["Repo"] = modulePackage.getRepoID()
                    lines["Summary"] = modulePackage.getSummary()

                    table = self._create_simple_table(lines)

                    string_output += "{}\n".format(self.base.output.term.bold(str(pkg)))
                    string_output += "{}".format(table.toString())
                    output.add(string_output)

        return "\n\n".join(sorted(output))

    def _create_and_fill_table(self, latest):
        table = libdnf.smartcols.Table()
        table.setTermforce(libdnf.smartcols.Table.TermForce_AUTO)
        table.enableMaxout(True)
        column_name = table.newColumn("Name")
        column_stream = table.newColumn("Stream")
        column_profiles = table.newColumn("Profiles")
        column_profiles.setWrap(True)
        column_info = table.newColumn("Summary")
        column_info.setWrap(True)

        if not self.base.conf.verbose:
            column_info.hidden = True

        for latest_per_repo in latest:
            for nameStreamArch in latest_per_repo:
                if len(nameStreamArch) == 1:
                    modulePackage = nameStreamArch[0]
                else:
                    active = [module for module in nameStreamArch
                              if self.base._moduleContainer.isModuleActive(module)]
                    if active:
                        modulePackage = active[0]
                    else:
                        modulePackage = nameStreamArch[0]
                line = table.newLine()
                default_str, enabled_str, disabled_str = self._module_strs_formatter(
                    modulePackage, markActive=False)
                default_profiles = self.base._moduleContainer.getDefaultProfiles(
                    modulePackage.getName(), modulePackage.getStream())
                profiles_str = self._profile_report_formatter(modulePackage, default_profiles,
                                                             enabled_str)
                line.getColumnCell(column_name).setData(modulePackage.getName())
                line.getColumnCell(
                    column_stream).setData(
                    modulePackage.getStream() + default_str + enabled_str + disabled_str)
                line.getColumnCell(column_profiles).setData(profiles_str)
                summary_str = self._summary_report_formatter(modulePackage.getSummary())
                line.getColumnCell(column_info).setData(summary_str)

        return table

    def _get_brief_description(self, module_specs, module_state):
        modules = []
        if module_specs:
            for spec in module_specs:
                module_list, nsvcap = self._get_modules(spec)
                modules.extend(module_list)
        else:
            modules = self.base._moduleContainer.getModulePackages()
        latest = self.base._moduleContainer.getLatestModulesPerRepo(module_state, modules)
        if not latest:
            return ""

        table = self._create_and_fill_table(latest)
        current_repo_id_index = 0
        already_printed_lines = 0
        try:
            repo_name = self.base.repos[latest[0][0][0].getRepoID()].name
        except KeyError:
            repo_name = latest[0][0][0].getRepoID()
        versions = len(latest[0])
        header = self._format_header(table)
        str_table = self._format_repoid(repo_name)
        str_table += header
        for i in range(0, table.getNumberOfLines()):
            if versions + already_printed_lines <= i:
                already_printed_lines += versions
                current_repo_id_index += 1
                # Fail-Safe repository is not in self.base.repos
                try:
                    repo_name = self.base.repos[
                        latest[current_repo_id_index][0][0].getRepoID()].name
                except KeyError:
                    repo_name = latest[current_repo_id_index][0][0].getRepoID()
                versions = len(latest[current_repo_id_index])
                str_table += "\n"
                str_table += self._format_repoid(repo_name)
                str_table += header

            line = table.getLine(i)
            str_table += table.toString(line, line)
        return str_table + MODULE_TABLE_HINT

    def _format_header(self, table):
        line = table.getLine(0)
        return table.toString(line, line).split('\n', 1)[0] + '\n'

    def _format_repoid(self, repo_name):
        return "{}\n".format(self.base.output.term.bold(repo_name))

    def _install_profiles_internal(self, install_set_artifacts, install_dict, strict):
        #  Remove source packages because they cannot be installed or upgraded
        base_no_source_query = self.base.sack.query().filterm(arch__neq=['src', 'nosrc']).apply()
        install_base_query = base_no_source_query.filter(nevra_strict=install_set_artifacts)
        error_specs = []

        # add hot-fix packages
        hot_fix_repos = [i.id for i in self.base.repos.iter_enabled() if i.module_hotfixes]
        hotfix_packages = base_no_source_query.filter(
            reponame=hot_fix_repos, name=install_dict.keys())
        install_base_query = install_base_query.union(hotfix_packages)

        for pkg_name, set_specs in install_dict.items():
            query = install_base_query.filter(name=pkg_name)
            if not query:
                # package can also be non-modular or part of another stream
                query = base_no_source_query.filter(name=pkg_name)
                if not query:
                    for spec in set_specs:
                        logger.error(_("Unable to resolve argument {}").format(spec))
                    logger.error(_("No match for package {}").format(pkg_name))
                    error_specs.extend(set_specs)
                    continue
            self.base._goal.group_members.add(pkg_name)
            sltr = dnf.selector.Selector(self.base.sack)
            sltr.set(pkg=query)
            self.base._goal.install(select=sltr, optional=(not strict))
        return install_base_query, error_specs


def format_modular_solver_errors(errors):
    msg = dnf.util._format_resolve_problems(errors)
    return "\n".join(
        [P_('Modular dependency problem:', 'Modular dependency problems:', len(errors)), msg])
site-packages/dnf/module/exceptions.py000064400000007106147511334650014070 0ustar00# supplies the 'module' command.
#
# Copyright (C) 2014-2017  Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

import dnf
from dnf.module import module_messages, NO_PROFILE_SPECIFIED
from dnf.i18n import _


class NoModuleException(dnf.exceptions.Error):
    def __init__(self, module_spec):
        value = _("No such module: {}").format(module_spec)
        super(NoModuleException, self).__init__(value)


class NoStreamException(dnf.exceptions.Error):
    def __init__(self, stream):
        value = _("No such stream: {}").format(stream)
        super(NoStreamException, self).__init__(value)


class EnabledStreamException(dnf.exceptions.Error):
    def __init__(self, module_spec):
        value = _("No enabled stream for module: {}").format(module_spec)
        super(EnabledStreamException, self).__init__(value)


class EnableMultipleStreamsException(dnf.exceptions.Error):
    def __init__(self, module_spec, value=None):
        if value is None:
            value = _("Cannot enable more streams from module '{}' at the same time").format(module_spec)
        super(EnableMultipleStreamsException, self).__init__(value)


class DifferentStreamEnabledException(dnf.exceptions.Error):
    def __init__(self, module_spec):
        value = _("Different stream enabled for module: {}").format(module_spec)
        super(DifferentStreamEnabledException, self).__init__(value)


class NoProfileException(dnf.exceptions.Error):
    def __init__(self, profile):
        value = _("No such profile: {}").format(profile)
        super(NoProfileException, self).__init__(value)


class ProfileNotInstalledException(dnf.exceptions.Error):
    def __init__(self, module_spec):
        value = _("Specified profile not installed for {}").format(module_spec)
        super(ProfileNotInstalledException, self).__init__(value)


class NoStreamSpecifiedException(dnf.exceptions.Error):
    def __init__(self, module_spec):
        value = _("No stream specified for '{}', please specify stream").format(module_spec)
        super(NoStreamSpecifiedException, self).__init__(value)


class NoProfileSpecifiedException(dnf.exceptions.Error):
    def __init__(self, module_spec):
        value = module_messages[NO_PROFILE_SPECIFIED].format(module_spec)
        super(NoProfileSpecifiedException, self).__init__(value)


class NoProfilesException(dnf.exceptions.Error):
    def __init__(self, module_spec):
        value = _("No such profile: {}. No profiles available").format(module_spec)
        super(NoProfilesException, self).__init__(value)


class NoProfileToRemoveException(dnf.exceptions.Error):
    def __init__(self, module_spec):
        value = _("No profile to remove for '{}'").format(module_spec)
        super(NoProfileToRemoveException, self).__init__(value)
site-packages/dnf/module/__init__.py000064400000002356147511334650013450 0ustar00# Copyright (C) 2017  Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU Library General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.

from dnf.i18n import _

DIFFERENT_STREAM_INFO = 1
NOTHING_TO_SHOW = 2
INSTALLING_NEWER_VERSION = 4
ENABLED_MODULES = 5
NO_PROFILE_SPECIFIED = 6

module_messages = {
    DIFFERENT_STREAM_INFO: _("Enabling different stream for '{}'."),
    NOTHING_TO_SHOW: _("Nothing to show."),
    INSTALLING_NEWER_VERSION: _("Installing newer version of '{}' than specified. Reason: {}"),
    ENABLED_MODULES: _("Enabled modules: {}."),
    NO_PROFILE_SPECIFIED: _("No profile specified for '{}', please specify profile."),
}
site-packages/dnf/pycomp.py000064400000007024147511334650011730 0ustar00# pycomp.py
# Python 2 and Python 3 compatibility module
#
# Copyright (C) 2013-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from gettext import NullTranslations
from sys import version_info
import base64
import email.mime.text
import gettext
import itertools
import locale
import sys
import types

PY3 = version_info.major >= 3

if PY3:
    from io import StringIO
    from configparser import ConfigParser
    import queue
    import urllib.parse
    import shlex

    # functions renamed in py3
    Queue = queue.Queue
    basestring = unicode = str
    filterfalse = itertools.filterfalse
    long = int
    NullTranslations.ugettext = NullTranslations.gettext
    NullTranslations.ungettext = NullTranslations.ngettext
    xrange = range
    raw_input = input
    base64_decodebytes = base64.decodebytes
    urlparse = urllib.parse
    urllib_quote = urlparse.quote
    shlex_quote = shlex.quote
    sys_maxsize = sys.maxsize


    def gettext_setup(t):
        _ = t.gettext
        P_ = t.ngettext
        return (_, P_)

    # string helpers
    def is_py2str_py3bytes(o):
        return isinstance(o, bytes)
    def is_py3bytes(o):
        return isinstance(o, bytes)

    # functions that don't take unicode arguments in py2
    ModuleType = lambda m: types.ModuleType(m)
    format = locale.format_string
    def setlocale(category, loc=None):
        locale.setlocale(category, loc)
    def write_to_file(f, content):
        f.write(content)
    def email_mime(body):
        return email.mime.text.MIMEText(body)
else:
    # functions renamed in py3
    from __builtin__ import unicode, basestring, long, xrange, raw_input
    from StringIO import StringIO
    from ConfigParser import ConfigParser
    import Queue
    import urllib
    import urlparse
    import pipes

    Queue = Queue.Queue
    filterfalse = itertools.ifilterfalse
    base64_decodebytes = base64.decodestring
    urllib_quote = urllib.quote
    shlex_quote = pipes.quote
    sys_maxsize = sys.maxint

    def gettext_setup(t):
        _ = t.ugettext
        P_ = t.ungettext
        return (_, P_)

    # string helpers
    def is_py2str_py3bytes(o):
        return isinstance(o, str)
    def is_py3bytes(o):
        return False

    # functions that don't take unicode arguments in py2
    ModuleType = lambda m: types.ModuleType(m.encode('utf-8'))
    def format(percent, *args, **kwargs):
        return locale.format(percent.encode('utf-8'), *args, **kwargs)
    def setlocale(category, loc=None):
        locale.setlocale(category, loc.encode('utf-8'))
    def write_to_file(f, content):
        f.write(content.encode('utf-8'))
    def email_mime(body):
        return email.mime.text.MIMEText(body.encode('utf-8'))
site-packages/dnf/cli/__init__.py000064400000002305147511334650012724 0ustar00# __init__.py
# DNF cli subpackage.
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
import dnf.exceptions


class CliError(dnf.exceptions.Error):
    """CLI Exception. :api"""
    pass


from dnf.cli.cli import Cli  # :api
from dnf.cli.commands import Command  # :api
site-packages/dnf/cli/commands/downgrade.py000064400000004425147511334650014745 0ustar00# downgrade.py
# Downgrade CLI command.
#
# Copyright (C) 2014-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from dnf.cli import commands
from dnf.cli.option_parser import OptionParser
from dnf.i18n import _


class DowngradeCommand(commands.Command):
    """A class containing methods needed by the cli to execute the
    downgrade command.
    """

    aliases = ('downgrade', 'dg')
    summary = _("Downgrade a package")

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('package', nargs='*', help=_('Package to downgrade'),
                            action=OptionParser.ParseSpecGroupFileCallback)

    def configure(self):
        demands = self.cli.demands
        demands.sack_activation = True
        demands.available_repos = True
        demands.resolving = True
        demands.root_user = True

        commands._checkGPGKey(self.base, self.cli)
        if not self.opts.filenames:
            commands._checkEnabledRepo(self.base)

    def run(self):
        file_pkgs = self.base.add_remote_rpms(self.opts.filenames, strict=False,
                                              progress=self.base.output.progress)
        return self.base.downgradePkgs(
            specs=self.opts.pkg_specs + ['@' + x for x in self.opts.grp_specs],
            file_pkgs=file_pkgs,
            strict=self.base.conf.strict)
site-packages/dnf/cli/commands/group.py000064400000035235147511334650014132 0ustar00# group.py
# Group CLI command.
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from dnf.comps import CompsQuery
from dnf.cli import commands
from dnf.i18n import _, ucd

import libdnf.transaction

import dnf.cli
import dnf.exceptions
import dnf.util
import logging

logger = logging.getLogger("dnf")

class GroupCommand(commands.Command):
    """ Single sub-command interface for most groups interaction. """

    direct_commands = {'grouplist'    : 'list',
                       'groupinstall' : 'install',
                       'groupupdate'  : 'install',
                       'groupremove'  : 'remove',
                       'grouperase'   : 'remove',
                       'groupinfo'    : 'info'}
    aliases = ('group', 'groups', 'grp') + tuple(direct_commands.keys())
    summary = _('display, or use, the groups information')

    _CMD_ALIASES = {'update'     : 'upgrade',
                    'erase'      : 'remove'}
    _MARK_CMDS = ('install', 'remove')
    _GROUP_SUBCOMMANDS = ('summary', 'list', 'info', 'remove', 'install', 'upgrade', 'mark')


    def _canonical(self):
        # were we called with direct command?
        direct = self.direct_commands.get(self.opts.command)
        if direct:
            # canonize subcmd and args
            if self.opts.subcmd is not None:
                self.opts.args.insert(0, self.opts.subcmd)
            self.opts.subcmd = direct
        if self.opts.subcmd is None:
            self.opts.subcmd = 'summary'
        self.opts.subcmd = self._CMD_ALIASES.get(self.opts.subcmd,
                                                 self.opts.subcmd)

    def __init__(self, cli):
        super(GroupCommand, self).__init__(cli)
        self._remark = False

    def _assert_comps(self):
        msg = _('No group data available for configured repositories.')
        if not len(self.base.comps):
            raise dnf.exceptions.CompsError(msg)

    def _environment_lists(self, patterns):
        def available_pred(env):
            env_found = self.base.history.env.get(env.id)
            return not(env_found)

        self._assert_comps()
        if patterns is None:
            envs = self.base.comps.environments
        else:
            envs = self.base.comps.environments_by_pattern(",".join(patterns))

        return dnf.util.mapall(list, dnf.util.partition(available_pred, envs))

    def _group_lists(self, uservisible, patterns):
        def installed_pred(group):
            group_found = self.base.history.group.get(group.id)
            if group_found:
                return True
            return False
        installed = []
        available = []

        self._assert_comps()

        if patterns is None:
            grps = self.base.comps.groups
        else:
            grps = self.base.comps.groups_by_pattern(",".join(patterns))
        for grp in grps:
            tgt_list = available
            if installed_pred(grp):
                tgt_list = installed
            if not uservisible or grp.uservisible:
                tgt_list.append(grp)

        return installed, available

    def _info(self, userlist):
        for strng in userlist:
            group_matched = False

            for env in self.base.comps.environments_by_pattern(strng):
                self.output.display_groups_in_environment(env)
                group_matched = True

            for group in self.base.comps.groups_by_pattern(strng):
                self.output.display_pkgs_in_groups(group)
                group_matched = True

            if not group_matched:
                logger.error(_('Warning: Group %s does not exist.'), strng)

        return 0, []

    def _list(self, userlist):
        uservisible = 1
        showinstalled = 0
        showavailable = 0
        print_ids = self.base.conf.verbose or self.opts.ids

        while userlist:
            if userlist[0] == 'hidden':
                uservisible = 0
                userlist.pop(0)
            elif userlist[0] == 'installed':
                showinstalled = 1
                userlist.pop(0)
            elif userlist[0] == 'available':
                showavailable = 1
                userlist.pop(0)
            elif userlist[0] == 'ids':
                print_ids = True
                userlist.pop(0)
            else:
                break
        if self.opts.hidden:
            uservisible = 0
        if self.opts.installed:
            showinstalled = 1
        if self.opts.available:
            showavailable = 1
        if not userlist:
            userlist = None # Match everything...

        errs = False
        if userlist is not None:
            for group in userlist:
                comps = self.base.comps
                in_group = len(comps.groups_by_pattern(group)) > 0
                in_environment = len(comps.environments_by_pattern(group)) > 0
                if not in_group and not in_environment:
                    logger.error(_('Warning: No groups match:') + '\n   %s',
                                 group)
                    errs = True
            if errs:
                return 0, []

        env_inst, env_avail = self._environment_lists(userlist)
        installed, available = self._group_lists(uservisible, userlist)

        def _out_grp(sect, group):
            if not done:
                print(sect)
            msg = '   %s' % (group.ui_name if group.ui_name is not None else _("<name-unset>"))
            if print_ids:
                msg += ' (%s)' % group.id
            if group.lang_only:
                msg += ' [%s]' % group.lang_only
            print('{}'.format(msg))

        def _out_env(sect, envs):
            if envs:
                print(sect)
            for e in envs:
                msg = '   %s' % (e.ui_name if e.ui_name is not None else _("<name-unset>"))
                if print_ids:
                    msg += ' (%s)' % e.id
                print(msg)

        if not showinstalled:
            _out_env(_('Available Environment Groups:'), env_avail)
        if not showavailable:
            _out_env(_('Installed Environment Groups:'), env_inst)

        if not showavailable:
            done = False
            for group in installed:
                if group.lang_only:
                    continue
                _out_grp(_('Installed Groups:'), group)
                done = True

            done = False
            for group in installed:
                if not group.lang_only:
                    continue
                _out_grp(_('Installed Language Groups:'), group)
                done = True

        if showinstalled:
            return 0, []

        done = False
        for group in available:
            if group.lang_only:
                continue
            _out_grp(_('Available Groups:'), group)
            done = True

        done = False
        for group in available:
            if not group.lang_only:
                continue
            _out_grp(_('Available Language Groups:'), group)
            done = True

        return 0, []

    def _mark_install(self, patterns):
        q = CompsQuery(self.base.comps, self.base.history,
                       CompsQuery.GROUPS | CompsQuery.ENVIRONMENTS,
                       CompsQuery.AVAILABLE | CompsQuery.INSTALLED)
        solver = self.base._build_comps_solver()
        res = q.get(*patterns)

        if self.opts.with_optional:
            types = tuple(self.base.conf.group_package_types + ['optional'])
        else:
            types = tuple(self.base.conf.group_package_types)
        pkg_types = libdnf.transaction.listToCompsPackageType(types)
        for env_id in res.environments:
            solver._environment_install(env_id, pkg_types)
        for group_id in res.groups:
            solver._group_install(group_id, pkg_types)

    def _mark_remove(self, patterns):
        q = CompsQuery(self.base.comps, self.base.history,
                       CompsQuery.GROUPS | CompsQuery.ENVIRONMENTS,
                       CompsQuery.INSTALLED)
        solver = self.base._build_comps_solver()
        res = q.get(*patterns)
        for env_id in res.environments:
            assert dnf.util.is_string_type(env_id)
            solver._environment_remove(env_id)
        for grp_id in res.groups:
            assert dnf.util.is_string_type(grp_id)
            solver._group_remove(grp_id)

    def _mark_subcmd(self, extcmds):
        if extcmds[0] in self._MARK_CMDS:
            return extcmds[0], extcmds[1:]
        return 'install', extcmds

    def _summary(self, userlist):
        uservisible = 1
        if len(userlist) > 0:
            if userlist[0] == 'hidden':
                uservisible = 0
                userlist.pop(0)
        if self.opts.hidden:
            uservisible = 0
        if not userlist:
            userlist = None # Match everything...

        installed, available = self._group_lists(uservisible, userlist)

        def _out_grp(sect, num):
            if not num:
                return
            logger.info('%s %u', sect, num)
        done = 0
        for group in installed:
            if group.lang_only:
                continue
            done += 1
        _out_grp(_('Installed Groups:'), done)

        done = 0
        for group in installed:
            if not group.lang_only:
                continue
            done += 1
        _out_grp(_('Installed Language Groups:'), done)

        done = False
        for group in available:
            if group.lang_only:
                continue
            done += 1
        _out_grp(_('Available Groups:'), done)

        done = False
        for group in available:
            if not group.lang_only:
                continue
            done += 1
        _out_grp(_('Available Language Groups:'), done)

        return 0, []

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('--with-optional', action='store_true',
                            help=_("include optional packages from group"))
        grpparser = parser.add_mutually_exclusive_group()
        grpparser.add_argument('--hidden', action='store_true',
                               help=_("show also hidden groups"))
        grpparser.add_argument('--installed', action='store_true',
                               help=_("show only installed groups"))
        grpparser.add_argument('--available', action='store_true',
                               help=_("show only available groups"))
        grpparser.add_argument('--ids', action='store_true',
                               help=_("show also ID of groups"))
        parser.add_argument('subcmd', nargs='?', metavar='COMMAND',
                            help=_('available subcommands: {} (default), {}').format(
                                GroupCommand._GROUP_SUBCOMMANDS[0],
                                ', '.join(GroupCommand._GROUP_SUBCOMMANDS[1:])))
        parser.add_argument('args', nargs='*', metavar='COMMAND_ARG',
                            help=_('argument for group subcommand'))

    def configure(self):
        self._canonical()

        cmd = self.opts.subcmd
        args = self.opts.args

        if cmd not in self._GROUP_SUBCOMMANDS:
            logger.critical(_('Invalid groups sub-command, use: %s.'),
                            ", ".join(self._GROUP_SUBCOMMANDS))
            raise dnf.cli.CliError
        if cmd in ('install', 'remove', 'mark', 'info') and not args:
            self.cli.optparser.print_help(self)
            raise dnf.cli.CliError

        demands = self.cli.demands
        demands.sack_activation = True
        if cmd in ('install', 'mark', 'remove', 'upgrade'):
            demands.root_user = True
            demands.resolving = True
        if cmd == 'remove':
            demands.allow_erasing = True
            demands.available_repos = False
        else:
            demands.available_repos = True

        if cmd not in ('remove'):
            commands._checkEnabledRepo(self.base)

        if cmd in ('install', 'upgrade'):
            commands._checkGPGKey(self.base, self.cli)

    def run(self):
        cmd = self.opts.subcmd
        extcmds = self.opts.args

        if cmd == 'summary':
            return self._summary(extcmds)
        if cmd == 'list':
            return self._list(extcmds)
        if cmd == 'info':
            return self._info(extcmds)
        if cmd == 'mark':
            (subcmd, extcmds) = self._mark_subcmd(extcmds)
            if subcmd == 'remove':
                return self._mark_remove(extcmds)
            else:
                assert subcmd == 'install'
                return self._mark_install(extcmds)

        if cmd == 'install':
            if self.opts.with_optional:
                types = tuple(self.base.conf.group_package_types + ['optional'])
            else:
                types = tuple(self.base.conf.group_package_types)

            self._remark = True
            try:
                return self.base.env_group_install(extcmds, types,
                                                   self.base.conf.strict)
            except dnf.exceptions.MarkingError as e:
                msg = _('No package %s available.')
                logger.info(msg, self.base.output.term.bold(e))
                raise dnf.exceptions.PackagesNotAvailableError(
                    _("Unable to find a mandatory group package."))
        if cmd == 'upgrade':
            return self.base.env_group_upgrade(extcmds)
        if cmd == 'remove':
            for arg in extcmds:
                try:
                    self.base.env_group_remove([arg])
                except dnf.exceptions.Error:
                    pass

    def run_transaction(self):
        if not self._remark:
            return
        goal = self.base._goal
        history = self.base.history
        names = goal.group_members
        for pkg in self.base.sack.query().installed().filterm(name=names):
            reason = history.rpm.get_reason(pkg)
            history.set_reason(pkg, goal.group_reason(pkg, reason))
site-packages/dnf/cli/commands/alias.py000064400000015660147511334650014067 0ustar00# alias.py
# Alias CLI command.
#
# Copyright (C) 2018 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import print_function
from __future__ import unicode_literals

import logging
import os.path

import dnf.cli
import dnf.cli.aliases
from dnf.cli import commands
import dnf.conf
import dnf.exceptions
from dnf.i18n import _

logger = logging.getLogger('dnf')


class AliasCommand(commands.Command):
    aliases = ('alias',)
    summary = _('List or create command aliases')

    @staticmethod
    def set_argparser(parser):
        enable_group = parser.add_mutually_exclusive_group()
        enable_group.add_argument(
            '--enable-resolving', default=False, action='store_true',
            help=_('enable aliases resolving'))
        enable_group.add_argument(
            '--disable-resolving', default=False, action='store_true',
            help=_('disable aliases resolving'))
        parser.add_argument("subcommand", nargs='?', default='list',
                            choices=['add', 'list', 'delete'],
                            help=_("action to do with aliases"))
        parser.add_argument("alias", nargs="*", metavar="command[=result]",
                            help=_("alias definition"))

    def configure(self):
        demands = self.cli.demands
        if self.opts.subcommand in ('add', 'delete'):
            demands.root_user = True
        self.aliases_base = dnf.cli.aliases.Aliases()
        self.aliases_base._load_aliases()
        self.resolving_enabled = self.aliases_base.enabled
        self._update_config_from_options()

    def _update_config_from_options(self):
        enabled = None
        if self.opts.enable_resolving:
            enabled = True
            logger.info(_("Aliases are now enabled"))
        if self.opts.disable_resolving:
            enabled = False
            logger.info(_("Aliases are now disabled"))

        if enabled is not None:
            if not os.path.exists(dnf.cli.aliases.ALIASES_CONF_PATH):
                open(dnf.cli.aliases.ALIASES_CONF_PATH, 'w').close()
            dnf.conf.BaseConfig.write_raw_configfile(
                dnf.cli.aliases.ALIASES_CONF_PATH,
                'main', None, {'enabled': enabled})
            if not self.aliases_base._disabled_by_environ():
                self.aliases_base.enabled = enabled

    def _parse_option_alias(self):
        new_aliases = {}
        for alias in self.opts.alias:
            alias = alias.split('=', 1)
            cmd = alias[0].strip()
            if len(cmd.split()) != 1:
                logger.warning(_("Invalid alias key: %s"), cmd)
                continue
            if cmd.startswith('-'):
                logger.warning(_("Invalid alias key: %s"), cmd)
                continue
            if len(alias) == 1:
                logger.warning(_("Alias argument has no value: %s"), cmd)
                continue
            new_aliases[cmd] = alias[1].split()
        return new_aliases

    def _load_user_aliases(self):
        if not os.path.exists(dnf.cli.aliases.ALIASES_USER_PATH):
            open(dnf.cli.aliases.ALIASES_USER_PATH, 'w').close()
        try:
            conf = dnf.cli.aliases.AliasesConfig(
                dnf.cli.aliases.ALIASES_USER_PATH)
        except dnf.exceptions.ConfigError as e:
            logger.warning(_('Config error: %s'), e)
            return None
        return conf

    def _store_user_aliases(self, user_aliases, enabled):
        fileobj = open(dnf.cli.aliases.ALIASES_USER_PATH, 'w')
        output = "[main]\n"
        output += "enabled = {}\n\n".format(enabled)
        output += "[aliases]\n"
        for key, value in user_aliases.items():
            output += "{} = {}\n".format(key, ' '.join(value))
        fileobj.write(output)

    def add_aliases(self, aliases):
        conf = self._load_user_aliases()
        user_aliases = conf.aliases
        if user_aliases is None:
            return

        user_aliases.update(aliases)

        self._store_user_aliases(user_aliases, conf.enabled)
        logger.info(_("Aliases added: %s"), ', '.join(aliases.keys()))

    def remove_aliases(self, cmds):
        conf = self._load_user_aliases()
        user_aliases = conf.aliases
        if user_aliases is None:
            return

        valid_cmds = []
        for cmd in cmds:
            try:
                del user_aliases[cmd]
                valid_cmds.append(cmd)
            except KeyError:
                logger.info(_("Alias not found: %s"), cmd)

        self._store_user_aliases(user_aliases, conf.enabled)
        logger.info(_("Aliases deleted: %s"), ', '.join(valid_cmds))

    def list_alias(self, cmd):
        args = [cmd]
        try:
            args = self.aliases_base._resolve(args)
        except dnf.exceptions.Error as e:
            logger.error(
                _('%s, alias %s="%s"'), e, cmd, (' ').join(self.aliases_base.aliases[cmd]))
        else:
            print(_("Alias %s='%s'") % (cmd, " ".join(args)))

    def run(self):
        if not self.aliases_base.enabled:
            logger.warning(_("Aliases resolving is disabled."))

        if self.opts.subcommand == 'add':  # Add new alias
            aliases = self._parse_option_alias()
            if not aliases:
                raise dnf.exceptions.Error(_("No aliases specified."))
            self.add_aliases(aliases)
            return

        if self.opts.subcommand == 'delete':  # Remove alias
            cmds = self.opts.alias
            if cmds == []:
                raise dnf.exceptions.Error(_("No alias specified."))
            self.remove_aliases(cmds)
            return

        if not self.opts.alias:  # List all aliases
            if not self.aliases_base.aliases:
                logger.info(_("No aliases defined."))
                return
            for cmd in self.aliases_base.aliases:
                self.list_alias(cmd)
        else:  # List alias by key
            for cmd in self.opts.alias:
                if cmd not in self.aliases_base.aliases:
                    logger.info(_("No match for alias: %s") % cmd)
                    continue
                self.list_alias(cmd)
site-packages/dnf/cli/commands/swap.py000064400000004563147511334650013750 0ustar00#
# Copyright (C) 2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from dnf.i18n import _
from dnf.cli import commands

import dnf.util
import logging

logger = logging.getLogger("dnf")


class SwapCommand(commands.Command):
    """A class containing methods needed by the cli to execute the swap command.
    """

    aliases = ('swap',)
    summary = _('run an interactive {prog} mod for remove and install one spec').format(
        prog=dnf.util.MAIN_PROG_UPPER)

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('remove_spec', action="store", help=_('The specs that will be removed'))
        parser.add_argument('install_spec', action="store", help=_(
            'The specs that will be installed'))

    def configure(self):
        demands = self.cli.demands
        demands.sack_activation = True
        demands.available_repos = True
        demands.resolving = True
        demands.root_user = True
        commands._checkGPGKey(self.base, self.cli)
        commands._checkEnabledRepo(self.base, [self.opts.install_spec])

    def _perform(self, cmd_str, spec):
        cmd_cls = self.cli.cli_commands.get(cmd_str)
        if cmd_cls is not None:
            cmd = cmd_cls(self.cli)
            self.cli.optparser.parse_command_args(cmd, [cmd_str, spec])
            cmd.run()

    def run(self):
        self._perform('remove', self.opts.remove_spec)
        self._perform('install', self.opts.install_spec)
site-packages/dnf/cli/commands/shell.py000064400000023154147511334650014102 0ustar00# shell.py
# Shell CLI command.
#
# Copyright (C) 2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from dnf.cli import commands
from dnf.i18n import _, ucd

import dnf.util
import cmd
import copy
import dnf
import logging
import shlex
import sys


logger = logging.getLogger('dnf')


# only demands we'd like to override
class ShellDemandSheet(object):
    available_repos = True
    resolving = True
    root_user = True
    sack_activation = True


class ShellCommand(commands.Command, cmd.Cmd):

    aliases = ('shell', 'sh')
    summary = _('run an interactive {prog} shell').format(prog=dnf.util.MAIN_PROG_UPPER)

    MAPPING = {'repo': 'repo',
               'repository': 'repo',
               'exit': 'quit',
               'quit': 'quit',
               'run': 'ts_run',
               'ts': 'transaction',
               'transaction': 'transaction',
               'config': 'config',
               'resolvedep': 'resolve',
               'help': 'help'
               }

    def __init__(self, cli):
        commands.Command.__init__(self, cli)
        cmd.Cmd.__init__(self)
        self.prompt = '> '

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('script', nargs='?', metavar=_('SCRIPT'),
                            help=_('Script to run in {prog} shell').format(
                                prog=dnf.util.MAIN_PROG_UPPER))

    def configure(self):
        # append to ShellDemandSheet missing demands from
        # dnf.cli.demand.DemandSheet with their default values.
        default_demands = self.cli.demands
        self.cli.demands = ShellDemandSheet()
        for attr in dir(default_demands):
            if attr.startswith('__'):
                continue
            try:
                getattr(self.cli.demands, attr)
            except AttributeError:
                setattr(self.cli.demands, attr, getattr(default_demands, attr))

    def run(self):
        if self.opts.script:
            self._run_script(self.opts.script)
        else:
            self.cmdloop()

    def _clean(self):
        self.base._finalize_base()
        self.base._transaction = None
        self.base.fill_sack()

    def onecmd(self, line):
        if not line or line == '\n':
            return
        if line == 'EOF':
            line = 'quit'
        try:
            s_line = shlex.split(line)
        except:
            self._help()
            return
        # reset option parser before each command, keep usage information
        self.cli.optparser.__init__(reset_usage=False)
        opts = self.cli.optparser.parse_main_args(s_line)
        # Disable shell recursion.
        if opts.command == 'shell':
            return
        if opts.command in self.MAPPING:
            getattr(self, '_' + self.MAPPING[opts.command])(s_line[1::])
        else:
            cmd_cls = self.cli.cli_commands.get(opts.command)
            if cmd_cls is not None:
                cmd = cmd_cls(self.cli)
                try:
                    opts = self.cli.optparser.parse_command_args(cmd, s_line)
                except SystemExit:
                    # argparse.ArgumentParser prints usage information and executes
                    # sys.exit() on problems with parsing command line arguments
                    return
                try:
                    cmd.cli.demands = copy.deepcopy(self.cli.demands)
                    cmd.configure()
                    cmd.run()
                except dnf.exceptions.Error as e:
                    logger.error(_("Error:") + " " + ucd(e))
                    return
            else:
                self._help()

    def _config(self, args=None):
        def print_or_set(key, val, conf):
            if val:
                setattr(conf, key, val)
            else:
                try:
                    print('{}: {}'.format(key, getattr(conf, str(key))))
                except:
                    logger.warning(_('Unsupported key value.'))

        if not args or len(args) > 2:
            self._help('config')
            return

        key = args[0]
        val = args[1] if len(args) == 2 else None
        period = key.find('.')
        if period != -1:
            repo_name = key[:period]
            key = key[period+1:]
            repos = self.base.repos.get_matching(repo_name)
            for repo in repos:
                print_or_set(key, val, repo)
            if not repos:
                logger.warning(_('Could not find repository: %s'),
                               repo_name)
        else:
            print_or_set(key, val, self.base.conf)

    def _help(self, args=None):
        """Output help information.

        :param args: the command to output help information about. If
           *args* is an empty, general help will be output.
        """
        arg = args[0] if isinstance(args, list) and len(args) > 0 else args
        msg = None

        if arg:
            if arg == 'config':
                msg = _("""{} arg [value]
  arg: debuglevel, errorlevel, obsoletes, gpgcheck, assumeyes, exclude,
        repo_id.gpgcheck, repo_id.exclude
    If no value is given it prints the current value.
    If value is given it sets that value.""").format(arg)

            elif arg == 'help':
                msg = _("""{} [command]
    print help""").format(arg)

            elif arg in ['repo', 'repository']:
                msg = _("""{} arg [option]
  list: lists repositories and their status. option = [all | id | glob]
  enable: enable repositories. option = repository id
  disable: disable repositories. option = repository id""").format(arg)

            elif arg == 'resolvedep':
                msg = _("""{}
    resolve the transaction set""").format(arg)

            elif arg in ['transaction', 'ts']:
                msg = _("""{} arg
  list: lists the contents of the transaction
  reset: reset (zero-out) the transaction
  run: run the transaction""").format(arg)

            elif arg == 'run':
                msg = _("""{}
    run the transaction""").format(arg)

            elif arg in ['exit', 'quit']:
                msg = _("""{}
    exit the shell""").format(arg)

        if not msg:
            self.cli.optparser.print_help()
            msg = _("""Shell specific arguments:

config                   set config options
help                     print help
repository (or repo)     enable, disable or list repositories
resolvedep               resolve the transaction set
transaction (or ts)      list, reset or run the transaction set
run                      resolve and run the transaction set
exit (or quit)           exit the shell""")

        print('\n' + msg)

    def _repo(self, args=None):
        cmd = args[0] if args else None

        if cmd in ['list', None]:
            self.onecmd('repolist ' + ' '.join(args[1:]))

        elif cmd in ['enable', 'disable']:
            repos = self.cli.base.repos
            fill_sack = False
            for repo in args[1::]:
                r = repos.get_matching(repo)
                if r:
                    getattr(r, cmd)()
                    fill_sack = True
                else:
                    logger.critical(_("Error:") + " " + _("Unknown repo: '%s'"),
                                    self.base.output.term.bold(repo))
            if fill_sack:
                self.base.fill_sack()

            # reset base._comps, as it has changed due to changing the repos
            self.base._comps = None

        else:
            self._help('repo')

    def _resolve(self, args=None):
        try:
            self.cli.base.resolve(self.cli.demands.allow_erasing)
        except dnf.exceptions.DepsolveError as e:
            print(e)

    def _run_script(self, file):
        try:
            with open(file, 'r') as fd:
                lines = fd.readlines()
                for line in lines:
                    if not line.startswith('#'):
                        self.onecmd(line)
        except IOError:
            logger.info(_('Error: Cannot open %s for reading'), self.base.output.term.bold(file))
            sys.exit(1)

    def _transaction(self, args=None):
        cmd = args[0] if args else None

        if cmd == 'reset':
            self._clean()
            return

        self._resolve()
        if cmd in ['list', None]:
            if self.base._transaction:
                out = self.base.output.list_transaction(self.base._transaction)
                logger.info(out)

        elif cmd == 'run':
            try:
                self.base.do_transaction()
            except dnf.exceptions.Error as e:
                logger.error(_("Error:") + " " + ucd(e))
            else:
                logger.info(_("Complete!"))
            self._clean()

        else:
            self._help('transaction')

    def _ts_run(self, args=None):
        self._transaction(['run'])

    def _quit(self, args=None):
        logger.info(_('Leaving Shell'))
        sys.exit(0)
site-packages/dnf/cli/commands/search.py000064400000014242147511334650014236 0ustar00# search.py
# Search CLI command.
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import print_function
from __future__ import unicode_literals

import collections

from dnf.cli import commands
from dnf.cli.option_parser import OptionParser
from dnf.i18n import ucd, _, C_

import dnf.i18n
import dnf.match_counter
import dnf.util
import hawkey
import logging

logger = logging.getLogger('dnf')


class SearchCommand(commands.Command):
    """A class containing methods needed by the cli to execute the
    search command.
    """

    aliases = ('search', 'se')
    summary = _('search package details for the given string')

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('--all', action='store_true',
                            help=_("search also package description and URL"))
        parser.add_argument('query_string', nargs='+', metavar=_('KEYWORD'),
                            choices=['all'], default=None,
                            action=OptionParser.PkgNarrowCallback,
                            help=_("Keyword to search for"))

    def _search(self, args):
        """Search for simple text tags in a package object."""

        TRANS_TBL = collections.OrderedDict((
            ('name', C_('long', 'Name')),
            ('summary', C_('long', 'Summary')),
            ('description', C_('long', 'Description')),
            ('url', _('URL')),
        ))

        def _translate_attr(attr):
            try:
                return TRANS_TBL[attr]
            except:
                return attr

        def _print_section_header(exact_match, attrs, keys):
            trans_attrs = map(_translate_attr, attrs)
            # TRANSLATORS: separator used between package attributes (eg. Name & Summary & URL)
            trans_attrs_str = _(' & ').join(trans_attrs)
            if exact_match:
                # TRANSLATORS: %s  - translated package attributes,
                #              %%s - found keys (in listed attributes)
                section_text = _('%s Exactly Matched: %%s') % trans_attrs_str
            else:
                # TRANSLATORS: %s  - translated package attributes,
                #              %%s - found keys (in listed attributes)
                section_text = _('%s Matched: %%s') % trans_attrs_str
            formatted = self.base.output.fmtSection(section_text % ", ".join(keys))
            print(ucd(formatted))

        counter = dnf.match_counter.MatchCounter()
        for arg in args:
            self._search_counted(counter, 'name', arg)
            self._search_counted(counter, 'summary', arg)

        if self.opts.all:
            for arg in args:
                self._search_counted(counter, 'description', arg)
                self._search_counted(counter, 'url', arg)
        else:
            needles = len(args)
            pkgs = list(counter.keys())
            for pkg in pkgs:
                if len(counter.matched_needles(pkg)) != needles:
                    del counter[pkg]

        used_attrs = None
        matched_needles = None
        exact_match = False
        print_section_header = False
        limit = None
        if not self.base.conf.showdupesfromrepos:
            limit = self.base.sack.query().filterm(pkg=counter.keys()).latest()

        seen = set()
        for pkg in counter.sorted(reverse=True, limit_to=limit):
            if not self.base.conf.showdupesfromrepos:
                if pkg.name + pkg.arch in seen:
                    continue
                seen.add(pkg.name + pkg.arch)

            if used_attrs != counter.matched_keys(pkg):
                used_attrs = counter.matched_keys(pkg)
                print_section_header = True
            if matched_needles != counter.matched_needles(pkg):
                matched_needles = counter.matched_needles(pkg)
                print_section_header = True
            if exact_match != (counter.matched_haystacks(pkg) == matched_needles):
                exact_match = counter.matched_haystacks(pkg) == matched_needles
                print_section_header = True
            if print_section_header:
                _print_section_header(exact_match, used_attrs, matched_needles)
                print_section_header = False
            self.base.output.matchcallback(pkg, counter.matched_haystacks(pkg), args)

        if len(counter) == 0:
            logger.info(_('No matches found.'))

    def _search_counted(self, counter, attr, needle):
        fdict = {'%s__substr' % attr : needle}
        if dnf.util.is_glob_pattern(needle):
            fdict = {'%s__glob' % attr : needle}
        q = self.base.sack.query().filterm(hawkey.ICASE, **fdict)
        for pkg in q.run():
            counter.add(pkg, attr, needle)
        return counter

    def pre_configure(self):
        if not self.opts.quiet:
            self.cli.redirect_logger(stdout=logging.WARNING, stderr=logging.INFO)

    def configure(self):
        if not self.opts.quiet:
            self.cli.redirect_repo_progress()
        demands = self.cli.demands
        demands.available_repos = True
        demands.fresh_metadata = False
        demands.sack_activation = True
        self.opts.all = self.opts.all or self.opts.query_string_action

    def run(self):
        logger.debug(_('Searching Packages: '))
        return self._search(self.opts.query_string)
site-packages/dnf/cli/commands/repoquery.py000064400000103331147511334650015022 0ustar00#
# Copyright (C) 2014 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import print_function
from __future__ import unicode_literals
from dnf.i18n import _
from dnf.cli import commands
from dnf.cli.option_parser import OptionParser

import argparse
import datetime
import logging
import re
import sys

import dnf
import dnf.cli
import dnf.exceptions
import dnf.subject
import dnf.util
import hawkey

logger = logging.getLogger('dnf')


QFORMAT_DEFAULT = '%{name}-%{epoch}:%{version}-%{release}.%{arch}'
# matches %[-][dd]{attr}
QFORMAT_MATCH = re.compile(r'%(-?\d*?){([:.\w]+?)}')

QUERY_TAGS = """\
name, arch, epoch, version, release, reponame (repoid), from_repo, evr,
debug_name, source_name, source_debug_name,
installtime, buildtime, size, downloadsize, installsize,
provides, requires, obsoletes, conflicts, sourcerpm,
description, summary, license, url, reason"""

OPTS_MAPPING = {
    'conflicts': 'conflicts',
    'enhances': 'enhances',
    'obsoletes': 'obsoletes',
    'provides': 'provides',
    'recommends': 'recommends',
    'requires': 'requires',
    'requires-pre': 'requires_pre',
    'suggests': 'suggests',
    'supplements': 'supplements'
}


def rpm2py_format(queryformat):
    """Convert a rpm like QUERYFMT to an python .format() string."""
    def fmt_repl(matchobj):
        fill = matchobj.groups()[0]
        key = matchobj.groups()[1]
        if fill:
            if fill[0] == '-':
                fill = '>' + fill[1:]
            else:
                fill = '<' + fill
            fill = ':' + fill
        return '{0.' + key.lower() + fill + "}"

    def brackets(txt):
        return txt.replace('{', '{{').replace('}', '}}')

    queryformat = queryformat.replace("\\n", "\n").replace("\\t", "\t")
    for key, value in OPTS_MAPPING.items():
        queryformat = queryformat.replace(key, value)
    fmt = ""
    spos = 0
    for item in QFORMAT_MATCH.finditer(queryformat):
        fmt += brackets(queryformat[spos:item.start()])
        fmt += fmt_repl(item)
        spos = item.end()
    fmt += brackets(queryformat[spos:])
    return fmt


class _CommaSplitCallback(OptionParser._SplitCallback):
    SPLITTER = r'\s*,\s*'


class RepoQueryCommand(commands.Command):
    """A class containing methods needed by the cli to execute the repoquery command.
    """
    nevra_forms = {'repoquery-n': hawkey.FORM_NAME,
                   'repoquery-na': hawkey.FORM_NA,
                   'repoquery-nevra': hawkey.FORM_NEVRA}

    aliases = ('repoquery', 'rq') + tuple(nevra_forms.keys())
    summary = _('search for packages matching keyword')

    @staticmethod
    def filter_repo_arch(opts, query):
        """Filter query by repoid and arch options"""
        if opts.repo:
            query.filterm(reponame=opts.repo)
        if opts.arches:
            query.filterm(arch=opts.arches)
        return query

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('-a', '--all', dest='queryall', action='store_true',
                            help=_("Query all packages (shorthand for repoquery '*' "
                                   "or repoquery without argument)"))
        parser.add_argument('--show-duplicates', action='store_true',
                            help=_("Query all versions of packages (default)"))
        parser.add_argument('--arch', '--archlist', dest='arches', default=[],
                            action=_CommaSplitCallback, metavar='[arch]',
                            help=_('show only results from this ARCH'))
        parser.add_argument('-f', '--file', metavar='FILE', nargs='+',
                            help=_('show only results that owns FILE'))
        parser.add_argument('--whatconflicts', default=[], action=_CommaSplitCallback,
                            metavar='REQ',
                            help=_('show only results that conflict REQ'))
        parser.add_argument('--whatdepends', default=[], action=_CommaSplitCallback,
                            metavar='REQ',
                            help=_('shows results that requires, suggests, supplements, enhances,'
                                   'or recommends package provides and files REQ'))
        parser.add_argument('--whatobsoletes', default=[], action=_CommaSplitCallback,
                            metavar='REQ',
                            help=_('show only results that obsolete REQ'))
        parser.add_argument('--whatprovides', default=[], action=_CommaSplitCallback,
                            metavar='REQ',
                            help=_('show only results that provide REQ'))
        parser.add_argument('--whatrequires', default=[], action=_CommaSplitCallback,
                            metavar='REQ',
                            help=_('shows results that requires package provides and files REQ'))
        parser.add_argument('--whatrecommends', default=[], action=_CommaSplitCallback,
                            metavar='REQ',
                            help=_('show only results that recommend REQ'))
        parser.add_argument('--whatenhances', default=[], action=_CommaSplitCallback,
                            metavar='REQ',
                            help=_('show only results that enhance REQ'))
        parser.add_argument('--whatsuggests', default=[], action=_CommaSplitCallback,
                            metavar='REQ',
                            help=_('show only results that suggest REQ'))
        parser.add_argument('--whatsupplements', default=[], action=_CommaSplitCallback,
                            metavar='REQ',
                            help=_('show only results that supplement REQ'))
        whatrequiresform = parser.add_mutually_exclusive_group()
        whatrequiresform.add_argument("--alldeps", action="store_true",
                                      help=_("check non-explicit dependencies (files and Provides); default"))
        whatrequiresform.add_argument("--exactdeps", action="store_true",
                                      help=_('check dependencies exactly as given, opposite of --alldeps'))
        parser.add_argument("--recursive", action="store_true", help=_(
            'used with --whatrequires, and --requires --resolve, query packages recursively.'))
        parser.add_argument('--deplist', action='store_true', help=_(
            "show a list of all dependencies and what packages provide them"))
        parser.add_argument('--resolve', action='store_true',
                            help=_('resolve capabilities to originating package(s)'))
        parser.add_argument("--tree", action="store_true",
                            help=_('show recursive tree for package(s)'))
        parser.add_argument('--srpm', action='store_true',
                            help=_('operate on corresponding source RPM'))
        parser.add_argument("--latest-limit", dest='latest_limit', type=int,
                             help=_('show N latest packages for a given name.arch'
                                    ' (or latest but N if N is negative)'))
        parser.add_argument("--disable-modular-filtering", action="store_true",
                            help=_("list also packages of inactive module streams"))

        outform = parser.add_mutually_exclusive_group()
        outform.add_argument('-i', "--info", dest='queryinfo',
                             default=False, action='store_true',
                             help=_('show detailed information about the package'))
        outform.add_argument('-l', "--list", dest='queryfilelist',
                             default=False, action='store_true',
                             help=_('show list of files in the package'))
        outform.add_argument('-s', "--source", dest='querysourcerpm',
                             default=False, action='store_true',
                             help=_('show package source RPM name'))
        outform.add_argument('--changelogs', dest='querychangelogs',
                             default=False, action='store_true',
                             help=_('show changelogs of the package'))
        outform.add_argument('--qf', "--queryformat", dest='queryformat',
                             default=QFORMAT_DEFAULT,
                             help=_('display format for listing packages: '
                                    '"%%{name} %%{version} ...", '
                                    'use --querytags to view full tag list'))
        parser.add_argument('--querytags', action='store_true',
                            help=_('show available tags to use with '
                                   '--queryformat'))
        outform.add_argument("--nevra", dest='queryformat', const=QFORMAT_DEFAULT,
                             action='store_const',
                             help=_('use name-epoch:version-release.architecture format for '
                                    'displaying found packages (default)'))
        outform.add_argument("--nvr", dest='queryformat', const='%{name}-%{version}-%{release}',
                             action='store_const', help=_('use name-version-release format for '
                                                          'displaying found packages '
                                                          '(rpm query default)'))
        outform.add_argument("--envra", dest='queryformat',
                             const='%{epoch}:%{name}-%{version}-%{release}.%{arch}',
                             action='store_const',
                             help=_('use epoch:name-version-release.architecture format for '
                                    'displaying found packages'))
        outform.add_argument('--groupmember', action="store_true", help=_(
            'Display in which comps groups are presented selected packages'))
        pkgfilter = parser.add_mutually_exclusive_group()
        pkgfilter.add_argument("--duplicates", dest='pkgfilter',
                               const='duplicated', action='store_const',
                               help=_('limit the query to installed duplicate '
                                      'packages'))
        pkgfilter.add_argument("--duplicated", dest='pkgfilter',
                               const='duplicated', action='store_const',
                               help=argparse.SUPPRESS)
        pkgfilter.add_argument("--installonly", dest='pkgfilter',
                               const='installonly', action='store_const',
                               help=_('limit the query to installed installonly packages'))
        pkgfilter.add_argument("--unsatisfied", dest='pkgfilter',
                               const='unsatisfied', action='store_const',
                               help=_('limit the query to installed packages with unsatisfied dependencies'))
        parser.add_argument('--location', action='store_true',
                            help=_('show a location from where packages can be downloaded'))
        package_attribute = parser.add_mutually_exclusive_group()
        help_msgs = {
            'conflicts': _('Display capabilities that the package conflicts with.'),
            'depends': _('Display capabilities that the package can depend on, enhance, recommend,'
                         ' suggest, and supplement.'),
            'enhances': _('Display capabilities that the package can enhance.'),
            'provides': _('Display capabilities provided by the package.'),
            'recommends':  _('Display capabilities that the package recommends.'),
            'requires':  _('Display capabilities that the package depends on.'),
            'requires-pre':  _('If the package is not installed display capabilities that it depends on for '
                               'running %%pre and %%post scriptlets. If the package is installed display '
                               'capabilities that is depends for %%pre, %%post, %%preun and %%postun.'),
            'suggests':  _('Display capabilities that the package suggests.'),
            'supplements':  _('Display capabilities that the package can supplement.')
        }
        for arg, help_msg in help_msgs.items():
            name = '--%s' % arg
            package_attribute.add_argument(name, dest='packageatr', action='store_const',
                                           const=arg, help=help_msg)
        parser.add_argument('--available', action="store_true", help=_('Display only available packages.'))

        help_list = {
            'installed': _('Display only installed packages.'),
            'extras': _('Display only packages that are not present in any of available repositories.'),
            'upgrades': _('Display only packages that provide an upgrade for some already installed package.'),
            'unneeded': _('Display only packages that can be removed by "{prog} autoremove" '
                          'command.').format(prog=dnf.util.MAIN_PROG),
            'userinstalled': _('Display only packages that were installed by user.')
        }
        list_group = parser.add_mutually_exclusive_group()
        for list_arg, help_arg in help_list.items():
            switch = '--%s' % list_arg
            list_group.add_argument(switch, dest='list', action='store_const',
                                    const=list_arg, help=help_arg)

        # make --autoremove hidden compatibility alias for --unneeded
        list_group.add_argument(
            '--autoremove', dest='list', action='store_const',
            const="unneeded", help=argparse.SUPPRESS)
        parser.add_argument('--recent', action="store_true", help=_('Display only recently edited packages'))

        parser.add_argument('key', nargs='*', metavar="KEY",
                            help=_('the key to search for'))

    def pre_configure(self):
        if not self.opts.quiet:
            self.cli.redirect_logger(stdout=logging.WARNING, stderr=logging.INFO)

    def configure(self):
        if not self.opts.quiet:
            self.cli.redirect_repo_progress()
        demands = self.cli.demands

        if self.opts.obsoletes:
            if self.opts.packageatr:
                self.cli._option_conflict("--obsoletes", "--" + self.opts.packageatr)
            else:
                self.opts.packageatr = "obsoletes"

        if self.opts.querytags:
            return

        if self.opts.resolve and not self.opts.packageatr:
            raise dnf.cli.CliError(
                _("Option '--resolve' has to be used together with one of the "
                  "'--conflicts', '--depends', '--enhances', '--provides', '--recommends', "
                  "'--requires', '--requires-pre', '--suggests' or '--supplements' options"))

        if self.opts.recursive:
            if self.opts.exactdeps:
                self.cli._option_conflict("--recursive", "--exactdeps")
            if not any([self.opts.whatrequires,
                        (self.opts.packageatr == "requires" and self.opts.resolve)]):
                raise dnf.cli.CliError(
                    _("Option '--recursive' has to be used with '--whatrequires <REQ>' "
                      "(optionally with '--alldeps', but not with '--exactdeps'), or with "
                      "'--requires <REQ> --resolve'"))

        if self.opts.alldeps or self.opts.exactdeps:
            if not (self.opts.whatrequires or self.opts.whatdepends):
                raise dnf.cli.CliError(
                    _("argument {} requires --whatrequires or --whatdepends option".format(
                        '--alldeps' if self.opts.alldeps else '--exactdeps')))

        if self.opts.srpm:
            self.base.repos.enable_source_repos()

        if (self.opts.list not in ["installed", "userinstalled"] and
           self.opts.pkgfilter != "installonly") or self.opts.available:
            demands.available_repos = True

        demands.sack_activation = True

        if self.opts.querychangelogs:
            demands.changelogs = True

    def build_format_fn(self, opts, pkg):
        if opts.querychangelogs:
            out = []
            out.append('Changelog for %s' % str(pkg))
            for chlog in pkg.changelogs:
                dt = chlog['timestamp']
                out.append('* %s %s\n%s\n' % (dt.strftime("%a %b %d %Y"),
                                              dnf.i18n.ucd(chlog['author']),
                                              dnf.i18n.ucd(chlog['text'])))
            return '\n'.join(out)
        try:
            po = PackageWrapper(pkg)
            if opts.queryinfo:
                return self.base.output.infoOutput(pkg)
            elif opts.queryfilelist:
                filelist = po.files
                if not filelist:
                    print(_('Package {} contains no files').format(pkg), file=sys.stderr)
                return filelist
            elif opts.querysourcerpm:
                return po.sourcerpm
            else:
                return rpm2py_format(opts.queryformat).format(po)
        except AttributeError as e:
            # catch that the user has specified attributes
            # there don't exist on the dnf Package object.
            raise dnf.exceptions.Error(str(e))

    def _resolve_nevras(self, nevras, base_query):
        resolved_nevras_query = self.base.sack.query().filterm(empty=True)
        for nevra in nevras:
            resolved_nevras_query = resolved_nevras_query.union(base_query.intersection(
                dnf.subject.Subject(nevra).get_best_query(
                    self.base.sack,
                    with_provides=False,
                    with_filenames=False
                )
            ))

        return resolved_nevras_query

    def _do_recursive_deps(self, query_in, query_select, done=None):
        done = done if done else query_select

        query_required = query_in.filter(requires=query_select)

        query_select = query_required.difference(done)
        done = query_required.union(done)

        if query_select:
            done = self._do_recursive_deps(query_in, query_select, done=done)

        return done

    def by_all_deps(self, names, query, all_dep_types=False):
        # in case of arguments being NEVRAs, resolve them to packages
        resolved_nevras_query = self._resolve_nevras(names, query)

        # filter the arguments directly as reldeps
        depquery = query.filter(requires__glob=names)

        # filter the resolved NEVRAs as packages
        depquery = depquery.union(query.filter(requires=resolved_nevras_query))

        if all_dep_types:
            # TODO this is very inefficient, as it resolves the `names` glob to
            # reldeps four more times, which in a reasonably wide glob like
            # `dnf repoquery --whatdepends "libdnf*"` can take roughly 50% of
            # the total execution time.
            depquery = depquery.union(query.filter(recommends__glob=names))
            depquery = depquery.union(query.filter(enhances__glob=names))
            depquery = depquery.union(query.filter(supplements__glob=names))
            depquery = depquery.union(query.filter(suggests__glob=names))

            depquery = depquery.union(query.filter(recommends=resolved_nevras_query))
            depquery = depquery.union(query.filter(enhances=resolved_nevras_query))
            depquery = depquery.union(query.filter(supplements=resolved_nevras_query))
            depquery = depquery.union(query.filter(suggests=resolved_nevras_query))

        if self.opts.recursive:
            depquery = self._do_recursive_deps(query, depquery)

        return depquery

    def _get_recursive_providers_query(self, query_in, providers, done=None):
        done = done if done else self.base.sack.query().filterm(empty=True)
        t = self.base.sack.query().filterm(empty=True)
        for pkg in providers.run():
            t = t.union(query_in.filter(provides=pkg.requires))
        query_select = t.difference(done)
        if query_select:
            done = self._get_recursive_providers_query(query_in, query_select, done=t.union(done))
        return t.union(done)

    def _add_add_remote_packages(self):
        rpmnames = []
        remote_packages = []
        for key in self.opts.key:
            schemes = dnf.pycomp.urlparse.urlparse(key)[0]
            if key.endswith('.rpm'):
                rpmnames.append(key)
            elif schemes and schemes in ('http', 'ftp', 'file', 'https'):
                rpmnames.append(key)
        if rpmnames:
            remote_packages = self.base.add_remote_rpms(
                rpmnames, strict=False, progress=self.base.output.progress)
        return remote_packages

    def run(self):
        if self.opts.querytags:
            print(QUERY_TAGS)
            return

        self.cli._populate_update_security_filter(self.opts)

        q = self.base.sack.query(
            flags=hawkey.IGNORE_MODULAR_EXCLUDES
            if self.opts.disable_modular_filtering
            else hawkey.APPLY_EXCLUDES
        )
        if self.opts.key:
            remote_packages = self._add_add_remote_packages()

            kwark = {}
            if self.opts.command in self.nevra_forms:
                kwark["forms"] = [self.nevra_forms[self.opts.command]]
            pkgs = []
            query_results = q.filter(empty=True)

            if remote_packages:
                query_results = query_results.union(
                    self.base.sack.query().filterm(pkg=remote_packages))

            for key in self.opts.key:
                query_results = query_results.union(
                    dnf.subject.Subject(key, ignore_case=True).get_best_query(
                        self.base.sack, with_provides=False, query=q, **kwark))
            q = query_results

        if self.opts.recent:
            q = q._recent(self.base.conf.recent)
        if self.opts.available:
            if self.opts.list and self.opts.list != "installed":
                print(self.cli.optparser.print_usage())
                raise dnf.exceptions.Error(_("argument {}: not allowed with argument {}".format(
                    "--available", "--" + self.opts.list)))
        elif self.opts.list == "unneeded":
            q = q._unneeded(self.base.history.swdb)
        elif self.opts.list and self.opts.list != 'userinstalled':
            q = getattr(q, self.opts.list)()

        if self.opts.pkgfilter == "duplicated":
            installonly = self.base._get_installonly_query(q)
            q = q.difference(installonly).duplicated()
        elif self.opts.pkgfilter == "installonly":
            q = self.base._get_installonly_query(q)
        elif self.opts.pkgfilter == "unsatisfied":
            rpmdb = dnf.sack.rpmdb_sack(self.base)
            rpmdb._configure(self.base.conf.installonlypkgs, self.base.conf.installonly_limit)
            goal = dnf.goal.Goal(rpmdb)
            goal.protect_running_kernel = False
            solved = goal.run(verify=True)
            if not solved:
                print(dnf.util._format_resolve_problems(goal.problem_rules()))
            return
        elif not self.opts.list:
            # do not show packages from @System repo
            q = q.available()

        # filter repo and arch
        q = self.filter_repo_arch(self.opts, q)
        orquery = q

        if self.opts.file:
            q.filterm(file__glob=self.opts.file)
        if self.opts.whatconflicts:
            rels = q.filter(conflicts__glob=self.opts.whatconflicts)
            q = rels.union(q.filter(conflicts=self._resolve_nevras(self.opts.whatconflicts, q)))
        if self.opts.whatobsoletes:
            q.filterm(obsoletes=self.opts.whatobsoletes)
        if self.opts.whatprovides:
            query_for_provide = q.filter(provides__glob=self.opts.whatprovides)
            if query_for_provide:
                q = query_for_provide
            else:
                q.filterm(file__glob=self.opts.whatprovides)

        if self.opts.whatrequires:
            if (self.opts.exactdeps):
                q.filterm(requires__glob=self.opts.whatrequires)
            else:
                q = self.by_all_deps(self.opts.whatrequires, q)

        if self.opts.whatdepends:
            if (self.opts.exactdeps):
                dependsquery = q.filter(requires__glob=self.opts.whatdepends)
                dependsquery = dependsquery.union(q.filter(recommends__glob=self.opts.whatdepends))
                dependsquery = dependsquery.union(q.filter(enhances__glob=self.opts.whatdepends))
                dependsquery = dependsquery.union(q.filter(supplements__glob=self.opts.whatdepends))
                q = dependsquery.union(q.filter(suggests__glob=self.opts.whatdepends))
            else:
                q = self.by_all_deps(self.opts.whatdepends, q, True)

        if self.opts.whatrecommends:
            rels = q.filter(recommends__glob=self.opts.whatrecommends)
            q = rels.union(q.filter(recommends=self._resolve_nevras(self.opts.whatrecommends, q)))
        if self.opts.whatenhances:
            rels = q.filter(enhances__glob=self.opts.whatenhances)
            q = rels.union(q.filter(enhances=self._resolve_nevras(self.opts.whatenhances, q)))
        if self.opts.whatsupplements:
            rels = q.filter(supplements__glob=self.opts.whatsupplements)
            q = rels.union(q.filter(supplements=self._resolve_nevras(self.opts.whatsupplements, q)))
        if self.opts.whatsuggests:
            rels = q.filter(suggests__glob=self.opts.whatsuggests)
            q = rels.union(q.filter(suggests=self._resolve_nevras(self.opts.whatsuggests, q)))

        if self.opts.latest_limit:
            q = q.latest(self.opts.latest_limit)
        # reduce a query to security upgrades if they are specified
        q = self.base._merge_update_filters(q, warning=False)
        if self.opts.srpm:
            pkg_list = []
            for pkg in q:
                srcname = pkg.source_name
                if srcname is not None:
                    tmp_query = self.base.sack.query().filterm(name=srcname, evr=pkg.evr,
                                                               arch='src')
                    pkg_list += tmp_query.run()
            q = self.base.sack.query().filterm(pkg=pkg_list)
        if self.opts.tree:
            if not self.opts.whatrequires and self.opts.packageatr not in (
                    'conflicts', 'enhances', 'obsoletes', 'provides', 'recommends',
                    'requires', 'suggests', 'supplements'):
                raise dnf.exceptions.Error(
                    _("No valid switch specified\nusage: {prog} repoquery [--conflicts|"
                      "--enhances|--obsoletes|--provides|--recommends|--requires|"
                      "--suggest|--supplements|--whatrequires] [key] [--tree]\n\n"
                      "description:\n  For the given packages print a tree of the"
                      "packages.").format(prog=dnf.util.MAIN_PROG))
            self.tree_seed(q, orquery, self.opts)
            return

        pkgs = set()
        if self.opts.packageatr:
            rels = set()
            for pkg in q.run():
                if self.opts.list != 'userinstalled' or self.base.history.user_installed(pkg):
                    if self.opts.packageatr == 'depends':
                        rels.update(pkg.requires + pkg.enhances + pkg.suggests +
                                    pkg.supplements + pkg.recommends)
                    else:
                        rels.update(getattr(pkg, OPTS_MAPPING[self.opts.packageatr]))
            if self.opts.resolve:
                # find the providing packages and show them
                if self.opts.list == "installed":
                    query = self.filter_repo_arch(self.opts, self.base.sack.query())
                else:
                    query = self.filter_repo_arch(self.opts, self.base.sack.query().available())
                providers = query.filter(provides=rels)
                if self.opts.recursive:
                    providers = providers.union(
                        self._get_recursive_providers_query(query, providers))
                pkgs = set()
                for pkg in providers.latest().run():
                    pkgs.add(self.build_format_fn(self.opts, pkg))
            else:
                pkgs.update(str(rel) for rel in rels)
        elif self.opts.location:
            for pkg in q.run():
                location = pkg.remote_location()
                if location is not None:
                    pkgs.add(location)
        elif self.opts.deplist:
            pkgs = []
            for pkg in sorted(set(q.run())):
                if self.opts.list != 'userinstalled' or self.base.history.user_installed(pkg):
                    deplist_output = []
                    deplist_output.append('package: ' + str(pkg))
                    for req in sorted([str(req) for req in pkg.requires]):
                        deplist_output.append('  dependency: ' + req)
                        subject = dnf.subject.Subject(req)
                        query = subject.get_best_query(self.base.sack)
                        query = self.filter_repo_arch(
                            self.opts, query.available())
                        if not self.opts.verbose:
                            query = query.latest()
                        for provider in query.run():
                            deplist_output.append('   provider: ' + str(provider))
                    pkgs.append('\n'.join(deplist_output))
            if pkgs:
                print('\n\n'.join(pkgs))
            return
        elif self.opts.groupmember:
            self._group_member_report(q)
            return

        else:
            for pkg in q.run():
                if self.opts.list != 'userinstalled' or self.base.history.user_installed(pkg):
                    pkgs.add(self.build_format_fn(self.opts, pkg))

        if pkgs:
            if self.opts.queryinfo:
                print("\n\n".join(sorted(pkgs)))
            else:
                print("\n".join(sorted(pkgs)))

    def _group_member_report(self, query):
        package_conf_dict = {}
        for group in self.base.comps.groups:
            package_conf_dict[group.id] = set([pkg.name for pkg in group.packages_iter()])
        group_package_dict = {}
        pkg_not_in_group = []
        for pkg in query.run():
            group_id_list = []
            for group_id, package_name_set in package_conf_dict.items():
                if pkg.name in package_name_set:
                    group_id_list.append(group_id)
            if group_id_list:
                group_package_dict.setdefault(
                    '$'.join(sorted(group_id_list)), []).append(str(pkg))
            else:
                pkg_not_in_group.append(str(pkg))
        output = []
        for key, package_list in sorted(group_package_dict.items()):
            output.append(
                '\n'.join(sorted(package_list) + sorted(['  @' + id for id in key.split('$')])))
        output.append('\n'.join(sorted(pkg_not_in_group)))
        if output:
            print('\n'.join(output))

    def grow_tree(self, level, pkg, opts):
        pkg_string = self.build_format_fn(opts, pkg)
        if level == -1:
            print(pkg_string)
            return
        spacing = " "
        for x in range(0, level):
            spacing += "|   "
        requires = []
        for requirepkg in pkg.requires:
            requires.append(str(requirepkg))
        reqstr = "[" + str(len(requires)) + ": " + ", ".join(requires) + "]"
        print(spacing + r"\_ " + pkg_string + " " + reqstr)

    def tree_seed(self, query, aquery, opts, level=-1, usedpkgs=None):
        for pkg in sorted(set(query.run()), key=lambda p: p.name):
            usedpkgs = set() if usedpkgs is None or level == -1 else usedpkgs
            if pkg.name.startswith("rpmlib") or pkg.name.startswith("solvable"):
                return
            self.grow_tree(level, pkg, opts)
            if pkg not in usedpkgs:
                usedpkgs.add(pkg)
                if opts.packageatr:
                    strpkg = getattr(pkg, opts.packageatr)
                    ar = {}
                    for name in set(strpkg):
                        pkgquery = self.base.sack.query().filterm(provides=name)
                        for querypkg in pkgquery:
                            ar[querypkg.name + "." + querypkg.arch] = querypkg
                    pkgquery = self.base.sack.query().filterm(pkg=list(ar.values()))
                else:
                    pkgquery = self.by_all_deps((pkg.name, ), aquery) if opts.alldeps \
                        else aquery.filter(requires__glob=pkg.name)
                self.tree_seed(pkgquery, aquery, opts, level + 1, usedpkgs)


class PackageWrapper(object):

    """Wrapper for dnf.package.Package, so we can control formatting."""

    def __init__(self, pkg):
        self._pkg = pkg

    def __getattr__(self, attr):
        atr = getattr(self._pkg, attr)
        if atr is None:
            return "(none)"
        if isinstance(atr, list):
            return '\n'.join(sorted({dnf.i18n.ucd(reldep) for reldep in atr}))
        return dnf.i18n.ucd(atr)

    @staticmethod
    def _get_timestamp(timestamp):
        if timestamp > 0:
            dt = datetime.datetime.utcfromtimestamp(timestamp)
            return dt.strftime("%Y-%m-%d %H:%M")
        else:
            return ''

    @property
    def buildtime(self):
        return self._get_timestamp(self._pkg.buildtime)

    @property
    def installtime(self):
        return self._get_timestamp(self._pkg.installtime)
site-packages/dnf/cli/commands/distrosync.py000064400000003637147511334650015200 0ustar00# distrosync.py
# distro-sync CLI command.
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from dnf.cli import commands
from dnf.i18n import _


class DistroSyncCommand(commands.Command):
    """A class containing methods needed by the cli to execute the
    distro-synch command.
    """

    aliases = ('distro-sync', 'distrosync', 'distribution-synchronization', 'dsync')
    summary = _('synchronize installed packages to the latest available versions')

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('package', nargs='*', help=_('Package to synchronize'))

    def configure(self):
        demands = self.cli.demands
        demands.sack_activation = True
        demands.available_repos = True
        demands.resolving = True
        demands.root_user = True
        commands._checkGPGKey(self.base, self.cli)
        commands._checkEnabledRepo(self.base, self.opts.package)

    def run(self):
        return self.base.distro_sync_userlist(self.opts.package)
site-packages/dnf/cli/commands/install.py000064400000017123147511334650014440 0ustar00# install.py
# Install CLI command.
#
# Copyright (C) 2014-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals

import logging
from itertools import chain

import hawkey

import dnf.exceptions
from dnf.cli import commands
from dnf.cli.option_parser import OptionParser
from dnf.i18n import _

logger = logging.getLogger('dnf')


class InstallCommand(commands.Command):
    """A class containing methods needed by the cli to execute the
    install command.
    """
    nevra_forms = {'install-n': hawkey.FORM_NAME,
                   'install-na': hawkey.FORM_NA,
                   'install-nevra': hawkey.FORM_NEVRA}
    alternatives_provide = 'alternative-for({})'

    aliases = ('install', 'localinstall', 'in') + tuple(nevra_forms.keys())
    summary = _('install a package or packages on your system')

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('package', nargs='+', metavar=_('PACKAGE'),
                            action=OptionParser.ParseSpecGroupFileCallback,
                            help=_('Package to install'))

    def configure(self):
        """Verify that conditions are met so that this command can run.
        That there are enabled repositories with gpg keys, and that
        this command is called with appropriate arguments.
        """
        demands = self.cli.demands
        demands.sack_activation = True
        demands.available_repos = True
        demands.resolving = True
        demands.root_user = True
        commands._checkGPGKey(self.base, self.cli)
        if not self.opts.filenames:
            commands._checkEnabledRepo(self.base)

    def run(self):
        err_pkgs = []
        errs = []
        error_module_specs = []

        nevra_forms = self._get_nevra_forms_from_command()

        self.cli._populate_update_security_filter(self.opts)
        if self.opts.command == 'localinstall' and (self.opts.grp_specs or self.opts.pkg_specs):
            self._log_not_valid_rpm_file_paths(self.opts.grp_specs)
            if self.base.conf.strict:
                raise dnf.exceptions.Error(_('Nothing to do.'))
        skipped_grp_specs = []
        if self.opts.grp_specs and self.opts.command != 'localinstall':
            if dnf.base.WITH_MODULES:
                try:
                    module_base = dnf.module.module_base.ModuleBase(self.base)
                    module_base.install(self.opts.grp_specs, strict=self.base.conf.strict)
                except dnf.exceptions.MarkingErrors as e:
                    if e.no_match_group_specs:
                        for e_spec in e.no_match_group_specs:
                            skipped_grp_specs.append(e_spec)
                    if e.error_group_specs:
                        for e_spec in e.error_group_specs:
                            error_module_specs.append("@" + e_spec)
                    module_depsolv_errors = e.module_depsolv_errors
                    if module_depsolv_errors:
                        logger.error(dnf.module.module_base.format_modular_solver_errors(
                            module_depsolv_errors[0]))
            else:
                skipped_grp_specs = self.opts.grp_specs
        if self.opts.filenames and nevra_forms:
            self._inform_not_a_valid_combination(self.opts.filenames)
            if self.base.conf.strict:
                raise dnf.exceptions.Error(_('Nothing to do.'))
        else:
            err_pkgs = self._install_files()

        if skipped_grp_specs and nevra_forms:
            self._inform_not_a_valid_combination(skipped_grp_specs)
            if self.base.conf.strict:
                raise dnf.exceptions.Error(_('Nothing to do.'))
        elif skipped_grp_specs and self.opts.command != 'localinstall':
            self._install_groups(skipped_grp_specs)

        if self.opts.command != 'localinstall':
            errs = self._install_packages(nevra_forms)

        if (len(errs) != 0 or len(err_pkgs) != 0 or error_module_specs) and self.base.conf.strict:
            raise dnf.exceptions.PackagesNotAvailableError(_("Unable to find a match"),
                                                           pkg_spec=' '.join(errs),
                                                           packages=err_pkgs)

    def _get_nevra_forms_from_command(self):
        if self.opts.command in self.nevra_forms:
            return [self.nevra_forms[self.opts.command]]
        else:
            return []

    def _log_not_valid_rpm_file_paths(self, grp_specs):
        group_names = map(lambda g: '@' + g, grp_specs)
        for pkg in chain(self.opts.pkg_specs, group_names):
            msg = _('Not a valid rpm file path: %s')
            logger.info(msg, self.base.output.term.bold(pkg))

    def _inform_not_a_valid_combination(self, forms):
        for form in forms:
            msg = _('Not a valid form: %s')
            logger.warning(msg, self.base.output.term.bold(form))

    def _install_files(self):
        err_pkgs = []
        strict = self.base.conf.strict
        for pkg in self.base.add_remote_rpms(self.opts.filenames, strict=strict,
                                             progress=self.base.output.progress):
            try:
                self.base.package_install(pkg, strict=strict)
            except dnf.exceptions.MarkingError:
                msg = _('No match for argument: %s')
                logger.info(msg, self.base.output.term.bold(pkg.location))
                err_pkgs.append(pkg)

        return err_pkgs

    def _install_groups(self, grp_specs):
        try:
            self.base.env_group_install(grp_specs,
                                        tuple(self.base.conf.group_package_types),
                                        strict=self.base.conf.strict)
        except dnf.exceptions.Error:
            if self.base.conf.strict:
                raise

    def _report_alternatives(self, pkg_spec):
        query = self.base.sack.query().filterm(
            provides=self.alternatives_provide.format(pkg_spec))
        if query:
            msg = _('There are following alternatives for "{0}": {1}')
            logger.info(msg.format(
                pkg_spec,
                ', '.join(sorted(set([alt.name for alt in query])))))

    def _install_packages(self, nevra_forms):
        errs = []
        strict = self.base.conf.strict
        for pkg_spec in self.opts.pkg_specs:
            try:
                self.base.install(pkg_spec, strict=strict, forms=nevra_forms)
            except dnf.exceptions.MarkingError as e:
                msg = '{}: {}'.format(e.value, self.base.output.term.bold(pkg_spec))
                logger.info(msg)
                self.base._report_icase_hint(pkg_spec)
                self._report_alternatives(pkg_spec)
                errs.append(pkg_spec)

        return errs
site-packages/dnf/cli/commands/__pycache__/mark.cpython-36.opt-1.pyc000064400000005545147511334650021154 0ustar003

�ft`�
�@spddlmZddlmZddlZddlmZddlmZddl	Z	ddl
Z
ddlZejd�Z
Gdd�dej�ZdS)	�)�print_function)�unicode_literalsN)�_)�commands�dnfc@sLeZdZdZed�Zedd��Zdd�Zdd�Z	d	d
�Z
dd�Zd
d�ZdS)�MarkCommand�markz7mark or unmark installed packages as installed by user.cCs6|jdddddgtd�d�|jdd	d
td�d�dS)
Nr�Zinstall�remove�groupzhinstall: mark as installed by user
remove: unmark as installed by user
group: mark as installed by group)�nargs�choices�help�package�+ZPACKAGEzPackage specification)r�metavarr)�add_argumentr)�parser�r�/usr/lib/python3.6/mark.py�
set_argparser)s
zMarkCommand.set_argparsercCs,|jjj|tjj�tjtd�t	|��dS)Nz%s marked as user installed.)
�base�history�
set_reason�libdnf�transactionZTransactionItemReason_USER�logger�infor�str)�self�pkgrrr�
_mark_install2szMarkCommand._mark_installcCs,|jjj|tjj�tjtd�t	|��dS)Nz%s unmarked as user installed.)
rrrrrZ TransactionItemReason_DEPENDENCYrrrr)rr rrr�_mark_remove6szMarkCommand._mark_removecCs,|jjj|tjj�tjtd�t	|��dS)Nz%s marked as group installed.)
rrrrrZTransactionItemReason_GROUPrrrr)rr rrr�_mark_group:szMarkCommand._mark_groupcCs$|jj}d|_d|_d|_d|_dS)NTF)�cli�demandsZsack_activationZ	root_userZavailable_reposZ	resolving)rr%rrr�	configure>s
zMarkCommand.configurec
Cs|jjd}|jj}tjt|d|��}g}xR|D]J}tjj|�}|j	|j
j�}x|D]}||�qVWt|�dkr2|j
|�q2W|r�tjtd��x|D]}tjtd�|�q�Wtjj�|j
jj�}|dkr�|jj�}	n|j}	|j
jj|	gg�|j
jj|	�dS)NrZ_mark_zError:zPackage %s is not installed.)Zoptsrr�	functools�partial�getattrrZsubjectZSubjectZget_best_queryrZsack�len�appendr�errorrr$ZCliErrorrZlastZ_rpmdb_versionZend_rpmdb_versionZbeg�end)
r�cmdZpkgsZ	mark_funcZnotfoundr Zsubj�q�oldZ
rpmdb_versionrrr�runEs,


zMarkCommand.runN)r)
�__name__�
__module__�__qualname__�aliasesrZsummary�staticmethodrr!r"r#r&r1rrrrr$s	r)Z
__future__rrZlibdnf.transactionrZdnf.i18nrZdnf.clirrr'ZloggingZ	getLoggerrZCommandrrrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/clean.cpython-36.pyc000064400000010011147511334650020325 0ustar003

�ft`t�@s�ddlmZddlmZddlmZddlmZmZddlm	Z	ddlZ
ddlZ
ddlZ
ddl
Z
ddlZ
ddlZddlZddlZddlZejd�Zdd	d
gdgd	gd
gddd	gd�Zd
d�Zdd�Zdd�Zdd�ZGdd�dej�ZdS)�)�absolute_import)�unicode_literals)�commands)�_�P_)�miscN�dnf�metadata�dbcachezexpire-cache�packages)r	rr
zexpire-cache�allccsVxPtj|�D]B\}}}tjj||�}x(|D] }tjj||�}tjj|�Vq*WqWdS)z:Traverse dirpath recursively and yield relative filenames.N)�os�walk�path�relpath�join�normpath)�dirpath�root�dirs�files�base�fr�r�/usr/lib/python3.6/clean.py�_tree1s

rcs�fdd�|D�S)z5Yield those filenames that match any of the patterns.c3s(|] }�D]}tj||�r
|Vq
qdS)N)�re�match)�.0r�p)�patternsrr�	<genexpr><sz_filter.<locals>.<genexpr>r)rr r)r r�_filter:sr"cCsLd}xB|D]:}tjj||�}tjtjjtd�|�t	j
|�|d7}q
W|S)z(Remove the given filenames from dirpath.rzRemoving file %s�)r
rr�logger�logr�loggingZDDEBUGrrZunlink_f)rr�countrrrrr�_clean?s

r(cs0tjjd��fdd�|D�}tdd�|D��S)z:Return the repo IDs that have some cached metadata around.r	c3s|]}tj�|�VqdS)N)rr)rr)�metapatrrr!Msz _cached_repos.<locals>.<genexpr>css|]}|r|jd�VqdS)ZrepoidN)�group)r�mrrrr!Ns)r�repo�CACHE_FILES�set)rZmatchesr)r)r�
_cached_reposJsr/c@s0eZdZdZd	Zed�Zedd��Zdd�Z	dS)
�CleanCommandzSA class containing methods needed by the cli to execute the
    clean command.
    �cleanzremove cached datacCs|jddtj�td�d�dS)N�type�+zMetadata type to clean)�nargs�choices�help)�add_argument�_CACHE_TYPES�keysr)�parserrrr�
set_argparserYszCleanCommand.set_argparsercCsf|jjj}tjj|d�}tjj|d�}tjj|jjjd�}�x$y�|oJ|oJ|��t	dd�|j
jD��}tt
|��}tjtddj|���d|kr�t|�}|jjjj|�|jd�tjtd��dd	�|D�}t|t||��}	tjtd
d|	�|	�dSQRXWq>tjjk
�r\}
z:|jjj�sHtd�|
j}tj|�tj d
�n|
�WYdd}
~
Xq>Xq>WdS)NTcss |]}t|D]
}|VqqdS)N)r8)r�c�trrrr!gsz#CleanCommand.run.<locals>.<genexpr>zCleaning data: � zexpire-cachezCache was expiredcSsg|]}tjj|�qSr)rr,r-)rr=rrr�
<listcomp>qsz$CleanCommand.run.<locals>.<listcomp>z%d file removedz%d files removedz*Waiting for process with pid %d to finish.�)!rZconf�cachedirr�lockZbuild_metadata_lockZbuild_download_lockZbuild_rpmdb_lockZ
persistdirr.Zoptsr2�listrr$�debugrrr/Z_repo_persistorZexpired_to_add�update�remove�infor(r"r�
exceptionsZ	LockErrorZexit_on_lock�pid�timeZsleep)�selfrAZmd_lockZ
download_lockZ
rpmdb_lock�typesrZexpiredr r'�e�msgrrr�run_s2


zCleanCommand.runN)r1)
�__name__�
__module__�__qualname__�__doc__�aliasesrZsummary�staticmethodr;rOrrrrr0Qs
r0)Z
__future__rrZdnf.clirZdnf.i18nrrZdnf.yumrrZdnf.exceptionsZdnf.lockZdnf.loggingZdnf.repor&r
rrJZ	getLoggerr$r8rr"r(r/ZCommandr0rrrr�<module>s0
	site-packages/dnf/cli/commands/__pycache__/alias.cpython-36.pyc000064400000012322147511334650020343 0ustar003

�ft`��@s�ddlmZddlmZddlmZddlZddlZddlZddl	Zddlm
Z
ddlZddlZddl
mZejd�ZGdd	�d	e
j�ZdS)
�)�absolute_import)�print_function)�unicode_literalsN)�commands)�_�dnfc@sleZdZdZed�Zedd��Zdd�Zdd�Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�Zdd�ZdS)�AliasCommand�aliaszList or create command aliasescCsl|j�}|jdddtd�d�|jdddtd�d�|jdd	d
dd
dgtd
�d�|jdddtd�d�dS)Nz--enable-resolvingF�
store_truezenable aliases resolving)�default�action�helpz--disable-resolvingzdisable aliases resolving�
subcommand�?�list�add�deletezaction to do with aliases)�nargsr�choicesr
r	�*zcommand[=result]zalias definition)r�metavarr
)Zadd_mutually_exclusive_group�add_argumentr)�parserZenable_group�r�/usr/lib/python3.6/alias.py�
set_argparser*s

zAliasCommand.set_argparsercCsH|jj}|jjdkrd|_tjjj�|_|jj	�|jj
|_|j�dS)NrrT)rr)
�cli�demands�optsrZ	root_userr�aliasesZAliases�aliases_baseZ
_load_aliases�enabledZresolving_enabled�_update_config_from_options)�selfrrrr�	configure9s

zAliasCommand.configurecCs�d}|jjrd}tjtd��|jjr8d}tjtd��|dk	r�tjjt	j
jj�sft
t	j
jjd�j�t	jjjt	j
jjddd|i�|jj�s�||j_dS)NTzAliases are now enabledFzAliases are now disabled�w�mainr!)rZenable_resolving�logger�inforZdisable_resolving�os�path�existsrrrZALIASES_CONF_PATH�open�close�confZ
BaseConfigZwrite_raw_configfiler Z_disabled_by_environr!)r#r!rrrr"Bs
z(AliasCommand._update_config_from_optionscCs�i}x�|jjD]�}|jdd�}|dj�}t|j��dkrLtjtd�|�q|jd�rhtjtd�|�qt|�dkr�tjtd�|�q|dj�||<qW|S)N�=�rzInvalid alias key: %s�-zAlias argument has no value: %s)	rr	�split�strip�lenr'�warningr�
startswith)r#Znew_aliasesr	�cmdrrr�_parse_option_aliasTs
z AliasCommand._parse_option_aliascCsxtjjtjjj�s&ttjjjd�j�ytjjj	tjjj�}Wn4tj
jk
rr}ztj
td�|�dSd}~XnX|S)Nr%zConfig error: %s)r)r*r+rrr�ALIASES_USER_PATHr,r-Z
AliasesConfig�
exceptionsZConfigErrorr'r5r)r#r.�errr�_load_user_aliaseseszAliasCommand._load_user_aliasescCsdttjjjd�}d}|dj|�7}|d7}x*|j�D]\}}|dj|dj|��7}q4W|j|�dS)Nr%z[main]
zenabled = {}

z
[aliases]
z{} = {}
� )	r,rrrr9�format�items�join�write)r#�user_aliasesr!Zfileobj�output�key�valuerrr�_store_user_aliasespsz AliasCommand._store_user_aliasescCsP|j�}|j}|dkrdS|j|�|j||j�tjtd�dj|j	���dS)NzAliases added: %sz, )
r<r�updaterFr!r'r(rr@�keys)r#rr.rBrrr�add_aliasesys
zAliasCommand.add_aliasescCs�|j�}|j}|dkrdSg}xF|D]>}y||=|j|�Wq$tk
r`tjtd�|�Yq$Xq$W|j||j�tjtd�dj	|��dS)NzAlias not found: %szAliases deleted: %sz, )
r<r�append�KeyErrorr'r(rrFr!r@)r#�cmdsr.rBZ
valid_cmdsr7rrr�remove_aliases�s
zAliasCommand.remove_aliasescCs~|g}y|jj|�}WnHtjjk
r^}z(tjtd�||dj|jj	|��WYdd}~XnXt
td�|dj|�f�dS)Nz%s, alias %s="%s"r=z
Alias %s='%s')r Z_resolverr:�Errorr'�errorrr@r�print)r#r7�argsr;rrr�
list_alias�s0zAliasCommand.list_aliascCs|jjstjtd��|jjdkrL|j�}|s>tj	j
td���|j|�dS|jjdkr�|jj}|gkrxtj	j
td���|j
|�dS|jjs�|jjs�tjtd��dSxX|jjD]}|j|�q�Wn<x:|jjD].}||jjkr�tjtd�|�q�|j|�q�WdS)NzAliases resolving is disabled.rzNo aliases specified.rzNo alias specified.zNo aliases defined.zNo match for alias: %s)r r!r'r5rrrr8rr:rNrIr	rMrr(rR)r#rrLr7rrr�run�s2

zAliasCommand.runN)r	)�__name__�
__module__�__qualname__rrZsummary�staticmethodrr$r"r8r<rFrIrMrRrSrrrrr&s		
r)Z
__future__rrrZloggingZos.pathr)Zdnf.clirZdnf.cli.aliasesrZdnf.confZdnf.exceptionsZdnf.i18nrZ	getLoggerr'ZCommandrrrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/history.cpython-36.opt-1.pyc000064400000026342147511334650021721 0ustar003

i�-e%F�@s�ddlmZddlmZddlmZddlZddlZddlmZmZddl	m
Z
ddlmZm
Z
ddl	ZddlZddlZddlZddlZddlZddlZejd�ZGd	d
�d
e
j�ZdS)�)�absolute_import)�print_function)�unicode_literalsN)�_�ucd)�commands)�TransactionReplay�serialize_transaction�dnfcs�eZdZdZd+Zed�Zddddd	d
ddgZ�fd
d�Ze	dd��Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Z�ZS),�HistoryCommandzUA class containing methods needed by the cli to execute the
    history command.
    �history�histz(display, or use, the transaction history�list�info�redo�replay�rollback�store�undo�
userinstalledcstt|�j||�d|_dS)NF)�superr�__init__�_require_one_transaction_id)�self�args�kw)�	__class__��/usr/lib/python3.6/history.pyr4szHistoryCommand.__init__c
Cs�|jddddjtjddjtjdd���d�|jd	d
dd�|jd
ddtd�d�|jdd
td�d�|jdd
td�d�|jdd
td�d�|jddddd�|jddddd�dS)N�transactions_action�?ZCOMMANDz$Available commands: {} (default), {}rz, �)�nargs�metavar�helpz	--reverse�
store_truez$display history list output reversed)�actionr$z-oz--outputz<For the store command, file path to store the transaction to)�defaultr$z--ignore-installedzXFor the replay command, don't check for installed packages matching those in transactionz--ignore-extraszRFor the replay command, don't check for extra packages pulled into the transactionz--skip-unavailablezYFor the replay command, skip packages that are not available or have missing dependencies�transactions�*ZTRANSACTIONz�For commands working with history transactions, Transaction ID (<number>, 'last' or 'last-<number>' for one transaction, <transaction-id>..<transaction-id> for a range)�transaction_filenameZTRANSACTION_FILEzEFor the replay command, path to the stored transaction file to replay)�add_argument�formatr�_CMDS�joinr)�parserrrr�
set_argparser9s$



zHistoryCommand.set_argparsercCs.|jjs|jd|j_n0|jj|jkrH|jjjd|jj�|jd|j_td�j|jj�|_|jj	}|jjdk�r|jjs�t
jjtd���t|jj�dkr�t
jjtd���t
jj|jjd�|j_g|j_d|_d|_d|_d|jj_d|jj_t
jjj|j|j�n�|jjd	k�r6d|_|jj�s�t
jjtd
���n�|jjdk�r�d|_d|_d|_d|_|jj�s�td
�}tj|�t
jj|��n,t|jj�dk�r�tj|j�t
jj|j��d|_t
jjj|j|j�nd|_d|_|jjjdk�r*t
j |jjjt
j!��r*td|jjj�}tj|�t
jj|��dS)NrzUFound more than one transaction ID.
'{}' requires one transaction ID or package name.rzNo transaction file name given.r!z6More than one argument given as transaction file name.TFrz(No transaction ID or package name given.rrrz:memory:z+You don't have access to the history DB: %s)rrr)"�optsrr-r(�insertrr,�_require_one_transaction_id_msg�cli�demandsr
�CliError�len�os�path�abspathr*Zavailable_reposZ	resolvingZ	root_user�base�confZclean_requirements_on_removeZinstall_weak_depsrZ_checkGPGKeyr�logger�criticalZfresh_metadataZsack_activationr�access�R_OK)rr5�msgrrr�	configureUsZ




(
zHistoryCommand.configurecCs�t|tjj�rv|jjdkr2|jj\}td�|fS|jjdkrv|jjddkrV|jjn|jjdd�\}td�|fStjj	j
j||�S)	z.Get suggestions for resolving the given error.rzVCannot undo transaction %s, doing so would result in an inconsistent package database.rr�forcer!NzZCannot rollback transaction %s, doing so would result in an inconsistent package database.)�
isinstancer
�
exceptionsZTransactionCheckErrorr1rr(rr4r�Command�get_error_output)r�errorZid_rrrrG�s
zHistoryCommand.get_error_outputcCs:|j|�}t|�}t|j|dd|jjd�|_|jj�dS)NT)�data�ignore_installed�
ignore_extras�skip_unavailable)�_history_get_transactionr	rr;r1rLr�run)r�extcmds�oldrIrrr�
_hcmd_redo�s
zHistoryCommand._hcmd_redocCsD|stjjtd���|jjj|�}|s@tjjtd�j|d���|S)NzNo transaction ID givenzTransaction ID "{0}" not found.r)r
r4r6rr;rrPr,)rrOrPrrr�_history_get_transactions�sz(HistoryCommand._history_get_transactionscCs.|j|�}t|�dkr&tjjtd���|dS)Nr!z#Found more than one transaction ID!r)rRr7r
r4r6r)rrOrPrrrrM�s
z'HistoryCommand._history_get_transactioncCs|j|�}|j|�dS)N)rM�_revert_transaction)rrOrPrrr�
_hcmd_undo�s
zHistoryCommand._hcmd_undocCs�|j|�}|jjj�}d}|j|jkr�x�|jjjtt|jd|jd���D]X}|jrjt	j
td�|j�n|jr�t	j
td�|j�|dkr�t
jjj|�}qL|j|�qLW|j|�dS)Nr!z-Transaction history is incomplete, before %u.z,Transaction history is incomplete, after %u.)rMr;r�last�tidrPr�rangeZaltered_lt_rpmdbr=ZwarningrZaltered_gt_rpmdbr
ZdbZMergedTransactionWrapper�mergerS)rrOrPrUZmerged_trans�transrrr�_hcmd_rollback�s
*zHistoryCommand._hcmd_rollbackc	Cs&dddddddddd	d
d�}t|�}x�dD]�}x�|j|g�D]�}||d|d<|ddkrt|jdd�dkrtd|d<|dd
kr�d|kr�tj|d�}|jtjgd�d}|jjjj	|j
|j|j�dd�}t
jj|�|d<|jd�tjkr<d|d<q<Wq*Wt|j|dd|jjd�|_|jj�dS)N�Removed�Install�
Downgraded�	Downgrade�Upgraded�Upgrade�	Reinstall�Reinstalled�	Obsoletedz
Reason Change)r\r[r`r_r^r]rbrarcZObsoletez
Reason Change�rpms�groups�environmentsr&�reasonZcleanZ
dependency�nevra)Zformsrr!Zrepo_idT)rIrJrKrL)rdrerf)r	�get�hawkeyZSubjectZget_nevra_possibilitiesZ
FORM_NEVRA�outputrZswdbZresolveRPMTransactionItemReason�nameZarch�tids�libdnfZtransactionZTransactionItemReasonToStringZSYSTEM_REPO_NAMErr;r1rLrrN)	rrYZ
action_maprIZcontent_typeZtiZsubjrhrgrrrrS�sD

z"HistoryCommand._revert_transactioncCs:t|jj��}|jj|dd�}|dkr6tjjtd���dS)z&Execute history userinstalled command.zPackages installed by userrhrzNo packages to listN)	�tupler;Ziter_userinstalledrkZlistPkgsr
r4r6r)rZpkgsZn_listedrrr�_hcmd_userinstalledsz"HistoryCommand._hcmd_userinstalledc
s��fdd�}t�}t�}�xĈjjD�]�}d|k�r\y|jdd�\}}Wn0tk
rxtjtd�j|��t	j
j�YnXtd�}y||�}Wn0tk
r�tjt|�j|��t	j
j�YnXy||�}Wn0tk
r�tjt|�j|��t	j
j�YnX�j�r$||k�r$tj�j
�t	j
j�||k�r8||}}|j||f�|jt||d��q$y|j||��Wq$tk
�r��jjj|g�}|�r�|j|�n4td�j|�}	�j�r�tj|	�t	j
j�n
tj|	�Yq$Xq$Wt|d	d
�|fS)z0Convert commandline arguments to transaction idscsJ|dkrd}n|jd�r$|dd�}t|�}|dkrF|�jjj�j7}|S)NrU�0zlast-�r)�
startswith�intrkrrUrV)�sZtransaction_id)rrr�str2transaction_ids
z@HistoryCommand._args2transaction_ids.<locals>.str2transaction_idz..�zWInvalid transaction ID range definition '{}'.
Use '<transaction-id>..<transaction-id>'.zNCan't convert '{}' to transaction ID.
Use '<number>', 'last', 'last-<number>'.r!z8No transaction which manipulates package '{}' was found.T)�reverse)�setr1r(�split�
ValueErrorr=r>rr,r
r4r6rr3�add�updaterWrkr�searchr�sorted)
rrvrm�merged_tids�tZbegin_transaction_idZend_transaction_idZcant_convert_msgZtransact_ids_from_pkgnamerAr)rr�_args2transaction_ids
sV





z$HistoryCommand._args2transaction_idsc
Cs@|jj}|dkrDt|j|jj|jj|jj|jjd�|_|jj	��n�|j
�\}}|dkr~|sf|jjr~|jj
||jjd��n�|dkr�|s�|jjr�|jj||jj|��n�|dkr�|j|��nz|dkr�|j|��nd|dkr�|j|��nN|d	k�r|j��n8|d
k�r<|j|�}t|�}y�|jjdk	�r8|jjnd}|jjj�sV|jjj�r�tjj|��r�td�j|�}|jjj�s�|jjjd
j|�dj|�d��r�ttd�j|��dSt |d��"}t!j"||ddd�|j#d�WdQRXttd�j|��Wn>t$k
�r:}	z t%j&j'td�jt(|	����WYdd}	~	XnXdS)Nr)�filenamerJrKrLr)rxrrrrrrztransaction.jsonz{} exists, overwrite?z
{} [y/N]: z
{} [Y/n]: )rAZdefaultyes_msgzNot overwriting {}, exiting.�wrrT)�indentZ	sort_keys�
zTransaction saved to {}.zError storing transaction: {}))r1rrr;r*rJrKrLrrNr�r(rkZhistoryListCmdrxZhistoryInfoCmdrTrQrZrprMr	r<ZassumenoZ	assumeyesr8r9�isfilerr,Zuserconfirm�print�open�json�dump�write�OSErrorr
r4r6�str)
rZvcmdrmr�rVrIr�rA�f�errrrNMsN


(zHistoryCommand.runcCs|jjdkrdS|jj�dS)Nrrrr)rrrr)r1rrZpost_transaction)rrrr�run_resolvedszHistoryCommand.run_resolvedcCsX|jjdkrdS|jj�}|rTtjtjjt	d��x |D]}tjtjjd|�q8WdS)NrrrrzEWarning, the following problems occurred while running a transaction:z  )rrrr)
r1rrZget_warningsr=�logr
�loggingZWARNINGr)r�warningsr�rrr�run_transaction�s


zHistoryCommand.run_transaction)rr
)�__name__�
__module__�__qualname__�__doc__�aliasesrZsummaryr-r�staticmethodr0rBrGrQrRrMrTrZrSrpr�rNr�r��
__classcell__rr)rrr*s&=	0@2r)Z
__future__rrrrnrjZdnf.i18nrrZdnf.clirZdnf.transaction_srrr	r
Zdnf.exceptionsZdnf.transactionZdnf.utilr�r�r8Z	getLoggerr=rFrrrrr�<module>s 
site-packages/dnf/cli/commands/__pycache__/mark.cpython-36.pyc000064400000005545147511334650020215 0ustar003

�ft`�
�@spddlmZddlmZddlZddlmZddlmZddl	Z	ddl
Z
ddlZejd�Z
Gdd�dej�ZdS)	�)�print_function)�unicode_literalsN)�_)�commands�dnfc@sLeZdZdZed�Zedd��Zdd�Zdd�Z	d	d
�Z
dd�Zd
d�ZdS)�MarkCommand�markz7mark or unmark installed packages as installed by user.cCs6|jdddddgtd�d�|jdd	d
td�d�dS)
Nr�Zinstall�remove�groupzhinstall: mark as installed by user
remove: unmark as installed by user
group: mark as installed by group)�nargs�choices�help�package�+ZPACKAGEzPackage specification)r�metavarr)�add_argumentr)�parser�r�/usr/lib/python3.6/mark.py�
set_argparser)s
zMarkCommand.set_argparsercCs,|jjj|tjj�tjtd�t	|��dS)Nz%s marked as user installed.)
�base�history�
set_reason�libdnf�transactionZTransactionItemReason_USER�logger�infor�str)�self�pkgrrr�
_mark_install2szMarkCommand._mark_installcCs,|jjj|tjj�tjtd�t	|��dS)Nz%s unmarked as user installed.)
rrrrrZ TransactionItemReason_DEPENDENCYrrrr)rr rrr�_mark_remove6szMarkCommand._mark_removecCs,|jjj|tjj�tjtd�t	|��dS)Nz%s marked as group installed.)
rrrrrZTransactionItemReason_GROUPrrrr)rr rrr�_mark_group:szMarkCommand._mark_groupcCs$|jj}d|_d|_d|_d|_dS)NTF)�cli�demandsZsack_activationZ	root_userZavailable_reposZ	resolving)rr%rrr�	configure>s
zMarkCommand.configurec
Cs|jjd}|jj}tjt|d|��}g}xR|D]J}tjj|�}|j	|j
j�}x|D]}||�qVWt|�dkr2|j
|�q2W|r�tjtd��x|D]}tjtd�|�q�Wtjj�|j
jj�}|dkr�|jj�}	n|j}	|j
jj|	gg�|j
jj|	�dS)NrZ_mark_zError:zPackage %s is not installed.)Zoptsrr�	functools�partial�getattrrZsubjectZSubjectZget_best_queryrZsack�len�appendr�errorrr$ZCliErrorrZlastZ_rpmdb_versionZend_rpmdb_versionZbeg�end)
r�cmdZpkgsZ	mark_funcZnotfoundr Zsubj�q�oldZ
rpmdb_versionrrr�runEs,


zMarkCommand.runN)r)
�__name__�
__module__�__qualname__�aliasesrZsummary�staticmethodrr!r"r#r&r1rrrrr$s	r)Z
__future__rrZlibdnf.transactionrZdnf.i18nrZdnf.clirrr'ZloggingZ	getLoggerrZCommandrrrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/remove.cpython-36.opt-1.pyc000064400000007521147511334650021513 0ustar003

�ft`��@s�ddlmZddlmZddlmZddlmZddlmZddl	Z
ddlZddlZddl
Z
ddlZejd�ZGdd	�d	ej�ZdS)
�)�absolute_import)�unicode_literals)�commands)�_)�OptionParserN�dnfc@sbeZdZdZejejejejejejd�Zde	ej
��Zed�Z
edd��Zd	d
�Zdd�Zd
S)�
RemoveCommandzRemove command.)zremove-nz	remove-nazremove-nevrazerase-nzerase-nazerase-nevra�remove�erase�rmz-remove a package or packages from your systemcCsf|j�}|jdddtd�d�|jddtjd�|jddtd	�d�|jd
dtd�tjtd
�d�dS)Nz--duplicates�
store_true�
duplicatedzremove duplicated packages)�action�dest�helpz--duplicated)rrz--oldinstallonlyz*remove installonly packages over the limitZpackages�*zPackage to removeZPACKAGE)�nargsrr�metavar)Zadd_mutually_exclusive_group�add_argumentr�argparseZSUPPRESSrZParseSpecGroupFileCallback)�parserZmgroup�r�/usr/lib/python3.6/remove.py�
set_argparser0s

zRemoveCommand.set_argparsercCs^|jj}d|_d|_d|_|jjr*d|_n0tj	j
rN|jjrNd|_d|_d|_
nd|_
d|_dS)NTF)Zcli�demandsZ	resolvingZ	root_userZsack_activation�optsr
Zavailable_reposr�base�WITH_MODULES�	grp_specsZfresh_metadataZ
allow_erasing)�selfrrrr�	configure?szRemoveCommand.configurecCs\g}|jj|jkr"|j|jjg}|jj|jj7_d}|jj�rD|jjj�}|jj	|j
��}|j�j|�}|s�tj
jtd���x�|j�j�D]�\\}}}t|�dkr�q�|jdd�y|jjt|d��WnHtj
jk
�rd}	td�}
tj|
|jjjjt|d��|	�YnXx"|d	d�D]}|jj|��q&Wq�WdS|jj�r�|jjj�}|jj	|j
��jd�}|jjj�}|dk	�r�|j |j!|j"|j#d
�}
|
�r�|j|
�}|�r�x,|D]}|jj|��q�Wntj
jtd���dS|jj$�r*|�r*x�|jj$D]&}td�}
tj|
|jjjj|���q�Wn�|jj$�r�tjj%�rxtj&j'j(|j�}|j)|jj$�}t|jj$�t|�k�r�d}n|jj$}|�r�xB|D]:}y|jj*|g��r�d}Wntj
jk
�r�YnX�q�Wxx|jjD]l}y|jj)||d
�WnLtj
j+k
�r8}z*dj,|j-|jjjj|��}
tj.|
�WYdd}~XnXd}�q�W|�sXtjtd��dS)NFz)No duplicated packages found for removal.�T)�reverser�z%Installed package %s%s not available.�)�epoch�version�releasez.No old installonly packages found for removal.zNot a valid form: %s)�formsz{}: {}zNo packages marked for removal.���)/rZcommand�nevra_formsZ	pkg_specs�	filenamesr
rZsackZqueryZ_get_installonly_queryZ	installed�
differencer�
exceptions�ErrorrZ_na_dict�items�len�sortZ	reinstall�strZPackagesNotAvailableError�loggerZwarning�outputZtermZboldZpackage_removeZoldinstallonlyZlatestZget_running_kernel�filterr%r&r'rr�module�module_baseZ
ModuleBaser	Zenv_group_removeZMarkingError�format�value�info)rr(�done�qZinstonlyZdups�nameZarchZ	pkgs_listZxmsg�msgZpkgZkernelZrunning_installonlyZgrp_specr7Zskipped_grps�groupZpkg_spec�errr�runPs�
(




 



zRemoveCommand.runN)r	r
r)�__name__�
__module__�__qualname__�__doc__�hawkeyZ	FORM_NAMEZFORM_NAZ
FORM_NEVRAr*�tuple�keys�aliasesrZsummary�staticmethodrr rArrrrr#s
r)Z
__future__rrZdnf.clirZdnf.i18nrZdnf.cli.option_parserrZdnf.baserrrFZdnf.exceptionsZloggingZ	getLoggerr3ZCommandrrrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/group.cpython-36.pyc000064400000024356147511334650020420 0ustar003

i�-e�:�@s�ddlmZddlmZddlmZddlmZddlmZm	Z	ddl
ZddlZddl
ZddlZddlZejd�ZGdd	�d	ej�ZdS)
�)�absolute_import)�unicode_literals)�
CompsQuery)�commands)�_�ucdN�dnfcs�eZdZdZddddddd�Zd-eej��Zed
�Z	ddd�Z
d.Zd/Zdd�Z
�fdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zed%d&��Zd'd(�Zd)d*�Zd+d,�Z�ZS)0�GroupCommandz; Single sub-command interface for most groups interaction. �list�install�remove�info)Z	grouplistZgroupinstallZgroupupdateZgroupremoveZ
grouperaseZ	groupinfo�group�groups�grpz'display, or use, the groups information�upgrade)�updateZerase�summary�markcCsn|jj|jj�}|r<|jjdk	r4|jjjd|jj�||j_|jjdkrPd|j_|jj|jj|jj�|j_dS)Nrr)�direct_commands�get�optsZcommand�subcmd�args�insert�_CMD_ALIASES)�selfZdirect�r�/usr/lib/python3.6/group.py�
_canonical6szGroupCommand._canonicalcstt|�j|�d|_dS)NF)�superr	�__init__�_remark)r�cli)�	__class__rrr!CszGroupCommand.__init__cCs$td�}t|jj�s tjj|��dS)Nz4No group data available for configured repositories.)r�len�base�compsr�
exceptionsZ
CompsError)r�msgrrr�
_assert_compsGszGroupCommand._assert_compscsT�fdd�}�j�|dkr(�jjj}n�jjjdj|��}tjjt	tjj
||��S)Ncs�jjjj|j�}|S)N)r&�history�envr�id)r,Z	env_found)rrr�available_predMsz7GroupCommand._environment_lists.<locals>.available_pred�,)r*r&r'�environments�environments_by_pattern�joinr�utilZmapallr
�	partition)r�patternsr.�envsr)rr�_environment_listsLszGroupCommand._environment_listsc	s��fdd�}g}g}�j�|dkr0�jjj}n�jjjdj|��}x2|D]*}|}||�r^|}|sj|jrJ|j|�qJW||fS)Ncs�jjjj|j�}|rdSdS)NTF)r&r+rrr-)rZgroup_found)rrr�installed_predZsz1GroupCommand._group_lists.<locals>.installed_predr/)r*r&r'r�groups_by_patternr2�uservisible�append)	rr:r5r8�	installed�	availableZgrpsrZtgt_listr)rr�_group_listsYs
zGroupCommand._group_listscCs~xt|D]l}d}x&|jjj|�D]}|jj|�d}qWx&|jjj|�D]}|jj|�d}qFW|stjt	d�|�qWdgfS)NFTz!Warning: Group %s does not exist.r)
r&r'r1�outputZdisplay_groups_in_environmentr9Zdisplay_pkgs_in_groups�logger�errorr)r�userlistZstrngZ
group_matchedr,rrrr�_infoqs
zGroupCommand._infocs�d}d}d}|jjjp|jj�xz|r�|ddkr@d}|jd�q |ddkr\d}|jd�q |ddkrxd}|jd�q |ddkr�d�|jd�q Pq W|jjr�d}|jjr�d}|jjr�d}|s�d}d}|dk	�r@x\|D]T}|jj	}t
|j|��dk}t
|j|��dk}	|r�|	r�t
jtd	�d
|�d}q�W|�r@dgfS|j|�\}
}|j||�\}}
��fdd�}�fd
d�}|�s�|td�|�|�s�|td�|
�|�s
d�x,|D]$}|j�r��q�|td�|�d��q�Wd�x,|D]$}|j�s�q�|td�|�d��q�W|�rdgfSd�x,|
D]$}|j�r2�q"|td�|�d��q"Wd�x,|
D]$}|j�sd�qT|td�|�d��qTWdgfS)N�r�hiddenr<r=�idsTFzWarning: No groups match:z
   %scs`�st|�d|jdk	r|jntd�}�r:|d|j7}|jrN|d|j7}tdj|��dS)Nz   %sz<name-unset>z (%s)z [%s]z{})�print�ui_namerr-�	lang_only�format)�sectrr))�done�	print_idsrr�_out_grp�sz$GroupCommand._list.<locals>._out_grpcsT|rt|�xB|D]:}d|jdk	r(|jntd�}�rD|d|j7}t|�qWdS)Nz   %sz<name-unset>z (%s))rGrHrr-)rKr6�er))rMrr�_out_env�s
z$GroupCommand._list.<locals>._out_envzAvailable Environment Groups:zInstalled Environment Groups:zInstalled Groups:zInstalled Language Groups:zAvailable Groups:zAvailable Language Groups:)r&�conf�verboserrF�poprEr<r=r'r%r9r1r@rArr7r>rI)rrBr:Z
showinstalledZ
showavailableZerrsrr'Zin_groupZin_environmentZenv_instZ	env_availr<r=rNrPr)rLrMr�_list�s�


	







zGroupCommand._listc	Cs�t|jj|jjtjtjBtjtjB�}|jj�}|j	|�}|j
jrXt|jj
jdg�}nt|jj
j�}tjj|�}x|jD]}|j||�qzWx|jD]}|j||�q�WdS)N�optional)rr&r'r+�GROUPS�ENVIRONMENTSZ	AVAILABLE�	INSTALLED�_build_comps_solverrr�
with_optional�tuplerQ�group_package_types�libdnfZtransactionZlistToCompsPackageTyper0Z_environment_installrZ_group_install)	rr5�q�solver�res�typesZ	pkg_types�env_idZgroup_idrrr�
_mark_install�s


zGroupCommand._mark_installcCs�t|jj|jjtjtjBtj�}|jj�}|j|�}x(|j	D]}t
jj|�sPt
�|j|�q<Wx(|jD]}t
jj|�szt
�|j|�qfWdS)N)rr&r'r+rVrWrXrYrr0rr3Zis_string_type�AssertionErrorZ_environment_removerZ
_group_remove)rr5r^r_r`rbZgrp_idrrr�_mark_remove�s


zGroupCommand._mark_removecCs*|d|jkr"|d|dd�fSd|fS)NrrDr)�
_MARK_CMDS)r�extcmdsrrr�_mark_subcmdszGroupCommand._mark_subcmdcCs d}t|�dkr*|ddkr*d}|jd�|jjr6d}|s>d}|j||�\}}dd�}d}x|D]}|jrlq`|d7}q`W|td�|�d}x|D]}|js�q�|d7}q�W|td�|�d}x|D]}|jr�q�|d7}q�W|td	�|�d}x|D]}|j�s�q�|d7}q�W|td
�|�dgfS)NrDrrEcSs|sdStjd||�dS)Nz%s %u)r@r
)rKZnumrrrrNsz'GroupCommand._summary.<locals>._out_grpzInstalled Groups:zInstalled Language Groups:FzAvailable Groups:zAvailable Language Groups:)r%rSrrEr>rIr)rrBr:r<r=rNrLrrrr�_summary
sH




zGroupCommand._summaryc
Cs�|jddtd�d�|j�}|jddtd�d�|jddtd�d�|jd	dtd
�d�|jddtd�d�|jd
ddtd�jtjddjtjdd���d�|jdddtd�d�dS)Nz--with-optional�
store_truez$include optional packages from group)�action�helpz--hiddenzshow also hidden groupsz--installedzshow only installed groupsz--availablezshow only available groupsz--idszshow also ID of groupsr�?ZCOMMANDz'available subcommands: {} (default), {}rz, rD)�nargs�metavarrlr�*ZCOMMAND_ARGzargument for group subcommand)�add_argumentrZadd_mutually_exclusive_grouprJr	�_GROUP_SUBCOMMANDSr2)�parserZ	grpparserrrr�
set_argparser<s"

zGroupCommand.set_argparsercCs�|j�|jj}|jj}||jkrBtjtd�dj|j��t	j
j�|d
krf|rf|j
jj
|�t	j
j�|j
j}d|_|dkr�d|_d|_|dkr�d|_d	|_nd|_|dkr�tj|j�|dkr�tj|j|j
�dS)
Nz$Invalid groups sub-command, use: %s.z, rrrr
TrF)rrrr
)rrrr)rr)rrrrrrr@Zcriticalrr2rr#ZCliErrorZ	optparserZ
print_help�demandsZsack_activationZ	root_userZ	resolvingZ
allow_erasingZavailable_reposrZ_checkEnabledRepor&Z_checkGPGKey)r�cmdrrurrr�	configurePs.

zGroupCommand.configurecCs�|jj}|jj}|dkr"|j|�S|dkr4|j|�S|dkrF|j|�S|dkr�|j|�\}}|dkrn|j|�S|dkszt�|j	|�S|dk�r0|jj
r�t|jj
jdg�}nt|jj
j�}d|_y|jj|||jj
j�Stjjk
�r.}z6td	�}tj||jjjj|��tjjtd
���WYdd}~XnX|dk�rF|jj|�S|dk�r�x<|D]4}y|jj|g�Wntjjk
�r�YnX�qVWdS)Nrr
r
rrrrUTzNo package %s available.z)Unable to find a mandatory group package.r)rrrrirTrCrhrerdrcrZr[r&rQr\r"Zenv_group_install�strictrr(ZMarkingErrorrr@r
r?ZtermZboldZPackagesNotAvailableErrorZenv_group_upgradeZenv_group_remove�Error)rrvrgrrarOr)�argrrr�runosF









zGroupCommand.runcCsf|js
dS|jj}|jj}|j}x@|jjj�j�j|d�D]$}|j	j
|�}|j||j||��q:WdS)N)�name)
r"r&Z_goalr+Z
group_membersZsackZqueryr<ZfiltermZrpmZ
get_reasonZ
set_reasonZgroup_reason)rZgoalr+�namesZpkg�reasonrrr�run_transaction�szGroupCommand.run_transaction)rrr)rr)rr
r
rrrr)�__name__�
__module__�__qualname__�__doc__rr[�keys�aliasesrrrrfrrrr!r*r7r>rCrTrcrerhri�staticmethodrtrwr{r�
__classcell__rr)r$rr	$s8

h
/*r	)Z
__future__rrZ	dnf.compsrZdnf.clirZdnf.i18nrrZlibdnf.transactionr]rZdnf.exceptionsZdnf.utilZloggingZ	getLoggerr@ZCommandr	rrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/clean.cpython-36.opt-1.pyc000064400000010011147511334650021264 0ustar003

�ft`t�@s�ddlmZddlmZddlmZddlmZmZddlm	Z	ddlZ
ddlZ
ddlZ
ddl
Z
ddlZ
ddlZddlZddlZddlZejd�Zdd	d
gdgd	gd
gddd	gd�Zd
d�Zdd�Zdd�Zdd�ZGdd�dej�ZdS)�)�absolute_import)�unicode_literals)�commands)�_�P_)�miscN�dnf�metadata�dbcachezexpire-cache�packages)r	rr
zexpire-cache�allccsVxPtj|�D]B\}}}tjj||�}x(|D] }tjj||�}tjj|�Vq*WqWdS)z:Traverse dirpath recursively and yield relative filenames.N)�os�walk�path�relpath�join�normpath)�dirpath�root�dirs�files�base�fr�r�/usr/lib/python3.6/clean.py�_tree1s

rcs�fdd�|D�S)z5Yield those filenames that match any of the patterns.c3s(|] }�D]}tj||�r
|Vq
qdS)N)�re�match)�.0r�p)�patternsrr�	<genexpr><sz_filter.<locals>.<genexpr>r)rr r)r r�_filter:sr"cCsLd}xB|D]:}tjj||�}tjtjjtd�|�t	j
|�|d7}q
W|S)z(Remove the given filenames from dirpath.rzRemoving file %s�)r
rr�logger�logr�loggingZDDEBUGrrZunlink_f)rr�countrrrrr�_clean?s

r(cs0tjjd��fdd�|D�}tdd�|D��S)z:Return the repo IDs that have some cached metadata around.r	c3s|]}tj�|�VqdS)N)rr)rr)�metapatrrr!Msz _cached_repos.<locals>.<genexpr>css|]}|r|jd�VqdS)ZrepoidN)�group)r�mrrrr!Ns)r�repo�CACHE_FILES�set)rZmatchesr)r)r�
_cached_reposJsr/c@s0eZdZdZd	Zed�Zedd��Zdd�Z	dS)
�CleanCommandzSA class containing methods needed by the cli to execute the
    clean command.
    �cleanzremove cached datacCs|jddtj�td�d�dS)N�type�+zMetadata type to clean)�nargs�choices�help)�add_argument�_CACHE_TYPES�keysr)�parserrrr�
set_argparserYszCleanCommand.set_argparsercCsf|jjj}tjj|d�}tjj|d�}tjj|jjjd�}�x$y�|oJ|oJ|��t	dd�|j
jD��}tt
|��}tjtddj|���d|kr�t|�}|jjjj|�|jd�tjtd��dd	�|D�}t|t||��}	tjtd
d|	�|	�dSQRXWq>tjjk
�r\}
z:|jjj�sHtd�|
j}tj|�tj d
�n|
�WYdd}
~
Xq>Xq>WdS)NTcss |]}t|D]
}|VqqdS)N)r8)r�c�trrrr!gsz#CleanCommand.run.<locals>.<genexpr>zCleaning data: � zexpire-cachezCache was expiredcSsg|]}tjj|�qSr)rr,r-)rr=rrr�
<listcomp>qsz$CleanCommand.run.<locals>.<listcomp>z%d file removedz%d files removedz*Waiting for process with pid %d to finish.�)!rZconf�cachedirr�lockZbuild_metadata_lockZbuild_download_lockZbuild_rpmdb_lockZ
persistdirr.Zoptsr2�listrr$�debugrrr/Z_repo_persistorZexpired_to_add�update�remove�infor(r"r�
exceptionsZ	LockErrorZexit_on_lock�pid�timeZsleep)�selfrAZmd_lockZ
download_lockZ
rpmdb_lock�typesrZexpiredr r'�e�msgrrr�run_s2


zCleanCommand.runN)r1)
�__name__�
__module__�__qualname__�__doc__�aliasesrZsummary�staticmethodr;rOrrrrr0Qs
r0)Z
__future__rrZdnf.clirZdnf.i18nrrZdnf.yumrrZdnf.exceptionsZdnf.lockZdnf.loggingZdnf.repor&r
rrJZ	getLoggerr$r8rr"r(r/ZCommandr0rrrr�<module>s0
	site-packages/dnf/cli/commands/__pycache__/shell.cpython-36.opt-1.pyc000064400000017277147511334650021336 0ustar003

�ft`l&�@s�ddlmZddlmZmZddlZddlZddlZddlZddl	Z	ddl
Z
ddlZe	jd�Z
Gdd�de�ZGdd�dejej�ZdS)	�)�commands)�_�ucdN�dnfc@seZdZdZdZdZdZdS)�ShellDemandSheetTN)�__name__�
__module__�__qualname__Zavailable_reposZ	resolvingZ	root_userZsack_activation�r
r
�/usr/lib/python3.6/shell.pyr%src@s�eZdZd*Zed�jejjd�Z	dddddddd	d
dd�
Z
d
d�Zedd��Z
dd�Zdd�Zdd�Zdd�Zd+dd�Zd,dd�Zd-dd�Zd.d d!�Zd"d#�Zd/d$d%�Zd0d&d'�Zd1d(d)�ZdS)2�ShellCommand�shell�shzrun an interactive {prog} shell)�prog�repo�quitZts_run�transaction�config�resolve�help)
r�
repository�exitr�run�tsrr�
resolvedeprcCs$tjj||�tjj|�d|_dS)Nz> )r�Command�__init__�cmd�Cmd�prompt)�self�clir
r
rr=szShellCommand.__init__cCs*|jddtd�td�jtjjd�d�dS)N�script�?ZSCRIPTzScript to run in {prog} shell)r)�nargs�metavarr)�add_argumentr�formatr�util�MAIN_PROG_UPPER)�parserr
r
r�
set_argparserBszShellCommand.set_argparserc
Csr|jj}t�|j_xZt|�D]N}|jd�r,qyt|jj|�Wqtk
rht|jj|t||��YqXqWdS)N�__)r!�demandsr�dir�
startswith�getattr�AttributeError�setattr)r Zdefault_demands�attrr
r
r�	configureHs

zShellCommand.configurecCs$|jjr|j|jj�n|j�dS)N)�optsr"�_run_scriptZcmdloop)r r
r
rrUszShellCommand.runcCs |jj�d|j_|jj�dS)N)�baseZ_finalize_base�_transaction�	fill_sack)r r
r
r�_clean[s
zShellCommand._cleancCs`|s|dkrdS|dkrd}ytj|�}Wn|j�dS|jjjdd�|jjj|�}|jdkrldS|j|jkr�t	|d|j|j�|dd��n�|jj
j|j�}|dk	�rT||j�}y|jjj||�}Wnt
k
r�dSXy&tj|jj�|j_|j�|j�Wn@tjjk
�rP}ztjtd	�d
t|��dSd}~XnXn|j�dS)N�
ZEOFrF)Zreset_usager
r�zError:� )�shlex�split�_helpr!�	optparserrZparse_main_argsZcommand�MAPPINGr0Zcli_commands�getZparse_command_args�
SystemExit�copy�deepcopyr-r4rr�
exceptions�Error�logger�errorrr)r �lineZs_liner5Zcmd_clsr�er
r
r�onecmd`s<
$

zShellCommand.onecmdNc	Cs�dd�}|st|�dkr(|jd�dS|d}t|�dkrD|dnd}|jd�}|d	kr�|d|�}||dd�}|jjj|�}x|D]}||||�q�W|s�tjtd�|�n||||jj	�dS)
Nc
SsP|rt|||�n:ytdj|t|t|����Wntjtd��YnXdS)Nz{}: {}zUnsupported key value.)r2�printr'r0�strrI�warningr)�key�val�confr
r
r�print_or_set�sz*ShellCommand._config.<locals>.print_or_set�rrr<�.zCould not find repository: %s���)
�lenr@�findr7�repos�get_matchingrIrPrrS)	r �argsrTrQrRZperiodZ	repo_namerZrr
r
r�_config�s"	



zShellCommand._configcCs�t|t�rt|�dkr|dn|}d}|r�|dkrBtd�j|�}n�|dkrZtd�j|�}nv|dkrrtd	�j|�}n^|d
kr�td�j|�}nF|dkr�td�j|�}n.|dkr�td�j|�}n|dkr�td�j|�}|s�|jjj�td�}td|�dS)z�Output help information.

        :param args: the command to output help information about. If
           *args* is an empty, general help will be output.
        rNrz�{} arg [value]
  arg: debuglevel, errorlevel, obsoletes, gpgcheck, assumeyes, exclude,
        repo_id.gpgcheck, repo_id.exclude
    If no value is given it prints the current value.
    If value is given it sets that value.rz{} [command]
    print helprrz�{} arg [option]
  list: lists repositories and their status. option = [all | id | glob]
  enable: enable repositories. option = repository id
  disable: disable repositories. option = repository idrz"{}
    resolve the transaction setrrzy{} arg
  list: lists the contents of the transaction
  reset: reset (zero-out) the transaction
  run: run the transactionrz{}
    run the transactionrrz{}
    exit the shella�Shell specific arguments:

config                   set config options
help                     print help
repository (or repo)     enable, disable or list repositories
resolvedep               resolve the transaction set
transaction (or ts)      list, reset or run the transaction set
run                      resolve and run the transaction set
exit (or quit)           exit the shellr;)rr)rr)rr)	�
isinstance�listrXrr'r!rAZ
print_helprN)r r\�arg�msgr
r
rr@�s:"zShellCommand._helpcCs�|r|dnd}|d
kr6|jddj|dd���n�|dkr�|jjj}d}x\|dd�D]L}|j|�}|r~t||��d	}qZtjt	d
�dt	d�|jj
jj|��qZW|r�|jj
�d|j_n
|jd�dS)Nrr_z	repolist r=r<�enable�disableFTzError:zUnknown repo: '%s'r)r_N)rbrc)rM�joinr!r7rZr[r0rIZcriticalr�output�term�boldr9Z_compsr@)r r\rrZr9r�rr
r
r�_repo�s"



zShellCommand._repocCsLy|jjj|jjj�Wn.tjjk
rF}zt|�WYdd}~XnXdS)N)	r!r7rr-Z
allow_erasingrrGZ
DepsolveErrorrN)r r\rLr
r
r�_resolve�szShellCommand._resolvecCs�yDt|d��0}|j�}x |D]}|jd�s|j|�qWWdQRXWn:tk
r~tjtd�|jj	j
j|��tj
d�YnXdS)Nrh�#z!Error: Cannot open %s for readingr<)�open�	readlinesr/rM�IOErrorrI�inforr7rerfrg�sysr)r �file�fd�linesrKr
r
rr6�s

zShellCommand._run_scriptcCs�|r|dnd}|dkr$|j�dS|j�|d	krZ|jjr�|jjj|jj�}tj|�nz|dkr�y|jj�Wn@t	j
jk
r�}z tjt
d�dt|��WYdd}~XnXtjt
d��|j�n
|jd�dS)
Nr�resetr_rzError:r=z	Complete!r)r_N)r:rjr7r8reZlist_transactionrIroZdo_transactionrrGrHrJrrr@)r r\r�outrLr
r
rr8	s",
zShellCommand._transactioncCs|jdg�dS)Nr)r8)r r\r
r
r�_ts_run"szShellCommand._ts_runcCstjtd��tjd�dS)Nz
Leaving Shellr)rIrorrpr)r r\r
r
r�_quit%szShellCommand._quit)r
r)N)N)N)N)N)N)N)rrr	�aliasesrr'rr(r)ZsummaryrBr�staticmethodr+r4rr:rMr]r@rirjr6r8rvrwr
r
r
rr,s4
&

;



r)Zdnf.clirZdnf.i18nrrZdnf.utilrrrEZloggingr>rpZ	getLoggerrI�objectrrrrr
r
r
r�<module>s
site-packages/dnf/cli/commands/__pycache__/swap.cpython-36.pyc000064400000003537147511334650020234 0ustar003

�ft`s	�@s`ddlmZddlmZddlmZddlmZddlZddl	Z	e	j
d�ZGdd�dej�Z
dS)	�)�absolute_import)�unicode_literals)�_)�commandsN�dnfc@sLeZdZdZdZed�jejj	d�Z
edd��Zdd�Z
d	d
�Zdd�Zd
S)�SwapCommandzNA class containing methods needed by the cli to execute the swap command.
    �swapz=run an interactive {prog} mod for remove and install one spec)�progcCs,|jddtd�d�|jddtd�d�dS)N�remove_specZstorezThe specs that will be removed)�action�help�install_specz The specs that will be installed)�add_argumentr)�parser�r�/usr/lib/python3.6/swap.py�
set_argparser&s
zSwapCommand.set_argparsercCsH|jj}d|_d|_d|_d|_tj|j|j�tj	|j|j
jg�dS)NT)�cli�demandsZsack_activationZavailable_reposZ	resolvingZ	root_userrZ_checkGPGKey�baseZ_checkEnabledRepo�optsr
)�selfrrrr�	configure,szSwapCommand.configurecCs@|jjj|�}|dk	r<||j�}|jjj|||g�|j�dS)N)rZcli_commands�getZ	optparserZparse_command_args�run)rZcmd_str�specZcmd_cls�cmdrrr�_perform5s

zSwapCommand._performcCs$|jd|jj�|jd|jj�dS)N�removeZinstall)rrr
r
)rrrrr<szSwapCommand.runN)r)�__name__�
__module__�__qualname__�__doc__�aliasesr�formatr�utilZMAIN_PROG_UPPERZsummary�staticmethodrrrrrrrrrs	r)Z
__future__rrZdnf.i18nrZdnf.clirZdnf.utilrZloggingZ	getLoggerZloggerZCommandrrrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/shell.cpython-36.pyc000064400000017277147511334650020377 0ustar003

�ft`l&�@s�ddlmZddlmZmZddlZddlZddlZddlZddl	Z	ddl
Z
ddlZe	jd�Z
Gdd�de�ZGdd�dejej�ZdS)	�)�commands)�_�ucdN�dnfc@seZdZdZdZdZdZdS)�ShellDemandSheetTN)�__name__�
__module__�__qualname__Zavailable_reposZ	resolvingZ	root_userZsack_activation�r
r
�/usr/lib/python3.6/shell.pyr%src@s�eZdZd*Zed�jejjd�Z	dddddddd	d
dd�
Z
d
d�Zedd��Z
dd�Zdd�Zdd�Zdd�Zd+dd�Zd,dd�Zd-dd�Zd.d d!�Zd"d#�Zd/d$d%�Zd0d&d'�Zd1d(d)�ZdS)2�ShellCommand�shell�shzrun an interactive {prog} shell)�prog�repo�quitZts_run�transaction�config�resolve�help)
r�
repository�exitr�run�tsrr�
resolvedeprcCs$tjj||�tjj|�d|_dS)Nz> )r�Command�__init__�cmd�Cmd�prompt)�self�clir
r
rr=szShellCommand.__init__cCs*|jddtd�td�jtjjd�d�dS)N�script�?ZSCRIPTzScript to run in {prog} shell)r)�nargs�metavarr)�add_argumentr�formatr�util�MAIN_PROG_UPPER)�parserr
r
r�
set_argparserBszShellCommand.set_argparserc
Csr|jj}t�|j_xZt|�D]N}|jd�r,qyt|jj|�Wqtk
rht|jj|t||��YqXqWdS)N�__)r!�demandsr�dir�
startswith�getattr�AttributeError�setattr)r Zdefault_demands�attrr
r
r�	configureHs

zShellCommand.configurecCs$|jjr|j|jj�n|j�dS)N)�optsr"�_run_scriptZcmdloop)r r
r
rrUszShellCommand.runcCs |jj�d|j_|jj�dS)N)�baseZ_finalize_base�_transaction�	fill_sack)r r
r
r�_clean[s
zShellCommand._cleancCs`|s|dkrdS|dkrd}ytj|�}Wn|j�dS|jjjdd�|jjj|�}|jdkrldS|j|jkr�t	|d|j|j�|dd��n�|jj
j|j�}|dk	�rT||j�}y|jjj||�}Wnt
k
r�dSXy&tj|jj�|j_|j�|j�Wn@tjjk
�rP}ztjtd	�d
t|��dSd}~XnXn|j�dS)N�
ZEOFrF)Zreset_usager
r�zError:� )�shlex�split�_helpr!�	optparserrZparse_main_argsZcommand�MAPPINGr0Zcli_commands�getZparse_command_args�
SystemExit�copy�deepcopyr-r4rr�
exceptions�Error�logger�errorrr)r �lineZs_liner5Zcmd_clsr�er
r
r�onecmd`s<
$

zShellCommand.onecmdNc	Cs�dd�}|st|�dkr(|jd�dS|d}t|�dkrD|dnd}|jd�}|d	kr�|d|�}||dd�}|jjj|�}x|D]}||||�q�W|s�tjtd�|�n||||jj	�dS)
Nc
SsP|rt|||�n:ytdj|t|t|����Wntjtd��YnXdS)Nz{}: {}zUnsupported key value.)r2�printr'r0�strrI�warningr)�key�val�confr
r
r�print_or_set�sz*ShellCommand._config.<locals>.print_or_set�rrr<�.zCould not find repository: %s���)
�lenr@�findr7�repos�get_matchingrIrPrrS)	r �argsrTrQrRZperiodZ	repo_namerZrr
r
r�_config�s"	



zShellCommand._configcCs�t|t�rt|�dkr|dn|}d}|r�|dkrBtd�j|�}n�|dkrZtd�j|�}nv|dkrrtd	�j|�}n^|d
kr�td�j|�}nF|dkr�td�j|�}n.|dkr�td�j|�}n|dkr�td�j|�}|s�|jjj�td�}td|�dS)z�Output help information.

        :param args: the command to output help information about. If
           *args* is an empty, general help will be output.
        rNrz�{} arg [value]
  arg: debuglevel, errorlevel, obsoletes, gpgcheck, assumeyes, exclude,
        repo_id.gpgcheck, repo_id.exclude
    If no value is given it prints the current value.
    If value is given it sets that value.rz{} [command]
    print helprrz�{} arg [option]
  list: lists repositories and their status. option = [all | id | glob]
  enable: enable repositories. option = repository id
  disable: disable repositories. option = repository idrz"{}
    resolve the transaction setrrzy{} arg
  list: lists the contents of the transaction
  reset: reset (zero-out) the transaction
  run: run the transactionrz{}
    run the transactionrrz{}
    exit the shella�Shell specific arguments:

config                   set config options
help                     print help
repository (or repo)     enable, disable or list repositories
resolvedep               resolve the transaction set
transaction (or ts)      list, reset or run the transaction set
run                      resolve and run the transaction set
exit (or quit)           exit the shellr;)rr)rr)rr)	�
isinstance�listrXrr'r!rAZ
print_helprN)r r\�arg�msgr
r
rr@�s:"zShellCommand._helpcCs�|r|dnd}|d
kr6|jddj|dd���n�|dkr�|jjj}d}x\|dd�D]L}|j|�}|r~t||��d	}qZtjt	d
�dt	d�|jj
jj|��qZW|r�|jj
�d|j_n
|jd�dS)Nrr_z	repolist r=r<�enable�disableFTzError:zUnknown repo: '%s'r)r_N)rbrc)rM�joinr!r7rZr[r0rIZcriticalr�output�term�boldr9Z_compsr@)r r\rrZr9r�rr
r
r�_repo�s"



zShellCommand._repocCsLy|jjj|jjj�Wn.tjjk
rF}zt|�WYdd}~XnXdS)N)	r!r7rr-Z
allow_erasingrrGZ
DepsolveErrorrN)r r\rLr
r
r�_resolve�szShellCommand._resolvecCs�yDt|d��0}|j�}x |D]}|jd�s|j|�qWWdQRXWn:tk
r~tjtd�|jj	j
j|��tj
d�YnXdS)Nrh�#z!Error: Cannot open %s for readingr<)�open�	readlinesr/rM�IOErrorrI�inforr7rerfrg�sysr)r �file�fd�linesrKr
r
rr6�s

zShellCommand._run_scriptcCs�|r|dnd}|dkr$|j�dS|j�|d	krZ|jjr�|jjj|jj�}tj|�nz|dkr�y|jj�Wn@t	j
jk
r�}z tjt
d�dt|��WYdd}~XnXtjt
d��|j�n
|jd�dS)
Nr�resetr_rzError:r=z	Complete!r)r_N)r:rjr7r8reZlist_transactionrIroZdo_transactionrrGrHrJrrr@)r r\r�outrLr
r
rr8	s",
zShellCommand._transactioncCs|jdg�dS)Nr)r8)r r\r
r
r�_ts_run"szShellCommand._ts_runcCstjtd��tjd�dS)Nz
Leaving Shellr)rIrorrpr)r r\r
r
r�_quit%szShellCommand._quit)r
r)N)N)N)N)N)N)N)rrr	�aliasesrr'rr(r)ZsummaryrBr�staticmethodr+r4rr:rMr]r@rirjr6r8rvrwr
r
r
rr,s4
&

;



r)Zdnf.clirZdnf.i18nrrZdnf.utilrrrEZloggingr>rpZ	getLoggerrI�objectrrrrr
r
r
r�<module>s
site-packages/dnf/cli/commands/__pycache__/deplist.cpython-36.opt-1.pyc000064400000001571147511334650021661 0ustar003

�ft`��@sPddlmZddlmZddlmZddlmZddlmZGdd�de�ZdS)	�)�print_function)�absolute_import)�unicode_literals)�_)�RepoQueryCommandc@s$eZdZdZdZed�Zdd�ZdS)�DeplistCommandz<
    The command is alias for 'dnf repoquery --deplist'
    �deplistz`[deprecated, use repoquery --deplist] List package's dependencies and what packages provide themcCstj|�d|j_dS)NT)r�	configureZoptsr)�self�r�/usr/lib/python3.6/deplist.pyr	"s
zDeplistCommand.configureN)r)�__name__�
__module__�__qualname__�__doc__�aliasesrZsummaryr	rrrrrsrN)	Z
__future__rrrZdnf.i18nrZdnf.cli.commands.repoqueryrrrrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/upgrademinimal.cpython-36.pyc000064400000002117147511334650022251 0ustar003

�ft`�@sDddlmZddlmZddlmZddlmZGdd�de�ZdS)�)�absolute_import)�unicode_literals)�_)�UpgradeCommandc@s$eZdZdZd	Zed�Zdd�ZdS)
�UpgradeMinimalCommandzSA class containing methods needed by the cli to execute the check
    command.
    �upgrade-minimal�update-minimal�up-minzWupgrade, but only 'newest' package match which fixes a problem that affects your systemc	CsRtj|�d|_t|jj|jj|jj|jj|jj	|jj
|jj|jjg�sNd|_
dS)NT)r�	configureZupgrade_minimal�anyZoptsZbugfixZenhancementZ
newpackageZsecurityZadvisoryZbugzillaZcvesZseverityZall_security)�self�r
�$/usr/lib/python3.6/upgrademinimal.pyr
"s
zUpgradeMinimalCommand.configureN)rrr	)�__name__�
__module__�__qualname__�__doc__�aliasesrZsummaryr
r
r
r
rrsrN)Z
__future__rrZdnf.i18nrZdnf.cli.commands.upgraderrr
r
r
r�<module>ssite-packages/dnf/cli/commands/__pycache__/install.cpython-36.pyc000064400000013703147511334650020724 0ustar003

�ft`S�@s�ddlmZddlmZddlZddlmZddlZddlZddl	m
Z
ddlmZddl
mZejd�ZGd	d
�d
e
j�ZdS)�)�absolute_import)�unicode_literalsN)�chain)�commands)�OptionParser)�_�dnfc@s�eZdZdZejejejd�ZdZ	de
ej��Ze
d�Zedd	��Zd
d�Zdd
�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�ZdS)�InstallCommandzUA class containing methods needed by the cli to execute the
    install command.
    )z	install-nz
install-naz
install-nevrazalternative-for({})�install�localinstall�inz,install a package or packages on your systemcCs"|jddtd�tjtd�d�dS)N�package�+ZPACKAGEzPackage to install)�nargs�metavar�action�help)�add_argumentrrZParseSpecGroupFileCallback)�parser�r�/usr/lib/python3.6/install.py�
set_argparser1szInstallCommand.set_argparsercCsH|jj}d|_d|_d|_d|_tj|j|j�|j	j
sDtj|j�dS)z�Verify that conditions are met so that this command can run.
        That there are enabled repositories with gpg keys, and that
        this command is called with appropriate arguments.
        TN)�cli�demandsZsack_activationZavailable_reposZ	resolvingZ	root_userrZ_checkGPGKey�base�opts�	filenamesZ_checkEnabledRepo)�selfrrrr�	configure7szInstallCommand.configurec
CsPg}g}g}|j�}|jj|j�|jjdkrf|jjs>|jjrf|j|jj�|jj	j
rftjj
td���g}|jjo||jjdk�rTtjj�rLy,tjjj|j�}|j|jj|jj	j
d�Wn�tjjk
�rH}zp|jr�x|jD]}|j|�q�W|j�rx|jD]}|jd|�q�W|j}	|	�r8tjtjjj|	d��WYdd}~XnXn|jj}|jj�r�|�r�|j|jj�|jj	j
�r�tjj
td���n|j�}|�r�|�r�|j|�|jj	j
�r�tjj
td���n|�r�|jjdk�r�|j|�|jjdk�r|j |�}t!|�dk�s$t!|�dk�s$|�rL|jj	j
�rLtjj"td�dj#|�|d��dS)	NrzNothing to do.)�strict�@rzUnable to find a match� )�pkg_specZpackages)$�_get_nevra_forms_from_commandrZ _populate_update_security_filterr�command�	grp_specs�	pkg_specs�_log_not_valid_rpm_file_pathsr�confrr�
exceptions�ErrorrZWITH_MODULES�module�module_baseZ
ModuleBaser
Z
MarkingErrorsZno_match_group_specs�appendZerror_group_specs�module_depsolv_errors�logger�errorZformat_modular_solver_errorsr�_inform_not_a_valid_combination�_install_files�_install_groups�_install_packages�lenZPackagesNotAvailableError�join)
r�err_pkgs�errsZerror_module_specs�nevra_formsZskipped_grp_specsr,�eZe_specr.rrr�runEsX

 


.zInstallCommand.runcCs&|jj|jkr|j|jjgSgSdS)N)rr$r9)rrrrr#zsz,InstallCommand._get_nevra_forms_from_commandcCsJtdd�|�}x6t|jj|�D]$}td�}tj||jjj	j
|��qWdS)NcSsd|S)Nr r)�grrr�<lambda>�sz>InstallCommand._log_not_valid_rpm_file_paths.<locals>.<lambda>zNot a valid rpm file path: %s)�maprrr&rr/�infor�output�term�bold)rr%Zgroup_names�pkg�msgrrrr'�sz,InstallCommand._log_not_valid_rpm_file_pathscCs2x,|D]$}td�}tj||jjjj|��qWdS)NzNot a valid form: %s)rr/Zwarningrr@rArB)r�formsZformrDrrrr1�s
z.InstallCommand._inform_not_a_valid_combinationcCs�g}|jjj}x~|jj|jj||jjjd�D]^}y|jj||d�Wq,t	j
jk
r�td�}t
j||jjjj|j��|j|�Yq,Xq,W|S)N)r�progress)rzNo match for argument: %s)rr(rZadd_remote_rpmsrrr@rFZpackage_installrr)�MarkingErrorrr/r?rArB�locationr-)rr7rrCrDrrrr2�s
zInstallCommand._install_filescCsPy&|jj|t|jjj�|jjjd�Wn$tjjk
rJ|jjjrF�YnXdS)N)r)	rZenv_group_install�tupler(Zgroup_package_typesrrr)r*)rr%rrrr3�s
zInstallCommand._install_groupscCsV|jjj�j|jj|�d�}|rRtd�}tj|j|dj	t
tdd�|D������dS)N)Zprovidesz/There are following alternatives for "{0}": {1}z, cSsg|]
}|j�qSr)�name)�.0Zaltrrr�
<listcomp>�sz7InstallCommand._report_alternatives.<locals>.<listcomp>)rZsack�queryZfilterm�alternatives_provide�formatrr/r?r6�sorted�set)rr"rMrDrrr�_report_alternatives�sz#InstallCommand._report_alternativescCs�g}|jjj}x�|jjD]�}y|jj|||d�Wqtjjk
r�}zJdj	|j
|jjjj
|��}tj|�|jj|�|j|�|j|�WYdd}~XqXqW|S)N)rrEz{}: {})rr(rrr&r
rr)rGrO�valuer@rArBr/r?Z_report_icase_hintrRr-)rr9r8rr"r:rDrrrr4�s


 z InstallCommand._install_packagesN)r
rr)�__name__�
__module__�__qualname__�__doc__�hawkeyZ	FORM_NAMEZFORM_NAZ
FORM_NEVRAr9rNrI�keys�aliasesrZsummary�staticmethodrrr;r#r'r1r2r3rRr4rrrrr	%s"
5		r	)Z
__future__rrZlogging�	itertoolsrrXZdnf.exceptionsrZdnf.clirZdnf.cli.option_parserrZdnf.i18nrZ	getLoggerr/ZCommandr	rrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/distrosync.cpython-36.pyc000064400000002640147511334650021455 0ustar003

�ft`��@s:ddlmZddlmZddlmZGdd�dej�ZdS)�)�absolute_import)�commands)�_c@s8eZdZdZdZed�Zedd��Zd	d
�Z	dd�Z
d
S)�DistroSyncCommandzZA class containing methods needed by the cli to execute the
    distro-synch command.
    �distro-sync�
distrosync�distribution-synchronization�dsyncz?synchronize installed packages to the latest available versionscCs|jddtd�d�dS)N�package�*zPackage to synchronize)�nargs�help)�add_argumentr)�parser�r� /usr/lib/python3.6/distrosync.py�
set_argparser"szDistroSyncCommand.set_argparsercCsF|jj}d|_d|_d|_d|_tj|j|j�tj	|j|j
j�dS)NT)Zcli�demandsZsack_activationZavailable_reposZ	resolvingZ	root_userrZ_checkGPGKey�baseZ_checkEnabledRepo�optsr
)�selfrrrr�	configure&szDistroSyncCommand.configurecCs|jj|jj�S)N)rZdistro_sync_userlistrr
)rrrr�run/szDistroSyncCommand.runN)rrrr	)�__name__�
__module__�__qualname__�__doc__�aliasesrZsummary�staticmethodrrrrrrrrs	rN)Z
__future__rZdnf.clirZdnf.i18nrZCommandrrrrr�<module>ssite-packages/dnf/cli/commands/__pycache__/group.cpython-36.opt-1.pyc000064400000024215147511334650021351 0ustar003

i�-e�:�@s�ddlmZddlmZddlmZddlmZddlmZm	Z	ddl
ZddlZddl
ZddlZddlZejd�ZGdd	�d	ej�ZdS)
�)�absolute_import)�unicode_literals)�
CompsQuery)�commands)�_�ucdN�dnfcs�eZdZdZddddddd�Zd-eej��Zed
�Z	ddd�Z
d.Zd/Zdd�Z
�fdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zed%d&��Zd'd(�Zd)d*�Zd+d,�Z�ZS)0�GroupCommandz; Single sub-command interface for most groups interaction. �list�install�remove�info)Z	grouplistZgroupinstallZgroupupdateZgroupremoveZ
grouperaseZ	groupinfo�group�groups�grpz'display, or use, the groups information�upgrade)�updateZerase�summary�markcCsn|jj|jj�}|r<|jjdk	r4|jjjd|jj�||j_|jjdkrPd|j_|jj|jj|jj�|j_dS)Nrr)�direct_commands�get�optsZcommand�subcmd�args�insert�_CMD_ALIASES)�selfZdirect�r�/usr/lib/python3.6/group.py�
_canonical6szGroupCommand._canonicalcstt|�j|�d|_dS)NF)�superr	�__init__�_remark)r�cli)�	__class__rrr!CszGroupCommand.__init__cCs$td�}t|jj�s tjj|��dS)Nz4No group data available for configured repositories.)r�len�base�compsr�
exceptionsZ
CompsError)r�msgrrr�
_assert_compsGszGroupCommand._assert_compscsT�fdd�}�j�|dkr(�jjj}n�jjjdj|��}tjjt	tjj
||��S)Ncs�jjjj|j�}|S)N)r&�history�envr�id)r,Z	env_found)rrr�available_predMsz7GroupCommand._environment_lists.<locals>.available_pred�,)r*r&r'�environments�environments_by_pattern�joinr�utilZmapallr
�	partition)r�patternsr.�envsr)rr�_environment_listsLszGroupCommand._environment_listsc	s��fdd�}g}g}�j�|dkr0�jjj}n�jjjdj|��}x2|D]*}|}||�r^|}|sj|jrJ|j|�qJW||fS)Ncs�jjjj|j�}|rdSdS)NTF)r&r+rrr-)rZgroup_found)rrr�installed_predZsz1GroupCommand._group_lists.<locals>.installed_predr/)r*r&r'r�groups_by_patternr2�uservisible�append)	rr:r5r8�	installed�	availableZgrpsrZtgt_listr)rr�_group_listsYs
zGroupCommand._group_listscCs~xt|D]l}d}x&|jjj|�D]}|jj|�d}qWx&|jjj|�D]}|jj|�d}qFW|stjt	d�|�qWdgfS)NFTz!Warning: Group %s does not exist.r)
r&r'r1�outputZdisplay_groups_in_environmentr9Zdisplay_pkgs_in_groups�logger�errorr)r�userlistZstrngZ
group_matchedr,rrrr�_infoqs
zGroupCommand._infocs�d}d}d}|jjjp|jj�xz|r�|ddkr@d}|jd�q |ddkr\d}|jd�q |ddkrxd}|jd�q |ddkr�d�|jd�q Pq W|jjr�d}|jjr�d}|jjr�d}|s�d}d}|dk	�r@x\|D]T}|jj	}t
|j|��dk}t
|j|��dk}	|r�|	r�t
jtd	�d
|�d}q�W|�r@dgfS|j|�\}
}|j||�\}}
��fdd�}�fd
d�}|�s�|td�|�|�s�|td�|
�|�s
d�x,|D]$}|j�r��q�|td�|�d��q�Wd�x,|D]$}|j�s�q�|td�|�d��q�W|�rdgfSd�x,|
D]$}|j�r2�q"|td�|�d��q"Wd�x,|
D]$}|j�sd�qT|td�|�d��qTWdgfS)N�r�hiddenr<r=�idsTFzWarning: No groups match:z
   %scs`�st|�d|jdk	r|jntd�}�r:|d|j7}|jrN|d|j7}tdj|��dS)Nz   %sz<name-unset>z (%s)z [%s]z{})�print�ui_namerr-�	lang_only�format)�sectrr))�done�	print_idsrr�_out_grp�sz$GroupCommand._list.<locals>._out_grpcsT|rt|�xB|D]:}d|jdk	r(|jntd�}�rD|d|j7}t|�qWdS)Nz   %sz<name-unset>z (%s))rGrHrr-)rKr6�er))rMrr�_out_env�s
z$GroupCommand._list.<locals>._out_envzAvailable Environment Groups:zInstalled Environment Groups:zInstalled Groups:zInstalled Language Groups:zAvailable Groups:zAvailable Language Groups:)r&�conf�verboserrF�poprEr<r=r'r%r9r1r@rArr7r>rI)rrBr:Z
showinstalledZ
showavailableZerrsrr'Zin_groupZin_environmentZenv_instZ	env_availr<r=rNrPr)rLrMr�_list�s�


	







zGroupCommand._listc	Cs�t|jj|jjtjtjBtjtjB�}|jj�}|j	|�}|j
jrXt|jj
jdg�}nt|jj
j�}tjj|�}x|jD]}|j||�qzWx|jD]}|j||�q�WdS)N�optional)rr&r'r+�GROUPS�ENVIRONMENTSZ	AVAILABLE�	INSTALLED�_build_comps_solverrr�
with_optional�tuplerQ�group_package_types�libdnfZtransactionZlistToCompsPackageTyper0Z_environment_installrZ_group_install)	rr5�q�solver�res�typesZ	pkg_types�env_idZgroup_idrrr�
_mark_install�s


zGroupCommand._mark_installcCslt|jj|jjtjtjBtj�}|jj�}|j|�}x|j	D]}|j
|�q<Wx|jD]}|j|�qVWdS)N)
rr&r'r+rVrWrXrYrr0Z_environment_removerZ
_group_remove)rr5r^r_r`rbZgrp_idrrr�_mark_remove�s


zGroupCommand._mark_removecCs*|d|jkr"|d|dd�fSd|fS)NrrDr)�
_MARK_CMDS)r�extcmdsrrr�_mark_subcmdszGroupCommand._mark_subcmdcCs d}t|�dkr*|ddkr*d}|jd�|jjr6d}|s>d}|j||�\}}dd�}d}x|D]}|jrlq`|d7}q`W|td�|�d}x|D]}|js�q�|d7}q�W|td�|�d}x|D]}|jr�q�|d7}q�W|td	�|�d}x|D]}|j�s�q�|d7}q�W|td
�|�dgfS)NrDrrEcSs|sdStjd||�dS)Nz%s %u)r@r
)rKZnumrrrrNsz'GroupCommand._summary.<locals>._out_grpzInstalled Groups:zInstalled Language Groups:FzAvailable Groups:zAvailable Language Groups:)r%rSrrEr>rIr)rrBr:r<r=rNrLrrrr�_summary
sH




zGroupCommand._summaryc
Cs�|jddtd�d�|j�}|jddtd�d�|jddtd�d�|jd	dtd
�d�|jddtd�d�|jd
ddtd�jtjddjtjdd���d�|jdddtd�d�dS)Nz--with-optional�
store_truez$include optional packages from group)�action�helpz--hiddenzshow also hidden groupsz--installedzshow only installed groupsz--availablezshow only available groupsz--idszshow also ID of groupsr�?ZCOMMANDz'available subcommands: {} (default), {}rz, rD)�nargs�metavarrkr�*ZCOMMAND_ARGzargument for group subcommand)�add_argumentrZadd_mutually_exclusive_grouprJr	�_GROUP_SUBCOMMANDSr2)�parserZ	grpparserrrr�
set_argparser<s"

zGroupCommand.set_argparsercCs�|j�|jj}|jj}||jkrBtjtd�dj|j��t	j
j�|d
krf|rf|j
jj
|�t	j
j�|j
j}d|_|dkr�d|_d|_|dkr�d|_d	|_nd|_|dkr�tj|j�|dkr�tj|j|j
�dS)
Nz$Invalid groups sub-command, use: %s.z, rrrr
TrF)rrrr
)rrrr)rr)rrrrrqr@Zcriticalrr2rr#ZCliErrorZ	optparserZ
print_help�demandsZsack_activationZ	root_userZ	resolvingZ
allow_erasingZavailable_reposrZ_checkEnabledRepor&Z_checkGPGKey)r�cmdrrtrrr�	configurePs.

zGroupCommand.configurecCs�|jj}|jj}|dkr"|j|�S|dkr4|j|�S|dkrF|j|�S|dkrx|j|�\}}|dkrn|j|�S|j|�S|dk�r$|jj	r�t
|jjj
dg�}nt
|jjj
�}d|_y|jj|||jjj�Stjjk
�r"}z6td	�}tj||jjjj|��tjjtd
���WYdd}~XnX|dk�r:|jj|�S|dk�r�x<|D]4}y|jj|g�Wntjjk
�rzYnX�qJWdS)Nrr
r
rrrrUTzNo package %s available.z)Unable to find a mandatory group package.r)rrrrhrTrCrgrdrcrZr[r&rQr\r"Zenv_group_install�strictrr(ZMarkingErrorrr@r
r?ZtermZboldZPackagesNotAvailableErrorZenv_group_upgradeZenv_group_remove�Error)rrurfrrarOr)�argrrr�runosD









zGroupCommand.runcCsf|js
dS|jj}|jj}|j}x@|jjj�j�j|d�D]$}|j	j
|�}|j||j||��q:WdS)N)�name)
r"r&Z_goalr+Z
group_membersZsackZqueryr<ZfiltermZrpmZ
get_reasonZ
set_reasonZgroup_reason)rZgoalr+�namesZpkg�reasonrrr�run_transaction�szGroupCommand.run_transaction)rrr)rr)rr
r
rrrr)�__name__�
__module__�__qualname__�__doc__rr[�keys�aliasesrrrrerqrr!r*r7r>rCrTrcrdrgrh�staticmethodrsrvrzr~�
__classcell__rr)r$rr	$s8

h
/*r	)Z
__future__rrZ	dnf.compsrZdnf.clirZdnf.i18nrrZlibdnf.transactionr]rZdnf.exceptionsZdnf.utilZloggingZ	getLoggerr@ZCommandr	rrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/check.cpython-36.opt-1.pyc000064400000007213147511334650021271 0ustar003

�ft`?�@sVddlmZddlmZddlmZddlmZddlZddlZ	Gdd�dej
�ZdS)�)�absolute_import)�unicode_literals)�_)�commandsNc@s8eZdZdZdZed�Zedd��Zdd�Z	dd	�Z
d
S)�CheckCommandzSA class containing methods needed by the cli to execute the check
    command.
    �checkz#check for problems in the packagedbc	Cs�|jddddtd�d�|jddddtd	�d�|jd
dddtd�d�|jd
dddtd�d�|jddddtd�d�|jdddddddggtjd�dS)Nz--all�check_typesZappend_const�allzshow all problems; default)�dest�action�const�helpz--dependencies�dependencieszshow dependency problemsz--duplicates�
duplicateszshow duplicate problemsz--obsoleted�	obsoletedzshow obsoleted packagesz
--provides�provideszshow problems with provides�check_yum_types�*)�nargs�choicesr
)�add_argumentr�argparseZSUPPRESS)�parser�r�/usr/lib/python3.6/check.py�
set_argparser$s$zCheckCommand.set_argparsercCsxd|jj_|jjr<|jjr0|jj|jj|j_n|jj|j_|jjsPdh|j_nt|jj�|j_|jjj	dg7_	dS)NTr	)
ZcliZdemandsZsack_activation�optsrr�set�base�confZdisable_excludes)�selfrrr�	configure;s
zCheckCommand.configurec	Cs�t�}|jjj�j�}|jjjddh��r�d}�x||D�]r}x�t|j�tt|j	�t|j
��BD]�}t|�jd�rtq`t
|j|gd��s`t|�jd�r�|dkr�tjj|j�}tjj|�}|jt|�d�tjj|�}|jjj|_|j|dd�|j�}|r�q`td�}	|j|	j|jjjj|�|jjjj|���q`Wxx|jD]n}
|j|
gt|
�j �d	d
�}xJ|D]B}d}	|j|	j|jjjj|�|jjjj|
�|jjjj|����q^W�q8Wq6W|jjjddh��rN|jj!|�}
|j"�j#|
�j$�}xl|j%�D]`\}}|j&�xL|d
d�D]<}td�j|jjjj|d	�|jjjj|��}	|j|	��qW�q�W|jjjddh��r�x||D]t}xl|j'D]b}|j|gt|�j �d	d
�}t
|��rttd�j|jjjj|d	�|jjjj|��}	|j|	��qtW�qhW|jjjddh��r\xf|D]^}xV|j(D]L}||j|gd�k�rtd�}	|j|	j|jjjj|�|jjjj|����qW�q�Wxt)|�D]}	t*|	��qfW|�r�tj+j,djt
|����dS)Nr	rZrpmlib)r�(F)ZselectZoptionalz{} has missing requires of {}r)r�namez"{} has installed conflict "{}": {}r�z{} is a duplicate with {}rz{} is obsoleted by {}rz%{} provides {} but it cannot be foundzCheck discovered {} problem(s))-rr�sackZqueryZ	installedrr�intersectionZregular_requiresZrequires_preZprereq_ignoreinst�str�
startswith�len�filter�dnfZ
rpmdb_sack�selectorZSelector�goalZGoalrZprotect_running_kernelZinstall�runr�add�format�outputZtermZboldZ	conflicts�splitZ_get_installonly_queryZ
duplicated�
differenceZ
_name_dict�items�sortZ	obsoletesr�sorted�print�
exceptions�Error)r Z
output_set�qr%ZpkgZrequirer,r-Zsolved�msgZconflictZ
conflictedZconflict_pkgZinstallonlyZdupsr#Zpkgs�dupZobsoleterZproviderrrr.Is�(
$


 zCheckCommand.runN)r)�__name__�
__module__�__qualname__�__doc__�aliasesrZsummary�staticmethodrr!r.rrrrrsr)Z
__future__rrZdnf.i18nrZdnf.clirrZdnf.exceptionsr+ZCommandrrrrr�<module>ssite-packages/dnf/cli/commands/__pycache__/__init__.cpython-36.opt-1.pyc000064400000063433147511334650021761 0ustar003

�ft`{}�@sdZddlmZddlmZddlmZddlmZddlZ	ddl
Z	ddlZ	ddlZ	ddl
Z
ddlZe
jd�Zed�d	Zed
�dZed�Zd
d�Zffdd�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZdS)z<
Classes for subcommands of the yum command line interface.
�)�print_function)�unicode_literals)�OptionParser)�_N�dnfz+To diagnose the problem, try running: '%s'.zrpm -Va --nofiles --nodigestzDYou probably have corrupted RPMDB, running '%s' might fix the issue.zrpm --rebuilddbaYou have enabled checking of packages via GPG keys. This is a good thing.
However, you do not have any GPG public keys installed. You need to download
the keys for packages you wish to install and install them.
You can do that by running the command:
    rpm --import public.gpg.key


Alternatively you can specify the url to the key you would like to use
for a repository in the 'gpgkey' option in a repository section and {prog}
will install it for you.

For more information contact your distribution or package provider.cCsp|jjsdS|j�slxV|jj�D]H}|js0|jr |jr tjdt	j
tjj
d��tjtd�|�tjj�q WdS)z�Verify that there are gpg keys for the enabled repositories in the
    rpm database.

    :param base: a :class:`dnf.Base` object.
    :raises: :class:`cli.CliError`
    Nz
%s
)�progzProblem repository: %s)�confZgpgcheckZ_gpg_key_check�reposZiter_enabledZ
repo_gpgcheckZgpgkey�loggerZcritical�gpg_msg�formatr�util�MAIN_PROG_UPPERr�cli�CliError)�baser�repo�r�/usr/lib/python3.6/__init__.py�_checkGPGKey:srcCs||jj�rdSxD|D]<}|jd�r2tjj|�r2dStjjj|�d}|d
krdSqWt	d�j
d	j|jj
��}tjj|��dS)z�Verify that there is at least one enabled repo.

    :param base: a :class:`dnf.Base` object.
    :param possible_local_files: the list of strings that could be a local rpms
    :raises: :class:`cli.CliError`:
    Nz.rpmr�http�ftp�file�httpsz*There are no enabled repositories in "{}".z", ")rrrr)r	Z_any_enabled�endswith�os�path�existsrZpycompZurlparserr�joinrZreposdirrr)rZpossible_local_filesZlfile�scheme�msgrrr�_checkEnabledRepoKs

r!c@s�eZdZdZgZdZdZdd�Zedd��Z	edd	��Z
ed
d��Zdd
�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�ZdS)�Commandz%Abstract base class for CLI commands.�NcCs
||_dS)N)r)�selfrrrr�__init__fszCommand.__init__cCs|jjS)N)rr)r$rrrrjszCommand.basecCs
|jdS)Nr)�aliases)r$rrr�_basecmdoszCommand._basecmdcCs
|jjjS)N)rr�output)r$rrrr(sszCommand.outputcCsdS)z4Define command specific options and arguments. #:apiNr)r$�parserrrr�
set_argparserwszCommand.set_argparsercCsdS)z*Do any command-specific pre-configuration.Nr)r$rrr�
pre_configure{szCommand.pre_configurecCsdS)z&Do any command-specific configuration.Nr)r$rrr�	configure�szCommand.configurecCs&t|tjj�rttfStd|��dS)z.Get suggestions for resolving the given error.zerror not supported yet: %sN)�
isinstancer�
exceptionsZTransactionCheckError�_RPM_VERIFY�_RPM_REBUILDDB�NotImplementedError)r$�errorrrr�get_error_output�szCommand.get_error_outputcCsdS)zExecute the command.Nr)r$rrr�run�szCommand.runcCsdS)z$Finalize operation after resolvementNr)r$rrr�run_resolved�szCommand.run_resolvedcCsdS)z%Finalize operations post-transaction.Nr)r$rrr�run_transaction�szCommand.run_transaction)�__name__�
__module__�__qualname__�__doc__r&�summary�optsr%�propertyrr'r(r*r+r,r3r4r5r6rrrrr"_sr"c	@sReZdZdZdZed�ZdZddddd	d
ddeh	Ze	d
d��Z
dd�Zdd�ZdS)�InfoCommandzRA class containing methods needed by the cli to execute the
    info command.
    �infoz4display details about a package or group of packages�all�	available�	installed�extras�updates�upgrades�
autoremove�recent�	obsoletesc	Cs�|j�}|jdddddtd�d�|jddddtd	�d
�|jddddtd
�d
�|jddddtd�d
�|jddddtd�d
�|jddddtd�d
�|jddddtd�d
�|jddddtd�d
�|jddtd�|j|jtjtd�d�dS) Nz--all�_packages_action�store_constr@zshow all packages (default))�dest�action�const�default�helpz--availablerAzshow only available packages)rKrLrMrOz--installedrBzshow only installed packagesz--extrasrCzshow only extras packagesz	--updatesrEzshow only upgrades packagesz
--upgradesz--autoremoverFzshow only autoremove packagesz--recentrGz#show only recently changed packages�packages�*�PACKAGEzPackage name specification)�nargs�metavar�choicesrNrLrO)�add_mutually_exclusive_group�add_argumentr�
pkgnarrows�DEFAULT_PKGNARROWr�PkgNarrowCallback)�clsr)�narrowsrrrr*�s:zInfoCommand.set_argparsercCs||jj}d|_|jjr"|jj|j_|jjdkr4d|_|jjrd|jjr\|jjdd|jj�nd|j_|jjdkrxd|j_dS)NTrBz--obsoletesz--rHrDrE)	r�demands�sack_activationr<rI�packages_action�available_reposrH�_option_conflict)r$r]rrrr,�szInfoCommand.configurecCs&|jj|j�|jjd|jj|jj�S)Nr?)r� _populate_update_security_filterr<r�output_packagesr_rP)r$rrrr4�szInfoCommand.runN)r?)
r7r8r9r:r&rr;rYrX�classmethodr*r,r4rrrrr>�s
 r>c@s$eZdZdZdZed�Zdd�ZdS)	�ListCommandzRA class containing methods needed by the cli to execute the
    list command.
    �list�lsz$list a package or groups of packagescCs&|jj|j�|jjd|jj|jj�S)Nrf)rrbr<rrcr_rP)r$rrrr4�szListCommand.runN)rfrg)r7r8r9r:r&rr;r4rrrrre�srec@s8eZdZdZd
Zed�Zedd��Zdd	�Z	d
d�Z
dS)�ProvidesCommandzVA class containing methods needed by the cli to execute the
    provides command.
    �provides�whatprovides�provz*find what package provides the given valuecCs|jddtd�td�d�dS)N�
dependency�+ZPROVIDEz#Provide specification to search for)rSrTrO)rWr)r)rrrr*�szProvidesCommand.set_argparsercCs|jj}d|_d|_d|_dS)NTF)rr]r`Zfresh_metadatar^)r$r]rrrr,�szProvidesCommand.configurecCstjtd��|jj|jj�S)NzSearching Packages: )r
�debugrrrir<rl)r$rrrr4�szProvidesCommand.runN)rirjrk)r7r8r9r:r&rr;�staticmethodr*r,r4rrrrrh�srhc@s8eZdZdZdZed�Zedd��Zdd�Z	d	d
�Z
dS)
�CheckUpdateCommandzZA class containing methods needed by the cli to execute the
    check-update command.
    �check-update�
check-upgradez$check for available package upgradescCs0|jddddtd�d�|jddtd	�d
�dS)Nz--changelogs�
changelogsF�
store_truezshow changelogs before update)rKrNrLrOrPrQrR)rSrT)rWr)r)rrrr*sz CheckUpdateCommand.set_argparsercCs6|jj}d|_d|_d|_|jjr(d|_t|j�dS)NT)	rr]r^r`Zplugin_filtering_enabledr<rsr!r)r$r]rrrr,szCheckUpdateCommand.configurecCsR|jj|jdd�|jj|jjd|jjd�}|r:d|jj_|jj	j
rN|jj�dS)NZgte)Zcmp_typeT)�print_rs�d)rrbr<r�
check_updatesrPrsr]�success_exit_statusrZautocheck_running_kernelZ_check_running_kernel)r$�foundrrrr4s

zCheckUpdateCommand.runN)rqrr)r7r8r9r:r&rr;ror*r,r4rrrrrp�s	rpcseZdZdZGdd�de�ZGdd�de�ZGdd�de�ZGdd	�d	e�ZGd
d�de�Z	Gdd
�d
e�Z
Gdd�de�ZGdd�de�ZGdd�de�Z
Gdd�de�ZGdd�de�Zeeeee	e
eee
eehZd%Zed�Z�fdd�Zdd �Zd!d"�Zd#d$�Z�ZS)&�RepoPkgsCommandz2Implementation of the repository-packages command.c@s$eZdZdZdZdd�Zdd�ZdS)	z%RepoPkgsCommand.CheckUpdateSubCommandz'Implementation of the info sub-command.�check-updatecCs|jj}d|_d|_dS)NT)rr]r`r^)r$r]rrrr,(sz/RepoPkgsCommand.CheckUpdateSubCommand.configurecCs*|jj|jj|jdd�}|r&d|jj_dS)z?Execute the command with respect to given arguments *cli_args*.T)rurvN)rrwr<�	pkg_specs�reponamerr]rx)r$ryrrr�run_on_repo-sz1RepoPkgsCommand.CheckUpdateSubCommand.run_on_repoN)r{)r7r8r9r:r&r,r~rrrr�CheckUpdateSubCommand#src@s$eZdZdZdZdd�Zdd�ZdS)	zRepoPkgsCommand.InfoSubCommandz'Implementation of the info sub-command.r?cCsh|jj}d|_|jjr"|jj|j_|jjdkr4d|_|jjrd|jjr\|jjdd|jj�nd|j_dS)NTrBz--obsoletesz--rH)	rr]r^r<�_pkg_specs_action�pkg_specs_actionr`rHra)r$r]rrrr,9sz(RepoPkgsCommand.InfoSubCommand.configurecCs.|jj|j�|jjd|jj|jj|j�dS)z?Execute the command with respect to given arguments *cli_args*.r?N)rrbr<rrcr�r|r})r$rrrr~Fsz*RepoPkgsCommand.InfoSubCommand.run_on_repoN)r?)r7r8r9r:r&r,r~rrrr�InfoSubCommand4s
r�c@s$eZdZdZdZdd�Zdd�ZdS)	z!RepoPkgsCommand.InstallSubCommandz*Implementation of the install sub-command.�installcCs$|jj}d|_d|_d|_d|_dS)NT)rr]r`r^�	resolving�	root_user)r$r]rrrr,Qs
z+RepoPkgsCommand.InstallSubCommand.configurecCs�|jj|j�t|j|j�d}|jjsjy|jjd|j�Wn&tj	j
k
rbtjt
d��Yq�Xd}nvxt|jjD]h}y|jj||j�WnJtj	j
k
r�}z*dj|j|jjjj|��}tj|�WYdd}~XqtXd}qtW|s�tj	jt
d���dS)NFrQzNo package available.Tz{}: {}zNo packages marked for install.)rrbr<rrr|r�r}rr.�MarkingErrorr
r?rr�valuer(�term�bold�Error)r$�done�pkg_spec�er rrrr~Xs$z-RepoPkgsCommand.InstallSubCommand.run_on_repoN)r�)r7r8r9r:r&r,r~rrrr�InstallSubCommandLsr�c@seZdZdZdZdd�ZdS)zRepoPkgsCommand.ListSubCommandz'Implementation of the list sub-command.rfcCs.|jj|j�|jjd|jj|jj|j�dS)z?Execute the command with respect to given arguments *cli_args*.rfN)rrbr<rrcr�r|r})r$rrrr~zsz*RepoPkgsCommand.ListSubCommand.run_on_repoN)rf)r7r8r9r:r&r~rrrr�ListSubCommandusr�c@s$eZdZdZdZdd�Zdd�ZdS)	z RepoPkgsCommand.MoveToSubCommandz*Implementation of the move-to sub-command.�move-tocCs$|jj}d|_d|_d|_d|_dS)NT)rr]r^r`r�r�)r$r]rrrr,�s
z*RepoPkgsCommand.MoveToSubCommand.configurecCs�t|j|j�d}|jjs�y|jjd|jd�Wn`tjj	k
rVt
jtd��Yn@tjj
k
rzt
jtd��Yntjjk
r�YnXd}n�x�|jjD]�}y|jj||jd�Wn�tjj	k
r�td�}t
j||�Yq�tjj
k
�rd}z\xT|jD]J}d}|jjj|�}|�r.td	�|}td
�}t
j||jjj|�|��qWWYdd}~Xq�tjjk
�r|Yq�Xd}q�W|�s�tjjtd���dS)
z?Execute the command with respect to given arguments *cli_args*.FrQ)Znew_reponamezNo package installed.zNo package available.TzNo match for argument: %sr#z
 (from %s)z%Installed package %s%s not available.NzNothing to do.)rrrr<r|�	reinstallr}rr.�PackagesNotInstalledErrorr
r?r�PackagesNotAvailableErrorr�rP�historyrr(r�r�r�)r$r�r�r �err�pkg�xmsg�pkgreporrrr~�s>.z,RepoPkgsCommand.MoveToSubCommand.run_on_repoN)r�)r7r8r9r:r&r,r~rrrr�MoveToSubCommand�sr�c@s$eZdZdZdZdd�Zdd�ZdS)	z&RepoPkgsCommand.ReinstallOldSubCommandz0Implementation of the reinstall-old sub-command.�
reinstall-oldcCs$|jj}d|_d|_d|_d|_dS)NT)rr]r^r`r�r�)r$r]rrrr,�s
z0RepoPkgsCommand.ReinstallOldSubCommand.configurecCs�t|j|j�d}|jjs�y|jjd|j|j�Wndtjj	k
r\t
d�}tj|�Yn@tjj
k
r�tjt
d��Yntjjk
r�YnXd}n�x�|jjD]�}y|jj||j|j�Wn�tjj	k
r�t
d�}tj||�Yq�tjj
k
�rl}z\xT|jD]J}d}|jjj|�}|�r6t
d�|}t
d	�}tj||jjj|�|��qWWYd
d
}~Xq�tjjk
�r�Yq�Xd}q�W|�s�tjjt
d���d
S)z?Execute the command with respect to given arguments *cli_args*.FrQz)No package installed from the repository.zNo package available.TzNo match for argument: %sr#z
 (from %s)z%Installed package %s%s not available.NzNothing to do.)rrrr<r|r�r}rr.r�rr
r?r�r�rPr�rr(r�r�r�)r$r�r r�r�r�r�r�rrrr~�sB.z2RepoPkgsCommand.ReinstallOldSubCommand.run_on_repoN)r�)r7r8r9r:r&r,r~rrrr�ReinstallOldSubCommand�sr�cs4eZdZdZd	Z�fdd�Zdd�Zdd�Z�ZS)
z#RepoPkgsCommand.ReinstallSubCommandz,Implementation of the reinstall sub-command.r�cs,ttj|�j|�tj|�tj|�f|_dS)zInitialize the command.N)�superrz�ReinstallSubCommandr%r�r��wrapped_commands)r$r)�	__class__rrr%�sz,RepoPkgsCommand.ReinstallSubCommand.__init__cCs6d|jj_x&|jD]}|j|_|j|_|j�qWdS)NT)rr]r`r�r<r}r,)r$�commandrrrr,�s

z-RepoPkgsCommand.ReinstallSubCommand.configurecCs\t|j|j�xH|jD].}y|j�Wntjjk
r@wYqXPqWtjjtd���dS)z?Execute the command with respect to given arguments *cli_args*.z!No packages marked for reinstall.N)	rrrr�r~rr.r�r)r$r�rrrr~sz/RepoPkgsCommand.ReinstallSubCommand.run_on_repo)r�)	r7r8r9r:r&r%r,r~�
__classcell__rr)r�rr��s
r�c@s,eZdZdZd
Zdd�Zdd�Zdd�Zd	S)z,RepoPkgsCommand.RemoveOrDistroSyncSubCommandz8Implementation of the remove-or-distro-sync sub-command.�remove-or-distro-synccCs$|jj}d|_d|_d|_d|_dS)NT)rr]r`r^r�r�)r$r]rrrr,s
z6RepoPkgsCommand.RemoveOrDistroSyncSubCommand.configurec	s�|jjjj��tjj|�}|j|jjj�}|jjj���fdd�|j	�D�}|s`tj
jd|��|j�}|jjj
j}xD|D]<}|j|j|jd�r�|jjjj|�qz|jjjj||d�qzWdS)z;Synchronize a package with another repository or remove it.csg|]}�j|��kr|�qSr)r)�.0r�)r�r}rr�
<listcomp>#szIRepoPkgsCommand.RemoveOrDistroSyncSubCommand._replace.<locals>.<listcomp>zno package matched)�name�arch)�
clean_depsN)rrZsackZdisable_repor�subjectZSubjectZget_best_queryr�rBr.r�rArZclean_requirements_on_remove�filterr�r�Z_goalZdistupgradeZerase)	r$r�r}r�ZmatchesrBrAr��packager)r�r}r�_replaces

z5RepoPkgsCommand.RemoveOrDistroSyncSubCommand._replacecCs�t|j|j�d}|jjs^y|jd|j�Wn*tjj	k
rVt
d�}tj|�Yq�Xd}nVxT|jjD]H}y|j||j�Wn,tjj	k
r�t
d�}tj||�YqhXd}qhW|s�tjj
t
d���dS)z?Execute the command with respect to given arguments *cli_args*.FrQz)No package installed from the repository.TzNo match for argument: %szNothing to do.N)rrrr<r|r�r}rr.r�rr
r?r�)r$r�r r�rrrr~0s$z8RepoPkgsCommand.RemoveOrDistroSyncSubCommand.run_on_repoN)r�)r7r8r9r:r&r,r�r~rrrr�RemoveOrDistroSyncSubCommands
r�c@s$eZdZdZdZdd�Zdd�ZdS)	z+RepoPkgsCommand.RemoveOrReinstallSubCommandz6Implementation of the remove-or-reinstall sub-command.�remove-or-reinstallcCs$|jj}d|_d|_d|_d|_dS)NT)rr]r^r`r�r�)r$r]rrrr,Rs
z5RepoPkgsCommand.RemoveOrReinstallSubCommand.configurec
Cst|j|j�d}|jjs~y|jjd|j|jdd�Wn@tjj	k
r`t
d�}tj|�Yq�tjj
k
rvYq�Xd}nvxt|jjD]h}y|jj||j|jdd�WnBtjj	k
r�t
d�}tj||�Yq�tjj
k
r�Yq�Xd}q�W|�s
tjjt
d���dS)	z?Execute the command with respect to given arguments *cli_args*.FrQT)Zold_reponameZnew_reponame_neqZ	remove_naz)No package installed from the repository.zNo match for argument: %szNothing to do.N)rrrr<r|r�r}rr.r�rr
r?r�r�)r$r�r r�rrrr~Ys4z7RepoPkgsCommand.RemoveOrReinstallSubCommand.run_on_repoN)r�)r7r8r9r:r&r,r~rrrr�RemoveOrReinstallSubCommandMsr�c@s$eZdZdZdZdd�Zdd�ZdS)	z RepoPkgsCommand.RemoveSubCommandz)Implementation of the remove sub-command.�removecCs*|jj}d|_d|_d|_d|_d|_dS)NTF)rr]r^Z
allow_erasingr`r�r�)r$r]rrrr,�sz*RepoPkgsCommand.RemoveSubCommand.configurecCs�d}|jjsRy|jjd|j�Wn*tjjk
rJtd�}t	j
|�Yq�Xd}n`x^|jjD]R}y|jj||j�Wn4tjjk
r�}zt	j
t|��WYdd}~Xq\Xd}q\W|s�t	jtd��dS)z?Execute the command with respect to given arguments *cli_args*.FrQz)No package installed from the repository.TNzNo packages marked for removal.)
r<r|rr�r}rr.r�rr
r?�strZwarning)r$r�r r�r�rrrr~�s  z,RepoPkgsCommand.RemoveSubCommand.run_on_repoN)r�)r7r8r9r:r&r,r~rrrr�RemoveSubCommand~sr�c@s$eZdZdZd	Zdd�Zdd�ZdS)
z!RepoPkgsCommand.UpgradeSubCommandz*Implementation of the upgrade sub-command.�upgrade�
upgrade-tocCs$|jj}d|_d|_d|_d|_dS)NT)rr]r^r`r�r�)r$r]rrrr,�s
z+RepoPkgsCommand.UpgradeSubCommand.configurecCs�t|j|j�d}|jjs.|jj|j�d}nTxR|jjD]F}y|jj||j�Wn(tj	j
k
rxtjt
d�|�Yq8Xd}q8W|s�tj	jt
d���dS)z?Execute the command with respect to given arguments *cli_args*.FTzNo match for argument: %szNo packages marked for upgrade.N)rrrr<r|Zupgrade_allr}r�rr.r�r
r?rr�)r$r�r�rrrr~�sz-RepoPkgsCommand.UpgradeSubCommand.run_on_repoN)r�r�)r7r8r9r:r&r,r~rrrr�UpgradeSubCommand�sr��repository-packages�	repo-pkgs�
repo-packages�repository-pkgsz7run commands on top of all packages in given repositorycs>tt|�j���fdd�|jD�}d|_dd�|D�|_dS)zInitialize the command.c3s|]}|��VqdS)Nr)r��subcmd)rrr�	<genexpr>�sz+RepoPkgsCommand.__init__.<locals>.<genexpr>NcSsi|]}|jD]
}||�qqSr)r&)r�r��aliasrrr�
<dictcomp>�sz,RepoPkgsCommand.__init__.<locals>.<dictcomp>)r�rzr%�SUBCMDSr��_subcmd_name2obj)r$rZsubcmd_objs)r�)rrr%�s
zRepoPkgsCommand.__init__c	Cs`|j�}|jdddddtd�d�|jddddtd	�d
�|jddddtd
�d
�|jddddtd�d
�|jddddtd�d
�|jddddtd�d
�|jddddtd�d
�|jddddtd�d
�|jddtjtd�td�d�d d!�|jD�}d"d!�|jD�}|jd#dd$|d%j|�d&�d}|ddddd'ddh}|jd(d)td*�||tjtd+�d,�dS)-Nz--allr�rJr@zshow all packages (default))rKrLrMrNrOz--availablerAzshow only available packages)rKrLrMrOz--installedrBzshow only installed packagesz--extrasrCzshow only extras packagesz	--updatesrEzshow only upgrades packagesz
--upgradesz--autoremoverFzshow only autoremove packagesz--recentrGz#show only recently changed packagesr}�ZREPOIDz
Repository ID)rSrLrTrOcSsg|]}|jd�qS)r)r&)r�r�rrrr��sz1RepoPkgsCommand.set_argparser.<locals>.<listcomp>cSsg|]}|jD]}|�qqSr)r&)r�r�r�rrrr��sr�Z
SUBCOMMANDz, )rSrTrUrOrHr|rQrRzPackage specification)rSrTrUrNrLrO)rVrWrrZ_RepoCallbackEnabler�rrZ)r$r)r\Zsubcommand_choicesZsubcommand_choices_allrYrXrrrr*�sP
zRepoPkgsCommand.set_argparsercCs�y|j|jjd|_Wn>tjjtfk
rV}z|jjj�tjj�WYdd}~XnX|j|j_|jj	d|j_	|jj
�dS)z8Verify whether the command can run with given arguments.rN)r�r<r�rrr�KeyError�	optparserZprint_usager}r,)r$r�rrrr,s
zRepoPkgsCommand.configurecCs|jj�dS)z>Execute the command with respect to given arguments *extcmds*.N)r�r~)r$rrrr4szRepoPkgsCommand.run)r�r�r�r�)r7r8r9r:r"rr�r�r�r�r�r�r�r�r�r�r�r&rr;r%r*r,r4r�rr)r�rrz s0)79>1(#+rzc@s0eZdZdZd	Zed�Zedd��Zdd�Z	dS)
�HelpCommandzRA class containing methods needed by the cli to execute the
    help command.
    rOzdisplay a helpful usage messagecCs*|jddtd�td�jtjjd�d�dS)N�cmd�?ZCOMMANDz{prog} command to get help for)r)rSrTrO)rWrrrr
r)r)rrrr*szHelpCommand.set_argparsercCsN|jjs|jj|jjkr(|jjj�n"|jj|jj}|jjj||��dS)N)r<r�rZcli_commandsr�Z
print_help)r$r�rrrr4$s

zHelpCommand.runN)rO)
r7r8r9r:r&rr;ror*r4rrrrr�s
r�)r:Z
__future__rrZdnf.cli.option_parserrZdnf.i18nrZdnf.clirZdnf.exceptionsZ
dnf.pycompZdnf.utilZloggingrZ	getLoggerr
r/r0rrr!�objectr"r>rerhrprzr�rrrr�<module>s:
9?$ysite-packages/dnf/cli/commands/__pycache__/autoremove.cpython-36.pyc000064400000003572147511334650021447 0ustar003

�ft`��@stddlmZddlmZddlmZddlmZddlmZddl	Z
ddlZddlZej
d�ZGdd	�d	ej�ZdS)
�)�absolute_import)�unicode_literals)�commands)�OptionParser)�_N�dnfc@sReZdZejejejd�Zdeej	��Z
ed�Ze
dd��Zdd�Zdd	�Zd
S)�AutoremoveCommand)zautoremove-nz
autoremove-nazautoremove-nevra�
autoremovezKremove all unneeded packages that were originally installed as dependenciescCs"|jddtd�tjtd�d�dS)NZpackages�*zPackage to removeZPACKAGE)�nargs�help�action�metavar)�add_argumentrrZParseSpecGroupFileCallback)�parser�r� /usr/lib/python3.6/autoremove.py�
set_argparser,szAutoremoveCommand.set_argparsercCs\|jj}d|_d|_d|_t|jj|jj|jj	g�rLd|j
j_d|_
d|_nd|_d|_dS)NTF)Zcli�demandsZ	resolvingZ	root_userZsack_activation�any�opts�	grp_specs�	pkg_specs�	filenames�baseZconfZclean_requirements_on_removeZ
allow_erasingZavailable_reposZfresh_metadata)�selfrrrr�	configure2s
zAutoremoveCommand.configurecCsjt|jj|jj|jjg�r\g}|jj|jkr<|j|jjg}|jj||jj|jj|jj�n
|jj�dS)N)	rrrrrZcommand�nevra_formsrr	)rZformsrrr�runBszAutoremoveCommand.runN)r	)�__name__�
__module__�__qualname__�hawkeyZ	FORM_NAMEZFORM_NAZ
FORM_NEVRAr�tuple�keys�aliasesrZsummary�staticmethodrrrrrrrr"s
r)Z
__future__rrZdnf.clirZdnf.cli.option_parserrZdnf.i18nrZdnf.exceptionsrr"ZloggingZ	getLoggerZloggerZCommandrrrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/__init__.cpython-36.pyc000064400000064016147511334650021020 0ustar003

�ft`{}�@sdZddlmZddlmZddlmZddlmZddlZ	ddl
Z	ddlZ	ddlZ	ddl
Z
ddlZe
jd�Zed�d	Zed
�dZed�Zd
d�Zffdd�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZdS)z<
Classes for subcommands of the yum command line interface.
�)�print_function)�unicode_literals)�OptionParser)�_N�dnfz+To diagnose the problem, try running: '%s'.zrpm -Va --nofiles --nodigestzDYou probably have corrupted RPMDB, running '%s' might fix the issue.zrpm --rebuilddbaYou have enabled checking of packages via GPG keys. This is a good thing.
However, you do not have any GPG public keys installed. You need to download
the keys for packages you wish to install and install them.
You can do that by running the command:
    rpm --import public.gpg.key


Alternatively you can specify the url to the key you would like to use
for a repository in the 'gpgkey' option in a repository section and {prog}
will install it for you.

For more information contact your distribution or package provider.cCsp|jjsdS|j�slxV|jj�D]H}|js0|jr |jr tjdt	j
tjj
d��tjtd�|�tjj�q WdS)z�Verify that there are gpg keys for the enabled repositories in the
    rpm database.

    :param base: a :class:`dnf.Base` object.
    :raises: :class:`cli.CliError`
    Nz
%s
)�progzProblem repository: %s)�confZgpgcheckZ_gpg_key_check�reposZiter_enabledZ
repo_gpgcheckZgpgkey�loggerZcritical�gpg_msg�formatr�util�MAIN_PROG_UPPERr�cli�CliError)�baser�repo�r�/usr/lib/python3.6/__init__.py�_checkGPGKey:srcCs||jj�rdSxD|D]<}|jd�r2tjj|�r2dStjjj|�d}|d
krdSqWt	d�j
d	j|jj
��}tjj|��dS)z�Verify that there is at least one enabled repo.

    :param base: a :class:`dnf.Base` object.
    :param possible_local_files: the list of strings that could be a local rpms
    :raises: :class:`cli.CliError`:
    Nz.rpmr�http�ftp�file�httpsz*There are no enabled repositories in "{}".z", ")rrrr)r	Z_any_enabled�endswith�os�path�existsrZpycompZurlparserr�joinrZreposdirrr)rZpossible_local_filesZlfile�scheme�msgrrr�_checkEnabledRepoKs

r!c@s�eZdZdZgZdZdZdd�Zedd��Z	edd	��Z
ed
d��Zdd
�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�ZdS)�Commandz%Abstract base class for CLI commands.�NcCs
||_dS)N)r)�selfrrrr�__init__fszCommand.__init__cCs|jjS)N)rr)r$rrrrjszCommand.basecCs
|jdS)Nr)�aliases)r$rrr�_basecmdoszCommand._basecmdcCs
|jjjS)N)rr�output)r$rrrr(sszCommand.outputcCsdS)z4Define command specific options and arguments. #:apiNr)r$�parserrrr�
set_argparserwszCommand.set_argparsercCsdS)z*Do any command-specific pre-configuration.Nr)r$rrr�
pre_configure{szCommand.pre_configurecCsdS)z&Do any command-specific configuration.Nr)r$rrr�	configure�szCommand.configurecCs&t|tjj�rttfStd|��dS)z.Get suggestions for resolving the given error.zerror not supported yet: %sN)�
isinstancer�
exceptionsZTransactionCheckError�_RPM_VERIFY�_RPM_REBUILDDB�NotImplementedError)r$�errorrrr�get_error_output�szCommand.get_error_outputcCsdS)zExecute the command.Nr)r$rrr�run�szCommand.runcCsdS)z$Finalize operation after resolvementNr)r$rrr�run_resolved�szCommand.run_resolvedcCsdS)z%Finalize operations post-transaction.Nr)r$rrr�run_transaction�szCommand.run_transaction)�__name__�
__module__�__qualname__�__doc__r&�summary�optsr%�propertyrr'r(r*r+r,r3r4r5r6rrrrr"_sr"c	@sReZdZdZdZed�ZdZddddd	d
ddeh	Ze	d
d��Z
dd�Zdd�ZdS)�InfoCommandzRA class containing methods needed by the cli to execute the
    info command.
    �infoz4display details about a package or group of packages�all�	available�	installed�extras�updates�upgrades�
autoremove�recent�	obsoletesc	Cs�|j�}|jdddddtd�d�|jddddtd	�d
�|jddddtd
�d
�|jddddtd�d
�|jddddtd�d
�|jddddtd�d
�|jddddtd�d
�|jddddtd�d
�|jddtd�|j|jtjtd�d�dS) Nz--all�_packages_action�store_constr@zshow all packages (default))�dest�action�const�default�helpz--availablerAzshow only available packages)rKrLrMrOz--installedrBzshow only installed packagesz--extrasrCzshow only extras packagesz	--updatesrEzshow only upgrades packagesz
--upgradesz--autoremoverFzshow only autoremove packagesz--recentrGz#show only recently changed packages�packages�*�PACKAGEzPackage name specification)�nargs�metavar�choicesrNrLrO)�add_mutually_exclusive_group�add_argumentr�
pkgnarrows�DEFAULT_PKGNARROWr�PkgNarrowCallback)�clsr)�narrowsrrrr*�s:zInfoCommand.set_argparsercCs||jj}d|_|jjr"|jj|j_|jjdkr4d|_|jjrd|jjr\|jjdd|jj�nd|j_|jjdkrxd|j_dS)NTrBz--obsoletesz--rHrDrE)	r�demands�sack_activationr<rI�packages_action�available_reposrH�_option_conflict)r$r]rrrr,�szInfoCommand.configurecCs&|jj|j�|jjd|jj|jj�S)Nr?)r� _populate_update_security_filterr<r�output_packagesr_rP)r$rrrr4�szInfoCommand.runN)r?)
r7r8r9r:r&rr;rYrX�classmethodr*r,r4rrrrr>�s
 r>c@s$eZdZdZdZed�Zdd�ZdS)	�ListCommandzRA class containing methods needed by the cli to execute the
    list command.
    �list�lsz$list a package or groups of packagescCs&|jj|j�|jjd|jj|jj�S)Nrf)rrbr<rrcr_rP)r$rrrr4�szListCommand.runN)rfrg)r7r8r9r:r&rr;r4rrrrre�srec@s8eZdZdZd
Zed�Zedd��Zdd	�Z	d
d�Z
dS)�ProvidesCommandzVA class containing methods needed by the cli to execute the
    provides command.
    �provides�whatprovides�provz*find what package provides the given valuecCs|jddtd�td�d�dS)N�
dependency�+ZPROVIDEz#Provide specification to search for)rSrTrO)rWr)r)rrrr*�szProvidesCommand.set_argparsercCs|jj}d|_d|_d|_dS)NTF)rr]r`Zfresh_metadatar^)r$r]rrrr,�szProvidesCommand.configurecCstjtd��|jj|jj�S)NzSearching Packages: )r
�debugrrrir<rl)r$rrrr4�szProvidesCommand.runN)rirjrk)r7r8r9r:r&rr;�staticmethodr*r,r4rrrrrh�srhc@s8eZdZdZdZed�Zedd��Zdd�Z	d	d
�Z
dS)
�CheckUpdateCommandzZA class containing methods needed by the cli to execute the
    check-update command.
    �check-update�
check-upgradez$check for available package upgradescCs0|jddddtd�d�|jddtd	�d
�dS)Nz--changelogs�
changelogsF�
store_truezshow changelogs before update)rKrNrLrOrPrQrR)rSrT)rWr)r)rrrr*sz CheckUpdateCommand.set_argparsercCs6|jj}d|_d|_d|_|jjr(d|_t|j�dS)NT)	rr]r^r`Zplugin_filtering_enabledr<rsr!r)r$r]rrrr,szCheckUpdateCommand.configurecCsR|jj|jdd�|jj|jjd|jjd�}|r:d|jj_|jj	j
rN|jj�dS)NZgte)Zcmp_typeT)�print_rs�d)rrbr<r�
check_updatesrPrsr]�success_exit_statusrZautocheck_running_kernelZ_check_running_kernel)r$�foundrrrr4s

zCheckUpdateCommand.runN)rqrr)r7r8r9r:r&rr;ror*r,r4rrrrrp�s	rpcseZdZdZGdd�de�ZGdd�de�ZGdd�de�ZGdd	�d	e�ZGd
d�de�Z	Gdd
�d
e�Z
Gdd�de�ZGdd�de�ZGdd�de�Z
Gdd�de�ZGdd�de�Zeeeee	e
eee
eehZd%Zed�Z�fdd�Zdd �Zd!d"�Zd#d$�Z�ZS)&�RepoPkgsCommandz2Implementation of the repository-packages command.c@s$eZdZdZdZdd�Zdd�ZdS)	z%RepoPkgsCommand.CheckUpdateSubCommandz'Implementation of the info sub-command.�check-updatecCs|jj}d|_d|_dS)NT)rr]r`r^)r$r]rrrr,(sz/RepoPkgsCommand.CheckUpdateSubCommand.configurecCs*|jj|jj|jdd�}|r&d|jj_dS)z?Execute the command with respect to given arguments *cli_args*.T)rurvN)rrwr<�	pkg_specs�reponamerr]rx)r$ryrrr�run_on_repo-sz1RepoPkgsCommand.CheckUpdateSubCommand.run_on_repoN)r{)r7r8r9r:r&r,r~rrrr�CheckUpdateSubCommand#src@s$eZdZdZdZdd�Zdd�ZdS)	zRepoPkgsCommand.InfoSubCommandz'Implementation of the info sub-command.r?cCsh|jj}d|_|jjr"|jj|j_|jjdkr4d|_|jjrd|jjr\|jjdd|jj�nd|j_dS)NTrBz--obsoletesz--rH)	rr]r^r<�_pkg_specs_action�pkg_specs_actionr`rHra)r$r]rrrr,9sz(RepoPkgsCommand.InfoSubCommand.configurecCs.|jj|j�|jjd|jj|jj|j�dS)z?Execute the command with respect to given arguments *cli_args*.r?N)rrbr<rrcr�r|r})r$rrrr~Fsz*RepoPkgsCommand.InfoSubCommand.run_on_repoN)r?)r7r8r9r:r&r,r~rrrr�InfoSubCommand4s
r�c@s$eZdZdZdZdd�Zdd�ZdS)	z!RepoPkgsCommand.InstallSubCommandz*Implementation of the install sub-command.�installcCs$|jj}d|_d|_d|_d|_dS)NT)rr]r`r^�	resolving�	root_user)r$r]rrrr,Qs
z+RepoPkgsCommand.InstallSubCommand.configurecCs�|jj|j�t|j|j�d}|jjsjy|jjd|j�Wn&tj	j
k
rbtjt
d��Yq�Xd}nvxt|jjD]h}y|jj||j�WnJtj	j
k
r�}z*dj|j|jjjj|��}tj|�WYdd}~XqtXd}qtW|s�tj	jt
d���dS)NFrQzNo package available.Tz{}: {}zNo packages marked for install.)rrbr<rrr|r�r}rr.�MarkingErrorr
r?rr�valuer(�term�bold�Error)r$�done�pkg_spec�er rrrr~Xs$z-RepoPkgsCommand.InstallSubCommand.run_on_repoN)r�)r7r8r9r:r&r,r~rrrr�InstallSubCommandLsr�c@seZdZdZdZdd�ZdS)zRepoPkgsCommand.ListSubCommandz'Implementation of the list sub-command.rfcCs.|jj|j�|jjd|jj|jj|j�dS)z?Execute the command with respect to given arguments *cli_args*.rfN)rrbr<rrcr�r|r})r$rrrr~zsz*RepoPkgsCommand.ListSubCommand.run_on_repoN)rf)r7r8r9r:r&r~rrrr�ListSubCommandusr�c@s$eZdZdZdZdd�Zdd�ZdS)	z RepoPkgsCommand.MoveToSubCommandz*Implementation of the move-to sub-command.�move-tocCs$|jj}d|_d|_d|_d|_dS)NT)rr]r^r`r�r�)r$r]rrrr,�s
z*RepoPkgsCommand.MoveToSubCommand.configurecCs�t|j|j�d}|jjs�y|jjd|jd�Wnltjj	k
rVt
jtd��YnLtjj
k
rzt
jtd��Yn(tjjk
r�ds�td��YnXd}n�x�|jjD]�}y|jj||jd�Wn�tjj	k
r�td�}t
j||�Yq�tjj
k
�rp}z\xT|jD]J}d	}|jjj|�}|�r:td
�|}td�}t
j||jjj|�|��qWWYdd}~Xq�tjjk
�r�d�s�td��Yq�Xd}q�W|�s�tjjtd
���dS)z?Execute the command with respect to given arguments *cli_args*.FrQ)Znew_reponamezNo package installed.zNo package available.z+Only the above marking errors are expected.TzNo match for argument: %sr#z
 (from %s)z%Installed package %s%s not available.NzNothing to do.)rrrr<r|�	reinstallr}rr.�PackagesNotInstalledErrorr
r?r�PackagesNotAvailableErrorr��AssertionErrorrP�historyrr(r�r�r�)r$r�r�r �err�pkg�xmsg�pkgreporrrr~�s@.z,RepoPkgsCommand.MoveToSubCommand.run_on_repoN)r�)r7r8r9r:r&r,r~rrrr�MoveToSubCommand�sr�c@s$eZdZdZdZdd�Zdd�ZdS)	z&RepoPkgsCommand.ReinstallOldSubCommandz0Implementation of the reinstall-old sub-command.�
reinstall-oldcCs$|jj}d|_d|_d|_d|_dS)NT)rr]r^r`r�r�)r$r]rrrr,�s
z0RepoPkgsCommand.ReinstallOldSubCommand.configurecCs�t|j|j�d}|jjs�y|jjd|j|j�Wnptjj	k
r\t
d�}tj|�YnLtjj
k
r�tjt
d��Yn(tjjk
r�ds�td��YnXd}n�x�|jjD]�}y|jj||j|j�Wn�tjj	k
r�t
d�}tj||�Yq�tjj
k
�rx}z\xT|jD]J}d}|jjj|�}|�rBt
d	�|}t
d
�}tj||jjj|�|��qWWYdd}~Xq�tjjk
�r�d�s�td��Yq�Xd}q�W|�s�tjjt
d���dS)
z?Execute the command with respect to given arguments *cli_args*.FrQz)No package installed from the repository.zNo package available.z+Only the above marking errors are expected.TzNo match for argument: %sr#z
 (from %s)z%Installed package %s%s not available.NzNothing to do.)rrrr<r|r�r}rr.r�rr
r?r�r�r�rPr�rr(r�r�r�)r$r�r r�r�r�r�r�rrrr~�sD.z2RepoPkgsCommand.ReinstallOldSubCommand.run_on_repoN)r�)r7r8r9r:r&r,r~rrrr�ReinstallOldSubCommand�sr�cs4eZdZdZd	Z�fdd�Zdd�Zdd�Z�ZS)
z#RepoPkgsCommand.ReinstallSubCommandz,Implementation of the reinstall sub-command.r�cs,ttj|�j|�tj|�tj|�f|_dS)zInitialize the command.N)�superrz�ReinstallSubCommandr%r�r��wrapped_commands)r$r)�	__class__rrr%�sz,RepoPkgsCommand.ReinstallSubCommand.__init__cCs6d|jj_x&|jD]}|j|_|j|_|j�qWdS)NT)rr]r`r�r<r}r,)r$�commandrrrr,�s

z-RepoPkgsCommand.ReinstallSubCommand.configurecCs\t|j|j�xH|jD].}y|j�Wntjjk
r@wYqXPqWtjjtd���dS)z?Execute the command with respect to given arguments *cli_args*.z!No packages marked for reinstall.N)	rrrr�r~rr.r�r)r$r�rrrr~sz/RepoPkgsCommand.ReinstallSubCommand.run_on_repo)r�)	r7r8r9r:r&r%r,r~�
__classcell__rr)r�rr��s
r�c@s,eZdZdZd
Zdd�Zdd�Zdd�Zd	S)z,RepoPkgsCommand.RemoveOrDistroSyncSubCommandz8Implementation of the remove-or-distro-sync sub-command.�remove-or-distro-synccCs$|jj}d|_d|_d|_d|_dS)NT)rr]r`r^r�r�)r$r]rrrr,s
z6RepoPkgsCommand.RemoveOrDistroSyncSubCommand.configurec	s�|jjjj��tjj|�}|j|jjj�}|jjj���fdd�|j	�D�}|s`tj
jd|��|j�}|jjj
j}xD|D]<}|j|j|jd�r�|jjjj|�qz|jjjj||d�qzWdS)z;Synchronize a package with another repository or remove it.csg|]}�j|��kr|�qSr)r)�.0r�)r�r}rr�
<listcomp>#szIRepoPkgsCommand.RemoveOrDistroSyncSubCommand._replace.<locals>.<listcomp>zno package matched)�name�arch)�
clean_depsN)rrZsackZdisable_repor�subjectZSubjectZget_best_queryr�rBr.r�rArZclean_requirements_on_remove�filterr�r�Z_goalZdistupgradeZerase)	r$r�r}r�ZmatchesrBrAr��packager)r�r}r�_replaces

z5RepoPkgsCommand.RemoveOrDistroSyncSubCommand._replacecCs�t|j|j�d}|jjs^y|jd|j�Wn*tjj	k
rVt
d�}tj|�Yq�Xd}nVxT|jjD]H}y|j||j�Wn,tjj	k
r�t
d�}tj||�YqhXd}qhW|s�tjj
t
d���dS)z?Execute the command with respect to given arguments *cli_args*.FrQz)No package installed from the repository.TzNo match for argument: %szNothing to do.N)rrrr<r|r�r}rr.r�rr
r?r�)r$r�r r�rrrr~0s$z8RepoPkgsCommand.RemoveOrDistroSyncSubCommand.run_on_repoN)r�)r7r8r9r:r&r,r�r~rrrr�RemoveOrDistroSyncSubCommands
r�c@s$eZdZdZdZdd�Zdd�ZdS)	z+RepoPkgsCommand.RemoveOrReinstallSubCommandz6Implementation of the remove-or-reinstall sub-command.�remove-or-reinstallcCs$|jj}d|_d|_d|_d|_dS)NT)rr]r^r`r�r�)r$r]rrrr,Rs
z5RepoPkgsCommand.RemoveOrReinstallSubCommand.configurec
Cs*t|j|j�d}|jjs�y|jjd|j|jdd�WnLtjj	k
r`t
d�}tj|�Yn(tjj
k
r�ds~td��YnXd}n�x�|jjD]x}y|jj||j|jdd�WnRtjj	k
r�t
d�}tj||�Yq�tjj
k
�rd�std��Yq�Xd}q�W|�s&tjjt
d���d	S)
z?Execute the command with respect to given arguments *cli_args*.FrQT)Zold_reponameZnew_reponame_neqZ	remove_naz)No package installed from the repository.z)Only the above marking error is expected.zNo match for argument: %szNothing to do.N)rrrr<r|r�r}rr.r�rr
r?r�r�r�)r$r�r r�rrrr~Ys4z7RepoPkgsCommand.RemoveOrReinstallSubCommand.run_on_repoN)r�)r7r8r9r:r&r,r~rrrr�RemoveOrReinstallSubCommandMsr�c@s$eZdZdZdZdd�Zdd�ZdS)	z RepoPkgsCommand.RemoveSubCommandz)Implementation of the remove sub-command.�removecCs*|jj}d|_d|_d|_d|_d|_dS)NTF)rr]r^Z
allow_erasingr`r�r�)r$r]rrrr,�sz*RepoPkgsCommand.RemoveSubCommand.configurecCs�d}|jjsRy|jjd|j�Wn*tjjk
rJtd�}t	j
|�Yq�Xd}n`x^|jjD]R}y|jj||j�Wn4tjjk
r�}zt	j
t|��WYdd}~Xq\Xd}q\W|s�t	jtd��dS)z?Execute the command with respect to given arguments *cli_args*.FrQz)No package installed from the repository.TNzNo packages marked for removal.)
r<r|rr�r}rr.r�rr
r?�strZwarning)r$r�r r�r�rrrr~�s  z,RepoPkgsCommand.RemoveSubCommand.run_on_repoN)r�)r7r8r9r:r&r,r~rrrr�RemoveSubCommand~sr�c@s$eZdZdZd	Zdd�Zdd�ZdS)
z!RepoPkgsCommand.UpgradeSubCommandz*Implementation of the upgrade sub-command.�upgrade�
upgrade-tocCs$|jj}d|_d|_d|_d|_dS)NT)rr]r^r`r�r�)r$r]rrrr,�s
z+RepoPkgsCommand.UpgradeSubCommand.configurecCs�t|j|j�d}|jjs.|jj|j�d}nTxR|jjD]F}y|jj||j�Wn(tj	j
k
rxtjt
d�|�Yq8Xd}q8W|s�tj	jt
d���dS)z?Execute the command with respect to given arguments *cli_args*.FTzNo match for argument: %szNo packages marked for upgrade.N)rrrr<r|Zupgrade_allr}r�rr.r�r
r?rr�)r$r�r�rrrr~�sz-RepoPkgsCommand.UpgradeSubCommand.run_on_repoN)r�r�)r7r8r9r:r&r,r~rrrr�UpgradeSubCommand�sr��repository-packages�	repo-pkgs�
repo-packages�repository-pkgsz7run commands on top of all packages in given repositorycs>tt|�j���fdd�|jD�}d|_dd�|D�|_dS)zInitialize the command.c3s|]}|��VqdS)Nr)r��subcmd)rrr�	<genexpr>�sz+RepoPkgsCommand.__init__.<locals>.<genexpr>NcSsi|]}|jD]
}||�qqSr)r&)r�r��aliasrrr�
<dictcomp>�sz,RepoPkgsCommand.__init__.<locals>.<dictcomp>)r�rzr%�SUBCMDSr��_subcmd_name2obj)r$rZsubcmd_objs)r�)rrr%�s
zRepoPkgsCommand.__init__c	Cs`|j�}|jdddddtd�d�|jddddtd	�d
�|jddddtd
�d
�|jddddtd�d
�|jddddtd�d
�|jddddtd�d
�|jddddtd�d
�|jddddtd�d
�|jddtjtd�td�d�d d!�|jD�}d"d!�|jD�}|jd#dd$|d%j|�d&�d}|ddddd'ddh}|jd(d)td*�||tjtd+�d,�dS)-Nz--allr�rJr@zshow all packages (default))rKrLrMrNrOz--availablerAzshow only available packages)rKrLrMrOz--installedrBzshow only installed packagesz--extrasrCzshow only extras packagesz	--updatesrEzshow only upgrades packagesz
--upgradesz--autoremoverFzshow only autoremove packagesz--recentrGz#show only recently changed packagesr}�ZREPOIDz
Repository ID)rSrLrTrOcSsg|]}|jd�qS)r)r&)r�r�rrrr��sz1RepoPkgsCommand.set_argparser.<locals>.<listcomp>cSsg|]}|jD]}|�qqSr)r&)r�r�r�rrrr��sr�Z
SUBCOMMANDz, )rSrTrUrOrHr|rQrRzPackage specification)rSrTrUrNrLrO)rVrWrrZ_RepoCallbackEnabler�rrZ)r$r)r\Zsubcommand_choicesZsubcommand_choices_allrYrXrrrr*�sP
zRepoPkgsCommand.set_argparsercCs�y|j|jjd|_Wn>tjjtfk
rV}z|jjj�tjj�WYdd}~XnX|j|j_|jj	d|j_	|jj
�dS)z8Verify whether the command can run with given arguments.rN)r�r<r�rrr�KeyError�	optparserZprint_usager}r,)r$r�rrrr,s
zRepoPkgsCommand.configurecCs|jj�dS)z>Execute the command with respect to given arguments *extcmds*.N)r�r~)r$rrrr4szRepoPkgsCommand.run)r�r�r�r�)r7r8r9r:r"rr�r�r�r�r�r�r�r�r�r�r�r&rr;r%r*r,r4r�rr)r�rrz s0)79>1(#+rzc@s0eZdZdZd	Zed�Zedd��Zdd�Z	dS)
�HelpCommandzRA class containing methods needed by the cli to execute the
    help command.
    rOzdisplay a helpful usage messagecCs*|jddtd�td�jtjjd�d�dS)N�cmd�?ZCOMMANDz{prog} command to get help for)r)rSrTrO)rWrrrr
r)r)rrrr*szHelpCommand.set_argparsercCsN|jjs|jj|jjkr(|jjj�n"|jj|jj}|jjj||��dS)N)r<r�rZcli_commandsr�Z
print_help)r$r�rrrr4$s

zHelpCommand.runN)rO)
r7r8r9r:r&rr;ror*r4rrrrr�s
r�)r:Z
__future__rrZdnf.cli.option_parserrZdnf.i18nrZdnf.clirZdnf.exceptionsZ
dnf.pycompZdnf.utilZloggingrZ	getLoggerr
r/r0rrr!�objectr"r>rerhrprzr�rrrr�<module>s:
9?$ysite-packages/dnf/cli/commands/__pycache__/updateinfo.cpython-36.pyc000064400000033121147511334650021410 0ustar003

�ft`2J�@s�dZddlmZddlmZddlmZddlZddlZddlZddlm	Z	ddl
mZddlm
Z
mZdd	lmZd
d�ZGdd
�d
e	j�ZdS)zUpdateInfo CLI command.�)�absolute_import)�print_function)�unicode_literalsN)�commands)�OptionParser)�_�exact_width)�unicodecCstdd�|D��S)z7Return maximum length of items in a non-empty iterable.css|]}t|�VqdS)N)r)�.0�item�r� /usr/lib/python3.6/updateinfo.py�	<genexpr>&sz_maxlen.<locals>.<genexpr>)�max)�iterablerrr
�_maxlen$srcs.eZdZdZejed�ejed�ejed�ej	ed�ej
ed�iZed�ed�ed	�ed
�d�Zdddd
d
d
dd�Z
dgee
j��Zed�ZdZdddegZ�fdd�Zedd��Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Z d.d/�Z!d0d1�Z"d2d3�Z#d4d5�Z$�Z%S)6�UpdateInfoCommandz)Implementation of the UpdateInfo command.�bugfix�enhancement�security�unknown�
newpackagez
Critical/Sec.zImportant/Sec.z
Moderate/Sec.zLow/Sec.)�Critical�	Important�Moderate�Low�list�info�summary)zlist-updateinfoz
list-securityzlist-seczinfo-updateinfoz
info-securityzinfo-seczsummary-updateinfoZ
updateinfoz!display advisories about packages�	available�	installed�updates�allcstt|�j|�d|_dS)zInitialize the command.N)�superr�__init__�_installed_query)�self�cli)�	__class__rr
r$CszUpdateInfoCommand.__init__c	Cs|j�}|jddddtd�d�|jddddtd	�d�|jd
dddtd�d�|jd
dddtd�d�dddg}|j�}|jddddtd�d�|jddddtd�d�|jddddtd�d�|jddddtd�d�|jd d!ddtd"�d�|jd#d$d%||d&tjtd'�d(�dS))Nz--available�
_availabilityr�store_constz?advisories about newer versions of installed packages (default))�dest�const�action�helpz--installedr z?advisories about equal and older versions of installed packagesz	--updatesr!zbadvisories about newer versions of those installed packages for which a newer version is availablez--allr"z3advisories about any versions of installed packagesrrrz	--summary�_spec_actionz$show summary of advisories (default)z--listzshow list of advisoriesz--infozshow info of advisoriesz
--with-cve�with_cveF�
store_truez'show only advisories with CVE reference)r+�defaultr-r.z	--with-bz�with_bzz,show only advisories with bugzilla reference�spec�*ZSPECrzPackage specification)�nargs�metavar�choicesr2r-r.)Zadd_mutually_exclusive_group�add_argumentrrZPkgNarrowCallback)�parser�availabilityZcmdsZ
output_formatrrr
�
set_argparserHsD






zUpdateInfoCommand.set_argparsercCs�d|jj_d|jj_|jj|jkr6|j|jj|j_n|jjrJ|jj|j_|jj	r`|jj	|j_
n:|jjs||jjd|jkr�|j
|j_
n|jjjd�|j_
t�|j_|jjr�|jjjtj�|jjr�|jjjtj�|jjr�|jjjtj�|jj�r|jjjtj�|jj�r�|jjjd�}|dk�r:|jjjtj�n�|dk�rV|jjjtj�np|dk�rr|jjjtj�nT|dk�r�|jjjtj�n8|d
k�r�d|j_n$|d
k�r�d|j_n|jjjd|�|jj�r�|jjj|jj�dS)zADo any command-specific configuration based on command arguments.Trrrr�secr�	bugzillas�bzs�cvesN)rr=)r>r?) r'ZdemandsZavailable_reposZsack_activation�optsZcommand�direct_commands�spec_actionr/r)r;r4�availabilities�availability_default�pop�set�_advisory_typesr�add�hawkey�ADVISORY_BUGFIXr�ADVISORY_ENHANCEMENTr�ADVISORY_NEWPACKAGEr�ADVISORY_SECURITYr3r0�insert�advisory�extend)r&r4rrr
�	configurensJ













zUpdateInfoCommand.configurecCs�|jjdkr$|j|jj�}td�}n^|jjdkrH|j|jj�}td�}n:|jjdkrl|j|jj�}td�}n|j|jj�}td�}|jjdkr�|j	|�n$|jjdkr�|j
|�n|j||�dS)z#Execute the command with arguments.r r!r"rrrN)rAr;�installed_apkg_adv_instsr4r�updating_apkg_adv_insts�all_apkg_adv_insts�available_apkg_adv_instsrC�display_list�display_info�display_summary)r&�apkg_adv_insts�descriptionrrr
�run�s 


zUpdateInfoCommand.runcCs@|jdkr |jjj�j�j�|_|jj|j|jd�}t	|�dkS)N)�nameZevr__gter)
r%�base�sack�queryr Zapply�filterr]�evr�len)r&�apackage�qrrr
�_newer_equal_installed�s
z(UpdateInfoCommand._newer_equal_installedcs,|jjrJ|jjrJ|jjrJ|jjrJ|jjrJ|jjrJ|jjrJdS�j|jjkr\dSt	�fdd�|jjD��rzdS|jjr��j|jjkr�dS|jjr�t	�fdd�|jjD��r�dS|jjr�t	�fdd�|jjD��r�dS|jj�rt	dd��j
D���rdS|jj�r(t	dd��j
D���r(dSd	S)
NTc3s|]}tj�j|�VqdS)N)�fnmatch�fnmatchcase�id)r
�pat)rPrr
r�sz6UpdateInfoCommand._advisory_matcher.<locals>.<genexpr>csg|]}�j|��qSr)Z	match_bug)r
Zbug)rPrr
�
<listcomp>�sz7UpdateInfoCommand._advisory_matcher.<locals>.<listcomp>csg|]}�j|��qSr)Z	match_cve)r
Zcve)rPrr
rk�scSsg|]}|jtjk�qSr)�typerJ�
REFERENCE_CVE)r
�refrrr
rk�scSsg|]}|jtjk�qSr)rlrJ�REFERENCE_BUGZILLA)r
rnrrr
rk�sF)rArHr4�severityZbugzillar@r0r3rl�any�
references)r&rPr)rPr
�_advisory_matcher�s2






""

z#UpdateInfoCommand._advisory_matcherc#shxb|j|�D]T��j|jj�}|j|�}t�fdd�|jjD��}|sJ|r|j��}�||fVqWdS)z4Return (adv. package, advisory, installed) triplets.c3s|]}tj�j|�VqdS)N)rgrhr])r
rj)rdrr
r�szAUpdateInfoCommand._apackage_advisory_installed.<locals>.<genexpr>N)	Zget_advisory_pkgsZget_advisoryr^r_rsrqrAr4rf)r&Z
pkgs_queryZcmptype�specsrPZadvisory_matchZapackage_matchr r)rdr
�_apackage_advisory_installed�s

z.UpdateInfoCommand._apackage_advisory_installedcCs@|jj}|j�jdd�}|j�}|r<|j|j�j|jd��}|S)z<Return query containing packages of currently running kernelT)�empty)�	sourcerpm)r^r_r`�filtermZget_running_kernel�unionrw)r&r_reZkernelrrr
�running_kernel_pkgs�sz%UpdateInfoCommand.running_kernel_pkgscCs8|jjj�j�jd�}|j|j�j��}|j|tj	|�S)z5Return available (adv. package, adv., inst.) triplets�)
r^r_r`r ZlatestryrzrurJ�GT)r&rtrerrr
rV�sz*UpdateInfoCommand.available_apkg_adv_instscCs"|j|jjj�j�tjtjB|�S)z5Return installed (adv. package, adv., inst.) triplets)rur^r_r`r rJ�LT�EQ)r&rtrrr
rS�sz*UpdateInfoCommand.installed_apkg_adv_instscCs |j|jjj�jdd�tj|�S)z4Return updating (adv. package, adv., inst.) tripletsT)Z
upgradable)rur^r_r`rxrJr|)r&rtrrr
rT�sz)UpdateInfoCommand.updating_apkg_adv_instscCs(|j|jjj�j�tjtjBtjB|�S)z5Return installed (adv. package, adv., inst.) triplets)	rur^r_r`r rJr}r~r|)r&rtrrr
rUsz$UpdateInfoCommand.all_apkg_adv_instscCsVi}xB|D]:\}}}|j||j<|jtjkr
|j|jf||j|jf<q
Wtj|j��S)zMake the summary of advisories.)rlrirJrNrp�collections�Counter�values)r&rZ�id2type�apkgrPr rrr
�_summaryszUpdateInfoCommand._summaryc	
CsV|j|�}|�r<ttd�|�dtd�|tjfdtd�|tjfdtd�|tjdffdtd�|tjd	ffdtd
�|tjdffdtd�|tjd
ffdtd�|tjdffdtd�|tjfdtd�|tjfdtd�|tjfg
}t	dd�|D��}x<|D]4\}}}|�s�qtd|d|t
|�|f��qW|jjj
�rR|jj�dS)z"Display the summary of advisories.zUpdates Information Summary: rzNew Package notice(s)zSecurity notice(s)r{zCritical Security notice(s)rzImportant Security notice(s)rzModerate Security notice(s)rzLow Security notice(s)rzUnknown Security notice(s)NzBugfix notice(s)zEnhancement notice(s)zother notice(s)css"|]}|drt|d�VqdS)�N)r	)r
�vrrr
r(sz4UpdateInfoCommand.display_summary.<locals>.<genexpr>z
    %*s %s�)r��printrrJrMrNrKrL�ADVISORY_UNKNOWNrr	r^�confZautocheck_running_kernelr'Z_check_running_kernel)	r&rZr[Ztyp2cntZlabel_counts�width�indent�label�countrrr
rYs2
$z!UpdateInfoCommand.display_summarycs��fdd�}�fdd�}t�}x�|D]�\}}}d|j|j|jf}�jjsR�jjr�x�|jD]Z}	|	jt	j
krx�jjrxqZn|	jt	jkr��jjr�qZ|j|jf|j
|||jft��|	j<qZWq$|j|jf|j
|||jft��|j<q$Wg}
d}}}
x�t|j�dd�d	�D]r\\}}}}t|
t|��}
xR|j�D]F\}}t|t|��}||�}t|t|��}|
j||�||||f��q.W�qWxZ|
D]R\}}}}}�jjj�r�td
||||||
||f�ntd||||||f��q�WdS)
zDisplay the list of advisories.cs �jjdksdS|rdSdSdS)Nr"�zi z  )rAr;)�inst)r&rr
�	inst2mark2s
z1UpdateInfoCommand.display_list.<locals>.inst2markcs2|tjkr�jj|td��S�jj|td��SdS)NzUnknown/Sec.r)rJrN�SECURITY2LABEL�getr�
TYPE2LABEL)�typZsev)r&rr
�
type2label:s
z2UpdateInfoCommand.display_list.<locals>.type2labelz%s-%s.%srcSs|dS)Nrr)�xrrr
�<lambda>Rsz0UpdateInfoCommand.display_list.<locals>.<lambda>)�keyz%s%-*s %-*s %-*s %sz%s%-*s %-*s %sN)�dictr]rb�archrAr0r3rrrlrJrormrp�
setdefault�updatedri�sorted�itemsrrc�appendr^r��verboser�)r&rZr�r�Znevra_inst_dictr�rPr ZnevrarnZadvlistZidwZtlwZnwr�Zaupdatedr�ZaidZatypesevr�r)r&r
rW0s4*($$zUpdateInfoCommand.display_listcs��jjj���jjj�td�td�td�td�td�td�td�td�td	�td
�f
�����fdd�}t�}x"|D]\}}}|j|||��qtWtd
j	t
|dd�d���dS)z/Display the details about available advisories.z	Update IDZTypeZUpdatedZBugsZCVEsZDescriptionZSeverityZRightsZFilesZ	Installedc
s�|jg�jj|jtd��gt|j�ggg|jp0dj�|j	g|j
pBdj�tt�fdd�|j
D���dg
}xV|jD]L}|jtjkr�|djdj|j|jp�d��qn|jtjkrn|dj|j�qnW|dj�|dj��s�d|d<d|d	<�jjd
k�r|�rtd�ntd�g|d
<t��}g}|jdd�|jd|j�|jdd�xxt�|�D]j\}}|ddgfk�rt�qXxJt|�D]>\}}	|dk�r�|nd}
|t|
�}|jd|d|
|	f��q~W�qXWdj|�S)Nrr�c3s|]}|j�kr|jVqdS)N)r��filename)r
Zpkg)�archesrr
rsszHUpdateInfoCommand.display_info.<locals>.advisory2info.<locals>.<genexpr>�z{} - {}r���r"�trueZfalse�	�=�Oz  rz	%*s%s: %s�
)rir�r�rlrr	r�r[�
splitlinesrpZrightsr�rGZpackagesrrrJror��format�titlerm�sortrAr;r�zip�	enumerater�join)rPr Z
attributesrnr��linesr�Z	atr_lines�i�liner�Zkey_padding)r��labelsr&r�rr
�
advisory2infoisF
 "z5UpdateInfoCommand.display_info.<locals>.advisory2infoz

cSs|j�S)N)�lower)r�rrr
r��sz0UpdateInfoCommand.display_info.<locals>.<lambda>)r�N)r^r_Zlist_archesr�r�rrGrIr�r�r�)r&rZr�Z
advisoriesr�rPr r)r�r�r&r�r
rXas
(zUpdateInfoCommand.display_info)&�__name__�
__module__�__qualname__�__doc__rJrKrrLrNr�rMr�r�rBr�keys�aliasesrrErDr$�staticmethodr<rRr\rfrsrurzrVrSrTrUr�rYrWrX�
__classcell__rr)r(r
r)sJ



&6	1r)r�Z
__future__rrrrrgrJZdnf.clirZdnf.cli.option_parserrZdnf.i18nrrZ
dnf.pycompr	rZCommandrrrrr
�<module>ssite-packages/dnf/cli/commands/__pycache__/autoremove.cpython-36.opt-1.pyc000064400000003572147511334650022406 0ustar003

�ft`��@stddlmZddlmZddlmZddlmZddlmZddl	Z
ddlZddlZej
d�ZGdd	�d	ej�ZdS)
�)�absolute_import)�unicode_literals)�commands)�OptionParser)�_N�dnfc@sReZdZejejejd�Zdeej	��Z
ed�Ze
dd��Zdd�Zdd	�Zd
S)�AutoremoveCommand)zautoremove-nz
autoremove-nazautoremove-nevra�
autoremovezKremove all unneeded packages that were originally installed as dependenciescCs"|jddtd�tjtd�d�dS)NZpackages�*zPackage to removeZPACKAGE)�nargs�help�action�metavar)�add_argumentrrZParseSpecGroupFileCallback)�parser�r� /usr/lib/python3.6/autoremove.py�
set_argparser,szAutoremoveCommand.set_argparsercCs\|jj}d|_d|_d|_t|jj|jj|jj	g�rLd|j
j_d|_
d|_nd|_d|_dS)NTF)Zcli�demandsZ	resolvingZ	root_userZsack_activation�any�opts�	grp_specs�	pkg_specs�	filenames�baseZconfZclean_requirements_on_removeZ
allow_erasingZavailable_reposZfresh_metadata)�selfrrrr�	configure2s
zAutoremoveCommand.configurecCsjt|jj|jj|jjg�r\g}|jj|jkr<|j|jjg}|jj||jj|jj|jj�n
|jj�dS)N)	rrrrrZcommand�nevra_formsrr	)rZformsrrr�runBszAutoremoveCommand.runN)r	)�__name__�
__module__�__qualname__�hawkeyZ	FORM_NAMEZFORM_NAZ
FORM_NEVRAr�tuple�keys�aliasesrZsummary�staticmethodrrrrrrrr"s
r)Z
__future__rrZdnf.clirZdnf.cli.option_parserrZdnf.i18nrZdnf.exceptionsrr"ZloggingZ	getLoggerZloggerZCommandrrrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/upgrade.cpython-36.pyc000064400000007141147511334650020704 0ustar003

�ft`~�@stddlmZddlmZddlZddlZddlZddlmZddl	m
Z
ddlmZej
d�ZGdd	�d	ej�ZdS)
�)�absolute_import)�unicode_literalsN)�commands)�OptionParser)�_�dnfc@sXeZdZdZdZed�Zed	d
��Zdd�Z	d
d�Z
dd�Zdd�Zdd�Z
dd�ZdS)�UpgradeCommandzTA class containing methods needed by the cli to execute the
    update command.
    �upgrade�update�
upgrade-to�	update-to�localupdate�upz,upgrade a package or packages on your systemcCs"|jddtd�tjtd�d�dS)NZpackages�*zPackage to upgradeZPACKAGE)�nargs�help�action�metavar)�add_argumentrrZParseSpecGroupFileCallback)�parser�r�/usr/lib/python3.6/upgrade.py�
set_argparser*szUpgradeCommand.set_argparsercCsZ|jj}d|_d|_d|_d|_tj|j|j�|j	j
sDtj|j�d|_d|_
d|_dS)z�Verify that conditions are met so that this command can run.

        These include that there are enabled repositories with gpg
        keys, and that this command is being run by the root user.
        TN)�cli�demandsZsack_activationZavailable_reposZ	resolvingZ	root_userrZ_checkGPGKey�base�opts�	filenamesZ_checkEnabledRepo�upgrade_minimal�all_security�skipped_grp_specs)�selfrrrr�	configure0szUpgradeCommand.configurecCs�|jr
dnd}|jj|j||jd�|jjs<|jjs<|jjrzd}||j�O}||j	�O}||j
�O}||j�O}|r�dSn|jj
�dStjjtd���dS)N�eqZgte)�cmp_type�allFzNo packages marked for upgrade.)rrZ _populate_update_security_filterrrr�	pkg_specs�	grp_specs�_update_modules�
_update_files�_update_packages�_update_groupsrZupgrade_allr�
exceptions�Errorr)r!r$�resultrrr�runBs

zUpgradeCommand.runcCsNt|jj�}tjjr6tjjj|j�}|j	|jj�|_
n
|jj|_
t|j
�|kS)N)�lenrr'rrZWITH_MODULES�module�module_baseZ
ModuleBaser	r )r!Zgroup_specs_numr2rrrr(Vs
zUpgradeCommand._update_modulescCs�d}|jjr�x~|jj|jjd|jjjd�D]^}y|jj|�d}Wq*tjj	k
r�}z$t
jtd�|jjj
j|j��WYdd}~Xq*Xq*W|S)NF)�strict�progressTzNo match for argument: %s)rrrZadd_remote_rpms�outputr4Zpackage_upgraderr,�MarkingError�logger�infor�term�bold�location)r!�successZpkg�errrr)`s
*zUpgradeCommand._update_filescCsrd}xh|jjD]\}y|jj|�d}Wqtjjk
rh}z"tjt	d�|jj
jj|��WYdd}~XqXqW|S)NFTzNo match for argument: %s)
rr&rr	rr,r6r7r8rr5r9r:)r!r<Zpkg_specr=rrrr*ms
(zUpgradeCommand._update_packagescCs|jr|jj|j�dSdS)NTF)r rZenv_group_upgrade)r!rrrr+xszUpgradeCommand._update_groupsN)r	r
rrr
r)�__name__�
__module__�__qualname__�__doc__�aliasesrZsummary�staticmethodrr"r/r(r)r*r+rrrrr#s

r)Z
__future__rrZloggingZdnf.exceptionsrZdnf.baseZdnf.clirZdnf.cli.option_parserrZdnf.i18nrZ	getLoggerr7ZCommandrrrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/makecache.cpython-36.opt-1.pyc000064400000002357147511334650022121 0ustar003

�ft`m�@sxddlmZddlmZddlmZddlmZddlZddlZddl	Zddl
ZddlZejd�Z
Gdd�dej�ZdS)	�)�absolute_import)�unicode_literals)�commands)�_N�dnfc@s,eZdZd	Zed�Zedd��Zdd�ZdS)
�MakeCacheCommand�	makecache�mczgenerate the metadata cachecCs,|jdddd�|jdddgdtjd�dS)Nz--timer�
store_true�	timer_opt)�action�dest�timer�?)�nargs�choices�metavar�help)�add_argument�argparseZSUPPRESS)�parser�r�/usr/lib/python3.6/makecache.py�
set_argparser'szMakeCacheCommand.set_argparsercCs2|jjdk	p|jj}td�}tj|�|jj|�S)Nz*Making cache files for all metadata files.)Zoptsrrr�logger�debug�baseZupdate_cache)�selfr�msgrrr�run.s
zMakeCacheCommand.runN)rr	)	�__name__�
__module__�__qualname__�aliasesrZsummary�staticmethodrrrrrrr#sr)Z
__future__rrZdnf.clirZdnf.i18nrrrZdnf.exceptionsZdnf.utilZloggingZ	getLoggerrZCommandrrrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/remove.cpython-36.pyc000064400000007521147511334650020554 0ustar003

�ft`��@s�ddlmZddlmZddlmZddlmZddlmZddl	Z
ddlZddlZddl
Z
ddlZejd�ZGdd	�d	ej�ZdS)
�)�absolute_import)�unicode_literals)�commands)�_)�OptionParserN�dnfc@sbeZdZdZejejejejejejd�Zde	ej
��Zed�Z
edd��Zd	d
�Zdd�Zd
S)�
RemoveCommandzRemove command.)zremove-nz	remove-nazremove-nevrazerase-nzerase-nazerase-nevra�remove�erase�rmz-remove a package or packages from your systemcCsf|j�}|jdddtd�d�|jddtjd�|jddtd	�d�|jd
dtd�tjtd
�d�dS)Nz--duplicates�
store_true�
duplicatedzremove duplicated packages)�action�dest�helpz--duplicated)rrz--oldinstallonlyz*remove installonly packages over the limitZpackages�*zPackage to removeZPACKAGE)�nargsrr�metavar)Zadd_mutually_exclusive_group�add_argumentr�argparseZSUPPRESSrZParseSpecGroupFileCallback)�parserZmgroup�r�/usr/lib/python3.6/remove.py�
set_argparser0s

zRemoveCommand.set_argparsercCs^|jj}d|_d|_d|_|jjr*d|_n0tj	j
rN|jjrNd|_d|_d|_
nd|_
d|_dS)NTF)Zcli�demandsZ	resolvingZ	root_userZsack_activation�optsr
Zavailable_reposr�base�WITH_MODULES�	grp_specsZfresh_metadataZ
allow_erasing)�selfrrrr�	configure?szRemoveCommand.configurecCs\g}|jj|jkr"|j|jjg}|jj|jj7_d}|jj�rD|jjj�}|jj	|j
��}|j�j|�}|s�tj
jtd���x�|j�j�D]�\\}}}t|�dkr�q�|jdd�y|jjt|d��WnHtj
jk
�rd}	td�}
tj|
|jjjjt|d��|	�YnXx"|d	d�D]}|jj|��q&Wq�WdS|jj�r�|jjj�}|jj	|j
��jd�}|jjj�}|dk	�r�|j |j!|j"|j#d
�}
|
�r�|j|
�}|�r�x,|D]}|jj|��q�Wntj
jtd���dS|jj$�r*|�r*x�|jj$D]&}td�}
tj|
|jjjj|���q�Wn�|jj$�r�tjj%�rxtj&j'j(|j�}|j)|jj$�}t|jj$�t|�k�r�d}n|jj$}|�r�xB|D]:}y|jj*|g��r�d}Wntj
jk
�r�YnX�q�Wxx|jjD]l}y|jj)||d
�WnLtj
j+k
�r8}z*dj,|j-|jjjj|��}
tj.|
�WYdd}~XnXd}�q�W|�sXtjtd��dS)NFz)No duplicated packages found for removal.�T)�reverser�z%Installed package %s%s not available.�)�epoch�version�releasez.No old installonly packages found for removal.zNot a valid form: %s)�formsz{}: {}zNo packages marked for removal.���)/rZcommand�nevra_formsZ	pkg_specs�	filenamesr
rZsackZqueryZ_get_installonly_queryZ	installed�
differencer�
exceptions�ErrorrZ_na_dict�items�len�sortZ	reinstall�strZPackagesNotAvailableError�loggerZwarning�outputZtermZboldZpackage_removeZoldinstallonlyZlatestZget_running_kernel�filterr%r&r'rr�module�module_baseZ
ModuleBaser	Zenv_group_removeZMarkingError�format�value�info)rr(�done�qZinstonlyZdups�nameZarchZ	pkgs_listZxmsg�msgZpkgZkernelZrunning_installonlyZgrp_specr7Zskipped_grps�groupZpkg_spec�errr�runPs�
(




 



zRemoveCommand.runN)r	r
r)�__name__�
__module__�__qualname__�__doc__�hawkeyZ	FORM_NAMEZFORM_NAZ
FORM_NEVRAr*�tuple�keys�aliasesrZsummary�staticmethodrr rArrrrr#s
r)Z
__future__rrZdnf.clirZdnf.i18nrZdnf.cli.option_parserrZdnf.baserrrFZdnf.exceptionsZloggingZ	getLoggerr3ZCommandrrrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/alias.cpython-36.opt-1.pyc000064400000012322147511334650021302 0ustar003

�ft`��@s�ddlmZddlmZddlmZddlZddlZddlZddl	Zddlm
Z
ddlZddlZddl
mZejd�ZGdd	�d	e
j�ZdS)
�)�absolute_import)�print_function)�unicode_literalsN)�commands)�_�dnfc@sleZdZdZed�Zedd��Zdd�Zdd�Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�Zdd�ZdS)�AliasCommand�aliaszList or create command aliasescCsl|j�}|jdddtd�d�|jdddtd�d�|jdd	d
dd
dgtd
�d�|jdddtd�d�dS)Nz--enable-resolvingF�
store_truezenable aliases resolving)�default�action�helpz--disable-resolvingzdisable aliases resolving�
subcommand�?�list�add�deletezaction to do with aliases)�nargsr�choicesr
r	�*zcommand[=result]zalias definition)r�metavarr
)Zadd_mutually_exclusive_group�add_argumentr)�parserZenable_group�r�/usr/lib/python3.6/alias.py�
set_argparser*s

zAliasCommand.set_argparsercCsH|jj}|jjdkrd|_tjjj�|_|jj	�|jj
|_|j�dS)NrrT)rr)
�cli�demands�optsrZ	root_userr�aliasesZAliases�aliases_baseZ
_load_aliases�enabledZresolving_enabled�_update_config_from_options)�selfrrrr�	configure9s

zAliasCommand.configurecCs�d}|jjrd}tjtd��|jjr8d}tjtd��|dk	r�tjjt	j
jj�sft
t	j
jjd�j�t	jjjt	j
jjddd|i�|jj�s�||j_dS)NTzAliases are now enabledFzAliases are now disabled�w�mainr!)rZenable_resolving�logger�inforZdisable_resolving�os�path�existsrrrZALIASES_CONF_PATH�open�close�confZ
BaseConfigZwrite_raw_configfiler Z_disabled_by_environr!)r#r!rrrr"Bs
z(AliasCommand._update_config_from_optionscCs�i}x�|jjD]�}|jdd�}|dj�}t|j��dkrLtjtd�|�q|jd�rhtjtd�|�qt|�dkr�tjtd�|�q|dj�||<qW|S)N�=�rzInvalid alias key: %s�-zAlias argument has no value: %s)	rr	�split�strip�lenr'�warningr�
startswith)r#Znew_aliasesr	�cmdrrr�_parse_option_aliasTs
z AliasCommand._parse_option_aliascCsxtjjtjjj�s&ttjjjd�j�ytjjj	tjjj�}Wn4tj
jk
rr}ztj
td�|�dSd}~XnX|S)Nr%zConfig error: %s)r)r*r+rrr�ALIASES_USER_PATHr,r-Z
AliasesConfig�
exceptionsZConfigErrorr'r5r)r#r.�errr�_load_user_aliaseseszAliasCommand._load_user_aliasescCsdttjjjd�}d}|dj|�7}|d7}x*|j�D]\}}|dj|dj|��7}q4W|j|�dS)Nr%z[main]
zenabled = {}

z
[aliases]
z{} = {}
� )	r,rrrr9�format�items�join�write)r#�user_aliasesr!Zfileobj�output�key�valuerrr�_store_user_aliasespsz AliasCommand._store_user_aliasescCsP|j�}|j}|dkrdS|j|�|j||j�tjtd�dj|j	���dS)NzAliases added: %sz, )
r<r�updaterFr!r'r(rr@�keys)r#rr.rBrrr�add_aliasesys
zAliasCommand.add_aliasescCs�|j�}|j}|dkrdSg}xF|D]>}y||=|j|�Wq$tk
r`tjtd�|�Yq$Xq$W|j||j�tjtd�dj	|��dS)NzAlias not found: %szAliases deleted: %sz, )
r<r�append�KeyErrorr'r(rrFr!r@)r#�cmdsr.rBZ
valid_cmdsr7rrr�remove_aliases�s
zAliasCommand.remove_aliasescCs~|g}y|jj|�}WnHtjjk
r^}z(tjtd�||dj|jj	|��WYdd}~XnXt
td�|dj|�f�dS)Nz%s, alias %s="%s"r=z
Alias %s='%s')r Z_resolverr:�Errorr'�errorrr@r�print)r#r7�argsr;rrr�
list_alias�s0zAliasCommand.list_aliascCs|jjstjtd��|jjdkrL|j�}|s>tj	j
td���|j|�dS|jjdkr�|jj}|gkrxtj	j
td���|j
|�dS|jjs�|jjs�tjtd��dSxX|jjD]}|j|�q�Wn<x:|jjD].}||jjkr�tjtd�|�q�|j|�q�WdS)NzAliases resolving is disabled.rzNo aliases specified.rzNo alias specified.zNo aliases defined.zNo match for alias: %s)r r!r'r5rrrr8rr:rNrIr	rMrr(rR)r#rrLr7rrr�run�s2

zAliasCommand.runN)r	)�__name__�
__module__�__qualname__rrZsummary�staticmethodrr$r"r8r<rFrIrMrRrSrrrrr&s		
r)Z
__future__rrrZloggingZos.pathr)Zdnf.clirZdnf.cli.aliasesrZdnf.confZdnf.exceptionsZdnf.i18nrZ	getLoggerr'ZCommandrrrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/repolist.cpython-36.pyc000064400000016402147511334650021116 0ustar003

�ft`z2�@s�ddlmZddlmZddlmZddlmZmZmZm	Z	ddl
mZddlZ
ddlZ
ddlZ
ddlZddlZddlZddlZejd�Zdd	�Zd
d�Zdd
�Zdd�ZGdd�dej�ZdS)�)�absolute_import)�unicode_literals)�commands)�_�ucd�fill_exact_width�exact_width)�OptionParserN�dnfcCsd|rtjj|jj��ntd�}|jdkr4td�|S|jsFtd�|St|j�}td�||fSdS)N�unknown�zNever (last: %s)zInstant (last: %s)z%s second(s) (last: %s)���)r
�util�normalize_time�_repo�getTimestamprZmetadata_expire�_num2ui_num)�repo�mdZlast�num�r�/usr/lib/python3.6/repolist.py�_expire_str%s

rcCsttjjd|d��S)Nz%dT)rr
Zpycomp�format)rrrrr0srcCsF|jj�}|jj�}x,|D]$}tj||�r.dStj||�rdSqWdS)NTF)�id�lower�name�fnmatch)rZpatterns�ridZrnmZpatrrr�_repo_match4s


rcCs>d}x*|jtjd�j|jd�D]}||j7}qWtjjj	|�S)Nr)�flags)�reponame__eq)
�query�hawkey�IGNORE_EXCLUDES�filtermrZ_sizer
�clirZ
format_number)�sackr�retZpkgrrr�
_repo_size?sr)c@s@eZdZdZdZed�Zedd��Zdd�Z	d	d
�Z
dd�Zd
S)�RepoListCommandzVA class containing methods needed by the cli to execute the
    repolist command.
    �repolist�repoinfoz,display the configured software repositoriesc	Csz|j�}|jdddddtd�d�|jddddtd	�d
�|jddddtd
�d
�|jdddddddgtjtd�d�dS)Nz--all�
_repos_action�store_const�allzshow all repos)�dest�action�const�default�helpz	--enabled�enabledzshow enabled repos (default))r0r1r2r4z
--disabled�disabledzshow disabled repos�repos�*zenabled-defaultZ
REPOSITORYzRepository specification)�nargsr3�metavar�choicesr1r4)Zadd_mutually_exclusive_group�add_argumentrr	ZPkgNarrowCallback)�parserZ	repolimitrrr�
set_argparserNszRepoListCommand.set_argparsercCs |jjs|jjtjtjd�dS)N)�stdout�stderr)�opts�quietr&Zredirect_logger�loggingZWARNING�INFO)�selfrrr�
pre_configure_szRepoListCommand.pre_configurecCsT|jjs|jj�|jj}|jjjs0|jjdkr<d|_	d|_
|jjrP|jj|j_dS)Nr,T)
rArBr&Zredirect_repo_progress�demands�base�conf�verbose�commandZavailable_reposZsack_activationr-�repos_action)rErGrrr�	configurecs
zRepoListCommand.configurec-
Csl|jj}dd�|jjD�}|jjj}t|jjj��}|jt	j
d�d�|jj}|j
d|jd}|j
d}|jd}d	}	g}
|s�tjtd
��dS|dkp�|dko�|}g}�x�|D�]�}
t|�r�t|
|�r�q�d7\}}}d
}d	}d
}|�r�|||}}}|
j�rnd}|dk�rq�|�s.|�s.|jjdk�rJ|td�|}ttd��}|�s^|jjdk�r�t|jj|
�}n<d}|dks�|dk�r�|�r�q�|td�|}ttd��}|�p�|jjdk�s�t|
j�}|
j||
j||ff�q�|�r�|
j}nd}|jjtd�|
j�|jjtd�|
j�g}|�r8||jjtd�|�g7}|�rh|
jj ��rh||jjtd�|
jj ��g7}|�r�|
jj!��r�|
jj!�}||jjtd�dj"t#|���g7}|�r|
jj$��rdd�|
jj$�D�}x@|j%�D]4\}}||jjtd�d|dj"t#|��f�g7}�q�W|�r�t|jjj&t'j(d�j)|
jd��}t|jjj&�j)|
jd��}t*|�}t*|�}|	|7}	||jjtd�t+j,j-|
jj.���|jjtd �|�|jjtd!�|�|jjtd"�|�g7}|
j/�r||jjtd#�|
j/�g7}|�r2|
jj0�} ||jjtd$�t+j,j-| ��g7}n"|
j1�r2||jjtd%�|
j1�g7}|
j2}!|!�r^||jjtd&�dj"|!��g7}nF|�r�|
jj3�}"|"�r�d'|"d	t|"�d(f}#||jjtd&�|#�g7}t4|
|�}$||jjtd)�|$�g7}|
j5�r�||jjtd*�dj"|
j5��g7}|
j6�r||jjtd+�dj"|
j6��g7}|�r4||jjtd,�|�g7}|
j7�rV||jjtd-�|
j7�g7}|jd.j"t8t|���q�W|�r�t9d/j"|��|�r:|
�r:ttd0��}%d	}&d	}'xR|
D]J\}}(\}}|%t|�k�r�t|�}%|&t|(�k�r�t|(�}&|'|k�r�|}'�q�W|�rBttd1��|'k�r.|j:|%ttd1��d2})n|j:|%|'d2})n|j:|%d(})|)|&k�r`|)}&n$|)|&8})|%|)d27}%|&|)|)d27}&t;td0�|%�}*|�r�t;td3�|&|&�}+ntd3�}+|�s�t9d4|*|+f�nt9d5|*|+td1�f�xX|
D]P\}}(\}}|�st9d4t;||%�|(f��q�t9d5t;||%�t;|(|&|&�|f��q�W|�sN|jjdk�rhtd6�},t9|,j<t*|	���dS)8NcSsg|]}|j��qSr)r)�.0�xrrr�
<listcomp>psz'RepoListCommand.run.<locals>.<listcomp>r)�keyZgreenZboldZredZnormalrzNo repositories availabler/zenabled-default�Tr6r,r5FzRepo-id            : zRepo-name          : zRepo-status        : zRepo-revision      : zRepo-tags          : z, cSsi|]\}}||�qSrr)rN�k�vrrr�
<dictcomp>�sz'RepoListCommand.run.<locals>.<dictcomp>zRepo-distro-tags      : z[%s]: %s)r )r!zRepo-updated       : zRepo-pkgs          : zRepo-available-pkgs: zRepo-size          : zRepo-metalink      : z  Updated          : zRepo-mirrors       : zRepo-baseurl       : z%s (%d more)rzRepo-expire        : zRepo-exclude       : zRepo-include       : zRepo-excluded      : zRepo-filename      : �
z

zrepo idZstatus�z	repo namez%s %sz%s %s %szTotal packages: {})rRrRrR)=rArLr7rHrIrJ�list�values�sort�operator�
attrgetter�output�termZFG_COLORZMODE�loggerZwarningr�lenrr5rKrr)r'rr�appendrZmetadataZ
fmtKeyValFillrZgetRevisionZgetContentTags�join�sortedZ
getDistroTags�itemsr"r#r$r%rr
rrZgetMaxTimestampZmetalinkrZ
mirrorlistZbaseurlZ
getMirrorsrZexcludepkgsZincludepkgsZrepofile�map�print�columnsrr)-rE�argZextcmdsrJr7r^Z	on_ehibegZ	on_dhibegZon_hiendZtot_numZcolsZinclude_statusZrepoinfo_outputrZehibegZdhibegZhiendZ
ui_enabledZui_endis_widZui_excludes_numr5Zui_sizerr�outZtagsZdistroTagsDictZdistrorZ
num_availableZui_numZui_num_availableZtsZbaseurlsZmirrorsZurlZexpireZid_lenZnm_lenZst_lenZrname�leftZtxt_ridZtxt_rnam�msgrrr�runns.







"







zRepoListCommand.runN)r+r,)�__name__�
__module__�__qualname__�__doc__�aliasesrZsummary�staticmethodr>rFrMrlrrrrr*Fsr*)Z
__future__rrZdnf.clirZdnf.i18nrrrrZdnf.cli.option_parserr	Zdnf.cli.formatr
Z
dnf.pycompZdnf.utilrr#rCr[Z	getLoggerr_rrrr)ZCommandr*rrrr�<module>s"
site-packages/dnf/cli/commands/__pycache__/deplist.cpython-36.pyc000064400000001571147511334650020722 0ustar003

�ft`��@sPddlmZddlmZddlmZddlmZddlmZGdd�de�ZdS)	�)�print_function)�absolute_import)�unicode_literals)�_)�RepoQueryCommandc@s$eZdZdZdZed�Zdd�ZdS)�DeplistCommandz<
    The command is alias for 'dnf repoquery --deplist'
    �deplistz`[deprecated, use repoquery --deplist] List package's dependencies and what packages provide themcCstj|�d|j_dS)NT)r�	configureZoptsr)�self�r�/usr/lib/python3.6/deplist.pyr	"s
zDeplistCommand.configureN)r)�__name__�
__module__�__qualname__�__doc__�aliasesrZsummaryr	rrrrrsrN)	Z
__future__rrrZdnf.i18nrZdnf.cli.commands.repoqueryrrrrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/reinstall.cpython-36.pyc000064400000005740147511334650021255 0ustar003

�ft`]�@slddlmZddlmZddlmZddlmZddlmZddl	Z
ddlZejd�Z
Gdd	�d	ej�ZdS)
�)�absolute_import)�unicode_literals)�commands)�OptionParser)�_N�dnfc@s8eZdZdZdZed�Zedd��Zdd�Z	d	d
�Z
dS)
�ReinstallCommandzSA class containing methods needed by the cli to execute the reinstall command.
    �	reinstall�reizreinstall a packagecCs"|jddtd�tjtd�d�dS)N�packages�+zPackage to reinstallZPACKAGE)�nargs�help�action�metavar)�add_argumentrrZParseSpecGroupFileCallback)�parser�r�/usr/lib/python3.6/reinstall.py�
set_argparser(szReinstallCommand.set_argparsercCsH|jj}d|_d|_d|_d|_tj|j|j�|j	j
sDtj|j�dS)aVerify that conditions are met so that this command can
        run.  These include that the program is being run by the root
        user, that there are enabled repositories with gpg keys, and
        that this command is called with appropriate arguments.
        TN)Zcli�demandsZsack_activationZavailable_reposZ	resolvingZ	root_userrZ_checkGPGKey�base�opts�	filenamesZ_checkEnabledRepo)�selfrrrr�	configure.szReinstallCommand.configurecCs�d}xp|jj|jjd|jjjd�D]P}y|jj|�Wn6tjj	k
rlt
jtd�|jjj
j|j��Yq"Xd}q"W�xR|jjdd�|jjD�D�]2}y|jj|�W�ntjjk
�r}zPx,|jD]"}t
jtd�|jj
j|j��Pq�Wt
jtd�|jjj
j|��WYdd}~Xq�tjjk
�r�}z^xV|jD]L}d}|jjj|�}|�rdtd	�|}td
�}t
j||jjj
j|�|��q<WWYdd}~Xq�tjj	k
�r�d�s�td��Yq�Xd}q�W|�s�tjjtd���dS)
NF)�strict�progresszNo match for argument: %sTcSsg|]}d|�qS)�@r)�.0�xrrr�
<listcomp>Lsz(ReinstallCommand.run.<locals>.<listcomp>z(Package %s available, but not installed.�z
 (from %s)z%Installed package %s%s not available.z+Only the above marking errors are expected.z!No packages marked for reinstall.)rZadd_remote_rpmsrr�outputrZpackage_reinstallr�
exceptionsZMarkingError�logger�inforZtermZbold�locationZ	pkg_specsZ	grp_specsr	ZPackagesNotInstalledErrorr�nameZPackagesNotAvailableError�historyZrepo�AssertionError�Error)r�doneZpkgZpkg_spec�errZxmsgZpkgrepo�msgrrr�run=sB
$

"zReinstallCommand.runN)r	r
)�__name__�
__module__�__qualname__�__doc__�aliasesrZsummary�staticmethodrrr/rrrrr!sr)Z
__future__rrZdnf.clirZdnf.cli.option_parserrZdnf.i18nrZdnf.exceptionsrZloggingZ	getLoggerr%ZCommandrrrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/downgrade.cpython-36.pyc000064400000003413147511334650021225 0ustar003

�ft`	�@sRddlmZddlmZddlmZddlmZddlmZGdd�dej	�Z
dS)	�)�absolute_import)�unicode_literals)�commands)�OptionParser)�_c@s8eZdZdZdZed�Zedd��Zdd�Z	d	d
�Z
dS)
�DowngradeCommandzWA class containing methods needed by the cli to execute the
    downgrade command.
    �	downgrade�dgzDowngrade a packagecCs|jddtd�tjd�dS)N�package�*zPackage to downgrade)�nargs�help�action)�add_argumentrrZParseSpecGroupFileCallback)�parser�r�/usr/lib/python3.6/downgrade.py�
set_argparser$szDowngradeCommand.set_argparsercCsH|jj}d|_d|_d|_d|_tj|j|j�|j	j
sDtj|j�dS)NT)Zcli�demandsZsack_activationZavailable_reposZ	resolvingZ	root_userrZ_checkGPGKey�base�opts�	filenamesZ_checkEnabledRepo)�selfrrrr�	configure)szDowngradeCommand.configurecCsJ|jj|jjd|jjjd�}|jj|jjdd�|jjD�||jj	j
d�S)NF)�strict�progresscSsg|]}d|�qS)�@r)�.0�xrrr�
<listcomp>8sz(DowngradeCommand.run.<locals>.<listcomp>)Zspecs�	file_pkgsr)rZadd_remote_rpmsrr�outputrZ
downgradePkgsZ	pkg_specsZ	grp_specsZconfr)rr rrr�run4szDowngradeCommand.runN)rr	)�__name__�
__module__�__qualname__�__doc__�aliasesrZsummary�staticmethodrrr"rrrrrsrN)Z
__future__rrZdnf.clirZdnf.cli.option_parserrZdnf.i18nrZCommandrrrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/install.cpython-36.opt-1.pyc000064400000013703147511334650021663 0ustar003

�ft`S�@s�ddlmZddlmZddlZddlmZddlZddlZddl	m
Z
ddlmZddl
mZejd�ZGd	d
�d
e
j�ZdS)�)�absolute_import)�unicode_literalsN)�chain)�commands)�OptionParser)�_�dnfc@s�eZdZdZejejejd�ZdZ	de
ej��Ze
d�Zedd	��Zd
d�Zdd
�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�ZdS)�InstallCommandzUA class containing methods needed by the cli to execute the
    install command.
    )z	install-nz
install-naz
install-nevrazalternative-for({})�install�localinstall�inz,install a package or packages on your systemcCs"|jddtd�tjtd�d�dS)N�package�+ZPACKAGEzPackage to install)�nargs�metavar�action�help)�add_argumentrrZParseSpecGroupFileCallback)�parser�r�/usr/lib/python3.6/install.py�
set_argparser1szInstallCommand.set_argparsercCsH|jj}d|_d|_d|_d|_tj|j|j�|j	j
sDtj|j�dS)z�Verify that conditions are met so that this command can run.
        That there are enabled repositories with gpg keys, and that
        this command is called with appropriate arguments.
        TN)�cli�demandsZsack_activationZavailable_reposZ	resolvingZ	root_userrZ_checkGPGKey�base�opts�	filenamesZ_checkEnabledRepo)�selfrrrr�	configure7szInstallCommand.configurec
CsPg}g}g}|j�}|jj|j�|jjdkrf|jjs>|jjrf|j|jj�|jj	j
rftjj
td���g}|jjo||jjdk�rTtjj�rLy,tjjj|j�}|j|jj|jj	j
d�Wn�tjjk
�rH}zp|jr�x|jD]}|j|�q�W|j�rx|jD]}|jd|�q�W|j}	|	�r8tjtjjj|	d��WYdd}~XnXn|jj}|jj�r�|�r�|j|jj�|jj	j
�r�tjj
td���n|j�}|�r�|�r�|j|�|jj	j
�r�tjj
td���n|�r�|jjdk�r�|j|�|jjdk�r|j |�}t!|�dk�s$t!|�dk�s$|�rL|jj	j
�rLtjj"td�dj#|�|d��dS)	NrzNothing to do.)�strict�@rzUnable to find a match� )�pkg_specZpackages)$�_get_nevra_forms_from_commandrZ _populate_update_security_filterr�command�	grp_specs�	pkg_specs�_log_not_valid_rpm_file_pathsr�confrr�
exceptions�ErrorrZWITH_MODULES�module�module_baseZ
ModuleBaser
Z
MarkingErrorsZno_match_group_specs�appendZerror_group_specs�module_depsolv_errors�logger�errorZformat_modular_solver_errorsr�_inform_not_a_valid_combination�_install_files�_install_groups�_install_packages�lenZPackagesNotAvailableError�join)
r�err_pkgs�errsZerror_module_specs�nevra_formsZskipped_grp_specsr,�eZe_specr.rrr�runEsX

 


.zInstallCommand.runcCs&|jj|jkr|j|jjgSgSdS)N)rr$r9)rrrrr#zsz,InstallCommand._get_nevra_forms_from_commandcCsJtdd�|�}x6t|jj|�D]$}td�}tj||jjj	j
|��qWdS)NcSsd|S)Nr r)�grrr�<lambda>�sz>InstallCommand._log_not_valid_rpm_file_paths.<locals>.<lambda>zNot a valid rpm file path: %s)�maprrr&rr/�infor�output�term�bold)rr%Zgroup_names�pkg�msgrrrr'�sz,InstallCommand._log_not_valid_rpm_file_pathscCs2x,|D]$}td�}tj||jjjj|��qWdS)NzNot a valid form: %s)rr/Zwarningrr@rArB)r�formsZformrDrrrr1�s
z.InstallCommand._inform_not_a_valid_combinationcCs�g}|jjj}x~|jj|jj||jjjd�D]^}y|jj||d�Wq,t	j
jk
r�td�}t
j||jjjj|j��|j|�Yq,Xq,W|S)N)r�progress)rzNo match for argument: %s)rr(rZadd_remote_rpmsrrr@rFZpackage_installrr)�MarkingErrorrr/r?rArB�locationr-)rr7rrCrDrrrr2�s
zInstallCommand._install_filescCsPy&|jj|t|jjj�|jjjd�Wn$tjjk
rJ|jjjrF�YnXdS)N)r)	rZenv_group_install�tupler(Zgroup_package_typesrrr)r*)rr%rrrr3�s
zInstallCommand._install_groupscCsV|jjj�j|jj|�d�}|rRtd�}tj|j|dj	t
tdd�|D������dS)N)Zprovidesz/There are following alternatives for "{0}": {1}z, cSsg|]
}|j�qSr)�name)�.0Zaltrrr�
<listcomp>�sz7InstallCommand._report_alternatives.<locals>.<listcomp>)rZsack�queryZfilterm�alternatives_provide�formatrr/r?r6�sorted�set)rr"rMrDrrr�_report_alternatives�sz#InstallCommand._report_alternativescCs�g}|jjj}x�|jjD]�}y|jj|||d�Wqtjjk
r�}zJdj	|j
|jjjj
|��}tj|�|jj|�|j|�|j|�WYdd}~XqXqW|S)N)rrEz{}: {})rr(rrr&r
rr)rGrO�valuer@rArBr/r?Z_report_icase_hintrRr-)rr9r8rr"r:rDrrrr4�s


 z InstallCommand._install_packagesN)r
rr)�__name__�
__module__�__qualname__�__doc__�hawkeyZ	FORM_NAMEZFORM_NAZ
FORM_NEVRAr9rNrI�keys�aliasesrZsummary�staticmethodrrr;r#r'r1r2r3rRr4rrrrr	%s"
5		r	)Z
__future__rrZlogging�	itertoolsrrXZdnf.exceptionsrZdnf.clirZdnf.cli.option_parserrZdnf.i18nrZ	getLoggerr/ZCommandr	rrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/history.cpython-36.pyc000064400000026342147511334650020762 0ustar003

i�-e%F�@s�ddlmZddlmZddlmZddlZddlZddlmZmZddl	m
Z
ddlmZm
Z
ddl	ZddlZddlZddlZddlZddlZddlZejd�ZGd	d
�d
e
j�ZdS)�)�absolute_import)�print_function)�unicode_literalsN)�_�ucd)�commands)�TransactionReplay�serialize_transaction�dnfcs�eZdZdZd+Zed�Zddddd	d
ddgZ�fd
d�Ze	dd��Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Z�ZS),�HistoryCommandzUA class containing methods needed by the cli to execute the
    history command.
    �history�histz(display, or use, the transaction history�list�info�redo�replay�rollback�store�undo�
userinstalledcstt|�j||�d|_dS)NF)�superr�__init__�_require_one_transaction_id)�self�args�kw)�	__class__��/usr/lib/python3.6/history.pyr4szHistoryCommand.__init__c
Cs�|jddddjtjddjtjdd���d�|jd	d
dd�|jd
ddtd�d�|jdd
td�d�|jdd
td�d�|jdd
td�d�|jddddd�|jddddd�dS)N�transactions_action�?ZCOMMANDz$Available commands: {} (default), {}rz, �)�nargs�metavar�helpz	--reverse�
store_truez$display history list output reversed)�actionr$z-oz--outputz<For the store command, file path to store the transaction to)�defaultr$z--ignore-installedzXFor the replay command, don't check for installed packages matching those in transactionz--ignore-extraszRFor the replay command, don't check for extra packages pulled into the transactionz--skip-unavailablezYFor the replay command, skip packages that are not available or have missing dependencies�transactions�*ZTRANSACTIONz�For commands working with history transactions, Transaction ID (<number>, 'last' or 'last-<number>' for one transaction, <transaction-id>..<transaction-id> for a range)�transaction_filenameZTRANSACTION_FILEzEFor the replay command, path to the stored transaction file to replay)�add_argument�formatr�_CMDS�joinr)�parserrrr�
set_argparser9s$



zHistoryCommand.set_argparsercCs.|jjs|jd|j_n0|jj|jkrH|jjjd|jj�|jd|j_td�j|jj�|_|jj	}|jjdk�r|jjs�t
jjtd���t|jj�dkr�t
jjtd���t
jj|jjd�|j_g|j_d|_d|_d|_d|jj_d|jj_t
jjj|j|j�n�|jjd	k�r6d|_|jj�s�t
jjtd
���n�|jjdk�r�d|_d|_d|_d|_|jj�s�td
�}tj|�t
jj|��n,t|jj�dk�r�tj|j�t
jj|j��d|_t
jjj|j|j�nd|_d|_|jjjdk�r*t
j |jjjt
j!��r*td|jjj�}tj|�t
jj|��dS)NrzUFound more than one transaction ID.
'{}' requires one transaction ID or package name.rzNo transaction file name given.r!z6More than one argument given as transaction file name.TFrz(No transaction ID or package name given.rrrz:memory:z+You don't have access to the history DB: %s)rrr)"�optsrr-r(�insertrr,�_require_one_transaction_id_msg�cli�demandsr
�CliError�len�os�path�abspathr*Zavailable_reposZ	resolvingZ	root_user�base�confZclean_requirements_on_removeZinstall_weak_depsrZ_checkGPGKeyr�logger�criticalZfresh_metadataZsack_activationr�access�R_OK)rr5�msgrrr�	configureUsZ




(
zHistoryCommand.configurecCs�t|tjj�rv|jjdkr2|jj\}td�|fS|jjdkrv|jjddkrV|jjn|jjdd�\}td�|fStjj	j
j||�S)	z.Get suggestions for resolving the given error.rzVCannot undo transaction %s, doing so would result in an inconsistent package database.rr�forcer!NzZCannot rollback transaction %s, doing so would result in an inconsistent package database.)�
isinstancer
�
exceptionsZTransactionCheckErrorr1rr(rr4r�Command�get_error_output)r�errorZid_rrrrG�s
zHistoryCommand.get_error_outputcCs:|j|�}t|�}t|j|dd|jjd�|_|jj�dS)NT)�data�ignore_installed�
ignore_extras�skip_unavailable)�_history_get_transactionr	rr;r1rLr�run)r�extcmds�oldrIrrr�
_hcmd_redo�s
zHistoryCommand._hcmd_redocCsD|stjjtd���|jjj|�}|s@tjjtd�j|d���|S)NzNo transaction ID givenzTransaction ID "{0}" not found.r)r
r4r6rr;rrPr,)rrOrPrrr�_history_get_transactions�sz(HistoryCommand._history_get_transactionscCs.|j|�}t|�dkr&tjjtd���|dS)Nr!z#Found more than one transaction ID!r)rRr7r
r4r6r)rrOrPrrrrM�s
z'HistoryCommand._history_get_transactioncCs|j|�}|j|�dS)N)rM�_revert_transaction)rrOrPrrr�
_hcmd_undo�s
zHistoryCommand._hcmd_undocCs�|j|�}|jjj�}d}|j|jkr�x�|jjjtt|jd|jd���D]X}|jrjt	j
td�|j�n|jr�t	j
td�|j�|dkr�t
jjj|�}qL|j|�qLW|j|�dS)Nr!z-Transaction history is incomplete, before %u.z,Transaction history is incomplete, after %u.)rMr;r�last�tidrPr�rangeZaltered_lt_rpmdbr=ZwarningrZaltered_gt_rpmdbr
ZdbZMergedTransactionWrapper�mergerS)rrOrPrUZmerged_trans�transrrr�_hcmd_rollback�s
*zHistoryCommand._hcmd_rollbackc	Cs&dddddddddd	d
d�}t|�}x�dD]�}x�|j|g�D]�}||d|d<|ddkrt|jdd�dkrtd|d<|dd
kr�d|kr�tj|d�}|jtjgd�d}|jjjj	|j
|j|j�dd�}t
jj|�|d<|jd�tjkr<d|d<q<Wq*Wt|j|dd|jjd�|_|jj�dS)N�Removed�Install�
Downgraded�	Downgrade�Upgraded�Upgrade�	Reinstall�Reinstalled�	Obsoletedz
Reason Change)r\r[r`r_r^r]rbrarcZObsoletez
Reason Change�rpms�groups�environmentsr&�reasonZcleanZ
dependency�nevra)Zformsrr!Zrepo_idT)rIrJrKrL)rdrerf)r	�get�hawkeyZSubjectZget_nevra_possibilitiesZ
FORM_NEVRA�outputrZswdbZresolveRPMTransactionItemReason�nameZarch�tids�libdnfZtransactionZTransactionItemReasonToStringZSYSTEM_REPO_NAMErr;r1rLrrN)	rrYZ
action_maprIZcontent_typeZtiZsubjrhrgrrrrS�sD

z"HistoryCommand._revert_transactioncCs:t|jj��}|jj|dd�}|dkr6tjjtd���dS)z&Execute history userinstalled command.zPackages installed by userrhrzNo packages to listN)	�tupler;Ziter_userinstalledrkZlistPkgsr
r4r6r)rZpkgsZn_listedrrr�_hcmd_userinstalledsz"HistoryCommand._hcmd_userinstalledc
s��fdd�}t�}t�}�xĈjjD�]�}d|k�r\y|jdd�\}}Wn0tk
rxtjtd�j|��t	j
j�YnXtd�}y||�}Wn0tk
r�tjt|�j|��t	j
j�YnXy||�}Wn0tk
r�tjt|�j|��t	j
j�YnX�j�r$||k�r$tj�j
�t	j
j�||k�r8||}}|j||f�|jt||d��q$y|j||��Wq$tk
�r��jjj|g�}|�r�|j|�n4td�j|�}	�j�r�tj|	�t	j
j�n
tj|	�Yq$Xq$Wt|d	d
�|fS)z0Convert commandline arguments to transaction idscsJ|dkrd}n|jd�r$|dd�}t|�}|dkrF|�jjj�j7}|S)NrU�0zlast-�r)�
startswith�intrkrrUrV)�sZtransaction_id)rrr�str2transaction_ids
z@HistoryCommand._args2transaction_ids.<locals>.str2transaction_idz..�zWInvalid transaction ID range definition '{}'.
Use '<transaction-id>..<transaction-id>'.zNCan't convert '{}' to transaction ID.
Use '<number>', 'last', 'last-<number>'.r!z8No transaction which manipulates package '{}' was found.T)�reverse)�setr1r(�split�
ValueErrorr=r>rr,r
r4r6rr3�add�updaterWrkr�searchr�sorted)
rrvrm�merged_tids�tZbegin_transaction_idZend_transaction_idZcant_convert_msgZtransact_ids_from_pkgnamerAr)rr�_args2transaction_ids
sV





z$HistoryCommand._args2transaction_idsc
Cs@|jj}|dkrDt|j|jj|jj|jj|jjd�|_|jj	��n�|j
�\}}|dkr~|sf|jjr~|jj
||jjd��n�|dkr�|s�|jjr�|jj||jj|��n�|dkr�|j|��nz|dkr�|j|��nd|dkr�|j|��nN|d	k�r|j��n8|d
k�r<|j|�}t|�}y�|jjdk	�r8|jjnd}|jjj�sV|jjj�r�tjj|��r�td�j|�}|jjj�s�|jjjd
j|�dj|�d��r�ttd�j|��dSt |d��"}t!j"||ddd�|j#d�WdQRXttd�j|��Wn>t$k
�r:}	z t%j&j'td�jt(|	����WYdd}	~	XnXdS)Nr)�filenamerJrKrLr)rxrrrrrrztransaction.jsonz{} exists, overwrite?z
{} [y/N]: z
{} [Y/n]: )rAZdefaultyes_msgzNot overwriting {}, exiting.�wrrT)�indentZ	sort_keys�
zTransaction saved to {}.zError storing transaction: {}))r1rrr;r*rJrKrLrrNr�r(rkZhistoryListCmdrxZhistoryInfoCmdrTrQrZrprMr	r<ZassumenoZ	assumeyesr8r9�isfilerr,Zuserconfirm�print�open�json�dump�write�OSErrorr
r4r6�str)
rZvcmdrmr�rVrIr�rA�f�errrrNMsN


(zHistoryCommand.runcCs|jjdkrdS|jj�dS)Nrrrr)rrrr)r1rrZpost_transaction)rrrr�run_resolvedszHistoryCommand.run_resolvedcCsX|jjdkrdS|jj�}|rTtjtjjt	d��x |D]}tjtjjd|�q8WdS)NrrrrzEWarning, the following problems occurred while running a transaction:z  )rrrr)
r1rrZget_warningsr=�logr
�loggingZWARNINGr)r�warningsr�rrr�run_transaction�s


zHistoryCommand.run_transaction)rr
)�__name__�
__module__�__qualname__�__doc__�aliasesrZsummaryr-r�staticmethodr0rBrGrQrRrMrTrZrSrpr�rNr�r��
__classcell__rr)rrr*s&=	0@2r)Z
__future__rrrrnrjZdnf.i18nrrZdnf.clirZdnf.transaction_srrr	r
Zdnf.exceptionsZdnf.transactionZdnf.utilr�r�r8Z	getLoggerr=rFrrrrr�<module>s 
site-packages/dnf/cli/commands/__pycache__/check.cpython-36.pyc000064400000007213147511334650020332 0ustar003

�ft`?�@sVddlmZddlmZddlmZddlmZddlZddlZ	Gdd�dej
�ZdS)�)�absolute_import)�unicode_literals)�_)�commandsNc@s8eZdZdZdZed�Zedd��Zdd�Z	dd	�Z
d
S)�CheckCommandzSA class containing methods needed by the cli to execute the check
    command.
    �checkz#check for problems in the packagedbc	Cs�|jddddtd�d�|jddddtd	�d�|jd
dddtd�d�|jd
dddtd�d�|jddddtd�d�|jdddddddggtjd�dS)Nz--all�check_typesZappend_const�allzshow all problems; default)�dest�action�const�helpz--dependencies�dependencieszshow dependency problemsz--duplicates�
duplicateszshow duplicate problemsz--obsoleted�	obsoletedzshow obsoleted packagesz
--provides�provideszshow problems with provides�check_yum_types�*)�nargs�choicesr
)�add_argumentr�argparseZSUPPRESS)�parser�r�/usr/lib/python3.6/check.py�
set_argparser$s$zCheckCommand.set_argparsercCsxd|jj_|jjr<|jjr0|jj|jj|j_n|jj|j_|jjsPdh|j_nt|jj�|j_|jjj	dg7_	dS)NTr	)
ZcliZdemandsZsack_activation�optsrr�set�base�confZdisable_excludes)�selfrrr�	configure;s
zCheckCommand.configurec	Cs�t�}|jjj�j�}|jjjddh��r�d}�x||D�]r}x�t|j�tt|j	�t|j
��BD]�}t|�jd�rtq`t
|j|gd��s`t|�jd�r�|dkr�tjj|j�}tjj|�}|jt|�d�tjj|�}|jjj|_|j|dd�|j�}|r�q`td�}	|j|	j|jjjj|�|jjjj|���q`Wxx|jD]n}
|j|
gt|
�j �d	d
�}xJ|D]B}d}	|j|	j|jjjj|�|jjjj|
�|jjjj|����q^W�q8Wq6W|jjjddh��rN|jj!|�}
|j"�j#|
�j$�}xl|j%�D]`\}}|j&�xL|d
d�D]<}td�j|jjjj|d	�|jjjj|��}	|j|	��qW�q�W|jjjddh��r�x||D]t}xl|j'D]b}|j|gt|�j �d	d
�}t
|��rttd�j|jjjj|d	�|jjjj|��}	|j|	��qtW�qhW|jjjddh��r\xf|D]^}xV|j(D]L}||j|gd�k�rtd�}	|j|	j|jjjj|�|jjjj|����qW�q�Wxt)|�D]}	t*|	��qfW|�r�tj+j,djt
|����dS)Nr	rZrpmlib)r�(F)ZselectZoptionalz{} has missing requires of {}r)r�namez"{} has installed conflict "{}": {}r�z{} is a duplicate with {}rz{} is obsoleted by {}rz%{} provides {} but it cannot be foundzCheck discovered {} problem(s))-rr�sackZqueryZ	installedrr�intersectionZregular_requiresZrequires_preZprereq_ignoreinst�str�
startswith�len�filter�dnfZ
rpmdb_sack�selectorZSelector�goalZGoalrZprotect_running_kernelZinstall�runr�add�format�outputZtermZboldZ	conflicts�splitZ_get_installonly_queryZ
duplicated�
differenceZ
_name_dict�items�sortZ	obsoletesr�sorted�print�
exceptions�Error)r Z
output_set�qr%ZpkgZrequirer,r-Zsolved�msgZconflictZ
conflictedZconflict_pkgZinstallonlyZdupsr#Zpkgs�dupZobsoleterZproviderrrr.Is�(
$


 zCheckCommand.runN)r)�__name__�
__module__�__qualname__�__doc__�aliasesrZsummary�staticmethodrr!r.rrrrrsr)Z
__future__rrZdnf.i18nrZdnf.clirrZdnf.exceptionsr+ZCommandrrrrr�<module>ssite-packages/dnf/cli/commands/__pycache__/module.cpython-36.pyc000064400000035164147511334650020550 0ustar003

�ft`�A�@s�ddlmZddlmZmZddlmZddlmZddl	m
Z
ddl	ZddlZddl
Z
ddlZddlZddlZddlZGdd�dej�ZdS)	�)�print_function)�commands�CliError)�_)�NoModuleException)�loggerNcs*eZdZGdd�dej�ZGdd�de�ZGdd�de�ZGdd�de�ZGd	d
�d
e�Z	Gdd�de�Z
Gd
d�de�ZGdd�de�ZGdd�de�Z
Gdd�de�ZGdd�de�ZGdd�de�Zeeee	e
eee
eeehZehZd%Zed�Z�fdd�Zdd�Zdd �Zd!d"�Zd#d$�Z�ZS)&�
ModuleCommandcs,eZdZ�fdd�Zdd�Zdd�Z�ZS)zModuleCommand.SubCommandcs(ttj|�j|�tjjj|j�|_dS)N)	�superr�
SubCommand�__init__�dnf�module�module_baseZ
ModuleBase�base)�self�cli)�	__class__��/usr/lib/python3.6/module.pyr(sz!ModuleCommand.SubCommand.__init__c	Cs�t�}x�|jjD]�}|jj|�\}}|dkr.q|jr:|jnd}|jrJ|jnd}|jr^|jdksd|jrxt	j
td�j|��|j
r�|j
nd}|jjj||dd|�}|j|�qW|S)N��zjOnly module name, stream, architecture or profile is used. Ignoring unneeded information in argument: '{}'���)�set�opts�module_specr�_get_modules�name�stream�version�contextr�infor�format�archr�_moduleContainer�query�update)	r�modules_from_specsr�__Znsvcaprrr"�modulesrrr�#_get_modules_from_name_stream_specs,sz<ModuleCommand.SubCommand._get_modules_from_name_stream_specsc	Cs�t�}t�}x0|D](}||kr|jjj|�r|j|j��qWxB|D]:}tj|�}x*|jtj	gd�D]}|j
rd|j|j
�qdWqDW||fS)N)Zforms)rrr#ZisModuleActiver%ZgetArtifacts�hawkeyZSubjectZget_nevra_possibilitiesZ
FORM_NEVRAr�add)	rZuse_modulesZskip_modulesZ	artifactsZ	pkg_namesr
ZartifactZsubjZ	nevra_objrrr�_get_module_artifact_names>s


z3ModuleCommand.SubCommand._get_module_artifact_names)�__name__�
__module__�__qualname__rr)r,�
__classcell__rr)rrr
&sr
c@s(eZdZdZed�Zdd�Zdd�ZdS)	zModuleCommand.ListSubCommand�listz,list all module streams, profiles and statescCs|jj}d|_d|_dS)NT)r�demands�available_repos�sack_activation)rr2rrr�	configureRsz&ModuleCommand.ListSubCommand.configurecCs�|j}|jjr&|j|jjtjjj�}nV|jj	rF|j|jjtjjj
�}n6|jjrf|j|jjtjjj�}n|j|jjtjjj
�}|r�t|�dS|jjr�td�}tjj|��dS)NzNo matching Modules to list)rr�enabledZ_get_brief_descriptionr�libdnfr
�ModulePackageContainerZModuleState_ENABLED�disabledZModuleState_DISABLED�	installedZModuleState_INSTALLEDZModuleState_UNKNOWN�printrr�
exceptions�Error)rZmods�output�msgrrr�
run_on_moduleWs(z*ModuleCommand.ListSubCommand.run_on_moduleN)r1)r-r.r/�aliasesr�summaryr5r@rrrr�ListSubCommandMsrCc@s(eZdZdZed�Zdd�Zdd�ZdS)	zModuleCommand.InfoSubCommandr z)print detailed information about a modulecCs|jj}d|_d|_dS)NT)rr2r3r4)rr2rrrr5tsz&ModuleCommand.InfoSubCommand.configurecCsf|jjr|jj|jj�}n*|jjr4|jj|jj�}n|jj|jj�}|rRt|�nt	j
jtd���dS)NzNo matching Modules to list)
r�verboserZ_get_full_infor�profileZ_get_info_profilesZ	_get_infor;rr<r=r)rr>rrrr@ys
z*ModuleCommand.InfoSubCommand.run_on_moduleN)r )r-r.r/rArrBr5r@rrrr�InfoSubCommandosrFc@s(eZdZdZed�Zdd�Zdd�ZdS)	zModuleCommand.EnableSubCommand�enablezenable a module streamcCs$|jj}d|_d|_d|_d|_dS)NT)rr2r3r4�	resolving�	root_user)rr2rrrr5�s
z(ModuleCommand.EnableSubCommand.configurecCs�y|jj|jj�Wnltjjk
r�}zL|jjj	rb|j
s@|jrD|�|jrb|jdt
jjjkrb|�tjt|��WYdd}~XnXdS)Nr)rrGrrrr<�
MarkingErrorsr�conf�strict�no_match_group_specs�error_group_specs�module_depsolv_errorsr7r
r8�!ModuleErrorType_ERROR_IN_DEFAULTSr�error�str)r�errrr@�s
z,ModuleCommand.EnableSubCommand.run_on_moduleN)rG)r-r.r/rArrBr5r@rrrr�EnableSubCommand�srTc@s(eZdZdZed�Zdd�Zdd�ZdS)	zModuleCommand.DisableSubCommand�disablez%disable a module with all its streamscCs$|jj}d|_d|_d|_d|_dS)NT)rr2r3r4rHrI)rr2rrrr5�s
z)ModuleCommand.DisableSubCommand.configurecCs�y|jj|jj�Wnltjjk
r�}zL|jjj	rb|j
s@|jrD|�|jrb|jdt
jjjkrb|�tjt|��WYdd}~XnXdS)Nr)rrUrrrr<rJrrKrLrMrNrOr7r
r8rPrrQrR)rrSrrrr@�s
z-ModuleCommand.DisableSubCommand.run_on_moduleN)rU)r-r.r/rArrBr5r@rrrr�DisableSubCommand�srVc@s(eZdZdZed�Zdd�Zdd�ZdS)	zModuleCommand.ResetSubCommand�resetzreset a modulecCs$|jj}d|_d|_d|_d|_dS)NT)rr2r3r4rHrI)rr2rrrr5�s
z'ModuleCommand.ResetSubCommand.configurecCsby|jj|jj�WnHtjjk
r\}z(|jjj	r>|j
r>|�tjt
|��WYdd}~XnXdS)N)rrWrrrr<rJrrKrLrMrrQrR)rrSrrrr@�s
z+ModuleCommand.ResetSubCommand.run_on_moduleN)rW)r-r.r/rArrBr5r@rrrr�ResetSubCommand�srXc@s(eZdZdZed�Zdd�Zdd�ZdS)	zModuleCommand.InstallSubCommand�installz/install a module profile including its packagescCs$|jj}d|_d|_d|_d|_dS)NT)rr2r3r4rHrI)rr2rrrr5�s
z)ModuleCommand.InstallSubCommand.configurecCspy|jj|jj|jjj�WnNtjj	k
rj}z.|jjjrL|j
sH|jrL|�tj
t|��WYdd}~XnXdS)N)rrYrrrrKrLrr<rJrMrNrrQrR)rrSrrrr@�s
z-ModuleCommand.InstallSubCommand.run_on_moduleN)rY)r-r.r/rArrBr5r@rrrr�InstallSubCommand�srZc@s(eZdZdZed�Zdd�Zdd�ZdS)	zModuleCommand.UpdateSubCommandr%z0update packages associated with an active streamcCs$|jj}d|_d|_d|_d|_dS)NT)rr2r3r4rHrI)rr2rrrr5�s
z(ModuleCommand.UpdateSubCommand.configurecCs&|jj|jj�}|r"tdj|���dS)Nz, )rZupgraderrr�join)rZmodule_specsrrrr@�sz,ModuleCommand.UpdateSubCommand.run_on_moduleN)r%)r-r.r/rArrBr5r@rrrr�UpdateSubCommand�sr\c@s(eZdZd	Zed�Zdd�Zdd�ZdS)
zModuleCommand.RemoveSubCommand�remove�erasez3remove installed module profiles and their packagescCs0|jj}d|_d|_d|_d|_d|_d|_dS)NTF)rr2Z
allow_erasingr3Zfresh_metadatarHrIr4)rr2rrrr5�sz(ModuleCommand.RemoveSubCommand.configurec
Cs�|jj|jj�}|jjr�|j�}|j|t��\}}|j|jj	j
�|�\}}|jjj�j
�j|d�}|jjj�j
�j|d�}xF|D]>}||kr�td�j|�}	tj|	�q�|jjj||jjjd�q�W|s�dStjtjj|d��dS)N)rz0Package {} belongs to multiple modules, skipping)Z
clean_deps)rM)rr]rr�allr)r,rrr#ZgetModulePackages�sackr$r:�filtermrr!rr Zgoalr^rKZclean_requirements_on_removerQrr<rJ)
rZskipped_groupsr&Zremove_names_from_specr'Z
keep_namesZremove_queryZ
keep_query�pkgr?rrrr@�s&
z,ModuleCommand.RemoveSubCommand.run_on_moduleN)r]r^)r-r.r/rArrBr5r@rrrr�RemoveSubCommand�s	rcc@s(eZdZdZed�Zdd�Zdd�ZdS)	z ModuleCommand.SwitchToSubCommand�	switch-toz7switch a module to a stream and distrosync rpm packagescCs.|jj}d|_d|_d|_d|_d|jj_dS)NT)	rr2r3r4rHrIrrKZmodule_stream_switch)rr2rrrr5sz*ModuleCommand.SwitchToSubCommand.configurecCsry|jj|jj|jjjd�WnNtjj	k
rl}z.|jjjrN|j
sJ|jrN|�tj
t|��WYdd}~XnXdS)N)rL)rZ	switch_torrrrKrLrr<rJrMrNrrQrR)rrSrrrr@"s
z.ModuleCommand.SwitchToSubCommand.run_on_moduleN)rd)r-r.r/rArrBr5r@rrrr�SwitchToSubCommandsrec@s(eZdZdZed�Zdd�Zdd�ZdS)	z ModuleCommand.ProvidesSubCommand�provideszlist modular packagescCs|jj}d|_d|_dS)NT)rr2r3r4)rr2rrrr50sz*ModuleCommand.ProvidesSubCommand.configurecCs |jj|jj�}|rt|�dS)N)rZ_what_providesrrr;)rr>rrrr@5sz.ModuleCommand.ProvidesSubCommand.run_on_moduleN)rf)r-r.r/rArrBr5r@rrrr�ProvidesSubCommand+srgc@s(eZdZdZed�Zdd�Zdd�ZdS)	z!ModuleCommand.RepoquerySubCommand�	repoqueryz#list packages belonging to a modulecCs|jj}d|_d|_dS)NT)rr2r3r4)rr2rrrr5?sz+ModuleCommand.RepoquerySubCommand.configurecCs�t�}x*|jjD]}|jj|�\}}|j|�qW|j|t��\}}t�}|jjs\|jjr�|j	j
j�j�j|d�}x|D]}	|j
t|	��qzW|jjr�|j	j
j�j�j|d�}x|D]}	|j
t|	��q�Wdjt|��}
t|
�dS)N)Znevra_strict)r�
)rrrrrr%r,�	availabler:rr`r$rar+rRr[�sortedr;)rr&rr(r'Znames_from_specZspec_artifactsZpackage_stringsr$rbr>rrrr@Ds"

z/ModuleCommand.RepoquerySubCommand.run_on_moduleN)rh)r-r.r/rArrBr5r@rrrr�RepoquerySubCommand:srlr
zInteract with Modules.cs>tt|�j���fdd�|jD�}d|_dd�|D�|_dS)Nc3s|]}|��VqdS)Nr)�.0�subcmd)rrr�	<genexpr>dsz)ModuleCommand.__init__.<locals>.<genexpr>cSsi|]}|jD]
}||�qqSr)rA)rmrn�aliasrrr�
<dictcomp>fsz*ModuleCommand.__init__.<locals>.<dictcomp>)r	rr�SUBCMDSrn�_subcmd_name2obj)rrZsubcmd_objs)r)rrrbs
zModuleCommand.__init__cCs|j�}|jdddtd�d�|jdddtd�d�|jd	d
dtd�d�|jdd
dtd�d�|jdddtd�d�|jdddtd�d�g}g}xHt|jdd�d�D]2}|j|jd�|jdj|jd|jp�d��q�W|jdd|ddj	|�d�|jd d!d"td#�d$�dS)%Nz	--enabledr6�
store_truezshow only enabled modules)�dest�action�helpz
--disabledr9zshow only disabled modulesz--installedr:z'show only installed modules or packagesz	--profilerEzshow profile contentz--availablerjzshow only available packagesz--allr_zremove all modular packagescSs
|jdS)Nr)rA)�xrrr�<lambda>~sz-ModuleCommand.set_argparser.<locals>.<lambda>)�keyrz{}: {}rrnrz<modular command>ri)�nargs�choices�metavarrwrzmodule-spec�*zModule specification)r}r{rw)
Zadd_mutually_exclusive_group�add_argumentrrkrr�appendrAr!rBr[)r�parserZnarrowsZsubcommand_choicesZsubcommand_helprnrrr�
set_argparseris8
"

zModuleCommand.set_argparsercCsZy|j|jjd|_Wn(ttfk
r@|jjj�t�YnX|j|j_|jj�dS)Nr)	rsrrnr�KeyErrorrZ	optparserZprint_usager5)rrrrr5�s

zModuleCommand.configurecCs|j�|jj�dS)N)�check_required_argumentrnr@)rrrr�run�szModuleCommand.runcCsRdd�|jD�}|jjd|krN|jjsNttd�jtjj	|jj
|jjd���dS)NcSsg|]}|jD]}|�qqSr)rA)rmrnrprrr�
<listcomp>�sz9ModuleCommand.check_required_argument.<locals>.<listcomp>rz{} {} {}: too few arguments)�SUBCMDS_NOT_REQUIRED_ARGrrnrrrr!r�utilZ	MAIN_PROGZcommand)rZnot_required_argumentrrrr��s
z%ModuleCommand.check_required_argument)r
)r-r.r/r�Commandr
rCrFrTrVrXrZr\rcrergrlrrr�rArrBrr�r5r�r�r0rr)rrr%s.'"%	r)Z
__future__rZdnf.clirrZdnf.i18nrZdnf.module.exceptionsrZdnf.utilrr�sys�osr*r7Zdnf.module.module_baseZdnf.exceptionsr�rrrrr�<module>ssite-packages/dnf/cli/commands/__pycache__/updateinfo.cpython-36.opt-1.pyc000064400000033121147511334650022347 0ustar003

�ft`2J�@s�dZddlmZddlmZddlmZddlZddlZddlZddlm	Z	ddl
mZddlm
Z
mZdd	lmZd
d�ZGdd
�d
e	j�ZdS)zUpdateInfo CLI command.�)�absolute_import)�print_function)�unicode_literalsN)�commands)�OptionParser)�_�exact_width)�unicodecCstdd�|D��S)z7Return maximum length of items in a non-empty iterable.css|]}t|�VqdS)N)r)�.0�item�r� /usr/lib/python3.6/updateinfo.py�	<genexpr>&sz_maxlen.<locals>.<genexpr>)�max)�iterablerrr
�_maxlen$srcs.eZdZdZejed�ejed�ejed�ej	ed�ej
ed�iZed�ed�ed	�ed
�d�Zdddd
d
d
dd�Z
dgee
j��Zed�ZdZdddegZ�fdd�Zedd��Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Z d.d/�Z!d0d1�Z"d2d3�Z#d4d5�Z$�Z%S)6�UpdateInfoCommandz)Implementation of the UpdateInfo command.�bugfix�enhancement�security�unknown�
newpackagez
Critical/Sec.zImportant/Sec.z
Moderate/Sec.zLow/Sec.)�Critical�	Important�Moderate�Low�list�info�summary)zlist-updateinfoz
list-securityzlist-seczinfo-updateinfoz
info-securityzinfo-seczsummary-updateinfoZ
updateinfoz!display advisories about packages�	available�	installed�updates�allcstt|�j|�d|_dS)zInitialize the command.N)�superr�__init__�_installed_query)�self�cli)�	__class__rr
r$CszUpdateInfoCommand.__init__c	Cs|j�}|jddddtd�d�|jddddtd	�d�|jd
dddtd�d�|jd
dddtd�d�dddg}|j�}|jddddtd�d�|jddddtd�d�|jddddtd�d�|jddddtd�d�|jd d!ddtd"�d�|jd#d$d%||d&tjtd'�d(�dS))Nz--available�
_availabilityr�store_constz?advisories about newer versions of installed packages (default))�dest�const�action�helpz--installedr z?advisories about equal and older versions of installed packagesz	--updatesr!zbadvisories about newer versions of those installed packages for which a newer version is availablez--allr"z3advisories about any versions of installed packagesrrrz	--summary�_spec_actionz$show summary of advisories (default)z--listzshow list of advisoriesz--infozshow info of advisoriesz
--with-cve�with_cveF�
store_truez'show only advisories with CVE reference)r+�defaultr-r.z	--with-bz�with_bzz,show only advisories with bugzilla reference�spec�*ZSPECrzPackage specification)�nargs�metavar�choicesr2r-r.)Zadd_mutually_exclusive_group�add_argumentrrZPkgNarrowCallback)�parser�availabilityZcmdsZ
output_formatrrr
�
set_argparserHsD






zUpdateInfoCommand.set_argparsercCs�d|jj_d|jj_|jj|jkr6|j|jj|j_n|jjrJ|jj|j_|jj	r`|jj	|j_
n:|jjs||jjd|jkr�|j
|j_
n|jjjd�|j_
t�|j_|jjr�|jjjtj�|jjr�|jjjtj�|jjr�|jjjtj�|jj�r|jjjtj�|jj�r�|jjjd�}|dk�r:|jjjtj�n�|dk�rV|jjjtj�np|dk�rr|jjjtj�nT|dk�r�|jjjtj�n8|d
k�r�d|j_n$|d
k�r�d|j_n|jjjd|�|jj�r�|jjj|jj�dS)zADo any command-specific configuration based on command arguments.Trrrr�secr�	bugzillas�bzs�cvesN)rr=)r>r?) r'ZdemandsZavailable_reposZsack_activation�optsZcommand�direct_commands�spec_actionr/r)r;r4�availabilities�availability_default�pop�set�_advisory_typesr�add�hawkey�ADVISORY_BUGFIXr�ADVISORY_ENHANCEMENTr�ADVISORY_NEWPACKAGEr�ADVISORY_SECURITYr3r0�insert�advisory�extend)r&r4rrr
�	configurensJ













zUpdateInfoCommand.configurecCs�|jjdkr$|j|jj�}td�}n^|jjdkrH|j|jj�}td�}n:|jjdkrl|j|jj�}td�}n|j|jj�}td�}|jjdkr�|j	|�n$|jjdkr�|j
|�n|j||�dS)z#Execute the command with arguments.r r!r"rrrN)rAr;�installed_apkg_adv_instsr4r�updating_apkg_adv_insts�all_apkg_adv_insts�available_apkg_adv_instsrC�display_list�display_info�display_summary)r&�apkg_adv_insts�descriptionrrr
�run�s 


zUpdateInfoCommand.runcCs@|jdkr |jjj�j�j�|_|jj|j|jd�}t	|�dkS)N)�nameZevr__gter)
r%�base�sack�queryr Zapply�filterr]�evr�len)r&�apackage�qrrr
�_newer_equal_installed�s
z(UpdateInfoCommand._newer_equal_installedcs,|jjrJ|jjrJ|jjrJ|jjrJ|jjrJ|jjrJ|jjrJdS�j|jjkr\dSt	�fdd�|jjD��rzdS|jjr��j|jjkr�dS|jjr�t	�fdd�|jjD��r�dS|jjr�t	�fdd�|jjD��r�dS|jj�rt	dd��j
D���rdS|jj�r(t	dd��j
D���r(dSd	S)
NTc3s|]}tj�j|�VqdS)N)�fnmatch�fnmatchcase�id)r
�pat)rPrr
r�sz6UpdateInfoCommand._advisory_matcher.<locals>.<genexpr>csg|]}�j|��qSr)Z	match_bug)r
Zbug)rPrr
�
<listcomp>�sz7UpdateInfoCommand._advisory_matcher.<locals>.<listcomp>csg|]}�j|��qSr)Z	match_cve)r
Zcve)rPrr
rk�scSsg|]}|jtjk�qSr)�typerJ�
REFERENCE_CVE)r
�refrrr
rk�scSsg|]}|jtjk�qSr)rlrJ�REFERENCE_BUGZILLA)r
rnrrr
rk�sF)rArHr4�severityZbugzillar@r0r3rl�any�
references)r&rPr)rPr
�_advisory_matcher�s2






""

z#UpdateInfoCommand._advisory_matcherc#shxb|j|�D]T��j|jj�}|j|�}t�fdd�|jjD��}|sJ|r|j��}�||fVqWdS)z4Return (adv. package, advisory, installed) triplets.c3s|]}tj�j|�VqdS)N)rgrhr])r
rj)rdrr
r�szAUpdateInfoCommand._apackage_advisory_installed.<locals>.<genexpr>N)	Zget_advisory_pkgsZget_advisoryr^r_rsrqrAr4rf)r&Z
pkgs_queryZcmptype�specsrPZadvisory_matchZapackage_matchr r)rdr
�_apackage_advisory_installed�s

z.UpdateInfoCommand._apackage_advisory_installedcCs@|jj}|j�jdd�}|j�}|r<|j|j�j|jd��}|S)z<Return query containing packages of currently running kernelT)�empty)�	sourcerpm)r^r_r`�filtermZget_running_kernel�unionrw)r&r_reZkernelrrr
�running_kernel_pkgs�sz%UpdateInfoCommand.running_kernel_pkgscCs8|jjj�j�jd�}|j|j�j��}|j|tj	|�S)z5Return available (adv. package, adv., inst.) triplets�)
r^r_r`r ZlatestryrzrurJ�GT)r&rtrerrr
rV�sz*UpdateInfoCommand.available_apkg_adv_instscCs"|j|jjj�j�tjtjB|�S)z5Return installed (adv. package, adv., inst.) triplets)rur^r_r`r rJ�LT�EQ)r&rtrrr
rS�sz*UpdateInfoCommand.installed_apkg_adv_instscCs |j|jjj�jdd�tj|�S)z4Return updating (adv. package, adv., inst.) tripletsT)Z
upgradable)rur^r_r`rxrJr|)r&rtrrr
rT�sz)UpdateInfoCommand.updating_apkg_adv_instscCs(|j|jjj�j�tjtjBtjB|�S)z5Return installed (adv. package, adv., inst.) triplets)	rur^r_r`r rJr}r~r|)r&rtrrr
rUsz$UpdateInfoCommand.all_apkg_adv_instscCsVi}xB|D]:\}}}|j||j<|jtjkr
|j|jf||j|jf<q
Wtj|j��S)zMake the summary of advisories.)rlrirJrNrp�collections�Counter�values)r&rZ�id2type�apkgrPr rrr
�_summaryszUpdateInfoCommand._summaryc	
CsV|j|�}|�r<ttd�|�dtd�|tjfdtd�|tjfdtd�|tjdffdtd�|tjd	ffdtd
�|tjdffdtd�|tjd
ffdtd�|tjdffdtd�|tjfdtd�|tjfdtd�|tjfg
}t	dd�|D��}x<|D]4\}}}|�s�qtd|d|t
|�|f��qW|jjj
�rR|jj�dS)z"Display the summary of advisories.zUpdates Information Summary: rzNew Package notice(s)zSecurity notice(s)r{zCritical Security notice(s)rzImportant Security notice(s)rzModerate Security notice(s)rzLow Security notice(s)rzUnknown Security notice(s)NzBugfix notice(s)zEnhancement notice(s)zother notice(s)css"|]}|drt|d�VqdS)�N)r	)r
�vrrr
r(sz4UpdateInfoCommand.display_summary.<locals>.<genexpr>z
    %*s %s�)r��printrrJrMrNrKrL�ADVISORY_UNKNOWNrr	r^�confZautocheck_running_kernelr'Z_check_running_kernel)	r&rZr[Ztyp2cntZlabel_counts�width�indent�label�countrrr
rYs2
$z!UpdateInfoCommand.display_summarycs��fdd�}�fdd�}t�}x�|D]�\}}}d|j|j|jf}�jjsR�jjr�x�|jD]Z}	|	jt	j
krx�jjrxqZn|	jt	jkr��jjr�qZ|j|jf|j
|||jft��|	j<qZWq$|j|jf|j
|||jft��|j<q$Wg}
d}}}
x�t|j�dd�d	�D]r\\}}}}t|
t|��}
xR|j�D]F\}}t|t|��}||�}t|t|��}|
j||�||||f��q.W�qWxZ|
D]R\}}}}}�jjj�r�td
||||||
||f�ntd||||||f��q�WdS)
zDisplay the list of advisories.cs �jjdksdS|rdSdSdS)Nr"�zi z  )rAr;)�inst)r&rr
�	inst2mark2s
z1UpdateInfoCommand.display_list.<locals>.inst2markcs2|tjkr�jj|td��S�jj|td��SdS)NzUnknown/Sec.r)rJrN�SECURITY2LABEL�getr�
TYPE2LABEL)�typZsev)r&rr
�
type2label:s
z2UpdateInfoCommand.display_list.<locals>.type2labelz%s-%s.%srcSs|dS)Nrr)�xrrr
�<lambda>Rsz0UpdateInfoCommand.display_list.<locals>.<lambda>)�keyz%s%-*s %-*s %-*s %sz%s%-*s %-*s %sN)�dictr]rb�archrAr0r3rrrlrJrormrp�
setdefault�updatedri�sorted�itemsrrc�appendr^r��verboser�)r&rZr�r�Znevra_inst_dictr�rPr ZnevrarnZadvlistZidwZtlwZnwr�Zaupdatedr�ZaidZatypesevr�r)r&r
rW0s4*($$zUpdateInfoCommand.display_listcs��jjj���jjj�td�td�td�td�td�td�td�td�td	�td
�f
�����fdd�}t�}x"|D]\}}}|j|||��qtWtd
j	t
|dd�d���dS)z/Display the details about available advisories.z	Update IDZTypeZUpdatedZBugsZCVEsZDescriptionZSeverityZRightsZFilesZ	Installedc
s�|jg�jj|jtd��gt|j�ggg|jp0dj�|j	g|j
pBdj�tt�fdd�|j
D���dg
}xV|jD]L}|jtjkr�|djdj|j|jp�d��qn|jtjkrn|dj|j�qnW|dj�|dj��s�d|d<d|d	<�jjd
k�r|�rtd�ntd�g|d
<t��}g}|jdd�|jd|j�|jdd�xxt�|�D]j\}}|ddgfk�rt�qXxJt|�D]>\}}	|dk�r�|nd}
|t|
�}|jd|d|
|	f��q~W�qXWdj|�S)Nrr�c3s|]}|j�kr|jVqdS)N)r��filename)r
Zpkg)�archesrr
rsszHUpdateInfoCommand.display_info.<locals>.advisory2info.<locals>.<genexpr>�z{} - {}r���r"�trueZfalse�	�=�Oz  rz	%*s%s: %s�
)rir�r�rlrr	r�r[�
splitlinesrpZrightsr�rGZpackagesrrrJror��format�titlerm�sortrAr;r�zip�	enumerater�join)rPr Z
attributesrnr��linesr�Z	atr_lines�i�liner�Zkey_padding)r��labelsr&r�rr
�
advisory2infoisF
 "z5UpdateInfoCommand.display_info.<locals>.advisory2infoz

cSs|j�S)N)�lower)r�rrr
r��sz0UpdateInfoCommand.display_info.<locals>.<lambda>)r�N)r^r_Zlist_archesr�r�rrGrIr�r�r�)r&rZr�Z
advisoriesr�rPr r)r�r�r&r�r
rXas
(zUpdateInfoCommand.display_info)&�__name__�
__module__�__qualname__�__doc__rJrKrrLrNr�rMr�r�rBr�keys�aliasesrrErDr$�staticmethodr<rRr\rfrsrurzrVrSrTrUr�rYrWrX�
__classcell__rr)r(r
r)sJ



&6	1r)r�Z
__future__rrrrrgrJZdnf.clirZdnf.cli.option_parserrZdnf.i18nrrZ
dnf.pycompr	rZCommandrrrrr
�<module>ssite-packages/dnf/cli/commands/__pycache__/swap.cpython-36.opt-1.pyc000064400000003537147511334650021173 0ustar003

�ft`s	�@s`ddlmZddlmZddlmZddlmZddlZddl	Z	e	j
d�ZGdd�dej�Z
dS)	�)�absolute_import)�unicode_literals)�_)�commandsN�dnfc@sLeZdZdZdZed�jejj	d�Z
edd��Zdd�Z
d	d
�Zdd�Zd
S)�SwapCommandzNA class containing methods needed by the cli to execute the swap command.
    �swapz=run an interactive {prog} mod for remove and install one spec)�progcCs,|jddtd�d�|jddtd�d�dS)N�remove_specZstorezThe specs that will be removed)�action�help�install_specz The specs that will be installed)�add_argumentr)�parser�r�/usr/lib/python3.6/swap.py�
set_argparser&s
zSwapCommand.set_argparsercCsH|jj}d|_d|_d|_d|_tj|j|j�tj	|j|j
jg�dS)NT)�cli�demandsZsack_activationZavailable_reposZ	resolvingZ	root_userrZ_checkGPGKey�baseZ_checkEnabledRepo�optsr
)�selfrrrr�	configure,szSwapCommand.configurecCs@|jjj|�}|dk	r<||j�}|jjj|||g�|j�dS)N)rZcli_commands�getZ	optparserZparse_command_args�run)rZcmd_str�specZcmd_cls�cmdrrr�_perform5s

zSwapCommand._performcCs$|jd|jj�|jd|jj�dS)N�removeZinstall)rrr
r
)rrrrr<szSwapCommand.runN)r)�__name__�
__module__�__qualname__�__doc__�aliasesr�formatr�utilZMAIN_PROG_UPPERZsummary�staticmethodrrrrrrrrrs	r)Z
__future__rrZdnf.i18nrZdnf.clirZdnf.utilrZloggingZ	getLoggerZloggerZCommandrrrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/repolist.cpython-36.opt-1.pyc000064400000016402147511334650022055 0ustar003

�ft`z2�@s�ddlmZddlmZddlmZddlmZmZmZm	Z	ddl
mZddlZ
ddlZ
ddlZ
ddlZddlZddlZddlZejd�Zdd	�Zd
d�Zdd
�Zdd�ZGdd�dej�ZdS)�)�absolute_import)�unicode_literals)�commands)�_�ucd�fill_exact_width�exact_width)�OptionParserN�dnfcCsd|rtjj|jj��ntd�}|jdkr4td�|S|jsFtd�|St|j�}td�||fSdS)N�unknown�zNever (last: %s)zInstant (last: %s)z%s second(s) (last: %s)���)r
�util�normalize_time�_repo�getTimestamprZmetadata_expire�_num2ui_num)�repo�mdZlast�num�r�/usr/lib/python3.6/repolist.py�_expire_str%s

rcCsttjjd|d��S)Nz%dT)rr
Zpycomp�format)rrrrr0srcCsF|jj�}|jj�}x,|D]$}tj||�r.dStj||�rdSqWdS)NTF)�id�lower�name�fnmatch)rZpatterns�ridZrnmZpatrrr�_repo_match4s


rcCs>d}x*|jtjd�j|jd�D]}||j7}qWtjjj	|�S)Nr)�flags)�reponame__eq)
�query�hawkey�IGNORE_EXCLUDES�filtermrZ_sizer
�clirZ
format_number)�sackr�retZpkgrrr�
_repo_size?sr)c@s@eZdZdZdZed�Zedd��Zdd�Z	d	d
�Z
dd�Zd
S)�RepoListCommandzVA class containing methods needed by the cli to execute the
    repolist command.
    �repolist�repoinfoz,display the configured software repositoriesc	Csz|j�}|jdddddtd�d�|jddddtd	�d
�|jddddtd
�d
�|jdddddddgtjtd�d�dS)Nz--all�
_repos_action�store_const�allzshow all repos)�dest�action�const�default�helpz	--enabled�enabledzshow enabled repos (default))r0r1r2r4z
--disabled�disabledzshow disabled repos�repos�*zenabled-defaultZ
REPOSITORYzRepository specification)�nargsr3�metavar�choicesr1r4)Zadd_mutually_exclusive_group�add_argumentrr	ZPkgNarrowCallback)�parserZ	repolimitrrr�
set_argparserNszRepoListCommand.set_argparsercCs |jjs|jjtjtjd�dS)N)�stdout�stderr)�opts�quietr&Zredirect_logger�loggingZWARNING�INFO)�selfrrr�
pre_configure_szRepoListCommand.pre_configurecCsT|jjs|jj�|jj}|jjjs0|jjdkr<d|_	d|_
|jjrP|jj|j_dS)Nr,T)
rArBr&Zredirect_repo_progress�demands�base�conf�verbose�commandZavailable_reposZsack_activationr-�repos_action)rErGrrr�	configurecs
zRepoListCommand.configurec-
Csl|jj}dd�|jjD�}|jjj}t|jjj��}|jt	j
d�d�|jj}|j
d|jd}|j
d}|jd}d	}	g}
|s�tjtd
��dS|dkp�|dko�|}g}�x�|D�]�}
t|�r�t|
|�r�q�d7\}}}d
}d	}d
}|�r�|||}}}|
j�rnd}|dk�rq�|�s.|�s.|jjdk�rJ|td�|}ttd��}|�s^|jjdk�r�t|jj|
�}n<d}|dks�|dk�r�|�r�q�|td�|}ttd��}|�p�|jjdk�s�t|
j�}|
j||
j||ff�q�|�r�|
j}nd}|jjtd�|
j�|jjtd�|
j�g}|�r8||jjtd�|�g7}|�rh|
jj ��rh||jjtd�|
jj ��g7}|�r�|
jj!��r�|
jj!�}||jjtd�dj"t#|���g7}|�r|
jj$��rdd�|
jj$�D�}x@|j%�D]4\}}||jjtd�d|dj"t#|��f�g7}�q�W|�r�t|jjj&t'j(d�j)|
jd��}t|jjj&�j)|
jd��}t*|�}t*|�}|	|7}	||jjtd�t+j,j-|
jj.���|jjtd �|�|jjtd!�|�|jjtd"�|�g7}|
j/�r||jjtd#�|
j/�g7}|�r2|
jj0�} ||jjtd$�t+j,j-| ��g7}n"|
j1�r2||jjtd%�|
j1�g7}|
j2}!|!�r^||jjtd&�dj"|!��g7}nF|�r�|
jj3�}"|"�r�d'|"d	t|"�d(f}#||jjtd&�|#�g7}t4|
|�}$||jjtd)�|$�g7}|
j5�r�||jjtd*�dj"|
j5��g7}|
j6�r||jjtd+�dj"|
j6��g7}|�r4||jjtd,�|�g7}|
j7�rV||jjtd-�|
j7�g7}|jd.j"t8t|���q�W|�r�t9d/j"|��|�r:|
�r:ttd0��}%d	}&d	}'xR|
D]J\}}(\}}|%t|�k�r�t|�}%|&t|(�k�r�t|(�}&|'|k�r�|}'�q�W|�rBttd1��|'k�r.|j:|%ttd1��d2})n|j:|%|'d2})n|j:|%d(})|)|&k�r`|)}&n$|)|&8})|%|)d27}%|&|)|)d27}&t;td0�|%�}*|�r�t;td3�|&|&�}+ntd3�}+|�s�t9d4|*|+f�nt9d5|*|+td1�f�xX|
D]P\}}(\}}|�st9d4t;||%�|(f��q�t9d5t;||%�t;|(|&|&�|f��q�W|�sN|jjdk�rhtd6�},t9|,j<t*|	���dS)8NcSsg|]}|j��qSr)r)�.0�xrrr�
<listcomp>psz'RepoListCommand.run.<locals>.<listcomp>r)�keyZgreenZboldZredZnormalrzNo repositories availabler/zenabled-default�Tr6r,r5FzRepo-id            : zRepo-name          : zRepo-status        : zRepo-revision      : zRepo-tags          : z, cSsi|]\}}||�qSrr)rN�k�vrrr�
<dictcomp>�sz'RepoListCommand.run.<locals>.<dictcomp>zRepo-distro-tags      : z[%s]: %s)r )r!zRepo-updated       : zRepo-pkgs          : zRepo-available-pkgs: zRepo-size          : zRepo-metalink      : z  Updated          : zRepo-mirrors       : zRepo-baseurl       : z%s (%d more)rzRepo-expire        : zRepo-exclude       : zRepo-include       : zRepo-excluded      : zRepo-filename      : �
z

zrepo idZstatus�z	repo namez%s %sz%s %s %szTotal packages: {})rRrRrR)=rArLr7rHrIrJ�list�values�sort�operator�
attrgetter�output�termZFG_COLORZMODE�loggerZwarningr�lenrr5rKrr)r'rr�appendrZmetadataZ
fmtKeyValFillrZgetRevisionZgetContentTags�join�sortedZ
getDistroTags�itemsr"r#r$r%rr
rrZgetMaxTimestampZmetalinkrZ
mirrorlistZbaseurlZ
getMirrorsrZexcludepkgsZincludepkgsZrepofile�map�print�columnsrr)-rE�argZextcmdsrJr7r^Z	on_ehibegZ	on_dhibegZon_hiendZtot_numZcolsZinclude_statusZrepoinfo_outputrZehibegZdhibegZhiendZ
ui_enabledZui_endis_widZui_excludes_numr5Zui_sizerr�outZtagsZdistroTagsDictZdistrorZ
num_availableZui_numZui_num_availableZtsZbaseurlsZmirrorsZurlZexpireZid_lenZnm_lenZst_lenZrname�leftZtxt_ridZtxt_rnam�msgrrr�runns.







"







zRepoListCommand.runN)r+r,)�__name__�
__module__�__qualname__�__doc__�aliasesrZsummary�staticmethodr>rFrMrlrrrrr*Fsr*)Z
__future__rrZdnf.clirZdnf.i18nrrrrZdnf.cli.option_parserr	Zdnf.cli.formatr
Z
dnf.pycompZdnf.utilrr#rCr[Z	getLoggerr_rrrr)ZCommandr*rrrr�<module>s"
site-packages/dnf/cli/commands/__pycache__/reinstall.cpython-36.opt-1.pyc000064400000005625147511334650022216 0ustar003

�ft`]�@slddlmZddlmZddlmZddlmZddlmZddl	Z
ddlZejd�Z
Gdd	�d	ej�ZdS)
�)�absolute_import)�unicode_literals)�commands)�OptionParser)�_N�dnfc@s8eZdZdZdZed�Zedd��Zdd�Z	d	d
�Z
dS)
�ReinstallCommandzSA class containing methods needed by the cli to execute the reinstall command.
    �	reinstall�reizreinstall a packagecCs"|jddtd�tjtd�d�dS)N�packages�+zPackage to reinstallZPACKAGE)�nargs�help�action�metavar)�add_argumentrrZParseSpecGroupFileCallback)�parser�r�/usr/lib/python3.6/reinstall.py�
set_argparser(szReinstallCommand.set_argparsercCsH|jj}d|_d|_d|_d|_tj|j|j�|j	j
sDtj|j�dS)aVerify that conditions are met so that this command can
        run.  These include that the program is being run by the root
        user, that there are enabled repositories with gpg keys, and
        that this command is called with appropriate arguments.
        TN)Zcli�demandsZsack_activationZavailable_reposZ	resolvingZ	root_userrZ_checkGPGKey�base�opts�	filenamesZ_checkEnabledRepo)�selfrrrr�	configure.szReinstallCommand.configurecCs�d}xp|jj|jjd|jjjd�D]P}y|jj|�Wn6tjj	k
rlt
jtd�|jjj
j|j��Yq"Xd}q"W�xD|jjdd�|jjD�D�]$}y|jj|�W�ntjjk
�r}zPx,|jD]"}t
jtd�|jj
j|j��Pq�Wt
jtd�|jjj
j|��WYdd}~Xq�tjjk
�r�}z^xV|jD]L}d}|jjj|�}|�rdtd	�|}td
�}t
j||jjj
j|�|��q<WWYdd}~Xq�tjj	k
�r�Yq�Xd}q�W|�s�tjjtd���dS)NF)�strict�progresszNo match for argument: %sTcSsg|]}d|�qS)�@r)�.0�xrrr�
<listcomp>Lsz(ReinstallCommand.run.<locals>.<listcomp>z(Package %s available, but not installed.�z
 (from %s)z%Installed package %s%s not available.z!No packages marked for reinstall.)rZadd_remote_rpmsrr�outputrZpackage_reinstallr�
exceptionsZMarkingError�logger�inforZtermZbold�locationZ	pkg_specsZ	grp_specsr	ZPackagesNotInstalledErrorr�nameZPackagesNotAvailableError�historyZrepo�Error)r�doneZpkgZpkg_spec�errZxmsgZpkgrepo�msgrrr�run=sB
$

"zReinstallCommand.runN)r	r
)�__name__�
__module__�__qualname__�__doc__�aliasesrZsummary�staticmethodrrr.rrrrr!sr)Z
__future__rrZdnf.clirZdnf.cli.option_parserrZdnf.i18nrZdnf.exceptionsrZloggingZ	getLoggerr%ZCommandrrrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/distrosync.cpython-36.opt-1.pyc000064400000002640147511334650022414 0ustar003

�ft`��@s:ddlmZddlmZddlmZGdd�dej�ZdS)�)�absolute_import)�commands)�_c@s8eZdZdZdZed�Zedd��Zd	d
�Z	dd�Z
d
S)�DistroSyncCommandzZA class containing methods needed by the cli to execute the
    distro-synch command.
    �distro-sync�
distrosync�distribution-synchronization�dsyncz?synchronize installed packages to the latest available versionscCs|jddtd�d�dS)N�package�*zPackage to synchronize)�nargs�help)�add_argumentr)�parser�r� /usr/lib/python3.6/distrosync.py�
set_argparser"szDistroSyncCommand.set_argparsercCsF|jj}d|_d|_d|_d|_tj|j|j�tj	|j|j
j�dS)NT)Zcli�demandsZsack_activationZavailable_reposZ	resolvingZ	root_userrZ_checkGPGKey�baseZ_checkEnabledRepo�optsr
)�selfrrrr�	configure&szDistroSyncCommand.configurecCs|jj|jj�S)N)rZdistro_sync_userlistrr
)rrrr�run/szDistroSyncCommand.runN)rrrr	)�__name__�
__module__�__qualname__�__doc__�aliasesrZsummary�staticmethodrrrrrrrrs	rN)Z
__future__rZdnf.clirZdnf.i18nrZCommandrrrrr�<module>ssite-packages/dnf/cli/commands/__pycache__/upgrademinimal.cpython-36.opt-1.pyc000064400000002117147511334650023210 0ustar003

�ft`�@sDddlmZddlmZddlmZddlmZGdd�de�ZdS)�)�absolute_import)�unicode_literals)�_)�UpgradeCommandc@s$eZdZdZd	Zed�Zdd�ZdS)
�UpgradeMinimalCommandzSA class containing methods needed by the cli to execute the check
    command.
    �upgrade-minimal�update-minimal�up-minzWupgrade, but only 'newest' package match which fixes a problem that affects your systemc	CsRtj|�d|_t|jj|jj|jj|jj|jj	|jj
|jj|jjg�sNd|_
dS)NT)r�	configureZupgrade_minimal�anyZoptsZbugfixZenhancementZ
newpackageZsecurityZadvisoryZbugzillaZcvesZseverityZall_security)�self�r
�$/usr/lib/python3.6/upgrademinimal.pyr
"s
zUpgradeMinimalCommand.configureN)rrr	)�__name__�
__module__�__qualname__�__doc__�aliasesrZsummaryr
r
r
r
rrsrN)Z
__future__rrZdnf.i18nrZdnf.cli.commands.upgraderrr
r
r
r�<module>ssite-packages/dnf/cli/commands/__pycache__/search.cpython-36.opt-1.pyc000064400000010452147511334650021460 0ustar003

�ft`��@s�ddlmZddlmZddlmZddlZddlmZddlmZddl	m
Z
mZmZddl	Z
ddlZ
ddlZ
ddlZddlZejd�ZGd	d
�d
ej�ZdS)�)�absolute_import)�print_function)�unicode_literalsN)�commands)�OptionParser)�ucd�_�C_�dnfc@sPeZdZdZdZed�Zedd��Zdd�Z	d	d
�Z
dd�Zd
d�Zdd�Z
dS)�
SearchCommandzTA class containing methods needed by the cli to execute the
    search command.
    �search�sez+search package details for the given stringc	Cs<|jddtd�d�|jddtd�dgdtjtd	�d
�dS)Nz--all�
store_truez'search also package description and URL)�action�help�query_string�+ZKEYWORD�allzKeyword to search for)�nargs�metavar�choices�defaultrr)�add_argumentrrZPkgNarrowCallback)�parser�r�/usr/lib/python3.6/search.py�
set_argparser0szSearchCommand.set_argparsercs4tjdtdd�fdtdd�fdtdd�fdtd	�ff���fd
d����fdd
�}tjj�}x(|D] }�j|d|��j|d|�qbW�jj	r�xd|D] }�j|d|��j|d|�q�Wn:t
|�}t|j��}x$|D]}t
|j
|��|kr�||=q�Wd}d}	d}
d}d}�jjj�s0�jjj�j|j�d�j�}t�}
x�|jd|d�D]�}�jjj�s~|j|j|
k�rl�qF|
j|j|j�||j|�k�r�|j|�}d}|	|j
|�k�r�|j
|�}	d}|
|j|�|	kk�r�|j|�|	k}
d}|�r�||
||	�d}�jjj||j|�|��qFWt
|�dk�r0tjtd��dS)z0Search for simple text tags in a package object.�nameZlong�Name�summaryZSummary�descriptionZDescriptionZurlZURLc	sy�|S|SdS)Nr)�attr)�	TRANS_TBLrr�_translate_attrCsz.SearchCommand._search.<locals>._translate_attrcs^t�|�}td�j|�}|r*td�|}ntd�|}�jjj|dj|��}tt|��dS)Nz & z%s Exactly Matched: %%sz%s Matched: %%sz, )�mapr�join�base�outputZ
fmtSection�printr)�exact_matchZattrs�keysZtrans_attrsZtrans_attrs_strZsection_textZ	formatted)r#�selfrr�_print_section_headerIs
z4SearchCommand._search.<locals>._print_section_headerNF)�pkgT)�reverseZlimit_torzNo matches found.) �collections�OrderedDictr	rr
Z
match_counterZMatchCounter�_search_counted�optsr�len�listr*�matched_needlesr&ZconfZshowdupesfromrepos�sack�query�filtermZlatest�set�sortedrZarch�addZmatched_keysZmatched_haystacksr'Z
matchcallback�logger�info)r+�argsr,�counter�argZneedlesZpkgsr-Z
used_attrsr5r)Zprint_section_header�limit�seenr)r"r#r+r�_search9s`






zSearchCommand._searchcCs`d||i}tjj|�r$d||i}|jjj�jtjf|�}x|j	�D]}|j
|||�qFW|S)Nz
%s__substrz%s__glob)r
�utilZis_glob_patternr&r6r7r8�hawkeyZICASE�runr;)r+r?r!ZneedleZfdict�qr-rrrr1�szSearchCommand._search_countedcCs |jjs|jjtjtjd�dS)N)�stdout�stderr)r2�quiet�cliZredirect_logger�loggingZWARNING�INFO)r+rrr�
pre_configure�szSearchCommand.pre_configurecCsD|jjs|jj�|jj}d|_d|_d|_|jjp:|jj	|j_dS)NTF)
r2rJrKZredirect_repo_progress�demandsZavailable_reposZfresh_metadataZsack_activationrZquery_string_action)r+rOrrr�	configure�s
zSearchCommand.configurecCstjtd��|j|jj�S)NzSearching Packages: )r<�debugrrCr2r)r+rrrrF�szSearchCommand.runN)rr
)�__name__�
__module__�__qualname__�__doc__�aliasesrr�staticmethodrrCr1rNrPrFrrrrr(s	O		r)Z
__future__rrrr/Zdnf.clirZdnf.cli.option_parserrZdnf.i18nrrr	r
Zdnf.match_counterZdnf.utilrErLZ	getLoggerr<ZCommandrrrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/module.cpython-36.opt-1.pyc000064400000035164147511334650021507 0ustar003

�ft`�A�@s�ddlmZddlmZmZddlmZddlmZddl	m
Z
ddl	ZddlZddl
Z
ddlZddlZddlZddlZGdd�dej�ZdS)	�)�print_function)�commands�CliError)�_)�NoModuleException)�loggerNcs*eZdZGdd�dej�ZGdd�de�ZGdd�de�ZGdd�de�ZGd	d
�d
e�Z	Gdd�de�Z
Gd
d�de�ZGdd�de�ZGdd�de�Z
Gdd�de�ZGdd�de�ZGdd�de�Zeeee	e
eee
eeehZehZd%Zed�Z�fdd�Zdd�Zdd �Zd!d"�Zd#d$�Z�ZS)&�
ModuleCommandcs,eZdZ�fdd�Zdd�Zdd�Z�ZS)zModuleCommand.SubCommandcs(ttj|�j|�tjjj|j�|_dS)N)	�superr�
SubCommand�__init__�dnf�module�module_baseZ
ModuleBase�base)�self�cli)�	__class__��/usr/lib/python3.6/module.pyr(sz!ModuleCommand.SubCommand.__init__c	Cs�t�}x�|jjD]�}|jj|�\}}|dkr.q|jr:|jnd}|jrJ|jnd}|jr^|jdksd|jrxt	j
td�j|��|j
r�|j
nd}|jjj||dd|�}|j|�qW|S)N��zjOnly module name, stream, architecture or profile is used. Ignoring unneeded information in argument: '{}'���)�set�opts�module_specr�_get_modules�name�stream�version�contextr�infor�format�archr�_moduleContainer�query�update)	r�modules_from_specsr�__Znsvcaprrr"�modulesrrr�#_get_modules_from_name_stream_specs,sz<ModuleCommand.SubCommand._get_modules_from_name_stream_specsc	Cs�t�}t�}x0|D](}||kr|jjj|�r|j|j��qWxB|D]:}tj|�}x*|jtj	gd�D]}|j
rd|j|j
�qdWqDW||fS)N)Zforms)rrr#ZisModuleActiver%ZgetArtifacts�hawkeyZSubjectZget_nevra_possibilitiesZ
FORM_NEVRAr�add)	rZuse_modulesZskip_modulesZ	artifactsZ	pkg_namesr
ZartifactZsubjZ	nevra_objrrr�_get_module_artifact_names>s


z3ModuleCommand.SubCommand._get_module_artifact_names)�__name__�
__module__�__qualname__rr)r,�
__classcell__rr)rrr
&sr
c@s(eZdZdZed�Zdd�Zdd�ZdS)	zModuleCommand.ListSubCommand�listz,list all module streams, profiles and statescCs|jj}d|_d|_dS)NT)r�demands�available_repos�sack_activation)rr2rrr�	configureRsz&ModuleCommand.ListSubCommand.configurecCs�|j}|jjr&|j|jjtjjj�}nV|jj	rF|j|jjtjjj
�}n6|jjrf|j|jjtjjj�}n|j|jjtjjj
�}|r�t|�dS|jjr�td�}tjj|��dS)NzNo matching Modules to list)rr�enabledZ_get_brief_descriptionr�libdnfr
�ModulePackageContainerZModuleState_ENABLED�disabledZModuleState_DISABLED�	installedZModuleState_INSTALLEDZModuleState_UNKNOWN�printrr�
exceptions�Error)rZmods�output�msgrrr�
run_on_moduleWs(z*ModuleCommand.ListSubCommand.run_on_moduleN)r1)r-r.r/�aliasesr�summaryr5r@rrrr�ListSubCommandMsrCc@s(eZdZdZed�Zdd�Zdd�ZdS)	zModuleCommand.InfoSubCommandr z)print detailed information about a modulecCs|jj}d|_d|_dS)NT)rr2r3r4)rr2rrrr5tsz&ModuleCommand.InfoSubCommand.configurecCsf|jjr|jj|jj�}n*|jjr4|jj|jj�}n|jj|jj�}|rRt|�nt	j
jtd���dS)NzNo matching Modules to list)
r�verboserZ_get_full_infor�profileZ_get_info_profilesZ	_get_infor;rr<r=r)rr>rrrr@ys
z*ModuleCommand.InfoSubCommand.run_on_moduleN)r )r-r.r/rArrBr5r@rrrr�InfoSubCommandosrFc@s(eZdZdZed�Zdd�Zdd�ZdS)	zModuleCommand.EnableSubCommand�enablezenable a module streamcCs$|jj}d|_d|_d|_d|_dS)NT)rr2r3r4�	resolving�	root_user)rr2rrrr5�s
z(ModuleCommand.EnableSubCommand.configurecCs�y|jj|jj�Wnltjjk
r�}zL|jjj	rb|j
s@|jrD|�|jrb|jdt
jjjkrb|�tjt|��WYdd}~XnXdS)Nr)rrGrrrr<�
MarkingErrorsr�conf�strict�no_match_group_specs�error_group_specs�module_depsolv_errorsr7r
r8�!ModuleErrorType_ERROR_IN_DEFAULTSr�error�str)r�errrr@�s
z,ModuleCommand.EnableSubCommand.run_on_moduleN)rG)r-r.r/rArrBr5r@rrrr�EnableSubCommand�srTc@s(eZdZdZed�Zdd�Zdd�ZdS)	zModuleCommand.DisableSubCommand�disablez%disable a module with all its streamscCs$|jj}d|_d|_d|_d|_dS)NT)rr2r3r4rHrI)rr2rrrr5�s
z)ModuleCommand.DisableSubCommand.configurecCs�y|jj|jj�Wnltjjk
r�}zL|jjj	rb|j
s@|jrD|�|jrb|jdt
jjjkrb|�tjt|��WYdd}~XnXdS)Nr)rrUrrrr<rJrrKrLrMrNrOr7r
r8rPrrQrR)rrSrrrr@�s
z-ModuleCommand.DisableSubCommand.run_on_moduleN)rU)r-r.r/rArrBr5r@rrrr�DisableSubCommand�srVc@s(eZdZdZed�Zdd�Zdd�ZdS)	zModuleCommand.ResetSubCommand�resetzreset a modulecCs$|jj}d|_d|_d|_d|_dS)NT)rr2r3r4rHrI)rr2rrrr5�s
z'ModuleCommand.ResetSubCommand.configurecCsby|jj|jj�WnHtjjk
r\}z(|jjj	r>|j
r>|�tjt
|��WYdd}~XnXdS)N)rrWrrrr<rJrrKrLrMrrQrR)rrSrrrr@�s
z+ModuleCommand.ResetSubCommand.run_on_moduleN)rW)r-r.r/rArrBr5r@rrrr�ResetSubCommand�srXc@s(eZdZdZed�Zdd�Zdd�ZdS)	zModuleCommand.InstallSubCommand�installz/install a module profile including its packagescCs$|jj}d|_d|_d|_d|_dS)NT)rr2r3r4rHrI)rr2rrrr5�s
z)ModuleCommand.InstallSubCommand.configurecCspy|jj|jj|jjj�WnNtjj	k
rj}z.|jjjrL|j
sH|jrL|�tj
t|��WYdd}~XnXdS)N)rrYrrrrKrLrr<rJrMrNrrQrR)rrSrrrr@�s
z-ModuleCommand.InstallSubCommand.run_on_moduleN)rY)r-r.r/rArrBr5r@rrrr�InstallSubCommand�srZc@s(eZdZdZed�Zdd�Zdd�ZdS)	zModuleCommand.UpdateSubCommandr%z0update packages associated with an active streamcCs$|jj}d|_d|_d|_d|_dS)NT)rr2r3r4rHrI)rr2rrrr5�s
z(ModuleCommand.UpdateSubCommand.configurecCs&|jj|jj�}|r"tdj|���dS)Nz, )rZupgraderrr�join)rZmodule_specsrrrr@�sz,ModuleCommand.UpdateSubCommand.run_on_moduleN)r%)r-r.r/rArrBr5r@rrrr�UpdateSubCommand�sr\c@s(eZdZd	Zed�Zdd�Zdd�ZdS)
zModuleCommand.RemoveSubCommand�remove�erasez3remove installed module profiles and their packagescCs0|jj}d|_d|_d|_d|_d|_d|_dS)NTF)rr2Z
allow_erasingr3Zfresh_metadatarHrIr4)rr2rrrr5�sz(ModuleCommand.RemoveSubCommand.configurec
Cs�|jj|jj�}|jjr�|j�}|j|t��\}}|j|jj	j
�|�\}}|jjj�j
�j|d�}|jjj�j
�j|d�}xF|D]>}||kr�td�j|�}	tj|	�q�|jjj||jjjd�q�W|s�dStjtjj|d��dS)N)rz0Package {} belongs to multiple modules, skipping)Z
clean_deps)rM)rr]rr�allr)r,rrr#ZgetModulePackages�sackr$r:�filtermrr!rr Zgoalr^rKZclean_requirements_on_removerQrr<rJ)
rZskipped_groupsr&Zremove_names_from_specr'Z
keep_namesZremove_queryZ
keep_query�pkgr?rrrr@�s&
z,ModuleCommand.RemoveSubCommand.run_on_moduleN)r]r^)r-r.r/rArrBr5r@rrrr�RemoveSubCommand�s	rcc@s(eZdZdZed�Zdd�Zdd�ZdS)	z ModuleCommand.SwitchToSubCommand�	switch-toz7switch a module to a stream and distrosync rpm packagescCs.|jj}d|_d|_d|_d|_d|jj_dS)NT)	rr2r3r4rHrIrrKZmodule_stream_switch)rr2rrrr5sz*ModuleCommand.SwitchToSubCommand.configurecCsry|jj|jj|jjjd�WnNtjj	k
rl}z.|jjjrN|j
sJ|jrN|�tj
t|��WYdd}~XnXdS)N)rL)rZ	switch_torrrrKrLrr<rJrMrNrrQrR)rrSrrrr@"s
z.ModuleCommand.SwitchToSubCommand.run_on_moduleN)rd)r-r.r/rArrBr5r@rrrr�SwitchToSubCommandsrec@s(eZdZdZed�Zdd�Zdd�ZdS)	z ModuleCommand.ProvidesSubCommand�provideszlist modular packagescCs|jj}d|_d|_dS)NT)rr2r3r4)rr2rrrr50sz*ModuleCommand.ProvidesSubCommand.configurecCs |jj|jj�}|rt|�dS)N)rZ_what_providesrrr;)rr>rrrr@5sz.ModuleCommand.ProvidesSubCommand.run_on_moduleN)rf)r-r.r/rArrBr5r@rrrr�ProvidesSubCommand+srgc@s(eZdZdZed�Zdd�Zdd�ZdS)	z!ModuleCommand.RepoquerySubCommand�	repoqueryz#list packages belonging to a modulecCs|jj}d|_d|_dS)NT)rr2r3r4)rr2rrrr5?sz+ModuleCommand.RepoquerySubCommand.configurecCs�t�}x*|jjD]}|jj|�\}}|j|�qW|j|t��\}}t�}|jjs\|jjr�|j	j
j�j�j|d�}x|D]}	|j
t|	��qzW|jjr�|j	j
j�j�j|d�}x|D]}	|j
t|	��q�Wdjt|��}
t|
�dS)N)Znevra_strict)r�
)rrrrrr%r,�	availabler:rr`r$rar+rRr[�sortedr;)rr&rr(r'Znames_from_specZspec_artifactsZpackage_stringsr$rbr>rrrr@Ds"

z/ModuleCommand.RepoquerySubCommand.run_on_moduleN)rh)r-r.r/rArrBr5r@rrrr�RepoquerySubCommand:srlr
zInteract with Modules.cs>tt|�j���fdd�|jD�}d|_dd�|D�|_dS)Nc3s|]}|��VqdS)Nr)�.0�subcmd)rrr�	<genexpr>dsz)ModuleCommand.__init__.<locals>.<genexpr>cSsi|]}|jD]
}||�qqSr)rA)rmrn�aliasrrr�
<dictcomp>fsz*ModuleCommand.__init__.<locals>.<dictcomp>)r	rr�SUBCMDSrn�_subcmd_name2obj)rrZsubcmd_objs)r)rrrbs
zModuleCommand.__init__cCs|j�}|jdddtd�d�|jdddtd�d�|jd	d
dtd�d�|jdd
dtd�d�|jdddtd�d�|jdddtd�d�g}g}xHt|jdd�d�D]2}|j|jd�|jdj|jd|jp�d��q�W|jdd|ddj	|�d�|jd d!d"td#�d$�dS)%Nz	--enabledr6�
store_truezshow only enabled modules)�dest�action�helpz
--disabledr9zshow only disabled modulesz--installedr:z'show only installed modules or packagesz	--profilerEzshow profile contentz--availablerjzshow only available packagesz--allr_zremove all modular packagescSs
|jdS)Nr)rA)�xrrr�<lambda>~sz-ModuleCommand.set_argparser.<locals>.<lambda>)�keyrz{}: {}rrnrz<modular command>ri)�nargs�choices�metavarrwrzmodule-spec�*zModule specification)r}r{rw)
Zadd_mutually_exclusive_group�add_argumentrrkrr�appendrAr!rBr[)r�parserZnarrowsZsubcommand_choicesZsubcommand_helprnrrr�
set_argparseris8
"

zModuleCommand.set_argparsercCsZy|j|jjd|_Wn(ttfk
r@|jjj�t�YnX|j|j_|jj�dS)Nr)	rsrrnr�KeyErrorrZ	optparserZprint_usager5)rrrrr5�s

zModuleCommand.configurecCs|j�|jj�dS)N)�check_required_argumentrnr@)rrrr�run�szModuleCommand.runcCsRdd�|jD�}|jjd|krN|jjsNttd�jtjj	|jj
|jjd���dS)NcSsg|]}|jD]}|�qqSr)rA)rmrnrprrr�
<listcomp>�sz9ModuleCommand.check_required_argument.<locals>.<listcomp>rz{} {} {}: too few arguments)�SUBCMDS_NOT_REQUIRED_ARGrrnrrrr!r�utilZ	MAIN_PROGZcommand)rZnot_required_argumentrrrr��s
z%ModuleCommand.check_required_argument)r
)r-r.r/r�Commandr
rCrFrTrVrXrZr\rcrergrlrrr�rArrBrr�r5r�r�r0rr)rrr%s.'"%	r)Z
__future__rZdnf.clirrZdnf.i18nrZdnf.module.exceptionsrZdnf.utilrr�sys�osr*r7Zdnf.module.module_baseZdnf.exceptionsr�rrrrr�<module>ssite-packages/dnf/cli/commands/__pycache__/downgrade.cpython-36.opt-1.pyc000064400000003413147511334650022164 0ustar003

�ft`	�@sRddlmZddlmZddlmZddlmZddlmZGdd�dej	�Z
dS)	�)�absolute_import)�unicode_literals)�commands)�OptionParser)�_c@s8eZdZdZdZed�Zedd��Zdd�Z	d	d
�Z
dS)
�DowngradeCommandzWA class containing methods needed by the cli to execute the
    downgrade command.
    �	downgrade�dgzDowngrade a packagecCs|jddtd�tjd�dS)N�package�*zPackage to downgrade)�nargs�help�action)�add_argumentrrZParseSpecGroupFileCallback)�parser�r�/usr/lib/python3.6/downgrade.py�
set_argparser$szDowngradeCommand.set_argparsercCsH|jj}d|_d|_d|_d|_tj|j|j�|j	j
sDtj|j�dS)NT)Zcli�demandsZsack_activationZavailable_reposZ	resolvingZ	root_userrZ_checkGPGKey�base�opts�	filenamesZ_checkEnabledRepo)�selfrrrr�	configure)szDowngradeCommand.configurecCsJ|jj|jjd|jjjd�}|jj|jjdd�|jjD�||jj	j
d�S)NF)�strict�progresscSsg|]}d|�qS)�@r)�.0�xrrr�
<listcomp>8sz(DowngradeCommand.run.<locals>.<listcomp>)Zspecs�	file_pkgsr)rZadd_remote_rpmsrr�outputrZ
downgradePkgsZ	pkg_specsZ	grp_specsZconfr)rr rrr�run4szDowngradeCommand.runN)rr	)�__name__�
__module__�__qualname__�__doc__�aliasesrZsummary�staticmethodrrr"rrrrrsrN)Z
__future__rrZdnf.clirZdnf.cli.option_parserrZdnf.i18nrZCommandrrrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/repoquery.cpython-36.pyc000064400000053431147511334650021313 0ustar003

�ft`ن�
@sddlmZddlmZddlmZddlmZddlmZddlm	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlZddlZddlZddlZejd�Zd	Ze
jd
�ZdZdd
dddddddd�	Zdd�ZGdd�de	j�ZGdd�dej�ZGdd�de �Z!dS)�)�absolute_import)�print_function)�unicode_literals)�_)�commands)�OptionParserN�dnfz.%{name}-%{epoch}:%{version}-%{release}.%{arch}z%(-?\d*?){([:.\w]+?)}aname, arch, epoch, version, release, reponame (repoid), from_repo, evr,
debug_name, source_name, source_debug_name,
installtime, buildtime, size, downloadsize, installsize,
provides, requires, obsoletes, conflicts, sourcerpm,
description, summary, license, url, reason�	conflicts�enhances�	obsoletes�provides�
recommends�requiresZrequires_pre�suggests�supplements)	r	r
rrr
rzrequires-prerrcCs�dd�}dd�}|jdd�jdd�}x tj�D]\}}|j||�}q.Wd	}d
}x>tj|�D]0}|||||j���7}|||�7}|j�}qZW||||d��7}|S)z:Convert a rpm like QUERYFMT to an python .format() string.cSs^|j�d}|j�d}|rJ|ddkr:d|dd�}nd|}d|}d|j�|dS)	Nr��-�>�<�:z{0.�})�groups�lower)ZmatchobjZfill�key�r�/usr/lib/python3.6/repoquery.py�fmt_replDszrpm2py_format.<locals>.fmt_replcSs|jdd�jdd�S)N�{z{{rz}})�replace)Ztxtrrr�bracketsOszrpm2py_format.<locals>.bracketsz\n�
z\t�	�rN)r�OPTS_MAPPING�items�
QFORMAT_MATCH�finditer�start�end)�queryformatrrr�valueZfmt�spos�itemrrr�
rpm2py_formatBsr-c@seZdZdZdS)�_CommaSplitCallbackz\s*,\s*N)�__name__�
__module__�__qualname__ZSPLITTERrrrrr._sr.c@s�eZdZdZejejejd�Zd%e	ej
��Zed�Z
edd��Zedd	��Zd
d�Zdd
�Zdd�Zdd�Zd&dd�Zd'dd�Zd(dd�Zdd�Zdd�Zdd�Zd d!�Zd*d#d$�ZdS)+�RepoQueryCommandzSA class containing methods needed by the cli to execute the repoquery command.
    )zrepoquery-nzrepoquery-nazrepoquery-nevra�	repoquery�rqz$search for packages matching keywordcCs,|jr|j|jd�|jr(|j|jd�|S)z'Filter query by repoid and arch options)Zreponame)�arch)Zrepo�filterm�arches)�opts�queryrrr�filter_repo_archms
z!RepoQueryCommand.filter_repo_archc
Cs�|jddddtd�d�|jddtd�d	�|jd
ddgtd
td�d�|jddddtd�d�|jdgtdtd�d�|jdgtdtd�d�|jdgtdtd�d�|jdgtdtd�d�|jd gtdtd!�d�|jd"gtdtd#�d�|jd$gtdtd%�d�|jd&gtdtd'�d�|jd(gtdtd)�d�|j�}|jd*dtd+�d	�|jd,dtd-�d	�|jd.dtd/�d	�|jd0dtd1�d	�|jd2dtd3�d	�|jd4dtd5�d	�|jd6dtd7�d	�|jd8d9ttd:�d;�|jd<dtd=�d	�|j�}|jd>d?d@dAdtdB�dC�|jdDdEdFdAdtdG�dC�|jdHdIdJdAdtdK�dC�|jdLdMdAdtdN�dC�|jdOdPdQttdR�dS�|jdTdtdU�d	�|jdVdQtdWtdX�dY�|jdZdQd[dWtd\�dY�|jd]dQd^dWtd_�dY�|jd`dtda�d	�|j�}|jdbdcdddWtde�dY�|jdfdcdddWtjdY�|jdgdcdhdWtdi�dY�|jdjdcdkdWtdl�dY�|jdmdtdn�d	�|j�}tdo�tdp�tdq�tdr�tds�tdt�tdu�tdv�tdw�dx�	}x2|j�D]&\}}dy|}|j|dzdW||d{��q�W|jd|dtd}�d	�td~�td�td��td��j	t
jjd��td��d��}	|j�}
x2|	j�D]&\}}dy|}
|
j|
d�dW||d{��q4W|
jd�d�dWd�tjd{�|jd�dtd��d	�|jd�d�d�td��d��dS)�Nz-az--allZqueryall�
store_truezNQuery all packages (shorthand for repoquery '*' or repoquery without argument))�dest�action�helpz--show-duplicatesz(Query all versions of packages (default))r=r>z--archz
--archlistr7z[arch]z show only results from this ARCH)r<�defaultr=�metavarr>z-fz--file�FILE�+z show only results that owns FILE)r@�nargsr>z--whatconflictsZREQz#show only results that conflict REQ)r?r=r@r>z
--whatdependszishows results that requires, suggests, supplements, enhances,or recommends package provides and files REQz--whatobsoletesz#show only results that obsolete REQz--whatprovidesz"show only results that provide REQz--whatrequiresz:shows results that requires package provides and files REQz--whatrecommendsz$show only results that recommend REQz--whatenhancesz"show only results that enhance REQz--whatsuggestsz"show only results that suggest REQz--whatsupplementsz%show only results that supplement REQz	--alldepsz=check non-explicit dependencies (files and Provides); defaultz--exactdepsz:check dependencies exactly as given, opposite of --alldepsz--recursivezOused with --whatrequires, and --requires --resolve, query packages recursively.z	--deplistz>show a list of all dependencies and what packages provide themz	--resolvez.resolve capabilities to originating package(s)z--treez"show recursive tree for package(s)z--srpmz#operate on corresponding source RPMz--latest-limit�latest_limitzOshow N latest packages for a given name.arch (or latest but N if N is negative))r<�typer>z--disable-modular-filteringz-list also packages of inactive module streamsz-iz--info�	queryinfoFz+show detailed information about the package)r<r?r=r>z-lz--list�
queryfilelistz!show list of files in the packagez-sz--source�querysourcerpmzshow package source RPM namez--changelogs�querychangelogszshow changelogs of the packagez--qfz
--queryformatr)zfdisplay format for listing packages: "%%{name} %%{version} ...", use --querytags to view full tag list)r<r?r>z--querytagsz-show available tags to use with --queryformatz--nevra�store_constzZuse name-epoch:version-release.architecture format for displaying found packages (default))r<�constr=r>z--nvrz%{name}-%{version}-%{release}zQuse name-version-release format for displaying found packages (rpm query default)z--envraz.%{epoch}:%{name}-%{version}-%{release}.%{arch}zPuse epoch:name-version-release.architecture format for displaying found packagesz
--groupmemberz=Display in which comps groups are presented selected packagesz--duplicates�	pkgfilter�
duplicatedz/limit the query to installed duplicate packagesz--duplicatedz
--installonly�installonlyz1limit the query to installed installonly packagesz
--unsatisfied�unsatisfiedzClimit the query to installed packages with unsatisfied dependenciesz
--locationz5show a location from where packages can be downloadedz5Display capabilities that the package conflicts with.zaDisplay capabilities that the package can depend on, enhance, recommend, suggest, and supplement.z2Display capabilities that the package can enhance.z-Display capabilities provided by the package.z1Display capabilities that the package recommends.z1Display capabilities that the package depends on.z�If the package is not installed display capabilities that it depends on for running %%pre and %%post scriptlets. If the package is installed display capabilities that is depends for %%pre, %%post, %%preun and %%postun.z/Display capabilities that the package suggests.z5Display capabilities that the package can supplement.)	r	�dependsr
rr
rzrequires-prerrz--%s�
packageatr)r<r=rKr>z--availablez Display only available packages.z Display only installed packages.zLDisplay only packages that are not present in any of available repositories.zQDisplay only packages that provide an upgrade for some already installed package.zIDisplay only packages that can be removed by "{prog} autoremove" command.)�progz2Display only packages that were installed by user.)�	installedZextrasZupgrades�unneeded�
userinstalled�listz--autoremoverTz--recentz%Display only recently edited packagesr�*ZKEYzthe key to search for)rCr@r>)
�add_argumentrr.Zadd_mutually_exclusive_group�int�QFORMAT_DEFAULT�argparseZSUPPRESSr$�formatr�util�	MAIN_PROG)�parserZwhatrequiresformZoutformrLZpackage_attributeZ	help_msgs�argZhelp_msg�nameZ	help_listZ
list_groupZlist_argZhelp_argZswitchrrr�
set_argparservs


























zRepoQueryCommand.set_argparsercCs |jjs|jjtjtjd�dS)N)�stdout�stderr)r8�quiet�cliZredirect_logger�loggingZWARNING�INFO)�selfrrr�
pre_configureszRepoQueryCommand.pre_configurecCsj|jjs|jj�|jj}|jjrJ|jjrB|jjdd|jj�nd|j_|jjrVdS|jj	rx|jjrxt
jjtd���|jj
r�|jjr�|jjdd�t|jj|jjdko�|jj	g�s�t
jjtd���|jjs�|jj�r|jjp�|jj�st
jjtd	j|jj�rd
nd����|jj�r$|jjj�|jjdk�r@|jjd
k�sJ|jj�rPd|_d|_|jj�rfd|_dS)Nz--obsoletesz--rz�Option '--resolve' has to be used together with one of the '--conflicts', '--depends', '--enhances', '--provides', '--recommends', '--requires', '--requires-pre', '--suggests' or '--supplements' optionsz--recursivez--exactdepsrz�Option '--recursive' has to be used with '--whatrequires <REQ>' (optionally with '--alldeps', but not with '--exactdeps'), or with '--requires <REQ> --resolve'z;argument {} requires --whatrequires or --whatdepends optionz	--alldepsrSrUrNT)rSrU)r8rerfZredirect_repo_progress�demandsrrQZ_option_conflict�	querytags�resolverZCliErrorr�	recursive�	exactdeps�any�whatrequires�alldeps�whatdependsr\�srpm�baseZreposZenable_source_reposrVrL�	availableZavailable_reposZsack_activationrI�
changelogs)rirkrrr�	configures@




zRepoQueryCommand.configurec	Cs|jrpg}|jdt|��xH|jD]>}|d}|jd|jd�tjj|d�tjj|d�f�q$Wdj|�Syht	|�}|j
r�|jjj
|�S|jr�|j}|s�ttd�j|�tjd	�|S|jr�|jSt|j�j|�SWn4tk
�r}ztjjt|���WYdd}~XnXdS)
NzChangelog for %s�	timestampz* %s %s
%s
z%a %b %d %YZauthor�textr zPackage {} contains no files)�file)rI�append�strrw�strftimer�i18n�ucd�join�PackageWrapperrFru�outputZ
infoOutputrG�files�printrr\�sysrdrHZ	sourcerpmr-r)�AttributeError�
exceptions�Error)	rir8�pkg�outZchlog�dtZpoZfilelist�errr�build_format_fnGs.
z RepoQueryCommand.build_format_fncCsN|jjj�jdd�}x4|D],}|j|jtjj|�j	|jjddd���}qW|S)NT)�emptyF)�
with_providesZwith_filenames)
ru�sackr9r6�union�intersectionr�subject�Subject�get_best_query)riZnevrasZ
base_query�resolved_nevras_queryZnevrarrr�_resolve_nevrascs
z RepoQueryCommand._resolve_nevrasNcCsD|r|n|}|j|d�}|j|�}|j|�}|r@|j|||d�}|S)N)r)�done)�filter�
differencer��_do_recursive_deps)ri�query_in�query_selectr�Zquery_requiredrrrr�ps

z#RepoQueryCommand._do_recursive_depsFcCs�|j||�}|j|d�}|j|j|d��}|r�|j|j|d��}|j|j|d��}|j|j|d��}|j|j|d��}|j|j|d��}|j|j|d��}|j|j|d	��}|j|j|d
��}|jjr�|j||�}|S)N)�requires__glob)r)�recommends__glob)�enhances__glob)�supplements__glob)�suggests__glob)r
)r
)r)r)r�r�r�r8rnr�)ri�namesr9Z
all_dep_typesr�Zdepqueryrrr�by_all_deps}szRepoQueryCommand.by_all_depscCs�|r|n|jjj�jdd�}|jjj�jdd�}x$|j�D]}|j|j|jd��}q:W|j|�}|rz|j	|||j|�d�}|j|�S)NT)r�)r)r�)
rur�r9r6�runr�r�rr��_get_recursive_providers_query)rir��	providersr��tr�r�rrrr��s
z/RepoQueryCommand._get_recursive_providers_querycCsxg}g}xN|jjD]B}tjjj|�d}|jd�r>|j|�q|r|d	kr|j|�qW|rt|jj|d|jj	j
d�}|S)
Nrz.rpm�http�ftpr{�httpsF)�strict�progress)r�r�r{r�)r8rrZpycompZurlparse�endswithr|ruZadd_remote_rpmsr�r�)riZrpmnames�remote_packagesrZschemesrrr�_add_add_remote_packages�s
z)RepoQueryCommand._add_add_remote_packagesc	Cs�|jjrtt�dS|jj|j�|jjj|jj	r8t
jnt
jd�}|jj
r�|j�}i}|jj|jkrx|j|jjg|d<g}|jdd�}|r�|j|jjj�j|d��}x>|jj
D]2}|jtjj|dd�j|jjfd|d�|���}q�W|}|jj�r|j|jjj�}|jj�rX|jj�r�|jjd	k�r�t|jjj��tjj t!d
j"dd|jj����nH|jjd
k�rx|j#|jj$j%�}n(|jj�r�|jjdk�r�t&||jj��}|jj'dk�r�|jj(|�}|j)|�j*�}n�|jj'dk�r�|jj(|�}n�|jj'dk�rVtjj+|j�}|j,|jjj-|jjj.�tj/j0|�}	d|	_1|	j2dd�}
|
�sRttj3j4|	j5���dS|jj�sh|j�}|j6|j|�}|}|jj7�r�|j|jj7d�|jj8�r�|j|jj8d�}|j|j|j9|jj8|�d��}|jj:�r�|j|jj:d�|jj;�r|j|jj;d�}
|
�r|
}n|j|jj;d�|jj<�rR|jj=�rB|j|jj<d�n|j>|jj<|�}|jj?�r�|jj=�r�|j|jj?d�}|j|j|jj?d��}|j|j|jj?d��}|j|j|jj?d��}|j|j|jj?d��}n|j>|jj?|d�}|jj@�r|j|jj@d�}|j|j|j9|jj@|�d��}|jjA�rR|j|jjAd�}|j|j|j9|jjA|�d��}|jjB�r�|j|jjBd�}|j|j|j9|jjB|�d��}|jjC�r�|j|jjCd�}|j|j|j9|jjC|�d ��}|jjD�r�|jE|jjD�}|jjF|dd!�}|jjG�rRg}xD|D]<}|jH}|dk	�r�|jjj�j||jId"d#�}||j2�7}�q�W|jjj�j|d�}|jjJ�r�|jj<�r�|jjKd9k�r�tjj t!d,�j"tj3jLd-���|jM|||j�dStN�}|jjK�r�tN�}x||j2�D]p}|jjdk�s�|jj$jO|��r�|jjKd.k�r|jP|jQ|jR|jS|jT|jU�n|jPt&|tV|jjK���q�W|jjW�r�|jjd	k�rj|j6|j|jjj��}n|j6|j|jjj�j��}|j|d/�}|jjX�r�|j|jY||��}tN�}x@|jE�j2�D]}|jZ|j[|j|���q�Wn|jPd0d1�|D���n�|jj\�r6x.|j2�D]"}|j]�}|dk	�r|jZ|��qW�nv|jj^�rNg}x�t_tN|j2���D]�}|jjdk�sx|jj$jO|��rVg}|j`d2ta|��x�t_d3d4�|jQD��D]x}|j`d5|�tjj|�}|j|jj�}|j6|j|j��}|jjb�s�|jE�}x$|j2�D]}|j`d6ta|���q�W�q�W|j`d7jc|���qVW|�rJtd8jc|��dS|jjd�rf|je|�dSxD|j2�D]8}|jjdk�s�|jj$jO|��rp|jZ|j[|j|���qpW|�r�|jjf�r�td8jct_|���ntd7jct_|���dS):N)�flagsZformsT)r�)r�)Zignore_caseF)r�r9rSz)argument {}: not allowed with argument {}z--availablez--rTrUrMrNrO)Zverify)Z
file__glob)Zconflicts__glob)r	)r)Zprovides__glob)r�)r�)r�)r�)r�)r
)r
)r)r)Zwarning�src)ra�evrr5r	r
rrr
rrrz�No valid switch specified
usage: {prog} repoquery [--conflicts|--enhances|--obsoletes|--provides|--recommends|--requires|--suggest|--supplements|--whatrequires] [key] [--tree]

description:
  For the given packages print a tree of thepackages.)rRrP)rcss|]}t|�VqdS)N)r})�.0Zrelrrr�	<genexpr>Qsz'RepoQueryCommand.run.<locals>.<genexpr>z	package: cSsg|]}t|��qSr)r})r��reqrrr�
<listcomp>]sz(RepoQueryCommand.run.<locals>.<listcomp>z  dependency: z
   provider: r z

)r	r
rrr
rrr)gr8rlr��
QUERY_TAGSrfZ _populate_update_security_filterrur�r9Zdisable_modular_filtering�hawkeyZIGNORE_MODULAR_EXCLUDESZAPPLY_EXCLUDESrr�Zcommand�nevra_formsr�r�r6rr�r�r�ZrecentZ_recentZconfrvrVZ	optparserZprint_usager�r�rr\Z	_unneeded�historyZswdb�getattrrLZ_get_installonly_queryr�rMZ
rpmdb_sackZ
_configureZinstallonlypkgsZinstallonly_limit�goalZGoalZprotect_running_kernelr�r]Z_format_resolve_problemsZ
problem_rulesr:r{Z
whatconflictsr�Z
whatobsoletesZwhatprovidesrqror�rsZwhatrecommendsZwhatenhancesZwhatsupplementsZwhatsuggestsrDZlatestZ_merge_update_filtersrtZsource_namer�ZtreerQr^�	tree_seed�setZuser_installed�updaterr
rrr
r#rmrnr��addr��locationZremote_locationZdeplist�sortedr|r}�verboser�Zgroupmember�_group_member_reportrF)ri�qr�ZkwarkZpkgsZ
query_resultsrrNZrpmdbr�ZsolvedZorqueryZrelsZquery_for_provideZdependsqueryZpkg_listr�ZsrcnameZ	tmp_queryr9r�r�Zdeplist_outputr�r�Zproviderrrrr��sH





















"








zRepoQueryCommand.runc
Cs&i}x.|jjjD] }tdd�|j�D��||j<qWi}g}xr|j�D]f}g}x(|j�D]\}}	|j|	krX|j	|�qXW|r�|j
djt|��g�j	t
|��qF|j	t
|��qFWg}
xDt|j��D]4\}}|
j	djt|�tdd�|jd�D����q�W|
j	djt|���|
�r"tdj|
��dS)NcSsg|]
}|j�qSr)ra)r�r�rrrr�}sz9RepoQueryCommand._group_member_report.<locals>.<listcomp>�$r cSsg|]}d|�qS)z  @r)r��idrrrr��s)ru�compsrr�Z
packages_iterr�r�r$rar|�
setdefaultr�r�r}�splitr�)
rir9Zpackage_conf_dict�groupZgroup_package_dictZpkg_not_in_groupr�Z
group_id_listZgroup_idZpackage_name_setr�rZpackage_listrrrr�zs* 
,z%RepoQueryCommand._group_member_reportc
Cs�|j||�}|d
kr t|�dSd}xtd|�D]}|d7}q0Wg}x|jD]}|jt|��qLWdtt|��ddj|�d}	t|d	|d|	�dS)Nr� rz|   �[z: z, �]z\_ ���)r�r��rangerr|r}�lenr�)
ri�levelr�r8Z
pkg_stringZspacing�xrZ
requirepkgZreqstrrrr�	grow_tree�s"zRepoQueryCommand.grow_treercCs8�x0tt|j��dd�d�D�]}|dks2|dkr8t�n|}|jjd�sT|jjd�rXdS|j|||�||kr|j|�|jr�t||j�}i}xFt|�D]:}	|j	j
j�j|	d�}
x |
D]}|||jd|j
<q�Wq�W|j	j
j�jt|j��d	�}
n&|j�r|j|jf|�n|j|jd
�}
|j|
|||d|�qWdS)NcSs|jS)N)ra)�prrr�<lambda>�sz,RepoQueryCommand.tree_seed.<locals>.<lambda>)rrZrpmlibZsolvable)r�.)r�)r�r�)r�r�r�ra�
startswithr�r�rQr�rur�r9r6r5rV�valuesrrr�r�r�)rir9Zaqueryr8r�Zusedpkgsr�Zstrpkg�arraZpkgqueryZquerypkgrrrr��s$"

zRepoQueryCommand.tree_seed)r3r4)N)F)Nr�)r�N)r/r0r1�__doc__r�Z	FORM_NAMEZFORM_NAZ
FORM_NEVRAr��tuple�keys�aliasesrZsummary�staticmethodr:rbrjrxr�r�r�r�r�r�r�r�r�r�rrrrr2cs,
	0



Hr2c@sDeZdZdZdd�Zdd�Zedd��Zedd	��Z	ed
d��Z
dS)
r�z>Wrapper for dnf.package.Package, so we can control formatting.cCs
||_dS)N)�_pkg)rir�rrr�__init__�szPackageWrapper.__init__cCsFt|j|�}|dkrdSt|t�r:djtdd�|D���Stjj|�S)Nz(none)r cSsh|]}tjj|��qSr)rrr�)r�Zreldeprrr�	<setcomp>�sz-PackageWrapper.__getattr__.<locals>.<setcomp>)	r�r��
isinstancerVr�r�rrr�)ri�attrZatrrrr�__getattr__�s
zPackageWrapper.__getattr__cCs&|dkrtjj|�}|jd�SdSdS)Nrz%Y-%m-%d %H:%Mr")�datetimeZutcfromtimestampr~)ryr�rrr�_get_timestamp�s
zPackageWrapper._get_timestampcCs|j|jj�S)N)r�r��	buildtime)rirrrr��szPackageWrapper.buildtimecCs|j|jj�S)N)r�r��installtime)rirrrr��szPackageWrapper.installtimeN)r/r0r1r�r�r�r�r��propertyr�r�rrrrr��sr�)"Z
__future__rrrZdnf.i18nrZdnf.clirZdnf.cli.option_parserrr[r�rg�rer�rZdnf.exceptionsZdnf.subjectZdnf.utilr�Z	getLoggerZloggerrZ�compiler%r�r#r-Z_SplitCallbackr.ZCommandr2�objectr�rrrr�<module>sJ

Wsite-packages/dnf/cli/commands/__pycache__/repoquery.cpython-36.opt-1.pyc000064400000053431147511334650022252 0ustar003

�ft`ن�
@sddlmZddlmZddlmZddlmZddlmZddlm	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlZddlZddlZddlZejd�Zd	Ze
jd
�ZdZdd
dddddddd�	Zdd�ZGdd�de	j�ZGdd�dej�ZGdd�de �Z!dS)�)�absolute_import)�print_function)�unicode_literals)�_)�commands)�OptionParserN�dnfz.%{name}-%{epoch}:%{version}-%{release}.%{arch}z%(-?\d*?){([:.\w]+?)}aname, arch, epoch, version, release, reponame (repoid), from_repo, evr,
debug_name, source_name, source_debug_name,
installtime, buildtime, size, downloadsize, installsize,
provides, requires, obsoletes, conflicts, sourcerpm,
description, summary, license, url, reason�	conflicts�enhances�	obsoletes�provides�
recommends�requiresZrequires_pre�suggests�supplements)	r	r
rrr
rzrequires-prerrcCs�dd�}dd�}|jdd�jdd�}x tj�D]\}}|j||�}q.Wd	}d
}x>tj|�D]0}|||||j���7}|||�7}|j�}qZW||||d��7}|S)z:Convert a rpm like QUERYFMT to an python .format() string.cSs^|j�d}|j�d}|rJ|ddkr:d|dd�}nd|}d|}d|j�|dS)	Nr��-�>�<�:z{0.�})�groups�lower)ZmatchobjZfill�key�r�/usr/lib/python3.6/repoquery.py�fmt_replDszrpm2py_format.<locals>.fmt_replcSs|jdd�jdd�S)N�{z{{rz}})�replace)Ztxtrrr�bracketsOszrpm2py_format.<locals>.bracketsz\n�
z\t�	�rN)r�OPTS_MAPPING�items�
QFORMAT_MATCH�finditer�start�end)�queryformatrrr�valueZfmt�spos�itemrrr�
rpm2py_formatBsr-c@seZdZdZdS)�_CommaSplitCallbackz\s*,\s*N)�__name__�
__module__�__qualname__ZSPLITTERrrrrr._sr.c@s�eZdZdZejejejd�Zd%e	ej
��Zed�Z
edd��Zedd	��Zd
d�Zdd
�Zdd�Zdd�Zd&dd�Zd'dd�Zd(dd�Zdd�Zdd�Zdd�Zd d!�Zd*d#d$�ZdS)+�RepoQueryCommandzSA class containing methods needed by the cli to execute the repoquery command.
    )zrepoquery-nzrepoquery-nazrepoquery-nevra�	repoquery�rqz$search for packages matching keywordcCs,|jr|j|jd�|jr(|j|jd�|S)z'Filter query by repoid and arch options)Zreponame)�arch)Zrepo�filterm�arches)�opts�queryrrr�filter_repo_archms
z!RepoQueryCommand.filter_repo_archc
Cs�|jddddtd�d�|jddtd�d	�|jd
ddgtd
td�d�|jddddtd�d�|jdgtdtd�d�|jdgtdtd�d�|jdgtdtd�d�|jdgtdtd�d�|jd gtdtd!�d�|jd"gtdtd#�d�|jd$gtdtd%�d�|jd&gtdtd'�d�|jd(gtdtd)�d�|j�}|jd*dtd+�d	�|jd,dtd-�d	�|jd.dtd/�d	�|jd0dtd1�d	�|jd2dtd3�d	�|jd4dtd5�d	�|jd6dtd7�d	�|jd8d9ttd:�d;�|jd<dtd=�d	�|j�}|jd>d?d@dAdtdB�dC�|jdDdEdFdAdtdG�dC�|jdHdIdJdAdtdK�dC�|jdLdMdAdtdN�dC�|jdOdPdQttdR�dS�|jdTdtdU�d	�|jdVdQtdWtdX�dY�|jdZdQd[dWtd\�dY�|jd]dQd^dWtd_�dY�|jd`dtda�d	�|j�}|jdbdcdddWtde�dY�|jdfdcdddWtjdY�|jdgdcdhdWtdi�dY�|jdjdcdkdWtdl�dY�|jdmdtdn�d	�|j�}tdo�tdp�tdq�tdr�tds�tdt�tdu�tdv�tdw�dx�	}x2|j�D]&\}}dy|}|j|dzdW||d{��q�W|jd|dtd}�d	�td~�td�td��td��j	t
jjd��td��d��}	|j�}
x2|	j�D]&\}}dy|}
|
j|
d�dW||d{��q4W|
jd�d�dWd�tjd{�|jd�dtd��d	�|jd�d�d�td��d��dS)�Nz-az--allZqueryall�
store_truezNQuery all packages (shorthand for repoquery '*' or repoquery without argument))�dest�action�helpz--show-duplicatesz(Query all versions of packages (default))r=r>z--archz
--archlistr7z[arch]z show only results from this ARCH)r<�defaultr=�metavarr>z-fz--file�FILE�+z show only results that owns FILE)r@�nargsr>z--whatconflictsZREQz#show only results that conflict REQ)r?r=r@r>z
--whatdependszishows results that requires, suggests, supplements, enhances,or recommends package provides and files REQz--whatobsoletesz#show only results that obsolete REQz--whatprovidesz"show only results that provide REQz--whatrequiresz:shows results that requires package provides and files REQz--whatrecommendsz$show only results that recommend REQz--whatenhancesz"show only results that enhance REQz--whatsuggestsz"show only results that suggest REQz--whatsupplementsz%show only results that supplement REQz	--alldepsz=check non-explicit dependencies (files and Provides); defaultz--exactdepsz:check dependencies exactly as given, opposite of --alldepsz--recursivezOused with --whatrequires, and --requires --resolve, query packages recursively.z	--deplistz>show a list of all dependencies and what packages provide themz	--resolvez.resolve capabilities to originating package(s)z--treez"show recursive tree for package(s)z--srpmz#operate on corresponding source RPMz--latest-limit�latest_limitzOshow N latest packages for a given name.arch (or latest but N if N is negative))r<�typer>z--disable-modular-filteringz-list also packages of inactive module streamsz-iz--info�	queryinfoFz+show detailed information about the package)r<r?r=r>z-lz--list�
queryfilelistz!show list of files in the packagez-sz--source�querysourcerpmzshow package source RPM namez--changelogs�querychangelogszshow changelogs of the packagez--qfz
--queryformatr)zfdisplay format for listing packages: "%%{name} %%{version} ...", use --querytags to view full tag list)r<r?r>z--querytagsz-show available tags to use with --queryformatz--nevra�store_constzZuse name-epoch:version-release.architecture format for displaying found packages (default))r<�constr=r>z--nvrz%{name}-%{version}-%{release}zQuse name-version-release format for displaying found packages (rpm query default)z--envraz.%{epoch}:%{name}-%{version}-%{release}.%{arch}zPuse epoch:name-version-release.architecture format for displaying found packagesz
--groupmemberz=Display in which comps groups are presented selected packagesz--duplicates�	pkgfilter�
duplicatedz/limit the query to installed duplicate packagesz--duplicatedz
--installonly�installonlyz1limit the query to installed installonly packagesz
--unsatisfied�unsatisfiedzClimit the query to installed packages with unsatisfied dependenciesz
--locationz5show a location from where packages can be downloadedz5Display capabilities that the package conflicts with.zaDisplay capabilities that the package can depend on, enhance, recommend, suggest, and supplement.z2Display capabilities that the package can enhance.z-Display capabilities provided by the package.z1Display capabilities that the package recommends.z1Display capabilities that the package depends on.z�If the package is not installed display capabilities that it depends on for running %%pre and %%post scriptlets. If the package is installed display capabilities that is depends for %%pre, %%post, %%preun and %%postun.z/Display capabilities that the package suggests.z5Display capabilities that the package can supplement.)	r	�dependsr
rr
rzrequires-prerrz--%s�
packageatr)r<r=rKr>z--availablez Display only available packages.z Display only installed packages.zLDisplay only packages that are not present in any of available repositories.zQDisplay only packages that provide an upgrade for some already installed package.zIDisplay only packages that can be removed by "{prog} autoremove" command.)�progz2Display only packages that were installed by user.)�	installedZextrasZupgrades�unneeded�
userinstalled�listz--autoremoverTz--recentz%Display only recently edited packagesr�*ZKEYzthe key to search for)rCr@r>)
�add_argumentrr.Zadd_mutually_exclusive_group�int�QFORMAT_DEFAULT�argparseZSUPPRESSr$�formatr�util�	MAIN_PROG)�parserZwhatrequiresformZoutformrLZpackage_attributeZ	help_msgs�argZhelp_msg�nameZ	help_listZ
list_groupZlist_argZhelp_argZswitchrrr�
set_argparservs


























zRepoQueryCommand.set_argparsercCs |jjs|jjtjtjd�dS)N)�stdout�stderr)r8�quiet�cliZredirect_logger�loggingZWARNING�INFO)�selfrrr�
pre_configureszRepoQueryCommand.pre_configurecCsj|jjs|jj�|jj}|jjrJ|jjrB|jjdd|jj�nd|j_|jjrVdS|jj	rx|jjrxt
jjtd���|jj
r�|jjr�|jjdd�t|jj|jjdko�|jj	g�s�t
jjtd���|jjs�|jj�r|jjp�|jj�st
jjtd	j|jj�rd
nd����|jj�r$|jjj�|jjdk�r@|jjd
k�sJ|jj�rPd|_d|_|jj�rfd|_dS)Nz--obsoletesz--rz�Option '--resolve' has to be used together with one of the '--conflicts', '--depends', '--enhances', '--provides', '--recommends', '--requires', '--requires-pre', '--suggests' or '--supplements' optionsz--recursivez--exactdepsrz�Option '--recursive' has to be used with '--whatrequires <REQ>' (optionally with '--alldeps', but not with '--exactdeps'), or with '--requires <REQ> --resolve'z;argument {} requires --whatrequires or --whatdepends optionz	--alldepsrSrUrNT)rSrU)r8rerfZredirect_repo_progress�demandsrrQZ_option_conflict�	querytags�resolverZCliErrorr�	recursive�	exactdeps�any�whatrequires�alldeps�whatdependsr\�srpm�baseZreposZenable_source_reposrVrL�	availableZavailable_reposZsack_activationrI�
changelogs)rirkrrr�	configures@




zRepoQueryCommand.configurec	Cs|jrpg}|jdt|��xH|jD]>}|d}|jd|jd�tjj|d�tjj|d�f�q$Wdj|�Syht	|�}|j
r�|jjj
|�S|jr�|j}|s�ttd�j|�tjd	�|S|jr�|jSt|j�j|�SWn4tk
�r}ztjjt|���WYdd}~XnXdS)
NzChangelog for %s�	timestampz* %s %s
%s
z%a %b %d %YZauthor�textr zPackage {} contains no files)�file)rI�append�strrw�strftimer�i18n�ucd�join�PackageWrapperrFru�outputZ
infoOutputrG�files�printrr\�sysrdrHZ	sourcerpmr-r)�AttributeError�
exceptions�Error)	rir8�pkg�outZchlog�dtZpoZfilelist�errr�build_format_fnGs.
z RepoQueryCommand.build_format_fncCsN|jjj�jdd�}x4|D],}|j|jtjj|�j	|jjddd���}qW|S)NT)�emptyF)�
with_providesZwith_filenames)
ru�sackr9r6�union�intersectionr�subject�Subject�get_best_query)riZnevrasZ
base_query�resolved_nevras_queryZnevrarrr�_resolve_nevrascs
z RepoQueryCommand._resolve_nevrasNcCsD|r|n|}|j|d�}|j|�}|j|�}|r@|j|||d�}|S)N)r)�done)�filter�
differencer��_do_recursive_deps)ri�query_in�query_selectr�Zquery_requiredrrrr�ps

z#RepoQueryCommand._do_recursive_depsFcCs�|j||�}|j|d�}|j|j|d��}|r�|j|j|d��}|j|j|d��}|j|j|d��}|j|j|d��}|j|j|d��}|j|j|d��}|j|j|d	��}|j|j|d
��}|jjr�|j||�}|S)N)�requires__glob)r)�recommends__glob)�enhances__glob)�supplements__glob)�suggests__glob)r
)r
)r)r)r�r�r�r8rnr�)ri�namesr9Z
all_dep_typesr�Zdepqueryrrr�by_all_deps}szRepoQueryCommand.by_all_depscCs�|r|n|jjj�jdd�}|jjj�jdd�}x$|j�D]}|j|j|jd��}q:W|j|�}|rz|j	|||j|�d�}|j|�S)NT)r�)r)r�)
rur�r9r6�runr�r�rr��_get_recursive_providers_query)rir��	providersr��tr�r�rrrr��s
z/RepoQueryCommand._get_recursive_providers_querycCsxg}g}xN|jjD]B}tjjj|�d}|jd�r>|j|�q|r|d	kr|j|�qW|rt|jj|d|jj	j
d�}|S)
Nrz.rpm�http�ftpr{�httpsF)�strict�progress)r�r�r{r�)r8rrZpycompZurlparse�endswithr|ruZadd_remote_rpmsr�r�)riZrpmnames�remote_packagesrZschemesrrr�_add_add_remote_packages�s
z)RepoQueryCommand._add_add_remote_packagesc	Cs�|jjrtt�dS|jj|j�|jjj|jj	r8t
jnt
jd�}|jj
r�|j�}i}|jj|jkrx|j|jjg|d<g}|jdd�}|r�|j|jjj�j|d��}x>|jj
D]2}|jtjj|dd�j|jjfd|d�|���}q�W|}|jj�r|j|jjj�}|jj�rX|jj�r�|jjd	k�r�t|jjj��tjj t!d
j"dd|jj����nH|jjd
k�rx|j#|jj$j%�}n(|jj�r�|jjdk�r�t&||jj��}|jj'dk�r�|jj(|�}|j)|�j*�}n�|jj'dk�r�|jj(|�}n�|jj'dk�rVtjj+|j�}|j,|jjj-|jjj.�tj/j0|�}	d|	_1|	j2dd�}
|
�sRttj3j4|	j5���dS|jj�sh|j�}|j6|j|�}|}|jj7�r�|j|jj7d�|jj8�r�|j|jj8d�}|j|j|j9|jj8|�d��}|jj:�r�|j|jj:d�|jj;�r|j|jj;d�}
|
�r|
}n|j|jj;d�|jj<�rR|jj=�rB|j|jj<d�n|j>|jj<|�}|jj?�r�|jj=�r�|j|jj?d�}|j|j|jj?d��}|j|j|jj?d��}|j|j|jj?d��}|j|j|jj?d��}n|j>|jj?|d�}|jj@�r|j|jj@d�}|j|j|j9|jj@|�d��}|jjA�rR|j|jjAd�}|j|j|j9|jjA|�d��}|jjB�r�|j|jjBd�}|j|j|j9|jjB|�d��}|jjC�r�|j|jjCd�}|j|j|j9|jjC|�d ��}|jjD�r�|jE|jjD�}|jjF|dd!�}|jjG�rRg}xD|D]<}|jH}|dk	�r�|jjj�j||jId"d#�}||j2�7}�q�W|jjj�j|d�}|jjJ�r�|jj<�r�|jjKd9k�r�tjj t!d,�j"tj3jLd-���|jM|||j�dStN�}|jjK�r�tN�}x||j2�D]p}|jjdk�s�|jj$jO|��r�|jjKd.k�r|jP|jQ|jR|jS|jT|jU�n|jPt&|tV|jjK���q�W|jjW�r�|jjd	k�rj|j6|j|jjj��}n|j6|j|jjj�j��}|j|d/�}|jjX�r�|j|jY||��}tN�}x@|jE�j2�D]}|jZ|j[|j|���q�Wn|jPd0d1�|D���n�|jj\�r6x.|j2�D]"}|j]�}|dk	�r|jZ|��qW�nv|jj^�rNg}x�t_tN|j2���D]�}|jjdk�sx|jj$jO|��rVg}|j`d2ta|��x�t_d3d4�|jQD��D]x}|j`d5|�tjj|�}|j|jj�}|j6|j|j��}|jjb�s�|jE�}x$|j2�D]}|j`d6ta|���q�W�q�W|j`d7jc|���qVW|�rJtd8jc|��dS|jjd�rf|je|�dSxD|j2�D]8}|jjdk�s�|jj$jO|��rp|jZ|j[|j|���qpW|�r�|jjf�r�td8jct_|���ntd7jct_|���dS):N)�flagsZformsT)r�)r�)Zignore_caseF)r�r9rSz)argument {}: not allowed with argument {}z--availablez--rTrUrMrNrO)Zverify)Z
file__glob)Zconflicts__glob)r	)r)Zprovides__glob)r�)r�)r�)r�)r�)r
)r
)r)r)Zwarning�src)ra�evrr5r	r
rrr
rrrz�No valid switch specified
usage: {prog} repoquery [--conflicts|--enhances|--obsoletes|--provides|--recommends|--requires|--suggest|--supplements|--whatrequires] [key] [--tree]

description:
  For the given packages print a tree of thepackages.)rRrP)rcss|]}t|�VqdS)N)r})�.0Zrelrrr�	<genexpr>Qsz'RepoQueryCommand.run.<locals>.<genexpr>z	package: cSsg|]}t|��qSr)r})r��reqrrr�
<listcomp>]sz(RepoQueryCommand.run.<locals>.<listcomp>z  dependency: z
   provider: r z

)r	r
rrr
rrr)gr8rlr��
QUERY_TAGSrfZ _populate_update_security_filterrur�r9Zdisable_modular_filtering�hawkeyZIGNORE_MODULAR_EXCLUDESZAPPLY_EXCLUDESrr�Zcommand�nevra_formsr�r�r6rr�r�r�ZrecentZ_recentZconfrvrVZ	optparserZprint_usager�r�rr\Z	_unneeded�historyZswdb�getattrrLZ_get_installonly_queryr�rMZ
rpmdb_sackZ
_configureZinstallonlypkgsZinstallonly_limit�goalZGoalZprotect_running_kernelr�r]Z_format_resolve_problemsZ
problem_rulesr:r{Z
whatconflictsr�Z
whatobsoletesZwhatprovidesrqror�rsZwhatrecommendsZwhatenhancesZwhatsupplementsZwhatsuggestsrDZlatestZ_merge_update_filtersrtZsource_namer�ZtreerQr^�	tree_seed�setZuser_installed�updaterr
rrr
r#rmrnr��addr��locationZremote_locationZdeplist�sortedr|r}�verboser�Zgroupmember�_group_member_reportrF)ri�qr�ZkwarkZpkgsZ
query_resultsrrNZrpmdbr�ZsolvedZorqueryZrelsZquery_for_provideZdependsqueryZpkg_listr�ZsrcnameZ	tmp_queryr9r�r�Zdeplist_outputr�r�Zproviderrrrr��sH





















"








zRepoQueryCommand.runc
Cs&i}x.|jjjD] }tdd�|j�D��||j<qWi}g}xr|j�D]f}g}x(|j�D]\}}	|j|	krX|j	|�qXW|r�|j
djt|��g�j	t
|��qF|j	t
|��qFWg}
xDt|j��D]4\}}|
j	djt|�tdd�|jd�D����q�W|
j	djt|���|
�r"tdj|
��dS)NcSsg|]
}|j�qSr)ra)r�r�rrrr�}sz9RepoQueryCommand._group_member_report.<locals>.<listcomp>�$r cSsg|]}d|�qS)z  @r)r��idrrrr��s)ru�compsrr�Z
packages_iterr�r�r$rar|�
setdefaultr�r�r}�splitr�)
rir9Zpackage_conf_dict�groupZgroup_package_dictZpkg_not_in_groupr�Z
group_id_listZgroup_idZpackage_name_setr�rZpackage_listrrrr�zs* 
,z%RepoQueryCommand._group_member_reportc
Cs�|j||�}|d
kr t|�dSd}xtd|�D]}|d7}q0Wg}x|jD]}|jt|��qLWdtt|��ddj|�d}	t|d	|d|	�dS)Nr� rz|   �[z: z, �]z\_ ���)r�r��rangerr|r}�lenr�)
ri�levelr�r8Z
pkg_stringZspacing�xrZ
requirepkgZreqstrrrr�	grow_tree�s"zRepoQueryCommand.grow_treercCs8�x0tt|j��dd�d�D�]}|dks2|dkr8t�n|}|jjd�sT|jjd�rXdS|j|||�||kr|j|�|jr�t||j�}i}xFt|�D]:}	|j	j
j�j|	d�}
x |
D]}|||jd|j
<q�Wq�W|j	j
j�jt|j��d	�}
n&|j�r|j|jf|�n|j|jd
�}
|j|
|||d|�qWdS)NcSs|jS)N)ra)�prrr�<lambda>�sz,RepoQueryCommand.tree_seed.<locals>.<lambda>)rrZrpmlibZsolvable)r�.)r�)r�r�)r�r�r�ra�
startswithr�r�rQr�rur�r9r6r5rV�valuesrrr�r�r�)rir9Zaqueryr8r�Zusedpkgsr�Zstrpkg�arraZpkgqueryZquerypkgrrrr��s$"

zRepoQueryCommand.tree_seed)r3r4)N)F)Nr�)r�N)r/r0r1�__doc__r�Z	FORM_NAMEZFORM_NAZ
FORM_NEVRAr��tuple�keys�aliasesrZsummary�staticmethodr:rbrjrxr�r�r�r�r�r�r�r�r�r�rrrrr2cs,
	0



Hr2c@sDeZdZdZdd�Zdd�Zedd��Zedd	��Z	ed
d��Z
dS)
r�z>Wrapper for dnf.package.Package, so we can control formatting.cCs
||_dS)N)�_pkg)rir�rrr�__init__�szPackageWrapper.__init__cCsFt|j|�}|dkrdSt|t�r:djtdd�|D���Stjj|�S)Nz(none)r cSsh|]}tjj|��qSr)rrr�)r�Zreldeprrr�	<setcomp>�sz-PackageWrapper.__getattr__.<locals>.<setcomp>)	r�r��
isinstancerVr�r�rrr�)ri�attrZatrrrr�__getattr__�s
zPackageWrapper.__getattr__cCs&|dkrtjj|�}|jd�SdSdS)Nrz%Y-%m-%d %H:%Mr")�datetimeZutcfromtimestampr~)ryr�rrr�_get_timestamp�s
zPackageWrapper._get_timestampcCs|j|jj�S)N)r�r��	buildtime)rirrrr��szPackageWrapper.buildtimecCs|j|jj�S)N)r�r��installtime)rirrrr��szPackageWrapper.installtimeN)r/r0r1r�r�r�r�r��propertyr�r�rrrrr��sr�)"Z
__future__rrrZdnf.i18nrZdnf.clirZdnf.cli.option_parserrr[r�rg�rer�rZdnf.exceptionsZdnf.subjectZdnf.utilr�Z	getLoggerZloggerrZ�compiler%r�r#r-Z_SplitCallbackr.ZCommandr2�objectr�rrrr�<module>sJ

Wsite-packages/dnf/cli/commands/__pycache__/makecache.cpython-36.pyc000064400000002357147511334650021162 0ustar003

�ft`m�@sxddlmZddlmZddlmZddlmZddlZddlZddl	Zddl
ZddlZejd�Z
Gdd�dej�ZdS)	�)�absolute_import)�unicode_literals)�commands)�_N�dnfc@s,eZdZd	Zed�Zedd��Zdd�ZdS)
�MakeCacheCommand�	makecache�mczgenerate the metadata cachecCs,|jdddd�|jdddgdtjd�dS)Nz--timer�
store_true�	timer_opt)�action�dest�timer�?)�nargs�choices�metavar�help)�add_argument�argparseZSUPPRESS)�parser�r�/usr/lib/python3.6/makecache.py�
set_argparser'szMakeCacheCommand.set_argparsercCs2|jjdk	p|jj}td�}tj|�|jj|�S)Nz*Making cache files for all metadata files.)Zoptsrrr�logger�debug�baseZupdate_cache)�selfr�msgrrr�run.s
zMakeCacheCommand.runN)rr	)	�__name__�
__module__�__qualname__�aliasesrZsummary�staticmethodrrrrrrr#sr)Z
__future__rrZdnf.clirZdnf.i18nrrrZdnf.exceptionsZdnf.utilZloggingZ	getLoggerrZCommandrrrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/search.cpython-36.pyc000064400000010452147511334650020521 0ustar003

�ft`��@s�ddlmZddlmZddlmZddlZddlmZddlmZddl	m
Z
mZmZddl	Z
ddlZ
ddlZ
ddlZddlZejd�ZGd	d
�d
ej�ZdS)�)�absolute_import)�print_function)�unicode_literalsN)�commands)�OptionParser)�ucd�_�C_�dnfc@sPeZdZdZdZed�Zedd��Zdd�Z	d	d
�Z
dd�Zd
d�Zdd�Z
dS)�
SearchCommandzTA class containing methods needed by the cli to execute the
    search command.
    �search�sez+search package details for the given stringc	Cs<|jddtd�d�|jddtd�dgdtjtd	�d
�dS)Nz--all�
store_truez'search also package description and URL)�action�help�query_string�+ZKEYWORD�allzKeyword to search for)�nargs�metavar�choices�defaultrr)�add_argumentrrZPkgNarrowCallback)�parser�r�/usr/lib/python3.6/search.py�
set_argparser0szSearchCommand.set_argparsercs4tjdtdd�fdtdd�fdtdd�fdtd	�ff���fd
d����fdd
�}tjj�}x(|D] }�j|d|��j|d|�qbW�jj	r�xd|D] }�j|d|��j|d|�q�Wn:t
|�}t|j��}x$|D]}t
|j
|��|kr�||=q�Wd}d}	d}
d}d}�jjj�s0�jjj�j|j�d�j�}t�}
x�|jd|d�D]�}�jjj�s~|j|j|
k�rl�qF|
j|j|j�||j|�k�r�|j|�}d}|	|j
|�k�r�|j
|�}	d}|
|j|�|	kk�r�|j|�|	k}
d}|�r�||
||	�d}�jjj||j|�|��qFWt
|�dk�r0tjtd��dS)z0Search for simple text tags in a package object.�nameZlong�Name�summaryZSummary�descriptionZDescriptionZurlZURLc	sy�|S|SdS)Nr)�attr)�	TRANS_TBLrr�_translate_attrCsz.SearchCommand._search.<locals>._translate_attrcs^t�|�}td�j|�}|r*td�|}ntd�|}�jjj|dj|��}tt|��dS)Nz & z%s Exactly Matched: %%sz%s Matched: %%sz, )�mapr�join�base�outputZ
fmtSection�printr)�exact_matchZattrs�keysZtrans_attrsZtrans_attrs_strZsection_textZ	formatted)r#�selfrr�_print_section_headerIs
z4SearchCommand._search.<locals>._print_section_headerNF)�pkgT)�reverseZlimit_torzNo matches found.) �collections�OrderedDictr	rr
Z
match_counterZMatchCounter�_search_counted�optsr�len�listr*�matched_needlesr&ZconfZshowdupesfromrepos�sack�query�filtermZlatest�set�sortedrZarch�addZmatched_keysZmatched_haystacksr'Z
matchcallback�logger�info)r+�argsr,�counter�argZneedlesZpkgsr-Z
used_attrsr5r)Zprint_section_header�limit�seenr)r"r#r+r�_search9s`






zSearchCommand._searchcCs`d||i}tjj|�r$d||i}|jjj�jtjf|�}x|j	�D]}|j
|||�qFW|S)Nz
%s__substrz%s__glob)r
�utilZis_glob_patternr&r6r7r8�hawkeyZICASE�runr;)r+r?r!ZneedleZfdict�qr-rrrr1�szSearchCommand._search_countedcCs |jjs|jjtjtjd�dS)N)�stdout�stderr)r2�quiet�cliZredirect_logger�loggingZWARNING�INFO)r+rrr�
pre_configure�szSearchCommand.pre_configurecCsD|jjs|jj�|jj}d|_d|_d|_|jjp:|jj	|j_dS)NTF)
r2rJrKZredirect_repo_progress�demandsZavailable_reposZfresh_metadataZsack_activationrZquery_string_action)r+rOrrr�	configure�s
zSearchCommand.configurecCstjtd��|j|jj�S)NzSearching Packages: )r<�debugrrCr2r)r+rrrrF�szSearchCommand.runN)rr
)�__name__�
__module__�__qualname__�__doc__�aliasesrr�staticmethodrrCr1rNrPrFrrrrr(s	O		r)Z
__future__rrrr/Zdnf.clirZdnf.cli.option_parserrZdnf.i18nrrr	r
Zdnf.match_counterZdnf.utilrErLZ	getLoggerr<ZCommandrrrrr�<module>s
site-packages/dnf/cli/commands/__pycache__/upgrade.cpython-36.opt-1.pyc000064400000007141147511334650021643 0ustar003

�ft`~�@stddlmZddlmZddlZddlZddlZddlmZddl	m
Z
ddlmZej
d�ZGdd	�d	ej�ZdS)
�)�absolute_import)�unicode_literalsN)�commands)�OptionParser)�_�dnfc@sXeZdZdZdZed�Zed	d
��Zdd�Z	d
d�Z
dd�Zdd�Zdd�Z
dd�ZdS)�UpgradeCommandzTA class containing methods needed by the cli to execute the
    update command.
    �upgrade�update�
upgrade-to�	update-to�localupdate�upz,upgrade a package or packages on your systemcCs"|jddtd�tjtd�d�dS)NZpackages�*zPackage to upgradeZPACKAGE)�nargs�help�action�metavar)�add_argumentrrZParseSpecGroupFileCallback)�parser�r�/usr/lib/python3.6/upgrade.py�
set_argparser*szUpgradeCommand.set_argparsercCsZ|jj}d|_d|_d|_d|_tj|j|j�|j	j
sDtj|j�d|_d|_
d|_dS)z�Verify that conditions are met so that this command can run.

        These include that there are enabled repositories with gpg
        keys, and that this command is being run by the root user.
        TN)�cli�demandsZsack_activationZavailable_reposZ	resolvingZ	root_userrZ_checkGPGKey�base�opts�	filenamesZ_checkEnabledRepo�upgrade_minimal�all_security�skipped_grp_specs)�selfrrrr�	configure0szUpgradeCommand.configurecCs�|jr
dnd}|jj|j||jd�|jjs<|jjs<|jjrzd}||j�O}||j	�O}||j
�O}||j�O}|r�dSn|jj
�dStjjtd���dS)N�eqZgte)�cmp_type�allFzNo packages marked for upgrade.)rrZ _populate_update_security_filterrrr�	pkg_specs�	grp_specs�_update_modules�
_update_files�_update_packages�_update_groupsrZupgrade_allr�
exceptions�Errorr)r!r$�resultrrr�runBs

zUpgradeCommand.runcCsNt|jj�}tjjr6tjjj|j�}|j	|jj�|_
n
|jj|_
t|j
�|kS)N)�lenrr'rrZWITH_MODULES�module�module_baseZ
ModuleBaser	r )r!Zgroup_specs_numr2rrrr(Vs
zUpgradeCommand._update_modulescCs�d}|jjr�x~|jj|jjd|jjjd�D]^}y|jj|�d}Wq*tjj	k
r�}z$t
jtd�|jjj
j|j��WYdd}~Xq*Xq*W|S)NF)�strict�progressTzNo match for argument: %s)rrrZadd_remote_rpms�outputr4Zpackage_upgraderr,�MarkingError�logger�infor�term�bold�location)r!�successZpkg�errrr)`s
*zUpgradeCommand._update_filescCsrd}xh|jjD]\}y|jj|�d}Wqtjjk
rh}z"tjt	d�|jj
jj|��WYdd}~XqXqW|S)NFTzNo match for argument: %s)
rr&rr	rr,r6r7r8rr5r9r:)r!r<Zpkg_specr=rrrr*ms
(zUpgradeCommand._update_packagescCs|jr|jj|j�dSdS)NTF)r rZenv_group_upgrade)r!rrrr+xszUpgradeCommand._update_groupsN)r	r
rrr
r)�__name__�
__module__�__qualname__�__doc__�aliasesrZsummary�staticmethodrr"r/r(r)r*r+rrrrr#s

r)Z
__future__rrZloggingZdnf.exceptionsrZdnf.baseZdnf.clirZdnf.cli.option_parserrZdnf.i18nrZ	getLoggerr7ZCommandrrrrr�<module>s
site-packages/dnf/cli/commands/repolist.py000064400000031172147511334650014633 0ustar00# repolist.py
# repolist CLI command.
#
# Copyright (C) 2014-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from dnf.cli import commands
from dnf.i18n import _, ucd, fill_exact_width, exact_width
from dnf.cli.option_parser import OptionParser
import dnf.cli.format
import dnf.pycomp
import dnf.util
import fnmatch
import hawkey
import logging
import operator

logger = logging.getLogger('dnf')


def _expire_str(repo, md):
    last = dnf.util.normalize_time(repo._repo.getTimestamp()) if md else _("unknown")
    if repo.metadata_expire <= -1:
        return _("Never (last: %s)") % last
    elif not repo.metadata_expire:
        return _("Instant (last: %s)") % last
    else:
        num = _num2ui_num(repo.metadata_expire)
        return _("%s second(s) (last: %s)") % (num, last)


def _num2ui_num(num):
    return ucd(dnf.pycomp.format("%d", num, True))


def _repo_match(repo, patterns):
    rid = repo.id.lower()
    rnm = repo.name.lower()
    for pat in patterns:
        if fnmatch.fnmatch(rid, pat):
            return True
        if fnmatch.fnmatch(rnm, pat):
            return True
    return False


def _repo_size(sack, repo):
    ret = 0
    for pkg in sack.query(flags=hawkey.IGNORE_EXCLUDES).filterm(reponame__eq=repo.id):
        ret += pkg._size
    return dnf.cli.format.format_number(ret)


class RepoListCommand(commands.Command):
    """A class containing methods needed by the cli to execute the
    repolist command.
    """

    aliases = ('repolist', 'repoinfo')
    summary = _('display the configured software repositories')

    @staticmethod
    def set_argparser(parser):
        repolimit = parser.add_mutually_exclusive_group()
        repolimit.add_argument('--all', dest='_repos_action',
                               action='store_const', const='all', default=None,
                               help=_("show all repos"))
        repolimit.add_argument('--enabled', dest='_repos_action',
                               action='store_const', const='enabled',
                               help=_("show enabled repos (default)"))
        repolimit.add_argument('--disabled', dest='_repos_action',
                               action='store_const', const='disabled',
                               help=_("show disabled repos"))
        parser.add_argument('repos', nargs='*', default='enabled-default', metavar="REPOSITORY",
                            choices=['all', 'enabled', 'disabled'],
                            action=OptionParser.PkgNarrowCallback,
                            help=_("Repository specification"))

    def pre_configure(self):
        if not self.opts.quiet:
            self.cli.redirect_logger(stdout=logging.WARNING, stderr=logging.INFO)

    def configure(self):
        if not self.opts.quiet:
            self.cli.redirect_repo_progress()
        demands = self.cli.demands
        if self.base.conf.verbose or self.opts.command == 'repoinfo':
            demands.available_repos = True
            demands.sack_activation = True

        if self.opts._repos_action:
            self.opts.repos_action = self.opts._repos_action

    def run(self):
        arg = self.opts.repos_action
        extcmds = [x.lower() for x in self.opts.repos]

        verbose = self.base.conf.verbose

        repos = list(self.base.repos.values())
        repos.sort(key=operator.attrgetter('id'))
        term = self.output.term
        on_ehibeg = term.FG_COLOR['green'] + term.MODE['bold']
        on_dhibeg = term.FG_COLOR['red']
        on_hiend = term.MODE['normal']
        tot_num = 0
        cols = []
        if not repos:
            logger.warning(_('No repositories available'))
            return
        include_status = arg == 'all' or (arg == 'enabled-default' and extcmds)
        repoinfo_output = []
        for repo in repos:
            if len(extcmds) and not _repo_match(repo, extcmds):
                continue
            (ehibeg, dhibeg, hiend) = '', '', ''
            ui_enabled = ''
            ui_endis_wid = 0
            ui_excludes_num = ''
            if include_status:
                (ehibeg, dhibeg, hiend) = (on_ehibeg, on_dhibeg, on_hiend)
            if repo.enabled:
                enabled = True
                if arg == 'disabled':
                    continue
                if include_status or verbose or self.opts.command == 'repoinfo':
                    ui_enabled = ehibeg + _('enabled') + hiend
                    ui_endis_wid = exact_width(_('enabled'))
                if verbose or self.opts.command == 'repoinfo':
                    ui_size = _repo_size(self.base.sack, repo)
            else:
                enabled = False
                if arg == 'enabled' or (arg == 'enabled-default' and not extcmds):
                    continue
                ui_enabled = dhibeg + _('disabled') + hiend
                ui_endis_wid = exact_width(_('disabled'))

            if not (verbose or self.opts.command == 'repoinfo'):
                rid = ucd(repo.id)
                cols.append((rid, repo.name, (ui_enabled, ui_endis_wid)))
            else:
                if enabled:
                    md = repo.metadata
                else:
                    md = None
                out = [self.output.fmtKeyValFill(_("Repo-id            : "), repo.id),
                       self.output.fmtKeyValFill(_("Repo-name          : "), repo.name)]

                if include_status:
                    out += [self.output.fmtKeyValFill(_("Repo-status        : "),
                                                      ui_enabled)]
                if md and repo._repo.getRevision():
                    out += [self.output.fmtKeyValFill(_("Repo-revision      : "),
                                                      repo._repo.getRevision())]
                if md and repo._repo.getContentTags():
                    tags = repo._repo.getContentTags()
                    out += [self.output.fmtKeyValFill(_("Repo-tags          : "),
                                                      ", ".join(sorted(tags)))]

                if md and repo._repo.getDistroTags():
                    distroTagsDict = {k: v for (k, v) in repo._repo.getDistroTags()}
                    for (distro, tags) in distroTagsDict.items():
                        out += [self.output.fmtKeyValFill(
                            _("Repo-distro-tags      : "),
                            "[%s]: %s" % (distro, ", ".join(sorted(tags))))]

                if md:
                    num = len(self.base.sack.query(flags=hawkey.IGNORE_EXCLUDES).filterm(
                        reponame__eq=repo.id))
                    num_available = len(self.base.sack.query().filterm(reponame__eq=repo.id))
                    ui_num = _num2ui_num(num)
                    ui_num_available = _num2ui_num(num_available)
                    tot_num += num
                    out += [
                        self.output.fmtKeyValFill(
                            _("Repo-updated       : "),
                            dnf.util.normalize_time(repo._repo.getMaxTimestamp())),
                        self.output.fmtKeyValFill(_("Repo-pkgs          : "), ui_num),
                        self.output.fmtKeyValFill(_("Repo-available-pkgs: "), ui_num_available),
                        self.output.fmtKeyValFill(_("Repo-size          : "), ui_size)]

                if repo.metalink:
                    out += [self.output.fmtKeyValFill(_("Repo-metalink      : "),
                                                      repo.metalink)]
                    if enabled:
                        ts = repo._repo.getTimestamp()
                        out += [self.output.fmtKeyValFill(
                            _("  Updated          : "), dnf.util.normalize_time(ts))]
                elif repo.mirrorlist:
                    out += [self.output.fmtKeyValFill(_("Repo-mirrors       : "),
                                                      repo.mirrorlist)]
                baseurls = repo.baseurl
                if baseurls:
                    out += [self.output.fmtKeyValFill(_("Repo-baseurl       : "),
                                                      ", ".join(baseurls))]
                elif enabled:
                    mirrors = repo._repo.getMirrors()
                    if mirrors:
                        url = "%s (%d more)" % (mirrors[0], len(mirrors) - 1)
                        out += [self.output.fmtKeyValFill(_("Repo-baseurl       : "), url)]

                expire = _expire_str(repo, md)
                out += [self.output.fmtKeyValFill(_("Repo-expire        : "), expire)]

                if repo.excludepkgs:
                    # TRANSLATORS: Packages that are excluded - their names like (dnf systemd)
                    out += [self.output.fmtKeyValFill(_("Repo-exclude       : "),
                                                      ", ".join(repo.excludepkgs))]

                if repo.includepkgs:
                    out += [self.output.fmtKeyValFill(_("Repo-include       : "),
                                                      ", ".join(repo.includepkgs))]

                if ui_excludes_num:
                    # TRANSLATORS: Number of packages that where excluded (5)
                    out += [self.output.fmtKeyValFill(_("Repo-excluded      : "),
                                                      ui_excludes_num)]

                if repo.repofile:
                    out += [self.output.fmtKeyValFill(_("Repo-filename      : "),
                                                      repo.repofile)]
                repoinfo_output.append("\n".join(map(ucd, out)))

        if repoinfo_output:
            print("\n\n".join(repoinfo_output))
        if not verbose and cols:
            #  Work out the first (id) and last (enabled/disabled/count),
            # then chop the middle (name)...

            id_len = exact_width(_('repo id'))
            nm_len = 0
            st_len = 0

            for (rid, rname, (ui_enabled, ui_endis_wid)) in cols:
                if id_len < exact_width(rid):
                    id_len = exact_width(rid)
                if nm_len < exact_width(rname):
                    nm_len = exact_width(rname)
                if st_len < ui_endis_wid:
                    st_len = ui_endis_wid
                # Need this as well as above for: fill_exact_width()
            if include_status:
                if exact_width(_('status')) > st_len:
                    left = term.columns - (id_len + len(_('status')) + 2)
                else:
                    left = term.columns - (id_len + st_len + 2)
            else:  # Don't output a status column.
                left = term.columns - (id_len + 1)

            if left < nm_len:  # Name gets chopped
                nm_len = left
            else:  # Share the extra...
                left -= nm_len
                id_len += left // 2
                nm_len += left - (left // 2)

            txt_rid = fill_exact_width(_('repo id'), id_len)
            if include_status:
                txt_rnam = fill_exact_width(_('repo name'), nm_len, nm_len)
            else:
                txt_rnam = _('repo name')
            if not include_status:  # Don't output a status column.
                print("%s %s" % (txt_rid, txt_rnam))
            else:
                print("%s %s %s" % (txt_rid, txt_rnam, _('status')))
            for (rid, rname, (ui_enabled, ui_endis_wid)) in cols:
                if not include_status:  # Don't output a status column.
                    print("%s %s" % (fill_exact_width(rid, id_len), rname))
                    continue

                print("%s %s %s" % (fill_exact_width(rid, id_len),
                                    fill_exact_width(rname, nm_len, nm_len),
                                    ui_enabled))
        if verbose or self.opts.command == 'repoinfo':
            msg = _('Total packages: {}')
            print(msg.format(_num2ui_num(tot_num)))
site-packages/dnf/cli/commands/mark.py000064400000006720147511334650013725 0ustar00# mark.py
# Mark CLI command.
#
# Copyright (C) 2015-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import print_function
from __future__ import unicode_literals

import libdnf.transaction

from dnf.i18n import _
from dnf.cli import commands

import dnf
import functools
import logging

logger = logging.getLogger("dnf")


class MarkCommand(commands.Command):

    aliases = ('mark',)
    summary = _('mark or unmark installed packages as installed by user.')

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('mark', nargs=1, choices=['install', 'remove', 'group'],
                            help=_("install: mark as installed by user\n"
                                   "remove: unmark as installed by user\n"
                                   "group: mark as installed by group"))
        parser.add_argument('package', nargs='+', metavar="PACKAGE",
                            help=_("Package specification"))

    def _mark_install(self, pkg):
        self.base.history.set_reason(pkg, libdnf.transaction.TransactionItemReason_USER)
        logger.info(_('%s marked as user installed.'), str(pkg))

    def _mark_remove(self, pkg):
        self.base.history.set_reason(pkg, libdnf.transaction.TransactionItemReason_DEPENDENCY)
        logger.info(_('%s unmarked as user installed.'), str(pkg))

    def _mark_group(self, pkg):
        self.base.history.set_reason(pkg, libdnf.transaction.TransactionItemReason_GROUP)
        logger.info(_('%s marked as group installed.'), str(pkg))

    def configure(self):
        demands = self.cli.demands
        demands.sack_activation = True
        demands.root_user = True
        demands.available_repos = False
        demands.resolving = False

    def run(self):
        cmd = self.opts.mark[0]
        pkgs = self.opts.package

        mark_func = functools.partial(getattr(self, '_mark_' + cmd))

        notfound = []
        for pkg in pkgs:
            subj = dnf.subject.Subject(pkg)
            q = subj.get_best_query(self.base.sack)
            for pkg in q:
                mark_func(pkg)
            if len(q) == 0:
                notfound.append(pkg)

        if notfound:
            logger.error(_('Error:'))
            for pkg in notfound:
                logger.error(_('Package %s is not installed.'), pkg)
            raise dnf.cli.CliError

        old = self.base.history.last()
        if old is None:
            rpmdb_version = self.sack._rpmdb_version()
        else:
            rpmdb_version = old.end_rpmdb_version

        self.base.history.beg(rpmdb_version, [], [])
        self.base.history.end(rpmdb_version)
site-packages/dnf/cli/commands/check.py000064400000016077147511334650014056 0ustar00#
# Copyright (C) 2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from dnf.i18n import _
from dnf.cli import commands

import argparse
import dnf.exceptions


class CheckCommand(commands.Command):
    """A class containing methods needed by the cli to execute the check
    command.
    """

    aliases = ('check',)
    summary = _('check for problems in the packagedb')

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('--all', dest='check_types',
                            action='append_const', const='all',
                            help=_('show all problems; default'))
        parser.add_argument('--dependencies', dest='check_types',
                            action='append_const', const='dependencies',
                            help=_('show dependency problems'))
        parser.add_argument('--duplicates', dest='check_types',
                            action='append_const', const='duplicates',
                            help=_('show duplicate problems'))
        parser.add_argument('--obsoleted', dest='check_types',
                            action='append_const', const='obsoleted',
                            help=_('show obsoleted packages'))
        parser.add_argument('--provides', dest='check_types',
                            action='append_const', const='provides',
                            help=_('show problems with provides'))
        # Add compatibility with yum but invisible in help
        # In choices [] allows to return empty list if no argument otherwise it fails
        parser.add_argument('check_yum_types', nargs='*', choices=[
            'all', 'dependencies', 'duplicates', 'obsoleted', 'provides', []],
                            help=argparse.SUPPRESS)

    def configure(self):
        self.cli.demands.sack_activation = True
        if self.opts.check_yum_types:
            if self.opts.check_types:
                self.opts.check_types = self.opts.check_types + \
                                        self.opts.check_yum_types
            else:
                self.opts.check_types = self.opts.check_yum_types
        if not self.opts.check_types:
            self.opts.check_types = {'all'}
        else:
            self.opts.check_types = set(self.opts.check_types)
        self.base.conf.disable_excludes += ["all"]

    def run(self):
        output_set = set()
        q = self.base.sack.query().installed()

        if self.opts.check_types.intersection({'all', 'dependencies'}):
            sack = None
            for pkg in q:
                for require in set(pkg.regular_requires) | set(set(pkg.requires_pre) - set(pkg.prereq_ignoreinst)):
                    if str(require).startswith('rpmlib'):
                        continue
                    if not len(q.filter(provides=[require])):
                        if str(require).startswith('('):
                            # rich deps can be only tested by solver
                            if sack is None:
                                sack = dnf.sack.rpmdb_sack(self.base)
                            selector = dnf.selector.Selector(sack)
                            selector.set(provides=str(require))
                            goal = dnf.goal.Goal(sack)
                            goal.protect_running_kernel = self.base.conf.protect_running_kernel
                            goal.install(select=selector, optional=False)
                            solved = goal.run()
                            # there ase only @system repo in sack, therefore solved is only in case
                            # when rich deps doesn't require any additional package
                            if solved:
                                continue
                        msg = _("{} has missing requires of {}")
                        output_set.add(msg.format(
                            self.base.output.term.bold(pkg),
                            self.base.output.term.bold(require)))
                for conflict in pkg.conflicts:
                    conflicted = q.filter(provides=[conflict],
                                          name=str(conflict).split()[0])
                    for conflict_pkg in conflicted:
                        msg = '{} has installed conflict "{}": {}'
                        output_set.add(msg.format(
                            self.base.output.term.bold(pkg),
                            self.base.output.term.bold(conflict),
                            self.base.output.term.bold(conflict_pkg)))

        if self.opts.check_types.intersection({'all', 'duplicates'}):
            installonly = self.base._get_installonly_query(q)
            dups = q.duplicated().difference(installonly)._name_dict()
            for name, pkgs in dups.items():
                pkgs.sort()
                for dup in pkgs[1:]:
                    msg = _("{} is a duplicate with {}").format(
                        self.base.output.term.bold(pkgs[0]),
                        self.base.output.term.bold(dup))
                    output_set.add(msg)

        if self.opts.check_types.intersection({'all', 'obsoleted'}):
            for pkg in q:
                for obsolete in pkg.obsoletes:
                    obsoleted = q.filter(provides=[obsolete],
                                         name=str(obsolete).split()[0])
                    if len(obsoleted):
                        msg = _("{} is obsoleted by {}").format(
                            self.base.output.term.bold(obsoleted[0]),
                            self.base.output.term.bold(pkg))
                        output_set.add(msg)

        if self.opts.check_types.intersection({'all', 'provides'}):
            for pkg in q:
                for provide in pkg.provides:
                    if pkg not in q.filter(provides=[provide]):
                        msg = _("{} provides {} but it cannot be found")
                        output_set.add(msg.format(
                            self.base.output.term.bold(pkg),
                            self.base.output.term.bold(provide)))

        for msg in sorted(output_set):
            print(msg)

        if output_set:
            raise dnf.exceptions.Error(
                'Check discovered {} problem(s)'.format(len(output_set)))
site-packages/dnf/cli/commands/module.py000064400000040731147511334650014260 0ustar00# supplies the 'module' command.
#
# Copyright (C) 2014-2017  Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import print_function

from dnf.cli import commands, CliError
from dnf.i18n import _
from dnf.module.exceptions import NoModuleException
from dnf.util import logger
import dnf.util

import sys
import os

import hawkey
import libdnf
import dnf.module.module_base
import dnf.exceptions


class ModuleCommand(commands.Command):
    class SubCommand(commands.Command):

        def __init__(self, cli):
            super(ModuleCommand.SubCommand, self).__init__(cli)
            self.module_base = dnf.module.module_base.ModuleBase(self.base)

        def _get_modules_from_name_stream_specs(self):
            modules_from_specs = set()
            for module_spec in self.opts.module_spec:
                __, nsvcap = self.module_base._get_modules(module_spec)
                # When there is no match, the problem was already reported by module_base.remove()
                if nsvcap is None:
                    continue
                name = nsvcap.name if nsvcap.name else ""
                stream = nsvcap.stream if nsvcap.stream else ""
                if (nsvcap.version and nsvcap.version != -1) or nsvcap.context:
                    logger.info(_("Only module name, stream, architecture or profile is used. "
                                  "Ignoring unneeded information in argument: '{}'").format(
                        module_spec))
                arch = nsvcap.arch if nsvcap.arch else ""
                modules = self.base._moduleContainer.query(name, stream, "", "", arch)
                modules_from_specs.update(modules)
            return modules_from_specs

        def _get_module_artifact_names(self, use_modules, skip_modules):
            artifacts = set()
            pkg_names = set()
            for module in use_modules:
                if module not in skip_modules:
                    if self.base._moduleContainer.isModuleActive(module):
                        artifacts.update(module.getArtifacts())
            for artifact in artifacts:
                subj = hawkey.Subject(artifact)
                for nevra_obj in subj.get_nevra_possibilities(
                        forms=[hawkey.FORM_NEVRA]):
                    if nevra_obj.name:
                        pkg_names.add(nevra_obj.name)
            return pkg_names, artifacts

    class ListSubCommand(SubCommand):

        aliases = ('list',)
        summary = _('list all module streams, profiles and states')

        def configure(self):
            demands = self.cli.demands
            demands.available_repos = True
            demands.sack_activation = True

        def run_on_module(self):
            mods = self.module_base

            if self.opts.enabled:
                output = mods._get_brief_description(
                    self.opts.module_spec, libdnf.module.ModulePackageContainer.ModuleState_ENABLED)
            elif self.opts.disabled:
                output = mods._get_brief_description(
                    self.opts.module_spec,
                    libdnf.module.ModulePackageContainer.ModuleState_DISABLED)
            elif self.opts.installed:
                output = mods._get_brief_description(
                    self.opts.module_spec,
                    libdnf.module.ModulePackageContainer.ModuleState_INSTALLED)
            else:
                output = mods._get_brief_description(
                    self.opts.module_spec, libdnf.module.ModulePackageContainer.ModuleState_UNKNOWN)
            if output:
                print(output)
                return
            if self.opts.module_spec:
                msg = _('No matching Modules to list')
                raise dnf.exceptions.Error(msg)

    class InfoSubCommand(SubCommand):

        aliases = ('info',)
        summary = _('print detailed information about a module')

        def configure(self):
            demands = self.cli.demands
            demands.available_repos = True
            demands.sack_activation = True

        def run_on_module(self):
            if self.opts.verbose:
                output = self.module_base._get_full_info(self.opts.module_spec)
            elif self.opts.profile:
                output = self.module_base._get_info_profiles(self.opts.module_spec)
            else:
                output = self.module_base._get_info(self.opts.module_spec)
            if output:
                print(output)
            else:
                raise dnf.exceptions.Error(_('No matching Modules to list'))

    class EnableSubCommand(SubCommand):

        aliases = ('enable',)
        summary = _('enable a module stream')

        def configure(self):
            demands = self.cli.demands
            demands.available_repos = True
            demands.sack_activation = True
            demands.resolving = True
            demands.root_user = True

        def run_on_module(self):
            try:
                self.module_base.enable(self.opts.module_spec)
            except dnf.exceptions.MarkingErrors as e:
                if self.base.conf.strict:
                    if e.no_match_group_specs or e.error_group_specs:
                        raise e
                    if e.module_depsolv_errors and e.module_depsolv_errors[1] != \
                            libdnf.module.ModulePackageContainer.ModuleErrorType_ERROR_IN_DEFAULTS:
                        raise e
                logger.error(str(e))

    class DisableSubCommand(SubCommand):

        aliases = ('disable',)
        summary = _('disable a module with all its streams')

        def configure(self):
            demands = self.cli.demands
            demands.available_repos = True
            demands.sack_activation = True
            demands.resolving = True
            demands.root_user = True

        def run_on_module(self):
            try:
                self.module_base.disable(self.opts.module_spec)
            except dnf.exceptions.MarkingErrors as e:
                if self.base.conf.strict:
                    if e.no_match_group_specs or e.error_group_specs:
                        raise e
                    if e.module_depsolv_errors and e.module_depsolv_errors[1] != \
                            libdnf.module.ModulePackageContainer.ModuleErrorType_ERROR_IN_DEFAULTS:
                        raise e
                logger.error(str(e))

    class ResetSubCommand(SubCommand):

        aliases = ('reset',)
        summary = _('reset a module')

        def configure(self):
            demands = self.cli.demands
            demands.available_repos = True
            demands.sack_activation = True
            demands.resolving = True
            demands.root_user = True

        def run_on_module(self):
            try:
                self.module_base.reset(self.opts.module_spec)
            except dnf.exceptions.MarkingErrors as e:
                if self.base.conf.strict:
                    if e.no_match_group_specs:
                        raise e
                logger.error(str(e))

    class InstallSubCommand(SubCommand):

        aliases = ('install',)
        summary = _('install a module profile including its packages')

        def configure(self):
            demands = self.cli.demands
            demands.available_repos = True
            demands.sack_activation = True
            demands.resolving = True
            demands.root_user = True

        def run_on_module(self):
            try:
                self.module_base.install(self.opts.module_spec, self.base.conf.strict)
            except dnf.exceptions.MarkingErrors as e:
                if self.base.conf.strict:
                    if e.no_match_group_specs or e.error_group_specs:
                        raise e
                logger.error(str(e))

    class UpdateSubCommand(SubCommand):

        aliases = ('update',)
        summary = _('update packages associated with an active stream')

        def configure(self):
            demands = self.cli.demands
            demands.available_repos = True
            demands.sack_activation = True
            demands.resolving = True
            demands.root_user = True

        def run_on_module(self):
            module_specs = self.module_base.upgrade(self.opts.module_spec)
            if module_specs:
                raise NoModuleException(", ".join(module_specs))

    class RemoveSubCommand(SubCommand):

        aliases = ('remove', 'erase',)
        summary = _('remove installed module profiles and their packages')

        def configure(self):
            demands = self.cli.demands
            demands.allow_erasing = True
            demands.available_repos = True
            demands.fresh_metadata = False
            demands.resolving = True
            demands.root_user = True
            demands.sack_activation = True

        def run_on_module(self):
            skipped_groups = self.module_base.remove(self.opts.module_spec)
            if self.opts.all:
                modules_from_specs = self._get_modules_from_name_stream_specs()
                remove_names_from_spec, __ = self._get_module_artifact_names(
                    modules_from_specs, set())
                keep_names, __ = self._get_module_artifact_names(
                    self.base._moduleContainer.getModulePackages(), modules_from_specs)
                remove_query = self.base.sack.query().installed().filterm(
                    name=remove_names_from_spec)
                keep_query = self.base.sack.query().installed().filterm(name=keep_names)
                for pkg in remove_query:
                    if pkg in keep_query:
                        msg = _("Package {} belongs to multiple modules, skipping").format(pkg)
                        logger.info(msg)
                    else:
                        self.base.goal.erase(
                            pkg, clean_deps=self.base.conf.clean_requirements_on_remove)
            if not skipped_groups:
                return

            logger.error(dnf.exceptions.MarkingErrors(no_match_group_specs=skipped_groups))

    class SwitchToSubCommand(SubCommand):

        aliases = ('switch-to',)
        summary = _('switch a module to a stream and distrosync rpm packages')

        def configure(self):
            demands = self.cli.demands
            demands.available_repos = True
            demands.sack_activation = True
            demands.resolving = True
            demands.root_user = True
            self.base.conf.module_stream_switch = True

        def run_on_module(self):
            try:
                self.module_base.switch_to(self.opts.module_spec, strict=self.base.conf.strict)
            except dnf.exceptions.MarkingErrors as e:
                if self.base.conf.strict:
                    if e.no_match_group_specs or e.error_group_specs:
                        raise e
                logger.error(str(e))

    class ProvidesSubCommand(SubCommand):

        aliases = ("provides", )
        summary = _('list modular packages')

        def configure(self):
            demands = self.cli.demands
            demands.available_repos = True
            demands.sack_activation = True

        def run_on_module(self):
            output = self.module_base._what_provides(self.opts.module_spec)
            if output:
                print(output)

    class RepoquerySubCommand(SubCommand):

        aliases = ("repoquery", )
        summary = _('list packages belonging to a module')

        def configure(self):
            demands = self.cli.demands
            demands.available_repos = True
            demands.sack_activation = True

        def run_on_module(self):
            modules_from_specs = set()
            for module_spec in self.opts.module_spec:
                modules, __ = self.module_base._get_modules(module_spec)
                modules_from_specs.update(modules)
            names_from_spec, spec_artifacts = self._get_module_artifact_names(
                modules_from_specs, set())
            package_strings = set()
            if self.opts.available or not self.opts.installed:
                query = self.base.sack.query().available().filterm(nevra_strict=spec_artifacts)
                for pkg in query:
                    package_strings.add(str(pkg))
            if self.opts.installed:
                query = self.base.sack.query().installed().filterm(name=names_from_spec)
                for pkg in query:
                    package_strings.add(str(pkg))

            output = "\n".join(sorted(package_strings))
            print(output)


    SUBCMDS = {ListSubCommand, InfoSubCommand, EnableSubCommand,
               DisableSubCommand, ResetSubCommand, InstallSubCommand, UpdateSubCommand,
               RemoveSubCommand, SwitchToSubCommand, ProvidesSubCommand, RepoquerySubCommand}

    SUBCMDS_NOT_REQUIRED_ARG = {ListSubCommand}

    aliases = ("module",)
    summary = _("Interact with Modules.")

    def __init__(self, cli):
        super(ModuleCommand, self).__init__(cli)
        subcmd_objs = (subcmd(cli) for subcmd in self.SUBCMDS)
        self.subcmd = None
        self._subcmd_name2obj = {
            alias: subcmd for subcmd in subcmd_objs for alias in subcmd.aliases}

    def set_argparser(self, parser):
        narrows = parser.add_mutually_exclusive_group()
        narrows.add_argument('--enabled', dest='enabled',
                             action='store_true',
                             help=_("show only enabled modules"))
        narrows.add_argument('--disabled', dest='disabled',
                             action='store_true',
                             help=_("show only disabled modules"))
        narrows.add_argument('--installed', dest='installed',
                             action='store_true',
                             help=_("show only installed modules or packages"))
        narrows.add_argument('--profile', dest='profile',
                             action='store_true',
                             help=_("show profile content"))
        parser.add_argument('--available', dest='available', action='store_true',
                            help=_("show only available packages"))
        narrows.add_argument('--all', dest='all',
                             action='store_true',
                             help=_("remove all modular packages"))
        subcommand_choices = []
        subcommand_help = []
        for subcmd in sorted(self.SUBCMDS, key=lambda x: x.aliases[0]):
            subcommand_choices.append(subcmd.aliases[0])
            subcommand_help.append('{}: {}'.format(subcmd.aliases[0], subcmd.summary or ''))
        parser.add_argument('subcmd', nargs=1, choices=subcommand_choices,
                            metavar='<modular command>',
                            help='\n'.join(subcommand_help))
        parser.add_argument('module_spec', metavar='module-spec', nargs='*',
                            help=_("Module specification"))

    def configure(self):
        try:
            self.subcmd = self._subcmd_name2obj[self.opts.subcmd[0]]
        except (CliError, KeyError):
            self.cli.optparser.print_usage()
            raise CliError
        self.subcmd.opts = self.opts
        self.subcmd.configure()

    def run(self):
        self.check_required_argument()
        self.subcmd.run_on_module()

    def check_required_argument(self):
        not_required_argument = [alias
                                 for subcmd in self.SUBCMDS_NOT_REQUIRED_ARG
                                 for alias in subcmd.aliases]
        if self.opts.subcmd[0] not in not_required_argument:
            if not self.opts.module_spec:
                raise CliError(
                    _("{} {} {}: too few arguments").format(dnf.util.MAIN_PROG,
                                                            self.opts.command,
                                                            self.opts.subcmd[0]))
site-packages/dnf/cli/commands/autoremove.py000064400000005746147511334650015170 0ustar00# autoremove.py
# Autoremove CLI command.
#
# Copyright (C) 2014-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from dnf.cli import commands
from dnf.cli.option_parser import OptionParser
from dnf.i18n import _

import dnf.exceptions
import hawkey
import logging

logger = logging.getLogger("dnf")


class AutoremoveCommand(commands.Command):

    nevra_forms = {'autoremove-n': hawkey.FORM_NAME,
                   'autoremove-na': hawkey.FORM_NA,
                   'autoremove-nevra': hawkey.FORM_NEVRA}

    aliases = ('autoremove',) + tuple(nevra_forms.keys())
    summary = _('remove all unneeded packages that were originally installed '
                'as dependencies')

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('packages', nargs='*', help=_('Package to remove'),
                            action=OptionParser.ParseSpecGroupFileCallback,
                            metavar=_('PACKAGE'))

    def configure(self):
        demands = self.cli.demands
        demands.resolving = True
        demands.root_user = True
        demands.sack_activation = True

        if any([self.opts.grp_specs, self.opts.pkg_specs, self.opts.filenames]):
            self.base.conf.clean_requirements_on_remove = True
            demands.allow_erasing = True
            # disable all available repos to delete whole dependency tree
            # instead of replacing removable package with available packages
            demands.available_repos = False
        else:
            demands.available_repos = True
            demands.fresh_metadata = False

    def run(self):
        if any([self.opts.grp_specs, self.opts.pkg_specs, self.opts.filenames]):
            forms = []
            if self.opts.command in self.nevra_forms:
                forms = [self.nevra_forms[self.opts.command]]

            self.base.autoremove(forms,
                                 self.opts.pkg_specs,
                                 self.opts.grp_specs,
                                 self.opts.filenames)
        else:
            self.base.autoremove()
site-packages/dnf/cli/commands/upgrade.py000064400000011176147511334650014423 0ustar00# upgrade.py
# Upgrade CLI command.
#
# Copyright (C) 2014-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals

import logging

import dnf.exceptions
import dnf.base
from dnf.cli import commands
from dnf.cli.option_parser import OptionParser
from dnf.i18n import _

logger = logging.getLogger('dnf')


class UpgradeCommand(commands.Command):
    """A class containing methods needed by the cli to execute the
    update command.
    """
    aliases = ('upgrade', 'update', 'upgrade-to', 'update-to', 'localupdate', 'up')
    summary = _('upgrade a package or packages on your system')

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('packages', nargs='*', help=_('Package to upgrade'),
                            action=OptionParser.ParseSpecGroupFileCallback,
                            metavar=_('PACKAGE'))

    def configure(self):
        """Verify that conditions are met so that this command can run.

        These include that there are enabled repositories with gpg
        keys, and that this command is being run by the root user.
        """
        demands = self.cli.demands
        demands.sack_activation = True
        demands.available_repos = True
        demands.resolving = True
        demands.root_user = True
        commands._checkGPGKey(self.base, self.cli)
        if not self.opts.filenames:
            commands._checkEnabledRepo(self.base)
        self.upgrade_minimal = None
        self.all_security = None
        self.skipped_grp_specs = None

    def run(self):
        cmp_type = "eq" if self.upgrade_minimal else "gte"
        self.cli._populate_update_security_filter(self.opts, cmp_type=cmp_type,
                                                  all=self.all_security)

        if self.opts.filenames or self.opts.pkg_specs or self.opts.grp_specs:
            result = False
            result |= self._update_modules()
            result |= self._update_files()
            result |= self._update_packages()
            result |= self._update_groups()

            if result:
                return
        else:
            self.base.upgrade_all()
            return

        raise dnf.exceptions.Error(_('No packages marked for upgrade.'))

    def _update_modules(self):
        group_specs_num = len(self.opts.grp_specs)
        if dnf.base.WITH_MODULES:
            module_base = dnf.module.module_base.ModuleBase(self.base)
            self.skipped_grp_specs = module_base.upgrade(self.opts.grp_specs)
        else:
            self.skipped_grp_specs = self.opts.grp_specs

        return len(self.skipped_grp_specs) != group_specs_num

    def _update_files(self):
        success = False
        if self.opts.filenames:
            for pkg in self.base.add_remote_rpms(self.opts.filenames, strict=False,
                                                 progress=self.base.output.progress):
                try:
                    self.base.package_upgrade(pkg)
                    success = True
                except dnf.exceptions.MarkingError as e:
                    logger.info(_('No match for argument: %s'),
                                self.base.output.term.bold(pkg.location))
        return success

    def _update_packages(self):
        success = False
        for pkg_spec in self.opts.pkg_specs:
            try:
                self.base.upgrade(pkg_spec)
                success = True
            except dnf.exceptions.MarkingError as e:
                logger.info(_('No match for argument: %s'),
                            self.base.output.term.bold(pkg_spec))
        return success

    def _update_groups(self):
        if self.skipped_grp_specs:
            self.base.env_group_upgrade(self.skipped_grp_specs)
            return True
        return False
site-packages/dnf/cli/commands/history.py000064400000043045147511334650014475 0ustar00# Copyright 2006 Duke University
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU Library General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.

from __future__ import absolute_import
from __future__ import print_function
from __future__ import unicode_literals

import libdnf
import hawkey

from dnf.i18n import _, ucd
from dnf.cli import commands
from dnf.transaction_sr import TransactionReplay, serialize_transaction

import dnf.cli
import dnf.exceptions
import dnf.transaction
import dnf.util

import json
import logging
import os


logger = logging.getLogger('dnf')


class HistoryCommand(commands.Command):
    """A class containing methods needed by the cli to execute the
    history command.
    """

    aliases = ('history', 'hist')
    summary = _('display, or use, the transaction history')

    _CMDS = ['list', 'info', 'redo', 'replay', 'rollback', 'store', 'undo', 'userinstalled']

    def __init__(self, *args, **kw):
        super(HistoryCommand, self).__init__(*args, **kw)

        self._require_one_transaction_id = False

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('transactions_action', nargs='?', metavar="COMMAND",
                            help="Available commands: {} (default), {}".format(
                                HistoryCommand._CMDS[0],
                                ", ".join(HistoryCommand._CMDS[1:])))
        parser.add_argument('--reverse', action='store_true',
                            help="display history list output reversed")
        parser.add_argument("-o", "--output", default=None,
                            help=_("For the store command, file path to store the transaction to"))
        parser.add_argument("--ignore-installed", action="store_true",
                            help=_("For the replay command, don't check for installed packages matching "
                            "those in transaction"))
        parser.add_argument("--ignore-extras", action="store_true",
                            help=_("For the replay command, don't check for extra packages pulled "
                            "into the transaction"))
        parser.add_argument("--skip-unavailable", action="store_true",
                            help=_("For the replay command, skip packages that are not available or have "
                            "missing dependencies"))
        parser.add_argument('transactions', nargs='*', metavar="TRANSACTION",
                            help="For commands working with history transactions, "
                                 "Transaction ID (<number>, 'last' or 'last-<number>' "
                                 "for one transaction, <transaction-id>..<transaction-id> "
                                 "for a range)")
        parser.add_argument('transaction_filename', nargs='?', metavar="TRANSACTION_FILE",
                            help="For the replay command, path to the stored "
                                 "transaction file to replay")

    def configure(self):
        if not self.opts.transactions_action:
            # no positional argument given
            self.opts.transactions_action = self._CMDS[0]
        elif self.opts.transactions_action not in self._CMDS:
            # first positional argument is not a command
            self.opts.transactions.insert(0, self.opts.transactions_action)
            self.opts.transactions_action = self._CMDS[0]

        self._require_one_transaction_id_msg = _("Found more than one transaction ID.\n"
                                                 "'{}' requires one transaction ID or package name."
                                                 ).format(self.opts.transactions_action)

        demands = self.cli.demands
        if self.opts.transactions_action == 'replay':
            if not self.opts.transactions:
                raise dnf.cli.CliError(_('No transaction file name given.'))
            if len(self.opts.transactions) > 1:
                raise dnf.cli.CliError(_('More than one argument given as transaction file name.'))

            # in case of replay, copy over the file name to it's appropriate variable
            # (the arg parser can't distinguish here)
            self.opts.transaction_filename = os.path.abspath(self.opts.transactions[0])
            self.opts.transactions = []

            demands.available_repos = True
            demands.resolving = True
            demands.root_user = True

            # Override configuration options that affect how the transaction is resolved
            self.base.conf.clean_requirements_on_remove = False
            self.base.conf.install_weak_deps = False

            dnf.cli.commands._checkGPGKey(self.base, self.cli)
        elif self.opts.transactions_action == 'store':
            self._require_one_transaction_id = True
            if not self.opts.transactions:
                raise dnf.cli.CliError(_('No transaction ID or package name given.'))
        elif self.opts.transactions_action in ['redo', 'undo', 'rollback']:
            demands.available_repos = True
            demands.resolving = True
            demands.root_user = True

            self._require_one_transaction_id = True
            if not self.opts.transactions:
                msg = _('No transaction ID or package name given.')
                logger.critical(msg)
                raise dnf.cli.CliError(msg)
            elif len(self.opts.transactions) > 1:
                logger.critical(self._require_one_transaction_id_msg)
                raise dnf.cli.CliError(self._require_one_transaction_id_msg)
            demands.available_repos = True
            dnf.cli.commands._checkGPGKey(self.base, self.cli)
        else:
            demands.fresh_metadata = False
        demands.sack_activation = True
        if self.base.history.path != ":memory:" and not os.access(self.base.history.path, os.R_OK):
            msg = _("You don't have access to the history DB: %s" % self.base.history.path)
            logger.critical(msg)
            raise dnf.cli.CliError(msg)

    def get_error_output(self, error):
        """Get suggestions for resolving the given error."""
        if isinstance(error, dnf.exceptions.TransactionCheckError):
            if self.opts.transactions_action == 'undo':
                id_, = self.opts.transactions
                return (_('Cannot undo transaction %s, doing so would result '
                          'in an inconsistent package database.') % id_,)
            elif self.opts.transactions_action == 'rollback':
                id_, = (self.opts.transactions if self.opts.transactions[0] != 'force'
                        else self.opts.transactions[1:])
                return (_('Cannot rollback transaction %s, doing so would '
                          'result in an inconsistent package database.') % id_,)

        return dnf.cli.commands.Command.get_error_output(self, error)

    def _hcmd_redo(self, extcmds):
        old = self._history_get_transaction(extcmds)
        data = serialize_transaction(old)
        self.replay = TransactionReplay(
            self.base,
            data=data,
            ignore_installed=True,
            ignore_extras=True,
            skip_unavailable=self.opts.skip_unavailable
        )
        self.replay.run()

    def _history_get_transactions(self, extcmds):
        if not extcmds:
            raise dnf.cli.CliError(_('No transaction ID given'))

        old = self.base.history.old(extcmds)
        if not old:
            raise dnf.cli.CliError(_('Transaction ID "{0}" not found.').format(extcmds[0]))
        return old

    def _history_get_transaction(self, extcmds):
        old = self._history_get_transactions(extcmds)
        if len(old) > 1:
            raise dnf.cli.CliError(_('Found more than one transaction ID!'))
        return old[0]

    def _hcmd_undo(self, extcmds):
        old = self._history_get_transaction(extcmds)
        self._revert_transaction(old)

    def _hcmd_rollback(self, extcmds):
        old = self._history_get_transaction(extcmds)
        last = self.base.history.last()

        merged_trans = None
        if old.tid != last.tid:
            # history.old([]) returns all transactions and we don't want that
            # so skip merging the transactions when trying to rollback to the last transaction
            # which is the current system state and rollback is not applicable
            for trans in self.base.history.old(list(range(old.tid + 1, last.tid + 1))):
                if trans.altered_lt_rpmdb:
                    logger.warning(_('Transaction history is incomplete, before %u.'), trans.tid)
                elif trans.altered_gt_rpmdb:
                    logger.warning(_('Transaction history is incomplete, after %u.'), trans.tid)

                if merged_trans is None:
                    merged_trans = dnf.db.history.MergedTransactionWrapper(trans)
                else:
                    merged_trans.merge(trans)

        self._revert_transaction(merged_trans)

    def _revert_transaction(self, trans):
        action_map = {
            "Install": "Removed",
            "Removed": "Install",
            "Upgrade": "Downgraded",
            "Upgraded": "Downgrade",
            "Downgrade": "Upgraded",
            "Downgraded": "Upgrade",
            "Reinstalled": "Reinstall",
            "Reinstall": "Reinstalled",
            "Obsoleted": "Install",
            "Obsolete": "Obsoleted",
            "Reason Change": "Reason Change",
        }

        data = serialize_transaction(trans)

        # revert actions in the serialized transaction data to perform rollback/undo
        for content_type in ("rpms", "groups", "environments"):
            for ti in data.get(content_type, []):
                ti["action"] = action_map[ti["action"]]

                if ti["action"] == "Install" and ti.get("reason", None) == "clean":
                    ti["reason"] = "dependency"

                if ti["action"] == "Reason Change" and "nevra" in ti:
                    subj = hawkey.Subject(ti["nevra"])
                    nevra = subj.get_nevra_possibilities(forms=[hawkey.FORM_NEVRA])[0]
                    reason = self.output.history.swdb.resolveRPMTransactionItemReason(
                        nevra.name,
                        nevra.arch,
                        trans.tids()[0] - 1
                    )
                    ti["reason"] = libdnf.transaction.TransactionItemReasonToString(reason)

                if ti.get("repo_id") == hawkey.SYSTEM_REPO_NAME:
                    # erase repo_id, because it's not possible to perform forward actions from the @System repo
                    ti["repo_id"] = None

        self.replay = TransactionReplay(
            self.base,
            data=data,
            ignore_installed=True,
            ignore_extras=True,
            skip_unavailable=self.opts.skip_unavailable
        )
        self.replay.run()

    def _hcmd_userinstalled(self):
        """Execute history userinstalled command."""
        pkgs = tuple(self.base.iter_userinstalled())
        n_listed = self.output.listPkgs(pkgs, 'Packages installed by user', 'nevra')
        if n_listed == 0:
            raise dnf.cli.CliError(_('No packages to list'))

    def _args2transaction_ids(self):
        """Convert commandline arguments to transaction ids"""

        def str2transaction_id(s):
            if s == 'last':
                s = '0'
            elif s.startswith('last-'):
                s = s[4:]
            transaction_id = int(s)
            if transaction_id <= 0:
                transaction_id += self.output.history.last().tid
            return transaction_id

        tids = set()
        merged_tids = set()
        for t in self.opts.transactions:
            if '..' in t:
                try:
                    begin_transaction_id, end_transaction_id = t.split('..', 2)
                except ValueError:
                    logger.critical(
                        _("Invalid transaction ID range definition '{}'.\n"
                          "Use '<transaction-id>..<transaction-id>'."
                          ).format(t))
                    raise dnf.cli.CliError
                cant_convert_msg = _("Can't convert '{}' to transaction ID.\n"
                                     "Use '<number>', 'last', 'last-<number>'.")
                try:
                    begin_transaction_id = str2transaction_id(begin_transaction_id)
                except ValueError:
                    logger.critical(_(cant_convert_msg).format(begin_transaction_id))
                    raise dnf.cli.CliError
                try:
                    end_transaction_id = str2transaction_id(end_transaction_id)
                except ValueError:
                    logger.critical(_(cant_convert_msg).format(end_transaction_id))
                    raise dnf.cli.CliError
                if self._require_one_transaction_id and begin_transaction_id != end_transaction_id:
                        logger.critical(self._require_one_transaction_id_msg)
                        raise dnf.cli.CliError
                if begin_transaction_id > end_transaction_id:
                    begin_transaction_id, end_transaction_id = \
                        end_transaction_id, begin_transaction_id
                merged_tids.add((begin_transaction_id, end_transaction_id))
                tids.update(range(begin_transaction_id, end_transaction_id + 1))
            else:
                try:
                    tids.add(str2transaction_id(t))
                except ValueError:
                    # not a transaction id, assume it's package name
                    transact_ids_from_pkgname = self.output.history.search([t])
                    if transact_ids_from_pkgname:
                        tids.update(transact_ids_from_pkgname)
                    else:
                        msg = _("No transaction which manipulates package '{}' was found."
                                ).format(t)
                        if self._require_one_transaction_id:
                            logger.critical(msg)
                            raise dnf.cli.CliError
                        else:
                            logger.info(msg)

        return sorted(tids, reverse=True), merged_tids

    def run(self):
        vcmd = self.opts.transactions_action

        if vcmd == 'replay':
            self.replay = TransactionReplay(
                self.base,
                filename=self.opts.transaction_filename,
                ignore_installed = self.opts.ignore_installed,
                ignore_extras = self.opts.ignore_extras,
                skip_unavailable = self.opts.skip_unavailable
            )
            self.replay.run()
        else:
            tids, merged_tids = self._args2transaction_ids()

            if vcmd == 'list' and (tids or not self.opts.transactions):
                self.output.historyListCmd(tids, reverse=self.opts.reverse)
            elif vcmd == 'info' and (tids or not self.opts.transactions):
                self.output.historyInfoCmd(tids, self.opts.transactions, merged_tids)
            elif vcmd == 'undo':
                self._hcmd_undo(tids)
            elif vcmd == 'redo':
                self._hcmd_redo(tids)
            elif vcmd == 'rollback':
                self._hcmd_rollback(tids)
            elif vcmd == 'userinstalled':
                self._hcmd_userinstalled()
            elif vcmd == 'store':
                tid = self._history_get_transaction(tids)
                data = serialize_transaction(tid)
                try:
                    filename = self.opts.output if self.opts.output is not None else "transaction.json"

                    # it is absolutely possible for both assumeyes and assumeno to be True, go figure
                    if (self.base.conf.assumeno or not self.base.conf.assumeyes) and os.path.isfile(filename):
                        msg = _("{} exists, overwrite?").format(filename)
                        if self.base.conf.assumeno or not self.base.output.userconfirm(
                            msg='\n{} [y/N]: '.format(msg), defaultyes_msg='\n{} [Y/n]: '.format(msg)):
                                print(_("Not overwriting {}, exiting.").format(filename))
                                return

                    with open(filename, "w") as f:
                        json.dump(data, f, indent=4, sort_keys=True)
                        f.write("\n")

                    print(_("Transaction saved to {}.").format(filename))

                except OSError as e:
                    raise dnf.cli.CliError(_('Error storing transaction: {}').format(str(e)))

    def run_resolved(self):
        if self.opts.transactions_action not in ("replay", "redo", "rollback", "undo"):
            return

        self.replay.post_transaction()

    def run_transaction(self):
        if self.opts.transactions_action not in ("replay", "redo", "rollback", "undo"):
            return

        warnings = self.replay.get_warnings()
        if warnings:
            logger.log(
                dnf.logging.WARNING,
                _("Warning, the following problems occurred while running a transaction:")
            )
            for w in warnings:
                logger.log(dnf.logging.WARNING, "  " + w)
site-packages/dnf/cli/commands/deplist.py000064400000002746147511334650014443 0ustar00#
# Copyright (C) 2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import print_function
from __future__ import absolute_import
from __future__ import unicode_literals
from dnf.i18n import _
from dnf.cli.commands.repoquery import RepoQueryCommand


class DeplistCommand(RepoQueryCommand):
    """
    The command is alias for 'dnf repoquery --deplist'
    """

    aliases = ('deplist',)
    summary = _("[deprecated, use repoquery --deplist] List package's dependencies and what packages provide them")

    def configure(self):
        RepoQueryCommand.configure(self)
        self.opts.deplist = True
site-packages/dnf/cli/commands/remove.py000064400000014752147511334650014274 0ustar00# remove_command.py
# Remove CLI command.
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from dnf.cli import commands
from dnf.i18n import _
from dnf.cli.option_parser import OptionParser
import dnf.base
import argparse
import hawkey
import dnf.exceptions
import logging

logger = logging.getLogger("dnf")


class RemoveCommand(commands.Command):
    """Remove command."""

    nevra_forms = {'remove-n': hawkey.FORM_NAME,
                   'remove-na': hawkey.FORM_NA,
                   'remove-nevra': hawkey.FORM_NEVRA,
                   'erase-n': hawkey.FORM_NAME,
                   'erase-na': hawkey.FORM_NA,
                   'erase-nevra': hawkey.FORM_NEVRA}

    aliases = ('remove', 'erase', 'rm') + tuple(nevra_forms.keys())
    summary = _('remove a package or packages from your system')

    @staticmethod
    def set_argparser(parser):
        mgroup = parser.add_mutually_exclusive_group()
        mgroup.add_argument('--duplicates', action='store_true',
                            dest='duplicated',
                            help=_('remove duplicated packages'))
        mgroup.add_argument('--duplicated', action='store_true',
                            help=argparse.SUPPRESS)
        mgroup.add_argument('--oldinstallonly', action='store_true',
                            help=_(
                                'remove installonly packages over the limit'))
        parser.add_argument('packages', nargs='*', help=_('Package to remove'),
                            action=OptionParser.ParseSpecGroupFileCallback,
                            metavar=_('PACKAGE'))

    def configure(self):
        demands = self.cli.demands
        # disable all available repos to delete whole dependency tree
        # instead of replacing removable package with available packages
        demands.resolving = True
        demands.root_user = True
        demands.sack_activation = True
        if self.opts.duplicated:
            demands.available_repos = True
        elif dnf.base.WITH_MODULES and self.opts.grp_specs:
            demands.available_repos = True
            demands.fresh_metadata = False
            demands.allow_erasing = True
        else:
            demands.allow_erasing = True
            demands.available_repos = False

    def run(self):

        forms = []
        if self.opts.command in self.nevra_forms:
            forms = [self.nevra_forms[self.opts.command]]

        # local pkgs not supported in erase command
        self.opts.pkg_specs += self.opts.filenames
        done = False

        if self.opts.duplicated:
            q = self.base.sack.query()
            instonly = self.base._get_installonly_query(q.installed())
            dups = q.duplicated().difference(instonly)
            if not dups:
                raise dnf.exceptions.Error(_('No duplicated packages found for removal.'))

            for (name, arch), pkgs_list in dups._na_dict().items():
                if len(pkgs_list) < 2:
                    continue
                pkgs_list.sort(reverse=True)
                try:
                    self.base.reinstall(str(pkgs_list[0]))
                except dnf.exceptions.PackagesNotAvailableError:
                    xmsg = ''
                    msg = _('Installed package %s%s not available.')
                    logger.warning(msg, self.base.output.term.bold(str(pkgs_list[0])), xmsg)

                for pkg in pkgs_list[1:]:
                    self.base.package_remove(pkg)
            return

        if self.opts.oldinstallonly:
            q = self.base.sack.query()
            instonly = self.base._get_installonly_query(q.installed()).latest(-1)
            # also remove running kernel from the set
            kernel = self.base.sack.get_running_kernel()
            if kernel is not None:
                running_installonly = instonly.filter(
                    epoch=kernel.epoch, version=kernel.version, release=kernel.release)
                if running_installonly:
                    instonly = instonly.difference(running_installonly)
            if instonly:
                for pkg in instonly:
                    self.base.package_remove(pkg)
            else:
                raise dnf.exceptions.Error(
                    _('No old installonly packages found for removal.'))
            return

        # Remove groups.
        if self.opts.grp_specs and forms:
            for grp_spec in self.opts.grp_specs:
                msg = _('Not a valid form: %s')
                logger.warning(msg, self.base.output.term.bold(grp_spec))
        elif self.opts.grp_specs:
            if dnf.base.WITH_MODULES:
                module_base = dnf.module.module_base.ModuleBase(self.base)
                skipped_grps = module_base.remove(self.opts.grp_specs)
                if len(self.opts.grp_specs) != len(skipped_grps):
                    done = True
            else:
                skipped_grps = self.opts.grp_specs

            if skipped_grps:
                for group in skipped_grps:
                    try:
                        if self.base.env_group_remove([group]):
                            done = True
                    except dnf.exceptions.Error:
                        pass

        for pkg_spec in self.opts.pkg_specs:
            try:
                self.base.remove(pkg_spec, forms=forms)
            except dnf.exceptions.MarkingError as e:
                msg = '{}: {}'.format(e.value, self.base.output.term.bold(pkg_spec))
                logger.info(msg)
            else:
                done = True

        if not done:
            logger.warning(_('No packages marked for removal.'))
site-packages/dnf/cli/commands/clean.py000064400000010564147511334650014056 0ustar00# clean.py
# Clean CLI command.
#
# Copyright (C) 2014-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from dnf.cli import commands
from dnf.i18n import _, P_
from dnf.yum import misc

import dnf.cli
import dnf.exceptions
import dnf.lock
import dnf.logging
import dnf.repo
import logging
import os
import re
import time

logger = logging.getLogger("dnf")

# Dict mapping cmdline arguments to actual data types to be cleaned up
_CACHE_TYPES = {
    'metadata': ['metadata', 'dbcache', 'expire-cache'],
    'packages': ['packages'],
    'dbcache': ['dbcache'],
    'expire-cache': ['expire-cache'],
    'all': ['metadata', 'packages', 'dbcache'],
}


def _tree(dirpath):
    """Traverse dirpath recursively and yield relative filenames."""
    for root, dirs, files in os.walk(dirpath):
        base = os.path.relpath(root, dirpath)
        for f in files:
            path = os.path.join(base, f)
            yield os.path.normpath(path)


def _filter(files, patterns):
    """Yield those filenames that match any of the patterns."""
    return (f for f in files for p in patterns if re.match(p, f))


def _clean(dirpath, files):
    """Remove the given filenames from dirpath."""
    count = 0
    for f in files:
        path = os.path.join(dirpath, f)
        logger.log(dnf.logging.DDEBUG, _('Removing file %s'), path)
        misc.unlink_f(path)
        count += 1
    return count


def _cached_repos(files):
    """Return the repo IDs that have some cached metadata around."""
    metapat = dnf.repo.CACHE_FILES['metadata']
    matches = (re.match(metapat, f) for f in files)
    return set(m.group('repoid') for m in matches if m)


class CleanCommand(commands.Command):
    """A class containing methods needed by the cli to execute the
    clean command.
    """

    aliases = ('clean',)
    summary = _('remove cached data')

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('type', nargs='+',
                           choices=_CACHE_TYPES.keys(),
                           help=_('Metadata type to clean'))

    def run(self):
        cachedir = self.base.conf.cachedir
        md_lock = dnf.lock.build_metadata_lock(cachedir, True)
        download_lock = dnf.lock.build_download_lock(cachedir, True)
        rpmdb_lock = dnf.lock.build_rpmdb_lock(self.base.conf.persistdir, True)
        while True:
            try:
                with md_lock and download_lock and rpmdb_lock:
                    types = set(t for c in self.opts.type for t in _CACHE_TYPES[c])
                    files = list(_tree(cachedir))
                    logger.debug(_('Cleaning data: ' + ' '.join(types)))

                    if 'expire-cache' in types:
                        expired = _cached_repos(files)
                        self.base._repo_persistor.expired_to_add.update(expired)
                        types.remove('expire-cache')
                        logger.info(_('Cache was expired'))

                    patterns = [dnf.repo.CACHE_FILES[t] for t in types]
                    count = _clean(cachedir, _filter(files, patterns))
                    logger.info(P_('%d file removed', '%d files removed', count) % count)
                    return
            except dnf.exceptions.LockError as e:
                if not self.base.conf.exit_on_lock:
                    msg = _('Waiting for process with pid %d to finish.') % (e.pid)
                    logger.info(msg)
                    time.sleep(3)
                else:
                    raise e
site-packages/dnf/cli/commands/makecache.py000064400000003555147511334650014677 0ustar00# makecache.py
# Makecache CLI command.
#
# Copyright (C) 2014-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from dnf.cli import commands
from dnf.i18n import _

import argparse
import dnf.cli
import dnf.exceptions
import dnf.util
import logging

logger = logging.getLogger("dnf")


class MakeCacheCommand(commands.Command):
    aliases = ('makecache', 'mc')
    summary = _('generate the metadata cache')

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('--timer', action='store_true', dest="timer_opt")
        # compatibility with dnf < 2.0
        parser.add_argument('timer', nargs='?', choices=['timer'],
                            metavar='timer', help=argparse.SUPPRESS)

    def run(self):
        timer = self.opts.timer is not None or self.opts.timer_opt
        msg = _("Making cache files for all metadata files.")
        logger.debug(msg)
        return self.base.update_cache(timer)
site-packages/dnf/cli/commands/reinstall.py000064400000010135147511334650014763 0ustar00# reinstall.py
# Reinstall CLI command.
#
# Copyright (C) 2014-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from dnf.cli import commands
from dnf.cli.option_parser import OptionParser
from dnf.i18n import _

import dnf.exceptions
import logging

logger = logging.getLogger('dnf')


class ReinstallCommand(commands.Command):
    """A class containing methods needed by the cli to execute the reinstall command.
    """

    aliases = ('reinstall', 'rei')
    summary = _('reinstall a package')

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('packages', nargs='+', help=_('Package to reinstall'),
                            action=OptionParser.ParseSpecGroupFileCallback,
                            metavar=_('PACKAGE'))

    def configure(self):
        """Verify that conditions are met so that this command can
        run.  These include that the program is being run by the root
        user, that there are enabled repositories with gpg keys, and
        that this command is called with appropriate arguments.
        """
        demands = self.cli.demands
        demands.sack_activation = True
        demands.available_repos = True
        demands.resolving = True
        demands.root_user = True
        commands._checkGPGKey(self.base, self.cli)
        if not self.opts.filenames:
            commands._checkEnabledRepo(self.base)

    def run(self):

        # Reinstall files.
        done = False
        for pkg in self.base.add_remote_rpms(self.opts.filenames, strict=False,
                                             progress=self.base.output.progress):
            try:
                self.base.package_reinstall(pkg)
            except dnf.exceptions.MarkingError:
                logger.info(_('No match for argument: %s'),
                            self.base.output.term.bold(pkg.location))
            else:
                done = True

        # Reinstall packages.
        for pkg_spec in self.opts.pkg_specs + ['@' + x for x in self.opts.grp_specs]:
            try:
                self.base.reinstall(pkg_spec)
            except dnf.exceptions.PackagesNotInstalledError as err:
                for pkg in err.packages:
                    logger.info(_('Package %s available, but not installed.'),
                                self.output.term.bold(pkg.name))
                    break
                logger.info(_('No match for argument: %s'),
                            self.base.output.term.bold(pkg_spec))
            except dnf.exceptions.PackagesNotAvailableError as err:
                for pkg in err.packages:
                    xmsg = ''
                    pkgrepo = self.base.history.repo(pkg)
                    if pkgrepo:
                        xmsg = _(' (from %s)') % pkgrepo
                    msg = _('Installed package %s%s not available.')
                    logger.info(msg, self.base.output.term.bold(pkg),
                                xmsg)
            except dnf.exceptions.MarkingError:
                assert False, 'Only the above marking errors are expected.'
            else:
                done = True

        if not done:
            raise dnf.exceptions.Error(_('No packages marked for reinstall.'))
site-packages/dnf/cli/commands/updateinfo.py000064400000045062147511334650015133 0ustar00# updateinfo.py
# UpdateInfo CLI command.
#
# Copyright (C) 2014-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

"""UpdateInfo CLI command."""
from __future__ import absolute_import
from __future__ import print_function
from __future__ import unicode_literals

import collections
import fnmatch

import hawkey
from dnf.cli import commands
from dnf.cli.option_parser import OptionParser
from dnf.i18n import _, exact_width
from dnf.pycomp import unicode


def _maxlen(iterable):
    """Return maximum length of items in a non-empty iterable."""
    return max(exact_width(item) for item in iterable)


class UpdateInfoCommand(commands.Command):
    """Implementation of the UpdateInfo command."""

    TYPE2LABEL = {hawkey.ADVISORY_BUGFIX: _('bugfix'),
                  hawkey.ADVISORY_ENHANCEMENT: _('enhancement'),
                  hawkey.ADVISORY_SECURITY: _('security'),
                  hawkey.ADVISORY_UNKNOWN: _('unknown'),
                  hawkey.ADVISORY_NEWPACKAGE: _('newpackage')}

    SECURITY2LABEL = {'Critical': _('Critical/Sec.'),
                      'Important': _('Important/Sec.'),
                      'Moderate': _('Moderate/Sec.'),
                      'Low': _('Low/Sec.')}

    direct_commands = {'list-updateinfo'    : 'list',
                       'list-security'      : 'list',
                       'list-sec'           : 'list',
                       'info-updateinfo'    : 'info',
                       'info-security'      : 'info',
                       'info-sec'           : 'info',
                       'summary-updateinfo' : 'summary'}
    aliases = ['updateinfo'] + list(direct_commands.keys())
    summary = _('display advisories about packages')
    availability_default = 'available'
    availabilities = ['installed', 'updates', 'all', availability_default]

    def __init__(self, cli):
        """Initialize the command."""
        super(UpdateInfoCommand, self).__init__(cli)
        self._installed_query = None

    @staticmethod
    def set_argparser(parser):
        availability = parser.add_mutually_exclusive_group()
        availability.add_argument(
            "--available", dest='_availability', const='available', action='store_const',
            help=_("advisories about newer versions of installed packages (default)"))
        availability.add_argument(
            "--installed", dest='_availability', const='installed', action='store_const',
            help=_("advisories about equal and older versions of installed packages"))
        availability.add_argument(
            "--updates", dest='_availability', const='updates', action='store_const',
            help=_("advisories about newer versions of those installed packages "
                   "for which a newer version is available"))
        availability.add_argument(
            "--all", dest='_availability', const='all', action='store_const',
            help=_("advisories about any versions of installed packages"))
        cmds = ['summary', 'list', 'info']
        output_format = parser.add_mutually_exclusive_group()
        output_format.add_argument("--summary", dest='_spec_action', const='summary',
                                   action='store_const',
                                   help=_('show summary of advisories (default)'))
        output_format.add_argument("--list", dest='_spec_action', const='list',
                                   action='store_const',
                                   help=_('show list of advisories'))
        output_format.add_argument("--info", dest='_spec_action', const='info',
                                   action='store_const',
                                   help=_('show info of advisories'))
        parser.add_argument("--with-cve", dest='with_cve', default=False,
                            action='store_true',
                            help=_('show only advisories with CVE reference'))
        parser.add_argument("--with-bz", dest='with_bz', default=False,
                            action='store_true',
                            help=_('show only advisories with bugzilla reference'))
        parser.add_argument('spec', nargs='*', metavar='SPEC',
                            choices=cmds, default=cmds[0],
                            action=OptionParser.PkgNarrowCallback,
                            help=_("Package specification"))

    def configure(self):
        """Do any command-specific configuration based on command arguments."""
        self.cli.demands.available_repos = True
        self.cli.demands.sack_activation = True

        if self.opts.command in self.direct_commands:
            # we were called with direct command
            self.opts.spec_action = self.direct_commands[self.opts.command]
        else:
            if self.opts._spec_action:
                self.opts.spec_action = self.opts._spec_action

        if self.opts._availability:
            self.opts.availability = self.opts._availability
        else:
            # yum compatibility - search for all|available|installed|updates in spec[0]
            if not self.opts.spec or self.opts.spec[0] not in self.availabilities:
                self.opts.availability = self.availability_default
            else:
                self.opts.availability = self.opts.spec.pop(0)

        # filtering by advisory types (security/bugfix/enhancement/newpackage)
        self.opts._advisory_types = set()
        if self.opts.bugfix:
            self.opts._advisory_types.add(hawkey.ADVISORY_BUGFIX)
        if self.opts.enhancement:
            self.opts._advisory_types.add(hawkey.ADVISORY_ENHANCEMENT)
        if self.opts.newpackage:
            self.opts._advisory_types.add(hawkey.ADVISORY_NEWPACKAGE)
        if self.opts.security:
            self.opts._advisory_types.add(hawkey.ADVISORY_SECURITY)

        # yum compatibility - yum accepts types also as positional arguments
        if self.opts.spec:
            spec = self.opts.spec.pop(0)
            if spec == 'bugfix':
                self.opts._advisory_types.add(hawkey.ADVISORY_BUGFIX)
            elif spec == 'enhancement':
                self.opts._advisory_types.add(hawkey.ADVISORY_ENHANCEMENT)
            elif spec in ('security', 'sec'):
                self.opts._advisory_types.add(hawkey.ADVISORY_SECURITY)
            elif spec == 'newpackage':
                self.opts._advisory_types.add(hawkey.ADVISORY_NEWPACKAGE)
            elif spec in ('bugzillas', 'bzs'):
                self.opts.with_bz = True
            elif spec == 'cves':
                self.opts.with_cve = True
            else:
                self.opts.spec.insert(0, spec)

        if self.opts.advisory:
            self.opts.spec.extend(self.opts.advisory)


    def run(self):
        """Execute the command with arguments."""
        if self.opts.availability == 'installed':
            apkg_adv_insts = self.installed_apkg_adv_insts(self.opts.spec)
            description = _('installed')
        elif self.opts.availability == 'updates':
            apkg_adv_insts = self.updating_apkg_adv_insts(self.opts.spec)
            description = _('updates')
        elif self.opts.availability == 'all':
            apkg_adv_insts = self.all_apkg_adv_insts(self.opts.spec)
            description = _('all')
        else:
            apkg_adv_insts = self.available_apkg_adv_insts(self.opts.spec)
            description = _('available')

        if self.opts.spec_action == 'list':
            self.display_list(apkg_adv_insts)
        elif self.opts.spec_action == 'info':
            self.display_info(apkg_adv_insts)
        else:
            self.display_summary(apkg_adv_insts, description)

    def _newer_equal_installed(self, apackage):
        if self._installed_query is None:
            self._installed_query = self.base.sack.query().installed().apply()
        q = self._installed_query.filter(name=apackage.name, evr__gte=apackage.evr)
        return len(q) > 0

    def _advisory_matcher(self, advisory):
        if not self.opts._advisory_types \
                and not self.opts.spec \
                and not self.opts.severity \
                and not self.opts.bugzilla \
                and not self.opts.cves \
                and not self.opts.with_cve \
                and not self.opts.with_bz:
            return True
        if advisory.type in self.opts._advisory_types:
            return True
        if any(fnmatch.fnmatchcase(advisory.id, pat) for pat in self.opts.spec):
            return True
        if self.opts.severity and advisory.severity in self.opts.severity:
            return True
        if self.opts.bugzilla and any([advisory.match_bug(bug) for bug in self.opts.bugzilla]):
            return True
        if self.opts.cves and any([advisory.match_cve(cve) for cve in self.opts.cves]):
            return True
        if self.opts.with_cve:
            if any([ref.type == hawkey.REFERENCE_CVE for ref in advisory.references]):
                return True
        if self.opts.with_bz:
            if any([ref.type == hawkey.REFERENCE_BUGZILLA for ref in advisory.references]):
                return True
        return False

    def _apackage_advisory_installed(self, pkgs_query, cmptype, specs):
        """Return (adv. package, advisory, installed) triplets."""
        for apackage in pkgs_query.get_advisory_pkgs(cmptype):
            advisory = apackage.get_advisory(self.base.sack)
            advisory_match = self._advisory_matcher(advisory)
            apackage_match = any(fnmatch.fnmatchcase(apackage.name, pat)
                                 for pat in self.opts.spec)
            if advisory_match or apackage_match:
                installed = self._newer_equal_installed(apackage)
                yield apackage, advisory, installed

    def running_kernel_pkgs(self):
        """Return query containing packages of currently running kernel"""
        sack = self.base.sack
        q = sack.query().filterm(empty=True)
        kernel = sack.get_running_kernel()
        if kernel:
            q = q.union(sack.query().filterm(sourcerpm=kernel.sourcerpm))
        return q

    def available_apkg_adv_insts(self, specs):
        """Return available (adv. package, adv., inst.) triplets"""
        # check advisories for the latest installed packages
        q = self.base.sack.query().installed().latest(1)
        # plus packages of the running kernel
        q = q.union(self.running_kernel_pkgs().installed())
        return self._apackage_advisory_installed(q, hawkey.GT, specs)

    def installed_apkg_adv_insts(self, specs):
        """Return installed (adv. package, adv., inst.) triplets"""
        return self._apackage_advisory_installed(
            self.base.sack.query().installed(), hawkey.LT | hawkey.EQ, specs)

    def updating_apkg_adv_insts(self, specs):
        """Return updating (adv. package, adv., inst.) triplets"""
        return self._apackage_advisory_installed(
            self.base.sack.query().filterm(upgradable=True), hawkey.GT, specs)

    def all_apkg_adv_insts(self, specs):
        """Return installed (adv. package, adv., inst.) triplets"""
        return self._apackage_advisory_installed(
            self.base.sack.query().installed(), hawkey.LT | hawkey.EQ | hawkey.GT, specs)

    def _summary(self, apkg_adv_insts):
        """Make the summary of advisories."""
        # Remove duplicate advisory IDs. We assume that the ID is unique within
        # a repository and two advisories with the same IDs in different
        # repositories must have the same type.
        id2type = {}
        for (apkg, advisory, installed) in apkg_adv_insts:
            id2type[advisory.id] = advisory.type
            if advisory.type == hawkey.ADVISORY_SECURITY:
                id2type[(advisory.id, advisory.severity)] = (advisory.type, advisory.severity)
        return collections.Counter(id2type.values())

    def display_summary(self, apkg_adv_insts, description):
        """Display the summary of advisories."""
        typ2cnt = self._summary(apkg_adv_insts)
        if typ2cnt:
            print(_('Updates Information Summary: ') + description)
            # Convert types to strings and order the entries.
            label_counts = [
                (0, _('New Package notice(s)'), typ2cnt[hawkey.ADVISORY_NEWPACKAGE]),
                (0, _('Security notice(s)'), typ2cnt[hawkey.ADVISORY_SECURITY]),
                (1, _('Critical Security notice(s)'),
                 typ2cnt[(hawkey.ADVISORY_SECURITY, 'Critical')]),
                (1, _('Important Security notice(s)'),
                 typ2cnt[(hawkey.ADVISORY_SECURITY, 'Important')]),
                (1, _('Moderate Security notice(s)'),
                 typ2cnt[(hawkey.ADVISORY_SECURITY, 'Moderate')]),
                (1, _('Low Security notice(s)'),
                 typ2cnt[(hawkey.ADVISORY_SECURITY, 'Low')]),
                (1, _('Unknown Security notice(s)'),
                 typ2cnt[(hawkey.ADVISORY_SECURITY, None)]),
                (0, _('Bugfix notice(s)'), typ2cnt[hawkey.ADVISORY_BUGFIX]),
                (0, _('Enhancement notice(s)'), typ2cnt[hawkey.ADVISORY_ENHANCEMENT]),
                (0, _('other notice(s)'), typ2cnt[hawkey.ADVISORY_UNKNOWN])]
            width = _maxlen(unicode(v[2]) for v in label_counts if v[2])
            for indent, label, count in label_counts:
                if not count:
                    continue
                print('    %*s %s' % (width + 4 * indent, unicode(count), label))
        if self.base.conf.autocheck_running_kernel:
            self.cli._check_running_kernel()

    def display_list(self, apkg_adv_insts):
        """Display the list of advisories."""
        def inst2mark(inst):
            if not self.opts.availability == 'all':
                return ''
            elif inst:
                return 'i '
            else:
                return '  '

        def type2label(typ, sev):
            if typ == hawkey.ADVISORY_SECURITY:
                return self.SECURITY2LABEL.get(sev, _('Unknown/Sec.'))
            else:
                return self.TYPE2LABEL.get(typ, _('unknown'))

        nevra_inst_dict = dict()
        for apkg, advisory, installed in apkg_adv_insts:
            nevra = '%s-%s.%s' % (apkg.name, apkg.evr, apkg.arch)
            if self.opts.with_cve or self.opts.with_bz:
                for ref in advisory.references:
                    if ref.type == hawkey.REFERENCE_BUGZILLA and not self.opts.with_bz:
                        continue
                    elif ref.type == hawkey.REFERENCE_CVE and not self.opts.with_cve:
                        continue
                    nevra_inst_dict.setdefault((nevra, installed, advisory.updated), dict())[ref.id] = (
                            advisory.type, advisory.severity)
            else:
                nevra_inst_dict.setdefault((nevra, installed, advisory.updated), dict())[advisory.id] = (
                        advisory.type, advisory.severity)

        advlist = []
        # convert types to labels, find max len of advisory IDs and types
        idw = tlw = nw = 0
        for (nevra, inst, aupdated), id2type in sorted(nevra_inst_dict.items(), key=lambda x: x[0]):
            nw = max(nw, len(nevra))
            for aid, atypesev in id2type.items():
                idw = max(idw, len(aid))
                label = type2label(*atypesev)
                tlw = max(tlw, len(label))
                advlist.append((inst2mark(inst), aid, label, nevra, aupdated))

        for (inst, aid, label, nevra, aupdated) in advlist:
            if self.base.conf.verbose:
                print('%s%-*s %-*s %-*s %s' % (inst, idw, aid, tlw, label, nw, nevra, aupdated))
            else:
                print('%s%-*s %-*s %s' % (inst, idw, aid, tlw, label, nevra))


    def display_info(self, apkg_adv_insts):
        """Display the details about available advisories."""
        arches = self.base.sack.list_arches()
        verbose = self.base.conf.verbose
        labels = (_('Update ID'), _('Type'), _('Updated'), _('Bugs'),
                  _('CVEs'), _('Description'), _('Severity'), _('Rights'),
                  _('Files'), _('Installed'))

        def advisory2info(advisory, installed):
            attributes = [
                [advisory.id],
                [self.TYPE2LABEL.get(advisory.type, _('unknown'))],
                [unicode(advisory.updated)],
                [],
                [],
                (advisory.description or '').splitlines(),
                [advisory.severity],
                (advisory.rights or '').splitlines(),
                sorted(set(pkg.filename for pkg in advisory.packages
                           if pkg.arch in arches)),
                None]
            for ref in advisory.references:
                if ref.type == hawkey.REFERENCE_BUGZILLA:
                    attributes[3].append('{} - {}'.format(ref.id, ref.title or ''))
                elif ref.type == hawkey.REFERENCE_CVE:
                    attributes[4].append(ref.id)
            attributes[3].sort()
            attributes[4].sort()
            if not verbose:
                attributes[7] = None
                attributes[8] = None
            if self.opts.availability == 'all':
                attributes[9] = [_('true') if installed else _('false')]

            width = _maxlen(labels)
            lines = []
            lines.append('=' * 79)
            lines.append('  ' + advisory.title)
            lines.append('=' * 79)
            for label, atr_lines in zip(labels, attributes):
                if atr_lines in (None, [None]):
                    continue
                for i, line in enumerate(atr_lines):
                    key = label if i == 0 else ''
                    key_padding = width - exact_width(key)
                    lines.append('%*s%s: %s' % (key_padding, "", key, line))
            return '\n'.join(lines)

        advisories = set()
        for apkg, advisory, installed in apkg_adv_insts:
            advisories.add(advisory2info(advisory, installed))

        print("\n\n".join(sorted(advisories, key=lambda x: x.lower())))
site-packages/dnf/cli/commands/upgrademinimal.py000064400000003407147511334650015770 0ustar00#
# Copyright (C) 2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from dnf.i18n import _
from dnf.cli.commands.upgrade import UpgradeCommand


class UpgradeMinimalCommand(UpgradeCommand):
    """A class containing methods needed by the cli to execute the check
    command.
    """

    aliases = ('upgrade-minimal', 'update-minimal', 'up-min')
    summary = _("upgrade, but only 'newest' package match which fixes a problem"
                " that affects your system")

    def configure(self):
        UpgradeCommand.configure(self)

        self.upgrade_minimal = True
        if not any([self.opts.bugfix, self.opts.enhancement,
                   self.opts.newpackage, self.opts.security, self.opts.advisory,
                   self.opts.bugzilla, self.opts.cves, self.opts.severity]):
            self.all_security = True
site-packages/dnf/cli/commands/__init__.py000064400000076573147511334650014547 0ustar00# Copyright 2006 Duke University
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU Library General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
#
# Written by Seth Vidal

"""
Classes for subcommands of the yum command line interface.
"""

from __future__ import print_function
from __future__ import unicode_literals

from dnf.cli.option_parser import OptionParser
from dnf.i18n import _

import dnf.cli
import dnf.exceptions
import dnf.pycomp
import dnf.util
import logging
import os

logger = logging.getLogger('dnf')
_RPM_VERIFY = _("To diagnose the problem, try running: '%s'.") % \
    'rpm -Va --nofiles --nodigest'
_RPM_REBUILDDB = _("You probably have corrupted RPMDB, running '%s'"
                   " might fix the issue.") % 'rpm --rebuilddb'

gpg_msg = \
    _("""You have enabled checking of packages via GPG keys. This is a good thing.
However, you do not have any GPG public keys installed. You need to download
the keys for packages you wish to install and install them.
You can do that by running the command:
    rpm --import public.gpg.key


Alternatively you can specify the url to the key you would like to use
for a repository in the 'gpgkey' option in a repository section and {prog}
will install it for you.

For more information contact your distribution or package provider.""")


def _checkGPGKey(base, cli):
    """Verify that there are gpg keys for the enabled repositories in the
    rpm database.

    :param base: a :class:`dnf.Base` object.
    :raises: :class:`cli.CliError`
    """
    if not base.conf.gpgcheck:
        return
    if not base._gpg_key_check():
        for repo in base.repos.iter_enabled():
            if (repo.gpgcheck or repo.repo_gpgcheck) and not repo.gpgkey:
                logger.critical("\n%s\n", gpg_msg.format(prog=dnf.util.MAIN_PROG_UPPER))
                logger.critical(_("Problem repository: %s"), repo)
                raise dnf.cli.CliError


def _checkEnabledRepo(base, possible_local_files=()):
    """Verify that there is at least one enabled repo.

    :param base: a :class:`dnf.Base` object.
    :param possible_local_files: the list of strings that could be a local rpms
    :raises: :class:`cli.CliError`:
    """
    if base.repos._any_enabled():
        return

    for lfile in possible_local_files:
        if lfile.endswith(".rpm") and os.path.exists(lfile):
            return
        scheme = dnf.pycomp.urlparse.urlparse(lfile)[0]
        if scheme in ('http', 'ftp', 'file', 'https'):
            return
    msg = _('There are no enabled repositories in "{}".').format('", "'.join(base.conf.reposdir))
    raise dnf.cli.CliError(msg)


class Command(object):
    """Abstract base class for CLI commands."""

    aliases = [] # :api
    summary = ""  # :api
    opts = None

    def __init__(self, cli):
        # :api
        self.cli = cli

    @property
    def base(self):
        # :api
        return self.cli.base

    @property
    def _basecmd(self):
        return self.aliases[0]

    @property
    def output(self):
        return self.cli.base.output

    def set_argparser(self, parser):
        """Define command specific options and arguments. #:api"""
        pass

    def pre_configure(self):
        # :api
        """Do any command-specific pre-configuration."""
        pass

    def configure(self):
        # :api
        """Do any command-specific configuration."""
        pass

    def get_error_output(self, error):
        """Get suggestions for resolving the given error."""
        if isinstance(error, dnf.exceptions.TransactionCheckError):
            return (_RPM_VERIFY, _RPM_REBUILDDB)
        raise NotImplementedError('error not supported yet: %s' % error)

    def run(self):
        # :api
        """Execute the command."""
        pass

    def run_resolved(self):
        """Finalize operation after resolvement"""
        pass

    def run_transaction(self):
        """Finalize operations post-transaction."""
        pass

class InfoCommand(Command):
    """A class containing methods needed by the cli to execute the
    info command.
    """

    aliases = ('info',)
    summary = _('display details about a package or group of packages')
    DEFAULT_PKGNARROW = 'all'
    pkgnarrows = {'available', 'installed', 'extras', 'updates', 'upgrades',
                  'autoremove', 'recent', 'obsoletes', DEFAULT_PKGNARROW}

    @classmethod
    def set_argparser(cls, parser):
        narrows = parser.add_mutually_exclusive_group()
        narrows.add_argument('--all', dest='_packages_action',
                             action='store_const', const='all', default=None,
                             help=_("show all packages (default)"))
        narrows.add_argument('--available', dest='_packages_action',
                             action='store_const', const='available',
                             help=_("show only available packages"))
        narrows.add_argument('--installed', dest='_packages_action',
                             action='store_const', const='installed',
                             help=_("show only installed packages"))
        narrows.add_argument('--extras', dest='_packages_action',
                             action='store_const', const='extras',
                             help=_("show only extras packages"))
        narrows.add_argument('--updates', dest='_packages_action',
                             action='store_const', const='upgrades',
                             help=_("show only upgrades packages"))
        narrows.add_argument('--upgrades', dest='_packages_action',
                             action='store_const', const='upgrades',
                             help=_("show only upgrades packages"))
        narrows.add_argument('--autoremove', dest='_packages_action',
                             action='store_const', const='autoremove',
                             help=_("show only autoremove packages"))
        narrows.add_argument('--recent', dest='_packages_action',
                             action='store_const', const='recent',
                             help=_("show only recently changed packages"))
        parser.add_argument('packages', nargs='*', metavar=_('PACKAGE'),
                            choices=cls.pkgnarrows, default=cls.DEFAULT_PKGNARROW,
                            action=OptionParser.PkgNarrowCallback,
                            help=_("Package name specification"))

    def configure(self):
        demands = self.cli.demands
        demands.sack_activation = True
        if self.opts._packages_action:
            self.opts.packages_action = self.opts._packages_action
        if self.opts.packages_action != 'installed':
            demands.available_repos = True
        if self.opts.obsoletes:
            if self.opts._packages_action:
                self.cli._option_conflict("--obsoletes", "--" + self.opts._packages_action)
            else:
                self.opts.packages_action = 'obsoletes'
        if self.opts.packages_action == 'updates':
            self.opts.packages_action = 'upgrades'

    def run(self):
        self.cli._populate_update_security_filter(self.opts)
        return self.base.output_packages('info', self.opts.packages_action,
                                         self.opts.packages)

class ListCommand(InfoCommand):
    """A class containing methods needed by the cli to execute the
    list command.
    """

    aliases = ('list', 'ls')
    summary = _('list a package or groups of packages')

    def run(self):
        self.cli._populate_update_security_filter(self.opts)
        return self.base.output_packages('list', self.opts.packages_action,
                                         self.opts.packages)


class ProvidesCommand(Command):
    """A class containing methods needed by the cli to execute the
    provides command.
    """

    aliases = ('provides', 'whatprovides', 'prov')
    summary = _('find what package provides the given value')

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('dependency', nargs='+', metavar=_('PROVIDE'),
                            help=_("Provide specification to search for"))

    def configure(self):
        demands = self.cli.demands
        demands.available_repos = True
        demands.fresh_metadata = False
        demands.sack_activation = True

    def run(self):
        logger.debug(_("Searching Packages: "))
        return self.base.provides(self.opts.dependency)

class CheckUpdateCommand(Command):
    """A class containing methods needed by the cli to execute the
    check-update command.
    """

    aliases = ('check-update', 'check-upgrade')
    summary = _('check for available package upgrades')

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('--changelogs', dest='changelogs',
                            default=False, action='store_true',
                            help=_('show changelogs before update'))
        parser.add_argument('packages', nargs='*', metavar=_('PACKAGE'))

    def configure(self):
        demands = self.cli.demands
        demands.sack_activation = True
        demands.available_repos = True
        demands.plugin_filtering_enabled = True
        if self.opts.changelogs:
            demands.changelogs = True
        _checkEnabledRepo(self.base)

    def run(self):
        self.cli._populate_update_security_filter(self.opts, cmp_type="gte")

        found = self.base.check_updates(self.opts.packages, print_=True,
                                        changelogs=self.opts.changelogs)
        if found:
            self.cli.demands.success_exit_status = 100

        if self.base.conf.autocheck_running_kernel:
            self.cli._check_running_kernel()


class RepoPkgsCommand(Command):
    """Implementation of the repository-packages command."""

    class CheckUpdateSubCommand(Command):
        """Implementation of the info sub-command."""

        aliases = ('check-update',)

        def configure(self):
            demands = self.cli.demands
            demands.available_repos = True
            demands.sack_activation = True

        def run_on_repo(self):
            """Execute the command with respect to given arguments *cli_args*."""
            found = self.base.check_updates(self.opts.pkg_specs,
                                            self.reponame, print_=True)
            if found:
                self.cli.demands.success_exit_status = 100

    class InfoSubCommand(Command):
        """Implementation of the info sub-command."""

        aliases = ('info',)

        def configure(self):
            demands = self.cli.demands
            demands.sack_activation = True
            if self.opts._pkg_specs_action:
                self.opts.pkg_specs_action = self.opts._pkg_specs_action
            if self.opts.pkg_specs_action != 'installed':
                demands.available_repos = True
            if self.opts.obsoletes:
                if self.opts._pkg_specs_action:
                    self.cli._option_conflict("--obsoletes", "--" + self.opts._pkg_specs_action)
                else:
                    self.opts.pkg_specs_action = 'obsoletes'

        def run_on_repo(self):
            """Execute the command with respect to given arguments *cli_args*."""
            self.cli._populate_update_security_filter(self.opts)
            self.base.output_packages('info', self.opts.pkg_specs_action,
                                      self.opts.pkg_specs, self.reponame)

    class InstallSubCommand(Command):
        """Implementation of the install sub-command."""

        aliases = ('install',)

        def configure(self):
            demands = self.cli.demands
            demands.available_repos = True
            demands.sack_activation = True
            demands.resolving = True
            demands.root_user = True

        def run_on_repo(self):
            self.cli._populate_update_security_filter(self.opts)
            """Execute the command with respect to given arguments *cli_args*."""
            _checkGPGKey(self.base, self.cli)

            done = False

            if not self.opts.pkg_specs:
                # Install all packages.
                try:
                    self.base.install('*', self.reponame)
                except dnf.exceptions.MarkingError:
                    logger.info(_('No package available.'))
                else:
                    done = True
            else:
                # Install packages.
                for pkg_spec in self.opts.pkg_specs:
                    try:
                        self.base.install(pkg_spec, self.reponame)
                    except dnf.exceptions.MarkingError as e:
                        msg = '{}: {}'.format(e.value, self.base.output.term.bold(pkg_spec))
                        logger.info(msg)
                    else:
                        done = True

            if not done:
                raise dnf.exceptions.Error(_('No packages marked for install.'))

    class ListSubCommand(InfoSubCommand):
        """Implementation of the list sub-command."""

        aliases = ('list',)

        def run_on_repo(self):
            """Execute the command with respect to given arguments *cli_args*."""
            self.cli._populate_update_security_filter(self.opts)
            self.base.output_packages('list', self.opts.pkg_specs_action,
                                      self.opts.pkg_specs, self.reponame)

    class MoveToSubCommand(Command):
        """Implementation of the move-to sub-command."""

        aliases = ('move-to',)

        def configure(self):
            demands = self.cli.demands
            demands.sack_activation = True
            demands.available_repos = True
            demands.resolving = True
            demands.root_user = True

        def run_on_repo(self):
            """Execute the command with respect to given arguments *cli_args*."""
            _checkGPGKey(self.base, self.cli)

            done = False

            if not self.opts.pkg_specs:
                # Reinstall all packages.
                try:
                    self.base.reinstall('*', new_reponame=self.reponame)
                except dnf.exceptions.PackagesNotInstalledError:
                    logger.info(_('No package installed.'))
                except dnf.exceptions.PackagesNotAvailableError:
                    logger.info(_('No package available.'))
                except dnf.exceptions.MarkingError:
                    assert False, 'Only the above marking errors are expected.'
                else:
                    done = True
            else:
                # Reinstall packages.
                for pkg_spec in self.opts.pkg_specs:
                    try:
                        self.base.reinstall(pkg_spec, new_reponame=self.reponame)
                    except dnf.exceptions.PackagesNotInstalledError:
                        msg = _('No match for argument: %s')
                        logger.info(msg, pkg_spec)
                    except dnf.exceptions.PackagesNotAvailableError as err:
                        for pkg in err.packages:
                            xmsg = ''
                            pkgrepo = self.base.history.repo(pkg)
                            if pkgrepo:
                                xmsg = _(' (from %s)') % pkgrepo
                            msg = _('Installed package %s%s not available.')
                            logger.info(msg, self.output.term.bold(pkg), xmsg)
                    except dnf.exceptions.MarkingError:
                        assert False, \
                               'Only the above marking errors are expected.'
                    else:
                        done = True

            if not done:
                raise dnf.exceptions.Error(_('Nothing to do.'))

    class ReinstallOldSubCommand(Command):
        """Implementation of the reinstall-old sub-command."""

        aliases = ('reinstall-old',)

        def configure(self):
            demands = self.cli.demands
            demands.sack_activation = True
            demands.available_repos = True
            demands.resolving = True
            demands.root_user = True

        def run_on_repo(self):
            """Execute the command with respect to given arguments *cli_args*."""
            _checkGPGKey(self.base, self.cli)

            done = False

            if not self.opts.pkg_specs:
                # Reinstall all packages.
                try:
                    self.base.reinstall('*', self.reponame, self.reponame)
                except dnf.exceptions.PackagesNotInstalledError:
                    msg = _('No package installed from the repository.')
                    logger.info(msg)
                except dnf.exceptions.PackagesNotAvailableError:
                    logger.info(_('No package available.'))
                except dnf.exceptions.MarkingError:
                    assert False, 'Only the above marking errors are expected.'
                else:
                    done = True
            else:
                # Reinstall packages.
                for pkg_spec in self.opts.pkg_specs:
                    try:
                        self.base.reinstall(pkg_spec, self.reponame,
                                            self.reponame)
                    except dnf.exceptions.PackagesNotInstalledError:
                        msg = _('No match for argument: %s')
                        logger.info(msg, pkg_spec)
                    except dnf.exceptions.PackagesNotAvailableError as err:
                        for pkg in err.packages:
                            xmsg = ''
                            pkgrepo = self.base.history.repo(pkg)
                            if pkgrepo:
                                xmsg = _(' (from %s)') % pkgrepo
                            msg = _('Installed package %s%s not available.')
                            logger.info(msg, self.output.term.bold(pkg), xmsg)
                    except dnf.exceptions.MarkingError:
                        assert False, \
                               'Only the above marking errors are expected.'
                    else:
                        done = True

            if not done:
                raise dnf.exceptions.Error(_('Nothing to do.'))

    class ReinstallSubCommand(Command):
        """Implementation of the reinstall sub-command."""

        aliases = ('reinstall',)

        def __init__(self, cli):
            """Initialize the command."""
            super(RepoPkgsCommand.ReinstallSubCommand, self).__init__(cli)
            self.wrapped_commands = (RepoPkgsCommand.ReinstallOldSubCommand(cli),
                                     RepoPkgsCommand.MoveToSubCommand(cli))

        def configure(self):
            self.cli.demands.available_repos = True
            for command in self.wrapped_commands:
                command.opts = self.opts
                command.reponame = self.reponame
                command.configure()

        def run_on_repo(self):
            """Execute the command with respect to given arguments *cli_args*."""
            _checkGPGKey(self.base, self.cli)
            for command in self.wrapped_commands:
                try:
                    command.run_on_repo()
                except dnf.exceptions.Error:
                    continue
                else:
                    break
            else:
                raise dnf.exceptions.Error(_('No packages marked for reinstall.'))

    class RemoveOrDistroSyncSubCommand(Command):
        """Implementation of the remove-or-distro-sync sub-command."""

        aliases = ('remove-or-distro-sync',)

        def configure(self):
            demands = self.cli.demands
            demands.available_repos = True
            demands.sack_activation = True
            demands.resolving = True
            demands.root_user = True

        def _replace(self, pkg_spec, reponame):
            """Synchronize a package with another repository or remove it."""
            self.cli.base.sack.disable_repo(reponame)

            subject = dnf.subject.Subject(pkg_spec)
            matches = subject.get_best_query(self.cli.base.sack)
            history = self.cli.base.history
            installed = [
                pkg for pkg in matches.installed()
                if history.repo(pkg) == reponame]
            if not installed:
                raise dnf.exceptions.PackagesNotInstalledError(
                    'no package matched', pkg_spec)
            available = matches.available()
            clean_deps = self.cli.base.conf.clean_requirements_on_remove
            for package in installed:
                if available.filter(name=package.name, arch=package.arch):
                    self.cli.base._goal.distupgrade(package)
                else:
                    self.cli.base._goal.erase(package, clean_deps=clean_deps)

        def run_on_repo(self):
            """Execute the command with respect to given arguments *cli_args*."""
            _checkGPGKey(self.base, self.cli)

            done = False

            if not self.opts.pkg_specs:
                # Sync all packages.
                try:
                    self._replace('*', self.reponame)
                except dnf.exceptions.PackagesNotInstalledError:
                    msg = _('No package installed from the repository.')
                    logger.info(msg)
                else:
                    done = True
            else:
                # Reinstall packages.
                for pkg_spec in self.opts.pkg_specs:
                    try:
                        self._replace(pkg_spec, self.reponame)
                    except dnf.exceptions.PackagesNotInstalledError:
                        msg = _('No match for argument: %s')
                        logger.info(msg, pkg_spec)
                    else:
                        done = True

            if not done:
                raise dnf.exceptions.Error(_('Nothing to do.'))

    class RemoveOrReinstallSubCommand(Command):
        """Implementation of the remove-or-reinstall sub-command."""

        aliases = ('remove-or-reinstall',)

        def configure(self):
            demands = self.cli.demands
            demands.sack_activation = True
            demands.available_repos = True
            demands.resolving = True
            demands.root_user = True

        def run_on_repo(self):
            """Execute the command with respect to given arguments *cli_args*."""
            _checkGPGKey(self.base, self.cli)

            done = False

            if not self.opts.pkg_specs:
                # Reinstall all packages.
                try:
                    self.base.reinstall('*', old_reponame=self.reponame,
                                        new_reponame_neq=self.reponame,
                                        remove_na=True)
                except dnf.exceptions.PackagesNotInstalledError:
                    msg = _('No package installed from the repository.')
                    logger.info(msg)
                except dnf.exceptions.MarkingError:
                    assert False, 'Only the above marking error is expected.'
                else:
                    done = True
            else:
                # Reinstall packages.
                for pkg_spec in self.opts.pkg_specs:
                    try:
                        self.base.reinstall(
                            pkg_spec, old_reponame=self.reponame,
                            new_reponame_neq=self.reponame, remove_na=True)
                    except dnf.exceptions.PackagesNotInstalledError:
                        msg = _('No match for argument: %s')
                        logger.info(msg, pkg_spec)
                    except dnf.exceptions.MarkingError:
                        assert False, 'Only the above marking error is expected.'
                    else:
                        done = True

            if not done:
                raise dnf.exceptions.Error(_('Nothing to do.'))

    class RemoveSubCommand(Command):
        """Implementation of the remove sub-command."""

        aliases = ('remove',)

        def configure(self):
            demands = self.cli.demands
            demands.sack_activation = True
            demands.allow_erasing = True
            demands.available_repos = False
            demands.resolving = True
            demands.root_user = True

        def run_on_repo(self):
            """Execute the command with respect to given arguments *cli_args*."""

            done = False

            if not self.opts.pkg_specs:
                # Remove all packages.
                try:
                    self.base.remove('*', self.reponame)
                except dnf.exceptions.MarkingError:
                    msg = _('No package installed from the repository.')
                    logger.info(msg)
                else:
                    done = True
            else:
                # Remove packages.
                for pkg_spec in self.opts.pkg_specs:
                    try:
                        self.base.remove(pkg_spec, self.reponame)
                    except dnf.exceptions.MarkingError as e:
                        logger.info(str(e))
                    else:
                        done = True

            if not done:
                logger.warning(_('No packages marked for removal.'))

    class UpgradeSubCommand(Command):
        """Implementation of the upgrade sub-command."""

        aliases = ('upgrade', 'upgrade-to')

        def configure(self):
            demands = self.cli.demands
            demands.sack_activation = True
            demands.available_repos = True
            demands.resolving = True
            demands.root_user = True

        def run_on_repo(self):
            """Execute the command with respect to given arguments *cli_args*."""
            _checkGPGKey(self.base, self.cli)

            done = False

            if not self.opts.pkg_specs:
                # Update all packages.
                self.base.upgrade_all(self.reponame)
                done = True
            else:
                # Update packages.
                for pkg_spec in self.opts.pkg_specs:
                    try:
                        self.base.upgrade(pkg_spec, self.reponame)
                    except dnf.exceptions.MarkingError:
                        logger.info(_('No match for argument: %s'), pkg_spec)
                    else:
                        done = True

            if not done:
                raise dnf.exceptions.Error(_('No packages marked for upgrade.'))

    SUBCMDS = {CheckUpdateSubCommand, InfoSubCommand, InstallSubCommand,
               ListSubCommand, MoveToSubCommand, ReinstallOldSubCommand,
               ReinstallSubCommand, RemoveOrDistroSyncSubCommand,
               RemoveOrReinstallSubCommand, RemoveSubCommand,
               UpgradeSubCommand}

    aliases = ('repository-packages',
               'repo-pkgs', 'repo-packages', 'repository-pkgs')
    summary = _('run commands on top of all packages in given repository')

    def __init__(self, cli):
        """Initialize the command."""
        super(RepoPkgsCommand, self).__init__(cli)
        subcmd_objs = (subcmd(cli) for subcmd in self.SUBCMDS)
        self.subcmd = None
        self._subcmd_name2obj = {
            alias: subcmd for subcmd in subcmd_objs for alias in subcmd.aliases}

    def set_argparser(self, parser):
        narrows = parser.add_mutually_exclusive_group()
        narrows.add_argument('--all', dest='_pkg_specs_action',
                             action='store_const', const='all', default=None,
                             help=_("show all packages (default)"))
        narrows.add_argument('--available', dest='_pkg_specs_action',
                             action='store_const', const='available',
                             help=_("show only available packages"))
        narrows.add_argument('--installed', dest='_pkg_specs_action',
                             action='store_const', const='installed',
                             help=_("show only installed packages"))
        narrows.add_argument('--extras', dest='_pkg_specs_action',
                             action='store_const', const='extras',
                             help=_("show only extras packages"))
        narrows.add_argument('--updates', dest='_pkg_specs_action',
                             action='store_const', const='upgrades',
                             help=_("show only upgrades packages"))
        narrows.add_argument('--upgrades', dest='_pkg_specs_action',
                             action='store_const', const='upgrades',
                             help=_("show only upgrades packages"))
        narrows.add_argument('--autoremove', dest='_pkg_specs_action',
                             action='store_const', const='autoremove',
                             help=_("show only autoremove packages"))
        narrows.add_argument('--recent', dest='_pkg_specs_action',
                             action='store_const', const='recent',
                             help=_("show only recently changed packages"))

        parser.add_argument(
            'reponame', nargs=1, action=OptionParser._RepoCallbackEnable,
            metavar=_('REPOID'), help=_("Repository ID"))
        subcommand_choices = [subcmd.aliases[0] for subcmd in self.SUBCMDS]
        subcommand_choices_all = [alias for subcmd in self.SUBCMDS for alias in subcmd.aliases]
        parser.add_argument('subcmd', nargs=1, metavar="SUBCOMMAND",
                            choices=subcommand_choices_all, help=", ".join(subcommand_choices))
        DEFAULT_PKGNARROW = 'all'
        pkgnarrows = {DEFAULT_PKGNARROW, 'installed', 'available',
                      'autoremove', 'extras', 'obsoletes', 'recent',
                      'upgrades'}
        parser.add_argument('pkg_specs', nargs='*', metavar=_('PACKAGE'),
                            choices=pkgnarrows, default=DEFAULT_PKGNARROW,
                            action=OptionParser.PkgNarrowCallback,
                            help=_("Package specification"))

    def configure(self):
        """Verify whether the command can run with given arguments."""
        # Check sub-command.
        try:
            self.subcmd = self._subcmd_name2obj[self.opts.subcmd[0]]
        except (dnf.cli.CliError, KeyError) as e:
            self.cli.optparser.print_usage()
            raise dnf.cli.CliError
        self.subcmd.opts = self.opts
        self.subcmd.reponame = self.opts.reponame[0]
        self.subcmd.configure()

    def run(self):
        """Execute the command with respect to given arguments *extcmds*."""
        self.subcmd.run_on_repo()

class HelpCommand(Command):
    """A class containing methods needed by the cli to execute the
    help command.
    """

    aliases = ('help',)
    summary = _('display a helpful usage message')

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('cmd', nargs='?', metavar=_('COMMAND'),
                            help=_("{prog} command to get help for").format(
                                prog=dnf.util.MAIN_PROG_UPPER))

    def run(self):
        if (not self.opts.cmd
                or self.opts.cmd not in self.cli.cli_commands):
            self.cli.optparser.print_help()
        else:
            command = self.cli.cli_commands[self.opts.cmd]
            self.cli.optparser.print_help(command(self))
site-packages/dnf/cli/utils.py000064400000010650147511334650012327 0ustar00# Copyright (C) 2016  Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU Library General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.

"""Various utility functions, and a utility class."""

from __future__ import absolute_import
from __future__ import unicode_literals
from dnf.cli.format import format_number
from dnf.i18n import _
import dnf.util
import logging
import os
import time

_USER_HZ = os.sysconf(os.sysconf_names['SC_CLK_TCK'])
logger = logging.getLogger('dnf')

def jiffies_to_seconds(jiffies):
    """Convert a number of jiffies to seconds. How many jiffies are in a second
    is system-dependent, e.g. 100 jiffies = 1 second is common.

    :param jiffies: a number of jiffies
    :return: the equivalent number of seconds
    """
    return int(jiffies) / _USER_HZ


def seconds_to_ui_time(seconds):
    """Return a human-readable string representation of the length of
    a time interval given in seconds.

    :param seconds: the length of the time interval in seconds
    :return: a human-readable string representation of the length of
      the time interval
    """
    if seconds >= 60 * 60 * 24:
        return "%d day(s) %d:%02d:%02d" % (seconds // (60 * 60 * 24),
                                           (seconds // (60 * 60)) % 24,
                                           (seconds // 60) % 60,
                                           seconds % 60)
    if seconds >= 60 * 60:
        return "%d:%02d:%02d" % (seconds // (60 * 60), (seconds // 60) % 60,
                                 (seconds % 60))
    return "%02d:%02d" % ((seconds // 60), seconds % 60)


def get_process_info(pid):
    """Return info dict about a process."""

    pid = int(pid)

    # Maybe true if /proc isn't mounted, or not Linux ... or something.
    if (not os.path.exists("/proc/%d/status" % pid) or
        not os.path.exists("/proc/stat") or
        not os.path.exists("/proc/%d/stat" % pid)):
        return

    ps = {}
    with open("/proc/%d/status" % pid) as status_file:
        for line in status_file:
            if line[-1] != '\n':
                continue
            data = line[:-1].split(':\t', 1)
            if len(data) < 2:
                continue
            data[1] = dnf.util.rtrim(data[1], ' kB')
            ps[data[0].strip().lower()] = data[1].strip()
    if 'vmrss' not in ps:
        return
    if 'vmsize' not in ps:
        return

    boot_time = None
    with open("/proc/stat") as stat_file:
        for line in stat_file:
            if line.startswith("btime "):
                boot_time = int(line[len("btime "):-1])
                break
    if boot_time is None:
        return

    with open('/proc/%d/stat' % pid) as stat_file:
        ps_stat = stat_file.read().split()
        ps['start_time'] = boot_time + jiffies_to_seconds(ps_stat[21])
        ps['state'] = {'R' : _('Running'),
                       'S' : _('Sleeping'),
                       'D' : _('Uninterruptible'),
                       'Z' : _('Zombie'),
                       'T' : _('Traced/Stopped')
                       }.get(ps_stat[2], _('Unknown'))

    return ps


def show_lock_owner(pid):
    """Output information about process holding a lock."""

    ps = get_process_info(pid)
    if not ps:
        msg = _('Unable to find information about the locking process (PID %d)')
        logger.critical(msg, pid)
        return

    msg = _('  The application with PID %d is: %s') % (pid, ps['name'])

    logger.critical("%s", msg)
    logger.critical(_("    Memory : %5s RSS (%5sB VSZ)"),
                    format_number(int(ps['vmrss']) * 1024),
                    format_number(int(ps['vmsize']) * 1024))

    ago = seconds_to_ui_time(int(time.time()) - ps['start_time'])
    logger.critical(_('    Started: %s - %s ago'),
                    dnf.util.normalize_time(ps['start_time']), ago)
    logger.critical(_('    State  : %s'), ps['state'])

    return
site-packages/dnf/cli/option_parser.py000064400000056464147511334650014070 0ustar00# optparse.py
# CLI options parser.
#
# Copyright (C) 2014-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import unicode_literals
from dnf.i18n import _
from dnf.util import _parse_specs

import argparse
import dnf.exceptions
import dnf.util
import dnf.rpm
import dnf.yum.misc
import logging
import os.path
import re
import sys

logger = logging.getLogger("dnf")


class MultilineHelpFormatter(argparse.HelpFormatter):
    def _split_lines(self, text, width):
        if '\n' in text:
            return text.splitlines()
        return super(MultilineHelpFormatter, self)._split_lines(text, width)

class OptionParser(argparse.ArgumentParser):
    """ArgumentParser like class to do things the "yum way"."""

    def __init__(self, reset_usage=True):
        super(OptionParser, self).__init__(add_help=False,
                                           formatter_class=MultilineHelpFormatter)
        self.command_positional_parser = None
        self.command_group = None
        self._add_general_options()
        if reset_usage:
            self._cmd_usage = {}      # names, summary for dnf commands, to build usage
            self._cmd_groups = set()  # cmd groups added (main, plugin)

    def error(self, msg):
        """Output an error message, and exit the program.
           This method overrides standard argparser's error
           so that error output goes to the logger.

        :param msg: the error message to output
        """
        self.print_usage()
        logger.critical(_("Command line error: %s"), msg)
        sys.exit(1)

    class _RepoCallback(argparse.Action):
        def __call__(self, parser, namespace, values, opt_str):
            operation = 'disable' if opt_str == '--disablerepo' else 'enable'
            l = getattr(namespace, self.dest)
            l.extend((x, operation) for x in re.split(r'\s*[,\s]\s*', values))

    class _RepoCallbackEnable(argparse.Action):
        def __call__(self, parser, namespace, values, opt_str):
            namespace.repos_ed.append((values[0], 'enable'))
            setattr(namespace, 'reponame', values)

    class _SplitCallback(argparse._AppendAction):
        """ Split all strings in seq, at "," and whitespace.
        Returns a new list. """
        SPLITTER = r'\s*[,\s]\s*'

        def __call__(self, parser, namespace, values, opt_str):
            first = True
            for val in re.split(self.SPLITTER, values):
                if first or val:
                    # Empty values are sometimes used to clear existing content of the option.
                    # Only the first value in the parsed string can be empty. Other empty values
                    # are ignored.
                    super(OptionParser._SplitCallback,
                          self).__call__(parser, namespace, val, opt_str)
                first = False

    class _SplitExtendDictCallback(argparse.Action):
        """ Split string at "," or whitespace to (key, value).
        Extends dict with {key: value}."""
        def __call__(self, parser, namespace, values, opt_str):
            try:
                key, val = values.split(',')
                if not key or not val:
                    raise ValueError
            except ValueError:
                msg = _('bad format: %s') % values
                raise argparse.ArgumentError(self, msg)
            dct = getattr(namespace, self.dest)
            dct[key] = val

    class _SetoptsCallback(argparse.Action):
        """ Parse setopts arguments and put them into main_<setopts>
            and repo_<setopts>."""
        def __call__(self, parser, namespace, values, opt_str):
            vals = values.split('=')
            if len(vals) > 2:
                logger.warning(_("Setopt argument has multiple values: %s"), values)
                return
            if len(vals) < 2:
                logger.warning(_("Setopt argument has no value: %s"), values)
                return
            k, v = vals
            period = k.rfind('.')
            if period != -1:
                repo = k[:period]
                k = k[period+1:]
                if hasattr(namespace, 'repo_setopts'):
                    repoopts = namespace.repo_setopts
                else:
                    repoopts = {}
                repoopts.setdefault(repo, {}).setdefault(k, []).append(v)
                setattr(namespace, 'repo_' + self.dest, repoopts)
            else:
                if hasattr(namespace, 'main_setopts'):
                    mainopts = namespace.main_setopts
                else:
                    mainopts = {}
                mainopts.setdefault(k, []).append(v)
                setattr(namespace, 'main_' + self.dest, mainopts)

    class ParseSpecGroupFileCallback(argparse.Action):
        def __call__(self, parser, namespace, values, opt_str):
            _parse_specs(namespace, values)

    class PkgNarrowCallback(argparse.Action):
        def __init__(self, *args, **kwargs):
            self.pkgnarrow = {}
            try:
                for k in ['choices', 'default']:
                    self.pkgnarrow[k] = kwargs[k]
                    del kwargs[k]
            except KeyError as e:
                raise TypeError("%s() missing mandatory argument %s"
                                % (self.__class__.__name__, e))
            kwargs['default'] = []
            super(OptionParser.PkgNarrowCallback, self).__init__(*args, **kwargs)

        def __call__(self, parser, namespace, values, opt_str):
            dest_action = self.dest + '_action'
            if not values or values[0] not in self.pkgnarrow['choices']:
                narrow = self.pkgnarrow['default']
            else:
                narrow = values.pop(0)
            setattr(namespace, dest_action, narrow)
            setattr(namespace, self.dest, values)

    class ForceArchAction(argparse.Action):
        def __call__(self, parser, namespace, values, opt_str):
            namespace.ignorearch = True
            namespace.arch = values

    def _add_general_options(self):
        """ Standard options known to all dnf subcommands. """
        # All defaults need to be a None, so we can always tell whether the user
        # has set something or whether we are getting a default.
        general_grp = self.add_argument_group(_('General {prog} options'.format(
            prog=dnf.util.MAIN_PROG_UPPER)))
        general_grp.add_argument("-c", "--config", dest="config_file_path",
                                 default=None, metavar='[config file]',
                                 help=_("config file location"))
        general_grp.add_argument("-q", "--quiet", dest="quiet",
                                 action="store_true", default=None,
                                 help=_("quiet operation"))
        general_grp.add_argument("-v", "--verbose", action="store_true",
                                 default=None, help=_("verbose operation"))
        general_grp.add_argument("--version", action="store_true", default=None,
                                 help=_("show {prog} version and exit").format(
                                     prog=dnf.util.MAIN_PROG_UPPER))
        general_grp.add_argument("--installroot", help=_("set install root"),
                                 metavar='[path]')
        general_grp.add_argument("--nodocs", action="store_const", const=['nodocs'], dest='tsflags',
                                 help=_("do not install documentations"))
        general_grp.add_argument("--noplugins", action="store_false",
                                 default=None, dest='plugins',
                                 help=_("disable all plugins"))
        general_grp.add_argument("--enableplugin", dest="enableplugin",
                                 default=[], action=self._SplitCallback,
                                 help=_("enable plugins by name"),
                                 metavar='[plugin]')
        general_grp.add_argument("--disableplugin", dest="disableplugin",
                                 default=[], action=self._SplitCallback,
                                 help=_("disable plugins by name"),
                                 metavar='[plugin]')
        general_grp.add_argument("--releasever", default=None,
                                 help=_("override the value of $releasever"
                                        " in config and repo files"))
        general_grp.add_argument("--setopt", dest="setopts", default=[],
                                 action=self._SetoptsCallback,
                                 help=_("set arbitrary config and repo options"))
        general_grp.add_argument("--skip-broken", dest="skip_broken", action="store_true",
                                 default=None,
                                 help=_("resolve depsolve problems by skipping packages"))
        general_grp.add_argument('-h', '--help', '--help-cmd',
                                 action="store_true", dest='help',
                                 help=_("show command help"))

        general_grp.add_argument('--allowerasing', action='store_true',
                                 default=None,
                                 help=_('allow erasing of installed packages to '
                                        'resolve dependencies'))
        best_group = general_grp.add_mutually_exclusive_group()
        best_group.add_argument("-b", "--best", action="store_true", dest='best', default=None,
                                help=_("try the best available package versions in transactions."))
        best_group.add_argument("--nobest", action="store_false", dest='best',
                                help=_("do not limit the transaction to the best candidate"))
        general_grp.add_argument("-C", "--cacheonly", dest="cacheonly",
                                 action="store_true", default=None,
                                 help=_("run entirely from system cache, "
                                        "don't update cache"))
        general_grp.add_argument("-R", "--randomwait", dest="sleeptime", type=int,
                                 default=None, metavar='[minutes]',
                                 help=_("maximum command wait time"))
        general_grp.add_argument("-d", "--debuglevel", dest="debuglevel",
                                 metavar='[debug level]', default=None,
                                 help=_("debugging output level"), type=int)
        general_grp.add_argument("--debugsolver",
                                 action="store_true", default=None,
                                 help=_("dumps detailed solving results into"
                                        " files"))
        general_grp.add_argument("--showduplicates", dest="showdupesfromrepos",
                                 action="store_true", default=None,
                                 help=_("show duplicates, in repos, "
                                        "in list/search commands"))
        general_grp.add_argument("-e", "--errorlevel", default=None, type=int,
                                 help=_("error output level"))
        general_grp.add_argument("--obsoletes", default=None, dest="obsoletes",
                                 action="store_true",
                                 help=_("enables {prog}'s obsoletes processing logic "
                                        "for upgrade or display capabilities that "
                                        "the package obsoletes for info, list and "
                                        "repoquery").format(prog=dnf.util.MAIN_PROG))
        general_grp.add_argument("--rpmverbosity", default=None,
                                 help=_("debugging output level for rpm"),
                                 metavar='[debug level name]')
        general_grp.add_argument("-y", "--assumeyes", action="store_true",
                                 default=None, help=_("automatically answer yes"
                                                      " for all questions"))
        general_grp.add_argument("--assumeno", action="store_true",
                                 default=None, help=_("automatically answer no"
                                                      " for all questions"))
        general_grp.add_argument("--enablerepo", action=self._RepoCallback,
                                 dest='repos_ed', default=[], metavar='[repo]',
                                 help=_("Enable additional repositories. List option. "
                                        "Supports globs, can be specified multiple times."))
        repo_group = general_grp.add_mutually_exclusive_group()
        repo_group.add_argument("--disablerepo", action=self._RepoCallback,
                                dest='repos_ed', default=[], metavar='[repo]',
                                help=_("Disable repositories. List option. "
                                       "Supports globs, can be specified multiple times."))
        repo_group.add_argument('--repo', '--repoid', metavar='[repo]', dest='repo',
                                action=self._SplitCallback, default=[],
                                help=_('enable just specific repositories by an id or a glob, '
                                       'can be specified multiple times'))
        enable_group = general_grp.add_mutually_exclusive_group()
        enable_group.add_argument("--enable", default=False,
                                  dest="set_enabled", action="store_true",
                                  help=_("enable repos with config-manager "
                                         "command (automatically saves)"))
        enable_group.add_argument("--disable", default=False,
                                  dest="set_disabled", action="store_true",
                                  help=_("disable repos with config-manager "
                                         "command (automatically saves)"))
        general_grp.add_argument("-x", "--exclude", "--excludepkgs", default=[],
                                 dest='excludepkgs', action=self._SplitCallback,
                                 help=_("exclude packages by name or glob"),
                                 metavar='[package]')
        general_grp.add_argument("--disableexcludes", "--disableexcludepkgs",
                                 default=[], dest="disable_excludes",
                                 action=self._SplitCallback,
                                 help=_("disable excludepkgs"),
                                 metavar='[repo]')
        general_grp.add_argument("--repofrompath", default={},
                                 action=self._SplitExtendDictCallback,
                                 metavar='[repo,path]',
                                 help=_("label and path to an additional repository to use (same "
                                        "path as in a baseurl), can be specified multiple times."))
        general_grp.add_argument("--noautoremove", action="store_false",
                                 default=None, dest='clean_requirements_on_remove',
                                 help=_("disable removal of dependencies that are no longer used"))
        general_grp.add_argument("--nogpgcheck", action="store_false",
                                 default=None, dest='gpgcheck',
                                 help=_("disable gpg signature checking (if RPM policy allows)"))
        general_grp.add_argument("--color", dest="color", default=None,
                                 help=_("control whether color is used"))
        general_grp.add_argument("--refresh", dest="freshest_metadata",
                                 action="store_true",
                                 help=_("set metadata as expired before running"
                                        " the command"))
        general_grp.add_argument("-4", dest="ip_resolve", default=None,
                                 help=_("resolve to IPv4 addresses only"),
                                 action="store_const", const='ipv4')
        general_grp.add_argument("-6", dest="ip_resolve", default=None,
                                 help=_("resolve to IPv6 addresses only"),
                                 action="store_const", const='ipv6')
        general_grp.add_argument("--destdir", "--downloaddir", dest="destdir", default=None,
                                 help=_("set directory to copy packages to"))
        general_grp.add_argument("--downloadonly", dest="downloadonly",
                                 action="store_true", default=False,
                                 help=_("only download packages"))
        general_grp.add_argument("--comment", dest="comment", default=None,
                                 help=_("add a comment to transaction"))
        # Updateinfo options...
        general_grp.add_argument("--bugfix", action="store_true",
                                 help=_("Include bugfix relevant packages, "
                                        "in updates"))
        general_grp.add_argument("--enhancement", action="store_true",
                                 help=_("Include enhancement relevant packages,"
                                        " in updates"))
        general_grp.add_argument("--newpackage", action="store_true",
                                 help=_("Include newpackage relevant packages,"
                                        " in updates"))
        general_grp.add_argument("--security", action="store_true",
                                 help=_("Include security relevant packages, "
                                        "in updates"))
        general_grp.add_argument("--advisory", "--advisories", dest="advisory",
                                 default=[], action=self._SplitCallback,
                                 help=_("Include packages needed to fix the "
                                        "given advisory, in updates"))
        general_grp.add_argument("--bz", "--bzs", default=[], dest="bugzilla",
                                 action=self._SplitCallback, help=_(
                "Include packages needed to fix the given BZ, in updates"))
        general_grp.add_argument("--cve", "--cves", default=[], dest="cves",
                                 action=self._SplitCallback,
                                 help=_("Include packages needed to fix the given CVE, in updates"))
        general_grp.add_argument(
            "--sec-severity", "--secseverity",
            choices=['Critical', 'Important', 'Moderate', 'Low'], default=[],
            dest="severity", action=self._SplitCallback, help=_(
                "Include security relevant packages matching the severity, "
                "in updates"))
        general_grp.add_argument("--forcearch", metavar="ARCH",
                                 dest=argparse.SUPPRESS,
                                 action=self.ForceArchAction,
                                 choices=sorted(dnf.rpm._BASEARCH_MAP.keys()),
                                 help=_("Force the use of an architecture"))
        general_grp.add_argument('command', nargs='?', help=argparse.SUPPRESS)

    def _add_cmd_usage(self, cmd, group):
        """ store usage info about a single dnf command."""
        summary = dnf.i18n.ucd(cmd.summary)
        name = dnf.i18n.ucd(cmd.aliases[0])
        if not name in self._cmd_usage:
            self._cmd_usage[name] = (group, summary)
            self._cmd_groups.add(group)

    def add_commands(self, cli_cmds, group):
        """ store name & summary for dnf commands

        The stored information is used build usage information
        grouped by build-in & plugin commands.
        """
        for cmd in set(cli_cmds.values()):
            self._add_cmd_usage(cmd, group)

    def get_usage(self):
        """ get the usage information to show the user. """
        desc = {'main': _('List of Main Commands:'),
                'plugin': _('List of Plugin Commands:')}
        usage = '%s [options] COMMAND\n' % dnf.util.MAIN_PROG
        for grp in ['main', 'plugin']:
            if not grp in self._cmd_groups:
                # dont add plugin usage, if we dont have plugins
                continue
            usage += "\n%s\n\n" % desc[grp]
            for name in sorted(self._cmd_usage.keys()):
                group, summary = self._cmd_usage[name]
                if group == grp:
                    usage += "%-25s %s\n" % (name, summary)
        return usage

    def _add_command_options(self, command):
        self.prog = "%s %s" % (dnf.util.MAIN_PROG, command._basecmd)
        self.description = command.summary
        self.command_positional_parser = argparse.ArgumentParser(self.prog, add_help=False)
        self.command_positional_parser.print_usage = self.print_usage
        self.command_positional_parser._positionals.title = None
        self.command_group = self.add_argument_group(
            '{} command-specific options'.format(command._basecmd.capitalize()))
        self.command_group.add_argument = self.cmd_add_argument
        self.command_group._command = command._basecmd
        command.set_argparser(self.command_group)

    def cmd_add_argument(self, *args, **kwargs):
        if all([(arg[0] in self.prefix_chars) for arg in args]):
            return type(self.command_group).add_argument(self.command_group, *args, **kwargs)
        else:
            return self.command_positional_parser.add_argument(*args, **kwargs)

    def _check_encoding(self, args):
        for arg in args:
            try:
                arg.encode('utf-8')
            except UnicodeEncodeError as e:
                raise dnf.exceptions.ConfigError(
                    _("Cannot encode argument '%s': %s") % (arg, str(e)))

    def parse_main_args(self, args):
        self._check_encoding(args)
        namespace, _unused_args = self.parse_known_args(args)
        return namespace

    def parse_command_args(self, command, args):
        self._add_command_options(command)
        namespace, unused_args = self.parse_known_args(args)
        namespace = self.command_positional_parser.parse_args(unused_args, namespace)
        command.opts = namespace
        return command.opts

    def print_usage(self, file_=None):
        if self.command_positional_parser:
            self._actions += self.command_positional_parser._actions
        super(OptionParser, self).print_usage(file_)

    def print_help(self, command=None):
        # pylint: disable=W0212
        if command:
            if not self.command_group or self.command_group._command != command._basecmd:
                self._add_command_options(command)
            self._actions += self.command_positional_parser._actions
            self._action_groups.append(self.command_positional_parser._positionals)
        else:
            self.usage = self.get_usage()
        super(OptionParser, self).print_help()
site-packages/dnf/cli/aliases.py000064400000015735147511334650012621 0ustar00# aliases.py
# Resolving aliases in CLI arguments.
#
# Copyright (C) 2018 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from dnf.i18n import _

import collections
import dnf.cli
from dnf.conf.config import PRIO_DEFAULT
import dnf.exceptions
import libdnf.conf
import logging
import os
import os.path

logger = logging.getLogger('dnf')

ALIASES_DROPIN_DIR = '/etc/dnf/aliases.d/'
ALIASES_CONF_PATH = os.path.join(ALIASES_DROPIN_DIR, 'ALIASES.conf')
ALIASES_USER_PATH = os.path.join(ALIASES_DROPIN_DIR, 'USER.conf')


class AliasesConfig(object):
    def __init__(self, path):
        self._path = path
        self._parser = libdnf.conf.ConfigParser()
        self._parser.read(self._path)

    @property
    def enabled(self):
        option = libdnf.conf.OptionBool(True)
        try:
            option.set(PRIO_DEFAULT, self._parser.getData()["main"]["enabled"])
        except IndexError:
            pass
        return option.getValue()

    @property
    def aliases(self):
        result = collections.OrderedDict()
        section = "aliases"
        if not self._parser.hasSection(section):
            return result
        for key in self._parser.options(section):
            value = self._parser.getValue(section, key)
            if not value:
                continue
            result[key] = value.split()
        return result


class Aliases(object):
    def __init__(self):
        self.aliases = collections.OrderedDict()
        self.conf = None
        self.enabled = True

        if self._disabled_by_environ():
            self.enabled = False
            return

        self._load_main()

        if not self.enabled:
            return

        self._load_aliases()

    def _disabled_by_environ(self):
        option = libdnf.conf.OptionBool(True)
        try:
            option.set(PRIO_DEFAULT, os.environ['DNF_DISABLE_ALIASES'])
            return option.getValue()
        except KeyError:
            return False
        except RuntimeError:
            logger.warning(
                _('Unexpected value of environment variable: '
                  'DNF_DISABLE_ALIASES=%s'), os.environ['DNF_DISABLE_ALIASES'])
            return True

    def _load_conf(self, path):
        try:
            return AliasesConfig(path)
        except RuntimeError as e:
            raise dnf.exceptions.ConfigError(
                _('Parsing file "%s" failed: %s') % (path, e))
        except IOError as e:
            raise dnf.exceptions.ConfigError(
                _('Cannot read file "%s": %s') % (path, e))

    def _load_main(self):
        try:
            self.conf = self._load_conf(ALIASES_CONF_PATH)
            self.enabled = self.conf.enabled
        except dnf.exceptions.ConfigError as e:
            logger.debug(_('Config error: %s'), e)

    def _load_aliases(self, filenames=None):
        if filenames is None:
            try:
                filenames = self._dropin_dir_filenames()
            except dnf.exceptions.ConfigError:
                return
        for filename in filenames:
            try:
                conf = self._load_conf(filename)
                if conf.enabled:
                    self.aliases.update(conf.aliases)
            except dnf.exceptions.ConfigError as e:
                logger.warning(_('Config error: %s'), e)

    def _dropin_dir_filenames(self):
        # Get default aliases config filenames:
        #   all files from ALIASES_DROPIN_DIR,
        #   and ALIASES_USER_PATH as the last one (-> override all others)
        ignored_filenames = [os.path.basename(ALIASES_CONF_PATH),
                             os.path.basename(ALIASES_USER_PATH)]

        def _ignore_filename(filename):
            return filename in ignored_filenames or\
                filename.startswith('.') or\
                not filename.endswith(('.conf', '.CONF'))

        filenames = []
        try:
            if not os.path.exists(ALIASES_DROPIN_DIR):
                os.mkdir(ALIASES_DROPIN_DIR)
            for fn in sorted(os.listdir(ALIASES_DROPIN_DIR)):
                if _ignore_filename(fn):
                    continue
                filenames.append(os.path.join(ALIASES_DROPIN_DIR, fn))
        except (IOError, OSError) as e:
            raise dnf.exceptions.ConfigError(e)
        if os.path.exists(ALIASES_USER_PATH):
            filenames.append(ALIASES_USER_PATH)
        return filenames

    def _resolve(self, args):
        stack = []
        self.prefix_options = []

        def store_prefix(args):
            num = 0
            for arg in args:
                if arg and arg[0] != '-':
                    break
                num += 1

            self.prefix_options += args[:num]

            return args[num:]

        def subresolve(args):
            suffix = store_prefix(args)

            if (not suffix or  # Current alias on stack is resolved
                    suffix[0] not in self.aliases or  # End resolving
                    suffix[0].startswith('\\')):  # End resolving
                try:
                    stack.pop()

                    # strip the '\' if it exists
                    if suffix[0].startswith('\\'):
                        suffix[0] = suffix[0][1:]
                except IndexError:
                    pass

                return suffix

            if suffix[0] in stack:  # Infinite recursion detected
                raise dnf.exceptions.Error(
                    _('Aliases contain infinite recursion'))

            # Next word must be an alias
            stack.append(suffix[0])
            current_alias_result = subresolve(self.aliases[suffix[0]])
            if current_alias_result:  # We reached non-alias or '\'
                return current_alias_result + suffix[1:]
            else:  # Need to resolve aliases in the rest
                return subresolve(suffix[1:])

        suffix = subresolve(args)
        return self.prefix_options + suffix

    def resolve(self, args):
        if self.enabled:
            try:
                args = self._resolve(args)
            except dnf.exceptions.Error as e:
                logger.error(_('%s, using original arguments.'), e)
        return args
site-packages/dnf/cli/cli.py000064400000126261147511334650011744 0ustar00# Copyright 2005 Duke University
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU Library General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
#
# Written by Seth Vidal

"""
Command line interface yum class and related.
"""

from __future__ import print_function
from __future__ import absolute_import
from __future__ import unicode_literals

try:
    from collections.abc import Sequence
except ImportError:
    from collections import Sequence
import datetime
import logging
import operator
import os
import random
import rpm
import sys
import time

import hawkey
import libdnf.transaction

from . import output
from dnf.cli import CliError
from dnf.i18n import ucd, _
import dnf
import dnf.cli.aliases
import dnf.cli.commands
import dnf.cli.commands.alias
import dnf.cli.commands.autoremove
import dnf.cli.commands.check
import dnf.cli.commands.clean
import dnf.cli.commands.deplist
import dnf.cli.commands.distrosync
import dnf.cli.commands.downgrade
import dnf.cli.commands.group
import dnf.cli.commands.history
import dnf.cli.commands.install
import dnf.cli.commands.makecache
import dnf.cli.commands.mark
import dnf.cli.commands.module
import dnf.cli.commands.reinstall
import dnf.cli.commands.remove
import dnf.cli.commands.repolist
import dnf.cli.commands.repoquery
import dnf.cli.commands.search
import dnf.cli.commands.shell
import dnf.cli.commands.swap
import dnf.cli.commands.updateinfo
import dnf.cli.commands.upgrade
import dnf.cli.commands.upgrademinimal
import dnf.cli.demand
import dnf.cli.format
import dnf.cli.option_parser
import dnf.conf
import dnf.conf.substitutions
import dnf.const
import dnf.db.history
import dnf.exceptions
import dnf.logging
import dnf.persistor
import dnf.plugin
import dnf.rpm
import dnf.sack
import dnf.transaction
import dnf.util
import dnf.yum.misc

logger = logging.getLogger('dnf')


def _add_pkg_simple_list_lens(data, pkg, indent=''):
    """ Get the length of each pkg's column. Add that to data.
        This "knows" about simpleList and printVer. """
    na = len(pkg.name) + 1 + len(pkg.arch) + len(indent)
    ver = len(pkg.evr)
    rid = len(pkg._from_repo)
    for (d, v) in (('na', na), ('ver', ver), ('rid', rid)):
        data[d].setdefault(v, 0)
        data[d][v] += 1


def _list_cmd_calc_columns(output, ypl):
    """ Work out the dynamic size of the columns to pass to fmtColumns. """
    data = {'na' : {}, 'ver' : {}, 'rid' : {}}
    for lst in (ypl.installed, ypl.available, ypl.extras, ypl.autoremove,
                ypl.updates, ypl.recent):
        for pkg in lst:
            _add_pkg_simple_list_lens(data, pkg)
    if len(ypl.obsoletes) > 0:
        for (npkg, opkg) in ypl.obsoletesTuples:
            _add_pkg_simple_list_lens(data, npkg)
            _add_pkg_simple_list_lens(data, opkg, indent=" " * 4)

    data = [data['na'], data['ver'], data['rid']]
    columns = output.calcColumns(data, remainder_column=1)
    return (-columns[0], -columns[1], -columns[2])


def print_versions(pkgs, base, output):
    def sm_ui_time(x):
        return time.strftime("%c", time.gmtime(x))

    rpmdb_sack = dnf.sack.rpmdb_sack(base)
    done = False
    for pkg in rpmdb_sack.query().installed().filterm(name=pkgs):
        if done:
            print("")
        done = True
        if pkg.epoch == '0':
            ver = '%s-%s.%s' % (pkg.version, pkg.release, pkg.arch)
        else:
            ver = '%s:%s-%s.%s' % (pkg.epoch,
                                   pkg.version, pkg.release, pkg.arch)
        name = output.term.bold(pkg.name)
        print(_("  Installed: %s-%s at %s") %(name, ver,
                                              sm_ui_time(pkg.installtime)))
        print(_("  Built    : %s at %s") % (pkg.packager if pkg.packager else "",
                                            sm_ui_time(pkg.buildtime)))
        # :hawkey, no changelist information yet
        # print(_("  Committed: %s at %s") % (pkg.committer,
        #                                    sm_ui_date(pkg.committime)))


def report_module_switch(switchedModules):
    msg1 = _("The operation would result in switching of module '{0}' stream '{1}' to "
             "stream '{2}'")
    for moduleName, streams in switchedModules.items():
        logger.warning(msg1.format(moduleName, streams[0], streams[1]))


class BaseCli(dnf.Base):
    """This is the base class for yum cli."""

    def __init__(self, conf=None):
        conf = conf or dnf.conf.Conf()
        super(BaseCli, self).__init__(conf=conf)
        self.output = output.Output(self, self.conf)

    def do_transaction(self, display=()):
        """Take care of package downloading, checking, user
        confirmation and actually running the transaction.

        :param display: `rpm.callback.TransactionProgress` object(s)
        :return: history database transaction ID or None
        """
        if dnf.base.WITH_MODULES:
            if not self.conf.module_stream_switch:
                switchedModules = dict(self._moduleContainer.getSwitchedStreams())
                if switchedModules:
                    report_module_switch(switchedModules)
                    msg = _("It is not possible to switch enabled streams of a module unless explicitly "
                            "enabled via configuration option module_stream_switch.\n"
                            "It is recommended to rather remove all installed content from the module, and "
                            "reset the module using '{prog} module reset <module_name>' command. After "
                            "you reset the module, you can install the other stream.").format(
                        prog=dnf.util.MAIN_PROG)
                    raise dnf.exceptions.Error(msg)

        trans = self.transaction
        pkg_str = self.output.list_transaction(trans)
        if pkg_str:
            logger.info(pkg_str)

        if trans:
            # Check which packages have to be downloaded
            install_pkgs = []
            rmpkgs = []
            install_only = True
            for tsi in trans:
                if tsi.action in dnf.transaction.FORWARD_ACTIONS:
                    install_pkgs.append(tsi.pkg)
                elif tsi.action in dnf.transaction.BACKWARD_ACTIONS:
                    install_only = False
                    rmpkgs.append(tsi.pkg)

            # Close the connection to the rpmdb so that rpm doesn't hold the
            # SIGINT handler during the downloads.
            del self._ts

            # report the total download size to the user
            if not install_pkgs:
                self.output.reportRemoveSize(rmpkgs)
            else:
                self.output.reportDownloadSize(install_pkgs, install_only)

        if trans or self._moduleContainer.isChanged() or \
                (self._history and (self._history.group or self._history.env)):
            # confirm with user
            if self.conf.downloadonly:
                logger.info(_("{prog} will only download packages for the transaction.").format(
                    prog=dnf.util.MAIN_PROG_UPPER))
            elif 'test' in self.conf.tsflags:
                logger.info(_("{prog} will only download packages, install gpg keys, and check the "
                              "transaction.").format(prog=dnf.util.MAIN_PROG_UPPER))
            if self._promptWanted():
                if self.conf.assumeno or not self.output.userconfirm():
                    raise CliError(_("Operation aborted."))
        else:
            logger.info(_('Nothing to do.'))
            return

        if trans:
            if install_pkgs:
                logger.info(_('Downloading Packages:'))
                try:
                    total_cb = self.output.download_callback_total_cb
                    self.download_packages(install_pkgs, self.output.progress, total_cb)
                except dnf.exceptions.DownloadError as e:
                    specific = dnf.cli.format.indent_block(ucd(e))
                    errstr = _('Error downloading packages:') + '\n%s' % specific
                    # setting the new line to prevent next chars being eaten up
                    # by carriage returns
                    print()
                    raise dnf.exceptions.Error(errstr)
            # Check GPG signatures
            self.gpgsigcheck(install_pkgs)

        if self.conf.downloadonly:
            return

        if not isinstance(display, Sequence):
            display = [display]
        display = [output.CliTransactionDisplay()] + list(display)
        tid = super(BaseCli, self).do_transaction(display)

        # display last transaction (which was closed during do_transaction())
        if tid is not None:
            trans = self.history.old([tid])[0]
            trans = dnf.db.group.RPMTransaction(self.history, trans._trans)
        else:
            trans = None

        if trans:
            # the post transaction summary is already written to log during
            # Base.do_transaction() so here only print the messages to the
            # user arranged in columns
            print()
            print('\n'.join(self.output.post_transaction_output(trans)))
            print()
            for tsi in trans:
                if tsi.state == libdnf.transaction.TransactionItemState_ERROR:
                    raise dnf.exceptions.Error(_('Transaction failed'))

        return tid

    def gpgsigcheck(self, pkgs):
        """Perform GPG signature verification on the given packages,
        installing keys if possible.

        :param pkgs: a list of package objects to verify the GPG
           signatures of
        :raises: Will raise :class:`Error` if there's a problem
        """
        error_messages = []
        for po in pkgs:
            result, errmsg = self._sig_check_pkg(po)

            if result == 0:
                # Verified ok, or verify not req'd
                continue

            elif result == 1:
                ay = self.conf.assumeyes and not self.conf.assumeno
                if (not sys.stdin or not sys.stdin.isatty()) and not ay:
                    raise dnf.exceptions.Error(_('Refusing to automatically import keys when running ' \
                            'unattended.\nUse "-y" to override.'))

                # the callback here expects to be able to take options which
                # userconfirm really doesn't... so fake it
                fn = lambda x, y, z: self.output.userconfirm()
                try:
                    self._get_key_for_package(po, fn)
                except (dnf.exceptions.Error, ValueError) as e:
                    error_messages.append(str(e))

            else:
                # Fatal error
                error_messages.append(errmsg)

        if error_messages:
            for msg in error_messages:
                logger.critical(msg)
            raise dnf.exceptions.Error(_("GPG check FAILED"))

    def latest_changelogs(self, package):
        """Return list of changelogs for package newer then installed version"""
        newest = None
        # find the date of the newest changelog for installed version of package
        # stored in rpmdb
        for mi in self._rpmconn.readonly_ts.dbMatch('name', package.name):
            changelogtimes = mi[rpm.RPMTAG_CHANGELOGTIME]
            if changelogtimes:
                newest = datetime.date.fromtimestamp(changelogtimes[0])
                break
        chlogs = [chlog for chlog in package.changelogs
                  if newest is None or chlog['timestamp'] > newest]
        return chlogs

    def format_changelog(self, changelog):
        """Return changelog formatted as in spec file"""
        chlog_str = '* %s %s\n%s\n' % (
            changelog['timestamp'].strftime("%a %b %d %X %Y"),
            dnf.i18n.ucd(changelog['author']),
            dnf.i18n.ucd(changelog['text']))
        return chlog_str

    def print_changelogs(self, packages):
        # group packages by src.rpm to avoid showing duplicate changelogs
        bysrpm = dict()
        for p in packages:
            # there are packages without source_name, use name then.
            bysrpm.setdefault(p.source_name or p.name, []).append(p)
        for source_name in sorted(bysrpm.keys()):
            bin_packages = bysrpm[source_name]
            print(_("Changelogs for {}").format(', '.join([str(pkg) for pkg in bin_packages])))
            for chl in self.latest_changelogs(bin_packages[0]):
                print(self.format_changelog(chl))

    def check_updates(self, patterns=(), reponame=None, print_=True, changelogs=False):
        """Check updates matching given *patterns* in selected repository."""
        ypl = self.returnPkgLists('upgrades', patterns, reponame=reponame)
        if self.conf.obsoletes or self.conf.verbose:
            typl = self.returnPkgLists('obsoletes', patterns, reponame=reponame)
            ypl.obsoletes = typl.obsoletes
            ypl.obsoletesTuples = typl.obsoletesTuples

        if print_:
            columns = _list_cmd_calc_columns(self.output, ypl)
            if len(ypl.updates) > 0:
                local_pkgs = {}
                highlight = self.output.term.MODE['bold']
                if highlight:
                    # Do the local/remote split we get in "yum updates"
                    for po in sorted(ypl.updates):
                        local = po.localPkg()
                        if os.path.exists(local) and po.verifyLocalPkg():
                            local_pkgs[(po.name, po.arch)] = po

                cul = self.conf.color_update_local
                cur = self.conf.color_update_remote
                self.output.listPkgs(ypl.updates, '', outputType='list',
                              highlight_na=local_pkgs, columns=columns,
                              highlight_modes={'=' : cul, 'not in' : cur})
                if changelogs:
                    self.print_changelogs(ypl.updates)

            if len(ypl.obsoletes) > 0:
                print(_('Obsoleting Packages'))
                # The tuple is (newPkg, oldPkg) ... so sort by new
                for obtup in sorted(ypl.obsoletesTuples,
                                    key=operator.itemgetter(0)):
                    self.output.updatesObsoletesList(obtup, 'obsoletes',
                                                     columns=columns)

        return ypl.updates or ypl.obsoletes

    def distro_sync_userlist(self, userlist):
        """ Upgrade or downgrade packages to match the latest versions available
            in the enabled repositories.

            :return: (exit_code, [ errors ])

            exit_code is::
                0 = we're done, exit
                1 = we've errored, exit with error string
                2 = we've got work yet to do, onto the next stage
        """
        oldcount = self._goal.req_length()
        if len(userlist) == 0:
            self.distro_sync()
        else:
            for pkg_spec in userlist:
                self.distro_sync(pkg_spec)

        cnt = self._goal.req_length() - oldcount
        if cnt <= 0 and not self._goal.req_has_distupgrade_all():
            msg = _('No packages marked for distribution synchronization.')
            raise dnf.exceptions.Error(msg)

    def downgradePkgs(self, specs=[], file_pkgs=[], strict=False):
        """Attempt to take the user specified list of packages or
        wildcards and downgrade them. If a complete version number is
        specified, attempt to downgrade them to the specified version

        :param specs: a list of names or wildcards specifying packages to downgrade
        :param file_pkgs: a list of pkg objects from local files
        """

        result = False
        for pkg in file_pkgs:
            try:
                self.package_downgrade(pkg, strict=strict)
                result = True
            except dnf.exceptions.MarkingError as e:
                logger.info(_('No match for argument: %s'),
                            self.output.term.bold(pkg.location))

        for arg in specs:
            try:
                self.downgrade_to(arg, strict=strict)
                result = True
            except dnf.exceptions.PackageNotFoundError as err:
                msg = _('No package %s available.')
                logger.info(msg, self.output.term.bold(arg))
            except dnf.exceptions.PackagesNotInstalledError as err:
                logger.info(_('Packages for argument %s available, but not installed.'),
                            self.output.term.bold(err.pkg_spec))
            except dnf.exceptions.MarkingError:
                assert False

        if not result:
            raise dnf.exceptions.Error(_('No packages marked for downgrade.'))

    def output_packages(self, basecmd, pkgnarrow='all', patterns=(), reponame=None):
        """Output selection *pkgnarrow* of packages matching *patterns* and *repoid*."""
        try:
            highlight = self.output.term.MODE['bold']
            ypl = self.returnPkgLists(
                pkgnarrow, patterns, installed_available=highlight, reponame=reponame)
        except dnf.exceptions.Error as e:
            return 1, [str(e)]
        else:
            update_pkgs = {}
            inst_pkgs = {}
            local_pkgs = {}

            columns = None
            if basecmd == 'list':
                # Dynamically size the columns
                columns = _list_cmd_calc_columns(self.output, ypl)

            if highlight and ypl.installed:
                #  If we have installed and available lists, then do the
                # highlighting for the installed packages so you can see what's
                # available to update, an extra, or newer than what we have.
                for pkg in (ypl.hidden_available +
                            ypl.reinstall_available +
                            ypl.old_available):
                    key = (pkg.name, pkg.arch)
                    if key not in update_pkgs or pkg > update_pkgs[key]:
                        update_pkgs[key] = pkg

            if highlight and ypl.available:
                #  If we have installed and available lists, then do the
                # highlighting for the available packages so you can see what's
                # available to install vs. update vs. old.
                for pkg in ypl.hidden_installed:
                    key = (pkg.name, pkg.arch)
                    if key not in inst_pkgs or pkg > inst_pkgs[key]:
                        inst_pkgs[key] = pkg

            if highlight and ypl.updates:
                # Do the local/remote split we get in "yum updates"
                for po in sorted(ypl.updates):
                    if po.reponame != hawkey.SYSTEM_REPO_NAME:
                        local_pkgs[(po.name, po.arch)] = po

            # Output the packages:
            clio = self.conf.color_list_installed_older
            clin = self.conf.color_list_installed_newer
            clir = self.conf.color_list_installed_reinstall
            clie = self.conf.color_list_installed_extra
            rip = self.output.listPkgs(ypl.installed, _('Installed Packages'), basecmd,
                                highlight_na=update_pkgs, columns=columns,
                                highlight_modes={'>' : clio, '<' : clin,
                                                 '=' : clir, 'not in' : clie})
            clau = self.conf.color_list_available_upgrade
            clad = self.conf.color_list_available_downgrade
            clar = self.conf.color_list_available_reinstall
            clai = self.conf.color_list_available_install
            rap = self.output.listPkgs(ypl.available, _('Available Packages'), basecmd,
                                highlight_na=inst_pkgs, columns=columns,
                                highlight_modes={'<' : clau, '>' : clad,
                                                 '=' : clar, 'not in' : clai})
            raep = self.output.listPkgs(ypl.autoremove, _('Autoremove Packages'),
                                basecmd, columns=columns)
            rep = self.output.listPkgs(ypl.extras, _('Extra Packages'), basecmd,
                                columns=columns)
            cul = self.conf.color_update_local
            cur = self.conf.color_update_remote
            rup = self.output.listPkgs(ypl.updates, _('Available Upgrades'), basecmd,
                                highlight_na=local_pkgs, columns=columns,
                                highlight_modes={'=' : cul, 'not in' : cur})

            # XXX put this into the ListCommand at some point
            if len(ypl.obsoletes) > 0 and basecmd == 'list':
            # if we've looked up obsolete lists and it's a list request
                rop = len(ypl.obsoletes)
                print(_('Obsoleting Packages'))
                for obtup in sorted(ypl.obsoletesTuples,
                                    key=operator.itemgetter(0)):
                    self.output.updatesObsoletesList(obtup, 'obsoletes',
                                                     columns=columns)
            else:
                rop = self.output.listPkgs(ypl.obsoletes, _('Obsoleting Packages'),
                                    basecmd, columns=columns)
            rrap = self.output.listPkgs(ypl.recent, _('Recently Added Packages'),
                                 basecmd, columns=columns)
            if len(patterns) and \
                    rrap == 0 and rop == 0 and rup == 0 and rep == 0 and rap == 0 and raep == 0 and rip == 0:
                raise dnf.exceptions.Error(_('No matching Packages to list'))

    def returnPkgLists(self, pkgnarrow='all', patterns=None,
                       installed_available=False, reponame=None):
        """Return a :class:`dnf.yum.misc.GenericHolder` object containing
        lists of package objects that match the given names or wildcards.

        :param pkgnarrow: a string specifying which types of packages
           lists to produce, such as updates, installed, available, etc.
        :param patterns: a list of names or wildcards specifying
           packages to list
        :param installed_available: whether the available package list
           is present as .hidden_available when doing all, available,
           or installed
        :param reponame: limit packages list to the given repository

        :return: a :class:`dnf.yum.misc.GenericHolder` instance with the
           following lists defined::

             available = list of packageObjects
             installed = list of packageObjects
             upgrades = tuples of packageObjects (updating, installed)
             extras = list of packageObjects
             obsoletes = tuples of packageObjects (obsoleting, installed)
             recent = list of packageObjects
        """

        done_hidden_available = False
        done_hidden_installed = False
        if installed_available and pkgnarrow == 'installed':
            done_hidden_available = True
            pkgnarrow = 'all'
        elif installed_available and pkgnarrow == 'available':
            done_hidden_installed = True
            pkgnarrow = 'all'

        ypl = self._do_package_lists(
            pkgnarrow, patterns, ignore_case=True, reponame=reponame)
        if self.conf.showdupesfromrepos:
            for pkg in ypl.reinstall_available:
                if not pkg.installed and not done_hidden_available:
                    ypl.available.append(pkg)

        if installed_available:
            ypl.hidden_available = ypl.available
            ypl.hidden_installed = ypl.installed
        if done_hidden_available:
            ypl.available = []
        if done_hidden_installed:
            ypl.installed = []
        return ypl

    def provides(self, args):
        """Print out a list of packages that provide the given file or
        feature.  This a cli wrapper to the provides methods in the
        rpmdb and pkgsack.

        :param args: the name of a file or feature to search for
        :return: (exit_code, [ errors ])

        exit_code is::

            0 = we're done, exit
            1 = we've errored, exit with error string
            2 = we've got work yet to do, onto the next stage
        """
        # always in showdups mode
        old_sdup = self.conf.showdupesfromrepos
        self.conf.showdupesfromrepos = True

        matches = []
        used_search_strings = []
        for spec in args:
            query, used_search_string = super(BaseCli, self).provides(spec)
            matches.extend(query)
            used_search_strings.extend(used_search_string)
        for pkg in sorted(matches):
            self.output.matchcallback_verbose(pkg, used_search_strings, args)
        self.conf.showdupesfromrepos = old_sdup

        if not matches:
            raise dnf.exceptions.Error(_('No Matches found'))

    def _promptWanted(self):
        # shortcut for the always-off/always-on options
        if self.conf.assumeyes and not self.conf.assumeno:
            return False
        return True


class Cli(object):
    def __init__(self, base):
        self.base = base
        self.cli_commands = {}
        self.command = None
        self.demands = dnf.cli.demand.DemandSheet()  # :api

        self.register_command(dnf.cli.commands.alias.AliasCommand)
        self.register_command(dnf.cli.commands.autoremove.AutoremoveCommand)
        self.register_command(dnf.cli.commands.check.CheckCommand)
        self.register_command(dnf.cli.commands.clean.CleanCommand)
        self.register_command(dnf.cli.commands.distrosync.DistroSyncCommand)
        self.register_command(dnf.cli.commands.deplist.DeplistCommand)
        self.register_command(dnf.cli.commands.downgrade.DowngradeCommand)
        self.register_command(dnf.cli.commands.group.GroupCommand)
        self.register_command(dnf.cli.commands.history.HistoryCommand)
        self.register_command(dnf.cli.commands.install.InstallCommand)
        self.register_command(dnf.cli.commands.makecache.MakeCacheCommand)
        self.register_command(dnf.cli.commands.mark.MarkCommand)
        self.register_command(dnf.cli.commands.module.ModuleCommand)
        self.register_command(dnf.cli.commands.reinstall.ReinstallCommand)
        self.register_command(dnf.cli.commands.remove.RemoveCommand)
        self.register_command(dnf.cli.commands.repolist.RepoListCommand)
        self.register_command(dnf.cli.commands.repoquery.RepoQueryCommand)
        self.register_command(dnf.cli.commands.search.SearchCommand)
        self.register_command(dnf.cli.commands.shell.ShellCommand)
        self.register_command(dnf.cli.commands.swap.SwapCommand)
        self.register_command(dnf.cli.commands.updateinfo.UpdateInfoCommand)
        self.register_command(dnf.cli.commands.upgrade.UpgradeCommand)
        self.register_command(dnf.cli.commands.upgrademinimal.UpgradeMinimalCommand)
        self.register_command(dnf.cli.commands.InfoCommand)
        self.register_command(dnf.cli.commands.ListCommand)
        self.register_command(dnf.cli.commands.ProvidesCommand)
        self.register_command(dnf.cli.commands.CheckUpdateCommand)
        self.register_command(dnf.cli.commands.RepoPkgsCommand)
        self.register_command(dnf.cli.commands.HelpCommand)

    def _configure_repos(self, opts):
        self.base.read_all_repos(opts)
        if opts.repofrompath:
            for label, path in opts.repofrompath.items():
                this_repo = self.base.repos.add_new_repo(label, self.base.conf, baseurl=[path])
                this_repo._configure_from_options(opts)
                # do not let this repo to be disabled
                opts.repos_ed.append((label, "enable"))

        if opts.repo:
            opts.repos_ed.insert(0, ("*", "disable"))
            opts.repos_ed.extend([(r, "enable") for r in opts.repo])

        notmatch = set()

        # Process repo enables and disables in order
        try:
            for (repo, operation) in opts.repos_ed:
                repolist = self.base.repos.get_matching(repo)
                if not repolist:
                    if self.base.conf.strict and operation == "enable":
                        msg = _("Unknown repo: '%s'")
                        raise dnf.exceptions.RepoError(msg % repo)
                    notmatch.add(repo)

                if operation == "enable":
                    repolist.enable()
                else:
                    repolist.disable()
        except dnf.exceptions.ConfigError as e:
            logger.critical(e)
            self.optparser.print_help()
            sys.exit(1)

        for repo in notmatch:
            logger.warning(_("No repository match: %s"), repo)

        expired_repos = self.base._repo_persistor.get_expired_repos()
        if expired_repos is None:
            expired_repos = self.base.repos.keys()
        for rid in expired_repos:
            repo = self.base.repos.get(rid)
            if repo:
                repo._repo.expire()

        # setup the progress bars/callbacks
        (bar, self.base._ds_callback) = self.base.output.setup_progress_callbacks()
        self.base.repos.all().set_progress_bar(bar)
        key_import = output.CliKeyImport(self.base, self.base.output)
        self.base.repos.all()._set_key_import(key_import)

    def _log_essentials(self):
        logger.debug('{prog} version: %s'.format(prog=dnf.util.MAIN_PROG_UPPER),
                     dnf.const.VERSION)
        logger.log(dnf.logging.DDEBUG,
                        'Command: %s', self.cmdstring)
        logger.log(dnf.logging.DDEBUG,
                        'Installroot: %s', self.base.conf.installroot)
        logger.log(dnf.logging.DDEBUG, 'Releasever: %s',
                        self.base.conf.releasever)
        logger.debug("cachedir: %s", self.base.conf.cachedir)

    def _process_demands(self):
        demands = self.demands
        repos = self.base.repos

        if demands.root_user:
            if not dnf.util.am_i_root():
                raise dnf.exceptions.Error(
                    _('This command has to be run with superuser privileges '
                        '(under the root user on most systems).'))

        if demands.changelogs:
            for repo in repos.iter_enabled():
                repo.load_metadata_other = True

        if demands.cacheonly or self.base.conf.cacheonly:
            self.base.conf.cacheonly = True
            for repo in repos.values():
                repo._repo.setSyncStrategy(dnf.repo.SYNC_ONLY_CACHE)
        else:
            if demands.freshest_metadata:
                for repo in repos.iter_enabled():
                    repo._repo.expire()
            elif not demands.fresh_metadata:
                for repo in repos.values():
                    repo._repo.setSyncStrategy(dnf.repo.SYNC_LAZY)

        if demands.sack_activation:
            self.base.fill_sack(
                load_system_repo='auto' if self.demands.load_system_repo else False,
                load_available_repos=self.demands.available_repos)

    def _parse_commands(self, opts, args):
        """Check that the requested CLI command exists."""

        basecmd = opts.command
        command_cls = self.cli_commands.get(basecmd)
        if command_cls is None:
            logger.critical(_('No such command: %s. Please use %s --help'),
                            basecmd, sys.argv[0])
            if self.base.conf.plugins:
                logger.critical(_("It could be a {PROG} plugin command, "
                                  "try: \"{prog} install 'dnf-command(%s)'\"").format(
                    prog=dnf.util.MAIN_PROG, PROG=dnf.util.MAIN_PROG_UPPER), basecmd)
            else:
                logger.critical(_("It could be a {prog} plugin command, "
                                  "but loading of plugins is currently disabled.").format(
                    prog=dnf.util.MAIN_PROG_UPPER))
            raise CliError
        self.command = command_cls(self)

        logger.log(dnf.logging.DDEBUG, 'Base command: %s', basecmd)
        logger.log(dnf.logging.DDEBUG, 'Extra commands: %s', args)

    def configure(self, args, option_parser=None):
        """Parse command line arguments, and set up :attr:`self.base.conf` and
        :attr:`self.cmds`, as well as logger objects in base instance.

        :param args: a list of command line arguments
        :param option_parser: a class for parsing cli options
        """
        aliases = dnf.cli.aliases.Aliases()
        args = aliases.resolve(args)

        self.optparser = dnf.cli.option_parser.OptionParser() \
            if option_parser is None else option_parser
        opts = self.optparser.parse_main_args(args)

        # Just print out the version if that's what the user wanted
        if opts.version:
            print(dnf.const.VERSION)
            print_versions(self.base.conf.history_record_packages, self.base,
                           self.base.output)
            sys.exit(0)

        if opts.quiet:
            opts.debuglevel = 0
            opts.errorlevel = 2
        if opts.verbose:
            opts.debuglevel = opts.errorlevel = dnf.const.VERBOSE_LEVEL

        # Read up configuration options and initialize plugins
        try:
            if opts.cacheonly:
                self.base.conf._set_value("cachedir", self.base.conf.system_cachedir,
                                          dnf.conf.PRIO_DEFAULT)
                self.demands.cacheonly = True
            self.base.conf._configure_from_options(opts)
            self._read_conf_file(opts.releasever)
            if 'arch' in opts:
                self.base.conf.arch = opts.arch
            self.base.conf._adjust_conf_options()
        except (dnf.exceptions.ConfigError, ValueError) as e:
            logger.critical(_('Config error: %s'), e)
            sys.exit(1)
        except IOError as e:
            e = '%s: %s' % (ucd(str(e)), repr(e.filename))
            logger.critical(_('Config error: %s'), e)
            sys.exit(1)
        if opts.destdir is not None:
            self.base.conf.destdir = opts.destdir
            if not self.base.conf.downloadonly and opts.command not in (
                    'download', 'system-upgrade', 'reposync', 'modulesync'):
                logger.critical(_('--destdir or --downloaddir must be used with --downloadonly '
                                  'or download or system-upgrade command.')
                )
                sys.exit(1)
        if (opts.set_enabled or opts.set_disabled) and opts.command != 'config-manager':
            logger.critical(
                _('--enable, --set-enabled and --disable, --set-disabled '
                  'must be used with config-manager command.'))
            sys.exit(1)

        if opts.sleeptime is not None:
            time.sleep(random.randrange(opts.sleeptime * 60))

        # store the main commands & summaries, before plugins are loaded
        self.optparser.add_commands(self.cli_commands, 'main')
        # store the plugin commands & summaries
        self.base.init_plugins(opts.disableplugin, opts.enableplugin, self)
        self.optparser.add_commands(self.cli_commands,'plugin')

        # show help if no command specified
        # this is done here, because we first have the full
        # usage info after the plugins are loaded.
        if not opts.command:
            self.optparser.print_help()
            sys.exit(0)

        # save our original args out
        self.base.args = args
        # save out as a nice command string
        self.cmdstring = self.optparser.prog + ' '
        for arg in self.base.args:
            self.cmdstring += '%s ' % arg

        self._log_essentials()
        try:
            self._parse_commands(opts, args)
        except CliError:
            sys.exit(1)

        # show help for dnf <command> --help / --help-cmd
        if opts.help:
            self.optparser.print_help(self.command)
            sys.exit(0)

        opts = self.optparser.parse_command_args(self.command, args)

        if opts.allowerasing:
            self.demands.allow_erasing = opts.allowerasing
            self.base._allow_erasing = True
        if opts.freshest_metadata:
            self.demands.freshest_metadata = opts.freshest_metadata
        if opts.debugsolver:
            self.base.conf.debug_solver = True
        if opts.obsoletes:
            self.base.conf.obsoletes = True
        self.command.pre_configure()
        self.base.pre_configure_plugins()

        # with cachedir in place we can configure stuff depending on it:
        self.base._activate_persistor()

        self._configure_repos(opts)

        self.base.configure_plugins()

        self.base.conf._configure_from_options(opts)

        self.command.configure()

        if self.base.conf.destdir:
            dnf.util.ensure_dir(self.base.conf.destdir)
            self.base.repos.all().pkgdir = self.base.conf.destdir

        if self.base.conf.color != 'auto':
            self.base.output.term.reinit(color=self.base.conf.color)

        if rpm.expandMacro('%_pkgverify_level') in ('signature', 'all'):
            forcing = False
            for repo in self.base.repos.iter_enabled():
                if repo.gpgcheck:
                    continue
                repo.gpgcheck = True
                forcing = True
            if not self.base.conf.localpkg_gpgcheck:
                self.base.conf.localpkg_gpgcheck = True
                forcing = True
            if forcing:
                logger.warning(
                    _("Warning: Enforcing GPG signature check globally "
                      "as per active RPM security policy (see 'gpgcheck' in "
                      "dnf.conf(5) for how to squelch this message)"
                      )
                )

    def _read_conf_file(self, releasever=None):
        timer = dnf.logging.Timer('config')
        conf = self.base.conf

        # replace remote config path with downloaded file
        conf._check_remote_file('config_file_path')

        # search config file inside the installroot first
        conf._search_inside_installroot('config_file_path')

        # check whether a config file is requested from command line and the file exists
        filename = conf._get_value('config_file_path')
        if (conf._get_priority('config_file_path') == dnf.conf.PRIO_COMMANDLINE) and \
                not os.path.isfile(filename):
            raise dnf.exceptions.ConfigError(_('Config file "{}" does not exist').format(filename))

        # read config
        conf.read(priority=dnf.conf.PRIO_MAINCONFIG)

        # search reposdir file inside the installroot first
        from_root = conf._search_inside_installroot('reposdir')
        # Update vars from same root like repos were taken
        if conf._get_priority('varsdir') == dnf.conf.PRIO_COMMANDLINE:
            from_root = "/"
        subst = conf.substitutions
        subst.update_from_etc(from_root, varsdir=conf._get_value('varsdir'))
        # cachedir, logs, releasever, and gpgkey are taken from or stored in installroot
        if releasever is None and conf.releasever is None:
            releasever = dnf.rpm.detect_releasever(conf.installroot)
        elif releasever == '/':
            releasever = dnf.rpm.detect_releasever(releasever)
        if releasever is not None:
            conf.releasever = releasever
        if conf.releasever is None:
            logger.warning(_("Unable to detect release version (use '--releasever' to specify "
                             "release version)"))

        for opt in ('cachedir', 'logdir', 'persistdir'):
            conf.prepend_installroot(opt)

        self.base._logging._setup_from_dnf_conf(conf)

        timer()
        return conf

    def _populate_update_security_filter(self, opts, cmp_type='eq', all=None):
        """

        :param opts:
        :param cmp_type: string supported "eq", "gte"
        :param all:
        :return:
        """
        if (opts is None) and (all is None):
            return
        types = []

        if opts.bugfix or all:
            types.append('bugfix')
        if opts.enhancement or all:
            types.append('enhancement')
        if opts.newpackage or all:
            types.append('newpackage')
        if opts.security or all:
            types.append('security')

        self.base.add_security_filters(cmp_type, types=types, advisory=opts.advisory,
                                       bugzilla=opts.bugzilla, cves=opts.cves,
                                       severity=opts.severity)

    def redirect_logger(self, stdout=None, stderr=None):
        # :api
        """
        Change minimal logger level for terminal output to stdout and stderr according to specific
        command requirements
        @param stdout: logging.INFO, logging.WARNING, ...
        @param stderr:logging.INFO, logging.WARNING, ...
        """
        if stdout is not None:
            self.base._logging.stdout_handler.setLevel(stdout)
        if stderr is not None:
            self.base._logging.stderr_handler.setLevel(stderr)

    def redirect_repo_progress(self, fo=sys.stderr):
        progress = dnf.cli.progress.MultiFileProgressMeter(fo)
        self.base.output.progress = progress
        self.base.repos.all().set_progress_bar(progress)

    def _check_running_kernel(self):
        kernel = self.base.sack.get_running_kernel()
        if kernel is None:
            return

        q = self.base.sack.query().filterm(provides=kernel.name)
        q = q.installed()
        q.filterm(advisory_type='security')

        ikpkg = kernel
        for pkg in q:
            if pkg > ikpkg:
                ikpkg = pkg

        if ikpkg > kernel:
            print('Security: %s is an installed security update' % ikpkg)
            print('Security: %s is the currently running version' % kernel)

    def _option_conflict(self, option_string_1, option_string_2):
        print(self.optparser.print_usage())
        raise dnf.exceptions.Error(_("argument {}: not allowed with argument {}".format(
            option_string_1, option_string_2)))

    def register_command(self, command_cls):
        """Register a Command. :api"""
        for name in command_cls.aliases:
            if name in self.cli_commands:
                raise dnf.exceptions.ConfigError(_('Command "%s" already defined') % name)
            self.cli_commands[name] = command_cls

    def run(self):
        """Call the base command, and pass it the extended commands or
           arguments.

        :return: (exit_code, [ errors ])

        exit_code is::

            0 = we're done, exit
            1 = we've errored, exit with error string
            2 = we've got work yet to do, onto the next stage
        """
        self._process_demands()

        # Reports about excludes and includes (but not from plugins)
        if self.base.conf.excludepkgs:
            logger.debug(
                _('Excludes in dnf.conf: ') + ", ".join(sorted(set(self.base.conf.excludepkgs))))
        if self.base.conf.includepkgs:
            logger.debug(
                _('Includes in dnf.conf: ') + ", ".join(sorted(set(self.base.conf.includepkgs))))
        for repo in self.base.repos.iter_enabled():
            if repo.excludepkgs:
                logger.debug(_('Excludes in repo ') + repo.id + ": "
                             + ", ".join(sorted(set(repo.excludepkgs))))
            if repo.includepkgs:
                logger.debug(_('Includes in repo ') + repo.id + ": "
                             + ", ".join(sorted(set(repo.includepkgs))))

        return self.command.run()
site-packages/dnf/cli/format.py000064400000007406147511334650012464 0ustar00# Copyright (C) 2013-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.

from __future__ import unicode_literals
from dnf.pycomp import long

def format_number(number, SI=0, space=' '):
    """Return a human-readable metric-like string representation
    of a number.

    :param number: the number to be converted to a human-readable form
    :param SI: If is 0, this function will use the convention
       that 1 kilobyte = 1024 bytes, otherwise, the convention
       that 1 kilobyte = 1000 bytes will be used
    :param space: string that will be placed between the number
       and the SI prefix
    :return: a human-readable metric-like string representation of
       *number*
    """

    # copied from from urlgrabber.progress
    symbols = [ ' ', # (none)
                'k', # kilo
                'M', # mega
                'G', # giga
                'T', # tera
                'P', # peta
                'E', # exa
                'Z', # zetta
                'Y'] # yotta

    if SI: step = 1000.0
    else: step = 1024.0

    thresh = 999
    depth = 0
    max_depth = len(symbols) - 1

    if number is None:
        number = 0.0

    # we want numbers between 0 and thresh, but don't exceed the length
    # of our list.  In that event, the formatting will be screwed up,
    # but it'll still show the right number.
    while number > thresh and depth < max_depth:
        depth  = depth + 1
        number = number / step

    if isinstance(number, int) or isinstance(number, long):
        format = '%i%s%s'
    elif number < 9.95:
        # must use 9.95 for proper sizing.  For example, 9.99 will be
        # rounded to 10.0 with the .1f format string (which is too long)
        format = '%.1f%s%s'
    else:
        format = '%.0f%s%s'

    return(format % (float(number or 0), space, symbols[depth]))

def format_time(seconds, use_hours=0):
    """Return a human-readable string representation of a number
    of seconds.  The string will show seconds, minutes, and
    optionally hours.

    :param seconds: the number of seconds to convert to a
       human-readable form
    :param use_hours: If use_hours is 0, the representation will
       be in minutes and seconds. Otherwise, it will be in hours,
       minutes, and seconds
    :return: a human-readable string representation of *seconds*
    """

    # copied from from urlgrabber.progress
    if seconds is None or seconds < 0:
        if use_hours: return '--:--:--'
        else:         return '--:--'
    elif seconds == float('inf'):
        return 'Infinite'
    else:
        seconds = int(seconds)
        minutes = seconds // 60
        seconds = seconds % 60
        if use_hours:
            hours = minutes // 60
            minutes = minutes % 60
            return '%02i:%02i:%02i' % (hours, minutes, seconds)
        else:
            return '%02i:%02i' % (minutes, seconds)

def indent_block(s):
    return '\n'.join('  ' + s for s in s.splitlines())
site-packages/dnf/cli/progress.py000064400000016671147511334650013044 0ustar00# Copyright (C) 2013-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.

from __future__ import unicode_literals
from dnf.cli.format import format_number, format_time
from dnf.cli.term import _term_width
from dnf.pycomp import unicode
from time import time

import sys
import dnf.callback
import dnf.util


class MultiFileProgressMeter(dnf.callback.DownloadProgress):
    """Multi-file download progress meter"""

    STATUS_2_STR = {
        dnf.callback.STATUS_FAILED: 'FAILED',
        dnf.callback.STATUS_ALREADY_EXISTS: 'SKIPPED',
        dnf.callback.STATUS_MIRROR: 'MIRROR',
        dnf.callback.STATUS_DRPM: 'DRPM',
    }

    def __init__(self, fo=sys.stderr, update_period=0.3, tick_period=1.0, rate_average=5.0):
        """Creates a new progress meter instance

        update_period -- how often to update the progress bar
        tick_period -- how fast to cycle through concurrent downloads
        rate_average -- time constant for average speed calculation
        """
        self.fo = fo
        self.update_period = update_period
        self.tick_period = tick_period
        self.rate_average = rate_average
        self.unknown_progres = 0
        self.total_drpm = 0
        self.isatty = sys.stdout.isatty()
        self.done_drpm = 0
        self.done_files = 0
        self.done_size = 0
        self.active = []
        self.state = {}
        self.last_time = 0
        self.last_size = 0
        self.rate = None
        self.total_files = 0
        self.total_size = 0

    def message(self, msg):
        dnf.util._terminal_messenger('write_flush', msg, self.fo)

    def start(self, total_files, total_size, total_drpms=0):
        self.total_files = total_files
        self.total_size = total_size
        self.total_drpm = total_drpms

        # download state
        self.done_drpm = 0
        self.done_files = 0
        self.done_size = 0
        self.active = []
        self.state = {}

        # rate averaging
        self.last_time = 0
        self.last_size = 0
        self.rate = None

    def progress(self, payload, done):
        now = time()
        text = unicode(payload)
        total = int(payload.download_size)
        done = int(done)

        # update done_size
        if text not in self.state:
            self.state[text] = now, 0
            self.active.append(text)
        start, old = self.state[text]
        self.state[text] = start, done
        self.done_size += done - old

        # update screen if enough time has elapsed
        if now - self.last_time > self.update_period:
            if total > self.total_size:
                self.total_size = total
            self._update(now)

    def _update(self, now):
        if self.last_time:
            delta_time = now - self.last_time
            delta_size = self.done_size - self.last_size
            if delta_time > 0 and delta_size > 0:
                # update the average rate
                rate = delta_size / delta_time
                if self.rate is not None:
                    weight = min(delta_time/self.rate_average, 1)
                    rate = rate*weight + self.rate*(1 - weight)
                self.rate = rate
        self.last_time = now
        self.last_size = self.done_size
        if not self.isatty:
            return
        # pick one of the active downloads
        text = self.active[int(now/self.tick_period) % len(self.active)]
        if self.total_files > 1:
            n = '%d' % (self.done_files + 1)
            if len(self.active) > 1:
                n += '-%d' % (self.done_files + len(self.active))
            text = '(%s/%d): %s' % (n, self.total_files, text)

        # average rate, total done size, estimated remaining time
        if self.rate and self.total_size:
            time_eta = format_time((self.total_size - self.done_size) / self.rate)
        else:
            time_eta = '--:--'
        msg = ' %5sB/s | %5sB %9s ETA\r' % (
            format_number(self.rate) if self.rate else '---  ',
            format_number(self.done_size),
            time_eta)
        left = _term_width() - len(msg)
        bl = (left - 7)//2
        if bl > 8:
            # use part of the remaining space for progress bar
            if self.total_size:
                pct = self.done_size * 100 // self.total_size
                n, p = divmod(self.done_size * bl * 2 // self.total_size, 2)
                bar = '=' * n + '-' * p
                msg = '%3d%% [%-*s]%s' % (pct, bl, bar, msg)
                left -= bl + 7
            else:
                n = self.unknown_progres - 3
                p = 3
                n = 0 if n < 0 else n
                bar = ' ' * n + '=' * p
                msg = '     [%-*s]%s' % (bl, bar, msg)
                left -= bl + 7
                self.unknown_progres = self.unknown_progres + 3 if self.unknown_progres + 3 < bl \
                    else 0
        self.message('%-*.*s%s' % (left, left, text, msg))

    def end(self, payload, status, err_msg):
        start = now = time()
        text = unicode(payload)
        size = int(payload.download_size)
        done = 0

        # update state
        if status == dnf.callback.STATUS_MIRROR:
            pass
        elif status == dnf.callback.STATUS_DRPM:
            self.done_drpm += 1
        elif text in self.state:
            start, done = self.state.pop(text)
            self.active.remove(text)
            size -= done
            self.done_files += 1
            self.done_size += size
        elif status == dnf.callback.STATUS_ALREADY_EXISTS:
            self.done_files += 1
            self.done_size += size

        if status:
            # the error message, no trimming
            if status is dnf.callback.STATUS_DRPM and self.total_drpm > 1:
                msg = '[%s %d/%d] %s: ' % (self.STATUS_2_STR[status], self.done_drpm,
                                           self.total_drpm, text)
            else:
                msg = '[%s] %s: ' % (self.STATUS_2_STR[status], text)
            left = _term_width() - len(msg) - 1
            msg = '%s%-*s\n' % (msg, left, err_msg)
        else:
            if self.total_files > 1:
                text = '(%d/%d): %s' % (self.done_files, self.total_files, text)

            # average rate, file size, download time
            tm = max(now - start, 0.001)
            msg = ' %5sB/s | %5sB %9s    \n' % (
                format_number(float(done) / tm),
                format_number(done),
                format_time(tm))
            left = _term_width() - len(msg)
            msg = '%-*.*s%s' % (left, left, text, msg)
        self.message(msg)

        # now there's a blank line. fill it if possible.
        if self.active:
            self._update(now)
site-packages/dnf/cli/__pycache__/term.cpython-36.pyc000064400000027036147511334650016430 0ustar003

�ft`f9�@sxddlmZddlmZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddd�Zddd�ZGd	d
�d
e
�ZdS)
�)�absolute_import)�unicode_literalsN�cCsBy(d}tj|tj|�}tjd|�d}|Stk
r<dSXdS)z Get the real terminal width ZabcdefghshhhhrN)�fcntlZioctl�termiosZ
TIOCGWINSZ�struct�unpack�IOError)�fdZbuf�ret�r�/usr/lib/python3.6/term.py�_real_term_widthsrcCs&tdd�}|sdS|dkrdS|SdS)z@ Compute terminal width falling to default 80 in case of troubler)r
�P�N)r)r
Ztwrrr
�_term_width)s
rc	@seZdZdZdZedd��Zedd��Zdddd	�Zd
ddd
ddddd�Z	d
ddd
ddddd�Z
ddddddd�Zdddddd d!d"d�Zd#d$d%d&d'd(d)d*d�Z
d+d,�ZdFd/d0�ZdGd1d2�Zd3d4�Zd5d6�Zd7d8�ZdHd:d;�Zd<d=�Zd>d?�Zd@dA�ZdBdC�ZdDdE�Zd-S)I�Termz>A class to provide some terminal "UI" helpers based on curses.TcCst�S)N)r)�selfrrr
�<lambda>@sz
Term.<lambda>cCst�S)N)r)rrrr
rAsZsmulZrevZsgr0)�	underline�reverse�normalrr������)�black�blue�green�cyan�red�magenta�yellow�white)rr"r r$rr#r!r%zz�zzz(B)�bold�blink�dimrrrzzzzzzzzzzzzzzzzcCs|j|_|j|_|j|_dS)N)�_Term__ansi_forced_MODE�MODE�_Term__ansi_forced_FG_COLOR�FG_COLOR�_Term__ansi_forced_BG_COLOR�BG_COLOR)rrrr
Z
__forced_initzszTerm.__forced_initN�autocCsXd|_d|_|dkr |j�dSddddddd�|_ddddddddd�|_ddddddddd�|_|dkrvd	|_dS|d
ks�t�|s�tj}|j	�s�d	|_dSyt
j|j�d�Wnt
k
r�d	|_dSXt
j|_t
jd�|_x8|jD].}|}||jk�r|j|}|j|�|j|<q�W|jd
�jd�}|�rhx4|jj�D]&\}}t
j||�j��pZd|j|<�q>W|jd�jd�}|�r�x8|jj�D]*\}}t
j||�j��p�d}||j|<�q�W|jd�jd�}	|	�rx4|jj�D]&\}}t
j|	|�j��p�d|j|<�q�W|jd�jd�}
|
�rTx8|jj�D]*\}}t
j|
|�j��pBd}||j|<�q&WdS)a
Reinitializes the :class:`Term`.

        :param term_stream:  the terminal stream that the
           :class:`Term` should be initialized to use.  If
           *term_stream* is not given, :attr:`sys.stdout` is used.
        :param color: when to colorize output.  Valid values are
           'always', 'auto', and 'never'.  'always' will use ANSI codes
           to always colorize output, 'auto' will decide whether do
           colorize depending on the terminal, and 'never' will never
           colorize.
        T��alwaysNr&)r'r(r)rrr)rrr r!r"r#r$r%ZneverFr0)r
�linesZsetfzutf-8ZsetafZsetbZsetab)�_Term__enabledr3�_Term__forced_initr+r-r/�AssertionError�sys�stdout�isatty�cursesZ	setupterm�fileno�	ExceptionZtigetstr�
_ctigetstrZtigetnum�_Term__cap_names�	_tigetstr�encode�
_Term__colors�itemsZtparm�decode�_Term__ansi_colors)r�term_stream�color�cap_name�modeZset_fg�valZset_fg_ansiZfg_colorZset_bgZset_bg_ansiZbg_colorrrr
�reinits�



""zTerm.reinitcCs|j||�dS)N)rJ)rrErFrrr
�__init__�sz
Term.__init__cCs0|j|�pd}tjj|�r"|j�}tjdd|�S)Nr&z\$<\d+>[/*]?)r=�dnfZpycompZis_py3bytesrC�re�sub)rrGZcaprrr
r?�szTerm._tigetstrcCs|j|t|�|jdS)zColorize string with colorr)r+�str)rrF�srrr
rF�sz
Term.colorcCs|jd|�S)zMake string bold.r')rF)rrPrrr
r'�sz	Term.boldFc
s\|js
|S|stj}��fdd�}x4|D],}||�}	|rFtj|	tj�}	tj|	||�}q(W|S)aSearch the string *haystack* for all occurrences of any
        string in the list *needles*.  Prefix each occurrence with
        *beg*, and postfix each occurrence with *end*, then return the
        modified string.  For example::

           >>> yt = Term()
           >>> yt.sub('spam and eggs', 'x', 'z', ['and'])
           'spam xandz eggs'

        This is particularly useful for emphasizing certain words
        in output: for example, calling :func:`sub` with *beg* =
        MODE['bold'] and *end* = MODE['normal'] will return a string
        that when printed to the terminal will appear to be *haystack*
        with each occurrence of the strings in *needles* in bold
        face.  Note, however, that the :func:`sub_mode`,
        :func:`sub_bold`, :func:`sub_fg`, and :func:`sub_bg` methods
        provide convenient ways to access this same emphasizing functionality.

        :param haystack: the string to be modified
        :param beg: the string to be prefixed onto matches
        :param end: the string to be postfixed onto matches
        :param needles: a list of strings to add the prefixes and
           postfixes to
        :param escape: a function that accepts a string and returns
           the same string with problematic characters escaped.  By
           default, :func:`re.escape` is used.
        :param ignore_case: whether case should be ignored when
           searching for matches
        :return: *haystack* with *beg* prefixing, and *end*
          postfixing, occurrences of the strings in *needles*
        cs�|j��S)N)�group)�match)�beg�endrr
rszTerm.sub.<locals>.<lambda>)r4rM�escape�template�IrN)
r�haystackrSrT�needlesrUZignore_caseZrenderZneedleZpatr)rSrTr
rN�s 
zTerm.subcKs|j|||jd|f|�S)aOSearch the string *haystack* for all occurrences of any
        string in the list *needles*.  Prefix each occurrence with
        *beg*, and postfix each occurrence with self.MODE['normal'],
        then return the modified string.  If *beg* is an ANSI escape
        code, such as given by self.MODE['bold'], this method will
        return *haystack* with the formatting given by the code only
        applied to the strings in *needles*.

        :param haystack: the string to be modified
        :param beg: the string to be prefixed onto matches
        :param end: the string to be postfixed onto matches
        :param needles: a list of strings to add the prefixes and
           postfixes to
        :return: *haystack* with *beg* prefixing, and self.MODE['normal']
          postfixing, occurrences of the strings in *needles*
        r)rNr+)rrXrSrY�kwdsrrr
�sub_norm&sz
Term.sub_normcKs|j||j||f|�S)aTSearch the string *haystack* for all occurrences of any
        string in the list *needles*.  Prefix each occurrence with
        self.MODE[*mode*], and postfix each occurrence with
        self.MODE['normal'], then return the modified string.  This
        will return a string that when printed to the terminal will
        appear to be *haystack* with each occurrence of the strings in
        *needles* in the given *mode*.

        :param haystack: the string to be modified
        :param mode: the mode to set the matches to be in.  Valid
           values are given by self.MODE.keys().
        :param needles: a list of strings to add the prefixes and
           postfixes to
        :return: *haystack* with self.MODE[*mode*] prefixing, and
          self.MODE['normal'] postfixing, occurrences of the strings
          in *needles*
        )r[r+)rrXrHrYrZrrr
�sub_mode9sz
Term.sub_modecKs|j|d|f|�S)a�Search the string *haystack* for all occurrences of any
        string in the list *needles*.  Prefix each occurrence with
        self.MODE['bold'], and postfix each occurrence with
        self.MODE['normal'], then return the modified string.  This
        will return a string that when printed to the terminal will
        appear to be *haystack* with each occurrence of the strings in
        *needles* in bold face.

        :param haystack: the string to be modified
        :param needles: a list of strings to add the prefixes and
           postfixes to
        :return: *haystack* with self.MODE['bold'] prefixing, and
          self.MODE['normal'] postfixing, occurrences of the strings
          in *needles*
        r')r\)rrXrYrZrrr
�sub_boldMsz
Term.sub_boldcKs|j||j||f|�S)acSearch the string *haystack* for all occurrences of any
        string in the list *needles*.  Prefix each occurrence with
        self.FG_COLOR[*color*], and postfix each occurrence with
        self.MODE['normal'], then return the modified string.  This
        will return a string that when printed to the terminal will
        appear to be *haystack* with each occurrence of the strings in
        *needles* in the given color.

        :param haystack: the string to be modified
        :param color: the color to set the matches to be in.  Valid
           values are given by self.FG_COLOR.keys().
        :param needles: a list of strings to add the prefixes and
           postfixes to
        :return: *haystack* with self.FG_COLOR[*color*] prefixing, and
          self.MODE['normal'] postfixing, occurrences of the strings
          in *needles*
        )r[r-)rrXrFrYrZrrr
�sub_fg_szTerm.sub_fgcKs|j||j||f|�S)a�Search the string *haystack* for all occurrences of any
        string in the list *needles*.  Prefix each occurrence with
        self.BG_COLOR[*color*], and postfix each occurrence with
        self.MODE['normal'], then return the modified string.  This
        will return a string that when printed to the terminal will
        appear to be *haystack* with each occurrence of the strings in
        *needles* highlighted in the given background color.

        :param haystack: the string to be modified
        :param color: the background color to set the matches to be in.  Valid
           values are given by self.BG_COLOR.keys().
        :param needles: a list of strings to add the prefixes and
           postfixes to
        :return: *haystack* with self.BG_COLOR[*color*] prefixing, and
          self.MODE['normal'] postfixing, occurrences of the strings
          in *needles*
        )r[r/)rrXrFrYrZrrr
�sub_bgsszTerm.sub_bg)Nr0)Nr0)NF)�__name__�
__module__�__qualname__�__doc__r4�propertyZreal_columns�columnsr>rArDr*r,r.r5rJrKr?rFr'rNr[r\r]r^r_rrrr
r4sr	
f
	
-r)r)r)Z
__future__rrr:Z
dnf.pycomprLrrMrr7rrr�objectrrrrr
�<module>s

site-packages/dnf/cli/__pycache__/aliases.cpython-36.opt-1.pyc000064400000012474147511334650020041 0ustar003

�ft`��@s�ddlmZddlmZddlmZddlZddlZddlm	Z	ddl
ZddlZddl
Z
ddlZddlZe
jd�ZdZejjed�Zejjed	�ZGd
d�de�ZGdd
�d
e�ZdS)�)�absolute_import)�unicode_literals)�_N)�PRIO_DEFAULT�dnfz/etc/dnf/aliases.d/zALIASES.confz	USER.confc@s,eZdZdd�Zedd��Zedd��ZdS)�
AliasesConfigcCs$||_tjj�|_|jj|j�dS)N)�_path�libdnf�confZConfigParser�_parser�read)�self�path�r�/usr/lib/python3.6/aliases.py�__init__*szAliasesConfig.__init__cCsHtjjd�}y|jt|jj�dd�Wntk
r>YnX|j�S)NT�main�enabled)	r	r
�
OptionBool�setrrZgetData�
IndexError�getValue)r
�optionrrrr/szAliasesConfig.enabledcCsVtj�}d}|jj|�s|Sx4|jj|�D]$}|jj||�}|sBq*|j�||<q*W|S)N�aliases)�collections�OrderedDictrZ
hasSectionZoptionsr�split)r
�resultZsection�key�valuerrrr8szAliasesConfig.aliasesN)�__name__�
__module__�__qualname__r�propertyrrrrrrr)s	rc@sNeZdZdd�Zdd�Zdd�Zdd�Zdd
d�Zdd
�Zdd�Z	dd�Z
d	S)�AliasescCsFtj�|_d|_d|_|j�r(d|_dS|j�|js:dS|j�dS)NTF)rrrr
r�_disabled_by_environ�
_load_main�
_load_aliases)r
rrrrGs
zAliases.__init__cCshtjjd�}y|jttjd�|j�Stk
r:dSt	k
rbt
jtd�tjd�dSXdS)NTZDNF_DISABLE_ALIASESFz@Unexpected value of environment variable: DNF_DISABLE_ALIASES=%s)
r	r
rrr�os�environr�KeyError�RuntimeError�logger�warningr)r
rrrrr%WszAliases._disabled_by_environcCs�yt|�Stk
rB}ztjjtd�||f��WYdd}~Xn:tk
rz}ztjjtd�||f��WYdd}~XnXdS)NzParsing file "%s" failed: %szCannot read file "%s": %s)rr+r�
exceptions�ConfigErrorr�IOError)r
r�errr�
_load_confds"zAliases._load_confcCsVy|jt�|_|jj|_Wn6tjjk
rP}ztjt	d�|�WYdd}~XnXdS)NzConfig error: %s)
r2�ALIASES_CONF_PATHr
rrr.r/r,�debugr)r
r1rrrr&ns
zAliases._load_mainNcCs�|dkr.y|j�}Wntjjk
r,dSXxf|D]^}y"|j|�}|jrX|jj|j�Wq4tjjk
r�}ztj	t
d�|�WYdd}~Xq4Xq4WdS)NzConfig error: %s)�_dropin_dir_filenamesrr.r/r2rr�updater,r-r)r
�	filenames�filenamer
r1rrrr'us

zAliases._load_aliasescs�tjjt�tjjt�g��fdd�}g}yPtjjt�s@tjt�x4ttj	t��D]"}||�r^qP|j
tjjt|��qPWWn2tt
fk
r�}ztjj|��WYdd}~XnXtjjt�r�|j
t�|S)Ncs|�kp|jd�p|jd�S)N�.�.conf�.CONF)r:r;)�
startswith�endswith)r8)�ignored_filenamesrr�_ignore_filename�s
z7Aliases._dropin_dir_filenames.<locals>._ignore_filename)r(r�basenamer3�ALIASES_USER_PATH�exists�ALIASES_DROPIN_DIR�mkdir�sorted�listdir�append�joinr0�OSErrorrr.r/)r
r?r7�fnr1r)r>rr5�s 


zAliases._dropin_dir_filenamescs:g�g�_�fdd������fdd���|�}�j|S)NcsNd}x&|D]}|r |ddkr P|d7}q
W�j|d|�7_||d�S)Nr�-�)�prefix_options)�argsZnum�arg)r
rr�store_prefix�s
z&Aliases._resolve.<locals>.store_prefixcs��|�}|s*|d�jks*|djd�rry.�j�|djd�rV|ddd�|d<Wntk
rlYnX|S|d�kr�tjjtd����j|d���j|d�}|r�||dd�S�|dd��SdS)Nr�\rLz"Aliases contain infinite recursion)	rr<�poprrr.�ErrorrrG)rN�suffixZcurrent_alias_result)r
�stackrP�
subresolverrrV�s&
z$Aliases._resolve.<locals>.subresolve)rM)r
rNrTr)r
rUrPrVr�_resolve�szAliases._resolvecCsP|jrLy|j|�}Wn6tjjk
rJ}ztjtd�|�WYdd}~XnX|S)Nz%s, using original arguments.)rrWrr.rSr,�errorr)r
rNr1rrr�resolve�s"zAliases.resolve)N)r r!r"rr%r2r&r'r5rWrYrrrrr$Fs


/r$)Z
__future__rrZdnf.i18nrrZdnf.clirZdnf.conf.configrZdnf.exceptionsZlibdnf.confr	Zloggingr(Zos.pathZ	getLoggerr,rCrrHr3rA�objectrr$rrrr�<module>s 
site-packages/dnf/cli/__pycache__/__init__.cpython-36.pyc000064400000000755147511334650017217 0ustar003

�ft`��@sDddlmZddlZGdd�dejj�ZddlmZddl	m
Z
dS)�)�absolute_importNc@seZdZdZdS)�CliErrorzCLI Exception. :apiN)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/__init__.pyrsr)�Cli)�Command)Z
__future__rZdnf.exceptionsZdnf�
exceptions�ErrorrZdnf.cli.clir
Zdnf.cli.commandsrrrrr	�<module>ssite-packages/dnf/cli/__pycache__/completion_helper.cpython-36.opt-1.pyc000064400000020733147511334650022125 0ustar003

i�-e/�@s<ddlZddlZddlZddlZdd�Zdd�ZGdd�dejjj	j
�ZGdd	�d	ejjjj
�ZGd
d�dejjjj�ZGdd
�d
ejjj�ZGdd�dejjjj�ZGdd�dejjjj�ZGdd�dejjjj�ZGdd�dejjjj�Zdd�Z e!dk�r8ye ej"dd��Wn e#k
�r6ej$d�YnXdS)�Ncst�fdd�|�S)Ncst|�j��S)N)�str�
startswith)�k)�kw��'/usr/lib/python3.6/completion_helper.py�<lambda>sz#filter_list_by_kw.<locals>.<lambda>)�filter)rZlstr)rr�filter_list_by_kwsr
cCstdd�|D��S)NcSsg|]}t|��qSr)r)�.0�xrrr�
<listcomp>!sz%listpkg_to_setstr.<locals>.<listcomp>)�set)�pkgsrrr�listpkg_to_setstr srcs,eZdZ�fdd�Zdd�Zdd�Z�ZS)�RemoveCompletionCommandcstt|�j|�dS)N)�superr�__init__)�self�args)�	__class__rrr$sz RemoveCompletionCommand.__init__cCsd|jj_d|jj_dS)NFT)�cli�demands�	root_user�sack_activation)rrrr�	configure's
z!RemoveCompletionCommand.configurecCs,x&tj|j|jj�D]}tt|��qWdS)N)�ListCompletionCommand�	installed�base�opts�	pkg_specs�printr)r�pkgrrr�run+szRemoveCompletionCommand.run)�__name__�
__module__�__qualname__rrr#�
__classcell__rr)rrr#srcs,eZdZ�fdd�Zdd�Zdd�Z�ZS)�InstallCompletionCommandcstt|�j|�dS)N)rr(r)rr)rrrr1sz!InstallCompletionCommand.__init__cCs"d|jj_d|jj_d|jj_dS)NFT)rrr�available_reposr)rrrrr4s

z"InstallCompletionCommand.configurecCsNttj|j|jj��}ttj|j|jj��}x||D]}tt|��q6WdS)N)	rrrrrr �	availabler!r)rrr*r"rrrr#9s

zInstallCompletionCommand.run)r$r%r&rrr#r'rr)rrr(0sr(cs,eZdZ�fdd�Zdd�Zdd�Z�ZS)�ReinstallCompletionCommandcstt|�j|�dS)N)rr+r)rr)rrrrCsz#ReinstallCompletionCommand.__init__cCs"d|jj_d|jj_d|jj_dS)NFT)rrrr)r)rrrrrFs

z$ReinstallCompletionCommand.configurecCsNttj|j|jj��}ttj|j|jj��}x||@D]}tt|��q6WdS)N)	rrrrrr r*r!r)rrr*r"rrrr#Ks

zReinstallCompletionCommand.run)r$r%r&rrr#r'rr)rrr+Bsr+csHeZdZ�fdd�Zdd�Zedd��Zedd��Zed	d
��Z�Z	S)rcstt|�j|�dS)N)rrr)rr)rrrrTszListCompletionCommand.__init__cCs�|j}|jj}|jj}t|�dkrH|d|krHtdjt|d|���n�|dkr`|j|j	|�}n||dkrx|j
|j	|�}nd|dkr�|j|j	|�}nLt|j
|j	|��}t|j|j	|��}||B}|s�tdjt|d|���dSx|D]}tt
|��q�WdS)N��
rr*�updatesr)Z
pkgnarrowsrZpackagesZpackages_action�lenr!�joinr
rrr*r.rr)r�subcmdsr�actionrr*rr"rrrr#Ws&
zListCompletionCommand.runcCs |jj�j�jdj|d�d�S)Nz{}*r)�
name__glob)�sack�queryr�filterm�format)r�argrrrrnszListCompletionCommand.installedcCs |jj�j�jdj|d�d�S)Nz{}*r)r3)r4r5r*r6r7)rr8rrrr*rszListCompletionCommand.availablecCs|jdj|d�gdd�S)Nz{}*rF)Zprint_)Z
check_updatesr7)rr8rrrr.vszListCompletionCommand.updates)
r$r%r&rr#�staticmethodrr*r.r'rr)rrrSs
rcs$eZdZ�fdd�Zdd�Z�ZS)�RepoListCompletionCommandcstt|�j|�dS)N)rr:r)rr)rrrr|sz"RepoListCompletionCommand.__init__cCs�|j}|jdkr>tdjt|jddd�|jjj�D����nn|jdkrvtdjt|jddd�|jjj�D����n6|jdkr�tdjt|jdd	d�|jjj�D����dS)
N�enabledr-rcSsg|]
}|j�qSr)�id)r�rrrrr
�sz1RepoListCompletionCommand.run.<locals>.<listcomp>ZdisabledcSsg|]}|js|j�qSr)r;r<)rr=rrrr
�s�allcSsg|]
}|j�qSr)r<)rr=rrrr
�s)	rZrepos_actionr!r0r
ZreposrZiter_enabledr>)rrrrrr#s


zRepoListCompletionCommand.run)r$r%r&rr#r'rr)rrr:{sr:cs,eZdZ�fdd�Zdd�Zdd�Z�ZS)�UpgradeCompletionCommandcstt|�j|�dS)N)rr?r)rr)rrrr�sz!UpgradeCompletionCommand.__init__cCs"d|jj_d|jj_d|jj_dS)NFT)rrrr)r)rrrrr�s

z"UpgradeCompletionCommand.configurecCs,x&tj|j|jj�D]}tt|��qWdS)N)rr.rrr r!r)rr"rrrr#�szUpgradeCompletionCommand.run)r$r%r&rrr#r'rr)rrr?�sr?cs,eZdZ�fdd�Zdd�Zdd�Z�ZS)�DowngradeCompletionCommandcstt|�j|�dS)N)rr@r)rr)rrrr�sz#DowngradeCompletionCommand.__init__cCs"d|jj_d|jj_d|jj_dS)NFT)rrrr)r)rrrrr�s

z$DowngradeCompletionCommand.configurecCs0x*tj|j|jj�j�D]}tt|��qWdS)N)rr*rrr Z
downgradesr!r)rr"rrrr#�szDowngradeCompletionCommand.run)r$r%r&rrr#r'rr)rrr@�sr@cs$eZdZ�fdd�Zdd�Z�ZS)�CleanCompletionCommandcstt|�j|�dS)N)rrAr)rr)rrrr�szCleanCompletionCommand.__init__cCs0tjjjjj�}tdjt|j	j
d|���dS)Nr-r,)�dnfr�commands�cleanZ_CACHE_TYPES�keysr!r0r
r�type)rr1rrrr#�szCleanCompletionCommand.run)r$r%r&rr#r'rr)rrrA�srAcCs�tjjj�}tjj|�}|ddkrP|jgg|�tdjt|d|j���dS|jj	�|j
t�|j
t�|j
t
�|j
t�|j
t�|j
t�|j
t�|j
t�|j|�y|j�Wn&ttjjfk
r�tjd�YnXdS)NrZ_cmdsr-r,)rBrZBaseCliZCliZinit_pluginsr!r0r
Zcli_commands�clearZregister_commandrr(r+rr:r?r@rArr#�OSError�
exceptions�Error�sys�exit)rrrrrr�main�s(









rM�__main__r,)%Zdnf.exceptionsrBZdnf.cliZdnf.cli.commands.cleanrKr
rrrC�removeZ
RemoveCommandrZinstallZInstallCommandr(Z	reinstallZReinstallCommandr+ZListCommandrZrepolistZRepoListCommandr:ZupgradeZUpgradeCommandr?Z	downgradeZDowngradeCommandr@rDZCleanCommandrArMr$�argv�KeyboardInterruptrLrrrr�<module>s&
(	
site-packages/dnf/cli/__pycache__/output.cpython-36.pyc000064400000157050147511334650017021 0ustar003

�ft`Z�@s�dZddlmZddlmZddlmZddlZddlZddlZddlZ	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlmZmZddlmZmZmZmZmZmZmZmZddlmZmZmZm Z m!Z!dd	l"m#Z#dd
l$m%Z%ddl&Z'ddl(Z'ddl)Z'ddl*Z'ddl+Z'ddl,Z'ddlZ'ddl-Z'ddl.Z'ddl/Z'e
j0d�Z1dd
�Z2Gdd�de3�Z4Gdd�de'j5j6�Z7Gdd�de'j5j8�Z9Gdd�de#�Z:ddd�Z;dS)z"Handle actual output from the cli.�)�absolute_import)�print_function)�unicode_literalsN)�
format_number�format_time)�_�C_�P_�ucd�fill_exact_width�
textwrap_fill�exact_width�select_short_long)�xrange�
basestring�long�unicode�sys_maxsize)�TransactionDisplay)�MergedTransactionWrapper�dnfcCsrtj|ftjd��}t|�}|d}||}|s@tjd|�}n|rR|jd|�t|�}tt|f|g|���S)N��)r)�	itertools�chain�repeat�len�extend�iter�list�zip)Z
cols_countZlabel�lst�leftZ
lst_lengthZright_countZ
missing_itemsZlst_iter�r#�/usr/lib/python3.6/output.py�_spread_in_columns6s
r%c@s
eZdZdZdmZejd�Zdd�Zdd�Z	d	d
�Z
dd�Zd
d�Ze
dd��Zedd��Zedd��Zdndd�Ze
dd��Zdd�Zdodd�Zdpd!d"�Zdqd#d$�Zd%d&�Zd'd(�Zd)d*�Zdrd,d-�Zdsd.d/�Zdtd0d1�Zidifd2d3�Zdud4d5�Zd6d7�Z d8d9�Z!d:d;�Z"d<d=�Z#d>d?�Z$d@dA�Z%dvdBdC�Z&dwdDdE�Z'dxdFdG�Z(dHdI�Z)dJdK�Z*dydLdM�Z+dNdO�Z,dPdQ�Z-dRdS�Z.dTdU�Z/dVdW�Z0dzdXdY�Z1d{dZd[�Z2ge3�fd\d]�Z4gfd^d_�Z5e6d`�e6d`�e6da�e6db�e6dc�e6dd�e6de�e6df�e6dg�e6dh�e6di�dj�Z7gfdkdl�Z8dS)|�Outputz+Main output class for the yum command line.� �z	^\*{0,2}/cCs$||_||_tjjj�|_d|_dS)N)�conf�baser�cli�termZTerm�progress)�selfr*r)r#r#r$�__init__IszOutput.__init__cCs0|jj}dd|}|jt||�d�}|||fS)Nz%s�=r')r,�columns�
fmtColumnsr )r.�col_data�rowZ
term_widthZrule�headerr#r#r$�_bannerOszOutput._bannerc	Cszdd�|dD�}xF|D]>}x8t|�D],\}}||}t|�}|j|d�d||<q&WqW|j|ddd�}tttj|��S)NcSsg|]
}t��qSr#)�dict)�.0rr#r#r$�
<listcomp>Vsz&Output._col_widths.<locals>.<listcomp>rrz  )�indent)�	enumerater�get�calcColumnsr�map�operator�neg)	r.�rowsr3r4�i�valZcol_dctZlength�colsr#r#r$�_col_widthsUs
zOutput._col_widthscCs(d}d}|s�n�t|t�s$|dkr2|jjd}n�|dkr<n�x�|jdd�j�D]�}|dkr`d}qN||jjkr~||jj|7}qN||jjkr�||jj|7}qN|jd�r�|dd�|jjkr�||jj|dd�7}qN|jd�o�|dd�|jjkrN||jj|dd�7}qNW|�r |jjd}||fS)	Nr�bold�normal�,r'zfg:r(zbg:)	�
isinstancerr,ZMODE�replace�splitZFG_COLOR�
startswithZBG_COLOR)r.�	highlight�hibeg�hiendZhighr#r#r$�
_highlight`s0

zOutput._highlightcKs$|j|�\}}|jj||||f|�S)N)rPr,�sub)r.ZhaystackrMZneedles�kwdsrNrOr#r#r$�_sub_highlight}szOutput._sub_highlightcCs4d}x*|D]"}||d|kr P||d7}q
W|S)z; Spaces left on the current field will help how many pkgs? rrr#)�currentZ	data_tupsr"�ret�tupr#r#r$�_calc_columns_spaces_helps�s
z!Output._calc_columns_spaces_helpscCs|jjS)N)r*�history)r.r#r#r$rX�szOutput.historycCs|jjS)N)r*�sack)r.r#r#r$rY�szOutput.sackNrrcCs�t|�}|}dg|}x&td|�D]}t||j��||<q"W|dkrN|jj}|dkrndg|d}|jd�|s�g}	xDtd|�D]6}||}
|
r�|	j|
dd�q�|	j||d�q�W|	dt|�7<|jj}t	|	�|kr�|	S|}|t	|�|dt
|�8}|d�s|d7}�x�|dk�r�d}d}
xptd|�D]b}|j|||||�}|�s`�q<|�r�||dk�r�|d|k�r��q<||k�r��q<|}|}
�q<W|�r||
jd�d||
}||
�r�|
|dk�r�|d8}||
|7<||8}�qd}x*td|�D]}||�s"�q|d7}�qW|�rz||}x:td|�D],}||�s\�qJ|||7<||8}�qJW|d8}||}x$td|�D]}|||7<�q�W|||||7<d}�qW|S)a�Dynamically calculate the widths of the columns that the
        fields in data should be placed into for output.

        :param data: a list of dictionaries that represent the data to
           be output.  Each dictionary in the list corresponds to a
           column of output. The keys of the dictionary are the
           lengths of the items to be output, and the value associated
           with a key is the number of items of that length.
        :param columns: a list containing the minimum amount of space
           that must be allocated for each row. This can be used to
           ensure that there is space available in a column if, for
           example, the actual lengths of the items being output
           cannot be given in *data*
        :param remainder_column: number of the column to receive a few
           extra spaces that may remain after other allocation has
           taken place
        :param total_width: the total width of the output.
           self.term.real_columns is used by default
        :param indent: string that will be prefixed to a line of
           output to create e.g. an indent
        :return: a list of the widths of the columns that the fields
           in data should be placed into for output
        Nrr����r[)
r�range�sorted�itemsr,�real_columns�appendrr1�sumr
rW�pop)r.�datar1�remainder_column�total_widthr:rDZpdata�dZfull_columns�colZ
default_widthZhelpsrCZthelps�diffZoverflowed_columnsZnormr#r#r$r=�s�



"




zOutput.calcColumnscCs|dkrd|fSd|fS)z$Returns tuple of (align_left, width)rTFr#)�widthr#r#r$�_fmt_column_align_widths
zOutput._fmt_column_align_widthcCslt|�dkst|�dkst�t|�dkr8|\}}d}}t|�dkr\|\}}}|j|�\}}t|�|||fS)NrZr(r)r�AssertionErrorrPr
)r.r3rCrirNrOrMr#r#r$�	_col_datas
zOutput._col_datac
CsFt|�}t|�}g}x�|dd	�D]�}|j|�\}}}	}
|sP|d7}|j|�q"|j|�\}}t|�}||kr�|d7}|r�|j|	|d|||
g�q�|j|	d||||
g�n(|dd||d7}|j|	||
g�||7}|d7}q"W|j|d
�\}}}	}
|j|�\}}t||||	|
d�}|d|7}|j|�|t|�S)a.Return a row of data formatted into a string for output.
        Items can overflow their columns.

        :param columns: a list of tuples containing the data to
           output.  Each tuple contains first the item to be output,
           then the amount of space allocated for the column, and then
           optionally a type of highlighting for the item
        :param msg: a string to begin the line of output with
        :param end: a string to end the line of output with
        :return: a row of data formatted into a string for output
        Nrz%sz	%s%s%s%s r'z%s%s%s
)r"�prefix�suffixz%%s%sr[r[)	rrrlr`rjr
rr�tuple)
r.r1�msg�endrercr3rCrirNrOZ
align_leftZ	val_widthr#r#r$r2%s6


zOutput.fmtColumnsFcCsP|dkrd
}d||j|jf}|ddg}t||j|jf||�}t|j|��dS)a�Print a package as a line.

        :param pkg: the package to be printed
        :param ui_overflow: unused
        :param indent: string to be prefixed onto the line to provide
           e.g. an indent
        :param highlight: highlighting options for the name of the
           package
        :param columns: tuple containing the space allocated for each
           column of output.  The columns are the package name, version,
           and repository
        N�(��z%s%s.%srG���������)rurvrw)�name�archr �evr�
_from_repo�printr2)r.�pkg�ui_overflowr:rMr1Zna�hi_colsr#r#r$�
simpleListTs
zOutput.simpleListc	CsN|dkrd}d|t|�f}|ddg}|j}t||f||�}t|j|��dS)	a(Print a package as a line, with the package itself in envra
        format so it can be passed to list/install/etc.

        :param pkg: the package to be printed
        :param ui_overflow: unused
        :param indent: string to be prefixed onto the line to provide
           e.g. an indent
        :param highlight: highlighting options for the name of the
           package
        :param columns: tuple containing the space allocated for each
           column of output.  The columns the are the package envra and
           repository
        N�?rtz%s%srG�����)r�r�)r
�ui_from_repor r|r2)	r.r}r~r:rMr1ZenvrarZridr#r#r$�simpleEnvraListjs
zOutput.simpleEnvraListcCstt|j��dS)z.Print a package as a line containing its name.N)r|r
rx)r.r}r#r#r$�simple_name_list�szOutput.simple_name_listcCstt|��dS)z/Print a package as a line containing its NEVRA.N)r|r
)r.r}r#r#r$�simple_nevra_list�szOutput.simple_nevra_listcCs�t|�}|jj}|st}n|dkr&d}d|dd}|s>|St|�}t||||d�}|jd�dkr�||dkr�t|||d	d�}|S)
z�Return a key value pair in the common two column output
        format.

        :param key: the key to be formatted
        :param val: the value associated with *key*
        :return: the key value pair formatted in two columns for output
        �r'rZz: )riZinitial_indentZsubsequent_indent�
rr(z
     ...: )r
r,r_rr
r�count)r.�keyrCZkeylenrDZnxtrUr#r#r$�
fmtKeyValFill�s zOutput.fmtKeyValFillr0cCsht|�}|jjd}t|�}||dkr6|d}}n$|||d}|||t|�}d|||fS)a�Format and return a section header.  The format of the
        header is a line with *name* centered, and *fill* repeated on
        either side to fill an entire line on the terminal.

        :param name: the name of the section
        :param fill: the character to repeat on either side of *name*
          to fill an entire line.  *fill* must be a single character.
        :return: a string formatted to be a section header
        rZ�z%s %s %s)r
r,r1r
r)r.rxZfillrDZname_lenZbegrqr#r#r$�
fmtSection�s
zOutput.fmtSectioncs�dd�}�fdd�}g}�j|�\}}tdtdd�tdd��}|j||d	||j|f��|jrv|j|td
�|j��tdtdd�tdd��}|j|||j��|j|td�|j��tdtdd
�tdd��}|j|||j	��tdtdd�tdd��}|j||t
t|j����|j|td�|j
��tdtdd�tdd��}|j|||j��|j�r��jj|�}	|	�r�|j|td�|	���jj�r>|j|td�|j��|j|td�tjj|j���|j�r�|j|td�tjj|j����jj|�}
|
�r>yt|
jj��}Wntk
�r"d}YnX|j|td��j|���tdtdd�tdd��}|j|||j ��|j!�r�|j|td�t"|j!���|j|td�|j#��tdtdd�tdd��}|j|||j$��dj%|�S)z�Print information about the given package.

        :param pkg: the package to print information about
        :param highlight: highlighting options for the name of the
           package
        cSsdjt|dd�dt|�g�S)Nr'��:)�joinr�str)r�rCr#r#r$�format_key_val�sz)Output.infoOutput.<locals>.format_key_valcs�jt|dd�d|pd�S)Nr�z : r)r�r)r�rC)r.r#r$�format_key_val_fill�sz.Output.infoOutput.<locals>.format_key_val_fillr��short�Namerz%s%s%sZEpoch�VersionZRelease�Arch�Architecture�SizeZSource�Repo�
Repositoryz	From repoZPackagerZ	BuildtimezInstall timeNzInstalled byZSummaryZURLZLicenseZDescriptionr�)&rPrrr`rxZepochr�version�releaseryr�float�_sizeZ	sourcerpm�repoid�_from_systemrXZrepor)�verboseZpackagerr�utilZnormalize_timeZ	buildtimeZinstalltimeZpackage_data�int�_itemZgetInstalledBy�
ValueError�_pwd_ui_username�summary�urlr
�license�descriptionr�)r.r}rMr�r�Zoutput_listrNrOr�Zhistory_repoZhistory_pkg�uidr#)r.r$�
infoOutput�sh

zOutput.infoOutputc
Cs�|\}}|dk	rV|jj}|jtjkr,|jj}|j|||d�|j||d|jjd�dS|j�}d|j	|j
f}|j}	td||	||f�dS)	a{Print a simple string that explains the relationship
        between the members of an update or obsoletes tuple.

        :param uotup: an update or obsoletes tuple.  The first member
           is the new package, and the second member is the old
           package
        :param changetype: a string indicating what the change between
           the packages is, e.g. 'updates' or 'obsoletes'
        :param columns: a tuple containing information about how to
           format the columns of output.  The absolute value of each
           number in the tuple indicates how much space has been
           allocated for the corresponding column.  If the number is
           negative, the text in the column will be left justified,
           and if it is positive, the text will be right justified.
           The columns of output are the package name, version, and repository
        N)r1rMr'r�)r1r:rMz%s.%sz%-35.35s [%.12s] %.10s %-20.20sz    )
r)�color_update_remote�reponame�hawkeyZSYSTEM_REPO_NAME�color_update_localr��color_update_installedZcompactPrintrxryr�r|)
r.ZuotupZ
changetyper1Z	changePkgZinstPkgZchiZ	c_compactZ	i_compactZc_repor#r#r$�updatesObsoletesLists
zOutput.updatesObsoletesListcCsl|dk�rht|�dk�r`td|�t�}|dkrbi}x"|D]}	|	|t|	�t|	j�<q<W|j�}x�t|�D]�}	|	j|	jf}
d}|
|kr�|j	dd	�}nD|	j
||
�r�|j	d
d	�}n(|	j||
�r�|j	dd�}n|j	d
d	�}|dkr�|j|	d||d�ql|dk�r|j
|j|	|d�d�ql|dk�r0|j|	�ql|dkrl|j|	�qlqlW|�r`tdjt|���t|�SdS)a�Prints information about the given list of packages.

        :param lst: a list of packages to print information about
        :param description: string describing what the list of
           packages contains, e.g. 'Available Packages'
        :param outputType: The type of information to be printed.
           Current options::

              'list' - simple pkg list
              'info' - similar to rpm -qi output
              'name' - simple name list
              'nevra' - simple nevra list
        :param highlight_na: a dictionary containing information about
              packages that should be highlighted in the output.  The
              dictionary keys are (name, arch) tuples for the package,
              and the associated values are the package objects
              themselves.
        :param columns: a tuple containing information about how to
           format the columns of output.  The absolute value of each
           number in the tuple indicates how much space has been
           allocated for the corresponding column.  If the number is
           negative, the text in the column will be left justified,
           and if it is positive, the text will be right justified.
           The columns of output are the package name, version, and
           repository
        :param highlight_modes: dictionary containing information
              about to highlight the packages in *highlight_na*.
              *highlight_modes* should contain the following keys::

                 'not_in' - highlighting used for packages not in *highlight_na*
                 '=' - highlighting used when the package versions are equal
                 '<' - highlighting used when the package has a lower version
                       number
                 '>' - highlighting used when the package has a higher version
                       number
        :return: number of packages listed
        r�inforx�nevrarz%sFznot inrGr0�>rF�<T)r~rMr1)rMr�N)rr�rxr�)rr|�setr�r{�valuesr]rxryr<Zevr_eqZevr_ltr��addr�r�r�r�)r.r!r�Z
outputTypeZhighlight_nar1Zhighlight_modesZinfo_setZunique_item_dictr}r�rMr#r#r$�listPkgs3s@'



zOutput.listPkgscCs2ttd��ttd��f}ttd��ttd��f}||}x�|dkrJtd�}d}|jjrl|dkrhtd�}n|}ytjj|�}Wn.tk
r�Yntk
r�|d	}YnXt|�j	�}t
|�d	kr�|jjr�|d	n|d	}||kr�Pd|ko�d|k�r|d	}Pd|kr:d|kr:|d	}Pq:W||k�r.d
SdS)z�Get a yes or no from the user, and default to No

        :msg: String for case with [y/N]
        :defaultyes_msg: String for case with [Y/n]
        :return: True if the user selects yes, and False if the user
           selects no
        �y�yes�n�noNzIs this ok [y/N]: rzIs this ok [Y/n]: rTF)r
rr)Z
defaultyesrZi18nZ	ucd_input�EOFError�KeyboardInterrupt�lowerr)r.rpZdefaultyes_msgZyuiZnuiZauiZchoicer#r#r$�userconfirm�s>

zOutput.userconfirmcCs~|jj�j�j�}|jj�j�j�}i}xPtjtt|��d�D]6}||kr^||d||<q@||kr@||d||<q@W|S)Nrr)	rY�query�	installedZ
_name_dict�	availablerrrr )r.�sectionsr�r�rf�pkg_namer#r#r$�_pkgs2name_dict�szOutput._pkgs2name_dictc	Cs�i}i}x~tjtt|��d�D]d}|j|�}|dkr8q tt|��t|j�}tt|j��}|j|d�d||<|j|d�d||<q W||fS)Nrr)	rrrr r<r
r
�GRP_PACKAGE_INDENTr�)	r.r��	name_dictZ
nevra_lengthsZrepo_lengthsr�r}Znevra_lZrepo_lr#r#r$�_pkgs2col_lengths�s
zOutput._pkgs2col_lengthscCs$x|D]}td|j|f�qWdS)Nz%s%s)r|r�)r.�	pkg_namesrxr#r#r$�_display_packages�s
zOutput._display_packagescCspxj|D]b}y||}Wn(tk
r>td|j|f�wYnXd}|jsR|jj}|j|d|j||d�qWdS)Nz%s%sFT)r~r:rMr1)�KeyErrorr|r�r�r)Zcolor_list_available_installr�)r.r�r�r1rxr}rMr#r#r$�_display_packages_verbose�s
z Output._display_packages_verbosec
Csldd�}tdtd�|j�|jj}|r@ttd�t|j��|jr`ttd�t|j�p\d�|jrxttd�|j�td	�||j	�ftd
�||j
�ftd�||j�ftd�||j�ff}|�r0|j
|�}|j||�}|j|�}|d
|df}xp|D].\}}	t|	�dk�rq�t|�|j|	||�q�Wn8x6|D].\}}	t|	�dk�rP�q6t|�|j|	��q6WdS)z�Output information about the packages in a given group

        :param group: a Group object to output information about
        cSstdd�|D��S)Ncss|]}|jVqdS)N)rx)r8r}r#r#r$�	<genexpr>�sz?Output.display_pkgs_in_groups.<locals>.names.<locals>.<genexpr>)r])�packagesr#r#r$�names�sz,Output.display_pkgs_in_groups.<locals>.namesr�z	Group: %sz
 Group-Id: %sz Description: %srz
 Language: %sz Mandatory Packages:z Default Packages:z Optional Packages:z Conditional Packages:rrN)r|r�ui_namer)r�r
�id�ui_descriptionZ	lang_onlyZmandatory_packagesZdefault_packagesZoptional_packagesZconditional_packagesr�r�r=rr�r�)
r.�groupr�r�r�r�Zcol_lengthsr1�section_namer�r#r#r$�display_pkgs_in_groups�s8

zOutput.display_pkgs_in_groupscCs�dd�}ttd�|j�|jjr8ttd�t|j��|jr\t|j�pJd}ttd�|�td�||j�ftd�||j	�ff}x0|D](\}}t
|�d	kr�q�t|�|j|�q�Wd
S)z�Output information about the packages in a given environment

        :param environment: an Environment object to output information about
        cSstdd�|D��S)Ncss|]}|jVqdS)N)rx)r8r�r#r#r$r�	szFOutput.display_groups_in_environment.<locals>.names.<locals>.<genexpr>)r])�groupsr#r#r$r�sz3Output.display_groups_in_environment.<locals>.nameszEnvironment Group: %sz Environment-Id: %srz Description: %sz Mandatory Groups:z Optional Groups:rN)r|rr�r)r�r
r�r�Zmandatory_groupsZoptional_groupsrr�)r.Zenvironmentr�r�r�r�r�r#r#r$�display_groups_in_environmentsz$Output.display_groups_in_environmentcsVd���fdd�	����fdd�}�jjr4d�}nd�j�jf}�j|�jpRd�}�r|�d	krj�jj��j|��d
d�}t|�|d	kr��jj	}|s�d	Stt
d��j�d}d}	�xXt|�D�]J}
�j|
kr�d
}	qˆj
|
k�rt
d
�}�||
|d
d�d
}qˆj|
k�r,t
d�}�||
|dd�d
}qˆj|
k�rVt
d�}�||
|dd�d
}q�||
|��rhd
}q�t
d�}x��jD]�}t|�}tj||
��r��|||dd�d
}n`|j�d}
td��t�fdd�|
D���r�|
j�d}n|
}tj|
|��rx�|||dd�d
}�qxWq�Wt||	g��sLx*t|�D]}
t
d�}�||
|dd��q*Wt�d	S)a�Output search/provides type callback matches.

        :param po: the package object that matched the search
        :param values: the information associated with *po* that
           matched the search
        :param matchfor: a list of strings to be highlighted in the
           output
        :param verbose: whether to output extra verbose information
        :param highlight: highlighting options for the highlighted matches
        Fcsd|sttd��t|�pd}|dkr(dS�r>�j|��dd�}|rTt�j||��nt||�dS)Nz
Matched from:rT)�ignore_case)r|rr
rSr�)r��itemZprinted_headline�can_overflow)rM�matchforr.r#r$�print_highlighted_key_item'sz8Output.matchcallback.<locals>.print_highlighted_key_itemcsT�jj|�sdStd�}d}x2�jD](}tj||�r$�|||p@|dd�d}q$W|S)NFzFilename    : %s)r�T)�FILE_PROVIDE_RE�matchr�files�fnmatch)r��
printed_matchr�Z
file_match�filename)�por�r.r#r$�print_file_provides4sz1Output.matchcallback.<locals>.print_file_providesz%s : z%s.%s : rNT)r�zRepo        : %szDescription : )r�zURL         : %szLicense     : %szProvide    : %srz=<>c3s|]}|�kVqdS)Nr#)r8�char)�possibler#r$r�psz'Output.matchcallback.<locals>.<genexpr>zOther       : %s)F)r)Zshowdupesfromreposrxryr�r�Zcolor_search_matchrSr|r�rr�r�r�r�r�Zprovidesr�r�rK�any)r.r�r�r�r�rMr�rpr�Z
name_matchr�r�ZprovideZ
first_provideZitem_newr#)rMr�r�r�r�r.r$�
matchcallbacksp


zOutput.matchcallbackcCs|j|||dd�S)aqOutput search/provides type callback matches.  This will
        output more information than :func:`matchcallback`.

        :param po: the package object that matched the search
        :param values: the information associated with *po* that
           matched the search
        :param matchfor: a list of strings to be highlighted in the
           output
        T)r�)r�)r.r�r�r�r#r#r$�matchcallback_verboses
zOutput.matchcallback_verbosec
Csd}d}d}d}x�|D]�}yrt|j�}||7}y|j�r@||7}Wntk
rVYnX|s^wyt|j�}Wntk
r�YnX||7}Wqtk
r�d}td�}	tj|	�PYqXqW|�s|r�tjtd�t	|��||k�r�tjtd�t	||��|�rtjtd�t	|��dS)	z�Report the total download size for a set of packages

        :param packages: a list of package objects
        :param installonly: whether the transaction consists only of installations
        rFTz2There was an error calculating total download sizezTotal size: %szTotal download size: %szInstalled size: %sN)
r�r�ZverifyLocalPkg�	ExceptionZinstallsizer�logger�errorr�r)
r.r�Zinstallonly�totsizeZlocsizeZinsizer�r}�sizerpr#r#r$�reportDownloadSize�sD






zOutput.reportDownloadSizecCsrd}d}xL|D]D}y|j}||7}Wqtk
rPd}td�}tj|�PYqXqW|sntjtd�t|��dS)zmReport the total size of packages being removed.

        :param packages: a list of package objects
        rFTz-There was an error calculating installed sizezFreed space: %sN)r�r�rr�r�r�r)r.r�r�r�r}r�rpr#r#r$�reportRemoveSize�s

zOutput.reportRemoveSizecCs*|sdSg}g}|jr$|jtd��xJ|jD]@}t|j|��}|j|�}|rR|jn|}	|jtdd|	|��q,W|j	r�|jtd��x@|j	D]6}t|j
|��}|jj|�j}	|jtdd|	|��q�W|�r |j
|�}
x$|D]}|j|jt||
�d��q�W|j|
td�td�ddf�|d	d	�<d
j|�S)Nz+Marking packages as installed by the group:r��@z)Marking packages as removed by the group:r'ZGroup�Packagesrrr�)Z
new_groupsr`rrZadded_packagesZ_group_by_idr�rr%Zremoved_groupsZremoved_packagesr�r<rEr2r r6r�)r.�compsrXrh�outrAZgrp_idZpkgsZgroup_objectZgrp_namer3r4r#r#r$�list_group_transaction�s.


$zOutput.list_group_transactioncQs�tjtjBtjBtjBtjBtjB}t�}t�}|dkr<g}tj	j
|�}g}iiid�}d}	gf�fdd�	}
tjjr|t
d�nt
d�}�xPtdd	�|jftdd
�|jftdd�|jf||jft
d�|jft
d
�|jft
d�|jft
d�|jft
d�|jftdd�|jfg
D]�\}}
g}i}xL|D]D}|jtjjk�r6�qx(|jj�D]}|j|t��j |��qBW�qWx\t!|
dd�d�D]H}|jtjj"tjj#gk�r��qxt!|j$|jg��}|
|||	|j%|�}	�qxW|j&||f��qWt!t'�jj(j)��j*��}|�rXt
d�}g}xF|D]>\}}x2t+|�D]&}|j&d||fddddddf��qW�qW|j&||f�t!t'�jj(j,��j*��}|�r�t
d�}g}xF|D]>\}}x2t+|�D]&}|j&d||fddddddf��q�W�q�W|j&||f�t!t'�jj(j-��j*��}|�r<t
d�}g}x*|D]"\}}|j&|d|ddddf��qW|j&||f�t!t'�jj(j.��j*��}|�r�t
d�}g}x:|D]2\}}|j&|dd|d|dfddddf��qlW|j&||f�t!t+�jj(j/���}|�rt
d�}g}x&|D]}|j&|ddddddf��q�W|j&||f�t!t+�jj(j0���}|�rht
d�}g}x&|D]}|j&|ddddddf��q8W|j&||f��jj1�rNdd �}�jj1j2j3}|�r�t
d!�}g}x |j4�D]}|j&||���q�W|j&||f��jj1j2j5} | �rt
d"�}g}x | j4�D]}|j&||���q�W|j&||f��jj1j2j6}!|!�rdt
d#�}g}x |!j4�D]}|j&||���q>W|j&||f��jj1j7j3}"|"�r�t
d$�}g}x |"j4�D]}|j&||���q�W|j&||f��jj1j7j5}#|#�rt
d%�}g}x |#j4�D]}|j&||���q�W|j&||f��jj1j7j6}$|$�rNt
d&�}g}x |$j4�D]}|j&||���q(W|j&||f��j8j9�rv�jj:j;|@�rvg}�jj<d'|d(�\}}t'd)d*�|D��}x"t!|�D]}%|
|||	|%g�}	�q�Wd+g}&�jj=�s�|&j&d,�t
d-�d.j>|&�}'d/d0�|D�}|j&|'|f�g}x*t!|j*��D]\}(}%|
|||	|%g�}	�qWt
d1�}'�jj8j?�rN|'d}'n|'t
d2�}'d3d0�|D�}|j&|'|f��j@jA})|d4�rˆjj(jB��rˆjj1�o��jj1j7�p��jj1j2�r�dS|d4i|d5|d6ig}d|	ddd7g}*�jC|d8|*d9|d:�}*|*\}+}	},}-}.tD|*�d7}/|)|/k�r&|)n|/})tE|+td;d<�td=d<��}0tE|	td;d>�td=d?��}1tE|,td;d@�td=d@��}2tE|-td;dA�td=dB��}3tE|.td;dC�td=dC��}4dDdE|)�jF|0|+f|1|	f|2|,f|3|-f|4|.ffd.�dE|)fg}5x�|D]�\}}|�	rdF|}6x�|D]�\}7}8}9}:};}<}=|7|+|=f|8|	f|9|,f|:|-f|;|.ff}*�jF|*d.dG�}>�jG�j8jH�\}?}@xBt!|<�D]6}AdHt
dI�dJ}B|B|?|AjI|@|AjJ|AjKf;}B|>|B7}>�	qrW|6|>}6�	q
W|�r�|5j&|6��q�W|5j&t
dK�dE|)�t
dL�tL|j�tL|j�tL|j�tL|j�dft
dM�tL|j�dft
dN�tL|j�tL|j�tL|j�dft
dO�tL|j�dft
dP�tL|�tL|�dff}Cd}Dd}Ed}Fd}Gx�|CD]�\}}H}I|H�
r�|I�
r��
q�tMd<dQ|H�}JtN|�}KtNtO|H��}LtN|J�}M|I�
r�tNtO|I��}Nnd}NtP|K|D�}DtP|L|E�}EtP|M|F�}FtP|N|G�}G�
q�Wx�|CD]�\}}H}ItMd<dQ|H�}J|I�r�tMdRdS|I�}OtQ||D�}P|H�r�dT}>|5j&|>|P|E|HdU|F|Jf|G|I|Of�n$dV}>|5j&|>|P|E|Fd.|G|I|Of�n&|H�r$dW}>|5j&|>tQ||D�|E|H|Jf��q$Wdj>|5�S)Xz]Return a string representation of the transaction in an
        easy-to-read format.
        N)r��v�rrcs�|j\}}}}}	|j}
|j}t|j�}|dkr2d}|jrB�jj}
n|jrR�jj	}
n�jj
}
|j|||
||||
f�xRdt|�fdt|
�fdt|�ffD],\}}||j
|d�|||d7<q�Wt|t|��}|S)NZnoarchr�r�r�rr)Zpkgtuprzr{rr�r�r)r�Z
_from_cmdliner�r�r`r�
setdefault�max)�linesrc�a_widr��	obsoletesr��a�er�r�rzr�r��hirf)r.r#r$�	_add_line�s"


,z*Output.list_transaction.<locals>._add_linez Installing group/module packageszInstalling group packagesr�Z
InstallingZ	UpgradingZReinstallingzInstalling dependencieszInstalling weak dependenciesZRemovingzRemoving dependent packageszRemoving unused dependenciesZDowngradingcSs|jS)N)r})�xr#r#r$�<lambda>4sz)Output.list_transaction.<locals>.<lambda>)r�zInstalling module profilesz%s/%srzDisabling module profileszEnabling module streamszSwitching module streamsz%s -> %srzDisabling moduleszResetting modulescSs&|j�}|r|ntd�ddddddfS)Nz<name-unset>r)ZgetNamer)r�rxr#r#r$�format_lineqsz,Output.list_transaction.<locals>.format_linezInstalling Environment GroupszUpgrading Environment GroupszRemoving Environment GroupszInstalling GroupszUpgrading GroupszRemoving GroupsT)Zreport_problems�transactioncss|]}t|�|fVqdS)N)r�)r8r}r#r#r$r��sz*Output.list_transaction.<locals>.<genexpr>z--bestz--allowerasingzSSkipping packages with conflicts:
(add '%s' to command line to force their upgrade)r'cSsg|]}|dd�d�qS)Nrrr[)rr#)r8rBr#r#r$r9�sz+Output.list_transaction.<locals>.<listcomp>z,Skipping packages with broken dependencies%sz or part of a groupcSsg|]}|dd�d�qS)Nrrr[)rr#)r8rBr#r#r$r9�sr�r�r��z  rZ)r:r1rdrer�ZPackagerr�r�r�r�r�r�z	%s
%s
%s
r0z%s:
r�z     Z	replacingz  %s%s%s.%s %s
z
Transaction Summary
%s
�Install�UpgradeZRemove�	DowngradeZSkipr�zDependent packagezDependent packagesz%s  %*d %s (+%*d %s)
z%-*sz%s  %s  ( %*d %s)
z%s  %*d %s
)Rr�ZUPGRADEZUPGRADE_ALLZDISTUPGRADEZDISTUPGRADE_ALLZ	DOWNGRADE�INSTALLr�rr�Z_make_listsr*ZWITH_MODULESrrr�ZupgradedZreinstalledZinstalled_groupZ
installed_depZinstalled_weakZerasedZ
erased_depZerased_cleanZ
downgraded�action�libdnfr	ZTransactionItemAction_OBSOLETEDr�Z
getReplacedByr�r�r]ZFORWARD_ACTIONSZTransactionItemAction_REMOVEr<r}r`r7Z_moduleContainerZgetInstalledProfilesr^rZgetRemovedProfilesZgetEnabledStreamsZgetSwitchedStreamsZgetDisabledModulesZgetResetModulesZ_history�envZ
_installedr�Z	_upgradedZ_removedr�r)ZbestZ_goal�actionsZ_skipped_packagesZ_allow_erasingr�Zupgrade_group_objects_upgrader,r1Z	isChangedr=rarr2rPr�rxryrzrr	r
rr�r)Qr.r	reZforward_actionsZskipped_conflictsZskipped_brokenZ
list_bunchZ
pkglist_linesrcrrZ
ins_group_msgrZpkglistr�Zreplaces�tsirBZ	obsoletedZinstalledProfilesrxZprofilesZprofileZremovedProfilesZenabledStreams�streamZswitchedStreamsZdisabledModulesZresetModulesrZinstall_env_groupr�Zupgrade_env_groupZremove_env_groupZ
install_groupZ
upgrade_groupZremove_groupr}ZrecommendationsZskip_strr�Zoutput_widthr1Zn_widZv_widZr_widZs_widZ
real_widthZmsg_packageZmsg_archZmsg_versionZmsg_repositoryZmsg_sizer�Ztotalmsgr�rrzr�r�rrrprNrOZobspoZappendedZsummary_dataZmax_msg_actionZ
max_msg_countZmax_msg_pkgsZmax_msg_depcountr�ZdepcountZmsg_pkgsZlen_msg_actionZ
len_msg_countZlen_msg_pkgsZlen_msg_depcountZmsg_deppkgsZ
action_msgr#)r.r$�list_transaction�s�$
 ,,.





$







zOutput.list_transactionc
s��fdd�}|sdSg}g}|jdj|��x|D]}|jt|��q2Wxd
D]}|||�}|rNPqNW|sz�jjdg}xD|r�|dt|��}	|jdj�jt|	|����|t|�d�}q|W|S)Ncs�t|�|krgS�jj|dd}|dkr0gSdg|}d}x`|D]X}t|�||kr�t|�||}||krtgS||8}t|�||<|d7}|t|�;}qDWx8tt|��D](}||||7<||d9<q�W|S)zb Work out how many columns we can use to display stuff, in
                the post trans output. rrZrr[)rr,r1r\)�msgs�numr"Zcol_lensrgrprh)r.r#r$�
_fits_in_colsKs(

z+Output._pto_callback.<locals>._fits_in_colsrz{}:���r
r�r(rZz  {})rrrr
r�r(rZ)r`�formatr�r,r1rr2r )
r.rZtsisrr�rrrrDZcurrent_msgsr#)r.r$�
_pto_callbackHs&


zOutput._pto_callbackcCstjj|j||j�S)z{
        Return a human-readable summary of the transaction. Packages in sections
        are arranged to columns.
        )rr�Z_post_transaction_outputr*r)r.r	r#r#r$�post_transaction_outputzszOutput.post_transaction_outputcCs@d}|jjdkr6tjjjtjd�}tjjjtjd�|_|t�fS)z_Set up the progress callbacks and various
           output bars based on debug level.
        NrZ)Zfo)	r)Z
debuglevelrr+r-ZMultiFileProgressMeter�sys�stdout�DepSolveProgressCallBack)r.�progressbarr#r#r$�setup_progress_callbacks�s
zOutput.setup_progress_callbackscCsz|dkrdS|jj}tjd|�tdtj�|�}dt||�t|�t|�f}tt	d�|t
|��|}tj|�dS)a!Outputs summary information about the download process.

        :param remote_size: the total amount of information that was
           downloaded, in bytes
        :param download_start_timestamp: the time when the download
           process started, in seconds since the epoch
        rN�-g{�G�z�?z %5sB/s | %5sB %9s     ZTotal)r,r1r�r�r��timerrrrr)r.Zremote_sizeZdownload_start_timestampriZdl_timerpr#r#r$�download_callback_total_cb�s
z!Output.download_callback_total_cbcCs�t�}t�}d}xD|D]<}|jtjjtjjfkr2q|j|j�|j|j�|d7}qWt	|�dkrt|dj
t|��fS|dj
t|��fS)Nrrz, r)
r�rrr	ZTransactionItemAction_UPGRADEDZ TransactionItemAction_DOWNGRADEDr��action_nameZaction_shortrr�r]r)r.ZhpkgsrZ
actions_shortr�r}r#r#r$�_history_uiactions�s
zOutput._history_uiactionsc	st|t�r��fdd�|D�S|dks.|dkrftd�}td�d|}�dk	r^t|��kr^|}t|�Sdd	�}yrtjt|��}|t|j�d
d�}t|j	�}d||f}�dk	r�t|��kr�d
||�|f}t|��kr�d|}|St
k
�r�t|�SXdS)Ncsg|]}�j|���qSr#)r�)r8�u)�limitr.r#r$r9�sz+Output._pwd_ui_username.<locals>.<listcomp>�����z<unset>ZSystemr'cWs|j|�}|sdS|dS)zf Split gives us a [0] for everything _but_ '', this function
                returns '' in that case. rr)rK)�text�argsrUr#r#r$�
_safe_split_0�s
z.Output._pwd_ui_username.<locals>._safe_split_0�;rZz%s <%s>z%s ... <%s>z<%s>)r+r,)rIrrrr
�pwd�getpwuidr�Zpw_gecosZpw_namer�)	r.r�r*Zloginidrxr/�user�fullnameZ	user_namer#)r*r.r$r��s*

zOutput._pwd_ui_usernamec
Csz|jj|�}|jjdkr"ddg}nf|jjdkr6dg}nR|jjdksFt�t�}d}d}x2|D]*}|d7}|jdkrx|d7}|j|j�qZWd}t	|�dkr�t
d	�}	|jj}
|
dkr�t
jjjd�}
|
dkr�d
}
|
dkr�|
dnd
}nt
d
�}	d
}t|tt
d�dd�t|	||�tt
d�dd�tt
d�dd�tt
d�dd�f�d#|dddddd}td|�d}|dk�r|t|�}x�|D]�}t	|�dk�r�|j�p�d}	n|j|jd
�}	t|	�}	tjdtj|j��}
|j|j��\}}t|	||�}	t|dd�}d}}|jdk�rd}}n"|j�r&d}}n|j�r6d}}|j�rBd }|j�rNd!}t||j|	|
||fd"||f��q�WdS)$z�Output a list of information about the history of yum
        transactions.

        :param tids: transaction Ids; lists all transactions if empty
        ZusersrrZZcommandszsingle-user-commandsrNz%s | %s | %s | %s | %szCommand line��O�7z	User nameZIDrz
Date and timertz	Action(s)�ZAlteredrr(r$z%6u | %s | %-16.16s | %s | %4uTrz%Y-%m-%d %H:%Mr'�*�#�Er�r�z%s%s�	) rX�oldr)Zhistory_list_viewrkr��cmdliner��loginuidrrr,r_rr+Z_real_term_widthr|r�reversedr�r
r%�strftime�	localtime�
beg_timestampr(rc�return_codeZ	is_output�altered_lt_rpmdb�altered_gt_rpmdb�tid)r.�tids�reverse�transactionsZuids�doneZblanksr	�fmtrxZ	real_colsZ
name_widthZtable_widthZtmrZuiactsZrmarkZlmarkr#r#r$�historyListCmd�sr



 



zOutput.historyListCmdcCst|�}|jj�}|dkr8tjtd��tjjtd���|j	}|j
}g}|sz|jjdd�}|dk	r�|j|j	�|j|�n|jj
|�}|s�tjtd��tjjtd���d
\}}	d}
d}|r�t|�}|j�\}}	�x|D]�}|dk	o�|j	|k�r|jj�}
|jt|
��d}d}|j	|k�rL|j	|	k�rL|
dk�r<t|�}
n
|
j|�d}n`|
dk	�r�|�rhtd	d
�d}|j|
�d}
|�r�|j�\}}	|j	|k�r�|j	|	k�r�|}
d}|s�|�r�td	d
�d}|j||�q�W|
dk	�r�|�r�td	d
�|j|
�dS)z�Output information about a transaction in history

        :param tids: transaction Ids; prints info for the last transaction if empty
        :raises dnf.exceptions.Error in case no transactions were found
        NzNo transactionszFailed history infoF)Zcomplete_transactions_onlyz$No transaction ID, or package, givenrTr$r6r[r[)r[r[)r�rX�lastr��criticalrr�
exceptions�ErrorrG�end_rpmdb_versionr�r`r=r]rbrYZ_rpmdb_versionZcompare_rpmdbvr�r�merger|�_historyInfoCmd)r.rH�patsZmtidsrNZlasttidZlastdbvrJZbmtidZemtidZmobjrKZtransZrpmdbvZmergedr#r#r$�historyInfoCmd#sl







zOutput.historyInfoCmdcs||j}t|t�r|g}�fdd�|D�}td�td�td�td�d��td�td�td	�td
�d��tdd�t�j��t�j��D��}|�d<|�d<d4���fdd�	}|j�}t|�dkr�t	td�d|d|d5f�nt	td�|d�t
|j�}tj
dtj|��}	t	td�|	�|jdk	�rT|j�rDt	td�|jd�nt	td�|j�|jdk	�r�|j}
tj
dtj|
��}|
|}|d6k�r�td�|}nH|d8k�r�td�|d}n,|d;k�r�td�|d<}ntd�|d>}t	td �||�|jdk	�r(|j�rt	td!�|jd�nt	td!�|j�t|ttf��rvt�}
xD|D],}||
k�rV�qD|
j|�t	td"�|��qDWnt	td"�|�t|jttf��r|j}|ddk�r�t	td#�dtd$�d�|dd�}nHt|��s�t	td#�td%��n*|�rnt	td#�td&�d'jd(d�|D���nV|jdk�r<t	td#�dtd$�d�n2|j�r\t	td#�td)�|j�nt	td#�td%��t|jttf��r�t�}
x(|jD]}||
k�r��q�|
j|��q�Wt	td*�|�nt	td*�|j�|jdk	�r t|jttf��rx0|jD]}t	td+�|��q�Wnt	td+�|j�|jdk	�rpt|jttf��r`x0|jD]}t	td,�|��qFWnt	td,�|j�|j�}|�r�t	td-��d}x(|D] }tt|��}||k�r�|}�q�Wx|D]}||d.d|d/��q�Wt	td0���j||�|j �}|�r4t	td1��d}x$|D]}|d7}t	d2||��qW|j!�}|�rxt	td3��d}x$|D]}|d7}t	d2||��qXWdS)?Ncsg|]}�j|��qSr#)r�)r8r�)r.r#r$r9tsz*Output._historyInfoCmd.<locals>.<listcomp>Z	InstalledZErased�Upgraded�
Downgraded)rBr�or�z
Not installedZOlderZNewercSsg|]}t|��qSr#)r)r8rr#r#r$r9zs�maxlenFrTc	s�d|}|r�}n�}|d}�jj�j�j|jd�j�}	|	sH|d}nB�jj|	d�}
|
r�|j|
�}|dkrpn|dkr�|d}n|d}|r��j	d�\}}
n�j	d	�\}}
t
||d
�}d}|r�|j�}td||||
|t
|�|f�dS)
Nr'rB)rxrrrYr�rFrGrZrz%s%s%s%s %-*s %s)rYr�r�ZfiltermrxZrunrX�packageZcomparerPrr�r|r�)r}Z
prefix_len�
was_installedrM�pkg_max_lenZ	show_repormZ_pkg_states�stateZipkgsZinst_pkg�resrNrOZui_repo)�_pkg_states_available�_pkg_states_installedr.r#r$�_simple_pkg~s2


z+Output._historyInfoCmd.<locals>._simple_pkgrzTransaction ID :z%u..%uz%czBegin time     :zBegin rpmdb    :z**r
�<z(%u seconds)z(%u minutes)r5z
(%u hours)z	(%u days)zEnd time       :zEnd rpmdb      :zUser           :zReturn-Code    :ZAbortedZSuccessz	Failures:z, cSsg|]}t|��qSr#)r�)r8rBr#r#r$r9�szFailure:zReleasever     :zCommand Line   :zComment        :zTransaction performed with:r�)r\r]zPackages Altered:zScriptlet output:z%4dzErrors:)FFrTr[i,i,iPFi,iPFi��iii�Q)"r?rIr�rr�rr�rHrr|r�rCr%rArBZbeg_rpmdb_versionrEZ
end_timestamprRrFror�r�rD�allr�Z
releaseverr>�commentZperformed_withr��historyInfoCmdPkgsAltered�outputr�)r.r=rUr?rxrZrbrHZbegtZbegtmZendtZendtmrh�seenrBZcodesr>reZ	perf_withZmax_lenZwith_pkgZstr_lenZt_outr�lineZt_errr#)r`rar.r$rTps�
( 







&






zOutput._historyInfoCmdrzDep-Install�	Obsoleted�
Obsoleting�Erase�	Reinstallr
rXrrW)zTrue-InstallrzDep-Installrjrkrlrmr
rXZUpdateZUpdatedc

s�|j}d}d}|j�}xH|D]@�|j�j�j�}|t|�krDt|�}tt���}||kr|}qWx�|D]��d}	�jtjj	kr�d}	d}
|r�t
�fdd�|D��r�d}
|j|
�\}}|j�j�j�}tt
|�|�}td	|	||||t���j�f�qfWd
S)aPrint information about how packages are altered in a transaction.

        :param old: the :class:`DnfSwdbTrans` to
           print information about
        :param pats: a list of patterns.  Packages that match a patten
           in *pats* will be highlighted in the output
        rr'r�z ** rGcsg|]}�j|��qSr#)r�)r8Zpat)r}r#r$r9Bsz4Output.historyInfoCmdPkgsAltered.<locals>.<listcomp>rFz%s%s%s%s %-*s %sNz    )�_history_state2uistater�r<r'rr�r^rr	ZTransactionItemState_DONEr�rPrr
r|r�)
r.r=rUZall_uistatesrZr]r�ZuistateZpkg_lenrmrMrNrOr#)r}r$rf"s2

z Output.historyInfoCmdPkgsAlteredz   )NrNr)rr)FrFN)FrFN)r0)F)N)NN)NNN)N)F)N)N)F)9�__name__�
__module__�__qualname__�__doc__r��re�compiler�r/r6rErPrS�staticmethodrW�propertyrXrYr=rjrlr2r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr#r&r(r�rMr�rVrTrrnrfr#r#r#r$r&Cs�




/



V
$N
-
'
c

/
_2
"
MM'r&c@s(eZdZdZdd�Zdd�Zdd�ZdS)	r!zGProvides text output callback functions for Dependency Solver callback.cCs�d}|dkrtd�}n||dkr(td�}nj|dkr:td�}nX|dkrLtd	�}nF|d
kr^td�}n4|dkrptd
�}n"|dkr�td�}n|dkr�td�}|r�tj||j|j|j�dS)a�Print information about a package being added to the
        transaction set.

        :param pkgtup: tuple containing the package name, arch,
           version, and repository
        :param mode: a short string indicating why the package is
           being added to the transaction set.

        Valid current values for *mode* are::

           i = the package will be installed
           u = the package will be an update
           e = the package will be erased
           r = the package will be reinstalled
           d = the package will be a downgrade
           o = the package will be obsoleting another package
           ud = the package will be updated
           od = the package will be obsoleted
        NrBz'---> Package %s.%s %s will be installedr)z(---> Package %s.%s %s will be an upgraderz$---> Package %s.%s %s will be erasedr�z)---> Package %s.%s %s will be reinstalledrfz)---> Package %s.%s %s will be a downgraderYz(---> Package %s.%s %s will be obsoletingZudz&---> Package %s.%s %s will be upgradedZodz'---> Package %s.%s %s will be obsoleted)rr��debugrxryrz)r.r}�modergr#r#r$�	pkg_addedPs&






z"DepSolveProgressCallBack.pkg_addedcCstjtd��dS)zRPerform setup at the beginning of the dependency solving
        process.
        z"--> Starting dependency resolutionN)r�rwr)r.r#r#r$�startyszDepSolveProgressCallBack.startcCstjtd��dS)zAOutput a message stating that dependency resolution has finished.z"--> Finished dependency resolutionN)r�rwr)r.r#r#r$rqszDepSolveProgressCallBack.endN)rorprqrrryrzrqr#r#r#r$r!Ms)r!c@seZdZdd�Zdd�ZdS)�CliKeyImportcCs||_||_dS)N)r*rg)r.r*rgr#r#r$r/�szCliKeyImport.__init__cCsbdd�}td�||�|tjj|�|jdd�f}tjd|�|jjj	rJdS|jjj
rXdS|jj�S)	NcSs$tjjrdnd}|dd�jd|�S)N�0�0ri����)rZpycompZPY3�rjust)r�Zrjr#r#r$�short_id�sz'CliKeyImport._confirm.<locals>.short_idzLImporting GPG key 0x%s:
 Userid     : "%s"
 Fingerprint: %s
 From       : %szfile://rz%sTF)
rrZcryptoZ_printable_fingerprintrJr�rOr*r)Z	assumeyesZassumenorgr�)r.r�ZuseridZfingerprintr�Z	timestamprrpr#r#r$�_confirm�s


zCliKeyImport._confirmN)rorprqr/r�r#r#r#r$r{�sr{csNeZdZdZedd��Z�fdd�Zdd�Zdd	�Zd
d�Z	ddd�Z
�ZS)�CliTransactionDisplayz1A YUM specific callback class for RPM operations.cCstjjj�S)N)rr+r,�_term_width)r.r#r#r$r�szCliTransactionDisplay.<lambda>cs0tt|�j�d|_d|_d|_d|_d|_dS)NrTr0rs)�superr�r/�lastmsg�lastpackagerg�mark�marks)r.)�	__class__r#r$r/�szCliTransactionDisplay.__init__c	Csjtjjj|�}|dkrdS|j�}t|�}	||_|dkr>d}
n|td�|}
|j|||||
||	|�dS)a�Output information about an rpm operation.  This may
        include a text progress bar.

        :param package: the package involved in the event
        :param action: the type of action that is taking place.  Valid
           values are given by
           :func:`rpmtrans.TransactionDisplay.action.keys()`
        :param ti_done: a number representing the amount of work
           already done in the current transaction
        :param ti_total: a number representing the total amount of work
           to be done in the current transaction
        :param ts_done: the number of the current transaction in
           transaction set
        :param ts_total: the total number of transactions in the
           transaction set
        Nr�d)	rr	�ACTIONSr<�_max_action_widthr
r�r�
_out_progress)r.r[r�ti_done�ti_total�ts_done�ts_totalZ
action_str�wid1�pkgname�percentr#r#r$r-�szCliTransactionDisplay.progresscCsHt|d�s>d}x(tjjj�D]}t|�}||kr|}qW||_|j}|S)N�_max_action_wid_cacher)�hasattrrr	r�r�r
r�)r.r�rCZwid_valr#r#r$r��s
z'CliTransactionDisplay._max_action_widthc	Cs�|jr�tjj�s||kr�|j|||tjj�||d�\}	}}
t|�}|	t|||�t||
|
�f}||jkr�tj	j
d|tj�||_||kr�td�dS)N)r-r�r�Zwrite_flushr')rgrr �isatty�_makefmtr
rr�rr��_terminal_messengerr|)r.r�r�r�r�r�Zprocessr�r�rL�wid2rpr#r#r$r��s

z#CliTransactionDisplay._out_progressTN�cCs�tt|��}d||f}d|d|d}	|	||f}
|dkrFd}nt|�}d|d}|d|d7}|d7}|d7}|d7}|j}
|
|kr�|}
|
|8}
||
dkr�|
d}|j||}d||f}
d|
d	}||d}|r�|d
kr�d|
}|}n�|�rD|dk�r*||jt||d
�f}nd}d|d|
}|}nL|d
k�r\d|
}|}n4|dk�rx||j|f}nd}d|d|
}|}|||fS)Nz%s.%s�%zs/%�srsrZrz[%-zs]r�z
  %s: %s   r
gY@rz

  %s: %s r'z  %s: %s   z	  %s: %s )rr�r
rir�r�)r.r�r�r�r-r�r��lr�Zfmt_donerKZpnlZoverheadrir�Zfmt_barZfull_pnlrLr�Zbarr#r#r$r��sP


zCliTransactionDisplay._makefmt)TNr�)rorprqrrrvrir/r-r�r�r��
__classcell__r#r#)r�r$r��s
 r�c
Cs�d}tjj�sdS|dkr d}n|dkr6t|�|}nd}tjjj�}|dkrZ||krZd}d||f}|t|�d8}|dkr�d}|dkr�|d8}|dkr�d}|t	||�}d|||f}n�||kr�d	t
|||�|f}nb|d
8}|dkr�d}|d}	|	t|�k�rt|�}	||	8}|t	||�}dt
||	|	�|||f}||k�rZtjj
d|tj�||k�rvtjj
dd
tj�tjj
dtjd�dS)aIOutput the current status to the terminal using a simple
    text progress bar consisting of 50 # marks.

    :param current: a number representing the amount of work
       already done
    :param total: a number representing the total amount of work
       to be done
    :param name: a name to label the progress bar with
    r:Nrr$z %d/%drrZz	
[%-*s]%sz
%s%sr�z

%s: [%-*s]%s�writer��flush)r�)rr r�r�rr+r,r�rr�rr
r�r�)
rTZtotalrxr�r�rirqZhashbarrgZnwidr#r#r$r"sL



r")N)<rrZ
__future__rrrr�r�rZlibdnf.transactionrZloggingr?r1rsrr%Zdnf.cli.formatrrZdnf.i18nrrr	r
rrr
rZ
dnf.pycomprrrrrZdnf.yum.rpmtransrZdnf.db.historyrZdnf.baserZdnf.callbackZdnf.cli.progressZdnf.cli.termZdnf.confZ
dnf.cryptoZdnf.transactionZdnf.utilZdnf.yum.miscZ	getLoggerr�r%�objectr&�callbackZDepsolver!Z	KeyImportr{r�r"r#r#r#r$�<module>sb(

7site-packages/dnf/cli/__pycache__/utils.cpython-36.pyc000064400000006141147511334650016613 0ustar003

�ft`��@s�dZddlmZddlmZddlmZddlmZddlZ	ddl
Z
ddlZddlZej
ejd�Ze
jd�Zd	d
�Zdd�Zd
d�Zdd�ZdS)z/Various utility functions, and a utility class.�)�absolute_import)�unicode_literals)�
format_number)�_N�
SC_CLK_TCK�dnfcCst|�tS)z�Convert a number of jiffies to seconds. How many jiffies are in a second
    is system-dependent, e.g. 100 jiffies = 1 second is common.

    :param jiffies: a number of jiffies
    :return: the equivalent number of seconds
    )�int�_USER_HZ)Zjiffies�r
�/usr/lib/python3.6/utils.py�jiffies_to_secondssrcCsj|dkr0d|d	|d
d|dd|dfS|dkrVd|d|dd|dfSd|d|dfS)
aReturn a human-readable string representation of the length of
    a time interval given in seconds.

    :param seconds: the length of the time interval in seconds
    :return: a human-readable string representation of the length of
      the time interval
    �<�z%d day(s) %d:%02d:%02dz%d:%02d:%02dz	%02d:%02dii�Qii�Qiiir
)Zsecondsr
r
r�seconds_to_ui_time)s

rcCs�t|�}tjjd|�s:tjjd�s:tjjd|�r>dSi}td|��v}xn|D]f}|ddkrhqV|dd�jdd�}t|�dkr�qVtjj	|dd	�|d<|dj
�||d
j
�j�<qVWWdQRXd|kr�dSd|kr�dSd}td��4}x,|D]$}|jd
�r�t|td
�d��}Pq�WWdQRX|dk�r6dStd|��^}|j
�j�}|t|d�|d<td�td�td�td�td�d�j|dtd��|d<WdQRX|S)z!Return info dict about a process.z/proc/%d/statusz
/proc/statz
/proc/%d/statN��
z:	�z kBr�vmrss�vmsizezbtime ��
start_timeZRunningZSleepingZUninterruptibleZZombiezTraced/Stopped)�R�S�D�Z�TZUnknown�state���rr)r�os�path�exists�open�split�lenr�utilZrtrim�strip�lower�
startswith�readrr�get)�pid�psZstatus_file�line�dataZ	boot_timeZ	stat_fileZps_statr
r
r�get_process_info<sJ
*



r.cCs�t|�}|s$td�}tj||�dStd�||df}tjd|�tjtd�tt|d�d�tt|d	�d��tttj��|d
�}tjtd�tj	j
|d
�|�tjtd�|d
�dS)z0Output information about process holding a lock.z=Unable to find information about the locking process (PID %d)Nz$  The application with PID %d is: %s�namez%sz    Memory : %5s RSS (%5sB VSZ)rirrz    Started: %s - %s agoz    State  : %sr)r.r�loggerZcriticalrrr�timerr$Znormalize_time)r*r+�msgZagor
r
r�show_lock_ownerls

r3)�__doc__Z
__future__rrZdnf.cli.formatrZdnf.i18nrZdnf.utilrZloggingrr1�sysconf�
sysconf_namesr	Z	getLoggerr0rrr.r3r
r
r
r�<module>s

0site-packages/dnf/cli/__pycache__/option_parser.cpython-36.pyc000064400000040660147511334650020343 0ustar003

�ft`4]�@s�ddlmZddlmZddlmZddlZddlZddlZddl	Zddl
ZddlZddlZ
ddlZddlZejd�ZGdd�dej�ZGdd	�d	ej�ZdS)
�)�unicode_literals)�_)�_parse_specsN�dnfcseZdZ�fdd�Z�ZS)�MultilineHelpFormattercs"d|kr|j�Stt|�j||�S)N�
)�
splitlines�superr�_split_lines)�self�text�width)�	__class__��#/usr/lib/python3.6/option_parser.pyr
'sz#MultilineHelpFormatter._split_lines)�__name__�
__module__�__qualname__r
�
__classcell__rr)rrr&srcseZdZdZd.�fdd�	Zdd�ZGdd�dej�ZGd	d
�d
ej�Z	Gdd�dej
�ZGd
d�dej�ZGdd�dej�Z
Gdd�dej�ZGdd�dej�ZGdd�dej�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd/�fd*d+�	Zd0�fd,d-�	Z�ZS)1�OptionParserz5ArgumentParser like class to do things the "yum way".Tcs>tt|�jdtd�d|_d|_|j�|r:i|_t�|_	dS)NF)�add_helpZformatter_class)
r	r�__init__r�command_positional_parser�
command_group�_add_general_options�
_cmd_usage�set�_cmd_groups)rZreset_usage)rrrr/szOptionParser.__init__cCs&|j�tjtd�|�tjd�dS)z�Output an error message, and exit the program.
           This method overrides standard argparser's error
           so that error output goes to the logger.

        :param msg: the error message to output
        zCommand line error: %s�N)�print_usage�loggerZcriticalr�sys�exit)r�msgrrr�error9szOptionParser.errorc@seZdZdd�ZdS)zOptionParser._RepoCallbackcs@|dkrdnd�t||j�}|j�fdd�tjd|�D��dS)Nz
--disablerepo�disable�enablec3s|]}|�fVqdS)Nr)�.0�x)�	operationrr�	<genexpr>Hsz6OptionParser._RepoCallback.__call__.<locals>.<genexpr>z\s*[,\s]\s*)�getattr�dest�extend�re�split)r�parser�	namespace�values�opt_str�lr)r)r�__call__Esz#OptionParser._RepoCallback.__call__N)rrrr5rrrr�
_RepoCallbackDsr6c@seZdZdd�ZdS)z OptionParser._RepoCallbackEnablecCs$|jj|ddf�t|d|�dS)Nrr&Zreponame)�repos_ed�append�setattr)rr0r1r2r3rrrr5Ksz)OptionParser._RepoCallbackEnable.__call__N)rrrr5rrrr�_RepoCallbackEnableJsr:cs$eZdZdZdZ�fdd�Z�ZS)zOptionParser._SplitCallbackzN Split all strings in seq, at "," and whitespace.
        Returns a new list. z\s*[,\s]\s*csDd}x:tj|j|�D](}|s |r8ttj|�j||||�d}qWdS)NTF)r.r/�SPLITTERr	r�_SplitCallbackr5)rr0r1r2r3�first�val)rrrr5Tsz$OptionParser._SplitCallback.__call__)rrr�__doc__r;r5rrr)rrr<Osr<c@seZdZdZdd�ZdS)z%OptionParser._SplitExtendDictCallbackz[ Split string at "," or whitespace to (key, value).
        Extends dict with {key: value}.c	Cshy"|jd�\}}|s|r t�Wn,tk
rNtd�|}tj||��YnXt||j�}|||<dS)N�,zbad format: %s)r/�
ValueErrorr�argparseZ
ArgumentErrorr+r,)	rr0r1r2r3�keyr>r#Zdctrrrr5bsz.OptionParser._SplitExtendDictCallback.__call__N)rrrr?r5rrrr�_SplitExtendDictCallback_srDc@seZdZdZdd�ZdS)zOptionParser._SetoptsCallbackzY Parse setopts arguments and put them into main_<setopts>
            and repo_<setopts>.cCs|jd�}t|�dkr*tjtd�|�dSt|�dkrJtjtd�|�dS|\}}|jd�}|dkr�|d|�}	||dd�}t|d�r�|j}
ni}
|
j|	i�j|g�j	|�t
|d|j|
�n:t|d	�r�|j}ni}|j|g�j	|�t
|d
|j|�dS)N�=�z'Setopt argument has multiple values: %sz Setopt argument has no value: %s�.r�repo_setoptsZrepo_�main_setoptsZmain_���)
r/�lenr Zwarningr�rfind�hasattrrH�
setdefaultr8r9r,rI)rr0r1r2r3�vals�k�vZperiod�repoZrepooptsZmainoptsrrrr5ps,



z&OptionParser._SetoptsCallback.__call__N)rrrr?r5rrrr�_SetoptsCallbackmsrSc@seZdZdd�ZdS)z'OptionParser.ParseSpecGroupFileCallbackcCst||�dS)N)r)rr0r1r2r3rrrr5�sz0OptionParser.ParseSpecGroupFileCallback.__call__N)rrrr5rrrr�ParseSpecGroupFileCallback�srTcs$eZdZ�fdd�Zdd�Z�ZS)zOptionParser.PkgNarrowCallbackcs�i|_y&x dD]}|||j|<||=qWWn6tk
rb}ztd|jj|f��WYdd}~XnXg|d<ttj|�j||�dS)N�choices�defaultz"%s() missing mandatory argument %s)rUrV)	�	pkgnarrow�KeyError�	TypeErrorrrr	r�PkgNarrowCallbackr)r�args�kwargsrP�e)rrrr�s
"z'OptionParser.PkgNarrowCallback.__init__cCsV|jd}|s"|d|jdkr.|jd}n
|jd�}t|||�t||j|�dS)NZ_actionrrUrV)r,rW�popr9)rr0r1r2r3Zdest_actionZnarrowrrrr5�s

z'OptionParser.PkgNarrowCallback.__call__)rrrrr5rrr)rrrZ�srZc@seZdZdd�ZdS)zOptionParser.ForceArchActioncCsd|_||_dS)NT)Z
ignorearchZarch)rr0r1r2r3rrrr5�sz%OptionParser.ForceArchAction.__call__N)rrrr5rrrr�ForceArchAction�sr_c
Cs�|jtdjtjjd���}|jdddddtd�d	�|jd
ddd
dtd�d�|jddd
dtd�d�|jdd
dtd�jtjjd�d�|jdtd�dd�|jdddgdtd�d�|jd d!dd"td#�d$�|jd%d&g|jtd'�d(d)�|jd*d+g|jtd,�d(d)�|jd-dtd.�d/�|jd0d1g|jtd2�d3�|jd4d5d
dtd6�d�|jd7d8d9d
d:td;�d<�|jd=d
dtd>�d�|j	�}|jd?d@d
dAdtdB�dC�|jdDd!dAtdE�d<�|jdFdGdHd
dtdI�d�|jdJdKdLt
ddMtdN�dO�|jdPdQdRdSdtdT�t
dU�|jdVd
dtdW�d�|jdXdYd
dtdZ�d�|jd[d\dt
td]�d^�|jd_dd`d
tda�jtjjd�db�|jdcdtdd�dedf�|jdgdhd
dtdi�d�|jdjd
dtdk�d�|jdl|jdmgdntdo�dp�|j	�}|jdq|jdmgdntdr�dp�|jdsdtdndu|jgtdv�dw�|j	�}|jdxdydzd
td{�db�|jd|dyd}d
td~�db�|jdd�d�gd�|jtd��d�d��|jd�d�gd�|jtd��dnd��|jd�i|j
d�td��d��|jd�d!dd�td��d$�|jd�d!dd�td��d$�|jd�d�dtd��d��|jd�d�d
td��d��|jd�d�dtd��dd�d��|jd�d�dtd��dd�d��|jd�d�d�dtd��d��|jd�d�d
dytd��d�|jd�d�dtd��d��|jd�d
td��d��|jd�d
td��d��|jd�d
td��d��|jd�d
td��d��|jd�d�d�g|jtd��d3�|jd�d�gd�|jtd��db�|jd�d�gd�|jtdƒdb�|jd�d�d�d�d�d�ggd�|jtdʃdˍ|jd�d�tj|jttjjj��td΃dύ|jd�d�tjdҍdS)�z0 Standard options known to all dnf subcommands. zGeneral {prog} options)�progz-cz--configZconfig_file_pathNz
[config file]zconfig file location)r,rV�metavar�helpz-qz--quiet�quiet�
store_truezquiet operation)r,�actionrVrbz-vz	--verbosezverbose operation)rerVrbz	--versionzshow {prog} version and exitz
--installrootzset install rootz[path])rbraz--nodocs�store_constZnodocsZtsflagszdo not install documentations)re�constr,rbz--nopluginsZstore_falseZpluginszdisable all plugins)rerVr,rbz--enablepluginZenablepluginzenable plugins by namez[plugin])r,rVrerbraz--disablepluginZ
disablepluginzdisable plugins by namez--releaseverz:override the value of $releasever in config and repo files)rVrbz--setoptZsetoptsz%set arbitrary config and repo options)r,rVrerbz
--skip-brokenZskip_brokenz.resolve depsolve problems by skipping packagesz-hz--helpz
--help-cmdrbzshow command help)rer,rbz--allowerasingz;allow erasing of installed packages to resolve dependenciesz-bz--bestZbestz8try the best available package versions in transactions.)rer,rVrbz--nobestz2do not limit the transaction to the best candidatez-Cz--cacheonlyZ	cacheonlyz2run entirely from system cache, don't update cachez-Rz--randomwaitZ	sleeptimez	[minutes]zmaximum command wait time)r,�typerVrarbz-dz--debuglevelZ
debuglevelz
[debug level]zdebugging output level)r,rarVrbrhz
--debugsolverz)dumps detailed solving results into filesz--showduplicatesZshowdupesfromreposz2show duplicates, in repos, in list/search commandsz-ez--errorlevelzerror output level)rVrhrbz--obsoletesZ	obsoletesz�enables {prog}'s obsoletes processing logic for upgrade or display capabilities that the package obsoletes for info, list and repoquery)rVr,rerbz--rpmverbosityzdebugging output level for rpmz[debug level name])rVrbraz-yz--assumeyesz*automatically answer yes for all questionsz
--assumenoz)automatically answer no for all questionsz--enablerepor7z[repo]z]Enable additional repositories. List option. Supports globs, can be specified multiple times.)rer,rVrarbz
--disablerepozSDisable repositories. List option. Supports globs, can be specified multiple times.z--repoz--repoidrRzUenable just specific repositories by an id or a glob, can be specified multiple times)rar,rerVrbz--enableFZset_enabledz>enable repos with config-manager command (automatically saves)z	--disableZset_disabledz?disable repos with config-manager command (automatically saves)z-xz	--excludez
--excludepkgsZexcludepkgsz exclude packages by name or globz	[package])rVr,rerbraz--disableexcludesz--disableexcludepkgsZdisable_excludeszdisable excludepkgsz--repofrompathz[repo,path]zolabel and path to an additional repository to use (same path as in a baseurl), can be specified multiple times.)rVrerarbz--noautoremoveZclean_requirements_on_removez7disable removal of dependencies that are no longer usedz--nogpgcheckZgpgcheckz5disable gpg signature checking (if RPM policy allows)z--colorZcolorzcontrol whether color is used)r,rVrbz	--refreshZfreshest_metadataz2set metadata as expired before running the command)r,rerbz-4Z
ip_resolvezresolve to IPv4 addresses onlyZipv4)r,rVrbrergz-6zresolve to IPv6 addresses onlyZipv6z	--destdirz
--downloaddirZdestdirz!set directory to copy packages toz--downloadonlyZdownloadonlyzonly download packagesz	--commentZcommentzadd a comment to transactionz--bugfixz,Include bugfix relevant packages, in updates)rerbz
--enhancementz1Include enhancement relevant packages, in updatesz--newpackagez0Include newpackage relevant packages, in updatesz
--securityz.Include security relevant packages, in updatesz
--advisoryz--advisoriesZadvisoryz=Include packages needed to fix the given advisory, in updatesz--bzz--bzsZbugzillaz7Include packages needed to fix the given BZ, in updatesz--cvez--cvesZcvesz8Include packages needed to fix the given CVE, in updatesz--sec-severityz
--secseverityZCriticalZ	ImportantZModerateZLowZseverityzDInclude security relevant packages matching the severity, in updates)rUrVr,rerbz--forcearchZARCHz Force the use of an architecture)rar,rerUrb�command�?)�nargsrb)�add_argument_groupr�formatr�utilZMAIN_PROG_UPPER�add_argumentr<rSZadd_mutually_exclusive_group�int�	MAIN_PROGr6rDrBZSUPPRESSr_�sortedZrpmZ
_BASEARCH_MAP�keys)rZgeneral_grpZ
best_groupZ
repo_groupZenable_grouprrrr�s:























z!OptionParser._add_general_optionscCsHtjj|j�}tjj|jd�}||jkrD||f|j|<|jj|�dS)z- store usage info about a single dnf command.rN)rZi18nZucd�summary�aliasesrr�add)r�cmd�grouprt�namerrr�_add_cmd_usageds

zOptionParser._add_cmd_usagecCs&x t|j��D]}|j||�qWdS)z� store name & summary for dnf commands

        The stored information is used build usage information
        grouped by build-in & plugin commands.
        N)rr2rz)rZcli_cmdsrxrwrrr�add_commandslszOptionParser.add_commandscCs�td�td�d�}dtjj}xfd	D]^}||jkr4q$|d||7}x<t|jj��D]*}|j|\}}||krT|d||f7}qTWq$W|S)
z- get the usage information to show the user. zList of Main Commands:zList of Plugin Commands:)�main�pluginz%s [options] COMMAND
r|r}z
%s

z	%-25s %s
)r|r})rrrnrqrrrrrs)rZdesc�usageZgrpryrxrtrrr�	get_usageus

zOptionParser.get_usagecCs~dtjj|jf|_|j|_tj|jdd�|_	|j
|j	_
d|j	j_|j
dj|jj���|_|j|j_|j|j_|j|j�dS)Nz%s %sF)rz{} command-specific options)rrnrq�_basecmdr`rt�descriptionrB�ArgumentParserrr�_positionals�titlerlrm�
capitalizer�cmd_add_argumentro�_commandZ
set_argparser)rrirrr�_add_command_options�s



z!OptionParser._add_command_optionscsBt�fdd�|D��r0t�j�j�jf|�|�S�jj||�SdS)Ncsg|]}|d�jk�qS)r)Zprefix_chars)r'�arg)rrr�
<listcomp>�sz1OptionParser.cmd_add_argument.<locals>.<listcomp>)�allrhrror)rr[r\r)rrr��szOptionParser.cmd_add_argumentcCs`xZ|D]R}y|jd�Wqtk
rV}z"tjjtd�|t|�f��WYdd}~XqXqWdS)Nzutf-8zCannot encode argument '%s': %s)�encode�UnicodeEncodeErrorr�
exceptionsZConfigErrorr�str)rr[r�r]rrr�_check_encoding�s
zOptionParser._check_encodingcCs|j|�|j|�\}}|S)N)r��parse_known_args)rr[r1Z_unused_argsrrr�parse_main_args�s
zOptionParser.parse_main_argscCs2|j|�|j|�\}}|jj||�}||_|jS)N)r�r�r�
parse_argsZopts)rrir[r1Zunused_argsrrr�parse_command_args�s

zOptionParser.parse_command_argsNcs,|jr|j|jj7_tt|�j|�dS)N)r�_actionsr	rr)rZfile_)rrrr�szOptionParser.print_usagecsd|rH|js|jj|jkr$|j|�|j|jj7_|jj|jj�n
|j	�|_
tt|�j
�dS)N)rr�r�r�r�rZ_action_groupsr8r�rr~r	r�
print_help)rri)rrrr��s

zOptionParser.print_help)T)N)N)rrrr?rr$rBZActionr6r:Z
_AppendActionr<rDrSrTrZr_rrzr{rr�r�r�r�r�rr�rrr)rrr,s.
;	r)Z
__future__rZdnf.i18nrZdnf.utilrrBZdnf.exceptionsrZdnf.rpmZdnf.yum.miscZloggingZos.path�osr.r!Z	getLoggerr Z
HelpFormatterrr�rrrrr�<module>s
site-packages/dnf/cli/__pycache__/demand.cpython-36.opt-1.pyc000064400000003013147511334650017635 0ustar003

�ft`�	�@s0ddlmZGdd�de�ZGdd�de�ZdS)�)�unicode_literalsc@s&eZdZdd�Zddd�Zdd�ZdS)	�_BoolDefaultcCs ||_d|jjt|�f|_dS)Nz__%s%x)�default�	__class__�__name__�id�
_storing_name)�selfr�r
�/usr/lib/python3.6/demand.py�__init__sz_BoolDefault.__init__NcCs |j}|j|kr||jS|jS)N)�__dict__rr)r	�objZobjtype�objdictr
r
r�__get__s

z_BoolDefault.__get__cCs8|j}|j|kr*||j}||kr*td��|||j<dS)NzDemand already set.)r
r�AttributeError)r	r�valrZcurrent_valr
r
r�__set__#s

z_BoolDefault.__set__)N)r�
__module__�__qualname__rrrr
r
r
rrs
rc@speZdZdZed�Zed�Zed�Zed�Zed�Z	ed�Z
dZed�Zed�Z
ed�Zed�ZdZed�ZdS)�DemandSheetzHCollection of demands that different CLI parts have on other parts. :apiFTrN)rrr�__doc__rZ
allow_erasingZavailable_reposZ	resolvingZ	root_userZsack_activationZload_system_repoZsuccess_exit_statusZ	cacheonlyZfresh_metadataZfreshest_metadataZ
changelogsZtransaction_displayZplugin_filtering_enabledr
r
r
rr+srN)Z
__future__r�objectrrr
r
r
r�<module>ssite-packages/dnf/cli/__pycache__/aliases.cpython-36.pyc000064400000012474147511334650017102 0ustar003

�ft`��@s�ddlmZddlmZddlmZddlZddlZddlm	Z	ddl
ZddlZddl
Z
ddlZddlZe
jd�ZdZejjed�Zejjed	�ZGd
d�de�ZGdd
�d
e�ZdS)�)�absolute_import)�unicode_literals)�_N)�PRIO_DEFAULT�dnfz/etc/dnf/aliases.d/zALIASES.confz	USER.confc@s,eZdZdd�Zedd��Zedd��ZdS)�
AliasesConfigcCs$||_tjj�|_|jj|j�dS)N)�_path�libdnf�confZConfigParser�_parser�read)�self�path�r�/usr/lib/python3.6/aliases.py�__init__*szAliasesConfig.__init__cCsHtjjd�}y|jt|jj�dd�Wntk
r>YnX|j�S)NT�main�enabled)	r	r
�
OptionBool�setrrZgetData�
IndexError�getValue)r
�optionrrrr/szAliasesConfig.enabledcCsVtj�}d}|jj|�s|Sx4|jj|�D]$}|jj||�}|sBq*|j�||<q*W|S)N�aliases)�collections�OrderedDictrZ
hasSectionZoptionsr�split)r
�resultZsection�key�valuerrrr8szAliasesConfig.aliasesN)�__name__�
__module__�__qualname__r�propertyrrrrrrr)s	rc@sNeZdZdd�Zdd�Zdd�Zdd�Zdd
d�Zdd
�Zdd�Z	dd�Z
d	S)�AliasescCsFtj�|_d|_d|_|j�r(d|_dS|j�|js:dS|j�dS)NTF)rrrr
r�_disabled_by_environ�
_load_main�
_load_aliases)r
rrrrGs
zAliases.__init__cCshtjjd�}y|jttjd�|j�Stk
r:dSt	k
rbt
jtd�tjd�dSXdS)NTZDNF_DISABLE_ALIASESFz@Unexpected value of environment variable: DNF_DISABLE_ALIASES=%s)
r	r
rrr�os�environr�KeyError�RuntimeError�logger�warningr)r
rrrrr%WszAliases._disabled_by_environcCs�yt|�Stk
rB}ztjjtd�||f��WYdd}~Xn:tk
rz}ztjjtd�||f��WYdd}~XnXdS)NzParsing file "%s" failed: %szCannot read file "%s": %s)rr+r�
exceptions�ConfigErrorr�IOError)r
r�errr�
_load_confds"zAliases._load_confcCsVy|jt�|_|jj|_Wn6tjjk
rP}ztjt	d�|�WYdd}~XnXdS)NzConfig error: %s)
r2�ALIASES_CONF_PATHr
rrr.r/r,�debugr)r
r1rrrr&ns
zAliases._load_mainNcCs�|dkr.y|j�}Wntjjk
r,dSXxf|D]^}y"|j|�}|jrX|jj|j�Wq4tjjk
r�}ztj	t
d�|�WYdd}~Xq4Xq4WdS)NzConfig error: %s)�_dropin_dir_filenamesrr.r/r2rr�updater,r-r)r
�	filenames�filenamer
r1rrrr'us

zAliases._load_aliasescs�tjjt�tjjt�g��fdd�}g}yPtjjt�s@tjt�x4ttj	t��D]"}||�r^qP|j
tjjt|��qPWWn2tt
fk
r�}ztjj|��WYdd}~XnXtjjt�r�|j
t�|S)Ncs|�kp|jd�p|jd�S)N�.�.conf�.CONF)r:r;)�
startswith�endswith)r8)�ignored_filenamesrr�_ignore_filename�s
z7Aliases._dropin_dir_filenames.<locals>._ignore_filename)r(r�basenamer3�ALIASES_USER_PATH�exists�ALIASES_DROPIN_DIR�mkdir�sorted�listdir�append�joinr0�OSErrorrr.r/)r
r?r7�fnr1r)r>rr5�s 


zAliases._dropin_dir_filenamescs:g�g�_�fdd������fdd���|�}�j|S)NcsNd}x&|D]}|r |ddkr P|d7}q
W�j|d|�7_||d�S)Nr�-�)�prefix_options)�argsZnum�arg)r
rr�store_prefix�s
z&Aliases._resolve.<locals>.store_prefixcs��|�}|s*|d�jks*|djd�rry.�j�|djd�rV|ddd�|d<Wntk
rlYnX|S|d�kr�tjjtd����j|d���j|d�}|r�||dd�S�|dd��SdS)Nr�\rLz"Aliases contain infinite recursion)	rr<�poprrr.�ErrorrrG)rN�suffixZcurrent_alias_result)r
�stackrP�
subresolverrrV�s&
z$Aliases._resolve.<locals>.subresolve)rM)r
rNrTr)r
rUrPrVr�_resolve�szAliases._resolvecCsP|jrLy|j|�}Wn6tjjk
rJ}ztjtd�|�WYdd}~XnX|S)Nz%s, using original arguments.)rrWrr.rSr,�errorr)r
rNr1rrr�resolve�s"zAliases.resolve)N)r r!r"rr%r2r&r'r5rWrYrrrrr$Fs


/r$)Z
__future__rrZdnf.i18nrrZdnf.clirZdnf.conf.configrZdnf.exceptionsZlibdnf.confr	Zloggingr(Zos.pathZ	getLoggerr,rCrrHr3rA�objectrr$rrrr�<module>s 
site-packages/dnf/cli/__pycache__/main.cpython-36.opt-1.pyc000064400000012241147511334650017334 0ustar003

�ft`f�@sPdZddlmZddlmZddlmZddlmZddlmZddl	m
Z
ddlmZdd	l
mZdd
lmZddlZddlZddl	ZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZejd�Zd
d�Zdd�Z eee
fdd�Z!dd�Z"dd�Z#dd�Z$d dd�Z%e&dk�rLe%ej'dd�dd�dS)!z4
Entrance point for the yum command line interface.
�)�print_function)�absolute_import)�unicode_literals)�Conf)�Cli)�OptionParser)�ucd)�show_lock_owner)�_N�dnfcCs&tjtjjddd�tjt|��dS)N�T)�exc_info�)�logger�logr�logging�SUBDEBUG�criticalr)�e�r�/usr/lib/python3.6/main.py�
ex_IOError2srcCs6tjtjjddd�|jdk	r2tjtd�t|��dS)NrT)r
z	Error: %sr)	rrrrr�valuerr
r)rrrr�ex_Error8s
rcCs�y6tjj�tjjj|���}t||||�SQRXW�n�tjjk
rr}ztj	|j
�t|j�dSd}~X�nLtjj
k
r�}ztj	|j
�dSd}~X�ntjjk
r�}zdSd}~X�n�tjjk
�r�}zt|�Sd}~Xn�tjk
�r$}ztj	td�t|��dSd}~Xn�tjjk
�r\}ztj	td�t|��dSd}~Xnbtk
�r�}zt|�Sd}~Xn>tk
�r�}z tj	djt|�jtd���dSd}~XnXdS)N��rz	Error: %sz{}: {}zTerminated.)rZi18nZsetup_stdout�cliZBaseCli�_main�
exceptionsZProcessLockErrorrrrr	�pid�	LockError�
DepsolveError�Errorr�hawkey�	Exceptionr
r�libdnf�error�IOErrorr�KeyboardInterrupt�format�type�__name__)�argsZ
conf_class�	cli_classZoption_parser_class�baserrrr�main?s4

r.cCsb|jj�||�}y|jttt|��|��Wn(ttfk
rV}zt|�Sd}~XnXt	||�S)z2Run the dnf program from a command line interface.N)
Z_loggingZ	_presetupZ	configure�list�maprr&�OSErrorr�cli_run)r-r+r,Z
option_parserrrrrrr\s
rc,Cs�ytd�}WnFtk
rR}z*|jtjkrBtjtd��tjd�WYdd}~Xn
X|j	�y|j
�Wn@tjj
k
r��Yn(ttfk
r�}zt|�Sd}~XnX|jj�r�yt||�}W�ntjjk
�r�}z�t|�d}|jj�r|jjdd��r|td�jd�7}|jjj�rN|�s<|td	�jd
�7}n|td�jd
�7}|jjj�r�|jjjd�}|tjjk�r�|�s�|td
�jd�7}n|td�jd�7}|�r�tjdj|���WYdd}~XnX|�r�|S|jj �|jj!S)N�.z8No read/execute access in current directory, moving to /�/rT)Z	availablez?try to add '{}' to command line to replace conflicting packagesz--allowerasingz.try to add '{}' to skip uninstallable packagesz
--skip-brokenz' or '{}' to skip uninstallable packages�bestz7try to add '{}' to use not only best candidate packagesz--nobestz0 or '{}' to use not only best candidate packagesz({}))"�openr&�errnoZEACCESrrr
�os�chdir�closeZrunrrrr1r�demands�	resolvingr r�
allow_erasingZ_goalZproblem_conflictsr(r-Zconf�strictr5Z
_get_priorityZPRIO_MAINCONFIG�info�commandZrun_transactionZsuccess_exit_status)rr-�fr�ret�msgZpriorrrr2msT







r2cCs
|jdkr&|j|jj�tjtd��|jj�g}|jj	dk	rN|j
|jj	�y|j|d�Wn�tj
jk
r�}ztjt|��dSd}~Xnvtjjk
r�}z$x|jj|�D]}tj|�q�WdSd}~Xn4tk
�r�}zt|�Sd}~XnXtjtd��dS)z9Perform the depsolve, download and RPM transaction stage.NzDependencies resolved.)Zdisplayrz	Complete!r)ZtransactionZresolver;r=rr?r
r@Zrun_resolvedZtransaction_display�appendZdo_transactionrrZCliErrorr%rrZTransactionCheckErrorZget_error_outputrr&r)rr-Zdisplays�exc�errrCrrrrr<�s(

r<FcCst|�}|rtj|�|S)apCall one of the multiple main() functions based on environment variables.

    :param args: command line arguments passed into yum
    :param exit_code: if *exit_code* is True, this function will exit
       python with its exit code when it has finished executing.
       Otherwise, it will return its exit code.
    :return: the exit code from dnf.yum execution
    )r.�sys�exit)r+�	exit_codeZerrcoderrr�	user_main�s

rJ�__main__rT)rI)F)(�__doc__Z
__future__rrrZdnf.confrZdnf.cli.clirZdnf.cli.option_parserrZdnf.i18nrZ
dnf.cli.utilsr	r
Zdnf.clirZdnf.exceptionsZdnf.loggingZdnf.utilr7r"Zlibdnf.errorr$rr8Zos.pathrGZ	getLoggerrrrr.rr2r<rJr*�argvrrrr�<module>sB
5

site-packages/dnf/cli/__pycache__/utils.cpython-36.opt-1.pyc000064400000006141147511334650017552 0ustar003

�ft`��@s�dZddlmZddlmZddlmZddlmZddlZ	ddl
Z
ddlZddlZej
ejd�Ze
jd�Zd	d
�Zdd�Zd
d�Zdd�ZdS)z/Various utility functions, and a utility class.�)�absolute_import)�unicode_literals)�
format_number)�_N�
SC_CLK_TCK�dnfcCst|�tS)z�Convert a number of jiffies to seconds. How many jiffies are in a second
    is system-dependent, e.g. 100 jiffies = 1 second is common.

    :param jiffies: a number of jiffies
    :return: the equivalent number of seconds
    )�int�_USER_HZ)Zjiffies�r
�/usr/lib/python3.6/utils.py�jiffies_to_secondssrcCsj|dkr0d|d	|d
d|dd|dfS|dkrVd|d|dd|dfSd|d|dfS)
aReturn a human-readable string representation of the length of
    a time interval given in seconds.

    :param seconds: the length of the time interval in seconds
    :return: a human-readable string representation of the length of
      the time interval
    �<�z%d day(s) %d:%02d:%02dz%d:%02d:%02dz	%02d:%02dii�Qii�Qiiir
)Zsecondsr
r
r�seconds_to_ui_time)s

rcCs�t|�}tjjd|�s:tjjd�s:tjjd|�r>dSi}td|��v}xn|D]f}|ddkrhqV|dd�jdd�}t|�dkr�qVtjj	|dd	�|d<|dj
�||d
j
�j�<qVWWdQRXd|kr�dSd|kr�dSd}td��4}x,|D]$}|jd
�r�t|td
�d��}Pq�WWdQRX|dk�r6dStd|��^}|j
�j�}|t|d�|d<td�td�td�td�td�d�j|dtd��|d<WdQRX|S)z!Return info dict about a process.z/proc/%d/statusz
/proc/statz
/proc/%d/statN��
z:	�z kBr�vmrss�vmsizezbtime ��
start_timeZRunningZSleepingZUninterruptibleZZombiezTraced/Stopped)�R�S�D�Z�TZUnknown�state���rr)r�os�path�exists�open�split�lenr�utilZrtrim�strip�lower�
startswith�readrr�get)�pid�psZstatus_file�line�dataZ	boot_timeZ	stat_fileZps_statr
r
r�get_process_info<sJ
*



r.cCs�t|�}|s$td�}tj||�dStd�||df}tjd|�tjtd�tt|d�d�tt|d	�d��tttj��|d
�}tjtd�tj	j
|d
�|�tjtd�|d
�dS)z0Output information about process holding a lock.z=Unable to find information about the locking process (PID %d)Nz$  The application with PID %d is: %s�namez%sz    Memory : %5s RSS (%5sB VSZ)rirrz    Started: %s - %s agoz    State  : %sr)r.r�loggerZcriticalrrr�timerr$Znormalize_time)r*r+�msgZagor
r
r�show_lock_ownerls

r3)�__doc__Z
__future__rrZdnf.cli.formatrZdnf.i18nrZdnf.utilrZloggingrr1�sysconf�
sysconf_namesr	Z	getLoggerr0rrr.r3r
r
r
r�<module>s

0site-packages/dnf/cli/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000755147511334650020156 0ustar003

�ft`��@sDddlmZddlZGdd�dejj�ZddlmZddl	m
Z
dS)�)�absolute_importNc@seZdZdZdS)�CliErrorzCLI Exception. :apiN)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/__init__.pyrsr)�Cli)�Command)Z
__future__rZdnf.exceptionsZdnf�
exceptions�ErrorrZdnf.cli.clir
Zdnf.cli.commandsrrrrr	�<module>ssite-packages/dnf/cli/__pycache__/progress.cpython-36.opt-1.pyc000064400000010777147511334650020270 0ustar003

�ft`��@spddlmZddlmZmZddlmZddlmZddl	m	Z	ddl
Z
ddlZddl
ZGdd�dejj�ZdS)	�)�unicode_literals)�
format_number�format_time)�_term_width)�unicode)�timeNc@sreZdZdZejjdejjdejjdejj	diZ
ejdddfd	d
�Z
dd�Zddd�Zdd�Zdd�Zdd�ZdS)�MultiFileProgressMeterz"Multi-file download progress meterZFAILEDZSKIPPEDZMIRRORZDRPMg333333�?g�?g@cCsp||_||_||_||_d|_d|_tjj�|_d|_	d|_
d|_g|_i|_
d|_d|_d|_d|_d|_dS)z�Creates a new progress meter instance

        update_period -- how often to update the progress bar
        tick_period -- how fast to cycle through concurrent downloads
        rate_average -- time constant for average speed calculation
        rN)�fo�
update_period�tick_period�rate_average�unknown_progres�
total_drpm�sys�stdout�isatty�	done_drpm�
done_files�	done_size�active�state�	last_time�	last_size�rate�total_files�
total_size)�selfr	r
rr�r�/usr/lib/python3.6/progress.py�__init__&s"zMultiFileProgressMeter.__init__cCstjjd||j�dS)NZwrite_flush)�dnf�utilZ_terminal_messengerr	)r�msgrrr�message?szMultiFileProgressMeter.messagercCsF||_||_||_d|_d|_d|_g|_i|_d|_d|_	d|_
dS)Nr)rrrrrrrrrrr)rrrZtotal_drpmsrrr�startBszMultiFileProgressMeter.startcCs�t�}t|�}t|j�}t|�}||jkrD|df|j|<|jj|�|j|\}}||f|j|<|j||7_||j|j	kr�||j
kr�||_
|j|�dS)Nr)rr�int�
download_sizerr�appendrrr
r�_update)r�payload�done�now�textZtotalr$�oldrrr�progressSs


zMultiFileProgressMeter.progresscCsJ|jrj||j}|j|j}|dkrj|dkrj||}|jdk	rdt||jd�}|||jd|}||_||_|j|_|js�dS|jt||j	�t
|j�}|jdkr�d|jd}t
|j�dkr�|d|jt
|j�7}d||j|f}|jo�|j
�rt|j
|j|j�}nd}d|j�r,t|j�ndt|j�|f}	t�t
|	�}
|
d	d
}|dk�r0|j
�r�|jd|j
}t|j|d
|j
d
�\}}
d
|d|
}d||||	f}	|
|d	8}
nj|jd}d}
|dk�r�dn|}d|d
|
}d|||	f}	|
|d	8}
|jd|k�r*|jdnd|_|jd|
|
||	f�dS)Nr�z%dz-%dz(%s/%d): %sz--:--z %5sB/s | %5sB %9s ETA
z---  ����d�=�-z%3d%% [%-*s]%s�� z
     [%-*s]%sz%-*.*s%s)rrrr�minrrrr%r�lenrrrrrr�divmodr
r#)rr+Z
delta_timeZ
delta_sizerZweightr,�nZtime_etar"�leftZblZpct�pZbarrrrr(gsX




zMultiFileProgressMeter._updatecCs�t�}}t|�}t|j�}d}|tjjkr.n�|tjjkrJ|jd7_nt||j	kr�|j	j
|�\}}|jj|�||8}|j
d7_
|j|7_n(|tjjkr�|j
d7_
|j|7_|�r*|tjjkr�|jdkr�d|j||j|j|f}	nd|j||f}	t�t|	�d}
d|	|
|f}	nl|jdk�rHd|j
|j|f}t||d�}dtt|�|�t|�t|�f}	t�t|	�}
d	|
|
||	f}	|j|	�|j�r�|j|�dS)
Nrr/z[%s %d/%d] %s: z	[%s] %s: z%s%-*s
z(%d/%d): %sg����MbP?z %5sB/s | %5sB %9s    
z%-*.*s%s)rrr%r&r �callback�
STATUS_MIRROR�STATUS_DRPMrr�popr�removerr�STATUS_ALREADY_EXISTSr�STATUS_2_STRrr9r�maxr�floatrr#r()rr)ZstatusZerr_msgr$r+r,�sizer*r"r<Ztmrrr�end�sH



zMultiFileProgressMeter.endN)r)�__name__�
__module__�__qualname__�__doc__r r>Z
STATUS_FAILEDrCr?r@rDr�stderrrr#r$r.r(rHrrrrrs
5r)Z
__future__rZdnf.cli.formatrrZdnf.cli.termrZ
dnf.pycomprrrZdnf.callbackr Zdnf.utilr>ZDownloadProgressrrrrr�<module>ssite-packages/dnf/cli/__pycache__/term.cpython-36.opt-1.pyc000064400000026773147511334650017376 0ustar003

�ft`f9�@sxddlmZddlmZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddd�Zddd�ZGd	d
�d
e
�ZdS)
�)�absolute_import)�unicode_literalsN�cCsBy(d}tj|tj|�}tjd|�d}|Stk
r<dSXdS)z Get the real terminal width ZabcdefghshhhhrN)�fcntlZioctl�termiosZ
TIOCGWINSZ�struct�unpack�IOError)�fdZbuf�ret�r�/usr/lib/python3.6/term.py�_real_term_widthsrcCs&tdd�}|sdS|dkrdS|SdS)z@ Compute terminal width falling to default 80 in case of troubler)r
�P�N)r)r
Ztwrrr
�_term_width)s
rc	@seZdZdZdZedd��Zedd��Zdddd	�Zd
ddd
ddddd�Z	d
ddd
ddddd�Z
ddddddd�Zdddddd d!d"d�Zd#d$d%d&d'd(d)d*d�Z
d+d,�ZdFd/d0�ZdGd1d2�Zd3d4�Zd5d6�Zd7d8�ZdHd:d;�Zd<d=�Zd>d?�Zd@dA�ZdBdC�ZdDdE�Zd-S)I�Termz>A class to provide some terminal "UI" helpers based on curses.TcCst�S)N)r)�selfrrr
�<lambda>@sz
Term.<lambda>cCst�S)N)r)rrrr
rAsZsmulZrevZsgr0)�	underline�reverse�normalrr������)�black�blue�green�cyan�red�magenta�yellow�white)rr"r r$rr#r!r%zz�zzz(B)�bold�blink�dimrrrzzzzzzzzzzzzzzzzcCs|j|_|j|_|j|_dS)N)�_Term__ansi_forced_MODE�MODE�_Term__ansi_forced_FG_COLOR�FG_COLOR�_Term__ansi_forced_BG_COLOR�BG_COLOR)rrrr
Z
__forced_initzszTerm.__forced_initN�autocCsLd|_d|_|dkr |j�dSddddddd�|_ddddddddd�|_ddddddddd�|_|dkrvd	|_dS|s�tj}|j�s�d	|_dSyt	j
|j�d
�Wntk
r�d	|_dSXt	j
|_t	jd�|_x8|jD].}|}||jk�r�|j|}|j|�|j|<q�W|jd�jd
�}|�r\x4|jj�D]&\}}t	j||�j��pNd|j|<�q2W|jd�jd
�}|�r�x8|jj�D]*\}}t	j||�j��p�d}||j|<�q~W|jd�jd
�}	|	�r�x4|jj�D]&\}}t	j|	|�j��p�d|j|<�q�W|jd�jd
�}
|
�rHx8|jj�D]*\}}t	j|
|�j��p6d}||j|<�qWdS)a
Reinitializes the :class:`Term`.

        :param term_stream:  the terminal stream that the
           :class:`Term` should be initialized to use.  If
           *term_stream* is not given, :attr:`sys.stdout` is used.
        :param color: when to colorize output.  Valid values are
           'always', 'auto', and 'never'.  'always' will use ANSI codes
           to always colorize output, 'auto' will decide whether do
           colorize depending on the terminal, and 'never' will never
           colorize.
        T��alwaysNr&)r'r(r)rrr)rrr r!r"r#r$r%ZneverF)r
�linesZsetfzutf-8ZsetafZsetbZsetab)�_Term__enabledr3�_Term__forced_initr+r-r/�sys�stdout�isatty�cursesZ	setupterm�fileno�	ExceptionZtigetstr�
_ctigetstrZtigetnum�_Term__cap_names�	_tigetstr�encode�
_Term__colors�itemsZtparm�decode�_Term__ansi_colors)r�term_stream�color�cap_name�modeZset_fg�valZset_fg_ansiZfg_colorZset_bgZset_bg_ansiZbg_colorrrr
�reinits�



""zTerm.reinitcCs|j||�dS)N)rI)rrDrErrr
�__init__�sz
Term.__init__cCs0|j|�pd}tjj|�r"|j�}tjdd|�S)Nr&z\$<\d+>[/*]?)r<�dnfZpycompZis_py3bytesrB�re�sub)rrFZcaprrr
r>�szTerm._tigetstrcCs|j|t|�|jdS)zColorize string with colorr)r+�str)rrE�srrr
rE�sz
Term.colorcCs|jd|�S)zMake string bold.r')rE)rrOrrr
r'�sz	Term.boldFc
s\|js
|S|stj}��fdd�}x4|D],}||�}	|rFtj|	tj�}	tj|	||�}q(W|S)aSearch the string *haystack* for all occurrences of any
        string in the list *needles*.  Prefix each occurrence with
        *beg*, and postfix each occurrence with *end*, then return the
        modified string.  For example::

           >>> yt = Term()
           >>> yt.sub('spam and eggs', 'x', 'z', ['and'])
           'spam xandz eggs'

        This is particularly useful for emphasizing certain words
        in output: for example, calling :func:`sub` with *beg* =
        MODE['bold'] and *end* = MODE['normal'] will return a string
        that when printed to the terminal will appear to be *haystack*
        with each occurrence of the strings in *needles* in bold
        face.  Note, however, that the :func:`sub_mode`,
        :func:`sub_bold`, :func:`sub_fg`, and :func:`sub_bg` methods
        provide convenient ways to access this same emphasizing functionality.

        :param haystack: the string to be modified
        :param beg: the string to be prefixed onto matches
        :param end: the string to be postfixed onto matches
        :param needles: a list of strings to add the prefixes and
           postfixes to
        :param escape: a function that accepts a string and returns
           the same string with problematic characters escaped.  By
           default, :func:`re.escape` is used.
        :param ignore_case: whether case should be ignored when
           searching for matches
        :return: *haystack* with *beg* prefixing, and *end*
          postfixing, occurrences of the strings in *needles*
        cs�|j��S)N)�group)�match)�beg�endrr
rszTerm.sub.<locals>.<lambda>)r4rL�escape�template�IrM)
r�haystackrRrS�needlesrTZignore_caseZrenderZneedleZpatr)rRrSr
rM�s 
zTerm.subcKs|j|||jd|f|�S)aOSearch the string *haystack* for all occurrences of any
        string in the list *needles*.  Prefix each occurrence with
        *beg*, and postfix each occurrence with self.MODE['normal'],
        then return the modified string.  If *beg* is an ANSI escape
        code, such as given by self.MODE['bold'], this method will
        return *haystack* with the formatting given by the code only
        applied to the strings in *needles*.

        :param haystack: the string to be modified
        :param beg: the string to be prefixed onto matches
        :param end: the string to be postfixed onto matches
        :param needles: a list of strings to add the prefixes and
           postfixes to
        :return: *haystack* with *beg* prefixing, and self.MODE['normal']
          postfixing, occurrences of the strings in *needles*
        r)rMr+)rrWrRrX�kwdsrrr
�sub_norm&sz
Term.sub_normcKs|j||j||f|�S)aTSearch the string *haystack* for all occurrences of any
        string in the list *needles*.  Prefix each occurrence with
        self.MODE[*mode*], and postfix each occurrence with
        self.MODE['normal'], then return the modified string.  This
        will return a string that when printed to the terminal will
        appear to be *haystack* with each occurrence of the strings in
        *needles* in the given *mode*.

        :param haystack: the string to be modified
        :param mode: the mode to set the matches to be in.  Valid
           values are given by self.MODE.keys().
        :param needles: a list of strings to add the prefixes and
           postfixes to
        :return: *haystack* with self.MODE[*mode*] prefixing, and
          self.MODE['normal'] postfixing, occurrences of the strings
          in *needles*
        )rZr+)rrWrGrXrYrrr
�sub_mode9sz
Term.sub_modecKs|j|d|f|�S)a�Search the string *haystack* for all occurrences of any
        string in the list *needles*.  Prefix each occurrence with
        self.MODE['bold'], and postfix each occurrence with
        self.MODE['normal'], then return the modified string.  This
        will return a string that when printed to the terminal will
        appear to be *haystack* with each occurrence of the strings in
        *needles* in bold face.

        :param haystack: the string to be modified
        :param needles: a list of strings to add the prefixes and
           postfixes to
        :return: *haystack* with self.MODE['bold'] prefixing, and
          self.MODE['normal'] postfixing, occurrences of the strings
          in *needles*
        r')r[)rrWrXrYrrr
�sub_boldMsz
Term.sub_boldcKs|j||j||f|�S)acSearch the string *haystack* for all occurrences of any
        string in the list *needles*.  Prefix each occurrence with
        self.FG_COLOR[*color*], and postfix each occurrence with
        self.MODE['normal'], then return the modified string.  This
        will return a string that when printed to the terminal will
        appear to be *haystack* with each occurrence of the strings in
        *needles* in the given color.

        :param haystack: the string to be modified
        :param color: the color to set the matches to be in.  Valid
           values are given by self.FG_COLOR.keys().
        :param needles: a list of strings to add the prefixes and
           postfixes to
        :return: *haystack* with self.FG_COLOR[*color*] prefixing, and
          self.MODE['normal'] postfixing, occurrences of the strings
          in *needles*
        )rZr-)rrWrErXrYrrr
�sub_fg_szTerm.sub_fgcKs|j||j||f|�S)a�Search the string *haystack* for all occurrences of any
        string in the list *needles*.  Prefix each occurrence with
        self.BG_COLOR[*color*], and postfix each occurrence with
        self.MODE['normal'], then return the modified string.  This
        will return a string that when printed to the terminal will
        appear to be *haystack* with each occurrence of the strings in
        *needles* highlighted in the given background color.

        :param haystack: the string to be modified
        :param color: the background color to set the matches to be in.  Valid
           values are given by self.BG_COLOR.keys().
        :param needles: a list of strings to add the prefixes and
           postfixes to
        :return: *haystack* with self.BG_COLOR[*color*] prefixing, and
          self.MODE['normal'] postfixing, occurrences of the strings
          in *needles*
        )rZr/)rrWrErXrYrrr
�sub_bgsszTerm.sub_bg)Nr0)Nr0)NF)�__name__�
__module__�__qualname__�__doc__r4�propertyZreal_columns�columnsr=r@rCr*r,r.r5rIrJr>rEr'rMrZr[r\r]r^rrrr
r4sr	
f
	
-r)r)r)Z
__future__rrr9Z
dnf.pycomprKrrLrr6rrr�objectrrrrr
�<module>s

site-packages/dnf/cli/__pycache__/cli.cpython-36.opt-1.pyc000064400000074151147511334650017167 0ustar003

h�-e���@stdZddlmZddlmZddlmZyddlmZWn ek
rXddlmZYnXddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlZddlZddlmZdd	lmZdd
lmZmZddlZddlZddlZddlZddlZddl Zddl!Zddl"Zddl#Zddl$Zddl%Zddl&Zddl'Zddl(Zddl)Zddl*Zddl+Zddl,Zddl-Zddl.Zddl/Zddl0Zddl1Zddl2Zddl3Zddl4Zddl5Zddl6Zddl7Zddl8Zddl9Zddl:Zddl;Zddl<Zddl=Zddl>Zddl?Zddl@ZddlAZddlBZddlCZddlDZe
jEd�ZFdd
d�ZGdd�ZHdd�ZIdd�ZJGdd�dejK�ZLGdd�deM�ZNdS)z/
Command line interface yum class and related.
�)�print_function)�absolute_import)�unicode_literals)�SequenceN�)�output)�CliError)�ucd�_�dnf�cCs�t|j�dt|j�t|�}t|j�}t|j�}xFd|fd|fd|ffD],\}}||j|d�|||d7<qLWdS)zl Get the length of each pkg's column. Add that to data.
        This "knows" about simpleList and printVer. r�na�ver�ridrN)�len�name�archZevrZ
_from_repo�
setdefault)�data�pkg�indentr
rr�d�v�r�/usr/lib/python3.6/cli.py�_add_pkg_simple_list_lens]s 

 rcCs�iiid�}x<|j|j|j|j|j|jfD]}x|D]}t||�q4Wq*Wt|j�dkr�x*|j	D] \}}t||�t||dd�q`W|d|d|dg}|j
|d	d
�}|d|d	|dfS)
zA Work out the dynamic size of the columns to pass to fmtColumns. )r
rrr� �)rr
rrr)Zremainder_column�z    )�	installed�	available�extras�
autoremove�updates�recentrr�	obsoletes�obsoletesTuplesZcalcColumns)r�yplrZlstrZnpkgZopkg�columnsrrr�_list_cmd_calc_columnshs

r)c	Cs�dd�}tjj|�}d}x�|j�j�j|d�D]�}|r>td�d}|jdkrbd|j|j	|j
f}nd	|j|j|j	|j
f}|jj|j
�}ttd
�||||j�f�ttd�|jr�|jnd||j�f�q.WdS)NcSstjdtj|��S)Nz%c)�time�strftimeZgmtime)�xrrr�
sm_ui_timezsz"print_versions.<locals>.sm_ui_timeF)rrT�0z%s-%s.%sz%s:%s-%s.%sz  Installed: %s-%s at %sz  Built    : %s at %s)r�sack�
rpmdb_sack�queryr�filterm�printZepoch�version�releaser�term�boldrr
ZinstalltimeZpackagerZ	buildtime)	�pkgs�baserr-r0�donerrrrrr�print_versionsys 
r;cCs>td�}x0|j�D]$\}}tj|j||d|d��qWdS)NzTThe operation would result in switching of module '{0}' stream '{1}' to stream '{2}'rr)r
�items�logger�warning�format)�switchedModulesZmsg1Z
moduleNameZstreamsrrr�report_module_switch�srAcs�eZdZdZd �fdd�	Zff�fdd�	Zdd�Zd	d
�Zdd�Zd
d�Z	fdddfdd�Z
dd�Zggdfdd�Zdfdfdd�Z
d!dd�Z�fdd�Zdd�Z�ZS)"�BaseCliz#This is the base class for yum cli.Ncs4|ptjj�}tt|�j|d�tj||j�|_dS)N)�conf)rrCZConf�superrB�__init__rZOutput)�selfrC)�	__class__rrrE�szBaseCli.__init__cstjjrJ|jjsJt|jj��}|rJt|�t	d�j
tjjd�}tj
j|��|j}|jj|�}|rjtj|�|r�g}g}d}xF|D]>}	|	jtjjkr�|j|	j�q�|	jtjjkr�d}|j|	j�q�W|`|s�|jj|�n|jj||�|�s|jj��s|j�r�|jj�s|jj�r�|jj �r:tjt	d�j
tjj!d��n(d|jj"k�rbtjt	d�j
tjj!d��|j#��r�|jj$�s�|jj%��r�t&t	d���ntjt	d	��d
S|�rD|�r:tjt	d��y|jj'}
|j(||jj)|
�Wn\tj
j*k
�r8}z:tj+j
j,t-|��}t	d�d
|}
t.�tj
j|
��WYd
d
}~XnX|j/|�|jj �rRd
St0|t1��sd|g}tj2�gt3|�}t4t5|�j6|�}|d
k	�r�|j7j8|g�d}tj9jj:|j7|j;�}nd
}|�rt.�t.dj<|jj=|���t.�x.|D]&}	|	j>t?jj@k�r�tj
jt	d����q�W|S)z�Take care of package downloading, checking, user
        confirmation and actually running the transaction.

        :param display: `rpm.callback.TransactionProgress` object(s)
        :return: history database transaction ID or None
        aQIt is not possible to switch enabled streams of a module unless explicitly enabled via configuration option module_stream_switch.
It is recommended to rather remove all installed content from the module, and reset the module using '{prog} module reset <module_name>' command. After you reset the module, you can install the other stream.)�progTFz7{prog} will only download packages for the transaction.ZtestzP{prog} will only download packages, install gpg keys, and check the transaction.zOperation aborted.zNothing to do.NzDownloading Packages:zError downloading packages:z
%sr�
zTransaction failed)Arr9ZWITH_MODULESrCZmodule_stream_switch�dictZ_moduleContainerZgetSwitchedStreamsrAr
r?�util�	MAIN_PROG�
exceptions�ErrorZtransactionrZlist_transactionr=�info�actionZFORWARD_ACTIONS�appendrZBACKWARD_ACTIONSZ_tsZreportRemoveSizeZreportDownloadSizeZ	isChangedZ_history�group�env�downloadonly�MAIN_PROG_UPPERZtsflags�
_promptWanted�assumeno�userconfirmrZdownload_callback_total_cbZdownload_packages�progressZ
DownloadError�cliZindent_blockr	r3�gpgsigcheck�
isinstancerZCliTransactionDisplay�listrDrB�do_transaction�history�oldZdbZRPMTransactionZ_trans�joinZpost_transaction_output�state�libdnfZTransactionItemState_ERROR)rFZdisplayr@�msgZtransZpkg_strZinstall_pkgsZrmpkgsZinstall_onlyZtsiZtotal_cb�eZspecificZerrstr�tid)rGrrr^�s�







zBaseCli.do_transactionc
sg}x�|D]�}�j|�\}}|dkr(q
q
|dkrĈjjo@�jj}tjsVtjj�rl|rltjj	t
d����fdd�}y�j||�Wq�tjj	tfk
r�}z|j
t|��WYdd}~Xq�Xq
|j
|�q
W|r�x|D]}	tj|	�q�Wtjj	t
d���dS)aPerform GPG signature verification on the given packages,
        installing keys if possible.

        :param pkgs: a list of package objects to verify the GPG
           signatures of
        :raises: Will raise :class:`Error` if there's a problem
        rrzTRefusing to automatically import keys when running unattended.
Use "-y" to override.cs
�jj�S)N)rrX)r,�y�z)rFrr�<lambda>$sz%BaseCli.gpgsigcheck.<locals>.<lambda>NzGPG check FAILED)Z_sig_check_pkgrC�	assumeyesrW�sys�stdin�isattyrrMrNr
Z_get_key_for_package�
ValueErrorrQ�strr=�critical)
rFr8Zerror_messages�po�result�errmsgZay�fnrerdr)rFrr[s&
"
zBaseCli.gpgsigcheckcsXd�x:|jjjd|j�D]$}|tj}|rtjj|d��PqW�fdd�|j	D�}|S)zBReturn list of changelogs for package newer then installed versionNrrcs$g|]}�dks|d�kr|�qS)N�	timestampr)�.0Zchlog)�newestrr�
<listcomp>=sz-BaseCli.latest_changelogs.<locals>.<listcomp>)
Z_rpmconnZreadonly_tsZdbMatchr�rpmZRPMTAG_CHANGELOGTIME�datetimeZdateZ
fromtimestamp�
changelogs)rF�packageZmiZchangelogtimesZchlogsr)rwr�latest_changelogs3s
zBaseCli.latest_changelogscCs4d|djd�tjj|d�tjj|d�f}|S)z*Return changelog formatted as in spec filez* %s %s
%s
ruz%a %b %d %X %YZauthor�text)r+rZi18nr	)rFZ	changelogZ	chlog_strrrr�format_changelogAs
zBaseCli.format_changelogcCs�t�}x&|D]}|j|jp|jg�j|�qWxdt|j��D]T}||}ttd�j	dj
dd�|D����x$|j|d�D]}t|j|��qzWq<WdS)NzChangelogs for {}z, cSsg|]}t|��qSr)ro)rvrrrrrxQsz,BaseCli.print_changelogs.<locals>.<listcomp>r)
rJr�source_namerrQ�sorted�keysr3r
r?rar}r)rFZpackagesZbysrpm�pr�Zbin_packagesZchlrrr�print_changelogsIs
"zBaseCli.print_changelogsTFc	CsR|jd||d�}|jjs |jjr@|jd||d�}|j|_|j|_|�rDt|j|�}t|j�dkr�i}|jj	j
d}	|	r�x>t|j�D]0}
|
j�}t
jj|�r�|
j�r�|
||
j|
jf<q�W|jj}|jj}
|jj|jdd||||
d�d	�|r�|j|j�t|j�dk�rDttd
��x0t|jtjd�d�D]}|jj|d|d��q(W|j�pP|jS)
z?Check updates matching given *patterns* in selected repository.Zupgrades)�reponamer%rr7rr])�=znot in)Z
outputType�highlight_nar(�highlight_modeszObsoleting Packages)�key)r()�returnPkgListsrCr%�verboser&r)rrr#r6�MODEr�ZlocalPkg�os�path�existsZverifyLocalPkgrr�color_update_local�color_update_remote�listPkgsr�r3r
�operator�
itemgetter�updatesObsoletesList)rF�patternsr�Zprint_r{r'Ztyplr(�
local_pkgs�	highlightrqZlocal�cul�cur�obtuprrr�
check_updatesUs:
zBaseCli.check_updatescCsr|jj�}t|�dkr |j�nx|D]}|j|�q&W|jj�|}|dkrn|jj�rntd�}tjj|��dS)ab Upgrade or downgrade packages to match the latest versions available
            in the enabled repositories.

            :return: (exit_code, [ errors ])

            exit_code is::
                0 = we're done, exit
                1 = we've errored, exit with error string
                2 = we've got work yet to do, onto the next stage
        rz4No packages marked for distribution synchronization.N)	Z_goalZ
req_lengthrZdistro_syncZreq_has_distupgrade_allr
rrMrN)rFZuserlistZoldcount�pkg_specZcntrdrrr�distro_sync_userlist{s


zBaseCli.distro_sync_userlistc
CsJd}xf|D]^}y|j||d�d}Wq
tjjk
rf}z"tjtd�|jjj	|j
��WYdd}~Xq
Xq
Wx�|D]�}y|j||d�d}Wqrtjjk
r�}z$td�}	tj|	|jjj	|��WYdd}~Xqrtjj
k
�r}z"tjtd�|jjj	|j��WYdd}~Xqrtjjk
�r*YqrXqrW|�sFtjjtd���dS)	aaAttempt to take the user specified list of packages or
        wildcards and downgrade them. If a complete version number is
        specified, attempt to downgrade them to the specified version

        :param specs: a list of names or wildcards specifying packages to downgrade
        :param file_pkgs: a list of pkg objects from local files
        F)�strictTzNo match for argument: %sNzNo package %s available.z6Packages for argument %s available, but not installed.z!No packages marked for downgrade.)Zpackage_downgraderrMZMarkingErrorr=rOr
rr6r7�locationZdowngrade_toZPackageNotFoundErrorZPackagesNotInstalledErrorr�rN)
rFZspecsZ	file_pkgsr�rrrre�arg�errrdrrr�
downgradePkgs�s,	

(
&
"
zBaseCli.downgradePkgs�allc!CsDy$|jjjd}|j||||d�}Wn0tjjk
rT}zdt|�gfSd}~X�n�Xi}i}	i}
d}|dkrzt|j|�}|r�|j	r�xB|j
|j|jD],}|j
|jf}
|
|ks�|||
kr�|||
<q�W|o�|j�rx8|jD].}|j
|jf}
|
|	k�s||	|
kr�||	|
<q�W|�rP|j�rPx2t|j�D]$}|jtjk�r(||
|j
|jf<�q(W|jj}|jj}|jj}|jj}|jj|j	td�|||||||d�d�}|jj}|jj}|jj}|jj }|jj|jtd	�||	|||||d
�d�}|jj|j!td�||d�}|jj|j"td
�||d�}|jj#}|jj$}|jj|jtd�||
|||d�d�}t%|j&�dk�r�|dk�r�t%|j&�}t'td��xLt|j(t)j*d�d�D]}|jj+|d|d��q�Wn|jj|j&td�||d�}|jj|j,td�||d�} t%|��r@| dk�r@|dk�r@|dk�r@|dk�r@|dk�r@|dk�r@|dk�r@tjjtd���dS)zJOutput selection *pkgnarrow* of packages matching *patterns* and *repoid*.r7)�installed_availabler�rNr]zInstalled Packages)�>�<r�znot in)r�r(r�zAvailable Packages)r�r�r�znot inzAutoremove Packages)r(zExtra PackageszAvailable Upgrades)r�znot inrzObsoleting Packages)r�r%zRecently Added PackageszNo matching Packages to list)-rr6r�r�rrMrNror)r�hidden_available�reinstall_availableZ
old_availablerrr �hidden_installedr#r�r��hawkeyZSYSTEM_REPO_NAMErCZcolor_list_installed_olderZcolor_list_installed_newerZcolor_list_installed_reinstallZcolor_list_installed_extrar�r
Zcolor_list_available_upgradeZcolor_list_available_downgradeZcolor_list_available_reinstallZcolor_list_available_installr"r!r�r�rr%r3r&r�r�r�r$)!rF�basecmd�	pkgnarrowr�r�r�r'reZupdate_pkgsZ	inst_pkgsr�r(rr�rqZclioZclinZclirZclieZripZclauZcladZclarZclaiZrapZraepZrepr�r�ZrupZropr�Zrraprrr�output_packages�s�







FzBaseCli.output_packagesc	Cs�d}d}|r|dkrd}d}n|r2|dkr2d}d}|j||d|d�}|jjrvx(|jD]}|jrT|rT|jj|�qTW|r�|j|_|j|_|r�g|_|r�g|_|S)a#Return a :class:`dnf.yum.misc.GenericHolder` object containing
        lists of package objects that match the given names or wildcards.

        :param pkgnarrow: a string specifying which types of packages
           lists to produce, such as updates, installed, available, etc.
        :param patterns: a list of names or wildcards specifying
           packages to list
        :param installed_available: whether the available package list
           is present as .hidden_available when doing all, available,
           or installed
        :param reponame: limit packages list to the given repository

        :return: a :class:`dnf.yum.misc.GenericHolder` instance with the
           following lists defined::

             available = list of packageObjects
             installed = list of packageObjects
             upgrades = tuples of packageObjects (updating, installed)
             extras = list of packageObjects
             obsoletes = tuples of packageObjects (obsoleting, installed)
             recent = list of packageObjects
        FrTr�r )Zignore_caser�)	Z_do_package_listsrC�showdupesfromreposr�rr rQr�r�)	rFr�r�r�r�Zdone_hidden_availableZdone_hidden_installedr'rrrrr�
s,zBaseCli.returnPkgListsc	s�|jj}d|j_g}g}x4|D],}tt|�j|�\}}|j|�|j|�qWx t|�D]}|jj|||�qXW||j_|s�t	j
jtd���dS)a�Print out a list of packages that provide the given file or
        feature.  This a cli wrapper to the provides methods in the
        rpmdb and pkgsack.

        :param args: the name of a file or feature to search for
        :return: (exit_code, [ errors ])

        exit_code is::

            0 = we're done, exit
            1 = we've errored, exit with error string
            2 = we've got work yet to do, onto the next stage
        TzNo Matches foundN)
rCr�rDrB�provides�extendr�rZmatchcallback_verboserrMrNr
)	rF�argsZold_sdupZmatchesZused_search_strings�specr1Zused_search_stringr)rGrrr�?s

zBaseCli.providescCs|jjr|jjrdSdS)NFT)rCrjrW)rFrrrrV^szBaseCli._promptWanted)N)r�NFN)�__name__�
__module__�__qualname__�__doc__rEr^r[r}rr�r�r�r�r�r�r�rV�
__classcell__rr)rGrrB�sk'&"Y
1rBc@s�eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zddd
�Zd dd�Z	d!dd�Z
d"dd�Zej
fdd�Zdd�Zdd�Zdd�Zdd�ZdS)#�ClicCs"||_i|_d|_tjjj�|_|jtjj	j
j�|jtjj	jj
�|jtjj	jj�|jtjj	jj�|jtjj	jj�|jtjj	jj�|jtjj	jj�|jtjj	jj�|jtjj	jj�|jtjj	jj�|jtjj	jj�|jtjj	j j!�|jtjj	j"j#�|jtjj	j$j%�|jtjj	j&j'�|jtjj	j(j)�|jtjj	j*j+�|jtjj	j,j-�|jtjj	j.j/�|jtjj	j0j1�|jtjj	j2j3�|jtjj	j4j5�|jtjj	j6j7�|jtjj	j8�|jtjj	j9�|jtjj	j:�|jtjj	j;�|jtjj	j<�|jtjj	j=�dS)N)>r9�cli_commands�commandrrZZdemandZDemandSheet�demands�register_commandZcommands�aliasZAliasCommandr"ZAutoremoveCommandZcheckZCheckCommandZcleanZCleanCommandZ
distrosyncZDistroSyncCommandZdeplistZDeplistCommandZ	downgradeZDowngradeCommandrRZGroupCommandr_ZHistoryCommandZinstallZInstallCommandZ	makecacheZMakeCacheCommandZmarkZMarkCommand�moduleZ
ModuleCommandZ	reinstallZReinstallCommand�removeZ
RemoveCommand�repolistZRepoListCommandZ	repoqueryZRepoQueryCommand�searchZ
SearchCommand�shellZShellCommandZswapZSwapCommandZ
updateinfoZUpdateInfoCommandZupgradeZUpgradeCommandZupgrademinimalZUpgradeMinimalCommandZInfoCommandZListCommandZProvidesCommandZCheckUpdateCommandZRepoPkgsCommandZHelpCommand)rFr9rrrrEfsBzCli.__init__cCs|jj|�|jr^xJ|jj�D]<\}}|jjj||jj|gd�}|j|�|jj	|df�qW|j
r�|jjdd�|jjdd�|j
D��t
�}yzxt|jD]j\}}|jjj|�}|s�|jjjr�|dkr�td�}	tjj|	|��|j|�|dk�r�|j�q�|j�q�WWnFtjjk
�rP}
z$tj|
�|jj�tjd	�WYdd}
~
XnXx|D]}tjtd
�|��qXW|jjj �}|dk�r�|jjj!�}x,|D]$}|jjj"|�}|�r�|j#j$��q�W|jj%j&�\}
|j_'|jjj(�j)|
�t%j*|j|jj%�}|jjj(�j+|�dS)N)Zbaseurl�enabler�*�disablecSsg|]}|df�qS)r�r)rv�rrrrrx�sz(Cli._configure_repos.<locals>.<listcomp>zUnknown repo: '%s'rzNo repository match: %s)r�r�),r9Zread_all_reposZrepofrompathr<�reposZadd_new_reporC�_configure_from_optionsZrepos_edrQ�repo�insertr��setZget_matchingr�r
rrMZ	RepoError�addr�r��ConfigErrorr=rp�	optparser�
print_helprk�exitr>Z_repo_persistorZget_expired_reposr��get�_repo�expirerZsetup_progress_callbacksZ_ds_callbackr��set_progress_barZCliKeyImportZ_set_key_import)rF�optsZlabelr�Z	this_repoZnotmatchr�Z	operationr�rdreZ
expired_reposrZbarZ
key_importrrr�_configure_repos�sL








zCli._configure_reposcCsvtjdjtjjd�tjj�tjtj	j
d|j�tjtj	j
d|jj
j�tjtj	j
d|jj
j�tjd|jj
j�dS)Nz{prog} version: %s)rHzCommand: %szInstallroot: %szReleasever: %szcachedir: %s)r=�debugr?rrKrU�const�VERSION�log�logging�DDEBUG�	cmdstringr9rC�installroot�
releasever�cachedir)rFrrr�_log_essentials�s



zCli._log_essentialscCs|j}|jj}|jr.tjj�s.tjjt	d���|j
rLx|j�D]
}d|_q>W|j
s\|jjj
r�d|jj_
xn|j�D]}|jjtjj�qpWnL|jr�xD|j�D]}|jj�q�Wn(|js�x |j�D]}|jjtjj�q�W|j�r�|jj|jjr�dnd|jjd�dS)Nz[This command has to be run with superuser privileges (under the root user on most systems).T�autoF)�load_system_repoZload_available_repos)r�r9r�Z	root_userrrKZ	am_i_rootrMrNr
r{�iter_enabledZload_metadata_other�	cacheonlyrC�valuesr�ZsetSyncStrategyr�ZSYNC_ONLY_CACHE�freshest_metadatar�Zfresh_metadataZ	SYNC_LAZYZsack_activationZ	fill_sackr�Zavailable_repos)rFr�r�r�rrr�_process_demands�s.



zCli._process_demandscCs�|j}|jj|�}|dkr~tjtd�|tjd�|jj	j
r`tjtd�jtj
jtj
jd�|�ntjtd�jtj
jd��t�||�|_tjtjjd|�tjtjjd	|�dS)
z,Check that the requested CLI command exists.Nz)No such command: %s. Please use %s --helprzLIt could be a {PROG} plugin command, try: "{prog} install 'dnf-command(%s)'")rHZPROGzRIt could be a {prog} plugin command, but loading of plugins is currently disabled.)rHzBase command: %szExtra commands: %s)r�r�r�r=rpr
rk�argvr9rCZpluginsr?rrKrLrUrr�r�r�)rFr�r�r��command_clsrrr�_parse_commands�s


zCli._parse_commandsNc	Cs�tjjj�}|j|�}|dkr*tjjj�n||_|jj|�}|j	rpt
tjj�t
|jjj|j|jj�tjd�|jr�d|_d|_|jr�tjj|_|_yh|jr�|jjjd|jjjtjj�d|j_|jjj|�|j|j �d|kr�|j!|jj_!|jjj"�Wn�tj#j$t%fk
�rF}z t&j't(d�|�tjd�WYdd}~XnXt)k
�r�}z:d	t*t+|��t,|j-�f}t&j't(d�|�tjd�WYdd}~XnX|j.dk	�r�|j.|jj_.|jjj/�r�|j0dk�r�t&j't(d��tjd�|j1�s�|j2�r|j0dk�rt&j't(d��tjd�|j3dk	�r>t4j5t6j7|j3d��|jj8|j9d�|jj:|j;|j<|�|jj8|j9d�|j0�s�|jj=�tjd�||j_>|jj?d|_@x$|jj>D]}|j@d|7_@�q�W|jA�y|jB||�Wn tCk
�rtjd�YnX|jD�r$|jj=|j0�tjd�|jjE|j0|�}|jF�rN|jF|j_Gd|j_H|jI�r`|jI|j_I|jJ�rrd|jj_K|jL�r�d|jj_L|j0jM�|jjN�|jjO�|jP|�|jjQ�|jjj|�|j0jR�|jjj.�rtjSjT|jjj.�|jjj.|jjUjV�_W|jjjXdk�r(|jjjYjZ|jjjXd�t[j\d�dk�r�d}x,|jjUj]�D]}|j^�rZ�qJd|_^d}�qJW|jjj_�s�d|jj__d}|�r�t&j`t(d��dS)aParse command line arguments, and set up :attr:`self.base.conf` and
        :attr:`self.cmds`, as well as logger objects in base instance.

        :param args: a list of command line arguments
        :param option_parser: a class for parsing cli options
        Nrrr�TrzConfig error: %srz%s: %s�download�system-upgrade�reposync�
modulesynczb--destdir or --downloaddir must be used with --downloadonly or download or system-upgrade command.zconfig-managerz_--enable, --set-enabled and --disable, --set-disabled must be used with config-manager command.�<�mainZpluginrz%s r�)�colorz%_pkgverify_level�	signaturer�Fz�Warning: Enforcing GPG signature check globally as per active RPM security policy (see 'gpgcheck' in dnf.conf(5) for how to squelch this message))r�r�r�r�)r�r�)arrZ�aliasesZAliasesZresolve�
option_parserZOptionParserr�Zparse_main_argsr4r3r�r�r;r9rCZhistory_record_packagesrrkr��quietZ
debuglevelZ
errorlevelr�Z
VERBOSE_LEVELr�Z
_set_valueZsystem_cachedirZPRIO_DEFAULTr�r��_read_conf_filer�rZ_adjust_conf_optionsrMr�rnr=rpr
�IOErrorr	ro�repr�filenameZdestdirrTr�Zset_enabledZset_disabledZ	sleeptimer*Zsleep�randomZ	randrangeZadd_commandsr�Zinit_pluginsZ
disablepluginZenablepluginr�r�rHr�r�r�r�helpZparse_command_argsZallowerasingZ
allow_erasingZ_allow_erasingr�ZdebugsolverZdebug_solverr%Z
pre_configureZpre_configure_pluginsZ_activate_persistorr�Zconfigure_plugins�	configurerKZ
ensure_dirr�r�Zpkgdirr�r6ZreinitryZexpandMacror�ZgpgcheckZlocalpkg_gpgcheckr>)	rFr�r�r�r�rer�Zforcingr�rrrr��s�





















z
Cli.configurecCsBtjjd�}|jj}|jd�|jd�|jd�}|jd�tjj	krht
jj|�rhtj
jtd�j|���|jtjjd�|jd�}|jd�tjj	kr�d}|j}|j||jd�d�|dkr�|jdkr�tjj|j�}n|dkr�tjj|�}|dk	r�||_|jdk�rtjtd	��xd
D]}|j|��qW|jjj|�|�|S)N�configZconfig_file_pathzConfig file "{}" does not exist)ZpriorityZreposdir�varsdir�/)rzPUnable to detect release version (use '--releasever' to specify release version)r��logdir�
persistdir)r�rr)rr�ZTimerr9rCZ_check_remote_fileZ_search_inside_installrootZ
_get_valueZ
_get_priorityZPRIO_COMMANDLINEr�r��isfilerMr�r
r?�readZPRIO_MAINCONFIGZ
substitutionsZupdate_from_etcr�ryZdetect_releaseverr�r=r>Zprepend_installroot�_loggingZ_setup_from_dnf_conf)rFr�ZtimerrCr�Z	from_rootZsubst�optrrrr��s6




zCli._read_conf_file�eqcCs�|dkr|dkrdSg}|js"|r,|jd�|js6|r@|jd�|jsJ|rT|jd�|js^|rh|jd�|jj|||j|j|j	|j
d�dS)zz

        :param opts:
        :param cmp_type: string supported "eq", "gte"
        :param all:
        :return:
        N�bugfix�enhancement�
newpackage�security)�types�advisory�bugzilla�cves�severity)r	rQr
rrr9Zadd_security_filtersrrrr)rFr�Zcmp_typer�r
rrr� _populate_update_security_filter�s







z$Cli._populate_update_security_filtercCs4|dk	r|jjjj|�|dk	r0|jjjj|�dS)z�
        Change minimal logger level for terminal output to stdout and stderr according to specific
        command requirements
        @param stdout: logging.INFO, logging.WARNING, ...
        @param stderr:logging.INFO, logging.WARNING, ...
        N)r9rZstdout_handlerZsetLevelZstderr_handler)rF�stdout�stderrrrr�redirect_logger�szCli.redirect_loggercCs.tjjj|�}||jj_|jjj�j|�dS)N)	rrZrYZMultiFileProgressMeterr9rr�r�r�)rFZforYrrr�redirect_repo_progress�s
zCli.redirect_repo_progresscCs�|jjj�}|dkrdS|jjj�j|jd�}|j�}|jdd�|}x|D]}||krL|}qLW||kr�td|�td|�dS)N)r�r)Z
advisory_typez,Security: %s is an installed security updatez-Security: %s is the currently running version)r9r/Zget_running_kernelr1r2rrr3)rFZkernel�qZikpkgrrrr�_check_running_kernel�s
zCli._check_running_kernelcCs*t|jj��tjjtdj||����dS)Nz)argument {}: not allowed with argument {})r3r�Zprint_usagerrMrNr
r?)rFZoption_string_1Zoption_string_2rrr�_option_conflict�szCli._option_conflictcCs<x6|jD],}||jkr*tjjtd�|��||j|<qWdS)zRegister a Command. :apizCommand "%s" already definedN)r�r�rrMr�r
)rFr�rrrrr��s
zCli.register_commandcCs�|j�|jjjr8tjtd�djtt	|jjj����|jjj
rhtjtd�djtt	|jjj
����xx|jjj�D]h}|jr�tjtd�|j
ddjtt	|j����|j
rvtjtd�|j
ddjtt	|j
����qvW|jj�S)a2Call the base command, and pass it the extended commands or
           arguments.

        :return: (exit_code, [ errors ])

        exit_code is::

            0 = we're done, exit
            1 = we've errored, exit with error string
            2 = we've got work yet to do, onto the next stage
        zExcludes in dnf.conf: z, zIncludes in dnf.conf: zExcludes in repo z: zIncludes in repo )r�r9rCZexcludepkgsr=r�r
rar�r�Zincludepkgsr�r��idr��run)rFr�rrrrs
"
"(,zCli.run)N)N)rN)NN)r�r�r�rEr�r�r�r�r�r�rrrkrrrrr�rrrrrr�es$3

-


r�)r)Or�Z
__future__rrr�collections.abcr�ImportError�collectionsrzr�r�r�r�ryrkr*r�Zlibdnf.transactionrcrrZdnf.clirZdnf.i18nr	r
rZdnf.cli.aliasesZdnf.cli.commandsZdnf.cli.commands.aliasZdnf.cli.commands.autoremoveZdnf.cli.commands.checkZdnf.cli.commands.cleanZdnf.cli.commands.deplistZdnf.cli.commands.distrosyncZdnf.cli.commands.downgradeZdnf.cli.commands.groupZdnf.cli.commands.historyZdnf.cli.commands.installZdnf.cli.commands.makecacheZdnf.cli.commands.markZdnf.cli.commands.moduleZdnf.cli.commands.reinstallZdnf.cli.commands.removeZdnf.cli.commands.repolistZdnf.cli.commands.repoqueryZdnf.cli.commands.searchZdnf.cli.commands.shellZdnf.cli.commands.swapZdnf.cli.commands.updateinfoZdnf.cli.commands.upgradeZdnf.cli.commands.upgrademinimalZdnf.cli.demandZdnf.cli.formatZdnf.cli.option_parserZdnf.confZdnf.conf.substitutionsZ	dnf.constZdnf.db.historyZdnf.exceptionsZdnf.loggingZ
dnf.persistorZ
dnf.pluginZdnf.rpmZdnf.sackZdnf.transactionZdnf.utilZdnf.yum.miscZ	getLoggerr=rr)r;rAZBaserB�objectr�rrrr�<module>s�

Osite-packages/dnf/cli/__pycache__/option_parser.cpython-36.opt-1.pyc000064400000040660147511334650021302 0ustar003

�ft`4]�@s�ddlmZddlmZddlmZddlZddlZddlZddl	Zddl
ZddlZddlZ
ddlZddlZejd�ZGdd�dej�ZGdd	�d	ej�ZdS)
�)�unicode_literals)�_)�_parse_specsN�dnfcseZdZ�fdd�Z�ZS)�MultilineHelpFormattercs"d|kr|j�Stt|�j||�S)N�
)�
splitlines�superr�_split_lines)�self�text�width)�	__class__��#/usr/lib/python3.6/option_parser.pyr
'sz#MultilineHelpFormatter._split_lines)�__name__�
__module__�__qualname__r
�
__classcell__rr)rrr&srcseZdZdZd.�fdd�	Zdd�ZGdd�dej�ZGd	d
�d
ej�Z	Gdd�dej
�ZGd
d�dej�ZGdd�dej�Z
Gdd�dej�ZGdd�dej�ZGdd�dej�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd/�fd*d+�	Zd0�fd,d-�	Z�ZS)1�OptionParserz5ArgumentParser like class to do things the "yum way".Tcs>tt|�jdtd�d|_d|_|j�|r:i|_t�|_	dS)NF)�add_helpZformatter_class)
r	r�__init__r�command_positional_parser�
command_group�_add_general_options�
_cmd_usage�set�_cmd_groups)rZreset_usage)rrrr/szOptionParser.__init__cCs&|j�tjtd�|�tjd�dS)z�Output an error message, and exit the program.
           This method overrides standard argparser's error
           so that error output goes to the logger.

        :param msg: the error message to output
        zCommand line error: %s�N)�print_usage�loggerZcriticalr�sys�exit)r�msgrrr�error9szOptionParser.errorc@seZdZdd�ZdS)zOptionParser._RepoCallbackcs@|dkrdnd�t||j�}|j�fdd�tjd|�D��dS)Nz
--disablerepo�disable�enablec3s|]}|�fVqdS)Nr)�.0�x)�	operationrr�	<genexpr>Hsz6OptionParser._RepoCallback.__call__.<locals>.<genexpr>z\s*[,\s]\s*)�getattr�dest�extend�re�split)r�parser�	namespace�values�opt_str�lr)r)r�__call__Esz#OptionParser._RepoCallback.__call__N)rrrr5rrrr�
_RepoCallbackDsr6c@seZdZdd�ZdS)z OptionParser._RepoCallbackEnablecCs$|jj|ddf�t|d|�dS)Nrr&Zreponame)�repos_ed�append�setattr)rr0r1r2r3rrrr5Ksz)OptionParser._RepoCallbackEnable.__call__N)rrrr5rrrr�_RepoCallbackEnableJsr:cs$eZdZdZdZ�fdd�Z�ZS)zOptionParser._SplitCallbackzN Split all strings in seq, at "," and whitespace.
        Returns a new list. z\s*[,\s]\s*csDd}x:tj|j|�D](}|s |r8ttj|�j||||�d}qWdS)NTF)r.r/�SPLITTERr	r�_SplitCallbackr5)rr0r1r2r3�first�val)rrrr5Tsz$OptionParser._SplitCallback.__call__)rrr�__doc__r;r5rrr)rrr<Osr<c@seZdZdZdd�ZdS)z%OptionParser._SplitExtendDictCallbackz[ Split string at "," or whitespace to (key, value).
        Extends dict with {key: value}.c	Cshy"|jd�\}}|s|r t�Wn,tk
rNtd�|}tj||��YnXt||j�}|||<dS)N�,zbad format: %s)r/�
ValueErrorr�argparseZ
ArgumentErrorr+r,)	rr0r1r2r3�keyr>r#Zdctrrrr5bsz.OptionParser._SplitExtendDictCallback.__call__N)rrrr?r5rrrr�_SplitExtendDictCallback_srDc@seZdZdZdd�ZdS)zOptionParser._SetoptsCallbackzY Parse setopts arguments and put them into main_<setopts>
            and repo_<setopts>.cCs|jd�}t|�dkr*tjtd�|�dSt|�dkrJtjtd�|�dS|\}}|jd�}|dkr�|d|�}	||dd�}t|d�r�|j}
ni}
|
j|	i�j|g�j	|�t
|d|j|
�n:t|d	�r�|j}ni}|j|g�j	|�t
|d
|j|�dS)N�=�z'Setopt argument has multiple values: %sz Setopt argument has no value: %s�.r�repo_setoptsZrepo_�main_setoptsZmain_���)
r/�lenr Zwarningr�rfind�hasattrrH�
setdefaultr8r9r,rI)rr0r1r2r3�vals�k�vZperiod�repoZrepooptsZmainoptsrrrr5ps,



z&OptionParser._SetoptsCallback.__call__N)rrrr?r5rrrr�_SetoptsCallbackmsrSc@seZdZdd�ZdS)z'OptionParser.ParseSpecGroupFileCallbackcCst||�dS)N)r)rr0r1r2r3rrrr5�sz0OptionParser.ParseSpecGroupFileCallback.__call__N)rrrr5rrrr�ParseSpecGroupFileCallback�srTcs$eZdZ�fdd�Zdd�Z�ZS)zOptionParser.PkgNarrowCallbackcs�i|_y&x dD]}|||j|<||=qWWn6tk
rb}ztd|jj|f��WYdd}~XnXg|d<ttj|�j||�dS)N�choices�defaultz"%s() missing mandatory argument %s)rUrV)	�	pkgnarrow�KeyError�	TypeErrorrrr	r�PkgNarrowCallbackr)r�args�kwargsrP�e)rrrr�s
"z'OptionParser.PkgNarrowCallback.__init__cCsV|jd}|s"|d|jdkr.|jd}n
|jd�}t|||�t||j|�dS)NZ_actionrrUrV)r,rW�popr9)rr0r1r2r3Zdest_actionZnarrowrrrr5�s

z'OptionParser.PkgNarrowCallback.__call__)rrrrr5rrr)rrrZ�srZc@seZdZdd�ZdS)zOptionParser.ForceArchActioncCsd|_||_dS)NT)Z
ignorearchZarch)rr0r1r2r3rrrr5�sz%OptionParser.ForceArchAction.__call__N)rrrr5rrrr�ForceArchAction�sr_c
Cs�|jtdjtjjd���}|jdddddtd�d	�|jd
ddd
dtd�d�|jddd
dtd�d�|jdd
dtd�jtjjd�d�|jdtd�dd�|jdddgdtd�d�|jd d!dd"td#�d$�|jd%d&g|jtd'�d(d)�|jd*d+g|jtd,�d(d)�|jd-dtd.�d/�|jd0d1g|jtd2�d3�|jd4d5d
dtd6�d�|jd7d8d9d
d:td;�d<�|jd=d
dtd>�d�|j	�}|jd?d@d
dAdtdB�dC�|jdDd!dAtdE�d<�|jdFdGdHd
dtdI�d�|jdJdKdLt
ddMtdN�dO�|jdPdQdRdSdtdT�t
dU�|jdVd
dtdW�d�|jdXdYd
dtdZ�d�|jd[d\dt
td]�d^�|jd_dd`d
tda�jtjjd�db�|jdcdtdd�dedf�|jdgdhd
dtdi�d�|jdjd
dtdk�d�|jdl|jdmgdntdo�dp�|j	�}|jdq|jdmgdntdr�dp�|jdsdtdndu|jgtdv�dw�|j	�}|jdxdydzd
td{�db�|jd|dyd}d
td~�db�|jdd�d�gd�|jtd��d�d��|jd�d�gd�|jtd��dnd��|jd�i|j
d�td��d��|jd�d!dd�td��d$�|jd�d!dd�td��d$�|jd�d�dtd��d��|jd�d�d
td��d��|jd�d�dtd��dd�d��|jd�d�dtd��dd�d��|jd�d�d�dtd��d��|jd�d�d
dytd��d�|jd�d�dtd��d��|jd�d
td��d��|jd�d
td��d��|jd�d
td��d��|jd�d
td��d��|jd�d�d�g|jtd��d3�|jd�d�gd�|jtd��db�|jd�d�gd�|jtdƒdb�|jd�d�d�d�d�d�ggd�|jtdʃdˍ|jd�d�tj|jttjjj��td΃dύ|jd�d�tjdҍdS)�z0 Standard options known to all dnf subcommands. zGeneral {prog} options)�progz-cz--configZconfig_file_pathNz
[config file]zconfig file location)r,rV�metavar�helpz-qz--quiet�quiet�
store_truezquiet operation)r,�actionrVrbz-vz	--verbosezverbose operation)rerVrbz	--versionzshow {prog} version and exitz
--installrootzset install rootz[path])rbraz--nodocs�store_constZnodocsZtsflagszdo not install documentations)re�constr,rbz--nopluginsZstore_falseZpluginszdisable all plugins)rerVr,rbz--enablepluginZenablepluginzenable plugins by namez[plugin])r,rVrerbraz--disablepluginZ
disablepluginzdisable plugins by namez--releaseverz:override the value of $releasever in config and repo files)rVrbz--setoptZsetoptsz%set arbitrary config and repo options)r,rVrerbz
--skip-brokenZskip_brokenz.resolve depsolve problems by skipping packagesz-hz--helpz
--help-cmdrbzshow command help)rer,rbz--allowerasingz;allow erasing of installed packages to resolve dependenciesz-bz--bestZbestz8try the best available package versions in transactions.)rer,rVrbz--nobestz2do not limit the transaction to the best candidatez-Cz--cacheonlyZ	cacheonlyz2run entirely from system cache, don't update cachez-Rz--randomwaitZ	sleeptimez	[minutes]zmaximum command wait time)r,�typerVrarbz-dz--debuglevelZ
debuglevelz
[debug level]zdebugging output level)r,rarVrbrhz
--debugsolverz)dumps detailed solving results into filesz--showduplicatesZshowdupesfromreposz2show duplicates, in repos, in list/search commandsz-ez--errorlevelzerror output level)rVrhrbz--obsoletesZ	obsoletesz�enables {prog}'s obsoletes processing logic for upgrade or display capabilities that the package obsoletes for info, list and repoquery)rVr,rerbz--rpmverbosityzdebugging output level for rpmz[debug level name])rVrbraz-yz--assumeyesz*automatically answer yes for all questionsz
--assumenoz)automatically answer no for all questionsz--enablerepor7z[repo]z]Enable additional repositories. List option. Supports globs, can be specified multiple times.)rer,rVrarbz
--disablerepozSDisable repositories. List option. Supports globs, can be specified multiple times.z--repoz--repoidrRzUenable just specific repositories by an id or a glob, can be specified multiple times)rar,rerVrbz--enableFZset_enabledz>enable repos with config-manager command (automatically saves)z	--disableZset_disabledz?disable repos with config-manager command (automatically saves)z-xz	--excludez
--excludepkgsZexcludepkgsz exclude packages by name or globz	[package])rVr,rerbraz--disableexcludesz--disableexcludepkgsZdisable_excludeszdisable excludepkgsz--repofrompathz[repo,path]zolabel and path to an additional repository to use (same path as in a baseurl), can be specified multiple times.)rVrerarbz--noautoremoveZclean_requirements_on_removez7disable removal of dependencies that are no longer usedz--nogpgcheckZgpgcheckz5disable gpg signature checking (if RPM policy allows)z--colorZcolorzcontrol whether color is used)r,rVrbz	--refreshZfreshest_metadataz2set metadata as expired before running the command)r,rerbz-4Z
ip_resolvezresolve to IPv4 addresses onlyZipv4)r,rVrbrergz-6zresolve to IPv6 addresses onlyZipv6z	--destdirz
--downloaddirZdestdirz!set directory to copy packages toz--downloadonlyZdownloadonlyzonly download packagesz	--commentZcommentzadd a comment to transactionz--bugfixz,Include bugfix relevant packages, in updates)rerbz
--enhancementz1Include enhancement relevant packages, in updatesz--newpackagez0Include newpackage relevant packages, in updatesz
--securityz.Include security relevant packages, in updatesz
--advisoryz--advisoriesZadvisoryz=Include packages needed to fix the given advisory, in updatesz--bzz--bzsZbugzillaz7Include packages needed to fix the given BZ, in updatesz--cvez--cvesZcvesz8Include packages needed to fix the given CVE, in updatesz--sec-severityz
--secseverityZCriticalZ	ImportantZModerateZLowZseverityzDInclude security relevant packages matching the severity, in updates)rUrVr,rerbz--forcearchZARCHz Force the use of an architecture)rar,rerUrb�command�?)�nargsrb)�add_argument_groupr�formatr�utilZMAIN_PROG_UPPER�add_argumentr<rSZadd_mutually_exclusive_group�int�	MAIN_PROGr6rDrBZSUPPRESSr_�sortedZrpmZ
_BASEARCH_MAP�keys)rZgeneral_grpZ
best_groupZ
repo_groupZenable_grouprrrr�s:























z!OptionParser._add_general_optionscCsHtjj|j�}tjj|jd�}||jkrD||f|j|<|jj|�dS)z- store usage info about a single dnf command.rN)rZi18nZucd�summary�aliasesrr�add)r�cmd�grouprt�namerrr�_add_cmd_usageds

zOptionParser._add_cmd_usagecCs&x t|j��D]}|j||�qWdS)z� store name & summary for dnf commands

        The stored information is used build usage information
        grouped by build-in & plugin commands.
        N)rr2rz)rZcli_cmdsrxrwrrr�add_commandslszOptionParser.add_commandscCs�td�td�d�}dtjj}xfd	D]^}||jkr4q$|d||7}x<t|jj��D]*}|j|\}}||krT|d||f7}qTWq$W|S)
z- get the usage information to show the user. zList of Main Commands:zList of Plugin Commands:)�main�pluginz%s [options] COMMAND
r|r}z
%s

z	%-25s %s
)r|r})rrrnrqrrrrrs)rZdesc�usageZgrpryrxrtrrr�	get_usageus

zOptionParser.get_usagecCs~dtjj|jf|_|j|_tj|jdd�|_	|j
|j	_
d|j	j_|j
dj|jj���|_|j|j_|j|j_|j|j�dS)Nz%s %sF)rz{} command-specific options)rrnrq�_basecmdr`rt�descriptionrB�ArgumentParserrr�_positionals�titlerlrm�
capitalizer�cmd_add_argumentro�_commandZ
set_argparser)rrirrr�_add_command_options�s



z!OptionParser._add_command_optionscsBt�fdd�|D��r0t�j�j�jf|�|�S�jj||�SdS)Ncsg|]}|d�jk�qS)r)Zprefix_chars)r'�arg)rrr�
<listcomp>�sz1OptionParser.cmd_add_argument.<locals>.<listcomp>)�allrhrror)rr[r\r)rrr��szOptionParser.cmd_add_argumentcCs`xZ|D]R}y|jd�Wqtk
rV}z"tjjtd�|t|�f��WYdd}~XqXqWdS)Nzutf-8zCannot encode argument '%s': %s)�encode�UnicodeEncodeErrorr�
exceptionsZConfigErrorr�str)rr[r�r]rrr�_check_encoding�s
zOptionParser._check_encodingcCs|j|�|j|�\}}|S)N)r��parse_known_args)rr[r1Z_unused_argsrrr�parse_main_args�s
zOptionParser.parse_main_argscCs2|j|�|j|�\}}|jj||�}||_|jS)N)r�r�r�
parse_argsZopts)rrir[r1Zunused_argsrrr�parse_command_args�s

zOptionParser.parse_command_argsNcs,|jr|j|jj7_tt|�j|�dS)N)r�_actionsr	rr)rZfile_)rrrr�szOptionParser.print_usagecsd|rH|js|jj|jkr$|j|�|j|jj7_|jj|jj�n
|j	�|_
tt|�j
�dS)N)rr�r�r�r�rZ_action_groupsr8r�rr~r	r�
print_help)rri)rrrr��s

zOptionParser.print_help)T)N)N)rrrr?rr$rBZActionr6r:Z
_AppendActionr<rDrSrTrZr_rrzr{rr�r�r�r�r�rr�rrr)rrr,s.
;	r)Z
__future__rZdnf.i18nrZdnf.utilrrBZdnf.exceptionsrZdnf.rpmZdnf.yum.miscZloggingZos.path�osr.r!Z	getLoggerr Z
HelpFormatterrr�rrrrr�<module>s
site-packages/dnf/cli/__pycache__/format.cpython-36.pyc000064400000004461147511334650016746 0ustar003

�ft`�@s8ddlmZddlmZddd�Zddd�Zdd	�Zd
S)
�)�unicode_literals)�long� c		Cs�ddddddddd	g	}|r d
}nd}d}d
}t|�d}|dkrDd}x$||krh||krh|d}||}qFWt|t�s~t|t�r�d}n|dkr�d}nd}|t|p�d
�|||fS)a�Return a human-readable metric-like string representation
    of a number.

    :param number: the number to be converted to a human-readable form
    :param SI: If is 0, this function will use the convention
       that 1 kilobyte = 1024 bytes, otherwise, the convention
       that 1 kilobyte = 1000 bytes will be used
    :param space: string that will be placed between the number
       and the SI prefix
    :return: a human-readable metric-like string representation of
       *number*
    r�k�M�G�T�P�E�Z�Yg@�@g�@i�r�Ngz%i%s%sgfffff�#@z%.1f%s%sz%.0f%s%s)�len�
isinstance�intr�float)	ZnumberZSIZspaceZsymbols�stepZthresh�depthZ	max_depth�format�r�/usr/lib/python3.6/format.py�
format_numbers4rcCsx|dks|dkr|rdSdSnV|td�kr.dSt|�}|d}|d}|rh|d}|d}d|||fSd	||fSdS)
a�Return a human-readable string representation of a number
    of seconds.  The string will show seconds, minutes, and
    optionally hours.

    :param seconds: the number of seconds to convert to a
       human-readable form
    :param use_hours: If use_hours is 0, the representation will
       be in minutes and seconds. Otherwise, it will be in hours,
       minutes, and seconds
    :return: a human-readable string representation of *seconds*
    Nrz--:--:--z--:--�infZInfinite�<z%02i:%02i:%02iz	%02i:%02i)rr)ZsecondsZ	use_hoursZminutesZhoursrrr�format_timeIsrcCsdjdd�|j�D��S)N�
css|]}d|VqdS)z  Nr)�.0�srrr�	<genexpr>hszindent_block.<locals>.<genexpr>)�join�
splitlines)rrrr�indent_blockgsr!N)rr)r)Z
__future__rZ
dnf.pycomprrrr!rrrr�<module>s
5
site-packages/dnf/cli/__pycache__/output.cpython-36.opt-1.pyc000064400000156715147511334650017767 0ustar003

�ft`Z�@s�dZddlmZddlmZddlmZddlZddlZddlZddlZ	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlmZmZddlmZmZmZmZmZmZmZmZddlmZmZmZm Z m!Z!dd	l"m#Z#dd
l$m%Z%ddl&Z'ddl(Z'ddl)Z'ddl*Z'ddl+Z'ddl,Z'ddlZ'ddl-Z'ddl.Z'ddl/Z'e
j0d�Z1dd
�Z2Gdd�de3�Z4Gdd�de'j5j6�Z7Gdd�de'j5j8�Z9Gdd�de#�Z:ddd�Z;dS)z"Handle actual output from the cli.�)�absolute_import)�print_function)�unicode_literalsN)�
format_number�format_time)�_�C_�P_�ucd�fill_exact_width�
textwrap_fill�exact_width�select_short_long)�xrange�
basestring�long�unicode�sys_maxsize)�TransactionDisplay)�MergedTransactionWrapper�dnfcCsrtj|ftjd��}t|�}|d}||}|s@tjd|�}n|rR|jd|�t|�}tt|f|g|���S)N��)r)�	itertools�chain�repeat�len�extend�iter�list�zip)Z
cols_countZlabel�lst�leftZ
lst_lengthZright_countZ
missing_itemsZlst_iter�r#�/usr/lib/python3.6/output.py�_spread_in_columns6s
r%c@s
eZdZdZdmZejd�Zdd�Zdd�Z	d	d
�Z
dd�Zd
d�Ze
dd��Zedd��Zedd��Zdndd�Ze
dd��Zdd�Zdodd�Zdpd!d"�Zdqd#d$�Zd%d&�Zd'd(�Zd)d*�Zdrd,d-�Zdsd.d/�Zdtd0d1�Zidifd2d3�Zdud4d5�Zd6d7�Z d8d9�Z!d:d;�Z"d<d=�Z#d>d?�Z$d@dA�Z%dvdBdC�Z&dwdDdE�Z'dxdFdG�Z(dHdI�Z)dJdK�Z*dydLdM�Z+dNdO�Z,dPdQ�Z-dRdS�Z.dTdU�Z/dVdW�Z0dzdXdY�Z1d{dZd[�Z2ge3�fd\d]�Z4gfd^d_�Z5e6d`�e6d`�e6da�e6db�e6dc�e6dd�e6de�e6df�e6dg�e6dh�e6di�dj�Z7gfdkdl�Z8dS)|�Outputz+Main output class for the yum command line.� �z	^\*{0,2}/cCs$||_||_tjjj�|_d|_dS)N)�conf�baser�cli�termZTerm�progress)�selfr*r)r#r#r$�__init__IszOutput.__init__cCs0|jj}dd|}|jt||�d�}|||fS)Nz%s�=r')r,�columns�
fmtColumnsr )r.�col_data�rowZ
term_widthZrule�headerr#r#r$�_bannerOszOutput._bannerc	Cszdd�|dD�}xF|D]>}x8t|�D],\}}||}t|�}|j|d�d||<q&WqW|j|ddd�}tttj|��S)NcSsg|]
}t��qSr#)�dict)�.0rr#r#r$�
<listcomp>Vsz&Output._col_widths.<locals>.<listcomp>rrz  )�indent)�	enumerater�get�calcColumnsr�map�operator�neg)	r.�rowsr3r4�i�valZcol_dctZlength�colsr#r#r$�_col_widthsUs
zOutput._col_widthscCs(d}d}|s�n�t|t�s$|dkr2|jjd}n�|dkr<n�x�|jdd�j�D]�}|dkr`d}qN||jjkr~||jj|7}qN||jjkr�||jj|7}qN|jd�r�|dd�|jjkr�||jj|dd�7}qN|jd�o�|dd�|jjkrN||jj|dd�7}qNW|�r |jjd}||fS)	Nr�bold�normal�,r'zfg:r(zbg:)	�
isinstancerr,ZMODE�replace�splitZFG_COLOR�
startswithZBG_COLOR)r.�	highlight�hibeg�hiendZhighr#r#r$�
_highlight`s0

zOutput._highlightcKs$|j|�\}}|jj||||f|�S)N)rPr,�sub)r.ZhaystackrMZneedles�kwdsrNrOr#r#r$�_sub_highlight}szOutput._sub_highlightcCs4d}x*|D]"}||d|kr P||d7}q
W|S)z; Spaces left on the current field will help how many pkgs? rrr#)�currentZ	data_tupsr"�ret�tupr#r#r$�_calc_columns_spaces_helps�s
z!Output._calc_columns_spaces_helpscCs|jjS)N)r*�history)r.r#r#r$rX�szOutput.historycCs|jjS)N)r*�sack)r.r#r#r$rY�szOutput.sackNrrcCs�t|�}|}dg|}x&td|�D]}t||j��||<q"W|dkrN|jj}|dkrndg|d}|jd�|s�g}	xDtd|�D]6}||}
|
r�|	j|
dd�q�|	j||d�q�W|	dt|�7<|jj}t	|	�|kr�|	S|}|t	|�|dt
|�8}|d�s|d7}�x�|dk�r�d}d}
xptd|�D]b}|j|||||�}|�s`�q<|�r�||dk�r�|d|k�r��q<||k�r��q<|}|}
�q<W|�r||
jd�d||
}||
�r�|
|dk�r�|d8}||
|7<||8}�qd}x*td|�D]}||�s"�q|d7}�qW|�rz||}x:td|�D],}||�s\�qJ|||7<||8}�qJW|d8}||}x$td|�D]}|||7<�q�W|||||7<d}�qW|S)a�Dynamically calculate the widths of the columns that the
        fields in data should be placed into for output.

        :param data: a list of dictionaries that represent the data to
           be output.  Each dictionary in the list corresponds to a
           column of output. The keys of the dictionary are the
           lengths of the items to be output, and the value associated
           with a key is the number of items of that length.
        :param columns: a list containing the minimum amount of space
           that must be allocated for each row. This can be used to
           ensure that there is space available in a column if, for
           example, the actual lengths of the items being output
           cannot be given in *data*
        :param remainder_column: number of the column to receive a few
           extra spaces that may remain after other allocation has
           taken place
        :param total_width: the total width of the output.
           self.term.real_columns is used by default
        :param indent: string that will be prefixed to a line of
           output to create e.g. an indent
        :return: a list of the widths of the columns that the fields
           in data should be placed into for output
        Nrr����r[)
r�range�sorted�itemsr,�real_columns�appendrr1�sumr
rW�pop)r.�datar1�remainder_column�total_widthr:rDZpdata�dZfull_columns�colZ
default_widthZhelpsrCZthelps�diffZoverflowed_columnsZnormr#r#r$r=�s�



"




zOutput.calcColumnscCs|dkrd|fSd|fS)z$Returns tuple of (align_left, width)rTFr#)�widthr#r#r$�_fmt_column_align_widths
zOutput._fmt_column_align_widthcCsPt|�dkr|\}}d}}t|�dkr@|\}}}|j|�\}}t|�|||fS)NrZrr()rrPr
)r.r3rCrirNrOrMr#r#r$�	_col_datas
zOutput._col_datac
CsFt|�}t|�}g}x�|dd	�D]�}|j|�\}}}	}
|sP|d7}|j|�q"|j|�\}}t|�}||kr�|d7}|r�|j|	|d|||
g�q�|j|	d||||
g�n(|dd||d7}|j|	||
g�||7}|d7}q"W|j|d
�\}}}	}
|j|�\}}t||||	|
d�}|d|7}|j|�|t|�S)a.Return a row of data formatted into a string for output.
        Items can overflow their columns.

        :param columns: a list of tuples containing the data to
           output.  Each tuple contains first the item to be output,
           then the amount of space allocated for the column, and then
           optionally a type of highlighting for the item
        :param msg: a string to begin the line of output with
        :param end: a string to end the line of output with
        :return: a row of data formatted into a string for output
        Nrz%sz	%s%s%s%s r'z%s%s%s
)r"�prefix�suffixz%%s%sr[r[)	rrrkr`rjr
rr�tuple)
r.r1�msg�endrercr3rCrirNrOZ
align_leftZ	val_widthr#r#r$r2%s6


zOutput.fmtColumnsFcCsP|dkrd
}d||j|jf}|ddg}t||j|jf||�}t|j|��dS)a�Print a package as a line.

        :param pkg: the package to be printed
        :param ui_overflow: unused
        :param indent: string to be prefixed onto the line to provide
           e.g. an indent
        :param highlight: highlighting options for the name of the
           package
        :param columns: tuple containing the space allocated for each
           column of output.  The columns are the package name, version,
           and repository
        N�(��z%s%s.%srG���������)rtrurv)�name�archr �evr�
_from_repo�printr2)r.�pkg�ui_overflowr:rMr1Zna�hi_colsr#r#r$�
simpleListTs
zOutput.simpleListc	CsN|dkrd}d|t|�f}|ddg}|j}t||f||�}t|j|��dS)	a(Print a package as a line, with the package itself in envra
        format so it can be passed to list/install/etc.

        :param pkg: the package to be printed
        :param ui_overflow: unused
        :param indent: string to be prefixed onto the line to provide
           e.g. an indent
        :param highlight: highlighting options for the name of the
           package
        :param columns: tuple containing the space allocated for each
           column of output.  The columns the are the package envra and
           repository
        N�?rsz%s%srG�����)r�r�)r
�ui_from_repor r{r2)	r.r|r}r:rMr1Zenvrar~Zridr#r#r$�simpleEnvraListjs
zOutput.simpleEnvraListcCstt|j��dS)z.Print a package as a line containing its name.N)r{r
rw)r.r|r#r#r$�simple_name_list�szOutput.simple_name_listcCstt|��dS)z/Print a package as a line containing its NEVRA.N)r{r
)r.r|r#r#r$�simple_nevra_list�szOutput.simple_nevra_listcCs�t|�}|jj}|st}n|dkr&d}d|dd}|s>|St|�}t||||d�}|jd�dkr�||dkr�t|||d	d�}|S)
z�Return a key value pair in the common two column output
        format.

        :param key: the key to be formatted
        :param val: the value associated with *key*
        :return: the key value pair formatted in two columns for output
        �r'rZz: )riZinitial_indentZsubsequent_indent�
rr(z
     ...: )r
r,r_rr
r�count)r.�keyrCZkeylenrDZnxtrUr#r#r$�
fmtKeyValFill�s zOutput.fmtKeyValFillr0cCsht|�}|jjd}t|�}||dkr6|d}}n$|||d}|||t|�}d|||fS)a�Format and return a section header.  The format of the
        header is a line with *name* centered, and *fill* repeated on
        either side to fill an entire line on the terminal.

        :param name: the name of the section
        :param fill: the character to repeat on either side of *name*
          to fill an entire line.  *fill* must be a single character.
        :return: a string formatted to be a section header
        rZ�z%s %s %s)r
r,r1r
r)r.rwZfillrDZname_lenZbegrpr#r#r$�
fmtSection�s
zOutput.fmtSectioncs�dd�}�fdd�}g}�j|�\}}tdtdd�tdd��}|j||d	||j|f��|jrv|j|td
�|j��tdtdd�tdd��}|j|||j��|j|td�|j��tdtdd
�tdd��}|j|||j	��tdtdd�tdd��}|j||t
t|j����|j|td�|j
��tdtdd�tdd��}|j|||j��|j�r��jj|�}	|	�r�|j|td�|	���jj�r>|j|td�|j��|j|td�tjj|j���|j�r�|j|td�tjj|j����jj|�}
|
�r>yt|
jj��}Wntk
�r"d}YnX|j|td��j|���tdtdd�tdd��}|j|||j ��|j!�r�|j|td�t"|j!���|j|td�|j#��tdtdd�tdd��}|j|||j$��dj%|�S)z�Print information about the given package.

        :param pkg: the package to print information about
        :param highlight: highlighting options for the name of the
           package
        cSsdjt|dd�dt|�g�S)Nr'��:)�joinr�str)r�rCr#r#r$�format_key_val�sz)Output.infoOutput.<locals>.format_key_valcs�jt|dd�d|pd�S)Nr�z : r)r�r)r�rC)r.r#r$�format_key_val_fill�sz.Output.infoOutput.<locals>.format_key_val_fillr��short�Namerz%s%s%sZEpoch�VersionZRelease�Arch�Architecture�SizeZSource�Repo�
Repositoryz	From repoZPackagerZ	BuildtimezInstall timeNzInstalled byZSummaryZURLZLicenseZDescriptionr�)&rPrrr`rwZepochr�version�releaserxr�float�_sizeZ	sourcerpm�repoid�_from_systemrXZrepor)�verboseZpackagerr�utilZnormalize_timeZ	buildtimeZinstalltimeZpackage_data�int�_itemZgetInstalledBy�
ValueError�_pwd_ui_username�summary�urlr
�license�descriptionr�)r.r|rMr�r�Zoutput_listrNrOr�Zhistory_repoZhistory_pkg�uidr#)r.r$�
infoOutput�sh

zOutput.infoOutputc
Cs�|\}}|dk	rV|jj}|jtjkr,|jj}|j|||d�|j||d|jjd�dS|j�}d|j	|j
f}|j}	td||	||f�dS)	a{Print a simple string that explains the relationship
        between the members of an update or obsoletes tuple.

        :param uotup: an update or obsoletes tuple.  The first member
           is the new package, and the second member is the old
           package
        :param changetype: a string indicating what the change between
           the packages is, e.g. 'updates' or 'obsoletes'
        :param columns: a tuple containing information about how to
           format the columns of output.  The absolute value of each
           number in the tuple indicates how much space has been
           allocated for the corresponding column.  If the number is
           negative, the text in the column will be left justified,
           and if it is positive, the text will be right justified.
           The columns of output are the package name, version, and repository
        N)r1rMr'r�)r1r:rMz%s.%sz%-35.35s [%.12s] %.10s %-20.20sz    )
r)�color_update_remote�reponame�hawkeyZSYSTEM_REPO_NAME�color_update_localr�color_update_installedZcompactPrintrwrxr�r{)
r.ZuotupZ
changetyper1Z	changePkgZinstPkgZchiZ	c_compactZ	i_compactZc_repor#r#r$�updatesObsoletesLists
zOutput.updatesObsoletesListcCsl|dk�rht|�dk�r`td|�t�}|dkrbi}x"|D]}	|	|t|	�t|	j�<q<W|j�}x�t|�D]�}	|	j|	jf}
d}|
|kr�|j	dd	�}nD|	j
||
�r�|j	d
d	�}n(|	j||
�r�|j	dd�}n|j	d
d	�}|dkr�|j|	d||d�ql|dk�r|j
|j|	|d�d�ql|dk�r0|j|	�ql|dkrl|j|	�qlqlW|�r`tdjt|���t|�SdS)a�Prints information about the given list of packages.

        :param lst: a list of packages to print information about
        :param description: string describing what the list of
           packages contains, e.g. 'Available Packages'
        :param outputType: The type of information to be printed.
           Current options::

              'list' - simple pkg list
              'info' - similar to rpm -qi output
              'name' - simple name list
              'nevra' - simple nevra list
        :param highlight_na: a dictionary containing information about
              packages that should be highlighted in the output.  The
              dictionary keys are (name, arch) tuples for the package,
              and the associated values are the package objects
              themselves.
        :param columns: a tuple containing information about how to
           format the columns of output.  The absolute value of each
           number in the tuple indicates how much space has been
           allocated for the corresponding column.  If the number is
           negative, the text in the column will be left justified,
           and if it is positive, the text will be right justified.
           The columns of output are the package name, version, and
           repository
        :param highlight_modes: dictionary containing information
              about to highlight the packages in *highlight_na*.
              *highlight_modes* should contain the following keys::

                 'not_in' - highlighting used for packages not in *highlight_na*
                 '=' - highlighting used when the package versions are equal
                 '<' - highlighting used when the package has a lower version
                       number
                 '>' - highlighting used when the package has a higher version
                       number
        :return: number of packages listed
        r�inforw�nevrarz%sFznot inrGr0�>rF�<T)r}rMr1)rMr�N)rr�rwr�)rr{�setr�rz�valuesr]rwrxr<Zevr_eqZevr_ltr�addr�r�r�r�)r.r!r�Z
outputTypeZhighlight_nar1Zhighlight_modesZinfo_setZunique_item_dictr|r�rMr#r#r$�listPkgs3s@'



zOutput.listPkgscCs2ttd��ttd��f}ttd��ttd��f}||}x�|dkrJtd�}d}|jjrl|dkrhtd�}n|}ytjj|�}Wn.tk
r�Yntk
r�|d	}YnXt|�j	�}t
|�d	kr�|jjr�|d	n|d	}||kr�Pd|ko�d|k�r|d	}Pd|kr:d|kr:|d	}Pq:W||k�r.d
SdS)z�Get a yes or no from the user, and default to No

        :msg: String for case with [y/N]
        :defaultyes_msg: String for case with [Y/n]
        :return: True if the user selects yes, and False if the user
           selects no
        �y�yes�n�noNzIs this ok [y/N]: rzIs this ok [Y/n]: rTF)r
rr)Z
defaultyesrZi18nZ	ucd_input�EOFError�KeyboardInterrupt�lowerr)r.roZdefaultyes_msgZyuiZnuiZauiZchoicer#r#r$�userconfirm�s>

zOutput.userconfirmcCs~|jj�j�j�}|jj�j�j�}i}xPtjtt|��d�D]6}||kr^||d||<q@||kr@||d||<q@W|S)Nrr)	rY�query�	installedZ
_name_dict�	availablerrrr )r.�sectionsr�r�rf�pkg_namer#r#r$�_pkgs2name_dict�szOutput._pkgs2name_dictc	Cs�i}i}x~tjtt|��d�D]d}|j|�}|dkr8q tt|��t|j�}tt|j��}|j|d�d||<|j|d�d||<q W||fS)Nrr)	rrrr r<r
r
�GRP_PACKAGE_INDENTr�)	r.r��	name_dictZ
nevra_lengthsZrepo_lengthsr�r|Znevra_lZrepo_lr#r#r$�_pkgs2col_lengths�s
zOutput._pkgs2col_lengthscCs$x|D]}td|j|f�qWdS)Nz%s%s)r{r�)r.�	pkg_namesrwr#r#r$�_display_packages�s
zOutput._display_packagescCspxj|D]b}y||}Wn(tk
r>td|j|f�wYnXd}|jsR|jj}|j|d|j||d�qWdS)Nz%s%sFT)r}r:rMr1)�KeyErrorr{r�r�r)Zcolor_list_available_installr�)r.r�r�r1rwr|rMr#r#r$�_display_packages_verbose�s
z Output._display_packages_verbosec
Csldd�}tdtd�|j�|jj}|r@ttd�t|j��|jr`ttd�t|j�p\d�|jrxttd�|j�td	�||j	�ftd
�||j
�ftd�||j�ftd�||j�ff}|�r0|j
|�}|j||�}|j|�}|d
|df}xp|D].\}}	t|	�dk�rq�t|�|j|	||�q�Wn8x6|D].\}}	t|	�dk�rP�q6t|�|j|	��q6WdS)z�Output information about the packages in a given group

        :param group: a Group object to output information about
        cSstdd�|D��S)Ncss|]}|jVqdS)N)rw)r8r|r#r#r$�	<genexpr>�sz?Output.display_pkgs_in_groups.<locals>.names.<locals>.<genexpr>)r])�packagesr#r#r$�names�sz,Output.display_pkgs_in_groups.<locals>.namesr�z	Group: %sz
 Group-Id: %sz Description: %srz
 Language: %sz Mandatory Packages:z Default Packages:z Optional Packages:z Conditional Packages:rrN)r{r�ui_namer)r�r
�id�ui_descriptionZ	lang_onlyZmandatory_packagesZdefault_packagesZoptional_packagesZconditional_packagesr�r�r=rr�r�)
r.�groupr�r�r�r�Zcol_lengthsr1�section_namer�r#r#r$�display_pkgs_in_groups�s8

zOutput.display_pkgs_in_groupscCs�dd�}ttd�|j�|jjr8ttd�t|j��|jr\t|j�pJd}ttd�|�td�||j�ftd�||j	�ff}x0|D](\}}t
|�d	kr�q�t|�|j|�q�Wd
S)z�Output information about the packages in a given environment

        :param environment: an Environment object to output information about
        cSstdd�|D��S)Ncss|]}|jVqdS)N)rw)r8r�r#r#r$r�	szFOutput.display_groups_in_environment.<locals>.names.<locals>.<genexpr>)r])�groupsr#r#r$r�sz3Output.display_groups_in_environment.<locals>.nameszEnvironment Group: %sz Environment-Id: %srz Description: %sz Mandatory Groups:z Optional Groups:rN)r{rr�r)r�r
r�r�Zmandatory_groupsZoptional_groupsrr�)r.Zenvironmentr�r�r�r�r�r#r#r$�display_groups_in_environmentsz$Output.display_groups_in_environmentcsVd���fdd�	����fdd�}�jjr4d�}nd�j�jf}�j|�jpRd�}�r|�d	krj�jj��j|��d
d�}t|�|d	kr��jj	}|s�d	Stt
d��j�d}d}	�xXt|�D�]J}
�j|
kr�d
}	qˆj
|
k�rt
d
�}�||
|d
d�d
}qˆj|
k�r,t
d�}�||
|dd�d
}qˆj|
k�rVt
d�}�||
|dd�d
}q�||
|��rhd
}q�t
d�}x��jD]�}t|�}tj||
��r��|||dd�d
}n`|j�d}
td��t�fdd�|
D���r�|
j�d}n|
}tj|
|��rx�|||dd�d
}�qxWq�Wt||	g��sLx*t|�D]}
t
d�}�||
|dd��q*Wt�d	S)a�Output search/provides type callback matches.

        :param po: the package object that matched the search
        :param values: the information associated with *po* that
           matched the search
        :param matchfor: a list of strings to be highlighted in the
           output
        :param verbose: whether to output extra verbose information
        :param highlight: highlighting options for the highlighted matches
        Fcsd|sttd��t|�pd}|dkr(dS�r>�j|��dd�}|rTt�j||��nt||�dS)Nz
Matched from:rT)�ignore_case)r{rr
rSr�)r��itemZprinted_headline�can_overflow)rM�matchforr.r#r$�print_highlighted_key_item'sz8Output.matchcallback.<locals>.print_highlighted_key_itemcsT�jj|�sdStd�}d}x2�jD](}tj||�r$�|||p@|dd�d}q$W|S)NFzFilename    : %s)r�T)�FILE_PROVIDE_RE�matchr�files�fnmatch)r��
printed_matchr�Z
file_match�filename)�por�r.r#r$�print_file_provides4sz1Output.matchcallback.<locals>.print_file_providesz%s : z%s.%s : rNT)r�zRepo        : %szDescription : )r�zURL         : %szLicense     : %szProvide    : %srz=<>c3s|]}|�kVqdS)Nr#)r8�char)�possibler#r$r�psz'Output.matchcallback.<locals>.<genexpr>zOther       : %s)F)r)Zshowdupesfromreposrwrxr�r�Zcolor_search_matchrSr{r�rr�r�r�r�r�Zprovidesr�r�rK�any)r.r�r�r�r�rMr�ror�Z
name_matchr�r�ZprovideZ
first_provideZitem_newr#)rMr�r�r�r�r.r$�
matchcallbacksp


zOutput.matchcallbackcCs|j|||dd�S)aqOutput search/provides type callback matches.  This will
        output more information than :func:`matchcallback`.

        :param po: the package object that matched the search
        :param values: the information associated with *po* that
           matched the search
        :param matchfor: a list of strings to be highlighted in the
           output
        T)r�)r�)r.r�r�r�r#r#r$�matchcallback_verboses
zOutput.matchcallback_verbosec
Csd}d}d}d}x�|D]�}yrt|j�}||7}y|j�r@||7}Wntk
rVYnX|s^wyt|j�}Wntk
r�YnX||7}Wqtk
r�d}td�}	tj|	�PYqXqW|�s|r�tjtd�t	|��||k�r�tjtd�t	||��|�rtjtd�t	|��dS)	z�Report the total download size for a set of packages

        :param packages: a list of package objects
        :param installonly: whether the transaction consists only of installations
        rFTz2There was an error calculating total download sizezTotal size: %szTotal download size: %szInstalled size: %sN)
r�r�ZverifyLocalPkg�	ExceptionZinstallsizer�logger�errorr�r)
r.r�Zinstallonly�totsizeZlocsizeZinsizer�r|�sizeror#r#r$�reportDownloadSize�sD






zOutput.reportDownloadSizecCsrd}d}xL|D]D}y|j}||7}Wqtk
rPd}td�}tj|�PYqXqW|sntjtd�t|��dS)zmReport the total size of packages being removed.

        :param packages: a list of package objects
        rFTz-There was an error calculating installed sizezFreed space: %sN)r�r�rr�r�r�r)r.r�r�r�r|r�ror#r#r$�reportRemoveSize�s

zOutput.reportRemoveSizecCs*|sdSg}g}|jr$|jtd��xJ|jD]@}t|j|��}|j|�}|rR|jn|}	|jtdd|	|��q,W|j	r�|jtd��x@|j	D]6}t|j
|��}|jj|�j}	|jtdd|	|��q�W|�r |j
|�}
x$|D]}|j|jt||
�d��q�W|j|
td�td�ddf�|d	d	�<d
j|�S)Nz+Marking packages as installed by the group:r��@z)Marking packages as removed by the group:r'ZGroup�Packagesrrr�)Z
new_groupsr`rrZadded_packagesZ_group_by_idr�rr%Zremoved_groupsZremoved_packagesr�r<rEr2r r6r�)r.�compsrXrh�outrAZgrp_idZpkgsZgroup_objectZgrp_namer3r4r#r#r$�list_group_transaction�s.


$zOutput.list_group_transactioncQs�tjtjBtjBtjBtjBtjB}t�}t�}|dkr<g}tj	j
|�}g}iiid�}d}	gf�fdd�	}
tjjr|t
d�nt
d�}�xPtdd	�|jftdd
�|jftdd�|jf||jft
d�|jft
d
�|jft
d�|jft
d�|jft
d�|jftdd�|jfg
D]�\}}
g}i}xL|D]D}|jtjjk�r6�qx(|jj�D]}|j|t��j |��qBW�qWx\t!|
dd�d�D]H}|jtjj"tjj#gk�r��qxt!|j$|jg��}|
|||	|j%|�}	�qxW|j&||f��qWt!t'�jj(j)��j*��}|�rXt
d�}g}xF|D]>\}}x2t+|�D]&}|j&d||fddddddf��qW�qW|j&||f�t!t'�jj(j,��j*��}|�r�t
d�}g}xF|D]>\}}x2t+|�D]&}|j&d||fddddddf��q�W�q�W|j&||f�t!t'�jj(j-��j*��}|�r<t
d�}g}x*|D]"\}}|j&|d|ddddf��qW|j&||f�t!t'�jj(j.��j*��}|�r�t
d�}g}x:|D]2\}}|j&|dd|d|dfddddf��qlW|j&||f�t!t+�jj(j/���}|�rt
d�}g}x&|D]}|j&|ddddddf��q�W|j&||f�t!t+�jj(j0���}|�rht
d�}g}x&|D]}|j&|ddddddf��q8W|j&||f��jj1�rNdd �}�jj1j2j3}|�r�t
d!�}g}x |j4�D]}|j&||���q�W|j&||f��jj1j2j5} | �rt
d"�}g}x | j4�D]}|j&||���q�W|j&||f��jj1j2j6}!|!�rdt
d#�}g}x |!j4�D]}|j&||���q>W|j&||f��jj1j7j3}"|"�r�t
d$�}g}x |"j4�D]}|j&||���q�W|j&||f��jj1j7j5}#|#�rt
d%�}g}x |#j4�D]}|j&||���q�W|j&||f��jj1j7j6}$|$�rNt
d&�}g}x |$j4�D]}|j&||���q(W|j&||f��j8j9�rv�jj:j;|@�rvg}�jj<d'|d(�\}}t'd)d*�|D��}x"t!|�D]}%|
|||	|%g�}	�q�Wd+g}&�jj=�s�|&j&d,�t
d-�d.j>|&�}'d/d0�|D�}|j&|'|f�g}x*t!|j*��D]\}(}%|
|||	|%g�}	�qWt
d1�}'�jj8j?�rN|'d}'n|'t
d2�}'d3d0�|D�}|j&|'|f��j@jA})|d4�rˆjj(jB��rˆjj1�o��jj1j7�p��jj1j2�r�dS|d4i|d5|d6ig}d|	ddd7g}*�jC|d8|*d9|d:�}*|*\}+}	},}-}.tD|*�d7}/|)|/k�r&|)n|/})tE|+td;d<�td=d<��}0tE|	td;d>�td=d?��}1tE|,td;d@�td=d@��}2tE|-td;dA�td=dB��}3tE|.td;dC�td=dC��}4dDdE|)�jF|0|+f|1|	f|2|,f|3|-f|4|.ffd.�dE|)fg}5x�|D]�\}}|�	rdF|}6x�|D]�\}7}8}9}:};}<}=|7|+|=f|8|	f|9|,f|:|-f|;|.ff}*�jF|*d.dG�}>�jG�j8jH�\}?}@xBt!|<�D]6}AdHt
dI�dJ}B|B|?|AjI|@|AjJ|AjKf;}B|>|B7}>�	qrW|6|>}6�	q
W|�r�|5j&|6��q�W|5j&t
dK�dE|)�t
dL�tL|j�tL|j�tL|j�tL|j�dft
dM�tL|j�dft
dN�tL|j�tL|j�tL|j�dft
dO�tL|j�dft
dP�tL|�tL|�dff}Cd}Dd}Ed}Fd}Gx�|CD]�\}}H}I|H�
r�|I�
r��
q�tMd<dQ|H�}JtN|�}KtNtO|H��}LtN|J�}M|I�
r�tNtO|I��}Nnd}NtP|K|D�}DtP|L|E�}EtP|M|F�}FtP|N|G�}G�
q�Wx�|CD]�\}}H}ItMd<dQ|H�}J|I�r�tMdRdS|I�}OtQ||D�}P|H�r�dT}>|5j&|>|P|E|HdU|F|Jf|G|I|Of�n$dV}>|5j&|>|P|E|Fd.|G|I|Of�n&|H�r$dW}>|5j&|>tQ||D�|E|H|Jf��q$Wdj>|5�S)Xz]Return a string representation of the transaction in an
        easy-to-read format.
        N)r��v�rrcs�|j\}}}}}	|j}
|j}t|j�}|dkr2d}|jrB�jj}
n|jrR�jj	}
n�jj
}
|j|||
||||
f�xRdt|�fdt|
�fdt|�ffD],\}}||j
|d�|||d7<q�Wt|t|��}|S)NZnoarchr�r�r�rr)Zpkgtupryrzrr�r�r)r�Z
_from_cmdliner�r�r`r�
setdefault�max)�linesrc�a_widr��	obsoletesr��a�er�r�ryr�r��hirf)r.r#r$�	_add_line�s"


,z*Output.list_transaction.<locals>._add_linez Installing group/module packageszInstalling group packagesr�Z
InstallingZ	UpgradingZReinstallingzInstalling dependencieszInstalling weak dependenciesZRemovingzRemoving dependent packageszRemoving unused dependenciesZDowngradingcSs|jS)N)r|)�xr#r#r$�<lambda>4sz)Output.list_transaction.<locals>.<lambda>)r�zInstalling module profilesz%s/%srzDisabling module profileszEnabling module streamszSwitching module streamsz%s -> %srzDisabling moduleszResetting modulescSs&|j�}|r|ntd�ddddddfS)Nz<name-unset>r)ZgetNamer)r�rwr#r#r$�format_lineqsz,Output.list_transaction.<locals>.format_linezInstalling Environment GroupszUpgrading Environment GroupszRemoving Environment GroupszInstalling GroupszUpgrading GroupszRemoving GroupsT)Zreport_problems�transactioncss|]}t|�|fVqdS)N)r�)r8r|r#r#r$r��sz*Output.list_transaction.<locals>.<genexpr>z--bestz--allowerasingzSSkipping packages with conflicts:
(add '%s' to command line to force their upgrade)r'cSsg|]}|dd�d�qS)Nrrr[)rr#)r8rBr#r#r$r9�sz+Output.list_transaction.<locals>.<listcomp>z,Skipping packages with broken dependencies%sz or part of a groupcSsg|]}|dd�d�qS)Nrrr[)rr#)r8rBr#r#r$r9�sr�r�r��z  rZ)r:r1rdrer�ZPackagerr�r�r�r�r�r�z	%s
%s
%s
r0z%s:
r�z     Z	replacingz  %s%s%s.%s %s
z
Transaction Summary
%s
�Install�UpgradeZRemove�	DowngradeZSkipr�zDependent packagezDependent packagesz%s  %*d %s (+%*d %s)
z%-*sz%s  %s  ( %*d %s)
z%s  %*d %s
)Rr�ZUPGRADEZUPGRADE_ALLZDISTUPGRADEZDISTUPGRADE_ALLZ	DOWNGRADE�INSTALLr�rr�Z_make_listsr*ZWITH_MODULESrrr�ZupgradedZreinstalledZinstalled_groupZ
installed_depZinstalled_weakZerasedZ
erased_depZerased_cleanZ
downgraded�action�libdnfrZTransactionItemAction_OBSOLETEDr�Z
getReplacedByr�r�r]ZFORWARD_ACTIONSZTransactionItemAction_REMOVEr<r|r`r7Z_moduleContainerZgetInstalledProfilesr^rZgetRemovedProfilesZgetEnabledStreamsZgetSwitchedStreamsZgetDisabledModulesZgetResetModulesZ_history�envZ
_installedr�Z	_upgradedZ_removedr�r)ZbestZ_goal�actionsZ_skipped_packagesZ_allow_erasingr�Zupgrade_group_objects_upgrader,r1Z	isChangedr=rarr2rPr�rwrxryrr	r
rr�r)Qr.rreZforward_actionsZskipped_conflictsZskipped_brokenZ
list_bunchZ
pkglist_linesrcr�rZ
ins_group_msgrZpkglistr�Zreplaces�tsirBZ	obsoletedZinstalledProfilesrwZprofilesZprofileZremovedProfilesZenabledStreams�streamZswitchedStreamsZdisabledModulesZresetModulesrZinstall_env_groupr�Zupgrade_env_groupZremove_env_groupZ
install_groupZ
upgrade_groupZremove_groupr|ZrecommendationsZskip_strr�Zoutput_widthr1Zn_widZv_widZr_widZs_widZ
real_widthZmsg_packageZmsg_archZmsg_versionZmsg_repositoryZmsg_sizer�Ztotalmsgr�rryr�r�rrrorNrOZobspoZappendedZsummary_dataZmax_msg_actionZ
max_msg_countZmax_msg_pkgsZmax_msg_depcountr�ZdepcountZmsg_pkgsZlen_msg_actionZ
len_msg_countZlen_msg_pkgsZlen_msg_depcountZmsg_deppkgsZ
action_msgr#)r.r$�list_transaction�s�$
 ,,.





$







zOutput.list_transactionc
s��fdd�}|sdSg}g}|jdj|��x|D]}|jt|��q2Wxd
D]}|||�}|rNPqNW|sz�jjdg}xD|r�|dt|��}	|jdj�jt|	|����|t|�d�}q|W|S)Ncs�t|�|krgS�jj|dd}|dkr0gSdg|}d}x`|D]X}t|�||kr�t|�||}||krtgS||8}t|�||<|d7}|t|�;}qDWx8tt|��D](}||||7<||d9<q�W|S)zb Work out how many columns we can use to display stuff, in
                the post trans output. rrZrr[)rr,r1r\)�msgs�numr"Zcol_lensrgrorh)r.r#r$�
_fits_in_colsKs(

z+Output._pto_callback.<locals>._fits_in_colsrz{}:���r	r�r(rZz  {})rrrr	r�r(rZ)r`�formatr�r,r1rr2r )
r.rZtsisrr�rrrrDZcurrent_msgsr#)r.r$�
_pto_callbackHs&


zOutput._pto_callbackcCstjj|j||j�S)z{
        Return a human-readable summary of the transaction. Packages in sections
        are arranged to columns.
        )rr�Z_post_transaction_outputr*r)r.rr#r#r$�post_transaction_outputzszOutput.post_transaction_outputcCs@d}|jjdkr6tjjjtjd�}tjjjtjd�|_|t�fS)z_Set up the progress callbacks and various
           output bars based on debug level.
        NrZ)Zfo)	r)Z
debuglevelrr+r-ZMultiFileProgressMeter�sys�stdout�DepSolveProgressCallBack)r.�progressbarr#r#r$�setup_progress_callbacks�s
zOutput.setup_progress_callbackscCsz|dkrdS|jj}tjd|�tdtj�|�}dt||�t|�t|�f}tt	d�|t
|��|}tj|�dS)a!Outputs summary information about the download process.

        :param remote_size: the total amount of information that was
           downloaded, in bytes
        :param download_start_timestamp: the time when the download
           process started, in seconds since the epoch
        rN�-g{�G�z�?z %5sB/s | %5sB %9s     ZTotal)r,r1r�r�r��timerrrrr)r.Zremote_sizeZdownload_start_timestampriZdl_timeror#r#r$�download_callback_total_cb�s
z!Output.download_callback_total_cbcCs�t�}t�}d}xD|D]<}|jtjjtjjfkr2q|j|j�|j|j�|d7}qWt	|�dkrt|dj
t|��fS|dj
t|��fS)Nrrz, r)
r�rrrZTransactionItemAction_UPGRADEDZ TransactionItemAction_DOWNGRADEDr��action_nameZaction_shortrr�r]r)r.ZhpkgsrZ
actions_shortr�r|r#r#r$�_history_uiactions�s
zOutput._history_uiactionsc	st|t�r��fdd�|D�S|dks.|dkrftd�}td�d|}�dk	r^t|��kr^|}t|�Sdd	�}yrtjt|��}|t|j�d
d�}t|j	�}d||f}�dk	r�t|��kr�d
||�|f}t|��kr�d|}|St
k
�r�t|�SXdS)Ncsg|]}�j|���qSr#)r�)r8�u)�limitr.r#r$r9�sz+Output._pwd_ui_username.<locals>.<listcomp>�����z<unset>ZSystemr'cWs|j|�}|sdS|dS)zf Split gives us a [0] for everything _but_ '', this function
                returns '' in that case. rr)rK)�text�argsrUr#r#r$�
_safe_split_0�s
z.Output._pwd_ui_username.<locals>._safe_split_0�;rZz%s <%s>z%s ... <%s>z<%s>)r*r+)rIrrrr
�pwd�getpwuidr�Zpw_gecosZpw_namer�)	r.r�r)Zloginidrwr.�user�fullnameZ	user_namer#)r)r.r$r��s*

zOutput._pwd_ui_usernamec
Csj|jj|�}|jjdkr"ddg}nV|jjdkr6dg}nBt�}d}d}x2|D]*}|d7}|jdkrh|d7}|j|j�qJWd}t|�dkr�t	d�}	|j
j}
|
dkr�tj
j
jd�}
|
dkr�d	}
|
d
kr�|
dnd	}nt	d�}	d	}t|tt	d
�dd�t|	||�tt	d�dd�tt	d�dd�tt	d�dd�f�d"|dddddd}td|�d}|dk�rlt|�}x�|D]�}t|�dk�r�|j�p�d}	n|j|jd	�}	t|	�}	tjdtj|j��}
|j|j��\}}t|	||�}	t|dd�}d}}|jdk�rd}}n"|j�rd}}n|j�r&d}}|j�r2d}|j�r>d }t||j|	|
||fd!||f��qrWdS)#z�Output a list of information about the history of yum
        transactions.

        :param tids: transaction Ids; lists all transactions if empty
        ZusersrrZZcommandsrNz%s | %s | %s | %s | %szCommand line��O�7z	User nameZIDrz
Date and timersz	Action(s)�ZAlteredrr(r#z%6u | %s | %-16.16s | %s | %4uTrz%Y-%m-%d %H:%Mr'�*�#�Er�r�z%s%s�	)rX�oldr)Zhistory_list_viewr��cmdliner��loginuidrrr,r_rr+Z_real_term_widthr{r�reversedr�r
r$�strftime�	localtime�
beg_timestampr'rc�return_codeZ	is_output�altered_lt_rpmdb�altered_gt_rpmdb�tid)r.�tids�reverse�transactionsZuids�doneZblanksr�fmtrwZ	real_colsZ
name_widthZtable_widthZtmrZuiactsZrmarkZlmarkr#r#r$�historyListCmd�sp



 



zOutput.historyListCmdcCst|�}|jj�}|dkr8tjtd��tjjtd���|j	}|j
}g}|sz|jjdd�}|dk	r�|j|j	�|j|�n|jj
|�}|s�tjtd��tjjtd���d
\}}	d}
d}|r�t|�}|j�\}}	�x|D]�}|dk	o�|j	|k�r|jj�}
|jt|
��d}d}|j	|k�rL|j	|	k�rL|
dk�r<t|�}
n
|
j|�d}n`|
dk	�r�|�rhtd	d
�d}|j|
�d}
|�r�|j�\}}	|j	|k�r�|j	|	k�r�|}
d}|s�|�r�td	d
�d}|j||�q�W|
dk	�r�|�r�td	d
�|j|
�dS)z�Output information about a transaction in history

        :param tids: transaction Ids; prints info for the last transaction if empty
        :raises dnf.exceptions.Error in case no transactions were found
        NzNo transactionszFailed history infoF)Zcomplete_transactions_onlyz$No transaction ID, or package, givenrTr#r5r[r[)r[r[)r�rX�lastr��criticalrr�
exceptions�ErrorrF�end_rpmdb_versionr�r`r<r]rbrYZ_rpmdb_versionZcompare_rpmdbvr�r�merger{�_historyInfoCmd)r.rG�patsZmtidsrMZlasttidZlastdbvrIZbmtidZemtidZmobjrJZtransZrpmdbvZmergedr#r#r$�historyInfoCmd#sl







zOutput.historyInfoCmdcs||j}t|t�r|g}�fdd�|D�}td�td�td�td�d��td�td�td	�td
�d��tdd�t�j��t�j��D��}|�d<|�d<d4���fdd�	}|j�}t|�dkr�t	td�d|d|d5f�nt	td�|d�t
|j�}tj
dtj|��}	t	td�|	�|jdk	�rT|j�rDt	td�|jd�nt	td�|j�|jdk	�r�|j}
tj
dtj|
��}|
|}|d6k�r�td�|}nH|d8k�r�td�|d}n,|d;k�r�td�|d<}ntd�|d>}t	td �||�|jdk	�r(|j�rt	td!�|jd�nt	td!�|j�t|ttf��rvt�}
xD|D],}||
k�rV�qD|
j|�t	td"�|��qDWnt	td"�|�t|jttf��r|j}|ddk�r�t	td#�dtd$�d�|dd�}nHt|��s�t	td#�td%��n*|�rnt	td#�td&�d'jd(d�|D���nV|jdk�r<t	td#�dtd$�d�n2|j�r\t	td#�td)�|j�nt	td#�td%��t|jttf��r�t�}
x(|jD]}||
k�r��q�|
j|��q�Wt	td*�|�nt	td*�|j�|jdk	�r t|jttf��rx0|jD]}t	td+�|��q�Wnt	td+�|j�|jdk	�rpt|jttf��r`x0|jD]}t	td,�|��qFWnt	td,�|j�|j�}|�r�t	td-��d}x(|D] }tt|��}||k�r�|}�q�Wx|D]}||d.d|d/��q�Wt	td0���j||�|j �}|�r4t	td1��d}x$|D]}|d7}t	d2||��qW|j!�}|�rxt	td3��d}x$|D]}|d7}t	d2||��qXWdS)?Ncsg|]}�j|��qSr#)r�)r8r�)r.r#r$r9tsz*Output._historyInfoCmd.<locals>.<listcomp>Z	InstalledZErased�Upgraded�
Downgraded)rBr�or�z
Not installedZOlderZNewercSsg|]}t|��qSr#)r)r8rr#r#r$r9zs�maxlenFrTc	s�d|}|r�}n�}|d}�jj�j�j|jd�j�}	|	sH|d}nB�jj|	d�}
|
r�|j|
�}|dkrpn|dkr�|d}n|d}|r��j	d�\}}
n�j	d	�\}}
t
||d
�}d}|r�|j�}td||||
|t
|�|f�dS)
Nr'rB)rwrrrXr�rFrGrYrz%s%s%s%s %-*s %s)rYr�r�ZfiltermrwZrunrX�packageZcomparerPrr�r{r�)r|Z
prefix_len�
was_installedrM�pkg_max_lenZ	show_reporlZ_pkg_states�stateZipkgsZinst_pkg�resrNrOZui_repo)�_pkg_states_available�_pkg_states_installedr.r#r$�_simple_pkg~s2


z+Output._historyInfoCmd.<locals>._simple_pkgrzTransaction ID :z%u..%uz%czBegin time     :zBegin rpmdb    :z**r	�<z(%u seconds)z(%u minutes)r4z
(%u hours)z	(%u days)zEnd time       :zEnd rpmdb      :zUser           :zReturn-Code    :ZAbortedZSuccessz	Failures:z, cSsg|]}t|��qSr#)r�)r8rBr#r#r$r9�szFailure:zReleasever     :zCommand Line   :zComment        :zTransaction performed with:r�)r[r\zPackages Altered:zScriptlet output:z%4dzErrors:)FFrTr[i,i,iPFi,iPFi��iii�Q)"r>rIr�rr�rr�rGrr{r�rBr$r@rAZbeg_rpmdb_versionrDZ
end_timestamprQrErnr�r�rC�allr�Z
releaseverr=�commentZperformed_withr��historyInfoCmdPkgsAltered�outputr�)r.r<rTr>rwrYrarGZbegtZbegtmZendtZendtmrh�seenrBZcodesr=rdZ	perf_withZmax_lenZwith_pkgZstr_lenZt_outr�lineZt_errr#)r_r`r.r$rSps�
( 







&






zOutput._historyInfoCmdr
zDep-Install�	Obsoleted�
Obsoleting�Erase�	ReinstallrrWrrV)zTrue-Installr
zDep-InstallrirjrkrlrrWZUpdateZUpdatedc

s�|j}d}d}|j�}xH|D]@�|j�j�j�}|t|�krDt|�}tt���}||kr|}qWx�|D]��d}	�jtjj	kr�d}	d}
|r�t
�fdd�|D��r�d}
|j|
�\}}|j�j�j�}tt
|�|�}td	|	||||t���j�f�qfWd
S)aPrint information about how packages are altered in a transaction.

        :param old: the :class:`DnfSwdbTrans` to
           print information about
        :param pats: a list of patterns.  Packages that match a patten
           in *pats* will be highlighted in the output
        rr'r�z ** rGcsg|]}�j|��qSr#)r�)r8Zpat)r|r#r$r9Bsz4Output.historyInfoCmdPkgsAltered.<locals>.<listcomp>rFz%s%s%s%s %-*s %sNz    )�_history_state2uistater�r<r&rr�r]rrZTransactionItemState_DONEr�rPrr
r{r�)
r.r<rTZall_uistatesrYr\r�ZuistateZpkg_lenrlrMrNrOr#)r|r$re"s2

z Output.historyInfoCmdPkgsAlteredz   )NrNr)rr)FrFN)FrFN)r0)F)N)NN)NNN)N)F)N)N)F)9�__name__�
__module__�__qualname__�__doc__r��re�compiler�r/r6rErPrS�staticmethodrW�propertyrXrYr=rjrkr2rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr"r%r'r�rLr�rUrSrrmrer#r#r#r$r&Cs�




/



V
$N
-
'
c

/
_2
"
MM'r&c@s(eZdZdZdd�Zdd�Zdd�ZdS)	r zGProvides text output callback functions for Dependency Solver callback.cCs�d}|dkrtd�}n||dkr(td�}nj|dkr:td�}nX|dkrLtd	�}nF|d
kr^td�}n4|dkrptd
�}n"|dkr�td�}n|dkr�td�}|r�tj||j|j|j�dS)a�Print information about a package being added to the
        transaction set.

        :param pkgtup: tuple containing the package name, arch,
           version, and repository
        :param mode: a short string indicating why the package is
           being added to the transaction set.

        Valid current values for *mode* are::

           i = the package will be installed
           u = the package will be an update
           e = the package will be erased
           r = the package will be reinstalled
           d = the package will be a downgrade
           o = the package will be obsoleting another package
           ud = the package will be updated
           od = the package will be obsoleted
        NrBz'---> Package %s.%s %s will be installedr(z(---> Package %s.%s %s will be an upgraderz$---> Package %s.%s %s will be erasedr�z)---> Package %s.%s %s will be reinstalledrfz)---> Package %s.%s %s will be a downgraderXz(---> Package %s.%s %s will be obsoletingZudz&---> Package %s.%s %s will be upgradedZodz'---> Package %s.%s %s will be obsoleted)rr��debugrwrxry)r.r|�moderfr#r#r$�	pkg_addedPs&






z"DepSolveProgressCallBack.pkg_addedcCstjtd��dS)zRPerform setup at the beginning of the dependency solving
        process.
        z"--> Starting dependency resolutionN)r�rvr)r.r#r#r$�startyszDepSolveProgressCallBack.startcCstjtd��dS)zAOutput a message stating that dependency resolution has finished.z"--> Finished dependency resolutionN)r�rvr)r.r#r#r$rpszDepSolveProgressCallBack.endN)rnrorprqrxryrpr#r#r#r$r Ms)r c@seZdZdd�Zdd�ZdS)�CliKeyImportcCs||_||_dS)N)r*rf)r.r*rfr#r#r$r/�szCliKeyImport.__init__cCsbdd�}td�||�|tjj|�|jdd�f}tjd|�|jjj	rJdS|jjj
rXdS|jj�S)	NcSs$tjjrdnd}|dd�jd|�S)N�0�0ri����)rZpycompZPY3�rjust)r�Zrjr#r#r$�short_id�sz'CliKeyImport._confirm.<locals>.short_idzLImporting GPG key 0x%s:
 Userid     : "%s"
 Fingerprint: %s
 From       : %szfile://rz%sTF)
rrZcryptoZ_printable_fingerprintrJr�rNr*r)Z	assumeyesZassumenorfr�)r.r�ZuseridZfingerprintr�Z	timestampr~ror#r#r$�_confirm�s


zCliKeyImport._confirmN)rnrorpr/rr#r#r#r$rz�srzcsNeZdZdZedd��Z�fdd�Zdd�Zdd	�Zd
d�Z	ddd�Z
�ZS)�CliTransactionDisplayz1A YUM specific callback class for RPM operations.cCstjjj�S)N)rr+r,�_term_width)r.r#r#r$r�szCliTransactionDisplay.<lambda>cs0tt|�j�d|_d|_d|_d|_d|_dS)NrTr0rr)�superr�r/�lastmsg�lastpackagerf�mark�marks)r.)�	__class__r#r$r/�szCliTransactionDisplay.__init__c	Csjtjjj|�}|dkrdS|j�}t|�}	||_|dkr>d}
n|td�|}
|j|||||
||	|�dS)a�Output information about an rpm operation.  This may
        include a text progress bar.

        :param package: the package involved in the event
        :param action: the type of action that is taking place.  Valid
           values are given by
           :func:`rpmtrans.TransactionDisplay.action.keys()`
        :param ti_done: a number representing the amount of work
           already done in the current transaction
        :param ti_total: a number representing the total amount of work
           to be done in the current transaction
        :param ts_done: the number of the current transaction in
           transaction set
        :param ts_total: the total number of transactions in the
           transaction set
        Nr�d)	rr�ACTIONSr<�_max_action_widthr
r�r�
_out_progress)r.rZr�ti_done�ti_total�ts_done�ts_totalZ
action_str�wid1�pkgname�percentr#r#r$r-�szCliTransactionDisplay.progresscCsHt|d�s>d}x(tjjj�D]}t|�}||kr|}qW||_|j}|S)N�_max_action_wid_cacher)�hasattrrrr�r�r
r�)r.r�rCZwid_valr#r#r$r��s
z'CliTransactionDisplay._max_action_widthc	Cs�|jr�tjj�s||kr�|j|||tjj�||d�\}	}}
t|�}|	t|||�t||
|
�f}||jkr�tj	j
d|tj�||_||kr�td�dS)N)r-r�r�Zwrite_flushr')rfrr�isatty�_makefmtr
rr�rr��_terminal_messengerr{)r.r�r�r�r�r�Zprocessr�r�rK�wid2ror#r#r$r��s

z#CliTransactionDisplay._out_progressTN�cCs�tt|��}d||f}d|d|d}	|	||f}
|dkrFd}nt|�}d|d}|d|d7}|d7}|d7}|d7}|j}
|
|kr�|}
|
|8}
||
dkr�|
d}|j||}d||f}
d|
d	}||d}|r�|d
kr�d|
}|}n�|�rD|dk�r*||jt||d
�f}nd}d|d|
}|}nL|d
k�r\d|
}|}n4|dk�rx||j|f}nd}d|d|
}|}|||fS)Nz%s.%s�%zs/%�srrrZrz[%-zs]r�z
  %s: %s   r	gY@rz

  %s: %s r'z  %s: %s   z	  %s: %s )rr�r
rir�r�)r.r�r�r�r-r�r��lr�Zfmt_donerJZpnlZoverheadrir�Zfmt_barZfull_pnlrKr�Zbarr#r#r$r��sP


zCliTransactionDisplay._makefmt)TNr�)rnrorprqrurir/r-r�r�r��
__classcell__r#r#)r�r$r��s
 r�c
Cs�d}tjj�sdS|dkr d}n|dkr6t|�|}nd}tjjj�}|dkrZ||krZd}d||f}|t|�d8}|dkr�d}|dkr�|d8}|dkr�d}|t	||�}d|||f}n�||kr�d	t
|||�|f}nb|d
8}|dkr�d}|d}	|	t|�k�rt|�}	||	8}|t	||�}dt
||	|	�|||f}||k�rZtjj
d|tj�||k�rvtjj
dd
tj�tjj
dtjd�dS)aIOutput the current status to the terminal using a simple
    text progress bar consisting of 50 # marks.

    :param current: a number representing the amount of work
       already done
    :param total: a number representing the total amount of work
       to be done
    :param name: a name to label the progress bar with
    r9Nrr#z %d/%drrZz	
[%-*s]%sz
%s%sr�z

%s: [%-*s]%s�writer��flush)r�)rrr�r�rr+r,r�rr�rr
r�r�)
rTZtotalrwr�r�rirpZhashbarrfZnwidr#r#r$r!sL



r!)N)<rqZ
__future__rrrr�r�rZlibdnf.transactionrZloggingr?r0rrrr$Zdnf.cli.formatrrZdnf.i18nrrr	r
rrr
rZ
dnf.pycomprrrrrZdnf.yum.rpmtransrZdnf.db.historyrZdnf.baserZdnf.callbackZdnf.cli.progressZdnf.cli.termZdnf.confZ
dnf.cryptoZdnf.transactionZdnf.utilZdnf.yum.miscZ	getLoggerr�r%�objectr&�callbackZDepsolver Z	KeyImportrzr�r!r#r#r#r$�<module>sb(

7site-packages/dnf/cli/__pycache__/cli.cpython-36.pyc000064400000074203147511334650016226 0ustar003

h�-e���@stdZddlmZddlmZddlmZyddlmZWn ek
rXddlmZYnXddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlZddlZddlmZdd	lmZdd
lmZmZddlZddlZddlZddlZddlZddl Zddl!Zddl"Zddl#Zddl$Zddl%Zddl&Zddl'Zddl(Zddl)Zddl*Zddl+Zddl,Zddl-Zddl.Zddl/Zddl0Zddl1Zddl2Zddl3Zddl4Zddl5Zddl6Zddl7Zddl8Zddl9Zddl:Zddl;Zddl<Zddl=Zddl>Zddl?Zddl@ZddlAZddlBZddlCZddlDZe
jEd�ZFdd
d�ZGdd�ZHdd�ZIdd�ZJGdd�dejK�ZLGdd�deM�ZNdS)z/
Command line interface yum class and related.
�)�print_function)�absolute_import)�unicode_literals)�SequenceN�)�output)�CliError)�ucd�_�dnf�cCs�t|j�dt|j�t|�}t|j�}t|j�}xFd|fd|fd|ffD],\}}||j|d�|||d7<qLWdS)zl Get the length of each pkg's column. Add that to data.
        This "knows" about simpleList and printVer. r�na�ver�ridrN)�len�name�archZevrZ
_from_repo�
setdefault)�data�pkg�indentr
rr�d�v�r�/usr/lib/python3.6/cli.py�_add_pkg_simple_list_lens]s 

 rcCs�iiid�}x<|j|j|j|j|j|jfD]}x|D]}t||�q4Wq*Wt|j�dkr�x*|j	D] \}}t||�t||dd�q`W|d|d|dg}|j
|d	d
�}|d|d	|dfS)
zA Work out the dynamic size of the columns to pass to fmtColumns. )r
rrr� �)rr
rrr)Zremainder_column�z    )�	installed�	available�extras�
autoremove�updates�recentrr�	obsoletes�obsoletesTuplesZcalcColumns)r�yplrZlstrZnpkgZopkg�columnsrrr�_list_cmd_calc_columnshs

r)c	Cs�dd�}tjj|�}d}x�|j�j�j|d�D]�}|r>td�d}|jdkrbd|j|j	|j
f}nd	|j|j|j	|j
f}|jj|j
�}ttd
�||||j�f�ttd�|jr�|jnd||j�f�q.WdS)NcSstjdtj|��S)Nz%c)�time�strftimeZgmtime)�xrrr�
sm_ui_timezsz"print_versions.<locals>.sm_ui_timeF)rrT�0z%s-%s.%sz%s:%s-%s.%sz  Installed: %s-%s at %sz  Built    : %s at %s)r�sack�
rpmdb_sack�queryr�filterm�printZepoch�version�releaser�term�boldrr
ZinstalltimeZpackagerZ	buildtime)	�pkgs�baserr-r0�donerrrrrr�print_versionsys 
r;cCs>td�}x0|j�D]$\}}tj|j||d|d��qWdS)NzTThe operation would result in switching of module '{0}' stream '{1}' to stream '{2}'rr)r
�items�logger�warning�format)�switchedModulesZmsg1Z
moduleNameZstreamsrrr�report_module_switch�srAcs�eZdZdZd �fdd�	Zff�fdd�	Zdd�Zd	d
�Zdd�Zd
d�Z	fdddfdd�Z
dd�Zggdfdd�Zdfdfdd�Z
d!dd�Z�fdd�Zdd�Z�ZS)"�BaseCliz#This is the base class for yum cli.Ncs4|ptjj�}tt|�j|d�tj||j�|_dS)N)�conf)rrCZConf�superrB�__init__rZOutput)�selfrC)�	__class__rrrE�szBaseCli.__init__cstjjrJ|jjsJt|jj��}|rJt|�t	d�j
tjjd�}tj
j|��|j}|jj|�}|rjtj|�|r�g}g}d}xF|D]>}	|	jtjjkr�|j|	j�q�|	jtjjkr�d}|j|	j�q�W|`|s�|jj|�n|jj||�|�s|jj��s|j�r�|jj�s|jj�r�|jj �r:tjt	d�j
tjj!d��n(d|jj"k�rbtjt	d�j
tjj!d��|j#��r�|jj$�s�|jj%��r�t&t	d���ntjt	d	��d
S|�rD|�r:tjt	d��y|jj'}
|j(||jj)|
�Wn\tj
j*k
�r8}z:tj+j
j,t-|��}t	d�d
|}
t.�tj
j|
��WYd
d
}~XnX|j/|�|jj �rRd
St0|t1��sd|g}tj2�gt3|�}t4t5|�j6|�}|d
k	�r�|j7j8|g�d}tj9jj:|j7|j;�}nd
}|�rt.�t.dj<|jj=|���t.�x.|D]&}	|	j>t?jj@k�r�tj
jt	d����q�W|S)z�Take care of package downloading, checking, user
        confirmation and actually running the transaction.

        :param display: `rpm.callback.TransactionProgress` object(s)
        :return: history database transaction ID or None
        aQIt is not possible to switch enabled streams of a module unless explicitly enabled via configuration option module_stream_switch.
It is recommended to rather remove all installed content from the module, and reset the module using '{prog} module reset <module_name>' command. After you reset the module, you can install the other stream.)�progTFz7{prog} will only download packages for the transaction.ZtestzP{prog} will only download packages, install gpg keys, and check the transaction.zOperation aborted.zNothing to do.NzDownloading Packages:zError downloading packages:z
%sr�
zTransaction failed)Arr9ZWITH_MODULESrCZmodule_stream_switch�dictZ_moduleContainerZgetSwitchedStreamsrAr
r?�util�	MAIN_PROG�
exceptions�ErrorZtransactionrZlist_transactionr=�info�actionZFORWARD_ACTIONS�appendrZBACKWARD_ACTIONSZ_tsZreportRemoveSizeZreportDownloadSizeZ	isChangedZ_history�group�env�downloadonly�MAIN_PROG_UPPERZtsflags�
_promptWanted�assumeno�userconfirmrZdownload_callback_total_cbZdownload_packages�progressZ
DownloadError�cliZindent_blockr	r3�gpgsigcheck�
isinstancerZCliTransactionDisplay�listrDrB�do_transaction�history�oldZdbZRPMTransactionZ_trans�joinZpost_transaction_output�state�libdnfZTransactionItemState_ERROR)rFZdisplayr@�msgZtransZpkg_strZinstall_pkgsZrmpkgsZinstall_onlyZtsiZtotal_cb�eZspecificZerrstr�tid)rGrrr^�s�







zBaseCli.do_transactionc
sg}x�|D]�}�j|�\}}|dkr(q
q
|dkrĈjjo@�jj}tjsVtjj�rl|rltjj	t
d����fdd�}y�j||�Wq�tjj	tfk
r�}z|j
t|��WYdd}~Xq�Xq
|j
|�q
W|r�x|D]}	tj|	�q�Wtjj	t
d���dS)aPerform GPG signature verification on the given packages,
        installing keys if possible.

        :param pkgs: a list of package objects to verify the GPG
           signatures of
        :raises: Will raise :class:`Error` if there's a problem
        rrzTRefusing to automatically import keys when running unattended.
Use "-y" to override.cs
�jj�S)N)rrX)r,�y�z)rFrr�<lambda>$sz%BaseCli.gpgsigcheck.<locals>.<lambda>NzGPG check FAILED)Z_sig_check_pkgrC�	assumeyesrW�sys�stdin�isattyrrMrNr
Z_get_key_for_package�
ValueErrorrQ�strr=�critical)
rFr8Zerror_messages�po�result�errmsgZay�fnrerdr)rFrr[s&
"
zBaseCli.gpgsigcheckcsXd�x:|jjjd|j�D]$}|tj}|rtjj|d��PqW�fdd�|j	D�}|S)zBReturn list of changelogs for package newer then installed versionNrrcs$g|]}�dks|d�kr|�qS)N�	timestampr)�.0Zchlog)�newestrr�
<listcomp>=sz-BaseCli.latest_changelogs.<locals>.<listcomp>)
Z_rpmconnZreadonly_tsZdbMatchr�rpmZRPMTAG_CHANGELOGTIME�datetimeZdateZ
fromtimestamp�
changelogs)rF�packageZmiZchangelogtimesZchlogsr)rwr�latest_changelogs3s
zBaseCli.latest_changelogscCs4d|djd�tjj|d�tjj|d�f}|S)z*Return changelog formatted as in spec filez* %s %s
%s
ruz%a %b %d %X %YZauthor�text)r+rZi18nr	)rFZ	changelogZ	chlog_strrrr�format_changelogAs
zBaseCli.format_changelogcCs�t�}x&|D]}|j|jp|jg�j|�qWxdt|j��D]T}||}ttd�j	dj
dd�|D����x$|j|d�D]}t|j|��qzWq<WdS)NzChangelogs for {}z, cSsg|]}t|��qSr)ro)rvrrrrrxQsz,BaseCli.print_changelogs.<locals>.<listcomp>r)
rJr�source_namerrQ�sorted�keysr3r
r?rar}r)rFZpackagesZbysrpm�pr�Zbin_packagesZchlrrr�print_changelogsIs
"zBaseCli.print_changelogsTFc	CsR|jd||d�}|jjs |jjr@|jd||d�}|j|_|j|_|�rDt|j|�}t|j�dkr�i}|jj	j
d}	|	r�x>t|j�D]0}
|
j�}t
jj|�r�|
j�r�|
||
j|
jf<q�W|jj}|jj}
|jj|jdd||||
d�d	�|r�|j|j�t|j�dk�rDttd
��x0t|jtjd�d�D]}|jj|d|d��q(W|j�pP|jS)
z?Check updates matching given *patterns* in selected repository.Zupgrades)�reponamer%rr7rr])�=znot in)Z
outputType�highlight_nar(�highlight_modeszObsoleting Packages)�key)r()�returnPkgListsrCr%�verboser&r)rrr#r6�MODEr�ZlocalPkg�os�path�existsZverifyLocalPkgrr�color_update_local�color_update_remote�listPkgsr�r3r
�operator�
itemgetter�updatesObsoletesList)rF�patternsr�Zprint_r{r'Ztyplr(�
local_pkgs�	highlightrqZlocal�cul�cur�obtuprrr�
check_updatesUs:
zBaseCli.check_updatescCsr|jj�}t|�dkr |j�nx|D]}|j|�q&W|jj�|}|dkrn|jj�rntd�}tjj|��dS)ab Upgrade or downgrade packages to match the latest versions available
            in the enabled repositories.

            :return: (exit_code, [ errors ])

            exit_code is::
                0 = we're done, exit
                1 = we've errored, exit with error string
                2 = we've got work yet to do, onto the next stage
        rz4No packages marked for distribution synchronization.N)	Z_goalZ
req_lengthrZdistro_syncZreq_has_distupgrade_allr
rrMrN)rFZuserlistZoldcount�pkg_specZcntrdrrr�distro_sync_userlist{s


zBaseCli.distro_sync_userlistc
CsTd}xf|D]^}y|j||d�d}Wq
tjjk
rf}z"tjtd�|jjj	|j
��WYdd}~Xq
Xq
Wx�|D]�}y|j||d�d}Wqrtjjk
r�}z$td�}	tj|	|jjj	|��WYdd}~Xqrtjj
k
�r}z"tjtd�|jjj	|j��WYdd}~Xqrtjjk
�r4d�s0t�YqrXqrW|�sPtjjtd���dS)	aaAttempt to take the user specified list of packages or
        wildcards and downgrade them. If a complete version number is
        specified, attempt to downgrade them to the specified version

        :param specs: a list of names or wildcards specifying packages to downgrade
        :param file_pkgs: a list of pkg objects from local files
        F)�strictTzNo match for argument: %sNzNo package %s available.z6Packages for argument %s available, but not installed.z!No packages marked for downgrade.)Zpackage_downgraderrMZMarkingErrorr=rOr
rr6r7�locationZdowngrade_toZPackageNotFoundErrorZPackagesNotInstalledErrorr��AssertionErrorrN)
rFZspecsZ	file_pkgsr�rrrre�arg�errrdrrr�
downgradePkgs�s,	

(
&
"zBaseCli.downgradePkgs�allc!CsDy$|jjjd}|j||||d�}Wn0tjjk
rT}zdt|�gfSd}~X�n�Xi}i}	i}
d}|dkrzt|j|�}|r�|j	r�xB|j
|j|jD],}|j
|jf}
|
|ks�|||
kr�|||
<q�W|o�|j�rx8|jD].}|j
|jf}
|
|	k�s||	|
kr�||	|
<q�W|�rP|j�rPx2t|j�D]$}|jtjk�r(||
|j
|jf<�q(W|jj}|jj}|jj}|jj}|jj|j	td�|||||||d�d�}|jj}|jj}|jj}|jj }|jj|jtd	�||	|||||d
�d�}|jj|j!td�||d�}|jj|j"td
�||d�}|jj#}|jj$}|jj|jtd�||
|||d�d�}t%|j&�dk�r�|dk�r�t%|j&�}t'td��xLt|j(t)j*d�d�D]}|jj+|d|d��q�Wn|jj|j&td�||d�}|jj|j,td�||d�} t%|��r@| dk�r@|dk�r@|dk�r@|dk�r@|dk�r@|dk�r@|dk�r@tjjtd���dS)zJOutput selection *pkgnarrow* of packages matching *patterns* and *repoid*.r7)�installed_availabler�rNr]zInstalled Packages)�>�<r�znot in)r�r(r�zAvailable Packages)r�r�r�znot inzAutoremove Packages)r(zExtra PackageszAvailable Upgrades)r�znot inrzObsoleting Packages)r�r%zRecently Added PackageszNo matching Packages to list)-rr6r�r�rrMrNror)r�hidden_available�reinstall_availableZ
old_availablerrr �hidden_installedr#r�r��hawkeyZSYSTEM_REPO_NAMErCZcolor_list_installed_olderZcolor_list_installed_newerZcolor_list_installed_reinstallZcolor_list_installed_extrar�r
Zcolor_list_available_upgradeZcolor_list_available_downgradeZcolor_list_available_reinstallZcolor_list_available_installr"r!r�r�rr%r3r&r�r�r�r$)!rF�basecmd�	pkgnarrowr�r�r�r'reZupdate_pkgsZ	inst_pkgsr�r(rr�rqZclioZclinZclirZclieZripZclauZcladZclarZclaiZrapZraepZrepr�r�ZrupZropr�Zrraprrr�output_packages�s�







FzBaseCli.output_packagesc	Cs�d}d}|r|dkrd}d}n|r2|dkr2d}d}|j||d|d�}|jjrvx(|jD]}|jrT|rT|jj|�qTW|r�|j|_|j|_|r�g|_|r�g|_|S)a#Return a :class:`dnf.yum.misc.GenericHolder` object containing
        lists of package objects that match the given names or wildcards.

        :param pkgnarrow: a string specifying which types of packages
           lists to produce, such as updates, installed, available, etc.
        :param patterns: a list of names or wildcards specifying
           packages to list
        :param installed_available: whether the available package list
           is present as .hidden_available when doing all, available,
           or installed
        :param reponame: limit packages list to the given repository

        :return: a :class:`dnf.yum.misc.GenericHolder` instance with the
           following lists defined::

             available = list of packageObjects
             installed = list of packageObjects
             upgrades = tuples of packageObjects (updating, installed)
             extras = list of packageObjects
             obsoletes = tuples of packageObjects (obsoleting, installed)
             recent = list of packageObjects
        FrTr�r )Zignore_caser�)	Z_do_package_listsrC�showdupesfromreposr�rr rQr�r�)	rFr�r�r�r�Zdone_hidden_availableZdone_hidden_installedr'rrrrr�
s,zBaseCli.returnPkgListsc	s�|jj}d|j_g}g}x4|D],}tt|�j|�\}}|j|�|j|�qWx t|�D]}|jj|||�qXW||j_|s�t	j
jtd���dS)a�Print out a list of packages that provide the given file or
        feature.  This a cli wrapper to the provides methods in the
        rpmdb and pkgsack.

        :param args: the name of a file or feature to search for
        :return: (exit_code, [ errors ])

        exit_code is::

            0 = we're done, exit
            1 = we've errored, exit with error string
            2 = we've got work yet to do, onto the next stage
        TzNo Matches foundN)
rCr�rDrB�provides�extendr�rZmatchcallback_verboserrMrNr
)	rF�argsZold_sdupZmatchesZused_search_strings�specr1Zused_search_stringr)rGrrr�?s

zBaseCli.providescCs|jjr|jjrdSdS)NFT)rCrjrW)rFrrrrV^szBaseCli._promptWanted)N)r�NFN)�__name__�
__module__�__qualname__�__doc__rEr^r[r}rr�r�r�r�r�r�r�rV�
__classcell__rr)rGrrB�sk'&"Y
1rBc@s�eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zddd
�Zd dd�Z	d!dd�Z
d"dd�Zej
fdd�Zdd�Zdd�Zdd�Zdd�ZdS)#�ClicCs"||_i|_d|_tjjj�|_|jtjj	j
j�|jtjj	jj
�|jtjj	jj�|jtjj	jj�|jtjj	jj�|jtjj	jj�|jtjj	jj�|jtjj	jj�|jtjj	jj�|jtjj	jj�|jtjj	jj�|jtjj	j j!�|jtjj	j"j#�|jtjj	j$j%�|jtjj	j&j'�|jtjj	j(j)�|jtjj	j*j+�|jtjj	j,j-�|jtjj	j.j/�|jtjj	j0j1�|jtjj	j2j3�|jtjj	j4j5�|jtjj	j6j7�|jtjj	j8�|jtjj	j9�|jtjj	j:�|jtjj	j;�|jtjj	j<�|jtjj	j=�dS)N)>r9�cli_commands�commandrrZZdemandZDemandSheet�demands�register_commandZcommands�aliasZAliasCommandr"ZAutoremoveCommandZcheckZCheckCommandZcleanZCleanCommandZ
distrosyncZDistroSyncCommandZdeplistZDeplistCommandZ	downgradeZDowngradeCommandrRZGroupCommandr_ZHistoryCommandZinstallZInstallCommandZ	makecacheZMakeCacheCommandZmarkZMarkCommand�moduleZ
ModuleCommandZ	reinstallZReinstallCommand�removeZ
RemoveCommand�repolistZRepoListCommandZ	repoqueryZRepoQueryCommand�searchZ
SearchCommand�shellZShellCommandZswapZSwapCommandZ
updateinfoZUpdateInfoCommandZupgradeZUpgradeCommandZupgrademinimalZUpgradeMinimalCommandZInfoCommandZListCommandZProvidesCommandZCheckUpdateCommandZRepoPkgsCommandZHelpCommand)rFr9rrrrEfsBzCli.__init__cCs|jj|�|jr^xJ|jj�D]<\}}|jjj||jj|gd�}|j|�|jj	|df�qW|j
r�|jjdd�|jjdd�|j
D��t
�}yzxt|jD]j\}}|jjj|�}|s�|jjjr�|dkr�td�}	tjj|	|��|j|�|dk�r�|j�q�|j�q�WWnFtjjk
�rP}
z$tj|
�|jj�tjd	�WYdd}
~
XnXx|D]}tjtd
�|��qXW|jjj �}|dk�r�|jjj!�}x,|D]$}|jjj"|�}|�r�|j#j$��q�W|jj%j&�\}
|j_'|jjj(�j)|
�t%j*|j|jj%�}|jjj(�j+|�dS)N)Zbaseurl�enabler�*�disablecSsg|]}|df�qS)r�r)rv�rrrrrx�sz(Cli._configure_repos.<locals>.<listcomp>zUnknown repo: '%s'rzNo repository match: %s)r�r�),r9Zread_all_reposZrepofrompathr<�reposZadd_new_reporC�_configure_from_optionsZrepos_edrQ�repo�insertr��setZget_matchingr�r
rrMZ	RepoError�addr�r��ConfigErrorr=rp�	optparser�
print_helprk�exitr>Z_repo_persistorZget_expired_reposr��get�_repo�expirerZsetup_progress_callbacksZ_ds_callbackr��set_progress_barZCliKeyImportZ_set_key_import)rF�optsZlabelr�Z	this_repoZnotmatchr�Z	operationr�rdreZ
expired_reposrZbarZ
key_importrrr�_configure_repos�sL








zCli._configure_reposcCsvtjdjtjjd�tjj�tjtj	j
d|j�tjtj	j
d|jj
j�tjtj	j
d|jj
j�tjd|jj
j�dS)Nz{prog} version: %s)rHzCommand: %szInstallroot: %szReleasever: %szcachedir: %s)r=�debugr?rrKrU�const�VERSION�log�logging�DDEBUG�	cmdstringr9rC�installroot�
releasever�cachedir)rFrrr�_log_essentials�s



zCli._log_essentialscCs|j}|jj}|jr.tjj�s.tjjt	d���|j
rLx|j�D]
}d|_q>W|j
s\|jjj
r�d|jj_
xn|j�D]}|jjtjj�qpWnL|jr�xD|j�D]}|jj�q�Wn(|js�x |j�D]}|jjtjj�q�W|j�r�|jj|jjr�dnd|jjd�dS)Nz[This command has to be run with superuser privileges (under the root user on most systems).T�autoF)�load_system_repoZload_available_repos)r�r9r�Z	root_userrrKZ	am_i_rootrMrNr
r{�iter_enabledZload_metadata_other�	cacheonlyrC�valuesr�ZsetSyncStrategyr�ZSYNC_ONLY_CACHE�freshest_metadatar�Zfresh_metadataZ	SYNC_LAZYZsack_activationZ	fill_sackr�Zavailable_repos)rFr�r�r�rrr�_process_demands�s.



zCli._process_demandscCs�|j}|jj|�}|dkr~tjtd�|tjd�|jj	j
r`tjtd�jtj
jtj
jd�|�ntjtd�jtj
jd��t�||�|_tjtjjd|�tjtjjd	|�dS)
z,Check that the requested CLI command exists.Nz)No such command: %s. Please use %s --helprzLIt could be a {PROG} plugin command, try: "{prog} install 'dnf-command(%s)'")rHZPROGzRIt could be a {prog} plugin command, but loading of plugins is currently disabled.)rHzBase command: %szExtra commands: %s)r�r�r�r=rpr
rk�argvr9rCZpluginsr?rrKrLrUrr�r�r�)rFr�r�r��command_clsrrr�_parse_commands�s


zCli._parse_commandsNc	Cs�tjjj�}|j|�}|dkr*tjjj�n||_|jj|�}|j	rpt
tjj�t
|jjj|j|jj�tjd�|jr�d|_d|_|jr�tjj|_|_yh|jr�|jjjd|jjjtjj�d|j_|jjj|�|j|j �d|kr�|j!|jj_!|jjj"�Wn�tj#j$t%fk
�rF}z t&j't(d�|�tjd�WYdd}~XnXt)k
�r�}z:d	t*t+|��t,|j-�f}t&j't(d�|�tjd�WYdd}~XnX|j.dk	�r�|j.|jj_.|jjj/�r�|j0dk�r�t&j't(d��tjd�|j1�s�|j2�r|j0dk�rt&j't(d��tjd�|j3dk	�r>t4j5t6j7|j3d��|jj8|j9d�|jj:|j;|j<|�|jj8|j9d�|j0�s�|jj=�tjd�||j_>|jj?d|_@x$|jj>D]}|j@d|7_@�q�W|jA�y|jB||�Wn tCk
�rtjd�YnX|jD�r$|jj=|j0�tjd�|jjE|j0|�}|jF�rN|jF|j_Gd|j_H|jI�r`|jI|j_I|jJ�rrd|jj_K|jL�r�d|jj_L|j0jM�|jjN�|jjO�|jP|�|jjQ�|jjj|�|j0jR�|jjj.�rtjSjT|jjj.�|jjj.|jjUjV�_W|jjjXdk�r(|jjjYjZ|jjjXd�t[j\d�dk�r�d}x,|jjUj]�D]}|j^�rZ�qJd|_^d}�qJW|jjj_�s�d|jj__d}|�r�t&j`t(d��dS)aParse command line arguments, and set up :attr:`self.base.conf` and
        :attr:`self.cmds`, as well as logger objects in base instance.

        :param args: a list of command line arguments
        :param option_parser: a class for parsing cli options
        Nrrr�TrzConfig error: %srz%s: %s�download�system-upgrade�reposync�
modulesynczb--destdir or --downloaddir must be used with --downloadonly or download or system-upgrade command.zconfig-managerz_--enable, --set-enabled and --disable, --set-disabled must be used with config-manager command.�<�mainZpluginrz%s r�)�colorz%_pkgverify_level�	signaturer�Fz�Warning: Enforcing GPG signature check globally as per active RPM security policy (see 'gpgcheck' in dnf.conf(5) for how to squelch this message))r�r�r�r�)r�r�)arrZ�aliasesZAliasesZresolve�
option_parserZOptionParserr�Zparse_main_argsr4r3r�r�r;r9rCZhistory_record_packagesrrkr��quietZ
debuglevelZ
errorlevelr�Z
VERBOSE_LEVELr�Z
_set_valueZsystem_cachedirZPRIO_DEFAULTr�r��_read_conf_filer�rZ_adjust_conf_optionsrMr�rnr=rpr
�IOErrorr	ro�repr�filenameZdestdirrTr�Zset_enabledZset_disabledZ	sleeptimer*Zsleep�randomZ	randrangeZadd_commandsr�Zinit_pluginsZ
disablepluginZenablepluginr�r�rHr�r�r�r�helpZparse_command_argsZallowerasingZ
allow_erasingZ_allow_erasingr�ZdebugsolverZdebug_solverr%Z
pre_configureZpre_configure_pluginsZ_activate_persistorr�Zconfigure_plugins�	configurerKZ
ensure_dirr�r�Zpkgdirr�r6ZreinitryZexpandMacror�ZgpgcheckZlocalpkg_gpgcheckr>)	rFr�r�r�r�rer�Zforcingr�rrrr��s�





















z
Cli.configurecCsBtjjd�}|jj}|jd�|jd�|jd�}|jd�tjj	krht
jj|�rhtj
jtd�j|���|jtjjd�|jd�}|jd�tjj	kr�d}|j}|j||jd�d�|dkr�|jdkr�tjj|j�}n|dkr�tjj|�}|dk	r�||_|jdk�rtjtd	��xd
D]}|j|��qW|jjj|�|�|S)N�configZconfig_file_pathzConfig file "{}" does not exist)ZpriorityZreposdir�varsdir�/)rzPUnable to detect release version (use '--releasever' to specify release version)r��logdir�
persistdir)r�rr)rr�ZTimerr9rCZ_check_remote_fileZ_search_inside_installrootZ
_get_valueZ
_get_priorityZPRIO_COMMANDLINEr�r��isfilerMr�r
r?�readZPRIO_MAINCONFIGZ
substitutionsZupdate_from_etcr�ryZdetect_releaseverr�r=r>Zprepend_installroot�_loggingZ_setup_from_dnf_conf)rFr�ZtimerrCr�Z	from_rootZsubst�optrrrr��s6




zCli._read_conf_file�eqcCs�|dkr|dkrdSg}|js"|r,|jd�|js6|r@|jd�|jsJ|rT|jd�|js^|rh|jd�|jj|||j|j|j	|j
d�dS)zz

        :param opts:
        :param cmp_type: string supported "eq", "gte"
        :param all:
        :return:
        N�bugfix�enhancement�
newpackage�security)�types�advisory�bugzilla�cves�severity)r
rQrrr
r9Zadd_security_filtersrrrr)rFr�Zcmp_typer�rrrr� _populate_update_security_filter�s







z$Cli._populate_update_security_filtercCs4|dk	r|jjjj|�|dk	r0|jjjj|�dS)z�
        Change minimal logger level for terminal output to stdout and stderr according to specific
        command requirements
        @param stdout: logging.INFO, logging.WARNING, ...
        @param stderr:logging.INFO, logging.WARNING, ...
        N)r9rZstdout_handlerZsetLevelZstderr_handler)rF�stdout�stderrrrr�redirect_logger�szCli.redirect_loggercCs.tjjj|�}||jj_|jjj�j|�dS)N)	rrZrYZMultiFileProgressMeterr9rr�r�r�)rFZforYrrr�redirect_repo_progress�s
zCli.redirect_repo_progresscCs�|jjj�}|dkrdS|jjj�j|jd�}|j�}|jdd�|}x|D]}||krL|}qLW||kr�td|�td|�dS)N)r�r
)Z
advisory_typez,Security: %s is an installed security updatez-Security: %s is the currently running version)r9r/Zget_running_kernelr1r2rrr3)rFZkernel�qZikpkgrrrr�_check_running_kernel�s
zCli._check_running_kernelcCs*t|jj��tjjtdj||����dS)Nz)argument {}: not allowed with argument {})r3r�Zprint_usagerrMrNr
r?)rFZoption_string_1Zoption_string_2rrr�_option_conflict�szCli._option_conflictcCs<x6|jD],}||jkr*tjjtd�|��||j|<qWdS)zRegister a Command. :apizCommand "%s" already definedN)r�r�rrMr�r
)rFr�rrrrr��s
zCli.register_commandcCs�|j�|jjjr8tjtd�djtt	|jjj����|jjj
rhtjtd�djtt	|jjj
����xx|jjj�D]h}|jr�tjtd�|j
ddjtt	|j����|j
rvtjtd�|j
ddjtt	|j
����qvW|jj�S)a2Call the base command, and pass it the extended commands or
           arguments.

        :return: (exit_code, [ errors ])

        exit_code is::

            0 = we're done, exit
            1 = we've errored, exit with error string
            2 = we've got work yet to do, onto the next stage
        zExcludes in dnf.conf: z, zIncludes in dnf.conf: zExcludes in repo z: zIncludes in repo )r�r9rCZexcludepkgsr=r�r
rar�r�Zincludepkgsr�r��idr��run)rFr�rrrrs
"
"(,zCli.run)N)N)r	N)NN)r�r�r�rEr�r�r�r�r�r�rrrkrrrrr�rrrrrr�es$3

-


r�)r)Or�Z
__future__rrr�collections.abcr�ImportError�collectionsrzr�r�r�r�ryrkr*r�Zlibdnf.transactionrcrrZdnf.clirZdnf.i18nr	r
rZdnf.cli.aliasesZdnf.cli.commandsZdnf.cli.commands.aliasZdnf.cli.commands.autoremoveZdnf.cli.commands.checkZdnf.cli.commands.cleanZdnf.cli.commands.deplistZdnf.cli.commands.distrosyncZdnf.cli.commands.downgradeZdnf.cli.commands.groupZdnf.cli.commands.historyZdnf.cli.commands.installZdnf.cli.commands.makecacheZdnf.cli.commands.markZdnf.cli.commands.moduleZdnf.cli.commands.reinstallZdnf.cli.commands.removeZdnf.cli.commands.repolistZdnf.cli.commands.repoqueryZdnf.cli.commands.searchZdnf.cli.commands.shellZdnf.cli.commands.swapZdnf.cli.commands.updateinfoZdnf.cli.commands.upgradeZdnf.cli.commands.upgrademinimalZdnf.cli.demandZdnf.cli.formatZdnf.cli.option_parserZdnf.confZdnf.conf.substitutionsZ	dnf.constZdnf.db.historyZdnf.exceptionsZdnf.loggingZ
dnf.persistorZ
dnf.pluginZdnf.rpmZdnf.sackZdnf.transactionZdnf.utilZdnf.yum.miscZ	getLoggerr=rr)r;rAZBaserB�objectr�rrrr�<module>s�

Osite-packages/dnf/cli/__pycache__/format.cpython-36.opt-1.pyc000064400000004461147511334650017705 0ustar003

�ft`�@s8ddlmZddlmZddd�Zddd�Zdd	�Zd
S)
�)�unicode_literals)�long� c		Cs�ddddddddd	g	}|r d
}nd}d}d
}t|�d}|dkrDd}x$||krh||krh|d}||}qFWt|t�s~t|t�r�d}n|dkr�d}nd}|t|p�d
�|||fS)a�Return a human-readable metric-like string representation
    of a number.

    :param number: the number to be converted to a human-readable form
    :param SI: If is 0, this function will use the convention
       that 1 kilobyte = 1024 bytes, otherwise, the convention
       that 1 kilobyte = 1000 bytes will be used
    :param space: string that will be placed between the number
       and the SI prefix
    :return: a human-readable metric-like string representation of
       *number*
    r�k�M�G�T�P�E�Z�Yg@�@g�@i�r�Ngz%i%s%sgfffff�#@z%.1f%s%sz%.0f%s%s)�len�
isinstance�intr�float)	ZnumberZSIZspaceZsymbols�stepZthresh�depthZ	max_depth�format�r�/usr/lib/python3.6/format.py�
format_numbers4rcCsx|dks|dkr|rdSdSnV|td�kr.dSt|�}|d}|d}|rh|d}|d}d|||fSd	||fSdS)
a�Return a human-readable string representation of a number
    of seconds.  The string will show seconds, minutes, and
    optionally hours.

    :param seconds: the number of seconds to convert to a
       human-readable form
    :param use_hours: If use_hours is 0, the representation will
       be in minutes and seconds. Otherwise, it will be in hours,
       minutes, and seconds
    :return: a human-readable string representation of *seconds*
    Nrz--:--:--z--:--�infZInfinite�<z%02i:%02i:%02iz	%02i:%02i)rr)ZsecondsZ	use_hoursZminutesZhoursrrr�format_timeIsrcCsdjdd�|j�D��S)N�
css|]}d|VqdS)z  Nr)�.0�srrr�	<genexpr>hszindent_block.<locals>.<genexpr>)�join�
splitlines)rrrr�indent_blockgsr!N)rr)r)Z
__future__rZ
dnf.pycomprrrr!rrrr�<module>s
5
site-packages/dnf/cli/__pycache__/main.cpython-36.pyc000064400000012241147511334650016375 0ustar003

�ft`f�@sPdZddlmZddlmZddlmZddlmZddlmZddl	m
Z
ddlmZdd	l
mZdd
lmZddlZddlZddl	ZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZejd�Zd
d�Zdd�Z eee
fdd�Z!dd�Z"dd�Z#dd�Z$d dd�Z%e&dk�rLe%ej'dd�dd�dS)!z4
Entrance point for the yum command line interface.
�)�print_function)�absolute_import)�unicode_literals)�Conf)�Cli)�OptionParser)�ucd)�show_lock_owner)�_N�dnfcCs&tjtjjddd�tjt|��dS)N�T)�exc_info�)�logger�logr�logging�SUBDEBUG�criticalr)�e�r�/usr/lib/python3.6/main.py�
ex_IOError2srcCs6tjtjjddd�|jdk	r2tjtd�t|��dS)NrT)r
z	Error: %sr)	rrrrr�valuerr
r)rrrr�ex_Error8s
rcCs�y6tjj�tjjj|���}t||||�SQRXW�n�tjjk
rr}ztj	|j
�t|j�dSd}~X�nLtjj
k
r�}ztj	|j
�dSd}~X�ntjjk
r�}zdSd}~X�n�tjjk
�r�}zt|�Sd}~Xn�tjk
�r$}ztj	td�t|��dSd}~Xn�tjjk
�r\}ztj	td�t|��dSd}~Xnbtk
�r�}zt|�Sd}~Xn>tk
�r�}z tj	djt|�jtd���dSd}~XnXdS)N��rz	Error: %sz{}: {}zTerminated.)rZi18nZsetup_stdout�cliZBaseCli�_main�
exceptionsZProcessLockErrorrrrr	�pid�	LockError�
DepsolveError�Errorr�hawkey�	Exceptionr
r�libdnf�error�IOErrorr�KeyboardInterrupt�format�type�__name__)�argsZ
conf_class�	cli_classZoption_parser_class�baserrrr�main?s4

r.cCsb|jj�||�}y|jttt|��|��Wn(ttfk
rV}zt|�Sd}~XnXt	||�S)z2Run the dnf program from a command line interface.N)
Z_loggingZ	_presetupZ	configure�list�maprr&�OSErrorr�cli_run)r-r+r,Z
option_parserrrrrrr\s
rc,Cs�ytd�}WnFtk
rR}z*|jtjkrBtjtd��tjd�WYdd}~Xn
X|j	�y|j
�Wn@tjj
k
r��Yn(ttfk
r�}zt|�Sd}~XnX|jj�r�yt||�}W�ntjjk
�r�}z�t|�d}|jj�r|jjdd��r|td�jd�7}|jjj�rN|�s<|td	�jd
�7}n|td�jd
�7}|jjj�r�|jjjd�}|tjjk�r�|�s�|td
�jd�7}n|td�jd�7}|�r�tjdj|���WYdd}~XnX|�r�|S|jj �|jj!S)N�.z8No read/execute access in current directory, moving to /�/rT)Z	availablez?try to add '{}' to command line to replace conflicting packagesz--allowerasingz.try to add '{}' to skip uninstallable packagesz
--skip-brokenz' or '{}' to skip uninstallable packages�bestz7try to add '{}' to use not only best candidate packagesz--nobestz0 or '{}' to use not only best candidate packagesz({}))"�openr&�errnoZEACCESrrr
�os�chdir�closeZrunrrrr1r�demands�	resolvingr r�
allow_erasingZ_goalZproblem_conflictsr(r-Zconf�strictr5Z
_get_priorityZPRIO_MAINCONFIG�info�commandZrun_transactionZsuccess_exit_status)rr-�fr�ret�msgZpriorrrr2msT







r2cCs
|jdkr&|j|jj�tjtd��|jj�g}|jj	dk	rN|j
|jj	�y|j|d�Wn�tj
jk
r�}ztjt|��dSd}~Xnvtjjk
r�}z$x|jj|�D]}tj|�q�WdSd}~Xn4tk
�r�}zt|�Sd}~XnXtjtd��dS)z9Perform the depsolve, download and RPM transaction stage.NzDependencies resolved.)Zdisplayrz	Complete!r)ZtransactionZresolver;r=rr?r
r@Zrun_resolvedZtransaction_display�appendZdo_transactionrrZCliErrorr%rrZTransactionCheckErrorZget_error_outputrr&r)rr-Zdisplays�exc�errrCrrrrr<�s(

r<FcCst|�}|rtj|�|S)apCall one of the multiple main() functions based on environment variables.

    :param args: command line arguments passed into yum
    :param exit_code: if *exit_code* is True, this function will exit
       python with its exit code when it has finished executing.
       Otherwise, it will return its exit code.
    :return: the exit code from dnf.yum execution
    )r.�sys�exit)r+�	exit_codeZerrcoderrr�	user_main�s

rJ�__main__rT)rI)F)(�__doc__Z
__future__rrrZdnf.confrZdnf.cli.clirZdnf.cli.option_parserrZdnf.i18nrZ
dnf.cli.utilsr	r
Zdnf.clirZdnf.exceptionsZdnf.loggingZdnf.utilr7r"Zlibdnf.errorr$rr8Zos.pathrGZ	getLoggerrrrr.rr2r<rJr*�argvrrrr�<module>sB
5

site-packages/dnf/cli/__pycache__/demand.cpython-36.pyc000064400000003013147511334650016676 0ustar003

�ft`�	�@s0ddlmZGdd�de�ZGdd�de�ZdS)�)�unicode_literalsc@s&eZdZdd�Zddd�Zdd�ZdS)	�_BoolDefaultcCs ||_d|jjt|�f|_dS)Nz__%s%x)�default�	__class__�__name__�id�
_storing_name)�selfr�r
�/usr/lib/python3.6/demand.py�__init__sz_BoolDefault.__init__NcCs |j}|j|kr||jS|jS)N)�__dict__rr)r	�objZobjtype�objdictr
r
r�__get__s

z_BoolDefault.__get__cCs8|j}|j|kr*||j}||kr*td��|||j<dS)NzDemand already set.)r
r�AttributeError)r	r�valrZcurrent_valr
r
r�__set__#s

z_BoolDefault.__set__)N)r�
__module__�__qualname__rrrr
r
r
rrs
rc@speZdZdZed�Zed�Zed�Zed�Zed�Z	ed�Z
dZed�Zed�Z
ed�Zed�ZdZed�ZdS)�DemandSheetzHCollection of demands that different CLI parts have on other parts. :apiFTrN)rrr�__doc__rZ
allow_erasingZavailable_reposZ	resolvingZ	root_userZsack_activationZload_system_repoZsuccess_exit_statusZ	cacheonlyZfresh_metadataZfreshest_metadataZ
changelogsZtransaction_displayZplugin_filtering_enabledr
r
r
rr+srN)Z
__future__r�objectrrr
r
r
r�<module>ssite-packages/dnf/cli/__pycache__/completion_helper.cpython-36.pyc000064400000020733147511334650021166 0ustar003

i�-e/�@s<ddlZddlZddlZddlZdd�Zdd�ZGdd�dejjj	j
�ZGdd	�d	ejjjj
�ZGd
d�dejjjj�ZGdd
�d
ejjj�ZGdd�dejjjj�ZGdd�dejjjj�ZGdd�dejjjj�ZGdd�dejjjj�Zdd�Z e!dk�r8ye ej"dd��Wn e#k
�r6ej$d�YnXdS)�Ncst�fdd�|�S)Ncst|�j��S)N)�str�
startswith)�k)�kw��'/usr/lib/python3.6/completion_helper.py�<lambda>sz#filter_list_by_kw.<locals>.<lambda>)�filter)rZlstr)rr�filter_list_by_kwsr
cCstdd�|D��S)NcSsg|]}t|��qSr)r)�.0�xrrr�
<listcomp>!sz%listpkg_to_setstr.<locals>.<listcomp>)�set)�pkgsrrr�listpkg_to_setstr srcs,eZdZ�fdd�Zdd�Zdd�Z�ZS)�RemoveCompletionCommandcstt|�j|�dS)N)�superr�__init__)�self�args)�	__class__rrr$sz RemoveCompletionCommand.__init__cCsd|jj_d|jj_dS)NFT)�cli�demands�	root_user�sack_activation)rrrr�	configure's
z!RemoveCompletionCommand.configurecCs,x&tj|j|jj�D]}tt|��qWdS)N)�ListCompletionCommand�	installed�base�opts�	pkg_specs�printr)r�pkgrrr�run+szRemoveCompletionCommand.run)�__name__�
__module__�__qualname__rrr#�
__classcell__rr)rrr#srcs,eZdZ�fdd�Zdd�Zdd�Z�ZS)�InstallCompletionCommandcstt|�j|�dS)N)rr(r)rr)rrrr1sz!InstallCompletionCommand.__init__cCs"d|jj_d|jj_d|jj_dS)NFT)rrr�available_reposr)rrrrr4s

z"InstallCompletionCommand.configurecCsNttj|j|jj��}ttj|j|jj��}x||D]}tt|��q6WdS)N)	rrrrrr �	availabler!r)rrr*r"rrrr#9s

zInstallCompletionCommand.run)r$r%r&rrr#r'rr)rrr(0sr(cs,eZdZ�fdd�Zdd�Zdd�Z�ZS)�ReinstallCompletionCommandcstt|�j|�dS)N)rr+r)rr)rrrrCsz#ReinstallCompletionCommand.__init__cCs"d|jj_d|jj_d|jj_dS)NFT)rrrr)r)rrrrrFs

z$ReinstallCompletionCommand.configurecCsNttj|j|jj��}ttj|j|jj��}x||@D]}tt|��q6WdS)N)	rrrrrr r*r!r)rrr*r"rrrr#Ks

zReinstallCompletionCommand.run)r$r%r&rrr#r'rr)rrr+Bsr+csHeZdZ�fdd�Zdd�Zedd��Zedd��Zed	d
��Z�Z	S)rcstt|�j|�dS)N)rrr)rr)rrrrTszListCompletionCommand.__init__cCs�|j}|jj}|jj}t|�dkrH|d|krHtdjt|d|���n�|dkr`|j|j	|�}n||dkrx|j
|j	|�}nd|dkr�|j|j	|�}nLt|j
|j	|��}t|j|j	|��}||B}|s�tdjt|d|���dSx|D]}tt
|��q�WdS)N��
rr*�updatesr)Z
pkgnarrowsrZpackagesZpackages_action�lenr!�joinr
rrr*r.rr)r�subcmdsr�actionrr*rr"rrrr#Ws&
zListCompletionCommand.runcCs |jj�j�jdj|d�d�S)Nz{}*r)�
name__glob)�sack�queryr�filterm�format)r�argrrrrnszListCompletionCommand.installedcCs |jj�j�jdj|d�d�S)Nz{}*r)r3)r4r5r*r6r7)rr8rrrr*rszListCompletionCommand.availablecCs|jdj|d�gdd�S)Nz{}*rF)Zprint_)Z
check_updatesr7)rr8rrrr.vszListCompletionCommand.updates)
r$r%r&rr#�staticmethodrr*r.r'rr)rrrSs
rcs$eZdZ�fdd�Zdd�Z�ZS)�RepoListCompletionCommandcstt|�j|�dS)N)rr:r)rr)rrrr|sz"RepoListCompletionCommand.__init__cCs�|j}|jdkr>tdjt|jddd�|jjj�D����nn|jdkrvtdjt|jddd�|jjj�D����n6|jdkr�tdjt|jdd	d�|jjj�D����dS)
N�enabledr-rcSsg|]
}|j�qSr)�id)r�rrrrr
�sz1RepoListCompletionCommand.run.<locals>.<listcomp>ZdisabledcSsg|]}|js|j�qSr)r;r<)rr=rrrr
�s�allcSsg|]
}|j�qSr)r<)rr=rrrr
�s)	rZrepos_actionr!r0r
ZreposrZiter_enabledr>)rrrrrr#s


zRepoListCompletionCommand.run)r$r%r&rr#r'rr)rrr:{sr:cs,eZdZ�fdd�Zdd�Zdd�Z�ZS)�UpgradeCompletionCommandcstt|�j|�dS)N)rr?r)rr)rrrr�sz!UpgradeCompletionCommand.__init__cCs"d|jj_d|jj_d|jj_dS)NFT)rrrr)r)rrrrr�s

z"UpgradeCompletionCommand.configurecCs,x&tj|j|jj�D]}tt|��qWdS)N)rr.rrr r!r)rr"rrrr#�szUpgradeCompletionCommand.run)r$r%r&rrr#r'rr)rrr?�sr?cs,eZdZ�fdd�Zdd�Zdd�Z�ZS)�DowngradeCompletionCommandcstt|�j|�dS)N)rr@r)rr)rrrr�sz#DowngradeCompletionCommand.__init__cCs"d|jj_d|jj_d|jj_dS)NFT)rrrr)r)rrrrr�s

z$DowngradeCompletionCommand.configurecCs0x*tj|j|jj�j�D]}tt|��qWdS)N)rr*rrr Z
downgradesr!r)rr"rrrr#�szDowngradeCompletionCommand.run)r$r%r&rrr#r'rr)rrr@�sr@cs$eZdZ�fdd�Zdd�Z�ZS)�CleanCompletionCommandcstt|�j|�dS)N)rrAr)rr)rrrr�szCleanCompletionCommand.__init__cCs0tjjjjj�}tdjt|j	j
d|���dS)Nr-r,)�dnfr�commands�cleanZ_CACHE_TYPES�keysr!r0r
r�type)rr1rrrr#�szCleanCompletionCommand.run)r$r%r&rr#r'rr)rrrA�srAcCs�tjjj�}tjj|�}|ddkrP|jgg|�tdjt|d|j���dS|jj	�|j
t�|j
t�|j
t
�|j
t�|j
t�|j
t�|j
t�|j
t�|j|�y|j�Wn&ttjjfk
r�tjd�YnXdS)NrZ_cmdsr-r,)rBrZBaseCliZCliZinit_pluginsr!r0r
Zcli_commands�clearZregister_commandrr(r+rr:r?r@rArr#�OSError�
exceptions�Error�sys�exit)rrrrrr�main�s(









rM�__main__r,)%Zdnf.exceptionsrBZdnf.cliZdnf.cli.commands.cleanrKr
rrrC�removeZ
RemoveCommandrZinstallZInstallCommandr(Z	reinstallZReinstallCommandr+ZListCommandrZrepolistZRepoListCommandr:ZupgradeZUpgradeCommandr?Z	downgradeZDowngradeCommandr@rDZCleanCommandrArMr$�argv�KeyboardInterruptrLrrrr�<module>s&
(	
site-packages/dnf/cli/__pycache__/progress.cpython-36.pyc000064400000010777147511334650017331 0ustar003

�ft`��@spddlmZddlmZmZddlmZddlmZddl	m	Z	ddl
Z
ddlZddl
ZGdd�dejj�ZdS)	�)�unicode_literals)�
format_number�format_time)�_term_width)�unicode)�timeNc@sreZdZdZejjdejjdejjdejj	diZ
ejdddfd	d
�Z
dd�Zddd�Zdd�Zdd�Zdd�ZdS)�MultiFileProgressMeterz"Multi-file download progress meterZFAILEDZSKIPPEDZMIRRORZDRPMg333333�?g�?g@cCsp||_||_||_||_d|_d|_tjj�|_d|_	d|_
d|_g|_i|_
d|_d|_d|_d|_d|_dS)z�Creates a new progress meter instance

        update_period -- how often to update the progress bar
        tick_period -- how fast to cycle through concurrent downloads
        rate_average -- time constant for average speed calculation
        rN)�fo�
update_period�tick_period�rate_average�unknown_progres�
total_drpm�sys�stdout�isatty�	done_drpm�
done_files�	done_size�active�state�	last_time�	last_size�rate�total_files�
total_size)�selfr	r
rr�r�/usr/lib/python3.6/progress.py�__init__&s"zMultiFileProgressMeter.__init__cCstjjd||j�dS)NZwrite_flush)�dnf�utilZ_terminal_messengerr	)r�msgrrr�message?szMultiFileProgressMeter.messagercCsF||_||_||_d|_d|_d|_g|_i|_d|_d|_	d|_
dS)Nr)rrrrrrrrrrr)rrrZtotal_drpmsrrr�startBszMultiFileProgressMeter.startcCs�t�}t|�}t|j�}t|�}||jkrD|df|j|<|jj|�|j|\}}||f|j|<|j||7_||j|j	kr�||j
kr�||_
|j|�dS)Nr)rr�int�
download_sizerr�appendrrr
r�_update)r�payload�done�now�textZtotalr$�oldrrr�progressSs


zMultiFileProgressMeter.progresscCsJ|jrj||j}|j|j}|dkrj|dkrj||}|jdk	rdt||jd�}|||jd|}||_||_|j|_|js�dS|jt||j	�t
|j�}|jdkr�d|jd}t
|j�dkr�|d|jt
|j�7}d||j|f}|jo�|j
�rt|j
|j|j�}nd}d|j�r,t|j�ndt|j�|f}	t�t
|	�}
|
d	d
}|dk�r0|j
�r�|jd|j
}t|j|d
|j
d
�\}}
d
|d|
}d||||	f}	|
|d	8}
nj|jd}d}
|dk�r�dn|}d|d
|
}d|||	f}	|
|d	8}
|jd|k�r*|jdnd|_|jd|
|
||	f�dS)Nr�z%dz-%dz(%s/%d): %sz--:--z %5sB/s | %5sB %9s ETA
z---  ����d�=�-z%3d%% [%-*s]%s�� z
     [%-*s]%sz%-*.*s%s)rrrr�minrrrr%r�lenrrrrrr�divmodr
r#)rr+Z
delta_timeZ
delta_sizerZweightr,�nZtime_etar"�leftZblZpct�pZbarrrrr(gsX




zMultiFileProgressMeter._updatecCs�t�}}t|�}t|j�}d}|tjjkr.n�|tjjkrJ|jd7_nt||j	kr�|j	j
|�\}}|jj|�||8}|j
d7_
|j|7_n(|tjjkr�|j
d7_
|j|7_|�r*|tjjkr�|jdkr�d|j||j|j|f}	nd|j||f}	t�t|	�d}
d|	|
|f}	nl|jdk�rHd|j
|j|f}t||d�}dtt|�|�t|�t|�f}	t�t|	�}
d	|
|
||	f}	|j|	�|j�r�|j|�dS)
Nrr/z[%s %d/%d] %s: z	[%s] %s: z%s%-*s
z(%d/%d): %sg����MbP?z %5sB/s | %5sB %9s    
z%-*.*s%s)rrr%r&r �callback�
STATUS_MIRROR�STATUS_DRPMrr�popr�removerr�STATUS_ALREADY_EXISTSr�STATUS_2_STRrr9r�maxr�floatrr#r()rr)ZstatusZerr_msgr$r+r,�sizer*r"r<Ztmrrr�end�sH



zMultiFileProgressMeter.endN)r)�__name__�
__module__�__qualname__�__doc__r r>Z
STATUS_FAILEDrCr?r@rDr�stderrrr#r$r.r(rHrrrrrs
5r)Z
__future__rZdnf.cli.formatrrZdnf.cli.termrZ
dnf.pycomprrrZdnf.callbackr Zdnf.utilr>ZDownloadProgressrrrrr�<module>ssite-packages/dnf/cli/output.py000064400000255005147511334650012534 0ustar00# Copyright 2005 Duke University
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU Library General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.

"""Handle actual output from the cli."""

from __future__ import absolute_import
from __future__ import print_function
from __future__ import unicode_literals

import fnmatch
import hawkey
import itertools
import libdnf.transaction
import logging
import operator
import pwd
import re
import sys
import time

from dnf.cli.format import format_number, format_time
from dnf.i18n import _, C_, P_, ucd, fill_exact_width, textwrap_fill, exact_width, select_short_long
from dnf.pycomp import xrange, basestring, long, unicode, sys_maxsize
from dnf.yum.rpmtrans import TransactionDisplay
from dnf.db.history import MergedTransactionWrapper
import dnf.base
import dnf.callback
import dnf.cli.progress
import dnf.cli.term
import dnf.conf
import dnf.crypto
import dnf.i18n
import dnf.transaction
import dnf.util
import dnf.yum.misc

logger = logging.getLogger('dnf')


def _spread_in_columns(cols_count, label, lst):
    left = itertools.chain((label,), itertools.repeat(''))
    lst_length = len(lst)
    right_count = cols_count - 1
    missing_items = -lst_length % right_count
    if not lst_length:
        lst = itertools.repeat('', right_count)
    elif missing_items:
        lst.extend(('',) * missing_items)
    lst_iter = iter(lst)
    return list(zip(left, *[lst_iter] * right_count))


class Output(object):
    """Main output class for the yum command line."""

    GRP_PACKAGE_INDENT = ' ' * 3
    FILE_PROVIDE_RE = re.compile(r'^\*{0,2}/')

    def __init__(self, base, conf):
        self.conf = conf
        self.base = base
        self.term = dnf.cli.term.Term()
        self.progress = None

    def _banner(self, col_data, row):
        term_width = self.term.columns
        rule = '%s' % '=' * term_width
        header = self.fmtColumns(zip(row, col_data), ' ')
        return rule, header, rule

    def _col_widths(self, rows):
        col_data = [dict() for _ in rows[0]]
        for row in rows:
            for (i, val) in enumerate(row):
                col_dct = col_data[i]
                length = len(val)
                col_dct[length] = col_dct.get(length, 0) + 1
        cols = self.calcColumns(col_data, None, indent='  ')
        # align to the left
        return list(map(operator.neg, cols))

    def _highlight(self, highlight):
        hibeg = ''
        hiend = ''
        if not highlight:
            pass
        elif not isinstance(highlight, basestring) or highlight == 'bold':
            hibeg = self.term.MODE['bold']
        elif highlight == 'normal':
            pass # Minor opt.
        else:
            # Turn a string into a specific output: colour, bold, etc.
            for high in highlight.replace(',', ' ').split():
                if high == 'normal':
                    hibeg = ''
                elif high in self.term.MODE:
                    hibeg += self.term.MODE[high]
                elif high in self.term.FG_COLOR:
                    hibeg += self.term.FG_COLOR[high]
                elif (high.startswith('fg:') and
                      high[3:] in self.term.FG_COLOR):
                    hibeg += self.term.FG_COLOR[high[3:]]
                elif (high.startswith('bg:') and
                      high[3:] in self.term.BG_COLOR):
                    hibeg += self.term.BG_COLOR[high[3:]]

        if hibeg:
            hiend = self.term.MODE['normal']
        return (hibeg, hiend)

    def _sub_highlight(self, haystack, highlight, needles, **kwds):
        hibeg, hiend = self._highlight(highlight)
        return self.term.sub(haystack, hibeg, hiend, needles, **kwds)

    @staticmethod
    def _calc_columns_spaces_helps(current, data_tups, left):
        """ Spaces left on the current field will help how many pkgs? """
        ret = 0
        for tup in data_tups:
            if left < (tup[0] - current):
                break
            ret += tup[1]
        return ret

    @property
    def history(self):
        return self.base.history

    @property
    def sack(self):
        return self.base.sack

    def calcColumns(self, data, columns=None, remainder_column=0,
                    total_width=None, indent=''):
        """Dynamically calculate the widths of the columns that the
        fields in data should be placed into for output.

        :param data: a list of dictionaries that represent the data to
           be output.  Each dictionary in the list corresponds to a
           column of output. The keys of the dictionary are the
           lengths of the items to be output, and the value associated
           with a key is the number of items of that length.
        :param columns: a list containing the minimum amount of space
           that must be allocated for each row. This can be used to
           ensure that there is space available in a column if, for
           example, the actual lengths of the items being output
           cannot be given in *data*
        :param remainder_column: number of the column to receive a few
           extra spaces that may remain after other allocation has
           taken place
        :param total_width: the total width of the output.
           self.term.real_columns is used by default
        :param indent: string that will be prefixed to a line of
           output to create e.g. an indent
        :return: a list of the widths of the columns that the fields
           in data should be placed into for output
        """
        cols = len(data)
        # Convert the data to ascending list of tuples, (field_length, pkgs)
        pdata = data
        data = [None] * cols # Don't modify the passed in data
        for d in range(0, cols):
            data[d] = sorted(pdata[d].items())

        if total_width is None:
            total_width = self.term.real_columns

        #  We start allocating 1 char to everything but the last column, and a
        # space between each (again, except for the last column). Because
        # at worst we are better with:
        # |one two three|
        # | four        |
        # ...than:
        # |one two three|
        # |            f|
        # |our          |
        # ...the later being what we get if we pre-allocate the last column, and
        # thus. the space, due to "three" overflowing it's column by 2 chars.
        if columns is None:
            columns = [1] * (cols - 1)
            columns.append(0)

        # i'm not able to get real terminal width so i'm probably
        # running in non interactive terminal (pipe to grep, redirect to file...)
        # avoid splitting lines to enable filtering output
        if not total_width:
            full_columns = []
            for d in xrange(0, cols):
                col = data[d]
                if col:
                    full_columns.append(col[-1][0])
                else:
                    full_columns.append(columns[d] + 1)
            full_columns[0] += len(indent)
            # if possible, try to keep default width (usually 80 columns)
            default_width = self.term.columns
            if sum(full_columns) > default_width:
                return full_columns
            total_width = default_width


        total_width -= (sum(columns) + (cols - 1) + exact_width(indent))
        if not columns[-1]:
            total_width += 1
        while total_width > 0:
            # Find which field all the spaces left will help best
            helps = 0
            val = 0
            for d in xrange(0, cols):
                thelps = self._calc_columns_spaces_helps(columns[d], data[d],
                                                         total_width)
                if not thelps:
                    continue
                #  We prefer to overflow: the last column, and then earlier
                # columns. This is so that in the best case (just overflow the
                # last) ... grep still "works", and then we make it prettier.
                if helps and (d == (cols - 1)) and (thelps / 2) < helps:
                    continue
                if thelps < helps:
                    continue
                helps = thelps
                val = d

            #  If we found a column to expand, move up to the next level with
            # that column and start again with any remaining space.
            if helps:
                diff = data[val].pop(0)[0] - columns[val]
                if not columns[val] and (val == (cols - 1)):
                    #  If we are going from 0 => N on the last column, take 1
                    # for the space before the column.
                    total_width -= 1
                columns[val] += diff
                total_width -= diff
                continue

            overflowed_columns = 0
            for d in xrange(0, cols):
                if not data[d]:
                    continue
                overflowed_columns += 1
            if overflowed_columns:
                #  Split the remaining spaces among each overflowed column
                # equally
                norm = total_width // overflowed_columns
                for d in xrange(0, cols):
                    if not data[d]:
                        continue
                    columns[d] += norm
                    total_width -= norm

            #  Split the remaining spaces among each column equally, except the
            # last one. And put the rest into the remainder column
            cols -= 1
            norm = total_width // cols
            for d in xrange(0, cols):
                columns[d] += norm
            columns[remainder_column] += total_width - (cols * norm)
            total_width = 0

        return columns

    @staticmethod
    def _fmt_column_align_width(width):
        """Returns tuple of (align_left, width)"""
        if width < 0:
            return (True, -width)
        return (False, width)

    def _col_data(self, col_data):
        assert len(col_data) == 2 or len(col_data) == 3
        if len(col_data) == 2:
            (val, width) = col_data
            hibeg = hiend = ''
        if len(col_data) == 3:
            (val, width, highlight) = col_data
            (hibeg, hiend) = self._highlight(highlight)
        return (ucd(val), width, hibeg, hiend)

    def fmtColumns(self, columns, msg=u'', end=u''):
        """Return a row of data formatted into a string for output.
        Items can overflow their columns.

        :param columns: a list of tuples containing the data to
           output.  Each tuple contains first the item to be output,
           then the amount of space allocated for the column, and then
           optionally a type of highlighting for the item
        :param msg: a string to begin the line of output with
        :param end: a string to end the line of output with
        :return: a row of data formatted into a string for output
        """
        columns = list(columns)
        total_width = len(msg)
        data = []
        for col_data in columns[:-1]:
            (val, width, hibeg, hiend) = self._col_data(col_data)

            if not width: # Don't count this column, invisible text
                msg += u"%s"
                data.append(val)
                continue

            (align_left, width) = self._fmt_column_align_width(width)
            val_width = exact_width(val)
            if val_width <= width:
                #  Don't use fill_exact_width() because it sucks performance
                # wise for 1,000s of rows. Also allows us to use len(), when
                # we can.
                msg += u"%s%s%s%s "
                if align_left:
                    data.extend([hibeg, val, " " * (width - val_width), hiend])
                else:
                    data.extend([hibeg, " " * (width - val_width), val, hiend])
            else:
                msg += u"%s%s%s\n" + " " * (total_width + width + 1)
                data.extend([hibeg, val, hiend])
            total_width += width
            total_width += 1
        (val, width, hibeg, hiend) = self._col_data(columns[-1])
        (align_left, width) = self._fmt_column_align_width(width)
        val = fill_exact_width(val, width, left=align_left,
                              prefix=hibeg, suffix=hiend)
        msg += u"%%s%s" % end
        data.append(val)
        return msg % tuple(data)

    def simpleList(self, pkg, ui_overflow=False, indent='', highlight=False,
                   columns=None):
        """Print a package as a line.

        :param pkg: the package to be printed
        :param ui_overflow: unused
        :param indent: string to be prefixed onto the line to provide
           e.g. an indent
        :param highlight: highlighting options for the name of the
           package
        :param columns: tuple containing the space allocated for each
           column of output.  The columns are the package name, version,
           and repository
        """
        if columns is None:
            columns = (-40, -22, -16) # Old default
        na = '%s%s.%s' % (indent, pkg.name, pkg.arch)
        hi_cols = [highlight, 'normal', 'normal']

        columns = zip((na, pkg.evr, pkg._from_repo), columns, hi_cols)
        print(self.fmtColumns(columns))

    def simpleEnvraList(self, pkg, ui_overflow=False,
                        indent='', highlight=False, columns=None):
        """Print a package as a line, with the package itself in envra
        format so it can be passed to list/install/etc.

        :param pkg: the package to be printed
        :param ui_overflow: unused
        :param indent: string to be prefixed onto the line to provide
           e.g. an indent
        :param highlight: highlighting options for the name of the
           package
        :param columns: tuple containing the space allocated for each
           column of output.  The columns the are the package envra and
           repository
        """
        if columns is None:
            columns = (-63, -16) # Old default
        envra = '%s%s' % (indent, ucd(pkg))
        hi_cols = [highlight, 'normal', 'normal']
        rid = pkg.ui_from_repo
        columns = zip((envra, rid), columns, hi_cols)
        print(self.fmtColumns(columns))

    def simple_name_list(self, pkg):
        """Print a package as a line containing its name."""
        print(ucd(pkg.name))

    def simple_nevra_list(self, pkg):
        """Print a package as a line containing its NEVRA."""
        print(ucd(pkg))

    def fmtKeyValFill(self, key, val):
        """Return a key value pair in the common two column output
        format.

        :param key: the key to be formatted
        :param val: the value associated with *key*
        :return: the key value pair formatted in two columns for output
        """
        keylen = exact_width(key)
        cols = self.term.real_columns
        if not cols:
            cols = sys_maxsize
        elif cols < 20:
            cols = 20
        nxt = ' ' * (keylen - 2) + ': '
        if not val:
            # textwrap.fill in case of empty val returns empty string
            return key
        val = ucd(val)
        ret = textwrap_fill(val, width=cols, initial_indent=key,
                            subsequent_indent=nxt)
        if ret.count("\n") > 1 and keylen > (cols // 3):
            # If it's big, redo it again with a smaller subsequent off
            ret = textwrap_fill(val, width=cols, initial_indent=key,
                                subsequent_indent='     ...: ')
        return ret

    def fmtSection(self, name, fill='='):
        """Format and return a section header.  The format of the
        header is a line with *name* centered, and *fill* repeated on
        either side to fill an entire line on the terminal.

        :param name: the name of the section
        :param fill: the character to repeat on either side of *name*
          to fill an entire line.  *fill* must be a single character.
        :return: a string formatted to be a section header
        """
        name = ucd(name)
        cols = self.term.columns - 2
        name_len = exact_width(name)
        if name_len >= (cols - 4):
            beg = end = fill * 2
        else:
            beg = fill * ((cols - name_len) // 2)
            end = fill * (cols - name_len - len(beg))

        return "%s %s %s" % (beg, name, end)

    def infoOutput(self, pkg, highlight=False):
        """Print information about the given package.

        :param pkg: the package to print information about
        :param highlight: highlighting options for the name of the
           package
        """
        def format_key_val(key, val):
            return " ".join([fill_exact_width(key, 12, 12), ":", str(val)])

        def format_key_val_fill(key, val):
            return self.fmtKeyValFill(fill_exact_width(key, 12, 12) + " : ", val or "")

        output_list = []
        (hibeg, hiend) = self._highlight(highlight)
        # Translators: This is abbreviated 'Name'. Should be no longer
        # than 12 characters. You can use the full version if it is short
        # enough in your language.
        key = select_short_long(12, C_("short", "Name"),
                                    C_("long", "Name"))
        output_list.append(format_key_val(key,
                                          "%s%s%s" % (hibeg, pkg.name, hiend)))
        if pkg.epoch:
            # Translators: This message should be no longer than 12 characters.
            output_list.append(format_key_val(_("Epoch"), pkg.epoch))
        key = select_short_long(12, C_("short", "Version"),
                                    C_("long", "Version"))
        output_list.append(format_key_val(key, pkg.version))
        # Translators: This message should be no longer than 12 characters.
        output_list.append(format_key_val(_("Release"), pkg.release))
        key = select_short_long(12, C_("short", "Arch"),
                                    C_("long", "Architecture"))
        output_list.append(format_key_val(key, pkg.arch))
        key = select_short_long(12, C_("short", "Size"), C_("long", "Size"))
        output_list.append(format_key_val(key,
                                          format_number(float(pkg._size))))
        # Translators: This message should be no longer than 12 characters.
        output_list.append(format_key_val(_("Source"), pkg.sourcerpm))
        key = select_short_long(12, C_("short", "Repo"),
                                    C_("long", "Repository"))
        output_list.append(format_key_val(key, pkg.repoid))

        if pkg._from_system:
            history_repo = self.history.repo(pkg)
            if history_repo:
                # Translators: This message should be no longer than 12 chars.
                output_list.append(format_key_val(_("From repo"), history_repo))
        if self.conf.verbose:
            # :hawkey does not support changelog information
            # print(_("Committer   : %s") % ucd(pkg.committer))
            # print(_("Committime  : %s") % time.ctime(pkg.committime))
            # Translators: This message should be no longer than 12 characters.
            output_list.append(format_key_val(_("Packager"), pkg.packager))
            # Translators: This message should be no longer than 12 characters.
            output_list.append(format_key_val(_("Buildtime"),
                                              dnf.util.normalize_time(pkg.buildtime)))
            if pkg.installtime:
            # Translators: This message should be no longer than 12 characters.
                output_list.append(format_key_val(_("Install time"),
                                                  dnf.util.normalize_time(pkg.installtime)))
            history_pkg = self.history.package_data(pkg)
            if history_pkg:
                try:
                    uid = int(history_pkg._item.getInstalledBy())
                except ValueError: # In case int() fails
                    uid = None
                # Translators: This message should be no longer than 12 chars.
                output_list.append(format_key_val(_("Installed by"), self._pwd_ui_username(uid)))
        # Translators: This is abbreviated 'Summary'. Should be no longer
        # than 12 characters. You can use the full version if it is short
        # enough in your language.
        key = select_short_long(12, C_("short", "Summary"),
                                    C_("long", "Summary"))
        output_list.append(format_key_val_fill(key, pkg.summary))
        if pkg.url:
            output_list.append(format_key_val(_("URL"), ucd(pkg.url)))
        # Translators: This message should be no longer than 12 characters.
        output_list.append(format_key_val_fill(_("License"), pkg.license))
        # Translators: This is abbreviated 'Description'. Should be no longer
        # than 12 characters. You can use the full version if it is short
        # enough in your language.
        key = select_short_long(12, C_("short", "Description"),
                                    C_("long", "Description"))
        output_list.append(format_key_val_fill(key, pkg.description))
        return "\n".join(output_list)

    def updatesObsoletesList(self, uotup, changetype, columns=None):
        """Print a simple string that explains the relationship
        between the members of an update or obsoletes tuple.

        :param uotup: an update or obsoletes tuple.  The first member
           is the new package, and the second member is the old
           package
        :param changetype: a string indicating what the change between
           the packages is, e.g. 'updates' or 'obsoletes'
        :param columns: a tuple containing information about how to
           format the columns of output.  The absolute value of each
           number in the tuple indicates how much space has been
           allocated for the corresponding column.  If the number is
           negative, the text in the column will be left justified,
           and if it is positive, the text will be right justified.
           The columns of output are the package name, version, and repository
        """
        (changePkg, instPkg) = uotup

        if columns is not None:
            # New style, output all info. for both old/new with old indented
            chi = self.conf.color_update_remote
            if changePkg.reponame != hawkey.SYSTEM_REPO_NAME:
                chi = self.conf.color_update_local
            self.simpleList(changePkg, columns=columns, highlight=chi)
            self.simpleList(instPkg, columns=columns, indent=' ' * 4,
                            highlight=self.conf.color_update_installed)
            return

        # Old style
        c_compact = changePkg.compactPrint()
        i_compact = '%s.%s' % (instPkg.name, instPkg.arch)
        c_repo = changePkg.repoid
        print('%-35.35s [%.12s] %.10s %-20.20s' %
              (c_compact, c_repo, changetype, i_compact))

    def listPkgs(self, lst, description, outputType, highlight_na={},
                 columns=None, highlight_modes={}):
        """Prints information about the given list of packages.

        :param lst: a list of packages to print information about
        :param description: string describing what the list of
           packages contains, e.g. 'Available Packages'
        :param outputType: The type of information to be printed.
           Current options::

              'list' - simple pkg list
              'info' - similar to rpm -qi output
              'name' - simple name list
              'nevra' - simple nevra list
        :param highlight_na: a dictionary containing information about
              packages that should be highlighted in the output.  The
              dictionary keys are (name, arch) tuples for the package,
              and the associated values are the package objects
              themselves.
        :param columns: a tuple containing information about how to
           format the columns of output.  The absolute value of each
           number in the tuple indicates how much space has been
           allocated for the corresponding column.  If the number is
           negative, the text in the column will be left justified,
           and if it is positive, the text will be right justified.
           The columns of output are the package name, version, and
           repository
        :param highlight_modes: dictionary containing information
              about to highlight the packages in *highlight_na*.
              *highlight_modes* should contain the following keys::

                 'not_in' - highlighting used for packages not in *highlight_na*
                 '=' - highlighting used when the package versions are equal
                 '<' - highlighting used when the package has a lower version
                       number
                 '>' - highlighting used when the package has a higher version
                       number
        :return: number of packages listed
        """
        if outputType in ['list', 'info', 'name', 'nevra']:
            if len(lst) > 0:
                print('%s' % description)
                info_set = set()
                if outputType == 'list':
                    unique_item_dict = {}
                    for pkg in lst:
                        unique_item_dict[str(pkg) + str(pkg._from_repo)] = pkg

                    lst = unique_item_dict.values()

                for pkg in sorted(lst):
                    key = (pkg.name, pkg.arch)
                    highlight = False
                    if key not in highlight_na:
                        highlight = highlight_modes.get('not in', 'normal')
                    elif pkg.evr_eq(highlight_na[key]):
                        highlight = highlight_modes.get('=', 'normal')
                    elif pkg.evr_lt(highlight_na[key]):
                        highlight = highlight_modes.get('>', 'bold')
                    else:
                        highlight = highlight_modes.get('<', 'normal')

                    if outputType == 'list':
                        self.simpleList(pkg, ui_overflow=True,
                                        highlight=highlight, columns=columns)
                    elif outputType == 'info':
                        info_set.add(self.infoOutput(pkg, highlight=highlight) + "\n")
                    elif outputType == 'name':
                        self.simple_name_list(pkg)
                    elif outputType == 'nevra':
                        self.simple_nevra_list(pkg)
                    else:
                        pass

                if info_set:
                    print("\n".join(sorted(info_set)))

            return len(lst)

    def userconfirm(self, msg=None, defaultyes_msg=None):
        """Get a yes or no from the user, and default to No

        :msg: String for case with [y/N]
        :defaultyes_msg: String for case with [Y/n]
        :return: True if the user selects yes, and False if the user
           selects no
        """
        yui = (ucd(_('y')), ucd(_('yes')))
        nui = (ucd(_('n')), ucd(_('no')))
        aui = yui + nui
        while True:
            if msg is None:
                msg = _('Is this ok [y/N]: ')
            choice = ''
            if self.conf.defaultyes:
                if defaultyes_msg is None:
                    msg = _('Is this ok [Y/n]: ')
                else:
                    msg = defaultyes_msg
            try:
                choice = dnf.i18n.ucd_input(msg)
            except EOFError:
                pass
            except KeyboardInterrupt:
                choice = nui[0]
            choice = ucd(choice).lower()
            if len(choice) == 0:
                choice = yui[0] if self.conf.defaultyes else nui[0]
            if choice in aui:
                break

            # If the English one letter names don't mix with the translated
            # letters, allow them too:
            if u'y' == choice and u'y' not in aui:
                choice = yui[0]
                break
            if u'n' == choice and u'n' not in aui:
                choice = nui[0]
                break

        if choice in yui:
            return True
        return False

    def _pkgs2name_dict(self, sections):
        installed = self.sack.query().installed()._name_dict()
        available = self.sack.query().available()._name_dict()

        d = {}
        for pkg_name in itertools.chain(*list(zip(*sections))[1]):
            if pkg_name in installed:
                d[pkg_name] = installed[pkg_name][0]
            elif pkg_name in available:
                d[pkg_name] = available[pkg_name][0]
        return d

    def _pkgs2col_lengths(self, sections, name_dict):
        nevra_lengths = {}
        repo_lengths = {}
        for pkg_name in itertools.chain(*list(zip(*sections))[1]):
            pkg = name_dict.get(pkg_name)
            if pkg is None:
                continue
            nevra_l = exact_width(ucd(pkg)) + exact_width(self.GRP_PACKAGE_INDENT)
            repo_l = exact_width(ucd(pkg.reponame))
            nevra_lengths[nevra_l] = nevra_lengths.get(nevra_l, 0) + 1
            repo_lengths[repo_l] = repo_lengths.get(repo_l, 0) + 1
        return (nevra_lengths, repo_lengths)

    def _display_packages(self, pkg_names):
        for name in pkg_names:
            print('%s%s' % (self.GRP_PACKAGE_INDENT, name))

    def _display_packages_verbose(self, pkg_names, name_dict, columns):
        for name in pkg_names:
            try:
                pkg = name_dict[name]
            except KeyError:
                # package not in any repo -> print only package name
                print('%s%s' % (self.GRP_PACKAGE_INDENT, name))
                continue
            highlight = False
            if not pkg._from_system:
                highlight = self.conf.color_list_available_install
            self.simpleEnvraList(pkg, ui_overflow=True,
                                 indent=self.GRP_PACKAGE_INDENT,
                                 highlight=highlight,
                                 columns=columns)

    def display_pkgs_in_groups(self, group):
        """Output information about the packages in a given group

        :param group: a Group object to output information about
        """
        def names(packages):
            return sorted(pkg.name for pkg in packages)
        print('\n' + _('Group: %s') % group.ui_name)

        verbose = self.conf.verbose
        if verbose:
            print(_(' Group-Id: %s') % ucd(group.id))
        if group.ui_description:
            print(_(' Description: %s') % ucd(group.ui_description) or "")
        if group.lang_only:
            print(_(' Language: %s') % group.lang_only)

        sections = (
            (_(' Mandatory Packages:'), names(group.mandatory_packages)),
            (_(' Default Packages:'), names(group.default_packages)),
            (_(' Optional Packages:'), names(group.optional_packages)),
            (_(' Conditional Packages:'), names(group.conditional_packages)))
        if verbose:
            name_dict = self._pkgs2name_dict(sections)
            col_lengths = self._pkgs2col_lengths(sections, name_dict)
            columns = self.calcColumns(col_lengths)
            columns = (-columns[0], -columns[1])
            for (section_name, packages) in sections:
                if len(packages) < 1:
                    continue
                print(section_name)
                self._display_packages_verbose(packages, name_dict, columns)
        else:
            for (section_name, packages) in sections:
                if len(packages) < 1:
                    continue
                print(section_name)
                self._display_packages(packages)

    def display_groups_in_environment(self, environment):
        """Output information about the packages in a given environment

        :param environment: an Environment object to output information about
        """
        def names(groups):
            return sorted(group.name for group in groups)
        print(_('Environment Group: %s') % environment.ui_name)

        if self.conf.verbose:
            print(_(' Environment-Id: %s') % ucd(environment.id))
        if environment.ui_description:
            description = ucd(environment.ui_description) or ""
            print(_(' Description: %s') % description)

        sections = (
            (_(' Mandatory Groups:'), names(environment.mandatory_groups)),
            (_(' Optional Groups:'), names(environment.optional_groups)))
        for (section_name, packages) in sections:
            if len(packages) < 1:
                continue
            print(section_name)
            self._display_packages(packages)

    def matchcallback(self, po, values, matchfor=None, verbose=None,
                      highlight=None):
        """Output search/provides type callback matches.

        :param po: the package object that matched the search
        :param values: the information associated with *po* that
           matched the search
        :param matchfor: a list of strings to be highlighted in the
           output
        :param verbose: whether to output extra verbose information
        :param highlight: highlighting options for the highlighted matches
        """
        def print_highlighted_key_item(key, item, printed_headline, can_overflow=False):
            if not printed_headline:
                print(_('Matched from:'))
            item = ucd(item) or ""
            if item == "":
                return
            if matchfor:
                item = self._sub_highlight(item, highlight, matchfor, ignore_case=True)
            if can_overflow:
                print(self.fmtKeyValFill(key, item))
            else:
                print(key % item)

        def print_file_provides(item, printed_match):
            if not self.FILE_PROVIDE_RE.match(item):
                return False
            key = _("Filename    : %s")
            file_match = False
            for filename in po.files:
                if fnmatch.fnmatch(filename, item):
                    print_highlighted_key_item(
                        key, filename, file_match or printed_match, can_overflow=False)
                    file_match = True
            return file_match

        if self.conf.showdupesfromrepos:
            msg = '%s : ' % po
        else:
            msg = '%s.%s : ' % (po.name, po.arch)
        msg = self.fmtKeyValFill(msg, po.summary or "")
        if matchfor:
            if highlight is None:
                highlight = self.conf.color_search_match
            msg = self._sub_highlight(msg, highlight, matchfor, ignore_case=True)
        print(msg)

        if verbose is None:
            verbose = self.conf.verbose
        if not verbose:
            return

        print(_("Repo        : %s") % po.ui_from_repo)
        printed_match = False
        name_match = False
        for item in set(values):
            if po.summary == item:
                name_match = True
                continue # Skip double name/summary printing

            if po.description == item:
                key = _("Description : ")
                print_highlighted_key_item(key, item, printed_match, can_overflow=True)
                printed_match = True
            elif po.url == item:
                key = _("URL         : %s")
                print_highlighted_key_item(key, item, printed_match, can_overflow=False)
                printed_match = True
            elif po.license == item:
                key = _("License     : %s")
                print_highlighted_key_item(key, item, printed_match, can_overflow=False)
                printed_match = True
            elif print_file_provides(item, printed_match):
                printed_match = True
            else:
                key = _("Provide    : %s")
                for provide in po.provides:
                    provide = str(provide)
                    if fnmatch.fnmatch(provide, item):
                        print_highlighted_key_item(key, provide, printed_match, can_overflow=False)
                        printed_match = True
                    else:
                        first_provide = provide.split()[0]
                        possible = set('=<>')
                        if any((char in possible) for char in item):
                            item_new = item.split()[0]
                        else:
                            item_new = item
                        if fnmatch.fnmatch(first_provide, item_new):
                            print_highlighted_key_item(
                                key, provide, printed_match, can_overflow=False)
                            printed_match = True

        if not any([printed_match, name_match]):
            for item in set(values):
                key = _("Other       : %s")
                print_highlighted_key_item(key, item, printed_match, can_overflow=False)
        print()

    def matchcallback_verbose(self, po, values, matchfor=None):
        """Output search/provides type callback matches.  This will
        output more information than :func:`matchcallback`.

        :param po: the package object that matched the search
        :param values: the information associated with *po* that
           matched the search
        :param matchfor: a list of strings to be highlighted in the
           output
        """
        return self.matchcallback(po, values, matchfor, verbose=True)

    def reportDownloadSize(self, packages, installonly=False):
        """Report the total download size for a set of packages

        :param packages: a list of package objects
        :param installonly: whether the transaction consists only of installations
        """
        totsize = 0
        locsize = 0
        insize = 0
        error = False
        for pkg in packages:
            # Just to be on the safe side, if for some reason getting
            # the package size fails, log the error and don't report download
            # size
            try:
                size = int(pkg._size)
                totsize += size
                try:
                    if pkg.verifyLocalPkg():
                        locsize += size
                except Exception:
                    pass

                if not installonly:
                    continue

                try:
                    size = int(pkg.installsize)
                except Exception:
                    pass
                insize += size
            except Exception:
                error = True
                msg = _('There was an error calculating total download size')
                logger.error(msg)
                break

        if not error:
            if locsize:
                logger.info(_("Total size: %s"),
                                        format_number(totsize))
            if locsize != totsize:
                logger.info(_("Total download size: %s"),
                                        format_number(totsize - locsize))
            if installonly:
                logger.info(_("Installed size: %s"), format_number(insize))

    def reportRemoveSize(self, packages):
        """Report the total size of packages being removed.

        :param packages: a list of package objects
        """
        totsize = 0
        error = False
        for pkg in packages:
            # Just to be on the safe side, if for some reason getting
            # the package size fails, log the error and don't report download
            # size
            try:
                size = pkg._size
                totsize += size
            except Exception:
                error = True
                msg = _('There was an error calculating installed size')
                logger.error(msg)
                break
        if not error:
            logger.info(_("Freed space: %s"), format_number(totsize))

    def list_group_transaction(self, comps, history, diff):
        if not diff:
            return None

        out = []
        rows = []
        if diff.new_groups:
            out.append(_('Marking packages as installed by the group:'))
        for grp_id in diff.new_groups:
            pkgs = list(diff.added_packages(grp_id))
            group_object = comps._group_by_id(grp_id)
            grp_name = group_object.ui_name if group_object else grp_id
            rows.extend(_spread_in_columns(4, "@" + grp_name, pkgs))
        if diff.removed_groups:
            out.append(_('Marking packages as removed by the group:'))
        for grp_id in diff.removed_groups:
            pkgs = list(diff.removed_packages(grp_id))
            grp_name = history.group.get(grp_id).ui_name
            rows.extend(_spread_in_columns(4, "@" + grp_name, pkgs))

        if rows:
            col_data = self._col_widths(rows)
            for row in rows:
                out.append(self.fmtColumns(zip(row, col_data), ' '))
            out[0:0] = self._banner(col_data, (_('Group'), _('Packages'), '', ''))
        return '\n'.join(out)

    def list_transaction(self, transaction, total_width=None):
        """Return a string representation of the transaction in an
        easy-to-read format.
        """
        forward_actions = hawkey.UPGRADE | hawkey.UPGRADE_ALL | hawkey.DISTUPGRADE | \
            hawkey.DISTUPGRADE_ALL | hawkey.DOWNGRADE | hawkey.INSTALL
        skipped_conflicts = set()
        skipped_broken = set()

        if transaction is None:
            # set empty transaction list instead of returning None
            # in order to display module changes when RPM transaction is empty
            transaction = []

        list_bunch = dnf.util._make_lists(transaction)
        pkglist_lines = []
        data = {'n' : {}, 'v' : {}, 'r' : {}}
        a_wid = 0 # Arch can't get "that big" ... so always use the max.

        def _add_line(lines, data, a_wid, po, obsoletes=[]):
            (n, a, e, v, r) = po.pkgtup
            evr = po.evr
            repoid = po._from_repo
            size = format_number(po._size)

            if a is None: # gpgkeys are weird
                a = 'noarch'

            # none, partial, full?
            if po._from_system:
                hi = self.conf.color_update_installed
            elif po._from_cmdline:
                hi = self.conf.color_update_local
            else:
                hi = self.conf.color_update_remote
            lines.append((n, a, evr, repoid, size, obsoletes, hi))
            #  Create a dict of field_length => number of packages, for
            # each field.
            for (d, v) in (("n", len(n)), ("v", len(evr)), ("r", len(repoid))):
                data[d].setdefault(v, 0)
                data[d][v] += 1
            a_wid = max(a_wid, len(a))
            return a_wid
        ins_group_msg = _('Installing group/module packages') if dnf.base.WITH_MODULES \
            else _('Installing group packages')

        for (action, pkglist) in [
                # TRANSLATORS: This is for a list of packages to be installed.
                (C_('summary', 'Installing'), list_bunch.installed),
                # TRANSLATORS: This is for a list of packages to be upgraded.
                (C_('summary', 'Upgrading'), list_bunch.upgraded),
                # TRANSLATORS: This is for a list of packages to be reinstalled.
                (C_('summary', 'Reinstalling'), list_bunch.reinstalled),
                (ins_group_msg, list_bunch.installed_group),
                (_('Installing dependencies'), list_bunch.installed_dep),
                (_('Installing weak dependencies'), list_bunch.installed_weak),
                # TRANSLATORS: This is for a list of packages to be removed.
                (_('Removing'), list_bunch.erased),
                (_('Removing dependent packages'), list_bunch.erased_dep),
                (_('Removing unused dependencies'), list_bunch.erased_clean),
                # TRANSLATORS: This is for a list of packages to be downgraded.
                (C_('summary', 'Downgrading'), list_bunch.downgraded)]:
            lines = []

            # build a reverse mapping to 'replaced_by'
            # this is required to achieve reasonable speed
            replaces = {}
            for tsi in transaction:
                if tsi.action != libdnf.transaction.TransactionItemAction_OBSOLETED:
                    continue
                for i in tsi._item.getReplacedBy():
                    replaces.setdefault(i, set()).add(tsi)

            for tsi in sorted(pkglist, key=lambda x: x.pkg):
                if tsi.action not in dnf.transaction.FORWARD_ACTIONS + [libdnf.transaction.TransactionItemAction_REMOVE]:
                    continue

                # get TransactionItems obsoleted by tsi
                obsoleted = sorted(replaces.get(tsi._item, []))

                a_wid = _add_line(lines, data, a_wid, tsi.pkg, obsoleted)

            pkglist_lines.append((action, lines))

        installedProfiles = sorted(dict(self.base._moduleContainer.getInstalledProfiles()).items())
        if installedProfiles:
            action = _("Installing module profiles")
            lines = []
            for name, profiles in installedProfiles:
                for profile in list(profiles):
                    lines.append(("%s/%s" % (name, profile), "", "", "", "", "", ""))
            pkglist_lines.append((action, lines))

        removedProfiles = sorted(dict(self.base._moduleContainer.getRemovedProfiles()).items())
        if removedProfiles:
            action = _("Disabling module profiles")
            lines = []
            for name, profiles in removedProfiles:
                for profile in list(profiles):
                    lines.append(("%s/%s" % (name, profile), "", "", "", "", "", ""))
            pkglist_lines.append((action, lines))

        enabledStreams = sorted(dict(self.base._moduleContainer.getEnabledStreams()).items())
        if enabledStreams:
            action = _("Enabling module streams")
            lines = []
            for name, stream in enabledStreams:
                lines.append((name, "", stream, "", "", "", ""))
            pkglist_lines.append((action, lines))

        switchedStreams = sorted(dict(self.base._moduleContainer.getSwitchedStreams()).items())
        if switchedStreams:
            action = _("Switching module streams")
            lines = []
            for name, stream in switchedStreams:
                lines.append((name, "", "%s -> %s" % (stream[0], stream[1]), "", "", "", ""))
            pkglist_lines.append((action, lines))

        disabledModules = sorted(list(self.base._moduleContainer.getDisabledModules()))
        if disabledModules:
            action = _("Disabling modules")
            lines = []
            for name in disabledModules:
                lines.append((name, "", "", "", "", "", ""))
            pkglist_lines.append((action, lines))

        resetModules = sorted(list(self.base._moduleContainer.getResetModules()))
        if resetModules:
            action = _("Resetting modules")
            lines = []
            for name in resetModules:
                lines.append((name, "", "", "", "", "", ""))
            pkglist_lines.append((action, lines))
        if self.base._history:
            def format_line(group):
                name = group.getName()
                return (name if name else _("<name-unset>"), "", "", "", "", "", "")

            install_env_group = self.base._history.env._installed
            if install_env_group:
                action = _("Installing Environment Groups")
                lines = []
                for group in install_env_group.values():
                    lines.append(format_line(group))
                pkglist_lines.append((action, lines))
            upgrade_env_group = self.base._history.env._upgraded
            if upgrade_env_group:
                action = _("Upgrading Environment Groups")
                lines = []
                for group in upgrade_env_group.values():
                    lines.append(format_line(group))
                pkglist_lines.append((action, lines))
            remove_env_group = self.base._history.env._removed
            if remove_env_group:
                action = _("Removing Environment Groups")
                lines = []
                for group in remove_env_group.values():
                    lines.append(format_line(group))
                pkglist_lines.append((action, lines))
            install_group = self.base._history.group._installed
            if install_group:
                action = _("Installing Groups")
                lines = []
                for group in install_group.values():
                    lines.append(format_line(group))
                pkglist_lines.append((action, lines))
            upgrade_group = self.base._history.group._upgraded
            if upgrade_group:
                action = _("Upgrading Groups")
                lines = []
                for group in upgrade_group.values():
                    lines.append(format_line(group))
                pkglist_lines.append((action, lines))
            remove_group = self.base._history.group._removed
            if remove_group:
                action = _("Removing Groups")
                lines = []
                for group in remove_group.values():
                    lines.append(format_line(group))
                pkglist_lines.append((action, lines))
        # show skipped conflicting packages
        if not self.conf.best and self.base._goal.actions & forward_actions:
            lines = []
            skipped_conflicts, skipped_broken = self.base._skipped_packages(
                report_problems=True, transaction=transaction)
            skipped_broken = dict((str(pkg), pkg) for pkg in skipped_broken)
            for pkg in sorted(skipped_conflicts):
                a_wid = _add_line(lines, data, a_wid, pkg, [])
            recommendations = ["--best"]
            if not self.base._allow_erasing:
                recommendations.append("--allowerasing")
            skip_str = _("Skipping packages with conflicts:\n"
                         "(add '%s' to command line "
                         "to force their upgrade)") % " ".join(recommendations)
            # remove misleading green color from the "packages with conflicts" lines
            lines = [i[:-1] + ("", ) for i in lines]
            pkglist_lines.append((skip_str, lines))

            lines = []
            for nevra, pkg in sorted(skipped_broken.items()):
                a_wid = _add_line(lines, data, a_wid, pkg, [])
            skip_str = _("Skipping packages with broken dependencies%s")
            if self.base.conf.upgrade_group_objects_upgrade:
                skip_str = skip_str % ""
            else:
                skip_str = skip_str % _(" or part of a group")

            # remove misleading green color from the "broken dependencies" lines
            lines = [i[:-1] + ("", ) for i in lines]
            pkglist_lines.append((skip_str, lines))
        output_width = self.term.columns
        if not data['n'] and not self.base._moduleContainer.isChanged() and not \
                (self.base._history and (self.base._history.group or self.base._history.env)):
            return u''
        else:
            data = [data['n'], {}, data['v'], data['r'], {}]
            columns = [1, a_wid, 1, 1, 5]
            columns = self.calcColumns(data, indent="  ", columns=columns,
                                       remainder_column=2, total_width=total_width)
            (n_wid, a_wid, v_wid, r_wid, s_wid) = columns
            real_width = sum(columns) + 5
            output_width = output_width if output_width >= real_width else real_width

            # Do not use 'Package' without context. Using context resolves
            # RhBug 1302935 as a side effect.
            msg_package = select_short_long(n_wid,
            # Translators: This is the short version of 'Package'. You can
            # use the full (unabbreviated) term 'Package' if you think that
            # the translation to your language is not too long and will
            # always fit to limited space.
                                            C_('short', 'Package'),
            # Translators: This is the full (unabbreviated) term 'Package'.
                                            C_('long', 'Package'))
            msg_arch = select_short_long(a_wid,
            # Translators: This is abbreviated 'Architecture', used when
            # we have not enough space to display the full word.
                                         C_('short', 'Arch'),
            # Translators: This is the full word 'Architecture', used when
            # we have enough space.
                                         C_('long', 'Architecture'))
            msg_version = select_short_long(v_wid,
            # Translators: This is the short version of 'Version'. You can
            # use the full (unabbreviated) term 'Version' if you think that
            # the translation to your language is not too long and will
            # always fit to limited space.
                                            C_('short', 'Version'),
            # Translators: This is the full (unabbreviated) term 'Version'.
                                            C_('long', 'Version'))
            msg_repository = select_short_long(r_wid,
            # Translators: This is abbreviated 'Repository', used when
            # we have not enough space to display the full word.
                                               C_('short', 'Repo'),
            # Translators: This is the full word 'Repository', used when
            # we have enough space.
                                               C_('long', 'Repository'))
            msg_size = select_short_long(s_wid,
            # Translators: This is the short version of 'Size'. It should
            # not be longer than 5 characters. If the term 'Size' in your
            # language is not longer than 5 characters then you can use it
            # unabbreviated.
                                         C_('short', 'Size'),
            # Translators: This is the full (unabbreviated) term 'Size'.
                                         C_('long', 'Size'))

            out = [u"%s\n%s\n%s\n" % ('=' * output_width,
                                      self.fmtColumns(((msg_package, -n_wid),
                                                       (msg_arch, -a_wid),
                                                       (msg_version, -v_wid),
                                                       (msg_repository, -r_wid),
                                                       (msg_size, s_wid)), u" "),
                                      '=' * output_width)]

        for (action, lines) in pkglist_lines:
            if lines:
                totalmsg = u"%s:\n" % action
            for (n, a, evr, repoid, size, obsoletes, hi) in lines:
                columns = ((n, -n_wid, hi), (a, -a_wid),
                           (evr, -v_wid), (repoid, -r_wid), (size, s_wid))
                msg = self.fmtColumns(columns, u" ", u"\n")
                hibeg, hiend = self._highlight(self.conf.color_update_installed)
                for obspo in sorted(obsoletes):
                    appended = '     ' + _('replacing') + '  %s%s%s.%s %s\n'
                    appended %= (hibeg, obspo.name, hiend, obspo.arch, obspo.evr)
                    msg += appended
                totalmsg = totalmsg + msg

            if lines:
                out.append(totalmsg)
        out.append(_("""
Transaction Summary
%s
""") % ('=' * output_width))
        summary_data = (
            (_('Install'), len(list_bunch.installed) +
             len(list_bunch.installed_group) +
             len(list_bunch.installed_weak) +
             len(list_bunch.installed_dep), 0),
            (_('Upgrade'), len(list_bunch.upgraded), 0),
            (_('Remove'), len(list_bunch.erased) + len(list_bunch.erased_dep) +
             len(list_bunch.erased_clean), 0),
            (_('Downgrade'), len(list_bunch.downgraded), 0),
            (_('Skip'), len(skipped_conflicts) + len(skipped_broken), 0))
        max_msg_action = 0
        max_msg_count = 0
        max_msg_pkgs = 0
        max_msg_depcount = 0
        for action, count, depcount in summary_data:
            if not count and not depcount:
                continue

            msg_pkgs = P_('Package', 'Packages', count)
            len_msg_action = exact_width(action)
            len_msg_count = exact_width(unicode(count))
            len_msg_pkgs = exact_width(msg_pkgs)

            if depcount:
                len_msg_depcount = exact_width(unicode(depcount))
            else:
                len_msg_depcount = 0

            max_msg_action = max(len_msg_action, max_msg_action)
            max_msg_count = max(len_msg_count, max_msg_count)
            max_msg_pkgs = max(len_msg_pkgs, max_msg_pkgs)
            max_msg_depcount = max(len_msg_depcount, max_msg_depcount)

        for action, count, depcount in summary_data:
            msg_pkgs = P_('Package', 'Packages', count)
            if depcount:
                msg_deppkgs = P_('Dependent package', 'Dependent packages',
                                 depcount)
                action_msg = fill_exact_width(action, max_msg_action)
                if count:
                    msg = '%s  %*d %s (+%*d %s)\n'
                    out.append(msg % (action_msg,
                                      max_msg_count, count,
                                      "%-*s" % (max_msg_pkgs, msg_pkgs),
                                      max_msg_depcount, depcount, msg_deppkgs))
                else:
                    msg = '%s  %s  ( %*d %s)\n'
                    out.append(msg % (action_msg,
                                      (max_msg_count + max_msg_pkgs) * ' ',
                                      max_msg_depcount, depcount, msg_deppkgs))
            elif count:
                msg = '%s  %*d %s\n'
                out.append(msg % (fill_exact_width(action, max_msg_action),
                                  max_msg_count, count, msg_pkgs))
        return ''.join(out)


    def _pto_callback(self, action, tsis):
        #  Works a bit like calcColumns, but we never overflow a column we just
        # have a dynamic number of columns.
        def _fits_in_cols(msgs, num):
            """ Work out how many columns we can use to display stuff, in
                the post trans output. """
            if len(msgs) < num:
                return []

            left = self.term.columns - ((num - 1) + 2)
            if left <= 0:
                return []

            col_lens = [0] * num
            col = 0
            for msg in msgs:
                if len(msg) > col_lens[col]:
                    diff = (len(msg) - col_lens[col])
                    if left <= diff:
                        return []
                    left -= diff
                    col_lens[col] = len(msg)
                col += 1
                col %= len(col_lens)

            for col in range(len(col_lens)):
                col_lens[col] += left // num
                col_lens[col] *= -1
            return col_lens

        if not tsis:
            return ''
        out = []
        msgs = []
        out.append('{}:'.format(action))
        for tsi in tsis:
            msgs.append(str(tsi))
        for num in (8, 7, 6, 5, 4, 3, 2):
            cols = _fits_in_cols(msgs, num)
            if cols:
                break
        if not cols:
            cols = [-(self.term.columns - 2)]
        while msgs:
            current_msgs = msgs[:len(cols)]
            out.append('  {}'.format(self.fmtColumns(zip(current_msgs, cols))))
            msgs = msgs[len(cols):]
        return out


    def post_transaction_output(self, transaction):
        """
        Return a human-readable summary of the transaction. Packages in sections
        are arranged to columns.
        """
        return dnf.util._post_transaction_output(self.base, transaction, self._pto_callback)

    def setup_progress_callbacks(self):
        """Set up the progress callbacks and various
           output bars based on debug level.
        """
        progressbar = None
        if self.conf.debuglevel >= 2:
            progressbar = dnf.cli.progress.MultiFileProgressMeter(fo=sys.stdout)
            self.progress = dnf.cli.progress.MultiFileProgressMeter(fo=sys.stdout)

        # setup our depsolve progress callback
        return (progressbar, DepSolveProgressCallBack())

    def download_callback_total_cb(self, remote_size, download_start_timestamp):
        """Outputs summary information about the download process.

        :param remote_size: the total amount of information that was
           downloaded, in bytes
        :param download_start_timestamp: the time when the download
           process started, in seconds since the epoch
        """
        if remote_size <= 0:
            return

        width = self.term.columns
        logger.info("-" * width)
        dl_time = max(0.01, time.time() - download_start_timestamp)
        msg = ' %5sB/s | %5sB %9s     ' % (
            format_number(remote_size // dl_time),
            format_number(remote_size),
            format_time(dl_time))
        msg = fill_exact_width(_("Total"), width - len(msg)) + msg
        logger.info(msg)

    def _history_uiactions(self, hpkgs):
        actions = set()
        actions_short = set()
        count = 0
        for pkg in hpkgs:
            if pkg.action in (libdnf.transaction.TransactionItemAction_UPGRADED, libdnf.transaction.TransactionItemAction_DOWNGRADED):
                # skip states we don't want to display in user input
                continue
            actions.add(pkg.action_name)
            actions_short.add(pkg.action_short)
            count += 1

        if len(actions) > 1:
            return count, ", ".join(sorted(actions_short))

        # So empty transactions work, although that "shouldn't" really happen
        return count, "".join(list(actions))

    def _pwd_ui_username(self, uid, limit=None):
        if isinstance(uid, list):
            return [self._pwd_ui_username(u, limit) for u in uid]

        # loginuid is set to      -1 (0xFFFF_FFFF) on init, in newer kernels.
        # loginuid is set to INT_MAX (0x7FFF_FFFF) on init, in older kernels.
        if uid is None or uid in (0xFFFFFFFF, 0x7FFFFFFF):
            loginid = _("<unset>")
            name = _("System") + " " + loginid
            if limit is not None and len(name) > limit:
                name = loginid
            return ucd(name)

        def _safe_split_0(text, *args):
            """ Split gives us a [0] for everything _but_ '', this function
                returns '' in that case. """
            ret = text.split(*args)
            if not ret:
                return ''
            return ret[0]

        try:
            user = pwd.getpwuid(int(uid))
            fullname = _safe_split_0(ucd(user.pw_gecos), ';', 2)
            user_name = ucd(user.pw_name)
            name = "%s <%s>" % (fullname, user_name)
            if limit is not None and len(name) > limit:
                name = "%s ... <%s>" % (_safe_split_0(fullname), user_name)
                if len(name) > limit:
                    name = "<%s>" % user_name
            return name
        except KeyError:
            return ucd(uid)

    def historyListCmd(self, tids, reverse=False):
        """Output a list of information about the history of yum
        transactions.

        :param tids: transaction Ids; lists all transactions if empty
        """
        transactions = self.history.old(tids)
        if self.conf.history_list_view == 'users':
            uids = [1, 2]
        elif self.conf.history_list_view == 'commands':
            uids = [1]
        else:
            assert self.conf.history_list_view == 'single-user-commands'
            uids = set()
            done = 0
            blanks = 0
            for transaction in transactions:
                done += 1
                if transaction.cmdline is None:
                    blanks += 1
                uids.add(transaction.loginuid)

        fmt = "%s | %s | %s | %s | %s"
        if len(uids) == 1:
            name = _("Command line")
            real_cols = self.term.real_columns
            if real_cols is None:
                # if output is redirected in `less` the columns
                # detected are None value, to detect terminal size
                # use stdin file descriptor
                real_cols = dnf.cli.term._real_term_width(0)
            if real_cols is None:
                # if even stdin fd fails use 24 to fit to 80 cols
                real_cols = 24
            name_width = real_cols - 55 if real_cols > 79 else 24
        else:
            # TRANSLATORS: user names who executed transaction in history command output
            name = _("User name")
            name_width = 24
        print(fmt % (fill_exact_width(_("ID"), 6, 6),
                     fill_exact_width(name, name_width, name_width),
                     fill_exact_width(_("Date and time"), 16, 16),
                     fill_exact_width(_("Action(s)"), 14, 14),
                     fill_exact_width(_("Altered"), 7, 7)))

        # total table width: each column length +3 (padding and separator between columns)
        table_width = 6 + 3 + name_width + 3 + 16 + 3 + 14 + 3 + 7
        print("-" * table_width)
        fmt = "%6u | %s | %-16.16s | %s | %4u"

        if reverse is True:
            transactions = reversed(transactions)
        for transaction in transactions:
            if len(uids) == 1:
                name = transaction.cmdline or ''
            else:
                name = self._pwd_ui_username(transaction.loginuid, 24)
            name = ucd(name)
            tm = time.strftime("%Y-%m-%d %H:%M",
                               time.localtime(transaction.beg_timestamp))
            num, uiacts = self._history_uiactions(transaction.data())
            name = fill_exact_width(name, name_width, name_width)
            uiacts = fill_exact_width(uiacts, 14, 14)
            rmark = lmark = ' '
            if transaction.return_code is None:
                rmark = lmark = '*'
            elif transaction.return_code:
                rmark = lmark = '#'
                # We don't check .errors, because return_code will be non-0
            elif transaction.is_output:
                rmark = lmark = 'E'
            if transaction.altered_lt_rpmdb:
                rmark = '<'
            if transaction.altered_gt_rpmdb:
                lmark = '>'
            print(fmt % (transaction.tid, name, tm, uiacts, num), "%s%s" % (lmark, rmark))

    def historyInfoCmd(self, tids, pats=[], mtids=set()):
        """Output information about a transaction in history

        :param tids: transaction Ids; prints info for the last transaction if empty
        :raises dnf.exceptions.Error in case no transactions were found
        """
        tids = set(tids)
        last = self.history.last()
        if last is None:
            logger.critical(_('No transactions'))
            raise dnf.exceptions.Error(_('Failed history info'))

        lasttid = last.tid
        lastdbv = last.end_rpmdb_version

        transactions = []
        if not tids:
            last = self.history.last(complete_transactions_only=False)
            if last is not None:
                tids.add(last.tid)
                transactions.append(last)
        else:
            transactions = self.history.old(tids)

        if not tids:
            logger.critical(_('No transaction ID, or package, given'))
            raise dnf.exceptions.Error(_('Failed history info'))

        bmtid, emtid = -1, -1
        mobj = None
        done = False

        if mtids:
            mtids = sorted(mtids)
            bmtid, emtid = mtids.pop()

        for trans in transactions:
            if lastdbv is not None and trans.tid == lasttid:
                #  If this is the last transaction, is good and it doesn't
                # match the current rpmdb ... then mark it as bad.
                rpmdbv = self.sack._rpmdb_version()
                trans.compare_rpmdbv(str(rpmdbv))
            lastdbv = None

            merged = False

            if trans.tid >= bmtid and trans.tid <= emtid:
                if mobj is None:
                    mobj = MergedTransactionWrapper(trans)
                else:
                    mobj.merge(trans)
                merged = True
            elif mobj is not None:
                if done:
                    print("-" * 79)
                done = True

                self._historyInfoCmd(mobj)
                mobj = None

                if mtids:
                    bmtid, emtid = mtids.pop()
                    if trans.tid >= bmtid and trans.tid <= emtid:
                        mobj = trans
                        merged = True

            if not merged:
                if done:
                    print("-" * 79)
                done = True
                self._historyInfoCmd(trans, pats)

        if mobj is not None:
            if done:
                print("-" * 79)
            self._historyInfoCmd(mobj)

    def _historyInfoCmd(self, old, pats=[]):
        loginuid = old.loginuid
        if isinstance(loginuid, int):
            loginuid = [loginuid]
        name = [self._pwd_ui_username(uid) for uid in loginuid]

        _pkg_states_installed = {'i' : _('Installed'), 'e' : _('Erased'),
                                 'o' : _('Upgraded'), 'n' : _('Downgraded')}
        _pkg_states_available = {'i' : _('Installed'), 'e' : _('Not installed'),
                                 'o' : _('Older'), 'n' : _('Newer')}
        maxlen = max([len(x) for x in (list(_pkg_states_installed.values()) +
                                       list(_pkg_states_available.values()))])
        _pkg_states_installed['maxlen'] = maxlen
        _pkg_states_available['maxlen'] = maxlen
        def _simple_pkg(pkg, prefix_len, was_installed=False, highlight=False,
                        pkg_max_len=0, show_repo=True):
            prefix = " " * prefix_len
            if was_installed:
                _pkg_states = _pkg_states_installed
            else:
                _pkg_states = _pkg_states_available
            state = _pkg_states['i']

            # get installed packages with name = pkg.name
            ipkgs = self.sack.query().installed().filterm(name=pkg.name).run()

            if not ipkgs:
                state = _pkg_states['e']
            else:
                # get latest installed package from software database
                inst_pkg = self.history.package(ipkgs[0])
                if inst_pkg:
                    res = pkg.compare(inst_pkg)
                    # res is:
                    # 0 if inst_pkg == pkg
                    # > 0 when inst_pkg > pkg
                    # < 0 when inst_pkg < pkg
                    if res == 0:
                        pass  # installed
                    elif res > 0:
                        state = _pkg_states['o']  # updated
                    else:
                        state = _pkg_states['n']  # downgraded

            if highlight:
                (hibeg, hiend) = self._highlight('bold')
            else:
                (hibeg, hiend) = self._highlight('normal')
            state = fill_exact_width(state, _pkg_states['maxlen'])
            ui_repo = ''
            if show_repo:
                ui_repo = pkg.ui_from_repo()
            print("%s%s%s%s %-*s %s" % (prefix, hibeg, state, hiend,
                                        pkg_max_len, str(pkg), ui_repo))

        tids = old.tids()
        if len(tids) > 1:
            print(_("Transaction ID :"), "%u..%u" % (tids[0], tids[-1]))
        else:
            print(_("Transaction ID :"), tids[0])
        begt = float(old.beg_timestamp)
        begtm = time.strftime("%c", time.localtime(begt))
        print(_("Begin time     :"), begtm)
        if old.beg_rpmdb_version is not None:
            if old.altered_lt_rpmdb:
                print(_("Begin rpmdb    :"), old.beg_rpmdb_version, "**")
            else:
                print(_("Begin rpmdb    :"), old.beg_rpmdb_version)
        if old.end_timestamp is not None:
            endt = old.end_timestamp
            endtm = time.strftime("%c", time.localtime(endt))
            diff = endt - begt
            if diff < 5 * 60:
                diff = _("(%u seconds)") % diff
            elif diff < 5 * 60 * 60:
                diff = _("(%u minutes)") % (diff // 60)
            elif diff < 5 * 60 * 60 * 24:
                diff = _("(%u hours)") % (diff // (60 * 60))
            else:
                diff = _("(%u days)") % (diff // (60 * 60 * 24))
            print(_("End time       :"), endtm, diff)
        if old.end_rpmdb_version is not None:
            if old.altered_gt_rpmdb:
                print(_("End rpmdb      :"), old.end_rpmdb_version, "**")
            else:
                print(_("End rpmdb      :"), old.end_rpmdb_version)
        if isinstance(name, (list, tuple)):
            seen = set()
            for i in name:
                if i in seen:
                    continue
                seen.add(i)
                print(_("User           :"), i)
        else:
            print(_("User           :"), name)
        if isinstance(old.return_code, (list, tuple)):
            codes = old.return_code
            if codes[0] is None:
                print(_("Return-Code    :"), "**", _("Aborted"), "**")
                codes = codes[1:]
            elif not all(codes):
                print(_("Return-Code    :"), _("Success"))
            elif codes:
                print(_("Return-Code    :"), _("Failures:"), ", ".join([str(i) for i in codes]))
        elif old.return_code is None:
            print(_("Return-Code    :"), "**", _("Aborted"), "**")
        elif old.return_code:
            print(_("Return-Code    :"), _("Failure:"), old.return_code)
        else:
            print(_("Return-Code    :"), _("Success"))

        if isinstance(old.releasever, (list, tuple)):
            seen = set()
            for i in old.releasever:
                if i in seen:
                    continue
                seen.add(i)
            print(_("Releasever     :"), i)
        else:
            print(_("Releasever     :"), old.releasever)

        if old.cmdline is not None:
            if isinstance(old.cmdline, (list, tuple)):
                for cmdline in old.cmdline:
                    print(_("Command Line   :"), cmdline)
            else:
                print(_("Command Line   :"), old.cmdline)

        if old.comment is not None:
            if isinstance(old.comment, (list, tuple)):
                for comment in old.comment:
                    print(_("Comment        :"), comment)
            else:
                print(_("Comment        :"), old.comment)

        perf_with = old.performed_with()
        if perf_with:
            print(_("Transaction performed with:"))
        max_len = 0
        for with_pkg in perf_with:
            str_len = len(str(with_pkg))
            if str_len > max_len:
                max_len = str_len
        for with_pkg in perf_with:
            _simple_pkg(with_pkg, 4, was_installed=True, pkg_max_len=max_len)

        print(_("Packages Altered:"))

        self.historyInfoCmdPkgsAltered(old, pats)

        t_out = old.output()
        if t_out:
            print(_("Scriptlet output:"))
            num = 0
            for line in t_out:
                num += 1
                print("%4d" % num, line)
        t_err = old.error()
        if t_err:
            print(_("Errors:"))
            num = 0
            for line in t_err:
                num += 1
                print("%4d" % num, line)

    # TODO: remove
    _history_state2uistate = {'True-Install' : _('Install'),
                              'Install'      : _('Install'),
                              'Dep-Install'  : _('Dep-Install'),
                              'Obsoleted'    : _('Obsoleted'),
                              'Obsoleting'   : _('Obsoleting'),
                              'Erase'        : _('Erase'),
                              'Reinstall'    : _('Reinstall'),
                              'Downgrade'    : _('Downgrade'),
                              'Downgraded'   : _('Downgraded'),
                              'Update'       : _('Upgrade'),
                              'Updated'      : _('Upgraded'),
                              }
    def historyInfoCmdPkgsAltered(self, old, pats=[]):
        """Print information about how packages are altered in a transaction.

        :param old: the :class:`DnfSwdbTrans` to
           print information about
        :param pats: a list of patterns.  Packages that match a patten
           in *pats* will be highlighted in the output
        """
        #  Note that these don't use _simple_pkg() because we are showing what
        # happened to them in the transaction ... not the difference between the
        # version in the transaction and now.
        all_uistates = self._history_state2uistate
        maxlen = 0
        pkg_max_len = 0

        packages = old.packages()

        for pkg in packages:
            uistate = all_uistates.get(pkg.action_name, pkg.action_name)
            if maxlen < len(uistate):
                maxlen = len(uistate)
            pkg_len = len(str(pkg))
            if pkg_max_len < pkg_len:
                pkg_max_len = pkg_len

        for pkg in packages:
            prefix = " " * 4
            if pkg.state != libdnf.transaction.TransactionItemState_DONE:
                prefix = " ** "

            highlight = 'normal'
            if pats:
                if any([pkg.match(pat) for pat in pats]):
                    highlight = 'bold'
            (hibeg, hiend) = self._highlight(highlight)

            uistate = all_uistates.get(pkg.action_name, pkg.action_name)
            uistate = fill_exact_width(ucd(uistate), maxlen)

            print("%s%s%s%s %-*s %s" % (prefix, hibeg, uistate, hiend,
                                        pkg_max_len, str(pkg),
                                        pkg.ui_from_repo()))

class DepSolveProgressCallBack(dnf.callback.Depsolve):
    """Provides text output callback functions for Dependency Solver callback."""

    def pkg_added(self, pkg, mode):
        """Print information about a package being added to the
        transaction set.

        :param pkgtup: tuple containing the package name, arch,
           version, and repository
        :param mode: a short string indicating why the package is
           being added to the transaction set.

        Valid current values for *mode* are::

           i = the package will be installed
           u = the package will be an update
           e = the package will be erased
           r = the package will be reinstalled
           d = the package will be a downgrade
           o = the package will be obsoleting another package
           ud = the package will be updated
           od = the package will be obsoleted
        """
        output = None
        if mode == 'i':
            output = _('---> Package %s.%s %s will be installed')
        elif mode == 'u':
            output = _('---> Package %s.%s %s will be an upgrade')
        elif mode == 'e':
            output = _('---> Package %s.%s %s will be erased')
        elif mode == 'r':
            output = _('---> Package %s.%s %s will be reinstalled')
        elif mode == 'd':
            output = _('---> Package %s.%s %s will be a downgrade')
        elif mode == 'o':
            output = _('---> Package %s.%s %s will be obsoleting')
        elif mode == 'ud':
            output = _('---> Package %s.%s %s will be upgraded')
        elif mode == 'od':
            output = _('---> Package %s.%s %s will be obsoleted')

        if output:
            logger.debug(output, pkg.name, pkg.arch, pkg.evr)

    def start(self):
        """Perform setup at the beginning of the dependency solving
        process.
        """
        logger.debug(_('--> Starting dependency resolution'))

    def end(self):
        """Output a message stating that dependency resolution has finished."""
        logger.debug(_('--> Finished dependency resolution'))


class CliKeyImport(dnf.callback.KeyImport):
    def __init__(self, base, output):
        self.base = base
        self.output = output

    def _confirm(self, id, userid, fingerprint, url, timestamp):

        def short_id(id):
            rj = '0' if dnf.pycomp.PY3 else b'0'
            return id[-8:].rjust(8, rj)

        msg = (_('Importing GPG key 0x%s:\n'
                 ' Userid     : "%s"\n'
                 ' Fingerprint: %s\n'
                 ' From       : %s') %
               (short_id(id), userid,
                dnf.crypto._printable_fingerprint(fingerprint),
                url.replace("file://", "")))
        logger.critical("%s", msg)

        if self.base.conf.assumeyes:
            return True
        if self.base.conf.assumeno:
            return False
        return self.output.userconfirm()


class CliTransactionDisplay(TransactionDisplay):
    """A YUM specific callback class for RPM operations."""

    width = property(lambda self: dnf.cli.term._term_width())

    def __init__(self):
        super(CliTransactionDisplay, self).__init__()
        self.lastmsg = ""
        self.lastpackage = None # name of last package we looked at
        self.output = True

        # for a progress bar
        self.mark = "="
        self.marks = 22

    def progress(self, package, action, ti_done, ti_total, ts_done, ts_total):
        """Output information about an rpm operation.  This may
        include a text progress bar.

        :param package: the package involved in the event
        :param action: the type of action that is taking place.  Valid
           values are given by
           :func:`rpmtrans.TransactionDisplay.action.keys()`
        :param ti_done: a number representing the amount of work
           already done in the current transaction
        :param ti_total: a number representing the total amount of work
           to be done in the current transaction
        :param ts_done: the number of the current transaction in
           transaction set
        :param ts_total: the total number of transactions in the
           transaction set
        """
        action_str = dnf.transaction.ACTIONS.get(action)
        if action_str is None:
            return

        wid1 = self._max_action_width()

        pkgname = ucd(package)
        self.lastpackage = package
        if ti_total == 0:
            percent = 0
        else:
            percent = (ti_done*long(100))//ti_total
        self._out_progress(ti_done, ti_total, ts_done, ts_total,
                           percent, action_str, pkgname, wid1)

    def _max_action_width(self):
        if not hasattr(self, '_max_action_wid_cache'):
            wid1 = 0
            for val in dnf.transaction.ACTIONS.values():
                wid_val = exact_width(val)
                if wid1 < wid_val:
                    wid1 = wid_val
            self._max_action_wid_cache = wid1
        wid1 = self._max_action_wid_cache
        return wid1

    def _out_progress(self, ti_done, ti_total, ts_done, ts_total,
                      percent, process, pkgname, wid1):
        if self.output and (sys.stdout.isatty() or ti_done == ti_total):
            (fmt, wid1, wid2) = self._makefmt(percent, ts_done, ts_total,
                                              progress=sys.stdout.isatty(),
                                              pkgname=pkgname, wid1=wid1)
            pkgname = ucd(pkgname)
            msg = fmt % (fill_exact_width(process, wid1, wid1),
                         fill_exact_width(pkgname, wid2, wid2))
            if msg != self.lastmsg:
                dnf.util._terminal_messenger('write_flush', msg, sys.stdout)
                self.lastmsg = msg
                if ti_done == ti_total:
                    print(" ")

    def _makefmt(self, percent, ts_done, ts_total, progress=True,
                 pkgname=None, wid1=15):
        l = len(str(ts_total))
        size = "%s.%s" % (l, l)
        fmt_done = "%" + size + "s/%" + size + "s"
        done = fmt_done % (ts_done, ts_total)

        #  This should probably use TerminLine, but we don't want to dep. on
        # that. So we kind do an ok job by hand ... at least it's dynamic now.
        if pkgname is None:
            pnl = 22
        else:
            pnl = exact_width(pkgname)

        overhead = (2 * l) + 2 # Length of done, above
        overhead += 2 + wid1 +2 # Length of beginning ("  " action " :")
        overhead += 1          # Space between pn and done
        overhead += 2          # Ends for progress
        overhead += 1          # Space for end
        width = self.width
        if width < overhead:
            width = overhead    # Give up
        width -= overhead
        if pnl > width // 2:
            pnl = width // 2

        marks = self.width - (overhead + pnl)
        width = "%s.%s" % (marks, marks)
        fmt_bar = "[%-" + width + "s]"
        # pnl = str(28 + marks + 1)
        full_pnl = pnl + marks + 1

        if progress and percent == 100: # Don't chop pkg name on 100%
            fmt = "\r  %s: %s   " + done
            wid2 = full_pnl
        elif progress:
            if marks > 5:
                bar = fmt_bar % (self.mark * int(marks * (percent / 100.0)), )
            else:
                bar = ""
            fmt = "\r  %s: %s " + bar + " " + done
            wid2 = pnl
        elif percent == 100:
            fmt = "  %s: %s   " + done
            wid2 = full_pnl
        else:
            if marks > 5:
                bar = fmt_bar % (self.mark * marks, )
            else:
                bar = ""
            fmt = "  %s: %s " + bar + " " + done
            wid2 = pnl
        return fmt, wid1, wid2

def progressbar(current, total, name=None):
    """Output the current status to the terminal using a simple
    text progress bar consisting of 50 # marks.

    :param current: a number representing the amount of work
       already done
    :param total: a number representing the total amount of work
       to be done
    :param name: a name to label the progress bar with
    """

    mark = '#'
    if not sys.stdout.isatty():
        return

    if current == 0:
        percent = 0
    else:
        if total != 0:
            percent = float(current) / total
        else:
            percent = 0

    width = dnf.cli.term._term_width()

    if name is None and current == total:
        name = '-'

    end = ' %d/%d' % (current, total)
    width -= len(end) + 1
    if width < 0:
        width = 0
    if name is None:
        width -= 2
        if width < 0:
            width = 0
        hashbar = mark * int(width * percent)
        output = '\r[%-*s]%s' % (width, hashbar, end)
    elif current == total: # Don't chop name on 100%
        output = '\r%s%s' % (fill_exact_width(name, width, width), end)
    else:
        width -= 4
        if width < 0:
            width = 0
        nwid = width // 2
        if nwid > exact_width(name):
            nwid = exact_width(name)
        width -= nwid
        hashbar = mark * int(width * percent)
        output = '\r%s: [%-*s]%s' % (fill_exact_width(name, nwid, nwid), width,
                                     hashbar, end)

    if current <= total:
        dnf.util._terminal_messenger('write', output, sys.stdout)

    if current == total:
        dnf.util._terminal_messenger('write', '\n', sys.stdout)

    dnf.util._terminal_messenger('flush', out=sys.stdout)
site-packages/dnf/cli/completion_helper.py000064400000016457147511334650014712 0ustar00#!/usr/libexec/platform-python
#
# This file is part of dnf.
#
# Copyright 2015 (C) Igor Gnatenko <i.gnatenko.brain@gmail.com>
# Copyright 2016 (C) Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301  USA

import dnf.exceptions
import dnf.cli
import dnf.cli.commands.clean
import sys


def filter_list_by_kw(kw, lst):
    return filter(lambda k: str(k).startswith(kw), lst)

def listpkg_to_setstr(pkgs):
    return set([str(x) for x in pkgs])

class RemoveCompletionCommand(dnf.cli.commands.remove.RemoveCommand):
    def __init__(self, args):
        super(RemoveCompletionCommand, self).__init__(args)

    def configure(self):
        self.cli.demands.root_user = False
        self.cli.demands.sack_activation = True

    def run(self):
        for pkg in ListCompletionCommand.installed(self.base, self.opts.pkg_specs):
            print(str(pkg))


class InstallCompletionCommand(dnf.cli.commands.install.InstallCommand):
    def __init__(self, args):
        super(InstallCompletionCommand, self).__init__(args)

    def configure(self):
        self.cli.demands.root_user = False
        self.cli.demands.available_repos = True
        self.cli.demands.sack_activation = True

    def run(self):
        installed = listpkg_to_setstr(ListCompletionCommand.installed(self.base,
                                                                      self.opts.pkg_specs))
        available = listpkg_to_setstr(ListCompletionCommand.available(self.base,
                                                                      self.opts.pkg_specs))
        for pkg in (available - installed):
            print(str(pkg))


class ReinstallCompletionCommand(dnf.cli.commands.reinstall.ReinstallCommand):
    def __init__(self, args):
        super(ReinstallCompletionCommand, self).__init__(args)

    def configure(self):
        self.cli.demands.root_user = False
        self.cli.demands.available_repos = True
        self.cli.demands.sack_activation = True

    def run(self):
        installed = listpkg_to_setstr(ListCompletionCommand.installed(self.base,
                                                                      self.opts.pkg_specs))
        available = listpkg_to_setstr(ListCompletionCommand.available(self.base,
                                                                      self.opts.pkg_specs))
        for pkg in (installed & available):
            print(str(pkg))

class ListCompletionCommand(dnf.cli.commands.ListCommand):
    def __init__(self, args):
        super(ListCompletionCommand, self).__init__(args)

    def run(self):
        subcmds = self.pkgnarrows
        args = self.opts.packages
        action = self.opts.packages_action
        if len(args) > 1 and args[1] not in subcmds:
            print("\n".join(filter_list_by_kw(args[1], subcmds)))
        else:
            if action == "installed":
                pkgs = self.installed(self.base, args)
            elif action == "available":
                pkgs = self.available(self.base, args)
            elif action == "updates":
                pkgs = self.updates(self.base, args)
            else:
                available = listpkg_to_setstr(self.available(self.base, args))
                installed = listpkg_to_setstr(self.installed(self.base, args))
                pkgs = (available | installed)
                if not pkgs:
                    print("\n".join(filter_list_by_kw(args[0], subcmds)))
                    return
            for pkg in pkgs:
                print(str(pkg))

    @staticmethod
    def installed(base, arg):
        return base.sack.query().installed().filterm(name__glob="{}*".format(arg[0]))

    @staticmethod
    def available(base, arg):
        return base.sack.query().available().filterm(name__glob="{}*".format(arg[0]))

    @staticmethod
    def updates(base, arg):
        return base.check_updates(["{}*".format(arg[0])], print_=False)


class RepoListCompletionCommand(dnf.cli.commands.repolist.RepoListCommand):
    def __init__(self, args):
        super(RepoListCompletionCommand, self).__init__(args)

    def run(self):
        args = self.opts
        if args.repos_action == "enabled":
            print("\n".join(filter_list_by_kw(args.repos[0],
                            [r.id for r in self.base.repos.iter_enabled()])))
        elif args.repos_action == "disabled":
            print("\n".join(filter_list_by_kw(args.repos[0],
                            [r.id for r in self.base.repos.all() if not r.enabled])))
        elif args.repos_action == "all":
            print("\n".join(filter_list_by_kw(args.repos[0],
                            [r.id for r in self.base.repos.all()])))


class UpgradeCompletionCommand(dnf.cli.commands.upgrade.UpgradeCommand):
    def __init__(self, args):
        super(UpgradeCompletionCommand, self).__init__(args)

    def configure(self):
        self.cli.demands.root_user = False
        self.cli.demands.available_repos = True
        self.cli.demands.sack_activation = True

    def run(self):
        for pkg in ListCompletionCommand.updates(self.base, self.opts.pkg_specs):
            print(str(pkg))


class DowngradeCompletionCommand(dnf.cli.commands.downgrade.DowngradeCommand):
    def __init__(self, args):
        super(DowngradeCompletionCommand, self).__init__(args)

    def configure(self):
        self.cli.demands.root_user = False
        self.cli.demands.available_repos = True
        self.cli.demands.sack_activation = True

    def run(self):
        for pkg in ListCompletionCommand.available(self.base, self.opts.pkg_specs).downgrades():
            print(str(pkg))


class CleanCompletionCommand(dnf.cli.commands.clean.CleanCommand):
    def __init__(self, args):
        super(CleanCompletionCommand, self).__init__(args)

    def run(self):
        subcmds = dnf.cli.commands.clean._CACHE_TYPES.keys()
        print("\n".join(filter_list_by_kw(self.opts.type[1], subcmds)))


def main(args):
    base = dnf.cli.cli.BaseCli()
    cli = dnf.cli.Cli(base)
    if args[0] == "_cmds":
        base.init_plugins([], [], cli)
        print("\n".join(filter_list_by_kw(args[1], cli.cli_commands)))
        return
    cli.cli_commands.clear()
    cli.register_command(RemoveCompletionCommand)
    cli.register_command(InstallCompletionCommand)
    cli.register_command(ReinstallCompletionCommand)
    cli.register_command(ListCompletionCommand)
    cli.register_command(RepoListCompletionCommand)
    cli.register_command(UpgradeCompletionCommand)
    cli.register_command(DowngradeCompletionCommand)
    cli.register_command(CleanCompletionCommand)
    cli.configure(args)
    try:
        cli.run()
    except (OSError, dnf.exceptions.Error):
        sys.exit(0)

if __name__ == "__main__":
    try:
        main(sys.argv[1:])
    except KeyboardInterrupt:
        sys.exit(1)
site-packages/dnf/cli/term.py000064400000034546147511334650012150 0ustar00# Copyright (C) 2013-2014  Red Hat, Inc.
# Terminal routines.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
import curses
import dnf.pycomp
import fcntl
import re
import struct
import sys
import termios


def _real_term_width(fd=1):
    """ Get the real terminal width """
    try:
        buf = 'abcdefgh'
        buf = fcntl.ioctl(fd, termios.TIOCGWINSZ, buf)
        ret = struct.unpack(b'hhhh', buf)[1]
        return ret
    except IOError:
        return None


def _term_width(fd=1):
    """ Compute terminal width falling to default 80 in case of trouble"""
    tw = _real_term_width(fd=1)
    if not tw:
        return 80
    elif tw < 20:
        return 20
    else:
        return tw


class Term(object):
    """A class to provide some terminal "UI" helpers based on curses."""

    # From initial search for "terminfo and python" got:
    # http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/475116
    # ...it's probably not copyrightable, but if so ASPN says:
    #
    #  Except where otherwise noted, recipes in the Python Cookbook are
    # published under the Python license.

    __enabled = True

    real_columns = property(lambda self: _real_term_width())
    columns = property(lambda self: _term_width())

    __cap_names = {
        'underline' : 'smul',
        'reverse' : 'rev',
        'normal' : 'sgr0',
        }

    __colors = {
        'black' : 0,
        'blue' : 1,
        'green' : 2,
        'cyan' : 3,
        'red' : 4,
        'magenta' : 5,
        'yellow' : 6,
        'white' : 7
        }
    __ansi_colors = {
        'black' : 0,
        'red' : 1,
        'green' : 2,
        'yellow' : 3,
        'blue' : 4,
        'magenta' : 5,
        'cyan' : 6,
        'white' : 7
        }
    __ansi_forced_MODE = {
        'bold' : '\x1b[1m',
        'blink' : '\x1b[5m',
        'dim' : '',
        'reverse' : '\x1b[7m',
        'underline' : '\x1b[4m',
        'normal' : '\x1b(B\x1b[m'
        }
    __ansi_forced_FG_COLOR = {
        'black' : '\x1b[30m',
        'red' : '\x1b[31m',
        'green' : '\x1b[32m',
        'yellow' : '\x1b[33m',
        'blue' : '\x1b[34m',
        'magenta' : '\x1b[35m',
        'cyan' : '\x1b[36m',
        'white' : '\x1b[37m'
        }
    __ansi_forced_BG_COLOR = {
        'black' : '\x1b[40m',
        'red' : '\x1b[41m',
        'green' : '\x1b[42m',
        'yellow' : '\x1b[43m',
        'blue' : '\x1b[44m',
        'magenta' : '\x1b[45m',
        'cyan' : '\x1b[46m',
        'white' : '\x1b[47m'
        }

    def __forced_init(self):
        self.MODE = self.__ansi_forced_MODE
        self.FG_COLOR = self.__ansi_forced_FG_COLOR
        self.BG_COLOR = self.__ansi_forced_BG_COLOR

    def reinit(self, term_stream=None, color='auto'):
        """Reinitializes the :class:`Term`.

        :param term_stream:  the terminal stream that the
           :class:`Term` should be initialized to use.  If
           *term_stream* is not given, :attr:`sys.stdout` is used.
        :param color: when to colorize output.  Valid values are
           'always', 'auto', and 'never'.  'always' will use ANSI codes
           to always colorize output, 'auto' will decide whether do
           colorize depending on the terminal, and 'never' will never
           colorize.
        """
        self.__enabled = True
        self.lines = 24

        if color == 'always':
            self.__forced_init()
            return

        # Output modes:
        self.MODE = {
            'bold' : '',
            'blink' : '',
            'dim' : '',
            'reverse' : '',
            'underline' : '',
            'normal' : ''
            }

        # Colours
        self.FG_COLOR = {
            'black' : '',
            'blue' : '',
            'green' : '',
            'cyan' : '',
            'red' : '',
            'magenta' : '',
            'yellow' : '',
            'white' : ''
            }

        self.BG_COLOR = {
            'black' : '',
            'blue' : '',
            'green' : '',
            'cyan' : '',
            'red' : '',
            'magenta' : '',
            'yellow' : '',
            'white' : ''
            }

        if color == 'never':
            self.__enabled = False
            return
        assert color == 'auto'

        # If the stream isn't a tty, then assume it has no capabilities.
        if not term_stream:
            term_stream = sys.stdout
        if not term_stream.isatty():
            self.__enabled = False
            return

        # Check the terminal type.  If we fail, then assume that the
        # terminal has no capabilities.
        try:
            curses.setupterm(fd=term_stream.fileno())
        except Exception:
            self.__enabled = False
            return
        self._ctigetstr = curses.tigetstr

        self.lines = curses.tigetnum('lines')

        # Look up string capabilities.
        for cap_name in self.MODE:
            mode = cap_name
            if cap_name in self.__cap_names:
                cap_name = self.__cap_names[cap_name]
            self.MODE[mode] = self._tigetstr(cap_name)

        # Colors
        set_fg = self._tigetstr('setf').encode('utf-8')
        if set_fg:
            for (color, val) in self.__colors.items():
                self.FG_COLOR[color] = curses.tparm(set_fg, val).decode() or ''
        set_fg_ansi = self._tigetstr('setaf').encode('utf-8')
        if set_fg_ansi:
            for (color, val) in self.__ansi_colors.items():
                fg_color = curses.tparm(set_fg_ansi, val).decode() or ''
                self.FG_COLOR[color] = fg_color
        set_bg = self._tigetstr('setb').encode('utf-8')
        if set_bg:
            for (color, val) in self.__colors.items():
                self.BG_COLOR[color] = curses.tparm(set_bg, val).decode() or ''
        set_bg_ansi = self._tigetstr('setab').encode('utf-8')
        if set_bg_ansi:
            for (color, val) in self.__ansi_colors.items():
                bg_color = curses.tparm(set_bg_ansi, val).decode() or ''
                self.BG_COLOR[color] = bg_color

    def __init__(self, term_stream=None, color='auto'):
        self.reinit(term_stream, color)

    def _tigetstr(self, cap_name):
        # String capabilities can include "delays" of the form "$<2>".
        # For any modern terminal, we should be able to just ignore
        # these, so strip them out.
        cap = self._ctigetstr(cap_name) or ''
        if dnf.pycomp.is_py3bytes(cap):
            cap = cap.decode()
        return re.sub(r'\$<\d+>[/*]?', '', cap)

    def color(self, color, s):
        """Colorize string with color"""
        return (self.MODE[color] + str(s) + self.MODE['normal'])

    def bold(self, s):
        """Make string bold."""
        return self.color('bold', s)

    def sub(self, haystack, beg, end, needles, escape=None, ignore_case=False):
        """Search the string *haystack* for all occurrences of any
        string in the list *needles*.  Prefix each occurrence with
        *beg*, and postfix each occurrence with *end*, then return the
        modified string.  For example::

           >>> yt = Term()
           >>> yt.sub('spam and eggs', 'x', 'z', ['and'])
           'spam xandz eggs'

        This is particularly useful for emphasizing certain words
        in output: for example, calling :func:`sub` with *beg* =
        MODE['bold'] and *end* = MODE['normal'] will return a string
        that when printed to the terminal will appear to be *haystack*
        with each occurrence of the strings in *needles* in bold
        face.  Note, however, that the :func:`sub_mode`,
        :func:`sub_bold`, :func:`sub_fg`, and :func:`sub_bg` methods
        provide convenient ways to access this same emphasizing functionality.

        :param haystack: the string to be modified
        :param beg: the string to be prefixed onto matches
        :param end: the string to be postfixed onto matches
        :param needles: a list of strings to add the prefixes and
           postfixes to
        :param escape: a function that accepts a string and returns
           the same string with problematic characters escaped.  By
           default, :func:`re.escape` is used.
        :param ignore_case: whether case should be ignored when
           searching for matches
        :return: *haystack* with *beg* prefixing, and *end*
          postfixing, occurrences of the strings in *needles*
        """
        if not self.__enabled:
            return haystack

        if not escape:
            escape = re.escape

        render = lambda match: beg + match.group() + end
        for needle in needles:
            pat = escape(needle)
            if ignore_case:
                pat = re.template(pat, re.I)
            haystack = re.sub(pat, render, haystack)
        return haystack
    def sub_norm(self, haystack, beg, needles, **kwds):
        """Search the string *haystack* for all occurrences of any
        string in the list *needles*.  Prefix each occurrence with
        *beg*, and postfix each occurrence with self.MODE['normal'],
        then return the modified string.  If *beg* is an ANSI escape
        code, such as given by self.MODE['bold'], this method will
        return *haystack* with the formatting given by the code only
        applied to the strings in *needles*.

        :param haystack: the string to be modified
        :param beg: the string to be prefixed onto matches
        :param end: the string to be postfixed onto matches
        :param needles: a list of strings to add the prefixes and
           postfixes to
        :return: *haystack* with *beg* prefixing, and self.MODE['normal']
          postfixing, occurrences of the strings in *needles*
        """
        return self.sub(haystack, beg, self.MODE['normal'], needles, **kwds)

    def sub_mode(self, haystack, mode, needles, **kwds):
        """Search the string *haystack* for all occurrences of any
        string in the list *needles*.  Prefix each occurrence with
        self.MODE[*mode*], and postfix each occurrence with
        self.MODE['normal'], then return the modified string.  This
        will return a string that when printed to the terminal will
        appear to be *haystack* with each occurrence of the strings in
        *needles* in the given *mode*.

        :param haystack: the string to be modified
        :param mode: the mode to set the matches to be in.  Valid
           values are given by self.MODE.keys().
        :param needles: a list of strings to add the prefixes and
           postfixes to
        :return: *haystack* with self.MODE[*mode*] prefixing, and
          self.MODE['normal'] postfixing, occurrences of the strings
          in *needles*
        """
        return self.sub_norm(haystack, self.MODE[mode], needles, **kwds)

    def sub_bold(self, haystack, needles, **kwds):
        """Search the string *haystack* for all occurrences of any
        string in the list *needles*.  Prefix each occurrence with
        self.MODE['bold'], and postfix each occurrence with
        self.MODE['normal'], then return the modified string.  This
        will return a string that when printed to the terminal will
        appear to be *haystack* with each occurrence of the strings in
        *needles* in bold face.

        :param haystack: the string to be modified
        :param needles: a list of strings to add the prefixes and
           postfixes to
        :return: *haystack* with self.MODE['bold'] prefixing, and
          self.MODE['normal'] postfixing, occurrences of the strings
          in *needles*
        """
        return self.sub_mode(haystack, 'bold', needles, **kwds)

    def sub_fg(self, haystack, color, needles, **kwds):
        """Search the string *haystack* for all occurrences of any
        string in the list *needles*.  Prefix each occurrence with
        self.FG_COLOR[*color*], and postfix each occurrence with
        self.MODE['normal'], then return the modified string.  This
        will return a string that when printed to the terminal will
        appear to be *haystack* with each occurrence of the strings in
        *needles* in the given color.

        :param haystack: the string to be modified
        :param color: the color to set the matches to be in.  Valid
           values are given by self.FG_COLOR.keys().
        :param needles: a list of strings to add the prefixes and
           postfixes to
        :return: *haystack* with self.FG_COLOR[*color*] prefixing, and
          self.MODE['normal'] postfixing, occurrences of the strings
          in *needles*
        """
        return self.sub_norm(haystack, self.FG_COLOR[color], needles, **kwds)

    def sub_bg(self, haystack, color, needles, **kwds):
        """Search the string *haystack* for all occurrences of any
        string in the list *needles*.  Prefix each occurrence with
        self.BG_COLOR[*color*], and postfix each occurrence with
        self.MODE['normal'], then return the modified string.  This
        will return a string that when printed to the terminal will
        appear to be *haystack* with each occurrence of the strings in
        *needles* highlighted in the given background color.

        :param haystack: the string to be modified
        :param color: the background color to set the matches to be in.  Valid
           values are given by self.BG_COLOR.keys().
        :param needles: a list of strings to add the prefixes and
           postfixes to
        :return: *haystack* with self.BG_COLOR[*color*] prefixing, and
          self.MODE['normal'] postfixing, occurrences of the strings
          in *needles*
        """
        return self.sub_norm(haystack, self.BG_COLOR[color], needles, **kwds)
site-packages/dnf/cli/main.py000064400000014546147511334650012123 0ustar00# Copyright 2005 Duke University
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU Library General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.

"""
Entrance point for the yum command line interface.
"""

from __future__ import print_function
from __future__ import absolute_import
from __future__ import unicode_literals
from dnf.conf import Conf
from dnf.cli.cli import Cli
from dnf.cli.option_parser import OptionParser
from dnf.i18n import ucd
from dnf.cli.utils import show_lock_owner
from dnf.i18n import _

import dnf.cli
import dnf.cli.cli
import dnf.cli.option_parser
import dnf.exceptions
import dnf.i18n
import dnf.logging
import dnf.util
import errno
import hawkey
import libdnf.error
import logging
import os
import os.path
import sys

logger = logging.getLogger("dnf")


def ex_IOError(e):
    logger.log(dnf.logging.SUBDEBUG, '', exc_info=True)
    logger.critical(ucd(e))
    return 1


def ex_Error(e):
    logger.log(dnf.logging.SUBDEBUG, '', exc_info=True)
    if e.value is not None:
        logger.critical(_('Error: %s'), ucd(e))
    return 1


def main(args, conf_class=Conf, cli_class=Cli, option_parser_class=OptionParser):
    try:
        dnf.i18n.setup_stdout()
        with dnf.cli.cli.BaseCli(conf_class()) as base:
            return _main(base, args, cli_class, option_parser_class)
    except dnf.exceptions.ProcessLockError as e:
        logger.critical(e.value)
        show_lock_owner(e.pid)
        return 200
    except dnf.exceptions.LockError as e:
        logger.critical(e.value)
        return 200
    except dnf.exceptions.DepsolveError as e:
        return 1
    except dnf.exceptions.Error as e:
        return ex_Error(e)
    except hawkey.Exception as e:
        logger.critical(_('Error: %s'), ucd(e))
        return 1
    except libdnf.error.Error as e:
        logger.critical(_('Error: %s'), ucd(e))
        return 1
    except IOError as e:
        return ex_IOError(e)
    except KeyboardInterrupt as e:
        logger.critical('{}: {}'.format(type(e).__name__, _("Terminated.")))
        return 1


def _main(base, args, cli_class, option_parser):
    """Run the dnf program from a command line interface."""

    # our core object for the cli
    base._logging._presetup()
    cli = cli_class(base)

    # do our cli parsing and config file setup
    # also sanity check the things being passed on the cli
    try:
        cli.configure(list(map(ucd, args)), option_parser())
    except (IOError, OSError) as e:
        return ex_IOError(e)

    return cli_run(cli, base)


def cli_run(cli, base):
    # Try to open the current directory to see if we have
    # read and execute access. If not, chdir to /
    try:
        f = open(".")
    except IOError as e:
        if e.errno == errno.EACCES:
            logger.critical(_('No read/execute access in current directory, moving to /'))
            os.chdir("/")
    else:
        f.close()

    try:
        cli.run()
    except dnf.exceptions.LockError:
        raise
    except (IOError, OSError) as e:
        return ex_IOError(e)

    if cli.demands.resolving:
        try:
            ret = resolving(cli, base)
        except dnf.exceptions.DepsolveError as e:
            ex_Error(e)
            msg = ""
            if not cli.demands.allow_erasing and base._goal.problem_conflicts(available=True):
                msg += _("try to add '{}' to command line to replace conflicting "
                         "packages").format("--allowerasing")
            if cli.base.conf.strict:
                if not msg:
                    msg += _("try to add '{}' to skip uninstallable packages").format(
                        "--skip-broken")
                else:
                    msg += _(" or '{}' to skip uninstallable packages").format("--skip-broken")
            if cli.base.conf.best:
                prio = cli.base.conf._get_priority("best")
                if prio <= dnf.conf.PRIO_MAINCONFIG:
                    if not msg:
                        msg += _("try to add '{}' to use not only best candidate packages").format(
                            "--nobest")
                    else:
                        msg += _(" or '{}' to use not only best candidate packages").format(
                            "--nobest")
            if msg:
                logger.info("({})".format(msg))
            raise
        if ret:
            return ret

    cli.command.run_transaction()
    return cli.demands.success_exit_status


def resolving(cli, base):
    """Perform the depsolve, download and RPM transaction stage."""

    if base.transaction is None:
        base.resolve(cli.demands.allow_erasing)
        logger.info(_('Dependencies resolved.'))

    cli.command.run_resolved()

    # Run the transaction
    displays = []
    if cli.demands.transaction_display is not None:
        displays.append(cli.demands.transaction_display)
    try:
        base.do_transaction(display=displays)
    except dnf.cli.CliError as exc:
        logger.error(ucd(exc))
        return 1
    except dnf.exceptions.TransactionCheckError as err:
        for msg in cli.command.get_error_output(err):
            logger.critical(msg)
        return 1
    except IOError as e:
        return ex_IOError(e)
    else:
        logger.info(_('Complete!'))
    return 0


def user_main(args, exit_code=False):
    """Call one of the multiple main() functions based on environment variables.

    :param args: command line arguments passed into yum
    :param exit_code: if *exit_code* is True, this function will exit
       python with its exit code when it has finished executing.
       Otherwise, it will return its exit code.
    :return: the exit code from dnf.yum execution
    """

    errcode = main(args)
    if exit_code:
        sys.exit(errcode)
    return errcode


if __name__ == "__main__":
    user_main(sys.argv[1:], exit_code=True)
site-packages/dnf/cli/demand.py000064400000004747147511334650012431 0ustar00# demand.py
# Demand sheet and related classes.
#
# Copyright (C) 2014-2015  Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import unicode_literals


class _BoolDefault(object):
    def __init__(self, default):
        self.default = default
        self._storing_name = '__%s%x' % (self.__class__.__name__, id(self))

    def __get__(self, obj, objtype=None):
        objdict = obj.__dict__
        if self._storing_name in objdict:
            return objdict[self._storing_name]
        return self.default

    def __set__(self, obj, val):
        objdict = obj.__dict__
        if self._storing_name in objdict:
            current_val = objdict[self._storing_name]
            if current_val != val:
                raise AttributeError('Demand already set.')
        objdict[self._storing_name] = val

class DemandSheet(object):
    """Collection of demands that different CLI parts have on other parts. :api"""

    # :api...
    allow_erasing = _BoolDefault(False)
    available_repos = _BoolDefault(False)
    resolving = _BoolDefault(False)
    root_user = _BoolDefault(False)
    sack_activation = _BoolDefault(False)
    load_system_repo = _BoolDefault(True)
    success_exit_status = 0

    cacheonly = _BoolDefault(False)
    fresh_metadata = _BoolDefault(True)
    freshest_metadata = _BoolDefault(False)
    changelogs = _BoolDefault(False)

    transaction_display = None

    # This demand controlls applicability of the plugins that could filter
    # repositories packages (e.g. versionlock).
    # If it stays None, the demands.resolving is used as a fallback.
    plugin_filtering_enabled = _BoolDefault(None)
site-packages/dnf/subject.py000064400000002176147511334650012063 0ustar00# subject.py
# Implements Subject.
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import print_function
from __future__ import unicode_literals
from hawkey import Subject  # :api

site-packages/dnf/db/group.py000064400000036206147511334650012146 0ustar00# -*- coding: utf-8 -*-

# Copyright (C) 2017-2018 Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU Library General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
#


import libdnf.transaction

import dnf.db.history
import dnf.transaction
import dnf.exceptions
from dnf.i18n import _
from dnf.util import logger

import rpm

class PersistorBase(object):
    def __init__(self, history):
        assert isinstance(history, dnf.db.history.SwdbInterface), str(type(history))
        self.history = history
        self._installed = {}
        self._removed = {}
        self._upgraded = {}
        self._downgraded = {}

    def __len__(self):
        return len(self._installed) + len(self._removed) + len(self._upgraded) + len(self._downgraded)

    def clean(self):
        self._installed = {}
        self._removed = {}
        self._upgraded = {}
        self._downgraded = {}

    def _get_obj_id(self, obj):
        raise NotImplementedError

    def _add_to_history(self, item, action):
        ti = self.history.swdb.addItem(item, "", action, libdnf.transaction.TransactionItemReason_USER)
        ti.setState(libdnf.transaction.TransactionItemState_DONE)

    def install(self, obj):
        self._installed[self._get_obj_id(obj)] = obj
        self._add_to_history(obj, libdnf.transaction.TransactionItemAction_INSTALL)

    def remove(self, obj):
        self._removed[self._get_obj_id(obj)] = obj
        self._add_to_history(obj, libdnf.transaction.TransactionItemAction_REMOVE)

    def upgrade(self, obj):
        self._upgraded[self._get_obj_id(obj)] = obj
        self._add_to_history(obj, libdnf.transaction.TransactionItemAction_UPGRADE)

    def downgrade(self, obj):
        self._downgraded[self._get_obj_id(obj)] = obj
        self._add_to_history(obj, libdnf.transaction.TransactionItemAction_DOWNGRADE)

    def new(self, obj_id, name, translated_name, pkg_types):
        raise NotImplementedError

    def get(self, obj_id):
        raise NotImplementedError

    def search_by_pattern(self, pattern):
        raise NotImplementedError


class GroupPersistor(PersistorBase):

    def __iter__(self):
        items = self.history.swdb.getItems()
        items = [i for i in items if i.getCompsGroupItem()]
        return iter(items)

    def _get_obj_id(self, obj):
        return obj.getGroupId()

    def new(self, obj_id, name, translated_name, pkg_types):
        swdb_group = self.history.swdb.createCompsGroupItem()
        swdb_group.setGroupId(obj_id)
        if name is not None:
            swdb_group.setName(name)
        if translated_name is not None:
            swdb_group.setTranslatedName(translated_name)
        swdb_group.setPackageTypes(pkg_types)
        return swdb_group

    def get(self, obj_id):
        swdb_group = self.history.swdb.getCompsGroupItem(obj_id)
        if not swdb_group:
            return None
        swdb_group = swdb_group.getCompsGroupItem()
        return swdb_group

    def search_by_pattern(self, pattern):
        return self.history.swdb.getCompsGroupItemsByPattern(pattern)

    def get_package_groups(self, pkg_name):
        return self.history.swdb.getPackageCompsGroups(pkg_name)

    def is_removable_pkg(self, pkg_name):
        # for group removal and autoremove
        reason = self.history.swdb.resolveRPMTransactionItemReason(pkg_name, "", -2)
        if reason != libdnf.transaction.TransactionItemReason_GROUP:
            return False

        # TODO: implement lastTransId == -2 in libdnf
        package_groups = set(self.get_package_groups(pkg_name))
        for group_id, group in self._removed.items():
            for pkg in group.getPackages():
                if pkg.getName() != pkg_name:
                    continue
                if not pkg.getInstalled():
                    continue
                package_groups.remove(group_id)
        for group_id, group in self._installed.items():
            for pkg in group.getPackages():
                if pkg.getName() != pkg_name:
                    continue
                if not pkg.getInstalled():
                    continue
                package_groups.add(group_id)
        if package_groups:
            return False
        return True


class EnvironmentPersistor(PersistorBase):

    def __iter__(self):
        items = self.history.swdb.getItems()
        items = [i for i in items if i.getCompsEnvironmentItem()]
        return iter(items)

    def _get_obj_id(self, obj):
        return obj.getEnvironmentId()

    def new(self, obj_id, name, translated_name, pkg_types):
        swdb_env = self.history.swdb.createCompsEnvironmentItem()
        swdb_env.setEnvironmentId(obj_id)
        if name is not None:
            swdb_env.setName(name)
        if translated_name is not None:
            swdb_env.setTranslatedName(translated_name)
        swdb_env.setPackageTypes(pkg_types)
        return swdb_env

    def get(self, obj_id):
        swdb_env = self.history.swdb.getCompsEnvironmentItem(obj_id)
        if not swdb_env:
            return None
        swdb_env = swdb_env.getCompsEnvironmentItem()
        return swdb_env

    def search_by_pattern(self, pattern):
        return self.history.swdb.getCompsEnvironmentItemsByPattern(pattern)

    def get_group_environments(self, group_id):
        return self.history.swdb.getCompsGroupEnvironments(group_id)

    def is_removable_group(self, group_id):
        # for environment removal
        swdb_group = self.history.group.get(group_id)
        if not swdb_group:
            return False

        # TODO: implement lastTransId == -2 in libdnf
        group_environments = set(self.get_group_environments(group_id))
        for env_id, env in self._removed.items():
            for group in env.getGroups():
                if group.getGroupId() != group_id:
                    continue
                if not group.getInstalled():
                    continue
                group_environments.remove(env_id)
        for env_id, env in self._installed.items():
            for group in env.getGroups():
                if group.getGroupId() != group_id:
                    continue
                if not group.getInstalled():
                    continue
                group_environments.add(env_id)
        if group_environments:
            return False
        return True


class RPMTransaction(object):
    def __init__(self, history, transaction=None):
        self.history = history
        self.transaction = transaction
        if not self.transaction:
            try:
                self.history.swdb.initTransaction()
            except:
                pass
        self._swdb_ti_pkg = {}

    # TODO: close trans if needed

    def __iter__(self):
        # :api
        if self.transaction:
            items = self.transaction.getItems()
        else:
            items = self.history.swdb.getItems()
        items = [dnf.db.history.RPMTransactionItemWrapper(self.history, i) for i in items if i.getRPMItem()]
        return iter(items)

    def __len__(self):
        if self.transaction:
            items = self.transaction.getItems()
        else:
            items = self.history.swdb.getItems()
        items = [dnf.db.history.RPMTransactionItemWrapper(self.history, i) for i in items if i.getRPMItem()]
        return len(items)

    def _pkg_to_swdb_rpm_item(self, pkg):
        rpm_item = self.history.swdb.createRPMItem()
        rpm_item.setName(pkg.name)
        rpm_item.setEpoch(pkg.epoch or 0)
        rpm_item.setVersion(pkg.version)
        rpm_item.setRelease(pkg.release)
        rpm_item.setArch(pkg.arch)
        return rpm_item

    def new(self, pkg, action, reason=None, replaced_by=None):
        rpm_item = self._pkg_to_swdb_rpm_item(pkg)
        repoid = self.get_repoid(pkg)
        if reason is None:
            reason = self.get_reason(pkg)
        result = self.history.swdb.addItem(rpm_item, repoid, action, reason)
        if replaced_by:
            result.addReplacedBy(replaced_by)
        self._swdb_ti_pkg[result] = pkg
        return result

    def get_repoid(self, pkg):
        result = getattr(pkg, "_force_swdb_repoid", None)
        if result:
            return result
        return pkg.reponame

    def get_reason(self, pkg):
        """Get reason for package"""
        return self.history.swdb.resolveRPMTransactionItemReason(pkg.name, pkg.arch, -1)

    def get_reason_name(self, pkg):
        """Get reason for package"""
        return libdnf.transaction.TransactionItemReasonToString(self.get_reason(pkg))

    def _add_obsoleted(self, obsoleted, replaced_by=None):
        obsoleted = obsoleted or []
        for obs in obsoleted:
            ti = self.new(obs, libdnf.transaction.TransactionItemAction_OBSOLETED)
            if replaced_by:
                ti.addReplacedBy(replaced_by)

    def add_downgrade(self, new, old, obsoleted=None):
        ti_new = self.new(new, libdnf.transaction.TransactionItemAction_DOWNGRADE)
        ti_old = self.new(old, libdnf.transaction.TransactionItemAction_DOWNGRADED, replaced_by=ti_new)
        self._add_obsoleted(obsoleted, replaced_by=ti_new)

    def add_erase(self, old, reason=None):
        self.add_remove(old, reason)

    def add_install(self, new, obsoleted=None, reason=None):
        if reason is None:
            reason = libdnf.transaction.TransactionItemReason_USER
        ti_new = self.new(new, libdnf.transaction.TransactionItemAction_INSTALL, reason)
        self._add_obsoleted(obsoleted, replaced_by=ti_new)

    def add_reinstall(self, new, old, obsoleted=None):
        ti_new = self.new(new, libdnf.transaction.TransactionItemAction_REINSTALL)
        ti_old = self.new(old, libdnf.transaction.TransactionItemAction_REINSTALLED, replaced_by=ti_new)
        self._add_obsoleted(obsoleted, replaced_by=ti_new)

    def add_remove(self, old, reason=None):
        reason = reason or libdnf.transaction.TransactionItemReason_USER
        ti_old = self.new(old, libdnf.transaction.TransactionItemAction_REMOVE, reason)

    def add_upgrade(self, new, old, obsoleted=None):
        ti_new = self.new(new, libdnf.transaction.TransactionItemAction_UPGRADE)
        ti_old = self.new(old, libdnf.transaction.TransactionItemAction_UPGRADED, replaced_by=ti_new)
        self._add_obsoleted(obsoleted, replaced_by=ti_new)

    def _test_fail_safe(self, hdr, pkg):
        if pkg._from_cmdline:
            return 0
        if pkg.repo.module_hotfixes:
            return 0
        try:
            if hdr['modularitylabel'] and not pkg._is_in_active_module():
                logger.critical(_("No available modular metadata for modular package '{}', "
                                  "it cannot be installed on the system").format(pkg))
                return 1
        except ValueError:
            return 0
        return 0

    def _populate_rpm_ts(self, ts):
        """Populate the RPM transaction set."""
        modular_problems = 0

        for tsi in self:
            try:
                if tsi.action == libdnf.transaction.TransactionItemAction_DOWNGRADE:
                    hdr = tsi.pkg._header
                    modular_problems += self._test_fail_safe(hdr, tsi.pkg)
                    ts.addInstall(hdr, tsi, 'u')
                elif tsi.action == libdnf.transaction.TransactionItemAction_DOWNGRADED:
                    ts.addErase(tsi.pkg.idx)
                elif tsi.action == libdnf.transaction.TransactionItemAction_INSTALL:
                    hdr = tsi.pkg._header
                    modular_problems += self._test_fail_safe(hdr, tsi.pkg)
                    ts.addInstall(hdr, tsi, 'i')
                elif tsi.action == libdnf.transaction.TransactionItemAction_OBSOLETE:
                    hdr = tsi.pkg._header
                    modular_problems += self._test_fail_safe(hdr, tsi.pkg)
                    ts.addInstall(hdr, tsi, 'u')
                elif tsi.action == libdnf.transaction.TransactionItemAction_OBSOLETED:
                    ts.addErase(tsi.pkg.idx)
                elif tsi.action == libdnf.transaction.TransactionItemAction_REINSTALL:
                    # note: in rpm 4.12 there should not be set
                    # rpm.RPMPROB_FILTER_REPLACEPKG to work
                    hdr = tsi.pkg._header
                    modular_problems += self._test_fail_safe(hdr, tsi.pkg)
                    ts.addReinstall(hdr, tsi)
                elif tsi.action == libdnf.transaction.TransactionItemAction_REINSTALLED:
                    # Required when multiple packages with the same NEVRA marked as installed
                    ts.addErase(tsi.pkg.idx)
                elif tsi.action == libdnf.transaction.TransactionItemAction_REMOVE:
                    ts.addErase(tsi.pkg.idx)
                elif tsi.action == libdnf.transaction.TransactionItemAction_UPGRADE:
                    hdr = tsi.pkg._header
                    modular_problems += self._test_fail_safe(hdr, tsi.pkg)
                    ts.addInstall(hdr, tsi, 'u')
                elif tsi.action == libdnf.transaction.TransactionItemAction_UPGRADED:
                    ts.addErase(tsi.pkg.idx)
                elif tsi.action == libdnf.transaction.TransactionItemAction_REASON_CHANGE:
                    pass
                else:
                    raise RuntimeError("TransactionItemAction not handled: %s" % tsi.action)
            except rpm.error as e:
                raise dnf.exceptions.Error(_("An rpm exception occurred: %s" % e))
        if modular_problems:
            raise dnf.exceptions.Error(_("No available modular metadata for modular package"))

        return ts

    @property
    def install_set(self):
        # :api
        result = set()
        for tsi in self:
            if tsi.action in dnf.transaction.FORWARD_ACTIONS:
                try:
                    result.add(tsi.pkg)
                except KeyError:
                    raise RuntimeError("TransactionItem is has no RPM attached: %s" % tsi)
        return result

    @property
    def remove_set(self):
        # :api
        result = set()
        for tsi in self:
            if tsi.action in dnf.transaction.BACKWARD_ACTIONS + [libdnf.transaction.TransactionItemAction_REINSTALLED]:
                try:
                    result.add(tsi.pkg)
                except KeyError:
                    raise RuntimeError("TransactionItem is has no RPM attached: %s" % tsi)
        return result

    def _rpm_limitations(self):
        """ Ensures all the members can be passed to rpm as they are to perform
            the transaction.
        """
        src_installs = [pkg for pkg in self.install_set if pkg.arch == 'src']
        if len(src_installs):
            return _("Will not install a source rpm package (%s).") % \
                src_installs[0]
        return None

    def _get_items(self, action):
        return [tsi for tsi in self if tsi.action == action]
site-packages/dnf/db/history.py000064400000035045147511334650012513 0ustar00# -*- coding: utf-8 -*-

# Copyright (C) 2009, 2012-2018  Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU Library General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
#

import calendar
import os
import time

import libdnf.transaction
import libdnf.utils

from dnf.i18n import ucd
from dnf.yum import misc
from dnf.exceptions import DatabaseError

from .group import GroupPersistor, EnvironmentPersistor, RPMTransaction


class RPMTransactionItemWrapper(object):
    def __init__(self, swdb, item):
        assert item is not None
        self._swdb = swdb
        self._item = item

    def __str__(self):
        return self._item.getItem().toStr()

    def __lt__(self, other):
        return self._item < other._item

    def __eq__(self, other):
        return self._item == other._item

    def __hash__(self):
        return self._item.__hash__()

    def match(self, pattern):
        return True

    def is_package(self):
        return self._item.getRPMItem() is not None

    def is_group(self):
        return self._item.getCompsGroupItem() is not None

    def is_environment(self):
        return self._item.getCompsEnvironmentItem() is not None

    def get_group(self):
        return self._item.getCompsGroupItem()

    def get_environment(self):
        return self._item.getCompsEnvironmentItem()

    @property
    def name(self):
        return self._item.getRPMItem().getName()

    @property
    def epoch(self):
        return self._item.getRPMItem().getEpoch()

    @property
    def version(self):
        return self._item.getRPMItem().getVersion()

    @property
    def release(self):
        return self._item.getRPMItem().getRelease()

    @property
    def arch(self):
        return self._item.getRPMItem().getArch()

    @property
    def evr(self):
        if self.epoch:
            return "{}:{}-{}".format(self.epoch, self.version, self.release)
        return "{}-{}".format(self.version, self.release)

    @property
    def nevra(self):
        return self._item.getRPMItem().getNEVRA()

    @property
    def action(self):
        return self._item.getAction()

    @action.setter
    def action(self, value):
        self._item.setAction(value)

    @property
    def reason(self):
        return self._item.getReason()

    @reason.setter
    def reason(self, value):
        return self._item.setReason(value)

    @property
    def action_name(self):
        try:
            return self._item.getActionName()
        except AttributeError:
            return ""

    @property
    def action_short(self):
        try:
            return self._item.getActionShort()
        except AttributeError:
            return ""

    @property
    def state(self):
        return self._item.getState()

    @state.setter
    def state(self, value):
        self._item.setState(value)

    @property
    def from_repo(self):
        return self._item.getRepoid()

    def ui_from_repo(self):
        if not self._item.getRepoid():
            return ""
        return "@" + self._item.getRepoid()

    @property
    def obsoleting(self):
        return None

    def get_reason(self):
        # TODO: get_history_reason
        return self._swdb.rpm.get_reason(self)

    @property
    def pkg(self):
        return self._swdb.rpm._swdb_ti_pkg[self._item]

    @property
    def files(self):
        return self.pkg.files

    @property
    def _active(self):
        return self.pkg


class TransactionWrapper(object):

    altered_lt_rpmdb = False
    altered_gt_rpmdb = False

    def __init__(self, trans):
        self._trans = trans

    @property
    def tid(self):
        return self._trans.getId()

    @property
    def cmdline(self):
        return self._trans.getCmdline()

    @property
    def releasever(self):
        return self._trans.getReleasever()

    @property
    def beg_timestamp(self):
        return self._trans.getDtBegin()

    @property
    def end_timestamp(self):
        return self._trans.getDtEnd()

    @property
    def beg_rpmdb_version(self):
        return self._trans.getRpmdbVersionBegin()

    @property
    def end_rpmdb_version(self):
        return self._trans.getRpmdbVersionEnd()

    @property
    def return_code(self):
        return int(self._trans.getState() != libdnf.transaction.TransactionItemState_DONE)

    @property
    def loginuid(self):
        return self._trans.getUserId()

    @property
    def data(self):
        return self.packages

    @property
    def is_output(self):
        output = self._trans.getConsoleOutput()
        return bool(output)

    @property
    def comment(self):
        return self._trans.getComment()

    def tids(self):
        return [self._trans.getId()]

    def performed_with(self):
        return []

    def packages(self):
        result = self._trans.getItems()
        return [RPMTransactionItemWrapper(self, i) for i in result]

    def output(self):
        return [i[1] for i in self._trans.getConsoleOutput()]

    def error(self):
        return []

    def compare_rpmdbv(self, rpmdbv):
        self.altered_gt_rpmdb = self._trans.getRpmdbVersionEnd() != rpmdbv


class MergedTransactionWrapper(TransactionWrapper):

    def __init__(self, trans):
        self._trans = libdnf.transaction.MergedTransaction(trans._trans)

    def merge(self, trans):
        self._trans.merge(trans._trans)

    @property
    def loginuid(self):
        return self._trans.listUserIds()

    def tids(self):
        return self._trans.listIds()

    @property
    def return_code(self):
        return [int(i != libdnf.transaction.TransactionItemState_DONE) for i in self._trans.listStates()]

    @property
    def cmdline(self):
        return self._trans.listCmdlines()

    @property
    def releasever(self):
        return self._trans.listReleasevers()

    @property
    def comment(self):
        return self._trans.listComments()

    def output(self):
        return [i[1] for i in self._trans.getConsoleOutput()]

class SwdbInterface(object):

    def __init__(self, db_dir, releasever=""):
        # TODO: record all vars
        # TODO: remove relreasever from options
        self.releasever = str(releasever)
        self._rpm = None
        self._group = None
        self._env = None
        self._addon_data = None
        self._swdb = None
        self._db_dir = db_dir
        self._output = []

    def __del__(self):
        self.close()

    @property
    def rpm(self):
        if self._rpm is None:
            self._rpm = RPMTransaction(self)
        return self._rpm

    @property
    def group(self):
        if self._group is None:
            self._group = GroupPersistor(self)
        return self._group

    @property
    def env(self):
        if self._env is None:
            self._env = EnvironmentPersistor(self)
        return self._env

    @property
    def dbpath(self):
        return os.path.join(self._db_dir, libdnf.transaction.Swdb.defaultDatabaseName)

    @property
    def swdb(self):
        """ Lazy initialize Swdb object """
        if not self._swdb:
            # _db_dir == persistdir which is prepended with installroot already
            try:
                self._swdb = libdnf.transaction.Swdb(self.dbpath)
            except RuntimeError as ex:
                raise DatabaseError(str(ex))
            self._swdb.initTransaction()
            # TODO: vars -> libdnf
        return self._swdb

    def transform(self, input_dir):
        transformer = libdnf.transaction.Transformer(input_dir, self.dbpath)
        transformer.transform()

    def close(self):
        try:
            del self._tid
        except AttributeError:
            pass
        self._rpm = None
        self._group = None
        self._env = None
        if self._swdb:
            self._swdb.closeTransaction()
            self._swdb.closeDatabase()
        self._swdb = None
        self._output = []

    @property
    def path(self):
        return self.swdb.getPath()

    def reset_db(self):
        return self.swdb.resetDatabase()

    # TODO: rename to get_last_transaction?
    def last(self, complete_transactions_only=True):
        # TODO: complete_transactions_only
        t = self.swdb.getLastTransaction()
        if not t:
            return None
        return TransactionWrapper(t)

    # TODO: rename to: list_transactions?
    def old(self, tids=None, limit=0, complete_transactions_only=False):
        tids = tids or []
        tids = [int(i) for i in tids]
        result = self.swdb.listTransactions()
        result = [TransactionWrapper(i) for i in result]
        # TODO: move to libdnf
        if tids:
            result = [i for i in result if i.tid in tids]

        # populate altered_lt_rpmdb and altered_gt_rpmdb
        for i, trans in enumerate(result):
            if i == 0:
                continue
            prev_trans = result[i-1]
            if trans._trans.getRpmdbVersionBegin() != prev_trans._trans.getRpmdbVersionEnd():
                trans.altered_lt_rpmdb = True
                prev_trans.altered_gt_rpmdb = True
        return result[::-1]

    def get_current(self):
        return TransactionWrapper(self.swdb.getCurrent())

    def set_reason(self, pkg, reason):
        """Set reason for package"""
        rpm_item = self.rpm._pkg_to_swdb_rpm_item(pkg)
        repoid = self.repo(pkg)
        action = libdnf.transaction.TransactionItemAction_REASON_CHANGE
        ti = self.swdb.addItem(rpm_item, repoid, action, reason)
        ti.setState(libdnf.transaction.TransactionItemState_DONE)
        return ti

    '''
    def package(self, pkg):
        """Get SwdbPackage from package"""
        return self.swdb.package(str(pkg))
    '''

    def repo(self, pkg):
        """Get repository of package"""
        return self.swdb.getRPMRepo(str(pkg))

    def package_data(self, pkg):
        """Get package data for package"""
        # trans item is returned
        result = self.swdb.getRPMTransactionItem(str(pkg))
        if result is None:
            return result
        result = RPMTransactionItemWrapper(self, result)
        return result

#    def reason(self, pkg):
#        """Get reason for package"""
#        result = self.swdb.resolveRPMTransactionItemReason(pkg.name, pkg.arch, -1)
#        return result

    # TODO: rename to begin_transaction?
    def beg(self, rpmdb_version, using_pkgs, tsis, cmdline=None, comment=""):
        try:
            self.swdb.initTransaction()
        except:
            pass

        tid = self.swdb.beginTransaction(
            int(calendar.timegm(time.gmtime())),
            str(rpmdb_version),
            cmdline or "",
            int(misc.getloginuid()),
            comment)
        self.swdb.setReleasever(self.releasever)
        self._tid = tid

        return tid

    def pkg_to_swdb_rpm_item(self, po):
        rpm_item = self.swdb.createRPMItem()
        rpm_item.setName(po.name)
        rpm_item.setEpoch(po.epoch or 0)
        rpm_item.setVersion(po.version)
        rpm_item.setRelease(po.release)
        rpm_item.setArch(po.arch)
        return rpm_item

    def log_scriptlet_output(self, msg):
        if not hasattr(self, '_tid'):
            return
        if not msg:
            return
        for line in msg.splitlines():
            line = ucd(line)
            # logging directly to database fails if transaction runs in a background process
            self._output.append((1, line))

    '''
    def _log_errors(self, errors):
        for error in errors:
            error = ucd(error)
            self.swdb.log_error(self._tid, error)
    '''

    def end(self, end_rpmdb_version="", return_code=None, errors=None):
        if not hasattr(self, '_tid'):
            return  # Failed at beg() time

        if return_code is None:
            # return_code/state auto-detection
            return_code = libdnf.transaction.TransactionState_DONE
            for tsi in self.rpm:
                if tsi.state == libdnf.transaction.TransactionItemState_ERROR:
                    return_code = libdnf.transaction.TransactionState_ERROR
                    break

        for file_descriptor, line in self._output:
            self.swdb.addConsoleOutputLine(file_descriptor, line)
        self._output = []

        self.swdb.endTransaction(
            int(time.time()),
            str(end_rpmdb_version),
            return_code,
        )

        # Closing and cleanup is done in the close() method.
        # It is important to keep data around after the transaction ends
        # because it's needed by plugins to report installed packages etc.

    # TODO: ignore_case, more patterns
    def search(self, patterns, ignore_case=True):
        """ Search for history transactions which contain specified
            packages al. la. "yum list". Returns transaction ids. """
        return self.swdb.searchTransactionsByRPM(patterns)

    def user_installed(self, pkg):
        """Returns True if package is user installed"""
        reason = self.swdb.resolveRPMTransactionItemReason(pkg.name, pkg.arch, -1)
        if reason == libdnf.transaction.TransactionItemReason_USER:
            return True
        # if reason is not known, consider a package user-installed
        # because it was most likely installed via rpm
        if reason == libdnf.transaction.TransactionItemReason_UNKNOWN:
            return True
        return False

    def get_erased_reason(self, pkg, first_trans, rollback):
        """Get reason of package before transaction being undone. If package
        is already installed in the system, keep his reason.

        :param pkg: package being installed
        :param first_trans: id of first transaction being undone
        :param rollback: True if transaction is performing a rollback"""
        if rollback:
            # return the reason at the point of rollback; we're setting that reason
            result = self.swdb.resolveRPMTransactionItemReason(pkg.name, pkg.arch, first_trans)
        else:
            result = self.swdb.resolveRPMTransactionItemReason(pkg.name, pkg.arch, -1)

        # consider unknown reason as user-installed
        if result == libdnf.transaction.TransactionItemReason_UNKNOWN:
            result = libdnf.transaction.TransactionItemReason_USER
        return result
site-packages/dnf/db/__init__.py000064400000001412147511334650012540 0ustar00# Copyright (C) 2017  Red Hat, Inc.
#
# DNF database subpackage
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU Library General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
site-packages/dnf/db/__pycache__/group.cpython-36.opt-1.pyc000064400000033766147511334650017401 0ustar003

i�-e�<�@s�ddlZddlZddlZddlZddlmZddlm	Z	ddl
Z
Gdd�de�ZGdd�de�Z
Gdd	�d	e�ZGd
d�de�ZdS)�N)�_)�loggerc@sleZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�ZdS)�
PersistorBasecCs"||_i|_i|_i|_i|_dS)N)�history�
_installed�_removed�	_upgraded�_downgraded)�selfr�r�/usr/lib/python3.6/group.py�__init__ s
zPersistorBase.__init__cCs(t|j�t|j�t|j�t|j�S)N)�lenrrrr	)r
rrr�__len__(szPersistorBase.__len__cCsi|_i|_i|_i|_dS)N)rrrr	)r
rrr�clean+szPersistorBase.cleancCst�dS)N)�NotImplementedError)r
�objrrr�_get_obj_id1szPersistorBase._get_obj_idcCs*|jjj|d|tjj�}|jtjj�dS)N�)r�swdb�addItem�libdnf�transaction�TransactionItemReason_USERZsetStateZTransactionItemState_DONE)r
�item�action�tirrr�_add_to_history4szPersistorBase._add_to_historycCs$||j|j|�<|j|tjj�dS)N)rrrrr�TransactionItemAction_INSTALL)r
rrrr�install8szPersistorBase.installcCs$||j|j|�<|j|tjj�dS)N)rrrrr�TransactionItemAction_REMOVE)r
rrrr�remove<szPersistorBase.removecCs$||j|j|�<|j|tjj�dS)N)rrrrr�TransactionItemAction_UPGRADE)r
rrrr�upgrade@szPersistorBase.upgradecCs$||j|j|�<|j|tjj�dS)N)r	rrrr�TransactionItemAction_DOWNGRADE)r
rrrr�	downgradeDszPersistorBase.downgradecCst�dS)N)r)r
�obj_id�name�translated_name�	pkg_typesrrr�newHszPersistorBase.newcCst�dS)N)r)r
r&rrr�getKszPersistorBase.getcCst�dS)N)r)r
�patternrrr�search_by_patternNszPersistorBase.search_by_patternN)�__name__�
__module__�__qualname__r
rrrrrr!r#r%r*r+r-rrrrrsrc@sDeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dS)�GroupPersistorcCs"|jjj�}dd�|D�}t|�S)NcSsg|]}|j�r|�qSr)�getCompsGroupItem)�.0�irrr�
<listcomp>Vsz+GroupPersistor.__iter__.<locals>.<listcomp>)rr�getItems�iter)r
�itemsrrr�__iter__TszGroupPersistor.__iter__cCs|j�S)N)�
getGroupId)r
rrrrrYszGroupPersistor._get_obj_idcCsH|jjj�}|j|�|dk	r(|j|�|dk	r:|j|�|j|�|S)N)rrZcreateCompsGroupItemZ
setGroupId�setName�setTranslatedName�setPackageTypes)r
r&r'r(r)�
swdb_grouprrrr*\s



zGroupPersistor.newcCs"|jjj|�}|sdS|j�}|S)N)rrr2)r
r&r>rrrr+fs
zGroupPersistor.getcCs|jjj|�S)N)rrZgetCompsGroupItemsByPattern)r
r,rrrr-msz GroupPersistor.search_by_patterncCs|jjj|�S)N)rrZgetPackageCompsGroups)r
�pkg_namerrr�get_package_groupspsz!GroupPersistor.get_package_groupscCs�|jjj|dd�}|tjjkr"dSt|j|��}xJ|jj	�D]<\}}x2|j
�D]&}|j�|kr`qN|j�sjqN|j
|�qNWq<WxJ|jj	�D]<\}}x2|j
�D]&}|j�|kr�q�|j�s�q�|j|�q�Wq�W|r�dSdS)Nr�FT���)rr�resolveRPMTransactionItemReasonrrZTransactionItemReason_GROUP�setr@rr8ZgetPackagesZgetName�getInstalledr!r�add)r
r?�reasonZpackage_groups�group_id�group�pkgrrr�is_removable_pkgss*zGroupPersistor.is_removable_pkgN)
r.r/r0r9rr*r+r-r@rKrrrrr1Rs
r1c@sDeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dS)�EnvironmentPersistorcCs"|jjj�}dd�|D�}t|�S)NcSsg|]}|j�r|�qSr)�getCompsEnvironmentItem)r3r4rrrr5�sz1EnvironmentPersistor.__iter__.<locals>.<listcomp>)rrr6r7)r
r8rrrr9�szEnvironmentPersistor.__iter__cCs|j�S)N)ZgetEnvironmentId)r
rrrrr�sz EnvironmentPersistor._get_obj_idcCsH|jjj�}|j|�|dk	r(|j|�|dk	r:|j|�|j|�|S)N)rrZcreateCompsEnvironmentItemZsetEnvironmentIdr;r<r=)r
r&r'r(r)�swdb_envrrrr*�s



zEnvironmentPersistor.newcCs"|jjj|�}|sdS|j�}|S)N)rrrM)r
r&rNrrrr+�s
zEnvironmentPersistor.getcCs|jjj|�S)N)rrZ!getCompsEnvironmentItemsByPattern)r
r,rrrr-�sz&EnvironmentPersistor.search_by_patterncCs|jjj|�S)N)rrZgetCompsGroupEnvironments)r
rHrrr�get_group_environments�sz+EnvironmentPersistor.get_group_environmentscCs�|jjj|�}|sdSt|j|��}xJ|jj�D]<\}}x2|j�D]&}|j�|krTqB|j	�s^qB|j
|�qBWq0WxJ|jj�D]<\}}x2|j�D]&}|j�|kr�q�|j	�s�q�|j|�q�Wq|W|r�dSdS)NFT)
rrIr+rDrOrr8Z	getGroupsr:rEr!rrF)r
rHr>Zgroup_environmentsZenv_id�envrIrrr�is_removable_group�s*z'EnvironmentPersistor.is_removable_groupN)
r.r/r0r9rr*r+r-rOrQrrrrrL�s
rLc@s�eZdZd,dd�Zdd�Zdd�Zdd	�Zd-d
d�Zdd
�Zdd�Z	dd�Z
d.dd�Zd/dd�Zd0dd�Z
d1dd�Zd2dd�Zd3dd�Zd4dd�Zd d!�Zd"d#�Zed$d%��Zed&d'��Zd(d)�Zd*d+�ZdS)5�RPMTransactionNc	Cs:||_||_|js0y|jjj�WnYnXi|_dS)N)rrrZinitTransaction�_swdb_ti_pkg)r
rrrrrr
�szRPMTransaction.__init__cs8�jr�jj�}n�jjj�}�fdd�|D�}t|�S)Ncs&g|]}|j�rtjjj�j|��qSr)�
getRPMItem�dnf�dbr�RPMTransactionItemWrapper)r3r4)r
rrr5�sz+RPMTransaction.__iter__.<locals>.<listcomp>)rr6rrr7)r
r8r)r
rr9�s
zRPMTransaction.__iter__cs8�jr�jj�}n�jjj�}�fdd�|D�}t|�S)Ncs&g|]}|j�rtjjj�j|��qSr)rTrUrVrrW)r3r4)r
rrr5�sz*RPMTransaction.__len__.<locals>.<listcomp>)rr6rrr)r
r8r)r
rr�s
zRPMTransaction.__len__cCsP|jjj�}|j|j�|j|jp$d�|j|j�|j	|j
�|j|j�|S)Nr)
rrZ
createRPMItemr;r'ZsetEpochZepochZ
setVersion�versionZ
setRelease�releaseZsetArch�arch)r
rJ�rpm_itemrrr�_pkg_to_swdb_rpm_item�sz$RPMTransaction._pkg_to_swdb_rpm_itemcCsV|j|�}|j|�}|dkr&|j|�}|jjj||||�}|rH|j|�||j|<|S)N)r\�
get_repoid�
get_reasonrrr�
addReplacedByrS)r
rJrrG�replaced_byr[Zrepoid�resultrrrr*�s




zRPMTransaction.newcCst|dd�}|r|S|jS)NZ_force_swdb_repoid)�getattrZreponame)r
rJrarrrr]�szRPMTransaction.get_repoidcCs|jjj|j|jd�S)zGet reason for package����)rrrCr'rZ)r
rJrrrr^szRPMTransaction.get_reasoncCstjj|j|��S)zGet reason for package)rrZTransactionItemReasonToStringr^)r
rJrrr�get_reason_nameszRPMTransaction.get_reason_namecCs8|pg}x*|D]"}|j|tjj�}|r|j|�qWdS)N)r*rr�TransactionItemAction_OBSOLETEDr_)r
�	obsoletedr`Zobsrrrr�_add_obsoleted
s

zRPMTransaction._add_obsoletedcCs6|j|tjj�}|j|tjj|d�}|j||d�dS)N)r`)r*rrr$� TransactionItemAction_DOWNGRADEDrh)r
r*�oldrg�ti_new�ti_oldrrr�
add_downgradeszRPMTransaction.add_downgradecCs|j||�dS)N)�
add_remove)r
rjrGrrr�	add_eraseszRPMTransaction.add_erasecCs4|dkrtjj}|j|tjj|�}|j||d�dS)N)r`)rrrr*rrh)r
r*rgrGrkrrr�add_installszRPMTransaction.add_installcCs6|j|tjj�}|j|tjj|d�}|j||d�dS)N)r`)r*rr�TransactionItemAction_REINSTALL�!TransactionItemAction_REINSTALLEDrh)r
r*rjrgrkrlrrr�
add_reinstallszRPMTransaction.add_reinstallcCs"|p
tjj}|j|tjj|�}dS)N)rrrr*r )r
rjrGrlrrrrn$szRPMTransaction.add_removecCs6|j|tjj�}|j|tjj|d�}|j||d�dS)N)r`)r*rrr"�TransactionItemAction_UPGRADEDrh)r
r*rjrgrkrlrrr�add_upgrade(szRPMTransaction.add_upgradecCs^|jr
dS|jjrdSy.|drB|j�rBtjtd�j|��dSWntk
rXdSXdS)NrZmodularitylabelz\No available modular metadata for modular package '{}', it cannot be installed on the systemrc)	Z
_from_cmdlineZrepoZmodule_hotfixesZ_is_in_active_modulerZcriticalr�format�
ValueError)r
�hdrrJrrr�_test_fail_safe-szRPMTransaction._test_fail_safecCsRd}�x0|D�]&}�y�|jtjjkrP|jj}||j||j�7}|j||d��n�|jtjjkrp|j	|jj
��n�|jtjjkr�|jj}||j||j�7}|j||d��nL|jtjjkr�|jj}||j||j�7}|j||d��n|jtjj
k�r|j	|jj
�n�|jtjjk�r<|jj}||j||j�7}|j||�n�|jtjjk�r\|j	|jj
�n�|jtjjk�r||j	|jj
�nz|jtjjk�r�|jj}||j||j�7}|j||d�n@|jtjjk�r�|j	|jj
�n |jtjjk�r�ntd|j��Wqtjk
�r2}ztjjtd|���WYdd}~XqXqW|�rNtjjtd���|S)z!Populate the RPM transaction set.r�ur4z%TransactionItemAction not handled: %szAn rpm exception occurred: %sNz1No available modular metadata for modular package)rrrr$rJZ_headerryZ
addInstallriZaddErase�idxrZTransactionItemAction_OBSOLETErfrqZaddReinstallrrr r"rtZ#TransactionItemAction_REASON_CHANGE�RuntimeError�rpm�errorrU�
exceptions�Errorr)r
ZtsZmodular_problems�tsirx�errr�_populate_rpm_ts;sR*zRPMTransaction._populate_rpm_tscCsXt�}xL|D]D}|jtjjkry|j|j�Wqtk
rNtd|��YqXqW|S)Nz*TransactionItem is has no RPM attached: %s)	rDrrUrZFORWARD_ACTIONSrFrJ�KeyErrorr|)r
rar�rrr�install_setms
zRPMTransaction.install_setcCsbt�}xV|D]N}|jtjjtjjgkry|j|j�Wqt	k
rXt
d|��YqXqW|S)Nz*TransactionItem is has no RPM attached: %s)rDrrUrZBACKWARD_ACTIONSrrrrFrJr�r|)r
rar�rrr�
remove_setys
zRPMTransaction.remove_setcCs,dd�|jD�}t|�r(td�|dSdS)zj Ensures all the members can be passed to rpm as they are to perform
            the transaction.
        cSsg|]}|jdkr|�qS)�src)rZ)r3rJrrrr5�sz3RPMTransaction._rpm_limitations.<locals>.<listcomp>z+Will not install a source rpm package (%s).rN)r�rr)r
Zsrc_installsrrr�_rpm_limitations�s

zRPMTransaction._rpm_limitationscs�fdd�|D�S)Ncsg|]}|j�kr|�qSr)r)r3r�)rrrr5�sz-RPMTransaction._get_items.<locals>.<listcomp>r)r
rr)rr�
_get_items�szRPMTransaction._get_items)N)NN)N)N)N)NN)N)N)N)r.r/r0r
r9rr\r*r]r^rerhrmrorprsrnruryr��propertyr�r�r�r�rrrrrR�s*
		







2
rR)Zlibdnf.transactionrZdnf.db.historyrUZdnf.transactionZdnf.exceptionsZdnf.i18nrZdnf.utilrr}�objectrr1rLrRrrrr�<module>s3<<site-packages/dnf/db/__pycache__/history.cpython-36.pyc000064400000043306147511334650016776 0ustar003

�ft`%:�@s�ddlZddlZddlZddlZddlZddlmZddlm	Z	ddl
mZddlm
Z
mZmZGdd�de�ZGd	d
�d
e�ZGdd�de�ZGd
d�de�ZdS)�N)�ucd)�misc)�
DatabaseError�)�GroupPersistor�EnvironmentPersistor�RPMTransactionc@sjeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
edd��Zedd��Zedd��Zedd��Zedd ��Zed!d"��Zed#d$��Zed%d&��Zejd'd&��Zed(d)��Zejd*d)��Zed+d,��Zed-d.��Zed/d0��Zejd1d0��Zed2d3��Zd4d5�Zed6d7��Zd8d9�Zed:d;��Z ed<d=��Z!ed>d?��Z"d@S)A�RPMTransactionItemWrappercCs|dk	st�||_||_dS)N)�AssertionError�_swdb�_item)�self�swdb�item�r�/usr/lib/python3.6/history.py�__init__#sz"RPMTransactionItemWrapper.__init__cCs|jj�j�S)N)rZgetItemZtoStr)r
rrr�__str__(sz!RPMTransactionItemWrapper.__str__cCs|j|jkS)N)r)r
�otherrrr�__lt__+sz RPMTransactionItemWrapper.__lt__cCs|j|jkS)N)r)r
rrrr�__eq__.sz RPMTransactionItemWrapper.__eq__cCs
|jj�S)N)r�__hash__)r
rrrr1sz"RPMTransactionItemWrapper.__hash__cCsdS)NTr)r
�patternrrr�match4szRPMTransactionItemWrapper.matchcCs|jj�dk	S)N)r�
getRPMItem)r
rrr�
is_package7sz$RPMTransactionItemWrapper.is_packagecCs|jj�dk	S)N)r�getCompsGroupItem)r
rrr�is_group:sz"RPMTransactionItemWrapper.is_groupcCs|jj�dk	S)N)r�getCompsEnvironmentItem)r
rrr�is_environment=sz(RPMTransactionItemWrapper.is_environmentcCs
|jj�S)N)rr)r
rrr�	get_group@sz#RPMTransactionItemWrapper.get_groupcCs
|jj�S)N)rr)r
rrr�get_environmentCsz)RPMTransactionItemWrapper.get_environmentcCs|jj�j�S)N)rrZgetName)r
rrr�nameFszRPMTransactionItemWrapper.namecCs|jj�j�S)N)rrZgetEpoch)r
rrr�epochJszRPMTransactionItemWrapper.epochcCs|jj�j�S)N)rrZ
getVersion)r
rrr�versionNsz!RPMTransactionItemWrapper.versioncCs|jj�j�S)N)rrZ
getRelease)r
rrr�releaseRsz!RPMTransactionItemWrapper.releasecCs|jj�j�S)N)rrZgetArch)r
rrr�archVszRPMTransactionItemWrapper.archcCs*|jrdj|j|j|j�Sdj|j|j�S)Nz{}:{}-{}z{}-{})r#�formatr$r%)r
rrr�evrZszRPMTransactionItemWrapper.evrcCs|jj�j�S)N)rrZgetNEVRA)r
rrr�nevra`szRPMTransactionItemWrapper.nevracCs
|jj�S)N)rZ	getAction)r
rrr�actiondsz RPMTransactionItemWrapper.actioncCs|jj|�dS)N)rZ	setAction)r
�valuerrrr*hscCs
|jj�S)N)rZ	getReason)r
rrr�reasonlsz RPMTransactionItemWrapper.reasoncCs|jj|�S)N)rZ	setReason)r
r+rrrr,pscCs$y
|jj�Stk
rdSXdS)N�)rZ
getActionName�AttributeError)r
rrr�action_namets
z%RPMTransactionItemWrapper.action_namecCs$y
|jj�Stk
rdSXdS)Nr-)rZgetActionShortr.)r
rrr�action_short{s
z&RPMTransactionItemWrapper.action_shortcCs
|jj�S)N)r�getState)r
rrr�state�szRPMTransactionItemWrapper.statecCs|jj|�dS)N)r�setState)r
r+rrrr2�scCs
|jj�S)N)r�	getRepoid)r
rrr�	from_repo�sz#RPMTransactionItemWrapper.from_repocCs|jj�sdSd|jj�S)Nr-�@)rr4)r
rrr�ui_from_repo�s
z&RPMTransactionItemWrapper.ui_from_repocCsdS)Nr)r
rrr�
obsoleting�sz$RPMTransactionItemWrapper.obsoletingcCs|jjj|�S)N)r�rpm�
get_reason)r
rrrr:�sz$RPMTransactionItemWrapper.get_reasoncCs|jjj|jS)N)rr9Z_swdb_ti_pkgr)r
rrr�pkg�szRPMTransactionItemWrapper.pkgcCs|jjS)N)r;�files)r
rrrr<�szRPMTransactionItemWrapper.filescCs|jS)N)r;)r
rrr�_active�sz!RPMTransactionItemWrapper._activeN)#�__name__�
__module__�__qualname__rrrrrrrrrr r!�propertyr"r#r$r%r&r(r)r*�setterr,r/r0r2r5r7r8r:r;r<r=rrrrr	"sBr	c@s�eZdZdZdZdd�Zedd��Zedd��Zedd	��Z	ed
d��Z
edd
��Zedd��Zedd��Z
edd��Zedd��Zedd��Zedd��Zedd��Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(S))�TransactionWrapperFcCs
||_dS)N)�_trans)r
�transrrrr�szTransactionWrapper.__init__cCs
|jj�S)N)rD�getId)r
rrr�tid�szTransactionWrapper.tidcCs
|jj�S)N)rDZ
getCmdline)r
rrr�cmdline�szTransactionWrapper.cmdlinecCs
|jj�S)N)rDZ
getReleasever)r
rrr�
releasever�szTransactionWrapper.releasevercCs
|jj�S)N)rDZ
getDtBegin)r
rrr�
beg_timestamp�sz TransactionWrapper.beg_timestampcCs
|jj�S)N)rDZgetDtEnd)r
rrr�
end_timestamp�sz TransactionWrapper.end_timestampcCs
|jj�S)N)rD�getRpmdbVersionBegin)r
rrr�beg_rpmdb_version�sz$TransactionWrapper.beg_rpmdb_versioncCs
|jj�S)N)rD�getRpmdbVersionEnd)r
rrr�end_rpmdb_version�sz$TransactionWrapper.end_rpmdb_versioncCst|jj�tjjk�S)N)�intrDr1�libdnf�transaction�TransactionItemState_DONE)r
rrr�return_code�szTransactionWrapper.return_codecCs
|jj�S)N)rDZ	getUserId)r
rrr�loginuid�szTransactionWrapper.loginuidcCs|jS)N)�packages)r
rrr�data�szTransactionWrapper.datacCs|jj�}t|�S)N)rD�getConsoleOutput�bool)r
�outputrrr�	is_output�s
zTransactionWrapper.is_outputcCs
|jj�S)N)rDZ
getComment)r
rrr�comment�szTransactionWrapper.commentcCs|jj�gS)N)rDrF)r
rrr�tids�szTransactionWrapper.tidscCsgS)Nr)r
rrr�performed_with�sz!TransactionWrapper.performed_withcs�jj�}�fdd�|D�S)Ncsg|]}t�|��qSr)r	)�.0�i)r
rr�
<listcomp>�sz/TransactionWrapper.packages.<locals>.<listcomp>)rDZgetItems)r
�resultr)r
rrV�s
zTransactionWrapper.packagescCsdd�|jj�D�S)NcSsg|]}|d�qS)rr)r_r`rrrra�sz-TransactionWrapper.output.<locals>.<listcomp>)rDrX)r
rrrrZ�szTransactionWrapper.outputcCsgS)Nr)r
rrr�error�szTransactionWrapper.errorcCs|jj�|k|_dS)N)rDrN�altered_gt_rpmdb)r
Zrpmdbvrrr�compare_rpmdbv�sz!TransactionWrapper.compare_rpmdbvN)r>r?r@�altered_lt_rpmdbrdrrArGrHrIrJrKrMrOrTrUrWr[r\r]r^rVrZrcrerrrrrC�s*rCc@sheZdZdd�Zdd�Zedd��Zdd�Zed	d
��Zedd��Z	ed
d��Z
edd��Zdd�ZdS)�MergedTransactionWrappercCstjj|j�|_dS)N)rQrRZMergedTransactionrD)r
rErrrr�sz!MergedTransactionWrapper.__init__cCs|jj|j�dS)N)rD�merge)r
rErrrrh�szMergedTransactionWrapper.mergecCs
|jj�S)N)rDZlistUserIds)r
rrrrU�sz!MergedTransactionWrapper.loginuidcCs
|jj�S)N)rDZlistIds)r
rrrr]szMergedTransactionWrapper.tidscCsdd�|jj�D�S)NcSsg|]}t|tjjk��qSr)rPrQrRrS)r_r`rrrrasz8MergedTransactionWrapper.return_code.<locals>.<listcomp>)rDZ
listStates)r
rrrrTsz$MergedTransactionWrapper.return_codecCs
|jj�S)N)rDZlistCmdlines)r
rrrrHsz MergedTransactionWrapper.cmdlinecCs
|jj�S)N)rDZlistReleasevers)r
rrrrIsz#MergedTransactionWrapper.releasevercCs
|jj�S)N)rDZlistComments)r
rrrr\sz MergedTransactionWrapper.commentcCsdd�|jj�D�S)NcSsg|]}|d�qS)rr)r_r`rrrrasz3MergedTransactionWrapper.output.<locals>.<listcomp>)rDrX)r
rrrrZszMergedTransactionWrapper.outputN)
r>r?r@rrhrArUr]rTrHrIr\rZrrrrrg�srgc@s�eZdZd6dd�Zdd�Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
dd�Zdd�Zedd��Z
dd�Zd7dd�Zd8dd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd9d(d)�Zd*d+�Zd,d-�Zd:d.d/�Zd;d0d1�Zd2d3�Zd4d5�ZdS)<�
SwdbInterfacer-cCs8t|�|_d|_d|_d|_d|_d|_||_g|_dS)N)	�strrI�_rpm�_group�_envZ_addon_datar�_db_dir�_output)r
Zdb_dirrIrrrrs
zSwdbInterface.__init__cCs|j�dS)N)�close)r
rrr�__del__%szSwdbInterface.__del__cCs|jdkrt|�|_|jS)N)rkr)r
rrrr9(s

zSwdbInterface.rpmcCs|jdkrt|�|_|jS)N)rlr)r
rrr�group.s

zSwdbInterface.groupcCs|jdkrt|�|_|jS)N)rmr)r
rrr�env4s

zSwdbInterface.envcCstjj|jtjjj�S)N)�os�path�joinrnrQrR�SwdbZdefaultDatabaseName)r
rrr�dbpath:szSwdbInterface.dbpathcCsZ|jsTytjj|j�|_Wn.tk
rH}ztt|���WYdd}~XnX|jj�|jS)z Lazy initialize Swdb object N)	rrQrRrwrx�RuntimeErrorrrj�initTransaction)r
Zexrrrr>s
zSwdbInterface.swdbcCstjj||j�}|j�dS)N)rQrRZTransformerrx�	transform)r
Z	input_dirZtransformerrrrr{KszSwdbInterface.transformcCsZy|`Wntk
rYnXd|_d|_d|_|jrJ|jj�|jj�d|_g|_dS)N)	�_tidr.rkrlrmrZcloseTransactionZ
closeDatabasero)r
rrrrpOs

zSwdbInterface.closecCs
|jj�S)N)rZgetPath)r
rrrru]szSwdbInterface.pathcCs
|jj�S)N)rZ
resetDatabase)r
rrr�reset_dbaszSwdbInterface.reset_dbTcCs|jj�}|sdSt|�S)N)rZgetLastTransactionrC)r
�complete_transactions_only�trrr�lastes
zSwdbInterface.lastNrFcs��pg�dd��D��|jj�}dd�|D�}�rD�fdd�|D�}xJt|�D]>\}}|dkr`qN||d}|jj�|jj�krNd|_d|_qNW|ddd�S)	NcSsg|]}t|��qSr)rP)r_r`rrrraosz%SwdbInterface.old.<locals>.<listcomp>cSsg|]}t|��qSr)rC)r_r`rrrraqscsg|]}|j�kr|�qSr)rG)r_r`)r]rrratsrrT���)rZlistTransactions�	enumeraterDrLrNrfrd)r
r]�limitr~rbr`rEZ
prev_transr)r]r�oldms

zSwdbInterface.oldcCst|jj��S)N)rCrZ
getCurrent)r
rrr�get_current�szSwdbInterface.get_currentcCsB|jj|�}|j|�}tjj}|jj||||�}|jtjj	�|S)zSet reason for package)
r9Z_pkg_to_swdb_rpm_item�reporQrRZ#TransactionItemAction_REASON_CHANGErZaddItemr3rS)r
r;r,�rpm_itemZrepoidr*Ztirrr�
set_reason�s
zSwdbInterface.set_reasoncCs|jjt|��S)zGet repository of package)rZ
getRPMReporj)r
r;rrrr��szSwdbInterface.repocCs*|jjt|��}|dkr|St||�}|S)zGet package data for packageN)rZgetRPMTransactionItemrjr	)r
r;rbrrr�package_data�s

zSwdbInterface.package_datacCsfy|jj�WnYnX|jjttjtj���t|�|p>dtt	j
��|�}|jj|j�||_
|S)Nr-)rrzZbeginTransactionrP�calendarZtimegm�timeZgmtimerjrZgetloginuidZ
setReleaseverrIr|)r
Z
rpmdb_versionZ
using_pkgsZtsisrHr\rGrrr�beg�s
zSwdbInterface.begcCsN|jj�}|j|j�|j|jp"d�|j|j�|j|j	�|j
|j�|S)Nr)rZ
createRPMItemZsetNamer"ZsetEpochr#Z
setVersionr$Z
setReleaser%ZsetArchr&)r
Zpor�rrr�pkg_to_swdb_rpm_item�s
z"SwdbInterface.pkg_to_swdb_rpm_itemcCsDt|d�sdS|sdSx(|j�D]}t|�}|jjd|f�q WdS)Nr|r)�hasattr�
splitlinesrro�append)r
�msg�linerrr�log_scriptlet_output�s
z"SwdbInterface.log_scriptlet_outputcCs�t|d�sdS|dkrFtjj}x&|jD]}|jtjjkr&tjj}Pq&Wx |jD]\}}|j	j
||�qNWg|_|j	jtt
j
��t|�|�dS)Nr|)r�rQrRZTransactionState_DONEr9r2ZTransactionItemState_ERRORZTransactionState_ERRORrorZaddConsoleOutputLineZendTransactionrPr�rj)r
rOrT�errorsZtsiZfile_descriptorr�rrr�end�s

zSwdbInterface.endcCs|jj|�S)z{ Search for history transactions which contain specified
            packages al. la. "yum list". Returns transaction ids. )rZsearchTransactionsByRPM)r
ZpatternsZignore_caserrr�search�szSwdbInterface.searchcCs8|jj|j|jd�}|tjjkr$dS|tjjkr4dSdS)z)Returns True if package is user installedrTFr�)r�resolveRPMTransactionItemReasonr"r&rQrR�TransactionItemReason_USER�TransactionItemReason_UNKNOWN)r
r;r,rrr�user_installed�szSwdbInterface.user_installedcCsF|r|jj|j|j|�}n|jj|j|jd�}|tjjkrBtjj}|S)a2Get reason of package before transaction being undone. If package
        is already installed in the system, keep his reason.

        :param pkg: package being installed
        :param first_trans: id of first transaction being undone
        :param rollback: True if transaction is performing a rollbackrr�)rr�r"r&rQrRr�r�)r
r;Zfirst_transZrollbackrbrrr�get_erased_reason�szSwdbInterface.get_erased_reason)r-)T)NrF)Nr-)r-NN)T)r>r?r@rrqrAr9rrrsrxrr{rprur}r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrris0




	

ri)r�rtr�Zlibdnf.transactionrQZlibdnf.utilsZdnf.i18nrZdnf.yumrZdnf.exceptionsrrrrrr�objectr	rCrgrirrrr�<module>sM"site-packages/dnf/db/__pycache__/group.cpython-36.pyc000064400000034130147511334650016424 0ustar003

i�-e�<�@s�ddlZddlZddlZddlZddlmZddlm	Z	ddl
Z
Gdd�de�ZGdd�de�Z
Gdd	�d	e�ZGd
d�de�ZdS)�N)�_)�loggerc@sleZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�ZdS)�
PersistorBasecCsBt|tjjj�s ttt|����||_i|_i|_	i|_
i|_dS)N)�
isinstance�dnf�db�historyZ
SwdbInterface�AssertionError�str�type�
_installed�_removed�	_upgraded�_downgraded)�selfr�r�/usr/lib/python3.6/group.py�__init__ s zPersistorBase.__init__cCs(t|j�t|j�t|j�t|j�S)N)�lenrr
rr)rrrr�__len__(szPersistorBase.__len__cCsi|_i|_i|_i|_dS)N)rr
rr)rrrr�clean+szPersistorBase.cleancCst�dS)N)�NotImplementedError)r�objrrr�_get_obj_id1szPersistorBase._get_obj_idcCs*|jjj|d|tjj�}|jtjj�dS)N�)r�swdb�addItem�libdnf�transaction�TransactionItemReason_USERZsetStateZTransactionItemState_DONE)r�item�action�tirrr�_add_to_history4szPersistorBase._add_to_historycCs$||j|j|�<|j|tjj�dS)N)rrr#rr�TransactionItemAction_INSTALL)rrrrr�install8szPersistorBase.installcCs$||j|j|�<|j|tjj�dS)N)r
rr#rr�TransactionItemAction_REMOVE)rrrrr�remove<szPersistorBase.removecCs$||j|j|�<|j|tjj�dS)N)rrr#rr�TransactionItemAction_UPGRADE)rrrrr�upgrade@szPersistorBase.upgradecCs$||j|j|�<|j|tjj�dS)N)rrr#rr�TransactionItemAction_DOWNGRADE)rrrrr�	downgradeDszPersistorBase.downgradecCst�dS)N)r)r�obj_id�name�translated_name�	pkg_typesrrr�newHszPersistorBase.newcCst�dS)N)r)rr,rrr�getKszPersistorBase.getcCst�dS)N)r)r�patternrrr�search_by_patternNszPersistorBase.search_by_patternN)�__name__�
__module__�__qualname__rrrrr#r%r'r)r+r0r1r3rrrrrsrc@sDeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dS)�GroupPersistorcCs"|jjj�}dd�|D�}t|�S)NcSsg|]}|j�r|�qSr)�getCompsGroupItem)�.0�irrr�
<listcomp>Vsz+GroupPersistor.__iter__.<locals>.<listcomp>)rr�getItems�iter)r�itemsrrr�__iter__TszGroupPersistor.__iter__cCs|j�S)N)�
getGroupId)rrrrrrYszGroupPersistor._get_obj_idcCsH|jjj�}|j|�|dk	r(|j|�|dk	r:|j|�|j|�|S)N)rrZcreateCompsGroupItemZ
setGroupId�setName�setTranslatedName�setPackageTypes)rr,r-r.r/�
swdb_grouprrrr0\s



zGroupPersistor.newcCs"|jjj|�}|sdS|j�}|S)N)rrr8)rr,rDrrrr1fs
zGroupPersistor.getcCs|jjj|�S)N)rrZgetCompsGroupItemsByPattern)rr2rrrr3msz GroupPersistor.search_by_patterncCs|jjj|�S)N)rrZgetPackageCompsGroups)r�pkg_namerrr�get_package_groupspsz!GroupPersistor.get_package_groupscCs�|jjj|dd�}|tjjkr"dSt|j|��}xJ|jj	�D]<\}}x2|j
�D]&}|j�|kr`qN|j�sjqN|j
|�qNWq<WxJ|jj	�D]<\}}x2|j
�D]&}|j�|kr�q�|j�s�q�|j|�q�Wq�W|r�dSdS)Nr�FT���)rr�resolveRPMTransactionItemReasonrrZTransactionItemReason_GROUP�setrFr
r>ZgetPackagesZgetName�getInstalledr'r�add)rrE�reasonZpackage_groups�group_id�group�pkgrrr�is_removable_pkgss*zGroupPersistor.is_removable_pkgN)
r4r5r6r?rr0r1r3rFrQrrrrr7Rs
r7c@sDeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dS)�EnvironmentPersistorcCs"|jjj�}dd�|D�}t|�S)NcSsg|]}|j�r|�qSr)�getCompsEnvironmentItem)r9r:rrrr;�sz1EnvironmentPersistor.__iter__.<locals>.<listcomp>)rrr<r=)rr>rrrr?�szEnvironmentPersistor.__iter__cCs|j�S)N)ZgetEnvironmentId)rrrrrr�sz EnvironmentPersistor._get_obj_idcCsH|jjj�}|j|�|dk	r(|j|�|dk	r:|j|�|j|�|S)N)rrZcreateCompsEnvironmentItemZsetEnvironmentIdrArBrC)rr,r-r.r/�swdb_envrrrr0�s



zEnvironmentPersistor.newcCs"|jjj|�}|sdS|j�}|S)N)rrrS)rr,rTrrrr1�s
zEnvironmentPersistor.getcCs|jjj|�S)N)rrZ!getCompsEnvironmentItemsByPattern)rr2rrrr3�sz&EnvironmentPersistor.search_by_patterncCs|jjj|�S)N)rrZgetCompsGroupEnvironments)rrNrrr�get_group_environments�sz+EnvironmentPersistor.get_group_environmentscCs�|jjj|�}|sdSt|j|��}xJ|jj�D]<\}}x2|j�D]&}|j�|krTqB|j	�s^qB|j
|�qBWq0WxJ|jj�D]<\}}x2|j�D]&}|j�|kr�q�|j	�s�q�|j|�q�Wq|W|r�dSdS)NFT)
rrOr1rJrUr
r>Z	getGroupsr@rKr'rrL)rrNrDZgroup_environmentsZenv_id�envrOrrr�is_removable_group�s*z'EnvironmentPersistor.is_removable_groupN)
r4r5r6r?rr0r1r3rUrWrrrrrR�s
rRc@s�eZdZd,dd�Zdd�Zdd�Zdd	�Zd-d
d�Zdd
�Zdd�Z	dd�Z
d.dd�Zd/dd�Zd0dd�Z
d1dd�Zd2dd�Zd3dd�Zd4dd�Zd d!�Zd"d#�Zed$d%��Zed&d'��Zd(d)�Zd*d+�ZdS)5�RPMTransactionNc	Cs:||_||_|js0y|jjj�WnYnXi|_dS)N)rrrZinitTransaction�_swdb_ti_pkg)rrrrrrr�szRPMTransaction.__init__cs8�jr�jj�}n�jjj�}�fdd�|D�}t|�S)Ncs&g|]}|j�rtjjj�j|��qSr)�
getRPMItemrrr�RPMTransactionItemWrapper)r9r:)rrrr;�sz+RPMTransaction.__iter__.<locals>.<listcomp>)rr<rrr=)rr>r)rrr?�s
zRPMTransaction.__iter__cs8�jr�jj�}n�jjj�}�fdd�|D�}t|�S)Ncs&g|]}|j�rtjjj�j|��qSr)rZrrrr[)r9r:)rrrr;�sz*RPMTransaction.__len__.<locals>.<listcomp>)rr<rrr)rr>r)rrr�s
zRPMTransaction.__len__cCsP|jjj�}|j|j�|j|jp$d�|j|j�|j	|j
�|j|j�|S)Nr)
rrZ
createRPMItemrAr-ZsetEpochZepochZ
setVersion�versionZ
setRelease�releaseZsetArch�arch)rrP�rpm_itemrrr�_pkg_to_swdb_rpm_item�sz$RPMTransaction._pkg_to_swdb_rpm_itemcCsV|j|�}|j|�}|dkr&|j|�}|jjj||||�}|rH|j|�||j|<|S)N)r`�
get_repoid�
get_reasonrrr�
addReplacedByrY)rrPr!rM�replaced_byr_Zrepoid�resultrrrr0�s




zRPMTransaction.newcCst|dd�}|r|S|jS)NZ_force_swdb_repoid)�getattrZreponame)rrPrerrrra�szRPMTransaction.get_repoidcCs|jjj|j|jd�S)zGet reason for package����)rrrIr-r^)rrPrrrrbszRPMTransaction.get_reasoncCstjj|j|��S)zGet reason for package)rrZTransactionItemReasonToStringrb)rrPrrr�get_reason_nameszRPMTransaction.get_reason_namecCs8|pg}x*|D]"}|j|tjj�}|r|j|�qWdS)N)r0rr�TransactionItemAction_OBSOLETEDrc)r�	obsoletedrdZobsr"rrr�_add_obsoleted
s

zRPMTransaction._add_obsoletedcCs6|j|tjj�}|j|tjj|d�}|j||d�dS)N)rd)r0rrr*� TransactionItemAction_DOWNGRADEDrl)rr0�oldrk�ti_new�ti_oldrrr�
add_downgradeszRPMTransaction.add_downgradecCs|j||�dS)N)�
add_remove)rrnrMrrr�	add_eraseszRPMTransaction.add_erasecCs4|dkrtjj}|j|tjj|�}|j||d�dS)N)rd)rrrr0r$rl)rr0rkrMrorrr�add_installszRPMTransaction.add_installcCs6|j|tjj�}|j|tjj|d�}|j||d�dS)N)rd)r0rr�TransactionItemAction_REINSTALL�!TransactionItemAction_REINSTALLEDrl)rr0rnrkrorprrr�
add_reinstallszRPMTransaction.add_reinstallcCs"|p
tjj}|j|tjj|�}dS)N)rrrr0r&)rrnrMrprrrrr$szRPMTransaction.add_removecCs6|j|tjj�}|j|tjj|d�}|j||d�dS)N)rd)r0rrr(�TransactionItemAction_UPGRADEDrl)rr0rnrkrorprrr�add_upgrade(szRPMTransaction.add_upgradecCs^|jr
dS|jjrdSy.|drB|j�rBtjtd�j|��dSWntk
rXdSXdS)NrZmodularitylabelz\No available modular metadata for modular package '{}', it cannot be installed on the systemrg)	Z
_from_cmdlineZrepoZmodule_hotfixesZ_is_in_active_modulerZcriticalr�format�
ValueError)r�hdrrPrrr�_test_fail_safe-szRPMTransaction._test_fail_safecCsRd}�x0|D�]&}�y�|jtjjkrP|jj}||j||j�7}|j||d��n�|jtjjkrp|j	|jj
��n�|jtjjkr�|jj}||j||j�7}|j||d��nL|jtjjkr�|jj}||j||j�7}|j||d��n|jtjj
k�r|j	|jj
�n�|jtjjk�r<|jj}||j||j�7}|j||�n�|jtjjk�r\|j	|jj
�n�|jtjjk�r||j	|jj
�nz|jtjjk�r�|jj}||j||j�7}|j||d�n@|jtjjk�r�|j	|jj
�n |jtjjk�r�ntd|j��Wqtjk
�r2}ztjjtd|���WYdd}~XqXqW|�rNtjjtd���|S)z!Populate the RPM transaction set.r�ur:z%TransactionItemAction not handled: %szAn rpm exception occurred: %sNz1No available modular metadata for modular package)r!rrr*rPZ_headerr}Z
addInstallrmZaddErase�idxr$ZTransactionItemAction_OBSOLETErjruZaddReinstallrvr&r(rxZ#TransactionItemAction_REASON_CHANGE�RuntimeError�rpm�errorr�
exceptions�Errorr)rZtsZmodular_problems�tsir|�errr�_populate_rpm_ts;sR*zRPMTransaction._populate_rpm_tscCsXt�}xL|D]D}|jtjjkry|j|j�Wqtk
rNtd|��YqXqW|S)Nz*TransactionItem is has no RPM attached: %s)	rJr!rrZFORWARD_ACTIONSrLrP�KeyErrorr�)rrer�rrr�install_setms
zRPMTransaction.install_setcCsbt�}xV|D]N}|jtjjtjjgkry|j|j�Wqt	k
rXt
d|��YqXqW|S)Nz*TransactionItem is has no RPM attached: %s)rJr!rrZBACKWARD_ACTIONSrrvrLrPr�r�)rrer�rrr�
remove_setys
zRPMTransaction.remove_setcCs,dd�|jD�}t|�r(td�|dSdS)zj Ensures all the members can be passed to rpm as they are to perform
            the transaction.
        cSsg|]}|jdkr|�qS)�src)r^)r9rPrrrr;�sz3RPMTransaction._rpm_limitations.<locals>.<listcomp>z+Will not install a source rpm package (%s).rN)r�rr)rZsrc_installsrrr�_rpm_limitations�s

zRPMTransaction._rpm_limitationscs�fdd�|D�S)Ncsg|]}|j�kr|�qSr)r!)r9r�)r!rrr;�sz-RPMTransaction._get_items.<locals>.<listcomp>r)rr!r)r!r�
_get_items�szRPMTransaction._get_items)N)NN)N)N)N)NN)N)N)N)r4r5r6rr?rr`r0rarbrirlrqrsrtrwrrryr}r��propertyr�r�r�r�rrrrrX�s*
		







2
rX)Zlibdnf.transactionrZdnf.db.historyrZdnf.transactionZdnf.exceptionsZdnf.i18nrZdnf.utilrr��objectrr7rRrXrrrr�<module>s3<<site-packages/dnf/db/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000161147511334650017763 0ustar003

�ft`
�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/dnf/db/__pycache__/__init__.cpython-36.pyc000064400000000161147511334650017024 0ustar003

�ft`
�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/dnf/db/__pycache__/history.cpython-36.opt-1.pyc000064400000043250147511334650017733 0ustar003

�ft`%:�@s�ddlZddlZddlZddlZddlZddlmZddlm	Z	ddl
mZddlm
Z
mZmZGdd�de�ZGd	d
�d
e�ZGdd�de�ZGd
d�de�ZdS)�N)�ucd)�misc)�
DatabaseError�)�GroupPersistor�EnvironmentPersistor�RPMTransactionc@sjeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
edd��Zedd��Zedd��Zedd��Zedd ��Zed!d"��Zed#d$��Zed%d&��Zejd'd&��Zed(d)��Zejd*d)��Zed+d,��Zed-d.��Zed/d0��Zejd1d0��Zed2d3��Zd4d5�Zed6d7��Zd8d9�Zed:d;��Z ed<d=��Z!ed>d?��Z"d@S)A�RPMTransactionItemWrappercCs||_||_dS)N)�_swdb�_item)�self�swdb�item�r�/usr/lib/python3.6/history.py�__init__#sz"RPMTransactionItemWrapper.__init__cCs|jj�j�S)N)rZgetItemZtoStr)rrrr�__str__(sz!RPMTransactionItemWrapper.__str__cCs|j|jkS)N)r)r�otherrrr�__lt__+sz RPMTransactionItemWrapper.__lt__cCs|j|jkS)N)r)rrrrr�__eq__.sz RPMTransactionItemWrapper.__eq__cCs
|jj�S)N)r�__hash__)rrrrr1sz"RPMTransactionItemWrapper.__hash__cCsdS)NTr)r�patternrrr�match4szRPMTransactionItemWrapper.matchcCs|jj�dk	S)N)r�
getRPMItem)rrrr�
is_package7sz$RPMTransactionItemWrapper.is_packagecCs|jj�dk	S)N)r�getCompsGroupItem)rrrr�is_group:sz"RPMTransactionItemWrapper.is_groupcCs|jj�dk	S)N)r�getCompsEnvironmentItem)rrrr�is_environment=sz(RPMTransactionItemWrapper.is_environmentcCs
|jj�S)N)rr)rrrr�	get_group@sz#RPMTransactionItemWrapper.get_groupcCs
|jj�S)N)rr)rrrr�get_environmentCsz)RPMTransactionItemWrapper.get_environmentcCs|jj�j�S)N)rrZgetName)rrrr�nameFszRPMTransactionItemWrapper.namecCs|jj�j�S)N)rrZgetEpoch)rrrr�epochJszRPMTransactionItemWrapper.epochcCs|jj�j�S)N)rrZ
getVersion)rrrr�versionNsz!RPMTransactionItemWrapper.versioncCs|jj�j�S)N)rrZ
getRelease)rrrr�releaseRsz!RPMTransactionItemWrapper.releasecCs|jj�j�S)N)rrZgetArch)rrrr�archVszRPMTransactionItemWrapper.archcCs*|jrdj|j|j|j�Sdj|j|j�S)Nz{}:{}-{}z{}-{})r"�formatr#r$)rrrr�evrZszRPMTransactionItemWrapper.evrcCs|jj�j�S)N)rrZgetNEVRA)rrrr�nevra`szRPMTransactionItemWrapper.nevracCs
|jj�S)N)rZ	getAction)rrrr�actiondsz RPMTransactionItemWrapper.actioncCs|jj|�dS)N)rZ	setAction)r�valuerrrr)hscCs
|jj�S)N)rZ	getReason)rrrr�reasonlsz RPMTransactionItemWrapper.reasoncCs|jj|�S)N)rZ	setReason)rr*rrrr+pscCs$y
|jj�Stk
rdSXdS)N�)rZ
getActionName�AttributeError)rrrr�action_namets
z%RPMTransactionItemWrapper.action_namecCs$y
|jj�Stk
rdSXdS)Nr,)rZgetActionShortr-)rrrr�action_short{s
z&RPMTransactionItemWrapper.action_shortcCs
|jj�S)N)r�getState)rrrr�state�szRPMTransactionItemWrapper.statecCs|jj|�dS)N)r�setState)rr*rrrr1�scCs
|jj�S)N)r�	getRepoid)rrrr�	from_repo�sz#RPMTransactionItemWrapper.from_repocCs|jj�sdSd|jj�S)Nr,�@)rr3)rrrr�ui_from_repo�s
z&RPMTransactionItemWrapper.ui_from_repocCsdS)Nr)rrrr�
obsoleting�sz$RPMTransactionItemWrapper.obsoletingcCs|jjj|�S)N)r
�rpm�
get_reason)rrrrr9�sz$RPMTransactionItemWrapper.get_reasoncCs|jjj|jS)N)r
r8Z_swdb_ti_pkgr)rrrr�pkg�szRPMTransactionItemWrapper.pkgcCs|jjS)N)r:�files)rrrrr;�szRPMTransactionItemWrapper.filescCs|jS)N)r:)rrrr�_active�sz!RPMTransactionItemWrapper._activeN)#�__name__�
__module__�__qualname__rrrrrrrrrrr �propertyr!r"r#r$r%r'r(r)�setterr+r.r/r1r4r6r7r9r:r;r<rrrrr	"sBr	c@s�eZdZdZdZdd�Zedd��Zedd��Zedd	��Z	ed
d��Z
edd
��Zedd��Zedd��Z
edd��Zedd��Zedd��Zedd��Zedd��Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(S))�TransactionWrapperFcCs
||_dS)N)�_trans)r�transrrrr�szTransactionWrapper.__init__cCs
|jj�S)N)rC�getId)rrrr�tid�szTransactionWrapper.tidcCs
|jj�S)N)rCZ
getCmdline)rrrr�cmdline�szTransactionWrapper.cmdlinecCs
|jj�S)N)rCZ
getReleasever)rrrr�
releasever�szTransactionWrapper.releasevercCs
|jj�S)N)rCZ
getDtBegin)rrrr�
beg_timestamp�sz TransactionWrapper.beg_timestampcCs
|jj�S)N)rCZgetDtEnd)rrrr�
end_timestamp�sz TransactionWrapper.end_timestampcCs
|jj�S)N)rC�getRpmdbVersionBegin)rrrr�beg_rpmdb_version�sz$TransactionWrapper.beg_rpmdb_versioncCs
|jj�S)N)rC�getRpmdbVersionEnd)rrrr�end_rpmdb_version�sz$TransactionWrapper.end_rpmdb_versioncCst|jj�tjjk�S)N)�intrCr0�libdnf�transaction�TransactionItemState_DONE)rrrr�return_code�szTransactionWrapper.return_codecCs
|jj�S)N)rCZ	getUserId)rrrr�loginuid�szTransactionWrapper.loginuidcCs|jS)N)�packages)rrrr�data�szTransactionWrapper.datacCs|jj�}t|�S)N)rC�getConsoleOutput�bool)r�outputrrr�	is_output�s
zTransactionWrapper.is_outputcCs
|jj�S)N)rCZ
getComment)rrrr�comment�szTransactionWrapper.commentcCs|jj�gS)N)rCrE)rrrr�tids�szTransactionWrapper.tidscCsgS)Nr)rrrr�performed_with�sz!TransactionWrapper.performed_withcs�jj�}�fdd�|D�S)Ncsg|]}t�|��qSr)r	)�.0�i)rrr�
<listcomp>�sz/TransactionWrapper.packages.<locals>.<listcomp>)rCZgetItems)r�resultr)rrrU�s
zTransactionWrapper.packagescCsdd�|jj�D�S)NcSsg|]}|d�qS)rr)r^r_rrrr`�sz-TransactionWrapper.output.<locals>.<listcomp>)rCrW)rrrrrY�szTransactionWrapper.outputcCsgS)Nr)rrrr�error�szTransactionWrapper.errorcCs|jj�|k|_dS)N)rCrM�altered_gt_rpmdb)rZrpmdbvrrr�compare_rpmdbv�sz!TransactionWrapper.compare_rpmdbvN)r=r>r?�altered_lt_rpmdbrcrr@rFrGrHrIrJrLrNrSrTrVrZr[r\r]rUrYrbrdrrrrrB�s*rBc@sheZdZdd�Zdd�Zedd��Zdd�Zed	d
��Zedd��Z	ed
d��Z
edd��Zdd�ZdS)�MergedTransactionWrappercCstjj|j�|_dS)N)rPrQZMergedTransactionrC)rrDrrrr�sz!MergedTransactionWrapper.__init__cCs|jj|j�dS)N)rC�merge)rrDrrrrg�szMergedTransactionWrapper.mergecCs
|jj�S)N)rCZlistUserIds)rrrrrT�sz!MergedTransactionWrapper.loginuidcCs
|jj�S)N)rCZlistIds)rrrrr\szMergedTransactionWrapper.tidscCsdd�|jj�D�S)NcSsg|]}t|tjjk��qSr)rOrPrQrR)r^r_rrrr`sz8MergedTransactionWrapper.return_code.<locals>.<listcomp>)rCZ
listStates)rrrrrSsz$MergedTransactionWrapper.return_codecCs
|jj�S)N)rCZlistCmdlines)rrrrrGsz MergedTransactionWrapper.cmdlinecCs
|jj�S)N)rCZlistReleasevers)rrrrrHsz#MergedTransactionWrapper.releasevercCs
|jj�S)N)rCZlistComments)rrrrr[sz MergedTransactionWrapper.commentcCsdd�|jj�D�S)NcSsg|]}|d�qS)rr)r^r_rrrr`sz3MergedTransactionWrapper.output.<locals>.<listcomp>)rCrW)rrrrrYszMergedTransactionWrapper.outputN)
r=r>r?rrgr@rTr\rSrGrHr[rYrrrrrf�srfc@s�eZdZd6dd�Zdd�Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
dd�Zdd�Zedd��Z
dd�Zd7dd�Zd8dd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd9d(d)�Zd*d+�Zd,d-�Zd:d.d/�Zd;d0d1�Zd2d3�Zd4d5�ZdS)<�
SwdbInterfacer,cCs8t|�|_d|_d|_d|_d|_d|_||_g|_dS)N)	�strrH�_rpm�_group�_envZ_addon_datar
�_db_dir�_output)rZdb_dirrHrrrrs
zSwdbInterface.__init__cCs|j�dS)N)�close)rrrr�__del__%szSwdbInterface.__del__cCs|jdkrt|�|_|jS)N)rjr)rrrrr8(s

zSwdbInterface.rpmcCs|jdkrt|�|_|jS)N)rkr)rrrr�group.s

zSwdbInterface.groupcCs|jdkrt|�|_|jS)N)rlr)rrrr�env4s

zSwdbInterface.envcCstjj|jtjjj�S)N)�os�path�joinrmrPrQ�SwdbZdefaultDatabaseName)rrrr�dbpath:szSwdbInterface.dbpathcCsZ|jsTytjj|j�|_Wn.tk
rH}ztt|���WYdd}~XnX|jj�|jS)z Lazy initialize Swdb object N)	r
rPrQrvrw�RuntimeErrorrri�initTransaction)rZexrrrr
>s
zSwdbInterface.swdbcCstjj||j�}|j�dS)N)rPrQZTransformerrw�	transform)rZ	input_dirZtransformerrrrrzKszSwdbInterface.transformcCsZy|`Wntk
rYnXd|_d|_d|_|jrJ|jj�|jj�d|_g|_dS)N)	�_tidr-rjrkrlr
ZcloseTransactionZ
closeDatabasern)rrrrroOs

zSwdbInterface.closecCs
|jj�S)N)r
ZgetPath)rrrrrt]szSwdbInterface.pathcCs
|jj�S)N)r
Z
resetDatabase)rrrr�reset_dbaszSwdbInterface.reset_dbTcCs|jj�}|sdSt|�S)N)r
ZgetLastTransactionrB)r�complete_transactions_only�trrr�lastes
zSwdbInterface.lastNrFcs��pg�dd��D��|jj�}dd�|D�}�rD�fdd�|D�}xJt|�D]>\}}|dkr`qN||d}|jj�|jj�krNd|_d|_qNW|ddd�S)	NcSsg|]}t|��qSr)rO)r^r_rrrr`osz%SwdbInterface.old.<locals>.<listcomp>cSsg|]}t|��qSr)rB)r^r_rrrr`qscsg|]}|j�kr|�qSr)rF)r^r_)r\rrr`tsrrT���)r
ZlistTransactions�	enumeraterCrKrMrerc)rr\�limitr}rar_rDZ
prev_transr)r\r�oldms

zSwdbInterface.oldcCst|jj��S)N)rBr
Z
getCurrent)rrrr�get_current�szSwdbInterface.get_currentcCsB|jj|�}|j|�}tjj}|jj||||�}|jtjj	�|S)zSet reason for package)
r8Z_pkg_to_swdb_rpm_item�reporPrQZ#TransactionItemAction_REASON_CHANGEr
ZaddItemr2rR)rr:r+�rpm_itemZrepoidr)Ztirrr�
set_reason�s
zSwdbInterface.set_reasoncCs|jjt|��S)zGet repository of package)r
Z
getRPMRepori)rr:rrrr��szSwdbInterface.repocCs*|jjt|��}|dkr|St||�}|S)zGet package data for packageN)r
ZgetRPMTransactionItemrir	)rr:rarrr�package_data�s

zSwdbInterface.package_datacCsfy|jj�WnYnX|jjttjtj���t|�|p>dtt	j
��|�}|jj|j�||_
|S)Nr,)r
ryZbeginTransactionrO�calendarZtimegm�timeZgmtimerirZgetloginuidZ
setReleaseverrHr{)rZ
rpmdb_versionZ
using_pkgsZtsisrGr[rFrrr�beg�s
zSwdbInterface.begcCsN|jj�}|j|j�|j|jp"d�|j|j�|j|j	�|j
|j�|S)Nr)r
Z
createRPMItemZsetNamer!ZsetEpochr"Z
setVersionr#Z
setReleaser$ZsetArchr%)rZpor�rrr�pkg_to_swdb_rpm_item�s
z"SwdbInterface.pkg_to_swdb_rpm_itemcCsDt|d�sdS|sdSx(|j�D]}t|�}|jjd|f�q WdS)Nr{r)�hasattr�
splitlinesrrn�append)r�msg�linerrr�log_scriptlet_output�s
z"SwdbInterface.log_scriptlet_outputcCs�t|d�sdS|dkrFtjj}x&|jD]}|jtjjkr&tjj}Pq&Wx |jD]\}}|j	j
||�qNWg|_|j	jtt
j
��t|�|�dS)Nr{)r�rPrQZTransactionState_DONEr8r1ZTransactionItemState_ERRORZTransactionState_ERRORrnr
ZaddConsoleOutputLineZendTransactionrOr�ri)rrNrS�errorsZtsiZfile_descriptorr�rrr�end�s

zSwdbInterface.endcCs|jj|�S)z{ Search for history transactions which contain specified
            packages al. la. "yum list". Returns transaction ids. )r
ZsearchTransactionsByRPM)rZpatternsZignore_caserrr�search�szSwdbInterface.searchcCs8|jj|j|jd�}|tjjkr$dS|tjjkr4dSdS)z)Returns True if package is user installedrTFr�)r
�resolveRPMTransactionItemReasonr!r%rPrQ�TransactionItemReason_USER�TransactionItemReason_UNKNOWN)rr:r+rrr�user_installed�szSwdbInterface.user_installedcCsF|r|jj|j|j|�}n|jj|j|jd�}|tjjkrBtjj}|S)a2Get reason of package before transaction being undone. If package
        is already installed in the system, keep his reason.

        :param pkg: package being installed
        :param first_trans: id of first transaction being undone
        :param rollback: True if transaction is performing a rollbackrr�)r
r�r!r%rPrQr�r�)rr:Zfirst_transZrollbackrarrr�get_erased_reason�szSwdbInterface.get_erased_reason)r,)T)NrF)Nr,)r,NN)T)r=r>r?rrpr@r8rqrrrwr
rzrortr|rr�r�r�r�r�r�r�r�r�r�r�r�rrrrrhs0




	

rh)r�rsr�Zlibdnf.transactionrPZlibdnf.utilsZdnf.i18nrZdnf.yumrZdnf.exceptionsrrqrrr�objectr	rBrfrhrrrr�<module>sM"site-packages/dnf/yum/__init__.py000064400000001750147511334650012772 0ustar00# __init__.py
# The legacy YUM subpackage.
#
# Copyright (C) 2013  Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#
site-packages/dnf/yum/rpmtrans.py000064400000037212147511334650013103 0ustar00# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU Library General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
# Copyright 2005 Duke University
# Parts Copyright 2007 Red Hat, Inc

from __future__ import print_function, absolute_import
from __future__ import unicode_literals

import libdnf.transaction

from dnf.i18n import _, ucd
import dnf.callback
import dnf.transaction
import dnf.util
import rpm
import os
import logging
import sys
import tempfile
import traceback
import warnings


# TODO: merge w/ libdnf
# transaction set states
TS_UPDATE = 10
TS_INSTALL = 20
TS_ERASE = 40
TS_OBSOLETED = 50
TS_OBSOLETING = 60
TS_AVAILABLE = 70
TS_UPDATED = 90
TS_FAILED = 100

TS_INSTALL_STATES = [TS_INSTALL, TS_UPDATE, TS_OBSOLETING]
TS_REMOVE_STATES = [TS_ERASE, TS_OBSOLETED, TS_UPDATED]

RPM_ACTIONS_SET = {libdnf.transaction.TransactionItemAction_INSTALL,
                   libdnf.transaction.TransactionItemAction_DOWNGRADE,
                   libdnf.transaction.TransactionItemAction_DOWNGRADED,
                   libdnf.transaction.TransactionItemAction_OBSOLETE,
                   libdnf.transaction.TransactionItemAction_OBSOLETED,
                   libdnf.transaction.TransactionItemAction_UPGRADE,
                   libdnf.transaction.TransactionItemAction_UPGRADED,
                   libdnf.transaction.TransactionItemAction_REMOVE,
                   libdnf.transaction.TransactionItemAction_REINSTALLED}

logger = logging.getLogger('dnf')


def _add_deprecated_action(name):
    """
    Wrapper to return a deprecated action constant
    while printing a deprecation warning.
    """
    @property
    def _func(self):
        msg = "%s.%s is deprecated. Use dnf.callback.%s instead." \
            % (self.__class__.__name__, name, name)
        warnings.warn(msg, DeprecationWarning, stacklevel=2)
        value = getattr(dnf.callback, name)
        return value
    return _func


class TransactionDisplay(object):
    # :api

    def __init__(self):
        # :api
        pass

    # use constants from dnf.callback which are the official API
    PKG_CLEANUP = _add_deprecated_action("PKG_CLEANUP")
    PKG_DOWNGRADE = _add_deprecated_action("PKG_DOWNGRADE")
    PKG_REMOVE = _add_deprecated_action("PKG_REMOVE")
    PKG_ERASE = PKG_REMOVE
    PKG_INSTALL = _add_deprecated_action("PKG_INSTALL")
    PKG_OBSOLETE = _add_deprecated_action("PKG_OBSOLETE")
    PKG_REINSTALL = _add_deprecated_action("PKG_REINSTALL")
    PKG_UPGRADE = _add_deprecated_action("PKG_UPGRADE")
    PKG_VERIFY = _add_deprecated_action("PKG_VERIFY")
    TRANS_PREPARATION = _add_deprecated_action("TRANS_PREPARATION")
    PKG_SCRIPTLET = _add_deprecated_action("PKG_SCRIPTLET")
    TRANS_POST = _add_deprecated_action("TRANS_POST")

    def progress(self, package, action, ti_done, ti_total, ts_done, ts_total):
        """Report ongoing progress on a transaction item. :api

        :param package: a package being processed
        :param action: the action being performed
        :param ti_done: number of processed bytes of the transaction
           item being processed
        :param ti_total: total number of bytes of the transaction item
           being processed
        :param ts_done: number of actions processed in the whole
           transaction
        :param ts_total: total number of actions in the whole
           transaction

        """
        pass

    def scriptout(self, msgs):
        """Hook for reporting an rpm scriptlet output.

        :param msgs: the scriptlet output
        """
        pass

    def error(self, message):
        """Report an error that occurred during the transaction. :api"""
        pass

    def filelog(self, package, action):
        # check package object type - if it is a string - just output it
        """package is the same as in progress() - a package object or simple
           string action is also the same as in progress()"""
        pass

    def verify_tsi_package(self, pkg, count, total):
        # TODO: replace with verify_tsi?
        self.progress(pkg, dnf.transaction.PKG_VERIFY, 100, 100, count, total)


class ErrorTransactionDisplay(TransactionDisplay):

    """An RPMTransaction display that prints errors to standard output."""

    def error(self, message):
        super(ErrorTransactionDisplay, self).error(message)
        dnf.util._terminal_messenger('print', message, sys.stderr)


class LoggingTransactionDisplay(TransactionDisplay):
    '''
    Base class for a RPMTransaction display callback class
    '''
    def __init__(self):
        super(LoggingTransactionDisplay, self).__init__()
        self.rpm_logger = logging.getLogger('dnf.rpm')

    def error(self, message):
        self.rpm_logger.error(message)

    def filelog(self, package, action):
        action_str = dnf.transaction.FILE_ACTIONS[action]
        msg = '%s: %s' % (action_str, package)
        self.rpm_logger.log(dnf.logging.SUBDEBUG, msg)

    def scriptout(self, msgs):
        if msgs:
            self.rpm_logger.info(ucd(msgs))


class RPMTransaction(object):
    def __init__(self, base, test=False, displays=()):
        if not displays:
            displays = [ErrorTransactionDisplay()]
        self.displays = displays
        self.base = base
        self.test = test  # are we a test?
        self.trans_running = False
        self.fd = None
        self.total_actions = 0
        self.total_installed = 0
        self.complete_actions = 0
        self.installed_pkg_names = set()
        self.total_removed = 0

        self._setupOutputLogging(base.conf.rpmverbosity)
        self._te_list = []
        # Index in _te_list of the transaction element being processed (for use
        # in callbacks)
        self._te_index = 0
        self._tsi_cache = None

    def _setupOutputLogging(self, rpmverbosity="info"):
        # UGLY... set up the transaction to record output from scriptlets
        io_r = tempfile.NamedTemporaryFile()
        self._readpipe = io_r
        self._writepipe = open(io_r.name, 'w+b')
        self.base._ts.setScriptFd(self._writepipe)
        rpmverbosity = {'critical' : 'crit',
                        'emergency' : 'emerg',
                        'error' : 'err',
                        'information' : 'info',
                        'warn' : 'warning'}.get(rpmverbosity, rpmverbosity)
        rpmverbosity = 'RPMLOG_' + rpmverbosity.upper()
        if not hasattr(rpm, rpmverbosity):
            rpmverbosity = 'RPMLOG_INFO'
        rpm.setVerbosity(getattr(rpm, rpmverbosity))
        rpm.setLogFile(self._writepipe)

    def _shutdownOutputLogging(self):
        # reset rpm bits from recording output
        rpm.setVerbosity(rpm.RPMLOG_NOTICE)
        rpm.setLogFile(sys.stderr)
        try:
            self._writepipe.close()
        except:
            pass

    def _scriptOutput(self):
        try:
            # XXX ugly workaround of problem which started after upgrading glibc
            # from glibc-2.27-32.fc28.x86_64 to glibc-2.28-9.fc29.x86_64
            # After this upgrade nothing is read from _readpipe, so every
            # posttrans and postun scriptlet output is lost. The problem
            # only occurs when using dnf-2, dnf-3 is OK.
            # I did not find the root cause of this error yet.
            self._readpipe.seek(self._readpipe.tell())
            out = self._readpipe.read()
            if not out:
                return None
            return out
        except IOError:
            pass

    def messages(self):
        messages = self._scriptOutput()
        if messages:
            for line in messages.splitlines():
                yield ucd(line)

    def _scriptout(self):
        msgs = self._scriptOutput()
        for display in self.displays:
            display.scriptout(msgs)
        self.base.history.log_scriptlet_output(msgs)

    def __del__(self):
        self._shutdownOutputLogging()

    def _extract_cbkey(self, cbkey):
        """Obtain the package related to the calling callback."""

        if hasattr(cbkey, "pkg"):
            tsi = cbkey
            return [tsi]

        te = self._te_list[self._te_index]
        te_nevra = dnf.util._te_nevra(te)
        if self._tsi_cache:
            if str(self._tsi_cache[0]) == te_nevra:
                return self._tsi_cache
        items = []
        for tsi in self.base.transaction:
            if tsi.action not in RPM_ACTIONS_SET:
                # skip REINSTALL in order to return REINSTALLED, or REASON_CHANGE to avoid crash
                continue
            if str(tsi) == te_nevra:
                items.append(tsi)
        if items:
            self._tsi_cache = items
            return items
        raise RuntimeError("TransactionItem not found for key: %s" % cbkey)

    def callback(self, what, amount, total, key, client_data):
        try:
            if isinstance(key, str):
                key = ucd(key)
            if what == rpm.RPMCALLBACK_TRANS_START:
                self._transStart(total)
            elif what == rpm.RPMCALLBACK_TRANS_STOP:
                pass
            elif what == rpm.RPMCALLBACK_TRANS_PROGRESS:
                self._trans_progress(amount, total)
            elif what == rpm.RPMCALLBACK_ELEM_PROGRESS:
                # This callback type is issued every time the next transaction
                # element is about to be processed by RPM, before any other
                # callbacks are issued.  "amount" carries the index of the element.
                self._elemProgress(key, amount)
            elif what == rpm.RPMCALLBACK_INST_OPEN_FILE:
                return self._instOpenFile(key)
            elif what == rpm.RPMCALLBACK_INST_CLOSE_FILE:
                self._instCloseFile(key)
            elif what == rpm.RPMCALLBACK_INST_START:
                self._inst_start(key)
            elif what == rpm.RPMCALLBACK_INST_STOP:
                self._inst_stop(key)
            elif what == rpm.RPMCALLBACK_INST_PROGRESS:
                self._instProgress(amount, total, key)
            elif what == rpm.RPMCALLBACK_UNINST_START:
                self._uninst_start(key)
            elif what == rpm.RPMCALLBACK_UNINST_STOP:
                self._unInstStop(key)
            elif what == rpm.RPMCALLBACK_UNINST_PROGRESS:
                self._uninst_progress(amount, total, key)
            elif what == rpm.RPMCALLBACK_CPIO_ERROR:
                self._cpioError(key)
            elif what == rpm.RPMCALLBACK_UNPACK_ERROR:
                self._unpackError(key)
            elif what == rpm.RPMCALLBACK_SCRIPT_ERROR:
                self._scriptError(amount, total, key)
            elif what == rpm.RPMCALLBACK_SCRIPT_START:
                self._script_start(key)
            elif what == rpm.RPMCALLBACK_SCRIPT_STOP:
                self._scriptStop()
        except Exception:
            exc_type, exc_value, exc_traceback = sys.exc_info()
            except_list = traceback.format_exception(exc_type, exc_value, exc_traceback)
            logger.critical(''.join(except_list))

    def _transStart(self, total):
        self.total_actions = total
        if self.test: return
        self.trans_running = True
        self._te_list = list(self.base._ts)

    def _trans_progress(self, amount, total):
        action = dnf.transaction.TRANS_PREPARATION
        for display in self.displays:
            display.progress('', action, amount + 1, total, 1, 1)

    def _elemProgress(self, key, index):
        self._te_index = index
        self.complete_actions += 1
        if not self.test:
            transaction_list = self._extract_cbkey(key)
            for display in self.displays:
                display.filelog(transaction_list[0].pkg, transaction_list[0].action)

    def _instOpenFile(self, key):
        self.lastmsg = None
        transaction_list = self._extract_cbkey(key)
        pkg = transaction_list[0].pkg
        rpmloc = pkg.localPkg()
        try:
            self.fd = open(rpmloc)
        except IOError as e:
            for display in self.displays:
                display.error("Error: Cannot open file %s: %s" % (rpmloc, e))
        else:
            if self.trans_running:
                self.total_installed += 1
                self.installed_pkg_names.add(pkg.name)
            return self.fd.fileno()

    def _instCloseFile(self, key):
        self.fd.close()
        self.fd = None

    def _inst_start(self, key):
        pass

    def _inst_stop(self, key):
        if self.test or not self.trans_running:
            return

        self._scriptout()

        if self.complete_actions == self.total_actions:
            # RPM doesn't explicitly report when post-trans phase starts
            action = dnf.transaction.TRANS_POST
            for display in self.displays:
                display.progress(None, action, None, None, None, None)

    def _instProgress(self, amount, total, key):
        transaction_list = self._extract_cbkey(key)
        pkg = transaction_list[0].pkg
        action = transaction_list[0].action
        for display in self.displays:
            display.progress(pkg, action, amount, total, self.complete_actions, self.total_actions)

    def _uninst_start(self, key):
        self.total_removed += 1

    def _uninst_progress(self, amount, total, key):
        transaction_list = self._extract_cbkey(key)
        pkg = transaction_list[0].pkg
        action = transaction_list[0].action
        for display in self.displays:
            display.progress(pkg, action, amount, total, self.complete_actions, self.total_actions)

    def _unInstStop(self, key):
        if self.test:
            return

        self._scriptout()

    def _cpioError(self, key):
        transaction_list = self._extract_cbkey(key)
        msg = "Error in cpio payload of rpm package %s" % transaction_list[0].pkg
        for display in self.displays:
            display.error(msg)

    def _unpackError(self, key):
        transaction_list = self._extract_cbkey(key)
        msg = "Error unpacking rpm package %s" % transaction_list[0].pkg
        for display in self.displays:
            display.error(msg)

    def _scriptError(self, amount, total, key):
        # "amount" carries the failed scriptlet tag,
        # "total" carries fatal/non-fatal status
        scriptlet_name = rpm.tagnames.get(amount, "<unknown>")

        transaction_list = self._extract_cbkey(key)
        name = transaction_list[0].pkg.name

        msg = ("Error in %s scriptlet in rpm package %s" % (scriptlet_name, name))

        for display in self.displays:
            display.error(msg)

    def _script_start(self, key):
        # TODO: this doesn't fit into libdnf TransactionItem use cases
        action = dnf.transaction.PKG_SCRIPTLET
        if key is None and self._te_list == []:
            pkg = 'None'
        else:
            transaction_list = self._extract_cbkey(key)
            pkg = transaction_list[0].pkg
        complete = self.complete_actions if self.total_actions != 0 and self.complete_actions != 0 \
            else 1
        total = self.total_actions if self.total_actions != 0 and self.complete_actions != 0 else 1
        for display in self.displays:
            display.progress(pkg, action, 100, 100, complete, total)

    def _scriptStop(self):
        self._scriptout()

    def verify_tsi_package(self, pkg, count, total):
        for display in self.displays:
            display.verify_tsi_package(pkg, count, total)
site-packages/dnf/yum/__pycache__/misc.cpython-36.opt-1.pyc000064400000023257147511334650017417 0ustar003

�ft`W.�@sZdZddlmZmZddlmZddlmZmZmZddl	Tddl
ZddlZ
ddlZ
ddlZ
ddlZ
ddlZddlZddlZddlZddlZddlZddlZddlZddlZdgZdadd	�Zdad
d�Zdd
�ZGdd�de �Z!dd�Z"dd�Z#d,dd�Z$dd�Z%dd�Z&dd�Z'd-dd�Z(d d!�Z)da*d"d#�Z+d.d$d%�Z,d&d'�Z-d(d)�Z.d/d*d+�Z/dS)0z%
Assorted utility functions for yum.
�)�print_function�absolute_import)�unicode_literals)�base64_decodebytes�
basestring�unicode)�*NZsha256cCstdkrtjd�jat|�S)z( Tests if a string is a shell wildcard. Nz[*?]|\[.+\])�_re_compiled_glob_match�re�compile�search)�s�r�/usr/lib/python3.6/misc.py�re_glob.srcCsFtdkr(tjd�j}tjd�j}||faxtD]}||�r.dSq.WdS)zC Tests if a string needs a full nevra match, instead of just name. Nz.*([-.*?]|\[.+\]).z[0-9]+:TF)�_re_compiled_full_matchr
r�match)r
ZoneZtwoZrecrrr�re_full_search_needed6s
rcCstdS)Nr)�_default_checksumsrrrr�get_default_chksum_typeDsrc@s:eZdZdZd
dd�Zdd�Zdd�Zd	d
�Zdd�ZdS)�
GenericHolderz�Generic Holder class used to hold other objects of known types
       It exists purely to be able to do object.somestuff, object.someotherstuff
       or object[key] and pass object to another function that will
       understand itNcCs
||_dS)N)�_GenericHolder__iter)�self�iterrrr�__init__MszGenericHolder.__init__cCs|jdk	rt||j�SdS)N)rr)rrrr�__iter__Ps
zGenericHolder.__iter__cCs t||�rt||�St|��dS)N)�hasattr�getattr�KeyError)r�itemrrr�__getitem__Ts

zGenericHolder.__getitem__cCsdd�t|�j�D�S)z!Return a dictionary of all lists.cSs"i|]\}}t|�tkr||�qSr)�type�list)�.0�keyZlist_rrr�
<dictcomp>\sz+GenericHolder.all_lists.<locals>.<dictcomp>)�vars�items)rrrr�	all_listsZszGenericHolder.all_listscCs4x.|j�j�D]\}}t|�j|g�j|�qW|S)z7 Concatenate the list attributes from 'other' to ours. )r(r'r&�
setdefault�extend)r�otherr$�valrrr�merge_lists_szGenericHolder.merge_lists)N)	�__name__�
__module__�__qualname__�__doc__rrr r(r-rrrrrGs
rcCs�tjdd|�}tj�}d}d}xn|jd�D]`}|jd�r>d}q*|rT|j�dkrTd}q*|rf|jd�rfPq*|rx|jd�rxPq*|r*|j|d�q*Wt|j	��S)	z,Convert ASCII-armored GPG key to binary
    s
?�
rs$-----BEGIN PGP PUBLIC KEY BLOCK-----��s"-----END PGP PUBLIC KEY BLOCK-----�=)
r
�sub�io�BytesIO�split�
startswith�strip�writer�getvalue)�rawkey�blockZinblockZpastheaders�linerrr�
procgpgkeyes 
rAcCsPxJ|jdd�D]:}|d|krt|dd�}||kr8dS||krDdSdSqWd	S)
ab
    Return if the GPG key described by the given keyid and timestamp are
    installed in the rpmdb.

    The keyid and timestamp should both be passed as integers.
    The ts is an rpm transaction set object

    Return values:
        - -1      key is not installed
        - 0       key with matching ID and timestamp is installed
        - 1       key with matching ID is installed but has an older timestamp
        - 2       key with matching ID is installed but has a newer timestamp

    No effort is made to handle duplicates. The first matching keyid is used to
    calculate the return result.
    �namez
gpg-pubkey�version�release�rr3����)ZdbMatch�int)Zts�keyidZ	timestampZhdrZinstalledtsrrr�keyInstalled�srJTcCstjj|�stj|�tjj|���tjj���}ttjj	|d�d��}|j
d�WdQRX|j|�|�r|d}tjj|��stj|dd�xFtj|d�D]4}tjj
|�}|d|}	tj||	�tj|	d�q�Wd	}
ttjj	|d�d
d��}|j
|
�WdQRXdSQRXWdQRXdS)Nzgpg.conf�wbr4z-roi�)�modez/*�/z�lock-never
    no-auto-check-trustdb
    trust-model direct
    no-expensive-trust-checks
    no-permission-warning
    preserve-permissions
    �wT)�os�path�exists�makedirs�dnfZcryptoZpubring_dirZContext�open�joinr<Z	op_import�glob�basename�shutil�copy�chmod)r>rIZgpgdirZmake_ro_copyZctx�fpZrodir�frWZro_fZoptsrrr�import_key_to_pubring�s&

r]c	Cs�tj�}y.tj|�}tjj|d�}dtjj|f}Wn$t	k
rZdtjj|f}YnXdtjj
|f}ttj|��}xB|D]:}tj
|�}t|d�r�t|d�dkr�|d|kr�|Sq�Wtj|tjj
d�}|S)zqreturn a path to a valid and safe cachedir - only used when not running
       as root or when --tempcache is setrz%s-%s-z%s/%s*i��)�prefix�dir)rO�geteuid�pwd�getpwuidrSZi18nZucd�constZPREFIXrZTMPDIR�sortedrV�lstat�S_ISDIR�S_IMODE�tempfileZmkdtemp)	ZuidZusertupZusernamer_�dirpathZ	cachedirsZthisdirZstatsZcachedirrrr�getCacheDir�s


(rkcCsfg}t|�}t|�}d}x6||krN||}|j|||��||7}||8}qW|j||d��|S)zE Given a seq, split into a list of lists of length max_entries each. rN)�lenr"�append)�seqZmax_entries�retZnumZbeg�endrrr�
seq_max_split�s
rqcCsDytj|�Wn0tk
r>}z|jtjkr.�WYdd}~XnXdS)z| Call os.unlink, but don't die if the file isn't there. This is the main
        difference between "rm -f" and plain "rm". N)rO�unlink�OSError�errno�ENOENT)�filename�errr�unlink_f�s
rxFcCs^y
tj|�Stk
rX}z2|jtjtjfkr2dS|rF|jtjkrFdS�WYdd}~XnXdS)zF Call os.stat(), but don't die if the file isn't there. Returns None. N)rO�statrsrtru�ENOTDIRZEACCES)rvZ
ignore_EACCESrwrrr�stat_f�s
r{cCsFy$td��}|j�}t|�SQRXWnttfk
r@tj�SXdS)z� Get the audit-uid/login-uid, if available. os.getuid() is returned
        instead if there was a problem. Note that no caching is done here. z/proc/self/loginuidN)rT�readrH�IOError�
ValueErrorrO�getuid)�fo�datarrr�_getloginuid�s
r�cCstdkrt�atS)z� Get the audit-uid/login-uid, if available. os.getuid() is returned
        instead if there was a problem. The value is cached, so you don't
        have to save it. N)�_cached_getloginuidr�rrrr�getloginuidsr�c	Cs�|r
|}nJd}|jd�}|dkr@||d�}|dkr@|d|�}|dkrTtjjd
��|r�t|�}t|�}|r�|r�|j|jkr�|Sytjj||d�Wn2t	k
r�}ztjjt
|���WYdd}~XnX|r�|r�tj||j|jf�|S)
z�take a filename and decompress it into the same relative location.
       When the compression type is not recognized (or file is not compressed),
       the content of the file is copied to the destinationN�.r�.zck�.xz�.bz2�.gz�.lzma�.zstz(Could not determine destination filenamei�)r�r�r�r�r�r�)
�rfindrS�
exceptionsZ	MiscErrorr{�st_mtime�libdnfZutils�
decompress�RuntimeError�strrO�utime)	rv�dest�check_timestamps�outZdot_posZextZfir�rwrrrr�s,
"r�cCs:tjj|�}|d7}tjj|�s.tj|dd�|d|S)Nz/geni�)rLrM)rOrP�dirnamerQrR)rv�generated_namer�rrr�calculate_repo_gen_dest8s
r�cCst||�}t||dd�S)z� This is a wrapper around decompress, where we work out a cached
        generated name, and use check_timestamps. filename _must_ be from
        a repo. and generated_name is the type of the file. T)r�r�)r�r�)rvr�r�rrr�repo_gen_decompress@s
r�cCs�g}x�tj|�D]�}t|��n}xf|D]^}tjd|�r6q$|j�}|j�}|sLq$|rx|jdd�}|jdd�}|j|j��q$|j	|�q$WWdQRXqW|S)a( Takes a glob of a dir (like /etc/foo.d/\*.foo) returns a list of all
       the lines in all the files matching that glob, ignores comments and blank
       lines, optional paramater 'line_as_list tells whether to treat each line
       as a space or comma-separated list, defaults to True.
    z\s*(#|$)�
� �,N)
rVrTr
r�rstrip�lstrip�replacer*r9rm)ZthisglobZline_as_list�resultsZfnamer\r@rrr�read_in_items_from_dot_dirHs"

r�)NT)F)NF)T)0r1Z
__future__rrrZ
dnf.pycomprrrryZlibdnf.utilsr�Z	dnf.constrSZ
dnf.cryptoZdnf.exceptionsZdnf.i18nrtrVr7rOZos.pathrbr
rXrirr	rrrr�objectrrArJr]rkrqrxr{r�r�r�r�r�r�r�rrrr�<module>sN
#	


#site-packages/dnf/yum/__pycache__/misc.cpython-36.pyc000064400000023257147511334650016460 0ustar003

�ft`W.�@sZdZddlmZmZddlmZddlmZmZmZddl	Tddl
ZddlZ
ddlZ
ddlZ
ddlZ
ddlZddlZddlZddlZddlZddlZddlZddlZddlZdgZdadd	�Zdad
d�Zdd
�ZGdd�de �Z!dd�Z"dd�Z#d,dd�Z$dd�Z%dd�Z&dd�Z'd-dd�Z(d d!�Z)da*d"d#�Z+d.d$d%�Z,d&d'�Z-d(d)�Z.d/d*d+�Z/dS)0z%
Assorted utility functions for yum.
�)�print_function�absolute_import)�unicode_literals)�base64_decodebytes�
basestring�unicode)�*NZsha256cCstdkrtjd�jat|�S)z( Tests if a string is a shell wildcard. Nz[*?]|\[.+\])�_re_compiled_glob_match�re�compile�search)�s�r�/usr/lib/python3.6/misc.py�re_glob.srcCsFtdkr(tjd�j}tjd�j}||faxtD]}||�r.dSq.WdS)zC Tests if a string needs a full nevra match, instead of just name. Nz.*([-.*?]|\[.+\]).z[0-9]+:TF)�_re_compiled_full_matchr
r�match)r
ZoneZtwoZrecrrr�re_full_search_needed6s
rcCstdS)Nr)�_default_checksumsrrrr�get_default_chksum_typeDsrc@s:eZdZdZd
dd�Zdd�Zdd�Zd	d
�Zdd�ZdS)�
GenericHolderz�Generic Holder class used to hold other objects of known types
       It exists purely to be able to do object.somestuff, object.someotherstuff
       or object[key] and pass object to another function that will
       understand itNcCs
||_dS)N)�_GenericHolder__iter)�self�iterrrr�__init__MszGenericHolder.__init__cCs|jdk	rt||j�SdS)N)rr)rrrr�__iter__Ps
zGenericHolder.__iter__cCs t||�rt||�St|��dS)N)�hasattr�getattr�KeyError)r�itemrrr�__getitem__Ts

zGenericHolder.__getitem__cCsdd�t|�j�D�S)z!Return a dictionary of all lists.cSs"i|]\}}t|�tkr||�qSr)�type�list)�.0�keyZlist_rrr�
<dictcomp>\sz+GenericHolder.all_lists.<locals>.<dictcomp>)�vars�items)rrrr�	all_listsZszGenericHolder.all_listscCs4x.|j�j�D]\}}t|�j|g�j|�qW|S)z7 Concatenate the list attributes from 'other' to ours. )r(r'r&�
setdefault�extend)r�otherr$�valrrr�merge_lists_szGenericHolder.merge_lists)N)	�__name__�
__module__�__qualname__�__doc__rrr r(r-rrrrrGs
rcCs�tjdd|�}tj�}d}d}xn|jd�D]`}|jd�r>d}q*|rT|j�dkrTd}q*|rf|jd�rfPq*|rx|jd�rxPq*|r*|j|d�q*Wt|j	��S)	z,Convert ASCII-armored GPG key to binary
    s
?�
rs$-----BEGIN PGP PUBLIC KEY BLOCK-----��s"-----END PGP PUBLIC KEY BLOCK-----�=)
r
�sub�io�BytesIO�split�
startswith�strip�writer�getvalue)�rawkey�blockZinblockZpastheaders�linerrr�
procgpgkeyes 
rAcCsPxJ|jdd�D]:}|d|krt|dd�}||kr8dS||krDdSdSqWd	S)
ab
    Return if the GPG key described by the given keyid and timestamp are
    installed in the rpmdb.

    The keyid and timestamp should both be passed as integers.
    The ts is an rpm transaction set object

    Return values:
        - -1      key is not installed
        - 0       key with matching ID and timestamp is installed
        - 1       key with matching ID is installed but has an older timestamp
        - 2       key with matching ID is installed but has a newer timestamp

    No effort is made to handle duplicates. The first matching keyid is used to
    calculate the return result.
    �namez
gpg-pubkey�version�release�rr3����)ZdbMatch�int)Zts�keyidZ	timestampZhdrZinstalledtsrrr�keyInstalled�srJTcCstjj|�stj|�tjj|���tjj���}ttjj	|d�d��}|j
d�WdQRX|j|�|�r|d}tjj|��stj|dd�xFtj|d�D]4}tjj
|�}|d|}	tj||	�tj|	d�q�Wd	}
ttjj	|d�d
d��}|j
|
�WdQRXdSQRXWdQRXdS)Nzgpg.conf�wbr4z-roi�)�modez/*�/z�lock-never
    no-auto-check-trustdb
    trust-model direct
    no-expensive-trust-checks
    no-permission-warning
    preserve-permissions
    �wT)�os�path�exists�makedirs�dnfZcryptoZpubring_dirZContext�open�joinr<Z	op_import�glob�basename�shutil�copy�chmod)r>rIZgpgdirZmake_ro_copyZctx�fpZrodir�frWZro_fZoptsrrr�import_key_to_pubring�s&

r]c	Cs�tj�}y.tj|�}tjj|d�}dtjj|f}Wn$t	k
rZdtjj|f}YnXdtjj
|f}ttj|��}xB|D]:}tj
|�}t|d�r�t|d�dkr�|d|kr�|Sq�Wtj|tjj
d�}|S)zqreturn a path to a valid and safe cachedir - only used when not running
       as root or when --tempcache is setrz%s-%s-z%s/%s*i��)�prefix�dir)rO�geteuid�pwd�getpwuidrSZi18nZucd�constZPREFIXrZTMPDIR�sortedrV�lstat�S_ISDIR�S_IMODE�tempfileZmkdtemp)	ZuidZusertupZusernamer_�dirpathZ	cachedirsZthisdirZstatsZcachedirrrr�getCacheDir�s


(rkcCsfg}t|�}t|�}d}x6||krN||}|j|||��||7}||8}qW|j||d��|S)zE Given a seq, split into a list of lists of length max_entries each. rN)�lenr"�append)�seqZmax_entries�retZnumZbeg�endrrr�
seq_max_split�s
rqcCsDytj|�Wn0tk
r>}z|jtjkr.�WYdd}~XnXdS)z| Call os.unlink, but don't die if the file isn't there. This is the main
        difference between "rm -f" and plain "rm". N)rO�unlink�OSError�errno�ENOENT)�filename�errr�unlink_f�s
rxFcCs^y
tj|�Stk
rX}z2|jtjtjfkr2dS|rF|jtjkrFdS�WYdd}~XnXdS)zF Call os.stat(), but don't die if the file isn't there. Returns None. N)rO�statrsrtru�ENOTDIRZEACCES)rvZ
ignore_EACCESrwrrr�stat_f�s
r{cCsFy$td��}|j�}t|�SQRXWnttfk
r@tj�SXdS)z� Get the audit-uid/login-uid, if available. os.getuid() is returned
        instead if there was a problem. Note that no caching is done here. z/proc/self/loginuidN)rT�readrH�IOError�
ValueErrorrO�getuid)�fo�datarrr�_getloginuid�s
r�cCstdkrt�atS)z� Get the audit-uid/login-uid, if available. os.getuid() is returned
        instead if there was a problem. The value is cached, so you don't
        have to save it. N)�_cached_getloginuidr�rrrr�getloginuidsr�c	Cs�|r
|}nJd}|jd�}|dkr@||d�}|dkr@|d|�}|dkrTtjjd
��|r�t|�}t|�}|r�|r�|j|jkr�|Sytjj||d�Wn2t	k
r�}ztjjt
|���WYdd}~XnX|r�|r�tj||j|jf�|S)
z�take a filename and decompress it into the same relative location.
       When the compression type is not recognized (or file is not compressed),
       the content of the file is copied to the destinationN�.r�.zck�.xz�.bz2�.gz�.lzma�.zstz(Could not determine destination filenamei�)r�r�r�r�r�r�)
�rfindrS�
exceptionsZ	MiscErrorr{�st_mtime�libdnfZutils�
decompress�RuntimeError�strrO�utime)	rv�dest�check_timestamps�outZdot_posZextZfir�rwrrrr�s,
"r�cCs:tjj|�}|d7}tjj|�s.tj|dd�|d|S)Nz/geni�)rLrM)rOrP�dirnamerQrR)rv�generated_namer�rrr�calculate_repo_gen_dest8s
r�cCst||�}t||dd�S)z� This is a wrapper around decompress, where we work out a cached
        generated name, and use check_timestamps. filename _must_ be from
        a repo. and generated_name is the type of the file. T)r�r�)r�r�)rvr�r�rrr�repo_gen_decompress@s
r�cCs�g}x�tj|�D]�}t|��n}xf|D]^}tjd|�r6q$|j�}|j�}|sLq$|rx|jdd�}|jdd�}|j|j��q$|j	|�q$WWdQRXqW|S)a( Takes a glob of a dir (like /etc/foo.d/\*.foo) returns a list of all
       the lines in all the files matching that glob, ignores comments and blank
       lines, optional paramater 'line_as_list tells whether to treat each line
       as a space or comma-separated list, defaults to True.
    z\s*(#|$)�
� �,N)
rVrTr
r�rstrip�lstrip�replacer*r9rm)ZthisglobZline_as_list�resultsZfnamer\r@rrr�read_in_items_from_dot_dirHs"

r�)NT)F)NF)T)0r1Z
__future__rrrZ
dnf.pycomprrrryZlibdnf.utilsr�Z	dnf.constrSZ
dnf.cryptoZdnf.exceptionsZdnf.i18nrtrVr7rOZos.pathrbr
rXrirr	rrrr�objectrrArJr]rkrqrxr{r�r�r�r�r�r�r�rrrr�<module>sN
#	


#site-packages/dnf/yum/__pycache__/__init__.cpython-36.pyc000064400000000161147511334650017251 0ustar003

�ft`��@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/dnf/yum/__pycache__/rpmtrans.cpython-36.pyc000064400000033177147511334650017375 0ustar003

�ft`�>�	@sHddlmZmZddlmZddlZddlmZmZddl	Z
ddlZ
ddlZ
ddl
Z
ddlZddlZddlZddlZddlZddlZdZdZdZdZd	Zd
ZdZdZeeegZeeegZejjejj ejj!ejj"ejj#ejj$ejj%ejj&ejj'h	Z(ej)d
�Z*dd�Z+Gdd�de,�Z-Gdd�de-�Z.Gdd�de-�Z/Gdd�de,�Z0dS)�)�print_function�absolute_import)�unicode_literalsN)�_�ucd�
��(�2�<�F�Z�d�dnfcst�fdd��}|S)zb
    Wrapper to return a deprecated action constant
    while printing a deprecation warning.
    cs2d|jj��f}tj|tdd�ttj��}|S)Nz1%s.%s is deprecated. Use dnf.callback.%s instead.�)�
stacklevel)�	__class__�__name__�warnings�warn�DeprecationWarning�getattrr�callback)�self�msg�value)�name��/usr/lib/python3.6/rpmtrans.py�_funcCs
z%_add_deprecated_action.<locals>._func)�property)rrr)rr�_add_deprecated_action>sr!c@s�eZdZdd�Zed�Zed�Zed�ZeZed�Z	ed�Z
ed�Zed	�Zed
�Z
ed�Zed�Zed
�Zdd�Zdd�Zdd�Zdd�Zdd�ZdS)�TransactionDisplaycCsdS)Nr)rrrr�__init__PszTransactionDisplay.__init__�PKG_CLEANUP�
PKG_DOWNGRADE�
PKG_REMOVE�PKG_INSTALL�PKG_OBSOLETE�
PKG_REINSTALL�PKG_UPGRADE�
PKG_VERIFY�TRANS_PREPARATION�
PKG_SCRIPTLET�
TRANS_POSTcCsdS)aReport ongoing progress on a transaction item. :api

        :param package: a package being processed
        :param action: the action being performed
        :param ti_done: number of processed bytes of the transaction
           item being processed
        :param ti_total: total number of bytes of the transaction item
           being processed
        :param ts_done: number of actions processed in the whole
           transaction
        :param ts_total: total number of actions in the whole
           transaction

        Nr)r�package�actionZti_doneZti_totalZts_doneZts_totalrrr�progressbszTransactionDisplay.progresscCsdS)z_Hook for reporting an rpm scriptlet output.

        :param msgs: the scriptlet output
        Nr)r�msgsrrr�	scriptoutsszTransactionDisplay.scriptoutcCsdS)z:Report an error that occurred during the transaction. :apiNr)r�messagerrr�errorzszTransactionDisplay.errorcCsdS)z|package is the same as in progress() - a package object or simple
           string action is also the same as in progress()Nr)rr/r0rrr�filelog~szTransactionDisplay.filelogcCs|j|tjjdd||�dS)Nr)r1r�transactionr+)r�pkg�count�totalrrr�verify_tsi_package�sz%TransactionDisplay.verify_tsi_packageN)r�
__module__�__qualname__r#r!r$r%r&Z	PKG_ERASEr'r(r)r*r+r,r-r.r1r3r5r6r;rrrrr"Ms$r"cs eZdZdZ�fdd�Z�ZS)�ErrorTransactionDisplayz@An RPMTransaction display that prints errors to standard output.cs&tt|�j|�tjjd|tj�dS)N�print)�superr>r5r�utilZ_terminal_messenger�sys�stderr)rr4)rrrr5�szErrorTransactionDisplay.error)rr<r=�__doc__r5�
__classcell__rr)rrr>�sr>cs8eZdZdZ�fdd�Zdd�Zdd�Zdd	�Z�ZS)
�LoggingTransactionDisplayz@
    Base class for a RPMTransaction display callback class
    cstt|�j�tjd�|_dS)Nzdnf.rpm)r@rFr#�logging�	getLogger�
rpm_logger)r)rrrr#�sz"LoggingTransactionDisplay.__init__cCs|jj|�dS)N)rIr5)rr4rrrr5�szLoggingTransactionDisplay.errorcCs.tjj|}d||f}|jjtjj|�dS)Nz%s: %s)rr7ZFILE_ACTIONSrI�logrGZSUBDEBUG)rr/r0Z
action_strrrrrr6�sz!LoggingTransactionDisplay.filelogcCs|r|jjt|��dS)N)rI�infor)rr2rrrr3�sz#LoggingTransactionDisplay.scriptout)	rr<r=rDr#r5r6r3rErr)rrrF�s
rFc@s�eZdZdffdd�Zd8dd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Zd5d6�Zd7S)9�RPMTransactionFcCsn|st�g}||_||_||_d|_d|_d|_d|_d|_t	�|_
d|_|j|j
j�g|_d|_d|_dS)NFr)r>�displays�base�test�
trans_running�fd�
total_actions�total_installed�complete_actions�set�installed_pkg_names�
total_removed�_setupOutputLoggingZconf�rpmverbosity�_te_list�	_te_index�
_tsi_cache)rrNrOrMrrrr#�s zRPMTransaction.__init__rKcCs~tj�}||_t|jd�|_|jjj|j�dddddd�j	||�}d|j
�}tt|�s^d	}tj
tt|��tj|j�dS)
Nzw+bZcritZemerg�errrKZwarning)�criticalZ	emergencyr5ZinformationrZRPMLOG_ZRPMLOG_INFO)�tempfileZNamedTemporaryFile�	_readpipe�openr�
_writepiperN�_tsZsetScriptFd�get�upper�hasattr�rpm�setVerbosityr�
setLogFile)rrYZio_rrrrrX�s
z"RPMTransaction._setupOutputLoggingc	Cs8tjtj�tjtj�y|jj�WnYnXdS)N)rgrhZ
RPMLOG_NOTICErirBrCrb�close)rrrr�_shutdownOutputLogging�sz%RPMTransaction._shutdownOutputLoggingcCsBy(|jj|jj��|jj�}|s&dS|Stk
r<YnXdS)N)r`�seek�tell�read�IOError)r�outrrr�
_scriptOutput�s
zRPMTransaction._scriptOutputccs,|j�}|r(x|j�D]}t|�VqWdS)N)rq�
splitlinesr)r�messages�linerrrrs�szRPMTransaction.messagescCs4|j�}x|jD]}|j|�qW|jjj|�dS)N)rqrMr3rN�historyZlog_scriptlet_output)rr2�displayrrr�
_scriptout�szRPMTransaction._scriptoutcCs|j�dS)N)rk)rrrr�__del__�szRPMTransaction.__del__cCs�t|d�r|}|gS|j|j}tjj|�}|jrJt|jd�|krJ|jSg}x2|jj	D]&}|j
tkrhqXt|�|krX|j|�qXW|r�||_|St
d|��dS)z3Obtain the package related to the calling callback.r8rz%TransactionItem not found for key: %sN)rfrZr[rrAZ	_te_nevrar\�strrNr7r0�RPM_ACTIONS_SET�append�RuntimeError)rZcbkeyZtsiZteZte_nevra�itemsrrr�_extract_cbkey�s$

zRPMTransaction._extract_cbkeyc
Cs��y�t|t�rt|�}|tjkr.|j|��nv|tjkr<�nh|tjkrV|j||��nN|tj	krp|j
||��n4|tjkr�|j|�S|tj
kr�|j|��n|tjkr�|j|�n�|tjkr�|j|�n�|tjkr�|j|||�n�|tjkr�|j|�n�|tjk�r|j|�n�|tjk�r,|j|||�nx|tjk�rD|j|�n`|tjk�r\|j|�nH|tjk�rx|j |||�n,|tj!k�r�|j"|�n|tj#k�r�|j$�WnBt%k
�r�t&j'�\}}}t(j)|||�}	t*j+dj,|	��YnXdS)N�)-�
isinstanceryrrgZRPMCALLBACK_TRANS_START�_transStartZRPMCALLBACK_TRANS_STOPZRPMCALLBACK_TRANS_PROGRESS�_trans_progressZRPMCALLBACK_ELEM_PROGRESS�
_elemProgressZRPMCALLBACK_INST_OPEN_FILE�
_instOpenFileZRPMCALLBACK_INST_CLOSE_FILE�_instCloseFileZRPMCALLBACK_INST_START�_inst_startZRPMCALLBACK_INST_STOP�
_inst_stopZRPMCALLBACK_INST_PROGRESS�
_instProgressZRPMCALLBACK_UNINST_START�
_uninst_startZRPMCALLBACK_UNINST_STOP�_unInstStopZRPMCALLBACK_UNINST_PROGRESS�_uninst_progressZRPMCALLBACK_CPIO_ERROR�
_cpioErrorZRPMCALLBACK_UNPACK_ERROR�_unpackErrorZRPMCALLBACK_SCRIPT_ERROR�_scriptErrorZRPMCALLBACK_SCRIPT_START�
_script_startZRPMCALLBACK_SCRIPT_STOP�_scriptStop�	ExceptionrB�exc_info�	traceback�format_exception�loggerr^�join)
rZwhat�amountr:�keyZclient_data�exc_type�	exc_value�
exc_tracebackZexcept_listrrrrsR











zRPMTransaction.callbackcCs(||_|jrdSd|_t|jj�|_dS)NT)rRrOrP�listrNrcrZ)rr:rrrr�<s
zRPMTransaction._transStartcCs4tjj}x&|jD]}|jd||d|dd�qWdS)Nr�)rr7r,rMr1)rr�r:r0rvrrrr�BszRPMTransaction._trans_progresscCsP||_|jd7_|jsL|j|�}x&|jD]}|j|dj|dj�q,WdS)Nr�r)r[rTrOr~rMr6r8r0)rr��index�transaction_listrvrrrr�Gs
zRPMTransaction._elemProgresscCs�d|_|j|�}|dj}|j�}yt|�|_WnDtk
rt}z(x |jD]}|jd||f�qJWWYdd}~Xn.X|j	r�|j
d7_
|jj|j
�|jj�SdS)NrzError: Cannot open file %s: %sr�)Zlastmsgr~r8ZlocalPkgrarQrorMr5rPrSrV�addr�fileno)rr�r�r8Zrpmloc�ervrrrr�Os

(zRPMTransaction._instOpenFilecCs|jj�d|_dS)N)rQrj)rr�rrrr�_s
zRPMTransaction._instCloseFilecCsdS)Nr)rr�rrrr�cszRPMTransaction._inst_startcCsV|js|jrdS|j�|j|jkrRtjj}x"|jD]}|j	d|dddd�q6WdS)N)
rOrPrwrTrRrr7r.rMr1)rr�r0rvrrrr�fszRPMTransaction._inst_stopcCsJ|j|�}|dj}|dj}x&|jD]}|j|||||j|j�q&WdS)Nr)r~r8r0rMr1rTrR)rr�r:r�r�r8r0rvrrrr�rs



zRPMTransaction._instProgresscCs|jd7_dS)Nr�)rW)rr�rrrr�yszRPMTransaction._uninst_startcCsJ|j|�}|dj}|dj}x&|jD]}|j|||||j|j�q&WdS)Nr)r~r8r0rMr1rTrR)rr�r:r�r�r8r0rvrrrr�|s



zRPMTransaction._uninst_progresscCs|jr
dS|j�dS)N)rOrw)rr�rrrr��szRPMTransaction._unInstStopcCs6|j|�}d|dj}x|jD]}|j|�q WdS)Nz'Error in cpio payload of rpm package %sr)r~r8rMr5)rr�r�rrvrrrr��s
zRPMTransaction._cpioErrorcCs6|j|�}d|dj}x|jD]}|j|�q WdS)NzError unpacking rpm package %sr)r~r8rMr5)rr�r�rrvrrrr��s
zRPMTransaction._unpackErrorc	CsNtjj|d�}|j|�}|djj}d||f}x|jD]}|j|�q8WdS)Nz	<unknown>rz'Error in %s scriptlet in rpm package %s)rgZtagnamesrdr~r8rrMr5)	rr�r:r�Zscriptlet_namer�rrrvrrrr��s
zRPMTransaction._scriptErrorcCs�tjj}|dkr |jgkr d}n|j|�}|dj}|jdkrN|jdkrN|jnd}|jdkrl|jdkrl|jnd}x"|jD]}|j	||dd||�qxWdS)N�Nonerr�r)
rr7r-rZr~r8rRrTrMr1)rr�r0r8r�Zcompleter:rvrrrr��s

zRPMTransaction._script_startcCs|j�dS)N)rw)rrrrr��szRPMTransaction._scriptStopcCs"x|jD]}|j|||�qWdS)N)rMr;)rr8r9r:rvrrrr;�sz!RPMTransaction.verify_tsi_packageN)rK)rr<r=r#rXrkrqrsrwrxr~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r;rrrrrL�s4
	.
rL)1Z
__future__rrrZlibdnf.transactionZlibdnfZdnf.i18nrrZdnf.callbackrZdnf.transactionZdnf.utilrg�osrGrBr_r�rZ	TS_UPDATEZ
TS_INSTALLZTS_ERASEZTS_OBSOLETEDZ
TS_OBSOLETINGZTS_AVAILABLEZ
TS_UPDATEDZ	TS_FAILEDZTS_INSTALL_STATESZTS_REMOVE_STATESr7ZTransactionItemAction_INSTALLZTransactionItemAction_DOWNGRADEZ TransactionItemAction_DOWNGRADEDZTransactionItemAction_OBSOLETEZTransactionItemAction_OBSOLETEDZTransactionItemAction_UPGRADEZTransactionItemAction_UPGRADEDZTransactionItemAction_REMOVEZ!TransactionItemAction_REINSTALLEDrzrHr�r!�objectr"r>rFrLrrrr�<module>sL



<	site-packages/dnf/yum/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000161147511334650020210 0ustar003

�ft`��@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/dnf/yum/__pycache__/rpmtrans.cpython-36.opt-1.pyc000064400000033177147511334650020334 0ustar003

�ft`�>�	@sHddlmZmZddlmZddlZddlmZmZddl	Z
ddlZ
ddlZ
ddl
Z
ddlZddlZddlZddlZddlZddlZdZdZdZdZd	Zd
ZdZdZeeegZeeegZejjejj ejj!ejj"ejj#ejj$ejj%ejj&ejj'h	Z(ej)d
�Z*dd�Z+Gdd�de,�Z-Gdd�de-�Z.Gdd�de-�Z/Gdd�de,�Z0dS)�)�print_function�absolute_import)�unicode_literalsN)�_�ucd�
��(�2�<�F�Z�d�dnfcst�fdd��}|S)zb
    Wrapper to return a deprecated action constant
    while printing a deprecation warning.
    cs2d|jj��f}tj|tdd�ttj��}|S)Nz1%s.%s is deprecated. Use dnf.callback.%s instead.�)�
stacklevel)�	__class__�__name__�warnings�warn�DeprecationWarning�getattrr�callback)�self�msg�value)�name��/usr/lib/python3.6/rpmtrans.py�_funcCs
z%_add_deprecated_action.<locals>._func)�property)rrr)rr�_add_deprecated_action>sr!c@s�eZdZdd�Zed�Zed�Zed�ZeZed�Z	ed�Z
ed�Zed	�Zed
�Z
ed�Zed�Zed
�Zdd�Zdd�Zdd�Zdd�Zdd�ZdS)�TransactionDisplaycCsdS)Nr)rrrr�__init__PszTransactionDisplay.__init__�PKG_CLEANUP�
PKG_DOWNGRADE�
PKG_REMOVE�PKG_INSTALL�PKG_OBSOLETE�
PKG_REINSTALL�PKG_UPGRADE�
PKG_VERIFY�TRANS_PREPARATION�
PKG_SCRIPTLET�
TRANS_POSTcCsdS)aReport ongoing progress on a transaction item. :api

        :param package: a package being processed
        :param action: the action being performed
        :param ti_done: number of processed bytes of the transaction
           item being processed
        :param ti_total: total number of bytes of the transaction item
           being processed
        :param ts_done: number of actions processed in the whole
           transaction
        :param ts_total: total number of actions in the whole
           transaction

        Nr)r�package�actionZti_doneZti_totalZts_doneZts_totalrrr�progressbszTransactionDisplay.progresscCsdS)z_Hook for reporting an rpm scriptlet output.

        :param msgs: the scriptlet output
        Nr)r�msgsrrr�	scriptoutsszTransactionDisplay.scriptoutcCsdS)z:Report an error that occurred during the transaction. :apiNr)r�messagerrr�errorzszTransactionDisplay.errorcCsdS)z|package is the same as in progress() - a package object or simple
           string action is also the same as in progress()Nr)rr/r0rrr�filelog~szTransactionDisplay.filelogcCs|j|tjjdd||�dS)Nr)r1r�transactionr+)r�pkg�count�totalrrr�verify_tsi_package�sz%TransactionDisplay.verify_tsi_packageN)r�
__module__�__qualname__r#r!r$r%r&Z	PKG_ERASEr'r(r)r*r+r,r-r.r1r3r5r6r;rrrrr"Ms$r"cs eZdZdZ�fdd�Z�ZS)�ErrorTransactionDisplayz@An RPMTransaction display that prints errors to standard output.cs&tt|�j|�tjjd|tj�dS)N�print)�superr>r5r�utilZ_terminal_messenger�sys�stderr)rr4)rrrr5�szErrorTransactionDisplay.error)rr<r=�__doc__r5�
__classcell__rr)rrr>�sr>cs8eZdZdZ�fdd�Zdd�Zdd�Zdd	�Z�ZS)
�LoggingTransactionDisplayz@
    Base class for a RPMTransaction display callback class
    cstt|�j�tjd�|_dS)Nzdnf.rpm)r@rFr#�logging�	getLogger�
rpm_logger)r)rrrr#�sz"LoggingTransactionDisplay.__init__cCs|jj|�dS)N)rIr5)rr4rrrr5�szLoggingTransactionDisplay.errorcCs.tjj|}d||f}|jjtjj|�dS)Nz%s: %s)rr7ZFILE_ACTIONSrI�logrGZSUBDEBUG)rr/r0Z
action_strrrrrr6�sz!LoggingTransactionDisplay.filelogcCs|r|jjt|��dS)N)rI�infor)rr2rrrr3�sz#LoggingTransactionDisplay.scriptout)	rr<r=rDr#r5r6r3rErr)rrrF�s
rFc@s�eZdZdffdd�Zd8dd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Zd5d6�Zd7S)9�RPMTransactionFcCsn|st�g}||_||_||_d|_d|_d|_d|_d|_t	�|_
d|_|j|j
j�g|_d|_d|_dS)NFr)r>�displays�base�test�
trans_running�fd�
total_actions�total_installed�complete_actions�set�installed_pkg_names�
total_removed�_setupOutputLoggingZconf�rpmverbosity�_te_list�	_te_index�
_tsi_cache)rrNrOrMrrrr#�s zRPMTransaction.__init__rKcCs~tj�}||_t|jd�|_|jjj|j�dddddd�j	||�}d|j
�}tt|�s^d	}tj
tt|��tj|j�dS)
Nzw+bZcritZemerg�errrKZwarning)�criticalZ	emergencyr5ZinformationrZRPMLOG_ZRPMLOG_INFO)�tempfileZNamedTemporaryFile�	_readpipe�openr�
_writepiperN�_tsZsetScriptFd�get�upper�hasattr�rpm�setVerbosityr�
setLogFile)rrYZio_rrrrrX�s
z"RPMTransaction._setupOutputLoggingc	Cs8tjtj�tjtj�y|jj�WnYnXdS)N)rgrhZ
RPMLOG_NOTICErirBrCrb�close)rrrr�_shutdownOutputLogging�sz%RPMTransaction._shutdownOutputLoggingcCsBy(|jj|jj��|jj�}|s&dS|Stk
r<YnXdS)N)r`�seek�tell�read�IOError)r�outrrr�
_scriptOutput�s
zRPMTransaction._scriptOutputccs,|j�}|r(x|j�D]}t|�VqWdS)N)rq�
splitlinesr)r�messages�linerrrrs�szRPMTransaction.messagescCs4|j�}x|jD]}|j|�qW|jjj|�dS)N)rqrMr3rN�historyZlog_scriptlet_output)rr2�displayrrr�
_scriptout�szRPMTransaction._scriptoutcCs|j�dS)N)rk)rrrr�__del__�szRPMTransaction.__del__cCs�t|d�r|}|gS|j|j}tjj|�}|jrJt|jd�|krJ|jSg}x2|jj	D]&}|j
tkrhqXt|�|krX|j|�qXW|r�||_|St
d|��dS)z3Obtain the package related to the calling callback.r8rz%TransactionItem not found for key: %sN)rfrZr[rrAZ	_te_nevrar\�strrNr7r0�RPM_ACTIONS_SET�append�RuntimeError)rZcbkeyZtsiZteZte_nevra�itemsrrr�_extract_cbkey�s$

zRPMTransaction._extract_cbkeyc
Cs��y�t|t�rt|�}|tjkr.|j|��nv|tjkr<�nh|tjkrV|j||��nN|tj	krp|j
||��n4|tjkr�|j|�S|tj
kr�|j|��n|tjkr�|j|�n�|tjkr�|j|�n�|tjkr�|j|||�n�|tjkr�|j|�n�|tjk�r|j|�n�|tjk�r,|j|||�nx|tjk�rD|j|�n`|tjk�r\|j|�nH|tjk�rx|j |||�n,|tj!k�r�|j"|�n|tj#k�r�|j$�WnBt%k
�r�t&j'�\}}}t(j)|||�}	t*j+dj,|	��YnXdS)N�)-�
isinstanceryrrgZRPMCALLBACK_TRANS_START�_transStartZRPMCALLBACK_TRANS_STOPZRPMCALLBACK_TRANS_PROGRESS�_trans_progressZRPMCALLBACK_ELEM_PROGRESS�
_elemProgressZRPMCALLBACK_INST_OPEN_FILE�
_instOpenFileZRPMCALLBACK_INST_CLOSE_FILE�_instCloseFileZRPMCALLBACK_INST_START�_inst_startZRPMCALLBACK_INST_STOP�
_inst_stopZRPMCALLBACK_INST_PROGRESS�
_instProgressZRPMCALLBACK_UNINST_START�
_uninst_startZRPMCALLBACK_UNINST_STOP�_unInstStopZRPMCALLBACK_UNINST_PROGRESS�_uninst_progressZRPMCALLBACK_CPIO_ERROR�
_cpioErrorZRPMCALLBACK_UNPACK_ERROR�_unpackErrorZRPMCALLBACK_SCRIPT_ERROR�_scriptErrorZRPMCALLBACK_SCRIPT_START�
_script_startZRPMCALLBACK_SCRIPT_STOP�_scriptStop�	ExceptionrB�exc_info�	traceback�format_exception�loggerr^�join)
rZwhat�amountr:�keyZclient_data�exc_type�	exc_value�
exc_tracebackZexcept_listrrrrsR











zRPMTransaction.callbackcCs(||_|jrdSd|_t|jj�|_dS)NT)rRrOrP�listrNrcrZ)rr:rrrr�<s
zRPMTransaction._transStartcCs4tjj}x&|jD]}|jd||d|dd�qWdS)Nr�)rr7r,rMr1)rr�r:r0rvrrrr�BszRPMTransaction._trans_progresscCsP||_|jd7_|jsL|j|�}x&|jD]}|j|dj|dj�q,WdS)Nr�r)r[rTrOr~rMr6r8r0)rr��index�transaction_listrvrrrr�Gs
zRPMTransaction._elemProgresscCs�d|_|j|�}|dj}|j�}yt|�|_WnDtk
rt}z(x |jD]}|jd||f�qJWWYdd}~Xn.X|j	r�|j
d7_
|jj|j
�|jj�SdS)NrzError: Cannot open file %s: %sr�)Zlastmsgr~r8ZlocalPkgrarQrorMr5rPrSrV�addr�fileno)rr�r�r8Zrpmloc�ervrrrr�Os

(zRPMTransaction._instOpenFilecCs|jj�d|_dS)N)rQrj)rr�rrrr�_s
zRPMTransaction._instCloseFilecCsdS)Nr)rr�rrrr�cszRPMTransaction._inst_startcCsV|js|jrdS|j�|j|jkrRtjj}x"|jD]}|j	d|dddd�q6WdS)N)
rOrPrwrTrRrr7r.rMr1)rr�r0rvrrrr�fszRPMTransaction._inst_stopcCsJ|j|�}|dj}|dj}x&|jD]}|j|||||j|j�q&WdS)Nr)r~r8r0rMr1rTrR)rr�r:r�r�r8r0rvrrrr�rs



zRPMTransaction._instProgresscCs|jd7_dS)Nr�)rW)rr�rrrr�yszRPMTransaction._uninst_startcCsJ|j|�}|dj}|dj}x&|jD]}|j|||||j|j�q&WdS)Nr)r~r8r0rMr1rTrR)rr�r:r�r�r8r0rvrrrr�|s



zRPMTransaction._uninst_progresscCs|jr
dS|j�dS)N)rOrw)rr�rrrr��szRPMTransaction._unInstStopcCs6|j|�}d|dj}x|jD]}|j|�q WdS)Nz'Error in cpio payload of rpm package %sr)r~r8rMr5)rr�r�rrvrrrr��s
zRPMTransaction._cpioErrorcCs6|j|�}d|dj}x|jD]}|j|�q WdS)NzError unpacking rpm package %sr)r~r8rMr5)rr�r�rrvrrrr��s
zRPMTransaction._unpackErrorc	CsNtjj|d�}|j|�}|djj}d||f}x|jD]}|j|�q8WdS)Nz	<unknown>rz'Error in %s scriptlet in rpm package %s)rgZtagnamesrdr~r8rrMr5)	rr�r:r�Zscriptlet_namer�rrrvrrrr��s
zRPMTransaction._scriptErrorcCs�tjj}|dkr |jgkr d}n|j|�}|dj}|jdkrN|jdkrN|jnd}|jdkrl|jdkrl|jnd}x"|jD]}|j	||dd||�qxWdS)N�Nonerr�r)
rr7r-rZr~r8rRrTrMr1)rr�r0r8r�Zcompleter:rvrrrr��s

zRPMTransaction._script_startcCs|j�dS)N)rw)rrrrr��szRPMTransaction._scriptStopcCs"x|jD]}|j|||�qWdS)N)rMr;)rr8r9r:rvrrrr;�sz!RPMTransaction.verify_tsi_packageN)rK)rr<r=r#rXrkrqrsrwrxr~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r;rrrrrL�s4
	.
rL)1Z
__future__rrrZlibdnf.transactionZlibdnfZdnf.i18nrrZdnf.callbackrZdnf.transactionZdnf.utilrg�osrGrBr_r�rZ	TS_UPDATEZ
TS_INSTALLZTS_ERASEZTS_OBSOLETEDZ
TS_OBSOLETINGZTS_AVAILABLEZ
TS_UPDATEDZ	TS_FAILEDZTS_INSTALL_STATESZTS_REMOVE_STATESr7ZTransactionItemAction_INSTALLZTransactionItemAction_DOWNGRADEZ TransactionItemAction_DOWNGRADEDZTransactionItemAction_OBSOLETEZTransactionItemAction_OBSOLETEDZTransactionItemAction_UPGRADEZTransactionItemAction_UPGRADEDZTransactionItemAction_REMOVEZ!TransactionItemAction_REINSTALLEDrzrHr�r!�objectr"r>rFrLrrrr�<module>sL



<	site-packages/dnf/yum/misc.py000064400000027127147511334650012174 0ustar00# misc.py
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

"""
Assorted utility functions for yum.
"""

from __future__ import print_function, absolute_import
from __future__ import unicode_literals
from dnf.pycomp import base64_decodebytes, basestring, unicode
from stat import *
import libdnf.utils
import dnf.const
import dnf.crypto
import dnf.exceptions
import dnf.i18n
import errno
import glob
import io
import os
import os.path
import pwd
import re
import shutil
import tempfile

_default_checksums = ['sha256']


_re_compiled_glob_match = None
def re_glob(s):
    """ Tests if a string is a shell wildcard. """
    global _re_compiled_glob_match
    if _re_compiled_glob_match is None:
        _re_compiled_glob_match = re.compile(r'[*?]|\[.+\]').search
    return _re_compiled_glob_match(s)

_re_compiled_full_match = None
def re_full_search_needed(s):
    """ Tests if a string needs a full nevra match, instead of just name. """
    global _re_compiled_full_match
    if _re_compiled_full_match is None:
        # A glob, or a "." or "-" separator, followed by something (the ".")
        one = re.compile(r'.*([-.*?]|\[.+\]).').match
        # Any epoch, for envra
        two = re.compile('[0-9]+:').match
        _re_compiled_full_match = (one, two)
    for rec in _re_compiled_full_match:
        if rec(s):
            return True
    return False

def get_default_chksum_type():
    return _default_checksums[0]

class GenericHolder(object):
    """Generic Holder class used to hold other objects of known types
       It exists purely to be able to do object.somestuff, object.someotherstuff
       or object[key] and pass object to another function that will
       understand it"""

    def __init__(self, iter=None):
        self.__iter = iter

    def __iter__(self):
        if self.__iter is not None:
            return iter(self[self.__iter])

    def __getitem__(self, item):
        if hasattr(self, item):
            return getattr(self, item)
        else:
            raise KeyError(item)

    def all_lists(self):
        """Return a dictionary of all lists."""
        return {key: list_ for key, list_ in vars(self).items()
                if type(list_) is list}

    def merge_lists(self, other):
        """ Concatenate the list attributes from 'other' to ours. """
        for (key, val) in other.all_lists().items():
            vars(self).setdefault(key, []).extend(val)
        return self

def procgpgkey(rawkey):
    '''Convert ASCII-armored GPG key to binary
    '''

    # Normalise newlines
    rawkey = re.sub(b'\r\n?', b'\n', rawkey)

    # Extract block
    block = io.BytesIO()
    inblock = 0
    pastheaders = 0
    for line in rawkey.split(b'\n'):
        if line.startswith(b'-----BEGIN PGP PUBLIC KEY BLOCK-----'):
            inblock = 1
        elif inblock and line.strip() == b'':
            pastheaders = 1
        elif inblock and line.startswith(b'-----END PGP PUBLIC KEY BLOCK-----'):
            # Hit the end of the block, get out
            break
        elif pastheaders and line.startswith(b'='):
            # Hit the CRC line, don't include this and stop
            break
        elif pastheaders:
            block.write(line + b'\n')

    # Decode and return
    return base64_decodebytes(block.getvalue())


def keyInstalled(ts, keyid, timestamp):
    '''
    Return if the GPG key described by the given keyid and timestamp are
    installed in the rpmdb.

    The keyid and timestamp should both be passed as integers.
    The ts is an rpm transaction set object

    Return values:
        - -1      key is not installed
        - 0       key with matching ID and timestamp is installed
        - 1       key with matching ID is installed but has an older timestamp
        - 2       key with matching ID is installed but has a newer timestamp

    No effort is made to handle duplicates. The first matching keyid is used to
    calculate the return result.
    '''
    # Search
    for hdr in ts.dbMatch('name', 'gpg-pubkey'):
        if hdr['version'] == keyid:
            installedts = int(hdr['release'], 16)
            if installedts == timestamp:
                return 0
            elif installedts < timestamp:
                return 1
            else:
                return 2

    return -1


def import_key_to_pubring(rawkey, keyid, gpgdir=None, make_ro_copy=True):
    if not os.path.exists(gpgdir):
        os.makedirs(gpgdir)

    with dnf.crypto.pubring_dir(gpgdir), dnf.crypto.Context() as ctx:
        # import the key
        with open(os.path.join(gpgdir, 'gpg.conf'), 'wb') as fp:
            fp.write(b'')
        ctx.op_import(rawkey)

        if make_ro_copy:

            rodir = gpgdir + '-ro'
            if not os.path.exists(rodir):
                os.makedirs(rodir, mode=0o755)
                for f in glob.glob(gpgdir + '/*'):
                    basename = os.path.basename(f)
                    ro_f = rodir + '/' + basename
                    shutil.copy(f, ro_f)
                    os.chmod(ro_f, 0o755)
                # yes it is this stupid, why do you ask?
                opts = """lock-never
    no-auto-check-trustdb
    trust-model direct
    no-expensive-trust-checks
    no-permission-warning
    preserve-permissions
    """
                with open(os.path.join(rodir, 'gpg.conf'), 'w', 0o755) as fp:
                    fp.write(opts)


        return True


def getCacheDir():
    """return a path to a valid and safe cachedir - only used when not running
       as root or when --tempcache is set"""

    uid = os.geteuid()
    try:
        usertup = pwd.getpwuid(uid)
        username = dnf.i18n.ucd(usertup[0])
        prefix = '%s-%s-' % (dnf.const.PREFIX, username)
    except KeyError:
        prefix = '%s-%s-' % (dnf.const.PREFIX, uid)

    # check for /var/tmp/prefix-* -
    dirpath = '%s/%s*' % (dnf.const.TMPDIR, prefix)
    cachedirs = sorted(glob.glob(dirpath))
    for thisdir in cachedirs:
        stats = os.lstat(thisdir)
        if S_ISDIR(stats[0]) and S_IMODE(stats[0]) == 448 and stats[4] == uid:
            return thisdir

    # make the dir (tempfile.mkdtemp())
    cachedir = tempfile.mkdtemp(prefix=prefix, dir=dnf.const.TMPDIR)
    return cachedir

def seq_max_split(seq, max_entries):
    """ Given a seq, split into a list of lists of length max_entries each. """
    ret = []
    num = len(seq)
    seq = list(seq) # Trying to use a set/etc. here is bad
    beg = 0
    while num > max_entries:
        end = beg + max_entries
        ret.append(seq[beg:end])
        beg += max_entries
        num -= max_entries
    ret.append(seq[beg:])
    return ret

def unlink_f(filename):
    """ Call os.unlink, but don't die if the file isn't there. This is the main
        difference between "rm -f" and plain "rm". """
    try:
        os.unlink(filename)
    except OSError as e:
        if e.errno != errno.ENOENT:
            raise

def stat_f(filename, ignore_EACCES=False):
    """ Call os.stat(), but don't die if the file isn't there. Returns None. """
    try:
        return os.stat(filename)
    except OSError as e:
        if e.errno in (errno.ENOENT, errno.ENOTDIR):
            return None
        if ignore_EACCES and e.errno == errno.EACCES:
            return None
        raise

def _getloginuid():
    """ Get the audit-uid/login-uid, if available. os.getuid() is returned
        instead if there was a problem. Note that no caching is done here. """
    #  We might normally call audit.audit_getloginuid(), except that requires
    # importing all of the audit module. And it doesn't work anyway: BZ 518721
    try:
        with open("/proc/self/loginuid") as fo:
            data = fo.read()
            return int(data)
    except (IOError, ValueError):
        return os.getuid()

_cached_getloginuid = None
def getloginuid():
    """ Get the audit-uid/login-uid, if available. os.getuid() is returned
        instead if there was a problem. The value is cached, so you don't
        have to save it. """
    global _cached_getloginuid
    if _cached_getloginuid is None:
        _cached_getloginuid = _getloginuid()
    return _cached_getloginuid


def decompress(filename, dest=None, check_timestamps=False):
    """take a filename and decompress it into the same relative location.
       When the compression type is not recognized (or file is not compressed),
       the content of the file is copied to the destination"""

    if dest:
        out = dest
    else:
        out = None
        dot_pos = filename.rfind('.')
        if dot_pos > 0:
            ext = filename[dot_pos:]
            if ext in ('.zck', '.xz', '.bz2', '.gz', '.lzma', '.zst'):
                out = filename[:dot_pos]
        if out is None:
            raise dnf.exceptions.MiscError("Could not determine destination filename")

    if check_timestamps:
        fi = stat_f(filename)
        fo = stat_f(out)
        if fi and fo and fo.st_mtime == fi.st_mtime:
            return out

    try:
        # libdnf.utils.decompress either decompress file to the destination or
        # copy the content if the compression type is not recognized
        libdnf.utils.decompress(filename, out, 0o644)
    except RuntimeError as e:
        raise dnf.exceptions.MiscError(str(e))

    if check_timestamps and fi:
        os.utime(out, (fi.st_mtime, fi.st_mtime))

    return out

def calculate_repo_gen_dest(filename, generated_name):
    dest = os.path.dirname(filename)
    dest += '/gen'
    if not os.path.exists(dest):
        os.makedirs(dest, mode=0o755)
    return dest + '/' + generated_name


def repo_gen_decompress(filename, generated_name):
    """ This is a wrapper around decompress, where we work out a cached
        generated name, and use check_timestamps. filename _must_ be from
        a repo. and generated_name is the type of the file. """

    dest = calculate_repo_gen_dest(filename, generated_name)
    return decompress(filename, dest=dest, check_timestamps=True)

def read_in_items_from_dot_dir(thisglob, line_as_list=True):
    """ Takes a glob of a dir (like /etc/foo.d/\\*.foo) returns a list of all
       the lines in all the files matching that glob, ignores comments and blank
       lines, optional paramater 'line_as_list tells whether to treat each line
       as a space or comma-separated list, defaults to True.
    """
    results = []
    for fname in glob.glob(thisglob):
        with open(fname) as f:
            for line in f:
                if re.match(r'\s*(#|$)', line):
                    continue
                line = line.rstrip() # no more trailing \n's
                line = line.lstrip() # be nice
                if not line:
                    continue
                if line_as_list:
                    line = line.replace('\n', ' ')
                    line = line.replace(',', ' ')
                    results.extend(line.split())
                    continue
                results.append(line)
    return results
site-packages/dnf/const.py000064400000004501147511334650011544 0ustar00# const.py
# dnf constants.
#
# Copyright (C) 2012-2015  Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import unicode_literals
import distutils.sysconfig

CONF_FILENAME='/etc/dnf/dnf.conf' # :api
CONF_AUTOMATIC_FILENAME='/etc/dnf/automatic.conf'
DISTROVERPKG=('system-release(releasever)', 'system-release',
              'distribution-release(releasever)', 'distribution-release',
              'redhat-release', 'suse-release')
GROUP_PACKAGE_TYPES = ('mandatory', 'default', 'conditional') # :api
INSTALLONLYPKGS=['kernel', 'kernel-PAE',
                 'installonlypkg(kernel)',
                 'installonlypkg(kernel-module)',
                 'installonlypkg(vm)',
                 'multiversion(kernel)']
LOG='dnf.log'
LOG_HAWKEY='hawkey.log'
LOG_LIBREPO='dnf.librepo.log'
LOG_MARKER='--- logging initialized ---'
LOG_RPM='dnf.rpm.log'
NAME='DNF'
PERSISTDIR='/var/lib/dnf' # :api
PID_FILENAME = '/var/run/dnf.pid'
RUNDIR='/run'
USER_RUNDIR='/run/user'
SYSTEM_CACHEDIR='/var/cache/dnf'
TMPDIR='/var/tmp/'
# CLI verbose values greater or equal to this are considered "verbose":
VERBOSE_LEVEL=6

PREFIX=NAME.lower()
PROGRAM_NAME=NAME.lower()  # Deprecated - no longer used, Argparser prints program name based on sys.argv
PLUGINCONFPATH = '/etc/dnf/plugins'  # :api
PLUGINPATH = '%s/dnf-plugins' % distutils.sysconfig.get_python_lib()
VERSION='4.7.0'
USER_AGENT = "dnf/%s" % VERSION

BUGTRACKER_COMPONENT=NAME.lower()
BUGTRACKER='https://bugs.rockylinux.org/'
site-packages/dnf/dnssec.py000064400000026073147511334650011705 0ustar00# dnssec.py
# DNS extension for automatic GPG key verification
#
# Copyright (C) 2012-2018 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import print_function
from __future__ import absolute_import
from __future__ import unicode_literals

from enum import Enum
import base64
import hashlib
import logging
import re

from dnf.i18n import _
import dnf.rpm
import dnf.exceptions

logger = logging.getLogger("dnf")


RR_TYPE_OPENPGPKEY = 61


class DnssecError(dnf.exceptions.Error):
    """
    Exception used in the dnssec module
    """
    def __repr__(self):
        return "<DnssecError, value='{}'>"\
            .format(self.value if self.value is not None else "Not specified")


def email2location(email_address, tag="_openpgpkey"):
    # type: (str, str) -> str
    """
    Implements RFC 7929, section 3
    https://tools.ietf.org/html/rfc7929#section-3
    :param email_address:
    :param tag:
    :return:
    """
    split = email_address.split("@")
    if len(split) != 2:
        msg = "Email address must contain exactly one '@' sign."
        raise DnssecError(msg)

    local = split[0]
    domain = split[1]
    hash = hashlib.sha256()
    hash.update(local.encode('utf-8'))
    digest = base64.b16encode(hash.digest()[0:28])\
        .decode("utf-8")\
        .lower()
    return digest + "." + tag + "." + domain


class Validity(Enum):
    """
    Output of the verification algorithm.
    TODO: this type might be simplified in order to less reflect the underlying DNS layer.
    TODO: more specifically the variants from 3 to 5 should have more understandable names
    """
    VALID = 1
    REVOKED = 2
    PROVEN_NONEXISTENCE = 3
    RESULT_NOT_SECURE = 4
    BOGUS_RESULT = 5
    ERROR = 9


class NoKey:
    """
    This class represents an absence of a key in the cache. It is an expression of non-existence
    using the Python's type system.
    """
    pass


class KeyInfo:
    """
    Wrapper class for email and associated verification key, where both are represented in
    form of a string.
    """
    def __init__(self, email=None, key=None):
        self.email = email
        self.key = key

    @staticmethod
    def from_rpm_key_object(userid, raw_key):
        # type: (str, bytes) -> KeyInfo
        """
        Since dnf uses different format of the key than the one used in DNS RR, I need to convert
        the former one into the new one.
        """
        input_email = re.search('<(.*@.*)>', userid)
        if input_email is None:
            raise DnssecError

        email = input_email.group(1)
        key = raw_key.decode('ascii').split('\n')

        start = 0
        stop = 0
        for i in range(0, len(key)):
            if key[i] == '-----BEGIN PGP PUBLIC KEY BLOCK-----':
                start = i
            if key[i] == '-----END PGP PUBLIC KEY BLOCK-----':
                stop = i

        cat_key = ''.join(key[start + 2:stop - 1]).encode('ascii')
        return KeyInfo(email, cat_key)


class DNSSECKeyVerification:
    """
    The main class when it comes to verification itself. It wraps Unbound context and a cache with
    already obtained results.
    """

    # Mapping from email address to b64 encoded public key or NoKey in case of proven nonexistence
    _cache = {}
    # type: Dict[str, Union[str, NoKey]]

    @staticmethod
    def _cache_hit(key_union, input_key_string):
        # type: (Union[str, NoKey], str) -> Validity
        """
        Compare the key in case it was found in the cache.
        """
        if key_union == input_key_string:
            logger.debug("Cache hit, valid key")
            return Validity.VALID
        elif key_union is NoKey:
            logger.debug("Cache hit, proven non-existence")
            return Validity.PROVEN_NONEXISTENCE
        else:
            logger.debug("Key in cache: {}".format(key_union))
            logger.debug("Input key   : {}".format(input_key_string))
            return Validity.REVOKED

    @staticmethod
    def _cache_miss(input_key):
        # type: (KeyInfo) -> Validity
        """
        In case the key was not found in the cache, create an Unbound context and contact the DNS
        system
        """
        try:
            import unbound
        except ImportError as e:
            msg = _("Configuration option 'gpgkey_dns_verification' requires "
                    "python3-unbound ({})".format(e))
            raise dnf.exceptions.Error(msg)

        ctx = unbound.ub_ctx()
        if ctx.set_option("verbosity:", "0") != 0:
            logger.debug("Unbound context: Failed to set verbosity")

        if ctx.set_option("qname-minimisation:", "yes") != 0:
            logger.debug("Unbound context: Failed to set qname minimisation")

        if ctx.resolvconf() != 0:
            logger.debug("Unbound context: Failed to read resolv.conf")

        if ctx.add_ta_file("/var/lib/unbound/root.key") != 0:
            logger.debug("Unbound context: Failed to add trust anchor file")

        status, result = ctx.resolve(email2location(input_key.email),
                                     RR_TYPE_OPENPGPKEY, unbound.RR_CLASS_IN)
        if status != 0:
            logger.debug("Communication with DNS servers failed")
            return Validity.ERROR
        if result.bogus:
            logger.debug("DNSSEC signatures are wrong")
            return Validity.BOGUS_RESULT
        if not result.secure:
            logger.debug("Result is not secured with DNSSEC")
            return Validity.RESULT_NOT_SECURE
        if result.nxdomain:
            logger.debug("Non-existence of this record was proven by DNSSEC")
            return Validity.PROVEN_NONEXISTENCE
        if not result.havedata:
            # TODO: This is weird result, but there is no way to perform validation, so just return
            # an error
            logger.debug("Unknown error in DNS communication")
            return Validity.ERROR
        else:
            data = result.data.as_raw_data()[0]
            dns_data_b64 = base64.b64encode(data)
            if dns_data_b64 == input_key.key:
                return Validity.VALID
            else:
                # In case it is different, print the keys for further examination in debug mode
                logger.debug("Key from DNS: {}".format(dns_data_b64))
                logger.debug("Input key   : {}".format(input_key.key))
                return Validity.REVOKED

    @staticmethod
    def verify(input_key):
        # type: (KeyInfo) -> Validity
        """
        Public API. Use this method to verify a KeyInfo object.
        """
        logger.debug("Running verification for key with id: {}".format(input_key.email))
        key_union = DNSSECKeyVerification._cache.get(input_key.email)
        if key_union is not None:
            return DNSSECKeyVerification._cache_hit(key_union, input_key.key)
        else:
            result = DNSSECKeyVerification._cache_miss(input_key)
            if result == Validity.VALID:
                DNSSECKeyVerification._cache[input_key.email] = input_key.key
            elif result == Validity.PROVEN_NONEXISTENCE:
                DNSSECKeyVerification._cache[input_key.email] = NoKey()
            return result


def nice_user_msg(ki, v):
    # type: (KeyInfo, Validity) -> str
    """
    Inform the user about key validity in a human readable way.
    """
    prefix = _("DNSSEC extension: Key for user ") + ki.email + " "
    if v == Validity.VALID:
        return prefix + _("is valid.")
    else:
        return prefix + _("has unknown status.")


def any_msg(m):
    # type: (str) -> str
    """
    Label any given message with DNSSEC extension tag
    """
    return _("DNSSEC extension: ") + m


class RpmImportedKeys:
    """
    Wrapper around keys, that are imported in the RPM database.

    The keys are stored in packages with name gpg-pubkey, where the version and
    release is different for each of them. The key content itself is stored as
    an ASCII armored string in the package description, so it needs to be parsed
    before it can be used.
    """
    @staticmethod
    def _query_db_for_gpg_keys():
        # type: () -> List[KeyInfo]
        # TODO: base.conf.installroot ?? -----------------------\
        transaction_set = dnf.rpm.transaction.TransactionWrapper()
        packages = transaction_set.dbMatch("name", "gpg-pubkey")
        return_list = []
        for pkg in packages:
            packager = dnf.rpm.getheader(pkg, 'packager')
            email = re.search('<(.*@.*)>', packager).group(1)
            description = dnf.rpm.getheader(pkg, 'description')
            key_lines = description.split('\n')[3:-3]
            key_str = ''.join(key_lines)
            return_list += [KeyInfo(email, key_str.encode('ascii'))]

        return return_list

    @staticmethod
    def check_imported_keys_validity():
        keys = RpmImportedKeys._query_db_for_gpg_keys()
        logger.info(any_msg(_("Testing already imported keys for their validity.")))
        for key in keys:
            try:
                result = DNSSECKeyVerification.verify(key)
            except DnssecError as e:
                # Errors in this exception should not be fatal, print it and just continue
                logger.warning("DNSSEC extension error (email={}): {}"
                             .format(key.email, e.value))
                continue
            # TODO: remove revoked keys automatically and possibly ask user to confirm
            if result == Validity.VALID:
                logger.debug(any_msg("GPG Key {} is valid".format(key.email)))
                pass
            elif result == Validity.PROVEN_NONEXISTENCE:
                logger.debug(any_msg("GPG Key {} does not support DNS"
                                    " verification".format(key.email)))
            elif result == Validity.BOGUS_RESULT:
                logger.info(any_msg("GPG Key {} could not be verified, because DNSSEC signatures"
                                    " are bogus. Possible causes: wrong configuration of the DNS"
                                    " server, MITM attack".format(key.email)))
            elif result == Validity.REVOKED:
                logger.info(any_msg("GPG Key {} has been revoked and should"
                                    " be removed immediately".format(key.email)))
            else:
                logger.debug(any_msg("GPG Key {} could not be tested".format(key.email)))
site-packages/dnf/lock.py000064400000012312147511334650011345 0ustar00# lock.py
# DNF Locking Subsystem.
#
# Copyright (C) 2013-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from dnf.exceptions import ProcessLockError, ThreadLockError, LockError
from dnf.i18n import _
from dnf.yum import misc
import dnf.logging
import dnf.util
import errno
import fcntl
import hashlib
import logging
import os
import threading
import time

logger = logging.getLogger("dnf")

def _fit_lock_dir(dir_):
    if not dnf.util.am_i_root():
        # for regular users the best we currently do is not to clash with
        # another DNF process of the same user. Since dir_ is quite definitely
        # not writable for us, yet significant, use its hash:
        hexdir = hashlib.sha1(dir_.encode('utf-8')).hexdigest()
        dir_ = os.path.join(misc.getCacheDir(), 'locks', hexdir)
    return dir_

def build_download_lock(cachedir, exit_on_lock):
    return ProcessLock(os.path.join(_fit_lock_dir(cachedir), 'download_lock.pid'),
                       'cachedir', not exit_on_lock)

def build_metadata_lock(cachedir, exit_on_lock):
    return ProcessLock(os.path.join(_fit_lock_dir(cachedir), 'metadata_lock.pid'),
                       'metadata', not exit_on_lock)


def build_rpmdb_lock(persistdir, exit_on_lock):
    return ProcessLock(os.path.join(_fit_lock_dir(persistdir), 'rpmdb_lock.pid'),
                       'RPMDB', not exit_on_lock)


def build_log_lock(logdir, exit_on_lock):
    return ProcessLock(os.path.join(_fit_lock_dir(logdir), 'log_lock.pid'),
                       'log', not exit_on_lock)


class ProcessLock(object):
    def __init__(self, target, description, blocking=False):
        self.blocking = blocking
        self.count = 0
        self.description = description
        self.target = target
        self.thread_lock = threading.RLock()

    def _lock_thread(self):
        if not self.thread_lock.acquire(blocking=False):
            msg = '%s already locked by a different thread' % self.description
            raise ThreadLockError(msg)
        self.count += 1

    def _try_lock(self, pid):
        fd = os.open(self.target, os.O_CREAT | os.O_RDWR, 0o644)

        try:
            try:
                fcntl.flock(fd, fcntl.LOCK_EX | fcntl.LOCK_NB)
            except OSError as e:
                if e.errno == errno.EWOULDBLOCK:
                    return -1
                raise

            old_pid = os.read(fd, 20)
            if len(old_pid) == 0:
                # empty file, write our pid
                os.write(fd, str(pid).encode('utf-8'))
                return pid

            try:
                old_pid = int(old_pid)
            except ValueError:
                msg = _('Malformed lock file found: %s.\n'
                        'Ensure no other dnf/yum process is running and '
                        'remove the lock file manually or run '
                        'systemd-tmpfiles --remove dnf.conf.') % (self.target)
                raise LockError(msg)

            if old_pid == pid:
                # already locked by this process
                return pid

            if not os.access('/proc/%d/stat' % old_pid, os.F_OK):
                # locked by a dead process, write our pid
                os.lseek(fd, 0, os.SEEK_SET)
                os.ftruncate(fd, 0)
                os.write(fd, str(pid).encode('utf-8'))
                return pid

            return old_pid

        finally:
            os.close(fd)

    def _unlock_thread(self):
        self.count -= 1
        self.thread_lock.release()

    def __enter__(self):
        dnf.util.ensure_dir(os.path.dirname(self.target))
        self._lock_thread()
        prev_pid = -1
        my_pid = os.getpid()
        pid = self._try_lock(my_pid)
        while pid != my_pid:
            if pid != -1:
                if not self.blocking:
                    self._unlock_thread()
                    msg = '%s already locked by %d' % (self.description, pid)
                    raise ProcessLockError(msg, pid)
                if prev_pid != pid:
                    msg = _('Waiting for process with pid %d to finish.') % (pid)
                    logger.info(msg)
                    prev_pid = pid
            time.sleep(1)
            pid = self._try_lock(my_pid)

    def __exit__(self, *exc_args):
        if self.count == 1:
            os.unlink(self.target)
        self._unlock_thread()
site-packages/dnf/comps.py000064400000060364147511334650011550 0ustar00# comps.py
# Interface to libcomps.
#
# Copyright (C) 2013-2018 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import print_function
from __future__ import unicode_literals

import libdnf.transaction

from dnf.exceptions import CompsError
from dnf.i18n import _, ucd
from functools import reduce

import dnf.i18n
import dnf.util
import fnmatch
import gettext
import itertools
import libcomps
import locale
import logging
import operator
import re
import sys

logger = logging.getLogger("dnf")

# :api :binformat
CONDITIONAL = libdnf.transaction.CompsPackageType_CONDITIONAL
DEFAULT     = libdnf.transaction.CompsPackageType_DEFAULT
MANDATORY   = libdnf.transaction.CompsPackageType_MANDATORY
OPTIONAL    = libdnf.transaction.CompsPackageType_OPTIONAL

ALL_TYPES = CONDITIONAL | DEFAULT | MANDATORY | OPTIONAL


def _internal_comps_length(comps):
    collections = (comps.categories, comps.groups, comps.environments)
    return reduce(operator.__add__, map(len, collections))


def _first_if_iterable(seq):
    if seq is None:
        return None
    return dnf.util.first(seq)


def _by_pattern(pattern, case_sensitive, sqn):
    """Return items from sqn matching either exactly or glob-wise."""

    pattern = dnf.i18n.ucd(pattern)
    exact = {g for g in sqn if g.name == pattern or g.id == pattern}
    if exact:
        return exact

    if case_sensitive:
        match = re.compile(fnmatch.translate(pattern)).match
    else:
        match = re.compile(fnmatch.translate(pattern), flags=re.I).match

    ret = set()
    for g in sqn:
        if match(g.id):
            ret.add(g)
        elif g.name is not None and match(g.name):
            ret.add(g)
        elif g.ui_name is not None and match(g.ui_name):
            ret.add(g)

    return ret


def _fn_display_order(group):
    return sys.maxsize if group.display_order is None else group.display_order


def install_or_skip(install_fnc, grp_or_env_id, types, exclude=None,
                    strict=True, exclude_groups=None):
    """
    Installs a group or an environment identified by grp_or_env_id.
    This method is preserved for API compatibility. It used to catch an
    exception thrown when a gorup or env was already installed, which is no
    longer thrown.
    `install_fnc` has to be Solver._group_install or
    Solver._environment_install.
    """
    return install_fnc(grp_or_env_id, types, exclude, strict, exclude_groups)


class _Langs(object):

    """Get all usable abbreviations for the current language."""

    def __init__(self):
        self.last_locale = None
        self.cache = None

    @staticmethod
    def _dotted_locale_str():
        lcl = locale.getlocale(locale.LC_MESSAGES)
        if lcl == (None, None):
            return 'C'
        return '.'.join(lcl)

    def get(self):
        current_locale = self._dotted_locale_str()
        if self.last_locale == current_locale:
            return self.cache

        self.cache = []
        locales = [current_locale]
        if current_locale != 'C':
            locales.append('C')
        for l in locales:
            for nlang in gettext._expand_lang(l):
                if nlang not in self.cache:
                    self.cache.append(nlang)

        self.last_locale = current_locale
        return self.cache


class CompsQuery(object):

    AVAILABLE = 1
    INSTALLED = 2

    ENVIRONMENTS = 1
    GROUPS = 2

    def __init__(self, comps, history, kinds, status):
        self.comps = comps
        self.history = history
        self.kinds = kinds
        self.status = status

    def _get_groups(self, available, installed):
        result = set()
        if self.status & self.AVAILABLE:
            result.update({i.id for i in available})
        if self.status & self.INSTALLED:
            for i in installed:
                group = i.getCompsGroupItem()
                if not group:
                    continue
                result.add(group.getGroupId())
        return result

    def _get_envs(self, available, installed):
        result = set()
        if self.status & self.AVAILABLE:
            result.update({i.id for i in available})
        if self.status & self.INSTALLED:
            for i in installed:
                env = i.getCompsEnvironmentItem()
                if not env:
                    continue
                result.add(env.getEnvironmentId())
        return result

    def get(self, *patterns):
        res = dnf.util.Bunch()
        res.environments = []
        res.groups = []
        for pat in patterns:
            envs = grps = []
            if self.kinds & self.ENVIRONMENTS:
                available = self.comps.environments_by_pattern(pat)
                installed = self.history.env.search_by_pattern(pat)
                envs = self._get_envs(available, installed)
                res.environments.extend(envs)
            if self.kinds & self.GROUPS:
                available = self.comps.groups_by_pattern(pat)
                installed = self.history.group.search_by_pattern(pat)
                grps = self._get_groups(available, installed)
                res.groups.extend(grps)
            if not envs and not grps:
                if self.status == self.INSTALLED:
                    msg = _("Module or Group '%s' is not installed.") % ucd(pat)
                elif self.status == self.AVAILABLE:
                    msg = _("Module or Group '%s' is not available.") % ucd(pat)
                else:
                    msg = _("Module or Group '%s' does not exist.") % ucd(pat)
                raise CompsError(msg)
        return res


class Forwarder(object):
    def __init__(self, iobj, langs):
        self._i = iobj
        self._langs = langs

    def __getattr__(self, name):
        return getattr(self._i, name)

    def _ui_text(self, default, dct):
        for l in self._langs.get():
            t = dct.get(l)
            if t is not None:
                return t
        return default

    @property
    def ui_description(self):
        return self._ui_text(self.desc, self.desc_by_lang)

    @property
    def ui_name(self):
        return self._ui_text(self.name, self.name_by_lang)

class Category(Forwarder):
    # :api
    def __init__(self, iobj, langs, group_factory):
        super(Category, self).__init__(iobj, langs)
        self._group_factory = group_factory

    def _build_group(self, grp_id):
        grp = self._group_factory(grp_id.name)
        if grp is None:
            msg = "no group '%s' from category '%s'"
            raise ValueError(msg % (grp_id.name, self.id))
        return grp

    def groups_iter(self):
        for grp_id in self.group_ids:
            yield self._build_group(grp_id)

    @property
    def groups(self):
        return list(self.groups_iter())

class Environment(Forwarder):
    # :api

    def __init__(self, iobj, langs, group_factory):
        super(Environment, self).__init__(iobj, langs)
        self._group_factory = group_factory

    def _build_group(self, grp_id):
        grp = self._group_factory(grp_id.name)
        if grp is None:
            msg = "no group '%s' from environment '%s'"
            raise ValueError(msg % (grp_id.name, self.id))
        return grp

    def _build_groups(self, ids):
        groups = []
        for gi in ids:
            try:
                groups.append(self._build_group(gi))
            except ValueError as e:
                logger.error(e)

        return groups

    def groups_iter(self):
        for grp_id in itertools.chain(self.group_ids, self.option_ids):
            try:
                yield self._build_group(grp_id)
            except ValueError as e:
                logger.error(e)

    @property
    def mandatory_groups(self):
        return self._build_groups(self.group_ids)

    @property
    def optional_groups(self):
        return self._build_groups(self.option_ids)

class Group(Forwarder):
    # :api
    def __init__(self, iobj, langs, pkg_factory):
        super(Group, self).__init__(iobj, langs)
        self._pkg_factory = pkg_factory
        self.selected = iobj.default

    def _packages_of_type(self, type_):
        return [pkg for pkg in self.packages if pkg.type == type_]

    @property
    def conditional_packages(self):
        return self._packages_of_type(libcomps.PACKAGE_TYPE_CONDITIONAL)

    @property
    def default_packages(self):
        return self._packages_of_type(libcomps.PACKAGE_TYPE_DEFAULT)

    def packages_iter(self):
        # :api
        return map(self._pkg_factory, self.packages)

    @property
    def mandatory_packages(self):
        return self._packages_of_type(libcomps.PACKAGE_TYPE_MANDATORY)

    @property
    def optional_packages(self):
        return self._packages_of_type(libcomps.PACKAGE_TYPE_OPTIONAL)

    @property
    def visible(self):
        return self._i.uservisible

class Package(Forwarder):
    """Represents comps package data. :api"""

    _OPT_MAP = {
        libcomps.PACKAGE_TYPE_CONDITIONAL : CONDITIONAL,
        libcomps.PACKAGE_TYPE_DEFAULT     : DEFAULT,
        libcomps.PACKAGE_TYPE_MANDATORY   : MANDATORY,
        libcomps.PACKAGE_TYPE_OPTIONAL    : OPTIONAL,
    }

    def __init__(self, ipkg):
        self._i = ipkg

    @property
    def name(self):
        # :api
        return self._i.name

    @property
    def option_type(self):
        # :api
        return self._OPT_MAP[self.type]

class Comps(object):
    # :api

    def __init__(self):
        self._i = libcomps.Comps()
        self._langs = _Langs()

    def __len__(self):
        return _internal_comps_length(self._i)

    def _build_category(self, icategory):
        return Category(icategory, self._langs, self._group_by_id)

    def _build_environment(self, ienvironment):
        return Environment(ienvironment, self._langs, self._group_by_id)

    def _build_group(self, igroup):
        return Group(igroup, self._langs, self._build_package)

    def _build_package(self, ipkg):
        return Package(ipkg)

    def _add_from_xml_filename(self, fn):
        comps = libcomps.Comps()
        try:
            comps.fromxml_f(fn)
        except libcomps.ParserError:
            errors = comps.get_last_errors()
            raise CompsError(' '.join(errors))
        self._i += comps

    @property
    def categories(self):
        # :api
        return list(self.categories_iter())

    def category_by_pattern(self, pattern, case_sensitive=False):
        # :api
        assert dnf.util.is_string_type(pattern)
        cats = self.categories_by_pattern(pattern, case_sensitive)
        return _first_if_iterable(cats)

    def categories_by_pattern(self, pattern, case_sensitive=False):
        # :api
        assert dnf.util.is_string_type(pattern)
        return _by_pattern(pattern, case_sensitive, self.categories)

    def categories_iter(self):
        # :api
        return (self._build_category(c) for c in self._i.categories)

    @property
    def environments(self):
        # :api
        return sorted(self.environments_iter(), key=_fn_display_order)

    def _environment_by_id(self, id):
        assert dnf.util.is_string_type(id)
        return dnf.util.first(g for g in self.environments_iter() if g.id == id)

    def environment_by_pattern(self, pattern, case_sensitive=False):
        # :api
        assert dnf.util.is_string_type(pattern)
        envs = self.environments_by_pattern(pattern, case_sensitive)
        return _first_if_iterable(envs)

    def environments_by_pattern(self, pattern, case_sensitive=False):
        # :api
        assert dnf.util.is_string_type(pattern)
        envs = list(self.environments_iter())
        found_envs = _by_pattern(pattern, case_sensitive, envs)
        return sorted(found_envs, key=_fn_display_order)

    def environments_iter(self):
        # :api
        return (self._build_environment(e) for e in self._i.environments)

    @property
    def groups(self):
        # :api
        return sorted(self.groups_iter(), key=_fn_display_order)

    def _group_by_id(self, id_):
        assert dnf.util.is_string_type(id_)
        return dnf.util.first(g for g in self.groups_iter() if g.id == id_)

    def group_by_pattern(self, pattern, case_sensitive=False):
        # :api
        assert dnf.util.is_string_type(pattern)
        grps = self.groups_by_pattern(pattern, case_sensitive)
        return _first_if_iterable(grps)

    def groups_by_pattern(self, pattern, case_sensitive=False):
        # :api
        assert dnf.util.is_string_type(pattern)
        grps = _by_pattern(pattern, case_sensitive, list(self.groups_iter()))
        return sorted(grps, key=_fn_display_order)

    def groups_iter(self):
        # :api
        return (self._build_group(g) for g in self._i.groups)

class CompsTransPkg(object):
    def __init__(self, pkg_or_name):
        if dnf.util.is_string_type(pkg_or_name):
            # from package name
            self.basearchonly = False
            self.name = pkg_or_name
            self.optional = True
            self.requires = None
        elif isinstance(pkg_or_name, libdnf.transaction.CompsGroupPackage):
            # from swdb package
            # TODO:
            self.basearchonly = False
            # self.basearchonly = pkg_or_name.basearchonly
            self.name = pkg_or_name.getName()
            self.optional = pkg_or_name.getPackageType() & libcomps.PACKAGE_TYPE_OPTIONAL
            # TODO:
            self.requires = None
            # self.requires = pkg_or_name.requires
        else:
            # from comps package
            self.basearchonly = pkg_or_name.basearchonly
            self.name = pkg_or_name.name
            self.optional = pkg_or_name.type & libcomps.PACKAGE_TYPE_OPTIONAL
            self.requires = pkg_or_name.requires

    def __eq__(self, other):
        return (self.name == other.name and
                self.basearchonly == self.basearchonly and
                self.optional == self.optional and
                self.requires == self.requires)

    def __str__(self):
        return self.name

    def __hash__(self):
        return hash((self.name,
                    self.basearchonly,
                    self.optional,
                    self.requires))

class TransactionBunch(object):
    def __init__(self):
        self._install = set()
        self._install_opt = set()
        self._remove = set()
        self._upgrade = set()

    def __iadd__(self, other):
        self._install.update(other._install)
        self._install_opt.update(other._install_opt)
        self._upgrade.update(other._upgrade)
        self._remove = (self._remove | other._remove) - \
            self._install - self._install_opt - self._upgrade
        return self

    def __len__(self):
        return len(self.install) + len(self.install_opt) + len(self.upgrade) + len(self.remove)

    @staticmethod
    def _set_value(param, val):
        for item in val:
            if isinstance(item, CompsTransPkg):
                param.add(item)
            else:
                param.add(CompsTransPkg(item))

    @property
    def install(self):
        """
        Packages to be installed with strict=True - transaction will
        fail if they cannot be installed due to dependency errors etc.
        """
        return self._install

    @install.setter
    def install(self, value):
        self._set_value(self._install, value)

    @property
    def install_opt(self):
        """
        Packages to be installed with strict=False - they will be
        skipped if they cannot be installed
        """
        return self._install_opt

    @install_opt.setter
    def install_opt(self, value):
        self._set_value(self._install_opt, value)

    @property
    def remove(self):
        return self._remove

    @remove.setter
    def remove(self, value):
        self._set_value(self._remove, value)

    @property
    def upgrade(self):
        return self._upgrade

    @upgrade.setter
    def upgrade(self, value):
        self._set_value(self._upgrade, value)


class Solver(object):
    def __init__(self, history, comps, reason_fn):
        self.history = history
        self.comps = comps
        self._reason_fn = reason_fn

    @staticmethod
    def _mandatory_group_set(env):
        return {grp.id for grp in env.mandatory_groups}

    @staticmethod
    def _full_package_set(grp):
        return {pkg.getName() for pkg in grp.mandatory_packages +
                grp.default_packages + grp.optional_packages +
                grp.conditional_packages}

    @staticmethod
    def _pkgs_of_type(group, pkg_types, exclude=[]):
        def filter(pkgs):
            return [pkg for pkg in pkgs
                    if pkg.name not in exclude]

        pkgs = set()
        if pkg_types & MANDATORY:
            pkgs.update(filter(group.mandatory_packages))
        if pkg_types & DEFAULT:
            pkgs.update(filter(group.default_packages))
        if pkg_types & OPTIONAL:
            pkgs.update(filter(group.optional_packages))
        if pkg_types & CONDITIONAL:
            pkgs.update(filter(group.conditional_packages))
        return pkgs

    def _removable_pkg(self, pkg_name):
        assert dnf.util.is_string_type(pkg_name)
        return self.history.group.is_removable_pkg(pkg_name)

    def _removable_grp(self, group_id):
        assert dnf.util.is_string_type(group_id)
        return self.history.env.is_removable_group(group_id)

    def _environment_install(self, env_id, pkg_types, exclude=None, strict=True, exclude_groups=None):
        assert dnf.util.is_string_type(env_id)
        comps_env = self.comps._environment_by_id(env_id)
        if not comps_env:
            raise CompsError(_("Environment id '%s' does not exist.") % ucd(env_id))

        swdb_env = self.history.env.new(env_id, comps_env.name, comps_env.ui_name, pkg_types)
        self.history.env.install(swdb_env)

        trans = TransactionBunch()
        for comps_group in comps_env.mandatory_groups:
            if exclude_groups and comps_group.id in exclude_groups:
                continue
            trans += self._group_install(comps_group.id, pkg_types, exclude, strict)
            swdb_env.addGroup(comps_group.id, True, MANDATORY)

        for comps_group in comps_env.optional_groups:
            if exclude_groups and comps_group.id in exclude_groups:
                continue
            swdb_env.addGroup(comps_group.id, False, OPTIONAL)
            # TODO: if a group is already installed, mark it as installed?
        return trans

    def _environment_remove(self, env_id):
        assert dnf.util.is_string_type(env_id) is True
        swdb_env = self.history.env.get(env_id)
        if not swdb_env:
            raise CompsError(_("Environment id '%s' is not installed.") % env_id)

        self.history.env.remove(swdb_env)

        trans = TransactionBunch()
        group_ids = set([i.getGroupId() for i in swdb_env.getGroups()])
        for group_id in group_ids:
            if not self._removable_grp(group_id):
                continue
            trans += self._group_remove(group_id)
        return trans

    def _environment_upgrade(self, env_id):
        assert dnf.util.is_string_type(env_id)
        comps_env = self.comps._environment_by_id(env_id)
        swdb_env = self.history.env.get(env_id)
        if not swdb_env:
            raise CompsError(_("Environment '%s' is not installed.") % env_id)
        if not comps_env:
            raise CompsError(_("Environment '%s' is not available.") % env_id)

        old_set = set([i.getGroupId() for i in swdb_env.getGroups()])
        pkg_types = swdb_env.getPackageTypes()

        # create a new record for current transaction
        swdb_env = self.history.env.new(comps_env.id, comps_env.name, comps_env.ui_name, pkg_types)

        trans = TransactionBunch()
        for comps_group in comps_env.mandatory_groups:
            if comps_group.id in old_set:
                if self.history.group.get(comps_group.id):
                    # upgrade installed group
                    trans += self._group_upgrade(comps_group.id)
            else:
                # install new group
                trans += self._group_install(comps_group.id, pkg_types)
            swdb_env.addGroup(comps_group.id, True, MANDATORY)

        for comps_group in comps_env.optional_groups:
            if comps_group.id in old_set and self.history.group.get(comps_group.id):
                # upgrade installed group
                trans += self._group_upgrade(comps_group.id)
            swdb_env.addGroup(comps_group.id, False, OPTIONAL)
            # TODO: if a group is already installed, mark it as installed?
        self.history.env.upgrade(swdb_env)
        return trans

    def _group_install(self, group_id, pkg_types, exclude=None, strict=True, exclude_groups=None):
        assert dnf.util.is_string_type(group_id)
        comps_group = self.comps._group_by_id(group_id)
        if not comps_group:
            raise CompsError(_("Group id '%s' does not exist.") % ucd(group_id))

        swdb_group = self.history.group.new(group_id, comps_group.name, comps_group.ui_name, pkg_types)
        for i in comps_group.packages_iter():
            swdb_group.addPackage(i.name, False, Package._OPT_MAP[i.type])
        self.history.group.install(swdb_group)

        trans = TransactionBunch()
        # TODO: remove exclude
        if strict:
            trans.install.update(self._pkgs_of_type(comps_group, pkg_types, exclude=[]))
        else:
            trans.install_opt.update(self._pkgs_of_type(comps_group, pkg_types, exclude=[]))
        return trans

    def _group_remove(self, group_id):
        assert dnf.util.is_string_type(group_id)
        swdb_group = self.history.group.get(group_id)
        if not swdb_group:
            raise CompsError(_("Module or Group '%s' is not installed.") % group_id)
        self.history.group.remove(swdb_group)
        trans = TransactionBunch()
        trans.remove = {pkg for pkg in swdb_group.getPackages() if self._removable_pkg(pkg.getName())}
        return trans

    def _group_upgrade(self, group_id):
        assert dnf.util.is_string_type(group_id)
        comps_group = self.comps._group_by_id(group_id)
        swdb_group = self.history.group.get(group_id)
        exclude = []

        if not swdb_group:
            argument = comps_group.ui_name if comps_group else group_id
            raise CompsError(_("Module or Group '%s' is not installed.") % argument)
        if not comps_group:
            raise CompsError(_("Module or Group '%s' is not available.") % group_id)
        pkg_types = swdb_group.getPackageTypes()
        old_set = set([i.getName() for i in swdb_group.getPackages()])
        new_set = self._pkgs_of_type(comps_group, pkg_types, exclude)

        # create a new record for current transaction
        swdb_group = self.history.group.new(group_id, comps_group.name, comps_group.ui_name, pkg_types)
        for i in comps_group.packages_iter():
            swdb_group.addPackage(i.name, False, Package._OPT_MAP[i.type])
        self.history.group.upgrade(swdb_group)

        trans = TransactionBunch()
        trans.install = {pkg for pkg in new_set if pkg.name not in old_set}
        trans.remove = {name for name in old_set
                        if name not in [pkg.name for pkg in new_set]}
        trans.upgrade = {pkg for pkg in new_set if pkg.name in old_set}
        return trans

    def _exclude_packages_from_installed_groups(self, base):
        for group in self.persistor.groups:
            p_grp = self.persistor.group(group)
            if p_grp.installed:
                installed_pkg_names = \
                    set(p_grp.full_list) - set(p_grp.pkg_exclude)
                installed_pkgs = base.sack.query().installed().filterm(name=installed_pkg_names)
                for pkg in installed_pkgs:
                    base._goal.install(pkg)
site-packages/dnf/match_counter.py000064400000007524147511334650013261 0ustar00# match_counter.py
# Implements class MatchCounter.
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import print_function
from __future__ import unicode_literals
from functools import reduce

WEIGHTS = {
    'name'		: 7,
    'summary'		: 4,
    'description'	: 2,
    'url'		: 1,
    }


def _canonize_string_set(sset, length):
    """ Ordered sset with empty strings prepended. """
    current = len(sset)
    l = [''] * (length - current) + sorted(sset)
    return l


class MatchCounter(dict):
    """Map packages to which of their attributes matched in a search against
    what values.

    The mapping is: ``package -> [(key, needle), ... ]``.

    """

    @staticmethod
    def _eval_weights(pkg, matches):
        # how much is each match worth and return their sum:
        def weight(match):
            key = match[0]
            needle = match[1]
            haystack = getattr(pkg, key)
            if key == "name" and haystack == needle:
                # if package matches exactly by name, increase weight
                return 2 * WEIGHTS[key]
            return WEIGHTS[key]

        return sum(map(weight, matches))

    def _key_func(self):
        """Get the key function used for sorting matches.

        It is not enough to only look at the matches and order them by the sum
        of their weighted hits. In case this number is the same we have to
        ensure that the same matched needles are next to each other in the
        result.

        Returned function is:
        pkg -> (weights_sum, canonized_needles_set, -distance)

        """
        def get_key(pkg):
            return (
                # use negative value to make sure packages with the highest weight come first
                - self._eval_weights(pkg, self[pkg]),
                # then order packages alphabetically
                pkg.name,
            )
        return get_key

    def _max_needles(self):
        """Return the max count of needles of all packages."""
        if self:
            return max(len(self.matched_needles(pkg)) for pkg in self)
        return 0

    def add(self, pkg, key, needle):
        self.setdefault(pkg, []).append((key, needle))

    def dump(self):
        for pkg in self:
            print('%s\t%s' % (pkg, self[pkg]))

    def matched_haystacks(self, pkg):
        return set(getattr(pkg, m[0]) for m in self[pkg])

    def matched_keys(self, pkg):
        # return keys in the same order they appear in the list
        result = []
        for i in self[pkg]:
            if i[0] in result:
                continue
            result.append(i[0])
        return result

    def matched_needles(self, pkg):
        return set(m[1] for m in self[pkg])

    def sorted(self, reverse=False, limit_to=None):
        keys = limit_to if limit_to else self.keys()
        return sorted(keys, key=self._key_func())

    def total(self):
        return reduce(lambda total, pkg: total + len(self[pkg]), self, 0)
site-packages/dnf/repo.py000064400000050514147511334650011370 0ustar00# repo.py
# DNF Repository objects.
#
# Copyright (C) 2013-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals

from dnf.i18n import ucd, _

import dnf.callback
import dnf.conf
import dnf.conf.substitutions
import dnf.const
import dnf.crypto
import dnf.exceptions
import dnf.logging
import dnf.pycomp
import dnf.util
import dnf.yum.misc
import libdnf.error
import libdnf.repo
import functools
import hashlib
import hawkey
import logging
import operator
import os
import re
import shutil
import string
import sys
import time
import traceback

_PACKAGES_RELATIVE_DIR = "packages"
_MIRRORLIST_FILENAME = "mirrorlist"
# Chars allowed in a repo ID
_REPOID_CHARS = string.ascii_letters + string.digits + '-_.:'
# Regex pattern that matches a repo cachedir and captures the repo ID
_CACHEDIR_RE = r'(?P<repoid>[%s]+)\-[%s]{16}' % (re.escape(_REPOID_CHARS),
                                                 string.hexdigits)

# Regex patterns matching any filename that is repo-specific cache data of a
# particular type.  The filename is expected to not contain the base cachedir
# path components.
CACHE_FILES = {
    'metadata': r'^%s\/.*((xml|yaml)(\.gz|\.xz|\.bz2|.zck)?|asc|cachecookie|%s)$' %
                (_CACHEDIR_RE, _MIRRORLIST_FILENAME),
    'packages': r'^%s\/%s\/.+rpm$' % (_CACHEDIR_RE, _PACKAGES_RELATIVE_DIR),
    'dbcache': r'^.+(solv|solvx)$',
}

logger = logging.getLogger("dnf")


def repo_id_invalid(repo_id):
    # :api
    """Return index of an invalid character in the repo ID (if present)."""
    first_invalid = libdnf.repo.Repo.verifyId(repo_id)
    return None if first_invalid < 0 else first_invalid


def _pkg2payload(pkg, progress, *factories):
    for fn in factories:
        pload = fn(pkg, progress)
        if pload is not None:
            return pload
    raise ValueError(_('no matching payload factory for %s') % pkg)


def _download_payloads(payloads, drpm, fail_fast=True):
    # download packages
    def _download_sort_key(payload):
        return not hasattr(payload, 'delta')

    drpm.err.clear()
    targets = [pload._librepo_target()
               for pload in sorted(payloads, key=_download_sort_key)]
    errs = _DownloadErrors()
    try:
        libdnf.repo.PackageTarget.downloadPackages(libdnf.repo.VectorPPackageTarget(targets), fail_fast)
    except RuntimeError as e:
        errs._fatal = str(e)
    drpm.wait()

    # process downloading errors
    errs._recoverable = drpm.err.copy()
    for tgt in targets:
        err = tgt.getErr()
        if err is None or err.startswith('Not finished'):
            continue
        callbacks = tgt.getCallbacks()
        payload = callbacks.package_pload
        pkg = payload.pkg
        if err == 'Already downloaded':
            errs._skipped.add(pkg)
            continue
        pkg.repo._repo.expire()
        errs._pkg_irrecoverable[pkg] = [err]

    return errs


def _update_saving(saving, payloads, errs):
    real, full = saving
    for pload in payloads:
        pkg = pload.pkg
        if pkg in errs:
            real += pload.download_size
            continue
        real += pload.download_size
        full += pload._full_size
    return real, full


class _DownloadErrors(object):
    def __init__(self):
        self._pkg_irrecoverable = {}
        self._val_recoverable = {}
        self._fatal = None
        self._skipped = set()

    def _irrecoverable(self):
        if self._pkg_irrecoverable:
            return self._pkg_irrecoverable
        if self._fatal:
            return {'': [self._fatal]}
        return {}

    @property
    def _recoverable(self):
        return self._val_recoverable

    @_recoverable.setter
    def _recoverable(self, new_dct):
        self._val_recoverable = new_dct

    def _bandwidth_used(self, pload):
        if pload.pkg in self._skipped:
            return 0
        return pload.download_size


class _DetailedLibrepoError(Exception):
    def __init__(self, librepo_err, source_url):
        Exception.__init__(self)
        self.librepo_code = librepo_err.args[0]
        self.librepo_msg = librepo_err.args[1]
        self.source_url = source_url


class _NullKeyImport(dnf.callback.KeyImport):
    def _confirm(self, id, userid, fingerprint, url, timestamp):
        return True


class Metadata(object):
    def __init__(self, repo):
        self._repo = repo

    @property
    def fresh(self):
        # :api
        return self._repo.fresh()


class PackageTargetCallbacks(libdnf.repo.PackageTargetCB):
    def __init__(self, package_pload):
        super(PackageTargetCallbacks, self).__init__()
        self.package_pload = package_pload

    def end(self, status, msg):
        self.package_pload._end_cb(None, status, msg)
        return 0

    def progress(self, totalToDownload, downloaded):
        self.package_pload._progress_cb(None, totalToDownload, downloaded)
        return 0

    def mirrorFailure(self, msg, url):
        self.package_pload._mirrorfail_cb(None, msg, url)
        return 0


class PackagePayload(dnf.callback.Payload):
    def __init__(self, pkg, progress):
        super(PackagePayload, self).__init__(progress)
        self.callbacks = PackageTargetCallbacks(self)
        self.pkg = pkg

    def _end_cb(self, cbdata, lr_status, msg):
        """End callback to librepo operation."""
        status = dnf.callback.STATUS_FAILED
        if msg is None:
            status = dnf.callback.STATUS_OK
        elif msg.startswith('Not finished'):
            return
        elif lr_status == libdnf.repo.PackageTargetCB.TransferStatus_ALREADYEXISTS:
            status = dnf.callback.STATUS_ALREADY_EXISTS

        self.progress.end(self, status, msg)

    def _mirrorfail_cb(self, cbdata, err, url):
        self.progress.end(self, dnf.callback.STATUS_MIRROR, err)

    def _progress_cb(self, cbdata, total, done):
        try:
            self.progress.progress(self, done)
        except Exception:
            exc_type, exc_value, exc_traceback = sys.exc_info()
            except_list = traceback.format_exception(exc_type, exc_value, exc_traceback)
            logger.critical(''.join(except_list))

    @property
    def _full_size(self):
        return self.download_size

    def _librepo_target(self):
        pkg = self.pkg
        pkgdir = pkg.pkgdir
        dnf.util.ensure_dir(pkgdir)

        target_dct = {
            'dest': pkgdir,
            'resume': True,
            'cbdata': self,
            'progresscb': self._progress_cb,
            'endcb': self._end_cb,
            'mirrorfailurecb': self._mirrorfail_cb,
        }
        target_dct.update(self._target_params())

        return libdnf.repo.PackageTarget(
            pkg.repo._repo,
            target_dct['relative_url'],
            target_dct['dest'], target_dct['checksum_type'], target_dct['checksum'],
            target_dct['expectedsize'], target_dct['base_url'], target_dct['resume'],
            0, 0, self.callbacks)


class RPMPayload(PackagePayload):

    def __str__(self):
        return os.path.basename(self.pkg.location)

    def _target_params(self):
        pkg = self.pkg
        ctype, csum = pkg.returnIdSum()
        ctype_code = libdnf.repo.PackageTarget.checksumType(ctype)
        if ctype_code == libdnf.repo.PackageTarget.ChecksumType_UNKNOWN:
            logger.warning(_("unsupported checksum type: %s"), ctype)

        return {
            'relative_url': pkg.location,
            'checksum_type': ctype_code,
            'checksum': csum,
            'expectedsize': pkg.downloadsize,
            'base_url': pkg.baseurl,
        }

    @property
    def download_size(self):
        """Total size of the download."""
        return self.pkg.downloadsize


class RemoteRPMPayload(PackagePayload):

    def __init__(self, remote_location, conf, progress):
        super(RemoteRPMPayload, self).__init__("unused_object", progress)
        self.remote_location = remote_location
        self.remote_size = 0
        self.conf = conf
        s = (self.conf.releasever or "") + self.conf.substitutions.get('basearch')
        digest = hashlib.sha256(s.encode('utf8')).hexdigest()[:16]
        repodir = "commandline-" + digest
        self.pkgdir = os.path.join(self.conf.cachedir, repodir, "packages")
        dnf.util.ensure_dir(self.pkgdir)
        self.local_path = os.path.join(self.pkgdir, self.__str__().lstrip("/"))

    def __str__(self):
        return os.path.basename(self.remote_location)

    def _progress_cb(self, cbdata, total, done):
        self.remote_size = total
        try:
            self.progress.progress(self, done)
        except Exception:
            exc_type, exc_value, exc_traceback = sys.exc_info()
            except_list = traceback.format_exception(exc_type, exc_value, exc_traceback)
            logger.critical(''.join(except_list))

    def _librepo_target(self):
        return libdnf.repo.PackageTarget(
            self.conf._config, os.path.basename(self.remote_location),
            self.pkgdir, 0, None, 0, os.path.dirname(self.remote_location),
            True, 0, 0, self.callbacks)

    @property
    def download_size(self):
        """Total size of the download."""
        return self.remote_size


class MDPayload(dnf.callback.Payload):

    def __init__(self, progress):
        super(MDPayload, self).__init__(progress)
        self._text = ""
        self._download_size = 0
        self.fastest_mirror_running = False
        self.mirror_failures = set()

    def __str__(self):
        if dnf.pycomp.PY3:
            return self._text
        else:
            return self._text.encode('utf-8')

    def __unicode__(self):
        return self._text

    def _progress_cb(self, cbdata, total, done):
        self._download_size = total
        self.progress.progress(self, done)

    def _fastestmirror_cb(self, cbdata, stage, data):
        if stage == libdnf.repo.RepoCB.FastestMirrorStage_DETECTION:
            # pinging mirrors, this might take a while
            msg = _('determining the fastest mirror (%s hosts).. ') % data
            self.fastest_mirror_running = True
        elif stage == libdnf.repo.RepoCB.FastestMirrorStage_STATUS and self.fastest_mirror_running:
            # done.. report but ignore any errors
            msg = 'error: %s\n' % data if data else 'done.\n'
        else:
            return
        self.progress.message(msg)

    def _mirror_failure_cb(self, cbdata, msg, url, metadata):
        self.mirror_failures.add(msg)
        msg = 'error: %s (%s).' % (msg, url)
        logger.debug(msg)

    @property
    def download_size(self):
        return self._download_size

    @property
    def progress(self):
        return self._progress

    @progress.setter
    def progress(self, progress):
        if progress is None:
            progress = dnf.callback.NullDownloadProgress()
        self._progress = progress

    def start(self, text):
        self._text = text
        self.progress.start(1, 0)

    def end(self):
        self._download_size = 0
        self.progress.end(self, None, None)


# use the local cache even if it's expired. download if there's no cache.
SYNC_LAZY = libdnf.repo.Repo.SyncStrategy_LAZY
# use the local cache, even if it's expired, never download.
SYNC_ONLY_CACHE = libdnf.repo.Repo.SyncStrategy_ONLY_CACHE
# try the cache, if it is expired download new md.
SYNC_TRY_CACHE = libdnf.repo.Repo.SyncStrategy_TRY_CACHE


class RepoCallbacks(libdnf.repo.RepoCB):
    def __init__(self, repo):
        super(RepoCallbacks, self).__init__()
        self._repo = repo
        self._md_pload = repo._md_pload

    def start(self, what):
        self._md_pload.start(what)

    def end(self):
        self._md_pload.end()

    def progress(self, totalToDownload, downloaded):
        self._md_pload._progress_cb(None, totalToDownload, downloaded)
        return 0

    def fastestMirror(self, stage, ptr):
        self._md_pload._fastestmirror_cb(None, stage, ptr)

    def handleMirrorFailure(self, msg, url, metadata):
        self._md_pload._mirror_failure_cb(None, msg, url, metadata)
        return 0

    def repokeyImport(self, id, userid, fingerprint, url, timestamp):
        return self._repo._key_import._confirm(id, userid, fingerprint, url, timestamp)


class Repo(dnf.conf.RepoConf):
    # :api
    DEFAULT_SYNC = SYNC_TRY_CACHE

    def __init__(self, name=None, parent_conf=None):
        # :api
        super(Repo, self).__init__(section=name, parent=parent_conf)

        self._config.this.disown()  # _repo will be the owner of _config
        self._repo = libdnf.repo.Repo(name if name else "", self._config)

        self._md_pload = MDPayload(dnf.callback.NullDownloadProgress())
        self._callbacks = RepoCallbacks(self)
        self._callbacks.this.disown()  # _repo will be the owner of callbacks
        self._repo.setCallbacks(self._callbacks)

        self._pkgdir = None
        self._key_import = _NullKeyImport()
        self.metadata = None  # :api
        self._repo.setSyncStrategy(SYNC_ONLY_CACHE if parent_conf and parent_conf.cacheonly else self.DEFAULT_SYNC)
        if parent_conf:
            self._repo.setSubstitutions(parent_conf.substitutions)
        self._substitutions = dnf.conf.substitutions.Substitutions()
        self._check_config_file_age = parent_conf.check_config_file_age \
            if parent_conf is not None else True

    @property
    def id(self):
        # :api
        return self._repo.getId()

    @property
    def repofile(self):
        # :api
        return self._repo.getRepoFilePath()

    @repofile.setter
    def repofile(self, value):
        self._repo.setRepoFilePath(value)

    @property
    def pkgdir(self):
        # :api
        if self._repo.isLocal():
            return self._repo.getLocalBaseurl()
        return self.cache_pkgdir()

    def cache_pkgdir(self):
        if self._pkgdir is not None:
            return self._pkgdir
        return os.path.join(self._repo.getCachedir(), _PACKAGES_RELATIVE_DIR)

    @pkgdir.setter
    def pkgdir(self, val):
        # :api
        self._pkgdir = val

    @property
    def _pubring_dir(self):
        return os.path.join(self._repo.getCachedir(), 'pubring')

    @property
    def load_metadata_other(self):
        return self._repo.getLoadMetadataOther()

    @load_metadata_other.setter
    def load_metadata_other(self, val):
        self._repo.setLoadMetadataOther(val)

    def __lt__(self, other):
        return self.id < other.id

    def __repr__(self):
        return "<%s %s>" % (self.__class__.__name__, self.id)

    def __setattr__(self, name, value):
        super(Repo, self).__setattr__(name, value)

    def disable(self):
        # :api
        self._repo.disable()

    def enable(self):
        # :api
        self._repo.enable()

    def add_metadata_type_to_download(self, metadata_type):
        # :api
        """Ask for additional repository metadata type to download.

        Given metadata_type is appended to the default metadata set when
        repository is downloaded.

        Parameters
        ----------
        metadata_type: string

        Example: add_metadata_type_to_download("productid")
        """
        self._repo.addMetadataTypeToDownload(metadata_type)

    def remove_metadata_type_from_download(self, metadata_type):
        # :api
        """Stop asking for this additional repository metadata type
        in download.

        Given metadata_type is no longer downloaded by default
        when this repository is downloaded.

        Parameters
        ----------
        metadata_type: string

        Example: remove_metadata_type_from_download("productid")
        """
        self._repo.removeMetadataTypeFromDownload(metadata_type)

    def get_metadata_path(self, metadata_type):
        # :api
        """Return path to the file with downloaded repository metadata of given type.

        Parameters
        ----------
        metadata_type: string
        """
        return self._repo.getMetadataPath(metadata_type)

    def get_metadata_content(self, metadata_type):
        # :api
        """Return content of the file with downloaded repository metadata of given type.

        Content of compressed metadata file is returned uncompressed.

        Parameters
        ----------
        metadata_type: string
        """
        return self._repo.getMetadataContent(metadata_type)

    def load(self):
        # :api
        """Load the metadata for this repo.

        Depending on the configuration and the age and consistence of data
        available on the disk cache, either loads the metadata from the cache or
        downloads them from the mirror, baseurl or metalink.

        This method will by default not try to refresh already loaded data if
        called repeatedly.

        Returns True if this call to load() caused a fresh metadata download.

        """
        ret = False
        try:
            ret = self._repo.load()
        except (libdnf.error.Error, RuntimeError) as e:
            if self._md_pload.mirror_failures:
                msg = "Errors during downloading metadata for repository '%s':" % self.id
                for failure in self._md_pload.mirror_failures:
                    msg += "\n  - %s" % failure
                logger.warning(msg)
            raise dnf.exceptions.RepoError(str(e))
        finally:
            self._md_pload.mirror_failures = set()
        self.metadata = Metadata(self._repo)
        return ret

    def _metadata_expire_in(self):
        """Get the number of seconds after which the cached metadata will expire.

        Returns a tuple, boolean whether there even is cached metadata and the
        number of seconds it will expire in. Negative number means the metadata
        has expired already, None that it never expires.

        """
        if not self.metadata:
            self._repo.loadCache(False)
        if self.metadata:
            if self.metadata_expire == -1:
                return True, None
            expiration = self._repo.getExpiresIn()
            if self._repo.isExpired():
                expiration = min(0, expiration)
            return True, expiration
        return False, 0

    def _set_key_import(self, key_import):
        self._key_import = key_import

    def set_progress_bar(self, progress):
        # :api
        self._md_pload.progress = progress

    def get_http_headers(self):
        # :api
        """Returns user defined http headers.

        Returns
        -------
        headers : tuple of strings
        """
        return self._repo.getHttpHeaders()

    def set_http_headers(self, headers):
        # :api
        """Sets http headers.

        Sets new http headers and rewrites existing ones.

        Parameters
        ----------
        headers : tuple or list of strings
            Example: set_http_headers(["User-Agent: Agent007", "MyFieldName: MyFieldValue"])
        """
        self._repo.setHttpHeaders(headers)

    def remote_location(self, location, schemes=('http', 'ftp', 'file', 'https')):
        """
        :param location: relative location inside the repo
        :param schemes: list of allowed protocols. Default is ('http', 'ftp', 'file', 'https')
        :return: absolute url (string) or None
        """
        def schemes_filter(url_list):
            for url in url_list:
                if schemes:
                    s = dnf.pycomp.urlparse.urlparse(url)[0]
                    if s in schemes:
                        return os.path.join(url, location.lstrip('/'))
                else:
                    return os.path.join(url, location.lstrip('/'))
            return None

        if not location:
            return None

        mirrors = self._repo.getMirrors()
        if mirrors:
            return schemes_filter(mirrors)
        elif self.baseurl:
            return schemes_filter(self.baseurl)
site-packages/dnf/goal.py000064400000002115147511334650011337 0ustar00# goal.py
# Customized hawkey.Goal
#
# Copyright (C) 2014-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals

from hawkey import Goal
site-packages/dnf/sack.py000064400000005743147511334650011350 0ustar00# sack.py
# The dnf.Sack class, derived from hawkey.Sack
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
import dnf.util
import dnf.package
import dnf.query
import logging
import hawkey
import os
from dnf.pycomp import basestring
from dnf.i18n import _

logger = logging.getLogger("dnf")

class Sack(hawkey.Sack):
    # :api

    def __init__(self, *args, **kwargs):
        super(Sack, self).__init__(*args, **kwargs)

    def _configure(self, installonly=None, installonly_limit=0, allow_vendor_change=None):
        if installonly:
            self.installonly = installonly
        self.installonly_limit = installonly_limit
        if allow_vendor_change is not None:
            self.allow_vendor_change = allow_vendor_change
            if allow_vendor_change is False:
                logger.warning(_("allow_vendor_change is disabled. This option is currently not supported for downgrade and distro-sync commands"))

    def query(self, flags=0):
        # :api
        """Factory function returning a DNF Query."""
        return dnf.query.Query(self, flags)


def _build_sack(base):
    cachedir = base.conf.cachedir
    # create the dir ourselves so we have the permissions under control:
    dnf.util.ensure_dir(cachedir)
    return Sack(pkgcls=dnf.package.Package, pkginitval=base,
                arch=base.conf.substitutions["arch"],
                cachedir=cachedir, rootdir=base.conf.installroot,
                logfile=os.path.join(base.conf.logdir, dnf.const.LOG_HAWKEY),
                logdebug=base.conf.logfilelevel > 9)


def _rpmdb_sack(base):
    # used by subscription-manager (src/dnf-plugins/product-id.py)
    sack = _build_sack(base)
    try:
        # It can fail if rpmDB is not present
        sack.load_system_repo(build_cache=False)
    except IOError:
        pass
    return sack


def rpmdb_sack(base):
    # :api
    """
    Returns a new instance of sack containing only installed packages (@System repo)
    Useful to get list of the installed RPMs after transaction.
    """
    return _rpmdb_sack(base)
site-packages/dnf/exceptions.py000064400000013331147511334650012600 0ustar00# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU Library General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
# Copyright 2004 Duke University

"""
Core DNF Errors.
"""

from __future__ import unicode_literals
from dnf.i18n import ucd, _, P_
import dnf.util
import libdnf
import warnings

class DeprecationWarning(DeprecationWarning):
    # :api
    pass


class Error(Exception):
    # :api
    """Base Error. All other Errors thrown by DNF should inherit from this.

    :api

    """
    def __init__(self, value=None):
        super(Error, self).__init__()
        self.value = None if value is None else ucd(value)

    def __str__(self):
        return "{}".format(self.value)

    def __unicode__(self):
        return ucd(self.__str__())



class CompsError(Error):
    # :api
    pass


class ConfigError(Error):
    def __init__(self, value=None, raw_error=None):
        super(ConfigError, self).__init__(value)
        self.raw_error = ucd(raw_error) if raw_error is not None else None


class DatabaseError(Error):
    pass


class DepsolveError(Error):
    # :api
    pass


class DownloadError(Error):
    # :api
    def __init__(self, errmap):
        super(DownloadError, self).__init__()
        self.errmap = errmap

    @staticmethod
    def errmap2str(errmap):
        errstrings = []
        for key in errmap:
            for error in errmap[key]:
                msg = '%s: %s' % (key, error) if key else '%s' % error
                errstrings.append(msg)
        return '\n'.join(errstrings)

    def __str__(self):
        return self.errmap2str(self.errmap)


class LockError(Error):
    pass


class MarkingError(Error):
    # :api

    def __init__(self, value=None, pkg_spec=None):
        """Initialize the marking error instance."""
        super(MarkingError, self).__init__(value)
        self.pkg_spec = None if pkg_spec is None else ucd(pkg_spec)

    def __str__(self):
        string = super(MarkingError, self).__str__()
        if self.pkg_spec:
            string += ': ' + self.pkg_spec
        return string


class MarkingErrors(Error):
    # :api
    def __init__(self, no_match_group_specs=(), error_group_specs=(), no_match_pkg_specs=(),
                 error_pkg_specs=(), module_depsolv_errors=()):
        """Initialize the marking error instance."""
        msg = _("Problems in request:")
        if (no_match_pkg_specs):
            msg += "\n" + _("missing packages: ") + ", ".join(no_match_pkg_specs)
        if (error_pkg_specs):
            msg += "\n" + _("broken packages: ") + ", ".join(error_pkg_specs)
        if (no_match_group_specs):
            msg += "\n" + _("missing groups or modules: ") + ", ".join(no_match_group_specs)
        if (error_group_specs):
            msg += "\n" + _("broken groups or modules: ") + ", ".join(error_group_specs)
        if (module_depsolv_errors):
            msg_mod = dnf.util._format_resolve_problems(module_depsolv_errors[0])
            if module_depsolv_errors[1] == \
                    libdnf.module.ModulePackageContainer.ModuleErrorType_ERROR_IN_DEFAULTS:
                msg += "\n" + "\n".join([P_('Modular dependency problem with Defaults:',
                                            'Modular dependency problems with Defaults:',
                                            len(module_depsolv_errors)),
                                        msg_mod])
            else:
                msg += "\n" + "\n".join([P_('Modular dependency problem:',
                                            'Modular dependency problems:',
                                            len(module_depsolv_errors)),
                                        msg_mod])
        super(MarkingErrors, self).__init__(msg)
        self.no_match_group_specs = no_match_group_specs
        self.error_group_specs = error_group_specs
        self.no_match_pkg_specs = no_match_pkg_specs
        self.error_pkg_specs = error_pkg_specs
        self.module_depsolv_errors = module_depsolv_errors

    @property
    def module_debsolv_errors(self):
        msg = "Attribute module_debsolv_errors is deprecated. Use module_depsolv_errors " \
              "attribute instead."
        warnings.warn(msg, DeprecationWarning, stacklevel=2)
        return self.module_depsolv_errors

class MetadataError(Error):
    pass


class MiscError(Error):
    pass


class PackagesNotAvailableError(MarkingError):
    def __init__(self, value=None, pkg_spec=None, packages=None):
        super(PackagesNotAvailableError, self).__init__(value, pkg_spec)
        self.packages = packages or []


class PackageNotFoundError(MarkingError):
    pass


class PackagesNotInstalledError(MarkingError):
    def __init__(self, value=None, pkg_spec=None, packages=None):
        super(PackagesNotInstalledError, self).__init__(value, pkg_spec)
        self.packages = packages or []


class ProcessLockError(LockError):
    def __init__(self, value, pid):
        super(ProcessLockError, self).__init__(value)
        self.pid = pid

    def __reduce__(self):
        """Pickling support."""
        return (ProcessLockError, (self.value, self.pid))


class RepoError(Error):
    # :api
    pass


class ThreadLockError(LockError):
    pass


class TransactionCheckError(Error):
    pass
site-packages/dnf/conf/read.py000064400000012213147511334650012255 0ustar00# read.py
# Reading configuration from files.
#
# Copyright (C) 2014-2017 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from dnf.i18n import _, ucd
import dnf.conf
import libdnf.conf
import dnf.exceptions
import dnf.repo
import glob
import logging
import os

logger = logging.getLogger('dnf')


class RepoReader(object):
    def __init__(self, conf, opts):
        self.conf = conf
        self.opts = opts

    def __iter__(self):
        # get the repos from the main yum.conf file
        for r in self._get_repos(self.conf.config_file_path):
            yield r

        # read .repo files from directories specified by conf.reposdir
        repo_configs = []
        for reposdir in self.conf.reposdir:
            for path in glob.glob(os.path.join(reposdir, "*.repo")):
                repo_configs.append(path)

        # remove .conf suffix before calling the sort function
        # also split the path so the separators are not treated as ordinary characters
        repo_configs.sort(key=lambda x: dnf.util.split_path(x[:-5]))

        for repofn in repo_configs:
            try:
                for r in self._get_repos(repofn):
                    yield r
            except dnf.exceptions.ConfigError:
                logger.warning(_("Warning: failed loading '%s', skipping."),
                               repofn)

    def _build_repo(self, parser, id_, repofn):
        """Build a repository using the parsed data."""

        substituted_id = libdnf.conf.ConfigParser.substitute(id_, self.conf.substitutions)

        # Check the repo.id against the valid chars
        invalid = dnf.repo.repo_id_invalid(substituted_id)
        if invalid is not None:
            if substituted_id != id_:
                msg = _("Bad id for repo: {} ({}), byte = {} {}").format(substituted_id, id_,
                                                                         substituted_id[invalid],
                                                                         invalid)
            else:
                msg = _("Bad id for repo: {}, byte = {} {}").format(id_, id_[invalid], invalid)
            raise dnf.exceptions.ConfigError(msg)

        repo = dnf.repo.Repo(substituted_id, self.conf)
        try:
            repo._populate(parser, id_, repofn, dnf.conf.PRIO_REPOCONFIG)
        except ValueError as e:
            if substituted_id != id_:
                msg = _("Repository '{}' ({}): Error parsing config: {}").format(substituted_id,
                                                                                 id_, e)
            else:
                msg = _("Repository '{}': Error parsing config: {}").format(id_, e)
            raise dnf.exceptions.ConfigError(msg)

        # Ensure that the repo name is set
        if repo._get_priority('name') == dnf.conf.PRIO_DEFAULT:
            if substituted_id != id_:
                msg = _("Repository '{}' ({}) is missing name in configuration, using id.").format(
                    substituted_id, id_)
            else:
                msg = _("Repository '{}' is missing name in configuration, using id.").format(id_)
            logger.warning(msg)
        repo.name = ucd(repo.name)
        repo._substitutions.update(self.conf.substitutions)
        repo.cfg = parser

        return repo

    def _get_repos(self, repofn):
        """Parse and yield all repositories from a config file."""

        substs = self.conf.substitutions
        parser = libdnf.conf.ConfigParser()
        parser.setSubstitutions(substs)
        try:
            parser.read(repofn)
        except RuntimeError as e:
            raise dnf.exceptions.ConfigError(_('Parsing file "{}" failed: {}').format(repofn, e))
        except IOError as e:
            logger.warning(e)

        # Check sections in the .repo file that was just slurped up
        for section in parser.getData():

            if section == 'main':
                continue

            try:
                thisrepo = self._build_repo(parser, ucd(section), repofn)
            except (dnf.exceptions.RepoError, dnf.exceptions.ConfigError) as e:
                logger.warning(e)
                continue
            else:
                thisrepo.repofile = repofn

            thisrepo._configure_from_options(self.opts)

            yield thisrepo
site-packages/dnf/conf/config.py000064400000047737147511334650012632 0ustar00# dnf configuration classes.
#
# Copyright (C) 2016-2017 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals

from dnf.yum import misc
from dnf.i18n import ucd, _
from dnf.pycomp import basestring, urlparse

import fnmatch
import dnf.conf.substitutions
import dnf.const
import dnf.exceptions
import dnf.pycomp
import dnf.util
import hawkey
import logging
import os
import libdnf.conf
import libdnf.repo
import tempfile

PRIO_EMPTY = libdnf.conf.Option.Priority_EMPTY
PRIO_DEFAULT = libdnf.conf.Option.Priority_DEFAULT
PRIO_MAINCONFIG = libdnf.conf.Option.Priority_MAINCONFIG
PRIO_AUTOMATICCONFIG = libdnf.conf.Option.Priority_AUTOMATICCONFIG
PRIO_REPOCONFIG = libdnf.conf.Option.Priority_REPOCONFIG
PRIO_PLUGINDEFAULT = libdnf.conf.Option.Priority_PLUGINDEFAULT
PRIO_PLUGINCONFIG = libdnf.conf.Option.Priority_PLUGINCONFIG
PRIO_COMMANDLINE = libdnf.conf.Option.Priority_COMMANDLINE
PRIO_RUNTIME = libdnf.conf.Option.Priority_RUNTIME

logger = logging.getLogger('dnf')


class BaseConfig(object):
    """Base class for storing configuration definitions.

       Subclass when creating your own definitions.

    """

    def __init__(self, config=None, section=None, parser=None):
        self.__dict__["_config"] = config
        self._section = section

    def __getattr__(self, name):
        if "_config" not in self.__dict__:
            raise AttributeError("'{}' object has no attribute '{}'".format(self.__class__, name))
        option = getattr(self._config, name)
        if option is None:
            return None
        try:
            value = option().getValue()
        except Exception as ex:
            return None
        if isinstance(value, str):
            return ucd(value)
        return value

    def __setattr__(self, name, value):
        option = getattr(self._config, name, None)
        if option is None:
            # unknown config option, store to BaseConfig only
            return super(BaseConfig, self).__setattr__(name, value)
        self._set_value(name, value, PRIO_RUNTIME)

    def __str__(self):
        out = []
        out.append('[%s]' % self._section)
        if self._config:
            for optBind in self._config.optBinds():
                try:
                    value = optBind.second.getValueString()
                except RuntimeError:
                    value = ""
                out.append('%s: %s' % (optBind.first, value))
        return '\n'.join(out)

    def _has_option(self, name):
        method = getattr(self._config, name, None)
        return method is not None

    def _get_value(self, name):
        method = getattr(self._config, name, None)
        if method is None:
            return None
        return method().getValue()

    def _get_priority(self, name):
        method = getattr(self._config, name, None)
        if method is None:
            return None
        return method().getPriority()

    def _set_value(self, name, value, priority=PRIO_RUNTIME):
        """Set option's value if priority is equal or higher
           than current priority."""
        method = getattr(self._config, name, None)
        if method is None:
            raise Exception("Option \"" + name + "\" does not exists")
        option = method()
        if value is None:
            try:
                option.set(priority, value)
            except Exception:
                pass
        else:
            try:
                if isinstance(value, list) or isinstance(value, tuple):
                    option.set(priority, libdnf.conf.VectorString(value))
                elif (isinstance(option, libdnf.conf.OptionBool)
                      or isinstance(option, libdnf.conf.OptionChildBool)
                      ) and isinstance(value, int):
                    option.set(priority, bool(value))
                else:
                    option.set(priority, value)
            except RuntimeError as e:
                raise dnf.exceptions.ConfigError(_("Error parsing '%s': %s")
                                                 % (value, str(e)),
                                                 raw_error=str(e))

    def _populate(self, parser, section, filename, priority=PRIO_DEFAULT):
        """Set option values from an INI file section."""
        if parser.hasSection(section):
            for name in parser.options(section):
                value = parser.getSubstitutedValue(section, name)
                if not value or value == 'None':
                    value = ''
                if hasattr(self._config, name):
                    try:
                        self._config.optBinds().at(name).newString(priority, value)
                    except RuntimeError as e:
                        logger.error(_('Invalid configuration value: %s=%s in %s; %s'),
                                     ucd(name), ucd(value), ucd(filename), str(e))
                else:
                    if name == 'arch' and hasattr(self, name):
                        setattr(self, name, value)
                    else:
                        logger.debug(
                            _('Unknown configuration option: %s = %s in %s'),
                            ucd(name), ucd(value), ucd(filename))

    def dump(self):
        # :api
        """Return a string representing the values of all the
           configuration options.
        """
        output = ['[%s]' % self._section]

        if self._config:
            for optBind in self._config.optBinds():
                # if not opt._is_runtimeonly():
                try:
                    output.append('%s = %s' % (optBind.first, optBind.second.getValueString()))
                except RuntimeError:
                    pass

        return '\n'.join(output) + '\n'

    @staticmethod
    def write_raw_configfile(filename, section_id, substitutions, modify):
        # :api
        """
        filename   - name of config file (.conf or .repo)
        section_id - id of modified section (e.g. main, fedora, updates)
        substitutions - instance of base.conf.substitutions
        modify     - dict of modified options
        """
        parser = libdnf.conf.ConfigParser()
        parser.read(filename)

        # b/c repoids can have $values in them we need to map both ways to figure
        # out which one is which
        if not parser.hasSection(section_id):
            for sect in parser.getData():
                if libdnf.conf.ConfigParser.substitute(sect, substitutions) == section_id:
                    section_id = sect

        for name, value in modify.items():
            if isinstance(value, list):
                value = ' '.join(value)
            parser.setValue(section_id, name, value)

        parser.write(filename, False)


class MainConf(BaseConfig):
    # :api
    """Configuration option definitions for dnf.conf's [main] section."""
    def __init__(self, section='main', parser=None):
        # pylint: disable=R0915
        config = libdnf.conf.ConfigMain()
        super(MainConf, self).__init__(config, section, parser)
        self._set_value('pluginpath', [dnf.const.PLUGINPATH], PRIO_DEFAULT)
        self._set_value('pluginconfpath', [dnf.const.PLUGINCONFPATH], PRIO_DEFAULT)
        self.substitutions = dnf.conf.substitutions.Substitutions()
        self.arch = hawkey.detect_arch()
        self._config.system_cachedir().set(PRIO_DEFAULT, dnf.const.SYSTEM_CACHEDIR)

        # setup different cache and log for non-privileged users
        if dnf.util.am_i_root():
            cachedir = dnf.const.SYSTEM_CACHEDIR
            logdir = '/var/log'
        else:
            try:
                cachedir = logdir = misc.getCacheDir()
            except (IOError, OSError) as e:
                msg = _('Could not set cachedir: {}').format(ucd(e))
                raise dnf.exceptions.Error(msg)

        self._config.cachedir().set(PRIO_DEFAULT, cachedir)
        self._config.logdir().set(PRIO_DEFAULT, logdir)

        # track list of temporary files created
        self.tempfiles = []

    def __del__(self):
        for file_name in self.tempfiles:
            os.unlink(file_name)

    @property
    def get_reposdir(self):
        # :api
        """Returns the value of reposdir"""
        myrepodir = None
        # put repo file into first reposdir which exists or create it
        for rdir in self._get_value('reposdir'):
            if os.path.exists(rdir):
                myrepodir = rdir
                break

        if not myrepodir:
            myrepodir = self._get_value('reposdir')[0]
            dnf.util.ensure_dir(myrepodir)
        return myrepodir

    def _check_remote_file(self, optname):
        """
        In case the option value is a remote URL, download it to the temporary location
        and use this temporary file instead.
        """
        prio = self._get_priority(optname)
        val = self._get_value(optname)
        if isinstance(val, basestring):
            location = urlparse.urlparse(val)
            if location[0] in ('file', ''):
                # just strip the file:// prefix
                self._set_value(optname, location.path, prio)
            else:
                downloader = libdnf.repo.Downloader()
                temp_fd, temp_path = tempfile.mkstemp(prefix='dnf-downloaded-config-')
                self.tempfiles.append(temp_path)
                try:
                    downloader.downloadURL(None, val, temp_fd)
                except RuntimeError as e:
                    raise dnf.exceptions.ConfigError(
                        _('Configuration file URL "{}" could not be downloaded:\n'
                          '  {}').format(val, str(e)))
                else:
                    self._set_value(optname, temp_path, prio)
                finally:
                    os.close(temp_fd)

    def _search_inside_installroot(self, optname):
        """
        Return root used as prefix for option (installroot or "/"). When specified from commandline
        it returns value from conf.installroot
        """
        installroot = self._get_value('installroot')
        if installroot == "/":
            return installroot
        prio = self._get_priority(optname)
        # don't modify paths specified on commandline
        if prio >= PRIO_COMMANDLINE:
            return installroot
        val = self._get_value(optname)
        # if it exists inside installroot use it (i.e. adjust configuration)
        # for lists any component counts
        if not isinstance(val, str):
            if any(os.path.exists(os.path.join(installroot, p.lstrip('/'))) for p in val):
                self._set_value(
                    optname,
                    libdnf.conf.VectorString([self._prepend_installroot_path(p) for p in val]),
                    prio
                )
                return installroot
        elif os.path.exists(os.path.join(installroot, val.lstrip('/'))):
            self._set_value(optname, self._prepend_installroot_path(val), prio)
            return installroot
        return "/"

    def prepend_installroot(self, optname):
        # :api
        prio = self._get_priority(optname)
        new_path = self._prepend_installroot_path(self._get_value(optname))
        self._set_value(optname, new_path, prio)

    def _prepend_installroot_path(self, path):
        root_path = os.path.join(self._get_value('installroot'), path.lstrip('/'))
        return libdnf.conf.ConfigParser.substitute(root_path, self.substitutions)

    def _configure_from_options(self, opts):
        """Configure parts of CLI from the opts """
        config_args = ['plugins', 'version', 'config_file_path',
                       'debuglevel', 'errorlevel', 'installroot',
                       'best', 'assumeyes', 'assumeno', 'clean_requirements_on_remove', 'gpgcheck',
                       'showdupesfromrepos', 'plugins', 'ip_resolve',
                       'rpmverbosity', 'disable_excludes', 'color',
                       'downloadonly', 'exclude', 'excludepkgs', 'skip_broken',
                       'tsflags', 'arch', 'basearch', 'ignorearch', 'cacheonly', 'comment']

        for name in config_args:
            value = getattr(opts, name, None)
            if value is not None and value != []:
                if self._has_option(name):
                    appendValue = False
                    if self._config:
                        try:
                            appendValue = self._config.optBinds().at(name).getAddValue()
                        except RuntimeError:
                            # fails if option with "name" does not exist in _config (libdnf)
                            pass
                    if appendValue:
                        add_priority = dnf.conf.PRIO_COMMANDLINE
                        if add_priority < self._get_priority(name):
                            add_priority = self._get_priority(name)
                        for item in value:
                            if item:
                                self._set_value(name, self._get_value(name) + [item], add_priority)
                            else:
                                self._set_value(name, [], dnf.conf.PRIO_COMMANDLINE)
                    else:
                        self._set_value(name, value, dnf.conf.PRIO_COMMANDLINE)
                elif hasattr(self, name):
                    setattr(self, name, value)
                else:
                    logger.warning(_('Unknown configuration option: %s = %s'),
                                   ucd(name), ucd(value))

        if getattr(opts, 'gpgcheck', None) is False:
            self._set_value("localpkg_gpgcheck", False, dnf.conf.PRIO_COMMANDLINE)

        if hasattr(opts, 'main_setopts'):
            # now set all the non-first-start opts from main from our setopts
            # pylint: disable=W0212
            for name, values in opts.main_setopts.items():
                for val in values:
                    if hasattr(self._config, name):
                        try:
                            # values in main_setopts are strings, try to parse it using newString()
                            self._config.optBinds().at(name).newString(PRIO_COMMANDLINE, val)
                        except RuntimeError as e:
                            raise dnf.exceptions.ConfigError(
                                _("Error parsing --setopt with key '%s', value '%s': %s")
                                % (name, val, str(e)), raw_error=str(e))
                    else:
                        # if config option with "name" doesn't exist in _config, it could be defined
                        # only in Python layer
                        if hasattr(self, name):
                            setattr(self, name, val)
                        else:
                            msg = _("Main config did not have a %s attr. before setopt")
                            logger.warning(msg, name)

    def exclude_pkgs(self, pkgs):
        # :api
        name = "excludepkgs"

        if pkgs is not None and pkgs != []:
            if self._has_option(name):
                self._set_value(name, pkgs, dnf.conf.PRIO_COMMANDLINE)
            else:
                logger.warning(_('Unknown configuration option: %s = %s'),
                               ucd(name), ucd(pkgs))

    def _adjust_conf_options(self):
        """Adjust conf options interactions"""

        skip_broken_val = self._get_value('skip_broken')
        if skip_broken_val:
            self._set_value('strict', not skip_broken_val, self._get_priority('skip_broken'))

    @property
    def releasever(self):
        # :api
        return self.substitutions.get('releasever')

    @releasever.setter
    def releasever(self, val):
        # :api
        if val is None:
            self.substitutions.pop('releasever', None)
            return
        self.substitutions['releasever'] = str(val)

    @property
    def arch(self):
        # :api
        return self.substitutions.get('arch')

    @arch.setter
    def arch(self, val):
        # :api

        if val is None:
            self.substitutions.pop('arch', None)
            return
        if val not in dnf.rpm._BASEARCH_MAP.keys():
            msg = _('Incorrect or unknown "{}": {}')
            raise dnf.exceptions.Error(msg.format("arch", val))
        self.substitutions['arch'] = val
        self.basearch = dnf.rpm.basearch(val)

    @property
    def basearch(self):
        # :api
        return self.substitutions.get('basearch')

    @basearch.setter
    def basearch(self, val):
        # :api

        if val is None:
            self.substitutions.pop('basearch', None)
            return
        if val not in dnf.rpm._BASEARCH_MAP.values():
            msg = _('Incorrect or unknown "{}": {}')
            raise dnf.exceptions.Error(msg.format("basearch", val))
        self.substitutions['basearch'] = val

    def read(self, filename=None, priority=PRIO_DEFAULT):
        # :api
        if filename is None:
            filename = self._get_value('config_file_path')
        parser = libdnf.conf.ConfigParser()
        try:
            parser.read(filename)
        except RuntimeError as e:
            raise dnf.exceptions.ConfigError(_('Parsing file "%s" failed: %s') % (filename, e))
        except IOError as e:
            logger.warning(e)
        self._populate(parser, self._section, filename, priority)

        # update to where we read the file from
        self._set_value('config_file_path', filename, priority)

    @property
    def verbose(self):
        return self._get_value('debuglevel') >= dnf.const.VERBOSE_LEVEL


class RepoConf(BaseConfig):
    """Option definitions for repository INI file sections."""

    def __init__(self, parent, section=None, parser=None):
        mainConfig = parent._config if parent else libdnf.conf.ConfigMain()
        super(RepoConf, self).__init__(libdnf.conf.ConfigRepo(mainConfig), section, parser)
        # Do not remove! Attribute is a reference holder.
        # Prevents premature removal of the mainConfig. The libdnf ConfigRepo points to it.
        self._mainConfigRefHolder = mainConfig
        if section:
            self._config.name().set(PRIO_DEFAULT, section)

    def _configure_from_options(self, opts):
        """Configure repos from the opts. """

        if getattr(opts, 'gpgcheck', None) is False:
            for optname in ['gpgcheck', 'repo_gpgcheck']:
                self._set_value(optname, False, dnf.conf.PRIO_COMMANDLINE)

        repo_setopts = getattr(opts, 'repo_setopts', {})
        for repoid, setopts in repo_setopts.items():
            if not fnmatch.fnmatch(self._section, repoid):
                continue
            for name, values in setopts.items():
                for val in values:
                    if hasattr(self._config, name):
                        try:
                            # values in repo_setopts are strings, try to parse it using newString()
                            self._config.optBinds().at(name).newString(PRIO_COMMANDLINE, val)
                        except RuntimeError as e:
                            raise dnf.exceptions.ConfigError(
                                _("Error parsing --setopt with key '%s.%s', value '%s': %s")
                                % (self._section, name, val, str(e)), raw_error=str(e))
                    else:
                        msg = _("Repo %s did not have a %s attr. before setopt")
                        logger.warning(msg, self._section, name)
site-packages/dnf/conf/__init__.py000064400000003671147511334650013111 0ustar00# conf.py
# dnf configuration classes.
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#


"""
The configuration classes and routines in yum are splattered over too many
places, hard to change and debug. The new structure here will replace that. Its
goal is to:

* accept configuration options from all three sources (the main config file,
  repo config files, command line switches)
* handle all the logic of storing those and producing related values.
* returning configuration values.
* optionally: asserting no value is overridden once it has been applied
  somewhere (e.g. do not let a new repo be initialized with different global
  cache path than an already existing one).

"""

from __future__ import absolute_import
from __future__ import unicode_literals

from dnf.conf.config import PRIO_DEFAULT, PRIO_MAINCONFIG, PRIO_AUTOMATICCONFIG
from dnf.conf.config import PRIO_REPOCONFIG, PRIO_PLUGINDEFAULT, PRIO_PLUGINCONFIG
from dnf.conf.config import PRIO_COMMANDLINE, PRIO_RUNTIME

from dnf.conf.config import BaseConfig, MainConf, RepoConf

Conf = MainConf
site-packages/dnf/conf/substitutions.py000064400000005152147511334650014305 0ustar00# substitutions.py
# Config file substitutions.
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

import logging
import os
import re

from dnf.i18n import _

ENVIRONMENT_VARS_RE = re.compile(r'^DNF_VAR_[A-Za-z0-9_]+$')
logger = logging.getLogger('dnf')


class Substitutions(dict):
    # :api

    def __init__(self):
        super(Substitutions, self).__init__()
        self._update_from_env()

    def _update_from_env(self):
        numericvars = ['DNF%d' % num for num in range(0, 10)]
        for key, val in os.environ.items():
            if ENVIRONMENT_VARS_RE.match(key):
                self[key[8:]] = val  # remove "DNF_VAR_" prefix
            elif key in numericvars:
                self[key] = val

    def update_from_etc(self, installroot, varsdir=("/etc/yum/vars/", "/etc/dnf/vars/")):
        # :api

        for vars_path in varsdir:
            fsvars = []
            try:
                dir_fsvars = os.path.join(installroot, vars_path.lstrip('/'))
                fsvars = os.listdir(dir_fsvars)
            except OSError:
                continue
            for fsvar in fsvars:
                filepath = os.path.join(dir_fsvars, fsvar)
                val = None
                if os.path.isfile(filepath):
                    try:
                        with open(filepath) as fp:
                            val = fp.readline()
                        if val and val[-1] == '\n':
                            val = val[:-1]
                    except (OSError, IOError, UnicodeDecodeError) as e:
                        logger.warning(_("Error when parsing a variable from file '{0}': {1}").format(filepath, e))
                        continue
                if val is not None:
                    self[fsvar] = val
site-packages/dnf/conf/__pycache__/__init__.cpython-36.pyc000064400000002207147511334650017367 0ustar003

�ft`��@spdZddlmZddlmZddlmZmZmZddlmZm	Z	m
Z
ddlmZmZddlm
Z
mZmZeZdS)	aL
The configuration classes and routines in yum are splattered over too many
places, hard to change and debug. The new structure here will replace that. Its
goal is to:

* accept configuration options from all three sources (the main config file,
  repo config files, command line switches)
* handle all the logic of storing those and producing related values.
* returning configuration values.
* optionally: asserting no value is overridden once it has been applied
  somewhere (e.g. do not let a new repo be initialized with different global
  cache path than an already existing one).

�)�absolute_import)�unicode_literals)�PRIO_DEFAULT�PRIO_MAINCONFIG�PRIO_AUTOMATICCONFIG)�PRIO_REPOCONFIG�PRIO_PLUGINDEFAULT�PRIO_PLUGINCONFIG)�PRIO_COMMANDLINE�PRIO_RUNTIME)�
BaseConfig�MainConf�RepoConfN)�__doc__Z
__future__rrZdnf.conf.configrrrrrr	r
rrr
rZConf�rr�/usr/lib/python3.6/__init__.py�<module>#ssite-packages/dnf/conf/__pycache__/config.cpython-36.opt-1.pyc000064400000035047147511334650020044 0ustar003

�ft`�O�@s<ddlmZddlmZddlmZddlmZmZddlm	Z	m
Z
ddlZddlZ
ddlZ
ddlZ
ddlZ
ddlZ
ddlZddlZddlZddlZddlZddlZejjjZejjjZejjjZejjj Z!ejjj"Z#ejjj$Z%ejjj&Z'ejjj(Z)ejjj*Z+ej,d�Z-Gdd	�d	e.�Z/Gd
d�de/�Z0Gdd
�d
e/�Z1dS)�)�absolute_import)�unicode_literals)�misc)�ucd�_)�
basestring�urlparseN�dnfcs~eZdZdZddd�Zdd�Z�fdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
efdd�Ze
fdd�Zdd�Zedd��Z�ZS)�
BaseConfigzlBase class for storing configuration definitions.

       Subclass when creating your own definitions.

    NcCs||jd<||_dS)N�_config)�__dict__�_section)�self�config�section�parser�r�/usr/lib/python3.6/config.py�__init__<s
zBaseConfig.__init__cCszd|jkrtdj|j|���t|j|�}|dkr4dSy|�j�}Wn tk
rb}zdSd}~XnXt|t	�rvt
|�S|S)Nrz!'{}' object has no attribute '{}')r�AttributeError�format�	__class__�getattrr�getValue�	Exception�
isinstance�strr)r�name�option�valueZexrrr�__getattr__@s

zBaseConfig.__getattr__cs:t|j|d�}|dkr(tt|�j||�S|j||t�dS)N)rr�superr
�__setattr__�
_set_value�PRIO_RUNTIME)rrrr)rrrr"NszBaseConfig.__setattr__c
Cstg}|jd|j�|jrjxN|jj�D]@}y|jj�}Wntk
rPd}YnX|jd|j|f�q&Wdj|�S)Nz[%s]�z%s: %s�
)	�appendr
r�optBinds�second�getValueString�RuntimeError�first�join)r�out�optBindrrrr�__str__Us
zBaseConfig.__str__cCst|j|d�}|dk	S)N)rr)rr�methodrrr�_has_optionaszBaseConfig._has_optioncCs$t|j|d�}|dkrdS|�j�S)N)rrr)rrr1rrr�
_get_valueeszBaseConfig._get_valuecCs$t|j|d�}|dkrdS|�j�S)N)rrZgetPriority)rrr1rrr�
_get_prioritykszBaseConfig._get_prioritycCst|j|d�}|dkr&td|d��|�}|dkr\y|j||�Wntk
rXYnXn�yrt|t�srt|t�r�|j|tjj	|��nDt|tjj
�s�t|tjj�r�t|t�r�|j|t
|��n|j||�WnHtk
�r}z*tjjtd�|t|�ft|�d��WYdd}~XnXdS)zSSet option's value if priority is equal or higher
           than current priority.NzOption "z" does not existszError parsing '%s': %s)�	raw_error)rrr�setr�list�tuple�libdnf�conf�VectorStringZ
OptionBoolZOptionChildBool�int�boolr+r	�
exceptions�ConfigErrorrr)rrr�priorityr1r�errrr#qs*
zBaseConfig._set_valuecCs�|j|�r�x�|j|�D]�}|j||�}|s4|dkr8d}t|j|�r�y|jj�j|�j||�Wq�tk
r�}z,t	j
td�t|�t|�t|�t
|��WYdd}~Xq�Xq|dkr�t||�r�t|||�qt	jtd�t|�t|�t|��qWdS)z+Set option values from an INI file section.�Noner%z,Invalid configuration value: %s=%s in %s; %sN�archz+Unknown configuration option: %s = %s in %s)�
hasSectionZoptionsZgetSubstitutedValue�hasattrrr(�at�	newStringr+�logger�errorrrr�setattr�debug)rrr�filenamer@rrrArrr�	_populate�s 

0zBaseConfig._populatecCshd|jg}|jrZxF|jj�D]8}y|jd|j|jj�f�Wqtk
rTYqXqWdj|�dS)z]Return a string representing the values of all the
           configuration options.
        z[%s]z%s = %sr&)	r
rr(r'r,r)r*r+r-)r�outputr/rrr�dump�s
zBaseConfig.dumpcCs�tjj�}|j|�|j|�sHx(|j�D]}tjjj||�|kr(|}q(Wx6|j�D]*\}}t|t	�rndj
|�}|j|||�qRW|j|d�dS)z�
        filename   - name of config file (.conf or .repo)
        section_id - id of modified section (e.g. main, fedora, updates)
        substitutions - instance of base.conf.substitutions
        modify     - dict of modified options
        � FN)
r9r:�ConfigParser�readrDZgetData�
substitute�itemsrr7r-ZsetValue�write)rLZ
section_id�
substitutionsZmodifyrZsectrrrrr�write_raw_configfile�s	




zBaseConfig.write_raw_configfile)NNN)�__name__�
__module__�__qualname__�__doc__rr r"r0r2r3r4r$r#�PRIO_DEFAULTrMrO�staticmethodrW�
__classcell__rr)rrr
5s
r
cs�eZdZdZd%�fdd�	Zdd�Zedd	��Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zedd��Zejdd��Zedd��Zejdd��Zedd��Zejd d��Zdefd!d"�Zed#d$��Z�ZS)&�MainConfz?Configuration option definitions for dnf.conf's [main] section.�mainNcstjj�}tt|�j|||�|jdtjj	gt
�|jdtjjgt
�tjjj
�|_tj�|_|jj�jt
tjj�tjj�r�tjj}d}nVytj�}}WnDttfk
r�}z$td�jt|��}tjj|��WYdd}~XnX|jj �jt
|�|jj!�jt
|�g|_"dS)NZ
pluginpathZpluginconfpathz/var/logzCould not set cachedir: {})#r9r:�
ConfigMainr!r_rr#r	�constZ
PLUGINPATHr\ZPLUGINCONFPATHrVZ
Substitutions�hawkeyZdetect_archrCrZsystem_cachedirr6ZSYSTEM_CACHEDIR�utilZ	am_i_rootrZgetCacheDir�IOError�OSErrorrrrr>�Error�cachedir�logdir�	tempfiles)rrrrrhrirA�msg)rrrr�s$


zMainConf.__init__cCsx|jD]}tj|�qWdS)N)rj�os�unlink)r�	file_namerrr�__del__�szMainConf.__del__cCsLd}x$|jd�D]}tjj|�r|}PqW|sH|jd�d}tjj|�|S)zReturns the value of reposdirNZreposdirr)r3rl�path�existsr	rdZ
ensure_dir)rZ	myrepodirZrdirrrr�get_reposdir�szMainConf.get_reposdirc	Cs�|j|�}|j|�}t|t�r�tj|�}|ddkrF|j||j|�n�tjj	�}t
jdd�\}}|jj
|�zdy|jd||�Wn>tk
r�}z"tjjtd�j|t|����WYdd}~XnX|j|||�Wdtj|�XdS)	z�
        In case the option value is a remote URL, download it to the temporary location
        and use this temporary file instead.
        r�filer%zdnf-downloaded-config-)�prefixNz9Configuration file URL "{}" could not be downloaded:
  {})rsr%)r4r3rrrr#rpr9ZrepoZ
Downloader�tempfileZmkstemprjr'ZdownloadURLr+r	r>r?rrrrl�close)	r�optname�prio�val�locationZ
downloaderZtemp_fdZ	temp_pathrArrr�_check_remote_file�s"




 zMainConf._check_remote_filecs��jd���dkr�S�j|�}|tkr,�S�j|�}t|t�s�t�fdd�|D��r��j|tjj	�fdd�|D��|��Sn4t
jjt
jj
�|jd���r��j|�j|�|��SdS)z�
        Return root used as prefix for option (installroot or "/"). When specified from commandline
        it returns value from conf.installroot
        �installroot�/c3s*|]"}tjjtjj�|jd���VqdS)r}N)rlrprqr-�lstrip)�.0�p)r|rr�	<genexpr>*sz6MainConf._search_inside_installroot.<locals>.<genexpr>csg|]}�j|��qSr)�_prepend_installroot_path)rr�)rrr�
<listcomp>-sz7MainConf._search_inside_installroot.<locals>.<listcomp>)r3r4�PRIO_COMMANDLINErr�anyr#r9r:r;rlrprqr-r~r�)rrwrxryr)r|rr�_search_inside_installroots$



z#MainConf._search_inside_installrootcCs,|j|�}|j|j|��}|j|||�dS)N)r4r�r3r#)rrwrx�new_pathrrr�prepend_installroot6s
zMainConf.prepend_installrootcCs,tjj|jd�|jd��}tjjj||j	�S)Nr|r})
rlrpr-r3r~r9r:rQrSrV)rrpZ	root_pathrrrr�<sz"MainConf._prepend_installroot_pathcCs`ddddddddd	d
dddd
dddddddddddddg}�x|D�]}t||d�}|dk	ob|gkrB|j|��r$d}|jr�y|jj�j|�j�}Wntk
r�YnX|�rtjj	}||j
|�kr�|j
|�}xR|D]6}|r�|j||j|�|g|�q�|j|gtjj	�q�Wn|j||tjj	�qBt
||��r>t|||�qBtjtd�t|�t|��qBWt|dd�dk�r�|jddtjj	�t
|d��r\x�|jj�D]�\}}x�|D]�}	t
|j|��r"y|jj�j|�jt	|	�WnJtk
�r}
z,tjjtd �||	t|
�ft|
�d!��WYdd}
~
XnXn.t
||��r<t|||	�ntd"�}tj||��q�W�q�WdS)#z%Configure parts of CLI from the opts Zplugins�version�config_file_path�
debuglevelZ
errorlevelr|ZbestZ	assumeyesZassumenoZclean_requirements_on_remove�gpgcheckZshowdupesfromreposZ
ip_resolveZrpmverbosityZdisable_excludesZcolorZdownloadonly�exclude�excludepkgs�skip_brokenZtsflagsrC�basearchZ
ignorearchZ	cacheonlyZcommentNFz%Unknown configuration option: %s = %sZlocalpkg_gpgcheck�main_setoptsz4Error parsing --setopt with key '%s', value '%s': %s)r5z1Main config did not have a %s attr. before setopt)rr2rr(rFZgetAddValuer+r	r:r�r4r#r3rErJrH�warningrrr�rTrGr>r?r)r�optsZconfig_argsrrZappendValueZadd_priority�item�valuesryrArkrrr�_configure_from_options@s\




.z MainConf._configure_from_optionscCsPd}|dk	rL|gkrL|j|�r2|j||tjj�ntjtd�t|�t|��dS)Nr�z%Unknown configuration option: %s = %s)	r2r#r	r:r�rHr�rr)rZpkgsrrrr�exclude_pkgss

zMainConf.exclude_pkgscCs(|jd�}|r$|jd||jd��dS)z Adjust conf options interactionsr��strictN)r3r#r4)rZskip_broken_valrrr�_adjust_conf_options�s
zMainConf._adjust_conf_optionscCs|jjd�S)N�
releasever)rV�get)rrrrr��szMainConf.releasevercCs,|dkr|jjdd�dSt|�|jd<dS)Nr�)rV�popr)rryrrrr��scCs|jjd�S)NrC)rVr�)rrrrrC�sz
MainConf.archcCsb|dkr|jjdd�dS|tjjj�krFtd�}tjj|j	d|���||jd<tjj
|�|_
dS)NrCzIncorrect or unknown "{}": {})rVr�r	�rpm�
_BASEARCH_MAP�keysrr>rgrr�)rryrkrrrrC�s
cCs|jjd�S)Nr�)rVr�)rrrrr��szMainConf.basearchcCsT|dkr|jjdd�dS|tjjj�krFtd�}tjj|j	d|���||jd<dS)Nr�zIncorrect or unknown "{}": {})
rVr�r	r�r�r�rr>rgr)rryrkrrrr��scCs�|dkr|jd�}tjj�}y|j|�Wndtk
rd}ztjjt	d�||f��WYdd}~Xn,t
k
r�}ztj|�WYdd}~XnX|j
||j||�|jd||�dS)Nr�zParsing file "%s" failed: %s)r3r9r:rQrRr+r	r>r?rrerHr�rMr
r#)rrLr@rrArrrrR�s

(z
MainConf.readcCs|jd�tjjkS)Nr�)r3r	rbZ
VERBOSE_LEVEL)rrrr�verbose�szMainConf.verbose)r`N)rXrYrZr[rro�propertyrrr{r�r�r�r�r�r�r��setterrCr�r\rRr�r^rr)rrr_�s&?
r_cs*eZdZdZd�fdd�	Zdd�Z�ZS)�RepoConfz4Option definitions for repository INI file sections.NcsP|r
|jntjj�}tt|�jtjj|�||�||_|rL|jj	�j
t|�dS)N)rr9r:rar!r�rZ
ConfigRepoZ_mainConfigRefHolderrr6r\)r�parentrrZ
mainConfig)rrrr�s
zRepoConf.__init__cCst|dd�dkr0xd	D]}|j|dtjj�qWt|di�}x�|j�D]�\}}tj|j|�s^qFx�|j�D]�\}}x�|D]�}t|j	|�r�y|j	j
�j|�jt|�WnLt
k
r�}	z0tjjtd�|j||t|	�ft|	�d��WYdd}	~	XnXqvtd�}
tj|
|j|�qvWqhWqFWdS)
zConfigure repos from the opts. r�NF�
repo_gpgcheck�repo_setoptsz7Error parsing --setopt with key '%s.%s', value '%s': %s)r5z-Repo %s did not have a %s attr. before setopt)r�r�)rr#r	r:r�rT�fnmatchr
rErr(rFrGr+r>r?rrrHr�)rr�rwr�ZrepoidZsetoptsrr�ryrArkrrrr��s$

2z RepoConf._configure_from_options)NN)rXrYrZr[rr�r^rr)rrr��s	r�)2Z
__future__rrZdnf.yumrZdnf.i18nrrZ
dnf.pycomprrr�Zdnf.conf.substitutionsr	Z	dnf.constZdnf.exceptionsZdnf.utilrcZloggingrlZlibdnf.confr9Zlibdnf.reporur:ZOptionZPriority_EMPTYZ
PRIO_EMPTYZPriority_DEFAULTr\ZPriority_MAINCONFIGZPRIO_MAINCONFIGZPriority_AUTOMATICCONFIGZPRIO_AUTOMATICCONFIGZPriority_REPOCONFIGZPRIO_REPOCONFIGZPriority_PLUGINDEFAULTZPRIO_PLUGINDEFAULTZPriority_PLUGINCONFIGZPRIO_PLUGINCONFIGZPriority_COMMANDLINEr�ZPriority_RUNTIMEr$Z	getLoggerrH�objectr
r_r�rrrr�<module>s@









site-packages/dnf/conf/__pycache__/read.cpython-36.opt-1.pyc000064400000006411147511334650017503 0ustar003

�ft`��@s~ddlmZddlmZddlmZmZddlZddlZ	ddl
ZddlZddlZddl
Z
ddlZe
jd�ZGdd�de�ZdS)�)�absolute_import)�unicode_literals)�_�ucdN�dnfc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�
RepoReadercCs||_||_dS)N)�conf�opts)�selfrr	�r�/usr/lib/python3.6/read.py�__init__$szRepoReader.__init__ccs�x|j|jj�D]
}|VqWg}x8|jjD],}x&tjtjj|d��D]}|j|�qFWq,W|j	dd�d�xT|D]L}yx|j|�D]
}|Vq�WWqrt
jjk
r�t
jtd�|�YqrXqrWdS)Nz*.repocSstjj|dd��S)N����)r�utilZ
split_path)�xrrr�<lambda>5sz%RepoReader.__iter__.<locals>.<lambda>)�keyz'Warning: failed loading '%s', skipping.)�
_get_reposrZconfig_file_path�reposdir�glob�os�path�join�append�sortr�
exceptions�ConfigError�logger�warningr)r
�rZrepo_configsrr�repofnrrr�__iter__(s


zRepoReader.__iter__c	Cs^tjjj||jj�}tjj|�}|dk	rl||krJtd�j	|||||�}ntd�j	||||�}tj
j|��tjj||j�}y|j
|||tjj�WnZtk
r�}z>||kr�td�j	|||�}ntd�j	||�}tj
j|��WYdd}~XnX|jd�tjjk�r8||k�r td�j	||�}ntd�j	|�}tj|�t|j�|_|jj|jj�||_|S)	z)Build a repository using the parsed data.Nz&Bad id for repo: {} ({}), byte = {} {}z!Bad id for repo: {}, byte = {} {}z.Repository '{}' ({}): Error parsing config: {}z)Repository '{}': Error parsing config: {}�namez@Repository '{}' ({}) is missing name in configuration, using id.z;Repository '{}' is missing name in configuration, using id.)�libdnfr�ConfigParserZ
substitute�
substitutionsr�repoZrepo_id_invalidr�formatrrZRepoZ	_populateZPRIO_REPOCONFIG�
ValueErrorZ
_get_priorityZPRIO_DEFAULTrrrr#Z_substitutions�updateZcfg)	r
�parserZid_r!Zsubstituted_idZinvalid�msgr'�errr�_build_repo?s8




zRepoReader._build_repoccs|jj}tjj�}|j|�y|j|�Wndtk
rd}ztjj	t
d�j||���WYdd}~Xn,tk
r�}zt
j|�WYdd}~XnXx�|j�D]x}|dkr�q�y|j|t|�|�}Wn:tjjtjj	fk
r�}zt
j|�w�WYdd}~XnX||_|j|j�|Vq�WdS)z4Parse and yield all repositories from a config file.zParsing file "{}" failed: {}N�main)rr&r$r%ZsetSubstitutions�read�RuntimeErrorrrrrr(�IOErrorrrZgetDatar.rZ	RepoErrorZrepofileZ_configure_from_optionsr	)r
r!Zsubstsr+r-ZsectionZthisreporrrrhs(

(
zRepoReader._get_reposN)�__name__�
__module__�__qualname__r
r"r.rrrrrr#s)r)Z
__future__rrZdnf.i18nrrZdnf.confrZlibdnf.confr$Zdnf.exceptionsZdnf.reporZloggingrZ	getLoggerr�objectrrrrr�<module>s
site-packages/dnf/conf/__pycache__/substitutions.cpython-36.opt-1.pyc000064400000003552147511334650021532 0ustar003

i�-ej
�@sLddlZddlZddlZddlmZejd�Zejd�ZGdd�de	�Z
dS)�N)�_z^DNF_VAR_[A-Za-z0-9_]+$Zdnfcs.eZdZ�fdd�Zdd�Zd
dd�Z�ZS)�
Substitutionscstt|�j�|j�dS)N)�superr�__init__�_update_from_env)�self)�	__class__��#/usr/lib/python3.6/substitutions.pyr"szSubstitutions.__init__cCs\dd�tdd�D�}xBtjj�D]4\}}tj|�rD|||dd�<q ||kr |||<q WdS)NcSsg|]}d|�qS)zDNF%dr	)�.0Znumr	r	r
�
<listcomp>'sz2Substitutions._update_from_env.<locals>.<listcomp>r�
�)�range�os�environ�items�ENVIRONMENT_VARS_RE�match)rZnumericvars�key�valr	r	r
r&s
zSubstitutions._update_from_env�/etc/yum/vars/�/etc/dnf/vars/cCs�x|D�]�}g}y"tjj||jd��}tj|�}Wntk
rJwYnXx�|D]�}tjj||�}d}tjj|�r�y<t|��}	|	j�}WdQRX|r�|ddkr�|dd�}Wn@tt	t
fk
r�}
ztjt
d�j||
��wRWYdd}
~
XnX|dk	rR|||<qRWqWdS)N�/��
z2Error when parsing a variable from file '{0}': {1}���r)r�path�join�lstrip�listdir�OSError�isfile�open�readline�IOError�UnicodeDecodeError�loggerZwarningr�format)rZinstallrootZvarsdirZ	vars_pathZfsvarsZ
dir_fsvarsZfsvar�filepathr�fp�er	r	r
�update_from_etc.s*

zSubstitutions.update_from_etc�rr)r-)�__name__�
__module__�__qualname__rrr,�
__classcell__r	r	)rr
rsr)Zloggingr�reZdnf.i18nr�compilerZ	getLoggerr'�dictrr	r	r	r
�<module>s

site-packages/dnf/conf/__pycache__/__init__.cpython-36.opt-1.pyc000064400000002207147511334650020326 0ustar003

�ft`��@spdZddlmZddlmZddlmZmZmZddlmZm	Z	m
Z
ddlmZmZddlm
Z
mZmZeZdS)	aL
The configuration classes and routines in yum are splattered over too many
places, hard to change and debug. The new structure here will replace that. Its
goal is to:

* accept configuration options from all three sources (the main config file,
  repo config files, command line switches)
* handle all the logic of storing those and producing related values.
* returning configuration values.
* optionally: asserting no value is overridden once it has been applied
  somewhere (e.g. do not let a new repo be initialized with different global
  cache path than an already existing one).

�)�absolute_import)�unicode_literals)�PRIO_DEFAULT�PRIO_MAINCONFIG�PRIO_AUTOMATICCONFIG)�PRIO_REPOCONFIG�PRIO_PLUGINDEFAULT�PRIO_PLUGINCONFIG)�PRIO_COMMANDLINE�PRIO_RUNTIME)�
BaseConfig�MainConf�RepoConfN)�__doc__Z
__future__rrZdnf.conf.configrrrrrr	r
rrr
rZConf�rr�/usr/lib/python3.6/__init__.py�<module>#ssite-packages/dnf/conf/__pycache__/substitutions.cpython-36.pyc000064400000003552147511334650020573 0ustar003

i�-ej
�@sLddlZddlZddlZddlmZejd�Zejd�ZGdd�de	�Z
dS)�N)�_z^DNF_VAR_[A-Za-z0-9_]+$Zdnfcs.eZdZ�fdd�Zdd�Zd
dd�Z�ZS)�
Substitutionscstt|�j�|j�dS)N)�superr�__init__�_update_from_env)�self)�	__class__��#/usr/lib/python3.6/substitutions.pyr"szSubstitutions.__init__cCs\dd�tdd�D�}xBtjj�D]4\}}tj|�rD|||dd�<q ||kr |||<q WdS)NcSsg|]}d|�qS)zDNF%dr	)�.0Znumr	r	r
�
<listcomp>'sz2Substitutions._update_from_env.<locals>.<listcomp>r�
�)�range�os�environ�items�ENVIRONMENT_VARS_RE�match)rZnumericvars�key�valr	r	r
r&s
zSubstitutions._update_from_env�/etc/yum/vars/�/etc/dnf/vars/cCs�x|D�]�}g}y"tjj||jd��}tj|�}Wntk
rJwYnXx�|D]�}tjj||�}d}tjj|�r�y<t|��}	|	j�}WdQRX|r�|ddkr�|dd�}Wn@tt	t
fk
r�}
ztjt
d�j||
��wRWYdd}
~
XnX|dk	rR|||<qRWqWdS)N�/��
z2Error when parsing a variable from file '{0}': {1}���r)r�path�join�lstrip�listdir�OSError�isfile�open�readline�IOError�UnicodeDecodeError�loggerZwarningr�format)rZinstallrootZvarsdirZ	vars_pathZfsvarsZ
dir_fsvarsZfsvar�filepathr�fp�er	r	r
�update_from_etc.s*

zSubstitutions.update_from_etc�rr)r-)�__name__�
__module__�__qualname__rrr,�
__classcell__r	r	)rr
rsr)Zloggingr�reZdnf.i18nr�compilerZ	getLoggerr'�dictrr	r	r	r
�<module>s

site-packages/dnf/conf/__pycache__/config.cpython-36.pyc000064400000035047147511334650017105 0ustar003

�ft`�O�@s<ddlmZddlmZddlmZddlmZmZddlm	Z	m
Z
ddlZddlZ
ddlZ
ddlZ
ddlZ
ddlZ
ddlZddlZddlZddlZddlZddlZejjjZejjjZejjjZejjj Z!ejjj"Z#ejjj$Z%ejjj&Z'ejjj(Z)ejjj*Z+ej,d�Z-Gdd	�d	e.�Z/Gd
d�de/�Z0Gdd
�d
e/�Z1dS)�)�absolute_import)�unicode_literals)�misc)�ucd�_)�
basestring�urlparseN�dnfcs~eZdZdZddd�Zdd�Z�fdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
efdd�Ze
fdd�Zdd�Zedd��Z�ZS)�
BaseConfigzlBase class for storing configuration definitions.

       Subclass when creating your own definitions.

    NcCs||jd<||_dS)N�_config)�__dict__�_section)�self�config�section�parser�r�/usr/lib/python3.6/config.py�__init__<s
zBaseConfig.__init__cCszd|jkrtdj|j|���t|j|�}|dkr4dSy|�j�}Wn tk
rb}zdSd}~XnXt|t	�rvt
|�S|S)Nrz!'{}' object has no attribute '{}')r�AttributeError�format�	__class__�getattrr�getValue�	Exception�
isinstance�strr)r�name�option�valueZexrrr�__getattr__@s

zBaseConfig.__getattr__cs:t|j|d�}|dkr(tt|�j||�S|j||t�dS)N)rr�superr
�__setattr__�
_set_value�PRIO_RUNTIME)rrrr)rrrr"NszBaseConfig.__setattr__c
Cstg}|jd|j�|jrjxN|jj�D]@}y|jj�}Wntk
rPd}YnX|jd|j|f�q&Wdj|�S)Nz[%s]�z%s: %s�
)	�appendr
r�optBinds�second�getValueString�RuntimeError�first�join)r�out�optBindrrrr�__str__Us
zBaseConfig.__str__cCst|j|d�}|dk	S)N)rr)rr�methodrrr�_has_optionaszBaseConfig._has_optioncCs$t|j|d�}|dkrdS|�j�S)N)rrr)rrr1rrr�
_get_valueeszBaseConfig._get_valuecCs$t|j|d�}|dkrdS|�j�S)N)rrZgetPriority)rrr1rrr�
_get_prioritykszBaseConfig._get_prioritycCst|j|d�}|dkr&td|d��|�}|dkr\y|j||�Wntk
rXYnXn�yrt|t�srt|t�r�|j|tjj	|��nDt|tjj
�s�t|tjj�r�t|t�r�|j|t
|��n|j||�WnHtk
�r}z*tjjtd�|t|�ft|�d��WYdd}~XnXdS)zSSet option's value if priority is equal or higher
           than current priority.NzOption "z" does not existszError parsing '%s': %s)�	raw_error)rrr�setr�list�tuple�libdnf�conf�VectorStringZ
OptionBoolZOptionChildBool�int�boolr+r	�
exceptions�ConfigErrorrr)rrr�priorityr1r�errrr#qs*
zBaseConfig._set_valuecCs�|j|�r�x�|j|�D]�}|j||�}|s4|dkr8d}t|j|�r�y|jj�j|�j||�Wq�tk
r�}z,t	j
td�t|�t|�t|�t
|��WYdd}~Xq�Xq|dkr�t||�r�t|||�qt	jtd�t|�t|�t|��qWdS)z+Set option values from an INI file section.�Noner%z,Invalid configuration value: %s=%s in %s; %sN�archz+Unknown configuration option: %s = %s in %s)�
hasSectionZoptionsZgetSubstitutedValue�hasattrrr(�at�	newStringr+�logger�errorrrr�setattr�debug)rrr�filenamer@rrrArrr�	_populate�s 

0zBaseConfig._populatecCshd|jg}|jrZxF|jj�D]8}y|jd|j|jj�f�Wqtk
rTYqXqWdj|�dS)z]Return a string representing the values of all the
           configuration options.
        z[%s]z%s = %sr&)	r
rr(r'r,r)r*r+r-)r�outputr/rrr�dump�s
zBaseConfig.dumpcCs�tjj�}|j|�|j|�sHx(|j�D]}tjjj||�|kr(|}q(Wx6|j�D]*\}}t|t	�rndj
|�}|j|||�qRW|j|d�dS)z�
        filename   - name of config file (.conf or .repo)
        section_id - id of modified section (e.g. main, fedora, updates)
        substitutions - instance of base.conf.substitutions
        modify     - dict of modified options
        � FN)
r9r:�ConfigParser�readrDZgetData�
substitute�itemsrr7r-ZsetValue�write)rLZ
section_id�
substitutionsZmodifyrZsectrrrrr�write_raw_configfile�s	




zBaseConfig.write_raw_configfile)NNN)�__name__�
__module__�__qualname__�__doc__rr r"r0r2r3r4r$r#�PRIO_DEFAULTrMrO�staticmethodrW�
__classcell__rr)rrr
5s
r
cs�eZdZdZd%�fdd�	Zdd�Zedd	��Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zedd��Zejdd��Zedd��Zejdd��Zedd��Zejd d��Zdefd!d"�Zed#d$��Z�ZS)&�MainConfz?Configuration option definitions for dnf.conf's [main] section.�mainNcstjj�}tt|�j|||�|jdtjj	gt
�|jdtjjgt
�tjjj
�|_tj�|_|jj�jt
tjj�tjj�r�tjj}d}nVytj�}}WnDttfk
r�}z$td�jt|��}tjj|��WYdd}~XnX|jj �jt
|�|jj!�jt
|�g|_"dS)NZ
pluginpathZpluginconfpathz/var/logzCould not set cachedir: {})#r9r:�
ConfigMainr!r_rr#r	�constZ
PLUGINPATHr\ZPLUGINCONFPATHrVZ
Substitutions�hawkeyZdetect_archrCrZsystem_cachedirr6ZSYSTEM_CACHEDIR�utilZ	am_i_rootrZgetCacheDir�IOError�OSErrorrrrr>�Error�cachedir�logdir�	tempfiles)rrrrrhrirA�msg)rrrr�s$


zMainConf.__init__cCsx|jD]}tj|�qWdS)N)rj�os�unlink)r�	file_namerrr�__del__�szMainConf.__del__cCsLd}x$|jd�D]}tjj|�r|}PqW|sH|jd�d}tjj|�|S)zReturns the value of reposdirNZreposdirr)r3rl�path�existsr	rdZ
ensure_dir)rZ	myrepodirZrdirrrr�get_reposdir�szMainConf.get_reposdirc	Cs�|j|�}|j|�}t|t�r�tj|�}|ddkrF|j||j|�n�tjj	�}t
jdd�\}}|jj
|�zdy|jd||�Wn>tk
r�}z"tjjtd�j|t|����WYdd}~XnX|j|||�Wdtj|�XdS)	z�
        In case the option value is a remote URL, download it to the temporary location
        and use this temporary file instead.
        r�filer%zdnf-downloaded-config-)�prefixNz9Configuration file URL "{}" could not be downloaded:
  {})rsr%)r4r3rrrr#rpr9ZrepoZ
Downloader�tempfileZmkstemprjr'ZdownloadURLr+r	r>r?rrrrl�close)	r�optname�prio�val�locationZ
downloaderZtemp_fdZ	temp_pathrArrr�_check_remote_file�s"




 zMainConf._check_remote_filecs��jd���dkr�S�j|�}|tkr,�S�j|�}t|t�s�t�fdd�|D��r��j|tjj	�fdd�|D��|��Sn4t
jjt
jj
�|jd���r��j|�j|�|��SdS)z�
        Return root used as prefix for option (installroot or "/"). When specified from commandline
        it returns value from conf.installroot
        �installroot�/c3s*|]"}tjjtjj�|jd���VqdS)r}N)rlrprqr-�lstrip)�.0�p)r|rr�	<genexpr>*sz6MainConf._search_inside_installroot.<locals>.<genexpr>csg|]}�j|��qSr)�_prepend_installroot_path)rr�)rrr�
<listcomp>-sz7MainConf._search_inside_installroot.<locals>.<listcomp>)r3r4�PRIO_COMMANDLINErr�anyr#r9r:r;rlrprqr-r~r�)rrwrxryr)r|rr�_search_inside_installroots$



z#MainConf._search_inside_installrootcCs,|j|�}|j|j|��}|j|||�dS)N)r4r�r3r#)rrwrx�new_pathrrr�prepend_installroot6s
zMainConf.prepend_installrootcCs,tjj|jd�|jd��}tjjj||j	�S)Nr|r})
rlrpr-r3r~r9r:rQrSrV)rrpZ	root_pathrrrr�<sz"MainConf._prepend_installroot_pathcCs`ddddddddd	d
dddd
dddddddddddddg}�x|D�]}t||d�}|dk	ob|gkrB|j|��r$d}|jr�y|jj�j|�j�}Wntk
r�YnX|�rtjj	}||j
|�kr�|j
|�}xR|D]6}|r�|j||j|�|g|�q�|j|gtjj	�q�Wn|j||tjj	�qBt
||��r>t|||�qBtjtd�t|�t|��qBWt|dd�dk�r�|jddtjj	�t
|d��r\x�|jj�D]�\}}x�|D]�}	t
|j|��r"y|jj�j|�jt	|	�WnJtk
�r}
z,tjjtd �||	t|
�ft|
�d!��WYdd}
~
XnXn.t
||��r<t|||	�ntd"�}tj||��q�W�q�WdS)#z%Configure parts of CLI from the opts Zplugins�version�config_file_path�
debuglevelZ
errorlevelr|ZbestZ	assumeyesZassumenoZclean_requirements_on_remove�gpgcheckZshowdupesfromreposZ
ip_resolveZrpmverbosityZdisable_excludesZcolorZdownloadonly�exclude�excludepkgs�skip_brokenZtsflagsrC�basearchZ
ignorearchZ	cacheonlyZcommentNFz%Unknown configuration option: %s = %sZlocalpkg_gpgcheck�main_setoptsz4Error parsing --setopt with key '%s', value '%s': %s)r5z1Main config did not have a %s attr. before setopt)rr2rr(rFZgetAddValuer+r	r:r�r4r#r3rErJrH�warningrrr�rTrGr>r?r)r�optsZconfig_argsrrZappendValueZadd_priority�item�valuesryrArkrrr�_configure_from_options@s\




.z MainConf._configure_from_optionscCsPd}|dk	rL|gkrL|j|�r2|j||tjj�ntjtd�t|�t|��dS)Nr�z%Unknown configuration option: %s = %s)	r2r#r	r:r�rHr�rr)rZpkgsrrrr�exclude_pkgss

zMainConf.exclude_pkgscCs(|jd�}|r$|jd||jd��dS)z Adjust conf options interactionsr��strictN)r3r#r4)rZskip_broken_valrrr�_adjust_conf_options�s
zMainConf._adjust_conf_optionscCs|jjd�S)N�
releasever)rV�get)rrrrr��szMainConf.releasevercCs,|dkr|jjdd�dSt|�|jd<dS)Nr�)rV�popr)rryrrrr��scCs|jjd�S)NrC)rVr�)rrrrrC�sz
MainConf.archcCsb|dkr|jjdd�dS|tjjj�krFtd�}tjj|j	d|���||jd<tjj
|�|_
dS)NrCzIncorrect or unknown "{}": {})rVr�r	�rpm�
_BASEARCH_MAP�keysrr>rgrr�)rryrkrrrrC�s
cCs|jjd�S)Nr�)rVr�)rrrrr��szMainConf.basearchcCsT|dkr|jjdd�dS|tjjj�krFtd�}tjj|j	d|���||jd<dS)Nr�zIncorrect or unknown "{}": {})
rVr�r	r�r�r�rr>rgr)rryrkrrrr��scCs�|dkr|jd�}tjj�}y|j|�Wndtk
rd}ztjjt	d�||f��WYdd}~Xn,t
k
r�}ztj|�WYdd}~XnX|j
||j||�|jd||�dS)Nr�zParsing file "%s" failed: %s)r3r9r:rQrRr+r	r>r?rrerHr�rMr
r#)rrLr@rrArrrrR�s

(z
MainConf.readcCs|jd�tjjkS)Nr�)r3r	rbZ
VERBOSE_LEVEL)rrrr�verbose�szMainConf.verbose)r`N)rXrYrZr[rro�propertyrrr{r�r�r�r�r�r�r��setterrCr�r\rRr�r^rr)rrr_�s&?
r_cs*eZdZdZd�fdd�	Zdd�Z�ZS)�RepoConfz4Option definitions for repository INI file sections.NcsP|r
|jntjj�}tt|�jtjj|�||�||_|rL|jj	�j
t|�dS)N)rr9r:rar!r�rZ
ConfigRepoZ_mainConfigRefHolderrr6r\)r�parentrrZ
mainConfig)rrrr�s
zRepoConf.__init__cCst|dd�dkr0xd	D]}|j|dtjj�qWt|di�}x�|j�D]�\}}tj|j|�s^qFx�|j�D]�\}}x�|D]�}t|j	|�r�y|j	j
�j|�jt|�WnLt
k
r�}	z0tjjtd�|j||t|	�ft|	�d��WYdd}	~	XnXqvtd�}
tj|
|j|�qvWqhWqFWdS)
zConfigure repos from the opts. r�NF�
repo_gpgcheck�repo_setoptsz7Error parsing --setopt with key '%s.%s', value '%s': %s)r5z-Repo %s did not have a %s attr. before setopt)r�r�)rr#r	r:r�rT�fnmatchr
rErr(rFrGr+r>r?rrrHr�)rr�rwr�ZrepoidZsetoptsrr�ryrArkrrrr��s$

2z RepoConf._configure_from_options)NN)rXrYrZr[rr�r^rr)rrr��s	r�)2Z
__future__rrZdnf.yumrZdnf.i18nrrZ
dnf.pycomprrr�Zdnf.conf.substitutionsr	Z	dnf.constZdnf.exceptionsZdnf.utilrcZloggingrlZlibdnf.confr9Zlibdnf.reporur:ZOptionZPriority_EMPTYZ
PRIO_EMPTYZPriority_DEFAULTr\ZPriority_MAINCONFIGZPRIO_MAINCONFIGZPriority_AUTOMATICCONFIGZPRIO_AUTOMATICCONFIGZPriority_REPOCONFIGZPRIO_REPOCONFIGZPriority_PLUGINDEFAULTZPRIO_PLUGINDEFAULTZPriority_PLUGINCONFIGZPRIO_PLUGINCONFIGZPriority_COMMANDLINEr�ZPriority_RUNTIMEr$Z	getLoggerrH�objectr
r_r�rrrr�<module>s@









site-packages/dnf/conf/__pycache__/read.cpython-36.pyc000064400000006411147511334650016544 0ustar003

�ft`��@s~ddlmZddlmZddlmZmZddlZddlZ	ddl
ZddlZddlZddl
Z
ddlZe
jd�ZGdd�de�ZdS)�)�absolute_import)�unicode_literals)�_�ucdN�dnfc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�
RepoReadercCs||_||_dS)N)�conf�opts)�selfrr	�r�/usr/lib/python3.6/read.py�__init__$szRepoReader.__init__ccs�x|j|jj�D]
}|VqWg}x8|jjD],}x&tjtjj|d��D]}|j|�qFWq,W|j	dd�d�xT|D]L}yx|j|�D]
}|Vq�WWqrt
jjk
r�t
jtd�|�YqrXqrWdS)Nz*.repocSstjj|dd��S)N����)r�utilZ
split_path)�xrrr�<lambda>5sz%RepoReader.__iter__.<locals>.<lambda>)�keyz'Warning: failed loading '%s', skipping.)�
_get_reposrZconfig_file_path�reposdir�glob�os�path�join�append�sortr�
exceptions�ConfigError�logger�warningr)r
�rZrepo_configsrr�repofnrrr�__iter__(s


zRepoReader.__iter__c	Cs^tjjj||jj�}tjj|�}|dk	rl||krJtd�j	|||||�}ntd�j	||||�}tj
j|��tjj||j�}y|j
|||tjj�WnZtk
r�}z>||kr�td�j	|||�}ntd�j	||�}tj
j|��WYdd}~XnX|jd�tjjk�r8||k�r td�j	||�}ntd�j	|�}tj|�t|j�|_|jj|jj�||_|S)	z)Build a repository using the parsed data.Nz&Bad id for repo: {} ({}), byte = {} {}z!Bad id for repo: {}, byte = {} {}z.Repository '{}' ({}): Error parsing config: {}z)Repository '{}': Error parsing config: {}�namez@Repository '{}' ({}) is missing name in configuration, using id.z;Repository '{}' is missing name in configuration, using id.)�libdnfr�ConfigParserZ
substitute�
substitutionsr�repoZrepo_id_invalidr�formatrrZRepoZ	_populateZPRIO_REPOCONFIG�
ValueErrorZ
_get_priorityZPRIO_DEFAULTrrrr#Z_substitutions�updateZcfg)	r
�parserZid_r!Zsubstituted_idZinvalid�msgr'�errr�_build_repo?s8




zRepoReader._build_repoccs|jj}tjj�}|j|�y|j|�Wndtk
rd}ztjj	t
d�j||���WYdd}~Xn,tk
r�}zt
j|�WYdd}~XnXx�|j�D]x}|dkr�q�y|j|t|�|�}Wn:tjjtjj	fk
r�}zt
j|�w�WYdd}~XnX||_|j|j�|Vq�WdS)z4Parse and yield all repositories from a config file.zParsing file "{}" failed: {}N�main)rr&r$r%ZsetSubstitutions�read�RuntimeErrorrrrrr(�IOErrorrrZgetDatar.rZ	RepoErrorZrepofileZ_configure_from_optionsr	)r
r!Zsubstsr+r-ZsectionZthisreporrrrhs(

(
zRepoReader._get_reposN)�__name__�
__module__�__qualname__r
r"r.rrrrrr#s)r)Z
__future__rrZdnf.i18nrrZdnf.confrZlibdnf.confr$Zdnf.exceptionsZdnf.reporZloggingrZ	getLoggerr�objectrrrrr�<module>s
site-packages/dnf/util.py000064400000047627147511334650011413 0ustar00# util.py
# Basic dnf utils.
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import print_function
from __future__ import absolute_import
from __future__ import unicode_literals

from .pycomp import PY3, basestring
from dnf.i18n import _, ucd
import argparse
import dnf
import dnf.callback
import dnf.const
import dnf.pycomp
import errno
import functools
import hawkey
import itertools
import locale
import logging
import os
import pwd
import shutil
import sys
import tempfile
import time
import libdnf.repo
import libdnf.transaction

logger = logging.getLogger('dnf')

MAIN_PROG = argparse.ArgumentParser().prog if argparse.ArgumentParser().prog == "yum" else "dnf"
MAIN_PROG_UPPER = MAIN_PROG.upper()

"""DNF Utilities."""


def _parse_specs(namespace, values):
    """
    Categorize :param values list into packages, groups and filenames

    :param namespace: argparse.Namespace, where specs will be stored
    :param values: list of specs, whether packages ('foo') or groups/modules ('@bar')
                   or filenames ('*.rmp', 'http://*', ...)

    To access packages use: specs.pkg_specs,
    to access groups use: specs.grp_specs,
    to access filenames use: specs.filenames
    """

    setattr(namespace, "filenames", [])
    setattr(namespace, "grp_specs", [])
    setattr(namespace, "pkg_specs", [])
    tmp_set = set()
    for value in values:
        if value in tmp_set:
            continue
        tmp_set.add(value)
        schemes = dnf.pycomp.urlparse.urlparse(value)[0]
        if value.endswith('.rpm'):
            namespace.filenames.append(value)
        elif schemes and schemes in ('http', 'ftp', 'file', 'https'):
            namespace.filenames.append(value)
        elif value.startswith('@'):
            namespace.grp_specs.append(value[1:])
        else:
            namespace.pkg_specs.append(value)


def _urlopen_progress(url, conf, progress=None):
    if progress is None:
        progress = dnf.callback.NullDownloadProgress()
    pload = dnf.repo.RemoteRPMPayload(url, conf, progress)
    if os.path.exists(pload.local_path):
        return pload.local_path
    est_remote_size = sum([pload.download_size])
    progress.start(1, est_remote_size)
    targets = [pload._librepo_target()]
    try:
        libdnf.repo.PackageTarget.downloadPackages(libdnf.repo.VectorPPackageTarget(targets), True)
    except RuntimeError as e:
        if conf.strict:
            raise IOError(str(e))
        logger.error(str(e))
    return pload.local_path

def _urlopen(url, conf=None, repo=None, mode='w+b', **kwargs):
    """
    Open the specified absolute url, return a file object
    which respects proxy setting even for non-repo downloads
    """
    if PY3 and 'b' not in mode:
        kwargs.setdefault('encoding', 'utf-8')
    fo = tempfile.NamedTemporaryFile(mode, **kwargs)

    try:
        if repo:
            repo._repo.downloadUrl(url, fo.fileno())
        else:
            libdnf.repo.Downloader.downloadURL(conf._config if conf else None, url, fo.fileno())
    except RuntimeError as e:
        raise IOError(str(e))

    fo.seek(0)
    return fo

def rtrim(s, r):
    if s.endswith(r):
        s = s[:-len(r)]
    return s


def am_i_root():
    # used by ansible (lib/ansible/modules/packaging/os/dnf.py)
    return os.geteuid() == 0

def clear_dir(path):
    """Remove all files and dirs under `path`

    Also see rm_rf()

    """
    for entry in os.listdir(path):
        contained_path = os.path.join(path, entry)
        rm_rf(contained_path)

def ensure_dir(dname):
    # used by ansible (lib/ansible/modules/packaging/os/dnf.py)
    try:
        os.makedirs(dname, mode=0o755)
    except OSError as e:
        if e.errno != errno.EEXIST or not os.path.isdir(dname):
            raise e


def split_path(path):
    """
    Split path by path separators.
    Use os.path.join() to join the path back to string.
    """
    result = []

    head = path
    while True:
        head, tail = os.path.split(head)
        if not tail:
            if head or not result:
                # if not result: make sure result is [""] so os.path.join(*result) can be called
                result.insert(0, head)
            break
        result.insert(0, tail)

    return result


def empty(iterable):
    try:
        l = len(iterable)
    except TypeError:
        l = len(list(iterable))
    return l == 0

def first(iterable):
    """Returns the first item from an iterable or None if it has no elements."""
    it = iter(iterable)
    try:
        return next(it)
    except StopIteration:
        return None


def first_not_none(iterable):
    it = iter(iterable)
    try:
        return next(item for item in it if item is not None)
    except StopIteration:
        return None


def file_age(fn):
    return time.time() - file_timestamp(fn)

def file_timestamp(fn):
    return os.stat(fn).st_mtime

def get_effective_login():
    try:
        return pwd.getpwuid(os.geteuid())[0]
    except KeyError:
        return "UID: %s" % os.geteuid()

def get_in(dct, keys, not_found):
    """Like dict.get() for nested dicts."""
    for k in keys:
        dct = dct.get(k)
        if dct is None:
            return not_found
    return dct

def group_by_filter(fn, iterable):
    def splitter(acc, item):
        acc[not bool(fn(item))].append(item)
        return acc
    return functools.reduce(splitter, iterable, ([], []))

def insert_if(item, iterable, condition):
    """Insert an item into an iterable by a condition."""
    for original_item in iterable:
        if condition(original_item):
            yield item
        yield original_item

def is_exhausted(iterator):
    """Test whether an iterator is exhausted."""
    try:
        next(iterator)
    except StopIteration:
        return True
    else:
        return False

def is_glob_pattern(pattern):
    if is_string_type(pattern):
        pattern = [pattern]
    return (isinstance(pattern, list) and any(set(p) & set("*[?") for p in pattern))

def is_string_type(obj):
    if PY3:
        return isinstance(obj, str)
    else:
        return isinstance(obj, basestring)

def lazyattr(attrname):
    """Decorator to get lazy attribute initialization.

    Composes with @property. Force reinitialization by deleting the <attrname>.
    """
    def get_decorated(fn):
        def cached_getter(obj):
            try:
                return getattr(obj, attrname)
            except AttributeError:
                val = fn(obj)
                setattr(obj, attrname, val)
                return val
        return cached_getter
    return get_decorated


def mapall(fn, *seq):
    """Like functools.map(), but return a list instead of an iterator.

    This means all side effects of fn take place even without iterating the
    result.

    """
    return list(map(fn, *seq))

def normalize_time(timestamp):
    """Convert time into locale aware datetime string object."""
    t = time.strftime("%c", time.localtime(timestamp))
    if not dnf.pycomp.PY3:
        current_locale_setting = locale.getlocale()[1]
        if current_locale_setting:
            t = t.decode(current_locale_setting)
    return t

def on_ac_power():
    """Decide whether we are on line power.

    Returns True if we are on line power, False if not, None if it can not be
    decided.

    """
    try:
        ps_folder = "/sys/class/power_supply"
        ac_nodes = [node for node in os.listdir(ps_folder) if node.startswith("AC")]
        if len(ac_nodes) > 0:
            ac_node = ac_nodes[0]
            with open("{}/{}/online".format(ps_folder, ac_node)) as ac_status:
                data = ac_status.read()
                return int(data) == 1
        return None
    except (IOError, ValueError):
        return None


def on_metered_connection():
    """Decide whether we are on metered connection.

    Returns:
      True: if on metered connection
      False: if not
      None: if it can not be decided
    """
    try:
        import dbus
    except ImportError:
        return None
    try:
        bus = dbus.SystemBus()
        proxy = bus.get_object("org.freedesktop.NetworkManager",
                               "/org/freedesktop/NetworkManager")
        iface = dbus.Interface(proxy, "org.freedesktop.DBus.Properties")
        metered = iface.Get("org.freedesktop.NetworkManager", "Metered")
    except dbus.DBusException:
        return None
    if metered == 0: # NM_METERED_UNKNOWN
        return None
    elif metered in (1, 3): # NM_METERED_YES, NM_METERED_GUESS_YES
        return True
    elif metered in (2, 4): # NM_METERED_NO, NM_METERED_GUESS_NO
        return False
    else: # Something undocumented (at least at this moment)
        raise ValueError("Unknown value for metered property: %r", metered)

def partition(pred, iterable):
    """Use a predicate to partition entries into false entries and true entries.

    Credit: Python library itertools' documentation.

    """
    t1, t2 = itertools.tee(iterable)
    return dnf.pycomp.filterfalse(pred, t1), filter(pred, t2)

def rm_rf(path):
    try:
        shutil.rmtree(path)
    except OSError:
        pass

def split_by(iterable, condition):
    """Split an iterable into tuples by a condition.

    Inserts a separator before each item which meets the condition and then
    cuts the iterable by these separators.

    """
    separator = object()  # A unique object.
    # Create a function returning tuple of objects before the separator.
    def next_subsequence(it):
        return tuple(itertools.takewhile(lambda e: e != separator, it))

    # Mark each place where the condition is met by the separator.
    marked = insert_if(separator, iterable, condition)

    # The 1st subsequence may be empty if the 1st item meets the condition.
    yield next_subsequence(marked)

    while True:
        subsequence = next_subsequence(marked)
        if not subsequence:
            break
        yield subsequence

def strip_prefix(s, prefix):
    if s.startswith(prefix):
        return s[len(prefix):]
    return None


def touch(path, no_create=False):
    """Create an empty file if it doesn't exist or bump it's timestamps.

    If no_create is True only bumps the timestamps.
    """
    if no_create or os.access(path, os.F_OK):
        return os.utime(path, None)
    with open(path, 'a'):
        pass


def _terminal_messenger(tp='write', msg="", out=sys.stdout):
    try:
        if tp == 'write':
            out.write(msg)
        elif tp == 'flush':
            out.flush()
        elif tp == 'write_flush':
            out.write(msg)
            out.flush()
        elif tp == 'print':
            print(msg, file=out)
        else:
            raise ValueError('Unsupported type: ' + tp)
    except IOError as e:
        logger.critical('{}: {}'.format(type(e).__name__, ucd(e)))
        pass


def _format_resolve_problems(resolve_problems):
    """
    Format string about problems in resolve

    :param resolve_problems: list with list of strings (output of goal.problem_rules())
    :return: string
    """
    msg = ""
    count_problems = (len(resolve_problems) > 1)
    for i, rs in enumerate(resolve_problems, start=1):
        if count_problems:
            msg += "\n " + _("Problem") + " %d: " % i
        else:
            msg += "\n " + _("Problem") + ": "
        msg += "\n  - ".join(rs)
    return msg


def _te_nevra(te):
    nevra = te.N() + '-'
    if te.E() is not None and te.E() != '0':
        nevra += te.E() + ':'
    return nevra + te.V() + '-' + te.R() + '.' + te.A()


def _log_rpm_trans_with_swdb(rpm_transaction, swdb_transaction):
    logger.debug("Logging transaction elements")
    for rpm_el in rpm_transaction:
        tsi = rpm_el.Key()
        tsi_state = None
        if tsi is not None:
            tsi_state = tsi.state
        msg = "RPM element: '{}', Key(): '{}', Key state: '{}', Failed() '{}': ".format(
            _te_nevra(rpm_el), tsi, tsi_state, rpm_el.Failed())
        logger.debug(msg)
    for tsi in swdb_transaction:
        msg = "SWDB element: '{}', State: '{}', Action: '{}', From repo: '{}', Reason: '{}', " \
              "Get reason: '{}'".format(str(tsi), tsi.state, tsi.action, tsi.from_repo, tsi.reason,
                                        tsi.get_reason())
        logger.debug(msg)


def _sync_rpm_trans_with_swdb(rpm_transaction, swdb_transaction):
    revert_actions = {libdnf.transaction.TransactionItemAction_DOWNGRADED,
                      libdnf.transaction.TransactionItemAction_OBSOLETED,
                      libdnf.transaction.TransactionItemAction_REMOVE,
                      libdnf.transaction.TransactionItemAction_UPGRADED,
                      libdnf.transaction.TransactionItemAction_REINSTALLED}
    cached_tsi = [tsi for tsi in swdb_transaction]
    el_not_found = False
    error = False
    for rpm_el in rpm_transaction:
        te_nevra = _te_nevra(rpm_el)
        tsi = rpm_el.Key()
        if tsi is None or not hasattr(tsi, "pkg"):
            for tsi_candidate in cached_tsi:
                if tsi_candidate.state != libdnf.transaction.TransactionItemState_UNKNOWN:
                    continue
                if tsi_candidate.action not in revert_actions:
                    continue
                if str(tsi_candidate) == te_nevra:
                    tsi = tsi_candidate
                    break
        if tsi is None or not hasattr(tsi, "pkg"):
            logger.critical(_("TransactionItem not found for key: {}").format(te_nevra))
            el_not_found = True
            continue
        if rpm_el.Failed():
            tsi.state = libdnf.transaction.TransactionItemState_ERROR
            error = True
        else:
            tsi.state = libdnf.transaction.TransactionItemState_DONE
    for tsi in cached_tsi:
        if tsi.state == libdnf.transaction.TransactionItemState_UNKNOWN:
            logger.critical(_("TransactionSWDBItem not found for key: {}").format(str(tsi)))
            el_not_found = True
    if error:
        logger.debug(_('Errors occurred during transaction.'))
    if el_not_found:
        _log_rpm_trans_with_swdb(rpm_transaction, cached_tsi)


class tmpdir(object):
    # used by subscription-manager (src/dnf-plugins/product-id.py)
    def __init__(self):
        prefix = '%s-' % dnf.const.PREFIX
        self.path = tempfile.mkdtemp(prefix=prefix)

    def __enter__(self):
        return self.path

    def __exit__(self, exc_type, exc_value, traceback):
        rm_rf(self.path)

class Bunch(dict):
    """Dictionary with attribute accessing syntax.

    In DNF, prefer using this over dnf.yum.misc.GenericHolder.

    Credit: Alex Martelli, Doug Hudgeon

    """
    def __init__(self, *args, **kwds):
         super(Bunch, self).__init__(*args, **kwds)
         self.__dict__ = self

    def __hash__(self):
        return id(self)


class MultiCallList(list):
    def __init__(self, iterable):
        super(MultiCallList, self).__init__()
        self.extend(iterable)

    def __getattr__(self, what):
        def fn(*args, **kwargs):
            def call_what(v):
                method = getattr(v, what)
                return method(*args, **kwargs)
            return list(map(call_what, self))
        return fn

    def __setattr__(self, what, val):
        def setter(item):
            setattr(item, what, val)
        return list(map(setter, self))


def _make_lists(transaction):
    b = Bunch({
        'downgraded': [],
        'erased': [],
        'erased_clean': [],
        'erased_dep': [],
        'installed': [],
        'installed_group': [],
        'installed_dep': [],
        'installed_weak': [],
        'reinstalled': [],
        'upgraded': [],
        'failed': [],
    })

    for tsi in transaction:
        if tsi.state == libdnf.transaction.TransactionItemState_ERROR:
            b.failed.append(tsi)
        elif tsi.action == libdnf.transaction.TransactionItemAction_DOWNGRADE:
            b.downgraded.append(tsi)
        elif tsi.action == libdnf.transaction.TransactionItemAction_INSTALL:
            if tsi.reason == libdnf.transaction.TransactionItemReason_GROUP:
                b.installed_group.append(tsi)
            elif tsi.reason == libdnf.transaction.TransactionItemReason_DEPENDENCY:
                b.installed_dep.append(tsi)
            elif tsi.reason == libdnf.transaction.TransactionItemReason_WEAK_DEPENDENCY:
                b.installed_weak.append(tsi)
            else:
                # TransactionItemReason_USER
                b.installed.append(tsi)
        elif tsi.action == libdnf.transaction.TransactionItemAction_REINSTALL:
            b.reinstalled.append(tsi)
        elif tsi.action == libdnf.transaction.TransactionItemAction_REMOVE:
            if tsi.reason == libdnf.transaction.TransactionItemReason_CLEAN:
                b.erased_clean.append(tsi)
            elif tsi.reason == libdnf.transaction.TransactionItemReason_DEPENDENCY:
                b.erased_dep.append(tsi)
            else:
                b.erased.append(tsi)
        elif tsi.action == libdnf.transaction.TransactionItemAction_UPGRADE:
            b.upgraded.append(tsi)

    return b


def _post_transaction_output(base, transaction, action_callback):
    """Returns a human-readable summary of the results of the
    transaction.

    :param action_callback: function generating output for specific action. It
       takes two parameters - action as a string and list of affected packages for
       this action
    :return: a list of lines containing a human-readable summary of the
       results of the transaction
    """
    def _tsi_or_pkg_nevra_cmp(item1, item2):
        """Compares two transaction items or packages by nevra.
           Used as a fallback when tsi does not contain package object.
        """
        ret = (item1.name > item2.name) - (item1.name < item2.name)
        if ret != 0:
            return ret
        nevra1 = hawkey.NEVRA(name=item1.name, epoch=item1.epoch, version=item1.version,
                              release=item1.release, arch=item1.arch)
        nevra2 = hawkey.NEVRA(name=item2.name, epoch=item2.epoch, version=item2.version,
                              release=item2.release, arch=item2.arch)
        ret = nevra1.evr_cmp(nevra2, base.sack)
        if ret != 0:
            return ret
        return (item1.arch > item2.arch) - (item1.arch < item2.arch)

    list_bunch = dnf.util._make_lists(transaction)

    skipped_conflicts, skipped_broken = base._skipped_packages(
        report_problems=False, transaction=transaction)
    skipped = skipped_conflicts.union(skipped_broken)

    out = []
    for (action, tsis) in [(_('Upgraded'), list_bunch.upgraded),
                           (_('Downgraded'), list_bunch.downgraded),
                           (_('Installed'), list_bunch.installed +
                            list_bunch.installed_group +
                            list_bunch.installed_weak +
                            list_bunch.installed_dep),
                           (_('Reinstalled'), list_bunch.reinstalled),
                           (_('Skipped'), skipped),
                           (_('Removed'), list_bunch.erased +
                               list_bunch.erased_dep +
                               list_bunch.erased_clean),
                           (_('Failed'), list_bunch.failed)]:
        out.extend(action_callback(
            action, sorted(tsis, key=functools.cmp_to_key(_tsi_or_pkg_nevra_cmp))))

    return out
site-packages/dnf/__init__.py000064400000002555147511334650012164 0ustar00# __init__.py
# The toplevel DNF package.
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import unicode_literals
import warnings
import dnf.pycomp

warnings.filterwarnings('once', category=DeprecationWarning, module=r'^dnf\..*$')

from dnf.const import VERSION
__version__ = VERSION  # :api

import dnf.base
Base = dnf.base.Base # :api

import dnf.plugin
Plugin = dnf.plugin.Plugin # :api

# setup libraries
dnf.pycomp.urlparse.uses_fragment.append("media")
site-packages/dnf/callback.py000064400000007214147511334650012156 0ustar00# callbacks.py
# Abstract interfaces to communicate progress on tasks.
#
# Copyright (C) 2014-2015  Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import unicode_literals
import dnf.yum.rpmtrans

import dnf.transaction

PKG_DOWNGRADE = dnf.transaction.PKG_DOWNGRADE  # :api
PKG_DOWNGRADED = dnf.transaction.PKG_DOWNGRADED  # :api
PKG_INSTALL = dnf.transaction.PKG_INSTALL  # :api
PKG_OBSOLETE = dnf.transaction.PKG_OBSOLETE  # :api
PKG_OBSOLETED = dnf.transaction.PKG_OBSOLETED  # :api
PKG_REINSTALL = dnf.transaction.PKG_REINSTALL  # :api
PKG_REINSTALLED = dnf.transaction.PKG_REINSTALLED  # :api
PKG_REMOVE = dnf.transaction.PKG_ERASE  # :api
PKG_ERASE = PKG_REMOVE  # deprecated, use PKG_REMOVE instead
PKG_UPGRADE = dnf.transaction.PKG_UPGRADE  # :api
PKG_UPGRADED = dnf.transaction.PKG_UPGRADED  # :api

PKG_CLEANUP = dnf.transaction.PKG_CLEANUP  # :api
PKG_VERIFY = dnf.transaction.PKG_VERIFY  # :api
PKG_SCRIPTLET = dnf.transaction.PKG_SCRIPTLET  # :api

TRANS_PREPARATION = dnf.transaction.TRANS_PREPARATION  # :api
TRANS_POST = dnf.transaction.TRANS_POST  # :api

STATUS_OK = None # :api
STATUS_FAILED = 1 # :api
STATUS_ALREADY_EXISTS = 2 # :api
STATUS_MIRROR = 3  # :api
STATUS_DRPM = 4    # :api


class KeyImport(object):
    def _confirm(self, id, userid, fingerprint, url, timestamp):
        """Ask the user if the key should be imported."""
        return False


class Payload(object):
    # :api

    def __init__(self, progress):
        self.progress = progress

    def __str__(self):
        """Nice, human-readable representation. :api"""
        pass

    @property
    def download_size(self):
        """Total size of the download. :api"""
        pass


class DownloadProgress(object):
    # :api

    def end(self, payload, status, msg):
        """Communicate the information that `payload` has finished downloading.

        :api, `status` is a constant denoting the type of outcome, `err_msg` is an
        error message in case the outcome was an error.

        """
        pass

    def message(self, msg):
        pass

    def progress(self, payload, done):
        """Update the progress display. :api

        `payload` is the payload this call reports progress for, `done` is how
        many bytes of this payload are already downloaded.

        """

        pass

    def start(self, total_files, total_size, total_drpms=0):
        """Start new progress metering. :api

        `total_files` the number of files that will be downloaded,
        `total_size` total size of all files.

        """

        pass


class NullDownloadProgress(DownloadProgress):
    pass


class Depsolve(object):
    def start(self):
        pass

    def pkg_added(self, pkg, mode):
        pass

    def end(self):
        pass


TransactionProgress = dnf.yum.rpmtrans.TransactionDisplay  # :api
site-packages/dnf/transaction_sr.py000064400000063141147511334650013454 0ustar00# Copyright (C) 2020 Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU Library General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.

from __future__ import absolute_import
from __future__ import print_function
from __future__ import unicode_literals

import libdnf
import hawkey

from dnf.i18n import _
import dnf.exceptions

import json


VERSION_MAJOR = 0
VERSION_MINOR = 0
VERSION = "%s.%s" % (VERSION_MAJOR, VERSION_MINOR)
"""
The version of the stored transaction.

MAJOR version denotes backwards incompatible changes (old dnf won't work with
new transaction JSON).

MINOR version denotes extending the format without breaking backwards
compatibility (old dnf can work with new transaction JSON). Forwards
compatibility needs to be handled by being able to process the old format as
well as the new one.
"""


class TransactionError(dnf.exceptions.Error):
    def __init__(self, msg):
        super(TransactionError, self).__init__(msg)


class TransactionReplayError(dnf.exceptions.Error):
    def __init__(self, filename, errors):
        """
        :param filename: The name of the transaction file being replayed
        :param errors: a list of error classes or a string with an error description
        """

        # store args in case someone wants to read them from a caught exception
        self.filename = filename
        if isinstance(errors, (list, tuple)):
            self.errors = errors
        else:
            self.errors = [errors]

        if filename:
            msg = _('The following problems occurred while replaying the transaction from file "{filename}":').format(filename=filename)
        else:
            msg = _('The following problems occurred while running a transaction:')

        for error in self.errors:
            msg += "\n  " + str(error)

        super(TransactionReplayError, self).__init__(msg)


class IncompatibleTransactionVersionError(TransactionReplayError):
    def __init__(self, filename, msg):
        super(IncompatibleTransactionVersionError, self).__init__(filename, msg)


def _check_version(version, filename):
    major, minor = version.split('.')

    try:
        major = int(major)
    except ValueError as e:
        raise TransactionReplayError(
            filename,
            _('Invalid major version "{major}", number expected.').format(major=major)
        )

    try:
        int(minor)  # minor is unused, just check it's a number
    except ValueError as e:
        raise TransactionReplayError(
            filename,
            _('Invalid minor version "{minor}", number expected.').format(minor=minor)
        )

    if major != VERSION_MAJOR:
        raise IncompatibleTransactionVersionError(
            filename,
            _('Incompatible major version "{major}", supported major version is "{major_supp}".')
                .format(major=major, major_supp=VERSION_MAJOR)
        )


def serialize_transaction(transaction):
    """
    Serializes a transaction to a data structure that is equivalent to the stored JSON format.
    :param transaction: the transaction to serialize (an instance of dnf.db.history.TransactionWrapper)
    """

    data = {
        "version": VERSION,
    }
    rpms = []
    groups = []
    environments = []

    if transaction is None:
        return data

    for tsi in transaction.packages():
        if tsi.is_package():
            rpms.append({
                "action": tsi.action_name,
                "nevra": tsi.nevra,
                "reason": libdnf.transaction.TransactionItemReasonToString(tsi.reason),
                "repo_id": tsi.from_repo
            })

        elif tsi.is_group():
            group = tsi.get_group()

            group_data = {
                "action": tsi.action_name,
                "id": group.getGroupId(),
                "packages": [],
                "package_types": libdnf.transaction.compsPackageTypeToString(group.getPackageTypes())
            }

            for pkg in group.getPackages():
                group_data["packages"].append({
                    "name": pkg.getName(),
                    "installed": pkg.getInstalled(),
                    "package_type": libdnf.transaction.compsPackageTypeToString(pkg.getPackageType())
                })

            groups.append(group_data)

        elif tsi.is_environment():
            env = tsi.get_environment()

            env_data = {
                "action": tsi.action_name,
                "id": env.getEnvironmentId(),
                "groups": [],
                "package_types": libdnf.transaction.compsPackageTypeToString(env.getPackageTypes())
            }

            for grp in env.getGroups():
                env_data["groups"].append({
                    "id": grp.getGroupId(),
                    "installed": grp.getInstalled(),
                    "group_type": libdnf.transaction.compsPackageTypeToString(grp.getGroupType())
                })

            environments.append(env_data)

    if rpms:
        data["rpms"] = rpms

    if groups:
        data["groups"] = groups

    if environments:
        data["environments"] = environments

    return data


class TransactionReplay(object):
    """
    A class that encapsulates replaying a transaction. The transaction data are
    loaded and stored when the class is initialized. The transaction is run by
    calling the `run()` method, after the transaction is created (but before it is
    performed), the `post_transaction()` method needs to be called to verify no
    extra packages were pulled in and also to fix the reasons.
    """

    def __init__(
        self,
        base,
        filename="",
        data=None,
        ignore_extras=False,
        ignore_installed=False,
        skip_unavailable=False
    ):
        """
        :param base: the dnf base
        :param filename: the filename to load the transaction from (conflicts with the 'data' argument)
        :param data: the dictionary to load the transaction from (conflicts with the 'filename' argument)
        :param ignore_extras: whether to ignore extra package pulled into the transaction
        :param ignore_installed: whether to ignore installed versions of packages
        :param skip_unavailable: whether to skip transaction packages that aren't available
        """

        self._base = base
        self._filename = filename
        self._ignore_installed = ignore_installed
        self._ignore_extras = ignore_extras
        self._skip_unavailable = skip_unavailable

        if not self._base.conf.strict:
            self._skip_unavailable = True

        self._nevra_cache = set()
        self._nevra_reason_cache = {}
        self._warnings = []

        if filename and data:
            raise ValueError(_("Conflicting TransactionReplay arguments have been specified: filename, data"))
        elif filename:
            self._load_from_file(filename)
        else:
            self._load_from_data(data)


    def _load_from_file(self, fn):
        self._filename = fn
        with open(fn, "r") as f:
            try:
                replay_data = json.load(f)
            except json.decoder.JSONDecodeError as e:
                raise TransactionReplayError(fn, str(e) + ".")

        try:
            self._load_from_data(replay_data)
        except TransactionError as e:
            raise TransactionReplayError(fn, e)

    def _load_from_data(self, data):
        self._replay_data = data
        self._verify_toplevel_json(self._replay_data)

        self._rpms = self._replay_data.get("rpms", [])
        self._assert_type(self._rpms, list, "rpms", "array")

        self._groups = self._replay_data.get("groups", [])
        self._assert_type(self._groups, list, "groups", "array")

        self._environments = self._replay_data.get("environments", [])
        self._assert_type(self._environments, list, "environments", "array")

    def _raise_or_warn(self, warn_only, msg):
        if warn_only:
            self._warnings.append(msg)
        else:
            raise TransactionError(msg)

    def _assert_type(self, value, t, id, expected):
        if not isinstance(value, t):
            raise TransactionError(_('Unexpected type of "{id}", {exp} expected.').format(id=id, exp=expected))

    def _verify_toplevel_json(self, replay_data):
        fn = self._filename

        if "version" not in replay_data:
            raise TransactionReplayError(fn, _('Missing key "{key}".'.format(key="version")))

        self._assert_type(replay_data["version"], str, "version", "string")

        _check_version(replay_data["version"], fn)

    def _replay_pkg_action(self, pkg_data):
        try:
            action = pkg_data["action"]
            nevra = pkg_data["nevra"]
            repo_id = pkg_data["repo_id"]
            reason = libdnf.transaction.StringToTransactionItemReason(pkg_data["reason"])
        except KeyError as e:
            raise TransactionError(
                _('Missing object key "{key}" in an rpm.').format(key=e.args[0])
            )
        except IndexError as e:
            raise TransactionError(
                _('Unexpected value of package reason "{reason}" for rpm nevra "{nevra}".')
                    .format(reason=pkg_data["reason"], nevra=nevra)
            )

        subj = hawkey.Subject(nevra)
        parsed_nevras = subj.get_nevra_possibilities(forms=[hawkey.FORM_NEVRA])

        if len(parsed_nevras) != 1:
            raise TransactionError(_('Cannot parse NEVRA for package "{nevra}".').format(nevra=nevra))

        parsed_nevra = parsed_nevras[0]
        na = "%s.%s" % (parsed_nevra.name, parsed_nevra.arch)

        query_na = self._base.sack.query().filter(name=parsed_nevra.name, arch=parsed_nevra.arch)

        epoch = parsed_nevra.epoch if parsed_nevra.epoch is not None else 0
        query = query_na.filter(epoch=epoch, version=parsed_nevra.version, release=parsed_nevra.release)

        # In case the package is found in the same repo as in the original
        # transaction, limit the query to that plus installed packages. IOW
        # remove packages with the same NEVRA in case they are found in
        # multiple repos and the repo the package came from originally is one
        # of them.
        # This can e.g. make a difference in the system-upgrade plugin, in case
        # the same NEVRA is in two repos, this makes sure the same repo is used
        # for both download and upgrade steps of the plugin.
        if repo_id:
            query_repo = query.filter(reponame=repo_id)
            if query_repo:
                query = query_repo.union(query.installed())

        if not query:
            self._raise_or_warn(self._skip_unavailable, _('Cannot find rpm nevra "{nevra}".').format(nevra=nevra))
            return

        # a cache to check no extra packages were pulled into the transaction
        if action != "Reason Change":
            self._nevra_cache.add(nevra)

        # store reasons for forward actions and "Removed", the rest of the
        # actions reasons should stay as they were determined by the transaction
        if action in ("Install", "Upgrade", "Downgrade", "Reinstall", "Removed"):
            self._nevra_reason_cache[nevra] = reason

        if action in ("Install", "Upgrade", "Downgrade"):
            if action == "Install" and query_na.installed() and not self._base._get_installonly_query(query_na):
                self._raise_or_warn(self._ignore_installed,
                    _('Package "{na}" is already installed for action "{action}".').format(na=na, action=action))

            sltr = dnf.selector.Selector(self._base.sack).set(pkg=query)
            self._base.goal.install(select=sltr, optional=not self._base.conf.strict)
        elif action == "Reinstall":
            query = query.available()

            if not query:
                self._raise_or_warn(self._skip_unavailable,
                    _('Package nevra "{nevra}" not available in repositories for action "{action}".')
                    .format(nevra=nevra, action=action))
                return

            sltr = dnf.selector.Selector(self._base.sack).set(pkg=query)
            self._base.goal.install(select=sltr, optional=not self._base.conf.strict)
        elif action in ("Upgraded", "Downgraded", "Reinstalled", "Removed", "Obsoleted"):
            query = query.installed()

            if not query:
                self._raise_or_warn(self._ignore_installed,
                    _('Package nevra "{nevra}" not installed for action "{action}".').format(nevra=nevra, action=action))
                return

            # erasing the original version (the reverse part of an action like
            # e.g. upgrade) is more robust, but we can't do it if
            # skip_unavailable is True, because if the forward part of the
            # action is skipped, we would simply remove the package here
            if not self._skip_unavailable or action == "Removed":
                for pkg in query:
                    self._base.goal.erase(pkg, clean_deps=False)
        elif action == "Reason Change":
            self._base.history.set_reason(query[0], reason)
        else:
            raise TransactionError(
                _('Unexpected value of package action "{action}" for rpm nevra "{nevra}".')
                    .format(action=action, nevra=nevra)
            )

    def _create_swdb_group(self, group_id, pkg_types, pkgs):
        comps_group = self._base.comps._group_by_id(group_id)
        if not comps_group:
            self._raise_or_warn(self._skip_unavailable, _("Group id '%s' is not available.") % group_id)
            return None

        swdb_group = self._base.history.group.new(group_id, comps_group.name, comps_group.ui_name, pkg_types)

        try:
            for pkg in pkgs:
                name = pkg["name"]
                self._assert_type(name, str, "groups.packages.name", "string")
                installed = pkg["installed"]
                self._assert_type(installed, bool, "groups.packages.installed", "boolean")
                package_type = pkg["package_type"]
                self._assert_type(package_type, str, "groups.packages.package_type", "string")

                try:
                    swdb_group.addPackage(name, installed, libdnf.transaction.stringToCompsPackageType(package_type))
                except libdnf.error.Error as e:
                    raise TransactionError(str(e))

        except KeyError as e:
            raise TransactionError(
                _('Missing object key "{key}" in groups.packages.').format(key=e.args[0])
            )

        return swdb_group

    def _swdb_group_install(self, group_id, pkg_types, pkgs):
        swdb_group = self._create_swdb_group(group_id, pkg_types, pkgs)

        if swdb_group is not None:
            self._base.history.group.install(swdb_group)

    def _swdb_group_upgrade(self, group_id, pkg_types, pkgs):
        if not self._base.history.group.get(group_id):
            self._raise_or_warn( self._ignore_installed, _("Group id '%s' is not installed.") % group_id)
            return

        swdb_group = self._create_swdb_group(group_id, pkg_types, pkgs)

        if swdb_group is not None:
            self._base.history.group.upgrade(swdb_group)

    def _swdb_group_downgrade(self, group_id, pkg_types, pkgs):
        if not self._base.history.group.get(group_id):
            self._raise_or_warn(self._ignore_installed, _("Group id '%s' is not installed.") % group_id)
            return

        swdb_group = self._create_swdb_group(group_id, pkg_types, pkgs)

        if swdb_group is not None:
            self._base.history.group.downgrade(swdb_group)

    def _swdb_group_remove(self, group_id, pkg_types, pkgs):
        if not self._base.history.group.get(group_id):
            self._raise_or_warn(self._ignore_installed, _("Group id '%s' is not installed.") % group_id)
            return

        swdb_group = self._create_swdb_group(group_id, pkg_types, pkgs)

        if swdb_group is not None:
            self._base.history.group.remove(swdb_group)

    def _create_swdb_environment(self, env_id, pkg_types, groups):
        comps_env = self._base.comps._environment_by_id(env_id)
        if not comps_env:
            self._raise_or_warn(self._skip_unavailable, _("Environment id '%s' is not available.") % env_id)
            return None

        swdb_env = self._base.history.env.new(env_id, comps_env.name, comps_env.ui_name, pkg_types)

        try:
            for grp in groups:
                id = grp["id"]
                self._assert_type(id, str, "environments.groups.id", "string")
                installed = grp["installed"]
                self._assert_type(installed, bool, "environments.groups.installed", "boolean")
                group_type = grp["group_type"]
                self._assert_type(group_type, str, "environments.groups.group_type", "string")

                try:
                    group_type = libdnf.transaction.stringToCompsPackageType(group_type)
                except libdnf.error.Error as e:
                    raise TransactionError(str(e))

                if group_type not in (
                    libdnf.transaction.CompsPackageType_MANDATORY,
                    libdnf.transaction.CompsPackageType_OPTIONAL
                ):
                    raise TransactionError(
                        _('Invalid value "{group_type}" of environments.groups.group_type, '
                            'only "mandatory" or "optional" is supported.'
                        ).format(group_type=grp["group_type"])
                    )

                swdb_env.addGroup(id, installed, group_type)
        except KeyError as e:
            raise TransactionError(
                _('Missing object key "{key}" in environments.groups.').format(key=e.args[0])
            )

        return swdb_env

    def _swdb_environment_install(self, env_id, pkg_types, groups):
        swdb_env = self._create_swdb_environment(env_id, pkg_types, groups)

        if swdb_env is not None:
            self._base.history.env.install(swdb_env)

    def _swdb_environment_upgrade(self, env_id, pkg_types, groups):
        if not self._base.history.env.get(env_id):
            self._raise_or_warn(self._ignore_installed,_("Environment id '%s' is not installed.") % env_id)
            return

        swdb_env = self._create_swdb_environment(env_id, pkg_types, groups)

        if swdb_env is not None:
            self._base.history.env.upgrade(swdb_env)

    def _swdb_environment_downgrade(self, env_id, pkg_types, groups):
        if not self._base.history.env.get(env_id):
            self._raise_or_warn(self._ignore_installed, _("Environment id '%s' is not installed.") % env_id)
            return

        swdb_env = self._create_swdb_environment(env_id, pkg_types, groups)

        if swdb_env is not None:
            self._base.history.env.downgrade(swdb_env)

    def _swdb_environment_remove(self, env_id, pkg_types, groups):
        if not self._base.history.env.get(env_id):
            self._raise_or_warn(self._ignore_installed, _("Environment id '%s' is not installed.") % env_id)
            return

        swdb_env = self._create_swdb_environment(env_id, pkg_types, groups)

        if swdb_env is not None:
            self._base.history.env.remove(swdb_env)

    def get_data(self):
        """
        :returns: the loaded data of the transaction
        """

        return self._replay_data

    def get_warnings(self):
        """
        :returns: an array of warnings gathered during the transaction replay
        """

        return self._warnings

    def run(self):
        """
        Replays the transaction.
        """

        fn = self._filename
        errors = []

        for pkg_data in self._rpms:
            try:
                self._replay_pkg_action(pkg_data)
            except TransactionError as e:
                errors.append(e)

        for group_data in self._groups:
            try:
                action = group_data["action"]
                group_id = group_data["id"]

                try:
                    pkg_types = libdnf.transaction.stringToCompsPackageType(group_data["package_types"])
                except libdnf.error.Error as e:
                    errors.append(TransactionError(str(e)))
                    continue

                if action == "Install":
                    self._swdb_group_install(group_id, pkg_types, group_data["packages"])
                elif action == "Removed":
                    self._swdb_group_remove(group_id, pkg_types, group_data["packages"])
                # Groups are not versioned, but a reverse transaction could be applied,
                # therefore we treat both actions the same way
                elif action == "Upgrade" or action == "Upgraded":
                    self._swdb_group_upgrade(group_id, pkg_types, group_data["packages"])
                elif action == "Downgrade" or action == "Downgraded":
                    self._swdb_group_downgrade(group_id, pkg_types, group_data["packages"])
                else:
                    errors.append(TransactionError(
                        _('Unexpected value of group action "{action}" for group "{group}".')
                            .format(action=action, group=group_id)
                    ))
            except KeyError as e:
                errors.append(TransactionError(
                    _('Missing object key "{key}" in a group.').format(key=e.args[0])
                ))
            except TransactionError as e:
                errors.append(e)

        for env_data in self._environments:
            try:
                action = env_data["action"]
                env_id = env_data["id"]

                try:
                    pkg_types = libdnf.transaction.stringToCompsPackageType(env_data["package_types"])
                except libdnf.error.Error as e:
                    errors.append(TransactionError(str(e)))
                    continue

                if action == "Install":
                    self._swdb_environment_install(env_id, pkg_types, env_data["groups"])
                elif action == "Removed":
                    self._swdb_environment_remove(env_id, pkg_types, env_data["groups"])
                # Environments are not versioned, but a reverse transaction could be applied,
                # therefore we treat both actions the same way
                elif action == "Upgrade" or action == "Upgraded":
                    self._swdb_environment_upgrade(env_id, pkg_types, env_data["groups"])
                elif action == "Downgrade" or action == "Downgraded":
                    self._swdb_environment_downgrade(env_id, pkg_types, env_data["groups"])
                else:
                    errors.append(TransactionError(
                        _('Unexpected value of environment action "{action}" for environment "{env}".')
                            .format(action=action, env=env_id)
                    ))
            except KeyError as e:
                errors.append(TransactionError(
                    _('Missing object key "{key}" in an environment.').format(key=e.args[0])
                ))
            except TransactionError as e:
                errors.append(e)

        if errors:
            raise TransactionReplayError(fn, errors)

    def post_transaction(self):
        """
        Sets reasons in the transaction history to values from the stored transaction.

        Also serves to check whether additional packages were pulled in by the
        transaction, which results in an error (unless ignore_extras is True).
        """

        if not self._base.transaction:
            return

        errors = []

        for tsi in self._base.transaction:
            try:
                pkg = tsi.pkg
            except KeyError as e:
                # the transaction item has no package, happens for action == "Reason Change"
                continue

            nevra = str(pkg)

            if nevra not in self._nevra_cache:
                # if ignore_installed is True, we don't want to check for
                # Upgraded/Downgraded/Reinstalled extras in the transaction,
                # basically those may be installed and we are ignoring them
                if not self._ignore_installed or not tsi.action in (
                    libdnf.transaction.TransactionItemAction_UPGRADED,
                    libdnf.transaction.TransactionItemAction_DOWNGRADED,
                    libdnf.transaction.TransactionItemAction_REINSTALLED
                ):
                    msg = _('Package nevra "{nevra}", which is not present in the transaction file, was pulled '
                        'into the transaction.'
                    ).format(nevra=nevra)

                    if not self._ignore_extras:
                        errors.append(TransactionError(msg))
                    else:
                        self._warnings.append(msg)

            try:
                replay_reason = self._nevra_reason_cache[nevra]

                if tsi.action in (
                    libdnf.transaction.TransactionItemAction_INSTALL,
                    libdnf.transaction.TransactionItemAction_REMOVE
                ) or libdnf.transaction.TransactionItemReasonCompare(replay_reason, tsi.reason) > 0:
                    tsi.reason = replay_reason
            except KeyError as e:
                # if the pkg nevra wasn't found, we don't want to change the reason
                pass

        if errors:
            raise TransactionReplayError(self._filename, errors)
site-packages/dnf/selector.py000064400000002145147511334650012240 0ustar00# selector.py
# DNF specific hawkey.Selector handling.
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals

from hawkey import Selector
site-packages/pyudev-0.21.0-py3.6.egg-info/PKG-INFO000064400000012025147511334650014771 0ustar00Metadata-Version: 1.1
Name: pyudev
Version: 0.21.0
Summary: A libudev binding
Home-page: http://pyudev.readthedocs.org/
Author: Sebastian Wiesner
Author-email: lunaryorn@gmail.com
License: LGPL 2.1+
Description: ######
        pyudev
        ######
        
        .. image:: https://secure.travis-ci.org/pyudev/pyudev.png?branch=develop
           :target: http://travis-ci.org/pyudev/pyudev
        
        http://pyudev.readthedocs.org
        
        pyudev is a LGPL_ licensed, pure Python_ binding for libudev_, the device and
        hardware management and information library for Linux.  It supports almost all
        libudev_ functionality. You can enumerate devices, query device properties and
        attributes or monitor devices, including asynchronous monitoring with threads,
        or within the event loops of Qt, Glib or wxPython.
        
        The binding supports CPython_ 2 (2.6 or newer) and 3 (3.1 or newer), and PyPy_
        1.5 or newer.  It is tested against udev 151 or newer, earlier versions of udev
        as found on dated Linux systems may work, but are not officially supported.
        
        
        Usage
        -----
        
        Usage of pyudev is quite simply thanks to the power of the underlying udev
        library. Getting the labels of all partitions just takes a few lines:
        
        >>> import pyudev
        >>> context = pyudev.Context()
        >>> for device in context.list_devices(subsystem='block', DEVTYPE='partition'):
        ...     print(device.get('ID_FS_LABEL', 'unlabeled partition'))
        ...
        boot
        swap
        system
        
        The website_ provides a detailed `user guide`_ and a complete `API reference`_.
        
        
        Support
        -------
        
        Please report issues and questions to the issue tracker, but respect the
        following guidelines:
        
        - Check that the issue has not already been reported.
        - Check that the issue is not already fixed in the ``master`` branch.
        - Open issues with clear title and a detailed description in grammatically
          correct, complete sentences.
        - Include the Python version and the udev version (see ``udevadm --version``) in
          the description of your issue.
        
        
        Development
        -----------
        
        The source code is hosted on GitHub_::
        
           git clone git://github.com/pyudev/pyudev.git
        
        Please fork the repository and send pull requests with your fixes or new
        features, but respect the following guidelines:
        
        - Read `how to properly contribute to open source projects on GitHub
          <http://gun.io/blog/how-to-github-fork-branch-and-pull-request/>`_.
        - Understand the `branching model
          <http://nvie.com/posts/a-successful-git-branching-model/>`_.
        - Use a topic branch based on the ``develop`` branch to easily amend a pull
          request later, if necessary.
        - Write `good commit messages
          <http://tbaggery.com/2008/04/19/a-note-about-git-commit-messages.html>`_.
        - Squash commits on the topic branch before opening a pull request.
        - Respect :pep:`8` (use pep8_ to check your coding style compliance).
        - Add unit tests if possible (refer to the `testsuite documentation
          <http://pyudev.readthedocs.org/en/latest/tests/index.html>`_).
        - Add API documentation in docstrings.
        - Open a `pull request <https://help.github.com/articles/using-pull-requests>`_
          that relates to but one subject with a clear title and description in
          grammatically correct, complete sentences.
        
        
        .. _LGPL: http://www.gnu.org/licenses/old-licenses/lgpl-2.1.html
        .. _Python: http://www.python.org/
        .. _CPython: http://www.python.org/
        .. _PyPy: http://www.pypy.org/
        .. _libudev: http://www.kernel.org/pub/linux/utils/kernel/hotplug/libudev/
        .. _website: http://pyudev.readthedocs.org
        .. _user guide: http://pyudev.readthedocs.org/en/latest/guide.html
        .. _api reference: http://pyudev.readthedocs.org/en/latest/api/index.html
        .. _issue tracker: http://github.com/lunaryorn/pyudev/issues
        .. _GitHub: http://github.com/lunaryorn/pyudev
        .. _git: http://www.git-scm.com/
        .. _pep8: http://pypi.python.org/pypi/pep8/
        
Platform: Linux
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: GNU Library or Lesser General Public License (LGPL)
Classifier: Operating System :: POSIX :: Linux
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 2
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Programming Language :: Python :: Implementation :: PyPy
Classifier: Topic :: Software Development :: Libraries
Classifier: Topic :: System :: Hardware
Classifier: Topic :: System :: Operating System Kernels :: Linux
site-packages/pyudev-0.21.0-py3.6.egg-info/dependency_links.txt000064400000000001147511334650017742 0ustar00
site-packages/pyudev-0.21.0-py3.6.egg-info/requires.txt000064400000000004147511334650016266 0ustar00six
site-packages/pyudev-0.21.0-py3.6.egg-info/SOURCES.txt000064400000003774147511334650015573 0ustar00CHANGES.rst
COPYING
MANIFEST.in
README.rst
requirements.txt
setup.cfg
setup.py
tox.ini
.tox/py27/lib/python2.7/site-packages/pbr/tests/testpackage/test-requirements.txt
.tox/py27/lib64/python2.7/site-packages/pbr/tests/testpackage/test-requirements.txt
doc/changes.rst
doc/conf.py
doc/contribute.rst
doc/endorsements.rst
doc/guide.rst
doc/index.rst
doc/install.rst
doc/licencing.rst
doc/_templates/info.html
doc/api/index.rst
doc/api/pyudev.glib.rst
doc/api/pyudev.pyqt4.rst
doc/api/pyudev.pyqt5.rst
doc/api/pyudev.pyside.rst
doc/api/pyudev.rst
doc/api/pyudev.wx.rst
doc/tests/index.rst
doc/tests/plugins.rst
doc/tests/running.rst
src/pyudev/__init__.py
src/pyudev/_compat.py
src/pyudev/_qt_base.py
src/pyudev/_util.py
src/pyudev/core.py
src/pyudev/discover.py
src/pyudev/glib.py
src/pyudev/monitor.py
src/pyudev/pyqt4.py
src/pyudev/pyqt5.py
src/pyudev/pyside.py
src/pyudev/version.py
src/pyudev/wx.py
src/pyudev.egg-info/PKG-INFO
src/pyudev.egg-info/SOURCES.txt
src/pyudev.egg-info/dependency_links.txt
src/pyudev.egg-info/requires.txt
src/pyudev.egg-info/top_level.txt
src/pyudev/_ctypeslib/__init__.py
src/pyudev/_ctypeslib/_errorcheckers.py
src/pyudev/_ctypeslib/libc.py
src/pyudev/_ctypeslib/libudev.py
src/pyudev/_ctypeslib/utils.py
src/pyudev/_os/__init__.py
src/pyudev/_os/pipe.py
src/pyudev/_os/poll.py
src/pyudev/device/__init__.py
src/pyudev/device/_device.py
src/pyudev/device/_errors.py
tests/__init__.py
tests/_constants.py
tests/conftest.py
tests/test_core.py
tests/test_device.py
tests/test_discover.py
tests/test_enumerate.py
tests/test_monitor.py
tests/test_observer.py
tests/test_observer_deprecated.py
tests/test_pypi.py
tests/test_util.py
tests/_device_tests/__init__.py
tests/_device_tests/_attributes_tests.py
tests/_device_tests/_device_tests.py
tests/_device_tests/_devices_tests.py
tests/_device_tests/_tags_tests.py
tests/plugins/__init__.py
tests/plugins/fake_monitor.py
tests/plugins/mock_libudev.py
tests/plugins/privileged.py
tests/plugins/travis.py
tests/utils/__init__.py
tests/utils/misc.py
tests/utils/udev.pysite-packages/pyudev-0.21.0-py3.6.egg-info/top_level.txt000064400000000007147511334650016423 0ustar00pyudev
site-packages/seobject.py000064400000324657147511334650011466 0ustar00# Copyright (C) 2005-2013 Red Hat
# see file 'COPYING' for use and warranty information
#
# semanage is a tool for managing SELinux configuration files
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#

import pwd
import grp
import selinux
import os
import re
import sys
import stat
import socket
from semanage import *
PROGNAME = "selinux-python"
import sepolicy
import setools
import ipaddress

try:
    import gettext
    kwargs = {}
    if sys.version_info < (3,):
        kwargs['unicode'] = True
    gettext.install(PROGNAME,
                    localedir="/usr/share/locale",
                    codeset='utf-8',
                    **kwargs)
except:
    try:
        import builtins
        builtins.__dict__['_'] = str
    except ImportError:
        import __builtin__
        __builtin__.__dict__['_'] = unicode

import syslog

file_types = {}
file_types[""] = SEMANAGE_FCONTEXT_ALL
file_types["all files"] = SEMANAGE_FCONTEXT_ALL
file_types["a"] = SEMANAGE_FCONTEXT_ALL
file_types["regular file"] = SEMANAGE_FCONTEXT_REG
file_types["--"] = SEMANAGE_FCONTEXT_REG
file_types["f"] = SEMANAGE_FCONTEXT_REG
file_types["-d"] = SEMANAGE_FCONTEXT_DIR
file_types["directory"] = SEMANAGE_FCONTEXT_DIR
file_types["d"] = SEMANAGE_FCONTEXT_DIR
file_types["-c"] = SEMANAGE_FCONTEXT_CHAR
file_types["character device"] = SEMANAGE_FCONTEXT_CHAR
file_types["c"] = SEMANAGE_FCONTEXT_CHAR
file_types["-b"] = SEMANAGE_FCONTEXT_BLOCK
file_types["block device"] = SEMANAGE_FCONTEXT_BLOCK
file_types["b"] = SEMANAGE_FCONTEXT_BLOCK
file_types["-s"] = SEMANAGE_FCONTEXT_SOCK
file_types["socket"] = SEMANAGE_FCONTEXT_SOCK
file_types["s"] = SEMANAGE_FCONTEXT_SOCK
file_types["-l"] = SEMANAGE_FCONTEXT_LINK
file_types["l"] = SEMANAGE_FCONTEXT_LINK
file_types["symbolic link"] = SEMANAGE_FCONTEXT_LINK
file_types["p"] = SEMANAGE_FCONTEXT_PIPE
file_types["-p"] = SEMANAGE_FCONTEXT_PIPE
file_types["named pipe"] = SEMANAGE_FCONTEXT_PIPE

file_type_str_to_option = {"all files": "a",
                           "regular file": "f",
                           "directory": "d",
                           "character device": "c",
                           "block device": "b",
                           "socket": "s",
                           "symbolic link": "l",
                           "named pipe": "p"}

ftype_to_audit = {"": "any",
                  "a" : "any",
                  "b": "block",
                  "c": "char",
                  "d": "dir",
                  "f": "file",
                  "l": "symlink",
                  "p": "pipe",
                  "s": "socket"}

try:
    import audit
    #test if audit module is enabled
    audit.audit_close(audit.audit_open())

    class logger:

        def __init__(self):
            self.audit_fd = audit.audit_open()
            self.log_list = []
            self.log_change_list = []

        def log(self, msg, name="", sename="", serole="", serange="", oldsename="", oldserole="", oldserange=""):

            sep = "-"
            if sename != oldsename:
                msg += sep + "sename"
                sep = ","
            if serole != oldserole:
                msg += sep + "role"
                sep = ","
            if serange != oldserange:
                msg += sep + "range"
                sep = ","

            self.log_list.append([self.audit_fd, audit.AUDIT_ROLE_ASSIGN, sys.argv[0], str(msg), name, 0, sename, serole, serange, oldsename, oldserole, oldserange, "", "", ""])

        def log_remove(self, msg, name="", sename="", serole="", serange="", oldsename="", oldserole="", oldserange=""):
            self.log_list.append([self.audit_fd, audit.AUDIT_ROLE_REMOVE, sys.argv[0], str(msg), name, 0, sename, serole, serange, oldsename, oldserole, oldserange, "", "", ""])

        def log_change(self, msg):
            self.log_change_list.append([self.audit_fd, audit.AUDIT_USER_MAC_CONFIG_CHANGE, str(msg), "semanage", "", "", ""])

        def commit(self, success):
            for l in self.log_list:
                audit.audit_log_semanage_message(*(l + [success]))
            for l in self.log_change_list:
                audit.audit_log_user_comm_message(*(l + [success]))

            self.log_list = []
            self.log_change_list = []
except (OSError, ImportError):
    class logger:

        def __init__(self):
            self.log_list = []

        def log(self, msg, name="", sename="", serole="", serange="", oldsename="", oldserole="", oldserange=""):
            message = " %s name=%s" % (msg, name)
            if sename != "":
                message += " sename=" + sename
            if oldsename != "":
                message += " oldsename=" + oldsename
            if serole != "":
                message += " role=" + serole
            if oldserole != "":
                message += " old_role=" + oldserole
            if serange != "" and serange is not None:
                message += " MLSRange=" + serange
            if oldserange != "" and oldserange is not None:
                message += " old_MLSRange=" + oldserange
            self.log_list.append(message)

        def log_remove(self, msg, name="", sename="", serole="", serange="", oldsename="", oldserole="", oldserange=""):
            self.log(msg, name, sename, serole, serange, oldsename, oldserole, oldserange)

        def log_change(self, msg):
            self.log_list.append(" %s" % msg)

        def commit(self, success):
            if success == 1:
                message = "Successful: "
            else:
                message = "Failed: "
            for l in self.log_list:
                syslog.syslog(syslog.LOG_INFO, message + l)


class nulllogger:

    def log(self, msg, name="", sename="", serole="", serange="", oldsename="", oldserole="", oldserange=""):
        pass

    def log_remove(self, msg, name="", sename="", serole="", serange="", oldsename="", oldserole="", oldserange=""):
        pass

    def log_change(self, msg):
        pass

    def commit(self, success):
        pass


def validate_level(raw):
    sensitivity = "s[0-9]*"
    category = "c[0-9]*"
    cat_range = category + r"(\." + category + ")?"
    categories = cat_range + r"(\," + cat_range + ")*"
    reg = sensitivity + "(-" + sensitivity + ")?" + "(:" + categories + ")?"
    return re.search("^" + reg + "$", raw)


def translate(raw, prepend=1):
    filler = "a:b:c:"
    if prepend == 1:
        context = "%s%s" % (filler, raw)
    else:
        context = raw
    (rc, trans) = selinux.selinux_raw_to_trans_context(context)
    if rc != 0:
        return raw
    if prepend:
        trans = trans[len(filler):]
    if trans == "":
        return raw
    else:
        return trans


def untranslate(trans, prepend=1):
    filler = "a:b:c:"
    if prepend == 1:
        context = "%s%s" % (filler, trans)
    else:
        context = trans

    (rc, raw) = selinux.selinux_trans_to_raw_context(context)
    if rc != 0:
        return trans
    if prepend:
        raw = raw[len(filler):]
    if raw == "":
        return trans
    else:
        return raw


class semanageRecords:
    transaction = False
    handle = None
    store = None
    args = None

    def __init__(self, args = None):
        global handle
        if args:
            # legacy code - args was store originally
            if type(args) == str:
                self.store = args
            else:
                self.args = args
        self.noreload = getattr(args, "noreload", False)
        if not self.store:
            self.store = getattr(args, "store", "")

        self.sh = self.get_handle(self.store)

        rc, localstore = selinux.selinux_getpolicytype()
        if self.store == "" or self.store == localstore:
            self.mylog = logger()
        else:
            sepolicy.load_store_policy(self.store)
            selinux.selinux_set_policy_root("%s%s" % (selinux.selinux_path(), self.store))
            self.mylog = nulllogger()

    def set_reload(self, load):
        self.noreload = not load

    def get_handle(self, store):
        global is_mls_enabled

        if semanageRecords.handle:
            return semanageRecords.handle

        handle = semanage_handle_create()
        if not handle:
            raise ValueError(_("Could not create semanage handle"))

        if not semanageRecords.transaction and store != "":
            semanage_select_store(handle, store, SEMANAGE_CON_DIRECT)
            semanageRecords.store = store

        if not semanage_is_managed(handle):
            semanage_handle_destroy(handle)
            raise ValueError(_("SELinux policy is not managed or store cannot be accessed."))

        rc = semanage_access_check(handle)
        if rc < SEMANAGE_CAN_READ:
            semanage_handle_destroy(handle)
            raise ValueError(_("Cannot read policy store."))

        rc = semanage_connect(handle)
        if rc < 0:
            semanage_handle_destroy(handle)
            raise ValueError(_("Could not establish semanage connection"))

        is_mls_enabled = semanage_mls_enabled(handle)
        if is_mls_enabled < 0:
            semanage_handle_destroy(handle)
            raise ValueError(_("Could not test MLS enabled status"))

        semanageRecords.handle = handle
        return semanageRecords.handle

    def deleteall(self):
        raise ValueError(_("Not yet implemented"))

    def start(self):
        if semanageRecords.transaction:
            raise ValueError(_("Semanage transaction already in progress"))
        self.begin()
        semanageRecords.transaction = True

    def begin(self):
        if semanageRecords.transaction:
            return
        rc = semanage_begin_transaction(self.sh)
        if rc < 0:
            raise ValueError(_("Could not start semanage transaction"))

    def customized(self):
        raise ValueError(_("Not yet implemented"))

    def commit(self):
        if semanageRecords.transaction:
            return

        if self.noreload:
            semanage_set_reload(self.sh, 0)
        rc = semanage_commit(self.sh)
        if rc < 0:
            self.mylog.commit(0)
            raise ValueError(_("Could not commit semanage transaction"))
        self.mylog.commit(1)

    def finish(self):
        if not semanageRecords.transaction:
            raise ValueError(_("Semanage transaction not in progress"))
        semanageRecords.transaction = False
        self.commit()


class moduleRecords(semanageRecords):

    def __init__(self, args = None):
        semanageRecords.__init__(self, args)

    def get_all(self):
        l = []
        (rc, mlist, number) = semanage_module_list_all(self.sh)
        if rc < 0:
            raise ValueError(_("Could not list SELinux modules"))

        for i in range(number):
            mod = semanage_module_list_nth(mlist, i)

            rc, name = semanage_module_info_get_name(self.sh, mod)
            if rc < 0:
                raise ValueError(_("Could not get module name"))

            rc, enabled = semanage_module_info_get_enabled(self.sh, mod)
            if rc < 0:
                raise ValueError(_("Could not get module enabled"))

            rc, priority = semanage_module_info_get_priority(self.sh, mod)
            if rc < 0:
                raise ValueError(_("Could not get module priority"))

            rc, lang_ext = semanage_module_info_get_lang_ext(self.sh, mod)
            if rc < 0:
                raise ValueError(_("Could not get module lang_ext"))

            l.append((name, enabled, priority, lang_ext))

        # sort the list so they are in name order, but with higher priorities coming first
        l.sort(key=lambda t: t[3], reverse=True)
        l.sort(key=lambda t: t[0])
        return l

    def customized(self):
        all = self.get_all()
        if len(all) == 0:
            return []
        return ["-d %s" % x[0] for x in [t for t in all if t[1] == 0]]

    def list(self, heading=1, locallist=0):
        all = self.get_all()
        if len(all) == 0:
            return

        if heading:
            print("\n%-25s %-9s %s\n" % (_("Module Name"), _("Priority"), _("Language")))
        for t in all:
            if t[1] == 0:
                disabled = _("Disabled")
            else:
                if locallist:
                    continue
                disabled = ""
            print("%-25s %-9s %-5s %s" % (t[0], t[2], t[3], disabled))

    def add(self, file, priority):
        if not os.path.exists(file):
            raise ValueError(_("Module does not exist: %s ") % file)

        rc = semanage_set_default_priority(self.sh, priority)
        if rc < 0:
            raise ValueError(_("Invalid priority %d (needs to be between 1 and 999)") % priority)

        rc = semanage_module_install_file(self.sh, file)
        if rc >= 0:
            self.commit()

    def set_enabled(self, module, enable):
        for m in module.split():
            rc, key = semanage_module_key_create(self.sh)
            if rc < 0:
                raise ValueError(_("Could not create module key"))

            rc = semanage_module_key_set_name(self.sh, key, m)
            if rc < 0:
                raise ValueError(_("Could not set module key name"))

            rc = semanage_module_set_enabled(self.sh, key, enable)
            if rc < 0:
                if enable:
                    raise ValueError(_("Could not enable module %s") % m)
                else:
                    raise ValueError(_("Could not disable module %s") % m)
        self.commit()

    def delete(self, module, priority):
        rc = semanage_set_default_priority(self.sh, priority)
        if rc < 0:
            raise ValueError(_("Invalid priority %d (needs to be between 1 and 999)") % priority)

        for m in module.split():
            rc = semanage_module_remove(self.sh, m)
            if rc < 0 and rc != -2:
                raise ValueError(_("Could not remove module %s (remove failed)") % m)

        self.commit()

    def deleteall(self):
        l = [x[0] for x in [t for t in self.get_all() if t[1] == 0]]
        for m in l:
            self.set_enabled(m, True)


class dontauditClass(semanageRecords):

    def __init__(self, args = None):
        semanageRecords.__init__(self, args)

    def toggle(self, dontaudit):
        if dontaudit not in ["on", "off"]:
            raise ValueError(_("dontaudit requires either 'on' or 'off'"))
        self.begin()
        semanage_set_disable_dontaudit(self.sh, dontaudit == "off")
        self.commit()


class permissiveRecords(semanageRecords):

    def __init__(self, args = None):
        semanageRecords.__init__(self, args)

    def get_all(self):
        l = []
        (rc, mlist, number) = semanage_module_list(self.sh)
        if rc < 0:
            raise ValueError(_("Could not list SELinux modules"))

        for i in range(number):
            mod = semanage_module_list_nth(mlist, i)
            name = semanage_module_get_name(mod)
            if name and name.startswith("permissive_"):
                l.append(name.split("permissive_")[1])
        return l

    def customized(self):
        return ["-a %s" % x for x in sorted(self.get_all())]

    def list(self, heading=1, locallist=0):
        all = [y["name"] for y in [x for x in sepolicy.info(sepolicy.TYPE) if x["permissive"]]]
        if len(all) == 0:
            return

        if heading:
            print("\n%-25s\n" % (_("Builtin Permissive Types")))
        customized = self.get_all()
        for t in all:
            if t not in customized:
                print(t)

        if len(customized) == 0:
            return

        if heading:
            print("\n%-25s\n" % (_("Customized Permissive Types")))
        for t in customized:
            print(t)

    def add(self, type):
        try:
            import sepolgen.module as module
        except ImportError:
            raise ValueError(_("The sepolgen python module is required to setup permissive domains.\nIn some distributions it is included in the policycoreutils-devel package.\n# yum install policycoreutils-devel\nOr similar for your distro."))

        name = "permissive_%s" % type
        modtxt = "(typepermissive %s)" % type

        rc = semanage_module_install(self.sh, modtxt, len(modtxt), name, "cil")
        if rc >= 0:
            self.commit()

        if rc < 0:
            raise ValueError(_("Could not set permissive domain %s (module installation failed)") % name)

    def delete(self, name):
        for n in name.split():
            rc = semanage_module_remove(self.sh, "permissive_%s" % n)
            if rc < 0:
                raise ValueError(_("Could not remove permissive domain %s (remove failed)") % name)

        self.commit()

    def deleteall(self):
        l = self.get_all()
        if len(l) > 0:
            all = " ".join(l)
            self.delete(all)


class loginRecords(semanageRecords):

    def __init__(self, args = None):
        semanageRecords.__init__(self, args)
        self.oldsename = None
        self.oldserange = None
        self.sename = None
        self.serange = None

    def __add(self, name, sename, serange):
        rec, self.oldsename, self.oldserange = selinux.getseuserbyname(name)
        if sename == "":
            sename = "user_u"

        userrec = seluserRecords(self.args)
        range, (rc, oldserole) = userrec.get(self.oldsename)
        range, (rc, serole) = userrec.get(sename)

        if is_mls_enabled == 1:
            if serange != "":
                serange = untranslate(serange)
            else:
                serange = range

        (rc, k) = semanage_seuser_key_create(self.sh, name)
        if rc < 0:
            raise ValueError(_("Could not create a key for %s") % name)

        if name[0] == '%':
            try:
                grp.getgrnam(name[1:])
            except:
                raise ValueError(_("Linux Group %s does not exist") % name[1:])
        else:
            try:
                pwd.getpwnam(name)
            except:
                raise ValueError(_("Linux User %s does not exist") % name)

        (rc, u) = semanage_seuser_create(self.sh)
        if rc < 0:
            raise ValueError(_("Could not create login mapping for %s") % name)

        rc = semanage_seuser_set_name(self.sh, u, name)
        if rc < 0:
            raise ValueError(_("Could not set name for %s") % name)

        if (is_mls_enabled == 1) and (serange != ""):
            rc = semanage_seuser_set_mlsrange(self.sh, u, serange)
            if rc < 0:
                raise ValueError(_("Could not set MLS range for %s") % name)

        rc = semanage_seuser_set_sename(self.sh, u, sename)
        if rc < 0:
            raise ValueError(_("Could not set SELinux user for %s") % name)

        rc = semanage_seuser_modify_local(self.sh, k, u)
        if rc < 0:
            raise ValueError(_("Could not add login mapping for %s") % name)

        semanage_seuser_key_free(k)
        semanage_seuser_free(u)

    def add(self, name, sename, serange):
        try:
            self.begin()
            # Add a new mapping, or modify an existing one
            if self.__exists(name):
                print(_("Login mapping for %s is already defined, modifying instead") % name)
                self.__modify(name, sename, serange)
            else:
                self.__add(name, sename, serange)
            self.commit()
        except ValueError as error:
            raise error

    # check if login mapping for given user exists
    def __exists(self, name):
        (rc, k) = semanage_seuser_key_create(self.sh, name)
        if rc < 0:
            raise ValueError(_("Could not create a key for %s") % name)

        (rc, exists) = semanage_seuser_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if login mapping for %s is defined") % name)
        semanage_seuser_key_free(k)

        return exists

    def __modify(self, name, sename="", serange=""):
        rec, self.oldsename, self.oldserange = selinux.getseuserbyname(name)
        if sename == "" and serange == "":
            raise ValueError(_("Requires seuser or serange"))

        userrec = seluserRecords(self.args)
        range, (rc, oldserole) = userrec.get(self.oldsename)

        if sename != "":
            range, (rc, serole) = userrec.get(sename)
        else:
            serole = oldserole

        if serange != "":
            self.serange = serange
        else:
            self.serange = range

        (rc, k) = semanage_seuser_key_create(self.sh, name)
        if rc < 0:
            raise ValueError(_("Could not create a key for %s") % name)

        (rc, exists) = semanage_seuser_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if login mapping for %s is defined") % name)
        if not exists:
            raise ValueError(_("Login mapping for %s is not defined") % name)

        (rc, u) = semanage_seuser_query(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not query seuser for %s") % name)

        self.oldserange = semanage_seuser_get_mlsrange(u)
        self.oldsename = semanage_seuser_get_sename(u)
        if (is_mls_enabled == 1) and (serange != ""):
            semanage_seuser_set_mlsrange(self.sh, u, untranslate(serange))

        if sename != "":
            semanage_seuser_set_sename(self.sh, u, sename)
            self.sename = sename
        else:
            self.sename = self.oldsename

        rc = semanage_seuser_modify_local(self.sh, k, u)
        if rc < 0:
            raise ValueError(_("Could not modify login mapping for %s") % name)

        semanage_seuser_key_free(k)
        semanage_seuser_free(u)

    def modify(self, name, sename="", serange=""):
        try:
            self.begin()
            self.__modify(name, sename, serange)
            self.commit()
        except ValueError as error:
            raise error

    def __delete(self, name):
        rec, self.oldsename, self.oldserange = selinux.getseuserbyname(name)
        userrec = seluserRecords(self.args)
        range, (rc, oldserole) = userrec.get(self.oldsename)

        (rc, k) = semanage_seuser_key_create(self.sh, name)
        if rc < 0:
            raise ValueError(_("Could not create a key for %s") % name)

        (rc, exists) = semanage_seuser_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if login mapping for %s is defined") % name)
        if not exists:
            raise ValueError(_("Login mapping for %s is not defined") % name)

        (rc, exists) = semanage_seuser_exists_local(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if login mapping for %s is defined") % name)
        if not exists:
            raise ValueError(_("Login mapping for %s is defined in policy, cannot be deleted") % name)

        rc = semanage_seuser_del_local(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not delete login mapping for %s") % name)

        semanage_seuser_key_free(k)

        rec, self.sename, self.serange = selinux.getseuserbyname("__default__")
        range, (rc, serole) = userrec.get(self.sename)

    def delete(self, name):
        try:
            self.begin()
            self.__delete(name)
            self.commit()

        except ValueError as error:
            raise error

    def deleteall(self):
        (rc, ulist) = semanage_seuser_list_local(self.sh)
        if rc < 0:
            raise ValueError(_("Could not list login mappings"))

        try:
            self.begin()
            for u in ulist:
                self.__delete(semanage_seuser_get_name(u))
            self.commit()
        except ValueError as error:
            raise error

    def get_all_logins(self):
        ddict = {}
        self.logins_path = selinux.selinux_policy_root() + "/logins"
        for path, dirs, files in os.walk(self.logins_path):
            if path == self.logins_path:
                for name in files:
                    try:
                        fd = open(path + "/" + name)
                        rec = fd.read().rstrip().split(":")
                        fd.close()
                        ddict[name] = (rec[1], rec[2], rec[0])
                    except IndexError:
                        pass
        return ddict

    def get_all(self, locallist=0):
        ddict = {}
        if locallist:
            (rc, self.ulist) = semanage_seuser_list_local(self.sh)
        else:
            (rc, self.ulist) = semanage_seuser_list(self.sh)
        if rc < 0:
            raise ValueError(_("Could not list login mappings"))

        for u in self.ulist:
            name = semanage_seuser_get_name(u)
            ddict[name] = (semanage_seuser_get_sename(u), semanage_seuser_get_mlsrange(u), "*")
        return ddict

    def customized(self):
        l = []
        ddict = self.get_all(True)
        for k in sorted(ddict.keys()):
            if ddict[k][1]:
                l.append("-a -s %s -r '%s' %s" % (ddict[k][0], ddict[k][1], k))
            else:
                l.append("-a -s %s %s" % (ddict[k][0], k))
        return l

    def list(self, heading=1, locallist=0):
        ddict = self.get_all(locallist)
        ldict = self.get_all_logins()
        lkeys = sorted(ldict.keys())
        keys = sorted(ddict.keys())
        if len(keys) == 0 and len(lkeys) == 0:
            return

        if is_mls_enabled == 1:
            if heading:
                print("\n%-20s %-20s %-20s %s\n" % (_("Login Name"), _("SELinux User"), _("MLS/MCS Range"), _("Service")))
            for k in keys:
                u = ddict[k]
                print("%-20s %-20s %-20s %s" % (k, u[0], translate(u[1]), u[2]))
            if len(lkeys):
                print("\nLocal customization in %s" % self.logins_path)

            for k in lkeys:
                u = ldict[k]
                print("%-20s %-20s %-20s %s" % (k, u[0], translate(u[1]), u[2]))
        else:
            if heading:
                print("\n%-25s %-25s\n" % (_("Login Name"), _("SELinux User")))
            for k in keys:
                print("%-25s %-25s" % (k, ddict[k][0]))


class seluserRecords(semanageRecords):

    def __init__(self, args = None):
        semanageRecords.__init__(self, args)

    def get(self, name):
        (rc, k) = semanage_user_key_create(self.sh, name)
        if rc < 0:
            raise ValueError(_("Could not create a key for %s") % name)
        (rc, exists) = semanage_user_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if SELinux user %s is defined") % name)
        (rc, u) = semanage_user_query(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not query user for %s") % name)
        serange = semanage_user_get_mlsrange(u)
        serole = semanage_user_get_roles(self.sh, u)
        semanage_user_key_free(k)
        semanage_user_free(u)
        return serange, serole

    def __add(self, name, roles, selevel, serange, prefix):
        if is_mls_enabled == 1:
            if serange == "":
                serange = "s0"
            else:
                serange = untranslate(serange)

            if selevel == "":
                selevel = "s0"
            else:
                selevel = untranslate(selevel)

        if len(roles) < 1:
            raise ValueError(_("You must add at least one role for %s") % name)

        (rc, k) = semanage_user_key_create(self.sh, name)
        if rc < 0:
            raise ValueError(_("Could not create a key for %s") % name)

        (rc, u) = semanage_user_create(self.sh)
        if rc < 0:
            raise ValueError(_("Could not create SELinux user for %s") % name)

        rc = semanage_user_set_name(self.sh, u, name)
        if rc < 0:
            raise ValueError(_("Could not set name for %s") % name)

        for r in roles:
            rc = semanage_user_add_role(self.sh, u, r)
            if rc < 0:
                raise ValueError(_("Could not add role %s for %s") % (r, name))

        if is_mls_enabled == 1:
            rc = semanage_user_set_mlsrange(self.sh, u, serange)
            if rc < 0:
                raise ValueError(_("Could not set MLS range for %s") % name)

            rc = semanage_user_set_mlslevel(self.sh, u, selevel)
            if rc < 0:
                raise ValueError(_("Could not set MLS level for %s") % name)
        rc = semanage_user_set_prefix(self.sh, u, prefix)
        if rc < 0:
            raise ValueError(_("Could not add prefix %s for %s") % (r, prefix))
        (rc, key) = semanage_user_key_extract(self.sh, u)
        if rc < 0:
            raise ValueError(_("Could not extract key for %s") % name)

        rc = semanage_user_modify_local(self.sh, k, u)
        if rc < 0:
            raise ValueError(_("Could not add SELinux user %s") % name)

        semanage_user_key_free(k)
        semanage_user_free(u)
        self.mylog.log("seuser", sename=name, serole=",".join(roles), serange=serange)

    def add(self, name, roles, selevel, serange, prefix):
        try:
            self.begin()
            if self.__exists(name):
                print(_("SELinux user %s is already defined, modifying instead") % name)
                self.__modify(name, roles, selevel, serange, prefix)
            else:
                self.__add(name, roles, selevel, serange, prefix)
            self.commit()
        except ValueError as error:
            self.mylog.commit(0)
            raise error

    def __exists(self, name):
        (rc, k) = semanage_user_key_create(self.sh, name)
        if rc < 0:
            raise ValueError(_("Could not create a key for %s") % name)

        (rc, exists) = semanage_user_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if SELinux user %s is defined") % name)
        semanage_user_key_free(k)

        return exists

    def __modify(self, name, roles=[], selevel="", serange="", prefix=""):
        oldserole = ""
        oldserange = ""
        newroles = " ".join(roles)
        if prefix == "" and len(roles) == 0 and serange == "" and selevel == "":
            if is_mls_enabled == 1:
                raise ValueError(_("Requires prefix, roles, level or range"))
            else:
                raise ValueError(_("Requires prefix or roles"))

        (rc, k) = semanage_user_key_create(self.sh, name)
        if rc < 0:
            raise ValueError(_("Could not create a key for %s") % name)

        (rc, exists) = semanage_user_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if SELinux user %s is defined") % name)
        if not exists:
            raise ValueError(_("SELinux user %s is not defined") % name)

        (rc, u) = semanage_user_query(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not query user for %s") % name)

        oldserange = semanage_user_get_mlsrange(u)
        (rc, rlist) = semanage_user_get_roles(self.sh, u)
        if rc >= 0:
            oldserole = " ".join(rlist)

        if (is_mls_enabled == 1) and (serange != ""):
            semanage_user_set_mlsrange(self.sh, u, untranslate(serange))
        if (is_mls_enabled == 1) and (selevel != ""):
            semanage_user_set_mlslevel(self.sh, u, untranslate(selevel))

        if prefix != "":
            semanage_user_set_prefix(self.sh, u, prefix)

        if len(roles) != 0:
            for r in rlist:
                if r not in roles:
                    semanage_user_del_role(u, r)
            for r in roles:
                if r not in rlist:
                    semanage_user_add_role(self.sh, u, r)

        rc = semanage_user_modify_local(self.sh, k, u)
        if rc < 0:
            raise ValueError(_("Could not modify SELinux user %s") % name)

        semanage_user_key_free(k)
        semanage_user_free(u)

        role = ",".join(newroles.split())
        oldserole = ",".join(oldserole.split())
        self.mylog.log("seuser", sename=name, oldsename=name, serole=role, serange=serange, oldserole=oldserole, oldserange=oldserange)

    def modify(self, name, roles=[], selevel="", serange="", prefix=""):
        try:
            self.begin()
            self.__modify(name, roles, selevel, serange, prefix)
            self.commit()
        except ValueError as error:
            self.mylog.commit(0)
            raise error

    def __delete(self, name):
        (rc, k) = semanage_user_key_create(self.sh, name)
        if rc < 0:
            raise ValueError(_("Could not create a key for %s") % name)

        (rc, exists) = semanage_user_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if SELinux user %s is defined") % name)
        if not exists:
            raise ValueError(_("SELinux user %s is not defined") % name)

        (rc, exists) = semanage_user_exists_local(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if SELinux user %s is defined") % name)
        if not exists:
            raise ValueError(_("SELinux user %s is defined in policy, cannot be deleted") % name)

        (rc, u) = semanage_user_query(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not query user for %s") % name)
        oldserange = semanage_user_get_mlsrange(u)
        (rc, rlist) = semanage_user_get_roles(self.sh, u)
        oldserole = ",".join(rlist)

        rc = semanage_user_del_local(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not delete SELinux user %s") % name)

        semanage_user_key_free(k)
        semanage_user_free(u)

        self.mylog.log_remove("seuser", oldsename=name, oldserange=oldserange, oldserole=oldserole)

    def delete(self, name):
        try:
            self.begin()
            self.__delete(name)
            self.commit()

        except ValueError as error:
            self.mylog.commit(0)
            raise error

    def deleteall(self):
        (rc, ulist) = semanage_user_list_local(self.sh)
        if rc < 0:
            raise ValueError(_("Could not list login mappings"))

        try:
            self.begin()
            for u in ulist:
                self.__delete(semanage_user_get_name(u))
            self.commit()
        except ValueError as error:
            self.mylog.commit(0)
            raise error

    def get_all(self, locallist=0):
        ddict = {}
        if locallist:
            (rc, self.ulist) = semanage_user_list_local(self.sh)
        else:
            (rc, self.ulist) = semanage_user_list(self.sh)
        if rc < 0:
            raise ValueError(_("Could not list SELinux users"))

        for u in self.ulist:
            name = semanage_user_get_name(u)
            (rc, rlist) = semanage_user_get_roles(self.sh, u)
            if rc < 0:
                raise ValueError(_("Could not list roles for user %s") % name)

            roles = " ".join(rlist)
            ddict[semanage_user_get_name(u)] = (semanage_user_get_prefix(u), semanage_user_get_mlslevel(u), semanage_user_get_mlsrange(u), roles)

        return ddict

    def customized(self):
        l = []
        ddict = self.get_all(True)
        for k in sorted(ddict.keys()):
            if ddict[k][1] or ddict[k][2]:
                l.append("-a -L %s -r %s -R '%s' %s" % (ddict[k][1], ddict[k][2], ddict[k][3], k))
            else:
                l.append("-a -R '%s' %s" % (ddict[k][3], k))
        return l

    def list(self, heading=1, locallist=0):
        ddict = self.get_all(locallist)
        if len(ddict) == 0:
            return
        keys = sorted(ddict.keys())

        if is_mls_enabled == 1:
            if heading:
                print("\n%-15s %-10s %-10s %-30s" % ("", _("Labeling"), _("MLS/"), _("MLS/")))
                print("%-15s %-10s %-10s %-30s %s\n" % (_("SELinux User"), _("Prefix"), _("MCS Level"), _("MCS Range"), _("SELinux Roles")))
            for k in keys:
                print("%-15s %-10s %-10s %-30s %s" % (k, ddict[k][0], translate(ddict[k][1]), translate(ddict[k][2]), ddict[k][3]))
        else:
            if heading:
                print("%-15s %s\n" % (_("SELinux User"), _("SELinux Roles")))
            for k in keys:
                print("%-15s %s" % (k, ddict[k][3]))


class portRecords(semanageRecords):

    valid_types = []

    def __init__(self, args = None):
        semanageRecords.__init__(self, args)
        try:
            self.valid_types = list(list(sepolicy.info(sepolicy.ATTRIBUTE, "port_type"))[0]["types"])
        except RuntimeError:
            pass

    def __genkey(self, port, proto):
        protocols = {"tcp": SEMANAGE_PROTO_TCP,
                     "udp": SEMANAGE_PROTO_UDP,
                     "sctp": SEMANAGE_PROTO_SCTP,
                     "dccp": SEMANAGE_PROTO_DCCP}

        if proto in protocols.keys():
            proto_d = protocols[proto]
        else:
            raise ValueError(_("Protocol has to be one of udp, tcp, dccp or sctp"))
        if port == "":
            raise ValueError(_("Port is required"))

        ports = port.split("-")
        if len(ports) == 1:
            high = low = int(ports[0])
        else:
            low = int(ports[0])
            high = int(ports[1])

        if high > 65535:
            raise ValueError(_("Invalid Port"))

        (rc, k) = semanage_port_key_create(self.sh, low, high, proto_d)
        if rc < 0:
            raise ValueError(_("Could not create a key for %s/%s") % (proto, port))
        return (k, proto_d, low, high)

    def __add(self, port, proto, serange, type):
        if is_mls_enabled == 1:
            if serange == "":
                serange = "s0"
            else:
                serange = untranslate(serange)

        if type == "":
            raise ValueError(_("Type is required"))

        type = sepolicy.get_real_type_name(type)

        if type not in self.valid_types:
            raise ValueError(_("Type %s is invalid, must be a port type") % type)

        (k, proto_d, low, high) = self.__genkey(port, proto)

        (rc, p) = semanage_port_create(self.sh)
        if rc < 0:
            raise ValueError(_("Could not create port for %s/%s") % (proto, port))

        semanage_port_set_proto(p, proto_d)
        semanage_port_set_range(p, low, high)
        (rc, con) = semanage_context_create(self.sh)
        if rc < 0:
            raise ValueError(_("Could not create context for %s/%s") % (proto, port))

        rc = semanage_context_set_user(self.sh, con, "system_u")
        if rc < 0:
            raise ValueError(_("Could not set user in port context for %s/%s") % (proto, port))

        rc = semanage_context_set_role(self.sh, con, "object_r")
        if rc < 0:
            raise ValueError(_("Could not set role in port context for %s/%s") % (proto, port))

        rc = semanage_context_set_type(self.sh, con, type)
        if rc < 0:
            raise ValueError(_("Could not set type in port context for %s/%s") % (proto, port))

        if (is_mls_enabled == 1) and (serange != ""):
            rc = semanage_context_set_mls(self.sh, con, serange)
            if rc < 0:
                raise ValueError(_("Could not set mls fields in port context for %s/%s") % (proto, port))

        rc = semanage_port_set_con(self.sh, p, con)
        if rc < 0:
            raise ValueError(_("Could not set port context for %s/%s") % (proto, port))

        rc = semanage_port_modify_local(self.sh, k, p)
        if rc < 0:
            raise ValueError(_("Could not add port %s/%s") % (proto, port))

        semanage_context_free(con)
        semanage_port_key_free(k)
        semanage_port_free(p)

        self.mylog.log_change("resrc=port op=add lport=%s proto=%s tcontext=%s:%s:%s:%s" % (port, socket.getprotobyname(proto), "system_u", "object_r", type, serange))

    def add(self, port, proto, serange, type):
        self.begin()
        if self.__exists(port, proto):
            print(_("Port {proto}/{port} already defined, modifying instead").format(proto=proto, port=port))
            self.__modify(port, proto, serange, type)
        else:
            self.__add(port, proto, serange, type)
        self.commit()

    def __exists(self, port, proto):
        (k, proto_d, low, high) = self.__genkey(port, proto)

        (rc, exists) = semanage_port_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if port {proto}/{port} is defined").format(proto=proto, port=port))
        semanage_port_key_free(k)

        return exists

    def __modify(self, port, proto, serange, setype):
        if serange == "" and setype == "":
            if is_mls_enabled == 1:
                raise ValueError(_("Requires setype or serange"))
            else:
                raise ValueError(_("Requires setype"))

        setype = sepolicy.get_real_type_name(setype)
        if setype and setype not in self.valid_types:
            raise ValueError(_("Type %s is invalid, must be a port type") % setype)

        (k, proto_d, low, high) = self.__genkey(port, proto)

        (rc, exists) = semanage_port_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if port %s/%s is defined") % (proto, port))
        if not exists:
            raise ValueError(_("Port %s/%s is not defined") % (proto, port))

        (rc, p) = semanage_port_query(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not query port %s/%s") % (proto, port))

        con = semanage_port_get_con(p)

        if is_mls_enabled == 1:
            if serange == "":
                serange = "s0"
            else:
                semanage_context_set_mls(self.sh, con, untranslate(serange))
        if setype != "":
            semanage_context_set_type(self.sh, con, setype)

        rc = semanage_port_modify_local(self.sh, k, p)
        if rc < 0:
            raise ValueError(_("Could not modify port %s/%s") % (proto, port))

        semanage_port_key_free(k)
        semanage_port_free(p)

        self.mylog.log_change("resrc=port op=modify lport=%s proto=%s tcontext=%s:%s:%s:%s" % (port, socket.getprotobyname(proto), "system_u", "object_r", setype, serange))

    def modify(self, port, proto, serange, setype):
        self.begin()
        self.__modify(port, proto, serange, setype)
        self.commit()

    def deleteall(self):
        (rc, plist) = semanage_port_list_local(self.sh)
        if rc < 0:
            raise ValueError(_("Could not list the ports"))

        self.begin()

        for port in plist:
            proto = semanage_port_get_proto(port)
            proto_str = semanage_port_get_proto_str(proto)
            low = semanage_port_get_low(port)
            high = semanage_port_get_high(port)
            port_str = "%s-%s" % (low, high)

            (k, proto_d, low, high) = self.__genkey(port_str, proto_str)
            if rc < 0:
                raise ValueError(_("Could not create a key for %s") % port_str)

            rc = semanage_port_del_local(self.sh, k)
            if rc < 0:
                raise ValueError(_("Could not delete the port %s") % port_str)
            semanage_port_key_free(k)

            if low == high:
                port_str = low

            self.mylog.log_change("resrc=port op=delete lport=%s proto=%s" % (port_str, socket.getprotobyname(proto_str)))

        self.commit()

    def __delete(self, port, proto):
        (k, proto_d, low, high) = self.__genkey(port, proto)
        (rc, exists) = semanage_port_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if port %s/%s is defined") % (proto, port))
        if not exists:
            raise ValueError(_("Port %s/%s is not defined") % (proto, port))

        (rc, exists) = semanage_port_exists_local(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if port %s/%s is defined") % (proto, port))
        if not exists:
            raise ValueError(_("Port %s/%s is defined in policy, cannot be deleted") % (proto, port))

        rc = semanage_port_del_local(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not delete port %s/%s") % (proto, port))

        semanage_port_key_free(k)

        self.mylog.log_change("resrc=port op=delete lport=%s proto=%s" % (port, socket.getprotobyname(proto)))

    def delete(self, port, proto):
        self.begin()
        self.__delete(port, proto)
        self.commit()

    def get_all(self, locallist=0):
        ddict = {}
        if locallist:
            (rc, self.plist) = semanage_port_list_local(self.sh)
        else:
            (rc, self.plist) = semanage_port_list(self.sh)
        if rc < 0:
            raise ValueError(_("Could not list ports"))

        for port in self.plist:
            con = semanage_port_get_con(port)
            ctype = semanage_context_get_type(con)
            level = semanage_context_get_mls(con)
            proto = semanage_port_get_proto(port)
            proto_str = semanage_port_get_proto_str(proto)
            low = semanage_port_get_low(port)
            high = semanage_port_get_high(port)
            ddict[(low, high, proto_str)] = (ctype, level)
        return ddict

    def get_all_by_type(self, locallist=0):
        ddict = {}
        if locallist:
            (rc, self.plist) = semanage_port_list_local(self.sh)
        else:
            (rc, self.plist) = semanage_port_list(self.sh)
        if rc < 0:
            raise ValueError(_("Could not list ports"))

        for port in self.plist:
            con = semanage_port_get_con(port)
            ctype = semanage_context_get_type(con)
            proto = semanage_port_get_proto(port)
            proto_str = semanage_port_get_proto_str(proto)
            low = semanage_port_get_low(port)
            high = semanage_port_get_high(port)
            if (ctype, proto_str) not in ddict.keys():
                ddict[(ctype, proto_str)] = []
            if low == high:
                ddict[(ctype, proto_str)].append("%d" % low)
            else:
                ddict[(ctype, proto_str)].append("%d-%d" % (low, high))
        return ddict

    def customized(self):
        l = []
        ddict = self.get_all(True)
        for k in sorted(ddict.keys()):
            port = k[0] if k[0] == k[1] else "%s-%s" % (k[0], k[1])
            if ddict[k][1]:
                l.append("-a -t %s -r '%s' -p %s %s" % (ddict[k][0], ddict[k][1], k[2], port))
            else:
                l.append("-a -t %s -p %s %s" % (ddict[k][0], k[2], port))
        return l

    def list(self, heading=1, locallist=0):
        ddict = self.get_all_by_type(locallist)
        if len(ddict) == 0:
            return
        keys = sorted(ddict.keys())

        if heading:
            print("%-30s %-8s %s\n" % (_("SELinux Port Type"), _("Proto"), _("Port Number")))
        for i in keys:
            rec = "%-30s %-8s " % i
            rec += "%s" % ddict[i][0]
            for p in ddict[i][1:]:
                rec += ", %s" % p
            print(rec)

class ibpkeyRecords(semanageRecords):

    valid_types = []

    def __init__(self, args = None):
        semanageRecords.__init__(self, args)
        try:
            q = setools.TypeQuery(setools.SELinuxPolicy(sepolicy.get_store_policy(self.store)), attrs=["ibpkey_type"])
            self.valid_types = sorted(str(t) for t in q.results())
        except:
            pass

    def __genkey(self, pkey, subnet_prefix):
        if subnet_prefix == "":
            raise ValueError(_("Subnet Prefix is required"))

        pkeys = pkey.split("-")
        if len(pkeys) == 1:
            high = low = int(pkeys[0], 0)
        else:
            low = int(pkeys[0], 0)
            high = int(pkeys[1], 0)

        if high > 65535:
            raise ValueError(_("Invalid Pkey"))

        (rc, k) = semanage_ibpkey_key_create(self.sh, subnet_prefix, low, high)
        if rc < 0:
            raise ValueError(_("Could not create a key for %s/%s") % (subnet_prefix, pkey))
        return (k, subnet_prefix, low, high)

    def __add(self, pkey, subnet_prefix, serange, type):
        if is_mls_enabled == 1:
            if serange == "":
                serange = "s0"
            else:
                serange = untranslate(serange)

        if type == "":
            raise ValueError(_("Type is required"))

        type = sepolicy.get_real_type_name(type)

        if type not in self.valid_types:
            raise ValueError(_("Type %s is invalid, must be a ibpkey type") % type)

        (k, subnet_prefix, low, high) = self.__genkey(pkey, subnet_prefix)

        (rc, p) = semanage_ibpkey_create(self.sh)
        if rc < 0:
            raise ValueError(_("Could not create ibpkey for %s/%s") % (subnet_prefix, pkey))

        semanage_ibpkey_set_subnet_prefix(self.sh, p, subnet_prefix)
        semanage_ibpkey_set_range(p, low, high)
        (rc, con) = semanage_context_create(self.sh)
        if rc < 0:
            raise ValueError(_("Could not create context for %s/%s") % (subnet_prefix, pkey))

        rc = semanage_context_set_user(self.sh, con, "system_u")
        if rc < 0:
            raise ValueError(_("Could not set user in ibpkey context for %s/%s") % (subnet_prefix, pkey))

        rc = semanage_context_set_role(self.sh, con, "object_r")
        if rc < 0:
            raise ValueError(_("Could not set role in ibpkey context for %s/%s") % (subnet_prefix, pkey))

        rc = semanage_context_set_type(self.sh, con, type)
        if rc < 0:
            raise ValueError(_("Could not set type in ibpkey context for %s/%s") % (subnet_prefix, pkey))

        if (is_mls_enabled == 1) and (serange != ""):
            rc = semanage_context_set_mls(self.sh, con, serange)
            if rc < 0:
                raise ValueError(_("Could not set mls fields in ibpkey context for %s/%s") % (subnet_prefix, pkey))

        rc = semanage_ibpkey_set_con(self.sh, p, con)
        if rc < 0:
            raise ValueError(_("Could not set ibpkey context for %s/%s") % (subnet_prefix, pkey))

        rc = semanage_ibpkey_modify_local(self.sh, k, p)
        if rc < 0:
            raise ValueError(_("Could not add ibpkey %s/%s") % (subnet_prefix, pkey))

        semanage_context_free(con)
        semanage_ibpkey_key_free(k)
        semanage_ibpkey_free(p)

    def add(self, pkey, subnet_prefix, serange, type):
        self.begin()
        if self.__exists(pkey, subnet_prefix):
            print(_("ibpkey {subnet_prefix}/{pkey} already defined, modifying instead").format(subnet_prefix=subnet_prefix, pkey=pkey))
            self.__modify(pkey, subnet_prefix, serange, type)
        else:
            self.__add(pkey, subnet_prefix, serange, type)
        self.commit()

    def __exists(self, pkey, subnet_prefix):
        (k, subnet_prefix, low, high) = self.__genkey(pkey, subnet_prefix)

        (rc, exists) = semanage_ibpkey_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if ibpkey {subnet_prefix}/{pkey} is defined").formnat(subnet_prefix=subnet_prefix, pkey=pkey))
        semanage_ibpkey_key_free(k)

        return exists

    def __modify(self, pkey, subnet_prefix, serange, setype):
        if serange == "" and setype == "":
            if is_mls_enabled == 1:
                raise ValueError(_("Requires setype or serange"))
            else:
                raise ValueError(_("Requires setype"))

        setype = sepolicy.get_real_type_name(setype)

        if setype and setype not in self.valid_types:
            raise ValueError(_("Type %s is invalid, must be a ibpkey type") % setype)

        (k, subnet_prefix, low, high) = self.__genkey(pkey, subnet_prefix)

        (rc, exists) = semanage_ibpkey_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if ibpkey %s/%s is defined") % (subnet_prefix, pkey))
        if not exists:
            raise ValueError(_("ibpkey %s/%s is not defined") % (subnet_prefix, pkey))

        (rc, p) = semanage_ibpkey_query(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not query ibpkey %s/%s") % (subnet_prefix, pkey))

        con = semanage_ibpkey_get_con(p)

        if (is_mls_enabled == 1) and (serange != ""):
            semanage_context_set_mls(self.sh, con, untranslate(serange))
        if setype != "":
            semanage_context_set_type(self.sh, con, setype)

        rc = semanage_ibpkey_modify_local(self.sh, k, p)
        if rc < 0:
            raise ValueError(_("Could not modify ibpkey %s/%s") % (subnet_prefix, pkey))

        semanage_ibpkey_key_free(k)
        semanage_ibpkey_free(p)

    def modify(self, pkey, subnet_prefix, serange, setype):
        self.begin()
        self.__modify(pkey, subnet_prefix, serange, setype)
        self.commit()

    def deleteall(self):
        (rc, plist) = semanage_ibpkey_list_local(self.sh)
        if rc < 0:
            raise ValueError(_("Could not list the ibpkeys"))

        self.begin()

        for ibpkey in plist:
            (rc, subnet_prefix) = semanage_ibpkey_get_subnet_prefix(self.sh, ibpkey)
            low = semanage_ibpkey_get_low(ibpkey)
            high = semanage_ibpkey_get_high(ibpkey)
            pkey_str = "%s-%s" % (low, high)
            (k, subnet_prefix, low, high) = self.__genkey(pkey_str, subnet_prefix)
            if rc < 0:
                raise ValueError(_("Could not create a key for %s") % pkey_str)

            rc = semanage_ibpkey_del_local(self.sh, k)
            if rc < 0:
                raise ValueError(_("Could not delete the ibpkey %s") % pkey_str)
            semanage_ibpkey_key_free(k)

        self.commit()

    def __delete(self, pkey, subnet_prefix):
        (k, subnet_prefix, low, high) = self.__genkey(pkey, subnet_prefix)
        (rc, exists) = semanage_ibpkey_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if ibpkey %s/%s is defined") % (subnet_prefix, pkey))
        if not exists:
            raise ValueError(_("ibpkey %s/%s is not defined") % (subnet_prefix, pkey))

        (rc, exists) = semanage_ibpkey_exists_local(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if ibpkey %s/%s is defined") % (subnet_prefix, pkey))
        if not exists:
            raise ValueError(_("ibpkey %s/%s is defined in policy, cannot be deleted") % (subnet_prefix, pkey))

        rc = semanage_ibpkey_del_local(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not delete ibpkey %s/%s") % (subnet_prefix, pkey))

        semanage_ibpkey_key_free(k)

    def delete(self, pkey, subnet_prefix):
        self.begin()
        self.__delete(pkey, subnet_prefix)
        self.commit()

    def get_all(self, locallist=0):
        ddict = {}
        if locallist:
            (rc, self.plist) = semanage_ibpkey_list_local(self.sh)
        else:
            (rc, self.plist) = semanage_ibpkey_list(self.sh)
        if rc < 0:
            raise ValueError(_("Could not list ibpkeys"))

        for ibpkey in self.plist:
            con = semanage_ibpkey_get_con(ibpkey)
            ctype = semanage_context_get_type(con)
            if ctype == "reserved_ibpkey_t":
                continue
            level = semanage_context_get_mls(con)
            (rc, subnet_prefix) = semanage_ibpkey_get_subnet_prefix(self.sh, ibpkey)
            low = semanage_ibpkey_get_low(ibpkey)
            high = semanage_ibpkey_get_high(ibpkey)
            ddict[(low, high, subnet_prefix)] = (ctype, level)
        return ddict

    def get_all_by_type(self, locallist=0):
        ddict = {}
        if locallist:
            (rc, self.plist) = semanage_ibpkey_list_local(self.sh)
        else:
            (rc, self.plist) = semanage_ibpkey_list(self.sh)
        if rc < 0:
            raise ValueError(_("Could not list ibpkeys"))

        for ibpkey in self.plist:
            con = semanage_ibpkey_get_con(ibpkey)
            ctype = semanage_context_get_type(con)
            (rc, subnet_prefix) = semanage_ibpkey_get_subnet_prefix(self.sh, ibpkey)
            low = semanage_ibpkey_get_low(ibpkey)
            high = semanage_ibpkey_get_high(ibpkey)
            if (ctype, subnet_prefix) not in ddict.keys():
                ddict[(ctype, subnet_prefix)] = []
            if low == high:
                ddict[(ctype, subnet_prefix)].append("0x%x" % low)
            else:
                ddict[(ctype, subnet_prefix)].append("0x%x-0x%x" % (low, high))
        return ddict

    def customized(self):
        l = []
        ddict = self.get_all(True)

        for k in sorted(ddict.keys()):
            port = k[0] if k[0] == k[1] else "%s-%s" % (k[0], k[1])
            if ddict[k][1]:
                l.append("-a -t %s -r '%s' -x %s %s" % (ddict[k][0], ddict[k][1], k[2], port))
            else:
                l.append("-a -t %s -x %s %s" % (ddict[k][0], k[2], port))
        return l

    def list(self, heading=1, locallist=0):
        ddict = self.get_all_by_type(locallist)
        keys = ddict.keys()
        if len(keys) == 0:
            return

        if heading:
            print("%-30s %-18s %s\n" % (_("SELinux IB Pkey Type"), _("Subnet_Prefix"), _("Pkey Number")))
        for i in sorted(keys):
            rec = "%-30s %-18s " % i
            rec += "%s" % ddict[i][0]
            for p in ddict[i][1:]:
                rec += ", %s" % p
            print(rec)

class ibendportRecords(semanageRecords):

    valid_types = []

    def __init__(self, args = None):
        semanageRecords.__init__(self, args)
        try:
            q = setools.TypeQuery(setools.SELinuxPolicy(sepolicy.get_store_policy(self.store)), attrs=["ibendport_type"])
            self.valid_types = set(str(t) for t in q.results())
        except:
            pass

    def __genkey(self, ibendport, ibdev_name):
        if ibdev_name == "":
            raise ValueError(_("IB device name is required"))

        port = int(ibendport)

        if port > 255 or port < 1:
            raise ValueError(_("Invalid Port Number"))

        (rc, k) = semanage_ibendport_key_create(self.sh, ibdev_name, port)
        if rc < 0:
            raise ValueError(_("Could not create a key for ibendport %s/%s") % (ibdev_name, ibendport))
        return (k, ibdev_name, port)

    def __add(self, ibendport, ibdev_name, serange, type):
        if is_mls_enabled == 1:
            if serange == "":
                serange = "s0"
            else:
                serange = untranslate(serange)

        if type == "":
            raise ValueError(_("Type is required"))

        type = sepolicy.get_real_type_name(type)

        if type not in self.valid_types:
            raise ValueError(_("Type %s is invalid, must be an ibendport type") % type)
        (k, ibendport, port) = self.__genkey(ibendport, ibdev_name)

        (rc, p) = semanage_ibendport_create(self.sh)
        if rc < 0:
            raise ValueError(_("Could not create ibendport for %s/%s") % (ibdev_name, port))

        semanage_ibendport_set_ibdev_name(self.sh, p, ibdev_name)
        semanage_ibendport_set_port(p, port)
        (rc, con) = semanage_context_create(self.sh)
        if rc < 0:
            raise ValueError(_("Could not create context for %s/%s") % (ibdev_name, port))

        rc = semanage_context_set_user(self.sh, con, "system_u")
        if rc < 0:
            raise ValueError(_("Could not set user in ibendport context for %s/%s") % (ibdev_name, port))

        rc = semanage_context_set_role(self.sh, con, "object_r")
        if rc < 0:
            raise ValueError(_("Could not set role in ibendport context for %s/%s") % (ibdev_name, port))

        rc = semanage_context_set_type(self.sh, con, type)
        if rc < 0:
            raise ValueError(_("Could not set type in ibendport context for %s/%s") % (ibdev_name, port))

        if (is_mls_enabled == 1) and (serange != ""):
            rc = semanage_context_set_mls(self.sh, con, serange)
            if rc < 0:
                raise ValueError(_("Could not set mls fields in ibendport context for %s/%s") % (ibdev_name, port))

        rc = semanage_ibendport_set_con(self.sh, p, con)
        if rc < 0:
            raise ValueError(_("Could not set ibendport context for %s/%s") % (ibdev_name, port))

        rc = semanage_ibendport_modify_local(self.sh, k, p)
        if rc < 0:
            raise ValueError(_("Could not add ibendport %s/%s") % (ibdev_name, port))

        semanage_context_free(con)
        semanage_ibendport_key_free(k)
        semanage_ibendport_free(p)

    def add(self, ibendport, ibdev_name, serange, type):
        self.begin()
        if self.__exists(ibendport, ibdev_name):
            print(_("ibendport {ibdev_name}/{port} already defined, modifying instead").format(ibdev_name=ibdev_name, port=port))
            self.__modify(ibendport, ibdev_name, serange, type)
        else:
            self.__add(ibendport, ibdev_name, serange, type)
        self.commit()

    def __exists(self, ibendport, ibdev_name):
        (k, ibendport, port) = self.__genkey(ibendport, ibdev_name)

        (rc, exists) = semanage_ibendport_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if ibendport {ibdev_name}/{port} is defined").format(ibdev_name=ibdev_name, port=port))
        semanage_ibendport_key_free(k)

        return exists

    def __modify(self, ibendport, ibdev_name, serange, setype):
        if serange == "" and setype == "":
            if is_mls_enabled == 1:
                raise ValueError(_("Requires setype or serange"))
            else:
                raise ValueError(_("Requires setype"))

        setype = sepolicy.get_real_type_name(setype)

        if setype and setype not in self.valid_types:
            raise ValueError(_("Type %s is invalid, must be an ibendport type") % setype)

        (k, ibdev_name, port) = self.__genkey(ibendport, ibdev_name)

        (rc, exists) = semanage_ibendport_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if ibendport %s/%s is defined") % (ibdev_name, ibendport))
        if not exists:
            raise ValueError(_("ibendport %s/%s is not defined") % (ibdev_name, ibendport))

        (rc, p) = semanage_ibendport_query(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not query ibendport %s/%s") % (ibdev_name, ibendport))

        con = semanage_ibendport_get_con(p)

        if (is_mls_enabled == 1) and (serange != ""):
            semanage_context_set_mls(self.sh, con, untranslate(serange))
        if setype != "":
            semanage_context_set_type(self.sh, con, setype)

        rc = semanage_ibendport_modify_local(self.sh, k, p)
        if rc < 0:
            raise ValueError(_("Could not modify ibendport %s/%s") % (ibdev_name, ibendport))

        semanage_ibendport_key_free(k)
        semanage_ibendport_free(p)

    def modify(self, ibendport, ibdev_name, serange, setype):
        self.begin()
        self.__modify(ibendport, ibdev_name, serange, setype)
        self.commit()

    def deleteall(self):
        (rc, plist) = semanage_ibendport_list_local(self.sh)
        if rc < 0:
            raise ValueError(_("Could not list the ibendports"))

        self.begin()

        for ibendport in plist:
            (rc, ibdev_name) = semanage_ibendport_get_ibdev_name(self.sh, ibendport)
            port = semanage_ibendport_get_port(ibendport)
            (k, ibdev_name, port) = self.__genkey(str(port), ibdev_name)
            if rc < 0:
                raise ValueError(_("Could not create a key for %s/%d") % (ibdevname, port))

            rc = semanage_ibendport_del_local(self.sh, k)
            if rc < 0:
                raise ValueError(_("Could not delete the ibendport %s/%d") % (ibdev_name, port))
            semanage_ibendport_key_free(k)

        self.commit()

    def __delete(self, ibendport, ibdev_name):
        (k, ibdev_name, port) = self.__genkey(ibendport, ibdev_name)
        (rc, exists) = semanage_ibendport_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if ibendport %s/%s is defined") % (ibdev_name, ibendport))
        if not exists:
            raise ValueError(_("ibendport %s/%s is not defined") % (ibdev_name, ibendport))

        (rc, exists) = semanage_ibendport_exists_local(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if ibendport %s/%s is defined") % (ibdev_name, ibendport))
        if not exists:
            raise ValueError(_("ibendport %s/%s is defined in policy, cannot be deleted") % (ibdev_name, ibendport))

        rc = semanage_ibendport_del_local(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not delete ibendport %s/%s") % (ibdev_name, ibendport))

        semanage_ibendport_key_free(k)

    def delete(self, ibendport, ibdev_name):
        self.begin()
        self.__delete(ibendport, ibdev_name)
        self.commit()

    def get_all(self, locallist=0):
        ddict = {}
        if locallist:
            (rc, self.plist) = semanage_ibendport_list_local(self.sh)
        else:
            (rc, self.plist) = semanage_ibendport_list(self.sh)
        if rc < 0:
            raise ValueError(_("Could not list ibendports"))

        for ibendport in self.plist:
            con = semanage_ibendport_get_con(ibendport)
            ctype = semanage_context_get_type(con)
            if ctype == "reserved_ibendport_t":
                continue
            level = semanage_context_get_mls(con)
            (rc, ibdev_name) = semanage_ibendport_get_ibdev_name(self.sh, ibendport)
            port = semanage_ibendport_get_port(ibendport)
            ddict[(port, ibdev_name)] = (ctype, level)
        return ddict

    def get_all_by_type(self, locallist=0):
        ddict = {}
        if locallist:
            (rc, self.plist) = semanage_ibendport_list_local(self.sh)
        else:
            (rc, self.plist) = semanage_ibendport_list(self.sh)
        if rc < 0:
            raise ValueError(_("Could not list ibendports"))

        for ibendport in self.plist:
            con = semanage_ibendport_get_con(ibendport)
            ctype = semanage_context_get_type(con)
            (rc, ibdev_name) = semanage_ibendport_get_ibdev_name(self.sh, ibendport)
            port = semanage_ibendport_get_port(ibendport)
            if (ctype, ibdev_name) not in ddict.keys():
                ddict[(ctype, ibdev_name)] = []
            ddict[(ctype, ibdev_name)].append("0x%x" % port)
        return ddict

    def customized(self):
        l = []
        ddict = self.get_all(True)

        for k in sorted(ddict.keys()):
            if ddict[k][1]:
                l.append("-a -t %s -r '%s' -z %s %s" % (ddict[k][0], ddict[k][1], k[1], k[0]))
            else:
                l.append("-a -t %s -z %s %s" % (ddict[k][0], k[1], k[0]))
        return l

    def list(self, heading=1, locallist=0):
        ddict = self.get_all_by_type(locallist)
        keys = ddict.keys()
        if len(keys) == 0:
            return

        if heading:
            print("%-30s %-18s %s\n" % (_("SELinux IB End Port Type"), _("IB Device Name"), _("Port Number")))
        for i in sorted(keys):
            rec = "%-30s %-18s " % i
            rec += "%s" % ddict[i][0]
            for p in ddict[i][1:]:
                rec += ", %s" % p
            print(rec)

class nodeRecords(semanageRecords):

    valid_types = []

    def __init__(self, args = None):
        semanageRecords.__init__(self, args)
        self.protocol = ["ipv4", "ipv6"]
        try:
            self.valid_types = list(list(sepolicy.info(sepolicy.ATTRIBUTE, "node_type"))[0]["types"])
        except RuntimeError:
            pass

    def validate(self, addr, mask, protocol):
        newaddr = addr
        newmask = mask
        newprotocol = ""

        if addr == "":
            raise ValueError(_("Node Address is required"))

        # verify valid comination
        if len(mask) == 0 or mask[0] == "/":
            i = ipaddress.ip_network(addr + mask)
            newaddr = str(i.network_address)
            newmask = str(i.netmask)
            if newmask == "0.0.0.0" and i.version == 6:
                newmask = "::"

            protocol = "ipv%d" % i.version

        try:
            newprotocol = self.protocol.index(protocol)
        except:
            raise ValueError(_("Unknown or missing protocol"))

        return newaddr, newmask, newprotocol

    def __add(self, addr, mask, proto, serange, ctype):
        addr, mask, proto = self.validate(addr, mask, proto)

        if is_mls_enabled == 1:
            if serange == "":
                serange = "s0"
            else:
                serange = untranslate(serange)

        if ctype == "":
            raise ValueError(_("SELinux node type is required"))

        ctype = sepolicy.get_real_type_name(ctype)

        if ctype not in self.valid_types:
            raise ValueError(_("Type %s is invalid, must be a node type") % ctype)

        (rc, k) = semanage_node_key_create(self.sh, addr, mask, proto)
        if rc < 0:
            raise ValueError(_("Could not create key for %s") % addr)

        (rc, node) = semanage_node_create(self.sh)
        if rc < 0:
            raise ValueError(_("Could not create addr for %s") % addr)
        semanage_node_set_proto(node, proto)

        rc = semanage_node_set_addr(self.sh, node, proto, addr)
        (rc, con) = semanage_context_create(self.sh)
        if rc < 0:
            raise ValueError(_("Could not create context for %s") % addr)

        rc = semanage_node_set_mask(self.sh, node, proto, mask)
        if rc < 0:
            raise ValueError(_("Could not set mask for %s") % addr)

        rc = semanage_context_set_user(self.sh, con, "system_u")
        if rc < 0:
            raise ValueError(_("Could not set user in addr context for %s") % addr)

        rc = semanage_context_set_role(self.sh, con, "object_r")
        if rc < 0:
            raise ValueError(_("Could not set role in addr context for %s") % addr)

        rc = semanage_context_set_type(self.sh, con, ctype)
        if rc < 0:
            raise ValueError(_("Could not set type in addr context for %s") % addr)

        if (is_mls_enabled == 1) and (serange != ""):
            rc = semanage_context_set_mls(self.sh, con, serange)
            if rc < 0:
                raise ValueError(_("Could not set mls fields in addr context for %s") % addr)

        rc = semanage_node_set_con(self.sh, node, con)
        if rc < 0:
            raise ValueError(_("Could not set addr context for %s") % addr)

        rc = semanage_node_modify_local(self.sh, k, node)
        if rc < 0:
            raise ValueError(_("Could not add addr %s") % addr)

        semanage_context_free(con)
        semanage_node_key_free(k)
        semanage_node_free(node)

        self.mylog.log_change("resrc=node op=add laddr=%s netmask=%s proto=%s tcontext=%s:%s:%s:%s" % (addr, mask, socket.getprotobyname(self.protocol[proto]), "system_u", "object_r", ctype, serange))

    def add(self, addr, mask, proto, serange, ctype):
        self.begin()
        if self.__exists(addr, mask, proto):
            print(_("Addr %s already defined, modifying instead") % addr)
            self.__modify(addr, mask, proto, serange, ctype)
        else:
            self.__add(addr, mask, proto, serange, ctype)
        self.commit()

    def __exists(self, addr, mask, proto):
        addr, mask, proto = self.validate(addr, mask, proto)

        (rc, k) = semanage_node_key_create(self.sh, addr, mask, proto)
        if rc < 0:
            raise ValueError(_("Could not create key for %s") % addr)

        (rc, exists) = semanage_node_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if addr %s is defined") % addr)
        semanage_node_key_free(k)

        return exists

    def __modify(self, addr, mask, proto, serange, setype):
        addr, mask, proto = self.validate(addr, mask, proto)

        if serange == "" and setype == "":
            raise ValueError(_("Requires setype or serange"))

        setype = sepolicy.get_real_type_name(setype)

        if setype and setype not in self.valid_types:
            raise ValueError(_("Type %s is invalid, must be a node type") % setype)

        (rc, k) = semanage_node_key_create(self.sh, addr, mask, proto)
        if rc < 0:
            raise ValueError(_("Could not create key for %s") % addr)

        (rc, exists) = semanage_node_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if addr %s is defined") % addr)
        if not exists:
            raise ValueError(_("Addr %s is not defined") % addr)

        (rc, node) = semanage_node_query(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not query addr %s") % addr)

        con = semanage_node_get_con(node)
        if (is_mls_enabled == 1) and (serange != ""):
            semanage_context_set_mls(self.sh, con, untranslate(serange))
        if setype != "":
            semanage_context_set_type(self.sh, con, setype)

        rc = semanage_node_modify_local(self.sh, k, node)
        if rc < 0:
            raise ValueError(_("Could not modify addr %s") % addr)

        semanage_node_key_free(k)
        semanage_node_free(node)

        self.mylog.log_change("resrc=node op=modify laddr=%s netmask=%s proto=%s tcontext=%s:%s:%s:%s" % (addr, mask, socket.getprotobyname(self.protocol[proto]), "system_u", "object_r", setype, serange))

    def modify(self, addr, mask, proto, serange, setype):
        self.begin()
        self.__modify(addr, mask, proto, serange, setype)
        self.commit()

    def __delete(self, addr, mask, proto):

        addr, mask, proto = self.validate(addr, mask, proto)

        (rc, k) = semanage_node_key_create(self.sh, addr, mask, proto)
        if rc < 0:
            raise ValueError(_("Could not create key for %s") % addr)

        (rc, exists) = semanage_node_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if addr %s is defined") % addr)
        if not exists:
            raise ValueError(_("Addr %s is not defined") % addr)

        (rc, exists) = semanage_node_exists_local(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if addr %s is defined") % addr)
        if not exists:
            raise ValueError(_("Addr %s is defined in policy, cannot be deleted") % addr)

        rc = semanage_node_del_local(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not delete addr %s") % addr)

        semanage_node_key_free(k)

        self.mylog.log_change("resrc=node op=delete laddr=%s netmask=%s proto=%s" % (addr, mask, socket.getprotobyname(self.protocol[proto])))

    def delete(self, addr, mask, proto):
        self.begin()
        self.__delete(addr, mask, proto)
        self.commit()

    def deleteall(self):
        (rc, nlist) = semanage_node_list_local(self.sh)
        if rc < 0:
            raise ValueError(_("Could not deleteall node mappings"))

        self.begin()
        for node in nlist:
            self.__delete(semanage_node_get_addr(self.sh, node)[1], semanage_node_get_mask(self.sh, node)[1], self.protocol[semanage_node_get_proto(node)])
        self.commit()

    def get_all(self, locallist=0):
        ddict = {}
        if locallist:
            (rc, self.ilist) = semanage_node_list_local(self.sh)
        else:
            (rc, self.ilist) = semanage_node_list(self.sh)
        if rc < 0:
            raise ValueError(_("Could not list addrs"))

        for node in self.ilist:
            con = semanage_node_get_con(node)
            addr = semanage_node_get_addr(self.sh, node)
            mask = semanage_node_get_mask(self.sh, node)
            proto = self.protocol[semanage_node_get_proto(node)]
            ddict[(addr[1], mask[1], proto)] = (semanage_context_get_user(con), semanage_context_get_role(con), semanage_context_get_type(con), semanage_context_get_mls(con))

        return ddict

    def customized(self):
        l = []
        ddict = self.get_all(True)
        for k in sorted(ddict.keys()):
            if ddict[k][3]:
                l.append("-a -M %s -p %s -t %s -r '%s' %s" % (k[1], k[2], ddict[k][2], ddict[k][3], k[0]))
            else:
                l.append("-a -M %s -p %s -t %s %s" % (k[1], k[2], ddict[k][2], k[0]))
        return l

    def list(self, heading=1, locallist=0):
        ddict = self.get_all(locallist)
        if len(ddict) == 0:
            return
        keys = sorted(ddict.keys())

        if heading:
            print("%-18s %-18s %-5s %-5s\n" % ("IP Address", "Netmask", "Protocol", "Context"))
        if is_mls_enabled:
            for k in keys:
                val = ''
                for fields in k:
                    val = val + '\t' + str(fields)
                print("%-18s %-18s %-5s %s:%s:%s:%s " % (k[0], k[1], k[2], ddict[k][0], ddict[k][1], ddict[k][2], translate(ddict[k][3], False)))
        else:
            for k in keys:
                print("%-18s %-18s %-5s %s:%s:%s " % (k[0], k[1], k[2], ddict[k][0], ddict[k][1], ddict[k][2]))


class interfaceRecords(semanageRecords):

    def __init__(self, args = None):
        semanageRecords.__init__(self, args)

    def __add(self, interface, serange, ctype):
        if is_mls_enabled == 1:
            if serange == "":
                serange = "s0"
            else:
                serange = untranslate(serange)

        if ctype == "":
            raise ValueError(_("SELinux Type is required"))

        (rc, k) = semanage_iface_key_create(self.sh, interface)
        if rc < 0:
            raise ValueError(_("Could not create key for %s") % interface)

        (rc, iface) = semanage_iface_create(self.sh)
        if rc < 0:
            raise ValueError(_("Could not create interface for %s") % interface)

        rc = semanage_iface_set_name(self.sh, iface, interface)
        (rc, con) = semanage_context_create(self.sh)
        if rc < 0:
            raise ValueError(_("Could not create context for %s") % interface)

        rc = semanage_context_set_user(self.sh, con, "system_u")
        if rc < 0:
            raise ValueError(_("Could not set user in interface context for %s") % interface)

        rc = semanage_context_set_role(self.sh, con, "object_r")
        if rc < 0:
            raise ValueError(_("Could not set role in interface context for %s") % interface)

        rc = semanage_context_set_type(self.sh, con, ctype)
        if rc < 0:
            raise ValueError(_("Could not set type in interface context for %s") % interface)

        if (is_mls_enabled == 1) and (serange != ""):
            rc = semanage_context_set_mls(self.sh, con, serange)
            if rc < 0:
                raise ValueError(_("Could not set mls fields in interface context for %s") % interface)

        rc = semanage_iface_set_ifcon(self.sh, iface, con)
        if rc < 0:
            raise ValueError(_("Could not set interface context for %s") % interface)

        rc = semanage_iface_set_msgcon(self.sh, iface, con)
        if rc < 0:
            raise ValueError(_("Could not set message context for %s") % interface)

        rc = semanage_iface_modify_local(self.sh, k, iface)
        if rc < 0:
            raise ValueError(_("Could not add interface %s") % interface)

        semanage_context_free(con)
        semanage_iface_key_free(k)
        semanage_iface_free(iface)

        self.mylog.log_change("resrc=interface op=add netif=%s tcontext=%s:%s:%s:%s" % (interface, "system_u", "object_r", ctype, serange))

    def add(self, interface, serange, ctype):
        self.begin()
        if self.__exists(interface):
            print(_("Interface %s already defined, modifying instead") % interface)
            self.__modify(interface, serange, ctype)
        else:
            self.__add(interface, serange, ctype)
        self.commit()

    def __exists(self, interface):
        (rc, k) = semanage_iface_key_create(self.sh, interface)
        if rc < 0:
            raise ValueError(_("Could not create key for %s") % interface)

        (rc, exists) = semanage_iface_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if interface %s is defined") % interface)
        semanage_iface_key_free(k)

        return exists

    def __modify(self, interface, serange, setype):
        if serange == "" and setype == "":
            raise ValueError(_("Requires setype or serange"))

        (rc, k) = semanage_iface_key_create(self.sh, interface)
        if rc < 0:
            raise ValueError(_("Could not create key for %s") % interface)

        (rc, exists) = semanage_iface_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if interface %s is defined") % interface)
        if not exists:
            raise ValueError(_("Interface %s is not defined") % interface)

        (rc, iface) = semanage_iface_query(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not query interface %s") % interface)

        con = semanage_iface_get_ifcon(iface)

        if (is_mls_enabled == 1) and (serange != ""):
            semanage_context_set_mls(self.sh, con, untranslate(serange))
        if setype != "":
            semanage_context_set_type(self.sh, con, setype)

        rc = semanage_iface_modify_local(self.sh, k, iface)
        if rc < 0:
            raise ValueError(_("Could not modify interface %s") % interface)

        semanage_iface_key_free(k)
        semanage_iface_free(iface)

        self.mylog.log_change("resrc=interface op=modify netif=%s tcontext=%s:%s:%s:%s" % (interface, "system_u", "object_r", setype, serange))

    def modify(self, interface, serange, setype):
        self.begin()
        self.__modify(interface, serange, setype)
        self.commit()

    def __delete(self, interface):
        (rc, k) = semanage_iface_key_create(self.sh, interface)
        if rc < 0:
            raise ValueError(_("Could not create key for %s") % interface)

        (rc, exists) = semanage_iface_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if interface %s is defined") % interface)
        if not exists:
            raise ValueError(_("Interface %s is not defined") % interface)

        (rc, exists) = semanage_iface_exists_local(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if interface %s is defined") % interface)
        if not exists:
            raise ValueError(_("Interface %s is defined in policy, cannot be deleted") % interface)

        rc = semanage_iface_del_local(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not delete interface %s") % interface)

        semanage_iface_key_free(k)

        self.mylog.log_change("resrc=interface op=delete netif=%s" % interface)

    def delete(self, interface):
        self.begin()
        self.__delete(interface)
        self.commit()

    def deleteall(self):
        (rc, ulist) = semanage_iface_list_local(self.sh)
        if rc < 0:
            raise ValueError(_("Could not delete all interface  mappings"))

        self.begin()
        for i in ulist:
            self.__delete(semanage_iface_get_name(i))
        self.commit()

    def get_all(self, locallist=0):
        ddict = {}
        if locallist:
            (rc, self.ilist) = semanage_iface_list_local(self.sh)
        else:
            (rc, self.ilist) = semanage_iface_list(self.sh)
        if rc < 0:
            raise ValueError(_("Could not list interfaces"))

        for interface in self.ilist:
            con = semanage_iface_get_ifcon(interface)
            ddict[semanage_iface_get_name(interface)] = (semanage_context_get_user(con), semanage_context_get_role(con), semanage_context_get_type(con), semanage_context_get_mls(con))

        return ddict

    def customized(self):
        l = []
        ddict = self.get_all(True)
        for k in sorted(ddict.keys()):
            if ddict[k][3]:
                l.append("-a -t %s -r '%s' %s" % (ddict[k][2], ddict[k][3], k))
            else:
                l.append("-a -t %s %s" % (ddict[k][2], k))
        return l

    def list(self, heading=1, locallist=0):
        ddict = self.get_all(locallist)
        if len(ddict) == 0:
            return
        keys = sorted(ddict.keys())

        if heading:
            print("%-30s %s\n" % (_("SELinux Interface"), _("Context")))
        if is_mls_enabled:
            for k in keys:
                print("%-30s %s:%s:%s:%s " % (k, ddict[k][0], ddict[k][1], ddict[k][2], translate(ddict[k][3], False)))
        else:
            for k in keys:
                print("%-30s %s:%s:%s " % (k, ddict[k][0], ddict[k][1], ddict[k][2]))


class fcontextRecords(semanageRecords):

    valid_types = []

    def __init__(self, args = None):
        semanageRecords.__init__(self, args)
        try:
            self.valid_types = list(list(sepolicy.info(sepolicy.ATTRIBUTE, "file_type"))[0]["types"])
            self.valid_types += list(list(sepolicy.info(sepolicy.ATTRIBUTE, "device_node"))[0]["types"])
        except RuntimeError:
            pass

        self.equiv = {}
        self.equiv_dist = {}
        self.equal_ind = False
        try:
            fd = open(selinux.selinux_file_context_subs_path(), "r")
            for i in fd.readlines():
                i = i.strip()
                if len(i) == 0:
                    continue
                if i.startswith("#"):
                    continue
                target, substitute = i.split()
                self.equiv[target] = substitute
            fd.close()
        except IOError:
            pass
        try:
            fd = open(selinux.selinux_file_context_subs_dist_path(), "r")
            for i in fd.readlines():
                i = i.strip()
                if len(i) == 0:
                    continue
                if i.startswith("#"):
                    continue
                target, substitute = i.split()
                self.equiv_dist[target] = substitute
            fd.close()
        except IOError:
            pass

    def commit(self):
        if self.equal_ind:
            subs_file = selinux.selinux_file_context_subs_path()
            tmpfile = "%s.tmp" % subs_file
            fd = open(tmpfile, "w")
            for target in self.equiv.keys():
                fd.write("%s %s\n" % (target, self.equiv[target]))
            fd.close()
            try:
                os.chmod(tmpfile, os.stat(subs_file)[stat.ST_MODE])
            except:
                pass
            os.rename(tmpfile, subs_file)
            self.equal_ind = False
        semanageRecords.commit(self)

    def add_equal(self, target, substitute):
        self.begin()
        if target != "/" and target[-1] == "/":
            raise ValueError(_("Target %s is not valid. Target is not allowed to end with '/'") % target)

        if substitute != "/" and substitute[-1] == "/":
            raise ValueError(_("Substitute %s is not valid. Substitute is not allowed to end with '/'") % substitute)

        if target in self.equiv.keys():
            print(_("Equivalence class for %s already exists, modifying instead") % target)
            self.equiv[target] = substitute
            self.equal_ind = True
            self.mylog.log_change("resrc=fcontext op=modify-equal %s %s" % (audit.audit_encode_nv_string("sglob", target, 0), audit.audit_encode_nv_string("tglob", substitute, 0)))
            self.commit()
            return

        self.validate(target)

        for fdict in (self.equiv, self.equiv_dist):
            for i in fdict:
                if i.startswith(target + "/"):
                    raise ValueError(_("File spec %s conflicts with equivalency rule '%s %s'") % (target, i, fdict[i]))

        self.mylog.log_change("resrc=fcontext op=add-equal %s %s" % (audit.audit_encode_nv_string("sglob", target, 0), audit.audit_encode_nv_string("tglob", substitute, 0)))

        self.equiv[target] = substitute
        self.equal_ind = True
        self.commit()

    def modify_equal(self, target, substitute):
        self.begin()
        if target not in self.equiv.keys():
            raise ValueError(_("Equivalence class for %s does not exist") % target)
        self.equiv[target] = substitute
        self.equal_ind = True

        self.mylog.log_change("resrc=fcontext op=modify-equal %s %s" % (audit.audit_encode_nv_string("sglob", target, 0), audit.audit_encode_nv_string("tglob", substitute, 0)))

        self.commit()

    def createcon(self, target, seuser="system_u"):
        (rc, con) = semanage_context_create(self.sh)
        if rc < 0:
            raise ValueError(_("Could not create context for %s") % target)
        if seuser == "":
            seuser = "system_u"

        rc = semanage_context_set_user(self.sh, con, seuser)
        if rc < 0:
            raise ValueError(_("Could not set user in file context for %s") % target)

        rc = semanage_context_set_role(self.sh, con, "object_r")
        if rc < 0:
            raise ValueError(_("Could not set role in file context for %s") % target)

        if is_mls_enabled == 1:
            rc = semanage_context_set_mls(self.sh, con, "s0")
            if rc < 0:
                raise ValueError(_("Could not set mls fields in file context for %s") % target)

        return con

    def validate(self, target):
        if target == "" or target.find("\n") >= 0:
            raise ValueError(_("Invalid file specification"))
        if target.find(" ") != -1:
            raise ValueError(_("File specification can not include spaces"))
        for fdict in (self.equiv, self.equiv_dist):
            for i in fdict:
                if target.startswith(i + "/"):
                    t = re.sub(i, fdict[i], target)
                    raise ValueError(_("File spec %s conflicts with equivalency rule '%s %s'; Try adding '%s' instead") % (target, i, fdict[i], t))

    def __add(self, target, type, ftype="", serange="", seuser="system_u"):
        self.validate(target)

        if is_mls_enabled == 1:
            serange = untranslate(serange)

        if type == "":
            raise ValueError(_("SELinux Type is required"))

        if type != "<<none>>":
            type = sepolicy.get_real_type_name(type)
            if type not in self.valid_types:
                raise ValueError(_("Type %s is invalid, must be a file or device type") % type)

        (rc, k) = semanage_fcontext_key_create(self.sh, target, file_types[ftype])
        if rc < 0:
            raise ValueError(_("Could not create key for %s") % target)

        (rc, fcontext) = semanage_fcontext_create(self.sh)
        if rc < 0:
            raise ValueError(_("Could not create file context for %s") % target)

        rc = semanage_fcontext_set_expr(self.sh, fcontext, target)
        if type != "<<none>>":
            con = self.createcon(target, seuser)

            rc = semanage_context_set_type(self.sh, con, type)
            if rc < 0:
                raise ValueError(_("Could not set type in file context for %s") % target)

            if (is_mls_enabled == 1) and (serange != ""):
                rc = semanage_context_set_mls(self.sh, con, serange)
                if rc < 0:
                    raise ValueError(_("Could not set mls fields in file context for %s") % target)
            rc = semanage_fcontext_set_con(self.sh, fcontext, con)
            if rc < 0:
                raise ValueError(_("Could not set file context for %s") % target)

        semanage_fcontext_set_type(fcontext, file_types[ftype])

        rc = semanage_fcontext_modify_local(self.sh, k, fcontext)
        if rc < 0:
            raise ValueError(_("Could not add file context for %s") % target)

        if type != "<<none>>":
            semanage_context_free(con)
        semanage_fcontext_key_free(k)
        semanage_fcontext_free(fcontext)

        if not seuser:
            seuser = "system_u"

        self.mylog.log_change("resrc=fcontext op=add %s ftype=%s tcontext=%s:%s:%s:%s" % (audit.audit_encode_nv_string("tglob", target, 0), ftype_to_audit[ftype], seuser, "object_r", type, serange))

    def add(self, target, type, ftype="", serange="", seuser="system_u"):
        self.begin()
        if self.__exists(target, ftype):
            print(_("File context for %s already defined, modifying instead") % target)
            self.__modify(target, type, ftype, serange, seuser)
        else:
            self.__add(target, type, ftype, serange, seuser)
        self.commit()

    def __exists(self, target, ftype):
        (rc, k) = semanage_fcontext_key_create(self.sh, target, file_types[ftype])
        if rc < 0:
            raise ValueError(_("Could not create key for %s") % target)

        (rc, exists) = semanage_fcontext_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if file context for %s is defined") % target)

        if not exists:
            (rc, exists) = semanage_fcontext_exists_local(self.sh, k)
            if rc < 0:
                raise ValueError(_("Could not check if file context for %s is defined") % target)
        semanage_fcontext_key_free(k)

        return exists

    def __modify(self, target, setype, ftype, serange, seuser):
        if serange == "" and setype == "" and seuser == "":
            raise ValueError(_("Requires setype, serange or seuser"))
        if setype not in ["",  "<<none>>"]:
            setype = sepolicy.get_real_type_name(setype)
            if setype not in self.valid_types:
                raise ValueError(_("Type %s is invalid, must be a file or device type") % setype)

        self.validate(target)

        (rc, k) = semanage_fcontext_key_create(self.sh, target, file_types[ftype])
        if rc < 0:
            raise ValueError(_("Could not create a key for %s") % target)

        (rc, exists) = semanage_fcontext_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if file context for %s is defined") % target)
        if exists:
            try:
                (rc, fcontext) = semanage_fcontext_query(self.sh, k)
            except OSError:
                raise ValueError(_("Could not query file context for %s") % target)
        else:
            (rc, exists) = semanage_fcontext_exists_local(self.sh, k)
            if rc < 0:
                raise ValueError(_("Could not check if file context for %s is defined") % target)
            if not exists:
                raise ValueError(_("File context for %s is not defined") % target)
            try:
                (rc, fcontext) = semanage_fcontext_query_local(self.sh, k)
            except OSError:
                raise ValueError(_("Could not query file context for %s") % target)

        if setype != "<<none>>":
            con = semanage_fcontext_get_con(fcontext)

            if con is None:
                con = self.createcon(target)

            if (is_mls_enabled == 1) and (serange != ""):
                semanage_context_set_mls(self.sh, con, untranslate(serange))
            if seuser != "":
                semanage_context_set_user(self.sh, con, seuser)

            if setype != "":
                semanage_context_set_type(self.sh, con, setype)

            rc = semanage_fcontext_set_con(self.sh, fcontext, con)
            if rc < 0:
                raise ValueError(_("Could not set file context for %s") % target)
        else:
            rc = semanage_fcontext_set_con(self.sh, fcontext, None)
            if rc < 0:
                raise ValueError(_("Could not set file context for %s") % target)

        rc = semanage_fcontext_modify_local(self.sh, k, fcontext)
        if rc < 0:
            raise ValueError(_("Could not modify file context for %s") % target)

        semanage_fcontext_key_free(k)
        semanage_fcontext_free(fcontext)

        if not seuser:
            seuser = "system_u"

        self.mylog.log_change("resrc=fcontext op=modify %s ftype=%s tcontext=%s:%s:%s:%s" % (audit.audit_encode_nv_string("tglob", target, 0), ftype_to_audit[ftype], seuser, "object_r", setype, serange))

    def modify(self, target, setype, ftype, serange, seuser):
        self.begin()
        self.__modify(target, setype, ftype, serange, seuser)
        self.commit()

    def deleteall(self):
        (rc, flist) = semanage_fcontext_list_local(self.sh)
        if rc < 0:
            raise ValueError(_("Could not list the file contexts"))

        self.begin()

        for fcontext in flist:
            target = semanage_fcontext_get_expr(fcontext)
            ftype = semanage_fcontext_get_type(fcontext)
            ftype_str = semanage_fcontext_get_type_str(ftype)
            (rc, k) = semanage_fcontext_key_create(self.sh, target, file_types[ftype_str])
            if rc < 0:
                raise ValueError(_("Could not create a key for %s") % target)

            rc = semanage_fcontext_del_local(self.sh, k)
            if rc < 0:
                raise ValueError(_("Could not delete the file context %s") % target)
            semanage_fcontext_key_free(k)

            self.mylog.log_change("resrc=fcontext op=delete %s ftype=%s" % (audit.audit_encode_nv_string("tglob", target, 0), ftype_to_audit[file_type_str_to_option[ftype_str]]))

        self.equiv = {}
        self.equal_ind = True
        self.commit()

    def __delete(self, target, ftype):
        if target in self.equiv.keys():
            self.equiv.pop(target)
            self.equal_ind = True

            self.mylog.log_change("resrc=fcontext op=delete-equal %s" % (audit.audit_encode_nv_string("tglob", target, 0)))

            return

        (rc, k) = semanage_fcontext_key_create(self.sh, target, file_types[ftype])
        if rc < 0:
            raise ValueError(_("Could not create a key for %s") % target)

        (rc, exists) = semanage_fcontext_exists_local(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if file context for %s is defined") % target)
        if not exists:
            (rc, exists) = semanage_fcontext_exists(self.sh, k)
            if rc < 0:
                raise ValueError(_("Could not check if file context for %s is defined") % target)
            if exists:
                raise ValueError(_("File context for %s is defined in policy, cannot be deleted") % target)
            else:
                raise ValueError(_("File context for %s is not defined") % target)

        rc = semanage_fcontext_del_local(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not delete file context for %s") % target)

        semanage_fcontext_key_free(k)

        self.mylog.log_change("resrc=fcontext op=delete %s ftype=%s" % (audit.audit_encode_nv_string("tglob", target, 0), ftype_to_audit[ftype]))

    def delete(self, target, ftype):
        self.begin()
        self.__delete(target, ftype)
        self.commit()

    def get_all(self, locallist=0):
        if locallist:
            (rc, self.flist) = semanage_fcontext_list_local(self.sh)
        else:
            (rc, self.flist) = semanage_fcontext_list(self.sh)
            if rc < 0:
                raise ValueError(_("Could not list file contexts"))

            (rc, fchomedirs) = semanage_fcontext_list_homedirs(self.sh)
            if rc < 0:
                raise ValueError(_("Could not list file contexts for home directories"))

            (rc, fclocal) = semanage_fcontext_list_local(self.sh)
            if rc < 0:
                raise ValueError(_("Could not list local file contexts"))

            self.flist += fchomedirs
            self.flist += fclocal

        ddict = {}
        for fcontext in self.flist:
            expr = semanage_fcontext_get_expr(fcontext)
            ftype = semanage_fcontext_get_type(fcontext)
            ftype_str = semanage_fcontext_get_type_str(ftype)
            con = semanage_fcontext_get_con(fcontext)
            if con:
                ddict[(expr, ftype_str)] = (semanage_context_get_user(con), semanage_context_get_role(con), semanage_context_get_type(con), semanage_context_get_mls(con))
            else:
                ddict[(expr, ftype_str)] = con

        return ddict

    def customized(self):
        l = []
        fcon_dict = self.get_all(True)
        for k in fcon_dict.keys():
            if fcon_dict[k]:
                if fcon_dict[k][3]:
                    l.append("-a -f %s -t %s -r '%s' '%s'" % (file_type_str_to_option[k[1]], fcon_dict[k][2], fcon_dict[k][3], k[0]))
                else:
                    l.append("-a -f %s -t %s '%s'" % (file_type_str_to_option[k[1]], fcon_dict[k][2], k[0]))

        if len(self.equiv):
            for target in self.equiv.keys():
                l.append("-a -e %s %s" % (self.equiv[target], target))
        return l

    def list(self, heading=1, locallist=0):
        fcon_dict = self.get_all(locallist)
        if len(fcon_dict) != 0:
            if heading:
                print("%-50s %-18s %s\n" % (_("SELinux fcontext"), _("type"), _("Context")))
            # do not sort local customizations since they are evaluated based on the order they where added in
            if locallist:
                fkeys = fcon_dict.keys()
            else:
                fkeys = sorted(fcon_dict.keys())
            for k in fkeys:
                if fcon_dict[k]:
                    if is_mls_enabled:
                        print("%-50s %-18s %s:%s:%s:%s " % (k[0], k[1], fcon_dict[k][0], fcon_dict[k][1], fcon_dict[k][2], translate(fcon_dict[k][3], False)))
                    else:
                        print("%-50s %-18s %s:%s:%s " % (k[0], k[1], fcon_dict[k][0], fcon_dict[k][1], fcon_dict[k][2]))
                else:
                    print("%-50s %-18s <<None>>" % (k[0], k[1]))

        if len(self.equiv_dist):
            if not locallist:
                if heading:
                    print(_("\nSELinux Distribution fcontext Equivalence \n"))
                for target in self.equiv_dist.keys():
                    print("%s = %s" % (target, self.equiv_dist[target]))
        if len(self.equiv):
            if heading:
                print(_("\nSELinux Local fcontext Equivalence \n"))

            for target in self.equiv.keys():
                print("%s = %s" % (target, self.equiv[target]))


class booleanRecords(semanageRecords):

    def __init__(self, args = None):
        semanageRecords.__init__(self, args)
        self.dict = {}
        self.dict["TRUE"] = 1
        self.dict["FALSE"] = 0
        self.dict["ON"] = 1
        self.dict["OFF"] = 0
        self.dict["1"] = 1
        self.dict["0"] = 0

        try:
            rc, self.current_booleans = selinux.security_get_boolean_names()
            rc, ptype = selinux.selinux_getpolicytype()
        except:
            self.current_booleans = []
            ptype = None

        if self.store == "" or self.store == ptype:
            self.modify_local = True
        else:
            self.modify_local = False

    def __mod(self, name, value):
        name = selinux.selinux_boolean_sub(name)

        (rc, k) = semanage_bool_key_create(self.sh, name)
        if rc < 0:
            raise ValueError(_("Could not create a key for %s") % name)
        (rc, exists) = semanage_bool_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if boolean %s is defined") % name)
        if not exists:
            raise ValueError(_("Boolean %s is not defined") % name)

        (rc, b) = semanage_bool_query(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not query file context %s") % name)

        if value.upper() in self.dict:
            semanage_bool_set_value(b, self.dict[value.upper()])
        else:
            raise ValueError(_("You must specify one of the following values: %s") % ", ".join(self.dict.keys()))

        if self.modify_local and name in self.current_booleans:
            rc = semanage_bool_set_active(self.sh, k, b)
            if rc < 0:
                raise ValueError(_("Could not set active value of boolean %s") % name)
        rc = semanage_bool_modify_local(self.sh, k, b)
        if rc < 0:
            raise ValueError(_("Could not modify boolean %s") % name)
        semanage_bool_key_free(k)
        semanage_bool_free(b)

    def modify(self, name, value=None, use_file=False):
        self.begin()
        if use_file:
            fd = open(name)
            for b in fd.read().split("\n"):
                b = b.strip()
                if len(b) == 0:
                    continue

                try:
                    boolname, val = b.split("=")
                except ValueError:
                    raise ValueError(_("Bad format %s: Record %s" % (name, b)))
                self.__mod(boolname.strip(), val.strip())
            fd.close()
        else:
            self.__mod(name, value)

        self.commit()

    def __delete(self, name):
        name = selinux.selinux_boolean_sub(name)

        (rc, k) = semanage_bool_key_create(self.sh, name)
        if rc < 0:
            raise ValueError(_("Could not create a key for %s") % name)
        (rc, exists) = semanage_bool_exists(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if boolean %s is defined") % name)
        if not exists:
            raise ValueError(_("Boolean %s is not defined") % name)

        (rc, exists) = semanage_bool_exists_local(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not check if boolean %s is defined") % name)
        if not exists:
            raise ValueError(_("Boolean %s is defined in policy, cannot be deleted") % name)

        rc = semanage_bool_del_local(self.sh, k)
        if rc < 0:
            raise ValueError(_("Could not delete boolean %s") % name)

        semanage_bool_key_free(k)

    def delete(self, name):
        self.begin()
        self.__delete(name)
        self.commit()

    def deleteall(self):
        (rc, self.blist) = semanage_bool_list_local(self.sh)
        if rc < 0:
            raise ValueError(_("Could not list booleans"))

        self.begin()

        for boolean in self.blist:
            name = semanage_bool_get_name(boolean)
            self.__delete(name)

        self.commit()

    def get_all(self, locallist=0):
        ddict = {}
        if locallist:
            (rc, self.blist) = semanage_bool_list_local(self.sh)
        else:
            (rc, self.blist) = semanage_bool_list(self.sh)
        if rc < 0:
            raise ValueError(_("Could not list booleans"))

        for boolean in self.blist:
            value = []
            name = semanage_bool_get_name(boolean)
            value.append(semanage_bool_get_value(boolean))
            if self.modify_local and name in self.current_booleans:
                value.append(selinux.security_get_boolean_pending(name))
                value.append(selinux.security_get_boolean_active(name))
            else:
                value.append(value[0])
                value.append(value[0])
            ddict[name] = value

        return ddict

    def get_desc(self, name):
        name = selinux.selinux_boolean_sub(name)
        return sepolicy.boolean_desc(name)

    def get_category(self, name):
        name = selinux.selinux_boolean_sub(name)
        return sepolicy.boolean_category(name)

    def customized(self):
        l = []
        ddict = self.get_all(True)
        for k in sorted(ddict.keys()):
            if ddict[k]:
                l.append("-m -%s %s" % (ddict[k][2], k))
        return l

    def list(self, heading=True, locallist=False, use_file=False):
        on_off = (_("off"), _("on"))
        if use_file:
            ddict = self.get_all(locallist)
            for k in sorted(ddict.keys()):
                if ddict[k]:
                    print("%s=%s" % (k, ddict[k][2]))
            return
        ddict = self.get_all(locallist)
        if len(ddict) == 0:
            return

        if heading:
            print("%-30s %s  %s %s\n" % (_("SELinux boolean"), _("State"), _("Default"), _("Description")))
        for k in sorted(ddict.keys()):
            if ddict[k]:
                print("%-30s (%-5s,%5s)  %s" % (k, on_off[ddict[k][2]], on_off[ddict[k][0]], self.get_desc(k)))
site-packages/sepolicy-1.1-py3.6.egg-info000064400000000343147511334650013713 0ustar00Metadata-Version: 1.0
Name: sepolicy
Version: 1.1
Summary: Python SELinux Policy Analyses bindings
Home-page: UNKNOWN
Author: Daniel Walsh
Author-email: dwalsh@redhat.com
License: UNKNOWN
Description: UNKNOWN
Platform: UNKNOWN
site-packages/python_linux_procfs-0.7.0-py3.6.egg-info/PKG-INFO000064400000000536147511334650017521 0ustar00Metadata-Version: 1.0
Name: python-linux-procfs
Version: 0.7.0
Summary: Linux /proc abstraction classes
Home-page: http://userweb.kernel.org/python-linux-procfs
Author: Arnaldo Carvalho de Melo
Author-email: acme@redhat.com
License: GPLv2
Description: Abstractions to extract information from the Linux kernel /proc files.
        
Platform: UNKNOWN
site-packages/python_linux_procfs-0.7.0-py3.6.egg-info/requires.txt000064400000000004147511334650021012 0ustar00six
site-packages/python_linux_procfs-0.7.0-py3.6.egg-info/SOURCES.txt000064400000000433147511334650020304 0ustar00pflags
setup.py
procfs/__init__.py
procfs/procfs.py
procfs/utilist.py
python_linux_procfs.egg-info/PKG-INFO
python_linux_procfs.egg-info/SOURCES.txt
python_linux_procfs.egg-info/dependency_links.txt
python_linux_procfs.egg-info/requires.txt
python_linux_procfs.egg-info/top_level.txtsite-packages/python_linux_procfs-0.7.0-py3.6.egg-info/dependency_links.txt000064400000000001147511334650022466 0ustar00
site-packages/python_linux_procfs-0.7.0-py3.6.egg-info/top_level.txt000064400000000007147511334650021147 0ustar00procfs
site-packages/sepolicy/booleans.py000064400000003051147511334650013277 0ustar00# Copyright (C) 2012 Red Hat
# see file 'COPYING' for use and warranty information
#
# setrans is a tool for analyzing process transistions in SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#
import sepolicy


def expand_attribute(attribute):
    try:
        return sepolicy.info(sepolicy.ATTRIBUTE, attribute)[0]["types"]
    except RuntimeError:
        return [attribute]


def get_types(src, tclass, perm):
    allows = sepolicy.search([sepolicy.ALLOW], {sepolicy.SOURCE: src, sepolicy.CLASS: tclass, sepolicy.PERMS: perm})
    if not allows:
        raise TypeError("The %s type is not allowed to %s any types" % (src, ",".join(perm)))

    tlist = []
    for l in map(lambda y: y[sepolicy.TARGET], filter(lambda x: set(perm).issubset(x[sepolicy.PERMS]), allows)):
        tlist = tlist + expand_attribute(l)
    return tlist
site-packages/sepolicy/templates/tmp.py000064400000006601147511334650014277 0ustar00# Copyright (C) 2007-2012 Red Hat
# see file 'COPYING' for use and warranty information
#
# policygentool is a tool for the initial generation of SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#
########################### tmp Template File #############################

te_types="""
type TEMPLATETYPE_tmp_t;
files_tmp_file(TEMPLATETYPE_tmp_t)
"""

te_rules="""
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
files_tmp_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_tmp_t, { dir file lnk_file })
"""

te_stream_rules="""
manage_sock_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
files_tmp_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_tmp_t, sock_file)
"""

if_rules="""
########################################
## <summary>
##	Do not audit attempts to read,
##	TEMPLATETYPE tmp files
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_dontaudit_read_tmp_files',`
	gen_require(`
		type TEMPLATETYPE_tmp_t;
	')

	dontaudit $1 TEMPLATETYPE_tmp_t:file read_file_perms;
')

########################################
## <summary>
##	Read TEMPLATETYPE tmp files
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_read_tmp_files',`
	gen_require(`
		type TEMPLATETYPE_tmp_t;
	')

	files_search_tmp($1)
	read_files_pattern($1, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
')

########################################
## <summary>
##	Manage TEMPLATETYPE tmp files
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_tmp',`
	gen_require(`
		type TEMPLATETYPE_tmp_t;
	')

	files_search_tmp($1)
	manage_dirs_pattern($1, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
	manage_files_pattern($1, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
	manage_lnk_files_pattern($1, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
')
"""

if_stream_rules="""\
########################################
## <summary>
##	Connect to TEMPLATETYPE over a unix stream socket.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_stream_connect',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_tmp_t;
	')

	files_search_pids($1)
	stream_connect_pattern($1, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t, TEMPLATETYPE_t)
')
"""

if_admin_types="""
		type TEMPLATETYPE_tmp_t;"""

if_admin_rules="""
	files_search_tmp($1)
	admin_pattern($1, TEMPLATETYPE_tmp_t)
"""
site-packages/sepolicy/templates/unit_file.py000064400000004325147511334650015456 0ustar00# Copyright (C) 2012 Red Hat
# see file 'COPYING' for use and warranty information
#
# policygentool is a tool for the initial generation of SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#
########################### unit Template File #############################

########################### Type Enforcement File #############################
te_types="""
type TEMPLATETYPE_unit_file_t;
systemd_unit_file(TEMPLATETYPE_unit_file_t)
"""

te_rules=""

########################### Interface File #############################
if_rules="""\
########################################
## <summary>
##	Execute TEMPLATETYPE server in the TEMPLATETYPE domain.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed to transition.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_systemctl',`
	gen_require(`
		type TEMPLATETYPE_t;
		type TEMPLATETYPE_unit_file_t;
	')

	systemd_exec_systemctl($1)
        systemd_read_fifo_file_passwd_run($1)
	allow $1 TEMPLATETYPE_unit_file_t:file read_file_perms;
	allow $1 TEMPLATETYPE_unit_file_t:service manage_service_perms;

	ps_process_pattern($1, TEMPLATETYPE_t)
')

"""

if_admin_types="""
	type TEMPLATETYPE_unit_file_t;"""

if_admin_rules="""
	TEMPLATETYPE_systemctl($1)
	admin_pattern($1, TEMPLATETYPE_unit_file_t)
	allow $1 TEMPLATETYPE_unit_file_t:service all_service_perms;
"""

########################### File Context ##################################
fc_file="""\
FILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_unit_file_t,s0)
"""

fc_dir=""
site-packages/sepolicy/templates/__init__.py000064400000001324147511334650015233 0ustar00#
# Copyright (C) 2007-2012 Red Hat
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
#
site-packages/sepolicy/templates/user.py000064400000010331147511334650014450 0ustar00# Copyright (C) 2007-2012 Red Hat
# see file 'COPYING' for use and warranty information
#
# policygentool is a tool for the initial generation of SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#
########################### Type Enforcement File #############################

te_login_user_types="""\
policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#
userdom_unpriv_user_template(TEMPLATETYPE)
"""

te_admin_user_types="""\
policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#
userdom_admin_user_template(TEMPLATETYPE)
"""

te_min_login_user_types="""\
policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

userdom_restricted_user_template(TEMPLATETYPE)
"""

te_x_login_user_types="""\
policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

userdom_restricted_xwindows_user_template(TEMPLATETYPE)
"""

te_existing_user_types="""\
policy_module(TEMPLATETYPE, 1.0.0)

"""

te_root_user_types="""\
policy_module(TEMPLATETYPE, 1.0.0)

## <desc>
## <p>
## Allow TEMPLATETYPE to read files in the user home directory
## </p>
## </desc>
gen_tunable(TEMPLATETYPE_read_user_files, false)

## <desc>
## <p>
## Allow TEMPLATETYPE to manage files in the user home directory
## </p>
## </desc>
gen_tunable(TEMPLATETYPE_manage_user_files, false)

########################################
#
# Declarations
#

userdom_base_user_template(TEMPLATETYPE)
"""

te_login_user_rules="""\
"""

te_existing_user_rules="""\

########################################
#
# TEMPLATETYPE customized policy
#
"""

te_x_login_user_rules="""\
"""

te_root_user_rules="""\

"""

te_transition_rules="""
optional_policy(`
        APPLICATION_role(TEMPLATETYPE_r, TEMPLATETYPE_t)
')
"""

te_user_trans_rules="""
optional_policy(`
        gen_require(`
                role USER_r;
        ')

        TEMPLATETYPE_role_change(USER_r)
')
"""

te_admin_rules="""
allow TEMPLATETYPE_t self:capability { dac_override dac_read_search kill sys_ptrace sys_nice };
files_dontaudit_search_all_dirs(TEMPLATETYPE_t)

selinux_get_enforce_mode(TEMPLATETYPE_t)
seutil_domtrans_setfiles(TEMPLATETYPE_t)
seutil_search_default_contexts(TEMPLATETYPE_t)

logging_send_syslog_msg(TEMPLATETYPE_t)

kernel_read_system_state(TEMPLATETYPE_t)

domain_dontaudit_search_all_domains_state(TEMPLATETYPE_t)
domain_dontaudit_ptrace_all_domains(TEMPLATETYPE_t)

userdom_dontaudit_search_admin_dir(TEMPLATETYPE_t)
userdom_dontaudit_search_user_home_dirs(TEMPLATETYPE_t)

tunable_policy(`TEMPLATETYPE_read_user_files',`
        userdom_read_user_home_content_files(TEMPLATETYPE_t)
        userdom_read_user_tmp_files(TEMPLATETYPE_t)
')

tunable_policy(`TEMPLATETYPE_manage_user_files',`
	userdom_manage_user_home_content_dirs(TEMPLATETYPE_t)
	userdom_manage_user_home_content_files(TEMPLATETYPE_t)
	userdom_manage_user_home_content_symlinks(TEMPLATETYPE_t)
        userdom_manage_user_tmp_files(TEMPLATETYPE_t)
')
"""

te_admin_trans_rules="""
gen_require(`
        role USER_r;
')

allow USER_r TEMPLATETYPE_r;
"""

te_admin_domain_rules="""
optional_policy(`
        APPLICATION_admin(TEMPLATETYPE_t, TEMPLATETYPE_r)
')
"""

te_roles_rules="""
optional_policy(`
        gen_require(`
                role ROLE_r;
        ')

        allow TEMPLATETYPE_r ROLE_r;
')
"""

te_sudo_rules="""
optional_policy(`
        sudo_role_template(TEMPLATETYPE, TEMPLATETYPE_r, TEMPLATETYPE_t)
')
"""

te_newrole_rules="""
seutil_run_newrole(TEMPLATETYPE_t, TEMPLATETYPE_r)
"""
site-packages/sepolicy/templates/spec.py000064400000004126147511334650014431 0ustar00header_comment_section="""\
# vim: sw=4:ts=4:et
"""

base_section="""\

%define selinux_policyver VERSION

Name:   MODULENAME_selinux
Version:	1.0
Release:	1%{?dist}
Summary:	SELinux policy module for MODULENAME

Group:	System Environment/Base		
License:	GPLv2+	
# This is an example. You will need to change it.
URL:		http://HOSTNAME
Source0:	MODULENAME.pp
Source1:	MODULENAME.if
Source2:	DOMAINNAME_selinux.8
Source3:	DOMAINNAME_u

Requires: policycoreutils, libselinux-utils
Requires(post): selinux-policy-base >= %{selinux_policyver}, policycoreutils
Requires(postun): policycoreutils
"""

mid_section="""\
BuildArch: noarch

%description
This package installs and sets up the  SELinux policy security module for MODULENAME.

%install
install -d %{buildroot}%{_datadir}/selinux/packages
install -m 644 %{SOURCE0} %{buildroot}%{_datadir}/selinux/packages
install -d %{buildroot}%{_datadir}/selinux/devel/include/contrib
install -m 644 %{SOURCE1} %{buildroot}%{_datadir}/selinux/devel/include/contrib/
install -d %{buildroot}%{_mandir}/man8/
install -m 644 %{SOURCE2} %{buildroot}%{_mandir}/man8/DOMAINNAME_selinux.8
install -d %{buildroot}/etc/selinux/targeted/contexts/users/
install -m 644 %{SOURCE3} %{buildroot}/etc/selinux/targeted/contexts/users/DOMAINNAME_u

%post
semodule -n -i %{_datadir}/selinux/packages/MODULENAME.pp
if /usr/sbin/selinuxenabled ; then
    /usr/sbin/load_policy
    %relabel_files
    /usr/sbin/semanage user -a -R DOMAINNAME_r DOMAINNAME_u
fi;
exit 0

%postun
if [ $1 -eq 0 ]; then
    semodule -n -r MODULENAME
    if /usr/sbin/selinuxenabled ; then
       /usr/sbin/load_policy
       %relabel_files
       /usr/sbin/semanage user -d DOMAINNAME_u
    fi;
fi;
exit 0

%files
%attr(0600,root,root) %{_datadir}/selinux/packages/MODULENAME.pp
%{_datadir}/selinux/devel/include/contrib/MODULENAME.if
%{_mandir}/man8/DOMAINNAME_selinux.8.*
/etc/selinux/targeted/contexts/users/DOMAINNAME_u

%changelog
* TODAYSDATE YOUR NAME <YOUR@EMAILADDRESS> 1.0-1
- Initial version

"""

define_relabel_files_begin ="""\
\n
%define relabel_files() \\
"""

define_relabel_files_end ="""\
restorecon -R FILENAME; \\
"""
site-packages/sepolicy/templates/rw.py000064400000007534147511334650014135 0ustar00# Copyright (C) 2007-2012 Red Hat
# see file 'COPYING' for use and warranty information
#
# policygentool is a tool for the initial generation of SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#

########################### tmp Template File #############################
te_types="""
type TEMPLATETYPE_rw_t;
files_type(TEMPLATETYPE_rw_t)
"""

te_rules="""
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t)
"""

########################### Interface File #############################
if_rules="""
########################################
## <summary>
##	Search TEMPLATETYPE rw directories.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_search_rw_dir',`
	gen_require(`
		type TEMPLATETYPE_rw_t;
	')

	allow $1 TEMPLATETYPE_rw_t:dir search_dir_perms;
	files_search_rw($1)
')

########################################
## <summary>
##	Read TEMPLATETYPE rw files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_read_rw_files',`
	gen_require(`
		type TEMPLATETYPE_rw_t;
	')

	read_files_pattern($1, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t)
	allow $1 TEMPLATETYPE_rw_t:dir list_dir_perms;
	files_search_rw($1)
')

########################################
## <summary>
##	Manage TEMPLATETYPE rw files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_rw_files',`
	gen_require(`
		type TEMPLATETYPE_rw_t;
	')

	manage_files_pattern($1, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t)
')

########################################
## <summary>
##	Create, read, write, and delete
##	TEMPLATETYPE rw dirs.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_rw_dirs',`
	gen_require(`
		type TEMPLATETYPE_rw_t;
	')

	manage_dirs_pattern($1, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t)
')

"""

te_stream_rules="""
manage_sock_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t)
"""

if_stream_rules="""\
########################################
## <summary>
##	Connect to TEMPLATETYPE over a unix stream socket.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_stream_connect',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_rw_t;
	')

	stream_connect_pattern($1, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t, TEMPLATETYPE_t)
')
"""

if_admin_types="""
		type TEMPLATETYPE_rw_t;"""

if_admin_rules="""
	files_search_etc($1)
	admin_pattern($1, TEMPLATETYPE_rw_t)
"""

########################### File Context ##################################
fc_file="""
FILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_rw_t,s0)
"""

fc_sock_file="""\
FILENAME        -s  gen_context(system_u:object_r:TEMPLATETYPE_etc_rw_t,s0)
"""

fc_dir="""
FILENAME(/.*)?		gen_context(system_u:object_r:TEMPLATETYPE_rw_t,s0)
"""
site-packages/sepolicy/templates/etc_rw.py000064400000007414147511334650014765 0ustar00# Copyright (C) 2007-2012 Red Hat
# see file 'COPYING' for use and warranty information
#
# policygentool is a tool for the initial generation of SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#
########################### etc_rw Template File #############################

########################### Type Enforcement File #############################
te_types="""
type TEMPLATETYPE_etc_rw_t;
files_type(TEMPLATETYPE_etc_rw_t)
"""
te_rules="""
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_etc_rw_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_etc_rw_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_etc_rw_t)
files_etc_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_etc_rw_t, { dir file lnk_file })
"""

te_stream_rules="""
manage_sock_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_etc_rw_t)
files_etc_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_etc_rw_t, sock_file)
"""

########################### Interface File #############################
if_rules="""
########################################
## <summary>
##	Search TEMPLATETYPE conf directories.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_search_conf',`
	gen_require(`
		type TEMPLATETYPE_etc_rw_t;
	')

	allow $1 TEMPLATETYPE_etc_rw_t:dir search_dir_perms;
	files_search_etc($1)
')

########################################
## <summary>
##	Read TEMPLATETYPE conf files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_read_conf_files',`
	gen_require(`
		type TEMPLATETYPE_etc_rw_t;
	')

	allow $1 TEMPLATETYPE_etc_rw_t:dir list_dir_perms;
	read_files_pattern($1, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_etc_rw_t)
	files_search_etc($1)
')

########################################
## <summary>
##	Manage TEMPLATETYPE conf files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_conf_files',`
	gen_require(`
		type TEMPLATETYPE_etc_rw_t;
	')

	manage_files_pattern($1, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_etc_rw_t)
	files_search_etc($1)
')

"""

if_stream_rules="""\
########################################
## <summary>
##	Connect to TEMPLATETYPE over a unix stream socket.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_stream_connect',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_etc_rw_t;
	')

	files_search_etc($1)
	stream_connect_pattern($1, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_t)
')
"""

if_admin_types="""
		type TEMPLATETYPE_etc_rw_t;"""

if_admin_rules="""
	files_search_etc($1)
	admin_pattern($1, TEMPLATETYPE_etc_rw_t)
"""

########################### File Context ##################################
fc_file="""\
FILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_etc_rw_t,s0)
"""

fc_dir="""\
FILENAME(/.*)?		gen_context(system_u:object_r:TEMPLATETYPE_etc_rw_t,s0)
"""
site-packages/sepolicy/templates/executable.py000064400000023243147511334650015621 0ustar00# Copyright (C) 2007-2012 Red Hat
# see file 'COPYING' for use and warranty information
#
# policygentool is a tool for the initial generation of SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#
########################### Type Enforcement File #############################
te_daemon_types="""\
policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

type TEMPLATETYPE_t;
type TEMPLATETYPE_exec_t;
init_daemon_domain(TEMPLATETYPE_t, TEMPLATETYPE_exec_t)

permissive TEMPLATETYPE_t;
"""

te_initscript_types="""
type TEMPLATETYPE_initrc_exec_t;
init_script_file(TEMPLATETYPE_initrc_exec_t)
"""

te_dbusd_types="""\
policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

type TEMPLATETYPE_t;
type TEMPLATETYPE_exec_t;
domain_type(TEMPLATETYPE_t)
domain_entry_file(TEMPLATETYPE_t, TEMPLATETYPE_exec_t)
role system_r types TEMPLATETYPE_t;

permissive TEMPLATETYPE_t;
"""

te_inetd_types="""\
policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

type TEMPLATETYPE_t;
type TEMPLATETYPE_exec_t;
inetd_service_domain(TEMPLATETYPE_t, TEMPLATETYPE_exec_t)

permissive TEMPLATETYPE_t;
"""

te_userapp_types="""\
policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

attribute_role TEMPLATETYPE_roles;
roleattribute system_r TEMPLATETYPE_roles;

type TEMPLATETYPE_t;
type TEMPLATETYPE_exec_t;
application_domain(TEMPLATETYPE_t, TEMPLATETYPE_exec_t)
role TEMPLATETYPE_roles types TEMPLATETYPE_t;

permissive TEMPLATETYPE_t;
"""

te_sandbox_types="""\
policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

sandbox_x_domain_template(TEMPLATETYPE)

permissive TEMPLATETYPE_t;
permissive TEMPLATETYPE_client_t;

"""

te_cgi_types="""\
policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

apache_content_template(TEMPLATETYPE)

permissive httpd_TEMPLATETYPE_script_t;
"""

te_daemon_rules="""\
allow TEMPLATETYPE_t self:fifo_file rw_fifo_file_perms;
allow TEMPLATETYPE_t self:unix_stream_socket create_stream_socket_perms;
"""

te_inetd_rules="""
"""

te_dbusd_rules="""
optional_policy(`
	dbus_system_domain(TEMPLATETYPE_t, TEMPLATETYPE_exec_t)
')
"""

te_userapp_rules="""
allow TEMPLATETYPE_t self:fifo_file manage_fifo_file_perms;
allow TEMPLATETYPE_t self:unix_stream_socket create_stream_socket_perms;
"""

te_cgi_rules="""
"""

te_sandbox_rules="""
"""

te_uid_rules="""
auth_use_nsswitch(TEMPLATETYPE_t)
"""

te_syslog_rules="""
logging_send_syslog_msg(TEMPLATETYPE_t)
"""

te_resolve_rules="""
sysnet_dns_name_resolve(TEMPLATETYPE_t)
"""

te_pam_rules="""
auth_domtrans_chk_passwd(TEMPLATETYPE_t)
"""

te_mail_rules="""
mta_send_mail(TEMPLATETYPE_t)
"""

te_dbus_rules="""
optional_policy(`
	dbus_system_bus_client(TEMPLATETYPE_t)
	dbus_connect_system_bus(TEMPLATETYPE_t)
')
"""

te_kerberos_rules="""
optional_policy(`
	kerberos_use(TEMPLATETYPE_t)
')
"""

te_manage_krb5_rcache_rules="""
optional_policy(`
	kerberos_keytab_template(TEMPLATETYPE, TEMPLATETYPE_t)
	kerberos_manage_host_rcache(TEMPLATETYPE_t)
')
"""

te_audit_rules="""
logging_send_audit_msgs(TEMPLATETYPE_t)
"""

te_run_rules="""
optional_policy(`
	gen_require(`
		type USER_t;
		role USER_r;
	')

	TEMPLATETYPE_run(USER_t, USER_r)
')
"""

te_fd_rules="""
domain_use_interactive_fds(TEMPLATETYPE_t)
"""

te_etc_rules="""
files_read_etc_files(TEMPLATETYPE_t)
"""

te_localization_rules="""
miscfiles_read_localization(TEMPLATETYPE_t)
"""

########################### Interface File #############################

if_heading_rules="""
## <summary>policy for TEMPLATETYPE</summary>"""

if_program_rules="""

########################################
## <summary>
##	Execute TEMPLATETYPE_exec_t in the TEMPLATETYPE domain.
## </summary>
## <param name=\"domain\">
## <summary>
##	Domain allowed to transition.
## </summary>
## </param>
#
interface(`TEMPLATETYPE_domtrans',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_exec_t;
	')

	corecmd_search_bin($1)
	domtrans_pattern($1, TEMPLATETYPE_exec_t, TEMPLATETYPE_t)
')

######################################
## <summary>
##	Execute TEMPLATETYPE in the caller domain.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_exec',`
	gen_require(`
		type TEMPLATETYPE_exec_t;
	')

	corecmd_search_bin($1)
	can_exec($1, TEMPLATETYPE_exec_t)
')
"""

if_user_program_rules="""
########################################
## <summary>
##	Execute TEMPLATETYPE in the TEMPLATETYPE domain, and
##	allow the specified role the TEMPLATETYPE domain.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed to transition
##	</summary>
## </param>
## <param name="role">
##	<summary>
##	The role to be allowed the TEMPLATETYPE domain.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_run',`
	gen_require(`
		type TEMPLATETYPE_t;
		attribute_role TEMPLATETYPE_roles;
	')

	TEMPLATETYPE_domtrans($1)
	roleattribute $2 TEMPLATETYPE_roles;
')

########################################
## <summary>
##	Role access for TEMPLATETYPE
## </summary>
## <param name="role">
##	<summary>
##	Role allowed access
##	</summary>
## </param>
## <param name="domain">
##	<summary>
##	User domain for the role
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_role',`
	gen_require(`
		type TEMPLATETYPE_t;
		attribute_role TEMPLATETYPE_roles;
	')

	roleattribute $1 TEMPLATETYPE_roles;

	TEMPLATETYPE_domtrans($2)

	ps_process_pattern($2, TEMPLATETYPE_t)
	allow $2 TEMPLATETYPE_t:process { signull signal sigkill };
')
"""

if_sandbox_rules="""
########################################
## <summary>
##	Execute sandbox in the TEMPLATETYPE_t domain, and
##	allow the specified role the TEMPLATETYPE_t domain.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed to transition.
##	</summary>
## </param>
## <param name="role">
##	<summary>
##	The role to be allowed the TEMPLATETYPE_t domain.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_transition',`
	gen_require(`
		type TEMPLATETYPE_t;
		type TEMPLATETYPE_client_t;
	')

	allow $1 TEMPLATETYPE_t:process { signal_perms transition };
	dontaudit $1 TEMPLATETYPE_t:process { noatsecure siginh rlimitinh };
	role $2 types TEMPLATETYPE_t;
	role $2 types TEMPLATETYPE_client_t;

	allow TEMPLATETYPE_t $1:process { sigchld signull };
	allow TEMPLATETYPE_t $1:fifo_file rw_inherited_fifo_file_perms;
	allow TEMPLATETYPE_client_t $1:process { sigchld signull };
	allow TEMPLATETYPE_client_t $1:fifo_file rw_inherited_fifo_file_perms;
')
"""

if_role_change_rules="""
########################################
## <summary>
##	Change to the TEMPLATETYPE role.
## </summary>
## <param name="role">
##	<summary>
##	Role allowed access.
##	</summary>
## </param>
## <rolecap/>
#
interface(`TEMPLATETYPE_role_change',`
	gen_require(`
		role TEMPLATETYPE_r;
	')

	allow $1 TEMPLATETYPE_r;
')
"""

if_initscript_rules="""
########################################
## <summary>
##	Execute TEMPLATETYPE server in the TEMPLATETYPE domain.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_initrc_domtrans',`
	gen_require(`
		type TEMPLATETYPE_initrc_exec_t;
	')

	init_labeled_script_domtrans($1, TEMPLATETYPE_initrc_exec_t)
')
"""

if_dbus_rules="""
########################################
## <summary>
##	Send and receive messages from
##	TEMPLATETYPE over dbus.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_dbus_chat',`
	gen_require(`
		type TEMPLATETYPE_t;
		class dbus send_msg;
	')

	allow $1 TEMPLATETYPE_t:dbus send_msg;
	allow TEMPLATETYPE_t $1:dbus send_msg;
')
"""

if_begin_admin="""
########################################
## <summary>
##	All of the rules required to administrate
##	an TEMPLATETYPE environment
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <param name="role">
##	<summary>
##	Role allowed access.
##	</summary>
## </param>
## <rolecap/>
#
interface(`TEMPLATETYPE_admin',`
	gen_require(`
		type TEMPLATETYPE_t;"""

if_middle_admin="""
	')

	allow $1 TEMPLATETYPE_t:process { signal_perms };
	ps_process_pattern($1, TEMPLATETYPE_t)

    tunable_policy(`deny_ptrace',`',`
        allow $1 TEMPLATETYPE_t:process ptrace;
    ')
"""

if_initscript_admin_types="""
		type TEMPLATETYPE_initrc_exec_t;"""

if_initscript_admin="""
	TEMPLATETYPE_initrc_domtrans($1)
	domain_system_change_exemption($1)
	role_transition $2 TEMPLATETYPE_initrc_exec_t system_r;
	allow $2 system_r;
"""

if_end_admin="""\
	optional_policy(`
		systemd_passwd_agent_exec($1)
		systemd_read_fifo_file_passwd_run($1)
	')
')
"""

########################### File Context ##################################
fc_program="""\
EXECUTABLE		--	gen_context(system_u:object_r:TEMPLATETYPE_exec_t,s0)
"""

fc_user="""\
#  No file context, leave blank
"""

fc_initscript="""\
EXECUTABLE	--	gen_context(system_u:object_r:TEMPLATETYPE_initrc_exec_t,s0)
"""
site-packages/sepolicy/templates/var_spool.py000064400000010056147511334650015502 0ustar00# Copyright (C) 2007-2012 Red Hat
# see file 'COPYING' for use and warranty information
#
# policygentool is a tool for the initial generation of SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#
########################### var_spool Template File #############################

########################### Type Enforcement File #############################
te_types="""
type TEMPLATETYPE_spool_t;
files_type(TEMPLATETYPE_spool_t)
"""
te_rules="""
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
files_spool_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_spool_t, { dir file lnk_file })
"""

te_stream_rules="""\
manage_sock_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
files_spool_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_spool_t, sock_file)
"""

########################### Interface File #############################
if_rules="""
########################################
## <summary>
##	Search TEMPLATETYPE spool directories.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_search_spool',`
	gen_require(`
		type TEMPLATETYPE_spool_t;
	')

	allow $1 TEMPLATETYPE_spool_t:dir search_dir_perms;
	files_search_spool($1)
')

########################################
## <summary>
##	Read TEMPLATETYPE spool files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_read_spool_files',`
	gen_require(`
		type TEMPLATETYPE_spool_t;
	')

	files_search_spool($1)
	read_files_pattern($1, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
')

########################################
## <summary>
##	Manage TEMPLATETYPE spool files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_spool_files',`
	gen_require(`
		type TEMPLATETYPE_spool_t;
	')

	files_search_spool($1)
	manage_files_pattern($1, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
')

########################################
## <summary>
##	Manage TEMPLATETYPE spool dirs.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_spool_dirs',`
	gen_require(`
		type TEMPLATETYPE_spool_t;
	')

	files_search_spool($1)
	manage_dirs_pattern($1, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
')

"""

if_stream_rules="""
########################################
## <summary>
##	Connect to TEMPLATETYPE over a unix stream socket.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_stream_connect',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_spool_t;
	')

	stream_connect_pattern($1, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
')
"""

if_admin_types="""
		type TEMPLATETYPE_spool_t;"""

if_admin_rules="""
	files_search_spool($1)
	admin_pattern($1, TEMPLATETYPE_spool_t)
"""

########################### File Context ##################################
fc_file="""\
FILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_spool_t,s0)
"""

fc_dir="""\
FILENAME(/.*)?		gen_context(system_u:object_r:TEMPLATETYPE_spool_t,s0)
"""
site-packages/sepolicy/templates/var_run.py000064400000005563147511334650015161 0ustar00# Copyright (C) 2007-2012 Red Hat
# see file 'COPYING' for use and warranty information
#
# policygentool is a tool for the initial generation of SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#
########################### var_run Template File #############################

te_types="""
type TEMPLATETYPE_var_run_t;
files_pid_file(TEMPLATETYPE_var_run_t)
"""

te_rules="""
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_run_t, TEMPLATETYPE_var_run_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_run_t, TEMPLATETYPE_var_run_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_run_t, TEMPLATETYPE_var_run_t)
files_pid_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_var_run_t, { dir file lnk_file })
"""

te_stream_rules="""
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_run_t, TEMPLATETYPE_var_run_t)
files_pid_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_var_run_t, sock_file)
"""

if_rules="""\
########################################
## <summary>
##	Read TEMPLATETYPE PID files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_read_pid_files',`
	gen_require(`
		type TEMPLATETYPE_var_run_t;
	')

	files_search_pids($1)
	read_files_pattern($1, TEMPLATETYPE_var_run_t, TEMPLATETYPE_var_run_t)
')

"""

if_stream_rules="""\
########################################
## <summary>
##	Connect to TEMPLATETYPE over a unix stream socket.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_stream_connect',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_var_run_t;
	')

	files_search_pids($1)
	stream_connect_pattern($1, TEMPLATETYPE_var_run_t, TEMPLATETYPE_var_run_t, TEMPLATETYPE_t)
')
"""

if_admin_types="""
		type TEMPLATETYPE_var_run_t;"""

if_admin_rules="""
	files_search_pids($1)
	admin_pattern($1, TEMPLATETYPE_var_run_t)
"""

fc_file="""\
FILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_var_run_t,s0)
"""

fc_sock_file="""\
FILENAME		-s	gen_context(system_u:object_r:TEMPLATETYPE_var_run_t,s0)
"""

fc_dir="""\
FILENAME(/.*)?		gen_context(system_u:object_r:TEMPLATETYPE_var_run_t,s0)
"""
site-packages/sepolicy/templates/script.py000064400000010245147511334650015002 0ustar00# Copyright (C) 2007-2012 Red Hat
# see file 'COPYING' for use and warranty information
#
# policygentool is a tool for the initial generation of SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#

########################### tmp Template File #############################
compile="""\
#!/bin/sh -e

DIRNAME=`dirname $0`
cd $DIRNAME
USAGE="$0 [ --update ]"
if [ `id -u` != 0 ]; then
echo 'You must be root to run this script'
exit 1
fi

if [ $# -eq 1 ]; then
	if [ "$1" = "--update" ] ; then
		time=`ls -l --time-style="+%x %X" TEMPLATEFILE.te | awk '{ printf "%s %s", $6, $7 }'`
		rules=`ausearch --start $time -m avc --raw -se TEMPLATETYPE`
		if [ x"$rules" != "x" ] ; then
			echo "Found avc's to update policy with"
			echo -e "$rules" | audit2allow -R
			echo "Do you want these changes added to policy [y/n]?"
			read ANS
			if [ "$ANS" = "y" -o "$ANS" = "Y" ] ; then
				echo "Updating policy"
				echo -e "$rules" | audit2allow -R >> TEMPLATEFILE.te
				# Fall though and rebuild policy
			else
				exit 0
			fi
		else
			echo "No new avcs found"
			exit 0
		fi
	else
		echo -e $USAGE
		exit 1
	fi
elif [ $# -ge 2 ] ; then
	echo -e $USAGE
	exit 1
fi

echo "Building and Loading Policy"
set -x
make -f /usr/share/selinux/devel/Makefile TEMPLATEFILE.pp || exit
/usr/sbin/semodule -i TEMPLATEFILE.pp

"""
rpm="""\
# Generate a rpm package for the newly generated policy

pwd=$(pwd)
rpmbuild --define "_sourcedir ${pwd}" --define "_specdir ${pwd}" --define "_builddir ${pwd}" --define "_srcrpmdir ${pwd}" --define "_rpmdir ${pwd}" --define "_buildrootdir ${pwd}/.build"  -ba TEMPLATEFILE_selinux.spec
"""

manpage="""\
# Generate a man page off the installed module
sepolicy manpage -p . -d DOMAINTYPE_t
"""

restorecon="""\
# Fixing the file context on FILENAME
/sbin/restorecon -F -R -v FILENAME
"""

tcp_ports="""\
# Adding SELinux tcp port to port PORTNUM
/usr/sbin/semanage port -a -t TEMPLATETYPE_port_t -p tcp PORTNUM
"""

udp_ports="""\
# Adding SELinux udp port to port PORTNUM
/usr/sbin/semanage port -a -t TEMPLATETYPE_port_t -p udp PORTNUM
"""

users="""\
# Adding SELinux user TEMPLATETYPE_u
/usr/sbin/semanage user -a -R "TEMPLATETYPE_rROLES" TEMPLATETYPE_u
"""

eusers="""\
# Adding roles to SELinux user TEMPLATETYPE_u
/usr/sbin/semanage user -m -R "TEMPLATETYPE_rROLES" TEMPLATETYPE_u
"""

admin_trans="""\
# Adding roles to SELinux user USER
/usr/sbin/semanage user -m -R +TEMPLATETYPE_r USER
"""

min_login_user_default_context="""\
cat > TEMPLATETYPE_u << _EOF
TEMPLATETYPE_r:TEMPLATETYPE_t:s0	TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:crond_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:initrc_su_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:local_login_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:remote_login_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:sshd_t			TEMPLATETYPE_r:TEMPLATETYPE_t
_EOF
if [ ! -f /etc/selinux/targeted/contexts/users/TEMPLATETYPE_u ]; then
   cp TEMPLATETYPE_u /etc/selinux/targeted/contexts/users/
fi
"""

x_login_user_default_context="""\
cat > TEMPLATETYPE_u << _EOF
TEMPLATETYPE_r:TEMPLATETYPE_t	TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:crond_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:initrc_su_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:local_login_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:remote_login_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:sshd_t				TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:xdm_t				TEMPLATETYPE_r:TEMPLATETYPE_t
_EOF
if [ ! -f /etc/selinux/targeted/contexts/users/TEMPLATETYPE_u ]; then
   cp TEMPLATETYPE_u /etc/selinux/targeted/contexts/users/
fi
"""
site-packages/sepolicy/templates/boolean.py000064400000002236147511334650015116 0ustar00# Copyright (C) 2007-2012 Red Hat
# see file 'COPYING' for use and warranty information
#
# policygentool is a tool for the initial generation of SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#
########################### boolean Template File ###########################

te_boolean="""
## <desc>
##	<p>
##	DESCRIPTION
##	</p>
## </desc>
gen_tunable(BOOLEAN, false)
"""

te_rules="""
tunable_policy(`BOOLEAN',`
#TRUE
',`
#FALSE
')
"""
site-packages/sepolicy/templates/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000161147511334650022454 0ustar003

>�\��@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/sepolicy/templates/__pycache__/var_log.cpython-36.pyc000064400000004112147511334650021407 0ustar003

>�\��@s dZdZdZdZdZdZdZdS)z?
type TEMPLATETYPE_log_t;
logging_log_file(TEMPLATETYPE_log_t)
a<
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
logging_log_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_log_t, { dir file lnk_file })
a�########################################
## <summary>
##	Read TEMPLATETYPE's log files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <rolecap/>
#
interface(`TEMPLATETYPE_read_log',`
	gen_require(`
		type TEMPLATETYPE_log_t;
	')

	logging_search_logs($1)
	read_files_pattern($1, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
')

########################################
## <summary>
##	Append to TEMPLATETYPE log files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_append_log',`
	gen_require(`
		type TEMPLATETYPE_log_t;
	')

	logging_search_logs($1)
	append_files_pattern($1, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
')

########################################
## <summary>
##	Manage TEMPLATETYPE log files
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_log',`
	gen_require(`
		type TEMPLATETYPE_log_t;
	')

	logging_search_logs($1)
	manage_dirs_pattern($1, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
	manage_files_pattern($1, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
	manage_lnk_files_pattern($1, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
')
z
		type TEMPLATETYPE_log_t;zA
	logging_search_logs($1)
	admin_pattern($1, TEMPLATETYPE_log_t)
zBFILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_log_t,s0)
zEFILENAME(/.*)?		gen_context(system_u:object_r:TEMPLATETYPE_log_t,s0)
N)Zte_typesZte_rulesZif_rulesZif_admin_typesZif_admin_rulesZfc_fileZfc_dir�rr�/usr/lib/python3.6/var_log.py�<module>s?site-packages/sepolicy/templates/__pycache__/var_run.cpython-36.opt-1.pyc000064400000003753147511334650022403 0ustar003

>�\s�@s,dZdZdZdZdZdZdZdZdZd	Z	d
S)zE
type TEMPLATETYPE_var_run_t;
files_pid_file(TEMPLATETYPE_var_run_t)
aV
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_run_t, TEMPLATETYPE_var_run_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_run_t, TEMPLATETYPE_var_run_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_run_t, TEMPLATETYPE_var_run_t)
files_pid_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_var_run_t, { dir file lnk_file })
z�
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_run_t, TEMPLATETYPE_var_run_t)
files_pid_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_var_run_t, sock_file)
a�########################################
## <summary>
##	Read TEMPLATETYPE PID files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_read_pid_files',`
	gen_require(`
		type TEMPLATETYPE_var_run_t;
	')

	files_search_pids($1)
	read_files_pattern($1, TEMPLATETYPE_var_run_t, TEMPLATETYPE_var_run_t)
')

a�########################################
## <summary>
##	Connect to TEMPLATETYPE over a unix stream socket.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_stream_connect',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_var_run_t;
	')

	files_search_pids($1)
	stream_connect_pattern($1, TEMPLATETYPE_var_run_t, TEMPLATETYPE_var_run_t, TEMPLATETYPE_t)
')
z
		type TEMPLATETYPE_var_run_t;zC
	files_search_pids($1)
	admin_pattern($1, TEMPLATETYPE_var_run_t)
zFFILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_var_run_t,s0)
zFFILENAME		-s	gen_context(system_u:object_r:TEMPLATETYPE_var_run_t,s0)
zIFILENAME(/.*)?		gen_context(system_u:object_r:TEMPLATETYPE_var_run_t,s0)
N)
Zte_typesZte_rulesZte_stream_rulesZif_rulesZif_stream_rulesZif_admin_typesZif_admin_rulesZfc_fileZfc_sock_fileZfc_dir�rr�/usr/lib/python3.6/var_run.py�<module>ssite-packages/sepolicy/templates/__pycache__/tmp.cpython-36.opt-1.pyc000064400000004776147511334650021535 0ustar003

>�\�
�@s dZdZdZdZdZdZdZdS)z=
type TEMPLATETYPE_tmp_t;
files_tmp_file(TEMPLATETYPE_tmp_t)
a:
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
files_tmp_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_tmp_t, { dir file lnk_file })
z�
manage_sock_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
files_tmp_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_tmp_t, sock_file)
a�
########################################
## <summary>
##	Do not audit attempts to read,
##	TEMPLATETYPE tmp files
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_dontaudit_read_tmp_files',`
	gen_require(`
		type TEMPLATETYPE_tmp_t;
	')

	dontaudit $1 TEMPLATETYPE_tmp_t:file read_file_perms;
')

########################################
## <summary>
##	Read TEMPLATETYPE tmp files
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_read_tmp_files',`
	gen_require(`
		type TEMPLATETYPE_tmp_t;
	')

	files_search_tmp($1)
	read_files_pattern($1, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
')

########################################
## <summary>
##	Manage TEMPLATETYPE tmp files
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_tmp',`
	gen_require(`
		type TEMPLATETYPE_tmp_t;
	')

	files_search_tmp($1)
	manage_dirs_pattern($1, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
	manage_files_pattern($1, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
	manage_lnk_files_pattern($1, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
')
a�########################################
## <summary>
##	Connect to TEMPLATETYPE over a unix stream socket.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_stream_connect',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_tmp_t;
	')

	files_search_pids($1)
	stream_connect_pattern($1, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t, TEMPLATETYPE_t)
')
z
		type TEMPLATETYPE_tmp_t;z>
	files_search_tmp($1)
	admin_pattern($1, TEMPLATETYPE_tmp_t)
N)Zte_typesZte_rulesZte_stream_rulesZif_rulesZif_stream_rulesZif_admin_typesZif_admin_rules�rr�/usr/lib/python3.6/tmp.py�<module>s=site-packages/sepolicy/templates/__pycache__/test_module.cpython-36.opt-1.pyc000064400000004670147511334650023252 0ustar003

>�\a�@s\iZded<ded<ded<ded<ded<d	ed
<ded<ded
<ded<ded<ded<ded<ded<ded<ded<ded<ded<ded<ded <d!ed"<d!ed#<ded$<d!ed%<d!ed&<d!ed'<d(ed)<d*ed+<d(ed,<d-ed.<d/ed0<d/ed1<d2ed3<d2ed4<d$ed5<d$ed6<d$ed7<d$ed8<d9ed:<d;ed<<d=ed><d?ed@<dAedB<dCZdDS)EZsepolicy_domain_tZdomainZdomainsZsepolicy_target_tZ
target_domainZsepolicy_source_tZ
source_domainZsepolicy_peer_tZpeer_domainZsepolicy_exception_types_tZexception_typesZsepolicy_userdomain_tZuser_domainZ
userdomainZsepolicy_bool_domain_tZbool_domainZsepolicy_file_t�typeZ	file_typeZsepolicy_private_file_tzprivate typeZprivate_typeZsepolicy_devpts_tZpty_typeZsepolicy_tmpfs_tZ
tmpfs_typeZsepolicy_home_file_tZ	home_typeZ
sepolicy_tZtty_typeZdirectory_typeZsepolicy_object_tZobject_typeZsepolicy_exec_tZscript_fileZentry_point�fileZ
entry_fileZinit_script_fileZ
entrypointZ
sepolicy_rZroleZsepolicyZrole_prefixZ	user_roleZsepolicy_source_rZsource_roleZsepolicy_domain�prefixZ
domain_prefixZsepolicy_userdomainZuserdomain_prefixZuser_prefixZobject_class�object�classzobjectclass(es)Zsepolicy_objectZobject_namez"sepolicy_name"�nameZsepolicy_tty_tZterminalZsepolicy_bool_tZbooleanzs0 - mcs_systemhigh�rangea�policy_module(TEMPLATETYPE, 1.0.0)

type sepolicy_t;
domain_type(sepolicy_t)
type sepolicy_domain_t;
domain_type(sepolicy_domain_t)
type sepolicy_target_t;
domain_type(sepolicy_target_t)
type sepolicy_source_t;
domain_type(sepolicy_source_t)
type sepolicy_peer_t;
domain_type(sepolicy_peer_t)
type sepolicy_exception_types_t;
domain_type(sepolicy_exception_types_t)
type sepolicy_userdomain_t;
domain_type(sepolicy_userdomain_t)

type sepolicy_file_t;
files_type(sepolicy_file_t)
type sepolicy_private_file_t;
files_type(sepolicy_private_file_t)
type sepolicy_home_file_t;
files_type(sepolicy_home_file_t)
type sepolicy_tty_t;
term_tty(sepolicy_tty_t)
type sepolicy_object_t;
type sepolicy_devpts_t;
term_pty(sepolicy_devpts_t)
type sepolicy_tmpfs_t;
files_type(sepolicy_tmpfs_t)
type sepolicy_exec_t;
files_type(sepolicy_exec_t)

role sepolicy_r;
role sepolicy_source_r;
role sepolicy_target_r;

#################################
#
# Local policy
#

N)�dict_valuesZte_test_module�r	r	�!/usr/lib/python3.6/test_module.py�<module>sV-site-packages/sepolicy/templates/__pycache__/spec.cpython-36.pyc000064400000004273147511334650020720 0ustar003

>�\V�@sdZdZdZdZdZdS)z# vim: sw=4:ts=4:et
a
%define selinux_policyver VERSION

Name:   MODULENAME_selinux
Version:	1.0
Release:	1%{?dist}
Summary:	SELinux policy module for MODULENAME

Group:	System Environment/Base		
License:	GPLv2+	
# This is an example. You will need to change it.
URL:		http://HOSTNAME
Source0:	MODULENAME.pp
Source1:	MODULENAME.if
Source2:	DOMAINNAME_selinux.8
Source3:	DOMAINNAME_u

Requires: policycoreutils, libselinux-utils
Requires(post): selinux-policy-base >= %{selinux_policyver}, policycoreutils
Requires(postun): policycoreutils
alBuildArch: noarch

%description
This package installs and sets up the  SELinux policy security module for MODULENAME.

%install
install -d %{buildroot}%{_datadir}/selinux/packages
install -m 644 %{SOURCE0} %{buildroot}%{_datadir}/selinux/packages
install -d %{buildroot}%{_datadir}/selinux/devel/include/contrib
install -m 644 %{SOURCE1} %{buildroot}%{_datadir}/selinux/devel/include/contrib/
install -d %{buildroot}%{_mandir}/man8/
install -m 644 %{SOURCE2} %{buildroot}%{_mandir}/man8/DOMAINNAME_selinux.8
install -d %{buildroot}/etc/selinux/targeted/contexts/users/
install -m 644 %{SOURCE3} %{buildroot}/etc/selinux/targeted/contexts/users/DOMAINNAME_u

%post
semodule -n -i %{_datadir}/selinux/packages/MODULENAME.pp
if /usr/sbin/selinuxenabled ; then
    /usr/sbin/load_policy
    %relabel_files
    /usr/sbin/semanage user -a -R DOMAINNAME_r DOMAINNAME_u
fi;
exit 0

%postun
if [ $1 -eq 0 ]; then
    semodule -n -r MODULENAME
    if /usr/sbin/selinuxenabled ; then
       /usr/sbin/load_policy
       %relabel_files
       /usr/sbin/semanage user -d DOMAINNAME_u
    fi;
fi;
exit 0

%files
%attr(0600,root,root) %{_datadir}/selinux/packages/MODULENAME.pp
%{_datadir}/selinux/devel/include/contrib/MODULENAME.if
%{_mandir}/man8/DOMAINNAME_selinux.8.*
/etc/selinux/targeted/contexts/users/DOMAINNAME_u

%changelog
* TODAYSDATE YOUR NAME <YOUR@EMAILADDRESS> 1.0-1
- Initial version

z

%define relabel_files() \
zrestorecon -R FILENAME; \
N)Zheader_comment_sectionZbase_sectionZmid_sectionZdefine_relabel_files_beginZdefine_relabel_files_end�rr�/usr/lib/python3.6/spec.py�<module>s0site-packages/sepolicy/templates/__pycache__/__init__.cpython-36.pyc000064400000000161147511334650021515 0ustar003

>�\��@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/sepolicy/templates/__pycache__/etc_rw.cpython-36.pyc000064400000005243147511334650021247 0ustar003

>�\�@s(dZdZdZdZdZdZdZdZdZd	S)
z?
type TEMPLATETYPE_etc_rw_t;
files_type(TEMPLATETYPE_etc_rw_t)
aO
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_etc_rw_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_etc_rw_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_etc_rw_t)
files_etc_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_etc_rw_t, { dir file lnk_file })
z�
manage_sock_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_etc_rw_t)
files_etc_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_etc_rw_t, sock_file)
a�
########################################
## <summary>
##	Search TEMPLATETYPE conf directories.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_search_conf',`
	gen_require(`
		type TEMPLATETYPE_etc_rw_t;
	')

	allow $1 TEMPLATETYPE_etc_rw_t:dir search_dir_perms;
	files_search_etc($1)
')

########################################
## <summary>
##	Read TEMPLATETYPE conf files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_read_conf_files',`
	gen_require(`
		type TEMPLATETYPE_etc_rw_t;
	')

	allow $1 TEMPLATETYPE_etc_rw_t:dir list_dir_perms;
	read_files_pattern($1, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_etc_rw_t)
	files_search_etc($1)
')

########################################
## <summary>
##	Manage TEMPLATETYPE conf files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_conf_files',`
	gen_require(`
		type TEMPLATETYPE_etc_rw_t;
	')

	manage_files_pattern($1, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_etc_rw_t)
	files_search_etc($1)
')

a�########################################
## <summary>
##	Connect to TEMPLATETYPE over a unix stream socket.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_stream_connect',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_etc_rw_t;
	')

	files_search_etc($1)
	stream_connect_pattern($1, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_t)
')
z
		type TEMPLATETYPE_etc_rw_t;zA
	files_search_etc($1)
	admin_pattern($1, TEMPLATETYPE_etc_rw_t)
zEFILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_etc_rw_t,s0)
zHFILENAME(/.*)?		gen_context(system_u:object_r:TEMPLATETYPE_etc_rw_t,s0)
N)	Zte_typesZte_rulesZte_stream_rulesZif_rulesZif_stream_rulesZif_admin_typesZif_admin_rulesZfc_fileZfc_dir�rr�/usr/lib/python3.6/etc_rw.py�<module>s>site-packages/sepolicy/templates/__pycache__/tmp.cpython-36.pyc000064400000004776147511334650020576 0ustar003

>�\�
�@s dZdZdZdZdZdZdZdS)z=
type TEMPLATETYPE_tmp_t;
files_tmp_file(TEMPLATETYPE_tmp_t)
a:
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
files_tmp_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_tmp_t, { dir file lnk_file })
z�
manage_sock_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
files_tmp_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_tmp_t, sock_file)
a�
########################################
## <summary>
##	Do not audit attempts to read,
##	TEMPLATETYPE tmp files
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_dontaudit_read_tmp_files',`
	gen_require(`
		type TEMPLATETYPE_tmp_t;
	')

	dontaudit $1 TEMPLATETYPE_tmp_t:file read_file_perms;
')

########################################
## <summary>
##	Read TEMPLATETYPE tmp files
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_read_tmp_files',`
	gen_require(`
		type TEMPLATETYPE_tmp_t;
	')

	files_search_tmp($1)
	read_files_pattern($1, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
')

########################################
## <summary>
##	Manage TEMPLATETYPE tmp files
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_tmp',`
	gen_require(`
		type TEMPLATETYPE_tmp_t;
	')

	files_search_tmp($1)
	manage_dirs_pattern($1, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
	manage_files_pattern($1, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
	manage_lnk_files_pattern($1, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t)
')
a�########################################
## <summary>
##	Connect to TEMPLATETYPE over a unix stream socket.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_stream_connect',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_tmp_t;
	')

	files_search_pids($1)
	stream_connect_pattern($1, TEMPLATETYPE_tmp_t, TEMPLATETYPE_tmp_t, TEMPLATETYPE_t)
')
z
		type TEMPLATETYPE_tmp_t;z>
	files_search_tmp($1)
	admin_pattern($1, TEMPLATETYPE_tmp_t)
N)Zte_typesZte_rulesZte_stream_rulesZif_rulesZif_stream_rulesZif_admin_typesZif_admin_rules�rr�/usr/lib/python3.6/tmp.py�<module>s=site-packages/sepolicy/templates/__pycache__/executable.cpython-36.pyc000064400000021246147511334650022106 0ustar003

>�\�&�@s�dZdZdZdZdZdZdZdZdZd	Z	d
Z
dZdZdZ
dZd
ZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZ dZ!d Z"d!Z#d"Z$d#Z%d$Z&d%Z'd&Z(d'S)(z�policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

type TEMPLATETYPE_t;
type TEMPLATETYPE_exec_t;
init_daemon_domain(TEMPLATETYPE_t, TEMPLATETYPE_exec_t)

permissive TEMPLATETYPE_t;
zO
type TEMPLATETYPE_initrc_exec_t;
init_script_file(TEMPLATETYPE_initrc_exec_t)
a#policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

type TEMPLATETYPE_t;
type TEMPLATETYPE_exec_t;
domain_type(TEMPLATETYPE_t)
domain_entry_file(TEMPLATETYPE_t, TEMPLATETYPE_exec_t)
role system_r types TEMPLATETYPE_t;

permissive TEMPLATETYPE_t;
z�policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

type TEMPLATETYPE_t;
type TEMPLATETYPE_exec_t;
inetd_service_domain(TEMPLATETYPE_t, TEMPLATETYPE_exec_t)

permissive TEMPLATETYPE_t;
aapolicy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

attribute_role TEMPLATETYPE_roles;
roleattribute system_r TEMPLATETYPE_roles;

type TEMPLATETYPE_t;
type TEMPLATETYPE_exec_t;
application_domain(TEMPLATETYPE_t, TEMPLATETYPE_exec_t)
role TEMPLATETYPE_roles types TEMPLATETYPE_t;

permissive TEMPLATETYPE_t;
z�policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

sandbox_x_domain_template(TEMPLATETYPE)

permissive TEMPLATETYPE_t;
permissive TEMPLATETYPE_client_t;

z�policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

apache_content_template(TEMPLATETYPE)

permissive httpd_TEMPLATETYPE_script_t;
z�allow TEMPLATETYPE_t self:fifo_file rw_fifo_file_perms;
allow TEMPLATETYPE_t self:unix_stream_socket create_stream_socket_perms;
�
zO
optional_policy(`
	dbus_system_domain(TEMPLATETYPE_t, TEMPLATETYPE_exec_t)
')
z�
allow TEMPLATETYPE_t self:fifo_file manage_fifo_file_perms;
allow TEMPLATETYPE_t self:unix_stream_socket create_stream_socket_perms;
z#
auth_use_nsswitch(TEMPLATETYPE_t)
z)
logging_send_syslog_msg(TEMPLATETYPE_t)
z)
sysnet_dns_name_resolve(TEMPLATETYPE_t)
z*
auth_domtrans_chk_passwd(TEMPLATETYPE_t)
z
mta_send_mail(TEMPLATETYPE_t)
zg
optional_policy(`
	dbus_system_bus_client(TEMPLATETYPE_t)
	dbus_connect_system_bus(TEMPLATETYPE_t)
')
z4
optional_policy(`
	kerberos_use(TEMPLATETYPE_t)
')
z{
optional_policy(`
	kerberos_keytab_template(TEMPLATETYPE, TEMPLATETYPE_t)
	kerberos_manage_host_rcache(TEMPLATETYPE_t)
')
z)
logging_send_audit_msgs(TEMPLATETYPE_t)
zj
optional_policy(`
	gen_require(`
		type USER_t;
		role USER_r;
	')

	TEMPLATETYPE_run(USER_t, USER_r)
')
z,
domain_use_interactive_fds(TEMPLATETYPE_t)
z&
files_read_etc_files(TEMPLATETYPE_t)
z-
miscfiles_read_localization(TEMPLATETYPE_t)
z.
## <summary>policy for TEMPLATETYPE</summary>a�

########################################
## <summary>
##	Execute TEMPLATETYPE_exec_t in the TEMPLATETYPE domain.
## </summary>
## <param name="domain">
## <summary>
##	Domain allowed to transition.
## </summary>
## </param>
#
interface(`TEMPLATETYPE_domtrans',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_exec_t;
	')

	corecmd_search_bin($1)
	domtrans_pattern($1, TEMPLATETYPE_exec_t, TEMPLATETYPE_t)
')

######################################
## <summary>
##	Execute TEMPLATETYPE in the caller domain.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_exec',`
	gen_require(`
		type TEMPLATETYPE_exec_t;
	')

	corecmd_search_bin($1)
	can_exec($1, TEMPLATETYPE_exec_t)
')
ak
########################################
## <summary>
##	Execute TEMPLATETYPE in the TEMPLATETYPE domain, and
##	allow the specified role the TEMPLATETYPE domain.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed to transition
##	</summary>
## </param>
## <param name="role">
##	<summary>
##	The role to be allowed the TEMPLATETYPE domain.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_run',`
	gen_require(`
		type TEMPLATETYPE_t;
		attribute_role TEMPLATETYPE_roles;
	')

	TEMPLATETYPE_domtrans($1)
	roleattribute $2 TEMPLATETYPE_roles;
')

########################################
## <summary>
##	Role access for TEMPLATETYPE
## </summary>
## <param name="role">
##	<summary>
##	Role allowed access
##	</summary>
## </param>
## <param name="domain">
##	<summary>
##	User domain for the role
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_role',`
	gen_require(`
		type TEMPLATETYPE_t;
		attribute_role TEMPLATETYPE_roles;
	')

	roleattribute $1 TEMPLATETYPE_roles;

	TEMPLATETYPE_domtrans($2)

	ps_process_pattern($2, TEMPLATETYPE_t)
	allow $2 TEMPLATETYPE_t:process { signull signal sigkill };
')
a�
########################################
## <summary>
##	Execute sandbox in the TEMPLATETYPE_t domain, and
##	allow the specified role the TEMPLATETYPE_t domain.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed to transition.
##	</summary>
## </param>
## <param name="role">
##	<summary>
##	The role to be allowed the TEMPLATETYPE_t domain.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_transition',`
	gen_require(`
		type TEMPLATETYPE_t;
		type TEMPLATETYPE_client_t;
	')

	allow $1 TEMPLATETYPE_t:process { signal_perms transition };
	dontaudit $1 TEMPLATETYPE_t:process { noatsecure siginh rlimitinh };
	role $2 types TEMPLATETYPE_t;
	role $2 types TEMPLATETYPE_client_t;

	allow TEMPLATETYPE_t $1:process { sigchld signull };
	allow TEMPLATETYPE_t $1:fifo_file rw_inherited_fifo_file_perms;
	allow TEMPLATETYPE_client_t $1:process { sigchld signull };
	allow TEMPLATETYPE_client_t $1:fifo_file rw_inherited_fifo_file_perms;
')
a>
########################################
## <summary>
##	Change to the TEMPLATETYPE role.
## </summary>
## <param name="role">
##	<summary>
##	Role allowed access.
##	</summary>
## </param>
## <rolecap/>
#
interface(`TEMPLATETYPE_role_change',`
	gen_require(`
		role TEMPLATETYPE_r;
	')

	allow $1 TEMPLATETYPE_r;
')
a
########################################
## <summary>
##	Execute TEMPLATETYPE server in the TEMPLATETYPE domain.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_initrc_domtrans',`
	gen_require(`
		type TEMPLATETYPE_initrc_exec_t;
	')

	init_labeled_script_domtrans($1, TEMPLATETYPE_initrc_exec_t)
')
a�
########################################
## <summary>
##	Send and receive messages from
##	TEMPLATETYPE over dbus.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_dbus_chat',`
	gen_require(`
		type TEMPLATETYPE_t;
		class dbus send_msg;
	')

	allow $1 TEMPLATETYPE_t:dbus send_msg;
	allow TEMPLATETYPE_t $1:dbus send_msg;
')
a�
########################################
## <summary>
##	All of the rules required to administrate
##	an TEMPLATETYPE environment
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <param name="role">
##	<summary>
##	Role allowed access.
##	</summary>
## </param>
## <rolecap/>
#
interface(`TEMPLATETYPE_admin',`
	gen_require(`
		type TEMPLATETYPE_t;z�
	')

	allow $1 TEMPLATETYPE_t:process { signal_perms };
	ps_process_pattern($1, TEMPLATETYPE_t)

    tunable_policy(`deny_ptrace',`',`
        allow $1 TEMPLATETYPE_t:process ptrace;
    ')
z#
		type TEMPLATETYPE_initrc_exec_t;z�
	TEMPLATETYPE_initrc_domtrans($1)
	domain_system_change_exemption($1)
	role_transition $2 TEMPLATETYPE_initrc_exec_t system_r;
	allow $2 system_r;
zb	optional_policy(`
		systemd_passwd_agent_exec($1)
		systemd_read_fifo_file_passwd_run($1)
	')
')
zEEXECUTABLE		--	gen_context(system_u:object_r:TEMPLATETYPE_exec_t,s0)
z #  No file context, leave blank
zKEXECUTABLE	--	gen_context(system_u:object_r:TEMPLATETYPE_initrc_exec_t,s0)
N))Zte_daemon_typesZte_initscript_typesZte_dbusd_typesZte_inetd_typesZte_userapp_typesZte_sandbox_typesZte_cgi_typesZte_daemon_rulesZte_inetd_rulesZte_dbusd_rulesZte_userapp_rulesZte_cgi_rulesZte_sandbox_rulesZte_uid_rulesZte_syslog_rulesZte_resolve_rulesZte_pam_rulesZ
te_mail_rulesZ
te_dbus_rulesZte_kerberos_rulesZte_manage_krb5_rcache_rulesZte_audit_rulesZte_run_rulesZte_fd_rulesZte_etc_rulesZte_localization_rulesZif_heading_rulesZif_program_rulesZif_user_program_rulesZif_sandbox_rulesZif_role_change_rulesZif_initscript_rulesZ
if_dbus_rulesZif_begin_adminZif_middle_adminZif_initscript_admin_typesZif_initscript_adminZif_end_adminZ
fc_programZfc_userZ
fc_initscript�rr� /usr/lib/python3.6/executable.py�<module>$sP
)9#site-packages/sepolicy/templates/__pycache__/var_cache.cpython-36.pyc000064400000005723147511334650021702 0ustar003

>�\8�@s(dZdZdZdZdZdZdZdZdZd	S)
z=
type TEMPLATETYPE_cache_t;
files_type(TEMPLATETYPE_cache_t)
aH
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
files_var_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_cache_t, { dir file lnk_file })
z�manage_sock_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
files_var_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_cache_t, sock_file)
a
########################################
## <summary>
##	Search TEMPLATETYPE cache directories.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_search_cache',`
	gen_require(`
		type TEMPLATETYPE_cache_t;
	')

	allow $1 TEMPLATETYPE_cache_t:dir search_dir_perms;
	files_search_var($1)
')

########################################
## <summary>
##	Read TEMPLATETYPE cache files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_read_cache_files',`
	gen_require(`
		type TEMPLATETYPE_cache_t;
	')

	files_search_var($1)
	read_files_pattern($1, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
')

########################################
## <summary>
##	Create, read, write, and delete
##	TEMPLATETYPE cache files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_cache_files',`
	gen_require(`
		type TEMPLATETYPE_cache_t;
	')

	files_search_var($1)
	manage_files_pattern($1, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
')

########################################
## <summary>
##	Manage TEMPLATETYPE cache dirs.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_cache_dirs',`
	gen_require(`
		type TEMPLATETYPE_cache_t;
	')

	files_search_var($1)
	manage_dirs_pattern($1, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
')

a�
########################################
## <summary>
##	Connect to TEMPLATETYPE over a unix stream socket.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_stream_connect',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_cache_t;
	')

	stream_connect_pattern($1, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
')
z
		type TEMPLATETYPE_cache_t;z@
	files_search_var($1)
	admin_pattern($1, TEMPLATETYPE_cache_t)
zDFILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_cache_t,s0)
zGFILENAME(/.*)?		gen_context(system_u:object_r:TEMPLATETYPE_cache_t,s0)
N)	Zte_typesZte_rulesZte_stream_rulesZif_rulesZif_stream_rulesZif_admin_typesZif_admin_rulesZfc_fileZfc_dir�rr�/usr/lib/python3.6/var_cache.py�<module>sQsite-packages/sepolicy/templates/__pycache__/executable.cpython-36.opt-1.pyc000064400000021246147511334650023045 0ustar003

>�\�&�@s�dZdZdZdZdZdZdZdZdZd	Z	d
Z
dZdZdZ
dZd
ZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZ dZ!d Z"d!Z#d"Z$d#Z%d$Z&d%Z'd&Z(d'S)(z�policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

type TEMPLATETYPE_t;
type TEMPLATETYPE_exec_t;
init_daemon_domain(TEMPLATETYPE_t, TEMPLATETYPE_exec_t)

permissive TEMPLATETYPE_t;
zO
type TEMPLATETYPE_initrc_exec_t;
init_script_file(TEMPLATETYPE_initrc_exec_t)
a#policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

type TEMPLATETYPE_t;
type TEMPLATETYPE_exec_t;
domain_type(TEMPLATETYPE_t)
domain_entry_file(TEMPLATETYPE_t, TEMPLATETYPE_exec_t)
role system_r types TEMPLATETYPE_t;

permissive TEMPLATETYPE_t;
z�policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

type TEMPLATETYPE_t;
type TEMPLATETYPE_exec_t;
inetd_service_domain(TEMPLATETYPE_t, TEMPLATETYPE_exec_t)

permissive TEMPLATETYPE_t;
aapolicy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

attribute_role TEMPLATETYPE_roles;
roleattribute system_r TEMPLATETYPE_roles;

type TEMPLATETYPE_t;
type TEMPLATETYPE_exec_t;
application_domain(TEMPLATETYPE_t, TEMPLATETYPE_exec_t)
role TEMPLATETYPE_roles types TEMPLATETYPE_t;

permissive TEMPLATETYPE_t;
z�policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

sandbox_x_domain_template(TEMPLATETYPE)

permissive TEMPLATETYPE_t;
permissive TEMPLATETYPE_client_t;

z�policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

apache_content_template(TEMPLATETYPE)

permissive httpd_TEMPLATETYPE_script_t;
z�allow TEMPLATETYPE_t self:fifo_file rw_fifo_file_perms;
allow TEMPLATETYPE_t self:unix_stream_socket create_stream_socket_perms;
�
zO
optional_policy(`
	dbus_system_domain(TEMPLATETYPE_t, TEMPLATETYPE_exec_t)
')
z�
allow TEMPLATETYPE_t self:fifo_file manage_fifo_file_perms;
allow TEMPLATETYPE_t self:unix_stream_socket create_stream_socket_perms;
z#
auth_use_nsswitch(TEMPLATETYPE_t)
z)
logging_send_syslog_msg(TEMPLATETYPE_t)
z)
sysnet_dns_name_resolve(TEMPLATETYPE_t)
z*
auth_domtrans_chk_passwd(TEMPLATETYPE_t)
z
mta_send_mail(TEMPLATETYPE_t)
zg
optional_policy(`
	dbus_system_bus_client(TEMPLATETYPE_t)
	dbus_connect_system_bus(TEMPLATETYPE_t)
')
z4
optional_policy(`
	kerberos_use(TEMPLATETYPE_t)
')
z{
optional_policy(`
	kerberos_keytab_template(TEMPLATETYPE, TEMPLATETYPE_t)
	kerberos_manage_host_rcache(TEMPLATETYPE_t)
')
z)
logging_send_audit_msgs(TEMPLATETYPE_t)
zj
optional_policy(`
	gen_require(`
		type USER_t;
		role USER_r;
	')

	TEMPLATETYPE_run(USER_t, USER_r)
')
z,
domain_use_interactive_fds(TEMPLATETYPE_t)
z&
files_read_etc_files(TEMPLATETYPE_t)
z-
miscfiles_read_localization(TEMPLATETYPE_t)
z.
## <summary>policy for TEMPLATETYPE</summary>a�

########################################
## <summary>
##	Execute TEMPLATETYPE_exec_t in the TEMPLATETYPE domain.
## </summary>
## <param name="domain">
## <summary>
##	Domain allowed to transition.
## </summary>
## </param>
#
interface(`TEMPLATETYPE_domtrans',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_exec_t;
	')

	corecmd_search_bin($1)
	domtrans_pattern($1, TEMPLATETYPE_exec_t, TEMPLATETYPE_t)
')

######################################
## <summary>
##	Execute TEMPLATETYPE in the caller domain.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_exec',`
	gen_require(`
		type TEMPLATETYPE_exec_t;
	')

	corecmd_search_bin($1)
	can_exec($1, TEMPLATETYPE_exec_t)
')
ak
########################################
## <summary>
##	Execute TEMPLATETYPE in the TEMPLATETYPE domain, and
##	allow the specified role the TEMPLATETYPE domain.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed to transition
##	</summary>
## </param>
## <param name="role">
##	<summary>
##	The role to be allowed the TEMPLATETYPE domain.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_run',`
	gen_require(`
		type TEMPLATETYPE_t;
		attribute_role TEMPLATETYPE_roles;
	')

	TEMPLATETYPE_domtrans($1)
	roleattribute $2 TEMPLATETYPE_roles;
')

########################################
## <summary>
##	Role access for TEMPLATETYPE
## </summary>
## <param name="role">
##	<summary>
##	Role allowed access
##	</summary>
## </param>
## <param name="domain">
##	<summary>
##	User domain for the role
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_role',`
	gen_require(`
		type TEMPLATETYPE_t;
		attribute_role TEMPLATETYPE_roles;
	')

	roleattribute $1 TEMPLATETYPE_roles;

	TEMPLATETYPE_domtrans($2)

	ps_process_pattern($2, TEMPLATETYPE_t)
	allow $2 TEMPLATETYPE_t:process { signull signal sigkill };
')
a�
########################################
## <summary>
##	Execute sandbox in the TEMPLATETYPE_t domain, and
##	allow the specified role the TEMPLATETYPE_t domain.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed to transition.
##	</summary>
## </param>
## <param name="role">
##	<summary>
##	The role to be allowed the TEMPLATETYPE_t domain.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_transition',`
	gen_require(`
		type TEMPLATETYPE_t;
		type TEMPLATETYPE_client_t;
	')

	allow $1 TEMPLATETYPE_t:process { signal_perms transition };
	dontaudit $1 TEMPLATETYPE_t:process { noatsecure siginh rlimitinh };
	role $2 types TEMPLATETYPE_t;
	role $2 types TEMPLATETYPE_client_t;

	allow TEMPLATETYPE_t $1:process { sigchld signull };
	allow TEMPLATETYPE_t $1:fifo_file rw_inherited_fifo_file_perms;
	allow TEMPLATETYPE_client_t $1:process { sigchld signull };
	allow TEMPLATETYPE_client_t $1:fifo_file rw_inherited_fifo_file_perms;
')
a>
########################################
## <summary>
##	Change to the TEMPLATETYPE role.
## </summary>
## <param name="role">
##	<summary>
##	Role allowed access.
##	</summary>
## </param>
## <rolecap/>
#
interface(`TEMPLATETYPE_role_change',`
	gen_require(`
		role TEMPLATETYPE_r;
	')

	allow $1 TEMPLATETYPE_r;
')
a
########################################
## <summary>
##	Execute TEMPLATETYPE server in the TEMPLATETYPE domain.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_initrc_domtrans',`
	gen_require(`
		type TEMPLATETYPE_initrc_exec_t;
	')

	init_labeled_script_domtrans($1, TEMPLATETYPE_initrc_exec_t)
')
a�
########################################
## <summary>
##	Send and receive messages from
##	TEMPLATETYPE over dbus.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_dbus_chat',`
	gen_require(`
		type TEMPLATETYPE_t;
		class dbus send_msg;
	')

	allow $1 TEMPLATETYPE_t:dbus send_msg;
	allow TEMPLATETYPE_t $1:dbus send_msg;
')
a�
########################################
## <summary>
##	All of the rules required to administrate
##	an TEMPLATETYPE environment
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <param name="role">
##	<summary>
##	Role allowed access.
##	</summary>
## </param>
## <rolecap/>
#
interface(`TEMPLATETYPE_admin',`
	gen_require(`
		type TEMPLATETYPE_t;z�
	')

	allow $1 TEMPLATETYPE_t:process { signal_perms };
	ps_process_pattern($1, TEMPLATETYPE_t)

    tunable_policy(`deny_ptrace',`',`
        allow $1 TEMPLATETYPE_t:process ptrace;
    ')
z#
		type TEMPLATETYPE_initrc_exec_t;z�
	TEMPLATETYPE_initrc_domtrans($1)
	domain_system_change_exemption($1)
	role_transition $2 TEMPLATETYPE_initrc_exec_t system_r;
	allow $2 system_r;
zb	optional_policy(`
		systemd_passwd_agent_exec($1)
		systemd_read_fifo_file_passwd_run($1)
	')
')
zEEXECUTABLE		--	gen_context(system_u:object_r:TEMPLATETYPE_exec_t,s0)
z #  No file context, leave blank
zKEXECUTABLE	--	gen_context(system_u:object_r:TEMPLATETYPE_initrc_exec_t,s0)
N))Zte_daemon_typesZte_initscript_typesZte_dbusd_typesZte_inetd_typesZte_userapp_typesZte_sandbox_typesZte_cgi_typesZte_daemon_rulesZte_inetd_rulesZte_dbusd_rulesZte_userapp_rulesZte_cgi_rulesZte_sandbox_rulesZte_uid_rulesZte_syslog_rulesZte_resolve_rulesZte_pam_rulesZ
te_mail_rulesZ
te_dbus_rulesZte_kerberos_rulesZte_manage_krb5_rcache_rulesZte_audit_rulesZte_run_rulesZte_fd_rulesZte_etc_rulesZte_localization_rulesZif_heading_rulesZif_program_rulesZif_user_program_rulesZif_sandbox_rulesZif_role_change_rulesZif_initscript_rulesZ
if_dbus_rulesZif_begin_adminZif_middle_adminZif_initscript_admin_typesZif_initscript_adminZif_end_adminZ
fc_programZfc_userZ
fc_initscript�rr� /usr/lib/python3.6/executable.py�<module>$sP
)9#site-packages/sepolicy/templates/__pycache__/spec.cpython-36.opt-1.pyc000064400000004273147511334650021657 0ustar003

>�\V�@sdZdZdZdZdZdS)z# vim: sw=4:ts=4:et
a
%define selinux_policyver VERSION

Name:   MODULENAME_selinux
Version:	1.0
Release:	1%{?dist}
Summary:	SELinux policy module for MODULENAME

Group:	System Environment/Base		
License:	GPLv2+	
# This is an example. You will need to change it.
URL:		http://HOSTNAME
Source0:	MODULENAME.pp
Source1:	MODULENAME.if
Source2:	DOMAINNAME_selinux.8
Source3:	DOMAINNAME_u

Requires: policycoreutils, libselinux-utils
Requires(post): selinux-policy-base >= %{selinux_policyver}, policycoreutils
Requires(postun): policycoreutils
alBuildArch: noarch

%description
This package installs and sets up the  SELinux policy security module for MODULENAME.

%install
install -d %{buildroot}%{_datadir}/selinux/packages
install -m 644 %{SOURCE0} %{buildroot}%{_datadir}/selinux/packages
install -d %{buildroot}%{_datadir}/selinux/devel/include/contrib
install -m 644 %{SOURCE1} %{buildroot}%{_datadir}/selinux/devel/include/contrib/
install -d %{buildroot}%{_mandir}/man8/
install -m 644 %{SOURCE2} %{buildroot}%{_mandir}/man8/DOMAINNAME_selinux.8
install -d %{buildroot}/etc/selinux/targeted/contexts/users/
install -m 644 %{SOURCE3} %{buildroot}/etc/selinux/targeted/contexts/users/DOMAINNAME_u

%post
semodule -n -i %{_datadir}/selinux/packages/MODULENAME.pp
if /usr/sbin/selinuxenabled ; then
    /usr/sbin/load_policy
    %relabel_files
    /usr/sbin/semanage user -a -R DOMAINNAME_r DOMAINNAME_u
fi;
exit 0

%postun
if [ $1 -eq 0 ]; then
    semodule -n -r MODULENAME
    if /usr/sbin/selinuxenabled ; then
       /usr/sbin/load_policy
       %relabel_files
       /usr/sbin/semanage user -d DOMAINNAME_u
    fi;
fi;
exit 0

%files
%attr(0600,root,root) %{_datadir}/selinux/packages/MODULENAME.pp
%{_datadir}/selinux/devel/include/contrib/MODULENAME.if
%{_mandir}/man8/DOMAINNAME_selinux.8.*
/etc/selinux/targeted/contexts/users/DOMAINNAME_u

%changelog
* TODAYSDATE YOUR NAME <YOUR@EMAILADDRESS> 1.0-1
- Initial version

z

%define relabel_files() \
zrestorecon -R FILENAME; \
N)Zheader_comment_sectionZbase_sectionZmid_sectionZdefine_relabel_files_beginZdefine_relabel_files_end�rr�/usr/lib/python3.6/spec.py�<module>s0site-packages/sepolicy/templates/__pycache__/semodule.cpython-36.pyc000064400000000625147511334650021600 0ustar003

>�\�@sdZdZdZdZdS)zQ
#!/bin/sh
make -f /usr/share/selinux/devel/Makefile
semodule -i TEMPLATETYPE.pp
z
restorecon -R -v FILENAME
z9
semanage ports -a -t TEMPLATETYPE_port_t -p tcp PORTNUM
z9
semanage ports -a -t TEMPLATETYPE_port_t -p udp PORTNUM
N)�compileZ
restoreconZ	tcp_portsZ	udp_ports�rr�/usr/lib/python3.6/semodule.py�<module>ssite-packages/sepolicy/templates/__pycache__/boolean.cpython-36.opt-1.pyc000064400000000421147511334650022333 0ustar003

>�\��@sdZdZdS)zP
## <desc>
##	<p>
##	DESCRIPTION
##	</p>
## </desc>
gen_tunable(BOOLEAN, false)
z0
tunable_policy(`BOOLEAN',`
#TRUE
',`
#FALSE
')
N)Z
te_booleanZte_rules�rr�/usr/lib/python3.6/boolean.py�<module>ssite-packages/sepolicy/templates/__pycache__/var_log.cpython-36.opt-1.pyc000064400000004112147511334650022346 0ustar003

>�\��@s dZdZdZdZdZdZdZdS)z?
type TEMPLATETYPE_log_t;
logging_log_file(TEMPLATETYPE_log_t)
a<
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
logging_log_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_log_t, { dir file lnk_file })
a�########################################
## <summary>
##	Read TEMPLATETYPE's log files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <rolecap/>
#
interface(`TEMPLATETYPE_read_log',`
	gen_require(`
		type TEMPLATETYPE_log_t;
	')

	logging_search_logs($1)
	read_files_pattern($1, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
')

########################################
## <summary>
##	Append to TEMPLATETYPE log files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_append_log',`
	gen_require(`
		type TEMPLATETYPE_log_t;
	')

	logging_search_logs($1)
	append_files_pattern($1, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
')

########################################
## <summary>
##	Manage TEMPLATETYPE log files
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_log',`
	gen_require(`
		type TEMPLATETYPE_log_t;
	')

	logging_search_logs($1)
	manage_dirs_pattern($1, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
	manage_files_pattern($1, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
	manage_lnk_files_pattern($1, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
')
z
		type TEMPLATETYPE_log_t;zA
	logging_search_logs($1)
	admin_pattern($1, TEMPLATETYPE_log_t)
zBFILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_log_t,s0)
zEFILENAME(/.*)?		gen_context(system_u:object_r:TEMPLATETYPE_log_t,s0)
N)Zte_typesZte_rulesZif_rulesZif_admin_typesZif_admin_rulesZfc_fileZfc_dir�rr�/usr/lib/python3.6/var_log.py�<module>s?site-packages/sepolicy/templates/__pycache__/unit_file.cpython-36.pyc000064400000002165147511334650021742 0ustar003

>�\��@s dZdZdZdZdZdZdZdS)zL
type TEMPLATETYPE_unit_file_t;
systemd_unit_file(TEMPLATETYPE_unit_file_t)
�aD########################################
## <summary>
##	Execute TEMPLATETYPE server in the TEMPLATETYPE domain.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed to transition.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_systemctl',`
	gen_require(`
		type TEMPLATETYPE_t;
		type TEMPLATETYPE_unit_file_t;
	')

	systemd_exec_systemctl($1)
        systemd_read_fifo_file_passwd_run($1)
	allow $1 TEMPLATETYPE_unit_file_t:file read_file_perms;
	allow $1 TEMPLATETYPE_unit_file_t:service manage_service_perms;

	ps_process_pattern($1, TEMPLATETYPE_t)
')

z 
	type TEMPLATETYPE_unit_file_t;z�
	TEMPLATETYPE_systemctl($1)
	admin_pattern($1, TEMPLATETYPE_unit_file_t)
	allow $1 TEMPLATETYPE_unit_file_t:service all_service_perms;
zHFILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_unit_file_t,s0)
N)Zte_typesZte_rulesZif_rulesZif_admin_typesZif_admin_rulesZfc_fileZfc_dir�rr�/usr/lib/python3.6/unit_file.py�<module>ssite-packages/sepolicy/templates/__pycache__/etc_rw.cpython-36.opt-1.pyc000064400000005243147511334650022206 0ustar003

>�\�@s(dZdZdZdZdZdZdZdZdZd	S)
z?
type TEMPLATETYPE_etc_rw_t;
files_type(TEMPLATETYPE_etc_rw_t)
aO
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_etc_rw_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_etc_rw_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_etc_rw_t)
files_etc_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_etc_rw_t, { dir file lnk_file })
z�
manage_sock_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_etc_rw_t)
files_etc_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_etc_rw_t, sock_file)
a�
########################################
## <summary>
##	Search TEMPLATETYPE conf directories.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_search_conf',`
	gen_require(`
		type TEMPLATETYPE_etc_rw_t;
	')

	allow $1 TEMPLATETYPE_etc_rw_t:dir search_dir_perms;
	files_search_etc($1)
')

########################################
## <summary>
##	Read TEMPLATETYPE conf files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_read_conf_files',`
	gen_require(`
		type TEMPLATETYPE_etc_rw_t;
	')

	allow $1 TEMPLATETYPE_etc_rw_t:dir list_dir_perms;
	read_files_pattern($1, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_etc_rw_t)
	files_search_etc($1)
')

########################################
## <summary>
##	Manage TEMPLATETYPE conf files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_conf_files',`
	gen_require(`
		type TEMPLATETYPE_etc_rw_t;
	')

	manage_files_pattern($1, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_etc_rw_t)
	files_search_etc($1)
')

a�########################################
## <summary>
##	Connect to TEMPLATETYPE over a unix stream socket.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_stream_connect',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_etc_rw_t;
	')

	files_search_etc($1)
	stream_connect_pattern($1, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_etc_rw_t, TEMPLATETYPE_t)
')
z
		type TEMPLATETYPE_etc_rw_t;zA
	files_search_etc($1)
	admin_pattern($1, TEMPLATETYPE_etc_rw_t)
zEFILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_etc_rw_t,s0)
zHFILENAME(/.*)?		gen_context(system_u:object_r:TEMPLATETYPE_etc_rw_t,s0)
N)	Zte_typesZte_rulesZte_stream_rulesZif_rulesZif_stream_rulesZif_admin_typesZif_admin_rulesZfc_fileZfc_dir�rr�/usr/lib/python3.6/etc_rw.py�<module>s>site-packages/sepolicy/templates/__pycache__/unit_file.cpython-36.opt-1.pyc000064400000002165147511334650022701 0ustar003

>�\��@s dZdZdZdZdZdZdZdS)zL
type TEMPLATETYPE_unit_file_t;
systemd_unit_file(TEMPLATETYPE_unit_file_t)
�aD########################################
## <summary>
##	Execute TEMPLATETYPE server in the TEMPLATETYPE domain.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed to transition.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_systemctl',`
	gen_require(`
		type TEMPLATETYPE_t;
		type TEMPLATETYPE_unit_file_t;
	')

	systemd_exec_systemctl($1)
        systemd_read_fifo_file_passwd_run($1)
	allow $1 TEMPLATETYPE_unit_file_t:file read_file_perms;
	allow $1 TEMPLATETYPE_unit_file_t:service manage_service_perms;

	ps_process_pattern($1, TEMPLATETYPE_t)
')

z 
	type TEMPLATETYPE_unit_file_t;z�
	TEMPLATETYPE_systemctl($1)
	admin_pattern($1, TEMPLATETYPE_unit_file_t)
	allow $1 TEMPLATETYPE_unit_file_t:service all_service_perms;
zHFILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_unit_file_t,s0)
N)Zte_typesZte_rulesZif_rulesZif_admin_typesZif_admin_rulesZfc_fileZfc_dir�rr�/usr/lib/python3.6/unit_file.py�<module>ssite-packages/sepolicy/templates/__pycache__/rw.cpython-36.opt-1.pyc000064400000005501147511334650021350 0ustar003

>�\\�@s,dZdZdZdZdZdZdZdZdZd	Z	d
S)z7
type TEMPLATETYPE_rw_t;
files_type(TEMPLATETYPE_rw_t)
z�
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t)
a�
########################################
## <summary>
##	Search TEMPLATETYPE rw directories.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_search_rw_dir',`
	gen_require(`
		type TEMPLATETYPE_rw_t;
	')

	allow $1 TEMPLATETYPE_rw_t:dir search_dir_perms;
	files_search_rw($1)
')

########################################
## <summary>
##	Read TEMPLATETYPE rw files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_read_rw_files',`
	gen_require(`
		type TEMPLATETYPE_rw_t;
	')

	read_files_pattern($1, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t)
	allow $1 TEMPLATETYPE_rw_t:dir list_dir_perms;
	files_search_rw($1)
')

########################################
## <summary>
##	Manage TEMPLATETYPE rw files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_rw_files',`
	gen_require(`
		type TEMPLATETYPE_rw_t;
	')

	manage_files_pattern($1, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t)
')

########################################
## <summary>
##	Create, read, write, and delete
##	TEMPLATETYPE rw dirs.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_rw_dirs',`
	gen_require(`
		type TEMPLATETYPE_rw_t;
	')

	manage_dirs_pattern($1, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t)
')

zQ
manage_sock_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t)
a�########################################
## <summary>
##	Connect to TEMPLATETYPE over a unix stream socket.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_stream_connect',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_rw_t;
	')

	stream_connect_pattern($1, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t, TEMPLATETYPE_t)
')
z
		type TEMPLATETYPE_rw_t;z=
	files_search_etc($1)
	admin_pattern($1, TEMPLATETYPE_rw_t)
zB
FILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_rw_t,s0)
zLFILENAME        -s  gen_context(system_u:object_r:TEMPLATETYPE_etc_rw_t,s0)
zE
FILENAME(/.*)?		gen_context(system_u:object_r:TEMPLATETYPE_rw_t,s0)
N)
Zte_typesZte_rulesZif_rulesZte_stream_rulesZif_stream_rulesZif_admin_typesZif_admin_rulesZfc_fileZfc_sock_fileZfc_dir�rr�/usr/lib/python3.6/rw.py�<module>sPsite-packages/sepolicy/templates/__pycache__/var_lib.cpython-36.opt-1.pyc000064400000006140147511334650022336 0ustar003

>�\��@s,dZdZdZdZdZdZdZdZdZd	Z	d
S)zA
type TEMPLATETYPE_var_lib_t;
files_type(TEMPLATETYPE_var_lib_t)
aZ
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
files_var_lib_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_var_lib_t, { dir file lnk_file })
z�manage_sock_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
files_var_lib_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_var_lib_t, sock_file)
a
########################################
## <summary>
##	Search TEMPLATETYPE lib directories.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_search_lib',`
	gen_require(`
		type TEMPLATETYPE_var_lib_t;
	')

	allow $1 TEMPLATETYPE_var_lib_t:dir search_dir_perms;
	files_search_var_lib($1)
')

########################################
## <summary>
##	Read TEMPLATETYPE lib files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_read_lib_files',`
	gen_require(`
		type TEMPLATETYPE_var_lib_t;
	')

	files_search_var_lib($1)
	read_files_pattern($1, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
')

########################################
## <summary>
##	Manage TEMPLATETYPE lib files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_lib_files',`
	gen_require(`
		type TEMPLATETYPE_var_lib_t;
	')

	files_search_var_lib($1)
	manage_files_pattern($1, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
')

########################################
## <summary>
##	Manage TEMPLATETYPE lib directories.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_lib_dirs',`
	gen_require(`
		type TEMPLATETYPE_var_lib_t;
	')

	files_search_var_lib($1)
	manage_dirs_pattern($1, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
')

a�
########################################
## <summary>
##	Connect to TEMPLATETYPE over a unix stream socket.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_stream_connect',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_var_lib_t;
	')

	stream_connect_pattern($1, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
')
z
		type TEMPLATETYPE_var_lib_t;zF
	files_search_var_lib($1)
	admin_pattern($1, TEMPLATETYPE_var_lib_t)
zFFILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_var_lib_t,s0)
zFFILENAME		-s	gen_context(system_u:object_r:TEMPLATETYPE_var_lib_t,s0)
zIFILENAME(/.*)?		gen_context(system_u:object_r:TEMPLATETYPE_var_lib_t,s0)
N)
Zte_typesZte_rulesZte_stream_rulesZif_rulesZif_stream_rulesZif_admin_typesZif_admin_rulesZfc_fileZfc_sock_fileZfc_dir�rr�/usr/lib/python3.6/var_lib.py�<module>sQsite-packages/sepolicy/templates/__pycache__/test_module.cpython-36.pyc000064400000004670147511334650022313 0ustar003

>�\a�@s\iZded<ded<ded<ded<ded<d	ed
<ded<ded
<ded<ded<ded<ded<ded<ded<ded<ded<ded<ded<ded <d!ed"<d!ed#<ded$<d!ed%<d!ed&<d!ed'<d(ed)<d*ed+<d(ed,<d-ed.<d/ed0<d/ed1<d2ed3<d2ed4<d$ed5<d$ed6<d$ed7<d$ed8<d9ed:<d;ed<<d=ed><d?ed@<dAedB<dCZdDS)EZsepolicy_domain_tZdomainZdomainsZsepolicy_target_tZ
target_domainZsepolicy_source_tZ
source_domainZsepolicy_peer_tZpeer_domainZsepolicy_exception_types_tZexception_typesZsepolicy_userdomain_tZuser_domainZ
userdomainZsepolicy_bool_domain_tZbool_domainZsepolicy_file_t�typeZ	file_typeZsepolicy_private_file_tzprivate typeZprivate_typeZsepolicy_devpts_tZpty_typeZsepolicy_tmpfs_tZ
tmpfs_typeZsepolicy_home_file_tZ	home_typeZ
sepolicy_tZtty_typeZdirectory_typeZsepolicy_object_tZobject_typeZsepolicy_exec_tZscript_fileZentry_point�fileZ
entry_fileZinit_script_fileZ
entrypointZ
sepolicy_rZroleZsepolicyZrole_prefixZ	user_roleZsepolicy_source_rZsource_roleZsepolicy_domain�prefixZ
domain_prefixZsepolicy_userdomainZuserdomain_prefixZuser_prefixZobject_class�object�classzobjectclass(es)Zsepolicy_objectZobject_namez"sepolicy_name"�nameZsepolicy_tty_tZterminalZsepolicy_bool_tZbooleanzs0 - mcs_systemhigh�rangea�policy_module(TEMPLATETYPE, 1.0.0)

type sepolicy_t;
domain_type(sepolicy_t)
type sepolicy_domain_t;
domain_type(sepolicy_domain_t)
type sepolicy_target_t;
domain_type(sepolicy_target_t)
type sepolicy_source_t;
domain_type(sepolicy_source_t)
type sepolicy_peer_t;
domain_type(sepolicy_peer_t)
type sepolicy_exception_types_t;
domain_type(sepolicy_exception_types_t)
type sepolicy_userdomain_t;
domain_type(sepolicy_userdomain_t)

type sepolicy_file_t;
files_type(sepolicy_file_t)
type sepolicy_private_file_t;
files_type(sepolicy_private_file_t)
type sepolicy_home_file_t;
files_type(sepolicy_home_file_t)
type sepolicy_tty_t;
term_tty(sepolicy_tty_t)
type sepolicy_object_t;
type sepolicy_devpts_t;
term_pty(sepolicy_devpts_t)
type sepolicy_tmpfs_t;
files_type(sepolicy_tmpfs_t)
type sepolicy_exec_t;
files_type(sepolicy_exec_t)

role sepolicy_r;
role sepolicy_source_r;
role sepolicy_target_r;

#################################
#
# Local policy
#

N)�dict_valuesZte_test_module�r	r	�!/usr/lib/python3.6/test_module.py�<module>sV-site-packages/sepolicy/templates/__pycache__/var_cache.cpython-36.opt-1.pyc000064400000005723147511334650022641 0ustar003

>�\8�@s(dZdZdZdZdZdZdZdZdZd	S)
z=
type TEMPLATETYPE_cache_t;
files_type(TEMPLATETYPE_cache_t)
aH
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
files_var_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_cache_t, { dir file lnk_file })
z�manage_sock_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
files_var_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_cache_t, sock_file)
a
########################################
## <summary>
##	Search TEMPLATETYPE cache directories.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_search_cache',`
	gen_require(`
		type TEMPLATETYPE_cache_t;
	')

	allow $1 TEMPLATETYPE_cache_t:dir search_dir_perms;
	files_search_var($1)
')

########################################
## <summary>
##	Read TEMPLATETYPE cache files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_read_cache_files',`
	gen_require(`
		type TEMPLATETYPE_cache_t;
	')

	files_search_var($1)
	read_files_pattern($1, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
')

########################################
## <summary>
##	Create, read, write, and delete
##	TEMPLATETYPE cache files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_cache_files',`
	gen_require(`
		type TEMPLATETYPE_cache_t;
	')

	files_search_var($1)
	manage_files_pattern($1, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
')

########################################
## <summary>
##	Manage TEMPLATETYPE cache dirs.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_cache_dirs',`
	gen_require(`
		type TEMPLATETYPE_cache_t;
	')

	files_search_var($1)
	manage_dirs_pattern($1, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
')

a�
########################################
## <summary>
##	Connect to TEMPLATETYPE over a unix stream socket.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_stream_connect',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_cache_t;
	')

	stream_connect_pattern($1, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
')
z
		type TEMPLATETYPE_cache_t;z@
	files_search_var($1)
	admin_pattern($1, TEMPLATETYPE_cache_t)
zDFILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_cache_t,s0)
zGFILENAME(/.*)?		gen_context(system_u:object_r:TEMPLATETYPE_cache_t,s0)
N)	Zte_typesZte_rulesZte_stream_rulesZif_rulesZif_stream_rulesZif_admin_typesZif_admin_rulesZfc_fileZfc_dir�rr�/usr/lib/python3.6/var_cache.py�<module>sQsite-packages/sepolicy/templates/__pycache__/var_lib.cpython-36.pyc000064400000006140147511334650021377 0ustar003

>�\��@s,dZdZdZdZdZdZdZdZdZd	Z	d
S)zA
type TEMPLATETYPE_var_lib_t;
files_type(TEMPLATETYPE_var_lib_t)
aZ
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
files_var_lib_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_var_lib_t, { dir file lnk_file })
z�manage_sock_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
files_var_lib_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_var_lib_t, sock_file)
a
########################################
## <summary>
##	Search TEMPLATETYPE lib directories.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_search_lib',`
	gen_require(`
		type TEMPLATETYPE_var_lib_t;
	')

	allow $1 TEMPLATETYPE_var_lib_t:dir search_dir_perms;
	files_search_var_lib($1)
')

########################################
## <summary>
##	Read TEMPLATETYPE lib files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_read_lib_files',`
	gen_require(`
		type TEMPLATETYPE_var_lib_t;
	')

	files_search_var_lib($1)
	read_files_pattern($1, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
')

########################################
## <summary>
##	Manage TEMPLATETYPE lib files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_lib_files',`
	gen_require(`
		type TEMPLATETYPE_var_lib_t;
	')

	files_search_var_lib($1)
	manage_files_pattern($1, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
')

########################################
## <summary>
##	Manage TEMPLATETYPE lib directories.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_lib_dirs',`
	gen_require(`
		type TEMPLATETYPE_var_lib_t;
	')

	files_search_var_lib($1)
	manage_dirs_pattern($1, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
')

a�
########################################
## <summary>
##	Connect to TEMPLATETYPE over a unix stream socket.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_stream_connect',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_var_lib_t;
	')

	stream_connect_pattern($1, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
')
z
		type TEMPLATETYPE_var_lib_t;zF
	files_search_var_lib($1)
	admin_pattern($1, TEMPLATETYPE_var_lib_t)
zFFILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_var_lib_t,s0)
zFFILENAME		-s	gen_context(system_u:object_r:TEMPLATETYPE_var_lib_t,s0)
zIFILENAME(/.*)?		gen_context(system_u:object_r:TEMPLATETYPE_var_lib_t,s0)
N)
Zte_typesZte_rulesZte_stream_rulesZif_rulesZif_stream_rulesZif_admin_typesZif_admin_rulesZfc_fileZfc_sock_fileZfc_dir�rr�/usr/lib/python3.6/var_lib.py�<module>sQsite-packages/sepolicy/templates/__pycache__/network.cpython-36.pyc000064400000030614147511334650021455 0ustar003

>�\#5�@sTdZdZdZdZdZdZdZdZdZd	Z	d
Z
dZdZd
Z
dZdZdZdZdZdZdS)z=
type TEMPLATETYPE_port_t;
corenet_port(TEMPLATETYPE_port_t)
zWsysnet_dns_name_resolve(TEMPLATETYPE_t)
corenet_all_recvfrom_unlabeled(TEMPLATETYPE_t)
z�allow TEMPLATETYPE_t self:tcp_socket create_stream_socket_perms;
corenet_tcp_sendrecv_generic_if(TEMPLATETYPE_t)
corenet_tcp_sendrecv_generic_node(TEMPLATETYPE_t)
corenet_tcp_sendrecv_all_ports(TEMPLATETYPE_t)
z.corenet_tcp_bind_generic_node(TEMPLATETYPE_t)
z?allow TEMPLATETYPE_t TEMPLATETYPE_port_t:tcp_socket name_bind;
zBallow TEMPLATETYPE_t TEMPLATETYPE_port_t:tcp_socket name_connect;
z�allow TEMPLATETYPE_t self:udp_socket { create_socket_perms listen };
corenet_udp_sendrecv_generic_if(TEMPLATETYPE_t)
corenet_udp_sendrecv_generic_node(TEMPLATETYPE_t)
corenet_udp_sendrecv_all_ports(TEMPLATETYPE_t)
z.corenet_udp_bind_generic_node(TEMPLATETYPE_t)
z?allow TEMPLATETYPE_t TEMPLATETYPE_port_t:udp_socket name_bind;
z.corenet_tcp_connect_all_ports(TEMPLATETYPE_t)
z2corenet_tcp_connect_all_rpc_ports(TEMPLATETYPE_t)
z9corenet_tcp_connect_all_unreserved_ports(TEMPLATETYPE_t)
z+corenet_tcp_bind_all_ports(TEMPLATETYPE_t)
z/corenet_tcp_bind_all_rpc_ports(TEMPLATETYPE_t)
z6corenet_tcp_bind_all_unreserved_ports(TEMPLATETYPE_t)
z+corenet_udp_bind_all_ports(TEMPLATETYPE_t)
z/corenet_udp_bind_all_rpc_ports(TEMPLATETYPE_t)
z6corenet_udp_bind_all_unreserved_ports(TEMPLATETYPE_t)
a�)########################################
## <summary>
##	Send and receive TCP traffic on the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="both" weight="10"/>
#
interface(`corenet_tcp_sendrecv_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	allow $1 TEMPLATETYPE_port_t:tcp_socket { send_msg recv_msg };
')

########################################
## <summary>
##	Send UDP traffic on the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="write" weight="10"/>
#
interface(`corenet_udp_send_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	allow $1 TEMPLATETYPE_port_t:udp_socket send_msg;
')

########################################
## <summary>
##	Do not audit attempts to send UDP traffic on the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_udp_send_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	dontaudit $1 TEMPLATETYPE_port_t:udp_socket send_msg;
')

########################################
## <summary>
##	Receive UDP traffic on the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="read" weight="10"/>
#
interface(`corenet_udp_receive_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	allow $1 TEMPLATETYPE_port_t:udp_socket recv_msg;
')

########################################
## <summary>
##	Do not audit attempts to receive UDP traffic on the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_udp_receive_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	dontaudit $1 TEMPLATETYPE_port_t:udp_socket recv_msg;
')

########################################
## <summary>
##	Send and receive UDP traffic on the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="both" weight="10"/>
#
interface(`corenet_udp_sendrecv_TEMPLATETYPE_port',`
	corenet_udp_send_TEMPLATETYPE_port($1)
	corenet_udp_receive_TEMPLATETYPE_port($1)
')

########################################
## <summary>
##	Do not audit attempts to send and receive
##	UDP traffic on the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_udp_sendrecv_TEMPLATETYPE_port',`
	corenet_dontaudit_udp_send_TEMPLATETYPE_port($1)
	corenet_dontaudit_udp_receive_TEMPLATETYPE_port($1)
')

########################################
## <summary>
##	Bind TCP sockets to the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_tcp_bind_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	allow $1 TEMPLATETYPE_port_t:tcp_socket name_bind;
	
')

########################################
## <summary>
##	Bind UDP sockets to the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_udp_bind_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	allow $1 TEMPLATETYPE_port_t:udp_socket name_bind;
	
')

########################################
## <summary>
##	Do not audit attempts to sbind to TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_udp_bind_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	dontaudit $1 TEMPLATETYPE_port_t:udp_socket name_bind;
	
')

########################################
## <summary>
##	Make a TCP connection to the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`corenet_tcp_connect_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	allow $1 TEMPLATETYPE_port_t:tcp_socket name_connect;
')
########################################
## <summary>
##	Do not audit attempts to make a TCP connection to TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`corenet_dontaudit_tcp_connect_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	dontaudit $1 TEMPLATETYPE_port_t:tcp_socket name_connect;
')


########################################
## <summary>
##	Send TEMPLATETYPE_client packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="write" weight="10"/>
#
interface(`corenet_send_TEMPLATETYPE_client_packets',`
	gen_require(`
		type TEMPLATETYPE_client_packet_t;
	')

	allow $1 TEMPLATETYPE_client_packet_t:packet send;
')

########################################
## <summary>
##	Do not audit attempts to send TEMPLATETYPE_client packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_send_TEMPLATETYPE_client_packets',`
	gen_require(`
		type TEMPLATETYPE_client_packet_t;
	')

	dontaudit $1 TEMPLATETYPE_client_packet_t:packet send;
')

########################################
## <summary>
##	Receive TEMPLATETYPE_client packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="read" weight="10"/>
#
interface(`corenet_receive_TEMPLATETYPE_client_packets',`
	gen_require(`
		type TEMPLATETYPE_client_packet_t;
	')

	allow $1 TEMPLATETYPE_client_packet_t:packet recv;
')

########################################
## <summary>
##	Do not audit attempts to receive TEMPLATETYPE_client packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_receive_TEMPLATETYPE_client_packets',`
	gen_require(`
		type TEMPLATETYPE_client_packet_t;
	')

	dontaudit $1 TEMPLATETYPE_client_packet_t:packet recv;
')

########################################
## <summary>
##	Send and receive TEMPLATETYPE_client packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="both" weight="10"/>
#
interface(`corenet_sendrecv_TEMPLATETYPE_client_packets',`
	corenet_send_TEMPLATETYPE_client_packets($1)
	corenet_receive_TEMPLATETYPE_client_packets($1)
')

########################################
## <summary>
##	Do not audit attempts to send and receive TEMPLATETYPE_client packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_sendrecv_TEMPLATETYPE_client_packets',`
	corenet_dontaudit_send_TEMPLATETYPE_client_packets($1)
	corenet_dontaudit_receive_TEMPLATETYPE_client_packets($1)
')

########################################
## <summary>
##	Relabel packets to TEMPLATETYPE_client the packet type.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`corenet_relabelto_TEMPLATETYPE_client_packets',`
	gen_require(`
		type TEMPLATETYPE_client_packet_t;
	')

	allow $1 TEMPLATETYPE_client_packet_t:packet relabelto;
')


########################################
## <summary>
##	Send TEMPLATETYPE_server packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="write" weight="10"/>
#
interface(`corenet_send_TEMPLATETYPE_server_packets',`
	gen_require(`
		type TEMPLATETYPE_server_packet_t;
	')

	allow $1 TEMPLATETYPE_server_packet_t:packet send;
')

########################################
## <summary>
##	Do not audit attempts to send TEMPLATETYPE_server packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_send_TEMPLATETYPE_server_packets',`
	gen_require(`
		type TEMPLATETYPE_server_packet_t;
	')

	dontaudit $1 TEMPLATETYPE_server_packet_t:packet send;
')

########################################
## <summary>
##	Receive TEMPLATETYPE_server packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="read" weight="10"/>
#
interface(`corenet_receive_TEMPLATETYPE_server_packets',`
	gen_require(`
		type TEMPLATETYPE_server_packet_t;
	')

	allow $1 TEMPLATETYPE_server_packet_t:packet recv;
')

########################################
## <summary>
##	Do not audit attempts to receive TEMPLATETYPE_server packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_receive_TEMPLATETYPE_server_packets',`
	gen_require(`
		type TEMPLATETYPE_server_packet_t;
	')

	dontaudit $1 TEMPLATETYPE_server_packet_t:packet recv;
')

########################################
## <summary>
##	Send and receive TEMPLATETYPE_server packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="both" weight="10"/>
#
interface(`corenet_sendrecv_TEMPLATETYPE_server_packets',`
	corenet_send_TEMPLATETYPE_server_packets($1)
	corenet_receive_TEMPLATETYPE_server_packets($1)
')

########################################
## <summary>
##	Do not audit attempts to send and receive TEMPLATETYPE_server packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_sendrecv_TEMPLATETYPE_server_packets',`
	corenet_dontaudit_send_TEMPLATETYPE_server_packets($1)
	corenet_dontaudit_receive_TEMPLATETYPE_server_packets($1)
')

########################################
## <summary>
##	Relabel packets to TEMPLATETYPE_server the packet type.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`corenet_relabelto_TEMPLATETYPE_server_packets',`
	gen_require(`
		type TEMPLATETYPE_server_packet_t;
	')

	allow $1 TEMPLATETYPE_server_packet_t:packet relabelto;
')
�
N)Zte_typesZ
te_networkZte_tcpZ	te_in_tcpZte_in_need_port_tcpZte_out_need_port_tcpZte_udpZ	te_in_udpZte_in_need_port_udpZte_out_all_ports_tcpZte_out_reserved_ports_tcpZte_out_unreserved_ports_tcpZte_in_all_ports_tcpZte_in_reserved_ports_tcpZte_in_unreserved_ports_tcpZte_in_all_ports_udpZte_in_reserved_ports_udpZte_in_unreserved_ports_udpZif_rulesZte_rules�rr�/usr/lib/python3.6/network.py�<module>s,bsite-packages/sepolicy/templates/__pycache__/user.cpython-36.opt-1.pyc000064400000006507147511334650021705 0ustar003

>�\��@sLdZdZdZdZdZdZdZdZdZdZ	d	Z
d
ZdZdZ
d
ZdZdZdZdS)z�policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#
userdom_unpriv_user_template(TEMPLATETYPE)
z�policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#
userdom_admin_user_template(TEMPLATETYPE)
z�policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

userdom_restricted_user_template(TEMPLATETYPE)
z�policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

userdom_restricted_xwindows_user_template(TEMPLATETYPE)
z$policy_module(TEMPLATETYPE, 1.0.0)

a�policy_module(TEMPLATETYPE, 1.0.0)

## <desc>
## <p>
## Allow TEMPLATETYPE to read files in the user home directory
## </p>
## </desc>
gen_tunable(TEMPLATETYPE_read_user_files, false)

## <desc>
## <p>
## Allow TEMPLATETYPE to manage files in the user home directory
## </p>
## </desc>
gen_tunable(TEMPLATETYPE_manage_user_files, false)

########################################
#
# Declarations
#

userdom_base_user_template(TEMPLATETYPE)
�zO
########################################
#
# TEMPLATETYPE customized policy
#
�
zO
optional_policy(`
        APPLICATION_role(TEMPLATETYPE_r, TEMPLATETYPE_t)
')
z~
optional_policy(`
        gen_require(`
                role USER_r;
        ')

        TEMPLATETYPE_role_change(USER_r)
')
a�
allow TEMPLATETYPE_t self:capability { dac_override dac_read_search kill sys_ptrace sys_nice };
files_dontaudit_search_all_dirs(TEMPLATETYPE_t)

selinux_get_enforce_mode(TEMPLATETYPE_t)
seutil_domtrans_setfiles(TEMPLATETYPE_t)
seutil_search_default_contexts(TEMPLATETYPE_t)

logging_send_syslog_msg(TEMPLATETYPE_t)

kernel_read_system_state(TEMPLATETYPE_t)

domain_dontaudit_search_all_domains_state(TEMPLATETYPE_t)
domain_dontaudit_ptrace_all_domains(TEMPLATETYPE_t)

userdom_dontaudit_search_admin_dir(TEMPLATETYPE_t)
userdom_dontaudit_search_user_home_dirs(TEMPLATETYPE_t)

tunable_policy(`TEMPLATETYPE_read_user_files',`
        userdom_read_user_home_content_files(TEMPLATETYPE_t)
        userdom_read_user_tmp_files(TEMPLATETYPE_t)
')

tunable_policy(`TEMPLATETYPE_manage_user_files',`
	userdom_manage_user_home_content_dirs(TEMPLATETYPE_t)
	userdom_manage_user_home_content_files(TEMPLATETYPE_t)
	userdom_manage_user_home_content_symlinks(TEMPLATETYPE_t)
        userdom_manage_user_tmp_files(TEMPLATETYPE_t)
')
zE
gen_require(`
        role USER_r;
')

allow USER_r TEMPLATETYPE_r;
zP
optional_policy(`
        APPLICATION_admin(TEMPLATETYPE_t, TEMPLATETYPE_r)
')
zz
optional_policy(`
        gen_require(`
                role ROLE_r;
        ')

        allow TEMPLATETYPE_r ROLE_r;
')
z_
optional_policy(`
        sudo_role_template(TEMPLATETYPE, TEMPLATETYPE_r, TEMPLATETYPE_t)
')
z4
seutil_run_newrole(TEMPLATETYPE_t, TEMPLATETYPE_r)
N)Zte_login_user_typesZte_admin_user_typesZte_min_login_user_typesZte_x_login_user_typesZte_existing_user_typesZte_root_user_typesZte_login_user_rulesZte_existing_user_rulesZte_x_login_user_rulesZte_root_user_rulesZte_transition_rulesZte_user_trans_rulesZte_admin_rulesZte_admin_trans_rulesZte_admin_domain_rulesZte_roles_rulesZ
te_sudo_rulesZte_newrole_rules�rr�/usr/lib/python3.6/user.py�<module> s"


site-packages/sepolicy/templates/__pycache__/script.cpython-36.opt-1.pyc000064400000006431147511334650022227 0ustar003

>�\��@s0dZdZdZdZdZdZdZdZdZd	Z	d
Z
dS)a�#!/bin/sh -e

DIRNAME=`dirname $0`
cd $DIRNAME
USAGE="$0 [ --update ]"
if [ `id -u` != 0 ]; then
echo 'You must be root to run this script'
exit 1
fi

if [ $# -eq 1 ]; then
	if [ "$1" = "--update" ] ; then
		time=`ls -l --time-style="+%x %X" TEMPLATEFILE.te | awk '{ printf "%s %s", $6, $7 }'`
		rules=`ausearch --start $time -m avc --raw -se TEMPLATETYPE`
		if [ x"$rules" != "x" ] ; then
			echo "Found avc's to update policy with"
			echo -e "$rules" | audit2allow -R
			echo "Do you want these changes added to policy [y/n]?"
			read ANS
			if [ "$ANS" = "y" -o "$ANS" = "Y" ] ; then
				echo "Updating policy"
				echo -e "$rules" | audit2allow -R >> TEMPLATEFILE.te
				# Fall though and rebuild policy
			else
				exit 0
			fi
		else
			echo "No new avcs found"
			exit 0
		fi
	else
		echo -e $USAGE
		exit 1
	fi
elif [ $# -ge 2 ] ; then
	echo -e $USAGE
	exit 1
fi

echo "Building and Loading Policy"
set -x
make -f /usr/share/selinux/devel/Makefile TEMPLATEFILE.pp || exit
/usr/sbin/semodule -i TEMPLATEFILE.pp

a# Generate a rpm package for the newly generated policy

pwd=$(pwd)
rpmbuild --define "_sourcedir ${pwd}" --define "_specdir ${pwd}" --define "_builddir ${pwd}" --define "_srcrpmdir ${pwd}" --define "_rpmdir ${pwd}" --define "_buildrootdir ${pwd}/.build"  -ba TEMPLATEFILE_selinux.spec
zU# Generate a man page off the installed module
sepolicy manpage -p . -d DOMAINTYPE_t
zI# Fixing the file context on FILENAME
/sbin/restorecon -F -R -v FILENAME
zk# Adding SELinux tcp port to port PORTNUM
/usr/sbin/semanage port -a -t TEMPLATETYPE_port_t -p tcp PORTNUM
zk# Adding SELinux udp port to port PORTNUM
/usr/sbin/semanage port -a -t TEMPLATETYPE_port_t -p udp PORTNUM
zh# Adding SELinux user TEMPLATETYPE_u
/usr/sbin/semanage user -a -R "TEMPLATETYPE_rROLES" TEMPLATETYPE_u
zq# Adding roles to SELinux user TEMPLATETYPE_u
/usr/sbin/semanage user -m -R "TEMPLATETYPE_rROLES" TEMPLATETYPE_u
zW# Adding roles to SELinux user USER
/usr/sbin/semanage user -m -R +TEMPLATETYPE_r USER
a�cat > TEMPLATETYPE_u << _EOF
TEMPLATETYPE_r:TEMPLATETYPE_t:s0	TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:crond_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:initrc_su_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:local_login_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:remote_login_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:sshd_t			TEMPLATETYPE_r:TEMPLATETYPE_t
_EOF
if [ ! -f /etc/selinux/targeted/contexts/users/TEMPLATETYPE_u ]; then
   cp TEMPLATETYPE_u /etc/selinux/targeted/contexts/users/
fi
acat > TEMPLATETYPE_u << _EOF
TEMPLATETYPE_r:TEMPLATETYPE_t	TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:crond_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:initrc_su_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:local_login_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:remote_login_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:sshd_t				TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:xdm_t				TEMPLATETYPE_r:TEMPLATETYPE_t
_EOF
if [ ! -f /etc/selinux/targeted/contexts/users/TEMPLATETYPE_u ]; then
   cp TEMPLATETYPE_u /etc/selinux/targeted/contexts/users/
fi
N)�compileZrpmZmanpageZ
restoreconZ	tcp_portsZ	udp_portsZusersZeusersZadmin_transZmin_login_user_default_contextZx_login_user_default_context�rr�/usr/lib/python3.6/script.py�<module>Essite-packages/sepolicy/templates/__pycache__/rw.cpython-36.pyc000064400000005501147511334650020411 0ustar003

>�\\�@s,dZdZdZdZdZdZdZdZdZd	Z	d
S)z7
type TEMPLATETYPE_rw_t;
files_type(TEMPLATETYPE_rw_t)
z�
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t)
a�
########################################
## <summary>
##	Search TEMPLATETYPE rw directories.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_search_rw_dir',`
	gen_require(`
		type TEMPLATETYPE_rw_t;
	')

	allow $1 TEMPLATETYPE_rw_t:dir search_dir_perms;
	files_search_rw($1)
')

########################################
## <summary>
##	Read TEMPLATETYPE rw files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_read_rw_files',`
	gen_require(`
		type TEMPLATETYPE_rw_t;
	')

	read_files_pattern($1, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t)
	allow $1 TEMPLATETYPE_rw_t:dir list_dir_perms;
	files_search_rw($1)
')

########################################
## <summary>
##	Manage TEMPLATETYPE rw files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_rw_files',`
	gen_require(`
		type TEMPLATETYPE_rw_t;
	')

	manage_files_pattern($1, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t)
')

########################################
## <summary>
##	Create, read, write, and delete
##	TEMPLATETYPE rw dirs.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_rw_dirs',`
	gen_require(`
		type TEMPLATETYPE_rw_t;
	')

	manage_dirs_pattern($1, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t)
')

zQ
manage_sock_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t)
a�########################################
## <summary>
##	Connect to TEMPLATETYPE over a unix stream socket.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_stream_connect',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_rw_t;
	')

	stream_connect_pattern($1, TEMPLATETYPE_rw_t, TEMPLATETYPE_rw_t, TEMPLATETYPE_t)
')
z
		type TEMPLATETYPE_rw_t;z=
	files_search_etc($1)
	admin_pattern($1, TEMPLATETYPE_rw_t)
zB
FILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_rw_t,s0)
zLFILENAME        -s  gen_context(system_u:object_r:TEMPLATETYPE_etc_rw_t,s0)
zE
FILENAME(/.*)?		gen_context(system_u:object_r:TEMPLATETYPE_rw_t,s0)
N)
Zte_typesZte_rulesZif_rulesZte_stream_rulesZif_stream_rulesZif_admin_typesZif_admin_rulesZfc_fileZfc_sock_fileZfc_dir�rr�/usr/lib/python3.6/rw.py�<module>sPsite-packages/sepolicy/templates/__pycache__/script.cpython-36.pyc000064400000006431147511334650021270 0ustar003

>�\��@s0dZdZdZdZdZdZdZdZdZd	Z	d
Z
dS)a�#!/bin/sh -e

DIRNAME=`dirname $0`
cd $DIRNAME
USAGE="$0 [ --update ]"
if [ `id -u` != 0 ]; then
echo 'You must be root to run this script'
exit 1
fi

if [ $# -eq 1 ]; then
	if [ "$1" = "--update" ] ; then
		time=`ls -l --time-style="+%x %X" TEMPLATEFILE.te | awk '{ printf "%s %s", $6, $7 }'`
		rules=`ausearch --start $time -m avc --raw -se TEMPLATETYPE`
		if [ x"$rules" != "x" ] ; then
			echo "Found avc's to update policy with"
			echo -e "$rules" | audit2allow -R
			echo "Do you want these changes added to policy [y/n]?"
			read ANS
			if [ "$ANS" = "y" -o "$ANS" = "Y" ] ; then
				echo "Updating policy"
				echo -e "$rules" | audit2allow -R >> TEMPLATEFILE.te
				# Fall though and rebuild policy
			else
				exit 0
			fi
		else
			echo "No new avcs found"
			exit 0
		fi
	else
		echo -e $USAGE
		exit 1
	fi
elif [ $# -ge 2 ] ; then
	echo -e $USAGE
	exit 1
fi

echo "Building and Loading Policy"
set -x
make -f /usr/share/selinux/devel/Makefile TEMPLATEFILE.pp || exit
/usr/sbin/semodule -i TEMPLATEFILE.pp

a# Generate a rpm package for the newly generated policy

pwd=$(pwd)
rpmbuild --define "_sourcedir ${pwd}" --define "_specdir ${pwd}" --define "_builddir ${pwd}" --define "_srcrpmdir ${pwd}" --define "_rpmdir ${pwd}" --define "_buildrootdir ${pwd}/.build"  -ba TEMPLATEFILE_selinux.spec
zU# Generate a man page off the installed module
sepolicy manpage -p . -d DOMAINTYPE_t
zI# Fixing the file context on FILENAME
/sbin/restorecon -F -R -v FILENAME
zk# Adding SELinux tcp port to port PORTNUM
/usr/sbin/semanage port -a -t TEMPLATETYPE_port_t -p tcp PORTNUM
zk# Adding SELinux udp port to port PORTNUM
/usr/sbin/semanage port -a -t TEMPLATETYPE_port_t -p udp PORTNUM
zh# Adding SELinux user TEMPLATETYPE_u
/usr/sbin/semanage user -a -R "TEMPLATETYPE_rROLES" TEMPLATETYPE_u
zq# Adding roles to SELinux user TEMPLATETYPE_u
/usr/sbin/semanage user -m -R "TEMPLATETYPE_rROLES" TEMPLATETYPE_u
zW# Adding roles to SELinux user USER
/usr/sbin/semanage user -m -R +TEMPLATETYPE_r USER
a�cat > TEMPLATETYPE_u << _EOF
TEMPLATETYPE_r:TEMPLATETYPE_t:s0	TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:crond_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:initrc_su_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:local_login_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:remote_login_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:sshd_t			TEMPLATETYPE_r:TEMPLATETYPE_t
_EOF
if [ ! -f /etc/selinux/targeted/contexts/users/TEMPLATETYPE_u ]; then
   cp TEMPLATETYPE_u /etc/selinux/targeted/contexts/users/
fi
acat > TEMPLATETYPE_u << _EOF
TEMPLATETYPE_r:TEMPLATETYPE_t	TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:crond_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:initrc_su_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:local_login_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:remote_login_t		TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:sshd_t				TEMPLATETYPE_r:TEMPLATETYPE_t
system_r:xdm_t				TEMPLATETYPE_r:TEMPLATETYPE_t
_EOF
if [ ! -f /etc/selinux/targeted/contexts/users/TEMPLATETYPE_u ]; then
   cp TEMPLATETYPE_u /etc/selinux/targeted/contexts/users/
fi
N)�compileZrpmZmanpageZ
restoreconZ	tcp_portsZ	udp_portsZusersZeusersZadmin_transZmin_login_user_default_contextZx_login_user_default_context�rr�/usr/lib/python3.6/script.py�<module>Essite-packages/sepolicy/templates/__pycache__/user.cpython-36.pyc000064400000006507147511334650020746 0ustar003

>�\��@sLdZdZdZdZdZdZdZdZdZdZ	d	Z
d
ZdZdZ
d
ZdZdZdZdS)z�policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#
userdom_unpriv_user_template(TEMPLATETYPE)
z�policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#
userdom_admin_user_template(TEMPLATETYPE)
z�policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

userdom_restricted_user_template(TEMPLATETYPE)
z�policy_module(TEMPLATETYPE, 1.0.0)

########################################
#
# Declarations
#

userdom_restricted_xwindows_user_template(TEMPLATETYPE)
z$policy_module(TEMPLATETYPE, 1.0.0)

a�policy_module(TEMPLATETYPE, 1.0.0)

## <desc>
## <p>
## Allow TEMPLATETYPE to read files in the user home directory
## </p>
## </desc>
gen_tunable(TEMPLATETYPE_read_user_files, false)

## <desc>
## <p>
## Allow TEMPLATETYPE to manage files in the user home directory
## </p>
## </desc>
gen_tunable(TEMPLATETYPE_manage_user_files, false)

########################################
#
# Declarations
#

userdom_base_user_template(TEMPLATETYPE)
�zO
########################################
#
# TEMPLATETYPE customized policy
#
�
zO
optional_policy(`
        APPLICATION_role(TEMPLATETYPE_r, TEMPLATETYPE_t)
')
z~
optional_policy(`
        gen_require(`
                role USER_r;
        ')

        TEMPLATETYPE_role_change(USER_r)
')
a�
allow TEMPLATETYPE_t self:capability { dac_override dac_read_search kill sys_ptrace sys_nice };
files_dontaudit_search_all_dirs(TEMPLATETYPE_t)

selinux_get_enforce_mode(TEMPLATETYPE_t)
seutil_domtrans_setfiles(TEMPLATETYPE_t)
seutil_search_default_contexts(TEMPLATETYPE_t)

logging_send_syslog_msg(TEMPLATETYPE_t)

kernel_read_system_state(TEMPLATETYPE_t)

domain_dontaudit_search_all_domains_state(TEMPLATETYPE_t)
domain_dontaudit_ptrace_all_domains(TEMPLATETYPE_t)

userdom_dontaudit_search_admin_dir(TEMPLATETYPE_t)
userdom_dontaudit_search_user_home_dirs(TEMPLATETYPE_t)

tunable_policy(`TEMPLATETYPE_read_user_files',`
        userdom_read_user_home_content_files(TEMPLATETYPE_t)
        userdom_read_user_tmp_files(TEMPLATETYPE_t)
')

tunable_policy(`TEMPLATETYPE_manage_user_files',`
	userdom_manage_user_home_content_dirs(TEMPLATETYPE_t)
	userdom_manage_user_home_content_files(TEMPLATETYPE_t)
	userdom_manage_user_home_content_symlinks(TEMPLATETYPE_t)
        userdom_manage_user_tmp_files(TEMPLATETYPE_t)
')
zE
gen_require(`
        role USER_r;
')

allow USER_r TEMPLATETYPE_r;
zP
optional_policy(`
        APPLICATION_admin(TEMPLATETYPE_t, TEMPLATETYPE_r)
')
zz
optional_policy(`
        gen_require(`
                role ROLE_r;
        ')

        allow TEMPLATETYPE_r ROLE_r;
')
z_
optional_policy(`
        sudo_role_template(TEMPLATETYPE, TEMPLATETYPE_r, TEMPLATETYPE_t)
')
z4
seutil_run_newrole(TEMPLATETYPE_t, TEMPLATETYPE_r)
N)Zte_login_user_typesZte_admin_user_typesZte_min_login_user_typesZte_x_login_user_typesZte_existing_user_typesZte_root_user_typesZte_login_user_rulesZte_existing_user_rulesZte_x_login_user_rulesZte_root_user_rulesZte_transition_rulesZte_user_trans_rulesZte_admin_rulesZte_admin_trans_rulesZte_admin_domain_rulesZte_roles_rulesZ
te_sudo_rulesZte_newrole_rules�rr�/usr/lib/python3.6/user.py�<module> s"


site-packages/sepolicy/templates/__pycache__/network.cpython-36.opt-1.pyc000064400000030614147511334650022414 0ustar003

>�\#5�@sTdZdZdZdZdZdZdZdZdZd	Z	d
Z
dZdZd
Z
dZdZdZdZdZdZdS)z=
type TEMPLATETYPE_port_t;
corenet_port(TEMPLATETYPE_port_t)
zWsysnet_dns_name_resolve(TEMPLATETYPE_t)
corenet_all_recvfrom_unlabeled(TEMPLATETYPE_t)
z�allow TEMPLATETYPE_t self:tcp_socket create_stream_socket_perms;
corenet_tcp_sendrecv_generic_if(TEMPLATETYPE_t)
corenet_tcp_sendrecv_generic_node(TEMPLATETYPE_t)
corenet_tcp_sendrecv_all_ports(TEMPLATETYPE_t)
z.corenet_tcp_bind_generic_node(TEMPLATETYPE_t)
z?allow TEMPLATETYPE_t TEMPLATETYPE_port_t:tcp_socket name_bind;
zBallow TEMPLATETYPE_t TEMPLATETYPE_port_t:tcp_socket name_connect;
z�allow TEMPLATETYPE_t self:udp_socket { create_socket_perms listen };
corenet_udp_sendrecv_generic_if(TEMPLATETYPE_t)
corenet_udp_sendrecv_generic_node(TEMPLATETYPE_t)
corenet_udp_sendrecv_all_ports(TEMPLATETYPE_t)
z.corenet_udp_bind_generic_node(TEMPLATETYPE_t)
z?allow TEMPLATETYPE_t TEMPLATETYPE_port_t:udp_socket name_bind;
z.corenet_tcp_connect_all_ports(TEMPLATETYPE_t)
z2corenet_tcp_connect_all_rpc_ports(TEMPLATETYPE_t)
z9corenet_tcp_connect_all_unreserved_ports(TEMPLATETYPE_t)
z+corenet_tcp_bind_all_ports(TEMPLATETYPE_t)
z/corenet_tcp_bind_all_rpc_ports(TEMPLATETYPE_t)
z6corenet_tcp_bind_all_unreserved_ports(TEMPLATETYPE_t)
z+corenet_udp_bind_all_ports(TEMPLATETYPE_t)
z/corenet_udp_bind_all_rpc_ports(TEMPLATETYPE_t)
z6corenet_udp_bind_all_unreserved_ports(TEMPLATETYPE_t)
a�)########################################
## <summary>
##	Send and receive TCP traffic on the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="both" weight="10"/>
#
interface(`corenet_tcp_sendrecv_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	allow $1 TEMPLATETYPE_port_t:tcp_socket { send_msg recv_msg };
')

########################################
## <summary>
##	Send UDP traffic on the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="write" weight="10"/>
#
interface(`corenet_udp_send_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	allow $1 TEMPLATETYPE_port_t:udp_socket send_msg;
')

########################################
## <summary>
##	Do not audit attempts to send UDP traffic on the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_udp_send_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	dontaudit $1 TEMPLATETYPE_port_t:udp_socket send_msg;
')

########################################
## <summary>
##	Receive UDP traffic on the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="read" weight="10"/>
#
interface(`corenet_udp_receive_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	allow $1 TEMPLATETYPE_port_t:udp_socket recv_msg;
')

########################################
## <summary>
##	Do not audit attempts to receive UDP traffic on the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_udp_receive_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	dontaudit $1 TEMPLATETYPE_port_t:udp_socket recv_msg;
')

########################################
## <summary>
##	Send and receive UDP traffic on the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="both" weight="10"/>
#
interface(`corenet_udp_sendrecv_TEMPLATETYPE_port',`
	corenet_udp_send_TEMPLATETYPE_port($1)
	corenet_udp_receive_TEMPLATETYPE_port($1)
')

########################################
## <summary>
##	Do not audit attempts to send and receive
##	UDP traffic on the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_udp_sendrecv_TEMPLATETYPE_port',`
	corenet_dontaudit_udp_send_TEMPLATETYPE_port($1)
	corenet_dontaudit_udp_receive_TEMPLATETYPE_port($1)
')

########################################
## <summary>
##	Bind TCP sockets to the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_tcp_bind_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	allow $1 TEMPLATETYPE_port_t:tcp_socket name_bind;
	
')

########################################
## <summary>
##	Bind UDP sockets to the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_udp_bind_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	allow $1 TEMPLATETYPE_port_t:udp_socket name_bind;
	
')

########################################
## <summary>
##	Do not audit attempts to sbind to TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_udp_bind_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	dontaudit $1 TEMPLATETYPE_port_t:udp_socket name_bind;
	
')

########################################
## <summary>
##	Make a TCP connection to the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`corenet_tcp_connect_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	allow $1 TEMPLATETYPE_port_t:tcp_socket name_connect;
')
########################################
## <summary>
##	Do not audit attempts to make a TCP connection to TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`corenet_dontaudit_tcp_connect_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	dontaudit $1 TEMPLATETYPE_port_t:tcp_socket name_connect;
')


########################################
## <summary>
##	Send TEMPLATETYPE_client packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="write" weight="10"/>
#
interface(`corenet_send_TEMPLATETYPE_client_packets',`
	gen_require(`
		type TEMPLATETYPE_client_packet_t;
	')

	allow $1 TEMPLATETYPE_client_packet_t:packet send;
')

########################################
## <summary>
##	Do not audit attempts to send TEMPLATETYPE_client packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_send_TEMPLATETYPE_client_packets',`
	gen_require(`
		type TEMPLATETYPE_client_packet_t;
	')

	dontaudit $1 TEMPLATETYPE_client_packet_t:packet send;
')

########################################
## <summary>
##	Receive TEMPLATETYPE_client packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="read" weight="10"/>
#
interface(`corenet_receive_TEMPLATETYPE_client_packets',`
	gen_require(`
		type TEMPLATETYPE_client_packet_t;
	')

	allow $1 TEMPLATETYPE_client_packet_t:packet recv;
')

########################################
## <summary>
##	Do not audit attempts to receive TEMPLATETYPE_client packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_receive_TEMPLATETYPE_client_packets',`
	gen_require(`
		type TEMPLATETYPE_client_packet_t;
	')

	dontaudit $1 TEMPLATETYPE_client_packet_t:packet recv;
')

########################################
## <summary>
##	Send and receive TEMPLATETYPE_client packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="both" weight="10"/>
#
interface(`corenet_sendrecv_TEMPLATETYPE_client_packets',`
	corenet_send_TEMPLATETYPE_client_packets($1)
	corenet_receive_TEMPLATETYPE_client_packets($1)
')

########################################
## <summary>
##	Do not audit attempts to send and receive TEMPLATETYPE_client packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_sendrecv_TEMPLATETYPE_client_packets',`
	corenet_dontaudit_send_TEMPLATETYPE_client_packets($1)
	corenet_dontaudit_receive_TEMPLATETYPE_client_packets($1)
')

########################################
## <summary>
##	Relabel packets to TEMPLATETYPE_client the packet type.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`corenet_relabelto_TEMPLATETYPE_client_packets',`
	gen_require(`
		type TEMPLATETYPE_client_packet_t;
	')

	allow $1 TEMPLATETYPE_client_packet_t:packet relabelto;
')


########################################
## <summary>
##	Send TEMPLATETYPE_server packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="write" weight="10"/>
#
interface(`corenet_send_TEMPLATETYPE_server_packets',`
	gen_require(`
		type TEMPLATETYPE_server_packet_t;
	')

	allow $1 TEMPLATETYPE_server_packet_t:packet send;
')

########################################
## <summary>
##	Do not audit attempts to send TEMPLATETYPE_server packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_send_TEMPLATETYPE_server_packets',`
	gen_require(`
		type TEMPLATETYPE_server_packet_t;
	')

	dontaudit $1 TEMPLATETYPE_server_packet_t:packet send;
')

########################################
## <summary>
##	Receive TEMPLATETYPE_server packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="read" weight="10"/>
#
interface(`corenet_receive_TEMPLATETYPE_server_packets',`
	gen_require(`
		type TEMPLATETYPE_server_packet_t;
	')

	allow $1 TEMPLATETYPE_server_packet_t:packet recv;
')

########################################
## <summary>
##	Do not audit attempts to receive TEMPLATETYPE_server packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_receive_TEMPLATETYPE_server_packets',`
	gen_require(`
		type TEMPLATETYPE_server_packet_t;
	')

	dontaudit $1 TEMPLATETYPE_server_packet_t:packet recv;
')

########################################
## <summary>
##	Send and receive TEMPLATETYPE_server packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="both" weight="10"/>
#
interface(`corenet_sendrecv_TEMPLATETYPE_server_packets',`
	corenet_send_TEMPLATETYPE_server_packets($1)
	corenet_receive_TEMPLATETYPE_server_packets($1)
')

########################################
## <summary>
##	Do not audit attempts to send and receive TEMPLATETYPE_server packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_sendrecv_TEMPLATETYPE_server_packets',`
	corenet_dontaudit_send_TEMPLATETYPE_server_packets($1)
	corenet_dontaudit_receive_TEMPLATETYPE_server_packets($1)
')

########################################
## <summary>
##	Relabel packets to TEMPLATETYPE_server the packet type.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`corenet_relabelto_TEMPLATETYPE_server_packets',`
	gen_require(`
		type TEMPLATETYPE_server_packet_t;
	')

	allow $1 TEMPLATETYPE_server_packet_t:packet relabelto;
')
�
N)Zte_typesZ
te_networkZte_tcpZ	te_in_tcpZte_in_need_port_tcpZte_out_need_port_tcpZte_udpZ	te_in_udpZte_in_need_port_udpZte_out_all_ports_tcpZte_out_reserved_ports_tcpZte_out_unreserved_ports_tcpZte_in_all_ports_tcpZte_in_reserved_ports_tcpZte_in_unreserved_ports_tcpZte_in_all_ports_udpZte_in_reserved_ports_udpZte_in_unreserved_ports_udpZif_rulesZte_rules�rr�/usr/lib/python3.6/network.py�<module>s,bsite-packages/sepolicy/templates/__pycache__/var_spool.cpython-36.opt-1.pyc000064400000005705147511334650022732 0ustar003

>�\.�@s(dZdZdZdZdZdZdZdZdZd	S)
z=
type TEMPLATETYPE_spool_t;
files_type(TEMPLATETYPE_spool_t)
aJ
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
files_spool_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_spool_t, { dir file lnk_file })
z�manage_sock_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
files_spool_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_spool_t, sock_file)
a�
########################################
## <summary>
##	Search TEMPLATETYPE spool directories.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_search_spool',`
	gen_require(`
		type TEMPLATETYPE_spool_t;
	')

	allow $1 TEMPLATETYPE_spool_t:dir search_dir_perms;
	files_search_spool($1)
')

########################################
## <summary>
##	Read TEMPLATETYPE spool files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_read_spool_files',`
	gen_require(`
		type TEMPLATETYPE_spool_t;
	')

	files_search_spool($1)
	read_files_pattern($1, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
')

########################################
## <summary>
##	Manage TEMPLATETYPE spool files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_spool_files',`
	gen_require(`
		type TEMPLATETYPE_spool_t;
	')

	files_search_spool($1)
	manage_files_pattern($1, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
')

########################################
## <summary>
##	Manage TEMPLATETYPE spool dirs.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_spool_dirs',`
	gen_require(`
		type TEMPLATETYPE_spool_t;
	')

	files_search_spool($1)
	manage_dirs_pattern($1, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
')

a�
########################################
## <summary>
##	Connect to TEMPLATETYPE over a unix stream socket.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_stream_connect',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_spool_t;
	')

	stream_connect_pattern($1, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
')
z
		type TEMPLATETYPE_spool_t;zB
	files_search_spool($1)
	admin_pattern($1, TEMPLATETYPE_spool_t)
zDFILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_spool_t,s0)
zGFILENAME(/.*)?		gen_context(system_u:object_r:TEMPLATETYPE_spool_t,s0)
N)	Zte_typesZte_rulesZte_stream_rulesZif_rulesZif_stream_rulesZif_admin_typesZif_admin_rulesZfc_fileZfc_dir�rr�/usr/lib/python3.6/var_spool.py�<module>sPsite-packages/sepolicy/templates/__pycache__/var_run.cpython-36.pyc000064400000003753147511334650021444 0ustar003

>�\s�@s,dZdZdZdZdZdZdZdZdZd	Z	d
S)zE
type TEMPLATETYPE_var_run_t;
files_pid_file(TEMPLATETYPE_var_run_t)
aV
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_run_t, TEMPLATETYPE_var_run_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_run_t, TEMPLATETYPE_var_run_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_run_t, TEMPLATETYPE_var_run_t)
files_pid_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_var_run_t, { dir file lnk_file })
z�
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_run_t, TEMPLATETYPE_var_run_t)
files_pid_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_var_run_t, sock_file)
a�########################################
## <summary>
##	Read TEMPLATETYPE PID files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_read_pid_files',`
	gen_require(`
		type TEMPLATETYPE_var_run_t;
	')

	files_search_pids($1)
	read_files_pattern($1, TEMPLATETYPE_var_run_t, TEMPLATETYPE_var_run_t)
')

a�########################################
## <summary>
##	Connect to TEMPLATETYPE over a unix stream socket.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_stream_connect',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_var_run_t;
	')

	files_search_pids($1)
	stream_connect_pattern($1, TEMPLATETYPE_var_run_t, TEMPLATETYPE_var_run_t, TEMPLATETYPE_t)
')
z
		type TEMPLATETYPE_var_run_t;zC
	files_search_pids($1)
	admin_pattern($1, TEMPLATETYPE_var_run_t)
zFFILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_var_run_t,s0)
zFFILENAME		-s	gen_context(system_u:object_r:TEMPLATETYPE_var_run_t,s0)
zIFILENAME(/.*)?		gen_context(system_u:object_r:TEMPLATETYPE_var_run_t,s0)
N)
Zte_typesZte_rulesZte_stream_rulesZif_rulesZif_stream_rulesZif_admin_typesZif_admin_rulesZfc_fileZfc_sock_fileZfc_dir�rr�/usr/lib/python3.6/var_run.py�<module>ssite-packages/sepolicy/templates/__pycache__/var_spool.cpython-36.pyc000064400000005705147511334650021773 0ustar003

>�\.�@s(dZdZdZdZdZdZdZdZdZd	S)
z=
type TEMPLATETYPE_spool_t;
files_type(TEMPLATETYPE_spool_t)
aJ
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
files_spool_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_spool_t, { dir file lnk_file })
z�manage_sock_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
files_spool_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_spool_t, sock_file)
a�
########################################
## <summary>
##	Search TEMPLATETYPE spool directories.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_search_spool',`
	gen_require(`
		type TEMPLATETYPE_spool_t;
	')

	allow $1 TEMPLATETYPE_spool_t:dir search_dir_perms;
	files_search_spool($1)
')

########################################
## <summary>
##	Read TEMPLATETYPE spool files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_read_spool_files',`
	gen_require(`
		type TEMPLATETYPE_spool_t;
	')

	files_search_spool($1)
	read_files_pattern($1, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
')

########################################
## <summary>
##	Manage TEMPLATETYPE spool files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_spool_files',`
	gen_require(`
		type TEMPLATETYPE_spool_t;
	')

	files_search_spool($1)
	manage_files_pattern($1, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
')

########################################
## <summary>
##	Manage TEMPLATETYPE spool dirs.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_spool_dirs',`
	gen_require(`
		type TEMPLATETYPE_spool_t;
	')

	files_search_spool($1)
	manage_dirs_pattern($1, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
')

a�
########################################
## <summary>
##	Connect to TEMPLATETYPE over a unix stream socket.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_stream_connect',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_spool_t;
	')

	stream_connect_pattern($1, TEMPLATETYPE_spool_t, TEMPLATETYPE_spool_t)
')
z
		type TEMPLATETYPE_spool_t;zB
	files_search_spool($1)
	admin_pattern($1, TEMPLATETYPE_spool_t)
zDFILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_spool_t,s0)
zGFILENAME(/.*)?		gen_context(system_u:object_r:TEMPLATETYPE_spool_t,s0)
N)	Zte_typesZte_rulesZte_stream_rulesZif_rulesZif_stream_rulesZif_admin_typesZif_admin_rulesZfc_fileZfc_dir�rr�/usr/lib/python3.6/var_spool.py�<module>sPsite-packages/sepolicy/templates/__pycache__/boolean.cpython-36.pyc000064400000000421147511334650021374 0ustar003

>�\��@sdZdZdS)zP
## <desc>
##	<p>
##	DESCRIPTION
##	</p>
## </desc>
gen_tunable(BOOLEAN, false)
z0
tunable_policy(`BOOLEAN',`
#TRUE
',`
#FALSE
')
N)Z
te_booleanZte_rules�rr�/usr/lib/python3.6/boolean.py�<module>ssite-packages/sepolicy/templates/__pycache__/semodule.cpython-36.opt-1.pyc000064400000000625147511334650022537 0ustar003

>�\�@sdZdZdZdZdS)zQ
#!/bin/sh
make -f /usr/share/selinux/devel/Makefile
semodule -i TEMPLATETYPE.pp
z
restorecon -R -v FILENAME
z9
semanage ports -a -t TEMPLATETYPE_port_t -p tcp PORTNUM
z9
semanage ports -a -t TEMPLATETYPE_port_t -p udp PORTNUM
N)�compileZ
restoreconZ	tcp_portsZ	udp_ports�rr�/usr/lib/python3.6/semodule.py�<module>ssite-packages/sepolicy/templates/var_log.py000064400000006271147511334650015133 0ustar00# Copyright (C) 2007-2012 Red Hat
# see file 'COPYING' for use and warranty information
#
# policygentool is a tool for the initial generation of SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#
########################### var_log Template File #############################

########################### Type Enforcement File #############################
te_types="""
type TEMPLATETYPE_log_t;
logging_log_file(TEMPLATETYPE_log_t)
"""

te_rules="""
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
logging_log_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_log_t, { dir file lnk_file })
"""

########################### Interface File #############################
if_rules="""\
########################################
## <summary>
##	Read TEMPLATETYPE's log files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <rolecap/>
#
interface(`TEMPLATETYPE_read_log',`
	gen_require(`
		type TEMPLATETYPE_log_t;
	')

	logging_search_logs($1)
	read_files_pattern($1, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
')

########################################
## <summary>
##	Append to TEMPLATETYPE log files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_append_log',`
	gen_require(`
		type TEMPLATETYPE_log_t;
	')

	logging_search_logs($1)
	append_files_pattern($1, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
')

########################################
## <summary>
##	Manage TEMPLATETYPE log files
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_log',`
	gen_require(`
		type TEMPLATETYPE_log_t;
	')

	logging_search_logs($1)
	manage_dirs_pattern($1, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
	manage_files_pattern($1, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
	manage_lnk_files_pattern($1, TEMPLATETYPE_log_t, TEMPLATETYPE_log_t)
')
"""

if_admin_types="""
		type TEMPLATETYPE_log_t;"""

if_admin_rules="""
	logging_search_logs($1)
	admin_pattern($1, TEMPLATETYPE_log_t)
"""

########################### File Context ##################################
fc_file="""\
FILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_log_t,s0)
"""

fc_dir="""\
FILENAME(/.*)?		gen_context(system_u:object_r:TEMPLATETYPE_log_t,s0)
"""
site-packages/sepolicy/templates/var_cache.py000064400000010070147511334650015405 0ustar00# Copyright (C) 2007-2012 Red Hat
# see file 'COPYING' for use and warranty information
#
# policygentool is a tool for the initial generation of SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#
########################### cache Template File #############################

########################### Type Enforcement File #############################
te_types="""
type TEMPLATETYPE_cache_t;
files_type(TEMPLATETYPE_cache_t)
"""
te_rules="""
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
files_var_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_cache_t, { dir file lnk_file })
"""

te_stream_rules="""\
manage_sock_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
files_var_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_cache_t, sock_file)
"""

########################### Interface File #############################
if_rules="""
########################################
## <summary>
##	Search TEMPLATETYPE cache directories.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_search_cache',`
	gen_require(`
		type TEMPLATETYPE_cache_t;
	')

	allow $1 TEMPLATETYPE_cache_t:dir search_dir_perms;
	files_search_var($1)
')

########################################
## <summary>
##	Read TEMPLATETYPE cache files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_read_cache_files',`
	gen_require(`
		type TEMPLATETYPE_cache_t;
	')

	files_search_var($1)
	read_files_pattern($1, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
')

########################################
## <summary>
##	Create, read, write, and delete
##	TEMPLATETYPE cache files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_cache_files',`
	gen_require(`
		type TEMPLATETYPE_cache_t;
	')

	files_search_var($1)
	manage_files_pattern($1, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
')

########################################
## <summary>
##	Manage TEMPLATETYPE cache dirs.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_cache_dirs',`
	gen_require(`
		type TEMPLATETYPE_cache_t;
	')

	files_search_var($1)
	manage_dirs_pattern($1, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
')

"""

if_stream_rules="""
########################################
## <summary>
##	Connect to TEMPLATETYPE over a unix stream socket.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_stream_connect',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_cache_t;
	')

	stream_connect_pattern($1, TEMPLATETYPE_cache_t, TEMPLATETYPE_cache_t)
')
"""

if_admin_types="""
		type TEMPLATETYPE_cache_t;"""

if_admin_rules="""
	files_search_var($1)
	admin_pattern($1, TEMPLATETYPE_cache_t)
"""

########################### File Context ##################################
fc_file="""\
FILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_cache_t,s0)
"""

fc_dir="""\
FILENAME(/.*)?		gen_context(system_u:object_r:TEMPLATETYPE_cache_t,s0)
"""
site-packages/sepolicy/templates/var_lib.py000064400000010313147511334650015110 0ustar00# Copyright (C) 2007-2012 Red Hat
# see file 'COPYING' for use and warranty information
#
# policygentool is a tool for the initial generation of SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#
########################### var_lib Template File #############################

########################### Type Enforcement File #############################
te_types="""
type TEMPLATETYPE_var_lib_t;
files_type(TEMPLATETYPE_var_lib_t)
"""
te_rules="""
manage_dirs_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
manage_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
manage_lnk_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
files_var_lib_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_var_lib_t, { dir file lnk_file })
"""

te_stream_rules="""\
manage_sock_files_pattern(TEMPLATETYPE_t, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
files_var_lib_filetrans(TEMPLATETYPE_t, TEMPLATETYPE_var_lib_t, sock_file)
"""


########################### Interface File #############################
if_rules="""
########################################
## <summary>
##	Search TEMPLATETYPE lib directories.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_search_lib',`
	gen_require(`
		type TEMPLATETYPE_var_lib_t;
	')

	allow $1 TEMPLATETYPE_var_lib_t:dir search_dir_perms;
	files_search_var_lib($1)
')

########################################
## <summary>
##	Read TEMPLATETYPE lib files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_read_lib_files',`
	gen_require(`
		type TEMPLATETYPE_var_lib_t;
	')

	files_search_var_lib($1)
	read_files_pattern($1, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
')

########################################
## <summary>
##	Manage TEMPLATETYPE lib files.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_lib_files',`
	gen_require(`
		type TEMPLATETYPE_var_lib_t;
	')

	files_search_var_lib($1)
	manage_files_pattern($1, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
')

########################################
## <summary>
##	Manage TEMPLATETYPE lib directories.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_manage_lib_dirs',`
	gen_require(`
		type TEMPLATETYPE_var_lib_t;
	')

	files_search_var_lib($1)
	manage_dirs_pattern($1, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
')

"""

if_stream_rules="""
########################################
## <summary>
##	Connect to TEMPLATETYPE over a unix stream socket.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`TEMPLATETYPE_stream_connect',`
	gen_require(`
		type TEMPLATETYPE_t, TEMPLATETYPE_var_lib_t;
	')

	stream_connect_pattern($1, TEMPLATETYPE_var_lib_t, TEMPLATETYPE_var_lib_t)
')
"""

if_admin_types="""
		type TEMPLATETYPE_var_lib_t;"""

if_admin_rules="""
	files_search_var_lib($1)
	admin_pattern($1, TEMPLATETYPE_var_lib_t)
"""

########################### File Context ##################################
fc_file="""\
FILENAME		--	gen_context(system_u:object_r:TEMPLATETYPE_var_lib_t,s0)
"""

fc_sock_file="""\
FILENAME		-s	gen_context(system_u:object_r:TEMPLATETYPE_var_lib_t,s0)
"""

fc_dir="""\
FILENAME(/.*)?		gen_context(system_u:object_r:TEMPLATETYPE_var_lib_t,s0)
"""
site-packages/sepolicy/templates/test_module.py000064400000010541147511334650016021 0ustar00# Copyright (C) 2007-2012 Red Hat
# see file 'COPYING' for use and warranty information
#
# policygentool is a tool for the initial generation of SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#


#['domain', 'role', 'role_prefix', 'object_class', 'name', 'private_type', 'prefix', 'entrypoint', 'target_domain', 'terminal', 'range', 'domains', 'entry_point', 'entry_file', 'domain_prefix', 'private type', 'user_prefix', 'user_role', 'user_domain', 'object', 'type', 'source_domain', 'file_type', 'file', 'class', 'peer_domain', 'objectclass(es)', 'exception_types', 'home_type', 'object_type', 'directory_type', 'boolean', 'pty_type', 'userdomain', 'tty_type', 'tmpfs_type', 'script_file', 'filetype', 'filename', 'init_script_file', 'source_role', 'userdomain_prefix']

dict_values={}
dict_values['domain'] = 'sepolicy_domain_t'
dict_values['domains'] = 'sepolicy_domain_t'
dict_values['target_domain'] = 'sepolicy_target_t'
dict_values['source_domain'] = 'sepolicy_source_t'
dict_values['peer_domain'] = 'sepolicy_peer_t'
dict_values['exception_types'] = 'sepolicy_exception_types_t'
dict_values['user_domain'] = 'sepolicy_userdomain_t'
dict_values['userdomain'] = 'sepolicy_userdomain_t'
dict_values['bool_domain'] = 'sepolicy_bool_domain_t'

dict_values['type'] = 'sepolicy_file_t'
dict_values['file_type'] = 'sepolicy_file_t'
dict_values['private type'] = 'sepolicy_private_file_t'
dict_values['private_type'] = 'sepolicy_private_file_t'
dict_values['pty_type'] = 'sepolicy_devpts_t'
dict_values['tmpfs_type'] = 'sepolicy_tmpfs_t'
dict_values['home_type'] = 'sepolicy_home_file_t'
dict_values['tty_type'] = 'sepolicy_t'
dict_values['directory_type'] = 'sepolicy_file_t'
dict_values['object_type'] = 'sepolicy_object_t'

dict_values['script_file'] = 'sepolicy_exec_t'
dict_values['entry_point'] = 'sepolicy_exec_t'
dict_values['file'] = 'sepolicy_file_t'
dict_values['entry_file'] = 'sepolicy_exec_t'
dict_values['init_script_file'] = 'sepolicy_exec_t'
dict_values['entrypoint'] = 'sepolicy_exec_t'

dict_values['role'] = 'sepolicy_r'
dict_values['role_prefix'] = 'sepolicy'
dict_values['user_role'] = 'sepolicy_r'
dict_values['source_role'] = 'sepolicy_source_r'

dict_values['prefix'] = 'sepolicy_domain'
dict_values['domain_prefix'] = 'sepolicy_domain'
dict_values['userdomain_prefix'] = 'sepolicy_userdomain'
dict_values['user_prefix'] = 'sepolicy_userdomain'

dict_values['object_class'] = 'file'
dict_values['object'] = 'file'
dict_values['class'] = 'file'
dict_values['objectclass(es)'] = 'file'
dict_values['object_name'] = 'sepolicy_object'
dict_values['name'] = '"sepolicy_name"'

dict_values['terminal'] = 'sepolicy_tty_t'
dict_values['boolean'] = 'sepolicy_bool_t'
dict_values['range'] = 's0 - mcs_systemhigh'

te_test_module="""\
policy_module(TEMPLATETYPE, 1.0.0)

type sepolicy_t;
domain_type(sepolicy_t)
type sepolicy_domain_t;
domain_type(sepolicy_domain_t)
type sepolicy_target_t;
domain_type(sepolicy_target_t)
type sepolicy_source_t;
domain_type(sepolicy_source_t)
type sepolicy_peer_t;
domain_type(sepolicy_peer_t)
type sepolicy_exception_types_t;
domain_type(sepolicy_exception_types_t)
type sepolicy_userdomain_t;
domain_type(sepolicy_userdomain_t)

type sepolicy_file_t;
files_type(sepolicy_file_t)
type sepolicy_private_file_t;
files_type(sepolicy_private_file_t)
type sepolicy_home_file_t;
files_type(sepolicy_home_file_t)
type sepolicy_tty_t;
term_tty(sepolicy_tty_t)
type sepolicy_object_t;
type sepolicy_devpts_t;
term_pty(sepolicy_devpts_t)
type sepolicy_tmpfs_t;
files_type(sepolicy_tmpfs_t)
type sepolicy_exec_t;
files_type(sepolicy_exec_t)

role sepolicy_r;
role sepolicy_source_r;
role sepolicy_target_r;

#################################
#
# Local policy
#

"""
site-packages/sepolicy/templates/semodule.py000064400000002435147511334650015315 0ustar00# Copyright (C) 2007-2012 Red Hat
# see file 'COPYING' for use and warranty information
#
# policygentool is a tool for the initial generation of SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#

########################### tmp Template File #############################
compile="""
#!/bin/sh
make -f /usr/share/selinux/devel/Makefile
semodule -i TEMPLATETYPE.pp
"""

restorecon="""
restorecon -R -v FILENAME
"""

tcp_ports="""
semanage ports -a -t TEMPLATETYPE_port_t -p tcp PORTNUM
"""

udp_ports="""
semanage ports -a -t TEMPLATETYPE_port_t -p udp PORTNUM
"""
site-packages/sepolicy/templates/network.py000064400000032443147511334650015173 0ustar00# Copyright (C) 2007-2012 Red Hat
# see file 'COPYING' for use and warranty information
#
# policygentool is a tool for the initial generation of SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#
########################### Type Enforcement File #############################
te_types="""
type TEMPLATETYPE_port_t;
corenet_port(TEMPLATETYPE_port_t)
"""

te_network="""\
sysnet_dns_name_resolve(TEMPLATETYPE_t)
corenet_all_recvfrom_unlabeled(TEMPLATETYPE_t)
"""

te_tcp="""\
allow TEMPLATETYPE_t self:tcp_socket create_stream_socket_perms;
corenet_tcp_sendrecv_generic_if(TEMPLATETYPE_t)
corenet_tcp_sendrecv_generic_node(TEMPLATETYPE_t)
corenet_tcp_sendrecv_all_ports(TEMPLATETYPE_t)
"""

te_in_tcp="""\
corenet_tcp_bind_generic_node(TEMPLATETYPE_t)
"""

te_in_need_port_tcp="""\
allow TEMPLATETYPE_t TEMPLATETYPE_port_t:tcp_socket name_bind;
"""

te_out_need_port_tcp="""\
allow TEMPLATETYPE_t TEMPLATETYPE_port_t:tcp_socket name_connect;
"""

te_udp="""\
allow TEMPLATETYPE_t self:udp_socket { create_socket_perms listen };
corenet_udp_sendrecv_generic_if(TEMPLATETYPE_t)
corenet_udp_sendrecv_generic_node(TEMPLATETYPE_t)
corenet_udp_sendrecv_all_ports(TEMPLATETYPE_t)
"""

te_in_udp="""\
corenet_udp_bind_generic_node(TEMPLATETYPE_t)
"""

te_in_need_port_udp="""\
allow TEMPLATETYPE_t TEMPLATETYPE_port_t:udp_socket name_bind;
"""

te_out_all_ports_tcp="""\
corenet_tcp_connect_all_ports(TEMPLATETYPE_t)
"""

te_out_reserved_ports_tcp="""\
corenet_tcp_connect_all_rpc_ports(TEMPLATETYPE_t)
"""

te_out_unreserved_ports_tcp="""\
corenet_tcp_connect_all_unreserved_ports(TEMPLATETYPE_t)
"""

te_in_all_ports_tcp="""\
corenet_tcp_bind_all_ports(TEMPLATETYPE_t)
"""

te_in_reserved_ports_tcp="""\
corenet_tcp_bind_all_rpc_ports(TEMPLATETYPE_t)
"""

te_in_unreserved_ports_tcp="""\
corenet_tcp_bind_all_unreserved_ports(TEMPLATETYPE_t)
"""

te_in_all_ports_udp="""\
corenet_udp_bind_all_ports(TEMPLATETYPE_t)
"""

te_in_reserved_ports_udp="""\
corenet_udp_bind_all_rpc_ports(TEMPLATETYPE_t)
"""

te_in_unreserved_ports_udp="""\
corenet_udp_bind_all_unreserved_ports(TEMPLATETYPE_t)
"""

if_rules="""\
########################################
## <summary>
##	Send and receive TCP traffic on the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="both" weight="10"/>
#
interface(`corenet_tcp_sendrecv_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	allow $1 TEMPLATETYPE_port_t:tcp_socket { send_msg recv_msg };
')

########################################
## <summary>
##	Send UDP traffic on the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="write" weight="10"/>
#
interface(`corenet_udp_send_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	allow $1 TEMPLATETYPE_port_t:udp_socket send_msg;
')

########################################
## <summary>
##	Do not audit attempts to send UDP traffic on the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_udp_send_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	dontaudit $1 TEMPLATETYPE_port_t:udp_socket send_msg;
')

########################################
## <summary>
##	Receive UDP traffic on the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="read" weight="10"/>
#
interface(`corenet_udp_receive_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	allow $1 TEMPLATETYPE_port_t:udp_socket recv_msg;
')

########################################
## <summary>
##	Do not audit attempts to receive UDP traffic on the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_udp_receive_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	dontaudit $1 TEMPLATETYPE_port_t:udp_socket recv_msg;
')

########################################
## <summary>
##	Send and receive UDP traffic on the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="both" weight="10"/>
#
interface(`corenet_udp_sendrecv_TEMPLATETYPE_port',`
	corenet_udp_send_TEMPLATETYPE_port($1)
	corenet_udp_receive_TEMPLATETYPE_port($1)
')

########################################
## <summary>
##	Do not audit attempts to send and receive
##	UDP traffic on the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_udp_sendrecv_TEMPLATETYPE_port',`
	corenet_dontaudit_udp_send_TEMPLATETYPE_port($1)
	corenet_dontaudit_udp_receive_TEMPLATETYPE_port($1)
')

########################################
## <summary>
##	Bind TCP sockets to the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_tcp_bind_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	allow $1 TEMPLATETYPE_port_t:tcp_socket name_bind;
	
')

########################################
## <summary>
##	Bind UDP sockets to the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_udp_bind_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	allow $1 TEMPLATETYPE_port_t:udp_socket name_bind;
	
')

########################################
## <summary>
##	Do not audit attempts to sbind to TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_udp_bind_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	dontaudit $1 TEMPLATETYPE_port_t:udp_socket name_bind;
	
')

########################################
## <summary>
##	Make a TCP connection to the TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`corenet_tcp_connect_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	allow $1 TEMPLATETYPE_port_t:tcp_socket name_connect;
')
########################################
## <summary>
##	Do not audit attempts to make a TCP connection to TEMPLATETYPE port.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`corenet_dontaudit_tcp_connect_TEMPLATETYPE_port',`
	gen_require(`
		type TEMPLATETYPE_port_t;
	')

	dontaudit $1 TEMPLATETYPE_port_t:tcp_socket name_connect;
')


########################################
## <summary>
##	Send TEMPLATETYPE_client packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="write" weight="10"/>
#
interface(`corenet_send_TEMPLATETYPE_client_packets',`
	gen_require(`
		type TEMPLATETYPE_client_packet_t;
	')

	allow $1 TEMPLATETYPE_client_packet_t:packet send;
')

########################################
## <summary>
##	Do not audit attempts to send TEMPLATETYPE_client packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_send_TEMPLATETYPE_client_packets',`
	gen_require(`
		type TEMPLATETYPE_client_packet_t;
	')

	dontaudit $1 TEMPLATETYPE_client_packet_t:packet send;
')

########################################
## <summary>
##	Receive TEMPLATETYPE_client packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="read" weight="10"/>
#
interface(`corenet_receive_TEMPLATETYPE_client_packets',`
	gen_require(`
		type TEMPLATETYPE_client_packet_t;
	')

	allow $1 TEMPLATETYPE_client_packet_t:packet recv;
')

########################################
## <summary>
##	Do not audit attempts to receive TEMPLATETYPE_client packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_receive_TEMPLATETYPE_client_packets',`
	gen_require(`
		type TEMPLATETYPE_client_packet_t;
	')

	dontaudit $1 TEMPLATETYPE_client_packet_t:packet recv;
')

########################################
## <summary>
##	Send and receive TEMPLATETYPE_client packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="both" weight="10"/>
#
interface(`corenet_sendrecv_TEMPLATETYPE_client_packets',`
	corenet_send_TEMPLATETYPE_client_packets($1)
	corenet_receive_TEMPLATETYPE_client_packets($1)
')

########################################
## <summary>
##	Do not audit attempts to send and receive TEMPLATETYPE_client packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_sendrecv_TEMPLATETYPE_client_packets',`
	corenet_dontaudit_send_TEMPLATETYPE_client_packets($1)
	corenet_dontaudit_receive_TEMPLATETYPE_client_packets($1)
')

########################################
## <summary>
##	Relabel packets to TEMPLATETYPE_client the packet type.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`corenet_relabelto_TEMPLATETYPE_client_packets',`
	gen_require(`
		type TEMPLATETYPE_client_packet_t;
	')

	allow $1 TEMPLATETYPE_client_packet_t:packet relabelto;
')


########################################
## <summary>
##	Send TEMPLATETYPE_server packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="write" weight="10"/>
#
interface(`corenet_send_TEMPLATETYPE_server_packets',`
	gen_require(`
		type TEMPLATETYPE_server_packet_t;
	')

	allow $1 TEMPLATETYPE_server_packet_t:packet send;
')

########################################
## <summary>
##	Do not audit attempts to send TEMPLATETYPE_server packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_send_TEMPLATETYPE_server_packets',`
	gen_require(`
		type TEMPLATETYPE_server_packet_t;
	')

	dontaudit $1 TEMPLATETYPE_server_packet_t:packet send;
')

########################################
## <summary>
##	Receive TEMPLATETYPE_server packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="read" weight="10"/>
#
interface(`corenet_receive_TEMPLATETYPE_server_packets',`
	gen_require(`
		type TEMPLATETYPE_server_packet_t;
	')

	allow $1 TEMPLATETYPE_server_packet_t:packet recv;
')

########################################
## <summary>
##	Do not audit attempts to receive TEMPLATETYPE_server packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_receive_TEMPLATETYPE_server_packets',`
	gen_require(`
		type TEMPLATETYPE_server_packet_t;
	')

	dontaudit $1 TEMPLATETYPE_server_packet_t:packet recv;
')

########################################
## <summary>
##	Send and receive TEMPLATETYPE_server packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
## <infoflow type="both" weight="10"/>
#
interface(`corenet_sendrecv_TEMPLATETYPE_server_packets',`
	corenet_send_TEMPLATETYPE_server_packets($1)
	corenet_receive_TEMPLATETYPE_server_packets($1)
')

########################################
## <summary>
##	Do not audit attempts to send and receive TEMPLATETYPE_server packets.
## </summary>
## <param name="domain">
##	<summary>
##	Domain to not audit.
##	</summary>
## </param>
## <infoflow type="none"/>
#
interface(`corenet_dontaudit_sendrecv_TEMPLATETYPE_server_packets',`
	corenet_dontaudit_send_TEMPLATETYPE_server_packets($1)
	corenet_dontaudit_receive_TEMPLATETYPE_server_packets($1)
')

########################################
## <summary>
##	Relabel packets to TEMPLATETYPE_server the packet type.
## </summary>
## <param name="domain">
##	<summary>
##	Domain allowed access.
##	</summary>
## </param>
#
interface(`corenet_relabelto_TEMPLATETYPE_server_packets',`
	gen_require(`
		type TEMPLATETYPE_server_packet_t;
	')

	allow $1 TEMPLATETYPE_server_packet_t:packet relabelto;
')
"""

te_rules="""
"""
site-packages/sepolicy/manpage.py000064400000120436147511334650013114 0ustar00# Copyright (C) 2012-2013 Red Hat
# AUTHOR: Dan Walsh <dwalsh@redhat.com>
# AUTHOR: Miroslav Grepl <mgrepl@redhat.com>
# see file 'COPYING' for use and warranty information
#
# semanage is a tool for managing SELinux configuration files
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#
__all__ = ['ManPage', 'HTMLManPages', 'manpage_domains', 'manpage_roles', 'gen_domains']

import string
import selinux
import sepolicy
import os
import time

typealias_types = {
"antivirus_t":("amavis_t", "clamd_t", "clamscan_t", "freshclam_t"),
"cluster_t":("rgmanager_t", "corosync_t", "aisexec_t", "pacemaker_t"),
"svirt_t":("qemu_t"),
"httpd_t":("phpfpm_t"),
}

equiv_dict = {"smbd": ["samba"], "httpd": ["apache"], "virtd": ["virt", "libvirt"], "named": ["bind"], "fsdaemon": ["smartmon"], "mdadm": ["raid"]}

equiv_dirs = ["/var"]
modules_dict = None


def gen_modules_dict(path="/usr/share/selinux/devel/policy.xml"):
    global modules_dict
    if modules_dict:
        return modules_dict

    import xml.etree.ElementTree
    modules_dict = {}
    try:
        tree = xml.etree.ElementTree.fromstring(sepolicy.policy_xml(path))
        for l in tree.findall("layer"):
            for m in l.findall("module"):
                name = m.get("name")
                if name == "user" or name == "unconfined":
                    continue
                if name == "unprivuser":
                    name = "user"
                if name == "unconfineduser":
                    name = "unconfined"
                for b in m.findall("summary"):
                    modules_dict[name] = b.text
    except IOError:
        pass
    return modules_dict

users = None
users_range = None


def get_all_users_info():
    global users
    global users_range
    if users and users_range:
        return users, users_range

    users = []
    users_range = {}
    allusers = []
    allusers_info = sepolicy.info(sepolicy.USER)

    for d in allusers_info:
        allusers.append(d['name'])
        if 'range' in d:
            users_range[d['name'].split("_")[0]] = d['range']

    for u in allusers:
        if u not in ["system_u", "root", "unconfined_u"]:
            users.append(u.replace("_u", ""))
    users.sort()
    return users, users_range

all_entrypoints = None

def get_entrypoints():
    global all_entrypoints
    if not all_entrypoints:
        all_entrypoints = next(sepolicy.info(sepolicy.ATTRIBUTE, "entry_type"))["types"]
    return all_entrypoints

domains = None


def gen_domains():
    global domains
    if domains:
        return domains
    domains = []
    for d in sepolicy.get_all_domains():
        found = False
        domain = d[:-2]
#		if domain + "_exec_t" not in get_entrypoints():
#			continue
        if domain in domains:
            continue
        domains.append(domain)

    for role in sepolicy.get_all_roles():
        if role[:-2] in domains or role == "system_r":
            continue
        domains.append(role[:-2])

    domains.sort()
    return domains


exec_types = None

def _gen_exec_types():
    global exec_types
    if exec_types is None:
        exec_types = next(sepolicy.info(sepolicy.ATTRIBUTE, "exec_type"))["types"]
    return exec_types

entry_types = None

def _gen_entry_types():
    global entry_types
    if entry_types is None:
        entry_types = next(sepolicy.info(sepolicy.ATTRIBUTE, "entry_type"))["types"]
    return entry_types

mcs_constrained_types = None

def _gen_mcs_constrained_types():
    global mcs_constrained_types
    if mcs_constrained_types is None:
        mcs_constrained_types = next(sepolicy.info(sepolicy.ATTRIBUTE, "mcs_constrained_type"))
    return mcs_constrained_types


types = None

def _gen_types():
    global types
    if types:
        return types
    all_types = sepolicy.info(sepolicy.TYPE)
    types = {}
    for rec in all_types:
        try:
            types[rec["name"]] = rec["attributes"]
        except:
            types[rec["name"]] = []
    return types


def prettyprint(f, trim):
    return " ".join(f[:-len(trim)].split("_"))

# for HTML man pages
manpage_domains = []
manpage_roles = []

def get_alphabet_manpages(manpage_list):
    alphabet_manpages = dict.fromkeys(string.ascii_letters, [])
    for i in string.ascii_letters:
        temp = []
        for j in manpage_list:
            if j.split("/")[-1][0] == i:
                temp.append(j.split("/")[-1])

        alphabet_manpages[i] = temp

    return alphabet_manpages


def convert_manpage_to_html(html_manpage, manpage):
    try:
        from commands import getstatusoutput
    except ImportError:
        from subprocess import getstatusoutput
    rc, output = getstatusoutput("/usr/bin/groff -man -Thtml %s 2>/dev/null" % manpage)
    if rc == 0:
        print(html_manpage, "has been created")
        fd = open(html_manpage, 'w')
        fd.write(output)
        fd.close()


class HTMLManPages:

    """
            Generate a HTML Manpages on an given SELinux domains
    """

    def __init__(self, manpage_roles, manpage_domains, path, os_version):
        self.manpage_roles = get_alphabet_manpages(manpage_roles)
        self.manpage_domains = get_alphabet_manpages(manpage_domains)
        self.os_version = os_version
        self.old_path = path + "/"
        self.new_path = self.old_path

        if self.os_version:
            self.__gen_html_manpages()
        else:
            print("SELinux HTML man pages can not be generated for this %s" % os_version)
            exit(1)

    def __gen_html_manpages(self):
        self._write_html_manpage()
        self._gen_index()
        self._gen_css()

    def _write_html_manpage(self):
        if not os.path.isdir(self.new_path):
            os.mkdir(self.new_path)

        for domain in self.manpage_domains.values():
            if len(domain):
                for d in domain:
                    convert_manpage_to_html((self.new_path + d.rsplit("_selinux", 1)[0] + ".html"), self.old_path + d)

        for role in self.manpage_roles.values():
            if len(role):
                for r in role:
                    convert_manpage_to_html((self.new_path + r.rsplit("_selinux", 1)[0] + ".html"), self.old_path + r)

    def _gen_index(self):
        html = self.new_path + self.os_version + ".html"
        fd = open(html, 'w')
        fd.write("""
<html>
<head>
	<link rel=stylesheet type="text/css" href="style.css" title="style">
	<title>SELinux man pages</title>
</head>
<body>
<h1>SELinux man pages for %s</h1>
<hr>
<table><tr>
<td valign="middle">
<h3>SELinux roles</h3>
""" % self.os_version)
        for letter in self.manpage_roles:
            if len(self.manpage_roles[letter]):
                fd.write("""
<a href=#%s_role>%s</a>"""
                         % (letter, letter))

        fd.write("""
</td>
</tr></table>
<pre>
""")
        rolename_body = ""
        for letter in self.manpage_roles:
            if len(self.manpage_roles[letter]):
                rolename_body += "<p>"
                for r in self.manpage_roles[letter]:
                    rolename = r.rsplit("_selinux", 1)[0]
                    rolename_body += "<a name=%s_role></a><a href=%s.html>%s_selinux(8)</a> - Security Enhanced Linux Policy for the %s SELinux user\n" % (letter, rolename, rolename, rolename)

        fd.write("""%s
</pre>
<hr>
<table><tr>
<td valign="middle">
<h3>SELinux domains</h3>"""
                 % rolename_body)

        for letter in self.manpage_domains:
            if len(self.manpage_domains[letter]):
                fd.write("""
<a href=#%s_domain>%s</a>
			""" % (letter, letter))

        fd.write("""
</td>
</tr></table>
<pre>
""")
        domainname_body = ""
        for letter in self.manpage_domains:
            if len(self.manpage_domains[letter]):
                domainname_body += "<p>"
                for r in self.manpage_domains[letter]:
                    domainname = r.rsplit("_selinux", 1)[0]
                    domainname_body += "<a name=%s_domain></a><a href=%s.html>%s_selinux(8)</a> - Security Enhanced Linux Policy for the %s SELinux processes\n" % (letter, domainname, domainname, domainname)

        fd.write("""%s
</pre>
</body>
</html>
""" % domainname_body)

        fd.close()
        print("%s has been created" % html)

    def _gen_css(self):
        style_css = self.old_path + "style.css"
        fd = open(style_css, 'w')
        fd.write("""
html, body {
    background-color: #fcfcfc;
    font-family: arial, sans-serif;
    font-size: 110%;
    color: #333;
}

h1, h2, h3, h4, h5, h5 {
	color: #2d7c0b;
	font-family: arial, sans-serif;
	margin-top: 25px;
}

a {
    color: #336699;
    text-decoration: none;
}

a:visited {
    color: #4488bb;
}

a:hover, a:focus, a:active {
    color: #07488A;
    text-decoration: none;
}

a.func {
    color: red;
    text-decoration: none;
}
a.file {
    color: red;
    text-decoration: none;
}

pre.code {
    background-color: #f4f0f4;
//    font-family: monospace, courier;
    font-size: 110%;
    margin-left: 0px;
    margin-right: 60px;
    padding-top: 5px;
    padding-bottom: 5px;
    padding-left: 8px;
    padding-right: 8px;
    border: 1px solid #AADDAA;
}

.url {
    font-family: serif;
    font-style: italic;
    color: #440064;
}
""")

        fd.close()
        print("%s has been created" % style_css)


class ManPage:

    """
        Generate a Manpage on an SELinux domain in the specified path
    """
    modules_dict = None
    enabled_str = ["Disabled", "Enabled"]

    def __init__(self, domainname, path="/tmp", root="/", source_files=False, html=False):
        self.html = html
        self.source_files = source_files
        self.root = root
        self.portrecs = sepolicy.gen_port_dict()[0]
        self.domains = gen_domains()
        self.all_domains = sepolicy.get_all_domains()
        self.all_attributes = sepolicy.get_all_attributes()
        self.all_bools = sepolicy.get_all_bools()
        self.all_port_types = sepolicy.get_all_port_types()
        self.all_roles = sepolicy.get_all_roles()
        self.all_users = get_all_users_info()[0]
        self.all_users_range = get_all_users_info()[1]
        self.all_file_types = sepolicy.get_all_file_types()
        self.role_allows = sepolicy.get_all_role_allows()
        self.types = _gen_types()
        self.exec_types = _gen_exec_types()
        self.entry_types = _gen_entry_types()
        self.mcs_constrained_types = _gen_mcs_constrained_types()

        if self.source_files:
            self.fcpath = self.root + "file_contexts"
        else:
            self.fcpath = self.root + selinux.selinux_file_context_path()

        self.fcdict = sepolicy.get_fcdict(self.fcpath)

        if not os.path.exists(path):
            os.makedirs(path)

        self.path = path

        if self.source_files:
            self.xmlpath = self.root + "policy.xml"
        else:
            self.xmlpath = self.root + "/usr/share/selinux/devel/policy.xml"
        self.booleans_dict = sepolicy.gen_bool_dict(self.xmlpath)

        self.domainname, self.short_name = sepolicy.gen_short_name(domainname)

        self.type = self.domainname + "_t"
        self._gen_bools()
        self.man_page_path = "%s/%s_selinux.8" % (path, self.domainname)
        self.fd = open(self.man_page_path, 'w')
        if self.domainname + "_r" in self.all_roles:
            self.__gen_user_man_page()
            if self.html:
                manpage_roles.append(self.man_page_path)
        else:
            if self.html:
                manpage_domains.append(self.man_page_path)
            self.__gen_man_page()
        self.fd.close()

        for k in equiv_dict.keys():
            if k == self.domainname:
                for alias in equiv_dict[k]:
                    self.__gen_man_page_link(alias)

    def _gen_bools(self):
        self.bools = []
        self.domainbools = []
        types = [self.type]
        if self.domainname in equiv_dict:
            for t in equiv_dict[self.domainname]:
                if t + "_t" in self.all_domains:
                    types.append(t + "_t")

        for t in types:
            domainbools, bools = sepolicy.get_bools(t)
            self.bools += bools
            self.domainbools += domainbools

        self.bools.sort()
        self.domainbools.sort()

    def get_man_page_path(self):
        return self.man_page_path

    def __gen_user_man_page(self):
        self.role = self.domainname + "_r"
        if not self.modules_dict:
            self.modules_dict = gen_modules_dict(self.xmlpath)

        try:
            self.desc = self.modules_dict[self.domainname]
        except:
            self.desc = "%s user role" % self.domainname

        if self.domainname in self.all_users:
            self.attributes = next(sepolicy.info(sepolicy.TYPE, (self.type)))["attributes"]
            self._user_header()
            self._user_attribute()
            self._can_sudo()
            self._xwindows_login()
            # until a new policy build with login_userdomain attribute
        #self.terminal_login()
            self._network()
            self._booleans()
            self._home_exec()
            self._transitions()
        else:
            self._role_header()
            self._booleans()

        self._port_types()
        self._mcs_types()
        self._writes()
        self._footer()

    def __gen_man_page_link(self, alias):
        path = "%s/%s_selinux.8" % (self.path, alias)
        self.fd = open("%s/%s_selinux.8" % (self.path, alias), 'w')
        self.fd.write(".so man8/%s_selinux.8" % self.domainname)
        self.fd.close()
        print(path)

    def __gen_man_page(self):
        self.anon_list = []

        self.attributes = {}
        self.ptypes = []
        self._get_ptypes()

        for domain_type in self.ptypes:
            try:
                if typealias_types[domain_type]:
                    fd = self.fd
                    man_page_path =  self.man_page_path
                    for t in typealias_types[domain_type]:
                        self._typealias_gen_man(t)
                    self.fd = fd
                    self.man_page_path = man_page_path
            except KeyError:
                continue
            self.attributes[domain_type] = next(sepolicy.info(sepolicy.TYPE, ("%s") % domain_type))["attributes"]

        self._header()
        self._entrypoints()
        self._process_types()
        self._mcs_types()
        self._booleans()
        self._nsswitch_domain()
        self._port_types()
        self._writes()
        self._file_context()
        self._public_content()
        self._footer()

    def _get_ptypes(self):
        for f in self.all_domains:
            if f.startswith(self.short_name) or f.startswith(self.domainname):
                self.ptypes.append(f)

    def _typealias_gen_man(self, t):
        self.man_page_path = "%s/%s_selinux.8" % (self.path, t[:-2])
        self.ports = []
        self.booltext = ""
        self.fd = open(self.man_page_path, 'w')
        self._typealias(t[:-2])
        self._footer()
        self.fd.close()

    def _typealias(self,typealias):
        self.fd.write('.TH  "%(typealias)s_selinux"  "8"  "%(date)s" "%(typealias)s" "SELinux Policy %(typealias)s"'
                 % {'typealias':typealias, 'date': time.strftime("%y-%m-%d")})
        self.fd.write(r"""
.SH "NAME"
%(typealias)s_selinux \- Security Enhanced Linux Policy for the %(typealias)s processes
.SH "DESCRIPTION"

%(typealias)s_t SELinux domain type is now associated with %(domainname)s domain type (%(domainname)s_t).
""" % {'typealias':typealias, 'domainname':self.domainname})

        self.fd.write(r"""
Please see

.B %(domainname)s_selinux

man page for more details.
"""  % {'domainname':self.domainname})

    def _header(self):
        self.fd.write('.TH  "%(domainname)s_selinux"  "8"  "%(date)s" "%(domainname)s" "SELinux Policy %(domainname)s"'
                      % {'domainname': self.domainname, 'date': time.strftime("%y-%m-%d")})
        self.fd.write(r"""
.SH "NAME"
%(domainname)s_selinux \- Security Enhanced Linux Policy for the %(domainname)s processes
.SH "DESCRIPTION"

Security-Enhanced Linux secures the %(domainname)s processes via flexible mandatory access control.

The %(domainname)s processes execute with the %(domainname)s_t SELinux type. You can check if you have these processes running by executing the \fBps\fP command with the \fB\-Z\fP qualifier.

For example:

.B ps -eZ | grep %(domainname)s_t

""" % {'domainname': self.domainname})

    def _format_boolean_desc(self, b):
        desc = self.booleans_dict[b][2][0].lower() + self.booleans_dict[b][2][1:]
        if desc[-1] == ".":
            desc = desc[:-1]
        return desc

    def _gen_bool_text(self):
        booltext = ""
        for b, enabled in self.domainbools + self.bools:
            if b.endswith("anon_write") and b not in self.anon_list:
                self.anon_list.append(b)
            else:
                if b not in self.booleans_dict:
                    continue
                booltext += """
.PP
If you want to %s, you must turn on the %s boolean. %s by default.

.EX
.B setsebool -P %s 1

.EE
""" % (self._format_boolean_desc(b), b, self.enabled_str[enabled], b)
        return booltext

    def _booleans(self):
        self.booltext = self._gen_bool_text()

        if self.booltext != "":
            self.fd.write("""
.SH BOOLEANS
SELinux policy is customizable based on least access required.  %s policy is extremely flexible and has several booleans that allow you to manipulate the policy and run %s with the tightest access possible.

""" % (self.domainname, self.domainname))

            self.fd.write(self.booltext)

    def _nsswitch_domain(self):
        nsswitch_types = []
        nsswitch_booleans = ['authlogin_nsswitch_use_ldap', 'kerberos_enabled']
        nsswitchbooltext = ""
        for k in self.attributes.keys():
            if "nsswitch_domain" in self.attributes[k]:
                nsswitch_types.append(k)

        if len(nsswitch_types):
            self.fd.write("""
.SH NSSWITCH DOMAIN
""")
            for b in nsswitch_booleans:
                nsswitchbooltext += """
.PP
If you want to %s for the %s, you must turn on the %s boolean.

.EX
.B setsebool -P %s 1
.EE
""" % (self._format_boolean_desc(b), (", ".join(nsswitch_types)), b, b)

        self.fd.write(nsswitchbooltext)

    def _process_types(self):
        if len(self.ptypes) == 0:
            return
        self.fd.write(r"""
.SH PROCESS TYPES
SELinux defines process types (domains) for each process running on the system
.PP
You can see the context of a process using the \fB\-Z\fP option to \fBps\bP
.PP
Policy governs the access confined processes have to files.
SELinux %(domainname)s policy is very flexible allowing users to setup their %(domainname)s processes in as secure a method as possible.
.PP
The following process types are defined for %(domainname)s:
""" % {'domainname': self.domainname})
        self.fd.write("""
.EX
.B %s
.EE""" % ", ".join(self.ptypes))
        self.fd.write("""
.PP
Note:
.B semanage permissive -a %(domainname)s_t
can be used to make the process type %(domainname)s_t permissive. SELinux does not deny access to permissive process types, but the AVC (SELinux denials) messages are still generated.
""" % {'domainname': self.domainname})

    def _port_types(self):
        self.ports = []
        for f in self.all_port_types:
            if f.startswith(self.short_name) or f.startswith(self.domainname):
                self.ports.append(f)

        if len(self.ports) == 0:
            return
        self.fd.write("""
.SH PORT TYPES
SELinux defines port types to represent TCP and UDP ports.
.PP
You can see the types associated with a port by using the following command:

.B semanage port -l

.PP
Policy governs the access confined processes have to these ports.
SELinux %(domainname)s policy is very flexible allowing users to setup their %(domainname)s processes in as secure a method as possible.
.PP
The following port types are defined for %(domainname)s:""" % {'domainname': self.domainname})

        for p in self.ports:
            self.fd.write("""

.EX
.TP 5
.B %s
.TP 10
.EE
""" % p)
            once = True
            for prot in ("tcp", "udp"):
                if (p, prot) in self.portrecs:
                    if once:
                        self.fd.write("""

Default Defined Ports:""")
                    once = False
                    self.fd.write(r"""
%s %s
.EE""" % (prot, ",".join(self.portrecs[(p, prot)])))

    def _file_context(self):
        flist = []
        flist_non_exec = []
        mpaths = []
        for f in self.all_file_types:
            if f.startswith(self.domainname):
                flist.append(f)
                if not f in self.exec_types or not f in self.entry_types:
                    flist_non_exec.append(f)
                if f in self.fcdict:
                    mpaths = mpaths + self.fcdict[f]["regex"]
        if len(mpaths) == 0:
            return
        mpaths.sort()
        mdirs = {}
        for mp in mpaths:
            found = False
            for md in mdirs:
                if mp.startswith(md):
                    mdirs[md].append(mp)
                    found = True
                    break
            if not found:
                for e in equiv_dirs:
                    if mp.startswith(e) and mp.endswith('(/.*)?'):
                        mdirs[mp[:-6]] = []
                        break

        equiv = []
        for m in mdirs:
            if len(mdirs[m]) > 0:
                equiv.append(m)

        self.fd.write(r"""
.SH FILE CONTEXTS
SELinux requires files to have an extended attribute to define the file type.
.PP
You can see the context of a file using the \fB\-Z\fP option to \fBls\bP
.PP
Policy governs the access confined processes have to these files.
SELinux %(domainname)s policy is very flexible allowing users to setup their %(domainname)s processes in as secure a method as possible.
.PP
""" % {'domainname': self.domainname})

        if len(equiv) > 0:
            self.fd.write(r"""
.PP
.B EQUIVALENCE DIRECTORIES
""")
            for e in equiv:
                self.fd.write(r"""
.PP
%(domainname)s policy stores data with multiple different file context types under the %(equiv)s directory.  If you would like to store the data in a different directory you can use the semanage command to create an equivalence mapping.  If you wanted to store this data under the /srv dirctory you would execute the following command:
.PP
.B semanage fcontext -a -e %(equiv)s /srv/%(alt)s
.br
.B restorecon -R -v /srv/%(alt)s
.PP
""" % {'domainname': self.domainname, 'equiv': e, 'alt': e.split('/')[-1]})

        if flist_non_exec:
                self.fd.write(r"""
.PP
.B STANDARD FILE CONTEXT

SELinux defines the file context types for the %(domainname)s, if you wanted to
store files with these types in a diffent paths, you need to execute the semanage command to sepecify alternate labeling and then use restorecon to put the labels on disk.

.B semanage fcontext -a -t %(type)s '/srv/my%(domainname)s_content(/.*)?'
.br
.B restorecon -R -v /srv/my%(domainname)s_content

Note: SELinux often uses regular expressions to specify labels that match multiple files.
""" % {'domainname': self.domainname, "type": flist_non_exec[-1]})

        self.fd.write(r"""
.I The following file types are defined for %(domainname)s:
""" % {'domainname': self.domainname})
        flist.sort()
        for f in flist:
            self.fd.write("""

.EX
.PP
.B %s
.EE

- %s
""" % (f, sepolicy.get_description(f)))

            if f in self.fcdict:
                plural = ""
                if len(self.fcdict[f]["regex"]) > 1:
                    plural = "s"
                    self.fd.write("""
.br
.TP 5
Path%s:
%s""" % (plural, self.fcdict[f]["regex"][0]))
                    for x in self.fcdict[f]["regex"][1:]:
                        self.fd.write(", %s" % x)

        self.fd.write("""

.PP
Note: File context can be temporarily modified with the chcon command.  If you want to permanently change the file context you need to use the
.B semanage fcontext
command.  This will modify the SELinux labeling database.  You will need to use
.B restorecon
to apply the labels.
""")

    def _see_also(self):
        ret = ""
        for d in self.domains:
            if d == self.domainname:
                continue
            if d.startswith(self.short_name):
                ret += ", %s_selinux(8)" % d
            if d.startswith(self.domainname + "_"):
                ret += ", %s_selinux(8)" % d
        self.fd.write(ret)

    def _public_content(self):
        if len(self.anon_list) > 0:
            self.fd.write("""
.SH SHARING FILES
If you want to share files with multiple domains (Apache, FTP, rsync, Samba), you can set a file context of public_content_t and public_content_rw_t.  These context allow any of the above domains to read the content.  If you want a particular domain to write to the public_content_rw_t domain, you must set the appropriate boolean.
.TP
Allow %(domainname)s servers to read the /var/%(domainname)s directory by adding the public_content_t file type to the directory and by restoring the file type.
.PP
.B
semanage fcontext -a -t public_content_t "/var/%(domainname)s(/.*)?"
.br
.B restorecon -F -R -v /var/%(domainname)s
.pp
.TP
Allow %(domainname)s servers to read and write /var/%(domainname)s/incoming by adding the public_content_rw_t type to the directory and by restoring the file type.  You also need to turn on the %(domainname)s_anon_write boolean.
.PP
.B
semanage fcontext -a -t public_content_rw_t "/var/%(domainname)s/incoming(/.*)?"
.br
.B restorecon -F -R -v /var/%(domainname)s/incoming
.br
.B setsebool -P %(domainname)s_anon_write 1
""" % {'domainname': self.domainname})
            for b in self.anon_list:
                desc = self.booleans_dict[b][2][0].lower() + self.booleans_dict[b][2][1:]
                self.fd.write("""
.PP
If you want to %s, you must turn on the %s boolean.

.EX
.B setsebool -P %s 1
.EE
""" % (desc, b, b))

    def _footer(self):
        self.fd.write("""
.SH "COMMANDS"
.B semanage fcontext
can also be used to manipulate default file context mappings.
.PP
.B semanage permissive
can also be used to manipulate whether or not a process type is permissive.
.PP
.B semanage module
can also be used to enable/disable/install/remove policy modules.
""")

        if len(self.ports) > 0:
            self.fd.write("""
.B semanage port
can also be used to manipulate the port definitions
""")

        if self.booltext != "":
            self.fd.write("""
.B semanage boolean
can also be used to manipulate the booleans
""")

        self.fd.write("""
.PP
.B system-config-selinux
is a GUI tool available to customize SELinux policy settings.

.SH AUTHOR
This manual page was auto-generated using
.B "sepolicy manpage".

.SH "SEE ALSO"
selinux(8), %s(8), semanage(8), restorecon(8), chcon(1), sepolicy(8)""" % (self.domainname))

        if self.booltext != "":
            self.fd.write(", setsebool(8)")

        self._see_also()

    def _valid_write(self, check, attributes):
        if check in [self.type, "domain"]:
            return False
        if check.endswith("_t"):
            for a in attributes:
                if a in self.types[check]:
                    return False
        return True

    def _entrypoints(self):
        entrypoints = [x['target'] for x in filter(lambda y:
            y['source'] == self.type and y['class'] == 'file' and 'entrypoint' in y['permlist'],
            sepolicy.get_all_allow_rules()
        )]

        if len(entrypoints) == 0:
            return

        self.fd.write("""
.SH "ENTRYPOINTS"
""")
        if len(entrypoints) > 1:
            entrypoints_str = "\\fB%s\\fP file types" % ", ".join(entrypoints)
        else:
            entrypoints_str = "\\fB%s\\fP file type" % entrypoints[0]

        self.fd.write("""
The %s_t SELinux type can be entered via the %s.

The default entrypoint paths for the %s_t domain are the following:
""" % (self.domainname, entrypoints_str, self.domainname))
        if "bin_t" in entrypoints:
            entrypoints.remove("bin_t")
            self.fd.write("""
All executeables with the default executable label, usually stored in /usr/bin and /usr/sbin.""")

        paths = []
        for entrypoint in entrypoints:
            if entrypoint in self.fcdict:
                paths += self.fcdict[entrypoint]["regex"]

        self.fd.write("""
%s""" % ", ".join(paths))

    def _mcs_types(self):
        if self.type not in self.mcs_constrained_types['types']:
            return
        self.fd.write ("""
.SH "MCS Constrained"
The SELinux process type %(type)s_t is an MCS (Multi Category Security) constrained type.  Sometimes this separation is referred to as sVirt. These types are usually used for securing multi-tenant environments, such as virtualization, containers or separation of users.  The tools used to launch MCS types, pick out a different MCS label for each process group.

For example one process might be launched with %(type)s_t:s0:c1,c2, and another process launched with %(type)s_t:s0:c3,c4. The SELinux kernel only allows these processes can only write to content with a matching MCS label, or a MCS Label of s0. A process running with the MCS level of s0:c1,c2 is not allowed to write to content with the MCS label of s0:c3,c4
""" % {'type': self.domainname})

    def _writes(self):
        # add assigned attributes
        src_list = [self.type]
        try:
            src_list += list(filter(lambda x: x['name'] == self.type, sepolicy.get_all_types_info()))[0]['attributes']
        except:
            pass

        permlist = list(filter(lambda x:
            x['source'] in src_list and
            set(['open', 'write']).issubset(x['permlist']) and
            x['class'] == 'file',
            sepolicy.get_all_allow_rules()))
        if permlist is None or len(permlist) == 0:
            return

        all_writes = []
        attributes = ["proc_type", "sysctl_type"]

        for i in permlist:
            if self._valid_write(i['target'], attributes):
                if i['target'] not in all_writes:
                    all_writes.append(i['target'])

        if len(all_writes) == 0:
            return
        self.fd.write("""
.SH "MANAGED FILES"
""")
        self.fd.write("""
The SELinux process type %s_t can manage files labeled with the following file types.  The paths listed are the default paths for these file types.  Note the processes UID still need to have DAC permissions.
""" % self.domainname)

        all_writes.sort()
        if "file_type" in all_writes:
            all_writes = ["file_type"]
        for f in all_writes:
            self.fd.write("""
.br
.B %s

""" % f)
            if f in self.fcdict:
                for path in self.fcdict[f]["regex"]:
                    self.fd.write("""\t%s
.br
""" % path)

    def _get_users_range(self):
        if self.domainname in self.all_users_range:
            return self.all_users_range[self.domainname]
        return "s0"

    def _user_header(self):
        self.fd.write('.TH  "%(type)s_selinux"  "8"  "%(type)s" "mgrepl@redhat.com" "%(type)s SELinux Policy documentation"'
                      % {'type': self.domainname})

        self.fd.write(r"""
.SH "NAME"
%(user)s_u \- \fB%(desc)s\fP - Security Enhanced Linux Policy

.SH DESCRIPTION

\fB%(user)s_u\fP is an SELinux User defined in the SELinux
policy. SELinux users have default roles, \fB%(user)s_r\fP.  The
default role has a default type, \fB%(user)s_t\fP, associated with it.

The SELinux user will usually login to a system with a context that looks like:

.B %(user)s_u:%(user)s_r:%(user)s_t:%(range)s

Linux users are automatically assigned an SELinux users at login.
Login programs use the SELinux User to assign initial context to the user's shell.

SELinux policy uses the context to control the user's access.

By default all users are assigned to the SELinux user via the \fB__default__\fP flag

On Targeted policy systems the \fB__default__\fP user is assigned to the \fBunconfined_u\fP SELinux user.

You can list all Linux User to SELinux user mapping using:

.B semanage login -l

If you wanted to change the default user mapping to use the %(user)s_u user, you would execute:

.B semanage login -m -s %(user)s_u __default__

""" % {'desc': self.desc, 'type': self.type, 'user': self.domainname, 'range': self._get_users_range()})

        if "login_userdomain" in self.attributes and "login_userdomain" in self.all_attributes:
            self.fd.write("""
If you want to map the one Linux user (joe) to the SELinux user %(user)s, you would execute:

.B $ semanage login -a -s %(user)s_u joe

""" % {'user': self.domainname})

    def _can_sudo(self):
        sudotype = "%s_sudo_t" % self.domainname
        self.fd.write("""
.SH SUDO
""")
        if sudotype in self.types:
            role = self.domainname + "_r"
            self.fd.write("""
The SELinux user %(user)s can execute sudo.

You can set up sudo to allow %(user)s to transition to an administrative domain:

Add one or more of the following record to sudoers using visudo.

""" % {'user': self.domainname})
            for adminrole in self.role_allows[role]:
                self.fd.write("""
USERNAME ALL=(ALL) ROLE=%(admin)s_r TYPE=%(admin)s_t COMMAND
.br
sudo will run COMMAND as %(user)s_u:%(admin)s_r:%(admin)s_t:LEVEL
""" % {'admin': adminrole[:-2], 'user': self.domainname})

                self.fd.write("""
You might also need to add one or more of these new roles to your SELinux user record.

List the SELinux roles your SELinux user can reach by executing:

.B $ semanage user -l |grep selinux_name

Modify the roles list and add %(user)s_r to this list.

.B $ semanage user -m -R '%(roles)s' %(user)s_u

For more details you can see semanage man page.

""" % {'user': self.domainname, "roles": " ".join([role] + self.role_allows[role])})
            else:
                self.fd.write("""
The SELinux type %s_t is not allowed to execute sudo.
""" % self.domainname)

    def _user_attribute(self):
        self.fd.write("""
.SH USER DESCRIPTION
""")
        if "unconfined_usertype" in self.attributes:
            self.fd.write("""
The SELinux user %s_u is an unconfined user. It means that a mapped Linux user to this SELinux user is supposed to be allow all actions.
""" % self.domainname)

        if "unpriv_userdomain" in self.attributes:
            self.fd.write("""
The SELinux user %s_u is defined in policy as a unprivileged user. SELinux prevents unprivileged users from doing administration tasks without transitioning to a different role.
""" % self.domainname)

        if "admindomain" in self.attributes:
            self.fd.write("""
The SELinux user %s_u is an admin user. It means that a mapped Linux user to this SELinux user is intended for administrative actions. Usually this is assigned to a root Linux user.
""" % self.domainname)

    def _xwindows_login(self):
        if "x_domain" in self.all_attributes:
            self.fd.write("""
.SH X WINDOWS LOGIN
""")
            if "x_domain" in self.attributes:
                self.fd.write("""
The SELinux user %s_u is able to X Windows login.
""" % self.domainname)
            else:
                self.fd.write("""
The SELinux user %s_u is not able to X Windows login.
""" % self.domainname)

    def _terminal_login(self):
        if "login_userdomain" in self.all_attributes:
            self.fd.write("""
.SH TERMINAL LOGIN
""")
            if "login_userdomain" in self.attributes:
                self.fd.write("""
The SELinux user %s_u is able to terminal login.
""" % self.domainname)
            else:
                self.fd.write("""
The SELinux user %s_u is not able to terminal login.
""" % self.domainname)

    def _network(self):
        from sepolicy import network
        self.fd.write("""
.SH NETWORK
""")
        for net in ("tcp", "udp"):
            portdict = network.get_network_connect(self.type, net, "name_bind")
            if len(portdict) > 0:
                self.fd.write("""
.TP
The SELinux user %s_u is able to listen on the following %s ports.
""" % (self.domainname, net))
                for p in portdict:
                    for t, ports in portdict[p]:
                        self.fd.write("""
.B %s
""" % ",".join(ports))
            portdict = network.get_network_connect(self.type, "tcp", "name_connect")
            if len(portdict) > 0:
                self.fd.write("""
.TP
The SELinux user %s_u is able to connect to the following tcp ports.
""" % (self.domainname))
                for p in portdict:
                    for t, ports in portdict[p]:
                        self.fd.write("""
.B %s
""" % ",".join(ports))

    def _home_exec(self):
        permlist = list(filter(lambda x:
            x['source'] == self.type and
            x['target'] == 'user_home_type' and
            x['class'] == 'file' and
            set(['ioctl', 'read', 'getattr', 'execute', 'execute_no_trans', 'open']).issubset(set(x['permlist'])),
            sepolicy.get_all_allow_rules()))
        self.fd.write("""
.SH HOME_EXEC
""")
        if permlist is not None:
            self.fd.write("""
The SELinux user %s_u is able execute home content files.
""" % self.domainname)

        else:
            self.fd.write("""
The SELinux user %s_u is not able execute home content files.
""" % self.domainname)

    def _transitions(self):
        self.fd.write(r"""
.SH TRANSITIONS

Three things can happen when %(type)s attempts to execute a program.

\fB1.\fP SELinux Policy can deny %(type)s from executing the program.

.TP

\fB2.\fP SELinux Policy can allow %(type)s to execute the program in the current user type.

Execute the following to see the types that the SELinux user %(type)s can execute without transitioning:

.B sesearch -A -s %(type)s -c file -p execute_no_trans

.TP

\fB3.\fP SELinux can allow %(type)s to execute the program and transition to a new type.

Execute the following to see the types that the SELinux user %(type)s can execute and transition:

.B $ sesearch -A -s %(type)s -c process -p transition

""" % {'user': self.domainname, 'type': self.type})

    def _role_header(self):
        self.fd.write('.TH  "%(user)s_selinux"  "8"  "%(user)s" "mgrepl@redhat.com" "%(user)s SELinux Policy documentation"'
                      % {'user': self.domainname})

        self.fd.write(r"""
.SH "NAME"
%(user)s_r \- \fB%(desc)s\fP - Security Enhanced Linux Policy

.SH DESCRIPTION

SELinux supports Roles Based Access Control (RBAC), some Linux roles are login roles, while other roles need to be transition into.

.I Note:
Examples in this man page will use the
.B staff_u
SELinux user.

Non login roles are usually used for administrative tasks. For example, tasks that require root privileges.  Roles control which types a user can run processes with. Roles often have default types assigned to them.

The default type for the %(user)s_r role is %(user)s_t.

The
.B newrole
program to transition directly to this role.

.B newrole -r %(user)s_r -t %(user)s_t

.B sudo
is the preferred method to do transition from one role to another.  You setup sudo to transition to %(user)s_r by adding a similar line to the /etc/sudoers file.

USERNAME ALL=(ALL) ROLE=%(user)s_r TYPE=%(user)s_t COMMAND

.br
sudo will run COMMAND as staff_u:%(user)s_r:%(user)s_t:LEVEL

When using a non login role, you need to setup SELinux so that your SELinux user can reach %(user)s_r role.

Execute the following to see all of the assigned SELinux roles:

.B semanage user -l

You need to add %(user)s_r to the staff_u user.  You could setup the staff_u user to be able to use the %(user)s_r role with a command like:

.B $ semanage user -m -R 'staff_r system_r %(user)s_r' staff_u

""" % {'desc': self.desc, 'user': self.domainname})
        troles = []
        for i in self.role_allows:
            if self.domainname + "_r" in self.role_allows[i]:
                troles.append(i)
        if len(troles) > 0:
            plural = ""
            if len(troles) > 1:
                plural = "s"

                self.fd.write("""

SELinux policy also controls which roles can transition to a different role.
You can list these rules using the following command.

.B search --role_allow

SELinux policy allows the %s role%s can transition to the %s_r role.

""" % (", ".join(troles), plural, self.domainname))
site-packages/sepolicy/__init__.py000064400000111172147511334650013240 0ustar00# Author: Dan Walsh <dwalsh@redhat.com>
# Author: Ryan Hallisey <rhallise@redhat.com>
# Author: Jason Zaman <perfinion@gentoo.org>

import errno
import selinux
import setools
import glob
import sepolgen.defaults as defaults
import sepolgen.interfaces as interfaces
import sys
import os
import re
import gzip

PROGNAME = "selinux-python"
try:
    import gettext
    kwargs = {}
    if sys.version_info < (3,):
        kwargs['unicode'] = True
    gettext.install(PROGNAME,
                    localedir="/usr/share/locale",
                    codeset='utf-8',
                    **kwargs)
except:
    try:
        import builtins
        builtins.__dict__['_'] = str
    except ImportError:
        import __builtin__
        __builtin__.__dict__['_'] = unicode

TYPE = 1
ROLE = 2
ATTRIBUTE = 3
PORT = 4
USER = 5
BOOLEAN = 6
TCLASS = 7

ALLOW = 'allow'
AUDITALLOW = 'auditallow'
NEVERALLOW = 'neverallow'
DONTAUDIT = 'dontaudit'
SOURCE = 'source'
TARGET = 'target'
PERMS = 'permlist'
CLASS = 'class'
TRANSITION = 'transition'
ROLE_ALLOW = 'role_allow'


# Autofill for adding files *************************
DEFAULT_DIRS = {}
DEFAULT_DIRS["/etc"] = "etc_t"
DEFAULT_DIRS["/tmp"] = "tmp_t"
DEFAULT_DIRS["/usr/lib/systemd/system"] = "unit_file_t"
DEFAULT_DIRS["/lib/systemd/system"] = "unit_file_t"
DEFAULT_DIRS["/etc/systemd/system"] = "unit_file_t"
DEFAULT_DIRS["/var/cache"] = "var_cache_t"
DEFAULT_DIRS["/var/lib"] = "var_lib_t"
DEFAULT_DIRS["/var/log"] = "log_t"
DEFAULT_DIRS["/var/run"] = "var_run_t"
DEFAULT_DIRS["/run"] = "var_run_t"
DEFAULT_DIRS["/run/lock"] = "var_lock_t"
DEFAULT_DIRS["/var/run/lock"] = "var_lock_t"
DEFAULT_DIRS["/var/spool"] = "var_spool_t"
DEFAULT_DIRS["/var/www"] = "content_t"

file_type_str = {}
file_type_str["a"] = _("all files")
file_type_str["f"] = _("regular file")
file_type_str["d"] = _("directory")
file_type_str["c"] = _("character device")
file_type_str["b"] = _("block device")
file_type_str["s"] = _("socket file")
file_type_str["l"] = _("symbolic link")
file_type_str["p"] = _("named pipe")

trans_file_type_str = {}
trans_file_type_str[""] = "a"
trans_file_type_str["--"] = "f"
trans_file_type_str["-d"] = "d"
trans_file_type_str["-c"] = "c"
trans_file_type_str["-b"] = "b"
trans_file_type_str["-s"] = "s"
trans_file_type_str["-l"] = "l"
trans_file_type_str["-p"] = "p"

# the setools policy handle
_pol = None

# cache the lookup results
file_equiv_modified = None
file_equiv = None
local_files = None
fcdict = None
methods = []
all_types = None
all_types_info = None
user_types = None
role_allows = None
portrecs = None
portrecsbynum = None
all_domains = None
roles = None
selinux_user_list = None
login_mappings = None
file_types = None
port_types = None
bools = None
all_attributes = None
booleans = None
booleans_dict = None
all_allow_rules = None
all_bool_rules = None
all_transitions = None


def policy_sortkey(policy_path):
    # Parse the extension of a policy path which looks like .../policy/policy.31
    extension = policy_path.rsplit('/policy.', 1)[1]
    try:
        return int(extension), policy_path
    except ValueError:
        # Fallback with sorting on the full path
        return 0, policy_path

def get_installed_policy(root="/"):
    try:
        path = root + selinux.selinux_binary_policy_path()
        policies = glob.glob("%s.*" % path)
        policies.sort(key=policy_sortkey)
        return policies[-1]
    except:
        pass
    raise ValueError(_("No SELinux Policy installed"))

def get_store_policy(store):
    """Get the path to the policy file located in the given store name"""
    policies = glob.glob("%s%s/policy/policy.*" %
                         (selinux.selinux_path(), store))
    if not policies:
        return None
    # Return the policy with the higher version number
    policies.sort(key=policy_sortkey)
    return policies[-1]

def policy(policy_file):
    global all_domains
    global all_attributes
    global bools
    global all_types
    global role_allows
    global users
    global roles
    global file_types
    global port_types
    all_domains = None
    all_attributes = None
    bools = None
    all_types = None
    role_allows = None
    users = None
    roles = None
    file_types = None
    port_types = None
    global _pol

    try:
        _pol = setools.SELinuxPolicy(policy_file)
    except:
        raise ValueError(_("Failed to read %s policy file") % policy_file)

def load_store_policy(store):
    policy_file = get_store_policy(store)
    if not policy_file:
        return None
    policy(policy_file)

try:
    policy_file = get_installed_policy()
    policy(policy_file)
except ValueError as e:
    if selinux.is_selinux_enabled() == 1:
        raise e


def info(setype, name=None):
    if setype == TYPE:
        q = setools.TypeQuery(_pol)
        q.name = name
        results = list(q.results())

        if name and len(results) < 1:
            # type not found, try alias
            q.name = None
            q.alias = name
            results = list(q.results())

        return ({
            'aliases': list(map(str, x.aliases())),
            'name': str(x),
            'permissive': bool(x.ispermissive),
            'attributes': list(map(str, x.attributes()))
        } for x in results)

    elif setype == ROLE:
        q = setools.RoleQuery(_pol)
        if name:
            q.name = name

        return ({
            'name': str(x),
            'roles': list(map(str, x.expand())),
            'types': list(map(str, x.types())),
        } for x in q.results())

    elif setype == ATTRIBUTE:
        q = setools.TypeAttributeQuery(_pol)
        if name:
            q.name = name

        return ({
            'name': str(x),
            'types': list(map(str, x.expand())),
        } for x in q.results())

    elif setype == PORT:
        q = setools.PortconQuery(_pol)
        if name:
            ports = [int(i) for i in name.split("-")]
            if len(ports) == 2:
                q.ports = ports
            elif len(ports) == 1:
                q.ports = (ports[0], ports[0])

        if _pol.mls:
            return ({
                'high': x.ports.high,
                'protocol': str(x.protocol),
                'range': str(x.context.range_),
                'type': str(x.context.type_),
                'low': x.ports.low,
            } for x in q.results())
        return ({
            'high': x.ports.high,
            'protocol': str(x.protocol),
            'type': str(x.context.type_),
            'low': x.ports.low,
        } for x in q.results())

    elif setype == USER:
        q = setools.UserQuery(_pol)
        if name:
            q.name = name

        if _pol.mls:
            return ({
                'range': str(x.mls_range),
                'name': str(x),
                'roles': list(map(str, x.roles)),
                'level': str(x.mls_level),
            } for x in q.results())
        return ({
            'name': str(x),
            'roles': list(map(str, x.roles)),
        } for x in q.results())

    elif setype == BOOLEAN:
        q = setools.BoolQuery(_pol)
        if name:
            q.name = name

        return ({
            'name': str(x),
            'state': x.state,
        } for x in q.results())

    elif setype == TCLASS:
        q = setools.ObjClassQuery(_pol)
        if name:
            q.name = name

        return ({
            'name': str(x),
            'permlist': list(x.perms),
        } for x in q.results())

    else:
        raise ValueError("Invalid type")


def _setools_rule_to_dict(rule):
    d = {
        'type': str(rule.ruletype),
        'source': str(rule.source),
        'target': str(rule.target),
        'class': str(rule.tclass),
    }

    # Evaluate boolean expression associated with given rule (if there is any)
    try:
        # Get state of all booleans in the conditional expression
        boolstate = {}
        for boolean in rule.conditional.booleans:
            boolstate[str(boolean)] = boolean.state
        # evaluate if the rule is enabled
        enabled = rule.conditional.evaluate(**boolstate) == rule.conditional_block
    except AttributeError:
        # non-conditional rules are always enabled
        enabled = True

    d['enabled'] = enabled

    try:
        d['permlist'] = list(map(str, rule.perms))
    except AttributeError:
        pass

    try:
        d['transtype'] = str(rule.default)
    except AttributeError:
        pass

    try:
        d['booleans'] = [(str(b), b.state) for b in rule.conditional.booleans]
    except AttributeError:
        pass

    try:
        d['conditional'] = str(rule.conditional)
    except AttributeError:
        pass

    try:
        d['filename'] = rule.filename
    except AttributeError:
        pass

    return d


def search(types, seinfo=None):
    if not seinfo:
        seinfo = {}
    valid_types = set([ALLOW, AUDITALLOW, NEVERALLOW, DONTAUDIT, TRANSITION, ROLE_ALLOW])
    for setype in types:
        if setype not in valid_types:
            raise ValueError("Type has to be in %s" % " ".join(valid_types))

    source = None
    if SOURCE in seinfo:
        source = str(seinfo[SOURCE])

    target = None
    if TARGET in seinfo:
        target = str(seinfo[TARGET])

    tclass = None
    if CLASS in seinfo:
        tclass = str(seinfo[CLASS]).split(',')

    toret = []

    tertypes = []
    if ALLOW in types:
        tertypes.append(ALLOW)
    if NEVERALLOW in types:
        tertypes.append(NEVERALLOW)
    if AUDITALLOW in types:
        tertypes.append(AUDITALLOW)
    if DONTAUDIT in types:
        tertypes.append(DONTAUDIT)

    if len(tertypes) > 0:
        q = setools.TERuleQuery(_pol,
                                ruletype=tertypes,
                                source=source,
                                target=target,
                                tclass=tclass)

        if PERMS in seinfo:
            q.perms = seinfo[PERMS]

        toret += [_setools_rule_to_dict(x) for x in q.results()]

    if TRANSITION in types:
        rtypes = ['type_transition', 'type_change', 'type_member']
        q = setools.TERuleQuery(_pol,
                                ruletype=rtypes,
                                source=source,
                                target=target,
                                tclass=tclass)

        if PERMS in seinfo:
            q.perms = seinfo[PERMS]

        toret += [_setools_rule_to_dict(x) for x in q.results()]

    if ROLE_ALLOW in types:
        ratypes = ['allow']
        q = setools.RBACRuleQuery(_pol,
                                  ruletype=ratypes,
                                  source=source,
                                  target=target,
                                  tclass=tclass)

        for r in q.results():
            toret.append({'source': str(r.source),
                          'target': str(r.target)})

    return toret


def get_conditionals(src, dest, tclass, perm):
    tdict = {}
    tlist = []
    src_list = [src]
    dest_list = [dest]
    # add assigned attributes
    try:
        src_list += list(filter(lambda x: x['name'] == src, get_all_types_info()))[0]['attributes']
    except:
        pass
    try:
        dest_list += list(filter(lambda x: x['name'] == dest, get_all_types_info()))[0]['attributes']
    except:
        pass
    allows = map(lambda y: y, filter(lambda x:
                x['source'] in src_list and
                x['target'] in dest_list and
                set(perm).issubset(x[PERMS]) and
                'conditional' in x,
                get_all_allow_rules()))

    try:
        for i in allows:
            tdict.update({'source': i['source'], 'conditional': (i['conditional'], i['enabled'])})
            if tdict not in tlist:
                tlist.append(tdict)
                tdict = {}
    except KeyError:
        return(tlist)

    return (tlist)


def get_conditionals_format_text(cond):

    enabled = False
    for x in cond:
        if x['conditional'][1]:
            enabled = True
            break
    return _("-- Allowed %s [ %s ]") % (enabled, " || ".join(set(map(lambda x: "%s=%d" % (x['conditional'][0], x['conditional'][1]), cond))))


def get_types_from_attribute(attribute):
    return list(info(ATTRIBUTE, attribute))[0]["types"]


def get_file_types(setype):
    flist = []
    mpaths = {}
    for f in get_all_file_types():
        if f.startswith(gen_short_name(setype)):
            flist.append(f)
    fcdict = get_fcdict()
    for f in flist:
        try:
            mpaths[f] = (fcdict[f]["regex"], file_type_str[fcdict[f]["ftype"]])
        except KeyError:
            mpaths[f] = []
    return mpaths


def get_real_type_name(name):
    """Return the real name of a type

    * If 'name' refers to a type alias, return the corresponding type name.
    * Otherwise return the original name (even if the type does not exist).
    """
    if not name:
        return name

    try:
        return next(info(TYPE, name))["name"]
    except (RuntimeError, StopIteration):
        return name

def get_writable_files(setype):
    file_types = get_all_file_types()
    all_writes = []
    mpaths = {}
    permlist = search([ALLOW], {'source': setype, 'permlist': ['open', 'write'], 'class': 'file'})
    if permlist is None or len(permlist) == 0:
        return mpaths

    fcdict = get_fcdict()

    attributes = ["proc_type", "sysctl_type"]
    for i in permlist:
        if i['target'] in attributes:
            continue
        if "enabled" in i:
            if not i["enabled"]:
                continue
        if i['target'].endswith("_t"):
            if i['target'] not in file_types:
                continue
            if i['target'] not in all_writes:
                if i['target'] != setype:
                    all_writes.append(i['target'])
        else:
            for t in get_types_from_attribute(i['target']):
                if t not in all_writes:
                    all_writes.append(t)

    for f in all_writes:
        try:
            mpaths[f] = (fcdict[f]["regex"], file_type_str[fcdict[f]["ftype"]])
        except KeyError:
            mpaths[f] = []  # {"regex":[],"paths":[]}
    return mpaths


def find_file(reg):
    if os.path.exists(reg):
        return [reg]
    try:
        pat = re.compile(r"%s$" % reg)
    except:
        print("bad reg:", reg)
        return []
    p = reg
    if p.endswith("(/.*)?"):
        p = p[:-6] + "/"

    path = os.path.dirname(p)

    try:                       # Bug fix: when "all files on system"
        if path[-1] != "/":    # is pass in it breaks without try block
            path += "/"
    except IndexError:
        print("try failed got an IndexError")
        pass

    try:
        pat = re.compile(r"%s$" % reg)
        return [x for x in map(lambda x: path + x, os.listdir(path)) if pat.match(x)]
    except:
        return []


def find_all_files(domain, exclude_list=[]):
    executable_files = get_entrypoints(domain)
    for exe in executable_files.keys():
        if exe.endswith("_exec_t") and exe not in exclude_list:
            for path in executable_files[exe]:
                for f in find_file(path):
                    return f
    return None


def find_entrypoint_path(exe, exclude_list=[]):
    fcdict = get_fcdict()
    try:
        if exe.endswith("_exec_t") and exe not in exclude_list:
            for path in fcdict[exe]["regex"]:
                for f in find_file(path):
                    return f
    except KeyError:
        pass
    return None


def read_file_equiv(edict, fc_path, modify):
    try:
        with open(fc_path, "r") as fd:
            for e in fd:
                f = e.split()
                if f and not f[0].startswith('#'):
                    edict[f[0]] = {"equiv": f[1], "modify": modify}
    except OSError as e:
        if e.errno != errno.ENOENT:
            raise
    return edict


def get_file_equiv_modified(fc_path=selinux.selinux_file_context_path()):
    global file_equiv_modified
    if file_equiv_modified:
        return file_equiv_modified
    file_equiv_modified = {}
    file_equiv_modified = read_file_equiv(file_equiv_modified, fc_path + ".subs", modify=True)
    return file_equiv_modified


def get_file_equiv(fc_path=selinux.selinux_file_context_path()):
    global file_equiv
    if file_equiv:
        return file_equiv
    file_equiv = get_file_equiv_modified(fc_path)
    file_equiv = read_file_equiv(file_equiv, fc_path + ".subs_dist", modify=False)
    return file_equiv


def get_local_file_paths(fc_path=selinux.selinux_file_context_path()):
    global local_files
    if local_files:
        return local_files
    local_files = []
    try:
        with open(fc_path + ".local", "r") as fd:
            fc = fd.readlines()
    except OSError as e:
        if e.errno != errno.ENOENT:
            raise
        return []
    for i in fc:
        rec = i.split()
        if len(rec) == 0:
            continue
        try:
            if len(rec) > 2:
                ftype = trans_file_type_str[rec[1]]
            else:
                ftype = "a"

            local_files.append((rec[0], ftype))
        except KeyError:
            pass
    return local_files


def get_fcdict(fc_path=selinux.selinux_file_context_path()):
    global fcdict
    if fcdict:
        return fcdict
    fd = open(fc_path, "r")
    fc = fd.readlines()
    fd.close()
    fd = open(fc_path + ".homedirs", "r")
    fc += fd.readlines()
    fd.close()
    fcdict = {}
    try:
        with open(fc_path + ".local", "r") as fd:
            fc += fd.readlines()
    except OSError as e:
        if e.errno != errno.ENOENT:
            raise

    for i in fc:
        rec = i.split()
        try:
            if len(rec) > 2:
                ftype = trans_file_type_str[rec[1]]
            else:
                ftype = "a"

            t = rec[-1].split(":")[2]
            if t in fcdict:
                fcdict[t]["regex"].append(rec[0])
            else:
                fcdict[t] = {"regex": [rec[0]], "ftype": ftype}
        except:
            pass

    fcdict["logfile"] = {"regex": ["all log files"]}
    fcdict["user_tmp_type"] = {"regex": ["all user tmp files"]}
    fcdict["user_home_type"] = {"regex": ["all user home files"]}
    fcdict["virt_image_type"] = {"regex": ["all virtual image files"]}
    fcdict["noxattrfs"] = {"regex": ["all files on file systems which do not support extended attributes"]}
    fcdict["sandbox_tmpfs_type"] = {"regex": ["all sandbox content in tmpfs file systems"]}
    fcdict["user_tmpfs_type"] = {"regex": ["all user content in tmpfs file systems"]}
    fcdict["file_type"] = {"regex": ["all files on the system"]}
    fcdict["samba_share_t"] = {"regex": ["use this label for random content that will be shared using samba"]}
    return fcdict


def get_transitions_into(setype):
    try:
        return [x for x in search([TRANSITION], {'class': 'process'}) if x["transtype"] == setype]
    except (TypeError, AttributeError):
        pass
    return None


def get_transitions(setype):
    try:
        return search([TRANSITION], {'source': setype, 'class': 'process'})
    except (TypeError, AttributeError):
        pass
    return None


def get_file_transitions(setype):
    try:
        return [x for x in search([TRANSITION], {'source': setype}) if x['class'] != "process"]
    except (TypeError, AttributeError):
        pass
    return None


def get_boolean_rules(setype, boolean):
    boollist = []
    permlist = search([ALLOW], {'source': setype})
    for p in permlist:
        if "booleans" in p:
            try:
                for b in p["booleans"]:
                    if boolean in b:
                        boollist.append(p)
            except:
                pass
    return boollist


def get_all_entrypoints():
    return get_types_from_attribute("entry_type")


def get_entrypoint_types(setype):
    q = setools.TERuleQuery(_pol,
                            ruletype=[ALLOW],
                            source=setype,
                            tclass=["file"],
                            perms=["entrypoint"])
    return [str(x.target) for x in q.results() if x.source == setype]


def get_init_transtype(path):
    entrypoint = selinux.getfilecon(path)[1].split(":")[2]
    try:
        entrypoints = list(filter(lambda x: x['target'] == entrypoint, search([TRANSITION], {'source': "init_t", 'class': 'process'})))
        return entrypoints[0]["transtype"]
    except (TypeError, AttributeError, IndexError):
        pass
    return None


def get_init_entrypoint(transtype):
    q = setools.TERuleQuery(_pol,
                            ruletype=["type_transition"],
                            source="init_t",
                            tclass=["process"])
    entrypoints = []
    for i in q.results():
        try:
            if i.default == transtype:
                entrypoints.append(i.target)
        except AttributeError:
            continue

    return entrypoints

def get_init_entrypoints_str():
    q = setools.TERuleQuery(_pol,
                            ruletype=["type_transition"],
                            source="init_t",
                            tclass=["process"])
    entrypoints = {}
    for i in q.results():
        try:
            transtype = str(i.default)
            if transtype in entrypoints:
                entrypoints[transtype].append(str(i.target))
            else:
                entrypoints[transtype] = [str(i.target)]
        except AttributeError:
            continue

    return entrypoints

def get_init_entrypoint_target(entrypoint):
    try:
        entrypoints = map(lambda x: x['transtype'], search([TRANSITION], {'source': "init_t", 'target': entrypoint, 'class': 'process'}))
        return list(entrypoints)[0]
    except (TypeError, IndexError):
        pass
    return None


def get_entrypoints(setype):
    fcdict = get_fcdict()
    mpaths = {}
    for f in get_entrypoint_types(setype):
        try:
            mpaths[f] = (fcdict[f]["regex"], file_type_str[fcdict[f]["ftype"]])
        except KeyError:
            mpaths[f] = []
    return mpaths


def get_methods():
    global methods
    if len(methods) > 0:
        return methods
    gen_interfaces()
    fn = defaults.interface_info()
    try:
        fd = open(fn)
    # List of per_role_template interfaces
        ifs = interfaces.InterfaceSet()
        ifs.from_file(fd)
        methods = list(ifs.interfaces.keys())
        fd.close()
    except:
        sys.stderr.write("could not open interface info [%s]\n" % fn)
        sys.exit(1)

    methods.sort()
    return methods


def get_all_types():
    global all_types
    if all_types is None:
        all_types = [x['name'] for x in info(TYPE)]
    return all_types

def get_all_types_info():
    global all_types_info
    if all_types_info is None:
        all_types_info = list(info(TYPE))
    return all_types_info

def get_user_types():
    global user_types
    if user_types is None:
        user_types = list(list(info(ATTRIBUTE, "userdomain"))[0]["types"])
    return user_types


def get_all_role_allows():
    global role_allows
    if role_allows:
        return role_allows
    role_allows = {}

    q = setools.RBACRuleQuery(_pol, ruletype=[ALLOW])
    for r in q.results():
        src = str(r.source)
        tgt = str(r.target)
        if src == "system_r" or tgt == "system_r":
            continue
        if src in role_allows:
            role_allows[src].append(tgt)
        else:
            role_allows[src] = [tgt]

    return role_allows


def get_all_entrypoint_domains():
    import re
    all_domains = []
    types = sorted(get_all_types())
    for i in types:
        m = re.findall("(.*)%s" % "_exec_t$", i)
        if len(m) > 0:
            if len(re.findall("(.*)%s" % "_initrc$", m[0])) == 0 and m[0] not in all_domains:
                all_domains.append(m[0])
    return all_domains


def gen_interfaces():
    try:
        from commands import getstatusoutput
    except ImportError:
        from subprocess import getstatusoutput
    ifile = defaults.interface_info()
    headers = defaults.headers()
    try:
        if os.stat(headers).st_mtime <= os.stat(ifile).st_mtime:
            return
    except OSError:
        pass

    if os.getuid() != 0:
        raise ValueError(_("You must regenerate interface info by running /usr/bin/sepolgen-ifgen"))
    print(getstatusoutput("/usr/bin/sepolgen-ifgen")[1])


def gen_port_dict():
    global portrecs
    global portrecsbynum
    if portrecs:
        return (portrecs, portrecsbynum)
    portrecsbynum = {}
    portrecs = {}
    for i in info(PORT):
        if i['low'] == i['high']:
            port = str(i['low'])
        else:
            port = "%s-%s" % (str(i['low']), str(i['high']))

        if (i['type'], i['protocol']) in portrecs:
            portrecs[(i['type'], i['protocol'])].append(port)
        else:
            portrecs[(i['type'], i['protocol'])] = [port]

        if 'range' in i:
            portrecsbynum[(i['low'], i['high'], i['protocol'])] = (i['type'], i['range'])
        else:
            portrecsbynum[(i['low'], i['high'], i['protocol'])] = (i['type'])

    return (portrecs, portrecsbynum)


def get_all_domains():
    global all_domains
    if not all_domains:
        all_domains = list(list(info(ATTRIBUTE, "domain"))[0]["types"])
    return all_domains


def get_all_roles():
    global roles
    if roles:
        return roles

    q = setools.RoleQuery(_pol)
    roles = [str(x) for x in q.results() if str(x) != "object_r"]
    return roles


def get_selinux_users():
    global selinux_user_list
    if not selinux_user_list:
        selinux_user_list = list(info(USER))
        if _pol.mls:
            for x in selinux_user_list:
                x['range'] = "".join(x['range'].split(" "))
    return selinux_user_list


def get_login_mappings():
    global login_mappings
    if login_mappings:
        return login_mappings

    fd = open(selinux.selinux_usersconf_path(), "r")
    buf = fd.read()
    fd.close()
    login_mappings = []
    for b in buf.split("\n"):
        b = b.strip()
        if len(b) == 0 or b.startswith("#"):
            continue
        x = b.split(":")
        login_mappings.append({"name": x[0], "seuser": x[1], "mls": ":".join(x[2:])})
    return login_mappings


def get_all_users():
    return sorted(map(lambda x: x['name'], get_selinux_users()))


def get_all_file_types():
    global file_types
    if file_types:
        return file_types
    file_types = list(sorted(info(ATTRIBUTE, "file_type"))[0]["types"])
    return file_types


def get_all_port_types():
    global port_types
    if port_types:
        return port_types
    port_types = list(sorted(info(ATTRIBUTE, "port_type"))[0]["types"])
    return port_types


def get_all_bools():
    global bools
    if not bools:
        bools = list(info(BOOLEAN))
    return bools


def prettyprint(f, trim):
    return " ".join(f[:-len(trim)].split("_"))


def markup(f):
    return f


def get_description(f, markup=markup):

    txt = "Set files with the %s type, if you want to " % markup(f)

    if f.endswith("_var_run_t"):
        return txt + "store the %s files under the /run or /var/run directory." % prettyprint(f, "_var_run_t")
    if f.endswith("_pid_t"):
        return txt + "store the %s files under the /run directory." % prettyprint(f, "_pid_t")
    if f.endswith("_var_lib_t"):
        return txt + "store the %s files under the /var/lib directory." % prettyprint(f, "_var_lib_t")
    if f.endswith("_var_t"):
        return txt + "store the %s files under the /var directory." % prettyprint(f, "_var_lib_t")
    if f.endswith("_var_spool_t"):
        return txt + "store the %s files under the /var/spool directory." % prettyprint(f, "_spool_t")
    if f.endswith("_spool_t"):
        return txt + "store the %s files under the /var/spool directory." % prettyprint(f, "_spool_t")
    if f.endswith("_cache_t") or f.endswith("_var_cache_t"):
        return txt + "store the files under the /var/cache directory."
    if f.endswith("_keytab_t"):
        return txt + "treat the files as kerberos keytab files."
    if f.endswith("_lock_t"):
        return txt + "treat the files as %s lock data, stored under the /var/lock directory" % prettyprint(f, "_lock_t")
    if f.endswith("_log_t"):
        return txt + "treat the data as %s log data, usually stored under the /var/log directory." % prettyprint(f, "_log_t")
    if f.endswith("_config_t"):
        return txt + "treat the files as %s configuration data, usually stored under the /etc directory." % prettyprint(f, "_config_t")
    if f.endswith("_conf_t"):
        return txt + "treat the files as %s configuration data, usually stored under the /etc directory." % prettyprint(f, "_conf_t")
    if f.endswith("_exec_t"):
        return txt + "transition an executable to the %s_t domain." % f[:-len("_exec_t")]
    if f.endswith("_cgi_content_t"):
        return txt + "treat the files as %s cgi content." % prettyprint(f, "_cgi_content_t")
    if f.endswith("_rw_content_t"):
        return txt + "treat the files as %s read/write content." % prettyprint(f, "_rw_content_t")
    if f.endswith("_rw_t"):
        return txt + "treat the files as %s read/write content." % prettyprint(f, "_rw_t")
    if f.endswith("_write_t"):
        return txt + "treat the files as %s read/write content." % prettyprint(f, "_write_t")
    if f.endswith("_db_t"):
        return txt + "treat the files as %s database content." % prettyprint(f, "_db_t")
    if f.endswith("_ra_content_t"):
        return txt + "treat the files as %s read/append content." % prettyprint(f, "_ra_content_t")
    if f.endswith("_cert_t"):
        return txt + "treat the files as %s certificate data." % prettyprint(f, "_cert_t")
    if f.endswith("_key_t"):
        return txt + "treat the files as %s key data." % prettyprint(f, "_key_t")

    if f.endswith("_secret_t"):
        return txt + "treat the files as %s secret data." % prettyprint(f, "_key_t")

    if f.endswith("_ra_t"):
        return txt + "treat the files as %s read/append content." % prettyprint(f, "_ra_t")

    if f.endswith("_ro_t"):
        return txt + "treat the files as %s read/only content." % prettyprint(f, "_ro_t")

    if f.endswith("_modules_t"):
        return txt + "treat the files as %s modules." % prettyprint(f, "_modules_t")

    if f.endswith("_content_t"):
        return txt + "treat the files as %s content." % prettyprint(f, "_content_t")

    if f.endswith("_state_t"):
        return txt + "treat the files as %s state data." % prettyprint(f, "_state_t")

    if f.endswith("_files_t"):
        return txt + "treat the files as %s content." % prettyprint(f, "_files_t")

    if f.endswith("_file_t"):
        return txt + "treat the files as %s content." % prettyprint(f, "_file_t")

    if f.endswith("_data_t"):
        return txt + "treat the files as %s content." % prettyprint(f, "_data_t")

    if f.endswith("_file_t"):
        return txt + "treat the data as %s content." % prettyprint(f, "_file_t")

    if f.endswith("_tmp_t"):
        return txt + "store %s temporary files in the /tmp directories." % prettyprint(f, "_tmp_t")
    if f.endswith("_etc_t"):
        return txt + "store %s files in the /etc directories." % prettyprint(f, "_tmp_t")
    if f.endswith("_home_t"):
        return txt + "store %s files in the users home directory." % prettyprint(f, "_home_t")
    if f.endswith("_tmpfs_t"):
        return txt + "store %s files on a tmpfs file system." % prettyprint(f, "_tmpfs_t")
    if f.endswith("_unit_file_t"):
        return txt + "treat files as a systemd unit file."
    if f.endswith("_htaccess_t"):
        return txt + "treat the file as a %s access file." % prettyprint(f, "_htaccess_t")

    return txt + "treat the files as %s data." % prettyprint(f, "_t")


def get_all_attributes():
    global all_attributes
    if not all_attributes:
        all_attributes = list(sorted(map(lambda x: x['name'], info(ATTRIBUTE))))
    return all_attributes


def _dict_has_perms(dict, perms):
    for perm in perms:
        if perm not in dict[PERMS]:
            return False
    return True


def gen_short_name(setype):
    all_domains = get_all_domains()
    if setype.endswith("_t"):
        # replace aliases with corresponding types
        setype = get_real_type_name(setype)
        domainname = setype[:-2]
    else:
        domainname = setype
    if domainname + "_t" not in all_domains:
        raise ValueError("domain %s_t does not exist" % domainname)
    if domainname[-1] == 'd':
        short_name = domainname[:-1] + "_"
    else:
        short_name = domainname + "_"
    return (domainname, short_name)

def get_all_allow_rules():
    global all_allow_rules
    if not all_allow_rules:
        all_allow_rules = search([ALLOW])
    return all_allow_rules

def get_all_bool_rules():
    global all_bool_rules
    if not all_bool_rules:
        q = setools.TERuleQuery(_pol, boolean=".*", boolean_regex=True,
                                ruletype=[ALLOW, DONTAUDIT])
        all_bool_rules = [_setools_rule_to_dict(x) for x in q.results()]
    return all_bool_rules

def get_all_transitions():
    global all_transitions
    if not all_transitions:
        all_transitions = list(search([TRANSITION]))
    return all_transitions

def get_bools(setype):
    bools = []
    domainbools = []
    domainname, short_name = gen_short_name(setype)
    for i in map(lambda x: x['booleans'], filter(lambda x: 'booleans' in x and x['source'] == setype, get_all_bool_rules())):
        for b in i:
            if not isinstance(b, tuple):
                continue
            try:
                enabled = selinux.security_get_boolean_active(b[0])
            except OSError:
                enabled = b[1]
            if b[0].startswith(short_name) or b[0].startswith(domainname):
                if (b[0], enabled) not in domainbools and (b[0], not enabled) not in domainbools:
                    domainbools.append((b[0], enabled))
            else:
                if (b[0], enabled) not in bools and (b[0], not enabled) not in bools:
                    bools.append((b[0], enabled))
    return (domainbools, bools)


def get_all_booleans():
    global booleans
    if not booleans:
        booleans = selinux.security_get_boolean_names()[1]
    return booleans


def policy_xml(path="/usr/share/selinux/devel/policy.xml"):
    try:
        fd = gzip.open(path)
        buf = fd.read()
        fd.close()
    except IOError:
        fd = open(path)
        buf = fd.read()
        fd.close()
    return buf


def gen_bool_dict(path="/usr/share/selinux/devel/policy.xml"):
    global booleans_dict
    if booleans_dict:
        return booleans_dict
    import xml.etree.ElementTree
    booleans_dict = {}
    try:
        tree = xml.etree.ElementTree.fromstring(policy_xml(path))
        for l in tree.findall("layer"):
            for m in l.findall("module"):
                for b in m.findall("tunable"):
                    desc = b.find("desc").find("p").text.strip("\n")
                    desc = re.sub("\n", " ", desc)
                    booleans_dict[b.get('name')] = (m.get("name"), b.get('dftval'), desc)
                for b in m.findall("bool"):
                    desc = b.find("desc").find("p").text.strip("\n")
                    desc = re.sub("\n", " ", desc)
                    booleans_dict[b.get('name')] = (m.get("name"), b.get('dftval'), desc)
            for i in tree.findall("bool"):
                desc = i.find("desc").find("p").text.strip("\n")
                desc = re.sub("\n", " ", desc)
                booleans_dict[i.get('name')] = ("global", i.get('dftval'), desc)
        for i in tree.findall("tunable"):
            desc = i.find("desc").find("p").text.strip("\n")
            desc = re.sub("\n", " ", desc)
            booleans_dict[i.get('name')] = ("global", i.get('dftval'), desc)
    except IOError:
        pass
    return booleans_dict


def boolean_category(boolean):
    booleans_dict = gen_bool_dict()
    if boolean in booleans_dict:
        return _(booleans_dict[boolean][0])
    else:
        return _("unknown")


def boolean_desc(boolean):
    booleans_dict = gen_bool_dict()
    if boolean in booleans_dict:
        return _(booleans_dict[boolean][2])
    else:
        desc = boolean.split("_")
        return "Allow %s to %s" % (desc[0], " ".join(desc[1:]))


def get_os_version():
    system_release = ""
    try:
        with open('/etc/system-release') as f:
            system_release = f.readline().rstrip()
    except IOError:
        system_release = "Misc"

    return system_release


def reinit():
    global all_attributes
    global all_domains
    global all_types
    global booleans
    global booleans_dict
    global bools
    global fcdict
    global file_types
    global local_files
    global methods
    global methods
    global portrecs
    global portrecsbynum
    global port_types
    global role_allows
    global roles
    global login_mappings
    global selinux_user_list
    global user_types
    all_attributes = None
    all_domains = None
    all_types = None
    booleans = None
    booleans_dict = None
    bools = None
    fcdict = None
    file_types = None
    local_files = None
    methods = None
    methods = None
    portrecs = None
    portrecsbynum = None
    port_types = None
    role_allows = None
    roles = None
    user_types = None
    login_mappings = None
    selinux_user_list = None
site-packages/sepolicy/sedbus.py000064400000003324147511334650012765 0ustar00import sys
import dbus
import dbus.service
import dbus.mainloop.glib
from slip.dbus import polkit


class SELinuxDBus (object):

    def __init__(self):
        self.bus = dbus.SystemBus()
        self.dbus_object = self.bus.get_object("org.selinux", "/org/selinux/object")

    @polkit.enable_proxy
    def semanage(self, buf):
        ret = self.dbus_object.semanage(buf, dbus_interface="org.selinux")
        return ret

    @polkit.enable_proxy
    def restorecon(self, path):
        ret = self.dbus_object.restorecon(path, dbus_interface="org.selinux")
        return ret

    @polkit.enable_proxy
    def setenforce(self, value):
        ret = self.dbus_object.setenforce(value, dbus_interface="org.selinux")
        return ret

    @polkit.enable_proxy
    def customized(self):
        ret = self.dbus_object.customized(dbus_interface="org.selinux")
        return ret

    @polkit.enable_proxy
    def semodule_list(self):
        ret = self.dbus_object.semodule_list(dbus_interface="org.selinux")
        return ret

    @polkit.enable_proxy
    def relabel_on_boot(self, value):
        ret = self.dbus_object.relabel_on_boot(value, dbus_interface="org.selinux")
        return ret

    @polkit.enable_proxy
    def change_default_mode(self, value):
        ret = self.dbus_object.change_default_mode(value, dbus_interface="org.selinux")
        return ret

    @polkit.enable_proxy
    def change_default_policy(self, value):
        ret = self.dbus_object.change_default_policy(value, dbus_interface="org.selinux")
        return ret

if __name__ == "__main__":
    try:
        dbus_proxy = SELinuxDBus()
        resp = dbus_proxy.setenforce(int(sys.argv[1]))
        print(resp)
    except dbus.DBusException as e:
        print(e)
site-packages/sepolicy/network.py000064400000005373147511334650013177 0ustar00# Copyright (C) 2012 Red Hat
# see file 'COPYING' for use and warranty information
#
# setrans is a tool for analyzing process transistions in SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#
import sepolicy


def get_types(src, tclass, perm, check_bools=False):
    allows = sepolicy.search([sepolicy.ALLOW], {sepolicy.SOURCE: src, sepolicy.CLASS: tclass, sepolicy.PERMS: perm})
    nlist = []
    if allows:
        for i in map(lambda y: y[sepolicy.TARGET], filter(lambda x: set(perm).issubset(x[sepolicy.PERMS]) and (not check_bools or x["enabled"]), allows)):
            if i not in nlist:
                nlist.append(i)
    return nlist


def get_network_connect(src, protocol, perm, check_bools=False):
    portrecs, portrecsbynum = sepolicy.gen_port_dict()
    d = {}
    tlist = get_types(src, "%s_socket" % protocol, [perm], check_bools)
    if len(tlist) > 0:
        d[(src, protocol, perm)] = []
        for i in tlist:
            if i == "ephemeral_port_type":
                if "unreserved_port_type" in tlist:
                    continue
                i = "ephemeral_port_t"
            if i == "unreserved_port_t":
                if "unreserved_port_type" in tlist:
                    continue
                if "port_t" in tlist:
                    continue
            if i == "port_t":
                d[(src, protocol, perm)].append((i, ["all ports with out defined types"]))
            if i == "port_type":
                d[(src, protocol, perm)].append((i, ["all ports"]))
            elif i == "unreserved_port_type":
                d[(src, protocol, perm)].append((i, ["all ports > 1024"]))
            elif i == "reserved_port_type":
                d[(src, protocol, perm)].append((i, ["all ports < 1024"]))
            elif i == "rpc_port_type":
                d[(src, protocol, perm)].append((i, ["all ports > 500 and  < 1024"]))
            else:
                try:
                    d[(src, protocol, perm)].append((i, portrecs[(i, protocol)]))
                except KeyError:
                    pass
    return d
site-packages/sepolicy/generate.py000064400000145234147511334650013301 0ustar00# Copyright (C) 2007-2012 Red Hat
# see file 'COPYING' for use and warranty information
#
# policygentool is a tool for the initial generation of SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#
import os
import sys
import stat
import re
import sepolicy
from sepolicy import get_all_types, get_all_attributes, get_all_roles
import time
import platform

from .templates import executable
from .templates import boolean
from .templates import etc_rw
from .templates import unit_file
from .templates import var_cache
from .templates import var_spool
from .templates import var_lib
from .templates import var_log
from .templates import var_run
from .templates import tmp
from .templates import rw
from .templates import network
from .templates import script
from .templates import spec
from .templates import user
import sepolgen.interfaces as interfaces
import sepolgen.defaults as defaults

##
## I18N
##
PROGNAME = "selinux-python"
try:
    import gettext
    kwargs = {}
    if sys.version_info < (3,):
        kwargs['unicode'] = True
    gettext.install(PROGNAME,
                    localedir="/usr/share/locale",
                    codeset='utf-8',
                    **kwargs)
except:
    try:
        import builtins
        builtins.__dict__['_'] = str
    except ImportError:
        import __builtin__
        __builtin__.__dict__['_'] = unicode


def get_rpm_nvr_from_header(hdr):
    'Given an RPM header return the package NVR as a string'
    name = hdr['name']
    version = hdr['version']
    release = hdr['release']
    release_version = version + "-" + release.split(".")[0]
    os_version = release.split(".")[1]

    return [name, release_version, os_version]


def get_rpm_nvr_list(package):
    try:
        import rpm
        nvr = None
        ts = rpm.ts()
        mi = ts.dbMatch(rpm.RPMTAG_NAME, package)
        for h in mi:
            nvr = get_rpm_nvr_from_header(h)
            break
    except:
        print(("Failed to retrieve rpm info for %s") % package)
        nvr = None

    return nvr


def get_all_ports():
    dict = {}
    for p in sepolicy.info(sepolicy.PORT):
        if p['type'] == "reserved_port_t" or \
                p['type'] == "port_t" or \
                p['type'] == "hi_reserved_port_t" or \
                p['type'] == "ephemeral_port_t" or \
                p['type'] == "unreserved_port_t":
            continue
        dict[(p['low'], p['high'], p['protocol'])] = (p['type'], p.get('range'))
    return dict


def get_all_users():
    users = [x['name'] for x in sepolicy.info(sepolicy.USER)]
    users.remove("system_u")
    users.remove("root")
    users.sort()
    return users

ALL = 0
RESERVED = 1
UNRESERVED = 2
PORTS = 3
ADMIN_TRANSITION_INTERFACE = "_admin$"
USER_TRANSITION_INTERFACE = "_role$"

DAEMON = 0
DBUS = 1
INETD = 2
CGI = 3
SANDBOX = 4
USER = 5
EUSER = 6
TUSER = 7
XUSER = 8
LUSER = 9
AUSER = 10
RUSER = 11
NEWTYPE = 12

poltype = {}
poltype[DAEMON] = _("Standard Init Daemon")
poltype[DBUS] = _("DBUS System Daemon")
poltype[INETD] = _("Internet Services Daemon")
poltype[CGI] = _("Web Application/Script (CGI)")
poltype[SANDBOX] = _("Sandbox")
poltype[USER] = _("User Application")
poltype[EUSER] = _("Existing Domain Type")
poltype[TUSER] = _("Minimal Terminal Login User Role")
poltype[XUSER] = _("Minimal X Windows Login User Role")
poltype[LUSER] = _("Desktop Login User Role")
poltype[AUSER] = _("Administrator Login User Role")
poltype[RUSER] = _("Confined Root Administrator Role")
poltype[NEWTYPE] = _("Module information for a new type")


def get_poltype_desc():
    keys = poltype.keys()
    keys.sort()
    msg = _("Valid Types:\n")
    for k in keys:
        msg += "%2s: %s\n" % (k, poltype[k])
    return msg

APPLICATIONS = [DAEMON, DBUS, INETD, USER, CGI]
USERS = [XUSER, TUSER, LUSER, AUSER, RUSER]


def verify_ports(ports):
    if ports == "":
        return []
    max_port = 2 ** 16
    try:
        temp = []
        for a in ports.split(","):
            r = a.split("-")
            if len(r) > 2:
                raise ValueError
            if len(r) == 1:
                begin = int(r[0])
                end = int(r[0])
            else:
                begin = int(r[0])
                end = int(r[1])

                if begin > end:
                    raise ValueError

            for p in range(begin, end + 1):
                if p < 1 or p > max_port:
                    raise ValueError
                temp.append(p)
        return temp
    except ValueError:
        raise ValueError(_("Ports must be numbers or ranges of numbers from 1 to %d ") % max_port)


class policy:

    def __init__(self, name, type):
        self.rpms = []
        self.ports = {}
        self.all_roles = get_all_roles()
        self.types = []

        if type not in poltype:
            raise ValueError(_("You must enter a valid policy type"))

        if not name:
            raise ValueError(_("You must enter a name for your policy module for your '%s'.") % poltype[type])
        try:
            self.ports = get_all_ports()
        except ValueError as e:
            print("Can not get port types, must be root for this information")
        except RuntimeError as e:
            print("Can not get port types", e)

        self.symbols = {}
        self.symbols["openlog"] = "set_use_kerberos(True)"
        self.symbols["openlog"] = "set_use_kerb_rcache(True)"
        self.symbols["openlog"] = "set_use_syslog(True)"
        self.symbols["gethostby"] = "set_use_resolve(True)"
        self.symbols["getaddrinfo"] = "set_use_resolve(True)"
        self.symbols["getnameinfo"] = "set_use_resolve(True)"
        self.symbols["krb"] = "set_use_kerberos(True)"
        self.symbols["gss_accept_sec_context"] = "set_manage_krb5_rcache(True)"
        self.symbols["krb5_verify_init_creds"] = "set_manage_krb5_rcache(True)"
        self.symbols["krb5_rd_req"] = "set_manage_krb5_rcache(True)"
        self.symbols["__syslog_chk"] = "set_use_syslog(True)"
        self.symbols["getpwnam"] = "set_use_uid(True)"
        self.symbols["getpwuid"] = "set_use_uid(True)"
        self.symbols["dbus_"] = "set_use_dbus(True)"
        self.symbols["pam_"] = "set_use_pam(True)"
        self.symbols["pam_"] = "set_use_audit(True)"
        self.symbols["fork"] = "add_process('fork')"
        self.symbols["transition"] = "add_process('transition')"
        self.symbols["sigchld"] = "add_process('sigchld')"
        self.symbols["sigkill"] = "add_process('sigkill')"
        self.symbols["sigstop"] = "add_process('sigstop')"
        self.symbols["signull"] = "add_process('signull')"
        self.symbols["ptrace"] = "add_process('ptrace')"
        self.symbols["getsched"] = "add_process('getsched')"
        self.symbols["setsched"] = "add_process('setsched')"
        self.symbols["getsession"] = "add_process('getsession')"
        self.symbols["getpgid"] = "add_process('getpgid')"
        self.symbols["setpgid"] = "add_process('setpgid')"
        self.symbols["getcap"] = "add_process('getcap')"
        self.symbols["setcap"] = "add_process('setcap')"
        self.symbols["share"] = "add_process('share')"
        self.symbols["getattr"] = "add_process('getattr')"
        self.symbols["setexec"] = "add_process('setexec')"
        self.symbols["setfscreate"] = "add_process('setfscreate')"
        self.symbols["noatsecure"] = "add_process('noatsecure')"
        self.symbols["siginh"] = "add_process('siginh')"
        self.symbols["kill"] = "add_process('signal_perms')"
        self.symbols["setrlimit"] = "add_process('setrlimit')"
        self.symbols["rlimitinh"] = "add_process('rlimitinh')"
        self.symbols["dyntransition"] = "add_process('dyntransition')"
        self.symbols["setcurrent"] = "add_process('setcurrent')"
        self.symbols["execmem"] = "add_process('execmem')"
        self.symbols["execstack"] = "add_process('execstack')"
        self.symbols["execheap"] = "add_process('execheap')"
        self.symbols["setkeycreate"] = "add_process('setkeycreate')"
        self.symbols["setsockcreate"] = "add_process('setsockcreate')"

        self.symbols["chown"] = "add_capability('chown')"
        self.symbols["dac_override"] = "add_capability('dac_override')"
        self.symbols["dac_read_search"] = "add_capability('dac_read_search')"
        self.symbols["fowner"] = "add_capability('fowner')"
        self.symbols["fsetid"] = "add_capability('fsetid')"
        self.symbols["setgid"] = "add_capability('setgid')"
        self.symbols["setegid"] = "add_capability('setgid')"
        self.symbols["setresgid"] = "add_capability('setgid')"
        self.symbols["setregid"] = "add_capability('setgid')"
        self.symbols["setresuid"] = "add_capability('setuid')"
        self.symbols["setuid"] = "add_capability('setuid')"
        self.symbols["seteuid"] = "add_capability('setuid')"
        self.symbols["setreuid"] = "add_capability('setuid')"
        self.symbols["setresuid"] = "add_capability('setuid')"
        self.symbols["setpcap"] = "add_capability('setpcap')"
        self.symbols["linux_immutable"] = "add_capability('linux_immutable')"
        self.symbols["net_bind_service"] = "add_capability('net_bind_service')"
        self.symbols["net_broadcast"] = "add_capability('net_broadcast')"
        self.symbols["net_admin"] = "add_capability('net_admin')"
        self.symbols["net_raw"] = "add_capability('net_raw')"
        self.symbols["ipc_lock"] = "add_capability('ipc_lock')"
        self.symbols["ipc_owner"] = "add_capability('ipc_owner')"
        self.symbols["sys_module"] = "add_capability('sys_module')"
        self.symbols["sys_rawio"] = "add_capability('sys_rawio')"
        self.symbols["chroot"] = "add_capability('sys_chroot')"
        self.symbols["sys_chroot"] = "add_capability('sys_chroot')"
        self.symbols["sys_ptrace"] = "add_capability('sys_ptrace')"
        self.symbols["sys_pacct"] = "add_capability('sys_pacct')"
        self.symbols["mount"] = "add_capability('sys_admin')"
        self.symbols["unshare"] = "add_capability('sys_admin')"
        self.symbols["sys_admin"] = "add_capability('sys_admin')"
        self.symbols["sys_boot"] = "add_capability('sys_boot')"
        self.symbols["sys_nice"] = "add_capability('sys_nice')"
        self.symbols["sys_resource"] = "add_capability('sys_resource')"
        self.symbols["sys_time"] = "add_capability('sys_time')"
        self.symbols["sys_tty_config"] = "add_capability('sys_tty_config')"
        self.symbols["mknod"] = "add_capability('mknod')"
        self.symbols["lease"] = "add_capability('lease')"
        self.symbols["audit_write"] = "add_capability('audit_write')"
        self.symbols["audit_control"] = "add_capability('audit_control')"
        self.symbols["setfcap"] = "add_capability('setfcap')"

        self.DEFAULT_DIRS = {}
        self.DEFAULT_DIRS["/etc"] = ["etc_rw", [], etc_rw]
        self.DEFAULT_DIRS["/tmp"] = ["tmp", [], tmp]
        self.DEFAULT_DIRS["rw"] = ["rw", [], rw]
        self.DEFAULT_DIRS["/usr/lib/systemd/system"] = ["unit_file", [], unit_file]
        self.DEFAULT_DIRS["/lib/systemd/system"] = ["unit_file", [], unit_file]
        self.DEFAULT_DIRS["/etc/systemd/system"] = ["unit_file", [], unit_file]
        self.DEFAULT_DIRS["/var/cache"] = ["var_cache", [], var_cache]
        self.DEFAULT_DIRS["/var/lib"] = ["var_lib", [], var_lib]
        self.DEFAULT_DIRS["/var/log"] = ["var_log", [], var_log]
        self.DEFAULT_DIRS["/var/run"] = ["var_run", [], var_run]
        self.DEFAULT_DIRS["/var/spool"] = ["var_spool", [], var_spool]

        self.DEFAULT_EXT = {}
        self.DEFAULT_EXT["_tmp_t"] = tmp
        self.DEFAULT_EXT["_unit_file_t"] = unit_file
        self.DEFAULT_EXT["_var_cache_t"] = var_cache
        self.DEFAULT_EXT["_var_lib_t"] = var_lib
        self.DEFAULT_EXT["_var_log_t"] = var_log
        self.DEFAULT_EXT["_var_run_t"] = var_run
        self.DEFAULT_EXT["_var_spool_t"] = var_spool
        self.DEFAULT_EXT["_port_t"] = network

        self.DEFAULT_KEYS = ["/etc", "/var/cache", "/var/log", "/tmp", "rw", "/var/lib", "/var/run", "/var/spool", "/etc/systemd/system", "/usr/lib/systemd/system", "/lib/systemd/system"]

        self.DEFAULT_TYPES = (
            (self.generate_daemon_types, self.generate_daemon_rules),
            (self.generate_dbusd_types, self.generate_dbusd_rules),
            (self.generate_inetd_types, self.generate_inetd_rules),
            (self.generate_cgi_types, self.generate_cgi_rules),
            (self.generate_sandbox_types, self.generate_sandbox_rules),
            (self.generate_userapp_types, self.generate_userapp_rules),
            (self.generate_existing_user_types, self.generate_existing_user_rules),
            (self.generate_min_login_user_types, self.generate_login_user_rules),
            (self.generate_x_login_user_types, self.generate_x_login_user_rules),
            (self.generate_login_user_types, self.generate_login_user_rules),
            (self.generate_admin_user_types, self.generate_login_user_rules),
            (self.generate_root_user_types, self.generate_root_user_rules),
            (self.generate_new_types, self.generate_new_rules))
        if not re.match(r"^[a-zA-Z0-9-_]+$", name):
            raise ValueError(_("Name must be alpha numberic with no spaces. Consider using option \"-n MODULENAME\""))

        if type == CGI:
            self.name = "httpd_%s_script" % name
        else:
            self.name = name

        self.file_name = name

        self.capabilities = []
        self.processes = []
        self.type = type
        self.initscript = ""
        self.program = None
        self.in_tcp = [False, False, False, []]
        self.in_udp = [False, False, False, []]
        self.out_tcp = [False, False, False, []]
        self.out_udp = [False, False, False, []]
        self.use_resolve = False
        self.use_tmp = False
        self.use_uid = False
        self.use_syslog = False
        self.use_kerberos = False
        self.manage_krb5_rcache = False
        self.use_pam = False
        self.use_dbus = False
        self.use_audit = False
        self.use_etc = self.type not in [EUSER, NEWTYPE]
        self.use_localization = self.type not in [EUSER, NEWTYPE]
        self.use_fd = self.type not in [EUSER, NEWTYPE]
        self.use_terminal = False
        self.use_mail = False
        self.booleans = {}
        self.files = {}
        self.dirs = {}
        self.found_tcp_ports = []
        self.found_udp_ports = []
        self.need_tcp_type = False
        self.need_udp_type = False
        self.admin_domains = []
        self.existing_domains = []
        self.transition_domains = []
        self.transition_users = []
        self.roles = []

    def __isnetset(self, l):
        return l[ALL] or l[RESERVED] or l[UNRESERVED] or len(l[PORTS]) > 0

    def set_admin_domains(self, admin_domains):
        self.admin_domains = admin_domains

    def set_existing_domains(self, existing_domains):
        self.existing_domains = existing_domains

    def set_admin_roles(self, roles):
        self.roles = roles

    def set_transition_domains(self, transition_domains):
        self.transition_domains = transition_domains

    def set_transition_users(self, transition_users):
        self.transition_users = transition_users

    def use_in_udp(self):
        return self.__isnetset(self.in_udp)

    def use_out_udp(self):
        return self.__isnetset(self.out_udp)

    def use_udp(self):
        return self.use_in_udp() or self.use_out_udp()

    def use_in_tcp(self):
        return self.__isnetset(self.in_tcp)

    def use_out_tcp(self):
        return self.__isnetset(self.out_tcp)

    def use_tcp(self):
        return self.use_in_tcp() or self.use_out_tcp()

    def use_network(self):
        return self.use_tcp() or self.use_udp()

    def find_port(self, port, protocol="tcp"):
        for begin, end, p in self.ports.keys():
            if port >= begin and port <= end and protocol == p:
                return self.ports[begin, end, protocol]
        return None

    def set_program(self, program):
        if self.type not in APPLICATIONS:
            raise ValueError(_("User Role types can not be assigned executables."))

        self.program = program

    def set_init_script(self, initscript):
        if self.type != DAEMON:
            raise ValueError(_("Only Daemon apps can use an init script.."))

        self.initscript = initscript

    def set_in_tcp(self, all, reserved, unreserved, ports):
        self.in_tcp = [all, reserved, unreserved, verify_ports(ports)]

    def set_in_udp(self, all, reserved, unreserved, ports):
        self.in_udp = [all, reserved, unreserved, verify_ports(ports)]

    def set_out_tcp(self, all, ports):
        self.out_tcp = [all, False, False, verify_ports(ports)]

    def set_out_udp(self, all, ports):
        self.out_udp = [all, False, False, verify_ports(ports)]

    def set_use_resolve(self, val):
        if type(val) is not bool:
            raise ValueError(_("use_resolve must be a boolean value "))

        self.use_resolve = val

    def set_use_syslog(self, val):
        if type(val) is not bool:
            raise ValueError(_("use_syslog must be a boolean value "))

        self.use_syslog = val

    def set_use_kerberos(self, val):
        if type(val) is not bool:
            raise ValueError(_("use_kerberos must be a boolean value "))

        self.use_kerberos = val

    def set_manage_krb5_rcache(self, val):
        if type(val) is not bool:
            raise ValueError(_("manage_krb5_rcache must be a boolean value "))

        self.manage_krb5_rcache = val

    def set_use_pam(self, val):
        self.use_pam = (val is True)

    def set_use_dbus(self, val):
        self.use_dbus = (val is True)

    def set_use_audit(self, val):
        self.use_audit = (val is True)

    def set_use_etc(self, val):
        self.use_etc = (val is True)

    def set_use_localization(self, val):
        self.use_localization = (val is True)

    def set_use_fd(self, val):
        self.use_fd = (val is True)

    def set_use_terminal(self, val):
        self.use_terminal = (val is True)

    def set_use_mail(self, val):
        self.use_mail = (val is True)

    def set_use_tmp(self, val):
        if self.type in USERS:
            raise ValueError(_("USER Types automatically get a tmp type"))

        if val:
            self.DEFAULT_DIRS["/tmp"][1].append("/tmp")
        else:
            self.DEFAULT_DIRS["/tmp"][1] = []

    def set_use_uid(self, val):
        self.use_uid = (val is True)

    def generate_uid_rules(self):
        if self.use_uid:
            return re.sub("TEMPLATETYPE", self.name, executable.te_uid_rules)
        else:
            return ""

    def generate_syslog_rules(self):
        if self.use_syslog:
            return re.sub("TEMPLATETYPE", self.name, executable.te_syslog_rules)
        else:
            return ""

    def generate_resolve_rules(self):
        if self.use_resolve:
            return re.sub("TEMPLATETYPE", self.name, executable.te_resolve_rules)
        else:
            return ""

    def generate_kerberos_rules(self):
        if self.use_kerberos:
            return re.sub("TEMPLATETYPE", self.name, executable.te_kerberos_rules)
        else:
            return ""

    def generate_manage_krb5_rcache_rules(self):
        if self.manage_krb5_rcache:
            return re.sub("TEMPLATETYPE", self.name, executable.te_manage_krb5_rcache_rules)
        else:
            return ""

    def generate_pam_rules(self):
        newte = ""
        if self.use_pam:
            newte = re.sub("TEMPLATETYPE", self.name, executable.te_pam_rules)
        return newte

    def generate_audit_rules(self):
        newte = ""
        if self.use_audit:
            newte = re.sub("TEMPLATETYPE", self.name, executable.te_audit_rules)
        return newte

    def generate_etc_rules(self):
        newte = ""
        if self.use_etc:
            newte = re.sub("TEMPLATETYPE", self.name, executable.te_etc_rules)
        return newte

    def generate_fd_rules(self):
        newte = ""
        if self.use_fd:
            newte = re.sub("TEMPLATETYPE", self.name, executable.te_fd_rules)
        return newte

    def generate_localization_rules(self):
        newte = ""
        if self.use_localization:
            newte = re.sub("TEMPLATETYPE", self.name, executable.te_localization_rules)
        return newte

    def generate_dbus_rules(self):
        newte = ""
        if self.type != DBUS and self.use_dbus:
            newte = re.sub("TEMPLATETYPE", self.name, executable.te_dbus_rules)
        return newte

    def generate_mail_rules(self):
        newte = ""
        if self.use_mail:
            newte = re.sub("TEMPLATETYPE", self.name, executable.te_mail_rules)
        return newte

    def generate_network_action(self, protocol, action, port_name):
        line = ""
        method = "corenet_%s_%s_%s" % (protocol, action, port_name)
        if method in sepolicy.get_methods():
            line = "%s(%s_t)\n" % (method, self.name)
        else:
            line = """
gen_require(`
    type %s_t;
')
allow %s_t %s_t:%s_socket name_%s;
""" % (port_name, self.name, port_name, protocol, action)
        return line

    def generate_network_types(self):
        for i in self.in_tcp[PORTS]:
            rec = self.find_port(int(i), "tcp")
            if rec is None:
                self.need_tcp_type = True
            else:
                port_name = rec[0][:-2]
                line = self.generate_network_action("tcp", "bind", port_name)
#                   line = "corenet_tcp_bind_%s(%s_t)\n" % (port_name, self.name)
                if line not in self.found_tcp_ports:
                    self.found_tcp_ports.append(line)

        for i in self.out_tcp[PORTS]:
            rec = self.find_port(int(i), "tcp")
            if rec is None:
                self.need_tcp_type = True
            else:
                port_name = rec[0][:-2]
                line = self.generate_network_action("tcp", "connect", port_name)
#                   line = "corenet_tcp_connect_%s(%s_t)\n" % (port_name, self.name)
                if line not in self.found_tcp_ports:
                    self.found_tcp_ports.append(line)

        for i in self.in_udp[PORTS]:
            rec = self.find_port(int(i), "udp")
            if rec is None:
                self.need_udp_type = True
            else:
                port_name = rec[0][:-2]
                line = self.generate_network_action("udp", "bind", port_name)
#                   line = "corenet_udp_bind_%s(%s_t)\n" % (port_name, self.name)
                if line not in self.found_udp_ports:
                    self.found_udp_ports.append(line)

        if self.need_udp_type is True or self.need_tcp_type is True:
            return re.sub("TEMPLATETYPE", self.name, network.te_types)
        return ""

    def __find_path(self, file):
        for d in self.DEFAULT_DIRS:
            if file.find(d) == 0:
                self.DEFAULT_DIRS[d][1].append(file)
                return self.DEFAULT_DIRS[d]
        self.DEFAULT_DIRS["rw"][1].append(file)
        return self.DEFAULT_DIRS["rw"]

    def add_capability(self, capability):
        if capability not in self.capabilities:
            self.capabilities.append(capability)

    def set_types(self, types):
        self.types = types

    def add_process(self, process):
        if process not in self.processes:
            self.processes.append(process)

    def add_boolean(self, name, description):
        self.booleans[name] = description

    def add_file(self, file):
        self.files[file] = self.__find_path(file)

    def add_dir(self, file):
        self.dirs[file] = self.__find_path(file)

    def generate_capabilities(self):
        newte = ""
        self.capabilities.sort()
        if len(self.capabilities) > 0:
            newte = "allow %s_t self:capability { %s };\n" % (self.name, " ".join(self.capabilities))
        return newte

    def generate_process(self):
        newte = ""
        self.processes.sort()
        if len(self.processes) > 0:
            newte = "allow %s_t self:process { %s };\n" % (self.name, " ".join(self.processes))
        return newte

    def generate_network_rules(self):
        newte = ""
        if self.use_network():
            newte = "\n"

            newte += re.sub("TEMPLATETYPE", self.name, network.te_network)

            if self.use_tcp():
                newte += "\n"
                newte += re.sub("TEMPLATETYPE", self.name, network.te_tcp)

                if self.use_in_tcp():
                    newte += re.sub("TEMPLATETYPE", self.name, network.te_in_tcp)

                    if self.need_tcp_type and len(self.in_tcp[PORTS]) > 0:
                        newte += re.sub("TEMPLATETYPE", self.name, network.te_in_need_port_tcp)

                if self.need_tcp_type and len(self.out_tcp[PORTS]) > 0:
                    newte += re.sub("TEMPLATETYPE", self.name, network.te_out_need_port_tcp)

                if self.in_tcp[ALL]:
                    newte += re.sub("TEMPLATETYPE", self.name, network.te_in_all_ports_tcp)
                if self.in_tcp[RESERVED]:
                    newte += re.sub("TEMPLATETYPE", self.name, network.te_in_reserved_ports_tcp)
                if self.in_tcp[UNRESERVED]:
                    newte += re.sub("TEMPLATETYPE", self.name, network.te_in_unreserved_ports_tcp)

                if self.out_tcp[ALL]:
                    newte += re.sub("TEMPLATETYPE", self.name, network.te_out_all_ports_tcp)
                if self.out_tcp[RESERVED]:
                    newte += re.sub("TEMPLATETYPE", self.name, network.te_out_reserved_ports_tcp)
                if self.out_tcp[UNRESERVED]:
                    newte += re.sub("TEMPLATETYPE", self.name, network.te_out_unreserved_ports_tcp)

                for i in self.found_tcp_ports:
                    newte += i

            if self.use_udp():
                newte += "\n"
                newte += re.sub("TEMPLATETYPE", self.name, network.te_udp)

                if self.need_udp_type:
                    newte += re.sub("TEMPLATETYPE", self.name, network.te_in_need_port_udp)
                if self.use_in_udp():
                    newte += re.sub("TEMPLATETYPE", self.name, network.te_in_udp)
                if self.in_udp[ALL]:
                    newte += re.sub("TEMPLATETYPE", self.name, network.te_in_all_ports_udp)
                if self.in_udp[RESERVED]:
                    newte += re.sub("TEMPLATETYPE", self.name, network.te_in_reserved_ports_udp)
                if self.in_udp[UNRESERVED]:
                    newte += re.sub("TEMPLATETYPE", self.name, network.te_in_unreserved_ports_udp)

                for i in self.found_udp_ports:
                    newte += i
        return newte

    def generate_transition_rules(self):
        newte = ""
        for app in self.transition_domains:
            tmp = re.sub("TEMPLATETYPE", self.name, user.te_transition_rules)
            newte += re.sub("APPLICATION", app, tmp)

        if self.type == USER:
            for u in self.transition_users:
                temp = re.sub("TEMPLATETYPE", self.name, executable.te_run_rules)
                newte += re.sub("USER", u.split("_u")[0], temp)

        return newte

    def generate_admin_rules(self):
        newte = ""
        if self.type == EUSER:
            for d in self.existing_domains:
                name = d.split("_t")[0]
                role = name + "_r"
                for app in self.admin_domains:
                    tmp = re.sub("TEMPLATETYPE", name, user.te_admin_domain_rules)
                    if role not in self.all_roles:
                        tmp = re.sub(role, "system_r", tmp)

                    newte += re.sub("APPLICATION", app, tmp)

            return newte

        if self.type == RUSER:
            newte += re.sub("TEMPLATETYPE", self.name, user.te_admin_rules)

            for app in self.admin_domains:
                tmp = re.sub("TEMPLATETYPE", self.name, user.te_admin_domain_rules)
                newte += re.sub("APPLICATION", app, tmp)

            for u in self.transition_users:
                role = u.split("_u")[0]

                if (role + "_r") in self.all_roles:
                    tmp = re.sub("TEMPLATETYPE", self.name, user.te_admin_trans_rules)
                    newte += re.sub("USER", role, tmp)

        return newte

    def generate_dbus_if(self):
        newif = ""
        if self.use_dbus:
            newif = re.sub("TEMPLATETYPE", self.name, executable.if_dbus_rules)
        return newif

    def generate_sandbox_if(self):
        newif = ""
        if self.type != SANDBOX:
            return newif
        newif = re.sub("TEMPLATETYPE", self.name, executable.if_sandbox_rules)
        return newif

    def generate_admin_if(self):
        newif = ""
        newtypes = ""
        if self.initscript != "":
            newtypes += re.sub("TEMPLATETYPE", self.name, executable.if_initscript_admin_types)
            newif += re.sub("TEMPLATETYPE", self.name, executable.if_initscript_admin)
        for d in self.DEFAULT_KEYS:
            if len(self.DEFAULT_DIRS[d][1]) > 0:
                newtypes += re.sub("TEMPLATETYPE", self.name, self.DEFAULT_DIRS[d][2].if_admin_types)
                newif += re.sub("TEMPLATETYPE", self.name, self.DEFAULT_DIRS[d][2].if_admin_rules)

        if newif != "":
            ret = re.sub("TEMPLATETYPE", self.name, executable.if_begin_admin)
            ret += newtypes

            ret += re.sub("TEMPLATETYPE", self.name, executable.if_middle_admin)
            ret += newif
            ret += re.sub("TEMPLATETYPE", self.name, executable.if_end_admin)
            return ret

        return ""

    def generate_cgi_types(self):
        return re.sub("TEMPLATETYPE", self.file_name, executable.te_cgi_types)

    def generate_sandbox_types(self):
        return re.sub("TEMPLATETYPE", self.file_name, executable.te_sandbox_types)

    def generate_userapp_types(self):
        return re.sub("TEMPLATETYPE", self.name, executable.te_userapp_types)

    def generate_inetd_types(self):
        return re.sub("TEMPLATETYPE", self.name, executable.te_inetd_types)

    def generate_dbusd_types(self):
        return re.sub("TEMPLATETYPE", self.name, executable.te_dbusd_types)

    def generate_min_login_user_types(self):
        return re.sub("TEMPLATETYPE", self.name, user.te_min_login_user_types)

    def generate_login_user_types(self):
        return re.sub("TEMPLATETYPE", self.name, user.te_login_user_types)

    def generate_admin_user_types(self):
        return re.sub("TEMPLATETYPE", self.name, user.te_admin_user_types)

    def generate_existing_user_types(self):
        if len(self.existing_domains) == 0:
            raise ValueError(_("'%s' policy modules require existing domains") % poltype[self.type])
        newte = re.sub("TEMPLATETYPE", self.name, user.te_existing_user_types)
        newte += """gen_require(`"""

        for d in self.existing_domains:
            newte += """
        type %s;""" % d
            role = d.split("_t")[0] + "_r"
            if role in self.all_roles:
                newte += """
	role %s;""" % role
        newte += """
')
"""
        return newte

    def generate_x_login_user_types(self):
        return re.sub("TEMPLATETYPE", self.name, user.te_x_login_user_types)

    def generate_root_user_types(self):
        return re.sub("TEMPLATETYPE", self.name, user.te_root_user_types)

    def generate_new_types(self):
        newte = ""
        if len(self.types) == 0:
            raise ValueError(_("Type field required"))

        for t in self.types:
            for i in self.DEFAULT_EXT:
                if t.endswith(i):
                    print(t, t[:-len(i)])
                    newte += re.sub("TEMPLATETYPE", t[:-len(i)], self.DEFAULT_EXT[i].te_types)
                    break

        if NEWTYPE and newte == "":
            default_ext = []
            for i in self.DEFAULT_EXT:
                default_ext.append(i)
            raise ValueError(_("You need to define a new type which ends with: \n %s") % "\n ".join(default_ext))

        return newte

    def generate_new_rules(self):
        return ""

    def generate_daemon_types(self):
        newte = re.sub("TEMPLATETYPE", self.name, executable.te_daemon_types)
        if self.initscript != "":
            newte += re.sub("TEMPLATETYPE", self.name, executable.te_initscript_types)
        return newte

    def generate_tmp_types(self):
        if self.use_tmp:
            return re.sub("TEMPLATETYPE", self.name, tmp.te_types)
        else:
            return ""

    def generate_booleans(self):
        newte = ""
        for b in self.booleans:
            tmp = re.sub("BOOLEAN", b, boolean.te_boolean)
            newte += re.sub("DESCRIPTION", self.booleans[b], tmp)
        return newte

    def generate_boolean_rules(self):
        newte = ""
        for b in self.booleans:
            newte += re.sub("BOOLEAN", b, boolean.te_rules)
        return newte

    def generate_sandbox_te(self):
        return re.sub("TEMPLATETYPE", self.name, executable.te_sandbox_types)

    def generate_cgi_te(self):
        return re.sub("TEMPLATETYPE", self.name, executable.te_cgi_types)

    def generate_daemon_rules(self):
        newif = re.sub("TEMPLATETYPE", self.name, executable.te_daemon_rules)

        return newif

    def generate_new_type_if(self):
        newif = ""
        for t in self.types:
            for i in self.DEFAULT_EXT:
                if t.endswith(i):
                    reqtype = t[:-len(i)] + "_t"
                    newif += re.sub("TEMPLATETYPE", t[:-len(i)], self.DEFAULT_EXT[i].if_rules)
                    break
        return newif

    def generate_login_user_rules(self):
        return re.sub("TEMPLATETYPE", self.name, user.te_login_user_rules)

    def generate_existing_user_rules(self):
        nerules = re.sub("TEMPLATETYPE", self.name, user.te_existing_user_rules)
        return nerules

    def generate_x_login_user_rules(self):
        return re.sub("TEMPLATETYPE", self.name, user.te_x_login_user_rules)

    def generate_root_user_rules(self):
        newte = re.sub("TEMPLATETYPE", self.name, user.te_root_user_rules)
        return newte

    def generate_userapp_rules(self):
        return re.sub("TEMPLATETYPE", self.name, executable.te_userapp_rules)

    def generate_inetd_rules(self):
        return re.sub("TEMPLATETYPE", self.name, executable.te_inetd_rules)

    def generate_dbusd_rules(self):
        return re.sub("TEMPLATETYPE", self.name, executable.te_dbusd_rules)

    def generate_tmp_rules(self):
        if self.use_tmp:
            return re.sub("TEMPLATETYPE", self.name, tmp.te_rules)
        else:
            return ""

    def generate_cgi_rules(self):
        newte = ""
        newte += re.sub("TEMPLATETYPE", self.name, executable.te_cgi_rules)
        return newte

    def generate_sandbox_rules(self):
        newte = ""
        newte += re.sub("TEMPLATETYPE", self.name, executable.te_sandbox_rules)
        return newte

    def generate_user_if(self):
        newif = ""
        if self.use_terminal or self.type == USER:
            newif = re.sub("TEMPLATETYPE", self.name, executable.if_user_program_rules)

        if self.type in (TUSER, XUSER, AUSER, LUSER):
            newif += re.sub("TEMPLATETYPE", self.name, executable.if_role_change_rules)
        return newif

    def generate_if(self):
        newif = ""
        newif += re.sub("TEMPLATETYPE", self.name, executable.if_heading_rules)
        if self.program:
            newif += re.sub("TEMPLATETYPE", self.name, executable.if_program_rules)
        if self.initscript != "":
            newif += re.sub("TEMPLATETYPE", self.name, executable.if_initscript_rules)

        for d in self.DEFAULT_KEYS:
            if len(self.DEFAULT_DIRS[d][1]) > 0:
                newif += re.sub("TEMPLATETYPE", self.name, self.DEFAULT_DIRS[d][2].if_rules)
                for i in self.DEFAULT_DIRS[d][1]:
                    if os.path.exists(i) and stat.S_ISSOCK(os.stat(i)[stat.ST_MODE]):
                        newif += re.sub("TEMPLATETYPE", self.name, self.DEFAULT_DIRS[d][2].if_stream_rules)
                        break
        newif += self.generate_user_if()
        newif += self.generate_dbus_if()
        newif += self.generate_admin_if()
        newif += self.generate_sandbox_if()
        newif += self.generate_new_type_if()
        newif += self.generate_new_rules()

        return newif

    def generate_default_types(self):
        return self.DEFAULT_TYPES[self.type][0]()

    def generate_default_rules(self):
        if self.DEFAULT_TYPES[self.type][1]:
            return self.DEFAULT_TYPES[self.type][1]()
        return ""

    def generate_roles_rules(self):
        newte = ""
        if self.type in (TUSER, XUSER, AUSER, LUSER):
            roles = ""
            if len(self.roles) > 0:
                newte += re.sub("TEMPLATETYPE", self.name, user.te_sudo_rules)
                newte += re.sub("TEMPLATETYPE", self.name, user.te_newrole_rules)
                for role in self.roles:
                    tmp = re.sub("TEMPLATETYPE", self.name, user.te_roles_rules)
                    newte += re.sub("ROLE", role, tmp)
        return newte

    def generate_te(self):
        newte = self.generate_default_types()
        for d in self.DEFAULT_KEYS:
            if len(self.DEFAULT_DIRS[d][1]) > 0:
                # CGI scripts already have a rw_t
                if self.type != CGI or d != "rw":
                    newte += re.sub("TEMPLATETYPE", self.name, self.DEFAULT_DIRS[d][2].te_types)

        if self.type != EUSER:
            newte += """
########################################
#
# %s local policy
#
""" % self.name
        newte += self.generate_capabilities()
        newte += self.generate_process()
        newte += self.generate_network_types()
        newte += self.generate_tmp_types()
        newte += self.generate_booleans()
        newte += self.generate_default_rules()
        newte += self.generate_boolean_rules()

        for d in self.DEFAULT_KEYS:
            if len(self.DEFAULT_DIRS[d][1]) > 0:
                if self.type == EUSER:
                    newte_tmp = ""
                    for domain in self.existing_domains:
                        newte_tmp += re.sub("TEMPLATETYPE_t", domain[:-2] + "_t", self.DEFAULT_DIRS[d][2].te_rules)
                        newte += re.sub("TEMPLATETYPE_rw_t", self.name + "_rw_t", newte_tmp)
                else:
                    newte += re.sub("TEMPLATETYPE", self.name, self.DEFAULT_DIRS[d][2].te_rules)
                for i in self.DEFAULT_DIRS[d][1]:
                    if os.path.exists(i) and stat.S_ISSOCK(os.stat(i)[stat.ST_MODE]):
                        if self.type == EUSER:
                            for domain in self.existing_domains:
                                newte += re.sub("TEMPLATETYPE", domain[:-2], self.DEFAULT_DIRS[d][2].te_stream_rules)

                        else:
                            newte += re.sub("TEMPLATETYPE", self.name, self.DEFAULT_DIRS[d][2].te_stream_rules)
                        break

        newte += self.generate_tmp_rules()
        newte += self.generate_network_rules()
        newte += self.generate_fd_rules()
        newte += self.generate_etc_rules()
        newte += self.generate_pam_rules()
        newte += self.generate_uid_rules()
        newte += self.generate_audit_rules()
        newte += self.generate_syslog_rules()
        newte += self.generate_localization_rules()
        newte += self.generate_resolve_rules()
        newte += self.generate_roles_rules()
        newte += self.generate_mail_rules()
        newte += self.generate_transition_rules()
        newte += self.generate_admin_rules()
        newte += self.generate_dbus_rules()
        newte += self.generate_kerberos_rules()
        newte += self.generate_manage_krb5_rcache_rules()

        return newte

    def generate_fc(self):
        newfc = ""
        fclist = []
        for i in self.files.keys():
            if os.path.exists(i) and stat.S_ISSOCK(os.stat(i)[stat.ST_MODE]):
                t1 = re.sub("TEMPLATETYPE", self.name, self.files[i][2].fc_sock_file)
            else:
                t1 = re.sub("TEMPLATETYPE", self.name, self.files[i][2].fc_file)
            t2 = re.sub("FILENAME", i, t1)
            fclist.append(re.sub("FILETYPE", self.files[i][0], t2))

        for i in self.dirs.keys():
            t1 = re.sub("TEMPLATETYPE", self.name, self.dirs[i][2].fc_dir)
            t2 = re.sub("FILENAME", i, t1)
            fclist.append(re.sub("FILETYPE", self.dirs[i][0], t2))

        if self.type in USERS + [SANDBOX]:
            if len(fclist) == 0:
                return executable.fc_user

        if self.type not in USERS + [SANDBOX, EUSER, NEWTYPE] and not self.program:
            raise ValueError(_("You must enter the executable path for your confined process"))

        if self.program:
            t1 = re.sub("EXECUTABLE", self.program, executable.fc_program)
            fclist.append(re.sub("TEMPLATETYPE", self.name, t1))

        if self.initscript != "":
            t1 = re.sub("EXECUTABLE", self.initscript, executable.fc_initscript)
            fclist.append(re.sub("TEMPLATETYPE", self.name, t1))

        fclist.sort()
        newfc = "\n".join(fclist)
        return newfc

    def generate_user_sh(self):
        newsh = ""
        if self.type not in (TUSER, XUSER, AUSER, LUSER, RUSER):
            return newsh

        roles = ""
        for role in self.roles:
            roles += " %s_r" % role
        if roles != "":
            roles += " system_r"
        tmp = re.sub("TEMPLATETYPE", self.name, script.users)
        newsh += re.sub("ROLES", roles, tmp)

        if self.type == RUSER or self.type == AUSER:
            for u in self.transition_users:
                tmp = re.sub("TEMPLATETYPE", self.name, script.admin_trans)
                newsh += re.sub("USER", u, tmp)

        if self.type == LUSER:
            newsh += re.sub("TEMPLATETYPE", self.name, script.min_login_user_default_context)
        else:
            newsh += re.sub("TEMPLATETYPE", self.name, script.x_login_user_default_context)

        return newsh

    def generate_sh(self):
        temp = re.sub("TEMPLATETYPE", self.file_name, script.compile)
        temp = re.sub("DOMAINTYPE", self.name, temp)
        if self.type == EUSER:
            newsh = re.sub("TEMPLATEFILE", "%s" % self.file_name, temp)
        else:
            newsh = re.sub("TEMPLATEFILE", self.file_name, temp)
            newsh += re.sub("DOMAINTYPE", self.name, script.manpage)

        if self.program:
            newsh += re.sub("FILENAME", self.program, script.restorecon)
        if self.initscript != "":
            newsh += re.sub("FILENAME", self.initscript, script.restorecon)

        for i in self.files.keys():
            newsh += re.sub("FILENAME", i, script.restorecon)

        for i in self.dirs.keys():
            newsh += re.sub("FILENAME", i, script.restorecon)

        for i in self.in_tcp[PORTS] + self.out_tcp[PORTS]:
            if self.find_port(i, "tcp") is None:
                t1 = re.sub("PORTNUM", "%d" % i, script.tcp_ports)
                newsh += re.sub("TEMPLATETYPE", self.name, t1)

        for i in self.in_udp[PORTS]:
            if self.find_port(i, "udp") is None:
                t1 = re.sub("PORTNUM", "%d" % i, script.udp_ports)
                newsh += re.sub("TEMPLATETYPE", self.name, t1)

        newsh += self.generate_user_sh()
        if (platform.linux_distribution(full_distribution_name=0)[0] in ("redhat", "centos", "SuSE", "fedora", "mandrake", "mandriva")):
            newsh += re.sub("TEMPLATEFILE", self.file_name, script.rpm)

        return newsh

    def generate_spec(self):
        newspec = ""

        selinux_policynvr = get_rpm_nvr_list("selinux-policy")

        if selinux_policynvr is None:
            selinux_policyver = "0.0.0"
        else:
            selinux_policyver = selinux_policynvr[1]

        newspec += spec.header_comment_section
        if self.type in APPLICATIONS:
            newspec += spec.define_relabel_files_begin
            if self.program:
                newspec += re.sub("FILENAME", self.program, spec.define_relabel_files_end)
            if self.initscript != "":
                newspec += re.sub("FILENAME", self.initscript, spec.define_relabel_files_end)
            for i in self.files.keys():
                newspec += re.sub("FILENAME", i, spec.define_relabel_files_end)
            for i in self.dirs.keys():
                newspec += re.sub("FILENAME", i, spec.define_relabel_files_end)

        newspec += re.sub("VERSION", selinux_policyver, spec.base_section)
        newspec = re.sub("MODULENAME", self.file_name, newspec)
        newspec = re.sub("DOMAINNAME", self.name, newspec)
        if len(self.rpms) > 0:
            newspec += "Requires(post): %s\n" % ", ".join(self.rpms)
        newspec += re.sub("MODULENAME", self.file_name, spec.mid_section)
        newspec = re.sub("DOMAINNAME", self.name, newspec)
        newspec = re.sub("TODAYSDATE", time.strftime("%a %b %e %Y"), newspec)

        if self.type not in APPLICATIONS:
            newspec = re.sub("%relabel_files", "", newspec)

        # Remove man pages from EUSER spec file
        if self.type == EUSER:
            newspec = re.sub(".*%s_selinux.8.*" % self.name, "", newspec)
        # Remove user context file from non users spec file
        if self.type not in (TUSER, XUSER, AUSER, LUSER, RUSER):
            newspec = re.sub(".*%s_u.*" % self.name, "", newspec)
        return newspec

    def write_spec(self, out_dir):
        specfile = "%s/%s_selinux.spec" % (out_dir, self.file_name)
        fd = open(specfile, "w")
        fd.write(self.generate_spec())
        fd.close()

        return specfile

    def write_te(self, out_dir):
        tefile = "%s/%s.te" % (out_dir, self.file_name)
        fd = open(tefile, "w")
        fd.write(self.generate_te())
        fd.close()
        return tefile

    def write_sh(self, out_dir):
        shfile = "%s/%s.sh" % (out_dir, self.file_name)
        fd = open(shfile, "w")
        fd.write(self.generate_sh())
        fd.close()
        os.chmod(shfile, 0o750)
        return shfile

    def write_if(self, out_dir):
        iffile = "%s/%s.if" % (out_dir, self.file_name)
        fd = open(iffile, "w")
        fd.write(self.generate_if())
        fd.close()
        return iffile

    def write_fc(self, out_dir):
        fcfile = "%s/%s.fc" % (out_dir, self.file_name)
        fd = open(fcfile, "w")
        fd.write(self.generate_fc())
        fd.close()
        return fcfile

    def __extract_rpms(self):
        import dnf

        with dnf.Base() as base:
            base.read_all_repos()
            base.fill_sack(load_system_repo=True)

            query = base.sack.query()

            pq = query.available()
            pq = pq.filter(file=self.program)

            for pkg in pq:
                self.rpms.append(pkg.name)
                for fname in pkg.files:
                    for b in self.DEFAULT_DIRS:
                        if b == "/etc":
                            continue
                        if fname.startswith(b):
                            if os.path.isfile(fname):
                                self.add_file(fname)
                            else:
                                self.add_dir(fname)
                sq = query.available()
                sq = sq.filter(provides=pkg.source_name)
                for bpkg in sq:
                    for fname in bpkg.files:
                        for b in self.DEFAULT_DIRS:
                            if b == "/etc":
                                continue
                            if fname.startswith(b):
                                if os.path.isfile(fname):
                                    self.add_file(fname)
                                else:
                                    self.add_dir(fname)

    def gen_writeable(self):
        try:
            self.__extract_rpms()
        except ImportError:
            pass

        if os.path.isfile("/var/run/%s.pid" % self.name):
            self.add_file("/var/run/%s.pid" % self.name)

        if os.path.isdir("/var/run/%s" % self.name):
            self.add_dir("/var/run/%s" % self.name)

        if os.path.isdir("/var/log/%s" % self.name):
            self.add_dir("/var/log/%s" % self.name)

        if os.path.isfile("/var/log/%s.log" % self.name):
            self.add_file("/var/log/%s.log" % self.name)

        if os.path.isdir("/var/lib/%s" % self.name):
            self.add_dir("/var/lib/%s" % self.name)

        if os.path.isfile("/etc/rc.d/init.d/%s" % self.name):
            self.set_init_script(r"/etc/rc\.d/init\.d/%s" % self.name)

        # we don't want to have subdir in the .fc policy file
        # if we already specify labeling for parent dir
        temp_basepath = []
        for p in self.DEFAULT_DIRS.keys():
            temp_dirs = []
            try:
                temp_basepath = self.DEFAULT_DIRS[p][1][0] + "/"
            except IndexError:
                continue

            for i in self.DEFAULT_DIRS[p][1]:
                if i.startswith(temp_basepath):
                    temp_dirs.append(i)
                else:
                    continue

            if len(temp_dirs) != 0:
                for i in temp_dirs:
                    if i in self.dirs.keys():
                        del(self.dirs[i])
                    elif i in self.files.keys():
                        del(self.files[i])
                    else:
                        continue

                self.DEFAULT_DIRS[p][1] = list(set(self.DEFAULT_DIRS[p][1]) - set(temp_dirs))

    def gen_symbols(self):
        if self.type not in APPLICATIONS:
            return
        if not os.path.exists(self.program):
            sys.stderr.write("""
***************************************
Warning %s does not exist
***************************************

""" % self.program)
            return
        fd = os.popen("nm -D %s | grep U" % self.program)
        for s in fd.read().split():
            for b in self.symbols:
                if s.startswith(b):
                    exec("self.%s" % self.symbols[b])
        fd.close()

    def generate(self, out_dir=os.getcwd()):
        out = "Created the following files:\n"
        out += "%s # %s\n" % (self.write_te(out_dir), _("Type Enforcement file"))
        out += "%s # %s\n" % (self.write_if(out_dir), _("Interface file"))
        out += "%s # %s\n" % (self.write_fc(out_dir), _("File Contexts file"))
        if self.type != NEWTYPE:
            if (platform.linux_distribution(full_distribution_name=0)[0] in ("redhat", "centos", "SuSE", "fedora", "mandrake", "mandriva")):
                out += "%s # %s\n" % (self.write_spec(out_dir), _("Spec file"))
            out += "%s # %s\n" % (self.write_sh(out_dir), _("Setup Script"))
        return out
site-packages/sepolicy/transition.py000064400000006226147511334650013676 0ustar00# Copyright (C) 2011 Red Hat
# see file 'COPYING' for use and warranty information
#
# setrans is a tool for analyzing process transistions in SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#
import sepolicy
__all__ = ['setrans']


def _entrypoint(src):
    trans = sepolicy.search([sepolicy.ALLOW], {sepolicy.SOURCE: src})
    return map(lambda y: y[sepolicy.TARGET], filter(lambda x: "entrypoint" in x[sepolicy.PERMS], trans))


def _get_trans(src):
    src_list = [src] + list(filter(lambda x: x['name'] == src, sepolicy.get_all_types_info()))[0]['attributes']
    trans_list = list(filter(lambda x: x['source'] in src_list and x['class'] == 'process', sepolicy.get_all_transitions()))
    return trans_list


class setrans:

    def __init__(self, source, dest=None):
        self.seen = []
        self.sdict = {}
        self.source = source
        self.dest = dest
        self._process(self.source)

    def _process(self, source):
        if source in self.sdict:
            return self.sdict[source]
        self.sdict[source] = {}
        trans = _get_trans(source)
        if not trans:
            return
        self.sdict[source]["name"] = source
        if not self.dest:
            self.sdict[source]["map"] = trans
        else:
            self.sdict[source]["map"] = list(map(lambda y: y, filter(lambda x: x["transtype"] == self.dest, trans)))
            self.sdict[source]["child"] = list(map(lambda y: y["transtype"], filter(lambda x: x["transtype"] not in [self.dest, source], trans)))
            for s in self.sdict[source]["child"]:
                self._process(s)

    def out(self, name, header=""):
        buf = ""
        if name in self.seen:
            return buf
        self.seen.append(name)

        if "map" in self.sdict[name]:
            for t in self.sdict[name]["map"]:
                cond = sepolicy.get_conditionals(t["source"], t["transtype"], "process", ["transition"])
                if cond:
                    buf += "%s%s @ %s --> %s %s\n" % (header, t["source"], t["target"], t["transtype"], sepolicy.get_conditionals_format_text(cond))
                else:
                    buf += "%s%s @ %s --> %s\n" % (header, t["source"], t["target"], t["transtype"])

        if "child" in self.sdict[name]:
            for x in self.sdict[name]["child"]:
                buf += self.out(x, "%s%s ... " % (header, name))
        return buf

    def output(self):
        self.seen = []
        print(self.out(self.source))
site-packages/sepolicy/__pycache__/gui.cpython-36.pyc000064400000244126147511334650016557 0ustar003

��f�@s(ddlZejdd�ddlmZddlmZddlmZddlmZddlZddl	Z	ddl
Z
ddl
mZmZm
Z
ddlZ	ddlZ	ddlZddlZddlZddlZd	Zy:ddlZiZejd0kr�ded<ejefd
dd�e��WnLyddlZeejd<Wn(ek
�r$ddlZeejd<YnXYnXiZxe	j D]Z!e!ee	j e!<�q8We"d�e"d�gZ#e"d�e"d�gZ$dd�Z%ddl&Z'e"d�e"d�fZ(e"d�e"d�fZ)dZ*dZ+dZ,dZ-dZ.dZ/dZ0dZ1dZ2dZ3dZ4d
Z5dZ6dZ7dZ8d Z9d!Z:d"Z;d#d$d%d&d'd(d)d*d+g	Z<e"d,�Z=Gd-d.�d.�Z>e?d/k�r$e>�Z@dS)1�N�Gtkz3.0)r)�Gdk)�GLib)�SELinuxDBus)�DISABLED�
PERMISSIVE�	ENFORCINGzselinux-python�T�unicodez/usr/share/localezutf-8)Z	localedirZcodeset�_ZNoZYesZDisableZEnablecCs<|dkr|dkrdS|dkr dS|dkr,dS||k||kS)Nr�����)�a�brr�/usr/lib/python3.6/gui.py�cmpFsrzAdvanced >>zAdvanced <<zAdvanced Search >>zAdvanced Search <<r�������	�boolean�fcontextzfcontext-equiv�port�login�user�module�node�	interfacez�<small>
To change from Disabled to Enforcing mode
- Change the system mode from Disabled to Permissive
- Reboot, so that the system can relabel
- Once the system is working as planned
  * Change the system mode to Enforcing</small>
c@s�eZdZ�d%dd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Zd5d6�Zd7d8�Zd9d:�Zd;d<�Zd=d>�Z d?d@�Z!dAdB�Z"dCdD�Z#dEdF�Z$dGdH�Z%dIdJ�Z&dKdL�Z'dMdN�Z(dOdP�Z)dQdR�Z*dSdT�Z+dUdV�Z,dWdX�Z-dYdZ�Z.d[d\�Z/d]d^�Z0d_d`�Z1dadb�Z2dcdd�Z3dedf�Z4dgdh�Z5didj�Z6dkdl�Z7dmdn�Z8dodp�Z9dqdr�Z:dsdt�Z;dudv�Z<dwdx�Z=dydz�Z>d{d|�Z?d}d~�Z@dd��ZAd�d��ZBd�d��ZCd�d��ZDd�d��ZEd�d��ZFd�d��ZGd�d��ZHd�d��ZId�d��ZJd�d��ZKd�d��ZLd�d��ZMd�d��ZNd�d��ZOd�d��ZPd�d��ZQd�d��ZRd�d��ZSd�d��ZTd�d��ZUd�d��ZVd�d��ZWd�d��ZXd�d��ZYd�d��ZZd�d��Z[d�d��Z\d�d��Z]d�d��Z^d�d��Z_d�d��Z`d�d��Za�d&d�dÄZbd�dńZcd�dDŽZdd�dɄZed�d˄Zfd�d̈́Zgd�dτZhd�dфZid�dӄZjd�dՄZkd�dׄZld�dلZmd�dۄZnd�d݄Zod�d߄Zpd�d�Zqd�d�Zrd�d�Zsd�d�Ztd�d�Zud�d�Zvd�d�Zwd�d�Zxd�d�Zyd�d�Zzd�d��Z{d�d��Z|d�d��Z}d�d��Z~d�d��Zd�d��Z��d�d�Z��d�d�Z��d�d�Z��d�d�Z��d�d	�Z��d
�d�Z��d�d
�Z��d�d�Z��d�d�Z��d�d�Z��d'�d�d�Z��d�d�Z��d�d�Z��d�d�Z��d�d�Z��d�d �Z��d!�d"�Z��d#�d$�Z�dS((�
SELinuxGuiNFcQCsd|_d|_t|_t�|_y|jj�}Wn6tjjk
r^}zt	|�|j
�WYdd}~XnX|j�||_d|_
tj�}tjjdd�d|_|jd}|j|�|jd�|_|jd�|_|jd	�|_|jd
�|_d|_|jd�|_|jd
�|_tjtjj �|_!tjtjj"�|_#t$j%�d|_&d|_'d|_(d|_)d|_*d|_+d|_,g|_-g|_.i|_/|jd�|_0|jd�|_1|jd�|_2d|_3|jd�|_4|jd�|_5|j5j6|j7�|jd�|_8|jd�|_9|jd�|_:d|_;|jd�|_<|jd�|_=|jd�|_>|jd�|_?|jd�|_@|jd�|_A|jd�|_B|jd�|_C|jd�|_D|jDjEdtjFjG�|jd �|_H|jHj6|j7�|jd!�|_I|jd"�|_J|jd#�|_K|jd$�|_L|jd%�|_M|jd&�|_N|jNjEdtjFjG�|jd'�|_O|jOj6|j7�|jd(�|_P|jd)�|_Q|jd*�|_R|jd+�|_S|jd,�|_T|jd-�|_U|jd.�|_V|jd/�|_W|jd0�|_X|jd1�|_Y|jd2�|_Z|jd3�|_[|jd4�|_\|jd5�|_]|jd6�|_^|jd7�|__|jd8�|_`|j`jEdtjFjG�|jd9�|_a|jd:�|_b|jbj6|j7�|jd;�|_c|jd<�|_d|jd=�|_e|jd>�|_f|jd?�|_g|jd@�|_h|jdA�|_i|jdB�|_j|jdC�|_k|jdD�|_l|jdE�|_m|jdF�|_n|jdG�|_o|jdH�|_p|jdI�|_q|jdJ�|_r|jdK�|_s|jdL�|_tg|_u|jv�dMk�r�|jkjwd�|jmjwd�|jtjwd�|jdN�|_x|jdO�|_y|jdP�|_z|j{�|jdQ�|_||jdR�|_}|jdS�|_~|jdT�|_|jdU�|_�|jdV�|_�|jdW�|_�|jdX�|_�|jdY�|_�|jdZ�|_�|jd[�|_�|jd\�|_�|jd]�|_�|jd^�|_�|jd_�|_�|jd`�|_�|jda�|_�|jdb�|_�|jdc�|_�|jdd�|_�|jde�|_�|jdf�|_�|jdg�|_�|jdh�|_�|jdi�|_�|jdj�|_�|jdk�|_�|jdl�|_�|jdm�|_�|jdn�|_�|jdo�|_�|jdp�|_�|jdq�|_�|jdr�|_�|jds�|_�|jdt�|_�|jdu�|_�|jdv�|_�|jdw�|_�|jdx�|_�|jdy�|_�|jdz�|_�|jd{�|_�|jd|�|_�|jd}�|_�|jd~�|_�|j�j�d�|jd�|_�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|j�j6|j7�|jd��|_�|j�j��|_�|jd��|_�|j�jEdtjFjG�|jd��|_�|j�j��|_�|jd��|_�|jd��|_�|j�jEdtjFjG�|jd��|_�|j�j6|j7�|jd��|_�|j�j��|_�|jd��|_�|jd��|_�|j�j6|j7�|jd��|_�|j�j��|_�|jd��|_�|j�jEdtjFjG�|jd��|_�|j�j��|_�|jd��|_�|j�j��|_�|jd��|_�|jd��|_�|j�jEdtjFjG�|jd��|_�|j�j6|j7�|jd��|_�|j�j��|_�|jd��|_�|jd��|_�|j�jEdtjFjG�|jd��|_�|j�j6|j7�|jd��|_�|j�j��|_�|jd��|_�|jd��|_�|j�jEdMtjFjG�|jd��|_�|j�j6|j7�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|j�|_�|j�j��|_�|jd��|_�|jd��|_�|j�jEdtjFjG�|jd��|_�|j�j6|j7�|jd��|_�|j�j��|_�|jd��|_�|j�j��|_�|jd��|_�|jd��|_�|j�jEdMtjFjG�|jd��|_�|j�j6|j7�|jd��|_�|j�j��|_�|jd��|_�|jd��|_�|j�jEdtjFjG�|jd��|_�|j�j6|j7�|jd��|_�|j�j��|_�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|j�j�d�|j�j�d�|j�j�|j�d�|j�j�|j�|j�j�dtj��|jd��|_�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|j�j�d��|_�|jd��|_�|jd��|�_|jd��|�_|jd��|�_|jd��|�_|�j�jd�|�j|j��|jdÃ|�_|�j�jd�|�j|jc�|jdă|�_|�j�jd�|�j|jY�|jdŃ|�_|�j�jd�|�j|jJ�|jdƃ|�_	|�j	�jd�|�j|j��|jdǃ|�_
|jd�|_|jdȃ|�_|jdɃ|�_|jdʃ|�_
|jd˃}|�j
�jd�|�j|j��g|�_|�j�jd�|�j|jԃd|�_d}|�rd�d�g|_-|�r$||j-k�r$|j-�j|�n�t�j�|_-|j-�j�t�jd΍|�rj||j-k�rj|�j�tdσ|�|j
�|�j��t|j-�}	�t�j�}
x�|j-D]�}|�j||��t�t|�j��t|	��|�_|j��j|�j�|j��j |�j�|�j!�xH|
�j"|g�D]6}�t�j#|�}|�r|�j||�|j.�j|��qW|�jd7�_�q�W|�j$�|j�j%|j�|j:�j%|j8�|�j&|�j'|�j(|�j)|�j*|�j+|�j,|�j-|�j.|�j/|�j0|�j1|�j2|�j3|�j4|�j5|�j6|�j7|�j8|�j9|�j:|�j;|�j<|�j=|�j>|�j?|�j@|�jA|�jB|�jC|�jD|�jE|�j1|�jF|�jG|�jH|�jI|�jJ|�jJ|�jK|�jL|�jM|�jN|�jO|�jP|�jQ|�jR|�jS|�jT|�jU|�jV|�jH|�jW|�jX|�jY|�jZ|�j[|�j\|�j\|�j\|�j]|�j^|�j_|j7|�j`|�jadМB}
|�jb|�|�jc|
�|j�j��td�jed�|�jf�|�jf�d|�_g|�j�j$�|�j�j$�|�jh�tik�r�|�jj�nV|j�r�|j�jk|j�|j�jl|j�|�jm�|�j`�n|�jn�|j�jot�|�jp�d|_t�jq�dS)�NFT�)Z
plat_specificz
/sepolicy/zsepolicy.glade�outer_notebookZSELinux_windowZMain_selection_menu�main_advanced_labelr�applications_selection_buttonZ
Revert_buttonrZadd_path_dialog�error_check_window�error_check_label�advanced_search_windowZadvanced_filterZ
advanced_sort�advanced_filter_entry�advanced_search_treeviewZLogin_label�login_seuser_comboboxZlogin_seuser_liststore�login_name_entry�login_mls_label�login_mls_entryZLogin_button�login_treeview�login_liststore�login_filter�login_popup_window�login_delete_liststore�login_delete_window�user_popup_windowZUser_button�user_liststore�user_filter�
user_treeview�user_roles_comboboxZuser_roles_liststoreZ
User_label�user_name_entry�user_mls_label�user_mls_level_entry�user_mls_entryZselinux_user_combobox�user_delete_liststore�user_delete_window�file_equiv_label�file_equiv_source_entry�file_equiv_dest_entryZfile_equiv_button�file_equiv_treeview�file_equiv_liststore�file_equiv_popup_windowZfile_equiv_filter�file_equiv_delete_liststore�file_equiv_delete_window�app_system_buttonZ
System_buttonZLockdown_buttonZSystems_boxZRelabel_buttonZRelabel_button_no�advanced_system�outer_notebook_frameZsystem_policy_type_label�select_button_browse�cancel_button_browse�moreTypes_window_filesZmore_types_file_liststoreZmoreTypes_treeview_files�system_policy_type_liststore�system_policy_type_comboboxrZEnforcing_button_defaultZPermissive_button_defaultZDisabled_button_defaultZenable_unconfinedZdisable_unconfinedZenable_permissiveZdisable_permissiveZ
enable_ptraceZdisable_ptrace�help_windowZ
help_textv�	info_text�
help_image�forward_button�back_button�
update_window�update_treeviewZUpdate_treestore�apply_buttonZ
Update_buttonZ
Add_buttonZ
Delete_button�files_path_entry�network_ports_entry�files_popup_window�network_popup_windowZ
Network_labelZfiles_labelZmake_path_recursiveZfiles_type_combo_storeZfiles_class_combo_store�files_type_combobox�files_class_combobox�files_mls_label�files_mls_entryZAdvanced_text_filesZcancel_delete_filesZ
tcp_buttonZ
udp_buttonZnetwork_type_combo_storeZnetwork_type_combobox�network_mls_label�network_mls_entryZAdvanced_text_networkZcancel_network_deleteZShow_mislabeled_files�mislabeled_files_label�
warning_filesZ
Modify_button�fix_label_window�fixlabel_label�fix_label_cancel�files_delete_window�files_delete_treeview�files_delete_liststore�network_delete_window�network_delete_treeview�network_delete_liststore�progress_barZExecutable_files_treeview�executable_files_filterZExecutable_files_tabZexecutable_files_treestoreZfiles_buttonZWritable_files_treeviewZwritable_files_treestore�writable_files_filterZWritable_files_tabZApplication_files_treeview�application_files_filterZApplication_files_tabZapplication_files_treestoreZnetwork_buttonZoutbound_treeview�network_out_liststore�network_out_filter�network_out_tabZinbound_treeview�network_in_liststore�network_in_filter�network_in_tabZBoolean_treeview�boolean_liststore�boolean_filterZbooleans_more_detail_windowZbooleans_more_detail_treeviewZbooleans_more_detail_liststoreZBooleans_button�transitions_into_treeview�transitions_into_liststore�transitions_into_filterZTransitions_into_tabZTransitions_button�transitions_from_treeview�transitions_from_treestore�transitions_from_filterZTransitions_from_tabZfile_transitions_treeviewZfile_transitions_liststoreZfile_transitions_filterZfile_transitionsZcombobox_org�application_liststore�completion_entry�entrycompletion_objZShow_modified_only_toggleZEnforcing_labelZEnforcing_buttonZPermissive_button�
status_barzSELinux status�filter_entry�
filter_box�add_modify_delete_boxZactivateZfiles_toggle_deleteZtoggledZfile_equiv_toggle_delete1Zuser_toggle_deleteZlogin_toggle_deleteZnetwork_toggle_deleteZ
toggle_updateZfiles_inner_notebookZnetwork_inner_notebookZtransitions_inner_notebook�loading_guiZhttpd_tZabrt_t)�keyz%s is not a valid domain)BZon_combo_button_clickedZon_disable_ptrace_toggledZ!on_SELinux_window_configure_eventZ%on_entrycompletion_obj_match_selectedZon_filter_changedZ"on_save_changes_file_equiv_clickedZon_save_changes_login_clickedZon_save_changes_user_clickedZon_save_changes_files_clickedZon_save_changes_network_clickedZ)on_Advanced_text_files_button_press_eventZitem_in_tree_selectedZ2on_Application_file_types_treeview_configure_event�on_save_delete_clickedZ)on_moreTypes_treeview_files_row_activatedZon_retry_button_files_clickedZon_make_path_recursive_toggledZ&on_files_path_entry_button_press_eventZon_files_path_entry_changedZon_select_type_files_clickedZon_choose_fileZon_Enforcing_button_toggledZon_confirmation_closeZon_column_clickedZ
on_tab_switchZon_file_equiv_button_clickedzon_app/system_button_clickedzon_app/users_button_clicked�on_show_advanced_search_windowZ on_Show_mislabeled_files_toggledZon_Browse_button_files_clickedZon_cancel_popup_clickedZon_treeview_cursor_changedZ on_login_seuser_combobox_changedZon_user_roles_combobox_changedZon_cancel_button_browse_clickedZon_apply_button_clickedZon_Revert_button_clickedZon_Update_button_clickedZ on_advanced_filter_entry_changedZ)on_advanced_search_treeview_row_activatedZ!on_Select_advanced_search_clickedZ!on_info_button_button_press_eventZon_back_button_clickedZon_forward_button_clickedZ#on_Boolean_treeview_columns_changedZon_completion_entry_changedZon_Add_button_clickedZon_Delete_button_clickedZon_Modify_button_clickedZon_Show_modified_only_toggledZon_cancel_button_config_clickedZon_Import_button_clickedZon_Export_button_clickedZon_enable_unconfined_toggledZon_enable_permissive_toggledZ&on_system_policy_type_combobox_changedZ#on_Enforcing_button_default_toggledZ$on_Permissive_button_default_toggledZ"on_Disabled_button_default_toggledZon_Relabel_button_toggled_cbZ%on_advanced_system_button_press_eventZon_files_type_combobox_changedZon_filter_row_changedZon_button_toggledZ
gtk_main_quitr(r�finish_initZ
advanced_init�
START_PAGE�opager�dbus�
customized�
exceptions�
DBusException�print�quit�init_cur�application�
filter_txtrZBuilder�	distutils�	sysconfigZget_python_lib�	code_pathZ
add_from_fileZ
get_objectr$�window�main_selection_windowr%�popupr&�
revert_buttonrZCursorZ
CursorTypeZWATCH�busy_cursorZLEFT_PTR�ready_cursor�selinux�selinux_getpolicytype�initialtype�
current_popup�
import_export�clear_entry�	files_add�network_add�mislabeled_filesZall_domainsZinstalled_listZpreviously_modified�file_dialogr'r(Z
invalid_entryr)�advanced_search_filterZset_visible_func�filter_the_dataZadvanced_search_sortr*r+Zadvanced_search�login_labelr,�login_seuser_combolistr-r.r/�login_radio_buttonr0r1Zset_sort_column_idZSortTypeZ	ASCENDINGr2r3r4r5r6�user_radio_buttonr7r8r9r:�user_roles_combolist�
user_labelr;r<r=r>Z
user_comboboxr?r@rArBrC�file_equiv_radio_buttonrDrErFZfile_equiv_treefilterrGrHrI�system_radio_button�lockdown_radio_buttonZsystems_box�relabel_button�relabel_button_norJrK�system_policy_labelrLrMrN�more_types_files_liststore�moreTypes_treeviewrOrPZpolicy_list�populate_system_policy�set_visible�enforcing_button_default�permissive_button_default�disabled_button_default�initialize_system_default_mode�enable_unconfined_button�disable_unconfined_button�enable_permissive_button�disable_permissive_buttonZenable_ptrace_button�disable_ptrace_buttonrQ�	help_textrRrSrTrUrVrW�update_treestorerX�
update_button�
add_button�
delete_buttonrYrZr[r\�popup_network_label�popup_files_label�recursive_path_toggle�files_type_combolist�files_class_combolistr]r^r_r`�advanced_text_filesZfiles_cancel_button�network_tcp_button�network_udp_button�network_port_type_combolist�network_port_type_comboboxrarb�advanced_text_networkZnetwork_cancel_button�show_mislabeled_files_onlyrcrd�
modify_button�
set_sensitivererfrgrhrirjrkrlrmrn�executable_files_treeviewro�executable_files_tabZget_tooltip_textZ executable_files_tab_tooltip_txt�executable_files_liststore�files_radio_buttonZfiles_button_tooltip_txt�writable_files_treeview�writable_files_liststorerp�writable_files_tabZwritable_files_tab_tooltip_txt�application_files_treeviewrq�application_files_tabZ!application_files_tab_tooltip_txt�application_files_liststore�network_radio_buttonZnetwork_button_tooltip_txt�network_out_treeviewrrrsrtZnetwork_out_tab_tooltip_txt�network_in_treeviewrurvrwZnetwork_in_tab_tooltip_txt�boolean_treeviewrxry�boolean_more_detail_windowZboolean_more_detail_treeview�!boolean_more_detail_tree_data_set�boolean_radio_button�
active_buttonZboolean_button_tooltip_txtrzr{r|�transitions_into_tabZ transitions_into_tab_tooltip_txt�transitions_radio_buttonZtransitions_button_tooltip_txtr}r~r�transitions_from_tabZ transitions_from_tab_tooltip_txt�transitions_file_treeview�transitions_file_liststoreZtransitions_file_filter�transitions_file_tabZ transitions_file_tab_tooltip_txtZ
combobox_menur�r�r�Zset_minimum_key_lengthZset_text_columnZset_match_func�
match_funcZset_completionZset_icon_from_stockZ
STOCK_FIND�show_modified_only�current_status_label�current_status_enforcing�current_status_permissiver�Zget_context_id�
context_idr�r�r��cellZdel_cell_files�connect�on_toggle_updateZdel_cell_files_equivZ
del_cell_userZdel_cell_loginZdel_cell_networkZupdate_cell�inner_notebook_files�inner_notebook_network�inner_notebook_transitionsZall_entries�	on_toggleZloading�append�sepolicyZget_all_domains�sort�str�lower�errorr�show�lenZget_init_entrypoints_str�
combo_box_add�floatZ
percentageZset_fractionZset_pulse_step�	idle_func�getZfind_entrypoint_path�hideZ	set_model�open_combo_menu�on_disable_ptrace�hide_combo_menu�set_application_label�get_filter_data�update_to_file_equiv�update_to_login�update_to_user�update_to_files�update_to_network�reveal_advanced�cursor_changed�resize_wrapr��populate_type_combo�invalid_entry_retry�recursive_path�highlight_entry_text�autofill_add_files_entry�select_type_more�on_browse_select�set_enforce�confirmation_close�column_clicked�
clear_filters�show_file_equiv_page�system_interface�users_interfacer��show_mislabeled_files�browse_for_files�close_popup�login_seuser_combobox_change�user_roles_combobox_change�close_config_window�apply_changes_button_press�update_or_revert_changes�get_advanced_filter_data�advanced_item_selected�advanced_item_button_push�on_help_button�on_help_back_clicked�on_help_forward_clicked�resize_columns�application_selected�add_button_clicked�delete_button_clicked�modify_button_clicked�on_show_modified_only�import_config_show�export_config_show�unconfined_toggle�permissive_toggle�change_default_policy�change_default_mode�relabel_on_reboot�reveal_advanced_system�show_more_types�
tab_change�closewindow�previously_modified_initializeZconnect_signalsrZtimeout_add_seconds�selinux_status�lockdown_inited�statusr�show_system_page�	set_label�set_text�show_applications_page�clearbuttons�set_current_page�reinit�main)�self�appZtestr��eZbuilderZ
glade_filer��path�lengthZentrypoint_dict�domainZ
entrypoint�dicrrr�__init__us 




"





zSELinuxGui.__init__cCs"i|_xtD]}i|j|<qWdS)N)�cur_dict�keys)rP�krrrr�ss
zSELinuxGui.init_curcCsLd}xB|jD]8}x2|j|D]$}||kr8|j||=dS|d7}qWqWdS)Nrr)rX)rP�ctr�irZ�jrrr�
remove_curxszSELinuxGui.remove_curcCsytj�|_Wntk
r(t|_YnX|jtkr�|jjd�|jjd�|jjd�|j	jd�|j
j|jt
d��|jjt�n|j|j�tjjd�r�|jjd�n|jjd�tj�d}tj�d}|tkr�|j	jd�|tkr�|jjd�|tk�r|jjd�dS)NFzSystem Status: Disabledz
/.autorelabelTr)r�Zsecurity_getenforcerG�OSErrorrr�r�r�r�r�r��pushr�rrRrI�
DISABLED_TEXT�set_enforce_text�osrS�existsr��
set_activer�r��selinux_getenforcemoderrr�r�)rPZ
policytype�moderrrrE�s0

zSELinuxGui.selinux_statuscCs�|jr
dS|j�d|_|jjtjd��i|_xN|jj�j	d�D]:}|j	�}t
|�dkr\qB|dt
|�dkd�|j|d<qBW|jj|jdd	�|jj|jd
d	�|j
�dS)NTZdeny_ptrace�
r	r)Zpriority�Disabledr�
unconfinedri�permissivedomains)rF�
wait_mouser�rer�Zsecurity_get_boolean_activeZmodule_dictr�Z
semodule_list�splitrr�r��ready_mouse)rP�m�modrrr�
lockdown_init�s$zSELinuxGui.lockdown_initcGs�|j�}|sdS|jtkr4|j�dkr4|j|j|�|jtkrp|jj|d�}|j�dkrp|rp|j	|jj|d��|jt
kr�|jj|d�}|r�|jj�|j
j|�dS)NZmore_detail_colr	Zrestorecon_colrr)�get_selected_iterr��
BOOLEANS_PAGEZget_name�display_more_detailr��
FILES_PAGE�	liststore�	get_value�fix_mislabeled�TRANSITIONS_PAGEr�Zclickedr�rJ)rP�treeviewZtreepathZtreecol�args�iterZvisibleZ	bool_namerrrr �s



zSELinuxGui.column_clickedcCsxtj�rtj�qWdS)N)rZevents_pendingZmain_iteration)rPrrrr�s
zSELinuxGui.idle_funccCs:y |jj|d�j|�dkrdSdStk
r4YnXdS)NrrTFr
)r�rw�find�AttributeError)rPZ
completionZ
key_stringr|Z	func_datarrrr��szSELinuxGui.match_funcc
Cs�|jj|jdk�|jj|jt|j�dk�y0td|j|j|jfd�}|j�}|j	�Wnt
k
rvd}YnX|jj�}|j
|d|ji�|jj|�|jjd|j|j|jf�|j|j�dS)Nrrz
%shelp/%s.txt�rr#�APPz
%shelp/%s.png)rUr��	help_pagerTr�	help_list�openr��read�close�IOErrorr�Z
get_bufferrJr�Z
set_bufferrSZ
set_from_file�
show_popuprQ)rP�fd�bufr�rrr�help_show_page�s

zSELinuxGui.help_show_pagecGs|jd8_|j�dS)Nr)r�r�)rPr{rrrr1�szSELinuxGui.on_help_back_clickedcGs|jd7_|j�dS)Nr)r�r�)rPr{rrrr2�sz"SELinuxGui.on_help_forward_clickedcGstd|_g|_|jtkr.|jjtd��dg|_|jtkrV|jjtd��ddddg|_|jtkr�|j	j
�}|tkr�|jjtd	��d
g|_|tkr�|jjtd��dg|_|t
kr�|jjtd
��dg|_|jtk�r$|jj
�}|tk�r|jjtd��dg|_|tk�r$|jjtd��dg|_|jtk�r�|jj
�}|tk�rb|jjtd��ddddg|_|tk�r�|jjtd��dg|_|tk�r�|jjtd��dg|_|jtk�r�|jjtd��dddd d!d"g|_|jtk�r�|jjtd#��d$d%d&d'g|_|jtk�r$|jjtd(��d)d*g|_|jtk�rH|jjtd+��d,g|_|jtk�rl|jjtd-��d.g|_|j�S)/NrzHelp: Start Page�startzHelp: Booleans PageZbooleansZbooleans_toggledZ
booleans_moreZbooleans_more_showzHelp: Executable Files PageZ
files_execzHelp: Writable Files PageZfiles_writezHelp: Application Types PageZ	files_appz'Help: Outbound Network Connections PageZports_outboundz&Help: Inbound Network Connections PageZ
ports_inboundz&Help: Transition from application PageZtransition_fromZtransition_from_booleanZtransition_from_boolean_1Ztransition_from_boolean_2z&Help: Transition into application PageZ
transition_toz&Help: Transition application file PageZtransition_filezHelp: Systems Page�systemZsystem_boot_modeZsystem_current_modeZ
system_exportZsystem_policy_typeZsystem_relabelzHelp: Lockdown PageZlockdownZlockdown_unconfinedZlockdown_permissiveZlockdown_ptracezHelp: Login PagerZ
login_defaultzHelp: SELinux User Page�userszHelp: File Equivalence PageZ
file_equiv)r�r�r�r�rQ�	set_titlerrsrur��get_current_page�EXE_PAGE�
WRITABLE_PAGE�APP_PAGE�NETWORK_PAGEr��
OUTBOUND_PAGE�INBOUND_PAGEryr��TRANSITIONS_FROM_PAGE�TRANSITIONS_TO_PAGE�TRANSITIONS_FILE_PAGE�SYSTEM_PAGE�
LOCKDOWN_PAGE�
LOGIN_PAGE�	USER_PAGE�FILE_EQUIV_PAGEr�)rPr{�ipagerrrr0�sl











zSELinuxGui.on_help_buttoncGsX|jdkrDd|_|jj�}|jj|dd|dd�|jj�n|jj�d|_dS)Nrrr�A)r�r�Zget_positionr�Zmoverr	)rPr{�locationrrrr
)s


zSELinuxGui.open_combo_menucGs|jj�d|_dS)Nr)r�r	r�)rPr{rrrr3s
zSELinuxGui.hide_combo_menucGs
d|_dS)NT)r
)rPr{rrrr
7sz SELinuxGui.set_application_labelcGst|�dS)N)r�)rPr{rrrr:szSELinuxGui.resize_wrapcCsHtj�d|_|jtkr |j|_|jtkr2|j|_|jtkrD|j	|_dS)Nr)
r�rf�enforce_moderr��enforce_buttonrr�rr�)rPrrrr�=s


z)SELinuxGui.initialize_system_default_modecCsvttjtj�dd��d}|j�d}xJ|D]B}|jj�}|jj|d|�||j	krf|j
j|�||_|d7}q,W|S)NT)�topdownrr)
�nextrc�walkr�Zselinux_pathr�rOr��	set_valuer�rPre�typeHistory)rP�typesr[�itemr|rrrr�Fs


z!SELinuxGui.populate_system_policycGs�|jdkrdSy�x�td|j��D]p}yR|j||�}|dksJ|dksJ|dkrLw |j|j�dksp|j�j|j�dkrtdSWq ttfk
r�Yq Xq WWnYnXdS)Nr#TrFrr
r
)r��range�
get_n_columnsrwr}rr~�	TypeError)rP�listr|r{�x�valrrrr�Ss
$zSELinuxGui.filter_the_datac
Cs�x�|j�D]|}xv||D]j\}}dj|�|f}	|	|jdkrl|jd|	ddkrTq||jd|	dkrlq|j|dj|�||�qWq
WdS)N�,r�actionz-d�typez, )rY�joinrX�network_initial_data_insert)
rPrQ�netd�protocol�	direction�modelrZ�t�portsZpkeyrrr�
net_updatefszSELinuxGui.net_updatecCs�|j�tj�}|jj�x�|D]�}|jj�}||drX|j|�}|j||d�}n|}||d}|jj|d|�|jj|d|�|jj|d||d�q W|j�dS)N�modify�equivrrr)	rlr�Zget_file_equivrE�clearr��markupr�rn)rPZedict�fr|�namer�rrr�file_equiv_initializeqs



z SELinuxGui.file_equiv_initializecCs�|j�|jj�x�tj�D]�}|jj�}|jj|dt|d��|d}d|kr\|jd�|jj|ddj	|��|jj|d|j
dd	��|jj|d
|j
dd	��|jj|dd
�qW|j�dS)Nrr��rolesZobject_rrz, r�levelr#r	r�rT)rlr7r�r��get_selinux_usersr�r�r�remover�rrn)rP�ur|r�rrr�user_initialize�s


zSELinuxGui.user_initializecCs�|j�|jj�xftj�D]Z}|jj�}|jj|d|d�|jj|d|d�|jj|d|d�|jj|dd�qW|j�dS)	Nrr�r�seuserr�mlsr	T)rlr1r�r�Zget_login_mappingsr�r�rn)rPr�r|rrr�login_initialize�s

zSELinuxGui.login_initializecCs|tjj|dddd�}|j||dt|j�tjj|dddd�}|j||dt|j�tjj|dddd�}|j||dt|j�dS)N�tcp�name_connectT)�check_bools�	name_bind�udp)r��network�get_network_connectr�r�rrr�ru)rPrQr�rrr�network_initialize�szSELinuxGui.network_initializecCsD|j�}|j|d|�|j|d|�|j|d|�|j|dd�dS)NrrrrT)r�r�)rPr�r�ZportTyper�r|rrrr��s
z&SELinuxGui.network_initial_data_insertcCs�d}|j�}x.|D]&}|d|kr0|j|�dS|d7}qW|j|d�}|j|d�td�krr|j|�}|d}n|j�}|j|d|�|j|�dS)NrrzMore...)�	get_modelre�get_iterrwrZ
insert_beforer�r�)rP�comboboxr�r[rvr\�niterr|rrr�combo_set_active_text�s



z SELinuxGui.combo_set_active_textcCs2|j�}|j�}|dkrdS|j|�}|j|d�S)Nr)r��
get_activer�rw)rPr�rv�indexr|rrr�combo_get_active_text�s
z SELinuxGui.combo_get_active_textcCs:|dkrdS|jj�}|jj|d|�|jj|d|�dS)Nrr)r�r�r�)rPr��val1r|rrrr�s

zSELinuxGui.combo_box_addcGsN|jj�}|j�d}|dkr"dS|jj|d�}|j|j|�|j|j�dS)Nrr)	r��
get_selection�get_selectedr�rwr�r]rCrN)rPr{rQr|rrrr�s
zSELinuxGui.select_type_morecGsx|jj�}|j�\}}|j|�}|jj|�}|jj|d�}|dkrFdS|jjd�|j	j
�|j|j�|j
j|�dS)Nrr#)r+r�r��convert_iter_to_child_iterr�r�rwr*rJr)r	rr%r�)rPr{�rowr�r|rQrrrr/�s


z$SELinuxGui.advanced_item_button_pushcGs`|jj|�}|jj|�}|jj|d�}|jjd�|jj�|j	|j
�|jj|�|j�dS)Nrr#)
r�r�r�r�rwr*rJr)r	rr%r�r4)rPrzrSr{r|rQrrrr.�s
z!SELinuxGui.advanced_item_selectedcCs4|r0t|�dkr0x|jD]}||dkrdSqWdS)NrTF)rr�)rPrQ�itemsrrr�find_application�s
zSELinuxGui.find_applicationcGs�|jjd�|jjd�|jjd�|jjd�|jj�}|j|�sHdS|j	�|j
jd�|jjd�|j
j�|jj�|jj�|jj�|jj�|jj�|jj�|jj�|jj�y(|ddkr�tj|�}|s�dS||_Wntk
�rYnX|j�|j|jj��|j�|j |�d|_!|j"|�|j#|�|j$|�|j%|�|j&|�|j'|�|j(|�|j)j*t+d�|�|j,j*t+d�|�|j-j*t+d�|�|j.j*t+d	�|�|j/j*t+d
�|�|j0j*t+d�|�|j1j*t+d�|�|j2j*t+d
�|�|j3j4t+d�|�|j5j4t+d�|�|j6j4t+d�|�|j3j*t+d�|�|j5j*t+d�|�|j6j*t+d�|�|j7j*t+d�|�||_|j8j4|j�|j9�dS)NFr#Tr�/z(File path used to enter the '%s' domain.z)Files to which the '%s' domain can write.z6Network Ports to which the '%s' is allowed to connect.z5Network Ports to which the '%s' is allowed to listen.z File Types defined for the '%s'.zODisplay boolean information that can be used to modify the policy for the '%s'.z;Display file type information that can be used by the '%s'.zADisplay network ports to which the '%s' can connect or listen to.z!Application Transitions Into '%s'z!Application Transitions From '%s'zFile Transitions From '%s'zVExecutables which will transition to '%s', when executing selected domains entrypoint.zQExecutables which will transition to a different domain, when '%s' executes them.z4Files by '%s' with transitions to a different label.zADisplay applications that can transition into or out of the '%s'.):r�r�rcrdr�rJr��get_textr�rKr�r�r�r�r�rurrrxr{r~r�r�r�r�Zget_init_transtyper��
IndexErrorrlrDr�r�rN�boolean_initializer��executable_files_initializer��writable_files_initialize�transitions_into_initialize�transitions_from_initialize�application_files_initialize�transitions_files_initializer��set_tooltip_textrr�rtrwr�r�r�r�r�rIr�r�r�r&rn)rPr{rQrrrr4�sr




















zSELinuxGui.application_selectedcCs tj�tj�|_tj�|_dS)N)r�rNZ
get_fcdict�fcdictZget_local_file_paths�local_file_paths)rPrrrrN6s
zSELinuxGui.reinitcCs�i|_�x�|jd�D�]�}|j�}t|�dkr0q|ddkr>q|d|jkrZi|j|d<|ddkr�d|ddki|jd|d<|dd	kr�|d
|dd�|jd	|d <|dd
kr�d|d
i|jd
|d!<|ddk�rd|d
|dd�|jd|d"<|ddk�r6d|d
i|jd|d#|d$f<|ddk�rj|d
|d|dd�|jd|d%<|ddk�r�|ddk�r�d|jk�r�i|jd<d|d
i|jd|d&<n"d|di|jd|d'|d
f<|ddkrd|ddki|jd|d(<qWd|jk�rdSxJd|jfd|jfgD]2\}}||jdk�r.|j|jd|d��q.Wx*tD]"}||jk�rj|jj|ii��qjWdS))Nrhrrz-Dr�activerz-1rr	r)r�r�r!r�r�s0)r�r��rolerr r)�maskr�r�rz-ezfcontext-equivr�r�enabledz-drjrkr
r
r
r
r
���r
r
r
r
)�	cust_dictrmrr�r�rerY�update)rPr�r\ZrecZsemodule�buttonrrrrD;sJ ""&
""
z)SELinuxGui.previously_modified_initializecCs�tj|�|_x�|jj�D]�}t|j|�dkr0q|j|d}xr|j|dD]`}||f|jdkr�|jd||fddkr�qN||jd||fdkr�qN|j|j|||�qNWqWdS)Nrrrr�z-dr�)r�Zget_entrypoints�entrypointsrYrrX�files_initial_data_insertr�)rPr��exe�
file_classrSrrrr�esz&SELinuxGui.executable_files_initializecCs@y&tj|d�d}tj|�d}||kStk
r:dSXdS)NrrF)r��matchpathcon�
getfileconr_)rPrS�con�currrr�
mislabeledsszSELinuxGui.mislabeledcCs�|j|�sdStj|d�d}tj|�d}d|_|j|dd�|j|dd�|j|dd�|j|d|jd�d�|j|d	|jd�d�dS)
NrrTr	rr�:rr)rr�r�r�r�r�rm)rP�treerSr|r�r�rrrr�set_mislabeled{s
zSELinuxGui.set_mislabeledcCs�tj|�|_x�|jj�D]�}t|j|�dkrF|j|jd|td��q|j|d}xr|j|dD]`}||f|jdkr�|jd||fddkr�qd||jd||fdkr�qd|j|j|||�qdWqWdS)	Nrz	all filesrrrr�z-dr�)	r�Zget_writable_files�writable_filesrYrr�r�rrX)rPr��writer�rSrrrr��sz$SELinuxGui.writable_files_initializec	Cs�|jd�}|dkr td�}d}nl||f|jk}x:tj|�D],}|j|�}|j|d|�|j||||�q:W|r�|j|�}|j|�}|j|�}|j|d|�|j|d|�|j|d|�|j|d|�dS)NzMISSING FILE PATHFrrrr)r�rr�r�Z	find_filer�rr�)	rPrvrSZ
selinux_labelr�r|r��pr�rrrr��s"




z$SELinuxGui.files_initial_data_insertcCsd|S)Nz	<b>%s</b>r)rPr�rrrr��szSELinuxGui.markupcCs |rtjddtjdd|��SdS)Nz</b>$r#z^<b>)�re�sub)rPr�rrr�unmarkup�szSELinuxGui.unmarkupcCs�tj|�|_x�|jj�D]�}t|j|�dkr0q|j|d}x�|j|dD]p}tj||jd�}||f|jdkr�|jd||fddkr�qN||jd||fdkr�qN|j|j	|||�qNWqWdS)Nrr)r�rr�z-dr�)
r�Zget_file_types�
file_typesrYrZget_descriptionr�rXr�r�)rPr�rQr�rS�descrrrr��sz'SELinuxGui.application_files_initializecCs.d}x$|jD]}t|j|�dkrdSqWdS)NrTF)rXr)rPr\rZrrr�modified�s
zSELinuxGui.modifiedcCsbx\tj|�D]N}xH|D]@\}}||jdkr>|jd|d}tj|�}|j|||�qWqWdS)Nrr�)r�Z	get_boolsrX�boolean_desc�boolean_initial_data_insert)rPr��blistrr�rrrrr��s
zSELinuxGui.boolean_initializecCsR|jj�}|jj|d|�|jj|d|�|jj|d|�|jj|dtd��dS)Nrrrr	zMore...)rxr�r�r)rPr�rr�r|rrrr�s

z&SELinuxGui.boolean_initial_data_insertcCsbx\tj|�D]N}d}d}d}d|kr,|d}d|kr<|d}d|krL|d}|j|||�qWdS)Nr�target�source)r�Zget_transitions_into�$transitions_into_initial_data_insert)rPr�r�r��
executablerrrrr��sz&SELinuxGui.transitions_into_initializecCsd|jj�}|dkr0|jj|dt|dd�n|jj|dd�|jj|d|�|jj|d|�dS)Nrr�Defaultr)r{r�r�r�)rPr�rrr|rrrr�s
z/SELinuxGui.transitions_into_initial_data_insertcCs�x�tj|�D]�}d}d}d}d|kr,|d}d|kr<|d}d|krL|d}|j|||�y*x$|j|dD]}|j|||�qlWWqtk
r�YqXqWdS)Nrr�	transtypeZregex)r�Zget_transitions�$transitions_from_initial_data_insertr��KeyError)rPr�r�r�rrZexecutable_typerrrr��s z&SELinuxGui.transitions_from_initializecCs�|jjd�}|dkr6|jj|dd�|jj|dd�n�|jj|�}|jj|dt|dd�d
}|ddr�|jj|dtd	�|�n|jj|dtd
�|�|jj|d|dd�|jj|dd�|jj|d|�|jj|d|�dS)NrrrFr�<span foreground="blue"><u>�</u></span>rz:To disable this transition, go to the %sBoolean section%s.z9To enable this transition, go to the %sBoolean section%s.Tr	)rr)r~r�r�r�r)rPr�rrr|r�r�rrrrsz/SELinuxGui.transitions_from_initial_data_insertcCsJxDtj|�D]6}d|kr"|d}nd}|j|d|d|d|�qWdS)N�filenamer�classr)r�Zget_file_transitions�$transitions_files_inital_data_insert)rPr�r\rrrrr�s

z'SELinuxGui.transitions_files_initializecCsZ|jj�}|jj|d|�|jj|d|�|jj|d|�|dkrFd}|jj|d|�dS)Nrrr�*r	)r�r�r�)rPrS�tclass�destr�r|rrrr"s
z/SELinuxGui.transitions_files_inital_data_insertcGs8|j�d|_d|_d|_d|_|jjd�|jj�|j	j
d�|jj
d�|jj
d�|j
j
d�|jj�r�|jjt�|j|_|j	j
d�|jj��r�|j|j�|j	j
d�|jj
|j�|jj
|j�|j
j
|j�|jjt�|d|jk�r|d}n
|jj�}|tk�r*|j|_td�}n6|tk�rF|j|_td�}n|tk�r`|j |_td�}|j!j"td�||j#d	��|j$j"td
�||j#d	��|jj"td�||j#d	��|j%j��r�|jj&�|j	j
d�|jjt'�|d|j(k�r�|d}n
|j(j�}|t)k�r |j*|_td�}|t+k�r:|j,|_td
�}|j!j"td�|j#|d��|j$j"td�|j#|d��|jj"td�|j#|d��|j-j��r|jjt.�|d|j/k�r�|d}n
|j/j�}|t0k�r�|j1|_|t2k�r�|j3|_|t4k�r|j5|_|j6j��r"|jjt7�|j8j�|j9j��rL|j:�|jjt;�|j8j�|j<j��r�|jjt=�|jj&�|j	j
d�|j>|_|j!j"td��|j$j"td��|jj"td��|j?j��r|jjt@�|jj&�|j	j
d�|jA|_|j!j"td��|j$j"td��|jj"td��|jBj��r~|jjtC�|jj&�|j	j
d�|jD|_|j!j"td��|j$j"td��|jj"td��|jj�|_E|j�r(|j8j&�|jjF�|_|jjF�|_|jjF�|_xXtGd|jjH��D]D}|jjI|�}|�r�|jJ�d}tK|tLjM��r�|jjN||jOd��q�W|jjP�jQ�|jjd�dS)NFTrrr�writabler�z4Add new %(TYPE)s file path for '%(DOMAIN)s' domains.)ZTYPEZDOMAINz3Delete %(TYPE)s file paths for '%(DOMAIN)s' domain.z�Modify %(TYPE)s file path for '%(DOMAIN)s' domain. Only bolded items in the list can be selected, this indicates they were modified previously.r�zlisten for inbound connectionszMAdd new port definition to which the '%(APP)s' domain is allowed to %(PERM)s.)r�ZPERMzVDelete modified port definitions to which the '%(APP)s' domain is allowed to %(PERM)s.zMModify port definitions to which the '%(APP)s' domain is allowed to %(PERM)s.z%Add new SELinux User/Role definition.z.Delete modified SELinux User/Role definitions.z7Modify selected modified SELinux User/Role definitions.z!Add new Login Mapping definition.z*Delete modified Login Mapping definitions.z3Modify selected modified Login Mapping definitions.z$Add new File Equivalence definition.z-Delete modified File Equivalence definitions.z�Modify selected modified File Equivalence definitions. Only bolded items in the list can be selected, this indicates they were modified previously.)Rr!rz�treesort�
treefilterrvr�r�r�r	r�r�r�rcrdr�r�r$rMrsr�r�r�r�rur�r�r�r�rr�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�ryr�r�r}r�rzr�r�r�r�r�r�rqr�r�r�r9r�r�r0r�r�rDr�r�r�r�Z
get_columnZ	get_cells�
isinstancerZCellRendererTextZ
set_sort_func�	stripsortr�Zunselect_all)rPr{r��categoryr��colr�rrrrB+s�
























zSELinuxGui.tab_changec	Cs:|j�\}}|j|j||��}|j|j||��}t||�S)N)Zget_sort_column_idr
rwr)	rPr�Zrow1Zrow2Z	user_dataZsort_columnrr�Zval2rrrr%�szSELinuxGui.stripsortcCs�|jj|�}|jj|�}|jj�|jjtd�|jj	|d��t
j|j|jj	|d��}x,|D]$}|j
|d|d|d|d�q^W|j|j�dS)NzBoolean %s Allow RulesrrrrZpermlist)ryr�r�r�r�r�r�rrxrwr�Zget_boolean_rulesr��display_more_detail_initr�)rP�windowsrS�itrrrrrrt�s

$zSELinuxGui.display_more_detailc	Cs0|jj�}|jj|dd|||dj|�f�dS)Nrzallow %s %s:%s { %s };� )r�r�r�r�)rPrrZ
class_typeZ
permissionr|rrrr(�s
z#SELinuxGui.display_more_detail_initcGs�d|_|jtkrJ|jjtd�|j�|jjtd�|j�|j	|�dS|jt
kr�|jjtd�|j�|jjtd�|j�|j
|�|jj�}|tkr�|jjd�n|jjd�d|_|jtkr�|jjtd	��|jjtd
��|j|�d|_|jtk�r2|jjtd��|jjtd��|j|�d|_|jtk�r�|jjd
�|jjd
�|jjtd��|jjtd��d|_|j |j�|j!�dS)NFzGAdd Network Port for %s.  Ports will be created when update is applied.zAdd Network Port for %szMAdd File Labeling for %s. File labels will be created when update is applied.zAdd File Labeling for %szex: /usr/sbin/Foobarzex: /var/lib/FoobarTzGAdd Login Mapping. User Mapping will be created when Update is applied.zAdd Login MappingzQAdd SELinux User Role. SELinux user roles will be created when update is applied.zAdd SELinux Usersr#zMAdd File Equivalency Mapping. Mapping will be created when update is applied.zAdd SELinux File Equivalency)"r�r�r�r�rJrr�r\r��init_network_dialogrur�r[�init_files_dialogr�r�r�rYr�r�r�r3�login_init_dialogr�r�r6�user_init_dialogr�rBrCrArFr��new_updates)rPr{r�rrrr5�sB







zSELinuxGui.add_button_clickedcCs||_|j�dS)N)r�r)rPr�rrrr��szSELinuxGui.show_popupcGs|jj�|jjd�dS)NT)r�r	r�r�)rPr{rrrr'�s
zSELinuxGui.close_popupc
Gs�d}|jr&|j�}|s&|jjd�dSd|_|jtkr@|j|�|jtk�r\|j	j
td�|j�|j
jtd�|j�d|_|j|�d|_d}d}|jj�}|tk�r |jj|�}||_|jj|d�}|jj
|�|jj|d�}|dkr�|j|j|�|jj|d�}|dk�r |j|j|�|tk�r�|jj|�}||_|jj|d�}|jj
|�|jj|d�}	|	dk�r||j|j|	�|jj|d�}|dk�r�|j|j|�|tk�r\|j j|�}||_|j!j|d�}|jj
|�y&|j!j|d�}
|
j"d	�dj"d
�}
Wnt#k
�rYnX|j!j|d�}	|	dk�r<|j|j|	�|
d}|dk�r\|j|j|�|jt$k�r�|j%|�|j&j
|j'j|d��|j(j
|j'j|d��|j)j
|j'j|d��|j|j*|j'j|d��|j+j
td��|j,jtd
��|j-|j,�|jt.k�r~|j/|�|j0j
|j1j|d��|j2j
|j1j|d��|j|j3|j1j|d��|j4j
td��|j5jtd��|j-|j5�|jt6k�r�|j7j
|j8|j9j|d���|j:j
|j8|j9j|d���|j;j
td��|j<jtd��d|_=|j-|j<�dS)NFTzPModify File Labeling for %s. File labels will be created when update is applied.zAdd File Labeling for %s�Modifyrrrz<b>z</b>r	zUModify SELinux User Role. SELinux user roles will be modified when update is applied.zModify SELinux UserszLModify Login Mapping. Login Mapping will be modified when Update is applied.zModify Login MappingzPModify File Equivalency Mapping. Mapping will be created when update is applied.zModify SELinux File Equivalency)>rzrrr�r�r�r�r��modify_button_network_clickedrur�rJrr�r[r��delete_old_itemr-r�r�r�ror�r�rwrYr�r]r^r�rpr�r�rqr�rmr~r�r/r;r7r=r>r:r�r6r�r�r.r-r1r/r,r�r3r�rBr
rErCrArFr�)rPr{r|�	operationr�r�rS�ftyperr�Zget_typerrrr7�s�













z SELinuxGui.modify_button_clickedcGsB|jj|�}|jj|d�}|j|j|�|j|j�|jj�dS)Nr)	r�r�rwr�r]r�r[rNr	)rPr�locr{r|r5rrrrHs
zSELinuxGui.populate_type_combocCs.|dkrdS|jd�rd}nd}|j|�dS)NZ	_script_tZ_tr)�endswithrm)rPrUZ
split_charrrr�strip_domainOs
zSELinuxGui.strip_domaincCs x|D]}|j|�rdSqWdS)NTF)�
startswith)rPr��exclude_listrRrrr�exclude_typeXs

zSELinuxGui.exclude_typec
Gs�g}|jjd�|j|j�|jj�}|jj�|jj�|j	|j
�}xN|jD]D}|dj|�rN|d|j
krN|djd�rN|j
|j	|d��qNW|jj��y�x.tjD]$}|jj
�}|jj|dtj|�q�W|tko�|jdk�rXxR|jj�D]D}|j|��r|jj
�}|jj|d|�|jj
�}|jj|d|�q�W|jjd�|jjd��n(|tk�r�|jdk�r�xp|jj�D]b}	|	j|��r�|j|	|��r�|	|jk�r�|jj
�}|jj|d|	�|jj
�}|jj|d|	��qzW|jjd�n�|tk�r�|jdk�r�xntj�D]b}
|
j|��r|
j|��rT|j|
|��rT|jj
�}|jj|d|
�|jj
�}|jj|d|
��qW|jjd�Wntk
�r�td�YnX|jjd�|jj d�|jj
�}|jj|dt!d��dS)	NTrZ	httpd_sysrFrr�zMore...)"r^r�r�r[r�r�r�r�r�r8r�r�r9r�r�r��
file_type_strr�r�r�rYrer�rr;rr�Zget_all_file_typesr~r�r]r`rJr)rPr{r:r�Zcompare�d�filesr|r�rrQrrrr-^s`


,



(




zSELinuxGui.init_files_dialogcGs�|j�}|s|jjd�dS|jjtd�|j�|jjtd�|j�d|_	|j
|�d}d}d|_|j�}|jj
|d�}|jj|�|jj
|d�}|dkr�|jjd�n|d	kr�|jjd�|jj
|d
�}|dkr�|j|j|�||_	dS)NFzJModify Network Port for %s.  Ports will be created when update is applied.zModify Network Port for %sr1rTrr�r�r)rrr�r�r�rJrr�r\r�r3r,r�rvrwrZr�rer�r�r�)rPr{r|r4r�rr�r�rrrr2�s.
z(SELinuxGui.modify_button_network_clickedc
Gs�|j|j�|jj�}|jj�|jjd��y:|tkrPt	j
j|jdddd�}n8|t
kr�t	j
j|jdddd�}|t	j
j|jdddd�7}g}xL|j�D]@}x:||D].\}}||dd	gkr�|jd
�r�q�|j|�q�Wq�W|j�|j|j�}|ddk�r|dd�}|d
}d}	d}
x@|D]8}|j|��r2|	}
|jj�}|jj|d|�|	d7}	�qW|jj|
�Wntk
�r~YnX|jjd�|jjd�dS)Nr#r�r�T)r�r�r�Zport_tZunreserved_port_tZ_typerr=rrr�r
r
)r�r\r�r�r�r�rZrJr�r�r�r�r�r�rYr7r�r�r8r9r�r�rer~r�rb)rPr{r�r�Z
port_typesrZr�r�Zshort_domainr[�foundr|rrrr,�sF




zSELinuxGui.init_network_dialogcGsN|j|�}|jj�dkrJx0tj�D]$}||dkr"|jj|jdd��q"WdS)Nr#r�r�)r�r/r�r�r�rJr)rP�combor{r�r�rrrr(�s

z'SELinuxGui.login_seuser_combobox_changecGsN|j|�}|jj�dkrJx0tj�D]$}||dkr"|jj|jdd��q"WdS)Nr#r�r�)r�r>r�r��
get_all_rolesrJr)rPr@r{Zseroler�rrrr)�s

z%SELinuxGui.user_roles_combobox_changecCsNd}|jsdS|jj�}|s dS|j�\}}|rJ|j|�}|rJ|jj|�}|S)N)rzr�r�r�r#)rPr|r�r"rrrrr�s

zSELinuxGui.get_selected_itercGsf|jjd�|j�}|dkr,|jjd�dS|j|sH|j|drLdS|jj|j|d�dS)NFrr
r
)r�r�rrrv)rPr{r|rrrr�szSELinuxGui.cursor_changedcGsn|j|j�|jj�tj�}|j�x*|D]"}|jj�}|jj|dt	|��q,W|j
jd�|jjd�dS)Nrr#)
r�r3r�r�r�Z
get_all_usersr�r�r�rr-rJr/)rPr{r�r�r|rrrr.�s


zSELinuxGui.login_init_dialogcGsn|j|j�|jj�tj�}|j�x*|D]"}|jj�}|jj|dt	|��q,W|j
jd�|jjd�dS)Nrr#)
r�r6r�r�r�rAr�r�r�rr;rJr>)rPr{r�rr|rrrr/	s


zSELinuxGui.user_init_dialogcCsh|jrdd|j�}|j�y|jj|�Wn0tjjk
rZ}z|j|�WYdd}~XnX|j�dS)Nzboolean -m -%d deny_ptrace)	r�r�rlr��semanager�r�rrn)rP�checkbutton�
update_bufferrRrrrrszSELinuxGui.on_disable_ptracecs�|jj���fdd�}g}|jtkrh|j�s8|j|j�Sx.|jD]$}|d|jdkr@|j||��q@W|jt	kr�|j
j�}|j�s�|tkr�|j
|j�S|tkr�|j|j�S|tkr�|j|j�Sx2|jD](}|d|df|jdkr�|j|�q�W|jtk�rR|j��s|j|j�Sx:|jD]0}|d|df|jdk�r|j||���qW|jtk�r�|j�d	k�st|j�Sx2|jD](}|d|jd
k�r||j||���q|W|jtk�r�|j��s�|j�Sx2|jD](}|d|jdk�r�|j||���q�W|jtk�rP|j�d	k�s|j�Sx2|jD](}|d|jdk�r$|j||���q$W|jj�xB|D]:}|jj�}x(td��D]}|jj||||��qzW�q`WdS)
Ncs*g}x td��D]}|j||�qW|S)Nr)r�r�)r��lr\)rTrr�dup_row!sz1SELinuxGui.on_show_modified_only.<locals>.dup_rowrrrrrrTzfcontext-equivrr)rvr�r�rsr�r�r�r�r�rur�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)rPrCrFZappend_listr�r�r|r\r)rTrr8sd








z SELinuxGui.on_show_modified_onlyc	Cs8|jd�}|j|d|�|j|d|�|j|d|�dS)Nrrr)r�r�)	rPrrQr�r4rSZfclassr5r|rrr�init_modified_files_liststorebs
z(SELinuxGui.init_modified_files_liststorecGstd�dS)Nzrestore to defualt clicked...)r�)rPr{rrr�restore_to_defaulthszSELinuxGui.restore_to_defaultcGs(|j|j�|jjd�|jjd�dS)NT)rCr'r[r�r\)rPr{rrrrkszSELinuxGui.invalid_entry_retrycCsVt|�dks|ddkrR|jj�|jjd�|jjd�|jjtd�|�dSdS)Nrr�FzAThe entry '%s' is not a valid path.  Paths must begin with a '/'.T)	rr'rr[r�r\r(rJr)rPZ
insert_txtrrr�error_check_filesps
zSELinuxGui.error_check_filescCsly t|�}|dks|dkrt�WnFtk
rf|jj�|jjd�|jjd�|jjt	d��dSXdS)NriFz'Port number must be between 1 and 65536T)
�int�
ValueErrorr'rr[r�r\r(rJr)rPrZpnumrrr�error_check_networkys
zSELinuxGui.error_check_networkcGs2|jr.|j|j�td�kr.|jj�|jj�dS)NzMore...)r�r�r]rr[r	rNr)rPr{rrrrA�s
zSELinuxGui.show_more_typesc	Gs|j�|j|j�}|jj�}|jj�}|jr�|j�}|jj	|d�}|jj	|d�}|jj	|d�}|j
j|d|�|j
j|d|�|j
j|d|�d|||||d�|jd|<n"|j
j
d�}d||d�|jd|<|j
j|d|�|j
j|d|�|j
j|d|�|j�dS)	Nrrrz-m)r�r�r��oldrange�	oldseuser�oldnamerz-a)r�r�r�)r'r�r,r/r�r-r�rrr1rwrvr�rXr�r0)	rPr{r��	mls_ranger�r|rOrNrMrrrr�s&

zSELinuxGui.update_to_loginc	Gsj|j�|j|j�}|jj�}|jj�}|jj�}|jr�|j�}|j	j
|d�}|j	j
|d�}|j	j
|d�}	|j	j
|d�}
|jj|d|�|jj|d|�|jj|d|	�|jj|d|
�d||||
|	||d�|j
d|<nD|jjd�}|s�|�r
d|||d	�|j
d|<nd|d
�|j
d|<|jj|d|�|jj|d|�|jj|d|�|jj|d|�|j�dS)Nrrr	rz-m)r�r�r�r�rM�oldlevel�oldrolesrOrz-a)r�r�r�r�)r�r�)r'r�r:r=r�r>r;r�rrr7rwrvr�rXr�r0)rPr{r�r�rPr�r|rOrRrQrMrrrr�s2


"
zSELinuxGui.update_to_usercGs�|j�|jj�}|jj�}|jrl|j�}|j|jj|d��}|j|jj|d��}d|||d�|j	d|<n |jj
d�}d|d�|j	d|<|jj|d|j|��|jj|d|j|��dS)Nrrz-m)r��src�oldsrc�olddestzfcontext-equivz-a)r�rS)r'rCr�rBr�rrr
rvr�rXr�r�)rPr{r rSr|rUrTrrrr�s

zSELinuxGui.update_to_file_equivc
Gs0|j�d|_|jj�}|j|�r&dS|j|j�}|jj�}|j|j�}|j	r�|j
�}|j|jj
|d��}|j|jj|d��}|jj
|d�}	d||||	d�|jd||f<n$|jjd�}d|d	�|jd||f<|jj|d|j|��|jj|d|j|��|jj|d|j|��d
|_|jjd
�|j�dS)NTrrrz-m)r�r��oldtype�oldpathZoldclassrz-a)r�r�F)r'r�rYr�rIr�r]r`r^r�rr�unmarkrvrwr�rXr�r�r�rer0)
rPr{rS�setyper�rr|rW�	oldsetypeZ	oldtclassrrrr�s,


 zSELinuxGui.update_to_filesc
Gs2d|_|jj�}|j|�rdS|jj�r.d}nd}|j|j�}|jj�}|j	r�|j
�}|j|jj
|d��}|j|jj
|d��}|j|jj|d��}	d|||	||d�|jd	||f<n&|jjd�}d
||d�|jd	||f<|jj|d|�|jj|d|�|jj|d|�d|_|jj�|jjd�|j�dS)
NTr�r�rrrz-m)r�r�r�rV�oldprotocol�oldportsrz-a)r�r�r�F)r�rZr�rLr�r�r�r�rbr�rrrXrvrwr�rXr�r\r	r�r�r0)
rPr{r�r�rYr�r|r\r[rZrrrr�s0



"
zSELinuxGui.update_to_networkcGs�d}|jjd�|jtkr�|jj�|jd}xZ|D]R\}}|||fd}|jj�}|jj|d|�|jj|d|�|jj|d|�q4W|j	|j
�dS|jtk�r,|jj�|jd}x`|D]X\}	}
||	|
fd}|jj�}|jj|d|	�|jj|d|�|jj|dt
j|
�q�W|j	|j�dS|jtk�r�|jj�|jd	}x�|D]|}||d
}
||jdd�}||jd
d�}|jj�}|jj|d|�|jj|d|
�|jj|d|�|jj|d|��qRW|j	|j�dS|jtk�rx|jj�|jd}xd|D]\}||d}||jdd�}|jj�}|jj|d|�|jj|d|�|jj|d|��qW|j	|j�dS|jtk�r�|jj�xX|jD]N}|d�r�|jj�}|jj|d|j|d��|jj|d|j|d���q�W|j	|j�dSdS)NZAddFrr�rrr	rrr�r�r#r�rrr�r)r�r�r�r�rmr�r�r�r�r�rkrurjr�r<rhr�r?rr@r�r4r5r�rGrEr
rH)rPr{r4Z	port_dictr�r�rYr|Z
fcontext_dictrSrZ	user_dictrr�r�r�Z
login_dictrr�r�rrrr6sv

















 z SELinuxGui.delete_button_clickedcGsx|j�|jtkrNx:|jD]0}|drd|dd�|jd|d|df<qW|jtkr�x>|jD]4}|dr`d|dd�|jd|dt|df<q`W|jtkr�x8|j	D].}|dr�d|d|d	d
�|jd|d<q�W|jt
k�r(x>|jD]4}|dr�d|d|d|dd�|jd
|d<q�W|jtk�rlx6|j
D],}|d�r<d|dd�|jd|d<�q<W|j�dS)Nrz-dr	)r�r�rrrrr)r�r�r�r)r�rr�r�r)r�rSzfcontext-equiv)r'r�r�rmrXrurj�reverse_file_type_strr�r?r�r4r�rGr0)rPr{�deleterrrr�Os,
(
,
&,
"z!SELinuxGui.on_save_delete_clickedcGs,x&|jD]}t|d|d|d�qWdS)Nrrr)rjr�)rPr{r^rrr�!on_save_delete_file_equiv_clickedgsz,SELinuxGui.on_save_delete_file_equiv_clickedcCs||d||d<dS)Nrr)rPr�rSr�rrrr�kszSELinuxGui.on_toggle_updatecCsVd}xL|D]D}|d|dkrF|d|dkrF|j|�}|j|�dS|d7}q
WdS)Nrrr)r�r�)rPrvr�r[r�r|rrr�ipage_deletens
 

zSELinuxGui.ipage_deletecCs�|sdS|jj|�}|jj|�}|j|d�}|j|d|j|d��|j|d�}||jdkrl|jd|=nd|i|jd|<|j�dS)Nrrrr�)ryr�r�rwr�rXr0)rPr�rSr�r|r�r�rrrr�wszSELinuxGui.on_togglecGs|j�|_|jj�dS)N)r�r�r��refilter)rP�entryr{rrrr-�s
z#SELinuxGui.get_advanced_filter_datacGs|j�|_|jj�dS)N)r�r�r#ra)rPr)r{rrrr�s
zSELinuxGui.get_filter_datacGsFd|_|jj�x�|jdD]�}|jd|d}|jjd�}|jj|dd�|jj|dtj|��|jj|dt|jd|d�|jj|dd�|jj|�}|jj|dt	d	�|�|jj|dd
�qW�x�|jdD�]~\}}|jd||fd}|jd||fd}|jjd�}|jj|dd�|jj|d|�|jj|dd�|d
k�rr|jj|dt	d�|j
�|dk�r�|jj|dt	d�|j
�|dk�r�|jj|dt	d�|j
�|jj|�}|jj|dd
�|jj|dt	d�|�|jj|�}|jj|dd
�|jj|dt	d�tj|�|jj|�}|jj|dd
�|jj|dt	d�|�q�W�x�|jdD�]z\}	}
|jd|	|
fd}|jjd�}|jj|dd�|jj|d|�|jj|dd�|d
k�r�|jj|dt	d�|j
�|dk�r|jj|dt	d�|j
�|dk�r6|jj|dt	d�|j
�|jj|�}|jj|dt	d�|	�|jj|dd
�|jj|�}|jj|dt	d�|
�|jj|dd
�|jd|	|
fd}|jj|�}|jj|dd
�|jj|dt	d�|��qnW�x�|jdD�]r}|jd|d}|jjd�}|jj|dd�|jj|d|�|jj|dd�|d
k�rn|jj|dt	d��|dk�r�|jj|dt	d��|dk�r�|jj|dt	d��|jj|�}|jj|dt	d �|�|jj|dd
�|jj|�}|jj|dd
�|jd|d!}|jj|dt	d"�|�|jd|jd#d$�}
|jj|�}|jj|dd
�|jj|dt	d%�|
��q�W�x�|jd&D�]r}|jd&|d}|jjd�}|jj|dd�|jj|d|�|jj|dd�|d
k�r�|jj|dt	d'��|dk�r|jj|dt	d(��|dk�r0|jj|dt	d)��|jj|�}|jj|dd
�|jj|dt	d*�|�|jj|�}|jj|dd
�|jd&|d+}|jj|dt	d,�|�|jd&|jd#d$�}
|jj|�}|jj|dd
�|jj|dt	d%�|
��q�W�x8|jd-D�](}|jd-|d}|jjd�}|jj|dd�|jj|d|�|jj|dd�|d
k�rz|jj|dt	d.��|dk�r�|jj|dt	d/��|dk�r�|jj|dt	d0��|jj|�}|jj|dd
�|jj|dt	d1�|�|jj|�}|jj|dd
�|jd-|d2}|jj|dt	d3�|��qW|j
|j�dS)4NTrr�rrrr�r	zSELinux name: %sFrr�z-azAdd file labeling for %sz-dzDelete file labeling for %sz-mzModify file labeling for %sz
File path: %szFile class: %szSELinux file type: %srzAdd ports for %szDelete ports for %szModify ports for %szNetwork ports: %szNetwork protocol: %srzAdd userzDelete userzModify userzSELinux User : %sr�z	Roles: %sr�r#zMLS/MCS Range: %srzAdd login mappingzDelete login mappingzModify login mappingzLogin Name : %sr�zSELinux User: %szfcontext-equivzAdd file equiv labeling.zDelete file equiv labeling.zModify file equiv labeling.zFile path : %srSzEquivalence: %s)r�r�r�rXr�r�r�rr�rr�r<rr�rV)rPr{Zboolsr4r|r�rSrrYrr�rr�r�rr�rSrrr�
update_gui�s�
"














zSELinuxGui.update_guicCsL|jj�r|j|_|jj�r$|j|_|jj�r6|j|_|jj�rH|j|_dS)N)r�r�r�r�r�r�)rPrrr�set_active_application_button	s



z(SELinuxGui.set_active_application_buttonTcCs~|jj�|jjd�|jjd�|jjd�|jjd�|jjd�|jjd�|j	jd�|j
jd�|rz|jjd�dS)NFr#)
r�r	r�r�r�r�r�r�r�r�r�r�rJ)rPr�rrrrL 	s
zSELinuxGui.clearbuttonscCsP|j�|jjd�|jjd�|jjtd��|jjd�|j�|j	�dS)NTZSystem)
rLr�r�r�r&rIrrerBr)rPrrrrH-	szSELinuxGui.show_system_pagecGsX|j�|j�|jjd�|jjtd��|j�|j�|j	j
d�|jj
d�dS)NTzFile Equivalence)rLr�r�rer&rIrrBrr�r�r�)rPr{rrrr"6	szSELinuxGui.show_file_equiv_pagecCsx|j�|jjd�|jjd�|jjtd��|jjd�|j�|j	�|j
�|j�|jj
d�|jj
d�dS)NTZUsers)rLr�r�r�r&rIrrerBr�r�rr�r�r�)rPrrr�show_users_page@	szSELinuxGui.show_users_pagecCsZ|jd�|jjd�|jjd�|jjd�|jjd�|jjd�|j�|j�dS)NFT)	rLr�r�r�r�r�rerBr)rPrrrrKM	s
z!SELinuxGui.show_applications_pagecGs|j�dS)N)rH)rPr{rrrr#W	szSELinuxGui.system_interfacecGs|j�dS)N)re)rPr{rrrr$Z	szSELinuxGui.users_interfacec	Gs�g}d}|jj�}|j�dkr�x^|jD]T}|jj|�}|jj|�}|jj|�}|dkr&|jj|d�dkrr|j	|�|d7}q&Wxn|D]}|jj
|�q�WnT|jdkr�|jj�|t
kr�|j|j�n*|tkr�|j|j�n|tkr�|j|j�dS)NrTrFr)r�r�r�rvr"r�r�r#rwr�r�r�r�r�r�r�r�r�r�)	rPrCr{Ziterlistr[r�r�r|Zitersrrrr%]	s,




z SELinuxGui.show_mislabeled_filescCsptj|�djd�d}tj|d�djd�d}|jtd�|||d�dd�tjjkrl|j	j
|�|j�dS)	NrrrrzbRun restorecon on %(PATH)s to change its type from %(CUR_CONTEXT)s to the default %(DEF_CONTEXT)s?)�PATHZCUR_CONTEXTZDEF_CONTEXTzrestorecon dialog)�title)r�r�rmr��verifyrr�ResponseTypeZYESr�Z
restoreconr4)rPrSrr�rrrrxv	s
&zSELinuxGui.fix_mislabeledcGs$|jj|j��|jj|j��dS)N)r�r�r
r�)rPr{rrrr0}	szSELinuxGui.new_updatescGsF|j�|j�td�k|_|jr2|jjtd��n|jjtd��dS)NZUpdatezUpdate ChangeszRevert Changes)rc�	get_labelrr�rVr�)rPr�r{rrrr,�	s
z#SELinuxGui.update_or_revert_changescGsb|j�|jr|j�n|j�d|_|j|jj��d|_|j�|j	�|j
�|jj�dS)NFT)
r'r��update_the_system�revert_datar�rDr�r�r!r4r0r�r�)rPr{rrrr+�	s
z%SELinuxGui.apply_changes_button_presscGsl|j�|j�}|j�y|jj|�Wn.tjjk
rV}zt|�WYdd}~XnX|j�|j	�dS)N)
r'�
format_updaterlr�rBr�r�r�rnr�)rPr{rDrRrrrrk�	szSELinuxGui.update_the_systemcCs2dddddd�}x|D]}||kr||SqWdS)Nrrr)zExecutable FileszWritable FileszApplication File TypeZInboundZOutboundZBooleansr)rP�lookupZipage_values�valuerrr�ipage_value_lookup�	s

zSELinuxGui.ipage_value_lookupcCs4|jd�d}|jd�d}|dkr,||_n|SdS)Nz: rrzSELinux name)rmZbool_revert)rPZ	attributeZbool_idrrr�get_attributes_update�	s
z SELinuxGui.get_attributes_updatec		Cs�|j�d}�xp|jD�]d}|dkrVx0|j|D]"}|d|j||d|f7}q0W|dk�rx�|j|D]�}|j||ddkr�|d|7}qld	|j||kr�|d
|j||d|j||d|j||d	|f7}ql|d|j||d|j||d|f7}qlW|d
k�rx�|j|D]�}|j||ddk�rX|d|7}n�d|j||k�r�d	|j||k�r�|d|j||d|j||d|j||d	|j||d|f7}n.|d|j||d|j||d|f7}�q.W|dk�rxxh|j|D]Z}|j||ddk�rD|d|7}n.|d|j||d|j||d|f7}�qW|dk�r�xx|j|D]j}|j||ddk�r�|d|7}n>|d|j||d|j||d|j||d|f7}�q�W|dkrxv|j|D]h\}}|j|||fddk�rF|d||f7}n0|d|j||d|j||d||f7}�qWqW|S)Nr#rzboolean -m -%d %s
r�rr�z-dzlogin -d %s
r�zlogin %s -s %s -r %s %s
r�zlogin %s -s %s %s
rzuser -d %s
r�zuser %s -L %s -r %s -R %s %s
r�zuser %s -R %s %s
zfcontext-equivzfcontext -d %s
zfcontext %s -e %s %s
rSrzfcontext %s -t %s -f %s %s
r�rrzport -d -p %s %s
zport %s -t %s -p %s %s
)rlrX)	rPrDrZrrEr�r�rr�rrrrm�	sH"
@2
(P4
4
D:zSELinuxGui.format_updatecCs`d}g}d}x.|jD]$}|j|ds0|j|�|d7}qW|j�x|D]}|j|�qJWdS)Nrr#r)r�r��reverser^)rPr[Zremove_listrDr�rrrrl�	s

zSELinuxGui.revert_datacGsN|j�tdk}|r$|jtd�n|jtd�|jj|�|jj|�dS)Nrr)r��ADVANCED_LABELrJr�r�rP)rP�labelr{�advancedrrrr@�	sz!SELinuxGui.reveal_advanced_systemcGsf|j�tdk}|r$|jtd�n|jtd�|jj|�|jj|�|jj|�|jj|�dS)Nrr)r�rsrJr_r�r`rarb)rPrtr{rurrrr�	szSELinuxGui.reveal_advancedcGsF|j�tdkr(|jtd�|j�n|jtd�|j|j�dS)Nrr)r��ADVANCED_SEARCH_LABELrJr'r�r))rPrtr{rrrr�
s

z)SELinuxGui.on_show_advanced_search_windowcCsJ|r&|jj|jtd��|jjd�n |jj|jtd��|jjd�dS)NzSystem Status: EnforcingTzSystem Status: Permissive)r�r`r�rr�rer�)rProrrrrb
s
zSELinuxGui.set_enforce_textcCs,|js
dS|jj|j��|j|j��dS)N)r�r�Z
setenforcer�rb)rPr�rrrr
szSELinuxGui.set_enforcecGs`|jj�}|dkrdSd|_|jj�|jj|�|jdkrH|j|�n|jdkr\|j|�dS)NF�Import�Export)	r��get_filenamer�r	rYrJr��
import_config�
export_config)rPr{rrrrr
s



zSELinuxGui.on_browse_selectcGsX|jj�}|jj�r0|jd�sT|jj|d�n$|jd�rT|jd�d}|jj|�dS)Nz(/.*)?r)rYr�r�r�r7rJrm)rPr{rSrrrr#
s



zSELinuxGui.recursive_pathcGs"|j�}|jr|jd�d|_dS)Nr#F)r�r�rJ)rPZ	entry_objr{Ztxtrrrr,
s
zSELinuxGui.highlight_entry_textcCs~|j�}|dkrdS|jd�r*|jjd�xNtjD]D}|j|�r2x4|jD]*}|djtj|�rH|j|j	|d�qHWq2WdS)Nr#z(/.*)?Tr)
r�r7r�rer�ZDEFAULT_DIRSr9r�r�r])rPrb�textr=r�rrrr2
s

z#SELinuxGui.autofill_add_files_entrycGs&|jjd�|_|jj�}|jj�}dS)Nr)r�Zget_colZboolean_column_1Z	get_widthZget_cell_renderers)rPr{�widthZrendererrrrr3>
s
zSELinuxGui.resize_columnscGs|jj�dS)N)r�r)rPr{rrrr&C
szSELinuxGui.browse_for_filescGs|jj�dS)N)r�r	)rPr{rrrr*F
szSELinuxGui.close_config_windowcGsl|j|jj�krdS|jtd��tjjkr<|jj|j�dS|j	j
|j|j��|j	jd�|jj�|_dS)Nz�Changing the policy type will cause a relabel of the entire file system on the next boot. Relabeling takes a long time depending on the size of the file system.  Do you wish to continue?T)
r�rPr�rhrrri�NOrer�r=r��relabel_on_boot)rPr{rrrr=I
sz SELinuxGui.change_default_policycCs4|js
dS|j|�|j�r0|jj|j�j��dS)N)r��enabled_changedr�r�r>rjr)rPr�rrrr>U
s

zSELinuxGui.change_default_modecGs0|jjtjj�|jjd�|jj�d|_dS)NzImport Configurationrw)r��
set_actionr�FileChooserActionZOPENr�rr�)rPr{rrrr9\
s
zSELinuxGui.import_config_showcGs0|jjtjj�|jjd�|jj�d|_dS)NzExport Configurationrx)r�r�rr�ZSAVEr�rr�)rPr{rrrr:c
s
zSELinuxGui.export_config_showcCs:|j�|jj�}t|d�}|j|�|j�|j�dS)N�w)rlr�r�r�rr�rn)rPrr�r�rrrr{i
s


zSELinuxGui.export_configcCsTt|d�}|j�}|j�|j�y|jj|�Wntk
rFYnX|j�dS)Nr)r�r�r�rlr�rBr_rn)rPrr�r�rrrrzq
s
zSELinuxGui.import_configcCsV|||f|kri||||f<||f||||fkrR|||	|
d�||||f||f<dS)N)r�r��changed�oldr)rPrVrQr�r4r�qr5r�r�r�rrr�init_dictionary|
szSELinuxGui.init_dictionarycCs*|jd�d}|dkrdS|dkr&dSdS)N�-r�0F�1T)rm)rPrrrr�translate_bool�
s
zSELinuxGui.translate_boolcGsx|jj�}tjjd�}|r"|r"dS|r2|r2dSy|jj|�Wn0tjjk
rr}z|j	|�WYdd}~XnXdS)Nz
/.autorelabel)
r�r�rcrSrdr�rr�r�r)rPr{r�rdrRrrrr?�
s
zSELinuxGui.relabel_on_rebootcGs
|j�|jjd�|jjd�|j|krV|j|j�|j|j	�t
d�krV|j	jd�|j|kr�|jrt|j|j�n|j
r�|j|j�|jj�s�|jj�r�|jjd�|jjd�|jjd�|jjd�|jjd�|jjd�|jj�tdk�r|jjtd�dS)NFTzMore...rr)r	r�rer�r�rNr�r[r�r]rr'r�r�r\r_Zget_visiblerar�r�r`r�rbr%r�rvrJ)rPr�r{rrrrC�
s,

zSELinuxGui.closewindowcCs|jj�j|j�|j�dS)N)r��
get_window�
set_cursorr�r)rPrrrrl�
szSELinuxGui.wait_mousecCs|jj�j|j�|j�dS)N)r�r�r�r�r)rPrrrrn�
szSELinuxGui.ready_mouser#cCsNtjddtjjtjj|�}|j|�|jtjj	�|j
�|j�}|j�|S)Nr)
r�
MessageDialog�MessageType�INFO�ButtonsTypeZYES_NOr��set_position�WindowPosition�MOUSE�show_all�run�destroy)rP�messagerg�dlgZrcrrrrh�
s
zSELinuxGui.verifycCsDtjddtjjtjj|�}|jtjj�|j	�|j
�|j�dS)Nr)rr�r�ZERRORr�ZCLOSEr�r�r�r�r�r�)rPr�r�rrrr�
szSELinuxGui.errorcCs�|j�sdS|j�}|dkrH|jtkrH|jtd��tjjkrH|j	j
d�|dkr||jtkr||jtd��tjjkr||j	j
d�||_	dS)Nria�Changing to SELinux disabled requires a reboot.  It is not recommended.  If you later decide to turn SELinux back on, the system will be required to relabel.  If you just want to see if SELinux is causing a problem on your system, you can go to permissive mode which will only log errors and not enforce SELinux policy.  Permissive mode does not require a reboot.  Do you wish to continue?Tz�Changing to SELinux enabled will cause a relabel of the entire file system on the next boot. Relabeling takes a long time depending on the size of the file system.  Do you wish to continue?)r�rjr�rrhrrrir~r�re)rPZradiortrrrr��
szSELinuxGui.enabled_changedcGs|jjd�|jjd�dS)Nr#F)r�rJr�re)rPr{rrrr!�
szSELinuxGui.clear_filterscGsB|js
dS|j�|jj�r*|jjd�n|jjd�|j�dS)Nzmodule -e unconfinedzmodule -d unconfined)r�rlr�r�r�rBrn)rPr{rrrr;�
s
zSELinuxGui.unconfined_togglecGsB|js
dS|j�|jj�r*|jjd�n|jjd�|j�dS)Nzmodule -e permissivedomainszmodule -d permissivedomains)r�rlr�r�r�rBrn)rPr{rrrr<�
s
zSELinuxGui.permissive_togglecGs:t|j�dkr.|jtd�td��tjjkr.dS|j�dS)Nra0You are attempting to close the application without applying your changes.
    *    To apply changes you have made during this session, click No and click Update.
    *    To leave the application without applying your changes, click Yes.  All changes that you have made during this session will be lost.zLoss of data DialogT)rr�rhrrrir~r�)rPr�r{rrrr�
szSELinuxGui.confirmation_closecGstjd�dS)Nr)�sys�exit)rPr{rrrr��
szSELinuxGui.quit)NF)T)r#)��__name__�
__module__�__qualname__rWr�r^rErqr rr�r�r1r2r0r
rr
rr�r�r�r�r�r�r�r�r�r�r�rrr/r.r�r4rNrDr�rrr�r�r�r
r�r
r�rr�rr�rr�rrBr%rtr(r5r�r'r7rr8r;r-r2r,r(r)rrrr.r/rr8rGrHrrIrLrArrrrrr6r�r_r�r`r�r-rrcrdrLrHr"rerKr#r$r%rxr0r,r+rkrprqrmrlr@rr�rbrrrrrr3r&r*r=r>r9r:r{rzr�r�r?rCrlrnrhrr�r!r;r<rr�rrrrr"ss*	@
	
	

B*

	})Y	5+

D	
A	

	


/		
	


r"�__main__)r	)AZgiZrequire_versionZ
gi.repositoryrrrZsepolicy.sedbusrr�r�r�rrrZsepolicy.networkZsepolicy.manpager�rcrZunicodedataZPROGNAME�gettext�kwargs�version_infoZinstall�builtinsr�__dict__�ImportErrorZ__builtin__r
r]r<r�rr�r�rZdistutils.sysconfigr�rsrvr�r�r�r�r�r�r�r�rsrur�ryr�r�r�r�r�r�rYrar"r�r�rrrr�<module>s�
	
site-packages/sepolicy/__pycache__/__init__.cpython-36.opt-1.pyc000064400000075600147511334650020470 0ustar003

��fz��@s6ddlZddlZddlZddlZddljZddljZddlZddl	Z	ddl
Z
ddlZdZy:ddl
Z
iZejd�krxded<e
jefddd�e��WnJyddlZeejd	<Wn&ek
r�ddlZeejd	<YnXYnXd
ZdZdZdZd
ZdZdZdZdZdZ dZ!dZ"dZ#dZ$dZ%dZ&dZ'iZ(de(d<de(d<de(d<de(d <de(d!<d"e(d#<d$e(d%<d&e(d'<d(e(d)<d(e(d*<d+e(d,<d+e(d-<d.e(d/<d0e(d1<iZ)e*d2�e)d3<e*d4�e)d5<e*d6�e)d7<e*d8�e)d9<e*d:�e)d;<e*d<�e)d=<e*d>�e)d?<e*d@�e)dA<iZ+d3e+dB<d5e+dC<d7e+dD<d9e+dE<d;e+dF<d=e+dG<d?e+dH<dAe+dI<da,da-da.da/da0ga1da2da3da4da5da6da7da8da9da:da;da<da=da>da?da@daAdaBdaCdaDdJdK�ZEd�dMdN�ZFdOdP�ZGdQdR�ZHdSdT�ZIyeF�ZJeHeJ�Wn6eKk
�rZLzejM�d
k�reL�WYddZL[LXnXd�dUdV�ZNdWdX�ZOd�dYdZ�ZPd[d\�ZQd]d^�ZRd_d`�ZSdadb�ZTdcdd�ZUdedf�ZVdgdh�ZWgfdidj�ZXgfdkdl�ZYdmdn�ZZej[�fdodp�Z\ej[�fdqdr�Z]ej[�fdsdt�Z^ej[�fdudv�Z_dwdx�Z`dydz�Zad{d|�Zbd}d~�Zcdd��Zdd�d��Zed�d��Zfd�d��Zgd�d��Zhd�d��Zid�d��Zjd�d��Zkd�d��Zld�d��Zmd�d��Znd�d��Zod�d��Zpd�d��Zqd�d��Zrd�d��Zsd�d��Ztd�d��Zud�d��Zvd�d��Zwd�d��Zxd�d��Zyd�d��Zzd�d��Z{d�d��Z|e|fd�d��Z}d�d��Z~d�d��Zd�d��Z�d�d��Z�d�d��Z�d�d��Z�d�d��Z�d�d„Z�d�d�dńZ�d�d�dDŽZ�d�dɄZ�d�d˄Z�d�d̈́Z�d�dτZ�dS)��Nzselinux-python�T�unicodez/usr/share/localezutf-8)Z	localedirZcodeset�_�������allowZ
auditallowZ
neverallowZ	dontaudit�source�target�permlist�classZ
transitionZ
role_allowZetc_tz/etcZtmp_tz/tmpZunit_file_tz/usr/lib/systemd/systemz/lib/systemd/systemz/etc/systemd/systemZvar_cache_tz
/var/cacheZ	var_lib_tz/var/libZlog_tz/var/logZ	var_run_tz/var/runz/runZ
var_lock_tz	/run/lockz
/var/run/lockZvar_spool_tz
/var/spoolZ	content_tz/var/wwwz	all files�azregular file�fZ	directory�dzcharacter device�czblock device�bzsocket file�sz
symbolic link�lz
named pipe�p�z--z-dz-cz-bz-sz-lz-pcCs:|jdd�d}yt|�|fStk
r4d|fSXdS)Nz/policy.rr)�rsplit�int�
ValueError)Zpolicy_path�	extension�r�/usr/lib/python3.6/__init__.py�policy_sortkeyys
r�/c	CsLy.|tj�}tjd|�}|jtd�|dSYnXttd���dS)Nz%s.*)�keyrzNo SELinux Policy installed���)�selinuxZselinux_binary_policy_path�glob�sortrrr)�root�path�policiesrrr�get_installed_policy�sr)cCs2tjdtj�|f�}|sdS|jtd�|dS)z?Get the path to the policy file located in the given store namez%s%s/policy/policy.*N)r!rr")r$r#Zselinux_pathr%r)�storer(rrr�get_store_policy�sr+c	CsTdadadadadadadadadayt	j
|�aWntt
d�|��YnXdS)NzFailed to read %s policy file)�all_domains�all_attributes�bools�	all_types�role_allowsZusers�roles�
file_types�
port_types�setoolsZ
SELinuxPolicy�_polrr)�policy_filerrr�policy�s
r7cCst|�}|sdSt|�dS)N)r+r7)r*r6rrr�load_store_policy�sr8cCs|tkrZtjt�}||_t|j��}|rLt|�dkrLd|_||_t|j��}dd�|D�S|t	kr�tj
t�}|rv||_dd�|j�D�S|tkr�tjt�}|r�||_dd�|j�D�S|t
k�rFtjt�}|�rdd�|jd�D�}t|�d	k�r�||_n t|�dk�r|d
|d
f|_tj�r4dd�|j�D�Sdd�|j�D�S|tk�r�tjt�}|�rf||_tj�r�d
d�|j�D�Sdd�|j�D�S|tk�r�tjt�}|�r�||_dd�|j�D�S|tk�r�tjt�}|�r�||_dd�|j�D�Std��dS)NrcssB|]:}ttt|j���t|�t|j�ttt|j���d�VqdS))�aliases�nameZ
permissive�
attributesN)�list�map�strr9�boolZispermissiver;)�.0�xrrr�	<genexpr>�szinfo.<locals>.<genexpr>css:|]2}t|�ttt|j���ttt|j���d�VqdS))r:r1�typesN)r>r<r=�expandrC)r@rArrrrB�scss*|]"}t|�ttt|j���d�VqdS))r:rCN)r>r<r=rD)r@rArrrrB�scSsg|]}t|��qSr)r)r@�irrr�
<listcomp>�szinfo.<locals>.<listcomp>�-rrcss<|]4}|jjt|j�t|jj�t|jj�|jjd�VqdS))�high�protocol�range�type�lowN)�portsrHr>rI�contextZrange_�type_rL)r@rArrrrB�scss2|]*}|jjt|j�t|jj�|jjd�VqdS))rHrIrKrLN)rMrHr>rIrNrOrL)r@rArrrrB�scss8|]0}t|j�t|�ttt|j��t|j�d�VqdS))rJr:r1�levelN)r>Z	mls_ranger<r=r1Z	mls_level)r@rArrrrBscss(|] }t|�ttt|j��d�VqdS))r:r1N)r>r<r=r1)r@rArrrrB
scss|]}t|�|jd�VqdS))r:�stateN)r>rQ)r@rArrrrBscss"|]}t|�t|j�d�VqdS))r:rN)r>r<�perms)r@rArrrrBszInvalid type)�TYPEr4Z	TypeQueryr5r:r<�results�len�alias�ROLE�	RoleQuery�	ATTRIBUTEZTypeAttributeQuery�PORTZPortconQuery�splitrM�mls�USERZ	UserQuery�BOOLEANZ	BoolQuery�TCLASSZ
ObjClassQueryr)�setyper:�qrTrMrrr�info�sr










rbc3Cs`t|j�t|j�t|j�t|j�d�}y<i}x|jjD]}|j|t|�<q6W|jjf|�|j	k}Wnt
k
rzd}YnX||d<yttt|j
��|d<Wnt
k
r�YnXyt|j�|d<Wnt
k
r�YnXydd�|jjD�|d<Wnt
k
�r
YnXyt|j�|d	<Wnt
k
�r4YnXy|j|d
<Wnt
k
�rZYnX|S)N)rKrr
rT�enabledr�	transtypecSsg|]}t|�|jf�qSr)r>rQ)r@rrrrrFHsz)_setools_rule_to_dict.<locals>.<listcomp>�booleans�conditional�filename)r>�ruletyperr
�tclassrfrerQZevaluateZconditional_block�AttributeErrorr<r=rR�defaultrg)ZrulerZ	boolstate�booleanrcrrr�_setools_rule_to_dict'sB
rmc
Cs�|si}tttttttg�}x&|D]}||kr"tddj|���q"Wd}t	|kr\t
|t	�}d}t|krtt
|t�}d}t|kr�t
|t�j
d�}g}g}t|kr�|jt�t|kr�|jt�t|kr�|jt�t|kr�|jt�t|�dk�r.tjt||||d�}	t|k�r|t|	_|dd�|	j�D�7}t|k�r�dd	d
g}
tjt|
|||d�}	t|k�rj|t|	_|dd�|	j�D�7}t|k�r�dg}tjt||||d�}	x.|	j�D]"}|jt
|j�t
|j�d
���q�W|S)NzType has to be in %s� �,r)rhrr
ricSsg|]}t|��qSr)rm)r@rArrrrF�szsearch.<locals>.<listcomp>�type_transitionZtype_changeZtype_membercSsg|]}t|��qSr)rm)r@rArrrrF�sr)rr
)�set�ALLOW�
AUDITALLOW�
NEVERALLOW�	DONTAUDIT�
TRANSITION�
ROLE_ALLOWr�join�SOURCEr>�TARGET�CLASSr[�appendrUr4�TERuleQueryr5�PERMSrRrT�
RBACRuleQueryrr
)
rCZseinfoZvalid_typesr`rr
riZtoretZtertypesraZrtypesZratypes�rrrr�searchYsn











r�csi}g}�g��g�y(�tt�fdd�t���dd7�WnYnXy(�tt�fdd�t���dd7�WnYnXtdd�t���fdd�t���}yHxB|D]:}|j|d|d	|d
fd��||kr�|j|�i}q�WWntk
�r|SX|S)Ncs|d�kS)Nr:r)rA)�srcrr�<lambda>�sz"get_conditionals.<locals>.<lambda>rr;cs|d�kS)Nr:r)rA)�destrrr��scSs|S)Nr)�yrrrr��scs2|d�ko0|d�ko0t��j|t�o0d|kS)Nrr
rf)rq�issubsetr~)rA)�	dest_list�perm�src_listrrr��srrfrc)rrf)r<�filter�get_all_types_infor=�get_all_allow_rules�updater|�KeyError)r�r�rir�ZtdictZtlistZallowsrEr)r�r�r�r�r�r�get_conditionals�s.((

 
r�cCsHd}x|D]}|ddr
d}Pq
Wtd�|djttdd�|���fS)	NFrfrTz-- Allowed %s [ %s ]z || cSsd|dd|ddfS)Nz%s=%drfrrr)rArrrr��sz.get_conditionals_format_text.<locals>.<lambda>)rrxrqr=)ZcondrcrArrr�get_conditionals_format_text�s
r�cCsttt|��ddS)NrrC)r<rbrY)Z	attributerrr�get_types_from_attribute�sr�cCs�g}i}x&t�D]}|jt|��r|j|�qWt�}xN|D]F}y$||dt||df||<Wq<tk
r�g||<Yq<Xq<W|S)N�regex�ftype)�get_all_file_types�
startswith�gen_short_namer|�
get_fcdict�
file_type_strr�)r`�flist�mpathsr�fcdictrrr�get_file_types�s
$r�cCs8|s|Syttt|��dSttfk
r2|SXdS)z�Return the real name of a type

    * If 'name' refers to a type alias, return the corresponding type name.
    * Otherwise return the original name (even if the type does not exist).
    r:N)�nextrbrS�RuntimeError�
StopIteration)r:rrr�get_real_type_name�sr�c
Cs<t�}g}i}ttg|ddgdd��}|dks:t|�dkr>|St�}ddg}x�|D]�}|d|krdqRd	|krv|d	svqR|djd
�r�|d|kr�qR|d|kr�|d|kr�|j|d�qRx&t|d�D]}||kr�|j|�q�WqRWxP|D]H}	y$||	dt||	df||	<Wq�t	k
�r2g||	<Yq�Xq�W|S)
N�open�write�file)rrrrZ	proc_typeZsysctl_typer
rc�_tr�r�)
r�r�rrrUr��endswithr|r�r�r�)
r`r2Z
all_writesr�rr�r;rE�trrrr�get_writable_files�s:

$r�cs�tjj|�r|gSytjd|��Wntd|�gS|}|jd�rX|dd�d}tjj|��y�d
dkrz�d7�Wntk
r�td�YnXy4tjd|���fdd	�t	�fd
d�tj
���D�SgSdS)Nz%s$zbad reg:z(/.*)?r	r rztry failed got an IndexErrorcsg|]}�j|�r|�qSr)�match)r@rA)�patrrrF(szfind_file.<locals>.<listcomp>cs�|S)Nr)rA)r'rrr�(szfind_file.<locals>.<lambda>i����r")�osr'�exists�re�compile�printr��dirname�
IndexErrorr=�listdir)Zregrr)r�r'r�	find_files,

&r�cCsVt|�}xH|j�D]<}|jd�r||krx$||D]}xt|�D]}|SWq2WqWdS)N�_exec_t)�get_entrypoints�keysr�r�)�domain�exclude_listZexecutable_files�exer'rrrr�find_all_files-sr�cCs`t�}y@|jd�rD||krDx(||dD]}xt|�D]}|SWq(WWntk
rZYnXdS)Nr�r�)r�r�r�r�)r�r�r�r'rrrr�find_entrypoint_path7sr�cCs�yZt|d��F}x>|D]6}|j�}|r|djd�r|d|d�||d<qWWdQRXWn0tk
r�}z|jtjkrz�WYdd}~XnX|S)Nr�r�#r)Zequiv�modify)r�r[r��OSError�errno�ENOENT)Zedict�fc_pathr��fd�errrr�read_file_equivCs
(r�cCs"trtSiatt|ddd�atS)Nz.subsT)r�)�file_equiv_modifiedr�)r�rrr�get_file_equiv_modifiedPs
r�cCs&trtSt|�att|ddd�atS)Nz
.subs_distF)r�)�
file_equivr�r�)r�rrr�get_file_equivYs
r�cCs�trtSgay&t|dd��}|j�}WdQRXWn.tk
r`}z|jtjkrR�gSd}~XnXxl|D]d}|j�}t|�dkr�qhy4t|�dkr�t|d}nd}tj	|d|f�Wqht
k
r�YqhXqhWtS)Nz.localr�rrrr)�local_filesr��	readlinesr�r�r�r[rU�trans_file_type_strr|r�)r�r��fcr�rE�recr�rrr�get_local_file_pathsbs,

r�cCs�trtSt|d�}|j�}|j�t|dd�}||j�7}|j�iay*t|dd��}||j�7}WdQRXWn0tk
r�}z|jtjkr��WYdd}~XnXx�|D]�}|j�}yjt|�dkr�t	|d}nd}|djd�d}|tk�r
t|dj
|d	�n|d	g|d
�t|<Wq�Yq�Xq�Wddgitd<dd
gitd<ddgitd<ddgitd<ddgitd<ddgitd<ddgitd<ddgitd<ddgitd<tS)Nr�z	.homedirsz.localrrr�:r�r)r�r�z
all log filesZlogfilezall user tmp filesZ
user_tmp_typezall user home filesZuser_home_typezall virtual image filesZvirt_image_typezBall files on file systems which do not support extended attributesZ	noxattrfsz)all sandbox content in tmpfs file systemsZsandbox_tmpfs_typez&all user content in tmpfs file systemsZuser_tmpfs_typezall files on the system�	file_typezAuse this label for random content that will be shared using sambaZ
samba_share_tr")r�r�r��closer�r�r�r[rUr�r|)r�r�r�r�rEr�r�r�rrrr�~sJ



r�cs<y�fdd�ttgddi�D�Sttfk
r6YnXdS)Ncsg|]}|d�kr|�qS)rdr)r@rA)r`rrrF�sz(get_transitions_into.<locals>.<listcomp>r�process)r�rv�	TypeErrorrj)r`r)r`r�get_transitions_into�s
r�cCs0yttg|dd��Sttfk
r*YnXdS)Nr�)rr)r�rvr�rj)r`rrr�get_transitions�s
r�cCs8ydd�ttgd|i�D�Sttfk
r2YnXdS)NcSsg|]}|ddkr|�qS)rr�r)r@rArrrrF�sz(get_file_transitions.<locals>.<listcomp>r)r�rvr�rj)r`rrr�get_file_transitions�s
r�c
Csdg}ttgd|i�}xJ|D]B}d|kry(x"|dD]}||kr2|j|�q2WWqYqXqW|S)Nrre)r�rrr|)r`rlZboollistrrrrrr�get_boolean_rules�s

r�cCstd�S)NZ
entry_type)r�rrrr�get_all_entrypoints�sr�cs0tjttg�dgdgd�}�fdd�|j�D�S)Nr��
entrypoint)rhrrirRcs g|]}|j�krt|j��qSr)rr>r
)r@rA)r`rrrF�sz(get_entrypoint_types.<locals>.<listcomp>)r4r}r5rrrT)r`rar)r`r�get_entrypoint_types�s
r�cshtj|�djd�d�y0tt�fdd�ttgddd����}|d	d
Sttt	fk
rbYnXdS)Nrr�rcs|d�kS)Nr
r)rA)r�rrr��sz$get_init_transtype.<locals>.<lambda>�init_tr�)rrrrd)
r#Z
getfileconr[r<r�r�rvr�rjr�)r'�entrypointsr)r�r�get_init_transtype�s$r�cCsbtjtdgddgd�}g}xB|j�D]6}y|j|kr@|j|j�Wq$tk
rXw$Yq$Xq$W|S)Nrpr�r�)rhrri)r4r}r5rTrkr|r
rj)rdrar�rErrr�get_init_entrypoint�s

r�cCs�tjtdgddgd�}i}xd|j�D]X}y<t|j�}||krR||jt|j��nt|j�g||<Wq$tk
rzw$Yq$Xq$W|S)Nrpr�r�)rhrri)	r4r}r5rTr>rkr|r
rj)rar�rErdrrr�get_init_entrypoints_str�s

r�cCsHy*tdd�ttgd|dd���}t|�dSttfk
rBYnXdS)NcSs|dS)Nrdr)rArrrr�sz,get_init_entrypoint_target.<locals>.<lambda>r�r�)rr
rr)r=r�rvr<r�r�)r�r�rrr�get_init_entrypoint_target
sr�cCsbt�}i}xRt|�D]F}y$||dt||df||<Wqtk
rXg||<YqXqW|S)Nr�r�)r�r�r�r�)r`r�r�rrrrr�s$r�c	Cs�tt�dkrtSt�tj�}y4t|�}tj�}|j|�t	|jj
��a|j�Wn&tj
jd|�tjd�YnXtj�tS)Nrz#could not open interface info [%s]
r)rU�methods�gen_interfaces�defaults�interface_infor��
interfacesZInterfaceSetZ	from_filer<r�r��sys�stderrr��exitr%)�fnr�Zifsrrr�get_methodss
r�cCstdkrdd�tt�D�atS)NcSsg|]}|d�qS)r:r)r@rArrrrF6sz!get_all_types.<locals>.<listcomp>)r/rbrSrrrr�
get_all_types3sr�cCstdkrttt��atS)N)�all_types_infor<rbrSrrrrr�9sr�cCs&tdkr"ttttd��dd�atS)NZ
userdomainrrC)�
user_typesr<rbrYrrrr�get_user_types?sr�cCsztrtSiatjttgd�}xX|j�D]L}t|j�}t|j�}|dks&|dkrPq&|tkrht|j	|�q&|gt|<q&WtS)N)rhZsystem_r)
r0r4rr5rrrTr>rr
r|)rar�r�Ztgtrrr�get_all_role_allowsFs

r�cCszddl}g}tt��}x^|D]V}|jdd|�}t|�dkrt|jdd|d��dkr|d|kr|j|d�qW|S)Nrz(.*)%sz_exec_t$z_initrc$)r��sortedr��findallrUr|)r�r,rCrE�mrrr�get_all_entrypoint_domainsZs

(r�cCs�yddlm}Wn tk
r0ddlm}YnXtj�}tj�}y tj|�j	tj|�j	kr`dSWnt
k
rvYnXtj�dkr�tt
d���t|d�d�dS)Nr)�getstatusoutputzEYou must regenerate interface info by running /usr/bin/sepolgen-ifgenz/usr/bin/sepolgen-ifgenr)Zcommandsr��ImportError�
subprocessr�r��headersr��stat�st_mtimer��getuidrrr�)r�Zifiler�rrrr�fsr�cCs�trttfSiaiax�tt�D]�}|d|dkr@t|d�}ndt|d�t|d�f}|d|dftkr�t|d|dfj|�n|gt|d|df<d|kr�|d|dft|d|d|df<q|dt|d|d|df<qWttfS)NrLrHz%s-%srKrIrJ)�portrecs�
portrecsbynumrbrZr>r|)rEZportrrr�
gen_port_dictxs("r�cCs"tsttttd��dd�atS)Nr�rrC)r,r<rbrYrrrr�get_all_domains�sr�cCs(trtStjt�}dd�|j�D�atS)NcSs g|]}t|�dkrt|��qS)Zobject_r)r>)r@rArrrrF�sz!get_all_roles.<locals>.<listcomp>)r1r4rXr5rT)rarrr�
get_all_roles�s

r�cCs@ts<ttt��atjr<x$tD]}dj|djd��|d<qWtS)NrrJrn)�selinux_user_listr<rbr]r5r\rxr[)rArrr�get_selinux_users�s
rcCs�trtSttj�d�}|j�}|j�gaxd|jd�D]V}|j�}t|�dks6|j	d�rZq6|jd�}tj
|d|ddj|dd��d��q6WtS)	Nr��
rr�r�rr)r:Zseuserr\)�login_mappingsr�r#Zselinux_usersconf_path�readr�r[�striprUr�r|rx)r��bufrrArrr�get_login_mappings�s
,rcCsttdd�t���S)NcSs|dS)Nr:r)rArrrr��szget_all_users.<locals>.<lambda>)r�r=rrrrr�
get_all_users�srcCs&trtSttttd��dd�atS)Nr�rrC)r2r<r�rbrYrrrrr��sr�cCs&trtSttttd��dd�atS)NZ	port_typerrC)r3r<r�rbrYrrrr�get_all_port_types�srcCststtt��atS)N)r.r<rbr^rrrr�
get_all_bools�sr	cCsdj|dt|��jd��S)Nrnr)rxrUr[)rZtrimrrr�prettyprint�sr
cCs|S)Nr)rrrr�markup�srcCsVd||�}|jd�r(|dt|d�S|jd�rD|dt|d�S|jd�r`|dt|d�S|jd�r||d	t|d�S|jd
�r�|dt|d�S|jd�r�|dt|d�S|jd
�s�|jd�r�|dS|jd�r�|dS|jd�r�|dt|d�S|jd��r|dt|d�S|jd��r:|dt|d�S|jd��rX|dt|d�S|jd��r~|d|dtd��S|jd��r�|dt|d�S|jd��r�|dt|d�S|jd��r�|dt|d�S|jd ��r�|dt|d �S|jd!��r|d"t|d!�S|jd#��r2|d$t|d#�S|jd%��rP|d&t|d%�S|jd'��rn|d(t|d'�S|jd)��r�|d*t|d'�S|jd+��r�|d$t|d+�S|jd,��r�|d-t|d,�S|jd.��r�|d/t|d.�S|jd0��r|d1t|d0�S|jd2��r"|d3t|d2�S|jd4��r@|d1t|d4�S|jd5��r^|d1t|d5�S|jd6��r||d1t|d6�S|jd5��r�|d7t|d5�S|jd8��r�|d9t|d8�S|jd:��r�|d;t|d8�S|jd<��r�|d=t|d<�S|jd>��r|d?t|d>�S|jd@��r&|dAS|jdB��rD|dCt|dB�S|dDt|dE�S)FNz+Set files with the %s type, if you want to Z
_var_run_tz8store the %s files under the /run or /var/run directory.Z_pid_tz,store the %s files under the /run directory.Z
_var_lib_tz0store the %s files under the /var/lib directory.Z_var_tz,store the %s files under the /var directory.Z_var_spool_tz2store the %s files under the /var/spool directory.Z_spool_tZ_cache_tZ_var_cache_tz/store the files under the /var/cache directory.Z	_keytab_tz)treat the files as kerberos keytab files.Z_lock_tzEtreat the files as %s lock data, stored under the /var/lock directoryZ_log_tzKtreat the data as %s log data, usually stored under the /var/log directory.Z	_config_tzRtreat the files as %s configuration data, usually stored under the /etc directory.Z_conf_tr�z,transition an executable to the %s_t domain.Z_cgi_content_tz"treat the files as %s cgi content.Z
_rw_content_tz)treat the files as %s read/write content.Z_rw_tZ_write_tZ_db_tz'treat the files as %s database content.Z
_ra_content_tz*treat the files as %s read/append content.Z_cert_tz'treat the files as %s certificate data.Z_key_tztreat the files as %s key data.Z	_secret_tz"treat the files as %s secret data.Z_ra_tZ_ro_tz(treat the files as %s read/only content.Z
_modules_tztreat the files as %s modules.Z
_content_tztreat the files as %s content.Z_state_tz!treat the files as %s state data.Z_files_tZ_file_tZ_data_tztreat the data as %s content.Z_tmp_tz1store %s temporary files in the /tmp directories.Z_etc_tz'store %s files in the /etc directories.Z_home_tz+store %s files in the users home directory.Z_tmpfs_tz&store %s files on a tmpfs file system.Z_unit_file_tz#treat files as a systemd unit file.Z_htaccess_tz#treat the file as a %s access file.ztreat the files as %s data.r�)r�r
rU)rrZtxtrrr�get_description�s�







rcCs"tstttdd�tt����atS)NcSs|dS)Nr:r)rArrrr�Bsz$get_all_attributes.<locals>.<lambda>)r-r<r�r=rbrYrrrr�get_all_attributes?sr
cCs"x|D]}||tkrdSqWdS)NFT)r~)�dictrRr�rrr�_dict_has_permsFs
rcCspt�}|jd�r&t|�}|dd�}n|}|d|krBtd|��|ddkr`|dd	�d}n|d}||fS)
Nr�rzdomain %s_t does not existrrr���r"r")r�r�r�r)r`r,�
domainname�
short_namerrrr�Ms
r�cCststtg�atS)N)�all_allow_rulesr�rrrrrrr�]s
r�cCs0ts,tjtddttgd�}dd�|j�D�atS)Nz.*T)rlZ
boolean_regexrhcSsg|]}t|��qSr)rm)r@rArrrrFhsz&get_all_bool_rules.<locals>.<listcomp>)�all_bool_rulesr4r}r5rrrurT)rarrr�get_all_bool_rulescs

rcCststttg��atS)N)�all_transitionsr<r�rvrrrr�get_all_transitionsksrc
sg}g}t��\}}x�tdd�t�fdd�t���D]�}x�|D]�}t|t�sNq>ytj|d�}Wntk
r||d}YnX|dj	|�s�|dj	|�r�|d|f|kr�|d|f|kr�|j
|d|f�q>|d|f|ko�|d|f|kr>|j
|d|f�q>Wq4W||fS)NcSs|dS)Nrer)rArrrr�uszget_bools.<locals>.<lambda>csd|ko|d�kS)Nrerr)rA)r`rrr�usrr)r�r=r�r�
isinstance�tupler#Zsecurity_get_boolean_activer�r�r|)r`r.ZdomainboolsrrrErrcr)r`r�	get_boolsqs"$

""rcCststj�datS)Nr)rer#Zsecurity_get_boolean_namesrrrr�get_all_booleans�sr�#/usr/share/selinux/devel/policy.xmlcCsPytj|�}|j�}|j�Wn,tk
rJt|�}|j�}|j�YnX|S)N)�gzipr�rr��IOError)r'r�rrrr�
policy_xml�s
rc
Cs�trtSddl}ia�y�|jjjt|��}�x2|jd�D�]"}x�|jd�D]�}xX|jd�D]J}|jd�jd�jj	d�}t
jdd|�}|jd	�|jd
�|ft|jd	�<qZWxX|jd�D]J}|jd�jd�jj	d�}t
jdd|�}|jd	�|jd
�|ft|jd	�<q�WqJWxT|jd�D]F}|jd�jd�jj	d�}t
jdd|�}d|jd
�|ft|jd	�<�qWq8WxT|jd�D]F}|jd�jd�jj	d�}t
jdd|�}d|jd
�|ft|jd	�<�qlWWnt
k
�r�YnXtS)
NrZlayer�moduleZtunable�descrrrnr:Zdftvalr?�global)�
booleans_dictZxml.etree.ElementTreeZetreeZElementTreeZ
fromstringrr��find�textrr��sub�getr)r'ZxmlZtreerr�rr!rErrr�
gen_bool_dict�s6$($$r(cCs*t�}||krt||d�Std�SdS)Nr�unknown)r(r)rlr#rrr�boolean_category�sr*cCsJt�}||krt||d�S|jd�}d|ddj|dd��fSdS)NrrzAllow %s to %srrnr)r(rr[rx)rlr#r!rrr�boolean_desc�s

r+cCsFd}y$td��}|j�j�}WdQRXWntk
r@d}YnX|S)Nrz/etc/system-releaseZMisc)r��readline�rstripr)Zsystem_releaserrrr�get_os_version�s

r.cCsPdadadadadadadadadada	da	da
dadada
dadadadadS)N)r-r,r/rer#r.r�r2r�r�r�r�r3r0r1r�rr�rrrr�reinit�s&r/)r)r )N)N)r)r)�r�r#r4r$Zsepolgen.defaultsr�Zsepolgen.interfacesr�r�r�r�rZPROGNAME�gettext�kwargs�version_infoZinstall�builtinsr>�__dict__r�Z__builtin__rrSrWrYrZr]r^r_rrrsrtruryrzr~r{rvrwZDEFAULT_DIRSr�rr�r5r�r�r�r�r�r/r�r�r0r�r�r,r1r�rr2r3r.r-rer#rrrrr)r+r7r8r6rr�Zis_selinux_enabledrbrmr�r�r�r�r�r�r�r�r�r�r�Zselinux_file_context_pathr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr�rr	r
rrr
rr�r�rrrrrr(r*r+r.r/rrrr�<module>sZ


	



i2
H!
$

		.	
	

]

	site-packages/sepolicy/__pycache__/interface.cpython-36.pyc000064400000013422147511334650017724 0ustar003

��f��@s&ddlZddlZddlZdZdZdddddd	d
ddg	Zd
Zy:ddlZiZej	d#krZded<ej
efddd�e��WnJyddlZeej
d<Wn&ek
r�ddlZeej
d<YnXYnXdd�Zd$dd�Zd%dd�Zd&dd�Zdad'dd�Zd(dd	�Zdd
�Zd)dd �Zd!d�Zd*d"d�ZdS)+�Nz_admin$z_role$�get_all_interfaces�get_interfaces_from_xml�	get_admin�get_user�get_interface_dict�get_interface_format_text�!get_interface_compile_format_text�get_xml_file�interface_compile_testzselinux-python�T�unicodez/usr/share/localezutf-8)Z	localedirZcodeset�_cCs,g}t|�}x|j�D]}|j|�qW|S)z' Get all interfaces from given xml file)r�keys�append)�pathZinterfaces_list�idict�k�r�/usr/lib/python3.6/interface.pyr4s
�cCs0ddlm}g}|s|�}nt|�}t|�}|S)Nr)�get_methods)�sepolicyrr	r)rrZall_interfaces�xml_pathrrrr=scCs�g}|r�y:t|�}t|�}x$|j�D]}|jd�r$|j|�q$WWq�tk
r�}z,tjjd|j	j
t|�f�tjd�WYdd}~Xq�Xn0x.t
j�D]"}|jd�r�|j|jd�d�q�W|S)z? Get all domains with an admin interface from installed policy.Z_adminz%s: %s
�Nr)r	rr�endswithr�IOError�sys�stderr�write�	__class__�__name__�str�exitrr�split)rZ
admin_listrrr�e�irrrrIs

cCs�g}|r�yRt|�}t|�}x<|j�D]0}|jd�r$d|dd	�tj�kr$|j|�q$WWq�tk
r�}z,tj	j
d|jjt
|�f�tjd�WYdd}~Xq�XnPxNtj�D]B}tjdt|�}t|�dkr�d|dtj�kr�|j|d�q�W|S)
z1 Get all domains with SELinux user role interfaceZ_rolez	%s_exec_tN�z%s: %s
rz(.*)%sr���)r	rrrrZ
get_all_typesrrrrrrr r!r"r�re�findall�USER_TRANSITION_INTERFACE�len)rZ
trans_listrrrr$r%�mrrrr_s$
�#/usr/share/selinux/devel/policy.xmlc

CsXddl}ddl}trtSiag}d}||7}|d7}�y|jj|�rT|jjj|�}n|jjj|�}x�|j	d�D]�}x�|j	d�D]�}xV|j
d�D]H}x"|j	d�D]}	|j|	jd��q�W||j
d	�jdgt|jd�<g}q�WxV|j
d
�D]H}x"|j	d�D]}	|j|	jd��q�W||j
d	�jd
gt|jd�<g}q�Wq~WqnWWntk
�rRYnXtS)NrzZ<?xml version="1.0" encoding="ISO-8859-1" standalone="no"?>
<policy>
<layer name="admin">
z
</layer>
</policy>
Zlayer�module�	interfaceZparam�nameZsummary�template)�osZxml.etree.ElementTree�interface_dictr�isfileZetreeZElementTree�parseZ
fromstringr)Zgetiteratorr�get�find�textr)
rr2ZxmlZ
param_listrZtree�lr,r%r$rrrrzs:cCs<t|�}d|dj||d�dj||djd��f}|S)Nz	%s(%s) %sz, r� r�
)r�joinr#)r/rr�interface_textrrrr�s0cCsLddlm}g}x6||dD]&}|j|j|�d|dj|�f}qW|S)Nr)�test_modulerz%s(%s)
z, )�	templatesr>r�dict_valuesr<)Zinterfaces_dictr/r>Z	param_tmpr%r=rrrr�s�compiletestcCs6ddlm}d}|tjd||j�7}|t||�7}|S)Nr)r>rZTEMPLATETYPE)r?r>r(�subZte_test_moduler)r/rr0r>�terrr�generate_compile_te�s
rDcCs�ddl}yddlm}Wn tk
r8ddlm}YnX|jj|�d}|jj|�jd�d}|d||�\}}|dkr�t	j
jd�t	j
jd|�t	jd	�n|SdS)
z; Returns xml format of interfaces for given .if policy filerN)�getstatusoutput�/�.zDpython /usr/share/selinux/devel/include/support/segenxml.py -w -m %sz-
 Could not proceed selected interface file.
z
%sr)
r2�commandsrE�ImportError�
subprocessr�dirname�basenamer#rrrr")Zif_filer2rEZbasedir�filename�rc�outputrrrr	�sc
Cs�dddddg}dg}yddlm}Wn tk
rDddlm}YnXddl}d	d
ddd
�}t|�}|jd�d|kp�||d|k�sjttd�|�yft	|dd�}|j
t||��|j�|d|d�\}	}
|	dkr�t
jj
|
�t
jj
td�|�Wn<tk
�r:}zt
jj
td�||f�WYdd}~XnXx@|j�D]}|jj|��rF|j|��qFWnt
jj
td�|�dS)NZuserdomZkernelZcorenet�filesZdevr1r)rEzcompiletest.ppzcompiletest.tezcompiletest.fczcompiletest.if)�pprCZfc�ifr
�zCompiling %s interfacerC�wz,make -f /usr/share/selinux/devel/Makefile %srQz
Compile test for %s failed.
z%
Compile test for %s has not run. %s
z,
Compiling of %s interface is not supported.)rHrErIrJr2rr#�printr
�openrrD�closerr�EnvironmentError�valuesr�exists�remove)
r/rZexclude_interfacesZexclude_interface_typerEr2Zpolicy_filesr�fdrNrOr$�vrrrr
�s2$*)r)r)r)r)r-)r-)rA)r-)r(rrZADMIN_TRANSITION_INTERFACEr*�__all__ZPROGNAME�gettext�kwargs�version_infoZinstall�builtinsr!�__dict__rIZ__builtin__rrrrrr3rrrrDr	r
rrrr�<module>sB
	



*


	site-packages/sepolicy/__pycache__/manpage.cpython-36.pyc000064400000105544147511334650017403 0ustar003

��f��@sdddddgZddlZddlZddlZddlZddlZd3d4ddd�Zdgdgddgdgdgdgd�ZdgZda	d5dd�Z
dadadd�Z
dad d!�Zdad"d�Zdad#d$�Zdad%d&�Zdad'd(�Zdad)d*�Zd+d,�ZgZgZd-d.�Zd/d0�ZGd1d�d�ZGd2d�d�Z dS)6�ManPage�HTMLManPages�manpage_domains�
manpage_roles�gen_domains�N�amavis_t�clamd_t�
clamscan_t�freshclam_t�rgmanager_t�
corosync_t�	aisexec_t�pacemaker_tZqemu_tZphpfpm_t)Zantivirus_tZ	cluster_tZsvirt_tZhttpd_tZsambaZapacheZvirtZlibvirtZbindZsmartmonZraid)ZsmbdZhttpdZvirtdZnamedZfsdaemonZmdadmz/var�#/usr/share/selinux/devel/policy.xmlcCs�trtSddl}iay�|jjjtj|��}xx|jd�D]j}xd|jd�D]V}|jd�}|dksF|dkrfqF|dkrrd}|dkr~d}x|jd	�D]}|j	t|<q�WqFWq6WWnt
k
r�YnXtS)
NrZlayer�module�name�userZ
unconfinedZ
unprivuserZunconfineduserZsummary)�modules_dictZxml.etree.ElementTreeZetreeZElementTreeZ
fromstring�sepolicyZ
policy_xml�findall�get�text�IOError)�pathZxmlZtree�l�mr�b�r�/usr/lib/python3.6/manpage.py�gen_modules_dict-s(
rcCs�trtrttfSgaiag}tjtj�}x<|D]4}|j|d�d|kr.|dt|djd�d<q.Wx&|D]}|d
krltj|jdd	��qlWtj�ttfS)Nr�range�_r�system_u�root�unconfined_uZ_u�)r"r#r$)	�users�users_ranger�infoZUSER�append�split�replace�sort)ZallusersZ
allusers_info�d�urrr�get_all_users_infoIs

r/cCststtjtjd��datS)N�
entry_type�types)�all_entrypoints�nextrr(�	ATTRIBUTErrrr�get_entrypointsasr5cCs�trtSgax4tj�D](}d}|dd�}|tkr4qtj|�qWx<tj�D]0}|dd�tksL|dkrjqLtj|dd��qLWtj�tS)NF�Zsystem_r���r7r7)�domainsr�get_all_domainsr)�
get_all_rolesr,)r-�found�domain�rolerrrrjscCs"tdkrttjtjd��datS)NZ	exec_typer1)�
exec_typesr3rr(r4rrrr�_gen_exec_types�sr?cCs"tdkrttjtjd��datS)Nr0r1)�entry_typesr3rr(r4rrrr�_gen_entry_types�srAcCstdkrttjtjd��atS)NZmcs_constrained_type)�mcs_constrained_typesr3rr(r4rrrr�_gen_mcs_constrained_types�srCcCsXtrtStjtj�}iax:|D]2}y|dt|d<Wqgt|d<YqXqWtS)N�
attributesr)r1rr(�TYPE)Z	all_typesZrecrrr�
_gen_types�s
rFcCsdj|dt|��jd��S)N� r!)�join�lenr*)�fZtrimrrr�prettyprint�srKcCsftjtjg�}xRtjD]H}g}x6|D].}|jd�dd|kr$|j|jd�d�q$W|||<qW|S)N�/�r���rN)�dict�fromkeys�stringZ
ascii_lettersr*r))Zmanpage_listZalphabet_manpages�iZtemp�jrrr�get_alphabet_manpages�s
rTcCstyddlm}Wn tk
r0ddlm}YnX|d|�\}}|dkrpt|d�t|d�}|j|�|j�dS)Nr)�getstatusoutputz)/usr/bin/groff -man -Thtml %s 2>/dev/nullzhas been created�w)ZcommandsrU�ImportError�
subprocess�print�open�write�close)Zhtml_manpageZmanpagerUZrc�output�fdrrr�convert_manpage_to_html�s


r_c@s8eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�ZdS)
rzF
            Generate a HTML Manpages on an given SELinux domains
    cCsTt|�|_t|�|_||_|d|_|j|_|jr<|j�ntd|�td�dS)NrLz7SELinux HTML man pages can not be generated for this %srM)	rTrr�
os_version�old_path�new_path� _HTMLManPages__gen_html_manpagesrY�exit)�selfrrrr`rrr�__init__�s



zHTMLManPages.__init__cCs|j�|j�|j�dS)N)�_write_html_manpage�
_gen_index�_gen_css)rerrrZ__gen_html_manpages�sz HTMLManPages.__gen_html_manpagescCs�tjj|j�stj|j�xN|jj�D]@}t|�r&x2|D]*}t|j|j	dd�dd|j
|�q8Wq&WxN|jj�D]@}t|�rvx2|D]*}t|j|j	dd�dd|j
|�q�WqvWdS)N�_selinuxrMrz.html)�osr�isdirrb�mkdirr�valuesrIr_�rsplitrar)rer<r-r=�rrrrrg�s
.
z HTMLManPages._write_html_manpagec	Cs�|j|jd}t|d�}|jd|j�x.|jD]$}t|j|�r2|jd||f�q2W|jd�d}x\|jD]R}t|j|�rp|d7}x6|j|D](}|jdd	�d
}|d||||f7}q�WqpW|jd|�x.|jD]$}t|j|�r�|jd
||f�q�W|jd�d}xb|jD]X}t|j|��r|d7}x8|j|D]*}|jdd	�d
}|d||||f7}�qBW�qW|jd|�|j�t	d|�dS)Nz.htmlrVz�
<html>
<head>
	<link rel=stylesheet type="text/css" href="style.css" title="style">
	<title>SELinux man pages</title>
</head>
<body>
<h1>SELinux man pages for %s</h1>
<hr>
<table><tr>
<td valign="middle">
<h3>SELinux roles</h3>
z
<a href=#%s_role>%s</a>z
</td>
</tr></table>
<pre>
r%z<p>rjrMrzo<a name=%s_role></a><a href=%s.html>%s_selinux(8)</a> - Security Enhanced Linux Policy for the %s SELinux user
zH%s
</pre>
<hr>
<table><tr>
<td valign="middle">
<h3>SELinux domains</h3>z
<a href=#%s_domain>%s</a>
			zv<a name=%s_domain></a><a href=%s.html>%s_selinux(8)</a> - Security Enhanced Linux Policy for the %s SELinux processes
z%s
</pre>
</body>
</html>
z%s has been created)
rbr`rZr[rrIrorr\rY)	re�htmlr^ZletterZ
rolename_bodyrpZrolenameZdomainname_body�
domainnamerrrrh�sL
 
zHTMLManPages._gen_indexcCs6|jd}t|d�}|jd�|j�td|�dS)Nz	style.cssrVaQ
html, body {
    background-color: #fcfcfc;
    font-family: arial, sans-serif;
    font-size: 110%;
    color: #333;
}

h1, h2, h3, h4, h5, h5 {
	color: #2d7c0b;
	font-family: arial, sans-serif;
	margin-top: 25px;
}

a {
    color: #336699;
    text-decoration: none;
}

a:visited {
    color: #4488bb;
}

a:hover, a:focus, a:active {
    color: #07488A;
    text-decoration: none;
}

a.func {
    color: red;
    text-decoration: none;
}
a.file {
    color: red;
    text-decoration: none;
}

pre.code {
    background-color: #f4f0f4;
//    font-family: monospace, courier;
    font-size: 110%;
    margin-left: 0px;
    margin-right: 60px;
    padding-top: 5px;
    padding-bottom: 5px;
    padding-left: 8px;
    padding-right: 8px;
    border: 1px solid #AADDAA;
}

.url {
    font-family: serif;
    font-style: italic;
    color: #440064;
}
z%s has been created)rarZr[r\rY)reZ	style_cssr^rrrri8s

7zHTMLManPages._gen_cssN)	�__name__�
__module__�__qualname__�__doc__rfrcrgrhrirrrrr�s
Gc@s.eZdZdZdZddgZdLdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zd2d3�Zd4d5�Zd6d7�Zd8d9�Zd:d;�Zd<d=�Z d>d?�Z!d@dA�Z"dBdC�Z#dDdE�Z$dFdG�Z%dHdI�Z&dJdK�Z'dS)MrzK
        Generate a Manpage on an SELinux domain in the specified path
    NZDisabledZEnabled�/tmprLFcCs||_||_||_tj�d|_t�|_tj�|_	tj
�|_tj�|_
tj�|_tj�|_t�d|_t�d|_tj�|_tj�|_t�|_t�|_t�|_t�|_ |jr�|jd|_!n|jt"j#�|_!tj$|j!�|_%t&j'j(|�s�t&j)|�||_'|j�r
|jd|_*n|jd|_*tj+|j*�|_,tj-|�\|_.|_/|j.d|_0|j1�d||j.f|_2t3|j2d�|_4|j.d	|jk�r�|j5�|j�r�t6j7|j2�n|j�r�t8j7|j2�|j9�|j4j:�x<t;j<�D]0}||j.k�r�xt;|D]}|j=|��q�W�q�WdS)
NrrMZ
file_contextsz
policy.xmlz#/usr/share/selinux/devel/policy.xml�_tz%s/%s_selinux.8rV�_r)>rq�source_filesr#rZ
gen_port_dict�portrecsrr8r9�all_domainsZget_all_attributes�all_attributesZ
get_all_boolsZ	all_boolsZget_all_port_types�all_port_typesr:Z	all_rolesr/�	all_users�all_users_rangeZget_all_file_types�all_file_typesZget_all_role_allows�role_allowsrFr1r?r>rAr@rCrBZfcpath�selinuxZselinux_file_context_pathZ
get_fcdict�fcdictrkr�exists�makedirs�xmlpathZ
gen_bool_dict�
booleans_dictZgen_short_namerr�
short_name�type�
_gen_bools�
man_page_pathrZr^�_ManPage__gen_user_man_pagerr)r�_ManPage__gen_man_pager\�
equiv_dict�keys�_ManPage__gen_man_page_link)rerrrr#rzrq�k�aliasrrrrf�s\








zManPage.__init__cCs�g|_g|_|jg}|jtkrNx.t|jD] }|d|jkr*|j|d�q*Wx6|D].}tj|�\}}|j|7_|j|7_qTW|jj	�|jj	�dS)Nrx)
�bools�domainboolsr�rrr�r|r)rZ	get_boolsr,)rer1�tr�r�rrrr��s


zManPage._gen_boolscCs|jS)N)r�)rerrr�get_man_page_path�szManPage.get_man_page_pathc
Cs�|jd|_|jst|j�|_y|j|j|_Wnd|j|_YnX|j|jkr�ttj	tj
|j��d|_|j
�|j�|j�|j�|j�|j�|j�|j�n|j�|j�|j�|j�|j�|j�dS)Nryz%s user rolerD)rrr=rrr��descrr3rr(rEr�rD�_user_header�_user_attribute�	_can_sudo�_xwindows_login�_network�	_booleans�
_home_exec�_transitions�_role_header�_port_types�
_mcs_types�_writes�_footer)rerrrZ__gen_user_man_page�s.
zManPage.__gen_user_man_pagecCsLd|j|f}td|j|fd�|_|jjd|j�|jj�t|�dS)Nz%s/%s_selinux.8rVz.so man8/%s_selinux.8)rrZr^r[rrr\rY)rer�rrrrZ__gen_man_page_link�s

zManPage.__gen_man_page_linkcCs�g|_i|_g|_|j�x�|jD]|}y@t|rd|j}|j}xt|D]}|j|�qFW||_||_Wntk
r|w"YnXt	t
jt
jd|��d|j|<q"W|j
�|j�|j�|j�|j�|j�|j�|j�|j�|j�|j�dS)Nz%srD)�	anon_listrD�ptypes�_get_ptypes�typealias_typesr^r��_typealias_gen_man�KeyErrorr3rr(rE�_header�_entrypoints�_process_typesr�r��_nsswitch_domainr�r��
_file_context�_public_contentr�)reZdomain_typer^r�r�rrrZ__gen_man_page�s6
$zManPage.__gen_man_pagecCs8x2|jD](}|j|j�s$|j|j�r|jj|�qWdS)N)r|�
startswithr�rrr�r))rerJrrrr�szManPage._get_ptypescCsZd|j|dd�f|_g|_d|_t|jd�|_|j|dd��|j�|jj�dS)Nz%s/%s_selinux.8r6r%rVr7r7)	rr��ports�booltextrZr^�
_typealiasr�r\)rer�rrrr�szManPage._typealias_gen_mancCsN|jjd|tjd�d��|jjd||jd��|jjdd|ji�dS)Nz\.TH  "%(typealias)s_selinux"  "8"  "%(date)s" "%(typealias)s" "SELinux Policy %(typealias)s"z%y-%m-%d)�	typealias�datez�
.SH "NAME"
%(typealias)s_selinux \- Security Enhanced Linux Policy for the %(typealias)s processes
.SH "DESCRIPTION"

%(typealias)s_t SELinux domain type is now associated with %(domainname)s domain type (%(domainname)s_t).
)r�rrzC
Please see

.B %(domainname)s_selinux

man page for more details.
rr)r^r[�time�strftimerr)rer�rrrr�#szManPage._typealiascCs8|jjd|jtjd�d��|jjdd|ji�dS)Nz_.TH  "%(domainname)s_selinux"  "8"  "%(date)s" "%(domainname)s" "SELinux Policy %(domainname)s"z%y-%m-%d)rrr�a�
.SH "NAME"
%(domainname)s_selinux \- Security Enhanced Linux Policy for the %(domainname)s processes
.SH "DESCRIPTION"

Security-Enhanced Linux secures the %(domainname)s processes via flexible mandatory access control.

The %(domainname)s processes execute with the %(domainname)s_t SELinux type. You can check if you have these processes running by executing the \fBps\fP command with the \fB\-Z\fP qualifier.

For example:

.B ps -eZ | grep %(domainname)s_t

rr)r^r[rrr�r�)rerrrr�6s
zManPage._headercCsH|j|ddj�|j|ddd�}|ddkrD|dd�}|S)Nr6rrM�.rNrN)r��lower)rerr�rrr�_format_boolean_descHs,zManPage._format_boolean_desccCspd}xf|j|jD]V\}}|jd�r<||jkr<|jj|�q||jkrHq|d|j|�||j||f7}qW|S)Nr%Z
anon_writezg
.PP
If you want to %s, you must turn on the %s boolean. %s by default.

.EX
.B setsebool -P %s 1

.EE
)r�r��endswithr�r)r�r��enabled_str)rer�rZenabledrrr�_gen_bool_textNs
"zManPage._gen_bool_textcCs>|j�|_|jdkr:|jjd|j|jf�|jj|j�dS)Nr%z�
.SH BOOLEANS
SELinux policy is customizable based on least access required.  %s policy is extremely flexible and has several booleans that allow you to manipulate the policy and run %s with the tightest access possible.

)r�r�r^r[rr)rerrrr�as


zManPage._booleanscCs�g}ddg}d}x*|jj�D]}d|j|kr|j|�qWt|�r~|jjd�x,|D]$}|d|j|�dj|�||f7}qVW|jj|�dS)NZauthlogin_nsswitch_use_ldapZkerberos_enabledr%Znsswitch_domainz
.SH NSSWITCH DOMAIN
zb
.PP
If you want to %s for the %s, you must turn on the %s boolean.

.EX
.B setsebool -P %s 1
.EE
z, )rDr�r)rIr^r[r�rH)reZnsswitch_typesZnsswitch_booleansZnsswitchbooltextr�rrrrr�ms
"zManPage._nsswitch_domaincCsZt|j�dkrdS|jjdd|ji�|jjddj|j��|jjdd|ji�dS)Nra�
.SH PROCESS TYPES
SELinux defines process types (domains) for each process running on the system
.PP
You can see the context of a process using the \fB\-Z\fP option to \fBps\bP
.PP
Policy governs the access confined processes have to files.
SELinux %(domainname)s policy is very flexible allowing users to setup their %(domainname)s processes in as secure a method as possible.
.PP
The following process types are defined for %(domainname)s:
rrz
.EX
.B %s
.EEz, z�
.PP
Note:
.B semanage permissive -a %(domainname)s_t
can be used to make the process type %(domainname)s_t permissive. SELinux does not deny access to permissive process types, but the AVC (SELinux denials) messages are still generated.
)rIr�r^r[rrrH)rerrrr��s
zManPage._process_typesc	Cs�g|_x2|jD](}|j|j�s*|j|j�r|jj|�qWt|j�dkrLdS|jjdd|ji�xv|jD]l}|jjd|�d}xRdD]J}||f|j	kr�|r�|jjd�d	}|jjd
|dj
|j	||f�f�q�WqjWdS)
Nra�
.SH PORT TYPES
SELinux defines port types to represent TCP and UDP ports.
.PP
You can see the types associated with a port by using the following command:

.B semanage port -l

.PP
Policy governs the access confined processes have to these ports.
SELinux %(domainname)s policy is very flexible allowing users to setup their %(domainname)s processes in as secure a method as possible.
.PP
The following port types are defined for %(domainname)s:rrz

.EX
.TP 5
.B %s
.TP 10
.EE
T�tcp�udpz

Default Defined Ports:Fz

%s %s
.EE�,)r�r�)r�r~r�r�rrr)rIr^r[r{rH)rerJ�p�onceZprotrrrr��s(

zManPage._port_typescCs�g}g}g}x^|jD]T}|j|j�r|j|�||jksB||jkrL|j|�||jkr||j|d}qWt|�dkr|dS|j�i}xt|D]l}d}x*|D]"}|j|�r�||j|�d}Pq�W|s�x2t	D]*}	|j|	�o�|j
d�r�g||dd�<Pq�Wq�Wg}
x*|D]"}t||�dk�r|
j|��qW|jjdd|ji�t|
�dk�r�|jjd	�x2|
D]*}	|jjd
|j|	|	j
d�dd
���qdW|�r�|jjd|j|dd��|jjdd|ji�|j�x�|D]�}|jjd|tj|�f�||jk�r�d}t|j|d�dk�r�d}|jjd||j|ddf�x0|j|ddd�D]}
|jjd|
��q\W�q�W|jjd�dS)N�regexrFTz(/.*)?�a�
.SH FILE CONTEXTS
SELinux requires files to have an extended attribute to define the file type.
.PP
You can see the context of a file using the \fB\-Z\fP option to \fBls\bP
.PP
Policy governs the access confined processes have to these files.
SELinux %(domainname)s policy is very flexible allowing users to setup their %(domainname)s processes in as secure a method as possible.
.PP
rrz 
.PP
.B EQUIVALENCE DIRECTORIES
a�
.PP
%(domainname)s policy stores data with multiple different file context types under the %(equiv)s directory.  If you would like to store the data in a different directory you can use the semanage command to create an equivalence mapping.  If you wanted to store this data under the /srv dirctory you would execute the following command:
.PP
.B semanage fcontext -a -e %(equiv)s /srv/%(alt)s
.br
.B restorecon -R -v /srv/%(alt)s
.PP
rLrM)rr�equivZalta�
.PP
.B STANDARD FILE CONTEXT

SELinux defines the file context types for the %(domainname)s, if you wanted to
store files with these types in a diffent paths, you need to execute the semanage command to sepecify alternate labeling and then use restorecon to put the labels on disk.

.B semanage fcontext -a -t %(type)s '/srv/my%(domainname)s_content(/.*)?'
.br
.B restorecon -R -v /srv/my%(domainname)s_content

Note: SELinux often uses regular expressions to specify labels that match multiple files.
)rrr�z=
.I The following file types are defined for %(domainname)s:
z

.EX
.PP
.B %s
.EE

- %s
r%�sz
.br
.TP 5
Path%s:
%sz, %sa

.PP
Note: File context can be temporarily modified with the chcon command.  If you want to permanently change the file context you need to use the
.B semanage fcontext
command.  This will modify the SELinux labeling database.  You will need to use
.B restorecon
to apply the labels.
i����rNrN)r�r�rrr)r>r@r�rIr,�
equiv_dirsr�r^r[r*rZget_description)re�flistZflist_non_execZmpathsrJZmdirsZmpr;Zmd�er�r�plural�xrrrr��sr








	
$
zManPage._file_contextcCsdd}xN|jD]D}||jkrq|j|j�r4|d|7}|j|jd�r|d|7}qW|jj|�dS)Nr%z, %s_selinux(8)r!)r8rrr�r�r^r[)re�retr-rrr�	_see_also9s
zManPage._see_alsocCszt|j�dkrv|jjdd|ji�xP|jD]F}|j|ddj�|j|ddd�}|jjd|||f�q,WdS)Nra,
.SH SHARING FILES
If you want to share files with multiple domains (Apache, FTP, rsync, Samba), you can set a file context of public_content_t and public_content_rw_t.  These context allow any of the above domains to read the content.  If you want a particular domain to write to the public_content_rw_t domain, you must set the appropriate boolean.
.TP
Allow %(domainname)s servers to read the /var/%(domainname)s directory by adding the public_content_t file type to the directory and by restoring the file type.
.PP
.B
semanage fcontext -a -t public_content_t "/var/%(domainname)s(/.*)?"
.br
.B restorecon -F -R -v /var/%(domainname)s
.pp
.TP
Allow %(domainname)s servers to read and write /var/%(domainname)s/incoming by adding the public_content_rw_t type to the directory and by restoring the file type.  You also need to turn on the %(domainname)s_anon_write boolean.
.PP
.B
semanage fcontext -a -t public_content_rw_t "/var/%(domainname)s/incoming(/.*)?"
.br
.B restorecon -F -R -v /var/%(domainname)s/incoming
.br
.B setsebool -P %(domainname)s_anon_write 1
rrr6rMzW
.PP
If you want to %s, you must turn on the %s boolean.

.EX
.B setsebool -P %s 1
.EE
)rIr�r^r[rrr�r�)rerr�rrrr�Ds,zManPage._public_contentcCsp|jjd�t|j�dkr&|jjd�|jdkr<|jjd�|jjd|j�|jdkrd|jjd�|j�dS)Na#
.SH "COMMANDS"
.B semanage fcontext
can also be used to manipulate default file context mappings.
.PP
.B semanage permissive
can also be used to manipulate whether or not a process type is permissive.
.PP
.B semanage module
can also be used to enable/disable/install/remove policy modules.
rzF
.B semanage port
can also be used to manipulate the port definitions
r%zA
.B semanage boolean
can also be used to manipulate the booleans
z�
.PP
.B system-config-selinux
is a GUI tool available to customize SELinux policy settings.

.SH AUTHOR
This manual page was auto-generated using
.B "sepolicy manpage".

.SH "SEE ALSO"
selinux(8), %s(8), semanage(8), restorecon(8), chcon(1), sepolicy(8)z, setsebool(8))r^r[rIr�r�rrr�)rerrrr�fs



zManPage._footercCs@||jdgkrdS|jd�r<x|D]}||j|kr"dSq"WdS)Nr<FrxT)r�r�r1)reZcheckrD�arrr�_valid_write�s

zManPage._valid_writecs�dd�t�fdd�tj��D�}t|�dkr0dS�jjd�t|�dkrXdd	j|�}nd
|d}�jjd�j|�jf�d|kr�|jd��jjd
�g}x(|D] }|�j	kr�|�j	|d7}q�W�jjdd	j|��dS)NcSsg|]}|d�qS)�targetr)�.0r�rrr�
<listcomp>�sz(ManPage._entrypoints.<locals>.<listcomp>cs&|d�jko$|ddko$d|dkS)N�source�class�file�
entrypoint�permlist)r�)�y)rerr�<lambda>�sz&ManPage._entrypoints.<locals>.<lambda>rz
.SH "ENTRYPOINTS"
rMz\fB%s\fP file typesz, z\fB%s\fP file typezw
The %s_t SELinux type can be entered via the %s.

The default entrypoint paths for the %s_t domain are the following:
Zbin_tz^
All executeables with the default executable label, usually stored in /usr/bin and /usr/sbin.r�z
%s)
�filterr�get_all_allow_rulesrIr^r[rHrr�remover�)reZentrypointsZentrypoints_str�pathsr�r)rerr��s*


zManPage._entrypointscCs.|j|jdkrdS|jjdd|ji�dS)Nr1a�
.SH "MCS Constrained"
The SELinux process type %(type)s_t is an MCS (Multi Category Security) constrained type.  Sometimes this separation is referred to as sVirt. These types are usually used for securing multi-tenant environments, such as virtualization, containers or separation of users.  The tools used to launch MCS types, pick out a different MCS label for each process group.

For example one process might be launched with %(type)s_t:s0:c1,c2, and another process launched with %(type)s_t:s0:c3,c4. The SELinux kernel only allows these processes can only write to content with a matching MCS label, or a MCS Label of s0. A process running with the MCS level of s0:c1,c2 is not allowed to write to content with the MCS label of s0:c3,c4
r�)r�rBr^r[rr)rerrrr��szManPage._mcs_typescsT�jg�y*�tt�fdd�tj���dd7�WnYnXtt�fdd�tj���}|dksnt|�dkrrdSg}ddg}x6|D].}�j|d|�r�|d|kr�|j|d�q�Wt|�dkr�dS�j	j
d	��j	j
d
�j�|j�d|kr�dg}xT|D]L}�j	j
d|�|�j
k�rx(�j
|d
D]}�j	j
d|��q0W�qWdS)Ncs|d�jkS)Nr)r�)r�)rerrr��sz!ManPage._writes.<locals>.<lambda>rrDcs.|d�ko,tddg�j|d�o,|ddkS)Nr�rZr[r�r�r�)�set�issubset)r�)�src_listrrr��sZ	proc_typeZsysctl_typer�z
.SH "MANAGED FILES"
z�
The SELinux process type %s_t can manage files labeled with the following file types.  The paths listed are the default paths for these file types.  Note the processes UID still need to have DAC permissions.
Z	file_typez
.br
.B %s

r�z	%s
.br
)r��listr�rZget_all_types_infor�rIr�r)r^r[rrr,r�)rer�Z
all_writesrDrRrJrr)rer�rr��s>*


zManPage._writescCs|j|jkr|j|jSdS)NZs0)rrr�)rerrr�_get_users_range�szManPage._get_users_rangecCsh|jjdd|ji�|jjd|j|j|j|j�d��d|jkrdd|jkrd|jjdd|ji�dS)Nzd.TH  "%(type)s_selinux"  "8"  "%(type)s" "mgrepl@redhat.com" "%(type)s SELinux Policy documentation"r�a
.SH "NAME"
%(user)s_u \- \fB%(desc)s\fP - Security Enhanced Linux Policy

.SH DESCRIPTION

\fB%(user)s_u\fP is an SELinux User defined in the SELinux
policy. SELinux users have default roles, \fB%(user)s_r\fP.  The
default role has a default type, \fB%(user)s_t\fP, associated with it.

The SELinux user will usually login to a system with a context that looks like:

.B %(user)s_u:%(user)s_r:%(user)s_t:%(range)s

Linux users are automatically assigned an SELinux users at login.
Login programs use the SELinux User to assign initial context to the user's shell.

SELinux policy uses the context to control the user's access.

By default all users are assigned to the SELinux user via the \fB__default__\fP flag

On Targeted policy systems the \fB__default__\fP user is assigned to the \fBunconfined_u\fP SELinux user.

You can list all Linux User to SELinux user mapping using:

.B semanage login -l

If you wanted to change the default user mapping to use the %(user)s_u user, you would execute:

.B semanage login -m -s %(user)s_u __default__

)r�r�rr �login_userdomainz�
If you want to map the one Linux user (joe) to the SELinux user %(user)s, you would execute:

.B $ semanage login -a -s %(user)s_u joe

r)r^r[rrr�r�r�rDr})rerrrr��szManPage._user_headercCs�d|j}|jjd�||jkr�|jd}|jjdd|ji�xn|j|D]N}|jjd|dd
�|jd��|jjd	|jd
j|g|j|�d��qLW|jjd|j�dS)Nz	%s_sudo_tz

.SH SUDO
ryz�
The SELinux user %(user)s can execute sudo.

You can set up sudo to allow %(user)s to transition to an administrative domain:

Add one or more of the following record to sudoers using visudo.

rz�
USERNAME ALL=(ALL) ROLE=%(admin)s_r TYPE=%(admin)s_t COMMAND
.br
sudo will run COMMAND as %(user)s_u:%(admin)s_r:%(admin)s_t:LEVEL
r6)Zadminra_
You might also need to add one or more of these new roles to your SELinux user record.

List the SELinux roles your SELinux user can reach by executing:

.B $ semanage user -l |grep selinux_name

Modify the roles list and add %(user)s_r to this list.

.B $ semanage user -m -R '%(roles)s' %(user)s_u

For more details you can see semanage man page.

rG)rZrolesz7
The SELinux type %s_t is not allowed to execute sudo.
r7)rrr^r[r1r�rH)reZsudotyper=Z	adminrolerrrr�(s



(zManPage._can_sudocCsd|jjd�d|jkr(|jjd|j�d|jkrD|jjd|j�d|jkr`|jjd|j�dS)Nz
.SH USER DESCRIPTION
Zunconfined_usertypez�
The SELinux user %s_u is an unconfined user. It means that a mapped Linux user to this SELinux user is supposed to be allow all actions.
Zunpriv_userdomainz�
The SELinux user %s_u is defined in policy as a unprivileged user. SELinux prevents unprivileged users from doing administration tasks without transitioning to a different role.
Zadmindomainz�
The SELinux user %s_u is an admin user. It means that a mapped Linux user to this SELinux user is intended for administrative actions. Usually this is assigned to a root Linux user.
)r^r[rDrr)rerrrr�Qs


zManPage._user_attributecCsJd|jkrF|jjd�d|jkr4|jjd|j�n|jjd|j�dS)NZx_domainz
.SH X WINDOWS LOGIN
z3
The SELinux user %s_u is able to X Windows login.
z7
The SELinux user %s_u is not able to X Windows login.
)r}r^r[rDrr)rerrrr�ds

zManPage._xwindows_logincCsJd|jkrF|jjd�d|jkr4|jjd|j�n|jjd|j�dS)Nr�z
.SH TERMINAL LOGIN
z2
The SELinux user %s_u is able to terminal login.
z6
The SELinux user %s_u is not able to terminal login.
)r}r^r[rDrr)rerrr�_terminal_loginrs

zManPage._terminal_logincCs�ddlm}|jjd�x�dD]�}|j|j|d�}t|�dkr�|jjd|j|f�x8|D]0}x*||D]\}}|jjdd	j|��qhWqZW|j|jdd
�}t|�dkr|jjd|j�x8|D]0}x*||D]\}}|jjdd	j|��q�Wq�WqWdS)
Nr)�networkz
.SH NETWORK
r�r�Z	name_bindzH
.TP
The SELinux user %s_u is able to listen on the following %s ports.
z
.B %s
r�Zname_connectzJ
.TP
The SELinux user %s_u is able to connect to the following tcp ports.
)r�r�)	rr�r^r[Zget_network_connectr�rIrrrH)rer�ZnetZportdictr�r�r�rrrr��s(


zManPage._networkcsXtt�fdd�tj���}�jjd�|dk	rB�jjd�j�n�jjd�j�dS)NcsH|d�jkoF|ddkoF|ddkoFtdddd	d
dg�jt|d��S)
Nr�r�Zuser_home_typer�r�Zioctl�read�getattrZexecuteZexecute_no_transrZr�)r�r�r�)r�)rerrr��sz$ManPage._home_exec.<locals>.<lambda>z
.SH HOME_EXEC
z;
The SELinux user %s_u is able execute home content files.
z?
The SELinux user %s_u is not able execute home content files.
)r�r�rr�r^r[rr)rer�r)rerr��szManPage._home_execcCs|jjd|j|jd��dS)Na�
.SH TRANSITIONS

Three things can happen when %(type)s attempts to execute a program.

\fB1.\fP SELinux Policy can deny %(type)s from executing the program.

.TP

\fB2.\fP SELinux Policy can allow %(type)s to execute the program in the current user type.

Execute the following to see the types that the SELinux user %(type)s can execute without transitioning:

.B sesearch -A -s %(type)s -c file -p execute_no_trans

.TP

\fB3.\fP SELinux can allow %(type)s to execute the program and transition to a new type.

Execute the following to see the types that the SELinux user %(type)s can execute and transition:

.B $ sesearch -A -s %(type)s -c process -p transition

)rr�)r^r[rrr�)rerrrr��szManPage._transitionscCs�|jjdd|ji�|jjd|j|jd��g}x,|jD]"}|jd|j|kr<|j|�q<Wt|�dkr�d}t|�dkr�d	}|jjd
dj|�||jf�dS)Nzd.TH  "%(user)s_selinux"  "8"  "%(user)s" "mgrepl@redhat.com" "%(user)s SELinux Policy documentation"ra[
.SH "NAME"
%(user)s_r \- \fB%(desc)s\fP - Security Enhanced Linux Policy

.SH DESCRIPTION

SELinux supports Roles Based Access Control (RBAC), some Linux roles are login roles, while other roles need to be transition into.

.I Note:
Examples in this man page will use the
.B staff_u
SELinux user.

Non login roles are usually used for administrative tasks. For example, tasks that require root privileges.  Roles control which types a user can run processes with. Roles often have default types assigned to them.

The default type for the %(user)s_r role is %(user)s_t.

The
.B newrole
program to transition directly to this role.

.B newrole -r %(user)s_r -t %(user)s_t

.B sudo
is the preferred method to do transition from one role to another.  You setup sudo to transition to %(user)s_r by adding a similar line to the /etc/sudoers file.

USERNAME ALL=(ALL) ROLE=%(user)s_r TYPE=%(user)s_t COMMAND

.br
sudo will run COMMAND as staff_u:%(user)s_r:%(user)s_t:LEVEL

When using a non login role, you need to setup SELinux so that your SELinux user can reach %(user)s_r role.

Execute the following to see all of the assigned SELinux roles:

.B semanage user -l

You need to add %(user)s_r to the staff_u user.  You could setup the staff_u user to be able to use the %(user)s_r role with a command like:

.B $ semanage user -m -R 'staff_r system_r %(user)s_r' staff_u

)r�rryrr%rMr�z�

SELinux policy also controls which roles can transition to a different role.
You can list these rules using the following command.

.B search --role_allow

SELinux policy allows the %s role%s can transition to the %s_r role.

z, )r^r[rrr�r�r)rIrH)reZtrolesrRr�rrrr��s)	zManPage._role_header)rwrLFF)(rsrtrurvrr�rfr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrxsJ
; 	+p"*	#
0-))rrr	r
)rrr
r)r)!�__all__rQr�rrkr�r�r�r�rrr&r'r/r2r5r8rr>r?r@rArBrCr1rFrKrrrTr_rrrrrr�<module>sH 


.site-packages/sepolicy/__pycache__/transition.cpython-36.pyc000064400000005626147511334650020165 0ustar003

>�\��@s0ddlZdgZdd�Zdd�ZGdd�d�ZdS)�N�setranscCs.tjtjgtj|i�}tdd�tdd�|��S)NcSs
|tjS)N)�sepolicyZTARGET)�y�r� /usr/lib/python3.6/transition.py�<lambda>sz_entrypoint.<locals>.<lambda>cSsd|tjkS)NZ
entrypoint)rZPERMS)�xrrrrs)r�searchZALLOWZSOURCE�map�filter)�src�transrrr�_entrypointsrcsF�gtt�fdd�tj���dd�tt�fdd�tj���}|S)Ncs|d�kS)N�namer)r)rrrr sz_get_trans.<locals>.<lambda>rZ
attributescs|d�ko|ddkS)N�source�class�processr)r)�src_listrrr!s)�listrrZget_all_types_infoZget_all_transitions)rZ
trans_listr)rrr�
_get_transs(rc@s0eZdZddd�Zdd�Zddd�Zd	d
�ZdS)
rNcCs(g|_i|_||_||_|j|j�dS)N)�seen�sdictr�dest�_process)�selfrrrrr�__init__'s
zsetrans.__init__cs���jkr�j�Si�j�<t��}|s.dS��j�d<�jsR|�j�d<nxttdd�t�fdd�|����j�d<ttdd�t��fdd�|����j�d<x �j�dD]}�j|�q�WdS)	Nrr
cSs|S)Nr)rrrrr9sz"setrans._process.<locals>.<lambda>cs|d�jkS)N�	transtype)r)r)rrrr9scSs|dS)Nrr)rrrrr:scs|d�j�gkS)Nr)r)r)rrrrr:s�child)rrrrr
rr)rrr
�sr)rrrr.s


*,zsetrans._process�c	Cs�d}||jkr|S|jj|�d|j|kr�x~|j|dD]l}tj|d|dddg�}|r�|d||d|d|dtj|�f7}q<|d	||d|d|df7}q<Wd
|j|kr�x.|j|d
D]}||j|d||f�7}q�W|S)Nrr
rrrZ
transitionz%s%s @ %s --> %s %s
�targetz%s%s @ %s --> %s
rz	%s%s ... )r�appendrrZget_conditionalsZget_conditionals_format_text�out)rr�headerZbuf�tZcondrrrrr">s
*$zsetrans.outcCsg|_t|j|j��dS)N)r�printr"r)rrrr�outputQszsetrans.output)N)r)�__name__�
__module__�__qualname__rrr"r&rrrrr%s

)r�__all__rrrrrrr�<module>ssite-packages/sepolicy/__pycache__/sedbus.cpython-36.opt-1.pyc000064400000004446147511334650020216 0ustar003

>�\��@s�ddlZddlZddlZddlZddlmZGdd�de�Zedkr�y&e�Z	e	j
eejd��Z
ee
�Wn,ejk
r�Zzee�WYddZ[XnXdS)�N)�polkitc@s�eZdZdd�Zejdd��Zejdd��Zejdd��Zejd	d
��Z	ejdd��Z
ejd
d��Zejdd��Zejdd��Z
dS)�SELinuxDBuscCstj�|_|jjdd�|_dS)Nzorg.selinuxz/org/selinux/object)�dbusZ	SystemBusZbusZ
get_object�dbus_object)�self�r�/usr/lib/python3.6/sedbus.py�__init__
s
zSELinuxDBus.__init__cCs|jj|dd�}|S)Nzorg.selinux)�dbus_interface)r�semanage)rZbuf�retrrrrszSELinuxDBus.semanagecCs|jj|dd�}|S)Nzorg.selinux)r
)r�
restorecon)r�pathrrrrr
szSELinuxDBus.restoreconcCs|jj|dd�}|S)Nzorg.selinux)r
)r�
setenforce)r�valuerrrrrszSELinuxDBus.setenforcecCs|jjdd�}|S)Nzorg.selinux)r
)r�
customized)rrrrrrszSELinuxDBus.customizedcCs|jjdd�}|S)Nzorg.selinux)r
)r�
semodule_list)rrrrrr"szSELinuxDBus.semodule_listcCs|jj|dd�}|S)Nzorg.selinux)r
)r�relabel_on_boot)rrrrrrr'szSELinuxDBus.relabel_on_bootcCs|jj|dd�}|S)Nzorg.selinux)r
)r�change_default_mode)rrrrrrr,szSELinuxDBus.change_default_modecCs|jj|dd�}|S)Nzorg.selinux)r
)r�change_default_policy)rrrrrrr1sz!SELinuxDBus.change_default_policyN)�__name__�
__module__�__qualname__r	rZenable_proxyrr
rrrrrrrrrrrsr�__main__�)�sysrZdbus.serviceZdbus.mainloop.glibZ	slip.dbusr�objectrrZ
dbus_proxyr�int�argvZresp�printZ
DBusException�errrr�<module>s.site-packages/sepolicy/__pycache__/manpage.cpython-36.opt-1.pyc000064400000105544147511334650020342 0ustar003

��f��@sdddddgZddlZddlZddlZddlZddlZd3d4ddd�Zdgdgddgdgdgdgd�ZdgZda	d5dd�Z
dadadd�Z
dad d!�Zdad"d�Zdad#d$�Zdad%d&�Zdad'd(�Zdad)d*�Zd+d,�ZgZgZd-d.�Zd/d0�ZGd1d�d�ZGd2d�d�Z dS)6�ManPage�HTMLManPages�manpage_domains�
manpage_roles�gen_domains�N�amavis_t�clamd_t�
clamscan_t�freshclam_t�rgmanager_t�
corosync_t�	aisexec_t�pacemaker_tZqemu_tZphpfpm_t)Zantivirus_tZ	cluster_tZsvirt_tZhttpd_tZsambaZapacheZvirtZlibvirtZbindZsmartmonZraid)ZsmbdZhttpdZvirtdZnamedZfsdaemonZmdadmz/var�#/usr/share/selinux/devel/policy.xmlcCs�trtSddl}iay�|jjjtj|��}xx|jd�D]j}xd|jd�D]V}|jd�}|dksF|dkrfqF|dkrrd}|dkr~d}x|jd	�D]}|j	t|<q�WqFWq6WWnt
k
r�YnXtS)
NrZlayer�module�name�userZ
unconfinedZ
unprivuserZunconfineduserZsummary)�modules_dictZxml.etree.ElementTreeZetreeZElementTreeZ
fromstring�sepolicyZ
policy_xml�findall�get�text�IOError)�pathZxmlZtree�l�mr�b�r�/usr/lib/python3.6/manpage.py�gen_modules_dict-s(
rcCs�trtrttfSgaiag}tjtj�}x<|D]4}|j|d�d|kr.|dt|djd�d<q.Wx&|D]}|d
krltj|jdd	��qlWtj�ttfS)Nr�range�_r�system_u�root�unconfined_uZ_u�)r"r#r$)	�users�users_ranger�infoZUSER�append�split�replace�sort)ZallusersZ
allusers_info�d�urrr�get_all_users_infoIs

r/cCststtjtjd��datS)N�
entry_type�types)�all_entrypoints�nextrr(�	ATTRIBUTErrrr�get_entrypointsasr5cCs�trtSgax4tj�D](}d}|dd�}|tkr4qtj|�qWx<tj�D]0}|dd�tksL|dkrjqLtj|dd��qLWtj�tS)NF�Zsystem_r���r7r7)�domainsr�get_all_domainsr)�
get_all_rolesr,)r-�found�domain�rolerrrrjscCs"tdkrttjtjd��datS)NZ	exec_typer1)�
exec_typesr3rr(r4rrrr�_gen_exec_types�sr?cCs"tdkrttjtjd��datS)Nr0r1)�entry_typesr3rr(r4rrrr�_gen_entry_types�srAcCstdkrttjtjd��atS)NZmcs_constrained_type)�mcs_constrained_typesr3rr(r4rrrr�_gen_mcs_constrained_types�srCcCsXtrtStjtj�}iax:|D]2}y|dt|d<Wqgt|d<YqXqWtS)N�
attributesr)r1rr(�TYPE)Z	all_typesZrecrrr�
_gen_types�s
rFcCsdj|dt|��jd��S)N� r!)�join�lenr*)�fZtrimrrr�prettyprint�srKcCsftjtjg�}xRtjD]H}g}x6|D].}|jd�dd|kr$|j|jd�d�q$W|||<qW|S)N�/�r���rN)�dict�fromkeys�stringZ
ascii_lettersr*r))Zmanpage_listZalphabet_manpages�iZtemp�jrrr�get_alphabet_manpages�s
rTcCstyddlm}Wn tk
r0ddlm}YnX|d|�\}}|dkrpt|d�t|d�}|j|�|j�dS)Nr)�getstatusoutputz)/usr/bin/groff -man -Thtml %s 2>/dev/nullzhas been created�w)ZcommandsrU�ImportError�
subprocess�print�open�write�close)Zhtml_manpageZmanpagerUZrc�output�fdrrr�convert_manpage_to_html�s


r_c@s8eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�ZdS)
rzF
            Generate a HTML Manpages on an given SELinux domains
    cCsTt|�|_t|�|_||_|d|_|j|_|jr<|j�ntd|�td�dS)NrLz7SELinux HTML man pages can not be generated for this %srM)	rTrr�
os_version�old_path�new_path� _HTMLManPages__gen_html_manpagesrY�exit)�selfrrrr`rrr�__init__�s



zHTMLManPages.__init__cCs|j�|j�|j�dS)N)�_write_html_manpage�
_gen_index�_gen_css)rerrrZ__gen_html_manpages�sz HTMLManPages.__gen_html_manpagescCs�tjj|j�stj|j�xN|jj�D]@}t|�r&x2|D]*}t|j|j	dd�dd|j
|�q8Wq&WxN|jj�D]@}t|�rvx2|D]*}t|j|j	dd�dd|j
|�q�WqvWdS)N�_selinuxrMrz.html)�osr�isdirrb�mkdirr�valuesrIr_�rsplitrar)rer<r-r=�rrrrrg�s
.
z HTMLManPages._write_html_manpagec	Cs�|j|jd}t|d�}|jd|j�x.|jD]$}t|j|�r2|jd||f�q2W|jd�d}x\|jD]R}t|j|�rp|d7}x6|j|D](}|jdd	�d
}|d||||f7}q�WqpW|jd|�x.|jD]$}t|j|�r�|jd
||f�q�W|jd�d}xb|jD]X}t|j|��r|d7}x8|j|D]*}|jdd	�d
}|d||||f7}�qBW�qW|jd|�|j�t	d|�dS)Nz.htmlrVz�
<html>
<head>
	<link rel=stylesheet type="text/css" href="style.css" title="style">
	<title>SELinux man pages</title>
</head>
<body>
<h1>SELinux man pages for %s</h1>
<hr>
<table><tr>
<td valign="middle">
<h3>SELinux roles</h3>
z
<a href=#%s_role>%s</a>z
</td>
</tr></table>
<pre>
r%z<p>rjrMrzo<a name=%s_role></a><a href=%s.html>%s_selinux(8)</a> - Security Enhanced Linux Policy for the %s SELinux user
zH%s
</pre>
<hr>
<table><tr>
<td valign="middle">
<h3>SELinux domains</h3>z
<a href=#%s_domain>%s</a>
			zv<a name=%s_domain></a><a href=%s.html>%s_selinux(8)</a> - Security Enhanced Linux Policy for the %s SELinux processes
z%s
</pre>
</body>
</html>
z%s has been created)
rbr`rZr[rrIrorr\rY)	re�htmlr^ZletterZ
rolename_bodyrpZrolenameZdomainname_body�
domainnamerrrrh�sL
 
zHTMLManPages._gen_indexcCs6|jd}t|d�}|jd�|j�td|�dS)Nz	style.cssrVaQ
html, body {
    background-color: #fcfcfc;
    font-family: arial, sans-serif;
    font-size: 110%;
    color: #333;
}

h1, h2, h3, h4, h5, h5 {
	color: #2d7c0b;
	font-family: arial, sans-serif;
	margin-top: 25px;
}

a {
    color: #336699;
    text-decoration: none;
}

a:visited {
    color: #4488bb;
}

a:hover, a:focus, a:active {
    color: #07488A;
    text-decoration: none;
}

a.func {
    color: red;
    text-decoration: none;
}
a.file {
    color: red;
    text-decoration: none;
}

pre.code {
    background-color: #f4f0f4;
//    font-family: monospace, courier;
    font-size: 110%;
    margin-left: 0px;
    margin-right: 60px;
    padding-top: 5px;
    padding-bottom: 5px;
    padding-left: 8px;
    padding-right: 8px;
    border: 1px solid #AADDAA;
}

.url {
    font-family: serif;
    font-style: italic;
    color: #440064;
}
z%s has been created)rarZr[r\rY)reZ	style_cssr^rrrri8s

7zHTMLManPages._gen_cssN)	�__name__�
__module__�__qualname__�__doc__rfrcrgrhrirrrrr�s
Gc@s.eZdZdZdZddgZdLdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zd2d3�Zd4d5�Zd6d7�Zd8d9�Zd:d;�Zd<d=�Z d>d?�Z!d@dA�Z"dBdC�Z#dDdE�Z$dFdG�Z%dHdI�Z&dJdK�Z'dS)MrzK
        Generate a Manpage on an SELinux domain in the specified path
    NZDisabledZEnabled�/tmprLFcCs||_||_||_tj�d|_t�|_tj�|_	tj
�|_tj�|_
tj�|_tj�|_t�d|_t�d|_tj�|_tj�|_t�|_t�|_t�|_t�|_ |jr�|jd|_!n|jt"j#�|_!tj$|j!�|_%t&j'j(|�s�t&j)|�||_'|j�r
|jd|_*n|jd|_*tj+|j*�|_,tj-|�\|_.|_/|j.d|_0|j1�d||j.f|_2t3|j2d�|_4|j.d	|jk�r�|j5�|j�r�t6j7|j2�n|j�r�t8j7|j2�|j9�|j4j:�x<t;j<�D]0}||j.k�r�xt;|D]}|j=|��q�W�q�WdS)
NrrMZ
file_contextsz
policy.xmlz#/usr/share/selinux/devel/policy.xml�_tz%s/%s_selinux.8rV�_r)>rq�source_filesr#rZ
gen_port_dict�portrecsrr8r9�all_domainsZget_all_attributes�all_attributesZ
get_all_boolsZ	all_boolsZget_all_port_types�all_port_typesr:Z	all_rolesr/�	all_users�all_users_rangeZget_all_file_types�all_file_typesZget_all_role_allows�role_allowsrFr1r?r>rAr@rCrBZfcpath�selinuxZselinux_file_context_pathZ
get_fcdict�fcdictrkr�exists�makedirs�xmlpathZ
gen_bool_dict�
booleans_dictZgen_short_namerr�
short_name�type�
_gen_bools�
man_page_pathrZr^�_ManPage__gen_user_man_pagerr)r�_ManPage__gen_man_pager\�
equiv_dict�keys�_ManPage__gen_man_page_link)rerrrr#rzrq�k�aliasrrrrf�s\








zManPage.__init__cCs�g|_g|_|jg}|jtkrNx.t|jD] }|d|jkr*|j|d�q*Wx6|D].}tj|�\}}|j|7_|j|7_qTW|jj	�|jj	�dS)Nrx)
�bools�domainboolsr�rrr�r|r)rZ	get_boolsr,)rer1�tr�r�rrrr��s


zManPage._gen_boolscCs|jS)N)r�)rerrr�get_man_page_path�szManPage.get_man_page_pathc
Cs�|jd|_|jst|j�|_y|j|j|_Wnd|j|_YnX|j|jkr�ttj	tj
|j��d|_|j
�|j�|j�|j�|j�|j�|j�|j�n|j�|j�|j�|j�|j�|j�dS)Nryz%s user rolerD)rrr=rrr��descrr3rr(rEr�rD�_user_header�_user_attribute�	_can_sudo�_xwindows_login�_network�	_booleans�
_home_exec�_transitions�_role_header�_port_types�
_mcs_types�_writes�_footer)rerrrZ__gen_user_man_page�s.
zManPage.__gen_user_man_pagecCsLd|j|f}td|j|fd�|_|jjd|j�|jj�t|�dS)Nz%s/%s_selinux.8rVz.so man8/%s_selinux.8)rrZr^r[rrr\rY)rer�rrrrZ__gen_man_page_link�s

zManPage.__gen_man_page_linkcCs�g|_i|_g|_|j�x�|jD]|}y@t|rd|j}|j}xt|D]}|j|�qFW||_||_Wntk
r|w"YnXt	t
jt
jd|��d|j|<q"W|j
�|j�|j�|j�|j�|j�|j�|j�|j�|j�|j�dS)Nz%srD)�	anon_listrD�ptypes�_get_ptypes�typealias_typesr^r��_typealias_gen_man�KeyErrorr3rr(rE�_header�_entrypoints�_process_typesr�r��_nsswitch_domainr�r��
_file_context�_public_contentr�)reZdomain_typer^r�r�rrrZ__gen_man_page�s6
$zManPage.__gen_man_pagecCs8x2|jD](}|j|j�s$|j|j�r|jj|�qWdS)N)r|�
startswithr�rrr�r))rerJrrrr�szManPage._get_ptypescCsZd|j|dd�f|_g|_d|_t|jd�|_|j|dd��|j�|jj�dS)Nz%s/%s_selinux.8r6r%rVr7r7)	rr��ports�booltextrZr^�
_typealiasr�r\)rer�rrrr�szManPage._typealias_gen_mancCsN|jjd|tjd�d��|jjd||jd��|jjdd|ji�dS)Nz\.TH  "%(typealias)s_selinux"  "8"  "%(date)s" "%(typealias)s" "SELinux Policy %(typealias)s"z%y-%m-%d)�	typealias�datez�
.SH "NAME"
%(typealias)s_selinux \- Security Enhanced Linux Policy for the %(typealias)s processes
.SH "DESCRIPTION"

%(typealias)s_t SELinux domain type is now associated with %(domainname)s domain type (%(domainname)s_t).
)r�rrzC
Please see

.B %(domainname)s_selinux

man page for more details.
rr)r^r[�time�strftimerr)rer�rrrr�#szManPage._typealiascCs8|jjd|jtjd�d��|jjdd|ji�dS)Nz_.TH  "%(domainname)s_selinux"  "8"  "%(date)s" "%(domainname)s" "SELinux Policy %(domainname)s"z%y-%m-%d)rrr�a�
.SH "NAME"
%(domainname)s_selinux \- Security Enhanced Linux Policy for the %(domainname)s processes
.SH "DESCRIPTION"

Security-Enhanced Linux secures the %(domainname)s processes via flexible mandatory access control.

The %(domainname)s processes execute with the %(domainname)s_t SELinux type. You can check if you have these processes running by executing the \fBps\fP command with the \fB\-Z\fP qualifier.

For example:

.B ps -eZ | grep %(domainname)s_t

rr)r^r[rrr�r�)rerrrr�6s
zManPage._headercCsH|j|ddj�|j|ddd�}|ddkrD|dd�}|S)Nr6rrM�.rNrN)r��lower)rerr�rrr�_format_boolean_descHs,zManPage._format_boolean_desccCspd}xf|j|jD]V\}}|jd�r<||jkr<|jj|�q||jkrHq|d|j|�||j||f7}qW|S)Nr%Z
anon_writezg
.PP
If you want to %s, you must turn on the %s boolean. %s by default.

.EX
.B setsebool -P %s 1

.EE
)r�r��endswithr�r)r�r��enabled_str)rer�rZenabledrrr�_gen_bool_textNs
"zManPage._gen_bool_textcCs>|j�|_|jdkr:|jjd|j|jf�|jj|j�dS)Nr%z�
.SH BOOLEANS
SELinux policy is customizable based on least access required.  %s policy is extremely flexible and has several booleans that allow you to manipulate the policy and run %s with the tightest access possible.

)r�r�r^r[rr)rerrrr�as


zManPage._booleanscCs�g}ddg}d}x*|jj�D]}d|j|kr|j|�qWt|�r~|jjd�x,|D]$}|d|j|�dj|�||f7}qVW|jj|�dS)NZauthlogin_nsswitch_use_ldapZkerberos_enabledr%Znsswitch_domainz
.SH NSSWITCH DOMAIN
zb
.PP
If you want to %s for the %s, you must turn on the %s boolean.

.EX
.B setsebool -P %s 1
.EE
z, )rDr�r)rIr^r[r�rH)reZnsswitch_typesZnsswitch_booleansZnsswitchbooltextr�rrrrr�ms
"zManPage._nsswitch_domaincCsZt|j�dkrdS|jjdd|ji�|jjddj|j��|jjdd|ji�dS)Nra�
.SH PROCESS TYPES
SELinux defines process types (domains) for each process running on the system
.PP
You can see the context of a process using the \fB\-Z\fP option to \fBps\bP
.PP
Policy governs the access confined processes have to files.
SELinux %(domainname)s policy is very flexible allowing users to setup their %(domainname)s processes in as secure a method as possible.
.PP
The following process types are defined for %(domainname)s:
rrz
.EX
.B %s
.EEz, z�
.PP
Note:
.B semanage permissive -a %(domainname)s_t
can be used to make the process type %(domainname)s_t permissive. SELinux does not deny access to permissive process types, but the AVC (SELinux denials) messages are still generated.
)rIr�r^r[rrrH)rerrrr��s
zManPage._process_typesc	Cs�g|_x2|jD](}|j|j�s*|j|j�r|jj|�qWt|j�dkrLdS|jjdd|ji�xv|jD]l}|jjd|�d}xRdD]J}||f|j	kr�|r�|jjd�d	}|jjd
|dj
|j	||f�f�q�WqjWdS)
Nra�
.SH PORT TYPES
SELinux defines port types to represent TCP and UDP ports.
.PP
You can see the types associated with a port by using the following command:

.B semanage port -l

.PP
Policy governs the access confined processes have to these ports.
SELinux %(domainname)s policy is very flexible allowing users to setup their %(domainname)s processes in as secure a method as possible.
.PP
The following port types are defined for %(domainname)s:rrz

.EX
.TP 5
.B %s
.TP 10
.EE
T�tcp�udpz

Default Defined Ports:Fz

%s %s
.EE�,)r�r�)r�r~r�r�rrr)rIr^r[r{rH)rerJ�p�onceZprotrrrr��s(

zManPage._port_typescCs�g}g}g}x^|jD]T}|j|j�r|j|�||jksB||jkrL|j|�||jkr||j|d}qWt|�dkr|dS|j�i}xt|D]l}d}x*|D]"}|j|�r�||j|�d}Pq�W|s�x2t	D]*}	|j|	�o�|j
d�r�g||dd�<Pq�Wq�Wg}
x*|D]"}t||�dk�r|
j|��qW|jjdd|ji�t|
�dk�r�|jjd	�x2|
D]*}	|jjd
|j|	|	j
d�dd
���qdW|�r�|jjd|j|dd��|jjdd|ji�|j�x�|D]�}|jjd|tj|�f�||jk�r�d}t|j|d�dk�r�d}|jjd||j|ddf�x0|j|ddd�D]}
|jjd|
��q\W�q�W|jjd�dS)N�regexrFTz(/.*)?�a�
.SH FILE CONTEXTS
SELinux requires files to have an extended attribute to define the file type.
.PP
You can see the context of a file using the \fB\-Z\fP option to \fBls\bP
.PP
Policy governs the access confined processes have to these files.
SELinux %(domainname)s policy is very flexible allowing users to setup their %(domainname)s processes in as secure a method as possible.
.PP
rrz 
.PP
.B EQUIVALENCE DIRECTORIES
a�
.PP
%(domainname)s policy stores data with multiple different file context types under the %(equiv)s directory.  If you would like to store the data in a different directory you can use the semanage command to create an equivalence mapping.  If you wanted to store this data under the /srv dirctory you would execute the following command:
.PP
.B semanage fcontext -a -e %(equiv)s /srv/%(alt)s
.br
.B restorecon -R -v /srv/%(alt)s
.PP
rLrM)rr�equivZalta�
.PP
.B STANDARD FILE CONTEXT

SELinux defines the file context types for the %(domainname)s, if you wanted to
store files with these types in a diffent paths, you need to execute the semanage command to sepecify alternate labeling and then use restorecon to put the labels on disk.

.B semanage fcontext -a -t %(type)s '/srv/my%(domainname)s_content(/.*)?'
.br
.B restorecon -R -v /srv/my%(domainname)s_content

Note: SELinux often uses regular expressions to specify labels that match multiple files.
)rrr�z=
.I The following file types are defined for %(domainname)s:
z

.EX
.PP
.B %s
.EE

- %s
r%�sz
.br
.TP 5
Path%s:
%sz, %sa

.PP
Note: File context can be temporarily modified with the chcon command.  If you want to permanently change the file context you need to use the
.B semanage fcontext
command.  This will modify the SELinux labeling database.  You will need to use
.B restorecon
to apply the labels.
i����rNrN)r�r�rrr)r>r@r�rIr,�
equiv_dirsr�r^r[r*rZget_description)re�flistZflist_non_execZmpathsrJZmdirsZmpr;Zmd�er�r�plural�xrrrr��sr








	
$
zManPage._file_contextcCsdd}xN|jD]D}||jkrq|j|j�r4|d|7}|j|jd�r|d|7}qW|jj|�dS)Nr%z, %s_selinux(8)r!)r8rrr�r�r^r[)re�retr-rrr�	_see_also9s
zManPage._see_alsocCszt|j�dkrv|jjdd|ji�xP|jD]F}|j|ddj�|j|ddd�}|jjd|||f�q,WdS)Nra,
.SH SHARING FILES
If you want to share files with multiple domains (Apache, FTP, rsync, Samba), you can set a file context of public_content_t and public_content_rw_t.  These context allow any of the above domains to read the content.  If you want a particular domain to write to the public_content_rw_t domain, you must set the appropriate boolean.
.TP
Allow %(domainname)s servers to read the /var/%(domainname)s directory by adding the public_content_t file type to the directory and by restoring the file type.
.PP
.B
semanage fcontext -a -t public_content_t "/var/%(domainname)s(/.*)?"
.br
.B restorecon -F -R -v /var/%(domainname)s
.pp
.TP
Allow %(domainname)s servers to read and write /var/%(domainname)s/incoming by adding the public_content_rw_t type to the directory and by restoring the file type.  You also need to turn on the %(domainname)s_anon_write boolean.
.PP
.B
semanage fcontext -a -t public_content_rw_t "/var/%(domainname)s/incoming(/.*)?"
.br
.B restorecon -F -R -v /var/%(domainname)s/incoming
.br
.B setsebool -P %(domainname)s_anon_write 1
rrr6rMzW
.PP
If you want to %s, you must turn on the %s boolean.

.EX
.B setsebool -P %s 1
.EE
)rIr�r^r[rrr�r�)rerr�rrrr�Ds,zManPage._public_contentcCsp|jjd�t|j�dkr&|jjd�|jdkr<|jjd�|jjd|j�|jdkrd|jjd�|j�dS)Na#
.SH "COMMANDS"
.B semanage fcontext
can also be used to manipulate default file context mappings.
.PP
.B semanage permissive
can also be used to manipulate whether or not a process type is permissive.
.PP
.B semanage module
can also be used to enable/disable/install/remove policy modules.
rzF
.B semanage port
can also be used to manipulate the port definitions
r%zA
.B semanage boolean
can also be used to manipulate the booleans
z�
.PP
.B system-config-selinux
is a GUI tool available to customize SELinux policy settings.

.SH AUTHOR
This manual page was auto-generated using
.B "sepolicy manpage".

.SH "SEE ALSO"
selinux(8), %s(8), semanage(8), restorecon(8), chcon(1), sepolicy(8)z, setsebool(8))r^r[rIr�r�rrr�)rerrrr�fs



zManPage._footercCs@||jdgkrdS|jd�r<x|D]}||j|kr"dSq"WdS)Nr<FrxT)r�r�r1)reZcheckrD�arrr�_valid_write�s

zManPage._valid_writecs�dd�t�fdd�tj��D�}t|�dkr0dS�jjd�t|�dkrXdd	j|�}nd
|d}�jjd�j|�jf�d|kr�|jd��jjd
�g}x(|D] }|�j	kr�|�j	|d7}q�W�jjdd	j|��dS)NcSsg|]}|d�qS)�targetr)�.0r�rrr�
<listcomp>�sz(ManPage._entrypoints.<locals>.<listcomp>cs&|d�jko$|ddko$d|dkS)N�source�class�file�
entrypoint�permlist)r�)�y)rerr�<lambda>�sz&ManPage._entrypoints.<locals>.<lambda>rz
.SH "ENTRYPOINTS"
rMz\fB%s\fP file typesz, z\fB%s\fP file typezw
The %s_t SELinux type can be entered via the %s.

The default entrypoint paths for the %s_t domain are the following:
Zbin_tz^
All executeables with the default executable label, usually stored in /usr/bin and /usr/sbin.r�z
%s)
�filterr�get_all_allow_rulesrIr^r[rHrr�remover�)reZentrypointsZentrypoints_str�pathsr�r)rerr��s*


zManPage._entrypointscCs.|j|jdkrdS|jjdd|ji�dS)Nr1a�
.SH "MCS Constrained"
The SELinux process type %(type)s_t is an MCS (Multi Category Security) constrained type.  Sometimes this separation is referred to as sVirt. These types are usually used for securing multi-tenant environments, such as virtualization, containers or separation of users.  The tools used to launch MCS types, pick out a different MCS label for each process group.

For example one process might be launched with %(type)s_t:s0:c1,c2, and another process launched with %(type)s_t:s0:c3,c4. The SELinux kernel only allows these processes can only write to content with a matching MCS label, or a MCS Label of s0. A process running with the MCS level of s0:c1,c2 is not allowed to write to content with the MCS label of s0:c3,c4
r�)r�rBr^r[rr)rerrrr��szManPage._mcs_typescsT�jg�y*�tt�fdd�tj���dd7�WnYnXtt�fdd�tj���}|dksnt|�dkrrdSg}ddg}x6|D].}�j|d|�r�|d|kr�|j|d�q�Wt|�dkr�dS�j	j
d	��j	j
d
�j�|j�d|kr�dg}xT|D]L}�j	j
d|�|�j
k�rx(�j
|d
D]}�j	j
d|��q0W�qWdS)Ncs|d�jkS)Nr)r�)r�)rerrr��sz!ManPage._writes.<locals>.<lambda>rrDcs.|d�ko,tddg�j|d�o,|ddkS)Nr�rZr[r�r�r�)�set�issubset)r�)�src_listrrr��sZ	proc_typeZsysctl_typer�z
.SH "MANAGED FILES"
z�
The SELinux process type %s_t can manage files labeled with the following file types.  The paths listed are the default paths for these file types.  Note the processes UID still need to have DAC permissions.
Z	file_typez
.br
.B %s

r�z	%s
.br
)r��listr�rZget_all_types_infor�rIr�r)r^r[rrr,r�)rer�Z
all_writesrDrRrJrr)rer�rr��s>*


zManPage._writescCs|j|jkr|j|jSdS)NZs0)rrr�)rerrr�_get_users_range�szManPage._get_users_rangecCsh|jjdd|ji�|jjd|j|j|j|j�d��d|jkrdd|jkrd|jjdd|ji�dS)Nzd.TH  "%(type)s_selinux"  "8"  "%(type)s" "mgrepl@redhat.com" "%(type)s SELinux Policy documentation"r�a
.SH "NAME"
%(user)s_u \- \fB%(desc)s\fP - Security Enhanced Linux Policy

.SH DESCRIPTION

\fB%(user)s_u\fP is an SELinux User defined in the SELinux
policy. SELinux users have default roles, \fB%(user)s_r\fP.  The
default role has a default type, \fB%(user)s_t\fP, associated with it.

The SELinux user will usually login to a system with a context that looks like:

.B %(user)s_u:%(user)s_r:%(user)s_t:%(range)s

Linux users are automatically assigned an SELinux users at login.
Login programs use the SELinux User to assign initial context to the user's shell.

SELinux policy uses the context to control the user's access.

By default all users are assigned to the SELinux user via the \fB__default__\fP flag

On Targeted policy systems the \fB__default__\fP user is assigned to the \fBunconfined_u\fP SELinux user.

You can list all Linux User to SELinux user mapping using:

.B semanage login -l

If you wanted to change the default user mapping to use the %(user)s_u user, you would execute:

.B semanage login -m -s %(user)s_u __default__

)r�r�rr �login_userdomainz�
If you want to map the one Linux user (joe) to the SELinux user %(user)s, you would execute:

.B $ semanage login -a -s %(user)s_u joe

r)r^r[rrr�r�r�rDr})rerrrr��szManPage._user_headercCs�d|j}|jjd�||jkr�|jd}|jjdd|ji�xn|j|D]N}|jjd|dd
�|jd��|jjd	|jd
j|g|j|�d��qLW|jjd|j�dS)Nz	%s_sudo_tz

.SH SUDO
ryz�
The SELinux user %(user)s can execute sudo.

You can set up sudo to allow %(user)s to transition to an administrative domain:

Add one or more of the following record to sudoers using visudo.

rz�
USERNAME ALL=(ALL) ROLE=%(admin)s_r TYPE=%(admin)s_t COMMAND
.br
sudo will run COMMAND as %(user)s_u:%(admin)s_r:%(admin)s_t:LEVEL
r6)Zadminra_
You might also need to add one or more of these new roles to your SELinux user record.

List the SELinux roles your SELinux user can reach by executing:

.B $ semanage user -l |grep selinux_name

Modify the roles list and add %(user)s_r to this list.

.B $ semanage user -m -R '%(roles)s' %(user)s_u

For more details you can see semanage man page.

rG)rZrolesz7
The SELinux type %s_t is not allowed to execute sudo.
r7)rrr^r[r1r�rH)reZsudotyper=Z	adminrolerrrr�(s



(zManPage._can_sudocCsd|jjd�d|jkr(|jjd|j�d|jkrD|jjd|j�d|jkr`|jjd|j�dS)Nz
.SH USER DESCRIPTION
Zunconfined_usertypez�
The SELinux user %s_u is an unconfined user. It means that a mapped Linux user to this SELinux user is supposed to be allow all actions.
Zunpriv_userdomainz�
The SELinux user %s_u is defined in policy as a unprivileged user. SELinux prevents unprivileged users from doing administration tasks without transitioning to a different role.
Zadmindomainz�
The SELinux user %s_u is an admin user. It means that a mapped Linux user to this SELinux user is intended for administrative actions. Usually this is assigned to a root Linux user.
)r^r[rDrr)rerrrr�Qs


zManPage._user_attributecCsJd|jkrF|jjd�d|jkr4|jjd|j�n|jjd|j�dS)NZx_domainz
.SH X WINDOWS LOGIN
z3
The SELinux user %s_u is able to X Windows login.
z7
The SELinux user %s_u is not able to X Windows login.
)r}r^r[rDrr)rerrrr�ds

zManPage._xwindows_logincCsJd|jkrF|jjd�d|jkr4|jjd|j�n|jjd|j�dS)Nr�z
.SH TERMINAL LOGIN
z2
The SELinux user %s_u is able to terminal login.
z6
The SELinux user %s_u is not able to terminal login.
)r}r^r[rDrr)rerrr�_terminal_loginrs

zManPage._terminal_logincCs�ddlm}|jjd�x�dD]�}|j|j|d�}t|�dkr�|jjd|j|f�x8|D]0}x*||D]\}}|jjdd	j|��qhWqZW|j|jdd
�}t|�dkr|jjd|j�x8|D]0}x*||D]\}}|jjdd	j|��q�Wq�WqWdS)
Nr)�networkz
.SH NETWORK
r�r�Z	name_bindzH
.TP
The SELinux user %s_u is able to listen on the following %s ports.
z
.B %s
r�Zname_connectzJ
.TP
The SELinux user %s_u is able to connect to the following tcp ports.
)r�r�)	rr�r^r[Zget_network_connectr�rIrrrH)rer�ZnetZportdictr�r�r�rrrr��s(


zManPage._networkcsXtt�fdd�tj���}�jjd�|dk	rB�jjd�j�n�jjd�j�dS)NcsH|d�jkoF|ddkoF|ddkoFtdddd	d
dg�jt|d��S)
Nr�r�Zuser_home_typer�r�Zioctl�read�getattrZexecuteZexecute_no_transrZr�)r�r�r�)r�)rerrr��sz$ManPage._home_exec.<locals>.<lambda>z
.SH HOME_EXEC
z;
The SELinux user %s_u is able execute home content files.
z?
The SELinux user %s_u is not able execute home content files.
)r�r�rr�r^r[rr)rer�r)rerr��szManPage._home_execcCs|jjd|j|jd��dS)Na�
.SH TRANSITIONS

Three things can happen when %(type)s attempts to execute a program.

\fB1.\fP SELinux Policy can deny %(type)s from executing the program.

.TP

\fB2.\fP SELinux Policy can allow %(type)s to execute the program in the current user type.

Execute the following to see the types that the SELinux user %(type)s can execute without transitioning:

.B sesearch -A -s %(type)s -c file -p execute_no_trans

.TP

\fB3.\fP SELinux can allow %(type)s to execute the program and transition to a new type.

Execute the following to see the types that the SELinux user %(type)s can execute and transition:

.B $ sesearch -A -s %(type)s -c process -p transition

)rr�)r^r[rrr�)rerrrr��szManPage._transitionscCs�|jjdd|ji�|jjd|j|jd��g}x,|jD]"}|jd|j|kr<|j|�q<Wt|�dkr�d}t|�dkr�d	}|jjd
dj|�||jf�dS)Nzd.TH  "%(user)s_selinux"  "8"  "%(user)s" "mgrepl@redhat.com" "%(user)s SELinux Policy documentation"ra[
.SH "NAME"
%(user)s_r \- \fB%(desc)s\fP - Security Enhanced Linux Policy

.SH DESCRIPTION

SELinux supports Roles Based Access Control (RBAC), some Linux roles are login roles, while other roles need to be transition into.

.I Note:
Examples in this man page will use the
.B staff_u
SELinux user.

Non login roles are usually used for administrative tasks. For example, tasks that require root privileges.  Roles control which types a user can run processes with. Roles often have default types assigned to them.

The default type for the %(user)s_r role is %(user)s_t.

The
.B newrole
program to transition directly to this role.

.B newrole -r %(user)s_r -t %(user)s_t

.B sudo
is the preferred method to do transition from one role to another.  You setup sudo to transition to %(user)s_r by adding a similar line to the /etc/sudoers file.

USERNAME ALL=(ALL) ROLE=%(user)s_r TYPE=%(user)s_t COMMAND

.br
sudo will run COMMAND as staff_u:%(user)s_r:%(user)s_t:LEVEL

When using a non login role, you need to setup SELinux so that your SELinux user can reach %(user)s_r role.

Execute the following to see all of the assigned SELinux roles:

.B semanage user -l

You need to add %(user)s_r to the staff_u user.  You could setup the staff_u user to be able to use the %(user)s_r role with a command like:

.B $ semanage user -m -R 'staff_r system_r %(user)s_r' staff_u

)r�rryrr%rMr�z�

SELinux policy also controls which roles can transition to a different role.
You can list these rules using the following command.

.B search --role_allow

SELinux policy allows the %s role%s can transition to the %s_r role.

z, )r^r[rrr�r�r)rIrH)reZtrolesrRr�rrrr��s)	zManPage._role_header)rwrLFF)(rsrtrurvrr�rfr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrxsJ
; 	+p"*	#
0-))rrr	r
)rrr
r)r)!�__all__rQr�rrkr�r�r�r�rrr&r'r/r2r5r8rr>r?r@rArBrCr1rFrKrrrTr_rrrrrr�<module>sH 


.site-packages/sepolicy/__pycache__/generate.cpython-36.opt-1.pyc000064400000125166147511334650020526 0ustar003

��f���@s�ddlZddlZddlZddlZddlZddlmZmZmZddlZddl	Z	ddl
mZddl
mZddl
m
Z
ddl
mZddl
mZdd	l
mZdd
l
mZddl
mZddl
mZdd
l
mZddl
mZddl
mZddl
mZddl
mZddl
mZddljZddljZdZy<ddlZiZ ej!dBk�r:de d<ej"efddd�e ��WnLyddl#Z#e$e#j%d<Wn(e&k
�r�ddl'Z'e(e'j%d<YnXYnXdd�Z)dd�Z*dd �Z+d!d"�Z,dZ-dZ.d#Z/dZ0d$Z1d%Z2dZ3dZ4d#Z5dZ6d&Z7d'Z8d(Z9d)Z:d*Z;d+Z<d,Z=d-Z>d.Z?iZ@eAd/�e@e3<eAd0�e@e4<eAd1�e@e5<eAd2�e@e6<eAd3�e@e7<eAd4�e@e8<eAd5�e@e9<eAd6�e@e:<eAd7�e@e;<eAd8�e@e<<eAd9�e@e=<eAd:�e@e><eAd;�e@e?<d<d=�ZBe3e4e5e8e6gZCe;e:e<e=e>gZDd>d?�ZEGd@dA�dA�ZFdS)C�N)�
get_all_types�get_all_attributes�
get_all_roles�)�
executable)�boolean)�etc_rw)�	unit_file)�	var_cache)�	var_spool)�var_lib)�var_log)�var_run)�tmp)�rw)�network)�script)�spec)�userzselinux-python�T�unicodez/usr/share/localezutf-8)Z	localedirZcodeset�_cCsF|d}|d}|d}|d|jd�d}|jd�d}|||gS)z6Given an RPM header return the package NVR as a string�name�version�release�-�.rr)�split)ZhdrrrrZrelease_versionZ
os_version�r�/usr/lib/python3.6/generate.py�get_rpm_nvr_from_headerGsr c	Cs`y>ddl}d}|j�}|j|j|�}x|D]}t|�}Pq*WWntd|�d}YnX|S)Nrz"Failed to retrieve rpm info for %s)�rpm�tsZdbMatchZRPMTAG_NAMEr �print)�packager!Znvrr"Zmi�hrrr�get_rpm_nvr_listRs


r&cCs�i}xztjtj�D]j}|ddks|ddks|ddks|ddks|ddkrTq|d|jd�f||d|d	|d
f<qW|S)N�typeZreserved_port_tZport_tZhi_reserved_port_tZephemeral_port_tZunreserved_port_t�rangeZlowZhigh�protocol)�sepolicy�infoZPORT�get)�dict�prrr�
get_all_portsbs,r/cCs6dd�tjtj�D�}|jd�|jd�|j�|S)NcSsg|]}|d�qS)rr)�.0�xrrr�
<listcomp>psz!get_all_users.<locals>.<listcomp>Zsystem_u�root)r*r+�USER�remove�sort)�usersrrr�
get_all_usersos


r8�z_admin$z_role$������	�
��zStandard Init DaemonzDBUS System DaemonzInternet Services DaemonzWeb Application/Script (CGI)ZSandboxzUser ApplicationzExisting Domain Typez Minimal Terminal Login User Rolez!Minimal X Windows Login User RolezDesktop Login User RolezAdministrator Login User Rolez Confined Root Administrator Rolez!Module information for a new typecCs>tj�}|j�td�}x |D]}|d|t|f7}qW|S)Nz
Valid Types:
z%2s: %s
)�poltype�keysr6r)rD�msg�krrr�get_poltype_desc�s
rGcCs�|dkrgSd	}y�g}x�|jd�D]�}|jd�}t|�dkr@t�t|�dkrft|d�}t|d�}n$t|d�}t|d�}||kr�t�x4t||d�D]"}|dks�||kr�t�|j|�q�Wq"W|Stk
r�ttd�|��YnXdS)
N�r9��,rrrz8Ports must be numbers or ranges of numbers from 1 to %d i)r�len�
ValueError�intr(�appendr)�portsZmax_port�temp�a�r�begin�endr.rrr�verify_ports�s.
rUc@s�eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zd�dd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zd2d3�Zd4d5�Zd6d7�Zd8d9�Zd:d;�Zd<d=�Z d>d?�Z!d@dA�Z"dBdC�Z#dDdE�Z$dFdG�Z%dHdI�Z&dJdK�Z'dLdM�Z(dNdO�Z)dPdQ�Z*dRdS�Z+dTdU�Z,dVdW�Z-dXdY�Z.dZd[�Z/d\d]�Z0d^d_�Z1d`da�Z2dbdc�Z3ddde�Z4dfdg�Z5dhdi�Z6djdk�Z7dldm�Z8dndo�Z9dpdq�Z:drds�Z;dtdu�Z<dvdw�Z=dxdy�Z>dzd{�Z?d|d}�Z@d~d�ZAd�d��ZBd�d��ZCd�d��ZDd�d��ZEd�d��ZFd�d��ZGd�d��ZHd�d��ZId�d��ZJd�d��ZKd�d��ZLd�d��ZMd�d��ZNd�d��ZOd�d��ZPd�d��ZQd�d��ZRd�d��ZSd�d��ZTd�d��ZUd�d��ZVd�d��ZWd�d��ZXd�d��ZYd�d��ZZd�d��Z[d�d��Z\d�d��Z]d�d��Z^d�d��Z_d�d��Z`d�d��Zad�d��Zbd�dÄZcd�dńZdd�dDŽZed�dɄZfd�d˄Zgd�d̈́Zhd�dτZid�dфZjd�dӄZkd�dՄZld�dׄZmd�dلZnd�dۄZod�d݄Zpd�d߄Zqd�d�Zrd�d�Zsetju�fd�d�Zvd�S)��policycCsg|_i|_t�|_g|_|tkr.ttd���|sFttd�t|��yt�|_WnTtk
r|}zt	d�WYdd}~Xn,t
k
r�}zt	d|�WYdd}~XnXi|_d|jd<d|jd<d|jd<d	|jd
<d	|jd<d	|jd<d|jd
<d|jd<d|jd<d|jd<d|jd<d|jd<d|jd<d|jd<d|jd<d|jd<d|jd<d|jd<d|jd <d!|jd"<d#|jd$<d%|jd&<d'|jd(<d)|jd*<d+|jd,<d-|jd.<d/|jd0<d1|jd2<d3|jd4<d5|jd6<d7|jd8<d9|jd:<d;|jd<<d=|jd><d?|jd@<dA|jdB<dC|jdD<dE|jdF<dG|jdH<dI|jdJ<dK|jdL<dM|jdN<dO|jdP<dQ|jdR<dS|jdT<dU|jdV<dW|jdX<dY|jdZ<d[|jd\<d]|jd^<d_|jd`<da|jdb<da|jdc<da|jdd<da|jde<df|jdg<df|jdh<df|jdi<df|jdj<df|jdg<dk|jdl<dm|jdn<do|jdp<dq|jdr<ds|jdt<du|jdv<dw|jdx<dy|jdz<d{|jd|<d}|jd~<d|jd�<d|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<i|_d�gt
g|jd�<d�gtg|jd�<d�gtg|jd�<d�gtg|jd�<d�gtg|jd�<d�gtg|jd�<d�gtg|jd�<d�gtg|jd�<d�gtg|jd�<d�gtg|jd�<d�gtg|jd�<i|_t|jd�<t|jd�<t|jd�<t|jd�<t|jd�<t|jd�<t|jd�<t|jd�<d�d�d�d�d�d�d�d�d�d�d�g|_|j|jf|j|jf|j|jf|j|j f|j!|j"f|j#|j$f|j%|j&f|j'|j(f|j)|j*f|j+|j(f|j,|j(f|j-|j.f|j/|j0ff
|_1t2j3d�|��s�ttd����|t4k�r�d�||_5n||_5||_6g|_7g|_8||_9d�|_:d|_;d�d�d�gg|_<d�d�d�gg|_=d�d�d�gg|_>d�d�d�gg|_?d�|_@d�|_Ad�|_Bd�|_Cd�|_Dd�|_Ed�|_Fd�|_Gd�|_H|j9tItJgk|_K|j9tItJgk|_L|j9tItJgk|_Md�|_Nd�|_Oi|_Pi|_Qi|_Rg|_Sg|_Td�|_Ud�|_Vg|_Wg|_Xg|_Yg|_Zg|_[dS)�Nz"You must enter a valid policy typez;You must enter a name for your policy module for your '%s'.z9Can not get port types, must be root for this informationzCan not get port typeszset_use_kerberos(True)Zopenlogzset_use_kerb_rcache(True)zset_use_syslog(True)zset_use_resolve(True)Z	gethostbyZgetaddrinfoZgetnameinfoZkrbzset_manage_krb5_rcache(True)Zgss_accept_sec_contextZkrb5_verify_init_credsZkrb5_rd_reqZ__syslog_chkzset_use_uid(True)�getpwnam�getpwuidzset_use_dbus(True)Zdbus_zset_use_pam(True)Zpam_zset_use_audit(True)zadd_process('fork')�forkzadd_process('transition')Z
transitionzadd_process('sigchld')Zsigchldzadd_process('sigkill')Zsigkillzadd_process('sigstop')Zsigstopzadd_process('signull')Zsignullzadd_process('ptrace')Zptracezadd_process('getsched')Zgetschedzadd_process('setsched')Zsetschedzadd_process('getsession')Z
getsessionzadd_process('getpgid')�getpgidzadd_process('setpgid')�setpgidzadd_process('getcap')Zgetcapzadd_process('setcap')Zsetcapzadd_process('share')Zsharezadd_process('getattr')�getattrzadd_process('setexec')Zsetexeczadd_process('setfscreate')Zsetfscreatezadd_process('noatsecure')Z
noatsecurezadd_process('siginh')Zsiginhzadd_process('signal_perms')�killzadd_process('setrlimit')Z	setrlimitzadd_process('rlimitinh')Z	rlimitinhzadd_process('dyntransition')Z
dyntransitionzadd_process('setcurrent')Z
setcurrentzadd_process('execmem')Zexecmemzadd_process('execstack')Z	execstackzadd_process('execheap')Zexecheapzadd_process('setkeycreate')Zsetkeycreatezadd_process('setsockcreate')Z
setsockcreatezadd_capability('chown')�chownzadd_capability('dac_override')Zdac_overridez!add_capability('dac_read_search')Zdac_read_searchzadd_capability('fowner')Zfownerzadd_capability('fsetid')Zfsetidzadd_capability('setgid')�setgid�setegid�	setresgid�setregidzadd_capability('setuid')�	setresuid�setuid�seteuid�setreuidzadd_capability('setpcap')Zsetpcapz!add_capability('linux_immutable')Zlinux_immutablez"add_capability('net_bind_service')Znet_bind_servicezadd_capability('net_broadcast')Z
net_broadcastzadd_capability('net_admin')Z	net_adminzadd_capability('net_raw')Znet_rawzadd_capability('ipc_lock')Zipc_lockzadd_capability('ipc_owner')Z	ipc_ownerzadd_capability('sys_module')�
sys_modulezadd_capability('sys_rawio')Z	sys_rawiozadd_capability('sys_chroot')�chrootZ
sys_chrootzadd_capability('sys_ptrace')Z
sys_ptracezadd_capability('sys_pacct')Z	sys_pacctzadd_capability('sys_admin')ZmountZunshareZ	sys_adminzadd_capability('sys_boot')Zsys_bootzadd_capability('sys_nice')Zsys_nicezadd_capability('sys_resource')Zsys_resourcezadd_capability('sys_time')Zsys_timez add_capability('sys_tty_config')Zsys_tty_configzadd_capability('mknod')�mknodzadd_capability('lease')Zleasezadd_capability('audit_write')Zaudit_writezadd_capability('audit_control')Z
audit_controlzadd_capability('setfcap')Zsetfcaprz/etcrz/tmprr	z/usr/lib/systemd/systemz/lib/systemd/systemz/etc/systemd/systemr
z
/var/cacherz/var/libr
z/var/logrz/var/runrz
/var/spoolZ_tmp_tZ_unit_file_tZ_var_cache_tZ
_var_lib_tZ
_var_log_tZ
_var_run_tZ_var_spool_tZ_port_tz^[a-zA-Z0-9-_]+$zQName must be alpha numberic with no spaces. Consider using option "-n MODULENAME"zhttpd_%s_scriptrHF)\�rpmsrOr�	all_roles�typesrCrLrr/r#�RuntimeError�symbols�DEFAULT_DIRSrrrr	r
rr
rr�DEFAULT_EXTr�DEFAULT_KEYS�generate_daemon_types�generate_daemon_rules�generate_dbusd_types�generate_dbusd_rules�generate_inetd_types�generate_inetd_rules�generate_cgi_types�generate_cgi_rules�generate_sandbox_types�generate_sandbox_rules�generate_userapp_types�generate_userapp_rules�generate_existing_user_types�generate_existing_user_rules�generate_min_login_user_types�generate_login_user_rules�generate_x_login_user_types�generate_x_login_user_rules�generate_login_user_types�generate_admin_user_types�generate_root_user_types�generate_root_user_rules�generate_new_types�generate_new_rules�
DEFAULT_TYPES�re�match�CGIr�	file_name�capabilities�	processesr'�
initscript�program�in_tcp�in_udp�out_tcp�out_udp�use_resolve�use_tmp�use_uid�
use_syslog�use_kerberos�manage_krb5_rcache�use_pam�use_dbus�	use_audit�EUSER�NEWTYPE�use_etc�use_localization�use_fd�use_terminal�use_mail�booleans�files�dirs�found_tcp_ports�found_udp_ports�
need_tcp_type�
need_udp_type�
admin_domains�existing_domains�transition_domains�transition_users�roles)�selfrr'�errr�__init__�sd











































































































zpolicy.__init__cCs(|tp&|tp&|tp&t|t�dkS)Nr)�ALL�RESERVED�
UNRESERVEDrK�PORTS)r��lrrrZ
__isnetset�szpolicy.__isnetsetcCs
||_dS)N)r�)r�r�rrr�set_admin_domains�szpolicy.set_admin_domainscCs
||_dS)N)r�)r�r�rrr�set_existing_domains�szpolicy.set_existing_domainscCs
||_dS)N)r�)r�r�rrr�set_admin_roles�szpolicy.set_admin_rolescCs
||_dS)N)r�)r�r�rrr�set_transition_domains�szpolicy.set_transition_domainscCs
||_dS)N)r�)r�r�rrr�set_transition_users�szpolicy.set_transition_userscCs|j|j�S)N)�_policy__isnetsetr�)r�rrr�
use_in_udp�szpolicy.use_in_udpcCs|j|j�S)N)r�r�)r�rrr�use_out_udp�szpolicy.use_out_udpcCs|j�p|j�S)N)r�r�)r�rrr�use_udp�szpolicy.use_udpcCs|j|j�S)N)r�r�)r�rrr�
use_in_tcp�szpolicy.use_in_tcpcCs|j|j�S)N)r�r�)r�rrr�use_out_tcp�szpolicy.use_out_tcpcCs|j�p|j�S)N)r�r�)r�rrr�use_tcp�szpolicy.use_tcpcCs|j�p|j�S)N)r�r�)r�rrr�use_network�szpolicy.use_network�tcpcCsFx@|jj�D]2\}}}||kr||kr||kr|j|||fSqWdS)N)rOrD)r�Zportr)rSrTr.rrr�	find_port�szpolicy.find_portcCs |jtkrttd���||_dS)Nz0User Role types can not be assigned executables.)r'�APPLICATIONSrLrr�)r�r�rrr�set_program�s
zpolicy.set_programcCs |jtkrttd���||_dS)Nz)Only Daemon apps can use an init script..)r'�DAEMONrLrr�)r�r�rrr�set_init_script�s
zpolicy.set_init_scriptcCs|||t|�g|_dS)N)rUr�)r��all�reserved�
unreservedrOrrr�
set_in_tcp�szpolicy.set_in_tcpcCs|||t|�g|_dS)N)rUr�)r�r�r�r�rOrrr�
set_in_udp�szpolicy.set_in_udpcCs|ddt|�g|_dS)NF)rUr�)r�r�rOrrr�set_out_tcp�szpolicy.set_out_tcpcCs|ddt|�g|_dS)NF)rUr�)r�r�rOrrr�set_out_udp�szpolicy.set_out_udpcCs"t|�tk	rttd���||_dS)Nz$use_resolve must be a boolean value )r'�boolrLrr�)r��valrrr�set_use_resolve�szpolicy.set_use_resolvecCs"t|�tk	rttd���||_dS)Nz#use_syslog must be a boolean value )r'r�rLrr�)r�r�rrr�set_use_syslog�szpolicy.set_use_syslogcCs"t|�tk	rttd���||_dS)Nz%use_kerberos must be a boolean value )r'r�rLrr�)r�r�rrr�set_use_kerberos�szpolicy.set_use_kerberoscCs"t|�tk	rttd���||_dS)Nz+manage_krb5_rcache must be a boolean value )r'r�rLrr�)r�r�rrr�set_manage_krb5_rcache�szpolicy.set_manage_krb5_rcachecCs|dk|_dS)NT)r�)r�r�rrr�set_use_pam�szpolicy.set_use_pamcCs|dk|_dS)NT)r�)r�r�rrr�set_use_dbus�szpolicy.set_use_dbuscCs|dk|_dS)NT)r�)r�r�rrr�
set_use_audit�szpolicy.set_use_auditcCs|dk|_dS)NT)r�)r�r�rrr�set_use_etc�szpolicy.set_use_etccCs|dk|_dS)NT)r�)r�r�rrr�set_use_localization�szpolicy.set_use_localizationcCs|dk|_dS)NT)r�)r�r�rrr�
set_use_fd�szpolicy.set_use_fdcCs|dk|_dS)NT)r�)r�r�rrr�set_use_terminal�szpolicy.set_use_terminalcCs|dk|_dS)NT)r�)r�r�rrr�set_use_mail�szpolicy.set_use_mailcCsB|jtkrttd���|r0|jddjd�ng|jdd<dS)Nz'USER Types automatically get a tmp typez/tmpr)r'�USERSrLrrorN)r�r�rrr�set_use_tmp�s

zpolicy.set_use_tmpcCs|dk|_dS)NT)r�)r�r�rrr�set_use_uidszpolicy.set_use_uidcCs |jrtjd|jtj�SdSdS)N�TEMPLATETYPErH)r�r��subrrZte_uid_rules)r�rrr�generate_uid_rulesszpolicy.generate_uid_rulescCs |jrtjd|jtj�SdSdS)Nr�rH)r�r�r�rrZte_syslog_rules)r�rrr�generate_syslog_rulesszpolicy.generate_syslog_rulescCs |jrtjd|jtj�SdSdS)Nr�rH)r�r�r�rrZte_resolve_rules)r�rrr�generate_resolve_rulesszpolicy.generate_resolve_rulescCs |jrtjd|jtj�SdSdS)Nr�rH)r�r�r�rrZte_kerberos_rules)r�rrr�generate_kerberos_rulesszpolicy.generate_kerberos_rulescCs |jrtjd|jtj�SdSdS)Nr�rH)r�r�r�rrZte_manage_krb5_rcache_rules)r�rrr�!generate_manage_krb5_rcache_rules sz(policy.generate_manage_krb5_rcache_rulescCs d}|jrtjd|jtj�}|S)NrHr�)r�r�r�rrZte_pam_rules)r��newterrr�generate_pam_rules&szpolicy.generate_pam_rulescCs d}|jrtjd|jtj�}|S)NrHr�)r�r�r�rrZte_audit_rules)r�r�rrr�generate_audit_rules,szpolicy.generate_audit_rulescCs d}|jrtjd|jtj�}|S)NrHr�)r�r�r�rrZte_etc_rules)r�r�rrr�generate_etc_rules2szpolicy.generate_etc_rulescCs d}|jrtjd|jtj�}|S)NrHr�)r�r�r�rrZte_fd_rules)r�r�rrr�generate_fd_rules8szpolicy.generate_fd_rulescCs d}|jrtjd|jtj�}|S)NrHr�)r�r�r�rrZte_localization_rules)r�r�rrr�generate_localization_rules>sz"policy.generate_localization_rulescCs*d}|jtkr&|jr&tjd|jtj�}|S)NrHr�)r'�DBUSr�r�r�rrZ
te_dbus_rules)r�r�rrr�generate_dbus_rulesDszpolicy.generate_dbus_rulescCs d}|jrtjd|jtj�}|S)NrHr�)r�r�r�rrZ
te_mail_rules)r�r�rrr�generate_mail_rulesJszpolicy.generate_mail_rulescCsFd}d|||f}|tj�kr.d||jf}nd||j|||f}|S)NrHzcorenet_%s_%s_%sz	%s(%s_t)
zD
gen_require(`
    type %s_t;
')
allow %s_t %s_t:%s_socket name_%s;
)r*Zget_methodsr)r�r)�action�	port_name�line�methodrrr�generate_network_actionPszpolicy.generate_network_actioncCshxf|jtD]X}|jt|�d�}|dkr0d|_q|ddd
�}|jdd|�}||jkr|jj|�qWxf|jtD]X}|jt|�d�}|dkr�d|_qt|ddd�}|jdd|�}||jkrt|jj|�qtWxh|j	tD]Z}|jt|�d�}|dk�rd|_
q�|ddd�}|jdd|�}||jkr�|jj|�q�W|j
dk�sR|jdk�rdtj
d|jtj�Sd	S)
Nr�Trr9ZbindZconnect�udpr�rH���r�r�)r�r�r�rMr�r�r�rNr�r�r�r�r�r�rr�te_types)r��iZrecr�r�rrr�generate_network_types^s6



zpolicy.generate_network_typescCsZx:|jD]0}|j|�dkr|j|dj|�|j|SqW|jddj|�|jdS)Nrrr)ro�findrN)r��file�drrrZ__find_path�szpolicy.__find_pathcCs||jkr|jj|�dS)N)r�rN)r�Z
capabilityrrr�add_capability�s
zpolicy.add_capabilitycCs
||_dS)N)rl)r�rlrrr�	set_types�szpolicy.set_typescCs||jkr|jj|�dS)N)r�rN)r�Zprocessrrr�add_process�s
zpolicy.add_processcCs||j|<dS)N)r�)r�r�descriptionrrr�add_boolean�szpolicy.add_booleancCs|j|�|j|<dS)N)�_policy__find_pathr�)r�rrrr�add_file�szpolicy.add_filecCs|j|�|j|<dS)N)rr�)r�rrrr�add_dir�szpolicy.add_dircCs6d}|jj�t|j�dkr2d|jdj|j�f}|S)NrHrz#allow %s_t self:capability { %s };
� )r�r6rKr�join)r�r�rrr�generate_capabilities�s

zpolicy.generate_capabilitiescCs6d}|jj�t|j�dkr2d|jdj|j�f}|S)NrHrz allow %s_t self:process { %s };
r)r�r6rKrr)r�r�rrr�generate_process�s

zpolicy.generate_processcCs�d}|j��r�d}|tjd|jtj�7}|j��r�|d7}|tjd|jtj�7}|j�r�|tjd|jtj	�7}|j
r�t|jt
�dkr�|tjd|jtj�7}|j
r�t|jt
�dkr�|tjd|jtj�7}|jtr�|tjd|jtj�7}|jt�r|tjd|jtj�7}|jt�r.|tjd|jtj�7}|jt�rP|tjd|jtj�7}|jt�rr|tjd|jtj�7}|jt�r�|tjd|jtj�7}x|jD]}||7}�q�W|j��r�|d7}|tjd|jtj�7}|j�r�|tjd|jtj�7}|j��r|tjd|jtj �7}|j!t�r6|tjd|jtj"�7}|j!t�rX|tjd|jtj#�7}|j!t�rz|tjd|jtj$�7}x|j%D]}||7}�q�W|S)NrH�
r�r)&r�r�r�rrZ
te_networkr�Zte_tcpr�Z	te_in_tcpr�rKr�r�Zte_in_need_port_tcpr�Zte_out_need_port_tcpr�Zte_in_all_ports_tcpr�Zte_in_reserved_ports_tcpr�Zte_in_unreserved_ports_tcpZte_out_all_ports_tcpZte_out_reserved_ports_tcpZte_out_unreserved_ports_tcpr�r�Zte_udpr�Zte_in_need_port_udpr�Z	te_in_udpr�Zte_in_all_ports_udpZte_in_reserved_ports_udpZte_in_unreserved_ports_udpr�)r�r�r�rrr�generate_network_rules�sV




zpolicy.generate_network_rulescCs�d}x2|jD](}tjd|jtj�}|tjd||�7}qW|jtkr�x<|jD]2}tjd|jt	j
�}|tjd|jd�d|�7}qJW|S)NrHr��APPLICATIONr4�_ur)r�r�r�rrZte_transition_rulesr'r4r�rZte_run_rulesr)r�r��appr�urPrrr�generate_transition_rules�s
 z policy.generate_transition_rulescCs,d}|jtkr�xn|jD]d}|jd�d}|d}xH|jD]>}tjd|tj�}||j	krdtj|d|�}|tjd||�7}q8WqW|S|jt
k�r(|tjd|jtj�7}x2|jD](}tjd|jtj�}|tjd||�7}q�WxN|j
D]D}|jd�d}|d|j	kr�tjd|jtj�}|tjd	||�7}q�W|S)
NrH�_tr�_rr�Zsystem_rrrr4)r'r�r�rr�r�r�rZte_admin_domain_rulesrk�RUSERrZte_admin_rulesr�Zte_admin_trans_rules)r�r�rr�rolerrrrrr�generate_admin_rules�s,

zpolicy.generate_admin_rulescCs d}|jrtjd|jtj�}|S)NrHr�)r�r�r�rrZ
if_dbus_rules)r��newifrrr�generate_dbus_ifszpolicy.generate_dbus_ifcCs(d}|jtkr|Stjd|jtj�}|S)NrHr�)r'�SANDBOXr�r�rrZif_sandbox_rules)r�rrrr�generate_sandbox_ifs

zpolicy.generate_sandbox_ifcCsd}d}|jdkr>|tjd|jtj�7}|tjd|jtj�7}xd|jD]Z}t|j	|d�dkrF|tjd|j|j	|dj
�7}|tjd|j|j	|dj�7}qFW|dkr�tjd|jtj�}||7}|tjd|jtj
�7}||7}|tjd|jtj�7}|SdS)NrHr�rrr9)r�r�r�rrZif_initscript_admin_typesZif_initscript_adminrqrKroZif_admin_typesZif_admin_rulesZif_begin_adminZif_middle_adminZif_end_admin)r�rZnewtypesr�retrrr�generate_admin_ifs"
 $zpolicy.generate_admin_ifcCstjd|jtj�S)Nr�)r�r�r�r�te_cgi_types)r�rrrrx5szpolicy.generate_cgi_typescCstjd|jtj�S)Nr�)r�r�r�r�te_sandbox_types)r�rrrrz8szpolicy.generate_sandbox_typescCstjd|jtj�S)Nr�)r�r�rrZte_userapp_types)r�rrrr|;szpolicy.generate_userapp_typescCstjd|jtj�S)Nr�)r�r�rrZte_inetd_types)r�rrrrv>szpolicy.generate_inetd_typescCstjd|jtj�S)Nr�)r�r�rrZte_dbusd_types)r�rrrrtAszpolicy.generate_dbusd_typescCstjd|jtj�S)Nr�)r�r�rrZte_min_login_user_types)r�rrrr�Dsz$policy.generate_min_login_user_typescCstjd|jtj�S)Nr�)r�r�rrZte_login_user_types)r�rrrr�Gsz policy.generate_login_user_typescCstjd|jtj�S)Nr�)r�r�rrZte_admin_user_types)r�rrrr�Jsz policy.generate_admin_user_typescCs�t|j�dkr$ttd�t|j��tjd|jt	j
�}|d7}xB|jD]8}|d|7}|jd�dd}||jkrF|d|7}qFW|d	7}|S)
Nrz,'%s' policy modules require existing domainsr�z
gen_require(`z
        type %s;rrz

	role %s;z
')
)
rKr�rLrrCr'r�r�rrZte_existing_user_typesrrk)r�r�rrrrrr~Ms

z#policy.generate_existing_user_typescCstjd|jtj�S)Nr�)r�r�rrZte_x_login_user_types)r�rrrr�_sz"policy.generate_x_login_user_typescCstjd|jtj�S)Nr�)r�r�rrZte_root_user_types)r�rrrr�bszpolicy.generate_root_user_typesc	Cs�d}t|j�dkrttd���xj|jD]`}xZ|jD]P}|j|�r2t||dt|���|tjd|dt|��|j|j	�7}Pq2Wq&Wt
r�|dkr�g}x|jD]}|j|�q�Wttd�dj|���|S)NrHrzType field requiredr�z3You need to define a new type which ends with: 
 %sz
 )
rKrlrLrrp�endswithr#r�r�r�r�rNr)r�r��tr�Zdefault_extrrrr�es
(
zpolicy.generate_new_typescCsdS)NrHr)r�rrrr�yszpolicy.generate_new_rulescCs6tjd|jtj�}|jdkr2|tjd|jtj�7}|S)Nr�rH)r�r�rrZte_daemon_typesr�Zte_initscript_types)r�r�rrrrr|s
zpolicy.generate_daemon_typescCs |jrtjd|jtj�SdSdS)Nr�rH)r�r�r�rrr�)r�rrr�generate_tmp_types�szpolicy.generate_tmp_typescCs@d}x6|jD],}tjd|tj�}|tjd|j||�7}qW|S)NrH�BOOLEANZDESCRIPTION)r�r�r�rZ
te_boolean)r�r��brrrr�generate_booleans�s
zpolicy.generate_booleanscCs,d}x"|jD]}|tjd|tj�7}qW|S)NrHr&)r�r�r�r�te_rules)r�r�r'rrr�generate_boolean_rules�szpolicy.generate_boolean_rulescCstjd|jtj�S)Nr�)r�r�rrr")r�rrr�generate_sandbox_te�szpolicy.generate_sandbox_tecCstjd|jtj�S)Nr�)r�r�rrr!)r�rrr�generate_cgi_te�szpolicy.generate_cgi_tecCstjd|jtj�}|S)Nr�)r�r�rrZte_daemon_rules)r�rrrrrs�szpolicy.generate_daemon_rulesc	Csrd}xh|jD]^}xX|jD]N}|j|�r|dt|��d}|tjd|dt|��|j|j�7}PqWqW|S)NrHrr�)rlrpr#rKr�r��if_rules)r�rr$r�Zreqtyperrr�generate_new_type_if�s
(
zpolicy.generate_new_type_ifcCstjd|jtj�S)Nr�)r�r�rrZte_login_user_rules)r�rrrr��sz policy.generate_login_user_rulescCstjd|jtj�}|S)Nr�)r�r�rrZte_existing_user_rules)r�Znerulesrrrr�sz#policy.generate_existing_user_rulescCstjd|jtj�S)Nr�)r�r�rrZte_x_login_user_rules)r�rrrr��sz"policy.generate_x_login_user_rulescCstjd|jtj�}|S)Nr�)r�r�rrZte_root_user_rules)r�r�rrrr��szpolicy.generate_root_user_rulescCstjd|jtj�S)Nr�)r�r�rrZte_userapp_rules)r�rrrr}�szpolicy.generate_userapp_rulescCstjd|jtj�S)Nr�)r�r�rrZte_inetd_rules)r�rrrrw�szpolicy.generate_inetd_rulescCstjd|jtj�S)Nr�)r�r�rrZte_dbusd_rules)r�rrrru�szpolicy.generate_dbusd_rulescCs |jrtjd|jtj�SdSdS)Nr�rH)r�r�r�rrr))r�rrr�generate_tmp_rules�szpolicy.generate_tmp_rulescCsd}|tjd|jtj�7}|S)NrHr�)r�r�rrZte_cgi_rules)r�r�rrrry�szpolicy.generate_cgi_rulescCsd}|tjd|jtj�7}|S)NrHr�)r�r�rrZte_sandbox_rules)r�r�rrrr{�szpolicy.generate_sandbox_rulescCsRd}|js|jtkr&tjd|jtj�}|jtt	t
tfkrN|tjd|jtj�7}|S)NrHr�)
r�r'r4r�r�rrZif_user_program_rules�TUSER�XUSER�AUSER�LUSERZif_role_change_rules)r�rrrr�generate_user_if�szpolicy.generate_user_ifcCsDd}|tjd|jtj�7}|jr6|tjd|jtj�7}|jdkrV|tjd|jtj�7}x�|j	D]�}t
|j|d�dkr^|tjd|j|j|dj�7}xZ|j|dD]H}t
jj|�r�tjt
j|�tj�r�|tjd|j|j|dj�7}Pq�Wq^W||j�7}||j�7}||j�7}||j�7}||j�7}||j�7}|S)NrHr�rrr9)r�r�rrZif_heading_rulesr�Zif_program_rulesr�Zif_initscript_rulesrqrKror-�os�path�exists�stat�S_ISSOCK�ST_MODEZif_stream_rulesr4rr rr.r�)r�rrr�rrr�generate_if�s(
 " 
zpolicy.generate_ifcCs|j|jd�S)Nr)r�r')r�rrr�generate_default_types�szpolicy.generate_default_typescCs&|j|jdr"|j|jd�SdS)NrrH)r�r')r�rrr�generate_default_rules�szpolicy.generate_default_rulescCs�d}|jttttfkr�d}t|j�dkr�|tjd|j	t
j�7}|tjd|j	t
j�7}x2|jD](}tjd|j	t
j
�}|tjd||�7}q\W|S)NrHrr�ZROLE)r'r0r1r2r3rKr�r�r�rrZ
te_sudo_rulesZte_newrole_rulesZte_roles_rules)r�r�r�rrrrr�generate_roles_rules�szpolicy.generate_roles_rulesc	Cs�|j�}xV|jD]L}t|j|d�dkr|jtks<|dkr|tjd|j|j|dj	�7}qW|jt
krx|d|j7}||j�7}||j�7}||j
�7}||j�7}||j�7}||j�7}||j�7}�xT|jD�]H}t|j|d�dkr�|jt
k�rXd}xt|jD]H}|tjd|dd�d	|j|dj�7}|tjd
|jd|�7}�q
Wn |tjd|j|j|dj�7}x�|j|dD]�}tjj|��r�tjtj|�tj��r�|jt
k�r�xX|jD],}|tjd|dd
�|j|dj�7}�q�Wn |tjd|j|j|dj�7}P�q�Wq�W||j�7}||j�7}||j�7}||j�7}||j�7}||j �7}||j!�7}||j"�7}||j#�7}||j$�7}||j%�7}||j&�7}||j'�7}||j(�7}||j)�7}||j*�7}||j+�7}|S)Nrrrr�r9z@
########################################
#
# %s local policy
#
rHZTEMPLATETYPE_trZTEMPLATETYPE_rw_tZ_rw_tr�r�),r<rqrKror'r�r�r�rr�r�r
rr�r%r(r=r*r�r)r5r6r7r8r9r:Zte_stream_rulesr/rr�r�r�r�r�r�r�r�r>r�rrr�r�r�)r�r�rZ	newte_tmpZdomainr�rrr�generate_tes`$
*  &. zpolicy.generate_tecCs�d}g}x�|jj�D]�}tjj|�rXtjtj|�tj�rXtj	d|j
|j|dj�}ntj	d|j
|j|dj�}tj	d||�}|j
tj	d|j|d|��qWxZ|jj�D]L}tj	d|j
|j|dj�}tj	d||�}|j
tj	d|j|d|��q�W|jttgk�r&t|�dk�r&tjS|jttttgk�rR|j�rRttd���|j�r�tj	d|jtj�}|j
tj	d|j
|��|jdk�r�tj	d|jtj�}|j
tj	d|j
|��|j�d	j|�}|S)
NrHr�r9�FILENAMEZFILETYPErz<You must enter the executable path for your confined processZ
EXECUTABLEr) r�rDr5r6r7r8r9r:r�r�rZfc_sock_fileZfc_filerNr�Zfc_dirr'r�rrKrZfc_userr�r�r�rLrZ
fc_programr�Z
fc_initscriptr6r)r�ZnewfcZfclistr��t1Zt2rrr�generate_fcDs4""" 
zpolicy.generate_fccCs�d}|jtttttfkr|Sd}x|jD]}|d|7}q(W|dkrL|d7}tjd|j	t
j�}|tjd||�7}|jtks�|jtkr�x2|jD](}tjd|j	t
j
�}|tjd||�7}q�W|jtkr�|tjd|j	t
j�7}n|tjd|j	t
j�7}|S)NrHz %s_rz	 system_rr�ZROLESr4)r'r0r1r2r3rr�r�r�rrr7r�Zadmin_transZmin_login_user_default_contextZx_login_user_default_context)r��newshr�rrrrrr�generate_user_shgs$
zpolicy.generate_user_shcCs�tjd|jtj�}tjd|j|�}|jtkrBtjdd|j|�}n&tjd|j|�}|tjd|jtj�7}|j	r�|tjd|j	tj
�7}|jdkr�|tjd|jtj
�7}x&|jj
�D]}|tjd|tj
�7}q�Wx&|jj
�D]}|tjd|tj
�7}q�WxX|jt|jtD]@}|j|d�dk�r
tjdd	|tj�}|tjd|j|�7}�q
WxN|jtD]@}|j|d
�dk�rZtjdd	|tj�}|tjd|j|�7}�qZW||j�7}tjdd�ddk�r�|tjd|jtj�7}|S)Nr�Z
DOMAINTYPEZTEMPLATEFILEz%sr@rHr�ZPORTNUMz%dr�r)�full_distribution_name�redhat�centos�SuSE�fedora�mandrake�mandriva)rFrGrHrIrJrK)r�r�r�r�compilerr'r�Zmanpager�Z
restoreconr�r�rDr�r�r�r�r�Z	tcp_portsr�Z	udp_portsrD�platform�linux_distributionr!)r�rPrCr�rArrr�generate_sh�s4

zpolicy.generate_shcCs�d}td�}|dkrd}n|d}|tj7}|jtkr�|tj7}|jr\|tjd|jtj	�7}|j
dkr||tjd|j
tj	�7}x&|jj�D]}|tjd|tj	�7}q�Wx&|j
j�D]}|tjd|tj	�7}q�W|tjd|tj�7}tjd|j|�}tjd|j|�}t|j�d	k�r$|d
dj|j�7}|tjd|jtj�7}tjd|j|�}tjdtjd
�|�}|jtk�rxtjdd|�}|jtk�r�tjd|jd|�}|jtttttfk�r�tjd|jd|�}|S)NrHzselinux-policyz0.0.0rr@�VERSIONZ
MODULENAMEZ
DOMAINNAMErzRequires(post): %s
z, Z
TODAYSDATEz%a %b %e %Yz%relabel_filesz.*%s_selinux.8.*z.*%s_u.*)r&rZheader_comment_sectionr'r�Zdefine_relabel_files_beginr�r�r�Zdefine_relabel_files_endr�r�rDr�Zbase_sectionr�rrKrjrZmid_section�timeZstrftimer�r0r1r2r3r)r�ZnewspecZselinux_policynvrZselinux_policyverr�rrr�
generate_spec�s>



zpolicy.generate_speccCs2d||jf}t|d�}|j|j��|j�|S)Nz%s/%s_selinux.spec�w)r��open�writerR�close)r��out_dirZspecfile�fdrrr�
write_spec�s

zpolicy.write_speccCs2d||jf}t|d�}|j|j��|j�|S)Nz%s/%s.terS)r�rTrUr?rV)r�rWZtefilerXrrr�write_te�s

zpolicy.write_tecCs>d||jf}t|d�}|j|j��|j�tj|d�|S)Nz%s/%s.shrSi�)r�rTrUrOrVr5�chmod)r�rWZshfilerXrrr�write_sh�s
zpolicy.write_shcCs2d||jf}t|d�}|j|j��|j�|S)Nz%s/%s.ifrS)r�rTrUr;rV)r�rWZiffilerXrrr�write_if�s

zpolicy.write_ifcCs2d||jf}t|d�}|j|j��|j�|S)Nz%s/%s.fcrS)r�rTrUrBrV)r�rWZfcfilerXrrr�write_fc�s

zpolicy.write_fcc

CsDddl}|j���(}|j�|jdd�|jj�}|j�}|j|jd�}x�|D]�}|j	j
|j�xT|jD]J}xD|j
D]:}|dkr�qt|j|�rttjj|�r�|j|�qt|j|�qtWqhW|j�}|j|jd�}xd|D]\}	xV|	jD]L}xF|j
D]<}|dkr�q�|j|�r�tjj|��r|j|�q�|j|�q�Wq�Wq�WqNWWdQRXdS)NrT)Zload_system_repo)rz/etc)Zprovides)�dnfZBaseZread_all_reposZ	fill_sackZsack�queryZ	available�filterr�rjrNrr�ro�
startswithr5r6�isfiler	r
Zsource_name)
r�r_�baser`ZpqZpkgZfnamer'ZsqZbpkgrrrZ__extract_rpms�s8




zpolicy.__extract_rpmscCs�y|j�Wntk
r YnXtjjd|j�rD|jd|j�tjjd|j�rf|jd|j�tjjd|j�r�|jd|j�tjjd|j�r�|jd|j�tjjd|j�r�|jd|j�tjjd|j�r�|j	d|j�g}x�|j
j�D]�}g}y|j
|dd	d
}Wntk
�r8w�YnXx4|j
|dD]"}|j
|��rJ|j|�n�qJ�qJWt|�d	kr�xF|D]>}||jj�k�r�|j|=n||jj�k�r�|j|=n�q��q�Wtt|j
|d�t|��|j
|d<q�WdS)Nz/var/run/%s.pidz/var/run/%sz/var/log/%sz/var/log/%s.logz/var/lib/%sz/etc/rc.d/init.d/%sz/etc/rc\.d/init\.d/%srr�/)�_policy__extract_rpms�ImportErrorr5r6rcrr	�isdirr
r�rorD�
IndexErrorrbrNrKr�r��list�set)r�Z
temp_basepathr.Z	temp_dirsr�rrr�
gen_writeablesF




zpolicy.gen_writeablecCs�|jtkrdStjj|j�s2tjjd|j�dStj	d|j�}x@|j
�j�D]0}x*|jD] }|j
|�r\td|j|�q\WqPW|j�dS)Nzl
***************************************
Warning %s does not exist
***************************************

znm -D %s | grep Uzself.%s)r'r�r5r6r7r��sys�stderrrU�popen�readrrnrb�execrV)r�rX�sr'rrr�gen_symbolsJs

zpolicy.gen_symbolscCs�d}|d|j|�td�f7}|d|j|�td�f7}|d|j|�td�f7}|jtkr�tjdd�ddkr�|d|j|�td�f7}|d|j	|�td�f7}|S)NzCreated the following files:
z%s # %s
zType Enforcement filezInterface filezFile Contexts filer)rErFrGrHrIrJrKz	Spec filezSetup Script)rFrGrHrIrJrK)
rZrr]r^r'r�rMrNrYr\)r�rW�outrrr�generate\s
zpolicy.generateN)r�)w�__name__�
__module__�__qualname__r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrr	r
r
rrrrrrr rxrzr|rvrtr�r�r�r~r�r�r�r�rrr%r(r*r+r,rsr.r�rr�r�r}rwrur/ryr{r4r;r<r=r>r?rBrDrOrRrYrZr\r]r^rfrlrsr5�getcwdrurrrrrV�s�B
	&8

	>#$*$3rV)r)Gr5rmr8r�r*rrrrQrMZ	templatesrrrr	r
rrr
rrrrrrrZsepolgen.interfacesZ
interfacesZsepolgen.defaultsZdefaultsZPROGNAME�gettext�kwargs�version_infoZinstall�builtins�str�__dict__rgZ__builtin__rr r&r/r8r�r�r�r�ZADMIN_TRANSITION_INTERFACEZUSER_TRANSITION_INTERFACEr�r�ZINETDr�rr4r�r0r1r3r2rr�rCrrGr�r�rUrVrrrr�<module>s�


site-packages/sepolicy/__pycache__/network.cpython-36.opt-1.pyc000064400000003145147511334650020415 0ustar003

>�\�
�@s ddlZddd�Zddd�ZdS)	�NFcshtjtjgtj|tj|tj�i�}g}|rdx8tdd�t��fdd�|��D]}||krJ|j|�qJW|S)NcSs
|tjS)N)�sepolicyZTARGET)�y�r�/usr/lib/python3.6/network.py�<lambda>szget_types.<locals>.<lambda>cs"t��j|tj�o �p |dS)NZenabled)�set�issubsetr�PERMS)�x)�check_bools�permrrrs)	r�searchZALLOWZSOURCEZCLASSr	�map�filter�append)�srcZtclassrrZallowsZnlist�ir)rrr�	get_typess"$rc	Csztj�\}}i}t|d||g|�}t|�dk�rvg||||f<�x2|D�](}|dkrdd|kr`qHd}|dkr�d|krvqHd|kr�qH|dkr�||||fj|dgf�|d	kr�||||fj|d
gf�qH|dkr�||||fj|dgf�qH|dk�r||||fj|d
gf�qH|dk�r6||||fj|dgf�qHy$||||fj||||ff�WqHtk
�rpYqHXqHW|S)Nz	%s_socketrZephemeral_port_typeZunreserved_port_typeZephemeral_port_tZunreserved_port_tZport_tz all ports with out defined typesZ	port_typez	all portszall ports > 1024Zreserved_port_typezall ports < 1024Z
rpc_port_typezall ports > 500 and  < 1024)rZ
gen_port_dictr�lenr�KeyError)	rZprotocolrrZportrecsZ
portrecsbynum�dZtlistrrrr�get_network_connect#s<

$
r)F)F)rrrrrrr�<module>s

site-packages/sepolicy/__pycache__/interface.cpython-36.opt-1.pyc000064400000013422147511334650020663 0ustar003

��f��@s&ddlZddlZddlZdZdZdddddd	d
ddg	Zd
Zy:ddlZiZej	d#krZded<ej
efddd�e��WnJyddlZeej
d<Wn&ek
r�ddlZeej
d<YnXYnXdd�Zd$dd�Zd%dd�Zd&dd�Zdad'dd�Zd(dd	�Zdd
�Zd)dd �Zd!d�Zd*d"d�ZdS)+�Nz_admin$z_role$�get_all_interfaces�get_interfaces_from_xml�	get_admin�get_user�get_interface_dict�get_interface_format_text�!get_interface_compile_format_text�get_xml_file�interface_compile_testzselinux-python�T�unicodez/usr/share/localezutf-8)Z	localedirZcodeset�_cCs,g}t|�}x|j�D]}|j|�qW|S)z' Get all interfaces from given xml file)r�keys�append)�pathZinterfaces_list�idict�k�r�/usr/lib/python3.6/interface.pyr4s
�cCs0ddlm}g}|s|�}nt|�}t|�}|S)Nr)�get_methods)�sepolicyrr	r)rrZall_interfaces�xml_pathrrrr=scCs�g}|r�y:t|�}t|�}x$|j�D]}|jd�r$|j|�q$WWq�tk
r�}z,tjjd|j	j
t|�f�tjd�WYdd}~Xq�Xn0x.t
j�D]"}|jd�r�|j|jd�d�q�W|S)z? Get all domains with an admin interface from installed policy.Z_adminz%s: %s
�Nr)r	rr�endswithr�IOError�sys�stderr�write�	__class__�__name__�str�exitrr�split)rZ
admin_listrrr�e�irrrrIs

cCs�g}|r�yRt|�}t|�}x<|j�D]0}|jd�r$d|dd	�tj�kr$|j|�q$WWq�tk
r�}z,tj	j
d|jjt
|�f�tjd�WYdd}~Xq�XnPxNtj�D]B}tjdt|�}t|�dkr�d|dtj�kr�|j|d�q�W|S)
z1 Get all domains with SELinux user role interfaceZ_rolez	%s_exec_tN�z%s: %s
rz(.*)%sr���)r	rrrrZ
get_all_typesrrrrrrr r!r"r�re�findall�USER_TRANSITION_INTERFACE�len)rZ
trans_listrrrr$r%�mrrrr_s$
�#/usr/share/selinux/devel/policy.xmlc

CsXddl}ddl}trtSiag}d}||7}|d7}�y|jj|�rT|jjj|�}n|jjj|�}x�|j	d�D]�}x�|j	d�D]�}xV|j
d�D]H}x"|j	d�D]}	|j|	jd��q�W||j
d	�jdgt|jd�<g}q�WxV|j
d
�D]H}x"|j	d�D]}	|j|	jd��q�W||j
d	�jd
gt|jd�<g}q�Wq~WqnWWntk
�rRYnXtS)NrzZ<?xml version="1.0" encoding="ISO-8859-1" standalone="no"?>
<policy>
<layer name="admin">
z
</layer>
</policy>
Zlayer�module�	interfaceZparam�nameZsummary�template)�osZxml.etree.ElementTree�interface_dictr�isfileZetreeZElementTree�parseZ
fromstringr)Zgetiteratorr�get�find�textr)
rr2ZxmlZ
param_listrZtree�lr,r%r$rrrrzs:cCs<t|�}d|dj||d�dj||djd��f}|S)Nz	%s(%s) %sz, r� r�
)r�joinr#)r/rr�interface_textrrrr�s0cCsLddlm}g}x6||dD]&}|j|j|�d|dj|�f}qW|S)Nr)�test_modulerz%s(%s)
z, )�	templatesr>r�dict_valuesr<)Zinterfaces_dictr/r>Z	param_tmpr%r=rrrr�s�compiletestcCs6ddlm}d}|tjd||j�7}|t||�7}|S)Nr)r>rZTEMPLATETYPE)r?r>r(�subZte_test_moduler)r/rr0r>�terrr�generate_compile_te�s
rDcCs�ddl}yddlm}Wn tk
r8ddlm}YnX|jj|�d}|jj|�jd�d}|d||�\}}|dkr�t	j
jd�t	j
jd|�t	jd	�n|SdS)
z; Returns xml format of interfaces for given .if policy filerN)�getstatusoutput�/�.zDpython /usr/share/selinux/devel/include/support/segenxml.py -w -m %sz-
 Could not proceed selected interface file.
z
%sr)
r2�commandsrE�ImportError�
subprocessr�dirname�basenamer#rrrr")Zif_filer2rEZbasedir�filename�rc�outputrrrr	�sc
Cs�dddddg}dg}yddlm}Wn tk
rDddlm}YnXddl}d	d
ddd
�}t|�}|jd�d|kp�||d|k�sjttd�|�yft	|dd�}|j
t||��|j�|d|d�\}	}
|	dkr�t
jj
|
�t
jj
td�|�Wn<tk
�r:}zt
jj
td�||f�WYdd}~XnXx@|j�D]}|jj|��rF|j|��qFWnt
jj
td�|�dS)NZuserdomZkernelZcorenet�filesZdevr1r)rEzcompiletest.ppzcompiletest.tezcompiletest.fczcompiletest.if)�pprCZfc�ifr
�zCompiling %s interfacerC�wz,make -f /usr/share/selinux/devel/Makefile %srQz
Compile test for %s failed.
z%
Compile test for %s has not run. %s
z,
Compiling of %s interface is not supported.)rHrErIrJr2rr#�printr
�openrrD�closerr�EnvironmentError�valuesr�exists�remove)
r/rZexclude_interfacesZexclude_interface_typerEr2Zpolicy_filesr�fdrNrOr$�vrrrr
�s2$*)r)r)r)r)r-)r-)rA)r-)r(rrZADMIN_TRANSITION_INTERFACEr*�__all__ZPROGNAME�gettext�kwargs�version_infoZinstall�builtinsr!�__dict__rIZ__builtin__rrrrrr3rrrrDr	r
rrrr�<module>sB
	



*


	site-packages/sepolicy/__pycache__/booleans.cpython-36.pyc000064400000001702147511334650017564 0ustar003

>�\)�@sddlZdd�Zdd�ZdS)�NcCs2ytjtj|�ddStk
r,|gSXdS)Nr�types)�sepolicy�infoZ	ATTRIBUTE�RuntimeError)Z	attribute�r�/usr/lib/python3.6/booleans.py�expand_attributesrcsvtjtjgtj|tj|tj�i�}|s<td|dj��f��g}x0tdd�t	�fdd�|��D]}|t
|�}q^W|S)Nz*The %s type is not allowed to %s any types�,cSs
|tjS)N)rZTARGET)�yrrr�<lambda>&szget_types.<locals>.<lambda>cst��j|tj�S)N)�set�issubsetr�PERMS)�x)�permrrr&s)r�searchZALLOWZSOURCEZCLASSr�	TypeError�join�map�filterr)�srcZtclassrZallowsZtlist�lr)rr�	get_types s""r)rrrrrrr�<module>ssite-packages/sepolicy/__pycache__/generate.cpython-36.pyc000064400000125166147511334650017567 0ustar003

��f���@s�ddlZddlZddlZddlZddlZddlmZmZmZddlZddl	Z	ddl
mZddl
mZddl
m
Z
ddl
mZddl
mZdd	l
mZdd
l
mZddl
mZddl
mZdd
l
mZddl
mZddl
mZddl
mZddl
mZddl
mZddljZddljZdZy<ddlZiZ ej!dBk�r:de d<ej"efddd�e ��WnLyddl#Z#e$e#j%d<Wn(e&k
�r�ddl'Z'e(e'j%d<YnXYnXdd�Z)dd�Z*dd �Z+d!d"�Z,dZ-dZ.d#Z/dZ0d$Z1d%Z2dZ3dZ4d#Z5dZ6d&Z7d'Z8d(Z9d)Z:d*Z;d+Z<d,Z=d-Z>d.Z?iZ@eAd/�e@e3<eAd0�e@e4<eAd1�e@e5<eAd2�e@e6<eAd3�e@e7<eAd4�e@e8<eAd5�e@e9<eAd6�e@e:<eAd7�e@e;<eAd8�e@e<<eAd9�e@e=<eAd:�e@e><eAd;�e@e?<d<d=�ZBe3e4e5e8e6gZCe;e:e<e=e>gZDd>d?�ZEGd@dA�dA�ZFdS)C�N)�
get_all_types�get_all_attributes�
get_all_roles�)�
executable)�boolean)�etc_rw)�	unit_file)�	var_cache)�	var_spool)�var_lib)�var_log)�var_run)�tmp)�rw)�network)�script)�spec)�userzselinux-python�T�unicodez/usr/share/localezutf-8)Z	localedirZcodeset�_cCsF|d}|d}|d}|d|jd�d}|jd�d}|||gS)z6Given an RPM header return the package NVR as a string�name�version�release�-�.rr)�split)ZhdrrrrZrelease_versionZ
os_version�r�/usr/lib/python3.6/generate.py�get_rpm_nvr_from_headerGsr c	Cs`y>ddl}d}|j�}|j|j|�}x|D]}t|�}Pq*WWntd|�d}YnX|S)Nrz"Failed to retrieve rpm info for %s)�rpm�tsZdbMatchZRPMTAG_NAMEr �print)�packager!Znvrr"Zmi�hrrr�get_rpm_nvr_listRs


r&cCs�i}xztjtj�D]j}|ddks|ddks|ddks|ddks|ddkrTq|d|jd�f||d|d	|d
f<qW|S)N�typeZreserved_port_tZport_tZhi_reserved_port_tZephemeral_port_tZunreserved_port_t�rangeZlowZhigh�protocol)�sepolicy�infoZPORT�get)�dict�prrr�
get_all_portsbs,r/cCs6dd�tjtj�D�}|jd�|jd�|j�|S)NcSsg|]}|d�qS)rr)�.0�xrrr�
<listcomp>psz!get_all_users.<locals>.<listcomp>Zsystem_u�root)r*r+�USER�remove�sort)�usersrrr�
get_all_usersos


r8�z_admin$z_role$������	�
��zStandard Init DaemonzDBUS System DaemonzInternet Services DaemonzWeb Application/Script (CGI)ZSandboxzUser ApplicationzExisting Domain Typez Minimal Terminal Login User Rolez!Minimal X Windows Login User RolezDesktop Login User RolezAdministrator Login User Rolez Confined Root Administrator Rolez!Module information for a new typecCs>tj�}|j�td�}x |D]}|d|t|f7}qW|S)Nz
Valid Types:
z%2s: %s
)�poltype�keysr6r)rD�msg�krrr�get_poltype_desc�s
rGcCs�|dkrgSd	}y�g}x�|jd�D]�}|jd�}t|�dkr@t�t|�dkrft|d�}t|d�}n$t|d�}t|d�}||kr�t�x4t||d�D]"}|dks�||kr�t�|j|�q�Wq"W|Stk
r�ttd�|��YnXdS)
N�r9��,rrrz8Ports must be numbers or ranges of numbers from 1 to %d i)r�len�
ValueError�intr(�appendr)�portsZmax_port�temp�a�r�begin�endr.rrr�verify_ports�s.
rUc@s�eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zd�dd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zd2d3�Zd4d5�Zd6d7�Zd8d9�Zd:d;�Zd<d=�Z d>d?�Z!d@dA�Z"dBdC�Z#dDdE�Z$dFdG�Z%dHdI�Z&dJdK�Z'dLdM�Z(dNdO�Z)dPdQ�Z*dRdS�Z+dTdU�Z,dVdW�Z-dXdY�Z.dZd[�Z/d\d]�Z0d^d_�Z1d`da�Z2dbdc�Z3ddde�Z4dfdg�Z5dhdi�Z6djdk�Z7dldm�Z8dndo�Z9dpdq�Z:drds�Z;dtdu�Z<dvdw�Z=dxdy�Z>dzd{�Z?d|d}�Z@d~d�ZAd�d��ZBd�d��ZCd�d��ZDd�d��ZEd�d��ZFd�d��ZGd�d��ZHd�d��ZId�d��ZJd�d��ZKd�d��ZLd�d��ZMd�d��ZNd�d��ZOd�d��ZPd�d��ZQd�d��ZRd�d��ZSd�d��ZTd�d��ZUd�d��ZVd�d��ZWd�d��ZXd�d��ZYd�d��ZZd�d��Z[d�d��Z\d�d��Z]d�d��Z^d�d��Z_d�d��Z`d�d��Zad�d��Zbd�dÄZcd�dńZdd�dDŽZed�dɄZfd�d˄Zgd�d̈́Zhd�dτZid�dфZjd�dӄZkd�dՄZld�dׄZmd�dلZnd�dۄZod�d݄Zpd�d߄Zqd�d�Zrd�d�Zsetju�fd�d�Zvd�S)��policycCsg|_i|_t�|_g|_|tkr.ttd���|sFttd�t|��yt�|_WnTtk
r|}zt	d�WYdd}~Xn,t
k
r�}zt	d|�WYdd}~XnXi|_d|jd<d|jd<d|jd<d	|jd
<d	|jd<d	|jd<d|jd
<d|jd<d|jd<d|jd<d|jd<d|jd<d|jd<d|jd<d|jd<d|jd<d|jd<d|jd<d|jd <d!|jd"<d#|jd$<d%|jd&<d'|jd(<d)|jd*<d+|jd,<d-|jd.<d/|jd0<d1|jd2<d3|jd4<d5|jd6<d7|jd8<d9|jd:<d;|jd<<d=|jd><d?|jd@<dA|jdB<dC|jdD<dE|jdF<dG|jdH<dI|jdJ<dK|jdL<dM|jdN<dO|jdP<dQ|jdR<dS|jdT<dU|jdV<dW|jdX<dY|jdZ<d[|jd\<d]|jd^<d_|jd`<da|jdb<da|jdc<da|jdd<da|jde<df|jdg<df|jdh<df|jdi<df|jdj<df|jdg<dk|jdl<dm|jdn<do|jdp<dq|jdr<ds|jdt<du|jdv<dw|jdx<dy|jdz<d{|jd|<d}|jd~<d|jd�<d|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<d�|jd�<i|_d�gt
g|jd�<d�gtg|jd�<d�gtg|jd�<d�gtg|jd�<d�gtg|jd�<d�gtg|jd�<d�gtg|jd�<d�gtg|jd�<d�gtg|jd�<d�gtg|jd�<d�gtg|jd�<i|_t|jd�<t|jd�<t|jd�<t|jd�<t|jd�<t|jd�<t|jd�<t|jd�<d�d�d�d�d�d�d�d�d�d�d�g|_|j|jf|j|jf|j|jf|j|j f|j!|j"f|j#|j$f|j%|j&f|j'|j(f|j)|j*f|j+|j(f|j,|j(f|j-|j.f|j/|j0ff
|_1t2j3d�|��s�ttd����|t4k�r�d�||_5n||_5||_6g|_7g|_8||_9d�|_:d|_;d�d�d�gg|_<d�d�d�gg|_=d�d�d�gg|_>d�d�d�gg|_?d�|_@d�|_Ad�|_Bd�|_Cd�|_Dd�|_Ed�|_Fd�|_Gd�|_H|j9tItJgk|_K|j9tItJgk|_L|j9tItJgk|_Md�|_Nd�|_Oi|_Pi|_Qi|_Rg|_Sg|_Td�|_Ud�|_Vg|_Wg|_Xg|_Yg|_Zg|_[dS)�Nz"You must enter a valid policy typez;You must enter a name for your policy module for your '%s'.z9Can not get port types, must be root for this informationzCan not get port typeszset_use_kerberos(True)Zopenlogzset_use_kerb_rcache(True)zset_use_syslog(True)zset_use_resolve(True)Z	gethostbyZgetaddrinfoZgetnameinfoZkrbzset_manage_krb5_rcache(True)Zgss_accept_sec_contextZkrb5_verify_init_credsZkrb5_rd_reqZ__syslog_chkzset_use_uid(True)�getpwnam�getpwuidzset_use_dbus(True)Zdbus_zset_use_pam(True)Zpam_zset_use_audit(True)zadd_process('fork')�forkzadd_process('transition')Z
transitionzadd_process('sigchld')Zsigchldzadd_process('sigkill')Zsigkillzadd_process('sigstop')Zsigstopzadd_process('signull')Zsignullzadd_process('ptrace')Zptracezadd_process('getsched')Zgetschedzadd_process('setsched')Zsetschedzadd_process('getsession')Z
getsessionzadd_process('getpgid')�getpgidzadd_process('setpgid')�setpgidzadd_process('getcap')Zgetcapzadd_process('setcap')Zsetcapzadd_process('share')Zsharezadd_process('getattr')�getattrzadd_process('setexec')Zsetexeczadd_process('setfscreate')Zsetfscreatezadd_process('noatsecure')Z
noatsecurezadd_process('siginh')Zsiginhzadd_process('signal_perms')�killzadd_process('setrlimit')Z	setrlimitzadd_process('rlimitinh')Z	rlimitinhzadd_process('dyntransition')Z
dyntransitionzadd_process('setcurrent')Z
setcurrentzadd_process('execmem')Zexecmemzadd_process('execstack')Z	execstackzadd_process('execheap')Zexecheapzadd_process('setkeycreate')Zsetkeycreatezadd_process('setsockcreate')Z
setsockcreatezadd_capability('chown')�chownzadd_capability('dac_override')Zdac_overridez!add_capability('dac_read_search')Zdac_read_searchzadd_capability('fowner')Zfownerzadd_capability('fsetid')Zfsetidzadd_capability('setgid')�setgid�setegid�	setresgid�setregidzadd_capability('setuid')�	setresuid�setuid�seteuid�setreuidzadd_capability('setpcap')Zsetpcapz!add_capability('linux_immutable')Zlinux_immutablez"add_capability('net_bind_service')Znet_bind_servicezadd_capability('net_broadcast')Z
net_broadcastzadd_capability('net_admin')Z	net_adminzadd_capability('net_raw')Znet_rawzadd_capability('ipc_lock')Zipc_lockzadd_capability('ipc_owner')Z	ipc_ownerzadd_capability('sys_module')�
sys_modulezadd_capability('sys_rawio')Z	sys_rawiozadd_capability('sys_chroot')�chrootZ
sys_chrootzadd_capability('sys_ptrace')Z
sys_ptracezadd_capability('sys_pacct')Z	sys_pacctzadd_capability('sys_admin')ZmountZunshareZ	sys_adminzadd_capability('sys_boot')Zsys_bootzadd_capability('sys_nice')Zsys_nicezadd_capability('sys_resource')Zsys_resourcezadd_capability('sys_time')Zsys_timez add_capability('sys_tty_config')Zsys_tty_configzadd_capability('mknod')�mknodzadd_capability('lease')Zleasezadd_capability('audit_write')Zaudit_writezadd_capability('audit_control')Z
audit_controlzadd_capability('setfcap')Zsetfcaprz/etcrz/tmprr	z/usr/lib/systemd/systemz/lib/systemd/systemz/etc/systemd/systemr
z
/var/cacherz/var/libr
z/var/logrz/var/runrz
/var/spoolZ_tmp_tZ_unit_file_tZ_var_cache_tZ
_var_lib_tZ
_var_log_tZ
_var_run_tZ_var_spool_tZ_port_tz^[a-zA-Z0-9-_]+$zQName must be alpha numberic with no spaces. Consider using option "-n MODULENAME"zhttpd_%s_scriptrHF)\�rpmsrOr�	all_roles�typesrCrLrr/r#�RuntimeError�symbols�DEFAULT_DIRSrrrr	r
rr
rr�DEFAULT_EXTr�DEFAULT_KEYS�generate_daemon_types�generate_daemon_rules�generate_dbusd_types�generate_dbusd_rules�generate_inetd_types�generate_inetd_rules�generate_cgi_types�generate_cgi_rules�generate_sandbox_types�generate_sandbox_rules�generate_userapp_types�generate_userapp_rules�generate_existing_user_types�generate_existing_user_rules�generate_min_login_user_types�generate_login_user_rules�generate_x_login_user_types�generate_x_login_user_rules�generate_login_user_types�generate_admin_user_types�generate_root_user_types�generate_root_user_rules�generate_new_types�generate_new_rules�
DEFAULT_TYPES�re�match�CGIr�	file_name�capabilities�	processesr'�
initscript�program�in_tcp�in_udp�out_tcp�out_udp�use_resolve�use_tmp�use_uid�
use_syslog�use_kerberos�manage_krb5_rcache�use_pam�use_dbus�	use_audit�EUSER�NEWTYPE�use_etc�use_localization�use_fd�use_terminal�use_mail�booleans�files�dirs�found_tcp_ports�found_udp_ports�
need_tcp_type�
need_udp_type�
admin_domains�existing_domains�transition_domains�transition_users�roles)�selfrr'�errr�__init__�sd











































































































zpolicy.__init__cCs(|tp&|tp&|tp&t|t�dkS)Nr)�ALL�RESERVED�
UNRESERVEDrK�PORTS)r��lrrrZ
__isnetset�szpolicy.__isnetsetcCs
||_dS)N)r�)r�r�rrr�set_admin_domains�szpolicy.set_admin_domainscCs
||_dS)N)r�)r�r�rrr�set_existing_domains�szpolicy.set_existing_domainscCs
||_dS)N)r�)r�r�rrr�set_admin_roles�szpolicy.set_admin_rolescCs
||_dS)N)r�)r�r�rrr�set_transition_domains�szpolicy.set_transition_domainscCs
||_dS)N)r�)r�r�rrr�set_transition_users�szpolicy.set_transition_userscCs|j|j�S)N)�_policy__isnetsetr�)r�rrr�
use_in_udp�szpolicy.use_in_udpcCs|j|j�S)N)r�r�)r�rrr�use_out_udp�szpolicy.use_out_udpcCs|j�p|j�S)N)r�r�)r�rrr�use_udp�szpolicy.use_udpcCs|j|j�S)N)r�r�)r�rrr�
use_in_tcp�szpolicy.use_in_tcpcCs|j|j�S)N)r�r�)r�rrr�use_out_tcp�szpolicy.use_out_tcpcCs|j�p|j�S)N)r�r�)r�rrr�use_tcp�szpolicy.use_tcpcCs|j�p|j�S)N)r�r�)r�rrr�use_network�szpolicy.use_network�tcpcCsFx@|jj�D]2\}}}||kr||kr||kr|j|||fSqWdS)N)rOrD)r�Zportr)rSrTr.rrr�	find_port�szpolicy.find_portcCs |jtkrttd���||_dS)Nz0User Role types can not be assigned executables.)r'�APPLICATIONSrLrr�)r�r�rrr�set_program�s
zpolicy.set_programcCs |jtkrttd���||_dS)Nz)Only Daemon apps can use an init script..)r'�DAEMONrLrr�)r�r�rrr�set_init_script�s
zpolicy.set_init_scriptcCs|||t|�g|_dS)N)rUr�)r��all�reserved�
unreservedrOrrr�
set_in_tcp�szpolicy.set_in_tcpcCs|||t|�g|_dS)N)rUr�)r�r�r�r�rOrrr�
set_in_udp�szpolicy.set_in_udpcCs|ddt|�g|_dS)NF)rUr�)r�r�rOrrr�set_out_tcp�szpolicy.set_out_tcpcCs|ddt|�g|_dS)NF)rUr�)r�r�rOrrr�set_out_udp�szpolicy.set_out_udpcCs"t|�tk	rttd���||_dS)Nz$use_resolve must be a boolean value )r'�boolrLrr�)r��valrrr�set_use_resolve�szpolicy.set_use_resolvecCs"t|�tk	rttd���||_dS)Nz#use_syslog must be a boolean value )r'r�rLrr�)r�r�rrr�set_use_syslog�szpolicy.set_use_syslogcCs"t|�tk	rttd���||_dS)Nz%use_kerberos must be a boolean value )r'r�rLrr�)r�r�rrr�set_use_kerberos�szpolicy.set_use_kerberoscCs"t|�tk	rttd���||_dS)Nz+manage_krb5_rcache must be a boolean value )r'r�rLrr�)r�r�rrr�set_manage_krb5_rcache�szpolicy.set_manage_krb5_rcachecCs|dk|_dS)NT)r�)r�r�rrr�set_use_pam�szpolicy.set_use_pamcCs|dk|_dS)NT)r�)r�r�rrr�set_use_dbus�szpolicy.set_use_dbuscCs|dk|_dS)NT)r�)r�r�rrr�
set_use_audit�szpolicy.set_use_auditcCs|dk|_dS)NT)r�)r�r�rrr�set_use_etc�szpolicy.set_use_etccCs|dk|_dS)NT)r�)r�r�rrr�set_use_localization�szpolicy.set_use_localizationcCs|dk|_dS)NT)r�)r�r�rrr�
set_use_fd�szpolicy.set_use_fdcCs|dk|_dS)NT)r�)r�r�rrr�set_use_terminal�szpolicy.set_use_terminalcCs|dk|_dS)NT)r�)r�r�rrr�set_use_mail�szpolicy.set_use_mailcCsB|jtkrttd���|r0|jddjd�ng|jdd<dS)Nz'USER Types automatically get a tmp typez/tmpr)r'�USERSrLrrorN)r�r�rrr�set_use_tmp�s

zpolicy.set_use_tmpcCs|dk|_dS)NT)r�)r�r�rrr�set_use_uidszpolicy.set_use_uidcCs |jrtjd|jtj�SdSdS)N�TEMPLATETYPErH)r�r��subrrZte_uid_rules)r�rrr�generate_uid_rulesszpolicy.generate_uid_rulescCs |jrtjd|jtj�SdSdS)Nr�rH)r�r�r�rrZte_syslog_rules)r�rrr�generate_syslog_rulesszpolicy.generate_syslog_rulescCs |jrtjd|jtj�SdSdS)Nr�rH)r�r�r�rrZte_resolve_rules)r�rrr�generate_resolve_rulesszpolicy.generate_resolve_rulescCs |jrtjd|jtj�SdSdS)Nr�rH)r�r�r�rrZte_kerberos_rules)r�rrr�generate_kerberos_rulesszpolicy.generate_kerberos_rulescCs |jrtjd|jtj�SdSdS)Nr�rH)r�r�r�rrZte_manage_krb5_rcache_rules)r�rrr�!generate_manage_krb5_rcache_rules sz(policy.generate_manage_krb5_rcache_rulescCs d}|jrtjd|jtj�}|S)NrHr�)r�r�r�rrZte_pam_rules)r��newterrr�generate_pam_rules&szpolicy.generate_pam_rulescCs d}|jrtjd|jtj�}|S)NrHr�)r�r�r�rrZte_audit_rules)r�r�rrr�generate_audit_rules,szpolicy.generate_audit_rulescCs d}|jrtjd|jtj�}|S)NrHr�)r�r�r�rrZte_etc_rules)r�r�rrr�generate_etc_rules2szpolicy.generate_etc_rulescCs d}|jrtjd|jtj�}|S)NrHr�)r�r�r�rrZte_fd_rules)r�r�rrr�generate_fd_rules8szpolicy.generate_fd_rulescCs d}|jrtjd|jtj�}|S)NrHr�)r�r�r�rrZte_localization_rules)r�r�rrr�generate_localization_rules>sz"policy.generate_localization_rulescCs*d}|jtkr&|jr&tjd|jtj�}|S)NrHr�)r'�DBUSr�r�r�rrZ
te_dbus_rules)r�r�rrr�generate_dbus_rulesDszpolicy.generate_dbus_rulescCs d}|jrtjd|jtj�}|S)NrHr�)r�r�r�rrZ
te_mail_rules)r�r�rrr�generate_mail_rulesJszpolicy.generate_mail_rulescCsFd}d|||f}|tj�kr.d||jf}nd||j|||f}|S)NrHzcorenet_%s_%s_%sz	%s(%s_t)
zD
gen_require(`
    type %s_t;
')
allow %s_t %s_t:%s_socket name_%s;
)r*Zget_methodsr)r�r)�action�	port_name�line�methodrrr�generate_network_actionPszpolicy.generate_network_actioncCshxf|jtD]X}|jt|�d�}|dkr0d|_q|ddd
�}|jdd|�}||jkr|jj|�qWxf|jtD]X}|jt|�d�}|dkr�d|_qt|ddd�}|jdd|�}||jkrt|jj|�qtWxh|j	tD]Z}|jt|�d�}|dk�rd|_
q�|ddd�}|jdd|�}||jkr�|jj|�q�W|j
dk�sR|jdk�rdtj
d|jtj�Sd	S)
Nr�Trr9ZbindZconnect�udpr�rH���r�r�)r�r�r�rMr�r�r�rNr�r�r�r�r�r�rr�te_types)r��iZrecr�r�rrr�generate_network_types^s6



zpolicy.generate_network_typescCsZx:|jD]0}|j|�dkr|j|dj|�|j|SqW|jddj|�|jdS)Nrrr)ro�findrN)r��file�drrrZ__find_path�szpolicy.__find_pathcCs||jkr|jj|�dS)N)r�rN)r�Z
capabilityrrr�add_capability�s
zpolicy.add_capabilitycCs
||_dS)N)rl)r�rlrrr�	set_types�szpolicy.set_typescCs||jkr|jj|�dS)N)r�rN)r�Zprocessrrr�add_process�s
zpolicy.add_processcCs||j|<dS)N)r�)r�r�descriptionrrr�add_boolean�szpolicy.add_booleancCs|j|�|j|<dS)N)�_policy__find_pathr�)r�rrrr�add_file�szpolicy.add_filecCs|j|�|j|<dS)N)rr�)r�rrrr�add_dir�szpolicy.add_dircCs6d}|jj�t|j�dkr2d|jdj|j�f}|S)NrHrz#allow %s_t self:capability { %s };
� )r�r6rKr�join)r�r�rrr�generate_capabilities�s

zpolicy.generate_capabilitiescCs6d}|jj�t|j�dkr2d|jdj|j�f}|S)NrHrz allow %s_t self:process { %s };
r)r�r6rKrr)r�r�rrr�generate_process�s

zpolicy.generate_processcCs�d}|j��r�d}|tjd|jtj�7}|j��r�|d7}|tjd|jtj�7}|j�r�|tjd|jtj	�7}|j
r�t|jt
�dkr�|tjd|jtj�7}|j
r�t|jt
�dkr�|tjd|jtj�7}|jtr�|tjd|jtj�7}|jt�r|tjd|jtj�7}|jt�r.|tjd|jtj�7}|jt�rP|tjd|jtj�7}|jt�rr|tjd|jtj�7}|jt�r�|tjd|jtj�7}x|jD]}||7}�q�W|j��r�|d7}|tjd|jtj�7}|j�r�|tjd|jtj�7}|j��r|tjd|jtj �7}|j!t�r6|tjd|jtj"�7}|j!t�rX|tjd|jtj#�7}|j!t�rz|tjd|jtj$�7}x|j%D]}||7}�q�W|S)NrH�
r�r)&r�r�r�rrZ
te_networkr�Zte_tcpr�Z	te_in_tcpr�rKr�r�Zte_in_need_port_tcpr�Zte_out_need_port_tcpr�Zte_in_all_ports_tcpr�Zte_in_reserved_ports_tcpr�Zte_in_unreserved_ports_tcpZte_out_all_ports_tcpZte_out_reserved_ports_tcpZte_out_unreserved_ports_tcpr�r�Zte_udpr�Zte_in_need_port_udpr�Z	te_in_udpr�Zte_in_all_ports_udpZte_in_reserved_ports_udpZte_in_unreserved_ports_udpr�)r�r�r�rrr�generate_network_rules�sV




zpolicy.generate_network_rulescCs�d}x2|jD](}tjd|jtj�}|tjd||�7}qW|jtkr�x<|jD]2}tjd|jt	j
�}|tjd|jd�d|�7}qJW|S)NrHr��APPLICATIONr4�_ur)r�r�r�rrZte_transition_rulesr'r4r�rZte_run_rulesr)r�r��appr�urPrrr�generate_transition_rules�s
 z policy.generate_transition_rulescCs,d}|jtkr�xn|jD]d}|jd�d}|d}xH|jD]>}tjd|tj�}||j	krdtj|d|�}|tjd||�7}q8WqW|S|jt
k�r(|tjd|jtj�7}x2|jD](}tjd|jtj�}|tjd||�7}q�WxN|j
D]D}|jd�d}|d|j	kr�tjd|jtj�}|tjd	||�7}q�W|S)
NrH�_tr�_rr�Zsystem_rrrr4)r'r�r�rr�r�r�rZte_admin_domain_rulesrk�RUSERrZte_admin_rulesr�Zte_admin_trans_rules)r�r�rr�rolerrrrrr�generate_admin_rules�s,

zpolicy.generate_admin_rulescCs d}|jrtjd|jtj�}|S)NrHr�)r�r�r�rrZ
if_dbus_rules)r��newifrrr�generate_dbus_ifszpolicy.generate_dbus_ifcCs(d}|jtkr|Stjd|jtj�}|S)NrHr�)r'�SANDBOXr�r�rrZif_sandbox_rules)r�rrrr�generate_sandbox_ifs

zpolicy.generate_sandbox_ifcCsd}d}|jdkr>|tjd|jtj�7}|tjd|jtj�7}xd|jD]Z}t|j	|d�dkrF|tjd|j|j	|dj
�7}|tjd|j|j	|dj�7}qFW|dkr�tjd|jtj�}||7}|tjd|jtj
�7}||7}|tjd|jtj�7}|SdS)NrHr�rrr9)r�r�r�rrZif_initscript_admin_typesZif_initscript_adminrqrKroZif_admin_typesZif_admin_rulesZif_begin_adminZif_middle_adminZif_end_admin)r�rZnewtypesr�retrrr�generate_admin_ifs"
 $zpolicy.generate_admin_ifcCstjd|jtj�S)Nr�)r�r�r�r�te_cgi_types)r�rrrrx5szpolicy.generate_cgi_typescCstjd|jtj�S)Nr�)r�r�r�r�te_sandbox_types)r�rrrrz8szpolicy.generate_sandbox_typescCstjd|jtj�S)Nr�)r�r�rrZte_userapp_types)r�rrrr|;szpolicy.generate_userapp_typescCstjd|jtj�S)Nr�)r�r�rrZte_inetd_types)r�rrrrv>szpolicy.generate_inetd_typescCstjd|jtj�S)Nr�)r�r�rrZte_dbusd_types)r�rrrrtAszpolicy.generate_dbusd_typescCstjd|jtj�S)Nr�)r�r�rrZte_min_login_user_types)r�rrrr�Dsz$policy.generate_min_login_user_typescCstjd|jtj�S)Nr�)r�r�rrZte_login_user_types)r�rrrr�Gsz policy.generate_login_user_typescCstjd|jtj�S)Nr�)r�r�rrZte_admin_user_types)r�rrrr�Jsz policy.generate_admin_user_typescCs�t|j�dkr$ttd�t|j��tjd|jt	j
�}|d7}xB|jD]8}|d|7}|jd�dd}||jkrF|d|7}qFW|d	7}|S)
Nrz,'%s' policy modules require existing domainsr�z
gen_require(`z
        type %s;rrz

	role %s;z
')
)
rKr�rLrrCr'r�r�rrZte_existing_user_typesrrk)r�r�rrrrrr~Ms

z#policy.generate_existing_user_typescCstjd|jtj�S)Nr�)r�r�rrZte_x_login_user_types)r�rrrr�_sz"policy.generate_x_login_user_typescCstjd|jtj�S)Nr�)r�r�rrZte_root_user_types)r�rrrr�bszpolicy.generate_root_user_typesc	Cs�d}t|j�dkrttd���xj|jD]`}xZ|jD]P}|j|�r2t||dt|���|tjd|dt|��|j|j	�7}Pq2Wq&Wt
r�|dkr�g}x|jD]}|j|�q�Wttd�dj|���|S)NrHrzType field requiredr�z3You need to define a new type which ends with: 
 %sz
 )
rKrlrLrrp�endswithr#r�r�r�r�rNr)r�r��tr�Zdefault_extrrrr�es
(
zpolicy.generate_new_typescCsdS)NrHr)r�rrrr�yszpolicy.generate_new_rulescCs6tjd|jtj�}|jdkr2|tjd|jtj�7}|S)Nr�rH)r�r�rrZte_daemon_typesr�Zte_initscript_types)r�r�rrrrr|s
zpolicy.generate_daemon_typescCs |jrtjd|jtj�SdSdS)Nr�rH)r�r�r�rrr�)r�rrr�generate_tmp_types�szpolicy.generate_tmp_typescCs@d}x6|jD],}tjd|tj�}|tjd|j||�7}qW|S)NrH�BOOLEANZDESCRIPTION)r�r�r�rZ
te_boolean)r�r��brrrr�generate_booleans�s
zpolicy.generate_booleanscCs,d}x"|jD]}|tjd|tj�7}qW|S)NrHr&)r�r�r�r�te_rules)r�r�r'rrr�generate_boolean_rules�szpolicy.generate_boolean_rulescCstjd|jtj�S)Nr�)r�r�rrr")r�rrr�generate_sandbox_te�szpolicy.generate_sandbox_tecCstjd|jtj�S)Nr�)r�r�rrr!)r�rrr�generate_cgi_te�szpolicy.generate_cgi_tecCstjd|jtj�}|S)Nr�)r�r�rrZte_daemon_rules)r�rrrrrs�szpolicy.generate_daemon_rulesc	Csrd}xh|jD]^}xX|jD]N}|j|�r|dt|��d}|tjd|dt|��|j|j�7}PqWqW|S)NrHrr�)rlrpr#rKr�r��if_rules)r�rr$r�Zreqtyperrr�generate_new_type_if�s
(
zpolicy.generate_new_type_ifcCstjd|jtj�S)Nr�)r�r�rrZte_login_user_rules)r�rrrr��sz policy.generate_login_user_rulescCstjd|jtj�}|S)Nr�)r�r�rrZte_existing_user_rules)r�Znerulesrrrr�sz#policy.generate_existing_user_rulescCstjd|jtj�S)Nr�)r�r�rrZte_x_login_user_rules)r�rrrr��sz"policy.generate_x_login_user_rulescCstjd|jtj�}|S)Nr�)r�r�rrZte_root_user_rules)r�r�rrrr��szpolicy.generate_root_user_rulescCstjd|jtj�S)Nr�)r�r�rrZte_userapp_rules)r�rrrr}�szpolicy.generate_userapp_rulescCstjd|jtj�S)Nr�)r�r�rrZte_inetd_rules)r�rrrrw�szpolicy.generate_inetd_rulescCstjd|jtj�S)Nr�)r�r�rrZte_dbusd_rules)r�rrrru�szpolicy.generate_dbusd_rulescCs |jrtjd|jtj�SdSdS)Nr�rH)r�r�r�rrr))r�rrr�generate_tmp_rules�szpolicy.generate_tmp_rulescCsd}|tjd|jtj�7}|S)NrHr�)r�r�rrZte_cgi_rules)r�r�rrrry�szpolicy.generate_cgi_rulescCsd}|tjd|jtj�7}|S)NrHr�)r�r�rrZte_sandbox_rules)r�r�rrrr{�szpolicy.generate_sandbox_rulescCsRd}|js|jtkr&tjd|jtj�}|jtt	t
tfkrN|tjd|jtj�7}|S)NrHr�)
r�r'r4r�r�rrZif_user_program_rules�TUSER�XUSER�AUSER�LUSERZif_role_change_rules)r�rrrr�generate_user_if�szpolicy.generate_user_ifcCsDd}|tjd|jtj�7}|jr6|tjd|jtj�7}|jdkrV|tjd|jtj�7}x�|j	D]�}t
|j|d�dkr^|tjd|j|j|dj�7}xZ|j|dD]H}t
jj|�r�tjt
j|�tj�r�|tjd|j|j|dj�7}Pq�Wq^W||j�7}||j�7}||j�7}||j�7}||j�7}||j�7}|S)NrHr�rrr9)r�r�rrZif_heading_rulesr�Zif_program_rulesr�Zif_initscript_rulesrqrKror-�os�path�exists�stat�S_ISSOCK�ST_MODEZif_stream_rulesr4rr rr.r�)r�rrr�rrr�generate_if�s(
 " 
zpolicy.generate_ifcCs|j|jd�S)Nr)r�r')r�rrr�generate_default_types�szpolicy.generate_default_typescCs&|j|jdr"|j|jd�SdS)NrrH)r�r')r�rrr�generate_default_rules�szpolicy.generate_default_rulescCs�d}|jttttfkr�d}t|j�dkr�|tjd|j	t
j�7}|tjd|j	t
j�7}x2|jD](}tjd|j	t
j
�}|tjd||�7}q\W|S)NrHrr�ZROLE)r'r0r1r2r3rKr�r�r�rrZ
te_sudo_rulesZte_newrole_rulesZte_roles_rules)r�r�r�rrrrr�generate_roles_rules�szpolicy.generate_roles_rulesc	Cs�|j�}xV|jD]L}t|j|d�dkr|jtks<|dkr|tjd|j|j|dj	�7}qW|jt
krx|d|j7}||j�7}||j�7}||j
�7}||j�7}||j�7}||j�7}||j�7}�xT|jD�]H}t|j|d�dkr�|jt
k�rXd}xt|jD]H}|tjd|dd�d	|j|dj�7}|tjd
|jd|�7}�q
Wn |tjd|j|j|dj�7}x�|j|dD]�}tjj|��r�tjtj|�tj��r�|jt
k�r�xX|jD],}|tjd|dd
�|j|dj�7}�q�Wn |tjd|j|j|dj�7}P�q�Wq�W||j�7}||j�7}||j�7}||j�7}||j�7}||j �7}||j!�7}||j"�7}||j#�7}||j$�7}||j%�7}||j&�7}||j'�7}||j(�7}||j)�7}||j*�7}||j+�7}|S)Nrrrr�r9z@
########################################
#
# %s local policy
#
rHZTEMPLATETYPE_trZTEMPLATETYPE_rw_tZ_rw_tr�r�),r<rqrKror'r�r�r�rr�r�r
rr�r%r(r=r*r�r)r5r6r7r8r9r:Zte_stream_rulesr/rr�r�r�r�r�r�r�r�r>r�rrr�r�r�)r�r�rZ	newte_tmpZdomainr�rrr�generate_tes`$
*  &. zpolicy.generate_tecCs�d}g}x�|jj�D]�}tjj|�rXtjtj|�tj�rXtj	d|j
|j|dj�}ntj	d|j
|j|dj�}tj	d||�}|j
tj	d|j|d|��qWxZ|jj�D]L}tj	d|j
|j|dj�}tj	d||�}|j
tj	d|j|d|��q�W|jttgk�r&t|�dk�r&tjS|jttttgk�rR|j�rRttd���|j�r�tj	d|jtj�}|j
tj	d|j
|��|jdk�r�tj	d|jtj�}|j
tj	d|j
|��|j�d	j|�}|S)
NrHr�r9�FILENAMEZFILETYPErz<You must enter the executable path for your confined processZ
EXECUTABLEr) r�rDr5r6r7r8r9r:r�r�rZfc_sock_fileZfc_filerNr�Zfc_dirr'r�rrKrZfc_userr�r�r�rLrZ
fc_programr�Z
fc_initscriptr6r)r�ZnewfcZfclistr��t1Zt2rrr�generate_fcDs4""" 
zpolicy.generate_fccCs�d}|jtttttfkr|Sd}x|jD]}|d|7}q(W|dkrL|d7}tjd|j	t
j�}|tjd||�7}|jtks�|jtkr�x2|jD](}tjd|j	t
j
�}|tjd||�7}q�W|jtkr�|tjd|j	t
j�7}n|tjd|j	t
j�7}|S)NrHz %s_rz	 system_rr�ZROLESr4)r'r0r1r2r3rr�r�r�rrr7r�Zadmin_transZmin_login_user_default_contextZx_login_user_default_context)r��newshr�rrrrrr�generate_user_shgs$
zpolicy.generate_user_shcCs�tjd|jtj�}tjd|j|�}|jtkrBtjdd|j|�}n&tjd|j|�}|tjd|jtj�7}|j	r�|tjd|j	tj
�7}|jdkr�|tjd|jtj
�7}x&|jj
�D]}|tjd|tj
�7}q�Wx&|jj
�D]}|tjd|tj
�7}q�WxX|jt|jtD]@}|j|d�dk�r
tjdd	|tj�}|tjd|j|�7}�q
WxN|jtD]@}|j|d
�dk�rZtjdd	|tj�}|tjd|j|�7}�qZW||j�7}tjdd�ddk�r�|tjd|jtj�7}|S)Nr�Z
DOMAINTYPEZTEMPLATEFILEz%sr@rHr�ZPORTNUMz%dr�r)�full_distribution_name�redhat�centos�SuSE�fedora�mandrake�mandriva)rFrGrHrIrJrK)r�r�r�r�compilerr'r�Zmanpager�Z
restoreconr�r�rDr�r�r�r�r�Z	tcp_portsr�Z	udp_portsrD�platform�linux_distributionr!)r�rPrCr�rArrr�generate_sh�s4

zpolicy.generate_shcCs�d}td�}|dkrd}n|d}|tj7}|jtkr�|tj7}|jr\|tjd|jtj	�7}|j
dkr||tjd|j
tj	�7}x&|jj�D]}|tjd|tj	�7}q�Wx&|j
j�D]}|tjd|tj	�7}q�W|tjd|tj�7}tjd|j|�}tjd|j|�}t|j�d	k�r$|d
dj|j�7}|tjd|jtj�7}tjd|j|�}tjdtjd
�|�}|jtk�rxtjdd|�}|jtk�r�tjd|jd|�}|jtttttfk�r�tjd|jd|�}|S)NrHzselinux-policyz0.0.0rr@�VERSIONZ
MODULENAMEZ
DOMAINNAMErzRequires(post): %s
z, Z
TODAYSDATEz%a %b %e %Yz%relabel_filesz.*%s_selinux.8.*z.*%s_u.*)r&rZheader_comment_sectionr'r�Zdefine_relabel_files_beginr�r�r�Zdefine_relabel_files_endr�r�rDr�Zbase_sectionr�rrKrjrZmid_section�timeZstrftimer�r0r1r2r3r)r�ZnewspecZselinux_policynvrZselinux_policyverr�rrr�
generate_spec�s>



zpolicy.generate_speccCs2d||jf}t|d�}|j|j��|j�|S)Nz%s/%s_selinux.spec�w)r��open�writerR�close)r��out_dirZspecfile�fdrrr�
write_spec�s

zpolicy.write_speccCs2d||jf}t|d�}|j|j��|j�|S)Nz%s/%s.terS)r�rTrUr?rV)r�rWZtefilerXrrr�write_te�s

zpolicy.write_tecCs>d||jf}t|d�}|j|j��|j�tj|d�|S)Nz%s/%s.shrSi�)r�rTrUrOrVr5�chmod)r�rWZshfilerXrrr�write_sh�s
zpolicy.write_shcCs2d||jf}t|d�}|j|j��|j�|S)Nz%s/%s.ifrS)r�rTrUr;rV)r�rWZiffilerXrrr�write_if�s

zpolicy.write_ifcCs2d||jf}t|d�}|j|j��|j�|S)Nz%s/%s.fcrS)r�rTrUrBrV)r�rWZfcfilerXrrr�write_fc�s

zpolicy.write_fcc

CsDddl}|j���(}|j�|jdd�|jj�}|j�}|j|jd�}x�|D]�}|j	j
|j�xT|jD]J}xD|j
D]:}|dkr�qt|j|�rttjj|�r�|j|�qt|j|�qtWqhW|j�}|j|jd�}xd|D]\}	xV|	jD]L}xF|j
D]<}|dkr�q�|j|�r�tjj|��r|j|�q�|j|�q�Wq�Wq�WqNWWdQRXdS)NrT)Zload_system_repo)rz/etc)Zprovides)�dnfZBaseZread_all_reposZ	fill_sackZsack�queryZ	available�filterr�rjrNrr�ro�
startswithr5r6�isfiler	r
Zsource_name)
r�r_�baser`ZpqZpkgZfnamer'ZsqZbpkgrrrZ__extract_rpms�s8




zpolicy.__extract_rpmscCs�y|j�Wntk
r YnXtjjd|j�rD|jd|j�tjjd|j�rf|jd|j�tjjd|j�r�|jd|j�tjjd|j�r�|jd|j�tjjd|j�r�|jd|j�tjjd|j�r�|j	d|j�g}x�|j
j�D]�}g}y|j
|dd	d
}Wntk
�r8w�YnXx4|j
|dD]"}|j
|��rJ|j|�n�qJ�qJWt|�d	kr�xF|D]>}||jj�k�r�|j|=n||jj�k�r�|j|=n�q��q�Wtt|j
|d�t|��|j
|d<q�WdS)Nz/var/run/%s.pidz/var/run/%sz/var/log/%sz/var/log/%s.logz/var/lib/%sz/etc/rc.d/init.d/%sz/etc/rc\.d/init\.d/%srr�/)�_policy__extract_rpms�ImportErrorr5r6rcrr	�isdirr
r�rorD�
IndexErrorrbrNrKr�r��list�set)r�Z
temp_basepathr.Z	temp_dirsr�rrr�
gen_writeablesF




zpolicy.gen_writeablecCs�|jtkrdStjj|j�s2tjjd|j�dStj	d|j�}x@|j
�j�D]0}x*|jD] }|j
|�r\td|j|�q\WqPW|j�dS)Nzl
***************************************
Warning %s does not exist
***************************************

znm -D %s | grep Uzself.%s)r'r�r5r6r7r��sys�stderrrU�popen�readrrnrb�execrV)r�rX�sr'rrr�gen_symbolsJs

zpolicy.gen_symbolscCs�d}|d|j|�td�f7}|d|j|�td�f7}|d|j|�td�f7}|jtkr�tjdd�ddkr�|d|j|�td�f7}|d|j	|�td�f7}|S)NzCreated the following files:
z%s # %s
zType Enforcement filezInterface filezFile Contexts filer)rErFrGrHrIrJrKz	Spec filezSetup Script)rFrGrHrIrJrK)
rZrr]r^r'r�rMrNrYr\)r�rW�outrrr�generate\s
zpolicy.generateN)r�)w�__name__�
__module__�__qualname__r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrr	r
r
rrrrrrr rxrzr|rvrtr�r�r�r~r�r�r�r�rrr%r(r*r+r,rsr.r�rr�r�r}rwrur/ryr{r4r;r<r=r>r?rBrDrOrRrYrZr\r]r^rfrlrsr5�getcwdrurrrrrV�s�B
	&8

	>#$*$3rV)r)Gr5rmr8r�r*rrrrQrMZ	templatesrrrr	r
rrr
rrrrrrrZsepolgen.interfacesZ
interfacesZsepolgen.defaultsZdefaultsZPROGNAME�gettext�kwargs�version_infoZinstall�builtins�str�__dict__rgZ__builtin__rr r&r/r8r�r�r�r�ZADMIN_TRANSITION_INTERFACEZUSER_TRANSITION_INTERFACEr�r�ZINETDr�rr4r�r0r1r3r2rr�rCrrGr�r�rUrVrrrr�<module>s�


site-packages/sepolicy/__pycache__/network.cpython-36.pyc000064400000003145147511334650017456 0ustar003

>�\�
�@s ddlZddd�Zddd�ZdS)	�NFcshtjtjgtj|tj|tj�i�}g}|rdx8tdd�t��fdd�|��D]}||krJ|j|�qJW|S)NcSs
|tjS)N)�sepolicyZTARGET)�y�r�/usr/lib/python3.6/network.py�<lambda>szget_types.<locals>.<lambda>cs"t��j|tj�o �p |dS)NZenabled)�set�issubsetr�PERMS)�x)�check_bools�permrrrs)	r�searchZALLOWZSOURCEZCLASSr	�map�filter�append)�srcZtclassrrZallowsZnlist�ir)rrr�	get_typess"$rc	Csztj�\}}i}t|d||g|�}t|�dk�rvg||||f<�x2|D�](}|dkrdd|kr`qHd}|dkr�d|krvqHd|kr�qH|dkr�||||fj|dgf�|d	kr�||||fj|d
gf�qH|dkr�||||fj|dgf�qH|dk�r||||fj|d
gf�qH|dk�r6||||fj|dgf�qHy$||||fj||||ff�WqHtk
�rpYqHXqHW|S)Nz	%s_socketrZephemeral_port_typeZunreserved_port_typeZephemeral_port_tZunreserved_port_tZport_tz all ports with out defined typesZ	port_typez	all portszall ports > 1024Zreserved_port_typezall ports < 1024Z
rpc_port_typezall ports > 500 and  < 1024)rZ
gen_port_dictr�lenr�KeyError)	rZprotocolrrZportrecsZ
portrecsbynum�dZtlistrrrr�get_network_connect#s<

$
r)F)F)rrrrrrr�<module>s

site-packages/sepolicy/__pycache__/__init__.cpython-36.pyc000064400000075600147511334650017531 0ustar003

��fz��@s6ddlZddlZddlZddlZddljZddljZddlZddl	Z	ddl
Z
ddlZdZy:ddl
Z
iZejd�krxded<e
jefddd�e��WnJyddlZeejd	<Wn&ek
r�ddlZeejd	<YnXYnXd
ZdZdZdZd
ZdZdZdZdZdZ dZ!dZ"dZ#dZ$dZ%dZ&dZ'iZ(de(d<de(d<de(d<de(d <de(d!<d"e(d#<d$e(d%<d&e(d'<d(e(d)<d(e(d*<d+e(d,<d+e(d-<d.e(d/<d0e(d1<iZ)e*d2�e)d3<e*d4�e)d5<e*d6�e)d7<e*d8�e)d9<e*d:�e)d;<e*d<�e)d=<e*d>�e)d?<e*d@�e)dA<iZ+d3e+dB<d5e+dC<d7e+dD<d9e+dE<d;e+dF<d=e+dG<d?e+dH<dAe+dI<da,da-da.da/da0ga1da2da3da4da5da6da7da8da9da:da;da<da=da>da?da@daAdaBdaCdaDdJdK�ZEd�dMdN�ZFdOdP�ZGdQdR�ZHdSdT�ZIyeF�ZJeHeJ�Wn6eKk
�rZLzejM�d
k�reL�WYddZL[LXnXd�dUdV�ZNdWdX�ZOd�dYdZ�ZPd[d\�ZQd]d^�ZRd_d`�ZSdadb�ZTdcdd�ZUdedf�ZVdgdh�ZWgfdidj�ZXgfdkdl�ZYdmdn�ZZej[�fdodp�Z\ej[�fdqdr�Z]ej[�fdsdt�Z^ej[�fdudv�Z_dwdx�Z`dydz�Zad{d|�Zbd}d~�Zcdd��Zdd�d��Zed�d��Zfd�d��Zgd�d��Zhd�d��Zid�d��Zjd�d��Zkd�d��Zld�d��Zmd�d��Znd�d��Zod�d��Zpd�d��Zqd�d��Zrd�d��Zsd�d��Ztd�d��Zud�d��Zvd�d��Zwd�d��Zxd�d��Zyd�d��Zzd�d��Z{d�d��Z|e|fd�d��Z}d�d��Z~d�d��Zd�d��Z�d�d��Z�d�d��Z�d�d��Z�d�d��Z�d�d„Z�d�d�dńZ�d�d�dDŽZ�d�dɄZ�d�d˄Z�d�d̈́Z�d�dτZ�dS)��Nzselinux-python�T�unicodez/usr/share/localezutf-8)Z	localedirZcodeset�_�������allowZ
auditallowZ
neverallowZ	dontaudit�source�target�permlist�classZ
transitionZ
role_allowZetc_tz/etcZtmp_tz/tmpZunit_file_tz/usr/lib/systemd/systemz/lib/systemd/systemz/etc/systemd/systemZvar_cache_tz
/var/cacheZ	var_lib_tz/var/libZlog_tz/var/logZ	var_run_tz/var/runz/runZ
var_lock_tz	/run/lockz
/var/run/lockZvar_spool_tz
/var/spoolZ	content_tz/var/wwwz	all files�azregular file�fZ	directory�dzcharacter device�czblock device�bzsocket file�sz
symbolic link�lz
named pipe�p�z--z-dz-cz-bz-sz-lz-pcCs:|jdd�d}yt|�|fStk
r4d|fSXdS)Nz/policy.rr)�rsplit�int�
ValueError)Zpolicy_path�	extension�r�/usr/lib/python3.6/__init__.py�policy_sortkeyys
r�/c	CsLy.|tj�}tjd|�}|jtd�|dSYnXttd���dS)Nz%s.*)�keyrzNo SELinux Policy installed���)�selinuxZselinux_binary_policy_path�glob�sortrrr)�root�path�policiesrrr�get_installed_policy�sr)cCs2tjdtj�|f�}|sdS|jtd�|dS)z?Get the path to the policy file located in the given store namez%s%s/policy/policy.*N)r!rr")r$r#Zselinux_pathr%r)�storer(rrr�get_store_policy�sr+c	CsTdadadadadadadadadayt	j
|�aWntt
d�|��YnXdS)NzFailed to read %s policy file)�all_domains�all_attributes�bools�	all_types�role_allowsZusers�roles�
file_types�
port_types�setoolsZ
SELinuxPolicy�_polrr)�policy_filerrr�policy�s
r7cCst|�}|sdSt|�dS)N)r+r7)r*r6rrr�load_store_policy�sr8cCs|tkrZtjt�}||_t|j��}|rLt|�dkrLd|_||_t|j��}dd�|D�S|t	kr�tj
t�}|rv||_dd�|j�D�S|tkr�tjt�}|r�||_dd�|j�D�S|t
k�rFtjt�}|�rdd�|jd�D�}t|�d	k�r�||_n t|�dk�r|d
|d
f|_tj�r4dd�|j�D�Sdd�|j�D�S|tk�r�tjt�}|�rf||_tj�r�d
d�|j�D�Sdd�|j�D�S|tk�r�tjt�}|�r�||_dd�|j�D�S|tk�r�tjt�}|�r�||_dd�|j�D�Std��dS)NrcssB|]:}ttt|j���t|�t|j�ttt|j���d�VqdS))�aliases�nameZ
permissive�
attributesN)�list�map�strr9�boolZispermissiver;)�.0�xrrr�	<genexpr>�szinfo.<locals>.<genexpr>css:|]2}t|�ttt|j���ttt|j���d�VqdS))r:r1�typesN)r>r<r=�expandrC)r@rArrrrB�scss*|]"}t|�ttt|j���d�VqdS))r:rCN)r>r<r=rD)r@rArrrrB�scSsg|]}t|��qSr)r)r@�irrr�
<listcomp>�szinfo.<locals>.<listcomp>�-rrcss<|]4}|jjt|j�t|jj�t|jj�|jjd�VqdS))�high�protocol�range�type�lowN)�portsrHr>rI�contextZrange_�type_rL)r@rArrrrB�scss2|]*}|jjt|j�t|jj�|jjd�VqdS))rHrIrKrLN)rMrHr>rIrNrOrL)r@rArrrrB�scss8|]0}t|j�t|�ttt|j��t|j�d�VqdS))rJr:r1�levelN)r>Z	mls_ranger<r=r1Z	mls_level)r@rArrrrBscss(|] }t|�ttt|j��d�VqdS))r:r1N)r>r<r=r1)r@rArrrrB
scss|]}t|�|jd�VqdS))r:�stateN)r>rQ)r@rArrrrBscss"|]}t|�t|j�d�VqdS))r:rN)r>r<�perms)r@rArrrrBszInvalid type)�TYPEr4Z	TypeQueryr5r:r<�results�len�alias�ROLE�	RoleQuery�	ATTRIBUTEZTypeAttributeQuery�PORTZPortconQuery�splitrM�mls�USERZ	UserQuery�BOOLEANZ	BoolQuery�TCLASSZ
ObjClassQueryr)�setyper:�qrTrMrrr�info�sr










rbc3Cs`t|j�t|j�t|j�t|j�d�}y<i}x|jjD]}|j|t|�<q6W|jjf|�|j	k}Wnt
k
rzd}YnX||d<yttt|j
��|d<Wnt
k
r�YnXyt|j�|d<Wnt
k
r�YnXydd�|jjD�|d<Wnt
k
�r
YnXyt|j�|d	<Wnt
k
�r4YnXy|j|d
<Wnt
k
�rZYnX|S)N)rKrr
rT�enabledr�	transtypecSsg|]}t|�|jf�qSr)r>rQ)r@rrrrrFHsz)_setools_rule_to_dict.<locals>.<listcomp>�booleans�conditional�filename)r>�ruletyperr
�tclassrfrerQZevaluateZconditional_block�AttributeErrorr<r=rR�defaultrg)ZrulerZ	boolstate�booleanrcrrr�_setools_rule_to_dict'sB
rmc
Cs�|si}tttttttg�}x&|D]}||kr"tddj|���q"Wd}t	|kr\t
|t	�}d}t|krtt
|t�}d}t|kr�t
|t�j
d�}g}g}t|kr�|jt�t|kr�|jt�t|kr�|jt�t|kr�|jt�t|�dk�r.tjt||||d�}	t|k�r|t|	_|dd�|	j�D�7}t|k�r�dd	d
g}
tjt|
|||d�}	t|k�rj|t|	_|dd�|	j�D�7}t|k�r�dg}tjt||||d�}	x.|	j�D]"}|jt
|j�t
|j�d
���q�W|S)NzType has to be in %s� �,r)rhrr
ricSsg|]}t|��qSr)rm)r@rArrrrF�szsearch.<locals>.<listcomp>�type_transitionZtype_changeZtype_membercSsg|]}t|��qSr)rm)r@rArrrrF�sr)rr
)�set�ALLOW�
AUDITALLOW�
NEVERALLOW�	DONTAUDIT�
TRANSITION�
ROLE_ALLOWr�join�SOURCEr>�TARGET�CLASSr[�appendrUr4�TERuleQueryr5�PERMSrRrT�
RBACRuleQueryrr
)
rCZseinfoZvalid_typesr`rr
riZtoretZtertypesraZrtypesZratypes�rrrr�searchYsn











r�csi}g}�g��g�y(�tt�fdd�t���dd7�WnYnXy(�tt�fdd�t���dd7�WnYnXtdd�t���fdd�t���}yHxB|D]:}|j|d|d	|d
fd��||kr�|j|�i}q�WWntk
�r|SX|S)Ncs|d�kS)Nr:r)rA)�srcrr�<lambda>�sz"get_conditionals.<locals>.<lambda>rr;cs|d�kS)Nr:r)rA)�destrrr��scSs|S)Nr)�yrrrr��scs2|d�ko0|d�ko0t��j|t�o0d|kS)Nrr
rf)rq�issubsetr~)rA)�	dest_list�perm�src_listrrr��srrfrc)rrf)r<�filter�get_all_types_infor=�get_all_allow_rules�updater|�KeyError)r�r�rir�ZtdictZtlistZallowsrEr)r�r�r�r�r�r�get_conditionals�s.((

 
r�cCsHd}x|D]}|ddr
d}Pq
Wtd�|djttdd�|���fS)	NFrfrTz-- Allowed %s [ %s ]z || cSsd|dd|ddfS)Nz%s=%drfrrr)rArrrr��sz.get_conditionals_format_text.<locals>.<lambda>)rrxrqr=)ZcondrcrArrr�get_conditionals_format_text�s
r�cCsttt|��ddS)NrrC)r<rbrY)Z	attributerrr�get_types_from_attribute�sr�cCs�g}i}x&t�D]}|jt|��r|j|�qWt�}xN|D]F}y$||dt||df||<Wq<tk
r�g||<Yq<Xq<W|S)N�regex�ftype)�get_all_file_types�
startswith�gen_short_namer|�
get_fcdict�
file_type_strr�)r`�flist�mpathsr�fcdictrrr�get_file_types�s
$r�cCs8|s|Syttt|��dSttfk
r2|SXdS)z�Return the real name of a type

    * If 'name' refers to a type alias, return the corresponding type name.
    * Otherwise return the original name (even if the type does not exist).
    r:N)�nextrbrS�RuntimeError�
StopIteration)r:rrr�get_real_type_name�sr�c
Cs<t�}g}i}ttg|ddgdd��}|dks:t|�dkr>|St�}ddg}x�|D]�}|d|krdqRd	|krv|d	svqR|djd
�r�|d|kr�qR|d|kr�|d|kr�|j|d�qRx&t|d�D]}||kr�|j|�q�WqRWxP|D]H}	y$||	dt||	df||	<Wq�t	k
�r2g||	<Yq�Xq�W|S)
N�open�write�file)rrrrZ	proc_typeZsysctl_typer
rc�_tr�r�)
r�r�rrrUr��endswithr|r�r�r�)
r`r2Z
all_writesr�rr�r;rE�trrrr�get_writable_files�s:

$r�cs�tjj|�r|gSytjd|��Wntd|�gS|}|jd�rX|dd�d}tjj|��y�d
dkrz�d7�Wntk
r�td�YnXy4tjd|���fdd	�t	�fd
d�tj
���D�SgSdS)Nz%s$zbad reg:z(/.*)?r	r rztry failed got an IndexErrorcsg|]}�j|�r|�qSr)�match)r@rA)�patrrrF(szfind_file.<locals>.<listcomp>cs�|S)Nr)rA)r'rrr�(szfind_file.<locals>.<lambda>i����r")�osr'�exists�re�compile�printr��dirname�
IndexErrorr=�listdir)Zregrr)r�r'r�	find_files,

&r�cCsVt|�}xH|j�D]<}|jd�r||krx$||D]}xt|�D]}|SWq2WqWdS)N�_exec_t)�get_entrypoints�keysr�r�)�domain�exclude_listZexecutable_files�exer'rrrr�find_all_files-sr�cCs`t�}y@|jd�rD||krDx(||dD]}xt|�D]}|SWq(WWntk
rZYnXdS)Nr�r�)r�r�r�r�)r�r�r�r'rrrr�find_entrypoint_path7sr�cCs�yZt|d��F}x>|D]6}|j�}|r|djd�r|d|d�||d<qWWdQRXWn0tk
r�}z|jtjkrz�WYdd}~XnX|S)Nr�r�#r)Zequiv�modify)r�r[r��OSError�errno�ENOENT)Zedict�fc_pathr��fd�errrr�read_file_equivCs
(r�cCs"trtSiatt|ddd�atS)Nz.subsT)r�)�file_equiv_modifiedr�)r�rrr�get_file_equiv_modifiedPs
r�cCs&trtSt|�att|ddd�atS)Nz
.subs_distF)r�)�
file_equivr�r�)r�rrr�get_file_equivYs
r�cCs�trtSgay&t|dd��}|j�}WdQRXWn.tk
r`}z|jtjkrR�gSd}~XnXxl|D]d}|j�}t|�dkr�qhy4t|�dkr�t|d}nd}tj	|d|f�Wqht
k
r�YqhXqhWtS)Nz.localr�rrrr)�local_filesr��	readlinesr�r�r�r[rU�trans_file_type_strr|r�)r�r��fcr�rE�recr�rrr�get_local_file_pathsbs,

r�cCs�trtSt|d�}|j�}|j�t|dd�}||j�7}|j�iay*t|dd��}||j�7}WdQRXWn0tk
r�}z|jtjkr��WYdd}~XnXx�|D]�}|j�}yjt|�dkr�t	|d}nd}|djd�d}|tk�r
t|dj
|d	�n|d	g|d
�t|<Wq�Yq�Xq�Wddgitd<dd
gitd<ddgitd<ddgitd<ddgitd<ddgitd<ddgitd<ddgitd<ddgitd<tS)Nr�z	.homedirsz.localrrr�:r�r)r�r�z
all log filesZlogfilezall user tmp filesZ
user_tmp_typezall user home filesZuser_home_typezall virtual image filesZvirt_image_typezBall files on file systems which do not support extended attributesZ	noxattrfsz)all sandbox content in tmpfs file systemsZsandbox_tmpfs_typez&all user content in tmpfs file systemsZuser_tmpfs_typezall files on the system�	file_typezAuse this label for random content that will be shared using sambaZ
samba_share_tr")r�r�r��closer�r�r�r[rUr�r|)r�r�r�r�rEr�r�r�rrrr�~sJ



r�cs<y�fdd�ttgddi�D�Sttfk
r6YnXdS)Ncsg|]}|d�kr|�qS)rdr)r@rA)r`rrrF�sz(get_transitions_into.<locals>.<listcomp>r�process)r�rv�	TypeErrorrj)r`r)r`r�get_transitions_into�s
r�cCs0yttg|dd��Sttfk
r*YnXdS)Nr�)rr)r�rvr�rj)r`rrr�get_transitions�s
r�cCs8ydd�ttgd|i�D�Sttfk
r2YnXdS)NcSsg|]}|ddkr|�qS)rr�r)r@rArrrrF�sz(get_file_transitions.<locals>.<listcomp>r)r�rvr�rj)r`rrr�get_file_transitions�s
r�c
Csdg}ttgd|i�}xJ|D]B}d|kry(x"|dD]}||kr2|j|�q2WWqYqXqW|S)Nrre)r�rrr|)r`rlZboollistrrrrrr�get_boolean_rules�s

r�cCstd�S)NZ
entry_type)r�rrrr�get_all_entrypoints�sr�cs0tjttg�dgdgd�}�fdd�|j�D�S)Nr��
entrypoint)rhrrirRcs g|]}|j�krt|j��qSr)rr>r
)r@rA)r`rrrF�sz(get_entrypoint_types.<locals>.<listcomp>)r4r}r5rrrT)r`rar)r`r�get_entrypoint_types�s
r�cshtj|�djd�d�y0tt�fdd�ttgddd����}|d	d
Sttt	fk
rbYnXdS)Nrr�rcs|d�kS)Nr
r)rA)r�rrr��sz$get_init_transtype.<locals>.<lambda>�init_tr�)rrrrd)
r#Z
getfileconr[r<r�r�rvr�rjr�)r'�entrypointsr)r�r�get_init_transtype�s$r�cCsbtjtdgddgd�}g}xB|j�D]6}y|j|kr@|j|j�Wq$tk
rXw$Yq$Xq$W|S)Nrpr�r�)rhrri)r4r}r5rTrkr|r
rj)rdrar�rErrr�get_init_entrypoint�s

r�cCs�tjtdgddgd�}i}xd|j�D]X}y<t|j�}||krR||jt|j��nt|j�g||<Wq$tk
rzw$Yq$Xq$W|S)Nrpr�r�)rhrri)	r4r}r5rTr>rkr|r
rj)rar�rErdrrr�get_init_entrypoints_str�s

r�cCsHy*tdd�ttgd|dd���}t|�dSttfk
rBYnXdS)NcSs|dS)Nrdr)rArrrr�sz,get_init_entrypoint_target.<locals>.<lambda>r�r�)rr
rr)r=r�rvr<r�r�)r�r�rrr�get_init_entrypoint_target
sr�cCsbt�}i}xRt|�D]F}y$||dt||df||<Wqtk
rXg||<YqXqW|S)Nr�r�)r�r�r�r�)r`r�r�rrrrr�s$r�c	Cs�tt�dkrtSt�tj�}y4t|�}tj�}|j|�t	|jj
��a|j�Wn&tj
jd|�tjd�YnXtj�tS)Nrz#could not open interface info [%s]
r)rU�methods�gen_interfaces�defaults�interface_infor��
interfacesZInterfaceSetZ	from_filer<r�r��sys�stderrr��exitr%)�fnr�Zifsrrr�get_methodss
r�cCstdkrdd�tt�D�atS)NcSsg|]}|d�qS)r:r)r@rArrrrF6sz!get_all_types.<locals>.<listcomp>)r/rbrSrrrr�
get_all_types3sr�cCstdkrttt��atS)N)�all_types_infor<rbrSrrrrr�9sr�cCs&tdkr"ttttd��dd�atS)NZ
userdomainrrC)�
user_typesr<rbrYrrrr�get_user_types?sr�cCsztrtSiatjttgd�}xX|j�D]L}t|j�}t|j�}|dks&|dkrPq&|tkrht|j	|�q&|gt|<q&WtS)N)rhZsystem_r)
r0r4rr5rrrTr>rr
r|)rar�r�Ztgtrrr�get_all_role_allowsFs

r�cCszddl}g}tt��}x^|D]V}|jdd|�}t|�dkrt|jdd|d��dkr|d|kr|j|d�qW|S)Nrz(.*)%sz_exec_t$z_initrc$)r��sortedr��findallrUr|)r�r,rCrE�mrrr�get_all_entrypoint_domainsZs

(r�cCs�yddlm}Wn tk
r0ddlm}YnXtj�}tj�}y tj|�j	tj|�j	kr`dSWnt
k
rvYnXtj�dkr�tt
d���t|d�d�dS)Nr)�getstatusoutputzEYou must regenerate interface info by running /usr/bin/sepolgen-ifgenz/usr/bin/sepolgen-ifgenr)Zcommandsr��ImportError�
subprocessr�r��headersr��stat�st_mtimer��getuidrrr�)r�Zifiler�rrrr�fsr�cCs�trttfSiaiax�tt�D]�}|d|dkr@t|d�}ndt|d�t|d�f}|d|dftkr�t|d|dfj|�n|gt|d|df<d|kr�|d|dft|d|d|df<q|dt|d|d|df<qWttfS)NrLrHz%s-%srKrIrJ)�portrecs�
portrecsbynumrbrZr>r|)rEZportrrr�
gen_port_dictxs("r�cCs"tsttttd��dd�atS)Nr�rrC)r,r<rbrYrrrr�get_all_domains�sr�cCs(trtStjt�}dd�|j�D�atS)NcSs g|]}t|�dkrt|��qS)Zobject_r)r>)r@rArrrrF�sz!get_all_roles.<locals>.<listcomp>)r1r4rXr5rT)rarrr�
get_all_roles�s

r�cCs@ts<ttt��atjr<x$tD]}dj|djd��|d<qWtS)NrrJrn)�selinux_user_listr<rbr]r5r\rxr[)rArrr�get_selinux_users�s
rcCs�trtSttj�d�}|j�}|j�gaxd|jd�D]V}|j�}t|�dks6|j	d�rZq6|jd�}tj
|d|ddj|dd��d��q6WtS)	Nr��
rr�r�rr)r:Zseuserr\)�login_mappingsr�r#Zselinux_usersconf_path�readr�r[�striprUr�r|rx)r��bufrrArrr�get_login_mappings�s
,rcCsttdd�t���S)NcSs|dS)Nr:r)rArrrr��szget_all_users.<locals>.<lambda>)r�r=rrrrr�
get_all_users�srcCs&trtSttttd��dd�atS)Nr�rrC)r2r<r�rbrYrrrrr��sr�cCs&trtSttttd��dd�atS)NZ	port_typerrC)r3r<r�rbrYrrrr�get_all_port_types�srcCststtt��atS)N)r.r<rbr^rrrr�
get_all_bools�sr	cCsdj|dt|��jd��S)Nrnr)rxrUr[)rZtrimrrr�prettyprint�sr
cCs|S)Nr)rrrr�markup�srcCsVd||�}|jd�r(|dt|d�S|jd�rD|dt|d�S|jd�r`|dt|d�S|jd�r||d	t|d�S|jd
�r�|dt|d�S|jd�r�|dt|d�S|jd
�s�|jd�r�|dS|jd�r�|dS|jd�r�|dt|d�S|jd��r|dt|d�S|jd��r:|dt|d�S|jd��rX|dt|d�S|jd��r~|d|dtd��S|jd��r�|dt|d�S|jd��r�|dt|d�S|jd��r�|dt|d�S|jd ��r�|dt|d �S|jd!��r|d"t|d!�S|jd#��r2|d$t|d#�S|jd%��rP|d&t|d%�S|jd'��rn|d(t|d'�S|jd)��r�|d*t|d'�S|jd+��r�|d$t|d+�S|jd,��r�|d-t|d,�S|jd.��r�|d/t|d.�S|jd0��r|d1t|d0�S|jd2��r"|d3t|d2�S|jd4��r@|d1t|d4�S|jd5��r^|d1t|d5�S|jd6��r||d1t|d6�S|jd5��r�|d7t|d5�S|jd8��r�|d9t|d8�S|jd:��r�|d;t|d8�S|jd<��r�|d=t|d<�S|jd>��r|d?t|d>�S|jd@��r&|dAS|jdB��rD|dCt|dB�S|dDt|dE�S)FNz+Set files with the %s type, if you want to Z
_var_run_tz8store the %s files under the /run or /var/run directory.Z_pid_tz,store the %s files under the /run directory.Z
_var_lib_tz0store the %s files under the /var/lib directory.Z_var_tz,store the %s files under the /var directory.Z_var_spool_tz2store the %s files under the /var/spool directory.Z_spool_tZ_cache_tZ_var_cache_tz/store the files under the /var/cache directory.Z	_keytab_tz)treat the files as kerberos keytab files.Z_lock_tzEtreat the files as %s lock data, stored under the /var/lock directoryZ_log_tzKtreat the data as %s log data, usually stored under the /var/log directory.Z	_config_tzRtreat the files as %s configuration data, usually stored under the /etc directory.Z_conf_tr�z,transition an executable to the %s_t domain.Z_cgi_content_tz"treat the files as %s cgi content.Z
_rw_content_tz)treat the files as %s read/write content.Z_rw_tZ_write_tZ_db_tz'treat the files as %s database content.Z
_ra_content_tz*treat the files as %s read/append content.Z_cert_tz'treat the files as %s certificate data.Z_key_tztreat the files as %s key data.Z	_secret_tz"treat the files as %s secret data.Z_ra_tZ_ro_tz(treat the files as %s read/only content.Z
_modules_tztreat the files as %s modules.Z
_content_tztreat the files as %s content.Z_state_tz!treat the files as %s state data.Z_files_tZ_file_tZ_data_tztreat the data as %s content.Z_tmp_tz1store %s temporary files in the /tmp directories.Z_etc_tz'store %s files in the /etc directories.Z_home_tz+store %s files in the users home directory.Z_tmpfs_tz&store %s files on a tmpfs file system.Z_unit_file_tz#treat files as a systemd unit file.Z_htaccess_tz#treat the file as a %s access file.ztreat the files as %s data.r�)r�r
rU)rrZtxtrrr�get_description�s�







rcCs"tstttdd�tt����atS)NcSs|dS)Nr:r)rArrrr�Bsz$get_all_attributes.<locals>.<lambda>)r-r<r�r=rbrYrrrr�get_all_attributes?sr
cCs"x|D]}||tkrdSqWdS)NFT)r~)�dictrRr�rrr�_dict_has_permsFs
rcCspt�}|jd�r&t|�}|dd�}n|}|d|krBtd|��|ddkr`|dd	�d}n|d}||fS)
Nr�rzdomain %s_t does not existrrr���r"r")r�r�r�r)r`r,�
domainname�
short_namerrrr�Ms
r�cCststtg�atS)N)�all_allow_rulesr�rrrrrrr�]s
r�cCs0ts,tjtddttgd�}dd�|j�D�atS)Nz.*T)rlZ
boolean_regexrhcSsg|]}t|��qSr)rm)r@rArrrrFhsz&get_all_bool_rules.<locals>.<listcomp>)�all_bool_rulesr4r}r5rrrurT)rarrr�get_all_bool_rulescs

rcCststttg��atS)N)�all_transitionsr<r�rvrrrr�get_all_transitionsksrc
sg}g}t��\}}x�tdd�t�fdd�t���D]�}x�|D]�}t|t�sNq>ytj|d�}Wntk
r||d}YnX|dj	|�s�|dj	|�r�|d|f|kr�|d|f|kr�|j
|d|f�q>|d|f|ko�|d|f|kr>|j
|d|f�q>Wq4W||fS)NcSs|dS)Nrer)rArrrr�uszget_bools.<locals>.<lambda>csd|ko|d�kS)Nrerr)rA)r`rrr�usrr)r�r=r�r�
isinstance�tupler#Zsecurity_get_boolean_activer�r�r|)r`r.ZdomainboolsrrrErrcr)r`r�	get_boolsqs"$

""rcCststj�datS)Nr)rer#Zsecurity_get_boolean_namesrrrr�get_all_booleans�sr�#/usr/share/selinux/devel/policy.xmlcCsPytj|�}|j�}|j�Wn,tk
rJt|�}|j�}|j�YnX|S)N)�gzipr�rr��IOError)r'r�rrrr�
policy_xml�s
rc
Cs�trtSddl}ia�y�|jjjt|��}�x2|jd�D�]"}x�|jd�D]�}xX|jd�D]J}|jd�jd�jj	d�}t
jdd|�}|jd	�|jd
�|ft|jd	�<qZWxX|jd�D]J}|jd�jd�jj	d�}t
jdd|�}|jd	�|jd
�|ft|jd	�<q�WqJWxT|jd�D]F}|jd�jd�jj	d�}t
jdd|�}d|jd
�|ft|jd	�<�qWq8WxT|jd�D]F}|jd�jd�jj	d�}t
jdd|�}d|jd
�|ft|jd	�<�qlWWnt
k
�r�YnXtS)
NrZlayer�moduleZtunable�descrrrnr:Zdftvalr?�global)�
booleans_dictZxml.etree.ElementTreeZetreeZElementTreeZ
fromstringrr��find�textrr��sub�getr)r'ZxmlZtreerr�rr!rErrr�
gen_bool_dict�s6$($$r(cCs*t�}||krt||d�Std�SdS)Nr�unknown)r(r)rlr#rrr�boolean_category�sr*cCsJt�}||krt||d�S|jd�}d|ddj|dd��fSdS)NrrzAllow %s to %srrnr)r(rr[rx)rlr#r!rrr�boolean_desc�s

r+cCsFd}y$td��}|j�j�}WdQRXWntk
r@d}YnX|S)Nrz/etc/system-releaseZMisc)r��readline�rstripr)Zsystem_releaserrrr�get_os_version�s

r.cCsPdadadadadadadadadada	da	da
dadada
dadadadadS)N)r-r,r/rer#r.r�r2r�r�r�r�r3r0r1r�rr�rrrr�reinit�s&r/)r)r )N)N)r)r)�r�r#r4r$Zsepolgen.defaultsr�Zsepolgen.interfacesr�r�r�r�rZPROGNAME�gettext�kwargs�version_infoZinstall�builtinsr>�__dict__r�Z__builtin__rrSrWrYrZr]r^r_rrrsrtruryrzr~r{rvrwZDEFAULT_DIRSr�rr�r5r�r�r�r�r�r/r�r�r0r�r�r,r1r�rr2r3r.r-rer#rrrrr)r+r7r8r6rr�Zis_selinux_enabledrbrmr�r�r�r�r�r�r�r�r�r�r�Zselinux_file_context_pathr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr�rr	r
rrr
rr�r�rrrrrr(r*r+r.r/rrrr�<module>sZ


	



i2
H!
$

		.	
	

]

	site-packages/sepolicy/__pycache__/booleans.cpython-36.opt-1.pyc000064400000001702147511334650020523 0ustar003

>�\)�@sddlZdd�Zdd�ZdS)�NcCs2ytjtj|�ddStk
r,|gSXdS)Nr�types)�sepolicy�infoZ	ATTRIBUTE�RuntimeError)Z	attribute�r�/usr/lib/python3.6/booleans.py�expand_attributesrcsvtjtjgtj|tj|tj�i�}|s<td|dj��f��g}x0tdd�t	�fdd�|��D]}|t
|�}q^W|S)Nz*The %s type is not allowed to %s any types�,cSs
|tjS)N)rZTARGET)�yrrr�<lambda>&szget_types.<locals>.<lambda>cst��j|tj�S)N)�set�issubsetr�PERMS)�x)�permrrr&s)r�searchZALLOWZSOURCEZCLASSr�	TypeError�join�map�filterr)�srcZtclassrZallowsZtlist�lr)rr�	get_types s""r)rrrrrrr�<module>ssite-packages/sepolicy/__pycache__/communicate.cpython-36.pyc000064400000002262147511334650020270 0ustar003

>�\��@s,ddlZddlZdd�Zdd�Zdd�ZdS)�NcCs0|j�tjjd|�tjj�tjd�dS)Nz
%s
�)Z
print_help�sys�stderr�write�flush�exit)�parser�msg�r
�!/usr/lib/python3.6/communicate.py�usages
rcCs6ytttjtj|��d�Stk
r0|gSXdS)N�types)�list�next�sepolicy�infoZ	ATTRIBUTE�
StopIteration)Z	attributer
r
r�expand_attribute"srcsvtjtjgtj|tj|tj�i�}|s<td|dj��f��g}x0tdd�t	�fdd�|��D]}|t
|�}q^W|S)Nz*The %s type is not allowed to %s any types�,cSs
|tjS)N)rZTARGET)�yr
r
r�<lambda>/szget_types.<locals>.<lambda>cst��j|tj�S)N)�set�issubsetr�PERMS)�x)�permr
rr/s)r�searchZALLOWZSOURCEZCLASSr�
ValueError�join�map�filterr)�srcZtclassrZallowsZtlist�lr
)rr�	get_types)s""r#)rrrrr#r
r
r
r�<module>ssite-packages/sepolicy/__pycache__/sedbus.cpython-36.pyc000064400000004446147511334650017257 0ustar003

>�\��@s�ddlZddlZddlZddlZddlmZGdd�de�Zedkr�y&e�Z	e	j
eejd��Z
ee
�Wn,ejk
r�Zzee�WYddZ[XnXdS)�N)�polkitc@s�eZdZdd�Zejdd��Zejdd��Zejdd��Zejd	d
��Z	ejdd��Z
ejd
d��Zejdd��Zejdd��Z
dS)�SELinuxDBuscCstj�|_|jjdd�|_dS)Nzorg.selinuxz/org/selinux/object)�dbusZ	SystemBusZbusZ
get_object�dbus_object)�self�r�/usr/lib/python3.6/sedbus.py�__init__
s
zSELinuxDBus.__init__cCs|jj|dd�}|S)Nzorg.selinux)�dbus_interface)r�semanage)rZbuf�retrrrrszSELinuxDBus.semanagecCs|jj|dd�}|S)Nzorg.selinux)r
)r�
restorecon)r�pathrrrrr
szSELinuxDBus.restoreconcCs|jj|dd�}|S)Nzorg.selinux)r
)r�
setenforce)r�valuerrrrrszSELinuxDBus.setenforcecCs|jjdd�}|S)Nzorg.selinux)r
)r�
customized)rrrrrrszSELinuxDBus.customizedcCs|jjdd�}|S)Nzorg.selinux)r
)r�
semodule_list)rrrrrr"szSELinuxDBus.semodule_listcCs|jj|dd�}|S)Nzorg.selinux)r
)r�relabel_on_boot)rrrrrrr'szSELinuxDBus.relabel_on_bootcCs|jj|dd�}|S)Nzorg.selinux)r
)r�change_default_mode)rrrrrrr,szSELinuxDBus.change_default_modecCs|jj|dd�}|S)Nzorg.selinux)r
)r�change_default_policy)rrrrrrr1sz!SELinuxDBus.change_default_policyN)�__name__�
__module__�__qualname__r	rZenable_proxyrr
rrrrrrrrrrrsr�__main__�)�sysrZdbus.serviceZdbus.mainloop.glibZ	slip.dbusr�objectrrZ
dbus_proxyr�int�argvZresp�printZ
DBusException�errrr�<module>s.site-packages/sepolicy/__pycache__/gui.cpython-36.opt-1.pyc000064400000244126147511334650017516 0ustar003

��f�@s(ddlZejdd�ddlmZddlmZddlmZddlmZddlZddl	Z	ddl
Z
ddl
mZmZm
Z
ddlZ	ddlZ	ddlZddlZddlZddlZd	Zy:ddlZiZejd0kr�ded<ejefd
dd�e��WnLyddlZeejd<Wn(ek
�r$ddlZeejd<YnXYnXiZxe	j D]Z!e!ee	j e!<�q8We"d�e"d�gZ#e"d�e"d�gZ$dd�Z%ddl&Z'e"d�e"d�fZ(e"d�e"d�fZ)dZ*dZ+dZ,dZ-dZ.dZ/dZ0dZ1dZ2dZ3dZ4d
Z5dZ6dZ7dZ8d Z9d!Z:d"Z;d#d$d%d&d'd(d)d*d+g	Z<e"d,�Z=Gd-d.�d.�Z>e?d/k�r$e>�Z@dS)1�N�Gtkz3.0)r)�Gdk)�GLib)�SELinuxDBus)�DISABLED�
PERMISSIVE�	ENFORCINGzselinux-python�T�unicodez/usr/share/localezutf-8)Z	localedirZcodeset�_ZNoZYesZDisableZEnablecCs<|dkr|dkrdS|dkr dS|dkr,dS||k||kS)Nr�����)�a�brr�/usr/lib/python3.6/gui.py�cmpFsrzAdvanced >>zAdvanced <<zAdvanced Search >>zAdvanced Search <<r�������	�boolean�fcontextzfcontext-equiv�port�login�user�module�node�	interfacez�<small>
To change from Disabled to Enforcing mode
- Change the system mode from Disabled to Permissive
- Reboot, so that the system can relabel
- Once the system is working as planned
  * Change the system mode to Enforcing</small>
c@s�eZdZ�d%dd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Zd5d6�Zd7d8�Zd9d:�Zd;d<�Zd=d>�Z d?d@�Z!dAdB�Z"dCdD�Z#dEdF�Z$dGdH�Z%dIdJ�Z&dKdL�Z'dMdN�Z(dOdP�Z)dQdR�Z*dSdT�Z+dUdV�Z,dWdX�Z-dYdZ�Z.d[d\�Z/d]d^�Z0d_d`�Z1dadb�Z2dcdd�Z3dedf�Z4dgdh�Z5didj�Z6dkdl�Z7dmdn�Z8dodp�Z9dqdr�Z:dsdt�Z;dudv�Z<dwdx�Z=dydz�Z>d{d|�Z?d}d~�Z@dd��ZAd�d��ZBd�d��ZCd�d��ZDd�d��ZEd�d��ZFd�d��ZGd�d��ZHd�d��ZId�d��ZJd�d��ZKd�d��ZLd�d��ZMd�d��ZNd�d��ZOd�d��ZPd�d��ZQd�d��ZRd�d��ZSd�d��ZTd�d��ZUd�d��ZVd�d��ZWd�d��ZXd�d��ZYd�d��ZZd�d��Z[d�d��Z\d�d��Z]d�d��Z^d�d��Z_d�d��Z`d�d��Za�d&d�dÄZbd�dńZcd�dDŽZdd�dɄZed�d˄Zfd�d̈́Zgd�dτZhd�dфZid�dӄZjd�dՄZkd�dׄZld�dلZmd�dۄZnd�d݄Zod�d߄Zpd�d�Zqd�d�Zrd�d�Zsd�d�Ztd�d�Zud�d�Zvd�d�Zwd�d�Zxd�d�Zyd�d�Zzd�d��Z{d�d��Z|d�d��Z}d�d��Z~d�d��Zd�d��Z��d�d�Z��d�d�Z��d�d�Z��d�d�Z��d�d	�Z��d
�d�Z��d�d
�Z��d�d�Z��d�d�Z��d�d�Z��d'�d�d�Z��d�d�Z��d�d�Z��d�d�Z��d�d�Z��d�d �Z��d!�d"�Z��d#�d$�Z�dS((�
SELinuxGuiNFcQCsd|_d|_t|_t�|_y|jj�}Wn6tjjk
r^}zt	|�|j
�WYdd}~XnX|j�||_d|_
tj�}tjjdd�d|_|jd}|j|�|jd�|_|jd�|_|jd	�|_|jd
�|_d|_|jd�|_|jd
�|_tjtjj �|_!tjtjj"�|_#t$j%�d|_&d|_'d|_(d|_)d|_*d|_+d|_,g|_-g|_.i|_/|jd�|_0|jd�|_1|jd�|_2d|_3|jd�|_4|jd�|_5|j5j6|j7�|jd�|_8|jd�|_9|jd�|_:d|_;|jd�|_<|jd�|_=|jd�|_>|jd�|_?|jd�|_@|jd�|_A|jd�|_B|jd�|_C|jd�|_D|jDjEdtjFjG�|jd �|_H|jHj6|j7�|jd!�|_I|jd"�|_J|jd#�|_K|jd$�|_L|jd%�|_M|jd&�|_N|jNjEdtjFjG�|jd'�|_O|jOj6|j7�|jd(�|_P|jd)�|_Q|jd*�|_R|jd+�|_S|jd,�|_T|jd-�|_U|jd.�|_V|jd/�|_W|jd0�|_X|jd1�|_Y|jd2�|_Z|jd3�|_[|jd4�|_\|jd5�|_]|jd6�|_^|jd7�|__|jd8�|_`|j`jEdtjFjG�|jd9�|_a|jd:�|_b|jbj6|j7�|jd;�|_c|jd<�|_d|jd=�|_e|jd>�|_f|jd?�|_g|jd@�|_h|jdA�|_i|jdB�|_j|jdC�|_k|jdD�|_l|jdE�|_m|jdF�|_n|jdG�|_o|jdH�|_p|jdI�|_q|jdJ�|_r|jdK�|_s|jdL�|_tg|_u|jv�dMk�r�|jkjwd�|jmjwd�|jtjwd�|jdN�|_x|jdO�|_y|jdP�|_z|j{�|jdQ�|_||jdR�|_}|jdS�|_~|jdT�|_|jdU�|_�|jdV�|_�|jdW�|_�|jdX�|_�|jdY�|_�|jdZ�|_�|jd[�|_�|jd\�|_�|jd]�|_�|jd^�|_�|jd_�|_�|jd`�|_�|jda�|_�|jdb�|_�|jdc�|_�|jdd�|_�|jde�|_�|jdf�|_�|jdg�|_�|jdh�|_�|jdi�|_�|jdj�|_�|jdk�|_�|jdl�|_�|jdm�|_�|jdn�|_�|jdo�|_�|jdp�|_�|jdq�|_�|jdr�|_�|jds�|_�|jdt�|_�|jdu�|_�|jdv�|_�|jdw�|_�|jdx�|_�|jdy�|_�|jdz�|_�|jd{�|_�|jd|�|_�|jd}�|_�|jd~�|_�|j�j�d�|jd�|_�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|j�j6|j7�|jd��|_�|j�j��|_�|jd��|_�|j�jEdtjFjG�|jd��|_�|j�j��|_�|jd��|_�|jd��|_�|j�jEdtjFjG�|jd��|_�|j�j6|j7�|jd��|_�|j�j��|_�|jd��|_�|jd��|_�|j�j6|j7�|jd��|_�|j�j��|_�|jd��|_�|j�jEdtjFjG�|jd��|_�|j�j��|_�|jd��|_�|j�j��|_�|jd��|_�|jd��|_�|j�jEdtjFjG�|jd��|_�|j�j6|j7�|jd��|_�|j�j��|_�|jd��|_�|jd��|_�|j�jEdtjFjG�|jd��|_�|j�j6|j7�|jd��|_�|j�j��|_�|jd��|_�|jd��|_�|j�jEdMtjFjG�|jd��|_�|j�j6|j7�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|j�|_�|j�j��|_�|jd��|_�|jd��|_�|j�jEdtjFjG�|jd��|_�|j�j6|j7�|jd��|_�|j�j��|_�|jd��|_�|j�j��|_�|jd��|_�|jd��|_�|j�jEdMtjFjG�|jd��|_�|j�j6|j7�|jd��|_�|j�j��|_�|jd��|_�|jd��|_�|j�jEdtjFjG�|jd��|_�|j�j6|j7�|jd��|_�|j�j��|_�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|j�j�d�|j�j�d�|j�j�|j�d�|j�j�|j�|j�j�dtj��|jd��|_�|jd��|_�|jd��|_�|jd��|_�|jd��|_�|j�j�d��|_�|jd��|_�|jd��|�_|jd��|�_|jd��|�_|jd��|�_|�j�jd�|�j|j��|jdÃ|�_|�j�jd�|�j|jc�|jdă|�_|�j�jd�|�j|jY�|jdŃ|�_|�j�jd�|�j|jJ�|jdƃ|�_	|�j	�jd�|�j|j��|jdǃ|�_
|jd�|_|jdȃ|�_|jdɃ|�_|jdʃ|�_
|jd˃}|�j
�jd�|�j|j��g|�_|�j�jd�|�j|jԃd|�_d}|�rd�d�g|_-|�r$||j-k�r$|j-�j|�n�t�j�|_-|j-�j�t�jd΍|�rj||j-k�rj|�j�tdσ|�|j
�|�j��t|j-�}	�t�j�}
x�|j-D]�}|�j||��t�t|�j��t|	��|�_|j��j|�j�|j��j |�j�|�j!�xH|
�j"|g�D]6}�t�j#|�}|�r|�j||�|j.�j|��qW|�jd7�_�q�W|�j$�|j�j%|j�|j:�j%|j8�|�j&|�j'|�j(|�j)|�j*|�j+|�j,|�j-|�j.|�j/|�j0|�j1|�j2|�j3|�j4|�j5|�j6|�j7|�j8|�j9|�j:|�j;|�j<|�j=|�j>|�j?|�j@|�jA|�jB|�jC|�jD|�jE|�j1|�jF|�jG|�jH|�jI|�jJ|�jJ|�jK|�jL|�jM|�jN|�jO|�jP|�jQ|�jR|�jS|�jT|�jU|�jV|�jH|�jW|�jX|�jY|�jZ|�j[|�j\|�j\|�j\|�j]|�j^|�j_|j7|�j`|�jadМB}
|�jb|�|�jc|
�|j�j��td�jed�|�jf�|�jf�d|�_g|�j�j$�|�j�j$�|�jh�tik�r�|�jj�nV|j�r�|j�jk|j�|j�jl|j�|�jm�|�j`�n|�jn�|j�jot�|�jp�d|_t�jq�dS)�NFT�)Z
plat_specificz
/sepolicy/zsepolicy.glade�outer_notebookZSELinux_windowZMain_selection_menu�main_advanced_labelr�applications_selection_buttonZ
Revert_buttonrZadd_path_dialog�error_check_window�error_check_label�advanced_search_windowZadvanced_filterZ
advanced_sort�advanced_filter_entry�advanced_search_treeviewZLogin_label�login_seuser_comboboxZlogin_seuser_liststore�login_name_entry�login_mls_label�login_mls_entryZLogin_button�login_treeview�login_liststore�login_filter�login_popup_window�login_delete_liststore�login_delete_window�user_popup_windowZUser_button�user_liststore�user_filter�
user_treeview�user_roles_comboboxZuser_roles_liststoreZ
User_label�user_name_entry�user_mls_label�user_mls_level_entry�user_mls_entryZselinux_user_combobox�user_delete_liststore�user_delete_window�file_equiv_label�file_equiv_source_entry�file_equiv_dest_entryZfile_equiv_button�file_equiv_treeview�file_equiv_liststore�file_equiv_popup_windowZfile_equiv_filter�file_equiv_delete_liststore�file_equiv_delete_window�app_system_buttonZ
System_buttonZLockdown_buttonZSystems_boxZRelabel_buttonZRelabel_button_no�advanced_system�outer_notebook_frameZsystem_policy_type_label�select_button_browse�cancel_button_browse�moreTypes_window_filesZmore_types_file_liststoreZmoreTypes_treeview_files�system_policy_type_liststore�system_policy_type_comboboxrZEnforcing_button_defaultZPermissive_button_defaultZDisabled_button_defaultZenable_unconfinedZdisable_unconfinedZenable_permissiveZdisable_permissiveZ
enable_ptraceZdisable_ptrace�help_windowZ
help_textv�	info_text�
help_image�forward_button�back_button�
update_window�update_treeviewZUpdate_treestore�apply_buttonZ
Update_buttonZ
Add_buttonZ
Delete_button�files_path_entry�network_ports_entry�files_popup_window�network_popup_windowZ
Network_labelZfiles_labelZmake_path_recursiveZfiles_type_combo_storeZfiles_class_combo_store�files_type_combobox�files_class_combobox�files_mls_label�files_mls_entryZAdvanced_text_filesZcancel_delete_filesZ
tcp_buttonZ
udp_buttonZnetwork_type_combo_storeZnetwork_type_combobox�network_mls_label�network_mls_entryZAdvanced_text_networkZcancel_network_deleteZShow_mislabeled_files�mislabeled_files_label�
warning_filesZ
Modify_button�fix_label_window�fixlabel_label�fix_label_cancel�files_delete_window�files_delete_treeview�files_delete_liststore�network_delete_window�network_delete_treeview�network_delete_liststore�progress_barZExecutable_files_treeview�executable_files_filterZExecutable_files_tabZexecutable_files_treestoreZfiles_buttonZWritable_files_treeviewZwritable_files_treestore�writable_files_filterZWritable_files_tabZApplication_files_treeview�application_files_filterZApplication_files_tabZapplication_files_treestoreZnetwork_buttonZoutbound_treeview�network_out_liststore�network_out_filter�network_out_tabZinbound_treeview�network_in_liststore�network_in_filter�network_in_tabZBoolean_treeview�boolean_liststore�boolean_filterZbooleans_more_detail_windowZbooleans_more_detail_treeviewZbooleans_more_detail_liststoreZBooleans_button�transitions_into_treeview�transitions_into_liststore�transitions_into_filterZTransitions_into_tabZTransitions_button�transitions_from_treeview�transitions_from_treestore�transitions_from_filterZTransitions_from_tabZfile_transitions_treeviewZfile_transitions_liststoreZfile_transitions_filterZfile_transitionsZcombobox_org�application_liststore�completion_entry�entrycompletion_objZShow_modified_only_toggleZEnforcing_labelZEnforcing_buttonZPermissive_button�
status_barzSELinux status�filter_entry�
filter_box�add_modify_delete_boxZactivateZfiles_toggle_deleteZtoggledZfile_equiv_toggle_delete1Zuser_toggle_deleteZlogin_toggle_deleteZnetwork_toggle_deleteZ
toggle_updateZfiles_inner_notebookZnetwork_inner_notebookZtransitions_inner_notebook�loading_guiZhttpd_tZabrt_t)�keyz%s is not a valid domain)BZon_combo_button_clickedZon_disable_ptrace_toggledZ!on_SELinux_window_configure_eventZ%on_entrycompletion_obj_match_selectedZon_filter_changedZ"on_save_changes_file_equiv_clickedZon_save_changes_login_clickedZon_save_changes_user_clickedZon_save_changes_files_clickedZon_save_changes_network_clickedZ)on_Advanced_text_files_button_press_eventZitem_in_tree_selectedZ2on_Application_file_types_treeview_configure_event�on_save_delete_clickedZ)on_moreTypes_treeview_files_row_activatedZon_retry_button_files_clickedZon_make_path_recursive_toggledZ&on_files_path_entry_button_press_eventZon_files_path_entry_changedZon_select_type_files_clickedZon_choose_fileZon_Enforcing_button_toggledZon_confirmation_closeZon_column_clickedZ
on_tab_switchZon_file_equiv_button_clickedzon_app/system_button_clickedzon_app/users_button_clicked�on_show_advanced_search_windowZ on_Show_mislabeled_files_toggledZon_Browse_button_files_clickedZon_cancel_popup_clickedZon_treeview_cursor_changedZ on_login_seuser_combobox_changedZon_user_roles_combobox_changedZon_cancel_button_browse_clickedZon_apply_button_clickedZon_Revert_button_clickedZon_Update_button_clickedZ on_advanced_filter_entry_changedZ)on_advanced_search_treeview_row_activatedZ!on_Select_advanced_search_clickedZ!on_info_button_button_press_eventZon_back_button_clickedZon_forward_button_clickedZ#on_Boolean_treeview_columns_changedZon_completion_entry_changedZon_Add_button_clickedZon_Delete_button_clickedZon_Modify_button_clickedZon_Show_modified_only_toggledZon_cancel_button_config_clickedZon_Import_button_clickedZon_Export_button_clickedZon_enable_unconfined_toggledZon_enable_permissive_toggledZ&on_system_policy_type_combobox_changedZ#on_Enforcing_button_default_toggledZ$on_Permissive_button_default_toggledZ"on_Disabled_button_default_toggledZon_Relabel_button_toggled_cbZ%on_advanced_system_button_press_eventZon_files_type_combobox_changedZon_filter_row_changedZon_button_toggledZ
gtk_main_quitr(r�finish_initZ
advanced_init�
START_PAGE�opager�dbus�
customized�
exceptions�
DBusException�print�quit�init_cur�application�
filter_txtrZBuilder�	distutils�	sysconfigZget_python_lib�	code_pathZ
add_from_fileZ
get_objectr$�window�main_selection_windowr%�popupr&�
revert_buttonrZCursorZ
CursorTypeZWATCH�busy_cursorZLEFT_PTR�ready_cursor�selinux�selinux_getpolicytype�initialtype�
current_popup�
import_export�clear_entry�	files_add�network_add�mislabeled_filesZall_domainsZinstalled_listZpreviously_modified�file_dialogr'r(Z
invalid_entryr)�advanced_search_filterZset_visible_func�filter_the_dataZadvanced_search_sortr*r+Zadvanced_search�login_labelr,�login_seuser_combolistr-r.r/�login_radio_buttonr0r1Zset_sort_column_idZSortTypeZ	ASCENDINGr2r3r4r5r6�user_radio_buttonr7r8r9r:�user_roles_combolist�
user_labelr;r<r=r>Z
user_comboboxr?r@rArBrC�file_equiv_radio_buttonrDrErFZfile_equiv_treefilterrGrHrI�system_radio_button�lockdown_radio_buttonZsystems_box�relabel_button�relabel_button_norJrK�system_policy_labelrLrMrN�more_types_files_liststore�moreTypes_treeviewrOrPZpolicy_list�populate_system_policy�set_visible�enforcing_button_default�permissive_button_default�disabled_button_default�initialize_system_default_mode�enable_unconfined_button�disable_unconfined_button�enable_permissive_button�disable_permissive_buttonZenable_ptrace_button�disable_ptrace_buttonrQ�	help_textrRrSrTrUrVrW�update_treestorerX�
update_button�
add_button�
delete_buttonrYrZr[r\�popup_network_label�popup_files_label�recursive_path_toggle�files_type_combolist�files_class_combolistr]r^r_r`�advanced_text_filesZfiles_cancel_button�network_tcp_button�network_udp_button�network_port_type_combolist�network_port_type_comboboxrarb�advanced_text_networkZnetwork_cancel_button�show_mislabeled_files_onlyrcrd�
modify_button�
set_sensitivererfrgrhrirjrkrlrmrn�executable_files_treeviewro�executable_files_tabZget_tooltip_textZ executable_files_tab_tooltip_txt�executable_files_liststore�files_radio_buttonZfiles_button_tooltip_txt�writable_files_treeview�writable_files_liststorerp�writable_files_tabZwritable_files_tab_tooltip_txt�application_files_treeviewrq�application_files_tabZ!application_files_tab_tooltip_txt�application_files_liststore�network_radio_buttonZnetwork_button_tooltip_txt�network_out_treeviewrrrsrtZnetwork_out_tab_tooltip_txt�network_in_treeviewrurvrwZnetwork_in_tab_tooltip_txt�boolean_treeviewrxry�boolean_more_detail_windowZboolean_more_detail_treeview�!boolean_more_detail_tree_data_set�boolean_radio_button�
active_buttonZboolean_button_tooltip_txtrzr{r|�transitions_into_tabZ transitions_into_tab_tooltip_txt�transitions_radio_buttonZtransitions_button_tooltip_txtr}r~r�transitions_from_tabZ transitions_from_tab_tooltip_txt�transitions_file_treeview�transitions_file_liststoreZtransitions_file_filter�transitions_file_tabZ transitions_file_tab_tooltip_txtZ
combobox_menur�r�r�Zset_minimum_key_lengthZset_text_columnZset_match_func�
match_funcZset_completionZset_icon_from_stockZ
STOCK_FIND�show_modified_only�current_status_label�current_status_enforcing�current_status_permissiver�Zget_context_id�
context_idr�r�r��cellZdel_cell_files�connect�on_toggle_updateZdel_cell_files_equivZ
del_cell_userZdel_cell_loginZdel_cell_networkZupdate_cell�inner_notebook_files�inner_notebook_network�inner_notebook_transitionsZall_entries�	on_toggleZloading�append�sepolicyZget_all_domains�sort�str�lower�errorr�show�lenZget_init_entrypoints_str�
combo_box_add�floatZ
percentageZset_fractionZset_pulse_step�	idle_func�getZfind_entrypoint_path�hideZ	set_model�open_combo_menu�on_disable_ptrace�hide_combo_menu�set_application_label�get_filter_data�update_to_file_equiv�update_to_login�update_to_user�update_to_files�update_to_network�reveal_advanced�cursor_changed�resize_wrapr��populate_type_combo�invalid_entry_retry�recursive_path�highlight_entry_text�autofill_add_files_entry�select_type_more�on_browse_select�set_enforce�confirmation_close�column_clicked�
clear_filters�show_file_equiv_page�system_interface�users_interfacer��show_mislabeled_files�browse_for_files�close_popup�login_seuser_combobox_change�user_roles_combobox_change�close_config_window�apply_changes_button_press�update_or_revert_changes�get_advanced_filter_data�advanced_item_selected�advanced_item_button_push�on_help_button�on_help_back_clicked�on_help_forward_clicked�resize_columns�application_selected�add_button_clicked�delete_button_clicked�modify_button_clicked�on_show_modified_only�import_config_show�export_config_show�unconfined_toggle�permissive_toggle�change_default_policy�change_default_mode�relabel_on_reboot�reveal_advanced_system�show_more_types�
tab_change�closewindow�previously_modified_initializeZconnect_signalsrZtimeout_add_seconds�selinux_status�lockdown_inited�statusr�show_system_page�	set_label�set_text�show_applications_page�clearbuttons�set_current_page�reinit�main)�self�appZtestr��eZbuilderZ
glade_filer��path�lengthZentrypoint_dict�domainZ
entrypoint�dicrrr�__init__us 




"





zSELinuxGui.__init__cCs"i|_xtD]}i|j|<qWdS)N)�cur_dict�keys)rP�krrrr�ss
zSELinuxGui.init_curcCsLd}xB|jD]8}x2|j|D]$}||kr8|j||=dS|d7}qWqWdS)Nrr)rX)rP�ctr�irZ�jrrr�
remove_curxszSELinuxGui.remove_curcCsytj�|_Wntk
r(t|_YnX|jtkr�|jjd�|jjd�|jjd�|j	jd�|j
j|jt
d��|jjt�n|j|j�tjjd�r�|jjd�n|jjd�tj�d}tj�d}|tkr�|j	jd�|tkr�|jjd�|tk�r|jjd�dS)NFzSystem Status: Disabledz
/.autorelabelTr)r�Zsecurity_getenforcerG�OSErrorrr�r�r�r�r�r��pushr�rrRrI�
DISABLED_TEXT�set_enforce_text�osrS�existsr��
set_activer�r��selinux_getenforcemoderrr�r�)rPZ
policytype�moderrrrE�s0

zSELinuxGui.selinux_statuscCs�|jr
dS|j�d|_|jjtjd��i|_xN|jj�j	d�D]:}|j	�}t
|�dkr\qB|dt
|�dkd�|j|d<qBW|jj|jdd	�|jj|jd
d	�|j
�dS)NTZdeny_ptrace�
r	r)Zpriority�Disabledr�
unconfinedri�permissivedomains)rF�
wait_mouser�rer�Zsecurity_get_boolean_activeZmodule_dictr�Z
semodule_list�splitrr�r��ready_mouse)rP�m�modrrr�
lockdown_init�s$zSELinuxGui.lockdown_initcGs�|j�}|sdS|jtkr4|j�dkr4|j|j|�|jtkrp|jj|d�}|j�dkrp|rp|j	|jj|d��|jt
kr�|jj|d�}|r�|jj�|j
j|�dS)NZmore_detail_colr	Zrestorecon_colrr)�get_selected_iterr��
BOOLEANS_PAGEZget_name�display_more_detailr��
FILES_PAGE�	liststore�	get_value�fix_mislabeled�TRANSITIONS_PAGEr�Zclickedr�rJ)rP�treeviewZtreepathZtreecol�args�iterZvisibleZ	bool_namerrrr �s



zSELinuxGui.column_clickedcCsxtj�rtj�qWdS)N)rZevents_pendingZmain_iteration)rPrrrr�s
zSELinuxGui.idle_funccCs:y |jj|d�j|�dkrdSdStk
r4YnXdS)NrrTFr
)r�rw�find�AttributeError)rPZ
completionZ
key_stringr|Z	func_datarrrr��szSELinuxGui.match_funcc
Cs�|jj|jdk�|jj|jt|j�dk�y0td|j|j|jfd�}|j�}|j	�Wnt
k
rvd}YnX|jj�}|j
|d|ji�|jj|�|jjd|j|j|jf�|j|j�dS)Nrrz
%shelp/%s.txt�rr#�APPz
%shelp/%s.png)rUr��	help_pagerTr�	help_list�openr��read�close�IOErrorr�Z
get_bufferrJr�Z
set_bufferrSZ
set_from_file�
show_popuprQ)rP�fd�bufr�rrr�help_show_page�s

zSELinuxGui.help_show_pagecGs|jd8_|j�dS)Nr)r�r�)rPr{rrrr1�szSELinuxGui.on_help_back_clickedcGs|jd7_|j�dS)Nr)r�r�)rPr{rrrr2�sz"SELinuxGui.on_help_forward_clickedcGstd|_g|_|jtkr.|jjtd��dg|_|jtkrV|jjtd��ddddg|_|jtkr�|j	j
�}|tkr�|jjtd	��d
g|_|tkr�|jjtd��dg|_|t
kr�|jjtd
��dg|_|jtk�r$|jj
�}|tk�r|jjtd��dg|_|tk�r$|jjtd��dg|_|jtk�r�|jj
�}|tk�rb|jjtd��ddddg|_|tk�r�|jjtd��dg|_|tk�r�|jjtd��dg|_|jtk�r�|jjtd��dddd d!d"g|_|jtk�r�|jjtd#��d$d%d&d'g|_|jtk�r$|jjtd(��d)d*g|_|jtk�rH|jjtd+��d,g|_|jtk�rl|jjtd-��d.g|_|j�S)/NrzHelp: Start Page�startzHelp: Booleans PageZbooleansZbooleans_toggledZ
booleans_moreZbooleans_more_showzHelp: Executable Files PageZ
files_execzHelp: Writable Files PageZfiles_writezHelp: Application Types PageZ	files_appz'Help: Outbound Network Connections PageZports_outboundz&Help: Inbound Network Connections PageZ
ports_inboundz&Help: Transition from application PageZtransition_fromZtransition_from_booleanZtransition_from_boolean_1Ztransition_from_boolean_2z&Help: Transition into application PageZ
transition_toz&Help: Transition application file PageZtransition_filezHelp: Systems Page�systemZsystem_boot_modeZsystem_current_modeZ
system_exportZsystem_policy_typeZsystem_relabelzHelp: Lockdown PageZlockdownZlockdown_unconfinedZlockdown_permissiveZlockdown_ptracezHelp: Login PagerZ
login_defaultzHelp: SELinux User Page�userszHelp: File Equivalence PageZ
file_equiv)r�r�r�r�rQ�	set_titlerrsrur��get_current_page�EXE_PAGE�
WRITABLE_PAGE�APP_PAGE�NETWORK_PAGEr��
OUTBOUND_PAGE�INBOUND_PAGEryr��TRANSITIONS_FROM_PAGE�TRANSITIONS_TO_PAGE�TRANSITIONS_FILE_PAGE�SYSTEM_PAGE�
LOCKDOWN_PAGE�
LOGIN_PAGE�	USER_PAGE�FILE_EQUIV_PAGEr�)rPr{�ipagerrrr0�sl











zSELinuxGui.on_help_buttoncGsX|jdkrDd|_|jj�}|jj|dd|dd�|jj�n|jj�d|_dS)Nrrr�A)r�r�Zget_positionr�Zmoverr	)rPr{�locationrrrr
)s


zSELinuxGui.open_combo_menucGs|jj�d|_dS)Nr)r�r	r�)rPr{rrrr3s
zSELinuxGui.hide_combo_menucGs
d|_dS)NT)r
)rPr{rrrr
7sz SELinuxGui.set_application_labelcGst|�dS)N)r�)rPr{rrrr:szSELinuxGui.resize_wrapcCsHtj�d|_|jtkr |j|_|jtkr2|j|_|jtkrD|j	|_dS)Nr)
r�rf�enforce_moderr��enforce_buttonrr�rr�)rPrrrr�=s


z)SELinuxGui.initialize_system_default_modecCsvttjtj�dd��d}|j�d}xJ|D]B}|jj�}|jj|d|�||j	krf|j
j|�||_|d7}q,W|S)NT)�topdownrr)
�nextrc�walkr�Zselinux_pathr�rOr��	set_valuer�rPre�typeHistory)rP�typesr[�itemr|rrrr�Fs


z!SELinuxGui.populate_system_policycGs�|jdkrdSy�x�td|j��D]p}yR|j||�}|dksJ|dksJ|dkrLw |j|j�dksp|j�j|j�dkrtdSWq ttfk
r�Yq Xq WWnYnXdS)Nr#TrFrr
r
)r��range�
get_n_columnsrwr}rr~�	TypeError)rP�listr|r{�x�valrrrr�Ss
$zSELinuxGui.filter_the_datac
Cs�x�|j�D]|}xv||D]j\}}dj|�|f}	|	|jdkrl|jd|	ddkrTq||jd|	dkrlq|j|dj|�||�qWq
WdS)N�,r�actionz-d�typez, )rY�joinrX�network_initial_data_insert)
rPrQ�netd�protocol�	direction�modelrZ�t�portsZpkeyrrr�
net_updatefszSELinuxGui.net_updatecCs�|j�tj�}|jj�x�|D]�}|jj�}||drX|j|�}|j||d�}n|}||d}|jj|d|�|jj|d|�|jj|d||d�q W|j�dS)N�modify�equivrrr)	rlr�Zget_file_equivrE�clearr��markupr�rn)rPZedict�fr|�namer�rrr�file_equiv_initializeqs



z SELinuxGui.file_equiv_initializecCs�|j�|jj�x�tj�D]�}|jj�}|jj|dt|d��|d}d|kr\|jd�|jj|ddj	|��|jj|d|j
dd	��|jj|d
|j
dd	��|jj|dd
�qW|j�dS)Nrr��rolesZobject_rrz, r�levelr#r	r�rT)rlr7r�r��get_selinux_usersr�r�r�remover�rrn)rP�ur|r�rrr�user_initialize�s


zSELinuxGui.user_initializecCs�|j�|jj�xftj�D]Z}|jj�}|jj|d|d�|jj|d|d�|jj|d|d�|jj|dd�qW|j�dS)	Nrr�r�seuserr�mlsr	T)rlr1r�r�Zget_login_mappingsr�r�rn)rPr�r|rrr�login_initialize�s

zSELinuxGui.login_initializecCs|tjj|dddd�}|j||dt|j�tjj|dddd�}|j||dt|j�tjj|dddd�}|j||dt|j�dS)N�tcp�name_connectT)�check_bools�	name_bind�udp)r��network�get_network_connectr�r�rrr�ru)rPrQr�rrr�network_initialize�szSELinuxGui.network_initializecCsD|j�}|j|d|�|j|d|�|j|d|�|j|dd�dS)NrrrrT)r�r�)rPr�r�ZportTyper�r|rrrr��s
z&SELinuxGui.network_initial_data_insertcCs�d}|j�}x.|D]&}|d|kr0|j|�dS|d7}qW|j|d�}|j|d�td�krr|j|�}|d}n|j�}|j|d|�|j|�dS)NrrzMore...)�	get_modelre�get_iterrwrZ
insert_beforer�r�)rP�comboboxr�r[rvr\�niterr|rrr�combo_set_active_text�s



z SELinuxGui.combo_set_active_textcCs2|j�}|j�}|dkrdS|j|�}|j|d�S)Nr)r��
get_activer�rw)rPr�rv�indexr|rrr�combo_get_active_text�s
z SELinuxGui.combo_get_active_textcCs:|dkrdS|jj�}|jj|d|�|jj|d|�dS)Nrr)r�r�r�)rPr��val1r|rrrr�s

zSELinuxGui.combo_box_addcGsN|jj�}|j�d}|dkr"dS|jj|d�}|j|j|�|j|j�dS)Nrr)	r��
get_selection�get_selectedr�rwr�r]rCrN)rPr{rQr|rrrr�s
zSELinuxGui.select_type_morecGsx|jj�}|j�\}}|j|�}|jj|�}|jj|d�}|dkrFdS|jjd�|j	j
�|j|j�|j
j|�dS)Nrr#)r+r�r��convert_iter_to_child_iterr�r�rwr*rJr)r	rr%r�)rPr{�rowr�r|rQrrrr/�s


z$SELinuxGui.advanced_item_button_pushcGs`|jj|�}|jj|�}|jj|d�}|jjd�|jj�|j	|j
�|jj|�|j�dS)Nrr#)
r�r�r�r�rwr*rJr)r	rr%r�r4)rPrzrSr{r|rQrrrr.�s
z!SELinuxGui.advanced_item_selectedcCs4|r0t|�dkr0x|jD]}||dkrdSqWdS)NrTF)rr�)rPrQ�itemsrrr�find_application�s
zSELinuxGui.find_applicationcGs�|jjd�|jjd�|jjd�|jjd�|jj�}|j|�sHdS|j	�|j
jd�|jjd�|j
j�|jj�|jj�|jj�|jj�|jj�|jj�|jj�|jj�y(|ddkr�tj|�}|s�dS||_Wntk
�rYnX|j�|j|jj��|j�|j |�d|_!|j"|�|j#|�|j$|�|j%|�|j&|�|j'|�|j(|�|j)j*t+d�|�|j,j*t+d�|�|j-j*t+d�|�|j.j*t+d	�|�|j/j*t+d
�|�|j0j*t+d�|�|j1j*t+d�|�|j2j*t+d
�|�|j3j4t+d�|�|j5j4t+d�|�|j6j4t+d�|�|j3j*t+d�|�|j5j*t+d�|�|j6j*t+d�|�|j7j*t+d�|�||_|j8j4|j�|j9�dS)NFr#Tr�/z(File path used to enter the '%s' domain.z)Files to which the '%s' domain can write.z6Network Ports to which the '%s' is allowed to connect.z5Network Ports to which the '%s' is allowed to listen.z File Types defined for the '%s'.zODisplay boolean information that can be used to modify the policy for the '%s'.z;Display file type information that can be used by the '%s'.zADisplay network ports to which the '%s' can connect or listen to.z!Application Transitions Into '%s'z!Application Transitions From '%s'zFile Transitions From '%s'zVExecutables which will transition to '%s', when executing selected domains entrypoint.zQExecutables which will transition to a different domain, when '%s' executes them.z4Files by '%s' with transitions to a different label.zADisplay applications that can transition into or out of the '%s'.):r�r�rcrdr�rJr��get_textr�rKr�r�r�r�r�rurrrxr{r~r�r�r�r�Zget_init_transtyper��
IndexErrorrlrDr�r�rN�boolean_initializer��executable_files_initializer��writable_files_initialize�transitions_into_initialize�transitions_from_initialize�application_files_initialize�transitions_files_initializer��set_tooltip_textrr�rtrwr�r�r�r�r�rIr�r�r�r&rn)rPr{rQrrrr4�sr




















zSELinuxGui.application_selectedcCs tj�tj�|_tj�|_dS)N)r�rNZ
get_fcdict�fcdictZget_local_file_paths�local_file_paths)rPrrrrN6s
zSELinuxGui.reinitcCs�i|_�x�|jd�D�]�}|j�}t|�dkr0q|ddkr>q|d|jkrZi|j|d<|ddkr�d|ddki|jd|d<|dd	kr�|d
|dd�|jd	|d <|dd
kr�d|d
i|jd
|d!<|ddk�rd|d
|dd�|jd|d"<|ddk�r6d|d
i|jd|d#|d$f<|ddk�rj|d
|d|dd�|jd|d%<|ddk�r�|ddk�r�d|jk�r�i|jd<d|d
i|jd|d&<n"d|di|jd|d'|d
f<|ddkrd|ddki|jd|d(<qWd|jk�rdSxJd|jfd|jfgD]2\}}||jdk�r.|j|jd|d��q.Wx*tD]"}||jk�rj|jj|ii��qjWdS))Nrhrrz-Dr�activerz-1rr	r)r�r�r!r�r�s0)r�r��rolerr r)�maskr�r�rz-ezfcontext-equivr�r�enabledz-drjrkr
r
r
r
r
���r
r
r
r
)�	cust_dictrmrr�r�rerY�update)rPr�r\ZrecZsemodule�buttonrrrrD;sJ ""&
""
z)SELinuxGui.previously_modified_initializecCs�tj|�|_x�|jj�D]�}t|j|�dkr0q|j|d}xr|j|dD]`}||f|jdkr�|jd||fddkr�qN||jd||fdkr�qN|j|j|||�qNWqWdS)Nrrrr�z-dr�)r�Zget_entrypoints�entrypointsrYrrX�files_initial_data_insertr�)rPr��exe�
file_classrSrrrr�esz&SELinuxGui.executable_files_initializecCs@y&tj|d�d}tj|�d}||kStk
r:dSXdS)NrrF)r��matchpathcon�
getfileconr_)rPrS�con�currrr�
mislabeledsszSELinuxGui.mislabeledcCs�|j|�sdStj|d�d}tj|�d}d|_|j|dd�|j|dd�|j|dd�|j|d|jd�d�|j|d	|jd�d�dS)
NrrTr	rr�:rr)rr�r�r�r�r�rm)rP�treerSr|r�r�rrrr�set_mislabeled{s
zSELinuxGui.set_mislabeledcCs�tj|�|_x�|jj�D]�}t|j|�dkrF|j|jd|td��q|j|d}xr|j|dD]`}||f|jdkr�|jd||fddkr�qd||jd||fdkr�qd|j|j|||�qdWqWdS)	Nrz	all filesrrrr�z-dr�)	r�Zget_writable_files�writable_filesrYrr�r�rrX)rPr��writer�rSrrrr��sz$SELinuxGui.writable_files_initializec	Cs�|jd�}|dkr td�}d}nl||f|jk}x:tj|�D],}|j|�}|j|d|�|j||||�q:W|r�|j|�}|j|�}|j|�}|j|d|�|j|d|�|j|d|�|j|d|�dS)NzMISSING FILE PATHFrrrr)r�rr�r�Z	find_filer�rr�)	rPrvrSZ
selinux_labelr�r|r��pr�rrrr��s"




z$SELinuxGui.files_initial_data_insertcCsd|S)Nz	<b>%s</b>r)rPr�rrrr��szSELinuxGui.markupcCs |rtjddtjdd|��SdS)Nz</b>$r#z^<b>)�re�sub)rPr�rrr�unmarkup�szSELinuxGui.unmarkupcCs�tj|�|_x�|jj�D]�}t|j|�dkr0q|j|d}x�|j|dD]p}tj||jd�}||f|jdkr�|jd||fddkr�qN||jd||fdkr�qN|j|j	|||�qNWqWdS)Nrr)r�rr�z-dr�)
r�Zget_file_types�
file_typesrYrZget_descriptionr�rXr�r�)rPr�rQr�rS�descrrrr��sz'SELinuxGui.application_files_initializecCs.d}x$|jD]}t|j|�dkrdSqWdS)NrTF)rXr)rPr\rZrrr�modified�s
zSELinuxGui.modifiedcCsbx\tj|�D]N}xH|D]@\}}||jdkr>|jd|d}tj|�}|j|||�qWqWdS)Nrr�)r�Z	get_boolsrX�boolean_desc�boolean_initial_data_insert)rPr��blistrr�rrrrr��s
zSELinuxGui.boolean_initializecCsR|jj�}|jj|d|�|jj|d|�|jj|d|�|jj|dtd��dS)Nrrrr	zMore...)rxr�r�r)rPr�rr�r|rrrr�s

z&SELinuxGui.boolean_initial_data_insertcCsbx\tj|�D]N}d}d}d}d|kr,|d}d|kr<|d}d|krL|d}|j|||�qWdS)Nr�target�source)r�Zget_transitions_into�$transitions_into_initial_data_insert)rPr�r�r��
executablerrrrr��sz&SELinuxGui.transitions_into_initializecCsd|jj�}|dkr0|jj|dt|dd�n|jj|dd�|jj|d|�|jj|d|�dS)Nrr�Defaultr)r{r�r�r�)rPr�rrr|rrrr�s
z/SELinuxGui.transitions_into_initial_data_insertcCs�x�tj|�D]�}d}d}d}d|kr,|d}d|kr<|d}d|krL|d}|j|||�y*x$|j|dD]}|j|||�qlWWqtk
r�YqXqWdS)Nrr�	transtypeZregex)r�Zget_transitions�$transitions_from_initial_data_insertr��KeyError)rPr�r�r�rrZexecutable_typerrrr��s z&SELinuxGui.transitions_from_initializecCs�|jjd�}|dkr6|jj|dd�|jj|dd�n�|jj|�}|jj|dt|dd�d
}|ddr�|jj|dtd	�|�n|jj|dtd
�|�|jj|d|dd�|jj|dd�|jj|d|�|jj|d|�dS)NrrrFr�<span foreground="blue"><u>�</u></span>rz:To disable this transition, go to the %sBoolean section%s.z9To enable this transition, go to the %sBoolean section%s.Tr	)rr)r~r�r�r�r)rPr�rrr|r�r�rrrrsz/SELinuxGui.transitions_from_initial_data_insertcCsJxDtj|�D]6}d|kr"|d}nd}|j|d|d|d|�qWdS)N�filenamer�classr)r�Zget_file_transitions�$transitions_files_inital_data_insert)rPr�r\rrrrr�s

z'SELinuxGui.transitions_files_initializecCsZ|jj�}|jj|d|�|jj|d|�|jj|d|�|dkrFd}|jj|d|�dS)Nrrr�*r	)r�r�r�)rPrS�tclass�destr�r|rrrr"s
z/SELinuxGui.transitions_files_inital_data_insertcGs8|j�d|_d|_d|_d|_|jjd�|jj�|j	j
d�|jj
d�|jj
d�|j
j
d�|jj�r�|jjt�|j|_|j	j
d�|jj��r�|j|j�|j	j
d�|jj
|j�|jj
|j�|j
j
|j�|jjt�|d|jk�r|d}n
|jj�}|tk�r*|j|_td�}n6|tk�rF|j|_td�}n|tk�r`|j |_td�}|j!j"td�||j#d	��|j$j"td
�||j#d	��|jj"td�||j#d	��|j%j��r�|jj&�|j	j
d�|jjt'�|d|j(k�r�|d}n
|j(j�}|t)k�r |j*|_td�}|t+k�r:|j,|_td
�}|j!j"td�|j#|d��|j$j"td�|j#|d��|jj"td�|j#|d��|j-j��r|jjt.�|d|j/k�r�|d}n
|j/j�}|t0k�r�|j1|_|t2k�r�|j3|_|t4k�r|j5|_|j6j��r"|jjt7�|j8j�|j9j��rL|j:�|jjt;�|j8j�|j<j��r�|jjt=�|jj&�|j	j
d�|j>|_|j!j"td��|j$j"td��|jj"td��|j?j��r|jjt@�|jj&�|j	j
d�|jA|_|j!j"td��|j$j"td��|jj"td��|jBj��r~|jjtC�|jj&�|j	j
d�|jD|_|j!j"td��|j$j"td��|jj"td��|jj�|_E|j�r(|j8j&�|jjF�|_|jjF�|_|jjF�|_xXtGd|jjH��D]D}|jjI|�}|�r�|jJ�d}tK|tLjM��r�|jjN||jOd��q�W|jjP�jQ�|jjd�dS)NFTrrr�writabler�z4Add new %(TYPE)s file path for '%(DOMAIN)s' domains.)ZTYPEZDOMAINz3Delete %(TYPE)s file paths for '%(DOMAIN)s' domain.z�Modify %(TYPE)s file path for '%(DOMAIN)s' domain. Only bolded items in the list can be selected, this indicates they were modified previously.r�zlisten for inbound connectionszMAdd new port definition to which the '%(APP)s' domain is allowed to %(PERM)s.)r�ZPERMzVDelete modified port definitions to which the '%(APP)s' domain is allowed to %(PERM)s.zMModify port definitions to which the '%(APP)s' domain is allowed to %(PERM)s.z%Add new SELinux User/Role definition.z.Delete modified SELinux User/Role definitions.z7Modify selected modified SELinux User/Role definitions.z!Add new Login Mapping definition.z*Delete modified Login Mapping definitions.z3Modify selected modified Login Mapping definitions.z$Add new File Equivalence definition.z-Delete modified File Equivalence definitions.z�Modify selected modified File Equivalence definitions. Only bolded items in the list can be selected, this indicates they were modified previously.)Rr!rz�treesort�
treefilterrvr�r�r�r	r�r�r�rcrdr�r�r$rMrsr�r�r�r�rur�r�r�r�rr�r�r�r�r�r�r�r�r�rr�r�r�r�r�r�r�ryr�r�r}r�rzr�r�r�r�r�r�rqr�r�r�r9r�r�r0r�r�rDr�r�r�r�Z
get_columnZ	get_cells�
isinstancerZCellRendererTextZ
set_sort_func�	stripsortr�Zunselect_all)rPr{r��categoryr��colr�rrrrB+s�
























zSELinuxGui.tab_changec	Cs:|j�\}}|j|j||��}|j|j||��}t||�S)N)Zget_sort_column_idr
rwr)	rPr�Zrow1Zrow2Z	user_dataZsort_columnrr�Zval2rrrr%�szSELinuxGui.stripsortcCs�|jj|�}|jj|�}|jj�|jjtd�|jj	|d��t
j|j|jj	|d��}x,|D]$}|j
|d|d|d|d�q^W|j|j�dS)NzBoolean %s Allow RulesrrrrZpermlist)ryr�r�r�r�r�r�rrxrwr�Zget_boolean_rulesr��display_more_detail_initr�)rP�windowsrS�itrrrrrrt�s

$zSELinuxGui.display_more_detailc	Cs0|jj�}|jj|dd|||dj|�f�dS)Nrzallow %s %s:%s { %s };� )r�r�r�r�)rPrrZ
class_typeZ
permissionr|rrrr(�s
z#SELinuxGui.display_more_detail_initcGs�d|_|jtkrJ|jjtd�|j�|jjtd�|j�|j	|�dS|jt
kr�|jjtd�|j�|jjtd�|j�|j
|�|jj�}|tkr�|jjd�n|jjd�d|_|jtkr�|jjtd	��|jjtd
��|j|�d|_|jtk�r2|jjtd��|jjtd��|j|�d|_|jtk�r�|jjd
�|jjd
�|jjtd��|jjtd��d|_|j |j�|j!�dS)NFzGAdd Network Port for %s.  Ports will be created when update is applied.zAdd Network Port for %szMAdd File Labeling for %s. File labels will be created when update is applied.zAdd File Labeling for %szex: /usr/sbin/Foobarzex: /var/lib/FoobarTzGAdd Login Mapping. User Mapping will be created when Update is applied.zAdd Login MappingzQAdd SELinux User Role. SELinux user roles will be created when update is applied.zAdd SELinux Usersr#zMAdd File Equivalency Mapping. Mapping will be created when update is applied.zAdd SELinux File Equivalency)"r�r�r�r�rJrr�r\r��init_network_dialogrur�r[�init_files_dialogr�r�r�rYr�r�r�r3�login_init_dialogr�r�r6�user_init_dialogr�rBrCrArFr��new_updates)rPr{r�rrrr5�sB







zSELinuxGui.add_button_clickedcCs||_|j�dS)N)r�r)rPr�rrrr��szSELinuxGui.show_popupcGs|jj�|jjd�dS)NT)r�r	r�r�)rPr{rrrr'�s
zSELinuxGui.close_popupc
Gs�d}|jr&|j�}|s&|jjd�dSd|_|jtkr@|j|�|jtk�r\|j	j
td�|j�|j
jtd�|j�d|_|j|�d|_d}d}|jj�}|tk�r |jj|�}||_|jj|d�}|jj
|�|jj|d�}|dkr�|j|j|�|jj|d�}|dk�r |j|j|�|tk�r�|jj|�}||_|jj|d�}|jj
|�|jj|d�}	|	dk�r||j|j|	�|jj|d�}|dk�r�|j|j|�|tk�r\|j j|�}||_|j!j|d�}|jj
|�y&|j!j|d�}
|
j"d	�dj"d
�}
Wnt#k
�rYnX|j!j|d�}	|	dk�r<|j|j|	�|
d}|dk�r\|j|j|�|jt$k�r�|j%|�|j&j
|j'j|d��|j(j
|j'j|d��|j)j
|j'j|d��|j|j*|j'j|d��|j+j
td��|j,jtd
��|j-|j,�|jt.k�r~|j/|�|j0j
|j1j|d��|j2j
|j1j|d��|j|j3|j1j|d��|j4j
td��|j5jtd��|j-|j5�|jt6k�r�|j7j
|j8|j9j|d���|j:j
|j8|j9j|d���|j;j
td��|j<jtd��d|_=|j-|j<�dS)NFTzPModify File Labeling for %s. File labels will be created when update is applied.zAdd File Labeling for %s�Modifyrrrz<b>z</b>r	zUModify SELinux User Role. SELinux user roles will be modified when update is applied.zModify SELinux UserszLModify Login Mapping. Login Mapping will be modified when Update is applied.zModify Login MappingzPModify File Equivalency Mapping. Mapping will be created when update is applied.zModify SELinux File Equivalency)>rzrrr�r�r�r�r��modify_button_network_clickedrur�rJrr�r[r��delete_old_itemr-r�r�r�ror�r�rwrYr�r]r^r�rpr�r�rqr�rmr~r�r/r;r7r=r>r:r�r6r�r�r.r-r1r/r,r�r3r�rBr
rErCrArFr�)rPr{r|�	operationr�r�rS�ftyperr�Zget_typerrrr7�s�













z SELinuxGui.modify_button_clickedcGsB|jj|�}|jj|d�}|j|j|�|j|j�|jj�dS)Nr)	r�r�rwr�r]r�r[rNr	)rPr�locr{r|r5rrrrHs
zSELinuxGui.populate_type_combocCs.|dkrdS|jd�rd}nd}|j|�dS)NZ	_script_tZ_tr)�endswithrm)rPrUZ
split_charrrr�strip_domainOs
zSELinuxGui.strip_domaincCs x|D]}|j|�rdSqWdS)NTF)�
startswith)rPr��exclude_listrRrrr�exclude_typeXs

zSELinuxGui.exclude_typec
Gs�g}|jjd�|j|j�|jj�}|jj�|jj�|j	|j
�}xN|jD]D}|dj|�rN|d|j
krN|djd�rN|j
|j	|d��qNW|jj��y�x.tjD]$}|jj
�}|jj|dtj|�q�W|tko�|jdk�rXxR|jj�D]D}|j|��r|jj
�}|jj|d|�|jj
�}|jj|d|�q�W|jjd�|jjd��n(|tk�r�|jdk�r�xp|jj�D]b}	|	j|��r�|j|	|��r�|	|jk�r�|jj
�}|jj|d|	�|jj
�}|jj|d|	��qzW|jjd�n�|tk�r�|jdk�r�xntj�D]b}
|
j|��r|
j|��rT|j|
|��rT|jj
�}|jj|d|
�|jj
�}|jj|d|
��qW|jjd�Wntk
�r�td�YnX|jjd�|jj d�|jj
�}|jj|dt!d��dS)	NTrZ	httpd_sysrFrr�zMore...)"r^r�r�r[r�r�r�r�r�r8r�r�r9r�r�r��
file_type_strr�r�r�rYrer�rr;rr�Zget_all_file_typesr~r�r]r`rJr)rPr{r:r�Zcompare�d�filesr|r�rrQrrrr-^s`


,



(




zSELinuxGui.init_files_dialogcGs�|j�}|s|jjd�dS|jjtd�|j�|jjtd�|j�d|_	|j
|�d}d}d|_|j�}|jj
|d�}|jj|�|jj
|d�}|dkr�|jjd�n|d	kr�|jjd�|jj
|d
�}|dkr�|j|j|�||_	dS)NFzJModify Network Port for %s.  Ports will be created when update is applied.zModify Network Port for %sr1rTrr�r�r)rrr�r�r�rJrr�r\r�r3r,r�rvrwrZr�rer�r�r�)rPr{r|r4r�rr�r�rrrr2�s.
z(SELinuxGui.modify_button_network_clickedc
Gs�|j|j�|jj�}|jj�|jjd��y:|tkrPt	j
j|jdddd�}n8|t
kr�t	j
j|jdddd�}|t	j
j|jdddd�7}g}xL|j�D]@}x:||D].\}}||dd	gkr�|jd
�r�q�|j|�q�Wq�W|j�|j|j�}|ddk�r|dd�}|d
}d}	d}
x@|D]8}|j|��r2|	}
|jj�}|jj|d|�|	d7}	�qW|jj|
�Wntk
�r~YnX|jjd�|jjd�dS)Nr#r�r�T)r�r�r�Zport_tZunreserved_port_tZ_typerr=rrr�r
r
)r�r\r�r�r�r�rZrJr�r�r�r�r�r�rYr7r�r�r8r9r�r�rer~r�rb)rPr{r�r�Z
port_typesrZr�r�Zshort_domainr[�foundr|rrrr,�sF




zSELinuxGui.init_network_dialogcGsN|j|�}|jj�dkrJx0tj�D]$}||dkr"|jj|jdd��q"WdS)Nr#r�r�)r�r/r�r�r�rJr)rP�combor{r�r�rrrr(�s

z'SELinuxGui.login_seuser_combobox_changecGsN|j|�}|jj�dkrJx0tj�D]$}||dkr"|jj|jdd��q"WdS)Nr#r�r�)r�r>r�r��
get_all_rolesrJr)rPr@r{Zseroler�rrrr)�s

z%SELinuxGui.user_roles_combobox_changecCsNd}|jsdS|jj�}|s dS|j�\}}|rJ|j|�}|rJ|jj|�}|S)N)rzr�r�r�r#)rPr|r�r"rrrrr�s

zSELinuxGui.get_selected_itercGsf|jjd�|j�}|dkr,|jjd�dS|j|sH|j|drLdS|jj|j|d�dS)NFrr
r
)r�r�rrrv)rPr{r|rrrr�szSELinuxGui.cursor_changedcGsn|j|j�|jj�tj�}|j�x*|D]"}|jj�}|jj|dt	|��q,W|j
jd�|jjd�dS)Nrr#)
r�r3r�r�r�Z
get_all_usersr�r�r�rr-rJr/)rPr{r�r�r|rrrr.�s


zSELinuxGui.login_init_dialogcGsn|j|j�|jj�tj�}|j�x*|D]"}|jj�}|jj|dt	|��q,W|j
jd�|jjd�dS)Nrr#)
r�r6r�r�r�rAr�r�r�rr;rJr>)rPr{r�rr|rrrr/	s


zSELinuxGui.user_init_dialogcCsh|jrdd|j�}|j�y|jj|�Wn0tjjk
rZ}z|j|�WYdd}~XnX|j�dS)Nzboolean -m -%d deny_ptrace)	r�r�rlr��semanager�r�rrn)rP�checkbutton�
update_bufferrRrrrrszSELinuxGui.on_disable_ptracecs�|jj���fdd�}g}|jtkrh|j�s8|j|j�Sx.|jD]$}|d|jdkr@|j||��q@W|jt	kr�|j
j�}|j�s�|tkr�|j
|j�S|tkr�|j|j�S|tkr�|j|j�Sx2|jD](}|d|df|jdkr�|j|�q�W|jtk�rR|j��s|j|j�Sx:|jD]0}|d|df|jdk�r|j||���qW|jtk�r�|j�d	k�st|j�Sx2|jD](}|d|jd
k�r||j||���q|W|jtk�r�|j��s�|j�Sx2|jD](}|d|jdk�r�|j||���q�W|jtk�rP|j�d	k�s|j�Sx2|jD](}|d|jdk�r$|j||���q$W|jj�xB|D]:}|jj�}x(td��D]}|jj||||��qzW�q`WdS)
Ncs*g}x td��D]}|j||�qW|S)Nr)r�r�)r��lr\)rTrr�dup_row!sz1SELinuxGui.on_show_modified_only.<locals>.dup_rowrrrrrrTzfcontext-equivrr)rvr�r�rsr�r�r�r�r�rur�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�)rPrCrFZappend_listr�r�r|r\r)rTrr8sd








z SELinuxGui.on_show_modified_onlyc	Cs8|jd�}|j|d|�|j|d|�|j|d|�dS)Nrrr)r�r�)	rPrrQr�r4rSZfclassr5r|rrr�init_modified_files_liststorebs
z(SELinuxGui.init_modified_files_liststorecGstd�dS)Nzrestore to defualt clicked...)r�)rPr{rrr�restore_to_defaulthszSELinuxGui.restore_to_defaultcGs(|j|j�|jjd�|jjd�dS)NT)rCr'r[r�r\)rPr{rrrrkszSELinuxGui.invalid_entry_retrycCsVt|�dks|ddkrR|jj�|jjd�|jjd�|jjtd�|�dSdS)Nrr�FzAThe entry '%s' is not a valid path.  Paths must begin with a '/'.T)	rr'rr[r�r\r(rJr)rPZ
insert_txtrrr�error_check_filesps
zSELinuxGui.error_check_filescCsly t|�}|dks|dkrt�WnFtk
rf|jj�|jjd�|jjd�|jjt	d��dSXdS)NriFz'Port number must be between 1 and 65536T)
�int�
ValueErrorr'rr[r�r\r(rJr)rPrZpnumrrr�error_check_networkys
zSELinuxGui.error_check_networkcGs2|jr.|j|j�td�kr.|jj�|jj�dS)NzMore...)r�r�r]rr[r	rNr)rPr{rrrrA�s
zSELinuxGui.show_more_typesc	Gs|j�|j|j�}|jj�}|jj�}|jr�|j�}|jj	|d�}|jj	|d�}|jj	|d�}|j
j|d|�|j
j|d|�|j
j|d|�d|||||d�|jd|<n"|j
j
d�}d||d�|jd|<|j
j|d|�|j
j|d|�|j
j|d|�|j�dS)	Nrrrz-m)r�r�r��oldrange�	oldseuser�oldnamerz-a)r�r�r�)r'r�r,r/r�r-r�rrr1rwrvr�rXr�r0)	rPr{r��	mls_ranger�r|rOrNrMrrrr�s&

zSELinuxGui.update_to_loginc	Gsj|j�|j|j�}|jj�}|jj�}|jj�}|jr�|j�}|j	j
|d�}|j	j
|d�}|j	j
|d�}	|j	j
|d�}
|jj|d|�|jj|d|�|jj|d|	�|jj|d|
�d||||
|	||d�|j
d|<nD|jjd�}|s�|�r
d|||d	�|j
d|<nd|d
�|j
d|<|jj|d|�|jj|d|�|jj|d|�|jj|d|�|j�dS)Nrrr	rz-m)r�r�r�r�rM�oldlevel�oldrolesrOrz-a)r�r�r�r�)r�r�)r'r�r:r=r�r>r;r�rrr7rwrvr�rXr�r0)rPr{r�r�rPr�r|rOrRrQrMrrrr�s2


"
zSELinuxGui.update_to_usercGs�|j�|jj�}|jj�}|jrl|j�}|j|jj|d��}|j|jj|d��}d|||d�|j	d|<n |jj
d�}d|d�|j	d|<|jj|d|j|��|jj|d|j|��dS)Nrrz-m)r��src�oldsrc�olddestzfcontext-equivz-a)r�rS)r'rCr�rBr�rrr
rvr�rXr�r�)rPr{r rSr|rUrTrrrr�s

zSELinuxGui.update_to_file_equivc
Gs0|j�d|_|jj�}|j|�r&dS|j|j�}|jj�}|j|j�}|j	r�|j
�}|j|jj
|d��}|j|jj|d��}|jj
|d�}	d||||	d�|jd||f<n$|jjd�}d|d	�|jd||f<|jj|d|j|��|jj|d|j|��|jj|d|j|��d
|_|jjd
�|j�dS)NTrrrz-m)r�r��oldtype�oldpathZoldclassrz-a)r�r�F)r'r�rYr�rIr�r]r`r^r�rr�unmarkrvrwr�rXr�r�r�rer0)
rPr{rS�setyper�rr|rW�	oldsetypeZ	oldtclassrrrr�s,


 zSELinuxGui.update_to_filesc
Gs2d|_|jj�}|j|�rdS|jj�r.d}nd}|j|j�}|jj�}|j	r�|j
�}|j|jj
|d��}|j|jj
|d��}|j|jj|d��}	d|||	||d�|jd	||f<n&|jjd�}d
||d�|jd	||f<|jj|d|�|jj|d|�|jj|d|�d|_|jj�|jjd�|j�dS)
NTr�r�rrrz-m)r�r�r�rV�oldprotocol�oldportsrz-a)r�r�r�F)r�rZr�rLr�r�r�r�rbr�rrrXrvrwr�rXr�r\r	r�r�r0)
rPr{r�r�rYr�r|r\r[rZrrrr�s0



"
zSELinuxGui.update_to_networkcGs�d}|jjd�|jtkr�|jj�|jd}xZ|D]R\}}|||fd}|jj�}|jj|d|�|jj|d|�|jj|d|�q4W|j	|j
�dS|jtk�r,|jj�|jd}x`|D]X\}	}
||	|
fd}|jj�}|jj|d|	�|jj|d|�|jj|dt
j|
�q�W|j	|j�dS|jtk�r�|jj�|jd	}x�|D]|}||d
}
||jdd�}||jd
d�}|jj�}|jj|d|�|jj|d|
�|jj|d|�|jj|d|��qRW|j	|j�dS|jtk�rx|jj�|jd}xd|D]\}||d}||jdd�}|jj�}|jj|d|�|jj|d|�|jj|d|��qW|j	|j�dS|jtk�r�|jj�xX|jD]N}|d�r�|jj�}|jj|d|j|d��|jj|d|j|d���q�W|j	|j�dSdS)NZAddFrr�rrr	rrr�r�r#r�rrr�r)r�r�r�r�rmr�r�r�r�r�rkrurjr�r<rhr�r?rr@r�r4r5r�rGrEr
rH)rPr{r4Z	port_dictr�r�rYr|Z
fcontext_dictrSrZ	user_dictrr�r�r�Z
login_dictrr�r�rrrr6sv

















 z SELinuxGui.delete_button_clickedcGsx|j�|jtkrNx:|jD]0}|drd|dd�|jd|d|df<qW|jtkr�x>|jD]4}|dr`d|dd�|jd|dt|df<q`W|jtkr�x8|j	D].}|dr�d|d|d	d
�|jd|d<q�W|jt
k�r(x>|jD]4}|dr�d|d|d|dd�|jd
|d<q�W|jtk�rlx6|j
D],}|d�r<d|dd�|jd|d<�q<W|j�dS)Nrz-dr	)r�r�rrrrr)r�r�r�r)r�rr�r�r)r�rSzfcontext-equiv)r'r�r�rmrXrurj�reverse_file_type_strr�r?r�r4r�rGr0)rPr{�deleterrrr�Os,
(
,
&,
"z!SELinuxGui.on_save_delete_clickedcGs,x&|jD]}t|d|d|d�qWdS)Nrrr)rjr�)rPr{r^rrr�!on_save_delete_file_equiv_clickedgsz,SELinuxGui.on_save_delete_file_equiv_clickedcCs||d||d<dS)Nrr)rPr�rSr�rrrr�kszSELinuxGui.on_toggle_updatecCsVd}xL|D]D}|d|dkrF|d|dkrF|j|�}|j|�dS|d7}q
WdS)Nrrr)r�r�)rPrvr�r[r�r|rrr�ipage_deletens
 

zSELinuxGui.ipage_deletecCs�|sdS|jj|�}|jj|�}|j|d�}|j|d|j|d��|j|d�}||jdkrl|jd|=nd|i|jd|<|j�dS)Nrrrr�)ryr�r�rwr�rXr0)rPr�rSr�r|r�r�rrrr�wszSELinuxGui.on_togglecGs|j�|_|jj�dS)N)r�r�r��refilter)rP�entryr{rrrr-�s
z#SELinuxGui.get_advanced_filter_datacGs|j�|_|jj�dS)N)r�r�r#ra)rPr)r{rrrr�s
zSELinuxGui.get_filter_datacGsFd|_|jj�x�|jdD]�}|jd|d}|jjd�}|jj|dd�|jj|dtj|��|jj|dt|jd|d�|jj|dd�|jj|�}|jj|dt	d	�|�|jj|dd
�qW�x�|jdD�]~\}}|jd||fd}|jd||fd}|jjd�}|jj|dd�|jj|d|�|jj|dd�|d
k�rr|jj|dt	d�|j
�|dk�r�|jj|dt	d�|j
�|dk�r�|jj|dt	d�|j
�|jj|�}|jj|dd
�|jj|dt	d�|�|jj|�}|jj|dd
�|jj|dt	d�tj|�|jj|�}|jj|dd
�|jj|dt	d�|�q�W�x�|jdD�]z\}	}
|jd|	|
fd}|jjd�}|jj|dd�|jj|d|�|jj|dd�|d
k�r�|jj|dt	d�|j
�|dk�r|jj|dt	d�|j
�|dk�r6|jj|dt	d�|j
�|jj|�}|jj|dt	d�|	�|jj|dd
�|jj|�}|jj|dt	d�|
�|jj|dd
�|jd|	|
fd}|jj|�}|jj|dd
�|jj|dt	d�|��qnW�x�|jdD�]r}|jd|d}|jjd�}|jj|dd�|jj|d|�|jj|dd�|d
k�rn|jj|dt	d��|dk�r�|jj|dt	d��|dk�r�|jj|dt	d��|jj|�}|jj|dt	d �|�|jj|dd
�|jj|�}|jj|dd
�|jd|d!}|jj|dt	d"�|�|jd|jd#d$�}
|jj|�}|jj|dd
�|jj|dt	d%�|
��q�W�x�|jd&D�]r}|jd&|d}|jjd�}|jj|dd�|jj|d|�|jj|dd�|d
k�r�|jj|dt	d'��|dk�r|jj|dt	d(��|dk�r0|jj|dt	d)��|jj|�}|jj|dd
�|jj|dt	d*�|�|jj|�}|jj|dd
�|jd&|d+}|jj|dt	d,�|�|jd&|jd#d$�}
|jj|�}|jj|dd
�|jj|dt	d%�|
��q�W�x8|jd-D�](}|jd-|d}|jjd�}|jj|dd�|jj|d|�|jj|dd�|d
k�rz|jj|dt	d.��|dk�r�|jj|dt	d/��|dk�r�|jj|dt	d0��|jj|�}|jj|dd
�|jj|dt	d1�|�|jj|�}|jj|dd
�|jd-|d2}|jj|dt	d3�|��qW|j
|j�dS)4NTrr�rrrr�r	zSELinux name: %sFrr�z-azAdd file labeling for %sz-dzDelete file labeling for %sz-mzModify file labeling for %sz
File path: %szFile class: %szSELinux file type: %srzAdd ports for %szDelete ports for %szModify ports for %szNetwork ports: %szNetwork protocol: %srzAdd userzDelete userzModify userzSELinux User : %sr�z	Roles: %sr�r#zMLS/MCS Range: %srzAdd login mappingzDelete login mappingzModify login mappingzLogin Name : %sr�zSELinux User: %szfcontext-equivzAdd file equiv labeling.zDelete file equiv labeling.zModify file equiv labeling.zFile path : %srSzEquivalence: %s)r�r�r�rXr�r�r�rr�rr�r<rr�rV)rPr{Zboolsr4r|r�rSrrYrr�rr�r�rr�rSrrr�
update_gui�s�
"














zSELinuxGui.update_guicCsL|jj�r|j|_|jj�r$|j|_|jj�r6|j|_|jj�rH|j|_dS)N)r�r�r�r�r�r�)rPrrr�set_active_application_button	s



z(SELinuxGui.set_active_application_buttonTcCs~|jj�|jjd�|jjd�|jjd�|jjd�|jjd�|jjd�|j	jd�|j
jd�|rz|jjd�dS)NFr#)
r�r	r�r�r�r�r�r�r�r�r�r�rJ)rPr�rrrrL 	s
zSELinuxGui.clearbuttonscCsP|j�|jjd�|jjd�|jjtd��|jjd�|j�|j	�dS)NTZSystem)
rLr�r�r�r&rIrrerBr)rPrrrrH-	szSELinuxGui.show_system_pagecGsX|j�|j�|jjd�|jjtd��|j�|j�|j	j
d�|jj
d�dS)NTzFile Equivalence)rLr�r�rer&rIrrBrr�r�r�)rPr{rrrr"6	szSELinuxGui.show_file_equiv_pagecCsx|j�|jjd�|jjd�|jjtd��|jjd�|j�|j	�|j
�|j�|jj
d�|jj
d�dS)NTZUsers)rLr�r�r�r&rIrrerBr�r�rr�r�r�)rPrrr�show_users_page@	szSELinuxGui.show_users_pagecCsZ|jd�|jjd�|jjd�|jjd�|jjd�|jjd�|j�|j�dS)NFT)	rLr�r�r�r�r�rerBr)rPrrrrKM	s
z!SELinuxGui.show_applications_pagecGs|j�dS)N)rH)rPr{rrrr#W	szSELinuxGui.system_interfacecGs|j�dS)N)re)rPr{rrrr$Z	szSELinuxGui.users_interfacec	Gs�g}d}|jj�}|j�dkr�x^|jD]T}|jj|�}|jj|�}|jj|�}|dkr&|jj|d�dkrr|j	|�|d7}q&Wxn|D]}|jj
|�q�WnT|jdkr�|jj�|t
kr�|j|j�n*|tkr�|j|j�n|tkr�|j|j�dS)NrTrFr)r�r�r�rvr"r�r�r#rwr�r�r�r�r�r�r�r�r�r�)	rPrCr{Ziterlistr[r�r�r|Zitersrrrr%]	s,




z SELinuxGui.show_mislabeled_filescCsptj|�djd�d}tj|d�djd�d}|jtd�|||d�dd�tjjkrl|j	j
|�|j�dS)	NrrrrzbRun restorecon on %(PATH)s to change its type from %(CUR_CONTEXT)s to the default %(DEF_CONTEXT)s?)�PATHZCUR_CONTEXTZDEF_CONTEXTzrestorecon dialog)�title)r�r�rmr��verifyrr�ResponseTypeZYESr�Z
restoreconr4)rPrSrr�rrrrxv	s
&zSELinuxGui.fix_mislabeledcGs$|jj|j��|jj|j��dS)N)r�r�r
r�)rPr{rrrr0}	szSELinuxGui.new_updatescGsF|j�|j�td�k|_|jr2|jjtd��n|jjtd��dS)NZUpdatezUpdate ChangeszRevert Changes)rc�	get_labelrr�rVr�)rPr�r{rrrr,�	s
z#SELinuxGui.update_or_revert_changescGsb|j�|jr|j�n|j�d|_|j|jj��d|_|j�|j	�|j
�|jj�dS)NFT)
r'r��update_the_system�revert_datar�rDr�r�r!r4r0r�r�)rPr{rrrr+�	s
z%SELinuxGui.apply_changes_button_presscGsl|j�|j�}|j�y|jj|�Wn.tjjk
rV}zt|�WYdd}~XnX|j�|j	�dS)N)
r'�
format_updaterlr�rBr�r�r�rnr�)rPr{rDrRrrrrk�	szSELinuxGui.update_the_systemcCs2dddddd�}x|D]}||kr||SqWdS)Nrrr)zExecutable FileszWritable FileszApplication File TypeZInboundZOutboundZBooleansr)rP�lookupZipage_values�valuerrr�ipage_value_lookup�	s

zSELinuxGui.ipage_value_lookupcCs4|jd�d}|jd�d}|dkr,||_n|SdS)Nz: rrzSELinux name)rmZbool_revert)rPZ	attributeZbool_idrrr�get_attributes_update�	s
z SELinuxGui.get_attributes_updatec		Cs�|j�d}�xp|jD�]d}|dkrVx0|j|D]"}|d|j||d|f7}q0W|dk�rx�|j|D]�}|j||ddkr�|d|7}qld	|j||kr�|d
|j||d|j||d|j||d	|f7}ql|d|j||d|j||d|f7}qlW|d
k�rx�|j|D]�}|j||ddk�rX|d|7}n�d|j||k�r�d	|j||k�r�|d|j||d|j||d|j||d	|j||d|f7}n.|d|j||d|j||d|f7}�q.W|dk�rxxh|j|D]Z}|j||ddk�rD|d|7}n.|d|j||d|j||d|f7}�qW|dk�r�xx|j|D]j}|j||ddk�r�|d|7}n>|d|j||d|j||d|j||d|f7}�q�W|dkrxv|j|D]h\}}|j|||fddk�rF|d||f7}n0|d|j||d|j||d||f7}�qWqW|S)Nr#rzboolean -m -%d %s
r�rr�z-dzlogin -d %s
r�zlogin %s -s %s -r %s %s
r�zlogin %s -s %s %s
rzuser -d %s
r�zuser %s -L %s -r %s -R %s %s
r�zuser %s -R %s %s
zfcontext-equivzfcontext -d %s
zfcontext %s -e %s %s
rSrzfcontext %s -t %s -f %s %s
r�rrzport -d -p %s %s
zport %s -t %s -p %s %s
)rlrX)	rPrDrZrrEr�r�rr�rrrrm�	sH"
@2
(P4
4
D:zSELinuxGui.format_updatecCs`d}g}d}x.|jD]$}|j|ds0|j|�|d7}qW|j�x|D]}|j|�qJWdS)Nrr#r)r�r��reverser^)rPr[Zremove_listrDr�rrrrl�	s

zSELinuxGui.revert_datacGsN|j�tdk}|r$|jtd�n|jtd�|jj|�|jj|�dS)Nrr)r��ADVANCED_LABELrJr�r�rP)rP�labelr{�advancedrrrr@�	sz!SELinuxGui.reveal_advanced_systemcGsf|j�tdk}|r$|jtd�n|jtd�|jj|�|jj|�|jj|�|jj|�dS)Nrr)r�rsrJr_r�r`rarb)rPrtr{rurrrr�	szSELinuxGui.reveal_advancedcGsF|j�tdkr(|jtd�|j�n|jtd�|j|j�dS)Nrr)r��ADVANCED_SEARCH_LABELrJr'r�r))rPrtr{rrrr�
s

z)SELinuxGui.on_show_advanced_search_windowcCsJ|r&|jj|jtd��|jjd�n |jj|jtd��|jjd�dS)NzSystem Status: EnforcingTzSystem Status: Permissive)r�r`r�rr�rer�)rProrrrrb
s
zSELinuxGui.set_enforce_textcCs,|js
dS|jj|j��|j|j��dS)N)r�r�Z
setenforcer�rb)rPr�rrrr
szSELinuxGui.set_enforcecGs`|jj�}|dkrdSd|_|jj�|jj|�|jdkrH|j|�n|jdkr\|j|�dS)NF�Import�Export)	r��get_filenamer�r	rYrJr��
import_config�
export_config)rPr{rrrrr
s



zSELinuxGui.on_browse_selectcGsX|jj�}|jj�r0|jd�sT|jj|d�n$|jd�rT|jd�d}|jj|�dS)Nz(/.*)?r)rYr�r�r�r7rJrm)rPr{rSrrrr#
s



zSELinuxGui.recursive_pathcGs"|j�}|jr|jd�d|_dS)Nr#F)r�r�rJ)rPZ	entry_objr{Ztxtrrrr,
s
zSELinuxGui.highlight_entry_textcCs~|j�}|dkrdS|jd�r*|jjd�xNtjD]D}|j|�r2x4|jD]*}|djtj|�rH|j|j	|d�qHWq2WdS)Nr#z(/.*)?Tr)
r�r7r�rer�ZDEFAULT_DIRSr9r�r�r])rPrb�textr=r�rrrr2
s

z#SELinuxGui.autofill_add_files_entrycGs&|jjd�|_|jj�}|jj�}dS)Nr)r�Zget_colZboolean_column_1Z	get_widthZget_cell_renderers)rPr{�widthZrendererrrrr3>
s
zSELinuxGui.resize_columnscGs|jj�dS)N)r�r)rPr{rrrr&C
szSELinuxGui.browse_for_filescGs|jj�dS)N)r�r	)rPr{rrrr*F
szSELinuxGui.close_config_windowcGsl|j|jj�krdS|jtd��tjjkr<|jj|j�dS|j	j
|j|j��|j	jd�|jj�|_dS)Nz�Changing the policy type will cause a relabel of the entire file system on the next boot. Relabeling takes a long time depending on the size of the file system.  Do you wish to continue?T)
r�rPr�rhrrri�NOrer�r=r��relabel_on_boot)rPr{rrrr=I
sz SELinuxGui.change_default_policycCs4|js
dS|j|�|j�r0|jj|j�j��dS)N)r��enabled_changedr�r�r>rjr)rPr�rrrr>U
s

zSELinuxGui.change_default_modecGs0|jjtjj�|jjd�|jj�d|_dS)NzImport Configurationrw)r��
set_actionr�FileChooserActionZOPENr�rr�)rPr{rrrr9\
s
zSELinuxGui.import_config_showcGs0|jjtjj�|jjd�|jj�d|_dS)NzExport Configurationrx)r�r�rr�ZSAVEr�rr�)rPr{rrrr:c
s
zSELinuxGui.export_config_showcCs:|j�|jj�}t|d�}|j|�|j�|j�dS)N�w)rlr�r�r�rr�rn)rPrr�r�rrrr{i
s


zSELinuxGui.export_configcCsTt|d�}|j�}|j�|j�y|jj|�Wntk
rFYnX|j�dS)Nr)r�r�r�rlr�rBr_rn)rPrr�r�rrrrzq
s
zSELinuxGui.import_configcCsV|||f|kri||||f<||f||||fkrR|||	|
d�||||f||f<dS)N)r�r��changed�oldr)rPrVrQr�r4r�qr5r�r�r�rrr�init_dictionary|
szSELinuxGui.init_dictionarycCs*|jd�d}|dkrdS|dkr&dSdS)N�-r�0F�1T)rm)rPrrrr�translate_bool�
s
zSELinuxGui.translate_boolcGsx|jj�}tjjd�}|r"|r"dS|r2|r2dSy|jj|�Wn0tjjk
rr}z|j	|�WYdd}~XnXdS)Nz
/.autorelabel)
r�r�rcrSrdr�rr�r�r)rPr{r�rdrRrrrr?�
s
zSELinuxGui.relabel_on_rebootcGs
|j�|jjd�|jjd�|j|krV|j|j�|j|j	�t
d�krV|j	jd�|j|kr�|jrt|j|j�n|j
r�|j|j�|jj�s�|jj�r�|jjd�|jjd�|jjd�|jjd�|jjd�|jjd�|jj�tdk�r|jjtd�dS)NFTzMore...rr)r	r�rer�r�rNr�r[r�r]rr'r�r�r\r_Zget_visiblerar�r�r`r�rbr%r�rvrJ)rPr�r{rrrrC�
s,

zSELinuxGui.closewindowcCs|jj�j|j�|j�dS)N)r��
get_window�
set_cursorr�r)rPrrrrl�
szSELinuxGui.wait_mousecCs|jj�j|j�|j�dS)N)r�r�r�r�r)rPrrrrn�
szSELinuxGui.ready_mouser#cCsNtjddtjjtjj|�}|j|�|jtjj	�|j
�|j�}|j�|S)Nr)
r�
MessageDialog�MessageType�INFO�ButtonsTypeZYES_NOr��set_position�WindowPosition�MOUSE�show_all�run�destroy)rP�messagerg�dlgZrcrrrrh�
s
zSELinuxGui.verifycCsDtjddtjjtjj|�}|jtjj�|j	�|j
�|j�dS)Nr)rr�r�ZERRORr�ZCLOSEr�r�r�r�r�r�)rPr�r�rrrr�
szSELinuxGui.errorcCs�|j�sdS|j�}|dkrH|jtkrH|jtd��tjjkrH|j	j
d�|dkr||jtkr||jtd��tjjkr||j	j
d�||_	dS)Nria�Changing to SELinux disabled requires a reboot.  It is not recommended.  If you later decide to turn SELinux back on, the system will be required to relabel.  If you just want to see if SELinux is causing a problem on your system, you can go to permissive mode which will only log errors and not enforce SELinux policy.  Permissive mode does not require a reboot.  Do you wish to continue?Tz�Changing to SELinux enabled will cause a relabel of the entire file system on the next boot. Relabeling takes a long time depending on the size of the file system.  Do you wish to continue?)r�rjr�rrhrrrir~r�re)rPZradiortrrrr��
szSELinuxGui.enabled_changedcGs|jjd�|jjd�dS)Nr#F)r�rJr�re)rPr{rrrr!�
szSELinuxGui.clear_filterscGsB|js
dS|j�|jj�r*|jjd�n|jjd�|j�dS)Nzmodule -e unconfinedzmodule -d unconfined)r�rlr�r�r�rBrn)rPr{rrrr;�
s
zSELinuxGui.unconfined_togglecGsB|js
dS|j�|jj�r*|jjd�n|jjd�|j�dS)Nzmodule -e permissivedomainszmodule -d permissivedomains)r�rlr�r�r�rBrn)rPr{rrrr<�
s
zSELinuxGui.permissive_togglecGs:t|j�dkr.|jtd�td��tjjkr.dS|j�dS)Nra0You are attempting to close the application without applying your changes.
    *    To apply changes you have made during this session, click No and click Update.
    *    To leave the application without applying your changes, click Yes.  All changes that you have made during this session will be lost.zLoss of data DialogT)rr�rhrrrir~r�)rPr�r{rrrr�
szSELinuxGui.confirmation_closecGstjd�dS)Nr)�sys�exit)rPr{rrrr��
szSELinuxGui.quit)NF)T)r#)��__name__�
__module__�__qualname__rWr�r^rErqr rr�r�r1r2r0r
rr
rr�r�r�r�r�r�r�r�r�r�r�rrr/r.r�r4rNrDr�rrr�r�r�r
r�r
r�rr�rr�rr�rrBr%rtr(r5r�r'r7rr8r;r-r2r,r(r)rrrr.r/rr8rGrHrrIrLrArrrrrr6r�r_r�r`r�r-rrcrdrLrHr"rerKr#r$r%rxr0r,r+rkrprqrmrlr@rr�rbrrrrrr3r&r*r=r>r9r:r{rzr�r�r?rCrlrnrhrr�r!r;r<rr�rrrrr"ss*	@
	
	

B*

	})Y	5+

D	
A	

	


/		
	


r"�__main__)r	)AZgiZrequire_versionZ
gi.repositoryrrrZsepolicy.sedbusrr�r�r�rrrZsepolicy.networkZsepolicy.manpager�rcrZunicodedataZPROGNAME�gettext�kwargs�version_infoZinstall�builtinsr�__dict__�ImportErrorZ__builtin__r
r]r<r�rr�r�rZdistutils.sysconfigr�rsrvr�r�r�r�r�r�r�r�rsrur�ryr�r�r�r�r�r�rYrar"r�r�rrrr�<module>s�
	
site-packages/sepolicy/__pycache__/communicate.cpython-36.opt-1.pyc000064400000002262147511334650021227 0ustar003

>�\��@s,ddlZddlZdd�Zdd�Zdd�ZdS)�NcCs0|j�tjjd|�tjj�tjd�dS)Nz
%s
�)Z
print_help�sys�stderr�write�flush�exit)�parser�msg�r
�!/usr/lib/python3.6/communicate.py�usages
rcCs6ytttjtj|��d�Stk
r0|gSXdS)N�types)�list�next�sepolicy�infoZ	ATTRIBUTE�
StopIteration)Z	attributer
r
r�expand_attribute"srcsvtjtjgtj|tj|tj�i�}|s<td|dj��f��g}x0tdd�t	�fdd�|��D]}|t
|�}q^W|S)Nz*The %s type is not allowed to %s any types�,cSs
|tjS)N)rZTARGET)�yr
r
r�<lambda>/szget_types.<locals>.<lambda>cst��j|tj�S)N)�set�issubsetr�PERMS)�x)�permr
rr/s)r�searchZALLOWZSOURCEZCLASSr�
ValueError�join�map�filterr)�srcZtclassrZallowsZtlist�lr
)rr�	get_types)s""r#)rrrrr#r
r
r
r�<module>ssite-packages/sepolicy/__pycache__/transition.cpython-36.opt-1.pyc000064400000005626147511334650021124 0ustar003

>�\��@s0ddlZdgZdd�Zdd�ZGdd�d�ZdS)�N�setranscCs.tjtjgtj|i�}tdd�tdd�|��S)NcSs
|tjS)N)�sepolicyZTARGET)�y�r� /usr/lib/python3.6/transition.py�<lambda>sz_entrypoint.<locals>.<lambda>cSsd|tjkS)NZ
entrypoint)rZPERMS)�xrrrrs)r�searchZALLOWZSOURCE�map�filter)�src�transrrr�_entrypointsrcsF�gtt�fdd�tj���dd�tt�fdd�tj���}|S)Ncs|d�kS)N�namer)r)rrrr sz_get_trans.<locals>.<lambda>rZ
attributescs|d�ko|ddkS)N�source�class�processr)r)�src_listrrr!s)�listrrZget_all_types_infoZget_all_transitions)rZ
trans_listr)rrr�
_get_transs(rc@s0eZdZddd�Zdd�Zddd�Zd	d
�ZdS)
rNcCs(g|_i|_||_||_|j|j�dS)N)�seen�sdictr�dest�_process)�selfrrrrr�__init__'s
zsetrans.__init__cs���jkr�j�Si�j�<t��}|s.dS��j�d<�jsR|�j�d<nxttdd�t�fdd�|����j�d<ttdd�t��fdd�|����j�d<x �j�dD]}�j|�q�WdS)	Nrr
cSs|S)Nr)rrrrr9sz"setrans._process.<locals>.<lambda>cs|d�jkS)N�	transtype)r)r)rrrr9scSs|dS)Nrr)rrrrr:scs|d�j�gkS)Nr)r)r)rrrrr:s�child)rrrrr
rr)rrr
�sr)rrrr.s


*,zsetrans._process�c	Cs�d}||jkr|S|jj|�d|j|kr�x~|j|dD]l}tj|d|dddg�}|r�|d||d|d|dtj|�f7}q<|d	||d|d|df7}q<Wd
|j|kr�x.|j|d
D]}||j|d||f�7}q�W|S)Nrr
rrrZ
transitionz%s%s @ %s --> %s %s
�targetz%s%s @ %s --> %s
rz	%s%s ... )r�appendrrZget_conditionalsZget_conditionals_format_text�out)rr�headerZbuf�tZcondrrrrr">s
*$zsetrans.outcCsg|_t|j|j��dS)N)r�printr"r)rrrr�outputQszsetrans.output)N)r)�__name__�
__module__�__qualname__rrr"r&rrrrr%s

)r�__all__rrrrrrr�<module>ssite-packages/sepolicy/interface.py000064400000017757147511334650013457 0ustar00# Copyright (C) 2012 Red Hat
# see file 'COPYING' for use and warranty information
#
# policygentool is a tool for the initial generation of SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#
import re
import sys
import sepolicy
ADMIN_TRANSITION_INTERFACE = "_admin$"
USER_TRANSITION_INTERFACE = "_role$"

__all__ = ['get_all_interfaces', 'get_interfaces_from_xml', 'get_admin', 'get_user', 'get_interface_dict', 'get_interface_format_text', 'get_interface_compile_format_text', 'get_xml_file', 'interface_compile_test']

##
## I18N
##
PROGNAME = "selinux-python"
try:
    import gettext
    kwargs = {}
    if sys.version_info < (3,):
        kwargs['unicode'] = True
    gettext.install(PROGNAME,
                    localedir="/usr/share/locale",
                    codeset='utf-8',
                    **kwargs)
except:
    try:
        import builtins
        builtins.__dict__['_'] = str
    except ImportError:
        import __builtin__
        __builtin__.__dict__['_'] = unicode


def get_interfaces_from_xml(path):
    """ Get all interfaces from given xml file"""
    interfaces_list = []
    idict = get_interface_dict(path)
    for k in idict.keys():
        interfaces_list.append(k)
    return interfaces_list


def get_all_interfaces(path=""):
    from sepolicy import get_methods
    all_interfaces = []
    if not path:
        all_interfaces = get_methods()
    else:
        xml_path = get_xml_file(path)
        all_interfaces = get_interfaces_from_xml(xml_path)

    return all_interfaces


def get_admin(path=""):
    """ Get all domains with an admin interface from installed policy."""
    """ If xml_path is specified, func returns an admin interface from specified xml file"""
    admin_list = []
    if path:
        try:
            xml_path = get_xml_file(path)
            idict = get_interface_dict(xml_path)
            for k in idict.keys():
                if k.endswith("_admin"):
                    admin_list.append(k)
        except IOError as e:
            sys.stderr.write("%s: %s\n" % (e.__class__.__name__, str(e)))
            sys.exit(1)
    else:
        for i in sepolicy.get_methods():
            if i.endswith("_admin"):
                admin_list.append(i.split("_admin")[0])

    return admin_list


def get_user(path=""):
    """ Get all domains with SELinux user role interface"""
    """ If xml_path is specified, func returns an user role interface from specified xml file"""
    trans_list = []
    if path:
        try:
            xml_path = get_xml_file(path)
            idict = get_interface_dict(xml_path)
            for k in idict.keys():
                if k.endswith("_role"):
                    if (("%s_exec_t" % k[:-5]) in sepolicy.get_all_types()):
                        trans_list.append(k)
        except IOError as e:
            sys.stderr.write("%s: %s\n" % (e.__class__.__name__, str(e)))
            sys.exit(1)
    else:
        for i in sepolicy.get_methods():
            m = re.findall("(.*)%s" % USER_TRANSITION_INTERFACE, i)
            if len(m) > 0:
                if "%s_exec_t" % m[0] in sepolicy.get_all_types():
                    trans_list.append(m[0])

    return trans_list

interface_dict = None


def get_interface_dict(path="/usr/share/selinux/devel/policy.xml"):
    global interface_dict
    import os
    import xml.etree.ElementTree
    if interface_dict:
        return interface_dict

    interface_dict = {}
    param_list = []

    xml_path = """<?xml version="1.0" encoding="ISO-8859-1" standalone="no"?>
<policy>
<layer name="admin">
"""
    xml_path += path
    xml_path += """
</layer>
</policy>
"""

    try:
        if os.path.isfile(path):
            tree = xml.etree.ElementTree.parse(path)
        else:
            tree = xml.etree.ElementTree.fromstring(xml_path)
        for l in tree.findall("layer"):
            for m in l.findall("module"):
                for i in m.getiterator('interface'):
                    for e in i.findall("param"):
                        param_list.append(e.get('name'))
                    interface_dict[(i.get("name"))] = [param_list, (i.find('summary').text), "interface"]
                    param_list = []
                for i in m.getiterator('template'):
                    for e in i.findall("param"):
                        param_list.append(e.get('name'))
                    interface_dict[(i.get("name"))] = [param_list, (i.find('summary').text), "template"]
                    param_list = []
    except IOError:
        pass
    return interface_dict


def get_interface_format_text(interface, path="/usr/share/selinux/devel/policy.xml"):
    idict = get_interface_dict(path)
    interface_text = "%s(%s) %s" % (interface, ", ".join(idict[interface][0]), " ".join(idict[interface][1].split("\n")))

    return interface_text


def get_interface_compile_format_text(interfaces_dict, interface):
    from .templates import test_module
    param_tmp = []
    for i in interfaces_dict[interface][0]:
        param_tmp.append(test_module.dict_values[i])
        interface_text = "%s(%s)\n" % (interface, ", ".join(param_tmp))

    return interface_text


def generate_compile_te(interface, idict, name="compiletest"):
    from .templates import test_module
    te = ""
    te += re.sub("TEMPLATETYPE", name, test_module.te_test_module)
    te += get_interface_compile_format_text(idict, interface)

    return te


def get_xml_file(if_file):
    """ Returns xml format of interfaces for given .if policy file"""
    import os
    try:
        from commands import getstatusoutput
    except ImportError:
        from subprocess import getstatusoutput
    basedir = os.path.dirname(if_file) + "/"
    filename = os.path.basename(if_file).split(".")[0]
    rc, output = getstatusoutput("python /usr/share/selinux/devel/include/support/segenxml.py -w -m %s" % basedir + filename)
    if rc != 0:
        sys.stderr.write("\n Could not proceed selected interface file.\n")
        sys.stderr.write("\n%s" % output)
        sys.exit(1)
    else:
        return output


def interface_compile_test(interface, path="/usr/share/selinux/devel/policy.xml"):
    exclude_interfaces = ["userdom", "kernel", "corenet", "files", "dev"]
    exclude_interface_type = ["template"]

    try:
        from commands import getstatusoutput
    except ImportError:
        from subprocess import getstatusoutput
    import os
    policy_files = {'pp': "compiletest.pp", 'te': "compiletest.te", 'fc': "compiletest.fc", 'if': "compiletest.if"}
    idict = get_interface_dict(path)

    if not (interface.split("_")[0] in exclude_interfaces or idict[interface][2] in exclude_interface_type):
        print(_("Compiling %s interface") % interface)
        try:
            fd = open(policy_files['te'], "w")
            fd.write(generate_compile_te(interface, idict))
            fd.close()
            rc, output = getstatusoutput("make -f /usr/share/selinux/devel/Makefile %s" % policy_files['pp'])
            if rc != 0:
                sys.stderr.write(output)
                sys.stderr.write(_("\nCompile test for %s failed.\n") % interface)

        except EnvironmentError as e:
            sys.stderr.write(_("\nCompile test for %s has not run. %s\n") % (interface, e))
        for v in policy_files.values():
            if os.path.exists(v):
                os.remove(v)

    else:
        sys.stderr.write(_("\nCompiling of %s interface is not supported.") % interface)
site-packages/sepolicy/help/files_write.png000064400000227715147511334650015114 0ustar00�PNG


IHDR�j��8pgAMA���a cHRMz&�����u0�`:�p��Q<bKGD��������IDATx��gt�v��~i�w�$ A�M�����V]o�j�%�H�ƭ�o�Y�kf��VkZ=�j�J�ߺ�SyO��H�#�=ޥy?��]9�����9y"3##2bG�ر����;���7KKK�����M�T��������J B&�ɜ�k�0^�KKK�����﫫����Ο�������狚J�V�����ٳg%�ȑ#���ږʲ�i�7;s6666666�3� �J�8~��1X(�2�a��|�������|�����@U�E�a�a�l�������+�a�l�&�?��6"`}��89Y/�>������!B�s���v6ױ�#�y�+�����>���u�����"�e�V���|
A ���'0�Ӂ��"��O$�E���Dcq�n���NgHe�<DA�Ӷ@A0�p4�a������t�X�n���q\'��a������Ig��}^dI�0M��(�(�xnv�������;ׇ\���l}�OG$��q�I4M����?��s<q�_��UTE�0L����g���s���c,]؂i�k�`ׁ��w����j�B~ .�"�a �
��o��9�X���?b*F�e�+�y��hi��� 0M��?z��<p�&�G����{�>�(��8�d֝rH�^{;;��珿�u֮XBo� �穯�᏾�,.��ʩib�h:
�f���4�}��'�00g�
&�qq�]��apd��p����>{��P���+�3K�o7 �CE�;��u�0_{�AꫫH��8
�d
��ş~�[8�Á �#1�٬%����R�,_�iBw�� ��R^\��Ȍ�OP]^N4'�J��]YVF��\Ne�ň���Db1�~�~ꫫx��7�x����042F"�����0���d�Y̫���������*2�aPWU� �

�	����@"�dlr�w�i^�w�o`���bt�``h��p��MEY	�Sd��L�P�OIQ��q�g�T�����#�3DQ�����rahY�2�x����AIQ�E�MȒt�_�/A����i�
@��}x�{�*/��L7M��66��0(-*����c��1
��!"�h�>ޅ(�,nY��M��j�����ў}�TV��~�$���i�+*X�|	����?��w���{�Y���{��a�
v<���GYq���k9y��}���ij�*�x�v�;�,I��N�w3�(`�p�����)��V��o226���(�lw����P~��o��S�
0M�0hm�O,��7ߡ��K� 
"���}�>z����ܷi'�v2:>�,KȒ�}�9v�=@O�"�8�ܻ�t&��q9�OM�dtb�g�{�j��V��koQYVF}M����.ק^"���]1M�E��|�c�oaqK�UU�9[��������&�5U��?�>���p��)v:���Xjk�ۅ$I8T��`fT�ր�@����ڄ����o����$��a��Y����A���k<��V��-�q�L����7�x��r��W/�a�r֯\���=�W���n�w��p�?�޷�(+!���|�c��q�\|>��բ뺵�`������8���
޲Q�Ӥ�����s�t;���`����a�
~��K�
�X_KV�?����t��kW�rI���+\�����VU�H$�u�0e%��^�����WW!�����ND���
3��O��3~�O�(�\��n���i�WLJ;���0M���ꥋQ�ӁaX��=p�*�(�H� �TE��r!�"�,�L����g|r
�4Hg,�A��A8%�J�r:g�>�e3�/B7t������C#d2Y�7��326�7@��g04:�n��>�Z��߿�>��֍�q��5���V.������*>�d7`�;����4��Q
�i�(�(2n����70��3̯���(���(��T8���lY�Qd���Z����
�,Y�}����NCNt�e�C<~�v�?D��cŢ�Wϲ귱��~L��>ڳ�H,�����i��ebj�L&ˏ^�
�a�XW˷�z�P�χ���ў��]NV�-" K��UQ�$���G]U%KZ�yo�Ne�Eo?C2��O��-�d7�O��a岼�\q0�(d�Y�Z����{�������4�g���I��� ���zii�džU�y�OP��7o`���A<����!?�0L�Ǎ"�8*KZ�1
�ˉ���RYV����iR\��a������43����'����c|r���RB���;z�d*�����t
��Z�߲�S1���b�z�~Zg��JB!JB��j��ݻ�ŋ_�������I&�%�J��*n�eė�42Ym&�e��Pҙ��Hܲj�%	�0Q�lVC�%��E�$��"�"9�xA���Jg@���@�4Ig�(�<��B�h��SUE����iz~��*��:�N�55]�o�"�E!������z63�- �阦��(��Y����"KV^DAD�D2Y
E��t�D>�D:�&�i�
���o�=���Dbqҙ��`C!�eQ�ک`cs#� �ϱ��j�:u�V���|Z���8U��CEd��צi�t�s^�4M��o��ς(�#T�a8ԋ?�ӡ^��M�D�Q���r���4@�^� ������(��{w�O@��@��>�,��,���\�9UE�L��=G9r��0Xܲ���j�����̕��^�s���0M�B3>]ׯ������� fcs�0W�1��+�\��6�3���Z׿��	����ੇ��K�x]���󜗦Y�b��jP����m�q����|:� =666�0M���
�/�_~���ϛ�<f�G66�?�9����M!׫:�<�Tmicc��`�귱�����C�;�$�^߷��������mln%n�A藹�v,�[ybr����D沱�9�NQ�I�H$7;;W��DD<n����I"���ŭT;NJKJG�h���/���*�L&C4E���_Ha]�}���ll
�c���F���=���i����4�Ã4/h����|,�[��yOLN�{����ɼp㋐&(�B4����ʊJE���'�L284��墦�I�n�^�YL�dbb�X<���J��Y@@UU\�:[���� �P��i�^?�|�B	����}��T*ey��m�&SSSA<n�U�p�
�}~�u�����r�n��^�\]ONM�ovvll>�^ʼ-��6_r�_��[}V%�"�V�EQD0�p�o�-.H]��ƭ��k>��G��"���k����S��Z����U6w�&���0s��(�_�;��00�]o�y͗�y{;屝
��*��1≨�Ұb��������^�5l�os�q��� E$Q���
]���Z�\���v��+aoy����"$S	dE�%�Ȳ�i��i�+|��{��
~A��78�ͩr3�m0������L.j� X�a3�~��8�N���(��dE�31��( \4�3�oݰ��]*H%QA�{�EA]�b��$�y7�sd8����~Q/SK~����}^}I����M��@�5��,ciB�R��&�L"�0n�gV9W�� �Fٷ�m�RUYy]�G:ΞE%�������FO�A�(�L��D�1��*oYU�͵Ʌ}-�Я'td�_�'&عk7���xX�|�%%�u������Y���oY5���{[(�U�Jiqi>m4%���t:I&��ҩ��i��
rmF9J,g��5ȲLOo/��,[��6d�&�i�E��CG�PTTDmM�e���f^�/纺8|�(�m4/hb�E�:?��qa`��RG�\FFFA����Ou�|�c��`K~����h����i\���I(�N�5�����~M�ŞPN�������w�x�(���W����G�r��I��8�O�$���;�B
�(�p�N�#�cc�t�9s�h���*&��
��������¬c�.0>1q��r����b�G;��\�7��=�d�2C��f*�']����h�>K�\�YӴ�����c�i�)����$IH��(��]n|^�2==�(����t8�D+�,�ss�t�1I��Ff���4ٹ{7�_�+G�GEΜ�䣏?�kG
�$c����#�_Vv��R��3M��;>����|�,,��^x\��D�N�f߁���bi,�m�x#���d8~�cc����B�����3�r�um�z������i�B6Cˌ����02��Ҧ	מ�\q�i�O����wq���p$����6ҩ�,s��{I&��ڳ�L:M(��"���L�m��q���j6o���?bhx���*����7���1>1Aee�D�={�108HMu57��]�!�Ҳ`[�G�q��I~?����@���Ǜo���'��eK�p�	������x��|��a��1�|����F��:���˞}{�f��Z��֖fN�wp��1Ea�MV^D0i�8î={P����Cqqo��6��#TVV�e�f���A"� ��~�Z�,nc��t��PS]��-[p�Ľϯ�c��nEQD�4T��,˖E��L�h�(
EEE��JV�299����A0!��Ŭg�Y*�4
��U�����F�}����!��@\�u:ı�'��y�����x����ڴ���>�n��p$¶�v���O32:��s�lX��}�\l��ױs����q�\�RIQ����tw����p8f�]�U�a���0�<�`br����~�‹��y�add��ʊk�KM5ׯ�C����wt���[%"c��|�s�p�֖�-��w߻F]�3�Y�>��&"�
��l6�T����.��R$�Y�J���քB���ќCAchh�-wo&�����K4�C44�G�u^}�
zz{i�8���m��myK�h,���'8t�(�a�]�X�x1����V�X΁C��@u5�������qN�>͇�?�lg'+W,����|��NN�:�,ˬY���>����>^{�-�~�--���;�
��QYQ��ۿ�׿�=��o����rV�\u���@����4�w�����$	������룾��|�u�<īo�A}}��_��wD�Q`:�w��J  �L�[oa�&
��|�2>̡#G8z�8�4̟�~Hwwo��u�����bb�ʛa pq�>3{E��CIq	��Ÿ]n���(**�4�H(��p��ݍ��&
�0t�3g��#˲uMyf�?��<��aͪ�TUV���YV���獷ޡ������ٱs'���(+-a��UTUVr�����Os�	�y�}:ϝc���cqv��͙�NV,[Ʃӧ��O8q��t�E[�$�#G����0>�,_Rw��n\<��:�l���%�%s��jkj����q�|�/[z]�R_[K&���ѣ�|>���X�t)5��t����ٳ����b1�-����9r�G�]����umYP���<����C"kV30�#mԑ�J��I|�u-G]Q�j��o��.�#�>z]�	,_���������X���r֮^�uk9��C&��/��HOo/555,_��Ɔ�h��ٳ��:�N<GE���������I1L���^�,^��kY����==ȲDSc[�q8���c?���{��'�L�캱��ٖ�����������_>O}]�gJii�UgO�
5]7�������_�0M>��c�֮a�ڵ����#Gp9��[���k�0�fzzA���bhh���>��N��D"���NN��O$�D"x\n�Y�؀�q߽[8q��N� �N��ƌq�|��>����MOO�T2�˩�YAQJJJ�D"��i���G�,Z������T��(�U��
~Qy��00��C���������겼��
�%%��)-)a�…8x����l޴�=���w�[[���ma+�W�������$Y������fDQ��Sx<�_��ܻp��|Fun
��E�˗.!p��
���d2DA������Oc�|JJ��d*���kV�d��U�����ۋ�}���dlU��-��:�y�Dc1SCUe��8c�8NTU���\&���G��kO=������~����LLN��8��ɮ={������2>>��ӧ9|�M��x�n:ϝc`p�4������9g;;������-ZH �4M<'N����]�E���9z��m���,jm��2r0L�0H$��].V�\NǙ3\�0�h����1#����?������	e��״�6�˅���䓝�H��,nk�����ᇈ���ukI��|�m;l����2��ʸ00���$��U��ձe�f�ڰ���!��8���~?����:�����t��(%%Ŵ67s��!���/��EY��'I��PQ^AeU%�'�P*"�L��GŒ���N�)-)E�4R��@I�f]7�4����tM�(.*��r��)�$�eTVTв`�o��%�ې%�l6ˮ={�m�B:Ξ%�L���q�10a��z8r�8n�Ω��67c&��Y�<��$	�p�l��?W~
�Э?Ӽhk��:�.�`�fjj��P����ׇ��8�N�jk�x�n>ض���q��4�����f�3��d2��U��:��)��{-���}�s_��UUU�r���d2$SI֯]Cii)�e����"+:|�d*���摇"��g�~��45UUl�w�@���a������d��U8�.�;�`�Z[0M�����Ҷh!�M����rQ^^���c�u����K/fݚ՘&�TW�[�2����8s���^�/]ʪ�+,�6��i�v��W_���b�i�̒LNMRVR6�|�4��D�1�d󦍴67S_[���(�t��[��ʊr.\��tr��{)+-ejz����U����ehx�` ��y�h�΅�Ajkjhn^@(������pP\Tı'���b�u456�j\��1���r!J"�h��g2&&&PUQ���E���8�p�D"A"� ���x�Y�I��H�D<gbb�˅ 
3k��LUU�z��P��S^^FQ(DqQ�3mo��V�B!:ϝcrj���**�q�=�OLPUQIQQ��Iۢ�̫���p�t�b��˨����h[���+��Օ���~t]cAc#K�ڬ5��
Eɯ�O��q:�8�׽�)Dyy9�a�**u�5�������ˆ�V���HC�|��*��(D ৬��իV�F�zܰ�M����@]k�a�q{nv���C���a~#��155��U��x<^��IL���^z�%s�����f5LI�f�rE�\W�|�E�����bK���Ν:|�?�g��霵
+��0�Y�O��2���}��\Z]��L������5�ɩIΝ?Ǣ�E�����(�����+�#���^�y�e藻N.m��)�~(�"z��+��]7�����CQ���vA��5k�x<�g�������8~��l6K"�@�%|^�l6����PTe־�p$��妲�2����Z�
R�%L����p?�a3vB~��,I3�ɲ<k�n�m��:9KyI�f�E��^�� �2MO._��}�(��!��z1#<�^�U.��,Iy��S��L]�)-)����� ����p��,��n`bb��nFFFذn���WY�����+[���`�v%I"��N����҆v���~/-:W��K�+�����֥imnOr{��R�^z�F��\-}��W�� �EA���S��t�-�|^_�\��r��r���4N���b�K�����C���̕�3}��L&sմW���t�`�j׹�OS/�߯�/�̑���uΞ����aMp��$���it]#���o@z�&�iRUY���?����Q��%Kh[�ȶ���ܸ]��gؒ(})n{s3���W�����γ�ܒӭZ�6w�a��
ee>r���1*�˘_߄,+��'ܐ�E��1�Xn�cc�yq��t/�՟s��EFpf1�]|�z�])�k��� ��~-\DkK��M;:��m�,�h��,��fN�9�J���F����2���*�m/�a��u�6��%�	�O�XJ�9$�u�ާjs��˜��G�4���?��AuU5�H�X<��߯����������EQ����t&M4����8�^�-�e�������t:�g*<�������K�ҒRJ�K�4κ�y�s��_�q���(�y[��A�����5��۾�$�ײ��\ryi9���_X����P��mT���@���9�k�61��o�o����g���ݑa&�� ���$�h]��8�6W��p����c'�e�v+��-�66_e��iC&�attԞ��\өi��,�e�7;+6666_i��/�tY��������\�t:����,oz66666�?_،�4M����(�x���6��c���}�66666��/�BH`���Y?	Mj�H�un��R.U�^
�3�^�������M������Gxw�S�IDA���KQ����SQ��ebsȅ�H�RLNM��7��C�UU)
[~�r-ۘ����fn��4M�٬�j&�Un{Ҭ(]3�¯�c���.F������;�DKv�s�>�q�rފ%x��y<����E��u��!��K���b2==���u5u�َ�@2�$�H�v�p̸�6L�sXF0Ms&X��_��f���q%r��syEM��f�(�rM;�����{�@V�вYdY·Ý�\�-[�bcc���h�����S�ii^��Em�O��Dq8l�{3.P\\�_�/��EA��Y*\ӄ���$���e��VN�>�˿9@ib	G��l]]g���a�ޝL&��k���#�¬�A���r�G��dҸ\��N�n��w�E�u|>��sLMMs��)�x�E�����З���sq���k֬Zɓ�?�(��8u��_|�?���imn�z:w�(��9s��G��v�*F��Y����_{��p����x�Q�nw����
����ߠ���uk�䟡�����h,���b�]�X�xq~p��{,��_��Gz�����Bgۃl�;.����sTUV�**�HEQ�����r1::�O~���s7˗-���0>>N4��������4�l�H$Bqq��h���s#dc1��_��U���X*�g�Y��8���5���;����0
4M�4u�r�'���1[X��ɶ;(++��ejj�����?NSc~���*dY&����|8�N"��`]׉D���f���q�� � �i�CC����\����R�z�]��ΓL$�u��	t]���UUI$
�q�,����Ea߁t�����G,[��p���IB� �@���qFG��<w��~��-䗿z��ֳ���s]�y�W�����x���R`ph�d*���$^�EQ�#(���e/����I�g�� �L�8x�0�`������	N�<���@E::��~�,�l�ˊ��[� �^��g�y�_��2SS����=�(���(
z�0QT'N�AY�Pd����g-�B�=忓D!����jWt�+٬ƥ�&A�a�<��?��C�X�r��%���?����>�4EE!~���0s�{6���?��=��8?��/�_���s�8�������WuU>���;>���]�Yܶ]�`�v��8	����}[�ݫ�1==�t8���ddd�c'N�Ř��dמ}��@uU%o��>�iE��ןc:�w����㡧���kV矷��Y\.��{/n����j:Ξ�𑣌��15=�"+(�·��u$ID�5^��oY�l)+�/煗_aӆ�,[���cccs��Z��Á��B�2��x����,_���&����X�t)��H4BII	�aphM�X�z5�<�D~M1�'�mMŤ�,GN$i?3��d�#�/��m��Ԗ9Da�y����E���e�.��4
M�^���4M�o+�>�$]�{���?yh�?��?�����3g��j����?�#JKJ8z�E��������s��	��:ESc#����#I"�o�����y��wyྭ�|>FFGٻ?O>�?��wh?s���{�ѱ1���?�0U%�Ic˗.eI�"�x�t����#t��PQQNWw7'N�f�޽,li�O�蟲��m�J>��������E*�&�Ͳ~�Z���o111A8�N'�9|��}}D�jkklU����,�>����-�PWS�	����`��f��L\�||mM�p�\�����OPZR�|^�n7�a�c����Cw/���:5���#�y�0<�T,�����#�L�j$��DN��X�U.�\�4M��d2ICC


��G?��4/X��(�����$�J�i���*
����fY�l���?�yy��'���.��Eg��0L���x�Ghmn棏?�ϘUUEUTDQ$��`� �a�(I�� X�N���ǣ=DSC纺�`�7�<�߶�3g;Y��ȅ�:Ξ幧�fϾ}�]n�.7���
3����ػ�o��.
���ۙ���,U��X_����"+��m��ϫ������|�
����;����OQ(4=LB,�(3�Ԝ��*���������T���nJ�#$�:��Py��g��m�;��U.ּn�sv�f��n-
�i���&�g��Ⱥ5��G�m$Q��p�f�*^{�M���R�4�<�$e�VD�PQ�����{�S_WGqQ�E�U��6�l�RL� �L!I��,ӗ��Y��?o޷�_�
�𓟒���%	I��$	yF�+�Še������ߏ�Pٰn��5���m�"��!�8>�w���&.��d"���kik[ľ��E��t��
�����QYY���>��G�?7��mlln��ݻ���V����8�EE�m|�x�d2�������2TUett���b�>��#�OL����`:��v������f� �Lg�Jg�y��8?�ɏ�������ⶉ@f��1���]��8�y�t&�$I��G	X�Uq��uѬ-p�i299��� PQ^NQQ�x"A,����p$�i���A�gT�P��P�4��P����r���{h���8��ۈi������z����atݠ����I8�����q�-u��� �N��z��)).f:fxx��Auu5Ueph���i�n�%�x��Y�<:6F4���PVZ�$�����v�q:�ʹ��I��0M���ڳ�����?��yEϚ666_=A�����7Ms�!^.Q��K�8���a� � ��EqN��#�����i��݃��gݺ�����0�u����]� ��_���PUY5k�=�쭣����1�K���|T\�_ťmd����+-_��wi��ו�SX�`-��3�s��][�4~��_0����x�6곱���	�YS�K;��N�J�/=�0ͼ˕k�a-��L� ��#��y]��69U?&(���
s�;�i^�ݘK0���4m�u�t���_��qvU������3G��];��R]]9�Ta��ml�D�P]���_�ԙ�=ۿ���]�~_U���k)lll�<�0�/IҌ#��6�$����b��`C����B�i�8N��(�}���Cnc��d�x<�P��������f��(�u�V)�\��eY�������_�ޒgs]�v��/�/\*۝������ͭ�=���v��)X��ov.lllll>-���L$tbi��� t����g�������Mᚂ߶��2�;|ԭɈ�W{�o�n�o.w��L�#;�
��������U����q������-�)��7Y�XB�s�(NL�O��+L�
�L�f��+� X�t=n{�������U%����l�۝��g�i�;6I�/�7�9��ͷ �P�ŏ?�c:"�t8ov���h�Fx4LqQ1�@�fg����+���i$	�+���r'#
0�LP.(�w�зp�%/^���V���0��iB���򷱱���)�A �I�(
�$�~�/A!�A�WE���v' Y��RHi�5��ƛ����}V�A�O�站LY��fN�i��KB��1Uq;DRY�^���EQ�j�M����g犪�L&c��7�,���N&�ɇ�UU�s��w�8�`El�uEQ��4MG�f<�L&c
���CU�q�MӼ�g�tM7�9�Y!�
� �͢(
�(��z>��X�{Ƃ�u칼Rx�"+ X��lI����6w�T:�>�ܲ$�p��zY��[},��߭J$3iLS�,D���(ʍ
�Y
I��\Ϲ��ɧ��i�l6Hʡ:E��t]���\��I*��9W�4
Q��p|�eQ����P������3qEi�ղȒ�a^F�$q��!^{�-�B!A`�-�X�4��8�L'Zj
:�´� ��:C�Ô�t:�B1�{Ǚ�>z�o<�,�b	�m;>B��ܽy�/H?��/��rSZR�=�7��ۃ���� O=��LG��<�D3+�fV7�Cˉ� �����g�~�MM�ٷ���y���_�KB� �>�(��c�066��ظ~=�H�ښ���P��$I�}�i��,Ͽ�+�/g��ʲ0��p����APd��[Y�t	���L�#��~�Y�r�(ZuYY����!�Jj��a掋>z���|�?��h�Wo
aV�>�i�h��o��U�����x�p��֮�|pT�����rq�	�,^|�]|7E^{�M�=N ���t�����OII1���٧�����/|ǣ�/��
��SLNMRRTLEE9�<�~�V���^�u,VY����|V�������躞?.�H��2����,ɡ24<L��GE����B�����$Q^V�,ˤR)FFGE���R�>���b$S)��%6m��%��$���1LӤ���D"����9u�4e�����1=5��*�A<gdl��MII�%,�fJ�H��7��m� `�&����G7����ó��D������PTT�(�d�:�`�vHym�i���nY��̛7��ǎs����g;Y�r�N����bxd�t*M&���v�~��߿���7hin&��0<2������2=��y�������` ����x���IF��)).���(__٬F:����=j��QT���)�N'S=��\.K�E�
�z�(/`xl�p8Lqq1E�P��&t�Fu4M�L��4*G�g|b��NQ_WK*�"���dp�܄BA���D�l6K����r16>N,�����rl|���)�� >��ѱ1��4���$�	~�‹��D�e��vrj�T*E(��p�|O�N������v�ez:̢�V�ۺL��`xd�l&���YM#��2<2��i����p8+��3O>��� ���Wy������q�.0�v�I&��3iLӤ��EQH$�w����4���"�	�����l�V����|v����VC@�%�s��'��(2�W����
q�ݜ����~��w�M׭���e���o019IqQׯ�7����G�5�|�]Z��9�uM��ݨ�Ov�&�QUYE�zFFFٻ?�h��/?���b����I$�3��z/��͘�9�<Y��8���8�UUTT�s��Q�,uz6�۶��j�r46��o~�CU)//���r:�h:�(�Re,=����PU�/[��شa=��a�ܽ�S���i>�����������j�.\��r�l'o��6~����>9���$wm��IW�y>ض�h4�w�󭼚xx��?�+$I"�i|�;ߡ��
A��|�s'E�˖.%��s��Q\N'�(���/��H��<�ԓH���/�DIq1���CΚ��
=�AV�gf���LMM�#q��i�ܽ�������۔��L&y��G9�������x<����O<��
�p�T;���>������LSc��DY������|7���v�Bb�8���E�e\N'�o��_��2^��D2IMu5�x�k�Y�a��:�$�~�-Y��#��C�0ػ���AE���y��Z��"R�.��Ғb~��x����x<�������o��
�d���-l޸���z�x�zW�ᄐ"0�<��|�,m�$Kd���深����iܗ�3��K�TE����xq9�l��n9ʇ�?�{���ٽo��0�� g�v22:�w��u�y�	|>�T
]��4�d2�–���G�m�B*��X�d1�ee�w�3=��a>��7X�b;��!5�\�����'O
������=��u9]x�^٬F<��lF��سo>�{� �L�F)
�X�Ҍ4�n��dQ$�Z���&�.$�L�㓝��>�o+}���?p���J|^����;�]��d2��u�hmi�O���������^��@ ��ÇIg2�_���/����*>ٵL���|7e�e����t{{~y@�%�EEE��.��,�Dª|Q�l�9<DII	�S��ݿ�T*M"�������8k9��R�*w�00
���O35=M2�fpp���>R���o}�9Z[Z�d�n��a�jk��w������y�v:�!��{�^46��~�ukVX�t)�����R
�8>O<�K��J����d�]�Y���}�$�I9�_�z�����'�L��	p:x�^\.�aO�-���x�Ov�B�q���"�h�|M��tͲ��4|^/e��;~��s]8�N�>�t�-�7��3Os�T;�����S�գGI&�$I�*+il�?��麎$Hd�[����|f����.��u]ǘ��+++X�v
kW���r�̺��k3�g
Q��,i[�ݛ�B�4t]���K�t���t������%�i�ٷ�#ǎ�q�A�|�F^�蚕�L:C*�B�e<ׯgŲe�{꺎"+�X���[�a��VL�L�$�͒L��=V���DqQ�}�4M�����IL� ���9�L5�i����RVV�?��:�MM464 ��}�!K[�U���PUղW�
�T:�?��:��y�?3�0tӴ�f2�n7%%�<��3,lm�������[������܌Y�Z8�ɐN�QT�P0�C<��
�Y�����{t����_�@*�.xF�8L�R��A"�����P0H4��p�	t�@V\N'.�-��0
TE��t�PU�٬�ܩ4��a�e<��B*p��cFä�Ȳ��"Xv(9�Gk�g" ��0DQDUTUE���YZ,`�=w�z�ʼ�d� �Lan����:y�TE����Չɒ�6�tv�w�~��X�
��$f9��*�wu��%̫��'���^}�MR�Ԭ�'�"�t�f�666_.S�dҙ9��낀(��9{����W���U+9�u��mm���<��S,Y���	�N�M�<��^�����6l������~UU�6�UU����|��Ɛ%	��@%F��x�׿a|r���l!��}�'1o�<zz{���慦ay���e���Ϯ����bq[.O�ir�f���$�)�'� ���F�o	�K�%Ib�e�޳�E[q�\,Z�J��

�I&�����5�V��"�������g��4M&&&0M�ۃ�(=v��������O~���;��c�RR\LǙ��BA�,n���EQ�ٽPPᲄ�ᠢ����}�ʊ
��r����P�ښNwt�s�n�$^�w��=�,��鬆,�����H&��7�AiI1�N����O��tu��_��ɩ)��nΝ����cDc1��a�|�1�����;w�������%��p�G
��Ԅ:�&E�d�G�=������0�����ѣ�����m��Z�=���,�I�d
T4�,��I$�ۿ�eK�p������4���_&(��ih�Nuu�a04<Lcc���Fy������:֭Y�����w���s]�9r��d��m
�
�_�U7td�qmcc� �޽�lmm�eY<42D4��]vB<�`brô�q�����C���)).F�u�GF�f��������2���i�p���\.���I%SLNMP��H�h�c�N‘�d��CYi)�L���a���Pd���Q@���������4n���r3��f�D�x=�GFq�\��~FFsy.��t04<L:�����` � L&h�_���9U��t���	**ʑ$�x<N4���M�f�Q�s$az:LEe�L���><7Y����&�H�hJ��384H}]�t���!\N'���8g�t]gl|�⢢��x"1#\\�Q^^��(\j��0M�޾>L��j���4M��8��%���i��E��(�i�p�s�ϳw���n�g�1�x���'X�v5�ee�BD�Q���	������g|b���4š�@���!�����!��:��S�|x<��1_Q�e�75=����4M���`0���$)o�oS��x=&&�0
���R��'�'ℂ��3��f5‘0�`0�����^G�5���3��q��'?eŲe���P]U��� �H0Z�b�
c�x=�Yen��h���zK�fcccs��@GG�܂����l&�����K�'Y�m�y˥+ܾw񘁵��
ϛko��[�����-Q��/��Ο�zpa2�ꥋ(�������_��˕>��}�_���n�/�Թ>&'�(��f҃a\L'�2��ĩS��;��Ix�ͷ�-��>���\�,��8�T���\����M���;���v��E�Zfb�8�z�e��s7-��D�������S_Ww�3�H�G�Q*+*	��Z����
����M�Zc�v�;�01p:K�/W�W�\�����V�+	�K���
�{��F�9���Q�(/�lrƢM�
׵5�Ӿ{��w�W�4w��R��4-PYY���H�������Zf�4��iGʴ������s�wN§�|��h��K��E���?.U�Գ�~�r�(/���b�Ί���W���y�l\�.�&/KkW��~V�٬�F����3!g�Y��h~&��dH$(��v���n�"@V��ȲtGw�.��,	�q;ds!����ۍ`��X<F8�.r���j,X�yxt��3ˣYyy�ݙ_����5>���磩����|I�X�B�T�fg�����0�����Q�d3L�M ��I}]=%�%7;�6�� P]UMuU��Ί����
�N����9�zD�Q�FǬ5~�B����A�kF���]�i�p�n0�'��0��cLNN!��%%�8#p��|666_Mn����O���0��� ��)��w����i��˧?�mllnS��Qq������}~d2c�����]EW��O_�~$8Nt]����q�\��z�kZ�p8�w��v�q:��qL�4b1+<�d�%���B0�ۣ��R>3���,��t��x�d�D�1�	��(
>��H$����U�t�T*����*�ϣ���d�rQ>M��t��k�W�BGo�b�J&n���I���@�u
��)]��$���$�I$IF�$��[	G�ٸ~۶���khhh��������(�H�DYY9�7�MII�t�C�LOO��u�|��}y�1�
�����|>�̬b�һ1���x<��]��ۃaTU���C�r��|���4�����{����~ǖ-[�W?���N�������i����a|��#���Ѵ,�`��m�s��[nv|�\I�s��<�`jz
��Y�x�`����<�8Λ��ێ+L��`6wm��ښ:R��S���D6��{l!�SZZ��wѧ�aZ�5�<��i�m�t�t��p��M �ĉc

�i�yM�R���)//��r_����\s��5��}\!�8���w�CyE�lY��4
��ϣ�>�"+�3~H2�̬�M�P��M����+�g6CIq)��06>FeE%�X�T*e��6��+��rL^>[?)�"��}/�d�Ov}���a͚u�R)z{{����b��t8H��VTIA��������$S��C!���ov�}�*�C�311�';wP_[�=woA�4Eatl����4MFF��D"��n*ʭ0��#��IyE��0<<D&�����P0t���sE}�i�?q���NB�"|^�:)+-������ٳo7�(�p:���_ ��r��~@@u8(+-���#�#D�Q���$��s�����*ҙ�N�$��0o�|6�u�m�ac�9r;�����)-+���i��(��-�e��F�d�Lj�HCC#M��tt36:‰��)/�`�G"+
Z6��秵u�����Ǟ���CCC,hj�ȱ�464�u��6n������������t�*+*��Bn���EV���vZ[r���� �T������{�N�	׽��{�������1����&k֮���jtlt&���sg!��x�R&''�p��?������������~Z�[m��U��B�����E��K�Fgw�T���*���3>1>k�/��`�`0HmM��%�bQ��S��D"�P���{���I�N'�m}���!Μ� �J�$R���ͧ�v1��+�%�%<���\���l��z��G~AP�ʊJ$I"�e~a4ŋ׵֒�/[�����;�D���(
�uNS�F�Fص�6n�����_����O�`͜WYQEqQ1�ϟ��p�,�a���wo!�J���Ř����~��=\�p[���]J���hs��*�(TD"�`jj���~��K��W��'�~JKJ)**�����U����̛���S'�P�����O�[�{����aɒ�����MYa��A��F9p�U�U��@Q�8��}3�-��_UULØ��gcc�yq�X�ϕϡ�!���S
$�Nc�&�,QRR�,�(��i���A�lI�H��y#'�$	e�NQ+$v_/���I*�"���x���"�J^��3�����F��-j㓝#�2kV�EQ4]gll�D"��p���PU��Q^^q������*��p:�l�h��1M�˅$�	:ϝ��CTW���4���A���4M����y�����"�P^V���;nw���(R\T�<���tq�����ctt��օ����0n������(�Yy�^*++	Gˆ�H �����};~�y��[��$'�`Q��66�'���?�N���ͩ�'q:��^����
��8�,�{�.L�r����{�y��s]�p84�o$*B�eB��e�syY9��-tt�FU�����Qy�'8r�0==ݴ�,��|�~2����6�����E2�����X,��e9y����(/�@�\��JSӂ�]e���g�9	�l��~�Ϟ���֖�4�o��3>>FYY�HI���008@QQ

�LNNp��<����ҊRP_w*�K��dn�w+���[i��$Iy5Y.š,�Hc�,��:�TEVp8�B�j��,˳���id2�A�$�H���k����0
dI�4ͼC�;]-cc�ya�&��i���-ݮ���\���f�A �9�(
H�<+� �(J�v@��uQ�%�-g@WH΀0��0/���̑[��g�d2���k��{��������,�]�I������lY�vb:<���3���d2A@Q,�͋�.M2�}�էa��8�@���B_�4zzz�������w_UU�/]B&��\׹�g���-D�KU�Y�e|s�9����V�$���$�����Klcs�r;���������_mvW��>_�߹��<��+]{xd���)֯�XK^���1+�$Ix<�/��,��^J���5���S.K���l,�Ұ��ø�U�w
n���7�CyY9EE�l��~�s�|U�K�o�� ��������C�|x�[��%�_%�%�"�*�B�W<��i��N�n�����XEQ�*����a�����o��i�H&�O_�����4�7;?_v��V��I�L&Y��A������^��U���<�0L~�{M��R�i�S"�1�EI�d�됈�u\��n�f��U�h�No����~铷X~��>u���N�ŦO�b��}�'�n�n�Y����w'K�ֽ�B����8�@B�I
�AV�L����dJd<!���c�������vz;������ �$5�ť:��ftl좠�$*���
�ɩI�_���y4��(�&����"7`0�Ⱬ�f�������vz;������)R�;L�kI#f"�8u��|��M����u�B5A�c���~���7�L�=���No�����o��#�,����#E��IߕэO���0���4)'|ʹ�{�+]SN)_�s=o᳈]��7L�\�ȷ��͕�<ے�զ�;�ݎo:���1Gl�V������2�v�0�¯�T�S�]j�+��[	���.$Uh�v�Qo�
@]�Jm�zCb�>�����.��.(w\s�#
P�QeaV~�:xfe�,�\�y�T*
�	Y���!�TVԹyfE�{[|�o�lmll�&��|��d.EdQ�U����A��>�fa�P�i,s�H�U���%^�
�>V�{�%Z+?����d�M!
֚�,�Pn�k�S#�Nh��״K(V�ZB���#0��,���#0�Q�G�h(q䏙�̌s��
���<lj�Z[�f�K3k&`�]|�����Vxc暪,���K�W�~��2�a"�����XT#�֩�y^�,b��H������b�����xIi#�,�<H��$
DR:+��lm�۞�ll>#�XT�ٖh��襱�AV7	�$l�Y�<7��}A�G��!����}�n��T8i*w�V�b�<����޷���q\徆9�z��k�Y��&���ŗ��r�����0����"W�ڧQ�~��4i.wr�B?�ȓ˂{�|����3+C,�u���ͼO,����rº��a~5��	En�ꐊa��\N�jj�|��Aj��g	~��I�D����u	��r'cQ�e�n���il��a2��Z����J�5�f]�U�K3��dΎ�h�p���M2c�Ig���.K��:;�F���M^B��qTI`y��Sb4�e{G�"pW��"p�'N�x�Cd^���O��(lh�H;;���w-��sJ��z������<�<Hk���ÌF5���������Nj0cLQ�Y5σi�t��hJ�%\�c��@�%�s�������0nU�>�+��N1�����q��Q�|2�.	RP��=����I:GR��lh�R���9��(��z7��.�����4��I�-��"?>�D�[b��(��Lj�T�ָPe�0iL�5�澅~�&��L����c(�hD���J���7+�G���R����Mt�q�"+�ܸU�T�dwW��,�uS��=��)���CF3�u.�n�lj��R-���g�,�u�IJ��NG�0����0a~�:�lt�>���0-��&/�h��&�K�.�q���͉�$Ͱ���iw65�(�ɜLrz Im�ʪz&��+��uN|oU ��-Q�(�TT�SVV��^�(S	����{pUXP���`���Z擹k��z"�1�ytI���N����
�$u�9��k�<�,�DL�o2CV7Ie
z'3,�r�y���(�)�NK�@"cpz0��y�˝<�8H�_F�Y©TUdƢO�a��5�&��<�8@�-�?�a�|/~���<�RC�,�	�R�Xh"��E�;93�$���$�n@�h�3#)�D"IE���Bw���_�`��(��	���f2��5��E~��H$�3��f�J��3�2����|TҚ�3+B4�9yry�d֠}(�oυ&�V�������~��j���L't�I��p�2Ah�&5!�M|L'5ܪ��fUA��W���e^���
4��<{���N�m��3�f8��ī��P^��������o2Cs��e�n&�є΅��d��� P�W�}���~&�֞�{[���*�5f�E��k�)�lǣٛ2g1Mf�����A|ι�3N�DS&3,(wP�W��&lZ�cI��hJ��eAj�T[�ܯX�&<���n��ZF����R���Ik�D$�]U��&T<�D�P����cK�Ȓ@�d����jE��K���i|�%��y�-@�P�ʠJF7��!��(��[i,uX3�bn�ȯNrn$��y����M���CE@���AF7�:k-^YX�w"�S�f�/�`:����/u�����lh��<��T��X���q�&3,��1k0�{���4�`��2���'��H��t���tK��Ѭ�ĩ��;y�d�މ4��<�����
83��H_�
M^�k�~f8E�`��f2Ͳ�;N,�c�&���=��He�|^�Ď3Qܪ��yV�K���9�ΰ�\�b�l	�I�鄎'!q�/�"	��5���;�D)�+|w}1}t��	yd�5z��0���3�}��Mf�xFӸ���n��:��%�Ϸ�1׸0�aw��"�"�;��6��o�9畨�y�A�ɸ��Y͠��E�T&_V�~�Br,�14��c8E�p
E�r�/A�&X5!�"�L�[ƥXK���M,�?�2�8=����b�&YXP��+�����$莳��KK�����l_�	�LE@�03�L/_�5/^kT:�I�$2�iro�����m�.K�l�PW�@��n(��{e���8d��KbY���H���g��(Y�0L4�dU��0�N�k�����hÑ,�����f��5�:��#�T�����s���:������Q��b�*]L%t&��	~�Ă�睌k��TU�u�Hk&G�`Z
,a��(���&�1������ھP�?��4�987꘱�X\�}fٲ�+�\���9��:����C2��RETY��҉"Y�4���j���gq��r�B"c�gS�DJ���Is���p��"y����O,��h$K4�����w������F/�u��(�X�ȟIL�466�XT�"���j8k�tɄ�:єNe@A��DҺ�����xLc$�qv$E�d�x*�w�(X�4�*�%�.N$�;���a����*��c*C�엺�A,[-�*r�B�2�݀�Ly@�!��/u0ɒ��lɢ��&&�$PW�2��,wZ��ɸ5��q&��+B����?5}[�Ң`m��f�k���}�w>U�tF%3���J8����O�?�Řq�̚�銑�T�T*�
#�,]�i2��ّ4Kj\��;���4t�de��R���4&�T�$�Yj�C�qZ*��T8�N�$���΅�,~�D�T�Ӄ)˜y�Yi�p�a{�H�R�-�uS_�*K�p��r'ʜTU����Α4�q��r'Sq����*�Ȣ@K����Md�+v0����Ԭ�qn��vfF��\Y�:�����q[����
���,q�K��X��H��R'�n�������YUZ�2�̡�{L',����pX�o2��~K��9p�"�z�Ȣ��Z��F��N��$uڇRTU\��xTcK��s���H�D��4tR��ޮi� �5�>�C�oa�:d���ܪ�Ɂ$Sq��*�$�ީ�aB�x�Sb:�sv$E�-QP�N1ɲ��E]�J8�3�p("��iI@7LڇR�3e50�����:uE*�
SY�3���0u�Y�牉�_��ԉ�%�7���`�ʠ�f�dL��H��h�t�D|N��1��&��-R�y.J�%�Za���ʲ����Z71���bDS�m�G&TU旨�epͱN2��E�}��6�B����,4��̬�N������ �Veɂ�_o�KD�2��މL�<Y�m��HY�̯��F�Z�~̂K�I3#�R�L�#sn$���K����a-)�T�T�����x��p1�V�1LX=�͚y�u�-�A�y��	8e�?���#�q���[ל#���ͧ��=z2˵��qk�����r�?���qk=_��'��t0Wgo��E$����S�<>��с�՗D�Y�޼�sMK=m�-�/=7�͜S��!��s����*̟yy),�/PD�,t�ʋTP�(���0����Ԉ3��w"�;B�*gmf����O綨�x�pi���r[rA��c=�M�4Ʋ�k�x����K�v%OB���K�����̬�B�KYx��)3���¬]�H7�F=��j��ǎL˜�o�pI�O0���Y�!@�P�HҠ{<�6s�K�Q�ôlz'2��eպ����>`�v{�t�$��dtsV;7̹�Kxx��/�SMG���|��\+���9W��XH�p�+�G�
��q�Jy���;r\���>:��Q�#}N�pi9޶��TD��1ek-[��t�&g�.��&�ə߯�ޘ����I��L@�;=@�D�h�2���jy�~izHf΍Z���u�'���N}�������_z�4���)RY#?ÿ������ة��:���(���k���+��S�>�1ҳ�=�ﺒ�LeTܪHK��pJgQ�ݴ�TURY���nbi�y%<	Ih��ҷT���P�Up��5�/�����h�[�|��v��J�dƤ���TBg�
^��ز
���{En������۪,�αH��u&�9�g���K�X��EȢ������>��3���%�7I����No�������vz;�ML/�f��N�kY��q��5~�_䧨(t]*�[����"�]�c�����������^����saǎ����������W�0�d2�7�b+�N�Pz�ۛO%�u]g`p�t�ƽ9MNN299u��Ec1‘�u��3߱X�3���Q"��g�ƥLMM�H$��M�$���u_����_���T�u419I8 �H��+�f`p�0��:χ�?�؉�ӷ��j������B&�ephMӮ+}2�"�J��"����b�����:CC�d.����8������G2��3��ŧ������=�h��3M�^~�}���=?��1��u�}�y�7/;>>1��o����(����F)�z�e�8��ν����:r4�=�J�ڛo16>~]盦ɶ�v����!�����l�}��_��"��t"K;>ىat�=���N~��K����&����6p��1�/\ ��s�����1��~����u�{���<|�s]]�:�~��b##��?xc�Q,v�*,�[��G���˯�D��~���P�7M�x��Wx�ŗy��9~�$���Jcs�
ғ���Y�>/�==��H���t{>�����Fc��4S]Uũ��$S)j�����#344̱�QU�5�Vq��������X�j%� r��Q&&&X�d1�X�cǏ#
U���]��l6��GH���X��ҒR�4��{�݄�:��i�BA���(--a���_ehx��˗�i[��d*EG�V�XN4�����9u��#��-Zȼ�zR����!��t�b�ϟU>�	]]]��9Cii	K��8v���c��԰|�R����!�N�j�J&'�صg��ô�4�����2��{��#�f��֮�(���9Μ���t2^=�EŜ�����(�q�E��`0��S�N�λ�!�"�E!V._��U�����#=�#=�(����+���0$I��嗀
`i�rmdak�`����T�W��C0<<�/����0�7m$�26>N*�bak+��LLNr��!��������qv��G($���+�a:<�3O=��իQ��N�G���:�d2�"�466PUYɑ��(//�����'NPW[K(4{'�i�tu�����V�\A8�T{���l��gΜ%���xX�l)o��{��cz:��
�ɤ3:|EUY�j%p��I�3ĵkVs��a��%������M��������?��/�,��d�%]I���r��L�`��e����i��IV�Κ��:}��?���
�)).����Zi4M�������?q�4Y�|9E�;kW�
~]�9v�K�,a߁�sצ�����<��㌌Xj�w�����U+W����v���OFYY)��d5���!��8���E,%�Hr��946�L��h�Rӿ��_�9y��޾><n7'O�������h�7�^~�0p8����|���������H*�fz:L"��w���,��������?p��I������㝻���e��������w?�x<�O~�~���066N8����C�_�eA.��L6����D�m����s��2��55�u�x,N2��?����m}D6��׿����D�󌎍q��
<��K,lme�޽4Ο���4���Jt�����,_�4�1醁	����s�0(--!�311AQ(�C�9~�$��s�з��0
�X"F2��'?���7H&SDg��D"A*�bzz�t:�o~�{t� ��c�.���߿��$�v�ѯି�ܹ|�ڽw/�����p05m-5��q"�(�q�F����_<�wm����$�g���O�W����޻����_�����e�����z��p����O��_�����x�ӧ۩������[���R)R�4�d��p�x<�K��I��Ţ

�v�j��'?�͛�:���B��T���i���eyIg2$ə�Mp��/���i&'&���,_��
~���,�a��>ځ��"
"���e�G;�����?���o~�{"�([[�ö��##LLL�b�2J��y��w(/+������1�'&etl���2����
�o?�{����ۃa����,˘�IyiCCäRi�?A<��6�--9z��˖��3O���͹�������J�%�Lr�t;u��$�Izz{����H4�n��"˗-���N�{�i�� G��ٳo?�t�MM�ݷ���~�޻�]{�2::��墷����)TEe�ʕ<x�}����8z�8n��ɩi�~}������{�6`��%%�LvN	G�w� �CC�O��_�G��$�y�&6�[��ӧ���T�&&��%<���;p�#G����!j����ןCE��(�t
Q��z�319Eۢ��zS���|�ʊr<nK�,F�edY��t�i�t��>��eK���w��f�,�j#�����f�,ncӆ�t��w������}��ɦ
ؼi#���+>ض���a���'�L�y�ܜ�Y��ݾ"Ӥ�����R��(�<��y4Mhl����z{{���ɮ�{���d`p���vS]UEEy�e�>z����|��{�^R��?`jj������ᠩ��o<�5b���0+W,c|b�o<�5z��8|�(	G���iV�\AYii�9z�8�ܽ�y���=C�@+�c^}-��,]���
�oJ}�ATU%�����Gc�|A���A@Sc#��5ܿ�^��(�;vM˲��AF��p��><�0o^=F�j��ޛ�|7���O�S\������ko���_{����ơÇٳw?��;:��i"�"��0x��H&���zz�(-)&�J��+�f�=wS_W�	��J�.���q9�������TUU�r�
EAUUN�:��M	�d��gp�ݖj�0�D�L&Co_��5�V���_QYQ����u������
�.�����p߽[xྭ��T��f����������ݼ�
������w��\�y��Gp�]�_������SO<���s�����tv������}��	���a��`�d�~N�>������P0�($�qN�wP\����ѱ1��'X��BqQ�o݊ �(�U����ۍ�Iۢ�,]��f��66�a��%m$Idph�ɩ)DQ �I��k		H�SLLN�J���(�����Σ�*�tM���hMp���|>Nδ�h$2��0D�<3��f�����su]��nPS]�����������ܵ�w��?��?�0FG�f��q{8s�3�o�R�]]癘���r�v�f��_5
Y��F��#��`���E���YBM����Շ�X��>0�J�'�P(�(�
319IQ(t��S�&o��6SS�,]����@8fz:L&�adt������0��u�=m&���;߇?x�V����TWU��_��$I|�ٯ���KEz��������q{��8����믳t�45�*
�d�X<�3O=��n�`dd���–��4+W,���S
�ȃR\\���4l�FOoM

�OLp��<�l���Y�v5�?��#G��1MUUhii�|w
��Ӷh�����٫���x���())A�e�p'N�"��a�:���Y�v
�H�CG��0�55��:�–�ڴ�3g;Y�d	+�/������3t�9C"���y.�y���eAS�o�w�Z_iI1�γ�O���������^��/ �~�Z���x�8��AUE%��a�*+��(������f&''�W_O"�`���D�����������ݻioo�����6n���#8�N�=ƒŋY��ĉS�ع{7�W�B$IB�.�t:|���!��>��,n�N�����n��̢�yk�\Q��n�,n#
�{�^$Qb��V���9|��uu;q��G�r���֖y�AdY�?�����MkK3/����ٻ���399��M��00�'�vs�����j�J���,Y���G�������_��R�ܩ�)����4�eK���h���x]7�'�<��#��~��_κ轢���c�'�019��5��;w28<̓�=���'O�l��z�,]������=�ɪ�	8x�޾~����%
�r�
�I%Y�z5�cc:r�y���?x���|��G�r��i��j)-)a�޽��HSc×*��8��ۙ��&
�r�r�?�(
(�©�v�.^���"�i���m�B�'&i�?�x"���۲���>::ΐL&���㣏?a߁�\N�ke��8ܮ����4�1��޼���^y��z�VJKJ�y�t:��;w�b�rJJ��72Y�f��X�0p�\ֈ�4�d��Ǐ��:�V��]4CD<7����p�\(��$I$S)2�4�'��I���i���(���iҩ4.����!z������k
;�Lr�q^{�M�������p��2����fQUQ�4Qf>_�>X��D"����p��:�lM��z<�t��������o��Ȳ�(���qt��f��{��oF��$I�Q�,�d2���S^Vʷ��
��C9���m�]����bt]G7�K��r����Ҷh�?��e4����gcs+Q�.5MC�$A �N#�"���۵(I����,ik㮍�w:�c�&�I�H�|_�hr���o��%I�o�e��ss��\X�?~6M�T*���ss}Ka��J�UUg]#��7�'j��CU�D"��i��n�w]G��)�5MC��|tRUU��)(�/�4�D�V���@�'Ȓ�(����^O^C�{F]���H.��i���L&C,��r�)������=�n�Y��o����oֱB	��zɥkƂ ���E�����t�r:/�V�Q8����W_��m���9C�V�X���5�(4^�ei����f�~i�K�$i�9s=��3��%w�\���50	G"|��6��'ϯ7�^�
��ɺ5kPUu��,�4M֮YÊ����ϝgcs�Q��
?;f�?\|�u]G|>�e�ʥ���v9W�*lC�md������� �\��{i��K�1����'���Y�Pf~�!�e�{���~a�~��@ 0[��-���z!���r���Jsi�ܩ�Ќ�vc`p���~�^/���2��d8����t���jkjnv�lln)LӤ���`0p��v���]�Bf���UUTWU��l|ʢ�?��ll���@�y��\�}�|ձrmlllll� l�occcccsa~�;[��������A؂�������↭�3��t�כ��s}�p8I��y���$SIB��'<�X���)\n�`I�H$�8g�*���߃��f�5-��"�N�D�׋�`m2�$�Lb�&� ��y�4-�,C��ut]'���:&�^���^�4�F�d��$�x��m5M#�J�v����l�Q� ��q2�,�
Vľ��iFFGh^��v�cc�Hg2H�xM׃a$S)\N�U�e:��4�Y}���!;�4��t�L&��r�\�t:}Y�r��}����𗴶�PZZB&��o���|�cU�����Y��Ƈ�?����'N���Ò�6�����~���8p�U����n��o���j�x�-^x�׬^���ͮ={�p�vV�\A_??������N�<�����-�3���^����'Or�q�����^'������'�v�|�2A��\��p��I��ߏn464 
��0����{�^��;@o_-�ͨ��'�v���_���BqqϿ����6���7:|���N����:��+W"��O��[o�n�ϥò��Sy�7�%���ڢ��ۛ��J��:ފ[q�A���(?���X�ښ���n�ȱc,ik�ُ����!_�S΅��yq��!>ض�Ɔ���G?���6�Z\�u^}�
~���>z��˯�;�
��	\����D�T������455r��Iv��C}]-�L���I��w|�C�����b8<���$�I|>��319I6�!�Jq��9����� j2���_������O~�KJ�@���q6�_��O<����bjj�d2I:�a:ɧM�ӄ�A��)n���"ͼ����?������8�,O�"��	�%@��E#�/�]�}{��Ν��]���ܞ����.ߥ��$JE�D�{�� �K�3#�	��2�T�J�o--��9qL��;���3����@ssS&Wq�y�� �/RZR��=��n��~�[��^*'Up��a:��c�!�М~n &##sttt��ƅ���A����o���Ū�����c��x�ч��4�` ���&��,���1U��������K�```���.��g6���������F�aph��G�9���t�.m��H�D~~J����~�N		񈢈�妤��N]n7����J%)))�����Ӄ�l��l�X0~*2���]�]��f���t��R���Ί��Fe�z=�998|�'O��SOR\T�(�c�'j

�$���^<^�(R\X��:�z�]�a�&M"�S��f���o���ͪ����Jq�?ݟ~��AD��
}�����=�c�$Z��t�1�F�$:�������dc�X�z�ttv"Iy���E:��sӎ_`�ܹ؝N=��Y�`>v�cL�I@�Վ��?o.� PVVʩ3gyo�,^���$<^/�qatW�ZE[[gϟ������btt����Ѩ�tuw�R��x����$�"gϟ�����]k� ( Q]�qZ҂ �����M��h�,[���Ғ����tvw�T�F#�����x�Q�;���&>.�J�^��bI�2&#y�b5���@���A����dd�.1B}CN���vsϺu���2<<LFz:�CCttvr��1���ؾk7v��J��h乧�f��]tuw�V��|7tH]����IJL����ɕ�8x�CG��������%���~���\FFGy��GP*�8t��NRb"e����iee�54PZ\��n����g�z򺲫k.������y)����uk����Yz�b�/]���~����9���>^}�M�z9��$������/>�U��X,ֿ�.U�&������KsKW[�8r�8�I��ٷ���~� �fL�������SQVFOo/���!�����HKM���p�����23����D�GGDŽ�,..��t�6|�%{zz��#*^���Gͥ�lݾs�Qy��ؾsm��dfd�R�)).�Տ���Mo:��DRR"kV��7�B�հ`�����5��'�G|������?���mmL�����o�3���+��m��	ז$���t�^���;w��ۇ ��ހ�j5�}�<|�_��\�������TJKJ��͉�댛f�D�f�1Ʒ��tb<h<^/gΞ���
�/E��=w�����L�����W�u�X��ZA�%��BJ����ʀoY�JF�Z�bՊ���3�A,VS&O段kX�j%�ϧ����^x�‚�~?���?��Oq��|�?�MM��ܳ<��#X-�q�Μ;ORb"?��X�xј�WT8+3����������?b����ۿ���&O�DRb"u

�ڢ�/��<)��,\���֬�jk�
�b#�YYY�Տ~�}��MSKgΝG�Vs���SgN�h��Ϟ�j��?�	�w/m������SO<ΉS����#/7����%4�<��IDAT%�p�|&WU��'
q�ر�����B$����O�l���ۙ3{6%�%���(�Ͽ�.�EEL��⮵k�s���j�͒��Eո�t8(&L�>�����(.*��#/'�P(� �#a=��aG��R]s����hnA�������2���6�EQdά� I2b���J�x`p�ܜRSS�ٿ�+]]]XX�j%���uu,]rDŽepQ�9c:���lߵ��ӧ����Z��}�l��x�Q~��� ��)
��JY�tI�:���Hc�.�}��9c:F����n�$���D^z��X�n��O��K��R*�HO���3L�<yL{;������ܹr%	�	�ع�����v^F�T�z	��TJD1���!�lFTj5*�P*D""^���)Qq{<�x�J�@t��~�� �(2�RR�ٺ}�}}���&#
A��1�}}�����<���D�R��jDŽ�����H��?�\I�h4h5�

P*�lپ���Wq�b5��~�#=�B���Β$�����걾�x}>�� �Z�B�V��j�$	�JE(��"I"Z����4�N��ׂ�R�f,�J�g��KBB8���Y�V�E�T�z	�{�Ⳑ$���>������|���D	 �? ���~P(�I�":�"�^O�^[�V�KK�T^NVf�))�76�ᖭx}>�^��V?��*7�Z
z��F��E�;��蒚ل��c�����G��QVZ”�*N�9ö;�3��h�<x����*L&#J��N�V�A�Rq��wQs�2*�
��ʋ�=��M���|�V��^��NYΨףQk���F~�o��Z�������!~��ߡP(�7gw�\[2�L���D��ϓ�����U|�i3��V��d���i�tz]씿R���շZF�/�A�F�A�d4�Q����dמ=�̛GFz:o��'}�P(Ķ�;9s�I��ܹj%�(���o���	�\.�z�]\.�$Q\TĂys��o�_�k�v;�**8y�����¤���T*�\[ǯ~�;\n7�������*I��š��0�
�ㆨ�����o�v��$���b�RS0�/�&UT���Af͜�����I�����뱼�EE̟;���z~�o�����;W�����?���$q���h4Z���]�N�F�!7'x�ͷx�GY�x����0}�T���0�� ��j�z=����A��������s.�P(�iu<��L��dמ�zV�Z���h2��vZ[ۈ��#--�#ǎq��	�f��l�?N�j�ľ�R*��z��q���l۱�;/f����;���TJ��9x����h4����ֹiu>��G8&.�I�`(����`���].6��ZMrr2:���;v�"��b!!!Qq8���F�~?
�2��p:�$��19E�׋��,	t�O�\.ׄ���k5$Ib�fCEA�l6���&����7qQ��p`6�c5��xHHH��������P*���ؾ�	�|��edd��36�t8��M&$`dd�NGB|<n����_��7���0g�L2320�B!��%	�р�lfxd$�m/�N�%11�����0:��)�7<08�F�%#=���O�磏y쑇HOK#%9�`0HwOOt2c6��hG"ę�8�������a4""����a4�ñ�
���		�V �D"�ϫӑ�h���088��d$%9���i�R�511�)�?@\��ǃ �Flv;�@���ıvIJJ ��3�������b���v{P
��T*��ݗ�§��4v�J�ӣP(p�]c{�	~�����x��\�4N��Q�
�ل�j�aw0<2B��Ljj�擿/����啑��=�D"���Οǒŋ���<|���O�_��?�Rʟ��|7�eyeddn[��;W� =-��r��BL&�����_�㗑��ޡf͘�g�~Vf&Y�����22_	�uUFFFFF�6Bv�222222��㗑��������������m��edddddn#���$�q��,�g�b}�k^#r��I������	��
���t}������&��_��g�ka�?/��Wt������W��oN�����O�>ͦ-[����?^~�X�@0�����565��}��H0�o������f�Б����&��%���>�Q9~������/~��|�K���̭#��^�����c'N��M
�������5�/���q�\��ƛ��D'o���g��W~���S��koNL#�	_��_im�������餯��@ @8fǮݼ�q�6�$�dpp�p8�(��>s�R���f�c��E�Ocs3�P�׋��'��	~�����
��K��}���A$I���T��a�X����8���O���Ά�X�λ��AfϚECSCCCL�>����_�d�b�<�����u�|>���1[��҂��!�p8�D���"�NW4l����f���Ӄ�٫z�`���~c+�W�������mmA �S��ۋ���5</����DI��D"1m�p8��?�O�RQ��f��`��
a��i��$
�r�*�q}>44��m���ɡ����(NH3�OA�yahx��
��}߹�>�$Q]SCդI��MD"2�ӹc�B�;Ψ͆J���gǮݴ\�BrR�=�Q�`�U+V���(�xY�`>��T*�zz8x�0S&O���Zrsr�="b6�ٳo�Ϝ%=-���R.���v��T^��U����f���\�\KzZڄ^j�:�% ˜p�p��mm̟;��K˕+�����n�}�����D"���m���r�]�P)U8N^�-RR���@�Ւ��7!Oo/����bI�g���tvuEW,%����_'u�ٰ�zz{�ﻏ��Nq�y�$�y�I�n7�w�B�V�t�b�͙s]����k
c4�����?��
�����.����sϢV�cy��G8|�6��R�S�?��T�ihjb��]x<*+*�1�ј�J�b����:�����tx<��~�J%
��ֶ6���!�H���Ty�A:ą��$�dzn�jJ��ous�Unz��r���줢���>���N^n��!1�\Ο��>BGg'�MM��/��3Oc�Zini!..��$FGG�s�
�x�Q9Jo_#�#���k��3c�4�FFh��`�;���Q]S���";;��y��3g�|����8q�4ݽ=��R�SG"���SU9)&� I�YY��=ܳn-s��A�T�R*1��8�Nv�ك�f��������:���6e
?|������o����[o�P(X�|96�����P(D8�чbْ;X�8�{���\Ww]����;N~^.��#=~���SVZ�^|���4v��Òŋ�۟�5ӧN�a��.ӦN�G/���b�̹s�\���f�bM5g/�'/7g��HIIf��,�?�'}䶜|�fg�b�Ҙm��7:�\UEIq�>� ���x�ޘ��G��#ǎ���I[[���摖�z����s�����ZCnn.��~�R�D��pך5$''��jP*U�Z<J�����cR��k��T�ޠG�R��h%9)A�ϏrL7:���rsr�c�"$�Z�W�ڰ$X�X,�޻��5�HINF�$
����������*+�D"�z=��*�������nz$"�DLsڠ�s��wc��~��22�5$@�Tb�X����h0�v�j�:j/�
� D�|�/�K��A�'5%��ɇ[�����Z�"^���"IbDd������+0�����%!!aL�7Z��QQ*�$%&b�Z�=s&��l�d4����(�J{����F%n�x�>t:�m��)I'N����'�=� FD$QD����Z�F�R���Q�M�DQ$
�v{&�����aN�<Ż�o�ukY��[��o��r��(Rs�2%�E��F̝��={p��XШ��fgs��I6o�Ƃ��P*�� //�ɕU�s��58�.��؉$A^n.��Y$&Zy�غ};*�
� �P*c>�Q�];>��`0�J�Ē�@JJ2W[[	�#$%%���HWw7��y����">>���8���b�t8�x<����$������MPF滈@t�w���466�P(��4�c'N���q��	�n߁$��Ca6~����T���P(�$	�χ�n#3���'��Z-,�?���BFFGILLD���A�����`���lظ����̜9�`0ȡ�G��2���ʌ������IZZK/f���TUV���������.����N�:��۸{��	6����v�ZMB|<�ĩ�lٶ��^x>&�+
���������{�(/�ĩS1?�D8u��6 �nus�unJ�����ڛo�b�2J���D"������Ob����BA���UTJE������r%��q�b5�?�,>������ӦQXP@�
�Z-]�����088;����6a�`HMM��rq���J�%!����?��s�HMN����E��x��줸��e5I�ؽ�#.VW���Z����V�y�/#� ��~�����1u�d��JIJL������4�~?��#��*�����r�������fw����B�������Q�+�)�
����:J���x<�\���(.,�j��������h����p:]!��� @NN6�`�8s		�HGg'���$��;0�M�$'�?0@G�'ysss�Z,1[ZX���ь�����VK)))4����B�`xx��UU�tZ��0�]]���2<<B|B<���
SYQAo_}}Q?���GwO]�=X��JJ�7[)_V����$I���蒗��?�Ä#tZ-������×^����m����M�b�2��2�p�D"6m�Bnnsf�F�"##�	G"��?��gɢE��:22�I��㿩�~A���R�J;(�Q��3{ք��)��f�|�14͗zAQ*��s׺�Vddd�(��S��.o���|mn�qu�^σ��g���=����##�}D�P�v����22��X�222222��㗑��������������m��edddddn#d�/#####s!;�o���zzz�t���Q��������22222�|%��r��p�ֱ��	�����ط��
�3�\tvu���o��^�����f��p��Xt�P(ą�j�c���N����[�@0��>����e��a���ٲm;�CC�Ba���y�M8�N�����K6~��������w�P(Ėm������.��100�g)O�X�o�/cKo'��\���x<l۱3��8�$I\���?��
o��.m�ץ������::8{�<*�**~�#I�$���!v��5A�8bi�=���0� �B7��6����F8����D$I�]�Z�p8���EE"�nي��b��v{����D"�O�m(��+��rs��q�~�$���!�
��� >�?Vv$!
�zc�lkk�����S�L(�GWw7��˼�a����hni�� �J�ŚK��űr�2֭Y��S�hlj��Ϗ�̷�5��kQ����N�dt4:	�B��^�?�����g�c����W^cdt4f�>m�>�5�
��|ʾ]c����F��v")6�������r�� ��S�bJ���nx��
:�����ӧci��kC!���:������\������w��n�S9iS�Lf��l6N'�>���L�<			<���(�
����c�B������:���ܱh���G������s�
�/]�rmV��G|���m����X����B�����iiiܵv
;v����GQa!��*9y��))
�h������m�	�^|�YLF#]��̟� /��n���ϊ���HOC�Ra��ٶsy�9�Yy8L{G'�`�%����/P*UL�(������~�V�z�q�IKM�b�`�����f�ҧb���h������Z�F�V����Y�p�%%465�T*1�����22�EDQd�����3}�T.��J�"��-�T*<n7))ׇ���獷��xl>��u��p�=}ĥ�:�&�w/��`Ͼ}�����O�T(c�h��%̝=��:��λ�A@�R�̓O0<2ʆ��?~�:::�t�2�=��ȡ�1[jw8x�G�V�ԿD�غ}�@��s��hQ�EbU*�Ϝ=ǥ�Z"b�RծP(d��-��v*��Ys�*Ξ����'��ܽn-��p��:7=�|�\�J�
���G��eelߵ����iS�>}w�YCwOG�����#!!����h5���ii�̛;���T�^/��_�jk+���
E,͂�Q���n�� ��y���v�ً����cxx��{����I^N�wgΞ�d4RR\ĊeK�HO�����l�����x�=��w��
�FCjj*�MML�\�ysزuCC�8]N^}�-4j5�**hn���磶����L�Z]]ܱx!3gLg��9,����X]M_?��)(
�����h��ʊ�mD�9}:���a��c��Vlv;�$������fŲe�����GF�[A�$�ZZHKKe�wr��a��[�B���&����?w흝7܂�|�=6kj�K����{�R^VJOo�b��i���ȃ���‡[>�E�e7�s{G))�<x�}��~�_�ȅ�jΝ�@mm'N��`4\.<1�ʔɓ��t��J��i���Z���"v��Ǩ�U��ee��n�A��E@%�|�Q�O�ܳ��mm=vAX�h!��V7�[��ի�H�D~^�}���.z���T^N||qqq$�Ǔ�����z��Ғ�Ӣ3����JKIKK�٧���p�q�f�.sf��Bu5G��`rU�i�4�>�BGW6����22��1���SR\DqQ!��R*���$;;�^�R��`0`�$��`xd�‚2��0M��H��J��hD��Q��ϼ9s%�Q�
�?@�@?���
F�J>�����N��R� !>���T*����P(���C�V��R��/Q^Z�A��m3dge��C�������STX�I
M4��Ƥ�
�/[�R)�͔�=��j}�dg3}�T,���a�{zILLĒ���v��;6?���n����R����� !>�AOvV6�#�#1[w�z��j���(,�g�̙?y�����s�Z9BwO3�M�a����-MKM�J�h鴴\a��CQ��`�P(B4${tu���l0��$'%b4��n

N�Sqq<���X,	lܴ����[��o��Z�$�K�/���GB|<S��8������'�l&)1��gΰ��A�

�x����=rsr(*(��������� 99944�chx�)UUlڲ�+W��}��r�bi>�1�@���B>ؼ��ӧQR\̬�38x�N����FV,_���'bg�_���l������>������A�.�<��#c*FN�>���h����ArR"�W�⣏?�i{����z)*,��tEDIbpp�+���X����
�y�\�edt���N�|�Qֿ����f͘�V�e���������4A���ؒ����@t��bu
6[T������'N�̓�s��)�~�1Q$
��
GzjZ�|jl6��
*}\����h����HA^��M[�p��M�E�`���A�RU9��Ҕ�*�|�m���X�z5��?L*� ;+����O�3gL'%9������,^��Fs�o÷�(��76#�g��ގAo���Vjj.
9p��>� &�	��xQQ�U�Ca�?N~~>�1?e4�j4�\�Bzz�
�tuw3c���_� �=������h0~ab�����X�`ii���d����l6���EvV&��NOyY)��

�T*�|�f�*A��������:e
�����Z�֬F�X�E0c�4JJJ���8N��ӘRU��j��3�\�M�B�9���l���Z���撘�HnN�)���d3w�,$�f����CJR2^��իV	��� j��ĤD�Z�����̙5����
%��%b4Y0o*����d����&&b2q8L�2��I�HIN����H����a��(�J�7��)X,	����`Ӗ��\����4B����̌	{\22�g$I����x�^���Xs�*��򈏋gʔɔ��
�)-)&/'��A �G�T���)��#++�cs
���q:]����T^NiI1CCCDD���L���HOO'�������q�(���aTJ%J����4rss���l6��i���d���dgg�����Y3HJJ�f�}�W�$=-��2=������j�/��x=^t:sf�frU%:���¤�r"�1V��B^^.�I����A$f��E���18�SY����l�1���ŋ���l�����n�pxt|�CAFm��� -[����/�h$adtKB�W���z���~?u��~����oY0.�,��v���������[��_��Ɍ�j��t�?�9?|��r�g�p8@�����2%I����d4�p�|�_bo�rm-8�_z�،CF�v#��O��/,�?�%������|'	�ô��žDOt%�͕�Wnn�_�T����/���`�-���É������&������a����n���bo�*�����/%�����{֭C�$���/�������,�����@nN�g����|ynj��&����EI"
!(�U׿���۫������cRF����ܨT*T�o�J
A���2��~���_򘔑�f�G�������m��edddddn#d�/#####s!;~�ۈ�H�����g�f�)+]����`���~l��������܎|�#�~���5TM���%�G�FCcJ�����Tn��˴wt�{�^V-_��>�����oc��y����ʌ�������\��E�T�o�uL����NU���A7ӮH$�Ś��6u*�i��͑�����ϙ���B̞9��/4����vΜ=��dd���_���DA����!I�k눏�C�Tb6�n*~ʵk�|��f���lݾ�35�?kV�ټm]�=_��O_�����;w�r��1u��{����*��Y�h!�}}�D��y�G���>|�(|���W��F|>߄v]���V��y���cz�"222^�?��={o��˲u�Μ;�$I_*V���{�T[{��~[���.��Q}����7}
�>N�+�/�֑���#G�:���,v��KVF&�/^�nw0���)�'�磏�BTUU������&55���|�p8Ljj*�P�5w�������n�rs8x��@�F���KJr2�j�(��'!>�I�E�++-����謝N�mm<����:
%
A �p��yl6;E�t�/��QX��Z�f��-t��R)�6u
��}�X����v���BwO7#����v��?���fтL����UF�{���;v���va��9t�(e��df����},Y��+W[9w�		�ܽv-^���{?���l���j>>p���D|>_T"�b5		�ܵf
			�ljnf��_��U+Vȱ�E�N';�쥫���u����"I�ٰ�<n7iii�dgs�q45�W�"33��.r��)���(.*b��H��㮵kY��N�=G}CV���˖���L}C#�`�F��K���b�����~���27��A�1mN����������S�1
���QR\̶���T[��HOO'3#��zR���j4=q���t�F#'N�&����Ś�{z�\W��ٳ������3�#�\Y�J�����/ߌ����Q���E�%�c*NS'Oa��)X�V"���Wcc#U��8t��ΨvwGg'CCô\�ʹ�oh����������;g6�N������UU�L_,v$##ss�M&���(*,d��)T����׋?��St��a�J����"��	�ۍ�d�w�E�Ւ�����J232&�W(�ו���Hzz:e���������"�$���tww�v�jr�����.���ɉ�'��t$''���c6�q��lڲ���.6o�FU�$fΘNvvii���RVZBu�%v��KEY|��f.�������J\.G�c�f���c(�ߩ8w�7ݢ��N|~?�*ʱZ���]� %EE��f:����v�r�INNb���F���>sf�D�R������(J�AP*����23�1}����x<t��[VZBwϗ�*��t������=c{<��%wP���訍iS��V�in��������ǎ3}�T�N��C�1̝=���ΠR��?o.j��޾>&������|<|�U˗���'	�o�VKrrH�EE�T*A�B�@�Pgfݚ՜:s�׋Œ�ն6�:N��V��d"�j!?/���R�Z���g���x��˽N?#�j%5%���b���nu�VD"�^mcƴ�TN� %%I������db��9�T*�{z�j�D"�RSi��@��sǢE��������ˉS����e���������ʢ����s�g6�r�d")1�ܜ�[��87=㯭�.����QXX�N�c���?o�.������1�b�H$�$I
1j���I�G"���^����p��E���%"cE	I�b���/u2���r�
UU��r�*����BfϚS���#33���D���8r�8gΝ�����6u*=��47�0c�T"�0�H�P(D(b��(�������3�Db�tz=��u\�X��f���c�X�?w.kj������lV�X��ލ^�'�������5�hkk���9���8~�6���H��K45���[D�T���˩3g8{�<�=��µg!!>�ܜlJ���{�ZV._FvV6���R�؈���ǖ+W(*,�������p��1JK��j4�Ǟ��"�5����ӧka��}��{~~>F�/Q��~���ż�s���B�P`6�IINf���$$$084�� �E�T��Giq1�Ν�k[�$J�@Iqqqq����ZLF�*�����`0PX�O("�l��ŋ̝3��,��|̘>�3����B}C#��[�F��6j���%̛�N���v�dg�����夡����v�TUQ^V�F������U��Ba�F�\�r����`2Ys�*�������̍	�$'%�����b�����AQa!%�ET�\����M�¢(.*�����W[������
����d2r��)Z�^e��)̟;���F.]�epp���Qrs�)*,����AOvV����-!�YY��l45����Ĕ)��3����".�LQa!V����4�����&1�JEy9��qT�\�����ʢ���^Ϭ�1
45������իQ*�$'%����J���t����3O=�^�����E��~��� ��ޜ:_(����쬬ϕו$�����@fF�gV���D"�\�+w��-����y�c��fdt��AA~~l��j[����3�O�+��q���o�c?Ĵ�S����?22�7�37;�q+s#���!N���}�/����U*JKJ�0ݍފ333?7��u�R��P����N��o�V+�Vk��Jy��L�6�
��>ݮ��������%%Ŕ�|�ο��CF���͎Cy��|��Ogg+�/��>75�]��8�N"�HB|�
Wddddd���aFFF�Z-q���2�]P�TX�VdddddnT*��)��V���22222227Dv�222222��㗑��������������m��edddddn#n�T�����G����zIOOg�����~?u

�\nr��(,(A���V:�:��������@{G�ii���F������(.��ܜ�X9��(J��2IOK�}g���MO_/H�S�%%���v�&�$�z���������L(B�$������AB�
�Mx}^4j
9��$%%�p:��� ?�NG]}N��B�$I�UjDI$
3o�������f�X��{�QF��b``�^Gܧp}��==��dg��}�u�b��q�\������Mzz:z�;�����j���fhx�42Qn*d/��C������RTP@^n.�����9}�y����z�M��{?����@�޾>*+*���ěo����V2����u��7��� ��;w��^g���$Z�=~�1o�N�>��o�g�f������>*��c���ٷ�>����Arss��&������O0g�,A�������Az{���1�o������Jk+gΞ�������{�ٿ����22338t��55��������c/'sgG�jhjb��X0��ed�o�_O$UۼƉS�	CX-�N'��������x��x��W�:eJL��������n2��o���u�����Å�	}�E|���:m?}�,�������eJ����I)�B!^{�Mj��ikoǠ��M���|ِ�7�A >�̞�>b����8u���!r����h�P0D˕��X���˗Eg�j5m��
�|�Q�z=
��ǃ$�Hh4=�>����O��FGml��C�^��ED��~s_�P�p�|��n
�Q�������(Idge��3Oc4�h4��Q��Z-<���X����ttvbIH�����2.TW3c�4z�~\.7v���CRUY��ݻijn�  ����222_��ɡ#G���UlKJ��IIJf��U�a�FzzzX�|96�(}��|>�L�"?/���~�?AwOv�����pp��\.���0cט�ŚA`���\�XMBB<�(r���qf�����Fl6;#��̛;���N��?@~^.iii�:�[���U���x���>�KK�2y�u������ϏZ�������\�;Nvv9�ٜ<}���b���&�$�˵u��Ց��ʢ��bM
Z��ٳf��x�������h42o����ũӧeŲex}^��<�V�a�('O���v���c�BΜ;�w7���Ý+V��91<{Gg'o�����d֮��9�f}nd����8ID�3kV�=���K;q�9�g�d�b4�l~0����3u�dJKJ������HH,�;���W��=~I��=s&	��lx#'N�bݚ��Ǥo%$t:kW�����귿����p8��s�h4��/��]{��t��
�h��˗�P(����HR�E���I�)S&�����m�ػ�c���c�#�{���g��W^k=>��3�
���֫�ۯ���Kj��'��nw�m�N���j��wtx�r�*6��B�R�D!(�J
���ԙ���?�����X�λ7|�����9$IB�R������Ϩ͆J�B�Q���P��(Jt:
�������t���Q����\���w���v���� (���B������;��ճe�vy��xg�{���֭���1<<@$A�R�R��鴈�H(B�ױq�f���Q�Ԩ5j�:-�.\d��Ѩ�|7Tl�r��?��@ @k[;����߿�2{�}L[{;;v�0��FCc#�oڄ�`@��284ğ^}��Ѩ��w����Z�lߎ�`���c�>{�V�R�K��Ǜ��fdd���&6o����6n$"���\a��ݨ�j��h���
%J�2�v��/"�֨�w��t}�M�h4h5Z���8}�]�=�� :r��G�#���;tuw�Ά��kh@����o�������`�޻����R9�b�s`�ҥ�׿�ϔ����W^�Rm-YY��׿�O��=�=�w6��8�A�$	�������3g���C4j
�p�@ �J�B��s�����?i�B��y�x�g�����t� ���-ANv6�<�$/>�,%�EڨT*1M����^"3#���O������h���r]]�Z���I<��Ӽ��3��s
Y�KF�k�R��=s+�/C��b6�(,�g���L�m���d��;�HOGP(�={�?�F����32:������0�M�HhW�R�ahh���d9��h@�Tq��Q�:��L��b`p��ARR"e%�TVT�x�BtZ-
���� v��RAA~��ӦL!';���L�\i��Ϡ�0�����֮�����G�R�_@����#/7������\�� /������щ$��>£=Hk[;CCC�n�jJK�q�]L��� /�������S[_O_?����!�R���s�
�̞��f�����h����^W��̌XۧO���q�Z-.���Ao_�AP����ܜl���X0o.��cNQ��K���������M �����<�q�K��$�D(.*����D��+��Q�/I��H �ȱ��u:R��Q�U�A�[Z���%99�D����DdlI^E"�0��y,^��_��w,�7���\��2y��wX�|YLVs�R�$IX���A@Qǖ�%$Ibxd����
%.��p8L0D�R�0��$IMF�\�������ᠾ�����\LF#�Μa�y DW�/Ař��Ì���0����-#�u#b�V\׀R�������
�@���&��ʐ$	��IWwn��9�fq��Uj._B����������6����5����ĺ5�inn�d4QR\��7��c��v�ٴyO?�8z��ٳf�����o�EA^Z���~:����l��cz�~�;:E�JI[{;}��
z�n��8Ɇ�?����Q�(�p�0M��aD1�E�����db��]����G��~���;"��]]��F�	KB�

���c2�HLJ��R]]	1�L��&"=Q�{IE	�Z��a�����DJr2S�*����d2"�Ҙ}�N�$ID�T�hll�rR���(
��RQ�U�����~�^�$v�ً��fƌ��!(F�^���]�dfd�V���h�D""�HA��኏�#7'�;W�@�T����V�%�j�oc2y��n��rӇ�Fmv�
Ř�qT�����v�).*��rQR\��'9z�m�ܱh�ͥ���={�QW߀Œ�C܏�h�����R|>?�����撕����%%%��3f0�����!Ξ;Ok[;ӧMc��h���HJL$/7'V�޾>�23��t�\�̕+Wihj�b����O��+\��C�TR��� �C!����4��1�'CC�
������KFF:�������j����KBB���J�ۅ��crU�8����lzz{���$5%���~*+*HIN�Bu5j����r��/ג��CͥK��\�����S��b�RL&����D����b�l�΅��T[��墴����Nzz{�X,̘>��T��GY�jI�I����f�*J%;v����\-\@jJ

��8N&UT���Okkz����'STX��ZE(f����̟KfF�m�`B}RRRP(&�����kV��ٔU˗x�����N�����12:‘c�q8̞9����ӧ�9�w�]��f����&144LB|UUUt�t���δ)SHNN�̹󴶵��@JJ
�6S&O��!�"3�O��p�����\.���Ѿ���7�|�
�99dd�s���J����(��762:j#==�Ӧ��ҌR��l6���Le�$�z=�H���~��Jq�=��ijnf����q����G^N.'N�����X�Y�d	���m�/{�����l?}îic�� D�PĖR$I"��h���5���?���D�s���w|�\�w� �X�(�	�N�.�F�%	a�w���6����<���15~�'+p?��_2}�-\8��~0�٥�A�J%�@Q�i��٩O�
�����m��Stu1�N����]��c���WA��,�2�~ި-�$�G�I�jU���׾�
�v�km
�BD""�:�	��v$���$�TF��?}�>��?vX[;�=K%��hn�GA ���j�g-�+�*T*�H�ߏV��^-����>�ܧ���j�K�V��t?�h����Y��Q��z�>}��~����?Q����|y>k�}z+�J"�*���p��z���#���Vs��>���M�h���隳���_s��W�/S�k��Fi���߲:ݍl�i�Yv�Z��/��/��ᄄ
J�2�F�P��=�ח[�>"X.##�C�P��O`2ݾ�]F泐����AHOO��Ր��N"�ꗑ��������������m��edddddn#d�/#####s!;��I$��X��b�������
���E��Q���;0�ۍ�f�6](
��٢Jv7H�r�o��P(����7"��t�pN��������~�v���_S�䢹���v�B�76r����D>dddn���!|#�
�Ì�l��`_�χ�nG��Q�%I�f����L���p8E��v������`�id�|%����ϯ���8�_I�ذ��ݸ����+8�N�}��ںz~�������>�O���r�GFط��o�HD�_���o#���?x������-1'?::����͡�G�b�p��&��n�С#��c��7�����k�WF���>�Bu���565300D�Ś�/�̇�����x݋�W�j[�N��ŭ����ȡ�G������edtt���{���8�"oo؀���w|���O�e$�7�?��˯�Fͥ���?���{���;�W������Z��Z���H[{;z�������$'%���N� 99)�n��y�<}I�Ղ^�gdd��Ǐ308@�IX���wt0g�L.��bw8��t
z)-)!#=��ϕ+W	�Cb2)-)F�R108HgW	����388Do_N����T\.�(2erz�����YY�H`W[[q:]��䐜�DCC#j����b�����;{V,r�����-[8�"]�����)/+%#=���O����×^����?�9==�d����gBF�;I�N���EQaf��W�x����\����~ֿ�?p?3�O���b�����h��t������������������j� �"���U�Xp8�����V�).*")11ɳ�����A�R���̠����Q|>�i�g61���JIGgׄ�E��X-�~?-c6����8���������T)���")1��+X�����ի���w}��^���b�$PQV����.��GGgN���@iI	}��'O��c�2ur�p����j5�*ʑ�+W��v�E��I�\���w����c֌X��⏌���j1�L~?j�Q�Ј$������|�����E�"��r��d�$%&b4�x<1�����G�).,"--��ISs�$QVVz��>sӎ_Ej.]��r&/7������o�z�*���q�\�ۿ�{֭��W_����E
�0����r��)DI�؉�L��"�02:ʹ�8w�����Ԅ�`@�����%�f��r]�))�M&�;���Oؼe+��5�;��ٺc*���۶�h�244���������y��~=������VFm6�*'��o���M?=}}�z�9{�<V���{���s��r�*���t:�;:���TVT�*����1M�L�X|��t.��Ėw���x��
o�� #�%"F���A�V��<t�}


����e``���a)*,��[�y��z>�x?/>�,[�o��r����7�½\[˛��a���>{���I8N���y�x�7P(��|^rssIIN�����ͮ={IMI�����]{��D�h��I�̙\����g�%"Fزu;�����}7�dޜ9���E%c5�=Ƌ�?w�c�\[�o���Sq�����q��w�~��,�c1+�-�7����{G'�����)1��;�]i�����{̝=�ںz�^����ilj� /�-۷��x�tvu1er���_3s�t::;i��D�R1<<LCc�%%�9~�Ӊ�d��}}}����322��g��(�w�lv����QΜ=���;&������))�=v����[��?0HA~V��s�7��?4<���ӧNe��i�\���3g�$$P5iy�9�{z���'))���|�Ӧq������XA#"b���]k���D_?W��R[WOyY)����D{�a*��(/+����pp�j+���<������� ����>{���_����s����2m�d�x��X�z53�O����3��]����t:9~�$w�]�_��G�zN�=���!>>���F��ҰX,��JMI�g��{���Crs��԰"445Qs�2�>����[�<��|gQ��,[��'{$	�%�I�*�s�
�/]��9s(,,䙧�$?/�p8̢��?~�"���O{G�=�4��{		�7��D~~O?�8��,�;�իV���Km}v�����<�$��� s��1�O�ʏ���&q��Q���;qϺ�$����C���I[G;G�gڴ��#�HOo/u
�ش����[>�������y��{hmk���3$�����ʼnS�HIN&�ϝ?Ozz?��Kܽn-m��7hW?E<�ԓT��222��9��TQ��O=I ���3�BA\.7��	��$&Em��;������).*�'� /7����瓒|�H\ZZ*�h�ZZ�\��t�7�(/+� ?��֮%/7gl�=�q��IFFG	������������$$���x��o������7OzZqqq�޳�M[��3OS}����t�\����4��j�n7��,_��B����tE��7��r3000��䣣���w,��`A@T*ULMOPh�
���!"��0�L
1<<���&��"�((�j�1!�ф�劥7��(�JT*:����!����x�����q�݄#.��2k�T*����f�e�%dgeb4c����<C\�Ѱ|�R�23n�� #�F"��&J��hL�Q�
�ۍR� 144�^�C �����ę�D""�(��1���6[� �F�A�X9*�*Z����`(���#�#�՚�
F###����2j�g6������&ԣR*�����Ȅ�#��h��	6ME�:�
�P(Q(���Qkزm;?x?�Μe�ޏx�٧�˫V���l�tZ-����h4
A�@5&�R����  `2�(*,$3#�����YvT�G�R
��P(���0F��P(;��f�u"G�$�k�)).�����x<x<B�0v��HV$*�#I����_��8�N�j5�i�̝5��s撕�IRb�y����F��0��{ϭ~��Un��B!.��PU9	�RI���ԩS��0�����DQ���j���1��$Z�(�J���j�dee�Q�U�ص�ш�h�d2�k�^�� sf����Ѩ���bph�$��BA|\&�	�RI��Jzz:w�\���{A����QTT����?w.��?��*�������HLLD�Rb4��7w�����w/'O�A��s��w�s��_�HnN��̦����;w��Ԅ�ngRy9�$�u��L��E��|�V8����ҒbA`hh�ֶ6D�R����ݰ$$`0P��]�Vˬ��ٶs'	��,^����B�۸��y�P8��m;8}�,�99�\�A!�0��uz<non}�Ӆ$I�������jA�+Z]t��j�P^Z����y��

�{�Z��秧�����d��m��W�����ukپsz��FC�ՊB� >>�8��Y3f��c��Z����n 5%e�M[�`)I����p����S�:S�N��p0}�4�v����2焼��E,]���+W��_����b�Z���3Ʒk
A`dd�3���t��a4y{�{<��c�Z����������3���%i���������(,,d��x��Ǹ\[���磊yZ�<� �7��LMM������RR������3�:}���$���INv6V��RI�5�Z��b�j������={�s�
�8���G��H'';����ގ�h$';�V?��:†�7H˖.#9)���~�|��9�g��]>
�Bѽ{�I�%c2�ШՄ�a�z=gϟ'�p�|$I��t"FDtzj��`0���G�$�.^$0�>����Aco�~�?&i�v�%	��DsKo�����_��N���A3�V{�s �F���C���E�NG(���O��N\\�<���P(��hD�R�x��������Z��B���A�V_'�	����o�������,�� �P���/#s;��Q��+p>��ج��v�V�0|~?:����_QR\̼�sHINF�VG?E�E?����Z\c�4j5Z��HDD�ӎMR�����%Qq��(�
L&���rm-�7?%
��g��l�J�F!A�z�@�R�F����+���_�r�2�̞5��At/�F��&]
�	��z��0�P�^�(I8��U�՘M&�6��NO||�$MhW(&<v��݌��z�>B�P�^v��P(L\|�:f���0�p$�f�ߏ�h$L��L��f�=�?4<�R��h0��j���h5��ׇ�jA�T"IR��j	�����HL���p�\�L&#^��Á�h$!!�{#��ikk��g�� ��r����u:��}�gjQ�@ҧ�K�9�Y3fL؛I��8}���+�t��� �;��Ct�����I��t:�j	�ε7�?�A���Xz��N[[;wߵ�:]o����+Q(L�2%�{��<�]��̌�X�7zA�����xm��c?>..��V�s�$ ))16� �5�i�ygk�M,��R}t�-3
q�����lA@��N�Q��џ��O����IZZ�����'�Z�B=V��vY)7̫��b8���k��&�M=�I�>}X�ڽ_�k��Z?~�����OKM����4�� �ʽ����1��,_�l6_׿�ʇ~����c4|9��ft���j9��^ϜY�&<�_�`(DJr2S&W����HO�0x���BA~��
SF滈 $Z��ə� �i"�:��9�fNp�_�@ @qa�EE�}��J���},8ާ�`(Ȩm�}�]F�P0����V�����ou�dddnA�TQ�g/G��|c6"=-�&22_�4�������m��edddddn#d�/#####s!;~�����������F��H$B$���y$AŘ��J����M�h�GA �tFU��E�
�B�T��l�O�"�H4L�ؿEQ��� (�t�X�] �Bר�j�D�B�$I�]'"�"��U��G�$��`,ܰV��}Z#�b�}ca<�~4��B�$� x�n�P,���bdt���l92�������71�$I"���7�����
�P*�������T*ox�$I" (��q��>q���=~���_��,Y�ü�:r��������e�8y�����gi��dRE9����k=~���Α���Ao��BzZ���O�6�N��3g9x�0S&Of`p�7׿Þ}�8y�4CCC�G�4�>����.\���fg�c�n����];q���I����V���'Μ;DZ'���N'������ĩS�RTX�Z����W����KBo��-۷��͜<}����;:h�r��Ӧ"u

l߹+�_FF櫱e�v��O����@uшm==����/4��CC���mJ��&�:��p������������ 1ъu�x׭�f���(��݃V��L�52:���RAy��/^������1g�T�DQdマy�&�_���lb��mdff�V�{��.���ؾs'��ECS{?����,|>�=���nv��K�0{�L'���N��۟�5.��D��H$LOO/~���Ιs�ؾsO>�(�����>���&			���z�1�чoj�d�^��h4�� � 288K�P�U��ܳz}LO;
�zx�'1��ש(/cre%�Μ������)���޻�b�f�7��W��rRǎ����w�*����"`2227ASK.��REFz:����\O^n˗-������n��'����p8���#�4�3���Ck[;�m��i�_C�$�GFp:�X�V,		���Q)U����A�q����F�"�H__=��4_����+#
a��c��f����8�qq��vT*���k��~z��Q(232P(
�DHNN�������DvV&
����λ<��������#
E�S�axd�Ϗ��#++���o��ܹr9ӧM�.�j d˶�8�.}�A��R���
��G?�(������f``�p$BOo/�@0������v1�<,		���	i��p:����b2��� Ii��"�ܴ�W
�ϝC_?gΝ���̞9����B@��P����'��̚1�BA~n.ǎ�`��,�?���8�.
E�I��*�\����z.�֢T*Q*�tuu����s�<��lfhh��Hp,.�5����4X-V�-�#��QFFG9x�Z��ys�7�S@@����mD�b��0<2��>ʅ�Y��N���0
�M&��RIKME��QWW���F�;;e�/#�
�D8}�,�]]��Ȫ+��|�ֶ6CC��74��֭<���||��CC AzzO=��?�G}}=����q�p������6`IH����eK�`�����B�V���J.�����m;vb��c[��68��hdpp膎������_����B������/�	�R�O~�C����,Z��9�gM���x�w�� %%��g3g�q��A���<f͚ɿ��7�dg308IJ%w02:J}C#n��S�?No_G�E�PRXX��+W��_��ل�㡸���,���	���uz�͝3�.6��s�/`��I���G�Z���r�
��|���"??���X_*
�ߺ�+W��饾����~��kCc�6oF5�Q��c��g�G��7���ĝ+WPR\|��o��~5E���4V._��'‘K/@�O�T���O�|�RN�:�?��/���f��i���?��r�o��
{>��a'I��9�^���[�144� 1��^O[[?����\�\;�n:��Ɍ^���<�0q�M�Ta21�ͨU�D8�N�����'Ys�*���9w��6�P����1�Ȩ&������Z����h2~�0�222QT*w�Zɏ^z�ÉՒ��)S�����]��ŋRVZʏ^z���N'����o�W��A.VW����3Oa�Xƍ�Op{<�\n


���D˕+������%��t����> :�&B34<�Ç�k���? /7'f#�='���'���'��`���5
������;�@������~��?�g�&
q�qy�A~�ҋ\�������ȳO?Œŋhljb�ܹ������DZZ*���A�Ta2�8p��A��﹇G~���&M������|�Y�f^W�Ԕ�͙â�y����龹�snN��#''�����(�8���O���())�ч$+33��/D"a�<���$%9�Sg���Ӎ�]M�6u�uz�_m�Y�X8>�@�I�=�	("���~f͘NyY)��O?���U,		�$'��c����8u�s��D����R��Ȏ]��2erL6���,Z����<����c�o�ӦM�{��n�#��H���dV�\��h��F.J��I����ɉH�P]��baxd�8��Sg�P9����)�5���p��9v��}�ﳌ�_<�Ep�Z
*�*:�%il�]��E�B�qL}��Q*�#Q�2I��\[�S(TV��O���446�x�4j
			���B� F�I�A�����	��(�c6eL#>
E��C%�Z
�qq��ő�����PYQ���e��-L�<KB�`p��@ �(��T���-F�Q�VkP(DQD�VG' c{�j�*�f1=(g2),(����AD3�±W�R�$F�D"��T*���{�_I�8x�0M�-deeF�(
�	�È�`0�$IQ����j��B��@ @0�h4�����<LFz:�?����x<n�\��V?��*7�ʨ,�N�㮵k���G�V�P(�h4x<^�~w�#�h�:r����T��c�ٹ{7			D"�k�1yJ��@�RE�*5��^.\�&������O>�n���s��$s�
?yV�T(�	uU��(�
�J%5�.��j������{��o�J�d���,_�tl�H@�ѠT~r��������IMM�܅l޲m��:Z�0��=�� �V�����|=�*u̡��jTJ%%�E�?p�^Ϭ3IHH�7��ч"
�k�^.Tנ��X�j%�`��x��f3�@��Ɇ�7�rG������̤��%|~F����2Ξ?��2v�	���,v��?�L[[w,ZD$�b����˦�[HKMextQ���tvu��IIL�:�F���B�`��=~�g�z��ɑc'x�駑$�w�{�����t�V,'�l���[rsrXs�*f͜��w6�R))).�����K�@�2j[-		�ƛ<��,[r'N�F�R�;�Ցz심kvK�ב����?!���G�G!D��+WPTXȉS�����<x�}��~y8��r24�A�Ӓ��ȡ�G8~�$j��M���W?�!K��3!�I�
4
[��d��>r���v�	�=v���A�
�m��*lx��l�2����T��I $9))�;�ߏ�� >>�͎5����CC�h4��21�L8z��	���$����$Ib�Xp��(U�����Aı�v����!AAjjJL`xdո�=�qmﭷ�/��ݘj��� !I�h����� ��a��HNJ�IJ��~l6i����IIMA�T208�%!�Nw]�x�^�.�cח���j
��xph�D�I����l2������(6��Ԕ~��ߑ��Œ��(��'>>��G{GG�@]Rb"�==��a��d2b2����F�$��rI����tr�j+��u\����_����!FFF�X�Z�c���d��������� ^ot�R�Ր�����!%9�R�?`xx����H���R��Kgg^�'y����ttv�Vkbj��}}D"222���FGIIN�����HIN��OFz��ήn�N'))ɤ$'34<��;�������`���$"�����D&-5���8�{zP(�df�c��|>-W��RF�zBB�]��L&
v����B4
�H���!,��ф���f�QXP��?JH�'==���!��I�O /7'f��	�ô��MX��� �=n�\�r�_FFF�N$���p�<�,^��]���ܳ����|~G�;Ǘu�rT	��� �p����o�̌t�/]*���F~zedd�w(�ot����t������"�edddddn#d�/#####s!;~�����������FȎ_FFFFF�6Bv��.���B�ЗJ-�����_:�.�_2�������WA�������|��/��lw8��f�2207U��R}�2�jk����rm-~��}DBB<�c*y�ط�dgg��\��8��'�={n���n&s�DI��rq��Q����ZP�T|��f 3#�m;v�y�6Ν;OrrI����I222�������M~^����#�N��{-�����gΒ��·[�����!�$�����ަ�t��8s���'e\��Q���7�N�`(Ȩm�����ihl>Q��B��a�n76��p8��|�A�~?�N����p�r-�`�˅�����j�;:���k���?���
�x�>�@���˂k�fgL"XE\c׿vM�'B�ڿ� n���FdL�[EN'n��`0k�������0�@��;֮�̚9�FC[[;�MM��ա`�6o፷��򫯲m�N�j53�O�؉���a�ϝ�_|��D+{�}����L�������l��3I��x<�����|8]�����X�����l1��p8��~D1*��r��x<���۷Oc�8
��|��l1;�ʽ�������_h'?Mgg'������a��"�N��Á(����D0���e�w,��fc㇛II�����}�\�$�����p��Ǎ��ۉ�����%��
ٴe+3�M����W_���I�>{��AvVK�X�[o��N�c��y�$'�D�T^NSs3H��щZ�����m|�� 3gNgݚ5���!���~��E�R
����{I�Zy�wp�\dgeq���lݾ���Q�rr��u�ڳ��+WIL���C���Ǯ={G"̙5���g��$	����3�{�Zv����:��))I<��Cl޺-��G�ȱ�9w����͙��墨���55T��a{Ӽ���70�BP��jVE������������pb��)(���]��|E���n����@0HeEw�[��m[iljF�$�azz{9y�F��{�Z���eǮݨ�J�,^LyY)������|���j5j�����e�vDQ��t�^������N!(��s)I�����t̛;��w,� Ƶ}�n4j5�֬���'%%���vz��0<�����&"�0EEE<x߽b�(�ϵ�kW���э!I���cx<�y�	JKJn��J�54�u�|>3gL'>.>�D��Ưf��XMcS3
��p8L��S����a��x���s��ws��IN��>'w�[G�
䏿�ܴwq�ݴ��2k�LRSR8v���M
QR\̊�K�3k�Ο���������1c�t.�֑��M\|t�JB����Ӊ�d��G��'����c���P[WOnn6j����!��.�M�ʦ-[P�U�Z��9�fq��y.\����Z�KK�=k&�P���/�����Q(�lڲ�FKZj*[��f�322��K�k�Μ=��:�]����<͂ys���\�\K�
�/]�Bu
v����x�x��*%�V+n��ֶ6&WV�LI�(��c���ܽn-�Y�m�^�J�����Dx{�{�zV�\��Z�����H�DWW��E<��#�;�ƦFfϜɪ��	�<r�s.�R��c�"��������p�h�B��r9u�57�)�����;��׳m�NfN��3O>A8�����۷����R�d���x|^lv;w�[��3�S������ٳ\��Jcs3E�,Z��%������j���Y0��/�q`���dww����ٳ(.*�٧�$//�Vߺ�LVf&�W�������3j�!(nl�&M������Ʉ���?���>�GF(-.f߁���S]S{N���n�f�ܴ�o�rA������3�308��;�RUŨ�Ǝݻ�@�$�� f��ܜ��nhjbڔ)(��A! D�#M&Y��@__?�c��j�dgQ9���C}C#��쉖IKM�g�����
����s��a�z�]Z��p�\�|>
��E�VG��R�IINA�R22<�R�$5%����J%6��P(�m�ƌi��HORRRHOK��J�\A�PPXX���h4,^����|�'}��s�T*�����R�"g6s���`�:(##�T�HKM������8:�����T_�DD� FD�|�Q���x����74���U,�7���wp�	FFl��׏��D+�������v{p8���`�X1�x���6��E����i��*���dg6_W粲R4
�o�DzZ::���vq�j+ ���tdggc�Zn�/}����u]�f�	�FCbb"ڿ`�ٚ�K�;p`l�&2���ސ$��;Z��FC\����$I��Fm6��n���摘h���I}ín��M-���H������HOK#5%���O�����%&��R��o��}[��"���O `�f��	�x�>>>pQ��Z-Q�J����|���ؽ����TU��)((@�V���G  >.���nijn�l2�t:Q
f͘Ak[�qq$%&�V�� 			Q)--��ѣ���k�\nB� S&W��҂zl0%$$ I������R�����rmE���zΞ;����/�P(��̜П�#�(�J�C!�G�o�� #�D �7��)���	����s��I����h4�A.�֡RE_����A���j4j���Y�p�j/_7�%I@��6 .����J6���%����FU�$DQ�j����� D�Hbt�뎝��>$$�23Y�l��*��^�����Co_999�TQ�xQ��D�H$Ž]��D������
l�϶�������n�;VnvV��*Q�U������dee~��E���.B���J��d���gϝ'��k��y�I��� E�8fϣR�"��林��˅�` �lF!(�X[{NFm�[��o��:��9~�$��'99A���a��������dB�V�p�|*��(.*"=-��55�z�̚�^�'5%�8�9�7?���BL&I���]}'m���:fϚŨ�ΉS���ʢ����K������h��U�_06�5��s�9ٸ=nB�0K�,��r����zB�0餥����IvVfte";�‚|*'U FDlv;����+:�+M�?"�C"�����ދ�DR�lu9UW�Դ۞�>=��{v��݇>��;�;;g��L�N�tה���T�$%J��Do���	�>3"�!�@�ɒ���s$�ț7.2㋸q��(?��)*,$
�LI!7'���
����f3��2??�lY�Iu:��̼�j^UU������{�%	��HnvvrP� 	��q��et:y�����aJKJ���&	����m[�`2���l߶���w���s���p������˿�\X��.g�gPTP���ZRRR�D��p��ٳk'�,���CFFyyd/%��8vR��x�233���acm
5UUx=���8�];w�����R�rr0,.��X̤���v�-_�=ϓ�N'�����ҒR�V+�99X,߿�$����,ˤ�\�ۻ���*<�lڴ��BAA>��^GNv6yyy���R\\DIqQb|ٞ=dgeG�z<dgg�œ��=�va����t����w�{G;x� ^���6�����nG��=P���,c�X�|wO������'R����Eahx�x\����III�/��O��}]�=|y�4?��?��|c�~��ox��!6��}��$�wqE�_���}������MOO37����'�<ſ���\��nA��x<N__ߚ�9$I�����`]��,�|���s)W���l�TU�������*�m����xx핗��}�E���?�v����Y,���x�/�H�$*��q��=��MMOs��Y��8?������w·����Ŕ:Axz���$Ib������?-ė]q�/<��'[A�""�� �SD~AAx���/� O�1EQZ3�Z4Mcxd�E��6KA�� ��S���?�{�oޤ���*�����x����.?22�W�/�(
#����7�an)E���:�F<����+W���QU5�m+�s����#�������umAxZ��q�]�����}�����������=�b��MZ��.]�J(�ԙ���ͭz������y���edt��g��QFHx��?00ȗg��w�ˉi������f�k?ˁ��&�D>mEQ�Ǖ;޻��_ޗ�iĖru�u\'O�����p8���[Q�;�D�X�޾>N�;��(|u����{����s�q‘��~���s�������o����blݼ���,.��z=<�?v��w� ��\���&����4MK�[����?[��u�REQ��b||�C��w�֭��Dh�u��7���E!~[���g�^��u[)�~����cۃ��M���p��P���N3�b��@ �o��~?�����Op�ԩd��>˹�F5������|��8��͛(*,�Ï?����k7n07?Oue%;�o��#���۶n%+3�p8������B�4������G?������~JK�9~�$��"�2��㟑��McS��hhl����h4Fj�UU���OBvV�����^

���q��5���[�|nNǿ8Ak[;��~�>9v���Q�23��7~���'�<��׈F�|���\�QGII1?��Ohim���s�
^{�%

8��W\�Q��8v���O{g��tY�iko'5�I~^��XC����o���*4
J���t��m�f
���h0 ?��a�����?������^��~�F���_���cLMO�y�F�v?9��o+��_������n�BMu��1TUT�p��
�\�JJJ
���
���f����p���54�r���~��n�觟���COo/;�o��Xf����Q^�U$$��8AiI1�/\$����gp������H�ċ�QUY�jg�}ŭ�fz.W*YYY<�g79Bue%�m��G���~��`X��KW��ё#����Ə��\�������h,�&���t:y�Z+W�]���M�55��f*6l@��!�2�����Z<�M�͜�p�^�믾BaA�z�#��w��H���N�o݊�����<��jn��k��n���S�hnm�����55������BVf&&����f6�֐��IcS�X���	�;;���bxd�W_~�@ ���׉�bttv������!U���v�[�l�l2q���p�5��\�t���7B�K����446�G����=��ڬl�X��{i�y�3g���S[]�_���D�6n���ʕk������߿Ozz:�X�?����>�8y�W�Ё�FY\\$;;M�hlj�bCy2)��i�u:v����ݻ���BUL&V����D��s��s��U~��'SP
“L�4������`��Zn��s�������p���_��`��R>>r���!:���.��ȗ�O3;7�O�cjk�q��ޱt�:}��_�狓'9��s~���Aϳ���o�^N�=KK[=JYY}���<}����ZZ����۶�y,6���I�]�Ask+�##dge��3�����cttu�����ݻ��ʼc}���ͱ�3���@�4ZZۙ�������ɋr��yf|�;�QZRL^n.�?�������!����m2;;�$�����������LNO!�Ȟ�ʼn�+>�����'�Ec�޵�.'�$x�;���>b�p�\���7���/���@��q��@#
q��زe36����6��:����4�lތ��C^Z)O'��ɉ+�̌*6����C8����X,Ά�rZ[���Φ����KA~>�c�LNM24<�������ܘL&d�|�����Q]U	$�����	�B������ܹgJ
�ȉ���GIQ����1&&&q8H�DVf&�cc�lV6���Ǹ�܄$IȲ��`���1<<�ˇ'�V�$�{v?%�����Q]U�N�CUU�Fcr%�ѱq^�U��/� |4�h4RX��;--���������0>����>�����t����d�ƍ���� ��\4��0]���l6�m݂,˄Ba�FE����&�gu
3����ݍ�d�a�384D^n.�%%46513}�cM��Ȟ]�8��yt:{��bnn�/��	Ģ1rr�ٲe�9JdYG�R&Ѻ�z$)��`�N�`4RR\Lyy��h$z�6�n7)))��?�btt���n�H4��V��;�N�v;y�98S�H$^�Ǖ՟��L,?�������8o���x<��>�G���i�jn� ?�ݎ�b�a�s��Ϳ�?h��`l|�C��ى�j(�����	(/-��'���l6B�_��@GW�h4�<FUե|�*���)���ᰣ�*����f��Rٲ�n�S]Y��f�0����$YZ�|yY)o��'O�"##EQhji�W_�`�����"~�	�(+)ES55q\J<�������l����x<�lV�}v�O�}���4�pgJ
.W*#c�����j!7'���q�8q�7~��v["u(y�v��2���4
I�p�R��#����(���j���|F���ܰ�P0��͛1�M��i(��z��8{�<z��̌Lzz{V�����x�2�=F^^.�����<')�B~n.y�����)�/`rj�߽�{N�:M{{n��޾>Z�ړ������ky��	G"��_�s>��#EaCy�#É|$E.�IDATcU�4����~�.�C����x<����+5�+׮s�y���Ac��J�c���\��K<����TUb�Z8}�&S��q��*���V���d�`���"���\��b�ZaE�PT�NG$�F}=y��D"�nތ�l�n�166���j�����j����)�?{�o����Y���#s���ۻ���,dY��qSXP��-��x<�����X[CUE%�^��DU���$	��FQQ)v���^22��X[KA~i�4
��u2�T'�
7yf�������$3=��DIQ1��to:[6m�j����K<�����~��7m�f���ՍN�Q[S�A�gtl���2vl�ʖ͛bf�G^^����呖���h�����
�355Ez���2�23����l6�����2f|>��������(q����������p��
tvw�G����``l|�t�W�	�*:���|RSww�E�z�s�زi#�X���~�ĝ��j�`��y�壉��)))�TW�箚v��IIq1]==�B!�
q8�a�ٰZ�l޴�ܜzz�X\�CņrlV+=��x�^�lڄ��ttv	df������墦���
�=LLN��*[�l����̌L
�󘛟��� �?��L!/'��l22��z<��/05=CeE5�U�R]���tb2�(,,������I���RVZBaA>��Y232p���+N�$	���6ٲy3��~������(/+�d2!I&���^�Ғb<�ss�p�:������`h���N6��q��Z��}��
UU���[sТ$IDcQ|������#==���-OMM�$�}�.UU!=����׿}��/������*�ʛo����������AS�Iv>�^�����އO�+�2�����MFz:������h4�DZK���8�)�]uA�	 ��>b�,���u��%IZsT� � <�d� � <ED�A����� ���_A�""�� �SD��H$����}����"5� ���P�?�r���@�S�008�P��F��jnfxd���i0�D"t���3]�J�������.]ⓣ��4���E��[V-֠�*m����033Û�}��G$����ۜ�x�p8���!������G ������׿�UbLMOӹ��{sK�����i��p�k�o0;;Gsk����&���x�w�F��>EQx����z��
�;�r�
�ZZ}k}�gf��o~���}�W�O����[�A�~u�Ғ$I��f#G>�4�K{~~��?�����ϯ~�n����x0\�z-�=U�n����c\�vm��C���(�y�w
��w�+W�q�ԩ���=��ȱO���Z�ܗ��V���E����}����D"���[\�~��ׯ������}oU�k��OKk�\�q���*2328w��^�MM��/�����
��‘��dd�37;GvV&~�	�H"O���<�o#����������k��X,^}�%\�T�[[���%;;I��|�*�cc�A�m�BUegΝ���w���^|�m��G"�8�%��Y�x���L���)**����1??ϗg�
�xv�3����ʼn�w-�9u�,333<�w/V��Sg΢(
{�즪��k7���DnN���ͷ��b��ҋ/P��ϥ�Wh�蠴�����%�q��z�����LUe%����6��%�	57SSU�\�_�utuws��5z��رm:����*N�9��⡃D�Q�;;��������###C!rsr�����p��+/#K�Ϟcl|��[6�����.\`xx��/��^���W���罹���2N�����E6�԰w�n�;:�|�*6��g���������yf��n�LSs�o��r��L&.]�Bsk+m���TW�q,\�t����
._�JVf�n���ٷw6���u�D#v��EIqѪm446��Ӌ�(��ix=^j��8�"��266�on���*�ٳ�mY��ߨ�ͷ��VS���*�7n\�_�*}��:sEQطw ���O���;���z._���[1:��&$Y_C�Ъ������+H����I{�;�X,FKk+[�lfhx��g�1:6Ɖ/O�i�x�xܼ���4�j���?G'�8�)����v�a49��qtz=���S�	G"�q���8w�<�E�457s��TU����M��tuu���ʩ3g�d4��[o388��l�������|y����D"���K�3�����+���9��g��G3:6��*�����8q�Ԛ����Q��'�����p �2�99Ȓ���}��.s��/(-.&{)]�,�Tl�@<O��ߺ��ώ��z���457s��9��ٱ}nw"e��))s������*�FIII���������ǃov�P8L4�w�I�سk�z��h��W.�����l��O>������op������q��gJ
o��.�##��xi��9w�<u

TUT�v��e�w���\�v�w�>�K�|�K�yy�����ǟ����[##%��uu�9��--|~�$kk���]�X�z=���o�I{GW�]C��'�o�����8u�4)))X,�;�q���n������Fc�-U��r��E&&'����|�ov��m8�Nv;��9�R��_�,�j����U�xWr��p:�������h���ԙ�+>�108Ȼ�������P5�k����w�C��,�����NK��?$�S\THFF:u

LOO377��<��t�Ev;9���Z�t//��>��3��-�ס�e@�� ��立�������	�TUT��׏,K��z�l��s�����G[gF����.��X���n��s�v���ٽ��g�1>1AkGF����y"��X�]ʏ�O���˟��_RV������G��S��LMMSW_Oqq�Xj�\.�w��s��EZZ�����H4���<m��ܾ�=�vq�
@#�c2���w`�����I�˕�_�ٟ�����R���6�_���������L^P��N#��~��Z�iittv�{������������I�F#�h�p8��.姦����g��<��x~<6>�8w��S�&f���^yY�	��X,�oۊ�d���-:::��tXXX��笠��}{�299�����b�Zٹ};W�]�h4�m��}������t"#`Q��t���Ӵ,���Ts����$Ο��N�!IF���[6S[]�W.
���FaA>���ٵ���0��(_�K��	CH���+/��2�z<lٴ�����%��ⴴ���lD���?���g����JqQ�z�#���������\���m6�z�����_����믾BOo���(����33LNMQY��h4�̝l2���wt���G4[�w�u�k
��V233p��-�7�gzjr��v�~�&G�}��_ 
��
�����$	�N�˕�*����h����X�fr�sHq$��h�f�_��Z�S��\�t�N�3%���Nٱm+cccdef���JsK+��NEI{�� ;3��L�߿��D~^>ccc4�l$77���I6�������:������ܰ��DCc#���m��v;/:��K�����j�d4�)/+�������*��yAK~�5ME�����!��ҋ/������"��Y^�%�zߨo /7����U�^<}�\�~��,�)NTUAU�>x=�23ٵ};nw���tu�p��4��24<��hdrr���$�����o���O?#����c�~}�GQ�Ĺ���Ezz{A
�t�w鼚谵X�tvu������D�~K�U��4�����;�xO��LfF�,s��-RSS�HO_�_�׿kM�ė�V��N�c|b���Q���]]defb2��bF�ܪ�$Y"�ԋ�q�Q��K��t�USy���w�p������绤{�go�mQQ6��G"�;��;(,�G��a4�Z�:�<v����AFG���ɦ����HUe%u�
�/,�⡃��P(Due%)�P�k7�$��R�����tT����_�f���x�۷S�����"��N��hmkg|b�g��fϮ݄�aZ�۱��TWU166��˗���chx�ܜld9��`2�())fnn���
�*+hk��!��{��k�ʊ
�jn���w����|:��������� �s��U�� ���D"aj��PUI�yv�>��0�
7���RR\LE���u�)��������/c�Xhhl䕗��r�x����DnjX}%�i���t��m[�07?G[G;�+���on���|RRD�Q��*Y\�Îm[�Vs�H���bB��k�߾m+����ף���ތ0::Jvv6��ߏ�����+�OLPTTH8�_c1�	�B�ݳ��F]}#�cdggS[]���"׮_����P(��S���룧��;��"��8���ڹ�ɘ<oQR\��n��|c����8===���b4q�\��iKǗBGg]�ݸ�n6oڈ�������r1�0OYi).^��f#���LMOSY����4Z���z�dfd���8I��;�dۖ�'z��z������b6��*�������(-)F�������^`vv�[��D#

�o��kdd����:�����277����B�$��(�Y�;」<p�����b1�[[)-.��p�Yf����޾>����|_4�k�ڵ��(
���oٱm�=�?�.��F����]lkY��\.�Ǔ=+�����j��$IF��U?[ގ�i������L�k��U��fjz������𤺟���ʯu�J|'%d��á�O�+ˬ��>H�o?oܯ�^�'�b��6Q%�Ϊ�޳��{f�ˬ�]k�F,�`0�g� ���׷j��2I���t�t?XW��``˦M�,s�_����{妿�~�&�yU�Ol����oЇD��,��{�g{k��ʟI����7;{�A�l6󋟽AI��Kx�<�������.
&x?h�o�����qw{��<�o:��/�Vnky����?i�L��?�S,f�����ۍ���;�5�	� O��m�4��7EAgb�~AAx���/� O�A�)����� ��4�eyi��;c�ʟ��`|b�`�S�
� ƒ��������r}�IJ��9EQ��͠(��I,���z�
FB���BAƢ�@ ���rI��X,�,6f�g�F#�|�V�����H4�H�����j��I�Z��{���#A��$I�P��@ue5:��]p(�f���1r�sY��1��(LLM�(+)Ù���/�O�tzm)���� �w)�����������dbqa���"z���5�N��fgtt�ź�1v�}‘�wA�gy
��"�/g<e���M��7��y}��u���/� <K�P]J;�m�eM����8W����u}/F��$� ��i��z���ihwK)�G��H��y���V��`,!��z߷��n�Db�;��ɝ����uo+�\~A�Y��_%I�7;K0$##�m�gf0�L8�v Ħ��q:��M��64MK��FQ�� V�5��/��F�Z��L��FW�k�e�@0���X-��ʊ�2�K�D(fddd)���Ǎ��\3����!I��;��bi����.b�>A���V?�_��{������k&&&Ŗ��UU哣ǸQW�$IȲL(�7o�M?���rW�J�$��s��u�F���B�@?��7	G"��rI��%���I�F����f��cvn���&�'�+��{��$I���*����������~Ƈ�|��/���v �c������p�����|mqq�K����V����r'�YSAw�� �#�W��NMO32:�3%���v�23�F�4��288DWw7��n226���8����x�{w��N�c|b�o:�x�陙�)��p���9dYƕ�B��377��3�3ح6�F#�x���TU�l4����^o@�$��f�����^�/4UC����+�89��w��M&N�:��G�������q�����eCy9�xY�QU��Q�;�Is�QS]����y��O��`��]�l6n6�"�P[]�۝����w�� �#��n9uvv�9�ܼܳ�D4����F�ei0��/Np���Ʉ�*wnK��Op:Rp�8bxd�ǃ�b!����������
�3;7KO��.u�K���2;;���HRb����0���BA������N��!�:������ۘ�����|�����#tuw'��y�������g�r�:�H�H$���'�p��/h�y���~��G�B�;��� ��H$��W����\�Q��8�p��^z��iljb��-x�9���YXX������������ڊ�(ɠ&�r�+{�}��99���c1�������faaUQ)-.!����B4�a�STP�$�D�4�h4���������i,f6�
�ټ�oկ��
MUQ5�p8Httv222J�����c�,?���炙��ło�������K^n��tI��k���j!
35=M �X0IӴd���/� <:�F�K���8S3�<����OsKv����!��Ǚ_X 3#��DOoF��`0���'��wկ������*t:]�*���h$�(,,.�F��uX,�~?�P �5
�G�kZrn�������yJ��מ% ���H���|US	����!�2�Ϟ������f|>vl���=����b��8��>�33q�\l��e�3{q�҈+q"�(��dge�t�������t:V�%9�`��������AAx�I�D$!�a2��w�=��HH������a�X�=�iko����p(Lmm5�˹r�:]�݀Ħ���8ɀW��/.����m/I�X�p$�;͍�l�`0��*�@���t�:=S��B!rsr��,��LLM�FIq8��
�
zt::�g��^���,�h��¢;F�|�Y�R�VM�ܼu�޾>��{�x<�r�0��9LLN��ֆ�7K~n.F��u�����v��u����q�7�YY������JAA��4�����@A~^r�>$zW&&&�.]��UTT<�	� O&I�XX\ 
�;�:my$�r�t4�`4"/�V��b(���`�#�F�Q�G�����h4&_Ӗ��ei����@b����%֭5�oyߝ�]
F�
�Xg`~a���.���
X�ij�����ad����H4��`@��-=H�T,�AQ�d=��8�X����މ�(MMM��_AxtTM���U�!�5M[�V�]����'V���+����%IZ5B-�H�@0@VQ�7�rW�Z���"h�1�ꑸ��'_[~}��a���t��m�jb:� ���HH��m��Z���k��,��`����k�*�N�r�(jb@��� �#��s�m#=��o0p����X�g2<�Z�� �#�7��Υ� @��qv�`�^����vw���$�38D"��n�ۧ�L&��� �wO�4�&3N�31w�;������v�������!c4����en~�P8��u�$	�͎���_Axtv)����n<�����e���4
TMK$r��j*�x|��#I�.1�P~AᑹW
�?t��3SҼ^���D(db2�tȕ���r�3u^Q|>��E�A��?Mӈ�b�dg'S�7�������瑌9�%����_Axr��l�:�%���K�$$YMd�A��`�*��3��+$� � |g����_Ax"�^�Z^�w9��K��.�'��_�i9��wۯ��(Jr��N��V�%�뱂X�GAxr��'�&Y�)+)`dlM�(�/XUv��x���.r�sp:����ATMþb��{]HHw��$���~���1�MȒLaA�����ŷ��G��UO$%���rG}��%� ��Ċ�b�؊�QU�H4B4%�(��f̦D�`��O ��j�Խ�p����ba`h�X,FaAv��X,�?�h0`����/���x<N��K^n.��C����Lq��X,�f3�`�ɄN�#`4��㄂A�V��p$L4CUf�昙�������Ξ
��GAx����.I�ccL�Lc6��+q�JJ���gjz
��D4
�f�	���=^�D#L��`���� ����������H�E��H$��b!��ۃ�` �QTX����'����.2�3���F����().arr���)�6;�p�ŀ�ɩI�˝�_��A�h+b�ʻqUQHu�RUQ��ladt���)���(-.�f��Ju%���H��M�'���"��Y:��H$ŒϷf�{�i��t�v�F���e�7�?�G�ӳ��p��1��1��a���e�
W���
�f��r�RVR��dZ�2"�� O�����'	�B�D�zIJ�m_�k/'����(����D�Q����T
�ل�iȒL\�����^��f� ?�̌�5���:dgeQ��4��H���`��SR\L�Ӊ+�E<c`hw��^��`��L�+Q/��N�,�P���w�EJ�9�_Ax2�`5M#͕������vt:^�MӘ������h,FnV��������ӝկO�E�"�X�^�әB0d`p�͎+�Ō�����r�Z��]���zdYƙ�$��edt�Tg*f��陙d"#�Ѹ4�P#����vcZQFSUt:=:YF[t�$��כx\�FI�.]�***���A�A����8yyȲ|�k�hM�0��z�{{�D"�d�`�X0h�F8FQT�zF��x<N8�`0`����tɀj2��w�a$	�&�g�$���
@YI�8�H$$���QU�X,��h$��	�#�t:�&�,�(
��`4��
G�.��*@rJ`4EUUL&ӊ�%��v��'� <�L�,�X�fX9�})s^�ӹj���lY5��h4&��kZr���g�:��ݾ����`0V�oy���øj��n���5�.s������I~A��p�%{5H��%�=��B7���z���v��O����K�|A�'�}����L.�~WuzT��_A�ޓ$	��#�`�Z�}ߕh4�,�ɿ߳�;�i*�H�� � ��'I)))�f���[����))N$`br
Y^�[�D�=�C~A��`1[0�̏�,����l�u��,�LNL��/� <nO��8X��z�|A�)"� � <ED�A����� �����>I���Df!M#��$�rb1&
TM{d�"��z ��(wo#YFU����e!A�Q�g�������"7'{�z�w�����PU���!FGǰ;�`�ِdMS�[�\�K�4�����0�L$R9�ff����$^�wZ�}�#��F���<&=A���]���|��'���Ns����?�F�1yw+/���IFG�b��w�����5-���R�����jj�#G���&cz�EJ���r�*Օ������}���s�V�v�Hjm��@�K�)^���H���^7UU��oK<'��%55���lVk�x��:??ϛo���l晽{�z<�6Yޮ$IH�����:��������/���u\������MA���5�Gc1:��9�ܳ<�o�G�$���QTX�+5�ֶ6RSS���jjBӠ����Nq�����9���M$�������[@���x����f6���qj��().�}��X�V����L�����[�{��[�p�sX,�PU�M��ttv�G111�3{����=8S����Q^Vƌ�G[{;6��m[�208����TW%��h��C�غy �F9w��@����{�o���Ρ�����OwO/.W*�����0:6N,ccM5�C�
Q\\��f#�墧��ѱqf�����`ۖ���qn�7�����fc˦�8Wd�A���������Md)ZM�ףi*�/^����݁˕�g_�`aa��,~��w���拓_2<2JZ��#���A�GF)-)&���~D$ahd��	\�N��׿&/'���/���b4�x�2[�l�d2�i6����6n�����HOgf�ǵ�:23�)**"�166��$u�
X�V�{z),( /7����b1��t��߾E^n�>��a�Z��Ƞ���ٹ9��J���ԙ34��������Ng
���~MVV�]]LNM�;h�褺��h,���>�f������=�}\�z���t����VWGFF:����i��o2=3C}Cٙ��<u����\�NOO)��8Iņr<n�zVA��1I���������_��� ;3�|�Mzz�رm+m�45�`6))*����h4���"��N�*+(*,�՗^"�������>�,CC�(����慃)-)�����>K8&����|���7|�9.]��}LIqy99|�96��a�Z��㄂A��'(��#/7�W&;++y�˃�L&�%�fg����z^8x�W^:��f���������?�)��{		��͋��c�6fgg),( +;��{���q232��op�	�������K�f����ٺes�񂔨[ue%�������aZ��x�C:�<^�q�/� |[����ŸQ߀�l&?/�NO ����_���#G8|�&����
2��y��Z,��永�H_�&���EUey����6ԥgٚ�.���H<5_ʙ�������i<7�^�~�$44�HOO����,�����0:6�,�D�Q���)+-A�e&&&Is��h�ݽ����def�v���)-)Nv�k��,��tz4M#����ހD�5TMEQT4U��v���EwO]=ݸ�n�z=,��HIq���KUe�9�<��ay����LsKz��p8,�	� ߚ5��%I"
q��.]�B__;vlc�ΝX�V&�&i���/�sF#g�}E]}=6��g��&77���!������J���M��HIq�x�������������t�'&����l6����'O��҆�f����;-
EQh��HZ��4����N���p���c�6��(M--��Juq��5���Z,�����ŋ�wt���ɞݻh�����6$pGW'�]�u�322Jvv�`���*� �BaA>3��q�dg37?�u��k��L$�j�RR\��㦹�������3�I<Ǖ�"�� ?��J����P��k7hikedd�={v?ЌAA��rW�t��%���b́c���D�]��]�'�����>��_�:��X,N8�d2%�
�������:dI&��(
&�19�}y���vo��i�X���`6���t�z��qdYF�e�� �,c\��x\A��!I�`Y���Gc1����^�L�����%IZ��L��/�l��X�N�N�K�_~=��1�v, !�R�-��32:ʯ�-��99�bp� ���$I������eY�b�$��i�h���v�؁�`@QU=��.RM����8��`����e�F#��_��]u��z���_����h\U��$��Zu^�jq�ϕ]����"F��a������zH�D[{u
�LM�PV��-A_A�6<TZ�$?/eV�{�i�FVV&��MȲLAA~r�� � ��8�*+*D0���:���\�X�GA�v=��D߭�A�ۖ���3&� O<��磽�}��!� �w���������%tEXtdate:create2013-08-30T12:28:19-04:00��(�%tEXtdate:modify2013-08-30T12:05:05-04:00xȦtEXtSoftwaregnome-screenshot��>IEND�B`�site-packages/sepolicy/help/lockdown.txt000064400000000443147511334650014436 0ustar00The Lockdown Screen allows you to tighten the SELinux Security on your machine.


These lockdown measures are recommended, but can cause SELinux issues.  If you have a machine you truly want to secure, and are confident in your understanding of SELinux you should try some of these options.
site-packages/sepolicy/help/start.txt000064400000000771147511334650013757 0ustar00You must 'Select' the initial screen to view for SELinux Configuration.


This application allows you to browse SELinux confinement per application.  You can enter the name of the application to see how SELinux confines it, or you could enter the SELinux name for the running process.

Alternatively you can select to manage SELinux on the system, lockdown the system via SELinux.   You can also manage confined users and confined user mappings.  Finally you could setup File System Labeling equivalence.
site-packages/sepolicy/help/system.png000064400000142651147511334650014117 0ustar00�PNG


IHDR��ah�>gAMA���a cHRMz&�����u0�`:�p��Q<bKGD��������IDATx��u|���g��x<im����B�����½\����Kq
�Rww��I���;��Bh/�B����2vv;��3������F�(��e���B
��@\\������!Q}�@�'c�vIII0dx���Ϸm۲�#]*�@ ���4ݣG��k�2�m,ˢ(�H��@ "�i�,��<�›b_��ƶ�=�Ot�@ ��Ĥ���&ü���Ν�>}��|5v��?f�]������G����|Xsi� �p�(`�6EQx���Q�Jl�ʶm۶��i���mc�Ǖ+	.MQ�pF �I�_���?X�!��N8��&����y��h��F!�c�e�IQ{��U�ei���iZDz�w��}C�(���**����9�a��M|���,�ضm#���-!��P�
˶DA�'M7ld�g#�>�p� �����{�>��C�_�P�i��y۶M����:v�3h/�#�y`]6L�ïf�^�LӴ~=���E+V?��;�(�#�^=r��￯������}zB��8s�g�9d#��u]���:M�!�!M���2��(U���ͭm����k.<7�[*��a�0���=O�x�E��()+��}߭7*�"�B{�^�4��ei�&��_�Տs������nhj��ѧr���*�fh��-K7�e!E��aZ˲�]�c)H!�LӲ,��y���i4M�<i���e�!a�6�0����Ѐ^=���y�i��iv�}����IJmY���^������n�I�v�B����y��4]�(h膢��eAu]��H?��)�i64�����������MOM��4]7M�e����*��)�JKJ���u�o�����{�ij3��u߭7V�������M+�U�a�g�w�mPXZNS�0�m���@�*�\��������=w`����]PYS����khlio��zӒ����m��JR|���khli�zR��(k����2����Ee�<�ewOonm�ZTlf��ۿW��0��m����i͸��� �����������zg�X_��^��.<��e��I����~��p̰!�0����hTܯϸ�C(B�e�4]RQ�ݼ�}�s{e����@߂�)��<��[o>���K+�Ə�ɷ�_{�y����5�����麍��q�V��^T�O>v�������c}n�
�^�0�0+�m,��=rP���~��j���NMJܰm�cw�*��}ϼt�W�<����20���>�QX^q̘��e�5s����g��d����:�eh�~�[����������yg��:}�W	q����_}Y]c��5�(
�z�1�9Y/��aVz�n�I={�t�n�aTQF��F�h��s�=dP4���L�!�~�_@մ��n/?|��Uk�^;s���7���eM�b_��2�� ۾ഓ�}�'_{�_�|���q�,I�mO�0�����ǟ���K'�
�Y��c���uå���?kꔩ�'����Z��)K4MD�_�y[qE����q;!��>����;�9%)ap�>�P��b
�HI�=t�M�>|�
�@
nغ!4jȠ���V�Y�q�`�!�v�5w�WTV1q��u[�/_�aΒ�P��3N9q��~���n��U���nji�⇟����KKJ�D�d�px�6�(JӲ��Se�h�������� �x�+���YZ>j�@�e�~czk{;B�!K�2U�u
B]7 �u��$�TT
d��F���c�aX��ʫv7��"d#�
�²$���뚆M�EU/;��^y=,ӌ��(

�-��MQE�QEmln�F˶%Q4M����pzݮ3�?�ïg:$�q�;��pt��/?|ߐ~�?��˲�
�Ͻ��1#�S�uӰlDAHA�v:�j��\=bЀn)ɶm����R�Ш�:��,�F�J��q#�h�nY��k�a�ېpx�I��r��7>���s������A��J�g����]֎t�	�dB����~z��o������r��j���5w��F{�f?���.�_̈́J�0y�H�É�-
�!@QTfzڨ!�^���C�����,Y^VY�҃�<��;ݒ���8V�
��C������QUuHҕ�ޗ߬޴���3O�R���ʼn�q��?e��W��a�sO:~��_����m؀~�����<gZ�`h�>�04ò������^^Y�hj�7]�T��m�fh����dt/��VT-7����c^}����e��i��O��Ӓ�m��}{dg	��
	��i�	��.�p��~@�e}�eY����9�|��!C��A<|*!����hT�y��5]�m���>-������=��Qű��iM۶��q�j���:Dz�<�����i�!K����B� �!�(��Iz�W]N翯�,a7S|�@lۆ{� �al��A�,�{߫q�ѩ=tHCӆaH�X]W����v���mnm�D���д�En>�B��8�a"�(@�DӴt]�w>#B�mڴ�x��:L˂:EѶ�`��No�ȶMA~�l��8!DSжmUUi��fq��BQUe(J�8k����45��*!CSѨ�����h4J�4B6ϱ�wnض㩻o�T
��!�b�IA�q[�ޛq!Mӄ�yܦi
<���k6o��ڿoANv0tH���0-Wٶ��G�_Q5
�*��	G��@�ȁ�	��H�/<�:��h�=,�w�����!�x#Bh�,�B��)����LK�t}��`������,.$B��
���S&�陗+�B���a<[�W�L ?�B�C�Nb�!tq �a�ge���G"�a۶C��.�,[U5@f�����!üBM�U�i��)L��Y��n�#��>��C t�'�c��'\�pD���O�4IH^B���9�$�\焮�/���8b��!�L�0�_�+�e��L�D�e�	]���y[�Z�l�B���������hmk�SϪ�y>%9٨��iO�Xpd��@@@�儸�#R����w���<>�2�_ 	��˰-�-ee4E7�6#������L(�x<�Һ�y�<r����ֶ�ֶ֤�$�l	�.CQ�Cv8�N~Ƅè�.S4��_'y465��n�ק�*��=��,������ҲR��!�c�����r0���B:A�H�@�m۸�W������̆Gخ��A�'�c�۶pU#��(�T�=�o�����B�������B<}���d�`�k �`p4��;w���dI��D�sX@8�ӧ����߼y`���$/��ȫ�?	������xvB�N��F�}��xm�}���>����&�?|#���
>R�0L,�B�eY�	�6��hoO�O�ߞ�0E���@ ��zPϣ�+G�bl��1 �{>p��q�4M7�W$q��!j&)�2B���Rf۶$�6B��R�Ag�"�A�*�[ݶ���_9��˃lD����/Ů�ZK[��hMA
�X�i[4MAHA
p�U��I�e٥˗oݶ=>>~��1>��4M��c.˶q�B� ����]B�;�B��/躎[������C��X��z������+))�4M��6�)Hu��r�eٖeQ4�uG������ �V
�yk��M8��M�DQ���G��;(����A(,*�u�O�ޱ8�,��ѧW��U�S�קe�M�D:�,���ـ�,!���!��y�gx��iK�)�fEI
�8��a��/�ضm�ñhɒ�g|���UW_����t:����p\N���eY�>�U��m۶���z<�(ڶ�t8<���t;�.���t�џM�m���� ��)B{GG�߿�?@��i��(J���������ΰLG  ɒ��(�a��E3�,Kn�[v�.�s�
MM		��!�=.�!S%ɢ��v:.���t�\.�c)�������c��(���q���ϛ�e۶,I>�G�$�h�r���p8/]�Ï?ɒ�r:� �\��7�8�'��EQ�y��$\.'~��ݖe�4�e��=n���p8d�3����O ���ؖ���iX-;
��
�
�;3ӳN�m[{�E�!�0tSs˰S��r�����`��p$r�)'oٶ������|���;�&�KAꃏ?ٲm�g�5t��?����z򤉣G��f�w�UU�h�wϞ%������c𢩄?۶eY�䳹3�����^�]P�����o���/��P(���M�І�˯������y�3I��EEe��3��/[�j��Q[�n߾�0#==;;s��.��y�ѣF�^����4?/w��Q��l-)-�*JzZZ �(��?��tb�>EQ���L˒eY�2�!�g�0kgQQ�^�N�r\ x�Ïjjk/<�\�,C�����/8ђ�6m*��1l8B�f����|��yg����֬����]���w�LNJ޲u[[{[fF�Y���_ ��FD�	���AVT�N{���	VZ*6|��M���s@�w��)�
C�L����u�U�<��S�.�닯��,k���Ph�ƍ~��q��ILHHKMMMK8`@A^ޗ_�u���Ç����-��W��x�WA�{}t�IS�oܸr�Y�-2m�B�(�i'��-5�[�5��n�W�Դ��(*����a�0�^=&M�����y�ʪ���u�7��]UՍ�M�H��Ͽ<h`^�	�	ii��yy��YX�r��qc�,\��b׮ں�`҄���4h@`��N�B
�5׶m˲l۲-˲,�4t��9n�?,Z�t��s�͟�p�/��B'O��xջk�镬�����of~w����ɥi�f-Z�d҄	?�4g՚�;�lݶ}g�oL�oh��`aG��������SO�i�ܝ��Ǚ�m&#����Ғ�#G��?߲xQ���{�t:k�9�?Xo��'	۶�n׽w��k�D��G�|jԈ			s,�oh�4a��N=���?�mlj��ʈ�&M����u�vEU�JJ���5M�z=�6dHvf��A��rsC�MQ��P����|���2�w��+2�w���\N�>5@ع+2M�C�{��O>�p��7�y/>>�o�^��l���:dPff�y眵aæ�]		q>�7?�G�^�Uպa���{���EQ쑛��ף{z�����9�a2�0{r�����߶m���$��a���?n�q'L9n��v]x޹�ƌ��ޝa���V����>{���C4p`NN�mY�7n<��'�7y�ĭ�wL�0a�����?�ܟ�Ϋ��;f�᠁�
����^��m�?MU5���1#6O9��ˮ̺�
=�|SOh��>�;s� ,]���g�ݲu�eY^�p��	Ͽ�����޽��~Ԉ�ݻ���[�e#d�����=�DZ�L��˒��Z[ۂ�P8��5M
��A��)@UU�D��|�٧�AITM;`k3�9�����W^[�z���O�O@65b��%�ZZZ�����ؔ��<|�o���%Q\�f��;ssrdIؿ����R���ሡ�@���H4��˰4E�����k�6��m�EQ6m�:o��e+V�,�i�E��̙7��c&%&&����+W��][���z�	ii��<�d�>�W�^�z���;vh�1h����xɒ�����kԈ዗,�������>��kEQ��Z[���
���m��N�?
HS �2��C5���%%����`(�W:��g��=��$J{�����q��m%%�y=r/��b]�SSR���N<�_�>�e��|�����wnVfFbB����&O�TS[�lŊH$ګ _7���L��
�ӻw8�HOONN"��E�y�8`BH7��M�Q%����z��/C�QE�UYUSS�t:�?�l��			�6lL��m��1���˖���o�z��ܜ��Ԕںz��G�!\�v]{{GF�t����⒓�,�ꑓ
!�z���R�+Mӊ��<�r��EUT����������SNeXf�#G�<qbNvVIi��[SSS�n���=��S6n�2|萸���K��]�޽z3q����6l9|�1�SE3̠��{�l{İ��ii�p()11�[Z8����ݖea�VES�n7	TN�[�bvwue��\e�vz���z��K�
vו�a�˯hjj����w���i���	~��G�G������Ͳ�(
��g��A�a����y��,�,��<�l۲�H$"��,��pX�u�˅]���,I��۶-IR0�1�TU%7۟Gl��>���?[�Z��ܞ���^�q���mlۦ �⫯�y�)Y�������p�eY�˥�j8v9],���B!������]�`(��<MS�p��������i隦�L�eQ�pk �D��8M���0�0���.�^�'
	���4�(MӢѨ���X�p�݆aD"�ף�F8v:�H$�v�TMÎ�ؤ��3�3���4�0�m[��+�
4�t�#�	>�|�e�������/�$��ׯ_�r|��1��7�֭[���F��p$��t:��\�ޭ[Rbb �m[Q|$EQ��D"�-��сO�i:⎧��i:
A��'�'Nï��;�eYV{{{�\��ؼu+Cӹ�نi"����2�m�mmmEq�Dp"4M��f�?NӴ�� ��b�{��B������"M�@ 6�0���V�i!C���[Zh�Ǝ��j����Y���6���uϗ`(��o����sM������EQ�nD��4eA��u��-�U,Ee=��79�2M�4u�8���S�3O7V�p$�wu�콇�w�����D��_�Ed�VNV�
��<�=b�,�����+C3��BJ3��;��Xr��L
봒W���%�:�~�
��Y���1c�m���/���Ç��Ą�e˖R�B�fy��h���Z�0 E�,ۻW_lk݇�^;�F(..߽G��������e�C9�N�e[�����ﴑe[G6h3�N�@�!�ٳW^^>��� �xY$��:��7�q�b����ux�?ؒ)�2`�Gd��Y�b�m�y-8�P��
��hb�!�3��8.�`}��űV&�D��@
Z����G��
!4M31!���B��HJ.�3x���L�{��E��D�������t���PU��i�)�)��h��8 ;d�[yd�ı~>�%!�,LjJj߾}���"�c�ou�eɒ��e"�XrG��y�����qx�sV� U�ÿr��>9G�*��	�{<|�0����%���=�@ t9��B��H?�@ t9��B��H?�@ t9��B��H?�@ t9��B��H?�@ t9��B��79�!�FxU�?a�-��eYx}G^�`�l�Y�@ �&�!��eY�
���ؚ׿?o����#
�p�]4EC��C}!���e9�t^]�@ ��A�����;���GUU���s҉'�E�c+\�N+���i˲�����8I��n�շߝ�Y���m���--�\x��1}�3���<k�OSO8>9)Q�
��#݆��7�ҏ�*�(�(,��}���iz��m��|�,I�i�.�">�e˲�׬}�W_}�y�[6M��y����m;!^���(˶E�m��\Ee�+o�����k�n۾��'�LMI9��3�N�$�BّH��8��UW��3G�#7G�t����FD �>
���O�4ð�$���446�z�M'w,�0�����S��x�ԩoN���m̨�3>�"U����~�����ڛn��⋲��^y�
��y�ש����wl�6L�g~����rssn��j���� ���ڽ{�ŋ�-KMNNMIa����7ߪ���|�ċ/8æ�|�I(�x��tΝ������ϻ�K]N�Q�@��8�E ~�ҏ��m�mu
u<Ǐ1ⱇ|��_���i'�t�e��y�������o.���?�����ԓOJKM4`����>���{|(';���ᾇ���/Y��_7�4Ѣ�s��O{����{�1UUS��O9��[o��o�ޗ\t��?��~�ld�u�i���&�r�f�.��4p�73��YX4���Ǐ��`��iW\vq]]ݯ/LL A@�ܬ\l/=��!ؿ���L���);SSS5M7f��g��ìo������^}��7������x&O��3�`�ʕ[�m둓ӣG�(g�~ZUuuSss��܌��<�G"�Ą���8=��^r���/lmk�)
�Ќ�ig�r�u�O8�8Q�P�����k�:�Sfϙ7g�|UU/��I�.^RV^n��iY�����Nٙ�=���K�p4�bY���X�5Y������~EQ�$%&$~��>�tF�4(9))3#cЀϿ���^�r�^|�5�߿u�^�'O=Qմ�.����N2hP]]]vVVvv&Dz�MMm�����`���M�5!�����mm,�>�cN�󳯾jmm��M>�io͞3�����G~�o������K.�`�֭MM�n�;+3��t2C��ptb#��x<6F =�W\qErr�eYx�a���v�aEAx�[Z���8��5W^���E�y�s�-qqq�Pu��O>ij�כ���p���=�cUM3M�gA~VfFJJr�|�˕��#�{w��ݫ����L���99�٦i���s\Vffvvְ!CdY��K{4P���ѣF:d�����q�������VTD �P�jjir���!��'ih�njj�˗/�߿���B���ee�$''%#�� ��a]��n�#O<YYU5핗5M�%��0"��mے(���FM�p8�0TMs:�P��y������4t]�4���C!B��8A��0�bY6����0����E"��EQDh���*� ��zG����eI>��/�� �n�zP�~������,˶����w�Y˪��DB�0OǍD��HO��'B)�j��i:�"h�
��(��)�����UUũB!�JQ�eY������?ҍI 3>��+�!��^y�Y��Ñ��Ku��9���y��`?;Ǎ��0p�`�@�M��(B�eY�ec/���B G�Z��!x�gY�����<���[�l��G���ߟT��3:#�h��#��!� �p�ᐥ�a����~�F�m�F8R2L3vp�;c�6˲��vᗉ�|V�k_獱]��xBH�a:'��1{ڿ,d��>?�11�t���_�M�Y��<���l��s<P㈢���86ڿ��e����E��G�d��s�i���oKvJ���D1V��5=X��Ow��!H?��E"�'�y�W_�'�,[.�m��CE�e���}�)WG얃oNY��,�RSssm]����0�����5����q��Ͻ�2�0.��}Ҹ\.�e;�0.���<�w�A{vq� �*+����,;�~�H��t:Y��%I�����9��eYfY��r�B<�9���O�ǣ��#O<�Y�u:�����y�F�ah��p�i���nڲ5>>>6���pȢ���RYr�-�]��~�?>>~�Νo��n��'�
������<G��bB4E�q�r99��8���Gj�,�s:�ɲ�p8p���	�,K_���g�}�g��+�� � P�t:DAx��kٶ��x��7֮_��4!QdI��$Ap�w����-�2np����9���\��X��?�p�د�c9��i�V�Wu��j��|8��qc�z�A ��7nlmmMOKkim]�r��IAؾsgEeerR6l߱�����v����W���̐$	��ͷV�Z���mƗ�|�q��okk_�~�$�.�?Q8�
C�7l4M�������v[�UW_�v�����
��8��'��ohHII�z<mmmkׯE��������,*������!%99�ڍMM�ee

�ì]�Ax�����/���x�m޺u��-�w�<��cB+W��L+!!�4M����Pk[[�����;�^OGG��(�-
�h��X�˅Z�qck[[jJ��i


��j�ꓓ��z4}�g���ss�kk�,Z�p:��ړ8����޲m[\\�(�qx��ohشy��sQE�F�.�3�(MMMq>ߞ]��x�1�a�|^�dz��j��q>�$I�MM�UU��$Iܸy���			�i�<�����^;�cz�Ꙝ��Fkkk�����X�ac{{�����؟Eqμ���� ��%�"C3e�

	��� �v�niik֮{���=n�(�[�m߽���q3������Ng(���=n��Ǜ���f��O8J�~�� ��[�~���k��|�=r)�z�JJJ��ء�jJJʊ�kN=i��9s�/Z���XTR2v��w�μ�pH�%˗���A����q\{GǗ�|���HѰG��o���帅��D�Ѭ��g^xQ���p�\E565=�ēQE��� <�‹S�?�����w��߷�Ï?U�@�ϱ��?���{=��⫯��>o�¡�}7k�G��P���>A}7k�(��{��t�mw���־|��V�B�of~�1�V�Y��{���.Z�l̨�|�ɒe�#�Hss�q�'����mm��W��z����(����~���O;���g|��˯\}��/��Z{{��i;v����g�x”�>��Ӣ�m�w��zo��m���jڐ��B�*�f�0������x<���f9�/��&=�[4����Ѩ�r���4-��M�^|�5˲
�hni���g�~����\�zM����ٶ�jj(��`a �������H$�|媱�G?��s�7l�UU����B��u~^^rrEћ�lY�l�,;�-X����ڴ77m�j���f����.לy�k��!������p�ں����&M�0�/gϙ[XT���p��!Ͽ�ʦ�[W�^�F��V�]�Dz��W_�\�&�ݮ��d��v��<��s��4u���s�͏��{������W�~������f�ѩ�D�	GX���!4#!>����8�����ko455�_����D��?Ά24�F�=�a�e�.X���h�-�<p�m��2q���#F3a­7�!�4�{�nƎ�8~�M�_o�f||�]�����έصk�e�UUI�	��l۰i�,I�m
Ӑei�ȑC�ݛ�lٲu��A�N��kNY5b��aC���g�w��s�/���NJLܼe���[DA;z��a�뮾�c�ݲu�q`���o8y�^���� ,-+�5{�E���I�����K�������:11aӖ-�7nLLH�����h�$ɑH�o��q>��%K::�.�s���

ƏS��N}� �6nZ�l���F�f�JJL��?��s֙�w�dFQ����
8�s�9�ӂ�`A^ޝ��m��Q;��-X
��^Ϣ�K�w��cn�P��Ԕ�C�=:����,,�8~�����))Æ?vt���|�qc��93�x��EK�V��-��9g�y�5W�4��;n���**.�$QUՑÇ��d_w��=
"ш(�7\s�e_�i��#��?f��I��z�ͱ�t��Sy�~A��㍛6�yۿ�z���55�ޞ������u�m?Ι3��>�z=x��y=4M3Mcؐ!�99x�!�P�y��0<�k���-�8���>��+���@ 
�6�KӴ��olj���/���e+Wj�&B|\ܙ��zˍ7��j&���8�ǫ���_�c5M�öm���jZ TU!�#�m�BÑ�a�,�!d�������y���C�Ѩm�q>�S�=�p8�s����S�������,*9l�,�O?���p�~�=��-�a�;�������;o>tH8D�QQ�Ѩ��2�7��d��Z�ű��DY��i�f�P8d[�ì�F �*�a�BI��������N?=�ihcG�~�weY����{����̌EU
����!�8�����M��NU�m�I@-ˊ*j�?���=��M�^O���G�/-5�0M���ԣ窱���<n��!C^yc�eZ�y9��O?�hEe�?jf4�tttX�EA����*(x�p�>�45MAQ��(����躎gn���}LӼ�w�>��ܲm�����.@`۶UѨ"�"B �B!Ӳ�N�n��p8ᑆP8B��W]yɅ���k�8�!˖eQ��1tòL�q{.b�D�	�C�������v�/��?��+���Ξr���7,_����65%%
���uFyŮ%˖gfd�;�G�ɧ�KJƎUXT��k��4!TUm���E��/��:M�ذ�д��'MJLL��pцM�u]�,[YU��+�VVUege!�F�QYYEQT�޽��_|�ʪ���L��ƌ�ݬY�~���'�?ᢍ�6k��0ESB\��h�X�`R��(��(�������ٷ��v��u͕W2�Gs�;�ht��
\�d��� ��4m��`�Gn�������i<�Aٶ�--��)S֮_�|�*<���1SE�{�˶Ǎ=�����$J4ES�4�8�SE]�x񎝅��]�v�ko��r��-�:f��+W���$&&�X��7�
�B�%I:t�˯��rժ�;'�.X�x��Bܧ�EA\}H�4)�%!d��(�I�MӁ`��7�X�aCzZ7��\TR��\���/��[%I��s'M���x��!�]q��y���=���>�S��铖���OUUW4cƬ�?y���$˶M�LMIILH���{��_��<�r[�n{�G�-\t�y�ڶEl)���1|R�RtC�%ڶ�P(�����4McY���6�$'%y����VEQ333�ZZ[�=Dz�kkEIKK�%���o����ŚŲl��o����*�,I�i��r�4M���!9)	�
ڶ]W���{���4�p8��#��~��~Cc��q����f�8�OUU�NJr�i����hTE�0����6`��eY�(F"��dY����P(5%;����X��zdY�VVU#d���{�|�q(��_h��*
C�,�F�QA�߽�6���$K���n�iZ���<4M���r,+J����,���l��r���:�ǝ�M"��ﮩ���A!��;n��^=�@m]=�@���H��u�;���:�ǝ���(*�2E��*�2��q\�qA�t�e�0��(������5+#��r!����i�s�Jj�P(�defB�iim5M3%9�0�ay�����5>���}>o$���KNJJNJ�'n�]�U�qqII��V���9�]rqRbb�χ�9�w� 1|G�>�,�8�MӦi�IQ�ES�a��ib�SU��X�a�[7N�X�Ų���a��=�9�~)��9�F�0�w�߽Y�Ǖ��?��=��(��ms�5�
p,�2��ttR��m���'.L�xUa��m۶,k���}I-���Z�s@���U���(˶���m˲�6�n��쳠1n^��������4�q�e�V��(
�Z�(���Ӝ=���H�4Mþ�,�Z�e�&v��������b��8x#~w�On��u�0M�2�?��E�7�/O�u��+�a)Y�5��i�a��<x�:u]w:_���%��x�E `ZuT�> �O8���m�r�aXk(
��a��y��]�0|���Nǘ��TؘR@��Bȴ,�%�-V�V#�4�:�4�Á�^�s�F;����~8�}��X�N�9��ý��b�������2�~��u��?�X��.m�SZ��(J�|m�VU5��,Kg�q�m#|���:7Ξpx{ӱG��1J}��㟱�2V$�t�勯���[���z��0�WQ�P4홟�t8���u�>�p�r�����΂����Y��h��ףG�^=E�=c#�_�{��I���������e�E�@���#������8�?�0�.�炃B�FfFF���H$rt�y���Ñ�?ls�[��CM�%`3�^<<�,��h��6�>��$p�4M#!�	����9x���(��@�E���'��gY�4MEQt]���ف$֘�����!�N�{A�j ��L����ں�`0 EQ�m��	��q	���	�!k��;��6"���1)��#!����JES(� ��R�9�}�Ċ����6B�b�����4f}}] �h
 `�vB|b||�>�[B�e�^��6�,+ֶ���`j�#��D�	��	��c����2˶���/� ��ؾs�a))�1��a�|^/��PU�H4������@񂎿�)���s����<O��y��,������$3#�����!��,;B�m����c�@Q���xp��Ȓ��O�7��!�w�ޒ(#���@aQ��k��i�4����'j���V��
�_B���D��s���ܿ��<�G"�?�2!.��3�����ǍW��9��)�gN�<p�굫�nH�դ���W_��9Æ9f�]ױ�%EQ�a����v����`�C���,�0k׭OJJJMI��8<c��#�X�����q\���5M�4���)XXXؿ߀���(�;v�a�O,�x<��O9��vc��
t��9;��B���u�
�92e��,[��gA���S�iF�j�0LKKK85r�mۚ��N׀�֮[�v{A�ܘ/���m�f�8~��]��MM��=�ba�)�
C_};�O`���Ҫ���;{�ĊaY/���h�	���@ �o�؟���>3�{�?���ZUjcu��ky���|
�
� ��555�\�f��FOKKe��tb�n��=gނ
74��8Y�DQ��qth��!�9�����B.���y��DQ��$	��LBm�joo���P��a�mٶ��|//�m�mxLG.,.�YX8|�����W���д(C����%��<�;�,I<��4-���kջwS%I"�0<�A%IbG0�ڽ۶�,ɖe�<�s�q,����4E765���۶��m[�m�j������΍ɲM��K�//����H�����ζ�ݵ����#<㶥)��tF���s樊�˿��rي�M�UzbŐ$��ں����ČH�(�/�#���h��vT�
f@���vl�V���D���-&�9������j�FvV֜y�W�Z��FRb��;k�lU�J�ʺ��=��smm�#�
=�)ϼ���/��з�G3f���r�4Uy���@sK���?��������KX�鞞~���a![�T��b��pI@4E����2�=&�o�>�O9��7�~�f�Y��X�*1!��+./-/�f�w�eef��Uh��p8�GiY٥]���U�����>�xwM͐AO9i�G�|ZTR�=���X�x�n��_s�����B}z�����EQ<��2�wWu
@�Fy��<��@p�j���ܜ�SO���BnN΋����U��7�t��'O=�GN����i��A���41�<���o��z��%
/�kW��5�'���=����of~�bժ�������2��O�/�C��L��I^����h]S0Չd�VT�e��F��w�Ծ�ᇯ��֮���˲��?[�mmnn1|X�>�O?��_��t^w��?͝�b��Ҳ���1��ӧL�|��W���nظ������x�ر��|�~,.-����n��ځ�3��'Os�,˲L<Y7t����N-B�c�EK��v�]|���^RYU��ǟ;iRQq��_}����K/�P�Ɔ�u�;z�O9v�_}�(ʖm�֭�0w���O9yԈ�UU���ѣ��|��Њ��◯\��ظr�j�e_뭾�{S��kop<�4��:^/O-ƏU�h��������֌��y��QXXV^�Ӽy��%�~fF�$�g�q�q�'?��Ҳ�[n�ąQoNW�uH�@ ���o���B���w�w��n���3N;�4-��'�.��.�m�$'�(.�NO�|rw�_�B�מԓ�`��ݩ��y)
��ӫ�K�=��ӆlZf~^^ZZZZZ*B��r��ǻ\����)�N�ճg^�ܢ�䤤�Æ2,c�c'�*(p�]ض��rssrr��^OQq����YP���)�26�3�v{�w�;�.�i��e˲������Ę�д�!��|�u�w�q�S�k�m����s��M����)��x�i�i���Y�))Y�Y,�R�0�1'�x�5_};s���y=r���k�~��,dۼ�����UP��Rի�������Y�A��m''%�W��<OӌiZ�i�`YVm]MZj�n����<�v����K��B���,ۺ��K�L>f�����v��s�-���E!�D].Wzz��&�����ֆW�	��M<�
6���4)1��� %9�e���p����~M���-����%�{g�?v��{.��ŋR�S<n�>�;
ߚ��K���a�FM���;~�0#�h��ϲ����x{��~T�kפ	������;���9��O?����74F�����%������4t���7�z{ي8�"�#7���i�ƵM;�NQ���V�Z�����|̮)J�*����.�3
��'$��|#�
홟Gpi�����n�V[{{TQ�J4Y����i���mђeUջ�����h�@ 
��<n�1o��nnNN�޽�����8d����a�4�uK,]�Ĵ�S�r(X�ta�7���Ә�E����Ӧ����s�:K����m;v���04�z��;
��J����D,ˮ�����x���ܜ9��V�,ߵ�gA� �Æ陟7�_����ソx�R�?�O�C@ ��8�U��l����/,)޲cgqiEssK��̼���/b����u->>.7;�{z���xI���|>oFz�����}zC����?����,Ass�eY�׷oMm][[[�������̌�n�232DI��y���=p@����-���e;i�a�؄���VSSS\R�e��]�v������AB,˦��&��[����=rs6o�����=�{bB|rrRzZEQi�ii���$�de�\���Č��			�q�����u�e�9��6���^z�99�����=9))3#c��N��O�^e�5�5�Iɩ))X�qc���nݶ�������[ZFϞ��oL�4]�i��SO3j���ݻ3�j�ꌌ�s�>+�G���w�L7�gA~jJJSss|\�瞓���j���8����Kj�j
����z{�ޱ�F����:�� @V�"e�U�9r���P(�B�N� ��G�i��	��(Q�,U�N���8���(B��2L8�aU�) ��,k�&˲8��$I�`0�w������w���сg��4��L��C�u
B��ò�>a�q[14�m��H��D�x%�P8�]b"����N��+�*��i�� "�TE�l����+��娢ض���u�eY\U�p�nL�x����$Y��7&�B��DE�y�al�x�G��Fmۖe��(dۊ���HQ �N��0LM�\N'���QA$ID62L3����D�$U~��?�8�/����K�>�X�
����EY�Mu�@X�
�F�Q�qu�N��c��cSO?����[��|��/�,>>��d]l����<`��'�v�t�.���b�_��w>>6#ao�dB�u��a��_��9S���c6���λ:��%r�R�~�Q��m���
�� a�i�:��9�@,�^Z�s��OB���u����`Y��� 
X�b����u�sT�ع��i��+���?�V瀍iYV,��Ә��s���H�^*�p�ҥ�2l�4�j���{�[��JK � ]Z�A�X��u8�,6�����������g��X��e&�,�x���t:���B�q�o���`��C��c��p�����,��#
��l�?������A��ߑC3�`��`0�r�jQssr96Z�0�nM��	�	�5�u�+4@ �<� U�öm�� �"���6Yv�
x��T��Q�|lb�8c6�9�3�Ng����Y��>0���_u^��4ͦ���X��X!��MQI	 3##��ӹFx��C�{����;BR V�ε�c�Dž�����ΎR��.����8>�3">?���!ܙ�ؿ?�䲕+?�����ᐓ^���vsW__�Գ�[�	!t�\>�{�S�x<���.+�ر�(11k��!��ڵ}GaBBB[[��/�(<>�F���E1�G	��p8��;V(��!>.N�e�ñ~�Ʀ�愄Ӳp:�(ڶ����y��$y�n�?VM���x<n�,Iq>�$IX�G��Dz,���x�2�x��p8"����%X��v��/\��[o���x��M���zeI2-��pj���|85�,�l�\r��!˲���a��8���m�eY����z���~����;���t����<�]Eܼx��)�r���r:���?��s]7x����|>Y� � �h�z�!����˛��_}�y�a�����4��OgH�x”)Ecwx�?��B����u�#��W�ΌD��G�x���;���X�
��o���?�F���)��M}:���&���p݆�C
�ݫ��i�e+v�Z�v]NNvz�n�

C
G"�6o8�����Uk��fggef�������<����,,\�a����_\RRYU���ڧw��5��eM9v2MQ��k����Cs��7m�ܿ_�~}�"�^�fm]Cð!����utt�8gNMm���(466�[�(;+s��Ắs�{��O?�<77�gA�(����UUuܘ��
�6����Ԕ~}���C�{[{��3�x~ܘћ�n���ö�-[����lش)x=^���������|�~�EVfƸ1�����*v565�]��&�_-[���1\�e�tY�hqm}��aC�z�ؾcǺ
طO/U��Çege�߸qgaѠ���Ҳ23DQ�kh\�aCZZ�a�JJ�j�jk��{��4h�����!�!�QEؿ^n���\��Ga��˯���ѱe��?�!
�i��(~��g��lm���4ü��[��l�m;ڶm�V[{;���0MӴZ��h��F��lܼ�/���������[�[T\���V��}��ut�#�ȫoL�*��¢/�������Du�;B��0�h�VV^>m�;�s����޽{�����iYw?�P8��`����۶�m?���
��+׬y�7X�}����0�/[ZZ{�����7ޚ^\R�eӲ����/��2�w�~\�h�,K!Ӵ(��uCQ�W�xs�ʕE%%Ͼ�R}C���=���0C�:��C���3Ͽ��Р�����ޚ������ֻﶷw<�̳ťe�kj�|����Һ��^y�a����Z�rue��g_x�a�Of|�h�����֊�/Ȳ���f|��(�@�b׮ioOw:ӦO߽��o���/DA�B�֬y���$Qlmkkoo��ǟ�+v=�̳�mm}��V�Z��r��+���c"�6M�-˒e������w�,|�gkkk�lݦ(J[[���"!������kTUmoo�UY�}��ݵ��{�e_4|�����1���΃ꆑ��<t�Ç�{�Y�i%$$\~�ŧ�zJ{{NJ��Z[��@m]���j����x<�{����OJJп_vf檵k�l�6~����uu�I�I��+��;夓&3i����m�@��������ǎs�W�$'�s֙�N�T]S��;!���/���㏝�ѽ��W\�z+v�Z�z���\�����GN�o����q�w�y��I�	[�n�UY�DZ�[��K8�W5-;;+;+�ԓ����l۱�޻�|�C���M�z�,���Ǝ�SSS~g�9�~�
��sם�\p���v8��<�q{BY����F�HNJ����=wQqI4mii),.�9����\xᰡCZ[�&�����k�u�\�m���3?_��`(����|媖ֶ@ X_�PZ^޿_?QB�pzzzrRRz�n
�������;o����[o��ګ��=w�i��&L���K2���HDO�(��>,�646���
4p��3>���y�ә��=�����0 �]NgZj��	� �n�KQ�
7��d{�n�4+������(hZ��U�խ��6�UU
��H�F���v�\#�
6dpJrr$ţ��]}պ�����>�`�	�O{�]��y�)�h�~�5W�߰�����l�..)�4a���u��Ç
4p@�^=���> �����s~U
C�hT�uEQt]���߸))!�z��qc�.\�x��͢(�����p�}{�9|�����M��F;::��ij݆�I����$%&�Z�6����<��s���TRZ*��	�`���U׍`0��*j8�FUӂ� �qN��O�^Æ��=}�ƍx�i�m����T^^�t�8�UU5//����/��Ԏ9��p6t�A���4M��۞����G���[n��ۙ�>��=w��F����X����a��M)��x�d��a��O �Z�N�e�+v��λ��,������k���AX�xIqI�(
��[�|yuM�9g��t��m;v(�:z�HQ>��M��dtO/��_�tim]����m���-_QW�ЧWϒ�����vtt�����Y�*�V�^]S[�ף���(jwm�o����v`��Æ閖���ߦ���{�Y���oNgwM�~�nj�t:f��4͓��XYY�r���55}z�jim�WςM���3���ٶ�!�j�ھc�����PTQF�aӦ!:dނ���,8��E�GS�^V^�s�9g�i����K*v�JMI閖j˲;
wf�w�۷���a��6�~�)�
���nj�G�B���^�WUմ��` 0�/KKˆ���ojjk�|q�ߵ�r�Ѫ�>tHz�4EQ/]V^Q��ޝ������c�K�4n�M[��߸q��A<�3���_~�-C3'��0v̮=-VۻW��k�~��W4M�t�	-��o��n8�0vlvVVaq�%������3UU���k�x�8`��M�{��vKM����5!�;	G��{�P��P�4ATUU�z�B��˲l�f8�2� �� �4MAh���e�a 
��N�x�t�D�s�?`/3�(J�4UU���t]E���r��	��F��ax�aX^q�3�ec��œ1�+�l���'�0�<%��(�Ӳ,I��!
ٶ��c��:� �p,�j��<��A���8��x�_�4��t�P�FH�y�ji�085��u��u()�
�<�˲l�s~����P��y�C�u#�(ȶ�^/v�ū���q�mY6Bxa<���
�$|��0÷a5�,KE]��EQh����9���v��*
MӒ$a�2�p�}�g^�+�T��j̽o�6�q�+�"���-[[�Z��(*����8�؂_�e)�K'V�}������
@A�F)�R��i���#3��l��t�D\�SQ,��]w@4x��4
{��b��c��c�OA,�x����ζm��e#�[OQ~�;ޥ��P�$����L��BS��7R�;җ:�@8�9�9,0EQ�~��Ĕ�s�ߘ��/����%"�3!k���t��7\s5˲1���Oa����}���6)
v�K���O���P�W�P�NQ�[i�S��Vg����/S�{�]�gJ�fD �6���B@�\�!������2)Z��2/`�~�GL���"�	��0L��-б�!,�$���O�I?˲�HXQ�C�A�
~�d�vI�t(�t��(���r�\�G@��E���Բ,A)���Ngz��17u2���n��l�a����mUU%��<�[���ؐ����h��P�������_;����pd������������m[��*���c�k�¢,ۖ 4M�0Bk,CӼ X���o�͗D�0��P�d{������֍�i�F�Hx�-�m%'�t.˲��@Ӵ�i�O�EI��*;��UU�ͷ �?��T��\���A��ճ���گ�W�s�q�X�b�/.��pb�F��w�~�ŗ$Q�F�Ǐ;��S��X�0M�����_a�s�NLLܻ8-�@����c��4=�9�YY�y=��|��]���B�a���8_\F��h4��-�B��}6m���S�m��c��U������9���.����t�������;�,c�VlKGGNJU��;�eY��O��L���onj��y�/&�������@���M
E565ev�V���W�}	�A�{��M�e���755a-�s�:��p8�7^�m�3j���m�76��)�����ljj�z���Dz�p8�'.a��h4*�b$ini�,�fh?7���iJ<Ϸ�����9Ȳ�`(�����m�B�F�V��,K�$���a�Eq,[[W�q�-7ݘ����7)��8������H4j�eY

��mE�����8 ES8rss3�L�tsK���B��Mӄ{MI�i�<���
�y�?җ
�@�{������*����X���cj�[a�c� �3]-�޾c����Ǝ��Ӝ�z<۶o��ڵ�ׇ#��6"}��'�a�de_z�=����C
JKK����%I���˫��EA��OB-�m�G���+�Mg������|��q���!]��x����ϋ�0�xR�����5U�(�>����i�������~��w�RS�RS7o��z��P������3O;m�%%eyy�}��e�}z��z��N{3�&%%^t�yUջ����������ᄃjZnN��yh�>yꉃ�V���K{�@����pV��˩Uu���	����(����&��0	�����B�`0�����3Ou�;����cG�y��M�>}���O=�hiE����TU��s�;�󯾾�����]�YY���P0
]x�y7]w����oܸ~�Ƨ{t�	S::���Y4M��
�
�4M�4LӴmK�4��c3�B,Ǯ]���Z�rխ7�XX\����G�QYY����P4������[Z� �O?�X8�8~ܕ�]:�"]���+��v{̤�;���eͺu�?���h�..)�ӻW��]����w�������={P���wY�!�`�a���B`YVz���;�L�����im�G�6n`$���9b�a�?G0�99w�y;�9w����x�e�|>�F82�eۑh4''��r%�Ƿ��9�δ�T�0l�����&x�chZ�e�ө���njn�Wy��e�ގ|||�΢cnj�F˲ �.�cwmu$IJL��
@
�7f���K�4��1w�B��B��1�&0���-11��vs�r�8�MLH!$�EQ�Fƍ���x�g�W����;^~��/��v��Uw�~˲�Rӆ��7%I����Du]�Y�?z�P�H_<���_��7M311)1.q���c\{J������Ҹx������)j��m�=��mw�5�B۲C�0 
Y��iz8�8�c�}s�;?����?a�qM�͡p(3##3#�{�mڛջw��iz(6�2��斑Ç[�y�}���Xd�0r�r
�\�hA$FY��Q�m�ƍ�{���������J3�a�Ph@���--�E9�n�P��z(FCA�0tÈD"�e醾fݺ5��9�������s纜�ֶV�0�Ѩ���L�0{μ�={�,�ؿ_ss3@@�$�����"w�!� k��Օ�+���h�I����鑘�����(*�(;vj�jFVf�(��euOO/)-MNJ
G"�euKK�(������y@��.����$�[�C�5]_�v��(#G���pȲ?HKME��YP���;7m�RS[��c�F"�BXRZRS[�y^HJL*���v����ljn�D�Y�x�+�e�~��� C
����-�GnKk������)-+KJJ477�����ջ\�
7AǎE��ká��aCS���KKӻu����[Z���RUu��u�p�w�^Y�d�ÿ$r'�"��H,{˲�^���O��(JE�Ū�a�GM�p�P���O�(�,���4ދc5˲LQ0�24m�6CӚ�Cy��D��qqq�}�Q|\��x}G���i�5�qء��>��υ}rpK��^��ZF�Q�0�����;�XV�u��,����%	RPUTM7DQ�EQӴNաTU%c����*3h��?l�Ԛ	�c�6����
�s��h,�~,:1����*cGx�R�a�]�)��m;��4]V^�ٗ_ege�{֙�P8����x�J��M�0tà:��4M��6;�|q�Q���4�G.y��S-�]��9���Й#�'�S�s�`����t~H,3~�9����=�,�P$��a�s�V��xQ��>��|�Z�:��Х��Ɗ��G�,���ѥ��(>�@���Y6���Rd��$�?<A�@ �Ó~�G�D���?���-��-z��_�m2��@ �Ñ~���7՗ԅX",�c��FU��(�Y6�ldZȴ �����Lk������c	��ڨ�6���
�F��2����@ �&����a��'�<3s��3�es�
�~�m ��7-[��ĺ%6�#�;y�Kb�|��wI���P�n!��s!Hp�n�g)�F���b[X1Y���֪�V�W�h
��u�m�u
,Q�@8Tm��F�!r�U?$-5
k������O�ֵ+P`�6��h|���m!]3��[z$;����WV�5��Z"'L=sD���Ѓ+ªy�q=R}⽟m
D�+'��Nw�����C�B�2�e#��~�P��7;���}���,M�>����%7N�]��geG�@�Z���P��E��9�OR��p{#�R�}�)�-�4$�F���k���`ɮݭ�4�x����;�>^^5}mmkdb�������W�l����$y�C_�h
i)��6FT�O6'������6@ (��p���dE����,��M����s�'��|c
�P��O ��C�~!���Bk�ڞ��pYa���-U~]5�;��)<C���|snYP1.�R���F��;���t�x\V�8iui�5>��������#��6"���CS���{l����l��6ԭ�~ô��%���Ȏ����)>��>�@ �b�A�g鏗T�7�?g��n������F���?�&{���(�Pg��~��K�>�/�����+��t(�4��?�#T3�娍^��dD^|z����L�
�ʰ�J��z��$����B
�z�m�n�}F��Kꃷ}�9;�1��a��@� �\	�C_q���ɱ�^�i�h�X��pZ�>k�B����sFgύ�MvvD
��?mX�KvU�DzfxF�'��I-a��љ����[D���'I�Twz���-z\������W�^_��?�{����no�m4�wR~�k�ʪ��HAw�^�,M�6�w�Ĝ4�����K9L��3!դ("������[��n7�rd����馦�C�܉pK��[�aA\���h�FE�:��#sa��-�3�eK��EAôu�vIlH1x��ʴ�nX��
���1CY1*BB�P�fZ
B
B��aS$�
��D�$U�v�NP�3�	B�� �4�Ȟ�Q�"�h
"Z*EA
 b@���7�� hi4���w�[�=��;��	Bl��A�,Y6� >��������B �8�M�1+Pg]rb�����~픃�E �A�4�~ى"�8�J?��T�jik�-�t�	��	 ����,K�����_H?^~$595)!	!Dt�@�@��C��p�p`�^��H��@��@D�p�q�a^r��?2�J ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]"���� �O ]��w۶�@)�7�!ᡖ!d#P��)Ċ � <�;ղ�ߓ)�@ ���
�$I�i��p�W�� �Y�0�C�!�0�,IMA�Ѩ�i�.Ģ(���,;���y��<!��XQ!�q>_4�t��?�@�:T�B�-^���VeȠA�_r˲�mCB��޾3�P�4ss|\MӸ+���ah�O|����q�mm�M{s�Ν<�_x�yÇ�t��i۶�a���8n��_};!;33�SOIIN�,k�F�4MS�в,\$\xU�MM���������>���C�F"�a�ϔ<�?��p,˒Dq��>�XVF��D���q'�0@�yl-q8�w����P%EQ,�p�bY�a�aDQd��y��0�mm7��
�68��%*�(�Q��rG�4�q� ���=��c�qq�y�e��mmYf��i��cV)!˲~�wmm��EQ8�ú����y�m�wt�Y����t8(�Eq�LA���H���/8@���(
R,�j��(JZZ��'�������~��w�ԩ_x�O?�����_�~�UWΚ=���������+.ONNz�7�-7�@Q��<KӴ���=zlپ-3#㿷�&�(��~�a�~����4�4�����^{���� ?�?������޽�����O�^����=�t{{G��
X�+ɲ�`��W��)K�$J�w��g?���?����b���>��C��;999��f9�s�>s����o���3O=���!���Ǟz����s���޽z���[X�%�.�Bh��.��3��~�˲�͍�u�� �6�^|�f|�Ř�#��_���j��SO>i��w�
���
Cg�yz��H������;����<p@m]�?���?�����-[�}�͏<���U�O�:�4ͦ�朜l���������x�w����G�|z���77�8]΋/<���=p�)�y��mۯ���p8�ҫ�WT�z�?%>.�����M���`(��ڦ�,KO=����mkk�L�0g���g|v��wUV��������M�0~�{446�����#�
잞����˶z�d��!G	���i��BI	I���[����?���������&M�p�,_���W_KII5|xf��kׯ�3o�,I���)�2hP}CC8637'��t�#��;z�Ȳ�]4CO�4�?��XI��3�._�rؐ!~�?766��1�����kj$I��˛8n\jJJ8)��cYv��P���u����M�4I��_|�2�e��p,����r���˶��yWQլ�̖֖������q�$JE�������qIII�^����o:8���8nG�����#��a_�GQ�0��ᘷ`��=��Y�5yҤ쬬��Q#FL�n����x��[�k����aC�9��s/���I���vKKKKM���5
����P0���n:M�~�ʱ����y�nKLHhmk��k/���{�܋.nmk{�_�6-�������iF����P8�D�r��G�q�-�&�'����,��}�7D�JNV����튪X��kW%òYjmkOJJ�i�܋/9��StC�u���ν�����m�DZ,K��p�7i�e�tA�_�/_޿]��x&��ee�$���������765y<�!����q�
n��>�����E%�U���II}z�bfWeeŮ�=�._���޿o�̌�[�o<p`]}}8�ӻזm��һ��A�4M�۶o�UY%;�����+v������ggenظ��vuOOߴyKVf� ��l1|�i�۶������6n�x�go��� 
[�m�z<�$�]Κ����qW_߰v�z���v���WU�{WeefF�?����^RZZ^Q�o�[�t���+.�¯�wO�.K2v!#���ܺu�A�?9)ٶmQ)�BE"�4].��[Q���{VQAx�3SQ|� ���e�n��K��F9��hZQI�È��$�a���i��	� ��(��9dٴ,]�eY�T�FH��P8!�$Q��_�㸝�E�(<��#BQ,˲-۴,��"�(�� !۶#����k�Fӌ�뺮8SI�t�8�� ��_fG��C����*b�P�~�eY�P!��Ͳ,���U�g�~~(��F�K%�PUUEQ�a���s�� Mӊ��(�
��p�,\��D��ټE�t EQ�D�О����

C���]�a��a'M=`躶wR�P�u\6]�UU��cYP���y�L;�pt!��_m�#
N��s�ټ:����'��F��mߝ��4��w
B�Wy��}~ƾǾ�R����O9픓,ӊ*�i��$Sma�"�����N ���7�4�4�%G���v�ɀ-�o
sHG�	��h���DP@H��k��ڒ���t8���xeu%K��L�7��1�ӫ=���(��H���mۆax�˲���B/����ZH:���+���'�BL��Q:��B�e"�(z�%�Ñ����sT��F��˲����@۶cҏ~"���/t'"�,Y�!���I�z��a��{�m;��0������u]��L8.'�3��^O�m����}vcW�m�M�l��k���OF�'�prhҏ�Pu��n�����V����ꤧ`�lLJdIRT5&���ݻ'"4E��Z�gE�DQ4MS�u�z��X������|$ػ�M�x�B,AY�"�(�j���Z��Ke�d��������P(�������=P�/�߹�b[:'���5#Bȶ�{��8��D�4MG"<��|ؾ
�G�Ś��`Y���<σ��{���4<�~Pt���!���!��G�]�����!�ey��D"��H?���3�p�����G�4��t�\.��)����E��r:^�K3��t�].,�,�\.�۽r�����أqq��dYfYV�$�M�,�.I��u��ֶ��8˲x���DA�m��8���q�EQt�\.��s���z�.�����z=Y��n���ښ�o��U���t:���ۢ�b���t8�^�06B4M{=��%I"BH�$���&&KRYy��O=�h�RM��n���ć�B�,��bB�qEQ����y��A����\Xv��z=�(�,�oؔO��3/�����c��^�y²�,�n���Ȓ���E۶������I��x�����������%I�$i��e��.�z<���zy�����r:��5:�F6MS���g��O�n� .�KY��U�E��tʒd۶eY4M�].QM�bYVŸ��@0�v����8�������#���W\qEr��N�i�h�X��pZ��K�0�V�^}��	e6n,��e�hނE%'%UVU����(,���9d�Fh�
%ee�))�i��YXVQ��p紷�	�>����X��s\]}���E�(���*%9YӴ�EE�		���s��WT���G�xrWe���NMM�UY�Ӽy,˦��TUWo߱s�Moܴ���,3�;ث��`p�…55���I<�/]�|��m��`��ǟ��U����e۶^y�e�8�e[>����aێ6nJL�w�\�`pμ�Uջ5MKNJ*,*Z�t)a||���^yM׍SN:Q7��������wmܼ���Ȳ�}ǎ����WT�6��pMщ	�XzEY�x�
].�(�[�m۲m{QIIZZ�,I�m��V,�F���?�d�_:�΄��~��FK��}^���kmk���O�p$3#������|�Ν,˺].˲���;�NA���ZZ^1o���5��))����x�…EG"�y麾��&+3#�\�P(���p��5�$���i�6g�ݵ5��ӏ�@0�y�`ux~��M�m��]�~��u���Z�v]z�n�m���ع���95%�e���֥�WD���ʪ�Ҳ�Ҳ������Q�îah�G�EI��;ǖi�nniv��ˑ��C�tSS�aH�嚵�&�����ٸy��+|>��3>�<i��o��u����ֹN?��w�߸yscc�u�22������z7�0#�{ZJ
EQ5uu�=�HB|�a�$=��K'w��¢Of|޷O�{x0!!�0I�.Yb�ffFE�W^#)1q��?g��?��x�x��Ԕ�y�X�5h�]�
ø��lL�쑛�Ï�7m٢(ʆM�233�̛oYVzz�斖-[�I�طO�Ͽ�F�w?�PQ��ݻ׮�0q��G�x�4�����lq��Ͻ�JF��Ѩ����4mނ��h4';��o�Y�n]b|�΢�Y�2��g����Ǟ`h�X���g_�1��pP��ڶv�������wK�Ǔ-]
��23�����l�nhj*ߵ��v���s�ϵ��.[�bܘ�O<����\�t���imm{��y�OKMINJ�u�����8næ�8�Of|>q�����斖5�ַ��R�ڴ7eYA��s�����̤�Ģ������+Vl޺���W�~{”�^��`0TV^Q[_?d� �H,r	!��n�;&���s�ƍ,�"=��s=rs����s�%%��y�9w�s���=w���	��<�$���Ȓ�i��Of|�r:k��Z�[<nO����c,�4EK"�~��,��c�� D���y���O��--�֛o�%���Y�)�'?��#�$}�ɧ�w�q��<��ںaӦ�����x�m�wsjj�M�]7~�U��,������9�Lv��-۶�=21!!%99���2h`�^��:���N<aђ��MM�eW��UT\���c��r�M>���Οr���
�a�$3#�������8v��E�i)��t���̌���u��'���#s������rs�@4M�|�]|�uW]�D�,[n#��;n?���DQ���y=nUQ���0#))���#G;天�\p�9�]r�%K����{�3))q��y��Ү��2�b��t�$I�MM�麞��ػg�0mm�u��=rro���3O;����y�23�{��_v�E�G�޷w����V�������UWj��j͚Ҳ2�PCC��M�y��ӫ�w�U����:��]}���eY�][;|����qlYy��������N8~JT���_}���Ǝ����y��iQO?���}+�r[�m۸i3B���_�nݑ��-˲���D�
�����3��3a���_SS�jZrR�uW_u���\�f�_�fg��_q���Ϟm�����n����;.??�_��,˲a?'n[6�����w琥!`&�Ѝp8�*�m�8|�{޸uM���ݺ��.)++//�4���*���J$�F��]~?Q��+7'����mhl<qʔϾ����d�С�;n�wnN�w�[��`�feU���Y@�2�D�IDATv�\�{�{G5M��P(�j�;�hZ�UW\~҉�?���۶�KJJ:e��w/��H$Z]]	G��e�Ѩb��iZ�aD��H4���q�B�p4�����w���r���a	˲�Q%��TT��Q\Z�����Ғ���(j8�e�Ad۲,ϙ7Ɨ_:��8�4uC�ê����~II���F6jnn���5LӴ�P8��(:r~^��n��ܳ��e�{B�n�P��� ��3φ��޽zY��%�e�`0
4MC����aZ�e��(8���$fft����o��FMӎ�`o�
��[�i�&��-�
��PR�ȴ,UU�vWs�ޭ[Mm]cc�¢x��(���QEinn)//�Z������w��<|BC��(}>�0�$��.��D��/.YZ����A��ŗȶO=餾}z/Y�B(±�L���OX�5b�mەUU|�)MQcG�bYv��A/��zB|\�=�m����h�;j�$I�s̛ӧ��^���zgdtw�\�eQ�����l���`08m�;�����7;+뼳����M����}��>��c�~�]��u�����W^���;A�z��@A8���#�A�4��7o����i�?vM�0�m��N˶>�W�y�����ҏg|�r՚>�z;iҪ�k8���b��,��L33#��9��^�C�y�BY�X�=n�1;�
�{�� \u�eq>���y��o����9��i���w�q3�����<]�e��
ba�����9w���޽zmݶ�����s�:�_��w���$�׃p:�$L�t�]�(R4�t:�RS�9��n޲u��!��ٚe)��eY1/<��p�R,��^d�>��cY����i����'�G"�{΀~}�z�qAn���E���,U�޽z.^���io�p�5N��{�H��	k�9%)E7�zk۶,�?�H߾}&M�v���y^Q�KӴ�,�DU`@C���()�ɦeI�t�}�O�0�c&�#�H$�(JJJ
���t�6��E���`W�_�!���A�B��N'˲,è��l��8��QEe怵v8d�_|�qӖg�x��z7� 6�0_��q�e)��r:��7���45..n��0E�,�$I��}���q�>��8���
!��t��?B�0tCc����	���:�NA<p�:{@��T�T''&w^P���PM�dY?$Q,���6�����.!!��8�0DQ�D"� `�>��i�ap�ؼ0�����.I���=�8��̈́����|P�!�,��$����4��*��b��1�a`�XO��v{<��mM�-#�
�*
���A�-�bh�4�׎��C6�~�v��N��0|�0���Xt�	ڶmYH�oN,���u ܸiө'�d�vBB<a,@�@5]����}�#5��].�s��ZE\`�֩}������Ƨ�΋@׍�$l�-���h[��[��EQ%Ir8�`�t�������MSp��
��#��5�(E�B!�d��*�)����P��I,���-��CUM����|�Ph�J�T�I@���r�tͲmA���R��cr���s����4��b���M V�؝��0!��9���v]s�}{��D"B��b6��R�f�g}������<�(f��B4{���Ӧ����/&��7�@���|�]ue�F�[�`�޿vxH�o�@��H�7M�߾�i޲V�]׳23ssr:����;�L!0��Æ���(�?������c�4A�ճ�!#/�J���4�@�]����~(^]�H7�/�D�x����`]!˰�h��߯��n�ux�]�m��s�I$�ߗ�o�㤪���s����]�R��J�آK4OM|�g���56@Ч�X"
�`�7�em��l�;[��~��������}�������߽s����<h���̢+�L	�	́�p$|j�9�����N���k�,WBK��W1��#�bD�`|q.1§f'��{��WA�W1�q}c��f�z���+95JhCc���D�$p��V��U�4MEV�z�
4
����CV���.[/6�E��Ս �隮���U_üa�����z���
cLLrir�	!t��v�c;�_��MB�:P��ⷱ4M�0����Ʌ-��y���қ�<1�ȅ��ıc�\���_r!�yi�}���BgF�؀=%Q��<�t2�[MA��z�Gxσ��;�A0
C�u�e%IPT"��n�=~{]�ns�c�E{$�<t��x'�Dki�������HBX���	B���UFl`Y��y:Dl'�S�=�̀��y�~��$����򷿏�C���3�sw86MS���0XQN�V&�����9�a1�e���II�(FWaYVM�M�d�i:Ƙ�y]�Y�E�>S[Wg�ٽ^w(�Y}���BW,�r,��*���kt�F��#�c�ѡ�h������l>�7��WH����|8l�1YY���,�(�i�1-B�뺮��ѡ��<Ϸ������L�nE�[���y�F�'y�{��w��OH��q��/֬{���Y��e�ܿ��,���0����ݼukbB�aO�ǣÆ��X��y�~�t����(��:=�������:0-M��h
��~8+���\�²lg0�t9����	!�aTUU��n��ZW�`�H�t�����B�8���^Ӵ�~�dEio�T%��{i�o�2i"���yY�����l6���.��4�@ ��xTU-:v���|�%˖:�[oq��P(��0m@���j����@ --���c�>0M�4�1�0����"I���EQ,���D"C�4m��C�����,:~|��W�|��!�3�ͭ���@K0��N�
�DQp�\^�������69)��t�S׊V�wt����l�(VTV�������*B��������.>..99IQ�Œ���q��G~�aC7�虩��<"G�dd����%��c�0���7�x��^[[{$���IH�WU������tv;:;�K\\����y��mm��B��޺}�=�#��+���ۈ	�5��Q���s\ X�ac89���/%e��m��n�}��ӟo�=��7�ܰi��g�pÇ�|����srj�ꓒ��OJO�9r��dYyEE\�����斖g�{������Q��,���ť�_{}��Q�<�bEeeSs˲�^]]]c��~)ɋ�-���ٹ{��?��7�z���n����~��S��=tH��j���/9ZPp���Q#Gf��}����'�--� ��`Uum��b)-=�{�޺����÷n�.Iҫ}#��p������o���%/�;���}��MNJz�%��M�֖�#F0���ފ�ʊp8<aܸ��ۖm۶�ܥ��LEQAX�r����6��R�c�?_�6��PU�99#Gd;~b�Ms���ɧ��()���3z�~����a����]Nm�����w�т��ں��'������7l�|�h~FF��?�yUu�G�?��<��	~�CGoش9--mǮ]����Ꚛ����́����V�%Z
4���h�0��j���=��*����?}��G���43�O[��u�
�>��˝;�|�?��9�y���O���_����RS��?��=cF$a9NU�ں���R�̜9~�8��=z��Ç��X]S�گ��3�L�8~�o���ܷh��M�O:t��d�<?y����Y~��G�̜��Q"�i���Y���n��f���j�*��[��Ș2yҢ{���wϞ9c��!���/�TUe�����XRR�{�ފ������ �:;��MM#GdΙ9��#���7kƴӦ=�_?���+(*��o���}��ӆ�F:��7�Ͽ��kB����x�2s�������	I�dY�XN��?^m�X�ϊU��/�P�W��(�%���墿Ҟ���<�bI��'��ю��e���y��׬Y����_x��ߘ?�����qc�����ܱ�֑#2��u�횦Eu�A��~p�;�Џ�u�_Jʃ��c�>28#���$J�,[��˲�[�$�"E�cA�y�gX�0Q�h�Y�u�z_4))����O~�`08g��?Y}����)�}>�����O��ֶ6C�iiEQv���x��Ə�Dx�WU��t)��q���!�����2����"QlV��̟��St�v����
�0	aYa�q<B� �j�ʊ��
AD�Y�'M���{����
!EQi9C7X��A�y��i�,����,i4���Ӥ���NLSV3� �4��v�mDf�O~�-9r�w�+]��O�a�S�L��[��u���Ǚ�acZ���]N�a���1CL�h�A�(���(J$�e9��o�L��^c�U�]���������C�м�7ZZ"��/��g_[��㸕}�s�n�e�{òϾ���Y�F���ڸi�av�}df�s/.~��'N�@9Q\���_�<O�Y�p����|sHFF֨Q9yy��Y+��n�O?�>�$i�Ms����$�ɓ�7�U���B�c�ֶ�wW�/
��bu�]7ϟ�n�F���~����V���c��s�����~�?�Ų�i����4����)�&mش�������}��}�w�2�f�c����ǎ�za���,��}���r��g��(�-7/HJH���UU��ohmms��]݂�a��V�Ʀ�����-7/زm�r���*6�Ks��{+W�:��=w�ֻ�=��'��;�d��;.���z��e�s�̾ẩ�<��SO�����чܳo��h�H$#}P8��3�>����G4�,:/h�W��+�H�:���eE1
���\.���,=y2%%���^��'O�8d�`I�����4�������zY��4�����].�hQ\Z��1�f��m�����Ǐ{�]w555���ї�N�a���$�KIikk;v���j�H$�
!��77|>o8�
��v�������a�	������

������B��IJNN��設��߯M��Ȳ��$�������qWS[����ǎ���,7� �2�G��8�f�p[]S����	B'��$IJ�ߟ��<���VTt,9)��جV�1�;:<nwkk��竭���#23KJK���$'%&^�=�J׳@#�8����2����
#!>^����p8������������ECeu���z<---�a,��q\GGg{G{B||t�VB����hW�h����KS���i��2C�uA,�=u�y�ܼ �K�����pM��DQ�4z���jg0�+*+���_�����I}IVz�I�h�����J�t���a�D��0ƚ��e`9NUU�X$�"�s����F_5LSUU�eQ�e�f���B��8UU�srEI�d�g#G�x�˲̰L$"G�7����"34y)Z��&�K���ޕr�1�ۄ.6��
�����Uh�MEUE�?��>4+cD��
��v{AQђe�,y�9�eB�0-� ���몪�v�h�.M���^hik��l�G7�ڷB?��\xm^Z�ۓ��C��ٳ��;::PW�$YQPWg�h>;!$�D�,B��0=�x����4g-�K˿�Lj���,#���pW�0:���t�"�;EQdY��Nc���DkF��h/��,��ξf��n����n�&�|�Ŧҵ��h��諺�w�Y
1�M��Ӊ]B��hs|�g��D1�M��]N��CAz}@k4C!�UNu};��(=:��a�&��W�ȡ�xD��7��S5-�S��Ŕg�ey����G��h�V�%vʸ�'�b�1�"t���
��޸�7k�F�D��[�b{.F�y��j[�
!���T[|�����_���i�c�Ms;��S�|c?Bz<�yL��7M���������^��4�H$r�U�
��0P��JB�k4]�9�#2!����Sf	!�a���p��^Wb�Ưq:_�]q����544\��B�N�������6��!�f�����qi.�1��A�W���~h�W�0��y��̎6���g.�V��=��&Th�W�^��s<��3pS\qz���ʊ�ue".��0��r�ҽ���q�L��~A,��c9�T�_aSD�e�������C�B���S��!�+�Yo��>_8�����$��k�IzQ'�B?���<OG˧+�˰��VWf�F:x4x�c�
�D᳌Gf�f��éש�|��z~����u�����񹌮~���z�P����F3L.j����F4�W�TC��a�<��o���5Mr.����`Cc��(��KbB� �������f���;�3c�����]1C��F���CӁ�{=�V��*�Iz}IQ�Q3v�g���q|8�^�@��"�X,�0������O�Կ2�1���tLy��"˲��F�o`YVQ��k-z�Z�����O'}��!:���n�gp��メ`yyYgGǀ�4Mע�� Dž�a��$�.�����|CcC]]����!e��ǎ��g�l6�<���0��f
��!c�ժ�e��
�>����ѱ�9�kimx�aw���a&���),:V__'�<�iL2Gd�����Fђ�� ���677�\.��u� ?~��j�vħ/��ǎ�p��u��n���e��?x����hmkk��p�\]ˀ#�L�4�~Bx�+)-U���E��]�4y����kii�xN`��HAH�t�e�����Ì>�N�ڥ/B��9�|�tb+���1����(�%�.�cl�&A#�q\s �s�n�e�G�JAt�,˶����ֹ].QB��}�k==�o�ےD��.3}�0Lѱ���"'�U�v���eYI��u��*/:V7|�p��X�����Iɥ'K�N���j��`EEERB"�2��M��>���U�)pƺ�,
�N�,�8a����.jl
j�DQ<Q|,>>!6�B�C��Һ]�a;n��x���Ͻ�cX�mljt�\/@�?��X�mhh8�#�4MQ+��~�?�<���,yy9-��x�n�(����K�/�u�j���^���r�֭��ź�^��f��#��v{�A4M{���M��<�����y�N�SӴ�,
�#�6l\�~��j�4��td���w���=��f��u6Z,��}>��頋���>:;�4m6ۛo�s$����h�����|n�����v{<n�۽n�ƕ~d��\.˲�a��έ۷WUW���i�.����s:�G��j��� ��#���劾d��c����j����b�x=����=n���)LOu~��0��˖C!���z�;-%�⒥Eǎ�A����y�V�5���'7��o��z�l����:>.�4���]���DQ$��\.���zE�T�"��m6����xX�q��^���8��KK_nko��|<��3���x<�A���n�$���q͆���������V�0��������M�4hPm}M�2��XӴ���'��!������l6���u9�=�
�y��+SR�Axvž�����y��k%&�[,��:���^$
B}C÷|��r:#��_^|���Y�y�����5��|0B�i�=�w��e ����[[[EQ|㭷m6�]���q�E�H���������=�]k�?���ÇϜ>-
����;:;o�y���Z-��Ʀi������1Y����Z�͊1EDcJ ���W���R�S�i�ȣ��7o4p��9�M�ܴe둂�	c�N�<I�e��Ͷv���eeS��f����]w4??55�����_�������ڬV�0‘�+W͙=+!>�xq���^ONL�}�™ӧ�/ػ�~�fN�F#�q�yw��;l����ҹsf���o��i7�<�s��C�2x�k�'%&ޱp���7�ٻ�'��~�
9�y�NJ���#2�kjE��ۭV=�677������LӴ�l�%%k�o�6�[nY�������%QZ���Ӧ��ڼuۍ�f��<�Z���u�d�+�����t��]{͔���-(�0n�5�'��F/�?_�������H��/�Z���̛7h`���� lߵ���t���������R��X�}�ݿ�KI���9��WTnݾ�j��8{���>ҵ�n�5����k����7�u9��8˲�`�(
���wE�v�"��j�Y�>|�=w��o&&$�7v�M7���o\3ym%k����v��.�<4w���>�l�5in�3hK�F�aS&Oڴe�1c22�-˱��׮����~�C��Ұ	��Z]�ON��x�󋗌9��>��K�$U�Լ���$I�
�������UU}�m���>~��>Xy��W����Z��/���ik)F�0L�$�9����剂`�!�6qЖ\���/��\�q�q㦦��^^60-m˶/�lۖ�����W3�
��%􇽦i�I���K_y���J�1YY۾ܾq��|����#Æ�#ֵ������N�˥�(�.���r�}�����^^�y�n��V����|YY�o�����d�Ղ1~w�mm�G���������!%%�b��q��.�3��?t���/s������s��v��30-��_�y��E�>Z��n���I����e���1c�H�t��/���5)))V���v���z�������>���0EE_�Y'���i���\.�����_{=s�0��I�i�$����w����_X��xqɊVEd�㸥�_�8�4�(
��l߁��Ą�����I��7hƐ��-��e�k��y�EDH�ɓ�������_zy���ݲm�{+W�'$�_6��o��[C��)1UyY�mmm�㶅�a9z�G�_^^�x�+��橀�B�n�ذ
�5�#�CB�4㌁�i��~��߷h髯�Z�(����w���x�iɉ����$�d�ybDq8�(�ٷ�����r��Dss �����s<�r~����FӴ@ ������ٳfλq�'�}VXT���QS[����2�q�(�6bA�%I*��@��74� 
�X��4M�x<��G�1w�lI	!��DqI}}CuMM(����5j�ر9��6��.5A�e�	��ݾ�ִ�{��KNJ*=y�e���O<|��G���;����j��?D�8pPڀYӧO�4q��-c�d-��އ���ݻ#��i���	s��>r��Æ�>l��݇��5}����g�:r�iC�N�:sƌI'������,���D��@�f�]3y�o��~�������ɲ򶶶'�x|���q~~AaCCcuMMg0���<"sxBB��7/40MV��FUU[�Z�q<�9��I����o`ڀ�3g�7nȐ�Ə;�}�8���#'7�΢EC���AӾ1��o�
]����I���:w�삢���������`(v8��O����?�`���<�?��ʪ��������������aCg���w�Ϲ�	���v{::;TM�8^�4MSuM�X����V������Ǎ�56k���8�����1�Oiy�����Kk++�&�������b�GY����7wnj�����
����ޞ����C~{ѷ����n��+��~�EU�V�c�>rϝw���2����z3�=��>p�wL�lJ�ʞ{q��f�5*�Q^QY[W�kϞ1�G�|�䤤���>n�X�;:N���j�������|sƴ�V�a�P�Dq��i�'O6,˚��vvt���E"2���,˺ݮ1�G?��cn�)
}gѷ|>��y��&F��8����(.����z<x��Q#G$''a{���׬)..nii���7�F�۽x�r��B���������023�h~~Aa���
�H!DV��7ͽvʔ?��Ꚛ���[��G��''M���ں`�M���:�e��s��$;��^�􏟚0n\KKKgggg0
��nt]����u}�
�%%��M�		�clV֓�=�������a|�f�-[�̞e��
���ee�%��`�㸪����&DPvNnEE˰�n�ڎ����n�ƣ���%%#2�-��`gg�"+]_=�ímm&1��o�	I��-�'On߹����������ܽ��5��v�
S��:\PP�a㦁H�XQY�����O��a.���t�ۿa�t:].���o�+((H�q�U6��p8�B	��N��z�7�0��
LTWW{��ĽsF���o�
c���F%&&��
��0��,����}�ȑ�ٹ�^�7))q톍���%%#�g�0�-��_�M�hmk1|���#��%'%M?n��MG�lV똬�G�󫫫��:u_vvggp�̤��򊊝�w���߽��
�w���M�Ǐ��Դ���k&M��f�65����S���e��{�N����w���B���~����4m��	_�]7z���Æ~�v݉��i�����[�P�<X�u��������w��s�uS�q��99�qc�̝=k�֭;w�NNJ�y��q�ͻ֬[?h�@��kFBB��ݻ
øy���,�Y��f�=p��9�cY����ͷ�.+/�=s�ج����5kץ����[JO�|�w���g͘1i„8�oǮ]�i�<o^SS��_VUWgG�������I'tvv��8jĈH$�t:V�yyeeB|�-7/�X,�7n:QRܿ_�����8q�8��]RZz�����Ӧ1�ɧ��>t��)�'o߹S��sfo޺U�$��_�z;<�}��6<�С�{�͙5s�
����;�e��@��	��W_YU5(-���RS��� '�0���+��N����y�m�gL�Z-�6llmo���E�&L�D"k֯��l�Y����Ꚛ�G��;��o?�(IςI�I��ͅE�
M�'�OVWW����56�M�D!B8�khh�Ym�Æy�nQ6l�|�D�}#��$`Y61!)?�hM]mG{Ks����q� 'L���-�`gG瘬�N�3%99h�>��c�����ٷ�̙3�� �2M3�=j �\9h�޹sgVVM�'���X\Zl�ڒ�T�{�8�5-B$A�l�p8,�<fM���u]�08��eYE�)��(�(��a:E�Y���i�Lz˔eYUUm6��(�ab�
�Iͮ��Ũ�(
�a���AB�i
�Hdi^?B��u��eYA�e�ժw�u���0fa���b	��4	D�yAUU�4�N��8"�4.�^���Z,��,<��_�����N�0AUU�e5U�X��H��8�k:6�UQU���nП��Z-�H$��<�04�"IB�4M��C�0�`�&B��	G"<��LðX�r$��(��4MM�x��u����"��}8���e�Ţ(2BXE�]�m�뺪��(��(
��V��ka�0�p�j�ү,���k���i�Z�i�gvP8Q������+5Us8����߳�ݍ����Xo����[UU�hb6)1)11QUճ�QDQ�i�jt�u_lBA�/�O�j�BE_p��������Q�ޒ�z�Dh�3�}�0�x�Oҿ��ߕ�}l�%��s���ܵ$���.jl����hb;�u^�㱟��p���$��2��p8�sr�z���s�LOo������o;��1����>�ݢ�DE_�=ב>�nԣ�*]�n�`�f��7��z���:�F�/��0LE�{��{�_�o��#D4M�_�di~˲�6]�G�~p������L̳]��F�S��}ϩC�+@��>����2I�c��Z��uz1�/�|��Й�>D;%��d�٫�:�1����2�&�0��$$�?����h[A4�:�nq�ۆ�=��!B�^�q�lۧۗ���y��}�W/a�/��CӴ��8fpt����(EQ�O���+&�bN�=���U&&�����欥Z$IbnX��R��gFS�KV(\�!�TpE�%�c��*���ی��ip���
�&�a����p�����cMӒ��'"Pb��!���
������d�/���������B��i�{��7���)���~�������>B?�9��ρ�}�~�s �@������@��>B?�9��ρ�}�~�s �@������@��>B?�9��ρ�}�~�s �@������@��>B?�9��ρ�}�~�s �@������@��>B?�9��ρ�}�~�s �@������@��>B?�9��ρ�}�~�s �@������@��>B?�9��ρ�}�~�s �@������@��>B?�9��ρ�}�~�s �@������@��>B?�9��ρ�}��z0�T
c|��EDQuU�u��8�ծr*�~��F��8h��>B?�9��ρ�}�~�s �@������@��>���W(B�%�O�bϑBA��Wc��Х��0��]ܘL�A���0ƺ�W7V���.~�gY61!!T�PO�	!$>.�a��Ƹ���j��OI1M�"��	��յՈ���8��q��6������B?tg���i�22.��!Ɏ���:�HJR��(ieB�q�i�B���`l��%�!�0M�B�0M�4/^�7M� �0$w@o.y�K9C�z�#G/�1�_qa��%rY���t�acL32/@o��2|�,�ĦiBX�E]�������_�0N�뮰�Tl�~�3��0m��i�A9���O���Dz>�N�΂a�eu]�y6��~�E4ZB^���-(*�X���򲒓�� �,+�B4��</¡#�;::A��9��Ý�NIi���6�ӧA��صoϑ���������Ӊ(� �8���>`0�eB�0lj���ZKk�(�=�
B?��>$�͹4M���J[[[]C���,�b�������‘f�������@K� r��xAQQS �1�t�������F���i3n���KKA��e��p�a�ֶ6�3���C��,�UT�B!��‘pk[kcScYy����F��*m�p6g��c��m8�<_QUy8�Hm}]vn�����UE'���aUUY�ihj�EǏUTV(����8Z����y������eٯ��1���^W_]NW0�/*�B���Y.9YZ][�1>�$�(<V��
E����=t������UUպ�z�0N��B�@��^��>Na0f���0� �0�~�5���x<ť%�

�2Gd���
�HIJ��l�IdEIJLNNH�����h	<
����=���5�c����GTM9|DeUe(�y������e`ꀦ�溆z��!+J[{���-���-���&������q{&�'B�9�U?�*6BDAl�h�C�"Z�$!�� D]�ycD/�
�P5-��A�e�6�i���ˊ��2Ƙ�8��>*sDz� M�Ng
�X�4��^�TUU�:�y��y��rM;���z<�4��R�S���r�36!.^�4��i�i���D�m��ۼл�`�1�4-!>!��s0on��b闒bFMm��}�UM8(-u@Iiiޡ���<o�X���eEfY6>>�#�y8���珏K����������頌c�,#��8VS5��6 u@AQARB��f+���oh�5M��C�y���	6k�K�α,˲�ax��a9t:ɇ �0��ܹ3++KUՋ��F�Օ��ӣ��,{EQh��������
�$���(��i�(t��`(h�,,˲��i�� `�C�AHEz��"��|��董���h�^���j�q�`8Ȳ�(����4�(�K�yB�c�45]x���c��`玝;!�z��b���:���#���r^����B��>���n�����4���d���D�͎�3f�2,��e��<j�aw�~6��@�K��~��$�&����e?�~�U���ӽ�0�u=%9!�(jl���Vu�<=�r��
t��ש���I1�~�+�:�K1q��!�@w4Dⳏ�f���F]={��y�n�A�}�/�J1F��'c�a"����0M���4��Z]-��XV�0f��(4�k��z�d�P8B?�"�W�����t�gD+���F��]��
]���^�B?tG�$)%�ߥ�?:���j��s$�X,�ښZ�ЋS�2�$�=6�_�9B� �@����x7]/��7/�9���9k�O�����֑�z�l��s�юv�^��7��䬡�n��,����(_�yI�7|�1+DQ4
#"�fW��q�4��E�x�C��6d!��8�$)����WGu:dǥ���R/���޵{OKk�����MB�#E����8N�`0HsQ���+��jzz�7!
BYyyAaQ\��С�(Z�VY���g�f��AZ��à�
���!�,SRZ���l��(�Æ�C�K��ѱ�@vZjj||��i��iB�!&!,è��i��j�]�y/�Fpuax�����z��(����� ;6y�D��i�a6��Fr��%Y,��74��7v�,�ø�nIUe��pX�V��A�;�t�#B��j���[����gg���/#}�kצ���r�L�t��I����i�֬ѣ0F� b�1M��p��q���$IQ��*���(#���_���Á�����1Y��~I�,�dB��s�755�}p:�+�b�H6��f�i�f�X�v��f=x���{��2�"IV���8:�����l�(�չ�� ��ez�Ɔa���ч�7��`0�(��G���@����ǭ��}Y���4wՇm۱3�8{!�U��Y���
��~������x<�#G�gϜ�1hPD�%I:x���n����(����Y�ʫ��V��o+�����s���U��7ϟWS[�KNJܴy��	��;:�l��f�~c����i�v�uׅB!b���ۋ�u�)��455�vm��U��o�q��͛>z�]w�J/+/ߵgo����Ӯ�/,���j��q֬�C�rr�&M��z�,(,,.)�����q��kM���O[ZZ�>ߔI�N'��:ݛ�M��,{�}��|���O
��nזm_nڲ������(�/^|$?��󵷷� 

��-�T�xq�;רּ�#��-s����lӖm�$��x����;;̻	!t�����BaI���A���_X�T�u��j� �>����� ��������w���u����Dq����`UU�{q��<�x��`(�⒗y�;|�Ȼ+Vx���$Q:~�ijϿH���O?_�n��o�}�(�[�}���U>��a���/w�ܹ{�?>�8)1q��VS[���O����x^{���:����t��eY�}᭯,]<aܸ��O�����;����;��yǎɚ0n���m�mRSǏ�����w�omk;�_����TTV��4p�{�?v��Q#�o�n��p��8U���,}�x�_���KK����~�R��hј�,�Nj�e��902s����|����%I����{<n��2}�555� B�}�>�P[{;m��x��o�󧟾��;4M����Eߺy��Ʀ�Q#G&''}�;��:䑇|��lۦ�ƜY3�u�ݹ�y�m~�7\�`��*�뮻�we�:q�8'7��<z�7�2x0}�U����?�4�7&k�>����5sƊU�^z��{��MEQ�F�9"��4j��Ꚛ��c�(�]�����ε��H$
�C�p$G:::4UC��)���#Gkjk'M�0(m@Cc�n�p�@vvbB�o�������5Mkkk۷���I�t]����K���$+ʍ�fe������r�|���qc�Dd!,˲a�
]WT%
wvtC!�0#�Hgg���m@j�_|QVV�}���1���8�מ=#2���hF$"+�
�!�p�b���u6�����������6/�B����6nڼ��n�5c��-[��?O=�1^��7nޜ��4w���A�KK��̜>-9)i������=%9����릖WT��iRO��L�<��8�e�Z��G;v�X�<�}���qܚu�S���
��˫�����͞9CQ�
�6�5����?��؉n�klV֊��d�f���n�XQYy��k#�0��99y{8^\<tpF���)�����ÇUTVL?~pFF{{��Ͼ����74X�֑��i�����׬�s���S���Y-�Q�F-(�6캩׮��󜼼ں�Y3������+���{m^��"Gd�ey��BN����>?�����>�x���V��a]����i�(��j*��
��(�5��MUU�@E����V����E����<�DX��X,ѡ�UU�SCя �‘����u�eY�q���(���O�4y�W�eY�e4M�N�b���񜪨�a��9����q��I���J��iðN����橧�_�<%9������Ç{o�	��
]�qMQ��[�ܱp!B���$I��#�0��p�f�G"��N��G"2�`UU��"�m!���� �l��B�p�a�HD��F�P(D�c�$�$M���_Y��+�1��2����~�b�B!���i!�5]�i�(=]�N'�X,��ܼ�v546e����qp5�=��
^a̜��߹oѐ��p8-!
y�&���ҩ��x[�OF���H��.C����_����Cg�xl��n�aYV�����������J��A����ƌ��5�{�/'�4�.��k�!����˽8p�8ԣ�|���:�{�/3MӢMX�5W�h���]�b4.z��a.���h�0���y��><%tEXtdate:create2013-09-27T11:07:37-04:00�iM%tEXtdate:modify2013-09-27T11:07:37-04:00u���tEXtSoftwareShutterc��	IEND�B`�site-packages/sepolicy/help/file_equiv.txt000064400000002235147511334650014747 0ustar00SELinux can either setup labeling directory using the Application/files screen, or you can setup file equivalence.


File Equivalence allows an administrator to label entire directory trees as the same way as the Equivalence directory tree.

Use Case 1:
An administrator want to store his Apache root content in a location other then /var/www like /srv/www. He could define an equivalence between /srv/www and /var/www.

libselinux reads the equivalence rules and does the substitution when ever the matchpathcon function is called.  Tools like restorecon/rpm/udev and others will all follow the substitution.  Using the example above when matchpathcon is handed /srv/www/cgi-bin/myscript.cgi, it substitutes /var/www for /svr/www and looks up the context of /var/www/cgi-bin/myscript.cgi.

In the command line you could execute.

# semanage fcontext -a -e /var/www /srv/www

Another common case where you might want to use file equivalence, is if you put your users home directories in a location other then /home.

If you setup an equivalence between /home and /export/home

# matchpathcon /export/home/dwalsh/.ssh
/export/home/dwalsh/.ssh    unconfined_u:object_r:home_ssh_t:s0
site-packages/sepolicy/help/booleans_more.png000064400000171614147511334650015420 0ustar00�PNG


IHDR�j��8psBIT|d�	pHYsNN�1�tEXtSoftwarewww.inkscape.org��< IDATx��etU�ڀ��9��F
��b�RJ�)�[�m{o�:�[�-(/š�Xq��w�#9��!�Qt��f�s�yg��ygޙ=#�$�'N(�=:�j��)��<�@p�`��lG}||�6l�~�_����d�EEE���:_ ��
���ٳg�O�<9J���;c̘1���l�-�@ ���R��}�������*-��Z�V(}�@ nr;�e2��$I�-�@ �&F�$��}�@ �H����@ �����@ �!���+A�$1]&n��_ �Bҳ��x�t��
�����ř����l��������[��?��
���b��;��h�U�呗���`@�T�:�O�w�~�u�M_�+۶ƻe��y�%����M,\��G�Ϙa���p	��Y|��KW#+��	�_ ��ΙO��	��!'�����}��X�
%�ja��;:80��!W���lۚ��m�?~�i���^l�oK��J����{5x���쎋'�ϷI���Jjz&n�.h�j���,-��zGg0�Ռ9�s�hB����
B�10�a��ry�*�l8::���pqr$�g7
�]�
F
.�N�d����:����pK�h~Z����,�
z��~�^^�����D�f3SY��>�/KW��sO1��ڷ��5�l��Պ��'Ξ��Vb��pwu��G8/��v����e�ƿؾ/�\NǶ:~�=tm�Z��9��՝^]��R��E�n�º�[iF�N�$Ilܱ��;����Է�EŤ�gb�ZQ�����9_�O��P�.[��b�PQ�ȁ�iʬ�˰�l��܉�����O����_��3M�t��'��ۆ��Ra��.�����h��n�#.��^~�y�V���E8j�z���8�~�N|}>�P�����I �!I2��a�Է7�~��9u�
�s�v����x����S��Y[p��92rr��%�n��_���ëOOb����^�gGme���ONKgѪ����$bzu�j��p�;AD�`N�=���]%������i�U*F�ϲu�$����ӱmv�e��D����ɑ����~����񓔖A���?���=2���7me�D�i�kn���?~���d2]�g)�HH�D��0�
��t���R*����#�@�8SӸ�QL}���ʱ3g����Y��Z�B�$�Vk����O	0���F456[�M�łD�•�d2���j����>b�q�=M�ق���=ޒ�2����Lf*���@�n]�6o1�D�
A�TT��6�!�6|(];v@QM��K���_���#����!Iv�FS?Ɗ
�|������bRQ�ރ�<�@�QT�W��E(}�?����e�h���;� <8�֡�.�>��lB����X�w�瑙�@d��t��`W\���_8�^�����۲�n	y~.��HZ�$����;��c����ps ;/�u����ۯ�p��ߺ����א�R������#�e�J^��?�Z����fw�!����a1}h�*��&�c����W��s��<p�X\����t/>�P�QDu��}���Lb�=/2����"L&S�)Ī�u��1j8�,]����-(�}�p����X��U*�5�I ��D�^�Q�T�
o��&5�ݐ-Z�H=z4
�⢞���d2(�pvr�Zm$���\ïN���ѱF�l6�T*��t888ؕ�V���Sux�B�����J�^�G�� ���#t''',5dT��(�J�f3&�	�R�F��l6�R���t����r��el`��M��T�F�j��Z�N��72
��F�^�G�Ra2[pr�,Z�����Pݏ�`@�ѠP(x�/л��D�R�[P���V�GGG***jtn�*��S�U����3gΈU���"I��z}�{���u��t�:ת�tun0S��+��/I5�[,����FU^���;r���VC�W��mj�Z��ϗ��lF��թ�Fu?{�s�\"
����2zv�$I�L&<\]��\�*5-q���5���_ TQRVN�Ν90暴
۶����f㞱�Q��զ@D�$\M��upsqf¨[�َ�NZ-��C�E{$4B��:��@ �9���_���@ �(�J�P��@ ��H��اW �J�s��[�w�j5%�%7�Q�N�NxzxR�+or��r9^-�P�����|�zB�T���	@aQ!fˍ��}ճnt�{����7����j�b0��~~����DhH(&���:�O�k
,V�N�B.����Jx���+��K8��D���8h�[�+"#+�r]9�N��,\�4]+%\����-�b7Ɋ�
223<`06��
Kݍu�2��FCrJ2��[׻���\&G���_��^�ޜ9wF(~�
�P��늪�]�r�|��X,��r�nxME�^�U��]�r�䕐n�i��p��C����i]�d W�XkNeT�}��B��7���dm�\�q����)..$7?�^�F����?��F[I��\w(�׿���u���W?��z��yyo���n�<n|r�+�xyy�P!I6�f�ix��A{�8��\WȐ�T�X�R�U���ɨV��_����o��~As�����Q��h�������FiY	j�����[]4����#��F��r�M����#~As���EpP(�ɨe*l�d���G�9��u􂓩V��;�~�$��
x��O0VT��k�!#3�a�srHNI�XQ���.)M�M�9�ڮG��8����MXhX�~�sJ���;4��8���^���C���Ǽ��Ǭ\����bf�������;�
sQ�d�W��d�r
Il\��ڋ���`<{��k����d��[e����c�{�F��{}̊U��X��k֐�����x����$�^�W#�6�M����R��f�QkA�����F��E�j�FiY���G����/�ߖ-gܘ[iٲ%k_��Q��d斡CP�Tlٺ���5b8P��B.�G�n8:j9s�,{��G.�s��ٶc'i���q��̝���ڵmÄ�@Jj*{����ӓ��=~����ڴnM�~��ǟ"I� ��
�Wp58��Ȕ�?���NDX�EE|��T�Z-o��Z���[�_VVFIi	e�e8hP*��d2���+7�))����r���^��r


0�LX-V�
%r�?
�����:|�}�j+Mg�9�x�{���V���������*7n1��d�dב�,3�kz��g�o��t:���8~�P9
>v�����ղ%cF�B.��k�^�%$С];z��^#Ϊ}��$I"&:���m��899����2��sZF�������GN�>M��m),*b�-�X��w��6��Ʀ-1|�PRRS����'���r�ql6��I�z\\\0��i3cG�BV��Z�_�l&�����c�Ә����f�̛���/�..���������/�F��dg琖�AT�Hb��ILJbϾ�xzz0j�pHNIa֜��x{s��1t��@xXf��ظ8R�҈�Ё�퉍��h�չ�����0�K�ȥ-G��C�
����U�F��mT<��<ĭ�GѾ][��l63u�t��8y�4Ͼ�
��EK~#((���O��u��f��Ys~�������J�Nt���ZMAa!�z�`�_[ٱk7��թ#�F�f��O	
dæ��_��qټu+1���=��*0w����Չ��߁����0
%O<�,������ܽ��11�P(�8ooo�ΰ!�@V�������Ν;		A��Щc'YYY����Z�L&C�P���ϩS�X�n��BQ(���q�F�J%�^޸�������ߖ��]9�ֽkwbcc9u�T
��
���R�����g�0x�(���5k(��شe�
�j�2}�,F�Nzf&��s.!�%˖MJZZ�8ϯ�@��.�����A������TeϾ�
����L�̛���'N��Du숟�/��������0��i$�������3g�X,|�ɧ�q�f,ZL�8��X���7 Ib8:j���9���j�"I���j�RQQ��IHL��Ý��Bzt��歕�Cl�A��˹��[�=w>��<�̳����9��ۏL&���??�|}�e�~�F��a��|����^4N��ZCfa�47..�H6���c�!3��vT�U������ިx.��ų}�N���9p�zL�>���HJN&)%��� ����g�šÇ�ē��N��H�v�L����r���Ñc�0�$&%Dp`ݺtAq~q׮={�ف�>}���9y�4���iצ
�-����t�ԑ�;w���LP@@�Q������?o�͉��������{�0�)���<���ſ.��g\]\����ߟ��<�=psu##3�ҲR�F#�N�v����L�;f,r��B��b�_L?\]\�Z���Ut��
�A��l�СCDGG���SC�����f�4��?McЀ�H6�d#9%�^��ߖ���Dbr2�xzx�m�z��^'NI�4��VJ�����76���L� 9%��N�:ERR[w����W�:u��ח.���e�B��3��/Z̡�x�v����{��Ё�={�ồ9y�4�$1l�`��F�f�޽�>{�A���+o��@��AU�Y�:���pz��ɑ�G1*��f�g����ղ%r����oaێh���<=<��� 0�??_{�8x�&�M�=>l(���v�8k�m=/�@МxzzQVVJyy)�᭑d6�te��&�N���Aſu�B[�0�a<������U�
���9s�,��N����DEEgΜ����RIYy9��%������BFF&�������OѵKgZ���f��V����"!1����hMFV)��lش_o��B�\�$I�mӆ��x�C������WT���KpP 3��I�>�w_}qQ�_����S)UDEEѹsg<==��̤���vm����C`` iii���Үm;Z�n�Z�����Z���3�%��8q�]�v�i�&4
C
�ܹs���a�ِ��8;;���R�D��b�KxX8*���l�W�Ww*�??���p�j�l !�����F�$�|}���;y�߯0l�`***x�G�����g�E���;�O�"U����#�¿_���WW���ؿ?���>���J�6���Α��G�^�HLNf���<��S�?p�?7m�{׮D�����IrJ
n܄��7�6k�"�I}{�&4$�m;v\��lR5k@e��+Y��<�̳t��""<��u��2P�9��-�6�V�xl"3~�Caaa�\6.��ddd���g/���s�))�>}��=z4*�:1�/hf�����xz�D�+����s	�1U�G���ϳ���K`@�t�ԉc�+�&_��k|�ɧ���xzzLj���Oy����pw�����W���F�n�x`�c<��S8h+�<��+�8�DE�k�>��sL����f쭣y���ٹ{7�>�d����l9�<�0�0l��Fg^p�=rD��^�*W��vDv�$"<�}{���j9s�ڶiˡC�puue��8h�j�ŬY�P��H�D���8u�1�b���{4`�r���u>�>Ȑ٭
�����Ό�}<�9�X̖:�A�#~j�����g�vm�9{[�m�sTOz��'=���R�Ry��6s���<�����>5���DD0�G0�M�����_���xx���q6���~Ѽ��ٲmj����ߎ��7N���5
?__d2-[�`̨������ٷ�''M�V�P��4�Q�/ZLQqqy*���������Z�d�-�x�����!j�Y\R��z�'N"$8777��=���e�6{��w�]�L&^{�m:w��{�������Eh~�u��3gq�h	��j�p��Yc���-Z$E���sH�B��1����zF�����:"�B�N���}����F�A.���Ye���-d2y��?��ш��o�ZծCU#��pUqZ��:�W]3�h�Z1w��c�Z��s���g��ホ��M"55���
Zz������JzZ:V����@�5I�I��&�B�HII�j��u�Lff&eee�T*\\\��l������@Zj�>ޤ$� I���֝J��˥kTW�ƚ/�R����o�{e2�Q��v5R��V*��ű������ETdT���r�}
���6�U�P���j��jʠz����z\
�I��X,��Uϣ�w}�-*)������d�֝�-�U*K姑
���
�VI�P*�v9
F#����Je�[U>
�=�U�/g��YPX���_��ג�m`��[),����Sdeg�*��#/6!!��U��+}u��T��f�*�ښ�Uݯj�L���W�9�K��w�"��r�S5U ��n|dԿOdd=Z��5;�J��ε֭[��k׮ƽ���e	

�C���J�T�d�kz��˱^do���U��7�^�k��z7��3Օ���j���{�
|�������ꍳ��jE!��;�ӨWC��X�
�*4'2��C�q��S\Z��bA�R5:�%m������[0�����ёO>���E �n�
|��NzU���&���S7:r�\������B�'.���q	����C�Q�bw!.I�K�T�hMI���\=�O�h\�)��U�7ʔV��d�7C�Ep����I����ر#&�	�R�F�A�Vװ�_�e��B!W��\o$IB�Rc2���rS���W��8��F��!*�n�N���C�V�V�/;��Ѻ
�1���QPT�wK�B��DNn��Mj��Z�(�
��(..��ŵ�Һ�L&JH`4�h4�-�QZV��O��%@Qq	I	�-�@���AA���1[��)��-pvv�Q�H���IG�
�_O_��Ɗ���8hpqv����_ײEK��$��(zt���r5�����͆J��f2�l6qtt�&�]
�\ݚ[���P����ƭ�@pSФ��ٹٔ���}�@P
�B���;~>bs�@ hJ�T��zZ�����L$I"%-������&U��憚�4/7��o�@ �hR�/�����@ ��m����i��e�7�ez3�w���kÆ�nJ�����(���T?�������:��U*u�����/��Ӥ�gC#�W�BS�ĉC�ر}
��|��k����8���nN��KQq�ٙt�ܭ�Ũ����g�о��'�@p9�P�KW�@�7��>v�E�x{���/)��ͫ��[���������N����2���/t�������(Vv�#9_U��q˰�.
��lAjZj��+**�k�v���iٲ%#�
��ى����٣�Uq��x{3 &�~����׷/��
�+-+c՚�L�}�y���
!!1����ӓ����pY���32�ӫ����j%��ΝK��ݝ~}����\o�ظ8ZGD��vs|O/.�éߖ���͕11��A�V����N�g�|����qS�Hj��]�L&��vJƐ[@LoW&=2�n]";"�;� �l&S~�Ro8�����R��j���l���t�J�����:"�����o+V`�X��tW���|_|�y��l۱���GFff�a,yy�̙7�r����d��6��={PV^���Wf����^�'-=�={�׸�ѧ��g�~:v�D&���c�8p�~��b�RRRr�7>uV�x��*��V! Ilݾ�~o嚵�Z�;�~�%S�M`ɲ�;�O����g���~d�w��;�N6�^N���"�ɑ+�l���9yͮ|�k&w~�o2��Z�
*��:�j��s	���j�Du�[N�9��q�w��$$&�}�.&�{?�y����y����J��I��#����:������]������.����&�M���+V�#�'N��g�*���;x��g�?r�IO���M����.=���<.Y��[�O`�ՐA�ד���s�<Ed��6�'&Nd�-�8c&L��[�Od� IDAT�󯾮n��cɲ��ݿ��?N��,��Kſh���c~�i��Q�Ç�O�^<�Ѓ<|�}dff{ �i�}˹�v��
�J���/>������w�6#��(+�&���g�(7��3���p��L�f����5t�F2b�P>��K~�i����0o�,>xw2[�o'+;�5���?e@L?�͜E��m�����KH�����;wq�q:w���NLt4���v�������_�!���o8v��O��W^�g�滯����$_|:�������]ZVF�8f���{�d���dfeq0>�������?WC�����j\��ԑ��b�|}Y��lRRSk��ć���cX�Vf͙K�b�n��F��=����o��SOֺ#��䄣�7WW���8u�z��=��1 &���
QXh(�aa�6�C�����t*��>�z'/�����K(1$S����ҒfWB�5�;���F�e)~'''�.>��C\\\x�pvv���_o�F#g�%���GHp0Ç
%7/�aC�0m�L֬[��/<�7���
�e��:��j��^Μ;�ć�_OMK�W�Юm�;FQq1:��͵�Iz�nn��*<=+��/((@�׳s�n�J%^-�8u��>>������^#�W˖deeSXX��ba��ń��������TTT�)I�'G-���LF����-K�@p�Ҩ��4�srHNI��Ϗ��G8u�}{��\��w�th׮N�T��_�T��k��iN$�4�#Gs��)'���$B�<��믚_		�,*��%*���$I�f�խ����`�'�Z�;m�ƨ��ݗc�O�s�n�~�:Fv ",��-�;x���cP*;~?_�z��֥3�~�
�{�.̞;�}�cٸycF���LJ%K�s�̙��T���\��C�v���w���ˡ�Ü:}��G���R�ӏIO?Ê�k���/صg/w�>�^y�
q�a0�٣�g����&6�1���D1a„�"""�� �C{\�5fr��vm�`4qpp`��A������A��?�#"صw/�:a��pvv"��OOuw7GC@�3n>�7�Go�Ξm��/(�ı#<<qR�+!ᮭ+.)�Q���b5�u�b�Zj8��Dzz:!���Tf�R��`����[G�dȠ��uX&'<,���@T�N�:s�N;2f�(�J%a��mӆ�V�mՊ��hR#
Y=�v�Ν)ו���̈́�����-C�����h�Sd$����h4��j2aa(�J�u�B��H>LFV�}~Dx���#8(�������!�6����+/<��G����J���7'N�"�SG��Z�u�v�{�)d2�i�@�O���٢E��#F4I�E���
m��NǢ�0�L���D&��KRJ��$&%���QǏJ��S�(��p��1=�C���ܢ�kLBBB���w)899�ēO5�,��Y�
FXhX3Js�g�~����G�@pS��;�����E���
�Z��C 4M;�Dc.h<�7�4
��!P �4�p��TAqqqS&!�I(..���u�@ h:�t���қ��"��2�M�B���˻�����&U�r�7Wq(�@ �b�@ � �����-�@ ���ɓ'�W^^~���U ���nsD�&�R***��ϧ�֑�M�9�5D|
&�%M��6Mp%F�������F�Ds�6I������Ѧ	����ǒWq��Y��F�K6�;׎�nsD�&��ZDz�r��zM�{��a��j\{���\�ϲ�2�:������Zff�X��?}����l߾���}��!�ؗ_��b9�Kg�vն^���1�S��X�V�J%eee�8}*��[�_�@p��\��/�����V�9�nf���Z�r����jsA
�?ff����z	����w���g�1c�4JKK.+
��LQq����l6"�[���S�߳����ןSXX�F�&�_��
����OH<��S'�4}����Y,V
�-ZR^^�����JyY1�бc'�Ο��l��ə�RVV��s0���돇{��$�o�^�?�B� &f�a����������qvv��\hl��=�j�ֵS��cǏ�ƍ��l�n݆As .�q���rz�ꃧ�'��l�b����A���,\<oo���pvr&00��̊U�qpp�l2ѯ_�`��et�ҕ��v���NE��͛7b�Yqvra̭��c�6��r1
h���a�ʛfݥ@pSҜ���p7�7o$$���Z�����l6ӡ}$��BY�z
�O����������`����T*�m�x�Y�?��:@��K�\�P*x�G~�HΜ;�L&�{�w�ޤ���������<4�q��g߾=����rٶ�/�v�Npp+V.#''���l�~�_�=�2"q�޷�C��$6����}��{0��	���b�XH�H#�}$�D�.�݇���?�$w��WWWyh"w�qw���J�?�$��1�߿���0�Nڶio��?v�L��$
������k��<��3 �Q �8�{��4U��~���Gp(>�~�K�n�y����ٛ�Ǐ��ҤǞD!W����mcǓ��ʱcG���f��	�p�ı&͋�����J�
����B�Vc�XHIIf��
���`���Cu�8`2���
\u˙3�qtt�]��t�8\��A/�F�$�=�$I�������A�g��h�����;~�_�L��X�fZ�lyIrxxx`���m��u��nѨ5u�@�Rc�Z.)M�@p�i��}={�b�E��u�_�HNI�����Zï��#V�
���=���Fv��@�V�M������S����أO���l6SVZj_�g0��`�
11��l��&��x��{�����j��ݝа0�?�ϳg��"�]�:���~��=
�__?��^BCð�llܴ���4������?HLJ@.W���Bt�6n���?��т>��6�Fyy�}�c@�Ax{����ͲeK���l��7��[6�󜙸��p���پc�e�[ 4ͽ����=��d��
@���h4PVZzѶ4  ���\
�
P*UWm&�1�-Z�H��.;���='m�Z��\�|~}���\ց@e�e8j�(�?������u�k|� nn.��i�
=*����qE���|��cA�r��I���IMK�q�_�>dgg_��}US�E�P6I%�\������rhnS}h����_�q@����7z�@s�r����-�F4[����#v�ܨ�4��9��vp��l��k�ث_p�"�@ \B�nT��y��-�U�fɇ@ �1h�6G�i��@	�����r�@ �
�\�Mc��@pq��@p
�X�ﲧ�hPh��,��P��@p����n�Na\�G�P���.5Sa!6���_�J��E�BCq����{wZ��g��U��?�@p)�/W�����U�H�?��͛���g6c��Ɛ�M��=$͟�����	;��	�\�y!�Ÿ��V+CF��o�@��ѹgo����oٸyK3Kxav��u���oILJ�f�57)��|��g����d�2������xg2�Ϝ��xb��HH�|�9�����[
���ʓ�z�)�}^g{�ˡ�tSRS�ҫ^pt)dff��_��ˆ��8q��U��1T��sd�d������{�X���J�BX���X���O?�r__��~;E��WAb����Z�MH 80����7�Y����7�f��~Rac�W1w��˖�X��X,Wt��N��j�^�c��k������������\ˌz!���(-+������l�V;����rϝw�`<�ˤ:�yy�{�㹘l����._Ɂ�덿!�;Θѣx�?�T7�d��@剝���h4V��y6���¢˪�:�������'O2m�L$	�����deg�
si��1H'���U����CL��W5�iY���^�ݺ���M����Mm�߻o??x?���K�g����_`0����:z$i���c5v��{w�q#��Ư�����|˘ѣhۦ
Ͻ�
��.L��F���ӓ�^~�q�m�r�]�qsu��=h�F'�}{>\�_7���Dޜ�.�~�E����3o...�L&�{�m��_|�-y�y ��Æ�x�RB[����̋�=˪5k��F���Q#�:h���4�|}0�ͼ�����.]����d^��+��ʿ4p񇏠�j�����'8(�lT�
ʒ�0�-<8�1�n��w?�__�*%���"3~�MBB"&����O��Q<<�qڶnC׮]���|?m:�99��xx��)X�V222���>r�F^222ٸe�>>�~�N�n��-��s,D�Ra0V����$�+�'�)2���|�.���\�̟O~~>�''G�n���Q#�n]��߷߱�I��|��A���ڷk�����j�<��ӌ1�n]�RTTT#�~�}�`ʧ�M&4��Ջ�ǎ�������^=zp��Q�y�E�o�*y�ɬY��F�濶RZVJ��#�U*�KJ�X��7��V���k��̠[F0f�(23���j���;��E|��:�$I̚�ǎ����~���?@�V����G�M澇e@�����ݫ�<�r��^��ʶ�hb����ͥo�^��'77�7'��^�7�,8h4|��G���G5�F��_/�HDx8�>�t�::�)OJb��wSp�@�~dr9�ݺ�ѵ+�;������
������}Z剉��SKyCV7I��7ߠ�j�2�ny��M=��r��I"�BY�~}�~��0[,����|��,X�+�w�O�䯭�:x0'O�b����:j�]�5F�2Y�����>��J�RUٿڰi��ć��	w��_��/<OXXS>x�-<��p!o������ݷ�����>|��G�߰�6�==�J��b�}�v���`�…���s����D��������u��KH�@�ʲ���c��f;v�n��$U��\���)19���R��`00��Ĕ߷+}cE�_~�)...�>{��� �f3���:y����ӓ6|v
�G���o���OL�Y�3$$��}���|��#G�<��g�ŧSpqq!19�n<2Y�2�Β��y���x�׈�رN����Ct�ֵN�I��h�*>|o2o��?2�N;rۘ1�=��s�N��Փ�6���ϧ|dW�C
dܘ1L~�
�*�=/��^o}{���:xNNN���k�?�MJN�i�y��j���!Cpww�l6s6!���R��>������=�Ϟ�O�<���#8(���0v��˸�c�f���x�`�I�>¿�z�?7n���c��۠� >����R��;v�ݻׯ�e2|�!z�<���cdl,�g̠�/0f�ݻ�ڦ
N�Z�Ԫn���M�=����,Y¸�DnON��̙����s���W_a�����&��zN�9Y�Q��$���AHpHS$m�ύ���Q#�u�H��c��?-=$���L�DD������D


���E&�ѹS���3�3��٬Z�;sfL����FCzF&NNN���a��x�pqv�W^�}۶�x�н[�u�B��e5�����ӳZY�8s�,�BB	fϾ}t����S�2h`�����͕qc�0���I��Q|��T���	f����h4�<u�1�FOZzA�A<��?��>�B`@��L|�!4j
7F*�
�JEzF���n��x���1�L����|�����p��!D��p�)�^OYY�Y�q���̙���y�l�:y?n}z�b��̙7�����gg4V���JFf&yyy5�pw���
���ɩ�\jS_<�ˤ[�.h������������CYYY�����ɑĤd����j������͕�r��P*����QTT�����W_��͕�QQ,�*j׷�={����E�輊�t���Ņq��p +;��V���RZ���_��s��\��i٢E��|�����m,���������bΜ=k�@U>�Jy�����[�̿�n��_�Gc������.:M�����ە�B����KK9��y�]l�룭���u��8���siş�xG�!!���d5�UT9s�L�+�?�I�޽
�U�����gيt�ڕ��6���CHP0�.�`40f�(ڴ�`��8j����q(������m�6�\��-<)--����9s���ꂗWK�ݗ��/�n�<x���𥥥u�W',4�����/�?���[@ܡCh4��כ�~�}�~�4������'&�}����)(,d���e�4z��NFV&�V��k�δo���@��(�J����>s���L�}�����j����k�f���2~�;�V��WK��Zr�]w�l�*v��C�v6���l�J���q�8��O�6�Y��wRR��ѭk��l߱���4���4`���F���͍i3f��+P���d��ͭ�rV.�q�'�>�{1k�{���}��/�/_�=%��de�ԛ^}��;WW�Ϙ���;�=����aμ�>���)*.b�F��\�`����ַ�8;9QT\��g���;Y�f
�-���dh�Z3g��\b"������l6s�����7��@V���B�Z�&((��-�W��}���֩�
�}1�����:J�94�޳f�;d�e�{1T��D����l�J֟گ�ղ��̢E���͜�?Kz�N2
��}��^�4/��(I�$���K6�����t��b����lRyyy
?V�U��
����ՠ�����䵪\��b�H&���{:��F�&�I*++���j�6���}���TV�<t:]������%c��^������R�l�㩯Lj�t�qV�3�뗷���^NS>��t�ܹƈ~�TO�b��(k��d����RT�^R��#���z�.Toˎ{�@
�׭�J���:~%�KێeV�m�I?�?qEiK�$�z�i�>���o߹�{��y��'��<!1�Qi鱧��R��.O�+�`0HS�&IR�|�f��y��+��XRvN���߬q-9%E�ܳ������h�'NHn� �yv
w��	i˖-R���
#V��MV�}����ަ�j�����">/N�Q�L&��\.G�u�7|}���󋈪hL�z��P((�߫]*��ƨ*ˠ�Y�>4�k�Z;��yq���j�Mu+�ŨO}e�X�.$ۅ�U�����~m���.��V���4�F�v=h����<1�ԥKk\�=�+Wֻ�Β�	��3 ҏ�We��l�Ѿ��m��L&����]JJ���IIT��C���k�vF�2�ܼ<���IJJf��Es����rV�Y��5k�֥3bb��X�Vv��KRr2];Gթf��%K�Q\\���+��C�=P*�,Y����R�Zw7�:��޻��r�&�};�DAa!+W�A�$F�GGG�o�Hii)cF��\B��t��Ģ%�թ#ڵcμy4�������r�jY�r�����Kyy9'O�&%5���2jĈ:���f�̝ǝ�o�j��r���(;v����p�mc)--e��5h��1nJ��dIi)+W�&5=�>}�`������ѣY�b%N��TTT0f�˪�W�&Y�W��m
���y��� �OoZ40�+�|�f3C�jn1�R�y��'��g��t)R��J��L�ܹ
�Wj0#�A��DRN)~���-6�u��v%�lw"�6�B_aa�����Kk0}}z:y{�Ը���0}�,r��X�a��y��KVv6F��Դ4,��O�sr�h4,[���bܓ�N�p�$%%���e�t:N�<�C�(+/g嚵���2m�L2���������s��I�9Z������ύ)*,�e�رs	��ddf���Î��ؽw/�d�䐞�������L�1���vSTT̴�(--c��
u�u��)��ۏ�`���r�ifVu�Y��dbڌ�����Ǵ3)++g��5���s6!I�X��R��8��=�ط?��g�k�~[��C�c0�1��, IDAT����S������?���e�����űe�V�,[~E�V_
�D��F,K��b��L���ipq���qsue��7�uP�T<����3�ٺ������Nz6�Ā�~��&�nEk7������l=�Iv���ӹ3:�n/� >1��76����?D:�w�L���wo�;ƹ���_IR�_�Lz�^}�E\��)��N���{,Z�����^�uD�=^���KC�����D&��?{�U����&�lz��@-ti��
*��WW��WTP��+•jCAz��kHB	�$��Bʦ��d�%��P�r>�s�'�3g�r�9g��|����K+qBC��a��i�ڵ|8�=�5����؀�̙={;;

Mϗ��UH�q�V��o���W*hu:�_@@�����=��k4�����ܚ��z���7�������9�Uy�cia���/g�oX�>�GF�
l�D�Lɺ��\߳��+�(�`�ӓ�����‚��2*++���Ӈ�[u��������<�|�غ}��$%'ӻg��<9v<��
���W0��.����QXh������NGۨ(fL���N���_i�'~���3?���`լ�$8(�-۶��3�ʗ��;	'O!�ɨ��4ʧN�����>�N�_�TW^u+n&=5�o,��R�_A3�\�.Z�N�3ʧ��+u+��V�<��3DFD���E�|㷲������o�:�Z��h��o�RX�e��
�`x������	�g�S1O*`D�D�����H,��Tg�I�0肵Ă@;�s~*�^z]�!askk�ڵ���`gk����؂��WPZZ���7p�U'b�X��i�3ϰ��
V���ذ�ǟ���DvN� �vdw�[&�QmL��++qd�F(��g��m5.]�C#KT�Nr��1��o���Z����a��'Ǐk��+���իu'N�����H��;N��xKOKKKB���ء�=M�)�$%��鉋K��F����i��'�5�Y��g���"��_c{۶T��Q��W_�_�����5k��6
uU��������$�Z-7n����h���P(�N��h4h�:f�9�o��춓~��L�����Fyu5���H�`Rgch4�JU��C.�cmm��E�R�P(���Ӈ���"7�n�T*�ňD"�(v�dV�J�J�����LFϾ�Y��
ZGD��DֿJRR9��F��zt����y�]]�x|ă��8��w��ӏ8p��z����A_A�333��F�ӡT*���j,,,�h��NL�U�@l��MƣR�L����t��(�������Z������P��¢"�_��q��0�Zm�L�z�4��N�, �wƭ{w��?e�"��O̭��
83cYk���1��3g�+����G�뛢1ߖr�*�6lB&�vO:}��+qLann����/ٸW���k���|�ر�GGG?h;�9k֭g���ܨ�",,��.r�J2�� ��	
����u6r��5"��Qk4lؼ���N�&"<��7�R�b�O?H�L�o���c��=~������:aa��;���v��cϕ�d�o�Ln^%%%�\�f-G��za�##���-�9���'�>z��{��R����ѣz"#"���d�
��Ǔ���s�<�o�%����v�g���D¶;9t���.�8;s">�s�/p��Q��s���@$�c�n�}}پsW���DF�vٚ���f��d�Z��_U^���o��,j��+_|��'����Y��^���g�&��C��V�(**��ӓ!������i����07�sL����O*�RYYi4�!�ߏ�����^���yW������ء=3^�'k7l�����c����JQQ1o��.*��]{�0x�@B����Q�v)�K�闂|�tE��,Y��Aզ�>L���F����d�
��0�B���Ү;�g��1Ğ��JR�����W�z�C
�R,�ZZK��`ȠA���RZZf`Cee%O����F�aT.K����ٙ��k�[�|�YY��W?��D|Z���������3����r������mË[������g�\bŽ������ey9W��������=��}��g���Xܧ��ˋ)�Mf�ӓ�����1-q��߶��B"����������ggr��)���ZZ��9����D@�?��T����ԛl���Œ��Y�q��n�zҰ
�x��nn������i�\.租W�e�vd2����W_y��m�v�"Ξ�@hH3�O�ۅ�X�q#��v6���SZZJ��h�l���+H�Rf��g�_ ;'�Rɑc���햀Ldx8�!!�;��۶3��9�:���ba�_�oF��K�o�:��<C��
Fa"��)7��M�VSq�2ז-��O��ח��g�����m�Rz��k�|Vx��[{i�F�T\\\����}��;����~�@/H#������	�=�]�V�)""�Y�~EEE6XcZQY�B���IY����������TJ%�))�t쀏�7=�u��|����
lptp��ޞ3�g��(
�1l(WS��������^�G�T���� ;'��L"#�9�x����n������H|�M�oZ��c������,ll�Z��؉�3�%nn���jQ��#��1X�o��V'���ϱ�޸����7~�]�q&1���p�bK|}}n���6�qcFs�L"�/^�����k�6�
������7�=ڇc���-,�nE�6mشe+.]b��!DFD`anN������iEP�V�ܽ���\�b1�{���Ύ�r��%8�Qm��`o��صg/
��IO>�B�`ێ���rf͜���6m��С};F�z�S�� �J�ء11�����_������e�ر��Ճ��Bb��P�T�����燍�
fff8992�?ڵmK��n^@��{Ϟ�sr���K�Z��V��߱���D�9�\N�����,/�&vk�8��k�1}:b�ۺ����o�Ͳ�O�4����̘��g:�1�F=h���FC��/���/&ϛI$hw7
&23�g�pڼ�&7״�U�r>��?p���\A��9w��ܜ?��s��}�-�
>�i�o��[��x��ߨQX߅\����@��#�^|���w�ц��@�"r��{���S��Ǜc��E�ŋQ���H���""n������@��3B�/ �cT�O� ��9?g����@a$�/�hVD"B�L��4:|�)�z봃&O~��	��_@@@�/`acCԿ�M��锝=�V����G�Y�"t��[[�{�~�f4I�u��)Ri�Iu���@��~���f�/^���ؒV�AF���Ο?G�.͑��������mh��}gOpSARgpH$�M�`u/Y�aWSS�Z��%�WPVVΉ�x�32�iZuq*U*�8�h���#�����kii$�
�6��o�q�f�_S�p������\z	G/_@�ձlORW�9Z�9p�0rZ'���/������W�T�tZ���Ms$k�F�a�ؾsW�]J%��-�����	���뎿6N�R��
��������i��c���y�T*v���,qߎ�v��GzF&���������K�z����G�\gs|&ɹ�t �Y�iI>���Z.�f<�H$����n�:i��~�B�Z�nt�zS�8������ק+W�f¸�8;���gdp">K���Ç�$.�.�o���hݖ6����r%C��o�N[)WSٻ?b��aC���>�����R��r��a�j5�F� �ߟs.p<�mZ��o�G�?y���JR�^%�m�Y(�J�L~kkk�328{7WW��}����#��I�z��֑��ۗ��R6o��NǐA��S����+|��W\�Dۨ(zt�Jl\<��tl�N�Xp�ȑFˤ߾�=w�L���a���Y����VsyD����Dšu���~l !����U*��ZZݺv�k��&��h�����Q�s��1���:x!��,��G��"8(��.��?w�ǎ����ѣ133��ѣ$%��>:��=���.^����r�F�a��g���7h����<�K/�p�Q�a %���*�2%��L�`7Tj-�K��Ҝ	���p"�r��I}B���U�<��G�4R�B��k�Q$�r--
�J��?0n�h�Z6o���OOb˶�H�Rlmmqws#9�*���IL���ߴ[
#�
mT��JR2��Օ'����T�ղg�~���y|�r������ᅭ���
KKKnܸ���5��6�j����ԠV�=�Wǿs�^��<�F�`��=&�T޸��_}C�.]���>'4$���V�r�*?����׬#��9~_�W�[��`o��#����|{[[rrsY�Ï���d�t��_?�����ѝ/�����\/]΀~}�~�R�8x��5�'Ǐ��9|NNN�ڳ���|>�����B�G�}Ή��$'1~�h~�y%*����W�������ߚT�*0W�BT��,]�'��x�W/~�mч�]���۳��mt���U�V�ڻ�G�<����Ɔ%�W���J�޽��/
�f��?���Ʉ�c	n4_�'OҫgO~�e%�YY��WW�֑t�ҙ.�c�w��hKee%��.g�c)**f߁�\KKc��4[[[[���hՆ�C�P]]c�6�S�PPXX����M�Z}�z3���֊0G:��H�g_̧���݉���L���8�.�}�wwK�!�Y���������R��ː��PTT�K�Q]S��xj�x��˩Q�0���Y�d)J��Ͽ���+++��KiY�Q:ׯ�x�r&�C�.]n^gx�V�dl�c�z����ԨL��5��:~�Z�N�k�_�S�]c��-�z��0R�??_�BC��9���"�|}�QUE�4z��I���kii����z����=��s��E�Mk�\M���ggg}%H$b��9|�(r��䔫�~�fjjj�������N�xzx���Auu5I))���ԊÆRXX���*0//O���Q(�o׎��b��
췳�Cbe����NNdfe���#�
�q�	���+k++\]]h���;�^^��-,,����D|<��n���P�8+;��Æ��鉫�K��
%"<���B�srY�i�>.[[[�qqv6�����¢"֬��K������₷�7������ ���3�����QZVj�6ꓜ�b0|&�p�D@�v>�6�-"��[��[���
��V����N��r<�l�я�Y�T+5dUPTa�}�%������Ѿ];��찱�6���w77<��qws���	K�[A�Zѿ��RXTdt}RJ
~�>xzzDI����Յ��0��sL���L�t����(�JD"3���	M�{��}�	b��!�zc&��E��&9�y�ӑ�����9�yy\KK��D����D��nǪ��0�_3	
b˶����`�":�tb��a 333���)���ƍd���r��Z���MQq1�ׯ믷�H�0n,� �U+�|}xy�T/���Æ�gf^[]ff��Ȉpr�����d��=x{{���t���b��W9y�4?��� �N�%�j*E���u�223IJN�w��pwZ&
㾚��R���Ó����K}g��ݝMlC*-�����|yzxҦu$/O���K���+G���?_�L~����2r�P�j53_����3c��Hu�IHV	�yy�^�����QۨC��0�Ïؿ��x�(�Qø^��y;2�K���t������5f����Z�S���l'a�{C��0#�ωMqTȕ�͐R!Wr%��(���CJJKI���UUUxyzp�L"e��ܸqK�K��ܼ|r���cfn~��zC�2y�g{;;\�]�),,"0 ��L��r��puum�>����a�|�ر�G��{/'�d<W�/�z�*�i����,�[���}O�lH©S<>b!�A8;;��텵Ċ�N�|�
Æ�܅���0���㩒˙6�y,,,����ˋ֑������CdD�A�ٵg/j����ӫGwΜ=���v��$��С};���pvv��ׇ������J��<қ�S���ϧ{׮t��ov��M^~>����������@#��prr",4������с�~}x"��B�t�2{���ϗ	c�|7
�D|<�v�2�V˅2h]�t6�o�e"anfNhH���AA�IL$2<�n]��p�ظx�j
�a��x;v�@Qq�bc����ud�m�թc


���C�RZ�v�ꆫ���V��ލ3g����F�Ν�� �))�V�ٶc'2��Y3g��u�Q�x{���CQq1�o�|�
�S�ϰg�~���0��^l$y��u����?7[zFz��`���~�����|y��7���m냝��H?'����7��%��k�҈��'-=������ؿ?'�8u�aaat�ЁAp,6���m��ܩbq��<=�qrr�?����/�t:����?o.NN��ߴY��B��K�|}}ضc'ҒR������>[X`)�$4$geb\�'�Ri�窪��^�Z7q��{�p��d����~����sL�o���B�D"A&�acc�s�r�AGZ�J�����h�R�������Ո-,�n�0̽�^���jQ(�X[�*���;��r��g�F�V�C,��U*Z�ɟ,��dΞ;Ϥ���S��YZ��h4h4C��r�����=2���}�������h�{�/�N$%5�������v��$"<����K�5�{���p�i]��5����Xk�� ��033k����'M�T9�ϳ��9�_�ōLP�SZGF�:�e�o��7-ه��g�#`���o+�����aBث_@@@@�ckk�S	M���3
�:~����x�:~���f��'-�"�����
�3+[��Y�wO�t�/]�RlI�� ���
E
�ϟ�sL��HZ@@@@@@�64�P��k�xy{cfV�Eo���ʚ���f�%R��h4�FUs$k�F�����|��v?w�LF�.�H���o�s���8e2��~��pǎ��q��ű�ؿeZ��_b�q\�.���-͒NK�!́�+��f�W(���F�h�_��	'OѯOV�Z̈́qcqvr2.=#�@�9���F��<6_�Z֔���ݿ�X̰!�����ii_�|�O����#�hE�n]���'#3����Ѯ���L&�8?6�KW� ��
	a��}TVV2b�P�����۲�aZݺtf��C\KK�[�.�mӆ�#-)��ӓ����w���̢�����x�2"3m۴�tb"�..�ٷ_)�('++__ƌE��s���"-)�G�n�����>w�ǎ����ѣ���6��t�2I�)����g���]
�}Iɫ��JA�L�;/�1�
�Z���Ұ�4gB�`6�H�\�dR�P~>x� {���Q�-͇�'.>��P\]]ٹg�$=#�#ǎ������D�P�f�zdr9�HHp�Q<�W�g��9��׭g���jv�ً��ZM��X��?yR_�n��F�_�t���ddfѭKڴ�4ʋ��??��������.[���dy�Zm��F�i�d��z��z���ݝk��
�n�FR�]���3$�:En^��E&��d�r�Ǟ������VP]S���z�u
��8s�,��=�j��ߴ�RIii�����,2���� :��M[�RQQ�Ͽ�zS�6�}RV^���VSTT����IMKC�ӱn�F222ؾs	�Nq">�=��QVZʺ
�ڷ���t���)(�%�{������sr(+/#)9�U��!##�5�7 ���a�d2v���ޘ	'Oq�x,'�㑖�PPXh��Һ���M�)-+����Lƚu�IMMe՚�\�xIm�J=
��c�8{�}�PYy��˖W[�g),*b��ddfr">�Sg�p%)�5��SSSÒe�9u�f톍h46m�J�LƉ�r���X�f
)���]�g��?���p�…�7:���V����m�D��.���+� IDATŦs%��?2�)�l�ϤZ�aͱ4��n�󌱦��[��O�/عk7nT�x�r���IM���/]��$�%%l߹�d<�W�����TUUQT\̒e˩�����ˉ����IJj*�99eh*����޻���T֬[�J�6ʋJ�4�E-�f�kjj�b
�����ڽ��W�
b���&ägd�R���w����z
�L���$>€~�HJNf߁:�А[O���F��OHP0�q����#�X[3g�,&=5A}t�(|�����g��C\�x	�VKii)�nvp����kX,f�[��8a</^�è�j��]|�%��^�#�{1��9�DfTUU!�Q�մ��$<�*^�n]	�?`�a�\�������KhH�.��A�f�t:t:�>^W�CC��TZ�׮�w�0�9<�N���;���/?����V�m��$�Z-Y�9$%'�R������ǔ��2~�h��4�T*���F���I���ϰ��'%5���J��4��>��o.b��}\�x	�JEuu��n>��#��037��ŅaC�ܶ�^t��>"�N�n�k�B����3F���JL�L���5��
e�#XZ���ŬR�����(��6��%���������v�~�ZM���X�a7����?`2���+�3p�>�}L��;�;s�g����ٞ=����H�^eh*���ݻ�����g�Ν����d��㷴�D�T"��8Dh4��Hր=����C���L
����]�������|��͍���T{~��Et������@tK����7n����V�Z��v}t:-)WS)*.&0 ���w���IRr2�{�Ї�������b����o�m�����pqv槕�6i��<�����6��**+��ODr��CG����cpm��v���iՆ� ^��ݻu3s����OM�sL'"#"�e/�˹q����Ҥ�5��BB��a��g�?o���j����M��ٙ1ON��O��
&�ۑ]p��0��?'>�oG�1sT4U�J�z$g;	������~Nl�ˠB��l��
��+9�F�$RRZJzF������J"!7/�ܼ<*+o��jyf�DfL�o,���OOП��,Y�:�� ���J�]�
S��(?�������l�_��RI�^=
�ذM�Py�r����DF����f��ߦur����1:����އi�;w�{��ЪU �

	aæMt�ؑ}����I���ZMuM5#n�(�a�$�<���5!!��=w�*X~~~lݶsss,--���`��ݸ��QV^�?�œٳ�H,	��ё�˖#��q����Ϗ�������� [[[�,_�H$b��1�ͣ�������oHKJxr�XN�>c��c���RRZJ�G��
b�O?�؀�<һ��_�Ͽ��DbE��1�p�66��,]�R���V�8z�Y99���ӷO��;88�L�g��|��TVVң[7��F������+�x�/M}����P���S�{�Ҳ2~�e%�^�b��ښ��0�n�NVv�;u`æ�899a%����Y��J�b¸���i�
lll��(�o����c�ې������X�o�g�co[?��V�ow��u
s��ъ�F���X?��f���n�����(|K�!�	'9|��	�g��?puu��������˯8:����
�z��տ�p�4ΎN��d/[a���O�k_aʞN; ��}����F�a`���-���e�">��G�QTT��ĊА��<3�ۖuKB�z�j�ĉ�y�)W���6�veiiI�.T&�N�^h���r�ɺ8U*��mu�d-��j�(
� yu5�����wc]]��_2�\����`X'_}����'$8;[��j~k4�Zb���J�B���S@�AВ}�\.G"��yu:r�����m%�M��Bow�24�W�. ,$��}5H��楹IJJj�\AAA���IDx��+��q�
����l�k�Z�Mafff4��W�7����ɍܰN�$��ߔ�w����8M�x�d����DF�����(��W�aT���a�۟�˃�|�ر�G�[�% � �bKBCB�y�~~��H���9�c:9)��Os�����租#�ґJ������j��~��OSC��:�������C���<D�������CD��ꗖH�J��5X�_��4��0KZ@@@@@�~�,o�r����Z!�JepTUU�����
4A����ٻ���bme����6G���Y��%R��h4�F��#ң�h�?d(�,��GY&�ѾK7�32����kBQ�n��S&�1��w
ג����^���W6l���ry5�L�෵6�W�. =#���r-^B~��;.���뤥g�i��l:t�N�Mu�O���_~3-�ѐ��^|�rrs�JG�^i��ݰ����&��o���w�/1�8.g��c����nGK�!�A�{�n|����L��������啖� ��p���fST�Tz1�;A&��C�W��j�F����"F57�u:
���tj�.nͯ�j�F�-�ؾS�wUU2��LfPGu���hh�N�3��o�酥XLhHi�]���|�edp�ʕ;
��Fc�o�NGuu��oj��B��M�Vs��	��CS����%&kkk���)��]9ם�+�m����`GE�B���T~�s7mW�i���),��u&/��o���[咖�C��ak����q7�B��h��~���Z���JeӍ�^�k�^f�1�O����q�3���˯���A.�3|�rrs)--c�S:r��z��c�0x��u�^
��ѣ�?x�¢"zt�z[[�Z-s?����
��9���r��Qv�ދF�!�u$S��̼�!��$%5��ޟǺ���Ͽ�����YD�������{��(>��kjjj(,*���c9�x�����ɿ~�%���m$�:�D"a��!t��-�����ILd��y�)���A���IKOg���9�x�u6bgo������O}��Ԅ	������IJJ��)�s�����xV�d�ٷ�JErJ
s������y���-����{s�c�z��kׯ������'y��o���7���}j����:x�:v������t�*�nj�W����H,-�����/>c��-\�|�[ۊ&�:�N���ڊ�����E��`aa��Ӧ��_2l�&�8����:2�%�W`eeeP�ޞ^,Y���V�������[����_'#3Ӡ��Ny^��֑�|��;FJ~�u���prr�����>����$	�c��WT����{o�� ��ߟ�O���w� ^���M�5�eJ���pm"���Yu��JɕV��}x�G��)y��j�ԫɸ[�����.`İ�D����7��y|��'H$\\\�5s��$6�O�h���wǾB��h�7����S����޾�k�._I"48��w�����J���峏����5<3q"���8t���#)9�}0|�}��Z����[[Z��vC�k���g��Y���QPXx[�����7Y�p��[�h�N�F�����|�ݻu�{�.�޻��v��'�ȣF���#���((,4���L��,1�:��eX�-����/�``��899�R�ط��/a޻������;p�y�̡s�N\NJB�Ւ��CRr2*����\};�2�Yƍ~�NgT�k7l����VkTv3_�����r[m�q��`�ӓЯ/���̙=�ҲR}y���gx���lش�Ri��kii�pum��ťQ}u��CG�v��GD�)؍v�\��q��b�ȶ�Z�)�)�p�晾�L]tKs�]��UJ�RCVQE�F�D�0����!��ݯV��y���v�������
�;�Y:~�VS��u�F�榬��JIvN.vv�����f��E��H$���s��3t� ������İ!�
�;z�8E�Ÿ�����a"j��(ݣǏs�����@T;fgk���;#�
�m���ji�No�H$2��NK��^�
	a�ۈK8�L�uafV{�V����wwF�z����G�ӡ�6��$U��1����K���4w��a}�Bwsx��ݝ��3a�X�r9;v����;;[�pc�XTTLt�(9s�,��XYYq�L"nnn�m�I�V����ɲ3��r������ކʏu���*
5�o�u:<ҫ�$�j�q��S�T�`-� �ۑ]�/��a��~,�q�V���ps�B"6�K�;"�z;�3��ْ�I�L��׍�o�>��z���ּ<m*�!!��}-r����yÌލ�h�f���1��k�����ju�5��y�Kޟ�1O���	����|�32
	�ۅ�Ա#?���g������>���j&?=	�A��ռ���)��a�f�u�b�F�T�J�"/?KKK|���y�(�J�sr�7�#�̞�Db�c�s��a>����v�T�V����KKK��ǡ#G(**B$5�$;�_?N�:ü�!-)a�ӓLjioݶ���$�ѷ���V�z{�
�o2�qcF���������(������g�'�q%9��S�p��y|���v�BfL�@Xh
���>��wޚͣ}a՚5D6�ʭ��5�����k�����1�)$%����M��HT*����P�P\��?����HDaQ��NzE�A0��c���������X�߅��I��7f4�/&",OO�a��C�x�ۖ�E� ���:��.\H���^P�Xl��������_�������6�%���/G��h��ѿ�/��l���4�%���vs��1�����>:��,��Յ��<�_/�e�qt�G��bfff��U*s�o���O7�������W��&ғr5���l��---�����Y�t��F}}�w��Y�R����0z+k�Zڍq;�黱�����/�b���TuK�բ�h�^��O�~�ս��������Z�ѿ���>�C��T~��A^]���9�y�Oђ}�\.G"�`~S�Z��!��
���h���+�N4%��l;�E�G�2�a�:�{���Pڵ.��n������Ӝ���z�((*����
�
fff_b��Az'���!�?˝�gsh{��o]=��;��OK�!
��H$2��^��������
�����+�{���-
;I��EK���"=-G�>�ă6C@@��#������<D�������C���<D�������C���<D�������C���<D�������C���<D�������CD���� ������b1���l��4K���I��]��4�p�u��o:F�7^ݡ9�h�f�O�H����N�V��IY�\Ȭ��˅SI͑�?����+~`�o�8�x���_B�R�q�:��M1INn.���ցC�M���jٱk�=IC��p�L�}���p%)�kii��� ;;�c�c���4K�_ZVJ�JúCWI�-E�Q��H*?�M��͙��i|�b��Hڀ������>thߎ�v�q��9�Zm^�T����(n�V�J�2�^��5�o�N�����ෆ��j�Fy0eӼ����OT��w��J��緪�ꎮ9{�?��0yN��Q]]c`�N���''7�\�?�P(��e�V��q����fZ�Fo�F�eێ���r�Je��5l��J���tb�>O:��d�La����L�Yo������C���#���h4���h���ԯ���jX��������w�/1�8.g��c��&��{d2��Y9~\��)V%���8{��/�_��U*��=ܔ�x�4�P�B�`�L.j9����H���s�
ɩ9<��/��H�++	�aaD��3��2�Q̛�KKRRS��y�۸�����ɿ~�%���o2���8:8�Ƭ�����=���{�h4D��d������ktlߎ���0���lؼ�Yo��Ϗ7f�ż������bf�9���s�9�U=JqQ1����Y��j��<Q�T���l�����/+��p�c�����d�
�Zbgg���-l#��)$	Ç!//�}���ɐA��%&����s��i�	`��:x1;��j9y�4f"3<<ܙ��^x�eb:v$-=?���()-��>ї���K����L$��B�{s�f�����$3_��˯N'",��;p�y$��ddf����}�nN'&����K/�H\|��C�1l�`�+*���潷��G�}�Z�F��bm%a�[��v�]����/��Ow����NL��U��������t:������t�*�nj�k�.����h�


y��DF��ٗ_�/�`�
ѷO��{�H<w�/>������wwwN�>Â��d��t�ôW�ItT�R)'��޽���/@����g'�<��������\�?1ƠL1����k�I<{��s�:�Fˤ[�.�ILd��y���
�}���OZz:+ϙij�۰;{;�:�yz��_�N:uҗ��z_��"�JQ�5�5�M[�boo�R��w�a�u���m��K/��;��G"����¬�3�ڶQ����OR�*(�RP.S���tvC���4�,͙�+�
'�)�)��'��^%�Þ�1�Mƭ�j���|lll����ϗ�g���oP"-��ގ�^��K�2�uv�����t�~o.l\o4�ʔ_Z�~=�))̙;���Mg���6p��e4
yy��}�m*+o4�Z��ƯV�q���_Np�r�9�b.�^�w�#�v�b�D"A�V���Gaac��\.')9?^z�E��d�T*B�����D�T�֨
n���p^��"������ptt�_�S�$_�JHpA�Ѡ֨��>�V�Cl)�ɓ��؁�7�����x��8}&��V�L~�
	�N����x��it�ҙظ8�ڴf���p�W�����)���Ǖ�$���xf�D���鷎���ח~�>J�n]�ѽڷ��_VҧwoƎ~�M[����������"�m��n��/M���Р|k
���x~�d�����ΦO��<ҫ��4
�D���-�>=�>�{1�'y��pqq�j�5V�Z���Ø2y2..��O�hFƮ={)//������Me��6T޸a����q".�'�+/M%48��7X�f-}�MhH0��n��חW�M�駞�Xl��j�m:��NDZ�X��t�Ӧ���	���X1����������ѱ�#9|�W���:E��c�T:u4��R���_7*�@���HIM%"<�e��섯���4��={զ5���"��UU�����ތu�}յ���h���½<�EN�I�O�޼0y2
���W��aZ��5<|��11<1r�ɶ}'�A���pu����r��^º�t���GB&2����T+4�9�FV�
v�ɾ�����)++�iS<h ��r����KӨ����������h�GLN�6������[m��Ù��\\�9��xG>J�e�,�B��S�C�X��p2ZK.&g�h?��	�;�ɵ�4||��|%	�LƑc�ؿ?^��
���G'Gjjj5r�v��|K��
�J8u��.��Ͷv����	//OZ������Q#G��߳��mF�ׅ�g�%-#���P"������Q�Ԋ�m��p��:�t���W�srQ(�9v��j��^}��R)��~���/��+�ݶ-�.\�@����R,��;[["��177����^={Хs666�����耻�^^^�����䄕���h�*��yH<{ggg��������WWZGF`og�O?�d˶��d2�22(*.�sL'||�1777�70�d2y��km������ �T*5i]����Y�9������k���)���ZZ����YVhݖ��\D"Z��tzF&}�<����vv��ڶ����Y�[���'�ߟ��j.^�����>_����e%�Ɍ����[[[.^�L\|<=��jo�ʤ���D����N�kS� IDATN�x{��₋�3UUUd��r--��'N��W];�_���.\hH�y���ك�ڶiÙijFu�0-g�oވT*�?�-B.���;i�� ����S��Z���ij�9Č�m��S.S��h�3}C�����D�p1��j������hE�sA._I����>0YP�>\��qtr��Ɔ�ޟǶ;������7���~����=�ukog��5�u���d2
���G	��e�_,�֨�+�����R̴q�z::�JͶ�POnn�ޞ:Pk������q������D�.+lHNN��$�k��w�b��VKiY����^?��I|��;��R�^���q��%^��'���}z"�|���^h4Z��~��/�s			�C�v�3�E���Gmg�u�6JJJ�[�m#+'��⎾��֛��Ͽ��œ�C�УkW�����l�ɵ�t��R^}�e���m�����ԩ�p*���
jjj�Q(	nՊ�S���w�ċ�?�h!�A���y�5u�� V�F6*��UWݣκ�lk��vh��j�m�k[g���m�{+��
��)K$!��JJH@�b�����'M���s�s�{ν܇�7qz�;;�߲�_�^��4_oo�n����C����xzzP�V-RRR�1k6��i���W��J�͜EVV�~~�ח1&Y�W�^`@����r"����*�Ë��{� S�Ž�{xyzV�V�ѳ[7�`}{�dt�d�M�4j���5f,/�kkV^�޽X��7||��;:Ui�~��
�^<��f��`OI��	c>������+�JKOgޏpt�G��aaa��m��|�Q���X�kW����EN~1}���H��#q	��w���8t��{FPP�f@�@�v6l�Љ�7�������x�S�����Κ+I��p�5��b���;p���gPXX�����v0��Z�r�jj��bmm�F�a엓�#��бË&󒧇*��	��m�k4��������O��:G	�;��ӧϤ�?W7O���l���� ��WC��̚��ZEZj�B�'A�V��與�;m[��Vh���Ņ{���V�6��v�z{�P BDHp���\�x	�LF��z�u��IMMC��L��h��[�n5j����ɉ�ǎ��k���T*ٹk7�^^4jЀZ�@,�2ؠpv�����۵���r�R�	
�^�$6T*~5KW\�r���IHp�۶E�Q����B��S�H���U���pq��Ғ�QH�R���(і�VÍ��^B*�aiaIP` n5\qrrQ�se��E�R!���ک>���d2Tj�..ԩUk����@$	�

�4��$<<���H
���Ąש�D"���{{{�mv2
D���NVv���E�f;f.
����뉊�O��u�S�6���XZZR�V-�<=�S^��t��"666�Xې��G�ƍ�~��Hmm�h4��ߣS�	
$������ѵ2�������ٕ�#;�oo�j� #��bT,�T�����A.w2j����ӯK�߷!�AFu��&2��܉�b�AA�������n����@�<=���C�V���Gpp���T��>��H�R�Ψ5jl���Ю������������@223Pk4�nՊА߮���P����dR�׫����) ��������K!C&���EF�Zn�8Hpq��(�X
�t�Ym7�hDL-o'$bK�j؛ȷ����EAff&�u�P;4��H�v���_��� ,,,�p��aa����}��r9�r9��޸����KN��88�cogG������Z�+��ͥAd$aau�W��C�(�gCVVV��


�Z�J?p������ܹs�dK��ƆF
coo��6>�l�Ə3l�>.�~��3�͡F������s�;��`心9s�#8(��ݺU��[�ocgg���;�˝�ݳ��O�F��򸗑��X̤i�y����y��G~~�����-0y�Eynݾ��Ȃ�_ͤK玏]��g��Ÿ	l۸�D�Uzd�e�<}6�YӦV���畫W�Vz,==����W�vuj�U��jG�RѪE��U*�����������Oj⿊fM�;V77o&r=!?��t���謊��BN�=K~~>����H$�������ciee��*n�L�֝;����i���̙��3
��l�]��� /�V-�W���%�Tۊ��NG�J����+9?//�?���.:]��z��(�W����?��B$66&F
h4%hJ4{>���[�caa�D���¢�^�@��o��I�~� �w�>�LZ�����E,�B,�{�R�}��������@�!~��!�<G�_@@@@@�9B��B�x��������s���#��/     �Qm�rJKO%;;��w�X�_M�J��
T�����HIK%�~�ɫU�Ƨ��M��5
k�W�z�J�����'�V;�^�V�5|���f�[4���C�������@TK�Ϲ�C�F���$$�Ֆ��P?�N@�"�f�M��ٴ:T��ji۩3�|�_�J%�6&��-���=��?U}e2�J%�ƌ��ܑ�ǘ>s�뻟���<�yG�{b�OBy��,[Ε�מX�	_����Y������:�����4�1_Nd�﹗�Qe���Y�O����_��Np��}���\-z��9��ѣ{8+W�1�v��]�7j�T�W��P-�_�Ѱ��]��a�'�s������[����г�w���&�p�&��>,_��EEբC�,�����gdf����WQF�ѐ��@~~>:��p���������{�p,����l��e,]������?��PXTH���j#�Z�������IIME���EK/�
�����W*
��r߾�x{�ir���M�U�u��E<�=�j�dF�n�������\��zu:��6/oE��+���IH�#%[	�^��:WY�D��!�k³�C*��5t�ҙ�^J�2/?���2��8s���Q-���j5�2K6��G�#8p)79;�&�R�Eau�5!&6�W���!.�$
D�-7s����)TҵK'����ɹ�[Cߠs���2�z��c��^����>�����A�F���E���唩H�R��M��ۋK��0��x����h��h�/^�������������������j��ծE�.��9{^^^����`oOjZ����ۈ=y�v�D��Һ���#11��_N�i����3�A&�p/#�������Y�n=����g�~������EDE1q�4l�������I��mϞ;o��G�l�*|}}�z�K�Ȣ�~���D�
�z��~�����[��Y�~�{p����\���7/_F�Y���S��YP��+o
}����%:2������xc+���s�93g0q�4�Z-))�|9n7o���:�]�t�H�xzz��ȏ�b���ܱ[1�㑆:���7o2~�dT*5�}\Z/�^oR�����2�_�gᒟ��GE�7ѻ|�o�:s��.{�M��\IVV%%Zd2)=�wc䧣hղYYY4nԈ��������Krꆇq��u����w��)k����}����np�N�Y,���~8L�7d+���R�*aް����琊�=w��[�ز};��n߹˥˗�Je��4���O���U*Q�.t�c˖���YYs�����w��z�ߺB|\v����W��ώ?�0[&��-4%%L�0��ӧ�b�j���W9p��ڴ��k�ٷ���;�>@��|}|��؉�*m��R[[ƎŠ�X�r%8�I���y?���A̘2�X��ѣhܰ!/^�nx�axz��b��r�XD"


�Ŕ��P�V-B��v4m܈��fL�L�.]*���2�e}V&W��LHP{����Khu:rrr8r�x�m��#����Л���g#?��Ύ/���EX��(**b�׳���a��Ɇ��f�&�ǨOFR7<�[�n1��ٌ5��ӧ���=���Q��/��|N�6�iۺ5cF�"�~��������v��^u�{��7��h�������fM�Q�֭	ⳑ����f�R��fˋDFzoݾ͜�s�2q�WW�x��̜���=z��yy�2h sf~��y��p~�~/��P(��@��0�[Q.�s&�ӉW�=�����$br�jj8�2�uo}k+K"j:s�NEj-w2
�x`�C��琊D֯���=�u#0 ���/`�q����m�{s�@�P-�_��"���c#_��r���D��� ,�KKK�z���O��[����P+4��/�D����_=666���+jKD">��lݾ��{��{���7���…4���Q�����񅍍��A�_��p?�P�����ht�²T�Xl��O}z���#�W�~��z�v���>�?v����Fe��A$Q�1��ѵs'~?~^���
M7�˱cؾi�a0۶��na���%��	�ǎ�͖���4�,��bU1:����DR�m,�teeg�g� DE�B��f�8--Jϱ�H���Y�ܯ/�f~Ma�_�V"Df�k��-���KE��̔����=��֕��:��ď��@��#`���\�=��З��D��M���.�>�J�����
�u�ü��xPL�������Nq7���;����sHU���ö�t��e��~��ܺ����
��T*�X]�N����+�j{��}��;v��vm���o��xQ��,]��WB&�2q�4���2pM5"!�.

Ds{�]������ز};�w��Jl���;K��@��p7)��S�1v�hll�iִ	[w����3(,,�ޞ~}{��c�n�ZcGƏ�WZ'777�]�@JZV��o��U$%'#�
4:��Ϩ1cy�][�6����BQq�f�"++??��\ŝ�$rssiݪM7f�ΝL�>�HD�>��
����ٶ
�g��)|��&v{{y��7��$<==�S�ڷ3[Ǟݻ1����z{�pOOO>��S��nVV��ܷ���e�v�狭�
	f��ܹ�D���J�3--���.��֖��>X[[ח��srt�nR��\1�K��=*���۴�M7b����������L�6���|��Jm^�b%��7_
����W���'�ˉ����%�0_9��0)��;-��V3Ė��_�{�~���Lo�<�9d��?8x���'�0���W�~5�.��D"���C�auxe�'�C�ѪU��|ꂯ\�̝;wL�mllh԰1���O]磢R����A�T"�J�D[Vaa!R�_*���h4XYY�\U��j���X���[Y!�Z�i�O�ӡR����5�����r�x�z�ӫ��?dr*�S�'j�
}X��J����C˪�j�b1"��ʺ���bysz�Z-z����R��Y�,�y	�CC�v�<
�x�<�9�a�=�jYn7�"���<s���J����Wߛ����N���D�m	�=�$Te2+s�'uli���0�F�,,,�le6<�kaa�H��,--��t�G	�P���G�,˷GUu1g����>J�%66O�/O��>�ry���"P9»���!���ao�P(�iS��T�v�g��X̻o��O�!  �/EX�<G�_@@@@@�9B��B�x�����
�KKS�z�''95}kV�j�*����k�Nڶm���ߋ��p^��ղ՟����X�N������@��(}Y���I�u�9s�cϾ�OUW�L�RYe>�#G�1}��O�O�T���=O,�2�����������"��|L����[�����2�ˉ\��*vU�'m�w�˾�L[{u����n�=���єٸ)�/\|b�幗�Q�|����*�~9����'1̈́�w���O;�T��5����?%%��y
��Ƞ���E?�l���i��h+g�R�DO��	�Z�!	�J�6��^Y]�jӲ��<�@����LLD�T�V�Ɇ24�Y�3�����;���9������+�U�Z�6���?*�]�TV�哙�Tj�e*�z����\}�/��)�r�>��v	oE�o?��<n��g9���[Χ�ʕ�v�*T*��B��CEY%%%���?�5���e#�Hqq1Ū�G��\�-c�Z��U�bJJJ*}�w�I��عk7�>�W��ñ�'h�|N�9㓒��ɹ�[Cߠs���2�z��c��^�̐��������{4mԨJ[�+���={�t�I.��[�{�I>��q�>s�ʼ��_՚�x��~���S�x�0���F6�;�s��mF�	~�)�_hŹ�����޽>�h�!����8�=��+WY�h׺��|g������W;;;>��I[:r��K~b��1��f��.��#�ЯSf|Ed��ĝ:��O>F*�>T~y���ӧ���n����jת��˗�ο�����?�nD81q'iܰi��в%C0�V-Z����~GpP=�ue��O�j�d�|�^o����sF���^���U�qq�0e�d6V�Ք���
.��!9����b��i�Lb��\
U%������RE�E�;��͚������������x�f��ҹcjת�7��.;��������s&N�f2f�K���˲e��ۣV��4a<�g�B�Ւ��ʗ�Ɛ������6�e(�J���&ё��LL���[���̙9�5��w�"j�p���g�W3��)*V!�I������s�v�׬^�������1Y��2��)�g��₋�3�3��\]Tˊ_�V��}�t���ĝ<E��Mxe�b��̖IMM#?����3}�$V�^���|�*.^��ۛ�W�q����
A�u�V|9v����*m9}�,���F��׬_�kC3i�x����y�2�\�e�����˴nՊ�gL㓏��W1��^�G�Tr��5��˜8n���ߴq#����PF@�?cG�b����x�Ć��zq��u�SR8t�(�~~��У[W��
'&6�HwDx8=�u�{�.f�o޶��ݻ�R��>s֤�����h	K�,��ެ.fM����5k��帱��كs.<T~E̵O�~߲m;�|�!cG��nx����5}�<a<��vmZӿOV�^c�������U\Ɲ:E~~/���֖��8�^���D»o�%���Q>�d+i^�OgC���}���h��G�5����y�X��*�m;�Y�!�����nՊ�ݺ�e�vzt�F�^/��7^}���^�W�0�͡�߸��w�T:?<�5��3r�p&�����Νg��|9v���ڳ�-۶3f�(F}2��X,�fM_Ə���/��]�֌=���lRSӘ�lF���)�X�i3���KKK�7�w�~� �4�ƻ/�ii&u���%)9�w�z��PM+��Mi�T�%�,�;��撚�ƗS����R���*W�*6$����D��caa���#��`��ذy3G�g��F:6o�ƥ+W�pw7����=;'k�q��bΊ�T��	i����LY�L���^���8z�8�AAf�#F������ڵz�
^2�N^,g�_皳��0c�H$���k�o�ݱ��#33�#G�U�����[ee򋊊�|�*)���j�¤�V����̬,�����e$_*%���R��\U�7�>����;��'��J��Iwtz�Y�3�'`~+_����s
mP�^=BC��ԡ{��g��Yt��������b�6Vt���s�K���+�S�s6����-)�J�2�,D4
q�c���=�;k�_o�ָ;��6�]g���"�H���C*C&�QXTz����S�O#�H̎�2ʒLV6f��>��Z6^l%������c�˝HIMA.w������g�ņ�oeeI����-�����d��>0iw�H�Nk����K�����͜����I��=���j	�Q��$�*͏^�NO���W���N��}ؿ_N�J!&�� IDAT�[2w�<�"#�u�r^<C�W�C�v,_���c� #+��6ѸaC#�YYh4RRS�����Ý_�-G�Vs7)��S�1v�hll�y�][�8h�K����f���p�\ڙYY�t:r��h&�-۶�������I��3�e7gCU����K=�t���eVVVxyy����T*5���ܱ�,[��܉�L�7�������[
7��5j����x�����*u�Ü|Ow�Λ��ɓqp��F"1�ɫC������k���_�^��4�:�k�p�6���o�{͛65�z|?W�]'99����&p��!222�D��ذe�6�$%![=��5pq��M�_k@�0�r
�]����V��҂�uK�ʏ���R+}��!;v����G�G��.�6o&##��ztc����x#wt�R@���ɘurt�nR�w���ѣF�?��=�\ڿof̚���=%%Z�w��O�������?��)���&>�j��䙦w�Z�}�|�٧̘5++K�v�Ġ�?i
�'M���kk1�..(�L�9���,�����L�x��9~���b+
��TעU�V���oݶ��ǎ������yg�{f'�g�^�'��{{�ed����H�_+#++���Бe2�����6��+((x�|�����Vb�LVu*Y�^OfV.
����ڠR���t&�8��i��������|��
���9*������QTT����Y�

��d�D"�΁��5���/�˳�C�V˃y8;����c�@�Djkkv�*P*�Υedee���W&J�Z�Z�~jc�`GA��ֆ�������d��R����BtZ�3�W�^��Xzzz�~�g�����^��!�<G�_@@@@@�9B��B�x��������sD���'//��	ױ1��<�3-��"O���w�US-�����l�Q��2�XtG>_x���:W�z�J������$�Qb��\%����ّ;i�4��^�MX�~=�		�5,^�����y���d�5���Wi��w��q~4
7n�|b9�v�Z�#�3>!���.c�
���ϒ<�~���\b6�/����Y�������e�r����5w	��TK�/..��4>[Ǵ�����7.���A|��l��v��3'=-�ed��|������j5�-�~�}���r3�����e25j5���7[^�Tr���wR��Qsi��jn�L4��K[�әM�S����Q�n.'�^����_�f�ե��L�i.�Ȭ9�Фq#���(*.�T�����꾗�aԯ�{�?��ɂ�����-�y:���:}e�r�3k�74o֔���*}�Q��;_�zx�tss���T[Z��iЉ%��a����&g�bZ�9?\�S ��)^<��II\�~��r{��S���8�����0y
�V��[�>�Y������Кؓ'
�mݱ���[ҰyK�?�P{��XI��m�x��@i;M�9�v���|�o̙�}��m3a"-ڴ�C�����g�D7mN��8v�J%���e���=v<�Z�e��i$ܸ��;ФUk���1ذz�:v��K����5��6ټ|
��d��ɔ���ȏ�� :����_A�T���G���hؼ%��� 33��/�m��$ܸ���/��Kw^F|�)KW�dͺ�L�<�Ys�1��������ȠπA4kݖm�s��]C�u6r����#>����4�����ޡ�dff0i�4��6�a��#�RIϾ�iԢ�z���ӼM;�����|�}0��v!��XI�/��W+m7��\�k��3�U@�v(�Kx���U:6��G��[��+&����\r��d�4�Sg��o�`f�������p�u�B߁�
��IS�ѰyKf�Y9�22�ٷ?MZ�海����iӱ	7J���|�sW^^�}�h���Pm��i-ʜlb�eЫ]]b���!Q�v� D"�JSB>Mb�NҺU+^�޽Ҽ�y���۰�y�~�g#G��1�͡��p����T*.\�Ĺ���dԋ�0��0*�-��2m�DV�[_�-I��ĝ<��0~L��Y�˯XYZ�x���;�����ؾi�	��ܶwww>Lèh6�^Œ?��\�{yy�}�o��hX�a##?��ի�7�ѭKgꆇ�}���5�R�l>˪��p�u�8�G���Y��r�T���L�/˖3�X��֬�@~A�^{��}��˖���΃�O�ut�ܙ�͛�ް���P���SVO{{;�o�@�6mص�	�[�.D֫�ޝ;�ٽ:����d^�ۇW���W�q�&���Y�j%cG�������ٽc۷�}��ܳ�Sgΐ��O߁��ک���+o���;y�=�o��%�*m7�燦�����5�f
{}��Y�5��QBznv�bvN�L�/��0ȕ�,�#�]��<"��c���E�N�G�Q�d���:x��d���C����rvvf������Ӿ][���.��O7;���~Y����X��֬���&�L�����hpr�el�Pl���y�:
<à�x�8��iQ�i	���r�����q��aåF��שּׁ,|��	
�a�h�ed��퍅�%�6ld��I�hKز};�'OB"�΍=y��~#5-͐��/�/l._����'r�şٜnݾ�K={��Gx�:��Ow|��Q(�(�SXTD�ɓ��p�N�1��W�&��n8:9R\\���#++�Qc�q��C9;;;lll��F��T�
e6+�J9J��mqw�AP` �ڴ���c�jQ�V�nR27n�����4mܘ{�2��̤At��XZZ�[�e���!�IqrrB��dV�9���pww�v�Z~��H�������(K���/�oGNN*���#F�n�F������8::��₻�;�.�ȝ��Hl���"��=�7kJ��H�RΜ=���'5\]qrt�������W��Ɗw:�f��.��/$:�_W;6�����I����cme��L��\�]��;k����ݶK<P�Y{4�D�\.����_�蜜�>~�ϴ�#G�`��Y�aC�>���B����O����·�殻Iɨ�j9J�ƍ��A��j	����*�`�����`���Lf����E���X|Z��m5�۴a�Wә��t���d��z='O�F��s��y�ؽ�E?�L��u���OL\�6 �NΝ�@�
�tĝ:E�N����C��cee��S��s7���4�?�FS����VĞ<ɥ˗9u�M5���_���!��ao�Qe��N��u�VD֫g��}y�_�@��m��z|����wp�ʕ*mhբ��h֔���(�9v���v1r���taa�t::w�@X������Ã%?���x�Z-N���;���8RRJs�{{yq��i.]�bV�� 3+��[����#Z��A�����Z�fϾ�F��*���Ə�觟Q�����'O{�$��\������� �?C����6�Va�ؙiC2�o}�2�_$R+bg�bL�H��d��/��f�G��q�YӿE�#����'//��[�]�>��\A��Ҿmv��m���/\�R�����|hn���3�h֔�ڵ�~�	�#X���gRD��링_MN���ҕ�$޾Irj7o� >�:���������H�͛�y���9���8��IRr2]:u��D\]\�ѭ���wp``�~XYY�P(���KP` �����h<���}�n��r"�գY���߸�����r��҉�_���

�Б��լI��Pڷk���7w��e���������4WvhH0 �ϯ&M7���<�ˣi��b-�&(0_oo\
D�	")9���P8ӿO�<?�pws�U��fm���
���#�HZZo}�F
���B��P�Ԯ��;w8}�,R��Z�!t�ԑ��$o�&80������!;;�QQ���Q�nyyP8+���6�e�[-aiaY�����V��^^F}�pwwg՚5XYYbkk˄1_`+�eϾ}��x����Ⱦ����S+$;;;�p�ና���VH];w�@Y���
�|���#��!���E��Y�A���E@�b��_{��#����͛Iϸ���%���N������2ꓑ�pu5�q�XLP`�a!���#NNNԋ� :2�h>�H$&sW��#��8w�����UC��]���*=VPP�hժU��V�rs5YXX`a!�0P�zQ*�4kݖ�+��O�# ��Q*���4������M��q�j�F���^=/�)�ʪZ�<&Bd��L&�����jC&�	�}�����.     �!~��!�<G�_@@@@@�9B��B�x��������s���#��/     �!~��!�<GT˻��KKS�z�''95}kV�j�*����k�Nڶm��������?�����eh4l%���!:*�Z�i�ZΞ;O�:��ښ�����غ}�Z���Y���]�z
w7��:�<w��8p�0yyy��Vdff�P(�Zݪ�}ҴI����6���Q(���غ}NNN�kӺ���		;��BA��9y�/�k�� )9��-
��6���ܛ��ʑcǨ��[��>�T�VvN6�b1:���ϳ`�]xyz"�7i.^2[���ӌ3�����IJ��juL�:���䇖�1k��x�s̡�hx��>������#h�j֬�@RR?.\Ğ}���슜<}���*c�>U��e����s�D
g�*�O��55\]Y�x	�o�a�ާۏ�)��f~�w�ik��.����Mժo�
=q�Z���{��w��_�,�?��g,���3)""�
���%"<�^�^�3������S�+��{����#�W7����,]����q���o�D�RIM__.����8���������k�#�АJ�Z�l�Ω�g���&��I:��+WINIa�XZX����ʊ����.N��r<&�А`$��^�H���w�{7	w77.^�LfV5\]9u�ڒV�[���5��N�P�P(8w��6l��ݻԮU�5�ֳ���/( 88��.r��;�RY@pPyyy���A^5}}ظyuj�❷ߢ���۶���"r�z�zRRSqpp������	7n�i�V�&%Q+4��[�Q+4�}����ɭ۷��,X��s�/p7)���0.]�bT���B6l�¾�H�w��7�w��ؖ[ѫT*V�����s��e�u���ؽw)��ԮU�J���Vs<&gg9vvv,���'$p��4
��lE�����V�e��?8sK���,��nݹ���pw�������F��
ٻo?���(Y�a#1�qx��ceeŦ��8p�0.^$0 �Db8�o��w�i�醺�ߴ���X�k�aeee$�E�`ێ��w�
�3�r9�99�Z���g�R�Օ��6m�ʱ�'pws������|��-J��\����Z!!�W�6l�xL,��n��S�n�'Oѡ};�vv���딝�cdϩ�g�����͍�w��x�
2�̨��?v���+�I�6xڬ=�ȍ�<�4���-y�T3�e`��uz=�D�Q��k�X�v=�))dgg��닏��6n4GVV���f��h�J�%�9��:s_�2���ץ�P�R�V���������c�ٻ���B~߹�kׯ�X,6��DL,R[[�R)���E��?7�|�*�1j���ذy3gϟ�v�����"��c���<qQ(��x���$5-�������U鱂���Y�TŔ��T�Q��ա�J�ԮErJ*I���_��vmZ��…�ٷ���0����ڹ�EEt���BAFF&���F�f��m4����Ϗ�{�1�?rss)V����~�B4j5?.\DjZ;w�c�����v��z�P8�Y�r%�׭��kX�b�a/�~=DG1�7���d���t|�=��ٷ�Z�D��L�Ν�I��sF��5�7���Ye�BC�
���x��M0�b�O��/K�s,&;;��!�Nm�999��7hr���w9w�j��-۷�T���#<����+֫@�d��}�ܯ/��bU1NNN|��F��p�OԬ�������/OOzt�ʱ�'X�q#/A�PЪEf|=�Z�E���� 77��ƀ����:��v�۰t�J��㙿h1����3����j�ڽZ��/�����+C��ޞ�/��Y�&�f.J%;w�_�^$%%�v��:D֯G�>�9v��q'�ٷ?�..t��GG�/�;�kӚ_���.^�\.�s�X��,\���7�ԡ�{�&���`go���+
�����0�2ڷm����۶a������5���F�T��Z!!��t)Z��O��^�3)Sѷ�+�Iu��T��(aʚ3�DVS��C7�bYC�ُRU��o�3o�e~�{�1��pѱG����ʂ�Kп/*��#G�q,&�k�׍Ƒ��{��M,^B���d(
~^���w�F�י���kh��:����X<=<�߷7?.\Ā���g�v��m��bb���`��?���4������/ZL�Ν�ک3��pQ����Ч�Kxzz�`�:�oo��;���j5Z���OIIIu�����B$bO�"&.���F��߻�\.G,���MZz:�0n�$��gϝ@�p�v�P�j���耫����"wrB"�1���䄯��>>��[���J��¨@Y����Ê���/4
���G�ACX��/�߻���;6	>�ކ�}Od2�r9���<�^5��)��
{�͚q"6{;;�R)r�����gg9����y�6Z4k��1lݾ�А�e�r9r������耇�;
�WW<=<��q�f"�4�����t' ##�A����#G�w/����7q�_�+W���p�����3.
��/���Ӿm|}}h͍���������Ri����ӢYS�6��Pɻ�?��F�T*
~���F�ڵ(**2���u�Ro9ʵ��ddfҢYS<==���4�3��._��W^3���!�Y�h1}:�b��W�&>�C�y�b�ʠ+99�D>��D��X���=�>>&�U�\.���3��+k�2������J����+ܹs��
����ۏۏۤ���CJ�����t�1�|ѧ>���<P�q�Z�Z�ޜw�;�r.1��B)]-�����=nnF�ʏ#�i=(0����0�wy��{{T*5��B�N��qÆ&e����.wr2�oP��>��(
\
�)��ە�pv6�����d�������?wOm%���quq��LJ����o�`�}ީ���JJ4��z��u:]u�5A�Rq3�*U1[��s�D֭K���r�XTj���p���\�pU�`oo�ԉ�]��N��[����7�#Z4oFnn.~�)/�liV���-�7���>����0��:��iz�����գk��f.���xyzq��U��HNI!#3���(�˝�u�">��3rrr���E�INNf��f�HllHNI!��
\.&�**.~�����ñ14lM~~>i��|1~�,^��;w�����9_U�>y�������W7�3g��+C���LJ�G�ѼiJ�Z��\��CߠI�FL�6��s���ɹ���0���:|�Q��Q8;����a�{�ҳ[7T�Ҁ���u�"��n��l�WϞ��?�ٻ�M[�2r�p��+b���˗+7�-,,����nR�^^���V�3��I�F̝�=�
{kkk�e���c�6�ѭ+׮�ӥSLJ��8d�#���s���}X�?�p_9��|���<5mI�V⥐ae!�Q�+#}h�I�Y{��z����!�:�L�F�"��N222�Wn��#��5���IeXX��R�j����9�/M7bƬ������ښ��L�2z����	��Nf��*�srx���{
W�?�LE&�����R�4����u���l�\��k�IJN�d�!� IDAT����
Q��t�҉�c�STTdt{�y�Z��+�J�UE�J%�h5F�FMzz:!����\n� �^�6�Ë�qv����Ɏ?� %5_BC�qv��x�
���ۻ�N��ŋ8:8��鉥�%A�� �����+NNN �А�,-i���ݟ�N&#""�D�����Vo/���y�Gw���P*������DG����ҪEs|�}ػ?׮_'��{{{��9}��BB��������Vu�����B��(�'$p�f"���c�k��i�7�Z� %%��w�ҕGqQ1�������t�ԉ�0,--�t�Vb1o����2��MJ"#3��z���6����=�bkC�G����䈓�u#�
6Gկ��XN�:Mpp0͚4!<,��q��V��]�DZ1��h	��Ғ�QHm��ըQz��g_FկO���\�J�	��DG!�J���heXaE֯OLl,ǎ�@&�ѬibO�$%5�&�h��n5j������>���c��� :���ԙ�\�|����6ib$�{�Τ���؉h4B��8������ۋ�}��p#���v�R����H��
dkkK�ݍ�K�\�I�RBC��Ie&�U�А�{�M7F�*�����ĝ<eh�� C��.�����Ū5k�5c"��l��k�㣪����'�6�$!��BG�&bADE}۪񵀬�tTT־�뻮
�HiJ�$�$�B2�~�?B�L2	�D�<��'�����{Ϲ��2s��/��c�uRQY�_�ܩ����x2�����P�#��*:��of$�C���N�f�U�IJT ��I&��@��|���^�6�HLH`�W_�_P�N��O�^��Đ���G���,��sr�:�¢/�Pc63��qDEFӞ)3�`��w���BEee�2I��|�a��z`����L}		��n�w��a\ԣ;˖/'��!:fg����7߮��c]�t�ؾI��,X����܃y�dg���̲o���@.YY�����/�iiXj-|��S���ɵ�\MFz�Y����]�W͟?_>|�Yo��E��v�w(����w�}��h�̜�B2q�tF�sI���:q�:��ϼ�?`��u(/��cF���u��gȍ7�ꛯ=.ۉ?�={�_"(**j�S��
�uC��V��O�n�SUU%I_�h��u���O�_o�G

8VTĔ��]u�Y��a7\/I�
j�#~!\.�U~HG���l����}��h��(��5f3>���Nq�8gG�B��jI���~*���H� �-���!�m�$~!��
��/�B�!���B�6D�BцH�B!�I�B!D"�_!�hC$�!�m�$~!��
��/�B�!���B�6D�BцH�B!�I�B!D"�_!�hC��UqiY)���Q��}�^O\\<:����B!D3Z%��ع�NOB|"*��c��ja۶��ޭGk4-�2�ݎN'҄�B�6l،����Z�߯%�c'�:=*���O��QT\L��g�Mo���q�fv�ރ��$:*�U�s:�l��'L����,].�/ZLDx���g<��=?��h~�<
���E�Y�~=����p���Nw��v�*"#۝�q������E�)�� )1�k����+X�v*   ���}GrR���
p����s�ya��\/Y�^�'<,�\�sΔ��6;����u�񗖕��jp:��[��&V�Z�e���s��,^��O>��=��p�_��{F��_�y�Ng��
��v�娮�����b�֭��fE�j�z�װ�3�t��>k6��v��0��1N�8��ە+9t�0��8��3^.��yy<��Ӥ��2t�`�Z-j�����f3O=�_,���.���԰�w�oRW���X,��V��k����'O�ͷ�R{j4���tz�����esu7�x��S�#V��宻��f�E������(}�q�дO�\�&믥��pY�̝K��=����n��[o��-z���ʪT�԰l�
����Z*s�c��xlX����;y����:R�%�~v�9~���)���Ҳ2��ț_;>�S�5�N�ӽϳZm�ݾg��q�M�}윹s����ڵ�Z���*���V+���i�z6ۙ�\/i�����1��L�5�^����=c:.X��{�2i�t�<�����܃��vn��:w�ȝ#F���Fzz�}��n]��{� ��1���PV^�̩�y�_��G�n���~r��8^Z��[n�o��Mڍ��`ێ�Z����ϫ���V�ę�<��s\�u���{���g����Y������c�O?c�֭L�6����'..�=?�̿_�7��f��K���|�
,X�)��A�n�(9~���4N�8�
?�o��[��]�w�R�2h���L�>��@HH�C�a֌i2~�cL�<�YO=�Q�����d�ƍLy�1����r�<�s�ٰq��α�".�ۗۇ�Ž�7o�ʻ���>��v0}�l�N'��G�6y"ee�|��'���Ȼ�W�t
G�#<<��Dyys�<��]���;���fc����@
HeU��QL���;�m�w��}�v�t��__�|���IV��t
cx�.�:�q��0���q�?e�Lz=y��sX��+6o�JDx�G���c�����ƿ�MbB<����}�~>�b?lڄ�`��k����(߬XAd�v\}Օ���
���z��,�����G���'�ХS'��XAd�H�:-�sїK�����a��萑����޽.q��8�w�τ�㈍�a��ǘ>u2+W��(s&c�Gx���Om�'[���**��T��xm�.�$�aw�x��\|�n�������ƭ�Rx{�>#��[�i�V�'�y�ʪ��0����l��_����16:�WV^�O>EXXa!!��u	߮\��d�?�����a�ue:w��ާ\��b��E6�os_|���
E���(��i�X��矝C���&��_b�kHOK�q�=c:O<���6ڼe��8���Mꩩ��Gx�c���r���}�!��y3f?�^�����7��U��.��ٿ3�$w����p�C1���y/���!88��F��j�1���׷)�I,��s�F?��V����d��Q#���"';�1�GQT\�T�X�\;������n��x���p�������u�D�x7^?��VSZZJqq	îJJr{~��_n������7zAA��y�5��Iiii��կOow<'O���'�~���7��\�dge���D�޽�ӫ�{��Z+V��{�n\?d0�I��:��f��t���&e���C�޽���r�L&ƌɉ�*r�����K�j�2���
\��=���d�|-���`6o�ʺ���׷7��;��^1`9�Y�Z,�q�{[�Y��~}�pϝwb�Z),,����r� �~�̣/�[��!�b��Q\ԣ;�i�5�[�r?n����G��k���o��2���ុ�JHH0���y�r� ��NBB��Կn�z�2;p�,��={�ޭ#ヒ��v��CxX�N\LL�q�pY�RShUׇ:u�HqI	�n�޽��]ԣ	�q�u���<y�k��8��DV�\�����tؤ̙���ƣ�>��\.��@^^����@B|�� ��+��C�X|���V'|��ᒓ,�r��>z�%Ǐs﨑t��PP<Ƒ���dފ�
�KJ�}�pz]r	�		>|�������ڤLNV�{�ҧW/��
���}�GQ]S���Ɂ�^�A�x�Pk�m�����5���x�Pw�5��z,���*��b��O�x�k|��p 7���(v��CMM
��[��.���`�Z-�11���QYUŁ�\�Z-O�B��t�E�HxX������l2���9�L&qq����R[[��z�����6mق�����~�d2���@tT���������V��j�lATd$��ؘ�Z�{???B��)(<z��%�Ԕd6n����������L&�M&�<!!�,��JKKy��W1��2�E_.a�/�뒞^�c2����rA����$44��p���(-+�X߇���1'�F_�f3�?��w�OMM
��%�/�@n.k���K.��=otT$�������ޖy�ӻ�%��Œ��I��ğG\l,555��`^ݻu%$8�А~ش���&��l�\kv����vt�Amm�{^����~��-����ܼ<J��{��DGG��h��$������ֺ����	'���<p�}�dg��+��?|4�pYt:���h\l�|��,䅗_���~��z�BCԤPr���>V�qlC����^��/�뒞^˜��ln<z�o�IT�����FDŽңM���J�
���GGe��� _n���WW��jȉa��rjmN��]�(��mR��;	�[�S���5Gju��zJr2�|�_y��>�_���رk�7l�WϞM��>%�d��ߠn����JXh(�I��k��_����{hHH�m��8m��>6�dr�ߴ�T�e99�|�*���z=6�
o7�)8���:�oUPPȄ��@���OXh+W�����Ju�TY2V���3���Ga�=w���DGG���qVb���˚���UW\��1��민DyE�=����+^�JJL�}�h�
0���:�	�?��y�t�n��_x�����D�Vs��a�F��L��y��`W]qPwv��� (0��B�V�O~~>;w�a��y-����{�> �`.�x��r������3���a���v����r�M7�{��'66�� �ij�;n�S�<G``���^�b�a7\��)�HNN�sǎ/-��rQ^Q�^�oq^��NeU���FRB#ヒ�S����Ȉ��ڤ��n��W_���T�E�%��-����}���E�8���N��U��+*b�W_���Kq5�l�jp�đ�|��?�bk[B|<w�~+Ͽ�2K?��k����<6���/��j��a7��8S�'-��;�Ԩ R��ظ�8��������͡���_�&�o`�ԫ�r�����q�ՙ��W��_��J"�<��W]y�W��'���������i
�у��Ǭ��qO���$;;�%_}�N��l6�R�:x0�� 7�p=���?�ԤLbb&N���x�o-�r�W�Z��o��N99��ƛ���PPPȱcE���<�Q�qj�ۙ4m�G����Ŷ�-��E���_UU��a��׀�a�ڬ���~V���j�DTT$��'#=
�FMxX�%%�����ف�v� �ߟNs�HO�l6��h萑Att4����dP�~y
���d2���>�FC�n��?U������m�n@@�;V�ї�����MhH(AA�������ZM�n]O���{w"۵�����H��9\7Hr�����o���?�"5%������ZMRb"Z����@z��E�S�jccbHHH�ᰓ�@dd�@q)�/�f�ӿ_?���ؾc'~~~�~��2~~~XmV�����̤]��r�����>';�)��Dǜ_�!���RQQ��(t�ҙ��w#22���J�w�BVV&�;�p��	l6��	�����h�ޭ+F_#�""H��wo�Аlv���.�����_�����ѽ[W�VXX�U�X�V�RR������c���У[7R���ˢBE�����k�����{�.dggѭK�k��iudgf�Q��9�|�Z�$�Ǔ��BeU%�%�IKM���m�����W]����*��eQ�T��������t:��_?RSR��Z����PRSR�鴤��b�X�iߞ�_ڨ���M�Ʊ�F�M&֮����z��2��ag>6�Ǵ��&��nw|���f4hIl�˘nꇟ���0?ze�#,Ї�@bB�H��W�v$��v4�>:2bL��4$D4�_���3%��t:}z�".&���8�8ꐑ���|B��		&&&�Ąx
�%((�[��HDD�>>�����a�����ӨLp�	��JjJ
9Y�M����UzZ*�"!!�V��fsǑ��FRR"���ӣ[W�2;p���c���z�Ӕ�d>�'$8����7��֏��4��Җ��~����Ç�����#G�^���t��??��ަ8��}d3�L��, D�7��0zD�Gp�NޡC�Ujf==�A��ơCs][~���'Oe�ݗŅaϞ=�N+**j�_�KO� =��&NV��~}zK�^Y�V**ʹ��;�u(���;|�Q#�vC㷪��b�3OK�o�Z�_�˅�jm��"!���8q��� �r
��)�rV�Yט�h4���8���#~!�j�$}�,�Z]w=^�*g�F5?���6K��'�B�!���B�6D�BцH�B!�I�B!D"�_!�hC$�!�m�$~!��
��/�B�!���}�e���Gi�N�{HO\\<:����B!D3Z��l6���͆�n�������ѣ�ѬB!N�U���^�������o۾�����hZ!�-h�#�ҲRtZ
N���?{k4���C����y��;o����ko��{�<ޛ��K��k�8|�O�y���J^}���V��}���y�#�g���&N���}�~WϽ�"Ӟ�ź���555t�q1���K|��f�N�:������ۇL~A�S����<�V]_�u��K��U�k�ϛ͵����/.)a��ɭ�8��������r�R)^�;���h���0ƌ�����}�͆N�s?��n���hP�Հ�B]B����R+8��ӯ:�ݎV�u��(
���,V���t�$'7�����t�ӝ�������!C;m�z.������fC�՞Z?ujjj�ܿј�Y�t:V�
�Aߤ��{�2��	��q���`�X��1�Qy��Amm�i���>��t:�����V�-ccb�m��A�Պ��K,g�?ą�X���Z�n�'����\���k��g��=�gKEi�/�6�[Ҹ?�\.�}���i����j��p4��H����6q��I����4|q������c�x��WX�f
�W�����K.����Y��c�(++�ə3�)���Y�1��ݿ��3��%�;G�$=5�Ν;�s�.|}})..a�����eʌ�X,VJJJx�<����u:j-V���T���7߰s��&qn߱���}���p6m�¿�x͝,w���+��NTd;lv;�{��K�r��	*���8���������s��)))�f�s��8��WϞ�[���IV��t
cx�.�:�q��0F?#/��:dd�c�.w��y���lL�0����A�p��1���0�L��W0w�S�����#��C�6�!�<7��Bq�q���al��;�I�=JP` Sf�Ġדw�0�fLk�l޺�w��'�};v�B>\�17oF�R��{���c�֥����e��>k6N���£L�<��~�`0p�Ճ4p �UUDGG�ȸ�<>e*��"��Lxx�����O���FX�O|�?����`�>�����V���1�k��?-{+1[�2��i�V�'�y�ʪ���7��9繹�Z,�k�\;�j.�ރ܃y���ٳ�gF�}��b�W�<�xsjkk=��c�e��BBB���8^}�
���������nj���C��@�l!�ł��l���Ȫ�pŀ\1`		u���ӛi�'�R�����cvq��$'&�n�^y'<B|\�?��]��+پc'N����r�[��G{���<5k&ݺva�����vjkk��0��H����7˿���˴ɓ3j����c��q�7�}�M���	�������;����C0c�R��y�0���<�i�&z|���_Q�p�^=y�����ݛ��,��"��">>�i�'�?7
���p�'L�<�I�N�cv����&0c�dR�����F��V.��?~~~Lzt��q���$�'0m�D������Sx�����<b�O����d2a��9���b�p�'LaMW��IDAT�2�	���1;���<�z�9&N���'g�ղo���F���2�1.��?��gb�x�u8q����v�~����
��vƌ��?�f���hp8��yq�q����g�5)��	!�G��ㅕ�������A���?����F�ՐŽ���ڜ.�����I�߮\�V�a���t�@�i��;̜:�9O�r����$&Nx��_y��?����9M�x�T*��Ra4=���j�[��g�����z��U���D�����`�1���DJr�&��4�F��(�Y������N�7:A���e	E�{���GDx8�
";���v��]�K�Ndgv����/�,����F����LVC}�N����=8uvE͞iQ�T���f�Z�v-۶o�;�ݨ
�N��(�\.ų�~�W�O�y��{{�|$�P�S�R5i���l��%KQk4�����nn]���D����7��Z��l6���q���p���?��n_�禛P���{��?��8?����kВ��q-7��.1���."8ZVCX�����T����PTa�QUccՎcM�w����������fƸ��Ӹ?k5ƌIJr2�����Q^�մ��V9�ߵK7� ::��4�K��l��M��W:�d{-s����N�ѣ�uנ7n�̑#�/+�ٳ��/x�W�9u*+W�����J�q]����Dڷ�&���ш�`����ʪ*�+*�'))��mgʌ�h�����曘4m:�I����Ǵ;n��/�LTT$N��I�>·+W5�#<,�s
��<Cii)�		Mօ�墼�½.�y�
L���2ط������y���'7���e����.c�-L�5�Ҳ2��V6n�Ҥ��n���
�ł�j�`�9u���v���+��d�UW6�w&q11�����D{�o<��sh����j.��"��f�׸�v;�����a��|��ر"�|�����*���c���W^���n���D�����{z����i�ˊ�(�#5����ӅN�f@Ǻ}��!u��[�$5���+�`��U<�����>:��H�~�]��V�j5ӟ�M���;o���£�zj���Ȼ��`^��7q$?�E_.a��u�
��f��\T\̫�����\�/����<L&�d����U)Z�j��������z�{��̑#G������ޭG�7Νf����&�����xZs�nnS�7�5�y�a;�M�#���7�	<��f?x���ra�Z=nNk�\[�N�=�?g�ZC�oB4�f�ۧ�����q֣����[i.6���qc�g��fg��O(--c��{���=�j�1�&��gEQ0����_��������y�Oo����촢�������2Z������q�h�d�4��%�_��-��i]'N����	N���l6��$�V��M���5t��>ԭ���~�x�@y��Zr&I��؄8[�B�R�r�]Sw��tQ��j�1��XnܟU*U��|I�2�?	����oW��j�9mj�����܃�>̨wӣ[�s�z^Lhh�C�J���$"##���w��0����8�!x�ճ����|�R!�hC$�!�m�$~!��
��/�B�!���B�6D�BцH�B!�I�B!D"�_!�hC$�!�m�$~!��
���֞�8�B�߭���s͹�{jxh8z��������~F?�z�$~!���FqI�9}lxC��Px�?�e�X\AA�^߷��((�t>!��ҲR���jϟԖw�{�����=�y\��U���5~!���uΓkcj���~^ĥ�jq�\���B�֤R��u$�!��0�_����J�B���˲u1�OG�*T���B\��ܯ�6+%��O;ϱ�"�V�Yi�[⯨�`߁��;����BE9+m�	I�B!.�X4L�VǏ���NJ�����>-�/a��
�[L���TQY�ɓ'�l���̚u߹�q�\-�w�2[~��b<�6C!��f�Zٴe3~~~��f:dd���g�^���������cEE�Zj	�����'�q�O�dgs0/���'q����Uq��u�����vlG��`�X���@iY)���du�d�MDFDR^Y�QF�V�a�F���t����㶟�����д�߿��B�?EA�V���LHp�%Ŕ��JzJ����B\L��DDD���IKM��t�w��"�����_BՉ*���OYY��)8,�Z�c���D��#����n�SV^���o�2Z����d�1�IKMC��A���B\�����t�_��v4
z����R�������������z4��w����Cy�Zjq8���*~F#V����ɓ'Q���L^cj�澰�0�33AU��b��p8(*.B�V��ハ�>�s����I��@`@ ~~~h�j��F����}B!.x�1�lܲ��?n����p��	v���M[���@|l?n�ƶ�;8y�V����l6�V�	��r��m������‰�'�Z��t:��tW�Ao !.���C��`�5Rc��j��O����Rx����l�����i��ߟ��o�f�ymS��!��04�_�#�P��N�F��łZ�FQb��@B\<5�fbcb1D���HA>�~����b�%-9�ÎZ�&)!���'~F��l2�g� 4$��1�Uj�(9^�J�Fw�T���?i)��$߸�V��}T����1�WV��x9�����B\(�;կ��H��o򾿟?Y2=ދ������` 3���{�11��z���Ĥ���r�l
��/-%B��=��:��ܤLtT����d���C=9�/���i|��Z�|�B���+�V��8JnK$�!�� h��;�������ϋ��N'�F��!��0���RPXx^<�~]O�7���v^��p8
	��/�� �j	�b���P�h��7q�QSm��/�� (
F����՝C.�뼊K��!�m�$~!��
��/�B�!���B�6D�Bц�6�;�NN�8���#���l6�p8�Y�B!ą�ů�m۾��>���QєU�3�G0}Ϩ�
7���;��s�A��Sk�еs'�\;���m�d2��jm/X��5k�R�Ħ���KxXX�r��盔����فa�_�[�V�!#��w�)�B�-&��vqi�>��u	���c0�[���r���7�x$�3
��r�p:��t:l6z�g����;VĴ�����r�����##=ݝ�E��p���ܸ���ڜ�/�O��������7����\�3�����g����th�j~yڒ�fC��z�dw8Н'�@%�� 6l،���k-�~��^GNf&F�����ut�֕y|�;)<z����_v�څ��/�-^�~ ��A�33ٻ?�z�?l��'v��M��t�|EEE|��lڲ�;w�ղ�����ݾN�c�ڵ���&!>���x���#}�)��F�ӓ������Ukְ`�|X�d)�R[ka��o)/� %9��S���v�*,\��6�g4���<��\��n{��=�y/��������U�������_Î]����%K�1����|��KV�]��oW�����dz}�NR����f��Mlظ���4>X�19�Y�?�a��7�MTT$'OV�Գϱw�>�|�?t��&��G,
!�g�����k���Τ�&�R��>�Iv����7�Ⱦ8VT�;��s����iq:�L$%&0f���������|�b4��\W���]��Æѷwo�F_�>p?���Mb�3{2�Y��S�ΜE��]��H��+��+.�Ҿ�i��=?�n��+.G���r�N�N��� -%������J��כ����I�>}�ۻ�����|3��6}����4�bc��؇H����t���2�1>��S�GG3u���>�N��qc�Ѓ}}��ƴ�qc:�Zv��͂��2~�C<�Ѓ$%&�~K
!�g�7��L\}�t�ڕ#����y�V�bc	
	��.��~�<��?���l]�
%,4�AW�_�V���S�*���]�r��s��1�{��W�ͷ+�3�h4(���6����/��Ą�.[�כv�HL�'!>��}ͩ8��\x}���nw�ܜ�8�\.,��㷦�B\XZ��<��a�&���QPxd�X^u%Ӟ�͛���F�a�eE�պ=˜��Q#<�>^��
j�I���|�i�BC�X-��j\֟>Z��d"%9��'���������'��Ҿ}��Ё��/�ppX�l��u�J����,X�)�~���G,[�-w�q[����šV�y����Z�e؍���Ś�ky��W�����f��!<9�Yr��ػo��/!��tT���W�l�ݎZ�F�Ѹ߫���W^eƔ��j-4ju���\
6��_���r8(�⾹��t���Ѷ�l>'L�Z��3*k�ۙ:s��@`@@+G&��B���{���yK���䓜����i�Z��MIh1��}ƱV�mR�\=%�L��ɓ')-+C��R[k&���L!D[�+Vu������l�"N����m��kjx�o��3 �B��)���l�!���놜�0�B\���B�6D�Bцh��.?!�B\����zc���oIEND�B`�site-packages/sepolicy/help/transition_to.png000064400000163035147511334650015466 0ustar00�PNG


IHDR�j��8pgAMA���a cHRMz&�����u0�`:�p��Q<bKGD��������IDATx��WtI�`�ns��Z�	*Pk�d2�ZWeVWUW�鞮��wf�u?�W����}w�����%3�R��Tɤ֚ A�E ��p�>BA��*i��������1;v�L���p���+//��v�}�eYv$�D"�|'Bd2��鑑��b��Zyy���͛�����t8wZ>�D"�H$�.�T*������}��U;v�Dkk�2MӰ,�N'�H$��[FA*���ɓ'4`��i��y��H$�Dr�,�Á�n_���i3MS��%�D"��c��M�,�ŸD"�H$��6
 ��D"�H$�q
�^+|��_"�z!���h��q!�U�l�c߆Se�-]N��~��u��7�1�DrwQ�� ]���@A6�%O`�&N���E&�!�H�6����&��v���l��R�4>�E�n
B`�&��i ���p�r:�5,`X�x�Én蘦���J׊��3Y�>/��bZ��(�"�z<w��H$��0�~t���k��K$_��X�>��ᓧ�u�����g?|�çN���n�0M�ys��������_|�e�`Y'ε���Q������!WmE�!�(
�i"�@@�W0�M�ϙ����?����i�VU���O��yNQA`YLLL���
O<� }�C��"��/���(���TP S�%|���w�?��k�[���~���Ac]�{���r�$�,̼�c�w�*��&w�/!X����6X���һ ��?4Ldb��s�8{�"�� 5��<g��_V �䫡(
��;�s�(�{�IkkH��86�������8l6B&&cd�ٜ�3M�*�Y�b)���{1-�L6Kei)6���h���J��8�T�x��
zzq9T���(
�a2���S[h���W�Į�hj�c`h�D2IUy�i���f���i�����q��F��4Ӥ��!�����~,K�H&��m�>��i`Ǟ�\�����4�"21�����ѱ٬N:�!�SVbpd��|���
�?8D:�AQ�KKp�������L4��6ä�$����T�N�
�mG��h�����5+���x�Ǩ����M�,K~���`�&�%%T��r��9,�d����LFclݵEUX����-�
�b}���c��UW�?_UUH���VU�~�R�|;��?��S\��a��|�� W�d�᣼���T������y��Y.]�!�NS_Sͮ����!4U��t��c#�e��sm��#4����?fh$���$�Bg����f��x���<�@˲0M���%���'t���t��`�;�����h<��n��v�G��4MU���������d,�3�=L:�a�ރ��F#����Üj��#��0����?�-�4��0�����C$ɽ�eY,n����(�~���lbɂ��u�bq�c����1,�ƺ����'Ν��3�=r�?��krnk�ۅ��8�v
�Y�]���n�Ϗ?� >���|�	��1��TTӲX�|)����Ż���YѺ�+��y������{/<KmU%��o�q�
6�Z�_�˯8t�T�n�&�ir��=���������2��,���9q�n����GSC=�a�,�ʲ2�m�W�}�ӛFQ=��`Y474��
N�=ljs�,���q�J���s������du�ѳm�>x�0xx�jV-m���&��=}���j�${��������<��5��,���b�
���/�|
!����¥Nj*+�7��m��1�aY��5˖`��p:�fn�����ݎ���n��v�PM�H��t\�at,�e��3��A���d�d*������Y�d1�i0��p�	G"�
��d�z�����h11'���0�i�xh�j��s;�؀�n�$
q���U+�,/�����_�r�|����B!t]GQl6
�Ӊ�n�r�'���T_OYI�0��lD&&
��f�4
����XOI0P�A�Ԝ���H$���j���Gx~�&v<B��c��E�mO��H$7�% 21��}���q9�ly���	G��d����oa�&s��K�
�ٶg?;���r��u1�`MU	|�m6TU!��PS�҅���8*JK8y�<�T����|��^��>��Uˋ�r�� �d��\���ų�=�'_|I��>g��&6?��x"��m|^�^/��q�
>�%v��-odQK3��� O�p^s�qaZ>����t�Y�`>�i�r9�{=��^�+*��,JKB��(+	]I��%XѺ�Ͽ�KG�eF�"�T��o��$S)�.'.��P�_�:H�%�_1���R^zj�}e�e��M�
�w�^kɒ%W��H$�Ǵ,2�,�T�ݎۙ���:���O���w�l�3�|K<ծ�*�ia�id�:��"ȝo�i`Y��YTE�/8�vR�p�Z�E:��f���s�|C@7�v;����ut�(6�\�:�N�<u��o�kJKB����;W6�8- �X���f#�͒L����hjNE(��B&�c�Tt� QL��N���:=�������Ȣ��L��3�|c�FF�b��r3$��!��̾��9gΜ��~���"�ݎ�n����)
�s�זe�t�g�Dz,v[�>E�e�l�4qد�N����oY��ۼ�Wp�OM�%�������a�y��:

��ױi*���b�$�*��f+6,�b��;s�4Y���ƺZ��w��qc��~C�&��X���0>�0n������\AL"�_���X�n*�l�Mسn��F��<~�tB�N/=��ƺfn�u�k�L9g�Y�r9M
u�mvj*����בH$_�I�D"��eY�WW�X{�ѿU���>oq�1��$�o���M"�L�f]��3ݖ��� ��H$��>������r|_"�H$���b�_"������ߥ���ݔ|��ca.\�0eg.���`�%�TU%�L�iq����"<n����$�T�^S�4��	t]��9�Z�v;�t:Mx,L&���"}mE�$XB(�N5d$�'Z,c�o.�`P�f%w!�����3�e>�������_x,L�@��TUV�:�d�Y������n���'�L208@d<B$!�Ivݽ��r�?���e@�J�=�&��v\�%�j%wE($�	��4~�M��G��x�:�:i��r�e����M}��m���:�}~.v\$2�0�{�^�D"A*�"��iQ$�oD�&Zwm�Ar?Px��N/��{US�ȼ2�p�S�Q P��gx��AВ��4�
�̭��W�������z�=��r�����krK蚦y׎7[��s��S�Cw��?��1�hnwJ3�����x�~��M�!
���^1�S{��C�⵸7z�9��3��`J$ߜhl�d*�f���44͎e��i&&s�w����s]�/��m����,��=����
J�nVj�[�L�&�@���La��fP�;���(+B)��M�DX�2O�C_�N��Ԇ��( Dq�?gl�0nmcDU��2�W��N(����X��eY��kz�o�gq˂�X"�S�N"C�4,3M(X���0�D�a�b2:��噦{f㚆_�d4ʁ��h]�����RTB�.\@UTjkk���憁=�F��(�OL�ƨ���=�{�¶�S���l9��,�`4f���膁��a�唗���{��eik��r61sL�.�{�b�:�`~KQ恁A��*�����kYVn{߂�,\�p��;N,gú�h�FWw7�1�/]zU�,Ӳ���o‘c�))	Q_Ww�5
y+���ߞ<�@0���yb���ɯk���M*�*6~,,\N�e�J������E�;�a��	V���4.�������+������qC�MM(���s��?�Ov��}儼ۮ�4
�z�G�������qN�:M&�-*��L��|�vΜ=�������v�|Qى)
j�1�zl�����Q.��N;���K8��y��l�����}�x�nz��x�wI&�W�z�W�ʠ�R'��܍��4�t�Ϻ�_i���MM;����@�xM�y`�Ϸn�a��**�]]t\�dǮ]tvu�γ��c��3�W<�7�S�aY�޻�����O�DQ.\h�]_��%�7o�C,��zS��^I���c�.�.\(�ɩ�N�p�����T�8�����M����Md<�Pĕt���iW�+�SU45���UPU���Z<O���((��oS�Q�3�UYH��"�,���fc�!�8�c�f�8���f7\�uN�>�#?ĥ�.&&'1
�϶m'�J�j[6?F2�dϾ�d�iB�[ߌ:cL/�ɰ��!ε������|��;v208H]M
�MM���G�}>F�a����'�����~�jky`��ܳ���!&�Q�����G8~�'O�&�����wB�8�ǟ|�_���f����8y�������^��'������|44��FJ�B���;��f��^����s�\�N��f����dQ��\�y��ۇ�fc�RZZ�G����CTWW�����$	&�Q6�[��%��޳�ή.�jkټi����B��8�i�((Ӷ���,[���M0�䩉\�! �J�sח\�ꢱ���}�px�/vI"�b�ڵ�.^4m��p_f�����ih@(�!��Б#�8y*W?�Ď/v�G384��>@w�e��(������{�e����p�����s��!�����
�ٽo�#�8].R�ܳ8q�$���x���q8Ӟ��(h�6힙�I2�DQL���tS_W�a$	��a�&�0�e�X����DcQ��(�P�f#��0�0�U�Ux\FFGH���]nB����J���������yf2D�#�F~���>)�n�i6'�l�H����JKR$�YJʗ�Θ8NT��3h�W_{��jjjp�\ŃB���ٹ�K�y�IN�>KuU�����q�z��"�9{UU9x��<� gΞ%<6�n�h��P(�g[��iG�g�c����#`Y&�e���%�%%�	���x`�:zz�8q�$c�q:��ذ~&�s��9�~?KZ[���47�፷ߡ���ERZZ�[�DrGBPRR¹�6�n�A:��Wo�IMu5/��<n�����T���UUWz�B022���;������!������{�T�PYQ�g[�aw8��g��l�RL�d��}��>��Ǚ���/~�:�55LNNr���/[�a������/q88t�%�ػ?U�����۬Z����:B%�iq+Y=K$����*�?�|+��;���2��_ #�LRWW����v|����ؽw/�`�S�O���C7
��|�򝘜 �]�B�?���s�;�_��t1�P^QΛo�Ò�Vε�O$h������'�<Nc}=_�ރ�n����w��Ep��	Lä`���.�~�>L,���s�l6V�\IGg'��\�t��kVSQQS�
�Ø���f+��5M��t��xp:�����$T���bD"Fã�T� � �p��"��Փ�f)+-������q\NMs����awp���h���z\.�@���Qzzz���AQ���u�t8EoB�c�x�x<w�zI�Sl�F4EU5�i'q���O$��!4����BFGGgw�!8s����>~�ɧ
r�q� �b�2l�CCDcq��*Y�f
֯�Rg�L�8�(
]����ձb�2�67��:.�s��9��8��PZRBc}


���iYtuu�t�6�[ǢE��Յ��̛�̢�q8�mv�{�i��w� ��]6UrsX�EEy9�o~JcC=���_��P�_���R^^~]��Lw�a�4���c����`Y�ܵ�D"��ukٰn��p��1\N'�׮aú��O�3>>��H$�� �/_&��t�H$\ho��s�	&''��̛;�����Iii	�?��Sg�p��)2��U�Ʃg�<���?��?����h�3kJ��iZ��1�B{;N�lX����jv���ØՅ>�P��*<���s��Q�����FEY������Q^^��E9t�(���x���� �{{Y�h!]�ݴ.Z��իY�`>���PU�E�p�|Ep��i<󚛱fq�Os�O������fq:�����.�U������v��x"΅����1�y.6�
�2q����I �����M�a�&��0�i��i�ۯ�Y"����|^?�X�ұ�5��8#�a�Mwz�2�B@2���|���?�C����K���ƈ�b�oogϾ��TW��y��ٳ�:<n7�/��ߏe�464p!΅�v��/s��)Z/��-<�N��\[�a�(
�p�q�n���3gY�p!� Ӳ�L�D2���b����?Ooo���+0M3g��S�/���_��T���pڛi�W�ѱ0���{H��,im���u�>۶
E6�_G*�f�lݱ���
***���#����Ɔ6=�0m�@_�'O�f���~�@7C�\�h4JYY)���Б#D�ǯ2���N3�f�7�t8q:�y�{n<r�3^����
�-]�SOl����h,ƒŋQU�C��\e�����

]�)--a���8y��d�ʊJ��*Y0��'6of�VTM%��ٽo?###,Y���.�H$xj��?@Ӝ9̝���'ٺ}gΜc�`Y躎n�
x�'I&�|�m�l�h��\�S����k����h6�h�}>���躎"�%�+6EU�y}x�^����y}�m�8
UQ��ʊJ|^��.��G��������I$��<�c�_u���V �@ �d2TVTa�.�t���r���4�J�a�Z���)���l��#G��L���<��SLLL��A��4u55l~l�@���AJJJ���fݚ�8�.ε��4M/\�eY��P_O��E̟��hx��Eee��nܰ!}}},]���k�`YPW[K��GAmm
m�/���͊e�X�j%6��N?I˲p�]�il��t^��?��_QV1�|�2��D�1L���`�4�708<L&�a�G�7w.�U���r:ٲ�1*�ˉ����zX�z5���h�ӈ��S_W��-��A��q�8JKJ8q�4c��_ϼ���*�n�LNN
���l�8�N���P�,�S^^�s��b�2JB!η_$�LRS]�Xx��N��x��	�<���1,���Mk�AeE%%%������ŋ(
�h������TUU�v�������N��%�1������K�������+��.f�� �����/�Xj�7����
QSU��f+�;6>�"vG.� ��**�h�X,��n'���Q5�t*M8��t2>>N2�d|b�0��E�
�0�``` w���i2��0-��8�T��ɉ\^��6�������?����4B�y.-���r�qKW���P�m7��Os�\C'�
SS]����zx\��7^�֬YCI�d�RVUuZ��f�q���������SJKKB�k�n�=ƿ�����6���i�Ӣ��@M�N����
i
Ø5}�;?Zr�)(.�x��~+L��۞:W�p�0L��W.Я�O!m�ݙ:�PU�)�n!Xn�\�L�����4_%��(�S���1���9Se�@��20<@ue���TuZ��.���\�hMS1������y)��4�������N�=�(
��;Q��3�^�M���vc��d�8~�F�d2�l6NzV/vN��$��h4�a�]n\n�x�˅a��q�nw���X,���x<l6�x��yN�B�Τ��@C��v#����;.�~�F��a:�:b�)/�ĺ�zB��ڮ�?s]�QU�P08��:�N����sfV���~����lی�<o��|�|^w��%��ô̫��z�3���󮗾����-D~Q1{>7#�Wyg��ֽ��j��z��v,�1�|�\7��,֕�|3uBayQ˲�����KJJ�y���d,���ozܞ�kX
�r�
�7̳�G�!'�%w�\g!�H��g�0>1�a��|'���7�d�eY�TW���8�b+�ҥ�.^|��%�{(L��ۙj|o���S���wׯ՟�0��^G��K�4�ib�lTTTq��q��G�����qZ>��ft�W�֊��̷��hWɷͽ�N?\�V��v�^k������_UP5����FDn�����%�OB�Y�h1,�6�)w�ܳh��[B��_�B��aw��-�z�Ա(F���躞����Ąrw�{#�x/%�}
���n�Y�vÐ�PH��P0�Ϗ��$��5�LNN�L�Z����!�r�o_w]�ۅ�(T�W�Τ���|o�f��z�iυ���}5���hx��D�f�$�ozV�����ҲY=�F
+��]�[\:uvý`�
A}^˛��w�{C
3!$�{�������߲J)+��f��������to�̅�t���k��󼙲H$�:���?�J2��Ѝ{��'��8N�@1P�^T��S�{��k�J$�enY�!��0<<,{���b<5N6������H$�W�~!�tMӦm�!�\�t:�����$�D��s�z��e�p8��>�����{d���U�J$���q�"�������hR�g$E"mpkԕ�p��P�BI$�m�Ϧ��L��!"�I�S^楤��#KK�*q}��H�u
�MJ$���R4��e��f1�ܞ᪪�'MߥK��B��+�u	������v�ѓ��\i/���h��%x��9<|�!��3?$��6P4��h���7�ߢSc���,ne4<��d���G�����Ғ��T�>��"LaNs�Z|q���~��A^��u,]��Sg/��[�(O,�X���k�}�eY��/�H$�
r�����H$���^��������6���H�������/?cӣ��b�2����Dc1����݌����f�����4��f�����Cdc1�m�c��%�|V.S�w���"\���F�p���K$����B�L�8|�(�`��E���:}�Á�(��_�܅d�\.'>����	���z��+/���� �PYQ�k�{EQ�C�"0�4�ia�;q:���b�\$:���-($d��~�^�E"�H���	!p8�\.4�F&�!��p��X�l����{/����������$�[Zسw7�G�u֯Y��ˊk[z����:��S�9v*����jK8s���N�����‰P’��~BX2�O"�Hnӂ��>�7=JC]0<2B`Y���z�"�ʥ�u��EyY/���ee�����v�1M�ï�S�,��^Μ	���1�K�����p9��pl����A�w!]��Dr{(��"�@���f>POQl����R9��464��s��<7��0.����P�������GbJt?PS�?���lz���5ə�n�,�eC��v��N��뿚((��?i�%��#���k-\����3::JiIIq_<'�L�r��|����
�v;��Ô���b4&�S]U����4BP[S{U�]A2�apd�T:���������ع�����~v��@&�昦��K�(��+�I$�-@A[[��_��S��M]|g�|����*i�4Q�!���@Q�Y
�’�X������>�~?��o�+��G���Ŏ����K�/�H$����֥6gL��j���y�9�e�7Z�uj{�2M��<��J���1~�D"�=�R_�WY��"?ִ��V�D"��n��WU]׋Az��d2�f�H$���pK�eY8N��(ݗ�QU�N�Sr���d�x<��K$�-��E!`���/�!.�UU�"�H$��[>_NNɓ���K$�m�[e��%�D"�{��q�]��#��[Ŕ}�dV����
�p�$���7H�9.�B�O���
�H$W���Q֒[���Z�s"��A�XV��'�fX��f�����Ф�o�C�7�u
�a�q���S�E����a�ͭ�8m�TҒoL2mp�3̒�5�)���yv^�W.�}\��
��fq��r�4�-!k�\�dum�_�-vɷ�����an�!�1�.cZ&�0�T�ʊ�;-�]�5
���$	j�k���wZN�w��QJ}��|�T��e��<ҋt�r��}��~!�L�͆��r��-C�
e^�gI� �2��T*�i�HG�wMӰ,�0��׼;�LM�dp�wEQ�h�P�w*��	y��{���R��WP���,�[�lu�N^_�M���fd�\S�f2���X�8��4Ml6�M_̲,t]����ͼ8���_g�>]�Ig2���ml�l��7/���#�@Q,�*�}U�f�02�Lq�c�����x��<�t�ŋe/0s�ĩ��}�lUUgU��|��ȱc��!���`Y�B<�\?͚&�a���,v�!�
e)��N��f6�*��4��sB`�d�;
�����8�Λ����w�{L���=+Ș�u�6�U2μ'����3$SiV,[z�{5Ug�9j�V|���f������y�r�A���鴞�e�d2��6�
[�^醁z�L&�!��("?S�����9�]7����f��p:߸�f�f�.76�O{΅�S���r��>l�l�.�J�9r�-��RQ^>�:�L< �w������UVV6���DUUq8ӌZ��O��߲x��N�sg!�� ��oޢ���K�]
QYQ~�yS
P:�����ǃeY�ٷ�4	��+��1Ӽ���S�y㭷ؾc';����g�`Z&
��Ӕ�첋�A��K�ŗ��z��E"�4
��Ɖ�����|Z�o&�Ÿ�(8x������ٳ>z���OeEE�坚��X���k�n���I��tvu3����[8<F4����:�}}�l��jS�B�L$x㭷	�i��Q�GQ��C,�bns3�iK�&j�84�h4���¾�/���FK�<�JK��qjk��x���5���L$9~��

ER8��0(���=8�N'��eph��ښ�5g�7S���^���O�n像>�����,\� W�|���:U���c�&;�؅eA*��0<;:8q�$�

��q3y���(��/?����r���7��8�i��-<�϶m�ҥK��)�;:�?o���mtt��h��ߏa���y枽J"��~�sJJJ(-)�v��lN'����7~�����B7,.�S���tvu����p��)9
D"�ٽg/����:�m�|��o����?x�ӧ����M�\I���^*��p���P
1<2���Z��Z��?�7o�˩3g8u�5�5���t���k��:��1�eq�R�T��������膁��$�Nu��:X�tYGFGy�wijl��l���!�H��[oQWSKyyY1M*�^��+��B}�����8v�Mզ�b"��O>a`p���:j���F��E"�E"X��fc|b��8����ۦ��/w�p��@(
�`���aQ�0���#<6��n���o��6�%%��n"�|>.����&&'��l����(##���"�\΢��,�?��h���_}���
�����d2"���bv���p�px���b�~|b���~��,6���h�0x��wPU�`�O,GQJB�bJ^1_/�L&�3�\��=�F*���?�˖.������$�LF�8�"���a,���>L�,�k�SEy9=}}\����r��d��l|��'�<}���*�'&�ů_GUU��L����:�tvus����PZR2�}��u>?����*�X��,	]a~MU����f����G\�x�` ��ŋI&����
��܌�X,F8<�h8�P�6�o�NMu5�P�D"A]m-�1�/_�2-4M������A4M#<�o����l������4��2FFF�f�PU���aFFG��tb�۱,��E���8v����?�s.X@"�`|b�D"I,���R����H���0}}}�����"�s���x,���bdtUQ��;�0<2Bue���L����ebb���!,��t�f��X��]4�##���֯'���v�)ޓҒN��i�*��5��tu_�B{;��l6��n�ͷ����c44�������f�+�N�N'�ap�����|�w/˖,���b�����\.

�������iqyx���M���?0�����?`ͪU�����@���S^VFMu5�H�޾��N�9dZ���,�?��˴̛���<���er2Jd|]�ahx�o���K�;84T,��8����_K����a�;������W^z�ŋ�D��� �L��l����7M3W��!��8�d���>4M�n�M1�PS]őclj'E��:����p��2:��y��)-	�q����S�B�N��i.���q�'O�aՊ�8��i
����y㭷)	��q{p:�����v���2��a��\M�=��?e�&��#ȹ/��(
]��x�ny�N�<��Kص{'N�����W^|�϶m#���<��c����Ŏ�LNN�g�~�^/�V��f��垽��1*+*QAǥN�n��3O=��3g1����h;!�����K��/��u�5��ì_����T4|n��χ��TV��}���������E008��d�X<���<��kY&�����ئGy��wp��TUV�q�z��8A0��R'�n�v��@�X,����4
��ɖ͏���߸f����<�f\y#
�����/���t�~�Z��C"�qꩮ���Ө�Z��~�����$v�
��G<�~�BX�E�ܹ(�¥�N�/^D��1�~rg��� ���q�X������B��+/�����x܄��k�n���p��R1�����ˁ\�\����Nl�2��QP��1~��k>z��Mp��~�ƛTUU������8p� >�E��\���p��I~?�N���KZY�h!�}�!�x��1o�\�_h�4M�>?]��Q�X,ζ�;��l�\.^~�y�����~�D�9

���'!p8���p�r^����Z������I��E��啗^`�D��q�ݯ?��ů�|�X,Jyy�?�4�O��awp��9��/
�|�<u�P0��~�����k�{��?��:y����SYQ��=���z��w��/W��9s���@ۅ�{
��;*�n������fL�d�VTU���"���P���a^}�N�:Mx,LYY�@`�}����C�|>���?�C�Οg���|���^�Bp�B{Q6��Ã7Sz�B`��v�d:�+?�ga1��{�^\.'kV��Rg���ЃCC����(�������Eo����v�q�s=��
�8�}Lmm
�,�bG�DM�x�y�����
��Q__�}L0�����MM>z��ͥί��f��4g�>�P�>��Ʌ����B �g��e��?�3���::;L���������AJ�y��Ï��Y^���g��<u���!tC�6r���޾>\ncc���3O=Ş}��؅�^z�y��r7�:;Q�B"����=�q��9:.u��<��3<�a}ѫXx/E!����$�3�@O���]jS�t]����R)R�4�����'4���?�.���G�O$X�l)?���122Bۅ@���f,a߁�<�Ѓ�䇿ǂ�-TUV�l�*+*8�s��hnj���B]m-�D����?��On�����h�؁�����c��մwt������f..A72�q��{���ֲr�2�67�%�7m��'������h�h4FI(���Fa�475�³ϰtI+�T������a��U�䇿G"��ȱ���sт�9�ǔ{m���y�x� �^�r�o�đ��X�z�E���ͬZ����_~�t:���HQ�,���������I��,_��U+���3O�b�2�����Nj���v���=<t�G~�|�{�TW��ƴk�����Ӻx��I:��v�h�t������H�R\��CWW�L������+���m�����2����%>��={��^n!��:��I��%K��ÿ��͏QQ^�72�<�/vPUY��E�ɏ~ĺ�kH�3����|�66=�0��/�-�T�}�������}��\��"�NOS2V.0�x������/�q�z�-Y�eY?q�l6�#?ğ��366�hx���	�^/�W�����Ƙ��Ȋe�x��gx�'1M���I>۶���&��_�u55|�m;���j�
~Q֩�jB�b�:.]�uq+�,�'?��W��m�i�3����Amm
[��(�W���5k���S�>?�L�������?�=z`#�/��X,Ƽy�g����ڽ��}���鿢��a�w�S�W�����a��i2�d���tV�4�b\��jx�^�/���L�H�S�^�S��|���P���ݜ���giF>�0���JJB���ˬX��˖�X_O�����r]�K/]"��H&���fnss�{�J�|�R��4~e]wu�M��8��m\.��k�?�iK/�ĩS<���=n7/��<�uu����3��?0H,�aU�x�{/���y�,\0�M�<�c�<ª�+����X<^���0����y^y�^|�Y�~_�Y�V._FӜF^��+���^��0=�_�a�LgVïz�f^��M�D��|9���`�&�Οr�2N���N6�rAi.�MU��YL�$�Ja�T6��2�b#c߁�;q�ۍ�O�,=���{�WZ��1A,�DU5l6�86d�4��I��6��ǝs������Y\N'�lEQ��m��2�������k?V~�z]׋��4���9)MӼq�cc��8/o]]-�y�7n(���\N�v��/l���n�n���
�����v/,3�f˻�s�OA�u��F�i�*<_��N:�F�uL��?�\�K�R$I�d.�-�����w�0� k��ڴ\!�t��'NP^V���V>�*��=�
�t:�|8�NtC/6.M�2�,X����*�O�⷟|���(*���?�@.�B�}��
�s�ۅ�N:�
MU���|�m�"~_�q�v�p����'�����O?��/�(JNn-��έB�7ئue�^��
\7ϟs���Y{C�ʥ���R�Z����z%V�Z�v�����-�7�a`�&�fCS�8W�]&��V����I�ߩT*���(Bɿ�׺�
.�B&��b ���~�y�A�JK�^���&���p8�<�y3M��d��c�^�s�܅�����\l�M��^�K�a0gN#/=��{{y���H�RXV�]wU�~�ӱyY����͏>šU���0�ە�߱�*EQ��\�6�V��1f��0���.{MS�qJݡ�ߕ��`
n�����U�M$�$IR�T��f��3uI��"��S��q��_ Ȥ3���W�K]�~�ګ8v��ʍ�ݜik#�111�s�<�ǿ��]�ws��9�N�q��9>߶��C�罹��Ϸo����,_��4�Ec��*v��χ���?��-��i�e��6>��'��Bcc#M�
��p!gpEnN�LwOA�ʥiZ��m�Ng��Tд�P�k``�#ǎ�L�p:s�Z��������`���zin���8r�8v��eKZ���v�'J��S�UQ��045WI�֬�f���(�4��y���{�W�7MU��j�8v��~�9�>�0u�W�U��b1�'&x����ʹ�JJJ���0>��S�@@0`�Gs�lێiY,^���<,+�`��7O\yN�$�I��_�UU�>z��?����Rε������_���3�޻���I�"��~�g[���O?e�V�7��ާ�r˗-��p�H$����f��zinj��w�GQTl�F(bI�b����}�p:�<�a=�ۋ��aw ��iE��bw�"�F
�\�����D�����bT��ۓ��=~����\#��n�c��X0>�oێ����q�y��M����׿�t]���G��ݵ�t8q��
��nw�R(��a��r����[ヌ�(<�f�z���_�
Y]燯�zU�f�$!6�
USi�7�=������������|�>{��gr�ʳ;�v��>����r"�q,���o���==!(+-��_d��f�~�}�իV^�܄x�6�'sE�˩�Z�P���P4~?�v�摇"���Յ��+���éSa���t�a�$�I���TUͽKS�R.0��d2?4*�4�ϋ��Z�nj�n��+�?T5�.��PN/��III��ŋf��2��$,��f�f4F(�Q!�ib�r�C��9p���%�M+W6��bN6�_O&��w���v��f��F��'������ղn�j�~?���<�$u���6>
ɵ{��.\8-pf`h��d��w�	�x�L6K( �JO�ٳo?cc֯[CUE~����/)+)aѢ��TU�z
���b��~l6[~�x�D2AeE��޾~TU���H$LFc����L�p:�l6F�aL�,F{ONF	��ҙ4�@`�B��b�I �Ͻ\��084D,����xHg2�}>,�"��v�"��PQQ���c|b��GV��\��-n�R9״�� 2>~�<���'��Y~q�n||�Ӂ���4�k��f�d��}>"��x�^3V_��bX��/_�#��
#AYi)�t����`���dt4�uk����$B7���.��P(H$2^��fӨ(/�v��+Ѷ�i�7�bUkeAߴJ�J�����<�
��댆Ü>s�Ϸm畗^������Z~��_108��o������2��'����E�[
1:��_����L6K0���4g��G(+-).�&�LRYQ^��X䂲��ʦ�C'�I���e�h�Bz�����C~~?�a�v�q:����}
1>>NEyUU�D"\n7��100����r�L&)--e,a||��P�P(����u󬮮*ʛ�d�D�	���D"��UQ^N<�`xx���~?ᱱiך���D.=$��r�t:DQ���H��,/+��>����NOo/�T
��CyY9#�#$�) �[���!��04E��(B�90���nj����d��x�P0X��h�8=fdd���*2�,#��8v*�z��eYLLNb����9/D:�oà�]��x���o�d������p��E]�ǿ�������LTU-�=�4�����A���$��**N���	B�`n����Adb�ݎ��d|��
�r9	�LF��b1�� ��歫�B:�.&����#�E���夬�t��:�l�;89Aee%�@h֩�3B���f7�]���f����YO��sLE�>FQ^x���;F��/��KX�z5F��X�+*y�B�+�l�V����A_%�9�3�ϖ~�~�>���2=��eYW���U��]�L�<�U�َ͞�7�k��^̔y�}�����o��/���f�u��7UӴ��yh�2�3�T�:mk��;w��A������kH��UQ��?�ɬ��)בu�)L�|�Sz�Se���S�4�&��|g�{Z�(L�_CQr�23�^�3e�*��r�|v3��Lf�K3ϛ�ln���^]o
�TY��Q��=O��1-��������y��Sx�3�Y��鮯��'�le��f�/��RE�nv4-�,:q�5t]�_�c��̯�K�K��TUV�p�~�`��r�[�E&�A7���ȼ�͈�Xn>#���~�p,	�u�%��3�Ky3��k�v�W?�+'|m����M8l�/�1[�o�3M�����ˋ� y�O-��wp�wr6�{�|f�q��k5*��ۍ�7��Cמ{=[���p�r]K��EM_K��y�̳�ֳ�)�l��aa�ҭ�^X��Hx3źQ�؅���B*����.Q���u����|p�Mg4[��&W~��\K!ҙ�����Koz���xB$���ٙ������.j��VۅL6�ϛ��zW��ֆ��f�ƢŇ��dH$ش�Jkr�DrK9�_
]�滩�gI/��޻ܪg�(
����Ȕv�w�d��/�t����>"7-\����v���UVV�M$�����`Kn�a��=��ѿO(�2����4M��ǯ9ܔ�f���,ˢ����Ҳ�x	�D"���z�,^��N�!��rà��3���h<���H.��n�V�D"�7�Q�D"���2-��cc4M������\g[޻	�4$�{�x�Hn��OSt\�$��J�i�3��枈|��=����z,��2�,�D��}bb��p��&&�s�\*
.���u��I$�]�|7}��
C"���=~���aL]fy�D�d�w��t:��j������͔�d��Lj�c<��o���e%�{�¦R���s�1~]�Y�f=^��m۷�?�Oii�x���!4U������x���(n������6�Bb���C��n*+��f3�:%%��].FFF0�]UVTM�Z�G"���X�/��-\3�ϲ��9TU%�����[I�Sd�Y�j�;w{��FUUR���[��ܚ��h��;��L$0��k�SZZ���'I��(�`�V��G0"���8�G�4m�-��H�m��_"�{�����4QZZ��m���`rr��{���q�����86���y�3gOs��*�+B������^�R�%*�+(-)��L��z�f��Ѓ�088��m�R)<Sv���уD"�H$���5��-3gl=oͦ�q{0
���>p:�KH&��02:���E�4&&�I$��=���������p�<�`�͆a�U���2M,����B"��<r����a���(�JJ��~���N**+Y�r
�`�e�V�v�,���aÃTWU#��cGp��l��D��aLˢ�u)}����q*+������������G(X���8N���2}3��)�d2A"�����q�t��yt=K,��L&�4
EQI������E��� �� ���^�m8o��e|b��ݩ����r�x<7#��vfdH$0���i���gAn+�@ ��f�-2��)R��)z �������(�+:NC��L&Q5
��T'%�o�Y
���╗�_t���Z4n+W�bIkn75M˝����:����*�eQ[S��((�B2�ĴL\N��RQQ�eZy#��b�JTU�����x�.Z�n��Eg�%9�_��`I�R|�!�=<2:ʎ�����=LEE%�@�3gO��S�N[��F����m���ߋ�f��r�j�/nEUn��_###���oPUS	B�]�����o�������b<�ؖ�V�}�,�}�d���{�d*��jE�ܳ/R[S��\�4MΞ;CYiUU�W�~���N�����ݞkl?~�x<~�IS����,.XD$၍r��a��+X�fݝ��ɬ�j���l�k�4��g~���t8g�ϛ��l�Me,<Ɨ�w���ȦG7c�:����d��C�uJKʰL=�����N0d��yX�Eoo�i��f)++#�FB
��FuUu����fX�t9˖.g߁��9s
�����8B���&�� �Aee�'7bh���
l6;��h�JU>���&&�q9]TW�0�$<:��i�����i����3O?����c�ٷ���R</��$SI�J��zs�M�$��|>�8��R]]C4:���(6����*��j��$�IFF��,+7ۢ��ۃD�u�r��������PW߀�]]�����i�����!��V_l6۴:��㔖�bYccc<t��j/j%
1:2***�L�X,�Ŏv���U�9eFR���!<n��tSQ,[���}8�N��lы166�X$L(����ɭ��Y�w��`��x����(�O�"�#A�V��Ra���N�:IeE;�؎�fC�u�/<�{�~I*��4
@��/]�B:.u0>>�����s��\�Yq9]�>{���A!�x��Y�����J�p:�̟������A�L�,����O>���QZZ���b����q�>��Ғ�b9E!�X�p�&�p�b;�Ο�aw`Z&kV�e��h��t&��7���$=��T�K�:���`Μ&L�$�L�v����QRRB4��a�>�I�i.�z\c�β,N�<N{G;>��5���~��t
��N*������G188X�/�.f�����)�N'5յ\h?σ�0��Ǽ�-�RI�G���)����F	�TUU�H�9�v�x<N��%�g$��J2��0t֬^ǂ�r!�x��p8������'�QVV^�fzdd�m;�"X�ţ�>F�,���vsO,�;��b�Lv�q�f#*��r��*���F�Qp;Z(B�b�Jy�Qb�mmg���<��Z��jCg��㡲��6>ĺ��QT���z�y�9�����̃��'�al,̡��F�<��)^x�%���8].l�ƥK1-��?@ ���M���~�������4���Tҙ4�/к���{��bG;���r�*֬Z�M�x��'�������ׇ��GQUzz.��f�:Z��x�GX�b##äR��6��o�ټv�k����QW[Geen���k�3:2¥�Kl\��T����S�K�����[���^bn�<�,�� ���������
�-Y�C>L $�p���#�QZRʳO?��%�hk;K:��2#����G6������N-�u����
I�!7��DLL����{��D�#=��z5�58�Nv���…����������;��ᑜ0�M322���26Fp%n�"׈��
���0
�����cΘa`Z&�[X�v��,��]E���Aiic�1��z�E�����@Ueյ�|>l�FCC##�C<t��EMM
��(�O��tɲ��f�t\���i�;w���Z***q8�����q_&�N���122\l�EA�l�l6��(���tRRRBxt�4���S5
�ݎe���o��8��e�(+-�5��˅C�^���
+��ߏ���0�b��4
��
�a�*
��2<2Dww�Ϝbrr���J�'ru;�N�?����n�'7�6:����g$UTTM�7-��9�X�v����Y\�0<���*v��E�SUYu���D�#���>�?��m��C�?�,X��ʊJ<�n"�1� ��$	�����������AI�M�C8��� ���	��A�6b�X.h/_߅���i�X�����)-�"�n�zΜ9�a�\����r�r")*˖-g��5�k;��� .�]���E:�f~�� ǎ!��P�@mM]�6�����S'p8��ֱ|�
����n����?y�y�ZX0!c�1�\�&*AQ�^n�!=�=�-;%���}~�f�Q*ESU�טm!��,���E�������E��#=ʥ�Z[��v��x����ZV�^;��4�7�|��)ujE��~�N��ҒR'�.���㴝?GEe%�X�h4JiI)>������p�l��T���,,Z�,��/7#����h���+�K6M�$T���dΜ&���R_�@s�\��Qlv;����� � ^�uk�c�)/+/�d2�8y������K�
����a\�p��KC}�L���s���O��I&���TVTRQQ9�h4���u|�4�ܴ�n�#� �IcY�Ca�Aa�a�
\.P!D1 g|bE�(�b1�}�|�.��b�,�7U>]ױ��iW���ܰ��i��S)t=�3?�0���	YM�,Ζ(`Y�%��FU�i���y��f\�E���t:��j��_����"J��C"�@�T�+�h,,��W�(�USQ�B6�)��S�K!ضP�TME ��y��g�� U!�N�UUB�{��6#�0t����d*�e��:Z`�ȝg�躞�l�L����p�����]����"��ٳ�N��ngŲ�d�.v\��ǟ�f8|�P1�L 0��z���kYWT
&��ԩ�����t&��3�B\e��5>�
�tN�%�j���S)(���x<��{��r���+�����fk�̖n�l����k�Sh���F�.f�75�B���~�5
�]k��Dr���+��=���M��?���~�]��bZ��圹˨zUzmJz�
�����*��UH%���٧�3��RW[���V�;x��X�T*EEyK�-���S�xX��5������8uu�����{����������4����LL�s��	&&ƙ��D�L糬۶Q�����).�mm-,����
z$����l�c�{ϗ�l�@��7?A/�iQ]U͉S'�����t�q{��H6������Q^^���G��)X��ɓ'�p�<�����n��3gO�J����%�U=�Υ>E�G�މ�!�|w��_"�;��&=�G�D}]��`�;BPS[���et_�btt���0�xUӈ�bEPS]˪���������:FF���Yt]����E��GX�ds��q��R>�\"�|���^;�Dr}f5�`&'O�RgB֮^�e�����Ţ�RZZ��`�2��+��D2��%K9zt������l6�v;n��ʊJj�k8}�CC�D��Y���{H$�8�
K$w
�~��ΆuH$�@���t�h]��Ǎ��`�����SUU�����5ō-��(�e��u)��58��"c444�{<^���m�ҥR�$��˨���Ji�%�{�;�Dr�0����\����%%%�ϋ�PQ^���׽Hc����_��z��A��FV_��ម�%�%�{��MɕH$��{��K$�{�x�H���[s7#�P�r%ɵ��%-�Dr+�Y[�Y���1��"WA�3���v�ҧ3il�8�P�
ydz�^��j�l�_�n�G��鿋�a�S�+��fZ�8��t�T�4�"J�Ofq��6���,
�.��T�m��&��I������H�̧��ydz�^��j�N8m�]"�L/����L�ܚɿ]k���#~��_[���� �h�mWH��U��&�B<m�)��a�S$2�L/���2�L/���w(}Z�h*w�a�rK
%3ɩ3�ޤG��׿�N���pq(��,+7VP�p�c2�L/���2�L/�ߑ�n�B��BG��k��d�(��B��Tw�u���2�L/���2�LG�[84��S`���կX@УQ�]ժ�f\�^��J��v�;-�o!���̺v/����D�M��k77C_@"m2�4P<���>Lj6~����G��Z�يi��;�*�S�<��d9�\��;����h�tb^�F%�*�zE&��jL�K�<��_Z����*n�2�10#/�\�zb�����k��U�M+�Uu�Dr+&���ٛ����0-��i�����Y����ǡP�ъ�Kp�\*�m�doN�B�W�mϵp��ԣ���U�ۮ�s*,�w�ښ�n�&p�r�hʔϪ���ʼA��"`iݕs��W�Z��5.Z*�m��ʬ�]?�PƂ*'v���C�H�>�eQ��X��&h�ī!�eS�˦���zq��"p��N����ǮP����I�W��H+Bn��37�Ц��5�3\�Cf�3��1C�ۭ%���~s����,�eS0�*�Q����{���\�����A;�-
`��Y~wz���^�չ��L��Mri$}[ٱ,�	�x~y�Mp�?ɉ�	�]��T��M�;a]����R�J]Ȏe������r��yY^�毷�y���a�ʚ���	w%XV碵ƅ�>aA���M^�Y���L�(�<�\����&�1�̵�`i��U
nJ<*�=5����|E$�;�iAS���W���>93����b��l��Mո���y��O*kbZ�B^ZD��J'{�W]G���-�sca������K��T�'{��"�ߩu�ˮ�B^Ϝ�O��|#�Uz�y�i�a4v��U���F08�%�Ͳ����3����"D�B�ô,�wĈ�����R�ΰa�������t��PW�pL����qxd���h��=�ދ1�ֹh�t�1�fA���5.<v��X�_��Q��x���R��J��^�6dca����.z#����e<a���ͩ�$�S�yhU���w���0��L�~Ey�`����R�6�7(��}I:F�|tb��=�;�^H$wB���ؗ�GsJ�s~(�[G�8֓�m0��'�q>�'��~�p���,[ش��D��3���U:Ȳ�ԫ�y����,�	�����3���n� �R9�gE���JGQ�,�u�м+zf_G�h�!W�����	��y�p��!���`ZP�ը�wpo�b�����	2Y����d*7��u*�|���#��x���/.���y��!�T��I�D���PB��u��� �H�`$�M��E<mM��yl�,8���\o�<�C->VԻQ�@��Mt�"�6qhM��t&S:Y�² �2��LRY�""ע*������.G�%�d�$�6�L����n��ӹJ2z��$3&K�Nd�d�Ì�	�x:W� g��N���ޡ	욠<ˑ�8Y=W�K�	�x��;C�Nf	���x���kHe
2���D��?єyS�5Da={��<e�@ִ8v9�#->TEM��8��3r.�r�f�ʼn���)�t��iL��ډ]���|%T�1�K奕!>9=��$�-�щ	��<����"$�&�~���Ḭ`4��s���*�ζI'��p])��4G��y����
M<�$@���d�`w{�ј��K�L݆;%���LU7�\]�;s��3�g���)Ꜧ2/,2�����[�)�j�)s�>��y^Z*�dt��b��O�+L&
&U�xy�P��V6�9۟d�2�48ѓ`��+zf$��i�?�W�<���~�@�lc�0o�S_y�տ
�
�U��7-�d�̢04�34�#�,#QӲP���'{����+Dn�op"��Nw8��p!`,n08�e8��3�!���{�2�'F�Y&�m�)�
$��Hqv E"m�3���,�F3�sy,C�X�����D�CiR�I2��}\��7�%�6�d��b!�L�FuFbY't.���$�&�ņ�Dr?"� �6�gO�%r�sp"KO$WO��Yz�2���'s
��Ҝ+�����SNds:��uڇ�d
�D�d4�3��7��\��=�ϐ̚�D2�N��鈞��Ǯ萡��'u ��)3���E��Fr:��[Y�%w˂r����Fs �b�)���*U����Xd,�r߱�b&	�;v%@ϴ�D��VR
Sp�|�ݭ܂<B�bce�|���X��
�f0("籘�ٲ�̟<3��ԣ��EOG!>B0Mk�
KJ�/hZW�#������S>���d�C->V6�����d�+�xZ����b����uZp�k��6U���2��̢$�;��BN��u"}+���&Qݘ��Ԩt(�i]1�y�V8�X������%��}��7��p�L�eR�,��%Υ����+s��0u�т�R��̒uE�Bથo����e��=}�s>y!�2�~)B`b1����NJ�oFza�:���/��<XW/P6-/�Y�W�g\S?�C�_���7L����Ɛ�iƹ���+���W��'7�-�V�%
�]��T1L�Ron�;�ҰkE�^���e�[���T��L$�Ɲ�G�����^�t�$��YUm"���W_{���3e��m�xTJdt�9e�b�������Fn�Ǒ[HC���ez����w�e���2�ݑ���AZ�0
�����7�����|�Ppe<˲,W��9C�َ��2�L/���2�LG��e���}D�^�j@�O�,`C"�H$ɽ�ac7N���`�D"�H$�$7k�on?�D"�H$�	��H$��>b��,�Ϸm���c������3��i9�KǥK��%�ys�fe2t]����f�d2�;]l��e``�}bF�a�������e�k���_���)�Dr/0��!p8����?0�ǟ|Boon��4�2v�n|#�M�,�t:�i�\����C���5e�c��l6�i����[���G�
˲�d2��eaY�aLˣ�ێ/v��'�NSZ����|��>����d�p�>{�_���L�0f��3�i�D�Q�8����k^/��uU�~Oͣp|�(K��׬�Se�d���A�y`��=���_�O<n7��?'���{������'�<����r��v;/<�,�MM�M�����CC�Y�
�MCUU�o��O?��a�m��s|�u�/_������ͼ��G�|^FFG��|������M}}�{�%��?���DZ����W���.^{hx�w���H�H$~�\.�~�ɽ��mk��GQ^~�>��<|��h�K��G�z
�2���϶n�0�~?���\#��w
s����˖]u8t�0;w}���<��3�`�Ν��ҋ|�g/�@�
����@]����T�w����!V._�SO>��^Y-�H�q��Q��������;}g%�r���f���chx�G~���*~�ɧ�h�>��o���;1-��|�@ p��,�O?��t:���G�^�5_u]��y��x��.�����;Y�hK�,�i�淴����
>z���l�R�8��c�� ����O
�]�$bI�bV,[�SOl�n�� %���eQ]U�O~�#�.''O�b���̛Nj�=��%��̛�C<���3<2Jo?O=��D�>���}�)�i��G,Z�˺z�/���[oSS]����{���b�&����ӧO�dI+�����ѣ|�m[QϬY�
U��.�N+�/�e�<^z������M1kp��磮���j,ˢ`���I.��0�����k�~���;�OOO�m�0z��X�`>��
���_)�����(!hl�燯�ƩS���󭨊B�狀����2��F�t\꤮��ϋ"�

,�߂��v}��N(��$DmMM��!�HnC���R]U������}4͙CII	�)�TAEy9K���v�j:����C˼y����f����$�p��~,Ӥ����ź5k8s�,s���````�p�\�L�3�!��(
��%��>���`����m�Hn
m����abZ&B����J�Y�`>.���IgW7s��ho�ȥ�NV,_�U���PU�ŋ�}��I}]-N��K���m;�M���]�x����a��4�]_�&�SZb�TT�3�e'O��y�<�5�;q�Bo����'s!X�|e��>r���v���/�|E,�����bo�������lz�JKJػ?��X�Eoo�v�a��]�EQ��nLӠ����[8z�8��(B`��h]���[(+-�i��*+
��֭����9w��gβp�N�����F��G�z�����P��bY&�-s���2?��#����H�&�W_{������=ŃB�^�s��v�i��GUU��p�\���14<LWw7����Gp9o�8����z4M����P0HK�<L�$�v�j���	�Z����j�;.�L�x�-,��B]M
��X�t)�P����|O��`0HmM
�ee�����ׇa���AUe�-D�4�v[���Y��WA�4���s
w����jZZZ��AӜF���A���������pкx!�=�(s��PU����ϟϲ%K���dxd$ר�57�|�R�GF������
�݆���g����!K��
�z@�������^?�`��Cggg1�9
�x�B?�n0��A��%w�4'�L^w�~�[��LyY�7��D"��:v|�����MӾy��w]����"��ٳř&v��˖��f��qQ.�#�H�~�~��Ոo��Dr�#����g���X�L�J$���K$��E��I���B�&�D"�H�#��H$��>B~�D"�H�#��H$��>B~�D"�H�#�~˲8z�8�]�$	�8@4��r~e:�����>���3g���/~�d�=v�h,v�f�Y�?������˭9%���ŋ�-q|��cYgϝ�V�o'��K�5�����/LLL�uǎ����޲��d���)��~�	'N�*V�d2�o�y���ћ�7�H�ֻ����w�o�t���}����;]|�䮦��"o����}�;.]⃏>.����9}��
��u�O>���S�n{9f�,��L��/����6q��9���������!B�y�}�X�jk׬����/��EQ�x|3u���M�D"�������f�%<�qG����s�����P��I$�̛ǖ͛1��}��/�f�*/ZT��/����[�e\�j�v�dxd���U��r����u^񁃇���op��i^z�yV�\y���DrWb��^�����zy�-���8x����.�Wo�I(�瞣�����H$(/+c��͘��'�}Nww7'O��u��Y��~��_|��n�'��L�8~�On��ѣ��4��kV�ŗ�i;��ysټiY]g붂�i呇���P2�����СÌ����K/RZRr�o�DrC��^.��W^~��G��f�j�67���m�<}�ʪJ���tuu�ֻ�1�2w�m]M˲,������_����ZN�:�~̜�F��/��{���

<��C:|�}�**���f�j>��oi�x1��/��S�=�}8t���a���x��d2lz�a����rJKJX�`>U��w�J$w7n��x���!>B]m-��e�.^DUU%�ee4͙CӜF�/_f�}4��s��I>��3>ߺ��l��(�UU��q��b��׿��p066�[�K(�v�����m�FEE9�O����?����O?�Ɖ��ر󋢞������o�4j�k(//c��ŷm�2��2kp_ue%�Ms������B{;��s�o��n���!���m�0ε��v�j֯]˼�s�p��9sy��M<��#\h�H2�����իV�h�Bε��(��&ܸ����;:B��uη�O)�J6��a��v�jZ[34<£?���/�U�ת�+�����OS"�G���*V�Z���fR��

TUU�~�Z�46R]U��E���܌���Z�<��͛6q��IN�9˚U�X�b�UU�\�3:������.��"�I(���g������c�ܹ�_�`hh�sm���㴝?_�3-��]5�is�4RUUņ��p��w��J$7ŬK��ȶ0-!s0M�-�=�ͦQZRB*���͛������#475��UU�����RQ^F(bnS3�������S̝ی��ttvq��	N�9�C<��cǹ��ɾ������p���r]g�c���4��*��:�a�t8(//c�/Y�|ccc���g���"�yJ����4�C���������W"�+�L�0�L�21M����X���l�\.'�Μa��f,�"��~�"�O�`~K��p��a��J��e�<.uv284����ee��ձh�B�67��b�޻�
��q��9�/��P_Guum܈�天q��E=�x<��p!��,���p:s�����K��W_{������=Ń�e�H$��2��G}]-#���>s]�ihh���>z����l~��[BP_W���0'N��n��n�jgΞ�����{!���tt\bAK[�L*�"�����<��
�I�R���lj�b�2f��ܜ/k=����q��������T�tI+G��̹s�219��󩨨��*�˩�����W"�+�d�h�Ƃ�-�SiB� K[[�'

1~���tuu�r��_|�X$ByY/<�,�[�184ę���B�X����=>r���~���Y��‚��9u��ݗ)-)AQ&�Q~��"� �I�v�j�Y��N199IsS�.`ph�'O�p8q8�l۱���zz{q:�,[��T2���--�pJw��b�&���$�I�GF��UU����4��!^�uk�c�)/+��L��s麎v�ƞ)Ϭi,e��β�in���o���s%ɭg�]�������U�ꛭ�7�n�47�g$��]����"��ٳ�X4��ΊeK�d3\�����I�?�<�����Ϭ�S��(Oi�%�ۏ�n��b�ۯ��f��ͤ�-�4���r[^�Dr׳z�J�����D�wi�%�]����p�i1$��҇%�H$�}�4��D"��GH�/�H$�}�4��D"��GH�/�H$�}�4�oY�ݗr{�w\�$�J�i9�U��rO�D�N�rW�#��j���b��^��e100����.�DrOpU�����?�3�T�}��_���w��R)��_~FgW�7�K�u��%��P:�X��;� ��޴<_%��^����70�x�`�={��<|������w��÷�3e�H����B�Ԗ-������[o����/<���a��l��͝���0G�Dz,֭]CYi�mZ�u�?AWW-��l�R������NII	+�/������n"�55լ_��t*������!&&'9v�8�L�˖�r�8v�8�T���z-\0���t�c'N��?���I�R��?�3֭�'��x�� �i�f�*mm�I��

����c��|�ݷ�(��v���b�O<�8�ܙ7E"��f2�;�Ư�x�P0���跿c߁�OL0�����/��(繧���>�WF4�����+�cY�Rg'��ژ�2o�k
��p��!�v;7�'�q�rkV����E�d񢅜>s���innb�ҥ��Y�3���eŲe�V�K�3|��o�� c���������|�oDz�Df�{o	��I�II���TF嫻�̼7��Ξ��G�ٳ3�u��iS�S��(�P"e(Q�� 	�{o3�қ��#�A@U��R�C	����{7n�w�u6[ҍ>���-��������O��'}�5�Ws��1N�:Mqq�y�e��G?⍷�"��f^�wY���g8v���5(�B{G���:�U��7\bxx�)���6m�k������ҥD/$�`(�K�������F.��s������ؼi����o���_g�]w1==�N�CQ�ĕ���#�02:JSKkW����g�6����e����w�1::�(�RE�f�_�[Tn
�D(–���3g�)���ꇢSb(�H�����7�d�;8q�4333 I?y����#k4K�����W�Z,L��06>�};w��wt���œ�?Fk[���A͊j�����I��#G�Q[[��g�]���X�������Ē���JJ(.*f��H�D]}c�
32:��祪��P(���˞�<��(��غe3O>��֮�ҕFrr���7���B]�`�U��<��c������Gsk�>����L������?���|>�]dff��SO�j��Eswgff������ ����r
���u�}���b���D"�|>���������������=�͖DSsK�<����#��<v�w�����������CRS]���,+W�����G~��Ք��u�֯]�
�����o<Ϯ�����Ӝ=w���7��PVR�R'����\i�����.z{����䑇�=oPTT��5��r����A������d`p����D;�aݺEs��zjW������y��e�P*_ԒS����m$I��t��
��v�OvV����t~��+�F�{��e)�,�ؓ�ikog��LF2��hmk������6�R��j��<߉�D:,f�-��t:f�^l�$t:-�M�8�v��)Ng�*��I���H<o�
f��$k�>��O��ݻ��������CFF:|x���Y֮Y���0���0�*��P(�N,,�b6'��zQ�����`J�^��b�{z�����|��3�����f�_+huZ�g����&/7�VC������J0�3�a``���t4����Nj����xPT�I����Y$���:�))Ԯ������B(���������q��8232�vםܻcyy�&��рN�gllPQU�'��j5�2���-�DHn6�g�}�o�����-�7�G�WVV�p���Ƞ�����f���8y���ϓ�de���HMM]�K�DFF��-eIDAT�4r��9E�M������ٳ<���B!,V%EE�P^VFUe%�O��[��{��EQA!gΞ������怜���	֬^E0��W_�ŋ\�|��C,��x<6lX?w/0~�1-5�Ԕ��[pM�IOK����VC͊�L&<�.]�ByY)�%%��/�Nǚի8t�(�N���F�����h4r��ERSR��ʺ��A�¦gf�|���ebr�����Zz������d���/��i�j5|����Sb<��T���p�
��D�ctZ[��G\i�Jk[�����p��Ezzz'έmm�����vٴq���4\���墬���R�\�̹s��F�y��׹t�
�._��f�tvwSY^�����p=EQ�x<�f�i4232�)1�Sn��{v�;�� -5m�
$IJtu�b1�� :��VK4%`01��,7�1
���G"4
�,'�$��ߗp8L4à�#�2�$�F�O����VU�H4��CY��e�@ ����bN�k���ey.�����UU�F�h��%��F�8I��X���ם�����d]��b��}4E�$N�:͉S�����ݎN��r�p�^�F�I�����%�P(D,�`0$��|W�F���4
z�>���v�h4$��<Y�A��,*� �(�h���^\.7W���D�׳�v%�H�ή����w}��h4�n�9:�.Q�n�F��jY�޵并�����4e>�_��$I���k��R�^�����$I���<�uLo��/۵���k_��MI����vtZ݂ϯ�����y���O��D�_I�0��>���Y�+��2��N��ᦷ�v%�E�h5���/J�"Anz���$������S� �p�_An#"�� �mD~AA����/� ��E��3=�H�;?е�V�jTUeff�H���GA��D�Q��@b��/���n�n	��‚���*�}�!/���X�Ʀ&��g��gz�F��+��_����E�����|>ߍ.� �RZZ�x���̲75塵�=񺳫����ϵ��o����{9]_fA�:X0W�$I��8��G�ʑcǸc�F�*+��顣��ހ�b��������(IV�&�PU���~�[Z��aB�C���z�~?�]�m6&&&��泥��P(��n'�������F.�ձn���N[����?�˿�Fq8$Y�7��%���^9�^o�=5��f�ミ�w�>��҈�b��W�fpx���FK����C�!��m��JgW7�/_������QZR�h[^��Ʀ&&']����z����`�����f�184DsK+����������b$�l��F�|p�{��#-5�5S	�y��_4�ONv6?�������w�E}�%�?p�������w_��184DA~>V������
�/�λ���� ��k������*�_���j�=5����?0�źz��Ji����뿦���uudfd03;����##��p�����B�23o��Rn9�$16>AgWm����>����%�+��띥�����a^{}�ee�����SO�w�>

p��K��H$��=o��`�U+WR�������l��n���w�I4�/�Ljj
>>�^�����y�����y���F���ehp��K
�$=��’3��Y��ܜ\�ܼ�V��S�pOM�t:hljfl|�ш��dee���<�� �,���Slڸ�����')(��'?��}�!֤$���~�������b��U��ܳ����{�;;9�"O?���%s�s�k\�jee|��gXQ]}��D�p�QU���R���wx��=LLLp�ƍx}>���o��>��+ع};G�� ;+�?�ɏ9|�(:�V�e�M<�Ѓx��%����s��Y��*��p�G~��w����?��u�f�ٷ�޾>���������D;��i��yF���w�A �/���y�f��7U�eLF#ڹn+�NGFZkW�f�����搞�FqQ{�|�p(��>�l��h4Ly<���*Z��^��b��3=3��3��hB�e�Z
��N����r��e�QU�Y�71�Q#kP�x��F�{	­AU�e)�+�ij��!<�v;Yfff��� IHį�
s��<�^/�PU @(B"ޫ�(1�f���l޴�d�
EQ����b13;;K(F���p8�����f�EEt��&�Y�A���>**`4�j4�A<�#��Gnf��ϋF"\��HMu���ؓ�iljfhx�p$BA^���Jc#1%������^�˲Lr��#ǎq���H$��`���~�Z���i���ĩS�OL��c����uT��s��*������G������4�׭�Ï>�Ï>����wtPY^N �S�8S��^�}�[��$���lܸ���>4-�jWR�흝TUT �G��;i����`hd����2�;F}C��cԮ�����7�����h�t��2���\nldbr�ݎ�*=~�|��tvw�FX�v-]���2;를����\��3��_��7�>{��g��F��(�ҕ+�wvRQV��,�������v�٭�ؾ���O�������d2&��fff����%ے������d����\������LO�`�'c0�F"��f���۳�^�FV��@ Ļ�|>?��V���&
a6��Z,�z�D‘���jp���b1ffg��-�L�_��H�P(��b�?����L��:N$��LOc��X�'����AFz:����^/&�	��J(�������d4
.��p$���@���IJJ��(
V��ߏ{j
�N���@��.lg�O�pR�l6a�Z�|�2�+~�F�F���r���ܜ�����Y]��p$LgW�gt�KII��l6l6[��n�n�߰�>[�ᚧ
Â�����6�vjJʂu:��ki4�E�	����t��x�5�Юr&���#f��Ғ������vj^zz������lN�g6��^j�׭g�2��N�F�WYQ���X��EA���/�M/+3S<N+_q�,� ��A�6"� � �FD�A�ۈ�� �pY����9��}��Ŕ�=�
���oCn'��~�uj9�A�U,������wxg�~TU�����ٿ1�r��r.IUU��6gϝ�ɓt��~%�9v�$����.���˱'>w�3>>����r2A�)ttv�;����~FFG�?oQU�7�~��..�~\_fA�:X��,K�YU˿�����q�	rssp:��nf�^���0�LD"&']��Jjj
�kf�[.����q�[Z�Z�s�k�Ly<D#4Z-&��ۍN�#%%��G8I$���ypfv�V��=���d��a4���etl���4��χ����a��	����$��h�|sSϒl��t:�������),,d�=��pؗL)*�*����uTVV��d#33��g�r��Y~���b�Z��˯PXX���`1[�}�CaLf�s��%ځ�Vl��%��FC�ѐ��N("
a���|Dc1�m6fggq��p8�{�����"99�}a=UeA�3���
��l�>�eel�v'���'�55<��tvu��޽H������?�LJq��*)N'�e+t[{;��y��@Gg�6����U?G��@��PR\��g;--��}��{�`bb�d{2�������h5��{���ș�瘝�RVZ����˯���(t��q�z��CΞ�@r����)rsr㹧�B�%�6���O]}N���ǿ�˿ ;+kA�{z{�XWOk{;:��G~H��-�V$Y����}��g�5��=ƹ���o������Z���imoàד���o���� ���o��������s�@'[�شh;������innEQbl��n2�3xe�n�z�q�;FuU�W���+��D�h�|�;������=H�Ī�5<�kׂz�8}�,u�
X,^��7IOK�c� ����I�Dzz:'O���������{�`l|���b��<I͊j�65�D��έ.��������^����a``����ۓ�����c��5>z���T��q�Y�?u�5kV��SO�?0����m��܍�b!�r����O,�⧄�a‘����t���Oq��I~��	�#
�p8���B���t:��w_�ԙ�def����`;�=�ܹe=�K\%����!&']�Ϳ�w����YY� �_��O),(����;�l�]�������0��?gfv����{z(,(�kIOK���d�vF���o~Gqq>�����z�qfff�_��%��<���8u�3gϳ���uIq:�t�
9�Y|�;ߦ���u�T�ӡ*
�`����=9Y��	7���η�~�N�#َ�bEUUB����dO?�Y�Y<��Ӭ]����x�s�xoMUU�� }<Þ$�+�|���8v��������䗿�-]��h�Zv;6[wn�̥+W8u��6����s;y�>^�}>?Zm<�ȧۈ'��X,X��F#V�UU�$�IX-,f��X,F,#2w�!� �2:�I�����m�`�c4�Z̨����G��\m���^��IjJ
~��@ ��`��s����
�Q�^OmM
��o.�w2>��рN�#���0�<����0�Fz=��.Xw,�`0 !-(� ��>s����&Iwm�ʻ����)N'�H��g�211�4��r�
��r��-���^�����袺�
�,#�2�I���x8z�^�Y��$�F��ZR\�A�'RQ^Nk[;^�����Mֳ�����CCTWU"�dY���#Ksیo[�edINCI����a�/]F�e�?��%%>v�������	�p[����"�2���������7/��7�}���:b�X�e���V~��_308�#>�$K�y�m�����beM
��r��
��~6nX��������F��G����9z��Ϟeú�4^����v���ĝ�7�筷�fE5�99�}w_�)Ue�ڵ&��3��t:�������{�nu���~zo*�1<2BjJ
&�	EQdl|��A^^.���
��p�)-.^�}����ߏ���j����I��h2�HKKE#�tuw�rO���A~^�X�VlI�A@c��(1���L|>�m�h�Z�d���p�\

c6�HKMC����##�dd���F����mk�$Y&��dxd�͆���r�]����j����fK"+3S\-�������YY��nTU%%%���!����bQ��HN��������ч"?/���B��@Vf&3����F����M("?7��ȴg�������t33cpp��DQQ!F����������Ɩ�Do_�
**i)�dee24<<W�e�,׊F���r���ܜx_�׳�v%�H�ήΥ� ���ȱc�9w���7�A����y�H�+�M//7w�3�� ��D��WZR�hľ 1�LAn#"�� �mD~AA����/� ��A�6"� � �F~UU���8y
���q^z�5\.��.���wt.z?s��<�^/�h�OZ8������[V�� ��]�l����7��QU��~•�W�}?D[!|-��$�����}����c���IJ��}�}(��X>���gf���!�
����!��(
��.&�n~�_���A4M$��B��J4evv�p8�hݪ�r��~��+�|>E�AA�u

s��	&&'�z���;?�p��o�ɩ3g	��v��(�Ph��B8̤�E��K}�~?�`�OI>Q�(
�H$��V��E��]������w2��˟�9�.o�����Y���x��8{�<�Ϝ�d6���-�������o~�#���B!
��y���bdt�CG�������>Ô��źz��Iܳ�.N�=���f���<�kZ�S��?p���?�ηE�mA��I�Dk[���_�Fy���㏹p��P(Țի���'X,ffg(�/�?D#˘L&��ܳ(���=o0������իmGUU��<ɩ�g�e
?����>�'����dddpǦ�������������<C `ϛo155%�
ᖲ��V��]���s�6RSR��G�OLPYQ�G�����h5�s�]���,k���'cdt���2���s\����=}}���RVZ�SO<���Z��'0�L<��\���h4�g?�wnݲ�"C<��
����[/|�Cd��/������O~��RSiimMԻ��uk�R]]�};w���x����g��h4����G�l6����ښTuqϣ����w�������ٻo?��ؓ��/�~�Z�_�@���Ԯ������9x�0&�I��-g�}����LRSSQU�	��@ �{�ͺ�kq8���d�����7ik�X�BK��,�s`gfd�����d"� KIIV�F#i��X�fdY&;+���l<����呑���n_r�6[&����t�Z1�� |�T ))���TRSS�����0�Ld��cKJ�l2�p�I�ِ$	��IIq1+��att�ܜRR����%��23;���&�j��������*z�������p01�"16:F͊jR�NF��D[!ܒ����j�ޗ�(H�ĺ5k8~�$�)�X�V�
W����l��a��Ǘ�Ъ�&�)�2����L���p�����\ߪJ,� �2kV�����z������c�P�rl����aw�⍷������|��N�pkQ�x3_���'�)�\�������8���@UUzzzy���_���͛�e�Of�;KgW���>{���$YB�׳q�:j����t��䐗������o}�9�46Q�p�ښ\�҈-9�$����LV��r��a�V��3�>�EEEX̖ě�F&=-���|�F#�y�$''333��a'''�ߏ��b���lݲ�2桶ْ(�ϧ�����|l6��瓛�Mqa!�e�u:��2��ͥ ?��N~^v{2�������dKbzz�ل�l&-=��R���0������n��In)&����\233��l���Q\XHVF&����\���0�M�����ilj"##�ukְu��.{{2�@�իjY���h4J$�b6c�Z).,d��U�Ba��(99��F232�w�v�����TV����=����˥��T��׊�(x<����5
�Ĕ�)7��=���w��*��ps:r�8gΝ�?��@��|�
�-(���ۋ���js3��D�׳�v%�H�ήN1�� 7������DC&ŸN�F�W����5+�$�FE���� _"�—C~AA����/� ��A�6"� � �FD�A��Ȃ���*'O�������Y:�gz�F�s���FFG��F�������~�3���,{:aA�&&'��ظ�Ƨ�4��|��?DUU.����ۻ�q}���`� ��w/�ʤ��Gr���s�P.U	oĄݽ�*�|��H$ʥ�W�����{�=��O�>�ǟZ�����=���I:@k[;o��vb�}����W�ຂ� G������3�����Rm���ퟖY�.L�#I��V��?�3�����p$�[����8kV�f۝[ini�؉�h�Zz`���V����w�14<����U��b��Μ�ͽ{����SO<Φ
���x<϶g���I�}�I�F�[�mC�[[����3��<������qOM����7�z�-��~���b��<I0$��䑇BQ��C��45��rE���j�z��G����x���D"�9w�Gz����c6�شq����.��
y�]�#���������Lx��w�XWϤ��7�}����MQ.�EW���'{���N6o����<|�1]]ݔ�;������?$�n�L&Ӳ������n�\���h�-[���aF�Fi�|��ANv6�7m���h��'�l�S^Z��MEb
AXn����ే&
Q����r��r�v�VJ�����cum-�+k��������?8�{~ȁ��r�x��').*d�����i^}})N'�P�7��KFz:����GN�9KQQ�
�8v�$UU;q��K�9t��c�<��TUV ��K�^OEY�y��}םX��Ϲ�pc-9�/��0����x:̾>&]�ttv�t8��<���h42<���^�B�2�\���2V���d2
��e���t�d*+*HKK]���\�deeRZR"�~�2S���4*��(��#����AJJ
+�����$%�IqQ!�99H�DVf&�7m⮭[hmk����ڕ5������[�B�)��#���zI�ZIJJ��mwq��JKK��˥����EKkIIVTU����U�+YQ]E~^ޢ�5
YYY�8S�Y�B�
_K�կ�*1EAQ$I����h$��M����l6&]n֯[��w�q����ʊe+�<%CQb��*�b
�����'1�M��J[{�z+QUQAr���˗9�"kV�W�����⢨�����s�Պ�����clڰ[R����EUU��'h�t�s�/����,s��YLF��������� �$��j),(������J��q:x<9ʽ;�s�����NJ�������U�0�M���1:>��0�̤���lK�ҕ+����BA~��{��7`���~��y��g�����y�VJ�K�Z-����i��D�e
������js3�%���,��W���t232(��ǞlG�eJKK0��T���������~._i�3=͔�CFz+���(�BqQ����e��*IV+�EE�@zZUU�H����4�%%����r��Z��b1N�>M(����Ғb�~?�==�fg��v%cc�wv��x�z����R�����nFFGIKMC�բ�>���X4ƚի����
�(,,�����Ok{;IV+&���u�LMyp���Z-�Ԭ@#k��������p��pS��C `|b"�F�!3#���=�Fڽg��c��R�荨�*g��;z�8gΞ�?��'�� |�h4Joo/.����͉�Y�z=�kW�����b���/�r0��8�N�3'_�_� �WkݚլZY�V+�,A��D-ᦧ�jE��/���_An#"�� �mD~AA����/� ��A�6� �Jk[["�}0��j>��F�s������ldt�3?�2E�QZ���7�p���쬗���/%������2q�Le� |�EW��-���׿���q��I�ٷ����G}#;q�\��g�~�g_T�����W^el||�e/_i��7�P	�M�����{�$�$���~A���wt��u)�¾�ާ��a��2���c%I⾝;hjn�7/����/<����?9���(kV�b�r��y��u'��ZpI��t�8�"7������ƫ4��P]Uŕ�W�%���JΜ;OA^������att�����`���.f�^�JJX�f5^��c'N2�r��v%'N��܅�L�\l�c�5q������_��frr��v��`�ߘ�� ܄b�(ͭ-��7�%�f㞻����8{�<n����R^z�URR�<��Cp��ﬗ�4�nٌ�(;q���6QYQ��9q�4���wo�������;�r���իj�p��ֶ6JKKټi#�X�S�����KUe�6lX0s`0d���p�I�$�?��FZA��]�'%%�ԓOp��1V֬`Eu����ٳ^ym7}�����]==8���Z�`dd�߾�2�����n���>%%%���#IPW�@]�%b�¡#G������c�����*��x���4��<ɥ+W�PW��g��t�y�m�.�w�~�{z(,��b������l"-5��N&���Z�X�VRR�Ȳ��X�'K2e��4��r��9�m6�f���X-�f3v���NgW|x�A��c�8x�0���ŋ����f���]���K��J ������y����g��o~�����h4r���������Ļ��զf��<ũ�g)��'�j]2-oR����	�p3Y�Z��GIQ5+V�x����1�z=��^��07l���tvu-kJ^$�s.����?�����㞻�x<��n$IB�edIB#k�$	�^Ϻ5�Iq:�����-[hjnajj
I���
�~�	�~?'N�bbr���JK�t���ۙ���+�⠞��Dii	^��͛6-�������mw1:6��秼����vn����<jW�PS]���'������#ՙ�'G���jټi#��˗�ֺ�����#��X���T~�������=ˊ�*^�����tv��|LLNr���-wlb�=�,�:�����6aǍ>���-�UUE���{Y��®��C�ђ���F�ᑇ�w�������>�l�VbJ��]����8�6�����z���HOo/]��LLN��~+
1EAQT���*�Q�3=����C��T�9w�"����d=n��W���� 
��ӋAo��v'ʗ�������m�ήn�
DrA�����b�x���z��)��W�RZZ�^������U��ʬ����0����塑e.7^������I�JKI��k42f��̌jW�PRT��j�p����մwt��w������=۶a0���exx���W(*,�j�`4���������^�K������x�� �A�g�}�o�����-�7Eabr���r����dg�70���W	��s�Ξ?OZJ
;�߃��X�BOMM�����;�s��֬^��q��9"�(�U��^Y���f���IJJb��U��J~^6[~����Ly<dge119���g�� ���E^n.u�
\�z�=���2�''��`EU>����.TT�=J{Gm�8�TUV������,�e�h�"��<��O,�fE5��3$%%�j�J�&������T�65���e��?9���&��'{���r::;i�|�^��U�ttvq�q�;:������".�7��Ӄ��@�`ph����`�M6033K}C.����b*+*����ŋh4�Z
���Ok[-m��4ZjW�d�5I_�e�L"�7��(x<���<��h��� ��pO��v�٭�ؾ��Դ?��p8�V�A�eE!���n���H$�F/���)
�X�ۗ_!���s�b2��h4�N�Kt���(�B,C��.�d�`Ѳ� |>��UU9v�$�Μ���HJJJ�qUU�F�h�Z$IJ�V��z8߻0���Ͽ�D"Ȳ�h�TU%���j�e�3�/�p3�F���r���ܜx\V�׳�v%�H�ή�?.;�^�K�,��M׭�����j��{�Z-z�nAП�O�u'�\��D�YU^4��ǹ��Ļ�5��K���vm��$iɺ��h>����]߆̏
�C��\�:�P�<���]A>��ի��(G{��.
�י�s��[A�iY,,�_� b�~AA����/� ��A�6"� � �FD�A��Ȃ���*c���z�Dc1FFG	��7��LOO'����'
1==��`�r���|7z��ka��'�����H$B}C���L��,�?��9�{z{iko������R�)����6^~�uEI�719Iåˉ�L�{��79{�ܲ��e���s�K�DfF��r���ݱ���b�65��Ԅ^��f�15塮����~����u��K��$��+V���F�&�q���t:zz{�p���x�1;;Kgg�}��l6���hjn!�!kd~���8{�<ɶd��\�Jc#�P�H4ʿ������'9�FjJ
]����7��Hq:��5;;˯�YgzZ�M3�� �z�����a��cc���v>�����-�H$�/~��z��V�������n�>)N'���x�����SW_OqQ1�%%����x8w��##�����L���Cjj*���L��8�E�ZUUZZ[�k�D$]T�Ñ���ϻ���j�����h�OAXN�w��E������w�����]l޴��g�q�����8}�?���yk�;���(+-%'+{r��$Ix<�����b���il���6���7().�jSe�%�tZ��ذn.��������p�C�u�V<�i��܌��QRRLgW7>���?9����bff�V�{�C{G'���MaA�N���o}��e�D"LMM��xU# ILO��]`���♞�3=���0f��g�FÔg
���k{��z�*�{zy��8����TWYj��P8�˯�FQ�>��Cl�c�������r�J#�?���W^I���~�۸��x��)-)�b�P��z�*
n�;^��!V�\	b�!�k`ə�VTUQXP��
�e����1��j���Յ{j���L�����u����!>O~[{;>���W�{j����.��ܜ���o��ͷ���$֮Y��?���f�ڵ<��C��^Z�ڨ]Y����ч"����C[{n�YCuU%+���k����}��8�

��׿(�;�NV�Z���M4P� |JUUJK��Ƴ�����֮e���g�A�$*++�����-[8z������{�����9z�N��-�y�������)/+�3=McS�=�4?����'{��7���oZ�����ꢣ����7��{������u똙���[�C@nf�1e��N���"KX�Vr���v��s�6rsrp:�����k�A���|b�
-O�[TX@0�p�˅�h$-5���^�'&�|��Y�'��''3<2���ccc�8�t:����t�c�x��iniAQdYf|b���i�f3�iil\����7STX���gfv6Q�d�
�F_��$N�S4
�p-U�g�SUdY&�(h5|>##�������ށJ,#�vOa���h�

�v��|�����~$�%U��d�����dM"�084Dzzc��x�^,f�i��z]��G_?�CCLOO'��Ny<�]���h4|�x����%���т{��(m��TWU���$5%���V:�����s��1.�ա��s�f232���>��c'NP^ZJZZG��`���Ŗ;6�����S����BnN)N'&����r�23il�ʹ�X�<��#8�v�_�������ttv2<2B,�v�J��r9q��̦
���%�|I	�]���n.��s��{r2EEE��x�{����D:PA���n7�׮ehx�^��U�4�����AUU%F����Ndzlj���#z�z����'������g�r�b�33�ZYKSs{���b]=W�^������T.��388��nGQU.�������aB�7��p��V��na�s��YΞ?O4C��7�{��.r��EaEu���wtPYQ���잂��>�=~i�����;HKMK,��*�p�N��R
�B��>�F&��`0Ĭw�Ʉ�j]ִ��X����z� �,˘���a|^�{����x����OSnF"B�F�1�� �*&�	��G �j����e�@ �F��`0�D���E��a�X�F�C�D�N���^�K��h4����p�X,F,C���FAU��t��a��(&�		��̹�=~�|�ERSR0��@���~F#��h$B8��EI�l2!�2�^/�H�5��7�%겢(���|0��a�X��T@��a4?-��(z��*���ۋ���jss�^�gu�J‘0�]�Kw�K���`X��`�x�{f�	��Ɯ�j4G�e�����ȱ��n&&'���{u��t�Ey*�j�`�.���m�t:�Ng�^�_r4�i�t���F�x�EwMo�A��pM]������,o2�3�F��F�����O��m���F�q����[�Re���-�l0(,(�f����m��"	��լ�&;;����/���jW��b��%JMI!5%�FCn	��YAn#"�� �mD~AA����/� ��A�6�(���(��'��nt9�����#��~���H����z��1��w��(�H�#��E!�|i��M�#�� �+������7�DQ������3���nt9��䤋��%f���{�]�>z���I��_�?�կi�|�����\���b�$f��օ�:����ddd�O*��(t��.�~A�ٵ���o-8�������:508�Ԕ�s�o�or�ܹeߏ��,_��$�$��w�{���C�����ve
#����0
�Az��q�ݘM�eON�r��w�����t1�ra���u�!��{���!))�h,���(���LLL����F�atl���A$I"���wp:���b�%
��|���q���]��Ko_?��h.�u�\��P���������0��^l6��X��G�q�����=��\.�������g
�Ųh������~�F���p�9…�����v^�����c�yg�>��r�)
���o##-�&��gbb�h$��bAUU�GFhkk����defRVZ�h[׷>��ۍ�O��z�X�L�\��`���sll���^E�j�,�z;��ɡü�o�g����>�\����8�v@�_�
��Ƴ�����k���+��������8r���㤦�����k����ۗ^����3�Α���[{�����\il������r���>gϝ���eb�(z�����399ɱ'�F��b
���_26>N("#=�}�O8�܅�L_��ӧ����AOGg'm���b1���L0:6� ��_��)��	��tvu���JKk=���:Lf3�_�}���ʑc�i�t��IEE������?����222��ɹ1�4A�
���(���G0�L4\����+���F�p��A�'�INNf�3Ϳ���q��I222��7/���墵��U�+)-Y�c�o��gϟ���e��(z�����_b2���h�Z����o������084̯_z���E���h�\��P�����x��g����>o�_rpߦ
IOK����`�s��1<��$�l��70<<L0D�e����53_,�������S


���Cx}>�n��������Ç9v���X,Ʊ'	��$Y����玲��`0H(&'+���J2����/��{���� �2�׮eeM
?��y�]Y�O~��%g�)
�@�p$Bnn6��S^VFgW�P���6r�-���s�uz���;�8��OL���S���w�\���k������|����-���/�������ӟ���55��r��-Ԯ��G����Ī�+y���u߽D"aRSS�˟�7�烏>��ì[�������fժ�r捎�q��	��
�s���?�#&����n�ŋt���t:����rc#�N�`ͪZ����+�۹#�[`^�w�R�r�g��p3Z�_J��`�Z>MN��$'�(*,������l�x�Q�6���}��>z�e+�$Is�TTEA�� K��ٌV�%���jIII����Á$�huZ�x�AUQ��(��.r���6�����h4�t�$PU	�e	Y��j��zb��p�N��� �2jb�^O�a2��˨*�$��j�D"D�QԹ�dV�Y#�{#%ѵ�dY��$�p(L,Y����N�V��h4�)�Zb�(�p�łF�!
����0�L�x��bD"�����z� ?I"�.l쎹��j�N)���j!+3��{�ʊr::;�D#��c��P�����s�@�^/�p
_�yC�h4"��$����y��\mj&=-�`0ı'A�ՒdM�c���h4�w'{�|���A�}�)^ym7�!/�c�ٸ��{�)1:::p�ݬ����p�3yI�\�c������N33;���D��a�K���h1�h�:FC�������K4�7/�ijO?Ň}L��ii�lX���؇�L�:��^�h�*�J9s�o�}��k�`2Q�ʊ
�N'��L~~���SSSx��'�(/�ͽ� I�׭��_$A�ch5�D�O�N�,�u��/��w^x��U|t�,s鱯65�O?�W��)�~�q$I�ݻ������kV��O�X߀,K�F�x�Q�ܺeA;���Ϲ�����q�Ü<}�M����L�ի8�VTU���m���nz��Y�����|������**�6n���"Q�o��
R���'7#i�����;HKMK��(
.��͆A�GUU&&&q�]�l6���p�݌OL�l������Z�<fff�'۱Z-�_�_���aúu��a�Z	����HO�b��&��$���t
��*����F���p:���a�`|0c(��` �t:QU�I��P(DzZ���A ����t0��;3��ȒD�GR�b�0>>�$K8��ϝ������0:����,&]��u:=Y�D"&].v���B�f�|8�Nf�^TU%�fcbr�P(DfF��06>��l���&>��#�}�)������`l|�g�Ō�����13;HȲLfF:���k��^���'-5�)�UQp:�x<F��1
def�����v�����lftl,Q���Iq:�@fF��'�P�h���^\.7W�����z=�kW�����\:��Dc1����/ܹe3�l�v��#—�ȱ�9w���7��^� q�7�����$�㞻��ξ�E�+PR\�e&��vk~Yf�w��b�����#?/�FCn	b(� � �FD�A�[�$}��j�Qq�GME�A�[��h��p �=������[�� � $&�+**$#=Y��Z���E�A�[�^���,��zQW��,5�� � 77UU���������|t� ��288��寘t�nt�/EQ8s����#�������WR�CG�p���?����(�.Wb>qA�Ut�����M_
�|�k�ǃ���RU�w��G]}ò��e����_�$JKJ8��AZ��y��F���|>&']D"����f��YPy�C$���3==C$A�$B�0�H�@ ��׋�����1>>���,�����ell�P(D,��tvu'��TU5���g6:������7�L4ML�
�	��244�x��r���pOM1==���*ݽ���������q�[���8�Μ�`�I��X,�źz~����v����n�;���#�����I�3���� �W�261�䶮�k�H�@ �n;>-w��P(�`�m ��t|�Ϸdo腹2���eo�O����+شa��;�EE|�⍷�
������'��؉�\�XG�-�Gz����e+�LJq�4��ә��=KgW����믣���J�D"���Dx���9q�4cc�Ĕ�ii���o��xw�~�Z-Օ�<����ʫ��O��i����|���(���_b4	C<��c�#a.]��.�����V�E�h��x��+�0;� /�g�z�c'Nr��$��N6nX�`��r��IΜ;G8��o}�����5�/�$K�wt��+��yy��9|�.�!�2�W����S�7\"������w�a2���2�z�TU�ݯ��hjna������*r(Q�v�w/�$o���<�$G�����������{�x<�l6���7���� �W�����f��@ ȡ�G�kh@���o������\�},��/�2��܁,�l�{����� �)J��9r�8=}}������ضu+�i��Z艉	�RS������ɓLLN233�gz�����v�u���"�(�Μejj��kW�_����ˍ�b晧��g����&b��l����na`p�+�Wm?♞�'�d��;��{��O�B����B$I�㙦����26m�Ȕ��> ++���?\ԝ/I[��DEy?��w).*Z�c,_%UU).*�/��'�����͛YYS�^�wl��ʕ5<��.�߹��Y/?x�;dee����G����_�fU-��������߿����!�E�����M��p���*++h����:Lff&�����w߳ ��LF�n�����}����O�zX�9~��@ZJ*v�=�=3��*Ĕ;wl'=-�o>���%���.7^Y�BK���� ;+����<$	z=�YY�t:���--ȲD8F�hp:�8��Fc"Ww�-{r2:����a��C�Ȳ�p8����ߊ27X�&YY�����G"񮾹e%	PUb���9����ghh��_{���QB��P�ܜl�n���,܎�l��=NJY��[��l�f�aO��(*f�	�N�-)	�ш�`�d4a2��$���$���),(��v355E�Ӊ�jE��,����������e��t:������t`�X���AU� wn�BnN6�)RR�$''c�Z�Y�2��x�m6�<H��X2���TI��c�F>>�	�OE��*��	�(���o�\TU����,������>�~���P8���(�%�H����������p}����ebb���\d�L4c�I��pPY^���8{�����kW������G��W^�Jc#wnق�n��'��٬x�g0���̒�����Z�~?N�#��w�;���QU���|֯]��`��W_��'�s����}sU%==���i^~m7O>�(���9v�-	UU���d�o���Ǝ{�F�e���>�tttR]Y��c�ilnF�$��ukV�J�5����0r�}�{��p�s.�f�j�46�I��p�l\��}����e�dff��N.TV׮$=��2?�أ8�}H�Z2-o4������,�V+�X����GFHKM����޾~��Iq�P���ɴl���˯�����-�)+)����޾>���irrr���tvu��ۇ�n'##�p8��a'�j����ܜ������F�����G[{��I�'�YY���2=3H��&���c�{(�ˣ�v%:����&���Iq:���h2r�J#�i�����|���PTX���,M�-�B!JK�IMM�ʕFB�0��;��(gdt���1*��HNN���A�Rx<�L�&).*��q����IGg'>��UU�#Z��HMI��������{wR�_@uu������$+E�������k�TWV �2W��k���	�KK��	
����炙���BuUf��ֶ6�GF)*,��p��ښxD*';���|����TWVb4o�nc�h��������/l;�:��7�_��%4
/~�7d��]��/��?�?��`,_�#ǎq��y����ȢK]��y����qIq�
M˝�de���⾻ ,�̌j��� A�B���k۝[o�3��y�go�a��FUe%U��7��pK�A�6"� � �FD�A�ۈ�� �p�_An#"�� �mdA�WU���'�Z��S�}w��7�{D"�a�2W�#���=w~A*�X,F(J���KNDr=UU9q�m��˾חY�~I�0��~�
FFFy��1�L(�B8^�˱X�sU̯��(D"���ך�D�,��;�����B���GUU9z�8 rq������~���L"p657����	���.�%⨪J�,���0;;˙�������E"�su�6�����EmA(�����/s���A��,��g�M4���_��0�L���~¬���w���Y��;w�p�2�O�B�����.k���N>��#‘۷���$&��{���;�����A0�̹sؓ�y���HKK]����Q^ym7�,��������>C{G']�ݔ���އ"�2���o<�(I��8o��.SSS���x��0��\݂�I4���?�WdY�Ge�{p��EffgY�r%�>V��	����B>���X����O>��(��^&&'iimc���%�t��E�?�V����ѣ<��c�<u�=�;6mb�{����O^^.O>��P�������(kV�b��-H��xg�~.����x��7������Ѣ{�:��G~���1��N�23��������@WO��AQ�ںeٓǜ9{���a�m�JVV��6��8���u�
Ly<|r�0?��G~��dۢu��~:���c�F
�imk#�2ᚤ���I�����O<N,��źE�p:Ԭ�fU�Jvݻ�^���A��T������70\����U�������QS�����ܲ�͛62>1�������Ͼ����(
?��w���@U�E�qOM��ͷ���DUU�ڻ���E�����ի�T����:Ξ�@mm
gϝ������
���`���h�ͥ�h`um�̏?�Ȓ� ܌d�w�_˞l#7'���lFFF�x<�Q\XH���3O?��d�w�apphY}���RUQ�;�����J͊����b6�1����&/7�\��kI��n��f��OC��,�ȒDZZ*%��def\�>�^���t�p8�����������#++�h4FJJ
��$
�HMI!ٖLFz:i��ȲDjj*+kjX�n-�}}�RVZBNN6II�%�133ä��� �h���\�&֯����‚22�efv���.���0�L
QQ^_&=}Q;)�2���$%%QXX�d;#�I���t��Ŀkr�kUU�3�Y���LF��Y\n99�h��wJL!���y�~�]�ݷ�ˍW���^�S�	�6�%	Q�PϚյ���|���&5�����Μ!++�$����	� ��h,��d��v
�QQhmk��Y#����tLLN �������D"APzZ:����I����_>�`(���g�59wo\EQ��AΞ?GaA>&���n�Z�	��(J���n�46RZR�,K4\�Lff>��H4BK[+��Ӏ�F#c�X)��cŊ*�JK���N�=��wm������v��R�����;���i��ldf�s�����,z���Q@EU�'.�X�P8��s�(*,�`0��C{ۻ6��Թ�R�Q�h,���]rL��D0�P3�2� C���$	UQ�D�h��6��$I(��$I���=�\3�GUU�
1E!����A��eY��di�>		I��Ŕ�>E"Q4����
�%����k�=�?�!�4�t� |�X����/@�%���,���uZQ��z%�(14
�h4�3�&�!�-wm��F����ηQ�,�D�iuD��-�$�D��%�$T�h4����Q��dI�e��b�mEB"�0�"	���Z-�d;�iϲ�_z���$�O��>?�d[2y�ĈT	)qv%� �^�&�pؓ�ْo�s0�ϔǃ�Bey����%��r3�w�/�H�ב�� �'EQ�Bdgf&z�n4�����$�
)N�q"�-���9�� � |	$I�i.d?hw��%I�,�*��An7��z�"�7H~AA��H���4DW� �pKX*��?�5�����n���h4�F�Yt_~���m�TGĵS�k4�/u\�ﻭ !��/� �"�����z����042����,X6�$�Fi�� 7;g�̴~�EU�Z,~��u,(Ңa�$������$F�Y�),( ٖ��DCU�A|��E� %�zB�0�`�Ͷ�<�%� �pˊFcD"�k^GQ�P8D8&�a41(�������yQT�χ?@��c6���'�PXP��b%����tX,��#���-Q�(�i����?8���0ɶd����Ʉ�h��c4�h4��>�:=�h��ߏ�d�l2	G"(J�)���EqQ1����=��� ­l�nvI�a�5��`$�RVR��3���F�!��V�	�$~���Gzj�����!&].tZݽ��H���RSR?�m�`(Ȭ�K(�d2�����B���D(*,dph�d[2�))�wv������$Z��H$BIq	���L�&�Z��f}^�'�1�L���/� �Ү���^�+��d;ՕU��&������ ?/���R,�x��݁�bAUT��)))���S\X����=�F��
�p��Ka����23;Kgw'�p���\\n^��Fˬ׋��#����v�r��iu��eff���AUE%��98�픕�.=���/� �*�
�:��@0�?�'��0�pM`.'D�Qb�������AUTF��"K2�X�h4�J|Z^��LA~>��K\�K��됝�EUE%��$	�NOR����b���8���}��8S�j��u�O��;���h�h4Ȳen
�Ee��cD�An
�XUUq:�$Y�h�h����FCZj��26>Fs[�H�ܬ233���3��A��0�%���jIN������#�b�aw�r��x<�b���>�q>�V�,�$ےIOKchx{���Ȥ�����������8RSR0\���(h4Z4s�c�H]=���K ����jee�M3ӑ � ��Eat|����EI㔹�wUU��h�Z:���B�d�`2���t��J0L$p���D�Q��x�V�V�F�IT��@,��Z����$IttuPVR� �F"$IB�բ(
�H�\��`0�F��h0 �2�X�X,�H��D�>]f>y�#��pEQ0��N��p��1�OA�5|�49�,c2����U���=9y��r&�i�#tz�>~�\�4̵��5
IV��*?��|:�nA��ש���'�lO3׍���.s���	�g$�A�[�W�̪����J<���|�{�G��~.��Gv���?��>�\ZL�#� �J>��d;�Я�L�U�A��=I���B��?"�W�*7#Kr�����SU�P(�N��_A���$	�͆�=Ŕ�s������l�H���|�.��
�$Y�D�An
&�	��x�<�v�<����K�e���E�An
�'͹\�o�r�	|A�6"� � �FD�A�ۈ�� �p����$IB�♅T��|��I��dL*(��l�"��t F,��c$�(��G���i!Aa9���������󑕙EnN����?K���3�_��(�0<<�5�JQAV�I�QU�ϊ��O��*=�}��G�



���r��'--���so#��X��$=A��g~EQx��w&�頳��g�|�^��������$��#�
p��
�>�ȉ�U5>��\��k?��x�������M �p�}�q��9VTU�t:Qe������t��M7$�:�S�����d�į�/��(��嗉F����a�����b6�g������W0�lݲ�����1�_�$IH@ln���::�����C��%�8_�kO����AAX�g�p$B{G'���ƶ;�$�"I��(*,�a���҂�nG��r��U��+����!.�����;�B��7���Y�v
H�._aff�р�betl���jJ���x�AOo/f���.�����W^���5kع�nL&u

(�ª�+ik��ͽ{c�͸�nRSRI�'���ByY.����V,3�֬����XL�fEu"�z�۹���W�p�c'N���bܱa
�/��������HK����ήn;+W�`ph��Q"��5+��`���"�N����n�GF��x���`ݚ�D�Q.�7033��baͪZ��� %� *��ÿ�g)ZH�բ�
'N�����$k��>�������x������棃�084��� )�F_?�CÔ�����o�%104���{2���o����ԙ3��S�zN�>��5�1����b�����u�D�2��q�ܜ��#3#���""�##����SW߀�l����‚�rs����LF�����W����7�DUU�f���w0��P^V���������W��� 9�Ư~�[��2h��`|b�$k�m��"�����X�46^ezz���Μ;Gzz:������###�$k����_b�墾����L:Laag�_���[R>>HeE9�))7��"� |�I��������Ν�ş����L~������Æukiim��jF����b��������b�'S]UIQa��E(����I��{���Ab1�Ԕ�ݱ�ҒRJJ�پm�P�P(Ļ���w�Sv�s7�Ϟ�ͽ�PR\D^N;��2,f3�h�������y�����������Ax��Ғb�SSD"Q�Z-���΃��Ot����h�mw�O��έHH���p�ΝlX����)

���䞻�12:JfF�?�;�o���e�� +kjxh��
�q�z�֮Y�� �˶�����Ejj
�4��p߽;ٹ���R�� —�3��#��0�����h�|�����O�}ܿs'���J2�����[�M&rss�������^��ᠺ����\�V�ܽlUU�F�����L�������$��)d��1�$����?@zz:f��eeM
�##ȒL8������dYfll�Á��CUa�w084��o�EVf&))N"�(�%ʼn�tUQ�%�F���D�%�/���*�b
�����B{G�]]ttu����V����6[]��TWUb2���{淫��0�\mjB����?A�K�dW�$I9��g���aÆuܱq#f���q._i�[�x�^ϑcǩ���b��u�����?0@o_+kjp8�4\�D�6[i��L�\ԬX����KOgtl��+V`4hnnᣃijj�b������t��h�t�Ӂ�ᠭ���C��Ɇ����465QX��������b6�(/+�S��������;6q�J#�ccTUT$p[G;�m���244Lvv~����j|~?�h�‚|\�~�dg㙞���:P�� 
c6�)).&55��M�\��GQT�m�D�QN��$
������ʊrΝ�HSK3CC�l�|��Ā � \o��_:}��ZYY��1EQ�B�.�.�?����~���h�D��C"߰�((��F�A�d���X�^��>���z����*�H�h,��`@��$��F�eY��Ȳ���uG�1�Z
�$��%)�y8!
a2��j���4 >���\���1�>�0���{�?G"4
�&����h�`0��}	Y��b~=C���w/������l1�OA��I�Dkk���_�eL&S⵪���aZZ�شaz������i����| UUu�3���\l��y~�^������j?-��l^�{:ݧ�[��ܠ�/(���{�2_�jq���v�Ͽ��$F��`�X���rH�DKku
�LL�(+����/� |�������A~^�0�߭NUU��2Y[�,��'�rA�/��:����J��"��d� &�A�\��D_����A�˖��xfLAnyZ��Mkk�.� � _1������.S��o%tEXtdate:create2013-08-30T12:28:19-04:00��(�%tEXtdate:modify2013-08-30T12:05:05-04:00xȦtEXtSoftwaregnome-screenshot��>IEND�B`�site-packages/sepolicy/help/files_apps.txt000064400000001063147511334650014742 0ustar00This screen shows application types that are defined for process running with the '%(APP)s' type.


The description should give you a decent description for what the application is allowed to do with the type.   If your application type is being denied access to a particular file, you might want to change the label of that file.

It is recommended that you use one of the types defined on this page.

Note if the label of the content that is being denied is owned by another domain, you might have to write policy or use 'audit2allow -M mypol' to allow access.
site-packages/sepolicy/help/transition_from_boolean_2.txt000064400000000204147511334650017746 0ustar00This screen shows you the boolean page with the boolean selected.


Enable or disable the boolean to turn on or off the transition.
site-packages/sepolicy/help/lockdown_unconfined.png000064400000065521147511334650016623 0ustar00�PNG


IHDR���
!igAMA���a cHRMz&�����u0�`:�p��Q<bKGD�������jLIDATx��w����{�+ ���9��̤H1*�,�
β-�������ݳ�����{��ٽ��M3����r�-Ɋ�r��Y�d��sU����M6�`J$��뜖�@���>U��B!D�P6m�te0������1�8�eB!�9��*�p���&���P�~�������p8�w��Bq��R)��D=�J�,���M��wلBq��AOO���C�BTVV�(
�i��	!��S���jz{{�5MCQ,�:��B!�gD�u�N'�eY~�B��eY��BQJt@�_!�(������(�����iZ��T���e�i���:�$[Q�sڨVUU��'��>ׅr��I���m�6��*�4�gYV������S��X���� �Q� O�?4T`��EMU��*Cx=n&&��G��UW�M秕3�z�Hg���B��'RQ��(�-�78Dy @*�&��RS�D�h���
5�8�r�Aww6]��2R<�L�9 �TB��p�y�q/���SYY)�/�'�(
#c���x���پ�C����t�R��������w�(��cht���&��?�����b�~x�����K�,��p̪=��W���ռ��+\a���M��36n���}ص� ��ڪJTU�_��Q����)8��Ɂ#�X�h����j��d���m����_|��?�$�P��9�v�������V/_��n/�
^�L���ފ�i�v
�+��eL�,�(Nh�{K�"EQ8��;�G�
����ݲ�\Π�,x����АT��i(�±�N������$�i��\�������۱�t2�,�i��M�$�`����r��#g$S)�54�q9i��b�V�FG�Ʀ�4�\�`Q�<�>JY�ϼ�z�d;�o}��7��Ǟd��,]����'��b���]���eY4��./FF9z�p9��r9.Y�UQ�s�0�5U�TF��W�[���v��eom�B2�BQ���?|���a�?�����#�L�H%�����Pϑ
�X��X<v�D2��i�ol��p���1�6K�0O����H�+�^��f;߇��E&��;�б6�泾��=!�/ħ`�&Mu�,Y����o�X{��^����(
�w�|UQY�r9��^5�ʓ����^�`3Mu5<���r���]{�r�j^x�mꪪ�s��;��r�j^x�m�k�h{_�ҝ�5Z��5-���x=nҙ,����;���#�~�ټ�ۮ�j�
�<���(��U���gQU���G��c�q�\8�6�{�
��W��6L�%��QU�_=��T��˗bN��>~������bͥ��fb2J��ctb�}���h�����~�ް�t:͞��	��q���/�t=�=�
���w�p��E��y+KZ[�]�r6]��<!.r�e�p^3�^���>�<��{�bq���Φت_��Y�E�������z�شs7���'��o}
���5�a�٦��OއӊU���
��7]u9~��G��#c�h���NW�+/�����^}�o�s�Z�?�-n�Oks
�ո�N^o#�lX�e�W��~�+�<����:}��h{�x��|�!��!2����������0
�5�(���Y��_?�w�p�T�Ξ>r9�׋�a�4M�:�P�t��Z�����o�� �2�cZ���0Mn��r.]�����o���Ѷ���L��':X��D���K��@cm
�a�w�(Y�t���Q�.'042J*��ᰟ�5�[dr�/ħ��*�ڎ�mχT�è��a��Y��e��ѩ)>/9�`���O����),3�^���`a�v:I�Rlۻ���^L�blb���j*ôuv2:>AY�?�ť˖�z�Rr�A<��,��6,�$��|a+ݽ�����1-���2r9�����(��*�kW�맞C�4�r�M�V�E`�&���%K�U���`��x˲��w�gIk��r�@UUL��BSUz��s��[�*/ò,��ڎ�������q��:�u5T����z~F�,B�"UQ�w�(��n�w?�:f�%��أf����L�J{xl�#':�y<|�[h��eb2J�����´Ljkh������Ξ>:{�q;,_��y
u8v�5���xp94���\_�Wl����].�+���<�˷��l����_�"_��i��k��p��f1M��=��7����ٰ��W�"����ۏ��%�-g~C=w�x-���Dw�lXCKc��D���F�頵,��H�t6��iTG�X�E�"�P��.�h��Ie(@��]�h�����EQ���tR_[ͪ��y{�6v|��X<Ne���o������ǃ��f~c�]�%�Z���������PY�ǃ�1>ʦM��e˖���GqvJ��=�l]�P}�-ӧM�S��Y�`Y&J�o�e�[�o
�_�d�e����S��)��eZ���Vl�?c=���)���-(�H��~��:�r�ͳ�H�jQ?�Z0}�}�l��mϵ��0�i�;8�s�����K�&���\��(�[#2V���r�8!gź��_�O��Ѧ��8�Ĝ}�-�8�f<gY��:N-�G1f�VUU�:��;n9��d�u�����=0���
�X�ž#Gٺ�C4M�u����&�ˡv]�~��u�/DI9�3g��aY�ƍ��K����hdl��h�y
u��BG&�c*�@�4�>o�vBq�(���O^��ԼB��pE9�P��ֈ�nө��7,�����|B��|�c��w����>2�BQ:��p!�⋩pK_d0q��8]R��.z�E6G\~�!�}`hS1�5�Z�_�\�@ ��n'�N���̢i>��4 �H��"}"��P(���
�Ll#DI���4�P��rLK�燪�����������|�HE

�l���N�G���� �JaNO9;s{�8��܏i���._��vrr]ש����jT��]!?/���(�T��0�|6���t�u�;������LNNb&�i�Igp�]d��1��0��t��E:�A�44U%���r�d���my�0������L&��B|
��n<��s�T�����|���p���1}�,�;�k6,��1�.��s_^A!�˒�dP��~ !�g#�J219�iZgl�c��`0�z���$��EA�B[�p���*�t��~Q4�shl�
EU�6~B\|E�D�	4M�nw��:R�\nhh�����;T�_\p.�+~���6ݎ��h�����s`�&�(e���-�E*�N�x�bR�9�@-V�+��@ @*�>�h�g��Da>�O2�oa�0!>
�2��NΠ�j~�EEU?��g^�F�\.���*��'k�p�ל�A�4U��,xќ�!N�(�9��$�LEUPu��n�;��G�K�{
�X���}���7^wuu��edY�t��&_4%�p�i�ջ���c���sHd26n���G��:W^~9�-�XeH��h���f���*g��?u�B+�����鰊'���r1���c���*k;A6�e���m�D{�t���>�Sμ-�"��a���_����M��j�l�\.���8�.tMC�>^W�9������� �^u��t�~
_�i��d�,�D"�cO��ή�|1�0f�˶,�l6K2�²,R��\N���`�;:�w�I��8@WO�Y�֖i�:֎;�ko���՗�l���hF�-@�ߦi�N�����?��";v�.>o�&�Tj���l��x�0H��$��Y���
��+~�ȷ��54]����ᑑ���)�jj�1�9U�`���t�^\N�T2�}�����i�QTe�6
�VTUSQT�\.
��g�y��/�EKQ�^VW�C��Oʼn��۾���x>�z�3���F&�֯]���fӖ��s�]w���AN���v�j^}�
b�˗-,�|�::;�ҝw��Pϋ�����$�r��W�^���L6���'��b�"��ըK|:�+�<���<�������ظy3?�W|�k_���5-sV�RM�0r�t�˗�9��Ʀ-[���{�>���ߧ`�Ɔz�y��7�x<���s������SU��ne�Ν�޳,�ӉM׉�b�}����̮ �o�f�7��4
UQ�5-ߥN��h,ƶ�;�F�֯[C4�,g�X�b9��ah]]=>r�K/]��c``���-$	�N�'�h���+.�@,�7r�]w044LgW7����N�7\w->�˲�z�œ!��IQUƎb��hlj"Z��H�0�8���,����8��s^���������yᥗ����#������gpp���w�v������ì]�����X�`!M���|Ӎ,^��7�~���a.X�o�MOo/�}�̟?�
��q��	�6�:kqq���+Ys�*~��G�ǟ������\���6�?��L�,��r9����;ٺm;��~�7Q���C���?p�L6���$[�mcak+K/�������|�z6�_��{��,]���{?d�

�q{�햛ik;Ά�먨(g˶���eY(�1/]��4�8��騪ʮ]{��r�t���)�m���m;hjl��o�נi��l޲�KV,gxh��	�t��\��������t�$I��'�����IGg�Ot�����cm��^����n����I������e�MM�gK���i��bT/]���1����m8Si���(�9?�N���n����g?!TQ�/�LUe�p8Ļ|���0˗-e��e�x���q#��l��r�v��Gp��
3�b`p����8�Nt]'SY�<XF("��d?�ي��Y�����{}�
���o�a�:��з�W�g|�9;pV�Zş��\�r%x�@a�l޺����b�2��K_�Б#��曘����&�!O���CCC=~�(/��eeAʂA����d2��P�bc�*^��j��|�NF���TSS]EUU%##�Ħ�����,��q�(*�X�4)/+#�"P�r����SEMu����Z[8t�(���\}��;p���q,hESU�� �`�׋iXh�������.U�B\�,�"�v��dCw���n�����Z�۲�$�ُu����7n����{����t�\.'�.����/���
�B!�;:�4�pE�cmǹ妛�y����;���%Kصg�

8�N|>�a�H�rM�,�ۖ/�/
˲�z�|�����������rL�,6NS�#ǎ�o	�f��y��6V,]��� �����0�x�ښj���E"������;wR^^Ƽ�f9Je$RU��X�a`&�aN��7��cf�>�Ӆ�n��Z~h\����#ӣ��ilh���$I:��ذv-�]ݼ��F.h��>�c��K�x<���[,Y����6���8��A(T��ŋٵgo~=�]\�a5�ռ��[��ײ~�����DsS��p�-Ba�^EU�4
�������%�ELQ�T[WΎ.:��#��O��M��׼�ӷ��F�������x�Q�o	�?{�t�J���2TU��q�}�Nn������tvu�r9��멮�����0�D�,]����������j�ʂ����	�B�TW�������h��n'��e��($SI��Lj�"���l��{�i�2>�]s5N�����Y���%˗�Τ���#��r��WR_WGMUu~���֖��C�iUU��TUVQRV���
��7�\��������f�C�](������`��q:�,_����J�d�崴̣��U�H��T�*������f���躍��:�뙚������-,hm����̯g��y��vjkjX�h!�P��5,Y��߇�㡪����G0����ۅ���l:���!!��Ea`��HUVO7٧���e>SU��~�db�o����$�����422���[k֬���|�B�������9z�g����?����4�+L��j~�pӜ�U`�U����+�|�h#er�">��06>Fۉ6�,\r�sŮ��E�0P�ѱ1~��o����P_���Şl��r�c�04UE�>����p�+PU���A�xq�>�͆��}��
�����zt]/�r)0M]ק����Z�l�2�����]�0-�4Ц_[x�i��ړd2�͞o��P� 
h����޳�e˖�X�/�L*�D�l=LM�����;�r�*�~�UѨ�o8-OE����g��Y�9S��K�,�?}e>�2��v�ݩ��}�2�t�uL�z|�����5��D��G��Gm�L�X��M�"���r��j�:V��zf��̬=((���'3_w�zf>oM�{�څ�TUE%_�Jgp:��eV��P6�%��066����{�
X�$�)t����z���!��r��zr����5M�ukנ�<�34���1M�il�G��ϥ��ж�Fr���?\s
UYx�l�}N˛�I8T!5jB\�TU����Ύ���g��,�ᑑb�;j�k�]�C����%�x��'��?�Ї����vO$���q9�L��x<ҩ���k�T
����T�Ӂi���,oY�t�ϗ?�B\T4M����������G���_A�(f_L�nf6��l6��_X�G�r�|?}E����wo�6�ӉͦcY���&��s�|[�B;o��x<n��,v�]��q*^���E&�!�N��8oE!�I�����O��"͢�*���D���g%�L��aPl��H$f
W}>��u����]B�ҥ{=^����E+J�a�+”��_pU�
'϶���w��L_1��l(�����,hY _�+�n�,�an�g�lo!������!�$���Y�?״���Wqa��?�kQq�Գ��L�}��q��B�������M("���I$����޾>��o�(� 
��*,�~r��}�3=��+��w��7��o��.G���2BY0X\f��Z2*���o�Ŧ-[Y�p!�������q�A�/[��n�����@�����`'���\.��~���>���znr2���%^�����9�Y:u~�ۚ�͋�絯.Tg{良_�'��U�d�Y�8��dUQH&�:r�h,F*�"�111I</�$�36>N|�uq2�bl|<?I0��M111Qly|��Q~��o�z�4��34<���$����6��3����D��r���wtp���8�Y��x�Wٵ{���<�̳�ܽ���1U%��2>>Q|����e>z��޾>bSS��i� ��H&9|����B"�`bb�\.������8##������g!�͒H&1-��x��I1�1>1񙕿0�a�g�_M�dj*>gMZ6�#����+kz��s����Ӳ,R�ԬQ��ެ~��1�-���Ͽ���nEA��p�����D����p�����������˼�ʫ�N�����_{�'�~���\.'���J���@:�!����U|���/�c�.����W ��Ǟ����h�
ש

�������'�ۉ������ǁ�����D;^������o���a���B��~�ҥKX�l�����nC�5TU����_��w�����K����v�?�����v���~J�"��i���q��Q[SS��-�2�Y30shؙs"̼�qj��ص�C�s�w��c�q��wQW{r�`Y���S�1�e<������=e�B9϶��c�ڎ��s�Q]Y���J$��/N�'����\�trr���1Z��;m�ݹ�3���w�BQ`ݚ53�51M�m��s�˲�T���a�x�i��կRQQ^\��(|�?[�n���<��fØ޷���<>��oX�飄CCCl޶�����X��bfY��[m3�����T*��O>�
�]GsS#�a�?�8�؛y�2��`�|/ӿ���o3�F>uݟ�8οF�s�h��Bt�()�Ռ�����������y���'����o���ŗ_!���Ç���O>�sϿ��d���J֭Y�/�(�=�\{��<t���;�3i~���H���pՕW�v�x��	����Zy	�^�����m�x���x��w����Rt��n��7�~��_z�ﻗ�[�aa��9z����;��i���c�Bd�Y�������0�� �Hb�ccd�Y�5
/��2��],]��?��"+�/�ҕ�`���E!���η�����k<��s����1��?0@e8BUU%����OL�F��('32��^Q���)/++��������2-�57�ɤ��b����d�\��i&�ahh�d*��n����d2���(�L���J,�b`p��CuU���������p~&���A�O�9�����C"������GOo/��D"a*#&&&�����S_����eE!�J�o�N�Y�|�x���q,ˢ������8U����^��{��bX�I0dxx���y�К��%�J������Ml߹��r/M������Ng��m�����ijl@Q�;:��F�����q�������h���*#R�'�;0L��Kmmmqhl�4���U���^��4����2�;:���fpp�2�����+�ߚ�d2�wt�e�6j��ilh`bbP(+��׏����tr�D;�D��M]m-�����z<���	��b��N�57��:]�=����r����,θ��0>1���qRUY9c~���j�;:سw/̟7����ӎ�D"Q<�*++	�A_��d˲��˟���dx<n����u�ښ�v;###����yilh`*�����[$fpp���!��@���щ��455b��f-S[S�k���,L���i�\N>Lk�|ڎ� ����z���2/^��2��$�LUPWWG:�"�J�� b�)�*+Y�h>��T:EUe%v����R�Tq`M�hl�����7�z����|�s"AkK���LLL��xhh�'2���eQUU�7��5��Xu�
6m�
���.����¡�D��TUVR__���)Ψ��"O��L�,���MW׮]��Ӊ�1���Tp�픗���������C6mي��e2:����-[��s�n���偯��˯��W�~��G{����
\��T�=��l޲���$�`����Y��'�"(
�CC��O����zF�Ƹ���p�\<���,lme����޻��N,�+�����K/SQ^΂�V��g����3���DZ�6>ش���z��v��'�~���z�q9���GU&�Qn��Ft]���454	�Q�x"A_�x���^^�-��[Z8r���vaw�~?�審��j����~�*+��<t���y�Ɔz��ǹ��9��AWw�v����_<���gɢE��iF������[o!�ΰu�6¡0����v<�}}$	��x���������t��`������뿢��˲�`�f�nߎ��.����5~��E0>>���6]'�N155�˯��o��r��7��cc�[Z�7]=]�ݠ(�}��<��S\y��$�I�n�NYY�����?́���.[o_��^6o���T�W�x�%�q���X���/���C�}��\{��X�E���?����X��kVs��}z��o �Ns������}�N���<��S'���o"�L�ؓO����]KY @"��?���D"���f֬�����?�����K8��I*�$�ΰ`A�^��'�~��Emm->����}���N=ޚ1<�����������2^x�eFFFHg2\�|˖.��Uq�p(���>+₡B����p��7�ֻ����x��Y�z5�

*D�骷��r����y�}�?�����SOs�����ſ���Gz{{���Ѧ'11Mc��!�05�Ս�j;+�(,]�������\}啘���/�/;w�ΟE�d�
��q�]w�v�A�W�]q�TU�?����7r�]w�E[g'���;Nt��	X�b�n��#��Y���(/�ޢ������8���'��^x�3,�DU���ʢP�����|HwwN����nڎ���-]ʷ��
��L��Fv����#G)/+#TQQ�b���b�VʂA�=F4C-4„Y�oN����r/_��n6o�����u|�_grr��������J�x��|�V�����;\u������Sʜ�R�H$������3��2'�;������q�|��X}�?���{Ey9K/f��K���[��:W]~9��r3����k��EQرkN��;n��
��
U�����btlt�9��2�}�ukְs�.V�\��E���W$b�ٸ�[�햛Y�|9u55��Ŏ]�ٲm��z�}�[�X��\.G6�C���2���$SI�;��#G���p���g���lL��lܼ�믽fF�w�p8���X�b��|�����&r��8�,��aa1�<����q͕W�q�f�dq;�e���e��f��+�C!���9tJ�� ��b��u|��_cAK�O��q�f.Y�����
���0����;v���r��~�����m;v�|�R�,>�?=n7w�~;��u�cc��`�{�p�eQWS�C�:-������o��f���C9v���^?_}�~����|����{�̎]�y��7�(/��.w�v+��}��M����t�`A+�lݶ]�I�3l߹�x���t��ڂ�fC�q�C��Xկiw�~;+W�`2��r��P����֛o&���p8��W��tQQQ����b���S[SMsS��c��!"�0�p�ۅ����~�}B�6����_�+��|����TTT0^�H�������Z.h��	#G��TUV�/��_288��k��n�6�i��7�ǿ���6?OQ���?����P(�_��_�?0�����D��
LLL�55R�������d�bj'&���F�~ꛚ���������Sc�j�;�i�Ϡ��e"�N�z\r&�
'u����_��255E&��ߧ2RI$��Bnjjd��fZ��{�a�utMˏ�nY�^���>�G��q�M7�횦�[�Go_M�
��"�N�N��9���LUU���O�H)(�l:���A:�!�L�L&��m��J<� ��kR�~_�����v�����k���}_�_}�w�{�Ǟx�k��
�+_ݼ�����*��*�������������t��o��%(��(N��n#�L�L&��r��T���ݦc�����i�5TU-V�k�����d3$��^"*N����8�<�/ZH��<�lZ��qr�\�$�Lj�`�m躞ߧd2Y&'�h�:=�_����h�~J�m(�����n�k�#�7s�]�V[S���(�l�t:M<G�44U#��L&I���3M�'Ʃ���4��?�l��\�����Av��O�'��yMcv{	��v�8I�o��62��D��OO7���z�����׏n��p8������Zh��iN��Éa$I�����F"� ��`�>&�v������~�C�����}�N~���x��p:���b���>B|��_c��2��U�p9��|>�,λY��m6-��h����X�M��2�ɼ�W��K�HM7`�on�a޼�x.����
�L���,_��0P�ŋa�&�p����1M��Me$_�j�&�eQUYIuUU�1�„�/R��c��X��(��CQX�B[�������t�q�#G�&���.�����y?β�Z���0l6j�Uӭ�

���jR�VE�i�(
�CC8�v�ϛ���`M�M�5����l:<J�7ʏoo��ЋU�6]gxd��?�+M��rs�=w�r:���e��=��476�t:
��v��:�

�l6���	����	:,+�:a��ml۾��ˋ/�BUe����b�_�6���+��f���q�v�j:;��߿�%
p�m��iO=����,_������X[[��uu�u����娌D8�����[���D"-\��ŋٻo��` ȑc�`�&�U���&�q:��/U��>p�]�p�����{�ع��2֮^͉����6;.���3k}��p��~��G�'�}���y�]�'�z��n��׃�jh���頽����,������x�]��p���o
n�+�1=�_MM
�W���'��n��Ɗ�l=n77�����	GX�b�=�$n�,��Z���.��r�v;c���"�q��wa&���ͣ���Sr��W��k��}�.������b���f�M�u<���=�����U����O?�G�c������՗��u�8���(��6��;n����OJ4��MkKˬc �^l���;3��(O>�6]g���4���k��֖������'���,���Z�������g�on�ꫮd�3o�UU����H$�C4��s���:r���AV,[F����:�D�H$�����g�a�%\}�m��šlڴ�Z�h�ɳPEA;~}���>�J-���	c�|,��[V@��D+�N^]g�������h]]hG��<��ֆ�H�ĕt�I�L&�:]Ǭ��X�3:k�tm�.���3��&���6l8m�)�‘��imi������L�७]|�c*v#�_ܽ��d��������~�.�t����1�;��ʼnX\.��)�����1E�_��+�l6K�����xp�݌���t:����TVVb�uNtt���~�njj����d�92�4^�UUI&�LF���N����	&''	����8�3ʜ�n�i�ۨ�D�CCCXTVFp�\�R)�GFPP��(ǂY˸f\��}��nb�v��t���Iҙ��F4���e��rx<��X1�5M��y�����"�P1=I�d4J6�ox�H&���l6&&'����q:�����L$	�*�anv���F,�n#�c=v�7�~��W���S|os�ߍͦ���Ʋ,�.7���l6;k����d4�6���������|^l6c����O��?L]m-�#�Ų�՟�sE�f�����$	�n��QF�F��������.˟�Ow>�8)���t�A�ۍ�	�>�Lf�{QT�L:���cSS����?��꫘?����*T5���i�L����|�`0+ej���1<7���Ӷ5�x����iC�ä�i¡0>��\.����l��r��e�a|^/�8v\.ק�N�\P�Ç�Ѫ_U�?�,����|�~��
a��bM�q�ڸea�Ց��>�`�U�s|b��	���юG��GD����ٰ�˱��1��l6̚�%K0-��t����&br���l��
upp����w/������g�_F.G�H�����k�K��zh���H�#�|��*<.۬p��lņf��}^/��,+_���{����^�|�]�{�3םo�qr��55��˂�|mFa�6��ͣ�,�a�8v�NG�̀��*~��S^VFEYYq^��r�2|�Y�hjl�U&��Yl]�ßӖ)�L�eY���Y���*<�,����)<6��ŲL�7Z[��z��$8}lY�E�n/�` P|��|���T�����C��ȱc�̛���͙ʟ?��g����o�px�2���z@{G''�����#SQQ��#G�(���֝���(
CC��?x��	TE������3'��猿A��s���1��ﺧ��POScc��~���l�}P��|�^�x���-v�,]ש��>�2ee�ӎW!ΗӃ�4��y'VYd����P����A;p �}�T
}�F�O<��ׯ��iX�f]�n�X�c�<��f����'�R|
�vr{����:�QU�ŋ1�,����UW��5��,�Ý����X���QmvR���w_��wr��ٯ�ΰ�3/W�R7L��֯���a��m3U�Wp�-7�t:��7�6�Z�u����|��r��̱��)ϙ~���S����܌g:�>��8��L���v��$K/��U��}�?��e9��}�{$ᖛn��*�mg-���6}�$��׿� ��U���T��y/N��[o�����n����eN-�G���rzU?`)`���|U�DW�d�(�(��T~B��(�c{�U�d>ai,���s�<r���� �ǃ���{�EQؽg'��j��8H�r�\`4�btx��]YC��"��h��s�azK� &�vo���>K���\�|�́�f�q�6k$��v9�m�]���s���?��06>F:��4MtM��?�V ��ƟC�������[�E4%�$�J���آe��<�\� t�����[�9�­ey��JV-�dpp�ص��~i���C�e�\���"�ir!��6�����|+�x�
�i�?�L���GEE��a:�d���~��855�s�a�_,�p�}��nEQP����ǽ�$�LP[SK����bdt���>B��}�S��Mw�NG࣫���eV5�d*Icu%߹Qe��!��DX6/�X(�L��^B!>:�ϔ��z�D"TW�X�}~�.7G�atl�H8r�=�m;v04<(x=V�Z���>�(U�x��m�X�pa����cc�,\�zց.bS1&��,Z��dN��8�+�U��9�~�ۅ�f/�FQ&&'ٱs��~��M�456�MOċ#�}e����D�q�;N`Z&���� [FO�*
����u�TQ
!��̩������Fv�'�adQ�|����Q��*��_z���3�155��cF-@a��111�3���h4%��kk�ͷ��w]���.�� ?��0��
�_�r����ǭ�<�n�����جש�J__O?��CCtvu�G˦-[��g��O�˜a�\�ٲ,"�J�.&TQI�,B(TM���PY�H��E���բ\!��l芢�N��#^�R�^�r�H�SX�'k���_��X�|9����fx�ɧH�Sd2Yn��z|^//����)�.^LSc�##���'����x�u'WQ8���M���m\{�U�I` ?pmMN����T�`׉an\U��霞'`��r����/߃���M�y��7���q8
M�?>ņuk	�������TTT�G�<�%!�(m*�ls��a���f��A6���una@��_|������K������l����l6�o���i���QWSË��BOo/^���k�b��y����*LM�y��g��m�����K/�2U�J6�î�K�H�s�39�NP��<�2���S�.Z��t:��G�p�~�z�]<H˼y��v4M=ge����=^���B�χjY��d2I"��1Y�9�\ehx��;�q���Kw��?����PT���r.]��ŋO$H�Stvu���E,#���X�h�\y%��LF'�T���1������f||�\�8e ��ԅ=\��Ob|�cܲ*�ˮ0:::g�>�X}��*cc�h����BUUV�ZE���m;w��Hg2��B!ąB�,��M����C��.��(*�\���^z{{Y�h�����L&C_?~�/?�`��e��?ܷ���Nn��:�;;1M�h4������ e� {��cb2JUU%[��Ԉ��9�/ز��+�p��]\�����*Q�h��ڎ��v�9�߲c���X�:]u�%TT�s��US����x��wxo�F����2!�����P(���'�L���I�@/]ݝ��ډI��o%0c(ҙ������cr2J}}6]��e>�a�t8X�d	�Q_W���9z��H���qۭ�����ݍ��f���:r���n�� 55'���4�` H__?��122���v����-��L(�B6������!��'X�l)7\{-��r3�Ͳm��_s
M�
�2!��[�����bS1�'���r�\.*�+�O�G~�ͥ0llA��`a��BCBeF+�B��B͂e��3�ݜ�M�0���`*>��(����g2s����3G,L�Z�Z?�eB!Χ9G�|���}��fZx��sN�}�1�ϫ�JEE�P���G��cx�xM����ܟ\��!���&鹘��]�e�,ԑ�f1��n�O�('���Գ 
!������B�IUQ��q�����a��՘������G"X�f8��fAB!>)	�ϊe�[���5���%��?>=M����FR�g��B!>��쳤�z��w���9�t�~���H��Kn��-!��s0+q
�ӕsX�I�i�;����J��,,���_�%�;�<yo_U��}����;wB.'�/��sQ��oY�X�d2I:�����a0��k�Gh��fI&��l�O��x�x"A:��0Ll6�9�9���gdY�b,]��w/��(�D1쯼���P��1++�**@Ӥ�B�s�Џ_{��&319ɿ���7O=�,�������p(T��f�g�b7c���u��r��	����Ȃ(/++>WP�3��*��l���?��՗�t:�]�
}�3�:z�i������_����|���w`M�
hӳj���֩�R޹�T(��(//�??����P��Y{XV(�1o��;��?��/0++ю˟��j"�Y[��k�@!�9rZ���q�~��u���|�lٶ���	�\z)=������r���rlٶ�cm�F�uFFG#6#�L�?0�iY8TE�e�<à���ѱ1�C�v��084����tn��P����A�n7�t���>�.�t���.ҙ4�g�:��y.\��x˲x�ŗ��������	���I�Rx<F��axd�d2E,c`h�-_�������0��`��Q���&�'&gph�4q��LLN���E4��v�6b�,��YW���[��ܚ5䮽��E߿�+��o���US�����D!�D!�g��+����G%��PWS�����/��eZ�]�����G�9à�����:�y�=�� ����Ħ������n��ߠe�<�߸�M[�����36>���<����������oSV�_��~�{
�[o�?�	O?�,m'N�p8����zժ�@C}=�׭e��ettv�}�N�57�O?���X�ŗヒ��n>ش	��E"�$�311��k���+y�Ǚ��$
q�UW��o0o^3�����z��C��g������2~����tڬ|��u2�ފ29	ӣ��7����w�x�QO?��_�k��=G���%{�
X~��BqN̾qm��d�\6�Wᅬ
���򫯡�:�H�W_����v;�w/��_��p���?�ɏ)/+�e�<~��w����l6C�0H�Ӕ��[����*��$.��U�\‚�V��,�T��&"�[�og���\�|9]��l޶���F&''y�wOX���z���j*l�D"�����uk�Ë/199A]m
��~B6��k��{��C9��˗S]Y�C�h;~�l&�e�d�Y.Y��?�ɏ��b9�F6������֯#����tkz�A 誊�b���z�2_���=x�����'��z2�����/��O0����(|����
�����(`���
n���b5�����i6�F���ϋ��!�iz>4-�T�uk��y�H�R�]})�i�i�V��X�hN��k���_�����6�����**�ir�ʕ�̟�\��C�?p�=~HGg'?|�{��e��b$	�|�A��I���ۍ����ra�&����q�dsY,���7�P����u�\Ñ�Gy��z��t�o��u�`��p���jr�^��p�����|���d�MnŊ�(�R ��S(��*��>4M�4�$p�-���K/s�����:�n>�'?q�ib���z=������p�(*>��Ӂ��%glٺ���r��?������{�j�.���xAQX�t	��eTWUQ__Oe$†uki��d�ᠵ�]��W��������g��]�**x�Xs饌7Op������?�r���+�r����Q@��`��h����%��:Lo_n��˅��E�t�n7���K:��ޣ�����J����e�z<d�
�_z	ǯ�㷿��kdx��7���Ғ�('B!>���|�a0:6�����p0M��&�Q���eeLF�x<\N'�d�d"A0$���:N��h4��� �J�H���	�6;���t:M,6���r���(�������n�L��0*+#x=�b���I��h����,�FQ�$�h�Ӂ��g*>�i�|>��'�x�(�J<>���etl4?��Ӂ��$����'躎��bbr��A6�%�J�r���|�B���ׇ��q���hǏc47���7��{/f}}�u��%!����|���=C��Y�Ξ������'����������yf�o�kf��e��b׼39�{���gf��^kk��8����ʕ���m2�ߎk�B�S��8��G�f�;�l|B�>�7��_(D�+�m���m��/���g�\����rI�!�8���o�MCn�jrK��������7�@߱��M7�~�!�6�OLSN�B�"�12Mp:�]}5�+����G���K����]w��ַ0�/�M�!�(�U՟�On��e�9���c�#G.��0��P�%Ų��X���M7a66���c�u쯽�26���<�N@����d��g5�ǧ;y�ܲ�;�*��=�i�����H&f5�����]������3��{��a����sO�
@~���
!����ָ/�dhx���jjk�D*�$�	�G��y}s^QϜ��#'�9�u���?���X�E__/�e���Dmm-�P���� �\��s��
�Z�O2��-���
ɭ[G��E߻�K/�o���YSϟ�!!��Y����������<4�������$�H�;e�˲x�Wy��wٽw/��۩����t�u��h���{��*v;�����������9c+���8SS1Z[00��m]��F��	�������`�ONͫ�*�CC����ٹk"�JR�0>>AOo/e����2�
="�W^I���!�öys����۝��w:��_!JD!�UEQ��O��~���N���A$f":q���i�}�.l6�,����MDz,��,�L���\�L&��T��>�����\MU���c����f�*<��a�i�:::B8b4�䗯�c,��(�{�(���x;mᑑQ�|�!�5Ք�����ƛLLN���W|?�\�x��|^L����R��L��dn�}�f<?�	���+�߇l6�_!DI�E9-�Oe��ɤӧUi[�M�)	�B��n������B�5�[n�2��g�#���e>���������x}^n��FT%?�����={ظy��q����2���T*��a��`���ɼ
��Ή�a&����6�ѩ�o;(	����+�y�������2���	�}�}2�,W]~������ib�\d���ܚ5��|�#�`{���#[�`�2�u��/�_pj~�\.W�b��c�&�t���L��;v�ҫ��c�N��8���\s�U�����Mx�n6�_Ǫ���q�f���	U�s�=_b��ż����qTEa|b�?<�U�J,^|�e��)l4M#�J�w��&�d�D<Eħ`�!�Ic;CCD�L�4����0zz{io�dێ�rٺuD"�|�ם�2�w��?��|���~���?bUV�կ�~�۸��Bmk�>*�@!��T��|$�I��(��aYV�Z_QT��{�5�,��ŗﺋ��ɟq��w��*~��Ɔzjkk�����ٴy'��ɤ3$�)'�H�%�1���Q�X,���8}���IMMͬm������Ke��;�Dp�&��ܽ�]5���|99�oWwv��χ�*\w��,\��+��Α��aӖsS�F�����M�7�!���X���_�}�[8�S��|�_�,�B�S�eY8*#U:|���f*�+P5�t:COo7�c,]���sEQH�Rl۹���Q���e���iYX��eY���`xd��/���ʉ�	6o�Jo_�u�TUV�e�v�hnn�����zB�P�ò,BaFFF8z�0�
�,m* �s��A��2|^ߜ���?�˯��e�<|��������c���2BwOG��0�sR���1I��_����>���gq���� ���f��
�@!�@���^/�������m�ocxdU�Xк��}Z������rY4M����H(D}]��p(Ē%��f�$IV,_Ƃ�������|�TUU��f)�|�:FFG��F	G„g�%��*�e�D�1�{���h����Xt�pE����9�i�������n��6�f�jl6���
l6'�;����p�455�͜�2_�f���jr�]���i�^@߿��Ū��C�_!.bs�o藟>�4M�6���}�5M�5s�i���RMof�†�˜IΜcF���4���,�L�N�Q�ˉ��<c9���7�9]���-���e�/h���H���_�
��f#{�ͤ��]rk��
!����9-��'.���gyC9/*��?��k8~�s�={0�AR��~�c��/��B��y�v����R΋�e�a`�x��UWa�Y�=��ё^�_!.ZRo+�6ݠϬ�"�����^,�C��	!�EN�_����YU%����@�_|<ccE!���z[!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��B!J�nY�a��r!��3fY�ad�,�:��B!�gDQ�@�u��%�/�B|�)����r�_!�(%�B!D	��B!J��BQB$��B�"�/�B�	~!���H�!�%D�_!�(!�B!D	��w�B�sŲ�gzT%}mZ&��b�j�,�B!���$cc��1㬢��~,����~B�t����#�B��,�b|b��`�������($�Izz{����Z\n7���7M���A&���B!.~�ea�&��X�}�y=�^o��_Q�KY4M���d�q�B�:?YƂH�!��b8OW�g-҅���I�!���R��o/��(�~?����I�R��i��y���/��a��L�������-�-,\N��cYS�)�'�����\���I$�(
8�<�9;P�,=��B�qJp*������]������/^i��\q+��a?q�����κ2���dl|�؀���U�������c'���ȱ�
�Ypj���f�*�d����9�SxD���B|��z�k��SSSds9|^/�@�l6���0�D�D"���0�X��I0����L&C:��2!�H066���$\BӴ�W8�=ՕU454������(�p��	��(>���`�ѱ1|>����Q<�d��LEY9��I�9#G,c|b�t&Mue6���H�!�()
�@��� r�����&�'�I�S|~F,��r�F�@?�t*%�(��J2���~0
�ښ��5P�����(�i1�$3����IY0Hgw'��06>F<�2����H8���0�@~M�el|�H8\�va�96,�/���:y�?�޷�@����h;q���A�����x�t]#K����f���~r�5UU
����:�t�D21g��?y��MP�����D"��n'�N��d���t�t���l6K<�f�/S8)/+������Aà��v�MJ�~!�_s���$�H02:���(���x�nP�W�T�d*���Ʀ�LLN�H$��r��i:���z<x�,�BUU��$�����t8	U�X0�����9���3�������e1:6��n��rQ�da�BʂA�~?`���M�"��ᘵLy���M˲P�L6K4�4�9����B�/�S.�-ˢ����t���,ˢ���p(LGW'C#ä�i\N5U���z���b8�v�.�@���Q�~/e� �X���A��ꨮ�bhx]ש��.��U$���w����v�N'�u��O�SWSK"�����Á�Q���2AӴ���X���8f�P�}x=���inj�aw�,M���6m�-Zt�'B!>-�4���~�!{gV�+�Bۉ㘦ISC#��[���Y��/�0Mӊ��\�a�'R5uր=��p�x��[f嬅5kY˲����t+�39�2������h�m۶��B�/����?�{����i��Y�|j�*�rZ_�S�uZK�S�c���3�S?�>�2g@Z�!�����X�EMu5(|���ʅ3T�m���BQ�\��}��6�Y:��
���Bq�S�ncjj
��{��@*�BS5P �H�t8��~J�i�L&����#�B��E!�E�Ʀ�wq�e*/+&�'0�s#z�Ӊ���B��`��U�>����Ʋ����u��eY}J��B�/��A{����%#�	!�%�P� �/�B�	~!���H�!�%䬍�EAU��
*�1�L?�����P�<�:.󙶭�*��b��3!�ɩc>!���3��(
�}�NbSS475���l��\�\|�L��{���م��c�ՄC!L�<�gNv�iX�Ŧ-[���n��^�iA}�X':�Y��BSc��Z��(D�QbSq��*�{!���U��l�ǟ|��^~?CCC��Y��@U��LG�MUi���W^-���|a��k�b���3 �@UUٶc'���:�@����x�t:������YW吿J�?��D2Y����MeF9�*�G8z���9�сi��
3/M�>X�����������`�ř�
�H9��s`p��g����O�Orb ��\9��i�LLL�b�2���J�n7�L�^z�˗Q[S���O]m�e�i�V4M��y��صg�cc�u�팍���M�l6���/��Z~8CM#000��+ذ~]1�b����r�
"�0����y���s���6�^o���\�uk�r���=�����qۭttv��POUe%o����_O{g'{?������7��������@��Ίe�X�|6��T:�~��L�t&�M7\χ��h[�P���u�9ByYW_yǎ�����oXρ�����f�ҥ�C!4Mc��]k;��T�H8�m��L"�䍷�fjj
��ƭ7�De$"��B���x��p8��;9������ͷ��4���ؼu��lڲ�4x��W�F�,hm��vQWWK$�����r<��sx�nb�/��
���ڽ�%�������Z[y�׉MM��oX����:����7�{�޾>��*���`��ETWU�t:Y�x1�`��_|��G�"Ċ�	��CI��l߱���~���q�\475��*N��8a��3�R)^z�U~��_��k�399Ɏ��h�?UQy��w���!�f�j�zzx�ͷhm��� Ͽ�2�����--l۱���N��l�
�����A�=J�kV_ʎ]�9��Ƌ/��eY�\���2>>q��!�_g~˲Xy�
���~�m7��˯��C�Y�z
'��ٺm;�ee475q�UWaZ[�m#��P[SM8b�%��L&�������(�t�vY0Ȳ�Ki������%���f�ۯ(/�o|���;Dc1�~�9��0�eA�.YLuu5SSS8x���b���r*��X�be��)U�>�������~v��K*�b��5\�a}�ڲ,\.��}�?��;�@QU���X�|9��J��(/'�hA+�#����q�UWq���ttv��fimm��U+��˹tժ�	G᧹���+V
U04<BOo/k׬f��E���B�rƪ�T*�ko��M�Q5�Á�ag��ft]��_�;�v���7���_`��?����q�m�A}}u���̛GsS~��4��[›�1�rޚ5y���{h;�Fe��l6K��]Ӱ�l�ܽ��æ-[�f�,_����Al6�l��۶�v�j�'G�ðL&&'H�Ҕ���n��z�9.]������,�׮-�k�d2ttv�p8p:蚎a�V�u�i����a��>���^~�w�{���3���41UU���e�Ν������߷a��,�0p�\�C!�~�]V�X�d4�q��B!>�����M(:�l6ˉ���8A<�k�e����l6��(mǏ�}_�f��w�>��?@]M
W_q5�դ3iFF�X�h!
�u9�Fo_/eeeT��X�̟�aD�¡
àu~v�˲��0���[n��׋�磻��H8DSc#]��$IZ[Z�d�r�^/==�44�S_WǑcǘ������y�|�?Ǐ�`�ҥ�[����^2���V�����$###�ǩ����p0�<,�n7�u�((�̟GuU����mD�n��:U������jj������8v"�n���*¡�UU�Ri���X�l)��=oo�����֯����|+B!.b��022��i�&kѢEg�n���(`����?��"��(��Ūt�0�4
��R���p5]x���
��,da��'`(l�P��z
����_u+*���a����jM?s�QPX��-EQ�LeF�g�f�fncf�f�����S��a����o���QY)���B|z��p����c�&L�S!sR��==\}�h�V�ff���k���3�?�kf>Wܙ��|��P�k=3OB
4U��s��S�\���˝m����L�-�T�?��7�M�a�Z**�%�B��xZ^M�ʗ���T'>˲hi�Oe$����bM�B����:U��r�r��8�a@��Bqn}��	�σ�c!����?�"��R r�%tEXtdate:create2013-09-27T12:15:22-04:00��%G%tEXtdate:modify2013-09-27T12:15:22-04:00����tEXtSoftwareShutterc��	IEND�B`�site-packages/sepolicy/help/system_current_mode.txt000064400000000530147511334650016705 0ustar00You can switch SELinux between Enforcing mode and Permissive mode.


When a machine is in permissive mode, SELinux will continue to log SELinux AVC messages, that would have been denied if the machine was in enforcing mode.

Changing the current mode of the system will not survive a reboot.  You would need to change the system mode for this.
site-packages/sepolicy/help/ports_outbound.txt000064400000000532147511334650015703 0ustar00This screen shows the network ports to which processes running with the '%(APP)s' type is allowed to connect.


SELinux controls the network ports that a applications are allowed to connect, based on SELinux Port types.

This screen allows you to modify the port number/port type definitions, which the '%(APP)s' is currently allowed to connect.
site-packages/sepolicy/help/login_default.png000064400000121320147511334650015375 0ustar00�PNG


IHDR��ah�>gAMA���a cHRMz&�����u0�`:�p��Q<bKGD��������IDATx���u|G��7�+�,33Cffj�4en��)�w�z�^��)mSH��I�����8���+-��q�^�M�OO�F�]i�����r�̙��j��t�$�B�BH[[[KK���W[[���HF}���ɲ�������i4���$Y�%I��B!�'bY6!!���,� I�0ݽV!��D�(�T*�F�QJ��\B]�dYn��B��
���o�͋��'��N��g���w~�9/��_���~��!ԍΆ~l�A%�,�R�C	��Ȳ�0��(�*���W�{W�,˲�)L3CeY	���5Q֍e��+�ПD��J�W��N(���
ݽ��r���3�r��f�)ըU I�S��U���v�J�2��(J�Z�:�����7!@)�0�u�Rj����Ѩ9�Ai�T
��J�ɲ,S�,���קB�NA�%�V��NA��F��)�����|p'�^��^7tz	����q�0����F#˲(����]U���ڡ?��Q��dź�Gϔ�מzb۞��̙���,V[jB��y�����f��B�l;tt��-o?�F��2eX��t�F�v8�,�2!@(��S��Ȕ
��2�t>����
�;����C	!lj�`������vcv^����g���n�鴔��{��A4�$I��`0�[�r��
����>�iU5����[��~�˱,+K�ST*!�0�(�$�T*�K�V��P��(I���h(���eY�F
@�R�ө���Ȳ�q\iEeK[[�� p�D���9*,TE���9�6��?�$��~��V�{���TXZmV������<�p:�N���$ N�36*�ک�DQ���Jm<B)-.+�
ijnq8��(�T\dxXAq	�0aA�2�������f�IK}�w~\���<TR^���ɱ���m�c`�^��,���gX��8N���BH~Q�����eA�[ۖ���'=m��-Ee�q1���������',8����ʲ�f��6{UTU�56��x�1��UT�x^�q>f���)7��F������o8�{JD_�^��p�m�~Y�9��������&B�'���c�"N�{�}׹�ٳ��^yt��D14(0:"|�����9�?p��f�v�R��ޯg����3�PJEIdY6��h妭=��_���F��Ғ��8i��7>�r��/mع�tA�!,_u�m7}��ٷ��p:eJ�j����r�Z�,�&�ٹ��w����*_�׃w��q���:z��tH�^�}���R�94(�Hf��O�C��>���O>p���7���}l<��?�<S0n�I�X�Y�qKYUucS���ر�Pqy�Jű��X�y[IEeCS��7]���?.	��-��~�ޙյ;b2}⸤���΋�JI�so�B�W"���}zYl������a���l��;S(����pDG���3�ܾ��[^���*��J%�%�c���V�8N�{T�o��J_�7>��gJ��٬V�z�$˓F�(����k���;��f��8N�`�yA���)I�|�뮘t�1�>�ʚ-;L=�2@@��<�y�����6�!�x�#'�MFCHP@��-mmf/� !���{����zp6aȑY�ҡ���>�m߁�G3���|�����ln~����ed�>tdÎݭm�[��j��QO��FqYŖ=������5,^����f��>��aA�V�
o��F�)�0z�V�$���j��e�i�nm�ݽ��rF)�h�E�9����G������FJ�Ѡ��+N'C��)���J�^�2��;�,S�R���Sq���ZZ�����S*SYjm���F�á4����g^?#51AE��tڪں������e����:��.ɲ^�E�����b�1{]3y⼥+�z�đ��w�6�mh�>��l��iV�V�T�U���f��!!�NQ�d��0�L�e[��ܷwxH�,����������t��R��6{ZR���N�$IN�C�
�o#S��j�9����w�p
|�Ï����}l<Ϝ�����G���+�.g�R���y��-^��j�~Fb\lqYEKk�짞��l)�/>�����%+!z�v��!^F��N�%(�a�#†����
����
;v��|��ߞ37<8h�N�����6���m<o����o^��XF���k�N*��Y�y{���V�NOI�4j�����8��+'�:h��
{��e�5�(I@`@�jDZ�J��2�#���|�O��Y��%�cY__︨�3�%v�9i��O�[�u*���)*,tƤ��w�nhj���j��n�~;Q|����A}z j����[��Nwq��޽{w���A�>�OE)�Xi��t�N�q8��L)��Ӣ�?�*���l/#S�0�Z���eeYv
�F�V�TAp:�j�Jy�r��h����)�Z�SP�B�C@����vJ�e�Q��O�L��g�limS��*{! �2i��9N�����h��&(�N�-�F��cYA�z]IE��}����L������>f/�ee
�|��j�j��6�:Q��Ng�񌔪��cǎa��%�b��dY���J�w��Q*˔e�ِMH{��R�!�,�<ϲ��,�Q�)�6��F�VKm���0�y�!�c�ͮ�W
�R��Ʋ,��F�:�s�Hf��O���w��:�B�Re������m�ܮ���E���6���ը���x�x!̀^��q����F��l4���l�,�އ~/��<���Xm����$r��~���dvY�$���(us��)�8�νAz~r�RP����W@�$JA%�0��|}tX����(�dJ��ݘUV�R�e�<?i���x�N�)"(��.���gS�@W'�]t����� $�D�'%Z��?j��e٨�MK�$����vQ�9����E���t�y�2�N!�b� @Gf!��P7:��|r��'�c��)]B���Y?˲JJB���]���R_�����	�j��/{�I�(P��{]~#Q)��%[U%I��]�Rw�6o}C}Ff��BA
����K+t*tY�
�j�e�e��Bᒉ��&p������5B��kjn����%�\Z!�:��S�����[[_K�^Z�5�������/88ب7�T��6�%�8geeep`0˲�Q���1�F�ѭ�1��p�;xK��a�FsI6�QhljԨ5AA<ϫեs� S�h0VWU;��ހ��Q�i�Χd�P�R�K2"�_��ȑe�e�:YvGBd�(J�t�5QRJ�{-����!---�,wQ�(�h�:��˝���V�˛�BX�ri�����Da:�ۓeY����
��)�xrr��ܮ#ϦE��ңGOo�ܹ���;-�}n_p��]ꔾ��c�����؜���B~�2qɝBD*BG�Y׉�k@"�0�Wz%AP�u�
׵�� S�b���Q�eCcàA�CATvB�$)0 �ȑ�M������s�/N�V�g��@������w�)�^(� �
>�˲*��w�t�8DQr�Zhﵢ��j�R[�L5Z5˲v���y�%w��L{:�_�L�^�R�(�6�ME������ph4��
èT��)'��J��hX������h������7��.�+.�ts:u
5��a�0(��IJ!atu[�� �Z]X\���ܧgO $�d.�459���娢�g)��s�Ή�8�W���T*p:�ʯ��8��ٳ~�e[[��+*��\�QNK�TVf)���kll_I���"�J�Z����)+��ӫ'�֪KKʚ���RS��o��=�u�/�N�UjU}C�Kz���2@�H���]y�R�Rq;v�����5r������+�[ZA�KKI�/(Я!�f�UVUGF���W�fF�$�s�!ò �j��LAassS�^�!Y99�Ҕ�d���} ��jwPv��F�C(�,�jTN�eYVr8�U��M�R+iɻtΥ�,�z����c�V�Rk4Z�f�Ν��n5y�L&����V��e�h0�x{�FJ�^�������Ѩ�f//_-v[�k�,�����Ԥ�jEQ�j��MMM�͝�S @X�e9�0D��445nٶ��8�e��j��>���f�ZE1
f//o����UU]S]S��e2
���u�Ҳ�>�$'7�d4ʲ���6l�h5fo����`�s*�����m����8N�V����>�&/�Z���j�j���m6����f�c�q�+�r`��[�혷paLttiyycc�S>�򫺺zAp����o��'�2Ð�ֶ]{�0�T]��Z�F�Պ�d49��2��C��lJEQ�鴇�Y�|�J�R�T[��ظy��h0�f�F�EI��yyy�:I����l6��8�3�f�Y�V+7�ۓe�m^�"dI�7	R]�ɚ���*[vNtD���%������#�T�՚�e��N��j�

�ZT]S;q¸��Y������~��w���=)�1c�=ZRV6y„�}z+�n��k��ɲl0�����ŋ�|�IZrrΩS�=����o������,��Y�%2a�2�Z�6��ɲ�ҨJJK>�V�G���Sy��jUz�4�7�t����MM�=�R���p8�U����޻���:���##�[ZZ7m�j��{�JONLذs��f#�'Mlhh:�q�j�z��'�3*N�ZrZj��!�����n��0��d;z$�q�R�0"#R���2gKuM��SM�4�kfX�ֺ�����M���t'23M&�T�d p:?Ͼ���mj�j�ݳ<$�Ҹ1c~���䤤cǎ6��d����n���J%��F�1��$QB�Z�F���oX�hquM��q��f���G�����+��ؽ�dn� �ƌ>r�XiY�����$��%4

]�L9������_��-���[[�/=z��>~�Y?t�1$��;�$I:�v���2ibP`����X1n���|,�D��3G�3r���+���Ŕ�&�/l�2����zڴ�а����m�~��aS'M�����t�?˲�q,�q�02���즦���W��$
�Uk���>x�ȴ+��yGqI��3g�
rrs	aƎ�q��ͮ�hT���t�JOOMN���yg

g\u�S�����i󶒒������:q��M

��32��_Px�Б�¢�3g��i˶��֕��'&��~��qDũ؎
�$+��577_1erLL��{f����uuu���վ��[�<����j� 8����������ܽ�ʩSN�޹{���£ǎ�U���v455���~�Ŝ>�T��zAdI�$I�%QQEQ��,�qѾ&�����e+V�5��ŋ�?���ȱ�F_�|�}�'O�`0ATv%I�0��?!��t^ڐ!�hQ�my_}�������D}�6��H�(���`��R��v��פ���Z�V����<��>y���%ɒV�:x���֔�5k׻�3E�O��l��w�"#o�9+*2��2�:
�P�*m'LG+�J�6
�~~��f�y�IW^1��bݽw�Ԕ�����h4j��`��MFcZjr��Tooo���8�cT����WM�ZQU5h����@IY�-(**-+�8���a����k+*+-Vk\\�ȑî�:9'7W����KNKM	�+)-s8'�ML��#,�r�2J��r֯���h6��y�ߟ~���j��O9����������f\5�j�j4Z�^�������2lZJ��{����i4j���q�~~� �p�um˩���{��λZ�A4�J���j�v����5��;�Γ��x�}���,��V�2h��A�n�����ĵ�6��|�UJGvt�F BD����჏O7m��1�?p$!��)
mm�^�l�9w!��Ԕ�‚�\�|ž{�����_9u2�;���Cz�4�A?d�{f�LOKkhhhiiq:��MMWL�b�y��t:�p�/C�y�^�}�>x���^zA���hp�X�eY�aY�c)��8r�]{LF��bY�j��|}ƍ�{�m�w����,g����j�Y,V�)������XVũDQ�Z�����>t���
@�=��}rؐ!�����z�L��C�C�|}��krrNn۾3-%E��P�,
bCcSxX(X��Sy�mV��S���v��R�~=�Q�w���ڛo�8!�����,�5uu�ضclj�,Q��KK��عi��3�EV����F������t:�:���#G���;�R���X�e2���}��\�J%Sj�ۓ���-X�x�����$���M�8��࿟�0-5E������&'�74677����76N�8���?,�Q��(�x5���e��s��jm+;u�6/���+75q{�(�^s�5��z�^�� �bppp\ḻ��ʪ�iWL=b���GV�]���{�5פ���$i˶5���		,�������8�r�?_�[n���h�)f���(��k4�>�{SJ��p~��n������Q��eC���������2::r��!��:�v��Scc�u:]VvNUuMP`@rr��l���

��$1..V��0�D���;,,T��;s7x�斖�3g�R���B>��ԬR�n����}������^1e��n��

��!%)�W�����Ꚗֶ�Ær*N�$����	��luMmmVv��3	�q��x�(�M�͕UU�%��(&���74VVW�x��Q����a&�1>.�f�����?�L��S���RS�O矙>��aCgde%'&r�t:��cccN�Ȭ���b�a�>vl݆�~���L����"�򶝻jjk�bcX�������Z�~����
�^k0�]C�-��`T�^tw�A�6J)ǩJK�����YY���%���N/�>���̼y�wͪ��&�������cY�������aC����	��ʇ��j�:��y��f0�P�t:y���dj���bQ�Մ��j4]e�o�pux���+�?��r�b�c]��r�m6+C�,Y��F��6K����j������vpP����ܤR�Ap:�^^^�E����8��X,�)�J��jmv�^��V�s2����y�I�F-�t��99O=���hmk��J�4ג�,m� ����?xp��˞�ϓJ�g���*�Rb|b{�NJ�z�N��$�R���
>��J�R��i���f����p�(
*�����h4ʲ�t8�F�L�J�Y�,:��f�SJM&Sss����j�z���<o�ٌ�0@����y��ˋeY���bѨ�б;�ʸ�V�V���i5Zً~'Y�u:��=����KTl�˯j}}
�fҺ����
���)S>�Rk�\c��zx�ĉ.F�B�<o��(�!�X,�<���ڪ��e��<��`���y.V��IA�V,��Ʀ&�T�V�2]�pQEU���<~��Y�AT>������%pJ�Vq*�IQ��6��eٶ�6��q��fs8F�A�� ��n�9��;)Knii��t��˯�<?��T*�2�˰,�J��r��w����ڦ��e���S�l�0S]]MQj2�OZmS6A������ޱ{��չ�6�M�ޮ�w��a���f���t:�}��9e�;w��E��0N�`�YEQ4h�i��Rq�L]Q��ab�{�'8X�k�t
]����~t�q��g��NJH��a�������[��.F���+pv�1�RBY�m6�^���<�~�̲lsk����E�9��K��*�B]C]Hp�F�c_��R��{����!��J�a�:�NQèT������`W�������˲�0݋,R^�tFbDAI����")�Rb7���'J.�-�U�*__�$K�S9��@�o�R�2��u�;�����������I)U2���t����_8�W��;
������q\���>u?l(���2�2è9��%��eYQ�]��J�3������q��]_r�,+ձ#���H����+��j��~he��^�������k_�K�t�Ȳ���$���B���V�8�N_V^VQU�;w$��g��y{DXDHpȥ��!�F�!�iu�(\r�p�Νj��}������;�c*�8�{[��z�H�d��{u~#I��H9�����6���ΕZ�7�~B� ���8����T��y���+�Kw����/9�w����{�NE���D��t�o�����p8B�C.ž1]nQw�B����)��5[��I�O�@��U�������88(���!���	Ȳl�4���pu\�)%:
�2D��]���uJ�^�w�鲵�<�po������((�X��*��!��lL��R$���k�(!�q�'��$�m@G�gC��Z�����pVH��!D��NÕ�[o�n�$I�7{�4_9 �K��ˡ����aDQ,++�4�F�

	�N�}�Y��2�dY�/�R%�ͅV�X���W)�>������k�]�<���o0��>ʄ��J�N�S�R���s$�g�B(�Z�ַ���ϭɫ�O�խ'��;��z��l�\��/C:PJ�
|}}�����^�,����$$$�ޢ$�P���F���2�R)��׃��h�DQ�B��8�Vk�����H)�X��8Y�9�s8�^�)C���S���j4Y�Ap�
�h4� p�W���z���<�u#�{p�d������D�߾����m�Z�µWO/+��6s'�2��2�!�a��dY��5�fwQAf�͟=k��aCw�:�`����#�e&
���6���X���l�(&�'*�Ry�R���w���j2�$Y���U�⒒��}��f�~�G�kj=z�m�:�{�Ǖ��(]{]�z\��(�ȣS'M�6u� ���eo����=#I�2EBe���u,#c��C�F�8��?i�O>�S\R���p�M7B�j�(����f��˹�DGE�5�b�h��ʪ��?�����e2�|�q���A�aYQAx�97]w]hh�2��u`����Jr󪚚�w���\}�4 �`4TTT�����ko�}"+KE�Z��|Yy9�0,ˊ���|Kk+a�31�H2�<Y�ln76~��a�aʄ�	����+�
�e��=E%�ʒ$J��t�N���/I�$���eY��_�}Xh��5y�x�^_[W�������>SnCC#˲���Y����yJ)C�(I���|�ͷ���Z��:d�Y�z��nomk�j5!j����2�t~Aa!˲����;w9v����BC!PZZ&I˲*�J���“������p����/���>8�������buM
�0mVkiY���ӧ��<�\
B$Ir?
"�9�|�as�x�^��ӻWkkk��\�F���w�ݛ���e��3���\$$$��{�y�Ï�+*��|}��]�qڭ2e�F�N��;d�!&��j���u@AQ%I"���E��+�+�!D���Ľ��8xh��AF�^�R���%�O�㱰��?�����&�[n���/�<��UU�<�⋋���]{�<�ԿEQt:�z����7�HMinn�4a<lݶ}��U0�_�[o���O>-.)mii	kll<z�x]}C[�eيUz�n��s���j��u������އ��Ly��Ǎ#I�N�ݲm�N�}����GZ����]������B������:�N�V���;�ǎ�8n��{~̨Q�F�X,xy�R �BuMmyEE��b�ڪ����Xƌ�����CN?~����[�������y�=b�#<�p8YV��CQ�%z�9�����'�*;u���|Z��r�����[�+� ��81c�Zo����fɊ���� ����v�ر%˗���}��;Z�f��]�a�{���;���Ԝ��}���������{g�ܻo�ko�s�̙F����j�ҥ���~r��}�������^��[������YƎ٫g���f���mݶ�Df�!C��ܵ�ࡵ�7$'%����}���y�!@HKKk���$�mmm�(����������=���^<�w������j{��kϞ��ʪ���T�Fy�I�}�ͳ��?���!�����$IT��Z��)8MF#a�ֶ6�N˲�C���%�juZ����0D�����y��/|��}/�q��^���I��4����(�J���@�N�kN���|�^�[�z��`�h4@��ˋ0LsssXh�r!!����cG�ڼm[���{����<�j�*S���x��7͸�޽z:���
��a��:����R�כL&�JEaYV�Qe�+�F��ڪ�����oMMN���		��@�(�L�ST\XUU�}�}��,�,���,�"!@F�����={��6�:z�P�f�����t:�N'���fMIN�3wn�����7�y��G�ꊩ�\��Ԥt���l��N���|I
��~�k�jӉ���
�b|�pJm���1|3��{g��p8�/F�Ap��j�a����aYA<��j�ǟ���f�ۧN�TYUe�ہ��.Iҵ3�~�KJ�JJK���䤤�Ҳ���k����sF
��ҢѨ)�����ƌ)����lim��Я��o�i���o��/���ϷY�8�E���(��<����e��a9'skjj�j�F��:y�'�Q[W�q"3=-�b���RS��fƫo�]RR:jĈ�C/\�Hp:)����W_x.22b��K���n�*"Dv��ݻwo��VV^�V�!����500`��CI	�>��'NDGFEF��<u*/??$(�gz���JYeJc�c\gO2�*�s
bayu����c`��q׎-AA��v�q:�����N�:~��a`d�N'K��}��N%2p� �������"I���R�BBjjk���ccb���eY���*)-+,.J��

���B����F__�>�TZZ��cY�������l6gfe9�B���������cz�.<,L��[-V����f�𯨨LL����9~��d��#���'7/���40 0"<�l6K�Lh�ڂ������Q�Kʼ�^&����8&*������p�ر���]�Ӣ�?�P�M!ᚠ��޽�W�^N�S��:�NeP����`P:�)�N��hԢ(�l6�ZM���aX&:2ZW@�e���X��Um�Z�qaHH��+�T���/�S(�NnNDX�$I'ssZ[[�Q��e�d�JMIUq����*�Z�&@�.1*�J�E]�V����h4Z�����!_�VK�,�V�u�9���T-e��Fie2�a�V�2��ҋ_y�eY忂 h�j;ϫT*�N'ɲ�nEQ�ө�jI�(I�%�,�Z�V�V;A���jDQ�$I��:%�3�0/��jB\������|����o��ܼ6�]	�J�2wkk��+����v���y^�R ʼ�
�ʢQ�޴�����^�?	�����_9���o���E�d���vow8��á�]�}�ڕ�d�{�x;�+��v��I��������J�ժ,�"IRKK+t�"e��R	�����������q8J�W>���7�f�1�p:�N�,�f�אA�<�� u6���҅�eVG
�t���t��=w���$X'u�K)���3��f��_��R�t=٩�k	�/��3L�u�ܥ���}�/��{y��yp�lI�\�}�4�+
 �R�a%Q���E��uw:� ��B�k���c��pI�L&S}C}qI�Z��1�C������$W���[���<��"!�~�v�]�Km�l�@��eV��Jb>���rE{�����=�A�^�Ǹ��Ee*���tJNP�e���nljdY~Gk
�=oF>�Gf�<I��|��������
�V`���)���8;o7�~�~�<B�K�����8��7ݽb(�x�{-jw�TH��Z	pvZ�<��+�R*����}9�~����Б��>i��\�.2���u��8AW3��O��o�f����m�'�#�<�'vѯ��WwB.VRYN�<���_M��o;]hh�˯��ܗ���C�G�.۽;UH����p�`���o��>/�SJO��FFDF�3��V�u7�a�����14$T���e9�fP(**4��A�	����I�;���籨ܹ��b�
�������\\R�$
��Ĥ�ܜ��(��666TUW���p�UW
#��L�lʙZ��W�w:�%��az��5��~���U{��kC������誐J����ęL&(//�X-�I)��(FFF�jHeeEuM�F���v����D�e��"�L&/��p��B���q��^%IZ�tqMm�k
�sv����Ϝ����!�']�Y�b鏋����歛2�N�����kQ�'�?����6��,�ʵ��,Y�x��{��ھs.����G�۷'+;�R�dY{��Ʀ��ܓ�VW��K\�M�ҮW���W�T@k[ۚ����������Kwg�7�y^��M9	�y���;��T!KJK����]�w��jY��k׭�C�(O*�933�祋�=�~㺂�3���,**\�x��#7oٸt���ꊇ����Wc�J��u>�'�F�rOC9F���:��лW��Ndf���j5���Ȩ�h//sCC���y�(�ۯo�N�����G���;�N�6C����֖}�����޽�H��_��t8���`*-+��K��������8ܳg���{�Ns�XQYQ\R4��CC�N��Rdd�=��S
H�h4�\7�L^111� ���E���%�G/�ј�qlذ%�ŭ��*Ne��
0x���I��pn�g���9�J�C���4'%&����>��������������?11	�/-��u���CG�~�Lo�wsK��:dx�����N��		)(,�����΢�*=���������4k�=�ʲ|NKFDD�w�_^Q���Z[[����x� _�a�����\!7<<"::��v�}���&��T��'�@�eeS�)))^�z�J�:z���}�32��ع�a��G���dd+*.lii޶c+��ȱ�P�C���q����ϜVkԔ�J����5kW���޳S��ݻw:|�j�nڲ���b�弃ߵg�{�ґ'p]_+�J\l��ᣖ�X�Â�+*��c����w����_ڸy�r�$��JeU�{eY޷OEEyKK�
kjkk9$�bIiɉ�??��G�Y�*+;S�o�,�f/�e�e���w�ڵC�$��UUUk֭���Z����������*&&�Sa2�KU�>}M&�(++]����+�G�Ne:�@:������?.ZX\R4d�PQ�R*K2tD�	�&�ڏ?�`Ӗ�J�qe�+�R���~��{�/�ճwHp��feY�=0��ڽS��#G�8����}�V�e=TV^�}Ƕ��9

u���w�j����a�F���w�TP`и�F�������ջW�qc���'�F�FC&00h��1�{�mh�w-H�e�����c�������0������l�ٜNgpPȸ��CC’�S��v�N���̰��:���enii���H�����c�?�#�I)?-Z���Ȱ�����{��Gƌ��sUZ�Jm4�!z���'L���V���0z�N�R
�?p��m#���P�U
�
��$I�D)�ճw�>}Y�U�T�N��2��
�����a��ȨA�D���K��{�՚[n������9�����r�m�dD�:�FF���G�����L/j�:R����|�m���fɲ��1�^�D??�I�8����A���p��5�O�~��5��r{��3n���d��Z\R���L&/I�����wt���6���ص{���ðA�A������s*/7))�h49zX�7TTT���lV��)���b����n�Y�I��yy��n�ҫg����֤���9��j���X,� �Y��{�,)-���Ȱ����{�5�Ȅ0g���ع-::����h4�t:Q�2Nw8f�w�>�x�TZ���j5��R��"8I�V����f��ܵ����a��������=�<t "<�l���r,�n�ڔ䔲�R�����Ų�3��M�����<.Ib||bxX�њ�V�.�-]�Y�fL��Gj���D%�p�_�Sۃ$����T��}�Z����Vf�@��Oo߱�b������T��G����8�w:�ә��2d�������:|���19����L�}��a��XZV�3�g�XSS�#-]����8�`g\3#1!Q��A��n�y� <������DFD
:<*2Z��44�746�������@lL���0���m�Zcbb
Cpp���%%$646�𰈐�P�Vc�ۃ��-K��^a��U6�5$4����l�D��6�BB��qRKkK}}�J��0~����S�N��bikmeY6..�f�	�`��S7�L�<o�����
<ϧ$�UVVFFF�lV��_5C���|VtT�$Iu�uA�A��N���MMe�e!�a#������h4�5U���eYV�7��G��K�ұ��T��!J^���N�)*�*66N��*��0�1q�(:�N�ڬ>>�����u�MM�qq	�G��X(BA�X6.6>$4�j�r,��ڣ��(u�N�S��
8�-䆌>������A��EG��ZG��|2o��+�^���{�3�պ�Z�ͦ�h/�i��+����ʈ"�,�^�p�x������ݽȳ\dߟT!���{�e665�]����S�W7�~����m˶�[B�(SY���}0�,SI:;3�$���,��^�*���0�L9~�̙#O��\=�*�3)O�鮚{O����u�y~�}����{�S�~E�nԝd*���
;"_.�;vY!��0�F�v��۩�~���թ� 'ei��N!��.ŜkY].���ؘؘ�{]�G(*��R�$²D�:�U�"8���l%�ِ�I�x{���M��o����ɸqTp���q]�dY>����ђ]����q6:��S�N��Z��3����П��
٩Μ_��ܚ���]����\J/��_�}_�"���ߪmW����x�(-�����BNilB�K���t³�BϞ$ [~<�5E+�[!�<<��r����I�:��G�9�[yl6�V�6�v��A�wZ{:����N���+�F�2PJ���{�qFDQ�A�ý֯7���F�~�u�H��`���1��XmVeH-B�S�dY�����QJM&���� 
D�A�j
p��	ض
�ʀRX�֮%6;��D��غBB@.�	!^^&YIG)�͞
O��{O?�*�}��G !$	�N��TO���C����eS!)P�R�pD��˲l�N'����AY��#�Ȅ`��,��@�H����X ��!>�W��J��}�X��}��/��O�yp�}��c���!�~��J�R�{{�ʽYB`�*زt:�X`�~��8�t��ԇ��sρ������]\�����)�S
��]��üy����0>|�L����?!�;)#�Ȯ�ꓚf,*�{��ÇAۣ��@X�"��0}:j5����Q.�::I�İ,�u�τR*��/Ov�
�|���"_}�ߎ��fJf*Bȯ�nI�e ��ʆ.�����4��w�b���$I�y=�{��9Q\�l�z�Z=W�ZW�~r�]���Dx�iHL��� ,��fI�m?���vJ�[x���ᮻ���x���ಘ/
!t�=~���?�-}�<���<���?��!!f3�(�,+���F�,[�����h4�'���RX\���o�v{\l�kh��H���Ƶ�7$%&1L�}`�"���#I��c�֭�d	���ĉ�R�_�`P�B9g���}�uC�:rp*�:<o��Ҳ��x��B\uU)�q��tJ�K�&���uYno�V^e��|���9'O���(����)VVU�j��{bc��L&׌U�u�SW�t_NAa�w�~��l�1�av�����Rc�Z^�lɲ�EE�)����8�ݼ��w��wUf%����0fDFHR�?e�R�@��Sfc!#���kk�c(�r����x{���RJ�Ά������ZͲ̚�����"I�iNp�"6@Ϟ��0d|�1��0\���Et�hE��5��5��M�d;f8bY������>O�@mv����B\uUYڎ]�rsO��g�g���/u�!LG��;����o��ۧ�����^n/��J��g_|9 ������k��:&D���騥J,m_����c���Q>b��sw��7v���|�u�v�F��#���͝3�h����޻w��_{���Ž�rnM��tِ�+���?���鬮����>yr�O��j��>���s,#�Sq�{�\�juAaQlL����/��vA/YzՕW��7|�Â�����������i�I��DذyV��G����w߅�����eG9��⫯�߷w�7�~w֝��[���b�������&Lشe���|���Ǫkj~X���w\{�Չ��o��Z�������ߺm{IY��cǵZmcc�}B���=t��=�R��� ;'�GZꝷ�f�Z?����Ʀ�'Ǎ�>�$����wϼC�y>��?��B)%9vl������c�n߹s��WN�2aܸ���d�Ԕd%�xqI�O?/������[���g�~;��;�������^=�Ǐ���>���i�(eN�ټm��Ӧ���|���ӣG����k'M/���'�f�9z��<%�t銕�ƌ��[9�kjj>z��6�b��Z�.*2�h4�o�9'/�yZ���ů,˒,����鑚��φ746���ǽ{��8�_6dHLt�
�^ӻW�I&̼�����ص�LA��j�Zm'����y���4h�����n�y�4˂,CP,Y'O�_�ȑ�o\}5���*�L������S���L�珟81qܸ���n��K+��.Z����u���w�2h�M7\���/�O��?={�]�=��/�i����Q�7^w���SU��'ss�{VZj�ի�ed�Y���3~�yɁ��V�^��;�����l����J�,���,��n��Ndfu���jJH�r�7NA�}�]��溺���0�h�~�>�32^{��ǡ�G֮�p"3�Yw%�ǗW���+	
���9bxZj��6���?7b���� �L`�cCB�������
:T��JrhllR�8_	v�I�8�+(,Z�d�r!�l߹k��?�T*�eϹ���
ػ8�8���@����X��p��Ս5�d2TUU�Y,UUZ�f`�~*g4�B�X�ٰi����
z��n��j�Z�Vy�͔R�e������c6��m�I��z�֕�Y��8��^ؼ>�jj`�t8p[~.c�Z��oo $   >.6..���73+;4$89)),4���A��c���S����Dž��%&$�,k��MFSDxx`�?!$2"\yI�q���BrO�0���'7/o���@�����s��:�N��f�Ww1详���^|1"<샏?���

=q"�������{�YO�뉔���yՕW|���Yٱ1�'sO)���Я_Hp��̱�G����SyyJ���ʈ������F������P��Z���mv{qq1��^,�644��λ�Jr�R��+�X�rEEU�={��*g���ѕ+ɰa5v��m������_\��hܹ{�A���	����\1erƉ�;n�%1!^�ӥ�����J�����0���h˶�/=��;9�#K��Pq\Yy���WQQ��;B��7n޼}gρ���Z�V�$A�"#SS�7lڬV�GZJʠ�ι�J��Q���'���{��V�?�y���֮_��ښ��O���N� �U���O�|?/**r��!�a�~��^��ۧ��u��b����ʲ���$BCc��鬫�W^jlj���6n�=x����W�Շ��٣GfV�CGtz�,�W]1�e�%˖sWYY��_��BDQ<��ѻW�m;vn߱�뮝���s��>�dnJJrrb�,���w�Ȉp�Z�l�7_}y���~�I�$I�0n�[�?~��)�&�zn[�%88h��K��p/����^�b����gמ�6m*��X�rՌ�WM�4��Ͼ7f��~�_6�m��7�755mڲu�q?/]�e��D�f�GF�3�;��ٿ��!�7�<���+�p!��:i��Ҽ3�8�А�+�Lnll���9bX�޽x2�TqqIhXHXh���w\LLXh�����Æ4p����Ndf����9�GZ�N�kmm�1�*�aH\lLF�	e���)(  �d.����Ȉ�f�pu����a�&�̄k����e��s9Q�;�RSK�ʜN�qcS�����:���o���޽zfe�N!-5eԈ�u

,K�u������\l�V��������MJL������A��z����8t��tFGG���S�՞��Я߰!�����+*BAHJL4``Ff����ѣccb��B5�a��;pp�W_5͠���;'�Tjr��;n�X�v����۹kwlL�;���uC*)-5
Ξ���jk낃�
���src���^���6((P)���q"3S�7�%';g���^��z����趛n�ۧOuM��˫GZj}}��˔_PXx��Ჲ�{gݥ\����N

���
��'�t����
`Р�o�����+��^|�ދ�|�s����G�h�r�f��د)��.�Q�B�~��DHO�ᅦ���Ν�}��w����G��$��(��ҙU���$I�$�EIy��2����?;(�]�r�$e#g̀�T������ˏ{���'@�e�-鬫�B\%]�:j�مȲ�B�+�(I�{]u�<�$�����J�?���@:�����BM���@�"��Kp��
��;�iC�:w&$��T�7`�\P�`�"HIa�x�Z�$eU�A����Oe�s�>pH��Q��}���W/��oD��N��Ǯ'�s;�wYR٩�?�Ka;F\h!�Ʀ�y�y�=�>\(5H�j�e5��*�/�9��X�q��$�t
������i�
��`x�Y8t�)Y��#���|�Rp:��`�n�0@�{�B��vhR�@�`�X�۟����jx�uظ*+�a�����q3���A���}jk!4���-�B3經(ã����lؽ6m�%K`�JX�
(�����
;��EI��>_�A���UU��[�|9����d	��@)��E�_�̛?W�����A잕�鄂8v���kAt0|8�sxy�����AZZ����	��i�H�m?�x��3���������� ��Ź����ON��RR 9n��6m��>��{a��J#�>�f��:��2˅t�5��'8p-�
����`�v	i?H`�lJ�adY ���!�'qu!����?0_�ҙ�uYI{�z�#N�.p�ev�1�Qn�BG3���d�뮃k��={��1 
�Z��Wᮻ`�V>||h` ���F����7�
��>@SS��e����{��{!$DXGry�{�Y3�[�3쁃�2Jo�4��Я�U�7���z�vW�r�ޡ��*�2;e%�\QQ� 
�		2�È�IJ,�T���#����<US�w�8�5K�?�,X�t䘖X��Ռ$��a���]n+h4��o�
���ܡEp8:m1P
Ç�3��a`4�m�G�5���L�q\�ّ���_[W7x��ǎ
Ʊ�G��J�����!ta�
��ن��M[����5�SPV������M[�ʲ<a�X�� ���Z�ʸ��eٖ��CG������$˔2�?q"3+{��A	��N������訡�+�>"��Q�ϙ|�s�T^ؽo_YY����r0�8�R
�*�B�
�>|��o���۹��χ�RRrj�JR_�D�ˡ��o���/�]j4~6
���A߾0y2�ƞ�ܟ�����<�C9m阧��ȱc7^w��dG��ݶc���޻?��s�F3��+;-��h�~3�
Y[W��s�'��oٺ�DV��=��t��RL��_~E�R�<�c׮��~���"e\1�5�"l޶����4��g�aY�f튕�z��?��3���\Lt��[�f��;**�n����Y�%%���/���[o�t:׆�
��l۹��ù����v���}���ͷ�^}��d\�r��aC/Y���4uʤ��p��7Gi5�X�w��by�{��,[��_�����wL��,^

�q]L�NH�L,C���)��Q��(tYsM]�b�j��~��7-�駭�w����3s����<;c�U�, �9�{���ÆQ���ˑ$iي��&L�$i˶m�_{Mwo�$�WH������۰1>6��'�������DGE)�#m��U��JJn�����:������=�w������訁��/_���ѣS'O�����=�?�uǎ�c�(SV�\���ѣÇ�4a�u3��i�J�~I�V�^��G0��y��S��/�J�d���{��xݵS&M�rꔣǏ�>��i@����0̱����=n���j���S�M��ǎy�W��ٽw�+������?����gX��_�ꊩ�����>,{��	ϼ�b���wܡl�ſԳ�=�O�UĠ�9��ܗ^{���o��+����

β����S���{�J�����ښ����흔���m��,/Y�b��s����o�^!���
��YQY٧W/�

<t�Ȗm���+�����v8t���h4��I#�bccJKK	!:�n�֭۶��kCCB��۟�t�LII)0�b���;w�t�uz�^yU�ڳ8TUW�ժ��H��c��,�T*�պz��X�]�l���p��+
�{(>'��ٷo��!i��i))�,O�<9�d��M���֌��]{�N�0A�ӕ�W���&�V��{���5:�(��=���SRR��b�$��'���Y��?���I������]���������Б#�k4Q�?.K������ؿ`���!�CCB\m�@)����h����ϯ�7]��VH
�����F����$I�Zm��=��
�o6{͘~URb↍�!���%���BZZZ�z��7ސ���aӖ[n�a�С1��>�ޮ2��-~J�6��n���!����)78�S�k���V����uZ-�0�$���G�=r�r�팫��[�����1�S�sB����{��WT����7ԧ�����/^��>}����{�^9u��K�m߹�ʩSN'�0�5Շ���uw�5s����V��T\cS������P&���?l~E]Q@u:��3g�ٷ������3NdVWW������
��lV��b�@cc��f���½�:|��â(Y�֣�3�������{�Х�l��k�}�Z��5WO߱k�C�?��s��8d�A�O�4)<,�n��q뭭m�o����q�**�V�Y�l���)�&.[�����O����.-*.���1lXUu����IW�^SRZڻW��K����WT0�)-+���9SP��s�h4�JO��o�sN���'���[V^֯O�Syy�<t�������n�k4�4��3��������(�1�Q�s՚5Z�v`��q��������}�J����ܫgzP``IYYfVVbB��1������K[Z[��Z�#5���o�Zm>>>ɉ�MM��qqAA�p~�}�~�a֬_�j��ii�θ:"<���53+k̨�F�aێ�aؐ!�f���_\lLCccz�&�q���l-*)�����#!>n���6�-�GZ��}�{�Х�l��v��顡!�AAKW�P�T�|��^�d�Tb~��֛�RS�m�PYU��C�]��l6_9u��
�6k4�^=��RR֮�PY�^fͺuf��Rf��M�!)1aђ��e�&����h���/,*ڸy��qc��jZ~AAvvvtt�G�<ߧw��k�-[�r��]ӯ�r�ȑ���w�{����B�R��P�]p4/!�]�\����,˺(wϠc4�k�!D9ܹz[+G<f��C�E�L�.sJ�R!�ڑ��^�J�:W��y�cz^�q��'�*�+`v.{�;
!!�6;��*�+ѿRK�_*#˲2u
���)ɲ��\�]\	���ܯ���gow<����zOw�(�~���9}�]��&�8?[zwo�Lt���r��9�.�����0�}�������;�!��E��օ�{��!�<�~��8�B����~.PW ����2tv����~�m���!`o
��'p�<��P��5��s�O)�dY�R��LI�zF�$I�(�� ($\O��n�X,�,��t��r�:��::P�j���R�Uu](����~�P�t"3����!��:��4��l�RJ)�0J�M�H.B�>��9�����x���>��㸎��`����e�ݳpOC���?2�1U�{7g����aMH��z�~�y���7��[��q���\��Y��6����|��ѵ�7dfeB:l�Z	!����T�9��Լj����_^Q�R�X�%�q8�7nZ�|H�t�������]”����S[W��u:����K�-߰i� �q󖟗-Sr��8�m����/�j�=v�kj�sN�Eg�C��<�~�Ɵ�-�5��������s?-^RW_�>t�o�TƉ̗^{�������>|���u������z��w��=�}����~'�B]}� /��Z]]�0,˼��k���uu�����A�վ�ᇧ��}L3B����z��x��+*+����K��[Z[%I�a�����ֽ���onܲ���	\�555o��><t��o����R2(���k;w�%�UU���F�Ą���3g��9�8'�K����z�̙�b��I&=��m��ѣF���ܹk��a�"��Ahnn����i�зoo�ͮQ��mV۰!CfL�*91ѕ�����ˤ�����[���ӧ��:<,��M[�񼣱�)3+���-$$���\9e�{B�a|}}@��{y��{k�%Li.�ק��nS�ա!!�B}C��(���%��N=|����¢"�V7b�Я��67�Իo���˯646���k֭[�|���{�QJ[[۪��'�?�����,S�^WV^� \�?�=�y�����b��op
���o?�����{���^

�2i��7ܠ�h,�$����$I���8�����y���VI��o��ԝ8a��A���<�O�4�wޝ>���{�I�ԳG�tr���5MƉ̧�yN���{���SW_���h�ڴZmBB�����%˂��L��s��x:??>.vЀ��2N(9����Lcwo �T)��WO���o�?��,�һ~���<b�Д�Y3���9_�M���h��.-+cY����O<>x����&>6VYnwo��(�3B��_PT\���#---5e��ד:��_�|��K(���|��m�΍�7?�ߧ�j��ftz`��eY6 �"P�R����
�J���-B���:�DQl�X�^^ʹ�$Iv�7����B�Xk[��`�TN��������=�9�N�JuID���w:�����Jgj��S���V�!�*�����r�K��v�ƢK�T	�Ju�8Χ���0F��Uҕ��ӑ��d�gBB�{��eY�V������
��f�]��w�ݽu�w~ŃT�շ��7���:�Jr)ƺ���R�������ݫ�B���9��zF��^�D��B��	�y@!���!�<�o	�8��`�EYn�W�E+���En���2���+w������?s��]˒���>�kxWw|�ȳ��@訟��"�?���XW��ϵ�fs��ޣ���?��:��L2���1��b�8����+�a�v��fc:2�
�(��r!�aYX���
na���U%�][�����v�M�γt��uec�F*����N!,4�ŀ.9]!n�J�Ơ��IR�e��~�E�����GV��(ot�u�{#�0}�YRb¤	�3�w9�y�ٸ��;w��f�_�$�m~�`�5��/]�B�k��~����f]q��edY&�lش�9_M?�������w?����>2<�٧��p8�x�ںz������W�]�z�ZI�F�6����
�#%CCm]��3�U��ӧ��
hkk�=����_��G�~V[[�=Aџ���QI�P[['�bCC㙂��ǎ+��Q�����9)�"CHYyyAa!� �bmm]Nnnfv6<th���WO�6r�pJ����dn���_�����MM�JU���=z���N��}M(��uu�(���L!�<S���'2`Ӗ�����g��|ӻ~h����*-+;r�$IE������Ͽ��3�~XPPX������Bill<p�PSssaQ��2��)P��	!�MM���������c�465)������o��Ǘs�1��3o�}��<��@�>�?���{fŪ5J0w�;��	��X�}<��gkjjL&ӷ�p�ĉ5k�G��˲��K��)(ظyKxX��늊�DI�߷/�EA��?��KZ�&6&�G4��S�>{�L�}{sO�
6��ٻoUMu��}��X��⟏;^T\1�y��y�2�FSSWw��!Y�;w�JJLx��OY�֟�-c&.6����hnn޸y��J�ʞ��珟81l萸�Xe5!�(>�Ŀ�����/^=mZwC�/�������v쬭��۱k���RS�bc�l����M),,�2{mں���浶��t�3�z�ĉ�>>�N�J��\��l�z���K��4MqI�^��<q�����3��W�tBH��3���fiY�(��))f/����ʪ��#G���mܼ���j�A���֮�~啾>>���[�����d�w��K��>tР���*%vޞ��N��(Iqq������?�g���/�ڳ��x<�[n���-۶O�bjII��vM��ПD��T*�T��>x�l����>R[W�ѻ�@uM��u���-��z$8((&:�?����V>r䑇ܹk׿����Ͻ`�ك���~�V�Zu"+[����2h��U�:���=i���o��j���<�;�E�R!�!��mB�$J����NA?vlDD��f%QyU�q���G�-������dIn�X���ׯo߼�3�P�2��j}���W^�:y���`�O�D�}Q���~Z4i�ӯR��fY�n�(I��"@�}��EK�f�<9�ӏ�b>�ޣF��ɩ��	
r-��Le�V��aYA���T*���kAP�UN��f���B&�n9N�w��)!!�e�V+�����&�0P�CJ)˱��Z�RK�$I�S�݆m?���*� f/shH���?׿_�}�����p8:��3,k��DQ�]��$C(�O>�����^~���4��eˮ��윓��M�'N0�L����{�|卷�g�����O�\e��dY>y�Ե3�;�N�Z-�2�p(�ݫ��5k��\��V��e�w���:0�U���]=m��M[��k+׬Y�fͻo���h��
=�R{��μg�������WM{�w�|��V����~�7���gDQ|��OWVW/]��Ͽx����@�/J`��,<|���Ʊ�rD`�f�
2x�֭�?�h\l������_~�
��:�_�Uk�BG�6�(�#�w)�~b�+�L޹gϖm�C����o�幗^*)-�>�;q��+W�ݰ���%Iҿ�x�o��/������I��q��˯�,?__�Z5z��=����C<��yۭ&�I%���%K���5j���1q�����L��#RSRz�ӧ]��}����2Nd8���Ǐ�V&����+;�dd<�ꫦ}6��-۶��g������z�
73��k�?��+�))o��ALt����ﵶ�����<h@�-:'_���t8�&�Q���6{yyyy��n���PfdlhlE1(0��t��;�ϲL`@����Z��(�X�f��$I��,�FGE�,[W_/�bHp�$I�����p8A��um����f�h5Z��{yy9��i2�l6[aQ�N���W�TM�����~�>>�--�-��,���� �J�}������h�i�<Q]}}qIIXHhhhȲ\XT��m���s�����VPPh2���b�gJ���	�Ʀ&�F���[ZJK�BCC���~M���jI�8�s
��0�aZ[[�kj����j��n��lN���ۍFcP``Mm���S��_�պ��?��׻S�~���ع
�t���f��K�s{�Vo��t�t�j�e�<��b��,.Zo�CwS���}�;m��pA�Z�i�p�D�1��#u9����N�ЩO����/u�
p�`ة�tY9��{/өB�:~����Wt=����=�uYq
������B�k����U���.46��⧟��j,��G!���!�<�~��8�B����蘳��`�D=WZ���n���By�~M%�3!ĕw��KMyFy���A�/�7����cJ���J�&�����~e��_S̽���	!�=9�{rr�-��ѝ
���>�ک�)�Y,:���X��n����/�.S]֟N��@�á����خN�B�թvm޺m��Uf/���/<,̕�R��Ke�����o� 
�]w�ѧw�NA�הQ�<|���
r�����򫓹�C
�5����֯�����((0�����ٱs׊�kQ���ǎ�^��9B$IjnnV���RSs3���t>��K��N)�������Z"��J:�Ʀ&�Á��=!V��b��{�X�NZ����V8p�Ыo��d7$����Y�Vמ�����$I

�X!��Ժ��VB��G����#��K�Ӟ%
!�onnf���޹��+�L���k���չ�*�T���e^�(��MM�(P�Fs*ﴲ2�~��j��S�޾sע��xyy���>�أj���>_�f�s�̙}�YCc�����|������>�hǮ�<ϗ����lوaC��عr��Ą���z{���v������sr����LƷ�{מ�e��}z�z��w�,[�mǎ]{��?p`����7��G��W�x�e�Ȉ��߿���e�x��S�Ԋ�k��[�p��Ca��J�ׅ����ƛ&�)7��W�|{�����K��ڳ'/�̰!Cv������
	�X��G�8��l�ʬ��%˖�UT����������^�8~��������aC%Yf�B�s�W�}��$'m޺�g��qc����o��5aa�A��---:���c�~1gϾ}>���E�c��%�ǝ��dv��?�<��s/�ݰ�!LyE�(�w�vkB|܉�l���#-�R_���o�[��f���H
���vԈ��--��.}��������,[�r�Sc�������KZ�[F���Ҳh�Ұ��)'*����?�t��#G�͘>mܘ�������y��}��{�9zl���cF�=rD�����#�
������F�>��U��-��Ӯ�������嗴Z��}�@E�|�M��ʼKu��&�E�ܜں}�p��w&$����lڼ�f���H[�a#�{�̔���'�����^%��β�L)��lo��J``@`@�;o��m���N���
)�b}CXm����01':�{���BC�����LʫF�1�t�˯��Ŀ����̬�Ҳ;n����u�u��dF������3j�gs�������O��:����h0��1�	��_��M��
��
��\�Z�V�0�z�$K�,�,{�ࡕ��̾g����Ҳ�����f���~�}6�˲<n��w�zcيU�}�_zz�������e������X��s�j�,I�5��w<t�l_�y{��z������O�0���
*w*�Z
!ػuM9�~�WF��+��;pp��)�uus���O�^*����9|س/�t,#�h0H�m�I�v[jrҭ7�d�y_����tf��$���L�o��0ǩ����n�7�^!�y�YY�II���ꚺ����F������A��=��G|�>Y�be����,e!��E�q�A��:����2.�=�gB||fv�򸠰()1Ay\]S:�Ni�T�T��BC��p��)8v<#(0�a����_|����=��3j䗟}���|��a��9{��a����<����s�L\1e���箙^&ӾN�>���SXX�o�������Ӯ�a�BQ�F�F�iiiq8�$655@sKaș��o�����m�X��ĸ�?-�?SPTTL��F$���K���jI��ʆ���/�	�,_��h$I*++1|Xuu�{}��4���.!>���-˴���,��&KRsK��ee��x����v� ~�Y���Z���7�}�*�,I��]3c��Ͻ�Jyy��ѣCBBdYV�T����7�����Sy�\==2"��'��d)$$x��O��WN�r��7���[g

�Ǎ�n�F�2���3WNq�)LNJ;z���V�Y���`4y��o���9_m߹+;'�W^������;�6o�v�D�7��ٜ/9�s8�$�HKuߖs��U��5�����{�V�

ڹ{���U�T�uu���[�������/88(5%%2"���6488"<<1!>6:���;�G���ؘ蔤$e�Ȉ𐐐�АQ#F����{��3"��Y�ņ~�%�ad*WVUM�<q�qJ=)-+�5��2����:yҸ1��:]�^=�^^��Q�p:���#--0  :229)�l6�HM

	q8~���a����I�	�ojl9b��1��V�N��*��ɓƏ�0̨�EI=j�S��w��8����w����t#˲#�
�j5)II��t#�q~~~��		�)��5.6&  `��s���-c������[�֔���#G���G����Ŧ��0�y�m�v�<p��CttThHRUU�q�����mU���ׯt6:v<��o���.�]�Z��P��ko�y�M7*�vY����F��EtY�J�ߩ�~��/�����E����k�"_�2T�gEe帱�@��t6Sz/)%��J7Pe���[���'�
Tsr�z@�X�N�ʄ�*�jĈ᠜�U��������uս*����,���$��Z犄�J�⊥�b�$�3�S��_,ө޺��(S��.Vʃ��s7��ܼ�gRWN���	��{��t�f�c���N'h:_C���E��?s]Sw�̮���q������?���TU:U�=OaYr�B:U�_,s~E=o	�^�Ƌld�Y]�^��C��m~!��㣿���!�:!�<�~��8�1i3(��)eϽ��+��ۅ<�$I������_CVz��%n�R�?���$I����|J]�H��tnVv%
'0�A��ž�~�Su�.��!�2�0��<�ל��bex��
�ޢ�8��υ��sF�BG�fp��v��z������ؘ%�����#GA�VM��s��
���ꪥ ��%K��?��j/YZZV����i��2��P��oC�0�w��W�|{��锔dǹ��rQr2������1�$���CG��TƵ(�2%��?.Zd�٣�"�ʿfݺ���0Ltd��f[�z����ʓ�Y���Ι��O���	����:s'!�����w�s:�׵�k��b$�t��
]/)�y��̬l����^�O����/����֬ݾs�����|��$K�贐�%�5��\R9�}��������_|�Up����j�>��Kj���O=m�y�܋�.˸�ֻ���<�cמ�۷+k2���7n�ҷO��?�x��=j�����g��CG�|��wP�P�⫯���u����7o�O�������tY��}�kj^{��cǞ��]�n�V�y��w6l��ϲLiY��ѣ�n����><bذ��ڽgמ=�U}����	]”�'3+�˹�nݾ=&*��Y����l�?����o�;�����_����x��!�N
��$}��g�	�mm_���A�{��%�l����M[����s�
a��3o�m�С��)-%9  @i�(����o��Z\PXXZV��7x�m;v�����l����?���9NUYYUQY��'�o�;v�@�����;�����<����������Gn�ۿ��͓O<>x�@��k���Ӯ�ڻW���$�Œs2w�q�QQ11����-�'OTn�v�����qq��t�~}
��:����xɲ�⒍����w�-7����_���(�|�aLt�
�^C8ΡC?���k֯�����~+��|����[������|��k5���ϾgVQq�;��#-��[o	
	�ꊩ=�R�>�wY�=j��,ˡ#G�{�Х�l��Yw�a4kj뢣"@����=���;�>��G�[�j�����3::���<<,LYHHppUuu�>��~\������'�4p@EU��LpPP}]=B.ZTS[��O�?@���SE�T*.�����y;/I�q�ee��.���[�%lܴ��/���J{�ks�	��!%)�f��1��n��;�ȃ�/Y����rʤ�E��6m�kh�X�f�U*���땕U?,�IE�Z�d�	�Zm�c�K�r���s�����M��AAA&/SLt4˲Ǐgl߱�eYY�f�W��_pP�����$�2LǕ25�
j�Z�V�x��{�ХʭB|����--�AAY�9`�ښ��{��y��7?��	�q��s�q��U~�����S�ۧT����>thxx��c�&M�m6����u����Ug��L?.44$,,Ti�W���mv{Ii���L&�e[Z[_{��{f�L����i[֭)+/߳o_�S���h��m?,����i���[��(���磏���A������b������d��3Nd�ٻ���~�葻v��?s��p��Y���A�2 �[)�>�>=|�В���;v����^�������?��H3��V��j��Z,���h��p���?���勗,������6o�VQY�ݛ�.U�*��[n���'2����^���MKM			Qr�g�d����8n�E#��O?���y�1i„?��Df��qc����;�srF�.��{��?�<3;{�ر_}�펝��sr�9p�Ж�ۏ;�n�F�Z=~��O>�b���?,\xˍ78�������Q�;w1�b�Ꝼwo۱�f����o�ٶ~Y�u:]TTdYyyrR���)4$d����AA!��cF�


LMM)..nnn7f�l6˲�������k��0lbb����^�KNL�j�x��f�d�ܹ{��Cg\5������\S[3lȐؘ���Ǝ1<$8(2""4$D��%%&�t���S�EEN���bINNЯ�C�M&Ө���{�Х�U!Gz��L&S��G�����w�]ǹ��Wlڲ%2"b��;�à��N��8���7�%%�Æ!�9v���/.6vЀ�����W�=u��!�a���,��=����0LNn���_?x���*��R��+**�z}rRb����{��)(�u��u�/��G!tq�3_�9e��|��
t*�E�~PFɲ�	Z�$�s��{>�dЄ�[�hI�B��^|xB��2j:��)�v9W�}�n�JEm/&˴����5����oөB��'�^��[�~	:zs��V�1Ԗv,�"e����Gp׾GP`����Z=�_���
��a���>Dص1���ȟ��Ɖ1!�IDAT�HЏ�os������2�[+$�C�ʄ�O�~h��e���.�}�/\�����8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!���!�<�~��8�B��`�G!��u�
 �П�R���L����w�����(���!��_�RJ忿Q�9��RB��)\�@qIq��<��CB�G%���E�e���qMy!�d4�:��j�Y��6MZ��b�XmV�
�]�_�] S9(0($(�3{����_sdC�?!D���ڰ�������(��Y��864$�O
��$�y��h����iu��߲οn�(PI��mVN�el�A](��557i4���Y�[Ȧ@�ihh8t����^^�,�i�X������;5j�(�F+S��9�_ٮƦ��ǎ�m^��eE�e�a$Q��v��-��C�eY�V;�?�X�,�Z-I��?�i�.��1!��#�P�2�sځ"˲.��V>���m0�R�[��;B�J�,˲,CG�Y�eJY�qo�$��2�$�a%˒L�.�ܸ��[������/�<Ke���+� I���}���tY�$�RJ�`�G]����GY��Z�Z�j�X\�ՔR�^/I�����h�y^E���>6��Z�e!,�r�t:�t��D�%�Z
�(^輾��ӱ��yyyQJ	� �V��=�˲���(�<�w
�L�Q�˵b�ޯ�� ��#B��gY���c�Ѹu����{g��0�rT0K�-�׷���`Yv�wߏ64""��t�ξ�eRJY���l�2�'&$
�>pYEyjr
˲���T�RUVUɲ���$I�6�d�̑e96&V��t��/Md�R�V�������;�a��ק��c���0�(�ƍ��p,;f�(��Ʊ�k�AP�T�Ҕk��ҦD�D�ˍҨ튤�(�^�~�ťee�BTg�Z=ZX\������e��}
�M��e�~BMD!�� /?�eYY�sN�,)+�$IY&�0,�Dz�,757555J�D��K���CCBT�
:�&�M�,�,����ee�9b�K����`�^��8��y��4u�ɓ�yy�iT*��Ah4�ں�7�}��yJ���(�� ��j�5��K'��#�.7�RI��`���N����n�mæ��gjuM͜��q
Μ�'Ǐ��{��?Wq���z�N'���ݗI�H�n��lv;C���@(�(��� ��������:�c��Z�Z�"�Q�X�U�Ղ(TVU9�%�I�j�:#�DBl��`P:�(�_vC)���1l�^�{��צN��f����ˆ���W�ժT�0�V����4{y�;�Uk��ڽ������8v�O�DQ?f�yޡ\(��B�
 SY�dI�$IbYvϾ�F�!))�ȱ�<�X�t��dx��z��Zm����:hгO?��v��q�s��*�*)!)#+���$9)�a��������А��3�E�%�u`��!�hKk�$IA��~�~�i=DA���o���mv�^��#dY�$I�eY�dYv8�--���~>>F�i��5��2q����إ��H�S��,7f�yN�bJlL������_�L����?���͢vS�x֏��B��Le��@����7��644dfe755
0@�7����ksss�^=}�f�N/�Խa��TB�(�!��u�uF���eJm6�F�	
4ť%
�
>f�Ng�h4ʁGY��Si4��Ꚛֶ���`�r8�?�,CeJ�F��2u:�M�9��%�R���|����ay�O���655����=z��N�EGE	�������/
�UW\��"�s'!��!t�Q�$J&�q�V������2�>��k֌1�y?��#ǎ�HK�ݫ�}<r��3g(��,��;-V�� �(��� V������
G�����Oxh����Ԛ���j??Y��|�(��XX\���8r�pA\�/�wo�Y99����)ddfκ��̜l�Ao4F������<8���(��j)�o����;n2h`EeeHH�Z�fYV�$�^��Y��������{!����0�� �:Q�Az���p8bcb�à����j�����8�_?���v~�ȑ11�,�ʔQ���”[��R�aLF�A�Wb3C�A��6{8�N��9&2R��{y�-����r)`2��F��ɤR�4ju��B+��u�ۻ��8�3��>�f���Öu|���ec��
Y���,7[��Vjwa�`k�ڣ�B�!�$[9HvIa0e!�؁`clY�-Y��ؒ��9�������g�G�@�d���k�3�[�����{�G,�-����FU���rY����m�~i�‹#�HMu��c݂ ������0�jkj�ZwEw��M˪(//-)Y��1VQ^�f�*۶O��Nj�����1���`�ڵ���Ѷm���_�G��X��8�1UUBt���P�IM�B�,S�3�Y�q


B<��
�?w���=��^#L�Nb�A��(�<�K���i�1��r=c�x���u]A�4W͝WQ^�8��EQ�d2m�K.�{繤�H����aYB(J餦!�
"�0<J#���X@U�����Q-�6�Kq=�@���\(�n_�;)ƘG=Bc̶m�4AHg2��|�n*������_ϝ��=�.��<4��us/>�2@�⢢�-�T�Q����8��c���?]�'5
O冎��B0B#��x��uØ�4��!�g�����e
�pA�3�x6�^�r(g�>3�f�k<�n�o�� �#ޚ�R*�rc�2<}�-�Ҡ���'����ߎg�l�F4���|���^>���`�(����CB(�|��!�o?|nǣ�`B)��7i��+��\L�p�</MLL�%�>u�SFA���NF"�O���c�y��u���&�N�*���<ꩪ-�B�\P!���1���]AEQ�Ț���1�q��DQ�` �5��ݞ7����!�.(��l�0�c�뺎㨪
�f����E��f���R*��}z0�x(˛c�I�!�0�꜅c>;�b�)��1��tmF��c�q�uDQj@ �~��Ь��L�#�6h��ˊ,
��1�@!P�B1�>cz�'�"���bL�$����׵�wu�?n|Qn�߲-<��=��P6�!44<����s�--�C��	c�J�y�������D�M����9�%�pd����<oxtd<
ɝpJ��8s���¾c��� B���œj?;E�R���	��^ig�0�<J1Fao���,�R�C4@>g=��l��z �3B�3��k`�|�}��M~�?cBȩ:ҟ6
'�~1�ʝ�!�bޙ�'_�$�M�������r��ϸB���<��_dYE��>����� ���2g^����%cL��DGW�(��(��8��EVA�%ُ�$ɲ|��ubbB�eY�EQQ7�ImRU���|�;}	�e�ȱ����i�8�o��t&�/'�,�Ꞣ(��_�Y᭸DQT�q��xJQ�3�
B?�7��o�ŷQ�1�,�J�ӃÃ�m�X�cc�O7�YLH:�>�{"�J2Ď;���5�Lb���=�7�����]ϭ�_�����x�XO�,ˆa���
�2�N�#2��fM�xo���(Yc<=>2:r�Dok�a�V�_���1� ���#�!I�zO����%��h�m���d��tðm[ dxtDӴ�#�Z�������y�;�'&'�vw����]�˜�3�����i�����wu���ٙ5����'0ƭ�m�T��.��ۻ:,����?p�`bp0��m{ph���2�b� �i��1���`BAQ������^�Ŏ�t
7,^�ذ,{�WY^
�(e�e��UT�)olhL�����,I��N/�0sB#dơ�m��,]���d���$��S�㩚���cc��C�iY�LZ����X*)�bY�5��VWUǢ��+Vʲ�{��p�y��aƘ"+���n�e&SIUUF��纒$a��-�G=�q:(B8��R�5-ӱM��$��P�a���ܒ�g&y�ί���+l�JOd$Y�$9ZX�z��X�(�QJ;��*+*	!�,GW-_1���v�1��G=۶�lv�!x��Mc���8sJ��K���;�?̭��<o 1���,ۮ���������r�뺲$���1�2A(--��&[�Nj�%s��O&L�$���8��2B!Q�	�B��wtu��)�B'z{���]�Qd�(c����	��r]QA�<�(#D8�v�t�C��1޵kWcc�m�P�|}�}��y��eY<�F����N˱�kUU��7�m;���2����Z@
� �qJ�,�c]�B���'��1EQ���b˖6XS���^��c;�(�a��APd%7����.���0��cJ�㺲$���8�M��k�@3n�c���ǔR���:� �b|F�1&K�"�5�8Z��2�(�<����;>Ψ�1bl�.���1I�x��H8���܅�o��cUV(�u�B?�5s"��*-�]׭��DY����+wVݣ��/�n�S��"�S/Nm�I1�e�>쭜�zD�0���+�y��F����ϴ�ӓ6(w
�,�!A��3�	!�l6�PJ�>�T?Sea�M��c�L�≘����S3�#+�2�~ȣ��dhxxdtt�w�;���KF'z{EQ���=�
��E�"�0cLU�ʊ�g�d�_�!��c�@ 1���y�J�9+Y��~V���<�1�>��=t=�{�ռ�'B?���	�k��}d>ߑ)c���3��}��������CC8bێeY�}I?i��5+E���5M:Մ��8K)���TUE�1�tF�:Ƙ(�U�l۶����d��y�gS���[�~oOj||A}]ռy�1��_W�REY�5M㹨|{n�J���_�r�,?q�����$���E	��i��	!�PH�4��_<��
�}��Ɛ �ѱ1J�"+/��o���631��iuUUii��8��1��1F�m�q�`0�{<(g�Fp~�޲���~���<��y_�a�|pɚ5���y�
�x$����(�C��M�V�Xn�&!$���b�6!� 	��$ɲ��y�#�X0lniy��ω��o��ys���վ��U��RJ��h@U1ƶm����0F��`B(��H�u]Q�����Z�
E�,c{�jM�R�����e%񸪪U��1���O����ċ�ϟ_���dYV ����P(�8N (�C��C�ᄋ�K/	�j0E��7-,(�B����9�A�A>�50�<�}M?p�M7\�i�eY�پ}YCCiIɾ����E%񒭯��4͛n�a��[�~wW2����
��_n~)�|Ӎ�a����


�b�X���]__[�5MUU��E��|��,�r]w��=�<����m���`0�ӟ�!�[n޷�����d�-_�i �(��T����c�ի2;�~'���_|o�^�q�Z�N�uFi$��Ư]~饖e�������Ըi�w�v�;���+����?qb����UV^s���]'O��LL\�aCˡC�Z֮^]T������<��3�H\T_���)�[�mK����ŗ�]SPP3B����itJ� �ܽ�'?{�o���ήh�p��U�(��'?�,��M���ۋ��3��,ˊ,�����;�s�ر���Y#���g�l}���o����M�؎+��69y�M7"���^y!�V���i����a|��g�
���ɲ���-]]]�?�����?�ۿOj�,K��?~��nY�c�`۶��Ԧ{zxӳ��t��'�bk[�/^x��(&I���G���'�b���m�k�������EVv���/_�\\TD9���λ�v���W^)/+{��?H$�l���}E�؏~����a^I�\���	��i�ŭ_z�M�W�����ֶ�w�~{[{�ަ����W�\iYv:��_U�j��{�y�x:}��#39�L�z�NRFkkj6~���V�Xְ䞻7��k�(���̭�|���C����'����/�;��7.ol,�!�L�K.]�h~U�#>P5o����pX��X,���ʁ�I�c�|���?���i}Q����}���oswA}��7~�/�42:ڰtiEE�7�qw{g�‹/z�����o���6\����<p���_��}�ͫ�\G���$]�n�]w|������c���#�u�W.Z��>�<�?��m�J�%��m޲epphõ�_ؼ����W�ò��e
K�,~��'C
K��tu}�*j��pA]��7���٬���Y�0��� �(�����H$֮Y][=xd�ܬ�mڿ�lN�����^su{g��8�tzᆭK׮u]���o��vddԴ��7lX�h�c�?+�e2+W,Ϛ&B�4M�����Z�����Ą��G���䤖N��WUm{����O�n��y�0ƙ�	�PI<�{Ϟ%�)��y^6kZ��bL7�@0���vEQ2��瞖
g��Ǽ��+[_�֎�MM7�t��%IO�|�w���������w�(/��������ijn����+�˷�������++*z{��\wʼn�ޒx�z~����K/Y+�� -����ww��Gz0
����_����������eph�4^rݵ�-�~���̛;o�k�p�h4Z����77�?�x��[���7����׷�˳�!IR{ggss�MMG��xA}2����K�ɤ븋-���]�jՂ��L&����1Ə<����p0\�xquUվ����7%Q�Nj-ˊ�����;:/\��˷n{���%18�a�5�EE�����Ǽ�{��3kA�$I����m������?<�i�$��a�AB���qEQ<׳[�$ޅҲ,>�?f�6_�(
�Ȳ�`0ȟ�*��Sz$I�f�!�@�/]m�6
!Ŀ�2�قH�u]A0�~CKEQ��OJ�$I�e	� �q\�@ �:�(��ecB��EQ�,�"��I��m���8!BAA�����}�_�x�������pkkk�	M3����fY��;o��V�0c�q]UU]�Ea�
����l�'����٬I�m�,���> �i�����i��n��l���Fv]�8U�y��M��Oc�5M��뺮;�0t]�[��BS#���3G��*w����-���=<2���PQ^qp>��y/�0&�b�7��xQ}�a~y?�S��֩A�Xލ��o�A�~�?��wƏ�?�Sס�'���x�8� ��[SS��Y�x1L��S�h3/~�r�r>]s��\��F���2Ƙa�p�S�]���gu��c��SX���_�h/BdYFg�
�s�?�%���d������BH2��9~�0[t�B%tEXtdate:create2013-09-27T11:38:38-04:00�p$(%tEXtdate:modify2013-09-27T11:38:38-04:00�-��tEXtSoftwareShutterc��	IEND�B`�site-packages/sepolicy/help/login.txt000064400000001422147511334650013724 0ustar00By Default on a SELinux Targeted Policy system, all users login using the unconfined_t user.


But SELinux has a very powerful concept called confined users.  You can setup individual users on your system to login with different SELinux user types.  This Login Mapping Screen allows you to map a Linux login user to an SELinux User.

Default SELinux Users:

* Terminal user/ssh - guest_u
  - No Network, No setuid, no exec in homedir

* Browser user/kiosk - xguest_u
  - Web access ports only.  No setuid, no exec in homedir

* Full Desktop user - User_u
  - Full Network, No SETUID.

* Confined Admin/Desktop User - Staff_u
  - Full Network, sudo to admin only, no root password.  Usually a confined admin

* Unconfined user - unconfined_u (Default)
  - SELinux does not block access.
site-packages/sepolicy/help/transition_from_boolean_1.png000064400000214045147511334650017724 0ustar00�PNG


IHDR�pַ��gAMA���a cHRMz&�����u0�`:�p��Q<bKGD��������IDATx��w|��mw{]�Kr�݀+`����@ ����BHB*$����T7l�{�E�,ɖd�~��μ�t>��tۘ���s�ۙ���y�g�gutt466*��1&������Ź�v�=++�۷OŒ���.������׈i�4M�7M�_�~��0�t�,,,,,�F8��߿��;x&�MӴ�~�s�0A�y������,,,,,�a!ݢߒ��<qQ��F��RY��$*�^:G��?��|�������rL�[�������)L�"
@�������|��kW�BH/1�1��0�M>Cˏ������h�,,�&X�d-���O�N(�|<��.�ŹS�1��G�1B�$
`���#��0x>�	��a�a�)
‰J�1�!��9,��:�4SBH�D��t�`�MvALQ�'�JY�'��wAi�nb�IlpR5�P"�"�zF)�?�lj����^?YX|^X��yc��*H��6{ͭ�M��=X�����e�0^oޒ��U:�ߣ���
����Kv�=�,)������1x�j��yKW�󷿐$��9N�4�DQ�4��0F�T�4�$Ju]�0V4�'�����������s�B<���F����������9��=���v���^SU
 I�i���:��ߟ��G���?1dPSK������}~���
�q1MM�A@0�]7LS��O �F�5�4MI�(���r'I"JUMcc������ц�@(4|�ر�@�ד���F�l�-��2�X|���8dy��s.������~�h�"�$������i#]ӣ�b�&�iZ���k/�aFSkPU�ܬLJiM]}nVf�?�j�a����d���g��J@G���߻gĠ������_�=X[����x��)JTQƎV��C�	eU�x�#��fe"���k\N���]���{���2h���u�}�
PCss[Gg�ϗ�����B		Gc�)�^�������+9ɛ����kh�*���>���v���D�O^nk{�ރ�n�$����-CXX|&!<�5��>�Ɯ��vB��W�|���窚����i�g���8�ad����,\�򂱣'�
<�C�W�y?��:x���!J�a�:\=������Գ6I�
���;c��������׮�8\3y�ط>\��nz��7���,U��6Q\�q��`(|��S�n��;���q��[x�x~�U5G'����T��y<Y�i;�x�?�ۥ��_<�$/Dc�y�GD��7�U���ir�xE]Ssg����g�ټ���Axs�ŏ,_U������M����9�&'�55��޻�[�nن1�z�E��
���녹99�J��[��P4?bX8}�9p�
מ?zd4;љ�a�-N@QՂܜ'���M[Wo�:o�G���@�09�c
Q���J�-�\�����:����E�!�&!3�L�on����w�z�P8,�h�b���w�2l@�?��u�ϸ���?����X�r���.I[w�+?\���B?�ã��w9�驣��B���zfj��G=�G~���0F;��7zdiEժM[6�܃L;��k��;Xyx�y��ݿ~�%k�C�[g]uٴ�?}�o5u
+6l9dPK[�;��R��~���"Ѩ�tl�� �b�e��0M�`�KcBh�Pݓ��t�-�e(��$VVהVT�?z�(�x���NJ��!�t�嚮��TP���e�ᘢPB(�#$
����w�U5G[��)%���P�!��:;5Me�P��g�[bFTQ�v[Sk[SsK4�\ј�����LBd��0�����+��u���ߟ����O��p��#�|䷣��=o� �M͏=��&�ffh�n��<.WM]Ê��Ǐ���A�D������Y�4��A�Dc��O?Z�4�45M�u��_B�l�mܱ��g�}�,x�9`��QE�'��=|��ڙ.�Ź����hɫ�|h�Iw^?�oQ���0t߯~�F���O~�q�^{�C��l�]t���E)��l��1.��>o�����!��|�f}eu���?�{)'#���5MJ��UE�*�S��s�u�����]{�
���3�[,_���d��!�͘<��W��y��+.�z޸w-im�;|������$�	�"
<�q� �]΢����ژ��rE�x����$��O~nUMmLQ���fL���W��_^��ٕ��5s�EK֬���2��B�MB������0�Ԥ��u����(I^�i��Vq�yN�_�~��Ѻ�[>_+�R���P$j�IvIR5�J�O�?�)v{ov_C(�����*�q�M�%QA�uM�DA`��D!~kI�LB4Mw�6Mә�B(`����(#�,?���.�O�+17S�{B��<O(����q(�~ftb?���,�k~����w���CZ��#ј���9�P�V�rPJ%Q�y>��,�
��4�w<#��(�ڵ���8}��r�vBH4c��	�QJ�08&����R�aDQ��8f�D�RUc�(�=�r.!4E���Fc��]��h��4H����lǾ���UEE'8�#��RVl�00B)^���ͨ��F̟:��1�&��/^�e�^�И�C��	�NY�8��i�G&��}_EU��0�p$�N0�F����A)��d����I���sY�W�4/@����.�˾���p�i���a����|}Av��i=��'�B�v��,+d�Ic�2c���6{IA�n���N��_�c[�����8wZ�o9���
����D��Aq���ALBE+j���q�����kaGմ��r�
;�a!]������fq9��c|,,a�_m���뱰8#�9�c[Z��s��A����s��B�n�k�D���(Z��s�[�<�/�j�)_U��x�?sO�i �^�����iE[�9����Ѿg��q�� �tC����@0 
�OEe�C�<n���knm�D#��x���<���-m-�[Ԟ��T������ki=e�ĂE�&�zܞ�p:bqf�]E�EI�$�=-�(�� ���U�6dXzzz����{ݴg#�ֶ֪�̜̌��p|�b0Ѧ�z]}���
��†O�Ca�#�H}c�?�okkS5�,��Ck[��AC]N�i�V���c�����L�3�8��y>��EIINa]�S�3(P��}_����C�w��b  �8�#�G��),��tJ]���Դ�Õ���#��~o��D‘H$�q{Nt�87`�ޝD)�$�f�RpNB)�B(P�0άq�$&0=��[!(����3bצ��0��t��\�S���YT���P  ��R�X ��f���'�|�u��?�A�3��}j(f�_���B	 �%����\�g�1�N����v޳z��ۍ���@���!;N��Hx��>��D{�':E���1��y!b�k|�IC�(��{���O}׉'>1���"�(9I1(N8\��J�!�$j��<هx�ԗ,y�;�*X|�1�9���=##n�Q��l�,�"X����7n���
�qO�D�i����ر���35%�D{��E?B��F��S��RJ�v;;ST�$vn�']ɲ�9s���g���)z
��_	9f��8��v����0j'J(&>DQdg��r'�{����UUY�0�Om'��Ӿ�7�,c��� $	/��h�Ĵ�Ka��e�4UU����y`��p�i��nć1��x��\k��$l<f�Q��se�"�����07MS�:ZԘ�9�0F@� &�a�0�N�fN"�ٙ�/:xHJJr�u��츎f��J��8����=TQQ���c��E.I˧+��y��aį�2#p��S-_X�9���i��X�Ɖ�6�D���8��?�h,�r:�L��p8⧅t��M>l�t�����mİa���/b�#=�㯕���#[r��{rfŘ�p��骪2$--457���fev��2��=�'$|3���nݶ]Ӵ	�Ǚ�Y]S���>j�M��%I�b����X�6	�{Z8���1k̄R"���ے���rsUM�_����4}>_ZjS�A��]]]��>�d�'��#���k<�	%n�"��ũ�LJ��əg����k���J��K6��LU�'�e%�mK~Rzς	!�(��|��?]�f�l�����|I>BH�y�כ��NY���Z�~�&�4�o�d����y�.�!�z�^����x<>Z\QY���\WW�c�N��NGrR��餔:�N���q��I��^��{���D8��������l�a�l�ή.��ҁ��G�4A���{��=nw���'��?�49���^���y�0A�z<n��.��x��y��8��r�].�!��v��.��I	q�\n���rI�h���n���N��	G�}��֔{�q�����^QU%�@iYYՑ#�}8��`9/v���v�v;�@�\N��#�"[�M�İϺa�l���/���r�\�w�z���x���v;��Ey���eY��l.��-�<V	.��4M�,'���,���p�].���~��m��r��G�c�L_���D!�����<�r���˻:�Xͳ!�%�p�>S����8����0G)��˗$���<Fsl@%�)�s	`�-��! &I��u�fہ��ή�Ʀ�҂�>N���^�ɴ~�]����o*--���H�y����[nilo_�n��HKM��;�l�Ic��c���V�Y����h�UW��ٷj�5bĠ�^y���[n�!?/��ra��/Z\VV6xР+.�t�…��
�--ƍ�t��v�Z���t�pݵn��
88��C~��s�y�>=�����|��o����B�p��Ļ�g
E��ӦL�
}��RI�.^S�K.�x�{��_�r�(�w�z�,˲l�y���jނ��,_?k���|��644��+.�7�@Kk��W\6b��e+Vn۱� ?�UWvk����C(���_߾�ϚEq:��6l���!��qx��%;v�5r�%�/nmk�xٲƦ��^8r�pEQ ጹ��u=��0~�s/����v�C��߼e��U��rso������ٍ��3�_|�P�̫�jo�x~Ѣ���f��]�͚9oѢҲ�����K�-\t�h]rr� ii����_�z���Ƙ���lQ%�tvv�\�H4"�rqa1B(�mv�d�Ƣ�-���!I�Z�[�@Zz����t���9�$�$���y=ަ�&�ߏ�I���Q�~gggQa�'%��F����e(���<r��M�b�d��`�;��ܵg�(|&�����[�kk��}�H���JӴ�?�7p����Η^{���~ي��M�e۶�?�g���]]]+W�ٶc��\:c�����]vɌy���ׯo��}�
RQY�e۶��|�r���M]�tيU��l��Ŧ_t��o�nhl��㏧�$�;���;;A�b�5W^������U��?��ʾt��XL9qȌk�L������?�����`(�SO��ا���g�ٳw��y������'��t�R����sss"����>+Bߒ��_���o޲u���������[��5�=��a����)�������0y�+�8��᧿�u��C��n۱c,��m�ߟ7o��I�羳g߾��߸���/v8d]�I��%nxaF�@0p�%3�rs�{�Y�Buu��<���cv��5������6jĈ����۶�?p`i���<��Դl��@0�b���+WM�<��e˖�\�e붖���C�r�m�Η^}m@�~�=����m�������q:��,gee�=n��SVV�P�PQQ�m�6�۝�Kr�ݕ���e��ٹ�#/7�����P9���C����uvv�(=z�h�>}222rsr]�m۶%''�|>���+�����y���fg��Z|� �j�
�0��ݳjա^ ���h��%铤�q��b������|��gWWo߱�㸼ܜ�fL���<|���kȠ�S'O��۶l�n�l��l��Ԕm;v�=�©Sǎ�a��Լ�J��f���5rdAA��n�{�W^v٤.�~х7o�|Ǝ�2q��륄������ٱkw�yI��0k���~������n���������:Y�&�LUU�߯���{���4���^&�y�-7�|�
�g�{�eff�pݵw�q{sKK}}���ikkkjj�oh�B�CS���}��<��ՙ��9~���ϛ@MKK���wnش��j�awgk���RU��<��{��-)�0v:�$�]>t�BQ�Ҳ2��
�g^u���-^�)��M�e�xf
���{��{֭ߐ����B���0���r��|_4-77w�y�Y���#7�x㒥��/�:eæMW\zɔɓ�_t��[�ϔI�F�i���.��̼d��X,�=�8��=�3���w8���q�����暆I�*1�.�e�l3�l޲�_�~}K�"i�f��uCw�N��4Mb����d�����F��,'&�߯II	�p���y�YV�s�E�3����q�w}��{��(.L����P����f�1�4m�G�����Ѓ?z��7���C�}�y�ݜ����了�#���}4t�I�v��]]S��v������,-;��{�M�2Y�p$"���իkjj��ãG�\�d��M�W�Z}��~��D"�p8��n�̫��`�}��6K�������l{����?��òlSz\Pz�`�&����G��X�����I\ ��ko͞3wn �n�5

���+�����V���{����쬬ܜ��f^sõ���������'^��S��@ ��h{GGa~��⫯����USf�I\楔j��D�����ᘢB6m����n��ѣ��%�M�M�]2#�(o̞#I��n!�ł������?�����+//��q���[n:�EQC��ί���<i��k�kj��s�{?���׷�~K�.[�f��Ə��tvv�����~��߭�ox���-9$x�p��8��x��%Q�I6��M�y��ۓ�����'���t8e��N�%�|����ܜ\�d�9^s}K���>�6���9t��aC�%%%544Ȍ����I�t�0��l��6��Sx����C�Τ$����S��q������'�e���F��ۯ�����d6���l��n;ZW��ֶ���W465eed�r�.�k�})�ə_8M��+W+�:��L�lٶ�0?��#G?PZf����r1B�\ueLQ6o�z��	�_zI{GgqQ����h,ַ�d՚5[�m�h��)�'��������$�>�R�iz/?�q�X���-;+�'�%m�퍍M��]W^~���Ê���<���q��
�?';{ێ��z�M��UU�$]q٥k�o(;X���Æj��}�΂��aC�ʲ���H�qVf�E���μꪁ�37M��i�]i)iq����-.&� �H4���޷o�m�/�p���[�reKKk���uu�,JNN���k��G����u{�w�D#9��>�73##99)9)����|��k���IOK-���n��<XԧOFz:�>�I�I�qc�RRK��t�ضcDŽq�_4������ 5%���?h��I��k����bAX����@�l6��a� ���@0 �e��[[[EQ�D"��͙���m��H8P��$�i
� �rYY/�]]]�����`0B���*m�m���WTV��.W�����TDQ ���z���6f?Ǡ��p��:��8�?��:hhn��hC�������f�PrrJ�)�q---荷�8���}ɺ��p�����^JJʁ��?����l�c���і����?��y��e���p8�t80�M��b1��� �H���c��a��(��a��%���P��vi��i����bv���0�D"���a:�Yl	�����`阑c�|d��ӑƢ�h,&����@E"EU�v�l�SJ�����r��f���� �躮(1��^bѨ(��h!$�r,�e��&�D"�$cM�9<��@�X��v{(b:d���v{8��l<�SJ#��(�lgfUU�g=��&�j�j� �@Q����0	a.Cn�(
��CEB��*���r:%��4M�z���G"�ϫiZ4C9�NA4M�Bn�[�4EQ�n��������E������.�IL�ec����a��Ɋ����)�)����h4*I���D">��R�=OKK�a�$���mimIIN�u���%--����|---JLIIM��rsKsrRr��>���G�E��Ǜ����%��%XY�a�ȑ#��U>�[r�㞻h[[�_��q�}.�l����(��(��2Eq�޽�=|(��]]��H��i�<�E�QM�\N���D��(��wuuŝ���`n�PJ9�����I%EQ8��LNa�� �����x�g�z�I �>����l����f� ؋�4�-���F��0�8�wt ��h4
��	,G��]wvvrS&i� ���+��i�X��!��</�`�SU5�j����?z9w2�%Bl������L(
�X�1Ɓ`�(�q���vM{{�B���g���8�`�1fBmmm�c�����#�xMJJB�a��]�.6����:gb�M�rrr�())abF^n�a�$�|>&� �0r�s0�l�]s��u�<��Ntn�1�4=���!�l���熪������O��0
C7M;y\�I�;��-!�de=�߱��h,6a�ر�Gk���^�&��3)�;~���u�^wd,�����'�%n~'셐�S�C< �{��ɖ�
����f��	_�w=K���C�v���;!/����>��$�zm����J��!�P|X����K_�=҅9�DžU��3��lM bs�W��it������n @@��>%&a�Ǯ9Yr�"�z�9!fZj��uk1”A�8�к�zC�Ƃ 8�Q�~�B���?YH�u��7&���ğ�p�v����<��.���vh��M�9�<)ϒeUV$���!4`���}��?�Cb�B4>�=���ЭR|2�n�gPvX|q(�q�a�A5!dj�ȋ����,�J�{<0m��Â$ ��ه3R�^P�,��t�a`qa���z���p�t���O����R���ŔX,;S;dPB%�Tԧ����R !)�)�Ҷ��3��RJ�r��hGg������y��ǎ>�%��Z�b퍗�r]}]CS×�Β�nqR0ƱX,'''+#+	3��9�WE��l�Mӵ����ټ@9̉�N��yל���y^D�ӥ����l�
��iq|Vf֐!C�O��� �x�?;��'!��k�&s�<�%1c,��$���(�gJ���j��]�3B�G���aM�Z�[�I,�B)���^����3{3ki݇~E�gP�Ż�Yx��e�8)_vY!�zm]m�e��l!��ե�jfF�ٳD�u�?�/���g�K���0��cܷ���.�`��3�����H�8�e���8+���p!�!I���q�jv<�]�8��*��,��A)�e�m�p��baaqV��A�nS'b�@�	�IieS$����.
�I�%�O;�#�U�=��df�r���#H8���%��=2wusG�a-/�����E�<[l��>���<�D?ێؖ�<Ǚ���i� �Ā^��v��� 
\{Hy쭃�孊�,�zG�7��@��ˋ��WN?��3]��������ߨ����#Ǐ���
��µ�\]���x�>�<�q���:���2d[�0�O$�[���jo��V�{��.8o�ڭ�g�,>i�nj1�aE��>�K�[XX$ҽ���iMMM���O�S�87��:��C<�+���?�y�M7
2X�h4��ܒ����+D	�v��m�0)�W֤�;��,�h���l�y�vT�h<T �����%�N7����m�c�۽g/;z)���$I��7�����sם�EEϽ�dge���<�ē

��ɾ>��h�c5!����SL�Cn�+1U%
�-���Ǫp�D0�@�����u��P8�657���N�4p��>���]������#f�}�HM��(�'^������a�N�@���R��{���-�`��וW�Q'����,N+`m�baa�۴���|��sOfF|�`��f7MC%�ͦ����CA�d�8�߻/9)	!���f�ۻOB! &���1�7V����Ҿ�(߷���#���lK׌����������c�hQ
���?P5�0�H4�����n߹s@������w���ˢ�XWW%D��H$��(�
Eݫ��i�'{���k�MrW�ֿ�t�&(��'FK��y��DQ�=�/Y�>,�~�����F���srrDA@u���p8=-m�֭}KJ������[�����[z����2##}ؐ!
��>�������0~�3!TxM7�5#��d�����ukV��g�Z�AӴ3�e�������ld��b�Dz����6L�4��Ύ�f��r����PU�0�á����v�]�D�0��(�&���s� �@�ē]�<�AO@$�3���̬˯�"�J��V�e˲2�,�oa�m&.���tE�Q&�B�a��ǃ� �`!�N���b�1VE�ƄD�²�
h��mv�����G)5MӒ����caaыc�?QDžx����C���{)�\Ÿ�RM������R�1��}�8_J
#@�Rs�a�q��/g�@�o9�����-,,���
�i�.�����������'��
B(��n�1ۦ����[�W�i3B�)��X�:����xEQ�侅��+�L�w:���?kaDz��RXXX�-|5+��Y�-�V,+���E"'�����|,�r�bJ�u<����1N"�)�$��F�i��s���9�����I|�f�P�N��ro�O)���*6w�ҡp�tu�o(LV��&:3�"�^�7�x�y:9��N�����'5�'���&C)<�Y��/r��%�Cr�i��q{E	G�g�,�"l6���<���q��j��2ğLq�	�R4���L*�>�NM}Q���2|�P5���!d���tY�E4�6���������x����+f&˘�Ԡ纹뜇E�3��LI�I�Ͳ�0
s���E�EɾdB�M1t����G돦����|]��1F����ad����h���!"����n3�%��IPJ%Ir:�aEUL�<�%��(��H���y���/�)�1
�
���d
�g��K�g/�l��lC��R~>o�>�_��P�_�<�NH�9����"
)f��'����g��WRi'/g��==S}�<���
�(*��p8�
�}�yt�Y���L�s��•�>|���{- QQ��8'm>'Y�i���Y;&��&�l?�O�T��zr
�I�1p��'���
�(���0�r ���ď���3�XB�P�A�w�xrּXˋ��S�p���SI�He9J��M̶Y���SEM�N]TJ�(&!,�S�ձ��3�*���8�f��PJ1�L�TU�sU#������M�� ��; ����x��q�0�CB)P�ߠ(��a��c��f����JE�����G��l;�Z�u�x�g���M)��$�@Acv�璞!Bi\2"�>o���
� ���~^��Rʪ���x�xU�>u
?�����T�,k�oQ�4Ms:�_��s9N�c��`�;	*-���`0������+&lS0&��(�ohHMIa_J�d�&��qL�X�$EQ9�b1��y�g�*�y�K�1�<���^SS�p8B�H$#==3#�����a�a�l�b椅1�E�QQl6��i@�4
 ��i���819˜�ODžB��u��������^�4�q&!��^��uCVQ�(�_�(
�x�u�`��r���x���$I�4���KN)e�74$�|6�1�=!D�$UUEQ|�ŗ���.�:E�4���b@m�4)v�BccSjjJb��i�8��8����!ˢ(��^|hBH\������cQY
�wϟ��1��{��E��h4�r9����"4B��D���+���q���!�ө�����6��]�����"L7�Φ�Ԕ8~^���ԅ�eY^����p���/��N���H�( �b1cO���$�&V�--�GE�����X,&
�+��Q��7e��H4��y>	3Ɏ1�u���aU�L�HMM�y���v�C�
�`
�ݎ��ڎ���a���j�Y���X��ǿ�x��DUUm6��n�D"cU�x�;u��'g��0nimmmk#&A����$)����{�R\��$��^/,,5b��jv{����f3C�4��l6[$MT>z�;�@M���SS��|�6�-�B�y��]XR\��*��cUU�=���������������%=����|K$agxI�$I��l<�$�&I�i���?>�oܴr���A��$��&I�$������0F� �}�Ҳ�<ϋ�(Ik.R��!������;��/?�����{���e���m�$���DQ��l�(������]~��
��� +V�^����=ן49˙�ê5k�?z��y���fMm-��(J����a%B��kjk[��?Z����E��8Q�U�ٷ�f��<�j��p$�~J��x������<|��xf�B�0,?���̾g�0�+��&�����6�|�]����g���v��v�0x�wȲ���8��g_x�`��ڣu}�L|^�M���ry�n�ͭ^�^Ӵ���u6nشY�y���21MS�e���v�9�3Lc���9o��^~�7���lܼY��]N�l��l6���p8�x�t:=n�(�ap�q�� �,lhlܻ�@CccfFFYy���K�f��Z}��.��9�s����=�(���p:�lSQW�[o����4ME��#�S9BG����;��\�hɒ̌t��m6[eՑ�SSRXm��p80F��q��D��t:]N�S�{�ȑjA�ٗL'X�n�a.�������!LL�4M��UUe�_��ڣ�����[�pλ�U>�i���;w��ƛ�����y��ˆ^�6w����ˣO�����76��n��%�����z�L�p8.���"����zG[[��|���5ke�]���>*?T!�C��.�4���$������}�o�|lђ%�_����p8��,˄�ӑ����rȲ�n���l�!�B�*�������ݿ/#=cl�۶����ܜ���i���k��� 8���alM�z5M{���B�0$�훦�r���q\Y�`(fϸ�j�N��s�=l��1R4�hGlh~2�3��>X��n�{ܞ�%%�G��?p`����$������_z`��=��iN�cɲeS&M���rVV�5kW�^��yu�x����W���~㦿?��(';��$##���X����`^n�dӖ�;w�ڷaAS6��2/��UU������232v�޵w�X,v�����+�Irsr�l۾w߾���d�gx<�ʪ#}�r��U����?���rsEQt�嬬�-۶mݶ=ɗ��7oٺw���Q ���#��<_ZV��x����Θ~qNv�������(.*ڶc��[�{�ٲm{y�!�4.�c���Ŵ���c��/-ۺm{ggg ������jͺ���ֶև���@ �����v�ٳc�.VϦi.]�b�굥e/�6�M�@�������?C����cǔ�\N��p
Ŵ��`�W4L
@	!�_ I1ɘQ#��[���ڽ'
����=z��l��]ƙ��k֯/,�OKI�t���`מ=/[���\���M[�z�������ή.���F���̭�wl۾��r�\�m;v��`��m��IN�C�4�,_q�%^�����z���h��������665-Y����-?/�����o�(BzZZKk뒥˪kj232tM�t���_���t:.��ҧ��`y��(�̌��;P�;ymMfF�0���]�/8o��=��mڼ%++������<����t8332J��[�RSSu�0M399i��!�*+���W�������<��36mv�]��mO<�ߎ�N�����ݿc�s)�ɺ��Z�v�
���7~���cJ�(
��>��ӝ�]�<l���\Q�%&	�C.��"�|cSSK[�|x�E���vttz�����쬬�������>���8--�4MM��_y�e�U�ϟ0���Hd��=�Kh��y붍�7�v�n�oݾm��=���̔$i��}�W���z��0UT���몦%'%����$;�5��(;���n����;NH������招�l9o�����G�Ǝxa��=�J��[22�׮�p����e��۶546utv,]���v{�n�4M����ؿ߆M�k��UT<O������y�{�����5]ONNniiٻ��<���C�����M�41>����|��^yU׵$�o��a�q]�Y�|^_�wn�q\KK�q�M�u�xJ�MQB�v�,.*z�{�[�r%|���^~����������7�Z�xIe���>�Gv��M��]�a��=�.��`��������g8���!4Ѣ�#G���3k֭?X~���{}C�#y4��^��W_�۔�4L�4��Ȳ��.�����穧w�����*�����}��'[ZZ~�_"��~��(s�{���Q�l�@�R�~��]{�,X��o�iko��#������8t|r�pW�?~��$�+V��ٯ~��_�2
���[s�yG׍W�x�綷w������7���O�e�_�����A����IJJ������ٳw�$I���~EQ9����/}��znjn�3��%/KKMQU��\�jQ��P(���fd���1s��)&1��^GjjTM��O~��H4ZYU��/����_ze�ݛ�l��λ��~���Çy��0WYux��-;v�zs��A8�D��P�R��3����D1�&ٶc�Ҳ��|���?�t]]����z�ȑ���g_x�{��4cј���i�b���Ͼ���m�C�pcS��ힿ�u�7�[�hͺu��=�̳���c����������������E�:|d���;v�z��W���3/�p����׮[|�6EQC�0�Q���I������T�54<��K�a`��� �d��/��j8~�ŗv��-�������ل�@(�p8^k���2���1%�q\ ���^���w����x�'�/�h��u>�����$+D,�0�C�p�OaAJr����ޣ����TVV�����/~YQY��@���;x��y�����7�r9��9���BB��i�h�0uEUM�,+;���ojnio��Ä���¡ʊ�>�|$ٸy���K����3��m���ή���}�Dx���좄�X������7�ܼm[���O?�\cS�S�{&�&&����6V��41��p�~����	UU�{���[�Gb��^|y��UUO?�\cc���>Gy��>^�<S�~�YM�@�����d�У�F��Ͻ����q�S�����uvv"@s�yw՚u��Տ=�T(z��W�v:����jgo�4ILQx�c�I	���kx�g3�sI��9����<@8��,
�׮ݹk��(��b�$�;��{�;��-X�q�f\|ѯ~�S�˵z��Ӆ0r:�hLY�n��~��}����.��4����!���wS���G�ѣ�*����~��n�>|���o�e�UW5467>!�z�EE?z��'��G"�ЏTW����|���yGW��n��1���� ?�n�c�Ǎӧ��[n���c���V=���޼ܼE������յ5�yyR�iR׍aC?�߽�.��e�I��|�UW\�nÆ�w_<��>}n��Ƌ/��0?�֛n�z<-�����aL7�����ӧPӵ	�ƍ;����ǎ�JX�����y���w�N���ko�ڳ�G?xh�5W�Zf%�1�G�~��Æ�4�%�i(��i�.���]��U|��Қ���C��>x��n��e+W���1�W?��#��\%IB�8IW�^s�e�=p�wǏ�r:���4M������,,(���F�h��a��;o��W?�YNV֪�kssr������.
G�Q&�%�R�b�f��w��7^6czaA��(���Z�C�����v�����a�Z��<n���4Lc@�~�}�̸��Kg\l�f�-[�������?�:i��e��n��N;!���tj�Ί������<�+o����~}�����|�G�X�d��.��4e��%˖a�!�������a���ܜlC7F6z���5|Ȑ��l�<�*�F���~:n����?PZ��;�~���LU՘��s�a:$7'��=rD �u�G<u/i�F)!�躞��~��Y7�pCFF:*��T���w�wt�����ڣϳY!�-�B(����w�ݣF�y�����ᖖ֢�>7�x�%ӧ�54ddddee657���P
�$�3:;;���ܕ��i�I���U������?�#����߯���_�+y~n�i��4	!&1nju�-7_:�bJ������PQU����˟�$��޷�_ߒ[o�a�������E
��M4���f?y�w�z����w��]:���C���d�{��99Y��)�&>���y=o��~����?p��3�.����a��av��Q��Y��}WNvv JL����&97�;��~J�T=�!��������\]S{�̙N�����t�r��SbP]S��������ё��Z[�"�(�qv�}�튢�O��}��O�<I�
�������D�^�a�kϞ#��1%�����ǤBl�I�04]cM!�<�o������'L�z���Gc1M�4]�
c�=�਑#��O���b�i��h���PU��䤤;w�¡���‚�@ �i�q�G����~���„���.�� ?��~�ŔX|^�]�����=��
=�z$��g����YW_�D!�Hw	B�����#%�E�d۱kg4mmkC�������Ɩ�V�4"�h��#��&����.Y���u!w��}Ѵ�C�9b��5kB��tvuVVN�%!�h������!=-MQ�ֶ�H$����ٳo_M��h4�r����|<v�h��1E�Ţ�e[��4]Gy��]{��ݷ���)++3���P(DHw�b��i���JMBbJ,��76��ɧ�23��2UMc�)b*�zٌ�L����>�g�>�q$Q5�PeU��:�UFzziYYyy�����,EQB�'�t��O?����{%Q4
�$��j&!JL	�B�`�P��j4=PZ��ޑ��y��������PNv6�l�X׍p$����4M�uM�j��UTV*��EK��ZZ[B�h����h]}nN�$�;��oh��ꢔ���566566���Fc�X,VZv����{E�G�U7]�)��P]�y�/)�ӷ��!;b��������J��v�F���`���q�I(�t��
T�H$jƒe��t�Ql�.��h4�k����=w#�����
C�)����@`���P8�)�Á1޹{Ϯ�{%�����P��{�q�@(�T5%)i@���yy�REQ�� ��O0U��b�@��B)�TӴP8�L��4]UՊ�ʚڣ����}�!
�Fk�����S��Ĕ]{�645��a�4����������UU
����A����7tr��{x������a���u(?T1|��Y�\
���v''�V�^����z����[��9{��U��?o„q�֬]��K/��[���rť��ퟏ�߰���n�:iқs�,[����O~n�I���߳z��999��y��W^�r�m�
��՚5��x�23!�999�6�� ?����;h��y��tdfddge�$��E���--Ͼ���LII����z=�������䘄\?kֿ�|�w?2i���]�q�f�nS5.!9\sՕ)��l���z���I����v��=��K���cgw>�7n��l^ow��rs�6; ��HINxr�sx��~х�֬}�����̺暷�y�����v�9��f��{��������{�^���q��{����J��=�o��f͚��;^�����ɂ�s�pم��%{]�5�n�뺙W�\�Ԕ��?��T�<���L���/]�b��mm�.�c��I��,]�b¸��))�N���W^}��	}K��������iin�kʤI?Z������B)�h��7f�)-;8b���c��ڽG�PFz�I��Pj�d���R
à�d��;�mİ�k7l�4�������L)����5���s���Q#�����(�S'Oz�WW�^��\N畗]�Ɯ����'23�/���ߚ����Ƀ���#rsrUAc�32ҁҌ�4�� 5%���L�x�2��s�����5���~"++���.�D"lʢ���������;f�m��y���n�mڔ)�?^�����?Z������rN�vaQa�o�-;X.�e���{���9��t�{�]M����Y5rD�o	�!��!�0A�RR��[:�z=ّ�䛿裇�ހ~��y���s�嗹]��)���$';�NUQv{Zj����}��޳���+����Ngzj�T����?X�P�iS�H���KJKM=r��s��}i�)����ͺf��E��+.��OaA���+�(�M����jRR��n�C1E��d�gBn���f-_��4��||^���Bef�S8&�u]OMIq9��6�6e�ko��l���.�1jĈ�o�}��	?�G��l<h��
Ü�λn�;#==�(��9[�4����o����#G����{�=//WӴx�����$I:������
ƦQ���l��H)�NK�;P�?x�{��5M��������?�7~�e�̈{��당9*���-�p�[�Oq��^�Z̗&n�L�J��b�0w����]�x��+UUU��q�:�;��œ�<�}/O8���I�Ŀ�u�O�!<�B�L��gDl����v���H�x>��g>�kv�h�G6L¼�٭�n���k�n�~�=w3W��_{�0��g�d�!U՘�"p�$QdqF�n�].��2]��a�1F�nH��j��a��1؆��a ��{�
p t�\P`��#Ѩ��?)�6�!��R^TU�$I��C�����h�꜕�W	ٳ��a�e���ŀ)���gae�u��t�����Ye=W*�"�ؔ}��!�� �q���4�$_�Qb5ƣLTU^�P(Ċ�����`̺L�d�em���z\.M�Y尕*V��O&�A�4��8���nw<\#^]��0��L���aBx�3�aau/Fb��k�n��{�ɡ�_p��"I��(��ͦ�:fA��N�P��:۲�����ϙ8j��w���~�tCw�%8>^.1�0ธ� @�K�<�D��8��јb�s!!�y���p8�'#�n���M���_�
{I|=�|8�s�\��e�oe�O�1 �0��kz��$&O��ce ��+�������)2c�&%$��Y�bb��v{bS<�O��A\����-�T��N�����k���]�`(�i���!�����pf}���<��9^�X,�b��SzZT8�����XH&���s)�YN7c���W���&��|�M\GE��ON�%�pV�^%�)J�q�&�2L,||�`�'j��0�Sb�xe��y�Wz�O$v�/Y@/��9��$Vc��9��G�̓���b��(f勿�xE��	/7~S��l�Ai��^�#�(>���b��H4c��'���U���*��bF�Q�ò�9�c����Ir�oSQ_<l���8�_���+Ə��4�ac@cSP���d��Hu����|�޴�Lѫ��LŲ���69r$'իǝ��a�á0{�<ϳ����s2~�t�q�WR���R�S�.��`��.���2Վ1�C6���O��������/��1�'��f��O��u��WJ�z��:l��k�Pb��6Lq�\>��
���������M2,N�4ͯ�21†a�3f��BHnn���Y%����(:���̓��PQU�C�)g��-�q��9�p�pb=�|@y�LU�N�7����A[|I���d���Z:֧�(��x��74�c�фS:خ<c�������jCQ���&hN]q��|��g<7�U�_�$�7#-����E��'���pQ"�:=�4B)�@:x���4�[��#E:�;�P84�dRJrJ�^��(?'>t��.���7�o��D����{�s�
�Cu�:>~�s/����som�����b��Z9L�3S!�u:��c��|�X����k��b�0�$������>���9���'�4!l�S�,G�|��-,,�.vN�b9#�hSK<�oh4�� ��h��k-�4(V����Z��ع�,�cl�:�ٗ���7c��y~Bh�cF�Mt��Bh��Hx��)�H���c����W&z&YXX|�X���&�mA�k>��_Qyh���w�%�s�p�Y�(����6��&B�$&F8�'
퉇fGo���-,,��.�-�$��f��b�e;lٺy��}�S��)^�lIuM�,�3�����۶���z}�\r����'&������eKA���'N4h����fia�ub��o!'5��榦�F��s���Ʀƭ�6_v��#��\�j��ukj���t�ͩ)���,2MSŶ��u����o�[�|���:xhjj��];��C�	���m�?�ܷ�y-,�n�.�-���G1'O�r�U��Xu�*���������zGg���z}���H�L�B!���(}K��9x�l��.�[�e]��v9==#''�}��������A��9�|y\����WIzF�S�U�.|�k��8����o�q3q�v��/�dse��+O��']�y˖��_^����6�BMӜ��[�$	�pᴋKJ��ٻ�W_�4m��Ç���_y!�\z�媢|8��Ԕ�ܜ���Q��)V����E1L#��h��(�/`���.��г�惎�?���'���;u�{�`aqz8Q�j�'���~�%��g����>��t��6��k����y���o�4�㸤�d��	�\}mSs�(�p�4������d�0���3|��#G�����v��.���Ç�p:���}���ORR2��9�J�]2'�]�v�ڳc�E�ʜ�⯍R
�k�¾�E5M�x��_꺾��F����]]]k׭���켁�&����^�f%PU%7'O���Դ��v\q�U(;W�%�~��]�P��}����r����$�f��w��
��L�j��X�E?kr�nݺYӵ��‹/�QS[�j�JPU53#�g�}g���S2�3���r��]�_{c�LNz�]���׮[}�U��a'�8�O�(ʰ���/��+g�z�f�uB���w�\� ,^�(==c���'uAL�|�G�NP�6�I�T'^���,�D����),蓗��t8�����<&�)�v�\XP����6����s�=�(
�?p����‚>}K�eee�ݞ��,s�,gfd~���U���zUUΛp��1����)�u�k�}�=��h���?v���Là@	!F�%�����h
��V����ٲuvu6�4�=v��%%�222SS�~�ѣ��%�C/|,�4q� � �"�Cg������mm�M͍`a-N���P��څ���:d|���--�cG�7f\��L�<|�*�>���v�@��.�;,��x�]�w�ٳ�����ÈZ��mm��
v�\X�G��ָ�m�	�p�9o���;cz���չ���t]1|dZZ$�^���|b�I<|z=R�g�JIώpݻá����x��g�DS�����rq����ڱs{yy�̙׽=��qc'�-���~���Ǐ���ܴb�p8ܷ�/2}��K�;����`yYWW�˯�4uʴҲ������8d��5�֮[=f���BQ^hmk]�b��k��M��o+�q�,��n��gdd�߸cl��Q�C�[�lEq��iY�Y=���r�R�S�6�(�]���{_UՒ�����޶u��V���������ɲ��+�q8��\���짷�A);.q�a6��������!��W�C�`]]��!����4���5rĨ>}�~��;�ٹ{GKssff��e3��v�;�G�;l�p�PwvvtuuM�4�С�cnj���\�p���px��K`��� ��1k�u��&���*��GF�������p�_��y���7�CÆ�h�h�DB�/2x���i.[�qMm��._}��C�e�(�W_5��tQJ�jmks9�Ɵ_YU����~�Ͽ=w����W��F�M��_�˖/�����˟2e��g�~nNb2�o��I�>�������K-�/cJ;�Y�<��P�`�E��*3z��n��g2�3��j���֖���/]\__�f�Ԕԛn�y�Q���������8pPrR�3�����׷A~aeeEi���F�3c�e�6m�=Z�Nd]�b�������)�Ž�8.|�x�E������@ ��0��˖/MMM#��_�A�����׽�Μ.��W^���>x����[7�\�aCG\v��i&'%Ϻ�@ PZz�L7�s��.�T���������W_\�x�</Y��{���u�t,mimA�w����4iʮ=;�;�5U���F�#���/<���9,aي�
%>lDCcCcS�q���?q����V,moo�*�+.�*+;{��ń�a]�[Z[%�l��Դ��n�uȐa)��C��eyÆ��}�^����.8RSs����e�W�x�-))�/�(QJ����`(XZV�]����l�PX�G��#�G�U%?����q��Ǝ�z��֖,8`��m[���
�a�l��Z��53��7�>�f����^˼��8x�����:;;g���'N�`(�q!t����H$ҿ_�/�=����E�&�0�IIIMM�7m(**�z���a��z���P(�0V55�Dm��nw�9�0�����������G;|�JBL�Ƣ�a躞�����?5WU�Aͼ�:�{��;w�,..a��9d��Pp��II�i�i����&a�#��!�"�0���μ����k��2f�8�PZj��{�5u�40�����$QA�R�|>��f7t�$&!#������m�:y��Y����(��w��Ệ��?����$���{ܞ�[6j��)�)9ٹ

��nP #��f�f,KOK��|�n���������vI��n���y��A��N���f�g656���y�>�ח�����/6���{c��ǎ�{�NYv2�n�	�/X{{� �1����7�5�$��s����_�jz�F1�wo����Ə;��O�m[o����kV&%'�3~�E���BZZ��1�֬]�k������⅋�5�@ �iZvV����^|�>}�c�h(�F��a躶j���kVfdf�1z��]6�-**9t�����n�e���LvT7�SUUUSSRR���[\\���jFFF�qL��tM����(���	�c�(���\�������H��)X|k��Iq����k��)�C��H�hmi��?����(Jvv�M7�b��w�EQ�~}��A	�Ţ���
SS�<﫯�d�fjJjUU��n���5�V���}���#U��}oZjڞ��W�^Qܧ���C���F���mظ~��7���fL��0�拂�(��n7v��5+w�ޕ��C��].W4��쩩�sߙ3j�h�4#���M,-=��/�䊚��h4�h�0|��
��S`�a�v�lmk�dƥ��k�������C�
<���n�h�!���E@�����]��K�/�~�$���zy��o�_xA¡P��x����
��`�@���^7��P(l��DQ�\�3M3���c�gWW�b�0
�7	h��������|ZZzkkKcS���,,��q\[[k0�z���)�m������t:g[{[$����wuuuu�\.�ө麪( ++���vUU�����::ړ�Rrs�X���P ���f����$���665����r��p8���JKMg���h'������~j���.;)�)<�765��$
�i��i����$z�wC���j���mLӴ���`0h���l��#�Hk[+b�l���z]�(����QU5-5���)))�����M�[Z��6� �&1M��`�T�R���ߥ�:�l�f{{��(�/�2yjrrJFzFYY���/��r�Õ��bzss�(��i����<����|�N������8�ݞ���iZKK���2	��F�і�f�۝�����i'�_mm�<�|Il�p����x�ͭ�-s�}{��SSRSS�B�@�h]-���lE�l�4���ꮮ�]{�21�q܄�ccj�`��ޢ_����D,AUջ�Nvv�'�jޚ�znnޤ�S@��_~a�E3��Kz<�(�8Y���/�3~�࡟��c�f8�x<���3:�}^��U�o]�.��wnt�~
�����o�y����z����
��?��2?�}�=(?K9�K��p��W�-����4g�lV0J)[�<
5�e8���m�Bn���&�ÈD"iii��)�
��m�����|A;::�JMK��2�q��������YX���p ��꺺��`���zBpl|ڑ�\�;N��4'NAz-c��u21I�l�%�CV╟1	$̵Op���nQ��O,O��,b�q���}���qCJ�5p������9i�d��m��̚0��ށ;=Y��݄��������t�^=>�?@{GGaA��p�݉����o�E?���%/�D�n�g^3k�5�]]N�3�s׽UU�m���M�'ME���h(R媫fb�x^��;�ђEn�k��|���;7o٘�����ȉҴ{
*!��2t܅�9 �'�&�4�)��lO��gOr�L𩯷���9q#�������؇^��椟e�1j��))�))����W1�ˉ�K,�'	�^��)�Sԧ�}N,�1�E�ftϓ��e1���1jР��(a��
v�mw���P_�_��K����P2|؈;n�K��]�v�e��g�n������XUU�����S.���ے���1���q؊m���z8�")��v���2$���<�#�3R�x��in�;#=��p��wt�y���i���^��0t�4��ۚ��~jJj,mni�8�$� �?)=#�&I�Gk�����gU���89�z:��~��8/��F�3
�B�e955U�5JHNv�q��'&���ںm˶�[5M��ֻRRRe�n�'#=c�A�M�a�/	!T{�f�%�=zl(��<4e�4M���I�Q#GϘqٲe�[����}�>gS���8����s���defr;�N�s@co�ϐ5��r
TUe1J_�ܺ����������k����yXZ�KB*++١[,N�0��‘p���G�&���#$�7�c�$I�������ټ_/��ns�y�>'�cǾ8EF'��+�S\��AQ��gu�j�_��@�'������D_��Ϣ�������YX�˚z.r�w� ]	S]bG-���.���(�2Mr������#���RX|eО�B�u�&�/o����q·#LH�EZXX|�`�	���RX|u  $��EN*�y��6�7*��ͳS�Rz��aa�
��b�����'��<P�x��DĄ~C�-,,,,N�`;�>���|�%B�2	��37A�o�A
�vs;�L>ՖuƟ���l���_��Oc��.o� �S��i&�w�G4�c&�ޤQ��^@7iD#�����8�SF4Ո��S�g�A��[M��[
Puj�4�]0:f§�D�BD=}��f����J���n�O)�L��j���[�G�4�'N���	5)�lxL��U#��O�[�Ah�K�+��s�< �GyI"F���zN�����Cr��6B�È}�x
����s%���X���[S�
S�T�@0�@1�I��b:a��X'b�y"@d	�.t��!#&|q����?�H<�>�g�}</I��؝QB	�Ϫ�)8l��c`@@)�x���}2���2���v�R��42$��5��9�"F�O�or*��A�?jtFݤ�GXr�Ur�ϴ�#�>�k1�x�܅܄Bg�D�@̠�A����R.(vN��j	�����]Q� �RČ���2�K�a�<K׻�5�P�E��o쎉�,0FQ��<.���%�k
*�������I<�atr0f�:�i�5:†�S��0iG��0�:.�.bB�M(@X5;#FT#�N#*��
�Š�1"�%?�"*�T�vF�0
�$�y�@�S�:#�?j���N(ϡ�$3q��c���)bKP�����g�GS
l���1���o�mx��'Uj��A��0R
R�fKs	�3�Fy�����J�n;w����m!�3j���^�����#}�w�G8���#��֐V�G�	��
�N���#W��AŜ1��1���C<2�I~C�b�祌ȓk;4
0kdҥC<����t��(�62_�lQT�Z�䌀)��S����`b�7u�M�7�FH��ľ�!�	�N�>����!9�fБy��d��.�_�}J�����J�:��d��S'�u{dnEYP'���8�.�v���������Xq�m|��H�:c�g�H�����j\��S�鳋��I�e��a9��NmZ������3jsr?פװ\GY�r������bKP�<�ՠ�鶟L��-#G��o����T��-�"�4]k�h�L�)��R����M�!�\��_��]G��,r{�bk����t|�.L���5tqz�̙	C��F��0U����E�E��&r�S�ޔ�,�x~�� E��߽�HxD�<��+��Q����Q�G�A}yY�ҡ��K�]Qs�O��4k����a�t�	�˃�i��yª��M䌃(�L�7
�4Cɞ�ջ��~5�_���n�j����1�tI�}(��-ll��>!Y�Q�_��Ԛz[�����TԜ1ȓ�Z��C��s��{'�bƦ�a̎/A�&E'�;o���:��0s�\8���F�ɷOH�^	)f���T;j"]#���79��]-oV(�K�xln����ī��ZBF�T�;S�:�ΈYժDT7���5h�wj�~#d�10Pf�^s(X�[]l��6�WQLB]6<{k�,�Y�|a� ErJ���Z����0�l9�}�f��1��&%���Aw�F�_׆Jҥ@�D�-ʒ��}�Q��R$Y�{�	W����\u0xA�3���6�Gi.a�Pm�v~�K7����muyp�Ѩ��Yd�
�QB��>˻G�bbr	�EH�.wj�'�Óڜ�;���?<��ܖ�!�'bw
>���O�:��챤�i�P�YYq0��!Ŭ��6m�U���!�����
6?����U�5�G)���}�pH1��J���&P�S�T����q;j"�L����
�˃�b�+C��Jc��jR_�Iah��4^Z߶�`0��Q���ӵ�*���R�����z|YsI��5�7��vv54BiD%Q�&=ڡ5������+B��~��^)�8�V��ģ�R=y$K���ڶ�ޚ���:5J�#}ma�.r�-�$|�uJ̲����ϴ�%�v�uj#�d#Y�NvH\{��+7����!��"N���׵�,��RJb�a�B�ގ��\�zۄ�%�	;m�,Z��OS��y�D��yL)�&e�B���Ę�2q���)��0�d@�Α�6�J� S2m�<_�����~{����ݾ�?�-y�����[;�
W��#�q��Ӵ��-���)qN	\��Ů�� �+s��!�^qR_WQ��;�K�9��1��c4��!p#�̘N����s?05m}eh��/J�Q�����e�����vl@�
��RM��~hZ�ߗ4�4V�|��=l�#��!���M�G!���j;���1�Q3��_:ij�:����\�aư�>jzA���Cm}���Iz��Q���$�*J����.�Ppڸ����-����ou��#|Ei�g״6��%|���a��S����Y�+�$�
m�]ze���6*<F�-JH1�
p��I<��P�}⽓R��E��sjwX1�U���.(q��]]`_}��PӮb<g�
"�U��#]w~����낄���J��Co4�䉺��"�q(��@����\�D0'+��W���^+>^���_̙�#��r�ϻ\)���߽�`R��éw���1l�ҫ�&����R�s��Yi�k��>f:ow�]秌�w,��_��?�ȹ�hT3��2���w�D�¬Q��F��E1���"�EjV*Zb���&}yC���G"ߙ�Z���f�����a�x���H�K5gkge�2w{�Uý������ba����X f����R#�����֐�`���"�ؠ,��>��J0F-A���w��24[^Z��QR}�@��}saǠ&�%"�Ѽl��$T0 �i� �e�t�*����\��n�,�蓹҆����B0b�I$r(��C�I�j���b�Nu�j&�12	8$��ou�"�"�272ՠ	+:�6[��`BTі�T������\�ݒ��p&��;�kNΑ�ȴ���apB�a�R�qZgcv[MbV�.W�ᜁ�G_���D�`BI.+ɛ:K�(
ɞ�;��͟˱���-�p}�;
�n�cL̈�U�Vt\)�JX���'(���ո	B���f��9$r(���	�Q�H<��(�B�}�ꜵ�e6���� #0�"�j�m��s�È@(P�$�����93MB	aq�TpD%�ǁ$�"��Y���aE'6s:#�,����a~C0f��s�Yx�D3�'���	��S|K�5���SL;\���j�� �"��,b��I��#���t�떒����F�V�	lz�w�0�"u���N%��l9�x�H��<��P��]	=�v`1P$�.@VX�IA@M�39��%O�5W�w�5�}��/)žp翞�餩( 4,{�N��hz1�&E�7�Ԃ��.��38>n����J�-!E��}�ߕ���~<c˻ӷ�W4��@�ɹkF]�xGL̵#���¤���Q�3>��M���褲�[jA�f}�e�V�yB)��1�(���s �H9�PI@6���$�m�gJi��c�1��jM�m�����<vNH�~r��`	����ۆ٪2��!aH��Nv�=�B9�`�w�Qנ�% ��=|h��q�!�M��G��A�@�͋��mɓG����7��K?xR�`�'zY��գ�68!~Z��R�;%&�@P�CsĂ����½�&fL����C�X�	�%sv52�b�k��j�n��RLIuV�U��j�fbbB� \�5�hzQ/�� �X3��N	k&Ռ���gR���b&��#+sF�EV��q"l��N2l������y��Q��gZ|���D%���7�>����������K�]ɥ}���x�t���T04L���M��NoF�	18A$��f���)C+�ص(L���赫_�j�aR(Mֻ|�<.;�]����W���ӓ��G�@��A��.�P
]~�T��.qf���y��5�+�3��$�[��@L��]v�D���opy�#[2����� Btٙ<��~���l�狞�f]��$�7
Y	ǯ�'3�S�fa�b�;��"��L�%��P�=Ҝ6Z�ծFˆy����ހƦ^ۀ1RKj޺��� �w���lC������۸���c�~��=I>�<6��ۃi�Ʒ�oIZ�����r� ���%"�׈��!q�B����y��������8��^}��g�?�}n����-,�9B*+�M��T���vj�G�ZX�B����h�^�f���@#��f>�}�1r:�m��m����0莝�0`�lx���3p�L�$ִ��A����<@��v>�e?�y(�OM؇/Y�^��8�9
ӣ�mFM���3��H{ee�wL�&=f�6�=m�B��%��o���%��Ə�(�·�zX��f�b� |�2XX�Nm]�&y/ �0s����3<ş�JI�l���%�|aW2v���h�}�G��ߠ�����O�4{	o�����r0��,���T7B8x����	���TVv?��hq���~�gL�u��YFE"Ѵ�Դ�T�4�K��ƅ���l�`b�	!��I>B��X,S�$��W��B�I�p8"�v��4M�4EQ�g���[���Ɣ}@cܫ`�b�1h�.�<���8M�8�c�� �7M$>���<��x��ZJ,��G�$L���-^L�E=D(!�)z�jj��$	��Q0Mf<�V����������qװ�L ���p�H�� ���'���0��@Ӻ�{���$p\wF�ǃ�;w�w�?�9��,�l��YX|�Z?�b�凞{�_���~�㟾�ʫ��l�1w���������qL������~��:K�~����Wbr&������p[[����n,Ձ���s߉#��k����z&Y���`�]�0���ʫ�Hx��ML"Ϟ;wii���Jޫ�ؕ����3{����-[��nj&��	��ǿ�]]�"��R�۱�s������(L���'<���'�3���<h8�{���
���5&�9�[��?��-�}��L̟�q&�c1���iɓO�֭�U��]��D�o<�</�&���q��da�M��9kfߒ�v�>�)���
���yť��������?��_ߒp8�n��斖���Y��,Z��h]}��UUw�ٻgᄇ�EEE��/Y�lێ��II�]~�c�X[[{qQQum�ѣuӦL�)ʢ�K������*����k������3M�С��%%&!A~^��[7m���y�.B����׿���#G�rs0�KK����m۷l��xd�c��={��;X^^X���|iYيU�%ɖ��3L������f��rrL�d��y<�
�6WTV��d���+�<^W���߿n�ƶ�6���pȻ��ݵ{Orr�,��W^��7�JNJ����0޴eˡ���L��{=�S�_^�pƲ�ΣGc��I(,�:R�x�RI�RRR(P�P4
K����%��3�e���	7ºu��.h,_{���������7����Ч�l�n��tvB�~�g��K���))���˗�ҥ��IIp����퐙	���0w.��y3<��2�a@n.$'Ò%�w/���6��M�s'��罔����C]l�o��w�M7��,,�>(���������������l�(�����={�/\���������D}C�-[�z{n8��o����m�G�_}�
���!�>���ttv�.I���+^{�M�ng*���������[����[쮊��bј;ZW�q\(6c�
���F]}��+g�}煗_���ձ$�p�����@ �����U�
�-���H�/������������>���;{���Ï`�::;gϝ��с��=j��9���/|�d�;��׿565�귿kmk[�x�K��=BBh��mO��Y����g��K>^���/WW��w��6!��IDATomkc�`0DL���#���]�d鎝;��?���=r�ѣ�n�-+Mڶ�
��#GZ;;���� ���V>� �MMp�=p� ����<�>
�m�Ag'�����[�z5|�;��`� ��c�d	TW��~PZ
v;����������K��
��J��l�;<�( ��P[?��"��PS6���44@WP
`��[p� <�0��&�Z��uup�PWo�	�>
�i���]	!�׿��7a�hX�n��Aַ�t�����|�D?��teæM�a��?��_�r�M����-۶i���ѱ}�NJHII�]w�~ݬ����l%���$]~mmmaA�M�_���j�歍MM����߲u��i��w��0d�}��qƍ���+t���ȸ���o�5���	x��[Rl�l����!y�9C�����nj��ѣ|�{�?/?/����oh4p��w�qݬk�������l��<xBH��'�),���''%����������~��_ttvm޺m��aw�q��3��=zH|���/��k��5�������z��VV��j�<ib�~���=�--{���o~��_����s� �2{䫮��E���DJ�xc�?�f�ʦ�p4r����PP��~�X�����a�*�ӟ�/���?��j������fX��� �E�@U��֭M�iӺm�{��$��w�_�v;��t[_�
�Ə�y�	�A8xv�߇�����r��<��gHOQ��a�^x�9����>�F��_�
���������[o��^��x�}��H�j��������a0�4##�������`�!���x��p8�ξ��?x�����XLQ�4]vȫ֬yc��)S&	O(���D��K�DQt8-�m�P��շ�����f#�i465�D��L��
�RJ��#�
��~�u�?��o`������i��h�-X��[s�L�$���XL��������`��ψIU�]~}C�b1�����565)����Ymt/��d���۶};t��d��q�������Ԝ��˪�R�����n7*+/ojnVT%#=�]`�FSS�A)S�50_mm��;��$��������'�>kE�������$��[������0a�B�k��@$�����(8PT=~��|3���(`�i@i��,���
+W��Cn.�\p�%��{p啀l��i�1��‘#�
II���c��@4
99
��@�
�y��cP8f��O�YX��|3l�qBյ50z���4���9s�U��{rsrLӔe��v}�ly�n�;;+k��ҋ/����'^p��m;���q�3������;6n�|���<h��o��p8�M��`�Gr8�EE�i��/[VWW?|��}�ϸ�ⶶV�?Ч��������orw���J1��{�{�ɲ������2��B{��=z���C����O�<���IUI���p84e��}K����UW����?o���c��k�<v��7g�ްi�u��6d��={/�pZCc������O?�,����w��KW�Y{��|�/�dƇ�,^�t̨Q�^s5������޳g��L�87'{��w6l�<��Ǝm�JKM]�lyC]]ߡ���׏�(m��’����hY��͛k�0����fo�`�
��C�χ��w�qVTmf�܎�w��Y���F		��;AEEBAA�������{�vL����� ���|?~��̜�3�<�̙����o��[a�Hx�������g`�lز��᭷`�n���ƌ��<HH���
�� (Μ�)S@.�?��իA �9s`�,�Ȁ�G�����1c��?a�&�Za�����~�	 ;�������	x�E2�\�Y���L&���_��>���
�ćCB����v=�������c���?6O�:�Q�x�M#���0��iom���j��d���`4�%��џF�� �@f����|>�b�X��6�Q$~SWk4B�@"i%f�F��w����t:�a�=�=���Y���߈D��^~��i����^z���s�1���6gt㞯o��l����n���VK�"���2���܏��dj��&(`0�X�<�/���&89��

��hO�����<Ă�AٸLt��@,gg�X@(��j@,>0��A���|0��C�(
D��G7� �$ǛG��fR�_=p6�����ϐ��zq�w34M���Q�&��v���ZP 
ǡ��W�a�B�P(���0��8�푳F
�h�
�����<�=*��M5nl'n<�=�=��h���f���=�T"���{����h���߷��$�i��5oF��|}�� ,f�f���hػW6v,%A�o�^8~T*���J�q����t����f���`�5�Y��:|>�d;_O��4�d���>(�9��m?���Z�8y�<�=�=�����;_ܪY��q���;��9����>����6�~�{`���-�J�l��8:���"M&��=+���&���y�����L�vak��+����•+����a��#��
Z8�����������@��
���C����??P��2����e�h|�
۝��'9���[����`�.<�����w�\���	9�+JG�1�=��Ñ#�jL��&�Q����z�>}@(l���7�&&��|�w
r9|�
���ee͂;�y�9n�
	�
�s��p���N@�~�x��(*��k����(Ҩ��m�h����p�˴
A��p� ��O?�y��qx��f%���G��ÛoBYx{�F(��N+׏��ݎU|�rڭD����9�4����W/��[x�0����3׊a�$���q��ؼ��:/�LWc�v�CH�)�Z��j�O�g�Wi�Z�ͼ���]�����ܐK�i�M
і���m���o�J�ףiY�|�w;(P��v�`�1��[���Ӱ2�h���a������=oLS����%ޗ���QC�o�
���n�v���R�m}���װͮس��K{�*����#EE����=y���fT��@AA�|������vA��?r���rɲ��tG����mI���]��j��_-�q��l�q�nybP���a -=c��?���Z������x��W�x�����
�������⋐�#G¯�A4O��GA�}����ga���+�ix�9�2���� �Z�a㵘Xd@Ee�F��0�F�EE��Dq{ۧiGB�>����X�뵙�r�t{��쳱����ۦ��:���]
�쪢�R�i�Lgu���!��u{�ﮬ�hs�H�,���~�Z�����:��������Ӯ&:Х+W�kjk�sr�@ ��x[,����Դ��R�@���S�P(�RS�32��ijJHL�����+*�SS@��'&&%������X,~��G~^���׮[����b��Gش	�}�}t��Y��\�*�<�ڵ0u*L���
AAp�*��3 ��h�=��I))�)�:��,_�u�N�Œ���)�if��h4VVU#!E��ϿSYU�ܰÆ��Ĥd��I��555��F��74�JNnll�vK%'746I��Y�Y�96�
iVTT����x<�NW\R�z�i�54����(j��+��ܥ���E��������>��T*e���dz�MM�=����`m߹�ҕ�����:Ze�������7f���cbc�BCg͸���a��v��!�	������O�=;b�0�՚��s�豼��oW|}������ʩ�'ˤ�{����l6��Caq��ӧc��|핡��AA�<O�֠<0�>��$<�8��O?
��
��y3��^����d6u$��?��3p�
���oM�ȑ��ל��_��r�ZEe�ޠ�
���}}|��k�+*6n���o�����
�\�]>]�цM��RR�&��h�DH$5=��

���'N�^�l�Ç���͙�x���P_�s���K��U+V}WU]�����}���@Яo�¢"�X��/��������r�����7���K/�L�$C��/��˯��uvr����{N���he�$E	E��G!�o'&C���;o�ٷ?-=C�,��8|��>X�����d6m�ڡ��$ww�T��_|��V�HJNIIM�i�j����|�����III�⒒�4MM�ƌ������MS'O3j����K��MM�a��ljj���x��7���|�صw_JZ�s�<
���i׀r�RDE��S��cc��Ν���<��#�f6acBL�_~	��з/,[}}#F4�wv�g���ysX����z7W��Ç?��#ӧN�4q|��ȯ����˫��qڔ)?|����a��]�I�K>_��믩�J�=a1[�jMDx��#F�Y[[[WW���2v�(�ɤi҄��N�4Z�*�������UU��⽷�R�T0o���sO?��h,V�H(�t������;zTϨ��>_��Q[=h��>:���m�̺3�\?�Ǜ>��Ȉmu>3�̎*Ճ��.]��B"_m�u�l΋/����$I
H�d@C�|��P(����3�2�B.7��6��͆���OKO
4
̜�ŧ�����5?��H$�lDB!��p���}�,[��X$�z
��ͤ.���Ͱd	TU��y�'����?!��Y E
V�
���`�d�8ƍ����v��Dx�	 �V�޻����l�Z�@Ӵ�`�)��D 
l6I�4MS$I��bd��q�wɩ�o���P �<���Z�Q=z���I))�}�X-�RI)�����hik�u�#>�8�a@,�8.��$)��H�t��O�d6[�����w�r�N�4��ŵ]7��p�>u���
�
==<�O��WK�|bh��޽z�B���v^_�PRV�ƫ/Cӿ����I3�_��Q=N�:����Μ;7�__��	II������d�%�7�VYY&��w�ݗ��+��NNl�EQ"�����Å�D����w��}}_��4���h��`�ؾ.^�'��X�^�E�Iwv&H��>};�l_
{�6��xy�ڵ���׫�S�\�)z��a?���ϟq�t//Ϸ����xĖm��_�(�{�!���zzxX-V�����f��b�ӻWhH�,f���+L�4qּ_{�%�Ht�…s.Y,O�����I�f6ە�X�2y��A��}��CCB~p>�7�.�-
��K�,_QT\<{�w���{���>=���3g�rIR���׮[�c����q��z��^��d#I�\6�M��99:@Mm��f�����K�FSRS�\����gF����7�4LӴ^�
���A�04M���9�g�x{�&M�������W���l����R���j�Z���
EXH��j5MJ���`@��MMMV�e��pw��;u���+j�����`�����n��,�Nw���9ۏ�����֭P\0l|�-���@��H������j��@$�5�\���**+EB�����d���	�_��so/�Q#G��	�B�a
��(�ruu�I�e��(4S*�*r���G�H���_x��7##Fc^~D���0�����R�A�H[^^A��z�N&��x<��T[[�Œ�O?�������/�L���Zo//�F��hP�
O�HTQY)����&�Z����;Ѷ���V�~�'��A��迼<x�E8sp&M�'���c���eQ�pp���_�V����W��9�p���r�6
|ƍ����s�WZ�&�z�~�ͷ�6u��E�x�۱A����8�z��;r7n��~�ׯ��wx��vϞ��j��ձw��:��3�i�j��d.����n��ܦ��o]��vA�u�CB��	8v�.��G��QP(`�T��>�8���Ό���]�F�Ȁ�LHL�s� -��JJ%<��w�*\O�����7m�t�����	_��r���'�ys��Ye�Rm����wrrzh�<�e¾}��]��r����l|��ii�s':��	H��n�00>̘�N��#i�p�<�>
��7m!P[b�4�_����a�X8���C�}{���x�`��s���L&HO��a�NhllN���[O�i���P�P(��}�>^^�{'|i�OCQ��K�,�/]v�m���_�n�67/ǽ����D0p �~}�U7`�n(-m��
�����zg��Vy��Q�����ؿڻDsK��bܫp����@_}�P�;0��Rx���pl4'��98��]?7��Ŵ����
��AN�/�`w����ں�v�"Sc/i���d���+{!@T�7�h��*+��aM3H��fG��9:떳��"���	���w�؛['��F�;o�V��f���kj?��a�'��.H͵�W���i��l�r��f�_<
��7G����?[	�����L�Ю�߬��߿�Vz��Μhh����h��ط��h4@yE�w?�A�[Ժ;K��a &.~�?�[Ii)Z��h>L��0Ls��f��?�74��ۺ}G\|���^y��w>�����=�6y$�:�[�Y�֎֤��{@����G��3�'�d2�>z��Mg����F�z�&����S']/u�DO7&\���Vk4`g�����K�����l���z��z�f�솭;v��45ic���ͦ(�$I���i8~�$EQI�)�%%�v2C��p�Vkm�i���MIMuvr"I�`0$$%�WTL�:e��߄����a#�z�N��h��j�`4=VWW��j�a�n���.gde�&������:
e�����l69
�-KyEEL\��f��3.i���t�9���h��i�NG�7̀V�C�F��`0��F��&o��h����ʾ��'��N�>wY�ɮ�ո�H$���۽w׼��Eb��H�z��Dz�s���z��Eb� �\��k�^��Ə}����n��zCII�K/<����[�ܺ}��ɓ�
RTR���M5u���^pP�3gǍc�Z��c�@���1i�x�T��LQQq��A�99k~����ѣF�9r���f����L��V�W}�Cm]��<2a�Xn蟣� ������%_��S��8}�ʵ�V������ի�|�т��?u�LVN�^o2x�?z���p�f#�|>�r���7l4M#�
E�)�����SS�U߯�Zm�{E=<��T|T�l������sۅK�<<�_z�oW������aV��߷d��	�hhh����8q��>^��k�F��!/��xzf�ӧ��ƪ5�Gz��G�rg4��<��_�P �2�ͦP(^{��u�������j�Ͽlx���~���������N��OMK��e�NN�ۯ����/�8���YZV�Vk�|bɗ�{�{��p223G
������x`�~F�	>�l�X"�5�~��^x��>�z-&!1i֌��L�d2��}:g���G�7���M�٬�f�X�jU\B�����}�f͜8~|���/>Y���5�_���,>���8��[o~��GR��6�4�`�src���5s̨Q}��NJN�Z��\�����/����=�)'BK,�}�Y��7k�Y���.^X�d�{o��tP���?��'{4=#C��T�Լ�ګ�M���k�L��#2��O���ڧ����2hУ?��Cr��YZ��m6ۈ�#G��f#�g4##‡�U��~�0
	�H$b����!$8��{��������8O�T|B�V;(��������T*-,*�|__�⒒��G�X,��p
�>e
\�t�ʵk���$I�5*$8����^�����H%R�H���(��?�����-�P��8�9;;1�����|�F�P�X$���P(�>>�ii|>���G��$
Q/��V<~�77�M�o9s��G��#򗍛����
T(+�-ݴe����-]�*�����{DF:(J�aH�T(>��R�T&��������D���!�Պ�B.g���@,�aG+�?z䘀�@`0h��6[,��ͳO?u��������^QQ�%��=�ݢ����sP2�ْ_P������a߁�����%*�DZ�'Q�Nh�jU��^�ӦL�i���&Ml\|�^� 7/�:~�gf�
F�n�ʵk
���N�y�4M��4Ø�f�����э��&�	�j5Mӡ!�n߮R��D�/�.��1��<UPXXS[���Z��Å��x{UWטL�
�6�-�����W���씓�ӦLy��~�q�Bq5&&/���ɩ���b6�Y��$I��ʖrqvIIK{����n��Œ�����ǎ�j���
�KJJ���?lHthH|�d��o����t-6V*�����OEF���u�L&3j7��͚=+4$T,3������|iП.��Q="�q9b���==<�PȽ<=yp>���Cc���ZUU5{�L�a�z�J�nn�O=���B�3*������'������g�tvv
�
��t�&N�׷�X,
������o�>$I�E⹳g{yyVVV9:9����
EdD�����!*2�����0��:�����BB�Rixh���gϨ(WWWWww7d>�aa/_�q,(0�G�+�уb8�g�w�����druqquq����trt���ω�"ww���/̝=��ˋ ���J'G��{V(�
������i�L6l����P(���zDF����Ȉp�\~��U?_�ͼ�>���+���)�
�n���>���svv�pws�N��0Lccc۔�f�Y���L����֊���;ҭ�i�XЋ*ǿ�wx�w�6uܘ1m��e��ԟݰ��o�a5X��/�,��ǵAp}����
��y1�h:2/c�>Mӌݨ"zQ%�o{��fcVw�9ଥ�Æ��hرCOrp�m�ޜ�jC�Y��ݧW/�R-3%���qv���Y!m쒲�ѣF�0X�@<k�7�jm�������d�~}���N8E����X���?�^W_�N�������i�Ӧqܺ�ϵ�w�}����q�_��a���)ZN*�����'�&����a<��7�z���spppt;n��4����y|���3C�4EQ���7Sm�ٍ���88���Й���������n��qǛ#g�<&�a��o6p��fH2G���ؙ��]���o�Vz��b��L����sr6o�-lhh<x�0�MQE�tg�b���>{���S���h��hLLJn�M����:�D\�rh���������[��;AV����2aP-v����o� �F-^B��<r��,>�g_��-�J�X�U��]��R���l0�T���}k�?����w�Z�Դ�#��ǁ���ܼ�7�yw��}PU]���mM����x�j5ۃ���R��<XS[������tM4�q4�����ܾ](lV!Eݙf�(8{���'��pp� �ڱswJZ����՘X���^���$��pY?��6
�v�4���Z��ͩ��o�MM�Rm6CC�GZF��d���Z���om�;�Ϟŵ�84+��/��B�~qjZ
�#�M�M�E���s�>rl`��r�\!�<�����m��f����ۧwG�j�Iz$I����޻��-�n��R�мy��|>������!�����K����.\1l�H$Z�Ӻ���	�Ǎ3:-=����F���7^wT�9ZW_�R*�|�5�@��0rtsD"Ѯ�{O�9=x��7��R��h����ھsWYy�ۯ�������٨n�2i��#R��ܶ]��3#h�ׯ���ڕ��3q�8�H\V^��yi��I)�͛��?�

�<��Q�ܼ��A�f͸���{tvr|��gv��S[W���%�w�ڭij"I�7^ߵg�+**^~��`v*~Nn�-[U��ys�̞��8��q���L�ࠠ�~hŪ��f3��f�y��+�BC'O����kj�M�/�v�������Z-��~x��9?��>���ҕ��eeYYo����Q}��	��O3j��;�sr&N�|�7u���~�`0�a����l�_�ᇩ��		�i�rt{�6��A�~����D��}��=j���	
|x�<??��OK$�9�f��uCNn�O?�2a���{� ��^��K�����ܹQ=z����?x��<%��c����9Μ޽zB�XTj�8���jæ͏<4ޜ9b����ᡡcF�<~�TyE����N��P|��ۡ���?8���Z��0�Q��cG�;f4�YZ�~��ԳG�Y3g��A�t�qc�899���gg���R��2��C��E�ܼ|��y���}��P$��pЯ�a%%��nn<�����cxm]I��)��$I����ۇ��O?����񗍛�j
��~�~2�48(�`0v�����z��������'�J���'�H���erY�=@*���5p@�������qc�xzz*�r�E,��N?���۲m����B���|}:\SS3n��aC�����غ-19Zy�K��ukFfVfv���{��}����밡CTJ%��~~�}��y��KJ�|�J�����H$���y<�a�|�������Q��|XZ��>��M�v��G�7��z��x����Դ��Pg'�~�i��m��߷t�*����<hTUU_��ݼ�O�RU]m�X�����Q���{���Cz`�L�T���7n�����/@���,(G7�I�5�
P�PO3�J�<v�xNn�J�`0�l�n6�i�NKϸr�ZQq��gz�/^�\^Qa�Z9�g���v_�z-7/���#(0p�[�iS�|�������%3+���=(0��m�m6ہC����o_*$(�
���]T\�}������Јr��74@uuuBb҆͛�BB������lۮ�j��S��عw��=���Lf''�#Ǐgde�Ơ���]A+�~o/o� 0��G�V<==���bqϨ(/O��=z4����l6?��3�*U��I!5�Ĥ����g�|$ICC�Z�y�����$bqXX���[P`�P(�������ӽ�<##��f���Wh���#2#3���sC�ϛ��0�>޾>>>?$8�Q��n����0��!���NN� ,4�wϞ����8,4��۫���o���b���&ӣ?ԻWϔ�t��8r��KW�h4��F�H$tvv>z�8EQ���ggg/O�Ξ930 ������4E���J����k��ڔz��g|}}"�B������
ExX�X$�D'O��j�����~�	�D���]XTԻW�¢���l��ܤՆEXQY)������������Lӿ��b�ij2�L���=M���垞���7ǿ�L~�Ï�M�<z�����GfXRZ�l��U_/Cqw��¢��_�\������˿��O�4�k��i�7�/��s@�W:8(�6�^�U
���eٍy<�1�~��ଂ'���
�(��q�� �"IR �T*��f�N�Z�	�F���g>A\��	
��ͰW�g��K�y�|�z�|>_�t����^��wB�Z�,���-�Kz��v%�����n�
�@p;��8�9w����e���l�}���T��U�h�{������vp��������~�nG+�����]�qo���5g��X���rpp� �W�3
����i��n`��F�kܷ[��U	)�,������hCS��ܷ�l��4�vM�f5���݇Vz�9��m��Q�ZSS��������LV���e���Q�������i�jQ�f���Ԣ�=+ލ*���a�X
��=�_P�\g�d�$ɢ���O�qp�g@VPQYըVC�����f}�ꬡ���"n4�223�R_�-��QvՍ��_�0I6�5k�5���xI�d5�����EW_㻈V-�ŋ�/^:o���i�;u��+׮@QQ��e��r�G�x��y�2;v�޵wߚ�֕�W��\�V�ڻ��)����t����������W}[RZz�	�����a�

K�M�$����u��-3cN�>������t:���{P�YA���8n0��\e0�h��^�E`W�X�U�D�ٲ��{�F�oI���!��ap;y�LVv;���5| -#-%5���}}���C�%E��
��陙Ǐ7�ͱq	3�n4�v�����)��:a
r����yq	�陙ٹ9�>>R�D��	��N�=>+;'"<� ��'O��‚¢��>�{y����=aמ}��ps�s�����k1����s
w-��Ϝ�������J��O|����Lww�����~X��4�\�t���N�N�*U]}��?��>{V�i�1}zK��fr�ܹ]{�
E†��ܼ<?_��ظ����Sg��޻�a�����P��{��4���[T\��?�33�.\�t��Ҳ2ggg'G�k1���\�����c�����ؤ��x;(�^Yy���>���R(�����o��Ӫ��1�����|��vM�^�>l�ѣ�������4_.YZUU���>������}Ԙ����Q:��o��N*�e+V�UT8x8#3k�-���=u�̑���ҕ+C�;;;�8;�$y����N��~u�~}�Aӌ�j�ӻ��`�i�/�}����(�A�п��ի�F��������J�R*U*OOw��[���̔H$�}�X��|�ͷ<o�$E2͓�$�p���o�[ݿ_?�@`6�Z�+�?x(�� &6nŷ�
�ߏO��y�u��b�X�7,Y����K����
U�5J�r��ݹ��?��^!���ů]���������C$��@"�8;9���:9:v�ս�h��o1{{��}`�X,�o+x<^SSӣ?T�P�c�WWל���ښ�_y���%):!):���WzDFb���A<��|�ɔ������~��FG8@�ՙL&��iؐ!���ӧN�ɤ���1|؅K�D"EQnnnƍ�8v��hn0��=~��J%�J����DG�G��{zzκ�~�D"�F��ģ��o�mI�^x���C�U*���nH��~#��x�2E��wP(b�JJK�M�ܣG�#�_�$5�PmS�܅�J�2&.������5J*�x��=4���Ô
��gϜ����=<<|}}g�w����8���4M;99��
���գ��Aw����{`�\���˭6��`�7_{uמ��E�~~�?|�X\|��b	
	�v�h���)у@]]}Iii\|��b���oh�ڬj��:_��KH����ޠ��M&S�ZM���>�����)��j�(�6��(�(��`D=!�N0�))Z�N(�Ս1q�@QTCccCccqI�Q��z���g���kjki����M��LMO/.)��l��>\�	L�4qї������Aox��G���~�9MљY�i��<���hA��aQQJZZuM��j�45Y,�451��jϜ?������T__o6�+*+�SR�32�32�񤔔��F��K+����@�Tz�ؐ�I+z���R9�T0�ƌ�?u�Lfv��O<�2av�^?��YY�Μ}��q/����͍����'C�����BCB0�קO}}������'Ǝ���x<^DX�N��J$?���V�{�������#�
5$I��l3w9�UVUz{y������',4���s�0��ch��ٳ�KH�t�jZz��	F����kϾ����^=�~�uCbrR\|��h
��ݠ���?/8(���cæ�̚�����u:�#=���ڵI�)��t:ݜٳ��'����U\RһWO�ZݷOo�LVQQt�ȱ��̆�����ݝa�Sg�8��ի������INI��ݻש�gx<^xX7�ߎ^�����%W
��������~Y
��������Ɩ����o�:a,�d�qcG��S���&�.��SG�$����,]��+�g�7rgʝ�^>�[9��fP6�c�)z��^^�^^�"��V�x�v��=|�P��z^�ЖE�'½	rp���'��y-Z���#�{�"I�D��a�]���'$I�c'O���8::�-��n){�T��V�
Zv�����0h;��Yt��[�����G�4��MGw�Q��D("�6a_��b�iC��������.w\]�H��N�bs�x�e��#���A8;9u�ŸW�#ǝ�ø��6ܿ[�{����q7�X���n��o��9888�����fU��d�k#��_������O�|�j�!Z��[��د���ދ�p��7.d�j9I�nȍz��A�wT��Y
9a�i�kj���f�Y�V�����r��)�����fkj�6i���Q���ʈ��~�N��j
F#P��]���hB�۟���E��q1B���b�Z��b��٩0�vX;��~�͆������X�}��y�:�f[n�?k׬�u��K���~W�g��W�.�U���'���&�{$��칑�'���@~A��|��#��N�+�s���[�n��ˋ��������֯')��G����Y���6��a#�	�ð���Ͼ\l��X�J���o�c���OMK�Z�]}�88:	d�?����Wq��0,-=���
�q���^�Y�~h�Ǜ�ڏ~�ӷ��G[ڿ��n�L����$[��i�/�u��ƌa`�X���aس��b=���OLJHJNtvvvqqE�Q�YXT��PRR2u��F����O��0L|bbaQ���;ђ��CAwq��[Ə�����Р��|��1�Z�Je^~ABRR}C����a�/]6�>��j�fؐ�+W��������=w^*�
��?��I�¢"?__� �c��֬[������Gw�
���$��:z������A*����>ZPTR�������
����Z�������stt�DV���ŋW�]+(,�o��@v�^����_�(�HC}C�J���t���.��y���/^�����뒑l����|���Q�ӝ9w�����ݽ��$�����N�P���VUWgfe��7xzx��p�e�\]]�(�F�V���OKJJ�������Ľ��a2��_8�&Վ^��y�t�u�l���M�8���u;w�9s��_-鄶=���
�C�k��Lf󵘸e+V644�;p(.>�����ׯ��������s����8�oش��q�	��`�����Z���Q}���6���trtR�T"���st70+/�������/KKK�fsyyEaQQmm��fMIK�����y˖3��-�t�Z����L&��lf�7�Dz���}��銊�ss�~���?p��Ѵ���>�T�7d�dCKD�})�^���_,Y�P\Zj0?����v���ڳ7+;����'$|���}55i-KZz�V��a%u��`0�WVv�ս�h�o�Z
���ϐJ[���0�h4-��{�A!wqr.,*JHN�a�*�L��/]��=�/�6�P}b���|�X$�3kVhH�����ࠠ��>O>�hCCCQq1T���?�ȱ�[�n=xͷ�r��
��2h�������s�\�xi��C�RF�5R*�VVU�?m�q�ݎ��O�6y�SϽ�r8�Td��������]����K�R�"�M�2q��,\�˯���+�-��uY99(7/;����R*m6[dd�����-���<���r�L�RY,����E�A�L&��/^��0�'��(����??�KW��4���ƫ���W,Z��G����I$h��OӴ��_�޽
0i�x$�������kS��̞������\�0�/<����M�R�0LQIIc��l�8;w�t�k�b�:��h2�5�ɨ���f3EQ��NR�T*��Oprt
�j��#�"��gCCB�-�R�7�����XmV���G��ɡiZ!��9����|n�G��$m6�
�K?f������4ij�ꚷ��Z�����h,-+3z��h4�tz�No4�|}�/Y�������=j���BC�\]W,]���_}�m��
Z�ξ�^����rE]}�N�G_zQpX-V�ah��0���V(
B�F][WGӴ�ln�ju:�:`1[�kj�.�A�{�����2yz�)j%���#8�322�U���x��
���
	�;R������UVU�T-�..��آ֨gL��#"�܅�R�T�T����H(:v���!� 
���5z䈥+V6�5q�	6�6v�h�@������DB���KMM�K�m�y�H(D���Ϙ �R��J������M[��7J�9�g����.[��/ظ�c�OF��M�2�F�_.]���R�Lf�ߘ-f��үO�����cL7&���k�~�5�w���6q�8�ټp�"��4[���ܻWρ�����;����=����A�z����콏�tڧ�x��˳_߾��d��N�;��)�\NQԇ���_�J!�ϙ=��k����G?l6�0�N�����sr2��gϜ������Lv�Emݱs�	��N��'J�٤��4���7��	��g6�
0�OØ�f�X����5e�D6��`�hZ!��mPq�����a�ő$���o|��{~��4M7��ø8;ߺh��0sP(P��AKJK�Z���w�qwsceD-�@ ��$i#�5�B�ٳ����y܁r�_�>�m�0�f>�4��x�zL�>-om2��fu��x�>� ��Uu�`,{mήcoǽ2���B��Z�L*�J�l��T�\
�ڈ9;99I�A9v\&�!���8��D��n)�?�tp����(�����!��A5��j�9h�v|�)�3�t��lW��fu�O��f3�w�q��kD/>�����o�IY�㪯����>�A��Oe���
�7�w�z��Y��o�^z����瀃��l�Gd���QZ�S���˾���[����o��]}I�U��E�;�oK��z��6���^�spp�
��X������p������q�z�݊_K��/������]]�k�7���z��>�e-Ѕ��G��0ݑ��3,q���i����� �����t��`���N4kS�6��������Ql\��k1`�9+7/�ĩ�hICccEE%�E
q�Ga���Xڬ�0���Z�_��[��-"5o�I�MZ�;~�����e�0�Ϟ?�a���W.����f���]�m�_>q��k�.]�������a��$i_1��,X�بf[����e�uu���=�ⷫ@_�6n�l�Z�H$�w���

�V����ş}�e�C�	�a?۱���Ze4+���,���m̭
�nc�Fv�ݗ���x�ʪ*h�^��h���9�i���h4{����y�ӧ�߰������~�>u
�����a4M����Wtt]Q�.\����U^Q���ZRZ
|>����f�i�ڜ�<��LK���J�Cm]]fV��b).)).)A��KJ2�2I�������b��i�6dHCC#�ǣh������+80�Б#�..�-�x��kJ����a��2�������HR�� `4��cIi)2��g�-�򫚚�͆aX~AAYy9�aE5i�9�y55�Z�.=#���s�mS��kk����Y,�UT�G�����O:����'�JՒ��VQQ^UU)��Y���|}�kj|���fN�2���s;w�I�̬������e	IIy�!�R����ʢ�446�޳^�w�@MM��+W+���w����.�D�Qo�cKpP������m3rdFf��oV�WTl�cK]]��n��I������q�����xz�WUW�=j��J�r�СϿ�������.\�<j�����ظ�I���g�efe=��h�87���Y�/;t�XaQ����?���x��,����������/)-urT=~2!1���f�!{��;}���8�H$�I�|���?�o��̺|�jRrʨ�#��f+�����zv�����#Gl�{|b��B��.\�,
����tp`g�5�u�)(*���;o��j�(J D*�H�G���Ӝ���̚n�.�H\���m��`�����dΝ�9:��G'1)���E.��	��/��÷���Ħ��Y�V�����z�����EEу�5�Ç-�b��dz���'M�p��Y�J�мy���IK����3��TK������>z���.���z����������^���ᅬĪ8����9��
��fL�=h�����uz]�>}r��i��1}zdD��I���x�g&����;���H�ܰi3 ��>[�qDXXHp��_�ed�Fvh(�R(��ddf���s�3�]�h4P��6�$)�R9i��&����]}��m�F�L�<�W�ޭ�����E�r��WϨE?��?|��GDīo����<����KC����(���fhI"A>�H�D!�gdfI%RG��b���J�m�Pddf��i�J���d4M�M�������b���f�CҤV�����3���q~������7��/>zlڔ��&�?�Կ?|��{ϛ���/Μ;/��F#X�V��I$��'��ګ&�	�
K�R�H��T&�1�T�6&���W˖W�Ը��1���0#I��a?�l6��
wG���u��0̤�S��خGc�b����sf�,,*JK�P�5MMM�V�.,*�K��Q�x�]�Q�..)<h X���[�|��{FE�����t���j�f��܅���I�$�(�$)��)�^_RZJ�6��d2���\��=v���!C�ss��l���|���l�7���e�e�	��88����u��z-6V&��8FDP`���6
�o�p-6V.�2�7�?��'G�)�&&��fe���:�Q�$��$I�Z�8�}�f���._��Fӄ<	�0�j�Ø���ꆆF� ��D,���׬=y��ko����O�x罤��aC�\���i�/<�˷���]�~��\��Q#\XT\U]�����噗_PQY����Aa�Ț��r���_y�a������:���zD�4��h���drrrj�j����
������h48�+���ZW��@�l*�*;'�I�
�J%����9;'G*�zyy���k�>�dP`�����`��t���W^[�l����Z�1�^^]}�88�}��+�

�|}���+n�������"�nU}C�Z�	
����N�T*��\]5
� dRim]���KQq1�P,�󽽼tz����MMm������)-=�������l2�dR��PYU���V]SC���<==U��
���|}��j���Z��������:�^enV
�1��&L�(
�������g�z�C�sp�=�G�Å�0`֌��d�e��š���*�s7s�Z�Bw�N>}p�����~�4ah�	r���7N�b�I�@�����SЖ(
�0}z��ӻ�p��Frp�s�sYp')*4$d��a��䳖�a`�O3�9�v��~�[lco�-ӆ���
Ss8�&w���&B���A�T��AẢ�Ͳ��)b�'�c�s�@�I`�홛'����`o����㏱��憆c����'T�){�mZ���={�{pc����=mݐ�-��q���spp�����rpppt;8������h���c͊B��t���HS�� o6��fU��Yp��ݙ6Ͽ�%��]t�gh��o��iպwh5֏Bwo��0� xl\�}xo��n@��F$oV�[�-�}��膴y��m�/-�3=C�&S��N�ֽC+��3gO�5j��xԴ��',��s���UU?��u
셝;V�m뎝���RV^��75i��8��8؊Y,����766�/D����۸�wh/���j��P���N=��O����f���J�$0c6��.�ք��h�~��W��d��s:�{�D.vk�����8�ij��Y_���=�V-gQqѾ�{�F�u�~�����6l��;����.��N���9~\��_��++/GK(���
��&��mr���ƨT*@ځ��l�@uM��+W�.��bA{��PyE��b�z��
dA�Μ��ʢ(
��k֞>{*��.�I��4G�2��q��jjk.\�����Y,�?�X,7�j�R�cV�I�nG�7�v�K�U�԰5lsR߯Y{��y���\�|�"Q^~ޞ���<0_$1��ESO<�XRJJ|b����T"!xD|b��l6r��	�'N��4��h4�<����bb���[�θo�������~���R�Vk>|�?���OO�<�̹��v�H%�X,�|��'���ٹ���%˞y��1��y��>^���2�c��BRԨ#�aN���[!Kv��{����I&:r�܅z������M�G�w���̬l�N;r����;}���{H���5�C������M�j��#�"QZf曯�r�򕤔��}�5k�Z]��}�7�^C��h�����y;v�9}欻���/��fݺ&MSHHSV^�����h�>~�����=A�վ�ګ�aah�1�������p�rm]ݣ=ȍ�"Z95Ԁ{xx���^ �����ʋ/����z�@�ijZ�Ӻ9�f���k��l),*j���a��;5=]"�����ƫ�|����wgfe�444�$���/U�3g�@ZFƐ�E����?�4!1�_�ۧ���TJeNN��E_̟������;ƌ9�\���������;󝆃�n�j��?}�w+��t媓��a��7���3ƍ������RR��"-\x���W��ص���_Z��b���(�i�4���_�x�9sF�գGLL,IR�/^

���>��9L7��
�R��}����yy�O���Ӆo����A��_0s��<4?!)��������_3j�7߯�<q|dDĂ?	f��(��9x���}�Y��V�~��:dȰqc'�݈�3�{��ק����򪯯z��)�$iUuM`@@+I�����>�
nn�A���;���yzz����X".)-�1��LJ�(w�L���+����\�p7�LZ���������5k���q�ʕ�,�H$
�B.�)�V+���&�R��|���3�B.��H$�q�c�H�>>>��.~~q��� 4$�Q���Т��<�����n�����k�FF�����j�zht�����/>ߺc��������7PO�ڲu[�Ȉ��GG���+jTJ������\.�0������s���N���H>A89:���<��Qb�Ǔ�d�I�����>28(�U��l6���{�̹��)��AAaa�_.Y�P(���
������f��/�3{���:rl瞽*�2�G䱓'�V��Q
M��Ry�jLdD8�t:�E�
�4M��ɬ��BCC�>Z���WW����ؽ�h4j�Z��j��՚�n�88�6�Z��h���F����������ܿo_``�7��~�5�����WTTVVU���[��-_��so���*�ټe�v��b��B��=��KJK=����`��)Ͼ���>���_\Z���f�X~�I�V�R���\}��غ��oV�>\�Ԅ2)56�q/-+�{���k1�
!)r�7�^}��)�q		R�����{4$8h뎝J�Èaø�ܼø8����fS����..<o��!}z����<p � ����|�Q�ȷ�.(zMKJN)))�7�` $$��J�óO=�T*���CC#���\]==<��|�;>k�W��P�J������������=�??_�X�7�:�K/<������(�HB����<����������+,$D.�����x���S&�99:yyy��������|���j�pw��T*U�AF�Q&��7u���G}}�\.W���n�>��y��r�����J���Ϟ�3{���7�0y��
������55�
�BnWJ�P��ܳC���׉�������??�@�tp�p񒓳c�޽�>0��>�z`a!!F���('GG���$R������+t=�[��3�~��t~�i�Xh��D�s��A*���ppp�%(���w߻o�Ա�G�>��*����v������,����˥Kׯ]Cg�7�/�o&h�E��x<��7v\���v�h�~T=���htO&���6c'�����l���8E�8�aF�4������`�9��#߁���(�#���r9���/���!�� ;bM��aC��Gam�u&mJ�U�q���N+�HzDD�$�N%c�M�4͠
���8���]]�����pp��B�;�[���������ֹb���5��q���spp�����ɸw��
w�8888��������v�q�7�#��L���nT��6��N�����!t�g��f�k70�0�wW��l��
rp�
�`?��/��:�3�[����v&-u7Z��'�"Q~�EE?��v�ً.�Z�>q�+�t���:PIi��K��]�PW_���L��T�^E?y��Iv��
�3�����$���ptO����g�WT�#6��Fv����^�?y�4�V��ݒ
�g���f���NѐEQ$E�gd�Fv	;_�F�d]}��"Z��I��O�o9)������_~e��CPQY���MM��Cs�:�����޷?&.~Ǯ=�55h��`���g'��ZȊKJ6l�$
�n����qumpG��?��i�!G7�ז�ےRRð-۶��'�8������$I��Q�Q�&�,�����̭����%�"[c��n���*�Nѐ���1,6>�`0�]�Uj���!���,��)�`���陙�!00��ӆ�x~A!��陙� "6.~�}�UVU���3��;9:���t�^?�s�����SO]�r9&.�jL������ciY9�����3g/]�ҫg�X,�wࠧ��������N��2�l�减3��a������ǟ%��}���{�`ZzƉS�D"Q��7#����L���+11�/_6MB�pї��R�
E\|†�6�������8t���cb����;'7��5kO�>m2�f�?+ُa��dھsן�w�8^U]���}�̙Ȉ�?�n۶s��l
	���G����e2�BCc��W�]����m��c'OˤR�R�o��Sg�^�ru`���v�ܸ����"$��j����diJj
�'"�û�i�ZçU{�0c4[&U77�F�1"<���s�^�
e�Z�_-[���1,:zي�u���i;�4 3;�����2��N���
ϝ?_TR����O�WT9~b��F��a��k�q�<����3�GL�4i��1����̹s��vİ�Cb�(j҄�ӦL�e�F�9�+6��w��/>��[��QQC���
�����{��i�>އ�c��0~�?�TXT�Ï�
z��G�
�ts0���6o��Y3C��UJ��{��M]����ǖf�mVf��Q��3g􌊪�����O�vߴiB�����^���
<|�XiYّ��G��|��ہ�����Θ>
i��-���>�ᡡC����i����̦���f�Aj9XK6c� tZ��I�R�
]\����M&�sƌ%
����%�c�r��~}��H$��>t(0PTT�����y="#�<={�dh����j�����?wΰ�C6��Gm}=I�bb�r��i�vws[�ч�Μ��H�҈��>�{��rsK>9��A�aa!��n��|>���9$8X�Tzx�;(����0L*����w���!!����#�Ĵ)�������`��c�<��c�w�JMK���:z�dUUՄq��̓Ol۱+7?ZF��Kmݾ���,-#���}ؐ!�a�B����m��QH��q��A��{ꩂ�B�X���Կo_$���q$�#���C����ܺ6:񮢕�����3x<�g^0��:��x하�����P�u�lؽw`ЧOo�;�9EP��(7@uuMBb���{�����j��R��j�0,6.>( ð����#G�$�g�KW���Ŏ3�f�����T��#Gl�{}C�N��X,V�U�Vs��n�V�������0�\&;u�LQq�B.�鴻��Z�4M���&&'ϟ3������)6.����f��<}�ȱ���O(.)��
���0mڔ�K�99:������E����u��f;q�t�R�A�߯��WT�¢�����M���F�����jjk�33�l����V�w�ۧ���32�<t�#ǎ[,V���3y��H�����]A+�~_�6��o�a\]]���IDX���S�޽蟓��ijz��g:z���돍�����ꪪ�=����'� �CC���BCB0O��M�2���;;'g��C�nO>�X�������̬aC��:s������A�l={DJ�b>��#2����� c	rqq�:"<�WTTAa�P(���puq����קϵظ�����Ƈ�͋����4
<(4$�ĩ��5ՕUUA8���?`�Z_z�9777oO��{�=0kfP``eU���V��/<���x��њ��RV�����

8w��Z���C$EF���bü��N�<][[+�ɞ{�I�\��料�ݯw��܄�DM����.48h����80 �͘ϝ����,<���b��ب7|}|�r{����|}|~G^[����|����M�<z����G&\QY�x��˖����v�laq�W��n���.��^?M��E��i��d��C��������-a��p�)
���=
�^y;e������q$D�6��9k]EQ���,-A�4M�$)�2�����e�&Qb�%�eZb�|���W���b1j3Жh��$�K�yX��}&x<�Dl#I����Nh3d��Yt��[����hJ��:�(���oC�V+
�s.���:j7�z�w9w�q~������}g��_�F�����p�?�]s���~�N�{�{�\?G����i#z׵H�2mT�oV�۬*�2盃���Ol�3=g����>7��a����ti���V���O�oG���+pp��hc�}���������j�%b��Vz��m��^�

YYh��d�����~D�V[XT\XX���P�$���FVC�����o2�بa��M��{���k�����<G�̸������^��
�'I��YC���E�d2����hAhK��=բ׏V�X�U�T�]�6��؈6@��e�[+��,���E�j�ϝ?s��%�{�Bw���k�N�KH�¢�ϾX�n	���9�ݹ�v��}��?�)+/G=����?��@S���MEE��e�J��O�>����x��
-��<�c�n���ѝ�)
֮��Zl��|��Ŝ�<��7�<��(�2� X�G�⋗-�p�^?Ax��2[�Ѫ+c�
�js6��[w�oh�q�ײ[���֐��08���p17/����m���R�S����@sJ6���+��33��L�d4c���>�j�<|$19���S"�t�*�7��eޜ��rrs���|}�b��h�D�/^:�b~AaXH��;y�,�c��n�
���� �Fnݱ��ի��>�0;w�ۧw����.UTT��ֆ��v�M������>s.;7'3+K |��dee{zz�'&}��F�)�ϯ���굘k11���
�Z�ٺc�s�gL��F���+{�Dj������˫��:!)������0��j�@.^����Ao/����-۶ee���_����O(+/wqqqvtJJI��_P���oW�`6����r9�d�cXeU�ǟ,���Q���>>�g����8�$�̬��vM�^�3r䰡C�^�J(R�a�W˖����O}a2�;��&�t��={鴺��`������{HMO�i�/!�A:z�\�|)z� �T*�ɤR)Ǘ�XYSS���E�O�X�������{b�޽6n�=5-
��S6GWA�4�/Z�j�V�uR�d2�P(�D
�\*�H��
�~����X�?�\���r�^����@�<�e�N�>�|�7ᡡM75i�i�=p0-==6>aɊ�a��;˖

eh�d2}�d�P pvr�hj��s��|���;s�rW}��ԙ��7ltR�dR�T*E��ah�P��rٿ5���A+�o���=<�<0O"�ڷ<�I�}����+*�����ꚛ�WV^�o���3�5>>:��+W##�q��Ç
{��'�z}fV�������;z��A�ԍj���ب9rxXXXP`��WTVUWW��֛ӧNqrr�:y��b-��T*��e@��Ç������4rt_x8>j��NuP(d2YxX��#zDD�����м�2�L �;�g����]�a��l~��Ǝ���>��C:={F
��/!1I$E$�H�RR��K�M�<�����D$̅���K���{{y]��U��O?���dR�����>2~��\�ax������~���k1�>�>��Ν����iq��]]]���F�ѿo�6�ݙV����k����G��f��Fx�W�ܾ��������gΞKMO��́�.��v���=h446VVU��g��ww�Zm�ٚ��@o���ظxw�����*.)Q��f���MMMZ��GD����������BT�`0p�}���dD��F�h����4��|�ߨnLiy'�iu����lXt�^��x�rEEE]]=M�E��y��y���<��wtT����0i�E_~%���I�|���U�>^H�tAaQ^~�})'G�7�{��ۻ��$'7�Q���lZ��j�2Z��a�ޠ�._��R)U������4��Z[[�����_��_�0��Z���oci���(���h�����������$�H�6��B��;�����#�tt�K��>;'��S/�����/,��ʺx�ʓ�=V^YLR��������O�:=z�� W�����11Ǐ��ٺ}G\BBXh��3gv������y�F$���*.)	����>�,������x�����?$8�ȱ�4C��l�=qb��	��/^JHJ3z����\\���ؑ�_���ݷO��k�{��5�V�#��~�E��>0kfXh�����6̞9#"<,95���4i�sg���Y��ۘ���W����t�ӧ��c�����{��Y]S��p���:z$55��������2��ǎ��4���[�mOKψ���գǡ�Gq��r���~ֿž�䪡㖖��WV����jPu��Ł��i�ڜIQ��8��u�>\���ѣ'�{;b�(H���w?���O�8;3-�+�Ɩb�s��]��a0+.)Y�b�ʥK�Ri�q�ϝ)w�{���H�Z�J�����j������D_}�%,���ap;f4�\�4
���b�Hv�^���!�nk��#3g�����f��	�hk�vMډ�f��G��prrtqvFE��8��4���K���Y����g�l6-��ͼ��Yt��[���ѳu;sN��������J�R���t�X���x��V_����֡Y��]�Sz����p�s88:WW�;-�,��������p��q��G�;�oK���:��2���^�spp�
��>��>ˍ�m�^?G���۬���m��rpt8s���-��6���%��vyU98�i�H���͍��}Z�����v���j��Re0���euE�F��h����fﲿ����")�6f��م�ط�jmS����a��H���fZl�q�e���?I�47zdbͥ춷_uc5�W����co��i����,����E�J�r�����t{���3gG��XXT�o��!Z�
pw���ٿ���_,YVXT�f���խ]�3I��O4��|�}��-��Y"��cgy�پ�SRW~�=�M��讠�k~���ի�@0��ɭ���q�b�$%���X�g�H���O�z=��A�ݫ:���]uceڬb����5���Q��jek���Y�԰����m6�C��s1c�������0�Պ���􌸄���i��'0@�ן:s�¥�$I�E���P�ΐ��{����dtS
�Z�1���WTh4����CG�^�z�$I��32��<e�Xl62#3��SW��t�M���
jk���=~����l�,Y����6������/�lѩ�g�:]}}}zF��XdY$E]�z���cZ��������#ǎWTVV��TTV��h,(,���ң�O����]k��:z�DII)��s�/\�|�l6��&��fdf����8^VV������bc,���H����k0�|�b��K��:-���O�T-�i55������mϓSR=<<�BaAaQTd���gf͸��_~=u�\aa��k1�F��芢�TVU8|��^ع{�P(��ˏKL	����H�����\������
Ǎ}��a������/-+���=w�V�+,,ڴe�B� �D"ٱk���ߖm�UJe``EQܼ0��2��.���;w�=|�^����(�������w��/_�RXXt���I��Ú��D���l��|>�iQ\����j�׎*MQuuu�~�u��);v펍�W9�.����Y�Ն��!�ܪ���N������%&�98(��?�������\]��\��j��ڳW,7�՗�\�q,,4L�R��b�&����94$�o�vCo!��*��f��d򩓧I�2������y�����˝���JJ���~��
���_�=h���FG��a��	���B�@$=<^hH������&�J)��ۧ��>������G�^^Q1|�P�Px�ĉ޽z�E�����&��PV^���<e�Dpuq�׷��k1Q="�y�	�՚��=~�خ�q]�0��ڔ�O<����S��AQ���ƌ.+/�t��G�+�J-V��S'M����ۦ���K���¢�f�~�0@	R�Ba�>��J��n---���zp�<�H$�|��aC�%v�+��Ǐ�|��b�f�R��[&����+(]EQ..��fqI��e�?~��s���D"����4M���w��&OFi\�������t��پ�~7���Lf>����O�Z�Z���e2�ajjj
��bQ:8tNu�\�:$:�[��h4�L&�Dl�Zi�fs��e���$gG'�D2p@�պb�*T�E�W������ƍ��Ϗ�8�����b����1����MA�С�u�L���F`���I�i��`2��M&s]}��b1���f3����b����|���ں������F_�b�0�aa�>>߭�������bjS����7���j��b����ٜZ�$I�in1�������&���	�řL&�ٌ>�����FuW_ڻ��~�a�N������"
r>t��I�3�����{���?��v�}������V���򊊊����A�ذi�Z��<qb����gωDB�'%�H�|��S����D"�գ�k�r�g������|�}v���������\.�H���qptC�R�P$�LS'OZ��[�T:g���Q=.Z�|�|>��Ϳ?q*��o�Ym��-ruq�J$F��/����үo�Ȉ��G�134z0L?��5?���k|�8{�����aC��&�?%I��ܪT��Q�������ko=��J��32�T,���~������'{��ǻGd�ǟ.Z�Ţ3��?uJ!��$��{�̸o��o��DsfϺ�u*;����W�F�����l�iZ*�tt-�PRfVVjZ����`�&��a'GGT%$�$dI���ϭӧLquu���/=~|�W��xbYy��B�࠰Z�U�������������B��	I�(��$�f37�0�D4��t:�b�g�zDFN?N�T�Q�FCR�J��0L���0�H,���a����*��ʫ��N`@I���u0(3�V��Y)���svrbGl0+.)�j��^���C�P�0z�A*�X,�ła8#�J	�`Ϣ�/p�qʝ7k�t�Y�l�H��;��Ȉ�Ȉ@�?y<G���3��%��x��Ŀw��iS&�U���F�~����V��sN��;�L	��\,FK0�L��&����u��Q*��
�zzxI�A=~B&��n�����p��ظ~�6����}@@�P `���X,f���,8�"|n�=��o�
��[���=m`��Q�e����E���_���趴	A�z���e2���-��B���i����
v�(��i�oU�@������Nǹ�!wO�����6�
�^�lc�E� ��'��O��988n
���}��	��#�U���_x��n��L*�4��/ɽJ�e�<88�E��A�Z�
�#_W_�{�{����ffqp���m��w3��o~]}1�U8������h�Q��>m\��F(����RW��`��f�����N��Cppt�<����Et���71��9)h�&c��v��*�'��%�>��=:�a�^��n�.�ߠ�N������k���~y�-���4��G�Y�h��o&���8��^�����7��p�ǟhT�����/�D�����o
:�s�׮�y���G���q˶mH���
�O>�B���G�>]�Ѭ^�+@�E�ן8u�����t:[��Qspt!Ⱦ�l����f�@Cc�^�G�cjjj�m��Z��WKQ'��f���_e�g�ϡ�-۶�74؛6EQ�5����Q�7p�9{Z�������P�������5?���TPTD34h�چ�F��t���睝����323kjj���U*��L&SqI	�@ddd���	��Ҳ���F�^�Ļ��l$i�Z/_����_\R��G��s�]���n]���N;5�.�WJjZjzzIi��d�e+�ٻ�������$I��ju


� �㵵�陙y�m2aa�7�������b��
C��b��ئ&7��������(�l�455��G�#��$I544�t:�����_�հ����f��$EuZ?���h
�%%Ż��|�ǤRM�4C�g�X�x�Ѻ����O�����=~��c�����x�юV�����5�򊊷_�艓�N�8r���!C&N��$�֮[�WP�������ٳcF��OH<}�lbRr��Q�?���W;::FEFN�0A�iڲm{Yy��fΞ1��(��x<�Ѩ�h��:�8��0����D'N�NMO�J$Ν������+�Jkkks��X��{��;p0���j�D����ګ�w�:s�D"�Z���M�</'7w��J��q�w���/?����陙�<8ي����a!!Λ��p��Ѫ5?�������3O�t��j��3�Q�vtTm��J�,++���������|�ͪ�^~9"<��(����32�r��1�Gzi��-�����e�j�A��V�D":d8��g�Y,��|�;~��8A��u7��Ͳ���^sz�o�>���:T�c���svtR��.IDAT*H���o���<�����

������O?�d�W�����#+;�ys��U~��k����W��κF�^Q�U&�-���#ǎ>vl��<�����~���vʤI�x'��/�Fc4M���~`�Y����<�O�^̜QRZv��Օ˖�e����	��Θ>���^8p��'�/Y�c�[���ݪ5M�5�3�?�G�B.�ǟuu�W�]�>u�F�����=sfDx�hg���Ԍ��ЯoRrJbR҆���?��È�'�{�h�7֮�...���?,�d����W,[*�H�y�4MG���߷��3￯��c�065&�0lzZB���刄"ä2y��
Lbr"�QE(
E|//�FMÈ��ťE�MMO=jX]C�F��8aLiY	���;�d���5�I�L4�oYEI�F��45��IҶ�zgd���	��KW.
�O�iLM�&��
2�����);d4�G�ݧwTrj2I���T*��vqu�yLjzjW�D�N�XXhA`I)��Ǎ���
��������0xP�̬�Xܫg$�����3���fȐ��e:�n�������pʤ����G�z%抏�'�)=�7h}��ѷ�֥[m&��><:-#
I�6�A]��l4��Ѥ>lpRr"������͎������?6��@g�2R-��Qw�huZ�a0h%]c#mA`/^3[�͊���f;�$`hF(O���er��b��E"�^��X�6�Պ�X�N`F"�4c2�r��d��lr��$m<I��D�r5H$b�ހD�03�
�I��X���I��J%:��O���h��+'��6����э����Tb��V�U&���F�PDR��l!B"�
�f�R)��"�IQ�Ro0�8.�H�c��`�-N��X,r��f��|��DS�D�l�V���=���Z�a|>�7p�R��h�iZ&�Y�V���70�d2��d�D����cO�$-���j�ҍ�8�����L�ҥK}��mt��nN�fh�ax8���3p��q0��0�1���As���M��a�Gg�
�x<��1P�![�'( �?��q�F��ZԴL'��]��9St��;��F��L�Y	6⍽%���>���۳����
�͓=����X�t���v}��w���$�Fڴ:��f�>�A��a�V������<�ǹ~{Z��t(X�َ>"�!x�a籹8888�9`0���9�a�wo@���n������� �O�dye��j������G���@UM5M��aGd���
��n�����qG`VY])�}��h��@��4�J/�,�\]�2y�����Z�F͹~���4M�l���p,�f<�=**�/O�ق���0CM�陜�����h�
���a���E�4CctG�~��`�tqppp�G�ǽt�1�sppp��/6���^7�bn#�
����(��o�w\�&i�9888�V���i�-�����}j���_��]��S���r�E�l64���CK����`wڜ�Ǒ�͍�������h�[2#�*�*3�2	ADAqa~a�P ��x����|>_ $��h�Z�@�\6A�A�׉�"���w�	9y���^I�H����4iPs"�XAȡ�8�	�#q� �B���5��B�
�ƹ~����24��\"�!�բ�h�j�,V+�G���/*.2�L8�k4������arrs3�2��1��dIi�Z��KuH�"��nj���W�/�FcQI��`�q\�Ѡ׈&��$I�ٌVa4�um]mQqqJzZuM5EQ����p������]Z��c�c8��Ҕ���ꪸ�x��ZZ^���m4�V+��5u�z�>+'����b��
���:���2�uz]n~~m]�ǻ�7�5����J�^�T8����ѐ��i2���++0KIOmhl���2
i�����<!9���JӤ�Z�U��E]o�`8�����.�����[do��3M����C
U�Ty��55=#{���K.�Q���)�Ji�1[,��z�nT7646�|��PUS�Z���<`4��R�6[���e�����7��ju��_]}}UM�B&7[,MM>��֨��pwu�����J�з�D�e�\������=�0�0B��I�d0�fsCc�H$p`(�����j�R4e����x<�L*�i��H��l��1��\*�٣G���d�CM����2�j�h��������S�U*MәYY^�^8���C�>}�\\�6�a4MR4e�Z�&S�w�������>l���l�n�.��I�1	qb���ˋ���ʊ��.[�֠�@_������$�$|�D,���7[�<���U�ץ��9;9������UTV�-fï;e���2�f�I�R?_���L7�TZ\RR]SC�lB��Q�b�qrttwu�JZV�$���x<���*�%�&_�AII0�.^�ػw�ͭ���q�QZ^����Q�H�@ ���33-6KP@�H(B�7V��j�	[b1��X$��x<�g#m4M� 
�\4B�05=����b� o��Ѡ��͆"y�=��
����(��`4�Ul�Q�*I
�|�1�#�z݅���~�vh�f�$I�xL�4`#m<�R�L&J���;�oÔ
%JH34��D��R)�Q��0`�V���8z#�%fF.�ۗ��(�f�4"���>�����\?G��a�����X�R�#��n�f��ֳjۑj�m��e��37-{�ݢ���s�mA.���EQ*�Zf���c]��9�tR8��9888n�0��L&�\�����œ�	���k�af��l��!���+۞�&�@��~�vpvr����������#gW�����G�A�HR*�9�9������aD"���w�H���R��C��0�X,����\?G;4��tJԻ����#2CQ�0������)�ѵk���������vp�������q�v����ع)Z}���2?�+�޴nQ�M����q�rS�/��x<��j�X,�(�<���o�Y!
i�2��4M�jm��i�Nw.��|0[,��x�����'�!�������q��Q
�K��4��!�A�>>4��-J@��ڒ �@���	�`-�O_�>W�恕�
E���Y..�aaB�P"���f�v�8�K�R�^�r`����(�߀=���8�������c8�������
��:�zN6�A��`��b�{� x<��j�Z%b1fw5X�����������=��������B���_��5���كtP((��(J*�"O�tp��AT����'����l6�8�T*EB��j�q\!�K$�/dr?bF"��%&�^�#A�c��|����9������@ӴR��DH���3�{��0
���4-��I�$B�� �,�T"
���@,��Ĥ����ǎE��9;;�D"�\��R�T"�D|��J�b�����dR���Ϟ--+�%��R)��B��J�B�09%�ҕ�#G�Z�E�e2�a�B����u�Ƃ888�rx<^MMM[׏��߯��|
�OQ����J�26.�j�
�=�����{xxl۱s��]4E���X��?�����
	6��gΝ?�bQiIMm��.��6�M�P�?xH��-[��c\]]���.�fUUu�����j�c��xo/�K�/�^��^�������!IR!��=��ͭ��q��=ٹ�ᡡ���EE�AAV�� ��/��ܳ�<�D\|BMm��#�=v��I�����ZL\BRbvNnCc�+W�\�JQ���***C��՚&�B!�J��?���!���<y���5E��2��Aq��%�TZTR�m�.�|����bspppܕ ��v�����ģ�l�m�;|����T:�>{��3A��a��bY��w���NNNMMM�@(�
߭Yk��r��6o��d4}�f���Þ}N�>+�V~��j#	���tӦL�ǟyv��=�&���^���`4W}��FR���(�@�c�l�
�6WVU��b�� �I�,**����C�f�Z������KJF�~���c'O99:~��:�~��}�N�
���3;'�h2}�l����۶��g����%$|�Ú��t�JEQ䖭��?���a2����_�t�ʵc'NX,��[�LIM���f/O�_۔��.��/w?m]?�cf�y��?��n@�~��$%5m��9���1�q�N�}��п��b�h4~�����	
z��'�MZzF�V[��XRZF3t`@�#=ؿo�^={<��#E�� �6�����U߼���.^Z��Ç��z��Gz���r�1���
Q�~���{�9_�H$��|�J���ţG�������<�أ/>�������1<.�
���ܻW�3�ΙL����ں��F�J���3�O�TN�<y��^�s�QYU%��|����,�N�S*��{�R)Ue���I���M���;z�D�^=u:]EeeMMmCccEe�OW�S�����=���:����k���UU��Ǝ�sǎo���y�,k�=�"#�_�zFE�WTdee��"��CHPдI�d2��d2�F��h2�&�Vk����i�P���VQY9h�@���Z�"MFSl\���믿m")r��Q陙6�M��\���4�$���Ҫ����:��2aܸȈ�,T)�NN�MMZ�l6r��)C����4�����6y�Q#Fx{yj4Mz��&�V���d��l2MF��lꦦ>�{{{z-X�H�7<��W_z���|�Ú�c���������׮[���U%e�J��C�����f0���w?m]?�\�XXT$	gϸ�hد�ڟ���X,��-,*�п���oxX�/�����o=����;v�Ȉ��@�����ɉ ��� ��C�0��w��𐏷�민L���3�8���3a�ح;w�9w>",���k��Qn�������l�-�� 8(6n����q��Q�{�ڶs��d:$�h4�4��OQ�N��X��Cs��ۦ_6������R)i���sP(�"QP` ���P�����RS�C�Cƍ}�ܹ���E��S��8����'��ÇQ����_P���U*�C��T�.�����Ѵ��W"�Mf�����A�P�?p0%=}���>A�F�D��8I�B���(��&
)��ڬ|>e��X,(���j��PH�P�a��b�H$��E!=|>�d2�p\,���V��
P0�L
��$I����NA(�H��iTJ����>��i�@@Q[mTO�ٌ�d�6�-'HӴ�lFy�Pd��b���8.��7���q7�����?��p�Hj,ˉӧ�9c�F�"��$I�0�h4"�i2�Px;
t1��8�Y�V���dB�!h��^�Ge�%�M0�8��}�{vvPE,�����6]�]A���f#��i�?Ps��f��O�a�'����FY,���X,h��O��lw3�f
�7���}$48�h4�)�YWˎn�?�*�����]�.i���
�%l��S��}γv��N�O�����Ak�np������3�ק�vu�98888�)ܐE�]�Q���0�߄�����:+
GW�>��8N444$&&r�)�������G7"�%tEXtdate:create2013-09-04T17:04:16-04:00V�%o%tEXtdate:modify2013-09-04T17:02:16-04:00*��tEXtSoftwareShutterc��	IEND�B`�site-packages/sepolicy/help/system_boot_mode.png000064400000146702147511334650016147 0ustar00�PNG


IHDR��ah�>gAMA���a cHRMz&�����u0�`:�p��Q<bKGD��������IDATx��uxUG��g��!��C�RhK��{���n�����ݨPw��Fqw�$!.7���3�?��¶�6�y����9���g�}߁MMMI���
�B��!�~�?66�mii�����˃��>�B�������V���|��i�ǺV
�B�a�gϞk׮e1��4BǺV
�B�1��8AXB��(��B�P(a0�{���>�B���J=�N�y)]��~6Gt�u7v��?��*����h�{�(�c�/�O;|(��2 �X�%��!d�Xk��#%:v�1��'�!��%�W�њXucG[1
�wº9�;ߺ�e�B��Ǻڔ�2�9�bX6�1!�L��t�=k9���2�qB�0��9�@���B@B��N��
A(<˲��[]����p�1ƄX�?�-!Z
�P�t��(Z����`��1!`o+eDۉ.��a��[E�)��Ų,BHUU� c�0�{����,���(��.���׳f/^��j����cъ�Ͼ��$I�p�WϜ'�����u���B��8k�g� ��0���W5�a� ��j�(�]��M���O���9��_rAF�!˲���"�{�~�o�^�����z�}��]�eI	�#���A0MSUU�����i������[�|�����t-ǰ�`��t��8B���ir]�yAD1�4MA!��2#<�j�ՖQ(GƘe��
�`p@�B����1wFj�n]�}��Ç�{bbl�ٖ�^�����I��v���p���ڣ����5]V�4!���eg��s��0[Z!EIKI&�T�֥�$wt�UM3�������5���L���y�
W���'�����5u��@ '#�eYQ"�:l`�̴���
!�e1�i)���j���q9��w�_Ϟ;�o���.�������76����y��I��M��PDN����]��M��q^OJR"����1�(�z�n�˹��B���6_��]7�b��{�}�!(�_Ƙe���7>��K/�����7^vQvF��i]����/V���<�/�i)�	�i=~��p̰!��,��rf$"��g��!BB�aÔVV}7oa����^yC���oAޔqc�|��7�xx���e��'��ɷ��p酯���\����j&D�E+�l�Y���4a��u��
�s1n�MW\̲,Dz+�m���=rP���~��z���NILذu�c���&��=�ҿo�&.���ʨ!#���G�WT�8f�i�B��.�mlj��⼳��^W]W�q,���}�����7�::���\EQf|�u|lLmc�?�����e�u�3&������;f��葔X�3�k���+�Fdy���Hd�G��:���CE"������GPT53���߻x��ū�Κ���c�qo�,c݈<DZ,���/>��կ0��v�<g��L���[���ǟ���+&��8���dE�
�+.�W���?wꔩ�&�z��?.X��$AX�ykIe����q;!��?�����N�=91~p�>�`���t]O��=t��>|�M�C7m#��2hGYŢUkVo�!;l�Eg�v���Y^9a��u[�-_�aΒ�`蒳O?u�	w<�dum������������11w���Ąp$B��)G�!d�D�4�( �0&������Xל�W�"|yU����Q�����m����v��SUMCj���o��l���@0&�!ϱ�r��WT�nnm#l�!�Mjmo�T��Ȋr�yg���iFDQ$Illimhj�Ȳ���JSKk$"��$�0���:���u��>y�3g9l���Fw}4��Ȩ�_~��!�z2������{�݉cF�%'i�nb� D�����+W�4�Gr�X�����&U���8.�{��1D�4�45M�u�>���bŕ6N��ӫ�?0��� #
�(
:������v�+O�+C��::�����V�+�;+/'����^��H�07����v�>�z��&��F�t9��`I!��PfZ�!�^���n�JK��dyyU�K��̛��HJ<y�XM�!�-��Q��v���7�7m�߫�S��56?qB\���}�0���?fY���N�0j��?����6����V��7L@0��cY��8��t�d�WT�Ȫ��MST�alb�abb<9��5���f�O9a̫�������##5�)�~^����9�o��,Q ���ǐr��{����s\��c��~���xN/_�|Ȑ!��S��
!�ea0�AU�0&���by��s+���ܳ
&!�s���,�`�5]x��8U�5M�9���zQ�m����n����[�FB@�@ň,6ۃ/��r:�y���@�r3����YBx�ń�����^m������8l6�at]�٤������ܯoK[[8"{�.�a0�>|��
B���,ˆ#��&��i���<�i�&��C��0LB�$�qD�-��.o�c� �/�
�'B1Ɗ�0cu�<O�(
�����޾r�Kh@Q!ˠHD���
��H$�0!X��;�7l���w��
p��B�jk�� ���.��V%
À�z܆a�?s��5��@����[��6���0L�:e���G�_QT�(���
G��I�@��)���H�Ao<�<ȭh��,�v��s����cB���r��iÄ]{�y��ɪ�����h�Z�$`�dE�2~La^�M{f���h�Þ2�r��
p0�� Ν�Ç�́h�����7?/�VO��a�
�]`�XQT@�v)ǎ}<|�0/�E�4YQ���aA]{3�Ǎr��Çv�P(]��	~�c�B9&�b�3c�$�/�y�����GouJ�e��y�������IaY��+��4,�'7���ҝ�e���׶e�>��B��zjJ*���~��jAHNL���*�!$����'.&��ցB9��9�91��4� 	�?�r�m���bZ�Z �oMa&	#�dY>I�z�N�O4�MfK�t+X����p8]��)5X�UT%TB��<�q��
�1�$�X_`���ֻ������@�� ���a�^����@���v�A"���Bh�5�P(�
B�ߏ1>�cM� ��(T��<���篍��Z���s@������{엙~�#�#
�h����ر���g��:��
�����x��<�-
�BQ��~6�~��%Z)ҺZ�]����n�+�=Le0��C#D���XSY��}Ç���a�5�h�fB|†
�;������gzD��q,  �>��5K���ݝ��P�����x��<��@U5]�#�V��#�L��Bh�@L�(�]�!<��M�sBMպ�1�A5M;:��c�S�%��b�4��׬�*bD0��0B<�K.{�8�[�|y��mqqq'����0�e!"�L���BQUU��[�
®���7�B�NX?@�4��[��`�{�~!<����޶�����{%&&���#��&w�x�4�i��A�Y�>��� �j+����D��*Az��X�H���U4!�����023�u]Ga�Y��oh4#33]�t��e%j`�cU�I�	����ԏ��B�8�D�aLUC�I6%�9ڼ�G�1�N���sg���N�YZ����HO�5=���s\8q9��n��G�|v��S�;�~��� �iZD��b��rD"��*��/�0��va\ll$��lm>��v�3�C�a����kkg}?��	�Z��:����LB�,�!QeYx�aY�4-Y�"55�wa�����32M�(� �Ǚ&�RBUU��Dg a�b���n_�ac||\Ϟ9�P�*�e�0��%I2M�n��<w~8�ӧ�i��H��pؗ�\���ګW���<���j�$��Ȟ}C�˲Ǚ���<6M��(�5���y<�P(�	l���ݻkv��q@��d%{�x�Æ�y(��}��²LsK+�rSO�r��g��`��P8|��Ӷlݺ���ԓ�|���;Ǐ� ���O�l-��s���ǟTU�L�8a������:��.,,-+�x<�^t!kM�J����O>��ٗ_��ګ�
v�������.�0��0��2�UES%??�'j��c���ʓN���ٹl��q�Gm)ڶmGqFZZvv朹\.g��1�G�X�f]IiY~^�ѣ6o)*-+��rZj��ˑSO��r:�VBhE�ZV?���.�l����eX�aW�^[]]���9j��P8<kޏ��m�'Mt��<LJ��g�9e�I�n*)+��o�ӻ�p˖������3�Wa��Es'M��~�=��i�546VVU��Ĕ�W����	'��@ �,:.|�(���	+���ʲ7��=~�����_�ܸi�	���!=���P <}����˯���'�j��p�]_Μi��E���?���ɓNL��OMIIIM8`@A^�W3�)ڶm��a/��jKk��xA�UP��G�~��7�\��n��4l�7B(�ʙӦ�HI����_�����#%��)�eY�]���0˲,��ޫ�`�^��g^x���&9)i�M�@pWuMSSs(���׳g|\|jjJ~^��w�\�z͸�c.YV�kW]}CG��fϝ7h�@���+�N'��*����X�z����f[�j��;'�0n��7-\��p�ɓ�ijn�j欼�=��[V�^{���rrrX�]�z��׆��KJJk���kjk�g~����߸y��鍍M
�M�O=u��ͻk��v��eeQ(R �5e��G�_|�e�ҷgJ6�t6���p(k�w���������+�H��'�5bD||���&�?aЀgL;m��s��[��2b�1ǟ���R�m��(;KK���TU�z=�6dHvf��A��rs�� ��V�o
��4M�����e��_z����/?�����/Pòy���a�e.8�'}�0���wދ���ۧצ-E��UC���̸�s7l�T��2>>6�����٫WAUu�����!F�����y=��ҳ�2z����yK�!��E� �aX� F�$��iw�K�+ƎyҤ�cF�X�acCc�'O�קwJJ2/�[��Fd��S��,�ݫp�A=R�!�%eecnj�|��ÆVTV�>t�֭
���O;u���m>��� B�y�{��$ˊ�rb�u��i���BCQ��Ds̈�SN�v�5Y�qCϬ�����A�aշ���BDQ\�|œ�>��h�i�^�p��Ͽ����HO�oh5bxFz��o�e����w?444�;��N�x�W&%&����`(���TU���:���]�*�j���~��g�z��m����A�6˰� �$Q��kx��V�^��ٙO05b��%�Z[�����Ԝ��4|�o����3`���k�n߶#7'�n�
����N/���PX�t�߯kF8Q5�c91����fa�"�c9UUw�,Y�v���Ey=s�._���s��\5~옄��of}_�u[k�/�8a|Jr�K�N�ݫpӖ���V��W��ѿo�%K������k�*�2p��u/�����T�gN��B�n�A@��i���O��"
�\ X[R�RZꈉ�,bch3g�svzZ�M�Y���Ʀ�-E[KK��z�^u�e���$'����zr��}��+���;EQ.��̌����m[��'M�X[W�lŊp8ҫ _���L��
�ӻw(�HKKJJ��X���0p�B���v�G�Hsk��뵼bX�������:��yх�!����mؘ֣�	cǴ��/[����q��Srs�SR����=r�p��u���i����&%%���3'B�z{��X^7˨�
!��l@�B�i�����ֶ�'�$����Æ5bxVf��ں�򊔤D���xN�2���b`�~	�qk�o�x<yy�G� ��,6t��í��}{ged03h@��DEQR��TU�HOs�ݘ��a���D�?5���v�T��ɕ�mcf�_Ы�NCsI���~�몫��� ���q]���inn�}���Q�c����[N6I%�JlX�mnny����v;�q� �M���$Iv�=
i��r�,��P(d��TM��l�@ `�R�>l�քkF�Yޟm��m;�egfG'��y��tbl�?�A�⫯�s��Y��c��a�СP��8�˥(J(r9]�bB��� �@Y��.W �aP0��Ĵw�b�1&6�E��a�$L0 ���$IUU�� ˲N��e��0,Gb��@ (��r�Q8v9�<ϫ�
�n�K׍p8�z4M�CN����n�����Ym��i�dgfS+���ʋ�|ŲA�A�)�11&�~���������.�<��Sׯ_��B�ܞQ������� ��H$G�8���+W��葘��0Ʋ,[["�dY��V�iGG���0�@�2<;;;�	���������p�-�x�X�L�loo���;6�����!��[�8���|>����p�:�0�d��/�0��j�� �$,�BX����\�e#����ɱ�*!�wB	���!�ȄD���,G,�%k��@ !�y��@^�e���rVV	�1i4/�/BH��H$l�]{?�h}EEkU5�P���{��L��
C�����]��C���Ɗ�
��U]�!ݾ���g�CE�8���?��0̜��n�V��1K��~��~ܓ���B�X!�E, {��o����4��~�\t��{��-%��m1�nf���	�˖-E�9A`�]Wg�:D��޽��A���8cBbcc�΄c}����a&��Oq:���a G!�{z��9��,��al�;��WBXX�+//�J�J����=����X���V�=��C�,eAؓ�� [�ߢ@bEQY]LbX�=T��5 !]���`^�_����Pʯ��,�C�;�?;A�4cbbt]�C�
!4#!.��k�?�܉	�֘���pt�k�l�u��������ȑ�Դ��?L���<p��ië<�[��Ǻ"ʱ�MIN�۷�z� �_��4M���"�y�8��0�T�)6�f�8y2)�֋�9	s�N�s̡�9�~���/Ʊ�>����u��P(�n�~
�B�vP�P(�n�~
�B�vP�P(�n�~
�B�vP�P(�n�~
�B�vP�P(�n�~
�B�v��D��1��7�f�"���i���&>�<!��{�B�P(G�᤟"	��n������r"BAp8�`P�4���bL�EaY�n���᮳�S(
�H9��[�\�m��?��(J�>}N;�k�%�Pt�/K�B�a��q�5U���P�u��V�u���~w����c�?��˖���/�X���[�,�����쟧�rrRb���p/�R(ʟ��H�կ"I�������l.�zƴ��6�n��I�dmɱ,���k־�ʫ����m7C��(c��!$BB&Ʋ,cL����~e��7_{u�m?�dJr�y��t:m�!�����].W}C��og�6�gn��jBEQ�>"
�8��(�㊃H?�0,��m����Ʀ��o����'�,�������N�:���|cF���/���+.�����o���+/�4;+�ק�\Λo����o�xc�za~����rssn��:��B0���ݻ.Z�xٲ�����d�嚛�_{��I'N���6l��G��A^�N��>���������t'��)�à�����H�՟�k��7��0jĈ�z��?|��7Ϝv�5W^�~��'O����o���?������i���$0`s����9;9)����Ϊoh����;�,[��[n��h�O?Ϲ쒋�x�����t�EQ���N?����w�>�/��g��	~�0��}�+���8��ٳ��
�ͬ�v�|��N7w�BEU�����OLL�C �ܬ\���XW�B�@��e؎��ݙ�����1��;��~��;�ݿ_�뮾��1��<Ϥ	
���\�e�֞99={�J�x�YgV��4������HOx!'�ǟ{�Y�P�����K.�7a���A��2���g�~ƚu�O�<YE@(�����k�<���s�͙7_Q��/��f�-\�����0�4����q��vgFz��K�r<B�8���D�T��N��r����@B6�-!>a��>�����,4()113#cЀϿ���o���r���k���E[���/�6�TEU/��s�<cȠA����YY�ٙ<�557wtt���[ZZ�@�ϧi*�B��J���q�3�?�t8?���6_L�w�_{��s����?�����wߓ��V[[{�%o.*jnnq��Y��.��eY*���L�����r��\}��III�iZ�P,˶w�s,'��(�=RS�.������Ą�$I?ϛ���n���eY!4m��N�����9��}��r�I���YX�������T���v���zf��{=�^�^��4^z���dg�a����Y����YÆIH���m�\yEaA��AEAHNN=j��C�����y��>q��V/*�P�KDͭ�n���xB�~ʱ�a���f�|��k�!$��P^Yn�ٓ�!�@�B�����G�x�����W^VU�n�	���z8��$I��p$b��� �������� 0
��N��i��A!!<ϋ�
�,� ���ᰢ�,�:&$�#I@UUEQ���ź��oOOK���G�B���X7dQQ�!��!����wtD�r��޾�������9NQ�P8�V8n8	��VЯ�#�!���0L$!0�!!@UUY���]�DEQ����A�!d�f�&�WYQ��?��B�P�d:���/����+�?+
B(>PsQ�E]�1X��+����k׼�m��B�P(���oJ�8���K+�i�P(�?#G�[B�R8��q�A���������B���&�܋1��G����w:���t&�0s�N��#J�A��)�?�#���ޘ/@uM� 1^��Tc��BB�a�,À�*�	k@��BUU��~�V�ZՎ�ZB�"D$	��,��`
^����(��F��]��]�Kt=G�7Qt��
{E
c̲�~��DZ�=T��ǁ�R�BDQ4MS��]�}��sM�Sb�d���$I�4�!�����-x���9��D�0M�������*�0���+	�{XL�ۈ�$):�c]�~��\I
���9�߲�����<�ҫ�?�ēK�-c�p8l��q\}CãO<e�ՑD�r˱�U�����v[sKK]}���]AZZZj����b��;�{�e�e].��Ҹ\.!����Gނo����q�{�������MK�IK�9�$��ڵ����x�۫kZ��N4MD�m��Ct�c�$9�N���6��n�z��;v���q����8nO=y��pD����(���O�B!��N� {��0� ��e��rM�x��~��-Eqqq�N��a�$K�Ap�m����tWUUggg\\ܶ;�|��ؘ��J� ��ή1������Ba�������9��%%V�&��]����.9�Dq�Ia��v��Y�=�̳O>�lyE���EQE����DQ��?�����x���kׯ������$�v���fEѪ��❒$Yն����VY����e=��p�y���c�B9B���s���4���9��W�]7w��g�5n�X��#�"�p�ƍmmmi���mm�W��4q�(��v쨬�JJL���m�QQQ�v��y��+Weef�l6+���o��b����l]׿��[����ؔ�/
��׾~�Z�ٜӦA���0���j
t:}��J�a8JE�}�|�ϸ~�۶���ǟ���Rz�䒒j�M��M�IJ�ﮪڹ}�M�J����������\V^���IJ��DQ�������]^����EE�6oٶc�ɓO"��\��4��8�0�	�m>_��}�^����C�����p$����"��߸���KINVT�����rW]}CRb����D{���ܜ캺��;�N��=1!��Ꚛ-[����J�h5��746nڼE��,G"�������ؘ�=�D���������z=�]��E۶����l�������]6��q�EQ���
�~WU�K��6��{�*LJHG"uuu�@��p�۰���=&&捷gD\�$Is��okk'��].I�X�-��lhl����^�v���{���Y���_L�O�ݒ$mݶ{w���fY��jjnv:��`���gY֪*��v�y��iim�~���˯��`�b�֯_�zMllL~�\��;�PZZ�u�vEQ���W�^s�iSgϙ;Ѣ�Ʀ���cG�~��̛l��t��z@H�^�x�koo�j�7MMM¼ܜY�ϲ̢�K�H$+3��oՍЇ��;�~=<�d�{��}_�x���xD�N����'}��=�<c����ڵ����m�d��s�/2*ʫ^~��g]s�D"��_���%!�f��!���A�����b�����|mm+V�\�rU �5�&NX�z�;�}P[[�d��1#G|��'K�.�B�-��O<��nkk[�b���JKIAVU�z����9��?��_�_z�_[����w�C���)S&�ɧ;KJ�nۮi����w@~�y���C"�쪪��Ï��^������G�����o��zD"�����+W�<p�0�(nܴ��W_3M��zKk�G�~v�Yg�v֪�kbcb�|�9���*�Pxނ���?;3����?
���W�;z��>�~��]��3g}�⫙�yyII�1��lY�l��`A~~�ko�����0�Y?���������r͙7���BЯO�$�/�����蜻`����?���s��ܹ��x��!Ͽ�ʦ�E+W��D"M�ͫ׮�ý{~�̕�ׄB!�ە����][��3ϝq���ϙ7?66��G�B_�mVfFjj���ǧ�S�WX�>B]����{�Ɂ����{/�6���e��E�'���i6��eP$�a�l�e9���`��7nz������'¨�&�~ۭ��T8.=+k�I�FN:��ߤ�=)�?��w����/X�nsc�ظ�K,`>���+7ߌ�~�p0�_ߡ^�ټr�ի�F�����������ܼ�檜����h��Q�V���i��k�}��������v���N<�%%�ݎl�C����q��g�~���J�^���+.{��lj��꧟m�Uuߣ���[��)�˗l���U�]���v�x����̙�V4h�--퇕�*��	g���/|��	b6�[�tѢX�KG~�q6!$1!���q��go۾�eYY���7d�/8��3�yy���cG��Q�sނ�`��,Z��f�n�㈮#�4%=)q���
��+KKwn�6a�(��XQӓ�Fxˆ�rr��'�;��` ��Y�pqM�.�_p�7]}���܌��;l��#
��̸�+
����$�7]ݕ�]��*�`Ԉ��3r���'�~�-�ۆr����<x�(�����M��s�?�z��ݵ�o�=�����G���;~�3g`�~}z�z���z��j��!Crsr�xC�(��e^P5�G�#>p�i�}��׈��S(G‘
�2���/�袤���_~�i���ۯO����(�n����<x�q�(���� TTpN����
��l �N����	AY�XY��+k�9wUf����՞�q}��oMS������'6~�'~�yο���ߜ2e���b��k������]�얟~~�w_����iS��������.$Gآ"��m���6��0���L�O v;)�C��o�⩪��w*EE�����g0W[��
"
�� �Z��Ù>���t� d���������+c*+9]����|���A�y�u#}ͪ����k����u3�] ��:n萴�EU�ɚ�1�C4
����r(���i]�5��"L��t���{�RSSu�5��q����G��K_����Ӧ��{.��`�X�Н�=�އ}���ӟ��&�E�,`j*������'��o�5+��\n5.N�0���DIb��i���i�i!�
E���mظ�{�}��Ǡiv�����
0
�ir(�DdI��P���t:5]�B�@�i��~�BxõהWT<����6��/�!Y��5�4
���eA��ʑpҏ1�����_���æy��W�dgO9i���bc
�?r��@0����9��C8LKI��+'L��;�̾��1ƗX�{�٘�G�.��i�7_O`@�e�*(,�������|�L�e���!�F�x�ͷ�z���{݆
_��cٜ���s적�|򩐖6�[x�̛o��n��y�	¦MWz��/:��� �駼�|�E:S����Oޙ�1z9�{p?��&��b�˕
�����{������'��_[]�\�+=!��_���ɬiB�@<�bc��F8���g��+��8���!a�B�+�|���l���


�m��9sZ�mK��0޸i3��ZZ<����'��W[_�����[�z��5�F�^W77�����v�۷�����~��u�+��E�&���V�̆@
��ᬦ1,�)��!�P�BZ����|�ů����1!&�ii֬]�v�M�.��|�����7X�4���/{��7�{�a�4�9�}����>��S���+V���y�ޤ�Dc�0R������@0����s����>�hMm�M7\���կ�B��C��INL�t�g	c�::;���bcbTU�8���.	'%&z�6YQ2�ӛ��[‘�Ą�p����O�C�8�H0M����	���H�n���a�YY��+!�q}Cc8ꙛ�0����F�vִi���M��*��!mmmBq����UUWK������i�Iv���h߸���@����K�8eZ�ý����*k��1��`FIO7SS� ��c�ZG`&�_����1>[V���W� �9VQ5��.��'�'k�z�o<�IO����iv�]QUXWSV(+k^�"��6�����T?��q��3#G��]������jeQ%tvv��ZM��9��ܬ�h�mMM�i&����lcS����1������o8���OJLLJL�nN��������bc��X���s����Ą�ؘk�c��Çr\��s��o�T`�0]7�ֱ�t�0��8��0�m���#���F

��}T��ߏ���2k�i���fV���g�񹧞�$	c����<�q��)^�yL��i!��i�P0�A����10`��sF"6@���ii���$��Rz�B�'�jl��Ͼ��86�h����,jh55{��<q���B���B�1�0�۽`�⢭�n��&UU�\Q�0��1��U�|O�^o�h��0]��&ϋ3g��g���w�/��jD].|�u�/4����z���N �E����8�Qa�eu��:xk9tj��t8f~���%˦��b��o�&:.uP�g���m��S�u+O2B�M�����
&&��k���a��&�~�z�5fϞ��e�E��
�Ѻ��B,!�Z�0�=�L��a�^֪h�Ot���_�`��P�@ �?^ͨ2�賃� �,HM5���� s�h��K�}a$�3۶!+�� ���-Z�����Bp\�v�f�^fb"�� ��86t��&��<�!�3q�1z���Q��%]9p3@d����O�����װ�� t`g��� ;;�>`�{���81���3������)��� ˚���F{��
!�D�/M�e��D
��g��oǺO�����0�� �����[�;�� �~ӴkV�DZ���J�l
t퟉V!"�@�V��|>���<�cc�(��5
8 =� ���D�0"��À1��>��[��ލ������l��k�3�0��a8|D/^��b��>�wj�S�+����W�1q8�U�\����^�ʕ8#��0����̠V���+k�t����͢n�q�1�
!���d]Vn	�/ft������q8�a��V��TU	��͙U��\�]wɷ�C�#2~��7�J?�"*��ϓ��N0��A���	�7݄���E-S+X_�eY�!�,�]����8�0dY�4!�N��`�=0����a�ٌaB�[Ӛ�Ɗ�G�:�h�j�����@�8�83S�4)���e����B�믣�6��G4LM�j�P��,�����ڸE���Bc�X�a������������ASG��i�C��Y���_#����r��V_��f�󵉢$�btt��]1|�7�_�Z,�EB�illhlj�������!r:��ߴKm1�79�ᯉi�]��Շ�	9d�;�{���&�0��'����i���[�''���H�՜F�R�+�h�#�@TU������^|��U%@�X�**+L����/I6���cێ�''�D=:�~|�GA�(J8�*=B��!�&t<L]!�$�����%;=O���A0M�󵖖�ffdy<�h�B� �v��1�B�I��AI� ��6��l{��{^�UT��{��m���@�_��X�Ԕ��.���vTU5�<ϲ��(������Ö��oݪ� �p�p�1q8���b��B3f�!*�������&���y�@�
�P(�w�~�0@��t�����ԩ(%���
�ƍ]��5{�S���?x��kW��+Ǥ�&-��/����3lȐ'��4��8D���˲r���<n���!t�+˲k׭OLLLIN2���UF�͑��]<�*쭪��*���$�`qqq�~����$m߱��?s,��x�:}����kW{��G�C�����8�kfʮ�Y�|EaA��� h*Z1,˶���B�Q#�`�UM8����]�����b���t�1ðC<a��]UU��-�g�W�P ���Y��z
Dz<ϗ��U�Ԝ<�$�'Z
�4��;�wN=�d󠎭Q��ݺ�:���I�v������K/�S�P�~�k����v�)ʭ�2�vy��u��/��SXЪ�>X�S�ƚ�ӟ�.&&���1�$�����k�2xԈᩩ),�:�N˹��vϙ�`���C
�y�n�I�d�_��C[v��R�縺��`0�r9A@I��	��l�fV�I!6�v_fF�,�u]������1� 
�v�5&ae..)�Q\<|�Ц��W�x�eIY��$�ak*yQAp8��&<�0�$��Z�{7B�f�X�xBh��X��2W�ލ1���i
� ���	<���m� ���1--
cC��M�ME��<��br\sK������33�f���7���ﮫAE+óum��Ng8�=g�"+V�wUW-[��ek��h5l6��u��/��������D���⫯�ӧK�>ˮ[ qq��{�R(�����귀����7ú:������-�O�TF�1^��.��Z���IRk�/*V�������г���̛�b�*M�.��g�VT����Gj��>��6�ԓ�<�‹����y������]}�a�"?���~Kk�ԓO>��i��ӂ�K8�MOK���+u]7	VU��y˩܂BaR%��HaY�o�>'O�,���fX���sV�Z���W�UT|3�;��23��+UMu8N������K/��̈�r{G�{~���vȠ���6��O>�YZ������,\�Xӵ��~淳��`�޽?��I�.:�܌�tES��HD�]ף]C���`{W-����1m� ��99/��Z\l, �rW�W�|���9m�=sr_x�UUt��	�P���G�rӍ.�KEA֮X���A���s���݊U���;���~�l+���͟LS;�t�_����#d@�� �L
�w�U��*NH�ϙ�YZ����xޢ��^=q��F$U�lrN��m�����`fwmݻ~�ڛo�jhl4M���k�֢���Ç����ӧ����.��o�]���y+V�.+/<p��#�Ϙ1eҤ�]wmjJʆ������w��;��iӾ�񧒲���}w�70�x�N�A���f��a�i����i���������-Yr����O�����>�䤉w��~�LM�vUU]q�%�(545��۷n�^X�?�I_~=S��-[��[�a�g�>mԈ�U�����ѣ&O:�b����+W555�\�����z�o����^��؛B�4��UIk>/����Â}�hM��`[[[Fz� ۋ��+*�7����ҋ.��Ȱ٤s�>k�I��SYyE\\�m�įQo�xW�4���x�ͷ���
!|s�;5�w�������<�0�2����!C�o�@

�ر��	��I��'����B��v�!F��4�@V<n\ç�t�5}��.��ɟv��f�:���9kZZ�v�#u]�ӫ�瞵ڃ�V��奦����B\.'B��r565]w�U�
�z��,)MJL>l�i��nL9i˲&6��������MӼ^�ΒR��SXP���۴y�եβ����U1d��@�ou���N�?(��z�[�0�!��r����=z̞;c\YU������mFnNNA~~yEEjJjZZZVFFJr2!`��u!�eO�0�n�}��}z]v�E�w��疔�^y٥�(��*(x��B�

>����Q
`�a`���**�SSR��AMӬ����g��k"B���v�y�_�|9 $�Gjk[��W\�xf����SO��;wނ��$�9�r���z���͝����s�\�P(555<?|ذ�ҲĄ����6_;��^2�=�
6��ѦL�U�2�x������Kg��1��ٙ�	�}�Q8k�|�D��QO9-��q��bM�u��^\�֌w_z��
7���������u#�ttvr7t���o�x�Ï*v�8�����������������"YQ���9"+����<t���Ʀ�o��l�
k�X�a=s󚛚�o\���tJ6[CS��U+�3rA���@�䈌 ���t9��`� ??>>.6&fİ���y�PتmD��;�16}��Y�ȑ@ h�f(*.)a&=�Ǣ%˪kv�����h�2�?uݘ<�ķ�y77'�O�^�ii�$
<x`�~ր�a�=z���Xb���i�9��҅q�x�7f�A��%���o����o�dk�����dz��Y�r{��ҝ�}�~?V���]���'�����l�ʊ]�
�EQ6dHa~ހ~}[ZZg����K;;�qp�B�^���ٕ+�'�G�X����O����n~��!@HolÐ�e����M�����;�8C�χC�Nhc��DA�u]�Ը������qq6������H�׷Oo����������E)7'�n���۷�������MJH���H��#3#C��bc�}��8�u��ֶ6�4O�8Q�
�%%9�����t疭�+wUv�w�NKK����R{r���gy�{�9���vtt���'��%%%���"�RSRSSR�6{NV���JLH�HO������Y�n=��5W^�t8�n����v�%��d;������̌�Q#G8�>�{�WT���&%&�$'[�1��־���h떲�Җ�������&�$�p,+	��(�g�>m�ȑ���gg3�1+�o�(,<���{�鱗\��w��'�����%$\|��Iq��q��gM��sgIm]mB\\A^�����;��ؑ#S���4�֟iI"'N�.~�!?w���S�mP���ώ�{/��;��NZ�_?����}9�
��kZH�͖��t�-�A0�
3��3
�'M�.��崒�rc!�Dd�ӡ(�5e�,�B���l(��:ʊb˲�
�9�3
��8+���f~�?�G�W��
���?�V�.�0,�vtvh�� t{<��غV,�D�
Ƃ �$	`�!���Ba�3';+�� �,K�d��(IYQL�D�tY�GD�16]NW$�t��8��n����2�f��P�y  G"B��i���Ӓ`�� 8�n@6͈�
6���
7m��a��`Ӵ;�"@��(�(B��i�=�̘�覩����*�aEy�Ʋ㰦��%U5ˢ�*��M�y�Q��+��A�Byݧ�(��s�_W�a��&,IZF���[��f**�����z�հ��؛%?�21F]<�&�`o�+�:�IC{SKF��1!B_|=s�-111�]ue\\\ש��~�a��Z�,+2��~j1Ƙa�#A��Y2b��5�&$��:0MD�؛�`�Y`b���Ѿ9!�0,��R�&6Mh��C�XY�~?noG~?�4�qlC7s&;>��p�_ʄ���8��1b�q���	L!�„`�!!�eY�^yUW���Auuh�n��ȃ��H�1p �*�.�T�)�G����*2Vi�d"I��'I2F�`**̞=��n��>��M�F�D���E�����
�5�@�s�!��=�3ϐD�4MCӺ�r3��K�D��Ā�)�!!DӠ�]� �ݻ��ۙ�;���<�>B��!��86����8��=ۛ&�k}�h}�`?g6M��+0M���d����b�����`JKQcc�!$6V��3?T���f
�jSQ ֭/���~b���8���?��0�	��_���S(�-�΍ZӀͦ�q���LY����f���g�99Ɛ!�w�>;B0��
4�(�fb�MB�2ƀe�$��YqP���R����P��m��
��;�����lY,K8�B]��7��ɓ��
L�x<������8.���AO�1V�A����-Y�-Yu�?`�߅�0��C�E��qB4���I=�W��C���8Fҏq���h'���-����d���͟�-X!����^u����˼W��pNKÉ�G\.!B���{�!�!��,�ŗ^�/��~��c����v݆�"��,K �))�AƐ!81��m�QSjn��ͨ�55!�oO�}0К���4��1�5{�Ɖ�$6��A��&��AU������	�eY��X­^u�z�
��hE��\�=�D����gD�P~;��$]�?4�<�����G�;ﰫV�^��+�T�;UU�+VQ��0?s&?o?�At!}�HsР#��件���ۣ BH��"c�0�(�q$-����YY�J;�R�k҅F'l��
���V�چuZ&��s�Z�ñ����?ǁ?�~��̧P�?�f�F+ό5��~��p�ܪ�� Dt�y��
Bc�(��k�SN���L$b&$�	L�i*7�̭\���l�Q�d��­Zŭ\y4g! ���2�
��qa-�u���+���MIH�N'���e��j���K�P�̋{`e�Տ��52aXΑ���z�2�K3��Y"�I�|�E�8��>�/�B���82�r@���VK�����p�1���ϲ����--�q��h*ͮ�6��ٌ9&F��vc�(S�DM�ᰬiPU����"A�2E�6
��3�[��L{;Dh��/��sTI�h膦�,Ø{�#�!�fdg��z��^���4a$bF���L%F�B BH�`S@7�����x�0����L����Ί
@fFFtJ)��:,���� �BYQ �v��r��eW��d‡>wB�[ܞS�2����<
��.]1]��+��U�1E�쀃DgS�P�O�@����~og����N��������T�Լ���{��8���0���!�v��������?�( r2���
EUG�^ZZ��k�?��#�=��M�m�$ˊ,˖�E�]1B%D#D�0��!�e#��3�B�᪍�R�b�22Z4�β6�dB"�.���`�Ț�s,K�B��"v�l,�B�AQYVN���P8����뭫�{�ŗ����XQ�t]��ٛ��z�����0vLJrr||< $
�ò����a�ӹ��DQ8�`�0I�l��;����p8ơp�&�+���f��c����SQU�0�N���P��yQ�t�0
������b�I�:::�qF.���H$�(�U�,˲� ��U�pX�4��.�i��������.��a�I6I�P(,�"���h��#��=�)�?#G&�� �WT4������,�|>�0TU����l6�)Sb,wx�?���N��u�Ñ����
G"�G�x���}>�%
0���,���{�~$IIInjj������<�ĸ�����6l2h`�^��x(N+vU�Y�&';+�G���ơ������;�_U_�z�ڜ����W"==휳�<p�͛�m�0x����l�^U�����ӻ���:�4��4��L{BV�]W��!���޴ys�~}���K0^�v]}c�!�{��vtt�4gNm]���$655�[�(;+s��ᚦ�<�{��O?�"77���@���s�*�2n�h�M�;;;SS�����~�HΝ�|��3g}'
¸1�7��1�R�577gæM~����2��ܜ����_fef�3�����rWSs���<��/��zي��~����v���-�kh1lh^Ϟ۶o_�a�����(��j��a�YY�7n�Q�sЀ����Y��$�76�߰!55eİa�e�u�uu�
=ss��5Z�B�'��8�0"�����ͽ���#�[��ק�wtlٺ��O?�D�2W?����[��;:_y}:ò���֦-[0�@cl`���iu麮�a�m�>�H$²���[>������z��p��λ;KJ�L�5�w���Ž@0�(���vD�v��}���>�=?���r�lXQT�(��z�n�﾿��~ɪ�_̜i���<
��.X0��Ƙ`��K/765�\����9�{�������ٗ_���>���MMM�ߚQRZ��
��Ͽ�
Dz��ӂE��v!�0L�0��˲���7��\�����_jhl|���wC!���;�0�<������(���[3��;�[Z�z����'�y���|wm��>WRVV����+��,;�W�\]U���^dY���>_�d	�q�6_����n����}��$�~��r׮7ޞ�t8ޘ1c��گ����/��D1�Z���^�IR������O?WT�z�g�|��>�tނ+V�z��y��WwUUY[�B��ÑI�i�v�������o�Q��3����m)�*˲�׾�x'&��H$�r�EQ���wUUo۾cw]݃��s�e�:����F����!���''%
<h��a�{�a����W]~�Yg���ޱb媶6_��_W߰��Fx]�=O�^���
��	�����\�v햭[O;&%)������!1!q@�~�y��vڤ'.]������*�����1cn�������=礉kjkY�B��⮸�O����~��W{���]�V�^}����[z��L�����{���s�:#1!��h뮪�P8��Һ����EU�����2Ϙ6577g�����;
mܴ�Wa���<v��P(����+ϝ�9Q֮���=w���/r�����yA<n $+#��n5rDRb�����q�,)��#����%�����K.6tH[�o����y=o���˅16�0?�n��`^^��Z�|~��������~�(C������Ĵ=��r��^�w�…��y��r�
�^3{�<�0&���g��546.�'�B9Fq_cSSiY��AG���_	��r:s����4��p���B!�ә��2a�8���ey��M99���4��Ꚇ�F�e��ij�^]S��ֆ	V���aL���v�\#�
6dprRR8�FG�vݵ�֯���>�����;�γN?]մ��n��
w?��}�1.)-�8�����r6t��{�x�}�?0#Dd��,�ȊÑ��i�,k�&����Y�qSb|B����Ǝ]�x��͛%I����n��޷w�Ç{<�P(�2����H���C�A�6lLLH�e91!a՚��p�&5�#8w���bc���K���v����@��צiz  �Dd%
E"����@@�y��ѧO�aC����mظ�PU��X���把
����(yy�w��g^x���n���N�c���C�JJJb�?���^�YR����q�-_};�އ����D"=RSW�\��ظ~��$k�d@���&+�P�K�,s'�q���|��EK�����t���y�(.\����T����-[������s�^�l����eY=j�$��~���[2��
��-]ZW_?t� �866f���
�}z����0nlGGGk[�E矻��z��յuuy={z<n���o����n`��Æ鑚���oSS�/8��⒒7g����n@��cF�r:?��0�iSO���^�z����>�z���DQ�UX�is��1��[Z0�C
TU�m���Ç���,�:dæMC>tȼ-Y:p@�K/��A̧_|Q^Q)���c���K*w�JIN��:�qۋwd�������r�0{��
�:��^��%%cF����B~�{�^EQRSR~�g_~UVV>d� �3�������8���]UcnjVTeWU��C�z�ʲ�x鲊�ʌ�t� �����c�KJl�m��1���߸q��A� �,���_}�-˰Ɵ0~�]{�X]�^��׮�����9��SZ���z��P8<~��쬬⒒�/����6��Y�������8`��M�{��HI���=��;)�G����0A�B!��ae�EQQEQ$I�|�,'�0B���f�f��Ȳ�i�(2� ԭ�{�	Cu����nQ'B��;�~�e�if!UUE����4M����{`��I�Ǎ�Dd]׭U��[�j��=N4�˲Ѳ�
�-��r,kuO��(˲��4M�ͦ�(��`cl�9F��1�f-��q���u��;w�(��@��<�󝝝��aX�G�cm1!� X�����,k�eM�%!A����ٹ�t@0��k��e�����4�ak�`EQx�Ǧ�	��a
't�~���r\q���,54MS�$M��462� ����UU�r)[��Yf�f�Y�ʠK�+��Xm�5KWԩ\V��jƘ�yQdY��훷��چ,ˊuV� X��{�M�4eӌ'Z��ʲ���B뫦�a$Aɲ�0��h���<���v=�u
VP�埪w1�����HD�R�p8��6Z��q����ZAK������[���raB��'��n�RBh���˩UVD��,7$�]�?�B9�8�!8KЭ�!B@t@�%�C�Ʊ��ؠ뒮��F6�=!���Wn�]Λ���
!�D��[�~��e��}���
D���9��]ϥ���u�P������'9�Y4xꠧ�_��Ysߣu-��E{�Q(��c�}A]ԏeW����D̞�I~��Q��P(=���[1��.��5��!�uCƯ�Es��#��A�
��W�I?�q�pH�e��nE):�yD��Z�-�K�a�����BN���y�sM�P�2�z����L�%	A����t:�z���@�t��nd�F��{�佛Y����U��`�fcSc|\BB|�n�]`e5��1>Lv�ï�P(���c#�;K�=OA~��n��m�R]]���5�-��vl�B�0�\�є�,��h���i�w�&��$I7��P�d���Y�#33�Gj�a0&�p�h��ͤ����8�j��T������٤HD>��]Q�h��B������o�eCC=D�Wa�M%u3�W�E����X�bELl��ᴼA�ٽ��_�I�H$2�qg�q���<�� �g��
Dz�^BB��Do��ִ��aX������9�YY�y=-�h`��[o���**+bcb3�3"��UmI���i�&�7�r��;�+W���'^�[��/�8+#C�4��CVg�!��X�0�K:::V�Z=�I�Y���.V4�a_�v̨Q11^���J���B����BM�͙�=�|�|�-&>.`�O��4q\\\ss���V�XGgg(���7���5J�4�qCS�av�}�7������r���1M3
Y�K�gz$�$)�����ɰ�5�c(jim�”Ah��`�;�4�@П����06	��`Y���6�Mjo�Y�keE����y������o"�x�oni��wZ
X8�u�4���&�Ynik�y���<!����
f���UQ!�2�a��]IV��`($±�m(ʟ�?�����M��t N�>.��mlo�����1�F��&޶}��(�Ǝ���9�,�x<�o�a��px����>�D����+.����e�A��)�~���f��ꫪ�k$Q��O	!�m��#G���kޘ�ζ��1&��1���NU�!���x�1Ƙ4]�pOci5N^�'9)1..NUT��}\US�0̍�][U]�ֻ簾����l�R�z�`����s�<s����yy���i�>�
��r�o�	G.�����>_{{{���3��U��ɞ;o�2Ӧ�:x�@���g�P(݁?��@DAh�����^^��שּׁi��!
K����>qB ��p���bgIi0YY�/>�TGg�����ѣF^{Օo̘1r��{���b��E��\|��SN����3o�����쬬���@0/��›�v����oܸ~�Ƨ{t�)S:::��Y�(��t�0��
��TT��h�!�㹵����C�W�������|���Q#FTUU9�� �a��z������㏅B�	'����+�/\�iZ}cCUu����N�x��ZZ[׬[7p@�s�:�a����>�{U��UVV�mGqjJ�}\XX�zsƻ��l0
�c�y!0M3�Gڶ��S&M����?����2g����jkG����/��gN����:w����8��bcb0&Vf�p$����r����|>���LMI�uc���ee�x�e���r:U]����-�ǹ�n���q�k�����ع}�q��l�&��r쮫	�É	�ѷ����ƌ����RT��p�]���`(t��	�h��]n7��.��縄�xI	!�$"�&�Ǎ�����ϗ�\y����s�~�ͷ�W����;8�둒:dРW�x�f����#���9�J�p�o
��g叶�
�HHHL�M�=g���7����Ʌm�M��.��+���X�6m���]�.�&C�`0h���j�p��O:���<������L����332232����x�f�n]�UU�B�������:r�p�0��Y?��L��zNV��-GB��ۋ�nܸ�wa�hg"�j��(���뛙���چr8욮�A��i�`�4]�t=����ښu�֬[�t:��;**+gϝ�r:�|m��G"M�N�0~��y���������lT�)�‘e�M������T�jo�1���3!!����B("��w����zVf�$I�i������%%&��a�4{��"�v���������r�,)M���UM[�v�,�#G���p��~jJ
!dwmmaA��(۶�شeKm]�S�=jͬk%�,-+����;ALLH,���v����lni	�#Y��DW�uvv�^�B0d� �e��|y=s[��:;;ssr���---�))u�
.�s��M±�G1�j��P04|��䤤����=$Q,-+�r�EY�v](�ݫWVf���O��I9��f�<�-�4M+���x���!�$YZ�������7�a�+@�$k��hVQ+W��nG���aL�Y�Q5
B(|8������}�Ï�bco�����XE�<o9Y��|���sY>9�͙ne��D"�eeE�nMw�s��iϛK�H��	��lAEVTM�$Ѫ�$I��v9�(
�@��r\q�I�+ �VccI�q�$9c���J��7�D3�G���~�V^e��Jm�
�iBcG�)���������.8��`0�}����+�����k�����4���n���,d�kU1�&�tY5�[��r�]d��9��a
�B��1�����~=��&38pa�%]�Ce`�^q�=���;�D�s��Z�C��@�O���_��r8�,z:
��Э�̶���?�u�P(�?�n-�T�)J��(���Rd��8��7? �B�P��I?�c�%���?A�q�U_��v�c�J�P(G��H?���M
��A���eB�>k��aK�1!&&&&�IL�ϖ-O0Lb�{��|�p^Q�f`�!���]`1!��P(���Ȥ��"���~r�3�v��41Ϣ8�����Ϸ@��0�M`]6�m�<b�S��l\�[�w	.�	�X����"A�SHp�0&�>��֐lp�h檲6���3��u>��9E��?�B�)G6̋	qH��E�'I
E��Z~���>0�Ӣ�v����$���5ws�/���9���g���S�XYS��n
�20�i���.�)ƍ�{��H�~^���L����Y��iQC���c�����6�=�͎k&f�=<�c��s�^�]��)y�������H�P(ݖ#����f,��t��}cm�5	���M	n�!����~�)k�Kv�n��Hg
K[���������յ�'�I|�m͝ʽ�o9fB�D����mA5�#��ưb����8mH*ƀ@@>q��/�Ɋ��-8惛G]=1�ߟl�f��E��P(�_�H?&D�����r���/+n^_޶��SS����;e@���х	iI�7�d��q�A�X_�h�ڐ���J���.k�#�����l��m*k�je���n��P���?���M�����RTө8-�!���^�Y&9F��O�P(Gđt� p��K�N��sz״��`SGPEz��$����Yt���^z�y}y��m�]=��Cˆ�;C�_փ%3��䵟KG�ť��2���^�Q&Y�I��\����#@�n�~�U���g�>�WBiC�7g'8޸aXA���i�+�B��z����:))):�+˲��<�;N�7,�*o;t�����$gGD�qgK�`ɮ��pa�gd~|j��5�]0:37ɡ0wK�$��$J���N��v�"��'��~�r��
_�L���lkZ��c2�wb~�볕��-�t�	�8�͍�jBNj�����/�iؠb D��r��0LKk����9�N�C9�0���|d�;	n'k������t �	��?�A<v>$�����Ƴ�a"ukvٸ��,ó�D�M�C�4CB�.�,�"���� ,B�a�B���cD��P�oh�N�q���I���.�v�U!� �ޓ:8�@B� !�կ �Y�DTco"3� h����h��mڞ��]WY�`4lB��j&&&Z�@՟r�B��>�8� �oe4c&�ԕ�f�܏h�ヮ���v9�^ʟ	�0d_#�B9��/�V�O���V_+615�)��	 ��"sG����>�oM?�����H��O��`�C��r�p�k��c]7
�/}��ơ�y��J�P(Q�*�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;��S(J��J?�B�t;�ï�BB��K;A�i
!�@!�_���Ax%v��i���P
�B�+�_��f���0#
fK!�q��Q��e�6b0�(�zx!�$I�$�i�H$bơ���B��ѪBccb"���iT�)J���O�y~��e_|=S��!�]u���a�!��!�k;CU�li���e�2�	�kck3����&D�6��7�ܶc���\t��CTMcclm�u/��W�Z��������g�qzrR�i�
7!�a!�i�V���+���ܜ��������s�>th8fY��Bi�@�P�z��4M�$mܴ��G��H<p@8f�2�,�DA�zKǶ�;n��:;�!QB��<O�8�eY�e%IbYV�P<˶�|7��
�68���rDEI�$�!�y�a��EQ�������bc����+*�|>��β,�0�D{� ��uvv��]+�2���{<�m;v�rǝ����u:!I�,TE��P�G���N���A�~����8USeYNMM�v�)II�>_��n���S/���'�~&2��/��4�k��q��斿�r�uW_������7�m7߄��3�2�(J~Ϟ[�m����;�Q��o?�P�O?x��r��jzCC�K��^ZV^�����>���w�m���ӻ����SO��wT쪄U�����n_�x�o�i��l����'�0{��O<���+*+Ǎ��c�;�)'MJJJ�����3�[ZZn���s�8���1��Ǟz����k���{����qG]���0�c]
���~B�qM-Mu�u�(�6�^|�>���1#G�������Y�Θv��n����s����9�waA8..)�Ͽ��g�u���?��ܿ�����6m����ny��'W�Z=m�T�0��[rr�].wcc#�q���w�mjn~��}�������t9/����S�L9�����n���+B��K��^Y�뽏>�z�ظ��{?
5��B�`[�O�4���c�x�^���0~���������[SSR>����O����M��z�E�?���A��R4M�=?����fa^!˲Ԇ�'�o�3�	���i=��|����f�=o����r����/����+W���k��I���LO_�~��y��6[zZBhȠA
���PH׍ܜ��
��^��#�+v1,3i��>�8YJ:p@�'�yv�ʕÆ����BMM�={��?�g���Z�M��˛0n\Jrr(.���8n��`���>���Tu�ĉv��/��X�41˲<�	t�ܽ{����λ��def���&��33�?��m�
!T�P_����'�KLL�*�
�����r ���o�t��>R���~B�eY��1o��{z$6&�4�I'fge%%%�1b�{��u�?�;����RW���6�e�3ϻ��K/�8���Ҳ���))�(����A����@{{��k�tvvN9iҶ�;n��	��m>��7�p�e�>���\zY�������	���P����0�p$����B�pCccnv��#n����N ';��q_{�Mሜ����J{G��Ȧa��U�r��nk�'&&0s�e��u�隮i�v���C-c��<�qT�)G��&�qܱ���p�����4������n��$��|����f��=d�`Iy�_�a�-��������������&)1�O�^,���U�߳g||���+|�����H/ڶm�

�P�O�^[�n���O둪�:�0�lݶmWU��a/���HO���U^Q�������a�&�ە���i��Q7o)1|�a[�m��ˋ��VTVnظ���>���K�$m��xl6��嬭�Я��

�kׯ�x<n�k@�~�5�wUUefdt�;{��������UTV�_�-�=���BJ���՟��n��-2
�XaݐEEE�����$��$I!BH86��r�뮻eE~��geYEQx]7dY��EA��0v;�P�uM�l��Dx�C#˲�f�u=Z"�f��,QTUUUQEQTdYQU��n���iv�MUTL��f�BB�M�T��ק�<��x�$�O=��P�D�4��
�x>�Xg+�"!c�AA�T�a5M�4���l��Y�'���Q ��n*��㊨�үBh�f0$�Xq��|��9��3��@��H$b�TBE�e�����` �0�����P(�{�p���p���EYǁ!�Bᰵq0�g�N���

�,���]�a��a�M=��k��7(B�i�U7M�E��G�@�X�<x�]�J9>�"�G��Q���9t4/���V��sO>!G"�}w]�0�~��`��f��F?G?D�l5SO�r�駙��e�0�;TT�!�]��,�����@5]kim9�0�D���v��-�O
{D[[��x�q
��$ �<���$�8�?������n��7�?1Gc�W
�b�)غ'c]�=n�i�@q�QEQ������?+Gf�S(�!�aXY:~�!�i|��R(G��H�q��s\A�f��Ê������@�qT�ɾ)۳�
�݉(�ߋ#�~!˲�cB�A=�g�w�5�G��Ż���0��i�_W��̪�5�{�ey=u�ˆ�<�g7zW`�Y�Z�Gw}�^�
��;)��#�~+UGg����DY���iIꢧ`�lTJ�6��(Q�]��=�{�x�*�!�$��i��z�.�r�V��C�uK�w��a�����v[8��Z!D��Y��j�-[��pwt���ֶ�`0���dekٻ@h���zŢK���2z	!��)�<G�	��V�1�e��/��8�^v�\��ij�&�{�`o������
��V�Ua Hq8��(������&��8A��pWǰ=�o�ʟ�#xh�硹���GSU��t�\.��)IR��+I���t8�n�%����v�,����.���v�\���t:�h!<������v���6��Mv�`��I��n/ں���k�� �11�(b��w:��[�$���r��V�eY���r�,�r:�^�`�ۋ�mkmk�����X��a�;�ί��vgIILL����z=,�bB��z<��f�!6�-&&&&f���+*��EK����v��N���uB��nw�s��hs�r�]1^/�����v�<.�%�,�z�I�8��7l���c�^jo���1�ec�^���8���v�����Ͷo��X���koA6I�z<���۶�����8��f��/]�x�R���p�c�^����v���c}����
ð�l3g��O�n�(�.�K�D��f��I�N��f����0���$�0M��$I����kׯ���{[e�e�8
��s��W'u��e��v�㜃-�L�IDAT������e�N����'N_^Y�a�Ɗ�]�y �����K�ʷ���8�vL��J��S��
�ؾ����rGq�o��1^���1MS��ƹ�˲��U]�����ꎝ;�㛚��̛/+2˰�>�䮪j�ǝ�������y�8�KIN���ٶ}����lܴ���<3#��@ 0���ں��DA�.[��hkfFF x��'*wU%'&mٺ��W^�8..6��fL����q��6nJ��s�\�@`μ��5�UUMJL,޹s�ҥ¸�X�a�/��駟v����������ۋwnܼ�����m۷oܼ���c2o�1	�	��Ȳ�p��6�\NI���nݲu����������k_�by$"������'�}���錏�����H���"�뉉�i�~���p(������RV^�}����.�i�;KJ��� LMM)���7��ں��d��Q\R2o�B��p8<o�"M�v��fef�BA����Z��f���ƪ�:g��u��ii�pT D=|,ua��M����ׯY�NQ�]U�k֮K��c\RZ�}ǎ�����d�c��ږ._���RS���������,Z���o���J�f� &!�l6ۯ[f�����v�O�
�1�a���棐�Εk�N��=�����[��X���'M��[om����6w��	'�{��7n���Դfݺ���x���YS�B������������G���u]��lϼ��)�O�Q��Ͼ�ۧ�=<��M�-\�D7�̌��+�OOLH�����3o��~�z<��|;%9y��i8@�4]��L�a=ss�i��-[dYްiSff�y�M�LK���ں�h��&�������D��>�e�z���7L8a�#O<i���M[��]��^z%#==�ssr���[�0��dg��7k֭K��߱����?��O�����cO�c]��履9r�p�Ájk�]�!"G����i>�xbB¢�K��`Vf�? X�`Ƹ���b�.��ջ�p�����ږ�X1n��'�~��t.^����{�i�RS��5M���{x�߰i�U�'�}1�v������Y����
!��o��vI%I�=wnA~��og%&$�,-mll\�b��"���~{ʔɯ��v ,���kh2h�v,&��v�;�.wT��W�7rG0y���z���;������UT����{����tMK�Ox��'Ds�/��l���>����YW_����q{z���5�-�gc���S��X�4��B�qL����o�>㣪���s���%����W��Q�tm�쫮����Uwum���m��ذ�{/!@H���3)So?�ÐT��?�7�r���;��<w�9k��I�=p�Y���>e�?���Y�pIIi�������9'7�߾�w�#�z 11��_r�$��ٜ�����H6dH�I�����˻dܘؘ���x���;#c��!33��j�Y3�o�T�Р�F�ѣE�[���^���\�[o�i��)%��,��f�Ԕ��ֶ�}�p�f�z]�Ò�i붴Ԕ�ك���Y3f\2fLz�^���#�i ���.��޲`�]�`p��-B��WΙc2��n��a����AYUU��ⲳ�ƌyŜ���t������7�s���cqq��W���#��o�[��E�����8EQ��b���0���RS[�;=�_.�f�ܺ�����JM��o{��X�~��ѣ�d���Y-�eo���{��%��;�#�������yn`f擏��_߾��\N��w����l���5b��n�8����˯���깏�摙3��¡~}��}��]zIiY�sM�L��.���?ı������B�mm;w�>������q�'¿��=p��NI�9y�[o����Z���w�ã�ݶc�GK�f��=�����֯�/Wu������Ӧ�����>`6�UU=1qC7�A�����~���jUQ����� <��[��v��g�2���Ȳ�t8��pX
C�P(\r����
�Q4	£�:#=�?QW_?k��?������#F �}���G���NӴ��rUQ,f��f2x�_��ĘQ#�}>Y�}>���K���{�t}��w̙5�/O=�w0��q���]1{�_����`0TQQUMkko/.)
�š�i���j0
�B���������Bᤤ�?>��P8���Ʒ%t]��`�j�$�v�CG����665��Ƅ�R 8y�Adf�y��5}����9��4MQ  � �㩭�kok;|���nGjll���V5M�5 &��b1��������k���u�I� �EU����DA��z����u�b�>R��~?@@�ed�Z^U5]�5U��8���hJMI���>t�}�,�Ǜ��V�H[��j�����@@7t ��!��4]�$����㸞=zTU�����y\.@8V%766����Z�)�>q�;�>!���n���FMv�
B��M�,Y�t��M���|�,A>Z�)2�+���8`���BA�N����,;v�h�0����p	MQ��˲��aC.~5���ݻ�>����KǎEq���y�M�Ӊ+m�Vd��f�:EQ.��ձY��a�4��^{�-YVee���u�u׽���������fL���[o�l�k��jH��_~��=#���dB(�!��ǎ����4-..v�_,�����^BS�
Td6�U7t���x���w��Go۾s`f��I����ɱ��-�N'�2]�RSRx�_�z���X�.�BhE�a�M�\PT��'�$�;nw�\N��E�y�W1Dzx��YY/�l���Z��뮹�l6;�vI����b���N��df���8.66��k�yq�+�~�S����9���b1�"@�4��f2�(��Z-I�	�_s��5��?0j��4Y��W����^:�N��j��bY��t"�p9��"v;MSuuuO���@0x�
��USW�������߰�e�P8< ���M�^y�?�w�=Vˉ���Hz��S&mN�KP�3h�5�l6?���ee
�4a��fSU��p8�_�e9�f�aB��0 �~�?'��k�.���|r�	�&O
��`0'$$ྞ�� ��y�����&ܕ���p8����Zp���v��ʲ,�0�� ��8��`(,���Z[,f���.ݛ���<U^Q	)��ѻ�/B��8M�Òd�Z���?��mm�,����a(�bY����)I�qV���f/B��j=�/B�0t]}��bwjnn�X���A<��ގ�.��9VQU]P�ۏ�C�4�eY|zM������|�bb<<�)�j2���� �I��i�ap�ȸ0�ɭ��EE��y:UH�f���I���/ෘ͉	�^oM�aI‡P$�3��*� �R����'�?�����4z�P8!�/���34
h��x��@��N����jUU�a���XU�4Cב��Np?���v�oon��
#&�CAI��PV�I"��O�NY�X�6���X9�xm2�����8�a5%Ir8�#��(��[3����5t�����pXE�b���	w�3ա_? ��o�HWTEմ@ ����a���~?�d��*�SPU52�!���Թ�����,�C%Y���[�.����-5�Û;D�H\�
C��n���H�����1�M;����"��p�@dI"Gr��B���J��箻�d�A��C9"c�~��Ϻ˹wY�.�y�Y� ��B4�7��Ӧ�S.�9���g YV<.ׂ�wEڈ�;�Zw^;|K��HqA�����egei�vve��GE镚���}��㏴9����A�G��d9�/����p��X�4A2����g1/�?K��8�~�!.�c�	��յ��9I0�ջ.��� �,ÆB!��N?�+�JS��5�(�b�#M2���8i�O/̢+$�_��b㚽͡p��>s:��g��O!��eLBKA�5�������<<χ�0�?o�K��q��^o���H�'.b��z���r�~����X�І���-�ȅ?q�"�����!Kr��}�Y�F��0EQW��(6���H�'.n US5��GW��B��hd��=�E��~��!D:7}�B8WA\��2�G�?��Qgu����o�'h����'�x&g�lx��'~�P�y�':�]tU�8:��^m��QΦ6/�zy|R9l��0: FB��q"�Χ�)S:��8'6�0
t:�;�މ��q��몦�4-��(!��5]����Bv�;���L��#�x�p8���6��#�x��"��i
�'���/�ᫌ$l�i�eY�"z������3�B&A()-}���x��ДI��xC(2��ȑFQP��e+�8稡 dX��haYyybB���*4M��j�aUU��,�iMS<��gj��,f���C�W�,˸�M�Mˊ²,@H:�]��q5��
ܺ�:���v���PyyEJrO�e?��~}���VU��iY�AqJM�4Mc���)X�mim�5=))�n�y!]��eY�F�'Y�y�������6e�߭x��i��$���,��_�u���7�\�~}|\����߂~}��A��X���)�U���y�B��o�����[Z[{��J���⪟����ܙ\�#DӴ?��m�>�w����UUU��,�u�
&�@Q�?�e�g�$��0u����&��!�r{�_��pXz�̽r��8
˲�$:�q��fEQ�v�a�^���t*�Rt���r�݋�گo߫���j������Ԕd�����כ���P^!L�*���(�PAa� �	�<ϗWT���>�{���p�}���~������>�3p����V��%��H�W��E<�9�v�����RS[���`���k�ǟ����w�|����|Ee%��	qq��(���k��bcbdY6�L9{sW�]���E�^�؀�Lee�a)�'#�@$g����!E566Θ6ힻ�hkk��U�5qq�����ֺ�N�?��D�)&&��E.�.Wk[�$��]���ߴ�7=�r�8�=��
$]3�?�
4"�X��z�+V��»���HJZ�~��<����iS����۫֬ݹkw~a���.��˯>��=99�u�		�<��ң�yyy%G��+*bb<i�z��[Z�y��ʪƦF��_��ӏ�����e=���M�M4M�lYuu�Iz$%.\�jUM͖m�G���W_��λuuu�|�yKk��˾r8}�d(��z~ᢃG��8p���Ͽ\v�H�����%/���5�L��G���QW_�ٿ��M�Ax�ͷ�7o�V^Q9�K_\���ݻ�m�y��Ą��^Z����mm8`EQ�^�{.����BÇ}����۰aӖ���
�̔e��%�,=�_�|ժ�'%�'�����
Eݝ�3p@��Gf͘�i˖/���HIIMm��AY�˃�����mv�:�ͻs��u�uE����������xV�Y�w0?##��'�����|�WN�3>>�Ͻ�?�5kSSS7o��7ެ��9XPP^^����/�L���@#q�;��"(�2���bQu��ˊ��c����II�^�	�_~酺���_|�q�����k��ܽ���x<��'��xjr�o�Д���0�0�,����4uҤaC���w��y���y���:�G�)�&�1bؐ�7�p�/n��b���G����c���,;j䈿<�ǰ$ݿ�ީ�&�ܽ��x�a��u�(^u�fQ�h�R\U��w�띑1z��y7�x�
7L�4�_�>y���z����u��w<��_���n۾���r�s��z��?���4p@��I�p;Rr��'��8~�c�y$'7������C����q���gͺd����;�EM�0>##��A�$�f$I��e&���t.Y��?8���/
�T�+��y�ݎ�=���y�Œң����G�����w}��wK?�������Y��_�Q0:$��嚹s���Z,��F
u醎"m�����B?@ӴIIw�q�}�흑�C�I�D��i��p� ��p(LA�q,˲M�����,i�f ��&$$��ܿdE����N��ٗ��=��v-|�_�����?���隆K�Ȳl�X�V�C�7l��p8̲��(�]V�a8�E1�Ͽ��_�>�ӟ
�x�3������a��7�j�.�%]�
�h�0@�EQ�eE�@�I�4r��g����}��~�_8 dY��htM�)��8�e!�a�_X�����'��4
�$�������y@f��y�9�;Ʌ�c�GH7�q�G�r�M�]}5!�0��Ȁ5Y-�ͦ���E!�k�]�e9K�]�7R��|�1A�Xg\���؂���m0�9}����ԟ���?�}}���|��[�mchz�M7R4��/����5$;{��u��[,����Ͻ�p��w�>!t����o�eYW���q���v���쬬��ܯ�[α��b�6�Ï?a֌�EE���?fԨȍ>YQ ��k�bh����%�g2��}ά�+V�fY6��6�145}�TO��ɿ�����iC74]��|>��#W�Y{߃�VU�j�l۹sӖ�����HQ�>����/,|��E/�~˼=�{}���\1gvB\��(j]}Ckk���8>,�:^l9��nlj��{�_1g��
��e�H��A��_�C>����5M��k�y�Ç�����-~���s+���{iaYy��S.�d�3Ͻ��c�o��/����TY
���i�P��g����;lVk�SYd^�ş���Y�Fܭ���^�eC��.B�n��l�ңG���y��g��}z������JUUq����6��EӴ������a��ť�`�f�h1�}��aÆ�x�uMMM�ee�%�ͪ�zEe� =�����>"�bFz�$�!�����u�]�PH�u�ÁԲ������}z���B��W�74�HLLLLTTV�!11��������w�	K��fC������0SS[�~�C�������G�Ԡ�,�ap���kj5MK��p��L��=q�"�e[�ڊ�%&$8]N�(����9���V��][W
�df����74�HJL��?�{ȅ�s�Fa0jkk�t��(�ͦ�z\l�$I>�?
����뮾:)11�g|�PY]�v�\NgKK�(� 0���}�q�����!�弭^R���HE
4�qm^���(��j]�5M�8�d�{��9�f^1g�����xR0�(�b&ҝN�yEU��3\���t���|��=�٬V!��$�Hx^7YV��X�	E�%��ރ���/�0����"��Y�QUUQT��n��04��$ɸ�MQ���QeO�^^�\������~�$IM��R�zo=!.2�;/E*8��k���r!��/6˲�
��*�r�� q*U���9|p�|B�6x�Z,EE��{����4�p��UMS�F:�����_1�r-m-f���pj���c�H�'.(g_�׀��$EQ�К6erbB����G$I�����gG���Ș%�P(�˲~`�ԩ�T/.��{R㷇%	@QP�@0��s����/,I�S�eY�dA�!���k�p�fa�����]{��=jƴ�m��a��G_l��W��yU�4���#�$�72��%$�n���ޘ j��a�@�n�͚1=��\�1����oG�e�Ӡ<��
� �y����$r�2�������F�c�~����,�$���wP�H���[���z�{&�/��l��1s�����QU��.+�|�
v^���:i�Hl9=!d�cm�]���c!���Z���3���c�|�?�:=�z����O�/��4��_��0�p8|�U�
>�(R��B��kTMe�4#2B��u�Y�����;��'.^b�Ɵp:?�]p�D�ihh8g�C�l6����7�����!�̢955U��ss	!dh��}�bw��O4�����E�,{nf�����3�~���:��&T��A\��L�}�x��M��u�WTE�%�xOD� �N���:	���
>�c�� �9��Sס��8�`bh�\�ďa ��x����3��u菤�%
>�c�-L�>q�9�\r�O?
r$ү���D����?I։���~�b²,Ζ�+�N��Ӂ�+S��?N�G<��ag�u��S�#3�À������:���K�?���t�!�T�a|:��Os��^�
!d D�z�p��u��|�?!�k�A��,˺���աr�:��8�v hhl�d���h2��%p��O-�+{nV� ~g�)��X,u��B��(��ef#����8q��(��"��%Y��@���f��0l(�^t��.!�L&����?R�T�z����c�!�[#�Q�M&I��6~MӲ,�Tkљ(�����>��Ny}������чe�@0P^^�R�SUM���1
��p�|�,˲
�
uu�i���cBe�Շ��g��f�8�%��(�YC
B�(ʧ�u	�G�y������ô��p,g�XuC�p�P
�v��),:T__ñ,�i�2�%����������677��v<�/��'~��Q#��K���G�v[]}���s��8p�4�w0߁V�������s��ǗP���e�b���T�e�u'��Y�0X����kiiq:�`��4	@U5��"�;�t݈<�L��/!�"�9����DW|��c4^kY��KJ�v�P7�a�f�w˶m4M۬�H��Ȭi�nim���s��<�T��ܣ�"rz��B�%�,'^f�����C�9��EN��������i��#�uN�*/:Tӿ_�fh��X,�	��GK�6�
�j��`EEEB\<ES��M��n��U?�S�u�i:=Z:b��õ���5�ff��<��Pll\t����u�t]/:t�l6�,ٟ�<��4���h��9�#}���醆�38
��y����/O=�w߾��~��Wqa,���p�y����_�꫚����q��v��n_�rշ+V�\.�ٌ����r99�SU��W^mjnf�c�ǝ����l������P(�b���+W�EQUU�ͺg����~��C{��~�r�U�M&˲n��n��E�Z,17��af���w���/p�\<��a�Z�n����w8N���p�X����>���v���i]��6�M����cccTUx��v۬��#X���j�r�k:R���K�%z:���E1��Ƨ"���r:�N��bq:N�#�R��<����M&���Y-��؋�^.:t8�\.��%�&x|�,�:�ν���z�]�˵n�ƪ��ؘ�0��Od-l6����y!d�۝N����c�
!d1��N���)��p�\.�a/��J[{���f�A�.����g��,� ���߭Z{�?>p�omm�u=%9%���iJ82=--���&rAA���_�����2��_��G]}��lv�\v���7oUueRR"�q�.�ᦲW�)zw����X�I�o�cY�E<��74�z�O���n������RSS3Dz6�Mw��k�8-g���
c�A�ӓ^����w����<��;���뮾�aX�`aO�ލ�7[���n��p:�._��s�g��?i�@0�����+��v8�h�(
! 4�%UVV�t�رf�!�y �c�����Kx���I�I�
��`���kץ��5}��0֬[�WP0|ȐѣFJ�d d6���\u��lܘ1Ç
�f��������W̞EA��ҥe��^5�,������>Y:u������ů��Fb|��s�N�0�`~��]�z��1i�x�ٛ�o��z���H/.)�>urk[��
'��,w�-۶��ӧo�ޯ��FB|�5s�N�p��v��5b��	�]��7��P���2p@fuM�,��]}�(��i�����>��o0�b6��|�rU�Ԕ���b�ǟ�f��/�����z'�����v��i�'�=�D��p؏-���o$&$\w�Ucnj^�v����C��5RQU|9��w+�kj.7�wF�W�~������4{�̴^���p�i�ֺ�z����S>_�U����ٳ�h�~�z$%͘65�W����6�&Ӵ)��G��-6m�>�x뎝��ͳfL��l�����@���� ��]�%��"KR��eX�2���w�
�y���aC�̚1conn��ƘQ#q+Y���7#�@q]nyp��޾��2�l��T�����Б����Q#׬[7l��t��D34���+jjk/7�o���)6	�rf���O���p��
��r��}�%A�kj�x����k��R5u��ʪ�7�yB�j�2�Ï?� �O>����y�����/e�Z
�u�$Y�����\��tCG�&ܒk��bb<啕+W�aB�����+�{���۰q݆
9��/��Zf�~����WU51!>#=���VQY�q����
7�^���/��?�ׯO������ŗ_i��v��i<��mv���q���^ze���Z�a���<˲ee�}�I��&�!|�Gmm���7l�����?���d2	E9������?p��>����������-۶�JM}��.Waѡϗ-��͆a d���bM��!h�k���q�毿�.))IE�á����٣ǫ����_�ٗ_B�*(*���</���r��n��m��17m~��72��2	�;��y�6����—�,�xiX��y��3m��/�}�s�����7������7��z�޽M&��ů���=�‹�ңG_{�Ç_ze���Z�aÇ�,���ÿl΢�_�t|��Qӵc����4���z�Us
��;x�{�)//_���]�Ʊ����P9��a�*d�~R"h\��Gb�m��{����
-���y���[�����_Zt��4�;� .dgxB��Z-</l߹s������.>���z�-������Ќ���QU��ʒ403s��I3�M�����|~_Mm��K�0<��6b�cA(���74�,�s���9�aTUu:���0}�A�BV��HqI}}CuMM0������6dH�޽f�/5����Æ^=��Ԕ��;v&&$�=J����G�8p��7\wm�~�X�]�rUm]�o� �ߛ֫WZj��	F��f�!����t��wݱe۶p8lF||��S�T%�_���m޶m��&�$'O�29��AUU��靚�<i�đ#��ڳ��WT��R��k6�nju��+�/7�����9ZV�����O�0!���/(lhh����M��2����^9gvZ�TI�Eiik��İc�Zm6�WгG�^�)S&M6th�>��zl� �0t�ϗ�7��w�b޼�}�lظ)�W��f^=�JM�$I�(h�j�\s�ӧL)(,j��jkk��@0�Z-����{� �ޒ�<.�/�瑇����࣏G�~�M7�u��{���קwFzڍ�_g9�R'�����}��0����*���Lb}c�(��DWNgf���!ك�Vk^~~C���p9����&e��.����fL��&���Ÿc������gN��ܳ�ko�e�����s���;��[��ܷo���7�L&r㗸�Y臐�E��{�׾��b�lv�\�i����~aF�������f1;++
i�^^QY[W�u�������Ą�����_�/�L�>ߑ�EQL��_P���oO�(�u]G�KTU-=z���i���?�����a	Bx������A�z��3f��_̻��v?�̳�0 ��P����HqqEE���|�g�HLL�t9�|�]qqqKK���9}���X��U�a��ꆱ'goCC��̃����+W�IOK3	BH��Y3��=��z���fά��|�yi�ё#����Ξ9�T[[G��=9���=��V˥��>���Çmii�@ �8�h��iڊ���KJ����V������Y3���>�_�o�[�f��S&�α\CcciYYqIi `������	 �'goEEM��n�o!܎���b����%%2�{�-�@��˒|����P����@F��s�|@�foK�ѣ��lioo��q��u[�m�����q�r�(((\�zM����+*����ϤUD�u��n��w�� ��lv��b��7����8�*B@(
�����X�b]���(UU{�����.9r�Ԭ�ݖ��;�^6$mW��@||��^a���P(t��{���ٻ��t%$�/_�:?����d@�L��%.g��w�hmkп����F�r՚���(����_]]}�q;���d&$ėWTlٶ-�g�[n�15%e��]{�樚6l�Ʀ��w�9��8�(65{s�﷘-�Ǝ�izێ.�d�m1OzZB�������nUUG������߯�7�W).��������+���ӻ��i4���Օ��oٺ��K�]>gV{�owN�p��ӧL^�~��m�Y�u:7]�w+V����v�t]���ݼm���sf�K�w+V�E���0M���o��nYy��I��dg'���|Errϫ������w�{���|�ĉ#���7oݪ���3���׮�XU]�ٿ(�Y��3����G��y��0 �l���e_S^Y{Ŝ�&�i��5GJ�{���r9G�t8JJK>3i�x�����kM���=zԨM[�h�6}ꔵ��� x�w�ۇ���~�s��߶c��ɓ�_viqI�!�i�n���>��UUi��={�LINޱk��]�d����:R\�77�ګ��2q�(�V�Z���v�-�F��[��b6�b�͉���55��
�﷟N��g�������¢†����G���JK�fg
���D:wB�444�Esf�~N���Uk�>�y��7	h���K��?XSW�koi�6:��h��������l���D��e�ˆ
��w���L�4i�e�$�nf�G
��C\8p�e˖��l�5!��|qi�Y4'�%(jǮ��4�q�p`6�C�Dz��TU�(J�4��t]gZ�d��qOA�eeYx>
�)ȒD3���e�[�4M+�b6�Y����x�8�7�b�4��9]7�B�08��
��_?@�4�e%I�8N�$�(jǻu*� �)
@�i��d
�B�Dz�+���*�aX�p\��@UE1�LaI�X��?���W�>\Q��UE1�b8f�?>�(ʊBQP�t��BQ�d
���R�{�#�A���l0���[#�p�e���&Q��a�e%Y6�L�a��ʲ��i�x-BF���0ް��$�L&Y��<���oCM�E�y�eXY��7(����@HQ�PHE��E��b�a���C:y���@����TU]�*��fKKMc9��X
�Gz��b]� j(oUUU��������xEQN��#�x��y����M�q��8.�0?�g�Y$}���8��:��<6���?��Q���$�{��{�E�]k	w
�<k���<�$ѝ߻\�聗�������J���lt�ȡ��YfMӬV랜�����3z�'6T�t:o�距X�^���n�w"�v^���G��4x/|�o0z3F��D]�{v�`#�B��,K]~����7�	�a@��v9�/z��M�6]�G	��"�O����.��V��!8�c���:��G��@�U|j9>��y��gN��tғ��^��yu
�'��0(��e9..����D�
"���t��6l���'ε dt��O̧�>������S]�	;�{����F.�!#{��߿oDȲ�
D�p�3YuZ�mF�22I�O\hNY�E���
�3�@rϞE��z�
�2�.�B�?qA�"�C�*��������;U+�=Q��֖�#�|/A�pR�����h��"%&�'��<Ǒ�~����'2�=�{��B���&.(]4� �TU=�F�C �xJ\X���K��	� ���CA�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A�	�A��iZ PBx�� ��!�M�4�a�"ZF!�� �B��8�aH�AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�Ü� ��@!��ټp>��{�! �� �KB�f��Jp��:�a~ޘ�b(��O��PӴ��jE����i�����7��s��!KQ	�AAk�kE��gR�a?c�G���@0P][
����Z�?�!���745�����Oёa�����8�B%Zjj�@I	��,�Ly�B��Q�_HB?ADW 4��!����a?_�7 ��Iѕs^��\���OѕN�8r1!����/��赉��;L��(�� ��G�Y�j��>A�pR 6!D�48���(X#����x}<�F>�y��BxR��u]UU���C��'�Pd"�D&�gAQMӚ�u>��~� �.D�%B�c���ڂ�B�f�)-/+9Z�s<M��E�<˲���;��8�å�	�B��_�y���]���)����G��ܞWppמ]m�m�t�sEQ�ap@�(y��)
!DQ�0<ϫ������|�u#�� �k0�!2�H�K��B�"����5�ɊB�4�������,C�jkk+�(��x@��5y�BU�*�*[�Zq����JI�x�D��S\Z�q\(*����E�����>��iaI*���Ä¡ֶ�ƦƲ���X"�U���A��ɭ��H,�VTU�ϫ��۳7GQ��ꪢ#�����(4E545��Ç**+dY�MM��,���GJJ��h���{Rm��u���@�n�����`0�_X�����յ5��y�oᡢ@(�_T �ruu���j����[E����u�ĉ	�A]:�}!DAHS4)���nx<�1#�:���Ғ������Y�����I	�f��0�$�	�q	�Y�-�-�/Dz�`�����̮C�< 
�?����*�*�� ˲-�-��-��S�����m�$�m�m˶��6�x����1ldjr���>d(�q��H��	� ��B<Ƿ�ڃ��$K�� 
R ]�X���K~��UݻM���0C�$YR	B�2��l��������^C�@7����KnjS����r,�r�}���N���t�QXT���DQ�q�}��!q1���B
C�
]Q�P8|�o��Kѵ�B��j\l\�'6g_{L&S��$]�kjk���&+Jz���䔒����4M�XV4����%Y�i:66���?�q{bc�*��j�j%Y� u"(è;��V�l6�$�$�%�����MUy�s9� ��g���iM�4����(z_��|�-[���E��ƏA\�*�+{��G�Ž�eY��o8��/,�U9=5M�FVUUy����I0�4MS����ap!����^���y���A����^�{GQU�a ��P��i��{v�.F�P��hܟ@A!4C�4�e������l!��	� ���b!�i˲��@�T�f\N'nQAq,�s~!t�2B2poN���l��0;�N�%M�x$��˲8վ�b��l�@��"��!��(2�c��$�At�cCȉQZj�����e%zHWt��n�H��|���
�:���sQ]�"���B�z)*"	�A�	O��M�u�����Q�:�h���L+EQ��O�!EQ�p�j����PI�WW!�d) �2���z=v2 
sGB?ADb�1�

�MM?��pE�O,�����a����B]��f���"�� �#�� I�=�M�HR�h�Y�2�L�5�$�At�XW�s��=:��sD麎"�� �~����w�d4/AD�CB?AD�s��H;��W 8RG槝��PdXD�Gڝ�zE��A�/9e�X,4M)�*��|G�3
�8g�󆮇%�8^��s�5�L'n�a��0:��C1cYQE����Sv����A�R��޺m{Kkk���=
�"��"y��;��8.ྨ����v5>=D�M�WV^^PX��߷/��(J����8EQf�9���ܠ�:�o�~@QB�����Ҧ�f�0x��׷O�^�v�o��=��ɱ�1���dž��L��(����(F/��Y#� ��}�w���G��y��ycǮݡP��СQ#F�m6]�u]7��8�;�v�db���aw�ޡCK�DQ���x^V��lV�(��rg�Zp�#��(�9���^�7ð����٣GFzڷ˗'��i��
�p8&A�*��f��AY��!E�a�Z5Mc�a�� ˲Yy��eB`2�~��'��?�mi����=(���$B�>�|SSS�ǝ��b�Z��e�d̢�l6��j2�l��,��`��cF�2	�h63���m6�l�y���	� �M�


�KC]�w�ڽ��3�O�,�\�zPVVlL̮�{�nW�'f��I�4s��}�a���;m�d��GK?5���3��������].����;�?e�Č���$	���@��by���ʲ�i����_��k��]{�\Q����+��޵'g�k�����3kfMmm�'&1!~��u#�k���m�h��g�ڶs����]rI0D�a�Xn�w��ѣeYnjj�f�roK�$I�_s��k�8x�uפ�����oݾ�gR҄��UUU��|�&O�ݿ?go����].gz�^���%�5��}22.7�0�Ͽ������v�9�f��!� .:��
�4}�-��~�����E�}݆�k֭g��+��������v��x���o��WUE=\\�އK¡��ŋ�v��˾^�n� /,|YQ5�a~��3��5��/�^��v{(za�˪�����:�qK?�����ֻ�����oO��c5M+++?R\±,BRPQ��^\x�.~5����a�彿d���dYF��G�<����/����V�ڽ�����n�Ə>Y�v�(�*.)ݸy˖m�?�⋄���y�����e_�ع��t����

8��	� �L��OQP����^��:�O��@���6/?��=�k����C�ʲ��ޖ��<l����;o�����`~A���TTV�H��k�M7
2dPր�n���Z(dFQ�II/����{���[�i�%c����y�gg��.��$I��ށ��S��y���={
�`�X8�u:��4a��555,�!�n���w�oko����x��O<��
�^��Z��[��<g��Ʀ���n��-�������w�]�οk݆
��O�<��oػoߵW]u��_v�%�f�c��\r��e��9R��w�\p���ݛ}� .R]��⫯b=1��-���ɓ&.Y��W^���eY��5p@�cO� �5p`uMMQ�!�v{����ӧ[,�p8�B�`(
��>�OUT�a<��;XS[;r��Ԕ��FM�¡��={��b�z�]M�&M�_X��j[[��]�G��iZeee]zZcc�$��&O���wO��iw�ݮ�v��!�Ò�$I�u����Ɋ��>_ �u#��������䯾����|Ӗ��={B�}>@�dzu����y��u=�dE	�B�`(dE�Ųb�j������n�A�F�ۼ�
�˾^�f��ݻgϜ1y�D�e[[[�m����A?���k�&&$L�:%=�Wqi�I�'&$�^�6���j�$%&VTT^zɸ��'5%�HI��Q#Y��i:w��O>�|�&���{ﱘ-�|�beRbbzZ���ܺ��XO̔IeEY�f�={�\�ͷ��q8�C���|�t������ͽb��U�+*+/76
�,�_X����c�����}{gx[Zƌ�z5U��߯��bİa�32��ۗ}�-���]�� �������]{�|�r˰�[�e��-�LYYd��wɸ�˾�&'7���n��	n�+��� .|�6o׵yE�$�%��Y�
�6���9�����?Y�	�B�(R�i�@VU��y]�UaYW��e���)����<!�eEE|���yܥ�e�p8LS��d���VO
�?��6�U�4��!����<�G�~����,�4M�4��Zd:&�ISU�eY����0^e�ap�OܩTQA�UU�(�f����<�����'�IYc� .����n�	BuM�qM�����]3w.!�j� �� ��P���ø�;��KE�o�ø�O?���gp�C!���a	Bك� ��I𓸣}$�J�YaX�(5MӴ�#BeU���S�4
�ŧ���Ba�ɔ�7w�֭
�M�YY�		$�q1�:��
^@H�鿸e^���P()!	y�&���ұ�tz[�OF����H��,C����_
v���C'�xt��ӡiZӴ^�R�������� ���'m�����k��O�a8�qc� �B��^� ��ĀNU�i�9�|���i�"[� ��K$�3Eq�U1� �|��y)�b�^onn.�F@�P�z����ۜ��%tEXtdate:create2013-09-27T12:31:21-04:00��&a%tEXtdate:modify2013-09-27T12:31:21-04:00��tEXtSoftwareShutterc��	IEND�B`�site-packages/sepolicy/help/system_relabel.txt000064400000000617147511334650015633 0ustar00SELinux is a labeling system.  Sometimes the labels on disk can get messed up.  One way to fix this is to trigger a full relabel on the next boot.


You can toggle this behavior using this screen.

Note:  Sometimes a simple restorecon is all you need to fix the labels on a file or directory.

If you add a new disk which does not have labels you could simply execute

# restorecon -R -v PATHTODISK
site-packages/sepolicy/help/lockdown_permissive.png000064400000072511147511334650016656 0ustar00�PNG


IHDR��ah�>gAMA���a cHRMz&�����u0�`:�p��Q<bKGD�������tDIDATx���u�E��j�_�MV"��{BHB����.w��C8��A�⮻�d�u�ў���Q�a#pXd����3鮑�NO�[UxѢE�$Y�%��m������:��������������0�eY۷o������Ҋ��m�6M�`�
��q\�޽yQB�e�Q?�L����RvU��C�m�]����x���c�v�����A�x�c��5������7!�~���گo��>�w�2��8���p��Q�l�&���B��_Vs?we۶m�{�4!��6o�'|��-am�A��a ���>��3�CG'�R>����lp(c����%fS*�BȲ,�0	�U�y%�#!d��iY� �{���7ƈRBH��:�T��!��$�<����6�b1Ux۶mJ������X�
˶d��}9i�aS[E�R��[�mſ'�݄~�w�^7�s��/��	!��!�$Ibu�{����y�|��r�0�7>���y5MX����n񲇞Y��HT�׫�ޛ���;�y�EC��`��[��o�<x�I�HmJ8N�u��$���sG�SJ5]wH�M�a!��_w�}-m���E���}F��l�1��aD����36n�QQUs���b�젴+�uM�������\�7>���_�{�uC�66������p�%�sg[�n� `��L�0-K��M!Q&QӴ,˒$�R�i�q�$"������2~۶y���o�Ã�� �V�mL��z�d��=��~>�@�l��t.X��Ͼ��_��ff쪩E����$��w߮�:!�Ѝ��Z���u��G�)�M6M���Q��j^v����./;�3�t�4-A��ssvVUBr2�mJ1B흁�/�hHi��}hƬ�o����u��P��G>�q1UUTm�A=�rm�Bm�^���m�y�Y��U^��� ����|�͐����SY[׻�!\����ё�����Q��Lm;��2RS�>o}cSkGgj�?;3�`\[ߨ����I>���ٲ�B�������-[
�LMNԯ/��4?�m�<�54�<����9c��ko]y�Y�=�4]�^�?����`7�,���H����F?j8B��P8���%6l`���)��er�mg��s��)���9$�3з�����{�������﬚0f��z�9g>��[��}F��lM�mJ��ݒ��l�#�}Ăe+_��� 
�>�U��y�痬\SQ]3f�G^|���&����H_]��w�Sv���w�ũ�~��S����so��\���qc,����9��M���O;i���Uu���s���]���U�7�w.>�TUU_��AZJrmc�u�^P�Բ`�JB�	�٧�౗�(�����(�UԽw��c��b���(ʳo�@]x�)�
�(�}�)������i=�r���yKW�[��o��s�u�!�DA0E��9�A���P���8-59龧_X�'��E��tZ�=y��uMM7��е�����HD�yv ���i]u�_�����:u�ԣ���mw>g����8�0�%iź�;+���-~�c|���.��q��2҆
��}^�0�����F���;o��2L��
�ұÇn�^�����֬�>r�Y'N��[���y��Q+�oX�r����‘�O�~ܤ	��y_Um���K�(mnm���W7������_sENFzTQ��1�el�B��ô,D��!�M��������rp(��J����j���Æ����/�utPJ�.�ӧj�N0�u#T]��r:9�c���mjSJ0^�y��;�������6Jmj[�p��[;:tMc]����^p�I�z��LSQUYv4��645+����QbjSK���,�vʲi�U�u��'��=y�1o|����<f�a�4�e�!O�y��o� u�M?�ʤqc�2uӰlJ0&�<���9K��:87+ӶmI�[[��4]�xN��+�S<~�pM�-��u�0�3��M���X�zͳo�s��'#��}s�h��!���}���*|X��n<8�QJ1�흁����̏��N�]TXU[�/�񶨢���k|^��|�1vʎ��u{(�e�cD)"����;|�/��v9�򾞿hGe��w���/�ffL9�p]7�n�3�h�������S_}��ek�����&�56�����((�3y��O����gL�r��Q3?����}�����K�hZ�hĠ����/���.�_QY�Ԃ�<US9��-���dQ����ꘪ�̟<a�S���a뎶��9�'M>���;���-,p8$��M���r�i�%�\u�٣������[���Y��+�-Z4|�p�0��P�R���aE�%I�$M�m�R�}M�����U��u�RB�(���g۶n�(������ �DZ
�]�Z7M7\N����c�("I��Q�8��N��>�x���`(��L�_�ȶm�u
yަ�#r�������)�N��`��t�g��)W�7��ȓ�_vᰁZ�ڢJ,���9Φ���u(��(�<U���)������)Eq�ڵP�~?�ea�=�l۶����n�F�mS�`���l��B�"D)��m[UU��X��$��REUyB�h��+�

C��J0�9�(1V?��0�)U��8JmI�mڼ�|����jx�|�1��5[7M�q��gu���i�&�(��3M�!�~���u�1&#
�[T
��N���6-�=eۆ�>�k��FU���D�
�߀�=&r��~𻡔��d�gY��(Z���ղ��w�Y{� �����eWRJ���eY�"Ӵ0!��uZϜ,M�w���ߏM��ωY�HJ�m���N�8��w�Sv��o&b��~�)�|?
Bh�)���0F�a�)�9�O�H4�[�Eض�v:����,[U5�v���G��� N���q�
�(L��=������
���;6����^�pP|��q��~��J��#>� a���(B_�!c�Nl��o4�)����m�� A|����m}�z>~h������
��#�0E�H����y>#-\SW�i�v��E~�?59�`��?�3�YT\��O6-�?�q�mQD^hmk��U1h����6��@�clF}C}(jniVb
����1�#�pȥ%�i枓��PxB���v�������SU55%�t��kS��r���746�<O0�D���.�i��i�ݮ@gП�D���<�F�~߁����m��z0���:��	�
��ݾ�� ?ԣ�#���� �D)%�ضM5M�w�ܣ�Z�eS�pIJ,BHjj2F8-u�����4�Pr��)�2B跽E�`��z(NN�>�8ڶ��c9J%�Cv����Z��zq���w-�����[�eۢ(�eX~��"*��S�u>���f�ٴicG���t��~Z�P$��`�?i��M~����B���1�:�o4[�]v9��mc�y��6E�u�<�ǯ��Z�ړ���5��3cKY�w��5
j&�+�#�-�JOK_�zUgGGZj�����~��v��GE��'�d�T�!���?���8f�9dY�$	!��blʚ��T��
�RjSV�뺪jii��a8�Et���(b��n���?����8����a����ެ�4�LFQd���	&�?��ݖ -*+ߐ��:��q�II�irǪA	!�m����C�4v}�C��_����7���2�����g5����
.Yv���oذh�RY�Ǎ���k�w��ƌO},��iY�a���2X��7u;�����mۦ6FQ�vy����Tv�;wI���{����xѱ$Iuu��i��ɞ����F�4z��u��o#ۦl����Ѳ��?{<gq�@�C��r'	/98��4�p� ;�HXD6-�~�Ƕm������kƻ��

mm����r����!�ǝ���r�����3�X���ٶ��x��~�,۶�q��~����y<^����@���q\{GGGg���0M��pttvv�=���[�%����~�����p8\U]%˲��fg�=�T�e��+˲���Ϸmߑ��b�S��^��)[��$���v9=n���x<�˲��(��Y��j�sN�s��5۶2&�p��q'%'�=nL0/�>���q'%��7m^�f����<�;]��Ԕ]�U�׮�����d��������������N9)���z<7�q�#�#�h��9��A^�ocd[vjN�ǰZ7nn�464*7��+t{��m�Ё�G��R��[Z^�:e�'�dƜ��"��Iӏ__^�e��L~�7m�2q�����z{}y���:bذ7�z�����IG6v�G�̪���*JiIɶ��~�9g�Ɏ(�kuH�m��r���73f�|��J��ݴu�����s�=�/g�Ñ�~lQ�k�>�q �*�	����3Ms�����g�~Zkk�~4}���K�.]��o�>���{��7SRRN>a���|��׫֬6d���-X�x��p�w�^mmm�p�s�NNJ�,+��1�y�R���t:e�A�9�繥�VTUU�;zd$�x��mm�5��r��G>���c']�z���;�����x~�%�5�������;��&M��_|��	�Omllڹ�2%9y{��P(���>i�x��(7'[�t8
�jS���ܹ��K'N�r��P�q�k֮�p󭈢�t�O	��ӧN-,,8������N��;��-˚�ݼP8�j͚7�z瘣�LOK������2xp�޽��
ƌ��SO���,Z�T��~}����ӧM]�f͒e�].�ÆS�XL=��s�s���?��n���Gnv�q�����}3��H��p$2p@���N����ƭ[����~���M����˷l����࣏2����������=d�%˖4k֤	f̜�f��-۶U��=i�+��1d�����O�j��m[�E)�9D��Y�#�(Ea�ӹx��7O�0~ݺ��k�ϝ��Rt��c���9B��[��^Ž��,[q���
y�_�t�ƍ��?~��u[�n��k��������O��К��u�hhlj�o�>��ի���Թ�.BHcC�A�'�1�޾�t���{��}��ŗJd'�x��EI���m��&l�����x�O?U�w�w��ѣ��Ҿ�3���i��	C>�i_~�MSsKAA����I'dgg�m�S�-۶���j����9l����{:�wqq8�Ap��b3�|�'y�G~�9\�#?��G�z<���Q���z�������EE����aC�~7~Y�Ɖ���ȿ��,]�u��̌���ԡC��ݻ�|��i7o�y}�a:����>���`@i�A(�(۾eY�eSJ9��MC�)�(r�qD�e���v����q��Î9r�acǬZ�����)��_���%���r%;���[�n/�W2b��ܜl���m�?��c��4zԈ��*G���lCcc��i�-_����c̘�����ޥ�Jr��UU��ܜl�~pH��TU##�7z��#����+�\ݫ y���0�q?Է�go��c���=���r˲���BG1�ǟ�|=���Ǝ�#?���_�,�R��Y�544����q��>r�%\������
�#�Hg �ij027@`�UUsʎ{�㱇���;��Cմ��,K��;v���{.Z���R5U=r��O>����n�յ�={�;���^oljr��s�m޼�w�b��5r���=�_�>���@ �t:#�h0��*�]6jS!D)�y��)��"�4M߲e���֭-�]�k���_|�͢%K'>.--�O>-+�������41'+���߯d�o�Qa�֠��/X�ŗ�,[��_�>ÇY�juCc�_�<��o��u�WQQ0
�"�a��ajS��9�E�
G��R0G���)�µ[��l��NN�;;y�ȁ6w�)'���;e'��#765�/+߶m{�^��w����YY�t�qS�}G�G�f����3�(��#=-�lCyzZ�Q�&���/\�8U���FQAO�χ(�_Z�F{��eff���[d\��!�SJu�ث���8%�����d�
_�q ܰiӶ;�N�e]@IOO_�hq���ydCc�_~]U]s�	��靝�]�s'Bh҄	�e}�`AsKkQa�s�))9�ٱ�Zҷ�m[I~�<�0X�A04M�R�F�^/��2���@{GGK[۔�G�džM�G�>v�Ȃ�=jk������|I~���Gm�^1xЀ��ԕ������{6f4�tӖ-�F=j$!Dv���+�ك��eff���������iz~~��磈����%'%�D��O�R��BMuenQql����+�b�^�ٷ���7�]xQss�8%%�{��q���ͷ�<l�a)I)�i��m�,;d�Ml
�x�onny��Go��M.�KI��m[��FeYv�\�HD�u���J�#�����tݶm��
��A���ہê��Ū?���6m�4b�VP�j�=�e���P(dY6!���?����mY���m�R�DA�z���F�Q���q�4�H��0��b^�7K���V7d5�����h��������n�,�Ԧ�A6�@Ӵp$����A)�!�`U���P(�pH/`�UM�*Q��+���i�H!��y
ÌF�II~]7"ш��E�Q�ϫ����c�ږi��Q�Sc�r-^8t�P,p;�#9�u������2�;����V�Z!�R��^��e�eee{�c����F��x<�EK����f��C!۶c��'!$�E�Q��Kgg'{�q�P�x��B�0��c�?;���RUUQ��c].��5k1ƽ
UU����Ƣ��8MӚ��	����8���"����s�F�~W��m#�X���c��X,����|TQ"�!D�Ji D�ݧ�x���p�(J̦
B�`,�B4
�mB���P(�1E1�%Q��T�� �̉ݭ��133�{�!��놢DM�t9�w�]_Q�ZY%R��#I���i���;��w���D&�?��pD�QvS�|!������ev���c`���(��X�g��k���0Mv����}[�7��?��_`����b��W�M��0j[�C�4]�Dɲ-B��R["�e[��T����}(�/B~�},K�EvA������@�5X�l�JO�X�p��R[�$�pњ�:�00!� ���ߑ��:�))){�����ߎ�4�N'��&l��}�I~(�²,��a�ө(1���Ѱ��
G"I�m�4��'�+��s�i���H�ԃ�>������ݻ�'�Q6M#�Zc�|�?y!���?1x��)�U8�csr�ϻ�V{d?��bٖ 
V�DQ�4L���EIG"��Sä�� ��n���2M��-��1f�:��zcq|ʷ�o�`?A������b��e�C�F�Zb��m��t�����(p\8v�r$�8Bm;Sd�#9�SF��*
�:��ȴlX�"~�t�);k�j���8�FQbJ^N^VfV$!��8�B�'�"B��ru��w�!�=�x�4���7���
+��K����h?02د:˲\N���xw�(���k�О�����K��rz��� �7�j@�Clۦ�j����̖�c��?�X�Omү����\�w������P�
����H4�&d�y��s�����\.��a�����񇸜N���C
R��M�
��!6_t$��L�H)�x<�m+��c�����tfj��z5�]��袋���t]�ɧ��Ҳ23[Z[_y����䤤��O֋_�]�O��m�n7���y��g_|q��yU55��N���?۔���h�HI�[�p����%%!˲y���ʪ���s�����IIN����w7�5æ��>�������(~H��K���q�322ذ�x�o"�i�+��k#����A���~�v~Ý�b�'��6n�����ysA"c9�;|ߚ�nCC�C�B��}����v���l���٢+N���8���&�1[����8�N�$��v{0aނ�
��>���v{�^��I)u:�V�`�9.���nڼe����a�M�|����}��wȲ��3�>��˕UU�(:�N�/��	!^���-\�d�5���y�҈m�~2�����p��N������#��].I�dY����Fu����/&%�C�BN������-��v��x�w�\lA��`���~�m��l��I���3���~N�K[��qR�,�xk��)�=`��t~���Y��sw�2���M:�~d�{���.e_�
�'���㔤$��lJ	!>���v���Nj�,Q�XjJ��=�<���u�#�;�/g=����ʆt�%?���
�M���C����������\UU-�4-��	GMMӎ=��SO��˯>��i�^�_x���z�����X���һWq|Pfz�i�͚��ֶmGň��0�!���μ��rs����
��u�CW��USSۿ_��o��cnj���/<�ܴ�T�$�b�G�xrGE�)'�pԤI����eSUu̓�<�v�B����~��,�%QdCa�9��ӌ�^�v��l��)a��6o9���|�3O?-+3���ŗ-���s���]@BDA�l��Ɗ7�ݟ��V	ϦSF������ ����_qH��g�����IB8����͎`��$I�at?�e��G�a�`05%�=��V�R����
�8�=��#�w��i� ���e�ۅM)�g{�c�0�Ng[{��<{��\NAÐ�uee��.����Y'd�LR����d�+���\Z[�,^<��)l�g�j�~ݾ#�o.��R�i���듏:�G��H$���oA�8yB��'E�/�a�l��'�}��׍M_Am۴,6=˾�)��[���/���-��@"�}^�RӲx�ga�\U]�e۶��d��o����3���N?��~/��ڊU�.8���K�����;*v���L���[�n?�0Q��8q��!5�uѨ�|��$�����ּ��y����E�|;���^_�bś3ޝ>mj���x.���.��²��n���s�IMN��ͷs�z�ݳN?����'�~�ͷW�Ys�I'��~�0�o����b��rEQ�Q�+o��`ᢾ}��t�[�n�e��|��Bf�`0	_x�y�{����^|��v۶]]S��C6MSӴ��Z�Jl�$�k&�%eYVcSscc��p�ʪ*��������e�A]�e��=��3�����òmAػVSW
��G��f�</IR4mhl4MS�D�0���UU�8N���:EQA`
���g�^,A>�����ӦJ��(Jg  I�iY��(�������چ1���6M3�
����N۶�����y��p8‚>��#�XL����F�� ����jMm]LUEQD�0���T�ֲ�C��;:ZZZ�C�)�I2M���D%
9�����;*�n�F�-Bmm��(�,C�
�6�ܹ+����}Y����*�

��ٔ�G{GGsK���`��m���lZCAL˪��W�X��s'�bGG�i�^�w�s[��l�ʊ];�M���(�77�l�G"�hT��&�b1���TU�$��s\}CCGg'�yQ��{��*����G�Q��$I��͝���(�<O�7AgT"�:�DŽ ����tW�hTv����w����Ν������漳�r�' ��yo��#�{욵��6l�{}}��9��3f~�Q ;z�$���tb8���	!���#&�dg�\�z�ڵQ�4蟖��r��޽�O=�����uee]��4�ž=��Y+W��:e�׳g��i��ž'N�N)z���ZZ[Ǐ7��i�.�)DA�e���x��v���J�G2f�HU�~����~����0F���}��vʲ��o�u{��O>�49d�u�f��7AjJ����\uœ�>w����������r��EAX�b�ǟ~��Q�2��$QBɲ�`�N`465=���I�Ʀfv����/����4}��u뫪��W\z���}���U];j��c�<�����\y��U�}�Cr1qBaA�g�Q��޽��9���_~���!t�����+�I�ثW񙧝�1niiY�ju,[�zm}C��ʪ>�{>v�b�f~^�%^pӭ�˲#�$��F���%
���X�r��o����)���y�Y�����⒋�٘>d�nU�աp��+�0M�^�8���N����8���窱Ę$%����Ͽ��ٳ=nw ����H�䐤
�6�5�]۶�rs/����[��Ç�}��s�J���(����fMm-B�K/�z�55����fK[�-7�0�˜���+���O�~���y��gSSRv��u�]�ٶm���?w�݊�����WT�����z�g�������ϔ�~��t���s��tD,3��g�kjj�_x޹�.^�n���>v��%�|��,�2������>�‹
����+.�����^IMI>��F�F�{�����֓O�>dР[�s���>��#ÑȊU��m�8���ƌy��G������/��7ߙQWW���(
%}�7e��O=c�fJJ�_����/�Z�f-E�����'��L+--���΅%TAY���O=�G?��3����K���������n[Vcc���|�QO<�����>~�q��~w��>w���~zk[[[[[ lkk3M#
������@���]�������@0���4t����f����ɞv�]�*���}��7���?y��@g p�)'���1!��Hd�qSj�뮸��'�~��O;���f}��
7߲b�*�����k���;�[ZDQ$�<��K�YY�����̌����Gk��hw� 
�~f:bsP�Av�⋯�*߸q���,\�t�m�^EE�?�`8��=��X��b'¸G~�i����/�3N�_R2g�w���Y͖7!�h��D�~�_O?��7�~�3p�]��}WumMm]�#��7v���zg�{3������r҉��n}���cFϛ�`���
��MM'?m��;wU��1q���N]_��ٳG����Y���-[�N�0�����,�0����1�F>�K/����u҄��^��7�yg�����ޚ�گ���"���/?�#4M{������ٷ,;(������'��x���㖛}>_g��ܳ�:�/gM9���M���ͷsf~��	���=jdcSS G"�&N���;vUV�o����_�x�u�]|�a�eB�|�7&>���f�\˴�?���G��g����믿��)��e���={���U�|� 𽊋��ǿ���0��JLQ��wf�v�I7�����z�_����/��b�2UUmlj�*�$���]P�s�u�Y�޽���ܜ�H4�1v����nii}��{G���{3��r����f�ǃ(���K���k7lڴr��o��3r�p�0�㜲|���WR��:¨���ԓN���?�����jD�M������9�}wۿo:��s>�����}ϲ�G����/ܺm��_3rĞ����N>���}]�����8�Ԏ��w�{�w�^�W_S�F׮_?���'����~��
5˲����V�Ԥ���7�0�)GU�1U=������
���YS[׫�����HO߰i��g�>l�PUS�RSB����"��u������z�*����z{��^q���s�-����FJrJi����w���Q#F$'%����o��_�Ѩb����W_yEaAAqa�#�߷l���7n�X�#����j�=zdee�0q��ܜ��y��M��9rRI�>��7������Yc\�P��ؔ����g���=��v�$��ݙ3�N����8^��=��_IuMu�?	�����Q��xsƻ�v�0n�(l��/���lÆ�"�C�u���X�����ysss
��/5UKMNq�ݑH4%9�!�y�9��,���ݫ���f3;eY���?�G~��&�3?���_<p�U�_�����;3V�^3n�QUU�vܱ�={:�޻����_�����Ӏ�B�(����<�jZ^n����@��r	��t:���!���JMIq��}��n�˶E^�G��dYfg^~�����dUUÑHnN���G��<��(��}rRR06t#%%EQQ�c]�c1��'��n�+=-��\��������aQUM�:e���� 
�����
���x�g���S�S����F��n���D�Qv��!Ic�χ	I���Ņ�!���V������d�,���,Y��4-����	��w��x�����ɺ��N�<h��攔��ˊ�Y�sNvv��<LJB!�ߟ���u�6����ʲL�ڴ��!%9��R�G¢ h���{z��~�g~�ѽ>t��™�9d�@�ߗ���ԣ�|������;o����:lph���
BȔ��bݎ�HDQ�Q#�B"��1GeYf,�7e�(���E�ёÇ�?|����c�aG���xb,�m��s�	���ԓNRc�^EE1U�m��SNG"={��R�4M�¥%%Ç�-˴,6�<!�ܳΊ�-���i�
�{8P7�H$b�hܘ1�&N�-[�u]��>�VvU����QÇ���c�:��ɓ��ht�v�k���N�)�{�`��6�^���ҝ�|�u@ϔ���==B��6n�t׽��������JNNZ�����E�y�ሩ�������#�����/l߾����-c�����F5MS�XFz�;���_K��������}>� �u�O=�\�ƍG1���CA]ӎ=��}��;��d��'�޲m���'M�������Y�D��ٗ_57�H����U��._�v���[
PTXPW_�<���+֮[�t:[[[�s�i��R S#����:eʋ����׳#��qS&�|��R]���HԶ-�1�mQ�7m|��W.^r�y�����r׽��}��X��S��u={�B�̌�����Gp̑GRJ5U��1M�Bm��=�������	G؉n��=��=�s۝wE��A�����o��ʬoh2h�������}>�(��m�v<���/��Kl�~���y�߰q�駜r�G���cF�ڹ����.�ӻ�?o�w^^n$A)J�4L�2��(���x��[�sW[{{�ƍÆ����v���<���w�]WWw�)'55��y�}����l����������>�$��FةZJ)G�P8��;�D����^���a89|��y�o���p(<v̨#&L���;n�ϝ��GM�TTXX[W�=�e9�ع�o��RGGgNv������{���dgW���7�%Q
�’$���9���e6v:�/Z�hРA���.`k��g]'�X��w_`�\�d���J�B�ز,����!��VV��X_{����X�?�
 �{��1K|�����sRRJ%QZ�~����>?�T7LI歭�le���߹��K��o�6�q1Uݶ}{,�
<߻Wqrr�m�MM����<���C�m���Y�UU������O>���r��u�bɲe����23ו�'��.���b���].WeU���c����5bx[{{(����y>nڼ%;+����RZ�s��;��
K�����__V����XWV���=rd0\�v!dԈ�~�?�Z�a4x�@L��ի99��r�SčMM�����]�U>�����<��bgKk���.�s�Ν�yy�H4�D�rrvVVe��˲��tΝ7�ݙL;�؞=z�+�k�vuMM[{{iIICcSFz��뭭�[��,7''++33#c}YYCc����$��n��.��SYU����r9�7l�UY����<�}�(ڶ-IҎ��x{*v����t:��i�\��R���^TPU�+V��8@r8*+�ZZ[���JKJL�\�n}0��������H۸iKss�ko�}�
�*��g��u�UU_͞}ϝ�	n�;���ѣ��c\XP�s׮���7�~g���S�L�B����cYVuMM[[����u���������‚h4�r�j���Ay������Ϲi�����C�9´������<At]/߸��t�+)!765�++�HO<h`8Y�j5��=%cF�&�Z�&�64+3S�4��~%%�)ɫ֬
��Ç
���*߸)%993R=d�W��#�v���!IҪ�+z��7���#s'�/����U
Y�_-X��c�=�4��>B��p�/�����A��l�vTQx�C��$˲L���r�_�9bxt��7c��I2MS�uY�m˲lK��HD�m;v<��c/=� ��PH��4]g�K�cG�4����$I�uEQ$QdUL����$˶E�8N�eD���RQ�ӉRb1�����/�A��pH�i�RE��!���eY�4��q��9�0��|�|���ŋ�|�H$�����f1ò,Q�e��a��t:A��U�0˜��DI�,�ܲu���/Z\]Ss��G�
!x�8�a��t9�#ӴTU�'BJ,��8�7#���y��X-��JM�j��f}����[YU-�‚E��kj���aP��8N�46\CCcsK3����_��_�,*,TU!�����s��J���{�<�i��>�����y�-}z�noo'�H���!N����!QdY6-�-����TQ(EN��U��ζ�S�x#���t�������/�ؽ������Qۦ�H���d]�4߷�Q����.YWy�WA
�(Ƙ�ɈY�]�����av�V�E��TU����^�E}���:�mۡp�m��D�M�����6fL8a�y��/^$�⻎�b,�!�ah���i#B�alTs|�`(��lY��d郺�q��7�b*��XQv��&Ƙ�����3dР`0h�^ >�jHUي��G����[�m�����H�����…I~��W\��:����ٽ�n/,b?��c�)�5,޷�p8TU��˯(E��*����ֶ`Ѣ$���+�P5���1�4M�ڋ�,����
z����{�;{>w����jڽBN�m�6QX�?;6�	�T���G��P�W ��}ob%����`��~6�!D)�� �O;X�L|����M(�]�Q�m��p,_�ě����9yٙ>A�ɢ�W�W���5��1���/{v����WP��윲U���vX!�oU*�r��
Q�{D���v8$�����4>6��kZ�ۗ�o��r�b1Ӷ!��oh?G������P8�~����{<�}���M�Y�Ǐ�#�������������0��;==�����Q�Y�Ó�Y��S��1QT�4���zoK{zVSS۹�%;�`kEB�j�#;N�Yрbq��m�#�_�5��
�W�?pS��f�$�ϝv��
��?}zڟ.
�����Ba���m[G^n/�H���29)%';g��$I�y6e[LUd�(���!������,�;;;�k�ss�zP��j�6o�ԫ�����>����BlJ5Mc}��i~M30FԶc�9��]���9�N��c���9LS��_;��/x8����ރ�7�4�}^����?w�1��ҷ�Y�Q��o���yK��gg,�>ZZ��U�%IJIN���%�T��tttZ�%�bQA+�A���ⓇX�U�А���ƚ��aEQ2���S��?ل���U�k萡~�֪6�ĕ�����l߱u��㲊�;c1U�‚�n�;�ض�p8��`��c��
Mu��'
�mԶdU!�����P�8dp�^z�,�M-��uͲL�2u]A����̌L�m˲|㭷-_�������f��A��)�C�e�00!~�Ov:A������x�4��#¼�>���c�<Ҵ�$�_r8�)��k׮�dNNΓ��|U���]S�*jhl�,�����,o����Ñ���/����_Ikk��y�
�t:�n7�����`џ���nlhR�Z8j(QU���f��[����8�I�y��C��p�u���}�cj���j���g�u�N畗^:f�(M����f����]���N�m��O?�B?<)��D_x�լ̌�O<a�0F^Q��g}�r���t�c��ʷ7���y�ޝa�åՓ��&'�;:;����m�fI߾w�v�$I�����O=󷫮,���4���/���?n\jj�����f���i��]��	잂��8��t�eY�e��i��iZ��вl��b~,
C���/��O>�\8����5]_�|�;��$�D����=�xKK�m�i�)�׮{���DQ�,!��3φ��
�6��;��2��Բm��]PLͤ��Ӱ�M�ޝ��e���F��z����-X�|Ŋ3ߟ��>����#���X�g5#>�LӴH$�F��H���@�1�4M����
� ��9$G[{���O����y^8��������f�fqQ�����i�uu�����QE�F"�
��99��zʿ��_�֗557�|ކ���
�Q���c���yn�>_u]����Q�–-U�������s565'w;��Iq�N���m�\.���s���99�>�7#=��6�a�{��bx�4�����z��#F:6YB0�YV^6t��V��4M%��kYV Tb�p(�DdY��o>��\s�򕫨mwtv���oڲ�!;�wCC�a^�������Lj���:!�4����.�Orts�ȓ��6h������)��J0�oh@=��+������\�vmqQ�?���C�?Q[W?z��?��s�q����p]tQZZZzZzmm펊�H������~ӦM%}J������"��ח�/_�rɲ囷l�ݫ���e�ر-��J46v�����j��2hPuM��u��o�p������>s������-[�9������ �)ɩk֮���T�E�7o	�£F�٣��De��%�7._�2/7���.mniVU���s�{3A�|��={lش�g5���A�N�~?�W�����*]�}>��^.�k߹}A�m;^G��a!��W����5C������iZ�m[��r�TU�t]�V��J�m��U�����.'''/7��`vo[��0MB�$I��"J9�gk?��k(���ph���mh�B%��QӴ�߸�������y�5kf|��{��
�l�[�M8��#�xX�ئ�+�v��4a��ȁ�P��C�7��`k�+�_�`�>�B|I�n6>�؏�>�U�k��{����`��;��}����D?$��;|�� ��v��%��Z�?qS����cL~QЏ����v�y�c�/�nS�0�kx��+��x<�ht���t-�!I�~�����,�4���t�?�	����k��'�~��v����Y�1�m��DQ�$V���)�(VVU]p�e��͂ �ݵ7�*L��9�+�v-�]����c�{��6�$�9錳N:���
�6J�����k��+|�O�}{ZZ$I�@v�	�+=)�K�/1l��=��g_|���Y���,;L�ܶc��m۫kj,˲mj�BHD�0�^�g̨Q�,�`���cI�˲2��G�af|�V]�mۖe[�NJ)BT��k�z��SO:Q����6o��t:EQdKlK���
���h�� �M�E��z�cF�b��Y�9�^E��:�? �uu�`�].WjJrjj
G���/۰�G�4��3�8|�a��ywg гG�qc��[���l��33��6l�4q‰��l��c�9��?���g�a\���q�<��sQE9����ss���K����|��Ꚛ'�}���}�����='7'�o�2;+���B��0�s�>��Sο�>�xȠ�w���u�e^������`���s�X����"�–m�.:��Q#F�q�=�e8��c�^�jՔ��~�Gw��B�{���׿��k�׬�o**(PfW���mJy�kƻ�?��Q��8�/g��C<��v����m_r��?���ljn��曂�� 7\݂��6l����S�e�W�y�)'���l��Ʀ�i�;v���[�8@�EK�,^�l@iium�/�:~�a7mz���\Ng�TsT��b1��`+j}���֮y�z�*~���:::yA��[֬]7l�s�:�>�y����

>���U��644���Қ��{�-�^�v���V�^���t�i����u��J�Q?����{w�����P(555��;��
��y>#=---M��$�o����9=z�>���h��r�EW_y��>���h4z�)'���⫯���KK�ʲ,
�駝������=�����8.O8��=��B��w��|3{��%w�v뎊�)>I߾�
znܰ��쑓�1d�����EN�󫯿�v��y�>�����r�����cؐ!�i��;n���E�^}�MI��t�=?����‡�p8
�c�P0�{<ӎ9�����v9S��}^o0Rb1����`HU�P(�"�h86#��4u�{3wTT�!���g��n������l;���v�	�>�T�^���
SR�K���صKvȥ�%�(�����e=������)���w�c���/,[?���Ѝ��n���MM
!�
�Ԥ(�����
���-�HX׍p8lS[Q�P(�F�Q������߶��)���`@�‹-<hPLӶnْ_Z�MN�(�s���YS�sW���~��w����LMIiimmii�W�wG�N�˙���s׮�����������յ�i�)�{�jim�UY)��~�:�Ʀ�>�{B*v�r8���<Ji8Z�r���
���1m��|;�ꂁ/92#�"��!nٶ=#=�Wq��]�t�(,�i��������#&���ƚ�S�ޤԔ�`��g��Y�332�mߞ��
��������ssu�8�/>�����S�\��̙��$!B��"q�\Rb81��-ˈF
�A�X,��e��p:
�AUU��YNLUA`u��X�-���(��,ۖ����w8^/BH�,#ƺ.������2�y��wt�r�1���	�5t(�����-Y�"�RU�%Q�m[7Yv��"���Z�iAa� 1������O��A� �F�#J�~����]�>���"J)A�6�&��bز(B㮹�1&S��m���)w��[�m���¢�8���_�(嚚�?�6nĆ��#J�s�x�b�R�!ĦQJ�l�`�A���eX׍�"��k��N�SS����`��&�
��G���з��~�󤱑TW#�DZW^ί^�m�Ljkq,fӴ4Ć�b��O�n��X���l���L�{ʔRd�]QN"���T���qKMI�^}U~�A�eC�N��j��\C}>dY�4�,kȐ��dª�/[��~�R��UTd���
���6,c���<�!����,Ղ���v{�?~���OKᅬ���!�"�8���;�
��,���("J��a�?��_�X�=�TV���ii��R��9���a�8��ki�DЏ��P�8��\�R�O<!,\�~J7B����W���6�*�?��w�k��q��mZu�v`ZB���ȲHk�0�.̟/̟�(��˹͛�$!B��Z��eQ�?v��va!2n�V����
�V�^�8էOW/�Y��X�l�B�r���m[��𓞓m#�M�����Ҍ?� �(��Ws��S~�� ��I��?�=���eY���XScԦ�%�����-�B,E�=ҲmJ)�$$�祅]W^�b�����aQJm���z�{�/��x��b����=�|�O�4t�������$Q���v:�{M�_��.��k.dY����~�P�lJ��!J��R���[IU�URB}>��i{,��q\T�VTl���*,(�y!�Dw��
��
��zyt�=��r���mj�f���4(�f
��dBY�ͯ͞ZEE��8��:��ٓu�`E!��c�1/;����:��Ꝟ�J)���߾c[Ϟ~�?���R��|>;���b��	<O8ΈWjrBG"(B�r�Ȳ�(
��p���_T�R;'�**Š�~��Bx��x��myy�����Q�|^߰�×�X��ؐ���&A���d���zJm��<���D�SUee<!6�3�q���^������1Ljմ��57�˳�a۲B��  �8B��`Ss��1cM�|���-��p9]���q׼�<υ‘�s檪*��
����J4;+����~
X�"�9��iڪ+���</B}SS[{���R�4mV}�i�u�C��y���c�����
ð,CU�Њ
�Z��뛳_
/����5kB�p[{;B49%ٗ������䔔��d���<��k��%�bU�0�ۙ�=/"ċbRJ�?%��y�u �����������e��Zߚ_��ǫR���nOcS� �6T����'�y����b����k����/�$��&''K�DINJJ�}^ogg��/�,���w�\�6����� �<���~��)�A�
�q����2M3>	���N�S��x�:Ƙڶ��?�重��z |��h$9��%I|���‘��+)Q��}�<�.��"��'{<�$��x֧N��N�z��D8�ק�|G�����������	!�����q\W�>B������毢$}���/����W\����|�ŗ���S��������E�7㡇���rʲ,�⮪��̀i�	��1O�eY�e��i���i��g>)��`]Ӟx��ˮ��^��b����4M{s���ݯ�dȠ�/��Zuu�,;�}4���<�F�R[״G�|*3##=��I���U5C����!�3�%ɔ�0
�	�u�e]׃�P{{��a�8�-Y�|�ڵ��|��EGL� IBh@i�_�H�4����`(���(��i�nY���o�w��/���T��k���'�����
��UTt�?l��(��ʪj�QGGgzZ��ѣ.��⪚���f�����VWW���9�3-cL�����}׮Ң���w�
4�eb��%���g�g���e�(n�;%%���ò��4�㎟z\���9��ٶ�q\}C��m�~#��[��v{�Rҗ,[<z���G6M��z�m�Ǝg�ߏ���B��[�J��s<Ǒ�@0��!Bȷs�-^��_��Dz+-۪ohX�n���SSR��2��;�#09+3�g��M��V���ѳG��%֕�M��QGBx�ҥ�(�����Uĉ����n��e�e���̉&$''����q�I'��ʫ3f�?jĈ������f|�������]t�E�����iш�q�ƚ��m;����4562���u�'��ںs׮�;*�[Z���4p���^�p��U�՛�n�ӫ���M�ܸis ����{��[_&;��S�}��w7n��}��"�����c�]��
����;e�s�a�Ζ����������7v�駞����r��Xl�G�}�3N9�w�^5��?�]{�q\W�S�lYV�J��}�S�8�����Ѽ�C��
�i�\<��i{��r��<�QDm�VUM��h4*�󺮋�@)1�ɲl�&�q1U�,��t��(�$ɲ,۶٪�l�����;�����M׺��R����6E[��FEQx޶m�ƄUm��@��n��&�Ũǃx�
�q]�߶q$���'��9|Bljevp��:��r�GZ�!6=�mS�뚘3>��E9�
e3z~?w&BlN�0!x�6����a����>�w�4>cO���4�R����|c��RQDN'u��˅0�>�~�	v^�?�Oj�ѼK|6�>�m�Gr{ΨÎ��w����3^6�@n�~�����c��f�og����f`�^t�0{�8kV���lơ���m���d�}�!`�g1�����n�����?6�G�MD�$��k��O����#˂�~j0G�������1�*-es<`�p]{�� �c��@���T����ѿ�Ñ���~�=��F�ԋ/&�����?l���[��;��O�ND�~�1�?6����y�XEE�>�SS��y��y�5� J���b~�Z*�����!��?	���q8�JJ"�K::H(d��D~8�b�r�����|�w䑞����A�H}>�1|��-��+��m$I]7�+%�:���Y����
BƄ	�UWGE=�(H��
��w�Q��"�:�y�1"q2��I�n���Cs��z�5JX��s晾#��^{
G�4%	��aA�/{-��fz0M�ىA?���١�>�O8�۰�}�e�	�{�ŭ�P���.����Lf��Y؏J��"J�~��O6&OƖůZ%~���7q `���yy�㰮wM�����cbE1����Rp��ص�"J����
~��vS�
�����(8�z�Vx �l�r�]��^y�7~����E��$Q�
�D�o��q8Á������ҥѧ�6����{�1��?��6MJb'�pP@��X!����t��]�;7��;�ر��E�3��}��Wp,FSR`F �A�`����;;�맜��ЬY���|y���˽�.?�0��~?r8��{�
�i�B��R�䓍�Öů^-~�4c	��<;7�u�6"�}8P���ҽ(3���bp���?��MS��}cǺ/��/+�.�z�_���Q�B�^��,[��y'�z�W_�N��9�\a�$�0#�����:;iVV���˗G�}�*-?��;y�g�t��!���:_��D�A/
�ۭ^rI��o�3f�c�
��yN=�{�Q��o�h�&%A!��Pu����O}�uW!���/�/?���IIP��@��@�B ���c�m��։_|�x�
u��1f+�������f�B�a�"/��??v��Ȳ�{���j~��v#I��_����{!Pq�r�}�+��o�����ǻ/���� A���2�T����c7�\�,���vϞ��o�+WR�V��2��n�Qld�a`M�~�z��s۷[}��P��~	8�3���Q��7�:�P��� ��<0F<�L����5���φ��_�� �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�D?$�~H8��p � �@�@����	�o�fD���1>؍pQJuS7M��y��t���6J�(�<�C�$�~H8��p � �@�@����	�D?$�`7��(��۾�|
z��R�(�������~�	n,�B��dJ)Ox�����Y�R��:���q\fF&B����} ���tBD?�
c����弜۶`�SD�D#u
u����4��s����,����3�	�{�m�4�����a_��lOV}C#B4'+[մ4�2���y۶7m�����mۿ�~(��mc�–m۶}�߶m�(�P�	��/`�{�C������q����{�	U����B!cV���o�C���=�ضmJ)�qhw!&!�k<��?-����ݱT��}�!0�{ĴeY�aPJEQ�y��>QJ��B|�l���L�����`?�iI)���aӖ�<��<����b�NI�8�1�� �����,
��ȖB�9>�(�H�!I,�����_!�(n۱}���6�X�"��I	!!��Y��cI9B(����%I2L���C��}�D?��~��v��Ҷ-J��k�@���Q�u��0ƭmm�U�J,�		��U���m;�mڲ���cl�fumMg��%��0-��G�#�����cg�(���T�TG�(!�3`?#���i�1U����F�<�+1�3����RYU]�q+�Q��
A_?��={�1F�>�A���)�X��Ըj�j]�k�j�l�U]�9B�[["�Ȗm[�k�5M�D#-���emش1o��him�8�G�`L�`cSC$�y}�Hd�M�ht���1U�ص���c\�����}��-%�q�&M����֬_���v����dY�_d!D!�`������9�L!�"۲SSSnj����cgESs�~���ݖe�de�\.ۦ��eefggd
0������]�h4������w�W�<FHQ���u��_RZS[�FA������(���������u{TM� ttv�u��<���1b�Ȟ�=��IÇE��w��~د�9L)�D)
F�������aD0A�Z�)ƈ�[��ƚ��8�s�\�mۖ�j���c��=.��~�E=
��jh�X��#?��1��Q���
24))9))ɶ��[��d�BDQ�|��HK�
clۦe[��)���-0Bp��owc�
��H�HOM_�n�5�dY��ɱ,���~��%�����Q�s���L��)�mmm��r������6nHMIMO˨���olP5�`�}(�ng�B�<g�����cӖMYY.��������4I���(�)�ə�.��L��8��,�J�'­+_�}�Ea�^�hѠA�t]?@���Ϩ���WQQ����^�4V~#
��͛5C+�Y�p8X����aH���K�D#�C�8�#�a�m���1�F�!�$�3��RI��7nD��?@�=��˪wt��y#Q"�I�Խ���E�(G8V�C"c�m�6LSvg��C���E����c��aJ�i��  V㏐a�''%�J�(�(��c��oS�RjS�Us��\.��^��t�]r�c#�lJA`S�{ܞ��>P��M��~��C�l�W�a?D?���!ߏ���4͜�����݇tu�U���_�~�~����u��n%F�ȴB?tS��~���7���gp�,+��G�G���}}�i���>@O�� �`_cBH,�x<�m�D(KR�׭�E��������
1�S�]_f)JLE�~؏���������#�"XZj:¨����u�e�.�;9)��F)u89ٹ�ϔ��I\N��#�T���~؏�R�ߥ�{��=RJ-ˢ�B��:p']�a4/$�~H8?����:p�ב�m�lS�����~��/���~���qD�
M�~��,In|�9+$I�-+����E��Y۶��e�C�y�1������</;�����Φ�}����i?�ϖ^�diGgg���<��L@�y��=y�E1��ZTv}��+��j���nB�ʪ�M��������#I���TU�m?�qB���D"l
L��`Y[� �/v�B)�8R�sgk[�mے(���;�km0Z�rU����4�0تǶM)�o�ڔr��n������m�^���w�Eeffv_/X��g�qي���l޺uԈ>�ײ,˲\.Kr���e�盚�W�^3t�`UU	!~��!I��B����AE���&?��:���k�>�̳</�X�*/7�����/�����|�m��~������
�1�D	b۶��1M��y���p84Ms9��$i��1�e��[o[��������a�i���Cv8lJ)��=�pkkkZjJ�=�/��ɲ��t�\.�0dY���.�s���K��=Jv8�.��l~S���t�$IbO�`���3p��ܼ��ز�e+V^q�%�N>&�h�����HOK[�rUJJrZj�'��PU��)�g~�ἅ���ۏ9�HJ錙ﻜΩ�N�*�w�455%''%%%�o�xԤ#�c��p8֕�{����/M�L�\�t��>����ԓNt:����&B�iSW�Z��s����M;�����Դ��o��1|X0�;o���<~�qK�/7c��q�h�ڶ��>�쿌=ZӴ�ֶϾ����SU��O9e��9��6�q�)E�E�UU��.��ə8�����C�c�<r��׬9|xrrRQA��͛wT�oh�]\<�m8kVGGgjJ��#�^/��t��F�m�q���k���ƛ6m���Λ������W5M{����7nLII	��(J�����1tcێo��NL�=���>���O>�v�<�����O���|$�z���_�G!����D|>��(�<�a�N�Ӳ,Qg~��-[(�/��FCc�w�7���`�fee�� PJ1���?���]~��O?�F}�)������|���$A��c��<�(��Y�>��+W���;�(͝7�{3S��	!;*v�_�hђ���QVf��ϿP����'��-_�����˯457��T��?���OVU��Ox��LJz�=���o8��S�7n\�rUjJ���iz 葟?l��⢢�.8�3ذqS0jo郞���]XPp�Yg
2d����9۴LE��<�FnNΓ�>r���_���Ǟ7vLnn�yg�=xР�dD���mm����Ͽ�K������!)������D�Rz���\q�%�`�u��p���r�
g�z�a�����=�/ӎ;���u@����Y�w��͛���}�%_~��s��3M��#'���3֬[w�I']|��Gv��� L7���O+.*ܾ}��5k���3O?�w�^��?��W�|4kVzj��Ag~�accӑ��xg��Ǟz��O�4}��K��p˭�������oٲ�!9�>_�����'���X,�*J4��E��B!C7B�mK�TV����a��={4����Sb+W���H���M˜4q��͛
��W�=r�i�555�E�--���s�JJn���$�?%%9
28��aUU-�"[���Z$��C�H4jYv,�#�@�G~���?���Z�hq~^�8
!��RS/]ZگD�$˲b1U��� J��";�n���ofK��T�}�}���B��'��v��+�;��#����c��wݵ��`��9�YY��>���`�Ν+W��4qBvV��9s6n���s����kwXUuuZjj���+*F�)�<�qkׯ�.Z,ˎ+/���r�<��W_�dg�^����)=5�IGh��ͷs(����Ϸn�����{3W�Zӯo��O����55�SA6n޼z��e+Wn۱�O���1�G�����ٯ�ouM��a�z��O>�c|�e�657;�������_�j�_#�Bjj��i�))NY0���M����;��l�ڵ
��G11%99���;ͻ��y�NY����A�F�^�w֧��m�x�M7D"Q�磊�t:	!�i�d�0$I�LK7tA�*�����pv7]��8I�0F��;�Nv�W�$V�#B,��e9>u���lk!����y=�49�����$)^�i۶ ��q�q�0�vdY6
�x]�1!l�<ϳ���<��dE���;�a�p^������n�ϭ��dgò��?��pYY�;|"�l��5M�fϝ{ʉ'"�)��i:�4BcEQX�},c��l��XL%�����X	� ��D�c�װ�FE�������F�vd��]�
���j��`�c�J06M�4�hF4�k��B��`�&�e_Wݷ�(1Y�W�Y�p����AdgeA�����l/�0&]�~�9g�..V%��|<��]��즮��s��^�f���F���x�
߽�I��=�����^��8�4͂���M�~����'��'mf�<�u��L�m�}��Ǝ��*�r�����YE~��:���a�.,x5.��	!�(��k1;�K����׮]�@" �����?��2��g%tEXtdate:create2013-09-27T12:17:02-04:00�"�%tEXtdate:modify2013-09-27T12:17:02-04:00�J�tEXtSoftwareShutterc��	IEND�B`�site-packages/sepolicy/help/files_exec.txt000064400000000616147511334650014726 0ustar00This screen shows application types that can transition to a process running with the '%(APP)s' type.


In SELinux these are called entrypoints.  SELinux controls the executable files that can be used as an entrypoint to an confined domain.  If you have an alternate executable that you would like to run in the '%(APP)s' domain, you need to change the executable file type to the entrypoint type.
site-packages/sepolicy/help/system_policy_type.txt000064400000000632147511334650016562 0ustar00If you have more then one policy type installed,  the advanced screen will become visible.  You can select the advanced tab and modify the policy type that SELinux is running with.

Policy types are installed as sub-directories of /etc/selinux.

Changing the policy type of the machine will require a system relabeled in permissive mode. The gui will insure that proper labels get assigned on the next reboot.
site-packages/sepolicy/help/transition_from_boolean.png000064400000203401147511334650017476 0ustar00�PNG


IHDR�j��8pgAMA���a cHRMz&�����u0�`:�p��Q<bKGD��������IDATx��g�י�	?�wU����`
�;-hDO��Du�{4�i�=3;3��DL����N��jٖ(�%zo@���P(�6�#ィ*T 	�����̓��<�9�y�@���z���v�]._O��0�0�w�,,,,,,,��෰������(	��B��_O����U@@�]���H�1�麎 ��	�(�wU�"��Q��D��z�����}Qǔ_����׶�-,,�0�෰�
�
�e�>���{=ܱj9�?����hAx��;i��$�ͱ~�JD�����-�����pAغ{���,�c��4��^4�s��t��Ӧ�y��Z��Py��� �30țmb��Y��0ع� ��g����TW���x2�-�*����dp:H�D.�'���r:�|'��T�_AHg2�����
3���H,N6�'�˱y�^�6�S��(r��i��4w�Z�!�(
�����-����a�$��M�E	A0��S�9��ƒ�-����d�Y���E�,@^UEAh=s�H,��iS��	]��t����$�pO�0�DD�H4Ƈ��q�T+�@�ݛ�2��|�|��!TU@���c�$��kZ��pOM�)�Y�$��:�����D�
Q	��iV,,nA O�қ�0g�t�k��[�%<���e��&\�[�����Q5��X�`�Ϻ�˨q���DcȒ̬�S�9u
�,S^A(*�0�p$J&�aӎ]=yUU�z�ܷn-[v�eJ}�x��X8g6�g��E��70�܋���0�*B�^��ݒ��S��%�ط�m�}^�}
�$"�"�\��HUS9������ũ�382²�X�h���T:�c�6U�x�.DQdӎ]h�N�߇ 
om�O���:K���t�m�~(�oܵ�t&�����:K��e��|�}'m�:�$���V/]�G[w�?8Ģ��,_8�W���s|^��.�4O��,n=��ɴ�F6������rܿ�vY�T�K�_�}�هjkk��l6K�[X\!�a��2S���b�]�h;C����Φ�;9q���G�q�|-3�s�L;yUeقy$S)�ٴ���Z���8����+��k�!DQ�LG'^��Py�}���K�ϫ����vvqך��UW�����'�s�0�Z�P5�#��A���˝kV�����S�(�����UU�q�fM��;�>�0�˨�� �s�d+3�N�>d��4��00<́�'���شm'�C�_����!��j��v�����a�s��i�X���g��Ƙ�XO^U����$Ri=����;0D$����c�ۨ��d��Lml�xk��2��*�!�ָeqb�$./㳣�8x�Zfq�m�K�7a��[X|I4M�nSx��;����W��ӝ�y�D�o?� �g�<.W�;�QU
Q�u��~7U� K���#�H$S�UM3��$3g�4��cj*+�R_W���Ƣ�9��g�¦��m6^~�~���&�N�Ɏ=�t]/��E2�,#�8��.b��9�����g/����a�٘5mʅ1���i�>�L&��fcjC��DU5����ةV�l�\.O��e��9���?���8�چ�n'�����G�<����p�|}�C��*�|�ƺ����"Ĝ�SQ5��,nY��nن���_���������eK&��h���,,��'�����24A��JK�t4U��>d����}Q9p����~̚>�{蚦��*��c�)��T�}�~�Ξ'��K$��̙E$��O6��}w�T{��#�"n�0��gr�T+������Y�]�`�v�v;�����a��Y|�c7�[ۨ��a�mL��c�<�~�?��#��YW]G�D�q�zA���y�F@��� �H�zJ��4������� �H��y��ٱ�?��Kt��S[UI��&>ݹ�0��{�膎�騚Z򠰰���$Ri�_��93��u�g��j�$�}�������-[�}��k	��A^U9s���~`J]-M���y�C'N�J�M�6A`��f��Q:{�J�Ye(H^U���L���Lm���\.����Gk�YdI"��78�Ԇ:R�4��̙1
�͆ �>wM�i��8��=����!��2���d:ͩ3�p:���yfNmBQN�9K$�"T����:��~�
��_�S�J+�X<���s�4O��0'�d�3�;�R_G^U9z�Y�q�l��ē)fM�BO��|������g�e	���̩MDbq�{�ټk��o=�
��w��ۇ��an����p9���TY��-�Q0t�ƹ�$Mz���t���{������I2-�K+����,K>3TMC�2��n�h�}�4I10
��`��~�Z�|QP��`zP(c4�  JFa�,B�B�nf[
����i���7����=��v�J�x��jc��"Z���(�Z�&��y�h�/`�;vt��}�~Ya��#ܽv5�-���uӋ���W���oaq-�F�ϫ�4��1�L][�6��I��k
��"�	NuԽ�$�L�U�b�)cƇ��l2�<���s@QdY&�L����?����7]��_-,,.���oaa1).���<��,]�յ���<|ϝ�c�e�gaqչh�?���6� `���õ�R�[X\{�>��
,,�^��P�&f	~��/	�������n��q�d���yT�\�{]��~�}s�caq�Q���2<2�=e�^l�뎢(8�N���M��(
.�M�H�S״΂ �t8��l�3ir��
�͊����B�$ҙ4�|�zW�K�(
N��Xܺ�*�����ښ�z����:�ݝTWU�p8�e��z���s�3i0�f�]�:�13>�v��z7���N$AS5z�z�Ƣ7��v���8�Py�zW���q��O7p�\�������͏n��(
��7�DT�u>H8��u��aN��b��8Y�nTA��������)2�L���\��N&��fSJ>��b#��`�;��s��L���|��F6��n����,Z�~�����%���Ѓ����0�t
���fP��P
Js��lF�ɫyQp�FČ�'���
.�=�,IH��,�I*D�3�B�@E�M�,�����|M�I�3x=�+
�daq�r��P-nI�+)C�9��E.����uE�p�-�,�����p�ʨP�c�]�o_���B6�E�ek�oq�1�t&M:�4�fP���q�k-�oq�b`��bҚ����T�7G��b��C`��oFADEK�[\W� &��!K�B��|>���^���?�W,�K���A��n��b�����[��]�u�\�~�:�:����
t�I�~��}���i3��)
�[,%�0:�
,S5MC�෸�$�	�jY�q8]��>���T�x<��f�1���,A���\q�TU%�J���Dq�?	Ba�������ȅ}8�[��
{��D"��iDA����v��H��A<��v_��"�^�OT�t��M����~j>��s|S�`���s=��eM��v{<(�|Q_]��]� ��f�d�x=^��D3k�(��A:����p8J�����b�NO:���(^Ж|��B��/�Q׋T:� ���R�AV\ ��(��%�UH��8�^��WU���������_���$�R��AwO/����ٽg/M���l�1�������0�̞9�T:�~Ī+����p�����C���B�>p��JQ�����~�	��1�9j��?����3i|>����|�P0XH�:v`/����O$������=J}]�E�һ3�]*	��(
�0Ƹ�x}Q����~���(���p(��k�r��	<ngϝ���>��(��P����9&�T���S\��@�ٳ����y��g�k��b1~������O���PJ����39�ڊ�磪�r�$b�(0�}2�?�Լ�O�`˶�|���3���m
~��(�V�n���I8����L&Sگ��0&pQ������u�l>�(�$�	veee�yvi)ƗYt��D�x�K�[\G��ѩ��Ů8�8 ��Ů� �/Y�e�(����d82tuu1}�4:������9��y��;�A2��0�M����$S)0��r�n�d���mdsY�g� �Ns��9�v;ӧNe��ٷ�'}��S����m�bqΝ?�,�Lij$�����C<��p2kf3�ap���d*ESC�P���A����7��wޱ�����:�	�g��g���5ɪ}��\�4t]gý�����O����	��s�����AB�05�Uh����$��5(�B2�DU5TU�L{;�##456R���ùs���4ϘN2�bph�H4F8d�)�RiΜm7��)S�z=c[�^폩������h�0 ��
I&S�R)� �q��iYa��i��v�{z��顬����F�Q�5Ǩ���|>��{~��)S�$�ITU%��p���\6nj�����/Q]U�]��0��M�����m~�δ�%
b�::Pd��F����GUUzz�H����*�Ο�����1�S�(�y��s�����Fۙ6TU���
��^�?��Q��ő%�ݎ"+dsY��$�$�v�H$

!IS�LAQ�����btuva��p��8TU%�Ja��.[f�s���̜ Zr��a�p8�'b��io�"dN�Uͅ�4�k#8��I��e�a>z�E����� S�L���^'���p�~��?�(����C!R�4UU,_��\U;w�f��ytuu108Hee5U�ttu�z����n����E����VB� ��n��v�s�n�4���C���,Y���N����ٹ{73�OG%B!+�ƍ@q�w�����gpx����/��;�eҗS7�1B�02��}�!����>�?"
�?0��6�������6m�
�V��فl�t3�`�;w����9N�������3gP�<G�g����|��o�c�n�h�������Ǭ�1�H���q��!�M�ʴ�S�d��Ι�@6���� �N�Ig8�������NC}=�0Օ�%}A0��i��`5��:S�N����7�~���Fi���Ǜ8�ڊ�����c,[����r��]����p�=�Ͽ�{֬Z�mk��~�c�|�O�l ��PWWG(���d�����NTM�͛9}�O=���G�,c�ې%C4���D�e$I����|>OC}�T���ADI$:�<XNue5�$���ISc�e�2�8�hY�illDͫ�y�d<���f��i
R(����@ ����n�_�LEQ0$��S�4N���`aq=�$�q�B.����DQ��v
��G���_r��|�S��3zzz����Б#��q]g��|�_�1yU�؉�Ȳ�}����h?{���nႺ�������<��o��'�������:¡�(�����-̚9���x�@ ��i�j=M4�;�z���x�CG���?���F�y�IfΘAgW7.�Ap9�TWW]�gc�9��~��{����Z�߿�k~�<�ԓ�e����[\�G�4dEf�<��#�̞�{�ȧ[�1w���o3s�>ظ��vs�;��o���g�f���{��D���j=M__?
�u��Ȓȩ�V2�,s��������?0���BS5���UUK�*��G~�o:�,3����˖4aZ�Z���c�g�Q�*�d���� N�Uը��A����C�uS��څ{酿�y�:��,�l�@<nN�ә�L����3sf3�n_��k�0�����;��g��^:;�ٳo�`9�h�����ͧy��G8t��}�46����O1����'N��Ǜx����S_�0I���lH��//��$!��dNtM���h<J&�A�)ȅ��H$d�l6�����6s��������q��L���x��������k�$��e�,NN�:[�|�I����f�躆����p�ΦI�����%���Z̵�=G>�2m�Tf�l&�J�~�,� b:�lUUM՜a�@R)�C��l�rY����A04���A��4�d�W_�h,FCC}�H"��0<<������� �Wb``EVPlJ�ara��?��'}��������7��ӭ���T��ٟ|�5+W���;�znr?��j���a^��2��2R�4������D�|��v$�N�Q�*�xY�	��,^���w(+���o�ũ���0L:��a�=�kVs�u|�e+�v�(�m�~�x����$1}�T͟OcC�y~aŞL��f��\N��p�=<��Cx}^�z�qB� �|�7��){�F�w�����˞}����E�55�Y���{��\�D�D��TW!+2��w�q;�>��ܖ<n7�|���A��P�tSТ�NUU%��:|�L&3��@�B �<�d
����tL�D|^�P�3f�i%���IyY9�� K2�X���6¡0555���f���W�Qs�H%S�ʨ������T:�,����d��r9Qd�����E>�"�2�|�h,JG�92��˄񾤪�h,�|�R{���J�ٳd���C��?@}}=-�g��;��_�n��l��̜�'�n惍�t:i����v���{�TTTp��9DQ",�����N�����w/e��g���w���(��w��f���A��� I{���С���y�Ni�>�]�	�B�ş�+�
]�ƻ�v�‡?f���p�]<p߽��v�~�=~�_�xx�


��'�r��QjkjX�|9�;:�ɧܶf
�v�`��T���VW
�8�с(���A�7N��0�|��[�����rRSU5�n�
Tǫ��>�$�/Ld%I��SUYɎ]�8�z�5�Wq��1�ϝc�ܹH�ć?&�I��؀��c�h�F)�aA�e�ߏ�kLmjb��y�V<wޱ�O6oaxd�)��L�:�y--l��	����ngά�D�1�̚E"�$�2����ü�^}�MQ���)��U�:e
w�����|�'O1g������6��?ƝO@ �J�"�Si4M�0	�ˆ�H2�4W�(����y5U5ȲL:�&�L�z�I^"����r�N�I��DcQr��� �t���~*++�Tm�2��$�dNrF�|^/�`yY\F�G�:u�� �g���Ӊ�DSå�גd|�~�8��-�[�\��fE	E1�������Oƌi�X�t	���d�i���-��sӧM�,P�(
��	t]��cF�QtM��a*ow�>�v�;��8�N�<6�
A�d2�(I�].�V�4�r����D�QE!�[.�7)�a���~���)���`���շ��n/m��yr�,v����QN����N��f�|�ry��$v��Ӊ�iD�Q�v;��a��,I���
5��O$p�\x��1��$�������:]R��Y0�Ӽ���2�DYQp:�^^��63��	�<~�̖� dsY::;�:e�f]�QU����u�\>o�ϋ"�T�L6���,Х3$Q�n���-E)mc�2�O�N'���i6��t���˙�pu�h<Jo/^��X<N0Xn�K���"\F�$q��x�FFF�F��Eѵ7�000���$U0�,�	d2��#}}}�2"��`�X,F<�n�
�L�d�I�W�q:�``��L��)�S�Ņ�uC�uv�����Yx�n�Ξ�sT��X�x�%�e����CEQ
�������'m�Q�*�a*B!��J���c������9VYY1��v���$9.���P�,Iȅ��6*�����Yܜ�.+c��(�R��5���.�".�àtL�el��%�d��i�E��,0�e-X^^*wt�����%p�6����:�ӎ�1��[�Cq2Q����P渕}q��Ѝ�\�䂿~���A�u;N�Xw��D��W�~��s��}����W����l���1���'����{TWW�~�B���|�
��`C=��0�N�Z���1�0�N'���	ga��Re�D�Y�|7�(1<<D$2���HAv�8%�;d�i�#p�}�,+/��N�?�0�ph�
m"<�ؕ�3ٱ/r���h㾛�������^�r�膎(�f�!�@�n�h�~�EQȩ��]�[�\.���e۶�8u
Y�Y0o.^��\.WҸM���A`άY�!QA�ʲ�ys��9W�bT���}�ڿ�umX\m�Ȍ�MD� 	(�������b\��#^��uC�$YB�n�>���c�٘>}U�~[A	������m2��{���V�J����Q���3ϕ��t��:��EQ4��s�?���v�
[0h��)6�B"� �Dq�]h�F��8��I����v�ӕʲ��ވ���
���+{.2�;g��1�Z|m�u�㧎��ěc��t�Bh[�kZ��$I��j���eJ!{EC3��l�1�G��߾��13���v��ʯw�ZX|!J+~A4]j�u���u����uC$Sɛb��t�xQUӍ�Z����&�I2��
=YWD�ǃ,��q����]�+��Wy>�v��z�_Y�YX\mJ���r�X�X
�sչq$7�a*�p8pԫ7:Ž}��F8��^�⾹���W����)�7���6	�c������FI�-���r����
��e)����l��W]�M�l����:2Ƹ�Z%��1���r�<��10
�|>�1��W�k�Lk#�~3���dh����t�p��������~����ZX�@*�B�o.�������W"�Q0�^�3�s�ϫ:y�@l��T�aq�!��M�gaaaq������� �����1���3U�e�+m,��'�[j�[A,�kȄ���@�.X⊂PJ�s�V����c���a�^�T'.�������aE
��.K�߂XƟ׎��(��?p������lZ+�/#��`w�9w��E�(J)l��(
��g(ɛ��u�u�(k74��k|�y���l��$Yj�[��D���E�_v���� �.DVd4M�ȱc�C!6~��9�g���AR�4w����(��A�0�(�+�%I�D� ��])�yr���FD4Uc��;��u�f�U�[a�;caaaaq��P�/I�uu,�?Y�q���U���j``���|��H���a��p��C���p�=�u�4Mc���L�6�T�(�I��4�M.j�+Qd��Q[�P��r~[�:X�������2���t������ʊ
6�{����q�zhll��,��ŋ����w�e�ʕ8t��[�
�8q��~�!����u��r7�-O����p��J�|>OGW�DV��e�v�,Ap�a��-,,,�-���ׯ��7lA ����\.�0.���3غm;��C��9�(��l6j��Y�h�Ã����ki���~��,_c8�bמ~�j�|b�|v%��ŵI`q#ra��z�������Ʉ�_�
ņ��A�u��<���(2�h��G��r:9|�({��#
QSUE����SWSC[{;���bs�_��.����>���8�6� �!'Ѳ'H�������n��;����ŵ���g�}���z��(h�JUU%�P�$pf�hA �ɰ|�R2�,��1�e�p���mD"�����x����H��˃��=,_�D]���Z�V�ih�vε��ı#,X����z+��-�(���q3S������U�$�_��,^���N���m�*�xl4�c��O�wAPU���~�O�>�� ��@qN` 000�/_�9'���������.K��CE����u��giz,,,,�2����+:6ZO�`�;�Eeƨ=]�P(�_��k��&h�g|�G���W�5G0Wr�g@���oM�z�ג�,V�aH��_����P��XXXXX\}���Eӽo`p��gYl[L��@6�%���x��}�k�W���$SI����w{-ndS��xQ�z������k�W��|^�����"t]�V�׈�Ƹ,?|�+筅�����-�	~3Ҟ������3�k�eU����li�r�O#�3����cP��k��a˰�����ZqY�/���\��g
*|xkw���c`���y�H��q)�}���@�$A@�4�h�@+f�e���WuIi<����Z�n}8�
?ے�s�\��r�8�vŎ 
�r9K��)��=�C�	��X\��ϩ�j��L"�Ь���!
!��lsS_[�`i�,�$����Ձ��4�5Z�!nr�ݽݸ�n+�$\R�@^3�]B7�Va�C�
�
C2
Y���_A0Be2$I��|�Ȳ�"+���]��K���k:���gbᱸ��{d�#��]
���,�����BQL�o��'沺Լj NE��¯��Ƽ�ů_�[!���=��\�}UA���0dI��
+��5b|���W��j��'�{�鼆�!!��8þb&�\.��(��xE3,Aȫ*���dc�����,n� ��:�ts?Y�uDQ��p\q�/*�����du6�l6��i`H���n/�틖;���,_���Y�|.�Y�R���Gu��STU5[��W�X�attv��~�U+W�(�y_|N�LN-�{�|Q?ׄE�͕�,�W�<A�\��s�d�蚆Px��n������)���{VDUUQD����d�~�F�9v����p8�<+AAK���2�-�'z���@Ww7�;:Y�ha�\A�t
M���S.�GUU(��v��ԟ�wf2t]'�ɘ�	��#IR��/�1j,1��SlǗA7r�l��E��6� ���MGG'.@�}���=��알]��"����S��d�̝3�t?I�P��s.+�SY�2��ș�v���-O?���Yc
�����k��/>�>��bcْ��ؽ�����q�Ke���k���q�\�AN�:Ź���m-�,��V�A��_�X2��?�)�CCRS[S÷�{��@������0IY��}�w�(��؀���TW�����={ƴ�r��p�zŁ��Pd�Y��y��p8c�f��
*�i񹍌���{�S�o�~�x������f����Y�@�4���&|������M��Fy��GƼ+� 088��i���p��y>ظ��Kc��
'��&"
ٜ�,	�k��8����2{޿�t��D�� >r�ֶ66�s���a��y��'�bC��h�Fe8���0b�帜Β����.��'���&�Οghx��G �����NT����;{�<�==̝3�D2EMu�L��;w��5QLV��c�a��o2���eK�����}R�y���5<2�GBY ����p�=��^t� ����?@Mu�$18��ɞ�(�l߹�x<Ά{��*֭�����;�t�V�Ι]��I���c&�?��_�?0���$
q�ڵ�=w��IWW?�9��z��匿��߿��d�x"A�<ȴiSyp�}��I��I�U�@�tv��Emm-N���AY @��stuw�b�2dY�p�e��|EQ�ܹs���/��yX�t	kW�3Κ�N^�迁��:w���۷�t:i?{��ﻷ4�����1j���0�����t�/Ȇq�[�(��λ��԰t��&
=�r��Y4e��Y�؄9q��s��˯�s��s�?��?@�� ���e��������lEQ��� ���F�f���n¡##�6�$�r8�H4Joo/n���p���b�8��r����hlh�{�F�e{i��F	����~��H����	�*+)+`.���|�[�=����+�{�T�+�f�tvwc�v����!l6���Z���x�x<N ����\.GOo/�aP]]��a'���o���SO�(J�]"�D���!\N�P�h,F�ReVU�{^U�d3����M]m
���WU�{z��r˃
�-?��H&	���L�0�����ʷ���
�ރ���?p�ӧ,/cʔ)?q�_�����3Lil��&���O,���d3(�BE�t�����c'N`/
��`�R���[��q��ԓ���r�;.Zq�d�M$�S�ؕ��n���]׸}��Nј�N�A ����e�p(D"�`ph]�J�d*E__?�(PYYI<'��v�)��Í$�	��,c�)`�	"#�'�L�L�H$�|>�� ba���;�D���7��ki�>A���E�T\.�H�0���0'y��T
��Nee��CdsYB��("K�������<��#���a���t&����$,/'��_��`y9�¤T�u	����I$�}>|>���d2��#
��*C�å{Wy���֬FD�=FcC=����:N�9��o�ͣ�x���Z>�h#��YfY @<�(=�`y9�(�����iE�h՞��Ku{�
TT�Y�b9�(^dȗ��K�xUU��c�]���-sJ��f��L&B�u��}}ȒDE8l��:��r�y�I�=Ʈ={��O�p8��i�l�G4��r����3���`��O*�bph�b#,�f��J�x�������5k� HuFFb�	�^/^��h4:i��b#
�ڜL�P�'{�Ӂ�� ��I�ɫ*�`�L&3��H4J2�"�����b8*+��^��*Y�b�d��G���Ѐ�㡶���{����A�x�</��55�ܶf5N��D"I.���r�ͱlh�������y/JڕH&����Q[(�y�4������z7tCG�5˥o.�ΗW�9��o��)��p��)���3lݾ��>3������"&�q�}���y񥗨��$�J��3�D�DDI"��s���jy㭷QU���F�s=o���##$�IV�X����9w�<�@���C�)��[o�/������[o122��fG�D��_�9�A���"�3' ��5���o��14<̌i���s��i���Y�j�d��{�QUY����'Ž]�8~�$�p��ӧ������ɓ��
Y�d	;�졾����}�l6C6��
hmkc��=��➻�b��fs�"$S)�l�FYY��'�L�¯_�����˖r��a�����޳�`0HOO/ӦN����G|�;�ݎ�딗�q��w�������n�B4eŲe$SIN�:������q��!���	��\����O��{�y�t~��_`��ik;â��΢(2����!F"#������3����E*�o�%�6�LNŭ����GY��6δ���t�3��Ͽ�%6��t:��Y��7g/����e�	n_��Tx�l��R^V��"X^�]��`מ�tuw�DY�x�N'���7w.�����O7o%�ˑ��Xw�m�;��C��D�Q��種��0?6���餼����0���������FN�>M"������z���������o<D"�dێ����p�<���O:��Tk+7}w�n����Ç��?0���̛;����C�$eΟ���K������|v���9�|�{9~��h���u����M�i*-s�p��5�����aͪ��1a��=��q̛G*��L{;7}Š��8v�B?/�?����ҳ�m��46��^æ�8�q��ٳ�h����Ǐ���y��߽�꒱[S�,��eǨ�UU�Б#
Q[SCue%�8��)S@P�y>��	�;:�4���jŊRy�0e�.����0�L����b�ۙ>m�\����G"�{ם|v� �X�fCE��ԓ���{����������'P�S��tvv�q�=�8u
����M��I��s�=��ѦO&-�����ׯ'�}a�Hd�m;v�t8X�x�����2{f3�B��{�ᣏ7��H��TWU�?0��O<��)M�AUe%�5��?p���n�����wޱ�cǏs����?����38��JgW���ϻ�O0$�N��0�~�}�^]],^�`�����nN�:EWW.��uk�^��`�(���ɶRL�KN�r���1�~mk[�gϒ�f���c�}��{�G~���6���Oyy9���cт|��F�ys�_�t��$;v���t��/x�G)++c�ʕ�̞M�����p�z��p����`�F�45��g�����ƍ�r9n_����?dhh��HdT��/�QP���y֮^ŷ�{��K��t�b~?{��#O��2�?������ttv�Őd���RUiNf�~+�/g���|��gP�d2ɶ�;0������իy��F��(sUU�^`L����,��p��穨�/��3<A״�ʟ7���~�yDQ�ч���9v�ĘgX\�TTT���<Ͻw���6�/]Ƭ�3���!kV�d��e�]��o?��(!�{�}�	��˿��?c������{TUV�r��rDŽ��eI�i����?08ȉ����U��4��#�͒��㶵<�أ9z�󝝈��7|�;�X��;���h�F*����i���y��x��i�=�Y���F_?���e�…<��H2�b�h�Ʒ�y�E���-�b1�O�ƳO?��(%����h�Vڗ�t
MU��Lij���en��5ߟc'N04<L8��'�����3gϒH$PU��MM��֚�
/\@�f�z�Q�^/�T�S�Z9����=�}�m�޾>*.S��i%;����rL�:�g�~
Y����,�?����s̽v������=isrY�c��+Wr��y浴0���'}�����碊y�ۺc.���{�˖�)����ee,�;�TNQ�<�<]בD�\.7��������ɫ*�dʜ\���ܵ�Ǎ���ܽ�|^sM��uM�J�-w�[�]w�c��y�l6'�G�'�ϱd�b�x�F"�����b��.�̚���$�0'������۸}�Z4M%��u�vjkjx��g��غc�eʜ����
I���x<n$I"�5���y�'OQSS}���;���,_����\�(
<��;��B[/O664�̓O0m�Tz{�X�p!3�g��`�<f6���oc�ڵd�Y�^/�<�$�f���[غ}3gL���|��3��4��g�P__�ukז����|����J2s1�\��U�6�bc*]��� ������޻���FY�����zK/��f�����KH.�C�T�^�I�e�����W^aт�].dYBS���4
]7
JDS�澓����t`�)�y�9�(���^dI*i�~?���"w:�8NvO>�(�8�+��N*�.�Cʒ\�Ȍ����c%�T��~�2��$��w�)�
�y��}�456��Ӎ��,�Y��|�ݎ���rb��L�^,6�K I�mv�N'�"�>$AQl�����饈h�gܪ���S�v;�M)
B���dQĩ�dSyt]A�t[�J��>���H�d��r�
��#I"���U��|�T�\>o
d̉I,�?����$����a�&4&�6��,���y~ap2
��A���������m�9w�U�U��f[�2v�MUY�x>���۶���K]m
` K�^�_qrQX�
��%������G�l�i��J���k�>�Tm̽(ܿ4�
��#I���MQ
t���/���~�
���r��a
�b;�1C+��(��$](G*�ɏF��>3�LIPi��"+,Z�����zA��\>O:�A��tQS]����0t4�ªR׍By�3E��E:����{�v9K߂�fæ(H��"�<���8t�(/���?�0�%�"yUEմQu����de����<����l�a>M#,綵kp9��q��p�(hB�Q}V2�eE�f�#�"�–K�g�L�
��f+<S]3�+dI�.�0�O���v��s�t��:�a���W4�B9��6.B�\>wy)x2��ȩEB�EF��3�������9��������~�D�Qn[����c���B?�=�0�����[��vv+�-����ۿc���̚9�l6[��5p:�ln�Í�v���l�].�Z����������c��ֻ�!�����V��.����fG(XG�l
�x�"dd���U8lv�|�M�n���I�q:�(�������ٱk7��UΟ��fg����λ����\��7����m���eJ8���.����+�m@�dl���\������/��fC3����1+���n�4%	EQ��Q^^��_��z��g�`�=�vͣ�����]���9s��O~���������--|v�n��dx�=w�<}�Y�+���c�Nh-�n��`��j��V֭]��kגH$�ů_�L{;�D��>�A���HUe%�}}��ƛ$�I͟OEE��mc��=H�Ĕ�F�Ջ��������U��Ig2����Z>�~Ds�dYf�ܹl޺�_��[r�kV��|gg�I�/ă/ƍ�/
����'��A�v[�Z5):x�0m��h�nz"�(��es���[ܶf5�,3���D"�K/���i,Y���I_��%�=a���(�&Ž��Z�8�{~��+���s�@ Pnʼn�,�Y%�0Y��o�
��׍����n��M��˯���~���a�O���c�ٱ{w�(L`����](�w\0�(������e���/NXeI��������6�̡���d*����"]
qt��
���|��^�A��Š�(�L,�ɓ$�IEA���pl��gמ=8�vŜ��X��M����_��*w߹��۶_�L[a����,�h��]|'�/]‡o���3��,K%�.���Iq'���D�0�Bds9^{�M��^�]h�,I�nk㷿���a��nj��Czzz���dƌ�lپ��'N�mwޱ���v�كMQX�|�E¿�~XA|&�49~����x���;3AA��荤X:{
�8�
UU(�l檠����~�ӟ���ߠ����S��u�6�~�={�ajkk���%��p��9�����'�D�����Q[SCwO���^yY��N���q�r:I�ӄ�A����D����QVV�� .����� ���1Fd�|�T��BH�T8߉��!�p��y�.7~���}>�p:��bQz��	�4�דH$A��c
oQ����h4F(2��q{�ۇ( �OP��O}}=�R0B�4ǵ!�L��.�"�X�����Sj:�FUU�GY~��bh�p8D&�%�J
�D�D"TWUa�����A�d�$���c����؀,�
���g�i���a�?0`�bT��x�^�{zE�ꪪ	g� ��5��sԖ91���.W��q$!����K�gт�4��QW[K�� /��w�~;U��TWc]�=�@���^r�<��}(�LuU�x���A<n^�������(���D��@�t:�H$RڷO$����f$��t���Id$���n�+l;x�^Ӡ��I�R5�~��$�����FMu�i����D�es�����v�hpWp3��%�=��D"��vS(�i��fc``A�B�r�ҽB��E.����F)���d�?�x�X<A(Ħ("J"�t�����gW]]�,ID"Q��p��|>0`xd��ZQ^V�k��*~�QU�����`i?xd$��e��o!���x�����r�����#��
��}c��d2��i���H$���EQ���2�+�M��H$���fhx�X<N���d�[t���r���d����ap����a$�a����r¡����J�r�X|�;	�F/.��@M#�@�X<��(�����"�Lb^���r��p�0Jc��3lݾ���n�"��"�[,�x��IY�oj�5Q	�C���$L�m�Τ%���˲���8�L���S&����$q��������%����6�c�n�ÿ�7(�r����{¦K��X�c\�����Wtߙ�R�/�[:�^��E��M�2�����{��A0�
j�2G�;�
����n���o�
�N���T� �<�ʏ~z������i�󽒘� �;�ȉ�T����Q�W"�৿x�G��
��u�����[�������]�vv�{�X��K�}����=�{6ޝk��l|lsu���DeMV�˕9���D횬���L���=��s��_�3�!sL�ºh��Hl��_�@_��q��c�Xp�v��{ѭv���ܳ��|���{�2/��9�;y��䙏>>����Ȯ={9r�(�?�l��h�g~�^�W��=�\.���Li�be���~@SU�TR�g�P^V�=w݉c�/딦�1��
��Q��DA�ڋ�+�a�Ah�=&*�0&�_�K��re^�G�rkL��Mԧ��F�3X~��-
����\�y���;����y�'x��$�l�b<nwI���xX�xqa?Z�z3�=�n�H\�|��>ϳ����w�(�R��
�/�7�s��u�����^�u�*+�@j�=���VԂ1�%��2������v3���d�u�ǭF';fqs�U<KU�ز{~�2a�"�>�1�h�g7)��Ebӈ���Y|i�ڧk5y2�X"F���/Q��Ƙ�i<4*t���+|��2��e�c7'_ų�$�M4Cv^2l�D�,��r���(�%��[USq�NK��4��r9�w����B�4|�F���g��!�� �gۑ+��U@��2|#lY|5��]�]c�no����DqJ�?�I�W��j�eF92@V䒏���WISC#�`К�[\5�b�+��-��p�>b��!���8U��n'���fx�$
7'�$��w5,,�4�dq=Ȥ3�yS�U��*fѝp�f���b}t7'_�����e�;��ŵf�gV<�����\.O(XNmM�����X�����,�'��=��DӖ��v_^�O�~�\�/�����V�r�+,,��$t`�����d/-�A���i��̘�>��iS��v{�覱X�3gژ1���}]&֤������bTt�˝x����G14<���;پc��_=.�����c�?��T*9�y�_��/�P��E1r����(W/��G�����s�'��D2���0��"�H�P�̙����񣨪���Y(6��}$2�����r̞�BMu
���C˜Ξm'����d���d��c��|�����c
C7W��?84Ȟ}���"x
�h��{w��f���a�g{�f�����ٲ�n���9s�\.����2�}��F}�R�ӧO��e���TUVq��~*+*��o�r�v�2�����~X�ō�eU��aPWW��=†{ ����M4a��i,^���˙�3d2Zf��p��RJRA�d�
�W�����$����C,C�%��8�(P[[���q�\%�—�R�[XX\o,ͣō���uuu�G��f	�����222��cG���$�I�b�JN�<ή�;�e��EC}mgN��v�?@,��r
��t������mvC](�o�1
i\�,}p�e����,;��c���dY�N��l���ea�y1Ì�6��:YY㯽���)S��w�c�պ�D�7Q?Y\�%�a̘6�Ӊ x�>���,_���3�I&�̟����Z**+9s�
MUi�2��?���ZTM�������O��z}TVT�j������b`����̟��p�⪧��X,��#�PU�������Z<g�0��s(�2&'}��g��d��l�	��i������\rk#��q��R�`f3���"�I3m�t�Μ����`04ij��m�e���Z��.J[Y��d�و�c�nk��y&���,,>�����$�O�"��r:�1}&n�������qA�?oCCC��<3�g��f9~�uu��C��M��y�9L0���ኯ��c��9ʹ������v���t�:��E������n��E��̮Kzz�9�z�������x�u�+��S�1u�1���p0�eޘc~��E�96gNK�ߡ�Z��>k�lf1��{�����K�)&�<��k�N������vt]�4���:�>6� �F�d�&�.^N]]�E�=�N4�yFs)��Ǩ�����a�{��9��ݍ����"�"��x<N&��С�8�N��Єe�n{]]=p��qn�m�3f��׉������uw�3>�����s�[XX�øx���lݶ���nee����c�)c�q��t[+�l��3KcF$�"\1���}����A�����[?������*l6�e�.zo8����j��(\�.���D%V�XE6�%���kG��DFF�ƢS[S���C�ˎӟg\��qY����(��\_<�y��L�J#���`��~l�B:�f���̞=�H$��� ˖������m���~��[��n���tvv�t:��������K�������0�|��g�I�R�j�����̙6l6�g� 
"��=�����!f͚�(���G�������ZZ�<��-sZ��k@�eF"#��1(xP���i����0S�Nc�����Bۗ-Y�n�l��cb��>�KOO7^����&�9����\���TUeێ�8N��i����W�2	.MS �f�m8�N�T����5 �"~q\��bҡ�Ǐ���I>���0o�<N�<A8������vSWWOG�yf͚]ߖ/_��ffc�4���3������裦������E��r���1�i*~F.�#���<���i�=�:	�@>�/����ٳf�v��h,J8"q��qt]'��SS]�(�[���]=v����9xh)͜�-444b�
M45N���c|�/��{?��-[7���Ĵ���gq$Q"��m��eΜ��s��dEf����B�[B��C�2
�F@ ����� 
"uu�ܹ�ndY��M�5�e�V���u�***8y���
�ٿs���aw��f8�,�/`xx���n�Oo.M
*+�Hg2x<^Ο?K,����t:����ĉc�r�B�̀I{����=�����9�clٺEVWTP[]���̚9��g�u:�ɓ'���`�O�TWU�i��ᅪ�f������f��Ձ�墵��t�@��L6Kcc�9�7�eD"���KeEv����H,,���a`�;��u>|�w�{�ʊJ/Z����+
~��袽lUU9�,UUU������Y-9r��AWW'�,8{���S�!IƷ�KW��� $�	�϶3o�|Μi�����tvv�:�u
�?��l����TUV�q�پs-s�"��}VWwyUE@(��v�Y�q9��S)����-������c�ql˶͔���xض}+--s9�v�9�[H$�ۿ�ʪ*vǨ��(��,��Ρ����������ف�kf[�'����~��F&�a�-<���n�~����T��V����/��E�<t�`0Ȕ����=C2�$���r��u���^|>�.FQΟ?W�*��(JTTT�t���4���jv���� �,�L&p�ݔ�Y�h	�X���>DQ�0�e��r9���c4#�n����dY�칳TMe`����G�4<n���bT�7��eA\nj>Oww�T
Q�PU���*��$3�g1<<��ng���r��Ar�v��z?]������C� �︋Hd��>�AY�|Յo��Q�x���������C,Z���������  �"6����39x�@a|�2F�}��9��ٻo�t�D"�y�$���-]FWw'�C�(�̴iө���ܹ�Ģ�R�L�#
�5N

���EQ�*I"e�2A`��Y=zQI$d2��n����~�h$���`������q�u4�$1{�6}��CG" ���DwwUU�,[����.Ξm'���)6dY&������^�K~2�{���f��p8X�|%������t&á�ill��#�e�2*+*q8�=w��6~��餶�Aٺu3�d�ÉMQ����ܽ���stv��o��I,�à����HwO7��Χ�����V��t:�e�P(LYy�¾��^���� �`δeEAN�:���CQl6��Ux��1m(��R0GF"tvuW`���u��C$2�Ǜ>"T0������daqY�S���@*�⃏> �N#+2j>Oe���
�s��!�ϵ# �f�ZDA��v^}�e�*�
Ś!XuMG�utC/��O�8�@?�l�L6Ck�1�۲��E�l6C��S�l����sH��l߾���i;s�x<nn]�:t��'O�p8��o�L{[I����k�t���8i����륡���"7�|9z����0�Q^V��];P��EEe%���\r�)�O��UU��!����	��^jK2�d�e?q5����/0�f�d���=
����Ԏ9AUU�ys��i��Bt]CU�1��EK��]�i��!���nsY��l)qA����n\.'^� ��i,\����Zr�u��TVV���d���jjV��v{������A���Zjjj������I���W��@9��
TUW���l̘����Y��~�”��TUՐ�eq8TW��t:�w�����d�*"^�����>?6��Y3�}�\.��륺�Y�Kmw:�c�9�N3����A��hll��vc���+(/RYe�x=��ʋ<,,,�"�\Q/�^$IB�D<n���/�)�����t�r������r�r�q�ܸ\.���
*+�p������$�p��)MSQ��GCC#~�XP��|�|>_0��GMM-�P�χM�108�(�TW�0o�|¡
N�:��離���Q]]�����tPYYEUUuɻ�8NN�2�ׇ ���R_WOC}#v�
��Ie��`y���Zjkj����K(//��pRu��GD�/��U�~��4���Ӭ^���Kk�R[��cʔ�T�+P5
�W��+s�޻D"A:�f``�l6[Z@JG�x�?�9k�/]>F��}�6�=�$J��̙����ucn2Q����S�8y�{��P�޻o7�d��n[7&>���������]WN_�x<�����\Y��~��-A���{���=����Re\���?]�[��,���'�|�E��}�qg|]F�!����ۯSYY��Uk�+�7^eŊU4�h��_t���u�&:�e����A�~�1m�l|�:`�:�044��clj�b��656^^����=�\��z�d2i���a��}^����@�y}dsY�����v{H%S���,8��c���x���i��$����R���g���S��s����c����D�����-,,.�$�����V_�ډ�	A��k���)�b��=�1�}ѱ��^w�}�N�f��Y�l64M��-㯻ƶb/�_I�Sttv �"ӦN��l;�wl��r��dX�z-�$�w���,�u�����~�1�t�2(�d2���JOO6��U�V��t�&�8�d�����U�
j`mG[XX\/���M�W�$-Z2f%�p8X�r�
�:����}�=����V��K��f��1.!���fMc2Ib�������L�<n
u
��.N�n%�Q^^�}��o�~�������ɩ�S�B!�G��ԩ�Si6lx���jp+?d���q�M����]�r�n4$I���*��}M�dA�$�ʒ�T�+X�j-+W�*�ѕe�χ��%��p��Ξ;��gZ��Z�s���d2$SI�@�4����+Xj�2�4�ܷ�����L�|��� �K��n���wyU��f����߿����2g.�J�
��3i��p�\�t�ǰ�lȊB.���?�0`�����9v��Fee�
MD�Y����Y������G?@�t��˯Jí����uŰ������
��N.����r9/��g���/��v�"SS[��ng�E躎�(%wYVPl
���C�M��ۍ�i�T�`��E�h4�����޳�T:�(H���-��:[�����0,;#��,�TVs:����n�n��cŐ��a�r�J�***�^�r���,QV:�pT������Q�ZY�[XXX|UX��f�,��Iz��;�W���2������za��-n0&�_;�G�k�&����B��,n�~UUI��W8;@͟ˡ�kץ��\�L:������.f�2]��Z�[|%��y�S��D�+
��>ؽ>���K�@��k�@�ASG��]{l6�|�\>�������������7f��UU'�0�1]k��I
Ο��?����Q)'�Ï�f͂�ˠ"|���������kK>�'�JM(�Ǩ���j����^8}��!��N8r�nx�Q������!���9Q��w��`�"��6�'���F���J5_g.gKR��o~���?����^�����k
���n��ذ��;��	��H���aؽa�Nx�=�<E�u�0����K�-k`aaaaa�(f����=<B�~b��TV�+��r(+�x�)�>�\��l�Y�D)bK8�|9,_���������<[� <���1�}�y/I�J&�;c|z�K�8Lvݕ�ӊ$haqks�S[�h��o��X\LI���œ9�_�+C�RU��+0�ދ.TU�ΎNB� ��0��������(�n7,]��t�����"���!��_�?�Ɵ�)̚�����ԩ�W��\����i� J"�,8�L�(
6E��ڡ�a�<��Q_qtA�0��`�ٮ�ô����I�Ӥ�i����/,�Tx���� ����j0&[�˵˚4|5�|񌢪��z���I&I�\�F�Gb����?���9����:?��/8x�� ��)1�����;�^���K�e���+����v�|��7X��{�����_y���~�TO]�𣍜>}z�6���/��o��?�ӏ��b�7����fs���?�����~����%��'N���i�������{������A�wtN:���߽�2;w�pL���'�/QΥ�߽w����o��5�=��ŵ���a��A.lj�G
��o���?�4���x=8x�m;vrۚ�����0N�<EwO7u��L�6
q��������q����Fص���m��~��k��?q�$e�^�����1c�4fϚE(D����~���D��R_W�����=�w���ڽ�7�~���:�/]
�޳�D2���ffL���6�δSUYɌ�����a�ǟp��!�,ZĔ�&kfkaq���f9v����e�K�,��g��]��1�j��/P]Y��o������gI$�AZ���0:D{�Y��<��)S'�WGG'�N���<c:�;�h�3I9~�����b1�?���f�…8��I�)�#�"�8����ZZ00���#��!">��S��
�Ig�={�n�_�@uU%�o��y-s��=���λ�c�n"�(+W,���\���dr=� ��*���L�6�"�o�e�\��w�{���DQ@v�������)S���Oy��Y�l�X!'�H ��|�	�_��s�1�&W����r��a�,\������:����x��Gضc'�g��Í�<}:�l�#G��W���s�q<n7
����y$I�8{�<��s7ݽ�$�I�l���<�o�{�͝�a455��*�T�lw�#G�����2ɖ�����A �H��:|��L�t*M*�bxd��rR��X�d2�����K/�t���#��>�Ï?f��y��cA�Fc��~DEE���TUV���#
��e��W����?���p���:R�.�k̖��r��H�R|�y'O�2}�4R����e���\����������A�4|>��mcý��J��J$��ccm��d�����8��LvO(%�Y�dӧ͘�,��Y�d1s���w/�B:�!�Wپs'˗-��y�E�m�Nt]�����a�fX�����^����@__?CCô̙��̛���O<μ�9�޻�����ܿ�>��~"�(�d�CG��2g�M!��x��8z��L��@�|.Oo_/�t�������H�����{��7�,��:�����0�N���>¼�s�D�,\����f�|�1.�����_w;�W��0�����ͽw�ŧ�7�u�֮^œ�?�̙3���b��h��'��(
��U�̙��={9p�

uTWWQ_WG4�f��p8/Y��a'���ف<���|��'�3{6� �h�B�����A��׬^ş~�_"
"�D��3g�~��]�Q�l��v/ZHs��~�q��%��~�����+Y�lŤ�RM�E��z��H��!����tttr�����n*B����#���0<l����J���?N�����
4]#���?@gw7�����W��������̝3�t:�����ڰ��?x��;w2o�\�.7��'{��S���/����>̔���}%�����Fb�#V��,�H�9��I>�G�$�{z�<�d���������=w���~b�8�a0����Igg��=�Y6�{��Y�+����c�Y�b'N���O>a����e�9Lj\��W_�ԩV����D�0�1�J7��ęl���2�~�Q�v
��d��ꢳ����.��(�$�'J�`qm�P�o�:S�L�yF3��L8�e���:E�,��'� �L�xy`�}���Wy�׿&
���{Ǫl�qغՌ
�j�W�ǟ��8p� ���C���>r����z��y������jk���ȒD}]'N���vS[WK4%����/���x��}:|���vS]UE"���7� ��h�B��˨����p`�۩��E�$kFkaq��q����F����4�ӦQ��o�o=ǚU+y㭷ٽgv���ǎ�p��|�'�_�����?!�J��t���݇(�8��\��?H�@?ӧO�����p��2�zz�3k�x���}���A�O�Fmm
�>���}�MXδiSi�1���ko���G�����K�?������;��s��a�ܹh��o�ͮ={�:��_��%S˪�\��eK�PYa����=[�Xc�զ�����_���--�6!L"�����Ȳ\z��}oI�PU�\.��fC��B?�C�o�
����1^{
B�k.���,�n���	��f�yr�,.���F>�/Y�]�4M���ch���%��u�|>O6�E�$\.�D]�p�\��H:�"����e�|>_*s|�YXX�躎��(�����$I"��h���nG2�,�$�u�>ݲ���w)/++�e�Y��v��MAU��Y�����H��x=���ß���dz�|]�K�^���N.��d9��R)��n��0�I���i���_���an�n[��ߏ$I�N6�CEDQ$�͖�E�es���+�3��ͲA�h��o�0�$��)+�2���]��L¯`Vg�۹����Q�p�MM���%�$]�$Yf�%�� �l�����[�z�^.��G��e�gaqk"��gYF�1J���6��0�Q��c�Dc�,˸���5C�Ü����y0�q��5�+)�0�N��6	��룼����! ��K�M��W�Q�Fq�fY�g�o�
�r���>z��w��"*'KIeaa�E1�����P_���Gk����c��P_?���
�<�E�o>�$�e�w#r!;� `H�[?1U�puV�E������r�|�{�Z1�-,,,&!X^N��j���|>�,Ztͅ� L�j����GI�g�~��g�l?�_�%���}��!s�|�(��gp�B[|�)L���Gd	}��p��W%�-��R�i������Q᭷^{
�~�'Lw��JSP���z2u� ��m��46b������[XXXXXX\�_S�׮�翃��'���ߘ�����`L����f��V�t��$d2�6�ٳ����
/��g���df�[����׷������:1�tR̽�'��X�
�mC��C8|>�^y�7�1�-Y]���ݻ��1s�N���,��#�_�%-E���������uab�	À�����^S����M�^D 6jk�=��f�6
���I��vC	��nv��1eY�(-,,,,�"&�� �Ǒd	�ӅQ[��+V�w��va�;G"���p ��g�nlD.+Cq8��{�S�� �A*�6�����b�̋�	�fsh�������	�YEsM6�%�Hb`P�.;׻�,,,��7�Q����Yd2$I�ab�h��,�c���e-��J�#{���B���ښZN�:�ɓ'���5�)D ���Bd���;�X����6z}=���_!����יB�B|��!���g?��3�=w��--�h}n܈a�
���~�8z�(�)�Z��t:/������,?���Ô���i������Wزm###4Ϙ1&������A8v�|��ys[�$	A�D�tvu,7C߶�=��i�]��rڏ.�/�H"�dJS������R�\�zAصg������]������!2@���L*����ؾc+��QYY�"+c��x덷hkmeڔ&*�a��#��jijl��������v]AU���l��ut�u{h=}�$�=��|~��=��?߁����p��x㭷i�3���w�q{��s�|~TU�Tk+�;:K���hk;C6��aw�_A�eb�N��˅��c�ҥ�l�������ys)�w7YXX\A��3l��	^��X,���壍���oPS]��i��g?����p8�(�������G.������wp��q���I]m
3�O��~�h��ӧ��b����~<n7==�(�L2����i��>�w…G���9�g�q��YE!�N���O���l�ˉ ��~�v�T��я�v�Bfd�QBI�4��#^}�
����y}�H����	~�R�r9:HEE%sf����aTVV2{�L^~�ujkjK�(��8�o�2��D"�3O?Ŭ�3o�����?x�y--$I:::ٽg/o���~�Yv�����f>��cB� �T�ʊ
��?�SN�:�MQ�>m*�H�#ǎ�c�..�ϲ%K�p��$�I��O<�(����)++�����L6���8t�([�o�_�şSS]
@Ww7n���QZXX|=���n�}����gy�
�<�Jۙ3l޺��--���a``����n/��M


����Oa`�_�������	�d2�?��'��FuU�V,�׿}�o?�~����g|��o���o�WU�*+�?ࣲ�b�X=����J�9t��n�B8b�*��'N�k�f�l&��QYQ�?&� K>��իVr�Ю�*�k��FiGs�<�N����>ݼ���J�N�
!7�nL�A]�[?s�,';�u��%�ڛo�if����>aάY�ş�+�M��GoB/�\OA`hx���^���`��e�����M��֭��d���<6���y�o>�$��]��i9Js��vUUU����ŷ�{��;v�t�bjkj8{�]]�h����a��(�
���SO<Ƴ�|�޾>��$�Z[��+���Y٨,,���A���?�c�ϛGo_kV�d�������V,g޼�<��{�ݨ�JeE�'�g�����|��&�/]ʟ~�_�`���҈�Uհ)6fϚISSM���ڳ�#ǎSV �WUl��9�g��.YN������[y��o�g�}/\��i̟�RH$ �˒�f�m�j����DF�d2,\0��x���I�l��N֮^�ys��]������b&��,3o�|��~GA�t1F������9vQ�$�l.G6�#�3�荲_s��	<�uuc�쥳��1�aƚ��l��6DIb$�̙v̛�X؟��r�R)�}���>��`0��n�������p8���B�@?�(��������V��?b��y,�?��� YXX\CY��$�͆a����;ө`&�I�R�r9�L(f��c�X��t�l.K>�b��f�����q��,�cf�~��_r��)V�\ɑ�GٸiK/����s�9q�$�K����8q�` �2S�``�;
�DL�.YQPA�̭�hf�+d�����f�fӜ��Xx�P�oM�ML�6ô�`�%�"n�A������?����_z����D�$�y��Rf�뉪��?x��ٳ�$	���S�����[��4������6�.�$!��ӧېd�Ɔz��8�|����WD�1�v¡[3�9�ڊ�餷���w�ǩ�q8��n$QBE<n7�l��~�2����v{8�<��9�jk�{_YXX\E)d����FcC���/�η���sy����x=�mv�=���?��y���~�"�ϟ�|G'˖,�>dϾ}�������o���D"~n����:dY�����s�1����F	�}x<n޻D9~����*֯��7�z�]��rǺ�p����s8�h�N6�㍷�f����C!.�O*���?��q3u�T~�ߡ�:�n�j�2涴�`����z����&���x�?�9k�/]^J={��4�h4���E�eTMcdd�ǃ��$�O$�|���.
-�~��7�m��I$Dc1R�4��r~?тL6��S�%�H&9r�(�t�{�1UoC��$�IE���
A���"�W�8���Ig2TWV���D�z<%wI���H$B6�E�uDQ�"�(%����ׇl6K:���L��0�x<�������B����.������SOR]UMEE���!��(.����O:�&�H�`.���X�h4J("
?�����}M����'�p����e�1�����
Te�s���H&�T����?�P_ϊeK�����v�����tb��������qy�/�ɲ�B�G��)�"e(��ڻ陞ݙ��{��}�ݻ�v{v�G=�V��ny��$��{Kx�]�*���G%�0�H��9<�BF�2"2�������0�������p{<�DG��}�I�����b��9�>o�]���J�J�6N��Ri�)��,˟�E�T!S
���=����zs�[����k��\y�ln���y*
��'9�"��?�-Z���|������W|�g(Y�lN�w+�]�|�����`ӆ���s��(��n����[�/Ԝ���|V}�Ũ��KU���|�y���=�V���I �Y���Oss�1�������Y}�Tf��
def��n��|�EAR�x��IJH\���/�G�%LjJ
�))_8 *�BTd$�ly��Y%I�]�:x]��B~�@ X�|���\]%���6��@�@~�@ �"D���.B�/�[)\	�@���;���v5�\��׳/f�?�Ͻ�ܲ,c���0��u ��῝	r_'�{�r�5�_
s��J�DSS#��$�'�Z�.IV�����>����,dY��o���R���i)>��$a����;����JgWy��H��,˜;EQ������ѾO� 3#I��oh���#��2o��'O�axd���\4ͼ��<���&>>nI֕@ �r�$���VN�9KnNvp_�������V�`�X�e�^<o._�(
��~���)���^s�����K�}�s��WUW3044f���\7ss=>���w���rr��Y**���f�x<�76��ަ��
���VFGGp8]���r��\Pkim�����q+57n��с�n�8��������3)���KW�`
5�z��饱����f23������{�/��?����==X������`�ᯪ����
��ۃ�������/^�������~?/_�W����v����G���j��z�Zm���011��V+m���ܸ����ך�s<^/V�
EQP�ݎ���{�_����4ccc���t:q�\��[�X,�=d�?����X�|�\�O�e���E����}����;�/I�	�_8��d"?�`fE!%9��������O'՗�����{��t:1|��gINNZ�7���z�
�?u
�ǃF���o�@Cc#>����3j���6�IDAT��NIq1�ηiim ==-(�12:�)44 ��Vs��bbb��魷F�$^���8u���bb�y����e�'Oq�����/��L�%QO��� �T4���{�3>n�'��3\�r�N��2�_�@Uu5(
I���ڽ�)x���?��6*I���>�Y�t�n7z�mzzz����M���Ǽ�·

��?��SO>��S����������A����D���O���͇�C��������˗����`0��ϥ+W�����󑗗GA~>�.�%��<�c����N''N����h4j��ҋ�ߔ������JMBB11��pؾu+^�����(�|r� ���?�h4��С%s�V+���,+F6�[���_���X���8�vl���O>IUu5.����Z��2��������Oyf�����˗9�������r�=�ٰn-�}�\�z��bccشaaa&Tj[z����G�284���#�0������_���dge���Ʀ
(+)�G��׭�����nc��[�pL��j��@Fz:��������o���,/+CQf��%I��v���Lll֭'#=�pS�._���E�1�Bilj�����y��j5���{6���/YQ^��n'%9�~��:����������ܨ���񰼬��nc�#��>�h0�y�~���d��7���$����C>LLt̜IE!<<��~�3g����;�"#��� ""�ñ$n�$I446��j���D�4Z
F����P�v�d€tf�d�5
�V+
��,/+��t�֨	C����EaǶ�de���p�F�^��'?���W_{����j5a&S@�0$d�W*��GHH&S(��aȲB�!�Z��hD�ף��@�d2���@Jr2c��X�V���1A;Y��O���|�L&~��}NwO�ׯ�����'N����ؘ~���~ia;o��܄��$22�Ʉ^�G�$"##���B��"I�V����"5%�ш��F��I�ף(J�:~�E���Z��h�5�L��1�PJr
�i�DFD��%@�
L.�����{6���V���͛ٵg�==����S;Q�T���~?UU����$QS[�˯����V��·8t�HpB�T~U*��|>���@Q8p�0����t:�M*����%u�N��QR���$V���.��x�|�|>T�
��.�����Gb�:��o>!>�����3���IFz:�N���khh�?�Issoy��⃽�?�Ocs3ˊ
9z�8U�5�T*�֮^E}cc��x�^
��p�\tvu����ѱ1�_�4-����OPU]=��׋�`d��r�~�=***Y�r�M�$$I�������74��T��3b�L�+��4>�	���(2���INJ���|�(��#�+�+!�4��
��KGg')))��\.Z��HNJ"�l��������䤤�.g���^{��|���<����>a'9)���t���	

�j������(������8�=�4C��455��鈊�"3#p�J�"%9��O}C##Q��DGG184����I��dg��ׇ)�Dxx�]$%&`0����j�2b%=-���adE&>>���6'y���|^�ZZ��������{������HK�Bjjnfhx��DZJ*c�c!I�QkHKK������A�Q�ʫ�n��7�K<u�
�|�:��������磾���Fnn�d�%Ѐ�LX��k�Nyi	�����a��iji!*2��0�}�BBB<�))�wtL�Cμ��s+Y�y����׵ϵ��Q��IHH*�jV�,�E=�:a�G�й����VZ����|�����'8����[4j�Z�o����+�o���+o��b�4S�~?���M7p�=�|>ۊ��[�/��w;Z�K	I�f��+������{�:�"������|���>K�DRb"�%�|1�15ꙕ�InNμv>˦A[^VFB|���}Qۂ/!�+KEQ(�ϧ ?�DEQ�����C�V���R�����uK��`	�eN�J��	����t�%�@ ���_ ����@ ���_ ���y���W�i�[�4.e撹��2|�2~]�B |u��C>��Xj~���g)��ʜ�_�$�;�ih�g��~���s��"�e��457/�'I6��=~ġ#G9{��3�kj���gΝ��ի�lXm6�?���X�!�:$I�����g��������p8��\.^����E�؉�ި[~F�$ZZ[iim���us=��V+'O���a�eN�����������܅˗�����续ݚ�j��;�|�*��-\���j�G7z�z�Z�Ȳ�����t��x<9v����T���v#�2��V�8���0v��bY ,���9|�8�cc8�N$I����7�|�ۍ���vs��e����(Ȳ<ßN�7>>Υ+Wh��z����Gq�I�4�hhLO3u}���#�2^�/�5��HP�������{4��6P.�[,��̻�_�$,�N�<>�f�Ō��(���f2�k���ݗШըTj������G����;��$n��(TTUQ���Z��}�%!>��w>Ɉe���'�v��Zٸ~=۷m�����C~n.����j���xf� I�ڽ��Ɗ�2������GdYa��Gظ~����@��Hu�����߂;�x��?���W���S^Z�ǟ��d
�2:JfF:FQL&�>��,��������ҲY��|�?x����DGs���;q��w>�!$�w����[������Z�"#y��������S��DEF���O16>Ξ?��r��0:6��
$���ޮ����t�j�
���x�'��L����G�Ҽ�$1�p����z�:V���~�����	�yXpr�������@���ԓO���2W�}26>�SO<���P�]
8�N���)-^�J�b����?dll��N���������Y���﹇�N��x���%%9��(��易����^;NSS3U55�[����\����<MYI1'O���/v��b�($'%�^",�DMm-+W,'?/���|���R
����l޸���CC<��Q�~?~��}��h��DQQ!�2�=�J�ȱ$�dz~�Z���'�z�:-�m�#IG�'!>�����	����
�Q�T|�g/�|��/Q��Oo_�����8�N�{{hhj">.���<̑�'p�]ʵi��]EQ0���ry��}j'�K&n|Ә7�+�BBB"[z����9�ȲB\\,O>��8@wO�,���O^n.y�9dgg��߿$n�$I�����)aY�	'11���z{���hCi������R���l��ձ���ѱ1�6;]]]��Đ������<�u+�N��ԙ���^��FIJLD��[N��|�l6���@|\>����L&�))DN~���%**
I�����������tvu���CVV�		��LA��'�E��|��ݻ���[���ׯs��Y�J�IJL�/N�����Ȃvv��CgW��Q�
INJ�lG�R���BvV&F��$��hHOKeՊDFF`��3�KLt�	֒$�R�������V�ط�˼C�q�q<��#$'��������˩���ͷ�A�Ѳj�J�;��f���
�>��J��_Q��k��H�h4"�2�
���#�WT�c�9s�l�}�Ի+EQ���cb�ANV&^���Tbb����!+3���jq�1�74�~�Zd��"+Ȋ�"��@ �I���9�k"##��磏y�����������;;�8r�8/_f�U�T*N�:�������\.]�Bss�J�N��� ���!22�8p�0CCC����k�^�[Zx�'���

�:���0�C���9t��MNv6�v�f�ʕ���L���8YV�'�x]��bll��IaA=�}�r�D���șs�'}lA^�p3#�O�l6/z�&�����;bbbV%'%�(
���V�Z�"**���tBBB��� #=���<�

��0�t:ٰ~+��}��$I8�N�;��MHHH�`�ʄ���X�r%��0232HMM!=-
S����8zz{��lڰ���0��������F����
���m����HLt�ii�FHLLX�z��N�#!!���dB�FR�����&22I���t233�������rq����p%��ܻy3Y��!�N��K����p0>>�J�B�ѐ�������͛6������P���			a���ȲLWw7��d�2�������+V������ell�����IMI!:*
��@rR7�꘰Onb�#���IrR�$�)4���@o_������ ��($	���D��s"Iv�����Yǧ>����/��lͪ5��šO?z��K����������Y��
���Q$I��>���f�m�@pw3��\�;q�/���o�j�3${?+N�����†��w�Y�og�Bi���e�����M�s���~n��g~>$I�����2g�!u������
S��$��/�
=_�###g�,�[�u!����\~B�$""̤���V���$���qFE��ڸ��	�IKM�>#��-�r��dy?�C%F�@�U�(
奥���̘�ym�DG�|��xG}�Z����5��K�Υ�]����ė�%�F�A��j�m�����@p!�@ w"��@p!�@ w���5�7�O�:�W�o��ʰ�Յ�F ��
K��,��f3g�$���.�R�缁.����NQXP@zZ�,s��5���HLHX�r-��墲��NGHHE�(����‚�Oo����Z��h�ze%�+V,'�l�����	�9|ƾ�@02ba`p�������:O���MUM
��RSS�h;g~*�ٜHp��;�?4<��cG���uL�$&�v���^�_��e�6���Ѿ�455�_XJH�DGg{?�����s��|~p�鵊�b=�G>|>?�?����&���۴����������~�? ������`� I�--��!.�+�_��6�>9p���Y���^��#G�Qs�v�Y�8�v�-dg����O����DcSӜ���k����;˼C��-�z9z�=�e�X��@rr2N��}��癧v�R��T.���'N���EFz:�޳CH�b�5HuM
Y��
F9��f#';�->�Z���v��;�b���Z�g�F֭YCOo/v����K�)+-!<,IRq��%�\�F���G��r���׮WP��ϣ۶�����
����(ܨ��_����F���~���W�XXVX��ヌ�l�������܅8�Nb����dYf߁tvvQS{�eE���#�\�|��ׯ��uk�r�:?� :��CG��n�j�;:�r�z�wm�Fd�L9��v"�fv<�(���2:6ƽ�7�t��oh 2"����;NwO����Wc4x��6�o���xH����dמ�\�r���a�ٹ���h���rr����t�ޒ��ȳO=ŵ�
*�k�7��ѣTTTR\Tĥ˗9v��b�3�������e�5j�3�x�������̹�TVU�������&-%��~��磶��11�����2zz{hhl����`0���h!�����
�9|�XpT@ ��(
a�0��>F�F�|�*�$&ij���Ԕ��H�wvws��U��r�kh�9t�(�]�l{�RR��96J�ĸ��[��K���F���������ׯ�t�x��i�K�Z�NN6���ݻ�x�ܻy3	���74���BQA]���76r��&&&HMI����j����(WN�yV:����4�Y�|9�A�;ļ�_Q"""x���̚3�,�df����`��GF�446�r�
6o�HYY)���K���;q�\���2��	���P��?cx�d
�����2dE�f�SSSKYi)�	�DFD���KtTaaa<�u+==�\�r�˅F�&?7�ҒL��8��.�@ X(@||�%�ded�v�IMI&..��+����B|\,�yydfd IIII����=T��r����唖;iu�.�aal߶���^.^��F�f�ʕ\�z��.RTX@Jr���v��s��5�GF����{6������T*���[��9�$��h(^V�c�%"���8	�q��呝�� M{��hHMM%>.�U+Wj4.����2o�7�#x���+����4��g3��QT�Ԡ�j����ʵk�<}���j�.��nuM-)�ɘ�f�~?�]�\�r���r��QY��Ȳ<�]aph��(�y����~dُ_�q8DGG�a�:***ikkG�L����o���G����'���hdd�™s瘘� 44�ʪ*���Q���uu\�~��|�rs�p���^���YVhlj�ĩӜ<}�s�/����K�+*iimc��ttuq��yV�\���&:�S����NS�v�]�```���8=FUu
��A�	�|�t�j���HO�h4RYUEcS�##�>s�S��p��iZZ[1B��`��X�[��E=�a�S;w��ĬJNJFQ�f31�1s�$I�,������d2����^������2�.7�mm��s���_���!IN����}��kIMM���b�YikogYa!��w/~����������!$$�Jňł����C�/�99��j���oh�����eelX�Y�IMI!&&&����u I���zrsr�y�DGGQ\T���axx���������%�hDVdN�>��j#1!��۶������

��׏e�tw�PY]��#��1ܨo�����26�_G��LSS3~���w>����܅�457�������r��nA;�l�HNN6�mm457C��Lll)�Ɂ�Dt�-����a��x��{)+-!&:��^�F#!z='O������~%��|^������`X:sþNH���n��t�=�r���o��r~A��֬Zs۽���K7��Y���^����V�RS0����ڻ���	i�}�\�|~?���͚�����f���g��'Or��E��������o���������2��l{�����܎�)dY�!�3e�����?�n��^T*�m�U��/���>,˜u����iC/ś��j(,��o��ϗ�ԛ��J�V�?���@ �;��o�Y�$�z=��p�Z�ts}��)�$I���	JK�g�?����t�ݝ�����`п���n��[_�y*�0
��EQX�|9��Ũ5�/��p��@lL��ij����%tZ��K��.���	�N�^��RhHH�W�L.�dA�"Dz�`��eЯ*���t�_ ����@ ���_ ������\Χ��u`>E�ϪRu�u3_�@pw�e�ɥ�w�Z~��3�K���� ��C��@��KuMmp�EQhkogtll��tK�6�i�蠷�oƱ��1�N������AsK+^�w�t��[�s�����ďD ����'���1���v�E��������.V�͆�f����v�w�y{�==�;~�ed�^��f��x�w���v#�2o���55���H�DwO���:~�	��S�e��8IK[ۜe�e���}�CG��ʫ�2666��������9�}}�|�w/���_��ݴwt,�z_>�$Q���#o��/I��=�8u:��������_�l��{�r���y5�:���Y�k�+�VQ1�s>-�\׸�w����K-�-��G���������TtL4�-��8y�->�"+��J:��HOKl>�V�N~���뉏�#�l�ŋ�5j�s�X�|9QQ�h�jN�:��8V��ukV������0�CCl_��Ʀ&��:�&S(Z����*�L&6mX��j��Ӵ��QXXȚU+�[[�e?�Ϝ��K������l��D |Ex�^���y�ͷ�0�ٴq�����def�ڟ�Lll;�FZj*u�
�l6�bcY�r%�"s��EZZۨ��#?/w�k557SQUExX8%��hnie��h�j�^�N~^###��+���ϩ)2e'�Ʀ
8w���ì\��匿��ϋ!$���*�����tR^V
H��F�\۷meͪU3��u���c.\���b���l_���d�w��"��������2����|�q�?NKk*U��v��Y�~�1>��v��܅�K����������N��`$>.���}•kW9w�"������n����o��,��ި�l6���������#=~���+���������xPPP�U�����;�04<���ETTi�)$'%�x�Z�W�$�����ի�=w�V�Z�B�ӡV�Q�Uh5Z�:
�M���8]N>�d?�O����Ӝ8y���$B�zn^2/I��V��244���x<=FMm-�=�����W^�Ì47�
̴��r����GTVU�JRq��%�:��秲��ʪj>�	�++�W���j��F�~x:*IB��Lփv�Čo"�~EQ0BY�n#�9y�)-)f�ʕ��kv�^��+׮�n��}�)V�X��+W�ey�ˊ$I�2>>β�BY!//�m�<Lq�2�]��
3�}����p:���T��PZ��^������cYa!%˖108��fC�֠QkX�z?� �����������{����/����]w	�������-[(*,�f�������L�ˊ
���d��5,//>�g���-q�.^�†��yd��23�O���
�C���2n�Eyi	�._��kdee���D�d��TQ�ώ�n��pp�F��x�G�>BnN6*���R�xlёQ0��Y�f5?���	�c�Y���b��5�Z��J5�5�N�cٲ"�23پm����A��7lܰ���+f�L��#I�n}������J�?��Ą���ʤ������%�z��QGlL1���~FF,ttv���J^n.V�-��-�3+#�}�<��cȊLll,��9v�Ą֯[KZZ|�
�SS�#�(��ݖ�]� &&���Tx����jŮ��݄�(���2� �2Z����qIOKC�Q����@`���t2<<B[{;��	�U*������e�bA�G�$�jf�y�_�HKg���?�On����I�X�v
i����7HKM%;3���K���'�bb0�B����n&��"ˠ�����r��٬6:����� 6&�d��0�����?9Z��V�al|���F2����}���_��|j玘��U�Iɓ�Jf����|7/I^����J��IB|�Q�V�`�tttp�zIII<��!���,n��=}�������dhx���V�jj���aǶ�XF-$%&!+2E��h4���p�=������[�|���ԓO02b����u���� ==��72:>FrR�qq�QX�ϙs���A�?�Zl� �MH���f��rQZR�Ȉ���P�KK��页��‚�"#�v�z��Z������F��x�'��ˣ����׮�(P^VJCC#���
��Z��0s��Ejjk~i�F�bc����>a���?�����O>�v*�ƍ\�|�����m'-��4�]���J"��h4j������bpp�)�ή.*��ioog���lܰ��(�F�Z��vQ]SC�����bz��hmk#??o���$	�ݎ��{"�ԇ��x�����Y�涇��k'O͊P�TȲ���G���)ݸ�=._�ʲ�B�""��@k�����=���Y�Ѐ�(e'���e����vc0�h4A;Se��1[U ���.�;�y�Ȳ�l�e�$q��Μ;�_��'��f����~��ǃV�
�?�Ϫ�j|>���y��ߓ��;�DQ�^�4>��v 0I����g�E�����`yy�n�4#��ei2����]�R%|�׃�X=�\n3_h�y���p�����>�^�c��
(((
�CYj���2�\΢��e��N�R�V�f�e�^vü�L���\�a�gLv��w�*U�wxx8�iҼ*���i#�sJ�v�D��e/�yx�4������5M���,ߧ��h4:�!1��}�$���{��y!K��"�E������l4j��C�	�!�����$�Q��R�x��/`ҼK��.���	sx8���/%��L&
��x0�$��$@t��""�����_U �"^:�@p!�@ w"��@pq����_����Qݪ_��7/���u:���.U�C�@p�!|�7�9@�a��6�Mg||<�	��b��p,vynY��XF�z�n���7 �l�y�������z����8{z{yo�3vV�$	����{l����˥�Whnm���-v�	�Eb1���fA6�
�ݾ��'�l�����l��ٓL8&f��%�+׮�_���Fo_��_~MWw�-5�I���׿�-o��~�/�7Y�9x�M��s�[Q�z�]Μ;������S6�WV��BOo'N�f�j��4&Μ=������˜8u���+?yr��O |	��wܜ��SYU��f��|6���$I�����G����8t��1��pp��2��"X�̹W?@oo.���v���N��$Iq�1\�r���~�WT����=�6���MUM5~��Ȉ\n77��himC��:���W�$I\�r���!��fj��`r���e��p3�CC�u���Ԍ�h�d
ett�}�v�j��Ghljfph���qjjo���������j��lj��`�j�h4r�����eh4�߽�Gz�ˉ��!$$���FB!�L�<|�5�WQ�l�6l`tt�ֶ6֬^�J����->�����PPp8���a63>>No_���`$����_�x<DFF`

������t:&��ֶv�[Zhni��qSW���(1�ѳv
Ue��11���q�z���v}>�H�DI�Y�:�"���1B����`0,vUߵ�j��y�@?
����[���!66nƉz����^���h�Z~�������^����gIKM���K|��~$IB�Ӓ����"�2{?���e�����f
�������q��)�y�=FGGiim����
��q����^J������+�j5��������.�r�
.^�DGg'�N�@��r��545����Z��ѱ1.]��m[�^Q�C��r��qV���C�)+)!66�uu:z�->@ڤ�@ ��!I�**x���A�Pd���q�~�1kV�������@B
���2N���'O�V�IOK����=~��n�¥K�����DCc#�==�?x�^�CGHNN"1!aF>dY��/��Ʀ�Y�SSRp8�\�r���jn�@RI\�v�^�^��;�_�ȥ+W0�dff���}�r��?�P��(h�:JK���̞�xfF��y�\�sx8W�]�����GgW-�����PI�����E��A�$���
��\��}�{��s��>~ُ�h��;x��'��pP]SKa~>:����h^x�9�
(���٧v2::���"//����K}C#>�EV�*	�ӉZ���#[B�R��_g���F�!$D����v��Ïظa=kV��b_sBBB��t�$��)��p8����F]+��c4g�Y^VJnN�=�4e%%�9w����K]]=�Ch��߱����y���$zzz�ٯ4j���;��HLH %9�!���f���g��lO�$֬^MNv6�}�;dfd_���7�k�ZV._��
����$	�VT���tDGGQVZ��~�#�JJ(--�瞣����v}0C�n1�QW�9<���Dd����d�jett4P��a��5jT*��c�wtPZRx�%	դ�JP�R�Ռ�����&����edE	�>ccㄆQ�T8&&�\.����>�ׇ��!Ā�(,+Zƺ5��ȏ@�5GQ�S��w�����|����"�y����l�RS�i^}�u�n7�022���$DBbB<kW����$'T豈`�F�A�
�.�����G��.
�J��h��2����aR# �6:ƭ6,�р���6*����exxd2���ʜ[�*�Bffy�y�[n��;�������‰S����"11���*�\���x�3xy�^�WTQR��JEX���׮�?�׿�V�y��'ط����DEE%����ihj$D�'5%���!����T*���1���V�����h4MM�
'))��t:������eχ�|�P����f������폲o�J����Յ)4���X�n7�
�,//]�z_�

>v��I~nz��իV���y��g��4y���������O�`0���x���9~�I���dg�!$�VGTT*����)��ʪ�Df�
I�G�<xSz��L�)��HuM-7�ꉋ��d2AfF{>����S���j���
�9�l߶���l�~�=^|�r��D������_�/�ٚUk���sӜN'j��N�$���bll�N�9<�׋�bA�
�,�ܢ8].�<ĺ�kHJL��v�t8p�\���1���|>�z=X�W]]����M�|��n�n��d���ta4p8����b2�		A�e4-n���ALt4�������Z��ݎJ��7/�����ٹ�����F�QHU
���

1���z����o�����?$/7��=+���e�n�`0��뙘��j�a0��vF U*�n7��ۍZ�F��#S.?4Ԉ�(�W�T
�Qk0�Ѩ�Ȳ�F��b�j4����g�VKXX���I��	|�H�D__�en���7�/zQ��]�ZS����

ӳ'I�O-˻����3���7�}-��
�gs�q0y����؉��]��ؘ���_��DI�8v�$�._�o��_��jf��~��~��4p��<�����s�����*�ߑ&��7})=�=���03��7���m~��ؘ��Ь�۷m
�oaO |=���V���X߱�N�������k~����	�&b,f�"~:��ò�B�S|e�i��@�A_�U!�@ w"��@p!�@ w"��@p1o�w:�x<�y�E���̘���}K�������ʵ~��Ϸ�m�oA;�,m�I |�(�2���,��Yd̗��U�m{~3�+,����%I����K�/��%I�j���'~��,��^������)���_y���y���}̧,�?y����y˰k�^N�>3������_�
�mm�9{�<G��X��#��$�������K���n���'��������9��t[�����p���;����,��'7��v��ks^�oZ��bz=���x<\�|�Z�����t3�{}>���ٵg�y�������G����8V�s�����.���[�Z1���Ց��CB|�ɭ�U
�t:q�\�$%a4���TUW���100��nC�ӡ��#1)�ZMWw7.����	N'�aaA����zz���&1!����Ń.�%X,�\�F~^fs8)�ɜ9w�s�/���}���0^{�Ϥ����0��b��q�݄�A���A����'2"b�u$`bb���!t:���LLL�$IX,�����z}
��ꈋ�Es�V�3�hu��ǡV���f��$I8��>�"�����t�������7�DFz:[x���~O�eΜ=ǹ��KRR:�v�o�7�y�$I�\.._�LDD$�E3(
�IId�������/��J�J�������}7�����-�FEQ�^YIɲe�\.�O7��'�.�η���b����񓄆��� '+����457�Q�1������?RSR���%..��OdT$��E�j5��#9~���t�7�뉉�a��GP�T��֢(
CCC��_�����%Q?���"�T�wtp��ax�Ǹz�:�U���C�JJ�������)���h�y�11��.^��(�̛�K��Dk[�6��u����c����űy��۵����&���Wy��g���cc����ԓO7�M��S;�����f��͛imo���ˤ���R�I�����3
�������2*��iim�h0���mA�����KW�PYU�{��ҷ_ ��ܧ(��j2�3HJL�}�->HHH�Y��<|���T~��� !>����,��7:6FwO��
�
���7�K8q�46��˅�����g������EUM-9���z�~���� ;+������LLL�t:��k�c��`xd�F�������!+
kV�◿����h��\�j_������_��g���ͦ�(+-�'?�ׯ����Ƿog���v�1���?�)�%�|��'<|��e����+�����'���chxsx8�V� 3#���8.]��B"�f��i֬Z�9<|A;kV�F�Rs��Q�m}��WɊ��q���������0�	�c��ƺ�k��_����n����x|�v�oۊZ����h4r�ƍY������%7�i��%����B��!"�F��j乧�����76 �$�~?Z��V�N�]2O$I���CH��A�i�J�J����j�j�������Ѩ�����2T��V��`0JHH!!!h4jdEF�V��8>���y�5�VM����J�		A��a0�̚�#���t:tz=F�EQ��t�ANB�V�Ѩ�'�bHHf��ٌ��=��'MS�|�\.\.WPm��?�>	������Ɔ�먪�����,/'**���{�4��FSs��v^{���|^Z��N�F�I"�d�d2�E���BC���D��!�~�M0�˲���������f�[pg�w�?-5���l����luI��2��yd�C���+�:��~�y�}::;q�\|��g��>�~����*
���h4�5jo��W��k���ދ/�Ɂ��j�Z-�d9uZ�mm�@zz���h&���`}HH�L&۾���/�{�c2�2<<�Z�B�բVk�C\��Z����d��FUQ %9	����x�o?�<y��;xݨVk��Q�˯����~�رI%�����Esk+e�%<|�k�$!�6oXOu�
�6Z��ZMzf&~����N�JK���r��1lv;ZM�7-dG���ukyw�\�r���סѨ�=w�F�_��x<|r��+*1���2>n�؉��232x��@LQX�r%99فzx�u���s����W��VR�>����Abcc�i�x<�������d
ehp�ѱ1������^�r%o�|���^���ehx��	��q$��384DHH�	Q�Q�(XFG����>1��;��x!>.�ѱ14
�F#�CC���2<<LtT��eP$""����vd����(��51�L�]E��+`��f������I_?.��Ԕ��Lww��aܨ���O>���#59�ԴT$�����Bhh(�qq��vFGG�$	�ZM|\###XFGI��'%%�J�+�����_��gx�^�;:�INNb``�ѱ��ȲL{G6���`$����2:���緿�=q���Z����L"#"p:�t��f�h4���
J@=**���8������>���7�߮�O�-}jn!M��DQ�^o�g>���=���"IҌ�(77��ө�����,���؉��x�����iC�g%и��?��_�ԓ�S^V��ϛ���?u�����߳i�z���d[��/έ��坩

��d��M�$iƲ�[�]O�~���4�o�S=w�-�zw��~�s}�$��9��g�S3��JK��Ξ���ڝ/���7�_OZjꌿݎm��<_(���Ρ(
�YYdge}ဨ(
1��<��c�T�;`U*�iO_���_ �0_f�*w�"���@p!�@ w"��@p!�@ w�~�׋_^x+�d��'Ey;�_�z�@0����Ϸ�/$Ib)y���~;�~��3g�$���Vjj���g_�$�v;�O��3����������$���q���;��c�;>c���W��_��ѣ��p�3�Q�$�UTp��%]7���"I��<|��������`:�Ն�鼥=EQط�׫���o�$����y��us=��'&&8}�$7�j��.��/��k����@uM-�C�@`K_�Áo�ýThkk����޾~*kj���`�Zq�]x<&&&���rs��9T��b��f��r�v�>Ԋ�011�}bY���o��ի211��U �޾~N�9C��c��(���J���l6�n7o��.�Ϟ�� �2����Atj3���A**+����Z�,c��q8����������,���8�]�ws�)���1�^/>��ۍ��j�1���r�q8��]S?-�{��LՃ�f[2�n�D�]�/IV���gNBNv���BZj
QQQ���n~�����v3::JTT$O��I\l�Xש(
UU�'O��f�n�瞥���F�{�>@��XmV֬Z�3O��
��')!�W^��##d���ԓO���}�74��z�Yz�z9w�"�F#O>�8*�ąK�A�$~�����$�D |uH�DCc�{���=n�z�	>��W�|���q��qBCC�pL���ʇ�C�բ����C�e�z�]�N'7��Y�bŬ�x<^>ػ���bc��w�=�?t��{���o�ŎG���K465ųO?5�WO����<���0{>��,��044��W�0�	#1!���t���q�)-)!#==X.���;v��}b�����z�:n����"�qq�G��ܧ�T������:�����v0::��'�@�p߁�|���x��?xh��<������#I*֭Y�/��/Pd��'O��ׇet���a��^vl�ƅK�p��T�Ԓ������ze%��Y�
�N��+y��q�ݜ8u��W���h�����D#+
�%��g?��v��Ҽ�U!EQHOO��?�1�		�74�~�Z���x��/�f�J���u�C<t����[q��|�[�c6�������A�����e��(��ޱ$I8�._�JB|<�6l 5%�$q��U�����l�:.]�ټi#����ܳij���}DyY���O)-^ƈe���p^x�9�����"7;�o=�,/_��vSZ\��-��P�T3��h4�~�Z����|����%7�i�����T�<�	�	s&�e���H��$G����EQ�X,������LrR�蒸��$��Ԍ�R����,��"#�z#��0�^�'>.���84j
V����z��J�HO�G�>��m��.�8|�(UU��~�~?���$'��;��w��f��"1��x<�Ů�@�H�BMDEE�,˄������d�b <<���0$I"2�LfF�yy284DbB������+��9<���x}>�x�-���Y�v
�**9y�奥$����|��ǟ�|�޾��魷hik�j����FtT���H����8�	��$Pk4$$�SXP@xx8C�\���S�nA$) i�'6&f�>���cޚMHL"))��ظ�dY�xY�W�⏯��F�a�u|����h���;��з�,�TVW���EH��ں:���[TTV��3Oq�	P&˦(�<K��݃��&;3�޾>�;:��0S]����8m����q{<���@����X,��+Jж@ �KQ%�A?�/6&���Q�~�=v<����DN�:E�ٌ�(�����Ï���f�ƍ�Tfll���
�9u�,u���$z���R�[��j�a��XVT�;���_�c���74��l�팏[QIR��{�SZZLɲb@	��W���s.��݃��xY���8u
�9���d�=�<Y�ˊ�����G�!��@=�a�S;w��ĬJNJFQL&&Sؼ��Q�III&59�^GfFˊ
���%7'��D�Z
�7ndYQ�b�I�p:��9{��7K��LB|<*�����DIq1qqqdef���MJJ
a&i�i��ʺ�����1tZ-��BA~>)��8].J���r�rtz6��U+��~�Zbb��HO'**������5�Z�W��h$#=���8"##IOK%##���dL��$�$�����l&..��΍�z�RSX�j%�֬!=-���h|>?+W,���(0�N%a&""���d�>���w�eE�;:	3����[��͊��
�q:�)).&/7�р��'9)��42��1��DFD���Jum-*I"%9�m�<LJr2��)D����������9<���xrr�IMM�JJr��N�V�9�ι��N}x���_�/�ٚUkn��5�~�\ߗ�,c��3�P�ճ*d!�h�ݎ$I���t���i���f.���9v��/^�?���E��̐�8��W�̃��φukg�_�P��c~Y���c��
��=�۶�I�����b��Y�_�9��6�R�a*�j���\y�/ߡ�&�|���b)֍@ �j��o��$I"6&�����H���Z6�bc)�˝����](��(�(���5#��-�r���Q�C%�.�}/�xٲ/�VLt4�{�E�:���*��g�z*x]�����@p��V��k�w/B�G ����@ ���_ ����@ ���7�˲|�٘7k+�4��*ͱm�Be��e��ԋ@ ��|�~r��ݥ��l���$���I[[�7��t��{�S�Ј$IȲ��S����\�2����c�9s�׮_���FSs�##�{�zU�5�|��^�211����p�\�!��r��udY�Y��|�4>��diE���K45�,v��tuw���3�[���z����Z8v�(m��I������w��7��-��Q ��T[G���Rk�I�D{G'�N���������MM#IRPCz*�^��}���͟�~��ή�M ���c�$�ѱ1v��ˈe�۳��C |uH�D[{;���n��K]}o��>~�YQ����\�~}�yS�O�����g����4�֬����6�U���L�y7���ꢳ�k�h�2G�n�F]ç� �s̻�_�$��9v�([|����Y�F���������|���P�ըT*�v;�<DOO�))<��!�L&��6���������w�ccc����c����n�6���&&&�Z�g�F�^���p:]Xm6v��CMm-�<����|>ܷ��V��Q^Z�C�fEy9�����W_E�R��ѭ�\�\ll!܅(���:��7��N���;��G\�r���1J��ٵ{�aa���ɱ'�x<DFF��c;Pd��~H?�u�������s��i�WV��9{���x�NLJ���͛�ol�ZE��|�q���f���v"#"x��'pL8�p�>l6�v���Z4
������|bw7��5�Waf��	���G�m
J�Nu&w�+׮a�Xx�爋�~�p��}�,Փ��!��y�I���r�*I:L{{;��s
��:zt�}��ECc#%��P���������g�kh������Z�rsY^V����z���� !>�5�V���̽�7����Ç�X,<���S[W�Y�bi�)((CCٶ����8r��m
�	�o �BLt��|YQ�^Y���"2��y����#3#�U+W�ry9}�457�i�z��x�'�?x����?OnN�,�<�=�ޏ>":*��e�DFF���ǵ��wt��Ԅ��c�d�����t�Y��	xw �?��dfd��ށ�ba�ƍ�X,���QQQ�N�g��|��>&����Z�</�U����)*, #=�->@xX��w�y�2�p>����2�L#�2�I�lt+{>����A�~?�������eEE��w,�(Imm�x�^���e���hr����΢��#8�j4���K^n.�$a�٨�q��R��'';���H�::()^F~^�II,//��͛8v�$�U�}~��Ldge������CY�]�DGG���Ejr2^����X�"#)��#..����RSILLD�$�Y�j֯������6�
���$2""h{�ksx8�=��<|��˚ի�z�gϝ�d�2���x������X���:z��~Y�j%�99�DG�R���ʢ�xa�PT��F�%/7�����ǭDEE���JrR*�jƫ�ZM||\�����}����;��C=LfFּA[��(�̆u먮�e��=h�JKJ8w�~���J�ݼ	�J��_Q�kjHOM#�dB�˴��q�	�WV��r��edY�/˓����[����V�1B8~�$C����8u�VG|\,*����$ƭ6.^�LIq1~�e����w/�����J�τ��1l�Б�lX�����W������(�p���_�Hyi)*��3gϡ�j���"?/���:::�T:����@����]�����Ê��|��>������Á�L�ٳ��,���ٳ��qRSR�h�'�_����Ԁ����edY�����؄���f� 7'�ή.._�BrR"q��\�^��((�BfF�PS�6�_��dZ���$�i�Χv�Y�����(�������󤩖�^�#;;{R_:���(�QZ\�V�a`p��+W�n�ZT���2@�$�N'�dӆ
$'%&�(2���Z��M֣�jIOK#.&���L�P�(��}�lF��Lj�BB|����x=���NH����*$	�))�BB���B��AzZ���'�eH��DD��HOG5��-��C��2� '+��4ƭ6����z}��pY������$';��CgW7�i���04<B{G\.	��ttu���ź5kX�za&-���T*�xl>����Z��ҬZE[{�vV�~�Z�ss������8bccILH !>I����������~@���� ���8�6�a��jj��l��v���XVTH�^τc���L��H���n��t�=�s���o��r~A��֬Zs�=������磧����8BBBfU�Bz���A���e��y��_*u"�Z���}8v�$.\�?���^��5��VL��v���_�+���x�g�Yȗ}�4~��������u�x�{�,׭�

�M~>$I�����2g�!u�����b`�h4d��:�hQ����8�y˻�E |u��7�,I��Pbbcf�.�,�d*�ZFG���eE�^s!���FQbcc
5�y��\���]'�y*�0
��EQXQ^NYI	�s׊��O�C���wԧ��j��⋨�j�;� w]���j��Z��
:�n��;��KK!�#w�.�_ ����@ ���_ ���/4���5�_�5���3p�2|�2~��G ��l?���Rˏ`6s��%I�������y7\�x<��p�����7��^�2���CuM-�

�����(
��C��y�mk��3_S�eZZ[�OL�������q���gm�y���(457��߿��
betR��� ��gޡ���>�;����/I6�����į_�-�Y�yw�ܨ��!��Ԃ�$Itvu���r��Q;���(�=����9��yw�.�i_ߜnz���=o��-���o{�?�@��B�$�x�]x�ޠh���C��ߏ�8I}C㼾eʖ,�|��~�UTν5�-Ο�o�|���n�W]SKUM�ם^���qs=�s���?��ի�%..�),x�$ILL8�|�*�	;*����\N�9GffI���9w��g�`�YILH��֡~Q$I��$�sgΟ����q�8�II�b2�r��U._�ʙs�Ї�I�������gΐ�����0�)�tvuQU]��``��9s�<n����8.\�ġ�G&1!�}����MEU���ĄŮ�@��xtq��!����kh &:���e��].\n���5����H���'�z�}��$'�z9x�0�������eE�dgϸ��(TUװ��A�:�	

��$%&�(
'N�",<���>9p��6���			��Nk[)�)8�N��?���g����q��y�'~xx]]�\�|�+׮����_^~���t:-i��3�����?�ɾ�q:���b0��V�
fr���o9�ot�����-jQ0�L<�s'gΝ����J鉓�����𑣜<}fɴ�<^/�7nPZR�F�!>.��RN�>�K��t�
M--8|EQ		��w��Qw��(����h�>�ǭ�:}���N�j5���DGE�ή]�9w�7�z���0BCCQ���^JJ2111��{7.�{��C ,����bZZ[9�"��ф�����Bd����0��∏������G�a6�s��y9���'���fՊDEF���^�$�ǭ��7�|DGF�R��t�
U�5tu�p��i�V|�O�4QQ�_?�l':
��ˮ={��룰����0�WVr��e���ilj���G�����F�k����p3qqq����M1A�VIxX�))hu�žM�X�ܧ(
&S�l��쬜9�ȲLA~.���v}�}Ž�磪���kV����x<TVW��-z�$���nl�	�

���&#=�M6���Cʹ�0������z�oh�f�SU]KIq1�EE����.���Ʒ����000���0��Qt:=kW�����FCa~F��իV��SQY����"&�w
�����
������p���M{{�n�@Jr
�%��?0HjJ2;}�9�#G���jX�v-֯�ҕ+�VgNZ6�Bٴq#U��h4�JKX���KW�C~n.)�Ilڰ���4�����j=��~�ݗ���@�RQR��������~?j��˗s�ʹ��1n����J�"�ͪ�VKnN]]��w�=�ׯ�/�y{�&��{﹏�e%�J�~�<� :����j��B8��5\�^���:r��=�OQS[KbB<QQ���~���Q_O]}=���(r@��4���B����C���c0X�����}/8du���X�v
���x�^֬^EyY	���!dYF�lʲ����>E��_dYF��2j�zE%N��^G}c#�]](���j����uudd������
��D�:���v��kTU�0<2BA~��{/�Μ����5�W������gX�r~����\�^N�9KCc]]��9CWW7����=�ֶ6FGG'}f`���@��������j�����N�������N��ǹ^Y��J�]�����N��ӹط�˜�E�n&>>a�I>��q�8E�����$'�q�Y^^Ɗ��������l�<�����,I.��?��5�W�����f�g�HOg떇��p����V�#?/�V����f�a�p��P�Մ�����!�<�E��z::����#99���D.\�Dww7�֬����q����\!!8�N��Պ�,��nA�$�N'��PTX��n�Niq1��Q��{(��#..�����;��G��?0@XXO>��y9��wPY]��`������VN�=Gkk+]��DEEq��uj��(��c��
DGEq�������>������cܨ��0?���s�zg泓hdefPU]Mu�
�"#1DFF���θu��L{{7�����g�u�^�
�9���r�Z>�d?M--47���j(-)al|����rs1�}˾����/�����?[�j�m�L�������R;�	�������u�����ڊ�(�	4]����/�$%&��P�Vl����(��6�B�����u���S?y�s����_�&�)8;�R�e�χ^�t�|>~��W�HO��ۃ#����nN3�7�ߏF��U�����b�ʕ�w�ft:�m���8�$��ׇ�b�3��
|>���R@�ղb��`����hfO^���YY�ϒ$QS{�Ԕ��i�V.L�>]�r�֋@ �j��O���h�*�jF�[I��T*��F[�GF���w̙f>�4W�@�4��=D��`	����U7�;�]'�y����RV�\�d�.�o���b�r�

�����(��L��{�%)1�X�J�_z��A�%�]��,k��C-�$�F#�F��k�F#��w�oI�Dtt4 |�RD�ωx��W���k�*�%|��E��	�@p!�@ w"��@p�`�Ռ��Ԟ�̧�Y�E�}]�J |y|�~r��ݥ��l���$a�X����~��۱�l@`"G���Z�K��Oo_�����8611���������ebbb�cSr���}��~�#�"�N'#ˌ�o���(084��j]�b�(�s��voU.�h�j��������DZZ�����[����|
�χ,˼��kTTVͫ���$���>^~�w���n>�d0��,s��Q����,��(���;\�|eA=j�ww}����9ЃCC8t��_�����ӳ��I |�H�D]}~�m�~9���t�S�r������y�˔-I��v}��K���^}�����s���KW�r���9�M/�\׸�w����)
�
���j��|/�p�ò"c2��^Yɹ�شq^�/(<���HOo)�I�dg�+��54AXX���ڳ���l
򉉉F��q��u��Vlv;���HMIaxd���>֮Z͑c�������r:���'44��e�X,�=���A�ss)^V|��.7������E�bW�@ �Jp��ܨ�g�޽DFF�r�
�8ȹ�Zm��$���DBB<۷>Bbb"m����DGG���E�����������23�VWw77��1�B��ɡ�����B�*u�
d��1n�r����P#+��1̱?���@��T�+*��xY`�w`�Nj$I���������.����dY���Abb�>����PM���O������v��L�ž]�H����,�hkk%';gV�W���(֭Y�'����J%!�$.^�̾�����؉<�ckV�Z�:}>��լ(/gxd�χ����w���Op��
�9|�y99��n�kj�?�����HJJd�r���+hko'%9�?��111��&�p8Qd�݆�n���^�����X�aaa�������_$	�ݎ�����#8�N&N&,Q��LLL0>>�}b��u��Z������#���������w��}pT�_���b�HKK!>.���E�$���x��]|���o;��mt����d&�<u���&r��p8\�|���[����~��������ԙ�l}x����qlv[�=Ŵ�+���fg��`�b���}���v�EA�a����d�̓Ff��ޮ�8�.�^��_`ͪU��{�eyYgΟ��.2�$100�Ȉ�eE�(�BIq1�=�4�ˊ�t�Jp��ʶ���c�6��Ǚ��������R��Y�v-?������ %%�q��}!!!h4�ٴ���x�V��x0�I���?䙧v�˿�99�Y�����<��IJ��gyy�yy<��N��J'�r�aúu(�Bjj*��%�<� 'N��̹�lڰ�g��I~~�@=5����HKMe�jE�ӓ�����B.]�BEe%ii)$&&��������g�b��r�����|��}���B$Ibyy9/<�<���(J��oܰ�����PM���s����i�T*Ռ�z���+����幧�&""B��;ļ=~�^Ϛ��X�z�B~��Z����v�w��OTVU���-DEE���MWw7=�����,��5�uuDGE��_�3n�284Doo/�II�����z(GGR�[�l��v���!�l������և����Ο�|���h�>�$	E�ihl�l6�OFz		�2)�)�����(�$	�,�Ѩ��mtuw��J�������Q@��21�`ph����j5]
a��P�ѱ1�	$)�W~�1t�_���e�����_~Í�:���w�i��|W~^.�y���1������JbB��Qi��Ɋ���,����.���T*������b	a�(
���P����v���ILH��w�9#��(def������u�FCjJ2Z��Ȉ��&&���xt�#���n^�ӛ��ư��-Kb����PQYEiII`/�(�kj��o��0���a6����^�C�V���B]C��FRRS�����9����V���F=�C���d���L|\����j�����t�?x��e�H��G�$�B�B ��0�����r����dg�����|��ֱ��}\�|�^O�:��׿F���� Iz�m~���O80�͜<u��W��R�0Bx��v����!r��IJJ$.6���z��(*(�f��o�������&99��'Oq���v������l6��������$:*
I�DGG!�2]]�|��>._�ʲ�"JK���x/]&+3�7�y70�*+�[��իV��o�09z |�M�������_���5���҆���Z�S=V�׋fR�����x��ts��.n����N�|y9q���|></n��P��V�w�,����OMm->���+W���z����ܯ����tf2����M�Q�T�z�z�A�^�@pw2�c��j�(�F������ I�]n45gΝ������':**8���v�r:��hu:�>^�		�N���	����dY��~�*��p^x�9dY�1��n�w�=|��O�R:�n7���H*)r�ߏ�����/+bӆ
D�ͨ�jdY��V�P�T��n@�Ѡ��fԃ
�|H�D__�e�:�w��V�Z��Y�0���Q��L>�K!�C���[
�I�Vc0�1B�S}��Y��P^V��+xk�ZL����N=2UO�i�����D�RW9M�#Z�v�O
	�4��:����M���t��N7��h4�~�b���������F�47۝ˎ�(�9WL�I��‰�����3$DLk4g�s=�|�:�g�헥��E�������j���1Ӄ�V���O���:����,�V�T<��3DF��_\���/K���(����������x0�$���,@t��""�����_U �t���@p��_ ����@ ���B�_�����Ŝ�_�$�v\.�]Q�6[p��0{�}_�(������Y�[5h|>��iY�q8_������r������] |��=��O2�vp�\x�KG�K��y}��%�x���񷵵r�Y\.׬Pl�����������HEE�m�;/�$188�?��_���k����'�?y�w�����!~��cl|�3�V�u���I��O%��V�\�Ư~�.]���bY�j�I��ol��w����c��4����wt0<22�ߘ��{�s���y5�:�v}��/I��\��+s�^���1��n}s�w9�������T*֭]OH���|~?�	bYa!%�˰X,8]���� �(��Q���,v9g��Ԅ��#,<���.,�����b0x�^�V+�=�4��8��;:P��def�V��^QEyY).���Z�������p`��q�݄���R�H����p�p:1�L���t9	#)1��J{G��������bttl�H���(o��UU4���7�K����u����j�QQYEvVf3��Y?y�s���^$**�W_{���ٲ��3#K`�ܰ0�QP��ꢻ����F�����X@\,$$���8F��IJLtr���������MH���9w��N �J�������q�RSA�͆��E%���n���xINN�����k�����#["9)iF���q��I�]���^|���,B�z_>�~I��x<TVUOQ��E!>>���<v��CrR*��JE]}�>��P&&�|�g)��_�J�e�WVRVZBSS��~DX�	I����6��Z�p)sx8o��.v���q����p8��PTX�;���V��餯��W^����$%%���������TUW�f�*���;d��1b���?�J%q��q��ƈ��%''�jfSWQ4�@�j4�[� (�.�$�����k�hkog��m444������g(^����INJ"�d��w�
���{Pxw�$&$���7gzb�o�{|~?�		�_��7�y�_��p3���|��ٽ�C�>	qq��(�qq3|�t;	�q<�};��5�8u���6�[G]C/]&?/��K\\��cll�Z��lf��u4��284DbB������|z�^n�����ʉS������$��7��? �x73%R���GjJ����TV�͛�� {>��ϋ���ȱc���c;x�]9v�ܜ�%1�?<2B_?�l��C���Z������l}x�֬al|��y������1
h4�\�Fll,���7��/����˯���K���w_�v��ں�IA������".6����'=v�CG���CYq1�uu���
HY��f3�z�Y�g�F�㿔���EAA�ersr���[�G_֭��t�^D��p����8u������!4
+W,����q�݁��<�S ˁ�!z=����������K�ILL$"�����z�
	3�f���v�!'O������I���������"o��n������Y�f5����rRVZBQA�ly���	��lܰ�����]�z��s_��L�H��[o�����P�6�����0���x�^���IOKc��`hh�JEtt4���DFD���V�����Ů&z{{�HO�������Iq5ˊ�����������V�����Z!
���ksx8=��def�y���!.&���a23��z�t����Պ�f#<<�q����t�6��C$&&��Ҋ}�NxX111��n�RSg��e�ѱ��ʕh_O������������`4�X��������P11�X,�&���f���J"&:���::�
5=�:�$1b����JDDY����a���j�h��g��Ș%@6�NdD���tww�������E�$����EQ��qATd$m���8�L����$�2��^��۽$|�$IE��Ǎ��E��C�n�{������w�ޗ���29�25c��9WAT*U�E&�J
�����d%�B%��_-��V�$$���S�m 0�!�u0=�S�Z�(�_�#1��̿�e�����73�Yu=��+�T��?���s2����7�	���Q�U�(Ȳ2�M�_��κ�JB�
t��?�%�>d��MO3���f�gɲ�W�<M�ħq@��fw�~Z7�����@��"]lY��b��p��i��B�g��T�Ϭsz�4�]bƚ�f�+�|����Byר5�Ο��韧�Y��M���Y�K�aw���ݟ�m��^�>T-�g��\�$Is��i'�J3��)�5��左�iz�[���S��"�����n;_6�C�tvu����9<|Q�"Dz�@���c4^C,��~�N���#D�_�<�n�@ _��xS+�@ �Y,��935��b#�@ �QH�`�څ�����h��U�t+�������o����8�z=.����a���n���g�@;���3M�����p8���BQ�-��1��}9���V��ܠ=~�@ |�����
�=��(sM����1::
�+��J��ISK�����mf	��ňł�dBA�����ܿ�V�J�(
^����17#z��@ ��3%��.�$�p�]���vjJ*��:���Oر�����26>NWO7�?��p8����v����$AWO��'>6~��H�zb�cp��XFG�
�?8�J�HIN���b������� ^���Ì4Z��ֶ6$I
��j����FlLl��/����Ǹu|r�Z�b�����LBBBH�O ,,���{w��Fy&p�ۭV�ԭ[���cll���x���%`��R�lR��lUj�R�[��֒Ť��@Cq�c.;᲍
�h.3͡ct��A3B3��<�*�G�o��J��G����8�C ����eѓL�~���ޞ�*Y�`"3��cp�c4&՝{{��2#&���,�T*�D&�����"ٛ�Q�T*3�� ѐ���������B$�qO	�B!�(Վ��}lU7T_.������Ca	
��ql�M0�4M@��d8}�4�|���5� p���')
�E������8�l���6t�C�	�.�B�\.�c�8��m����.���d�P+pL�$1<�cO��OH�Bq�97�z�^B�=�>>y�d_�p(��5(


�?0@("O06>N���ѱQ*J�M�8�hx<(E_?�r�h$��k	Gh����b8�p5�r,c~{;}�}TT�p(�eYD#Qb�(���x�S�O	C�B�Z�h$���GS)�^/��N1>>>㎵�;w�nss�bB!�
�gʛ����q<�M�3
Ghnn�-�9}���c;����L�|>�c;�cq��(�R]�i�ljF"8�=��Dq�2�BӴpl{�Fk�B�T:MSccm%�6����U�۶�ax�,�h����R	�߇��a�~�&o���)���e���`�6nŭ�&���B\q��]�AsӼ�}��+���cт�Xu�t

3&�)�X�`�ױX�X,V����m��-�=��PX�E���V~z��v�UJa��-�
�l���2M���
	��G�B�+�g�Qj憸��q>����\�^���i���m���H�BqE�>���9l�f��%��yӮ/�~!�W�j�U��B�٦��U�
�yy��~!�W����N��Ť��T\�l6Gc"Azt�b��7��?	�B!�(>�]�f�S���n���1M�ɩ����~!�W��ibY�e�Y׷g������K�_!��"Mڿ7��}B!�׈~!��kD�B�5"�_!�����_M�P!��P,/jb���_X��|~�\.G�R���fF��PJ�.�r�r�\���}F��~A�V!���wV��i�={������H�2���[Yܹh����<���u]���k�֟��n.�?.��2{����P�^�����H6�ŭT�F"��[)Ejd˲8N��-s�g�o���[���g	h��M7n�{��s�{�؇��g��mlݲ�@ p��3�.
�y� +W\C<e�y��}�Bq���u]�̓I����ell�X4B.�GU*ضM�T�P(b�~2��\�P0H6��W���n`�5��Yc���sy"��e��f���)K�BA&2����#@5�
�R�yy/���Nc��R��R���!��r��p&&&���A��wO����v���7��r۶���D�4L�dtl�b�H8��1����H���J�p��.�4��0�l�B�H.�#��i:�=�<�m��x�&���T*��h���(�e,��4����h��c�(U�-���O���QJ��c�|�ѱ1L�۶gl3)�B\����6M����?t��k������?Ώ�y��d�
�������5�,_�]����hh����g�����]~��GI�RT�B)��������eAG��&
�p�w�}��M�tv244�o`ll�0��֛yꏻI���5��.���;=�!���F9�Q�w'Ͽ��e�ؘॗ��c�\�q�����c���?!��#�4G��0�vw���=�p*@4��6����Caz�I���;b�%��f9=��p��U��'�K<���7��c���sϿ@,#�N�e�f�V���O&�att�U�Vp�m��;#��+��+�
k��ع};=�$�������Yܹ���~��>>LG{;�cc��Y�lW/[Fת�,Y���w�W��O��c|l���V�<p���A�lX��[�n%?���ql���M�QJ
�� �{���IDAT����O�ЮG��6�֬a���u�6¡��i#��t3x�4�R��+�a�M[غe�R��	��,�l��S���'Y������{���G�J��l�L6K�-��fY����~��K���o�|˯^�+��ʷo��{����}�cG)ŝ�n'����عc;�p��H�\>��;��uk9t���r��?�ӏ���ed���}B�wA����2�;;Yz���Ó����_����6��E�'&X�l)�m����qv�؎�1�L��C.�G��,��[�L
�k�@ @�q��~lۦR�Ԟm����8����b�r������}�ry|>��<��s���k�^��00MM�)��S�*����#�g�^���m$��u��,����m��&�i�N($
b�f��4�ZzH��h�H8�ǣS,�}4�㌤G��~��r����˲��B!�Rd2<~�˲(����"��+����_g�W�E�Ld2l��:,ˢ{u���/ٺe�h�CG�T��5�b��e�̛��ӻw�q�:^��2��C!�1cB��i(��뚦�Ԅ�3g���û0M�uY�|9���--�u� �<�����X,�a������ �846&N
�G�Ï���������]�)�.��<��'����m�5���~�Wh�Φk7���v*��u�j^S˖.�ǟ��5imia��N��J���E���>���]���ҽ��6�@Ӫ#�k:�]���]�桇wq����W]u��/B!��jQ������5k���Shdd�޾~��"��5��ڊ��!=:ʿ����y�v�W�fhx��~�1(�\����8��(��4ϛG4!���١�D#Q��`hx�`0��0�����CC�|>b�(J)&z{{I�G	,\0�q(��$�x<�m��L&I��D����L��=I"�0�M��8q�T*E$!�S*��I&�Y>�,Y�G���`�x�~�����QQ�e|m���E�� ��g`p�yMM�R"�0���$��p]����x<N�9KkK3^��Tj�'ObY&�.$52B(d|b������l�֖fN�9�� ���F�y�ܱ]f�!��d�5M���V{��J]�y��x�������7�\�R�-��u��z�w[_O���c��u]�5w�.�.���kzaz��t]s��_�__�~-~�{��n�}����9Կ��u��ɣ
]���r�8�C�X�^`���M�J�!��삆�����>6Y(�q�z�SK�>�\���	p�_O��e�u������w~����?���:돝�����_ʮKO2ɩӧٴq#k�VK�B�\�J)֭��_����[��1B!.�%~���/���!������=K�B!�L��ﺮ>::J~*��B!�<��?99��Ç�}�$�B�/O-�+������r7H!�B!�B!���BO	
�{%tEXtdate:create2013-09-04T16:55:40-04:00>��%tEXtdate:modify2013-09-04T16:55:40-04:00O˸ttEXtSoftwareShutterc��	IEND�B`�site-packages/sepolicy/help/system_export.txt000064400000000640147511334650015542 0ustar00SELinux allows you to export/import the current configuration of the machine.


If you have several machines configured the same way you may want to modify the SELinux configuration on one machine and then export the configuration to a file.  Then you could copy that file to another machine and import it on that machine.

Note, If you import a configuration to a machine, the local configuration will get removed.
site-packages/sepolicy/help/ports_inbound.txt000064400000000520147511334650015477 0ustar00This screen shows the network ports that processes running with the '%(APP)s' type is allowed to bind to.


SELinux controls the network ports that a application is allowed to bind to based on SELinux Port types.

This screen allows you to modify the port number/port type definitions, which the '%(APP)s' is currently allowed to bind.
site-packages/sepolicy/help/__init__.py000064400000000000147511334650014153 0ustar00site-packages/sepolicy/help/booleans_toggled.txt000064400000000466147511334650016132 0ustar00You are viewing the booleans page for the application domain.


Toggle the button to turn on or off the boolean.  This will not happen immediately. All changes on the application screen are bundled up into a single transaction.  You need to select the update button to apply all of your changes to the system.
site-packages/sepolicy/help/transition_from.txt000064400000001153147511334650016032 0ustar00This screen shows when a process running with the '%(APP)s' type executes 'Commands File Paths' that they will transition to the specified types.


Under SELinux, when a process running with a 'type' attempts to execute an executable, one of three things can happen.

1.  The process can be prevented from running the executable.
2.  The executable executes with the same label as parent.
3.  The executable 'transitions' to a new 'type' based on policy.

This screen shows the executables that transition to another domain when '%(APP)s' executes them, and the 'SELinux Application Type' of the newly created process.
site-packages/sepolicy/help/booleans_more_show.txt000064400000000076147511334650016504 0ustar00You are viewing the booleans page for the application domain.
site-packages/sepolicy/help/transition_file.png000064400000210223147511334650015753 0ustar00�PNG


IHDR�j��8pgAMA���a cHRMz&�����u0�`:�p��Q<bKGD��������IDATx��gt�׹�>_�Pr	�9'I�HeY�%˒�}�>���>7�̺k~̯�1k͝t�ow����s���lYE���L� 2�*�/̏
@0�ER����T�o;�;��}��_~�ZQQ��@ �?[,�U�aX0111111�L B.�˝����O���B_[�d������j��������|�X2�̆�/��{{{%�ĉ�V�\�F�eø݉3111111���L&��ӧO�rY��u�v���������`V��ŲB�u]�u�\훘����|��u]�
à�cbbbbbb��FL�obbbbb��$�����71�9A(�]�Gs?+}.��l��>�t�M�b�
����/�g�s�}fbbrgQ�2`n����� ���'S躎�f�a����H�Ҁ�a��ȸ�$�������䲸�NDA�f{� �N4�@�5A�f�b���M,@3�$v�
US�u���ޕH&���x�.dIB7�8�(�r:ow������s�wi�E}��&&7G,��o�ˇ�Ϣ�*]K:��W^��3g�ٯ��E��u�%��|����|��O�f�2���>8z����p9���P����������oA/~7W?'�H���FQd������c,ko-�h�}�W<��MN����CDQ,���2�]�|{�<�w��[֯���8����iil��E�vk!���^�阻�`�?�����C0t��n0�˓��Ƨ�	Gctv����O������>3o���@&&
Q9�7�照����L6�ժ�Ngp:�|�_��(X�VA K����Oש��y�j�.��:�|���JEfj6HCM
�d�t&S~w]u5#��mV�++EMӉ%<�ģ�4�������Gikndbj�T:Mm�
]׉'���y���PW@�$�S3XM�i��C.�O��z�y<�@*�f&f�Y��̞qil��J4]glb�p4��ᠶ���P�|^%����z���%RS_S
�ON���E�@������$�T��ēIN��A�t��>Vt.A����LL>uA`r6ȟ�����{���vQ_S}M�n�)�MLnC�	���TUr��<���~�
AA O���%�U˖�b��B��߆G�x��a�j����$�d�Yjkٺn5x�]��[_�ȩ3^a�.�?t�����y��G���,�W����n/���fi��c��(�$a��xr�Q�0��f�aښx�o05��t���;4�w��E��~����3��^�@�u����H���orqt��˗!
"�����a�Gƈ'�<r�v��1=B�%dI���"�:���(�D��v� �˱��6+��0/<�8� gzzxp�&Z�x��������z:Z���>"11�[1�K;����wo��ӏ�dղ�kn��tq�g�&&&7�f�4��?~���:��g�q��	��՗(l[;v$I�j�P�BqK�0��Q5���#܏���W~��`��U76�]ͥ�q^����c�n�r.+�&x�n����h��῿��oXǶ
k�矾�ѓg�[ﺮ��:�����j�������s��yv;^����&4M+?5UU,�h����'w�@F�'�0hon&�j��>ϩ�0��˗�}�z��˯sil���&��lw��C�4vl�Ȇ�+�᫿dtb��g�i��#�J�ǩ����r�/<CKC=B1-&&�"�ul(�-pc��\��� 0<:F���55,imf�����	��p�i�*,���fE���ãcX-DQ ��@��	X�ݎ(�ȲL:�e����0������V+�X�t&��f������l\�M׈�X���0cS�ry\N��153[V@,�OgbzMר�y`�F~���X��oV�RXI��
t���Q�����w�
�|{7���T�|���(�(���f�j�pil��=hkj���C�4E!�13$��#�2�,��҄��[�A���΁V�01���3=��?r�g�	��#Ǩ�Y�b�5'��MLLnC�p4�{�K$��l<��>ڛ��#�ry���_��:�M|����y=���<��nc���*�ȒD�׍EQ�$��Ms}��:y�X�V�+��>�t&�����w��gϳ}�ڲ�\eE� ����v;��o������Ͳ�6�o�T���=x�.<.��[پqo���b���Y�����g%�Lѵ��<��
�Ӂ"�جV/���u�v���E]u5S33h�A��P�a^��*��r��.����y{�.^b6��&�ϳ��'O��dp�m�mV|^Oy��L�Ut�����x���r_��G��wC}C8p�jժ+����\�0���d2Y,[A�/����j1TAߪ(ds��L���.K�n�(2���,K�W�l>�$��4���l���A6�GQ�b�Bq"�j6�Qɫ*���'���Ng�@!NU���^����_xv^�y3���a�X�l>O:��b� K�����$���*�,�j�r�l6K^U����+�z��,_�A,�$��'
95�EQ
7LL�aA@�En�\�s�Ι[�&&7��,lp��,�ȶ�.�
�f�,�aX-J���D�"��u]�j�,�Kج�+:�a(�m�b^i�|n��.A����4O�z�l'�Di�0�=�,R!/�R��\-�E)O�གྷ'9q�<���j�RZ0����T��u7�{�0��Ƨi�
?W��$-����+,�g��7v���+��
��~~�p� `�Yy�Gii�g���F�0��kiknĢX��	wC�k������0����0��jii�R��*<.'nW��9��|�\a���h&&&s�ѭ�O��ۖ&&&��V�������=Bـ�$I�������=@y�obr'aNB?[|��
�m�䳀���������
?�$�J�nwr����(�8N�'�x�t&}K�)�2^��h<���3�[��b����f	����۝��FE�~|���D���DN$t�;���0g�&�
A�D"�O�ӹ����ʾh�D���/
261FMu
�5��.���y�'�����b����N���� 	���%��wFA��a�-]���5�J��Y@�b�`���
S��� 
")K�l6���A��S(Ap:�
��c�-O�E�������OU�f�=�����t��k�J��d2x=�۝��E�'w���^���^/��WUsMd~i.)��-�SJ���$J?�ی�mr���:��b���$��{
n�4�^˧�B�[ݧ��uM��	]]����f�0
��)��;��M����T��R/��E	�˃�麡8L�or�q���+�O#��wqw���RA�����O�HgRȊ�,#�C#����w�]���sM�/B��G4�Y��V$up*
w�fr�X(�A@��T�t�}#�Et�-ʢ ���uCG0���yn��>U�sn|s'�(� ���4��NF$Q,�ٽ2���	Q�D	�BY�Q8�1�؊�Ly�T�&&�MSI�H���g�U�Z�hZ�x<��Qv缱g1�*�A �s��QV�XN}]�

T� ��ۋ$J44�355EC}�u{J�Q�D���	��̕�]L���\�p#�#箞A`6d�P5
���uk	TU��G�Ƙ��e�ʕ�����^`��i��4U��u.-�ybb���jkjn�]�Qp�[�����؉�$�I�mٌ,�_�H0b���W�!�0�
�<)�8;q��GSc��(�-�"b�=/8�N*�����@"� �ݴ��(�82�Ly�c``��1�L6��[�ͱ��i:�����2�]��T	E���èy����k��W	A���y�ۿ�{��@qۮ4h������O���ٳ$�IN�9K.�/�s�{F���w9�}Y������…�`'���
����mrg033˥��y������\i5[������8F���oG:��Bѭ����T�C�ݷU��aJ�J��zy2R����ܰs���׼4OL�;��Z,H����0�C�ٻ��ዅ����17��W��(�~�a��������<}Q������q��%����W��-�d�͍{�2���{���-�ɹ靻�!IR�G�H��\�xM�P$��/��D�r8IB�����|&I�R�Ү�$I444�t:QD.77��Ź�G��#�ۇ(
�� *��	��V9z]�!��r{�W]�����gyp�
���5��v�K6�A�e}x�t�"����x����s9�=z��=hlh`�������������6���?�q��
��� �Jq��a���ilh�����������,[���=�ɓ�8}�,^���}ĴIp �7�����v�jN�>�~�C��Y�|��뺇-����v���D<g*�A��/r��!�<�o�kY'��{8q�����iE��|�>8xEQx�����������)���عc�>�T�X<ζ-[X�j%�?8���0�

<�s'֢�{A��غ�#"�sq��U�f�*^/��DW�d2��w��ô45��	C���t�m�7�r��y��rY��o�ZY�����z���f�xġi=v�S���箝�y/��LNM���x��v>D,c�{�����35=Mo۷n�𑣌OL�P_�}۶���Afgf���d҅�8u�4���<��#X��yu'�"�$�+3��Ig҈��n�8l���4�T*��jE�et]'
�
`�$Oĉ��|>Y!��	G�4
��KmM-N������,��χ(�d2b�555W�3�#	�c�;����gYV�Zm��9��&�'��gH�����tlV�|�4ҋ/��������A`zz�����'���n�jk%���l߶�P(̹�n$I���c<��~�uw�P5YV�|���ndY��<�k'M��x+��N��������'�J����}[�02:Ʃӧ	�#
�m����d2Iw�y<�V��oo�C{[+���oinjd��.*+��&&�A����w����x�������}��qE�t&M(�����]����w���E����O}]=5��zg7�����k֬F�t�8���&�����˯�NC}=�X����]�MӨ��d�}X�V=ʪ+�z�8t�ښ~��߰a�:��}��V�j�p8LUe�i~��w���TUU�s��`0D:�����ɩ)v�y��kװ��**��9{���ؾ�χ��o4��񖷴A���S,��b���?��f'���_��U+Wr���d*EsSCC�<��#�45��Z,_��#˗-��S������lݼ��~H"��\�yEa�
191I�� �7m����7��:���W�,c��p:��l64M�n����{�D"A8f68K}]=� �x��'����D>�������>"�v����6�'ƱZ�
��ijj�n��z���edd���zDI��Sq�8�Vky7��c�r�p:���{�ܣ(�L<G�d�YѤ�(�S�$^�Y���� ���.��/��02:�_��S�?yMӨ�zY�f
�������D����l�Ķ�[&�˗�DQd��EY�f
�m��Joo�ϓL&E�J����f����$�0���U�ضe˗w18<�,K,�hg��N�V+���O>���!���fSMn�0����>-�M���/����?��	�ܑY���i:�MM|��_�?��`��w/�T��[6�m�DI��m6�n�Ķ-��D#D"A 311ɥK��'�lvR��}}t�?O2�"��;X��Ҏv4M����#�vr��9N�9C.��"}�s�inmi���&�wG{k+�eW]7#��ׇ�f�m�B}]��������-����Q�(�<��c�������LNN23;��E���
��C�%V,����f��s��.���by�/�r�r6o�Ȳe�#I˻���ى(
�>{��ɒ�v�E���m��)�bY ��c���Vxiii����|V�iv��Á(�$SIz�zi�o���Ţ�:6�
���z��ܮr�(����8]N4MC����i�X�H�����b��vy�'膊�"��$�	Nc��nx�{��Hg2�<u���<�w���u��B$	.��C����v�����lw7Nj���A_?c���NKs3��gz���x�'Ϝa���տ�����ٳ���A�4DQdY�RN�<�;���sݬ�� �����J�q��lܰ��C4���t]/������៾�}���^{�
�
�:
r��ٷ�2�,�V���v�λ{xk�nDA`��-d�Y�yw��y���j���#��POKs3;w�������,�Z���A�4TMC�Ԣ�Ո��TUU�����c�G"W�R^�	~���Ym�l��{�<
�3.���@5kV�������x"��+�$���B��-�

UU�����û8}��t����jkXֹ��~�իV"���������aՊ\��%�J�ģ�r��)�Z[��h��Ӽ��Ν;OWg'���j_O=�8�t��v�&�ϗ�)�����?s��eIF%dE.b��M,CUUA(�6Qq�ܸ\."�n��RУ�D	�����r#���p$LcC#-�-ج6����Y<�RLLn�   ��娩�E�4"�0�F.�m��Wl�B!�t&Ͷ-�	�+4EV8v��L��SO>A4���#d�Y��yx�N*�LNN����c˦��lv�\@�uVt-�0F��injb��t.Y�l0��n����i�oۊ ����z�*�nބa@cC^�Ahh���B/�/�n�6nX��(��~L����a�����vՕ�ܭ���y��N2�"O�;��NZ�����&����·X��A]m
��c�m6}xՁ�H��ɖ��x���I*�^�Z[PU�ѱq���\������&�+V��J��Sg�
��o�V�t���\�����|�Ӭ��l6����	�m��)֭]����B_?�t���ZB��Μ��v�ЎTx��8A �L`n�{�D@�4j���T��德b�r*}>���BT����p8�������߇�ngՊ崶4c�YY�z�55���eŕ+W�q�:�����0YZ����+W295E}m-��������Z�;(�H�D<'�H`�X����A$Y"����lD"��4�h]�	�¨��������D�]���
�
7t�d2I*�"��J�(�H�q�O ��� ���KC7���8�V���t�nb|���4M%�
R_׀���Z<J[���x�شi~�ޠ,IҼ��(���W_����SYY� �ݿ�c�O��.6�mޕ��o]��i��U�������+��4m���o�~���4p���bي+�+]��ڞ{W��Y�<�w~9���_)�R�Rۙ{�PE�9m��,77]�\����[ۯH�$��s�/\1���YxSe�Q I�������n�w�4/��-�tJaJ[�%mY�Ћ�%���q����_�s�)m�K�4�(�]B%��~YTn�u�Á�j����y<'�ˡ(
V�5��'�t��J<G�5vv��T2��nG�4��$���N��J"�@UU�N'���L&��ܳ|A��xܞ����ɧ� �L�L�?���-�	�bjj��[�'�����	A��鹺V�B ��"I���y�f��)���>��c,���\ע��]����u�q��NKLn�Я��^�n~�Q��]+|)[�F]��㹑�}�6+�7����q_��>���9��jq/���u��cBɼ�a�V��Z�w���r�Z���1�Rq�|��pη	`���+*����n��8
9���vSX,�Si��֛D�4M���-."nlBz�&{
à��������j��g�kW�f�w��-�;���;���V�͟{=��;�V��=�����v��:��P]]��'���������%�E�>��E[qf\��v5�����Õ��o���Ռ�ܩ���%I���^�`}�Μg��#����a��t-�w�iz�3�k�e�`B��fi�o�XQU���z���1(k��ɨ�ZИ
����N��vi�٧$�o�U�lPP��4�tBar�M��=��J�.��`�Zi�o �����4����FA��f��Z��HM��l.K2���Wˊ��r�����7�e���lp�p4|�z&&�<�y�@U��ʪE=�DJ��v�-W.�{��n@%�>��*����{]J7!LL�v�@
�����S���FXh��n��0ל�R9�-ʴ����Q�7����~�o��Og���14U��Ln-6�
��[V��O3�w[��m�51�,s˖
�\���is�orCD2�<5�7�������sK� d�YdY��h���jd�Y&�&�Y�31111��e+~�0�Z�W��.b�z���]ru���Q������NCH`���y_	O���dHe5��ƀ�E�<
�7�<2111����
�Fb�����4���r�8xpu%�~�M���n��n�������R��a������pI��ד�{�o�Thiw�[��ũ8?�S�}��!�,��v:���/�/�j�봚Z��� �7?LLLL>ʂ?���E�2�:��j�Jf���bq�V+;��������\�>�Q�}��a��GG��3N�2�K_���U]�����>J �����l���c�a��MLLL>d(���J*���/�H}]�B4EQ,��a�v;��3��?c�C�n�\N'����	�N��H$B.�'�QYYp�Y���s��|"��Gٴqn���kD���X���.�܂y�{oa��MLLL>�+~AHg2|x�8�\�E2���ٳX�VDQ�������9�vn��7��&�,�r������A0��:�K/|Q�G�(�kYt�@�ذ�lȲ�"�I�T-_0($�K�{��݁�������<�>A�Z���vdY!�ϑL&�Z-`�[���%Kx�ϳ��������1:�.�پm+���u�&֯[S�m]Z�˲��%��91ɉ3iV�����Ϲ��
��M�6Q@0LAp/!�r����ɧ�<�>����;������A�0 �/��EQ�^UU�v;��*�������ۅ��@�u.~Qx�?3ʹsA�?8A��bt2B8��n�H�Y�;���0��!̭~�O���A���<@/*ꉢ�"+TUVr��1.������x��=ؾm+��qid�D"�(�Ȳ� 4�a�v?P_���ٹ������E,�<��)�t}=�����S4�S򛘘��r�]]]�U���Y*���5�d2I:���s��55�X,��������v399�l0��롮��H4���@�5A���ን� ��9&g�d�y�Nz.�O~��;466�?��]�����:���TWU��LLLLn� ���sY�V���{ts��̽�?����AZ�uDAA �"�⢂�D�d/ D":���a��m�%�{]���'P0�����-�$��-�����
����0���3�:w>`�:^�����ܢ�|�1�MLLL>n�^�G��~�ȏ1��{�V����ɧ�-��$��jYI���Z�r�y7@LLLLLn
�D����j#�s��E$I��4����r8�NS𛘘��bnي_E�^/�����M���fG�$�������bn�}9�J��
a`
}�O�[.����������\����|f1o��;̿"nruL�obb��0 �҉�utS"|��M��)a:��>�����ɧ�y���D��KY�|>K0#������ת����X1�y�暂_�4��$�l�v���3� �l6�+�T�|>��d�@�t�x�o3��gB�GF�lh� ��krM�?15A>���p���Ln	�������q:�ج�۝$����pYWxi��S2�~�r���Dȫ�ŔW�⪂_UUR�
u
X,�۝N��0� ���	ԘzL>6�rTzT�r�7�����G|�XF�a���5XT��@6�EQ$I2��r� �m���#@V3p��ra"i�|��D���$��9ջW]^�r9dI6;�gQo�Z�����X�i��.�NjW� ��i8,��A�U��Xp�~��$i�X"
�uH$��)���U��s�\�ƾ�"s'M��uEQn�e�a���<k~7RA�(�W�7�OUU��\����`��xگ�AE�0�?5}��i�\����b�|"V�\ �ͲbyW9�%zH���k����H����_�G�4��8���G{[k!^��,6�m�躎�j(�G����*�4_\��[�����dA(�3Y��X,�=��R2i�I���}r��n�
�I�J�u2y�]Ft�rY��H�u��(�1LUU$I���hA�d���3�Պ�(75q�\�ɓ����J^+ovR,�|�|i,�6��2�-��q��Q�*�������\O�����C)_׊���UE�"�s��d28t�˻����0$Q�됙en*_�����Ǫ��y
�@8F�$�V��T����aE�2l6ۼgvjA(ܩM$����_S[[���0S�S�T�xn�@�����tb<���Tx������t�r\�Ϝ���5��y������s�NsSӼƵxڅ+Z�0�d2��o?.��P8�"�(�©�g����:����s�p��Q~��p��������PS]]~fn~��_��E�ػ?�/��d�HkKs��`0D<���t��*�cc(�r�Dmn�� �N��ůC��K��;�{M�����'à��]��u�H4���B�	���'������|��}�k������@ER�4��uSWW[^��/F�0�7S�� 0:6��_y�����o���\w7�P���6dY���TU�C�NUU�v;C�9׽x�n��	����$���oY�с�f-��L���;�]�E��\�T���K�D�� �����ɱ'8p���P��	v�yQ���+��GQ����/~�k���?�w�~N�:���GcCCY�x����e�gs��
y�~U&���������b�~�G�sn�_���˯p��IN�<EkK3~���.ŷX��S_u^�����K6������������t:�f�\��q�W��EI�����g/�Bg�Rv��|�r9.���zH�R��g?�������b�T��h��*SIx1A`vvvq�_�-�Y��wr� ��ƛo219Ecc#�u���qB�0�p�<��D��#��(�\p����tu.C�4Q�WQ���4��H���i�����X,
�_��J���A8��vc�ۙ��!�bQ4]gfv���Y��0v��<;t�]tuv��Q����"����	"��\�p$��$��a�X�
	CLMO�Wۑh���q�<�"3;D�4~���"I^�dQ�|�8�b'�V��\��Z����=d2���/�f�*���c1"�(�x��J(�"���B�pa�C�r��Qػ�cph��F.�CQ�x�MN�=K]]-�h��_{I��z��r�+�WUU�'&�ȑ��l�R*��y�ebr����Z1V����Z�Ł<�ϓJ�p:���w.��o��fph����%d2���P�|^�f��H$�i���} ���v�J����&3:6���H����S�ӈ�D8)��f�Q�u��K:�X,����~���6R��h�t*E2�bt|�L&��f#�
����@�V+�d�ѱq��$6���`A�˛o2;$PUI>�'��P[SC,czz]7P�\�ƙ*�Ypg�!9º�kH$���e�Z	Cs��[��3�wY,��u�+M���h���u\VLE���1�uw��=ܷ���6�8� p���jkiin&
qqd�,�à�����W�?8�����K/���D	�B��������݃��i���	���NDQdzf���q@B��H����*Ձj�n7�P��������H�͊��׌A�>Gh�ݷ���&�����~�Z�>�P�p8B*��j�055M$)�?95���x�t&ͥ�,��XR*���&:D2�@%TM�_�c�gf����`����T��$S�B��(
��l6���0.�r��	�۶
��9O`�9{���O������p8�ho��r� �ΩLG��l���N�$��;.m��t�"�"�/�t8x�8u�4kW�b�8u�Ly����ݻ	�# @ee%���U���� �X���r�e�FEa�H$�T� ��C���<���=׍�it���B/�uuu�]�����*�
���lݼ�Gv�,>�ݎ��FUUj�\�4����w-cbr�X,N"�䙧��o����$�N����������BmM
۷m�STx�������`�H$9p��,c��x��]��/�gMM
�=�0�9kl|�����f��u�f~��?Ghin�����g�"�R9��_�n��h4�Ţ�v�I����7�����0�vt �"�CC�#P��|�Q�{z�D�Tx�x<�����*ׯ ������#8���`���(�"���xl6�׮
G�P8�7
�I��Z��	��<���9���w���ǟ�+UU���i�������Ӆ�'�<��3�N?99Źs�q:���{7V��e��,�h�ͷ�!�J!�2-��7U��

|�/�W��@��
/��@U�L����~��jeiG�|������G�S�b1��$�|��y�o�H&���Ѓtw�GQd.�32:�����t��ۋ��b��=�����<��c���{׌��];���,
��Vl2������F�������_�z�8A�y��'inj��ڕ�h���>�(O@+�w�݃�n�����"W�������W�$����ͯ}�Ɔ��NcEx=�~?�uu��^~�uZ[[X�f5��{�'Ȳ�S�?��cZZZ���f���hk���^����e��t-�,��Nz���_�—���O��0N�=��ӧ���F�l������*qv.]�SO<>��ydE���u�����?c��X�b� �R�r���_hmmadt��@C7p{���}�������B��ǎg|�p���G�l�y.^��v333[��_����]fg� &���W���ƅ�>DQ,L�����3g
������x쑇���rZe��N�l1�_�E5_ʂ�x�2�GUUN�<E&�!���70��l�d*EsS��W�;�9v�d*��5���W����=��@i?K(���#<���|�+_fY�Rjk�Y�z5��\������6^|�464�L���ɓ<�������������~��7�70@^U�uM�PUM/�%��F.��b���ӟc�ͬ_����6‘pyR��Ν<��c\%��'�|t-�,4�D���N���x�sO�z���t��y���/�J�9v��5�\���0��)k�b��r�tf��|������Ν;~�M�ϋ_{�a6�[�����?G6�eff�0�ץa�z�J��p:d3Y֮^͆uky��'Y�v
�m��|%��r�:��G���/�@}]���{��i��Zv9��+��>��# \�>C�9��M8!��2>>��K��d2x=n��ҋt-[ƾ�D��۞���c��B6�#��s��!�vt����7Q�~�Z���x�J������K#���y����Rhwy������f�*�u.�0t�u�'�ϱq�z���s�#���X���e��v���utu.e�;x�Q5�X<�h���_�*Ձ�>8p�8�B5
M��[�@9_×.�^��w-c�s�U]�P�}�NjWyUC�i+��[�EQ�UT��Z�d����=v���!����}���m�Bۂ��|^�����}��ܷ};۷m�s���82����[_�;�Aυ^��$�t���fV�\�a��i�۶��e�������vR��FF9p�0_y�E�����s��{zp_'�Rz�b�N~���S��ϳe�F�;Ɨ_��=n��7����v��w�~���Q��T�<K?�"�c���|�D"���c��5<��S<���,�,櫣�x"��m[����?0=3�;���3g����ɷ����"E�s$ �2�o�^��;�fIGǼ�-Ս�*#
��iy�,*�UM-+�i�V��u�p8��K8�N���s���С
V�,el��ݎ,I��<�V8�����n�'�ĩS8�����m^E-�/+�ϲ(��%IFQ�^�V�Ki7��烺^Ps:
G}�m�j��F.�GE,�����
^z�yT���Q,(r��<�����ŴR����
�W1���
�|p�o߆��0�v���"�"�,c�X�X,HR�èsQ�,Y��D�|T"I"���/N�@���R����TUE7�b���s�T*M*�&�N[Tae?�-�ґ���R�<��p��_E�x�����g�4YQ��l�mV�|�б,h{����d�4]/\�*(����{����e���p�(�����δbx��N:��{�P8���@U�r�(J���2�?�Ձ*~��?044�~ �b9}�>�izY�E���4�<�,�3�}���@�^�r	^���I�J�5��<N����6À��j�~�I�}�s��ס녱��
2�,�\��AUU%/~�,�Z��5Sc��6�
��ù�n���XmV����(
v��Їu�]]|���`hx�����LAP,J��P�?)�3"�X�b7�v���J&�-�RE֭Y�s�>���j� In���W�/�Q�ݎ�f�f�"�WWz�Z�X-V�p�"K�Z�E��k�Q8ڰ�x�nE!���1�zY�&�͒.�w6�EQ�b<�+	�M��U3m4_�+��r�ܢ��EQdpx�
�����"V��=�{�����pp���D"A4�駞䍿�����>���ʮ��>�ۻ��r9�z<�����r��Y֮Y���$�	f���n������g�dY�:P�Ţ��_�D�DZZZhki�Booa�(��λ�Q��+�K���৑���:�,_Vf*L�;q�t:S�ꒋB������y�ШE��E{[+��ɓXkV�d��ū�i-�I��&�"�(�w^4MC�d0�j��e�ƫ�_�v�u��@U�u|9~]בdA�ƆFN�<���z���Ac��|���S�I$D�Q>���
�λ{�D�Q4M�7����P���·
��~�0Xѵl^�ʫ�rZ�n�NMM�J��ڗ�D���s��ٳw^����A^�կ	���z�A��8~�TAg���C���w�~�vt`��X�z�������K�`�(��i&���$��zS�(�2FQ����P8u�(ցa�F^�3��d���,�xP��Q��✿p�D2Y���%	Q�hnn��#ج6���-�6�g�^~�ʫ��ʣ�b����⼜^��F��m���:�L�l������^y�����c��W�ڕ @6���p'��GI�Z,�5EA��*��㙧?GUe%=z�*X�j��̽e#�rQ�H�R����(��D
I��X�t��p��aR�4�E)�u��O���O�
_z�X,�q:��_"�2]�:Y��ũ3g����*�h�l��7�D�Ŋ��ְu���/>+�RY�/T�^F!X�E�kY'o�ރ�ag˦M�|}��/�WU��q��I�N'O>��|���Ǹ]n�<�x�_���������;��_�O_~��}�˴��]1�M&�W+&�#8p���ꚧ�;15A<��r_�@2�$��㫨 �ɒL%���!B�0[�l������O^~�*���˻�����r2�H$�x<(�R<�&�NQS]��batlIq9]TTxI�R��	�*��3l��3�� ��SUYY��SQ�%�͑�e��z�5�D"���x��|9�(�LNM�H$p�ݸ�N����0�'8����fsT�T"�(����
���z���t:	��d2Y�~6��p$r�8�7�L�W�x=��9]I��n���U������y<n7�H�˅u���x"��븋��Y���� �(PUYI&�%
��g�>fg�lݲ���j�>����o���*�#e��"S�+��"�p$��f/+#��f���c�8�Fc�r��Qv=� 5�55���o^��X<���,�����a6"�P����z��@ͫ�\N|��vS�諨�w&YZ��|�<Fp��X�`(�l0�����r�iZy��Dp:�C!b�8~���J?�x�Պ$I���;�X�V��B��b��	�w�>���jq�*��-���X<~E�2�‘\��"�h�]s���v%ӱ4�*V/i�"}�L�h,J���\�H���`|b���f����m6�
�%�� ���E2�"�JQYY���
����zq�ݤ�i�*+Ig2d2,�.�`-���l6‘0U��Ċ
�5�դRi;����It]���Q��=�E�l�W7�PY�˟�EEh_E����2y�����$a��	�B��ʓ����¸�p0[��&&&p:�����D�Q����/��jkjشa�-�x�^���a�]n|~��h��a��頮��P>�(��5W�%�uf`4��j�i�o� ��ӳ���4L>��n[�`a�}zQ�ß�@E���SmmU�g����իزq#ZqEwŽ��ta�%J����^�(�U���b�����
�W<���q��KgPƜ��>W��w��W��jy[���|2���+��i�[�|�/�-��(��¸�V�(	�q{Pd��	�m��,s�臜9w�o}��עi[�f���v���G�G�:�rn�-�+�S�X��������J���yN<�X܋��ҮDA`"��������E�B�W+߫�ͫ��¼�M���.6�.f��\+�b�]��͵��X����9�T�Ŝ�J��k��Q>�����y���z�E��Q�|a�^��a6��!`
���[��kT��x%/���,�h/[a+F�ڢ�^�x�w�nwyܔ���=�0�O:�7�G}�b� •�{����~UUo����yjk����f�n o7S&�L?�p��ݷ�O�]��kح�U�-&T�~����>_L�/���v8n4�G
�X��6ѽ^�ŀW-�kM��CEz���ҰX����I&.��)���y-ʂ�T�%���6P.-jY�}fݚ5�4�M�^�ߏK^͗m-ܨ�ښ�jk�ܷ*m&��t���"	X��o���GE������E�pD`��]9��O�˕���H�R(r�"���� �Z>�'�N�n�M>>h�^T|,贘��˭�E��*"�HY����_s,
;����$N���Q������Mn9%f[3��0�
_E�ڞɽ��j������)r�؎���D"�����9�3A�hU�U�&&&&�]�u�lrg��;�:��144T�2�x2���L�V���|��լ5�5,��m���ZJ�&&w"e�A<� 
#����y~L
��W��b��a��̺6�)���008Tv���fiom-�1�gS��z��w0���n%�+x=-��P�ܮ����u�D"N6�C�e�N�$�F�\���5��E�K�g��v���q7C2�(��.8�p9]�J"��v���&2��D�\&%��^o�V�n5�
V�X�|.���b��t��q�a$q$I��z��-�N�����(xs��xܞOmUv��߻����TM+X��M�?����x?��Ռ�̫SA�f��iN��d"��n�f=�j�h,����f���:�X�h��43k��S�0��b�t]���,��L&��b�G�������7��i$IF�y�9.\�!�����O!�t�=��ڵ�hkm�؉>��N�>��Zp����m��J%ٻ�}�>��#
���}����*jkj
Sr�s=�/��Z.�����<�ҷ��V�����;o16>��b�f���s7l�!!��d�p���\.����
�7��g284ȱcG	G����Uk����n�}��Eg#�����m�}���^��뜟�/Ξ;C<��]����|�#��S�Np�̩��CG�������a]���yu*K��]Dc��v?��y��6���q�xgff��oɲ�.v�|UU��_���-�X���S˟ɽˢ�?��r��1�j�n�J&��_Y������?HSc3�v�9{��<-BEQX�d)^��`(H*�$��c�۩�
033����p295���$��S[SG&�!����X^���y���.B������[����e(�����cȒ��%�H��������DQ$	3�EQ,�Α�����c��444
��E�x���}}��1I����t8���#��311Nmm6����v����}��;��p299A P�[I�SȲD.�G���|~��,�H�LmM-���X�-�`�\>��U�Y�z���8��<n/+W�"����W��f���$��Q�dfv�C���hkm�����E<'	�k:�$Q[[G.�crj�@�����+�u:6�Ku�;���%��J]n'��5�ݒ`0Ⱦ������C�DUUEazf�|�����$�X��AmM-�l���)à��EV���(䫲
_�����0����Fh�N9y�8;x�D"����~a:SSS$SI��
4]��/U�}���ifggЋ��j�k�e���1TM��_����Y���G#�L05=�,���֑�f絟����Dt��
tu��ĉc�BAz�a0���h���j�199&%/���ո\�+�x�ʮ��5̯SA(LpC��(���˦����H&��nϼxUUeph��ZZZ���Å~aQjjjI��$�I�|[�b&�&PU��������
~EQhkk�ҥK�8y�%KQ�D�08}���\.��+�Ĺ\�S�Ob�X������<Dc1��������Lkk;{��f��8�}�&���GCC��
��Jee��U�BA��imi����TU�p��`(H}]=.��޾$�I4Me��Tx}���;Ȓ��n��p�ɤ�����r�L&9y�V��L&æM���=O(������IB� ��y�'�8q��<�\��Պ�jezfYQ���v��{���
v<����p�ljF#؊�v��A��{���@7t:�v�y�֛[�W?�t:]TVV��4r���K��yo7������('N���񰬳�t&M6�atl�babr���:��{��G���p$��M[q�ݜ9s�D"��[��� ��.x�0��ٳI�Ȥ3�\��x"N_/�Bx���hmi`rj0ؼe+N�����:��^�����b�Z��P(������69q�x9_G�/�wBi�^��#���=�R�r����߃�天�	��O:�bzz��z;�'��bw8x�o����immg���9{�d2�(���ח�T��L.KmM-ᄋ�l6C�8)��z9t����H��6�vͺ��x����H&���"�H��d�ut�/ώ{hmic`�Q��0p:]lټ��)�qӦ-,�����y�N��]n���TS�,044���D	��ʣ?V�����Wp��	�nw�7ť���Eim-x������z�d�jk�X�b�F���|p��u7�W�{�E�²,s�ص�'{����O@q��_I}}}aUV\e̽�Tr�PD��j�|�i***���(���E���͛���}�d2��M��8��8���r���|�]˖���Ob#�#�����뿀,�<���y�ɧٶ�>�*�Y�����gpx������������]˖�cO��" 0<<�ŋ�444]qvY��n��
]]��gd��H�%KD����<��P��G308��nG�e��ks�j\K��4��4�pa�084@(�� �0:6��f����
PYY�ҎN|>?۷��ƍ���
u�t�x�'hkmg�(�+*|X�6&&'hhh�﫜��a�#�2�>�+V���sd2�5�v�+��$QB�t�E˥�1�Ϗ��F�T��Fp8�=jjjq���l�r���ʛV�Z���b�1>>����~144�����}~�5���T�f�Zv<�P�G<��(<p���[����ITM����3��뭠:P3�Y0!��أOp���'��v]n?�h�vEYc��kx<�Z����E]/L�
�7򹧞%��ᱣ�O����͚�z�U������_�(I��O�T��=d3�^/���g�h�]˖�x9v�h�T����� I##���rTjxⱧP�5kֲm�}���p���d`N�51����e9u���z���@*�,{��z��E�i�h���^
aAE{�"����dYfrj�b)o��#᲋�p8�k�yd$�Bo�P�޾^�mݎ$J�08�(J�l6dYF�$��;���%�N�N��j��v{�e��i&&ƱYmDcQFF/�E�z+����8].7���8~�8��,~nY^qʤ���s���}TVV
�+�1����f��74�H�E���z�n��oM^e��`jz��=����Fؼy333H���ba���Tנ-ͭ��ٶu{�U�X��Q�Q/j���(��'N'�SS]C$.�C`�����$�N3::���LY�P�B;��y����Ym�y�]���cv�H�S�r��1��x���Fee� r��UUiii������b�5�5���p/�����Î,+�~Q]]��b������=ē��
�bv���@*4l�(���X�
���n.]�H{{�SSH�4�Y����5��1b�V��8)��~R�����d
���e��op��Q::��r�0t�p8T�ފ
��&�����/�U+W�Z��8w�,����i��ق=
I��p�z������u��¦���ӟ�@8"�Jq��),>����l�n�(�(�z�$	�b�0�j� ��W�U21Y���K/�Ƕ��y[��\��=�������
l޴EV�4z���	F.]���%�f2���126¥�K���,���աi� ������>����&.]�H4��p��d��u��.&&��Y���01>���8�\��k׳���\.���,�-�DcQ</��u8�.TUefv�t:M PÒ��$SIz�.�����r1<<D2�`��5D#Q��lX��h4R�S�vgΞ���n݆+4���D���v,+����d�>�M��PSSÅ�����D
z[6m�j�1U<t{<464��`����_��76>������q��m456�v{H$��Ar�\A�!����RL����s
��ydY���gq��}	�hQ��+��ʲeˑ$�hI��e˺H���ҳe�|A��N<Oy��f�T395Q����b��h�o`j��G�j��L$E��K&�adt��Ê�+�D���u�!�������_�nϼ~Q]]����>��j,]�Ikk+�(2|q�xӠ�i^S]�lp���vr�<�T���&‘0�|Y�io���1<<4�Y+W�B��/�J%ٴq��D�r�����y��`(X�/�ҊT�����0>1�}��f�s��,�p�h4���Y�M���1P���q���4����[Z��y<%�Lйt�d����.�$�����
���6lE�L&���K�,��#J�x��K:�e�`(�\.��
,�Bcc#��c444`Q,D�V�^K>���r455��%0��(��M��L�̔?�$�ښ4]#!���׍�w=|E��4�|>��b-�YV��U]���J�w����;A�$������Z��7?Y��VSӴ�(�bY�����Ȓ\�~�w�t�б��HRaw ��#B�3��$�r��k1>Q� ��?��
6�j��E]��J���9{��Ǐ��^�f���7�DeU۶lG��+�l6K^�c�ِ�S��d2��������e�ss˯T��LA��l��
��R�J��}��]���H&�ADQ�S��-Յ����h��h;Y�&s�\ag@)8����r��䢴{��� ��va�n�D"�,�岺���b���΁y��nA(,�%Q*�
�0��<��b�Fy�M������ܾW�+��c`\q%wn�(�ý��#�ܳ_ �N�?��
6�tIgY)v���I�u�l$I*��ܼ�궔�t&�"+X��yneUU���K��d2Ȳ�(ʳT6��G��;K��M���6��2<<L0�lwwy�X,�[��\>G�@�������;�(
����~�;s^x&Yz�0XYn(����b���W|�p5<7?�(b���q�����(��ܺh�DQ���r�.���ގ��DӴ�b��1b `�ٰ��-���;ܒ$�5����Z�Znn���W��Ş]ؖ��q��J�tE��۔rE����(�һ��R
��_,L���]�ks�Զ�5ʫ�JY��k?��D,c��0�7m)O2�Nv����t��"�X]/V6s�2�oY��B�0���_-�s�ܲ[�ML��i��#�
�ぇn� �a�Y����\��ގr�|����n=f�>$Qb���剷�n�w�E�2̺6��"�i��Fa���j�ҹtټ8***ni:WF2��0���v��Z����(J�o��u��M�6nT��P8�R\��𑼦��K,V��"=�lU����'�)�r�\�<�v����'�ɫ�ݐ~3��^@(��\Y�
~r`U�cW�Y
�M"�ӑEI�t^�e�Hd5슈���8,"������c�7Û���f�[>��q(�|�e�-WCz���㇡*"�AR
H����4���0�be8d`HB�ٔ��a3��o�7Û���-�xL�BZY��쵯�׌����r)�-_�)M
�3��]�;3��o�7Û���%|�߂�j��,b.ƙs׸�'y� ��"��MP���ٵ�3Û���fx3���釟��Q7�O���w5U��uQ���䎦�oD��ԩn��|�wj��(�XC4�1t����•�ܼ���q�u`b�i�P�Qh�I������PD�vi���+Z*������|N��5V�\2�jl7,�eQ��֎�Ѭ�I���z;n�tՉ��GC`i��K�f�
@{�J�G���۬�OQ��[a�4��A`2��o:Í�) �R���3X�X���c5�n�E-Fa5��FyW�0
�s~��S��]1�B�v��Wz��B9N��O�.Z��<����O��HB�N}�B���g�VPW��Ӹ\���llu�+�T���$�z�^����aJ�Ӎ9�9��.i��<�)����ܲ(��岸|�\��<��KG�zղ�
E�]]nV��yt��u�ye�Ext��J��fv�E1p�$�Z]�7�U��&k����@7��i�Ynhܔ
��%c-�Oa�E6�8謵�$`��d�Sfk��M��$��ΚF�H(�rh0A�[�%.$Q���B��-NV�XZC�
�N��f�m�
<���3k*��*��c4�3��>&���κf

jQ��UY���J&�y��ʼng�y�a~��K\T{�*0�n���Sg�R<���M��84�d{��j��X8ϱ�$v��S��lhqpx0�X$��e�=2��<�VYY�`'����G�X�ƪNr)�����z�i���ƣpa*é�]u6�6:&U>�O�K������E}����s�b
M7�����'�T��;�v)$r:����v����%.�|�j�Lg�E���i�[8;���Xj^�k�|a��e5vv�D���#�
��Jk�4�-�I��X&w�Qu��/
@8�1)�:荒�`�������27k��Grh:4�-ܿ�E4������n�`"�G7�]W��:�[\
�x`��ED7�^����)�dNg4�#��1{���0�ƫ��Z/c�C3�ŠO�xZ��h�&��]]�ɢ�S��x�gFS������u.�r�l����I4�Mk�z�g4��yZVXm��A�T�`B�>*r9̃�n��9ya�Q�PRE�>���j��(|a��E�cWUFq����byr�A�G�u>���
��q��4;��������OgX^ogI��g�VP�Q������]D�[Z�tT[�^ܷ�EW��ӣ)")��ƑP���,O���R9�@�i�DVg"�'��0Y�$���f�� K� er� ��d�
�����p�����P�C	.s��ٱ4����;6���5ͳ�'�}	��ٰYD}�\
~��M��8:�`2��*ù�4K����#]7��1�sd(I0��r�cbM>�����8}�T��]<�P9q1ő�$Kkl(��Ҷ��X��'8=�&�֐�"���J���p��`��R�zḦ�ʂ�*R���u.Lf���*4T(�m�06�"rx0A�W��҂�&���J��P�U��"��Y*�9f��jp�ʭ�B��ʊ�"Ⰺ0����O���Ϊg�
u0ɓ��%Q$�e�6l�@g���CIv��1Ε�2
�{��Ȓ��j�GR��b�xt�C�Y��Ɖ�Uڪ���`<�c6�rr$����r;��������/
r�b�v'��(�;dj��
�_�-f*��N+���z纙����8/w�`��3�i\V����56&c*��]��3�)�=D���%
A/G-A��K���RA�;����,����4��f
��AI�{���y�0i�;e���X��`"��k�XZS�e@0�I�lhqr~"��&�sJ���7�M��j��Fai�
�Edk�����P� `SD�*��e�����g��y-Xe�����7���1͓��I�j�܍�`oK�wh ��&������a��*�FRDR3�<mUVܶB鬱1ʡuh�c*>��,
*�_��zk�LF���Ogp�%�p�-�F��\r�8,�T���Y�`g2���%34[�꯫PX���%3ݟ��چ�Z�I��
fyt�`E��hJ#�Tow�ޑ��r:�'���{Q(��|�h�~x�/��3�Drr�A0�ᴊ�m*x���Ⲋl�p�j�z�(����e�dV���(��F{���.q)��R(O*�34��[�,������������/��.Lf�
hX
牧5�c ���5�
�N�K�#��.��*+ټ�[��U(ܿ�M��Js��ٸ�ٱ4]u6Z*�DR*'/�Pu�%�6��N�T��������%���DRZ�<��*�WX���{��/�\U����YccI����,����rtltT[�4��ݱʞ�c��n�V��	Ό�ɩ��x&cy��s��;�1���y-������7�%�Y��$�P��?N�t�&���-N�T��O3α��FV5��⤵�J��J^�c�I�+,,����tO��t�j�B[�ʉ�g��<����:;�~+�.���,��ƒ��ƌ��fbrW�P_a���B�'��߀�nc%i����4�P�dQ(k�45� �ܷ���?=0K2���Q@74���ܻ�p��b\�;���0@�JG*�U*W��bω�"�Z!P��9~ˌ�q�vy��e}���]nÈB!-�8��I���
��#�Q����ym��RܚW$�U��x�l�S#)���:G�_�p��-��f�}>V苋�M�.���z
G=��{�:611��0�"R����<X�%Tf���Q�(ԂPT2y�l~~�ҵ����B
��a
�OA(��߰x�.��AA`ϭ�r;(���0(��>,�s>=7�a,�����"���v/���-�_�=2wy�9d�by�f2��/��XF#�ׯY6�g�~�
�|V1
G�.�pC��d��%,.��yv���PZ9�O�?�A3�����ߛ���J��&ۻ��
g�I�tNG����F�kF�F�p���o�7�ߞ�$`Sn�z�3��t%��EdY��hFcE��(��(h��jp���4TX�Ă�u�
�o���JHfx3�=~�����N$�!�5��w
���;?�fx3���
����lBe&�����_�n�����Gg���T��0����g�VS��5Û���fx3��Ë�� ���"�p��g�<�ƒ�ﻡ-�;CW��4�N��Ƶ�������]��sC�NoL��������3�)�MLLLLL�!nZ�'�Ir9Ӯ��B*�bjj�D"���TY��z����$��G�ئ�:��9��cS��$��놛��!�3]�m��195e�k������a��w���Ϟ8u����۝�{��=��+�p��)~����i7�-���������+����\��E��fs����K�nw�?3���_y��g�^3� ��7����#���/9r��k�WU����nw�h�{����/������p�v'��䦸�I�a��200H[k˖-��^u���<u���6UU(���'��Ձ�cc�47�z�*Dq��b||�����������f����[���('N�����t*���
�����f6lX��b��d*Ňǎ12:����ap��Y�����dɒ~�_�F��sϱn�N�>���MM�,[�I,㭷�a``�M7P�s��L������ڵk������y�N'ׯ�><v�L&�u�TU��"�&�\����LNN���β�NFFG9s�U���_��x<N�� �D���JV�\� �:s���q+�w!+
�N��06�[GEEg�������v��؈�ih��ڸ��r��)����˒�v2�<H"�`ͪU���]�ܹ�����m��#�v���v�lߓ����=w�����-���21��\!�/���ѹt)�_�*�SS��w��s��/^����8���]���]Vtu1>1A2�d����g?��?��456΋;�ˑN��'�R)�&2�ę����~�S�ZZ��M�P�����H���xa��V]�ß���KiokC�4�(��X�n�t�d2I<'��16>A*��ݗ_�k_�R�O����g��������D!̞W^��T����Ձ�M�$�)�~�].���t88�}����n��*�O������[��������Ɔ9B:�A�%~��klٴ��v��'G�ev�y��]]��Ϳ�~�I��b,�ZF,������O,[���S�x�����{�֯]����^���@2��G?��������0�L:C&�&��ɚ[�W#��X-��T�j�_\�NLL�D���?u�##���LNN26>�#�v�r:y�W���?OUU%��c.��'��[������>��j%�^�֖f�uv���o��+vL>Y.�#��Wyp�X-à���/>�O<�(���BW#�Hp��Y��Y>������a5��ds9R��X��+W�~�:�|�1\N'��J��,�H���Q\.'��ɿ��7	C���Q��f�D�LMO���B$�b����8|��p�`(D���-�*+��}>���
���K(bv6���������}��<�裼�w�����[x�/���Qinn&�ˡ�6��C��������Y����y�"�F�4�;�۷���ϱ��]7P�m[��|��ə��xvŊ崵����b��巻H�X<n7���ʋ/�?��ټq�)�M�J�X�{�jj���6v��Akk�33Dc1UU���ކn�Q��� �
"��DAQ�*%��	��~���8���E,�`��E��yU��8SS�
��xH����̖7_�+6����A�T��`�7�z��v젥�]בd��`���ifff�ß��g����M���r�!t]G�4&'�ؽg�>�4�}}��c�<���0/��:N���O}}-֭GQd\.��.�kRUU�׾�%�ۻ����*/��*+�lܰ��w�����!r�<�D���)*�~$I��BkV�&�Jb�7n�������� �z���%�H���D]��1%�HR��p8��_�X�`�j�p���Q‘(������ �26�
�0�42J}}=��v���(�����>�,�H�t��dbrSH/���lkk��(���J�'&8}�,�p�֖VN�<I �g�a������2R[[K&�a�ʕd2TMcyW�CCtu.�������ɓ�=�MKs��*:�(�,�h7�-�WQA4�={��������6�u��\�y"�(/���.���/��'9~�$��*V�XΞ���\�y�`ۖ��	�����,]���k8q��p���N���EV�XA4�o��S�O��ظa=}����"
"MM��8y����x��455r��I�$��K�����n���_G"�[��; ��p��	�'&hlh ����_���ŋ��i����������9r�Cz��X�����9��nc��z�
�|x����s�s_��AQ���_8z�?������9~�G?��d*��kIg2�s��I�.������{�p��	��/���I�8p�U��4����b���e�ܭ4�#�u�H$B:����������	Tʁ4M#�LV6��!I�(���e]��+|I�0]בeUU%	5�GU5J~�-�$��fEEQnw9}�1�D"�$IX,DQ$�ϓ�f�Z�X�VTU%�˗W�EA�eҙj>��jEQTU%�H`��QI�����t&C&���p �2�����0�Ӊ ���B:���r��3�,�L�݆e��;	M�H$��EQ�0��$��������w��m�~?6��0
G�(�����g>�96mXO�X���l6K8槯���+y�GI�RجV4M+[�e������d2�:�u�\>�����NDQ$�͢
�����j-�K�d�Νh���\UU&q����{X�$<O������@�p�+B�3Y��a�K�ɭG��nK��,��:���f�9�u������0W0�m�y�x�"���a�c^�Պ�.i#�$�z�}&.����j!��y|b����#b��Y����B��?/.��Juu5n�I�
}�[�>w�NDQ��n�V��;X�����C��Q���\�իW��܄<�Ȣ��e��.2�,��׼�(�=���.�������&&��χ�N���d�ڵ7� 湻���'���bbbbbbra
~�{S𛘘����C�����������&&&&&&�W~M��f�����L�8�
��:�L�}ś�8���J����R�.c�0H�R7��w�sɛx��ڔ��z��]:���grc�Ri�]���\!�{.���/5�aG�Q��l�����W<|������?#��LNM1::����B��~�s��	��'?��+���������k�y6���04|�vg�3�����/s�̙�E�߽���>z���J�ow�h2�g�u����{�cl|�v'��䦸�V�����ك��"��v�yg�~��?R__��i��~����*$Ibbb���I�|�b٥�Ν��C46Գ��cދu]gbb���i��.��x<�o~��n7�p�׃����Ő�}�t:�+���#~Hmm-^�iK�H4���<u���I���� n�MS�z<��U>��`�s��s��֯]K.�����d*��(���_q������r�������b��ß�@TU��v#�2���
�j*��h,F_?�X�ۍa�0>9����b���:����C�p�\D�Qzz{I��x=R�4��0����(�C�b1,���?0���N�EQ���dtl�T*� ��
�u475-����nx��PQ��leS�s�����`��p:�l�H��D"�O_~����MO��ZZhkk���21)s����0�#c��8y���!�~�Iz��b��j�J��������k���-�C!���_������119��#�T*������!�L����w<@$�ĩS�A֯[GkK3?�UZ[Z�B|��Y�n��x"���^‘G><JSc�����"�����P8�,K����,gΞ孷��sO=�۷�{�0��.����H8<|���QF�Ƹo�6z����9�!��w/�h���I����������chx�C����_d�{��bLLN�~��w���j��TSQQ���'8z�E�:��_��m���ٳ����475������o��?��a��x��g�����W���$������JΜ=��?���$�CC4��c�X���W�}������B�D�Np��i~����w"��184L^Uy���{�Y���x�S[SKu p9��(�%�J&��x#51���b�g�K��kV�frj���oc��U|��m[��f�*���S<��#��FMM5������y�y��=lٴ���c�Ջ�'�Ew��=��������t������7
~ˇ��I&�����O��}�۾�w�}UU��U�u�f�߾��~�K���"=.���>�=�4V�C������.�<��=�_v�S"�Lr��!������hnn֮^M{[+��133��M���v�򥗨��`�u455
������q����?��fbr�M7���c���;8�+O�}��-�{�A�)�r��iizf�w���ݝ}w��x�1�����}�;��^j9�F�H��{�{�}U(o2�B��i�b~"�̬<'�'�ɬ�P�� C���WBr����Ÿ\����/%�"��@dd$y�9���RQU�^�g||���r�qqq�b�܅��\���������"
�$����$**��W��l����-)��x=^d0oC
��~��}q�AP�T<��	���Q����������������+W,t�>��:�?���y�������J���4w��{w�r9���G�RI���va4"W(p���/��W�T�D=29r�l��� �θ����o�L�J��'�ɧ�Ȧ3ɩ��wgf��m��x<����IG�J%^��ǃ^�����~<7>�Q�@0����v#����7������r��}�B!�;��\�� ����L���:�208��O��i��&�`A��nA������o�a�:�&*��7^�1����x��?F�VMA~>7lx臝s�����x��.��>��'���ד��D^n)I��Ή�v>�!B.���SO�����c�q:]����]=�	�a�eK�χ�8(rd�p��]��>Q�����\>��4�\>+.� 0�%Q2�N��� ���z��]T��?��M���T*= C��
�HKM�������,Y��3gυS�j4����~�oL9������2>�1ݽ��z�J�_�@iy�̳�=�v299ɇ�R�dŲ����a���d�T*�z=r�l�kkk9u�r�d��=�SRZʑc�x����j��nߧ��<�_���(1}����ů����G/�Lm}=��ϧv���8y����
���jQkԴwt2�t���JA~>%�e=~���W#��iniE�`��=q��@�EEE���p��-�[Z>���g�a��1��$$$�{�N.^����%3#c���+546q�u�N'q��,**d��"**��X��D��{+�
�}�������&���x�
�����j�bc�����C��������<��9{��(R\Z��֯]��[8p�0=�q�f�*�Z
��ܥ��������AzzzA1�Ѽ�ҋ����(�
V�Z���P�<��Q"yبTJtZ-.��ɄR���|dwl�ALt��޽��f�i�k_F��Á��'::
A��D�Z*�k8w�"?~�e�㉉�F&�16>���$z��Ɍ���t!���>TJ%��?��vrsrHJJB!�319��l�?�3���z�ccb�r:�ϝ�o`�Ka�J�IDATxh�fC&�)���y<��GP��
�:.���HLt4N�k���a0�Z,8].FG�0ED`4f�y�����S�S��&�#^��˅(���ţѨ�YF��195���$QQQDGE!�"��c8]N�V���>�n����!DQ$1!�^���exx�,��������ƈ�ngdt�w����w���p�~��bQ(���184��c�Y�z[7ofddd�x�{�f4����l6&�ŝٌ����t���E��12:���Y8%sLt4�`�Á)"�e�H$��tvv���d|||��>�N��夭�mn���\�_����
 >.�a�������EO_���鴳2�MNN"�����$#=}�����}��G�Dܗ�/Km����t�����v����E��?�w��2�<��uK\l���z���^${p��HbB���2����X��Q�����̜�7�VKzZ���0�����ݎ\.'+3����9/��DG����0�3�by�XDEEu�i��M�9��AJ�r��H$�&�L��l��^?��ݷN˛����h�	�Z-��}f��w�}��a1�9�T*�/�)����o�����xb�6L4��D�e�h��9C��D"�H=������G�
�K�h�D"�H#R�/�H$�cDj�%�D"y�H
�D"�H$���H$��12o�us��(~�ۓ|3���79��I���u�߷9ޒ/'է�Q7��ok���S��{<F��f~��p~��)�(r��qJ�����z�\?d�--�?x��Z>��_�������@gW�������p̻������.�� ||�0�

_��qw��Q��+�r�##�s�5If��Q^Q����B�D��i�GGG�[\L{G'CC��A�KK��;�222”������+LNN���b|�6+����$���740>>>��~�GF���B!\n7�@Ap:��@(bdd���qDQ����t:�z�37 n�����Y��t��`rrj�]� 3���
������OsK+��j��h���F��8���54�r9��r��?��n�p��>��>~v����~�����Mʹ��208H0D�Y˄���s�^��q��:&SSS
Ϝ{~�������!�#SN'���lv�~?�P���qF��L���v3n�195��麟����㮳����fz����j�56>ο��TTUឞ�[2ה��{|�ᣟ�_����H$_e��}2����~�	��o7nަ���F��K�S\LM]		���F#�o���(�с��er��Y�f͜�z}|�������؉�X�ly����o~�[o�NuM-�UU����'� 2��ß��/p��5r��X�b9~|�˅N��7^��r���A�� ˖,�=������oP^Q�J��퟼���C�z�� �����6L&r����.9J �՗^���`κ�%��>w�B�� �d�<s���6|^[�l�ƭی�l(�
�=��>9�����'������܅�ܸu�EE��޵��ݷ��\�~������x������K �,^�����	�>H������!T*��
yb�v~r���d2����S�<�ɬQ(��?xQ���grr���
,f3�<�:���>�Qp{<�yrJ��s�wt���Yq
	ܺ}���n���x��70>0eU5����vB�7�_�j}()�
DD&&&P�ը�jDQ|�sIH$�Q��������A��3::���d`p��Ţ�"�7��o���4:��ٸ~={��EgW�==��96����j::;IHH�o�Eo_q�1dge�`����HLL`�
$��q��
≍����ˤ&����O�����������=���8��=�V+/��<7���֝��\���R�V+U��DGG�o�AVf&��T�*��PH�����7X�V�?��oh��������1����̔s�������nq1kV��uS�v�y����ض�g���%������l����\Njj
�q�3���h�l6SRZ��dbll����l���c����%59��DIY:����f�ٴqf��w�����b����l޴	�V��U���ZZ�S\��Y�h���(+�/�����M�1����b������k���؄�b���!b��-���p��I�o��cj��1�L������BjJʬ��C�%>>nV܍���~�Z~���\�t	�^Oj��u-3M�-���WY�|��sA�@0�Z�B�T�m�f�2�������H�#� �p8aο�d2�?6�m����Z-F�S�A�j5(�
tZ-��Z�Z�B�V2�F#q��$%&099���Ft:-��Ƚa�{�
~�W?E����{���ݍL&#""��L^qQ���d��-(�J���f2�iu:\ӹ�EQ��IOKebrr:���^�(��>_tZ-r��N��k�\n
�c�1�~��
��V�����U/�P 0��ьF�A��3��؉��ut�R���j4(J�M8�ܧ�A�R���e�Uj�����|z|z�
�F���/���������f2,>�gڇ��X�t)o��UU>r�݁��X�j%˗.��p�y�L211�V�E�ףQ�Q�����W),(�デ))+��t���@LLL8�����x
�B�\."���]��V8ŵ�h@�R�t�欫Q�Q*�h5ڇzte�E�lٴ��|�-�y�)�rsz�$�od�P?�;Q�M7�qqLNM�G�y���HK���k�D��[x�����w�{�NdrG��`hh���6�
�t�*5�u��2�Z-kW������K0D&������K�ilj�f�������˰��h�Zz�}}\�r����_s�m��-f������pL`6����`��5=~���!
��IJH�ĩS�7"+�-%))����p�^x�}�
�r�x�������z	�B��rzz�x�����ɞ];ilj�ʵ��drd2غe3�W��Б#���022J(b�6�N�%P(���r�n1�N|Fnnv���@b8ݫB������sCAH��g�f#!!�L���$w�K�z��A�:=ׯ����ǃ�lF!��q���B}C
���GnNٙ���lDEEa0�q�\457ϊ�B��O>axd���&����[\��nD@d��8u
��Fm}=���DDDp��d29��ՄB��|)k׬���S
��nQQ!�(p��[[�p����dDFZ��.�8|�(�(�lɒ��ևZRR�B�D��I�399��n'55���QDQ$66�ή.<n7999�A��ۉ�Zi�����'ػ�I�RR���F&�������F����d�	FFG��@�P������0c�c$%&��������B��d"33���OsK^����p�q�i"�"�II���݃A�'7'�NG[{;CCä��`��iik��D���8R������v���=����dll�ֶvTj�QQX-�{z�rNa�X���fl|<���,|������l�����MDD��錌���Ӌ�b&6&�ȨH���Q*�d�����θm�H�5<*c��pLB���a������q��Hk8����s 
��ڊ�1ARb�ii���l|���fD�rs��	����#����Φ����ǎϊ�\N{G'��;�O��CZj*=��DFZ���C�R���Fww7/_aǶm�\�����AEE���HJJ���s&�RS��r:����IlL���8��/�juZ�rrp{<t��DL���Z"y}�$=�:;ߕk׸S\��O��\z6(�<h�q8t��211�g�O�7?�k�23�p���$;+����Y��H$����m����ĄD�-]��H$s�#>��ں:&&'y��IOK��Qd��eDGI=q�D��}��D"�H$���u{���S�D"�H$�9��H$��1"5��D"�<F��_"�H$�Lj��K$�D��~�D"�H#s���nܺE(��[0�����z	_�qQ�|�
��]�y����*���Ue�����uu������͉����4ɒ?��/��%�A`jj��g�����i�%��ݜ��������`�;f���������#r��Q��X����C!�������;p"��3
���~�2_���xf�!¬�2
�����7�t���{����1sA
�Bx�ޙϽ�
��=���?H�UW�Y�����8p�4j5Z�DŽ�;��LNN�[^ɟ�Q��{�r �ن(����<Ov�{�XsLL��{����׿�ݻ�gJ$߇�3��d465��_���=���g(-/��r�l�N�9��`�f����Ιs�g2����K���O�`��iljf�ҥs>&r��9�����d���\�r��_|�V�O��{�N�jj���'2��+/�@T��4�%e�\�|�B����C�Tq��Y��N�ssؼi�}�!�@���I6m؀\.��󴷷���/�����c�p8���]����x<�deQTXȩ�g0�������O�Ja�q�9q��LNM����p#���QSWGH�P(HNNB�P ���ȧ����={P�U����G,^�n7���;A��9"����7oߡ����ۿ�ҕ+�BO��=g?�XS)���n:;�p�ܨ��p2�`fO���g��HRb"?}�'�L&��X�ry�y����,[����\�m����1:6ƾg�AE��<���P�T��_�5�
Ź���ɥ�W���c��u����v�)+�������!dr9����ٸ~:�n�vlv;��<��͛y�׈����'�i��3�PRVΝ����n�j�o���k����"+3�g�~�ŋ�8}�CC���s��Y�{z���'=5������2""��m��l�4'y�V�e������ʋ/`�X���瓙���{�"�ɰ��Ȑ!"gϟ���g������?�q��K0����E�
��3t��P[WOiy9��e467s������?�
����a�<��	���;e�D�E�4�"`6�I��'.6�P(��l�h4�����b�h0M�ՊL&#::���<�-]Joo}��df���h����&��LLLp�ȧLNM�v�j*���q�6K/&1!���|#���c����َ�1N7Z�ObB
���΢EE������0z����dR���+�z�z�		
��G�rN104DQA:��RIRR))��t����GEFιӗ��X��)�ɨTۃ���t:b��0�Ȑ!��>����6������^�'';�ܜl�rs9{��P�uk�p��Er��5��ǚ�b�}��o��?�1+W,�z��GҜ����p�oQ?담Z�q��g�ܱ���hn޾MLt4�(������W()-e����r�^���룽����\J�ʦS��Q����3<:Bjj*��c���9v���v^|~SSS��[�"cc���|���eK�`�Z�����岨����s��ijjj��llٴ���6A�y~�R�0�L�>{�R��%��[�&=5�^��d"
�<;�LFLt���QQ$&�s��m1���\�f3����d2-�ѝ�oa�8*UJ�� 7n�"##��^yu�r?a�Z�"
���˗s��q^y��,^�������2f���%�tttH�v�\NQa�B�vɷ�x����%##���Z�&!>���Dz=�IIdge9}ם��JFF:r��H���Keu5f��%��e�&23��ju8�
X�������$
��JEbB�v��-�6�l�R�z=���h4Z�޻A���Yf�ҥ���199�B�@�R������K�����󑖚²�KP��<��	�33�Z-���b0���!#=��4B!���x�.^�V�ctt��#)��DFF�����l&�Izz�����`d`p�\�B.'**���<�����d�����ޏ�p����CVf&�ɉ��A�mٌ�lbdd������Gѣ/j����R��IKMA��a$55�իV���@jJ*kW��h4244��kɣBǼ/��d2�?6���g�r�:w�����@��vY~�^/��_��
�ؼq�BסD��E"�|_�l�,3�)���NOLL���(���טD�g ŋD"Yhߺ�/N?s{��o"���H��H~��x�H$ߗ���vc�����lU*��z�S"y�I�"�H�4W�D"�H$���H$��1"5��D"�<F��_"�H$�Lj��K$�D�����l6��f�s�+H_�E���:zz{���D�?5��W����͟������l6����}|=N��m�Pb-RVQ���C�G֜�����O�����\�Z��9z�̉~��Yj��r�~���/S���g)��	�{9ǽ^���W�s�E���`]=A����Sg����KUu�C���G9^���1��}|������w�[\�g�\��6o����&��W�Ơ�����|z�%����lr��'X�&�}�i�23�}�.n���h��ޅ �:s���^j��YTX0�A������J�V+֮����ݻv�V�8w�"�֮������
,3��݋e��mm�\�r�P(��{�`���╫��v/
'9��I��N&&'پe3.����SWW��?GnNgΝchx��˗�n�Ξ���� ,^����d>��l晧���Ν�&b�x<=v���2F��x�����\��+ť��-.����^�L&G.�s��]ZZ�E�g��K\l����Q���˧'>������f߳�P]SKKk+���%ee�|>�l�4g?�XS*��PY]M��۶l^��N"���>�E"""صcv��Ҋr2��I�O`Պ夦$�O~^y�9�SVQIA~�--�>{��.����S{�����|7���8|x:Ehz������k��饢�
�������f� W(�lg����O���c��
����q��r���)J�+�[RBlL驩;q���(bcbY�t	)�ɜ�p�ƦfR��9|�(=��}�˗q��d2�	��eϝ3�Z�&=-���V�\�^���n{z{9v�3֬ZŚիQ*�
R][K}C#�����t?i�X���~nݾ�F��� ��%%���s��
>=q�ήn.]��(�݉�=�~?���,Y��U+V �i�L�H�����bY���Ʀ&|>?9YY��ưb�rbcc���!=-���غy3J��s/�R�X�n-��
������l���޻���R�.��^����r�"#)*( )1������Ezz���Y�c��dǶ��L&&&'����/~���T����ֆ1"��K���":*���H�,^D\l,��파���ڊN�#$h�j

X\Tșs�1�u��ƒ��CFz��2)
RR����a��}l����`0�~�Z�++ikoG&��3�)���n횅��Gڣ/ �h4�v��33��o����x�nq��E�^Ig�X��Zy��)��'3#�����lF���iޡ�P(�_\�y��=��6nܺŊe�0T�Ԑ���(����S��HYe%��(�r��`6��� 7;���6��d�TJ�������a�:�8Ⱥ5�Y�r�Ν������'|~?��ϗY�j����T*ILH@�Rq��
�IHH =-��W�184���{w�}:�w�L
��J����F�22�Blۼ�JE\L�`A!L��AOUM
�i�DEE���~�HNJD��~^W˗c�/\o:%9�	���^eph�χ ��A
���ax��E�T���� ��~Պ������S{Y�h������OIsK�C�R�M��dh�Z6oܰЧ�D�(^~������7��j�dgg�������@ ���89�9$'%�?0�A�G�ݸ����		<�g9�Y��l4��ò%��触�����m6bcbhjj�����˖�~�Z�&�mm�B/��<~��;�Ŵ���|�R6�[KCc��vl6�9���S��@_?�i��X��ѱ1��ضu3K/&����F�E.�������(�{z���e��8��E�̌�RS�Z��A���HIN�`�ހV��ƭ[344�^�gɢECA������B��,�5�L��f�kj�‚2��1FINJ".6���ą>i�R���l�$'a0����`0`4P*lX����4
[6m$**�ʪjj��X�H� �p8f�${�L&��c�۾}v��ׯs�n1����V��v��K;�@�_��7��{׮y��HeR�H$����������Ʉ�f	��e��q�\,Y����HeR�H$����{�~��@ ��`��;���� ::Z�xI~��x�H$ߗ����ɼ�j5j��;�i�V�V+}�\��%ŋD"YhR7A"�H$�Lj��K$�D��~�D"�H#R�/�H$�cDj�%�D"y��i�].�CC�I�	Q���c�f[�r>�<_�Վ{�z].�Wn�����Zt��011��E{hH�������t}�
I$dN�������f5}��\�~c��w�i��ʍ��ȧ�OPZV���#p��?�"�Ue���� �.�{ο�|~~������m�:s�C��¥KTVUs��'�=��\�x�6~(�695���g�ů~�������}�D�]��=�` @cc����̦�8q�4w���;�ded���O�!-%���&������a�������wi�褾���ܜ9,����FUM-���,ZDk{;�W�@�TRVQA~^�c�T��a4�i���L|2::ƭ�w�l޸��BiY96��‚|�RS�}�.�LNM�a�Zlv;��=V�Z��'�$+3�;���l�rsr(+�`dt����",3�}�!��1<�w7kW��5q������S�-.�f����ͦ;��H8�(PRV��H�,�
����Re����Ԕ�իV��6�����HNL"::��^gz��UU����������Q̓���Q���ϝ��n7~����h6�[Ggg�lڸ�ֶ6<K/���s�ɀ��nܺMSK�ys�I$���g�2�P���8�*+�}�.��R�V�E�T�P(P�Uh5Z�[[9��|>N�9��׸z��n�$5%�V;o~���	~�λ���!�?�._��>���s�l��e |�z����Ï?fdd�9�09v����x�^��������ɧ�cjj���
Z��Z͕k׹r�:!!��g`p��.P�؈R��>brr
�B�J�B���D.���G�@�� �/|�N�,<����ʲ�Î(BIi�=�J����J�D�Rc�Zhni����t9��ȧ�r�|z���ƅ.��x������'����A �؉�������귿cph��N3088w��<��z����T2��	���#���m޴�ii��޹��Δ�Ia~>���$��,Y��eK�p��
RRRx�嗈�����k(UJ6�[Ǯ'v����|��u:K/������H"##Y�d1%ee��Ɛ��Abb�/�YF9O�rph�������lfrr�ں:~�k24<LUu
��vlۊ?���'55���d�ؾ���=���j����q&��Шլ_��EEE�WVb1����`qQ+�/��/*����B��GxzϞ�>�3�.��P?P�ˍ���؉�x��Y�l霵���HNNb��s�\����b�`w8_��.�G-^E"""���Nr����������RR�������c�l1�ӷ~BVf�t���Y#dɣbކ?���sc��*��[HOKE�T�������(�x������x

MM���0n�!"�##�/��P(�X�֭YMjJ
|�2��X�f5����?hhl�o~�3d���IKM!3=�}�Q*��@Uu
yy��Z�"#�������1rsshmk�I��V����ؘXb��E�=�v�P(HNL"N�	�p�3e�h42888�C�Z���ʙ��HOC�Y�T�
��)������eQ@�ӱw��TTU�xQf����p���rbccB!B!�P(�\.GE����j�$$ij}�fv��IjJ�s!=j���g���˗�5[7ob�����o~ÖM����chx���I�XS*X�V�޻�B!5��G���W^������)'^��%�c��0,]����:�����#2�JEU*�
�Bιs���C.������ͥ�����
D`��%4��r��%�imo�b1s�n1u�
����y�bbb�����r�^��p����[���������KrR2W�_������4�,^Dm}=����\���k�2<2BaAJ����+P*U�WV͊����褦���CNv6�������b��EDEEQQU�Z�B�Pr�#���S[W�\!c��E���E^n.:�n��R�������Q�"#��� **���a
����d�O`4ho�``p�KW�R��HK[���x}>RSS��dDFZ�������
��P^Y��8���L�>�By��%1!��OaA>��zC�lټ�ܜ��رm+I��\�r�/K���\.5���� 8���l2����n�?;�(��d��^���A�y��ކ�߼��۷������<3�(~��Z�����w�BN0���jQ(�B!~�?Nj���o߿� s�#�����F��w���de2��>+�J�J���o2d�s��e��w������9�����sH�J5�,�)�ۺ7����r��g�χF�A&��fz�����Q��{���?��kɣ�[e�t�_d2٬���\.C�Va2�z�(���j���0u�J���R��fg`p�����e����x��>X�{_���|��+�ɫ��g/��^ڛ�~eZ������څ{��x�e���g)�$��o��w�ҥ�dg�"џH����_#9)i��E"�^H�"�Hڷn�#""����Nv�h4�����u"�|o�x�H$M��_"�H$�Lj��K$�D��~�D"�H#R�/�H$�cDj�%�D"y��i��N��;K>119���Y�r�*_ X����|8&&�z�8��}|A�nw̛��ˈ���n����>���\�^�ML<\�C�p8�k��cN�����GΚ�o�f�����M�-�|�8x�n�-^�r�(.-���l���r=X����&����WT�wߛ5ڗ�|����sn^��^�;:�p�_��7tuw/t�0A�>����+��w~|���/]6
Q]S���\�">��n7%�e��׿�ԙ�tvu-�.I$�Ȝ��;���t�*r����q,f3g�_�؉ψ��$�wޡ���ՊB�������n</V�Ahnn������2�����ʚ��#��T��b���j4t��`��」��VÄ�Aum-��6�V˼�w��νeDQ������f�j5SN'�==� �F�~?===����p�\��׿�����BTd�����:ͱ�a�Z����;+��ftt���
*������p��-
^�����ygg�|���QYUMEU+W�`bb��Z&&&Q�U|x� �%�DEFb1�ihl���	��ODD'Ϝ����r�0�L�U*j��i�^&2����جcjkk����1kv��Q0������fDD,f3#��TVU3�t��餽����</��ֶv�C��������S����N�::���D��S\ZJrR2�))s�%
��2;�z{{����h�C����_���|>�V�c�s��p���?p�/���F^n���[Ɍ�;W�r�g`p��.c�;fdt���f����(~���������!�
�����}���N�"7;���I��}���W������Dt:=~|���2V���?����x��B�ӓ�ORR"1d�;��	$'%r�����JR���Z�T�TSZVβ�K��� ::���!<n7:������c�VFFG	���5��	���������L^Nz�~���;���w�����v�P�LƔ����(w��پm+;�n�3w��3g���'91��
�7SC�ô���f�*������wv���Dum-SSN>;u�7_�`0ȸ�FYy�����k���2�/��W^��K�t:HJJ���;446��j�����~{A3~���><p�E��x�^"�F��{�L&���yj�nA䃏>���p&F�J%��>MZj*wKJx��g���cph�̌t���ho��ä�$����S{�|��%e�<}����!"�vZ�ڸ~�&��;k�ѱ1F��hik� ?O�	��BJ����L&A��W@"yT̹���HNv?y�5�,^��8kV�����7_{��+WRTP��];ٶe� ������/ٱm._���lX����&��
�}��P(0E�����ő��LnN%�e��֑Gll,ƈ�?	�	
�/�NbB� R\Z�K���o�ŒŋEX�l�7���$

�ؼi#����ߏ��aqQ�w�d�ms.�Z��ukVSXX�[o�Nd�u���Wjjn���)�w�D�� 
9�Y����޵���n�y^�t:)����}�x�G���nr��0�L�|~�v+�-c��ռ�o&�����v����."""x橽����5�����;�LOo/&���N�߼�����PU]���^�j�R:��:|c���Bk[;�����ƨ��'
���4+F.^�º5���[?!7;QSD�P�Ղ�d�͛�[�����_������G�64�P�;w�a��q'J��m[���K�*5U�s,]����l^y�E�-]��U��2���+��>�����%��F_�H�;n~��R�@	(�
�7cc�DFZ������g��tN�.��^�B�`tt���I�7�N����\&G�����ϼS`6�ذn-����ihl�՗^D������������0R����K���~�R�dhx�̌���_J�"��C�<ќ �x}>�� :��\��ngjj
�F�Ԕ�p�z�iu(
<�9�c���V����~w:��"��x���`xd�V���crj
��t3(��:�����3n�q��%V�ZERbB8#�R�������!���'w�$*2!"�r2<<L($���ǵ�7��3�^|~�MM3��h0���ĒE�P�U�ƅ��/����O�	�\��}��O?��d� ?�5�W��@wOϼ1������>��LƎ����H�デ��tZ��Ph�g)"��n<���Q��h4��]��>!$�t���t��j�LLL�o�
�L6���1��L%�0�hԤ���W^F��<��/�|�9����eͪU���R�X�t	5�u4��R���J��ڍӍ���g�������(/>��ܜ\�߼Iyycc�,Y����F�;FIi�5��
z�߸Ik[F��M֓��DqI	c����8].>=~��v��ܺs��c0�ٱm+qq��=���Z^��VCA^]���j�����������dv�؁�ȵ7P(���2~��{�-.�nq1�P���|��hnm%?/���@Z-��t�
��=X��22(��������^z�yz������/>�ENV�.\�������5������MzzK/�Nq16��̌t�����{����ҲrV�X���˩kh`ll��Obb"�55��w�⩨��`�BAnN�C�}����Sg��p8���b�֭LLNPW_���)�I�|>>;u��b$?/��7oRZ^ASS3�y���RQY	27�cQQ7n��nI)U5��X��N��;wQ*�=~�;w��(
�Z����oO��K�,frj�Ʀ&�**H�����8|�(g�_������v���ܸu���(��Zjj����YJ$��>�8t@ܱ}1�1@�e�@ �N����"���C  b�6�t�R�()-��ի����FGϼ�t:q�\hu:�:~��/�5$�L�^����t9��-DDE�߽�.Z����|A�����r�,�r���v�}��j�̌`h4|>�P��߿����֯[KlL*�
AfʥR*���� ��֠��x��( ��p�T(��t�T*<n7��^�)"�׋���ޠ�^o@�Q311���C�ӡ7�y���Z��Z����h0099���&�hD���x���H�ՊB�`bbbf�Z=��!299���aĠ�?�é�FC���P�T�A�vf��;��\�27F\.�6����yj�n�-Y��a�h0`�>o'����C,_���;����ш��q���`��!<��9k�XP��LNN��׭P(�.�8].tZ�C�N�D"�b�`���N���}?�L��夭�m�P�R����h>ϻ����{��7E�Z-驩����F�aZ�J��g����2��ᠻ�������m�Z�Z?�n0�r;2��ٌ�l�Sx�F3�U���ٽ�\~_���9�^�����?�JElL�쿙L����j��=o�̮?�^?�ƈ����l�|_����9��e�;d�M�Y��a�V�I����7�RILL���87F����[Rʸ͆R�$+3�р�8��L
�&32��Z=�r�a��K�;�n�����}9�ƋD"����
�󉉉�Nz�r����<IFz�Zh�\���>Ktt��9	�#:���(�=�w�s�p?�L���Ob4<��;H$�GÜ�~�D"�H$���;��?��H$�D�~�D"�H#R�/�H$�cDj�%�D"y�H
�D"�H$��y��)[��`0��nO���@0D��n �w&��k=��w��r��i����n��y�H$��9
Kk+��~:�b�t�����}ph���ɯܸ(�������ۑ|{�MM���GTTV����u��x<~�λ�����w=��ǯ~�;��{��?� �G�������DQ�Б#�-)e��C���}�v{{�friH���i���w�˭;w[�]�H��9s����q��Ubcc�x<�ݸɧ'>#3#A�û�1:6J|\
E8a��n'���E���1�;;�~�&�11�dg��)����F�F���E��O��r��tvuQS[GgW���ܺ}���8 <��e{����k7n�dQ���2~��R�ᣟRV^AZJ
Z����a�I�T*�;�~:!��`@.�384^F1��n���q��3������t}"�pr�q�����d�Z�n7=���~~�N�����?0���A�R!�"�8&&�i�(�J�G8���4՗�^%!>��Ԕy�e&�n�㮽��)��^�F�E�V�Yo``��կ�h�X��ygq������/^������232z�$�_w��9WU�\NWw7�/\dpx��}��Jjjk9��I�.^LMm-��DDD�ß������^CE>>t�����N6mX?g'zzz�������A�3{�>��XEn��w��c�݁^�C.������KW������b��9�^�y�KW�`4�{�2.^�LGgv��-�6QQYŸ͆�b晧�r����l�,po��^��;��TTU���ƾg���g'�O/���s��U��11���>��~�r����^z�yT*�BW�
g�<��b!?/�m[��G�����������r�����bD�Tp��atZ-�����󌍏S_�HLL4{�܅\��>D��`��yz�p�|�
rcS3�/�6
����ftl�Z�_��O�L����H}C���V�uk�ZJ2��L�����T*f�&J$��9=�޾~��m��?�#�##���,Z�������HOK����M׳w��tvu14<�?����v�)-/������,��'o���KlL4�Y�{��V+�]lڸ��v�~�{v����:��*���DGG���GJr2N�����	A�y�k׬�u��r����<�g{w?I}C�-"-5�FK[{;*����t������~:)�F�������ظ���*;�o������Mrr��ʪ�F#=�}�-f6�[��b�?D��a�0RRZ��͛��5�0�����˗�l����9z�8E��tvu��X�
>#�dgf������Bdd$�CCx�>6�_GFz:�O�$>.���x�Ʀ&�&��㤦���2��/��!3#��x�'���ظ�
���^����Բi��Ď�Zijn���̊����S!$�M&Tj;wl'7'�Ųл%���=~�Z�Z�A��
	��jDQQD�P�T*a��F�y:���G���d�����<��Q8�Bz��{�_B��e ½בD�ɐ+����J��@ @0D�������ӧINJ�h4"�LFC�޾>}r��K��駳��?Q!�#"������b/Z�N�C�V�ӷ��nI	��{���WE��@Vf&�
��C=��ӷ…��iljb�
��*L�zb�錍�ω�{/���r�r9
��W^|���*�����pHS&�)<�.A����,<�?s�	�䅏m 0s�e2
��\�(��Kk_ ׳j�
�RS1�"�,��G֜�_.��R�fy��D�^/����G���r��zJ����~���34<ľg�A.���C�����%K8w�"UU��r�
/��7��Vˮ'��z߱��<�^����P�T(�J��o��� �?�,��
��x�,�h<��l\��#�#>.��qr��������FCvV&w�s��c,]�������(�Q�	C�8y�@0@fz:�i�\�r���D1��ҕ+LLL�R)��F�}�n�
�OMIy�Q�}�.���@AA>�y����i�$3#�\F}��Q*�����
P�POnN6Wo\gph�\�A�g��M>�)��CTVא��CTd$�ϞCEJ�*�B���U�ؼa��~:w˖.E&�q��-j����,*,�デ���DGE�sϒ����GE��+�/t�>��ϼ(�<��$�q��LNM�cbQ�Z,
�zINJB��������������¾gINJ"%9�LF_�v�
��@lLN���2
���$�J%}���F�bb���?�&''���E�Ra1��0��ta6�HII��`hxx��q|\t���x�0IH���p000��l�j�b���`�\A||}����V�甓ɩIA$=-�^OwO��2&�	��Koq��3�VOo�������P�NLN��Յ(Bzz��ɩ)zz{�!#5%���j>;uzV���r�デyj�nrs��b���N����G__?'Ϝa�ƍ�_���>bcb���ួN!�� �{q���>������l������So�$'��&&*zN:f�D�h��Iz�uv�+׮q������,��'��c����҂���͛���NzZ�n���������={�8��!�u�o�F]ZZ���C�D2�|1"��CCL9���ڏI���y����uk��Z�(���[��%�D"�,���㗺��D"�<F��_"�H$�Lj��K$�D��~�D"�H#R�/�H$�cDj�%�D"y�|�
�����t}g��B8���@[[;G������G?�����z�<�	�}}s��ˎ�����]O�����'N����ˉ�ȩ3g���XU]��;&&]ćZ(b||��G�R�Ј��^�]�H��y�P(�������$��P�A �p8�r:E�`0����f��z�s.r�q�n7� �E�P(D  
�^����v���_ذ����o��-�CC_��z�	���n������FFF(��`ph����\.)���11����n��v{�=�n���>}�TTV��ݍ��I�r�2��ojj
��=����6�QI�����ϔibb��;�����v�j\}>�n��P(�(��\.�N�L\n7>_81OCc�cc_�/^����a�kj�������m�9I}�qLL���Ʀ�2H暘������w�/�.I$�Ȝ����N�����
K�.!P���~r�>�׮al���7	�ݽ��FqI)*�
�Z�3O=Ź�����lٴ�3���o�M}c#���r���F#^����HF��ؾu+�w휵_�(r��5�߸�k:�{^n�B��C��s�q��[�j� W(hmk�|�Jū/�4'e2@UM
��<E ����LƧ'>���	�L��͛9w�"6�
����=ǡ#GƠ���W_��vs��	.^���U+ٺy3������2o���ܼs���v��"���/�?0��g	�B�Y��'�o{��������'G,]���;�s��������jy��q8&8z�*��Z�k?�
���㚾�}q�s�=��-�`���3�`���Ï��x	!v=��R����z�����!�/
Mͬ\��P(��KWhlj&=-���2:�쌇%%�\�q���^}�%6mX���PR*�A����j�Q�DRɃ��KJK����7^'&&���~l6�� C��8�.��r���."�V.^������1�����ݎ��b�լ[���_x��?0@H155���(S�)�*��[���.Y̶-[�}��މL&c���dg�o�����u��kim��������yq�>4j5�(����?��ߒ���g���=�x��<u�5�W�?���) ��+x�'�x�����j���}��f[6mbۖ�tvuQU]���k���O�x�k7n22:¶����Nq1�U��DZe�&�
G�C��DZ�>c��\�j�R===ttv�z�J������W���edd��W��GK�~�|��X�N�<��S��Z�����)�q����*�*%۶l!::�3�ΡV����!�"#�cȘ?�� �;��h����K#�J�������~�sZ�ک����K����՗_d��e]�1��˖�}�V�=�,�)���Q4��?66NBB<�����02:�L&C��EG&#
r�zz��|3C�Q�Q���`2�0�x}>,�6�}C�2y�f�� :*�H���2�7~���F#j����(4j�B��C�11�f�
�/���j�I	���7�P*??�~SN'II��M&T*.���n!�B"""�X-��8u�V���0�\��h4����L��TU�`6�Qk��O����k|��c^}�E���P��X,f6�_�Z�Z�j�R+W,�p��-Z��(,(��?���"7'�`(��b&-5���._��R�d���DEFb4P�Լ��q��u�<�s�<͸�FaA�))DZ��d�/���q�΢�B,3Z�Qǫ^�'66����f���ޠ�>'��
�|�d2���]��dCDD�B�D��i��-]���ȧǨ��&5%�ĄxΞ��J�����Pp]]�Zd2-�w:=�0����)!!��w�p��V,[F(����jf�#�/h2dLNNr��y<n"")��,Y��J���x��gINNZ�{���ds��y~��pNM�A&���������M6���Ɲ���d2d�X�v5�q��Q��2�����MA~>j��LFbB<wKJ8u�,�II����J�B&��v��x�2�(b����� 99�J���$�M�D�LNN"��Y�r%���M&���P=�
{G'�c�D�
��޽d��#�Ɉ��"66���A��;���g��ֲe�FJ�?;��ᠥ����L����� SSS�^����.�yiji!++�Nǥ�WP�T�74μ����˗q��%Z[�)��'R\RJ_?�����q��)��i�-�6o"&:�OO�D�
�ZZ2����̅�
��[Q��������Ao *2��$�m6���0�<��^""�(�J6nXOQQ!�y�~���Y�v-y�9dfd`1����#55���4�bc�Z���������루��U+V���JzZ111��Dž{6�V��҈����ra�b�MFz:�Y��z�����]5���q����v�*2��IMIA.��b�26�[O0 a�X�X�$'%�r�2z=j��ukײdQY��x}^�,ZĊ�����#..��B~n��q�B!֮	�8�fgc4����=���� &:zz�5�F	l߶�ŋ����A��IHL 6&�C��Q	����xj�nRSR������jIIIf||���&�RSX�jk֬&=-���x���==�ϔs�U+W�~�Z2��0[̌�����#';���7��hI��G&�a6������űl�bcc�V�XNQQ!�iDEE���ػ�IRSR�rN��j�X,DEE���J^^.F����LROV"y$	���1��2����n���|G�ghd������<�#�ʵk�).�?���yQ1�����S�?x����O��e���_����3{�,tq$�C��f�S~�FRSR�X,��
�Hɣ)&:�ܜ�YO槜N�]��}b���w�� ����E�H$?_��H$ߟ{�<̏2$ɣ�;��K$��L&��.�H�줮�D"�H$���H$��1"5��D"�<F��_"�H$�Lj��K$�D�YІ_n޾MOo�.'�"wKJhko��b��ۿrہ����nܺEWw7�n����ܺsgV=�|>.]����Z�K6���W��N�|�@ ���W����G����QI�|o��^�L&ï�!��a��
��'��9��7_�>��AQ�4���E�k7n����
�(���nn޺MKk7oݢ���K��n>��czz{��%������7������+8��/���7g׳���¥K��'y�����w����/;nCů�؄�k��=6���.�z�[0�܅��������`08g������>drj�;�1��?��?��>����r�Ν��%���';���>���8˗-a�֭\�~���J�V+/<�,6���/�Z,�{�Ydr�?;���0�P����DD�޸�J��٧�"2*�c'>cl|�FÆu�P*_�]�q��c�O08<LCc�-"
q��Y�X�x1O�|bN"��w����	��x���Y�v�B���B\�~���r���0��(
����_��%��Ѽ��9��.�]�k׬F�P �ər:9{�<}�$%���]!�+��Q]C--��{��^��Z�"%9��g��zپu��:s������x���).)����B����#!>~f��(RQU���±��S�L&����ѱ1�-Y¶-�gX��ê�+ؼq��͜�pA��u����ܸ}���N�����3��wp��E4j5,.*�y�'��p:]|t� 1/�m�f������e2��8w���	�
Y�f5N���oݼ�uk�̊qQ9{��?��� ���Ur���P~%�J��n����q�8O푦O�<���k��[\ª�+HKM����O?����:���p�
�Cô���y�F����iΜ=���$/�{���)z{{9}��uk�`48w����lٴ���^�{{�3��%�"�.\��r�ƏDNv�(�P(شa=�?�l����4g��LR��ٶe399�����Ź���<�};*�QIMI���B������E�_�7�GY�o��������-[hmk��K_������܌ ttu���Í[����fúu�DGSZ^��۷YTT��۷���������M�0O���'p��ٙ�1�9w���nrs�9v��=�T�֡�jX�j%�Μ����#�c����]����V�;:9s�<����355�LJ���Lvve�=�/,�(2��gۖ�?����e��~�Q��7r��U�+�0�
3�;y����Y۔�d�瑜�̓�v�M��,�����+ؼa#r�LQ�<��4�˖�{��^������l6mmmh4"�F@$>>��+V�~�ZZ��hmk%/7���,b��Q�ռ��9�j5�N�������nV�Z����ILH@�y{�� ���KaA��X��p�/�Ka~>��Vz{����d";+���ȅ��388���g��E$$�#W(@1�"���`����;-W�W�V+�}}�d2B��==,^\IJ�K(,(���d|�q�!�7��\.G!�#W�yb�6r��9y�,���aw8hni!*2
�F�\.'3#�ŋ�����V�y�}3����Ooo/6����NP(��Jr��ٲyQQQTUW�v�Y�li��t:RR�yq�>��k�y�6�����~�ر��EE�z�]_4�R�HJLd������(r������k׬f�������ӃZ��� 02:6g�q���""��͛��<�����
?}�-~��OX�f�4��4g��f��cb���r��/�&%)���L���HOK������!�K˸}���K��P(�|�*
����>��2�r:Y���'O���AaA>��X�������@�PPZ^��j���AE���trsr�q�J�����rr���7�٧���ئ�=�F���|������������I'�O�bll�χ�3��-�� �����z�PTX@�z~r�N�kj(+� 33��2B����lۼ�[w�~���(/<��B���b�088����ini����q���,�����fӆ����QTP�ޠ'=5�ں��mtvvQ��8s���1;v::(�����b�24Z
QQ����[�LMM��޽
�p��Q"�V\.�z��r�t�
۷n�b6s��EL&\n7Z���NIY9V����!d2*����<�7啕T�ԠR�����ҕ+457������hkogdd�-7R^Q9��Z��j�p���>/�CB|<�����ؾ�ؘ�?W�L&C�R�j劅���[Q��������Ao`|�Feu5r����<IaA�qq�wv25�$9)���)n߽�(��dg���dg#����MWw7E����HSs3y�9�ض�����u�ttv�����K�2:6��b����)��˅�db��5�!z{�HOOc��DEFp8��ڱ���<����r!
9�Y���b�ۉ��#**j��yA�""H������V��%��7[J��c����w����l4���r�q�]���K �ᰳs��.Y�Z�fbb��;�addt�իV�n��J���v젨���
��&D�{z��H��RS[Ktt4{w�"//�^O[G�@��4�F�㈋�c`p��V\n7.����h&&&ini�����\� �]]��*�RS�[R23��{�N�,^DVf��b��Y�f-ͭ-LLL���,]����4Z[[imk�f��w�ntZ��Q���>�n7>����p�N�˅���g�"'+�P(� �ٽ�@����;�ǣ\.��vO/�4�QQT��01��r�R�((�'&&��AZj�#��Hf� �p8�F�L&��c�۾Yv�+׮s�����	�Z
�����r�t:����7_{�%�ͻ~0���59��<-��H�d�P������ذn۷n�������@IYw������_����嗿�
����؅.�D"�����|F�����Y�DG��8q��6+�/#�K^����$&&i]�z�HY�11��Y����Nq	� ���g��Z����tԏ��/$��Q�?

�fz�b�x����^����`�L�B�X�:�HI~��B1+��^��J�V���EQ$�T*��2�����b�^�ၷ���R��>Z"�L����J���O����D"y�Hs�K$�D��~�D"�H#R�/�H$�cDj�%�D"y�H
�D"�H$��m��3��06>��˵��34<LKk+C$��|5��NSs3��c446�;�ӟbbr�ή�o���s�\��7���##.t1�,A����cb�wE"�����q��[�t�P(ĩ3g�����STUW��~��'O122���{����r��q�8|�(~�7������ʡ#G�qc!yx

���!�n�7Z���\�t�{�TWQYŭ;w���𬈟����oy��(+�X�}�H��9_�w�\ܸy���A�f�*j��)-+#&&��;�c��)-/��t��͛������Q�ٸa=
����� vl�FtT�n�dph���EEEȾd�����W��좦����|�� �.\����U+V�r��9�����G����噧�bْ%]�BEj��[\Bww*�
�\NK[+����HMIa�;�����ngW��ܝ9v���\�v�ή.�,Z�(Bcc������๧�Bo0p��U��X�|K/�͛��xHNJbxx����~j/###\�|�P(Ću�XTT��M$���Eo_�6l����RERbW��`rj��kב����;wijn&;;�͛6PWWO[{�`��;����o�X��@���188���f����������X-vl�Fm]M�-����c�VFFG�v�&
���J|\r���~�����d2��;	�._a�6��
���299IB|<O�݃�����+x<�l�HRR��_�f�
�����������>��Q\Z�F�a��$�ڏ��q�<���`tt����D��Y��\������::;Y�r��%��ʜV����'O����Z������ �d�����W�������W�Z-ܾ{��/r��5�kj)�ϣ�����&����~@�|�W�]'�j�͛467!�ɘ�#�"��\����իV�(��d223�����ȱc�wt�Y�l6a4������N�?0��C�IKM%77�\�(��Mf֬ZEGg''Ϝ�3\�����Y����k�WVN�ݍD&O��t�:���q�6��\����������ƭ�LNM��j9z�8iii��RYU�B� 9)	�L�G099��U�'����ƭۄB!*��������K������\.�����N�&:&�Sg�R[WOum%ee�DGϙD�E����̩�+W�����`�~�����!4j5�V+��^�������d"�h��t��p8�N�>�����MLL��>$E ������SD2�`0�l�nܾM]CG>=�� �⃏>�f�q��%�:�v;W�_'33��N�����JIFz:��#|��sF��
V������9#��V��~L&���3,��Q��0'zr����˥�����I��{��饭���ŔӉ(��$'��޽�ڱ��J�**X�b9�6l 59�JŦ
�	���7��z���g�<�w�ii�B8h���	�@CSkV�b�ڵ$'%!�"J����\vl�J��JK����i���DZ~�Z�RS��LgW:��'w>AQa*�D���8֯[˖M�hhl�s�U��lڰa�ع=�c�n��l�D~^.2����D6�_G^N��c�540<<Bk{;~��׋F�f�ŬY����X֭YMbbb�����all�dz�U�'��³V��r���X���H+啕x<^�[Z����χ��B�P�xQ;wl�b6i�{<^��[�R�����+/c6��~�&�6{v?��+0
�u����&����f���W^f��MAD�l�84lٴ���A�;;�������ݻ�`ӆ
�$'2r��ٰn�		������� =�}x�^�F#kV�bqQɉ��_�����s�@ Hk[v���B�Ь�0�dfd���ͪ�+d�A���O�	o��:��~����*����dlߺ���2�v���|���x֮ZETT$q�q���119EwOu

dd���+�[\BbBc���Abcbٻ�I>��%e�$'%���BA~����hk{;�ii���"�"""1��$��SY]Mrr�##��d������Q�ر���6�N'�@��N]ʩ���b�<����c�p������ۇ��G$<l���DqYIII����!������+-+#1!���
R��0�L�0��U�r9I���O�ډR�$!!�@0H($ ���(�x�2.���+W���H���:=###457���GzZ:��=O����3��p�U+W�����͛�j5������1����Q���Í��rr2!A`�}u������I�l�HggW�_�b����Z��H��㟝$++�VKum-F��ߏR��n�;�X,8쎙<�11���i��?$+#�H��[��P(��Z�� �"�B�Ր���)"�5�W�V�1�ϗ�r�  "��x8w�Y���L4461<2¸�ƽ��dtZ-
��45�����g��[.�c4�����<�/���ddd`Ї�|uvvq���.۷ne��Uh��+*�$9)������ab߳ϐ��G_?��մwtPTX���wKJ���e׎���ɭ;wijna���d��Q�؈N��֝;�����ށF~O������Z�z=˗.A�T284Dkk�֬a��\�|�ʪj:::� ?���H���!>.n��yAX,fT*���r����$+#���Z[��j����9���9���ttva2E���Eqii��=�EtuwS^Q�R�"66�LNa~>.���¦��������GVV&����TL��~?��L9�$'&���JSKd���d�L,�1

S]S��`�������^C�V��;X�t	.����*�v;������j!-5���&.\���_X@�@��u������=���ֱy�F�bc)-+������	�.^µ7�`�5l\����hn޺Meu5v���O��11��fcbb��%%�wv���OT����*�)��c��-dfdP[WG]}�Q�tzL&����232X�t)�M�45��R�HOKe��$?/�FCVV&����|�f3�����yy�������줽����(
�������v����`	��F_���1﷭d2����6v�`0���6�ޫ�op��������L&#
�������/x��Y�b~��J5,�(�v����@^n.O��C0B�%A
�f�������	�B���.X��DQ$�:v>7
ŗ^���2�/��gi�WW�'�	���Y�
�B�d�/|�n�m~Q������^���[6Ϥ��|�B��9���_�T*՜l��`���n~�λ��ODGG�(��\���EQ$
}e��^���#H�%�o�[e�/P� (�Jtz:�v&H9u�,��c$&&����� &����tX�ftZ]x{_q��Qx��?�����O&�͛���\��!�B���j�g��9�u�_�Ѡ�ig}�;�ŔWT☘dǶ�3#(~�R��`0`�Z��&�d��!�����\�2J$�����b�Xf~�����(�`����:ݗncrr�J��+��H$s���nG��Ί���Il6:����_�{0djj
��$�HK$?ߪ�Ut:ݜ[�V����a2���$�G�������d�5cK�Tb�>�_w�HW�\��D"�<F��_"�H$�Lj��K$�D��~�D"�H#R�/�H$�cd������r��r���1&��erjj!w���v���t24<����z��l�Gr��G�׍�?7����m�~

�|�S"��-x��G�������d2���nq	�����K��Bܹ[��ģ�������~DEU5���$(�d{8�#���*^�GF()+���ׅK���ԩ��A��,�2�먬������}��]���2J$Ns������QYU� DGE���ϭ;w�;���011Am}=�-�LMM����kj)���`�N����nq	�]X-f�Z-�
��54066�B�������dRS�f��TTUq��+���ʢ�����n��j�V˜�j��ů����e��;?.z���~�&e��]n�SS�z�^��)�����ygN�nI�̱�h4���QRZF(b����x�^�g#UVUSU]�Z��l6��w}=����\n���%!>���1�߼EKk+z���a���ϑO�q��)tZ
qqq8].n޾M_?1���**���������q��+P��DD�oh������n�f��M��饼����I����kh�����M0$&:��Ktt4~�f���P��
q������w���$"�HlL�){g�KKll�yf4�s�٨�����0������5]]�""���������R����CuM-5�uLM9iln������8�N��g!�K~_w��9
iY9���OtTr��^�o�y�LNiY9*�
���{~�A���h49{�<��Sg�a4�v����
��"inn���OP*��x�����(�))�6��o��̹�$$����A~^.��C��F���.�����1k���n޺E��JFz:�1_V����1~�?��h�x<x}>2�ө��&?7�;��}~���f]l�����ut`4����������`1�����w������ۨU*���9{�
���7n����G004���$�Ϝ%11��/���Hk[�==����r��y��}�=,���[\LWW7ii�$&$���!lv;���Zf3��_IlL�\ahxA�~�&E���׿������I]C#K�,�3�n[{;�>r����r�n^��޾>�RR�p�2�����Qk�x�
6�mf��;$
��h�i��.��B�$#=���ٹ	�/Wj
���ȩ����?����䙳DZ-def�Zol����P(�df��t�����t:�^�Nrr2����ƭ��z9BtT����A�n�}����U*��R���9�n�?�12*�Ŋ�f�j����ASs6����q���	�RSRx�'o���'�v�:�n�`��5��ҋdge"��IMI��
���t�).fӦ
������'h��<��B����q�z^z�y�ss�J��ky�c4���;�YXX@FF:�>�E�},Lsk?y�5�lބZ�FE���y��ٽs'ť����Y�)�Y�N��p���M���W_a�����r232x�},]���^226��8�==�l64j5�7n`�ΝDGG��SO���������@ HwO�#�L����}>�q�	�v��I���3)��,Q"E��l�rػ��:�w}�}qý���Puu{�ڻ-{-��^�-'ٴ�mɢ(�"���g0&w���(J��S	���~��i���~��Y�����l߾�o���T����D�D"���a�&�e�|�+ϱ����;���g���$�I4�Ƴ�<���=�'�_�4}�����J�����/~�?��D2��=���ןg����=�}75=M���e��룫�˶��_�_>�,���eǶ��(�<|d͝l�~}�6��oV�ۚ�Y��KcC=;�g����r�FFG�����bl|Y�ٷw7��9JKJx�q������i�ԉ|p��:S�
­Z3eoyi)�����K/�����<q����:�x�����Q���>���K<���m?� ��5��+?A�$?ssab�8]G�%,�bfv���Eb���c�x�|�����4���36>�|t�����9ҙ4Ձ�iض���5��}��r���XZ�qux�H$���u1�1=3��~��d2��� !I��z9|���u��+*ʦ�
G"���o�G�%dY&�SW[�ǎ��'����X�ӎ�$c��$aZ&�^���9������al���(�YΗ"�EQXZ�12:�"+��!v�l�����I���Y>[&��<6X�E2�bzfY�і��s�8�.�`���^�GF�8BQ‘33��MLPS]M,ǥi���eH��,,,rux����^/�"319Ed~��M$��~UWUmh2��c<d;;�:`ʲL(���<E�"**�y���=Jccc㧲���g,��֑����vgff7� 7����Jg'�����(���屣��J�hkogbr�ښ���=�{��H$y��ڲ����9z��hmm�`�3g��v�8��Qv����s���i.]���CAΜ=�Cu���Nq��η�G:��vΝ?O<�`����R)z��ikog��f?�(���׼��\hog��Uv��@QTN�9CII	5��}�7D($��������$��4o�BGg'�]D��ןgdd����g�]l���c\.�+|t�|��v����s�9��Y,�"�����������q��#_�������$۶nebr���:|>3338����q�**(.�>��D	z�<�F�Od��/�e�g�N������	��=��Q��ee�LLLr����	��TUU22:ʞݻx��w�����Ǐ������
���sڈD"}���Q�?}�L&�׿�U2��]\�t	���s_:��K����JӖ-�=��Q?��#���[�����65Q��q��TEE�e^|�η��ٯ��~Zvl߰�^#����ׯǞ�>��K�� �y�w�]�ȕ�N��(+-��3Ȳ�C�195ť�W��Ghnjbi)��㦱����Wٿw/�KK(����㷧N��W�"r,�զ�5�r��m�~?�,c�v����͟O�������P
�r��mH�SD�����/�<Ƀ���v���S�4�����+�ٵ�g�z�x<����0Mrme���t:�d2$S)4M��p`Y���a�|>dY&�J�wT�e4M���UU�'R��.˶��bȊ��t"�2���J�qi��aFAs��Ù����W�]&�!�H�v{p8T,�BU���(�N"���i�4
�4�eI�0L��bF����EdE��iwe.��v���a�h�I�H$����AU��֙ee�2�a�H$�������_c���~l�&�J�l*���iX�M:�Ʊ|n�����3��e�_�����'����C+��PUl�fii	Y��x<ȲL:�Φ�eҙ�M�k#���7���N�䁃���r�E���q�RT�˕?ƹ�P��e����
�zn;;��(�5(*�j۠iN���p,���	���̅ø�n�mm��tRR\�b].M����"_���(J�7�zݺ�N�_�,/�kW��;�	�}��Q���sֻ8r�*��e`���������O�zsߕ����{f{'l��eu]�nA�5����L���kB�P>��$I��]˒����ʕ+w�����ũ��^��$��Q�}\/��f���Y=n]��:��ca=������j�v��X��+�~?�M͔��W����zeEO?y���.�SS]MC}��5�	���n8_E�������O�ݶagk+��X��;Q�wj�����n�A��s�M��k� � �GD�A����� �p�_A�#"�� �}d�2���y�-�"�N���0���u��:f�m�H$V��5M���Գ����ԝeY��Çt]'�Ln�n}f�$�i}�	���
jY?�痹t��m}>������<���T*EWwϺC$�W::�ы/��~���nz�599�����e2:��`ph(���?�?����\�{ܬ���.�rXkff������h4����E^�ɿ�;�x=w��N���������;�019��E�$I�6������yնm���߶m,�Z1sTn��偶l�0QU%��©tW3M�x"Nt!J8^w���"���uEQ���cbj���1��q��D$�i�(��a[6�LmyY˲�kY��a��V����ׯ����oݕ9���43�3�c�����hEz��$	����|=躎���Nݛ�39=���Es:��GH$�#aҙ4�iac_w��������7��ܱ,+?M����ʙ�
�N^muݭޏ?��[8|�|9����D���q2��CU�.D��c̅�P�4�3��n9��Q~������M<��G���SgS�6Y��؟[�� ��h�-<.�T*���������ܰ�b�&�$#ɷqP��f��e3��l��9��?ڪ�ޝ'�r�5۶�%˲���(۔y��m��`.˛�iZ����f7��l]_'p���ٖm��E��t~麞� s87�>_��3�'�g`<uTU�` @t!���j����F��	��j����A<�A��;7{3Da���iB�~`S\C&�	�QlZ����GM�$<a49�m��� 7&��dY�t����M��	���f��m().��\��O��GGA�\?���ZG��/�$IH��&�'� w�&�^��:"��/� �)��MC4�� ���«eeG4�F�䆺�lԉa��͏^�)��mIDAT���5D�����(��p��
��� �=b� =3;�R,ƶ��ON`�6��
+��JI�0��>j�k+�K$X����]��u�(Қ$14����Ґ%�Ɔ���9u
�x�5� �zҙ�T
�߿�<�O��/� ܳ�D��X�E:�&��`�&.���aY�X�x"A,ò-��8�T�É��fxt]�ilh���:�x�Á��]�s_b�9à�����ZF�F�� ��H&I&��n\.�D���(
�D�Én$	\n7��T:EFױ,��h�p8LӖ&��M�&��/� ���if�$���I��s�4�i��y+���s��\�Lv����D�x"NyiKKK�3i��a����٩�3�N]M
�%��� �N���N�q���	p8�Ζ�F���	�������Gey�ss�����475333�lx��G2�d)cfv�۽6�^�� �=� ލ[�I0dgK+n���	f�f���ck�V�^�`�׋m٤�iJJJ(/-��q�D��|EQI�ӄ#�u��^�۶Y\Z���t&M]M-�H�X<���,�b�bqJ��	G"�#��d*{��[fqq�	C��h����P ȶ�h��naD�A�	�X�C%�J�H&H&���q4�
��i�u�4Ml�"��0tu۲�\Z~�s�40�촼>����z*+*׹ۗ���P]UE��!�J!IN���"�MMB���3<:BIq	���t8�-e˥((��,+�|k�"e���/� �VX۶)S�+������nE���۶���������S[UCee%����
�g{���`�[B�
�J �'�L0<2L��G("��F1͵s�K�Χ�*�,�(/+c|b�` ���b.fqq��&��v0�)�(-)A+Xƶ,EE�e��N�HC���� ����vKK˦��HA>)˲������nM�$k��ݶm�N'���?8@:�����ۍ��mR�T>i����0R�4����(J>�沽f����V<S�$���~�5o]gu]G�$TUŲ,t]��tb��TEQpi�,c�&�i�3��N*}m�\�ܐ�L&�eYh��"M���"gϞ��A�{��ɑe�����mM�+�˹]�C�Ng�Y�}-KC�sEQ(�Wyk� �ݱ�|�u:T�"��)����u:�-��B'w�p��$� �po����6�r`�m���R`��nV�v���/��	�o��vZ�%�>� ½�g�
����ϪL�W9D�A�z�$��
�T�'�{�Y���d2Ȓ����M�1۶H�38�A���$I��~‘y�э.�e����Ydy�n���
T�|E"�� ��ˍKsm�Qj��x��
/�,��Lψ�/� �V'��
��f)���GA�#"�� �}D~AA����/� ��v�$	Y�f�m���dY�N�d�e۟[���v�u�0M��H��,���pZHAA�<�0��	���TUVQ[S�b����aȎa�4,�bdt���I|E>�44��z�d۶�^����e�6CW�����qi


�T����ffg(++��������d�d2x�>6IGOA�>p�oY��կ���8D�� �x�k8��ݭ��V&	���dxt�#��y?����vv~��d��_�|�_��7�TW�L�Оz��-�9��Z[(..Ʋ��^XX���<t�P>P��-���S�kɰ��ɯ.�eY�����AyY�`���
�Ov�˺���K������/<LYii����+I`.o7�������_|�)TՑ/c��X�Ӛ � �n���:�}��1{�QL�@�$�]����FB� �]]�ATU�����6�ٵ�7�~��mmD�|����4��q:�<x�H?����".����cjz��;w���d[
��^����o���|J�������8�8n����X�ž={�����%���<򅇉D"������b��m�#����z=<p� �#Ø���];���t�ԉܿ��d2��?��1L�#���E:��9q�8e�]�J�� �P�=�v16>������w�.FF���i>���P���A&&���F�����1��m�,..�z9�o/��R� �p���~���)��h%UU�m��?����!�|E�BAN�����TWU��O~JMM5���c��(*�3<2��[��(/+�g��%�T���1���	�������?$���t��9x`?��a�6^����.�_h�0t*��	�#��p�ʊr�lق�љ��bjz�m�x<�ilh�����N���E0�ŗ_��������ض��㦲����>�Q�oۚ�o��.�;:�r����Y??��TUU������,E�"�{z���JF�y��)�^�/_aaa���!>��#�����TT�S�+bvn�m��iko����7�~���Μ=����"��ƛ���NiI�FWA���$I���ݸW�S'N�w��{TWV�^b``�C�����W:piN��4Ѵe�L���%��;[[����g�!�J����\8L$2����iQZR“O<���477q���H�S��i �<�P_�����ı�9}�,?��hn�B]M
O{�۶�x0�d"���4
�u������OS]U���\'<M����Dd~]7PU�'�8γ�<�of�ͥq챣|��/�أ� !QZZ�S'Np����������J�=��SSTVT��o|�Ǐ��ǤR)�����g�fxt��>�W��e؟}� e˶����_|���FG�����'Op�1��J7�� rݦ~]�9�֎�墾�UQ�������o��/~��>qMs������r�߫��q����aii����hN'�P���-�����y���e۶��c�&��|9g�Afg�(--�����X$	��Q���y�wX��س{7��ȒL&�a��U�mmF�e��g(�����6|�������^������bt�`ksS�9ݶldIFQTl��0t,+��-��e[���mY������G��}������*,���dgkn�+�<?�ۮ���rq��UQI�R�� �pǬ��/I�d���}��g�044ġCp��a<3�3||�2����t����B[^��G>Bmm-#��\f��݄BA�/^dxt�����R��av��E8�v�+/gjz�=�v�r�����7ߤ�����s_:IIq1�i�~�c��C�B���F)).�Ѓ��d���AcC=�`�3g�226���f��m���t��RUY��G��K�����uǎ|��륻����n��'���"�H�{�N��a��PO8�ݏ��j��;l�ҳ_$����xhnj����+��okòl��a*�/*������q���hٱ��Ξ������	~��'1 � ���ӧO�---�v�,�t:�o"�5����[�7�w(����RI4M���,˲PTY�I�Ә���t�{����ֻ���m�躎a��4
EQ��4Y��e�D"�,�8�m&�� I�DY���gt�L:���FUյ#
���ݑK������H����r�뺎�((����{�0R�ڪ}	Y���"���	�饗���
55բs� �p�$I������eY��v�m�6�L���:t�ÁiY8*Gњ���m�k��&�)����q:�8����׊��x�|���w��ӹ�<���ze.L���aS}�߫/bE��Y_a9$I�����m�΅�֜m-A_A�n+-�S'����sf��ٶMUU%��}ȲLCC}~�� � |Z�8�;Z[ZD0���C!@L�#� �Y�u�/�g+7� � �i��/�1c� �p�S#����]AA>c�H��7�j��%tEXtdate:create2013-08-30T12:28:19-04:00��(�%tEXtdate:modify2013-08-30T12:05:05-04:00xȦtEXtSoftwaregnome-screenshot��>IEND�B`�site-packages/sepolicy/help/lockdown_ptrace.txt000064400000002341147511334650015773 0ustar00Disable ptrace capability on your system.


The deny_ptrace feature allows an administrator to toggle the ability of processes on the computer system from examining other processes on the system, including user processes.   It can even block processes running as root.

Most people do not realize that any program they run can examine the memory of any other process run by them.  Meaning the computer game you are running on your desktop can watch everything going on in Firefox or a programs like pwsafe or kinit or other program that attempts to hide passwords..

SELinux defines this access as ptrace and sys_ptrace.  These accesses allow one process to read the memory of another process.   ptrace allows developers and administrators to debug how a process is running using tools like strace, ptrace and gdb.    You can even use gdb (GNU Debugger to manipulate another process running memory and environment.

The problem is this is allowed by default.

My wife does not debug programs, why is she allowed to debug them?  As a matter of fact most of the time, I am not debugging applications, so it would be more secure if we could disable it by default.

Note: Disabling ptrace can break some bug trappers that attempt to collect crash data.
site-packages/sepolicy/help/users.png000064400000160426147511334650013734 0ustar00�PNG


IHDR��ah�>gAMA���a cHRMz&�����u0�`:�p��Q<bKGD��������IDATx��uxW��[���]���PZ�P�F[��־z��RJ���C!�@��m��3;��qɲ$��J�����ܹs�̕sρ�������B!��@BBBB��Bh2����������
�$�%$$$ndA(**2��L&�����N�����Ŀ��QQQ'O�$A�<�aXO�JBBBB�_��8�$e2!�"��q!!!!!q#�9�/�}			���'Kۼ��A��Ә��j���W��n.~�ұ]y�$$z��_Z�f ���8@E4�@  �a��E�Q�+��+AA�$�1	o�2Z�#%(m8���0	�	�8Q�Gm�b�Q	�=�l�4��0'��&���"<��Y���%	�j��$�c��8��)��:�>?���"�a�m]E��`�dA,ˢ�M�f�I�AE�Y��-B���^!���cgQ�Q� ���+�����-p�yC�[�+��a�0�L&�q��֎�N�H�N���e�㖭߼��Q�a���|���SN|��w�Ba�X�">y㥧�x�'�<���y�/�zU&�DA�p�n�d���8�c������r�LE�eq��>����
>o�}!��B� 8�5[�����O=p_nQIie��/<c��
�(��v���d<�3�R��mز~��O^{�RBCS�|���I��q\�y;˒$	!�0�cY��I�t�P�AL"��<��d2Q��q\&��@�}�$$�'A ����`2������uwՆ���,�;�~ -�H���R*��8�f����B��Oyu
�d�������]�n�0��YM�<����!AwM��q\CsE+M����XYS��׮70v;��$I�UVa��-�"��]��O���/Wo���T���ƈ�`�m4m������
�/.�1� A��� �%�.j�������hZ�sO���-{�W��FG��[t:O7�_���&Q�V�����֥���E�������AXS�`�i� ܴZWMAI���ƒ��[۲

Y��tw�.؆���,A p�����嫟|`���%+�~�� �nwV�w��Ϗ�{:�78<��x�n�{`��A��a4��\��j�
�8z� BQ9��q���b���1���\&k7�b�����Ͽ.���G��U�6x妭O>0��%+�͹/$����Q�S��i9EF���I㎜H_�v#I��Z���&�$����U���z񟥕�nZ������܏^}A�����w�>����+�j���o��_���/-�0r��8�m޳���Q׮��;�H���#I��^}a뾃Uu�m��f�M��o��{y��44���ܺ��#i�gN�����‚}}�"�W�$$.��fڿ��j]�|5�{�1h��j�L���%��aB����C�'�8�y��_{�$pE���"E�A{� ̾}���ۧ?��'.�M��(R�T�0e�����>���7|��l&	��h������{�}˔[&�{��֨�8�2��3مe߿���V!|὏Og�j�*?��I��I�B�,���9"y���� Og刢8|Ѐ��҃�i'2�BF
t��3�vAIٸ�C���K?���1��<��ۦ����V���OI����Ҷvۮן�����O�x[�Vi�X�!"�aJ���y �\�a� ����v����)���EQ&�J**�K��O���k��DQT���n����1�vPUW�R*q�hQDA1)� 	�ĉv�������UQ�&�J�h������6��{���Q<�YiZ��74��76Ym6W��F76�X�6^�
�q�5�Z��M�r����6lV+��G�p�&�u�����>	+7o#I����E�9,���α� bbh5�ʚ���O�/��WE6���542v;N�$IZ������C1v;��v;ò��
%��(*���3.[�Ƚw._
!6�����.k��4|��ZO'^�FFE`[�~�]K�n��e�sGtDxeM��h���;�5.2����պ�,]�B�T�'��ֈ����!�0
>h�7�-Q��aAA�+�����b�>SǍ��Y �j��h234m�i�R��}w��n�̳}�c�>���i�Cޞr�J���2f�KVqߌ��Y�}Wsk��~}h�A��(���})� p�$I�:"4������anA4C�.�����!���U6��
�2f�KV�����C��2q��cm�I	��ar�(�R7���p���1��9C���R$�����|�]���ǎ4h˲���Ŀ�(��MV�B&S�d��.��� -���9��saQ�0�"I�a�β2��H�aY��N�$zM�F�n��U)�v;�ԍ �@2��j��"�1�V*��G��ys
FR3E�B <��H� �:�Q�R��j���3�J%��,�*����������y�����j��ܴ.�"���'qe��(�(� ,V+@�Tpo��;�gE��233%
�k��B�B!��fC��N�QQD�����"�(���i�Ѳ���DQ��4�ar��;��q��a��1	�ZmH�h�ZqEAF�g��Og�~����]�!�@Q���A�����(��A<\���eԆ�{�Μ�K�n4�J�V��xeY��'q��#�4��1[����r��%��(:v2�mx<�MS�;�9�v�1�/� �j�\E�~E�K��Ep1���	
�c���b�GE{��Y�HQ��l4=e�ȸ�H�B̲@��.�e	��yS(����(wJ>79;�Ƅ�&�D�-���#AP+�by^�iH�v%z�4|�m^		��n�Ǯb��8�IJ�ò���$z��>҂���3�<�Ս���	���qd�P�����v����������OQ����!D������e'`�ԅE��^������mmk=�}V:>~c!d9���`0@����A���S�T��ך-fЫ�?PE�L���K��$�%z�]�������]�I���HdKkKiyiߤ�>>>�lmO��H?:����P�P���l2�z:Q�s�8�Y��ب؞N�� 0S��j��I�X�F� m�i���)t�"�DA.���\�\&7��<ϫ�*����5�-A�$a6[\]�z�A�V�,�q��`0�d2JA��u� �8.
�%���]WЀ�b+:���K\Q1�	� ��޸�'��R�a�(�z����e�����F��
����	P��=��a<=<�t�$A*���R�Bh0A�f,'�2�\!Wt���5��5�����ު!�6xQ�pss��!~�_+ ���]��d�'//W�oS)U���Y��lNL�����]�8D��}�'��؍�˩\g;6]o�� �p�r�%b�LΥh���n�� ��&
�=�:�;g🩱9��e��kA��t$A*��镸AͲM�6d���,�a���<���}��v���ӻ�z>�5:E�a�Z
�@Ӵ�t=�I$]BB9����r��b��	�tj=H��d2�\f4��{r9A�f��2S�m��f�4�8!�(��$F�T.���i^�5j
�cB���v;I�4C@8�q���5!�$I`�\]S	����/�Q"� �Q��4�����Q��δ�516�1�a N�q���"^��#�QTyee�^߿Oa^~�(��16����ʹ}B1s�@��B�N\KDQ$I`��Q-��]�xH����%�ֶ�����A�8� �Yy]]=���7��Q�E�����; ��r��[��8C��B�p�J(�����t��yj���| $HB�P�D����_ZV�պ�들֨�N����缼�BC����c��02�������v�pc�h��4E�B�m*�"���F�� ���E1$8���!��B4A6H�5��"�D!q�0���H��^��(�8.#e�L��8��1'J�l�H
^|�r��A�R���������Ax���e�D�$I�M�hZ�R���bQ*�L�m6��p�i�fI�_3po�� ��V�U�T���\�ڮ��H�
��T(�kj~_�t�Ad�P(D Z,V�eU*���$:!���f�v�</�l6��]4��ܼ�Ͽ�d�i�'�-��{�}�"G��V�Q�y��i�F�ah4�pS����	��_P����j�HR�V##�f�}�Ч�0Q&�I��-�����i'OM7�����3j�ڹg߈aCU*�Z����<���l))-

P���CiMR�a�Jű,EQ��.�R��X�<ϫ����򊊪��8œ�|�e�)�!�βv�]���8��M�29ER$c���a��e�!IRo0R)��{
KH\������ꪂ"���vO>j�R��b�?�|A����
�^
�\.okӭ�kMcS��I��}�S�3�7sƌCG���,7qܸ�U55S'Mп#I�k� *�r�_{V�]��bc�
�|�?s|�ٳL&s��|$�ѱRõ..�oE��@!�Ϝ1���o��}�r�|�𡮮ZQ7n�\ZV���<|萉�DZ,��<I�F�ظe�q���IQTlLtCCê��LF��Q#��v��шa؃��oni9r,�`0x{y;�^�BAQ$����y���S�VVU�ݰ������w�N����4��(�
����GE%C�׷Ϙ�#MFSk����{ؐd�V+�ɊKJ�j5�Ȼ���gCC�٬l��F��[o��-�y!y`������퓤R*w��7e���0W�*�
�pB�\!�d�u�����$$�OM+,*���9b���Y��</د�����ep��(�����	�xI�Z�j"
"!��ʊY�0v,�O�
k7dd��o\l��y���scC��
��5kS�NN�2���;�d��͛'��zͺ̳Y%���gƍ�a����L��Ѩ%M�k��f�o���@���<�⾃����ӧL�ٺ١Au��</��s�q@m]�w?�	!\�x񉴓[�o���Z�Z��s��s�fe����٬�&�_�|E[[�P��4t�ऄ�w?\��yf�s���<p���A�V����پkwlLLk[��K�VU��=p`�'N�o۱+7/�LV�y���V�jhl\����ФĄ=��3�>T���[��v�A!��43~옰�Џ>��_�0�LJ�B��-_��?��z"M�P�<�aI�f��������ܼ��S&746���755��W�T�3Y�6���y˶�7���j]\0I�@e�c��(*�O�0��ϯ��0%5m̨�-++�oh,��=jı��E�'Mpws�*�r�No��H���y%�.ª⢄a��5g,Z�[�B	5���*J&�X{úF�s�q*�J�T��h�X������;v�,�_P@�Ln~��օx�\>|��Ƀ~pN\l���L%��2�V����B����hHp�_��tsP���,t�I����nnn�E�nn���u����t훷m4p`򠁉		E��J�B�Q�'9l�����lF�5�l6ۃ��/,*5bDpP0�-)-++/�0L�7���L�0�٧�*-+��t�q�ƍ{`����t�L�V�P�#�����7[,��{���|}���wLV ���ru�B�8�h�O?�؂���^�^&����?�����֔ɓ�]&�i4�F�T*�Z�$""���"#�m4-��4.�J���&�m3�1v{uM�w��N�8�����U�uu�=c�����}��Y{UM-�ڋKK�ZW�e2Y�>	I	qw�~[dD�i�!HB���C�uDiT$qU���i�LJ9�̔	��},쩧OG���2��d"p�bz
؅�@����b���V�Z�q��Դ��		���3�O�i��e˓T*�!��;7)!����`0��v]{�-Ӧ�hzي�
�Br8ẁ�4�T�?��o�����S*��nq�PQ^���:|���;T*��`X�~ß�Vxx�O�2i��RR�KJ �&��j��L�v��l6|�>�0�Q�R-���'�t:Q>y�Ը1�i��]ߞ���׺�����5u��v�쓔H3��bA1�-���� ����{ƙ�F��\!яc8��*�*���~Zt6+�a/O/���t�Ng;�ZXXVWWw�tƉ��z��fYV�7p,g2�y�S�TE�%yy�e�r�bێ�j�����u�
%�<��DE���o۾k�99y��ѭ���G�ya���q�1
��߾w�q[lt��`4��V+ݮkGav�ܭR*	�`;��.钸�@V�`9�h�),l.*R�����_B
�w�ugpP�R�D�i��|}}#��2Ϟ�oh����cG�<y���;<<���Τ�D��<����㸇�GdD���Y[�o�pw�=�>�Z%+���d\&����OE;�vZ��q�j����� MJAZZ[��JJ�"�#ƍu<5�$���'6:�$ɴ�S5���~~�ZW���[P`���Mӱ�QE �6Z����(�Anj��ҚWP�'11$$�ȱ���F��snnn��PcS����}w�e4�|�����Q�V��oRR�~}�N����6�ǍEnx1��6�٤T(-f�F�����ʚ��Ȉ����"��bmӵ76����L&s��������588������%4$�e��������������������2i⠁JJ�"#�H��y���+<,������0q����9����6m�䨨H�RO��t�!�A���p�
(-+?|䘛��m3�+
��N�$����b;o��(�AVWUFD�rr�ߖ��'pv�2���tٲ�Gmjj�zxx:��8�����\>b�7�c���r�R�24m�ZU*%v��f���_���(��bQ�Վ0�o����]�k��fk[k^A^�d�eA�F�V�Eà�lA�V	�h6������^^o��^`@�S�?֦�Q���]\\�f3���r� �f3R�D+H6�M�PQT*�Y99��ߗ��������<��;o3v��dR*�H;�e�s1�L,���9��l���?Z`g9D�$�}m}���/�q8����T(J%���F���b��n�[�6W�YOc�(�cI�2�j�Z�a4
E���f�B���l�(�h4�=�(��(�˔J��f�QS'	B��m4͸���!���d��(��|>����;� ��|[DXIJ+�W� 
��X��@/��#����ѹbK���}z(|��S�N��,*2�q6�����f�	!�Ѵ�jE�a�0tB��a�F�pqG'�:���~�!��ᤳ�<˲���Hd��V�E��,]����^��2i����Al6R�okkC�!�V�=���0C�4�ah{�l��<���d�Z�j��  ��`�i�M&��q����U�P=~<��)��z��w�w�R �g�m�����h��Ѵ�8�kow��aH�t�00���(��!� f4(f�Ʉ�/�v=A�/I��t�vD�cl�٬gYH��X,� BpG�w;��q�"��z�BI�$�%�
����j�8N%�',XPWZ�RQIbX��_����][����nF�7Σ��}���eU4M3����������(bC\������J�R)�h�"�"I�z����9�7�eY��@ljbǑ&��t!��8>r�aQ�1��y�ЦT�0@�x^�x��x^�Ļa@�I"�엸*��XP�_[W�ALR&�1�n�s,1�$Ʉ�$___��ǥF�70hQ�r�B.S*4M���/� E������c���� �	t�m^�$�FN<�q�P�
F�Z��ڬ�n�X�*��d��e� t�b��U*��|�0��
C���Z�T667�x{a�K�H� �qq���1�Ot�9b��@D+�vy �-�$��9���=F��.��٤��}{зY��x��H�1�JC�#�@�)�IR4�`#pR�E�&3����]al$Iy{z�er��"�r���@���q�A\�|騑�<�=�������تk�j�6����=!���hrEKkERH����9��"Ap�9'p��E��0���]�Zw7wi�G��8��T(kjk��PH��ziF����-f�2��7���A��s����Yc���n��0NP%�u�� �������PBx��\�:)����*������y^�D� )����-�JH�,�C3��~�&���,�x�>A�g\��hf�{�0�w��;��R~�ҡ:��㸪�*���3�)��v�a�|�.S��:�Ȃ�D�rE�-_�-zâ#��p�B�=2�TYU�����i����.��zAP)U2���3��CA!
�c��E�Ci��C���T*��s�ݮ�HU#!qSq�f�5X��s&A��8/�%
#h�x��R��$b�9��:�bD�
z� ~/]@������[>��v	�9�!Q!#jZ-k5�5�<�j]��3��\���kͽG�A�qWSSM3�s�L���O\h^���2j�&�?H� t����!��]G��Dxg��ږ�z��$��΢_�R!m�P��q^v��$I�Q�_ ˈ��[M�W+
���i}��2�e��>~����-�x���k�@��2ww�����Ú[��+ʢ�b��s�*�Z����8�$���?%@�V��\�L{v�7��q� A �a�J)"���A���(�2�L��N��d2�e�svKЅR�`����-!q%Y�c�w^�a�,�0`��䦦f��Dp��3kj�\]�A�,K2����cA��a5�(|�����z�dޣCF�~�h�ʕB��N�UMI1Ӭ��s�1�j�r�l�MŨ���)V�E���/�"E��UU�/Yj2�]]]_~��Ʀ�S?0�a���8�F��4]�8�g��>eʭӧ�W]]��W_��s�a�+�qBQ�..�gϞ8�>fԨ⒒)�&��pQeUuLtԜY�A)��8Ρ���j���А�qcF��f�\^�аb�_m:��Fs���D������DZ,�巋f�}���:��0�t�HH\k0��n�745�{��~��n�Ԫ�����f�a>�������dE�tMm-2E�q<M��b�,a򂘝�`׷M9a��R5m��h����V��@<?������ϱv;m�3�h�gy�E�a2A�T�����o��|꤉J���������ʪ��6� ��0��M��8���i��MӢ(br<_\R����tr�l�ʕi��V�E�TZm6d��������⒲�r�]]���=��y�3�� ����d��$I�e����
t:�aAM������}��'"#"����03Y,�55(���M�����˴c*!q�qn��0b�g�*�����Ƽ��L�T*�?���k���܇##"��;����'���k��==ܞ{�i���p�n�"�V�
��j4j��gh�u=���X��x��yϭ��=^�e&� ����DO=�v2}��!j��"�ʪ�?�.���~��������Ck5.�g��/��|��
�Mo�����O�L?�����qgg��}��v�ڝ���L�8p�Ц�[�ι�w?�TYUm0t:]F晖�6�ɼq�V�R1d��ߗ,�X,J�b���t��_���EST\2a�8��r���
�3O�c{bBB~A��c)�--%eeB/O����B��(�/��8~��	��x��qcƌ5�l6K{7�"���Ʀ�ں:��l�X�'�qcF'��=��)'�X�Z�7��o��u啕4M�5�٧�f;�#�!��jb茳���&���
k
�1�@'Dz��>�k�o�e��`Y�a�	�0��2�{���7?��K%ee����?�811!#3s��M?|��\.;t�h`@`Jꉢ�}�>;7��S	��2��
�l�'�{<���_|��ܹj����a͆
�>������+V�.+����?8à�h0���ǎ��'�G�M�eeg�6��'N��ص;6&�O?п��FcF/OO�L&DZ�~~��`4N;���/,*nii�*��>��GSR������$$n0��7W��y㿯<��X�\��EQr����5j5�0�ɤP�q��y��Booo�B:T<1
<�-��䑴������R��tu�@��i;�_���7)�(p�=ǝ��>�wҮa��'/���R�e�v�J%�ɀ(j]\ ��?'��|u:��c�<XT\�����n}kk눡C��}!�q����?��n�׷�����+�z�R��hH���8N�( �a��e2�Ѩ�+������Ʀ&?_AH��>I�9�y��>��

���4`��(Z7W-�qȨ��Q#u:ݒ+�$&z{{Ik>7!>�h2[F�B�P(�[���ؘE���7)).6��/�~�go�e���k��ۑ*��j�m6�Kth��<����[w}�xo���3��!an�-��%Ey�?~���yO1#ͯ�1@ ��I��1��kQ�p���+�(��a�/9�yA�V�m��)�

6�
@h��x����?����ꚪ��O>S]]���u��3�eј�#
�LF�"�i�d2�3��8;c7�AA��|��g*�����'��o����l"p��x�Fsg�Ѣ(��#G��455S)�QӧN�q�/�--g��� �6�-!>��;������B���nj5|��Ukְv�(�������?z���5�6,��7������M<v�X�~��VkMm]``E�B�^o6[���N�L���tsw?���_XXTR����')������M�9A�B��'AI���\ym��b�vwS�����~��GS�v��߮1�������3NLP(�#����l6�����8������+x�OH���kjnn�����A	���)��������aY���N�PxyyUTVj�jww�s><_]]�4�p�����p�j��99v;�')Q&��
���J�"0 @�TZ��Ze2�=�<��꣣"��dei4�>�	�nnEEU���^�A�Z����\./+/�������8����U��h�++�BB���'�4�ʿ�,��;����5#!q�p��ǎ�۷��nW��ݎ�848U*R�C�(
���8�j�R!ljn�p,48�eY�@B��F���ի�����2����B�Y@�W���|~A��hD�!�� h4.�q�$AN��"IJFA�JI�A�4���2#���2����'_FQ� �,+�˝5g�-ԴP�2��2��J�a�yEGZ��ro����l4M��R���f�q�P((��y�a�ϣ)� r�\FQv�EG�r��<���r�a� Ið�|�������LB�߼V伩�'��m4QF��l6���0��i�$!�t>r��b9��
�c�?��I$�-q���A�����"��h�.Tog�v�a��H�}��0]8k��h]�l6ǏQMa�i�`�XPTB��
#�hE�j��q�e��vn����0�a$��i�\�V+�a��n��A�j]�
l�������߹8�������o��N�e��g/�v;��%�.�d8}�EQ�Z-]�t���"�OǏ�8b�U�_�n�҅�u}�b���d;�G��0l��y<�;��JH�l\�����c8����s���̺�`Yt�Ai�!DsI�Kܴ\���y�F���ZYUIQ�t&�:Bh6�]\\�q��N�u�$�%nr�a��G!����l<��C]��@뢥(
擐��iqH{B���\�8�T(%��"
"/Hr_B�f)p䖽��Q׮�q\�j�x%K��@��,!q�������1848�?@Z������A��-m-����Ujw��w�%!!!!�kAB�M�f2��7=�0�k
r�"}�%�Qс��Nȿ:X�ө�
8�j����:D�c��;�a���]b�x:�)qm�և�k�Su�co����6�t���}�c4y1��s��E���W����͚#Sw�]��#��η�m���7v����kv�?;����nO]J\K���]��/����Jr��ypsh
t�8ܩ-uv]��!v��^���s����A;L�wߝ�v\u����~��}]������]�S*U��!
������iD���t�6;k	��_��պ�����]8���a]]-��!�����v�����=%�I\EQ�T�v�����b�j��А0��JJ�y���>�e..Z/O/��f+)-����Q2��\T�E�|�9N�9z]ee��`DA�Ն��8�<�i���*,,�$I�/g�(q��h!"�br�!Q��Ɔ��& �J�*8(X&��.�|������ID')���UU�J�����Q����e�Q1HU -g���/�$�Mˑ�K�Q��f0�+�*��"4
����l1�����������u�M�2�
ruuu<^U]%A!�!����8}D;M�9��~\���\�\ȑ���l�t:���]��Z[!�7�?t�@����GY�=u:��#�÷_~~��``�\k`��&���!�B��~�h����)�N��ca�a����(U���q�[���}2���=;Y�3
k֮J=�r2���S' �������EO�l�ܼ;s�(�s��s�{��~���w`����O�N߰q]��#��}�-��S̝��ĵ�b��,�9�j�Z8���Y>Ng�ڸi]��;wm_�b�ޠGw��o��]'�R�jk��vR����|�����u�1D{�n�5C;ވ䏳���rjZ�Q��4�ܴ���4��t������c�oG�X,敫��ع
�~*
��R��}v݆��3N�ڳ�����TVT��Y������ٰi��jAy�:�b�U:���sf�%�\��ܭ�f�Q�(��g2���M���8� ��ۻ�700mw�8�T(�


�j��U�

V���խ_��g2��}���O�����2L�k+)-3z,�8�er�!z�2�L&VUWj4.Ç��X,��d2��!�L&��X����0~���*�o"�Ɯ�왷�ѯo;k�H� �nnn��}\&���r���d�!�LAQT^^��l2���!a��a'��$��x�����˧��j��q9�ٴ͖�<�cǏ�0d��;��9�j�fd�b���$�T*qE1��	�͖<(Y�q���.*.�0l��J�R��۠�p������9$IVVUlܴ����
v.�N�"�~��1�.���Z���w�qwsssV���}��LO+*.���mim9r���eЀ��*�ݞ��'�DJdD�Z�����9Y�y������ۧ�q�Z�X��d�����u�O=֯��Z}<5��E�����aCG�R����YYg�O�|�ZW�A�|Œ��F0肬](:1��++/kmm���Ei��er��g��j4�G�>.�� \0U

~Ⱨj�j��Xb4X�=��	�������|iY	˲�8H.Wde�mnn��d��A��a�C�`�Zӵ���f�j?_���f@O�N?����#��T�N+rݬy͸嶆Ɔ��!�T�qȖ��U�>����
�x�C�9�M9~�]����NM;�a����K����jkkX�MI=FD{�n���{��B���Ap�@oimk]�q�^�g9�e��[7677��ص
P\R������!A�Z��3M7w���f��<���
��4���w�����9]�s��h49v�a��ܬ����w��VW_�y�`���O���i]KKK�n�|,娋V�x�\&?xh���UQQ>z�X��������Bl���y��-�7�5[�6�����yYu��6�1���u�u�e�׬���
NSpQ�z3���@�\��ا���a�����}4�H��Z��	���8ng�A��f�?�������p�tzss#�a<�S%��rER� ��VF��cǏ�,���4C�W�egg����:�NQԦ-�[��
���F�$��BGj���Ѹ�\����z���KRb�N��5�����JJLZ�fUeUŰ�ùy(��TL�0Y!���ӷ{��A��ܚ �����~�z��U}����X-8����l߱�����#� ��H�<���}������H���9t�`~A^[[��u���}$�Ǎ�y&��Y�gw��>l�0$�;�8����D�>:��{�u:��{	��8��3���܃sF���������$p�		;f|hhxKK�R���d8�k�j��n�~kaQIQ#G�:�:�eYA<=<ǎ��h�
e]}]ss�R�Һ�ڬV���\]�
��4߿Ztޚ0y����7nŽ��r�rd��$�9�|����y'
�j@����1�����N��պ���j4.AP�L�P���O�<5����XG���Ƀ���%�����ޮ�0~����`Y���\�RY����*�B���²�����M���!R��_�
����܇mӵ������2�֮!���:f���5��~�5U&�Q�Ri]�������AC���v��yW�+Z�Q*U�ƅ HDâ"�U*��a#���U��L&Gͩ���$I�Z���j2xXmm͉�GA3t}}��C��l6� k�%�=�77���~rss�=�䌨ӊ��e��?j�h��bもBr��Q��D������x�'����o\:N��<���1e�4��`Y��8��j�3v__��c��7�������o���M����Z-�U
�B�q�y�d2uJ���9�<}:}�
3�ߚ<hH����>,ˮ߸��(WW7Q]�����
��uf�y�I��:|�j���v;�rv;����ҕR�:s6�j�AR�
�Z-�g2����m���[6�8VQY~�=�t�6��j�3���5k�����b��sw��3�L�4ֻ�tښ���6o�����L&Ө5<ϙ���l��DQ7v�(
����F����z�2���j�i�l6��r�Rӎ�er��Y,����JJK�"�CBBѢ��bV�TÆ�(,,X�a�]w��qII9�T*1��<s6�������d2��&��A�k�0,�QWWm�~{��n
�@qss����

�q���NcQKJ���������q��d4m

		�8�d6��]Ń�����5�ƆW�kzF�GzZss�D�r�r��C08�?1!)1!	�o4��ۅ��E���ۚ׊�xB|�J����>|��3z��E��j-V�������B������:�V�1���Ĩ�hd~��Y�9�l6��cO��N�HbII���fs|\ByEySSC``��f����~�����1qÆ߼u��@�������ʪ
;k7[��;$$4#�Z������Nj�njj
NLHR*����'�P������f�FG�\lQ���;���
���-�N�f248y��C�l6��d2�"""I��i�fh����������V��l6���WbB��
�qihl�4q���uuUxX�*zyz��y��4?jԘȈ(�n����aĈQ�Q1>>���-������חa���H�//��^l6EQ_(</�Lƶ�6;k=j\DD$˲�X-V��dg���X��X�N�6�����Ӹ���p,������i��CC�"""�U*UXh�aF�a�mw����6=������������q�A`Yv���V����E�T��������A���_:�v�@�A�Tz{���������X�eY�����1m�w�B������`�Z��=�=�VKHH�R�rպ�����r���g6������B��t��A�!!�M�M*�*�? "�Ќ�N�l� �5V���F����p��H���O�����6�7W74��y.$$ԑ5� }}���P�Fe�Y��Dw�,K�dxxj��b�q��ngh�b�������7�4���""�ƎO�R@bY����H?��B�xB|�N����﯐+CC��v�B�2x�L&kkkmӵ����9��]צkS*��aݶ�#�V����U�Cm����.[��鷸��K;f7��:��C=NB�8�Vv��8��ث��%�y��Z=�=��/]V�nKK��j��dk�5m�-Q�їY��V����ع��>��e�ӼΪ���8�q��O�
�)}�IM���C�K:�y��5���~4��4��Og�}g}jp�2�y�~���#Χ=���%tj���t��t\�,R�����N��m�q׹�9�r��V�����D�����'v�5G��
\DC�S�K��`��|��~¡C�EFD��Vs�ۃc�#�nc�hr�_םD�%�tJb�1w���W:���>�"��u��m�w�}�0��趺;E���eO��9P�uR~��)
t��ݶ�N���Kۀv��{�A�J��������n��5|��E����'?�"��.�,E���#qÃ���]����Bn�Գ�X�~�<BBئk�y^��w��a��l�9Q���H�V��7M�͠��T��X-�M�=�����z�BH����h���Y��9�D@D����@)!��8�=�����F9F*�$H^�EQ$I�8.��IHHH���+t"�[]			���s�$@w��U~��� 
��1ō�H�_š!s�G�zѪ	�ݟ呄���1H�Cⲹ�)��^��΢�e>��a�Znj��t�[b��8��Q(���p~EW/
���^�8F�H�ð�v���O�0���,m{z��C���#!q3�,
 �����`RB¬{���]y֐òs�#'/o�-��s|@�R�����;*���I9�z>P��
B �I3�� ^�HYc�u�� q=;\d@���|���~�:�aؙ���W�~ۭ���+���e��Ζ`�Ώ���jH�ٷɲ�ƍ�w��O�,]���.0bߢ�O>�b�
���|�aO'���%K��JJ����ƻn���O?O�<�TFfVv���F���74@�}�{���w� �~~~v������[��{�>o��'lݾC��o�
���,�mp���}�ŗ�}��ã}*oHP�WVU��n}kk����<��y���7_{���x�mr�����߸��]�m���'|}}���BCQ$�r�X�����E�y�g)�E U�MjK���=�r\�T���:��sƏ��{�=���Z��<J�����;v�����:rd�3�O�4a�cO=�_P�[3�'Bx���ә��.�pܘ�<����TVUrs�Fcn^^��}��]{���ګM�M�O�546�d�W^xa��=s�/  `��M�1�6m�����������fe���% qn�aӖ�ܼ'}4"<|��!�QQ����;��'\4���1lXtT�'�EGEM?~҄	+������z��:&=�7�k����/a�a͙�q��f���xz�cn2�{:��4w~��;��{�WWmKKK``@&�\4���N�_���¾II$IzyyB�f����	�&"<,(00<<L�V���v�B������n�Z��q|˶�J�b��(���6IQ�\����b�|oxP���[�O���ٹ���^r�,"<� �����vs!D^S�
��F���-�Q�c��0�F��߯���O��H\]�"�L&���Z[[��BL�R���jj�0z���t2�i�|���A���SsK���VV�d2�&?�{�X\�?Ǎ}���}OM��l>�>GS�;~<��Y��z��b�8������ng���]����<o�X����Ƀ�[�Ѱ�C�����d�_k�X�������}��]�w�*.)�%�t��k晳AA�E�ܵ����FӋ�\��޾y�V777__�������&�PP���u��f��`4�t�$z4�7����ÆI?�QRZ:��{7l޲z�/��f��)EuR	�-@9��<{�_�>
������r��_�����Dž������w�"���OXHș��ؘ�!���$'[��ں��c��M�������.\\4�>>>!���>>���Q��j�*44$�����}��y�)McSSiY�a4����3����e��O���!I��֍
���-G�}��JeTddsKKBB����g��]]]'O��j��յ���\��tpPPDxxdx��Vz��G�!������7��{�������?i„�f�zs�0���<5���Q�n��V�R9�����ؘ�=H����U��ػ��є���[6=$$$�
:�S��%�5���Y���t8M��0d�SwR��ePH�#��ٹ��F�p���I8iw�x��z/��%�Zs��s�1D��7�smvr�!qs�0��F"����</�_�+��f����4BwΏ�?�:�~DZ��y�I�L�v�h%��:-r��\_��g2����~�+qm���h��dB�K�ٿR#B�\��Y�������c,
%$$$�On�y�������"�~			���ʣ�����Hy�a�FBE����6{���GJ����I���%$�F�{��5�!�{]�.���T[Hg)�"u]g�o�/9�qqA�蓈�d����,��謡�Nl8F7ε�a�wT�sl�3"�d�&�Y8@+*+�����4p@O'�Jqt���6���==ǎݻ��
9��3;�뎡���}ǵC�c�v2���(*2��	g�R����ktTe7��;�����m�K�oB���٬�?������H�����E�n.�5�sK�o�C��G�**�r�g�a���G��c�SW�^�y�\!������
ڲm�B!W�T�����AA�Z�����+V|��W��$��9�l))-;�v2:*�h��Ʀ�����hnf6nٲy����~�>�O����׺u�aa���E���ɍ���(*%5u��Uz�^.�����\�J���򩌌��O=�T*{݌X��j���p��ٹ�����qcF?��C��!.]6c�t� zcc@i�ohX�j��#G���e�v�$_{���Æ.����A��ܮ�9D��ò�s�,_1c����ʓ��fgO;ؼm{cc�/�koo�����Oh�y��w�.ھ}�H�
	���"�4TZ^�s�n����O�����y����nڲ�h���8z,������#%�e�u6�����W�>>m:�?/lljz�����Ѩ��"#�CC���"�䑣���I�@2�����oi��A������Ąx@LT��M�������ǟ�74�3�n����IJxyz�{���Ճ�
���=6$yPߤ�A��e2�\�u���Xm���v�����$Iǟ{f~YYYvN��Vٯo�4�����psuU�T����u�;~�Ι�����!�h4*�
�_[W_YU]SW���9���yl��cǏ���zxx�����?*2�	9j_.��5rܘ��HU����O�����湺�B��kF�	5��j�O>�����J�R�R�WTA��l�n�=���ve�����O�L�XVQ*��R�Қ������mFHP�\.�y�A�	�3{�g_4`���۱����
_��|A8�
T�T�%%�--9�y}�AǗS�
����V��6:2���z��1� d�=���YRR������q�̙J��]�������&��T�O��8��B.���'�����D��v�_/I\9B��4�Y�}a@�~3�O[�ju����˖
��_�V�Rk��^�B�x쑹/�����睷�|�&x"=�U����sV��ι`�?4$�k������;=����̐��aC��9"5--%��aQQQ�׬M9�:i��I����tm�qqK��((,�+(4
S&Mliiݽw������A�AA@���@5���PS[{$%���r�A�ǎٵ{OUMMTD��c��b��۟y欗������?�,-+������I
:y�J��j]�\����:+'G�T���zxxD��K��7	h���׬M9~b҄��ǎ����9n��-1��O>�����)��`�}׮��w$&&�9s������φ͛I�|�g�J%���[���A��}�М�H���S+q-plO�{�?�ϝ;8yP/�IHHtK���!�f�y�o��֒$y��i�w���D�C?������y:�����`7�Y�� jjk�\����%<,l�H%վC�ya����Ñ���~�c�#U�MH'�ZjH=����5�@;��딘ϋ~QU*���1�́�pV��r�ui�/]/�
&q��l������y>  5a�B%�n�7t�%U����o��G�z���m^������:�ۻ>bW��~����,%$$��Ά$k<73R�KH�$t�R����j_B�&��.TIHHHH�c${���@"�4Q��;m�ƛV^~�.�z�e�E�4An��ДD�D�t��ߩ��h�X�_��$���?����E�� �~�%J�r€[��u���E�g�x:���y�瑹m��Q��U���bC����S�η�H|���u���P+t\;~����g�=k2�kj�sr�V.
��Y��I:���r:3���	��N�����Y��U��X��9K?aaQqaQ��VW��)��0�(��ֶ�t�C������̴X,�AccSF��$�-kn^~VN˲����y	�l��ٌ;�N�\蚧��*�7*-V�-���Ç!�~���--�-��߾g��323��쟧��9K\E�$9g帾a�������s��S��Y�i��O>��
m�����;��8����Y0����;c��ϣ0��d�0���i6o�i�I'���!��g~���Je2�>�h�R����x�onny��F��g߾#�RDQ���x�ל_wކ�����tF��4�a***�?]U]������ko/+��p�����ǻ���h4���;~�x�	WWW��KqI���(��򪬪�/,������d��_Nn�P�R���tww���K��E�ǎ������L���/(�nhlܼu[uMMTdğ˖����~�6o���y���‚¢�¢��Ґ���۽%�1�L�t;v����������BC�~�u�B�l��o��$Z/]��iz߁g����6o�VS[�헟�IJ��ǟƏs�H-��7MI	,-+_�i��?�2m������o��hww���]{�(�������,���[�M��E!!!�������{�^Q�|}�"#�Mkk��Q#�/�x����y���ʔԴ��'2�����s���S�����z���ؿ֭�8nr���^?�lV�ǟ�a�[ラ�u6+'�Dž�ZZ[>��k���[:T\R�ӢE��֭?z����[~�c	%���������J���lni�w���|WW_O3prl���Ϝ=��˯��ڀdٱ�A��,]&�����l6aCc#�r�uuEMIY��bB��k������tl%��e���V7��rҚ���9�m,��_�����l����mm�ƌ�9�n�765�t2�!h}��9vP��PV^>f�H@tT���O�����Z����~銕�/Y*�ɛ�
�����c�L&KLL(*.��P]}]n^�w?��q��`�/(�߯/��2�,)117/�aXj��~Z�q�N� IR!�s<@�:C�&N_S[�h4�'�GP([��=��A���9��^ �18�߬{7z̃��/-+���aXQqIIi���*������j)����Q(f�91!��M���������D��L�0��GA�6.\\4
����&
����
��b���:dpDx�3O>���M���>1!!6&�޻��>mʈa��{0�j��{�}߽�4���@�$Ɂ465�x{{zz�,����Ad�����0�?��JQ����(��6��<N�MMMMMz�>1>^�P�-���(Ǒ�F� �k/�4z���W?8�'}dp� �$u�����v�\���n�>{ֽ��}x��1Hp;���2���`g��/��N&��w��hH=lȐ/>�(/�������y�qg�X��<����Ə��O1	�8������d4�bc?x���3~�i�����f���_}����u:�[�Z�M��-��MM��a��fsKK���z�0Q����әg>��S�Z���VUSx��,;�?��β��bX,����l�*��\��Y{OgH�'aYv��߽o��^~A��aC�n����������z��V��;y�����K��Y�~ڔɻ����v�bhf�	sf��������DG}��;gμ��ƍ���/,*����/,7zt~A��d�����<#�"˲Æ��/pSPXh6����22��6����t��b�������lܼ���������/2�d>��3f�P�����:|�P;kw|Z�eojni��G�y�l���H����7f����?�|��3��]�.'O�޳�^�:dpYy��?�$Ir�ĉ�Ǎ---;|4����_��MMMI�MJ:v<���;u:c�޽���1�W����%p|��	���=]�7/h��u���W8>eҤؘ���-[�
8'��S���t��a�AAz�~�������!�!���+w��?���{s%�1���f�7?�p:#3*2b�qC���rr~_�Tė��~�umC����ǟ�����<qbRb��l^���ʪ��}��LJeYd���7�&����sww_��'N�|h��}�|���r���;�شeۖm�i�6t����"�0�}�#��m�AH����ǟ��7��g�f�;v�;w�Y�fmHH�s��/.)Y�x���ߑ�Gkj���/�V�Y�a��>����}/b��b����Ͽ���o���-V��F���F��r�9��� H�"��c5-�l6��H\?8�����Q�Tv�!ߵr{:=�q&�Y���X�eF&��t���(�Ju��4v��$ɮ-��`dY�0�v�#6�ŪT*PA�m���CK
M(�r<��0"-�B]]� 0 \�^��5���8�c�n���8#��.�#����تE��q��DE	}i`x��J�6 ����]+W�S	� �\]���:~=��+B�ڑ5$������{� �S�؊
D�V�?/'L'7�.EQ�/��Yi�XP�y�Au͘�P,���;\��?�yaD�_��A�:�S�t��n��JH�+�?�D�%�����bK��.'�E�qq�z�/�ߨa���b/'�ڗ�[�<q�������\��4ؗ������NTIHHH�tH�_BBB�㟈~i�f@�Τ���o�p=7��6a��rG�齹��p.G�%���M��-�_N�Nx���Z�c��S\���c��c�ƫE	G-#S��~'�e����<�qF��۶صm8��F�� �������1���w%ow�N蒪N�.�3�}{���s.�c>}agv��D�Q:E�m�������kH�C#�h4�<c�
����f3�0]�;�a6��j��cY�ً��Xa:�etb��q�e���q��b��L��W�t�X��L�KPKk�����I�wn0�6X����	
u�(v��싯���;�����O�^B��|��乩�-���@��sLtԔI�G]��S�v�KKK��=װ�өgoG�b"�⣬۷_�T��Ń���#m6o>z�8E��{4*2R��ԻsR�!��[�557?��c�烮You�*�f�}�ͷ%e�j���|�p�����?q0��y���<�h�΅�}�
�7�,{��3�9�q.8��K�A�iw���œ'N���O�����������|�a�O��������W^|>*2r뎝۶��y~�s|�\8ז�fkni).)5[,,������L���"�����%��sss� �ٺQ@-����tF&˲9yy���v��3��Ɠ�

[�o�/,�s��#�R�]]�3��u�v����������㸬��Ҳ2�aϙ�mmme�v�稸����O�:�A���32��,�q\Ii陳Y�%�̙���e���~6;����ҥ��3�C�0̩����:������O��hni����8sF:�Q�AN,�7�hP����0��`���9y��lF�/)-�<s�j�"O8E��v�B���ZZVv:#B��ظt�ʑ#�O�:E�T�Ҳ��3h@��ں:$OMfsvNNVN��b��Q>W�㲲s
���-Y�q���'&���w��{�
E��g��{�H?��/?Ϲ�W�}g�ٜMcV�Ԝ���y���rŪ��{����Z�|eYy�N��ȱAZzz�^_^Qq>̊�%�e�+!Ե���:��'��O?�a��o��ۧ��?~�����j�s|�_:��ſ���w߼��[��nG�ܑ�̷�����o755i4�?�.?���}Ǯ��@A�`AiYٞ}����YQQ��$��7h`�u��?�/7M$��>z�M��߯���̔�6[,eee啕c���E���c���Ws����-[��
����������%�

��~~�.��_߾2�z�7���w����"K�'N�=p�eY�B��{��l���3S&MZ�|��
��֬[o�ږ�\շOw�7����¦�fw7����TTV�߿o߯��������C���导�� �{[[�����
�Px�R���x�ں���_AQ�~�sIVv���A�����~+V�پsWeU�]�n�6m����6l������)((������K2x�^|����hJJeU�J�ںc�J�ܱkORb�ǟQRZr,5���x���}�u]]݁C���*իo���Қ��:8y�(�/ݖϿќP����GEDn۾c���|������m�8z�H�J��(�?�����#�--A����<(!>."<l��Cn�����
����X��h4*��2���:y���[aq1ō[�`�j]�|����LVYU�T*�N����V\Rj����I����.��������..-���

cG�jiiٳo߳���>��[w�9c����p߁C^^����h4�i'?����C�L�0MJ����l�GD�/x�ݥ�W`������,Y����u��!ξ�o��?x��[�WUU�r��F�f�Ś<h���䐐��'>���R�QkT��wg�93}��������cnj=��"�(���gڔ��9[�2��>��S&M
	I��=z�XTT������ok[�����[g(��Q#Fh4�]{���8��Wp�ԩ��}��l��6�fϺ/(0��>��K/����Y99Q�b���+y��##"Ǝ�ӷϟ7��SG��<6�ᴓ�[��3zdtt��}�'��>m�(�?����1B���UTdħ>D�TqG}�q4�&IB@pם�O�0��?���z2�j�ZQ?�����;$9����ݸy����SO<���?��'���~n�����h4�U�OΣi�o��<{6�x�s>q2}��a�~�~o����W77�����q8��7*po/�[o����N34�s�!�[�� V��βǏ

�X�\G�H�hmm=��Q[W7g�,�Lf�PTR��8�[,����ނ��O�|�̙����sQ9��ʿ�L�4񎙷9ڕ�n0��瞂�@b�f����E?�������52'/��������n�Y��p��x��Β$A�Z�,KQ��a�6+�vzo�j�wݹ�wV�����P(&���X��_}�WP�x�u�e�{��E�8��7��gC������E�w�~{کS�7n?f��>28�?/��_P��…�yy�aaA����,�B0�!��r��[����Ͼ�reUN`���J����x�s��h4�$P*��p��oDD��G�Y?�q�Á����tW�P�<o���gA1��,�y�$	d'�1�\�|�Y��8�(�qL!�s,�R)�<�'��0�vO����
|���ϿXYY�{�~��Ç�Z���go�0�����/>6��?�x�Æ�� 7/_�k�:y�_z��/�H4��|��{�֬��G�q��j5�������A�C@��}�fe;�0���0�s2E���C�Ա�T�����"#[�oߺ}�g}(��A�8.1!�?O?��LaQ�s<�ݲ�x���O����7��b�>7�iWW��~����qܻ��f}c��[~\��3O=���!q@]q�
E%%$Ez���j�!!�����Euu���@T*���Q�1I�#ǎ��k�V�5�MO=��\.G�����
�b��a��v�Ͽ��㸟�/I!!�����/?������E�q����h4r����(�ds��GWVW�2m�cGԱ��T*��s���'ww�+|���/I�C޼u�O�,z��'�<p��E��x��v
[1�X�����x���C�Q4��8x��E5dp��_}���WUWS$��Ӡ&<,l�葯��fXh��s�������d�mn����z�#G�9�"��a��`�s
!Ƀ&x��aW�����]w��+��߻�UVUO76#��ϋ~usss�/��3uTG����|���x��}��m�}�n�W�5A���d6{��S9v����'�?C�����h4,�B	�@�IFQZ�Kpp����(�=*>.��_�y�O�{�7�fe�|�̙'�w
��^�mFG���Æ���[^���l6�[﾿�w���3����=�0������~� >.�

���O}���F���m��Z���y{����0v�F
`9���Q��L��l���V///d̽M��8��ۻ�kD�a4����5MdD8�����6o�s���\]��� �v�ը�&��"I�Lf4�Z[��y��� ��������_y%y�@A ����665E�����B��h|��y��oh@����8��/(puu��pW*��z}cc�����V�0E�p3[,Z��,������B�Uy;pj�Ǖ�W���8M���h4)�
��fS�T�uuH�N�T�j�4Mk�گ��^�޾�w�+�kjL&s���Z��i��(���F��0����j�����d2��`4��|}	�0��ZE����M&�Zm�Z)��F�RQ4�-Z�J�(
!��
��j�UUWi�ZO�noii!p�x/OO$��m��"���
`���: �_�;W��iim���
���P^Q����p�Z������0��׵��e2�R�7��k���<=<.'LCc#��A�Y{`@�aF����)4$��H��f�Z��F��j���wSss~A!�aC��;< {�ݻjq�M�����ʓ����Z��򊊄��N1�>+�x��7^}%&:��i�N���?ji����_XT������~91t�FQ��|w�s�A�%:���vͲ�\����	s�h�%��Yt_�UK��
���&�^G��z����̻�N�J�21!�ӱǵ�h�6uJTd$pRj]�
B�c%�B�1���Zown:]w��nF�ؘ�]�S���.��N��6]#��U;��Z���I��t���0�Ԫ���N%s9a:5����m�k���t]��U(щKTq��m'���a<=<���ޮ	��o�X���s���_�{��_fW�f�]��m�.�ۊ��0���^��%�|��K���������3���z�$�tQBB�7�藐���鸔�Ƌ�8N�;8����t����}���5�2�^�$��YH��:雨��o.�^���.V)��#���F^uݩ�;�^v��R���[����ĵ�ok\+5�U���^�=;^}�i�|%KGȮ���\oB�Ҧ�{5��q��`�s�-,���G-C�#�����˝����d6�� �Iٺc��?�E�{��M[��퓤R*o��7!�5(��l�(��r��?�l޺
�r�����!��/�Y����#�+W�e�X�Y���?{;M�v;K�d'5��u���h�%�G�{iǯ��roZz��9~"-"<L��ҭ����E�:|����خʦ׀N�b��.�/,�߷EQ�ZVt:T�����.�xjlL����+"������ٗ_�ع������׹�U|�0��S�3>��K�ɘ�����?/�c�����������b�_�33�&%*��G�~��O�v�T�a��(d��N!�y�]�E�n�����(�F��]��0���fɲ��?���!;7��y��\]�
�I����4����uuo��N�E����l6C�K9p��?v��a^��������̮u��h4�:,R���s�l6� 
�����Vx�7�L�YY
�
����r]{;�0�6��vd�~�
���'���z=2��
��M��<M�z��b��a��5g�q\�N�H����z��aZ��N�t{{;�a4M��p���N�y��-��!�'B��f���/)+]l,v[>�F[2�f��zͺ�iio��N�����&�w!�X,�Bx*#��e˟��THpл,N��B���z=�a�--��m3fL�:像?iniq�{Ԯ ��a>����vd-Q�L&sb��������ڡ#G׬[�������Q���n�nO�{��s����6��y�v~�!�������=j����뗮X���d0����2�_��D���SSm6ZŤ�d���Y�beVvN�>}�p�R��q���*�tF�O�����ժ�i[�m����߷��i�-Y�w��F�o���������X��Q���8�H>���������2x0A�!�o��i�����>�b�2���w�MLHظyˆM[Ned��g�?�.۵g������˛o�s"�䙬�Ӧm޶}��5��9�r�DZڦ-�'rX~����߸9'/������<{���eİa�V�ڲm{j��������e��lں5��`���$I�DE9�5���S�~��єT?_�"ѹ;���^�k7l\�j�����rrF�y�࡯��!5--:*�`4~��쭬�ؿ���z�x��
E ��t�6l$2���}�~��w��ݰi��l�קϟ˖oݾ��ɓq��J���O>ݽw���3	qqɃ�`W>W���s�홙g��z�9��K��뛔��5�ǍU*�q�Ҽm��e+W��H����w�`�Ą	��
8p˶��>���A�Pddf��ˢ��T7Wײ�
�a�>�@TdDVv6��+V�E�thHțWc�.b�uu�=����Ȉ��\�͖�!lmm[���;w��lI�	��,�557�5Ro0�ݰ�՗^���۸y˭�L
��𨨬2�
cF���1k�o�6y�EG��%��32�9sܘ�cG�ꓔ��+/������a��A&�ž���'N=j��%��1`6��&#��M����B��A���ӧkj�~�AL�4)1!��

6tȓ�=���s��	��MJ|����s��w�u��<���F���k�"��{����>4gvxX�J�<�r����ٶcGpP���(j����{���{�K��������0���2���͟_\R�ɇ����'�4���=$8��GEF�7n��}�����7�;���tي�@�B����I��rޖm�����j�ٌ�Gᅴ��|�ع;��lKK˸����������u7��s�}� 4$��>t�_����}�.��y��^~i�mÇ
���y���ZmCS�^oxt�C����#G�KK�o�4~�⒒��7M9�����f���\>�F����:Σ|�Ge2����U*�L.�0��9�%�T�â�|�����v
��V���K>��ӗ_{#5���윪ꚇ�3h���Z��ko�'Ǎ��_�BC~���ӧ�zTJ�J�&���+V�-��3nA�Ȳ,�YZ,b�~�J����L;��e�y�?�b���ihh��V�͹�/Xǟ0n�W��q�YDQ� �r�p����k�A��� �$	 �"I���wέ���L��ׅ��ĥ�q0��y�<��7?��q��Z�s�R��߸iϾ�*��	 I���I
�>t�}�\�a�q�R	EQ�/Z�P(�����o��ִ��-۶�3B���?)�7�~���a���U�T���&�X!��@T��Z��R�D===��W�v�o��ko��(�v��MG�d�^?y��I��h������O�T�h4J��;�@���R�H��d�|w��\.�����>�>M�͢BC�I��d2EGE|�}�����\�Z�V�����H�$U*%����U��h]\�\��H�l׷O�8q�ĉ������H�\��7F����J�ko�e4����N�:
��˧(����Em���1���>��ɉ��9q2������:~����_~�~��A��}��O�7o铔t6;ER^^���P(j��#���������C�e�1�Q躱�)<,�P(З�$I%���Y6���y欏�7�ay��,��w���A�Ǎ���?��'�O9g缆�a'�O������W������^����d2a�Zy�7�L���
iuM���
��d�:yR\l��_k]4g���y[Oח�߃�;v�6
J�����nw�"?�����{�����Ϗ����9�5[�6�
p�tƎݻ5j��b�}�}4Ø�f@Qq	�06&�D�ߴI�R��M�����_�rr_y����a�f�Z���n0�===9������ǟ�tm�3F�n4�h��y���`0 K��X���U{%oר�z����5>.væ�k֯�e���P�N������ZW�`4z��ngE Z,A���;�t޿__� ��l����
<�ר����u���c2�Ѐ���)*""4$��7�JJL�y�;f����"#ZumO=��K����<�rȵ��h��l6��v��Ȳ,�q�M������\]�2ٴ)�_|��_���(**^�j����Q>���
ɹ�[ZZ}��7�z���)'/���o�=��	ǭ۰��(������������	c�������7��ʂ¢;o��ҫ�����;|��W��ӧ=p���>����,&:z¸�;��q���5c�#LylL��c�nظu����6�Z���}��E�9������[Z����}feeϹ�ϋ~%mA%&�;��
�������<�=��\.п���{����Q#y�W��w�qGHHpߤ>^��~�qq�ȷ�(���~��~����}|�EA��;cb��2�6��{ ���������%p<y�@�F6jĈV�.:*r��)!��}=��##£"#�J%��I���
��722B�P<��K�a��(�E�l������F����\���`IDAT�
���
�����ٳ���o�GGFN�<))11.&�j�M�:�ORR\LLxX���kRb���gxXh\L�� ��+~��̎��	��twu

<h�N�N�dhhH`@@bB���GlttXh���{��D7WW����������������b��:|ˠ!�(���}gTTdlttdx�B����>|E�E������Qk�u���GLtTP`�Z�NJL��N��������׷O`@�Z�I���GI%I24$���w�AF�i���cF�"p�$���@??T>�"�ÔJ��!Q�$I����q��Z��C���cp� Л��a�(�74L�:e����5�㹱cFϘ6�Y��ћ�̺��Q#��岸��9��#���#(00**2>.�l�D��yyy�~aw��a�戈p/OO��;v�(OϠ��Ȉ�8�s| 8(�F�C���V�BCC����


A<���~��h���wo�_B⟁Z��`<|��-ӦD�G���K?8'y�K����U|�?���C���"#��u���{W���ef�bf�;)�wo��"
�]�l3�z����wl՞�ס��J��c9:G���@�ʯ ��t�D��t�}�q���/�}�	[��w���j]f�z��R�8����'1t(5��w�u4$�3z#j��`W���)gd�s�ӡ��(
����;�C���)��\�/��]�ڑ磛�đ�N�sm�z9����8
�Ws�Y�*Ky^�3�]��߆qT.mhu�ŐjCWQ_�i�/q�AM��ʿ����z��<��2N���y�/5P��O���]��zz��8�Kr_�Gj�=@��/�٭��$�%$$$n:$�/!!!q�A\yx�����
̥wV%nr�%�K��R�1�w>�;)�\I��ϟ!��y��ա#�#�hT��9�2�uQ=����ƫW�����Wg�t�!�"!vy-�r:�߆Ag��x���%ʽ��~g�.�S�����d2�ڻ7,4��OW]:�C=Y�(�~;p\;*8Uh��Gc�ꚚԴ�Q�H��R���qOgZ�_�
G�IMK�c�v]\llO'�J��0���x���Œ�|����ǒ��ii�aaj���y��>��&��8�rSU]�z�����w�Ν�]�aXhp��jݺm��-[kjjcc�q���[����))^�^���׏��(Y�/��qtLI�=�X,�u6�V���~u�]�ܐc�^�s�;�i΍��_S[�y�6�[�}���$�o`�Bx��ᅿ�6h��-�v�����BO��gyXV^��ǟDFD�����#�$�g�0�X,o��r|���ߴ�4�16wD�5t���,y�����8t��?��ٷ@���|�Ñc)I����铔~���?�Z�Z�BB"����#����	�,�L��k���N�8^.WTUW?8����3YY��ϥ�J�+Fz�m�~��W

��a�=3?'/o��U��q��ݏ?�4Md`��٬�iS��:}��1����k־��Ngf���=�Ѓ�לӍ��ں�5��757Ͼ���R�	c��ܽB8t��������޻����/l6z����g��llj
	.)-�c�rA����TJeAa��<2�?�ðܼ��[���=w�ٷO�4�@z��Ⴧ���ʗ_x~�;��	�QQ�~��Y�TNC�^JsvNΖ�;l6���?w���C��s�3�M}��g
��bcbЖ'�M�MM?0�M��j]�z�q��/�|<�D^AARB���c����‚���Jห��y��W�8p�-����%�cnj���;�O�*p�l�f��8y���^�#p|�
cF�|��G��mt����ѣi�ٵ{����B�p������u7��=}��������ػ�R�8���v���g@�~�(N�<���MIٹ{�o,�>u�'���dF��t�g�zJ��M�����t�
�?vX,��ӧu�
�9���{�@��׺�g��y�Ȉp����M[�w�
�_�n}�٬��>(��{�떩S�o�x��ѵ�7<�ԓw�y;�������՗_:�z"*2b�	�����hlj��ϟu�=!�A@Z�@‘f�/��.4$dֽ�0��j�|��0Gv{#HH��pMӏ>��Z�njn	
	(
O��3g?���^z�tF���2Μ}��CCC�kjP$~��
�������Z�fmKK˛��:dpr]C�#���OkK+B�j͚���7_}uȠd$'Y;�q����$	/O@`` m�y�'���f톍͙�bسw�Ͽ.V��h=ʑ�D��I������>v,2"<,$d��mM���Ǝ1l��yf����Ns�r<5��i���b��1�}}}4� �!A��~��!��ё�Z���n��F� ��=]��f�O=���	��{Vv����w�ޣT*�$%>p����&-�����
�źw�������3A�a��~��Ea����l�����-����j{:�W4���d_}�iCC��+I�T������FBw7��N�?����y���k᯿�
_���<��bmo�����v���ȈG~x��	/.))

Fr������9r��S��S&Op�j����:�446��뜼��'������EQ2����m�٪�k�UU��q���g_<��ܤ����nݿs{MmmJjj��ZE�%��n��Z�q˴Ͼ�������37?���=<,l�֭���;��>l(m���U*�w?�t�豚�ZA�[Z��d2�<ߦӡ���֮[�~���{ryKK��=ڮ�t�J�z������n�F�
��:���!ܳo���Y��#�
kmm�׷Om]݆M�S��"��-��9hc�f���
�6�ݰq՚��33m6
#M�n<�����Ǐs4%���h�=w�ݰq��m_|�ͤ�d2Y/��
!�8���x���U��{���;�de��k�������js�r��I�X�fͨ�#8������|�M3S&M��ǟ��s&O���?9���7z�H���������I��/����G����a�����С��3;w�Q����\�˦�ۖ�Z5��{����{��u�>rð�۶9v���#V�����9;h��
�=�����������ߺ}ǝ�ߎ|A��q�c���ƪ��S�3�>l�1crrsi�=rd\l���GtT�\&��V�Tq�����MM6��n�'�z�Dxx���Ã����5A���zlMm�������x��J���s��g�?���RSS��ࠠ�>k2�KJKG�9jĈ����Lwqq=jTxh���OxX�B����v�h��<�����8a2��&��������!�[R��x@E�N�<��u�]w�6���/0 �Tf��ѣ��Л�ðܼ�#�����o�H>���-���bL�M�%���.�ԫ��].���K!�%�@�z�i�p��q�M�շ�alE�!$l=����j�f��6�L����o�{�N�KNv�ԩ������������i5��Ԕ��������2(0p�q��S���qqi)c�<�)���H�9�},3�e�		���I�F$3�_Xx�w�MK����Xnpbbmm�Z�N40��0=�piY�#=D�~&_?}]QY9��>��|�\~�s��c�8�w�7_{ƒ_�T���A����7Y�n�_(_�D��㎟�2d�\.��
�m���T�مL�p׿�W��[}�M��!���zv��r��L����?~l��h;3��'4�xW�zZ���?�"˲���!���7w
q甗���קg����mӺ�2�.r̮�ޢ(�����k����@��qtg$1L�p�cH�=�}<�Po^w�L�7^w-=ߧ��ܶn���Ƚ����cv�Ktw�`��OO<No�,,���1=J�=}�m��~�����>�}�E��n�w���m��iz�;��p�څ9�����W��F���k�\���Ơ��~л~�4��~�:�e�f�8ڲ����A��߁DM��~��V��ܴ�0_�t�/I+���n�^`�����S]p��pI��/��չ!�	Mj��r�����٬�g��'��iP�V@�
�.�
�?5��z�z&Ysϋ7h�R�G�uv��t52=O�]K��U�n�
�����Rp�D�R�1#����J֭�Qo�6�Ӝߖ�����(v%|w�qv�1I��#I��9eW����Ne�:��h|t:A�ϤkU�v����(�߿���ͯ�.#�����ص[����+�j�!D�C_a�`l6��������Y[ܵ!�0/�{��/--���d��K�n�YE�M�>���$I��b�?q��r���Ia\PX���!�Vc��]<&77��Q���1�oh8��^XT�V�<�w�����{����Y�>( P�Ӻ���L���o�9�.G�n�O㶻���X�o�<x� �m���k��bxf���=��ݟ���0"�a��~�z�R�lhlJ6�s�n3��M�W�xs��iJ�ҽ�u�\w�z�q^A�k�ޖ�e߭^��hH�9|�m�qkk��/��;���W[[���"=���/I�+o�YVv����P���3g����4�ǽc���[��d2O�<c�}�/�Z±���W�>\��.Y��f�mݶ� 481q����y�M��[�m�>m�L&Cu��|j�޽k֮W(�-��E�%1����5�23��v�޽f�z�BYWW���kj��CCB6n޲i�V�^�w��=��8t��tnܼ�ܹr��m�{��{�1�7o	�Y�D��f�mش���;���s��,VKpP��3g*���BCW�]�m���аC�}�����j����PFFLt�V�ݸy��Nd��X�+W�q:�qq�¢(n۱���cU�5����F���{
,9[�_X�P(6l޲���Q�۱k��5߻\������5k�566%x�L��{����n4�DG�>�m�ab��Hho7�ܳgǮ]--�����oW�:����i���^�b��&�i�7�]��LeU��]�3���;/�	�Y��*-+[�rU^A���#~�����㣑�K�����Y�틿�{,�Uk�?��1"9y�-���ϸ��Ͽ�*-5�`0г���i{TDdIi��#����sf�\�:(((��i���m��Uk�(�����c�3?�߻�Ι�j�_�ؘhz���_���ء������'O�(I�;�ק���[�]�o������M�8A�ă��Ι�?e�D�^�~��3h�^F�ڽw߇?���Q�y��r���ƭ[�+*�ef����qii
����70  *2"*"bӖ��J6썷޶;��lmo7���{�Ǎݺ}ǡ��!A�5 Bh㖭�%gB+V�4���8^i���K~رc¸q
����q�7�"��^����X�⛭�vL�0A"R```��lt���߂E�465�Wߜ'���˃�������X1e��O>�����}��BY�lYkk[hH0�����w�By��;w﩮��fժ	c���0=##��M[���F۾���:�z��'���?e���_/;W^��ظg�c��{��2i���*��/]V�Ш��_�7!����Ƌ��K��ϳ��˿]�T*�t��148:��>��R���������1bxBh�����/�A�7ެ��LMc��jjjF��
	�<����o��{E�%KW|�e����c���2�sd��	���F#��mݶ}¸q�~~e��$$��W ��ٺm��m�'��7�駒$�Eu��r�l@B<BhԈmmm��L�p8���EGzaYv�
�>�Ԁ��V�N�5�LM3}��:��1�F�c�'�����f��a�fN����AAH
	�=s���ߑc�l6[�����d�6u��ɓ��b�RR�NlimE�!���
d����r���c�c>��v�,��t:'�iʤI�@ed���!�C��o�{�\.�>u�h2UVUI�h�ۣ""�̜165u@B���ッ��F�%	��[Snj�<�c����BZ�V��c�G��2ftZjJSS���{�sƴiC�޵gojj�5�g�w����%I�2i⸴�������c
�1}�Z�R)�Ǐ��:xР��e������J���
�6�?����g�����&;f�[o�K���s'�W}��P(}��Ҳs��~~~����Z���jEi�l/qZ��^AN�>���U�T�UU!��(888::z�A*��k����9���R��55�t&��B����HN޺}����=s��		>>>�]Ә�f��s4���{��ӦN��L+*˲�"����f�L�n4�
��_yc^��3�O����?lX_T\�}���#��B������B�^x�U��~��75ZMdd��l��;�BC_~�
���G�E1.&�`��|�/>�|ppPss��nE���wX,�A�Nf���=�}��yDPn^^ٹs��Uv���n�>��GZZZ�����w�С����#F �0FO�����}�@����֖~��$I��a��{���\����r�\.��bAutt`�����>u:'�Ln��b5�L�!$�"}V��j4uu����f�����6�PG�E�����M[����M��#G�<�]v�����jk7"���D�V��f���X�6��V�v�Q�V������=x�}��_��b�N׎]����546���WPp:''�����&��I�����K�1���C�y�{?���399�̙��w��Ε/����$�N�G��|����O?�}��]��-soڻ�����Ͼ�J�TN�0~ʤ��̽):*
!��c�����{g��3*��v���Î��U�̞�mǎ��ڡC�lش������y҄	�5��i�̞�c��ں::M}CCCc#Ƹ���������li�N�28qɲ�e�/[>g�,��cO����b��Ie����>}���� bw8|^�?�1oq��//�Z�����𰰥+��햛
PPX�Ւ�˽w����g����뮙SPT�g߾v�1e���ցz��dJKIiljL4���iɲ���˳�O����MKٺm[ٹ��	�����X�
z��q���+}t��'88X�����U�|�o~C���+V�������0�e:4"y����e>|�(!d��
��cF�<�q\Ұa5�ui))��ݰy����ܼ��AEQJ���j����"�Ã��%I�j��˵n�&��#��"EA�װ��~7�p��ܼ�ogf֌�m����ӏ?������7p@B}Cc�С!�� 3a�!z���$&N�<�`z��GQ5bD\l�M�**��4l蒥����:}�b���Ų,;b�p�����Z�bEƑ#S'O�1m��D�Ųz�訧�x�曼�kz�0�6nڴekb⠛o�1*2��ϰj�����j�Zzg�a�7��|��ΰ���A�o�\QQ������~��n�7���a��cG�H4`��4k֭��vN�u�v�R9xР+W�-+S*%gK'N����_X�u�)�&�y�mgKK�?y�x���c�k�o�~��]{�\7gά�/6_��h|�������e���e}&7wيoߟ�6!�eR�c���X��ݷ�]ں�|
������4��O_� ��v��l0��
��G�~w�I�^�u�O�J�H��ӝ��uB]��i�'��~Uq��$I�m�a���!�H���
����?m�A[��ҧc3����~� B~,}ZÒ�
�?�͞Kw����;�0�(ILW�ˮF��s�*JW����c%�XUw��"J�jn�^��.�>̳��i��~�vo�{�
�~�^ �1ӳ�!��L��gž�i<G:q���_q��Ecc(�
�[7��;ߕ�B�&L�=s:�~�&��=���׬�?Τ�7���n;L_�w��AA��]�Ճ�x��ob��
Խt����@!��w���mL�n�̳�<��k��w[��fⱪ�v�G?�+]}[���O�ыܴO�\ž�i�o���E]ǃ^׳�Dtfaa�x����:��~jG���C��kA�Џ](}��y3(}���s�E�\��*q����L��uަ����w�N��/ꋌu3�%L��uߦ����y^��	�S����՜�*�զ��IW`�O?�gT��?�pZ���U��д��;�z�^/?5O�۲i�P�+[�(��T��#
���{zQ�h>t�iMmmEe%��s&�=�h�;�Fc�-���O��u�椫Į��m��N�ڳ��\DϏ�Ճ.Z�Q�o�O�U'����ͥeeN�!ԭ�z�*]��4��Џ�S�K�.#Z(�3���h4�l��縫�{@��������Y�0��**�+*��zFQ�iH����qcS��ԙ��v��+(p8�teZ�����[Z��������mT�v�����m>��<ߗ$	3��m�w�ݫV�gL�6s�4|��l������\���w�<41��Q�Uγ����vv�`z]iw8N�>�:f���$]\I�l;h�>T���ίB�5����Œ����x�{=s�w[U�Ķ��^~u��7|||<ϒ�ә�A˅���MMo��� &��_�}�W�U�/�4��5o����
&��;�{�
�(�G;a<���_-�<��r�RS��駺��u���΍�m��Ϟ5�?�c������e�L���x].��5�]��el7���'Ʀ�n۱s����Ȉ��g�>/_KkkUuu@@���
��a�����?.�bpP�83+��Z�V���L�
�\��[r���l��G���1�/��fb⠱���11E�G����|}}�N�]�jj�"#辷w��ǎ��9�f����1���6��z_߂�"�eU]C�\���2��������^����7457��O�8z<3,44�D�o���j����j����r��"���ryn~���g�Ε3޹g�\.�,5I�
��JΖ657����k�n�557�74r2.'/?+�dssstTƸ��x��B���[Z2�u:AAA5��E�%G�#����|��9�fFGE!�
��<k�C�r������r���ڻ���#G+*+o��:��!�1nkk/.)�/(T��:�j�/B.�ټk���ڸ�؅��K/�,�ͪ�so�޽�_镽.�?p(=7??q�M[��--[��,�x��i4/)ݴ̬�#ǎGD����[�z��,��9}<81�����p�¢�={����ں���|O����O����BC�1�V~�HpPPlt�Z�nmm�<i"B�7޼햛���?�̚=sF��ѷ�<W"��mۯ�3'2"▹7%
��ە3�M���z�ןu2{᧟!��]�ݾ���?0�����/���Һy�K�����a���������_z�s��_.�z������_x���Iy���o���|�x�wk���7<��K�}|���`z���V#�^|����J��)"�cE�M�oڴn���-<��� ��eCO(�._�W�NgcSS���_-Y�Z��g�E6m��/y�����h4��PT\�������p���_y-��p��^����by���[[�<{�K����6o�9l�G��|�]���G��XQr��s�}��ay��c�3���y��N/��jqIɇ�>�k���`�X^y퍒�g%� ����m��w�|��;kok����?p����"�����G�H��p8�7C��p�̙���ҩ3g�&�.�>�t�����457���457͚1!4}�T�����t�W��;���{kמ��ں�Ҳ��S� ��t<33�ȑ�7544~�r�g_~�r������F�!��5MRRRaq��?�+�(.)��������MM���G����t&'!�0̉����`Ak[[m]=BH�R�ZAB�

�HƍE]{͜��
���`@e��NKIAi�ڭ۷��GSǤ���y�9L&��"�|t:�J����~�Sgrr�r��?����bcB����7�0���]��������RS�O��5�C���kcb�\<���fN�>}�G�|*+;;�D��	�[Z[�f��2�_�~*+;��5k��|!�RI��z� J�򉍍9v��0�
�K��^W_o4�Ϟ5�D�I�B�f����;w�y��S*�?l�a���<q���#4$��?<t�ȑ�'������;^�PO��k�����|"6&Ɲ�!�0���G�Tj�NG3z�}�����Xv�\m}��w�Fs�~��������Nf�X�j��fN��Cfe�r8)�G)�����C�2N6c�TZ{ss�J��љ��ӧ��������EF�)����vx&�5rij����R��d����M�8!  ���&�	!$
"FH&�\�
�<~ℿ�_xX�(J�I"gK�H���K?|��x�СE�%F��N&�� �}����_|龻�9}B����n�x�=v�M����|��2�[Ir���tY,�V��<�v��}"��}��M�=b�K����իϖ�
��wo�Y#X���!���"���NW��2�\α\Aa�28(���Y�PYyyxx����y!$u�v�p8�Q�N'/��\^�Ti4���~��<��Z�����y��Hs���F����Hʆ�F�PsK3��|�2��>Rƌy�W�;�/���˯8P!W�HN~��W9�����M[{[m}=!D��F���}����\��ڬ� �y!�μi�\�f��N�;�{d2YSs� �m��v�C�$�ÉDc���wU�А����PqII`@B�N)	!�p:���bu��]NBHE�\��h"����o��o��Ε�[,��!�a�M&��_�˅�*�R	����qۭ�z���E�i))k7lB�Z�&<<���O����cI�n�{ӳ�ۧ_~�v���3glߵ�n��ڳ�f�͙5�>��4thr��7_{���c/����iSsr���+Ζ��N�8���9��n0�
B�$�MM���/��q�s��v�i�N'F!d6��V�������7m�!�ݚ�G���ࣅ�2����:BHsKK``����x�ojj�ܜ���C��ݿ���1-5�N129�x扑��SSS��z��z�믽�PzƆM�}|ty���Y	��Q��G�L�:���h��DK�E�V�\���訨��7�p]``���;�KJ�r��q㚚��^����\ttTxX�/�z��ѭ۶B&��n��̬,��<u�䐐�{�*G�
�6�\�#<s���
���>��3��߸i��3��6u��!C�kk�oܔ:f��a;w��+(p�\Ç
;��9u��ֶ6��4nl��#G'O��s��E�}~��}HH���=j�^�_��c�V$I���OqIɚ��j��㣣�jjk�L�TPT��jn��5��oٶ��p�2��Ҳ��߭nko���W�T�8n����3�F&'�L&�Z���g4��2#���7-5u�Ν��6�}\Z�B��b񒼂�R96-����ٽg���C��[��&��������އN��DFD̜>uljjfV��˖����:������/�ڽo���gƴi#G$���~��W�g������x��	����`fʤ��IIZ���/�J?|��;�3z����R���r���n�n��Ə��j?���i^����k�-۶	�0t�.ڳ�@[{۩ӧ�
lڲe�������������А����c�R,�x����m�8h��{�����y�l6����Z6.��Z�>>:z����ų;N���LJ��uX,r�L�PH���͸˲,�q6���t���!���M��ӝ�����w�[���I�����!�Dz��tX,>:�~!/�(�]�q!�a�
����O>Q�U�Frm��z_�^�#I��d�j��֟��e�|ďn�j��%�h4jZ�B� ��pw��s:�F�����]�V�U�R�݆=n� ������(�JZumv�\&�8��`��=d1BH�Ptk��Y�v���
Ah�$��ph�j����yzkH&�	�PYU�1����^�K��ȣ/ï�{�}��?����Q�����я��>���6���2��w~�&��Y�V<�[�6�rzv���悛�>�q����R�#�����V�g[����P-ޟ�%�����6�EvZ�B�j)?i�x�H������K�M�V���4���+�M�,�~��~t��߭�TϱO�hܿ�4�\����̉�.0;�m��`&���<��kE?
��Ҽ�M�J~�K�����Οq��x����\>����
^B?x���ȝ��_���l�J��/�=�w+$��"]���u�� ������7��@8��~ӗ³�3�p:EQ��P�����.�}.P����7�R��!���G?]�P��]�.ׇ-�/,�����?�BC��1��_ЭX��q��'�9"�m�y}�N'ͱg�X�V+Ƹ��>���f�a�;::���;:0�V����������w3�̴/r{{{�qa���b�Z�V���dr�\�s��%g�Ҿ�M��9yy4�YGGGssKuu��i!�d6�74�74\�MW�(�E��e��BK�.3�����^�U��0F��+
��0�{�8����G�2�����p:��0d�o;���d���3g,���ں:�ն��O���X�

~�wkkk���?eҤ������U5�C��i��Dh��^{]�P���<�׿�MMq篸қ	.�^�.^��LNNʘ����A��oܲ���:"<�l�X��㚚��'NL?����^�~Ccss��Q��%�$��o�d>b��F���$Iz������VTV��L�������V�]7e�D�F��]�ϿZ�s�����}���<<i�;����7
c��P-�ݚc�aڍ�&�y�ĉ�'N��֢���������)c^���� B!��M�+����݇0F�������.>.v�ʕ�s\\̺
kjkY���֛���g8���sB0F��Ϗ>��$IWs��!���A�R�q�~~~4{0Ba�`,����޶!�:��믋���X,�O��[��
ˎl6B�H��b�vΜW_|a���~񥌓i5Z?���{~\Z��j�j�!|��*.���r'k}�(�v�!$J���_�|�������v�Q*>:]Td��o�&�F�A��"��8�1E��X�]��Ó���߫������ۏJ�8�����4���y�{�y��8�����A�Pq�Y��#q�@�w*�y�y�N����3?+���ܼ{��̬�
�B�P\3{�����ǎ��ʉ��I�Do�����>�O=���cӖ�Xq2;������}��cF��߇
�p�\9f��o�mlj��*���[oy���g��i�ӧNE��]��߭>r�F�A�<����
�Nj�{���/^"��Q��C���??�����������輫)�����p���J�4g��C�O��o˽w�I��+��c�f��
��UUՑ�z__��UPX�������(9[�V�$$8NI�||t�FcCC#DZ�(���}}s��X�
	���d��*Gkvٹs&�9"<\�����~��,*2":*
!���R^Q�c2�
�J��oh0�L,�"�"��N��/A)/����4h I�n��--�����W���U___V^ѹ_��XT\�g0���\������/2�'�6W�W�nm�SE����4�����gQQ�ǚ�5r7�GȽ�tP�˔�\n�����.q���C���{�����\��\�U��7
!$��`��KW�#ct���E�-�+��@O��ٳ&�Z�ݾ~��\.Py�������M�0
^B?x��u ��׹,M����7��ݸs��罞�Fb�ڦ]��߷~�ۂ����V}w�zi�O_�lrvzrGG��֦��p�,+�ɮ�ց_��������.�ˡ#�7�v�fݺ�
0����T*���6�F_LO_�a�A���ǂ����%�y�,w�-ԕ�b�ZQ��tt�v���pw��l�  �>\���Dž���<���ׂ �t~=&&c��*��+V�!$�K��q���6R!Fq�488�����ٷ����������>�&t��ӧ?�r�|�����k��>u�c�~��;���)S�������~/���C�V�\���
� DEF>����MI9�u�˯�N?|$,,��`�p��۶UWW�d�E�~v&7W����I�G�|�껜���I�^�m�Z�}���>���39�	������Ʀ5�����#ǎ�Z�6i�.�8�h2͛���W�^�f��YÓ�u�;�[����{guM�e�r��~q�1�|��?Z���N��IKM�8���_j��+׬]"���Q#W�^}�w�H��4|ŪU3�OS�T}���:�ܽgɲ�22��k�ޡC_3{����u7��Ą_�1��|����f�V)���s_lL̘ѣ>���Аw�|��׷�����f�X!M�M�MM�~�U���,�~�Ѣ����۶O�4i�7O���r��w��ݢ(�SHz�o7���B

���N�+�޻��/(x쩧����?Zx��׼��˼�e�Z�������p������Z�f���%%߮����&���;o�m��	�&N>l�3���Sh��K���f�����3Ǣ>u~.��s�\6m���z��Gd2��n���A��*�a�@_D�a�n�Ny����BC���
=�T�Tڬ�+������_�$�0III'O�JKMA������ ���c�9�c���&��f�9.6���X���99��B��X�Jչ�.!����!�T*��z�s��RX\2i�x"������8��A�a��8��i��M��465M�<i���G��?o���/[�R*B>::g��W�T�1�r��~Eo����>��a�^}�f����geg#���Kd2�3_D6��͔ѣ_z퍂¢	��3�BM��&�)&:��s��r,{:'g��͟,����E��v��>0�ͧsr�̞��u���E�q11�F$���AFEFVU�LO;���t:$$|�������+�Q�T�e����zi@���ӧo��+���׽~��������?W,Y<u��W^s�������J����*����z�ȑё��=0  "<��¢���ܼ|�PHH���g�v��-[t:��j��[�G�ǨF�����o4nݶ=(0����������˯����<���}�1Ӈ��w#��?(�
�ә_Pp�m����7ߙ_QQ9i„���>�i�����1aܸqi���1r�<i�СC���Xv�lРAw�z��a��t�Ν=6-�h6�;:bc����kjj�N�S�?�?8q���_Hp��A�a���B��Ə���q8�Ν;xp"=C�+�~����3sָ���_�qc��BCEQ,.)�9}��Y3�����fNddD��15u��D��}||����'��Z�9"����d2���g0DEF&'%%��i4��O�b�q\[[{}CÃ�ݛ4l�A�3z��d�e��	�ǡ�\���m�s��̞u��Y,�N�0�b�L?7��^�{����w��0�Lfs�0�y�ŗ��	��}�.���Y����\��g�/Z����+�����ڞ>�i?��_"�0� ���oV�jjiy�i�w7�cY�>�E1C�� �0��xf�����0��m�֕�)��h��K����5kyQx��S*���,b�եų&Џ<ߧ�����g��D���*AC��J�گE�!IJ,��$!���x�;J�>�B�tu�y�v�W���hD�?�]�L臈� �ЏAd����~�:���@�����~�:���@�����~�:���@�����~�:���@�����~�:���@�����~�:���@�����~�:���@�����~�:���@�����~�:���@�����~�:���@�����~�:���@�����~�:���@�����~�:���@�����~�:���@�����~�:���@�����~�:���@�����~�:���@�����~�:���@�����~�:���@�����~�:���@�����~�:���@�����~�:���@�����~�:���@�����~�:���pWz� �\��1���ږյ��;,~��1B��������J�EC=]
XQYQ\R�����0Ƃ(��&I�0���5���N�S)UV��j�^��iuJ��b�XmVz�KW�b�!$)8(848���4`���}�lp�`�EQlji

�<����F���p��qlXh�e
��(�N�V���R�!����]Q�VN�$���!ǵ�
ElL�$J��	"ô��f��L����#I�e�ƈe�����)�(J�H���~�]m�m��O�c^@�"I�0� �v���F���0�$I!�;���w~L�%��EQD����tn}q��?�1AD"�%�a!]0ƒ$�pyyW�.�\�{0��� �;�!D�$I�PW�I�$BX��$�"��}e �"�04K�D�^�ܸ�!�3?���^xb�.��O�AE���s��u����U�H�C��C4��(I�R���d�����V�EQt7r�j��C���H=��`�r����LX��8��r�\����(�r�!$�O��wr�V��8BF�����%IR�T����-�3]G�^׊�Lg��+]@�[���gI��ٱV�ݻn^�c���azT�h4k�o�=j���dYv��e�'����t�\�o:OB˲6�-��j5����fH�`�e;'C#D��du���$��*Q�����|I��b�
E�ѿ��D�!r����᫥KY�aXv�ȑ3�O��
�aA�j�;w��Xv��)V��cY�<�</�����}A?���0Ɛ���Л��H*–��\������8����V뉓'�U���j���d�8|����a�B�D<O�1Ƽ��=WV|��eYI��
*��DQ��d�e�Dz�$�����DA$U�T��a�����ݟ?��&D�$�e���s��L�465u݆�˿]�V�9�s8�)���b��2����)����w���pBdG�y^.�����M:���BDI��N�R�ժ?<p��]��}fHCc��_-q���ӧ}��2N��ҢR�x^��b��<1¢(E�G�l�ֶV���`&(0!T][S[W�1��g0���l6s�9:2�x��Z��|]}��U9dP�\.?�{f@\�F��
{h�<B���&M��V�#"�_{s�usfoݾ��ܹȈ���xL�T�d�0�7m>�������#ؼuۡ����OO>�q�߭q洩Əu8��"��~8��#!�$"I�$��(�,�f9��h
��}��p�Y�^�Ӽ��+Æ�ڬK��f|Zڋ����� ���>;�~�#B$�L6h��ӹ9����1�n4�W�'L
-.=[^Ui�ZSǤ���DLf�(��AA�~�IC�	<����a���j���"$IEQ�$I%Ir��F������`�ju�l=z<��ٳ��ܵ��!�J�.,*Z�v݌iSK�ʖ���믍����SOFEF|�ŗ:�n��-����nt����$I""��p���;�����ښ����ޞ���Vk|}|���h�<���V�%�x��c,BhHHsK�V���E��l6�\��h*�*[�Z
��*�F�P�����)�����Fs�9$$D�ɜ.����-C$B�B���jU*�]�9��D����f�����%%~�~��F�HMm]ҰaÓ�0B������y��� ��M�_��"ڸch��whQ��>z�j�~��|�?ٴu�I��._a�۳���
:"9���N�4�li)!D�D���n��$�� IC�a��v���hik�:u��t􆈰��������I�h>	A�
:WQ���6y�D^�ݷ_���y�������W.:'瑇
����j�Zm�����h6n����DGc�bc���J%!d��<����Rk��BCC�29˲�(z�
ey䑐�ھ�J��*�8��UJ� c^�G�t:�bc�MZʘ��������80e�h���vǴɓcccX���	����N�BX��iu���fc�Z���:]._�ب(�J���k�X���襀N��j�>:�L&S���L|l� 4�2�t:���l��P����er�^��s�$�iu�11�gKY�


���%���N�8�����

0 �2z��媮�	�	vc$I����������.�B?�Os��Z�~?��	!J��e�͆1�8N�TvX,Z�F&�K�h��y����aFE��F���,�r��*��Ƥ�]M'1�,�҆��q4��d2ڜ��˲� A0ƈ�\�A`Y6�dVTDdXh(�w��sg2�r��&�K�mIu:-m�o�ٜN'BH�R�U*I�:,���Ng��DI��t�k2�R�R�;::A���q,�t�l6�-�0�(��>	7|�E��kW�N	cL��0����p8X�5�LtJ�}�����_��~��
�g���MC3mB#�����X��ϯ��ܙs������<镇�j�XpW�Ц��a0BM�͸k�6[��B�1���������e
�Я`��Dhkwx�jC�u�66�^���uȺ�G�H�$�˓����;��#
�j���xt(s�����$���D]O���{~�>�E���o��,�ޜ��Y1#�v�U�D�����#J�I�X���i9�'�X�Y?�EQ�֘��֖K�a�H,��D�����t�K��"��DI�Zm�1�F��ҲL_�v���T*��z��~�a���0��B�_5ܕZ�V(6���_�Џ��*
�Jmw�/������B?�_��g���_CA�y^�Tj4�˽�tY
��&�|�D��$��.����FCY�ml~�|�!�%�wX��gY���s��[��n-��?�^�y��8�S)U,�B�O]��6����H�/W�9��0�,�2�@�~@!D���C)w/�˻]��d2�G�ǵ���W7�)�����r��ؼ�˲!�!����^]�9B���`�a��t�1�k�S�T���$]ζ��a��RSW�

�iu�i�aQ��ڍиz�$I����aY��0]hm]=B$<4��5L��X�q�$��@��ޜ�i��!����1B�]Y�.ӂ$I"� 4@o~�&����U�G(v��_�lϏ�E�ڤg����0Lg�Km���r1�̝���td.��6¡c��+�O�y���oy6��q��6:E���"�+8��w��#���=S��a��2=�Fp��pGKB�\&����/,�X�㸲���se
��eY�L��2�L.���9c6��r�\.�8�c9���a�P*4L�{[܏��\^|�$�ؑ����'�MFz8Q��4�'�q4���
��e:�q
�����6�B�c� �@��E��Eߓ$��t9�Fc}c��墣b5���W���v�0F�������� R|�8�����c�BeuU���g�3�3m򴀀��e�r��f��WUZmV�aڍFza2�A�;啕V���8���nlojn*��<��K�
�{'�c�~��w�1F�{8�d���3y9u
�'Nf�\����’"���r�X�iln�X,��E�U�N��b�457�����g��()-mjnv���+���T�Pg�X|}|-K^a��j�+(�;���j�j1�g�rZ�Z�
-6k^a��鬩�9y�T]}����r��DQ��@FBB?��������0� I�ƥ�7g�J��MN��jEQ
�h4�DNghHXXphrRr[{[k[�\&�Z�����m�~C#d��N��x~��U�UV�U&��������FE7���76�hu��h2�e������V��B��SF��D��1#G��r�c��~�g&�(�
��d�YNGk[�R�D1�A��� ��0F�_�Dϟ<}�eY�F#I�$
��w90�2��i�IC����y�L��`H��訨I�&�\N��$��d2���w��Q���`�$���0<,�a�\���=bdp`���1ƒ$���r9mv�y�!x���
�c�烃����Ne?yB�RE����X[W{��a��]ZV�}�� r�L�R���8��e���̖�3y��A��U5յ�u���̏A{<YF!�q,��5MtTt~a~hp�F����lhlx^!��?���`���#A�X�eYQ���aO���Atc�������r� i3�U�T
��w�K[�;�N�F.��8yg|L�R���o�.��
���K�X-*��eY�ay��$I.�c��V+AH�P�'���B����>,�ٕȁv륭w\<�qF�b��,��+<[v�&FV��eXڞ� �`�1�$��LF'�X�l�8�����N�	!� Ќǒ$�x�gY��`�wT!r�L!���c��^"!D"m�I?�#>v��1"�E������d4˲N��gGϏܭ�1�J�B"�eu��C��^u��c/-�AG9�.�.]�w�E���˞����W��<���~��B?��G�q���;"�OgpEѠף����jY?޴A�}�/�F1F��'c�a�v�N��$���4�ʸ_E	Bc�Ӊ�r8��!�o���`���f��������
��M�͗{AtD��� �QEe%�q���cQ4���B?tGQ*��a�O�~wR�ZsY�HQ�Tu�u���Me~�V��.�"�"!B?�����.z��ׁ�^�'o���s]������s�qw��9swG���.��7П�d��j�,˸\����i$����P(�(��k��qV��_:s�R)�8�1��n)�!ǩ�J���r���4e���|~O��~:p��#m���"#%Bܙ��y�����r��Bۢ�=�������nB!��WT�4H�P��j��A��9�0��b��10�AE:��{Y��0� �eJ�ʚ[Z$IR����C�kk2��e����


�y��z,I��L�D�0.���y�Z�>ȣ�{��oay䑐���
ŧ�y�x��f+(*JKI��EQE�FC#���W�Rq��ؘ�ur����a�^�T(�.�0>:�Z���dr�\����G��Z�����'�r���	�q[�m����$I�׫�J����ڽw_��$��B��#I�N���8���R�t:��Z�P8�N��J��ϋ/�:}������nD���R�R*%B!o��������h���ɜN�J�Ԩ5���y�J��j4�S��:|d��4�R��h8���M}}|��B���s�K~�e{
�EQ<z<���x�����t�صkxRRP`�����~��7�r8�^3g��u�����Ξ9��j����k���l�lhh��3��ܼ�ӧ%����Ry�L�N�����t:A�8rd᧟�:�s�-7��ꯗ�@ͽ��'�>���Ꚛ�����.0 0,4d���)cF���h������c<�O�8�j�I�i��w�c�Ngss˖m�Z��ǝ�ݶkϞ�gr����򊊌#G#�çN��WPX]]m2�gϘ�}�t����1c���gK�j��&$L�0^��u�6������MM���;B�>��mtI",�>t�}K�.{�?���{�ؽw���X���|o����<��$��r9
>��w��g�.�v��f_��Ǿ�>�6n޽w�R�|�B/pg���kB=���ׯG!��
%]�����k���h!/j�ZE�\�f���B���e���^y���E.�	�P^^Qr�T.�B0�].�>X��O.�����G�dw&'g�ʕ~~��S*��%%����~Ӧ-?l?���웕
�b��V���c�liفC�量|�~}hH�ǟQ[W�n㦣ǎ�_,^���H3�^�B�_�{�g�p8n�y��5�yo��ɽ��s��e��9"y̨QN��h2FGE�9"!>���j7s��M��ֶʪj�Hq����s��#�'
}�Q�٬ǹx>"<|��?���e���GǏ���F$'��!����uؐ��QQO=�XTd�R��h�r��`ЫU��S&�����rB�C��c4�L�>'��}��/<��]����€�����mjnN6,,,����+(H4�?>��ݻ� ��fL���N�:u�-�<���'O��0X�Q�e�)'�}�	�q%%g�N���SO�}����zo�~Ӧ�����׬[W_�0c���k�|�h�=w��t���'
:�^D%
VS[[XX�T(����㯟3G����v��f�Zmv��f7�ͼ�GI��P(�����ե�����nljD�n�g�8�x�2A�O��WP���h<v<slj� UUU��qMM��s��C��/|���~&�y��v�!�p8DQd,
���Xmf��jE�n�wtX�FctTԦ�[��+�gDEFb�Mf3B(0  �ȑ�C+
Q�v����l��ͦR��Z��
��d�͛���c^:
��w��s,3�k��1m�L&koo�{���w������ڳ',4tά��q�g��2���O��kϞ���NVYY5iℊ�����訒�ұi�2�cY6����k�J�P��O=�V��8��;�����b�����gN��t�v��3<ihdD��-[�JJ�zߑ��+W��<qrHb��7ݰm�ʪ����m6�L�WP���}43����AZ��ƍMkmmxa���ʪʔѣ$$�L����b��z����F�Z=lȐ����'N��c����;���J��4,7?Hb��	�7nڒ��]W_?c�T??�OW?�����y�j���`XV&�Y�V�M�����{��,���6�Z�fFz���B��Żd2���tһ�t2��E�(
����R���I�B��Mzd2��ngF�R�SW�\.:7��
B�f���t� �,�1vh�P(�M?%I��dN��eY�ex^p�G�R	<��8�Ӆ�.��8����v��Q���R*<�3���S[[�����/���������,|�̙�o�X,6����Ӹ�t:w��{��7#�	!� (�JABc��F[���v����n�;�\.:��n�wH����B��~��i���fc�nw`�<#��j��Q)��M����{C0�v���XA8o5�V+����!�5Ah�Qz���fW�TY'�ed465''%���B��E��~:�F3�1������`���CȻC�������s&=&�M�;�>�3�+�u�|�����W��8t��{�q�m>,�
�C/h�
�z}��'m��F�Ao�\���$I��N?�b�ٮ���%�P�Q�{徫s�W�
�y�}~
@���0r��^���R�c^�a������l�Fހa����Q��Z	���%tEXtdate:create2013-09-27T11:09:27-04:00ֲY`%tEXtdate:modify2013-09-27T11:09:27-04:00����tEXtSoftwareShutterc��	IEND�B`�site-packages/sepolicy/help/booleans.png000064400000216055147511334650014375 0ustar00�PNG


IHDR�j��8pbKGD�������	pHYs��tIME�
U�D IDATx��wpG���j�
����ދNidg�f�i��fg�nM�Ż��⽈�݋ۍw;ow���̎FfFcdHy��I� a���Uu4���pD�?E��������/��>���z����+��Ԡ��@ 7�$Ec�X������伮�B�׬Y󋼼<��(���@ ��p&��CCC�ZZZ~��e~~�K�et]�#�
��`����5>>�W���
�$	�/�
.�M&�rE�4���@ ��nT愾��@ ��B��@p��.�@ 7���/|4MCUUt]G6P4]GM$Ri$IBQ����$I���h��絍6�H�l_�d�`X2]"��0[N]�/��b�����(�gI$���D��u�n0ׁ���Dx��}|x���R_U�w��u����'�� @���~��o�w?�W������<t]���Y�|�C������?��_��g�AdYfYY	_��=��-��@��|�޻fxt�ǿ��'��|�1o�����QYV���4��)).�>�'�T[��N!�,�ͬ�{�~�>?�n���HO'=�y�,4~��S2tn���gO<JA^.��3�M&4U%?;����0$IB�e��0����[qA�=x:0��,V��b ���$��jԀ��?�ɨ`��RA0�o~�=�\.�~a5�|�@0��&�Z�h�F8EGgUC=��UD�QB�:`�
h�F�Ӊ$����j�b6�f'	��~�x�~���4;��yy�蚎/�'��(v;�P]�PU
�Ɉ�f#��8t]'�iV��`� I2N��p8LGW7�,���Iq�[(,��V�ON���oq�����G|�w��t\�Z!��O��� Ǖ���GX^[Ê�Z��	Ʀ�y�Ï0���n�ӝ�Z�ٽ���{O�?�|YQ�D�T�e�:��*���LSK+g{�Y�z/��.��y���>�?t?�e�����u��#ǰY�LLN�y�*�ZZx�ݏ0��8�6��vt@�4��<��{�~��
���a6�(�˥�������(�¿?�;~��#�j��qe3^��o�Hs�֬B�44]c���97>����;o����#cd�����~���~/����E"�w>�j�069�wy��	6��ޝ�)).�W����2�\<���҂থ����[6��WБ��W廉����E4�+I$�������~�����w� ?y�Qt���v��`�l6���μ���$@�A�x�{1���w�z�29IH�ש��`uC-?�w<v�W�,;���4��d�2��7�N^n6�����K�Y�PǓ�{�c�Z�7Aס�����!��?���D"�����g{pح�d����9�:�ggq����?�Ƚw���78��il]����>55s�T+V����7�~�J~�˧:7� ���q�������?�[GFi<z��+�Dc|�x�G��4��<�0�δdm��Kp���
����}%}Bh��g`bj���J�
��kh{��`0�N��ɚ
��p�::3o�d.�:��N�:�$K I��F��3�Y��&��8lV&�����g}t�~:�6m�Н���#l3^�3B�0�]��Dcq@�b6�Ǚ���'0�M|y�V�z~iiܽs;Cz'���������,>>xt����vq��T3���4;��b�o�G��SiF&'��M�f�,����hD�e
�TU�#IrjҢ�h�����~�|��G��=��a�����׊>��������۬<p�m��r���g���~�K*����c���p��^@�$L&_�}'�
�pڬȲ�8�6ܹ9lZ�������I^����NƧ���}�]o�E��V7�'׹u��ٵ=UU0�L����m��<���
lݰ���7?��¼<,f3�E�ܵcϼ�EQ��TWT����PV\�`|0��U�`�������
V���Fa~�-m��FnV�yv��c�Z���X������6��ͷ9t�$��'��[���;yw_#�F���
J�
Is��x%9�Y|�R�����N��uE�����[�|�S8��IB�$��f��|��u�;��@�Tdٰ�]�4$IBӴ�V]G���o��\���C�
h���җ*��b.}2_9ek�/������q���U��2��K�.(�\~s�xɪ�ReC�H$��FtXpoY��f�7?M D1(x�~~��o��SRT��i�b1,fp�ns�+��dB�����~��Ӣ'�Q1,F��/)�T5qѬ��T����r���M�-��yf<ں����[��/q���:Y�H�7�F�u��h�V4]gM}-y��	ɬ�bδ�T��M<]�_
��'����?H~N�U$`����UWa0�HOC'�
W�o�s��~�@0GNV&9Y���Y�f,f��x��O@J�ٵ@ 7��S������f�|�{
�@ �A4~�@ 7��?95�tV8�	�
r�sH$���k~B*�2�i�4<^�P��h4���`jf�x<~M׏�b!˕E4e�3s]oœ%���4Ҝi�Zp�k��P�U�Uج6Q�/�P8DOo�X��j�%���)r�s(((����
�X,F�`?�%��M׮��N(b`p��FUer����k���q���
�o�?��̏4&|Q���EUU$I�.E(bjz�����$k4S�:LF�5?1�������G������`||좯
n�/\C�"�p�����EG��/���h�v]���@�{_"�����/�_-�-}"k���kL�&M��Mq%			]�
6����`:7D|}]דa�/�P8��2�'&��@�3����QB��1�/]�k���yO2�4��A�+��~�~��$�
���蚆��᪪2�F�5v�����N�K�/�2IK��'���&�o��B������X,�ca��0�-�j�o��x����5�_R��b1�tuQXX�+#�8>1���dee299Enn�e���D�dee��37�ƿ�S�Q1��w#I�x���)����g1[�Ym�xff���{#���h���#r���q�9�u��KFF���V�������h��2����i�������>k��u���!�6YYY�NRfff���O�
	��ʢ���S�I�4������L:A���A�I��u-��Z�j"A84CQn�,��-�0��X:jB���w.)�ύ���?q��+n�s�l7�]]�#�7�H�7�4m��;0x���������%���#
����G8���?�c�X��-����x)-)Ũ�[�c�Z���@6�M�(�}��G6� %��O ���~�?$���*CCC�=�́�FFFFSi�Ģ�UU]�]���w�9~��������!4M����}�����b�%�q�UU9|�0���-�|��z^U%�p��1>���d�,��R��Ǘ��X,Ʊ���x<��k7>�M������q�<սc_0���(&�it��Qdو���
f�+��*���͇����m݊�i|�o�xEQضu�X��ǎ��H�Hg�ƍ %g�$�t8�4�MM����r�ؼy3�����񒝝MuUo�}���3lܰ���bl6+�x��MMLLL���g�ʕ�<y���I|^��*Y�f
�g�<Ӊ�j��[w`�XD�8|�(o�}�?���PZRB����/���ǖ͛/��'���NeM�����f���%��FnN.�,399I ��p������ǐI�����f'??V0�#�Q\T��l!	sn��%�h���db|b��KnN.N��P(���HJ�IHh�bA7�A�2]�^�25����K:��Z��8'��$??��6�8t�0�X������XP���jђ�55�~�����vh�FW�YZ��p:�l߶�ƃٳ�m��ڵk���b筷�D�x�>����mmlX����LNL���f��jia��]��K�[�������HYl.f��t�شq�x��Ϧ�����-TU������A�~YY�ܲy��D	�]��ʊ
��8���92�2ٴq#��S���ٻ�p(Ȇ
P
� p���� UU�(/+c����s�֭,�B��HdY�f��P�L+�ꢤ(��)yk�'b�lvd��V��j�>����Q6�_G8aff�h,F��TVVb0ȼ�g333���S__���(G�C��b�8ǎ�����c�O�e���뱘�TTT�a��O�ddt���"�74PQQA8��tǛ����e�����I��Ϣ(Fn�e3���&��ݯ�B^^55�D<�k�M6P^Zʿ��/8t���¢����+W]V�W%�1����������aw`�Y)+)������6���IOKǝ例����>�J�P�
�$a2�().���Ӝ;w.�t���J�F�&3N���NGG-�Z(�/�j����ɉ�t��D����R)!���x�����y���}���!FF����100�ƃ��׳�� }}�47�d`x�U�Vb�Z��:/Ԡ�߰n-���?�H"G�t���yq�n���1<r��ǎQ^^Nvvk֬%33���	�Ξ���I^ؽ���BQ�4�<IOO

��:�ES�I:Ϝ!�QUU��h����]��BC}]j���ʫ���ZUU"���QRRL^n.���]����'imo����󩩩�ͷ�bll��_{���*�
I$=v���LrsrY�j999�������[{�&�QWW�;�GwO�e�\�NuMtj�JzZ�hU�A)`xҊ�������\�֓�Jɡ�az{���/066ơ�G@��HO����-[��?0H8!?/���
vl�����fx8s�,
�u���q��H�tt���Fffff(���\�2�����u�
��S^Vƪ+ik?�$I�����n�d2�}����Ɓƃ�|ӛ	���w���ee������η��f�^�9��ـ��o����v�vgJ�:�b�F�]�q�9QL
jBe�*����j������"Ԅ��d����
�7`����)��t�Lsb0��ttt�f��0�L��w1�_YQ���x�o?�8�|t}Nj��Fo_3339r�ݎ$K,_�@��ɫ�����Y$ �������-[8����S-����(SSS?~�h$�,�d��c��p��a�Z���������6��N47�u�,����a�5��������HH���RTX�A1p���

p�\�M�蚾��|n4��+#��2�v�g:9�x���i�>?&�����,4U#-��=w�GL���ɉ�������f%/7�͆��Ba�X�j%e��������w�<�K,5Hb?���h4���$�Wc�Lf���S���-�X�ʱ�'��O~ķ�x���臜�:C $
1��p�����2����i:���}�LMOPUY������ghx���QNwt�u�6

���Ig���4
EQh��g߁����s��֯[��3=/�F8�hT����u����hט������f�Lhh)i��P�Z�V��ʩ���f�12:B ����ۍA60::���;�MQa�h�x,��jE1(x�^FFG���fxd�Ɉ�j����p$�4���hLY
�N'v�����v��Ɇ�6�_ �4$y֯`�D��#!�zzz���"//�-[�r�w�r��z�lܰw~>--��~���B��j��{��?B�Trssq��~�z��*+*AJ��S-���~���8z���q���x�F����ljk���1��c���86�[�$��~�&�9�ҲH��'��^�J���?o����[n���{��9?Y@��󑕝Ͷ�[h<t�`(�2�;�vZ����@�t�V+��o<ȩ�V��{XVYy�<C�Т�����/\�+&4�K	��x}3
���]	��y�oKJJ��g
���ƨ����p�r��Z��L&4D�T�;�o'��o�~�D����k��p8���&���OMu5��.zz{I$T��c4�������jj�X�&233q:444`6�����������䀔���n�f����������Y���e��uL�v��r]v6O�#---FZ�e4U#�i�@���gIKOcph]ӉEct�t��ʠ���b��sn�F��P0�$I���t&�
��<�LMM��	�B$����?�h�ٌ�f���Ř���\GcQr�r��뺎�a'?//�O� #=�����4UU���ʢ���p(Dff&~����.lV�֮�n_�#��c���9˒���"-͉�ᠼ��w>�EE20��̌���d���x<dega�Y��ɦ�����lr��Y��@zZ�YY��&fff���fYe%&��̬LlV+����<֮^���(�YY��=��mve�A)##WFFJ`��%�)���s�QTTD^N��9�
iN'���tww395Ŗ͛q��c����ϣ���������� #=��+��F�lڴ������ry�����X-Vѩ_�X���*�+�����L31>AyY&���OMMŤ��m[�9/<�gn�������~����?J
H}�����Ǿ���{~(��W�����wXb�3���w�kA޺�$˩�S��P8��֓�n�JND��w>���:�X,5��e9�
�:A8N��G�Q4M�l6�(
�H�D"��hĠ�T-5A���4��b�XM^TUe�3M]uݢ�&K���my~����1D����@G'7;��y/�K�}o�ߗJ��o��K�GH�j�������T߾0���!��%�n���ʞca���xW�KtV�F0�x�1�o����}݌������Z�(,Egg���^���4}:x�ʲ��j]�op�An��?�����T9��w24���;sk����i�~o0	g�ӹ��0�Ku���/e���8�]MdY^d꿦����]�>�u��@���Ձ���zgG�ɉ�'X��D�������໒�bJ����|n��`0\7��ٟb���{\���&&s[�w�v>������Nwt�zIOOgYE
�x⊂�|b�/�)XJ�f˪�d�܁7s��At=X'��9��|�k@1JOO��tRYY���}�>w�_ �ژ�&t]O^�H�	,f�D�OVR>1�ɡ7��b2�n��w��b2�D'|�|RA���ׅKpM�Tdgg��RN{׃u�������?�0ve��I�>��w��H8B ��™�&�Wx�@p-��Mfz�z>��A �<�4�Bw!�����LsNkE�E�2�MЯ��9GD��qC�u#�0((�yɐ��`��2_��#��53�F���E���H`6�Is�	˓@ \��?�Hpn�&E8�.O("����/*C �K��&���r�D]�M��DE��*��j�^Q�`��zڻ/B�_Y��^�HL�����b �iB1��ݛq�@ \��EZ��뜛���C���2��F�XUv�s��4��r���<��c�������X����b2
K�@ |f����XK�v�r���@�߼�x�4��v�5"q�o��[�|�e���'��%	�d�ۡ�!
���g,:Q��04<Dyi��UU%�b1�S�%W���'(,}bݕ\{�S�.w}rb��&r�y|�u n��H$���g���A����uk�244�,K�:�_���K��yy��Ym���ýtwN�m��/~x+�����'�!�b�Ό�ay�x37��O
(5���Z:��˕�� ��sab�+����}��@U:r����g�*�H$��?�[�laۖ-H����(O��)���'.yXU��'N4�y�&��������W^��P\T�=w�uQM�x���)++��jYJ�����g��x}^l6;w��e����[4�_��{�&33S�3�@�����'?���b	��r�h#�����������` ��DZX,(�B<G�t���|�f]]�D}^6l+���
��F�;�ʊL��QF�M ���
 IDAT��ݜr?y���hZ���xM��X8>r���Z6o�H0��t011AOo/SSS�L&ҜNdY�m��,�(
�X�9yN@,�l6�i�D"%�u]g`p���^������t^{�
:�Φ���Ɏ�(��@���if<�3yy��366�7���X,U�H4��d�dJ�%�������� ���@�'�~�{��"&&&y��g��_q:��F8�n���:S�$T�X,��hD�$��8�,�]7��*����}��mv���PU��)).$�Ow��~�w��K;w�r�x�ŗf'2��w����#�de�x��_[ �%	�c$T0�,��f$IF1*(��7��]�$ob���%�>%�d�yȲLqq�� ;+��e�R����������o=��n�wa2QU�{﹛�v��{�6>���������ƹ�s����Ѓ������TUq��1����D"lܰ]�9v���E�$���ؾm+����P�@���hbbb����F�`�[{	�B<�أ�~�Uv;��o=�M��0����p��)V�\�z��A'+W,GQ2].V�X���f&&�Ǖ�N,�[�=
$�A~���پm+�%%����o[0�7��G�H���AW�u��옼���e�<��ø22���^�ii��k����nG�4��c˖U,i�_QW��cm���Y�|������h�3�&;�D���D"~����8KE�]�v-n��}����wy�G0�M�h��k>�~��� �3֬^��5khjn���,������ѱ1$Y���4�Owp˦M�dIb��<��?p���o�[���{LNM�o�~����8�����$�LO�����bhh��Sk�lތ$K�{�ݼ��K��������L���q����[��j�E�7�H�͚�#I��;��a4Mg�����?�oD# �er����?�HzZ�@��%�@p3~��ƆukIKK���iV�'TI������Dc1ܹ���������i����{�N�
p쨟��#������8l�t�'X�rV�U���N�R��N�Sե�5���\��Ѓ����t�9���1(
�b�b6��j"��hD6P]ש����^�Q>��ct]��p�6�t:��7!�����Ι�%I"� I��…2��\f��K�����d�c�$77��ӧ�-�-̣���w�bbb���,����O<A㡃�F����dDӓ�娪JqQ{�~�7����"�@p3~��0k�_�]ZZ����n�kkxᥗ���/�a�:��‹Dc1$Y&��F����Kw��w��͂������g���1S�gb��Fx'+�xP���O��j��ٳ��=k���&��$x��9��IzZ�:۶nM��Y�]��N�7o�_�tG'~��o<�0�ii�jj�Y���ݯ���U+���,�GJKJ���u�ܜ6�_�^x��Huu;w���/��K�v�Fg-`���$�H�y�Z��?��--�2��ٸ~��������T92].~�!�}�w������m_���?�%	�l�d4b4ٳw/�|�aJKKx�ͷ�����uM �����F_MM�S�u��0�%t'�P�'�(����4���ȲL4�d6cTB��p�Ŋ�f%�a0(��X�9�K�8�P�Fc�V�x����_y����w��bP��hko������� k�F��'�tU���aі�h4J($l6+�u֡�j�&'���h$���*����H��adY�l6
�Pe���R}DדN�ss�H��ͳN�1��@2�YO$P�D<��l&:��0�1���vdY&
�D��͘��FeQy����D0�V�p8��j%
�Fqed�����w��'?��p�n2:;;�)��yg�FA���_�p8�ȹsM&���g�̈́�봝n����D"���t�C~>F�Hfff�D.�T���K����f��B��P�_���j�V����J�fnB��@r���LWN"� /7�ҒRQ�M�U��|u��7QW���m�D���:B?��:i�AFpi��!b��O����WM��B
��X�e	�ø�Şr�@ ��5~�˵d��B�dhY}��<�@ ��? r�@ �5�o4E�<�@ n��(�@ࢿ��qEQn�/�7Ҙ#�4��!��/�{,�q�Tlw�@ ���1�	>��qjz���A<^�e���^�5��`���sĘ&�,LMM��ݍ�i����$�1�G��5�H(n�1G�i�ς?�OY�TU���^Z�O$����[���S.�������`����.w����h'�=zt~�7]׉�b��
�ǰ~.�r�d6EQ.E�4�8F�I��	7��_b̹�ύi�DzX,��E`&��g�|���?r�0G�J5�U+װm��
���n�?��`@�uN�lfzf�;�|�%�=u���q��%�c���x=����3O"���;���M��v|�J�D¼���l�~+���~������RVV����7_��=���-R �Xj̹���~�~�IV�\Ͷ�;�u���9�N
��K\��4�ښzvl߁$��ԑ�s��iiih������(����l6�p8�t
MSgg�Q��F����4$I"
�����
[���A�[�>��`���c|��<���0��x�^4M#-͉�h"��e���$I���O$0M��>�����/`���\.���Xl��a�f��u��X�6�N����2[g^�6��9YIUհ�l�l6�2�k]���/$�D"AGg�e��*:	���W�3�X,J,O?=w��a�b���;�ł��/�f���|��ȲD]m=�����=deg㙙a��Xm6�=L($++���
|>/��D �g�M������^~?	5������䭽o~��3i��}KHX�V�V�7��S��p(L�ٳ�h:�,(**b����}{�����W��>����p8q���;hj>Aii�P����y���dii�C!6m�L�@�SS;q�ʊJj��<F_��'�X,��m;x�����&�8ii��s�W,����a4�ظa�瞻�M.�:g:;�����P]]�b0��ނ����p8��b��f�Z�ywo���;�!33S�ԛ��.
9�v
������w'����
�;(..!--���~��(Y��<p�Cl޴����j���!�p�0����r�Lf���=�0�V��,�]I��D"�A6O�ii;��� ?�M[{�h��Fff6�V�i��_�������PZZ��w�K]]̳O���CS]]Moo
��)((�;�&/7�F#�<�m��d2��uI�ضe;w�q7��`0 Z�@p�k�W9*���%�ʊJ\.���H�	t ?�M~���Fww����/�k=�QQ�u�Nv��s������RXX���etlDDY?8�i"I2��` HR����r��a"�0%�������D�׃�3���@6�CIG8���Lmm��LB��p��C$�l���y�t���I��8Jiiv��Ɍ�d&��b��MF���<܈�P�X,����z=��!��"�h4�������t�a0�D��xfRN�l�j����0�!Q\T�D�l�`	4��k_����Fnټ�]�_����Dhn>Aff&��%�9�	�?�`P�u0*
��a�Z0�LX�V֯�@vV�p<����dbd����2
��),(�n�#I&�	��N^^>ǎ����ӉQ1��}��+۷� ��u��p��W��ۍ,ˬ^���eU�����{�`2����R�_j�*IV���%�l�����+Va�;شq3���L.W&���j=���4�|7�������K~~��X,�T�ى
���ᄋ���l֬Y��i�9�ɇ�O]]=���̖[��x�}�},��������~�,'�P �E�4gZ��ڸa3��`�X(**�lwV��É�dJ�W�=9�Hv����Z��g�:Q���b�Bo"���&��ի�6���nn&��:�,/�-�#dd)��m.�Г�x]�0�̩�s�|����g�qe�]?�|a�����0�L�T���wa}�v��1&&ƹ��{/z�\�]��$Ih����b[�@p�s�1�j�?�Xj|��s��c���X,e������cxx���>FFGS߻��(/cxxؿ��1<�!ͷ,��by-��sip:��,���a�9K���s��`6[�����g�T:!���E�_z̹�_8�\j|��X$I2f�E�Ǜ�*d���qQ��$QSS� 8�@ ���c�5�	��~:�x���*��7�����W���	�B���b1B��$L���sĘ&�4D��+�o�� \.7ʳ1�1M�i�����F�6��-�@ �s���̈́
I ��!��@ �_ ����D���!�� ��j566���LNN��>4Mctl�X,v�<���c�����u��C�S�s�4���1"��g�g��%Ϙ�4�X���!FFF?�XK�7311q�Π�4���	KC �FGG��+JJ~?O=������&�}�W��ؾ\L�~�?���p)��(�~�iff<�K~��>�g�RA���Lc�E{������{/�X���x��d�b�\���0��9����s%e�d��={��u���f�?qٺx���yq�n�Ξ�"�9�.�����:����.�gz��k?}��擳����� 
]�nv���t]�}�y�immc��ms�,J������0v����ZN���������#GX�z5����N�r�J�f�D�S--TWU��:�g�Y�����g!=#�5�W333éS-(FeA>�D���|��(cc���,�j���烏>�d2�v�Z4U���MU���"'''5�vtv���YV��ӧOO�ihh ';���~z{��X,�]����6��%%%�dg'���r�D5�U�l6N�j������a������cyCC*�g8�tG+�/gjz��CYY�mmx=^�njkj�̙38�N֬^M0�LW�h���rss��y��H4���,oh��tr��Q�/��٬^�������z���	����[��@o_=�=�9�EMu��v�T9Owt1�ͬ^�z�{Y�b]]g�'�X,���a2�\�O��[SuRW[��o �ĉ���3<<�l0P[[���:|�D"A���b���+���(�=��@?ii�X�V��$�qV�X�x��b1��-cjj���1jkj�e�ɩ)�y�=V�\AFF:�'O����q�E���L�7��ASS3�PE1�pؙ������Ea��곭����s��J$��C��J��(//�TKK��NEa�ʕ�ډD����qv;�x�ɉI�;
����λ�B!��U˪��GQ6l�@vv6g:;S���s##�����X�j%N�SH��o$� ���|�����{�������s(����Z�裏�DS}�����'�O��tG'�ZZSB/���=��~|^o�����Q~�䯱���F4���X<������:�P�g��p���i~�䓩�ÊH�l�XHKKCSU���o'��_?���K
~]��C���!��|k����3���˯�~�X"�b2O$x��iim�`Px�7�a�ܹ�J�����@#�@��_}���A��‹d�\���.�Ru
�ycϞYs�0G��LW/�ڍ+3�H$����?��56���G�r��1���y�ŗ���<�0�H��˻���f||�_?��HU����f���r�����Fy�����t��:7|�w72:�K�_��������,Y�W^{���\:�:�r���K/����:�枷^�����u2��j�}�m���Y�b��x<�[{�22:�j'�P�_���y�����]����p:Hs:�X��I���\�W	��/�B(��c
�ڰ�l��t�����ng���h��S�<��������'33���&Z���w��կ�"��!��ϯ�z��MyYY�iMCQ;O��t���������~����x3eu��㩺U�&��9��9v�a�۱�m8��ԙEEE��q>�x熇�۞���CFE��!��jh��x�X,Jt�'�Ξ�w��o�C�t���7�`|b�������괵�f�ƍlڸ���{��S[[á�G8y�w|�v�h�����
��ʼ�����M�8p��P���i::;9��BW�Yf�gp���r�fV�X��[nI�O�Y������\V�XA0bjz��n�_�y+c����d֭[KC}]]g�����ǃ�b��;hjnfxh�h$B[[;w�y'�l�D]mm��.�������r�r��������A"��P�T�����Ғ��ۇ$I�344đ#G��� �P]]��-�������b��v��r�=�BA�'&�D"�<y
���ht���
�308HkkF�HC}��ﮡ���g�ٴa˖URTX��9���,gEE9
��TUU�D(/-K��ٌ�fe���Ԡ�tu-�'��^P's�?���y��۷QR\LV�kA��k' q�ܹT�~�����ܲy+�7P_WGaa��LV,o 77����v�#I�CC����n͚�3��62].�JK(p��;�l޴���<|~߂�vnd���jk�)/+����������_�P�v�P_OqqQJ8��Op�t�Arssذn_�AZ��	ϛ��mMuF�Hss3�SSx�J��)/-���E1���IMM
��|>�O�,l��d�]�r�lڄ�.b����s��㌏�a4Yx�DrB���GnN�U;RUU�?����*��Y�jo��7���066��i��Y.o��.UU����fEC=%%%��~�h�o�������`����S�u&&&��b��q��7md���_��W�����)).�˷߆;ߝ��4����kV�&##=���磷����LLFm���j�sQ��c4)))������z����SYYAii	���cyC=EE�4�l�&TM_?w�q�N�pn�u�I3���v�_��������� sNPf�	�����#��h�N"��+��Cg�^ڵ�~��T���{��n�a�X���Y��%TB��P$���G������P(D,���:�,+�7��p�c�������N֬Y}�w700��o�����I�s��9��/��	hz�B1�^���=CA�{Q>	U]P'sul4��ͥ���ښ|>����T�F����;��Ԣ2�ӱX��!I����a��0;o���}��������4�sQ{���Z0���L]؇��[\T�����LLL�i�m�
�s���D��B!z��)t�1����x��W�֣��������d6����� ���&��EEڭ+��ĸ���ӟ��o���͟W���q���e�Udg��r��}2Is����C^^�U3�i�ƙ3]��{��Ȏ3��W�<P�h��kz�4Ei$�������fwv#V{{w����hgDR#K�i$��k�6�f7lã�
_�
(�2����j�
�n�I�_ċ�^��̗�}�_����d��������&++�ׇ��f�����m[1�ʹ�w�P*x�1(�
��򨬬@.WPh�RRRt
��^ϙ�gq����d���α���]����&7'���F����h�X,�~?%�Ÿ���{4vyYY�N�:>��Q�(�ee��������{y�쬬����r���NjJ
9�ٴ���{��`(��b�S������m[�R\\�ŋbW�ʬ1000@eE��)
v�ד����祥���irssc�%�B����FD1J~^2���Ǐ�����w���DCc###+��h4"6�5V��H�w���}����4��y���B���ȴdr��|�,�UU��w���BrR���ILH��KJJ"��܌ ڬd�֣^��d2n�OkA�����rsc��f��!� IDAT�\^�f���j����T�������رc�:��ˋ)�500@Jr2��e�y��+���b�f��l~��X\\�j-�bɠ�����RSS���$#=���L��?�SO~����8%���DNN6&�	�ۍ�f#�l�����,�v��IHH 5%��GEy9r�|5D����N�X�VΞm��r���KMM5��#����N~^G����<n����!!!!6�X��p$Lgg'�H��Jmm
#�#��\���)-)!#CF]]-��单v��(�h�:����Q��_@�R�P(b����ytz=�k�x	����F�V߰�ڕ���~4�+�(��"Ѩ�Z��D��r�Y.�.�}���N��T�j�`0���aqi	����d4������	�����uuu�m�K$��������������-�Vf͟0k�~�r9�H�c�amc�ڋ�F���j�F��X�4?��=:��牢+����&ք��e�Hպ�A@Ÿ�]_����~Ʒ��
���b��F������r-e�ֲ\K>?J;�\�\k�_)oW*K$A�P����3��%�����k��_�P(��d򷶆�P(p8�<�̳����k��+��{����J�s�-���N�
3(+�09��p� �~lJ����m6���133KOo/��{�H�����YY��rN'���o���
z���p�ݼ��;<��h�ګ>}[����I�;;غe}}���{6�U�W_���Ɩ�-(>B]��W�K����S�O��w��+�E���9q�$��wQ��WL�F�v����dxd�陙��6k��������Om4��1^��ǥ�H.�oh�k�����z[>.}�L&����R���~w�yv�{�ZF��R�Z��{/��ϥe���]kY��¿\:Wz���Jy������>6Fn^.������wi���%�T)Q��W����V?כ�P��-v���'��W�05��ڠM��C&��E)�#�_]A,8tvt"DJKK1�LLMMQQ^�������&N�:�Z�����������j���Y]�c3îwO~^.����Ĩ�������Ғ�\.���b.۷m۰O"��w�KKK,..a�Y)+-�>6���Z��m۶��r��ׇL&[�����FCcS[�lA�P���InN9�ٜ>�>���JŶm��Btuw
����!���ڊ�>FQa!��U�T �|_5��8��\X�<n��ǹ�ꕳu��j�c��q�+��e�64�z��b�Y���Z�R{�A�'�o��GV����DV���X�V���~��d��K�q����O|����HLL�����u��E=�/��$�	B>;@�-��E)����^k���[���~�>/��`���S:v!z�
�BD�d6!Q^x�%���9��@T���e||�ٌN�#1)�R�(BjZ�Lz�՘�F>B��Tj��#��<�‹�d2=o��6KKK:|���i�zg?��Ʀf��G6��E�Ʀ&F�cdee����"���s��F�ę��Ɯ�@8ajz�sm파��?��g�_�����e���lc#F�1V��e�/
����A�jn7M����_��Yo�?p�Њ����V����Arj
�`(�d4in=GsK�g����sdgec�YAb4�J��d2����^z�hT��ŋ�s�~�ԍQ��JMɯ��>���tF��j%7��`0�7]��r9Z����|d��O]q:�Te��i�����*��O�j�C�O-r[u6��R���������HX�"D�M=�d2�`dd�ɩi‘U)Q�rւ���*4
Z�z�#?�H$��Y^^���y|����29���;�o�f���TXˉ�f������b|~ߦy���TVVPR\��`@�Ppם�io�`||�R��;����RIeEM��tuu�q��YΟ沈��J�R��V`������v����[(/+�h4!��())&??�Ą�˞ʺ��5?�S�N���$�M�,���z���E�����ʤ���ܜ�Wfgf��;署}�V����=���Q�B4;�k�+*D%i,!!�Aa��c{a*�����N/��&BQ8�0@�%w L�-�JIEA2%�<q{!��0:��d������x��F9�ЀV���E�V��阙�e~a!�ޫP*p����
�㵜le����p:��̤�����aFGGٵs�h��}},8�C��F�."�������v�㕋LF8�f��'���S����&11���LNMq�b?YYY+�8g�򕧾��-�ڨ������\j����2{v8��5.[}nT�����ml|�L��i:����QYQ���l2c�Zyt�#�/�W�ܱ�`0W���\���,j����	�cc+�RG���219�<@zZڪ��2>����dgem��Ir�w���Ȩ}sB
�|��_^^f����Z(��$$$>'+��%��lD._Q�uE�XR��(Ϥ8/���tJ�RH0��i�l/�";#K����d��:���d$�jTq��L�N������
�󩩩��q��с�`������\��1�����2ւTj5�`���rN���<p��

1<<̮]����`�Z9��ȅ��8n��6�Q���<�2ւ�^��	�|>N�:���(��v�=:�����f3^����lΞm�����R�����蠹����dv��$!!�J��SQ�B����FIq1�PA���`9��}�NEy6k
����iAERSRHLJ��񒟗Gcc}}�eff�Պ^����	QW���r�?����;wSVV��l^Y�X��Q(�>r�ёFGGE���<���)((�f���Ј��&?/���j�GF�b�����XX��q��#{HH0߰����������\3�;�l��&�o݁�rc]jA ^9JU��{��;���_o������fK]�uݥ�Q������y��OŃBBB��D�6�~��+y��#���������?���ز�zx�\��(~�
��=��x!���5�h4����W��ڮ����<��e֩d�Ț���G�=Ə~��Q�smm�P(��B(�J
E�*�����1k.*k����X�[��R���g]����R���hmĥ;�EQ�kDk�Xsq[ktkyX�_ `��dhx���kC×���"�H,�k�ř�֕y�E�r�KV	�ǥJ��x]�gbn~����L��'·-ǵ�e�\��i_�}t�4e�x]���'֎n��K��mډD"�{�M���F"|�/�����^D��RYQ��?đ�G@��_�*~����9�q{<��ߏ�����/����m����n�ŗ_F�j���[9�ЈZ�揯���^�c�K/ffg��_~=���v��_�A�`ɰ�}�ZM4������N���x�GWδ}:��]���l�]݉k-(����ѣ\�p��MJJr\�<t���A�z=��י_����R��߹����~�D��C;��������؉ܹ��x�M�/).摇�)��[vճm�Vr�U͑�6����MU����B���g�-|���"!a%��W�����h�hghh���bǎ(
ZZZ���I|�ߥ� ���ɸ���⪕ ���cnn���I��n�UOTq..����3g��(�{�6�;l���|��_��=�������l�B�{�<�ii|��I]��'D����%���0==Mo�y���o�'�7*]9��;�T�����A
�����Щ^��=�<�w��M��AF�v:���٬:|����w��� !�YF�Vc4��t���z)ٛ�{�s�fe2:��R��`@&����W6��t�er�Q(�,�VWQZRS���9�ȪI{�\�f�^1yCaQ!���9��{8t�p8����	������"�]:򛞙��?����Y�U�=��'���\���ac�Νܵ{7/��Gε�c�Z���pb4���0���#��g�enn���%�2P*|�O W(P($&&�����w�EcS3-��QVZ�Ғ��ԵL��7�`��Z0��Ȃ��7�����Ǒɠ�j#1!�%��P$�@'n����h$q����

B���A�g�IKM��pl�;٪��uW=���;7=N�qu�J0�kJ(���}%.�N��移��OJr2n������<جV\���a�������H$"�V��U_�ȨQ$^U�.�RIq1�mmTU����v�%$$$$n�����3�FYXX �������"			�d2���HY
0��p CFjZ**���GJr2.�k5v��'�j$���Y�����u�T�ըtQ,F��Ӊ�`$��KG��3?7����w������XY{�ۉF�
FRR��h4D�Q���_X�l2��a���13;�F��b�$������!1!����<j�
�^OjJ
2��p$���KKK��������277���#11�������W�,8h5�f3�p���KHHHH|z\ˮ�ϭ����������s�?��O�&�u%$$$$$n��vb}��������e��ԗ>��)��'Djj*_��SREHHHHH|�H��◐��������������g��ƿv�ͥ|�G�JHHHHHH���r-a��K{�RI^^>�	�R�KHHHHH|���j�����f��(X022LMM-
�Bz� �}�_�����j4��k�(�h5��0bT���|�=ffV���ZZ[	����07?ݞ�Fci:�Nzzz�n��!�"��1�����������x�R��nN��'N�djz��~���r�����	I�H��1���"��'��s?��d�f2����`0(5�OS���S���l�"!B4�����`ll�y��Z�cֆc'N�h=�����~F$"�8ki:imkC��Ë����2m�q��+ȵ�.�]8������f
�����/K�z�z����yᥗ����e�li�J�o�DQd�o25=�����z�\�]|�����EE����CC��΅(��KRF�P$ʋ�y���@�g��q�c�+�#�⇛�l2dc�Xr�8t��$�����/}�G�.$%%��j��@�Ӓ�.D�
Y�G"W|)��W��(r��E{G'��{�eOCS�.`6����{8~���s7F�����<�*�j������t��?��v�Z-O<�8 ^Q!��?�s�������V��[��d<��C������\.��*ٹc'N�dl|��OjZ��F×��"*��w�gzf�ܜ\v�q;�9���^����R�nZZ[����`0��/<��`���w9x�02���=��h�С�x�j��پm
�Ax}����m[�r��Y¡0��e,--���Lk[�P�'�|����?�R���G&###V7�����'�s뭷RR\ı�'6�kvn�BI}�NN�|��a�w���9v�%�E8�Y�3Nx;q���Q�2�x�p..r��!"��v�S]U���o����.vテ�n���s�bm��'a݉���~r��%�'�����^�Ă�Sm#$��T��8?�@k��Y��,�$٤�07���1��ܳ�7C��Ȑ�
q��Q</���,�\>r�����8���q��A"�*��Nˢs�x��L^۷!"�P(x��=���lx����C���QVZ���>p�}������&q��w�����]��P(>�m	f3��B�R���>	YU��]�ȍW�����Y�o�J$a�2f��!��[��{III�C$%'���ȅ��y�8m�tvu#CWyU�U���"p���ˎ�EQ����R�7���l�� ''�=�<L^^o��ϙ�gqy<�ر�W^߇��fbb����>���y�Ǚ���>6FcSKKK<��8�N�440>1AVV&�?�����t����PXX���g�ٺuyy�<�g������[$%'q����3���䤥�mZ' CE�'&���{0��LNMqˮ]������#����%�$''�o�f�H��`�Z��S_�j-���y�r%&&rˮ]>r�Ĥ$���y��IIN����{��L�Ng��C�ô�kc׮]4��2<<BgW�"��{���{���(//㮻�dK]������΁�q���Irss��֓��QS����&|$$I0j9xf�G�dzf�~�i�f��Yȿ�ގF��X�!��D}vd�z���
w�y'O}�ILF�p����������O~����PR\̮]��?p�P(D�>n��Vrrsx��7��ꡥ��\Ɨ�z����
���XXX`y9Ƚ��Co��:�x�6mR��`06���!��$�~}?�nx���ư�����?��<'O�Q�dR.266NEy9y��TWW108���Jgg|�/�BSs3���TWW���c
����#G��t:p�ܱ47���Q���HHH�d2!��v�zg?SS��/8µ����TV�P(��e�������9!���d�����#��R[[Knn.�\�@&�a�XHMI�h4��j��_�%�CC�{�M�����DtZ�r��ɩ)j�����#?/�����P�Ͷi��TW�P(��H� ?���\.'9)	���%# ���CC��].r��c�>C����������A�gxt�r��撞������S���P^Z�F�A�Ӓ����lZ�vE�1|^/�]��
���u�ܾ���<����A�V��d4���JRRc��m�b�@�^Ahhl��o��S%����������av��ޠC��\�C���
��/MG!�Ql1PY��,�x���g{��[��~&d�n��H$B^n.f�	�J�L�M�r%''���FbR���$%%��(x��DnN;�ogbb�Phc=
SUYIZj*)����~�h���,rss����P'��S4*��Q�6�i
�B7|�%�p��Y���7�rs����{�,8�?߇(�tvu�������
�������;���A&�������@OO���7�{��륹��'���n�r9J��ήn��*axd���B�r9�5�<t�h4���(6����***C��/PW[K[{[�n%����d�P(l۲����ihl��\[�^����-��y���p��1s�B�@�Rr���������QR\̅�����K+3�r�:�ظ�M��0�����
TVTP\THVV6��yqO��PR\āC����[I0�ٺ����h4�TUٶu+�ph�l�j�L�J�������
�z=��+�|�����4�����~���RW;_"%%��s�P(T����6�m�W��zYXX -5U�zȀ�"Z��r��|�rLFV��p(���l��0f��B�����r��D�
z����$�
9:M�h��d���bn~����X�l�L&#�N��9����G���v����|/�Kț
�].7�33��룲��H$™���|*#=���
��`uF�������ob��޻�D&��D����TU����A�H��Q��G?���b�\� ��P(���dxd��h�57?Giq��)7l�$�"Z��Ғ�z=I���������%ÂR�����p8DII	�elV��աV���ʢ���%���l�m�#===�g�FC�����,��y���b����Ym�LF))+�LKM�d2��tRVZFQ����B���ILL������
�F#���h�222H4'�����` ##KFf����t���HKM%
�u����0�̱����HKO��p27?�-���l1ūR��٬D"2��)-)A�Q"w��M��+�\.�l�(
RSSIMM��]��Ovv2dgeSQQ�����v���D�jHb�\N^~>�eaa���DlV+ii���h0����p8�L���Cb"�I�	����M0�LYiVk���Ȑ���N8f||�����wb\�^����&����
m����T��t:-555Ҍ_beP�Vŭ�uj�
9��Q�Z�@�U!~1~� IDAT�ˑ�d���h5J:5j����\�A��42�j��r�V������q�X>��Ւ����jebb���<J��W�KM%5%���$,��̌�&ǎG�ӑ������]�L�N���������,����l
?�u[��������@ZZ��:IO�\��� ���{<������]WWg�ފ�����J��Ԅ�Z�d���鏛�K�X��ri_�+�s��_.&�V�_�l��W�����Z;�(x�^dr9i}_�S�C7�٬_]��q8�<��ݏ��`�j�������v���������b��d7�9e������4��\ڗ��j�}\n���}�r_�+��[�F�Q�@7E��d�fϽV٣��c��k���N(�*�������ILL�����?��|7�җ����������%$$$$$$$�/!!!!!!qUn��@ �}l��w��Lz����t���KHHHHHH|�4e`�KF&��
߻�n�����i\�$$$$$$�\��7�1�M�cP�_f��`0���H��
�N1
�B�OL�D�����^�g��K��3=3��F�(23;��
'+IHH|6�O9qyVβ��-|�л�"k$"qS)~X9+?��+� |����������mhV��ᥗ�>|��u�B4Kszz��G�n:��D";v������7���cϋ� H6��})>���^���a_�N��Y�u������F�U������:�O(,������.A�B��h�f�!�_]-�ORV�O����Y�	7d�-,��|2?����(r��i��ԗ������w\�qb��a4�������ښ�Z-�N�����RIKK;vl��8lniajj
���Ν;���`0HKk+��t�tSUUI�.^@�T�c��&-��,--Q\\LQa!��=����z��H��\�`�s�-��T*�����X #=���2���t:Y\\�����*���D��qˮ]h��X�����h4j�w�c29�ֆ���f�Rh�Ŏ�ۉ�#�hkІ?����Lƶm�0q�/).���K����hni��vSYY�R��ncy9@vv6�e�s�;::W��Xhll�j- --��rrrB�D���fzz��EZZ۷m������e!��-u��zZZZ�¨TJvlߎV�������q2-����2>1�����2%%%

)/+���@�rXp�Ӿ��$��JӘvx���d�S��BG�$*����$�L���JuQ�U}�oF���0���QYQ�(��m�6z{{���"))�]�;YXX���9T*�;wb6�7���Ȋ5��������,���쪯grr�s1��v#		f���V�p�v�����X�e����p.
)-+%/7������(�J��["���(**�i�%�a3~A6�"��H�����0f�s��w�p.�����}�SS��WIIIftt��'N�p89�����?�iz�r��<�CCq�T*������g� "^v���I�>nٵ��2�����/o��������A����.��s�m��~2�2��˔��p��YFG�ttt�3���s��i::�8w�������x�8�N~��^T*5���q9LKK�d4����F�f������INN��?�).����C�D�Q�66�p8y�8��dfe�F�����@�>���#8�����l63j����Ibb"��|�Q�����'���繽{9�����4'N��(���o*��ш�`�j�r��Q:��9p��H�P(�o�����x�A���o���v;o��6��ټ��;��v���hni!))�smm�wv���'�%.ˢ�GEQ�9&��La�M!;��ý�Zh��kp��� E�i��k�DE/�""\��t3ʐ8+���x}></�N�`jz���y�����t,�����Q�ո��x��M-KV��hU*�=v���Y�;}�?�}���B�Rr�����Y�R�bu��;��Khx���V�gf�h���w�gjzzCY�@Ln�#�oȌ_�+�[�ƯOpa`�W^{����64p���ķZ���߱�-uu����?�����;oc-��硇8s���4*�*c�٢�(�@�����0�Ȯ`2�����o#';���4DQ$���؄�����V�z�,2>na����Cjj*ւ,,//���ͽ��CUer�:s�j�*�Պ�`@�T������H~~�EE�|�dg���@Ey9�p�/��|KFv�(����nj�R���vۭ�?x��ˣ��gC�/�������Ojj�J������;v �==�lجV^��l��������$���1�dd��}�V�j5]��LNL�p:q..��j���"5%������IKM������J��~�,���g�
B�"UQQζ�[���d|b�w�g�#KNbs�����T�����%;9
�!��+��U��Z� U
y3e�t�g)�Jb��%Q�&٬C�U��2��lÀ %9��3��GmM5��)!CFTV#t�\���Ȋ5�&���

3:j�f�q��������D��S�����~Μ9���&��L^nN\��]+V���2�+��
����,2����g��ה�#�h�ƚ�A������X��v뭼s� ���c���b���qr�������������\|>?g��?�	?����_x"������W�J�@?.��L����y���17?�%#�\N��FSsf����yl�'�;Ŷ�[Q��y�E��$''q��݄B!��Q�Z���e�rJKJhhjB�Vs���-����2B����|�_��g��z���W�=r9 r��E�rs�����BiI	�C�|��/�����]����sss��N��+O=�K�#CCCجV�����?+7'�L����VvlێF�F�n���tvu����c�5���ŗ���y��1^�ӫ����_W|D׬7�S465��o ��Nj��A.����D ��2<<L~^�����㏭
�fgg���x��N����KF�]ܲ��s�S��,-Bz��$��~e)�l�-I$Ը]>����g��3*����*��[&%�D4"t
D��/Q�7�ٵ��V�\(��HKKfffX	��j��(����~�*++)���;w���⊬�ۉFWʫ�iW�d_����ۘ������򓕕���o�o���x��������#��ɥP(đ�Gغe��;��ˮz����x<���26>Η��C��qe�3tܰ�G?��O,��z%(��X.3:6��,�s3q��,i�X��z��D�Q&&&���[HMI!--�`0H��BDp�\280Ȗ�:��R�뻀ޠ瞻�B�ף��(��(-)A��RVR�mu�@�� �����A�PRTTDnn/�SR\���@�P����\.'==���Y���H\�ffZ���!P\\�-��bvv���<^/����a2�WL_� P\T�r @ff&���x�^����ll߶�P(DFz:f��p8LVV&�εq����*���A�Z�)�J����Oee���LLLpۭ�R�n�J��1f�34<LNv6�6������F�Ν�333�A�
P)7>+?/�Ғ.\�ȅ�0��x=N�~���yJKJ�RWS�r��NKrrՕ��q�S���@&�!D"�l��&�/���$%%���GGg'SS�8�N��f��7�|���`��~�ܜ��@gg'����dg#W(0���188DCc#r���o�
��@{ge���K��ڄ�����NZ��JAz���D:5)	z�Jr���43yY�$��%1��$'�1�(��
r���!UU�������33�X,rsrhmme9��� ���4Ξm`l|�;n����

�����`xxx�O's��i��Ǚ��Z�_Th#%9�eEmM
/���'-=���T�r[�l!Ӳ��b��m$&&�HO'99�N��B�
�B���ĥ���MOO/s�sܹ{7�ee��ŗ%###&�6��~R���e�M��C���vw]]��zϸ�7BD�8�P(0�͟�L)��\A�X�EA�^�Z��  ��7n.�M�D6�����*o6�1���DQ���������r��S�O��zy�Q�T���h�BA �?���7���B�P��~�Y���
f�)�]�o,U��\&��ħ��,C.�K4�`�]�A�a�ҍ���ؐ~Dx~�^jkj�����e�ѬY�/����K
I�I7e�Y��>��C&�m�ki^.�K�f�zv�������ȿL&C�P�t��J��;�Z�k�0�F�J��}&[�L�Rm(ϕ:�R2�K܄2�f�!��#��7�r��OZVl��˥�R�P�T�Q�rSX�nČ_B⣌�/'�>�,%�T*c�g�3�J%����d�5����埉==�ڌ_B⣎���,E�V_�3		�?oYq��u�f@Z��������3BR��◐������<rCK��0�3����cle���dZ2?wk&��?�204��`$;;;n�(�,..204@YI��-!!!!!�	s�5�(F�y�����R�Q*U�K�R������Co�y���L�P(����%���	��_� bi�#�~�e���<�LB�N���@�ς�#�^/� ��O8|ӕ=��b�|>�5�+E����O%��ڳ�%�Z��
�8<�eB��v��<nDu߬2D��V� DVOl�l�DQ@"|�gnn���445��������ǫ�����u{� �4��v^y�uA��7�z��>����///��Ԅ ������Ȋ������~/�?��>���g�_X����s�����\��=+aEo���^�
L~�/����6a����믳�r���j8�N���?V�B�0�����ū�f����%�w#&���8x�!*r�a����3D��=|XYs�Ȑ�z�|(Y!qmܐ5�����+|����44������=�-����3��QQ\i�8�N4
YYYLNN����J�bh5��B��n#7/7v���>���7����A��t�H$�}l���&&&V���1==�R�����F���^���Bjj
��Ӹ=|^)�)�].�:6�
�\�z����DFF333��~�\.������d��`vv�Z��� ��bpp���9z��y�A����z���咗����ctt�R��jE���
��򟟟���Ȩ�Ʀ&�����ʢ���Z��fgg�9q�$�L�4C�Œ�����r����\.��v�o����-uudee�p8b�+'''�w
�'*$$&�P(���D�7`� Q��	,/���NvVN����Q�cc�3���4*KKK�ڕP���M&#��1�	8����^�gǶm�����Z�Z�bdd��!/77{<������c���X
��q�,��be1��~�~?J��N���$??���$�w�@�P008Hh��RXX�F��+�N�#*FA\wȇLF^n�MM��'�������s��f[Q*��y���$=��.�2ɉ�fIM6��j�j���Q����OL`��@Ef�����aff��"F�����+ʓ5�˥��,N�|���UY!�)�"�� \��V��2�\�p�G�<¨ݎ۽y���ӿ�7�FFx㭷ilj�����&���O����?0�� ��E\gR��v��}4���md��/�����,WLt�SS,8�>s���{�<�����"����q{<:|�Ʀffg��?�G���a���v���y�����308�Ç9v�$^����_���į~���G�3�=��0�s���DQĹ��Z�����_~���aΝk�Б#qeZ����#DVޯZ�ṽ/`��{�1j���A.\���e%���łÁsq����'���ini���#�7$Z1�/8��n��WsKKl���z�?�G:������g�����什��P(���33�<��/��_��χ\��Y����)/+cnn�'O����?���hhldnn��/��s�{���/��fg�b�
�8q�$�7�ell�^z9��r��z����
�6Գ�㉥o���,..��[oq��~��կ���W�?�g�''ijn�����;��b5	G¼���4��2;7���$�u����fK�8E��g��Z�Iq����!y�h�.L�v�~�FS�^~}���jnF�F(b�/�\\��p�‹/2=3�����/,04<�i;ߌ�ri9����&Y!�i+~�ʊ��=כ��	�FF���������F�,����m[y���_|�c'NRS]͙�
tvuQ�c�ϼOOo/U��S�DQ$7;��QA`xx���}Ah=��#=DMM56�QIOK'�#zz�s������~�v;�P�BΖ-�ܲki��lߺ�ܜ�^/M�-<����~~���8�\.gW�N�oۆJ�BEJKKF�zn��eX22��;HJJD.��m�V��*1����������S���Y2��?��� �h����-u<x�}446mj.4�L���RYQN�Ν�z������<[�l�u攔d���ٱ};��t��ƽ��G��)�ܜ�<�0*�����,��355�J�"�l��v3;7GSs)��ܲ���J�+�,/�x�"�UUTUU������(��94��0b%� �Ą�سJKJP(�TWU��S�����¢s��a���bV�\NUe%�yy�v�~��z^_�Ғrs�ٹc;���Q[SÒ˵$d��9YY�q�ml�����"
�"��.׊�W��XmV���X,c4%Ix=&^?[l�<zW9�~�
�{��\�"$#�HX���	�ж\��:�N����v�
����p�"�	/M�1�جVIMM��t^Q�l&����(+�6Y!�)��ע�mf~�F������(�]��y�a�
1�465SS]M ����"*FY\Z"11���~����Q��K^n.LNM��k�~����������^|�e���Sh�:�f���d,.-�������b6�����;:��:�u�d�A�9	���9i�V�e�,��gY~~��?��� K��6q7JI��L.@��1L�3���b��@r��w���[�nݪ�U�%����ٳg)(���q�ٴe3���g횵lڲ���L�

Y�l)w�q{,4�A�R�Կ���{Β��˹�~��������v�j�����������X��"c�XHOO��(��(
Frs����k��Ȉ�ټyu��X�f-�n�
����pp�����l�����h��	O�4
��Պ�l暫��7�BE�����S��b�N�p:��̤q�����k���$%S^V�-7߄�`@�����NSs3��s7'N������ӧ�NXc��\.'6�=z�6Ņ����7��G��;x��?�o�sXg�A�D����dfd�d�b*���8�LE�>�8�f��g�/0M�Q[g:��EQDQ���
�����JJ��y�HNNb�ʕ��(
HZ
K��H7G�M���+�$q�����1�����32�>#���q��E�Q�>nj^��k�_R�_@Zz�@`��=��f�P8L8A�$����x��g����̲sEQ�NL�Z�erI�b������|�ʧC���w999�K��m�ˆ����r�X'����\6G�����]w�Aqq1�yytwwS\\Doo#�#���o�~V�ZM0d�޽8�u睤����zIKMe���8]n���iX�0�$Ibtl��-���RZRBvV{�졶�����љUI	�(�������8��q�.7,�h4r��1�Ν#77�믻�CG������UU�������Mrr2��,^�@��yy̛�>r���$���FGG����l6���GIq	[��F�ɓ���r��غ�$IA�8@YiN����t�����;��E
������с 
���D� ��z8:)VV��hm����V��]��"B� IDAT�����~�75�����db޼Z>�蘅ښj�F#���k�߽q��
�@JJ2�؎�@C�BlvG\}M�O�AYP_��l&�s�.N�>CZZ�99oj���I��j�:L|��G����Ʉ�NiI1��Pp8��_�Q�Y�
EQbyɓ/jX�"��?p���J*���8y�
��IuUU�~�z=������Ӱp!���qz��͝����z,	�	dge���Ǣ����)-)�nw�����M0�������Xy�-YB__�=g���#H�Ě5�I��A�|v�ӓ0>9�� �����4�U䐞l�('�N�$
T�fRW�Cvz�阓d�����N7����Y�l)G��>�����mm���Ď��8w�sJ
�����i===���c��大���yvV/��|�����-�TW#G"�?p��2rrr>��P�p�\���r/��|Q܏rD8�^O~~���:.�\��Ab�$I"�y���S�O�{fd!��馭IGӜy*\,�19�J7���E!����㦗c�M�}��#��`�^?+
3��3�
3N��=S�3�e�p8<K��e�K�N��_?�?��ɜ��^@Ι�5W�O�!>a�� N�p8;>���$y�ydLE@�H�Bϴ���8_����yӟ��ʹ����Y�)�E�@ �N��Z�8����!3C�S���~t:�(��-
a0.h�s�u>|1_�Ηv:�N�����K�s�?=�z�����7�E����T�s�7�ljA0̸��qs�q!��N�$�F�g*��%I��s�K~Q�|�rO�w�Y��q�g�����y��\��k����?�N?M�̥���L=ں��w3󞮏�ل�W�����
�0˦4ͬ6x!;�42]�W�|
;RU��U ;+�{�K۩aC�W|�Ѩ*P�*����*AEEE��_EEEEEEE�UTTTTTTԎ_EEEEEEE�UTTTTTTԎ_EEEEEE�UTTTTTTԎ_EEEEEEE�UTTTTTTԎ_EEEEEEE�UTTTTTT�d.�+{].'=g{b�5�5t����V��
��������eY��������O��:�l�v�s�O��Ŭi(�l�:�r
I�f���̔~AD��E"��ў����{%I�����G	_�2}���ځ����j��|.��ȲL(�d4�Db?V����8��h&$�"+��`�N��/������˿�i�;x�^��'�h��=���d�ΝȲ|Y���r}�� ��l�n�_v]X�����v�ndd�
�>Kx����r���;sG��0�P����Ȳ��(l{�}���Նq	��0�����E�ľ.�y{�f��;�@��266�V�7��?�������/e�20p�Q��[��0�E4���;;	�8�?A�2�7/F7��K���(�@���X�|G���rQR\�����p �"��ǃ��b��f���^O$��v��jc3B��˸Պ��C��q{<���#G"h�Z�Nz}T�.��F���~�^�Y��j�x<�>~��F��������j	�B��vG,�	�-&�$Ix}>�v;�`�^Op������r��gx���&��PY�q�\4
�H�Ӊ�j%
��b0�c3aEQ��{�x}>A$���ŗ�j5��fA��IQ��pb���^���gbb�׋^�'9)���$�^/.��Ӊ^�G�$<V��`(��v��A�Ng��S�N��ucccp���V�D���r����.��?@A~>z��@ @(B���v�c:�����t
��j�Dd�8�'��NGRR	&n��͆���`0 NgT�P(��H$�Ą-�+�F��t���;9��h4��~“2�O'#����=�ↅh4ZZZZG��I����a��6���Q����V;g��RT>��O(BE\.>�/V��u�D��E�ۍ�n'����q�-��E�J��������-��m-|�4�K�E�<'��bʲ���`ph��mm䓑��Qq:�x�^��Vt:-�p�`0�US�`�~f�%I��h�Nw����I��rEm@�e�V��,G�1���R���N�\.W�t��hls��uq��0����S��P_(RY^v�B�s166���(+�/�h0����ko�EqaϿ�>������W[�ko���q��l{����yj�3,jh`dd�W~�{�-[�4�q=v���fv�ً��chd��^y%"��k��`�|6<������
˖.A�ӡ(
���W^e``�]��PS]�[�6�k��։	�޼���>LMu5�w����8����iko�?`dd�3�]�����;�O__�l߾��1f�`6�����ٞ�l}�}R�fDQ�_�X,�h4���Ak[��!��e��g?G�H<���\n�tv284DmMM����k�ݷ���Z[O����;[��v���
�������	;w醴���;v0���߽�:�}}��������/�HFF����`0Ⱦ��e��4�xz^���>��P0HYY� 
�x��W9�z2N��|sr�1�� �2��l���$>������\���_���qv��MfFz��-�KVf&�l�JfF&�]�tuwSZR�3�=O~^�]�l~�]�nO?�,�-b߾�||�8���������򫯒�������[�6SVZ�����7
y�W(,( ;+��͛oo�骰���_�{�Ndzϝ�lo/e�����&dE&''�:��`��F�z�9�����q��$�i�V�\��[�q����'�$''�w��Y,$$$���5g����������"���ƞ��t�:EwOe����ʫ��.���?��+)9�_��x<^�{z(.*�^��vIq1M�������:�u�cv/�[p�#$�zG��������hCVz������V������H��@�lo/�}������YP_O___�JJJ���7�9�C{Gz�������z�}|��-�TW��7�_*-)�>�ܹ~$I"99���f�bA�ӱgo#��ü�i3��U4��AU[S�.M|E:��R�H��u��!k$ed�3�6-��kJH4�PP��2DI$и?���������^�-7ߌ��Ü���l�����gN�b�2��)5�5���PRTı�MD�V�\���FNv6�,�p�����;�GMUF�1np�b�2���?f������ p��7r�-7s��1��>��;�����;�ޣ��Iy�!��RRRصg	�	���={��2�Y�Y�Ѐ�hD���TWV���˿����|n��JJJ���㻏|���d<�}������8�N��j��;(,,`��U�_��3�gP�h*���k��?xQ�z�����o`��H���:��G�e^m
kW�������!:�"Gx����;HNJBVd ?/��ﺓ�ׯghh��M����p�m��b�r%�1���Y�O�[TX
_NL���������HNN���;��"�@�ӧOP]]�
�_�#?�;�B�rˊ���c�޽<x�}�t�
16f���	��>���'�]�ϯ�[w���E
����o�n��fn��f̟g�F�)NW���8�
������b���lo/�EE1�>�N���QRR�}��Cvv�(r��p�5� �==�ݷ/6k;p�(
yy�|��o��~>� �Y�g:;1M��=,���Dk+�}��<�'��lo/��W�`pV^D�3���K�$)ζ��ۿ�p���᥮"���d��� 7��4�nogQ]�탴u�b��̯��Wo4!�o�?K$r�嶈,s��an��Vn��z.���kw��H����E�������A���ikkc�sv�3�ҩ3�Y�l�W�䖛nB���l`�E\s�U"L���|��ZRT��\�)w$"SW�ɷ�Gxi�!I��,�c])I�FdY�J�B�>��p$�G۷s���\.
��?>+�-%9)ir` �B�� pݵװ}�.FFF��?��4���9��������(H��(�hu:`��Elx�y���G~p����ISaYI��j4��m �`6���|�L&P��ӳ�����|�Z����������=���?aݺu����/q�m��l�A����dEA�h�נ'��iq�\QG�9��]�d�e�Z
�D0$
�_�$q��i��#��NR�f<^�`Qㆄ�(L�K4���$�ǭ�!"�p�Z�vn�'��EAD��X�ZQd4�D~^�-bՊ�L���ÄBa\��ҍ���;��AQ�f����8�)Q ��7�(�ܨLZ�EV�h4�����6��9s&NW�������dJ��x{�f233IKK;�N�K4�B @�41�$1��SR��iX����4���fm��Q�IBԖu:->_t�J�C\]�8+���D~��G9}���_���?�l�/����/b���?%=#%�f����e�4Tf�a[+��h����F��`��\��Hn�DY~��8zj���4�nRtd�I0~��F���&
�&����D�/J�m]��rպ�������STT�,��X[�b���dқE�G��+���x��Wx�)**DQ������U�*t�rA����P����R]�MJR�H�P(�(^�I9q����q|~?���\}�z�N�m�Ho_�����N�4��o2��‚��z��7/68�r��)ɜ:u���$�LQ�6��A�G���ILH`�1�iN���m�61����;�~�ł$��zn��^�-�Z-����|�
$''��sϓ���
�_ύ7���6���CZZW�_�[�6
ٙ�h�Z��C;�(��j��%&$D��hD�
�7��g�{�P8�}�ރ�dB����C��O�`D�<�|����III������ۛ�`��)))��������f����~j��q��lx�922ҹ�0�H��M����_���-<��s����k���p1��HKKe��u<�����G"����]�æ�[HLH`ժ�d�g084�/��������G۱X,bJH�o��߿FNv6g���^f&�_w-�o|���Ǥm'�!Lv�:�������g�������,_�4&�L]I���V�V�e��z����뗿���Y:��n233�����
�������h$3#�����M�����AEy9
��P�WUV�go�:���-7��K���(�Y��իVa��bu}�m��ʫ���-ヒ$���x�ݭ1����ঋ�B����F"�[o���;R��1��h�4�a߾�����R��~p{=.��܌$��{���� ˨CTDRM�fƞQY�f5�}��Stu��t�bn��ޚ���n��d=	���hDE���y����ُ~�V�a�b�ŗ^�
J.X�������L�l<��B��]{m��|���F��{������̸����]w��_���&gCCC�%k�B[�IFF�PEVb#E�J�a���˺�#�2�pPb��H����~�z}t#ִG�‘I"��q�F�.]J�…3"�.F�16C������q�ع�`0����������['�j�ZRRR�j�p�f�S2�fb��,��~dY�d�6��Y��`@���z=$&&F7�ø�nLF��NSԦ�35#���F��m��h41��;u8��_����y�摒���)�O�3�)LQ�3��k�6�Mmޚ�g��r�|>>�h;9�٬Y�*V��?�>�~lV<&�n8E�P8L0@���"3�9�\N����N'�y�q~�WErr2~����!~���ۿ�il��<S�dM��|>��E�vJJJ�6NN��\:����O�7?�				��:]'Sv�L�3�kAb��(
�`�p8�w�c�sջ(�q�M��hA$�յ�`���(���~�IIIH�4˶�6-^��B�ߔ�FQB�Pl��T]�����F�G�x<����������ɴ±� �?x���I�E7�N�[��G $�d��υ|��2n�V�������y�t]�� ��̣����?���
$&&^�M�(�Ӊsʧ�j�pz�0���r�878�w��M�$R��C(�v�8t�?�џ�y��h cں�t�d�Y��!K�^[������Z�RS��|�|L�e���+����Y��{A��{Og�gS�N�'�!8z,�'�Dk+K�,��\H���S��Y���YO�$��X���qn��q�x4ғ�H[{.����STWW�
�����'O
���줧���?SW3u�-'N���aΰ�L���d�ӿ���Y�\�}�z��-K�������d2ŕe�m����.A@�ӡ��iq����A��T�s�����Ts����7=m��8+�w1��
��S�Z'����s=�zi��`����b!����󍙉�����<�KE(��ٳx}>���(,(��m�e+CCCbt�$99���!�-���:JKK1\��y<�����6�����$�2m���d�m�&ջ����s�()>�KӾi~I��Ưv�h,S#�/�Ɨ�)���ثjk*�}�|)��oL(�4���|��UE��?Է)����������������񫨨��������������񫨨��������������񫨨��������������񫨨����|.˛��~?���f}����/�Q��MJEEEEE����˲Lg�2��(*,�{U����qz�rK-
չj
������\A.y�_�e�?�s\���x����@Xk`|�}��D�Z����v�Y�q�\�g�_E��q�\(���r�d�"�A����E��t��k���ß������|�tdYfb���8��׋���ȯr�9��!B�V�'�h�Kŗ�C��V+N�3V�`0�c�*�?@$�젍-�;���ټ�K@dp܃�=̒���^�ѱ1���q��!�O>����[o�MWw�%�"�K�\?ooڄ�����lyw+�O��T鞯1)��ɶ6���f�s!|>?���%2�lW���q:�=�̳���~n��(
�	��oq����\z���|�m&l�Ϥ��~���ɦ-[�p�>+㥒��:��U�YV��
{ٶ�4��;������\ځؗ�C"�������'��QEQ�9{�W^}�P(��ׄ˲�
�yg_]2]�}$����H=̯5`�_ރEa��x������z����S�p��GF�X,J��9��KIq1:����*+*�$�ή.*���j�tvua�XHLH�����S���'���������̯�G(����@ @vv6��tvu��x�¤��2n';'���RFFF��@��PYQAnN�P������n�v;���12:��� :���jt����:��K$�LTUVa2�d��ʊEj���O��}��Yr��	��!����F:��p�\�瓛���n���9r�;v��d4PVV����xS�N�������\�,�z�HHL���
��G��P�¢"�RSp8�:}���Q</
`���@%JKK�D"��t�HJJB�N'�jkIII�����AZZ�ee,Y�A8y��@0@8��������{p��ddf��5�ml$;+��`NIa`py�)�,�F���%���X�@\��EEX�Vz��E�ʊ
�G��Z-\�(��wt��4������t���	G�
��##��G�L���2:'(-������L���CV�&�I�X�D���$�
����BQN*��t�}�����ص{�<� �e�X,:����8����ь?SW�Hw�ɔHP��� 9��kJIK1�7,����䖛o�� N�c��,Omx��q+;v�d�����8Aˉ����Ϝ>}�3��46zhxE��>��CG�|2��c���ζ��'';A�e�;w�s�n,Oox���^|�e�ǭt�t���~�1d�zY IDAT��o�x
��Ÿu�`(DSK�n�F{�)�77�c�N�=�(J���'���lx�YF�,��v��P\X9�����,�,����N��҂�����e�wh=ن����������m��xj�3�=������AB�0>�?6�P%�W^}���O=�����;���qF��xb��FF����P��_}�����(���ቧ�����-lڼ���6��
���6<���^&&&x��e���adYᅗ^�����7m�`��<�������m�GgW7_z��,3<2+� ��F��`�����B�^�wVYbN|�=n��뻺��|<�̳�\n"��p��/�HˉX'l���[��ǟ``h���v�.X��t�FFx�����cux�,Ϣ&��Cg�_�CyN�mkE�׳e�)�N
��m�Q���Ą3���&r�����!s��EQD4Z-^��'�ހ$I$%'�W�/�CT�F����&�o-N��s�3�v�!^[JfZ��p�������f���m6�}||�4��'Y�l)�\}��{��62�����r��k�и?'�ڨ��C79�W��4��|�w,#�2Ǜ��������������&�u�\w���Ts����Ȃ�,\����<���0
�#�))����q���9�,Gb
rɒ�,\P�A����N0D���,U�dfd�j�J�z=M��q2������'H�Dmm
�������8v�8v���-�����0��meE9���X��р�``Q�Bj���ut��w�֓����a�����r�ד��u���[Y�v-��$��tvwc6���������Mey9K/����ښ����Z�$'%��ߏ��btl,V޼�\V,_F}�|�N�iidfdp�d�fs,$���(�ϧ����*������Y����3����|�����ɺ�kX�t)�S��k������j�(@Nv6W�[�w��0M��\��O����Y�|9˗-��~T.0�p{YT�������0��%��ɉ�H�=~nXT@�I���d�p��"��pz��q`�{��_��k�SvV��T����x�dgg�t�*+���w%_ć�|�:��A�Ѱra�Lcyv�o_WFv��P8tAC��km2M--�w��,]��#ǎ��z�LLL +2��V������fxx�3��TV�QTXĄ���}�x���,�����k�3~=+�/��ƭ�Ca�vGlË �dgs��i�v;��NGvv�;;bph���O�=|ҁDe `�;��UW�?��"�,�d�b��]l������
�D�ʊ�!7''v�^��2f�f��p8P��"��;��d{;���de�|�2��>��������yy��t��hq{<
�c����T*�K���[����֮Y3��8LF����ߏ��$
���ɘ��� ]��dgf����{pp����5kV3��6~}s�QQ�h4<���ʫ�#8sC�2�$&&lh4Z�˦�e��OO�Y������
322���@>�:
�$##�Nw�z�n{����~T�7#�Aǒ�,���IO��(��eY,��OK{?��8���d�sR�4�ji)�VF��Ձ���n�`�����C�W܇(
��q�GF���m�MM5322���(v�EQ�B��GFb2~�ry�~���]NN����u�&��,x|.���8�vƭ��Y��]ҁG$‘#G���((��
�PXX0iģr��V�\�(��;p�����Frr�P��,�,Y�?� ?����b�F��a�����NYi)99��?p���j�;:E���bDQ$+3����||�8�P����,jh�𑣴�lc�������
9"����``p���:DI���c���K^^.Z��VKvv6)f3��#�����Jqq˗.E����FQ8|��ͫ��a��ѐ� ��j9z�('����t̫����c<t���5�)//�xS�?>���(U���=���x��8ݦ��2<2©ӧ����2>NeeZ���ʊ���iu�޽�3�������`��vSUU{��N�#33�?����^�)f֬^EVV&�v�aph�{�F���MUe%�C������hp9�,^��޾>���	��TV��(
%��x�^����������۶222šի(/+���8F�s��@ @ii	���QQ^NVvV�,�iidff�W��kV����=
��ɻ[ߍ廠�I�ؾs'===c�٨���`004<LA~>[�{������;��̼`�O�=E�y�ݭsڏ��d�&b���(7�AKI^*U�Y���g&��jE���tjʲ�LM�(7��D�I���i5�:�+�Cj�����9|�(mm��QRR��j�~~�iih4;v�ddt������y��79��N[[;�H���
�}|�s���N ��r��{������


�t����ƭ{�ɔ@nn��H�6���H8�B�(
�`�N� (�B8F����$).��(
�HI�>נg*?�V{ш�,�C!�}������hbe�)��r_���T=],�i����H�V{Q�C���|*۝ґ �q��f~6W��pA�L�:1�o�?��H1��|�\*�Q�ԑ�+�C>�E�i}Y>�������y�t]���:����f�c����JA���i�O�9Wڂ ����]p
G��_����%��r_�z��]�O��g)��N�B�������bv$�"��su���~T.ud������J>D��[U���7sJ
��_�9F�AU�������o�$��TTT����������񫨨������������7�˲��D��M����$z����4�m`******߄�?zZS7�(�jN�{&SQlvv����J�yM�+�%�˲��� /7���Db?������a�;�ș�S/��z!��(����B�K.�T��,_��p8|I�G���r�D_�c\?k�S��\j���}�YҺTur>;���\~��H�YQ×%�/ÇL��Kmw����2�����ѧ|m:�蛥€L$��
��"Nk�j���9z�clv;���q�n7���:�Ϝ�t
1������ko�9gÌD"���&�;:���O�����S�3����@[O�dǎ����dY������{�?k���-���Kj#�@�_?�,���gSv�<A�S���O2r	�m!;�Kf�/EQ��
��u�YQ8�r��}|��<Y>����%<=OQ:D�s~!_z!�V+��կq�ݗT�f�Ν���t��uಬ�B��+E�7Z�=�1�%Ŵ���t�bE!�L�(����v�z�h4���`w8HINF�$�V+���%�͆�l��O���p:�
̩��4ee�(U�e&&&�X�P-W�e�v;�p����F#6�
��O8!))�݁�l&5Ռ��n�!I���������;��A�N��$'��xp:�h�Z��>�S144�o���7?%##�V�m�			��!��8u�Ts
���x<�8<ɓ����$%&�zq:�$&&b6�cv�p8x��g��G����<����e9zx�,GHKK�0�%4Y&
���l����z���z���~"�z��`(���"))����Y�(�Y3�@0��D�m�~��{FFF�@��1��	�-.-�ǃ���v���L("����V���=���*R���@�ף�����&OOSeҮl��>�f3�����~�v; �����g�
������u��^��Eu��@(„݃V��d��t��Q���%)�@J��_�t�}�l����P�ݎ(��z��Ǒ$����m@���&m>Dvv6:���k�)))��f�Y�i�4�<()9�N��8n����$�������242���b�c`� N��ۍɔ� D�inn.��ߏ��BRRR�h48�NF���d4��z����X���ĕ�l6�����a�����%?�G�ez�Β����(�~dYfhh�����zB_ `�-�s׷ؾs'+�-C�75�l�R::N����^���f�2��]�IHH`�޽�WJ*
�W?a�E�\.^�-5,���򃏶344Ėw�bNIa�2F~~>�Ĺ�~.��������6�@��LSs�������ͷ7a�X�۸���R~��_��hnifǮݘLF~��k�X�,v�U���n�t��p��1���@0��w�eɢE<��F,��
R\Ts�G�c��F���ɡ��=N�y����vpp�ݍ�dgf�⫯�l�:N����p8h9�J$!==��������@���Vj��HLL����m����'՜���cO�X�K/������va�����Nݼy1
12:���m��dg���O�򪪪���I^x�\N}�α��w�iu����X�|Ǐ7ť_UU��#GX�h������8��Ċ��hjn��Omm-{�6��=55���=|�c&���Mͬ[������l�7�E�O>��db�޽�t:����l/99	����/���ښj�L]���/�k.\@SS3�50:6�/�L$ᣝ;��W�|��Ǜp8$''�r���U.-cN�l>�'{�̢̯<3��-�D�a���D����n�ul9Ѓ����!V�� ���t%}��իgm��B�ݷ���\~�ƛTVT��G���顭��˅�l������"��_�
�$q��aB� ������'bm���
�Պ,+����*��ƹ�~��ױu�{�?t���n:::���䝭۰�[x㭷�,/g���tuw��9{��֯��z���^��?~NbbO?��@����������/�HWwG�}���"%%��<�	��tu����e|���������˥�����|�ZM\��*�8u��6m&	��ͽ�}���<��#�Q��Y�'C�����A���Dhko�+�Ds�	j�*y���~�m���E�hܷ���T�Wиo?'Z[�_7��#Z�xYY���ʹ�w�<=#4w��n��6�Z����ux<��m����$/7���7a2����\��*

��k&#!*+*(,, ;;��D�%�^�k���믽�P(D0$))ډTO��6��%K(,,�[w�ARR{�eh>q���˖q�U�)/-�:1Ae�'2�l�`���4��G�ɓ�������Ŏ��2>n�%W]=G��[o���A���`O�^4
~��}ć��;w�qS3��-N�>=#�h�{��U<��dfd�r�rn��F�rs������qF��!�|>�6��OKˉ�z?t�0�{�a���Jg�%�v�-��S���X�����,�{��M�ݷ?�N&�tҴ��hB$"s��a4	��ˉ�V�			�=���HOKS{�+���<���Tq��
��!A$��P_�� 
��n[^Li~]�ÄA��'3j��Fw�p������C���e�?�ߺ�z�D�e�G�E� ?��|����~ۭ�]�����77��f?M���	y�!����app���V._NzZ��p��a�tvr�]w�z����r��y�z���j�
V�ZE��zΞ%��C��	�w�b��F54p�uײt���i
�Z��L�,(��h�+�u|�!�KNJ������S���^ӑ#�w�,˴�:Eâ`�ҥ4��OUU�`�Ϗ�(x�>�z=v��׋�� �l����7�z���!���?��7��-�ӿ�I,���/��-��Lݼy
!�y�j�K �d2!F���F $"I&���AZZ����@�<�J���D d۶�(*.�a��Νc�H#zO�Z�F�]w���(�?�4�ӿ���0v����E��c4��dH��EQceڴe3���4,X�����E�~^�My�a|>/5�լY���k֐j6���ir^�w�Q*q�@�)��KSTXȍ���� �p��ng{{��tqy���011�(���˦��jAH0����j�e�ϋ,˱�C��@����@ W��==�!xI���ó7"�b�%Q�� 1!�޾����a��Nى �������
�&m�����X�����~�Z�f3+W������6��,�,_�L}��
��Ba�Uf�����u�RU��ȸ�M�Or��rF�]T�1�,������8��'z�_�ǹ;���$�����!�,�zcˡ�( �2YY����p����k֐��ʊe�1�͈��� 
qy��)��Ao�k��l6>��@0H8���$N���6L���X326�;�η���iRn��Mt/���(x<��4EA�j��~b�}�^��`0꿓��[�>=@4	׸;���^�fVeY懏>�;[�2<<����>�~�;~Q1���O��1��5��Dh=���>���\.�I\.�H��g9���w����}���~�YQ��'55�y55�vJKJ���$`NI����HKKc��X�V��Ց����-[x�����f���\w�5����5�y��w��?0��e˸���x��7D��2-\HssK���D5@rrZ��g」�:����C� �L*KT�rr�d�AV�-Y��������4;gvgv��{vv�g��vz���v�VtT���s�,�ʒ�)�9��K��Qj�=�{�~�spD���U�Ux����ԩ�Ȋ���b�bS�[�A ++�@ Ȧ-���
%��dyn�|�Æ��w����c�'3�0tȐ��a6�a2�Á�j��W��!��n�3~�Xv��M�Ȳ̉��X�jv��Y�f�H�a2���Ѓ,]��ٳf"�����.�x�1V�ZM~^>��������l�o��TU
�?���8u�vY3g�b�����N:|���%�ٳg��_�O~�g�_�8�Ȋ¶��y���t�z�}Ə˕�Wo���!4��W�"ϛK[�?9�J��Cz�0}��fc��A|y��mݛ?Q����J6n���G� �26����<V�]���sX�v
�/_��r1{�L�n�A[[+&ф/?�p��� 0v��I`��ɱ�Co'�_�����Q��&�QK�[]�6m��(��t�������O>I~^�y��7r��e�2��kא��ˀ�1|8Yn������fvC���3�!??��6�������+*�y��|�
�ΟcƓ��O�2i�*+ٻo?۶o�z�
��FIqK�-��r!�H�%+W��<��
K�$A��ru���)/+cĈἷd	���`�<*��Y�b�{�]�&�GEy[�m������&,EE�|�~=/��g�ޖqƌ'���\�pGC��V����:0|��z�G$��//$O����E�8��{�AQ}��}<�1��t:�(A��UUE�4DQ��}�,�Dp8��ٮ��yF�Q�Vk�U����߱Z��LSQ4M��N�3]�t�&�	UU	�������3��j2�V�$"�(.���b*����nu�9I�{�!=�N�U����f����χ:�s���vJ�q��Ȳ�(�Ra��e�C��~��%)y��p�u��b���j��K�ϝ�n$IBQ,KF�ܭw��NݻSO�T��,&e4��uN�w�ͦ�K8��p`���
��|�6$������X�<�ӶF�l6�����ޙ>]��1K�p����(/��D"X�V�fs�-�x<N"���ZL��/=�N�G�<AȨ˝m�j��@*J�n�x<�1������m���9��d����Jz1՝�?P[[�������
�}���7��o`````�'��
�o```````8~��|7����p��/'[���������p��MӨ�q�x<NVV�E�����Z���Hq��+B���WU���V���M��zN�C@��s��x����4-c�~���~_�����h��=�5��5���i�i�}��̿
7l�wSw(��(D�����q$Y�ع:v�nB��iklfP��n���?��QA�� IDAT�q�����%K	��|�n�._�oe)����r�V��w��^UU6n�̗.�����K����(�B[[[��cȶ���u�vss3�??r�L#�26n���y����|�2=v������������s�t
��7��ݏ,ߟ�I"��.^����X�j5�ԣm�-��2����FX��=����c�Wq.�ljk��3AI~��G|��Y�~N�:͠�QT���&TU�����,�Ȳ��,�BB��G���q����Ge�gډDB4��lN�)!~G�;��q�~?MMM����i	IBSU�f3�(�H$�ǿ�"�0N�3u߳D4C4������栤�>W�v��Z�ȲL<GEl6�^綶v~��7����+���Y��u�|)N�Β$aO=�;�H`6���K�G�£&�I쭢(�b1 �~Y���eYFQ~��_���/?Q3��a��NqQ!��$E��p8��w��͟#%$,fs��z��M�܇�,c�Z�vz�rg���X�	A ���x�G|�r�v��#A��--��j��{�b�%�8s���Fi��_�=IEa��!�%L&!��h9��f����b�j��[�\�Jm]=�����<�x<�?:7���=Hۥ�M�eY&��r����v�S��δ��r:�.UU�Œ���hm�?Y�/#Iq��q{�9׎���z^�T�k�v�dY���Gx��Y�z
3g��9�~�Ï��D"�=��w����ś��_��������b�>�'?�3�gJ�\���Omm-/��<�^�����T�����c�1�K�.�j�Zv;��\f�x����yy�t:!y�lK+�����c�8u��--<6m*��iii����7I�x���X�f-��6̟�_Q�w�>N�>��Kyj�\"�HF��{F�T�����˖����l1��ܹ�Z����z���⋣Gy��� �ΝGQ&Oz��C��f�446�v�y���X��}^|�yTE�?d��1�8y��+����#/�˲�+�>��~�OSs3�v�O�^l۹������W�u�97�l��\��^{�w�[LYi)m��,�?on.�-]JQa!>_>�g���O׮_���5�Vr.u�w,eX�0�L���M��r�*�x�G�Ne@�~��d	9/]恉���g����*�����0��^�º�_2~p	EY6�B,��-”�=�d�e&VJ($�Q&3����Fw�Hݾ��z�k׭'	3��,Y���"�����2���]���B���1c�D�>̖m�)).���+Xɭ�Z�?��{�ʰ�����f�;l��s�������ݫ'�N�f��-=%�
SG����ZC����z=T��>��j�������l���µk������3��N~��xv�3|�~�G�d��:}���C�8q�d��� ��ԩL}d
={�䋣Ǹ��WU�������,x�YƏG4e�g�1w�,~�����8V]�(&�ԝ1}:^������F,�jX��Q#G����UAUU}t*�}���B�


�c��Lz�!����k��B����;�p��)��$;;���z���!$YfΜ�|�e+�V,gά��#�l�J�^����gێܼy���F^}�e�_�������$�"�3|�p��ϋ/<��<t(��p$�a�$YBQTƎŋ�?���"�i&�O߾}���3x<U��v��/��K/����hni&P5l(L�p�-�~���������)��>�	�����
._��^}�?��k���S�8HiI)�-b��G����S��/�`��}|LՓ��|$T
M�u���UbF��9��=7��-{/FAQF%dE%�IH
��iLDQ�d��*��;4��=*��_���e�ݴ���iƏ����h��Q���h��Ss�0n�8._���~��J��ٛa�"�([�n������26l������_UU�=N�N�E/W���[,��cH�"�F��<��i��\%���j�^v�ޓ�	MQ��Y�!�
	)��jM�|f��(
�e�\�z�C�?燯�ʙ3g9v��#��D"|��G��֒��E�$=Ґ>�ޓ�4�X,�ٜ��9}"r�f,���S��V�%����X�V$Ib붭T�<E�7U^��S�&@�Պ�n畗_�������
�n�x̖�]ԩ[�dEɨ�������)).������w��˅5u;������l����ѣF2�駉'��S�o�3	��%C�����YS7�w-�n��t8(���͢�I0a1�1����4a�Z�rg�H$�Qу��ٟq��>��c=�~����b���l6�nK����I߸�F1�����e1���i���/�	C�yxxb�(eE9hf+��fP��	�A�����q0f`1�|��:?{�_���z��h�2DQ�d2e��͢�Ն��!�3r�H�}�6��={��[S���Կ6�U��m��=Lo/f�.UE�$J���ӻ7���?�P�*E�������K�$	�ۅ�$G⺳��0����^������v~��_hmkc۶�ج66l�Ă��8^}��K����ʼ��RPࣸ����6���#�|�'7�+x������%����ag�ƍ�xr:'O����Gx`�DQ䁉x��\�~�'OQ5l���G�|B~^�=�(ǏW��=t�T�}c�������&�p��aA�$�������݆�a�l���6��Ғb�[�����e��G2�0��J/s��	,Y�L?�PRZ�֭�x��
6�p���3�RVZFaa!cF��dx����3e�d��J�l��D"�d21l���x��?΄	�Y�l�^V�=�}y���&A�x�L�.'�M�v2�D��Zv����+W�2y�kj8s�,�x��J,c՚��Y���bz��I0u*#5�0���ٓ��</]F8a��4p ���D�Q�^�Ψ�y�,[�⮲|?x`D/]?~8w$&�D��A�I y(>�@��QT�Y��&A�oi�nG�V�p�M8s�;v��xu5�y�cǎ��Ԕ�$[hmme�u�a`e�텗  ��Nv�DU�0���X�����b����%��6t(�'Mb�Ν���n�ӷoC	�K���򶵵q��:::����O@�=ٌ5�����8u(.}�y�N�`0�`2a����� /?E��$	�Ʉ����Z�T��I4E���o=�
���AVV�~������q:���n�N�P�h4���:�F�Rq�V+�H�ݎ��H���n��%	Z[�p�������y�`^$�f�
�	��dg�v�3I4E�e���u��n�p8����t��l��h4�
�BȲ�����p��	�f3��$��Ғ<|h��t:�$I�G>]�tY.�K���� E
��l��i��*�h��A4%���/�s�R�󑓓��(��~TU%7'��J8ַ�DQԣ+N��p8�͖t��x�˅$I�b1A��/�
�����x7Y�P$IB�$l6�X���z6l���Y3���'+�M<��ݏ�j!';A0��;E��8���?=��b�X�=Lۺζ�d2��F���n\FT�_
_�Z�nq�i�q��_�S�n����_�������Q2�^r��v���믾j��
����x����50�c��yt�T��H�7rss7f���2��?06�^�t:?n���[�^�^�!�1�ƣ�������
�o```````8~����������p���70000000���������
~���^Y�iim!�v�2A��A~^~�^�k``````�-9~MӸr�
V���o�����o���+���߸\�_9��j�~2000�� ���?����"��PUUjn\�O�>X,V��x�ٌ�b��vs��M�������7�d�JN�>͙s�0�D������dYFQ��m��X�S�O���!�'���8;w怜��FITUE�$DQDQv��MnN����(
'N�b˖�T��������bΜ9C�ǃ�i��x_&�D��BG ��m�)/+3n�ȲL��d{<ܸy�M�?�#�AyY�]�9�H�u�v>L{{;&Q��Ӕ����bg��N,.� `2	��F,!c1wO�5M�&�n�M�4�D"�Φm����Xu5�w�FC����On,��q���=?�	Sw(�"+h���*]^���*����.�-�mL�8����|�r%�ϜA�4ݐ��p��!��܉��
�(��^UUTU���b(���(�b��k�,���&	$Y�b��]�4���[�d���O0"g|�H$�:_�v�w/I�S���Ghko'~m��|��Qv��ɘ1��4i2>�� $e��y�׿�����rȲ�������OVr��{ʕ�c4EUUdY�'w����T�8����3�$I�?�'�xI��r:��I�!O�U�E!�H��'��N��w֝�}7����m%wҡ��ߧ�}����{�W��i]^�|�����~�� �_�kׯs���;fN��`0��k�t9�l��Щ>w�H��9Mg�C���6���߿���/��{�_��IV��$(=>TUEQ$I�2���κx7��9MG �?���W��T_$�6�ۿ����&ܫ��DQ���>��F�IyYi�<�閩��2&���w6d�I~~>���E�N'���@qq1��\�,���ӛ��dz��O	�B4551��g8p� �ΝC�,Z��P(�~�(������de�G��y�ttt�أ��ӧǫ�q�\�Z���I4�����f�?��}����M�(/+c�A�����2i}��e�ҥL��557x��w���̞9�����^�����<�yj.�׬��#H�, �l�
l6UU�x��Gٱk�=�zz�\���C,�m[Y�`}�$���*��'�x����%�0s��W
Cb�+W�������C>�֭�x��Whkog��
̛�6l��L?�϶l����<2e2=� ��[���g8s�,;w�FETU��v�y����ճ�n�O�d���(�J[[�,�7�"c��x핗�|�
;w��d21g�,rss��/����|D"VV���q��6m�EQx���ӧ���o�r��C<��S<X׷3gβy��V+sf����[>|�X<�Ss�
�ؽwV����믽����'�o��dI����y�<��Ϗp��1�N'��O}]}F��V�AM<0q"ë�ؾc��_��r��…\�t�]��`�ۙ;{��庁߻o����ٳ��>���ש>q��ɴ���u�vZ[��z�<�p!f��,�lظ�K�/�~��N>���'N��h�޽z�t�q�>3��Ύ]�y��),,d����3���=͌'�g��?���_��6�;w�m��s�p���bo���1v�i�GY�]eHO/ѸLcK��B��F�?� �]vZۃdg9��v�Uq��@�>��x,Fk[;��{���.���K���iӺDG�_�Ϊ5k�X�L?�����J^U��}�q:���z�z�˗/�g�>�&<����6d�`~��[h@8&כ�,Ideyx��E�m�.��%K�1w�TUe��<�paŗl�sp�̙��O�6����y�&>��G�L���jN�>×/�m�v{t|�1�,ӻOof>����StK�]�%}fx���-9����ihhd���l6&����iL�4��S���SO���ʎ��?n<� p��dY�l���K/QZZ�';���*�,�ĉ<9}:;w�AU��[H$�\N^}�JJJ�q��(7�u�6�]��p��-�]���q���p8���GQQ!/��"��ʩS�9s��o�`�ĉ\�|���:�6�Ç�ҋ/��x���<;���㋣G�z�ڽ�z�%**�!��H$�x<455�x�R������hjif�QT��/<Ϡ���������G�8a�N�JIq1�~?7nޤ�����<b�XF��
F���O�������r�ܸA,�O�>��W�y�O͝�СC9u����N0��M�X0>/<�999����9�y橧��;p���V3b�p***X�a�� �2�?��>�{Q^V�s��}�.�~>�d%3�O����ϸy�&�x�E��8�� ��g�۰���<�k��BYi)Cb���=�-۷��٣���
ttt�28�N.x�Y�fr���/���c�����s�E1#�����6ѻw/��G��$
q���4��3g`��9q�̝3���|����ƌ�srrhjif�����ۇ�[DAA_=���(&ͅ�lf��Ɍ5���z����T��#����6�e9�>���2��������%<OF�������c(�h���m6��=LEO=<f��ȸ!e*�fٖs\���j����ݭ_R���'9{��77�EQԯ\�<x��}��v%�H�P^^�1��t���q��U$Yf֌���ۇ�UUlݶ�h4�G�RPX��k�2�L��0�����W_!?/���I�DG0��3�6�Y�՗_�܅D‘��ЛZ���Y����v�?�&�B�{�Og.^�H,�����ٳ��l���0y����׏�_zYQ�VSôi�;f�����+���_���[r����4�Οg�JZZZ��
kά�8��ˎ��#D�T:�~��G�޽�x����b��p�\��n\..��ˉ�f���n7YYY��䐐$�2ʵ٬�#yyD�Q�;ΜY3imm�xu5'L�,��Y���n���ȲLcS#f�D{{S~o��Ӊ�j%++�Ʉh2����$�him��\^/Vk�>YY�z��ܠ�����1�v�?�,VYn��������G�;�?�EQ�8a<'O�fǮ�L?���;��2�T{:ZZZ2�����E�ۍ����r�v�Q;��6@nn.6�
��D `��5�=UUijn!�����b¸q ��d���]N'�h"�ʳ�����|�+�H�ÎǓEvvvF�<��(2y�y�].A��l�V��(�H4
�'��MvNr�PfSc~�17o�DVdj��r��n��XF���mȒ�,I̘�Y����p�7��p�9�gQXP�o�z��7o���3�7�&pV��h��tRZR��_�XL���f�j��t:�ɟ�iD"�.��b�0�s}nݢw�^\�~�m;v2d� rsr��q:�n��oԏn�+�M�~�-��E$ax�|�xh���#	"r$L��>Lt��[Aia6�כ�d�1��Ե������p��������@~^N�G�	��j�Gwn��;���dz��8y�yy^�f�l6SVZ�%��n�b���ʺ���DQS�3�Ӂ����pZF�Nmwih�u�mm�w՟;�jhl�������x��l��".�������l۾��;v|k��?]ǯ*h����{7����ϳn�z�]�������'�8~<�

��v�f��D�/��'N�c�.JKJtgm��0���1!��~�(��=�K��j��`��l޲���G3h`%;v�r���� ;�7l�PB��(b��M&\.---lش�p$�QA�>�z}-��V+/,Z��m���o�r��W�H�"y¿Gy�7m�t�+Ț5�����Ԅ�*2�m۷c�Z����FQT*��ٱkǎg�r����y�-��<JJ�Y�a_|q�P(��������D�Q�KK<h�N��+������'�����U��y�����z����O9r�(---\�v���X�uO��Ҍ����@�U�����۶�#��Ό�c�cnj���	�Ӊ�b%p��y

|�b1B� �Ϝŝ���p�>#ҳG�������
ܵO3ƭ��1.$u������Q9`�]�9�Hp����QP��O>ax�0$I꒦�n�m�oԏ�;�$�H�w�~v��s��ȊƘ~�)�ҷ<�"��=�))�e��s�x�6������e�1����y�N��E]K���z��X4��(��7����p��A�_���y�q���ƺ
Y�a�6l���S�9w((���ڊ�ngذa��ͷx`�A�܅i�f3n����c��� IDAT����[�)t��'O���q�f:<_=ʾ��Ƣ444d�XNnN����bl�l��ׁ����ߏ=����U�N]H�{�������R���}��-T]]>|x�<|r��8��.{<���`�	�'v�P8LsSSRH��ܜ}�V��@�����,JJ�р��:DQ����`0HcS�E�X-��b1��sr��n�_~>���dy<D�a


hkkÒZ546f���l  &�����pEEEDc1"ᤱlmm�����NII1V����$Y����������)[[[),,��r�y���������r�zijl$//�x<N[[�^/�Or��r��M‘�E��s�»�ӿ_?&O�t�4�x���F\N'yyy����rOd��Φ�����"B��������7S[[�O�
�
��u�`0Dv�����f3�

H	�|_>9��477SXX���O�^�444PZZJ"����M����Z,z�H$B,#??_o�djQ��&�Hp��
<YY�v�99466������%I��ի�J����#��P_��b���I�2�7[,���G�撟�Gmm�v|�����LSs����Rl��d�߯�_ZZB{{;^����V
�d�Ku�5���Kk[y^/W�]#';�,OVRS��Km1��J}}C��|�嗼������D��K�Ţ�vIq1�D�k���磮�^o�����?5a�c�i��q���
BQ����;ظy3sf�›�����l6��(-{�Ӊ`2Q__��j�����Mss3��������㡥����:=��墥��p(Daa&є�o�M�e�P8L��G]]>���O8�W��|�5�1a����Cmmm��E"�������֭[��Z����Fss3��,|�������R� �jo��*�%%x��?	������絵����Ӈ8._�D$!c�/���o������[�B!~��7�����dg
�m9���/�+������u2~��n{f�w�M�E4���I�x������E�싣G�t�2���������s��r��n�3d�P���0����S���}��*I�h��}zw냢���
�B�ݿ�h4��9��IE���l[[#F7���������~3�l�V�	I�j�O��mv��e�X�Ɗ��_���0�ʕ� `�Z����mv?��'�q;��������
�o```````8~����������p���70000000���������
�o```````8~����������p���7000000�׈�;2��bܸYC8���墢�v��h}���WU�K�/R�+���A��4M#����4������i��^���tg���UU�q[���`2!��eQ1�L�"[w�A��񢪪>�:�1�o������^mro����STTd�_*�­[7����d2��~�lv��j)*,�V�����G�WW�w�~b�e�e��]�:�@�h4�/�B��a���I�����FY��}*��b��3],����˅�(,_�>�EE���o\f,c���X�f
���3g�ѷOov�څ��#`���K����� �^پY~m,^������l6c�$Ib�lڼ��BYi�]�566������Fill���[TT��:ݯ��mQ����j&���k��w�RU��"�"�$��q �
M�͸\��
�ü�x	"??�<��Or��q���=?����UU���M


R��˫������nu��

~��^��={Q���r"�(�pX���ص��/ѳg�f3�,PU�ł���b1��������i*�D���l)Gh�Y�de�H$�$Vkr0v.W�G`2��F�����HH�[���+u(�"�XL�\��̅_�ɪ�TUU!�6Fyy9�D�͆�l&!I���d2�
�6
�X��YƍK߾}�x��lX-~��_S9��^��r���d�A�����f��`��ج�^/�(f(��b!!IDc1�~?� �F	�#X�6L&������7�s��1Ə����t^���$	�ł�iD�Q�� �xUUikk�b1c6�QU5�	]W�Ѩޏ�%C�E!N}&�2�@�DB�b1#+
�x�߯�)mPUU%

�0�LI=�$B�P��V4U�� 
�mk2�H$��adY�b��uJ$z�tyw������?�W_�g�(�B$�u,��;wb�X�5c������"�2�{��c�F纥u8�o,CE$I�H�s����h�㱳N�O�x�E���?��`Lb@�|>�v������螈��;N
�vT���b�a����U�8�ۢCUU"��XA�$I�9�$�(
�Ni�������ѣ�<;�[r<�t�A������UU%�w�	w�w�t(������/9z�8/>�<��$��,w�[��w�㯹QC��wǯQ_WOYYy�:���&���8a<n����bV�^C��=y��446p�x5={T���U\�|I���(�Ï?��K�ص����~�������5�B!~�۷�C,Y����FN�<I�ߏ����e�������?X��:���K(�߽��;x�@}0mٺ���F
|>��O�������Ư~M�^�X�q#��p���Q���<~��隷�g�����ۗ�[�p��Q��z�d�}444RSs�.0h�@V|��]�~����"rssG"|�r%sg�!/?��@EQHH+���v?���%	����/?A���o���Z.^�Hnn.o��.#G�����e�Wзo_�z�wz�lO6�\Iss3�h&7'�7��.׀���f���475�t��n�FnnEEE��mٺ�=�������[7y��࣏�x��7nb�J�8�9z�8�Ʉ����%�v�:�_����[T�8Iհ��ۿ����S}�$M�Mx�^����ﴴ��k�n�V+��Q۱s��l���_��3gϰg�~v�܉ �͛7yw�2��~�|����G�Q�`0Ȓ�+8w�d����tǫ���QC��
<�矟�ϕ�WY�j5W�]������r��CN�>͗/ҫgO���ǖmۨ�QC��Kvv6����o�,��7o�k�^\.'�(���e47�P[Wǀ�����H���V������ill“����X��u�����\��x��n����ͷ����i�◿b�JV�\��)**b��5_96��p��XP��7o���۠����tD�r�mQ�\�����Ë/���37h	İX�|q��������PT�s�_���oh��߽CKK3�����"�9�����>����2��4�S�ϰt�
�e�3g�QWWO��߰�H$JKKKW�Ns���m�I"�����W�d�[Y?�����M�<��}���w��-�5�՚I���oзO_� �{��aVV�m��qZz��ӵ�����S���Փ��s��YlV+�p�>�$)�$SZZb8�N��[F�,'g�w{ɲ���;�6�Ym��a��ߏ'+���|9��d�`��3g�,�8u�4>���gΠi��G���޽z1h`%,����ɓ��G�q���4$YF�4V`�…�^EC}=��(7��u3f4{����
�����~����l|>�e�{�i�M�ʵ��8z�8�`�|_>��
47�0e�$���E�x�p:�̝3�'�?��79w��=��@���LU5�6+�v?;w��ͷ�f��H���1��׷/�,``e�n�dY&RXX��I�(-)��tPs��˗/3bx&A�HSQQ��*�̜�C>��s�2�j�o@�4ƍ͢��Z,̚9�qc�r��u�����9z�8�-\��'� ';�(2���9�db�l߹���B<YY�ڽ4���B�y�)F���C�=k�.\����={��ҋ/��Ι�g���Aaa��ϴG���F�\���s�/x�Y�쳔�3t�P~��̡�?G�d&�˳�瓛��18�N'3����#����ß����g���(������e���<yy^�l݆$I�A<����� 	c�Z?n+�βܸq�Q#GЫgO�=�4��h4J�^�?n��=��Ũ�#yp�D�6U��zMM���>�� �\7_~>C��xu5���T����3��|_kl���x�����^���#T�-`h���շ��3�r�������4�=_\���.ԇ(+����I�*�w]FV���8�cӦ2�駩:M�P���{g�����EQU�aC�R9`cnjf��}tttp��EzT���2�<���Ӈ_x��ܜ.��(
V��ONg��QXX��O�%NF���H�G4В6�N�����2N;���<��lc���?n,C
⩹s0��N��7���[��(��zWǟ��oE����h�=�����	���^={0�_?��?�5YYYzXW�4TUŗ�G����L�0^uZ��d��,bE,�(b6����	�9.���ɼU�ro�{<��n�m����0	�c�N&Oz�$`2	�f3V�M�P���2����_��'��Q��C���‡�:&EV�R.�l���R��O���f�3�0f�hd9�g�(꡺�2';���ɏ1�f~��_�D�2y2�bמ�:�����4�p�YL�����]��ܖf�Yo�ζ �oJ�N�
��,^���D�χ$I8�v���Ť��EAES*Os�/E��R2�<�"��T��2�WTY��mI�شi3/]���4�~ZJ�ݎ��W�^e��5dgg�v����Ȋ���j���խO�^T
�_���������o�>��7ޠ���ǦMc��,]��s��g�NYL&&A@E���7?�17n�d��$��F#�-�;����$ݡcnO�m���5662r�HN�:�굟2j�H�Ng�4_gl�k<ީ��0��M�p(����yd0?�;����قUK:���B$cJU	v��I�8���r�Z3��o�Q�"��l	(�L$���[C�Ǥ�&�<�/EQ6t/<���{��c�.���().bæ������6,3M�\Vrҡ�Z}S�|�ٌ�lA�:aMm�Ȳ�Q
�T-�+���u�����4� I�$I�N��zi6���ʔ��ՠ�~Χ(
��"�]�WdU�� ?^M<'RVZ�Ss����{K���ގY4S\TD�^=�ݻ�!+23�O������v���b��8��	)ØvH���lI�1g�E��ƒཥ���6C�a�ƍ��ҋ����e�v����X������}��X,L�6�<��h4�{K���3�0��ƄT�z��{��D-X���k��?�dgy�&�x�q�_��(��1��˖1��4A���c˶m��f�^/�Y�V2h��.i,6�5����3����G>��:�}0���M&&�s��z3z��.#'�C4�<�9{��W����9�d{غ};��E���1d���L&Q��X�V�^&Mz��˖#�����G�
=�`2�.��eԈ���x	=zT0a�x�@]}��\�'W�2�f1coW4��D"=z�`(Ġ�JZ[[Y�l���<6mZF�=� �L���͛�ۧ7n���,�7n�b1��xغ}�Ԟ����U褳�e����/\����b1rsr3d��}b6'�h>��c�_�����N:6�/��@�u�7�u��l���`2	45�RQQ~�4�dl�9k��غm��of��w/f��������F� /�E�Y��A�I��Q�G�M����מB{ BEa6V��'�?ś�F
FIH2&UAJ$���&=� o��6W�]��#G�6QL��q4h`%��Z�O���������x���l6����?�����n�i��z���w�e��y]�mjAs�Θ�){�>{���N��>f��ON�W���[�A��ު�e���z1q��q�zY��}=�PVZʀ�3l��$��jY��w�������ó���8}������ye���eذa����oK�6���l6��b�X�Á#uZ8}��f�!�fւ �p:1	�,c��t���l��q,�}zI��Z��!*EQ��l����a���vV�t�+]~���=�tي���D",+���D<G�4�ܴO�}r݉$�Ȓ��%���b!�c�Z�+
I�j���EI���q������[�tUU�=j�]�(�B"�@L���+=S�X,�b1]M�2�Ӈ�,V�����p��Ƀx�@$A�e\.W�c�m$I�$�c��H�ɀ z�t���:�B�CJO���q:���f�P�۲��@M��h������9
!�f�n��e�>���q9�X�V"��pOVv�=y5��ۻ��8������� � �3EJ��V�DQT\���wm�Ϯr�U�omߕ���]���|��aW���*K�He�
T�H*0�I�A������A��"��W�"g����w���M���vO��7s_&���t��
L���M�ς��*�E]Ǣ��b1|>�DO�1۔I�S��r�����q��3�s������f��V�&'����n�G��\c��+�N���Op�[h�ƃ�Dٜ쑛��>Y��6�
�׋�jett����?�?�1^��|>�:�B����(�S˛��(�&/�uL��n�l���2����f�b��H�rN�z�N����V��s�2�,̶/��D"A8>��p8�<����Ec�v�Y�V�~�%�C\�b�?��/���c�'q!J�KM�~�S���?���V2�s��xi��K�����.M�}l�m��8�߶QZėX� ��j�RUU%����ٸ��r=��E�P�ꫮ,�g��
��c:�?*
�>à��˼y�/A��/�]��痏�I���'w����V)_|N��2�U� Hпd{�!�B�B!$�!�B�B!$�!�B�B!$�!�B�B!$�!��wwA��W,j
��f���z\.7����B!�7;����]njk�P�<u��$�r��8�;���:�B�/�y��7�T2IMM
���^�s���ԒJ�J�Y�P4M#�L����9}.��Qдi�L&�~�[(�����Ye��6|�t:]zX���e2�i����d2YR�ڔ<�]�[&��0M������M�$��_��+������O��L�p�����k�F.�;�sOA~8	�_< u�40��i�������������総<�Ï>F׉�s"&[�}��{��F(r�<���ett��4Mv����o�ɾ���xϞ��]�w��_��7��N>�?����r���{���16:�?��'<��&���4����b�˿��t*�ӛ6��c���?��/H�R_���i<�y3�hT�b���4�_��ū;>E7L^�����]�_�r|6E]?k�߻o?/���ϵ��]��c��x��6�_,g��7M�j�ޚ4Y�r%����pT���A���X��`�55��Fq��464��:ǎ�j���؄��o՚��p8L:���������e2FFG)�TUU��f����	�d2����u�`������0�p���NRYYIC(�a��3�����&8�NB�z�V�D~@�4***��ll{�uV�Z����q8Dc1� �``p�
��x<N�P���p�D�Z�444�/���$q�~B!t]������a��"��� �aPS]���/&��`�Zq��D"QTV�L�����eY{���$���x���bdd���P(��jE�4Nuw��娮��f��:�D�m�9.�Ns�����Aҙ�T���U%XW���

�L��z=(�B2����
��Cww7�D���`0�k�`�Z9u��B�X�����I���t��O]m��a.!�T�Ѵ�[GFhm��O�X1��tVc4��S�s14�@U�>7�#q~/5/�7�i�~�.��@ �a����D��c��v���0M���^>���� ��:"�(�CCX�VB��ib��b��ޭlZIDAT��\[S���0�X��M0dpp�XӤ����%���W�W�����q�j��������b�i�i�?�OSS�},���0Om����{��cn{;�=�G�v�r�x�ɧ��5W��ÏH�3���6�N'�H�bQ��v���-,\���{��b�r���شe+�2�>�<+W��n��������q:�<��TUW���g�ho'��p��Q�{�ٹk7�� ���������ׇ��G,���S�ݷ��Sҹv�Ulڴ��H�T:�ӛ7sՕW�p80M�w�{�7����i���+456�s�n�55�v�!'�O����C�<J(���^c��}XmV��cc|��'��x�����巏<šի�`�.���K}]�=��q��5����:���0;w�$
�����+�m��/���f��/p�W��*�ir��!���}N�d�'�|
���˯�J0�7�(������J6�e�]��	�<�,s��9���e�Ws��QB�5�ե
ѣ�?A*���tq��q.�l5>�[R�4��/�f��/�����F�{�7��F��8x� �-�w�%_��������3����ӟ����*���O"���!j�k���Y�0����8�<!�ʞ�1ϭ������P]�湷�b����I���2y�(.+��Q/kף�g���;Ʀg�PS]͓Oo"�cph�P(�Ţ�5�����a�X��cǏ����ӛ�aΜ6����Ï?ahp�D���������:�}>X*������TV�>�?��l�#G;9r��/�b�Ha�
��y����'�����Ţ���寨)�h�Bn�p+6܂��B�����J�Xs9��|>O[[�B��r�x���`�|��z-�-->|d�B���C
�<��Oq��k�5:Œŋ�㶍D"�'O�dxd�X<Nyy9^�L�O�g�Ν\~�._��E0e$�:=�}}}t�t���Y�r%^�gZ���]�����x������&T_��eK���~�I���w�������s�QP�Z,�<�a�^���k�������1>ٳ��neٲ���

��u��#{x<AOo/��z���
/Zļy�Z���H���*��454�z�*���ǭna�-�Y�M���vTU-�m�Be�٩���P(044L,?�PB$B8fݺu,Y���ñc�q:ܾ�V��?|�;w�L&�?��k�^KSc#+�/c��e��(
�-��9���0P(��k���ի�D��z�x=�inj�3�%&�ʲ���[����߸�h2��ZP�9����D�������`���Eut%H�4£)��yM?���s�n6���+._ü��i����`Ŋ�l�e=�G���W���������ܴ�R�c�1��y�J��lZyv9�4�B��ߊ��t9������p?##�R�.H�7EQN�3M�r��n���q�?���((
XT�X,ƣ�=FSS3�V�,�L&C4cdd�ߏ��$�	|e>.[���n�����_Yٴ��NJr?K-▛o��?���̻����k׮�a�S��H&I�ӧ'~J:v��F���i�,~z�a�"��ƨ�o���6��h4Jrb�_i%EA�4�>�,�����b��|�7,V+����!��O:�&���v�LJQ��A�4��Ϻ-UEAa<_ԙ峊NMu5==�D�Q������g�;|�0�v�W\NMu5���ˍ�p
� ���i�CI��0Ϸ����'�1:661�Q��f+}�d/���p?[�neٲ�̟�1m�TQ�wAU�[W^��?���y�������'.�o�/3�;jh�+�9�5�baK5s��yk�1�vee.Z��T���|i#�i|^;]�����GF)���-�
?ǎ'�J��fK�x<��WF8<>l��
�*
�l�d2Ioon��-�>7�WU�d"A&�!��DJ
��=�<��=w�Ͳ�K��_�K"��4��"�D��!��༏�J�����S8��N�|��2������0����p�u�R^^��nGUUjkk�z������믑�X�p!
&~�}�1s��Y�h�_�w���/�Hg秔����=�-���l6*�~.\���N{�	\n7�߶����{��	�X�l)�CC��o�4���\����0
]���@���ﺓ��l�쪙���o�Ȧ-[��,Y������l��'���f���y�!4���f�?�'V����vv�ލ�b�����MQ�Q��@��
��“Oo���U����=�套_�0M6޺�Ɔ�55(������v��*����*"���(���k�b�<�ۇ����7��o��`�\.�3[��t�TT�v�x��	�ٶ�5�ZZ��������[7�ԦM8N|ee456p�u���SO�ZT��G���T��(�BUU6�=&j��	TTPQQ���=O"�`�������K��p8�|t�8ɛ۷S,Y��2TU�ǟ�/��G�'ʆ�8)
,����Mk�[�^3������}w�LLTEa�� ���3�W���j~�Qx�78x��W]IsS�^{�?���x��G�x<�u������}���Ǎ�n�n����`�`��o�M]]�v�*��ojl����;>(��h,��g�C׋\�f
��d*��?��nc��7QQQ!��P^��ٓX�|y�c���&�͞���t���\꺽p5p�,�xσ�(�A.���t�Zy�aP�u�-��!
�ݎah���f;��2���k���b�b�`�&�|�ł�j-�AӴ���Kg�X����'�z�����n�ͺ���CQNk�
��p|�ޔ�ds�R�c�tLv;�>��<u;S�b�g���<;m�l�T�&&���v���
EQPU�B�PڷɖX�8�z���S�gfzg�W,�4m�vgK�d��%ϴ�B���4]�����{�|���ח�l�r�3y��v����rf9_L�g����i.��\.���O���!
�ЬLJ��%	�g�J*'/Ȭ~��A�܎�����rUUq���D�ֳ�V'Zx������&�M�vͨ��\�l�裏�xW�W���g��3��f�f�N]of���gn����\�af/�mJ^8O����?/���j��zv>/of;q>�������A��UU�؈�
�Z&g;����Y�m6۴�IQ�n�oȉo@�_��x�T*��f�28k B\|
��H���|x�ޯ4=�a�D(//?�r-.��8����)//���5~}������RUU%?�7�<�O!���/�B	�B!���/�B	�B!���/�B	�B!���/�B	�B!���/�B	�B!���i�LFrB!�7���DcQ�<�_�k�RQ^@4��/�w��9����|E���a�۱�Er���!��hC�].j�&JgҜ�9	@uU5��+����f���	!��(d�Y�Q�k��r_9�##`B�����j�RS]ͧ��1~!��k�')(��H���`��>!�
�k�$�k�.	�B!�%T/�1~!�G\��;�4�i�f�>�0P�/�6VP0s�tM�8/�uN�Q$�!����ӌ���F�;��@?�i�j8�vt]����m���ힶ,��c.��w
�3?����0���JG{;^����z^������~!�-�4�
}ZK�0��"�B�\.�V,�Z��l�D2A>��4M
��t�\.�a�����3^0M��N:�!?�j#�i�4ևX�d)U�J��}����i�3i
Z�B��1��B��a��3u_�"ct?F6��==H�_!�%W���A�G�q8
��'�28<D��K6;~��p���H&hjhd,2F��a��ilh��1lV+�L�����x���I>_ �ΐJ�p9�
�t��v�J��?�����*+	T�� �mm���.���c#����v�Ji������&�_!�E�L��-_s|5�4�T���ȩ�n�D�t�i��rO$���%�L���H�R�T�P�
���OĉŢk���������F�Ģ8�:��2<2L�P�W�#�˒Τ����/����� �JM['�JP]UESC#�X�4�;����$�!��h#��b%��R(P�T*E�����1����2>_ԋ�4M���	B��x�^�@Q��p(8�TUUb��gI��9��E�1"P�Бå�y�n����.7��p�T���N��Ԃ����r�̡(
����*�D�E9}�"��	!�����|�|>:Ⱦ�01���H�Q���� �

?�E秝��yU�j�2<2B4â���ˉ�ct����vSV��To7�}}d��Y�"�M��*n���P'N��_�G�
�{�	��Sд��4M#PQAe�r�:ZA��x8w�\����O;ъ�i�"eǎ�y��I�B�M���MG{�i9�B��2���E+j�8v,��4�u�0�X,��Z�Lg��J��u}�7�2>QP�Q�-����S��oki���dzTU-����bUU�X,(�2m��g��39�1ٵ?5}S�r9�}�t�!���)�rZd"P��\��(
V��h�XN���ת�b?�5�g��:�>5}��b��Ϛ���̬l̶��[��B��"�����g�~�n�+�_!�%���ׯ6�e$G��!��Hج6���&=�bq|2
ʼn�}U&oNd�٤�/���P�bxx�k�h^�0��DQ��t����b��W.�_!��n�������W��|_y��`tdT�B����l��3�1~!��"�_!���/�B	�B!���/�B�o�����4�X,F&��������a�R)��(����rl6ۗ���d�4SQ�^�?�f���y�)�|�<��S,q8R
�B|���4M^ٶ�p���:���u��s7�F���a��e��4O��1���oy��K�0<2��QWW�'{��h��Ym.�g����Z��sTp�����`�Z	x<n���y�Ӄ~.�C?���e��,\��chx���;ח���u��T!�%���Ng秬��f���4MTU��ɓ���c���[V�i���QP5�طoo��.��֖4M�����JkK�����C&��j��r��D"455Q]U5�B`dd�Ɔn��;�'$
����s�N[[���]�>bhh�Ǟ|
��Ess3�d�2���M__


Dc1������47�D0�����[�nY3�--�����)
�E�Z�����h��O�OSc####D"QN��z�$�xM���366F4�����2sۉD"�b1�$~����&����S��9�MMM8�N)�B!��3��۬V���Z^z�e6o�J$�0�<�(�<��f��4/��*�$��L&�ds�r�R����z��<t�7�o'�Ns���f������Ez��Ϻׁ����4�?� G�v�i�B�|�@,���i$S)F��8���ӛ63:�PЈM|���Co_�|�'7mfl,�Ï>F_������JIO/'O�����B����Ľ���/�D*�������(��;1C7��`��^��>��#G���-���9t�0Ͽ�&&�t�ѱQ�~�]>��/��a��ᇉ�㼿c��C����#��ǥ�
!����`�����9s�Z�����C,]���}ĩ�n�T�����280@2����e^�\B�k.��L6�C����%s��x����a�ҥt�ϥ����L�0������G��ܼ�F>��#^yu
�5��\�j�� ���V$�J���K[k�U�\�f
���{$�I���yt��BUǟ����\y���+>6���]w�5W�E����Y��ŋ�H$��SQQ��%����eŲ�|�[�p�my�w)��\��o_s
'OusӺY�|9��u�w�O�p�,Z��e�h��'��o�n��:-\Hmm��R!�����u]�����Juu5N���V T_OQ+����t�w��l���
�x⩧���$��02:��j����o��*+K��I���ʔ�O�D���8]NE�f�22:��f�]���s\�v-G;?EQ�dxd�@E��J,#������+W�`�|�p�\`�TWWO��ji2���-*��9�9�5�ռ��������Ksc��guE�2�ĉ.�u�(�M�TT���b��+�x�	����奔
!���?�ϳc�N��������khjl�j��b�r�|z�.�X,�s�.��0n�������	��y��W���;�e�ͼ��vlV+-d�E���QU�χ��BU-�45���i�tw����J0X�m�n�j���;��or���fNk+�l�����MM������+x�ŗ�㶍,Y��W^�ơC����"����7I%S���"`����1-��TU�޻�a��(++�[�\SJ���$PQ��b���EQh�3���A�{�y��2��t#�==��TUe�e����<��̛7���f��*�z<:�D������s�ݷ���^��'��Qv�ؑ�7o^�lMӤX,��,��Z��xW�o�X���/��l��d��d�2�0�X�(�>����@�u�V�eS�-
X,TU�Vq��\U�[ӊ�`�B��qƠ:u�ar?�%�3�{溚�a��f_>��a�&E]'�H����������*�����L��:~EQN�n^�uv����^[
��q=��6��L�q��f~n�u�v��\U�ϝ%����Z犢̚����N�����D�1�w̓�B!Λ���&pӍ�]�&ί�P��
E�s�	!�_J�Z,456J�]@N��S��'��{�!���Bq1�F"�Tgg��Bq��D"���Ϳ0�#��IEND�B`�site-packages/sepolicy/help/file_equiv.png000064400000140105147511334650014713 0ustar00�PNG


IHDR��ah�>gAMA���a cHRMz&�����u0�`:�p��Q<bKGD��������IDATx��w|U��ϝ���K�7 �����v��ꮺ��m�kY"`QPz�z�T�{ϫ���!��W @�Ç�d�̝;g�;s߹瞃��˛���`0��
�p��r8��݁��LKKKPPPJJ
B������%%%v����t�����*�2н"�i������|FUUP������@ N#�,�,������˻A ��UU��~b�	��k���d��p�d ���9�{�O���P�Cp��'n�w�@@��~��!�14�K�4`���f@PU��(�hG��ߊw�JUUUU��i����j�[=�O��'Z�h����c�iB�pj�|�3��	Ƙ�0��&��h�y��h�q�=*�:�EQDI��#GY�q{<,���,ˊ±챃o�c��|?�c��
!��cF�$͵�	x<<�2���k��+�{��(J���z��$���UǩC�[Jk��	�Cp��
}���0EQ� �N�SUU��>���u�@>v�Ӈf�%Y^����oaHz�s��yݖm//Xh0�.wFr��{���=�wߖ����۱k��_z��:�ULѴ(���8Ai��(�a�Q��t*ƒ$�ŋ��O����i�	w]ulTB�aY��n�c/���>PRV^U���x<��#�^���E�d2-�z�W?��#�dg66���Ӓ���fh�VE�$�e��eI��eY�!�8�B,ˊ�(:�c,M�:�cA�w��PU�a����n�cXF: ع�@��/62B�e_����q�N'����ƍ��?_���Ǩ�Њ�Zp�\���>�OA)
I���yEQ Qb�/�7[��ƖV�������ں��ήnAeYaY&&*�pU5EQ��!*������wߖ�����//]�������nO��ah���n^�34.:JU8TZNS�0��FG�#��*��f��jI��쎯~\����b�ϕ�u)��������#��?2,�������	
��Y��Z;:��ma�B���n�g����f����8.!&���}_Q�$�A�C3Ҡ�4�pR����tCs˻�.���kBo}����M��D�7p�;�?:����sY�
���Z��	�FL=��;�}���>$k��BcY�i�.9\�b�/Y�IϾ��^�����NK�=i�o���ڸ��p�䱣>����o�歏�u�ձ��(��9n]�¢��y�̩���oX�
��w�uð��cWyu��ܡ���QyU���_Dh�������/���{o
������w?]z����	cE�)j��k�::�n���
�vT�ճ,CS���wk�U�7�wv�~�<�����Ʀ�E��e��.�5=51�?�GGE���''�z���!�x��u���|�n����#r�nϱ���O8 ^⢣�x��[��o�_����y�eh�XV����"Dz��w��u�\���������q��hTTu���uMM�<��o�y�1��em����w�uC�S���sW̟=�ԇ���?o���4M�N��g���7�z��gA����v�;`1��C��ggu;~VV�����#�?�ӏ�w���}��q#r���ۺ}ۮ���Q#��䂻y�����q�w�-ܼc�O6���/�h޴�~������l���lnm�b�G�+8 ����r���1�CU1EQF�^V�`��(�RU�g�n?����9�|c��qe�UK��
��X��w>h���M�k.�/�"��(J����d4���VU�bL!ı�0,�tvw�W�4��a�bU�;�&����C�����K3R�Yv�olimhjv{<6����ZZ�n���F�A���:�n�����9�}��l4Κ4���p��
�y��'F�l�J�e��^Y��	c���DYRTL!D!�X�j���6&wXTx���:�mnm�olD�fh�e]nOfjҤ1#QTEI��א�bl���v�zgђ[���t)B067����1��#>Z��@w�p>�1F��;���~��_|���n��ҔĄ��n��G��r�ӓ�z�A?�����#�����c�fƪA�G0���#Ǎ�}탏�&c|t�O6�UV���c/-X:g�DQ�c��hw8�w��h���+>��m���H�|�����֮	
�s\vz����x1�0W_0g��_|����}԰!� h���8YQ�ȡY�04ò��bN��)���|�4/�4C����t@�-16�����Iq1�'Ox���
���::c##.�=cՆ��]ó3��z�b�5$��Ȳx�M׏��q,`�)��g�h���͛G�!I��!�V0�MB�۠�t:AUc8ӢE��(��<"�bLQDz� 04���(I:��XV�$Q9����~(x/�ź	�d2�EQ�B��B�����@S�l4>��[V���n�;�0S�ۃ�����èwt��&#��Z�����m7�MK�d4���|��?�u��!�-mm.�����д��"_>��c��8�a\n7�YVDQ컞c��v��M"|gYQB�AUU�ǣſ��Ū�i
5�1�cLSHUU��i����:���y��(=�)=�r�gi�<O!�Д�����5@�b�v�i��X�q잃�v�?��E�tL>B0ֺ-�2�P��O�ݬuR�e� ��'˲^�}����{�"D�����`���F���,+�v˪J�}��/��x�ˍ�[0�+���c�Lf�<E�磨�y���1�����&!W1m'�EQ0YVE�q�q��(����?*��1�Z'1U����LHOI2��q1�$��Z턷L �M��
�	�$� !%)5>.;5��r��o���f�173MQT����%�"|�4/��EE/�4u
��,;$	z2��a9�C>�/�z�S�f�t=€pt�OӴ����=�!��@��2��_�yBH�E%�	����i޶���������7!I�"�"���ޙ�� ����LLJ�Ȋ��X���e٦榮�0��B�0Ha(�2��f��X���eMF���@�	��
��ݱh�A���a@��KpO����@8� ����UU��e��N�7�
�~���e'����|	��-��:��MFSO��Ѵ(�t:��������c�mQQUM�[��[�_y�(��'�>�o��&��f��;�G��(,I2EQ=�)��sgAgGGpPȱ�|���8�3�L@!�t:U��ć<���cLӴ�f�Z�<�r�O`�u:���c�M&�$I$��o�7�ر����(
��͂G�h
Q���4��(p�*]�
qWS[�}G`,J҄qci���t��Ui))f�Y�$@Q�]��n_�Ï�(
����2<'G��
���"RT�B���,44T��i�ȼ��:�۾cGxXXDx�VT^���gB3d�O
-��=���1˲ ���.m�	�$�w�6�ֱ:F��iZD��Y��w:8��Ғ�K/s����hܹk��_CQ�{���w���}�di[{�0��f�m]������?��Q�$I�~VX�ր��Ʋ��l�8l~~�iK�"�~~VYV��d2�M6���b1��uII)�q&���f3��c����o�X, �l6��f��t`�X�{�3Џa`�i������S��˲���;:;;���7�O �UQ�"�,��z�PsGWcC�������Ū����cL3��ܜ�����t�M���R�^o1��QU��%K++�fΘ6r�p�ǣ��HMN���˴��/k����'M�p��(>6.9)�OM�:%:**�`'EQ�^|��M[���$I޶=?""��[n��2,�2��U?��5<w�E\�a����;:bcb���
�����
��f������*+����0�ƹ���L�ϖ�^����VfZ�����o��k�1D ����~�p��gN��DFpv{�_�ڽg�cO�����=`�M���ѿ=���흝7o���D���u߮�n�}cnjz������k��X�-.)��?����>XT��l�0a���������(�Y]S���g���O�%Y�z�
�����v�#>>��,*^��J�e9��/���˧O����/���WV^~����K/�a�O��K>���p^8^lL�7�W��W�����gqFAy<�%^����v��𧨈�y�gy<�`{��PuiI�ر��{ׯ+y��t�Y,Mu՜Nw<��ۊ$K'�{쑿����f2u:EQ���x<|qIiP`�(��_����W���X����G���)�&mڼE���Z,�10 �ah�^7"7�ࡢϿ����f�عkwIi)��1��l..)��C��~VEQ�z��ܜ���I�55��+*��?���(��������|e�B����Y�x������n�m��Y/>��Ѡ���F �7�y^

U���{�r{�C;��p04-(���:��S�����q:��P���V�jw�w��%ŒiS'�"���h����n��&BZjJvV��5k�SS����� �v��i0������SUՙӧ�y��/��o�����sKk���UU�ۺ-%9���7fԨ�Çggfn�۪M!�wtM�Ȉ�-�7gvxhhZjjQq�OchB���A���OΞ9}�ĉF����!����3������-%%���Sg����h�/����������P�x�VkbB�,�E;����4�mϞ}7^wm{Gdž���wZj��`�7�4������q��ϫ��[�ac�����r�O��9oksskFz���l8)!1&*
cu��)�F�!o�v��ǫ�ɉI[���͞m��~Z����5%9���ࠠ��X{�=)!a��
�޳'$8xƴiu����lq���gpf�֊��t��a�0Ƣ$���4M;]NI�ɲm�<c�0lMueTb������㴌�"Y��i5��e�z[ssB(�w�F�����ѧ�}:~�@�@I����z��ah�ӥ�i��Ad�a��e6�8��#�?��j�#��x<��d2�t�'��L||��7��<EQM����v�E��l.�KE�ɄX��AUU�^�t��z���t:9�C�\.?�U�e`��U��(��j��ص������Z�k���567�ܮ���A8N ���j06oٔ���X��_��L�݂[[;�1즛��+(�g9]rR2���bA��۷�_��ぞz�Pw���(A��c�]]��f�B�,���j;)��(����d2�tz����xB�4��ե٦��6�J��鄞U���)�r8��i��vk-wuwk����DQB������ ���AEQ�(��.Y�Mz}�3�ԗ��VV���aa�,K�,�R��3�������BH��<�(��p�m� ���{�{�w�/�+�=׷������*!���6m�����t4E�:Y�E�,���������/w�04
�)�@ �B��))�ڟc�Z�F�€5O@�3��ӑƝ�����@8/Aiy\��q�UU�|#��P����p��
���X[W[�X�;Džd�6CQ���D�G�O �����������*TU�}Z�@w�@ 0����x{B�$�����Bg'c��`�_fA�յ����jU!�_�RB���� ����
�~���34���(*%)�?�#����y���*,4l��B �
~�����&�I��	��E���10�h�B��=��H�cl4��>��}ȣ!�`n�)@Z�(�*ֳ��qY��%�=j�$E���9�1�0B�0�&�Q�������x9/�$n�2*DG�BHU�A�Զ��X���")��g��`d�����`̬9�xB�,˵�5���
�t������Y���>�VAUU��z�kɠ��U�Vx��:��o=z�f�)���	�eM ���7�L��V�iEU@E�e}G���0`�p,���_Y\�\��w�3r��}��q�w�DS]�aP������J�|/MQ-�-����S��h�(�r:�f�YE����y~�w�l6+������=ϱ�c��i�QUUKe2USyx�|�0�Z�8I:�t�eY�N'K�0�$�=�j�'�Agfo��p��G�&�$K#rsnj����p9��\\[Wo��1#I�04MK��}�(��j��d�G����+.���Ə۰i�g��7l����#c��D�>g���n�,��I)���c���R�nq�\�EQ�1DzU��?���p�l�?��斂]�n��zA��9�׶)��B{5>?dY���͞}Ἱ�,�����W��C�	�P%�2 �U��jݽw��'N,-+�=s�[�,���IMI����B�ɲ�
G���{o�q��S'Or:�z����q��e�V��ګ�LLH�^ZM˲,I��Yp�WDD�kY�/��~8™�HffQ���֧�x���.�LfS}}CsK/Ͻ�Ҿ�BY�u��|m]EQ4M˲��|�ݎ�#�(��?�(v�O��4}�x��4w�䔠�҆��6�i�Gǡ�3��UE�YE^����H�"c�hy���>�$2"�?/�8g�����ڲ-GUUu[[;�0�P��ޡe��y��)�x��SɊRZV�އ�ut�u�O>�l��n��h4�=�á��(�8��kh()-;\QAӴ��~㦝�w_~�ő�!���U��i�e%I:\Qq�������(�a���O>p�]w&%&>��SM��E9\���Z����2�k�B����	�������v��k4s�
����t:�Ѹ9/o������rsRb�"@��a��y�+��QW����j~�hJ�TL��&�A�T�B���R�@�
�$I�"+����6�+I�7�BHU��Ԕ��۶��3f��h�X�����O>��������oz��b�����/�M͏?���?ݖ�cӖ-�?��,ˢ �
~X�SVFzWW��3�u��;9|���^�[����������ص{Ok[���f�wF�a�Q?���r
���������ZJJ˦O��(�A��y�:�A�=w	����y��h��--��e�#������������_zyƴi��O��?�9u���'8�N2@TP#I���[���N���nlj�ʤL�<)3=����=c��%K���ss�-��ˊ�*��L���
�H�ZxB��L�9~�^����e��ŵť�G�3����>™G{ޒ$)�!I�7�EQN�����˾Z���?>\v�0 ���}�粲2w��շ�����z�n��MQ�Q[�n+)-�����@~AAfF�N�Ӽ�����n�ۺ�^���[�fK}C��_}��7=��_7�m]�t����{����(�m�v8�ӦL:$���n�;��[�o���cǮ۸i[��V����������xx�B�Pw�=8(HQT��!�RDxx���m�O�2��*.)mmmC�!YY��l�ohhlj����0"B�����ֿ��Ǝ%˒AoP��8�N/J��lFew8=M��}W`@B($$Do�CO�'E!UQ��?2&�ݸ���?��O����Me��C�TUs�<��,k�^����^���DqάY�z�h4�X���d��t���Պ(���+2"����𰎎�iS&�]������?<��˯���Ə�r���l�,����p�5�^tѰ�C�0DGEEFD�����h�XX�E�4��8�EQz���n7�
� �r�iiM���a��0�$����+��BCB��o�4"7��i�Y��,��4qBGG�Nj?��L|>�AHO�����n7�A��ە���`�¡���ii�~�Շz��/����S�p�ݼǃ�x��(IQBm/?~�+�ٷ�nOqEl�?��9wY�AӇ߻�{A ���0�$��8���.�Ƙ�iI>Ǿ�λ�FGE�=�ysf746z<@���(�r������k�5��55��}gZjjMMmHp��\�ֻ&O��ݭ�q���c��I�,���m�GGG�����m2CBB��'�|��O���t04#�
��eY�xx�qWW��	�*jnn�8V���͙��;ﶴ��ݷ?;3!��x232����g_x1>>���f�ĉ�ƎY��(b�����7�}�11џ����x<<�(!h���Æ
s�ݵu�QQ��"�����NWHH���I�{�틋����:T\\RV6$;������_Ud�x��IŘeQ�+��.wH������ϧ6m�944bݦ-�(����P񡨈���
�R
��(��f�=VШ��}��--���dffD��7��tvu%��765��[]S[QU���*IR]}��`������G"|���6:*R��i���6(0���oa�(JC��t:]Ww��]��FCTd��ht9]f���p��7�$'666�ٷ�b����/*)���		������S!��+*����BBSR�eY�����Y-KEUU|llcS�ኊ�Ӧmڒ�ٲ��y�uQ��goYv�y��C���h��Q��x#8M&�̧m�`0t:N����qB��������8I��]UU�(�9R�!�r��-]>o�Z�ȁV�`A����ёъ�*:h��)����j�X3�3X�Q}L?Dz��C����e��y��8A���u:�k�|�)�*I�^�����i-�e�N�y�LF#�(�˅1�iZ����4��/I���<<ϲ��`PT���Ȳl08�SYDYQ��0���zlj��-A��u��(����A`�eY���|������������w&����.�ǣn�9+�2B�n�k_`m��(����x��(��Y�E��]r�5%�2�#���p�wj�$��I��~m�=b�(-Ӌ��������Q�A;]���^��0Y�����x�;c�hixx^�p�\ZS!EQ�����)ҖhB�tI��z.�9�EQ� hF_�"����MQ� ��(����g;z��'�� ���~������0�Y��g���RE���IQ�3��8A{=c��nױ2�Z@_��wgo�6�g��g�wkǞ~��j�}�KSu�]w)��]�K 6~�F�0�4E+�Z��	��/�B�$豃d���~��O�����(���������8�&�,!�t:�V��\a��sV@�>a��@O���	�PP`���Qd��,��Տ�8-1�@�x�=�����qd4��?k�*VTb�	�A>vfh���M�M�4M������s2��ț�@dPt߉=UQ���ijn������$^`�@8?Ђ�kkk�Iα6>G 5K�팇��M��_/�E �sUUy7��
MӾ�����`|��tW�b(��-n�¯��K��1njn�ohd:!.�b�h��hY�#xz<?'����\?P��@_ٯ�~�*,ƪ�����y��1Z\��YZV���Vdwؐ!��~�}-±^ �f80��R�o��
#tj�y���[�|�=>a��}��<�ZY=�˭y/�Xӯ�jYy)��qHhXxX�7��c��jE�ᐐP?���2[g3}V?Q��������y��{��՘��.//KII�r�i30Z5o%۪�*��❖��^T3־GU�z�]j2<ϗ.Sd���Ȩ���.y��𞚚��8��Y�D ��h_{8Ƃ�5���|��k�No0�y^�6u���G�@���)�ys/����ʹs�Y�0��`:0�Z+�i���&���=�&���
����x<�����U�_^�
����M�2
z�N�c�}G����ڛ�����/�j/l��E\�}s����9cvwW�w+��y��:�����9�}��~�B:��+�������*++����g���{ﮃ�`�SSR�m�hh�OII���:��T�R���RH��]q�UQQ�*V)D���kw� z�>"<���!=-CU���̌��Y�6mސ�5�l6�mݒ��[]S][[2b�H���r,TVU����$�&&&��������.��mذ\���{v���FFF
�Zx`CC��a9�!a��B6?�]wܣ����_���s����aaᩩi;w���`�##��f˶�mF�aȐa�6���_�u�`���,��eY��<~܄��N�f,lii��0z��aXI�DAhkk--+��;X��{9������(f�E��l��~띛6o���LLLڙ�y���{�슈��ei׮�Ȉȍ��''����a�S�Sw�O?QE����mE��i��[ZZ��aerR��^u�
u�v����˺��_Mm���e	���U�� ?:*z�Ko��V??IL&���� <<bH����"����m���Z����z���hAY*?\���r��W��^ �����Г'M�����-F����IQ����r
6���b���XY��;���d������{��u�x�(ʅ\�-�����:eZZjzKK���hd!d6��1��z�~��������=�aQ��bmkk�e+*��b2�LFSgg˲��qS&O��訬<\WW;~��̌,X����2lWW�f�c��.�w����0�$I���h�$16&���L�8����e��g���l��$I"�r��#��3=�t:���BCB�F�(��7�s�\V���r�u:��ӚU��%IQ�e��=���$�ü�).)ڳw��m[dI����߱=���6�Ŋں-o߾=n�[�e�������Ύ��C�YC����o��a��$I�i�B@��e9))%*2��v��n��t����۶m�ۿ_xx� �4Mgfdh}R�����Ć���,��޶~�/��t:��m2���6l\��ܤ_�D��=�Z�p��/��Ҕ���=3�.�+6&N�Wa�f���������l�����h1[��c�F�(
q��:�y+���
�@S4�0��B����q:�#!>!3#�n�������NIN�=����l6GGń�E`����s�2lpphzz�^���
��m~~~q���ɩmmmY�Y����
u�$��%�4e4������6%�p:Z��-ˈ#eYiiiB���9�TU%1!>�a���d���m~V�����lE����BCÌFS||<�(*.6^+�5К&��qWW��M�Z<!EQё��"����E�͟7?�?���?�f�v;˲�;�b��|�����G2���@ ��TVVvvv�޻�a�>��Q�<������%]}��j^oH���ƭ{C��6O;}|���j^���*��!f�٫I�	��Q1>*>��}Uh'zJ��~��}{�gU��e�ӵf��9�@��H��������w�����FN#�=�XU�~�9�P��7�_[��A�Ro��}�]�n��߅��%|O���@� t�ou?9|�k�U-��a�9��bB-mmnA����xI�JU���e/�z��dL!�($�� �A��v�LzVEv�ƪ
��:	�s�މ�!2�K#��T���z�W�gѿ��U���*ڄ�}]�h��F���SB���|�?�U��Dp�M|J���c����K���!��.��Q�"Cd���G&T�d��풞�Lg�7�G_u�e��aef��	�v�HJ�t�X�mwa�a\r�E1m�A*����vz�l��!2D���+����f����(U�ol�'�s�g����.d�Y��gdxQe��
	��c(I�~P�$cK	�!2D����RFV)��-�������(ؽ���N��k�Ym�U�DPZ�`-%�71��k�@h�d#t�4��2'�N/������1�@S�ȢZt�|;'��$C!�1�������d�G�Gb���0R1V�~~��VlFQ1�!����]��"s�e(�
F���у�d�8���U�o�YĖ5
:;�p�-���� ���j�QǓ���	d���)*F��(p	
8
�mNi|�ݫ�_m8xKy$��/�&��L��W����.A�)�g��`�σ;��"2D����(�	��?Z�%�8�>P0(
F@R��9��P���k�ڿ#��G7$Y�kFP���φW��Fz����|��(�zꪑz�����$� a�'���Տ^ηW�]�W���(o#�ہ�.i�*���7/�oZ�U�06�a�����Uͣ�Fǻ����@/�k2d<+�/;�������@�1X��Eȯp�t����h
����*�u�E�{DU����-�ɡ�[&�=
B Ș���b��#��c��c�NER�(cE�I���EUQ� cQƲ�B�v$��"�8U� H�-�Z�/cAV-zz|�Y� E����e�Y����EU{{i'�EUR�$c��^3*pX��ӥPZg0xDD�D��*�X��f�O�YG]<�_�U��R$,�XR�G’�%�*v
�GRE[���l�p7"΄1���lzb��Q���
�LG�Q�՞�D�B �X����-���	+�lf]��x�#�GnQ$�18cЄ��;��c|��ۣ��;=NR�"	�)!�Hjj�~B���+��1����@F����'Y�:��F��	A)���&��ݝW�m41�]ߒ�]=*c���}_�;7ִ��;7�xq�?�jgǴt��2��������z$_8Ԇ|����Q�&���P�¡6A��vt���9�Z���Ŵ0ÖrǗc�,��d�QNAI�!@U��&gF��Z�}��)���~i���.��_������}Y����}�0��i���y���0}Y3���Y~c-����(��~����]`}\7'@{�LH��5���oZ�׵\7:��+�<׎�3�+�u����4���h����Rq�gJ�U�`�ъ���f��
,�o�S㞐�W�.�إ��F�sye^�1܂�-�R,���{�n��m)wH
����_�߾��u�_�~���bt�j:�P+������B�tJ�~W�[Ǡ�����6�9���ŝ�&	�)c�1Tu����1��/F�X��C�W�6q���q�]ծ1	����U�/jg���~�U�dk;���	A��ni��"l\��)��\7:�B��QV�T�YO��7�w�W��(�jgGC����U֬���r���1�39�B�-��j`K��[Z���۸�F�����-m�
��sbL������W��+sԴ��.;ʨg�YO]1�?�ܹ��Q��j�l(v|��kJ�eJ�eՁ�Clc�6#㑔6�NI������-���U��>����XSa�Ǥ��t����9����fI
�Q�Q�"Y�0?��]Um�����EM��H����(�o�~�y��`� 7ִ�Ɲi��fY��s���S��b����I;���매YV��<X�9P�9�"��ә_�fT`t���qǤ��.ʟ;��YU�=%��`3$�0;�OǠ�l,oh
B,l������}����a6��R�П@ ��b?����q�`�(��6.+�h�0?6:�˯p&��E'��պ[쒓Weg�3���/��˚��niT���[j�K�lhi�nȭ9h��c/j+o�k:�w7�4uK��
�0mN�!(��\~J�R!VV�ս�n���]rU�P�.6tI5�b�K��ge6�:�D���YjH�qO���]xquӨx�E�l]%��5#�/;ʰ��Y�&���iT��a6���E��]8�"��d��iw���C���ۅ�n��]<�U�(��V6:��Q�4p��T;�8EϢe;�7�:n��WU��vT���p9���[T%EU18��S���n��0}��=P��Y�pK��9�S�,%���6�a�@3���sg���-m��OH6w�dAV6xڝr�?�b�m����%W�5bM���%�u�������]�-|�[$<<Τ-����>у���P$)�0p �X�J��NdR�:�rC���8#��䕆n���e��C�^�jq�.A�
����yI�������mlv�ao��cPV��P�'�ŽI0j�8���A��[�Tf��P�'�ʎ�35٥�@�ީ!]n��]�%��ڽ��M#0���A~ڢ�Y�X��v�#��
���g�H��@�u'���ta�gb�%�®=�='�/���r���ta�gR��fdZ�%�geZ�ԸD�v��+]
]����1(��p42�)G���B��3"N^m쒺�JR����91FCv���K�<1�"��b�(��i�.I��a�Ɖ�f�ba���ߜ0Mu��+ۄ�H���qF��+��99��!�v�ّY���?mV�	�����.P�,�W:�=���,e`Q��a(d324�vV��p���0��qn�qO��?8�F6#CA�1��.���z�Ӽ��W\V��IH���x��?���p�mN%7֨�Qm�X�.v{��.n�Ըcu=�ԱI��vv2���ƘvT�V�K
�h���8��B(���χ�N�ӭXtQ#��Ν����rc����v�����å�=Jy�����14�P�L
�+*^}��g��E��,������%�3�6��6#s�Ў��v��M4p�#�JW�[ɉ5���MpjS����R�.zD��Sjq��I�.�&��p��?��q
�SPS��u�un�Fc�,�����۝re�p�ȀC���z�[�,�l��u�mN%��v��O��]�P���q[�Q�&E�O��;y��)ON��t���U49���6�\�&���6�����P0$� +��M�H��C�{�v�\�!zD��K�U�VT���[�nw�c�_��<�ۭT����)�pJЦy���p��鸫yR���;&P��Q	SxDU�10�(�F.A��g��U�`�LL�SF.AUTLQ`��.A�kr��⅛[���Q�CF��)�m�)��c���ܢj�QI���T���d(Ct'���c���8�ҳ��������]�h�88JQ��+�L:�%�Z�YOK
�u,�]E������C��q� ���Uz��*d��8J� *��Q��#��mm� ^TU���Q�V��+��ek�M���!�;&?�]��bQ���":�ҳH��KP��g��We��z�#�,�(
yDլ�܂��;xUQ1��@����)*�`s�U�8��~pz�#}p��
a@Ђu4��$��;�]��1TS��,�z���XaU=K8���b0rB�b@��UŊ
�ݒ_o�QB�����B��7�;)����z
0 �d�o;&Fˀ�b0�(�c(�E����hKO۝204Rٌ�6�t
jf���!W�
:�?MQG����4�l�(#��� 
!Y�V=%��1!$+����bm�W�WTLSH�ń�*�)���%m5,0r��e��h�c����`��A�
�� m�.X
4�,�b�v��BS�C!?#�0�ã��6x�ܲYO���4E!EŲ
,��M4H
6�)��S�Q�~��T�E!�l�SG�a��tR1�uGX*܏�rg�|�b�b�+M!IQi
iZE�=%��g)���j�d����:<="Cd��FnAM
Շ[i��b�3�/���dm�L���IO#,Mi��f=��	
QL:ZR�QGk#S=�0��V060M#�F�<�v���`��t,R��p�c�`�ӊ
���IGc=K9J�d0X��� I�[˝=�p4�c���YG�*8���6җU0�h�1��E�
�ki2��"tDc�菴�҈BGeꨌIG+��P!0�hI�f=
,M�YD!d��J��DF-����qe8��P�K�_�6�hM�&���鏑�e�k1t�u}u��y�l�#:dz����g�*�v��=�#ϔ��Z>��Д�9�L���IO�����"s�dL:���87LEl���CqU�.l�p�utJ��-�iaz���t\�N�լH���.��PJ�{zd��22���Ɛ��%�z���b{_+���J�95-���T]�We�>2���%53� )G���dbuM�N�Z'y_�����W1�71q��t��"s:d\��ihwʻ�]:渵����%�n�)�$��1�(�,�T*�B��r4�%h�\�*#+�=휤����3u�3,#�*s>^��"sZe8���ʸ��]��eW1$���9�YR��@ �E��<�Ig@K�͋��c@�x�h���č��j�UU)�"?���(��pv����21��z�9�4=�wpVCQ'�z^��_�@8B/{��^ٻkkkSS�_s��$YVU���(���hllYV�h�|���,�$�
�@8K�e�B�,���kj�^������,C�d��BHQ�^~���(
�=h�m[s�9��O�u�€s��1F�޳7*2����`��ؘh?���5kw��9a��iS�h2���GQ6?��/l�숋��K����V�/����p���g�?����557]}\l�!�@p���5��漼����4??��#Flޒ��;�L�2%����z��)j�-9Æ�74|�jռٳ?Y�
~��dz�%�Æ���~���44;�j1t�	���3#�rqi�I�BCB"��Gϝ?w3W�^���q���1MӲ,���N�4Q��aÆ:$=5mhv��I%I�%-5e����'Nt��ɉ�!!!�fL2�'N/~Qq	MQ��Q.����KU�n{����DG?�‹@�V�oQq	EQ�aav������݂(
��r�UUmko���t:�z}WW�M����@@zw~�jUΰa`4'O�ȲlKk�gK�q���3�����V������qcF����1��ƌ6S'O���X��Ͳ�M7\�%o됬,��2�}'@�/�?o~��/,��

$&��E��EQ*++;;;�������=��i���AA�;��R����(��M��ڣ(JUUm��'��@/��73BH[�K�K/E��k���>�#oM�pV���u�E�sb�a�AL?�@ :��'�AG?I��!=�s�W�o�(�<v�9ʩR�@ �a�1��Đ�wo�eo�9=_J,>�@8G���wtt"��1���E�#��Q��
�!�ǔk���!��UTR����ޮ���.;|8<,,(0P�e��}"�}��!
!EQ�b�v��1M�~���˃@ }M?�_/�����m�~��k��k�l��w,����/0�����C�9RU����w��ڵ���?oNII�ȑ#��!�k�8�)v�.���b�_Q$��@(�K/�4%9�`0hc��ҽ��M�4�����&)1���Ҳ�Myy�**ڹ{��{9��yݺ���ؘ�Q�s�ƍ�~V�ko������5�,^r�E����ۿ?00`�_^z�E��-���_��###[[[(<p08(�h4B��oii���
	)+/���6�漼���l6���ZZV����8�
	���>TT|���а]{voޒ���o�ZOG��S��� ��BRA��Ƹ����Ʀf����htd���mmm}�oش15%%$$����h4~�l���?��K���z�Ҳ���,Z�qSUu��}������VQqI~A�g�>G���kv����c��0q�x��y��ۺ--5�d2#�ή����Bo/x���V8�S����e����YT\�w��w�{�}�զ͛�Y��?�t�������W_�h4���[-��kY�⫯v�o/X�ѢOuz}}}����)QEQϿ���<�"�3�Q���')<x�O>�p823�/���斖Ꚛ���K.��.4˾�:0 `ڔ�a�a���W^~�+*+����v{WWWqII[[˲mm�) IMS�(�?p���o
	
�����뛚������M&B0w���Ӧ=���?�[�����b9��???����R�������\��--�{uMMii��b�9}�5W^�u���:�˕���g&��(����������bI"�p�9:�G�>�{��$�ۣ�f!Qx� ������z<�0V%YY�h����		�?w�k/������͊��|�����T��i�۳22\n����fk;�:�kjc��ikkO���h��?>���1����Fhnn����iiO?�Į�{���QV�e�q����	��utt(��O����O!Z��@Qsf���˱1�@�B�3N/���O���R��ب�H
27]EQ��_�V��]�g�Z-�N�Ʉ���t܍�]K����e�z��}��7����~Ͻ

���V��>==
(����QU5,4t҄	���Fk[�q}����qcnj�+˲��6��f��_imm����i�ڧ��a��'N�:%+#c�~��n�=88X+!�z�ڧ�{��鹳g�i��<%�Z��/�,��%g�^��W�^���������1 ��uZ0�.�۠�{clA�8!t����Ǐ���ѣFΛ=ۛ�򨀢��nkO����r�A���q���r��
���Зv�S�(�@8���|�sg��CGi�L�z'$�]��=d6���@��i��Ƕ����N'Dz�hi�}�DQ��'�j�z�k�@}�ʜ�!���)T�@ �y��ׯm���=�
�O|��{Ȼql�@QHsu�����>��_�Ĝ։�S�(�@8�|��kY���		��Ss;�pNpV�0o��@ �������	�LBl.�@ :��'�Aǯ�~�?����_����7g�ySW���(�0o�f�2^D��졟Q�/��������E�<T��cO�<�_�'�zz�BM-^�i�|
=�*�@8{8j��!jm]}cSSTTTsK��D�,[S[��m{Y�a�=��ohhll������(����n�@uM��{�n� �--�ee<��74666�������ƍ��ɱ���P�~t:�.�+����W^���o�4��<�P1�u�7�������3��{?��7��=���߸��C�
o���>����i�֭�IID��,��xsϾ}cG��dI��<�Bdd�#~��'_��n�[+�2�=0���E�q0��4�.�k���#G�p:])��/>�l|\\���ǎ%Z%g	�J����wtt��q����G!I�TUEA��]38Ѧp���xx���,;���ӧ���.Y���o�c4h����:�U�pV�+�g��M��1�k���K��&�����/�\^Qq����@�|`�br��JMI���~���
��KW��y[~>˲�aa��ڰq�V��h�@ �%����'��22F�5���Y,f��<nwsK��h�J�f4��'��23F�m��
Q�f�E����r�Œ��(�"��V��h�@ �yN�����?������h�@ �rN�T��xi;�땀���x�:�P��-D��l���A��&�	�s�!�U�p�A�	a�AL?�@ :��'�A1��0�8n�f�(J�I��B�G�7ᰢ��1M�}Gg߄�'#�m�@��pf����#m[�S�6����Ӄϊ�&�}2Z%�3Co�,�l�gK���kX�Ï���y�-��g��?��/ݴ%dEA���<�/(5E������_;�™�'i3���ܺ}��(�
����

� hCԷ��}ǎ;n�%#=
�޻o�G�UVW
��	0����{s^^P`�D�#�x���{�5���y\�
�@�>�s��UhzSޖ���̌��J�_x 6&��p�����;�HIJZ��/��y� z�AX�w�v�v�y�м46o6dHKk낅��/�`apPPvV�>�i�ҋ/2������b}��{**+��ժ����ƎMҷ�3F�Q?�(�9bx.���v8����/<�{���OWVW{<���}�|�O�>]�$1!~ƴ�����ͫ���`��n�������Aعkwm]�M7\?fԨ��v�a�y�)�I���F
޿V��>R���'g������������ǻ\.�c�h4RҪ�$������;5%���óO=)�⚟�q��C�ů��ơ��7�~�_c�GQ�5�z�������x������ �����VWW?q�q���x����pf��_�3$8�a�Ͷe�V�$����U��-�tϾ}99�&M��|�/����j�iz��ÆEq�}M��W]~���Ѽ=^EEEF>|��şmش��y��m���6�u�!!�������Zݻo��q�V��!g�^��_���Y��gΔeyӖ<��j�V����כLƉ��S���êU� ]|��@��ݞں�Ԕ����Ӌf�_���Y���g��}�K���bc�bc=�gs��ࠠӦvww�����������2O��I&���K�p�
��EI�8��̐o
b�=0��Q�UE�o��pJ�
��u��4o�a�1��(J�SQUԳ�כ�~�,|���� =A��������A��@ ,����d86%=}���S�g�y۝@o�z�@ �d�I �b�	a�AL?�@ :��'�AG?�_Q��Y����t�E=�@w�@ ~�D��d��A��e�s��W�5kWT@ϠV�=h#\��於m���#���k�!TU�po�s�~Y�~����vN%�s�^�!$�G�~z�0Ei˸4|c�K�ʖ~�Dm���w!D�4:��Gf�o�w�r{<˾����t�W_
N%�s��m������E�������o2*��C���VU��xݵ�
KJ�2��K����?$I�˟��ѢO�KJ2��n��E�-))->%9���F���?����v���3�M=G�>h�����;z�/������pN㓯c�۶mxnNKkkeU�#~8o��ę3����		+�~�eB�a����{!�n��߭,<p�ܿ{��+�?x�Pjr���~�⻇��Peuu�Ν+�����xάY/��ZMm-B�\tzh�ں}��!���o{��N	��'_?ø����S&N�$)9)1   9)1:*:-5EQ�����o�[���t�yN�3*2�d4���
���p֌���cF�ڳo��͖���������t�*++%I.-+=r$�0p&3�����S'Mr�\��N	�\���ya��h���;m�����!� Z����˯��������]]?k��/��f�ڟ^�~��9�m�.�K����6���Pe���ǎ�;��^Um��!���B������t�:��¹}�e��$�X�xɐ��L��Ȉ���8�����b�Z�����"#�����/�$!>��PXhXRb�^�=j���Qq���9���Ԕ���o��$'����Q#F���m{>B(--�a�s�ӭuث(6�@ �~0�]]]<�765{O�##eEnkk�U�����f�
�~[AUUW���k�y�E��N|��ۖ�@Q�G	�l�d�c�#�ýۚ�V.$Y�X6�`gFF:�q��j�4M{3�k��Zآ�ģ2Q)���I/;;9VQ�P	�ܥ�j^�@UE�ı,\8o�6��,�v�����=���a�U� T�@8w�e��Aa4��gēC �iȂR�@t�O �b�	a�я�&�����9�O^Q��9G?��OaF��;���+���@8�蝯�JJ�[Z@�e�mܪ
]��ڟ���COy/��,˲�@ggW�Z����S�WQMM��M��h*Ӵ��l�wt8x�+���6�m��ܳ�U8�=�	��;_?���Ͼ�o-�EQtZ�ym�����4M!����S۳����i�ʉt6�UTgW'�q���дwI�7~��/�Z�� ˲&�^M�Mk��
�3�W�)�h���ٻ���j��)�,_�����Z���̌�Ҳ��Ҹ�ؚ���{�VU׸ܮА�o��@}�⻠�����~��Қ����ޱ�����u�uu���Gq�����###gXh�漭��[�~��7N�39)�U?mٺ���>-5eٗ_��[[[���x��x�֭�i�ھcGIiYrRRgg���W��nnn�eÆ⒒��@�ɴx�Ҙ�h�e�Y�">.�eY��@ �'��s�kޛ��6�VW�����L�0��>��hx���Y�]�BX���Ç����}�%<���?�Y���շߺ\�g_x1;;k݆
���h0�8n���;w-^��{�sE
u�=|�|�h���

�0n�'���m۾�/[[[�����].׈�܅�v��_���
�X�嵿�+<pp��G{���c���kjjKJJ��KsK�#�=q�ࡂ�����zh�s�^��E-.)�2i�,K�9�ƌ�����3l��������;
jjk�͝3q�x�����̝=���_}���;:;�***=���TU5..vHV�u�\�%o+��9�Asp)�ZRZ6jĈ)�'Ɋ���'��m����Q���u��egenڒ��������W_#@ӧNY�嗒$_r�cF��yݺ�Ҳ����U?VTV͛3{���Z��/�]}վ���[�̙=zRF�珞!..)�Ȉ������CEAN'̙5�g���͑A���/�����/���ٳf����l~#���O���GUuMGg��k�����kƷ��!g��
�׿?v�ܹe����sf�4
</tuw�{��3j̟3�7���~��O=�\B|<EQ3�M���U�e�O���~�v�r�
��o���/�|��m�����͚>H	�p�����?:��������#��`�0m��_�OnN�&?s�4D�Ȉ�	��a��""h���替��[�N�
�EEF�������F��;���x}��C�dkS�3�O;\Q1t���4k挏-NJHHMNv�\�aa%�e��vM{�]r�1�.����Ӧ@hH�%]���d����ִtἹ��&�i�̙� �LF��!��^��w��r���tuu���|��m^�Ae���=VQ��X}CCKkk�С��j��w¯������EEU�0MY�Y�]���АP�v�7[����f����Dz&����x��*A�Y���7�52""2"�{���!���vj��
8�Th�RU��	�@8�����,ב��Z�~���
�O?~֌��3[{l�z8f"��>�wz*��'e?�.c�5��T�jz��ӷ�@�.�@8��e�O`e4C>�>+ )�9�oQjc؁�3�@ ~�o�&X�@8 ~d�@t�c��K�����O��Hk�O�؇>{NGNk��p����^c�Z�#���M�
���W<�>�F�Lgᜣ��m��,k4�a��A�N@HQ-��vK��g�j9�=<o��!�ǀv���
�#��ZSZ��OQ�Ɏ(��e0�ҶEq8�V���n��Z�B����9��������I�$�qDx��lme,Dz�z������_
���Y�5
�	�яu{��'����'/��D�=	�+*��4�����O��Mӝ]]O>�(I�K�}����2�L{��k�mۛP��)x---�%4ɮ��'��LEe����(�^���O���[��?��k��}���,����()+�斖�������b�i�/o}�о����
'S`��ǟв�뉉�`��j^�P��ò,��kkkߵgwTddfFFsKK}CCkk[VfFdD�������;s�
�y����+�(.)ikk����oW|g��gϜa4��� **r�Ν55�)���SWW?o��A�9/���n��̌�EE]]],˥���t��۷WTV
�������M�WV�=�㺺��}�U\l̔I�5��}w�5z�H�U�t:'M�أ�h�Z`���1�Q�`׮�n��qc�[Zgf��DG@���
�ƌ�(�K/''&�pݵ�)�'H�P~���(�ٙ@<?�\�h�-��W�~44;��>�0�w�o��v�ٳh�g�.Y�EZj���!��C���l�ZW�0}��;���l�/(���ݖ��pŸ1c>��˨�H���'������o�8�`Qq�ƍ�i�Z��>Y�8%)酗^����Z��W����8pࠢ�/�4g��E�?+-+7�UU�3gL�v�w��i���UUW�>|���?��n՚5v�c���o�[�9//o붡C��L��.�?wN��]�,x?"<�����?bYvɲϳ�2�m�ٲ�UU����"#7��Ɋ���u-O���WX��	c�jz��H �8�R-���_x`֌�_|�MJr�C��w�w|��r��=aܸ?=�`nN���ۧM�:n̘k��
���=rC��qq��_Əp��]}��
�p�ࡉ�lj�4bx��7\7}�ԛo����
F�A!$BIYYxx���u�e�"��&���`�a��ں'캫�
�x�訨���'ۜ��n����s�>\��s��n]���sg�����zDD��~$IRsK��#A�3{�7ߴ~�ƹsf=p�=�g�|���k~Yw�=w?���l6[yE��c.�?ƴi�
}�Ux����$Z � �L��J�ۓ����o������Ҡ�@�ceY���@I�����v����;��AT�*��󼢨�(�Ʀ&�ۓ���r�xAXUTE��z}eUտ�!7''�?@�$A����
����ssr�l~�^�����n������q{�JJX��8�����������(�^���][W�p8�OP``~�Π��e].�$�TZV�I	�����c����t����8nm�ZS԰!C�x{�9H���֬IKM��3fTW�����=��߭\�zMCc�^��?wNcSӪ�k>��㸘�^VS[B!�a���P��FCxXت�֤�$˲�A����0��/$$8>.��+�
`9!��w�aged$&�?��K!���6��`�X�Ͻ�bSs�}wݩ��}�����a��.�$+3c�?~��
���o���=��������U3ӷ�|�����%,4�[o���]�[���?v,C3������W\v��r&��fMJr˲�0'�@8�蕯���HINNKM��٢$Q1���]n�ݷߦ��4K'�2�0[�n�O?^��mW3�}<�� 8����c-l�C�GU��dԎ:�N�eu:���,�ZԐv]EQ0��eE���gM��'c�$I�i���(
˲ڟ�tq�6�$�@ �ߐ����k��
��#~�ۅ2���h�Y�0n�v��O3�����c�����`�>o1������S�k��d�ǘ��k���{S����8o�&B�eY�5o�~9��"�����XI�^v��ڟ��"}�����~9�CǓ쳿OO��Z1�k��'�?�}�Yί���\lL�GOlφA�����p_��?sR&LU���Te�p�pR��)�87�8.a�AL?�@ :NA��A��+���@ �U�_����:�!�c�Ht�WQ��X϶�J���K *%g�~F�O>���}��+c���YB=t��'�y��A���Wu�Y�i����x��J�g%Q)�@8�������TU]�p8::�,f˲���t�4M!�p:###����GQ�U��V�E�d�e\n7VU����Ԝg�ٺ�����������r�
�765Qe4�f3p:f���p�����$%
���A��|�Z���j���������nݴ)S��;�-���/��J{G��e_L�0�d2
B��M�c��kim۰i��ѣ���xA�Q�sϾ}˿[9$;����ްq��*11�_}�흷���˖��v���%K���tvv���s�
��;;;?��k�����U��J	�)�d�k�޳w����������)J �DQ7f�?f0�KJaO`�ٻwDn��i�<TU]���1g����3jTkk��]�
z�C�w����(

�$�O=x��ɒ�5��(J����{�+�ؖ�0XUJ �GM?M�

����F�p�=&����YV41N���l�J{s644�wt��
������ό5��r=�̳�ʊ̰�^��M[�ƍ�iXK5��y�2��U��e���L��pZ�5�߰ystT$���%.�HOkkm{�ŗ���,���"+ ��G~�lؼ9&*�l6��qcw��;g�LEQDI:\Q�r���$��i��Y]]=v�(��
��I���,�D	h�Q���}aP�+_�{?���;f���E��f���暚���(��MUEU�KGg��h���<h�~����{��n��(������6..��ju:]AA���nڲ��UU�����geF��}���l�A�f����;(0�����@ ~''�����	�SQ �@8{8���[Z�;m뻡�m}��W'��]إ�G���I�zD�+@ ��^�y}��ˎ��m2ȇ�'��r����g�*�@8Ð�m�0� ��@ �� ��~�@t�Y�!s�S��0����γj�'���Ԓ��@j���O?���-��$�
��I�O^Q'y����3C߸~X��'�z�0s-2j��>���E�-�N�R��@qI�}�K�WQ������{�O����{>�t�ǟ.nim��ѽ���=��[�|�J��@ �&z�~����y�:-���(�Zj���ny[�546���546j-h��;�u8��Ҷ}�PUUKYs��e�Z��EQ�wY���ZҞ�kֺܮ]{�8�.M^Vo�UU���>p�ȲL�?�@8}u�h�|7nޜ��ew8�����а�0Q������qdx�0_~���l�i��'7�L�w�|�u�֬�8|�����`���&�_~�������C�����Z[۞z�?<prb�V`�o�A���l���9/c���^{㭤Ą�Ç++�l~~��_�Ow�}�웮�������)-+��t��7�jim	
	������y��wy�s�]wZ,D�CE�.|��Z�f��@ ��_m�_P0j�p��.IғO<][[mmm����=<<쵗^ljj..)q���YY.��h0�۰������w���%K����}�ysf�4]RZ����u͕W��v���v���:�u��G��9î�����Pqі������.����nݹ{��d����x<z���O�^~I�?Y��gK�O��+/2������{o��?�����q�2���YWW?q�������q:����׳,�0LDx8���r,�q�游X�ax�7
�����q4Mgf����VTUeef�4M�4BHE�˩�~�Z4�1M�����
�sr�RSCCC^y�̌��w�{��p��jD�f󋋍�
7�5
RU�0Lcccff������uvuegejy�)�r�\��������^C�t�t:;�� 6&���lٗ_mشIQ���m�@gW���)عk̨�`2��zg�����ʷ�y���]q�%�_r�{|��G�޻��v''%>������U���z��Z�v�


��#G~�h�E\���\x�`HH���D���K��(;\>z�~��̚9s�?������޻�9<�_Ͽ�ѢO��뻻��Ι=s��{����9w�D �r����o��=rdJr��b���
LJHhmm�;gNnΰ��ش�����N�-?�����t�))B#G��4a���9c��Q��"#�[[�bc����c�
�������mpn��5'�7�W�9"9)	c=e�������Ą��ysf��MOK�HO۳w_m]��]
i����Ξ1#+3C����<>..;+�f�s8��	��IIq11�ƌ	
���sTKa�9qm�^��=�^�?I/�,˒,�>�LU�~���y�v�ORQ�(b�O�E_%��Z"ȉ�Zҥ
3���[oк���_k�a��w�VQ���Z����`y�WQ�{�fD|u�ݸo�]�(M�^�x5�G���Pu�k�@ ��0���}ƶ�cwj��}�=�<��ޏ�__�G�(�Wc�Y�{�s�b�@ �*�� ��~�@t�O ��~L�r��m��1�TN+�K��z��5�M�R{���c�����PUUK�zf�@8;�/_�o	/�o9����~ChN�"*�s�N�c��=�h�ۍ3����I?���m�XYU��x��hQ*j;5ᮮ�~�I۩(���9��)ڶ7��&��6�;:�w8��V�v�޳{�^�eY;z�UU�6��U�\�;��z�b��x{��H���|��]��t�'[�nݰi������ֶZ-�7Y��`Uu5�=�PTիJ-۳Wս4��}���yy�e����u��@8���/��{j�g(
Q��)m�B4Mk;�/����~X�MӀMS�2%o?�9Ek�����?�iڻ��+�m���n����|��'�

�3h��N��)M��U|/�	��n�7�W�4�(
��7���>����)>7�]���}�d�}+���/lhl\�d�$��Ƶ���TO*߻��h���7��N}}�������5kw��
[�m=}�@ ����׿}ǎ��D�e���KM�FGSu��W������@��׬���rFGF�s�4Moݖ�������bY�o�񆵿��6uJb|��e��m6�>[��$I��x�⥟oܴ�����lljZ�t��`���h���x��Ȉ��{�̟;wά����>[����PUl1��A�?wv~A���W�u��������o���9<'�+.�e���;�{Td������'��\�k��*=5��ﵷ�'%%a��w숎�����>Z�������徻�ڳw�w?���������o���=�e�^<$+˷EQ�w�HNJB}���[n��~�Z,�ss~�qSs�W]e�����k��s�
7���F���9b���5m�z��ēO����o�9.6v[��ؘ����?YT\R2k��^_W_��W8xh��\yŧ�-9T\2{��a/Y�w��;n�u��Q����f��S&O�l�����OFz��Ç�F��x�}������'O�۶��w��y�g� �!�,�o��-y[��䴴��\~�� o�v�_XX]]]VV����Cnش���XQ��ښ�#�?���n��+��w:�?��
~\�:,$���^w���^ue``����k��!�/gfdȊ��+�J��i˖�n���������oX��'n�����3eҤ;n��B�a��;v���9����M7%&��ݷ�`箯�]~�E}�t��}��kj������7l,*.yᇀ 55��O?#��M����&�7$+�[n.-+_�˺�c��κ�Kkjk_}㍙ӧGEF������w��}�����*�**7gXk[[~A������˯��/(����bc�kjv�`�ؘܸ
�6��bVVdX������{��HO�?o���[G��u��ϖ-�����R����]���v-E�����˖]}��)IIi�)������=j�ǟ.>X\<s��^~���k���QQ�ӧNY����}WQIq��]+������G���9kj��N�������#I���'�rz��x<�+�L�(���䤤� ��6??�A��tC���BC���Eݞ�#<,��j�����}�������WTU���ǔ�䠠�?���6�ǟ.�x<��6?[Jrrk[[k[[Mm�,ˉ		�$�����Ʀ�$���ged���c�g͘e�����m{��3

LKMIIN���?T\�(��¤�����qC����BSR�j�껺��33DI
�4qBTddIi�M��PP`PXhXHp�BCBǍ=�ֶ���֜a�F����C��IK�ӣ�I�(i�,��m*Ʒ�|Ӥ	~�I]}˲��F�����l���n�[��@��?E!�-::z���m��

���u#��;��;�X���v��$GGG}�ꧦ��S�z�-�j�X�23h�.*.�D���8+3cl�����LKMIJLKNLt8���̚i�X���8N�m�����23�V�6+C�Mg3�<���������E�E�
��(��rutv@[{����#G�ں�ٳf��^�Ï�		F���W_�?w�7�Ϟ5���|�ڟ##"���V�^�|���̜)bkk+���ӡ�jSssbB<DZ~�u��斆��Ύ��Ԭ�̊ʪ+�߽g���6$�d2�z�W��$���wuw@sK��d��H��ͳg̸��8�kni�d��q/��㜡C������v�޽k��Q#�wtv|�ͷ}�iJrRHpp}}=�b�E�lV��Pq������]YU5z�H��W_#M�;���ٷo_a���e�pee��mo/X�������0tLttIiY|\\bB�>���{��/��������%&$|��"�X�o�����;\��O?��+.�oimu�ݲ����@GW��(cF��tɒ�����������?8E���$&�@8�镯��Ogf���VU544$)11,40l�ߑ��1f�Ƞ������(�^o6�~Z����o2�L�v�y�1�F]u�e:�NU�]���ߟ��ڹk���֏=��K.����#G����^PPTT���3��)����chz��!�99;w����=sFqI����fL��C�d��UW� I�M7\o1��m���Օ��j2�"##���h�JNJ�2iRqii��]F�15%�a���?�5""|K���許�G�;�nw8��Q#G:d��}��~w�v�2��ɉ����y�?�tqVfƐ��� �ж�Y��ǎ��/�nedDĝ�ߪ����S��4�����m�4E������g=\Q���z��ϖ���
���ܼ��Y���C���"#�Y���K.IL�oni��s�!!�I������M�<��oڜ�(JvV˰�I��6��dLOK�iF�M��ܵk�СQQ�k�|�-7�䃻��<8�a33��'g����55�f��d�x<�--Z
B/�臟,�y����R�'�ԬmSSòA�����h4>�dQ}C�{�\�ߋ�WQ'3�y2��55����ׄk����W_|A߻.���=8�0��l�~�qLL�wۛ��ͭ���P^��{͇w]��w�С�n�����2�(
`h���j�F���7�=,,L۠i��/�ܹ{��h���v�O�+�^E��
K�ˊ��+j��W��z���
���A;���o��fbcb|5��Y��۶=%%Y��ku|5�b�*�6��*�H����UE=
��?n�?��W�U��Ȯ�����0VQ��e�ZOmG}��ʲ��`�遉"�M�OF�x2�S��|뽜����G ��P��g���i�t��k1c6���Lϐ"�����d,��izp�,䴌��L`�yP��l�Dd���b��E>G!�@$��+�0�����UpO~�ӊ�����Wz�{��������s� ��~�����f�䎾Y�O�,- ��91~Ӎ���؎��������o
���@8;���|��B�� �$쇞A�7�7�[�C�Ŋ�*�"˲&�'��̀��&:�}�Qz㛧����hg�74����$�X��?4IDATI��=	�UU�e��Z}��Г�ߗ�m��U����A����U��e凫kj�����v���;�?\Q����v>��h��~me��E����Ѣ�[�PQQ�������YN_ӯ���>���>�ߵ��iZ[:�w�x퍷�����~�}gh-�<����a�O�hA�M{���x�E9]�^~��t�ko��mG~��`�����v�B��KK�G�~ߜ����|�͘�݈��o�EeY~�|�|���|_/_^RZ��K�H��-��6X���)��Hq���k������C;ݻ_;׫E�G��(�EK���al��?��@ ���Z͋�_x 80080��߹\���H��Ə�c�.��L���={�RS�̚	۶�5}�,˟.Y*
�%_�g�ܜa!��y[�ED���ͷ��+._�kY���v��K����/�f��W\!���k��������v��}��y�����,_^�Ш���`�������O?p�К��def���,.�3s���޴eˬ3
�e�����̌����VdD��^=i„����[�Ǝ3~������
:��������pҙ��
	

�߰qӔɓ
v�����Y�߬XQ]S{��Lƭ۶wtvΝ5����b6�5r��Me�X�.��|+"<��K/�X,��Y�C���oܴk��iS�p:���>~옆�����iS&�߸q��=ӧN����v]sՕٙ�?�����dޜ9i�)��VTT��ܜ��{����^q٥�W�.--;TT4z�H�~z܅��g���|��Y��1��S}æM9Æ�74���=�Ҳ�e_~	�V�޹kw~���!g��w�{���M�0�oO����<%%����;~\��y���g�+.-KKK�y>�����H!��/��ڷ���x�|��䤤�W�ڸisLL��/�/��,+9C�������燇�	�����kll��d?���v�ܽ{�Ͽ455�ߣ���1M�����m�!�����pHv��ޫ��ޖ��n��䤤>�Lppp၃�-]>Y
N�������-��X�z��{?[���+�
����.X�P�әL�
�6e���L&Y�`�_���Q5jĈƦ�W^�o�8r��;w��ʫC��i��r�������E�%�;_|嵡C�`�z�-��?48�ǟ~Z���䤤g^xA�/��������?��s��Y����}��o����Ǐ��
�����?�ؘ��^|I���6�@8��c���K�L�$JҨ�#�ϝ����g�@����d�8ݤ��L�����p8
���Ο;W����~~~Ο{��;wFF�'��͝=[����������̸����kim+*.nniimk���P%5%y��i9C�eff^rхF�a�B��y��wM�8.0 @Q���F�
	��v��-btTTVFƺ�v��;g�̈����ߵ{�(���a��qƍKMI�ۺ��n/�8��xZZ[-�qc�ϝ��?u���ǵ���o.����'(���L�652"b��=���0r�����b1�?p`��I</p:��mCXh�I���Ҏ�����G��HK?v�|��F�a1�w��SUS}��9i��Ǎݖ���t�ii���̛�q���U^Q�v{:;����gN�:s�􄸸I&def465�m�v�UW�����B��izp�^|MSn���0�즗gVK��p:��v�8��Jϗ��������v��a��۷�
q�1����k>���ṹ������ >>��a�OK�}PS[[~���b	
LOMy��{��^�������y�W��nDa��M�5���m��5���F��x<�^|�I���v��+.����%�e�ƌq:]ڡ���I���2̾�BI�b������S�>��'F�����r���t*���yI�dl�fѼ��(�������;?�������X�m��������(Y�kjj[Z[�N����

�u���9<,���E��Ԕ���햐���q�3{��<k4ccb<<ǭ����� �;w����蠀�K/��'�����󼪨����B�q�k~����WT��'+�VB�@ ���
�\��!�Y��?$+F��LO{�O�����m�X`Dn���)ع��G���\T\�3l��9�`�ر�wL�8V�Y{��hxN��ɓ��"nj��%�?�����oٺm���C�d��I�AAA,���
|�>�t��l?v܆M����a���UU﾿PV���b1����8.<<�l6WTV���CI���v��tՕW��߷�i��^P�������GGE%%&j����,�q,���ѕ?����w|��g��~��ޑP		It�66q��nc���I�8qy�8�;6.�7 nTS����TiBH�w�֭�1�rU 4IH��|v�������{��9�3B��3��
�/��`҆ݾs׷+W�m�{�CUՌa�Bkֶ�j��4� ����qc�9�3�?�7��r?���.Y2,-!t�����KAx�GBӧLy�7��2B�'N,\�T�������.^24e��Y��|睷�}wP\��!)����9;+!���p���%%y�a��3���}��.��q#��Y�;��������=qq�Q����t]}�駭��ss'��@m�F!�aV�Z]t��x����W�ˮ���
X�Mڶ����=~����������x�&�
�ElFcpv-�u��쎋���Bh�N�3��}!88X��"G�׾8����?n��D���]�(�z�y��6y�D]8ڢ�c����w�y�^��y=��~E�N/R�B��QU5}�0�Ge0�(
My���3\�#]���Po�:�3��~%Z�>P�������7,�FU�oy7v�^�(����օ���7�_�������)�X=��u�v.�����:|����}����M���@t�s���(�}�		�o��Y�O;}��ҕUB�8�/"�b���������~��l���*�b�;��wbѽ��?�fT1֟�k��/��BW�3}���f`���2�m���\�~q\�t����my��R��/��v�gz��;\p�s/C��k��C_�!�8^����#���
�����JoN��"d/�n`&:`����F���W?i��w�80�4=1P�@7���9��J���ðpi
���f��x�>=�'�4ƽ�^Jm�2��z[�N=l��i����_�v*puuuN���	|>_sKK_��b���kt#���?>�:恢1��)��O����+|���?D}��gy���/�(�{��k��}�����G��?��'z�q�+�{������z��B�'JdY���*9y�a�S�UU�����nmCc���*(*ں}{KKK��R��;,-�FCKIN�X,�

߮\�}�N�<��odgg���i�'�-���,mll���A���qi��6P;v�ꏃ@��s���6m��u����7� �|��6�Y���s߫��?u�</��xf�.�{˶�CRR���h4��-njn��|��ί.��W_���UTT��Ws`Y���ræM���s��yYri��E�?��s��п8-�t����?a�U�� ��+/��aac�͖�9\����?=�ltTԾ�9�'N�$�c9���"�DG?�/��¶��90jD��#�y�ں����INLdX���������ACvi�//lܼ��t���^���<�r����J�D�������:G�����a���5���E�P�.�;-5���.M�dYF���J�Tr�$!(4$88(h����]{��l��/0��U��_P|��%����Q��%�'K�5�szH����s�?�OINB����,��瞯�����X�úu6�,;v�(��hޮu�7$'%!��.>Q2��/�"#
���_~���z�}���9|��O�V0C�S��6uڔ��=v����'���_򁪪�~�y������/�v��!Ch������Ʋ���55�&�1�᠎�,�v:��Z�f*�MMM���b�KN��������-�z��~��I��T_w�������N�@ΒW�L��>{���K>P}�!g��'=1�nX�W(����mp��%I�zb�z�������t�T!]x��C�;��"�H?�H?�H?��%H�ra�LG���>U��_� D]R�3������$f9]O�J��m\g�
��Y�K�z�Pn�X�4]�%���w����9BHUU��m��K��z�����;���r^p��9^Mmݑ�G��"EQd��|>�q]]���bI�Eq�\UUՊ����=~��p���!��ilj���Ea�!�ee'KK	!�--�E�

���J�ۍ����@UW{�޾~d��6�PE۸iӈ��S��-Z�����O#�’��>�� <<��k��x��W^������''&&
|(7!�|媟6m�e��Ϟ��[s�;v�xdd�o�|rђ���U�����v{��ө�9TB����#;+S#��Ͻ` ��ֿ�����+Y�雀�p*������1�F��isttԜ�����On��W�"��v/���W_z)*2�ѧ��eيʊ��~SU5�en��/�/�t��6n�y��$�2P1�Q���Zt�ˇ��e���kk�;��I������_�pß���O?���f��l2!�*��jk�ƍ�v�1�UUCMF�0�����0�bY����ڿ�++���!�����Ą�a�u�,��ˑ�u���h�P����漹m�������k��� <�iD�|m���B�����lݾs�/n��_�����}>)1�w��!�������f��[WW�����|xX���WQY��R�>ǁ}b_?/@7t0�444̜1!;�WO��������Ȳ�5ӧge�x<��#���nB(uȐw�a1����=JV�Ą�Ћy!/�����_�ۡ��믽6<"c<";��5�wF���8Pɉ����x:�뿘��Cط����=^�e^�&���ݶV�0��,%4cm{h~�0m��P�԰ө��k����,�gڎ��~��Mg��4�
ಢO����������ͪ�2s�K�Z{H�s��Q5�i_����	Q$��u��ADo�����T�=�5����ң�_����q��;Տ1�|���۞�������D�1���k��CCB�wfa�:'`�p��e�>q�dHJ��_��v��+�,Z̲\hHB���?m�8n��ei��t�x՚�>���۷��s8�rs���Uk�#�$%&���<m���۞�����������k��*�ab��U����<�v�\k�~o��B��O��,�z1*g-���6lhjjڹ{��pכQE�����c��������fz��={BN����ܰiSSs3�� ��V˃�ߗ���ҿ^E�,;"+����8��/���L��T7=մ>�\p�q��CM�[�m�>�������~?�?_9�33?��S��w�M7Y���>����y�Tyy��AYÇ���B�?p`ɲo%I�9cƌk�Y���7������E���
x�{��?:j�ؾc�������Ҳ���d��򫯩�:{֬���S'O>QRr����#G"��RS���a��-����ɐQ5���:{O5x�[t�ҵkϞQ#G8]��
�PUu���i�cNjg�z`���Z����]JrҖm�FfgD0ƪ��}�_������,^�Ï�7l�����'�f��3yh���v�
AE�mege!�UMC��mim			��_�r������tQw��y�t*Mm#���fP�cܘ1}��L=}��iG�]S�)�u���斲�S�'N�|F�!d�XEI���;���7�z�M��FclLLTddSSSiiٔI�\n��bnjn&���Ί��c׮�������@�з�Wn۱�l6+�����f�$��ԩ)�&��n�A@�m6�a�f��h���p:�-��,����_|���͛+*+9��zᢊʪ���B��!�e��3����lim��� :L����9�FcxX��G��\�y�UQ��P�h4<t�hZ]]Bh_���a4�6��M���O��
{���{ΛF��wO=y���7�Z�U��[wl��� �H�d�EK��\�:<,�h4�l�;wmظ���Z|�d��e/���1�G�M�w�{՚�>�|^hhhtT�W=����a�ّ�G{��O'�6P��������^z�>��.=r䐔����ؘS���L�>n��ar��RS��{�]�%%)922b�ҥcF���b���6�m�ܤ�jP��޻��lcnj����8ndvV���¢ãF��2i��t��=��M�299)��`�<qBBB|Zj���NNJ���;!uu��E�����X�>�7u����`�e��vԻQҨ��b�SOS�����eA�������t�~����4��ܹ��w-��j�_�.�.��d:Ƕ.7.r�z�s��O��l���6ZB!m��Q{�}��ԣ�aUUQ���,��F+�Z,���ގ�7�0,��?"1�;4�`�fs龎�ET瞞�`�KB�ݼ�Su��B��)�TXߘ�C��m'?�e��Y���ǵ������
�n^z�ng��-q^���}Ko�Š�6K�e0�.5Ԫl2��QE���`�6�w��#���R�ijn�X600P����X4M�y>2"���٤����g�ŞBoћ�N5}�Lz�u��\r�cJ�\F��N�1�{��򹈁궧=D�j7���I
 � �5U՝uOv��J��j��"#5�uO>�ߖQKk�a��;w�c��B{��;YVVW_y���0;v�j��c\ZV�j����BC�D{Ͼ}�sII��x��;Pg�)xp�!�466vI��$K�
]�˲<��h<d:�f#IJ,u��,�2��̛�k��e+V��b�^Z!��͢EE�#�<^�'���޳g����{��*/�l�YQ�"�#��������}wXCc���O>���#��>6Pg�)�>��_�~BX���ݗ88�㸯�Yx���a��j9"��*��n����튕.���;n��o�7z��u�BU�Ջ�.�z�7���i�����,]�"0���㏯�n��=��"#~�?O9��B���DED�3zݏB�`0����//�#�fd��v�g�<��cW�C�3��&M����2}��n{��}�EEE��/�@��kBh��#��j�j�lߎڗ�s���7�o۱�o���<YZ�e۶��&�DGo޲exz:��4~�K��2z���z;���O?�9�ڙ�_�q쐔�{����,Zl0$IBm۱#=}���� �֬�����b�<t����1s�������z՚5�{b�2�q�EU�5���(���T�=���y���o�+��}�x�x��ɓeY	r8B���k���_|�Mcc#�q��&NP��ӦNq{�v{Mm����[���j����'�O�41m�PA��+�l��0L�Ӊ�Y��cǏO�2��
F�Pp�c��8�׿�EeeՉ��!))����>zo��{��5w���]����%+W��}��,���꾧��eyyESS�q���Wl���!8(�b6>r���c�rr�nOUu�5ӧ)���0fjjjB���IJ��9N�+((����-[�R}��}W^A����>r������?5";88����s��;wYL�� �as��nu:U-)9�w���,7v�,�[�m/(,D�m�'{t��x}>����E���۞~:~ll�����F���'m萬��!!�$oݾ=)1q��.Y���f��2o�R����,-+��=x�PSsӃ�ߟ�8xP\�O7��Ս7.:*r��-��&N�2�����v��j\Fzz�#���!������d49Byd�CAG\l�o�=p��=wޑ���0hа�Ԕ�����f^�ZI�����j��G��1�~.g�������	�N�\f�g/�_k8QR|�m�uRwI�N��j_��_�	4M;v�(��0���:|�p�x����c���j#M#�y�E�gYV/�0���1��Wݫ�?^�^,!>���o���)���3@��Ӈ���ȁ궧}��@:��G���R�n����g��Ϝa�.j��߶��Q�;�f�ۭ�?Pv��wt�~���߯{
��"w00�(oi�0—��>j����\[����Oq�?���d��ؗ�-j0?�^A�e������~&�"f0�W����*-�-��a��y��cl��LF���r{�T=Ϸ�>l��,��=a�h��W��C����&i�;cUUk�k�#�ۦ� l!�|>˲v��d4�@�=B�Eˆ���O!����,�t�K�~
�G���d0�4U��-A�a����}�M�4�(5���m��O��ihl�iȁe^4����^����:�W�XS5D���鹧퓶! �(0��h��ҖzI#cZO�=mo��ѤR��	���}���� �v�9Ow���lۄM�h�(�=��G@O�Gw�wkV:�!�~�u��ۢ^J����	�jCgvm"��X�m��0�J�Q�4��(���ҧՄ�٬����b�Z}>��(!��"#�J8����n�KV��A�&���ԃ��B�F#B�
#,ɲ�nG��=�N�	!gi�����v���i&����:	=�V�
!���~�M�5M��c�պa�Ƽ��ٳ��!�eɲoc��G��E�e�O�͟<a|ll�$I��[O9n�����?���{�eY�@n�?��M&mK��j�nڼE��ȈpQS�}����r�o�����E��m	�PUU�ɼy,�0,;2;�F���E�Z�?���c�iS��=�&3G!$�2��6=f��H[[�p9C�ں�*��j�گ.*;u��s��v�����d��lF���l۾����a�%Xk3���8f���7?�7��XY�����k֊�h4��8��y���|�EEy>Q$��n��G�O�b6�h>DZR7Luj�e�ں����)�'�3f��|���l�8���qg0��EG�pg�yQ1F�����_���|���!�,����]:a��@��j*;��TXTd6���������Ҫkj>��3I�

��>�������s|]}��d�eE��].5M#�&����fTW�<�W]S��˜ѣTM[�ú�6mfY�֛~�>,�w߯��9QRr���[[[�+*�l��v{6l�������6�o�sǭ���DI�LxQ��.Ѓ��&M�`6�cb����?v݌�k�/>q"6&��Gf�F���Y�b����}��V��n��m��MO<�(DZ_~�PQ�k�M�0~��'�mх��;�t�҈�����tUvێ�V�eȐ�}9|>q��e6���5=-��q���c���g���|>����
b���SI�,f����ڜ9+V��u�=<�-[���>u�/��v�ʊʊ����iS&��r��1��s՘1��}�	�۳w�'O��ֆ���|�^9BHӴ�SU�4M��斖���`��j�-_�z��=3g�X�ú����L&�p����,�z��������7����c���Ƽ���6�mHJҜ�s�N�_�``�A�4�!�|>_^~�O�5��644���755�=�l���n����9+s�# �l2k�d�7�0�xƎ�/''::jpB��jU�Վ������BC��\��_���h��m6�(�F��eY���l2��L��}���'&\5�l2557�k�oi���0��j2�֮��cYM����˧O�}��Ѡࠦ�f�HyEeFz����s�ɲ������/o�A�u��<|pPU�f�n߹��q�y�U�����X�zʤI�������I6,+3s���L�4����MS���-0I��r9].�c�BWUU���ege����566�:}����f۾c���$q�q��n�{DV���+�

��y�����S[T��

��I���κ�ްА܂|��l�
r8,��xv��A���Aa�!F����sf�{�Uc�TTVFFF�������^��Y�""":%��0��dY6M�� �eE�><,4T�		V�e��Q��uu�7$%e�ȑ,�y��i�''$ij��RT���14$�j���� NH
�M�,�?v�h�ӕ��4sƵ�AAC��**��6rDVLtL\lLlLLB� ��d����+X����o���t�ߩ-MӌCTT�����7����ClV[B|��c�Y����H�@��0y���'|��������=r�$I���C��#��u M�*�*�֭[333%I��_���q��A� Y��=�0n�c�q��ht�\V��MU=^�,�v��eUU����!�e%I*>Y�:$US5�r����x�)�0f����0c��TUu�\ta�l2����p���{�^A^|���g�?n���f��m[,��lV��xDQDњ5Ms�\!����xTM��l����h4��N�SQ��Ʊ�(I�Ϭ����`�`��>}mw��0�T���J�Dcd6��Вt{mcc��)��ڶ-@��R���TQ���aUU����ow��N�w�UK1U��V�5uH��۶�|��dY���G�[y�]�fzZ[_�0F����{<N���1nll"1��+
������@�Yt�s��ǝ6����{���t�i��F[��G�����QUU���vj�?r�~���M#/����v�ҋ����u��
c�O�/�a4�Q��a/U��M�=�֙@Q���������������e�f�bY6!.�����"?_�m��j4A�P0�r��vQ��&��`0�L^����`z�-B�#��������ZT/RCE�e�`0� ?=J��E4°��i������P)��1/��a�FdM�Ȫ.��!ր1�dEv�]�,��l!�"ˊ�q,g2�X���u��U�M�H��rƘeX�a���B� r��狾��g�E��!�B�k��v���+�R^Y.J"�?�s�eو��PuMu�-c�B���p�a���3���J���iZ�?Aøܮ��rDPxX��j�1ª����65�s'@4MSd9&)��"�D�"+*�"ёQ��4�=��q�����t�_$���j�!��G�롆4M#� 	���]{�A���.R�O�Ͼ��G��6鬿S=ôő�P7����"w��BLc�S�K�Cs��ѕ�z�z]���y��c�A�ilgB� 8���'�d�Lz��	�aX�U���f�ݠ�%!D��ʂ�B��8�+>Yr�D�A0�,+��<��p0�Pkk� � pDZ���q��F���4��]s�_!� 9vtۮ�y���nni��� ���QA��A`���	�dEnlj4]���=��&���4M%������\US%J͊UW__r���b�inn.)=���@9r�HAQQ]C�XV��Se��YQ��A��&O		9V|\��SRV��ijn�?#ZZ[E�|%��n���8�����T[W[r��P~�+��N���~�n�h%��6�x�/=Uv(?���j�}�$���*:z���H��2LM]���*:r���TE��U[W��j^A~��y��ں:�eϲ�1���RU]�r��.�+����v�z}��'��++0Ƈ�s
�<�Q���<PYU���$IRUu����_d!D@����}��а7�a��ZHH�Uc�;�c�ǫkj2҆ef�Y���FGFY,M#>Q����
����lljlhlx��vW�Tu��� ��x<�r%YNOVv���v�<�����Ԙ7�������n��D���Y�Ʀ�����"��G��t��!��;ì�[�u�b-�-n��'��F#ˆ�BDU��1Ftʯj�$�`Y�j�h����O�ɒc�s��b�H�?�?drW�!U���M�j�$�ͭ-���0*{���p84M+,*���fF�����Y��a�,c�5MQ5U�D����F�y��]�1Ʋ,�������;��{�^����jEe�]�EIJL7�xqq���<o6����}��eٰ��V��P~^HpHXhxY����J��c0sZ����2B!�ceI�X,��D�GZ,�����55�,!�� �E��[��)
Dz,˪��`�@��N>��x�֭����$A�f�����D��zً�H�o��/,e11~��h��7�$ɲl/��v��&�eY��Y�4A0�n�� d4�J/!�`0���#D��g����ƒ,s�vy\,���g'u1r{�,�R��1�/+���0�r�.疭[���:M�	!��Јǚ��dEfY.��B���A���@{�F4B�F4��I?�X,��N�ҡI=��F��4ʲ�j�����G�w?��(4���6���Ά�ӻ�0V%:*!$����.���������@m���ڮ��鷟%�Й>���5��t�J$>s7UU���}g�E�u�h����P�#�~��03�zm6��i�.�R%幋RQ���'�a� �(RG�K���2C���
���
����55�uu=���0:YZ�q\OM�1V�b�9�@�:C1��Q1��_�`1[z�EB��d�����6W�^�z���m���*!���ܢk߶�y�8@��8�h���\=� X�#sik�ѷEt�\�hw.����$�(�V��eI�EQ<�TI�W�i�
��������ړ�w�YM�ηr���s˜hZ��u���LF�(I�$�]�iȎ�Y��M��~�x��MM�I�q��!z$ =�-�q� .�������+�r5}=��&�Pr�dAaQhhH�!��l6�|>Z�^9�0���r���ݠ�*�o��E�!�,s�����^�4�`:$EO�@����uמ��qqaa��,ӬǚF�+!!,�H�$˲�l��yO�@���5kVDD��`����ܽ���><v���]UUUU-U����q\uM͞}�Gdg�|>�a��(I��m6��,� V��?"����}99o��>���IJ�����bc4M4�cI�~��S����A0`��4�f�)��q\`@��hE�b6Q1F&���?�p�ࡆ����ʬ��!!F��d4j�B^���uuu�!��
��lϋ�h2-f��b�e�d2٭V��|��-�w\5n��h4[,���v��b1�;}�
�,����tM
�UUݹ{ϣ��y���%����
��
ݽgoppPhH��_�|���_�h�ҍ[�644̸�jB�׋[��f^�x~ڴ���:(��p8r��>-i�`��g4ʵY��?�QEٶc�;�p�P�Ϳ��l6��?�_����{��������������!�Q�?��0z�Ȗ��
7Y��_���vɲ<y�D��M4�f��sם�ǍE���~�w�546�|�[o�i����v�M��KN�ܶcglt��)���N�:���:��sܷ?g̨QAA�Ą����cNj+*+S��&N�i��+�B��Ǎm��"@���]�˲��}�g��?��s�E��6n�q�O<�}���(���[������---� ���[�'K�c�|����}��w�K���q�F���[�H��q���a���|h�e!���`��r�<�o�#+��lVUU�EK�D>�������/��t��W�����c��'�`K���9o�z�ѷ�}��v�y{.�q�rs���WAA��㑣G_}}!dي�֬ݽg�/�2�
7}�pQpP�0ǎoڲu���-���x�Ï**+�._�s�� ��O?�����T��K8?:K?�`�����;o�1��|�Pn�m7ߜ���k�ސ���Q#F�����<(.ndvVRb��kjn��/hq�644���҈68!�;���=<c�}wߥ�����8N����w��#oڲ�7ߞ8�����{�++33���|��
�i����{xv\l��h�X��;�f�i�I� B���Gg?���B���~��z��n�I����{���?�Y[W���y��w����C>�Ѓ6nT�ګ��y�m���׿~�'O��0�b1<?e���o�%)q�ѣ����c��~�-)�� ��S���Y�bEXHhV��EK�VUU_=}�W��9w���*�R���ai��ψ����򊊢��F�10  91�뮳Z�^���x�n����x[[[eIFi�f0��UTV�=jp����ZEU�{#��>�7_Q��S��ʲ��ܼk��qc�(�RVVV�8����'�3��:-5��?�����:";���!�}>����UE%��8[[]n��j^���t577��[�zuI���[����b�[Z[B�!!�v���j0TU�z}�$�=D���1��V�u��CKK�%wK�:/��,�˖�\��]{��0�M�y���qæ���o1�/Y�~}Td�u�^�88�Xq�}��O��n����B��UZZ6iℓ���!!��?>n���X��9xpᒥ[�n3���=��b�8n���'��ɩ��	�f�4Q�~�q��a�1��W�>|�h``@vf�W�ٻ?m�����~XWZV6q�x����|~a�}9;��9r�ؐ䤆�ƫƍmhhPd%-uhiY��#���ZZZ��\�1~����55f�9=-->.n�޽k�����`QC���&SFFz^AA�С'�_�bվ��ʪ���M

҇��.�v���l6��>�ey�w��v�}��U����3.���8��c6��Q�N�eY6��J���<�B)�"���b�$�}�c$���l�+������<�zY�1�Lz�jI�hm!zB���m6EQX���	-
��i��(�,˲�,+z=&�I�e��$Q�C[�8�v��8�I�J%I2
�,3k��+**~�̳�󟢣� �1��X�СC�|\.�`U������nÆ�n�!L��h4*���{<�q�z��;��K�D�y�^j!����E�կP�F�=�a�^��_��n7��d4ҋ��^�_�ϧwc��EQ���v�1Ƣ,c�P{
��P�Q������L�}�s�l�VS[���	�@�{��0˜i��{�+%)����)�u��M�����.ź��_�����[�g�$��N�c���:v�?�q�zX�U%!!���IOKS�����?��E�5}��}��i��_E�x<}�8��d��ݪ����Ȳ���`4�_�j�1#�d4@_A�y��rrr ���0


�'ُxU��%tEXtdate:create2013-09-27T11:10:28-04:00a݃%tEXtdate:modify2013-09-27T11:10:08-04:00.bBtEXtSoftwareShutterc��	IEND�B`�site-packages/sepolicy/help/system.txt000064400000000121147511334650014133 0ustar00This screen allows you to view modify the way SELinux is running on your system.
site-packages/sepolicy/help/start.png000064400000033460147511334650013725 0ustar00�PNG


IHDR���
!igAMA���a cHRMz&�����u0�`:�p��Q<bKGD�������6+IDATx���yp��ow���W�. vf���%��v'�q'�圙ܚsof��L��[�V�;Ug�9S3gN&g�L�ı��ۀm6f;bh�n���d	����TQ �}���x��~z��X�{mm��n�8�����d2��EDD|鲇���-�L&���sc2��ڶ�1f�_����/.���---��,�9s�X˺��"""r�=����#w�u��ӈD�c��/""2A.{��n�ʭ��J ��~�"""�v��o���<m鋈�\.{�-����/""2���>;7E_DDd�]�-~۶��/�1�=f�}t��1Ƙ�����z�]l����������[�GD>�+2�(�"�˲p]���A��,�P�X$�1������m[$��J���d�Y���bQ�ǿ�F_��l˂P0H,�h�	,�"�N��a[�2w˲H��
��
1���K</gx�BD>�+~�hR�4�m~�
oma0�����?��������ʢ ?�_|�����c���^�
�j9�/�>ϗ�4���0�c���m�{��,���x��]���㿤��1����E�{�ϟ�c۹��1�R���M������[���_|�x,�m��e���3�R`���,����O�z��=�n]MO_���Ie2����HYI�����r�:<�ז����}n�{`Y�e�q#���h�_�*e�6�Z�X���ܸtK����O~<Ngw�L��᳔a�6�x�}}%�r�ˏ�Yq�Bb��[Α�dH�3�'ȏ�9��FIQCgw�P�d2EMe9�]�d�Y�+�G��ݽ�,�a��/a��������0���!�z{).( ��G��CCI*g��tA#�@�d*E�Vҙ,�H�T:MuE�p�3�m����L�������h�l�<̞�CTWV�z]]��w
�,/e`p���!2��H���Rz��h��"RU^F(�������!?�0?���md�,����� �ΐ�2of�Hx��+��^]��E>*��ȋE����X�)���12�oo�I4aRee�EX��R���敍���(����\k;�c������/��=w��fY���Zs/��������,�3���1[�6ǚO�1��w2kZ=��`��_���[s[�4�>�k��a��i���mv�o"�RQZ‘��s�j�Ϟ���+-�Ғܼ<��r
	�z�e:���7���d�����c�>���Y���d*ͻ{����``p��>�ͧώ>f��L���ٗ^�vl��{�y�J�
���V�̘�-+���K�0�����
�#e��"���'r�2�PTP�����t�w��g�|�O
'�p�쿷�rC�Vn�Lc<\�e���\|���	�Ϝ�p]���:z����s�Y8w6��\>���1��u=,��7�d����b��TU��O��Ϟ]��oo~���{��;���{�d��<cxr݋��w�p(�P2I�x���g�<>��F���'�3}��g��;��\���Ut���P�2�����
y�~���'9x����F����>�̔I�<tϝ��g���a�C�
��ټ}'��f
�k�*�-��mY���\gtp��Uʲ,:��ٹo?��̟=��ǎq���IU��ΟGiqqn?5��v9�eAO_?0<`r�ţQ��`.̞��ٖ�e]���!�� �d���A"�����<V.^ȃw�H��D�az��9���� �X���:���(�@0�m�ttv���A$f����/����72wFE���o,x�Ǭi�|�+HYI1G�O���~־��P�ʲRN���3�` @,!/������F8�����3g	
���E*���`�������eDBa0pr+T���\w��>~�2"�ʲ,�ؼm'�Ν͒��h��$
��Z$�|�+Ha~>���Ϟ��	86�oX@QA>��ϋ�� ��GYqK���;ɋ�(*��ĩ�4=Σ�{����;���kr��8�@�t:��yض��7��ٗ_�������{�v�=�;;w�J�I��*/�W��;�xg�n�.h���S[S�����{�±m\��2�p(D~<N �0of'Ϝ%?�G4����#�'��B$ȋF�` /�0?A�q8z��P���b�F~��mv=gϷr���͘�+������T���O�	��������e�M�#�<x��G7,Z�hU0���+rM1ƐJgL&	8�XDZɺ.C����,�=`�u��x�G8$��b[6��C2�&`�6�d
˶����z�H�t&C�u��ã#wC��!��w:�!���d�.�t:��!
cY0�L�/O0��/q��i��׿<��"'�͒ͺD"��_RY�%��	�ɺ.}���A۲C("�N�[��?@0���1��}���ɧY�l	�ݴ���AɋE�E����7Ǚ���e�}�"W�H8D4=��<l�"��{�1�h��G�c��By�!�\��`p�������ƾ���0{0]>DZ�M77
�!o�]��>��'V�H8w:���1��;�8��"J
>p]���}�eY�س��6���y��8kC<%�}�{�"ד+~��L"͇�g.����0�c���������o�q�����o������e1sj�x�@Me9y��E���(���<~��1�rۍ��<ﲟ.g�!��P7ȍJ�6��W:�_D&����D�F:�ODD�G.{�S��m��0�=�co�!r��J����e��9���������}v"���y��q�*���^��zq��j��_�2A,H&�ttv@�D/���ĸ�C�&7�
��Ec:�F&L����U�DD���4|�uBD���U�F�E�Ǵ�#""�#
�����L�a�#W���܃�B�ea۶��%""�1Mh�=��@�A�$�`�|&O��˂޾>�;Μٳ��[�ct��,k�/�>����""ע	�m��'�y��'pҙ4������c����^dh(�m�����m��ع���ق��i_����9u��Euvuq��m�MD�4a[��e188H:�aޜ������k��cl~�m����Y�m[X��e[�ome��o�L�X�|93�s��������<�7��ٵk�����>�	�r�r��ٶMӡC<�����?̚�o���9��;�JݔZ���/i�OD����y
ӧ�r�2~��'x�?������������e��/�y��dyn�z����F#<��3������u�d�"*+*�������Y3f��j��-c�f�`���|�?�������:::X}�M�R���EL�oFc�h��x��7��ׯm����?��O�!�8d�YJKKs��m���A��8A$����` ���V���Y}�*
��*+I$̝3�L&�c�m[��c���|�+_��m���S;i��ά�3����	�?��̖m۩��bhh�x<NqQ15���hh����D"�e�t��p�����H$,Z��P(LiI1���Ʀ��lh����X^��MM�ٻ�I5����[,Y��ӧ��z.�uulٺ�[V�L,�5ۯa��8_�җ�7g6Sj�0mj��/"�!&�<���^�ݵ�p8�#_���<����tttRVZ‚�Fζ��S�݋�8l~�Μ=K"��>͹s���;tu��t�b����|������K6�eph���A�l�s޵p�������N��z��/"�[\�����?
d�ܹsWM�<e��0��|��
������y�7�Լ�s�G�v�P>0����#���U]���D"�0���[.�Pr��-g�>u�FzDė&��'c�h|�Fx���/��я��b[xc�7��Ӹ�`Y7l ��˘��I��jks** ƄB�8`O���\t���̶�.ZD��?$���b"L,�)+í�%�l��~��j��VDD��L���W[���-nCd2X}}���c9�)*"�l��`��TDD�
�����.^L�
ST��L~>^U&��""rI.�P�eY��rt�Gg���:p��?�&?����c�|�M��G�?�Iܒ�<�!.�����]��wuwQ^]No_/�����{�,���dڮ��Z�q���I٦�T��y���)N<�i��.�X,� �.�ht�CDd�\���/}�K�/���7��-�Y�X��D��m���-��&�m����/���"����Ι�;�_+0r�'?��[�#��;�3z~���ٳI~�8�(��]����s	�u��#�ӟٴ��׾F��s��i�_DD�]�[�-X��^�pᣕ�����=�,LYfd���q��%�f
�����]��g�N���jj V�ED�����
��`����ʕdW��L���Z���ILu5^yy�b?Z�-��zbX^M
�5kpg��9}��+������I��:L~�D/���L��zd�B��f���6���={����[1yyx���j�_D�g��1��|�˖�]�
�u	n�Hh�Z��G��*+5�/"�#
��n�FdW�&�؈}�\n���W����jk�_PDD�[
�_� ^C��n�9p����l�po�d
���\�~�1��]�����ܼ��ڵ8�ᕔ�Z�)�~42�_VF榛p-�no'��_~��wd�߶� "rQ������sG�O�B��B/�Hp�Fp�I� /O��N(���z^�dn�B!o�Mx�Z��aJKs���VDD�q
������F�K�`��ܰ��/`���M��)-���5L���
�O�B��[�O�>~����ܸ�&M�����""�;P��⌁H���̭�b��	l�J��p�{���W[���k��/��1���+V�]�k`��oz�%�g�jj0ee��F(�ri,o�$2�܂;w.Ns3��^"�aV&�������\�~�t�@8�;{6��n��B�Kh�:;v`�����am���\�~��$d�.%s�MX�,�M��[�}�dn���\7��
)��,LUUn��<�B/�L��W���r�����""2��/Ϙ��dn�SV��k��k	lۆ�DrG�G"��
(��1|�����dW��&��[�֭�9z��"w�?
���L(�_~�,SQAf�܅�[[	��
�_�������_�
�����g�42�ߎWSC`�^B��|�ܮ��:�F��/"r�)�r��(�…do��A۶Z�����@@+""W��/W�)-%�j�ŋ�;;	���_����
�i�_D�
P�����O]]n��Z���M`�x�����l~��Fn�3>�5k "�u+�u��i2�ܢ-��(0� >4�?ߝ6����_��q��7LI�D/���uOᗉ�y���f2+V��/"rY)�2���/""��=� """W��/""�#
�����(�""">����3�ea�6�ea���<��s]۶�= pk��1��1��z���m۾�׳��i��s]����m\�7o�yX�ˌ1�c^��2Y�5:����/|��x��7nYF�o#�	>���<��r��8�������,���N��?@�@?%��̛3�L&˶ww��d1��Y�l)���(-)���c�t����300�!ѥ�SVZJoo/����A~~>�UU>r$<��x,X�x1�ee4�<ɞ���,˶�<�p(̍+�c�aϾ}tttRQQ��ٳ�D"��>��±�'X�8�D"�����쥮��3--���سo��a<CIq1%�h�7�` ���'���f��F,�����0�<�� �T�����?�O�	PQQ�̆��e1ưs�n��%�N��v�_i�;{6�uu�?��MY���yy(�"r5R���e���Ϳ|�{9v�h$B^^u�Sh�����RSSM(� ?���s���`��%L��`���o��w	����`[S���<���9u�4���8w~�v^s#]]ݴ��PYYI~"A��:*+*8u�4�_|���::;�TSCyyS��xv�:�75�Dd�
7�G�H^^�e����c?�!_�+�v�9¿|�1��?�%?��Ϲq�
�8����M}]
ӧ�0mϿ�2�7ߢ���ǟx��'�����ȋ���~FeEǽ�`0H<'�L�l�b��Gȋ�hmm埿�,,����"��7h9w��g�PTX@AA!y�S��Y��K<��|�/���ފ���)"r�(��	۲���d��=|��{��=w�%%Ŵ����O�����
'��˻�s,���#�,^t��?�ɿ�����7�TSMo_�<�n45���.�2�q�m��dX�t)˖,����u�/�?�����W^a߁&�����cǻ;��cߣq�\V��	��V?�)��ٱs'7�Xλ�v3����I���Fw7L�4�����$��F���u�9��L6���ٳ�S)�?A^<�s�[y�	��C��@������ŋx�-���?f����^u� �ɐNgػ?�����������������n��{�۶ikkg�������mܸr%�pXC�"r�Q���1TWU�Ч?͖m����Ī�+���5X�E__?��?��ü9s��?���^$������,JUe%�G_���Sܰ`3���f)+-ݏr�������#����,���ofRM5ӧOò,�͝CeeǏg����C���F�x�)�O�d��&V���X,�@�m��'��?�,��o��/~�a�̞Ż;wq�l��'
��w��eQR\��3F�Ǐ8�H�W����lK�t����bŲe

���ܴr�P�` 8�??�{���,������p��IfΘ���UG�ƌ07ޱ��|�Y��&v������	��'M"/��_�
ee�����QF�<z�e�`�لBA*+*H$�4�<EOO���d����(Y�m������ٙۿ���I_o�����8w.ϭ_�/�z���e���q_�<���J��O��p8DiI	�p�E������}�$�q~��'H�S<x���b�q3�|Omۢ���t:MEy9'��ٻo�H�H�Ӝh>ɬ
�kC�$�l���c�x�%���O�u�fΘ��8�~F#?�h�R�k�뺼���2w��ܴ�[V�̮��Q^^Ƥ�~�㟐�� ��8$	�1%��޻��D>�h���1�c��tuwc[����yM�104�p��wq�w�O�6�����ɓ��;�`�9`�o<�݌~,�f��Ul��.��ȴ�S9r�(EE�,]�xt��QQQ��Y�x��g���O0����r��<���ˎ�;q���r*+*�;g�t����,ll�u=�{z�m���_6�L�g׭��[9t�0�uu4Λ��u�)((��O܎�y��E�޲����3��e�|�M������2�޲�ٳf�g�^�ܼ�]�ߣ���D<���x�ӟ&/ӈ��\q�-�5��<�~g�mmTVT�q�ffLo`��؎Môi�mi���
۶��}��r�r��Ko_/�T���~�l���H�Ҥ�i�H&���M!��L��g`p�9�f2k�L��O����s���F�5s%�Ť3��4sfϢ ?��<�L�8���&SXP���3��ꢵ��
�����t��l�&�c���o���
c]��L�:���b�������P(�y�����g�b�D�Q���8w.�.��eY�^n� ��`�6˖,�3>@^,���]��}���!?'���0��b``��ӧQUYɉ����~�E7�@ݔZ�0ư{��O���n�v���K����_JD|貗��G	<��,X��b�u�G300�g�P���A��b$��:rT|6���>v��X#��cW�F��/|0�y�uG�m�Z;���;v`����0r-��m��,�دۖ5�{��e�,k�k����y\�}{=����G�c�֘��|�J��3�m�T*E"�@������q����1�
�F?7��:cc|�0_,��i�
�X�o8m-7��|{���|�a����ot>��a�mߗ��\��`0HQ�������L��c.>E�꥟��\m4�(""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">�����/""�#�+5#cƘ�~�"""�v���Dq]�������t��?��q��(��L���+�	nY�D�V��ػ����(�""">�����/""�#
�����(�""">�����/""�#
�����(�""">r���'""r�s]���.���Uq�Y۶)H�
����"�N���(�"""����c<��*���4O�Ӵ����f)*,�����|o�_DD�%�)�*+�EcW�$�������Mö���į����\3�m_�aY�K���/""r���[�[�ux,�_DD�Zg}ا?�R�}�"""��b}�<���
p=�u	�a2�,�e�8��cY��;��[���Τ���8��A,������%��'ֲ,���8x� �l˲����ȱ�c�,k�����h>A[{���X�E:������M��/��eY����4>ľ��8�z�������y�L63�c�l6K:�!�J��f	�B���0�����������b

��߇c;�'��k�����u�b����۶�O�`��b[��-���f6̠�����Z(+-S�EDD>�q�X�Eww7'N� ?����u�����S����),,zz{���f`p��D�d2����]]���8u�4�t�s)*(b�I�ui��r�R)�z����#�:�ODD�`8�c��=�#�1�����
ζ�p�y�J�h�6�‚"�D"A,ñm���)).�����S��zm�md�.�������IN�>MGgSjkJ)�"""��by
d3Y�������`0�m�?�y.����6�L�L6�繸�ˉ��R)b��;.�s����'@</Τ��j�.�����0�3w�b�}}}X���\��5Ɛ�QUYɉS'9p��T*EMu
�e���Ł�Mtvw3�����jz{{i:�����@�P(L���CC�Ca���ms��1ǡ�����.�;�I�R�H,�8��p�ɓ&��ކe��>�Kw�M~۶�T3���
<c�I$�\3�h$J(`��9d2�C0�C2��q�\��̞��y��!&�L���,���4.\����1���QVZ
@QQ!�x<7������\+>�:˲>ec p�O�c_V ���?"//o�����G"�߲T\��G�[X��P�EDD.٥]�CA~���e���G����/""�Q\bdc�(�����������DDD.U0$�L^�c
\��[d�YܬK(��������%*�/������>�v��a�!�K���]]�u�~�K)-)�����0rD.�h�&P�EDD.��\�#�.��v�G]	�>~Q�EDD|D���_DD�G����l�Ʋ,�1x�7a��m��њc��mY�m�Nߟ�靖�\�Og2;v���JKK��RG8���ڶ��W�1�=����fB� ��u��âi[���eY�9{���6�18�C]�������ӧ��ϧ��ꢷd��l6�� �x����Lp�-�b��o�u�v&�԰�J����*0����h����"w&EwO;w�f��e���ѯ�[�#7(0ƌ��YY�m�S�O���	eee�Y�L6�������NJKK�Z_���#�N���m̛3����q��2����s]w��E�bj}=�P���B�
1ƌN#���������Ŋ�˨��$<|7��ӵ,p]˲FW���ys�&�.����q��
�t���ę��c8|�uS���}��Mh��=��3���c�O��f���`�����8o.M���Os�y�ܼ�x<ζm�J%Y�p!%�ż�u+]]�x�GYYgϞ�v�d̟������F���կRZZ�eY�>s����I�JK��۩��–m�`�̙��)~��_p�
���5��tbݔ)��g/U��`�����d�"8}�˗.
�e[�\�������m6��6�}}�x�"zzzxg�VnX����
:�;ػ?�P������#�R)fϚ���RVVƴ��������ƶ�����#��b�2"�0����ٳg�F�̝3���j�_D�:7�㿖e���9y����ع{7Yץ��$/��
�T���kΜ=ˆ7�`��m���~����c�/������6~�������ٵ�������{����S�<C˹���Ӧ�3�����G�}����s����L�T:���%�Ν��-e˲ضc�w��u]^y�U9¯�y�}�`��Pr�����hV�u��ƛ���/��g������ϭ�����<��Ә�{;'qz�z���s���MM<��3�;�į�}�޾>:���>���ضMOO��z�������77mfǻ;ٶcon�D4��_��×�ADD�]~c��/���,�a!�<��77nↅ8w�<{���ƹ�2��H$L˹sD�Qf͜AMu5w�q�D��������6ZZΑL��ˋs�7qÂTWUq�-��D����/,,䏿�U��C��������B��Ma��,Y��p8�������AqQ�'�p�ꛙ9�!7\n[��8�

�I�Stuui�7�/~��q���a��E|���r�=w�D���z�*nY}3TT�SYY�M+W�������f�;:�Z7�O�>z���<y2|�Sܴr�Ph4�u���}�4L�F˹s����]w���)��PDD��	
�g��{�C�SVZF"������J*+*x���������*++�������f��/���i�?N&�����i�u�{�]|��H^,��x���'>��mY��w�`�6���

�y��p��Z[[پ�]v�������0��h>IWw�@���6Ν?Oww��2{�L��N������l����������y�>���;0r|�ȣ�gp]�u).*dhh�������D,�����]XP��3g8y�����7<����cr�r�@gΜ���o"�����\�̟?�^�pᣕ���%��˖m�y��79t�0�����ȋ�p=�_����<��l��.onڄg+�/c�ԩ�w�~���3������s��1Ǧv�dη�2k�ҙ,��i�8s�����(,,���Ӽ��K��g/�T�O�>jjj��z�=l�fJ�d�?���I$,Z����|�ݵ���"��ֲ���8x�N�a�̙����;v�ǹ���hoog��,hl=���3�oj���#>r���J�9��@ @gW�g͢����3�<i�m�e�v:;����;H��ӹ���e��9���[����Kue���L����`������S�_��eK�� ;w�����4ΝK}�����]����<�H��Gݰ`��U;�l�(yc 	�Fq��}�]�<��J"� ��2�Lp"���s]��ql�&�ɐ�d�B8�3�(����=o���#�H�3��!��h�F��w�d*��yD����l6�m��R)\�%<�u�uJ&	�ÄC!���3
��ύ�o�k�m�F(.�̖e��d�,�@ 0���#�=�chh�@0Hp�����{�@&������~��u�'X�d��k��ӄ��oY�px��H�޳�Y3g��H�
�����s����������tcC����4�Ng��HG�,��7o̼�|=���?߱�Y�=��C�y�u���ؕ˲��b�^߅[��ٳ-��i����yL�����'��ȕ5����X�d1%��W���+���a�Bҙ4�kj(,,�0���\��m���Ӂ�~�a�4�X�ٳfW�2�""2���!"&!%tEXtdate:create2013-09-27T11:07:27-04:00�{i�%tEXtdate:modify2013-09-27T11:07:27-04:00�&�otEXtSoftwareShutterc��	IEND�B`�site-packages/sepolicy/help/lockdown.png000064400000142647147511334650014420 0ustar00�PNG


IHDR��ah�>gAMA���a cHRMz&�����u0�`:�p��Q<bKGD��������IDATx��ux7��%
/��8��Nf��M������w�•�{eN�9��af&�c;f���~(�:�6��Ik}�<y�;I���J���+XQQ����$ɶl�B�P�~@��p0���c[ZZ������ ��O�P(gl�.--
�B� ����m[�u�kE�P(�?�a���z�jֶm�eY�C]+
�B������q� ,Ƙ|�@�P(��1�m�~��
��'&�l�o:�K��>Ȁ�5�}C���ǂ�88�џ����r�I�����A4�aƘ8��B0�m#��!9J����ܕm۶m�%�!l�D��_q��jB�� ~k�(�?rs�;����`���	��ڔ�3d8�bXV�c���e醉Ю��ʊ�q��4MӲx��w���B�1B����1�� �eY�0�i�$P��X۶m�I���%�J��aٖ$��s�t�ƶ��6�`w/E��~��!��
{�P�|�,��4
 �mۦi�n��Y�z�Q�8�.����3g-\�i����vӂe+�|�
I�"Q�_��G��s￯�tH���k�Ϝ=�;o�1�����5]g!�k�.
���aB����G[��9�W�wVF�T!˲�aDd��ǟ�������WT���(�"I"ƻ�^�4� �ei��t:���Ͽ��[�5d@acs��>^��}뵗s�0�mY�ap!@�1
ô,��b�<�!�0��iY�%�X�4�a��5]'}�rPضͲ���`8<�__����>oFZ�a�ݵ}o���C�#�l��p,^���og=p󍽒�v���р��h��4tCQU˲ ���gg��v�t�4Ʋ����`��k��SS:�������ql�^i��5���Dc@{g�MW]:���m�<�����u�?j�ꃡPNFo�aU�Um�A��l��U0�,k�vzj
������r�<n�a]���fP���yU�uy9Y��Ʀ֎�x�?-9����vDV���|^O}cSkGg�ߗ��� ��o�U�cY�����+�����־���0��P���4��m�e���W���φ���\pNvF�����cc��F����9�i�&%f���v��#�O5��l(y��/dY6���Q��c�2�)���z����=��(������?��k�>r�����UVO3򃯾��_x��+�;+#-U�uc��,_���4��p��+׼�����k/:�eY�e��Y_Q�s��A�}����~�75)qݖm�r�C�z��[��,>�Ȋ:v�YU_y��#Ə�,�Ah��y��M�]�qʢ�k���9�e��-7~3wAM}C{g�eg���ꌏ>O��66�늋�Z�Z�<�#�s��y�ݬ�^�������nݢP~%BYQF���pə��>T��}�)��O�3��������\�b�•�gΞ���X�s��[,ː��8���9l��|||��苯
��zy�s:�mO�<����և��񲋦��D8�%s���u�E���{>����M��w?�ݼEn��a�@����TV=wߝ>�Bx��ۼ��r�$%�?{=�a)	��F���o��J���[1�c��^V�`Ū��7A&�~���_y�]��SƎZ�i��5�~\�4��w��N�t��V���[�b����O���\��~��iI�QY�Sǔ߆mc��CM�H���m��R��u�)g0Ƃ��WUo/�;lϱ��<���c�r:�>�8M���n@j�����j`��6�B�c9���3�����چ��m+�8RkG��i��PT��3N���2MYU%Illimhj���qˊ���*ˊe�I2M����t���S�>��/f���&��h�#��aC�����?��-�qu�MO�����c�S�uӰl� Dx����y�W�:�WJ�m��5���76i�ΰ�qQY)�ϝ8z���e�f})�
c�(._���w?���S/���`��!���}l��<|��ڡ�<�������w?���W�(\t�)y9�յ��P�����r���u���y��B�$N7��rclK�!� �2�����]NGVz�����W�<{�O��F�䤣�L�u`�r8Bሦ�������:��Ͼ\�aӠ~�;��������D��7��	Ͽ�>˲g�>�������F�/�iY�����2,�q�+'�wEU���Y�tUS��-�e�@���ѻ��FQ�����'�������utf���2}����wv
PX��%����1��vL�L�]{�y�����,��k�'��K�>�0��C�C�� ˲$� h�n���|Z�����.��]il�B<ǩ��2�mۺa<�s�f���G�#/
������N���q7��(ʊ�1`t9�>����וCa�fJ��m�p�7�gY�`��t0�߫�%�SG0�r8X�1��j����7]yɰ�Z�ڢ��zX��1@��o`��gY6*��C2MK���3b��
����ô,�[�lۖ���w{Ŷ��I�!�%��1��m۪�2C���c�eUey��m+g�-
��� d$�
�'	6Ʋ,3��-����E�l{�5U��8�CƤںi"�}^��{3��i��8��4MQ࿘5{��M���d�B!���u�L�$�l�������U���D�2�߂�=9P[?�Oc����gY�����bG��1��� �7��1 _b��I`Y�4-���眑������@Ϗ�����,�$��
����'���>��
�d���^2���)
`��8wR���!�
#?+s@~^$���۶]���˲UUt�.�б����Pbh���à��0M3l`wd!��Q!?y�P�������s�p=�!�Q?�0$$!}	��sH��0�<z�Sz,{H?�����Bh��e�?�TL��,{��`tjJO�i޶��M[6���o ��a������/�dU� )I)�ںZUS!�������:P(����3'7'���	�I���`�9�kmk-�,g��ւ�~k0�0Q9�jhlP�p�K�Tط0)!��3�-�ң`B.���ru�3���`YV��Hy1H�?M�l�nko�ڶ-Iҡn@bA�A�.B��
 ����r ��vCAK�D�m�`1�b��2�Á���C]
�Bm���c�� ��(�W�w=�����D���~n0ٚ
1�p��~��0r8�P~$���:�ڝ�n'��¢"�H���>������b��Bt��wb�1�^��%�i�G�ݣ��M�+���ض}8tB��ʢ��}ԨQa�0�\C��e%&$�[����#!>q_{>��].DZ��,�|PO�K��n���^�x }��2��]8<��<�4�0���Du�`5�X!�d��1޶�ax��sB]ӻ�c8�����{ר�*?�oي]׵�fM�� B��i[� D������q��K7o�?i���o�&�0d���l�/���i���+@�ƻ�@h�A~2�����B��x��ݣ~0�<����ܺ�������M�D��Fu���eٖe!a{�A���n/QL[!����]���WTV	��+-��Db�@I�c^�������mB��6˲�
��iff�6t�~Y��?l�F�6�t�Q%�mQ?f���1fF�V��41'9�H��xx����o۶��a��/f~}��������g�N7t#*˂ ��e���0�a�e���G�wLJRRW0�v��t]���r!��m{W9dY>��t�m`���B'˲��hko���=�w0�2Ѯ���ڙ��:b�Ė���`0+;c�(
BHEEQ�gXֲ,�A�/KKK-�[����t9Y���� ��Y�MBB5M���v alY���\�n}BB|�>9�H��Ų�iZ�hI�,�r:?̞�����,SQT��\.���+[Z[���7y��9�05]�$Q�w��DX��8β,��m����J�Bh��᰼�B�]�ؖ��K޹���Ԉ�����&;\۶���c�Y�ini�X��~�)�a̛�0��r�	��l).)=���|�����' ��y��M[6�u��#�
{����k�M�2n�/g~]U]��¾}K��|>��͒MS)��m;��>��ѧ������KJ���u_p��G�z�bڐ�5U�T5??��#�꺾�������#�vuu-Y�b⸱�6oݺ�(#==;;����<w˄��Ǝ^�jMIiY~^�qc7n�\ZV.+JzZZ0�*�G�q�I�!$�yɨ�@��@�2,˰+V�������;jd$�9��ֶ��Mu9�<�G‘�f�x��#�m�PRV^_�п��pK�������'�_߂fO�:%
}?k��'���XYU��W����)�&B1�B�F���1+���ʲW^/�<�JK�C��O�X�a�;���
{�B(
�x�q��Y^~像<�����z>��˲�-X
�׮_��5�Ą����Դ�!���}�ŗ��n3j�3/���ڲt�
^����{'��뗯\�t:-�l�wB�(��'��+5��9w���n�g�Դc��(*��aX�eY�0�~}�N���+�?��U�5)��k�n��;�k���#�臟|>l萼>}���R������x��U'���hI�u�
�]���'͚=g���%K���n)��9�#Bϲ�!r:�V�ܶ�xʤ�6nY�q�K0�}TRb"�0M�-�}13�O����+W�r�	999,�._����n�ƒ��������/��&�߸I׍�Ʀ�Ʀ�;v�;k�N��e4�P��@k�Jnj�|�i���g���n����@�m�W�m{��;o�������}l���			���khl�:y���O:�Y?�njn��ʈ��N�����y�VEU�KK���5M��}#�
9|xvfְ�C�rsÑ0������eY^����>�ѻ��_�ѻ�s�}��v�P��y�!��0�2g�qڣ��7���7ފ��п߆M�++�F���q����[��bGeBB\�����ӯ_AUu�n�>�!F��>��y}z�����蓛c&�[��C��A,ð�A�(I���r��WL7��iSǏ�f����c�>j`����^�7m�"+���N/*.)��w��RS �%eeƏ=�ȩ�G����=j��-[O<�ؕkֶ���="Է ���oZj���,�!�˧Nk����j$%Y�Go�~�	_�u�5�d�;�=f�@��=�c,����}��M��X��GL���g��y��{�74�=*�w�^}Ͳl�������oA���:򈩗_|qrRR[[{(�D#�]]���!à�� ���9$���}�����^�$�����f�a��$Juu
�<���+������ǎ�pђ�ֶ���Mͩ�ɣF���ﺺBIZ�j����ssr�ǐA�N;�Ă��`0�D
���nFeY�5��Đ"bfa�"�c9Mӊ�KV�^�a��>���.����K���<a|bb—3�ټekk[{$=b��Ԕ�g_x��_�
�6/^�����2�A,Z��f�\��_A��!CV�Y���x��g�0{��}rr��P81
38vWDG����A@VY��B�ڒ���RW `wv�����6s�i��N���8�lljڴyKiiY^��K.�@�Ԕ�/���c�8�Yyŗ_���g��������y�Ą�iS����/Y�,���놑���z�#�hFzzrr����l2.��1ƺa�k��������adE�QU][[�v��9��`BBšu��{��4a|[Gǒ����;zznNvjjJ]}�0�ƌ��X����3�w���qqq��I�e��Ɇ���^i���aM� ���B��L����������裎�Dq붢�#��=*+3cgm]iyEjr���|GO�VZV1d�Ą��k�|����qcFc���KF�>f�(2W<`@aVF�0CJINRU-1115%Y����^��ƻ�X����4P9�/
Ƙe��5U�rr��[�o�+,6u��(�Y_��.����	B��ߏa���f���;.�g�q�pH�(I$�m(bY���剧�����8�N��A��m�v4�$��tF"]�=q)�D"N�C�u۶G("1�TU���pm�Ux����m�ٙٱ�fy�w�ݶm���m��g^x鴓O��̴m��r�:�p��xTU�D#��cY�p8,<PQ��
�A`�D�#?�����A��l�X]N�C�ll��AI�A�4-	�,�v�!�� ��QE���PX��QT�z�n��5M�D���cf4��}�nD��˅1�F�^�GU5]�I���:�2ٙ�tB�KC�b-]�d�С�c�xH��^�[[;y<����9v���/����^��o޼yooaT�#�h��۽t���^�����mۊ���!EQ��(��vvv�S�	�Bd�����0L(�Ե�fW���K��ӆ�eYVGGG�\�ӵq�f�ar��
���Ar�c9۶���B<�G�Q�	�0d�Y�3�i:��X��eYhA˰�!�,+�2���X@
E��P����Bɲ��L�ó|4*�%��@(��<� �/(�
�y�$�X�D�����
@�!�Q�4��X���U�BY���?9�2M�4u�����~s�g4�<q�(��h��/�X����>����׿�Y�iZ9Y��^�Ȣ�=G�`�4��"�+TC�#c�B �X�w-0fО�Ӟ��m�����bi~������7`w�hc�fF�;`�VbBҒ%�Dۜ 0�שׂ3
"�q\a�����eg㸸8bL8�WJ�_a�x��1�n�`x�A�1F�.�b‡8�3c��N������///���1�k�d�⮱���X���7���0�@F���+ �~Ra�ߣ@LVQ�VdT��!d�����BH������m�ށĝ��_��eY�@�0�H4��B�4�1����Z��$2�|�+B����NJuH�ں��z:���9=-=%9�O�~��0<�\N�eY���*����PW�B9���)����򷁼�Y��t8��BcSD<�&7�m�T�)6f�0y2)����܄9vS�6�9���B��Ç�7�Pi�\
���S(J��J?�B��8��S(J��J?�B��8��S(J��J?�B��8��S(J��J?�B��8��S(J��J?�B��8~!���1�mL�7�v��[�E�w�
��Oȁ���P(�A�sҏ1���t
C�#?���;���cA\.W8�u�zĄ��}C}a�Y�u:��h����
�B9X(�D�7o��߫�:���=�츄��E4!d��aǑ�z��H�c�-���9g�^��o�����z�y�
��w��e�斖�f�p�1G''%�ws�ېB�P�b�G��]E��mEE����>�lܼ��w:�i�.IIɱ,�㸕�V?��/<�_��i�� UUm�v�\B�!dٶ�(������~��;;;^}�-[��ȣ�))g�v���vH���v4*�<��x���j��#���h�!TU�؈(��:F�V�G��aY��p���665�x�u�u$˲�]���N<���;��o����;�O>��ʥ]���ߖ��_u�_p~vV��/��񸯻�jU�^��mۆi���_�~CnnεW^��6�4�z��.\�$5995%�c����_}���a�S.8��u6����p���={�?��o~ޥ_�q��@���`ơ���{H?���w��7��0v����w�}�WO>��.�(����O�>�Ͼ���{�����O:�Ԕ��oܼ��NMIN���r������|�%K�y�us,���/8�W^c�Gq@UՔ���?�ƛoп����n�6�{�iۧ�z��/��q�w�f��2�˙_o/*~�S&M�=o��i�^|A}}��oLL�B0��ܬ\b/=�աP�w��2lgW���NMM�4m�qg�v��}�ͷ8�K/����x�-��7mʔ��K�/ߴeK���>}r%Q<����kj�[Z���f�Nx!�&&$�~�)�H������Λ3w~[{��eXM�N=�Uk�s�Q�("�HuM�5W^~�'�qΏs檪z�y�:���WT��iZ���}^����蝁 �R��`�9�+)+�t��pR�&�-�`����HLH�⫯|��޽E6thrRRfF���������J����/vuum޲�_~�	��j��\t��'
:���>;++;;�縦����������P(��ޮ��B�jj[{;�qO<����������Qӎx��f�8��������wo����������;w�����-^�7+3��v�,K��rxbc[�27F�>0�^zirr�eYd�eَ���$Q�WZ���2xЕ�]����$�9s�}�
qqq,�"�N8���?.�����s���8��#UM3M�oA~VfFJJr�|�Ǔ��'�wo��ۯ����L��>�99�٦i��)�|Vffvv�����N�e_Է�`��!� ����;zԈ�F���
�<?h@���8bE��a	��������騟r�a���.]�tРA��C1Ƃ �W�;��d����"����^���G���~��4Ms:�(��Fm�vH�$IQY6
��raL�P5��r��aA��D"n���uU�� ��
�c��EQ�D"�/��h4�j˲n�`c�FY��$	�i����w�r�B��mE�z��v:���B��!7��͛��!�u���3�'�q�%�g�q:�q��F��p$ �q���Fɢ_r"�!����0�,��A�`"� h��(
Y�KJTU����3(BȲ�XMȟ����?ԍI�P(1���-�c���}R�H4���n_u�@>�v?!<П��F�8��B�P~��'�s�q\쥕ZZ(
��YK0�$�� ��W��? s_?���m��ڟ��AW�ی�c�a~�ȶ�OyP�6�)�B�s8h�gY�a�{Pm۶1�Ԁ
ӌ%�$�4�ms��|�y��}C��~t_��e�P�Orc,�"˲ݳ�)��?�=+ٽ{�ן$M̹h��ݤ�����٣�=�߫����S��kI���~��^>��`(��*B�2�e�׏0M�4ͽ[�[�v���!I�jw��5�лS(KB�Ɉ/�>��SϾ���<�h�RQl�v�\I�8����G#qu$Q$n9��t:�.���t4�����|^�����R[W���Գϱ,��x�����p�]8X�����eHr�D
�u��EQ�QU�<�eٱ|H��$���q���p:��/S�y���t:9��xܤh����.�!D��|��>�ȣ�H��8���	i(Al�Y�x�Ě��o���M����c;].�S�0��]��t�vׇ����������۷��ƛq��cw%A<nw�5�d��\)c� D������9����ɲ���<�tJ.��\�(
1!v:_���'�|�'�+*�.�(��("��n�$�� t�q-�v�\Ͻ���k���IO�1�D��p�tc8�(��o+*�$�T��t�'�H�����]��r�<gS��P������v�-��N|=W�^3{�wg�6q���'�"�p�mmm�ii�mmK���6u�(�[�o���JNJ"&��۶WTTz��7�zg��Y������ז�X���m�g_~��z�DAho�X�v�C�<�Qx���k׭7MA����z-˪oh�z<�Hd��u�i�<�У�746��$�}�����k�J�����kkk�K�XZ^��ؘ��S�������&�eW�]'��$�**w�}>"@7oްq���ۏ>�H���+-�JH�7M�a�P(����B�N����٩(����`T�y��z<��׷����������XQ����!9)�L�˲��cO�74��d����[���v��w$%&�<_]S�i˖��8II�<��ظa�&Q�eE�e��vˊ����:$J]��=<İl\��vTWo޺5.p8M��U���;i��M��&$$��)����g_|�#�(��7911*�uuu�P��r�Y����#��؏�$��9s��:0�^�G�$�a�++��Y��o/.����V�^��3�$%$��^I�6oٺsg���eY�x557���p8��,K�
�y��so�aZZ[�_?�0������Xl�ڵ�W�����E��;��e[�mSU5%%e��U'ܬg�]�������t¸qo����s�#aIr,Z����B0������Ͼ����	1�O�ܯ����ɲ�������h�>�‚�>�!����#�ʊ��E�ɧ�9��KJ�^�A��𣲢�]<�}7뇶�����y�EU��̟?b�Я���?R��>���w�(�����n����c��KW���_��#��X����ީ��[�h��c�y��EK�F�hKK�QӦ������K�/�����b�jj�'�p�}��瞿��K�~�Ŏ�UӶm/�D"�|��1ӏz���KJ�lݦ����7����GUӆ�1�QU=���::;�>����wn���/�LO�%���o�-���+�
�0�(��7lx�-�6�����?:픓��j投���G�|ʶmUSÑ�y�`vfVgW�[��F�._1aܸG�|j�;�����
���Ͼ���KNNB�ٸiӢ%K�Nלy����_|��
�7��9���v��uvuy=��̭kh��� ������;;�fϛ7u��>�l֏������1���=�a���+Wɲ��ܼr��H4Zد�'��|�H$�zR��B;kky⩓�?n�…?Ιw�G"�Ͽ�*+3#--�0��S���S+����BhFB|�S�=:d���|�_nnn�;���e��gAYɲ��Y,�r7{��E��7lz�n��)'�=��ɓo��z��i�{��<a”I����4���n��_�}V�-���NJLظi˺
��m�B�4�NǸ1cF��z7lڴi�aC���nM��N��ѣG�1d�s�:��3Ϙ=w^uMMRb��M��o�$�҄q���m,�^}��Gy��[x��������;�����]²���f�:����!�Cz��e���}�W^����aӦ���'&$����]���pF����q���E�:;�<n���K'O��*ʩ'�t��wA׭߰h���+�ʷ�~'%&���<���۶�,�(ʐA�|�Yg�z�ɡp� /���4a���E�s����~�o���5;w�27A�izjJ���&�
�J�ʶM�4��X�Ԕ��#FL�0�o~�?��~��	_̜�F�~��E�kv�tHҙ��v͕W0�[n���*.)q8$UUnj���}���-(��QI����ʋ/�@�4��ѣ�4~̘��'�x����|�	�=p�ݢ(����7l���>��;kk_y}FGG����7}��C
�߽߯wܞ��G�4�4F���C�BEA�,+�j�������?���>Gt�
�`8�i^�a�������?看�?o��嚮��w��'��kUU5� &.?v�=w���9M�‘�m[�e���TU������C!۲0��H4j�&�����wz���<r�pY�mێ{�A����n���~�O��r{qј�#�N��=�t�n�����V�0�:����;����<j��H$0�eY�$Y�m�Y�%�12$Sղ,��E�8�a�eÑ�mYd
���PHV�0!�ɑ��x���~�)�H�,C�0n�ko��t:.8�ܧ�y.1>!3#CQU�4#���`�����Iӑ�SEUm�"sB˲dE��
���`�$0�0��ׯ��{�JKM5CӴ���{�UU��}���9|��/�b�VA~^Nv��=XYUu���)�Jgg�eY"��ׯ���"1�L��4MEEQ��Ⱦ;����ۆa�e�4��m�G����{7m� ���"?.@`۶"+YV$I��#�p8lZ����
=��Ba2�����`8����.<��g^x��g��N˲B�p�eYC7,��<��&FPݧP���~2AZVQqӭ�����~��+/�4';{���V�]�t��ں�Ԕ�P8����y���;-YZ\\���1q��|��G/)-�0nlQq�s/��0B��ڨ�#�KJ�}�%�a�a�eC7�M����0w��u6� �8������_��������=���!TXXXTR�̋/UUWged�?~�د���Ï?9��c�����_�~�FM�Y�E��
01"o�,D!!� ��(����~?����x���ґÇ��ï�xC��aC:d��+W�	�BdZRӴÇC�>��cG�joo2x �a���m�WZڱӧ�^�v��dj�,cF1̮�Ų������/g~�b��`N=�$EQ�-\�m{ƻf}W�^�⫯��ᜬl˲��2y���99����V�x�����pFFo��1bİ�^zi���}�,��,ܶ����!B�ˇ� �����c9���a�0�P蹗_^�n]zZ/QGV\ZJ~\��(��.��
7:���>k�IϿ��=�?8|��K/� /7������O;�������=��c�55�}��w�~��II�m��������p��w�8w� <�oڼ���3��g�e���P(����IIJ�
}ߗh۶��HgWg|||\ �i�q��u9�����z���E���hjjnmkKL���|<��S%--��::�L��D�8����2M��ʊ�t8L��u���h�V]S#�brR��m���1����e��r�v�]�G�<�:�����!���mm������|R��M�$W�e�$�A|]�v���4-˒$)��<�t:���B�pjJ
q੭��,��9�NaUu
�vJJ����@!D>0#+
�0�ɲ,��(
;w�E�є�d�$�k7�4-�cW}�imk�9Nr84Uu:���a�x<�`������&������;k�}rs0�����믽�_߂`0XW�@0ƭ�����v�uu�>�71!AQT�cB��:�Nf��X㈢��Dz�a���jimmmk����x<�`�4v��d'�P8�F�23!�,˶�������l˲�MM/�������?���듓������I�qGUu|\\RR�e˾��Nj/� )11. i�Ӵh�aE,��AK?���0�i��a"I^B�i��I�NU5��X�%n�$g2[`YDZ��+8��2�L��E	<ocl޽���"T���+*���3O=��$I�m�<O4��
��16�|t"�m���v�I*��U�a۶-�"ڭ�zw_R˲v��<��"���y�2Y���7c[����tÈ�g�
�I�'w��Tش,�ax��,+�B��Z�(������_���D�2�zh�F|I9��,�4M�.I�!5�c%�*ܽqȗ�݅��,��a�&�cY�qB�M�� �u]'	H�b�A�q�a�p,k�f��[�Щ�����.Z��s�t��e��R��~�a�/�o��3
� Z��î���6c�H2�ܒ�niL�m)lL) ��KclZ����Ɋ��i���z���"�W�R����ݏ)k��1V�S�O���pw�X����}������=���t��?6��^a�eY��t/׶mUUcE8���O=Ŷ19K���gW8��ؿ�qȗ1J}���?c?e�J�w��K�X��-@~\r?알Uc,�r�|���Z���S(�--�`���]�1�Ϝ)1@Ӵ�>}
��U5���B��_s9��կ�^p���{4����=��\~�q~}@����C��~\p�
@u����蓛�FO;�r��[��w��b���fB����w�Yn�Up8K�~s�W����4M�!�)���!{r�b}EQU��,�������8�4E�u�X�g�Xc�>�:�ݪ�B�_84�~�a��kC��!d[fbBb|\�a� cl�g���!F�_�?&���B���R4!0����r�\{E�U@^�l��H��G�j̆��`(�0�m;!>1>>~�խc˲p�n�b�X�k[��s �%����_�?[��CHEE�eۃ�$ �չu�6�0RRRc�E´�~�PUը,�ek��|B����J�5�Z��b��7�� A,�noo--+����|1���t�ƶm�#�X7�k@I�@&W�������YY��]د�!91��P���H׵�Դ���w�5M#N�,�Ʀ������D�D�y�'�e�{��� D����6�P(,��,����F&N�X��9��-q�M�6d���+�^	H@Ԥ����^$�9#�?b�d]׉�%B�0�7�~[QԓO<��7�q�̲,�z�ڤ��Ԕd�0Jr +f�n�>Ƙ��<��[�i������d�`QQѠ��c�:�$m۾��Y?p,��N9���K�661�C��B�pd���C����{}�,]ַ ��v��{5#y5aY���5	�3޶mMWn�g�׬�z}�(vo��^z�6ðÇ
�4aŽ����!���Xg�P(�����{Dz<ϗ��U��}ԑ�'V
˲Hx��E��s4�(�aΟ��N��23zwt��wM����D�h� p�@���1$������V�>l��Qii�,˺�n���z�3o��Ç�y��pH�D�/���dJB
�W�P�=� !I�l�IF�LBm���h���P��a�mٶ��>� 
��dN�D.*)�^T4jĈ���^y�eIY��$�,K&a�Ap�\��!<�0�$i�V�s'B��X�xB�p8X�%��w�m�t8-�A�y��9�x�8�c�457���۶��m[�m����\||\S���斖�K���gf��x��/g~m�κ:QDQ$�I�2�����㏪���郞Z�l9�0d��X5��u���.��(�aΟ:��eY��۪[C;8���m�j�;$���=&��HLHHKI5L#;+��9s��X��FRb�y�ݬY������JK{�ɧ��;F�q��ӟx������������G�+.��4MEU{�]�`Kk�qG}ʉ'|���.�8�wz��\l��mMSy��-�"5�3��J�2�]&f@��GO?J��_�a��f��lŊĄ�+.�����˙_�<���Q^^����r�\e���~ff��(��o�������C��x�q�}�aqiY��^�u&`�…��_s�_|53�/,��O$I:��3z�Vu
@(˲�ݗ������ݵ���s�	�	������/����+wT}�嗝]]'wl��ܧ�^ӴaC�9uJ$}�"�����#�� 
��-�n�� �u�i�yy_�zي�		��	e
��ȟ:�`��"U�|������P�;EFQU��I��Y[����k;��-˺��oڲ���u���r�	Ͻ�����~�=g�ʕe��Æ7z��3fL�6��+.OKM]�~CSSsQq��	N9�o�����싯���\T\Lb� ��'�,s�,˲L�Y7t�a��A-Ƙ���t�m���]XU]��9ujqI�'�������?O���ƛ�y�m��O?rڧ��(ʦ-[֬]7{��SN<a��QU����Ʉqc��v����+�._��Դ|�J��^z����g_|���aM��~adi1�TMc�=��u]��mmm�{��������9s�JJ�?�̌�C:��S��6��/+������׎;��o�C/��Z��B�3ިٹ�o�����O=�$Ӵ耟B���g�dض������8;=�i���z���5;w�v��E�ѿ_�g�z��W_>l�i��yyiiiii�c����x���9�_߾y}r�KJ���F��r�i�ӏ�֯����N �������}�%�~��oAAvf���$&u�e�^_Ŏ
��0MӲl����
�B		�1$�i�Ç
�����뎣�8�r�۶+��22z�dg�����S����y�R���ӳ22RSR�2�8�C�,{Ĕ����Ͽ�9o�¼>���u��U����;lۂ(����WP��;�"�����5����0t�`�׶����rA�5M�2-A-˪��MKM�
Ÿ~bA�^o|||Yy9�8�W�e[�^t��iG�x��Ҳή��s����H�(+���IO�uԴ#:;;����N8�p���Y�Q#G���%%&�$'s�![#B�P�?w��i��z�3���haaV�C�����Q.�[� %9���5O����o>��K�֯�4������ef4*wvuq7bذ�_���U��1u�����^ii.���ǟ��o�dYQU���S�UU��G�����k�/Y���!A����575�]�1�������e+�eg����@�YAfffz��P8\�����9�o~^$%�����ö���YQdE�–eE‘���az��Z�hIu����A,X�$�Ca�0��v�ko����ӿ�_��tI��6d�@2!l�F�^�������r;.g8\�x~�?���՘E�%o�������K��y���h��زm[JJ
�0+W�޶����l����hԲ욚��|��#O<����''{ɲ�;v�-�Ea���}�����:㭷.^�����!�P�~�.]��m�JK���**-ٴm{IYeKKk��̼>���/�Q
��u->>.7;�wz���x�C���
������/��wt�s���Y�(��d;�΁��շ�����MNL���H��+3#Cr8����T]����Ͳ�#�N5��PRS�jkkKJ�7m�X������o~azzz�zB�1�qi��	��ea�}^o�ܜ���tvv�N�������JKMKKMu:�9YY�')11�wzBBB|\`���B�]|���Z�n}k[�E睛���r932z''%efd�3��r�/�W^QY[W������Bԟ4fG[GqI��-���J[ZZz�e���o�ƔQ�u�A��t�cu]��ݛe�+Wfdd�u��y}ru]߹�v�ĉ}�SSR�[Z����=���$U��.�)'�PTRZ[_���P����ק����O;655�-�K�0���uБ;/A�ò,C�n�(�ݽ�Ge��
�)�Lɲ�v�TU%��E��\.�e#�(	먨*	H�r� �h��8�49�#A�G(
��z=��+�H���n���$�v�a���S�u����8n�����X���6 �8� *�d'�p$B\b�Ѩ��"N� (�*I�i��(a�UE�l��rt]WT��tʊb�6�\W�u���娚F�����`(HV	��n�éi�"���(QE�cYcA�mGeٶm�Ӊ¶���$IA�A(��.�05M�݀,�eQ	��0MY��]}�A#wR+~{��߱$�/����K�^i,ۆ���ɗ!˶Q7t�e�`w�U�\��-TrlUpl��'��qӦ@ p�%���w_�K�x�������%�n��Rm�����	�&3�}/g���1�c5o̙
�^���VHt?�=�t,�Պ��S3~{���� ?�����_�0��)�C�b�������pΙ��q�)�(Z�e���(]L�T�}�Nv�*;�|`�����{��~����1-ˊ�ks�~T��V��kE�P[z�W�mc�aTm?��	�y˿Vm)�!�GK?�k�B�Pz��8K̻$.�~��/�k�@�|m��o�ov��:S(���~2��v�].���!�y�s�k�맩B����~/ �/���D�W���{�Ⱦ��o��)��"g�!��Ph����$���\��l-˲�a4��$�'���9d��@�(YVH�a۶��A��bW{���"��e�-��7����Ăq�l8$Ps,�g,��N)���~��0��޻_uߺ�4�����X��X%��Mqi) 3##�����^�{���"���1�!@�bݯ����<�D������*}�C$OUUI|�D}~(�Ó�x2c����%˗��׮!!'�~����y����'�kY&�����~�)��|>����WTn�^���H4��rV�رu[QBBB{{��O?#�9��F����$)6~����r���;Q(��E1>.��t�]����77�$$$��E�qH�mۂ �|ހ��p8|^/	��L���|^���n��pGt6>>޴�G�|2���HM�ey�޹�����E�ź��|>��Gt��r��~��aZ���:�kgY6��\N'��| 5w�\N�3�K�Ȳl\\��rٶ�q���
��.�K��e�W$&&v�v��'|�t�$��%�H�N��x���>�D�
��?����@��tb@�A���A��1Ƣ �WT4�����Y�moo7LSӴ�?��ᐎ�>!���C��~V8�v�T����Ͽ���qcF��λ�]����sG^�`P���Ԕ���>�H�����hͺ�Ç)��O�4�q�;v�Z�&'';�W����C�F��
7
<hGUՊU�s���23�}��駝r�!C��Y�nؐ!�
*)-���nmm�_�ogm�eYӏ�� ��1^�ju}C�0';{�ƍ�8`�핫V�76�>�WZZgg�?�X[Wo�$�MM�s�/���3j���<��ܹ�Ï?����[P IҬg��:q�8�
����RS��
����;:����(Ǐ۸ys~�>�moڼ%77g݆
�`���3��ܜ����O�23&�W_�P^�������5�O>�|ɲ�]���ѣ�N'�]�/XX��0z䈼>}�n۶f�aC���OU5rh̨��YYkׯ�^T<t𠴴���I��֮[���:z��Ҳ����>�9Ç�kk3
�r8p�~��(C
��ͽ�ʫ�~�=b�x;:;7m���I�`��$I|���M�;:���e�e_z��
�6ٶ�l�6M�����f��iZm�B�,�,�~㦏?������^w�\���fqI	Y[�s��oc����F_x�YQ���W�--w�s/Ƹ��3	C�Ѩ�j���x��r��;w�\�t�'�}nZ����Dfϛ7�o�.�m�ض�z��Ʀ��V=����=��3�P�o�����Z[[z�񦦦�_�QRZ�r:M�
��}�y�e���y:���iZ�at�P���_]�|yqi��<����S��^T�D Bu�BY����Ӎ��������k3:::�[Z^{�͎��G�x���|gm�O>URVV�����/�,;��-_YU��ɧ�aY���>^�hy}iko#��N�}��g�K�+w�x�n��3v��˯>��SI���U��|�Y�$���wtt|���;}�ɶ���>�pμy�V�x�wy���vTU��H�1Q(�'��e9���~���m{�#O<YWW�i�EQ��;���</���TU����QU�u��uu��y���?jĈA�=꼳φꆑ��<b��1�F�u��i%$$\r���tbGG��+��ڻ������5�����
����oHJJ<h`vf�ի7m�2i������
I�I�,��;��1u��emm�`��������rN?���.MIN:��ӎ�:����,w�'��_t�yG9-�w�+.���W�رl��k�����w}����_����[n9픓�6o޲��:�������jZvvVvV�I'����e۶;o���;nG"�7l�׷��믛0n\$IMM�����(
�׮�y��n���s�^����y^��0��ȸ�;ftrR�����y�KJ����ZTR*�.<1#����O�8!?�ϵW]��xl�6M�o~��!��Ἴ���W�������Ʋ��AJ��D��ӓ���{�jlj������9������x�uW]~٬�sLӜ:y�]������H#zR(�!g��8������|��!�ƌ���A��9��ӏ��r���
#����NKM�2y"��(��n����l��k�fUuMCC#˲AӲtݨ��ikk����j0�F�6�}>���=r����R���Q�̎^}��k֮���{>~���9��7��ܧ�x����\y��u�n���O�{Ƕ��ҩ�'�~��=j䈡C��;㭷�P�4�P($+�OQ�PT5
GeY�uEQt]�D���]�!)!�f�Ή&�_�p�ƍ�$vu�^���PX8f�(���DX�1#*˝��� 0Z�n}Rb��(I��+V��F�d��4��y����kjn.-+s:/�B�6]7B�XV�H$"ˊ�i�PH�y��տ��#��n�z�xM�4˶
�hjn���p�=<ϩ����{�?o|��gjk�Ǝ�v�F�1b����d�an��M�}��⒒����쫙w�w���"�r���e˗766�]�!%9��06���)�Ó����q\E�W�xs��E�uu�^ue~~�(��.*)-�$q�K�.���=�S/Y�e�6EQǍ#I⇟~�a㦌����/���1l(�����������[Z^>i����ֶ�s�<}GU��+k�������uu/����ں!��1�WZ�g_|���r�����:㍝�u�?n���n��i�pܱUU��W��Y[ۿ_�ֶvQ��-ذq���[Zl�>t��i�۶�52
ˊ2z��u62dԈ�s��_�h��?�l1~�IyE��g�~�e��.�ܱ#5%�WZ�a�m+ڞ��{���>���Y�֮[ʉ'�+�/*)?v,�����k������������>����|����O�����6.7d�;�&��jꎪ�Q#���JSu��%������;&M_TR�'�߰i���G*˲?̞�ٗ_�;e������bu����\����>g�c�imk{�7#���	����JJ.<�����/�����?�����Q�!��߰�Onn��T}�=��rX�["w�P$q�\BM�DQTUU��a�u]w:��iF"�!#�"	?)�"�0B���'/,4�8��Ӊ���]� ˲d�YBH�4UU���t]�$��~Դ�'Ȳb9d�W���X>�0g,��ʊ�-��K�'Dz�<%���(�˲�a�(��a۶��c,WGE��8M�H�����(�B!��<�wuu��M�$���!��ᰍ�(���0�eIn,��O�P"�B�� N��0A໺v�� �\N]7dE�����,�=XUU��m˲1&3��n���(��1|QC˲$I�u�|�(
� ��'��If��EܽeEa��p�2�p��H�Cv�9�+�s�$�m��y^EQ�N��M�[��F�(*�*QH�
�,�R,+�O�{�E���K�n�,�!EQ�a�t��,��l��t�Dr	$`��(D|�dY�1�4�x��j��c��ı��^��29�i{��ٶ�xl�I�)���C��A�@U�X���0i��!uo�C}�S(����ງFa0�i8S��Qc"���2���X���`��]�r��q_{��ń��'��^���_�@��;�&B���t�|�|P������Bvl��n�{9{U�@�bX{�ֽ��-��D�P7���@�]�ACƕ��2�����X=�_
���!��_���/!4S&��:V=��e���P('��s�FE��NQ�b����%��]2�5]cr���S�5�B��p��X����,�%	A����v��{����0���~_w�X�_,�@��&#����U��`YVcScB|bbB�a��'`e5����R(�¡���"��W�ߗ���뛷l�����ʎ
��
KN�mYB�4
�`�Ʋ#��eY���b�|�$��
>�L��홙ٽ�z1c�8�l޲ɶ������8��
�����{�r8$YV���r��[oA�P(�#���lh����[���UN��ਂ��/Y�,�v��w� 5;w>�̳�!��IO9�D�0x�5L�A(�x�y�e�9����ݛ�BD`�&�%�0�w?������ׇ8���2O�60�,�UTV��2zg�r�T[����7l���S�m�.�k��o�����x��������t�����AHr8�5M+�Mgg�+�9��8�ON!�L�싯Ə��/&�!���B�����&�PSssf�^uM�O|�5����?�`Yv|||ss3�b�v���+���?�����;V�u۶��L�t:�_|9�����c���|�eE"�p�x�˲,IR4mim�,�a?�����eJ� ����B�9Ȳ�P8���&�۶0�1�Ev:����N�c�s\]}=�s7\���̌^y!��|sK	�I:��,�aYVcc�e[������<D"1�[ZZ�`�aZZ[UU2,c�&�mJ2MS����p$"¡�m(�_�?w�0P�T��m��D��	||[����<��`�^�jY��m�UU�8a�w?�8w�|��g�u�\�z��H4�v�zl�w>��0����.8ヌVTu�Сii�_}������K��k$Qz��1ƭm��ƌ����^����m�l����fM�1ƺ���lF	�
Ÿ6caY���$'���k��z����jj���˫�k^{�ʹ�Դ�ԍ�6���P8�+-���ⴓO:dpiiyA^�ǟ}nYv�~}�;��^yU��II��svu�������{�cƛo�����={�<�eN8��aC�h�?+�B�	���~ X��`$+�����ʚ�>	��B��$I���&�1�0���W�����P(������uvu��ɓ&�;��K.~eƌ1�F=�Ѓe�s�/PU��3Ϝ~�O>��k���?�egeU�Ԅ¡p8|��g_w�U�׮]�~���{�㎙���[��0�����M�4M�4M۶TM�y!�r
c����k�羥�W�x�?�JJ��~��ѣ���?��K� �a��m���DQx��"��I/������^��PU]�����#�}ԑ-���֬2x�i���0LIii��~;v���m�^�����{��[�zuƛ��h0
�7��N�B`YVz��EۦO��'����ǜ<qHT���֎=�0�"�f����o���9{޼��x�����$2�e�QY���x<	�����n�;-5�0۶���H�	��Y�q:��[34�����J6��y}����������&��(G˲ ��kg]M4MJL��
@
ݘ8~���M�4��5{�|��Ñ�S�<���=11���<��xx�KLH�Dc,I"B�x�ĉ����������z�/�Z�|�m7��q\�Դ�C���ʫ�#?�O4*�޷ ܘ�$�¡�y(�_�?{�o�fbbRb\�g��qՉ��?�o[k��ŋ��
H��� �q���[o�;�m��p�-��4=��<�G�:�z�����G5���#�̌�̌����W^�ٹ�0M�Ñ�a�i5���5�2���{���"����k���rClak[і�����3F �����ʰ�a�px����m!�˩z(�G0��pH7�0�ѨeY���Z�f՚5n�������r��������0Y�u]?b��Y?�)�۷oA��A[ZZ���>�B�_8�ȝ�$�ZMMM�Ϊ��v�a��ss�$&&�j�d!YQ�m/�4�0���LI�,�Ꝟ^ZV����F-�ꕖ�*.-mii<p���..)M��r:5]_�z��(cF����t9�]�`Zj*�xgmm߂UU�n۾aӦں��z0����Ҳ�ں�`�KĤĤ��B�׳�ggsKK4*gef���8����Z�z
�`�С˶����mmk������)+/OJJ��������7x<�u�7@'��0̊ի#�Ȩ�#R��K���{��D����WZ���QUu��5�H��_������_��rX��y�?V�eYd/���}��"��D�X�4�i��0� I�qd4M��X�N�!��,�X��2���BA�Q���2..�wߋ����\����0)��y�PD>�x�{��">9$�%�m/"Q-eY�p,��j,Y,�)�q��<oٶ$������p@UE�tC�DRI�4M�v9HUU:�W�J?��7m����ΆH�y� 9�m�		�;W��X��Xtb����U&��$�=D�0M���!۶�Q�a��ʏ?�<;+��O�#1�m%ߐ�d�F�4C7�-\�i�]�`,�a�Y��K�
�M���ĺ.R� �m���w�=.��WS(JwY��H������`�~���ā"0�W��O9��3NGey/����;P��D�{q{U�/��~�b��C�P(�=z�l��ľ��B�P(=Z���S(���o�>��>K�,�w^d�gH�P(�ۤ�s����� �:����o����x��e�P(���o�~�YJ��1e�C�1wK���D�m�-[66-l��L���¦�K�M��ϱ�ln�MBp	,��1�A�)�6ƴ�P(���I?ƀE�#����'fnw�e�<��aُ~��w	�e;���.�'ƻ��%x���qp6���b�W�Y!Hp�^A�m�X�O|�%���d�ZQ� �;y�֐����%rn���O�P(�M���$~Ƃʣ��Ed��6���)��ps}�EvIQ�i��'e�����5Ӛ���O�����dyMyc��5z̐��F��h
�;�2����'5 ���l\65�0�[\����NE��c�ec�C߮�{���M�>uTo�A/�.{nV�?��+H�j�egG
�B�ܨA��悭Mw��?)�1k�����ݐ���fc�!��wŊn��h�ζhZ@:ed��m��/������-:��}�mm�R�x��1S�'�]�}�mkk)>�Ʒ�GU�6&z����6�`���7��s}��f�}��y纱�N͹働����g�S(ʯ� ���X������ǿ.ZRԼ��mSu���w�9`���E�&�'�_�]R��&f�UsmE��K\���{]01�w�ceY�"��3�I�.��P����C.��Xթ��]g8rP���[�Mfz[���.ô����m�m�A�eR�}
�B9(���1�/�:z@�O+�i��Άΰ�8��e�>��M�Yt���g=���3�,s�[��txk�ULC6�"ZP1�!53�)��JG�ŧ�92S�)~�Y	N��/�P��;:d ��0�{?ݢ��N�X���ݍى�W�Y��	�]�J�P(���K/MNN�m�ʲlGg��n�۲��V���9.cTn\n��S6.����,�Q�������h��g���Mv����(��I���z����Q�R�d�?Y�smE��L�уS�mm���ٶ��¤�T�G˫�[��}��%r2m<"7�)9i)(�����3>�)�a�D�
?��a������s<�`�r�a�����"wb�N�-հ �@���,BAY���|D1u�v	�a���L!h��n�V�ex�6�
@�8ݲ!Րx�g��1�`H10!ʹB!B@3lD��Poh�N�a�/G����}!�jBA@0�+t���@�1� Ơ5�"�!ŀȚ�;�@t�5AY3��!{��+�q�C$C�mݶ $��ec��$P���`��H�r��'���Y�����{~���<��St��WC�0�sE�r��~b��Y���j[6PS(�@U�8��)�{H?�~$595)!	cLu�B����C��r������P׍B��@E�r�q�i^z�R(��:�J�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=��۶�1@��~c!<�`�m����/���A�J�^U˲~M�
��w����p��0M3��LJ!�q�aT�c�e�bP�eU�~^�%I�$	lY�,˦i(����Ī
!�dY�t��?�B�9P�1�<�/\��ϿPe�С�\x>�q�mC1�B{��B�jZWKK|\�0d(���$�'9��X���_yu��/�w�٣F�t�e۶I��g�<�|��Ͽ������y�I'�$'[���pc��aZ�E�D*�jZWssbb��i��3N;eԈ�h�e�}�]�B����eYIZ�a�>���{ؐ�Q9�0�X����@�%.�k���wuB�("�8��yc�q˲,�J�IJ� $+�e��ۯ��M�6l6d���TYEI�$�!�y�a��EQ������������+*���]N'˲,�0�JA9�����YWW��VQ���|ޭ۷_�͝�]���u�\!I��-TE�P(�#��I�P�`?�~����8M�EIKK;��c�����;����N<��;��ǟ��#�����t]���˾�5�������K/INNz��W].�
�]���O2��j~�>��n�����7��(I�W�kƇ���x5M3M�����_*-+/���?o|���w�����_���������Q	�vCa��9o��^y��p8$��G��YO>��~����8nܽ=�r:�9-99��o�s�\g�q��o�kii���O;����hT�B=�DyEE�B;�:�����7pG]���4�C]
�'��~�1�qM-Mu�u�(�9�g�y�w>����c��r�?��̙'�p����kg�8;
�q�)�}��hQI��W\ ���1l������{�..)y�޻����
�6���x���+V�p�q�i67���d{<���F��\.׫o�����>���3�|�����q_p�9���c�O?�S7o�z��E"�g_x��r�[�w������_��p$��҂1��mm횮;���z����G�L�q�?�����QUu��_Zj��|<u�o���Ըw��s��:�wz����C��X��7�/˲tA9L�{�� &'%$��JokoO��g��s�λ����:y�瞻t��g^x1%%y�Q��{�^���9s�G��t���C#��a��99n�;����qcƔW�`Xf�ԩo��~(!J:d�G�xr���#����D"MM�}�4(�O���Z�C��˛2qbjJJ$-���8n��ѡp���>����Tm�ԩN��O?�Xֲl�ey��x���m��7�TT5+3���51>�a����<����������41))	`Я�0������/c��m�
��>R��~0B�eY��5g޼;�{ .�,k�ԩ�YY��IcG����۷��/����ﶴ���7t��F�`Y��3κ��N�TRZ�+---5UŶ��P8���áPGG�a��tuuM?r��mۯ��M�		m���]s����C�u�m���}�������D:::Lӌ�rGgg8�D�
������ƌ���������,��{�5�Fe%'+S�Ԏ�EU,�ڱ���8����ޑ���0�Y\xʉ'ꆮ�g�u�=��P۶y��8�J?�7@ޤ9�;��P�.]�tРA����LA�+˝gjJj{{{YyESs���>l�$�<ϯ]�����w�^\ZV]S���Կ�˲;��*wT�铐�x���Adf�޼u�!C�"�h��~��lI�OH�f�0�lٺuGU���웟��;��rGyEE~^~vV��^O���
7eef���q��ѣF���e���@�_QY�n���?����%q�~���px=�����<��иz�Z���xX]�sGUUfFFW��WjjzzziYYEe�ޅn�ԻWzbb�ϸ�R(?Cl��;����$.dʡ�ܐ�7o>��''%۶-IBc�FM��x<���vEU�{�IEQDQ�0LEQȹ�((�j����ꆡ�C����sbEQ�a���eY���i����(���(����NӲt]w:���;�p$!t8$]ӟ{�e��K��؃%I�,˶lӲ���2�ZQ1ƶmG�QADA�t�aX]�u]�����d����(�4;
��@��rX����C-�
��c�n���vY�O?�d�(�"�2q�����(
1�t�`��`(�0��ic�P$��W���(޽�!D�!�"�(I�:�+$���P�eق���k׍5������^!�u��M�uUU��"B����/�[U)�'B�l[Up�_������M��O=��0*�ھ��0�^��`��̉{���˙t�=����LKV�4��*����U�%�%��B)�'B��[Z[v��R����t–�=��dA,>�ָF�Q:<������]�?m$^US�1��I�7��0�eT{X),B�A��`�ضm���,�O(c,
�(��J��t�O��rp�~
�0�4M��.clZ&���^
��[���2�V�ݑ����*�2ж���=C��WA
�݉(�?���~!˲�c@�g�7��k2�m;��7�˓�;��ǙH,0RO2�G�E���۲!8|�����
۶Y�����}���^|p���a�I?�B����x�?QdlE��D�P7=��`cR�t8U�I-����حŻ!0�$�4M]׉�Sb�v�j���{J�{��a�…X�N�#��W+��h���Mƒݦ�csD�[����pJr2�ֲ;@h����b�o�gHRƚcl��]~J<�E�2�0�h�,3�?%ۻA�?ľ�W��I���e�.�}�`w��W��C�A�Q?�p0��\ν�(��������Ƙ�8A��hwǰ]�On
��A<��yhni�灇4Ms�����vK��%I"���.���%�p�]^�������x<^�w��Uc�۵K�0�y>8�N���&��	 ���r:7o����gY� q��$��m<�v�}^�$I����t�6˲~�����r��~��t:7o���֖�o��+]N����˯�KJ�����},��3�|^��!a�G �-s:�?���E�5M�z�n��$#9`��N�w�c�B���~��!.����<"�,��>I�8��Bk�mH��g执���茏�'�˲��x�p�t:�^/��
p:{�.ٶ-
B ��.�!I~�/..n��/�>#>>��p8����,\����\΀�/�<�|^���>�0��4M����̙�|���E���H��t8�]�1�$��v;۶-�b���H�dZ�q�$���C��k���Łݽ2���$�B�+�\z���v;aY�����x��m�{o�²lW0�b��#�L.��\�~}E���s��C%'%UUW���o+*����N��kו��������m{Qye��W^��~���,K������*���Q]����i���Ą������UT�e�ytGU���MMM�QU�Ü9ǥ��T��lݶ}�ڵ�a�o�XV^�����P(4w���ں��$A/Y�q�̌�P(t�ÏT�JIJ޴e��Ͽ�q\|\���@���ظe۶u�7$&�{<�P(�㜹�5;5MKNJ**.^�x1�0>>`Z��Ͽ��Ɖ��FiyyEeef��ۊ��o���N��m��o�TQYi�x�yb��(�2����{<nI�6oٲi����Ҵ�T����ޱd�RYV����~>���v'��}����,��W��@ ����?D#�̌�斖��m۷s�x,�*.)�^:�0--���r��y;k�RSR\.WQIɜ��b���t]�Y[�����=o8�VT�b�*���iڏs�흞~gB�P�Ç��(k�o�m� ?o�ڵ�V�QUmGU���k�{��m���|��M--�))Ƕ��-^�,*G��Ҫ���������-X��_����]#0��D#!���+��iim�z�<��n�r�a����7H��U��N�t�]��,�~��˖��?�x��)/����[[[�fϛ?e����|{�ƍMMM�֬���}�=�@:��jjvB32z��� �j�����0�O?{�QGn/*��O�/��{�pH��������|����g~�ݐ��~�3w��}��_}=5%e��y�e
2X�u�0n���4���~���
�6)��nÆ����̵,+=�WKk��[i@��O>�R�7�yWQ��;W�]7e��y�4�mEE6m�z�O=�|F�޲����4M�3o~T�s��?���Uk�$�'l/.�n����A��#,Ð��?���1�G�\.�P[[���dE���{�J�����,^��23���yl�nln�ر����;{�<^���ۖ,[6q��G��v/\���󵵵?��/���$'%�~��w�<�n�FR�}2eҤ��Җ��Ukֶ��!�^|�U��)��$I�f�.��⫙I��ť������-۸y3���:f�Q/��z(.���kh>t�~(6��v��oL�ɫ���9��6~�ɧ����������UT����rǝn�k��9��'&$>����s�9�
�7��'�����������!6�'�� �!Q�%!�[��Blہ���K/=����Ro��:�����"
��i�}�����VTV�~��?pK[ۺ
�������ק��\w�ՓƏW5
�t:�RSB�`�ܜaC���J[�vݦ-[Ə������
����:��~��r����`���f˲+w�(.)u9��&L������3��i���3HfFFWgW~~�ss�/�LKQ��˖ggf4��SN9��Ǐ�����ߔ��
f\|��W_~Y4]�d���n���㏗$)..�yUE4p�0����A��=���cY��3ϼ��,Zt�e��qۭII��f�I�v������GSs3��'%%�-`Y�������ON��\}��'765�0gnVf�n����/��`��ѣ���?ot�\�]p�9W]~��i+V�*+/�766�߰Q�����}�m�������^+��t5b����y����o�9�ԓo��M�=]V䂼�+.�d��UU��#��$�N��_7��i˖�6b�;��V�Yshoe˲�ݐ�81y�������⊌��GL�t�����֩����t���q���W����Os��o���.��of�2tcʤ�7\��c�:*??�7\�t:
�)s۲�Mm���:-��01�nD"aUQm�&�� �ƭkZ0J����viyyEE��i~�_QEU��,�JŎ�ή.2�(��-��gnN�ͷ����t����yqi��#0����ܜ�[n�����4ͪ�jC�]N���;��D�IDAT�2x�}w�9f��`(�iZ(���aU���GӲ.���=���޲u[||\RR҉�{�]wB��Q���&����WTʲb��iZ�aDe9*��>�������1�Ȳ���z����%��eɲ�F�i�*�a�*)+���limMJLP5���b۶��q�܏>�l��<ϛ��z8Q5������*--�y���--���u�i���D"��@�$�˙���o8��ӂ��e�$���z8ܭt�D�<�d8)��ϲ,��UZV
���0�@�4l�d,o�eZ�a*�B�:	��pH��/������i�p�7f����-�4M����D"�m�#�P8�66-KU��5<ϧ��U[W��Դ��8>�(�b躬(--��_+�3��P����Ƙe����00,�pH^�B�'&Q>������pܱ�(~��gضO:���-Y!E��#���Ǎ=ڶ���w>��Ah¸��
6��_J���ӧϚu���c�	c�:�#�8��3~?�i�X�32z{<۲B���x�n˶�	�B��xC��dge�}��}�QSsSnv����Gu��o���xN?��!�>��w�zKBB�(�~��eY�(JC;z��w�k�fRR��M����
àI�3�1��m��v[���� 0s�%�����W��߯ߑS��X����-�~?DZ�4332A�q�܀��r9>?���pp,wԴ#����{DQ��������瞻�K���9��9h��)'��;��y�i�8�N��ba��X���t����~�6o���|bb�Y����s/��[�>rZrR�����].��0M��H���v��RS�<���s�oܴyԈ��ٚe*��eY1/2��r;!D��~l����8����e���У�G��s�:s�����=�(�����q'+Ja��/~�W���J��'�]�L����4ڜ���a��m��t�}��:y���1CEQ�!M�b�9��U�`@�ᰢ()�ɦe9���{���G15�F�QEQRRR�����ضA@���W���BE	Cd��3��n��8�Uu�6��AYQ9���U�\N�G�~�~æ'y��f'D���[.�X�yӲU���?��O���+�ij\\�^�aBdžBa���TU��.���v�m��k~8!�2�M�n�KE�&����r��_�Z 
B0$W���O���&91���
dn?�j�&�q�{pHRye�+3޸�����
C��h4*�"1�S�aY�D���#Nn�����]h�f�a�/m�
 C����LMIno�`FQU�ł9t� ������|$���-[�[ZG�!+
��"A�-�b0�׎���6�~�q"�n��0r˲���Dt� ڶmYX�}N"��|�E0Z�a�I'o�vBB<�0 z_ ���C%Q�ŋ>`JMs���{ߵr�x-I�0ɇ��^������%��ʛ@׍�$b� -���h[��K��EQ���"�o��e/�~@,�'i����n�f$%o��� ���0q��c��È�i�ݨ{8wR�?�/��&�BU�.��@ ޵�-�h��{)WL�,�E�/!&�1��F`��J��h.����b5�=�֊�1F�dgW��q�e�
(��F!��/�l����ű!)z���2.vE�My�Q� ���hv7�]����'���=����՗�{�'Uu/��ۧ��,�Җ�tPT��5��X�yj�3y�$/�kl���h,�X�Л4)[h[��e{��:��8�0�.PZ���c?�Sn�{�ι�~�Gۈ�;�Z�\;zK��Hq���:C���Y�F�~~e��GTU�����{��Տ��8B�ج�S&M�%��'�W���p��X�uI�2����1/BA$y����1�E��J4�ס��֥�8�	�ôz��`}Ƙ�p8,��GU�X�9��UUMäc�!��r}�I����xy]��yJ�Oh4�#���\��_.::yB�^�E�\-]�7���`�3�(�9���p�>9;A���
�W�\�0���v���^��d�І���'��I��\� �+�i���<�h�3�
��V��]�Ql|E���� �隮���U_üa��н\� �+Ƙ����'��\\��3���?����0�u�~��oc'h��apד�<��[6:�
���?��h^D��!D�wE�U'���s�W���|j�Ү��ǧ��F�cb� �D�&��y
�2��MAhb�n�pQ�#��A��t
� ����:˲�$(���DQ7����.d���\o�	=��>���D"�h-�شЎ|V�
�!�e�� ��<�P��2���eY��i��A�'[{�W�s�&!I*)-�������3g,���p8l��(��#�a����V&�Q�`����$Q��UX�A�u�4k��1�y^�u�eDQ������mv��
�i�>EQh�+�e9�UT��yD�ܕ]� ���i5���ܺ�:����zC�pyyŀ�ϯ��Æ���i˲�"�&��"�����qM�OS�<���j�FJJ2M�,�"!$��	��t��O�<���o=����Y��Y��c��)�,#���;}Y�U�an��W��y��ĄD�0����
��,��<�Ni����(�t:=�������:0-M��h�SW�pVW�s��'�e��`��r>��?Btè��r��6�����b���E��/��q\]}��i���Ɋ��ک(J$"?�����<e�D����yY�����l6���.��4�@ ��xTU-:z���|�%˖:�or��P(��0m@���j����@ --���c�>0M�4�1�0����"I���EQ,���D"C�4m��C��֝�,:vl���!�3h�����@�%

�H�W��E�(�].�����RS[����t:�k�{+�;:���l6Q+*+EQLJHPT!�`���\[W����(��b���۸y��>�0����
��TVV�#C22L9��h	-�3Lcc�
�_��������D��k�UU����z<�������Gs��"�>����MVBȾ��[�o���?�z�ϟJ�FLH���C�FB�q�@`݆��pdvN���M[�m޺�����~��W^{}æ�{���/,�q�u�������ɩ��OJJ��?-=Q~���e�qq�A"B�[Z�z���ʪƦFQ�]�t�
s��.��1�G=�����M�M,�~�zuuu�E���$/^����f��/&O���ǟ��ƛuuu+����������C2TU#�<�xɑ��'N�92;7V;^hia�{��kk-Ki��{����e�u�vI�^��k���;v�.���~ݵ�/yq�}���{�ȑ䤤g^X���hm9b�0́�[ﬨ����ƍ{�lٶm��]�����TE�+W�/X�aC��? �?�ӵk�
UU۟�3rD��c���0w�Ν}��񒒚��1�G�.��c����r�h�\�6�������;��6���C�߿a���G�32ҟ��/��k>X����ILL��3�<|hæ�iii;v�z�տW��)((/�hFd�Z,��P�\�Ϋ@#Ƅ �alV��nWU��}WQ�'��٣?�/%�9�9}�����oX�ᇟ���#���������=p ��⇏��~����?����3"��q������2g�������ȑ���>\����~�fϜ9e���c�}��o~�E�6n:v�x�С{��߻?��ɓ&��W����?4g挽�����1M����j��r�M6���U�hU���xkpFƔɓ�u׷�s��Æ��~9|��@ ��*�0��'~�Ò���_쩨�\��_o]xB��3���4rD朙3i;Rj�~�fL�1mړ?�qN^^AQ����{��746�3ߘ?����B�޻��yB����22����$I�e�c9Y��p��b�z<+V��3�����V�+JE��E�=��Ǟz��������,��c�>��Y��?��_�1��+�
����f��w��m�‘#2�s�횦Eu�A��~p�;�Џ�u�_J���#?48#���$J�,[��˲�[�$�"E�cA�y�gX�0Q�h�Y�u�z_4))�g����O��`08g��?Z}����)�}>��g�����~��ֶ6C�iiEQv���x��GƏ�Dx�WU��t)��q���!�����
2�~����"QlV��̟��	���6��!Gd�0LBX�Eq�"ȴZ�����BxA��I�?��?�8�J�AHQTZ���
�ayAxcd�&�0�`�%K����q�4i]׉iʊ�`�d�Mo��Fdf���OВ#�z'��u����9uʔ{�u����`�q�i&11�A-���t����0�$���!H%�0E�D"�,����i��k�Wu�y�/((�я�'
͛{}��� ����y��eK9�[��;w��Xv��w1,���/��5j�ج�����a��Gff>����	���������u�L}�ׇddd�����ɚ��;�v�����y�=I���0���(�AI��<9z�OQU��a�a D8�mmk{{Ż� X,V��u�y�6l�y>���O?n�9��;g�?����g���eM��
�v����2i҆M���k��p�w�ݻ}�.�0l6;fz{올�/}~���޳(;/�����tイ�z�YU������6���5,]l%��klj��[o�t�-۶q,GˊA���b�1�4���re�����u��o���?|l��K�-�m��ʪ�g^X\V^1w��뮙��3�=����;�~�/���-�d�
�#|�����v*��Z����
4�n�u��������z	A.���t��8�����3����'M2x�$��TTVj�F�ﷵ�y�^�e5M����n��v�(.-
vG�a�Y�6����Ǐ��;���J���KN��0���JI�������=v�j�f��e��p؛�>�7��q��ZVQQ_�0t��x�Љ��~����I���J�$%''wtt����׏v��Ȳ��$�������qWS[����G�����<7��2�G��8���8���V����N��I��ڿ?�V��|k[[Q���$��c�Z��l���ݭ��>����.���,)-�oh엒���x����]���P8��֦�0.�S7���xY�;:;���/.���[S��S���
���>���񴴴`��H�q���	��Ѥ����(��P���\��v}cc�i:B�0]�A�X�G~�č���t�@ @�/�"fp$"3�q\�;�$���ѻg�V;�1-�XQY������~�t8LB�K���#ME�4E�8V���
&C�$�{c��:�Z��TU��E�(�<�i��j�U�4UU�XVYVh�?�at�@q��jvN�(���d��|�^Y���D��&�zcZd�v^�Vp��%Q�u]��r�1�ۄ.6��
�����Uh�IEUE�?�[�+cD��
��v{AQђe/-y��eB�0-� ���몪�v�h�]�]8�+x����f�y�����.��r��yi
�nO2C��g�JNJ���@]#�dEA]�e���	!�H$:f�����|gg�9sh�^Z�����o��2B�a�a�Pw�s���K/"˱SPE��Q�1ƺa��H�fa�����}��WM�|�s���MӤ����T�V!��}U��ӫ!F�I�ot:�K�m��ݘ(f��i��������`(H�h��`(���)��oGQ�Ǡ<�4LӄѼ�Jw>�z�G4�c�<U�b�1ž�L�QLyY��:z��h��n�[b��{{�K&�/Bg�lω�1o��q��i4L�Z��KV��b��i����B�IN���NN��k��p8�0�3<Y�7�#�ǃ��4_�iB�~��k.��~���iF"�˭*V0� �����\����g�	!����e�b&�����u9h����e�L�8\CC�E�!��t�l�j�l��\�!6�---�Ѝ�s	�1�X�>�ҝ1�C�&�"��0��gv���?s�
�F6�B������"����Mp��=�*+���p~h�O�0 ����|H������<��A�H���R���$�(�,���3pY�=�GS�B�_������e�7r�z��I������$��k�IzA'�eB?���<O����Jg�6�l������O��A�'�v>��0M�>C>2�4�
�N�N������tLM��΄�g�]�,'x�!&!̙W��0��C4����X_]Su������z�ݺ Ǯ�i���~J�i����YQ[-�Ą$A
���%ve/Ίp!�s�g�n�363w�2b����>C�C�DĽ�V�UU�$���(
����f�r���(�W���bƗ�T8Ӣ��i��*�ӭ�(As�[-Y���6}˲��|]kѓ�jUU�0�;��>є�Nj��ݞ�Cx�����e�R�4]��� p�B�t�|—,��
�
uu��JN�#��UV=Z���a���T�|�al6k(&1۬V��.�9����KLL���8���E���a�F�c�a�p8�/;���h}}}\\����1������!DK�
�PS[����r��{�U
�N����V�5:��D=v��r���wtt�].8X�=|$���C���������ru-b�2M��e@�9���TU�u�&��]�4y����kii�xN`��HAH�t�e���Ì>�N�ڥ/B��9�|�tb+���1����(�%�.�cl�&A#�q\s �s�n�e�G�JAt�,˶����ֹ].QB��}�k==�o�ےD��.3}�0L��c��"'�U�v��|���$���:g�^�-���>l8�r,������%a��IK��o���")!�a��ʦ�f��K�w�8m�Y�
�B'N�N�0��f5��̌�(/>��C��CGh�.�0����l<�G��^�1,�665�\.���?��X�mhh8�#�4MQ+��~��?�8��-yq9-��x�n�(����K�/�u�j���^���r�֭��ٺ�^��f��#��v{�A4M{���M��<�����y�N�SӴ�,
�#�6l\�~��j�4��td��㭷���=��f��u6Z,��}>��頋���>:;�4m6��o�u8����h�����|n�����v{<n�۽n�ƕ�`��\.˲�a��έ۷WUW���i�.����s:�G��j��� ��#���劾d��c����j����b�x=����=n��펦��:��o��e˃���b�{�N���{~�Ң�ǂ�����V�w͚�y�Ǔ�w�7��z�[�}^UUg�f������|�(B\.����z��p2ZB�6����x<,ø�n���sB腥/����|>�����z<z��[�n�K���ظf�������A�~kk�aR�BAMSt]�D¦i4���&z�`�i����e�����u�u6����Ξ�ݼUՕ))ɂ <�b�;��^���͵�-k}C��t/�����?�?r9��H�Ͽ���,��qj�]����rn
>!�4nj��_�*�������(��ƛ6��[o�8�"Y$I���|���q�=������z�����gN��|����7߸��v[��aAc�4�Z�nee嘬��\}��f�����1%hy�+DA�{})�)�
��`�����8p�٦inڲ�pA���c�L�$˲I��f[�~É���W]5a��O׮;�����zӂ���ZUV^q�-mV�a�H�ݕ��̞���x�+�&'&޺p���ӎ��ٷ��~3�O���ܼ���6xpFFzqI��9�Z��7o�|ƴ��ܹ{��!C����W�o[�p���سoϾ}'��~�u9�y�G����#2�kjE���[�V=�677���w��LӴ�l�%%k�o�6�nZ��������%QZ������i���m޺�Y3KO���V���:q��W^MNJ���[��j���[�N7��ɓTM��ß�YW]Ss�ԫg���gm�m�MM��40MUUA���UW_��~��V���~ӂ)II,˾�?����p����+*�n�n�X��=��v��b�Ϛ��w�۵goss�溜γ̃Ʋl0�)��(��]Qd�ݮ�r��eD�3�_tם{�Ą����ο�ܼ�^���&O��d�C3��啇�N����-�*��vmi�h@7c��I��l?fLFF��ba9a���u5���N�z���6���[�kW�ɉ�/~v��#Gy�ާ�{A��ꚚW_��$���B4]9"�����o��1�p�Ǐ��{+:���UVU�����%�2m-��i��Q#G��⽜�<Q� ��&ڒ�r:��啕�7n�8c����‹���m���m�r��r�a������4-9)1#=}�K/WTV	�0&+k���7n���V<txؐ!4c]k[��K_l��t�\�����r��.���+(*z��e~�w���yo�(�<ϗ����ONIN�X-�W����~$?ێ
�����SRR,�a���r:�������w2���[o=v,'/o��/��=���>������W�m6�4	1�_�L�#�c����|�&%%�j�z�n�0|^o�~����j~~��}�����5�DQ�4�x���r9���>߾c�+�f�1	1Mb��7��]�=�s��+.Y�ު�,s�t�Kǚ&E��՟�ݟ��������F%�
�1d�`�Ųx��ں���{Rz�����ر^\��y�l���U�		��y4��A���9�~RLU^�e[[[o�e�i���y��/^�R���y2�c���,6�n
��1
�8-4-��/9��{-}�ezCK�7��]��ϳ/,9^Z������@�!�]�/��}�ﶴ�+>��-���a�ϱ��竮��4-(�<23s������'�utv���vt#��s�(
��XxI��+*"�

</���1E��8M�<Oz���#F̝=[�DB���8^\R_�P]S
��kj�F�?vlNn��f�KMbYv�q�.�9m@�{�&'%��8���c�:����y��Ç
�~�
�uu���?��{
8(m����'O��i�c��}���o��ݑH�4��Ą�sf>rDU��a�F�c��̚>}@j�g>rDӴ�C�
H�9cƤ��egwtv�WTD"rs `�ٮ�<���7�_;����f�k�DYy[[��=:s��8�?���������3ljn�9<!!���&+rCc���-m�q�8�����
���70m��3Ǐ7d��	�ǝ�>�p��ё�����}gѢ�C�l�|���iߘ?�օ7�.�
�`�$����o�;{vAaQ{GGmmmgg0;�k�N���d0�֊w�^�����QeU�?�}o��	��}���nvn�!�3���;�g]���u�=���r�i�����[,��z���q\l�t��Ǝ��5��p��oh��}��^����rdz��L��c�����]��qZkj��|q��GY����7wnj��/��5����ޞ�����{ѷ���n���߹�~�EU�V�#?t����2����z3�=���w�wL�l4���=�b�͚5jT8�
������n�_�=���%'%�y��O��Q�����q��DUU�$�����3�]g�����NjK4M+=q�9`Y�4�`g������-�1�ǎgY��v�=���Yp�
�P�;�������41B��p����xqqEE����SO�9"99	3��v�fMqqqKKKGG缹�{���˖s��5�0��܆�����G��
�oܔ>h�E�!��̿a��S�����T���8��?(-=1i№���n�/��ֱ,�������/%��_;��'�Ąq�ZZZ:;;;��P(D�p��z|\�����o(.)illJLHp8c���o̟������
��5k7m�:g�,�0^hhl,-++.)
��UUW766!��sr+**X�]tםt����v����u6��/.)�9<h�;;;Y��q(nmk3��w}��I��h)=qb�Ν����]3���n��ݟ�Yg�ۯ�:5�ࡂ��
7
0@Ŋʪ�\�\ZE�p�\N�k�=���ry�G}c]AAA���Ӯ�1F���p8�J��w�-�˾A��4m`ڠ���c%��3�f��{�;h_N6"(11�T�0�
ӈ�r8~��>����x���nؘ�_P\R2bx�
�n����z����m�#�5rD^�䤤��ǭ߰�p~��j�5�p~~uu��S�����������X^Q�s���������ݗ����qc�����դI� ج֦�@��v�}��W�,�{Ϟ��^�c��8�?}� BH]]���|��M�8ᳵ�F�9|��O׮;^\<0m@[{��_#
�t�!���r\]]]Yy��]_\w��o�8���cN�xܘ1sg�ڴu��ݻ���y��x�w�5��8������c�n�0n�?/"�k֭�Ym��m��X��ol|��7���gϜ96++)1q��u���o���'�x����Y3fL�0!��۱k�a7Λ��Լy��U�ՙÇ����18#���|��	���� �1"�8��Ą�՟|Z^Y�Ӎ,����������444N?��v���=v<>.n�i�|��'��<x����w��u}�ٛ�n�$	a�7�Om�a2�
�;xp���sf͜vݵ�%��ǎaY�9�2q�x��WVU
JK�߿����=�����1cD��ʪ���Źyy��z��ӭV˺
[���gѤ	"�Ț���6�w}+99��������ǎ����&Jҳ`RbRsssaQaCS�����U��'�F����E;w"�!�544ج��a�<n�(
6m>v�羑Ao�,������������9�\t�a��&E�t�C���sL�h�ә���L�v�1csrs�ػo�̙3�_'�2�f{�@p��=|�Ν;���h�xB�(�ť�6�-)!Iպw���A�D"!������i�0���`DZ����H{
�<�(�$��p�NA�e���C:,��2eYVUU�ͦ*
f����ƀ$�fW�ڋQ�uQ�4M�0&��D�6��~�!C�9��eYY�mV��խSU�1�0c]�-K8��@�DQUUM��t�#�L�����b�Ȳ��?��on]x�	TUeYVSU���D8���c�ZUe���y����b�D"�3C{�B$IB��)�|(�L�D��[#�H��8��i�U�Dx���b����i��3ӵ@��=��qt�"B"�l�XEF��H�k�
u]WUU��E�ߠ�j�u
#�&[�V��Eהtm1�4�V+1M��
g� "
RssSUu��j�sP� ^�{�ՠ�q�G��-�۾�b��VUU5��MJLJLLTU�L;?!DQ;�A�c5��/6!� ��O�Y��/���y�Сs���hɓ�m	"��3�}�0�x�Oҿ'���/���ZK�S`�Y�gLBpג�v~�uQc^FgG;�3]�^?�����C7ډ;�̺�;���7�~�٧�;�S*f:=�@��N-y��.s���n��;B�W{�f�u������J��7���n�^�=�A���16SQ�^���{������M�z�;Yڿ�e�n���#
B?�LDC���6�L�e����t*G�s��
�G��}�L�X����y�Z��K=�;;tzև蠤Ӟ�:{u[����#t�Zf�dFQ�����}!m+���^��-.w۰�ַ��:�"D��k8�=1�i�t�Rz_��Ou�o��%��y�{h�����_��~���(
�)=�}�dQ�i�����$��\n�X�E�$�a�չ!(��a4U�h���BK�@���^B?Ƹ�����͈�1
�ޙZ!@��0L[[ۀ~.��p�i�c�iZrb���DJL�5 ���~A��\>�_��D�~��R/�9!�\Vzi�!�h�v��� *���K�y����A�!�s �@������@��>B?�9��ρ�}�~�s �@������@��>B?�9��ρ�}�~�s �@������@��>B?�9��ρ�}�~�s �@������@��>B?�9��ρ�}�~�s �@������@��>B?�9��ρ�}�~�s �@������@��>B?�9��ρ�}�~�s �@������@��>B?�9��ρ�}�~�s �@������@��>B?�9��ρ�}�~�s �@������p���AM�0Ɨza\@�UWu]�8��[�*�B���l�A8���s �@������@��>B?�9��ρ�}�~�s�K�p�"�\�y�|
z�������s,�.V��0Bwac2!�c8�z��^�X��
���e�ĄD�P}C==\ ����x�a �@w��Z���?%�4��	b&
V�V#���v��#F�0����ֶV�Нi�����ȸ�"�$;�jj�")Iɲ�\�<ʄ��L�,�/�����4͋0B�a�#��a��i^��o�&Aa��	���L.�1�~�M�P��Ʌyϗ�Y�ڤW�ݦ�0�0c�#�<�6_�����i��4MB˲��#&�04�Fc=��0���+�F?�y��c|Z�6C�4B� ����'BHt"=D'Jg�0˲���<�U?�"-	!/���r,�q\iyYɉRQY�x!�y����C� �R�˅���`�$�4L��{�ݩS� NJ�����#�������Ӊ(� �8���>`0�eB�0lj���ZKk�(�=�
B?��>$��si�!DQ������:EUY��757����#�0mmme偖A�X񱂢��@c��zEUek[+��_B7��f\7�������*+B��0�mm�gD{G���Y.���BDž#�ֶ�ƦƲ�C�G�X���U��LNo%�E�p�y����P������UU+����
����������P%
665�q� ����xIIcS˲_rc������6���`0�_T
��#�\r����c|(�p�%Px�(�(�R]]�{�@m]][{���u���a�:�����:�}��`�2,��aA�a����&]��x�KK�Fe��5�a�����l��L�Ȋ�������5*���%�x>
�5��{��uk�����G��6r��ʪ�P(��|KkKkk��M��u
�N�CV���6��[Z[�[�%�'L?)-5��L;N��s��~�Ul&������
�dE�$IB1�A����<ƈ^���jZ�,��m6�4MC�YSe�1�q�}T��A����5�c���z�USUUi�h�����5a�8����xL�,,*JINaF��5~�؄�xU�0Ʀ����J89�Fn�@ﺂ1�XӴ���x|΁�}����_J�a5�5���VT5}࠴�%��y�.��binn��e����`�#~�?>.������VVd3��2����B��XM�l6ۀ�EI	I6�������A�4Q�A��&�'ج]/�:Dz,���u{�=p��N>!�Fx�ΝYYY��^��cp%���������W�v�x>��Pє�A�$��7��j�&
͗-��eY�a5]3MS�q("I�H��BDQ<���=r�ҕȁ륽wTM�8#Y�1�g'�b
�X���yB�c�45]x���c��`玝;�_?���0!D�u���㏐�k,�y=ڢBx^�c�v�MbBLb�ޜ�%��F'�mv��6K�a�H.���i�}����؁�/E{�c�%A4It^'/�!�@��7�������)�)!EQc�tŶ���˱���� ��N>��(��/I+t��b�)B���h��g��f���F]#{�ҼN5ڠ�1�h�#���1�0�H��p����PI��VW� �1�#�$+
���z=y2C(���z狫ohhlj��3������
��.ԅ?Ɔ��lv����#�H�������&u�Ymt����R[S�zq���E��/�	!�aB ��]����v�0����眱�'��u�
G��|�S6	���9��@��Y�h��Or��o��Y�QUMQ�/�<����i�
QMÈȲ�U��g�5M�\'n�$����4���#�pg�$EUUU��NSv\���p1��i	�]��him���ڿ�IH4P4�}'�q� �A��>[��ޮ���h�	Q���
���ÇE�j�ʲL��8�06�-����`�o�}�0!�e���Ҧ�f�4EA6tH�]�������RS���4M�U�M��1	aFUUMӬVk��{�5�+{�}�%&&��E񥿽�g�p8\x���]N�a�a�l6��.�d�pW�а?'w��1�,3�v�%QTT�a��a�Z�����G��՚���d�K�����߯_F���֮M����r���v�-��1VUuӖ�Y�Ga�DA�c�����u��8��%I��(6�UEEQ0F�姿�Ձ��--յ�c�F���$Y$�$��g�mjj���t88�W�b�lV��f�4�b�8�v��z�����j�d�$Ym6��h~S��i��DQ��s��A8,�644�,
�
�سo��>0o�`PQ��7�5*>.n�l���[�ɻ�,ϻa�?ضcg �~�,BȻ��e�Z̻!o�|{}}����x<���Ϟ9#cР�,K�t��a���'�GQ]�w}��җ^>p���,�Z��x�m���7.ؗ����UUW�8^Mmm�?.9)q��-'�o��ز�s����w�ݫi�u�\
��i:�o/���S�(�����ڵ��VY��y�m7o>x�ȝwܖ>(���|�{���L�vm~aQUUU{G��f�<���7i��ד>p`AaaqIiMm퐌�k�^m�������L��t:�Ep��ތn��e�{�Y�o��?+(,r�][�}�i�V��^y���<�x��|�����.�(4�-^�\S�c��o��"�,^���r~���M[�I���⥪�s��\0��н�?�?D!�%Q�].W8~n�RM׭V�a� �z����"��ko�U[W����3^������%�B0�UU}��=���e˃���K^�9����o�X�zx��D����O?�<!�Ï?�tͺ}������ n���+W��^�a�KJ?߱s��/���I����JMm��?޳w���y���h&�K�%����˲|�›_Z�x¸q��ӟ>r����߻?����5a�8EQ������;&#=�������v$����#h���2�9h�Ew�=~��ѣF�{�"�����q���KIY��s���?߱���\s�U���|gѢ1YY^�"�rss`d����?������K�d����x�V�e��kkjjxA ���{~�vڬ��]߼�O>y��i�>8#�ۋ�u�y�MM�F�LNN��w��/,6t�C���oٶM׍9�f~�;s���[��w�����6�U�i�\s�7��Ht�xqNn�?x��o�1d�`��+T�=|>���xܘ�ѫ>����~��+V�z����7E�=j��'�KDШ�#�kj���J��v���/�;�n�G"�P(�‘p8����T
!d��(������4q �
����G‘��ى	��n�3�O�/,�4���m�S&M�u����.}Pcc��(�Ϛ�9|�O�K����y��;ƍ�e��,ˆa06t]Q�`(����ÌD"���������YYY����R����wt ��]_|1"s�(��aD"����p
�-V��n_�a�(����_{�T�8���U(?\���M���߿`�
�f��y���e������1�+����͛����Ι�>h`qi���ӧ%'%mܼ9���ᰧ$'WTT^{���8�?m@��)�'�Dzl��+�`��]�􃇾o��9�[�n}Jrr���9yyu����3g(��a��ѣF������=~��v���Z�r����a��t��
+*+��zu$�y>��0''o��NJ�����\5er �5=s���ʊ�����hoo_��g�|������:233-5u_v���x���}���}>��2j��#�Æ]3������W[W7k�t���tp���y{��k�Z�̰,��P��t~�ɧ�����'���q�p�j�2��:�@�4MEC7TM�y�V�T���ӷ��J����1R�j��;��(�.=<�G"�a,K4u���tj!��P8q:���,�1��E1��4M��EaY�eMӣӱX,��q<�**f:w���*sG�}�N���J��iðN�����'��_�"%9�����C�zo�	��
]�qMQ��[�ܶp!B���$I��#�0��p����D"��;�����UU�o�D"���N�>���gh�F��a�a"c�C���E�蓴�}4�ʲ]�qD��u]��#
a�M���)�N{���U�t���b���۱kWCcS֨Q�II�W��C?����ɘ��{
������ѐm�>�/��H����d�n/щ��2t��ep�q�<t����8�6�eu]80�����	M=�+ԿO�L��3�6�\���L�t�\S����/���y�P�*򽊶�\���4M�6a��\Y�ўcFt����T�m^�a�@ �����/`&�?�/����e%tEXtdate:create2013-09-27T11:08:13-04:00C��%tEXtdate:modify2013-09-27T11:08:13-04:002�tEXtSoftwareShutterc��	IEND�B`�site-packages/sepolicy/help/files_write.txt000064400000001067147511334650015135 0ustar00This screen shows files types to which a process running with the '%(APP)s' type is allowed to write.


The description should give you a decent description for what the application is allowed to do with the type.   If your application type is being denied access to a particular file, you might want to change the label of that file.

It is recommended that you use one of the types defined on this page.

Note if the label of the content that is being denied is owned by another domain, you might have to write policy or use 'audit2allow -M mypol'
to allow access.
site-packages/sepolicy/help/transition_to.txt000064400000001135147511334650015511 0ustar00This screen shows the SELinux process 'types' which will transition to the '%(APP)s' type when executing the 'Commands File Paths'.


Under SELinux, when a process running with a 'type' attempts to execute an executable, one of three things can happen.

1.  The process can be prevented from running the executable.
2.  The executable executes with the same label as parent.
3.  The executable 'transitions' to a new 'type' based on policy.

This screen shows the executables that transition to another domain when '%(APP)s' executes them, and the 'SELinux Application Type' of the newly created process.
site-packages/sepolicy/help/booleans_more.txt000064400000000301147511334650015433 0ustar00You are viewing the booleans page for the application domain.


Selecting the 'More...' button will open a dialog containing the SELinux allow rules that are turned on by the selected boolean.
site-packages/sepolicy/help/transition_from_boolean_2.png000064400000076733147511334660017740 0ustar00�PNG


IHDR�pַ��gAMA���a cHRMz&�����u0�`:�p��Q<bKGD�������|�IDATx���u�E����u׬��	��p98��S�8��� w�ltw�ݍ�����G�v&9$F�=<af�������귪pGGGcc�$I�����h��Z����x׮]� ���2s���u}Ϟ=��s�������iک.��eق����b�F|]ס�g6M�x��8�#�З��`�a�~��p�3C=g>���R�3_x%cܯ�a�B����?rC��	!t�{m?�`�BC?4�����\�eB�Z�#D2�a�'����96�]�aF�0�01���p�%�ec�Ђp�Ѓ���?Z�™o8��g2Z�g��`0d"
<BH�uU�0�A��<�C!��Y�Ai����W�־1F�Єe�X'�CFc,�DZ��ѦM�@H�x�3� ����_	�V0Ɗ��a����$+�AQBP�U���׉�?�c�n�'�/z~q�0�,�!Qi�f����Α>p؁��eU�ޝ7��5�,-��/X�n㓯�i�Z��������w���?�y���E[�/^��}E��aYEQB� Ȋ²,�`�0!DV�(����2��(�����P���;��:#5c�q����`�����)���W]����
��V!��^�e��(���˲l�����y�,z�Ͽ1dPSK�=�xAN�����X�5t]QU��1F�j���:���B��3�!�h���(��Y�Y�E!���^��^��8������>�!T��4���HIV5-<���|�����f[�q�ܯ�����651�@mB�DGy^|�_��0V5(I��c�E��H��Y��5��!B�����D���OKN���E�u���SS�W�0��o����Ý��4�/�<�ї���{j�z���ˆ$)(IcG�LK5!��r˰������1�WU�t8<.'BHU�n�o޷�G���eUu�y��
��m��QQ)�	
�-�0��PBl���jhjn�슉�$'$0�54%��(���r�ٻO������{�UU���>�r���0�e_y�;o�c�œ�����4YQ��ͺ��Z��.<8�隖��������ǎ�:n4B��8��?����衃���`L�t�eي�U_.Y>8?�?ϿbŮ��!y��N~��^}�E��V6a�_|u�
׼0�;��:#%YV�� �X�������_tΌ���|�9/��n�/o���8���oٺ��v��aO������(�;9!�xW�����*�ij����hB(�&���W�h���gM��:�0�-�kj�����Vm,��o�y�e���_-]Q������k�I�͋���kj���44���T�0��s�����7��JKMML(�޺�w�1�B�G��/��B��?�b��`��d

��ɀ�d93-���[�a�ʍ��/^���s,B��yA�Y������#���1��.�06:��_Z��vo��tØ5}J}s�~��<c�x�����*�$IUջo�nXa�����f]p��_���V9�6�eBVQܼ}W�������vb������,q:�I	q�����.^Uդ��I�G�����;�,!�L=��rߊ
�6n݁1�2v�^xǟ��g���,Y��x᪵^����/>�?<�hu]òuF��������;⢣����R�� �:?�a�alV��� �Ud�0H���Gh�?�%g2B�(
{���*�M=B��_~�����ۮ��EU��T�PMC��fcX&$�� !���dz\W�w_umK[;!1t��o�Y�:;E�M�!I�n�겁y���%�j�4��55�C!��Iͭm�`H7�ժiZM}}W�3��|���~6�a��;uR����'��܃��:��_�<_��ԫo͜<!-)Q�T� �Fn����a���GOMJ4C������fYQX��y>
�ϙ:~��(��+���*���1�Y,닷���?������#4a䈠$1����f�Д�S]xp&#�`�;��?Y�p�'_X,��W]�7 �������/���9���Wn��y_`�ج��'Mp9��V�cDb&3-e��ϼ1�a�e��.Z�voUͳ����W�LML8o�EQ!v����˒�$��v��W������6���f54�|�te|L�E��Ϛ6幷��8��ϛ1q�'��w�>T�eZ~Q4]G�6D�9�ey�w92�UՄd)+*M�$�c
��X6:ʓ�����&$�9鳦O~a�%��])ɗ�:{᪵]ݣ�*�βXD��A�4?��iq�1w�t��C1��G{<�����{0sz�ڵ�G�VU2|�	E�Xa���*���!�`NͲ?�P����]� �a��%Y�X�0EUEAx^VUEQ���?�M������v�EQT�n�1F1�K0"�v�l�?�����x}4͔�=#�0p�+D�8�����fcY�p�G��N�O˪�j�Yk���?���QC������(��cY� N>��BDA�8."�l6��銢���H� ۶m�p�h��1vڬ�aC!���k�a����qo'BX�!I˲�Y\BHP�8�����VΆu
CI��`̱L0���t��AH0�wD��^��xW�c�?Y��a	�cD-��iƱ�N7��2����1��iQ���ś����:�`@���u�ln�C�5��
�>�cI�L$���A|���m��!��w2�x��E�o1󯺮�/���A�oD{��	!�-��:!H�t�0����̔$YQ��EG;B��n��B��4$I��O.�`�Xs3�UMC���1?2?���PBG��!�|@��)����9$?��3�0��6rP�n�$#�N�C2|�6/&YQB�̲�q<)4M�*�YN7p
�����w�����8X�gY�I?B�x�C�aY7g��:
#;�� �!�_h�?�a�uM'�pw
���a��q׉:��� M����Q�V�AtC���g���y�;�w�����lcUSS�RB��#�0A'��J7�1���s��ͭ́`�|�������B-m-�CԞ��E�(���~����� �I޷G-"a���:
��S�����3 ��:�'8cBx�okoۻ/˰��4(���"�ttt$�'��Ʀ���
F?��UU���Bi�i<��n�'�C��`&
���X���l:2�I+�(�465�x{�<Qp��p�0���p���3
�q�,����E�T�BH hmkT8�����W��a��@��P�'J��YNf#�0�c�K�J=���$��̏_�G�y�����Dǜ����͌��h-:BH���3��cA�e�SY!�c��t�g�G~4b���p�?!�N0`=�|�4�b�i�NuA���1���1�ߗL;�[,��zē��[�3�~1�,�҆�S[D�A�C|4��cl�Ξx�?��7�����Gt0iN�3��������n���.
��=bs�Q�"t]g��0aL��.?�GD'_�Ԛ�jy�9����o8�� ĄM�{�I�M�a���u�f�)s����ٮ�G�����n���b0���wB���,	T��H����7nf��j�a�N��z|\|q��θ�����1��'��!�j��9EEQ��mI�Z�e�UU�T�3J��p�0��m�)��қ�<�#7��>K��~��a:41m7#&˲.��. �2=Fy��4�'�s��a�l6��~a,�<�s�`��~�HJ��6���,�3�Ӛ>�{��GE��j�WQ�P(�E���KSW��
9�f�n�"�u��!�a�04�B3t�e0f0�ё��#�~:���o�:xHll��i��EG74�hV��l�n��2��**+332�������E�������Ǜ���i��d�Of�:��X�/�A4�!�(
�{���.۸�֏!D��t��t]oji�T�e��i*��t:-�����\�^�e������#���2�o�c����_�(�,�Æ����557���%'���F����}�h��aD|!Q�\�EQ�	���^U]���>j�EQ�qE�+�b(�;}=�u��p:2�a���A�!���E11��ii�������u�j��WTnܼ��ta�|>���¡Ck�F?]�aY�o9�D�Ôb���JL��x<�P�eن�FDHbb������Ho�}'N��"�-l�L�{���euYaX���$�O�:,���j�({���?,_��f�"��nwTTTtTƘ�8���r8l6��+׮ߠ��u4Ms:Q���{<n��c���n�W�|[�w_LLL]]}�֭,�:���h��Aq8QQQn��n��_Q���9M�LOs,�vtvvvuY,M�,KgWWwww�'Aa�eY��-�N�36&������'?7����sb �8�;Oq:����#�`�A��`2�A1�a>&��k}La�0,C�=4*���|m]��o�-����޽�y_|Y����y���r�lV+m�t:n�[zG����>V5�b�,Z��{��S^�t:�l��y��<�r�v;˲�$��λ6��b�8��@�v�β����u�n�EGyl6���v���t�\�O�}V����t:�v�E�ԄF��Ȳ��ڲ�+>��K]�dEv�]��j��8bN���r:�NQ�5��r�t�5��)�����|>��]RZV����y����XDQ�u�Q�n�}eg(����)iNUo+�������,-�L�v8]����?R��eV�Y{�ה����@�_���0�����W�Y���{�-7�m6�E��-��tŊUk��p�����5���F�4���wލ�����e��;�N�a�\���ݻt���u}}CsK˄q�f�:w��+V�v8�?��
��	���`��n����}��k/�0��`wy���{ˍ7�x�5>����eYl`��"�=n�����m��劏�g������;:*:&:�RW_Gb�"��t������ڢ�jfv&��P���9#-c,
bk{kOOO|B���PT����l�ґN�34���w����p8֬[g���v��,�,�va��m�F�8��sZ��-Y���|�Yg�1\�$֦a6���퉚0~��o��\�6��n��<�q��+V���]w��>���|���8��s***/�����,��֟WUWo�v���_��l���^0���_/�������y>>>nWI��n��Z����l���`A~�q��y����{ᄏ����W^�����
�M���<p�E^����]r�#�_�lyQqqVFƅ��NJJ\�v�;|XT�������R5-
}��MMMS&O=r���ZZ[�ߞ��q���^_h�9��p����ʗ_4}���"�x�>�l��~"�;�i�AssKUM��w����ۧ(�g_�XX����;����/Y�|�q���>�b��b1���Z�rUQ��/��j��s�0.,�?��Y�^p��:??� /oؐ!�{�m**�f��+W�5s���K��X�isQ0:����Æ��ǟ~:66f�رp�O�PH��S�S��Xy�o~���2{ֹ���ç�B!IJ,�r�_��'��q{,V��n��Ȭ��.--MMI���KJL*))����=��8�`�"fge�ۿ���'/7!�gϞ-�[\.WtT���ڷw_��4�ݞ��^������eY���(�0�0tC7C�t�c�++�����/-���UZVT\�0�s���|6��iS?��Ɏ]���z���;��ݦ��A���®a�	���s�y���R_}�m�͊0������ǏS�m��O�
<8!>~Ԉ�����JJKK��^x�����%˖�x��V�\�|��i�-Y�t��M��ZZ[�ʲlQ��7�SX�ϲ����1��
�P($��@ �tMknm��o���R��dg�:�o-ڸi��+�r�y����꧞}nİa�yy�,��k�Þ��2|����6������55Ə{k�;�w�\�|ERbB����*�JT�QӺ�c\S^1h�4w�+*^��f�N{s}� �G����~�0�VkIYYE��^~eUՖ�,˦���?����޽��wvv
4pƴi7�xæ�-��٬V��[T\<v�f�;f�0MM�%%���Œ��8j����L��\NWі��?����}������&�;}ʔ(���7��[UU]�m�ρs�H4q��v=��S����֌���~��t��Ao�
>tvٶ���=e+V��u=59U�dI��	�B���#�TM5CE]�9�C�D�p,���h�X223�R���@RH��,6�M7�@ �q�Ƃ����\�9���L��u�"�J^N��������e�a��a��***%I*۽��v�|��.�x`A��o�B�af��:=Hzs�
�6
����o�}��k֮��x**�z}�ʽ{Y�eX&#=-&:ꜳg���M�8q��u����.^R��x֌��6l�p�yӧM=��6l��q��O�:j�H��������;��P(D�>�mn��WC�!�def�ެ������R��:�:��Ə3i�� ��q������o���0�=+#3��>uJrr��b1c���믹z�9�l�\���>n��C޾a�p�XI���Ӈ�3ƚ$��	����g�u�-�e�yWqNV��w�|�-/���0��(_}��-7\��{���^��������*-���OSSRbbb�TU}��C���X�}{Uuuww��AW�[�z͚�ʽe��|<oތ����� ,_����Z�4��?z���߰qي�Ǐ������	~������r^v�E��rWi��j��lcI�m6�����=��6�E�KA�Ǭ�s,�2lbb��Sf͚�v�+*+�1lİ��x�߽g���6t��a�}^_OOObB��b��x�@]}]SS��fO�OliiA�,/���Ymf����������S�EQ�7��[��EU��˲4���$�0�
�6EEE�\���G�z�y�9M�M�7+$I�}���]�B�-H��
�B^���v�xݵ��UgWWzz���*,(��k&M� I����/����M��b媪ꚻ���y��B�����������\�z���Ǎ�������u������QW���믛�z�p��O�P(D'JU5���#�v�,���/�M��q���z�{�^���;:�22&M��Ɯw������ȆA���뺺�P0�0Ln�?��h���1�F655{}��O�8�9ﴵ�����8���
�
I��q^_]yEke�#:������={�����۬6�����/��ؘ����@0h�X6n.���C�}�/Z��/]��Ԕ��t��W9�Ν�v���$%&�s�L��._)��Iu��TT���1r��#����X,�iƗ^|QH�6n�<i„f���љ3 ;&&&
���X�jSQ��3gL�6���Z˱a�UUEq���EQo���-m-QQQ4��8EQ����>Ow��}{�nwm}m0�z��x�<����u������M�֖V�qMmMtL����l>����9))�����x{"�nttt��s��n+�]&� a�eÐ9>6�l�WU�eټ��00B�`(1!!//gSіsΚ�Z�|yKKk^NNm]����D��+�v[_���Ӎ�<4cRSR��<I��11�1�1Ǐ���Z�ź�����V��lϞ�ى		��Ç���1~��ظܜ��O7�ܳgvvueef���tuu*,�:yҶ�;rsrx�7�c���6��f�Xi�
�^IRtTTFFzH
a�����zQ��̌�aC��l��x�g&9)������^~�Ņ��@`���Y�;v�$'%��.��}�y�j��TT\|�Æ�����s<g����沋/XXл+���aN�f?�B8����J�*)aߘS0p�MqYm�
��}w��omii���Ć�1eY������{�&N���Q5�����[�{�3�>���ڬV�a���M����C����~��l6={�~��ngXF׍P(d�ZB<�A�0�nW(�4]���t:y�S���\NEQEu:��P��������:����-���������l󎫢(���!�eccc�V���m�m���'ċ�(��ֶV����Ah�VCC�n�QQQQ�����`0(����QQQ����n����ҢiZTT�'�C�t�]=]��!��y�����S��F�V���X,�B�� 4�]�e�YO����j�PfZ�$K�E�ݮMr�\����n���`"�R($9�"z�>EQ<����@ *ʣ(J0�;��E�|.�KQI�\.�,�4�����.�(��-������QT��f�ZY��<��m6�χ1��l!��C�� �`cl��B���f3t� \N�?�8��tx�>�
��.ò�P(
��.�?pķӯ�V�4]KNJVUB����#k׭9r$�ٽ��m�������7ݔ=��-[6󂘛�+I�yX
��s��>��ή.�ahrOO�����AEQ����@ 9�SU����L���|f��������jmm�QI�$�e�^/�S���x韺��i�s��z�����p�%z{f�gV�5??!D;�!�Bff&BHUU�r�М����\ڐ����1������!E�<�s����iZnn.B�0M��Bf4Vϊ�ˊBS��S��G��eeY6;yȊ����?�%w��K�׋0���:;;��|޾�~�az�^��p,�1���˴��>0�����,�vwwӣ�>���������E�10�M����eUU��줟����u��������dY6$I463C��,�eia�a�;:�㎎�u�v�0T��L�(�tM�Y,���Pþ}mU��d=�ttb��i��)ʑ�E!���bjr��4�&�;v�hEQh��_�	og�Ǵ�ܜI�.OO�~[� � �7��� ����'zW�t���`B}zȟ�>��ӄ�C7X��m>�C}���l�1�X�O�8�0
΁����1�����yH�~+	Ρ��)�q�0��;�e�\O��-�4Çb�1zo��6��˚���A�ʄ��6��("��<k—9���~�v�3�a��q	�׬f0C���"˰���zMU1��<?h�#����a�EQ���qd��D�KrJ�7{��C`�Q�$���E}�Q�����w��4	�4��|g0�qa��|�����i�cԗ�w�c�C!��=0Kx� =i�`��!�#A�(��!�hv�,�<ǫ��s�q��c���8�h�+��V�Y�1t��8��LE�{:�G;�8:����Ç��O�鏎����?�?%� ������ڎW�0�ؘXBH[{Þ��?!�i�`����}��Ӌ�Di:t�?c��*g������~L����̓��a)JKMKJL:U���eY�hQTŜ�񇭊�1�2� !�ݡ�)�s��8�����9���E��	�%'%2D>J����3P�8mi:QtB���m�SX���0��j;.+4o�p
'��y����`�fpD\�d��e�O�w�T��ZN#4�6�g���>��0�#����B���8��9
'��vpD?�A��((�k+z�9���{��<vg<�av5���ڸl���.�'�����������x��j�t��(�b1�w��B�H�$V��{�[~����nuEiǘ,�&�����=�g��H�N_A�Y�e��I1�2M{�X�m����S��#!�*�,��O�#�2��*��A�M�1B�A,�cos0 N��`3�t�azo�|��:��̇	:t�6������"����<���!��;]ƆA�"W������U�
%=���pL��&h�A3�;:�W:�{mV�~E�������޹^i�*]��j�V&�bINL���|c��s��
x�}��P�|��\�K���A��]>}x���w���~��N�$R�cY�0B���<^+4�4y���C#Lx��'=�����V�{���6��8��䘳�0�lVs&):\;M�����?VL�u�N�?�4�NQ���v��}7\T폎�NO�0t���ͰL[[끪�y���hW :���P�e���5�9M���Y,�P(t��K�8�x�ɲL�I�a�m!BP��鐙�a�1)� ��(j��q����KGaG��fSdY��� ⱷ�zkrrr0��,\��7߶��[-���e��wW��7|�Ц�Q9��u�e�	Af���A}`�C3B�f�>^R�~e�E����Q���
vZ��ZV��laz���6���w��z��ƌ1m��|N��apwwB�j��Ӟ��f���v��s������6�w4Z�%��Q\�n}m}]Nv��i�.��*-����@�'����	�ahθY��ia,�dk0�etvu6�c9�"R<������T�lv�h�g��kk���/�TT4f�����K��=�̣7w
�q`�Ts�����BhPa!X������ 6&��J�h���*)+�r�7.�k���9��ƛ�~�EM]�A�0Ƃ���N����z����>_^n�,ˢ(657���k_-�f㦢��丸8:c��4��^��̌��h�D:^?��FQojj��֟?p�}�^t!��n�744���J����O�*)�4MI����aY�V�z�^�����kw���1sf��3&;l��ϝ��S[�Xq���A�0���6l���imk�Z��7n|�ɧ�~�a��^|��ŋ�9����F<���iZmm]iYYGG'�t�,ˊ��������0��j���lmk#	� �$e�X�[Z;���V�$I��YD�VrA@�����8V�y�Pm]�?`�O
���뚮��R��?I�U]�̙�ð�lo��NJr��x�ܳ϶�l-���6U�Դwt��q�9�0�X�e�eB����I�Dǒ�t��r�ko����i�X�y��M����b���Y,"�X���Ɗʽ�`Y��v�\�f�֭W\vIJR�,�uu�aЩ%UU=PU]�gOgg'�x}���0::��o�����hnia&����_Y�7$If�A�u�D���a�ݾc��f>l���-۽GE�Ͷv��]%�/y_�햛iE!���|�/n{���[��{����MC+"�a90�tحVKH�9���@@�� ���٬T��ƛ�5���W�^�����瞟6e��-[v��ŲlvV拯��������c�{��=���9�u�!����+
�Mv�ힻ�\�f͆��,�m�����
�&M�����m6���ܼe1�Ϯ��a�=��O=���/�\��w��?��g�����^���g�+I��c���j�')�K��`U�tM�u�{�[�=^UUͱ0Ɔ���߰qӖ�����m6��j�ߜ�NMM�~�����g�{!
yܮk��ً����?�_Ss�?xp��n�\�fݺ���gM�Y6t!d��E������򕫾��+�ȘQ�n������Ruuuw�75%���s�m��>���_�l��cǾ9� `�Zy��]]O?���鬨�{֌��[-�e+VZ��{�C��!��./_�v]k{��}�0�q�q����V�U�G�x�3�9k��y��iS�M����(B�R������������f:�ό�S���_�{����qwOϨ#>��Ɂ���6e�w�%+2Úubnp��'K[w(��-<���וW2!C���F)�2 ;��?8t�m;v^}��F���z��cF���믾���6��|��A���ǟ|���A�}��cF�
I
���iڦ�E���7�p}\l��E����.g@vccSgW��0�q�mm�~�aaA>�2O?�lA~��Y�n���,�\w���^�}��iS&/]�|æ�_~� /7��矽�‹�9�~J�q�0"�h�����f�0~ຫv�����{��}��#�2�3�x��mݺm��_�DG=��(�\�&5%y݆
�{;��JJ�6o�2h��!B1�Bw��6>�����f�����4w޼[o��o�������?��n��z��===>�o��Æ���[�^�+w��5i„��l�\���EyyO>����C!�a0¸��'.6V�
�ϧijrrR�����s��i�<�@yEE[[��j�<x��u
����M�
4'0 r0!��nQQ�;n��o��qc5M�Z���	�`�XTUq:�x}^�haY��w��`&!>�j�Ў[c���n\}���~�����{�C�mx����v��woZ���;b;�٬!���s�A�A}�N�B����b���wtvu%%&"�x��7-�V��j�n����s�1����O>��Y3f<���7ov��,�х-� �-B׍Q#F�r�
����u���	���N��&����:x������d���nC�����;����zM��i��꺮i��;E�=��W_|�j��z��n��""��ra����NIIf9.))���s��iKW��������񧟶��O?.���i����_{ͥ_4|�0Y�dIF����&'[������t:y���,+�B���c��=^��b�e��X���Ғ����8�fi�6d��Ҳ�Ꚅ���斕�֌9�eY����|�ǭi�s�6yrgg�;�0t��8h��/Ç�����`�iz (,��ͷ�RXP�ؓO��7����?�hnWWM����d�t@#��zB����x�%;v�o/?����r!����֛��~ǝ!]��BH�%M�r����������o�2���<.�k����~���+�.�e~���ֶ-�[Ǐ�p��+W:v�ag�:�x�6��eE��4$I�?�(2BhɲeK�-w:�a�v��ӦL������JMMinny�g^}���xk@v�%]�mǎ�A��%^��s�UV�XX8{ֹ?�0AhrOo���*� �{0W��eU�`2BH���{���,5%5
�ެƦ�`(�0�B��_y٥�?�Lmm]uM�/] ?���6>6���.}�צN�������#�!H�$��7s�TU�Y��z��RG���#��l����_�q�?x�o�����gV�4)$i�
I����&������&�(��Ϛ��+�����ع���B�����z������iS&O�0���sUE!�ЋA��g�{�_�LOK�;�w�|=���D �v���Ç�������T��1�]��~�?!>~���y�����;v���HOOK+�S^�wobb°!C���MA�@�q�dE՘�y��s���k��	1����׬Z����b�:M����qۭ�99
���a����ۿ���}����lٺ53#c�}o����\2 ;k�A��ʪ�����������C�� ������zİay��%��{�+���F�YW_�s|�����A�y��aC;��u�7x}��ÆZ��������ڐ*�ϯohغ}���1l���ܻ��� ?//77|֚�BǢ�����+FGo޲1�j��O���Ǝ^�oii�_u@׍�S��[Z[;��dg555醑��Y][{��:/7'1!AU����WUU�t:���iꎦ뵵�i��4�eٚ�ژ��۽��DU��C������]�u��nKMN��m~���px}��ؘ��Ƽ��Ʀ��;v�\Ρ�GGE�)�������KKM����{�X,���khHLH����4�����v���UUY���MM�T�s�̵�ֿ���W�VQ�S�p�Ӳ�k�6LQ��J���
�tJh��.˲�i�]�eY��V�(
���A������0��KR5��|��1�q���a���G~���|������O���4��j��:���
0��,X�p��U�<��O�AG��ES͙�hn>�W][��QUU�����׷Qڹ��	��>S핣cM߳����nPN�s`� ��~�!�� � b�hJ���I�$FH�eQ-1��/!Qt�PU�b��g�`�EQ4gWG��H[��v�L  �p,k��i�>=迪����$���+
i�f�ZyA�u]�eM�q�7k�XQP��
�X,���b�Ȳ�q�}��p�u�vuw�V�jpB���`��`0H�m����f���zi8�B�
�0�,1��f�˰�>tU����n8V����A�I�
�t�v����'M$�HK3z��/���D���{�soM����Q��5�b@�*���0��n���،�G�؜�̆~��A�aƌ�o6+C7$Y
��!YV$IF}�azW�aZM��4t��$�>�B�;�bZT($������t���������A��QTU��4�0�P�B������d�}�E�-�f%��r�Ə���D���?�?���
B��:f�|��1|�B��(c�e	Bw�~;f�f�1{�#�b9N7���
<H7��^z����m�[������	��1���#���
������	��×:<��ΙO��C0�p����b��~���v�Y#9��g�×�_4a�_�y���fw_"͏�� �s�����sa����i�q,��~�K�'�Lߵ�c���>�X?*�c��H��Tu�nh����=�j!�֭����H�Rlcbw�>�xǡ��1xj���[�t�0s���ˆ28��`4݀�@4�Ӭ��
� ����iKӉ�jP� ™ўcY�b������Y���ums�r�f��Tѡ�Y��舘<�C��l4O��C&�:}�|��Pǘ
锬��7��nU~���P���S���`O�S]�#�׳4܏�)}�B�D���<:^�'Y�{tt��&��6|dG��9�5�8|��#N���Og"b���r�J��
�+g������0���b�$q�U�n�����a܏�ź�P�X[�>7�ŖS�k"��O��uk6l\'�b �=j���.0�!e~�{���n���t�5^���.6�4�ҧ��6677_z��߱X�����h�w�����f�����:�p����{�3mڌ�9�{g�m�h�y�YY�mmm�ϟw��׹]��O��������8�֣������t�eW"�>�����I&�C�鷒~8�N���~��'-5�+�����Ţ(�$I���,�r�UU���������!���Z,V��)�2*K���v��f�;B��J����%+�w/V�]�%I��ʫEA\�rٗ_q�?gY���[׵�����i]]��Fy��,�����rwwwu�t#�$'��?16&�UU��a�Z�n�r�+.�J��&YDB���PtT4˲>�!��J�'
Fyg���pj��֓�ثe�K���G�8��-���DǨ�*˒��<����N�(���uvv�q�h�C� ]�]7m��G�i��ۯcc��Z[/��*�ѪU+UU���������K���ŗ�H�Ջ�.jii�X��s΋���d��@ 
3�3�{��G�x�OMI�X,W^y�s/<����V�\�1�/(�0~��?�����+�nim^�d�0YY�<�oڼ19)9?�����>���ذi�F�Ӂ�>}f}C](�fႡ��n۱5;+���f��%�記�f��鼏���}�陗^|٩��8�A>��\m������IS7nZ���O�֩��]�b����r��.[�q{�������HLH���,]������3�III����v��wM��@0P5��X-7�pKAA��]���x��wwuv����z����M\�r��*�������$/7���oݲk�E����Gҋ�wtx}��
��/IҺ�k�{rJʚ5��==��蘉's<�rՊ�c��u�=�ΝM��ͻ���F1�	1n����={�F���oKKK�e9$Ik׮3z��w����n������������.Y
��e�`�eE8pP΀�EKbk����8�p���ٳ����펟���ݑ��r�vuwmذ���<77���nش�8M�Я�JrR��_t��S�SE�ӭ�lv�Њ���v�&&%3K�uMQI�X�#����bV�BC4p��i�a膮�*�>�s��a�0:;;���>�?/!>!!!Q�u]��N��i3Y��8~RJJ�o�*/�c�X����������zKAoJ+��"�q0�� ��Y��B���&�!��y����i��+�88�
>�0B�H�t�ٳ:;;����h�
�靁!dEQl6�q���,'I��C�
8���'^��;cǶ��y���M�<5;3ۼ�2lll\m]Mss�s,�*���*�z�ŗ���O?����5d谆�z�ec��&��X__7gΛ��ظ�^,��
z|�ɇ,˦��͘~� �Oܰq]U����h���5+�@|\|fFfll��E���eY����eY��r��
o�y�e�ٳ.�4��hӻ��9b�(�� �5��ŋ����H�=j���2�a!f�?u?>�6��܉bY�c<c��?x��L\\������Mt������
M�rs�4]����X6**���oٲe����VUU�T:�/���{��ҙ������X,��f��<� W�$�c9�����t=���<!�$I�a���?D��ήN��E�PD�!�2ƈe9���c"z�=N��nwh�&˲(��]��ɲ$���/"!D�$�պb岶������a���UM�8��4ME:�����Y	1�eYz��EOv�����b=��	I!Q�Y�X��y�и���E�e�ax�7ϯ�q�w��~k���k�f7�_g�O��Vg�X�c�r�[���S��q��_c�j��ۢ�#��l6��w�Ǚ�
:�0��*^B�e�E�����ӕ����Š�
�O!߫	�{�'�2`�2��
!$�b�2��
pb����	u��Q���#�Cߧ���u�+�?*�����ܯ�V��Oa�>lD�ļ�����	!#�0�gb�B�#����c�/�����
��ux���1�vo�?Fo����K9�{طxx������X>&&�~�#��Hp�0O�yT���x���\'���9USUU�e�&�%IJl(x������AD#���P(�i�0�n����z�ID"������i�`@Q��*�l6EUU9���$"��&�'M��4��Z��Z��GC��b��+Ou��Xfz$>>�.ձ��t��}�&{sYt�|�~�
$kBk����1��Mxi�������o�l���w,�I����o��?��v���t�Ӷ���d���7p4�A'��+����7t��'��
�����K���}���c,i��F?�!�o�$�S7���^@�D��;����p?��oFKKKLL� ��vtv&&$�����6��fN/�0�n�cs�:f!c�|ժ�I���u6$'%q����p8eY�
���O��Z�QO[[��뉉	q������օB!A2��,����a:I)˲��2��[�u�w2c^9̟��A+��sW�����1������f��gef���ޞ��~�!}�L��	�]�����,	}E�u�
� q,
I=ޞ��8B˲]��c���;�*�4Z�a„}dI�8���+?�{!�1�V���t8����~+7�teY�|�0�0�Q7�7���c����A��~�4�v���=��tW�����B�e�����i�1
K�_9�0���X�����x<�ttt쯪r:Y���;Z@�w�qGrr2:]�􌭪�>{���������×\x����j��̤Q���f����1~��'2�����BϽ���fݽ���E�_{���`�[��_z���P(�t��w����u�'�
��dE��ۺ��6mY��B�i%kK�VUU���͠��6�ĬV�3�1}�?�~tgIIyEeSs��Q#�Y���7���֬�<qBwOϟ���ٳ��yތ&�>/�'�+f�A��%Y�v����͵m�\�[o͞5�~�7ޞSQQ9j��~��3�!d�1~����MJL�]�r�޺���䤖��<���i����	�u�k�8��}����t�/��j����&L�i�9d=a{�iWi�+G^��^sUG[��G}�c
I��oHKO�W�~�/^��_��s/���Һ�������K�>�ēmm�+V��e9))����͂��1nll�i��,_1j���.��"�a�A�s<N���i�֭��#�߸y���N�
Fn�X6�1�"�!��j�#�p�uM��n��{��{�sPa!B��������#�~��v?����_s�u�\���}�ɼ;o�m`a�c�#��<��ӯ���0���?�LVf��^3lȐ��Ƣ�⌴�#��K�ʺ�����^>lhSSsWwwrR�ҲƦ��(�9g�E���������F|\�����K��7��~��*��=^�(
��6m)��<g����Uk֌5�c��7N�2Y��|c���9k&��]˖��X,gϜQ^Q9��@�U��z�Y�}��W�l���#�.����V[W?�3��gL�ʱ,
I��mK�-0 {҄	���Jv��5jĈ��"B��YYn�kgIIccS[{���΍���uؗ^}����޻�JKImhl���ODQ���.���rǎ3��Ь�W�Ԭ]�!+#c�E��;w�def�=s��=�U��^�o��a۷�HJJ�>u��7Ξ1#%%�\ɾ���nݰis~^.B��r�aC��PZ�����\Obb�iS[Z[�-_��`�y1�1�99u�
����+��3z���.Z����;y҄ܜ����u6fgfN�8�~5�a�۰�����3��;<wZjjMm](,��_�v]Mm��Y�nݶ����7�:��؅��x}����r��l�Z_߀1�����h��#�D�UUW���;�?�TZj*}�����W^{����i�αlCc�(��i2Fp�#����J��Ǖ��?t����u��V���
��a��wvu��������[/�f��V�x����ݻ�����[o[�~�+������_��ǿ�u�ƍ6��U�u��!���f��o.\�n���c4M#!����K;;���B�P�B�PSS3-�o����i����=�{O��������EE~<w�<���^��˗�Ș���|���>?B�oX�v�v_IY:t�E��U�>�t��=����E[��}�ŷ�yg��U�=�m��[�n������O}CCH
�B�S�>����������!�a���$I����u�������u]����w?�`ͺu�I=���s?�l݆��[����9�Ʀ&�⫯w��?��3e{�����c��7*�J�O���Y���<�o޲e�ﷷw<��g�.�s/�T�w/�������Gi�� ��74Dy<�}��+�ޚu�������܋/�./�TT��ܹ<�?��>�������ߏ<"���i����쉧����#�VTV�}�z^x�����`0�jj]}�S�>WZV�lŊ�K����k� <��c���7��{C�v��ǿ��0���:�!B��9��y�}����ӳe��7ޞ�z��WTU2ﳷ�y�a�755ɲ̰lKk+�s�AQqqGG=T��ȣ
���֬y�ɧDAx��G����O��ۆ�����*=�7nڜ��[���m���?�dsіPHb�Q`�dž~�dJ�?m��G��;��ع�ۅ{��۴��eY��-�p8|���w�v�/n�囅��O��c���+V�5}������ל5c��*��/��;o�Žw�U��'�
2��U�0Œ�ۜa4M�z}��ϟ�x���^�	�A���:h��g�?n�g��4��n���[�Z�
=�!99~��_M�8��p ��6���$��=�K/�4aBuM-�[����|�GTU������=r���_z�ܳg>�K�-w:���d^�r������ܷoum͟��=��.^��746nٺ5
����ݷo���V�����Ϯ���r�]NADQ��m�G�0`�M�_��3 %9��v�������>xР�.8���n]��BEQ�N����}{�~_Kk�;rss����
�����f��,���x��?����z�^�-X-�1�G�3��穚���x�u�^~�%��]�W�jmk���۷o?��+z�С55��Y��I�D[��n�e�\t�W$'%]q٥�
���^�~������ܻ����f��cG���K�N�l�_~u�%�z���}��y��FG�[��}��23��a�
���`��p�<֌�W_ueVf������*�rGg�,ˣG������ʤmq�!��mv�כ��z��Y��
�uu>����gfff8��w@v���c�33��N�۽t�
Y��;:J�v�|��������fffd\w��n�����~.����܌�y�aY��0t�T���x�ŗ�[ZDQ<i��?xC�k�����O���Ύ�;���.)-+)--)-+)�]�}Ƕ��t���-�9}���c.�����WH����
C�,745&%&�;P%�ri�n��q��U�5}�ɿ��M�V�Y{޹碾�kUU�23r���p:��,�MȲl�L�d��[��iS�\{�ϦL�H?NO��������>;x���XzNQ�332���/��Ҥ�Dܻ��`0��:�s�--�P���9
�BJ_%�a��$�\�F���;���+$I���)���9}zbBBm]˲!I2���㉏��`�y�}򉼜�ʽ�DQ��l;w��D�8������s��[��������
Y����ikkooo���f&��:�zw��5}�TM�!����m;v���1���(���9l�����+�x����a���P_�$I��CAM�T]3�U"�èoh�u�F0�uCUT]ף=Q�g�IO>�(��c�a~��_�1����}��������Y����H�$I!�����eY&�oo<�ԓ�it�&'%�)�@577뺑���s�.BHyEEvVV0
_��(6���ߠ;A�tU�h�&
!�TU��03���>�;���f�;��ͯG���������XXp�?��c箿���Ąܜ��?�����?�۾c�y��pttv��&�x<����������#����S5�0M�BZ�dAS'MCϿ�rCcciYYUu�y������s
��/��B�݆1�d��
r�L7�n�)v��?���w��b8|����xz�n<S��ۼcM�v��������NLHJJJJJHJNJ�Y�{+��~�$�cEQ-Yz��,,>t覢�A���{�VTV2�?����tC��dY��w8�κ���~�E�ݷ?6&f��s�[�;v�4�01!!TRZ������Eٱsר#���B�,�.�3m�Q�++�<�‚|ڐ*I��e˶n۱z͚���_��nz�c����a\z�E
���D��{���p �����}��L��q�kjj�Λ���Z��������:y�Ľ��{���ή�g_xq��E����C�;�w?�pђ%�����ugwOOe���Ӧ�!��rr��nݰiS{Gǰ�Cߜ3�ҋ/�6y�~x�����]����L���x{�~���{�'��z���kk�<��g�`��O?���LOK۴����)..vPa��Ғu�7466��q8�o���ѣ3�ӿ\����2g��Ʀ�g�{��5c��!��
��ݻ�3��{zƌ�0L��}3�O��*!!~�����Ç-�S~�Ygutv����xݵ�X�zMuMmaa���B���?���55�G�<���w����
��z΀��rルӧ�./ONJ�p�yk�oظ����}�����΀���{�}�y[[��‚k��jSіy_�OLH���m߱� ?��v��gϞ�쬬�7555����T���(Q����d`A~rRB��]%%�T�����w�`��ƌ�>u
�%��7߾����]0{vvV��f}��7������}��?����瞛�����6��ǎ5b���K�n۱CU�!�o߹s����=݁@`ܘ1[�o=rdtt����O7v��%K�oܴ9��4p�ѣ��\���6l�|�e��痕�>u*�q�8���;W/���(����ނ�_wxF���)U!�0B�� ���3�|�(�9��16ӷh1�-Z�����E[�l1w����oXK����!d���>�����0f!UU�m��,�+���}��X�c���7zw��}4�#m��@�����U
�f�C��,��tZ��W�%I6��B!���A��W�&��@0����
��r�'�e��륏u]��z����F8�S|�c��N8�P(/����;j�����o̲��)���@�������D�l}]T��꽄��>�Ζ� !D�
� �n����y%�N�T��<h>mnn^�~]������	/}L~�G?������uB�7}�p�Y��w����{�0����p��;d˖-?6�GUUU�
CC�,�0TU�^���k	A}فfZ�#��;945��/9�c��?=��u�:�$��s�:��,����h��(9���y���\�a<�����aΞA���(�w�Y�~��t�f�̩6B6����4|�G�W��j��9'�IA�w�����o[����E����3h'�����N߽ៅN�F_4�~�л;��q��g_<�f�,K�t�}e~�~��C�^I_�6�C��W���X=�°xΒ�����<�َ��V�vN>B����4A��udb�{�~����㒋.���ϖ,[��{����{�g���~~Ӎ)�����mii�ÇEo�V�����r���N����+W9�s�:K�e+V���.8?-5U�5ԗ�D�B.��l�w^}Cc}}��qc�+*�����+*�nشiܘ�B;KJ�m�1lȐ�#���tF���yUU�u:A�N37t]7t���~�������7���z��h�ßc
�w�4ϐ~�9|��
y���姇��q���bf�������==|o�o舳f�_��P��cD���+�1���s�e�c�����;��E/��=�+.G���;|��>r�����P��n9�^%q,���(��ܸ}���=pՔ쀤��>(>���{E���xCy�_�+zx�vU3�ߎ�~��,�uuw;���>�f�"��
�hko�����B;KJ��ȣm��<�䖭ۊ��_x�(���\��v��������76�AY���ȣk֯G���
�6�Y������O=��־e붷��q����]���g�+�rcS�3�=�t8�y������۶?��K����.�#����OQ�r��>��i���*���Ϩ#b��Ou)����_��rɸ�=u]�O���Ň>�����a�ڷe!E������˧l����������(�RXPp�EN�<���u�I��_s����‚f�5b�$��t��7�}���>�b�����/8��af��b	?v�����X��>|ذP(�q� "ZD�=�7�M��;:6����/���M�<i�q/imk��驫�߻o��eˮ�����|ӌi��1/�g���C�A�8{��	BѤ{]7̜�� |p7N[�A�&��!a���[S�&e_5)��s�����{B��ˈ����6�n�8�a|���%;����o?�^���Գ���Fo��a�`!�(
̀��p�qq���8�I�B��I�	<�o۾c�9g777,( �47�p�0��l7�p}Uu�o�������K/�p̨����tðX,������r,�r� �\��܋/���9�P��C�.ݽ!�q��L�8~�؁�۶o߾c�Y3f�A���~�@����=I��v��eY��Y�-b(���*�/4�I?�N�s	�c8�Rii\6qL^��& �Rb�cs�'
L�Kڥ�2F����
m"W����{l+��9m|�C4�ttv���N�2����繳�Oߺm���M�7w�g6�5:*��>,���������p��e+V�x�g�5s������Ʀ-[֭ߐ�����O7nޜ��{��Y��т��bcb������V�ǽ{O�˯�^��xŪU�Ǝ���7ߔ����D3���߼��3[�oomk������ۿj͚�ڂ���cF/\�x�W_�A���V_�I���-[F��><}˾}{�l+�45�>�q������?.�Լ#���[��#�������cF�:�U�L�:b;) 9t�zN�����+V�Z�t�����yZ5$�H�l�ZС��9��].zC����v�Bٯ�cL۟�Vk�-q� �B����!��!��i�t���Z9
�99�		��,��F���t:�q
��i�
[�RUu���G�dY� Dc���d��տ�ս���D�)!a��A�ԛ8Dz/a�z�w�-�˼�(�$�V�5�����;n���,�q��ʟ;����J�0x�7�ڐ�I�����<F���6�~l�!��t�@��/?~�(�����~�鼎��K.�p��	����
r����۪��dg[,��{�|�ͷ�P(�u��w��7utt���~W�m�򕫜��w�A�$��`����%Ya���|=��{r�����Ek`�Ǝ���A�%��<F�S�c���~�8��C�a��G�G}����]%���r�C֮_�qӦ���O�\t�,��[�^~�%���+׬����SQ��;�h��p�'�g�*���w0~��糳����g� �H���Ԣ7�����:�e?:�#��g3���\T�e��;��Y/��JE���{ʇ��������HO1|XRb�����C�]t���
���v�����N��|�����9�A�|N6�8�N�Fp�w��I�B�����}�����u�m��������Z{G�r�{��ص룹�N�4))1�����_655���4q�
�ZZZ�]��>F}#�N&|�,c�T9�g颍�M��+W���̼���S��32ҷn�n�Z�����s`X&593x�
���=��/p�ݱ����݃*)-�YRr���O?����a�gef"h�D���y��Ic��.[�f�#�~�t�5����yҘI`���t�-��o���r:���B����_sn����!dO���S�'���|��L��0~܄��P�8Y(l��� ��!k��3*�C�>|gT��]@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"�~�8� �@����"No�'���8��P��1Ƨ�HN,3�s�"`�@����`Y��ɲ��E��?��!� Ȳ��0��TN,�9US�������FQTE�T�c9���2,�~GA''�#��I�"!D��8�1˰�@��F� �1fY��mT�u�Љ�"!��yD3|N�u
~h��4���^VdZ?�dY.1!!���l�:a[$Ĉ�K`�;�	~z0ƍ͍6�5-%�0��	b���7�#����	�"FX���֮�.�Пa����䜄m�$'&546!DR��%I��	��B8�3��t7�~8�
�8	�!�膁1B�a'*��AA�o�#9�y/'s�B?	��L�<v��D������o=������s��b�4���u� ���K���0
�f��Ou]��}a�|]�߻��a|H��u]UU:�m���5Wr�s�tð,�i��W#����ђ"�BCScٞ��q������EAdYV�3��</Ž];�^� 4ds|~�E��0M_?��^BA��[�n�]e���u�t�ˉ(t�M��h@��EA`��0�q�(�����)��a�
B?H�̹4�"+rwwwSK��(,�b���۫�����0���U5������ʲ=��:�1ƪ����vuw���5]��̜1uzll���A�U55�@�a���n�3����4-$I�O�����ֶ֪�ꝥ%�-ͺ����5B?ѡ��#�
� ��k�jw��jlnڲ�XQ���=��`PQ�aZ�Z�~�����Y���@[[���%e�>��r߾ֶ6:~��6�1�������=.���/�]J��I�}��76`�w�����(+�Jv�ɲ\�P�u�Ʀ��nEQ���u]?x!#!���qB�7�
�0� C7bcc'���w��斖!
2��p躞��l��
�H�������4tȰήΎ����@SKӡ�6�o��C�%�UT8���6�<�����Օ���������r8%Y����y����������1��f�gFy�F�)B�5C��(<BDA���I�::;,ˆ�BD�4��1F�`PQ��Uݺc;˲��0C�$IR	c�s���2hЀ���!�O�
##=}ʄI�"w{�y��y��v�12**:**�0��{��$�0#���5|DB\���c��tCW9
��#�y���1�XU���������6o�b��RSRt]ohlX�i��(��3�3��߿m�vM���Y����,�,�v�������'���546J��`�`P�aw�Bq�*��n�H�(۳;)!�n�W��4��h�*
BtTA$&::1>�n��q,˲���ў(�a���q0ɇ��aB׮];l�0�������`>�Y��,����Kw�Uy@V�E���EQU�r1���beY�eXUS
�cBQ�wz	!�(�*-E�<D�e�
à�U��<���eYQ�3;i�Q �2�16C�4�����y��5k�B^?A��0!D�4:�n!USY���D�B!B���ӈ�qybBb�lN�'�ݎ�7Jƈ�C6�2�E`��<}���oxG�~B}� |z����7����tH����"+�]��[�uC7��~���zW����C�N���c���~�uB���h��G�M�(����Q�:�h�����0A���1�
��N'���$� ��")�q�G�F�K����d�6�[��$	��� 6&�������Do�6���#��kjX�C'�⏱�iv�#:*B?�G�X,)ɩ'g�~sP��~B�H�Z��
���z�yNJ�{x�?�[$��N��Gu�n���-Bo^�8� ����l�:A
Of߇�9h��e��?�{���:ƛ�	q�먡��p�,�(�,��x?���7|�1+DQ4t=$I�a��U�[�0��r����"H�����e��;��A<N�8q������o��������f��d�+D��8N���q!��>}%�]M/�p� TUW���[��/���f�$���\9�0v����90�A�u:���-��aB�2���okk���̌����-����C�d�����j1�ޤ[$q,++��(6���
s����N��[oMLL�/Xŗ_}}cQQ0�]^>n��˥뺮�v��Fr��m�Z9�kni)*�:r�pI���x<Q��a��i�����t��!6�m˶mϿ�2��lIKM�����o����n�a��j��a�._1l���(��a�p:���q�q�-�,�v�MEY�1FV��/����;����Y��0??66�b���N:���n��l���8��n�ZY�u8�,��<�|��ں�!�[-��N�r��v�(�;v�\�a��)�E�u��p����v�鰮�8ͱ,����?���s/�|�5��~��ǎ���ի�.���)�R�(�(X>��eі⤤��?�t�y��g��)����O**+ssrB��b���k�V�ִ����k���������>��<t����q��������XEQ�����Ԕ�u��?��+~�?9)iOy��i.�s��5�		���~�yyeeA^^Qq��E�8n��uw���n�yK�֖���S�~�p��%K=wRb¦�[�n�V^Q��ٹnÆ
7���E

�y99]�=.��n�}��W��lINJ�Z���.]�|������j���v�֬]g�۫jj>�d�8-5���pZ����a�eo����ǿ��l��ǽ|媥+Vp��[o˲�Գ��*-������AD��g_|IU�ʽ{���P0��/������t�J���Գ�+��q���`�y��n�ŧ�}��[,�����`0��sϫ�n��t]��>۳�� ��;�665��Q�5M����ܻO�yBcEQ�~:��^����6e�o.Z�,&:��g��󾘿h�2Q����`(��c�'�����]�e;KJ�l��/�,)����u탏>��˯���BRh��׮[�aӢ%KdY~��w�*ykλ)��o�y����j������g,I��^���ώ9�ٹ���+��UZ��hKlL���F���JwwwFz���s��曻��KJ�z���Κ�:��YY�_{ͨ#�t�
��8�ST55%�g����;W�]����<qBjJ�M�_?l���hD�$I���f���}���ii���p<�Y�ӧMihh�y�r�7�u�/�{zX�%�,c�X\.�o�g�С+V�
�BM��-�m��QQ�+.�䒋.��x�?��/�$%9�+.XX����t:��2y�,K>����64�UW_����1���]0��r��%K���Z;:;�hyN�w
�Ñ3|>���ظ�Æ~��gMM�g͜��'��������dY6d���|�?BC�ohس��"Z<nw�̚�p8B�P �P0y�^UQB�a���sWICc��1��33ZZ[5]CE[�$&Ŀ9�M�fN�V�{�����ݛ6�;VӴ��ڦ٭�m�,�s�Y�r�?�<����/�߀RU���gO?���[Sո�X��y�3mʔԔ��� �
�B^���pȒ,�B�P!��jWO��a�R�S�{��?p�5W��w����}�ųg�hmm>l��}���_��35u��{��	ƏKJLp�p��������m��,��K.�8a<Bh��/��u��,/�vၪ���ѣFf����1gΟ��[o�i��_o**�9}��œ�l�PJrrlL�q99���0�

s�}����RS~s�=�A.��_}���v�Y3?����V����̘>�ù����\q٥o�5g�9�Bo��^GG猩ӆ
�񧟆B����A�0�23u]��|�� ����7���[s���rdgefDEy����p�\V�e@v6A(=-��rb���W_��U���{���+V�Z�j�h�\8{6�����N���ZVVfZZʾ��~ィ��kvUTTԏ�N�#��k�Y��İ,��@��r}���;KK���?�������fcF�4Qu]WUUE]�U�y��B)�2�W�.�(
M�Ec,˲�f�e������<
�X��Z���Պ�е!��[B�P��tj�Ʋ,]�� ���M*�8N�y  �n� �n��S�$�����}@�0$I����"Y�A�{�a��h��p:��ܹ��
>~�a]S�%A��%˗_q�aB��i�E�4��8҈
�hz;Mt	�$������B�m�+��~?}���,˴[@ d��3<���*V���H+��K�[�[UU��t�@�o��`���=�0���^~dYF�L/Q�,���<�&f��ّCo�F3�1������	���f�5[���Ot��;��+��t��+f��S�4|γ#l�H+	��f�/�6t�и���a����>�V~Ou��X:l�#2[uNu��@f������h�*�6/�0\GGǶm� 5"�0��
/%,7�%tEXtdate:create2013-09-04T17:02:50-04:00�!n�%tEXtdate:modify2013-09-04T17:01:29-04:00�!�tEXtSoftwareShutterc��	IEND�B`�site-packages/sepolicy/help/files_apps.png000064400000240072147511334660014715 0ustar00�PNG


IHDR�j��8pgAMA���a cHRMz&�����u0�`:�p��Q<bKGD��������IDATx��g�G���~"����U�Z���&��i9=�g��ݻ���[��̮���v�v�عsG���f�!��n6�@BkUUJ��*����BAQ �3���Hw�Ǐ�#����#������r�VX���������K��\.�OLL�'�4�$�o����ommmMn��F���������U,�
�'O��ž}�,_�|��(X�u���������#��X,r��
�TQLӼ��rpppppp��,�ۍ��Z��������䘦�*�eQ�qpppppp�r#Π�������%�:�+�������B�>W�Q���u!�%�l�k�E:��2_8;=���s�����E��+�#�wp�!�4�L.�i�x<n|/e�L._,,TU!�����T�R�L�\"�#	�'m�BL�$��b�B<n7^�g��B�e����=膎i��}��zV6��T�(��iYd�$I�o�kqp�-�����]�C�EQ_q}�OF:��߾͞���u�%}����_dϡ��O_ƥ*��E_w'��³����G��̓�Z�˲8p�8�v��O��;�~�f+jqI�0M!*߂Y��^?'�����0�J�*
m�M<���,^�]� �,R�4�ß��w14:����~�Y$!�$��3�I���YB�+o��{;v���*�֮d�0�ߧ���?��x�n;���Y�t�K�ڵ�a��J�2M�ʳ��4/N^p���3�J�����'O�Fhmj��=�D�Nrp�>$I�ة3l۳��<�]m�K%�n�B�����O��uܪ���FA*�E�4{�3M�l\�˂�s0-�������
c�S�55���(��g�46r~�^���xI�0�t6�s�>DW[+?x����s7=�파M�/hN4`�&�lM�X��EKcds9��&p�
�i��ڂ�s�#D�!���%�
LL��ֶ�uw�ζ�874BcC�4c&�����������S*���C4Ģ�NL���imjãc��e$I"��v��02�"��L.�c�1��X�e��Pd�FW�/!��S���w�k�Z@���<���65^qL�,��>	�i���hj�s��1,�d�e!��3Y�|�,�b�B�-�.Tk�m����I{K3��O�e�b�D[s3�׬��7���|��:p����Y�l	�}��;֯e۞�<�ȃ4���8�>��s�)�Jt���G�yo�nY������ $�e��cǙ��������+�&��}�"aN��_{���w?��|�)��0�ea�&K�z���W�a��+�.F�i�Ύ��"����]wp��)�'�QE������{�f����?p�r�w����q393�W{��)?	���6����K��FKc#]��vu�x��x���VŲ,�-�elr�_��M�|�~V,^tEQU������p��EW{+��~��N������?��W�bk�ϋ,˸].������	��n���}A�����7�OM�u�������+97<��
�<��5˗rQ9��0!�|��'hkn��Gܱn
[֭������&z7M�49v�,�G��o���4M�/��������0{���hjh`qo?��_��� I��ã`Y,��D�
=Ɓc'�,��Ksdž���_���0�]h�����|��ct��M�Y�r9��sad�������B>_`۞�46�	|��jkET���p;"��cC����sg����	��`��'����D_w'o}��t&�eY�}>6�Z�KU�xܘ��O=pa�˅$	�"�ڟ�����^$IBQ
�gΝgrz�2)�m�A��M*��P,�xf)�>v�=�_��4He���.�ff�\��}����dM�N���8�i	�{�z~�7p�]l�sn�j�$�z����Д����ʛl�}�W�x���4D�躎$I������rqnh���O���AC,�a���L*���eMCQTE`AW�H����ȶ��88�nHBp�I>�1O=t?�H(��eK�8!������p�XfRi�ݱ�t6����{�dAgS3I�e��y駘�Iog�x�I��om��ww���v�2��0�,	q�*�,�lma�E���q��4�c<v�B�ğ~���v�>��V�����5���zy���{p��o�����wn!�/���q��@����c�~���T�sK.������,�[P�\��E��CU<n+/�2M�^���` @Kc#c�E<��0�p0HC,z1LE�`���v��crz�֦��v�?@�X�����C5���o9ܮ�@cC�g}�%徆h��h�چؾ}��bŊK�;88\Ӳ(k�b	�˅�c+�i�NY�+�l
|��R*k����ծ�2�i��
���(2�~UU��(i�$QՈ�.�R�+G-ˢT�PU��FT&�a�q��$	M��u�6ap��ʺP*v��a���C�(�~��Y���fՎ%h��e��T���Q(q�\(��IHȲDY�Q�0���ȔJ%4]��(?z��|�+ϰta/�l�R�\�l��u
���'nc�HB`�s��J�9r��;8|R�q��\�E{�$�xf���,��5o<�e�v��vX�,$�wW٦i�v]�xܮK�eY��yU1�**�Ԟ%��T� �c�<���5;U���稊�v^T�6!�\>]�Z�8X�Ż;����1L�d��t��aU�!��Z�.�U;7��˲�W�3�韛�?Kb�������l�=�����W'���q������\'^W{���sn��kW��َKu�ڔ�HC�+>����8�qpp��eYt�4��v�y
��5�cN����s���988�s���ς�bK���V�������mB̀�,���������m@m���p3q+NB��㰷b�888�<(S�S�<}��3��Í�0
b��,�/�otr����$$�>?~����4�t��v�=n��A*�B�O����"
S.��d3�qK��ס`�̃�C=J6��7�K$qV7!�d���a-\DSc5_�7#�^yOMO1xa��������1.X��*�l�x,NKs�����S(����֎,�7oz��˲���"��
�����F\.^��s]�88\	IH�]yJ��@E��w���}~�M�X,��>�t[�����p��E/\7�`��gN
��z�7}z�D�]O�L
�ntr>�^ʺ%��_����x�;��$	,;͒$!�/F\�����&D-��O�U���7	�i�:�i[�Td庶�o�e��mE��ɭ0H���U��/��"Ʈ��5MLӼi����@kee��Fy�B7�\�\>c{�4m��$����3�;�t�*u�"�@��O�5Po�f��T�a�}�
��r8G�n2�4�bEu�UŅe�J%Ri�}w0p���+�B��u�v�"���zLu��4�ۓ����V�`�5�
���IH~��ga�֧Q .*�Y��ׅ@`��5���,I D�ڞ$I����$���Γ�Z�E��$]"��,&�y'�����\7�����Ȋ�e��F��~�D&3X�3)|^���r>.;�!Hg2�ܵ��˖���rM�G��'�%���V���hkm���Su� I�T�L&K[k�M+�t�:U��������_�!�����m��
�����5�I44\���
119����?�#�&��X`�����v��'ja3�����x(
K���e��
�mF������زi#��008���4�W���
Y��iY�Iѧ��}���bt��_�!�=	����_��3gػ?�m-�cŲe�8=��sah���n��V.cc� �1��D����܅3�;� �D7L�fh���
�.�Ja��"�31L��e{B!G��/��oy��/� I5?���U�V��w�~>L.���Ô5����S�G��[os��1Efbb��'N\T������k�n&&&9�¬k�/\`rj�UW�xx���}��Dž�!~��_P(.Q�J��?8s���?�]7f���?�~q2R�^�V����x ��,#�2�,���

188H2�DAC��ۃ,��E��[�Q�K�ɲ\S2�O�Ň۷���"I'N���?�IG�$����g?'��]Rv��\�և�,�w���'N��d}�Է���B�'��=���{p�T[b�l���y/�r���111��_~������``޺r���9��p#�$�e�IEӲ��1�J�iL���ܶ4M\}�s�e���<|�{﹛����iL��o�M�XDV��B�m;>�\*�Fy�ȕ���V.���;~���6��N�|�]FFGiomeAO/�򗄂A&��hii �ϳ㣝
���Ɲwl�m�#�ɰx�B��^��?��Ç	�B<�Ѓ�M��!;v��W�~��'�e���8x����湧��G��{ؚ�h,�� ��d2�*G�ٱ�#4Mc��u,Y���ǎ��TU垻��"I�ű�'ضc��������c��׿ftt������~�=�<�L�-�6�r�r>ܶ�������z���+~�k{��{���[I��u�ۅ�(�Fxe�n�PU�X,�Kq���ӸT�H,��2d�Y;��]��T4�v��[���o�Iwg'�dO�
�`��s��!�}>p?�>���W���q�]w28x����G*���w���cl|���Oq����ܵ�bܲ���`br��K�X@H����?�#=�����MӤ�����nL�djz�g�z�ѱ1���b=�݌��166NKK���K{wn���?��Wɱ��y������I�p�T�%��b�2~��o���Oq��(����A���d:9C<V�PЈ%VR*���B���ѼS!���r���P*�$�Ͳ��Y���0x��_208ȱ�'X�b9�Ο�ͷ߮i:g�Y�<������}�s�]�\���C_�֯]�����6�.]²%������ѣ��λ�<u�uk�p��Q��C�9��(l\��w��s����k�
Y�x�-q��v�͛iin���'?���/�;�MMlX�������a7�bdd�߼�[=J>�gxx����A�:;��W^eמ�y���������O~�s2�H�R���	��
^}�5,�bAOkV���{�x�>�<HKs3zzx㭷���ׯ��Ύ:;:��f��4M\\�WV�$��i�7���y}465��l	�$�F������D�QL������)E��T*+��^�|���������/�ʕm�s�����~CgGgΞ�?�����D֯�������>z��߾��ӧٹk�l��m�ĩS�]��#G���p��J�2˖.A������o�ɂ�E�����	üx�04M�(q�4R���GG{;�����f��kz/]��e���O0����5�V���ƙ���8y��_�%�l��˖��;�o���ʻ�f�k[�����m�2����D���IY�)�
�kڎ����q�_�߼���({���0"�0kV���gtl�L6Gss�6l`��M���\�j��$108H{{;kV��wA��s��)�=F.�C�$�]�tvv"��e100��+زiK�.���"�׻����v�q�.�|�1FF�رs�B�F��mc"����?�����������ٟ�H4\q�4wP3�Ύ��o��������'��sǦ�lٴ	I�ػo^���7�e�F��$�d!333���r��92�,��|>��S�8z��|�t:��룯����0�x<ƃ�ϡ#G8p��Ri��kV���KE��T���~(5)XU�**�����@:��T*q�9R�˖.#�H\"��������$���sah�=�E#�cLLNp������hh ����DC˖.e������瞻�b��]��p��K�008��Kذ~�-��Y�Y�xK-B�$>���wA�%�օK��ѹ=)���E��Y��H$�����^�\.#	�x<F8�wA

q��E�GFظ~�6��1�``p��*�\vD�7.��` D&�Ŵt\.�B1���8n���uM�\2��b���g��~�;�޷�ř��LMO��f9q��v|DkK�`���I=��������q��i����,���NNV�9y������˗-%	cY~��C�s�q�@�$/ZȾ��y��9|�(˖,��Vr0-�4�
��^֯[��'�pa���)0M�2��	������ٟИH\U{۴.$&��صg|��b�Ċ��	����;�������͛(�J���;��λ466���ȅ�!���ikk�����﹇������f�ńC!�@7C���2�

q�,Z��?f&��DW@ P$E��#�2�K�������>m; �Q(�f���)&&&(�J$�N�X$#K�x�F�斏n�N<㡭p����M��47�x�B�+W,G�4Mcێ����`���?y�|���?������.z,`߁����;9z�%�a�����{�a�<�}�m4MÚ3�ϗ^�41
�Dz.�Z�����ef&�a���k�ϝ�0<��m;o�����u�k�IUa�\.��،a$S3�;?h+�^��'����ckk+^��q�\�P,�e�F����e�**��G�X$���c��J�ٱs�R���V�>p?�p���Qb��--lڰ��˱�'0M�eKcY�����`���,�[���4^����&�� wlٌ���!V�X���,hok#��e���r��IY�j�׭�m�;�X���륻���s���^�������q��e����d����=w�ɒE����dt|�r�������&.\������hL$�I&��lZ����AFFG����tw�����hogѢ�D#:�;p�ܸ�n�fzf�;7o��w��ƥ:3��^/�,!${��\.355���B���k�L�\.G*�"�ϓ/�I���~UAQJ��,��瘚���"$���ٗ�e\.`��0���E��c1╶�t�b�(�N�fzf��D��M�|~&��hmn!��xX�l)ݝ�x�V�\ASS#m�-�`xd��˖�n����B(�0t���r�r{O��UUk���T�ǃ�}黷�2Eijj�4M\��ΎvB�pMK�Z��[�"	�zhmma|b�x,J8�1�`��d2�=ޱ��}�tv\ǻ�uL������f�p�R=m72<̂�^Cgff���)Z[��\ɒ����I�K?z�ڰa�hlVÔey�,WUUN�9�~�C��)�-F{���x�>�����u���4�Y�O��2�ϩ~�{_5�a�~�Il8�!�������,[��諸�+����k�gīg��;o!lE�j<հպS�P�$���+���W�-��� �]<.(@��U��﷟_qZS���r�B!4M#��#+2�@M�j�_�`ե�:��J��y}�4�����^�A*��eQ��՟g7M��� j�YƬ|�(ʬ��[ 0��`}<UMyY�g��$���D�¡�%��j��^�dIB��Cp=��4��j��W����E�k���]��%� ѐp����c�>s�͛�`jj���~��Ƹc�]$MXWؒBp����k��Wz�g��,�Df
���Pe�=��])���E�s3:�������q�
�pkR=�^�������k�J�q_��BL��@S����i��ڽ��mZ���ka<���ps��T��T�C.��r������r�|Ű�{��r�;ٸR<s�$��k�K��Jﺪ���p���|�����$��	�+��k��^�
�eY����G�]��,�ʕ,_��Ѧw�̸U���Wز$!f{�+�[�V��U���ʫ����Z���i�**����ݷ���	�����CQ�k��k��$	��=�Zu����Yq��t���_5��yzp�YN�n[���oU�ߵ�ÍBA(b��e,Y�d�����EQtCG�o��Y�<n[��yO�����}��0p��[~Їʻ6LGq�ᦠ:�R�RJ� �a�9U��$l+s�`]�)�"��n7m�m��i����X,f[��dnz��$єh�T.��fntr>5n��@ pSKYn/>i]T<�S�̤f���RфO4$h�7̫�u3R����>W�RIH5]�[a�${�#`�÷|�"��M/eqp���D�-����P��z��-#R�9�Is��-�L[;����Z���p��@����0��ҙ4�n8~�����!
��o�N��L�V>�Zz��|nˆr������w�&��$�����t��������s���J%E����9��pUJ�#c#���98888|�|n+~˲p�ݗ��.I�X���?���)��s����OCH0�F���d
:�'��Ka�B{‹�%�lޞ!�m!�/��]5x���4�4��dI�$�>�]�9��e�p���(�L�LUl�[��!����"��v���L����0?��߲,4M��UU<YU�'���U�~-kuw�[�`p,�^=��a�B?Ao�ѓ^z���l��&�w;�mFՃ�a����7�D�}�"�L226Bg{�%�#B
��|�ϋ�b^ڴ,��`�����*��M�iZ�m\��m�j�%IB�u4MCUի�T-��?C���蚆�(5w��Qu��HUj&�����]t*
�-dŲ�LNM�Ngp����=\�0D<�����KB��,�e�{��3x|��:�W����+�p��i~���$�+���c�Ng�6�Zw��2����؄$�Y[?B@cc�c���%�^_�w�#G����_G7�� _����$9t�O?�8���r�<�����������?�ׯ㙧�D�$9����eɢE�\OW�$�'N�g�>6mX��$K/���B2�������|��W���~�4y��_�����Mky8v���l���=w�Ū+j��jڳ�?��Oy��Gimm��u�3�vp��P���!��󵯾HkK.�E*�BU]L���z�����{Y�z����I2�,�D��G2���i��i���t����1�l�M��a�
��0kWI��y�sC3����.����EU�oZ&��cY��C�4g֖e�{������O>��L�P0��<H_�B� m��(�B:�&��
�x<��i"��a�Ng�Ţh�F.�#� �@�u�GF��o�`��54&����9s�,�|�0����0�.��|>���(�O�dxx!mDUUv��͙��<��Ca��$S)����F#D�a&&'����Ӷ�;li���0��p�[X�|9�Ϝ�/���hhh �ϓ��hlH���Q
�"S���~TU%�J��
^�����p;Q[�!(��x/�H��K����9t�0n�I�8~�$�N�����zm��z�7(�L ���_�S3345&��W�G���
C��Q�4-T��ǃ�(2��%�ױ�6($�%�턐Dm+_��˚�B�i:sMM
!X���G�v���ٰn]�~	�_��;���/<��X�_��*Ty�=w��;��_`br������8u�4'N��;�fm�����`0�;�OG{;�a�b�2��ͷ���� ��������_~�d2I2���066΁C��f�LMO�m�N$I���¯_˲��}��/�L����/�8w�M7��{��I�^/=�>����6��<��}���`&�DUTT��7~�Ȳ�a��?c��U�[������u�V�Zy�;�qpp�vfmJ
!p��x�^E�����rT�j�f�*�����šU�x��Ig�444���}����:�7l�g���)VEfy_�����P�c'F��N���N���<��$f���?\;�Vs5�G�ut]���eY<��V^x�Μ�?��_p��a����?���x�"��8���G{[+��D�����Fٻo?ǎ��0
>�#G�����r՞#�m}�3g������[	�����Ѯ]<�ԓ|�[��؉���o0>1��������v�(�KX�ɚU�X�|O?�8�a�g�>�hnn�L?��e�G�t�b����˗�ɗ��ڄܪ�>�y�K%ʚƖ͛��7�����T
x<���w�ϝ#�I����n3f)���A�����X���8{����+qI5�ں��zI44��3O�hh�`�χi�5��<z�2����#S�_��D,���$3�,^� '�M2���H$�tH���®WU���uc��
�,`�����-f�…�������P,��u��¥���JY�X�z5?�ſ
x���y�m��.�/[:�Y�i���ȋ�?G2�fɢE�������¥��$�r��e�4��,�HZ!lA��M0�G�o�N�9�t����n�x�mN�<�¾^.
q��I^|�9v�܉�������ba�\f���|�k7����,��&�3�ێY�~E��5kZ������{����EWg'���Wܱe3���uΝ?O,�x���"��S��z��1��ݭ�����;0‘�)		���S�:���/����r4�o#�Z�U_�i���E�4LӰ���u���+�?A0@�$6o�@2��9��%	�����y�W��_�@�T�g��1a{�Ƣlڰ�_��
�:;��b�V��6�z�*,ˤP("�2--ͬ6V�ӟ�E��������������G�h�K���(Ȳ�R�UEe��Ռ��28x�p(����͛��+�0<<L�� ˗-��==<���_�z)��ܱi˗/c�=�S����xp�T^��k���~���f�z�]�z�}�_����
@l߾�Z�d	��399I<����r
E�>/�Ν�������8�x�P0����SS��!Z��I�R�}>�>~���vɊBA�Tft"E���{0�9�����}���N�����@���1M�3g���8y��rY�g9�ؓ��a�e���Y����4������X,J.�'�͒hh �NcY�H�Ɋ<��F�,���I�n�P���q�/�p���0���$��F,�bbr�` ������b&-�Mx<R�4��P�p(��g���n7�R�` @2��!'�J1::��㦭�
������d��O�!N ����	2�~���D#�,119�����v3Qi�S�ӤR)�z{ٶcn�������������C����/��e�Rī��<��S��*�i"	B�J��$iށ�J�,�A2�䣏v
�ؼy��⿍0M��gN�H$0t���!r��%�|>?�-���ݫ�>::�x�i����{��e��r�*涑���U.�}QM���׷�˥����R�*��ƭ�:��DwW�>������pQ�g-��v����>ϽǴ��ɕ��a���,�$��O��×���TU��g��uDz��֍��ks��ܰ��^��]��}�b�>}WJ�5�g��Y���i���T�*��������*K���s��8��ۋ�uř�}>�].{�ᚔ����s�eY�b��A�᪔��Y'@>gH��oY��L&����!wp(���~gPrppp����V��$�1�):\��kk�;u����s�s?/��s�&,�Y�������>*;��������̓�w�n8��@�3�7:_����)S9�|�9r�y#	A�+�I�>2�a�w��/��vTQ(i??��EVs���4�~g���	�Y�;88|j�8��A.��T*��t�t��n�/�b�)I��*'�$�.��Q%�q�sƂ��<��,J�Ny;88|j�8��i>���a�L�djz�B�������壙:g�Ӭok��Y�a>��������b:�����pف_�u�<m-m�:D��ɻ�>O"��m�G�����ߛ�[��|���r�,��y~!�r	UU�eٱ�>�e��n��ʆDC@�m!��������pF~�O�e����2���V�k׊���(�_���:Q�s��\��Wʆ3�rpp��\�/�˶�}��%�a���]���Z络t�=������k�n W�P.�m�A�g�]��t˲�;e͞�l������j-��V��?�z2`>?�e>�>Mӑei�?��X�E�\��Y�mK���(�u	�ej>t+!0L]�pU�jƛ�a4MC��ڻ��d������L„�5
�Z������i]��ǹ����B�`�W��LI��x<���;!De�/Q��w�;88|.;�i��"+��Y��dYf���y�׈E�!�z���]��6h�w��$��L�
�;�Jg]V�a�������x�,�6�	!8~�${���w^|��bYo��.��r����z�5�'���?1>1���#��{�a`p��˅�a�}���
��i�x3�gY+��j���L6�?��%���㽼��|���p8��ヌ$I<���LOO�w�6�[K&�����6W1!���B���>�eY���_X�l)+�-�7�BR�4��/�
�����^�.Y����L��}�r��o�K^���H�ʳn��e�;w��������f
���a|��/�~�ZB� ���w�=��R��j=��3x�<M�	�~?;w�F����.�K�Ĩ>�$����v�F�z�|��g8��Occ���w��M���gW�����;&&�����1�����o�D��Y�s};�{M�k�$�%J����y~�00���0���`�����^x�^��]�����$�����(�i2:6�,�456�(
�b���q$I�1� ����\6K�X�?|���+W �2c�X�EkK3�|��g�r��Q	��I�$Q]*�i�����������`w�؊��t�{˗!	�iY�\.r������V,2:::+ͩt���~?�X���4
{�W_>.�Y�9z�8m��9z���Os��V._����ٴa=c���e4M��vs��i^�-�}�	zzz�4���ITU����d*ɹs�9��A"� 219I.��4M��$S�3D���������R��<�,-Mͨ��UU�I&�x<��[��˲���!���D��B!(�5�.	�r��Y�Tf������d��,��P(�J�)�J�>��8�L�l6G�\"���� ���F�x<�OL099I<'
22:J�X����l6�_��_��[ٸa.�E�����B��{�8>>N�X�P(��ގ�ﯥ9�L�v�j�~�	�,��#�����(�aKl���)kmm�x�n,�����o}��s��}�o}��D�r��@!�SS�~r�<�b�hjl��r��f!�������T�x<N,���">��d�9I����w�7M]��Y[���a��TUaúu��λD#|>g��_��q�yt�`ݚ5�[��y��LMO�Ÿs�f^��k<��S�ί~�:K-��躁����v���沴��������q>ڵ�t&�#>X[5e�Y~����9J�2m}�%�aYV%?brr���V����x�~�,@��i���;�4�_����}��ӟ�v�hjj����x�u�4jR�Za�2K/b��Y��T*Ŧ
�9q��PM��z���?�o$	Z�����q���7�~�g��24<��#GI�RlX��4<�n'����SO�&5�cc��W�I�n��3O���X{��|�]���Y�x�B��G����-�G|I�x�W�"����}�܍,	�e�˃*Ks$;��LMO�~�={���y>�~�c�����r|�k���'ٹs�PI~�[߬����G�w� �P�_��
���+��l���W��r��B_o/'N��4MB���$�l6�[o����x�^�{�)�����P�\>Owg'��?�%�8z��*�Fټq��ņ�k�%!�Җm;v �2ݝ]����QUEQhii�T.�hkm������	����=_��W�����mO2׬Zţ�<�^��l�b����>�,K|��/���h�B�Y��Oed���ڻwppp�$̻A\�+���?.�E($��x�z߽|�o?o��.=p?�\��;w�1M�]��p��)�����|��y�`0H�X�0ttݠP(�t�b�t����|�R�Y�rM��;~�d2E����a�ڵ|�cŊ�3��<|�h$B2�d��A�*J�z��n7�����k���;v��]���d2b�(K/B���u]��ș]>�����M�P`���|>�ؼ���8HC<���AU]<�䓬X��b���U+����_y���Vz��X�l)�@�C�S*�Y�j%��7hjl`מ=�q�C�8w�<�h��s��8y
*�p�4����#��*�R�B!�e��ui���O�P �/���B�[W�4)�u|nE���R��� S���EΝ;���r�X,ʿ���v�j�x�-��$��?��C0d�vEM�ɤӼ��{�X�����-n}��D�;︃E}}�:}��&�-]·��
6m�@�Tfrr�7�z��ﻇ�o��b��D�%��i������T*�i�������B  ����V��o��r������%��i�Eimma��{9v���x<F�T��'��O��ٷ��{����i>ڵ�b�D>_���ӖJIҬ퀠ׅa�����S3�zm����
��&--�lٴ�M��zA���+�gI�>+�/��{�
�����EJ��iw���kh�Ύ���w�~����U�������,���ETE��s�-�]���L�0P���W����X�tIm�3-���Q(�>����"���W�G�
~���35=]�k~]��A$&��������A{[B>ܾ��}��������%�����E�|���}��r��m�3M�b�D�T��v
�xx�Vz���*�����e���=tuvb�FeBd�[n�W�����g�~�s.�򫿢X,b&�a�(�4˅K%�8@�!A:�!	�Ѯ�X��@ @��G�L�<�`�ǃn�X�U�P.k`YȲ�"��?x��~��wMq��f�g��{�2�$!U��dY����y��JwU%�5�V���Oq����v��o�����e�X�x_�����z/ߘ$�-�7q��a�~�=֭]���F�e�~�P!	J�2.�E4�ч��;��t�b��;ߦ`���(Ku��G�2G�����D�/�K�yϧ���ɓ���!	6�_��3gY�|9��훼�\��CC��y</��zٳ��_z���f��
����q�ԚNA[k+o��vm��dQd�ˍ,ɌON����i��~2���x���n	�B4&5�;�4Q�6�V;EQ��Bl;��gΰb�r.] �����22:�Q(�=�n�i�4���—$�ŋ��Az��PU���.FG�hmi�T*�(�{-EV��!��_���*��`YVm�8}����J�y��ٳw/��!��}7�p�����`m��@�*jm�@�(*�h�m;v\Zn�	N�9˾(l˄�W&A�ˮ%�b~GGG�
����innb�޽���/ih�s�q��_�
�<���:|��D*�fzf���o��7���_g���~�ز������;��U+q�������QU�@ ���~����$UQ�F��X����y�m;v��x�s�fN�:�$I��n�����ک�������#��y�y�=����݋$	:;;g)�V' n��v���,8��V��R�?��ϐ$�%���[�ĩSĢQ:��9z�8n�N>_  W�Sj
U��$ʚ�*;�>9b���֒%Kfis����Ig��ܐ��핯e"�p8���CA0�L��c�cch�FC<N8&��3:>��(�47S*����z���E�gf��è�…�adY"��R�4�B��Oc"A�\fxd�x<��(�����Ƅ-)���Z��[S�4�t&C��gtl��K(bl���<7#���JeD�ኂc
!	B�м�Z.�k� �2�B�|�V^3�t:C$�4���\.G:���!��i���xȲL:��X,�F��bd2&&&iii�\֘����v���XY��҈�d�p8�Z9
X((�Jx�nFF��-7,���Q,�����ߏiZ%��[���Hp�J�X,�L�hL$�$	]י���𑣼���<��Ӵ������?��������N"�@2�bxd���8�$�Fcrr���Fb�(���5�H8LC<��댍O���V��P���q�M�	�^/��D#,le�DCCm����FQ[̏��519i�3>�i����122B:�!��P�cMӘ��&��P�D������令R)�����=w��¾^z���y�d2�
!���bY���aY���QKSӲ�s�����x��tpp�}Bp�����she[m��W"գ^V��i^rl�ң\&���b\���w6{��+��{$����x/w�|�	өi<nϼ�S_Fs�n�����>_O:�1ߒ$0�K��z�m�c�woX��uю•��c�.���?Ԏ��?��$�{���K�a�{�{on=����G����קq�<��5ۖ�|u�ry��z�?���+�?ǚU��I&�_��;��…}�=�x�wX���`�6D����ẩ�����F_$qy�2�;�������|VI�tU����gx���E�mg��.?�^�;��7��gX�J�\�8�|�����vn�U���֭�o��?� �X�y��s��=*�ta��$��6�W;����ؾh�+;�۫|t]#P�lw-X�EOw7=�ݳ���W׾��aY�x�g�|�6�p��<�Ѓ�I�=.�C��iotfnYM��d3�N�\.���Q��~���O��O<��%
���S>J�2�k����j��/W�קͳ��K'�J~i�����q�e��%���(��_E�����v�~n���Q9��pc�4%$S)�����0��%��I��Z���)˲���!�p���p�r-�ynnt:�������z�����&�0����u����L.�����e�ou�u�$��t:8|�1�<���TOY�E&�ezzE�I44�v�k���7�4o��}����˂�(�~lk�EΜ�'��P,�X��]#}¸>3n��}����˂i�>'�>K�e�|>_�?�J�L����Rɋ���ض�~r�^����Cu]#�JՌ��|><���9����f��d�F�.�����/�s���W|ئ��x��+�5;:�l�@0���/	�9�d��"��Q,�$	�ߏQ�1�����a�;ʙ��(
���Q(�eE�Y�h	�t�;������uY��������g��	�$!�2��M�s׽44$f�&��a�eLMM���o�O��ntY9|A�*����Y��$#+2�H��6�ӽ�S?�X,�λo�z��y�o;33Ӽ���l}�Ab���.*�O�Ee�k�)*�49|���x˔��H�RdsY~�ћ~q����vfs�]���މv'4=3�$�h�Vs�3>>F.�#�H^�/nZ�~��=�eY������c�&�T
��G8�С���e;�)��;?H:����	��w�Yq�U�ܬ�fc�t��O��xٷ/۶H<ր��gdd]�inn���L�055��(���"�2cc���y�� �|���	$!hin��o!�P���)��,��	C��cY�h�E���x+~2F)
444�Ū8�j�'��7�8��U�l[Pzz��{��b��5tuuS,	���yd�P,
�m���SSS(�BKK+��1>>�eY455�otn�P.+�,���p��)���@��gNјhA�Yv�܎,ɸ=n�p��/(�5���\n7��FN�:���L���L�L���8��֖VJ�2G�F+����ុ�s��䘖yK�_.��$�ĈD"l�|/����$�9}�ǎAH��F֯��[�A.�#�<7���G���q4M���tvv��Μ��r11>���G�F��y�.]�Lr�\>G�YJ�"�������'�r�1\.�e�a�F�o�EU�5�@ �c�>QsX��p�2w�mY�Ο#��v��>=���ʚ=���i
��׭���d�Y��0n��C�p��9B�0�o����Y�B��r� ����B<G��[]���㔊E��0#��LNMΎ\D"Q���y�Gikk'���5�t:M�>��Zx�Ghii�����ևY�fc�c��]F�77���tVE��R	E�)�ʜ8q!I�}~��R(�	B��!Z[��t���N�l�r�}�y��TU�����}�ŋ�`����¢���g�~���VFGGX��;��X�E.��ԩ�,_����|���i�5��k��ەu:��ѥ��pU�h˒��ku�� 	�x<���'��266F0$��Ҋ������hjj&��l~�\A�b��tW��9l��$L�`rj�Vh�e|>?+W�"�V|�oܰ�p8���8�?�MkK+���VUU��ϟ�Y�S�^��fŋ�×�[���l�g�O�(*G�����D"��r�k:�h���f\n7���LMM��^�x=^&&����(vg&@�e��j{��B��!��4�x��5��X����x<^&&&��J%�� �Ȳ�ZQ�U���f�^�����<u�w�{����Z�X4�$$ҙ4��b����.��a��mLNNp�>TU���6Xw��qL���%=��3oN%I"���ݯz<n"�>����n�����Y@C����Qҙ4�rq��*�x|��E ����T:�$$��0��]�#�owh�8�,�v{�D�H�#�S?���̗N��"q��~>HKk+w�}�H���6�r��P���

]`dd���Nz{�شi�R�m���x�^B�	�,KD�ъ�
���4{��&�f�:�洝X4F `Ӧ͔J%���Goo��,%��Q�h4��:��7?��մ��r���N��`�|��nfffhkk��X�����i��v���IKK+��c� ��`A�m5���~������$��-�V�e�V �ib�&��`�&�a�V(�bUQq�ݳ\�꺎�(s�?��eEA�%dY�ð�C���"+�fe��hh~�I�������K�]�5,l��Z.�)�J����v�1
�@Ew�0t�q {�n��0�$���> �L�أ��ThU��v��Qk�Bt]G���wN�r�ٱ,�d*I$���V�Y�QUuV=7��g���vg�!���߹t]g``���i=Z�*q�\�Y���V�����u�}������(
�y�M�G5�܂�e�����\��v��ݮ�
�}�K�]�/���r�:�/��%ʪ��n��	�������Y��|mg��Է�/SG����J}�|cIu̱��vY��k��N�pù�E�_B�/[�a��*���V9�{����t���bY�-��F�ӱF�p�!���s�6����l6[W^~v5dž�Uq�;�5|>_@1K�v3��P(E����fL��	+��,�B�@2��d@��E��;ײu����?l�DW4�� W2zd�eE�4��[&[2��	�i�sId�Nx'�g>���r�7Yz��N�/{��M���3|A3������_�ʋ/��=�
$�BV韶�U7Ci�
ES�B���iKv1]����]8��Nx'��	a��$]bE�@%�'&.�LsS�i0=3���h�ז�!̹�B�N|�.^��wNx'��	�w�;�oH���������%�r�CG�p�O�aQ0Lv�b����+}�e_��fI��	�w�;��ci
�wm��\I����'?�dY�Y�Ұ,��_�K�뒰,;��i���ȸ����j�qa��"��j��ܹ��Z��>�e�!ô>�{5-{��z��9_�/]�SM��OZ�K��9��aZ��f���4u�ˈi]���/�}��4�T��y�ֹʑ��/��gD��e:�_U)�%�x(�M4�
�A�֨J2g\׽��+�ɕ�l{\xbU��O!W2Y��gQ�ӄl���n����F7�.
���&[4ȔL
_�cp�L�l��Ƀ�/���������L�@��b�ٲy`s��$X��e}���J�dPҮ�=t�\�l��*�EW�M�h|fc�,�J�=A�ֹ���"
�u�z(	茻�
��a]�κ��6�e�/��V*�J%ޫe�ڦ6yH�
t�Ϥ�X@�+���K_���O��[��k+�|�]a�LI3�k��CA�(hN�s�H�&,h�*�n;�w��Y�I�����BAgrj���(
ҘH��:�3�HH�u�S�5
��-�g�Dq+��j��fW�yY��G��l�*����A6t�a��[�J�~�[�~�l�����@p�#�����%����C�d����e-��|cs���J�d�
]~�.	U�J&1ϭ��ۏa�TN'�y~]��_�=sC����㛛�<�<�,	��痬��V�������nU�w�)5	�Y���<�<�[�Dz �W���}��u��M�yzu�*ͺ��-T%s�2����>B^��4�Z,YfY��n7��ڣ.�_IJ@�ϯ��Vk�����e�R�����уf�iu)�on����Oٰ�I����D|r���m���a���÷6�	y��J.j}Ь��K��jٸd;=�WO���w%X��%	�ܪ�>/��U<�,D<��zV�;���/��ӂֈ�5>$ܼ��~^U̺��r�R��ws���xvy����d�t]/�ڗi�e���]~
�T�N5�gվnLb>���Ӳ�z=t���F"���Ͷ�i�Wrq|�pM���&�I��n�в0�A�CCFRO�����|2���s���G�l����a����ͣ)�����&U0��T�
n��*^U��P���E��
� a�y�d,��/@ԯ��?�K����Ȍg4�9���
���V�rL�YX�
��5F�e]�#K����&Z#.�xbU�θ���p��ټ ��%���î�e�dY���C�rm��u	6�ye�R��pz���+�S�S�[
˲h(�4��OfH,�^�.n���TNg��,w-��-a�7���0cכ;*u{2���Lt�b����Yu��T��^V�y�
><�%S46��I�
N�ypi�sS%�Jܽ0�DF#_6��/��KôV��U%�^dQ��ScE>:�C7-<���aV��������*<�"L�l�g ��K[g�1_�H�,������1/�	yd\bI����������xT��<A��O�p��}��+���fM�=�I��d�"X��;�f"���l��7�Z=Lfuv�ͱ �=��뒘��|t6��X��X����Jٰ��d��p����y�N�j}�i��v_-/M!���e(h&�;|<�6��&���)-2�����g"���4�o'NJt��(���d���V�U,Â��h
�)�.�X,m�p����Ui���;�gq����f�:���r8>ZdK��sS%�f�+֟{z��O�X��)�04��g Ǧ?�D�#��l�^������[�S)bZԕ�����83Qbm����ʅ�2���KblY���˚%���S{��N|���3l_�(�&����$I477�ؘ��_�$����p��V�.Y������X�nԳk"D}23y�T���!����SY�s�eʺE[ą"���2%�dp�̲V/w/rg_��JI�x~]�ֈ��v1�=�<�"L"�rr�H�d`a�ۏ��m��_�1� 	{��wK�E]��54��'���O�(�&e�"�Wm+�|ܻ(�[��(� �I���`Y������-j�E�83Q��x�\�@���q�s<M����H��0�y�����3k��
M![�)�l�qGo���N�lra�L^��Ֆ3y�T�`4����G�+c�O��X��[�)��uQ$��-9ӂ�W��Z#*ϭ�pg_���xU���~
�I�d�i�lOM�uΌ�xhY��_�J����d�vXͰW�CI�ɌΣ+�t�ݳ�xg_`�*I��uI��߽���O���+*��:�xre���^��n�������{t��xU	�2
�%!^X�璈�w��Νq">��KB|e]�
�~�X!Q٪[��G��,(j�=
.�B*/n�q��M!�'WE芻.�pԤs���A�d2�Ҙ�t�\l����ܻ(�*Ý������(=<�6�����c+—�ղ,Z�*+Z�$�39���Ze[B�[�	ye6�h	�,lr�03y���]fE��%�d�{��D\����fXd�C3�-
�����miP�l��ͬ�=U�E�bN��
w�ؼ��dF箾 �z�׭����4_F�̤�j_��.�Ru\�J�_�z$[�-|Q���l���\2S%5y���n\��fX��.30Uf߹<�{&���4{s��&K[<��.�+��\�	7�LC@��,�x�@�lѠ7�ƣ�a�n��7�ʰ�l��e��?�3��i��5�IUZ�*nE�B�4[�l�%A^;�b2���I���ۼ��ϱ�\I�B�"�鈹82T.������Үi~�g�_읩���3���8B�DF��M�=ӴG]<�&B{�EkD�����%�sIN���h���1UY�Y��HJc:�s�B�L��0-�n���*Q6,v�gi
�t�ݳ�����;z��+�+l�
08Uf&�c��Se&������a‰�"���m�n�s(�&�SeFSvz�%��fr�B��HB� ផG��Β���n��$?�3�G��5�g4v�g���i>:�e"��/���?�G7-�8���ǭH<���*��m��q4mO>���i�ʂU�>�����GS�Fl���y�w�gX��ë
Ό��m�N�h	��8�e&������	&˄�2ݼq$�?n�b$U�b�����H���γY^=���ۓ��]�����(��\������>�E�3�E�K
�w�茻h
�D|2>�4o�`%�@�3��k�=;Q��[����s��|��FC3oM��t��&Om�ɪL�ꭻ]���a�Lw���G"�Wȕ��˳�\U��5&2:���+�j=�X&9�S���8�獣)��Y���z��	a/<{�n�^�����R[��m�|���V�� �Sh�״Uq]Nz,l1�DFc2�3�֐$x�Tӄ�VYի,j���D�2�;��g�Y_O�ͻ�3�<2[��x�d�"!	�$lE�EM�n��n�в�W˜�F������H��%a�/ʺ�hZ#[2�>�nX�J�i�DHA��;|��=
���϶|�Q%�y!39��I[n�+d�Q�L\(�&͔��eV���՟�P695VDjJT�,���Ur�|X��#fK�t�D7-�s:���$��
!DE��ˡ��J'$a��$�k�!�DSHe"��,�?�G�-4êu�/��qj���<����d��/��sW_�;{�׷�0-;nݰp��Pm/����'!%�$�Y��eh��,�$	�a�VD�uy��VVH�zo����j��&�z��t�����%�Ԫ0g'JȒ`c���Mn�b�@����y|e��GF��4��b���x����<�.J�l2������uQ��x83^D3,�.Q���
�R�+	�"�RÑ���~">��_�1h��2H>:���E3,><��P6k���Z��"���~��%�D���oA���Q'NJN��W$�~�DQ�j��2�_!D�o����C�Z��D��F�_f���|�h�qa�L�dK!4�i*5�W�?�1�x�`����R�uF�}r;塚�T�^HU�di�L.L�X�bo�,l�pr��irM�k�.�?Q"[2�Mx0-{�9?���t�].��M��U���{���H����/���<[��,�(W?�U�C������T��Fs�Œf/nE�V�~��J�YS�P$AO�����ɕL�.��F%���x���ʂ��Ɛ�ǃy����VE-̱�"�X��aqeba���u�+s~F��p��F��n���Ŧ?�F��%��2��1���_ay�����)�[�$�YͰh��4�T�*S9�Ӌ��t�89V���,�+��&�Z��%9\BG�ź.?]q7�)�7��Mi�
���>�B*��ᤆ,	zn.�hd
�K�p+���%��ĉ�"!�L�-��T��7��nt�b2��e���Fg'J��QҬ�D�0��[|t&KI7)h��s��8<T����A0����!���	[
'	ȖLB^{?zp��G��,Q�-�.���y�e�b�:SY�}�!�99V"S2��f�4�T$���)��!�\�dA,E��GR�8cK%�ܶhw��aZ�5y{eB^�d��ݓdIНpcX��t��NO���2�<�DA�h�_��2�I��d��_�pe�)�/��Y[��-ꢩ��Ɛ�>9T��.2\ɋ@pj�n��a�
�k�0���:�>��͞���70��3�g&o���Kg�E�` ��o�sf�D�d��>��aZ��5t�'�������A���pr��t�@`oIf�oK3���M�
�ɼ�ёbeAr�+�
�
n�y��c%J��ى�i�	�F
��vާs�V}�L�t^(��e�?Yb��,�y��"`��Fe����A�g&J�+���f"����O��X	��3k_�U%О=�2^u�2��V�h�卑���p|�pE�Z���*�-^<[�H�r��*r��e�"�	���~��c+�*��J���o�<;��l�qq�\�k��@%K�>PW1�+骕��,���om�sb����Y�x������ٟ�W5�qϯ���?��Rm���jq:�XPY9�+�j}�,{fk\�׉�[�B�\	c���aQX�mA�IK�����U�.�6k����57�W��~���$
�˓-���>8�A��M��ު˸��w����X̓F�-��mQ{��?�9����le�y�R��m}$��B�?�/1���%Iv=�[q�-��3�OL��f�3[#*��矶O2��/��V�OW�/yߗ�?�:.W�u��W�V�YeR�e�õ�ȕ�ds�H���ܓ&�E��٩JD��?XU&♧)A6��h�o:�3����"4���_������>�.d���-���Z�����K�}^X�/L���s5`�Ϝ�.ͰWմϽo�9{��y*�0��x�P���2��YK���jq:�T;xÚݑVWQ��^��ϵ�U�o�:lYs�œ� .�<�3��?_ڄ����x����M���2���mw���f����z�+�I�ips��e���I�M��]]�]���ip��L��fQ��ߪ��QY���ϱ�y��e�gaK����Ҟ����y�w}9�\=Z)f���/�.�ݘ��j�jG��y��m_7=xT���ڎ�����ac99��x�*5��2%ݢ���>���f�K�仫�f�q^����H|n�8���Q�b�\�T�m��ԟ��zO�o
�s�3�Og���Ų RiI<ݛ'z-+��f�2�>@%���:P^i�y-3���*�C7
ӂL��8/���0|����V�^C�ϫ���J�	���!��:2���ui��wpp�����g���mo|�i��������)���1M�ᥪR@�b����ڰ�t��ۓ���I׍>�<�y֍p*SS����|}��w:d�kD0����k��P��?z���l<� �/�}�5�Yת�}��ڹ|��aZt�\<��6hs�7�y��ʂG����{yq}��F7ϯ��sI��\)Mד~�������?k��H5��Bؖ�Ty���|V�<���뢴E]W�3����B*�{G߽��-���c �������+�ٲ P�q��̇i��n?�-�z���#(�&pכ�kI�ܸ%O�����aQ��bM��+r�cZ(���n�C���JtĮ���\���bD|2�dk�mk}��
C�=�}t�e�*�RlS�[zx\��.E����uU"PX��5�(�#V>�Dܯ��Gӂ���{91Vd2��w��5zlw��<2Q�\I��<�AF�)�D@�]K[v�bs,�Y�Ew���6/��P%���xlE�["�S��7{pU��\R�l|��	yeZ#*KZ<�Ҭ�����RV�$yd<�=�Qe��}1�$���4��k&n}�D�_!�Qd�[���a?c�{��8��K�%K��T])=sq)��Jo�m�1�1�Bܯ ������5�B�lP
��W��4�j������;7����<\̣�+�"��Ig܍K�����̓��I�xm������?�;+~�k��=vg_�孶��_NU�C�D�|�dI����d�S�ŚGGͰ8x���?K[��d�GS�&t7���-^da���7�U|N�|�٢�S�¨��3-h�ӵ�>��*k;}(���i��3���˽��H��q&�+ôE]��t^;�b*�ca���Ϝ��q$ţ�oxF�ׇ�gV�ce��(K��=�b˂a�=�I�u���e~u0E�+��ËaA*o�}3t�\l]��+^=\=����(Upd�����y݀7���k��g�ɂ׏�X��ť�Έ�N�Kk<�2�*b~��I� ���p���_��#)�Z���d	U�W�g�l�
ؓ9{r�6�95V��AC�G�h�Ɍ����Nm������r�5�yhi�Kpr�H�-��Ç*~������~܊ć�2�8��U'eI��(���|cs�_L���O{�~$�CK����3l�ӓp�ʂ�&+&}��,�����c�D���]ӵ��p ��ξ60�e�����]�&{�+��=Ӥ��m���P��A���	[���
�����3��-�£
��x�D�wOdjGE&K)֮[���;��0]fE�=x��2����k;W�*W]��?Y�X����Y�=QyxY��n1�5غ$H<`���������Q9Ӟ*l;��Ѕ<�Z�,m�����l�9�c���P��>����}�s�;��ՃI��83a��<1��T��;��$�]��!v���SU,Z��m�]d"��wN��t�5>A��vMsf�ȣ+��
��2����W�82\`w��9b~��Nt��c�����-��x]�Y׸_ƥ����s,��9>:�E��q7��E_ݟĴ P���bz<�D{�]�����V$���	v�����$5>8����.?g&J��(���Ӝ�.sj�ć�2$�:[l)̜zY���_&��_���.�Y���y>:�E�����Ii�+qU�:��ͅeA�k;	�[�R�&g�wp�vLˢX6�
�3�i��il&.��'mW�
�G��h	�j
���]�;QͰ(�f�5��颉*��f��SYRy�D+�mF3l�z���"*"pA�l��b��]S�ŋ�c�F*�}��n�!���0����)��
ϯ��\\(X�,�%�XY1�U6L�%��f���IY���V=xNd4�-�Kb:�S�L���$D�Do���/��=2���d� �����.��F%
P6���Y���W&[2�
���GT�Q�J*�&��<��%!�=��Ba�SˆI�lbY�L�h\1=JE�/��G&W6����v8tg��/ 	����/W���]}�t�*�DͰN=�y2%��v������b���v�TͣK5�
`��/�M���/��f/nI��_f$����kW�+j�Ms���V@�dV'�yn�=��,�׏��$L�"]0�l�f�+T�-�HRc�?wT��g_��(��V���c���r%��n]���I�+���ϑႽ'� ]0�%V��0M{b�R�,��C�2��ESȶ���>�T3�{�<f+Ii�5zxdE�ɬ��wi��m^��.JW�ͶSY��yre��VGh(5���aq�|�-�~��!�.�/�e~wK�{���(Gԝa�h־�{�lY��
�ңfX��5����r���~x"��V�ekN�5zxty�lKS���)WOO��Q��ce����+��i�)��u���F���X�eZ".��xje�ci����W!�X���4�I��W��Yy,��iK�*No6��9:\`a��,q~��V^�X��6y0�AS�՟N\�V����E$a��{UۉMSHe4���8�Y�h���*1��=��|2ɂ�HҖ,h���\�.�4�n�	39�L� �S8?]�!�"�ў�!���Ι�e��.��HJ�%�2��jnF�N��ip�Ɍ$����&�WKi�+v<�v��%��dޠ=��[q3|a�\;��R�%��O{�E�h�r�=h��ى����L���np�g ��T���2nE��ك�%�-��O�j��e�v�TJjN�h(�F\��S%A�+�xF�3�b&gO�,ʆI�-3����x�.	���4���l�كm��L�Ơ�e�������tAg"�3����k$e�=?S&ꓯ���Qs�#���ڣ.
eۛh�d��ŋnZ$:���
�A�3%��`Q��l�$]0x`I��ƫ�-{d��ykyJjt�]��{f���%���h�?Q�9��u�.�^�\�O�i���
����86R�WS�8q�Q�����������2���B�o���V��T����|-����ջ/[܊Ӵj�W��x��ܤz?pI\r��U��8j�b_߼���3K�`�	U7XUW����/��tUE�F��Z�l���q��{�kqU�\E�f�߫�����W_�s�P;Ǐ6�@���V���iΌg�Ϲﱾ��Ϫ9��":���qi~j�S}���U����K������X��:���e\�c��%a*��/-R坛@[�>e��M1�t�o6,�~����Eū�쵀�_A*��q��� ��s��s�4��g��]�w�uQ�⪆�����ϓ/�t���>���M�<qW�ò`p�T�J'f)�WXF�����/
��I<�`u���G�ͣi�O��/�����|]kz���z}�~o��Ngf95�/
��9_q����8!�9>R�����B��%_�?��|��I�tV��/B������~�DF�g���-�̀��ګ�M&K���9�<TO�(�ڎ�)՛�=�/��������3���̲�o]����DZ���p�`��>25׮�'������>��Tm���[��V���g�g5�*���M�}2���9�O�b��z�(�ֈ:��z'"��氊O�x�O$�6?����V�j��ZM�۶����~��z��0-L������G�hdS�^7�Z۪�7*�j�+q��s�}	7/��βe_
W}n��F�3�ql����Ь�ua�נb̥.��e;kY��u�|i�.�KZ���.�,�Y鲰��EUtÚ7
s�խJ|mc��&7[����e�|�w�5Yl��ȵ0ղ�[��x�ɟ���J՚h�h\S����5g���p=�]�Z���G�,j���(�!���fIV�y	ye�'KLdtV�{�ْ�XZcY�M�84�'[4i��,k����9F��˲V/�S%�t�\͔Y��9��w.G*o���C_���m^���l[�k(�+2�,���c[��6�[�,6t�yvM�DHe��,a��n?�!��E����eV��(W�\2g�:X��ap�֖��/�[*�/n�1��x�D���E���5����`���_��5���M�Nfu\�Ćn?Eb�b!qq����Α��a� a�9��1M�ڦ{�l;���L�E���
#I���2�n��:������	[���)��Ǝ��R9f�צC�T��I�����C�l�r�B��_�]�~�wK�d��KC,H��hl�	��I^Xc(Y��X���4۸LcH��SY��!ƹ�2�Y��VqW���?�5�,	�Z�3|mc���f銻��e������
�����n?�z�%5^�񃏦xtyE�Wl
Xx*^n�6+���yya}���`��W��M�[�)���j	yl#=/�K����|٤���{'25��nE���ͳk���(�/[��JY�%���T�('��
��9�\��<�,X��c8���ÇG�I�y~�gN�M޺��>�u��gQ����z�{"�]}dIpv��=-ʺE2���0��m��-���p{`��iɘV����M�Ҧ�õc��2KA�L*o�.��*��C)L�bU�����pt��w�mdE��l��iʺ�ӫ#�O�	�%\�������?��V�-͞�[��-�6�3��y�h�5>��zY���´m�ʹ TX��#�W(j���O��[칐c��l�����:}��P�T��0a�`�}�y6y�M�Y��ep�N�mW�j���.ʆE�`���GQ�p+&
���f8;Q��h��9��.�p��/���lX=2����b�4f47{���ASY�_Hғp� �P6��d:�.b~�'@�#qa��;�Ӝ�(��{C8���)�^a}�m�o�@�Oe0������d��^g��F�8��c����4���ᕚ�~�9���p�H���@�GW�yle��N�R�QC�2��AO��\ɶ�?8e��(jɼa��g��J&Up˴G]D}
J�2�nZ(��!h��u)�=R4+����AO�MKX%���ň|x:��2��4d�y��-;�1剒}̗��^Y��j�es�m�e�^'�$:�HJ��P��l�j�7PȖ�t6�T�'D�M��f��ঠY5�B���b�/_2��*��t�����f���Z�fer�q1�֘��ET:�.Z�*SY��Z+����ޝ
,lt�g0,X��!U�O;�P��ӂ��q>3{����/��3jn��I�*���5!	����ξ ���(��l�20Y��D��^��z95^d�`�ֈ���E�3:�a���G{�E�drt�@�#sW_���=0�=��E�#�eA�ƐB�hpb�H<�rd�@�-�R$����s�iA��O����,�	}�|�``�DcHe$�]����{���@���7��� �`�U�z�%[��i�$�L&��Lf��L�;��zU�{�
 ��{o�^����WII$EY��Ç���>{�uv[+��.M�0``R�uw�ȌSQ7����/�/��o���Pl�t��QG�/BiZ4�� @ׄ�`X`A����dY��.*gt�R�鋰,OOQ���!/F��{�BJ�	W�<�&Ũ`��m�Ov�t�Z9�~�	�u��!���%�Y��e��X��DCt���X��Ũ���A�g��Pl�Y.�ED>SLG���T�DP��0�D"!`�Y���O:�0!���%�R	Z�4r�r��P8�O!��>NёK%���w�R���@h��V)%��~��i.�D]�CQ������D��J�($��Ht�@!���
�����uK|W�?�?"L���	�s��3�~��K�n��h8�=�`T^��}��Y/�F��a�U��g��e��3f��K�����fzF�����UyJ$�w�D#�ɤ�Gˏ�B̟�W����c�o䲨��R��gA�8��T��G��7�?N'G%Q�>�G�""7I(
�{5!:%�)�PD���{��j��<��Z"�=o&v}d�ˍ���^����̄�u�߿�A�خ����W��V�w��ڝ�|��?2�&p�7:~$�!�p�W�y�L�>�f�6O���;�~&�E''դ�L�sVX`��R)�e7�VB����C��W���3���q�51��^L��N/���sK�ΰ쩳3h�t���/��R��H�3p7�O����}���:zn��07M��&+^�N%C&��$5No�9�j"�he���bz1��^L��H?�
��
�	y��b��M��GȎW"ʒ�h>j�����&�H5�2	�@�J�;A-���j��?��?A*� Ӌ���bz1��^L����2�,Ov��Enh�ck�R	<�4���ď��L	���r
E>4��_+�|r��On-����������� �|��ɇ� D�[�V��<EDDDDDn��2�� �r�	�o/rU0������ ��G%�U��W�7::����IDAT��ݻ�x<8].FFGq8ttvr��A ���KsKk�w����s���Cuw�ݱ�w|t�"�
�.=������n�~ ��j��v
�p�ݷ���a�@�r���^��>��p��>D#���@0xG�8��;�!�Dp:���Ym��H$��+�׋�f��}���χ��'߰�|������'����\tZ�M�4:6�o~�,s��A��}���v�����>z��/����c+p��IZ�ڙSXx�
��v��g~Gvv&����066Nnn/��G��̹s$&$b6�پs����~�ݶ����\����x.�P��_{���Ť���t�(+C"�055Em}=o���R�F�A&�}`��s���������ⷿ{�+uu\��&-%�Ą���|>::;�X,H��k{�^^|�;NssJ��w�m��t*���819ɡ#GHJJ����ŌT*e�;�IOK���������y��WHNJ�l2}��V+���qf3���a�_�D8�l61>1A���$%&�P(>4�s.��/QQQ�N��#e�Dػ�?��/ٹg/�Μ�l2������^���y��w��h>R�[�����W�`0��s/PW_ORR"���w�9�<o��'Y&3�ص�޾~�Z
/��*e�����zy�O�}}ֈD"��~(_}����+W���g��5���3]�^}�M.^�"%9�ل������I�F#�m�N��
Q*�����r�1�LNN2<2���(*�
�JE_?���\n�&� ���CWO7J������[��d�|>��P���N�����u(���
�`���$�N�cl|�����
��#ǎ����R�[���M����"+3�5�V���
�imk�j��d�"*+ʩ\���Ǐ�����8m�<��CH$���ɜ�B����G�3�=�����顬t�99����0��v�"�>¤�ʟ��SQ^3dv���V�^/F���W���_�l2a2�POw:��صw�֬&%%�VK͕ZV.[�L&����vZ��q8d������‘0j�����j5���a<�c��+��&L&G��f���J�����P����m�.RR�Ѫ5l۱�L�}j
�F�V�����d�F�\.7�v� ..�R�B�`tt�����>e�l6s��%��%Q�TXm6&&'��@�T�R)q��t��r����V6�[�^��������ׇ��E��155EwO/>��^�wl�ѱ1�>:��ֶv��_��̌/Z���!;;�̌L����f�!��P�ճ�����?��_�M�IL��d2!�J����?~�kL#6�-��B���(�p��y]���xhim�j�a2c��j�����&jj��y�&�
Q)�LLN�t���t��n�'&0�ا������a0��Z�x�h4��G�H$�~����ttu
���t�044LO_/z���Ѩ�@�����룣��#�m
`���inme�j�h0PS[G$!%9���/�r���;�XUŮ=�}}��Y�/���jj�(/�Ok[;v��h4r,r�ᅬ��N�6+U�/c��y�ͷ�|$''�������j�2���3��308H_?�w�kj�OM��+�j5|�;��¥K���ϣ?�����3$'%!�ʐ]�ݘ�&�;:��?�1�]\��F�T�����>���0^����8&�����Vr��(�ϣt�<~��gHLH =-�5�W!�H�H�
����@�T���I0�f�"-%�LFzZu�
(
N�>�V�%''��{����6ݨ�����*O�\�\.G�TN{a�|`�y���v��f��b.\��[�fͪ�$���p:y����H$ا����{��좯��Sgϒ����hDz{�p84��008�c�<����G��\��?��<�<� �����l&11z����a~�~��#�׬a�<��l\���'OG��Μ%5%�H$BsK+.��]{��_��/0�̈́B!Z��������s<p߽��~�ijn���|����ŗ_�����h��h�H ����ڳ�������e`h�ǃ�n�j�1:6�[o�ÚU+9u�,�P�@0���9|��_卷ޢ���׃Z��a���y�9B�0i��<p�l}�=lv�p�'�x���?��d���?�	gΝ����\�����,(/g����λ(U*&''����-wm��̺�q�݄�A��G?D�����G_.^"!>�s�/�����+V��˯p�ƍ�8u
���߾��o�k�>F���TVTp���\�g_?�Ϝ$8�{z�(/���������O9q�4�CC�{�^z�5TJ���6l����D��=ʋ����kX�t�֟�>�<_{�IΞ?���CC�|�ɯ��o������%))�����q��E��RIJ��˙29�"�錍����zO~�	�ΙsKm�������GFYXY�L��cU�p'���[^�Z����RTX��ъ�T)���%��l�V._�SO<NOo/6��?�����Og������D�_ZJNv6�p���F|~?�W��'��8�.��(�_FNv6��܅��<s�G|�����$,^�����HJJ�����ǣY"�p��E�'&�x�G�����H$� ��f

�H��84
��sb_�R��ۃL&�h4
�ر{V�5�)���[Kk+g�]�_��qqا����H$ZZ�������3>��׿��{��wHKK��<>#�{"PVZ�_���a6����/�s�Z�ڨ��E�V399I͕ZV,_NQa!�o�>�$	TPTX�W�x����.��:��"��).*�`0��(+�����UU(�J�zz8�"f������ ˗-eNQѴ9�dr6o��w��MB�V+j���˗3����}�[�Y���<��#<���OLPs�
5�u1�.׼_((�J��������ֶ6�rr).*�ч�o��/y������?�şs�ƍ}�!��o���^z�UZZ��O<��S(�
$��]U0dמ����������O����<7����W���a������׿�d4Q�����GJJ2��?���?PXP����d��
��_�	��75�^����B}�a��~[~�\P���9|�ɯ�b�2*��Su����v��y�9��7��͡�G9s�
���Ϲ�n�� 
#�˩\PAi�<��էbK���J���������&���s���Fß���ܵa�������a���}}`�1����eAy9���X�6�::P�T<�Ѓ|��G��d�\���kO=ɢ��X�3e��SO����#ǎ1�d.��RXX��c�oy�����‚�Z
�u����� ���͒;޾���򈿩��^OfF:R����8��-f~I	i��\������zA�n"�x�(
��ں*���x<��曔�����}	�\.G&��ֿ=Ƽ����D"\n�Phz*F@.�!�JQ*��B!'9)���Ŭ^����4����c:��n���or��Q^x�������mΞ��3�>ϟ�����\.7Z��T���C"��s�nN�>�|���tѩG���ޒ��ܵqC����Lx�^��HT���(
FFG�i��F*������d4�ր?,O���V+�H�͛6�r�ihj"#=�VGVf&s�琑��L*%	�r�0�L��Ι�4�YA ���"��_�׬\I|\<O?��uu��ʫ��jTD����d��ܻe�kj�w� _}�+�6!��	�C1 Z��2r��*#�P�	��n
��L�B���O�N��ڥ'�D1%$"�ʐJ��T*&�V&&'�hԄ�!FFǢ�D7�9�N$	Z�fz��ߏ ���219	D��z
��͎L&E.���l�\.T*��ZMbBF��?�N�E&�a2I��G!�#�H��dx=^\.7�H��C[{�PP���Y,Q]lV���IOK��Y]�T��s��I���e��a2�@f��Ie�T*�SS����<�lp=Q�(��#W����ϙ��g\&c�b����x�^<^*�
�\����������a�g`p�����U+1�L�B!�Ri�^H��R)��t�Y?�L\�A
^��χ��G�T��~���ヒ^�'5%���>A���	x<���Vw�}}��(��J�J�V�ٴa=�P��U���J��?8�{�w���KZj*?��x���ihlbrr�y%s�M��utv�t�0��v�����+����X�J������IZZ֭c��}��������Z�F�@��ǖ�614<LU�eFGǮӣ���={�a�YIOKcdd�������Krr����;۶��hQ��������;۶��xxw�v�۾�`(D{G��H$F����ꊍ���000@gW'���+)A��6�5ё�LG9Ӡ����)��#.^���?��?������KI�\��r��������RB<iii����������z�Rz����8B� G�c��5���1�Sv���G}c#��U�guZ-�%%D"�-����TJIq1R��NGbbt���^�� ��)�H���3s���J����kLME�;����t�-.�|��Y:� �J1��(��2#z��U+�s��~��$$$�����{�c'N
پc'�{�֬Z�w��M�-]³/�ȉS�1�<r�������ۿ�?�׿e�޽<�Ba���{�%VV����O��g�wt�?MRBbl��^�G�� �H1�M�e��Z�^Ojj
O>��M��ǯ~���}�����'��o����_������>5��U+����.��Δ�+)f�Ν��7GnN6u��,Y�$���ؼikW������+W�����t��19=��V�R��n/�T*e��E��ד�����g����Uj~���r��2=p?T���Ư�V��RuS�G�T"�Iinm�:]���d2�z=Z�����1�g�)�m;wR���ݛ7388�/~��6[6�^F�($R):����!�GF�X�dfdp�����."�;v�&8}��N��/"�7��)lX��Ą�^��z}lߵ�U+V�9�c�f�a��������!�H��lH$�,�~?#��(�Jl6-�m<���d2�)�'&0
��j~��ߐ����ŋIOKC�Ѽ��`@�ס�h��񐔔�����p:�(U*�
���.�fsl�}������i�)�drFFG�$'%a4��l8].�RSٹg/���|���r��Se�`�n7?�ů����ܜ<�@������g���y�h4��>d2V�
�J��;�{��>4O�����F  �b�����	$&$��hp�ݸ\.,f*Ut))���t��ntZ-V��H$BRb"��V\n����������8���f�i��H�l2���|�L&���L��h���d��(
�J%��^�\����>E �b��v����ӤnL&#~��:!��j��B��!�Jq8��NT����dr�fSlÚ��������=���1��(
�����l�T�T*�����&�	��ݛM&�Sj��шs����F����r�Z
�����B��nG�Rq8ttv���?��SX;�(�ˈ�X��%6K�c��1�Z9�������Kd2����r]��#�O��˜<}����O�鴱|���r��(Y�V����'���o3����h�ٌi���##�FF�1V?����O�tEߣN�%��������]�{,�k���t�r�0ם��6��04<�V�A�ӡT��b6���#��$#�H�h���
�����Qݫ}�ߒ�hþ�ݎ�uds��p8�?��OY�b9�֬���o#��\�T��M�P�gi�蠵��{6���r��y�2�-���t3y��|������O~��+W|�����VΞ;O0$++�?��n{�7
�_���y%lٴ�P8����mxd�H$rK'wN�>é�g�˟��u'�D�\|b���B�kjHMN!=��w\-�{�ׁ�4�t����<ED>�ZZ[IOK'>>��g�1����FÜ�������9u�����.B�ە�:�D7�|���~�['z�ӖB��~�W���������*���!~����	��A�^}�f������m��ٿ��������
�28���az�l9�A.��`��?�\>��3gϱ�A����Xp��||~?��7�������C��Dj��gdt���F"7�]�s�h�EDn���Nv��}��?x��������{۶��|ccc<��K
Q�PO{g'�m�u��	q���?>��܅��8~�u���{��ȼ��n����m�ru�:s�]{�24<LcS��q�^SSS�9��S�\K$�bU�O�����I�z�]<�O�}�L{���ֶv�8� ��C�hܲ�>��O�����bU5��{��]����&99���LN�9�B�`�]����c��]

���M^n.o�������æ
�J�����[oa1�y��j�4������Nǣ?�^�c��C����rK/F*�"��������xLF2�<�aM"att�����X�z��T�\��r
f��{��;�=11��]�q8��/e���=w�ֶv���ؼic��X(�܅LNZIIIfŲe������;�F^k+u

<��H�R��p���f�J&&'�w� �����:���x8t�(=������b�r.�����L|\ׯ������&N'驩,]���{���p8d���z9|�8��r֬Z����9٤��r���.Zę��q��Qo�����,,Y��7�~����˗.A�P�9w�"�MM7�'5%��7�#p��I���iko�r�::;9p�0�P�˗����o����?���9{�R����"?o��H$�����͉�r��W�������˘M&��|U�k���V�ZIUu5�M�X,�ټ�L
	I�^Oo/��-�w��v;g/\`��1�
"�?����m�P��HO')1�����d2��8�D"�l��Rk��l6�yo�ii�+)�����L~^y��������ĤY�uZkV�b|b��g�r��Q�IKM孷�a`zt<�)))���9�76�p8��LB!���v��Jūo��������RENN���r����<rsrؾs7�Ϝa��JJr����x֮YEuM
�d�����s�G�ܹ���p�����rss8x�0�.\��;�ClX�.���c��˖���I]}=�v�%?7���~�ٶ���zz{Y���S��022Jzz��y�t:j��8p����,[����.�{��T]�� >��S��019ɉ��@"!33�Ą�����
B��JFz:%�sp��\�t��/�{��s��Y�;;c�u�����N�f�*
���TjJ�c0x����t$��S6��Ʉ^���l>n���v�y��!"\��z_�3g��7�B�T�����Ҝ<}��I||<�崶�q��1������3g����Gm}=���X�]�����
�r9��٤�$�t�b�Zm�5������70@wwV�
��k��里��ÉT*%5%����1��5�3��rsA"!%%�*����j�������8������Yk�)))���QZRs>#�Hp��ttt084L?2��V�ݛ�b`h��+����|~�;:���b�1�^�g�ڵ��7���>kZ_��ܜ*+*HNJB232��ʦt�<r��H�O������$	��ܵq#���XUE_?kV�b~�<�f333��p���V-\��E��ˣ�����l֯[˺�kh��PTX��ŋIHH ����DVF���#ZZZ�(+cQe%�i�Ӿ�H$R�Ri�Y��``��3i��J]=SS�~?�i1��ӱ�����`yc�}�;:���aa���2��McS3�CCا�HJJ$11���c1����\��gbr�����H�����E*�a4�rW�^���߫��d2��0�c�����p8�������F�+�����s\��cɢE��<�|�
���m��h��Z�S�n7�Hd�=��̌�h���޻醴��s.P�����8A@�QS��@Gg笵��]��D���hj"�2����l���ظnO?��� DM�
����2lv;��}d������=wo��E*������+��|��X>�k������E��j�x���
)/���Ǧ�D�L�-´�:������D"�r.\��^�����ή.�JKINJ�؉���7L�����L&#+3��.P�
����GOo/�Μ�ԙ���RF��#�@D�FߌD„��/ZnY�:{��W�02:�\.G�R���B]}}�#-4]�V�����P<����L"Ӯ�{z{�oh �D(�Ͽ)yf��΢��������
9v�$��м3�� g�_`hx�����h[�����T,�h $uL�݆��#%99�~�/��ۇF����O�H>x$�cfff,���f�nC�����t�b��8���p�-�����O<�������pL&>����1
���‚���HIN����޾>���(*,��t���Cjj*e�)�3�޾>�j5Y�Y��b�qq��hHIIA�TPTXH��'>>�˗111AcS�P����XD�д.*( �0�d>�����M�����ZMJJ2gϝ�������Y�|j��F�add���\2�ҹ\s���aV�ZI��������b�������!+3���B�����-.�b1���MRR�##TW_�f�QZR�����ɦ���޾>���XP^���G"�������8�����J/Z�F������$��h��Ǒ�����!7'�ZCBB<���h5Z֯[��f������n*�˩�(���	��FNv6e�K�H���������v�Fe��TT�3>>�� ';	֬Z�^o�py�s������4��
j������̝Kww7.��9EE,ZX�^�������|�hmmC*�2o^	
M����M�+�^b2���%7'���"�kjhm���U+W���D�U�.]��REKkF�!)4+���N��t��ɤ���q��y
��Y�j��U�3J$�n��p#�D"!`�YE�}""�$׺j�#��S�@�7���B�0�Ą�6�Ft���#^z�U���O=uݔ���D> l���8].~��_���rA�HdV�k�8�I>�ku�Y9�/��
+�/�t^�uetm׾���V��f�v8�����>���;^"w����'.Ԉ�|�\ۉ�)���o6��?��]k$�1�m44�։?L��c�h��={���$/7���›��
�u��7+�R������u�<�}�7Jsm^3����ׇN�%?/�)��/����r�=�|*�MII�+�=J8�b���G��}G����������!~��+�O�x(
>m�?���B�|�w�������|�
���g���t;��K�v���	׸W�i���"#~�;D0dxd����6^y���\�����ٻ���߳o�
c	|��<��K7�_?119ˑ�G�zy��Whik���c��o�H$���(~�'�HD��h�EDn�H$¤����^����N��W����ߴ�s8��OL�Á�j�>5E(bdd��������JWWwl�.��P8L(��t������044;��v���项����a�SS
��p�\�l6&��س�;:��wF�ohl������.=v��WSS[G @�P0>1����P���K���#G�O;��z�2199+/A��������f�n7����bbr�ۍ��gph���w&���IF���o����2.�����-r�\��v��DZOM�f9�N�NJ�'"�YF��'"r���p���a0X�l)�m��\��T*�O?���f������^~��3ȤR���Q(����Dظ~���o084��fC��2s���r��_���=�B�o������_?�,�I�LLNrϖ-T.��_"�����%�i���m�"y��y#33�����:Ν���_��H���O��ֆR��O��i�z�D�
�ֲjŊXyx�>�[Z�r�&.UUs����/%'+��/��ԄT*a����[LMM�����<�V
�r���8�R�dxx�p8Lm]
���vc6�Y�`��c��� O<�������kLMM����N��r�������4���.~��J��=���G����z��w��^���^��_�*i���v5��#~��dph����),(`��e�)*���{��$�c��,\ȏ~����q�\4�����ŋ8|�(I��H�R:LSs3]�=����'y�J�"ا�B!��#�@ �D*�O�U+Vp���s�/ �H��?�c6mXO$���#��SdNG�����hQ%O?�$:��D��˙ST���]��r�J�l޴�~�����������������Y#���!|^��)�ܳ�RI\\{� %%���r��>�ټ��UU��a~�'��<�F��x8x��7n�~������LZ�x�^A�駞$!!���Β�����$'O����K�<}�aV�\A�9|�k_e��J�8@BB?��`2��N���������$.^����-�AQD䋈8��M6o�H||<�b``��+�#�IQ)�;`��1���|>
��Ą�J%�H�TJqQ��	�}~�R):��\���]�	��@(���P�T�F�&#� �r�Q�T��jdr�����E����hx�IH�G:m،Fc�j�j��
�L�D*A�Ӣ�jQ��x<^¡�eNH��儦�J-����&|>q�II<x߽
�i?�.WTN�^?ˡL8�����Q(���f���hHHH`r�+��K�R��OGg�<Ñ�7:�2L��󑔔��`�`00:6�T&#>.��̊e�xo�ZZ�X�l	*�
�ǃ\.�� �" ��EDn�p8L��::;�J$�!�&3:��^y����իسw?���oG�,[�	�"�-\����q\.J����sP�T�ǯá#G�#1��hHL��w�e��=��~$�ht9A����xa%�C��7���jTJ%��/Gl6� ��ˑ�{��dB����+���ׇ\&C����MKMaa�U2+U1���hji�������+p8�L9�drr9�9�=q��'ORYQN_?���oصwol3��``�l}�]^y�u��Ƒʤ�Q�$H�"�����3:6���C�V���2����HO��wߣ����k�P[[�/�[Z��ٰn�,�vEE���~::;)/+C�{�%�_��iW7�;��_D�6���a��{Ш��)*�`00<2����99h4z��p��dfd`4� 99�Z��펆p�����"5%�I������4��BfFFl�;>1AKk+z���Hrr2���d���r�q�]���184�� Z����$����edd��8������c��1�L��?''ǔ�N�;%9�p8L�52�������W�"�b!��م}j�����2�|tu�`6�HOKchh�޾>��,���\���;�x<2�Ӱ�l
F�2�Ӑ��X�V:���H$���b�X��YXP��餿���T��bbr���D����`��� <�‹x<��A���h2��iW9���f}���_DD�c�	���~�G\.?�����}�R����GD��􈈈�ސ�d�w��w��KQa���(""�(����y���>�0""�h�EDDD��aED>/�5]DDDDD�K�h�EDDDDD�D��_DDDDD�K�h�EDDDDD�D���*�n7�--�7�+����x�D�x����O[��#b7���ǟx��sss�iu7}���dמ��$���hf��	�B(�|`������:��rg��>s�<�@��8�m+������bAFn��.r�q�.^��띥� x�^Ξ?��8q���on}�\AjJ2���cێ�\��%9)	����3�p�m�N��?Hm}=�II���Q�6^��\1����aN�:͞}����G�Rq��i���P(wDF��I}c#��@ ����Z0��P(C?y�ހN�����[�>����"��iw�J��ߏ�>��*_�DcS3�$7'�JuG�D::;ys��9z�ֶ6�Fq7��Z]����� ���逑�Q�{z0�=p�3g�b1[���w�9�<o��'Y&3�=���Q�J%!窶�h_�5"�v���!�%	�`��z{#����kjP(�B�X�,A8|��v�!��{}>�>_,��3g�ڬH$����H$™��hjn�n��5��|� Z�f��͌���`,������D"р&��z	^�g$�������3T.X�J����������@ ���m�^��gǮ�(�J��s��YB�0��ړO�R��{��
_���S��oh�7S2f�}$�Է��!~���L^w�����;w����T*e|b���bu=�f�f�|�X�`VОH$�w8���122Bo_?�<����A�����_{=�C0��#G���cjj*&��:���z9r�8}���|�n7������Aj���?����� �p8��ڙ�P(DOo/�O�������0�/��oF�H$Bx�/��6�p�����+uuttv��_�j\W�r��N��g_x�����3D"�؟P(D�q�μ��r��ʕZ�?AA~>F�>����)���	�#�d���|T��D"�A�nw,�p8����v_���Ͽ�� \Wo�p8���7ۡ�����..^�U�w�}}��s�Wj�(�;���K��Ɣ�A�ܹ,��`מ�Xm6N'O>�8�N����e,O=�8R����6o��KU>z�\�#=H(b߁�\��&=-�o��T]����3�dR��,3Ͽ�2
����������XQ�D�\.��������i�z�;:�oh@���#�`4xg�v�HMM���n�c��(*,�ֱ���s0��g|{{��H%l޴�H$BQa!�n���YY���n|>?��)�Fw"_<�j^��`0Ȳ�K�h4C!���Occz��Gz��G�q��a�6+��טST� ;q����F#
���$d2R���V���G0d���,[��}���B����=JB|<� �c�nLFkV����,^Xɒŋy���(�_���0�]<|�IkT��5�Z���e��}T]�����+�?����ͷ�A�T�����g}�t|��IOO���p$�ko�Ş}�Y�f5���UUttuQ_���$5����?�3�֬����˅��e��5lڸ�cǣ�|F�ƑJ$7lKccc���m��{����G����\��%!.�'�x���F��/G#%��|��_g��}\�x���<$H��� �����[��3����lɒY�<q�����꤫������t�;������̓��GgW˗.�dn1o���y%%�����ߏ�秢��-w��Ū�X�v��wS:��:=]n7=���?�Om}e��p��yz����W��Jmm���s� ]���t:z�~��شa=����*0���#�ϫ�Y��S��044��b�ч��q8�\��\�Ç����+�=���ˣ�M�IWO7O>�s
�رkc�$'%�Ѓ�fNo��Nq��Y"���s72��D
�h���Zr����[�{<:��(-)�C���1�x{�����r�V��100��gy�Gy��G1��ttu�RGÆ�޳�����ͯ?MFz:i4&�7��M--457����!�Jٶs'.����!6nX�^����޴���l}�]�v;�<��K�)*"��w������ѣ\��������R�����R��>�D"!!>>�cg'e���'"����n���.���
�:-R���E8捭o#�ݲY4�_pV�X��Mٵ{��_����:;���`��9~�E�*����'� ++�Ȋ��������p8�x�H$�>/�nۆ\� %%�w�mcxd��ΐ����+�j��|�p�"���\���RU5Wj�C�����Cy�Y2�jk�͌���VkX�����sx��1�̨T*�n�^�]�H]C���x�ȑY�"�J�?���9El۱��HzZ�-��c��E$%&r���R<g�
�$&$�ha%o��6[�y�7�n�r�*+ʑʤ\;�
�Bl}�=ZZ[y�X�d1�Ο���#ܵq^��g�������Y�z5
����p���F6�_��݋T*ehx�_y�H$������g�j���)*"-5����M����ϛG~^=p?�.D��q��Y��Ghni%.�B{g'���<p���:s�������v��,��fg||���)�;�'��$==�+�u�p��E�FU�/����c�<L����wޣ�����1"�@__?��#�����ii�X��������NU�eY�|)�6����P������ ��L~�~�JKٳon���<�(6����r�+*(�͛1��;x�?7�f�T���rˆ���A���axt�����e��c6�0��F�����V�),( 99	�\Ε+�قP(�?`Μ"R���h4H�R�IIJB��11i�>5��8j���"$��h���A�j5��-�t^	^��ۍ��bj�AzzN�������))dddDCyz�QTXH|�%��(�"�I�u���gD"TJ%�7m�[_���[�&��S��#�I	�B�d2�뢞�|�P��ܳe3��/�)��lߵ��	\nw��eefb2�hԤ����*��N�C�Q���J������gbr����K�������o~����w�el|,�~nq1��}�v�J���e��y�:dR)f�	�F��궦��LOJ��f:���4
9J���dRS�Q�匎��F�(�
�.Yº�k�ʤz,39YY,�(� ?�NG�R2��Q(T��g˦��drQ*�lX������d��4�̟@ ����yylX��U+V0�pφ�k�(/cpx�`0HfFwm�@vV��)�
��t,^���6n )1	���䤕E���]���8Ė f����R��M�dgQ:�$Q�`4��jHNN�h0�d�B�8p�0餥�"�J��H�����B{G'SS����9E7,��b�t:�-�		��'#=�Ԕd9���(�0::FJr
���-.f�j%
"DgO��gP>����#r��Q��c˺����t��ґʤ�4����>��L��$'���#?/���\rsr��)�P8LU�eN�>��������0c��J�;վ����T� ��ד����hdAygΝ'#=�V���'1!��.r���
�z�����dggQ�_@o�W�$%9�8��W^{�9EE����5�p8LRR"s���j4�dg�����]�‚9�˯����bNQ			�),b��E���V��r��1�L$�Ǔ��MbBWjkILH@���,�ˑ�d
S��H^^.z���5W�RWǷ��u�R)�99���j�!�J�X,�A<O�3�bs�	Ξ?OrR�}�,^XIA^�YY������N�C"��ζ��w�ݤ�� ﯽ�#�j5���T_��d�\lv�YY�Y,�����4��[�Zm�dgGzZG����?��AN�:�׿�U‘0:����w�=��E"�Y#�8K�V+�m��+��y���u�κ�n�p$B8"2�F,�����dێ��n�j��O��%�y����ђ���W��
/���������m�.j�\A�B��kְr�r�z���w�(+���2�?���	6o���dl/O$�?�D��„BQ�n�XX�x1;w��¥K����_���a�Z���d~��Ŷ;��nt*Y��d2�g�>�R)yy�L&v�����A&���\���$^��ukVc��c�[zZ�P�ή.�23�j��D 2�?j�=̠P(X�h���a�Z��(/���^�7ޤ����K� ��8v�;w�wV=���311IwO�׮���m�>Eޗ'A���{qQQT��2ٽw2��%�s��1�SS���r��[niV4���DB|<J����q�Z-U�/�����j���{��(�;־���Ү~��DZ'X�2j��23��tLNZ1�Ldd�������G��Q\TDA~V�
�B���abb�-����v��_��������y�C),�g^I	.���GzZ)��$''�����`�����OFF[6m�b6�����eK1�$''QYQ��jE232���#99���IT*Y��X,f:��ftt�իV���D0D�T���q]%�=���QkX�j
EG���"� H$A@�P�|�֬Z��l"7'���o))�яV 33#֡+��v�Err"s�
I��G�V�q�z$	����D�����p�f�**��b�����J��e��b��R�?os��R*IJJ�����hG���AjJʬ���������$&& �����$%9���,�z=�II�-�m�W�h4��d�$'QTT��b!#=��9s(-�KJJ
�99���R\4���$��R���¼�%�\��Gz�9EE��/Š70���%����IRb"�P\T��eK)(�Go�STX��E�(/+à׳v�*6�_O\|�����gaNaII����0���NG||���e��%���!�����d�ҥ���MAAs
�STļ�.�dee"�ˑˣe%�HHKI!.΂��fhx����
���g���X�g�f

�)�/��oiii���o�d�b��'��j�i�dgg���NJJ�H��q���ٲ�������e~�<V�XAFzR��@ ��E��;��N����x���,�\.���/%7'���L�:����dg�����l��v^]&�)��/�GB|<�`��+�SQVvK�_!�����}j�Ԕ�.YL����,''�RINvR�����
��牛��/ys�†�HLH��L��06�
��t[�!�^/>��3{̭����G�������G�n�{�y6m�p�M9"""���$����|뛬[��S����361AWW7��?�.6;s�������Z�⩯<A$�����Z��C������t�v��6���ͼ�����,�&��$
���}C�"���EGgǭM��d2n[(�Fsݹ�������f�M5:�Z�7��U̟��""�7Tj�7m$-��B�F#��),f3�?%5%����~V�X�K$���
,��g
����\4�"7�-��EDDDD>�H$���G�Q�3k�w�����C&�}a�҈�<�Ȉ_DDDD䣑J�h��g7?���~"�[E��/"""""�%B4�""�a���Ro弳��axd��N �~���O�ߏ���"����έ�� V����Q�^��;���]=.���qp8���~?���G�@4�""���f�����okogll��;bכ[[y�ͭ7m��nϾ�"�}�9������g���rb�p��I��m``��� j �c��?���o�}�ή��-Wcs3�����̳ϱ��;5�j�&&'ٱkW����p��I���y�u��Km}��=�g8��W^�����Wjy��7b��E�8�Vt>�/;�`�={��}�.���0�044���D&�S_���8f��DBSsͭ����J��g��g!n:������3���n������r�V+�	�H�R�;:hmoG�T��>�-gϟg��^��Ԕ�����\��TBCc}}�zT*�76b��0�L�������z::��Ie��hln��a6�b�+.VU�����R�y��7�p��		LZ'��oK0$�l�����G_?� @Cc#�uuT_�aQ傘[lA��1������$��mx<�&�cct����֎��G�T��o��zY�x1)))������CSK�V+J����	d29=�=��Jz��bjʁ�f���	�ӁR�൷��¥K$$�c��q��defb��im��<^���+/���+\��F�T���Cm]=}��h4j�y���j���Y��l���3:6��hB�TΪ�� D��*]tZ-����5�
�J��h��룡��)���ĩ3g��� ���G�"FFG�Xf�]?�>�=P��͞�
���m9t�(�mmRR���/I�H$�9{���n��Zp8dgeq��Q�8t�(2����j<^�I���� �Hb��x�^RSS���;y��±'���LLN�޶L9\�x�y%%�c�/UQW߀N����K������NJeK9|��ϝchh���r�����epp��KfF�i�B�;y���^N�9��l�_�����x��͍�"ohl�J]=:��}������*FFF�����p��)$R	Y��\�����Zm�8y�ܜlΞ?��'	�]]�Z���i���
o�����\�D"�_b||��." PS[��}��D"�?x���t.VU�grҊL.��;�ܽ�����'P(�M&~��_����o�y���~�r9��s�	����-����V�aמ�$''�o16>~Cy��?���91�%===���D"�����l�¥K45�P�؈����Z�j5�y����&.\�Ąu�Ғ�Y�n#��}�9�*�df���o088���%-5���|�R1��W�-�Cm}�===~�R������D"a��G�HO煗_������Nv����%�?�n��|�ayED���j�-]¼y%|�_�b��V�y�чY�����ՄCa\n����+�̝S��db�jE*�2����7ofӆ
�N� ?���J�{�f֯]�D*����o}�뱎��tX�p8Lcs3��Q�����2

�y��0��D"$	N����O�B�V�����t��a232�_�g�d2QXP��`�`���I�M4NGN�u#��+��7�ARb"�P�E�l�{�a�-YLaaO>��e�J��'?���Y8t�K��y��W�"3#�H�а�Μa��%��|��kVSU}��ď��ܽy3ǎ���vSYQ����-LF�P��y�X�d	֭#�s��i�.^���-�

f��E�A�h4j}�a��nJ��#>.�@ ��f��rAT�GF�R
��p�"f��屘ͱwPU}���T����r�rA mړ�V�������b�
��7�NvV�99�$'#W�ijn�n�@4��պH$R�Z-�p����z���	D"Q��EE����him��7�_���s�)꾼���s�œ�sD�/"r���r|^V��p(�D���T*�iu������'�BwOo��II��Y�D"H��v;.�k�o�T��n�m�H$H��,�*����d-��~��1w�3��5s�����R��HOgݚ��{�%�b�G����W�x���Э�.q��1r���j4
����ILH�瞟e�@"�N��!r������H$LZmx}��a"�����h��d�OL�t�b��/g���=�Z��ۍ�fc�jE���lI�R�
9!rUY��B���N��@ �d�c�>5������3�E�T�p8�G�JRSS�q��k&ցV���9yT*SS�<olV������\�rR��P(��j��t���oc�������J��
^��R��G�~�o����[�i�?V�5�cZjJ4.C$��� 
�x�,Y���G����Ƃ��O��|LD�/"r����֨y��7q8����!��IJLd���54p��y&&&��th4���!�J1��T.X@��˜8uz��_X�~��Y�
E̓�N��l6�y�&"�'O����}�櫙�r���LbBR��c�Op�ƍ
s��Y�OM�k�>.UW���L�U���f3�p�+�u�F&��l߹��v�rsb~��:Ct��l2�V��_:��q��ڍ\&cAE{���	�p��i����s��[زy���^$�REgd2�7m������p��I�.YL���o�}���Fy�Ab2�M&�*}4f�Z�&>.�{��Eoo?��/ihjD�P����'O���o ��P�T�M��J��b���Ԍ���b6��ڱ{BD �baՊX,揔g��˖"�J������F�Fc4Ȍ�FsK+�q�$�����omexd���$z{�$99���j��?�7R�t�.r�����{��o�IIT�����T�N����p�HJL$++�C���������㔖�`�Z��_�)�C��'"�1�	2�����jd2Y,�E �l6��h�M��
z��\���4v�!
��8�N�
%J��@0�V��� �V�����Z-&�1�T��|�����S�H�8��Á��h0����r�0��
�Y^��'&�B�z
S~�����X�!���Z���A�P ��qN3�B!�.Z����y�)��ﻗĄDt:-� `��	��h4�h8�6�9����3���@ ��r�R���t�|��j��ǃR���A	Ch�\.�V~���w�=,_����1��贸V���]n76�
�^�F�F�RM�%A����j��!\.��s�g=�׋�n���j�+���!���i�h�Z����@8fl|�&*�o�,�l�kS,�P(���xL�Z��錽_��x]�N��띥#��n�b6�t:Q*U��j��O��+ٰn���D>����'~��+/��*R��o>���#��\�TŤ�JWw7?��HM�}���p���!������;x�W����)�qq�v�|��^��$�W��T��Q�1��(�
6�_��4�]�����>S�V���c�}!
�������O�&����?�п�W�23������C���DDDDDD�D��_DDDDD�K�h�ED�� �C�pFFF�p��������A}C�m��������-=�j�q���ٙ9�~��AΞ?���]�ݷh�V�)"�e@4�""w��	^}��[
�s#�����b��7�N�v�r5�P��5W8s�܇>����{� u�
�{���|._�©3g�K�3�]{�q����r9��޻�޾>.�\�a���?s����[[��!"�e@��'"r��p��a�nkV�������}���!�|�qLF#���f�3�<**�ٱsn����T2338}�,r��{�l!##�3��s��E���j5�s�^��]{��n�j‘0�Ϟc��%�޷�P(�����%7'�����JGg�Vb��b���������˗���FG��j4<�Ѓh�Z�<t�����w�՗/STT���Ok[;G�G�Ppϖ�QOr�C!���(gtt�=���ٰv-r����~�����ˆ�k9~�
��def�}���
��g�>�'&���G*�F��صg/�`�y��!�._&>.����^ϑc�ihl$';�F���GCc#�>�0�J�r��1����dݚ5=~�ѱ1�� Z���|�Bɾ�@GIII�v5��#~���ȱ㴶������L^^.)�ɬZ���ěo���餼l>{�離���UU��F�RSys�ۘ�&<^/�n�NwO/���e�ʕ�\�����@0ȥ�jNSSSTUWc�Y�\s���2�
�����UU�w���/�8t���rsrxw�v���������SV6�B�D"!Z��˗���	s��ذ~='N�����7�ڊ�l���ζ�flV+�������w�|��F^}�
4
Y�/-e~�<���q�6�[˂�r����`0Ȼ۶��hx�H��'p��
��\�TEnn �ζ��dg�����cǸR�~��e�瑒�̒E�����C457��rWj�8t�(5���|>/ZHOO/5Wj��륶���^��8��M֮^����Б#lٴ���T,3�eeH$���7�fnq1՗k����`0�d�"�r9��#h�Z�H�����iu,^��LJSs�,��R��D�T*C&�z�����l~)>���55tvv���ͼ�Z��������	���1
B��yY)e������Ҧ埏�dD*����GI�h���A�Q����u�ᆦf���0��tu�D=���&3	�	��呙���3g��΢r��Yy��n�GF���-ded`0���Fc$$ij|�R�����
�LNGg׬<].��y%sILH����,[����qZZې���/e�…X�6��/g������iW1�Oq�/"rD"�^/�*��d�9w�BI0����x�^��r9t�atl���ńB!��0����l���ز�.֯]CNvS������{�J�B"�RU]��+5���u�p8�2'D��yy���q��Y�;:�J��瓚����y�HJL 
���?�`0ı�'���D�D���@("�7K�
���݄���(�[��l&?/���t6�]��܃V�A��q�����6rs�iji�ԙ�4���<��t:�RS9x����r G#��a‘i�����1����gŋN��©3g��́J��ԙ�R2w.��8r�8Wj�gD��C�
8{�<�V�k�^��>��&"rG�=�������N������|I���n.VWc2�g�]def��j%/7���rlv��lX���y����<,�i�wt244LB|<����[,446!�J�WRBa~~4›\��h����T�ܹ��)*B�RQXX�T*�h0L�I���Ԍ�h��l>˗-��r���D��d2��������h��������V�!+3���x"��y%����9=-���q��8�}�܍�b!';���a�{z���dg������8:������e�74	G���E.�#�����f``���rs)+-E�Ւ����h� ?���x���hjna�j%=-���bd�h��p���B���"9)�E��D"����(/g���@vv��j5�����=�D"�����,���|.�D"���6�B�eo �j����ED>3�o��ϻY<C���dg#�>W�� ���P:��͛6~�∈�2��~���o#�Y4�Z�����O[����j���� "�ED4�""""��l�����/"��!��Q�R}�b��|∻�EDDDDD�D��_DDDDD�K�h�EDDDDD�D��_DDDDD�K�h�EDDDDD�Dܖ������V|l�I�����Y�3��;[!��FwO�u�:�� m��7�TD�f��4����AΞ?����-�144��imkcjj����:߸]�� ���y��~�F"�n�����Z���445���A�j���x��\�������
122���a������2�����˸ݞ�~��o�bUՇ���a�;sF2>1��������
��7�����ص��a�~�=~��_���o����I��ȗ�K՗y�ŗ�RU5#�c7L755�������l����8w��������_{���;��f���#������n�N͕���?0��c����ʫ�����)��g?���1v-�K/s�R�_����������]���g��%6�펕ɵC!B�~_\[&7ˎ]�ٽw�����yò����I"�'N�����}����������8q�###���k����}}^��s�M�-�����j8r�8���T.� 99�7�n�f�319ɦ�����¥K��j6�_�R�����������'O�f`h����)�._���,]����Q����HO�����ERb"˗-cdt���NGX�l		�
,��8�"SS�,^�i������~�Z-O<�(z��t����6�/_&11�U+�s�	^{�-�g�23��c'N���†uk1��(���L�D���'N���r�222�imk�������ի��@Uu5����|�?AvV�9ٜ>{�Ą,f3�O�b��]�pϖ-�\.�?N[{;�J�|�Rd2>���;wq��%&''Y�j��ؽ��ȊeK)�7o���W�@��DE�P����1<:ʂ�r�6;�m�Am]=���tw�p��Q|~?K-��v��x�,��$??����/����y|b�s�/�t��?o#��X�f/ZHWw7�M�T��q����AyY�%%����'Np���B��9s�H$,_���NCS#�6l������!�ڸ1��cr��s���n7�K瑛��������|~)��ٜ9{�`0��墢���)�t��1:>ƺի�RW���0T��p:�8��h|��w�oG�8�N�l�k��!����صg/>�����RQV6�Z�V敔0i�d���\.j��).*����ˉZ�fՊ��t:��/���Fa~>K/B&�]�,3#���?x���LF��S��HuM
j��ukV����I�;վ��ܲ��B�76�b�RN�=����ST�˯��w��
�-~������/���%s���G&���ߏ��#-5�W^���d�rr��`r���w�c�����
N�:EY�|��;��r��K�s�6�[�#G�x��)*�`0008�+������׹5��|$&$���CvV欨g""����������ߨ��`��5��~^~�5�ϛǩ3g	�C�ZTJ%f�	�RD]��:�z&���gΒ��D"!�ގ�LMMa0x����7��s/�D~~>��	�Qc(9�"�>��{�֮Z��
���{/�O��|~)*�2��V�!%%���Ka~>���ȤR4
*�
�لL&%"D�3��+��λ��:�B�{��X��B!�����l\�A�z�8���z=����`br�7�~�8��B��'y��O�[�j���."�����a�ݶ�=��S<g&����A�/�0���m;v29i����#ǎ1����|���څ��k�F����C"������w�%)1��c1�ټiӬ{�N'?�寰�l����V�9q�W�jIMI���#|�+O��k�O8�ԙ3l\�~�L|8N�8��={�������sv�����$��I�ؽ��=�¡0�p��ԬYM�;�χ L9��>��ީD"a��}�dg�����HJL�w�eú����1<<By�|�۱����l۹�VKy�Yόgϝ'//�NKbbqq�O̵t0$>.���&&&bz]˝j__Tn���p8)�/�7�`tt�R��nG.�1���)��˗q��9
9�=�p�+������\�>��I�����vS]SCD������݋�j��H�T�T*%�����G"1!��g�P�����$V����Q<nR�dV�������=Μ;NJe�D��"w��4�����V�ٶ���������355�䤕��W���ƖM�0�L��sssHKIa��e465����M]}*��)��˅\.gQe%ͭ��z���weEU՗9~�E���s��)�rsHLLD&�R\<����2����,[������p:INN��z��a�Mx}>�-+V,G�R���JSs�=��8>1�~�Qx��W����������n�j�Ν�@wO/��wP�T��m;�?�0����_�g�۾������J}col�z�([&�Q\TDGg'z��Aφu�8y�GO������<�(
����~�z:��� G3��_Z���H$Ʀ���r�����������G�Π�
�����O?aAE����7��~��|�7���.�R�x��ce���-��kx��o�?q{�dge������R��U+W��_�V�(�7�\Ώ~���F�f��47���~��4�Y���̌�]����(�.@�Tb��p��[�Zm�����F/Y��p8��l��"����HZje�ϙ�7A���

�p8p��deg"�H�DŘ�~�T��"rˆ�����D��IJL$�p�]���d���R��gph��L&V��ںzRR�1�
����y�F��Ʒ�r
Z�&�yH��	��J���v���K_?�`�\n##��wv�������x<,^T���0���	���&%%��,���x�o8U%"r;4��000HRb"&��`0H��BrR˖,!.�B�t��76Q>�����D��O��J�8����HL�'99�%��P(Ѩ�,�('55�^~���\�-Y@Zj*���{��?�c�;Ρ#G����K�R)�p�L��李�����D"��0�t{�z�T�T`��hlj&-5�H$L$M�D�X̳tLHH��+ D�@&�"�Z
�.\�d2���q{<�ܽ�‚|B�0������E"��h�B������ZmA��/319����r�dgeq���}���o�����/���Ey�|��Ց��ť�j�v;2����ި��@aA�`��~����W�{�>RSRx������˖.!5%��{�199		J�����=a`p��DZJ*��_���0i��t�::��)�j5��p�N��bA�T����ݛ�IOK#Ee�J	�p�^��+Wضs'�����'0���R���f��=�X����y�M�l�~���z�|�q�n7~���	��ډ����ʢ`�ukװE�"++�p8L0D�V��y$�է�5�w���{۷SXXH��[Z�J�
Q����餯����H��;־��������E��}db��ώ]�YXYI~^.驩��R�Ѐ�루�����;:q8,]��Tʉӧ&	���ʃ�߇�lƠ7p���HJL�t^	>�����a4��휻p�` @^^z���208�R�䑇�h4��҂�f###��DcjGz���j4d��\7j�]A����#G����Jbb�=�0��H�.VW308HZZ*���x=�����l6�5Y�����b�SS���2:6Ɣ���
hni���
�\Frr������<֬\�V���_�4:˵e�&�
r������
w�-$����HnN���4���gnq1�������(�}�05�xNj��}�իV�R���155�>
1<2�y%Fl6�."3#�ή.����/G"���ՅN������M8摇`˦MX,f��"233�D"\��b`p���Az=J����ȣ=H^n.���;wq��[X�d	흝��Փ���#=��'��
b4HLH�h4RQV�D"�`��t:�[<�5�Vb��ٴ~=��)���Ʀf��gQe%V���BOo/i��l޴���Az��ؼi#kV�b||��2YHfF:�]��ѥ����2����8�.232���bdt�ys�R2w.�LNN��x�R[K�� ��-YJ(������D���f�Ӟ�^���HLL���dg�q�:l�)v���� N���|��e��6.�\a�j� /����s����y%H$`tt���L 33����O��Ie2?�M��~
��)�[L]]=)���h9�΋�W����D$�n�_w���!�`�͊�ͭo
�o 1!�#3
�\��xNQl�R� r�,6��7[H��]�R����qF�F�(/�MɄBѯZ�Bq��A���R��J�?y�3g��G?�CF#�iC��J��<��`ly��T��hB�Tκ>��Y.���`$�ȏ�P8� �B.'��P(�H�RB��`�t{�nF��M{#oA��o��ןfúu\��̑ܙ��>��A�;{�����������|�G��a2_��_��p�o&o�χB.�axmZ�D�|гgީ����߸��{XTYIGW�<�<�G?�.��ߟ��C���ݳ�\��ŋ�R��I|>
�"fo�r������ղ_�'ݾ>mB����7�+�Hp�]ttv��T�R�d���e�R)��&��>���T��Rg���K�H$�v��dRTj:�>f��6w|T��S�d�.ݨ�Lg"��g���m���m/7íth���3�D�B� ;+�Aϵv�Z�QF4r��	J�������~�o��[}�D"A{���g�g|��w��ӃD"aNQQ�t?���3�3��L&���i�O�k�?#����O�}}^���������q���(��ED����&uZj����h��h0����i���frҊ��";3�9�099IrR���s�'2����3DDD�ȤҨώ;�B!�dn����8���/w�RIZj���Q����s!CDDDDDD�
������ȗ��|�	7�q�A�o���m�� 05���ٵׯ��q�⥛���������?�9�����{�]��Nr��IZ��f]s�\�|Qx�p���������I�غ����{�ttv����A�� W���Λ�wj��iw�7���g����c���q�P(3J�`��K$�-Ɯ�\�$�Ă����:�P(�����pĜ���~���p���@͕Z^x�U�Ng�S���~���Z����ގ�x�ޘL^��`0H$��pČ_$��vc�O
��r8x��him%p�nSSS��`��V.UW36>>�#����t�fK��˗9�"�JJ�������sV�t��x�
�p:���q��.C��(r:g�p�RM�-��������VN�9��秷��_=�;FFGcA}�~?^�7&�@�q��՗k�}��r{<1�7�?W�����ԉ�z/��"�6��ήn����V+�/E?�����7��P({���6e����_?�lT��z~�w'��s��OD��Lt��˖2�<^y�uVV��h���x���a����my�A��0;v���.����˯���O=���$/��*������i{�ٻ�^��E�*IOK��;0�x}>�x�Q:̥�j�?���IJL���A�kjb׻��9q��p���k�d2�cc�;p�݆��g��U�����j*,��t��ގZ��+�?�\.���^��t���EB|<�O��`��<��V,���K�냃|�+O��_����$�������g�~¡K/fú�Q�+>ǎ�dph�S��`���3��4��؉��;�T½[6����omebr���,�ؑ���f��ڍ��c��,(/���;����r��w�NCA:��+$''�d�"=�^���8q�.����z�����)�W������8u�4^���Դ׽�>z���}��Q�	��FF�����}ɗ�GBB!|�p��4==f{fvvo��<w;3�7737�{s۽3miLc!@ /��������r�3#J�dh!�D���< )M�oD�/�}:z����5k8p�K/�?���!r��a�?��&U��?��/1_�~).*�����;�����ߏ�nG'I��PE����'hljd�]w���O,Y��eK���_��y�رs'F����B�����ɾ}���3O�ů���
�_���`���p��!::;ilj����RVZ���y���Z>LgW�x�h4FGGg&���L�ݞ8ɩ3g������8|�����֭���S&�{�'���������pPW_ϲ;�R9c�>��L
����%���1��y�T*��-����|�Gع{�y�,\0�%����G�y�q���ص��S����ˎ]�q�\���n�,Zđc�p�\L�2��7|��s�d�gΜ�L�4�ǿ�m�Ξ
�Ɯٳ��?�!�x������-rssټe�##��ϟ7�9�g��H%SH:�>���[ާ� Y���Vzzz9u�ӧMc��bp�ڵ�:�	�N�����ŋR9c�'�e��V���?K���6�fV�n�=��g
w,]��I�x���)--����I�<���;q�O��g��=<���<��3�]����S)��<��(�Z��‚|<YY�㧡�	��ñ��ܵr^�gS!Gc1׸_�?Ɏ]�9s�,��Y����
�gϟg�̙���q��I�_�@�h���O384�������Y9���N̟��I�x���aӻ�����
"��u�$�E����֏�QR\���fhh���rJ��q�]��/h�&�5���`�<:DcS�z�av��KgW����ܹ��ܽ����Ѧ�.����l<)5=w��d��f^z�Y{�X,F*�������2�z}��(X����]H��͖�GK?��PQ^ΔI�hnnf������FUU�z��H��=a�S�Պј~==	����$+ˍ��dxd����Ls�K�b7!��l�d2�t:�e9�j[��B��F$��b�ٔ��瞣���7�����g��Ϟ=���D<��ٳx�[�b߁�l�x��G-���%˝�'+�ŌN����8v;������h4"�2y��L�6��AsK+�T���{�x�\�(<��S��ƍ��ֱh�:ǎ��"

��O�~�-��k&s׼_4���^�&#y�y�l6dY����ޞ^fϜ��)��Fc|��Ǭ^��p$����=k&���FIq1.�Y��n�c42y��?��:����m�Y�p!�������ŋ�fhx��w.�j���íl۱�pxb'�Y3+���F�(���tˁ�NnNM--$���"�����h���x	����[f��z����-��e���W�Ɠ�fʸ�l��r���a��ۈ�ch�FJU�(/���E����&77���n�8E��7���b��s��:�L�^�Ǔ~���6]vM�$����n�����Kf��%Ұ�l��_�������Ժ^��‚��{him��re��Cz=ͭ��ա,���D���NWW�e��U5�TH�4t�D^n.{?����c�ZP��7�E{{;�d��������]s79�ټ��+|�m;��Ù��h,ƅ�*�D�>���x�"��8�/|�����"+L�<�T2ů^z�O��'�HP[W������r:)*,���^��l��ں:�Κ
�S�%LSU�݆,+���FY�`���;���%��3:����}}}<���&U��v���>�-).����h,Jqa˖.� ??3u��hdҤ
.�O^^.�+*X�h!n����B���rs�eL�2���r|^_:{ZI1%���c�Z),(��].

�Z-���0�r&��l��uk��$
������r9))*������\;�'U��0
RQ^��jF��Y���N���ӰX�����p��(�ByY�����t���b����z����2{�LLF�D���\�}�L�جVJ��)�ϧ���iS��v����aŝ��F;N�v9	��
�_w/%ţ�Z%ib._F~^.�CCx�����ux<�L=x=^JKKp:���RTTĤ�r'>������J1�M���s��)$I�� t��L�2M�p��ܱd)��E�$J@8fxx�������q���C��ܸόMGk0(*,$����R�̞Ŭ���b1�
X�`!u/2u��L��.w~>s��fҤI2o�\*&U`6),,� /��DQa��L*/�l6Q\X�̙�W�w­sC����UU��_��t�Ć
��8�M��*��w._�]+V\�2n��344��d�xIB)�q[N�+_'�Ѭe�ߺLe­3u�d<��n�f?��fU��5'� \'���3�y겔���O���o�e��G�UD���N��2�	�\�n�CA�AD�A�o��&kjn����}������� ��3~A�N��R[WO[{�E�L�<����[F���LWw7]]��A��r)*,�ͷ�fpp�
�>��9����&++���������� V����x�>�SO<Ƃy�Zż� \��:=~��o����ł^Qx����ˣ���%������3g�5s&l��g����:������ŗ^���ѹ/���ׯ�ʊe�8}�,��w���� ��m̪��_��&� \�T*�C�Y~�R֭]�Ʒ7������l~�}
��7w.�y�qZ[��4�y��288��<��G�P]����T��2u�d���x�;O��	�,^������ԓb�4A���:��f::: ��t:���������N��"�ӾJ:Y��i(�Lp`���~�#n��ٳfq��e�����Ԝ��,˨���ȄB!�����hmk��t�7� µ��pdYf��{������~��ŋQ��ϋ/�BcS��(�H�Y?IDAT99��N$I"���l23w�l������)/+e�%�?x��244��j�� n���AEy9.���6��� m�NUu���A�y�\���%D"FFBXmV,f3�T�p$�A��h4�H$���b1EA�e"�d��N�+a���I�T*�C}����$f��x<�,�"+� b�~A�	�f�h��4Y���l����gi��;n�{�reY��ǖa2�n����;J4�� �7��� ��
"� � |���/� � "�� �7��� ��
r�~M�n�wA�w�u~Mӈ�b��z]+M�R�b1��z}�6�Lf&7���
���S4MCUU���e�=�g^�����;��㨪���OCc�CC_IE��h4Jww7����R�/��d2���H�|�]�8�2Ǝ�+��Z˛>O��\�4�H4J�:��j��F�E�ט�����	G"7m%	��Uϝ}~���{���*++�j��l_}~??�寘<y�eY��~���&Lbr�����}�8w�>�����k�N�9K�闽�L&yc�[��ӧS][�[��eޜ�(�B,�����oo"�`6��X,�����.;7mJO�z� E�Ex��n�n��ZZ[��/š#G9z�8y��x���>��hmk��t������h�W_���o��������m�r��F�WR�@��{?!;�GWW7N�I����fb�8�yy�}gxx��\N'�x�W_�5���`������tP���/�O�=��Rq8�A�]�“���|�b���ŗ^a���X,��?o�K��Z����^5GC �bݎߧ_%UUٵg
�M���+�)+-�`�j����֏him�l6��k�SY9��zo��u��*W�P�$�x"N ������/�F��ltuu108DAA>F��w6����0Ol�@Qa�h���dYGqQ:����NQ^Vơ#G���)/+��t�F���E�e�rs��b���x=^�8������b2�i]��Q����ᠫ�������h�ƙ3g1�M�1��q�&>ݷ�����5�W3s�>����3x������O9s��'U��}*|�>z�D"���ɏ	�p9]�����с�f'//��UU���y�;O1m�Tl�����s�9x�0�<� z�5���Ӌ��H�R����'(**�l21<2BWWF����|�?�~��Ʉ�f����Ŋ�f����T*��j�BU5������y�iS�������0��dg��ɾ�<|���>���y��;O/�Ʊ���d����-�dg�x���/-����s��^�?��?dڔ)�F�7�J���D�Q�z�hll����S�&	zz�H$��nB�f����,B��`p���8YYY8�_������ӍN�QTT�^�'��#�$?/��B#!����	��I&�t��H$Fs+����#���z1���i�D���@�N.��#>HA~��FFF2u���G2�"��t:Qd�_z����?Gqa�� S�����_�G�G̜1����	��^��D2A��G2��?��p008H4E�ד��ާ��zz{��pQ`�X	�#������n�Ξ��|cf��tuuc4(**d`p��H<��c����v�L�4N�9ÜY3iljf��͘M&�&�ws�/��x�l|{����}lx�Q��(}~?�~�v��ˡ#G�e����{��7����'�H�`�|�y�
��˘7gG��b6S����r��6oy�)ؿ��s465a�X�z=D"jj�x��&��:���h�N�������{��D���<u�@ �C����I��\Q��C����VV޹���H(ċ/�J$!��Cq�B�/���q:L�4	MӨ���?�ϩ�g2�Nց$���O9x�0��P\Tă���ŗ^&����#=H�χ�il۱�N�C��_�_�]+Wp�]w�?��V�`������2l��p��*'O�����p8�_��?��p��9}�,U�5|�m�~�"�Ȅ���}����g��C̝=;])$�	^s#�6����),ȧ�����:���ttv�_�Ī�+8r��X�X<Ƽ9s�ΓO��[os��9B��钺WU���o�76�7���{hk� ��_��r��	-\Ȗ>D��s�}�x���&,G�4���7ofrE9S&O����~�9�[Zxc����oصg�`���%�ߑ����w�(
�W�"��c�;�H$(*,�
�ɿ��T*�����̦������ #�#ܵj�����afL��s�<�O�������RU�}^֬^ͅ�*���D���QXX�)K$e��wt�����̓����O?޾����c�n��O����v�~�1?� �򳟓��Cp`�uk�2m�^z�U��X��y�‚����8Z[������Ӧܐs���S�:}��eK��i ���$�2��Wu~ݮ��3�`0}�N�2��{�288����ر�R)�-]ʝ˗��SO���BwO����=�=��������x�z�X��xlO?�$�]]?y���V~���ΓO�ɾ�tvv�������a��g?�7ܽz�T��{?app(]�'���bph�͊�d����D2Ayiif�F#�����_�������Z�h$S)������;��}L�:�V�'�kh��9�ُ�M����_��6j��8~�Yn7�@��Ǐ�b�2�N����ǚ$I,X0��S��ݧ��f��L&�$��H��;w!I:L&������#�V���sf��s�$�Y3gRSW��'QS*�8r�V����"�(˖.e�ԩ���SQ^�N�Y�v-?|�D"�~?��!����̙=�����,�������?y2������R4��
|�;O����q��9�JK�:y2�>����z�aJ�������X�z5�p��_�_���s��^z�UN�=ÿ���SO<����K�%I"@Q�.^�왳X�r�������ѣ,�?�d"�"�,]����?�$�d*��ٳ�o�_x������C�:}�X,��G9s�w.�Y7�'�����������o����c�b�ٲ�%�����Hww7;v�"�r߽ky�g9�����N�̷y����ص���^�L�������%�2{�L����)]�=(�‚��s�2��_�o��O|Lc�Z�w�=���d��}�����G�+�>{��g�RVZ���GQd�}�i6|�Qv��Îݻiln����bC��Aj@(����ӧ��}�]p٥�W�r�t�͝����ɓ�b1�ѵI����_��/|�_]S��b����d2��ᠤ��g�y���\dYG"�$�L�L��L�U�PU��g�1{�LL&#���`0`4�y�U

@K_���$7�LجVt��iL�Rh���������3O���M{gg��r��JKJp�\

�iN����gSTX��f�<H$�Y!�H�x�"�,Z�i��񆆇��llx�TU���S�[{F������SR\�,+h�F"��бH���I:dYN�^I�;�H4
�\N�L�̂y��z�<��g8v�$�z�e����Y�(/+E�$6o������̹s|�m;�~���N�^A��L$$��^?ᙠ^�'�J�L$��t�t��d2��h�lcqQ�z��t��Ȳ�,˘�&"��X�ݎ�F8�d2!!!I:Y�N'����˜{�@�p8�^����s�"�t|�?�==���I��W�"�L�l�"��$���tuw�r�H&���l64M�d2e�g�s�j���c�޿��?����fL�NGg'�X�t��ʔ��C		MUQS* �n���M׉"��
&��p$�j��/�W�w-n�M��Iaa�MӰY����c%Io���ݖ��X<�ir�Om6��gǮݨj�<�<�$�(
��ł�i��)lV+9�9<���̸…�I�!��7�2����P���,�;:2͓�$�dr��^:�U�����T*ũ3g�>m*f���w�dӻ������"I0e�d9���nf��{�r�������̛G__�3�=z���tHR�G�r�t����/~AJU���UX,:��X�x�Ϝ��������3Y�t)�6V$�I��ع���j���uH��֏��H&��w�pY����&rsr�X��RW_�=kV#�	��4���֏��$�o=�ӧNc��iTU��v�(��#+ˍ���ŗ_��<EyY����`�ш��"�k��\���ǎ��*e����
a1[0�>��g�٘6u*;v�f��E�����F���D�Q��,^��_��
O�)L&��~�e4�&�X�H$x��W��裘Gߗ$	����S�1}zf��r3ߕtf�g��8�x�6oy���n�y�Irsr����w�A"��ͷ7a�ZX�p!Ol��o�����E�(
F>���?������=k��j��ɾ}�����v�f�]����x�)(��7����]��$�(/gێl޲Uո�޵�t��٘9��l����<��.v�ٓ�����\lh@�$,��?��ظ��ݲ�իV��Ga�ۛ8x���%�{��.�_�N���Y�|9==�457�t:�X,�M&�40���z̟�O����	��˩ӧQ�t���:y�Arsr���o=�0Ol�p�>��w����sx퍍�STTH[Gm����_3<<ª+��>����WU��r1�r5�����rŝH���%E�i�o�y'I�}}�����\���IVV�h��[���=��`��ίۑ���7��Wߍ�۟gDF;ŭZ��Ғb4M�������Ann�$��Ӌ�����!��ގ�hdhx�U�<��d��>��hO灁<�D���^E!''�X,F$!��F�$� =�=��f\.N��2h�ƿ�����ķ}��A @�㹼�~(���X�f
+�\N4#����I�*�
�����K<���x�M�����D~^.6�������eg�����c*'�8�NajJ%??������z<���tGځ|>�99�u(&����"�^�T����,�GF$ۗM$�h4b6������tfZ�4M���'�������`fD��y�D.�FH?&�����t �
�@�'UU���+z|>/�CC������w<p�z�-]���d2�F�H��,�(��H(�6�$h��1"�2>���@uM-����?�%w,Y���0����gB�0����ZK/G�ɤ�T����˲��ᤷ��Á�h����X,H�t����zzz{ӝ�\n;}~?�h�ǃ�h����D������裏���FaA!�C��6�}}8�C�QUE!�d��a���x0<<���9ڋ�>���E�4���ŋr��u446��?�	��(E�E�(
�`�L�ȿ���L�2��|0��@ ��l"�b6�n�苡�aZ[۰Z-8�&m���B�0�T*Ӊ��<�~W$�I�����V�$FB#\l�����i�R�Ѧ�/�gl�����/\�����k�L�i~5��}�ܽ��z��ڇ6
�p���~���+���S��s��Z������~NUu5�7�	z�嚚���/_���ߐ���ņ~��_��/p�\W�N2�������"�k�(�<���PS��s��u3��*g�`��i�|���yy��
��^n"����?GiIɗ^V"� 7'��3f���[��ʚի������{�b4]����(̜1�V]�
|�;~A��-
TM��l���2�����
��A���i/�v"����DA�AD������˽��RUSÞO>���E�{z�����	VZ��_ɲ���/\pkA�kjF���T���_�w��8��=ו�J���_~�55��Ѷ�����<u��]���8[>���޾�}}����ص��w�ŋ�������s��QN�<Ŗ�����{�s���	eܵg/5��@z��w��O4������}
�{�_��w�%�H0<2�˿�
��k�����S:���|��.|�]Wv>A������ܽ�ł�n�C�R)���={8r�xz�H��=�|���&
���!����e���Av���|��v줼����nN�<E2�d�9v�f��p8���wp��1jjk�z����f>��#‘�E���f��(���u�~�1�H���"�ضs'gΞ�j��v���/?�9����j�c�.N�9K,���֏h�蠨����~w3�$��/EGg��ܜ9�O�K::;p�\tw�p��a���ikk������a�p+�O��n��5Z�h,ƫ���ņ�
�r�1��qǒ%�����R]SKNv6F�����}�.�����/�/�6���c'N`2�Z��ڳ�O��#�HL�{~מ=T��P]SC(&O�~�3�kj��Ξ;�ko�I0$ۗMmmG��ɓ(�B��GsKW������ݫWgƵ757��Gs���G��l�x��Gs�y|^/v{z~�@0�G۶_�~��b�s��6��G�G�TV�ZIMm2y�^[?ކ���].v��C,g����>s���!$I�������!���K*�b���pk]kv>q�/�A�e�t�>�x���o��.uu��]N�x�-N�93��%���ώ]�����e(
G��=q�_��k�/6�ɾ�����(z
��I����3g������r���8�����LO����GQ<YY��O$��+�248��ȼ��+���g4s��deQTP�<[?B�e��Gx��7q8;q�]{�"I�>/^��7�ڔΟ�pPXX��ُ����n�QXX@V�аZ���;x��UUlzw3�]]�j�W^{�P(�^Q�fe����t88|�hzNvIbdd�W^{UU����������+�y�Á�fC����Id�p�޽���V������@x=�N����'�r����!77OVv����b,f3;v玲����+��FUu
�����̜Y���.?O�Ʒh�� ?/�ᯱ�i���y㭷ӟ��C�>ێ`0x�e�M>r�7��DIq1S�LF'�2	֦N�L<g��]��u��o^g��=���>�GFx�VdEAU5<��l�n�QR\��dd�w����
"��u�e�;�,�����G��r9�z=���q�=�����r��zt«�|��hv0H'U�=k�����K���.64�p�|@������^��~"�%%%�Z����r‘0�**(*(dՊ���EaҤ


Y��NB��~?�<� }UUinnI��f����ɓ&�`�<�z=�l��]��d������B���è�]�=���q9����rǒ%�eꧤ�����X�d�ZbʤI,[����<��;�������ֶv��hf�,�R��ϼ�s2�g%��>?�him�����H��ǒŋ������{B0O�R4���a����|�M���S���YEQ�=kw,]��d�j�PXX��ٳ�1m�'O"''�Uw� ;ۇ,�̛7�'��ɾb���2o�Lү�e�\��D<����$���e�����
�/]:��*���kh@�5�AN�9��l��{�P9c:��@0HSs3���e���g�^*gL����O�(/������n�=L�29s<���p��X�f.TW_q�	�D���4m�T$I��7w.g���Б#=~���M�B*���������Ñ	y�Ο��S��Ć
|�� �D���r���I(fт�(�BJUI�R�0UU�d48y�t&a����N���la߁|����)K3�5�hmk����D"�:���KnnK.�٧��]�Vr�	jjkY�d1V�5=�N'q���L+�I�s��Y��~�Ѳ�˝�d2RX�Ϥ�r}�A�}����RU�d2E*�����tww384H^n.�*+y���#��*�,`���|�u+}��UMe��]�:��t��իV�����-Lxf���K�_
0�����چ,���1N�:I0�nN�������afΘ�N�q��a.64��Gwv*�"'��u�����C��>mZ�?o.�,�=�3����@�4�׺_,�}>8z�8��-$�I��[��t������d��۸�:L&���eт�H�.s���J2�BQb��O�"�RXPp�}'|��g��p���]�x�{�K7�����Bcc˖.a�w�(
G�#
QRR��9s2Ϡ-f3�H�9�f1g�,FB!�ϛǔI�0���388HyY���lV�N����0w3+g���d��ɘFg^s8��wt0{�,*��8�����_w/S&OΌ�v;]���
�)-)!O0c�t�}^�#'N����������him������|ΟOnN��3s�C���d2QUSKN��ł�n����`p��2�ΙCuM-5��(�Byyy�^���^OYI	�x�9�f��܌�ba��Ŝ>s���f�.7&��m;v�����ŋ�7w.����'֡�j��}���V1yҤL=�
�2}�T
�󩩫K''�>
UU���Hqq1�Ϧs�w��|�RVܹ�'��G���ۛn���n�H$�8t��ǎ����=�W��x>;~��z���M��˕��0�H�D�i����j�
|^/&
S^VF[[y��̟7Y֡7����V�mᰲ�� ��T�����Y0>��P[_Ϥ�ITN�Fu��p�\�3~1s� |	ᄋ����/|�q�xEQ��rc9�o��؏�X���ܿ��ݗn�,+�Je2܍�x���X��DC�W��d����J�R��?�w-X���+0��ri}�Kͬ��3�?W�̗�/���BU5^�����d�}�}'�b�>A��b�u�Y����Я�����G��m��/B�uǯ�ҋ!Y��Z�K��Uթ�h�l6M�W���e\��3��Z���*gL�J��Fԥps��&�I��y���m��9t����~��$����Ǐp���/�I�W(-.���n	���uq�����/� � "�� �7���p����0��p�F���[]tAn:���z��a�����@z�����g���������R�~���6�����D���~�G�N�R���{��I~��	�74C� ��Ɔn^�1
t�tg~�A�6�D�O�離��Y��ܹl�uu?y���<V�y��$'��(//cVe%G�edd���~�~����d�~���Xw�=t��p��Qq:�^��ݟ쥥���ɏy��de�����v�.^�����Xy4���7X*��?��,��4�Ү]��¢¯E%	�7A����s��A����&
]qv�/C�$�f3V���`�x<v��VJg��"�ݔ�y�c�XF�X,d���� �p�H�D8F��S9�Y��N�i�~?]=]���x����ةT���^FFBL�����ee_?��P4Mc�?AA�Q�8�����
��7�
QQZ�^Qni\�e��Fgg���ر��ƢbA��˳p#�XVEF�~�IR����6�ڲ|V'"�� 7�h K�UKr$I_�qj��N��B�~�D�An"-�Y�� 86��j)eab��K��2��x�5KHhL\����7�����$K�������;��E�An��;��P�$� �p����K2���1��m6 ��~�N'&�1�M����H�R�#a,fK&ca"� �a1[~kJ���u:�pUU���ה�r|�$�H4JGG�hZc=>���y��@�$\��;~��ҕ>;v�"f�Anm�s���o|{�GOOO�c������p��	$IB���Dx�7����|v7;֔=�$I��p��1�:ځ�BSK3'Ϝ&�]q�I��I:zz{i�h���? 8����^/�T2�M�'I��N�M����y�í���{���AUM
�n���`��=�t_fy�������	u6v!3VǙz�T��� �p���w�}~?�]8�kj���%�s�����6�/^�����P�S�N���EgW7��M�x�zs����t�t���&�L���R��F@���v�Q�������c�X1$�I�::P�&����aE�$I�Ym��Z�.4UC�����I
����a2پk7���O�����n��[(.*d�)$�It:�����IMm
Y�,fV��ȱc���{t��p��%X�VN�9K,cVe%�g}��q�/� �4��ǂQ]]=>��իVr��9�8G�c��]R��?ڶ��Na4I��˗�jWY'8��'��m�wt��z1��D�1����里���@���&4Mm���A4UE���Z��$	���z����;��:��,˘�F/\���G���084�;�m�����ݽ��獍�ٹ{��#���!�����8u�4�--��λD"����� �p�H�����O&�=~��n"�(iln�̹s,�?�իV������յ�lx�

�PUE*��5�N�iʾ�u�T��`6�(*,bd�"CCC�)�I���q�W] �c��(+)C�t��14�`0�(z@���h���d�j�b2����K{��j��BSUTM#��uuttt����룳������\lh�l6hmkc��e�~ݽH�ĉS��X�D"Q��~B��I��eZ
D�~A�摘Ы_�$�����rŝ�mvFFF8�6����6���"7'��HCc��p8��M�'��W�կ��f����3�e9]M�`0�L�&��(2f����B�P:�k���״��|��KM]-���T��_y����"ic��UTM%��مN�c�޽���PPP@ ���Yv�R��������p���\�n7�g������H���bqZZ[�����tP����Y3�e�ٜ�W0��_~�������AA��I�D,#�L`43w�
��HH<��>J����-ttv����T��p��y��(�fU2u�9F�ŋ�Ĝٳp�홀�L%"';;��^�$��XO��Ʉ^�'���B!r��Qd���^"���x=�GF���%���Q=�^A�edE��p�(
�� �x��Ҳ�:��1� Y��	C�B��Ϟ������z����^�����Bu5�@���BF�O�������ٳ��������G08��U���PZR̹U455PR\���֕����jӦM�Z�r$� ܞ$Ibhx�H$��f������cM��x���n��z"� �J���/��x���6fU��`0d��F��u��nmc���=�5M�t���p��u�]�Ǡ7PVZz�<�CC�]�����ր�V
IJ?��~4E'��G���q�z=�,�>H�T��!�JeʙL&�'�	�N_�9w�h�AnUS?�����	�Mp�?;~�Wǟ�������K��K�4�����b1B�ye��u�ǚ����K/��o3�a��%���c
c/�K���[U�A��GBJm�As��;��<:�����V�続��[;O�X���t�J�A��淏��r�F�y�_������m��u�  ��An*E�d��K7 @'�I�6;�pEQn�:.u���J���vb���.���J�0�"�� 7��i��&�Ngz�
��Ym�l6�G����՛��`�0����"��--�$Iج6�6��� ��c��q�7�n<��H$nf��i���N.dw�j*�d�G�$tr�3��� �M�y)h��r��}�z�|>n�#u�H$LOo:�����v�f�|*�"0<2,� ��O�4��������l��ߏ����)}>O*��6R/� �p��ⴹ��,�?o}�$IB��@�A��ŭ~��"�g�H~AA�a���6DS� �p[�Rx�,hlJޱt��N�{�d2��C�:��,{��l���T*���/��W�/��+�	|A����to_/�##L��@GW'��QZ\2���O&��]��0���9�s�pUӰ�����.$���K�DSK3}~?&��������9�Bc,u�s�_���UO,'��p8.+�طD�An[�d�D"1��ITU%���I�R�L&L�t��B�0#�T-��7�`���ʹ���H$(-)�f��H$	�0�X���{�K\q_2�$�루���6:�:q:��#"�0f���D(�d4"�2�p��@"�$c2����DcQ���"80@?�e�8��[6D�A�v�fvI��������d4�L%�\1���A��}�LF��8h���'
���&����G��iln �HPTP������X��b�f��P8LCcz��D"AYi)�8N�u���Ρ��GQ�����߇�j#�0�����|y���A��ڸ<�n\M�p9]̘6��LGg'}�>����T>	�����Ng�S5����lo6�e��!�������X�+�=�4���a.6^$�QTPH������0<2��HOV����zEO$��X���� ��v��>u����.&WL�h4^�0"�� ��+X�^!����D"�B�t�zIJ�mk�K��H&I�Rh�J<���	M�0��h��NґL%I&�h��(جJ����ɽ�ݾt�V���<�O�D�Q$I 7`�ۨ(/��t�v�I&������(
���ϸ��r�2�,��ɨjzz���"����/� �.	�������n�S[_Gu]
�,���4����j��'����KGG�
ӽ��t0&�H�W�N�H����Vn���@���R���◮2�OQt:N��l����\N&�	&���`�`���v�x0�����Ȳ��ӡ�v:D�hhjL?��BIԦM�����HA�(UU�����Nw�{�xM�0(����b����fz=���EI�TEƠ7�L&��b�z�,˙�j4��w�Q$	LFӄg�$Q�p���&��t"!	EQPU�D"��` �J�Ɛe�шN�#�J�J�0��Fc�}FUU�̐�x<�����q�J
q��1ѹOA�=\m��N��d����G3繜�	���&�!t�!��\�2��?;�e��6��k����'�ol�zE��n��>y��w
��>s�����*I~A��9S�j�ΕK:@{�^���\��ok��%��`�����_O+����GA��\��.���z��t��!� ��;O�$d�L,�b�|���(�x����s��o0MS����z�A��}�$�p8�	������	���C��u���|
v�]~A�`6�1M_�Qj��	�Zm��\:��ޞ^�A��åIs����K��>� ��
"� � |���/� � "�� �7��v�$	���,�id���t:]z2&
TM�i�"��#F*��z�t�����6~ZHAA�>7��a	���ͣ� �|�W��e��Jk[��]��6�JJ�Y�H:��r��9����i45��3�a2)))I�r�D�޾^|>���k^G<'�c���t�A���UU�����щ'��ņF�֣��ݭn4
 LL���EK[K/���������WM60�������-���F1�]KEy��r�4���PUuº9}�K/�jm��@��y��Z24�w�MUU^z�U��$�>.�����Kz{F�:88�˿�
&������f�dl��$!����W[_Occ�ݻE�g�8V��X�Ӛ � Wr�O$�����U+Yy睤RI$I�؉����v������r�(
gϝC�`V������'`�ҥ�b1N�<��``�y I�>s���!L&#V���nfΘAEy9�n5hjn�b��'�ȤD<w��y�
�ϛǚի0�͜8u
UU�3k�u�������|��/N����j�L�L @uM
V�������B*�2�rF&֮Y��s�x<�'��
�H�R,]��SgNSUSÚի��e����ņF�n�*+i�蠳��D"�왕������Fyy6��,����F:��	���ÂysI&�?y���!�V+����9.�� � \/��^��t���EA�T�8@CSv�����m�"?/�W_���|���I{G'YYn�v-���wt2���l����y�X4F[G;�==�]N~��Kp��!� ��1�\�F#��a�Z�PU��'I&�dg������dSVVF"������^N�<��b�bC#�%%�֏0�M��N^z�7���o�i��ܜj��	0e�L�ߵg�.\�*z�}8�~����ˡ���޾>�6;5�uTN�N<��[�Z-�;w���!�8t���ٴ��s��	rr�����8y?'O�"?7��vSZZ���hhh�a���L�:��s��A�w�$I�����տv���~H~n.�z�e�X�`>�5��;��@EY9�ee��q���q��̘>����_��X4FUU5��~� mm�R*^��{I���(g�ʕDcQb��n�/).�O������U<|����LEyE�}�*�N���b!�L	���������{/�yy���g4�TQN $�H�(
�ܽ�����4�k��d䮕+x��
��s9^���kְh��� �%%���rת�tuw���Ó�=ƚի9u��h�Y3gr��{iikg�…<��̟77�xAJ��r�t�o^����v���Y{�֬��ϋ��A�*Wm�O$?y
��DqQ��
���1��;w�Ζ-ܻf
F��ӧ�����M����0<<LSsF���͌��(*,�f���>��4u�ǾF���h��Q����z��1<2���FKk����ؽ��f͜IgW:IG<����ɓ*��t����v3
�i�l�R�;:x�M�����d�H$�TQ�iN�T
��C�4M#�L���րt�5TM%�R�T��C]}=�o����AQ��p�ihld��i�M�������z�&�/\@��Ѩ�'� |e���/I�H�]{�p��a���X�hK/�b����˙��x��'���SN�<��je�K),,�����f͜������Ӵ���p��y�����YYI��\v6�==̪��d2PUUͶ;�p����C܏'+�T*ũ�g��r��vS[W����,-\@<�܅���v�9|�(���X�f�L�̧PS[G^n.w,]™�����a�ԩ�\[_GMm=յ5ttt���G8f���a���%����Q���� ǎ�
X�X��BEy9^����8~�$���t8I&����8�vJ��i�蠢��iS�p��q.TW����w,�B#A�RcM����iӦ]�㘪��b�L�X����;ihl���<�,�H$�F#��L�aUUQUY��I:���T
����>6�nl�����4�D"A2��d4"�r���d�N�N�#���0�[v2�BQd$I"�����D�x,��lFQ��G���?vG.I�h���F2��>������,#�rf���d�h4��m	�N����r::;y��W�����s� �p�$I��������t:�fs�ߚ��ǩ��eɢE�zR��^����/��X �4�1�c����>����a��(�gE�X,�}O���}�%�
�	�t�@&x_���S-��9��~�ߗ^�Ȳ��j�ly��!I�5��8u�>?�+ҭ"�� _��J˻v����3���4M#//���9�t:JJ�3�A��_��3}�4�n��I��
��{A���u��@tc��'� _�L�Ę1AA��)�@����[]AAn�@ ��RF�a{T%tEXtdate:create2013-09-03T13:19:37-04:00���5%tEXtdate:modify2013-09-03T13:19:37-04:00ʚW�tEXtSoftwaregnome-screenshot��>IEND�B`�site-packages/sepolicy/help/login_default.txt000064400000000773147511334660015441 0ustar00The Login Mapping Screen has a special Login user called __default__.  This record is used to setup the default login user for any login account that is not specified separately.


If this is a desktop system you might want to specify the user_u or xguest_u user.  If this is a terminal server the guest_u user might be a good match.

Then you would need to add the admin users or a Linux group with a different label. Perhaps as unconfined_u or staff_u.

You could use %%wheel to indicate the wheel group.
site-packages/sepolicy/help/login.png000064400000116246147511334660013705 0ustar00�PNG


IHDR��ah�>gAMA���a cHRMz&�����u0�`:�p��Q<bKGD��������IDATx���u|G���Պ-33Cf�&M��]R�������233'i��ff'�!�㘙$Y��0�c+��m��M�>�_N0Z�Z����3�O�8��Ԅ�j��� �\|0Ɲ��!!!\KKKhhhzz:���b�(Jii��f��juFF��(�,��Z8�X�MKK۷o�(
BH�e�a�{��C�$�T*�Z�B�K�.b��t�~��p��z�n���1����Bk���_�E���:��g^گ_1����|�yCc.˲!Bmp��")��0}!Dߥ��~)�+EQE���!�B���3j�wM躱��v�8Gh�5��ӵN!������bF��ð�t�BԼ
!$˲G���]�9].�J�2BH�$I�y��w��d�cD�0�u��t	!c���8NEzi�p���SE!�.��Y��[0��(+�V��'�GT���y��}���G���[��
=����qø�n��Z�VE����ުΑnP�C�(I�,Z�z�v��=0+�=�y��>�\����촔����~���5d@Fx�e�6���cj5O°���A�y���,�0#Lq{<�Z!DE�a��g^jikGe�$�w�
	��c��$Q�;��}�n�����DU������j5�t�{��MR�ղ,��n�^����V�}���iln���^�LM��C��X�eYE�=��R�0F�J�(ɲJ���y���$ɲ,��jB���fYV��ˆ��C�e�"��pWS����98;at�HQp�)!&Z�$���3�#���%YQ�:ݶ=������[ldDEM-B���
|��'��`�#�A�e����IN��n�LI�[Z!NA���"�T���EGY�n�G�d�����)��f&&"\!#�n�>z�]Cr����k�-_���T]W�a��$�s,���9dPb\������O��q��(q�Q�ʪ�!0��E�j�\�zݐܜ��6V�֥�$!���Z��Р��Ȉ��f�(v�+"4$�P���j��FGF0��7:A�qA&S`���섚���Z�ڏ�E)48hPv&:�6?��(�64�|��w��z#����>x�M�	qn��7q���?����9Y��#��b\�i���G
Gqg��p���6p��Q��	!�,�,[Z^�|���Ͽ��F��tt�f�Ϝ8��>���g�n�q��jҘ�󖮸���j�}�ܐ�xB4<�y��’R[���K�l۳���KT�*���7s��]��NT׌:�O�<QUd2EG�,(z���鴚'^}���
D9]���C���ѷ�=Q>m�Y�Y�Y�ncmc��b���lݳ���^��X�}����ذ�����b���?���w��B�k��wN}S˶��_5cZFJ�[������{u��	c�t�Fdw:?��;�Н�_7n�P���;���k��|�	nwb\�;�<�e��-{�-[��>��X��RI��q,���J�q*���Q����<48�?��d2�J��Ɋ2s򄺦������c��ѝv����=� ����7���S/�Კ�M��=�r�V�^Dz�H�V�;\p���41�{ꅃG��}TDذ�����(�Qa��F�����0�)$��>����ͻ�����M9���/��O���O;j~��n�a��r핳�Nz�j�7��=47���}�k����‚����1��n���(�a�V#�2"H�Q3�(�GW�>�������!D���*����;l��^��6��b��n��2���`�����N�2�%�"�BB�y��8�Y::NT�4���E�u�:m���q����K����4Y�����j[Z���.W`����ZZ�N��(:�V����:S�1�p�3�Y�̠�͘8�w��;���3O�3oُ*�������:~L\T�Ge�03��ƪ�����:86*RQ5�jnm�olr{<,ǪT*�ӕ��:q�p��#˲��EvCpvBtͮ�y~3���E}��w�1C�8��u��+Ç����ʃ�!#�n�.\�櫅K5��&=%������y߿��p:�R��~�aS@�׋�a�uZ��qcFB�F�1"1�3v��7?�ʠ�%�ŭݺ�������ǟ�FF\:e��#"B:����� t��o�×?,�s(Pv�u�g�56�ذ%<4D��Y3'Mx�����p��SƎZ�rMK[������Z�K��01h��8�S�TFCJb���j�[H
��˱��p,�����%�S�gN��W�
����-	1��̜�f�v�uXnNfr�F��+vCp�$I
y�[F
��U��@Y�{��=�9�cǎ�Ç��>�"�p,�0�t:�j�V�v{<�B:��B��O��]ٛ]eB��U*���XVQ�(�y�W�ܢ��xx��~��(x��溹=�^��xD�n�1F1�5��Ebl��z����t�:i�)�{0F���W�q
!��A�c�ϫ�&ЋN��A��XVE�N[]����>zߝ�涴�9�� SDz
A�|��!��y��8�Ӊ�鴒${<���	�y�СC���I�1�F�VQ��E��}�F������wq�!�e��(� �,K/��y���a4</w_+g}��!�A`0�X��t�yZa��t:Y�%DQ���G����sn�+c����$��@��s�LWR�$�QH�I�$��_�z���3#�f�$�l6�Ng2$Y���(���_Kp�� �|�É��0�@p��7���>+�,�Q��b�w{�1��O�A�{r�D_$��* �2!H�d�0����Ę(���]�n�Q��1KW��("B�%3'��JO�i5i��(!�[팛�Y89
B��}ɝp��9��G3�s3���o�G(�b���dʲ"n�vA�9%�n���x\��e��p��$�SQ��B���~t2�.��'�m��c����V?˲tHB8	��]���B_�z@;%��<�/zcY�	"����%I�!ˡ��ʲ|A�����m޶����|�>~q����jko��B'��rldx�F��m�u��c�.�J7����p:fg��w��jIIM	�d	_X!�<�jmk-+/c���� ra��3!�!����A!��	��G�444D�G�,��/�a��`0|�ņ�8�-���˨����Af�Yͫ#�"AP1��яB
QzCSc��������o�N{��tWt8���킌��9��DX���˾0���B�$ɲ|�]�$��{-���>a�;::E�Z��h�m�;K�?��7��b�Y�E����W��Daz�E���(�B.���=������>�'�EA��I��Յ����}�E>3\��奏}�3�Ǧ�[�����%��1������6d���{�g��X��8���v�(��Z-���F��E��c��,��a����a�Ὧ�s���y��N'BH��"�A�kA�g=���)F��F� �eU*��~j��X��$ɾg]Y+*��yZ[�B��eY��S�<��&L�p:��O7S�өT*B���$����tw��j���p ��Q�T�hL|z�L�V�,�r�B�a�r����~#�(����>Ѵ1������v����BI�̲�f0���cAj�����X�CD-!�dgf�cL�*��E�����m"8N�O��J��x<�W�5z��!|��ϲ���YW_����]m�2�0"
���p*�l� ����dY�׋0�	!��ojj���2h A���5յ�5';���i�+�u���
v�jko�dDN��+��ˊL?N�%��T��m��
�BC'M��d�rkG�(Jq�19YYe��#�
�;�ΆƦ��X�����e���kȰ,R��O�WX����a���	!Y���  ��]c��݁�&��
�oBX�U�ԜZò��0,���{'����}:��UQ�^w0�Вe+x�Z�VoٶmæM���h

ҨՊ��@��@�i�AAAAAAj�7i m��bY��l6[,�F�$�Fc�X,Vk��a�Y�e93X�Q�[�7o�8�eY�a4M`P``�Iū0�z��h2�����[��ޠgcF����ֽ��{�%%F�AQ�/���z�z�Fm
4����L�����@��x^e
4���o�kɼ�7�L�S����.���t:���[��??)1����l�xD�O>mmmE"�������Q�a��ֹ}�N�ah�5�!�Z��h$Y6����h2�!�$i��.Z�L�R�*զ-[�m�h0�z}�ɤ�yI�uZm@@�^��eY�V�L&�)��8��`2��<OoJw
��y�o
c��JhL�Q�[��6���
�΢�ĸd�1@QN{>�󂏢�Fc4�io_�V��h����_���2�i�G�\�ź�yÆ���˷l�Q\rT��S��˫����K�L;����(�^������.����r23����O�s�m��|cg��Dži�e��̰˫x���1��(*����f��<��'��g�c��y^�; '�Hٹko}C��b��=b�P���0,��
�k���#F��P�����vt��o��t�
��LO[�m����]:sF{��P���h2͸d�^�Wq*�s��Ǝ��ظi�v�a�F���9�#�0��� B��tΖ��f��5s�����p������^}�aa�Z��HA��h$�Ȳ�0
;^V�s�����U��}��}�eE�6e�?����q��������}���7\�R�$IV���A�e��F�Q��mm��,ljn�1m��#V�Ys �а!C��l��;�����8m�䃇���͜>m�t�N�\@����(��hl���?�4g�d9&���.�;tx��@��V?�cHQ�v0Y��Z�w��7k挈��/^�lڔ��-��P���'��2q���v��s٬K�F�؞7c�K���+b�c��׿oؼ���{lt��3\.�;4�ϲ,�q�q�0�B�ʶX,������4�
+V�.)9��+.���U�5�O�(+//.)���:eҺ
�N�F�Vq*��=(77;3�/�9Q^q͕�{DQ�֌>|��a�7l����{�`Nv�G�֬[��n>�_0fԨ��}��WT��8��7wtؖ��&=-u�܃y�1�*N�v�0"H�:��j�l֥II�s���^lmm�8����_y��G�._��(z����u6�WVn۱��ٳ��߶clj���C�y�j���%$$���>~�5�^�EQ�eY�E�$Q�$I��j����޻w��顡!���]�l��I��[�0���Ҳ��4a���w��s�K�z�(JtW�b?��a������,�߲���ϲ�:l46�U���^�,I�$�z�NGq8����쬬�V��x��D�EG��dE�h4cG�9b��ݒ���r��t��<�9-&��7^O���u�]	����z��أ��PJ��0�WQT*^oЇ����7�L�ٳf^~٥�vǎ]{dg�9<%9I��
z�^�39ٙr��7��X�Qq*� \y����Q#FDD�˒IJlyeeMm-DZ.A���9b؍�_W��`w8RR�&Nw��K�KJ�Z��е�쬰���Z��=㒩�i)��!�eX�aY�^�~z�G�$��������o����{�c�А�'����}�+�p8j�F���Z�Zh2����9b���9�N��7�"���C��~�����r	�+K�(zE��p\w�UY��+W��x<GK�	n���QS�I�d
�3j��Q�n�����U�׺\B�YJ���p2~CK� FD��G�9�9w'=�����f�wvr,{��s�B� �Y���s��d��{�
��ikk�|�����os��1�F�3gNnNN{{{GG���1[,�͚��o���j�0��y�1�N�yᙧ�z���yJ�����Ʊ˲ò�`\W_0����;���n_�b���?�M�2iǮ=��l���aY���n�����G��:Y�r,��T�$9�V��SO�;���@�;TXttܘ1n�cl��N�(_�f]tdTHppSSsq���[��de��(��tɒ(��-�1���?,9Vz��p�8�pt����Q5�oݱ�_9�D���`EQ�[[w�ݻy��#���,W�Toٺm��M'**� m�f��m�Z=�N�;RTx0/�h�1�J�dٲ�166��>W�T
!N�+#=����y�.Z�����̖��Y3.q���������iu#��sۭٙm�f��j�����Kg\��o��V���~8�f�8QR�:k�k)-5+ǰg����]���hz�$I���)�I������l��	�8�bժ���믽67w�,�7omniIOKcY6$$$5%�p���+W��|�
�B`��N2�V��L�b�K�N����9((�·�`3���i�X���Ǐ�_P��j��|vrr�V�-,*nlj����0�L!!����A���,��$��j3�$��D�rW�0�q�G��vt��8������A�ŪR�n��m������DG_6�R��(�bVF����E�%MM��Ή��r*��&���
	E�nusKKaQ��i�)7�p�$I�������F����ԶvsCSSeUu�ɔc4RSR�Nghp����N�=v,-%%;+�xى���|ܘ�������y<������#G
��[.�t�ѣ:�z���k��";+KQ��۶7���$'���_PX�r����믻N��y�R��v��@�^�w�6Bǩj�+cSR]���g_ef�H��.����7�d�yWss�8$$�7ߏe���f��oǍ"J�w<t�F��j�nAp8�z�����#��Fc�̿v;��c��a0�e�o��&������϶�������d�d���L�cH�ݮ(��`$��i�t{<�&<4�W^�����֛-V�Jŋ���x�v;힪�h9�����J��j4N�K��!�4u�ђ����_��ռ���[����G�n���S����4�;흒(�DG�ٷo�K��c4�k��dINOM����N��jeY!��l6�PP` �(�x<N��d2�Dd�ۍ�$Q��m6��`P��v����wڵZ���"��F����i4�N���t�z�a�n�[����e!�v���Q���-�][��[�#�#4j
�����(Z�v���C��*����5������V�D�~{���S���4o�,Z�9�Go^��K�N'!�a0�0v�uwc0��l4|�,K���(q�_t��p��1#	�i�!�aFf��󼢐y��76�LӧNnkoWE%�6��+�q4d�8���(J��βlgg'���q���t���^��`����%p'tw��K����j�{����\�p��W�T*:�˰,�ʨ��	��v��f뤟g!���J�l�0��Ԅ1�5uMZ�@��ꮷ��ݣ���/�t:i��n��#�X�V����xPW��S�&w"��f/����xD��!I�^��y�'Z+�T���A����#���>Z��"�m�-�H��a��=�'(0PQ�>z���8��!�S*(��t�t:�V���&3˲V��%�RS<�WW��������
BJJ����2�!DQ��,�"D<�$��aT*UNvndd�7��L�~pcY�v�=C�"��N�5`�(�tȇ��D?Ec7�vӱdO�4���UkdEv{<��M��KQȅ8���C�&0�YY����)!��ҵ�u'����p!���bG��zDg���8N���W��ާ��
��3/��a��K���˲�,��u��+�g2C��1��tNw~�)�B�c��o�"�,�rpp�(�v�����i��N���ڵ�Ps#���(AAAt���^!p�8���鴺ں��_�#Am�=c�%��b�"�.�Џb0��՘�Z�V��n$����<\��\tTtnn�LǤY��-഼Y1�,�u��^��$�2��qg>�݂���-{֡c,�bK[L�D�j���@zfv�N��5t����;��ǯm�c�%I���Vdh�Na�-��펊��sc�ܢ�^.l�*�B8��:�ä���U�����TU]����]���1RE�ӫ�j���M\S�i�,�`Y&.Q��X�'��N�7�ϫ%���W~�{���,�����beB��b԰I:B�(+���+�c����8�� ?�,_��g�d�?�U`��'Z��+�`��h�\m�cᖦ�Q��XS@���a:^�/�k��χ1#IRmmM����jMtTt��$}瘥u�;�/�(
B��EBG�9ݪz�;��zu����Ӎ�����w�]���Я��i��P�cYYQB�G�R��
O9����1�!^Ŷu
��-i.m�Gt���H�S�v8�,���7�!���<888!!A��2d������<--�:���`0x<�a8�����K��A�$:��q���r�Ns$�p,�r��(ǹ�n�^�(���%�G�C��V�EE�;n�Z��D��8�(z���~P�Ӻݞ��^7���p�������8|���#G47�t:술뮾���>0��q�(�to:�Ƙa��E�5�yv���Rv�]�Ə�u��y�6%���H?sD�]���2���$)-5��J�:!$-5}��N��h0ʊL�U����Ͽ������_njn9��wǭ���n�J�8}�0M���=]�$遇�2{��+f�%�������x�IIN�e�N�0&�b
8���g�I&/+�y��>����&#=�o��</I�7�d2}��		S&M�������q�w߷��F�M�1%9����$I�����?�!::�vd�����|cBtp�������'���+�z�������Ep�_x��#���$�y^�ں:�aX��$Y��
3]#1�
)8�豶O��:m�8�N?뒱i�5�J+��"r�
�'�(�,ɒ�����?A�EY��s*����>�����^{��K��t��֖=��WUU�������r���,�b�A`Y!�=�@a0�d�xىO����l֨�_ϛ�w�~�ӡ��.���S�Q3�<_��Pz�����e�@Ӗm�:t�5W�DGa�B55��,�,�R�DQ,��8ZRb6���8����Sτ?t߽�))�}���f�a:���Z�����\�@�0Ʋ,��]|�a0s�p�N�2x��f+>Z�V�u:ݎ]�
��l���sGjJ�ǟ}����|��{^�����А����Az]!İ���Pc0�Z�˭p6
���`A�#D�(��,ɲ�q�z�WE�~cE�33�w�޳w�ѣGt:^�WUW}��U�5���1��o���-��7�x��}��������^8��=��o߹���S�$��=h`."h՚����V��K�#�6m޲t�
�Јa�n��Ʒ�{������#&&�l6�:�����i_�l�N�5r��_}�p8tZ�C��g�X�x�]S���xٴ)SdY�j47o�j5z�>��3 '�hI�;[Z[���1�a���˟�Z-ϫ^|���S�Θ6��{rʤI�&����p�!�������RW_�i�;�Ʀ&:˔Is����fN�>w�wVk��!��[�CEU� �'����n�����*'=��-��{j�U�\|���qƥ9���v8���-���Q��81cw8n���p�e���#e�����W^|a����C�-]���i4�-۶����ܽ��x��b-(*�w�@Nvv���\�{k��^}��9sc}C�ŋ�����v����W^x��na�a���O�<q��{�c��ܴyˑ��qc�l޶}Ͼ��֬���x����r	�����PYV:;;%I�����:l�i�'�����J����a������Y����Ԕ����� �Ʋ������?ƌ)I�V��e��y�Z�=F�3���S�հ,����	����k�ԝ��0X��.��ڶ����{���O�t��MḪ�{W˲B/��E�$�%��?z0�$�Gv���tƌ��{W��.�q�A�W�ՈS@f��Ų\TT��l�:y҆͛K��>�׿|�â���q�G9:e�$I����t�5W^9x�@�[p�݈���ؘ�h�VS_ߠ��F�J���,˫y�0��J�Vw�lZ���vϹ���̦�����q4H����
��+��"����l�>|�P�e�b�M�$a�B'�7��_͝7p����0���Pw�!�vG�ͦ�j�Z�$��#+3���?�������ko<򗇯�l���X,�J�p:���|��,Q�#B_{���?]$��𱊄� ,�e��/>���p��p~}�a�¢�Q�<˰';X°�(�L�A���?*,*���q�\�/�����r��.�K��뮹��7ެ�����y�{332jjj��®����>�x�q���j5O���sʤ��$yܞ�-..vİ�Ͼ�^��x�<�d���c9I�� I��%B�֎���-inn�y�Z�Ͼt�{~��ښ� 7'c�r�r���p�5Ͽ�JRRbuuͤ	Ǝ=���!�ЃB��y������-�a�7��r	P��;v�<x��鬭�����U*���j����a{���HK
�?r$1>!>.��c�eeQ�s�74)������m=)��8�#JuM6�3<8HϊO>��#"�7o���x`;ώ;{0�b�V��2˜�}��Ed<r�(߆sKKEE�,�99�1QQ�--�59)���IQ�����ڊ����Ԉ�Q���uZmXXXeU��`���嚚ڸ��	ƲluMmhH��d*(,�xā��j����`�!�N���v����촇����7���466>r�h4�TRZZ]Sc2�dY�i4�򊊺������4I��kMF����*)!������⒩S���5���6�M?ᝠ�رcРA�G�Ѹ=�)ƛ����i2}�v��Z�Z�K��t:y��7�43,��(��7�+��1�q,�Ϊv8�7?**j�e��y#����/�߿��8.&N��%�6��a���(Fc@vV�����ϫT���Ӕ�JEgQ�y!�v��j�F�v	=�y^VQ5�o�}�V-�d�ZM�2�u:�0��N�K���[,��EQT�KT*�N����rI���jy��e��H�LOaE�h4j���"킠Ѩ%I�eY�Ѹ�n:�3�0O=�|ZJ�m7�d�Z/�9�8}���t�h�g�L�6�����0.���r!��A�Ra���L�DIB��҂V����{i>	���{��6��I1�Y܂rjz����n�q�}O�L�fǻ�>v�\�B�oբ\�@��]�X���E������(Z��ڷ�~�0n��}���е	N��a����xE1�ƌ��j�S'C��>�/����Ց ������]ƞ����`�X&u9��	��Lq:��􎀾!�����K轨�0}֥S�����\m����y�dY�v����(�&��+K2B��<�Y�QQw�v.Ƙ��@�~��6�eY6�m�mU�U<�C���!���n��+��.@�~�C����"0ơ!�.�K��m�l�@�)���<�෼ў��~4����iu���Bd�>��&p":-{Ss��bfY���5��|�{���9�dY	AqM�M��1�1p.n4Q����s	.����Sq�`� �nn����|����B�o�].�
�3W+�:9�n�<��3��x����w9?��>��%tAzJN�wF0�tg����zTZ�P_3��N��o=f�F��m��{2�;������3�����v!�)��m��u-������r��	����u����}�zu�=<�xot��O9VZ����xg[Ey�3SVv\o0DGE�i����B�h
���FUVV����~���;�h��緈��ڣ�n���%)4$4..���ZU]��e٠7��g����'�Jk6�765�d��U�F�iF+m��8]�V~��]U]��鼳{��G�����l�,y����
I�ꗜ�b4Buu�v�=3#���^����o
ih�ojnR�|LL\``����5Պ�p��`4�� ���#�z�1�-�=ë,ˋ/lni��m�{Og�n���B��/z��t���̳u�B6�/(<�]]o/�~]��o��,Z�l��EKn۱u��[�mFUVU,����޽�����hIW�E�-撒�ȧ���K���Viﻨ��757!�l��+W�hkk��H�Ǿm������$���6Ap��ezT���/��l���!��>�oW��!���^�"���?,^x0�u��+N �h���ʊ�8�o��u��.r8�x�#0�/�]�}+�Y����zO�����J�


<h�qG
�[ZZ4ju\\|BBb@������RQY�6t�F���
�y��MW]q�V����6[��=�x�<h�,�e�e�G���ƚ��Ԕ���Ԗ��Ñ����Z����Ck`}C}Uu�}w?��[��	��u?- ˒�`�ވ0$%%��x� _�$��#w� ����hܸ	�5U6�Mũ�Α#F�ܵ#,,,#=�Z���:N��$i��}֌���䔊��������SS�

�KJK!C�OOπ�����^G��p�m��	4Z;���j��Æ��5{$�3U^Q���ZXTH����Ʒ�|~A��h�k�=�(�rʕ����{�~����۹_�l�dzw�o`<Q^&�"ðÇ
�h�ސ����w���5b�H���k�\��5AHQ���TWW-�q�J��;t`������n��0́��5����*�*::���n"�<t`t9c�-\2mFcccى㼚'D�����ծ\����j��m�,�رm����c�Ƶ�
��W,����v����������*)ɩ�OZ�l�󾮯�Cq,����<��3�6�eYN�eoUihlؽg��(��쬯��谮^���������$U�T)�		=pp��U+
�
�~K)�❽LQ�a��غ}�VY��466�\�����eu���N[SScRR
���h/TC�5�����f�o�ƀ�{��q��q�����n�����1��J��"�(�����%�fh5�w�k��ut�q��S�ZZZ�~���88*2��t�,�k���ئ(������oٺ�a�y�k�j�l�|�����u��ʦF��zbF��Ч�ǏE�GL�z��IS��
�2m����4��ޠQ�1Ä�GL�8e���m�)�b
�2yꖭ��V+ðt������x<�QӦN�����̺򊫵mAA�����t�SG����_�g���)���ÙY�/�g6��
	��z��)����;o��U��`�봺��G̸dB���f`F�ժT�����Gnٶy����at_%!BDQBɲ,+2!d��C�eYV�R;v�(D���uz���0l||¨������
��x^}�}��AAA7�x+���G�ΩddE�0~"�0Y�Yqq	tz	���Q��QQ�7�x�]w���ܼh�B�ݣ^������3fy<���$��(��$uF�'22j�Ckii.9V2h�iS�g�g:����J�Vk4Ȳ��ى~ER~�y�n��[��C���(+;�{��c�%���`��N___�t:<�,�v�!$������r��GFzfi�M[68xמ]--�YY�UՕ�cw�Bn�`��EQ�w�X]S%�\q��11��]7@���`̜8Q�u����$[��`0j�ZI�̖��#��n��8t�0�-�J�q�� B���eYr8�ƀ�ӹm����z�a:;;�o�޸�8�)�±\Bb�굫�2�jkk�:}XxxQQ!˲'N��-��CG��e)55=6&.�����@�I:k�5W]; {@ZZ:~�ǯ��ڃ,Iv���T��}?����Etv �HY��-[7��쬜�ʊ�����8����[����x23�ƌ�lŒ���Y,���S#�;!!1/�`����;0L
onn��������Яhp��\{MzZ�V�E]w�� �DQ��r�1:*:���<>.a���	�j�������j
D%'�x�n���
��#))Y��GFF!���R2����툠ؘ���h�F����t9#""�B������1ѱ
��N�#*:&8(��l�@9]NZ!Qלqr�����U��L�488�#���n�w�l,˦��:�NQ]�K�11�F�1..^\	�	:�^���숈�������D��q�UW^Co䆇���JLH�e���5"<b��:��j����FE�L?1..^�V757��')1�eY�N���^�hbǩ"#��f���#�*�*99��n�$�a��I�<�[p;����������Ŝ��6y�T��AcQ9�MIN���v8��d0�͈���x�Z��}BnԄ�-s��]��&&&�E���WYY�f�7�;,8(��-�nkk���T��}?����R�N{�D�i3�����Z�r�|�B���{�e�-�U�W�q{��ƛM'���7o��`<zԘԔt�(t�.og6�^��{�^����Պ�0��r��㮚o&�z�z���j1���{�S=Ҩ{t���d�A"����c�ҷ�o�/���ߣ�ou�im�r�o�R�y���{�ONJNJ�׻~�q��}�ztr�]B����޷O��ðB~IQ�^7�N��쳻|�ݠ����S=��.�}_9����[}V�u�wC�֜>?~��;�r�e���Bc��Y��
�o��ߕ����y���c�nn�ef����v���b��,�Z;�c�b4���EI�����AHpt��M�$��8��!�xϲ�o2.��'z�W��iu�Ot�'�~��ݲ,�>E ���O�#EQ���\p��T�YEA3p�ț�s�&4�����,�#��{��9Q��li=����)4�ͰS}�<c��t~�^�o��'��^?'��d�a�v�:�ΰz=r8÷zot,Y�<$8�`0t
zEƸ���Ͽp�\)�Iޮ
�=q�f�5k3�3��Xz�NLA��=�s��~E|`���18i�}�|3o~Mm]zj����]������6l�BxXX��{{tQ�FQ���i�/,.��ϊ���ʢ�и�#646~����w�LNJ0�3V�ֽe��w9�_}���LN�
���.������$���Ò%��,�������8�p������e���Poe�9����w]W�Y����}�X�x띠��ԔdBB'g��~�l�.]�#�2+׬�����,_7�9�c������;��䇖d�g8bY�����>���q�\=z ���ti[�o/)9�N�8�;{F�a@�`�tW���e/��갡C����̳]%N��B�
|��g��BM������{Bt�2ݵ��Үe�(�8���}�N����ݻ�N��ǟlڲE�V�����KJ>���P��s���=�‹���g]�NJں�>����457=:��<������s*n��V�X^Q����������'w�~�(�-���������ڿ�v��bioo����~�h�O?9b���_z�����y�;�Ʀ�뮾j�%��߸��U�CB���657;�;���ګ�SS_z�
�W555���lڼ���6��a�Fc6[�|�=�Qkk�x`@N��s��ɾ��[�;|h6[���6e��H���[��=�vQ���7��G���!p^�0x�С���n���˖m�&N��Y�L�v�-9���I�������Emmm��rs��:��B�J��ڻ���h���S����;�O�]}�dž���2��Nlؼ��+�@	���Ͼ8^v|��_w��K�K�t������<�w��+11AQ��e˧M�r���pg�X�BM7�n��X�:!>�`0�n�c۔n�l�>�!���~��'�"#���7�~w��=����ƌIJL���k4�K��v�={�n�q����p8�#��!ABCCƎ=fԨ9�݊�o�y��Y���w������� �_P ��#GfL�v�����
���,����{ם���_{c̨Q7^��?���h����ug��>mjbB�
��hɱƦ��%%��}WNv��<���r��뮹��ݷ��+Ax��r�2].!D�?!DQHAQ���r8�G

���7�>���(�w������������Z͇����W���8�j��#���ugZjj]}]tT$]HDxxsS��	�s����7��t<��&����A˄����-!���ᄋ�;�y��ǎ�Q��2�-*����tnY�9�+��\�h=�`��l�>��T*˲���n[��r��Er,�v��[[�L�d4����:��F�Z=r�0��3�1�1,ì]�>��a�N�t9u:��<�
4�!� ��A�A&�	�L;���h��y�F�

D�����$�����EGEffd�DG�����4}���ju�����Ԕ����4�e�:��`���
����ҷTw��Ƹ�X��#���JJK'��0


�fn��R���l2��o4$����q�1o��~Kkktt�#����N��2l��{��_�x4++��;�\y�e�~�aAaQrR�ђct!
�
��#�
���<RP0u��0��{�����oh�����Xi��ICCB"#"���5j5B(""��rUUU!��+*L��e���_~�??�@Vf-�嗭^����a��]�q���qsK���x�w�\Æ���v���t.A�y��!����[�ͺ4�H��ܜ����jssrB��eY6����++7n��̓Ol޺��8�Т��TW[WPYY�y�֨��u6l�6p���+V��dYń���̵�7�<�v�s��F�c��3yV�Yc��J��0Bmmmn�G���Q#���/��&!!~��1q�1���V``�N�:d��իBv���iS�l����n6{<�ֶ6���b��X�~}RBbRBbrR��#��;���PPX�o�A�N�(ʕ��ffђ��544��ocI���4p��m[�n����П?����%YY���銢���"�v�����K��x��g����7ߒ%Y��K�M}�7�O�<k�̏?����1q��K���y����N�*c����†�}箵�����/]�⚫��5s�|4m��%�W��8�Ϋ�xCTd��bY�qӌ��~X�$� K��劏;e�S2|:;;��B� DGE�T�Ҳ2�a�nwtT�e�.5�-M�-'�:x��#�����������LIJ���
:�dܘѣF�1|ؑ������'����6[�5W]�0B8%9)�����`�+>ZB!,,4>f��k�3';������L�65+##,,4-5U�Յ���<h`aQ��s��&M���β����



MIN�h�qq��q�aa!�))����iq���I�Z�&2"r��訨���{<��Ą�j4�c��#�
7ftcSS]}=��(��i�F��/(�6yrrRR�����0̉��{�M�0��+���tÆ..9���1��8�%ݷZZZ�mߑ��x�w�t�1�GU�����O�Ӷ��FFD�9B�����'%$�5ҷLssKDD8-SPXw��@���dfzDπ��A��:������o:dHSs�) `@Nv[[�)����Z^Q�o����{ﺓ���Z��@e����U��,}�9e���F?]�>�j�]��c�;7��)�2�~� �`�0������,}��w:cܽ�g! �.�!�eY�w�h�&M	��=Yh���CZ���ѧ����BBX��.ķ�X��vo�Hߧ��ȲB�z�i�z��У��7�z��z#�7���2�,{o�rG���0�)K@�]h�B�+="�)��w�����s�����ug��P�N���
�>KҪO����1f�{�n!=�� ��~�e{���Y%zT�>�1�U�~�L��bƛ��;����B���~�;���@���mU��3�#��O�xA>>8����� �{܋��思���o8^?��F]�J�}�;��k8�2����g�no�D�����>���Q�!�7���=�O�{��u�+Ȼ;x�*���|�e��4����J���4:z�$�,ە�LV���DZ��C�F��.�㚚�=nO||]T�:����lAݝ�9��7�k������#G8�g��N��+ʧ���m0o��q��ݼ~Ӧ���ɓ&�ʴ��.J� �߸IQ�K�M��$��}'��Q�[�i^?���ѱ����L���G��=*-5���:�_V^���0v�h:��%I�2iRx�)�O�l@�7v����%��`�q'�T�0B7�����ާ��[�Uk�.Y�A��hբ񽢲j��%��=�V9����>������w?��p��ࠠ����歐ô������kkk�/X�{�#�$I�^��q�aI��|����ܽ��O>E��]�J�,�ɽ�x�'˲\wtݰy�ko��`�"�����y�����/>zT��
�6Y,�Ͽ���%KBG

���k��|�9���{�|r�e��n�n����OOKkim	0��~�eRb"�q��.

����׬]��iC���8p�e����������;�TUU�q����e�UUs����E��n��])II_~���E�;��]�᛹�jjk^y�
��^�[�qS|\\�t-o�i��eq��N�k���9�Y��e��[!�n�>x��?��U���ѱcF��#���N���E�W�Z{�D�����_~i���?,^�w�~���tŏs��N�Q�74����gL��h�2�A���D/�/_�j�w�K����������:q�xY�_{�=�����B��Uk�]>{��ѣǎ-�Ү�{f͜���<n��F} /o���<�#�z���0̡�_;��+.OLL�y�hɱ���465=���E���رkWP``Nv��aC�~��j�aBO��b}C��	A���
�뮽z��Շ��O�w~>zm�^lhl�8a<ð��	ё��ssY�			�|���{��\�&3=#2"";+3000##=0Є|N�EY�tY����b^�tYo�P�V�ɓ&z<����������Ј��~��W�<�b�ђo��[�q����juyEEFz:]HrrRMM�,�Z�væM��l�5sFtTTEeUf��2��5!�a���q󖭳gΌ���_�%:�McSϫ���B�X�VEQT*���q՚k��!IJ�%K��ߒ�z�o(>���ݻǏ�������(��K/->Z�n��;n�%��`��]3/�D����՗WT"��f�F�A�����w�ռ,ɹ99���Y���fs�^�BE��n���}ZZJtt��	��j�$Jy�+��v��bcR��F:n���(�5V�!$((P��ռ:4$��7\�NVH�v�ܥ��z}Ee%BH���; 'g���&S�5W]����v�z�qpppuM
]HG�M���t��99k�o����Ǐ����-c�v���;v��3zTnNN�T����@S���jiiA���Z��aEV�z���'L�8��뮹z�ꕥee��^�9%���ݱsW]}}Iii[{[NvV]]���K�����`��]�Ϟ���%[�m�|�l���0LSsӁ�<�N{��sv�ܵd�
Nř-�PGG�,��{�����j�3g���?,^VYU������7�>lXLt���t��v�!d6[�N'B���bמ��ܷ��$��#�p~aQacSSo�P���w�ٹ{�~��꫶n߾o��w?��h4�=jԈ�fΌ��q�\��r�����oΘ6���q��UK�-����5sƒ�˫����p��ʪ����	��565y�\:s��WV��4p���U��u��fjjk��O�W5�rs?��ˢ⣟���K/E�~Ͻ�u�Æ9VZ�0̾��kj,V���R�y��9�ZRb���Y�r�F�9|xJr2�85%e�С�$gef�^][[PX���:m�d��ZUU�a�-]�b@v��7��p8���2��-kjJJDD8�[��0�r͚+W
�ɹ�bc;l����)�&��[��
�qc��L!!!)�I�fs�F�aӖ��7n���*)-�; -5e��N�3w@��C�{����d��s�UWEGGEFD,^�L�R��/�t:Y�iV��}�}P`�m������z�چ�ƿ��Ȉ�Uk֚L��g�:^V�v��Z=h`nNV֪5k��ʬ\��d
�e֬[���3��,Z\S[g4*++ǎ=j����u6Θ6��+�(+//**JLL<����!���j���˷n�~Օ�O�8��.D���p!�7舞���[���4Q��N�tO�ݛ��ScLw�lkz��c�5h-RB�o���(
˜(��p��I#�Y��u�c��_���0{t�=yCcc����b�������g.C�@�i�!YQ�z�맅��}�}7�g��o{�z�澏���}.
��Н�|2��ut��!z�����.=*��\�@z��������y��	W�m�kS��E�t�h��߁�~B?��_�{��M���w��^�_�������7y����k��`KV�!
!$w����WdY�e�"�"A�"B��r��vEQ`�6��y�Trw���!��U���V]/�NvU�
	~=�H��T��0_Xm�]�N&�BX��Wo�zra�v�1M7����W_�����t�Z��G��~��w��~=o>2�=U�o���D}fXc�p��|�z��(³/�d�v��ۯ���?^{m�S�[�t�Z���V�Y[PX�1޷�����޻��rE���b]�r���έ��W�T,�b̸��5��/^�!$�r����,ܵgoo��р^XT��ڊ:����x�V�%K׮� �"Bh݆�?,YB�n;�d�{���]����;t!���\T|�q�;~� k֭���\�F��~���Z��|���Νr��?R��/657��ʫ��~\�z�֭6�M��q��X�ѣ�}��(��mm�(>�����ð,��s/l߹� ��ڦV���ڵ��o�}���x���]���>?���z�7�������u�l�,;o���[ZZ_{�m��s/��n�v�!������o"���?��_"���l��z�m;va��+*&M�JOK

-;q]8m�SB���9��wϙs��w-[���K.�;��y��ɓ&),ض}DŽq�����bcEQ�Z����Z�!4t�`�ӥ��P��9n̘k��23=�;5g' ����BAA�Z�fؐ!MMM�1�,ǭߴY�f�������3**��|֥��0���t��p���
�t9y����"��wMI"����B�#�{�n�PEe�V��0n�g_~YRr�W^z�������z��Ջ�.��=�c�nB������4c��Ç��o�(
�鴵uu!Q/���]��'� 457w��mm�Q�s�m�JK�������ё��1ѳfθ��y��n�˒�2�-�,DtZ-�r---� 4��С�8;��K��5��{D�Y3g����W]q��]�eI8`�p�.;%���#�z��,?�ׇBÆimk�����Hw:��&--- ��â%��iF㎏?�<"<�xYYjJ����c�GFF����@p��������/�>��)I��o�m�Z'����qל�?���?�<w@NbB|lL�Z�F}��g5��,����GF�����Z�NMN����-BS0�f�ʪ���999�Y�^2��xL�����8���x���[��۰�����y�����r)����B�$!BT*��j��*����\l\.BH��"�$I��M�m%˲K�:���۩��uv��9~�!�����T�"�v�~��#��NG��qw�j�����9���Hw9�����Xpa#�РO��qA�5
!�0�A����T��H`43����7�)���i��/pJ�5bxZj
:ut��x����=���\�zW<t��u���]Q��:~���\���d�'���*8�z�Y�p��9�Åx�����;���M��X�j,8��^)g�`�{�g���s����P�#&�ѽ�?�{��e)>�����۽�?�����@�]?}�E�}LN��޴�s�t:�O}3z,�>�s��2�?)�('�b���v�{���[��q�\N���	�d�=M�>��1�
˴cA�͆|�rG�M�d�][������&n�Y�|�:�ߎ��qk[��#�DG]p�����"�gV	o�4I�Z�(��G��(7�L�ϴK��w����y�2��f^r���}v2��w6�O>�|מ}���o��,3>�Cx;î\�f�e�(^{�U�^u�o�Yo\=sEQ0�kׯ���OgL���@�74���ۭmm�O���n����^kim�X��kZj�U�\�J���͹�S��������z����E���v����%�J;;;B�}��;�������\ko7ӁZZZEIjo7�(/�;t���/JRaQqQ�QI��k���+*A�$�����!���߿f݆���b�񄠶���%%t�~:Mž�,+���--y�+=��x|ׄ���*I�$It
�J��?R�Z�q�y��-7���o;].�y�jjk��ɲ\YU5w�w�=���O<�ͷ��+*�f���cl6����o�Z+*+O��;��D9m�c�����B9��ӧN1[,�+^�A~������_�L�9���G�=�O?G
2����|��/[��s���\{MzZ:�+�{�������f�����>rd�5q�������N���۰16&��U�+++%Y>t(���Z��3�:9)��uԈ�z�'*v��Ur�t���/��ڮ�{����x����p���ʪ�����@�/Ѩ�ͭ�{��W�m���G��/���Ò%ä$'?��KV�u݆��G����}���A8|�ȸ�cR���j`�%I��=b��b}�駯�����6�����[�������lݾ}��9�Y)�I7o	
�WTT��o���W��l6�VSv�\��/�1#8(���㈐%˗3�1���ē5��j����Z��]:�ࠠ�e'������'�{���ZI����L�mm
���'Nhmm]�a��=��h"��W�Zu��a�7l�:r���q��}ϼ���Q��O�JOJ\�����>�,��$?��_;�a�O>�}�ί��8f��n����>߸y��ͮ��y��S�p�й?B*����hz�>�Kx����8�����!����W�^�櫯������##"����X���>����m��G����=�t�"#"���-[��Ha�V�-=^:f�ȥ+V�����h���7�p���p�tjg�J�0��8�Wl� �0FN��#�ӧN���s8��,�wU���v0/�����oTd��n>lذ�CK�NxC%˰���B=���/�q�UW!��}�@����6������L��+���,��
�d��Sa�5AłE������wi����I�757GFDx�|J
&
�yB�aYQ�B��Rq�u�_�"ϫ<n���Dp��K�n��Ap#�<!�1�1fYF��H��qw]�a� D�CB˱]�W�xY�eY�"�mخc	ϫT�(�L�QQ�>���Æ�޳74TB������u:��$)�
�������{AQ�S�>��}��5�%K�x�E�G�f˥3.1����
|�W����]���t*�r�ر뮹����<�(��vӅ4�Ǖ�|�\-�F�QEp���=0DDD�*~���W_qź���k�W�\�r��/��Q�i�€��9�s��X��ӆ~o�+�x����N�_z000�^���$������i���}�џ���p�"!4zԈo��?�w�n��X��q:�cnj^�i��%%9��>p��W=��Kˍ7vİa+V�B��1��wM�E�2�%�ͺt�Ν7o��rrn���=�LuMM�ђ�L_�|Ū�k��,���G���/!h��8�~�i�����'�ܽ�އ�$���b4EQ���.\����L��&S@||܌���׿+�L�8!;+�O��UW\�����_��?R0j�k��j���>e2����_yyw���1�G]}�|����[\.��O>���O���z�7֮�0�+/<WYU��3�ege���[I�	~�_�f�W`�iԈ�[t�x�������(IMMM���������jmk�32��͒$E����/.~���X�	��u�Co0 B�v�� �ryE˲�		,˶��I�)˲��0��nQ�t:m��`4v���F\n�����F��YQY��j��bU*��jmjj

	

��u�l,� �"##T*U�ѣA����@��!�G�mmU��1Q���Q!EQ**�M�!!�������
�ј��L_���e&6&!d�X4j�N��vt���FGG�����2�MM�$s�=�11��l����ĄD�W�\.���q{\��`0D��7��-9�0̨�5��x�'C��z����}
�m�G��1���\:5[A�?O�~Ng�6}V��u�'˜a���7t�1U:5'�dž���{8|km���w!�Ǩ׎��$Bާ=r��O����?�����G=�r�Y��\�G�����X�|��.]g~��ރ��z��S���/ؙ�:�0P��Ϭg.v���?Y�'������X������~�;���et���@��wX�3�n�;����=�9e��c︳�,5�
}�wvP�e�e�?�>F��|�9q�3ǯ�9�|ː��=`�}���~�;dt���:��iT{$�"�:�v�F�q�o©����9��E���#=!$����=l�з�ǧ��zԮ
�6/]������J���PXT��7�J�t��<�GP�9e���7o��17��!� |�ɧGKJƌuל;l6ۧ_|Y^Y���

ںm��W��x��WO�<ɷ�r,�˲l�Z�?�kF߲X������Sϔ;Fߕ$�l�x�(N�l���n����1v8v�!�;R,����f�!�����˯��
1ƶ�N����s�V� �,����B�_�ֺ�
c| /��o�}����|�9��5J���j�2����+�]y��.���/���z�*!��I�2�v��F���v{B�Z}��8]���z��p��_�ܲm���dg?�׿��>D=x�=�̙����;''�����ս�;[������%K&��y��?�LOK{�W����r����;���8"""�hx��7���U[W7dР_}mђ���nݾsמ�{�.�q��z��;����^fY6>.���!���������Z����̛�g����?�T���DG?���F����ا_|����ȅ��l߹���ĸ1c�����y;v펎��X������`ɲ��ŋ�,���9|��yCm]ݳ/�8c��-۶/�����ʊ�@��򭐻�����ذi�9ӦL>l��W��DG��wtth�ڼC���㝻w�WT���9�ݚ��r���eع�}/BbB���Ԫ�k����K�tǭ����),r�\r�1�mm�Ͻ���5k�.W��hQ��[Z&Mo��X�x�c��=6&&**jɲ�W\6;)11$$������1i�Ȉ������DGϚ1�n���V���]s�ӦL6dpAA� �w�ޓ; ����`ޡk��jʤ��'N������m¸��}�Uk[��	�?���Ʀ����f�塇������3�f��=!I���Z�V:�Rk[��߂Ӡ7�6m�J���;��RS���o��t:�++s�^�!|��9Y���Θ1l��W_x^��?�l`n.˲o��.!��t���s��a�aa�����-[].�X��*�$Im��!���5��	z��swLt��b	0�w
C��g_|���{��}���5���z��aC[�Z�݃�����2i�������]>{��j5�]e�:!B_ϝ����\u�e����HO�5�#��:��Ȋ��,�w��?����kj��V����ma����L�2��W^Z�l����;��/���:F�1r��7_}eɲ��}�1��,�u:�P�٬V�n�����
Rq\`�I���a�BCC�1��_�ީ�h�Cv)�mw���sƍ}��w��w��Y-������A�T*ճO�o�qO<�̡�|�^/+2B��f�e��rfgf�rӍ.A
B���j�Z�)@V��2��]��T��|o7��򭐏?�̑��̌�=��#���[�ڦN�����wF�~��;~�7�yoɲ�ss�
�B***SS�#�õZm]}CJr׸�i��E��qyEeFz}��ܜ����j���J�R�j�PLt�G�;�:t8?"<�a��%}���{<2"���2i�'�g�Z��?�9'o�2�o����J���N�(G]6�����;�����=v�xppPEE%B(..��7��c�w��;o�$I�A�Vwtt��nY�,B��с|��⋯�	
����rszJʼ���(�����V�Ӓ$��KT</KRmm��ѣ�"#������G-Z�L�V˲\[[7a�����7�y��{�=jTKKkZjj�)PQ��jEu���S�ekGF�o����1,h
��u׈�����?�Q��{��B*�\^^�k�ٸe��y���n���QQQ���T*��5k;lz���X�W_��c��9**r��я����Ϟu�M7>��+'��3�ӧM��z�:�2��������H�:y���KV�\���n0�~��n��Ï?ݲm{Qq�K�=���z��wd��mش�ȑ�[n��?�8��v˲< '�w[N����斖����G��DFDl۱��^�R�ZZ[��B��n�F3t��А��Ȉ쬬���斖��ȸ������Ġ�����B���22���q�QQQ��Q�&L`6$8���]˲,\�}bF!JCc�Kg̘>�֓�ں��܎"
ihl�}��iS&����F���n�;)11::j@NNxXXb||fF��d���C��cc��##3��F�n1[&N?m�d:�*�IЛ�Bξt��S��4a�$K�'M�|�,�TK�c��xˍ7�,;a�X�F���qˍ7p������iw8R�����Ə=�Lp��2v{JJrXh�����̜<qBhHh\lljJrvv��9�����#G�1 �'&&DGE!�9�{��{�"#��Vz����i�ѡ��}���u��ܡ���/��jbB��7�@�)�$t+�G��E�Y�5�����?M�3vm��k�c�~�U����a���!���ٌf/ђ�1M�̼i��]��tT��r�~v�෼I�tBj��0a<��/���]�,���uշ*�>��r�K�&bC�g�[뼑�[	����[L��N�s�#x"�~�L�z��.C���]L�#��S7��ܼ�-��g��
���{��K�n��1}�}ڣ�v��=��.��?��W�9u���-٣���=���ΠGU�Q�N�:Ƙeq���?Y�wE��t��6�Ǭ.�����O�����
TH~p�~B?�����[؄���F�ޕ����E�e��=N���C�>>���G�ϯ)#˲wdDk>!�WdY���t��p"��^9Hg�1�l���L	�/��7�"�03?���s�#?Y�v��]�Gh��>��N�͋��hF>]h�w��+���''%ѱ��扢HM��s��
�F�z���R���(-޽g�V�Y�hqMmmvVV�.�t.�ݡ��C|0Sz���_|y���L����c2��׻��MNJ2�,{s�Qw��Qƻ(�2�55�-X�t��i�_�z��`&1>��t��q���+jk�23�Y�-,*����w�

	�{�N���ly��7<���{�����=)=^�z�:�[t!?,YZPX� �F|�b�ѣ�~�m����b�-۶͘>��Ͼ�yԈ=ҵ8'���B���3/����RZV��s�#���0�O=��<����� p����Yƛ[�{����ѭ�wnڲ����_~�n�ơC���λ�v��U�����s�<��_!���ےR���~�y�����B_~�ͼ��(/

�a�aC�457���+ӧN=ZR�r��F��+��]�!,,�e���ک�'mڲ�����w�q��߷}���;w�742��&p��OAa�'��i˖�������r����ڑ#��âϾ��HAa\\�{~$�R�ټw�~�V����,���i����Ͼ�r��e�t�B~���j�ژ��9��:~�ع�}���F�����ŗKW����,�������c�7z���[��t�W�q:�I���~���s�r��������ߏ���Q#�n�N�HOGy<�'���ܹ.A���JKM�鴭�mǏw�\�|��c�>2z�H�1`�?^q��eed���%3�OKLHHJJ,/��Z;.�t�5��x�cG�NMI���FR^Y�w��������TUU�۰Q��Ԕ��n�i��?����������I���_w-���3��GY�fMSs3�8�l�z�އn���1z�Ȭ��;o�M�V�1��{�ڲm��ѣ�d�~���QQW^6{@N6�A�}�E9���p:�v���y��Y�Bu�B
�]w�n0�[Z�BZ�64$8�p�˯���G=�whي���u����5�u�11t!Q���MMÆNOM�n�����<�ب�#��e"#"�Z�B�4�����F
Ac��%IB����T\XhB(66Vp	�,sWS[�p��o��.a�
|�N��ף��sJ���Y3ӧLٰisi��z`�ҥ�

�fΨ��Z�~}k{;�rN��R�B���bCC��$��Utĸ��(����?���Sܧ��xxX臟|f�Z#""�Ƥ�D�e�߲u˲�BL�����Ȉc�111��$�2L��2���<��<h���*�
��'�Y;:"#"
��B��b�8�֛nz����RS��ӧ}��ee'���R���)<,t�ر��1�9czllL��U�]���)&&�>.,.�1}ZttTLL4����Z�F���;]��Z�PUu��hdY��f{��W�kN�Z��+�ظzem]��ݻ���!d4�].��3[,�/�tӖ�u�������H�ؘ���/�;���Z��N��`�ݭ�mS&OܾcGىn���ӎjkoW��8[t�����Ǐ[]S�e�ؘ��ʪW�nlj�n��r�F���p8�6�!dn7�\Bh߁s����%K.Z�r�"?�Z�a�����,p��Q!�o�x��=|�`��y�Ԕ�쬨�(:6xaqQNv����/X0a�8I��y��7�yW�3/����?RP8c��O?�r˶mE��'��$ٷ̻|XPTt�ԩ�~���mۋ���ٻ��-[�^�v���O��އ-]�����|�n���?^�n1K��u�v�a����;6o��t:�"�|7��~EQ�ZmBB|m]]fFz@�1:*j����Q��S&M����Ϊ���ZmӦL3z���dY�f��?p��v���æ�������t�F�{�Yc����;Ə{͕W�����[�Ǎ����k�ޔ��I�GEF���EGEi�ڌ�4�V[Rr������t�홙#�
ݻ���h�4a|Zjjo�Py+䄱c���J��8(w@���)���u'�q�{��u��7n����k��~���Ǐs����U���aa�Ǝ�<t84$$%9yԈ��|�TVU�����e����%&$�ݿ�eش��N�}�ܡ�1S\Rr��8z���F��23���u:]fFz�ђ;w�(/��;�y�O����~�x���A�5^�o�=�1^?�����-�2MN�>��k0h���Ѳ�0�Ϟ�{?�5��S�h�]�;¾O1��]��tWTo�Fߧ���w>	�b��}��Qw6�oh%�]mI���P��ݥ_�q\�A�t�g��Ѽ|��
��a����Eػ1�=��ދ�����9�L}f蟬��N-�&~�gֿwh��e���.���/:c��.�w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~B?���w ��߁�~����s�r��#�,�]�_�B�×��A�#�迿~Q�~9���~VU]Uz�3�!p��K��a�P#L~a\��
F�F�p:Nǹ^a���Qk����g�t��v!��D�GDEDq�SFZ���>?G68w0Ʋ,����D�t5g�"��0�eX�c����ixTd�%��c@@�V�%���:���"�Ȳ��tp����E��q��jQ��I�I���]�&�0��޾?o���#L����h�X��ol()=6i�$�Z����O��l1>tn�.*��0#K�Kp1��$1�C�HQ5ϻ��s�>����,�!��y�2]�E�����cLQ�r�9,��XQ.��
��"�!��U��N�E��(��(�;sFQ��a|/ɲ�1��Ȳ�0
Ċ����o��I�9�:��{���U�z���,�����:Э�qZ�$c������J㣢(��W�:�vo�����dY�&�A$IB)��;"�Y�IDAT�c��y�(҅�,�q�����V0²"�<��$�t�CN�
s@����p8|��(Z��"B�@�t�\+3]y����o	cL�ϊ��ֱ�`شeKaQ�w��0=*�EK����:��v�,��W_O76..���x[�t���e�N���i����������bY��B!B�J�jhlTE��ʲh
*:Z�(JrR�F��3�w��(
!���ƦϾ��e�e�
2}�z�`F�$���n�F�e�L��p:9��.�-�*��.��+��]zM	c9.6�7�J���5�,����8c��8��q /���R��!�:::v���n�0LwV�o�c,JbYEyiY)˲��;Z][#�2]&�0,� �8�U�b�X,fY�	B�u5�5:*J�}۠�gܽڄ(�²lKkkaQ�F��x�o����t�	��q�Z�=ZRZ�q�Z�r��#�Z�����o
�@Qq!DE���@ޔNh�.6�Y�i��j�GKJt:흷ݺv���=�����Ͼ��G�O�ZRr�?Tq�ֶ6�V+����}���e%.&��t��۝.���p�Pm}]}C=�LjRrpPP��b��i�%�%��hw8��DIlhlt{��32y��/<������ib���BHB|��q�t:]ll�3Ͽ0{挕k֖WT���>t���F���Y�|E~A�) �޻�\�r��;�V˟x���y�/�$y����Ǝ7=	��Z���A!�(��Ȳ,�2˲;w�1�����K�F���<5 ;��t|5w��Q���Ͽ""���m��b�!�J��H��/,������`�b�VVUf�gFGE��(����;#�����&�t�l�,G�������D�������r�t:]��PE�eEQYV��q[;:���B���W�ٷ֌k߲֭u�V�E�;�`��iS&�(/�f�+.�����������O�FcFz����i��������1"H!�BD� �Eł[�v����
�,˨#t:�) �awX����L:�NQ�wP1ƒ$EEF����zS�	�t:ռ:",ܠ�W�T��ۃL&�V�V�遇.Dũ�julLLSs�����Tn��w�'/������V�]�~Dz������M�2966����`�Ŋ��o�0``n.F�Sq�		�(�-���PI����2���N�!�pѡT�d���k�^���櫯����+'M���7ߺ\���
��<h�;q��'!�"{���X��Q%I���r�\.A

m3�<�'��������ÇxuqcSShH��(t<	I�$Q���l��'�/J���K��}������?����
���Ђ�"�^g0$�e+V>t߽�		��Ą�P�FCy�7��1�F�74DEE�*�eYY�}�Bٻ�+22�����
�0� ����h%IB��8x��0�۝��d��G����:v����Æ�,�r	S&NLJJdV!#$ɒ�b�����d	!,�
F�NGc3��^�4����=S�))>^����v{pp0=0��!�hT�Tj���1fR��%I�Q�a�����v2U�����U|``�uW_���a4�K�N�,���DJNJ�8~܉�
�펎�
KKKE�DGE�6�����Յ��DFDx�1��446�;v4���@�\мc�8��`Q	!��e�Ӊ1�8N��t���^��,;].QX��e��t��,�z<���Q�G�ݩ�c�ei�?�q��*�����u`Y$��ј��$I,���;%�]�~��:::
��r2s��\R��@��N���Fi�Z�V�(J�ݎ
0�N��(F�u�j4Z����S���#�rn���t��2#�Rޡ<���Xt7_�;�1�z�B<� ,�Z;:hI�}�l6�>�����}/�����J��o�i
�$I��/�3�cِ��Zpט����(���y�I�<G�ݎ�sC[���������w8��v;��`6[BL�����e
�pQ���lox�Ρ�w�M�Ǹ;��z�Ygx�EQx��;���iPw���
��P�}�X7E!�����{�~���E���h����^�9�E1��v�U����o�^ۑ����(,�擴��'�X�%�@�p�eY���l������a�(,�&�'646������g~���p&%&Y�ֳe��o����&��pQa&:*���WMw����j���t	�s�"�А0�Z���\��\O�d1[ �.*�l�)�W�PI�DQ�h4z��\�3�.�ZM:w�B�Q��>����FCY�96�t9����x��|!Dͪ1Ɯ(�v��G���!D�DQ9��-˰��A�4��4��	��y5ϱ�1f�a���B�_��Ky{���"D�R!�8B=�������N�u
un����.,�FFD"����|{u��Q��"���	c��Ԡ�j�bbE9����a��^�P��3���1²,7��X���	�(�"�blj�y�.BH�1���!-tO.���8EQ���B����:�ιC�c���G�;G_�(
Aa���r�S��BV�W(�6�����[�gdm�V��0�5��٦���ȝp�b:3M��I8tt����]��u�|��{!z��dY�s���Xz��;cp���_�0�[���Z��o�$��*������(�rǕWU��(W�j�ey��*����#6���y��9��X��tv�;5j5
����!��Ҳ�;��.(.�w`���J'j���{rG:}�`��y�a�T\ǩ�jQ��Z��m��o�̢E'���)�Lq{�V��������Y�Z��*�*�.f��ZY]�nn'���������c�EI����X-?9>�$KI	�S&N	

-+?���鬬�v8�X�Vz�a�I�����j��q���X--�-�U�G�
i_o�z���}:�*9��{
� �JU][s������@�A��SSW[r�����x<,�4�����c�5�n����ʲ\X\d��<~�DKk�w��>a�X;:��v�)�d�ۋJ�G�ѣ.A8QQ^�P�1>RT�nn?z���t�������
������ilj�e�䁌 ��~��)��	!�3� �Y	

3rlPPPY�������A���,�1Q�z�^Q��vGEFGGD
�d�����J�p8�O���y�����/,�‬����áR����bN�Ohmkkln
0���w��QUy����UUjIU 	��T�BB6YB��"bO〸�� �Ι?���92g�3˙�j#��s�[{�(�
D��e!!Q�	$$�
�P����q�(Cp�i�a���^����[u�������X?�q}W�\,;<cؔIS��y~�r�D��S�1���S(ʐ��0�X��@,�$4]��EEQ ���eq! ���c�y���a���8�c[����!�X��r���姦L�9`�v��px����8��8��N.���~��8�mm�Y�!��}^�Ҳa��4!��cَm����o�˼
�24��Bh�氌a������c��$��ζm������C�aDF��sO�9S��`Y�q�$]�|Y�5�a222�W�Z�Ӄ�a�w^��t
AtC�a��2�eL�t�\���m'2�e�\�s���==�i
<�1���a.��)�b�a۶>?BLCs� H
cXUUURRbM�L�P(I:.t��D�I����$��縖�V��#y��(���0L�x��K�'�(1� ƴL�qx��&	�(d�c,BsKxBQ�~=���K�w�dYW�����IB�J�A�� !��㘖�s��e؁��UU4��B�P�`��0�ز,���q�i���~�Q��'�<9���|v0�vH4'9E*>�!��[]&��;sG�,{ܞ�{S7
��JF�CE^pp��k��T�)
eH;Bn�҂в��l���[�R��s#�rj����m�vp�-%�(y�w��թݿa�J?�B��H$�u7۶�>�����������M�B*�
�r3B������q�!B��r������@����$�_z��0@QT���S(���C�==�z{owG�"X(� 8��β��z�ж,�����S(�`0Ƣ(fg��3)��I\���1�$����J?�B���P�;��*���G��m�c*�
�rKnߢ�����P(�T�)
�G�->I?��+��#��Zv0Nn���xr��d^��7
��[J���fd����q?Q��+�$g� �m���\/B��:��}�K�ȱ,�;Π�uc�e%Q�
�0��Vu���ά�S(ʝd�'%�:�w�ʨ�H8'�8�	(�W�\ɲ,���x�Ģ����d��<��&�?{�܉ֶP(8n�AdY�4��OG�\�x<Nj`�g�mۤ�A�/r��0:}�L��ˎ��0v��d12���њڼp8##d�&�z�8��`� d�i��,���l�KΈB�P�-�U�V
><�^� o��G�kEi���L�ٶm۶��"J��zEIbY�������IJRM�B>�O�0Bi�,�<��<�I�#��,�u������rյ�9#FD�w�����z���|>I!��a��dB1�@����8�Dz,�e}^�(����dY]�!�$������}}��JK&��AQ%Qt0��ۯ^���
�sss�<��t]�$�%�\.�i��$���.����t������DQv�X�%�M�ii��%���)
�{�0LOO�ͥ��m�G�k^xn����q]�?߽{BqqF(T]S��C۶�i��l�d˾U�h��c�?��K�-|0�(_���;���-�ϛ[���j�(�
M���o����e<|x�[�ihj^��bY����?xQum��߼}�…�Z���
��2���[9e��@�.Y�٢�=j��O���v����'gL���zo���vE��h��lɒ�{�66��%����s�>���=g���ֶ�����?�����X��ɓd�����N����]PP>s��8[**����ӧM����F=B
�c��q0�0+W,�w��K�<����y+���S�%Dz�[]�_Y����%===��</�<����4
�S�>��&UQ�o��mٶ}O�>Q_]��0-�e�W�.Z� `峫?޺� �{�^EQ^}}�iY�,۶���O����a��}������O�|5�yβ��gϝ<u��8�1D�0�_��~՚�o|3������657�~Ӧ@��q�(�ߜ<�ﯼ�1�ZQ���Ϫkj��M/T���G���Щ�g���:t��[3���;�]][�U9Z��y����I�n��
��,�AM�t���ڰ~�ĉ��/���|���K�[Z����e�%�'N�u�?֟O*+-�DV=��J�����h_{�y;�#G.�Iee�ǯ\�ܲ-EI�,k����
����k����+�~�|��#��^����$��5M�|9ZT8.7~���99�(��n��~�,Isf�����xc��/<��?#n}��_���k[��4�Q���?��C/��eee>���ֱcF�Y����V��gY�O��{��ǎ54,}�g�왟�*G�\2�q���_�hA$���Suǎ�ŋ/<���ѣFQѧP(�(CG�l����JK&l޲������nڼ��o��IJe�n�L(._���u�⢢��mm_����zGE"�,p�ݪ�&��H(��(��i��qAhj>���5u��ܞK�,�R���v���w��e[���nim5M����huʹ�S-����ɿt�W���/7�����@,60��T�4��i�m#m��
=�P��	�vTU�z5��ߟW��y�칯��sr ���@(<x���q� ض���n	E'E�e����A��b?xX*�B��/�*�[�m߽g�њ�E�?w.�qW��U��o~�>�d�޽Y��~z$�3gj���͙����{�ޖ�V�ǝ�����1�|��P0��>y��r,�0L}c�G�l9PuP���<�v�Y���ϳ��"�#���/vwgC�ϛ��{�N(�3"gێ�_�<��y�JJ6}����X�ر����]_�n��(�9CU��ZZ[�����|s�ԘQѾ����F��i����>eҤQ�Xl�������G���¼p�����Ͽ�X.L�u=��.KRqq��'
ǎ-�9c[Ŏ������ϝ�$�:
�B��Y��6�,K��!��8.�H���Ul���Ҳ�k��Dz	E�e!dYyA6MS۲
��8�T��u�x��e�a�}� @tݐe���
�@Bz8�SU�AH��d�j�0�5���j��cY�0�dAKA�����p��:�0�L�Jڑ$�2M�c
݀��Y�%SfY��}��R�0DQ0M!&--����k_��u/ggeѲ�
�^�njj���+A�2��麾��r���@��iY�(Z��*�B"�UU%����jA�0�e��	1H��8�7�Bd�P���j�TeO$Ď$����'�WӴ�D ���!-˲�o
#�H@uӄ��,�"���q�jGQTI���8x��RoIqqVf&�}
�r/2��
^@��i��+��.(P%YB>)yI�w򀜺f�ˆlL�:E�yKr��w�G�Ƶ�з'�Z�x��a,�92���)*,��
�r���'m&�&��w�����q��;s�t���(w{8
��7U���W�n�.c�f҅E�
�ro�T{!��<�S�h(
�r� ˼!6�����l
��c!�F��;����%tEXtdate:create2013-09-27T11:09:37-04:00Y�%tEXtdate:modify2013-09-27T11:09:37-04:00kE�BtEXtSoftwareShutterc��	IEND�B`�site-packages/sepolicy/help/system_current_mode.png000064400000146641147511334660016671 0ustar00�PNG


IHDR��ah�>gAMA���a cHRMz&�����u0�`:�p��Q<bKGD��������IDATx��u�E�ǫ�}|�%�w� ����������p\���J<Y߬������J���	l}�<yf�{��k��U���+++[ZZ�$Y� �P(ʟa0���l[[[\\\~~>�T�)
�όeYeee�@������,�4�v�(
���0L^^���Y˲�i"��v�(
��b�q� �c�)��B�P(b,��#�T�)
�OOT����t���;�s�!�o��B�ƞ��ǂ��8�ޟ>�^1
�(��S��w�h.�0�1p�`,�B�]��LG.ѹ+˲,��O�Bز�x[��ΏքԍAi�(��rs�;����`����v�)f�p!İ�,G,������B{�r,+G"�1�0L���?��!#�z��c9�����,��:1m�"��X˲,��|K��!�4ݴLII�j��-��-���^��D����O�7췋B9\��Ų,BHUU� �e��߻u�Vg�^�mG�� ���;��_�l�����<|ǭKW�{��%I
��~y9�����}���ʡ�B�n�<��'�}� ��ˆa4M<�j�0A �X�4Q,�u]gR4�����(�ɾ���2��@Y�5t=$�w=�̵���������o�D"�$b�G�5U��`����v���y_|��w�������vۃ��f��p�0�e���s!@�1t�0M�㢻<�!�0��a��)�XUU�a��UM#}�rXX�Ųl]c�?ү/�`S���;#5E7��ھ��j���e��V�[��W���>I�����p8��y�{UMC�Q�4!���eg��u�L�0���Ʋ���$c�k��R��������qlz�Ԫ�Z�Pjb��1���}�5W���G�����w�|CmC�?��Hg&�(���:83��e����J!�e-�JKI�VT׸��	�u�;�t�¡��pqu}C~N������3��MMJlliŖ�#�q��������+��IIJD�76ˊ±����%��g���u��JJu݈���W����P~�e����6�ݹ�\|>��޻��3�TM���8�?ڕ���1
#%1!3��W��L5b���e�Лϓ���A&�� ���0LYU���(�}�٢ t���gN���K����߭XU^U3y�?������輌�U�,�E�_��%e�`�㧮X�᭏?�x.���Y��Xv͕͆�uc�
��koV��y��ĄM�w>t��6I���go�@�(�F�e��s�+���0�4M���7�tvu_v���m�ih�8�A�C���Kk�|]�W���(s�~S�������m����8� '����J��')�o^NO����ʑȘ��C���sW�{��d9r�3��)�EU3�<w����]�l��?���X�s���,ː��8���9lY�~r\���_Է��v�<g��L˚9ebCK�?~��.�:nL0�X��DE7��/�pP߂[�{��Y3gM���{�z�r���0@ 	���K����Ͽ=n'����T���''�8��]�����q�G����q���ME;0��F�U^�t��6o�L5��O���w�TTM7zö�6l�n��@0tљ��4m�?ZS߸x��a����>���;n�:>&�Λ�MML�2�:���B6I4L` �BȲ�~�����S��`������U^9n�P�cyNGg'��a���,U����Cj��6�`DQ���-��<�r,�1l��_YS��ށ��-3�mR{g������(����/?�4YQ$Ilnkoji�#��)G���vY���e�$�0j��~���:��̛��fL�����qÇ>w��#��W�54�<���&�MKN�ݴ0�A�v:k��Y7fؐ>�I�e	<�����ܢj�2Dž�H���IcF��f�������ǐrdX�Dqͦ�/���	x�ݹ��Æʊ������!.kG��?3c����㯿}���EQ��3�s�k�����w��徹�����.�۟·�$q���.�cKE�!���:nİ���۲�Ҿ[�������z��$%�0u���c���TE��a�]u��o~�ٺ-��+<뤙
ͭ_~�,!.V�}fN���[�,{��'L7�㯿m��2HQURA�
��<��X�a9�s99��յU��)�°�eZ,���xr2�+kj#����>s��zGiEGgWFj�3��|���{�����Y�(@-LCʑcF|L�^4z�  �q1�i�����9�jժ#F�N=|(�)c�a�AY�AU�,c�O���V�㽹�c��q���cY������꺦i<Ǒ��h���M�t�M�4��AA�Hc� ���{���W_���)yz �e�=[0ϲƝ���nc��W�K F�N�a�����&�66���o�����ut����b��ч��1x�eٰ,l6�0LM��gĘ�-[�P��a�B�$Y�%G"����(�,� ���
��`�-�R�abxc,+
�����^[9�#4�(
��e�,G��<9@`a,�2�0[�m�U�i����MUTx�>�`L���0��6{�7�J�!���
�~ނ�?l�!9x`aNv p�ln��0
rɖE�>����XQ�?O(,Ã��ȁ�)���L�Ao<�<ȭh�=,�w��s�	���[�c|��ibÄ��s2S�UM�{,8��ca�01K*�1�,Q��S&��ϵIb^f���D���%S(G���P�Ĺ�|(����Y��C��DX��ن�/4MKQT@�v)G�}<|�4/�Eմ��2�
�0���f���(�5�P(=!���9�K�C�~�3CRҗ�?=GE}�Aɣ�:�ײ���<Om�z ��ab�Y���J�0
˲��
F����f~����ul۾������꺞��
��u��^�IT� ɉ����zEU ����퉋�;�u�P�:lWwWNnN�'�0
�;H�w̱\{G{EU����6���[c��	�a�PSsS$9��$���?1>��7�-�ҫ`B���p��3���`YVQ�PE1H��M�,���u@-˒$�h7 ���~?y�P��e��d9�E��{���?%d"Dz,0�����1A`��q���@�_�Ѯ���!�e��XEI���{Hj�sC~_"|�~?7�,M�t,�c?��{9Q(G�Ƴk���n��f��D�cZ@(0`���=мy�X4�!D�H&����oQ"I��s��3+�Op��~�w�2�e���?B���u�F�
�u��5�h�fB|¦M�:;����=����8`��N�IV�>ڍ���O���=����y������H�.�fk#����!�Ƣ�[�u�C�1�xB��Z�q�qBMӎL������S����5Mm���1"��0B<�K.{�8�[�jU�qqq�'N��z
�`��B�e��BQUU%�ɻ���g��
����'h�FZ�jă��C�1�9���nǎb^���`b�����gr'�gM�2M1[{�C�g��DQm�b��?K_YU-B��2��$Ec�y�olh�
#3#]�u���-�e��
���L�5�}Y��?,�B�6ؓt�S%�eR?fʟ�1fF�V�1U
1'ٔP��xx���>�oY���v��y�8��K��::|�i���eY��²�r:��u�4�wߟ{����~��� �i��8�0�e�I �,��TH��a|��¸�XY�m6[����L�`d�=����_.8nꤶ��.�?+;c�DB�(F"���5M�a���SSS��-�uv�v�A�i)�"H<�q�i���BUU�.�H����i��7l����
�HY,��I��$�4M������!y��~�iD"��K�}՚um��<����I�(�{�
�X��8�4M��-��(�B�Bh��^@��*@,ӊK�#��Ֆ��q@��"J��)6�Ͳ�C���#�c�eZ��9��u�̳�<���K����3N=e��%�e'�0�O��*.�2i"����߶��s�9|�;�_]S;}�����~6��Ꚛ�,��۷�����\|��,Y4���aY��n{�Ås?���_�_X����n���K.��`0����0Ђdȯ*��(��7MӴ]%%�U�7���{媵�Ə�V�cǮ⌴�����.v��m'�7f�J���s'��u[QYy��������pD>�.���:B�KF�@"�d��6"�aY�]�n}MMmVV�ѣB��E_��w̘>�a��
��^�݉3�ߴ}KiyEccӀ��8�[�jmmm}^^N���K�.�>m�?�f���O9������:6&���2&'%N�<	�����5�P~��YQTU��~���)fj
4<o󖭓��Cz4��@ x�Y��Y����y�����v}<o�i���.�7o~�fL?.!>>5%%%5y�!������hǎ��G=��m�m�֬��_a��{�ɳ6n޼f�v�ݤaÿ*�HD9��S�����[�_��ƿ��'%���3"�\�
�0,˲,��z���ӦN~i�kO<�luMmrR҆�[���ږ��P8��G�64?//>.>55� ?���K֬�a��	K���ڽ������?m��
6`�r���	!"eC?�rD�Y���!Dv�m��u;w�L�<i��훷-Y�cp�	3�iim�d�����ֶ5�֟q�)999,ˮY��|k�步�e��M�5�����>���o޺M���斦�Sg��q�ֺ���� �e�C
�(�?(���cǂ�>ڶli�ks�J6�t�4��p��6���e�ݮ���W_|>,�|�qc����/\����eڔ�Æ9픓|����-++#�3m�䔔��;"�RRV�����5|��#�3���������@M�t��������/�2#=���>�r:�� c^�a�"3ò�y���-(|�7��b�e[QUU���233�?��M��T����z��+����t����� �H����]��������������W��C��A,ð�A�(I.��UTN?����&��a����N�1h@���d^�m�#��O�Y\Rڿ_ߑ#��II����O�0n���ƌYYU=f�ȭ۷757�z�I�6l��Ǝ�[�߿�Ԕ䈢��D.�:�Q�<@h(���hN�u�q�\~Uֵ�m�ˊ�u�/d�P��}�c,��U�}��mE�M��z���N���{����Ʀ�qcFg����ʫ�ial}��WMMM}����M���'%&vt��`(���VU���:���M�*�j�ć��'{���l����A[�eX�aB�(544=�k֭���N���7f��+��;����Қ��4z�Ͽ���;`��u?�߹cWnN��f:x�Yg�Z���C����~�_׌�,��ʱ��AD�,bX�p,��jII��7n�Z����b՚o�]�j��)'$$�6�ˢ�;�;|�`���SR���}���n�V�b���J�0�|Ūo�]�n��~�#�]�aSSS�E����5U�����@�Ѝ@00��=��S�d@Yau��KK���11VW���h3g�ufzZ�M��,B���e[�����+.�DӴ���Ͼ���N4p@yE�g_|�(ʅ矗����P�c{B|��i��W�^�
4]���t�������ጴ���D���oYd\��C�`�5]?��/G��V��K�bX��#���5��
N����A���oش9�O��'ttv�\����y�	3ss�SR���?v�p�
��]�i����&%%�����
!�z�}RS��
�2��Bm6!�cFgW��������K��cg�#ƍ���QW�PVQ������	3���W<(!>n�����0?��1���Q#G�3���?+#�a�aC''%*���������ZFz������a�^/MTN�C�1fY����ONnd�f�[�����K��5V��N�+�lmm�����c��������Ǎ����N�4l�$JIlX�mmm{����_v���8A�e���%I���PH�4��E\�C���fS5Ͳ,��H� EQ���A\;0
�xv�:v�ڑ��]l��y��iY&��[�� z慗�:�Ԭ�L˲��C��q.�KQ�P8�r�8��0���#����
�� 0
�Bđ�ALgw'� �cZ&�(:��$Y�t���$	���j0dY��pB�H!� G��7����� 
�a���󼪪�P�v�t���^�G��P8�t80�p��v)��i��4McX&;3��B(hH^�U�W6rL��11�+/��흏<�t��'��q�z��r��K������ �aY���-N�s՚5i}�$&$�˲"�9!�D��0��vuu��0������a�@0�k�o̞<
?u�O�i������:펭EE,��fg놁1���d�r�e�|>����p���a��,���QU
�s|���eYhB˰�!�,+�2���X@
E�]P����F�rc�ԇg�pX&�Kd��@ !�y��@^�D���0��HV	�1i4/�OBH�tY�a�>�XY�^]�!���z��L��
C��~�=�{���L�(�b(&�z�B(z|�����CE����t�lfNV������f�_�����j�c�"Bd�c��?�ׁ����m�N.z��������nf�?�e&�'�\�A���	�p]C���!����Q���q�0���%Ƅ�}����a�����t:��@�1b�X��>��Ȩ�aˠw2����o�~���O�1�$Y#$q�{�:��b��n$��0�@F���'!�A��p�F��DQ�VdU���!��:��K�s!$y\š��Y˲��P�NE��D�4͘�]�C����B
�H�K��:}�N,wbB"�s>��P~��bm������������ �䈜�������I?��a����4�caz��x�
|�+B�Mؔ䔁�����i ou�i�m�߭������
fY�}
����:F�L�oy1�=a��T���:�>�P�/���8Z�G5�B9v��
����O�P(�*�
����O�P(�*�
����O�P(�*�
����O�P(�*�
����O�P(���I�`a�-��~���0Ʀi��dA�C�r��g)
�rX���c�%A���B]ׂ��OL�VND�� �#j�p�]b���0�,���p8�suy
�B�.��~��E;v~��7��0��N$+.!��+~�F���:�qd�^��P�9��f+ھ��Ͽ��ܳ
,��Ï���/��BA��NfY�������:��M��^�vR(���H?��H�����_w�S���0�֢�r��f�
C�4I�ȑ�b8�[����ᅧ��v�
��(�eYN�B�#�LˊD"������y�嗻�:_y��;v�ȣ)���u���I���V8,�<�r����}>̨�y�9��AE!6"
���Q(��~�aX��������--7�t�3�gY�����[���Y'�2k�+s^��&�;����r���.�⫯�+*����/�8;+�^v��7^w�����yݲ,��lڼ%77�����<���n��e�V�LIJJIN�X�����W^mll�~��K.�`Ӗ�o��~0���t.�~�{s?�[��嗹��c$<���~��@���>�O��N_cs���ƌy�?����;/�~��SN����b���~tŒ���K.z�O;��Ԕ�aC�l-�~�Yg&'%�����dg565����u�m�W����_��o���f����A�?�8��(�II��|�ͷ�>p@�K/����Z�z쩧-l�}��Ͽ�
��_/XPXP0l������d�ks�N��p�EU������Ɵ^��B9�`�!��Y��^z��C�pਟeخ�.�ݙ�����	��9�̯�����n<h�_���o��ƛ�g�ԩ}
W�Y�m�������\I�>����ֶ���܌�4���pB|��g��J��/��E�/��u2a�Y�UU��SO�a��g̐D
�jjk���/��z��}��{EQ.��B�Ͷd���J�0�4`���vڝ�R/UʱƘ���RUS�6;�~�1�ҏF�l����y��࣏g��G"��Æ%%&ffd2�Ͽp�5W�\�g^x����h��~���:IQ��/���O1lXcccvVVvv&�q-��]]]��ζ��@ ���i�
 �)����q���9�?�����1��g��໅
���>x��o�s۝we�����_zх[��Z[��nwVf���dY�J?���–�dn�B9v`���ʤ�$�4�4˲�]��I�$
b��T��5t�૯�21!,Iҷ�����˲,B�Y'�r�X�� ?��8���WT�0̾�Y���I}
�.WA~^Fz����W����B^nNNv�a�i<������5jĈ��x��v��-,>l�(����Ǎ=r��Q#���<�8 .6�XQ1�rL� jmou��<�c:�m�imm��V�<x��iB�� Uv�=)1	c����P(�i���~�G�kjf?����v�ME]���eY6I�$),ˆ�;��늪:�@0(�0�P��tꚦ�*@Q� c��(��P��q�UeY��p,���0�0�$aTUU��NP�Y�]��xgzZ��f?��
�W�ܐEEE���j���������ΒҲ��9��8EQB�p0�pܰ,�a�K�!Duvu1#�2ƀa���!��F"�KJT���gP��i�њ�?#�=��nL
�B��q�h^���b�B��IQB��zlꙌ�|���Bx�?{���0p�d
�B�Y�_~(Ƙ�8��/���B�P(D�Z�1&)A�8��������-�
�e�ֳj�C�]ő�1������mPS8��p��ϲ,�0�y?��Z�ea���=��L��,��8�磻��Dt�VϽj_ύ�]�?�.��(�,�<����ڷ�=+�_��I��:��������y�)b���*~a�X�G�$2��Ʊ�O�9�D!�c=N�ߏ0�0��[��i���M����y��j�ߡw�P����_8~􉧞}�yt��U� X��p8l��q\cSӃ�<F��H�H�r��i��v��nkmkkhl�x��������!..����g�cY��r��=����8��p�,�r:�ː d��gϋ������\.Ӵ��!}�M��N'�qv��n���L����n�8��r��1��;�����(��#��B!��Ng�&��A�0fYF�y�k:��}��w�l+����v:�$a�ɻ��ns��������qqq;v�z�7bcbl{(	��r:{ƈ��{r��1����.���8��KJKI�&Ӵz���y�]r8�DQ�
��n�7��G�x��'����r:�(���r:�(�����5-��p<����7n���#=
�XE���a���H꿳�D�$Rm��N��"�C~b��4����E՟B9L���s���4-��:�\�~����3��I'z�Q!�7o���HKMm��X�f��iSEQܱkWUuuRb"1�ع�����v���۫֬��̰�l$��K���z���l]�?��s����ܒ�/
��׹~��$�\.ң��7n�l����s�ݦi665�]�P(�~�&�0y�{��Ǜ���������[�q�$I^���端��UR"�bYEESssrRRT�[Z[�+*��[X�]�q�(
��hhj����x�m-*ڲuێ]�N�q<�xͺu�a�����0L ��b��@ ����z=]]]�H�K�~X�y�s�\㍛7w�|)�Ɋ�677WV�nhlJJL$��,?���MM�9�

���.s8�>_gbB<��5��۶o����D�4���M��[�n^�DdYv9�r$����g�(u���>�ò�1^�dz���hǎؘ�����Z]SSY��f�6oݦ(J||�a��ﮮy��gw\�~}�²����džM�;;;cbbf�6'��$I�n��`��%I˰UUM���qq,�b�w�����>��?����3�$�'x�nI���飼��x�,�ﯖ�V����~�eIU���X���0L[{��#��Ð~�mظqͺbcc
�rB���vYY���;EINN^���N��໅�/]���RRV6q����z�E����$ٖ�Z���!ؿ?��]]�|�YsKb`^^��_|��e�eY���|��gTU[�lYa~���B���>�ȣr$���E�ɧ��u�	�e寽�����Q9�y��z���N���}�E[�d���þ���w?��D���}c���_K�Կ__Uӱe��;|��Uk֮Z�6~6���M[����|���a��ƍ}�����\��mm�3�O���|��Uk�x=�>�1յ��?���O9�������z�O��bgg���;w�B�/���ę3�y���Ҳ�;vj��zo��V�7�~�����0ƻ�k��ugW����x<���p|<ﳴ�>�y퍷d9�fݺ�C�0#���-[�y�EӴt]oko���g�q����]�ClḶO>eY��*�Px��%��?;3�����w�
�ë֬�8~�#O>�q���55��	0��y��II�1[�m[�r���X�xqAA���_�RTd�����o���v�\�-����	B0h��$�K�-ohl���^�x�)S�~�ɂ����*.5r��{~�֢5�~�e���u�
�p���}:oͺB����JNJB���?��S��<k�e�-�>66���B�~�yVfFjj���Ǧ�S�S�?��P�����{t�����s/��������G�#_~�B�2H��,`Y�㸅���*.ټe���}���:i�1c��2��n����ާϔ��N�t�u��{ǭ�w��U�޽l��ꚚĄ��۶oڲ�n�Y� �
�n��;v��a�{˶mۊ�6��t����Ǎ3z�ȡC_p�9�{������&&$l�V�y�6I�&��������	���h;��`����?e�I^���{�F�WT|�`�������l�˯�)�����_�_�l۶q�������.����px`���11K�/���v9��W�jjn�2y���y�i���n��[��\�q�e9�ՂoƉ		w���s�>k��],�F"����:����=����@a~�?�q����v�,Z�8z����V����17A��ZJr���'����wO�<��XӴ���Q#GN�8�oA�-7�4y��y���a�׻t��ں:�$�{�Y�]�W�a�u�m�YY%��6��(��ѣsr����U}�rX��믾��K.QUA0n̘��O;vڔ�7�tS��zʬ�Gŷ�}o�-�����~���~�ks:;�}�;n����:xЀ~���΂�<UU
C5bDnN�7���0fYV�UQ��>p߽��|�|���
��pxӼ�tww���^|��]|��5kTME1.6���O��
�+����ALlL̄qc��_<ϩ��,�4LSQU�?�(
ƀd�WT�X��1��a�08���r9]�iiw��Q#GȲlYVlL�c=�p8�qǝ��]3�����v��5�n�?�Ѓv��;�jkk�u���!p9]�i}��m�G��BY�%I�e�˜�X2#C2%����s\$"s�0ò�P�2Mìi�?�#]7 �6ɖ��p�_�r�g�B!�6q��W_�n�]rхO=�\B\|fFFDQt��B<�c�y����#MG�N#�bY&�����)G��n?i�L��z=�
���Ԕ]�UUп�c޿����{�ݣF�x��٦a��dg?�ЃU���������2MA��x�>p��$g�a��J��D��Ⱥ;����m]�I��0���uǘѣ��m۷���ɏX��#Y�H��1�B�`�0M�ө�Z(
�d�!
���`0t�_����y��/�Y��M�D�A�euM7M@��!@u�B9C��iye��ץW��w޽��+s��g?}��M�֬�ohHIN������g�YY�{��U%%���&N�σ?���eeǏ+.)y�ŗAE=jdIi�/��01��k��i��_�tӖ���8����y��kj���0��ƌ���A��߿���_�������9~¸q_|��~t�I'���}�d��-[UMcY1BH*� � ����,D!�����x���͂���6��u�UW�1�?>�ʜ�eY>t��aC�-_�n��@ @�%UU9b�?���7f���:d�a���,�Oj�I3g�߸q՚�dj��1#�f��bZ֤	����g�I61B�a0g�~Z$�,^�l�b�����~㋯�s��M�<n�Uk����$$$�^���W^
��6�m���Ͻ�Қ�k/<�\Y�/]�sW1SC�$��A/!!�r!�?� �a���s/��aӦ��>� �1������laQ�,[~��o��l�޴���y���?8bذ+/�$?7���?����:������>��c5���Ν���o�^oRb�iY�a�$''���v�]�}�X��m��-Yz��Y�Im)�/�9|��5];�%ڲ�`0�����*�q��
!9����q�;::"%33������#!>����W���DRSS�6����0�/�,�㺺�
���r$b����4��r��ZS[+�bRb"��,���9���2�p8���1cF�q�)]���-�d���ò�11����$'%�A�4,�6I�u���� o�a��)IR8�y�n�www����d�S__o��������Z����dq���4B�|`F�DX��8N�eQEQ��k����I6I"׮�a�6۞�0���s�d���b��UŖ�r��~}C��㎏�#&M���y�9cl�[n�����׷��746�]�q{{��n���]~CC���N���D�cB����v�f��h㈢�j*Dz�����jkoo�����p�\�n�����d%�@0��23!�,˶wt������:˲�--/��Ɛ5>}��11�pXnhlLJLLJL$7'i���5q����	+W�����.��Ą�ؘr��~���C9����9l�'9�1C�
� 9��n�a�S��9�e�[793�-0M��X`��e��=��9��~B�[뺎���������'���S�="I�eY<�� �<�a�u]��<�X�E�B{�$��V��*�0�eY�i��4��/�i�{\kHz�h[E=A�L��g�X�i�m:Mף��oAcҼ�ɝ�/"6L�e��Mӌ��"��D"�����-^���	�e2�PU���rg��a�]�����͕�p��!ɻ�Y��t�0Dz���"�ț)@�5M#�~E�u!���8]���XV7����ġS�4��1�/�-_��s�t���i�cR��~�1�ϧo��o�N�!��]{�6���a�%3�=�1@�PبR@����0M�!��-Zr*�F�a�}�����^=KDg{��}�HY�=��
�"�~�|�{�E+P���2�~��-�g�Xt��!m�
#MӌD"=˵,KQ�hv���3ϰ,L���m����'���X��q�ƨ?��W�ɟџ2Z%�;F�%wH�-@~\r?�w��D0Ʋ,�-(p:��~�y�>�r�r�����疞����ķ��PU5?/������c��П�K.�gO�K�:-�/��>�aDQƍCL.?�8�<!���G�t��BM�332�rs���i�P�q�D���!jl!�I�[1S���#�f�\Dz��1�{��3A.MUU���B92�ړC��#�HD�@Y��G�����8�0�H$�i*�V�$��$|23u��T��ptF��44�B�2������x��' cl�g���C�ȿ$wT
�H!���:�FB�qjJ�ñ_��h�����'��?�w��ljj���A˲�����n��� �{u�g��4�hےW�C�-���P-�P� ���OB*++L�<h�$����v꺞��,��-��� E	��~�f��!$:�t��_�����x<�Ay�A0M��k/+/����x<Q��o�;ƖeC�h7�K@I�@&W��f���YU��տ_�d���bMSSRR�kL��I���*qZeY6:-|`
�G(&���<Y/�k$�!�J�	�B�m����e���P84i⤆6��%N�;kJ���֭_�v{HB�&m��Ͼ�"q�5b�qS�h�F�-B�����[��r���x�n�@<���Y�e�oؘ�����DF�H�,�k��s_]���~}�����
011	"X\\<xА���$I;w�j���z<�3N=��v��N�г8�P0Z�a㰡Czf��Y���V�-,p:$�a�f$�&,˶���B�qc'X��j
��t
<d���n�G���K/c�bv��'N�]]���6t��hM�
��~>��N�X��Ҳ���fO<p��0M���U\2��ȯ@�P�q~�7t����Ԕ�������

�\~|�zA�bbb�[��I�y�omm]��#G7ftjj
˲N��8w����-޴y�a�x���l�$���$;4����<�545�A��)BH�$c��F#i&!��ivv�232"���k�eZ�c<^A|�>2'A2���*.=rdKK��_aFE�e$I"a�$�� �`���gF�$USk��B6�IJ���B��Ʋ,�`\SWgY�n���)����s'�<q��0�iimNKK�,C��2-�T�繸�ؖ���q�mm+V�*,(��H���[��²���QDQ$�I�29�ΰ^��wJD!��]S�r��a�*=�j�l6���,[5"Q(�c��uԏʲ�v�vִtvH���vl�^���IR{�/*��H��OMN�
=;+�E߯^�V��Ą��.8��U-��蓚�ȓO�|�cF�<鄙O<�LwwwaA����ޝ;��+�0#�D{��~[{��N8��S�����˖s�����+.�u�Ė�*<�G��HM0�B���P�DŽ�0�3Ax�����z�w�׮M����W�WV~6���23**�TMu8N����ⲋ/��̐#�ή�7�y���~İ���<��?()+OO�s�y��,[���uW_=��`p@��|�$I�svFz���BY�^�JFf x��;{j1 7'�Sf	������/�����vW��g]�ݧ�:)/'���WUu��a�O�
�{�������tI�(���k�^� �}VA~�g�X�vmggW||ܟsB�B�3�!�c���D�`g����5���v��(
��=�$���7���;/�������f�4�۶����=jЀ�g�z�s/��r:���U�.\�zݺ��C��3��9sfN�~�_�����i󖖖���)'�q�)_~�Miy��/���k�R\RBr� ��'�����i��@�t�e�wP�1�9~�����{���.���}�����6����O�i�������/�D�����[n޾sg�‚��O���y�Hd��6nZ���3N=eܘ��55�}�����fL?��x�����q�֬miiY�n�q/����BϾ�2/0�0�����Hh1�Ue�}��5M��� �,.����vѢ�Ҳ�/� 3#�f��>�ӧ��7��qq���	�ƽ2�
M� �~�W^п?��9����}��W7]w홧�f&�S(~o�˲���v��d��_1=]�nt��kN�� X[W��c�A��~}�}��7^�=b�p�4
��SSSSSS0�.�3..��r5���<~z��}��rKJ˒G��r��3��ޯ���v�NLLLnnNNN���))-�z<}�33�v;1��,�v{*wW:�.�a�i��n�?��O�� A
�1|�M�]{��w�8ݻ-˪����H���6#7'�����q������eed�$'gefq<�bY���Sn���O?��xٲ����}׺����زQ�[PЯ���A�+,lo�hmka��!$ײ��Ĥʪ
A�5�4LAM�lh�OMI�t
�bA��nw\\\yE�8�O�i�W^v�����y�Ҳ�����J�(Gd�˕��g��㺺�:|>�N0jiix~�Q�e�	񅅅�IIw�bD(���!4�O�42K�/���_��u�h�/[����q{��'�Y\��7�}�M�7���������u#����9�9|�˯�y�w+w�6ercSSggW��T����O|��M�-�Q���+"GEini9|xss�˯��r�j�|�$���omiݸy=b��)�lM-��׮���!jW�E��033��t��‚����ؘ�1�F�-��¤�r$���iY���S�D�M�Cť�ä��Y�|eMm���s�KW���@P׍ӏ{��7rsr��&I҈�ÇD&�
C��'
`�b�r��N��a��KW,���{�1�5fqI�[����Ͻ4�ܳϖ$���s�Ν���,ì[�~���Ҋ���aӴjk���G�x277'/'{��5��w�-,EaԈ}����>�ͷ��X����P(>o��_˲���uuu��n۹������==-3?��g�E�"
��뚦����fg�������٤쬬�oFZ������C}���{Nnv�(J�9�v�}���
�>�//77)!!33#�O�̌�f�����55u��i?m��Ą���Z___ZV�m�֪�Uݝ�}����E�!s���g�&���v���l-���Օ����������JMIMMI���9YY.�+1!!#=->>>.6f���BW]~���X�is{G�e]����p�32ғ332ƍ�t8��WQYU�P������Lԟ4fgGgIiq�m��emmm}R3���w`cJ��jà3N;u¸q��e���,�vݺ�����9;?/WӴ���)�&�-,HINnmk�����s�Uu:g�zJqiY}cCB||aA~A~^qI���8n\JJ2�G
t�.�1Y��3w�Z�eY�:�NQ{z
2��0N��Dd��L�#N�CQ��?�@Dz�p��u�(
IH�p8 �p��8�08�#I�l6[ �i}�<���P(��;����E�v�a���K�4����8n�����X��F�-6�0�2Y�=
��p8�p8��(E�$�0Q�0�J$bZ��ah�Q��.G"�e��u5M�8�\����lL^��$J��t�lvUUޘ�#�H8�cYcA�e�eٲ,�ݎ–QI���@0p:�n���r:	�eQm6	[X7Y���]}�A3wR�)�<i�X��`�fԥ}�cL˂{���F��iY��:��,�7��*L.��*9
=���y[�m����������%v�h�̃��90��g��G��@�
�u��y|4"ao�dBp�����7&�D�F���ހ䨍(!�sWτ�ѓ�V@��r�q�I�-���h:?D�F�`IH€�_�B �{�,-����{�9g�!��i���I��E�P�;0�dϬ��L��-�W�{�����6�i��Ic4�ϡ�s�*�t�(�1K��ʰ,�0��$�>�5o�Ǫ-�B9��j�=r-P(J��H��ļK��t�R�?��O�v����d�ǭ3�B9l�'�N���p��/
B����Mv�8UY?�W�P�ӯ�P8�Œ<?�(���u��g9�W82���B�#rx�֬]'IRnN��a��ֲ,��zk[[|\<qM{�C�

��!���9,�x"HY��vI�@f/{.AE�U��&��3j�!����>����B\z�������Uϥ��hmm����&݌V��ܔ��232�N>=����~���|4��A�c�D+����,��P�g���Qj���EΩ(
�zЂ���rlrOf4��?�r͚�>��~��r���n�盚�{�i@�.W��K<�B^���q{=�ʪ��J�F8��ݻw�,�����|�?��(
�$7r\l�$I��;�X���X��A\݉B9�(�q��v���plܴ���->>�0Mr�$Y�%����zm6���&I����p8�����l�116����ል��8��l\\�a�<�d(E!&&��`����~�җ_}���D�4���x�n:��k���t8�{�,���Đ�9�v��
�@j�p8�v{��+�"˲��1N�ò,���nw���p8DA���^�fmBBB�kw9������GK�Ii^��d���11.����|�Ï4Mx>&�K�$�jLL��nǀ��(�c���c�EA���lmk{��,��t�PU����l҉3g"�wx��,�ӏ��u�ò����ò<~�7�~������=l����ko���˲�������s��1�����]��6m1lh�~�TUpW�{��7��d������<rذP8�e붡Cﮮ^�������g_|)#=�3N>t���
�6
:t��ee�55������o0Ms����0��~X���!���޲u��A
�-k����G��'5����ᆱoh�5]�Ė��EK�fge�=Z�4�����>���ܜ����$-�n��(�&�nڲ���;5%yЀ�@�p����9o�� L�0~kQQA^�eYۊ����lڲ��{=^�aZZ[����bc>���̌I�766UT�nimq��3���'��\����7f��n'�˒����ƌ����c��
�6:t�~���]cG����ڸy��aC���fefH���ܲqӦ�Ԕ1�F��W446446���6l���(ʱ�a��!�r$2t����+���w�%f��^z���k��}0W�0$Iz���n+���~����}�W�l�fY�?�,�0L_g'؛�M7�0;:}B�,�,�y�?�����W_s:���FIi)	������}�[wuu���^�-G"��K>���ֶ���c���!��pXUԊ���s^w8��_{���n���}�a�w��P(�p��_~ew8,�–�Գ�5������_~��'�~�|��Ws?�������oiiy��9�e���0�@0����X����Y�t��n���F��HDy��WV�YSRV��3�657?��wC!��a];�P��'��tsssDQE����ήֶ�W�x����'�,-�����ɧJ�����~��e����5�k�|��eߟ�������K����/���~8��O%Q�U�w�~m���=gN]]���}���K���ÓO?k�������o�����Ov�|���ŋW�]�����<���/쮮E��1Q(��'��i���~��oع���'�lhh�V�=��|�;�K,�y��eyͺE����]]�c箺����}��\<z����3���χj����4r����G�w�نa���_q��g�<���c��U�ֶ@WWc}CuU�����z\��������	C���\�~����'O����Է����)1!qȠA����|��㦭X���������k���	������s�>�i�j��I��8>>.<a�q}����+c�����k�\��+o��MyYY�_}���w�m�8��S�b����
�[ZK�Kx�S#��̌���f�����}���~۽w�6o�ܯ�����4vl(LIJ9t�ؑ#~��K/9�S;;�V�Y�����v���<'����x^��^z�n���t�</���ged����;&)1���o�x�%�erXnoo/.-x~𠁗^tѨ�#::|S&M,�ϻ��k\.�eY�a�-(�ۤ@0����j����hjj.��<h�$��P(---)11�O�斖��l���]�dIAA��7�x�_�Z�p�aӦL��K3�Қ��iFO
���>�5����W6t��1s?�D�ә��=���C�n�?
���Ԕ��S&A�nW$ٴyKNN���6
�������eY��a������vttX+��D�:]��l[z��i�M�����8�r!����6�Y{���f�8��7�r:�g�z��i��/7o��{?z�m˴JK˦M����ѣF:�߾s�z�Utvʁ�u`�@ׁe)�`�����02
�Iqq���
�e	YY5���',Y�|SY�$I]��JM�'&5j̴i^�'�1r�t�Aة�|Lr97��&�'DJ�����bY�t��|u]}{g�4E��p8da��q�]�1�F�1<9))�y����mim-+/��m/�_���@,G�P($�EU����N�c��~�F�HOO۴y3	^SUմ,]�[Z[+++�N�s������v��O<�L}}ø�����Q#G�����0�?�q��}����{��7}���n�]��>���׬inn޸yKrRY-`��:5�S(�&�����ʪݯ����������ꂂ|Q�,[^ZV&I�A�V�ZU[_�Yg�X�j�Ν�Hd��q�(~���[�n�HO+,(X�bECc�����e��Ƭ\����i@^nYk�����5x��a�]��ߊ�	�|�X�-_QΫj}U�+/�\>nڨ�SS��>���Ԕ���>�����߬kl2d���.�W�}k�)�fU�ԮY�C]C���}>Q��뷥�r‰'�F�n6d�j�c�c{}��qc��
J��}R�,\8"�Ϙ�����]�ɧ�SR.9~:��}���U?����O�d57��t^��
)Jx��Y��޵by�$LL�Q�o�~{�ߝ9nl���ҕ+'���]]�����[֪�?i���;sFYE���㻺����.8��ݵ�k֭�oh����z���������R^^1b�0�G�>�o����:d����'�WTewu��#��F"ʲ++��2���|���'���$ۤ��l+ڸy����A`Y�ۅ����,�N�2y��	��k֬[WW�п_�u���ɧÜ|҉����F(�2qbvVVqi�^����|EQo����f���y�ּ��>))��5z'4s'��H2w�P(r8BUUEQTEٓ�a�5M���a�B!��A|r"ᰦi�(2� �I�|�e��_Y)�}6�};@�Fl���##Fh�*��߶��g�9y�89�u]Q���X]�Y�����y����eY���BSQ�
��U�a@����,�Y--̎LYP՟h����}$�!��<X;�,0t�2ht8��fA(hڞ:�{%BQ����yrQd�_�0�!WA>DQ�Ƣ WK]�Y�%gcYF�~\�!A������'-�� 8vM��H[���%��d�`EQx��L�˜,�@�z����o�S���;IjI�7q� Q�\l�eA˂qq�SO�/���7 #���7�L ��v�@�iia�����ۺU���=� ð��>����-*b��Pc#[^����:	@B�f���,� ����#�=�k-*ʡ��z;}qRR4�g�}47gtc�h��
��}�O��1ˢ�fn͚=�#�e�Y��n�G�f8�,�1��:V��
cL|g�O�>N����s���A��q6rr�ݞ��)B�g#�P��ط���~�1����gZ`��c���!���L�]�PM
��99�,�!�%� �
�*�a�v-�z5[T����e��^~�Y�&��HL4�
0F�ͨ��B�z���W���B��;��Gk��ߨBm�X��ӱ�
,�c IVR�������w�uu�72�wsK�2[��PH��c��/�!C̼<c�ps��톺ny�VJ
�Hma�=ҷ����(�g�S=wX(�W�"�)�B9�-�/��&`!�m�u+�f
?o�u���N9E�2E��hY@Q@�XP�1��=Cc��1�,�L���y��!R���[�4��������Wi$���S��S�(�]�� ���sϱ�׳6キ�[���s�ɓ!iFY�{��
��s�f�oY��r\w����(�0B����s�%K����5�ȗ\��ӱ׋%	���0�~�az��i�P�5��a5�!bY�q�4~[�ư���k�8��+��NC]]���ݺ�ٵ�:���,q\sM�[��T�8P5
��&�D�y}�P(�C�ێ��u눶Z11�nG~?SZ*Ι�-Y�
�>yV(/�l┩Bl�BU���G���������@�AUS��'��o��>��e��.��Vj����Ϙ�1--�3�`�lq\{힗��X����1f�>a�1|8`Y��
�g�m�m��ov\q@z�I+)�_��ٹ`%$l����С6�@�»+�-�i�&=E�{L�n��ٚD']�0�a4M���fXF�4�[����tc��V�'����0�e�0�l,�b�9.�p!�q#�y3������Ak��e�$�R��6��3����B��n~3�F"ڬYƨQ�
�'����v���)��Iq���̄В������deeG�D�%��2M�a:��{d�eAM��4�g���$I7r���ǒ�]���}R�0cY8m�fYfRRr��qG��a4U�i�
E�l�,Gu��aW�4M�0����O��!����!jog7l�}V�=[�<Y=�|��04k&�B9���M�qq���6(W^�]x����Z�qsS���-�
�VU�E���ÆY�zuLl���$9�A���{�gm�M�婓'�qک������?�����^p�9			{��"��a�\r�|��w�YY�y�a}'}�qc⨊1fY���26&6#=C���R$I����[���oY���X�f�[��^����22TM�y>�0:�Aha�8�0�薮���k��<~:�q�5�|�eY�V�y�O7.&�K\�!�&�˂H�Ȫ�8!`���G�����Ov\y%jmU��WhY@�B��P(�[�B@Q�~�0֦L1���P���,3%���ħ;b��&���[Lӊ��kmm%ZLbǺ��C���n��߄q�4M�,����0��>�����N<���
�H�����,K�����M�dX����Bm��$LI��/F�A�i����TYY�������v��f�:;}D�1�<�546�<��o���xa�+$p����$�$XX�u]7M���ŴL9i���y�Dr ����N����vEQ����������+ ˂�8 4
hPU�֦O��3H��曝_�p\`Y�4
"�P(=�m�y�a�yy���2�D�á�M�w�v!N�>.��o�������(�1Dz�i�عKQ�I'|��w�/^��x,˺�k�o�
�7nڌ-����뺞��}�%�硇#�2rذ�ԔϿ��f����+jjj%Qz�0����c�\���f�y}�Ν��ccc�6U�0ƚ�F_�u��(��A��b,,�z=��ĸ�8UQBo��^um-�0��/�5����FjJJjJ��mE^�'�IM���<��Ӈ
RVVQ����'���5�_�Y'����W䰜��p�����|��������9o���jnN��E��9e�IÇUUF�:���^|�1h���[��?g���{�1F�033�(�Hh�Ѿ�(ʱ�ol
��E�"U�(�PV���+5��U��y����$I���&�1�0�WT�����@ ������uuw���)�'�7�/W\>{Μ��G?�Ѓ�U���,U��sϝy��>�w����;����kk�@0���o����7nܼy��͏=��gvuu�	MB�aEQ���a�a�e*���B4r
c�����?�֬����KK��f��1c��k>��b˰���>���(<��C�Px��IW]~�K�j����T]S�����㦝0���6l:d�Yg��0LiYـ��*w�.//߱�85%�w��۷!�ʜ78��z!��e���??r���q���	�'��7TW�g�O�P(���{�e5
�2`�4���(�9s����նX��IC�r���~��Egbu���ɹ󟷑?.^�r\lL�ea��q�e�e9''��r���|>�ә����eY�YYv����2��nw9���z��ֶv�p����8������*�9q�$91M�r9�j��pbBb�mB�k��	����:�����<���6uؐ!>_BB����y��r��/�"�X�D���I�232ޛ��5k����=��'�}�j��;n���>)�#�
{a�+6�� ?/�5M�[X0~�EQ2�̲����'O�?��[��-*b�n�~ۿq�ٷ/���
��[��{�zf�d�ʸe˰�i�jBBbBl‚���~ͩ.�ѷ��eي��6�-�<� ��h�������%�i�!@04MSU�P8��	��ʜ������g�hmk����w���g�R[W�뺪j�PH�u�0[��ǎmƝw�3�����	t]����5c���a9�!6���x��͛���3KDP�4EQ��u=403#���!�p�5]�Mӂ�8h���z86MSӵ6l�a������UYU�`�B������u]�eMӎ�:e�w����۷�`��Ammm��vHW"��e��n����/��_�*��kV�>0�x�[��U�)
G���b8!A|�%�
7��~:r�ͨ��,/����u՝�>�a�c�rs�H5�WBr$�sW��*��gefJ�d�fzZZYyyRbb(6M�Oj*B������mȠ�.����,�O��nW5m�
�Hd��]]]����OMI����-,Te��][�m�ohx���0�M!,+/�o����ALLH,,��v����lmk������q�����o��6�aَ_~^n{GGwwwnNNyEEbb"���-5%�����rnڼB8q�8�a֮_
�F����TZ^�֧�$�e�}RS\.��(��o�B��������M���Zqq����`G����߲��0���M�F9��_3w�R��?�����22B���O�
C!l�$��Ar� 80\!d�$�Ŋ��GUU%I"�ɟI�8�#�i��4M���nG��2�0�e��j�P�pX������}�w�bc�v�u]]���A1��ġ�|��'�D���@�?`Y�n�dY�p,Q��a�b��TMx޴,I19c��T"���$�
H���j��A���8�~ � qO����t��`��J���~�1��������Y���]'���w^���!�k� D��`�
F`��,��LHr��,QKV�{teY{� eY&��eY$�=DH7��i$�r8,3SQY��'�fge�w�Y�`��a!�J��P����X�4]G=�U���G�u�8�rI��a�w�h�Ej�'��q�.���V�0�E�^/�$��
��K?PU377��W�O�\��C���|''�-2ejYP�����]��e��()�B���~p���3�p4���g�~�9g�ò���Lt ���@�O�{�_Ţ��{��^�/�=f�0�c���K�P�����@U��(7߬�{��K�/�p�{�1|8T��,+!A�1����gY.�Â8n���:TU�i��);���D�k�s�X�²,H����יݻ1��P(���e$H����VBB�_�-^�-Y����s�����,,ԧO7���G��JJ232��T�q{6�&0MHB[�u=:��M2�[ �kq�	�^#�{�2��^
c�0ly9�� ��}�))(W]�].����C��r�P�MCо
bZ�A�����z�}���:ƀa��b��QC`�=ZiYLQ�q�AOg%'[��� Ƙe��#��3ͼ<+)	��^��M�3X��(�#u�����Ϸ�5jk������lwB�1�w��:@�JO��8C��~:�P(�Ȥ��a*��'�B��
��O��H��ډ'�ܬO��\{-��:_QnߵCih��-�!BHQ�]���[��룽�r���Vj�>u���pX	�eY�����[����G0fw�dׯgjj{؏1@���E�����?�Z��kj�飝~:`�(4�?�BG&���4���r���'g}@1֔�ˊ��=�)�^a,���1B���ea`Zd�o!41�����ά�����s$"��B�ᐨ)ሎb~v�rn�
v�z��L(mwߍm�_-��c(�Gޗ��X��O�JI1��B3-헆�b�-��:��USݧP(�+���!��G��;>�ݛ�wU�g\N��1���?��d�C���1Bx��,�5���Y�+�y���8p0b�g�!!�P4���C�7t?���7n����104+�m�B:n7PC���)�ƠA�M7�ݻ�����_$B�SR"|�1���5#]!4֧N5��<��"���Y��ζrr��e��	+�����C.AL�Pz+�'��������!Y/����Js>(j�@�]Y�f����Y�6���j����;��5�����CS����%xߒ��b\7#/%F���E~Y�jZN�4wIC`�EM]�e9Q0�����Z����i�g�N����_�|��7��7�0>���))zv��~�q=�M�d��#�N0��;:�9���!�q2�B�P�rxҏ �h��-w�9�UKv����Ζ��fǼ���C��/������݃2�ò�g�J{~AY�S�R�Ɗ�����dDŽ��� �527�����NӲ�c�7��y���x븂��91�/(���
�#
�^�u��;_��b�?&����Kv��AQa�k�գT݄�
���7r�e�	�O�P~CY,�m<[�X_�{����nYѷ�tk�q��}�+��M�h]�++��I�*}+;-����8$�ԑ}殪YW����/�rLGWdKyGz�C�L��n��R���5�eAm~���[7��8���#���/�Lr��t�4���B��Π�c�[^}���տ�-|��[��*��Kߖ'yĎ�g��c��{|ŝ��Y��77?t��H8b��R�=P2�첅_��lL~\Z�-3ٙ���ab�o�M�����S�Buü���ͺ����%�5n}gkv�c�5�
S\~Y�K�P(���+�LJJ"��X�����9��p���o� X[�q�ѹ��I�.Y�q��J{{��p��؂��X[{H;o|fn�C�`�fId�H�$� ŝ+��3'��~��nc�op���!ɋw�,��jYxJ�Ă��555m��t��~	��̍�bjNj��
���A�LȴlP1�18ާP��0L[{����9ӕ2)G�aZ[[/s'�m�"���&��e�5݄<cZ�E�/k��Eʹ����gU�Dꆥ���#��2<�k�	 p�fZ���K<˳�ĘE0щw��j�!�!��:̬6����I9���̝���M®�
!$Q�����f�RA�A�_A" ���DfA�Te��H��ڞ��=w�B,h�	�ϱia��C��O9V�ӑ>�� �O2�1����%�#��x�<������݆B���3��DQ(G�����|j�j�}�i�5��?��0�D8���ʱ�>�O�IIJI�O�SݧP~0 k�P�;��CV�:�u�P�$Pѧkz��ެ
��'�ΨR(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J����ݖea�B�~���C�cc�1!�g�V	@� <�{V�4�_R(�B������l�(B��B?q$���8]��x�1˲v�
1(˲��?-Ē$I�6MK�e�0u�����VB#˲�iT�)J��ҏ1�y~ي�}:/��6�K/�8β,!�Ah�;CU�nk���e��1cr09�|��X����ٯ�صK�.8�����cY9��x�_�vݧ�������<�S���L�<P�1��0M�$U"�WT���5!!AU�o�����=rd8fY��Bi�@�P�|܆c��M�6o�z߃ee�:$,��!�t˲Q����p�ع�o����n?BHE�Dz<�c�9�cY�eYI�X����g����[nݴe��C{$"��(�@�$�a �<�3��(�
���>�P\llA~AEee����Y�e�a��U
B�q\www]CCm]}$�y����ޱk�M������q\~^���@I�t`��(��B�9�O:���#�Dǩ��DRSSO9�Ĥ�D���ڛ�v�Y�\t�#�?G���G�h�v�_��z���ֶko��_��"))�ŗ_q8��z��G�x�aEQ
���؞����nQ����~G��~��r��jzSSӳ/�TV^QX���[n~������;����w����c�wvvUջ����n_�l��_��l6�vŒ�Z���G�j���UU�Ə�v���'%%}���N��sΚ���mmm��|�Y�����ˁ@�Ǟ����YhWwW�~��q��9���.�Bh�Ѯ��#�H?Ƙ㸖����I�ƌ�ҳϼ���s?�x�ر��zːA�>�?��SN^����x�����s�:���p8\\Z���_lL�?�u��C��ァ�����o�l��遇]�v�)�f���ږ���r����9�s8���FKk���࣏�y㭶�v��y�E���{O�9�ܳ�,ھ��/�BϾ�RU��7�}w�	3�b�^{�`(��ֆ1�>U��v�c=�z}>_x��_���O�u����[���Ԕ��?�pڔ�s�|���y�B/��>lhzZ�����C9bL��ߗeY:��#�?�g���u�|�qqo�ym�����oӦL���W�Y��/&''�=:3=}�ƍ�-Zd�����B#�
kjn�B�n���8��P8�zǏ[Q��a��Ӧ���{�`�(��!�y��Uk֌1���;
�����
<� /����f�
��N����
���9�?fL |o�]�ݪ�N�6�n���c�eM�bY��8A�!�n���~�e���E���l�hO��gf��A<��$B����0?�B'OJLL�+�
�~��B9�1��;�wj�F_)��K?!IJ���X�x��@lL�i�ӧM���JJJ7f̜7ߺ��s8���N[{[CcS��?z�H�eO?�K/�hڔɥe�}RSSSRDQ����~�������N]�����y��;w�t˭	��>ߍ�]s�%����]|I���=��0{vBww0���4#,˝]]�P(757�fg�;�ߜ���dgy=�\w}X��de*����Q"�a��]�r��n��u&&&0s�%��q꩚�i�v����P˲x��8�J?� o���P(�W�Z5x�`M��|� Uv�=%9���WT���z<�ÇK����M�n��|��[�ii%e�5��I����cYvwuu��ꂼ�����V�:;���^�c��C��B���m���'U�u�a�پc�����޷� #=��jwEeeA~AvV��[�nWzZږ�۲23DQܺ�h��Q�al߱� ??&�[YU�i��~��sϊ�X�}����ln����qȠA<�755�߸���]�!���������w�IIIKK++/���ڿ�m���%$��)��DG��i�v����Q(GrCR���,˒$	!�1�Æa�\��qgD�<�ԓ�HDEA�u݈D"仢(D"�a�B�麦i6I
�2�q�a"���f�u=Z"�f��,0VTUUUQEQT"EUv�a�����6UQ-��6[0��l��jϽ�2��K$I|���$��i��a�χe�\�(�c˲�� � ���0��i���P�M�4p�2�!���͎�q3P�SD���~�B�4�� Ƙ��r���|���� �ȲL\*!���D"b���` �0LDU��P(�F���0�͋"�!�B�098��n����A�e
�o�4v�g���5M�!�4��M�4EQ�h!�RɃڣ��c!���-�*8�Ρ�y1�4p��S�>!���[���3��Uޟ��~F?G?D�L:�Y'�<�ԓMÔ#�0�;UT�!�=�=,z�OJ96�j����v�a�K�^���t�	[����&��X�q
��8 tx�;i�hOINq:���H����c8��Zo(`�dT{L),B�A��`�X�����1M�w(c,
�(��X��G��F��1�a$K�o]��0
lCo�ʑq$�L|�)���t�[q�]Y�_hYVT��)ݫ ���D�o�aK?��e�1!	yP��<�/��Z��v��$x�aM�~;S�F�I&�����SO[6Ǯ����]aY�0B�qd�ӳ��+����P�AO�I���n����"c+b�$R�z�)�;���Q��ԂO���Z�gBcI���4���J�ܞU��<�]�a�=��n�e��
!"ZG*Cje��d��������`0���D���= ���l�薞'$GF�clY�=~J<υ�2�0�p���ۿA�?ā�W��I���ij�&�{�`o��_����A�Q?��?�;���(���Ѷ��Ƙ�8A��pOǰ=�On
��a<��yhmk����TUu:�.���tJ��J�D������&�p:n�����nw�\n�{�?`��N��Ø�����q��f#�d��d�M�v{����X�4A����DѲ,��N���$��r�\���fY��\."XN����v{ю���q�e�^��a�;��O>����4&&��px��e-���z<��f�0�6�-&&&&f��**+~���W���v��N'9��cl������Bn�+���y�p8�.���"�˲���$��@ �qӖ��8�a�x���ή��8ҍ�,�����v���&y�v�m��%˲DA���d�$���c算_�g��l6۲+��X�x{��+�<�<n���<�0��0��6o������v���r�$I��l�K��t:�6�eY�i2�v�$I2L��8I�bcc�����ccc��^���?.G���a���ʤ���,����s���4���@aY���_�n�qS�TTUmڼ��j�e�E�#���mL�IDAT���kj��+v���8�v�7�UT�$'��sWqEUծ��ٯ��1^���1MS�Ʀ態��D���$UUw��$�Ƿ��|��a��G�]]��SRRvWW�h�q)��5��;v�Z�q#b��[��WTdf�����_����!))Q�+Wm-ڞ��������ɉI۶o��9����1-3&���ܼ}��M��$�ǹ\�@ �ݢ�kj�TUMJL,.)Y�b�0..`���Ͽ�i��'���zYEEeUUfz����[�z=�ݾc���[�UVUY^�d1����"=�Hdɲ7mv���$m߾m������������\�z�,G��R�z����t:��b��fAX��+*c��������o�
�™�mm��;w��8��r��YRZڳtajjJyeբ���7�$';����EK� Ą��EK�j�VW_���
].w0�Y\�v�6�-.6VU��}_�P���vg�@�Ç��(7o�,�� �ƍ?�ߠ(����oH��Dz�Ҳ���v����$'s��ѱb��NKM���)��(��X�t�'�>']I�F`
�B6���-3����v�y�����0Lkk�H���O�2�λ�eYv��m+V����7���Ӧ���kE;v��w,\�d��I�����[[ZZ~ذ!##��{�tuw���A32�S��B���w���x]�m6�O?{��w��?�����u�}����$ے��u����@>��ˉ		�z��-�~�W_y=��_y-%9y��M�6t��i���yϽ�a���~�͂-۶E"�M[�dff~��{�4�����o+�n�I����$Q|��w"�HM]����N��#�����x˶mn���g��HO��HnN@U�E���e9';���>�aÆ���]%�_/�V׵/�Y0x��<��0��bc�|���1�B��÷~�&9"��7�}��{��Ą��+V�����{�@�x�R˲�[[+w�]���]��b^�;|+W��4a�#�?�r:��X��x::|=����)�I�����v�]<�oڲ����܏�N�\RV���Æ���g�b��%Q�$i�…��>����PRV��ܼb��EE�O?��ę3^z��@ XQY���4b�0�h,r	!�w�]��Wύ�7s�-�ȓO����~�u^�K���+��ssn���N�c��E��%�'<�����/��l[��ޟ����lhllok��=y�9� D�D���$*��?$D���J� Ė��W�t��>�)7�t��fkima���>��f{�*���������cӖ-}
�~�
�������v�	���v{jJr ����>thZ��7m۾}��	���II�@ /7wİ���;���O>�ĥ�W�����U�{wIi��n�<q��o�1&&�/�y��ʪ*�e�$3#����� ���,5
3�(+V����2x��g�q�	'L;6'+�_��57'�0(6&��K.��/W����+WY���N;�dI�bcc���4��zbb��ǎu�w��Q����w���!$�B� ��n�^г+�SO�+�y�U�N����Ί�
����R�$��lʶ��xaY���J���?��l��3�̾��s�l�an����v�����?��ф���+V��r���-����X��q���		��g��o���퓑����]3gN]}ò��z����o~s��?[�v��1��f����m6�eo�����KQ��۷..F�������ynPV֓����~��<n�=w����j���=r����8���䋯����9���W3gL�H��}��s��^rqiY�sM[,��.��7�<̱���s� �Z�ڶ��ynwe�0�cp1N� 3gРy���+���	�o�����jYQ��s�~��-۶�xqfz���~���n�j�RM�&����9mZ�~}�ЃV�UӴ�7
�����Нv�Gh���T-
ʒl&N���_ܪ��={�0�y�����DQ��-I�$K�H$�J������;�A��#���H��c����Ϛ>��O>-:th�ȑ���̌��>�xm]���e�嚪ڬV��1t��??�E�G���A0������w�q٬~���|ބ��+f���C�
�#�PX������HD�u]�
M�‘H8A�t9����uuu�P(�RR��x�Iz��?���aD"R8h�.�8�����í���MM	�q�$�¡�D�i�Z��\��'��9��8]�UM
�B�"�|�ں����C���Nd��Ʀ��jM�uC�B�P!`�Xl6k��}~��C7\{M{{�a�8���jj03w�(H��_�C��YY�aج�C���@0(��L_�k�n膮�$�N<Nj�%�W�?����W��썶�G��
M�u���B�iC�@0)�L��,����ѣ�����>�����@$I�T5"I��M%%%��Vtʸ�A\�N��B�ah��CA��xh�E����q�(�/^�~C���.�=K��L���.�4p�ƍBA�N����,;v��4����`MQ��˲����/|%���ӧ��ݻ��#��.;Vũ�'��7<n7���[�{�Ju8�aP�q{p_��n�&MӁ@��7�Tupvvz��7^w�{�>�o��LO�4pƴ�������꫆�~ᥗ���o���Ap���@�2tH��1c{�O��'$��ٻ��%_�45���i�ր�L�a������4}�����G[�n��5uҤ�۶s,{l�A��Ͳ���^�x�_�r���٬�BhE�a�M�\PT��'�(��w��x�n����_�|��i�dgO���%_}c�Z���*���r:A4M^��a��sw�9����o�~�����o�������>:c�Ą�ۅ��lVQ��t:,ESv�-%9�k�^�j͞��F�����ƹj�7#�K��٭R,�z�nd���cY����i����>
�o��!��k����A��}k׭gY6"I��۰��W����k��p4�	��w?qA;i�椄$U;��[�4�V��kv��I�;M�x��$	��(J��,�Dd'`@���$II���a����ON�0a��I�p8K������z*��L��y�����pW��ZC%Ijo�
-8Ig{{��ngY�eYU�irGQ0"�,�t��6�B��ŋs�>����+*!q3z��E���$���;:�����֮(������(�e��(��~��,s�a59�kn�#d��S�� �C��7�m6Ap7���f���}_i������x�b{@�eU���h?T]�Y�ŧ�b).-}��7�7/.��s��i�%���t�Gh�fg���Ý�ZZ[DQt�ܧRŁ$m&�+ߟ��PBA�՚����4-�2>��ɜ1U�p�蕚��t�\8������ƌ�$!~	'�7��i@Ӹ����E&2��p�=,�v��i�S�����X�j��i�����|p��'���yyW^~�a�qq>
�h��� ���B-��}�w*��fu8�����k�łOǩ�)˲���9ŝ@U�Ąܚ��p||�i�ݍ�H�,���f��#�H��C�~@4�'���j���P�r�HEQ�`w��r��)h���QO��IZ��֏�!����~�-^�'<Zɖ����"W4��)�]����|4�GC^�}���R��Ws�M�M �$�#��Ra!����G��{�]����0����
ѱT�{m�g���;����Q��<�,f�h!�c��T�iS�I��,����3���>�g��wEۈ�;�Zw^;|K��HqA�������dg�~fe��GTU흖���{��Ï�8��М�cF��E����#�B_�O1�.Bր����B�'I�Agӏ��!�E�~H4�ס��ֹ�8'G"�z��`���e�H$�qܩGU�h�:��UUM��c�I&��#'m�Q��Yt�D��SB|B��9"E��3�r9~f�����x<�r>�t�#�������$K��.!�Gg���z�	�"����A��mV����ᄁ�YB�n��&"�ą��~�f��"+���;k9����HXEC;g���"����!�4]ӵ�]�#�!���I��<�������G�\q�;��;h�>n"D�с������i"��Ǟ<활ٲ��n����C]�����LwQ�Uu��>�z�	�9�ڼ��}��	�'F�؀=�ljp;�B:�L�0D��8�ؠ�(\���|�vx'��q�ah�NӴ p�����
#v�o��a�8�[tDB�3A���c�w$I��ҊM@ڑO	<^aC�4���w���UF6�4Ͳ,�;��hk�f��i�~!� ���>����|�Hdʤ�so�!�����|�H�(�(G��q�s�P2,CQ4����<9)��hr��y��t�4M����CY��u��)���3�uu6���q��ܫOQ\芦i��UeY $ˮ���z�8�N��n]]��j�z<�H����WjO�e?��������4��iE�MAqJ]�u]g���)X�mim5t#%%�n�y![��eY�F�'Y�y�w�JH��6e�7�.{����i��e�Կ��,���0�����X�vmbB�a�߼�}�Ca��X���)�U���y�B��o�����[Z[{��ɲ����������\�#D�t0r8��o!�0���\.�U��,���`(�(J�)�a���5MK��CV��֠�(�$?��k�\9f�H���eYY����y�V���N��4
���v�UU-:x��x��_�߯�UW^a����p~AaZ�T����B!�ߟ���P^!�蝦j���(�PAa� �I�<ϗWTH�ԷOM��/|�_߾7�p}ѡC/�����߷O&N��������>��ʱ����9����xZZZjjk���>u-���@���j��<_QY��|RB���
¦��ں������$EQ,��ܼ��Wϻ�n��ݨol�g���rI��ff����~�Z�1�����Ӧ�{�mm�$UU�$$ī�ZS[�q���P -���8����z<�mm�� �v�ܵvÆ_?�����X�x�6d�t����(Јb��/[�2�v���#%e��u�׮���?�M��ڛo�X�z������/���/����Ow��][W�����GWz�|���%G��+*��|�{��[Z�~���ʪƦF��[�Ҭ������Cg?����M�M4M�dIuu�Ez�$�_�JUMͦ-[G����_���;uuu�YKk�gK�t�\��f���zn��G��4hWn�g_,9t�����sܢ�W��Z,���#[�m����0`�
� ��ƛ����l)���p�%/,xy��[�n�w�@rRҳ/.hll�8���f���UTVD"�Æ���׬[�a�fM�fe)��qܢ��/X�bE���+�'���K�
UU۹{���Y�5c��M������%%5��Cg��.��B����t8q�\�6�;���:�����[�Ϸb���333}���ϖ|�v�����{��[�juZZ��͛����55

��+����YD�%Z�h$.tgT�B�EQVQ��l���}�튪<����7�GJJ��?i��^|���a�矯ߴ����_��;7/wϞ8��_<��KKM��/�2q�$I4è�R[W׳G��I���2����?p���k�S{�2iҘ�#�v�
�얹�V�:t�x@�~۶�ܾs˲�G���OH������N��}�N��@�i��։�xՕWXE��ŋqU�7�~�Of�ѣ��x��7�0e�������'���UU�)�;�x��())ݲu[Ee�g�u��+�`���i����&�v��=&O?q��G���yyE���˟~�ϖ|��؈3�\>k��c/lپ�aY�Ф	�333~�Ѓ� Ȳ�Ќ,˟~��b�x��E�?	�B�0��JtE�<�t:�G~���ϽPRz��h����w�]_���O?�(��3��|֬�}�D�
��_�|͜9�f���?�m6MӢ���@��~�Bwz���z������y����@�)�,ˢh�4�v�A,��(9�eY��)�0x^�APVd]�M��}Ѥ��������{⏡Ph��I�~��@a��1��^�g�����ck[���4��(v��n�?��Ç�$�eYUU]����p,�0b��K��}���?�gŁYY���#8K;n��uC�d�0L�h�0@�EQVUU@�ɲ<j��g����={��̿p@(�����AS4�q�BLӤ)*
?�`A��9>>�&N"��:2MYQ(H!L�	�m6����\r�\�$绎Q!�4Ǎs��7]w���Ø�a"B\�b�ٜ�a���Q�Lp�^0
CQI�dY�-�-�x�ט ~��.��rlAA�C��M8�9}����ԟ���?�ym�K�|��g��lahz�M7R4��/����=4'g�5�a�l�AYYϾ0�=w�1!t���˯�aYW����q���V��̜���yy_}��cY��1|�}$¬�������ѣ�7�U������ֶ��}�s��"�\��f�\�b%˲q>�p��a/�����S���|O������i�6
S7t܍/�5jŪ�?�KM��vۖ��7l�l��j���c��^X����ݕ�����(���NJH���UU��ohmms9]džA������MM�}��+.��f�:�fpY1h�Wl8Ɨ��>�8o�^]�o��ڷ��_<�m��_�fΜʪ�g_�_V^1}�K/���?�������k���I�����HO=��=w��ۣ�ʢ�"-����
4�n�u�������xN���p�9�������y}�G��ۧ� �EUTVj���ﷵ�y<��5M����|.�w�(.-
C��ZE����>|��]���TZV�_r8�aTTV
��#%�����â(ff�˲���m��~���Dt�p�\�@-����o�׷OB|<�HyE}CC�����$@Ee�E����@m]]�=p�I��B���=>.�a���ڵ�7<t�x,7oMS�����=@�����u�WϞ�#ee� ��쉻�,���VTt09)��q[Eo�@��r���z��ں�H$20+�������GJrRb��C�w�4BÑH[[�n4E9�0��eY��H�ŗ^w��)�ɩ={���j���q�[ZZ EY�a�@ �hO���&mEq,�o����*Z��k��oEA4M���:�q�p�C�\6k��������C
J�LQ�0��tϫ����Z����cEe�ߟ������n7�/Ɋ��4��
�T�ahA�6�P^�h�=�����2���*n,�y�eM�TUï���*C��ɲ�{����aUUw�����%_
4���o�e��)I��כx�Aq��y)Z�w�x^�u�X�s!�&x�Y��mPG�P��c���������w��{�Cx��{��f+(*Z����{���p$�0<�麪���0�aw���9�kik�Z�n�[7���EB?q^9�ڼ�l�')�‡ִ)����86"IVpl�L�?;B("I�1K�H$�0�e��Ќ�Sq�^\����o�d@Q�0�P8��s����O���)(�"�
� v�B�0��Y�GiI��c׮�ƌ�1mj[{�i����ʱ�Dk�F_�u=xb5��6���N'v	��;<�7&�$h�f0r:�fL�C���h���X9ep��QE�4(�0
�4�h^�Bw&���G8�_>k��i��b�|�� �<�,����=(Z��C��)î������I>�y"8@̜>m��i���D��[�c;/F�y��j$���2�Ѷ�.���ѿ@M��v��Ӄ���r��A��<��M���'������/�h��$I�[U�P(@�"�'��F�5�aO1"#���w�E���o��k��u>h�����y�Lq������6;����ZI�=qa#I��B�*Z����8;��B�fH�'.t'
��A�� �AQ˲ggv���4?s��
A|��C?nB%-��˔�ga���~�� �;]�~USeE֎�D$���d��a��O�WN��:�� ���8?u�9���fȥ
A�&2y��i��<#�+]��h�Z��C?����$�癓��%��� Gq�!���Il������'�8A�WH�'.$,��l�8�ҩ�M>��2EQ8���}����v&��0M<I>2�4;�N]N��;���xLM��NƧ�]�'x�u��B&B��W
�0�I�h|�7�#±���:��ɲ����=� Ǯ�i�S�~��i�B���YQ(
�KbB�������gg�	�pڡ��(��FQ�0�H$��ʈ�(\f6�N�wy�����i�._RԌM�l���F"|��&
�B�b����#N���x�^?d:B�5b%pNy�b�eY�j#�7�4�(ʏ�������a��;�3>�ׇ��\�>�}Y�
�C��e�@�Wj��k�}��8�a"�@��O��EbY������6�wzrbB�������L��j�ǻ�Se���p@AhE�$�.A���;�3111���a������6�a���"�Ȏ]��������q,��̒,��q�]\Ҕ㸚����f�Ӊ�܃c�0��>,�bt >~	?>x��騫�.�����߳o��fomkk\N�e�i�~ �X�))-Ug���A�2G��4M�ek��ZZZ����;-���4����8���0����ąv�K��t�>l:�_����ZQ��R��!4L`���ߴeM��=Z� :k��[Z[kk�\N'��M�8�ص����aI�ˉ�?�(���!�eq���*�x;�f>�iZ�hi�S��ʋ���
�?����m6[rRr����Z�7XQQ���H�TyEeSs���uՏ�8a�i���G���1����*jl
iY��x�?\|0>>!6����]�a<d�ZY����]�ch�nljt:�ˑ>��9G�tCC�i��i�<_YU�秞�ݳ��ş,x�\��v�\N��������(�<���t:�-_�Ͳ���j��#��ry<n��4M{��W���Y�h�q���x��i�/x)���X�t�
�(j��p�w�����ݓ�����V��X,,�z�^�Î�n���xv�iZ�ַ�yw~����y�4M���z].~���r�].�kي����ju:�4M��t8�n�PU]�i���>��a����(��v���8\ӑ���tF_��l���u�*�q^/>Y,���v��6���r�\є�T��Ø��P8l�X|��nå�^X�R�C�P���x�Q��c�fY��v���y��w<Ϛu뫪����Lӌ�>ѵp8�>���y����t�����F+���ju��n���(����xX�����m��^��e|�z<n��!���l� 466~�be||�����q����0�^���ᐦ)��JR�4�������e��i��?���!������Y�V���t8:�x�VUW��$s�̢�l({��w��ML��X��:�e�^�s\}Cíw���_8I���‹MM��:��7ݹ>�	├^��0�!����O��������7�~�j�^w�U�Z� �v�߸�n��v�\�۵y��}.k��IƇ¡w�{?^y�l��%���B�4�]�aee吜��k��B��8���-_��8�Ǜ��R�ЈF��_�zMz��ӧN1Ms՚��
F:f�(Y�M��V���+������Ç~�tف�����+fϢ �p���k��cE�0"���Nj�N�������^ONL�zΜI��/ضcG�=&M�#�0�y{6o�ֿO��̌���S'����^�~�K��ݴeK��}�����k�'%&^3gΤ	�nݶcێ#G�p饻s�
��-�fU��*�r��W���V����y�����4mVkqIɷ�W�N�u�W,���¬^X���~����32�W�]7m��#G���(Z\.�#e�~�䤤뮾j�EcV�^s��pİ���j����e�55���'3��o�ikmklj�=sfz�4UU9�۰ys]}=@`��)�-��-�~���)II4M���=RRfL��ѻwyE��
D�eڔ�n�k��-6m�?�h����ͳfLw:�����P(��� ��]Qd�ͦ�r�ڥ$�Y̽��VbB��aCg͘�����q��Q���=藙���.�<<}\�@X[�W6��4��������0�1�G�Z�f��!����fh��K����^2nl�>}�4l�y��Z]�5�y~����/<(���<� T�Լ����
!��ڠ�Y�UUo�����n>t�}�w�?��������/���qk)�0L� d�ޢ�v���g�B7q�\���+��\�r�0¦��_^�;-mͺ�k֭۝��ҫ�f��K�{MӒ�332^����U�
��Y�~��5k?�b��}����3ֵ������큠���u��y���p:}^oAQы//�y=k׭�࣏y�gY����/�GJr�E�@�[�a[[����u7646��oO���X,EQ.���p�|{�x�����<thw^ަ-[{��=��^�����gK�جV�42_xi����4�5��k���_}�mJJ�(�n�K7��ӳG�W^{=?���/��UPT�ͷ�x^�4��X���t:����o��k�g
�22Md���{��-n��/*.Y��bI��y�3m���ϗ|�}��Ą7������7hF�>},������=����#G^}�͢C�^|y���Y�n�/�OH��lΠ���
|��Q7�b���4���z�UsL�����[^^>Ώ�7̣4ݠ�!r�5��$�.d�qB"h\��Gr�m��}��W�
-���~��7{���^\p��4�;� �g�yB���m</lݾ��;ooii9T|����[Z[
�`����[]]�i���WdyPV֔ɓfN���W_�����@0$�2�0<��6b�cA(���74�,�s���9�a4Ms���
8}�A�Bv��pqI}}CuMM8������>t���\�Ն�@���î�seZ�ԭ۶''%�9B��C������y7\w��Y�]�|Em]�o~��ߛ޻wzZ��&�5r��5C��̽�{�cӖ-�$�����0}����տ���7nٲw���&�JM�6e�4M�׷OZ��I'�9bǮ]�`���B��f��j�^4z�5s���d����f�k�HYy[[��>0i„8�/��������&
557����e��{�Ɋ��ب�jK[k�/�e���p8𽂞=z�N�5eҤ�Æ���g��aG�BC��s��Ο͝ۯ_�u�7��N�|�̫�\��,+M��k��r��)�E�@mmm0
G"v��q�~~�=��]���}����*�����#G�r�Mw�q����}�df��x�u�S.u�o�\�@0�j*ð��i��k��"�7֋��0Ll�t�a���3�n����oh��}�����rPFҤ,��C�����\��v�5�uq޸��4E���ӧ����o:����ݹy��}׭so�ׯ�-��K��N/�CH)�*������k_|y�h�z=�̌����~f�f��������V1';;��Q^QY[W�y��!�{��䤤����_< Z,���UU-��_P��[oM�(Z
��C��K4M+=r���i�4�P0���$I�:|��i��9d������p�gso�z�O=�2Ma$)-+?\\\QQ�q��z���A��� =.���~[\\���gN��v��/|�a��i�i�ڝ���0(+�@~~Aa��2��-���e֌�cnj�ǿ�����l�̏?����Ȩ�#ZZ[gϜ�_����iz׮ݍ��=R��v�%��>��GF�����P8�p��u=>.^�e�W��466%&$��99���f67���n_�t՚�S�L6�c����Ҳ���P(�0LUuucc@`��܊�
����x�>&B�)))qي���KJf
�[��P0Td��WÑHk[���8���7�� ���#G6l����~���j��6m��ͷ�l6ۥ�����WPP�b�޽z	<_QY�kw�r:�"�a8�N�ù}�6���p:�6������� �w�	W�"�H$$��;��b�@Ei��;-����P�ᛦf������1�ҡ�;v�$&&��a�,G"��ݷ��]���'))q銕���%%d�$�-q!8�>�;Dk[��
��ۛ��4r���+V��/��␜���󫫫/7n��]�`h������M[����y�M7���m��]��5]>lhcSӶ�;/5��8�(65���Ym��^D��m�&\r��-[�|���t�P]]����z����9⛥�4����.;\\�;�W[{�o�
���lv�>}t]�������|�歗^<���f��v��
!6d��)�W�]�i˖�d�e�n�M�_����{{=�0�7n�b�e�fJ����V�z�m�2C�t}c�[o�SV^>eҤ�99I���.]����+�(=r��w�++/�<q�#�|ލ�7�q�̙MMͫ׮����0 �v{�̌���Q#G�A���$��'&$,�����ʄ��+.�m�X��\u���g������q��an��������qq�Ə�(�/��
�_�>cF�ްi���ӧNY�v� �7�~':�}(���? o��-۶O�<i�����:���f����W_YU���ֳg�^���v�ع{�a�TVU..��˻�ꫦL� ��e+V����v��Q#FH�����6��gsoNNN���ٷ����S���,������\XT��T��HuuUi鑜�q��h�N@�a����h�����r�<�b��C�;���&MӉ	I��j�j�-�梃��1*z\���B�@pH�`�Ñ���L��!Cw��nݾcʤI�&\*�2�f{Ԑ>���6m���]�B<��[EkRB��u�*�Ms'I�j�F"�e!Ei�FQ������0�,+<�㞂,�*�"�|8�SPd�f|��ķLi�VU�j���)*��!�1 �y�o�Ũ�:�s�a��	"��ir<�dq�~����ʲ�q�,�VQԏu�T@HS�P�u���Dp'�e9�WUU�4<
BI�q\��@5U�X,�,s,���t��+G���*MӚ�ZDQ�$�a�cӱ����u�?/TU-I���(
�FG	�0M�c�p$�(��	·F"��2�4�(ʒIJ��(��4MM�X��u���@��}�� $ɲ�bQ��<���6�u]UU��X�U��(���DDQ�_YtMѱ-f��(��4�(�
�	��MUՕ�����t�c;���q�G��-�վb��VUU5��(�NJLJLLTU�d;?B��8�烡��t!����S{�ZERї8���o߾����hɣ�m@��3�~EQ�x�O�G��7^��p��γ�Ϙ�cK���E�x���N;�t�����p���N��e�u�n��ڝ��{�=��?cgz|C�L���}��%�Y��e����}' �j��켎�qt��N�W��w�c7ct��M��g66|!��a*����y/��}#��`�iZ���b'��w�4�a�uy���O�'����I��y���hu�8���9z��S��}�L��Xŧ�c�:�ї:�?vv�Ĭ�AI'<y���aO�W��y|�M��(EQ�~�@�� ���N���a�vX��p�\Bf��8��|����K�zIN<�ž�����g��iZ�bR0�W|�^��F��(��@���1����f�*#�d��椥ZA�(�ܰ:=���IQ��jg�P8q>C�R-$��.B?�������͈�1M����B�EQmmm�z�:�Bǝ�!���%'&;��H�	�� �|�q���8t��lj}^߹^0��߁"����J
>!M����H*�痮o�j� ��a��AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�CB?AD�����4U���!� ~B!UWu]g���6�QI�'��߆�8�a��C���O���O���O���O���O���O���O���O��0�z� �S��6/�O᧞#BH�'�����p���`��6&#��$�At!�u���ZU�Ӈ���
���A���SEB?ADG��Z�b陒b��O��(*U�V���6�O4G�a
M��m�$�Atd���i=23�¼B����:PJR��(?Qe��0�i���O�M�<�A�	!�i��O�M�DH:wAt�09�3�$�At�S(�^�C��¼�K�zm�ӡ(��(!�y��/���A�I��M�D�4
�uĤ(
�h���q4^��O�v��t����
��4
!�q<�и�	!�H�щ�YPEӴ���F䪟 ��h��X���������aJ��J���O�4�r�(ϲ,�q{����R�̈́#�`((�<���fw��qܡ�Û�o�_p`Ǯm�m�t�sEQ�ap@�(y��)
!DQ�0<�k������|�u#�� �k0�!2�h�K�4B������5�)�J�4�������,"I������*�-~С�CEEM~?�P����ֶV���n�{�M�t���+.-�8.��UV�#a��Z���ψ�@@�uI��**��0�0)������XV^�/�+A��
��~� ��9��Bm�A�eي��}��k��v��VU������p$��*MQ
M��P�����
EQB�PcS�a
���ᒒƦ&�����T[{{]}m(r:��P(�� �J�\r����B�/���_x�(	�(�R]]��wOm]][{���u���a?�!"�� �K'��#�(i�� EQ@�4L��wѨ�n�������!;k`N�`��fFJR��j5M$+JRbrrBRNvNKk���ϱl8�k������ �D"{�W5mЀ��U��p�eٖ֖�֖ީ������6��(m�m˶��6���I�O9|TZj���1t�q��H��	� ��B<Ƿ�Ñ���� 
R C�Y���K~�4TM�ݻ��i��j��i�"k�!d�n�eg
�HK�4�x��N`�f���K.��J[���X��\N爡��n���6M���(%9��(��\N��!C��UM���n���*I:����%��ڱ`!�4-!>!��{Oގ�]��GJ�a5�5[�oQT5�wzZj���Ҽ�{t]�XV�X���eE�i:>>>
��?����*��j�jeE� u<(Ø;���T�j��J�UPT���d�Z�+*�tM�9��v#��Ob|�U<���34MӴa����=����@7mڔ�����O4~� �BTY]�'##�/�e�(
�~ñl~a��)i� ��7��j��sΗ
�,���i��5]3M��8a8F<���"�x�ߟ�<([9���ŽwTMc��h��9>�g'�b��i��y�B�45]�X����@(�q�&ү� �.�B���,p4]�i��v��Dz<���B��e"!d"���/Y�V<��C�Y��Gr��,�S��m�����})ڻB(p����:z�OB?AD�:6�����)�)EQc�tŶ����˱�w��� ���>��(���H+t��b��)��O������p�\����4��6 v�O�R	�A�QR%I��n7M���8��?��B(+
R@V��G��z�d@$"qGB?AD�q�

�MM?��pE�8_<�����a����BC׭V���!�� �#�� )�=�N�hR�h�I��X,�5�$�At�hW����=6��sD��"�� �~����v�d4/AD�CB?AD�s��h;�OW 8ZG�ǝ��PtXD�Gڝ�zE��A�/9i��l4M���(�w|G��
�8g��aH�l+B�9Κ�y���0Bd�R�!��������}wT�);��}� ����ЏKo޲����OfFjϞ&B�L@ѼB���p
�p_T�|l�J|����&x�++//(,���
�׏�yQeY�ӏN��(��
�p
L|n0�7��?�(
!@�TIiiSs�i�<���7Z/m{ �}箴���8M�p�c�DE'�L�h�RUU�4Qc��ދ�Aą����;c��<���m��H$Rx���#��a�aX�V�]N�`�0S�аswCdY�(��r	<��*EQ�]E�e9���m8�BH��yy^�7ð;v��٣GfF�7K�����t:M�t�\A����j�ڜ����!E��i��u]g��t
��(�Uy�WB`�X~�ē{���T�����	�`!��?�}���)���ի��ngXVQ�E��V�ժi��bq�lV��gヘ[�^4f�ED��a����p�V+��xu�7Hqh�nhh�\��m��y��=s��P(�(��+gg����ع����|��,�3gL_��g�6n�ӦLF}���(Ξ9#��]������q����L����.ɲ {���l�=�EQt]߼u�K�~uϾ��^5G������f�ص{�����l�̚��8_\rR��kF���Y��*��Ϟ�e�vM�.���p8�L�n��:��c�(�����ҥ��VY���暕�W��w���H�(+/߼u[ϔ�	�/�/,���j�M���w��ܼQ#Fx<�޽
�KJkjk�ff^<n�i��}�eKK���3j��� -BA\p:6��&�i��[��;����
�\.�u�W�Y�2�ko�WQ����ߟ��z���9��9���_�T�Pq�,�"���N�gK�Z�f� ��I�t�aB���3��u�'�
�� 
9��H$�4]E�0���YQQ��wޭ����_�
�B��^VV~���cY�����Ͼ0����_�J(za��,��ۿ��E�<7�2/:|���^@}��_�l�Ν_��5����b��CQTqI��6m���'%&.��k5���-�r����7ߪoh��T��Hqz:�~���,_=���4İa��?��?põ����߾s���:$gİa������JM>tHfFƝ�����v ��=�[**�Ld���=����:8{�m���
=	3�jZ����^x���w���Ͻ������s���x���,���AYz�����{R{��j�q,�v�D�e�KjjjX�C�v�-���7�3,{��>��7\{���}23n�{�e�f665e���t��n�/,�߯�}w�u��w�Y�N׍��'�|�
�{�\{�Uw�q���\LQ�j9���7^]fF���Żss������o�>$�q�꺇��_~��3x�g����O�4q���/���M�_�(j���A�}�	�@��A�55EE^p9�}22fO�n��$I
�#�p8"E")h�0M��}���֎9"=�WCc�n�RDڹkWbB��o����	��5Mkkk۾c�Q�t]�����Holl�e���Y���'�N���io:D�e�,ˆaP4t]Q�P8B�a��$�����^��_~�MYY��M�S{����8�o�֭��<o�$Ɋ��#�P8����f[�b%�����?z�T� ����m^\���%_�\�z�Ν�gΘ<q"˲��-k�o��#C?��ӕ�W''%M�:%#�wqi��ݻ'M����r���B�ݖ��\QQy����+*�|��^��KJƌ�2M�y{�~��g7m�X���w��jc��e�S��3�{��˫�����M�4QQ��V�سG�%_s��a��94'g�Njw�������-]������qc�H�e���ݻ���y���_�LK�EcF��~]ӳ�9|x�����%_}!�����74��8(++-5uǮ]�._�2���U��Kv��Y��_<n�/�ޝ�W[W7y����tA�?|���ڼ�h�%��i�e�������?�GC�0�0�HDE��t]�Ț��<o膪�,��*�����p�6UU�8��!��������<���,+IMQ�%��ZUU<5� "I�]�u��!�т�<�G�~��ɲ��(4M�4�izt:�E�4�eTE����0^e�ap�OܩTUUA�5M�(��p���<�����)�ɤ�1A
\Xx߾}]7��B����Ḧ(��5k��3��t]]��H$�{�K��;����$STU�M�$�B�'��B!���38L‘EQ�$Cb#{8�ӱ~w���_Y��+!�d��P�u]?a1��0�P�486]�q�Q|���N$"Y,�ݹy7onhl���NNJ"q� �QסW�B�hL��-s�ffF"�h	�hȋ6yG�N��ۺ|2�L���D��Ht:��R���8z:q�ckw�MӺ�����"M=A\��?i3N~0l��\s��\2M��t�{B(����!�8C�TE�K�V�s��瘦i�&,�5���D�=CQ�q�l�!� �|���(�����lA�EQ~������8�%tEXtdate:create2013-09-27T12:32:17-04:00����%tEXtdate:modify2013-09-27T12:32:17-04:00��tEXtSoftwareShutterc��	IEND�B`�site-packages/sepolicy/help/ports_outbound.png000064400000147363147511334660015667 0ustar00�PNG


IHDR�j��8pgAMA���a cHRMz&�����u0�`:�p��Q<bKGD��������IDATx��wtW��	�"2"��� z�<IQދ�*UUWuw�T���3��}g���fg�̴-�e�%S��H�{z� A$��D���@����~�� 3�ō��}�Jo�����?����WV�u�i�V�@ |'�$)�L&�uuu��a�*���/�;������l��?�@ �.�x<>��������-Rmm��)S�LW�4�w��@ |�H�D<�ȑ#�`��(�q��%��*`�&6�
��:Y1C5C���@ ��c����I�O �wJ_ ��8]�d>�/|3$I�~�ԣ��2�%IQ�F;�m�ch^FK���׿~�:C��@ ����}S��7@�$R��(�a`��p�$SI"�`b���
n���p�ӁUUH$�ē	<.�$�Mk�$I�Ap �a�H���f�a��XH�n��#6;��a.��k]+��H��z�(�i2@�%�.�~-�-�,I0�~gڐ+1�W����p�O�md��ch��ı5�����?z��?X�UU0��c*��ӏ�}��{��&`�&�Oֱc�A��gq�\�����˲�aH�����1��P��P8���7����BiQ!O>��	�c�
�I0�7o�ϲ�n������>^~�qdIB��;K_)Ӏ��$��_lbˮ����g�?kMZ���ޤ���=�4�-�S����:�`f�}�&}i��4�k��a|�yA�.�vt�1��
��g�
�)),��9æ�E��,s�t;����x"�ͦ��q9����cSUl6�$�I�Ri�g�3o�4LΝo�0
�������BGw���D"���쵋
��֌�n� 7Y��u�P8���Kei	o�X��=���(����h,FQ~�a0��J�7���|$ �����UU�
���b$I�|k���׋iJDc1�z�ذcc�T�i�nη�Q���n��u��r:)*ȣ���TJ#�L�y��	���M�`��������D2�,����`�ZinkG��;2�GN֡�y9&��b�\� \s$I������7q��Y���=�xt�bJ
.��M��_ �&��A~N�y�>q�0�5u2�$�$�~�d���	�<nlf���o�Zؼke�E����Xd��EE,�9�_l䧯����G9�t��'�e�^͙Ŏ�y⾥���Kˬ=v���/O$(/)f��}lٳ�b�a���;�d	ӄ#'���飪���W������I��G��F�蹧��*���!/>���41��c�G����s��[�6i�$�:�v��B�Ko_ı��tv��(��=��-���x��P8ƒ��$�L�i�^v�}}<��}t��p������eLy)�|���*�J��,Gq8���@p�b�&�������ǟ���{�a������d�����&�e%����ᓧ8x�8;����A"=m�t:�X,جV2�Y�ROw�
4]�~^z��x\.�_�9�=�`~YQ
�dތi�om�͏?�eK�9e_�fL��z<<��C��߿ˢ�3Y8{?{�-�:��z7�08y�,Z����ST�G*��o����'q:�<�*��u=��`���1����>Y��܉,K\hmӤ�����s��I�<�i̘4�Esg�oo���Vj*�Ii�'�ؾ���s�9̞6�_���m�<v��b��;�� /���KO>Bei	�`^�[	�66~��2}.F��7@�$�[�?{���BƎ�`�]�˜����d��XU�݆a�ש�[�Y�ȲD4�l����*N�Y�Q�X"A�t��a��d�h�a�
��q��Ìs��N�L��n���lVz��hi� �L�v9�5����b:m��膎���ys���/�٬,�m!6��Id�
�E�gR��KEI	���뎟~���
�4MC�eTU�i�c�Z9��Ƒ�ST����@�uTU�/����d*��(��@ue99~_�A��g�A;��VC�$��ճ}�����=���a��I����W�)A_0��]{	�#8�v��6�+����'�L�w>�0j*�y��	��lر�ͻ��tؙ5e2��b���`UU,��CEI1�&�gݖ��l6
rs8r��x�?}�y�ض���N�h����\�ߏ,I$S�����C����-�hX�j�Xr�B"��N�����L�â93�|�6���{�\Ĥqլ�l#�2qlu�sa�&�UQ�۬L�0�0p8�x�.<n7�ttu��&�9����x��	|�fЖ`�I|�m'
M����0��w!{&��t�q�m|�쬃h��*P�������A㾼@��@�ꆴs�Ns�ԩ#���c�&�T�x<��j�iO�4�dJL�����*�dj�'��jW,�DUR)
E� �>_U0M�Y&c/I`�Z�'� �mpk�i�$�)TUL�F�h���jE�eR������UM��c�����:�x�mrs�����;}ofv["@J�1M���H����X�VK:/�$c��$S�bA�u��4�)M�B[;�~��W�z�I�j�#$��ΆJRKaU��N��F�$dIBe����9~�����)`�Z�[����0YF�qm�&v�uT9�ib���z��,��QBe������`�YGT|�4Q��3�y���i��$I��&Z;:yp�]Y?2���Q`Iߋ�f;$��O��f;�i�y�!j���0�NGeY)�������<C��+7��V�4M����~��e���Mb���hu�L�pEiG�m������J��ǿ*�$I��6��^*�J�8��W]�J���4f͠���j��0p6ĸ�u�7C��0M���"*KG*�������qg=���H ���GT4�@0�+�:�6�x�R \��~�@ n�|,�X��� ;�n$D'���L[oŻn<��������%\tC'ǟ��b!�^��\Y�q9]��.B!b��U���(�<>LL����%��$$�N'���$SI��~SuZ2}�^�����`(J8��S����s�uC�$���imoe�����E{#"�G�=�=���PXPHQa��˳�T���6������%�bb"e������X�;ߺ{5�K���j#	���GYi���}�_��MӤ���p$���m��F���Z�8lS��
��$�FI$x�^��_��$	��Ź�s��w��lU��NOO)-���$�IN�h�݆�躎�n#��t:��X,�t�x��yUӻ�Nt]G�5zzz(.,�aw\6j؍N�]����x�wv�?�lKeޔ]q�w�L����FU
u�y-�,IRv�l�`�(����ÝQ<��m�a���౫��0B���瘘��yS+~Y�E)�a0M�0���OŢ|��X%���L+�J"����y�T�,�dV�U���-��I���ü����B��H$L$:��Ni�cȲ�ۋ��"B�n8n�?t�ȳi��tI�f��&���
�q�]Y�0n�wz)Ėg���@8D,EQ�8E�b�:�D�`(���ꥨ�*~I��!0�fo73�`�7�0Y#"Q�nI.�6�$)m6���W\6�AC3�*(YJO�KH��dJ����:�M�TFf�d2E*��n��9��Hd
�2߯*��*��%%�nV՚>f2l�_��Ӓ�F�`��d�V[�y�W�y
�@�5��0E�4���l.t=��@`�t���=�qI�/I�����ǔɓ().���#Iu��Xd��%tttPZR�O�N�,����)-)[gnb2��C�+	9t�,I�==l߱M�q�\̚9����+��������6eʷ�e�H+6i�<o۱]�0�	��e����NQQ!E��W|-�4���$	�Î�f�"�#��E����0q:�ج6���#֦�t��[�X|��ht�R���r�L�g�K�ę�:ė[ $Ə��ɓ�q}�4��V
��lٲ���	���:��E{$�>躁�`�Q����*�U��9��R^��пR�_�%�$�'O�o��-[�o��Y���T���kq��1"�G�#�Je��d�=�6r��I�BWW7u�Ne��u2��(��dž��tuus��yر��t��\����/�H[[�ظy.���>��cb�����(:�'
gϱu�v4MϦɤ�|�4������̱�i��3T|9��湭�/�o�f�b�-�kl���96m�ʹƦ�y����������X�����a�X�-2�����KN I�p���>d���bAQ�tZY�n�S^Z��f���'[F�DY�����awd����4-��?t�?���V+�O�`Ͼ�X�*��N���:�%�Lr��Q����hŧ���!Ig9��8jY���Z�z\OdY�4
$Y%�
�%;�)�`�bI$ٖ�M��z�s�a��i9v����	�B�κ
I��X�{�,&��c�n���@�{�.�2��3��I6���ɺS���r��~�f���)+)����+W�x��顸��H4ʮ�{him�����-dێ�ttt`¸q�s�]:t�#ǎ��z��ޥ�'�
�$I�ڻ�Uk���䏙1}�����5O<��w�W���O���PQQ���8�$q���]{v�J��3k6'����:jFUU��t^d09Yw��v��*K��V�YC{{����s�lܲ�h4Jh`����3m����ɹ�F�JKYr�=���K�4h\�6^����{�L�6�Ӊ�����`��J��x��[�q�����r�}==�lپ�h,��y�2yҰ����(�RZ�ʦ��67���K"��������Z����),(DQ���q�����
����`��p:��P�T*E__���(�B2��"�gr9ج6zz{0��ߏ,����1*/+�j�ݠ����y����x�mt]�j��;:��褸���KY)�-\��X��JN������72]��lݾ�`0��	�:e2k֮��w}76�M�-�(���f'�J�7PNk[�9qb�9��H$
�6;��~F�H�DWWmm��sם$I���w���U�Ί�V����ɺSL�:��.�~��l�4s��Q��Z��v�M���nglM5sf�d߁��@i)�&Md��	tuus��	6l�L���̞5��'N�u�v�?��(̛3��[�q��y>]�����	n��߷
�,������/y�����_����B�Ξs��#�W<;�6M���|�:��8A4����>���CeE�|�{�`�ʕTVV�r:y�Ï@��A>Z���G,�ի1M��*fΘ���9P[ˡ#G(.*����/6l�ܹF֬]GEy9�嘘��f��Q�hy����;��[�ǹ�&���T ��aǮ݌��f���9v��v�p�3�N��v�:� K2���Z��[�e�� ]]]aQ�18�n79����0
���\:::���T=�=�Xd^���KNNn���O*�"7'-�q����n����~Ξ=�n�����r�8�|���

��Ɨ�F�uR�ԗ3.��`������QQVFMU3gL���RY^N2���Cx<
�9}:e��4�=ǩ�zV|��p8̔ɓٰi��s��W�ks���t�����K�b!e����!aT�LYH$bxܾ+Z����?~���-��|-��<t]��|̜>������p���B�ϝ���9{��d2�]�e�Ʀ&��ʘ9}:5�Uh�F}�i��8I$A�ersr�,�����E�0M��6u*��gҤ��mlDQ,���f҄��l6����x���v��K4v�1��0M���|~��?������x�ʊ
~��?%??ﲣ�����T����K/���/�4ټu+�h�E��p�|d���Zv;��e�y�����G�$���hkk��������(��Os��I"�(�P���ؚ��T����9,]|G���ѣ$��C>՟���
~�����O��1�C�]
#�1����i�v; �p�|J��ټum�m�;�?X?4MCQEA�$�.'��~��F�?Y�QUQ���!�H�q����r��n���x�^�����z�~?��������'�vh�i��/�h��t�L&q��X�V,��9ٵ�Q��;M�i�~�	�9}~���3_�9J2�D�drss�y��TW����$I��qZ�ژ7g6��Ρ ?�Ʀ&\ίx�ɤ���PX�6<n/�0��a�*���z:���X��+�3B�K��q>�S�?Ə��
��%Ξ����p8̩ӧٱk7%��x<n���9v�flM
.���g���ڊiTVTP?xN���45���ѣL�<	�߇i��\.�;�ɺ:t]G�e&�G�C�߸�c�O0y�DL�F�ib�X����3�;u���d��C`Ơ������ӟ�	��_i�m�#�Dwo{��g��	�N����a��M�۰Y�X�`>�D��7�~�f


(((������^JKK����;��Eiim���cL�8�כ�6���6�pu���e���;p����k�{���t,{�͎�nO�,Ѝ��~�~��rQ�_��iS�ٽ���23u�d,����a��É��I����+��Mk[+���F4�V��p���n��pv����(���'�,�D�x<�Fz-������*�+��ɡ��7k'�;�e���dY���UQQ,
�x����*~=�g�_�Zh�΅��^��׏n�HH����}�i:~�ˠ���2���������flM
��}�w-\/$IBB"�LRXP�����8�)�S�J�,f�-))��pd'�Ib���#??���|L�DUT�%��v:y�	C�ڳ�D"AYI	K߃�秽����J���?wv���u�0��'`�&�-�T��3e�$ƏGwO/����B<n�.@�$ZZZ�6u*���4����7,�����S�4651s�t�̞�����1M����J�v�%���,Fo_Z�=�0
"�(a����oc��T�W���I2�d�=w3����B��[p��ܻd1����r��?g皚hko���Q5�M�ini��������(+�f�a��������c�qۂ���V�4]#
���0�Ft�e�CF�������������	8u��X���"z{z9|�(�����ϗ�)I�p�������p��o__�X,���[��D"X,���p8X�VTU�4M:::�F��m�����VB�N��֖V���p8LWg�D���v��0�dUU���Eӵ�UU��٬V��$n��m�Oe
PXX�aXU+�ex}�������E��E�d���())����ܜ>����|�Ι��@�=.Z���5T��w�i������Jp���m���JuU
�`��������<zI�Dww7�;�cΝ;��@ΰ�i�X��rUU�LCo��6��?%77=��u�v��'�c�v��-M���a��j,3�:����I���3��т�$I�r��&O�<ⷌ����н�cC��뺞=�KYiC���L�L����"��C�nzԮ�W2��B���T�ȳE�ц��x;��-z�{Zo.v
d�-�w��Ep9���Ֆ�ڋF�X,lv��D01q���)�̚��j%���'�x<(�B8FQlv�0��.?@4���`��q8D"T���j%���*�E&��v��-2�D2m<� ���7b�'S�3Q��>���^��ϔ���*�^-+�Œ��p%�:�L��:�y���\$I����3
�,�����N�5�����En'?��2KR�$QWWwi����ҽ]��B���p�v;����r�2�k�6�_����3�3�.N+�9��
�R���_�	���gd_�ڒ�v܃4��+���)��~Y��NzŢ`�&^o�[Wơ�op���u�q��x�<R����Ϧ��^ddy������Bd��p����>�>�<|��r�c�&�e�+iK�C�q%��4�+�J��B,c������~�o��|e�+6�7M���b~��`�Z��agL�Ɣɓ�5��[�f��:":Z�Z�:�%��ev�ِ��_���n$�~����#�2��а��7+�%'����0P���"��������*Ǣ(��C_K[˲<"JWf;�@�mq��t��W��VlC��\.�x�X<��f%��۳����D2��n'�c��Fz��j�7t�b! 'Ȏ�o�����N�P���$�g�&O���	�-o��|��EQ�t
�r�͌2�Ym٭uW��:��!��t���wI����1$d���2�fz�j% �26��W�0��uCn2���:�R2It]�T�
Y��u�Nj�i�n6��ҒRB����ͳ$I�x‘���$K�q�B<lR)�"�ǰ(2Z*��s��K�&I�H&\�W�c��p��7�����7-���n�����`�
�[p!������'/7oT����<��yՍK��np���5�8h�,��ܦ;mx��/�EL��(����?��UJQQW�P�M3��
$I�$ϙ�t��f�+7���{nv�5}4,#4B�t�ZpY�6;>�/k(z36��2�7����+|��jÆd2Igg�����x?�T�‚+�Q/���UQ��$�H$P��b��O�$	�:چy����U񛦉�f��v	���l���M��G��u��
��q�,��/}q��I���ƅ�8ф�ϩP��a�K��&�$�e!�@ �\u����SB���A_wY���s����i��8��3\2�&��8�}=$�I��M����JN 7@�R��1�@ �NV�I*�JG��d�ٞ���5�g�JݓfFq���,��1�����t+Z�G��z5�f�:e��1�\6�p�bH�R:HJ{G��y�~�o1��吝������#��X�h4���6�^�0M,��2�i��`UC�%R�Ԉc�	R�ɳ,�h�F*�ʆ���vC�!I)MCK�P%rw42��*� �����4Q�	��1u��{�	���l�sם47�����]��܇~�%C2�M�&l�w���V�j;ϼ4�iS'r���`��iԖ8Y2�B���bd�N2�D�5

�ei�ҏ$AAA!��$�	��$��8�����t����˗�������y�QUu������P_�����G�͞�c�<�,�=~�7�~���=&�?�w�|Y�9U_���̟;�ήn&M���+>�?����Gz�ә=w�a�b�J�K�X0^��N�b��-��a�w�q;ӧN�v2yG"���<t������-:����_���(�=����b���`(��Z�����p���ū���=w����q�\tww3�����餿��d*E("77��|M78u��T8�{˘;g*��Y�ev��|Kg��,�W���Zd��
�@�4LSg4�}���a� 9��l����G~���~�^g�q����T�u{(--AQ���p�ǃ�n'
��u=`&'@*�"���$	M�hmkcͺ/�=k�y��|-

g�Ec�Nww�������{����AݩzZ[ېdUUٳw
g��}˲���`���^?��p�]��tvuq�t>o��,˴�����ⶅ�6u
g��ֻ��q����#��D(��	Z�ډ�����v�PU�`(��(8byM ��Ȏ�%I"����A�~?S&M$�r��1l6�,Sw�����$SI;��U�?GQ,��n�~�	~�λ��QX��3O=�,���,Kz�0Q�v�v;��bAUD���J;�Đ�VB���R~*�]�e�$I�R���$��1cؽw�`��ك�K�f6m�BggO?�99>^�Yڷ<w�y;�7m��O���ͫ����o�>s�S�O��K/f��JKJ��lڲ��2]g������8r�HPY^β�K�pŧ���q8tttr��Q‘0=�}�صY�(-)eͺu���k����A>Z���E�&�ϛ��ߺS�8�]���IYi)u���=DgW}����������3X,2����2k�tfϜ�[���2c���>��@ �r�-JJ���f��p�(*�T�H$�`�̜>��c����O1s�t�n�Nh D^^���ֆ�i,�;�'{4����SS��7R��q�T=�!j�4s��AL�R^`G��a牿[�/��5-�
u�ihZj���4Y�t	O?�8
g��O?�9G�CB����?��+L�0��S�ظye�%��D~~.��%'�`�!N֝B7t=ʑ��[S��j�^�"�ܻd	
gϲ��ܻt	����Nv���c�<�+/���S�X�������~�#>��j#�L`�3�Ogڔ�<�����9��HQa!
��q�qv��ͤ	���ӦL6%�L&�rs���� �H�L�X8>/��<===�A@�a�3~�X:L��B!����T�@p�1̸���䞻�(+�:�:AJ��&S鑸,K�������� ?/�'{���<@��q�t:1�t��T�,I��dm�������in�/�ao�����95��~� �Bd��I�\�#���i�>�o�X���*�������#G�1~�XT���j��p���4=}LUQ�d2�����x<n�|�1�n߁�feʤIîe�&��<���!&���۲#f�ՊU�"�2�d�4�-��uzgB&��<h@+IiA�Ն����glM5g�-�o�����9U�qckhni�����O<���=8�N��j�0�aq�$ӦLf�޽�Z���*��	�æ�5���Z�˨�J^n.�l�jL%�|�j�.`��k9�9��$�($)=-�Dv:��^IA����Kxk�>j�q����|'��b]�9]g�w���τ�-Dƪ_������G
�J�0=�40M�X�r'�N��e����f�Xd��Ƽ9��t�j~��O>�(��hZ��@�����/SQNnn�0�Uu��̜>�4���X,�����Oハ?F�X�3�eK���{��W��Ԡ�l��(
�epFLUTf͘A[[;MM��X�V-X�ǟ~Jkk���2y2��!PSUŲ�KY�rN��X,Ƣ��2e2{��G�;R���X�*��Z���2��Elظ�Gz�z�v�@p�v��iN�8������MnNNv_$!��p:8���X�V:;;���������AwO>���"��A\N'���~TZR:bD!I�D��� �D
�ˎ���WÖ��)+��7���M�L�c
g�	�P��d��2,x�D�S`U�L�8y�8�4�������Brrr�D���a����B��I��{p
<�`�&]]]�l6�^/��8|^o�����#�i��ݍ��FQU����u��B�v;�P��TE���t:����6����`���\��A���q�픖�b�Zimk�����I~^n�{�=wvu100��� ?��BWw7N���F�`~{z{	����aǮ]l߹���WyIϚ�໇$I���}��M�f��I4���[�2�d0#=Ґ$�� �,���3d\��g@%���ٽ{^����-�a�i8C~~>�����B$���tQR\2l�=��÷��1췋���|T\�_��ud��F�~�K-_d�wq��֯K�g�3��9�>�ߌlM�x��ש���e�
�>��"���
�/n�62��|�9�if]�|�֡m�i�|>|�Q�
��d��1AUU���G-;�i^�l�����_�v��K]c��;�J�]
����t��~fd'S)JJ��1}���(Wu.��_�ؙ���Z\\VD���`�Zy`ٲ�,�@ ���j��b�:b1�|%�dr���ACWE񛦉�fg``���M_�\ H&��\.����*s�F��,���u]L)
��ݑ��eE �*W}��ؒ'�"L,F �W]+��\ ���T��]V7�yq�@ �U��=Q�p��f�[-�nU�_ �!\V�o<c�9�PR�0o���	�,&3
ᙙN\6��초%�@ \.��c)���)����w�\z�nr�l�ZC��j�����j��vs�*��`T�.I�5��Uxv�
E��_�Z$�BS[+%H7��4���!�QXPx��#�[��?���,Xn2���\�)�C ��F[��;�ttu���/F��@ ��\b�/N��l\�	Ӕ�q�t��躉$�X�銢`�&���/�k�%�N4��sX��{abpYH�ưHe7
�$�*j�?�@ ׊QG��a���e�Z�RO$h���fê����$iX��L�UU�5K�L�q�4�x	��;UR��n�d�.�i�i�eԑv*���o��F�+�2�d\��@ �f���R�����f���i�?�0غ};��������7N�9�$IXd9��b�`�����(��M�X�z
��V����,gc���2�2C�Y,�l��V�<fW-��I<�a�a�Dx��oq�B3�$
����y�����e�1#۩�}��c�g�ĢQ�z�=�Ο��,�$S���@p�1b�/�n"Ki�o�ʲ́�Z>���{f9eelߵ�_��U���� I��^L�d``�H4�/~�+�-]�y������9�9JqQyy���IWW7�X���\�v;=���}>L�$
��x���#�a����جVz{{imk��ɓ�l��!�ϩ��p��΀�ittv�����!��D����f�����ɺ:R�$�E���6��$=��X,rsrH�R$�q|>�p8��G"��|^/yy��AGg'��klbޜ9��B�Y�R���@p�1�T��H�Ú�8x���s��!K99�;��9P{���M�x���>u*���1�χ$�44�e�ƍ���+/����
�PU���>�o���/<��i��<����/ɘ�J:::Y|�]̛3���5�6g�2k���H�"K�*�x�\�	�0�$	]�x�����VY�y��g�'>z��u�(..⾥KY��t��kӦN��tr��	^z�y�oڌ�fC�uj&/7��`?��%��γk�^\.'=�����T���H%S�y��Z0r�_JO����ݚ�UL���n��,��*VU%�H�dz�ޱh�9�f1q�^~�y�͙����s��JY�v��=w��_��ψ�c�ڳ�D"���'�t
���+/��=w�Iݩzv�݇�����,���a�i�*�u�ē�a��:�nZ�hZ�9�g������Oo?.���[��.��slٶ���-,�	�_��=�����A2��0t��d������]��Z9�p��p��w���OQR\����i~]ב��R��_�2 �[�Q�I
�C1�X,L�8��'Np��9"�{�����&�i�x�D"�a�jz4�
�t�/��
�L�D�-XdY��%	]ׇ�);#�r:HiFf}�0F�I��ۮ�H�0��z��]g�Z�XUEQ�y�u}��`��42�% c�O�I$�
?��Պ�(ج*)-��i�1x����|]G�$RZ
��_ ׎Q��)
��:B�������o�6��d"�3O?�qc)-)���f�����*>�x�E�����;�?�������G?�m�mlش��va��Y�d1�]�Y9�a�X,X�V$$,J���y�c�ӿ����N�L�<,�N����6�h
�DUT,��D�#�~�u�M[��i�%�ܾh!�H��>��Й=k&c*+9|��m�m̙501ʹ,�ł��b���߸�����A$9��0t�Cf�/��Bڹs�9q�ĬB�$���M~n����Q��R�A#�8~����\$I�?�\c#.���EQa!�h���.�rs0���^��(���c��]Dc1
�q��#��� y����q�8yy�t����с�� ����'�z�}6Kz[a_?n��h4��f�j���ߏ��!
�F1���Bv;�D"kܗ�ώ�NB�<7^�'���n7}��8TU����D"��a'7'��:�YCA
	�Ì(�@ ���$�������L�.'��5���t ;u>��̱/�</��=����̱���[�$2�x/�# I�x���O�I�J�_�,
�k�f���׼8/�7���|f�^���g����h}�@ ��d���~�0�0�YմJ����J�r�/V���}�c��CKw&`���K6UŪ�h���90Z����/��C������yu�@ �9YşQ�)MîH(���|VV���0,7�O|����O �%�J1Ȏ��6Ւ6x�(#��Is�C׮wV�#��׏I?��1�w������َ���9�M�4MJ��������,��0t4M���d��I1�/��K4����4M*+*��ͻ�y�*X�V&��p��!�u����3gΤ��r���P�K���(����l�����m+�@p�!��fA(~�
�P���QF7�Z��A0؟�)o�;p:��	+���h�j0$�L�$a�Zq��Ü&]ͼ��(����ޏFpp��i,#��#�:�8�N"��X4}.�;��m��X�Ǣ��n��x��<jZ�`0�u.�t:���H�a��1t�LW�ˍ���d
��*�X,��+>$�aQ,x�>n[tc*�\VX<���$�U�x���}o���5��a�~�:ZZ�QU+N��9��2y��*��9y�$yyy}��h���m[(((d����ns�Tww[�o����4q2w�y7�����C�lv�`ꔩ�;�5kW������ɱ�G�Y���?8]]]|��Ȳ��b����;o�����˞�짩���&a�����|�P�d2��(Ȳ�]w�Ô�S���|�س���w�wQPXĦM�9u�$e�ettv������nwp���D"��۳w!jj�QR\r��O��d��T2�d��L�:��{vr�TU%�$�TV�hkoC�u�
�	
ٻo7�E�L�4���2z�z	��|>����#::ډF#x�>�r����L����i"��`8�1���ן������=��5�e$I"�JQVZ�=�,Al6;��}���a�͋
��	$��btwwaF�,,("�JPTTLWw�$�H&q��|~��/���������ƒ<�i�lܸ�ƦF�^�:R������a�^����>�=��i��Cnnv���|�����ld��qL?��p���J��I&S����t���bh���j�����Bgg�H���|<�~}��ܬo�p�QgH$��p�p-u�uX�VL�dޜ�:\��������D���f�N�E�by@�crɘg�6	��������u'����n�s��Q�;ڑ$	��Caa!�X���N�n7�H�C�k��l��1.���`?'�N�r:��˧'7�C�2i�̝3��%��:c��#���7r��!*+�0�f�����Ɩ-��e���g`�ه�+�4ijj$��t���rr0fL����3<��8���2B�F���innfٽ�
ɍD2�b������f#'�Î�ۇՑ	�'�o�^


���	��q{���C�e�~?����V�n99�\h���õ<��S���p�P-c�Tq��!���	G�L�<��˾�{��lv�.Y&��5�^z</�̙=�����W2e�&N��g+?��z���ٳ�2~�8z{{���f���3��zߛ�;�%�OMp:�0y�*�+ٳw7�e�<p�CtvvP{� �ܽ��˧+WPQ^AaA!55c�6u:��[Maa!wߵ��[6�o�^�`�)̜1MK��4�6o����#I	��8SYEnN��7r��
g�����[RR�$ɸ\.t-=��Y��2�˾���q�]���Q_
UM�N�eI�,3s�LZZ.���>|����,���Wb�/�VGܭ͸\.��n���hl<G$a�=K���&�0M2#WS=��G��x���6JK��z��\.|�a�O�q�P-n��D"AqQ1�Ν���[(�k�e�ɴ��3�H����v���p��$	<n]]�Ȳ�,[Pcݷ���p8(((����RS��i0�f��-�2�i���Á�(���
�p���iƮN��Z���CA.4�'���C$������f�A9��`�P?�����=��a�n��~D��Ѷ�5�=Cww7�@�`{���ib����/dIF35R�gϝ��v��*��J{�T����g\1�Z���46�c 4���&��^�����j�/���&��2o�||>?}�}��$I8���2ق���R)���&99�_ޠ���p�\�?�����*Kg̢���d2I{{��ݸ=\N�H��Ŵi��	��kʨVP�,�������cv����H$8t�����L�8	�?�͚6�p�݌7���s�i8#���F��$	�ߏ�a�2�i�v��z��}>?��.����S��5s6�L�0���^N֝`�ix�^��ߋ��c���X�`�D�]�w��x���g��xؿ/~�ysP5��`(Hww�~:���:��i�F9U_��m[�m���</�X�m۶�e�&N�9�j��x9t���;�q���V+^�UU	rP,l6;~����J�KJ9r��U��tp��4��E,[���tt�g�*9���c�Q�He0�y~^���8y���a�h������I�[3�d2���&p�!�r�v?�X�;n���s瓛�Ggg;�������1�,^�}A�.���a�XFXH뺎a(�A|4MC��l:�0�uY��X������q4M1��4
I��[�`0�i�5�i��:͔ؑoI�P�0�t
Ţd˱�k�z��b��Hc����ZG��������3���ih��jUQgEu]fj��Ȳ%=��?�����ْ-/���d���ғaY��T*��_�{e�lC�i��ke�醎���ed��/u@(�2{�\�N����Ǩ=|��}�ӕ=_�4b���b�����5������6�>�/IR����LC���`>C�@�m1���h�����K���|˲�U�^$Ga����H����r.���ʨ,�X�V�֋˗eԲ���K��t2�����6�r��:"IR�ZY���<2�����gp�\�,SY1&��ˍ�j�;<�?�?���#���Op�s);�[���|���ӡ��
���/��\{@L�nxLD�*�q����'�KJKˆ����C���%���p8��2SL��$�L�����y�
��k�b�����F$CF��=���*���l�mvQ>7,����DܰHH�G�.��[	!���$�L����ڔ@ �m2�^�G��B(~�@ n!,˟Y�_���p9]ك=���:D,#/7�Y�4�;΁��`��7�s�M?y�Ӆ�9����#����k���%'�Q˥d&�IN�����s���f������s�:::��˻���df�;u�s�M��J�8v��`>{�r��h�Ʃ�z�8Hk[99l6���?p��u�P��k�?�@�!ZZZ�z��m64M�D]�O�L~?�d��ud��q���ω�'9�p��E���e��H _�h4ʹ�s#�i�lش�w?��޾~-����X��Z��$�7n$??�‚V�^êի��bT���72�ҕ�,,�G�u~���C!9BGG'S�L�TF�)IMMM���{l޺������8��_���P{�0�--̘6m���K�him���>r��|�h4��m��ٵg/�X�I'���X<Φ-[hko�؉4661}�4�N�b����y�VƏ�@8̯~��W�3��k����¹�F�;δ�Sؼu��o ��m�
p�\lݺ�+�9T���F�]RfZ���?�+�$��QU����o��;�ֶ�l�(C2�d����q�\��4����\�����ܹF*�˯wv׈K*~I��S����A̟G8�w�噧��G&�p�����ǟ~��=�=w�E~^ި��+�y��a��P�����ؚV�Y˴iSq_TxG�)I>����s��	�͝���c�ڵ���O�7�OW�b������Hf*���w�'�J�z��;q8L�8���2ΞCQfL�6�UEa�I,�?����۸�ۨ��`��y̙5��[�Q\T̉�'�X,_���

�ܵ����ɂ��X�q#�y�߸���]��'��?��T=��g����<U_OMu�����;�`�9̜6mT���;�l�jrss9q�$S�L&wמ���7�����χ��gæ�|�a�D�0ٴe��������K�	W�d2��͛9p��CG���ih`�
�>}���<���ذi�e˪i��ڽ�=�����AE� ?����~���G���&�׮��#�޳�O>���|>/E���dv����_��]{�`�cƈ:r�Q����[��l!�$���~ݠ��Y�7v,�`Z	����z�Z��_���[����Je���R��OOo/�)_l�HWw7�x�+ef�X,��b3M���N*+ʑe�Ғ�N'�=�W,�@m-�<��C(�%� �����b��T��_�ӣ(
�h��[�1s�t�Ng����{�X,TW������򙛓��ba�g+Y��Z�[Z�'���dێ����s�b``����Je���l����c�n~��_���J���w�p�F;v�����MyY996m�́�Z�L�L^n�M�lش��@��{�����b��O$X�z
�D�X,�k��=u���obUU���y��oq��q6l�|ٲj6m������o��ɺS����	
��*�{�
N�:��5kQ,
�%%_֑@`D��69�rsr)/+�Un1.��3�
��p��:�`H�;�NR�S&O懯|���-c�/����ʴ�l̚9��}��@���>�NLj�h2�*3Y��/���.:��
4��J�.C}�́��~�}dY���Ӝ?��{��i����
�=����7�H��A��b����4�'}�,c�&��`՚�Y���t8�(�%�����G�$zzz�$��"�?�3�M��;'���Y��_A>�Xfa�yy��+����Ɵ��G������K�M�Q2�����9�f��WQQAQQ!�,������O�p�|��v&NH�����p�%9v���j`�&���-��#�}�N�V+O<�(�?���mtuwSQ^>��j�6B�EQ�5k&�-_���cێ���#������Z��ޥ,��n
X�`��S�n����1�[��ٳ�=�-�ůi

g9��@Kk+G���q3����V�fǮ�|�asf�b�رh�F<φ{��2���{�2�͙���`lu
�f�$�H�`�|\N��L�Rtuws��Q���8z�8���̙=�cǏ�e�v��c�^/U��_)�����	n[����r���	G"������ѣ$	t]�*��e�b1^}�u6l�̌i�hniE�4������?QYY��I__���������0e�d�
q�\̙=�Ғ�v;��̥�����ud����y�V�n�ζ;�z<x��o�UWU�:;a�y��a:��()*b���ԟ>CGg'���M�9~��eebSpM��d*E[{���(
ӧN%��m�N�n�AN @�刺?�-�ee�q����P���Agg'N�
��>u*���;v�u�và��MӲaz�u����tvvee<t���v;�/\��ԩQ;��.#����l�|s3�,���Ǵ�S�0nM.p��q�O��}�.����`0���inn��D�����gҤ�Y��+�y/�b��ukٺ}'L�����dN�2����lٶ��FwO�EEL�:��ʮ�{�d�g�z�D"����ee���1w�lf͜ɬ�3���������������kؾs'�`���x]�G��B�ۿ��Eo_���L�8�#ǎ��e���1e�+�'������;9�?�$>����n�~�=Z[[y�ᇘ:y2��W�ϯ#s``��{�QW_��n��''��ͫ�~��(??�B�(
v��uu��3g�,�_�����X������������##@�5%�L��󵜿p���>z�~fϚ������#�R)�|�Q���oȖ��}���b��66r��9rr�ן�D]ܾh�߶����?AgW�?c�T�ǘ<iN�#[G

�����<w��y�L�8���F��7nT;�w�����{�K/!?/��N�u}D�Q��Φ-[hj:��/�pţ��eF�Q,��m0��7�9�adײ�
��d�H$����f�]�|����t�����q:���4�_G��\�{��df�6o���]������.��>)��C!����{�E&�w�:q%eU�u���O�͝�=w�)��ߘ��.6n���n�G�z��C�y���y��_��X���������
�ېi��X��oU���n���ݣ�l�4�_G��\�{��d��(
��B��Z"I���~�:q�e�n��ۅ�|+|��@p��ň���~����0�����^VY_IY5���>v���8����G���Nz�%H��C���?G�Jʪ�n��)b�@ �B��@p!�@ �B��@p!�@ ��*�X,ƹ�Ƭ�SH;�hko�\c#�!~�C�9{�����}/�@ ����4M�o�����w�y�l���������7KgG�C�L&y�Ï�?�����[�^�{�"h����{C�;+��F$e��]7��|�0����>��'�wV_���%I����hk�2�ZqQ�{�E~��f:�(<��C��at]��"�E8~����ܶpK/fLeǎ���|�U�p���p8Lnn.S�L~��MӲutCgƴ�>���9q��UeƴiD�Q�44��i����u_���Ν��FOO��D�QJKKho��`�<z�z9P[�i̜>�����˖y�49}��==�cq&M�Hqq�0ǎ'�J1u�d�6ǎ� �H����X�`>K/���j��`(�;�O0���c�ܹ»�M��|r|>�0��p8(.,�b��	-�2��#\�
W�x,N<#
�L%ٵg/_l����D#��;�;�ϼ�sY�a�/[�=w�)��	��X��u�ƌ!�q���~�	^�-rrD"N�8IEE9��/ZD$!����O"���OV�*
v����d�Ŭ[��Á�㦷����N9ʜY3Y�r��̹d�7��?��4�y�lپ���Z��h,��nc�޽,]��_��7̟;��Jl�����i�:�H�H$Bh`�0���\�5�8�z�$Y%�����I���ã=Ĥ	سw/wܶ��Y΂y�0M(//��{���.e��ݤn��S�w�4�Z�<x�}�ٟ�1�mm�߸�d*���
/��<'�N���ɸ�^y�%y�!*+*x��(/O����۸oٽ������QVVJ[{;���J*���y��A*+���і$�E���?�cTU�
9����/<Ϗ���!�O������?�s˗3���Gz�)�&���>m*�f����f�^�G/�F��3>�����imk����X<NKk+�p�ֶ6rsr�Z������׏atww���w��I�G�%L��B�JK�q�ݜ=�HOo/6�
�T*E8����χ��#|]$�0���$ɔ��p����������p�q:���-�d�����TWa�f��J�dL�@�$tCg�ڵtuu3}�4.4�`�&���2���L'L��8=���cqJKJ8���3g�h�����0M$IB�%LL�_h���I���fe�~E������
��@[�k�e�3�kUU.g:[,��Vr�`-�A�?c�4N��ハW�h:���B�o��g�����@8̌i�Ĕ��bQ�;w�!?/�ӧ��v�ދaX,2�?_GcS�X�'���u.��$�I>[���ӧ9}�w�qw�y���u�vZ��y���z�D"Q�O��������TVT���OEyn����Nf͚IsK�E����r��	���	���i�V�56��x��'8{�o��>��Cx=^N�9Cm�a�O�f�	<x�}�|^�m�I]}=w�qƏ�����3g�t:���޳���<���yg�L���ؚj:D__�&N��#t
��h�s��FF�K����$I���fC�uR�T��i��*��D"�>�%);��&�i�H$PE��L&�Fc8�v����m�������L
�1�P���7��O>�����y�����$a�Z1�0��2��f~�eI��uEQ�4-{,�b�����ع��۶+󚦑L���sY������L�2�;n[���ˎΓ�$�ib��0MM�Q,�l{�H$PU5;���iUUE!�L��[�T���%��I�Pފ��ړ-��@p��$iXٳZ��FG�6+����b�!�NȲL ��aQY��M�[,��5��㙶wh��})W��e��6[�u$�����%IVo.�k����o.D�(��1m�T**�Q�>�u��t���Biq�5�޴�S�(�|��e��y��w���:#��;GN  ��+��0~��kv�+)�$Q]5�:?���0���B(~�@ n!����B(~�@ n!����b�U�i���q��0����$�d2I8I;�A��v]�YO*�"��v���Qu]'�`�Z���W,SӴl@p:8�Nt]���H[�Z,B��T
	�d��&�	�@0����x��8}��ap߽����زm�}�EE�Ȳ�S�?Ό��Fj���{�m�?��ߒ�����OW�����e�.�E�Xf]}=���_����$I�u��,��>[���Ç1��3g�����R_Y�9��IJ�K�ދ/���!��L�Y�JKq:�;;�L0�������z�����FYY騿�	���[S#BY���mV+˖.�����:֬]���	�#����g���r�x.)��ѣ�޻$	M׈�b���;����G?�>���|�W$3����/���D"Qv����>���lܼ����˓�=F2�������e���B��hl:��~�o,]r��'?/���6�;:(,,������K4���R\\$�G�9�� .\�n�SYQAOO/��~b�8��t���v�(�(g㖭�ܵ���UU��D�QLӤ��|T�ӑH���nb���Nz{{���%?/���b��X���730晧���j�0�n�&�፷���aڔ)x/�:(�n0B�+�ByY�`�\����f����߽�&�ee<������800��U�Yz�=>z�,���α�Y8>[�mc��YL�0��~E2�V+]]ݼ��;�����PYQ���cy����̟����Faۻo?�̞9�z?c��LcS�MM�ܵ���b:::y�����	0e�$�6;o��>c*+�����˙5s�ζ�"��o��
�aRTT����wo�A$�b����ͤ�8{�/�����i8{���w�v������!�2n�X^z�9TUv�#G����o3�����^���V��?���E8f�SO��U�����s��i"�({��tĨ���i��زm;E��B�G�/��]�ٰi/=����2�\&O�đ�Gi<��ӧI���'�}��ܶh!d����!�8��ys��֬��	��1a�8�}�L��˼�s�1m:]���ܽ�������}K�R]]��#G�2q��P(ěo����KuU�~Ƃo��@����y���1m�}��'M�������Ʀ&zz{��?�))Mc��̟7O��\3�����Ū�̙5���2v��ǝ��ΜY�8YW�_��?�T�i<�Ǎ#	�W�SEe��<��r��v>[��	���]���9zz{��?�)���k�Ξ;Ǹ�cy�io���ɓ$�q-Z���,'4����|��	IJ�Y�kl�~�g���������|#ZB�0ؾso���/��Ғ�t\hL�n9�������x���/�F���nh`��U��1*�ʸ�۹��{�e���^�e������Q��O*�����P(Ĕɓ�1m�T��`?����j�2k��}ɌR"�()-�I4E�uL�ȦQUY��R��c��ދ/0a�x~�ƛ��;�,˨��:8�j�X�F�S�d2E4�0�t�����y�H���0ab�8�if˹$IXdUU�'��	�$6�I�q��X�����8�X,kl������	�'�w�S�===���w���c˶�8Y�^�{��cæ͘�I ��g��ttv�t�b$I�a���St]�������x�g�����S���~���2����{���+d�������L&���<��#L�0�S�O���^�4a��TU�F�=|�%�܍���Dp�c�ۙ<i�X��XX�x1�����3{6�b�����~�@8��'�Q�ה�`�Uk>�?����zp:Ye�r���Ӂ�*TVT I�7��>B0�ͷ�AQfϚ���旿�5�0�i2~�8�rs9v�D��?��S8�}����@UU�?���Z��O�8�����ъ,]����y7+s\M
�߷���^����/P]-fL��H���d����H�4z{{�4
�Llj.(( ���	~�UU����/L�<��|`�`]�	��z=X,b�g�u�qckp�\D�Q�W 3�H��$�ix=��S*��n����AQt]'��xQU��W�����xp�\������O^^.��a�x������'�,�]����$
���KnNalvY&���z�ۦU��A_?�Dժ��߱l�bƏG��{gg���	v���Ǐ�~�F�{fx9
��z<��n�� 6�
�ݎa��"I>����nt}P��Fnn.�h��p��@�[�7]�]lܴq�_U
F��t:���4�'{�Ғ��NZ,����`�ԩ�Ҹ�N\W �f������A������\��+����v�����sss��M��&x<n�L��b�?���b�PR\Lɐ��>ߗ�r�����s�c�PY�����j̘��%��i�'#˹���C��2�,��]U���ep�ݸ���W�o�**��q�n�ɫ!Sp�1nl
n�Yk�������>B^n�eӉr.�����;GYi)e��� ��:`�Z�}Ѣ�L'ʹ�"���@ �B�/�-�P��@ �B�/�-�P��@ �B���M�$�H�Ҵa�4M#�Hd����i�?�00Ms�����i�LJ�����5�{�X"�q<�L�
�{F^"����[ �qq�u-�w��)����/�L�f�:|Y����{�7w���f�:��8S'O�����)>�l%��k�����g��nv��E�C�7��~��~�	�/\�����(+-��3

lڼ�x"�K�?ONN�=���j��L�<�g�~I�hko�OV������j�?�$v��u�׳g�~l6O>�(�'M�T}=��\E0b���<�X������0�����(��9�p�c'N��C���w�X�a���䞻�y�P�0Fʐ񰚟�'�f� ���g�&0cz:�Ͷ;X�p���4m�V��������0�rӧN����H4½K����ڳ���͔I����?x�m;v��K/q��1b��'M�����ͫ�clM
�.����E�����V"��,@�u~���l6X����v�a�f�~�	�`݆
�[�ko����'3o�>[�
��OEy��~��oHWw7?��_�r�XU�p8BWw7��N�F	�#��!:�낮�tww��q8�@;�p�T*��f����lظ�1���R)�Ng��n(Do_VUEQ"��P���~dI����TJ�nw I��>J,%68����C<�n��c�nb��O�)M#���;�X�P(�$˨C��aV�]���TWWa��G�Ǯ?}���kTT��v�PD܌�F&:߈7�(
ƏgǮ]�?x�)�&a�Z�(/����޾>TE��Z�x<x<���mm<����Ʀ-[7n,�p�X,�iBEy�x�>����y�A���PH�Zݵ{�D��I(�0t@e�̙tu�p��I���룽��?��O(--!��r�jZ[[�7g�f�`�q�>̖mۑe�e�.��tr�L��s�m_� CpcR{�0�����0(-.��C�4fϜI~^~�	�ӄ��N\S�`�ڵ>|�4�w�*���/�
��E4��C�d�4���>��?�+/������>D�u�
yv�Ӽ������l�����]�y�Ï�����^������r���8q�$�H�eK� I�%G�G��>�֝�_|UQx����c8�y�)>��S:;���l�=�H8�b�y���q8Yy�i�uɁ`�|��0~��~=�<��}�0�"T����l��p$�����j&N��M�}�N\.'ӧM#��{�������_��t=����|^&�G ��Nԝ��߽ƫ��NcSm���FWwk�X�G+>��j�F�z�dC7�-��;vR��XEIO���=߁p	�L�ᰓJ���M̜Y��8a/��sg�f�TVT����Yr��DcQTU��GT��d�矋��kJgW�?_Kaa!����k��D�F�<��3,Y|7�V�!4b�����~¸�5�>��/����Ϙ�
�n���D��q�B�x�Q��(?x�e\.�N�`��M#҇#fϚ���<ÂysYr�����u�����%�V
�;c�t�J>]����?��?��r�f�:B��f��?��O���E�q�"^|�Y�v�0y�$q��EL�0�?������ޯF�(k���t�b&L��������4y��o���ދ�gÜ�A�����ޏ�n'�P\T�SO<NeE���?�����WS��>JN �-[��}˘6e
�`����~�"�}�i�8Ȋ�+���8�v�40�	PT��gΐ���ɺST��SXXȩ���wo��.�L&�3s&��-�����r�����j��&�f��(
v�
�����xH`�t�‚JJJ8�܌ip��P�D"��J0�VWS���$I��v���0�xb#�d
�݁�������G�T��#>HnN�$��x�z=x��.n��T*E$�"�L�������O>!40��"�kz��b��L���;�%Ŝkj�j�RR\��������ގl����KN ���$Od��P�mI�p8�X,2���1.������G+V`�Zi�蠸����ϊ�V��s��E���;,[��I'p���Ue�̙��^&M��[�KIq1���<���ⳕ���~N0���\�ʰ`�<��#�8盛�>m�$��g+ٴe�}���7y�g��RV�^����B��W��	��_�������@x�ys�0{�,�_���������01��c�Z)).�>����Oikog򤉘�Iݩz^��[44�eٽK����kAQa�g͢���ݎ��D�eZ��x�����d�ܹ�����A?��r���0��jE�$LL�~F��Y�ܾh!+V����~o/�&��c�Z	l۾�ҒN�>C8�4M���p8�Rw�_v/���W|ʅ�fzz{y���|�d�����ܵ��?��;o��+W�D0M�1��,\0���ko�ɳ˟����z��[��1�,^B~^>�D�cǏ��t����9s����\c#��=���vuUy�������;�#�������)S���$I�����ى��gLeňu&�08U�S��2s�tE�LC��@�b�V�L���b��stvvQYQA�����^N�9���f���X�V��$��O�DWS�
�*�y�����y�JK��ɡ�����*+*8����+>�����Ɗ���3S__O<��jL%�X�����e�RQQΔI�I$�RQQAWW�iR��O��s������CMu5����^U������r���q8��Q�{=^�~��]����v�������p�T=����L%���GsK��Z���;�������".47g�'S)�44�*
�%%��?����y[SCgW��]�TW�2Xpm���b㦍#�@p��i�v������',�P���3

��o_����>��Ψlڲ�]{����%��ŦJ�w���fΘ�~Sp�zY8o�
��TR\,��-��P)��1a�8&�w��!� ?���|�zg㲈�sk F��@ �B�/�-�P��@ �B�/�-�P��@ �B�/�-Ĉ�|�d�O?[��Xrs,�I�;:X�qL�2��ܝ��/\KL�$�c�Y�G>�
O__[��`��{�ē��$�I��n����a�p�"nɭLj*�d������1w�,�L���(��IeE3�Og՚�9U_��.�E	�#����h8{6�0����a�&�a��,\/��8m��l߹�h,�a$�l�ST‰İ�6C�Ty6M�T��H���;�T�
�6�r�jR�6�̖�V~�����7Up�0r�.I��Iwo����T��*�UUx�^�m߁������w�-ʮݻ�|�:xv�rƏ�LWWsfϦ��d0�����晧���@��\;L�d���lڼ��p�`0D{G���(�J8�������?��Ph��3��}�F��O��c�50X��}�i�Š�V���IQQ�<� ��|N{G��q��,�H��`�����1,o��o��/������_xA8��1�Y�ܿ�^�
�8s����/��xZZ[�� �d�=I��j3e�dj��y�ᇙ6u
k֭��}̟GOo]��<��#��g�V�Q��������kX�t)/>�>��h$Jk[?x?c�ְu�vN��s��1.�wI����E�`y6M�+W��Մx�'���f�ڵ465Q���s�,g���̞5�Gz�E��$1{�L�njᙧ�����z?.�5fĈ_UU�.^@4��ù�&J���1m'L�_~����O՘1�;��[�ϋ�頤���E[[�g��Fh�c�O�0��6mي�iX���u�-B0D�4ƏG4AUULL�^/���wt����3	�ظy+�Uc(/+ŗ�����6`UUfϚŸ�5TWWs���b��j�UUE���p��~�N�ee�f�#(b��	c��WS��;����f�������w��(���zY���(��iS��a��$�I��߸����1{��:0��GQa!�@���}�כ+膁i��u���$I"?/��ӧ���gǮ���qLӤ���d2����y��y8�N6m�B(���:�]��m;vfg�
��ط?6mb�l۾#+������5H��G+V���Q<$�����o�٨3���6���0k�L��riim���S�q}�a��_p}����s;~|p���.�[�u�L[5_�p��-bт��rYpMq:���+��k��y�)j��y���q:L�8���<\N'�h���<z�>N'���*
&`��H�4�<+���祥���O?ɤ	p���h�B$I��p`�-(���edZ����/�L��(�S|�!���;��K�ϻ2�'�0�e��_p㒉)�_��D[N79�<�-���ظi��w�#���F��	�%�wQ��6��(��1s�tfL�&:����<�m��|�$I���3��,��]H�@ n!����B(~�@ n!����B(~�@ n!,˟Y�_���p9]�4�]�v�k�^N�������2E��?Ⱦh:�Ғ�Q���N��l޺��n���P�ή.�o��ɺ:r�Kƥ���q��֢(
yyyh�Ʊ�'�w�~���uٛ���d�ƍ����t:��J�u�vzz{)).&
�c�Nv��Gh DQQ�ec�_,s�|^��G�T�i�o�DcS���8�v��0��nc����m6rrr0��Ǐ�q����)�gKk+�7l�D]>���s���R2G{G�0�֢�:�yy�Z_����0_l؈�"��akQw�c*+I$�����;w���FAA���7L$	a�,��u�'���Â�-��p�W�b�J�c��9p�F9�xn�?�H�b�*���1
}�sKk+���_�w�~"�(�
s~��I���v_l�Ķ�;����׿�w�Ο���Y~�o����oĹ����'�p��I��wo�IGG'�0��m�>�\S�eo,�J�����?s��9{�@m-�����o�ʨ��Y���O�=�Z�+e�4c���|vvr��nؽoݶmTym���78z�8�|�)�x�7�z�������gϝc����{D�Q>]��OW�"�ڛ�������n^��b�+z����#d~�j5==�#�Qo�׮e�
��ֻ�R{����̽���_��o��P{�H6}c�y������}�0H&\hiA�-l۾�U��\�}��q�}�Z[[�7/U(�kL2���>�Bss6�i<�VnO�i�OV\�r�k�^6n�|�߻��y���D���	��GMdI�� ��&PSS��fc��uجV�}�irr��]�R)���`��8~�$&&���:��[jf��{F���:�$�xQ�ł���O��G���ud��}���3k���޾>�|����g��(-)ᙧ����Y>��e^�ϴ+Miؽ[��Q�IR:����n�a��hmk�������I^n.?���w��q����ݓf�A΁D`�I�(��(��+�J��7x���{/���޳}^�λ�&�V9PԊ�3)�L�ș�9M���T&!	�yJ�LOMMO�]�U���8p�0��lܰ���o�¥K<���\��h4�^�G�$<�u������r���Ie>���Ib�wt�9.\���ͷ�f�*^}m7'O�f��Ӿ�t����)�Xd�r�c=*�`�W_{���<T$E!!!���v���b-��8y�4/��W�Tp���PR<�����j�. +3�'O��Ir%qǖۯ��$7�$���/������k	G±㶢���7s��IΜ=Go_�ﺓ�g�248D(f�
�)*�RnSS3G��?�7���<|���m�dg���u:|���^|>UU��z����{�h4��SU����ㅗv����C<@A~�L�>��r��iu,�?���n^ٵ�������룶����V�x�i��?~NCc�&$ĺ��<DWw7&�����I.�zf��d�����)��$���d9t�dY�'��괼_�n?{������)v`���RR�Y�r��;:�x����&љ��ʼ^=e��:鳛�ӧ�4��l6�=Jum-V����~,f	6�,���Ek[;#�#���2gefd�F	�$�ݜ�p�����og���h4��aFG=�ʌD�45�n9L����(�;����/q��1|~��-�k�w��L `��ݬZ������N�>���۶nE#��.Ƒ�Qv����q'%]�{NOO'ٝ�Y������&6nXO~^m��=��������[o���R�X2���3�����x��;n, ?/����RSY�l)Z��C��`�YI��x����RnKk+�Ο;�ky��y{�>.^*g^YgΝc߁�=����,ZDvf�9�,\�`�Z��$����۝IJeKq%:gz�	7���o0x����?�SFFG���F��q�=�����T�88��tϾ��������q�]w�g�~�F#���D""�##�$Ns�

�αc|�����C��r��XE%			I�>����[{�R��HSs3u���s�'N�b�o�8{�<UU�9w���,����[˾>P�G������ǏO����ػ������{��4
?����}��;q�p$̨ǃ������Ԕd�&#]]ݱ�38�^�������F��_�%��?��M�446���h4\.�2�L�VKzZ===�|G�ܷ};�n��^�d��M;���}�=~��'O�?0@��
Ξ;��gy�坱{�������Ç	��$:��ѷ��׾��v�������Ȃ��HIN�������x�"rsr�$���46�r֯����p8|3��a���h����ye�$:q�\̟7�̌��2INv�t�bL&&����s��������t6�r����+�WT�l�b֭Y͢���C�ղh�V�ZINvY����O��'+3Wb"K/��p��nH9�M�IDAT�)G��������aQU�ܜl
��hhl";3���a�SS	����-��D'�U;q���FR��,\0��Ǐ��+ ���Y8����FL�X�X��a�X8��]��I.���(�RAk[�/^ĕ�HNv6C��$:�Ȳ�����-F�7�DP�� ���~�H$ʥ˗A��������>�;�����v��0�##�Y�$I�t�)�Lr���JŻ�=9T����x�v{����8NgW7��9EE$%%���;��H����o��wt���!��(gϟg~YYY��j��[UUz=���0
�ߟ�����e�+cúu\�\~�w4�NGVV&��zV._�����ݝ�DfF:_z�`�Z���wމ�륵�
EQ�h�tuws���$W=����H��;e0�F�A��q��)�V+�99?q��N����SQUŹ��)*(Ÿ́�.��DTUEQ4�NljS��٬X�f��8z�8��A��ol�d����<u�`(��]D"���}��J��1�L�9{������,Y����/���gY�V.�_�¥K��SQY/3љ�=!�p8‘�GY�z5��ď���O�iG������܂N��������%3#��ZN�>MzZwݱ���|���j̟����EU������4��v�n7�TU�����{������4��X���`�� ���z���p�:/ZD__?'N��b��D�٬ttv����Y�d	Z�I�HKK����y����&6��@Qa!�s�PZ\LZj*�yy�^����V>��q'����{�\~��^�27l��Z�k>�=w���K�'�	���JB��ںz�n��Ԕ
��hi�JWw[7�΂�����D�$*������;����LfF:
MM�ڶ�D�����؟��5�̻��F��EV���(';���~^���ݳ���|v���~�w߹�̌JK�)-)�ᰳt�"-X@aA�%���撔�b������#�>s�h4�C<���{�`��Z�V�����UR��Y�d1�\���l6�(
��9���(�iiܹ�L�k
��
�(*,���`^i)I�q;���p8L�@?ٙY��P��w݅����#�����Ս��h4�?��}�V��)"
���Ă��X�n�H���,\��$�"������'�e*�BYY)V��ֶvrsrHHH��]'� ���_|^�m�m����"UU	�FB�O=�,kV�����CW����#G��G�4i����+ 11�����D��L&dY�Dʼ�� >�EQ���)I�@���{�#��^~����V���ѣ��~�׫j�G�Б#?y�?����/|&�����|�_anQ�u�։c[�8z�z9p���'IR��`0�ͯ}�#���STĜ�8�������T�;��+�F����zdY�4�h4~���kb���5k>����`0bOH��
��i#�2N����Ǽ8��O‡j��gA(�M�|�����BUU�^/&��=Ƕ�q|�� |�z��� |�H��֚Ƕ�Ik�� �,"� � �""�� �,"� � �""�� �,2���kj@��s���D"�74���K�X҇�曶\�JGg��s����ZZ[��d2QZZ�F�ilj������l�23߳��}}tttPZR�N�b�[�kkIMI!��ī��x�^��΍�s�㡾����l"[� 7��(457�����}������<�̳:����g�%
q��~��3�_������[Z8q�'O���5Ͼ�^����	v��:'O�����D�a�=��{��G����TTV^���@���������
��r��1��?�%G��o�����ۿ�_��g(Jl�jEQx���_��u�.|6��a�8H���LWE�y�wO��r�m�D������u��>��'OM�l\[{;���3��>�)-�p8BGg'kV�.\���(t��b6��p�:�}�E�>ߴ޲v-�֬at�Cw�?�q�z��,�z�J.X@jj
�p��r��,Y��_��o�a�9�&T9��Q��Ί�n��?@�Jr�h��o���d�j��%��p���K�/3wN�H�w��e~���Y�l)۶l�����g����D�ܹ���QU]���(�n7˗-��	��KU�Ռz��-*�؉���~V�\A�����z����ٵ�u._.g��UdgeQW��p�ݬ]����**+�,Z��@ HuM
.W"˗.%��^/]�����)�jhl�����CA~���#p��y)-)&';��N���z��=p��$7wn�����'��ٵ�uN�:���6n�n��Mi�LF�rs�î�x��ݤ�����(�ϣ���_��wx}>R���(�h4.^��$�,_�Y�IKM�؉��׿�ɧ�����H$B~~z��y�����#8M*����s��[�$�Dx�7�;���0�/_�BcS�n��N�,�x�^v��&�6n$+#c����1i�Zd����^o�G�� N��NKm]=/��
�`���|����=�&�PT��gβ�?����|uU�����3�200��/�LCCc�v���)D�Q�|�Yjjk���O�D8t�G�����w�$RU]�S�<K(❣�y����~�M�<��d�w��ٱ'�� �����{���/�ʮ�8~��@�'�z��W����;ihhD�edIF��a���'�2Z��V��h�n�t��cJ�oni�������?��~�C��k��o`�lۺ���_�y���;x�'S��ρC��u�-X��F#��w���5��PW_O}C#����x�z�
�)�UF"v�|�χ�㥫��K��<}�CG��`O������:j��y�1�Ltu�������+��guu�h42��\�R����L�{�#*,, +3�Mo� ?�s/�v�*��v�, ++��z�-�7q��y���LW[�E$)�R��^����9EE\�x��::;���!�`�q��-�]�����+���C��x8x�MMʹ��184Hggy<x�}

����|�m[�p��%�23�Z-457��N�˥�
��%<��#���r��;�_������/<D����K�IH��u��<p߽��r�
�,^4%O�N�������<��l�����ҧU�� �����$IȲL(������G4A�Ѡ�*�p�5KH�>{�`0Īˁ�����P���\��x{�>֮Y�C�X�pz��`(/3����L$�r�j�����s�X�x1�����3<<B[{E��x�^�kj��룦���BYi	�u���R�����N�c����Ȓ���Ց���+1���*����l6����룱���d�{�.�4UU1�Lc�"��$'�IMN��z�Z�D�QTT@%��ى��!����KuM
�V,��t��vS[W���\ii�TTUSU]��+WHNv���������.�]]��ќG,/��CwO�]]ϙC$���r�$z{��3�����x�N�����LWw7V����α2U�':����!�kj��͙��K���<�~�����Ij�۱Z��:}���v6ݶ��瑓�MMm-�.������۶q���\�t�����H4��s�X8o�s����#G8y�4}��ܱe3, ';���*N�=GA^.�ﺓCGޙT�N�e^YkV�d��2"�_z�aJKJX�j+�/G�ӳl�6�rK/f�����a0x�)�;�5�W�l�R"�wm�F~^�L�{�#�j�H�Ĺp'%�d�"**�8y��؅���{���ֆ,���="ǸpS)�J{G[n߄�b�jk�׮�Jeյ�h4rss��饴���FCcM��T�ֲi�F6ܲEQ9s�m��d�g`4�D”�������������t:��~7�u��;p�m��i��WT �2#�#;~�������q�������JEUU,]���+W���CiI		6F��/����j�����X�W*�5��ϣ�����f�Ι��*|��|>������o|��nUU�� ���_�v�*V���'���� EQPTy�a����߫̉u�v�U�t��DŽϦh4�,�H��2�Se08r�G���	v;Z��f��s�91�����h4�mTU%���NJ�
�PU��uH�4�v��A @��`0PU�h4�V�e�z�9"��<� f�Y~��������3@�<{m��,Oy������˷.IR�K��|��ߘ��n��ʲ<�jA�W���A���
�=O0�,ǏY�0��D�f��s�9Q��L:n'n#I�if�\����sص㷻Ʒ�v��F��
S�N<�O�>���ͼ���§�GN�A�H�t���x�B�
E��iܱu�LWA�a9�§��f�&F´�\����0��Z�� �0���/� ���� �0���/� ���� �0�L;�?���ߏ$I��s6"ɕDBBlԴ�磻���@jJ
�EQ����𓒜,VqA�O�)�?��k��RY�UX�p�޳���j^|�eB�0f��o}��$:����O208H �[oe��[9x�{��'�����7��5v�LV�s"�r�����<�� |ttv1��t������2e����F^�-���o�h��x�m���ص{7+�/�?DCc�
�������{��H$������l�y�y��Cܱe3G�%�2��h�?��9Q[W��~��� N���F}C#��W�D�ȲDcS3�W[�8v�b�p���kh��b�`0����Ktvu�t:�Ry9��8v:���z�*W[[1����-
446r��Uq8h4�{z����������顥������8�ko����ĕ�Dw�
��M���O�k�Oi�k�Z��XW��`���L~�.^D��ɚU�xa����	�\���/~E �+�>JgWf����b�F#�.���A��/|bz����룶�������ٵ�u��2�52�<��K�������}�lg�ʕ3]ma	��<��Kt�t���y��1��>�{�%�z�))��$��l2���Fjj
Ͻ�">���Ğ}���w�Mbb⤲;���������b���Y�|�.�O?C��Eoo�֮�������SR<���.zzz���anQѤe}a��TI�\232fz
7Д����Ýwl孽{	�����r��D#C! �-��騬�"���~���z��8Ɇ����D"�$�H�VKP-�����STX�|���s�ן�;�W�d��R����CFF:��79p�0���ò%K��r�F��좲��?���t:��������n�� ����U6�
��Ojj*V��H4�-�ֲv�j~����J6ܲnRي�f�������'�y6v�NL��gΞ�7� #=�%������S�t:��أӮ��p����PZR<ӻO���~�Fæ���p�|�;N4e��2N�>��X�V˱')�;��@�ۍ�����$'�F8�999\�R�m�n@�}�"˱e}}��s�u:zz{�x�hu�Cz<+����h2��&�2
H�LwO�P%��t:iko����ӉF����g��ل,�(�J $�`0!	��h�D���D�Nj^��n����H__?=��X-dYF�Ѣ�h�i���>���ILt��b����屬}�}���io1�S���:s�w�LwO?�9�٤$'s���9v��ȃ��GnNm�:r���f�l�Ģ��٬���N�9Ca~>۶n�-���h�c�p���WVʑ�G9u��p�F捷ަ�����^�g;���0�X�f�Z
���
˖-a���ttv���[Zp8��ٷ���z,kV��ŋ��p���j�33Y�n-/�x���������74�v%�g�~Z�^����;�lf���455s��1�����{�|�zJ��7�|�
�uu�����]�Y�@�ܹ��a�9v��D��3��`���ϫ�m�
w�;�dOo/^���d7�	���^^��=!�J7���z�j4�F���
�p:Ӧp��c��3�</#�#��9w����/?FrR��$�UU�x<(���jE�e‘h4�		x<<^/����O�韙[T��U+Iv���t���Lj�����_�����dd���t"I�p�׋�`�l2��j��O8�j��z'�i�YQu�ߔ����ˁ���ǟ�v�{����e��V;�T=�8�
7еǝ�j�j�R1��fg��Na�H�4%K�N�%%99������t��SII�[��^������i�����p�[�N�c���o����-1�z�gE�s����ۍ,��)��Y��o�v���;mnJJ2�>���XA�0D�>wR��'���@�$�JK�w;��ʢ�f���g�h	� �,"� � �""�� �,"� � �""�� �,�_UU��0���DPe�������:����@ �X4����z'm�(ʔ2A>�E�/�#�z����ί~�[FFFx���\~�c�a(��j뇾������ɿ���-�==@,]�K;w�w?����¹P��������o��'�z�u��غת��Ž���W
��/|4�����6��N�wUUW���$4�X�����bxx��{<^��;f�cДy��p���N�� )))�!�;:�D�tuw��x��*����0�`THNv��j	�B��hp��C!FƖ�m��`��?�G�DQa!Z����nTU%5%e�jS�}};~�F�]��'[���+-�={	c���g�r��o|������/���EiI1s�
执�a�…dgeQU]þ��C�.��������lݼ�%��������D������(�`��Bb�s��,�2��000@(F�Ց�`����FCzZ����a���٬���b�ގN4��6��288k��`�Z���|�Ba�fӤ�5
>����v�TT��ގ2MO�x�g(��c��PU����~?.���H?�H�`0�s/�Dnv6�nۈ;)iRҫh4����9v�$��Wq&:�D"��D"�`�Z��괸���e���az��p�It��ӛeJ����套w��z1lٴ	Yր�$MJ���ן��^���e��%ܱu3;^�I{g'�H�U+W`�Xx��������Q]S��W_�K�K�˩����Nb��ۙST��?p�W*+IOK�'�Ė�r�z�J�=�RUu5K�,fNQ!�YY�s�W���t�b._���`�d4�ٵ{7�֬����>𳣲����J��f��Dg"Ͽ�s�p��v;/��A�ۍ���/>BI�H5*�<^�����h0RZZB4����H$�uk)+)嗿�
:��̌�g;o��6�u�D�֯[G~^.��O�LZj*:��Vˏ~�=v��&�P��@mml�[֮e����'fpp�A?�i��8y�4V��V÷��u�[Zx{�>��XΕ���ο�����t����&�3wo�6)�o(��ٳ\./��_cْ%8t�?�������o��=w��/�
III��p�;�STȓO?���|�GIKK��nV����JLd��u�̝K]]=��mȲ?�&f7U�Q|�w�u'��79s�������|�<�Ѓ�?x��������}�G��s���o����XπV��E�HMI�����ŽW�@��̓��KNN���+����JM����hdE�ms�j+Ͽ��ͷ߆˕��a������ke�n˖.�����<�(˖,a߁������#n�m#�P�%��,�={�����M�*
��p�=�Y�b9������@UU<���e``���T�-Y�� ��ᴏms�`0�^o�+�=�W��(���\�\Num-n���ŷ?x�0��Utuv������vFFG)(��O�C�o��Û{���u�������؉����=w�ş��,Z���w�ŝ[�N
�&����ֲp�|���o�l�$$._.���dgf��������w�g�>��?@Cciii���Q^Q9�_۬1%�_�\��7�D�բ��F�D#QB�0��
���U�g�s:ȒD(B�%�:]<����M&�		c�K�(��,�<��\���v��ĩS�z̙3�?���|�����'OM�g4%�F	�(�B^n.U�Ռ�z���%�������V~���)b��.�_�p�v�LMm;^����L����eU�]��AdYF���OJ�{B		6B׌�M%�H�nO@��!�6���K�p維d����|�P(į���=H���be��%�ۚ����RSR())�v������xw�;� ���*�,�*D�`0��D��X��'$`6�c�Ϩ�N�C�բ�hP%^�łN�EQb
#e�2�z}��QU1,]���PQY��%��%	�N��f��p��ہ F������K��93���S���J��KWww,�Á��a�ob�Zx������%/7Y���/#Is��e�����_?>֭���J{G���t8p8<���ܷ};UUtuw��5�x=�˗.e�\����(J���¥K���446���?�~�ukVS]S�O��l�����/�ʅK�0��׏�Ň�O�����azzzy�w����>#y�9��s'��oY�/���U.X�TTV��_=NWw��uW�"Tni,Щ�rG�Z��斫��~�v�mm�;x�׋A�'3#c�6�#~1��"I+�-�7�b��[���c��`g��y~����p�VKKK��~%���
˗-#�r��iFG=��~�斫��w?�;�N�g�ʕ���ƻx�q��!�&3�y���
����
+�-����@0��|�Ǿ�E/Zc��EvVm��w�ēO144�-k�PVVJ[G;�

8�T�&���y�����N���
��446
�q�$����AQv;�����a�Z���w�[����B
�1�L��~:;���d���z}�ݱA!��
������0��]8�rs�?�	z`pp�*YBU�RSq8�����ڊ�l&++Y�hko��󣪱���lLF#�H���^��$��3d�㡫�wR����;���#--��v���߳�̌�23�IE����(=��$:���W�^%�����l����?@Vf&��n�-cۤ��a�� ��F���D���&1щ�l�TfzzI.###����R���X,���U��D'o��Kgg'��v�YY��$�(
]]��>RRR���t����ioo�f�a6�h���d����������K�r���hhl�_~�3��~7�YY�������룽���HvVƱ�pc�r��-~�NG�ܹ���͍��r��22֝�a��y�7�L����X,X,�	e$�r%������$:�Ӷ�m6�JK'=���u�r�Z
ii�7|��,�Պ�j�����?���kHt:X�bŔ���p3h4�R�=���F�Ιܕ]Z29�q�m&���j'�+�+3!!��ee�+)�|>WU���t�/[L�e��'����F������y��--�Ba�-]O��IJ�Kq���풒�HJJ��ߌ3-��`ඍ��p8f�sB\NN6�,�i�t.��t��1�ñ��A��t���[1L��+̜)]�� ��q(��$I��3�]�� ��q�15�n��A�YD~AA�ED�A�YD~AA�ED�A�Y�}Ww7Ͻ��x�5�kj?�F�Q������P��'N288���p,m%p��%N�:������x��}>��K;hmk�򜪪�>s�������C[{��~���O��IeM����'���97���f���gܔ��(
����!"�����儂!*����뛴}8���㏩����'��|>�����_��ή."�XR	�׋��t��ÑxF�H�ݟ��0�P������o~KUu
�P��Μ;OwO���(�X.�д��a�޻������盒����������FUUFGG���� /�|�d��Ԕ"��P�����`<���(O����SU]M}CC���a~��o���!��
��?�&�ں�ޑ����c�>�.C���C���
���Ł0c>޹qj6�P8̳/�ȑ��>���̚2�������}���Q�33Y�jel�S	4�<iA���/�M��ٸakW�bϾ��_��$I�~�m ��?���ba�����r�^/_�җh�����SX,f��uE���O�:IcS3�}�~��Sh4��أ<��3������B[{;G�9J[[;_z� I�:s���~���o�S�z<^yu�m���g;�UH���-�y⩧���`����򗾈�d���˼��u‘mm�Ȳ��G8q�$�,s�֭�S]S�,�D"���y�Q���QV,[��wn���vS[_��h�<��a��瞧�����fV�\�ǧO���;G����簾��5�V2^O=�,�-����ֶ6���JK�~�];~���N!�2۶l!?/�g_x���r��y��
>k�I{{�|�!�ssy������l�b�RS���t:z=_|����f���Ư>���xkjk�� ���ý��KRɔ���`�m��r�rΜ;Gs��I�1L\�)�p���uk�p��m��g/G��cǹ��{�e�Zv��*�-WQ�Ǿ�E��^

��W�LVV&gϟG�Ӳa�-���ٜ��MMm-�UUTVUs��
W�T���Dzz���ϝÜ�"~�-\�ʢ���P(4��|��)�ZZ��c_";;��v�BGg'����JGgV��K��j�
z�x>k������d���|���!;;��^^}�522��h4��{7�����}�a�Z����M7r���8}�,W*+�w� s�̡����q��Y������������},Z��9EE|�X�rnw�N�������&R��iim����/=�0g�_�Gٵ{w�n�^���.].�d�\V,_��3��h|�<�Ѓ,Y��7������#nY��������ng�o�{[�f�X��/?F�(�
��7��;�la���"9�G6��ijj�ޢ��EU	�ƺ��X7�ʻ��*���3�ә7�I�ֶ6L&#y�9���
�$&&���E�ۍ^�Ǖ���l�/>BVf&Ͻ���U����Hr%�k��䓙�ή�_'+3�w2K�c0p:�c�W	�=�Ӂ�n'z��w`p��EVf&�99����FcKK�}�^��b�d2OJ
��x��HǞ��^����ŕ���ŋ0
�zl�.��n���h�	�RXP���B���q:��[&�+�Ō^���pb1�Y�t	�����r��IIIA�eRSR(�3�=���mx}��mɢE�dg�ͯ���f�q#��3}�}d��F����~��Rq�\$�l���a'';��s�����_��>ֹ��vM{�ш�h"�f�f���p��l������(,(@��b6�d���h4��?@^n.��H����(�DQR�Sؼ�6�z�9����~�ϛGJ��ֶ6��(V��V�s/���ͷS�؈V�%�008��^�c��y��������?��/�o����:TUE�ӑ�J��v�����D�{�2Ѳ�Kx⩧y�wO����uk�$����A7��5v�q��	v��[n��^��fc~Y)/���‚|:��HKMe��2E!��$--
I�PUE��������&EQ),�'/7I�Hr�HIN��J���=��O?CUM-�9ٓ�C�+�Wv�T���ŕ��={�_����e��0�N�����p(̦��2::�[Fz�]]457cOH���VB��L{���]�%�-Y�[{�288Daa��R���+�^�ry9kW������qύ�n����f��k�e�^Ϧ�����ơ#G�٬�Y�J����D���q^^s,mnjJ2V��NǺ5k�WVJVV&v[˗-�l2�����l"
q��\.y��lۺ�ԔJ���i)�/`�p�\�fg��ra1�����b1���	*x��-]��+�O��u'%1^e�����2�<旕a0HOK'77��s�Z,������E^n��X~���l,3K�[TX�F#�d�BV._NFzZ��m���-�C^n)��8NRSc�iI�(,(�b6���Y�r%%%���,����+)���t������ /7�‚�22��l�deQX���9E�B!�F#Y��dg���M(����%���x��j)���j������n�������>�V����h4�23ٶu9��̙S�[ff&����az=wޱ�쬬�l�a�>IKK����dw�`���T<�55�dg�b�2V�X.�
7��:7��dfd�(
�����t���A�ܹ8v��n\.�g�oX�>���榏��oxd�������o��	���GQ~���q��|��F����?�[��[o�e����p��N�:��O!�0#ĹQ�4�����tZ-���0��|n�`(�F�a��%@�K�x�\��3]�O
wRs��P?~Q��s��i��Z��ͧ(
�p�^/��A�l��ٝ�5Kɲ�f(� �*� ���� �0���/� ���� �0���/� ���� �0���/� ���� �0���/� ���� �0���/� ���� �0���/� ���� �0���/� ���� �0���/� ���� �0���/� ���� �0���/� ���� �0���/� ���� �0���/� ���� �0���/� ���� �0���/� ���� �0���/� ���� �0���/� ���� �0���/� �� I�L�CA�h<�kQ�����7�uAfUUoH��7�=>J}TT���0���$7z���?�uA>�TTF=�x�^E�D˖$	�Ʉ�dapx�P(8ßV�l6c���g4�K���dƕ�Bk6�Iv'*��A��'I��}�t:�J��ȚO�|UU������\.��ƎF�t����x)*(ž`���L�?��hUUe�� � �(�`�F�V��!-~������y���k���JGG'f�y�c�	hg�&� ¬��*��ܐ���j�7}��&I���yf���>�_A�9���(D��O�hY�QUI��4�S��PeR�	�$� �p��(ʔ (I�$M�2�����k�x������%$T&������+����/��.��^��?�{����\~A�o�O��$108���#%%�<ym���~6���޾>�v;F�!^������kD�Q|~f��&6�01�̓��B������¥,�x}>E�l2��{��<!�K��?����H$�V�Ý��n�O����$	�}���X�n��k�r� �p���⎗��������m6�m�(
�������!I�,�y�ginn�m͎weO$I��Μ���
�] 4�4s��E�`<�N\@�$dI�����6����`ph���z�z�D#��\o!<I�PTeR�ojn���_y�7���׏SY]
�z,dY����Ay'^���FGG9~�$�px�>�����* �� �p�����޾>�;:�'$PU]MZj*�P�+��\��J]}=�99x�^.\�H{g'�](�:�����*����.R�Ʉ#a���Ǧ�����e�ÉV�ehh������-�z‘���(J�����(Z�I��j������:~)�**���?4!#=�o~�k
��?�v�F�}���.��[���d�9D"dYFQ��;���&љȼ�RN�9ÎW^����իVb�X�x�2�`��ee�\��L�?��/� �4[����w����s�r9�`�S�ϰg�~�z=ѱ��o����.`0�*ѩe)�u���	v�����ގ;)	��D �����z��hjifph���&TU��dppUQ���@��mm�h�zu����w쟊,Kh4�+�-ehh�S�ϰ��]�����]����[�}}}<����AN�>C0"��x�����\�x��^�
~�ʾ��An�X���Q��H��g���݅?������f.���l�b6n�@k[###T����}����AEe%�h4�dY�weO�޲DfF��՘�F�2��x�A�*���RYA(�f�����$�CAT@�ף���쬬���>LF���8�{_;�_Qb����*�@������������^::��1���L}C#&�������r��5def�m�$I�܅���&���}}x}��TU���Q�� ��#1iT�$Itvu����[�a���x<\���j��r��ή.�GFHMI�`0��؄^���󡪓�	��F������h����F������D�����j5�L&<^�7�U�Gޫj|n�ەDum
�����O?K@EU����
���h��D�e:D^n��|�2֬^EzZ�� =5��ɂ��Y�v
.g"�h�`0D�ի���a�'���΂���h4�M�����-~ͷ���'%%��� � |�I�D0$	c����F$$��v9�Y�Mf�;:X�zU��\*�B�`��2�Ι��Sg���$.�O��h�h���R����%I"p%�0��t:�����%%9�FKwo~��̌L�\.F=�{{�B$�lh�:�:-��V�=��V��p�p(D~nޔ���P���A����y�^.^�LcS��
$%%�m��33��饢����A�33��=w���l\.'�/_���w���4��RQINN�9ٔWT���@NvV|�>�zW�����?�*V9A>�$Ibdt�ߏ�j�����]ӡP�^�<6Z=�F��tSm(�����e�����ԱV�,M�6�>��7qD����u�M���:�:=y��S�������������?�@ ��Ѡ�B���th5�������:D��x=#��p�N7�ޱ���墫_A�yUy����C<���:)�M�vb���㏭�7������$M�?�`0���%-/�}?�xW�t��E�����G�Gn�����5���[E�A��GB�Mm�Ak����<�,�����by�^s%:���P)� �pӼ����W6���:�Wb�{~����b�~A����{��
БH�Ն��E��ސ����~4��g����u�Y�'/�b0D�An<UU1������
�Z�X�VF=�tvu��GF�7��������F�"IV���&� �p�ج6l	7�5
� ���i�*(�K.dK@Q"�Ȍ�G�$dMl0��� �M�^)h?n�}���t:�nf�����'�t��cv:���:�F`�3*� ��٧�*�p����x
ݙf6����GU!ɕtS���h$J��N~A��c�esg�.���|�$IB�ePEv>A��b�o�OW����pm�D�A�F���SCt�� �Ӆ��łƗ�O|��׊D"�5�'��4�e�w�O��F�5�':.�n+�|A�Ϗi�tOo�E��wv��*��9�����}$�������v���|>��b��L�{]HHS��K�DSK3�}}�dI&7'{�}҅�x��k�䟲
R��'
HHH�R��W��/� |nE"Q����#(�B0$�F1�
�T���χ��AQc�{}?z���DK�U��0�99X-V��0��N��b�:r_b�|�H��$7Y��\mk���{��ߏ���d2a4�|
4
^��NO8���a4�0�L�B�0�eph���~���I�٦�l�$=� ���4��$���I_F��H4BQA!CC��b4�B�Bo>���Kr����Q�� }���:����ded��J������`�Ʉ�磡��NG8&/7���v�	v�\.j��HMN����VK8� ����z�{�Z��~F�zz{0�LS��/� |�M��[�J4�����D{G�}�dgeQ�_��b��pƲ�)*�`��ErR2��yx}^�h��A����tcTUedt���z�� Y�����z�h��z<x<^\����?0�N���],�o322�8NJ���������0meD�A>��:��������zc��%)6�}|���8!��*
�P���&TE�`4��*�$�F�D"��V��j1���MjJ�4�}麽�ii��-���$���٬����q:�D"aZZ��Jt��j��t�n�p��Ѡ�h�e
�[xJ]�ؘ�A�χk���$:�Ym���RU[�F����FUU�{����"���Ajj*����5��F�kc�t�3J8A��b�'��h�ڂ�b��p�?0�����Ե���L��j�Ȳ�=�N��M{G;���H_<��^�`���t��ra����(h4Z4��:6�I���1v�b�$?~\-..�Ԭt$� ��(t�t����,�S��B���^�G��R��@0$#=�Ʉ^�CUU�Ѩ�V�A���D��t:tZ-�&P
C,�n �$��`�tO]�$��(*(�gc��$�Z-�����D�� ����,�D�Q��(z�>��@��mE�O	�B(���`��z�����O���A�χ�-�#�2&�&�o˜��'M�3M����ؽrU��x�\��`�ZNJ�`�t:ݤ����������O3֍?�z���\{�3~�p��$� ���K��˕K,@'%%�/ts�c��#���K }����Qz�%�>� ���\�aw-�ި:ݬz��/� |�I��F�1���u7J(B��������B0B�Ӊ�/� |�I�DBB��
�tuP��;��Ӌ,�\�?��@��j�_A�|0M
�O�,���X,���,��t���/� |>\�4��`b����K,�#� ���� �0���/� ���� �0����>I���Xf!U%���$�rl1&U�i�"c��@�h4z�}$�(���6qYHAA��3�{}>�x�������>i���Ma��a�8E�jk+�XmV�rr�Z,H���*\/f��JUU��[b��P1����R9^������n7�����P�P(��b�S2�SA���Ea�^���W����F��>�z}�u+����I:::iimeՊ�S����U5���X����_.��+�v����?��y3�y�<y���bQe�{s�r9+W,�ju��@���)��P��䯭��(��駉D"$��8RRR��ͱ�3V���a�|��F#k׬Ɲ��'��J��D��w�_M]��Mܱe3Z�.^��:L���(�	� �0���P8Lm]=7�g��uD�$I�̹s����t8�����p��j�\^����R�������bͪU�AΝ��^�gْ� I\�t����F����.敖R���z
���1��|鑇�)�+*x��Y�x1�6n�d2q��Ea���ֱ�W���f����J��SYYŜ�"�����b1�t�Z���*�++�`�^��M�X�h! 
�8��Q�^/�h�U˗s��E*��ٴq#)�d����oh��t0�����v::���,�W���6��������b!�餡����.��HMIa��ED"Ξ�������`��AJA>*ͷ���DzM��jQU�w����	�Ն����022BzZO?�鼽wm�$&:��h�z���
�Iv�yi�+AZ��������������8��� z��c�O�d�"��b�X������D"aR������s��$���G8������Ν���l����ܜ�23x�71��8�v~��3def�Ž����l"5%���:���ST������+W*����nO�O����j������f�Q]SKYI	�p���x��Ly���Ghhl�ĩS$''�����s�HII�f����ǹ���������w�rss8y�


$�l��g/�s��r��"� |�I�D__�{��߼i?����o�|���&�/]BUu
�W*0��哟�G(bdd��NiI1y�9ܹu+�@���*�������hT!����n������|6�_O  ���l����[7p��iv��
������[70����L$������MNvY�l۲����g0(,�g`p�p8�V���6�m�x7�
�n]�|�A֯[��DR��͛6�|�R���!-=�[7�����Ԕy�!6m�ȅK�̟7�;�n����˖q��w�d���)V����c+II.Z[ۨ��b��ش�V��$DC_A��\��?s��LF#�YYh5Z�^/e�%�ٷ���v�e�&=�%Ť$'���/c6����`tt���z=N��Ғb�23�Z-(c��UU���k>�3yL}c#��}$%�HIv3��$���r����d�8����y����D�dB�M�� �2��=$:�x�^T֬ZE[{;/��2i���\���
���骢"K2�UU�D�(J�7 VeEU�FTE��rQ[WG}Cu
��\.�Z-���HH����HiI1&�1~?��a�}u:&��+h5Z���'� |b���$	����9~�$MMM,_��U+V`6�������r��#��z>¹��X,�]����L�������y�p:\�x���Vl������g^Y�����s��tuw3����@eeo��KEE���w݉+1�h4ʅ��HLt��tRS[�����D�/[J(����ܜl�''O��j[f��9EE9v��Z�RSY�j%�.����M�ܹ�\SWKuMU5մ�w������c^i)^��H$JnN6��ϑ�����0gΞ��v�`��LA~>II.�TTr��yEŞ`'��Lt�`����M[{;��ϝé�g�������իW}�� �p��~���jqq��E!ƻ�ǻ���GCc����h4��?�!�oXQEA�� K2�`�h4�A���|�v7^�#�UU%�F1h4�x=#��,#�2>�Y��O(;���j�$	�χ,I��C�0�`�ɄV��:Ӏ�h���$Iccޝ�0���c�?��a4
�&>���H$B �p�g	Y��b����~���|�;�&##]�A>2I����~�y��,c2�⿫�J(�����˗���*
:���6��TU�)s��Ǚl'�<��^�G?��i��V�l6Oy�N���k�7��s���=]�'�Z��Į��߯���h4X,�)�M��$ITU�p��yz��)*����/� |>RZ�͛n#;+����y��*ii�,�.D�err��A��Ё_��QR\ ��
��It:�x� ���H-~�n��A����3&� �{ځ����g�� � �`��0����	�%tEXtdate:create2013-08-30T12:28:19-04:00��(�%tEXtdate:modify2013-08-30T12:05:05-04:00xȦtEXtSoftwaregnome-screenshot��>IEND�B`�site-packages/sepolicy/help/lockdown_unconfined.txt000064400000001543147511334660016651 0ustar00Disable Unconfined System Processes


By default any system process that is started at boot that do not have SELinux Policy defined for them, run as initrc_t or init_t.  These domains are unconfined by SELinux.  Other similar processes which do not have SELinux Policy written for them run also unconfined.  By disabling the unconfined module moves you closer to what used to be called strict policy, and locks down your machine tighter.

Disabling the unconfined module will leave certain unconfined domains running on your system, specifically the unconfined_t user.  If you do not
want unconfined_t users on your system you would need to remove them from the 'Login Mapping' and Users Screens.

Note if you disable the unconfined module, you may see an increase in the denials, and if you have processes running as initrc_t, you may need to write policy for them.
site-packages/sepolicy/help/system_boot_mode.txt000064400000000712147511334660016171 0ustar00SELinux Systems can boot in three different modes.


* Enforcing mode (Default)
  - SELinux security policy is enforced.
* Permissive
  - SELinux prints warnings instead of enforcing.
* Disabled
  - No SELinux policy is loaded, SELinux does not run.

You can use this screen to change the enforcing mode.

Note if you disable SELinux, you will need to reboot, to turn it off.  Also the next time you turn SELinux on, a full system relabel will be performed.
site-packages/sepolicy/help/transition_from_boolean_1.txt000064400000000353147511334660017753 0ustar00After selecting the arrow under Boolean Enabled column, the line will expand to show a link which you can click.  This will take you to the booleans page and allow you to enable the boolean which will enable or disable the transition.
site-packages/sepolicy/help/lockdown_permissive.txt000064400000001322147511334660016702 0ustar00Disable Permissive Processes


Disabling the 'permissivedomains' module allows you to remove all permissive domains shipped with the distribution.

When the distribution policy writers write a new confined domain, they initially ship the policy for that domain in permissive mode.  Permissive mode means that a process running in the domain will not be confined by SELinux.  The kernel will log the AVC messages, access denials, that would have happened had the process been run in enforcing mode.

Permissive domain policies are experimental and will be turned to enforcing in future Operation System Releases.

Note if you disable the permissive domains module, you may see an increase in the denials in your log files.
site-packages/sepolicy/help/system_relabel.png000064400000147203147511334660015604 0ustar00�PNG


IHDR��ah�>gAMA���a cHRMz&�����u0�`:�p��Q<bKGD��������IDATx��ux7��%
/��8��Nf��M������w�•�{eN�9��af&�c;f���~(�:�6��Ik}�<y�;I���J���+XQQ����$ɶl�B�P�~@��p0���c[ZZ������ ��O�P(gl�.--
�B� ����m[�u�kE�P(�?�a���z�jֶm�eY�C]+
�B������q� ,Ƙ|�@�P(��1�m�~��
��'&�l�o:�K��>Ȁ�5�}C���ǂ�88�џ����r�I�����A4�aƘ8��B0�m#��!9J����ܕm۶m�%�!l�D��_q��jB�� ~k�(�?rs�;����`���	��ڔ�3d8�bXV�c���e醉Ю��ʊ�q��4MӲx��w���B�1B����1�� �eY�0�i�$P��X۶m�I���%�J��aٖ$��s�t�ƶ��6�`w/E��~��!��
{�P�|�,��4
 �mۦi�n��Y�z�Q�8�.����3g-\�i����vӂe+�|�
I�"Q�_��G��s￯�tH���k�Ϝ=�;o�1�����5]g!�k�.
���aB����G[��9�W�wVF�T!˲�aDd��ǟ�������WT���(�"I"ƻ�^�4� �ei��t:���Ͽ��[�5d@acs��>^��}뵗s�0�mY�ap!@�1
ô,��b�<�!�0��iY�%�X�4�a��5]'}�rPضͲ���`8<�__����>oFZ�a�ݵ}o���C�#�l��p,^���og=p󍽒�v���р��h��4tCQU˲ ���gg��v�t�4Ʋ����`��k��SS:�������ql�^i��5���Dc@{g�MW]:���m�<�����u�?j�ꃡPNFo�aU�Um�A��l��U0�,k�vzj
������r�<n�a]���fP���yU�uy9Y��Ʀ֎�x�?-9����vDV���|^O}cSkGg�ߗ��� ��o�U�cY�����+�����־���0��P���4��m�e���W���φ���\pNvF�����cc��F����9�i�&%f���v��#�O5��l(y��/dY6���Q��c�2�)���z����=��(������?��k�>r�����UVO3򃯾��_x��+�;+#-U�uc��,_���4��p��+׼�����k/:�eY�e��Y_Q�s��A�}����~�75)qݖm�r�C�z��[��,>�Ȋ:v�YU_y��#Ə�,�Ah��y��M�]�qʢ�k���9�e��-7~3wAM}C{g�eg���ꌏ>O��66�늋�Z�Z�<�#�s��y�ݬ�^�������nݢP~%BYQF���pə��>T��}�)��O�3��������\�b�•�gΞ���X�s��[,ː��8���9l��|||��苯
��zy�s:�mO�<����և��񲋦��D8�%s���u�E���{>����M��w?�ݼEn��a�@����TV=wߝ>�Bx��ۼ��r�$%�?{=�a)	��F���o��J���[1�c��^V�`Ū��7A&�~���_y�]��SƎZ�i��5�~\�4��w��N�t��V���[�b����O���\��~��iI�QY�Sǔ߆mc��CM�H���m��R��u�)g0Ƃ��WUo/�;lϱ��<���c�r:�>�8M���n@j�����j`��6�B�c9���3�����چ��m+�8RkG��i��PT��3N���2MYU%Illimhj���qˊ���*ˊe�I2M����t���S�>��/f���&��h�#��aC�����?��-�qu�MO�����c�S�uӰl� Dx����y�W�:�WJ�m��5���76i�ΰ�qQY)�ϝ8z���e�f})�
c�(._���w?���S/���`��!���}l��<|��ڡ�<�������w?���W�(\t�)y9�յ��P�����r���u���y��B�$N7��rclK�!� �2�����]NGVz�����W�<{�O��F�䤣�L�u`�r8Bሦ�������:��Ͼ\�aӠ~�;��������D��7��	Ͽ�>˲g�>�������F�/�iY�����2,�q�+'�wEU���Y�tUS��-�e�@���ѻ��FQ�����'�������utf���2}����wv
PX��%����1��vL�L�]{�y�����,��k�'��K�>�0��C�C�� ˲$� h�n���|Z�����.��]il�B<ǩ��2�mۺa<�s�f���G�#/
������N���q7��(ʊ�1`t9�>����וCa�fJ��m�p�7�gY�`��t0�߫�%�SG0�r8X�1��j����7]yɰ�Z�ڢ��zX��1@��o`��gY6*��C2MK���3b��
����ô,�[�lۖ���w{Ŷ��I�!�%��1��m۪�2C���c�eUey��m+g�-
��� d$�
�'	6Ʋ,3��-����E�l{�5U��8�CƤںi"�}^��{3��i��8��4MQ࿘5{��M���d�B!���u�L�$�l�������U���D�2�߂�=9P[?�Oc����gY�����bG��1��� �7��1 _b��I`Y�4-���眑������@Ϗ�����,�$��
����'���>��
�d���^2���)
`��8wR���!�
#?+s@~^$���۶]���˲UUt�.�б����Pbh���à��0M3l`wd!��Q!?y�P�������s�p=�!�Q?�0$$!}	��sH��0�<z�Sz,{H?�����Bh��e�?�TL��,{��`tjJO�i޶��M[6���o ��a������/�dU� )I)�ںZUS!�������:P(����3'7'���	�I���`�9�kmk-�,g��ւ�~k0�0Q9�jhlP�p�K�Tط0)!��3�-�ң`B.���ru�3���`YV��Hy1H�?M�l�nko�ڶ-Iҡn@bA�A�.B��
 ����r ��vCAK�D�m�`1�b��2�Á���C]
�Bm���c�� ��(�W�w=�����D���~n0ٚ
1�p��~��0r8�P~$���:�ڝ�n'��¢"�H���>������b��Bt��wb�1�^��%�i�G�ݣ��M�+���ض}8tB��ʢ��}ԨQa�0�\C��e%&$�[����#!>q_{>��].DZ��,�|PO�K��n���^�x }��2��]8<��<�4�0���Du�`5�X!�d��1޶�ax��sB]ӻ�c8�����{ר�*?�oي]׵�fM�� B��i[� D������q��K7o�?i���o�&�0d���l�/���i���+@�ƻ�@h�A~2�����B��x��ݣ~0�<����ܺ�������M�D��Fu���eٖe!a{�A���n/QL[!����]���WTV	��+-��Db�@I�c^�������mB��6˲�
��iff�6t�~Y��?l�F�6�t�Q%�mQ?f���1fF�V��41'9�H��xx����o۶��a��/f~}��������g�N7t#*˂ ��e���0�a�e���G�wLJRRW0�v��t]���r!��m{W9dY>��t�m`���B'˲��hko���=�w0�2Ѯ���ڙ��:b�Ė���`0+;c�(
BHEEQ�gXֲ,�A�/KKK-�[����t9Y���� ��Y�MBB5M���v alY���\�n}BB|�>9�H��Ų�iZ�hI�,�r:?̞�����,SQT��\.���+[Z[���7y��9�05]�$Q�w��DX��8β,��m����J�Bh��᰼�B�]�ؖ��K޹���Ԉ�����&;\۶���c�Y�ini�X��~�)�a̛�0��r�	��l).)=���|�����' ��y��M[6�u��#�
{����k�M�2n�/g~]U]��¾}K��|>��͒MS)��m;��>��ѧ������KJ���u_p��G�z�bڐ�5U�T5??��#�꺾�������#�vuu-Y�b⸱�6oݺ�(#==;;����<w˄��Ǝ^�jMIiY~^�qc7n�\ZV.+JzZZ0�*�G�q�I�!$�yɨ�@��@�2,˰+V�������;jd$�9��ֶ��Mu9�<�G‘�f�x��#�m�PRV^_�п��pK�������'�_߂fO�:%
}?k��'���XYU��W����)�&B1�B�F���1+���ʲW^/�<�JK�C��O�X�a�;���
{�B(
�x�q��Y^~像<�����z>��˲�-X
�׮_��5�Ą����Դ�!���}�ŗ��n3j�3/���ڲt�
^����{'��뗯\�t:-�l�wB�(��'��+5��9w���n�g�Դc��(*��aX�eY�0�~}�N���+�?��U�5)��k�n��;�k���#�臟|>l萼>}���R������x��U'���hI�u�
�]���'͚=g���%K���n)��9�#Bϲ�!r:�V�ܶ�xʤ�6nY�q�K0�}TRb"�0M�-�}13�O����+W�r�	999,�._����n�ƒ��������/��&�߸I׍�Ʀ�Ʀ�;v�;k�N��e4�P��@k�Jnj�|�i���g���n����@�m�W�m{��;o�������}l���			���khl�:y���O:�Y?�njn��ʈ��N�����y�VEU�KK���5M��}#�
9|xvfְ�C�rsÑ0������eY^����>�ѻ��_�ѻ�s�}��v�P��y�!��0�2g�qڣ��7���7ފ��п߆M�++�F���q����[��bGeBB\�����ӯ_AUu�n�>�!F��>��y}z�����蓛c&�[��C��A,ð�A�(I���r��WL7��iSǏ�f����c�>j`����^�7m�"+���N/*.)��w��RS �%eeƏ=�ȩ�G����=j��-[O<�ؕkֶ���="Է ���oZj���,�!�˧Nk����j$%Y�Go�~�	_�u�5�d�;�=f�@��=�c,����}��M��X��GL���g��y��{�74�=*�w�^}Ͳl�������oA���:򈩗_|qrRR[[{(�D#�]]���!à�� ���9$���}�����^�$�����f�a��$Juu
�<���+������ǎ�pђ�ֶ���Mͩ�ɣF���ﺺBIZ�j����ssr�ǐA�N;�Ă��`0�D
���nFeY�5��Đ"bfa�"�c9Mӊ�KV�^�a��>���.����K���<a|bb—3�ټekk[{$=b��Ԕ�g_x��_�
�6/^�����2�A,Z��f�\��_A��!CV�Y���x��g�0{��}rr��P81
38vWDG����A@VY��B�ڒ���RW `wv�����6s�i��N���8�lljڴyKiiY^��K.�@�Ԕ�/���c�8�Yyŗ_���g��������y�Ą�iS����/Y�,���놑���z�#�hFzzrr����l2.��1ƺa�k��������adE�QU][[�v��9��`BBšu��{��4a|[Gǒ����;zznNvjjJ]}�0�ƌ��X����3�w���qqq��I�e��Ɇ���^i���aM� ���B��L����������裎�Dq붢�#��=*+3cgm]iyEjr���|GO�VZV1d�Ą��k�|����qcFc���KF�>f�(2W<`@aVF�0CJINRU-1115%Y����^��ƻ�X����4P9�/
Ƙe��5U�rr��[�o�+,6u��(�Y_��.����	B��ߏa���f���;.�g�q�pH�(I$�m(bY���剧�����8�N��A��m�v4�$��tF"]�=q)�D"N�C�u۶G("1�TU���pm�Ux����m�ٙٱ�fy�w�ݶm���m��g^x鴓O��̴m��r�:�p��xTU�D#��cY�p8,<PQ��
�A`�D�#?�����A��l�X]N�C�ll��AI�A�4-	�,�v�!�� ��QE���PX��QT�z�n��5M�D���cf4��}�nD��˅1�F�^�GU5]�I���:�2ٙ�tB�KC�b-]�d�С�c�xH��^�[[;y<����9v���/����^��o޼yooaT�#�h��۽t���^�����mۊ���!EQ��(��vvv�S�	�Bd�����0L(�Ե�fW���K��ӆ�eYVGGG�\�ӵq�f�ar��
���Ar�c9۶���B<�G�Q�	�0d�Y�3�i:��X��eYhA˰�!�,+�2���X@
E��P����Bɲ��L�ó|4*�%��@(��<� �/(�
�y�$�X�D�����
@�!�Q�4��X���U�BY���?9�2M�4u�����~s�g4�<q�(��h��/�X����>����׿�Y�iZ9Y��^�Ȣ�=G�`�4��"�+TC�#c�B �X�w-0fО�Ӟ��m�����bi~������7`w�hc�fF�;`�VbBҒ%�Dۜ 0�שׂ3
"�q\a�����eg㸸8bL8�WJ�_a�x��1�n�`x�A�1F�.�b‡8�3c��N������///���1�k�d�⮱���X���7���0�@F���+ �~Ra�ߣ@LVQ�VdT��!d�����BH������m�ށĝ��_��eY�@�0�H4��B�4�1����Z��$2�|�+B����NJuH�ں��z:���9=-=%9�O�~��0<�\N�eY���*����PW�B9���)����򷁼�Y��t8��BcSD<�&7�m�T�)6f�0y2)����܄9vS�6�9���B��Ç�7�Pi�\
���S(J��J?�B��8��S(J��J?�B��8��S(J��J?�B��8��S(J��J?�B��8��S(J��J?�B��8~!���1�mL�7�v��[�E�w�
��Oȁ���P(�A�sҏ1���t
C�#?���;���cA\.W8�u�zĄ��}C}a�Y�u:��h����
�B9X(�D�7o��߫�:���=�츄��E4!d��aǑ�z��H�c�-���9g�^��o�����z�y�
��w��e�斖�f�p�1G''%�ws�ېB�P�b�G��]E��mEE����>�lܼ��w:�i�.IIɱ,�㸕�V?��/<�_��i�� UUm�v�\B�!dٶ�(������~��;;;^}�-[��ȣ�))g�v���vH���v4*�<��x���j��#���h�!TU�؈(��:F�V�G��aY��p���665�x�u�u$˲�]���N<���;��o����;�O>��ʥ]���ߖ��_u�_p~vV��/��񸯻�jU�^��mۆi���_�~CnnεW^��6�4�z��.\�$5995%�c����_}���a�S.8��u6����p���={�?��o~ޥ_�q��@���`ơ���{H?���w��7��0v����w�}�WO>��.�(����O�>�Ͼ���{�����O:�Ԕ��oܼ��NMIN���r������|�%K�y�us,���/8�W^c�Gq@UՔ���?�ƛoп����n�6�{�iۧ�z��/��q�w�f��2�˙_o/*~�S&M�=o��i�^|A}}��oLL�B0��ܬ\b/=�աP�w��2lgW���NMM�4m�qg�v��}�ͷ8�K/����x�-��7mʔ��K�/ߴeK���>}r%Q<����kj�[Z���f�Nx!�&&$�~�)�H������Λ3w~[{��eXM�N=�Uk�s�Q�("�HuM�5W^~�'�qΏs檪z�y�:���WT��iZ���}^����蝁 �R��`�9�+)+�t��pR�&�-�`����HLH�⫯|��޽E6thrRRfF���������J����/vuum޲�_~�	��j��\t��'
:���>;++;;�縦����������P(��ޮ��B�jj[{;�qO<����������Qӎx��f�8��������wo����������;w�����-^�7+3��v�,K��rxbc[�27F�>0�^zirr�eYd�eَ���$Q�WZ���2xЕ�]����$�9s�}�
qqq,�"�N8���?.�����s���8��#UM3M�oA~VfFJJr�|�Ǔ��'�wo��ۯ����L��>�99�٦i��)�|Vffvv�����N�e_Է�`��!� ����;zԈ�F���
�<?h@���8bE��a	��������騟r�a���.]�tРA��C1Ƃ �W�;��d����"����^���G���~��4Ms:�(��Fm�vH�$IQY6
��raL�P5��r��aA��D"n���uU�� ��
�c��EQ�D"�/��h4�j˲n�`c�FY��$	�i����w�r�B��mE�z��v:���B��!7��͛��!�u���3�'�q�%�g�q:�q��F��p$ �q���Fɢ_r"�!����0�,��A�`"� h��(
Y�KJTU����3(BȲ�XMȟ����?ԍI�P(1���-�c���}R�H4���n_u�@>�v?!<П��F�8��B�P~��'�s�q\쥕ZZ(
��YK0�$�� ��W��? s_?���m��ڟ��AW�ی�c�a~�ȶ�OyP�6�)�B�s8h�gY�a�{Pm۶1�Ԁ
ӌ%�$�4�ms��|�y��}C��~t_��e�P�Orc,�"˲ݳ�)��?�=+ٽ{�ן$M̹h��ݤ�����٣�=�߫����S��kI���~��^>��`(��*B�2�e�׏0M�4ͽ[�[�v���!I�jw��5�лS(KB�Ɉ/�>��SϾ���<�h�RQl�v�\I�8����G#qu$Q$n9��t:�.���t4�����|^�����R[W���Գϱ,��x�����p�]8X�����eHr�D
�u��EQ�QU�<�eٱ|H��$���q���p:��/S�y���t:9��xܤh����.�!D��|��>�ȣ�H��8���	i(Al�Y�x�Ě��o���M����c;].�S�0��]��t�vׇ����������۷��ƛq��cw%A<nw�5�d��\)c� D������9����ɲ���<�tJ.��\�(
1!v:_���'�|�'�+*�.�(��("��n�$�� t�q-�v�\Ͻ���k���IO�1�D��p�tc8�(��o+*�$�T��t�'�H�����]��r�<gS��P������v�-��N|=W�^3{�wg�6q���'�"�p�mmm�ii�mmK���6u�(�[�o���JNJ"&��۶WTTz��7�zg��Y������ז�X���m�g_~��z�DAho�X�v�C�<�Qx���k׭7MA����z-˪oh�z<�Hd��u�i�<�У�746��$�}�����k�J�����kkk�K�XZ^��ؘ��S�������&�eW�]'��$�**w�}>"@7oްq���ۏ>�H���+-�JH�7M�a�P(����B�N����٩(����`T�y��z<��׷����������XQ����!9)�L�˲��cO�74��d����[���v��w$%&�<_]S�i˖��8II�<��ظa�&Q�eE�e��vˊ����:$J]��=<İl\��vTWo޺5.p8M��U���;i��M��&$$��)����g_|�#�(��7911*�uuu�P��r�Y����#��؏�$��9s��:0�^�G�$�a�++��Y��o/.����V�^��3�$%$��^I�6oٺsg���eY�x557���p8��,K�
�y��so�aZZ[�_?�0������Xl�ڵ�W�����E��;��e[�mSU5%%e��U'ܬg�]�������t¸qo����s�#aIr,Z����B0������Ͼ����	1�O�ܯ����ɲ�������h�>�‚�>�!����#�ʊ��E�ɧ�9��KJ�^�A��𣲢�]<�}7뇶�����y�EU��̟?b�Я���?R��>���w�(�����n����c��KW���_��#��X����ީ��[�h��c�y��EK�F�hKK�QӦ������K�/�����b�jj�'�p�}��瞿��K�~�Ŏ�UӶm/�D"�|��1ӏz���KJ�lݦ����7����GUӆ�1�QU=���::;�>����wn���/�LO�%���o�-���+�
�0�(��7lx�-�6�����?:픓��j投���G�|ʶmUSÑ�y�`vfVgW�[��F�._1aܸG�|j�;�����
���Ͼ���KNNB�ٸiӢ%K�Nלy����_|��
�7��9���v��uvuy=��̭kh��� ������;;�fϛ7u��>�l֏������1���=�a���+Wɲ��ܼr��H4Zد�'��|�H$�zR��B;kky⩓�?n�…?Ιw�G"�Ͽ�*+3#--�0��S���S+����BhFB|�S�=:d���|�_nnn�;���e��gAYɲ��Y,�r7{��E��7lz�n��)'�=��ɓo��z��i�{��<a”I����4���n��_�}V�-���NJLظi˺
��m�B�4�NǸ1cF��z7lڴi�aC���nM��N��ѣG�1d�s�:��3Ϙ=w^uMMRb��M��o�$�҄q���m,�^}��Gy��[x��������;�����]²���f�:����!�Cz��e���}�W^����aӦ���'&$����]���pF����q���E�:;�<n���K'O��*ʩ'�t��wA׭߰h���+�ʷ�~'%&���<���۶�,�(ʐA�|�Yg�z�ɡp� /���4a���E�s����~�o���5;w�27A�izjJ���&�
�J�ʶM�4��X�Ԕ��#FL�0�o~�?��~��	_̜�F�~��E�kv�tHҙ��v͕W0�[n���*.)q8$UUnj���}���-(��QI����ʋ/�@�4��ѣ�4~̘��'�x����|�	�=p�ݢ(����7l���>��;kk_y}FGG����7}��C
�߽߯wܞ��G�4�4F���C�BEA�,+�j�������?���>Gt�
�`8�i^�a�������?看�?o��嚮��w��'��kUU5� &.?v�=w���9M�‘�m[�e���TU������C!۲0��H4j�&�����wz���<r�pY�mێ{�A����n���~�O��r{qј�#�N��=�t�n�����V�0�:����;����<j��H$0�eY�$Y�m�Y�%�12$Sղ,��E�8�a�eÑ�mYd
���PHV�0!�ɑ��x���~�)�H�,C�0n�ko��t:.8�ܧ�y.1>!3#CQU�4#���`�����Iӑ�SEUm�"sB˲dE��
���`�$0�0��ׯ��{�JKM5CӴ���{�UU��}���9|��/�b�VA~^Nv��=XYUu���)�Jgg�eY"��ׯ���"1�L��4MEEQ��Ⱦ;����ۆa�e�4��m�G����{7m� ���"?.@`۶"+YV$I��#�p8lZ����
=��Ba2�����`8����.<��g^x��g��N˲B�p�eYC7,��<��&FPݧP���~2AZVQqӭ�����~��+/�4';{���V�]�t��ں�Ԕ�P8����y���;-YZ\\���1q��|��G/)-�0nlQq�s/��0B��ڨ�#�KJ�}�%�a�a�eC7�M����0w��u6� �8������_��������=���!TXXXTR�̋/UUWged�?~�د���Ï?9��c�����_�~�FM�Y�E��
01"o�,D!!� ��(����~?����x���ґÇ��ï�xC��aC:d��+W�	�BdZRӴÇC�>��cG�joo2x �a���m�WZڱӧ�^�v��dj�,cF1̮�Ų������/g~�b��`N=�$EQ�-\�m{ƻf}W�^�⫯��ᜬl˲��2y���99����V�x�����pFFo��1bİ�^zi���}�,��,ܶ����!B�ˇ� �����c9���a�0�P蹗_^�n]zZ/QGV\ZJ~\��(��.��
7:���>k�IϿ��=�?8|��K/� /7������O;�������=��c�55�}��w�~��II�m��������p��w�8w� <�oڼ���3��g�e���P(����IIJ�
}ߗh۶��HgWg|||\ �i�q��u9�����z���E���hjjnmkKL���|<��S%--��::�L��D�8����2M��ʊ�t8L��u���h�V]S#�brR��m���1����e��r�v�]�G�<�:�����!���mm������|R��M�$W�e�$�A|]�v���4-˒$)��<�t:���B�pjJ
q੭��,��9�NaUu
�vJJ����@!D>0#+
�0�ɲ,��(
;w�E�є�d�$�k7�4-�cW}�imk�9Nr84Uu:���a�x<�`������&������;k�}rs0�����믽�_߂`0XW�@0ƭ�����v�uu�>�71!AQT�cB��:�Nf��X㈢��Dz�a���jimmmk����x<�`�4v��d'�P8�F�23!�,˶�������l˲�MM/�������?���듓������I�qGUu|\\RR�e˾��Nj/� )11. i�Ӵh�aE,��AK?���0�i��a"I^B�i��I�NU5��X�%n�$g2[`YDZ��+8��2�L��E	<ocl޽���"T���+*���3O=��$I�m�<O4��
��16�|t"�m���v�I*��U�a۶-�"ڭ�zw_R˲v��<��"���y�2Y���7c[����tÈ�g�
�I�'w��Tش,�ax��,+�B��Z�(������_���D�2�zh�F|I9��,�4M�.I�!5�c%�*ܽqȗ�݅��,��a�&�cY�qB�M�� �u]'	H�b�A�q�a�p,k�f��[�Щ�����.Z��s�t��e��R��~�a�/�o��3
� Z��î���6c�H2�ܒ�niL�m)lL) ��KclZ����Ɋ��i���z���"�W�R����ݏ)k��1V�S�O���pw�X����}������=���t��?6��^a�eY��t/׶mUUcE8���O=Ŷ19K���gW8��ؿ�qȗ1J}���?c?e�J�w��K�X��-@~\r?알Uc,�r�|���Z���S(�--�`���]�1�Ϝ)1@Ӵ�>}
��U5���B��_s9��կ�^p���{4����=��\~�q~}@����C��~\p�
@u����蓛�FO;�r��[��w��b���fB����w�Yn�Up8K�~s�W����4M�!�)���!{r�b}EQU��,�������8�4E�u�X�g�Xc�>�:�ݪ�B�_84�~�a��kC��!d[fbBb|\�a� cl�g���!F�_�?&���B���R4!0����r�\{E�U@^�l��H��G�j̆��`(�0�m;!>1>>~�խc˲p�n�b�X�k[��s �%����_�?[��CHEE�eۃ�$ �չu�6�0RRRc�E´�~�PUը,�ek��|B����J�5�Z��b��7�� A,�noo--+����|1���t�ƶm�#�X7�k@I�@&W�������YY��]د�!91��P���H׵�Դ���w�5M#N�,�Ʀ������D�D�y�'�e�{��� D����6�P(,��,����F&N�X��9��-q�M�6d���+�^	H@Ԥ����^$�9#�?b�d]׉�%B�0�7�~[QԓO<��7�q�̲,�z�ڤ��Ԕd�0Jr +f�n�>Ƙ��<��[�i������d�`QQѠ��c�:�$m۾��Y?p,��N9���K�661�C��B�pd���C����{}�,]ַ ��v��{5#y5aY���5	�3޶mMWn�g�׬�z}�(vo��^z�6ðÇ
�4aŽ����!���Xg�P(�����{Dz<ϗ��U��}ԑ�'V
˲Hx��E��s4�(�aΟ��N��23zwt��wM����D�h� p�@���1$������V�>l��Qii�,˺�n���z�3o��Ç�y��pH�D�/���dJB
�W�P�=� !I�l�IF�LBm���h���P��a�mٶ��>� 
��dN�D.*)�^T4jĈ���^y�eIY��$�,K&a�Ap�\��!<�0�$i�V�s'B��X�xB�p8X�%��w�m�t8-�A�y��9�x�8�c�457���۶��m[�m����\||\S���斖�K���gf��x��/g~m�κ:QDQ$�I�2�����㏪���郞Z�l9�0d��X5��u���.��(�aΟ:��eY��۪[C;8���m�j�;$���=&��HLHHKI5L#;+��9s��X��FRb�y�ݬY������JK{�ɧ��;F�q��ӟx������������G�+.��4MEU{�]�`Kk�qG}ʉ'|���.�8�wz��\l��mMSy��-�"5�3��J�2�]&f@��GO?J��_�a��f��lŊĄ�+.�����˙_�<���Q^^����r�\e���~ff��(��o�������C��x�q�}�aqiY��^�u&`�…��_s�_|53�/,��O$I:��3z�Vu
@(˲�ݗ������ݵ���s�	�	������/����+wT}�嗝]]'wl��ܧ�^ӴaC�9uJ$}�"�����#�� 
��-�n�� �u�i�yy_�zي�		��	e
��ȟ:�`��"U�|������P�;EFQU��I��Y[����k;��-˺��oڲ���u���r�	Ͻ�����~�=g�ʕe��Æ7z��3fL�6��+.OKM]�~CSSsQq��	N9�o�����싯���\T\Lb� ��'�,s�,˲L�Y7t�a��A-Ƙ���t�m���]XU]��9ujqI�'�������?O���ƛ�y�m��O?rڧ��(ʦ-[֬]7{��SN<a��QU����Ʉqc��v����+�._��Դ|�J��^z����g_|���aM��~adi1�TMc�=��u]��mmm�{��������9s�JJ�?�̌�C:��S��6��/+������׎;��o�C/��Z��B�3ިٹ�o�����O=�$Ӵ耟B���g�dض������8;=�i���z���5;w�v��E�ѿ_�g�z��W_>l�i��yyiiiii�c����x���9�_߾y}r�KJ���F��r�i�ӏ�֯����N �������}�%�~��oAAvf���$&u�e�^_Ŏ
��0MӲl����
�B		�1$�i�Ç
�����뎣�8�r�۶+��22z�dg�����S����y�R���ӳ22RSR�2�8�C�,{Ĕ����Ͽ�9o�¼>���u��U����;lۂ(����WP��;�"�����5����0t�`�׶����rA�5M�2-A-˪��MKM�
Ÿ~bA�^o|||Yy9�8�W�e[�^t��iG�x��Ҳή��s����H�(+���IO�uԴ#:;;����N8�p���Y�Q#G���%%&�$'s�![#B�P�?w��i��z�3���haaV�C�����Q.�[� %9���5O����o>��K�֯�4������ef4*wvuq7bذ�_���U��1u�����^ii.���ǟ��o�dYQU���S�UU��G�����k�/Y���!A����575�]�1�������e+�eg����@�YAfffz��P8\�����9�o~^$%�����ö���YQdE�–eE‘���az��Z�hIu����A,X�$�Ca�0��v�ko����ӿ�_��tI��6d�@2!l�F�^�������r;.g8\�x~�?���՘E�%o�������K��y���h��زm[JJ
�0+W�޶����l����hԲ욚��|��#O<����''{ɲ�;v�-�Ea���}�����:㭷.^�����!�P�~�.]��m�JK���**-ٴm{IYeKKk��̼>���/�Q
��u->>.7;�wz���x�C���
������/��wt�s���Y�(��d;�΁��շ�����MNL���H��+3#Cr8����T]����Ͳ�#�N5��PRS�jkkKJ�7m�X������o~azzz�zB�1�qi��	��ea�}^o�ܜ���tvv�N�������JKMKKMu:�9YY�')11�wzBBB|\`���B�]|���Z�n}k[�E睛���r932z''%efd�3��r�/�W^QY[W������Bԟ4fG[GqI��-���J[ZZz�e���o�ƔQ�u�A��t�cu]��ݛe�+Wfdd�u��y}ru]߹�v�ĉ}�SSR�[Z����=���$U��.�)'�PTRZ[_���P����ק����O;655�-�K�0���uБ;/A�ò,C�n�(�ݽ�Ge��
�)�Lɲ�v�TU%��E��\.�e#�(	먨*	H�r� �h��8�49�#A�G(
��z=��+�H���n���$�v�a���S�u����8n�����X���6 �8� *�d'�p$B\b�Ѩ��"N� (�*I�i��(a�UE�l��rt]WT��tʊb�6�\W�u���娚F�����`(HV	��n�éi�"���(QE�cYcA�mGeٶm�Ӊ¶���$IA�A(��.�05M�݀,�eQ	��0MY��]}�A#wR+~{��߱$�/����K�^i,ۆ���ɗ!˶Q7t�e�`w�U�\��-TrlUpl��'��qӦ@ p�%���w_�K�x�������%�n��Rm�����	�&3�}/g���1�c5o̙
�^���VHt?�=�t,�Պ��S3~{���� ?�����_�0��)�C�b�������pΙ��q�)�(Z�e���(]L�T�}�Nv�*;�|`�����{��~����1-ˊ�ks�~T��V��kE�P[z�W�mc�aTm?��	�y˿Vm)�!�GK?�k�B�Pz��8K̻$.�~��/�k�@�|m��o�ov��:S(���~2��v�].���!�y�s�k�맩B����~/ �/���D�W���{�Ⱦ��o��)��"g�!��Ph����$���\��l-˲�a4��$�'���9d��@�(YVH�a۶��A��bW{���"��e�-��7����Ăq�l8$Ps,�g,��N)���~��0��޻_uߺ�4�����X��X%��Mqi) 3##�����^�{���"���1�!@�bݯ����<�D������*}�C$OUUI|�D}~(�Ó�x2c����%˗��׮!!'�~����y����'�kY&�����~�)��|>����WTn�^���H4��rV�رu[QBBB{{��O?#�9��F����$)6~����r���;Q(��E1>.��t�]����77�$$$��E�qH�mۂ �|ހ��p8|^/	��L���|^���n��pGt6>>޴�G�|2���HM�ey�޹�����E�ź��|>��Gt��r��~��aZ���:�kgY6��\N'��| 5w�\N�3�K�Ȳl\\��rٶ�q���
��.�K��e�W$&&v�v��'|�t�$��%�H�N��x���>�D�
��?����@��tb@�A���A��1Ƣ �WT4�����Y�moo7LSӴ�?��ᐎ�>!���C��~V8�v�T����Ͽ���qcF��λ�]����sG^�`P���Ԕ���>�H�����hͺ�Ç)��O�4�q�;v�Z�&'';�W����C�F��
7
<hGUՊU�s���23�}��駝r�!C��Y�nؐ!�
*)-���nmm�_�ogm�eYӏ�� ��1^�ju}C�0';{�ƍ�8`�핫V�76�>�WZZgg�?�X[Wo�$�MM�s�/���3j���<��ܹ�Ï?����[P IҬg��:q�8�
����RS��
����;:����(Ǐ۸ys~�>�moڼ%77g݆
�`���3��ܜ����O�23&�W_�P^�������5�O>�|ɲ�]���ѣ�N'�]�/XX��0z䈼>}�n۶f�aC���OU5rh̨��YYkׯ�^T<t𠴴���I��֮[���:z��Ҳ����>�9Ç�kk3
�r8p�~��(C
��ͽ�ʫ�~�=b�x;:;7m���I�`��$I|���M�;:���e�e_z��
�6ٶ�l�6M�����f��iZm�B�,�,�~㦏?������^w�\���fqI	Y[�s��oc����F_x�YQ���W�--w�s/Ƹ��3	C�Ѩ�j���x��r��;w�\�t�'�}nZ����Dfϛ7�o�.�m�ض�z��Ʀ��V=����=��3�P�o�����Z[[z�񦦦�_�QRZ�r:M�
��}�y�e���y:���iZ�at�P���_]�|yqi��<����S��^T�D Bu�BY����Ӎ��������k3:::�[Z^{�͎��G�x���|gm�O>URVV�����/�,;��-_YU��ɧ�aY���>^�hy}iko#��N�}��g�K�+w�x�n��3v��˯>��SI���U��|�Y�$���wtt|���;}�ɶ���>�pμy�V�x�wy���vTU��H�1Q(�'��e9���~���m{�#O<YWW�i�EQ��;���</���TU����QU�u��uu��y���?jĈA�=꼳φꆑ��<b��1�F�u��i%$$\r���tbGG��+��ڻ������5�����
����oHJJ<h`vf�ի7m�2i������
I�I�,��;��1u��emm�`��������rN?���.MIN:��ӎ�:����,w�'��_t�yG9-�w�+.���W�رl��k�����w}����_����[n9픓�6o޲��:�������jZvvVvV�I'����e۶;o���;nG"�7l�׷��믛0n\$IMM�����(
�׮�y��n���s�^����y^��0��ȸ�;ftrR�����y�KJ����ZTR*�.<1#����O�8!?�ϵW]��xl�6M�o~��!��Ἴ���W�������Ʋ��AJ��D��ӓ���{�jlj������9������x�uW]~٬�sLӜ:y�]������H#zR(�!g��8������|��!�ƌ���A��9��ӏ��r���
#����NKM�2y"��(��n����l��k�fUuMCC#˲AӲtݨ��ikk����j0�F�6�}>���=r����R���Q�̎^}��k֮���{>~���9��7��ܧ�x����\y��u�n���O�{Ƕ��ҩ�'�~��=j䈡C��;㭷�P�4�P($+�OQ�PT5
GeY�uEQt]�D���]�!)!�f�Ή&�_�p�ƍ�$vu�^���PX8f�(���DX�1#*˝��� 0Z�n}Rb��(I��+V��F�d��4��y����kjn.-+s:/�B�6]7B�XV�H$"ˊ�i�PH�y��տ��#��n�z�xM�4˶
�hjn���p�=<ϩ����{�?o|��gjk�Ǝ�v�F�1b����d�an��M�}��⒒����쫙w�w���"�r���e˗766�]�!%9��06���)�Ó����q\E�W�xs��E�uu�^ue~~�(��.*)-�$q�K�.���=�S/Y�e�6EQǍ#I⇟~�a㦌����/���1l(�����������[Z^>i����ֶ�s�<}GU��+k�������uu/����ں!��1�WZ�g_|���r�����:㍝�u�?n���n��i�pܱUU��W��Y[ۿ_�ֶvQ��-ذq���[Zl�>t��i�۶�52
ˊ2z��u62dԈ�s��_�h��?�l1~�IyE��g�~�e��.�ܱ#5%�WZ�a�m+ڞ��{���>���Y�֮[ʉ'�+�/*)?v,�����k������������>����|����O�����6.7d�;�&��jꎪ�Q#���JSu��%������;&M_TR�'�߰i���G*˲?̞�ٗ_�;e������bu����\����>g�c�imk{�7#���	����JJ.<�����/�����?�����Q�!��߰�Onn��T}�=��rX�["w�P$q�\BM�DQTUU��a�u]w:��iF"�!#�"	?)�"�0B���'/,4�8��Ӊ���]� ˲d�YBH�4UU���t]�$��~Դ�'Ȳb9d�W���X>�0g,��ʊ�-��K�'Dz�<%���(�˲�a�(��a۶��c,WGE��8M�H�����(�B!��<�wuu��M�$���!��ᰍ�(���0�eIn,��O�P"�B�� N��0A໺v�� �\N]7dE�����,�=XUU��m˲1&3��n���(��1|QC˲$I�u�|�(
� ��'��If��EܽeEa��p�2�p��H�Cv�9�+�s�$�m��y^EQ�N��M�[��F�(*�*QH�
�,�R,+�O�{�E���K�n�,�!EQ�a�t��,��l��t�Dr	$`��(D|�dY�1�4�x��j��c��ı��^��29�i{��ٶ�xl�I�)���C��A�@U�X���0i��!uo�C}�S(����ງFa0�i8S��Qc"���2���X���`��]�r��q_{��ń��'��^���_�@��;�&B���t�|�|P������Bvl��n�{9{U�@�bX{�ֽ��-��D�P7���@�]�ACƕ��2�����X=�_
���!��_���/!4S&��:V=��e���P('��s�FE��NQ�b����%��]2�5]cr���S�5�B��p��X����,�%	A����v��{����0���~_w�X�_,�@��&#����U��`YVcScB|bbB�a��'`e5����R(�¡���"��W�ߗ���뛷l�����ʎ
��
KN�mYB�4
�`�Ʋ#��eY���b�|�$��
>�L��홙ٽ�z1c�8�l޲ɶ������8��
�����{�r8$YV���r��[oA�P(�#���lh����[���UN��ਂ��/Y�,�v��w� 5;w>�̳�!��IO9�D�0x�5L�A(�x�y�e�9����ݛ�BD`�&�%�0�w?������ׇ8���2O�60�,�UTV��2zg�r�T[����7l���S�m�.�k��o�����x��������t�����AHr8�5M+�Mgg�+�9��8�ON!�L�싯Ə��/&�!���B�����&�PSssf�^uM�O|�5����?�`Yv|||ss3�b�v���+���?�����;V�u۶��L�t:�_|9�����c���|�eE"�p�x�˲,IR4mim�,�a?�����eJ� ����B�9Ȳ�P8���&�۶0�1�Ev:����N�c�s\]}=�s7\���̌^y!��|sK	�I:��,�aYVcc�e[������<D"1�[ZZ�`�aZZ[UU2,c�&�mJ2MS����p$"¡�m(�_�?w�0P�T��m��D��	||[����<��`�^�jY��m�UU�8a�w?�8w�|��g�u�\�z��H4�v�zl�w>��0����.8ヌVTu�Сii�_}������K��k$Qz��1ƭm��ƌ����^����m�l����fM�1ƺ���lF	�
Ÿ6caY���$'���k��z����jj���˫�k^{�ʹ�Դ�ԍ�6���P8�+-���ⴓO:dpiiyA^�ǟ}nYv�~}�;��^yU��II��svu�������{�cƛo�����={�<�eN8��aC�h�?+�B�	���~ X��`$+�����ʚ�>	��B��$I���&�1�0���W�����P(������uvu��ɓ&�;��K.~eƌ1�F=�Ѓe�s�/PU��3Ϝ~�O>��k���?�egeU�Ԅ¡p8|��g_w�U�׮]�~���{�㎙���[��0�����M�4M�4M۶TM�y!�r
c����k�羥�W�x�?�JJ��~��ѣ���?��K� �a��m���DQx��"��I/������^��PU]�����#�}ԑ-���֬2x�i���0LIii��~;v���m�^�����{��[�zuƛ��h0
�7��N�B`YVz��EۦO��'����ǜ<qHT���֎=�0�"�f����o���9{޼��x�����$2�e�QY���x<	�����n�;-5�0۶���H�	��Y�q:��[34�����J6��y}����������&��(G˲ ��kg]M4MJL��
@
ݘ8~���M�4��5{�|��Ñ�S�<���=11���<��xx�KLH�Dc,I"B�x�ĉ����������z�/�Z�|�m7��q\�Դ�C���ʫ�#?�O4*�޷ ܘ�$�¡�y(�_�?{�o�fbbRb\�g��qՉ��?�o[k��ŋ��
H��� �q���[o�;�m��p�-��4=��<�G�:�z�����G5���#�̌�̌����W^�ٹ�0M�Ñ�a�i5���5�2���{���"����k���rClak[і�����3F �����ʰ�a�px����m!�˩z(�G0��pH7�0�ѨeY���Z�f՚5n�������r��������0Y�u]?b��Y?�)�۷oA��A[ZZ���>�B�_8�ȝ�$�ZMMM�Ϊ��v�a��ss�$&&�j�d!YQ�m/�4�0���LI�,�Ꝟ^ZV����F-�ꕖ�*.-mii<p���..)M��r:5]_�z��(cF����t9�]�`Zj*�xgmm߂UU�n۾aӦں��z0����Ҳ�ں�`�KĤĤ��B�׳�ggsKK4*gef���8����Z�z
�`�С˶����mmk������)+/OJJ��������7x<�u�7@'��0̊ի#�Ȩ�#R��K���{��D����WZ���QUu��5�H��_������_��rX��y�?V�eYd/���}��"��D�X�4�i��0� I�qd4M��X�N�!��,�X��2���BA�Q���2..�wߋ����\����0)��y�PD>�x�{��">9$�%�m/"Q-eY�p,��j,Y,�)�q��<oٶ$������p@UE�tC�DRI�4M�v9HUU:�W�J?��7m����ΆH�y� 9�m�		�;W��X��Xtb����U&��$�=D�0M���!۶�Q�a��ʏ?�<;+��O�#1�m%ߐ�d�F�4C7�-\�i�]�`,�a�Y��K�
�M���ĺ.R� �m���w�=.��WS(JwY��H������`�~���ā"0�W��O9��3NGey/����;P��D�{q{U�/��~�b��C�P(�=z�l��ľ��B�P(=Z���S(���o�>��>K�,�w^d�gH�P(�ۤ�s����� �:����o����x��e�P(���o�~�YJ��1e�C�1wK���D�m�-[66-l��L���¦�K�M��ϱ�ln�MBp	,��1�A�)�6ƴ�P(���I?ƀE�#����'fnw�e�<��aُ~��w	�e;���.�'ƻ��%x���qp6���b�W�Y!Hp�^A�m�X�O|�%���d�ZQ� �;y�֐����%rn���O�P(�M���$~Ƃʣ��Ed��6���)��ps}�EvIQ�i��'e�����5Ӛ���O�����dyMyc��5z̐��F��h
�;�2����'5 ���l\65�0�[\����NE��c�ec�C߮�{���M�>uTo�A/�.{nV�?��+H�j�egG
�B�ܨA��悭Mw��?)�1k�����ݐ���fc�!��wŊn��h�ζhZ@:ed��m��/������-:��}�mm�R�x��1S�'�]�}�mkk)>�Ʒ�GU�6&z����6�`���7��s}��f�}��y纱�N͹働����g�S(ʯ� ���X������ǿ.ZRԼ��mSu���w�9`���E�&�'�_�]R��&f�UsmE��K\���{]01�w�ceY�"��3�I�.��P����C.��Xթ��]g8rP���[�Mfz[���.ô����m�m�A�eR�}
�B9(���1�/�:z@�O+�i��Άΰ�8��e�>��M�Yt���g=���3�,s�[��txk�ULC6�"ZP1�!53�)��JG�ŧ�92S�)~�Y	N��/�P��;:d ��0�{?ݢ��N�X���ݍى�W�Y��	�]�J�P(���K/MNN�m�ʲlGg��n�۲��V���9.cTn\n��S6.����,�Q�������h��g���Mv����(��I���z����Q�R�d�?Y�smE��L�уS�mm���ٶ��¤�T�G˫�[��}��%r2m<"7�)9i)(�����3>�)�a�D�
?��a������s<�`�r�a�����"wb�N�-հ �@���,BAY���|D1u�v	�a���L!h��n�V�ex�6�
@�8ݲ!Րx�g��1�`H10!ʹB!B@3lD��Poh�N�a�/G����}!�jBA@0�+t���@�1� Ơ5�"�!ŀȚ�;�@t�5AY3��!{��+�q�C$C�mݶ $��ec��$P���`��H�r��'���Y�����{~���<��St��WC�0�sE�r��~b��Y���j[6PS(�@U�8��)�{H?�~$595)!	cLu�B����C��r������P׍B��@E�r�q�i^z�R(��:�J�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=*�
����O�P(=��۶�1@��~c!<�`�m����/���A�J�^U˲~M�
��w����p��0M3��LJ!�q�aT�c�e�bP�eU�~^�%I�$	lY�,˦i(����Ī
!�dY�t��?�B�9P�1�<�/\��ϿPe�С�\x>�q�mC1�B{��B�jZWKK|\�0d(���$�'9��X���_yu��/�w�٣F�t�e۶I��g�<�|��Ͽ������y�I'�$'[���pc��aZ�E�D*�jZWssbb��i��3N;eԈ�h�e�}�]�B����eYIZ�a�>���{ؐ�Q9�0�X����@�%.�k���wuB�("�8��yc�q˲,�J�IJ� $+�e��ۯ��M�6l6d���TYEI�$�!�y�a��EQ������������+*���]N'˲,�0�JA9�����YWW��VQ���|ޭ۷_�͝�]���u�\!I��-TE�P(�#��I�P�`?�~����8M�EIKK;��c�����;����N<��;��ǟ��#�����t]���˾�5�������K/INNz��W].�
�]���O2��j~�>��n�����7��(I�W�kƇ���x5M3M�����_*-+/���?o|���w�����_���������Q	�vCa��9o��^y��p8$��G��YO>��~����8nܽ=�r:�9-99��o�s�\g�q��o�kii���O;����hT�B=�DyEE�B;�:�����7pG]���4�C]
�'��~�1�qM-Mu�u�(�9�g�y�w>����c��r�?��̙'�p����kg�8;
�q�)�}��hQI��W\ ���1l������{�..)y�޻����
�6���x���+V�p�q�i67���d{<���F��\.׫o�����>���3�|�����q_p�9���c�O?�S7o�z��E"�g_x��r�[�w������_��p$��҂1��mm횮;���z����G�L�q�?�����QUu��_Zj��|<u�o���Ըw��s��:�wz����C��X��7�/˲tA9L�{�� &'%$��JokoO��g��s�λ����:y�瞻t��g^x1%%y�Q��{�^���9s�G��t���C#��a��99n�;����qcƔW�`Xf�ԩo��~(!J:d�G�xr���#����D"MM�}�4(�O���Z�C��˛2qbjJJ$-���8n��ѡp���>����Tm�ԩN��O?�Xֲl�ey��x���m��7�TT5+3���51>�a����<����������41))	`Я�0������/c��m�
��>R��~0B�eY��5g޼;�{ .�,k�ԩ�YY��IcG����۷��/����ﶴ���7t��F�`Y��3κ��N�TRZ�+---5UŶ��P8���áPGG�a��tuuM?r��mۯ��M�		m���]s����C�u�m���}�������D:::Lӌ�rGgg8�D�
������ƌ���������,��{�5�Fe%'+S�Ԏ�EU,�ڱ���8����ޑ���0�Y\xʉ'ꆮ�g�u�=��P۶y��8�J?�7@ޤ9�;��P�.]�tРA����LA�+˝gjJj{{{YyESs���>l�$�<ϯ]�����w�^\ZV]S���Կ�˲;��*wT�铐�x���Adf�޼u�!C�"�h��~��lI�OH�f�0�lٺuGU���웟��;��rGyEE~^~vV��^O���
7eef���q��ѣF���e���@�_QY�n���?����%q�~���px=�����<��иz�Z���xX]�sGUUfFFW��WjjzzziYYEe�ޅn�ԻWzbb�ϸ�R(?Cl��;����$.dʡ�ܐ�7o>��''%۶-IBc�FM��x<���vEU�{�IEQDQ�0LEQȹ�((�j����ꆡ�C����sbEQ�a���eY���i����(���(����NӲt]w:���;�p$!t8$]ӟ{�e��K��؃%I�,˶lӲ���2�ZQ1ƶmG�QADA�t�aX]�u]�����d����(�4;
��@��rX����C-�
��c�n���vY�O?�d�(�"�2q�����(
1�t�`��`(�0��ic�P$��W���(޽�!D�!�"�(I�:�+$���P�eق���k׍5������^!�u��M�uUU��"B����/�[U)�'B�l[Up�_������M��O=��0*�ھ��0�^��`��̉{���˙t�=����LKV�4��*����U�%�%��B)�'B��[Z[v��R����t–�=��dA,>�ָF�Q:<������]�?m$^US�1��I�7��0�eT{X),B�A��`�ضm���,�O(c,
�(��J��t�O��rp�~
�0�4M��.clZ&���^
��[���2�V�ݑ����*�2ж���=C��WA
�݉(�?���~!˲�c@�g�7��k2�m;��7�˓�;��ǙH,0RO2�G�E���۲!8|�����
۶Y�����}���^|p���a�I?�B����x�?QdlE��D�P7=��`cR�t8U�I-����حŻ!0�$�4M]׉�Sb�v�j���{J�{��a�…X�N�#��W+��h���Mƒݦ�csD�[����pJr2�ֲ;@h����b�o�gHRƚcl��]~J<�E�2�0�h�,3�?%ۻA�?ľ�W��I���e�.�}�`w��W��C�A�Q?�p0��\ν�(��������Ƙ�8A��hwǰ]�On
��A<��yhni�灇4Ms�����vK��%I"���.���%�p�]^�������x<^�w��Uc�۵K�0�y>8�N���&��	 ���r:7o����gY� q��$��m<�v�}^�$I����t�6˲~�����r��~��t:7o���֖�o��+]N����˯�KJ�����},��3�|^��!a�G �-s:�?���E�5M�z�n��$#9`��N�w�c�B���~��!.����<"�,��>I�8��Bk�mH��g执���茏�'�˲��x�p�t:�^/��
p:{�.ٶ-
B ��.�!I~�/..n��/�>#>>��p8����,\����\΀�/�<�|^���>�0��4M����̙�|���E���H��t8�]�1�$��v;۶-�b���H�dZ�q�$���C��k���Łݽ2���$�B�+�\z���v;aY�����x��m�{o�²lW0�b��#�L.��\�~}E���s��C%'%UUW���o+*����N��kו��������m{Qye��W^��~���,K������*���Q]����i���Ą������UT�e�ytGU���MMM�QU�Ü9ǥ��T��lݶ}�ڵ�a�o�XV^�����P(4w���ں��$A/Y�q�̌�P(t�ÏT�JIJ޴e��Ͽ�q\|\���@���ظe۶u�7$&�{<�P(�㜹�5;5MKNJ**.^�x1�0>>`Z��Ͽ��Ɖ��FiyyEeef��ۊ��o���N��m��o�TQYi�x�yb��(�2����{<nI�6oٲi����Ҵ�T����ޱd�RYV����~>���v'��}����,��W��@ ����?D#�̌�斖��m۷s�x,�*.)�^:�0--���r��y;k�RSR\.WQIɜ��b���t]�Y[�����=o8�VT�b�*���iڏs�흞~gB�P�Ç��(k�o�m� ?o�ڵ�V�QUmGU���k�{��m���|��M--�))Ƕ��-^�,*G��Ҫ���������-X��_����]#0��D#!���+��iim�z�<��n�r�a����7H��U��N�t�]��,�~��˖��?�x��)/����[[[�fϛ?e����|{�ƍMMM�֬���}�=�@:��jjvB32z��� �j�����0�O?{�QGn/*��O�/��{�pH��������|����g~�ݐ��~�3w��}��_}=5%e��y�e
2X�u�0n���4���~���
�6)��nÆ����̵,+=�WKk��[i@��O>�R�7�yWQ��;W�]7e��y�4�mEE6m�z�O=�|F�޲����4M�3o~T�s��?���Uk�$�'l/.�n����A��#,Ð��?���1�G�\.�P[[���dE���{�J�����,^��23���yl�nln�ر����;{�<^���ۖ,[6q��G��v/\���󵵵?��/���$'%�~��w�<�n�FR�}2eҤ��Җ��Ukֶ��!�^|�U��)��$I�f�.��⫙I��ť������-۸y3���:f�Q/��z(.���kh>t�~(6��v��oL�ɫ���9��6~�ɧ����������UT����rǝn�k��9��'&$>����s�9�
�7��'�����������!6�'�� �!Q�%!�[��Blہ���K/=����Ro��:�����"
��i�}�����VTV�~��?pK[ۺ
�������ק��\w�ՓƏW5
�t:�RSB�`�ܜaC���J[�vݦ-[Ə������
����:��~��r����`���f˲+w�(.)u9��&L������3��i���3HfFFWgW~~�ss�/�LKQ��˖ggf4��SN9��Ǐ�����ߔ��
f\|��W_~Y4]�d���n���㏗$)..�yUE4p�0����A��=���cY��3ϼ��,Zt�e��qۭII��f�I�v������GSs3��'%%�-`Y�������ON��\}��'765�0gnVf�n����/��`��ѣ���?ot�\�]p�9W]~��i+V�*+/�766�߰Q�����}�m�������^+��t5b����y����o�9�ԓo��M�=]V䂼�+.�d��UU��#��$�N��_7��i˖�6b�;��V�Yshoe˲�ݐ�81y�������⊌��GL�t�����֩����t���q���W����Os��o���.��of�2tcʤ�7\��c�:*??�7\�t:
�)s۲�Mm���:-��01�nD"aUQm�&�� �ƭkZ0J����viyyEE��i~�_QEU��,�JŎ�ή.2�(��-��gnN�ͷ����t����yqi��#0����ܜ�[n�����4ͪ�jC�]N���;��MrIDAT�2x�}w�9f��`(�iZ(���aU���GӲ.���=���޲u[||\RR҉�{�]wB��Q���&����WTʲb��iZ�aDe9*��>�������1�Ȳ���z����%��eɲ�F�i�*�a�*)+���limMJLP5���b۶��q�܏>�l��<ϛ��z8Q5������*--�y���--���u�i���D"��@�$�˙���o8��ӂ��e�$���z8ܭt�D�<�d8)��ϲ,��UZV
���0�@�4l�d,o�eZ�a*�B�:	��pH��/������i�p�7f����-�4M����D"�m�#�P8�66-KU��5<ϧ��U[W��Դ��8>�(�b躬(--��_+�3��P����Ƙe����00,�pH^�B�'&Q>������pܱ�(~��gضO:���-Y!E��#���Ǎ=ڶ���w>��Ah¸��
6��_J���ӧϚu���c�	c�:�#�8��3~?�i�X�32z{<۲B���x�n˶�	�B��xC��dge�}��}�QSsSnv����Gu��o���xN?��!�>��w�zKBB�(�~��eY�(JC;z��w�k�fRR��M����
àI�3�1��m��v[���� 0s�%�����W��߯ߑS��X����-�~?DZ�4332A�q�܀��r9>?���pp,wԴ#����{DQ��������瞻�K���9��9h��)'��;��y�i�8�N��ba��X���t����~�6o���|bb�Y����s/��[�>rZrR�����].��0M��H���v��RS�<���s�oܴyԈ��ٚe*��eY1/2��r;!D��~l����8����e���У�G��s�:s�����=�(�����q'+Ja��/~�W���J��'�]�L����4ڜ���a��m��t�}��:y���1CEQ�!M�b�9��U�`@�ᰢ()�ɦe9���{���G15�F�QEQRRR�����ضA@���W���BE	Cd��3��n��8�Uu�6��AYQ9���U�\N�G�~�~æ'y��f'D���[.�X�yӲU���?��O���+�ij\\�^�aBdžBa���TU��.���v�m��k~8!�2�M�n�KE�&����r��_�Z 
B0$W���O���&91���
dn?�j�&�q�{pHRye�+3޸�����
C��h4*�"1�S�aY�D���#Nn�����]h�f�a�/m�
 C����LMIno�`FQU�ł9t� ������|$���-[�[ZG�!+
��"A�-�b0�׎���6�~�q"�n��0r˲���Dt� ڶmYX�}N"��|�E0Z�a�I'o�vBB<�0 z_ ���C%Q�ŋ>`JMs���{ߵr�x-I�0ɇ��^������%��ʛ@׍�$b� -���h[��K��EQ���"�o��e/�~@,�'i����n�f$%o��� ���0q��c��È�i�ݨ{8wR�?�/��&�BU�.��@ ޵�-�h��{)WL�,�E�/!&�1��F`��J��h.����b5�=�֊�1F�dgW��q�e�
(��F!��/�l����ű!)z���2.vE�My�Q� ���hv7�]����'���=����՗�{�e�7|f����CHh����*��ٻw���U�<��zU��"��wvO<PD��N � 	�����ԙ��I�%	**�2�>�-O������o�wO����S�u���]������oF���a|��V�%���KO��Ȉ�R��ߴ�B������j$�o��_�B�7džaȲ�5d0m���"�pF&�`���l�Ң]�%���PZ]�\����aZ��|۰�B(�B8E�GM�9��]s��i��t�4�ɇ�p}ϓ6/a�,��"��)!>���1	w|f������.���k�-�CK�|kl�~�F�y$I�(�l�%�cu�t���a�+�������n�{��3�J:f	����y|�$c�.����c�*���Z�FQۃ�p�j���9��0��̅�������a]�p�#���0<����`rvr�	!t����}��;h�[�����{/~�@�	B����^ɷ�6:ލ�{��Ѽ��@��G�T�8v�s���|�ڼ4�>z��r���#`l@�~	eI��v?�t�2��MQ��]F�n���_�.ϤKE��n�ɲ�j D�$�4cG����]�N�z��H�~&���9�N$��Ҋ�6��##�D�E��q�~N_=�P��2:a�q� � bQw���w����F�DŽXd���䩿�-����ӦL���[��0�X���7
!������H�A��!<^Z���$IRtr��$Q�
c��uB(�a�$I��T���mv��
�iV������q<ǩ�& D�]����jt�7pkjjl6���
�å�ei�}yAX�ɧ�
����:�q��`L ���a�a�<O�ڧS����bfJJ2��Y�$BHlA�E����?(<|$!!��iS�X���G�K�8�(�7wz�ث~�4�.��zk��	��i>�?8�=�8$}O�\u�+I��(tz�E�������_z��(��W���\�N窟��ڃA���3#��YQQ�v�mVkMm��"#�ڃAUU��I!�<_S[��zj�>��������(/�2�k��h�8:
� ��>��l6M�\.�fSS����4���Q����^����A�����=
������]����`0��Ԕ��^WZ!�藮�:��C��_P(�rrR�$I�ee�Hd���ϙ����o����#G��?4p@&��������9

�̠W����$�].�����\U]����t:�k����{m6�$Ie��$%%$��@644V�����%''��j�X���]�v�݋2
�������/�(������9�^���
������+�秭�m�H���*!!^Ӵ��j������K\\�����y�-����Bv��Y�i�>���+
‰��f�53�N�@#!D����V�Ñ�9{����Y�a�
~���ǕӦ���wV�Y�s��ɗ_��gK|�iΞ=�5�II��z��%�J<X|�xiYY\���~�����g_x������^��_�;�G�J^��F���/���746p�hɒ��*�,�II�3���-۶O7n����y����.jniY�d���40S�tBȋs^=���ر�Æ���]�ْ#G����%Q�����������ض;jjk��Y�i�,�o��v^A��m�J��']~�˯��s��m�w8t(9)�W^��ohji6t(B���郏旕���ᱣG���w�mذi�V�0�fe��*�����jU�ߟ��B���/��4M߽gϰ�Y���uՌM[�|�t������#��n��������t�ֹhmޝ�w�ϯ��)<r4==��[�߿j�ڃ��233≊ʪEK�z\��Ą����V�Y����y��7��VeUա���ҲƦ��YC�K�$+��\�U�F	!��j��5M���?Q5������IIilj�2銹��TS[�p��[6?��C��_���ݻo_����~��OOM��/�6yr$�x^S�Ꚛ�}R�O�2f�h�:�����C�H�TYU�ڧϴ)S.7n̨ѷ�r��c��5G�
4h���;w��0a��??�dDQ~��S&�ܽ[%Ƹ���j�^w�mV��Ҫ�o�����̋&����[o��iS&4��O=9dР��&M�8���������%۶�(+/�����_�C@{{���a�Ь�S��v��>}�N�b�W<���ړ��_X���GZ�di]}=�a��fM��b���;yA �L�tEffƯ~�,ˊ���(ʧ��X,��3�'��9�
��@�I%��$Ir�\�W�c?��_..9��<8���#?p�=˖/_��b��K���f}8��P8<zT�3��
�^;lh֟����n�u=Z���&!���3����0�����ӟ<��23Ʋ$+�b�Z�y�nY�e�	G��(��8d��$�4*�b&���&%%������?���S�|�ْC��\4����y�yU�~��[Z[Mà�QTUu�����G3jd$A�4�ӥj�� 0!<��o<p������BImV�Ь���ct�v��k�QL�Ąp �y@�Z���j�JQQ��c�<������s�� @PU���1
�C� �� @0�B�`�W_mlj���#�Id� +�� "`l�	��v�Ь���1Zr�\H�w]�>!&Ɨ^tѝ��v��#y����CH�Zv���41�
!D04k�Y��i���DE�-�-�x���a���.�(�B~~�/~��P(4sƕM����������?���y~��E[�m�9��nE��˯��=|�����k֙�i�ۇee����g�ر���EEK�}!�fu��K���;33��ߓ����/EAp����cF�Ȳ<������˲t�	ю>U� &6
�������%Q�X�n�k���+V�!���3z��Oy͘>������o���8lb�4h_ �h��Uk�>�_��pط�ܹi�V�4m6;D��c���Ҝ�/�:�'wޑ���7����j?�}uRB�s�4��������rw��I7[������ξz݆
<�Ӳb,�|��pL/�!-X���a��t�{~���ٱs���^���k�+*^xe��Ҳӧ]>��g_x���h{��{��ܥ�z$���G�~������&�E��Z����h�i�5����b��y�����t:K�KII~����|�	��
0@�%�PYy���t����V���q���e��~���r�D����`{p�С6��n���O�3�֛njhh(9~�>�t:L�,+/�e�OJJkk��#G�VkfFEQ	!��������a�4=n7��/+���4p@B|<�XiYm]]�����$@Yy�E�����@uMM�>}h�ODQ\N'!���->.����7>r�����}�F��<��y����0���}	ǎ�e9�o_�V$BKkka���$��c�Z�l<nwKK��竮�	��C���KJj��$'%&��O��{�Fa(nmm5L�C��t���(J��=��ڼ���>%99�oz�P^Y��z�Oss3D�"�<��m�����褭�Q�Z�X�F�-�xڵyi�BB���4M�0DQ�X�~���Y38�ꦦ&�|Y� �����y>�N'K�����jG҂�e���<�������ЇU��4Y�L�UU�yN����D�� ��a��m�x^�4�X$I� �k�N51�4��8QE��B�ix��4-g�^I�>[�aC��]�� E"J�z�=!-2C���i�,I�a�)�BzL�f�@۠:�Pӵ�B�̩�U�����C��!�����p�ݞ_X��7^}��C�p�`�%I7M�h;a4a��ǾŢ 6�6�l6��c��׎�b��9�|�ڼ�l�;B�u崩�I��@t�HRTt���B‘Ht�!$�/� ���O��zi��IM�QB�4�`(;�ѵЗD���Q��%���(*� v�B�4�p$Z�0��DGiEeWN��M����mmc��؋M�sg�Z��G
�h?�b�D��.'v��F���C&�$�1n]N笫fCAz}@k4C!�YNt�;��������Ѽ̅��L��c<���fj�;�)�ɧ�
bʳ(�2d�ÇE4v���d�ӝ_��?8�k�/���3����u]�a���-_���7���N�5[�B���-��w��?P�u��q�U3ڃ��r��/!�n�ܦ�c���s��`�h�G"��*V0�X���	m��
]�o�	!���o�2K�&����ڟ�p�����|��v�m��.)L��h�
����8\]]�Y�X�t:m6�j�\�ؤ�̹&� �=!`�ư��&�ݵ�n���Ֆ��n�ٹ����>s�;e�g
��فjk�i��aQ����7߄���'�������ut}1��0 B����!������w0��9��&T���9�4��-.]j��Ǝ�z��aL�!���Ӂ�AU5�Q��Ƅ]�@�
7�( 6"� �{?�nWD�S�9���5]STE��Dd��n`���2�:p�޷�u�n��N$�@�7�<�|�IIqB�����}��������i�?�YH�t���Ёn�S���8�PǼ�g���4M����)|H���}#"���~Omk��צO׮�{<($�	!�,.[&~�9�N�	"l�"l�D���`��6��g\r�>aNJ"n7�Xz���"  ��dľG����/��E���.U�3E$D�ᔩ��s��F�~�������͞
x ��jW]_xĶ@B(l�,l�D���<hٵ>�,�A��I�̾}!mD�e��̙��Q@׈,C]��0O]zB���~#`�%Q�8�u�1畞Ct�Z��Ü)�]7��x-�3 ��k�uס���{��q��{�3�k�@B`k+����0��Eee�ƍ��]|~��jt2���i���w��3�%^�)7c��H�q7XG�0��y�>�z�9�!�ly�y���}��ܹ�ԩ��ᔹ�ݚM`0��B��$%s}�9h�>k�u���+.���h�1�X�mۤ�ރmm�e�e��	����>NJR~�4`�q���'�Mb�?,��9�����������i_�ؘ8����d�ni�P�@Ua��!NO7��� �f�
?�lo��66�[�|U���j7������l��p��	��2���~�\ � �;�<��q饰��׾N:[>�X�Ĵ��y��#�cG+B!�tF~�{������]'B�1yg���#|���`�����+�݇32��]��t�.�9O|��ob����c�e@�
��۠��<��tLM�3ĝ
�&����}=(B&�z�h������׀�O�4�kOh�����ڱ�A�^��K
r�bL ��l��A�vE�#��
�M�$&&��hұ�BBs���x��+�g����W/��kL��SS��x����p^~��:�Џ���A���p�KW0B�����C������V����RUU@��6���p8L�����b�XL��
���ox�Z�e9Bz4b/E��V�EQ���@��q�����^tg�Z5M31��E�|2M`��@�3���7�h���3 s���P���x{ ����z�!�"���p �'�'�r�,�������3���+����i��OT�(
O=��F�/���[,Bp߾��k?��z�M�0�a��51�+���NLL��M��|sK�(����f��B(���SPx���6..N:[:!�㸈���x�Z�TŪ����F��e�f��
�8r��j:���>|��嬩�
n������w���himm�.W�6� clv�2 �<_\R��*�u�NJ�9�=cA�kj���=��L�Y&�q(:��˹�4q�vt���.}�]N��ˉ��Jo��h�ת���\N��1���|cSӖm�8�s:�*�Us���R]]�v�$I�F׵��E��@߅.[�N���B���Qh���*��8��|�q�,I��"��..^�8 �X!��w�	u��.VZ��pA\\ܐ�Cx��8�n�''%�+:�N��;XVV����8TZV�����yi]�Ο'�;�q�P�ر�q�ƻ<���
-!=k@�$�G����'F��χ��9{�f�gGn�1�p�a�Պ+*��q�js�` Ic`��0M����:��-�"��g�9�����N�w:�X���??���}�>^�ɫ��Ncy=��%IRmm��n��j�{�.���r�X��+�^��f����v{�Qu]��������y�N�S��^�GV�Z���U6�U�u�ӑ�w�|Xx���}��W�\�j��b������=���l�w��`^���$	c�t8|>����w�����v�X�z����6����8�4M�ӹ~Ӧ����8]7dI�|N�#�l�Z�>��!�"�� t�\ч�v{�rðY�q>=Y,����xv�����)�����9g��P�b���^��NK�������G������V�v�Z�dz7w����z�m�XQQ�1�=>ѽp:~�O�$B����x<^�W�V�����x<����z��^��Zk[���x��A}^���gz��v�,����W����Gt�=��ݨ�I\��~���4�,��gȡq����4ʹԴP(��ah�Hc���յU�����y��J�B����kjkl6���u9��?��VT���$�����m:�ڲ��ܟ�o�Xk�jA��"Ik��t��|4���gv�-{�Ae�*r���*����3�5i���+7� }���m����!'���&`'�\8��~B�$�U55


/>����\��5���L�|��

�s��uՌ�}�?�h��Cy�(-/ߴykiYY0��Q"?]�y붾))6�mӖ-O��r�L�	����ۃ��ii��l�|��%ǎ�?z�H�4����/��8n`ffF������/������Q]}���BȦ�[-Yj�zZj�a�,oݾc���G��D�_z�_��dѢ��2!��.]�|E��W[[!�1���?NNN�x<�%%�<�|~A��
�+(\�dI]}}F�~�30�?pp��OZ�[�v��<h@�=�n���Cy��,hii�>���y��}��1h��}�W�[�n��`(�q���v� �"
�MMM|4æ�-���`vYy������n�С�,Z�f�1�G�KK_�d�5k%Iv8k֭�ѿ���ń�Ʀ��Դ??�L~A�(�ii�7nZ��sC��SS�O�5k׬_o�����W�ܹsת5k�~���8��ر}�΃��R�R-�l�-}RR<n��+�JJ�
��*+�>�l���G�RS]N'=b��
����[��~�S��,��ّ_���'�Y����F}�Ď��� 

��III��u��%�iH�XZZ���h�6]OMME~�lY~Aa�>}~���ۿ�Ï�w�l����)BqI�����O��\:���a۲�r��8������dZUT�ں�������3��O�y׮\u��1Y�b���-�Δ��#t�
��ѣ|^��l�4>��nh�''sqq��<�y �b��A���u(`z�Ӿ����K��4g΋s^1l���}�WdY�����;�ʲd�!D7�aC��+*�z�}��a3j�G�Y�����|R^Q)Iҋs^�
��`��"�Ç
�p����J�hb�B�8hK��錋󗖗�\���yaCC�+��뗞�n��u6��͝��Y�����^���̌��o�YV^!�����
7�^���ϖ�?pp�tƺ��֗��hw�\�aH��r��.����/,|�y~�w����g�$I� ?^��g����l�Z ������P^ކ͛����SRR,!�v�]Ng�߿������Q֐!�~���#G���nٶ�_z��/���z
/Z��n�a�	�/ϝg�СC �tO��6n���SRR�V���6L������+//���>��~�|�$ɺ���r�\.g||��M�_�翲��`B0&Y~��o޺��q�4g�YQ����sI/�|���Ą��}���J�d���9p���2g���55Ͼ�2 ��ر7�z��ȑW^���y�m��т��		�
m{�CEQ�3��[8=]z�]�O���a�0io
]�ab�19�kii��k��:�������F��
ܱ�
���U�-A#�2 �4͓&����$'�u�s�|S�4��l������Y�z�fs�>���|y�ƍ����ի����ܳC�e�%��
��<�3x�g��]w��}0<Ϯ����43  �8vI����y��?inn>Rt����������4M�x���|��U��755��2,+k��)3������U�Ձ�`DQ��$���� �riY���NDI!��$
<�������7l��ӦɲDq8G��kk�*��B�PeUu���cF�ڳw��f�[M�8n���_{MzZ�;���J��8�ȑ��x��o���!�����U�55���/ho�~����M�4i�qk֮52���n�nٶ-�`�fL�v��!MӲ:d��m�<4uҤ���+�M=x萮�HOK�2y�qcw����K��"����f�]<a�
�^C�٥�466BEQ<v������G�2iR�ߟ�_PWW_YU�646������KWT���^Ӵ�֖8���N���tҾ��}��KO�6eʘѣ0v���C�sm������{���cР�6n��/��f^�5�i(��Ę8���fƴi��m�@uuu{{0;��.��g�߇ �`��~��g��GQ^Q��3a��;o�힟�$g��df��艍����V����6l0���w��,!v{4��v�ݞ@{@�5�t]�u��
��Z[_k�Zy������٣�G:��yyuu_�� ��񔔖�H��e;z�������<NkUuM�/�D�(���3g�H��ͷ�v9��mm{��>p�}?��^��ϗYeٴ�HB�>qb�wZ7o��V�u�N'ji�mm����O?u�z��7�a����wz�B�j��j}�n���W^�g��|^ofF�{����1Ƹ�����^�c�Y���Æi���W��lݾ}�>�/9)�o|��[-��@�hQ��iY��/��;�L��r��f�f0<ZT��zɱc�MM�a����ֶ�HD�9z��8��5rĈGy�ꫮ
�B?��v�����!�p8\r��hQQYY���y��������˖//**jnn�gθ��vϙ�:��C�M�s�쭫���u(//��`��5��[d���ꬫf\r�E��ʪ�ٳf.�tQIɱ���6��\=�*�Puu
�q99{���$;��.���_=6v�������`0
�4��oƊ�������Ǩ��Gz��f666��
�\���5��O�6�4MQ���K�/*.	�<�WTV��7r��-++�wǭ���	��m�����V��+*.�5����=looW�󭇡p����w���Ȳ���\r�ئ-[���.�xium͖m۾X��n�_~饹����Z��_Z�,Ie�9{���C�q����~�i���{��OL��i�.���t�ܵ@�t:].��ᨭ������yR�)��p8�	��Nz�z�l�������T)>z����+�o?}������Od�Ahb3�(�p��8x�P�޽^�7))��U����JK��
�TPU
��f�o�{�<���;ڎk)*j)*j).n-,|�9NJ�?��Ÿ9�N/Ç�C���2d���������ƍ�r՚�y�6�ud���yy����]z�Μ���డYII��ee[�mK����nMOK۱sW��=�a�=���a����/���jmhl�ݿ�n�_z���m۱c�e7o��g��O������>�O�	��~��Æ
<hٗ+��KOkmk{����g_=p��08����9^Z�e��'^��ٳ������9rƴ�k߲֯m[rR� ����oZ�be�~�|^�i�		�m3Ms���EY�b��j����<�q\m}�;���tڔ)������"5��u?�aɱc�}����ҩ�'�;6��ۼu����3g644�]����2kȐp8�t8df����7���]��C�F"�ӑ�����e���	��?�}��bY�z��⢾}�z�����qcF{��⒒�G����M��
��gK?7LcЀM��i��0fL��v�zY�o��~0x� ���ݿێ�ӧN���ˊ�Kƌ�q\cS�c�[_^Q�?=�o߾i��;v�ڽg�i�C���WT-*ڛ�{��M�<�j��X������;�?vl$Y�r��f���'''WVU8xp̨Ѵ�����4ܧ��q��i�9h�9z4�4:��������XPXP�P{��XeeEIɱ���q��N ���:�Ֆ5x���$q՚�G�v�ldҼ �����U�Tښ�1M2n��>-��Gf�p:�)��MM͓.�|̨�{��n߾}��O��2%�If �@���u�����v;�ہł�n�ȯ�Ry�{Ü!��nٲ%;;���B$I**)�YmI	I��5U���E1�l6[8"��:B�����󜢨�$�LAATU�%)�%����i4��0�1
����l6MU!B�醴c@�$ܙ�A�(
Ð$�41Ƙ��BƢ$�4�`/�����(��j5:�:5Ur
ðX,�p��(�� J��i���� #�B��q!]�,KDQDA���t��׌;V�4��tM�X��H��y�s96�U�4��a��煦iV�%�����Y�cQB�0��1�Ю�p$"�<���jU"AU�X,c]�A0AH��N����<O, $�(�EU�$I���0M�$IxAUU�Z�V��!��p8l�Z�[�S�y�0�V��`��<@L����y�_�@�<�h�/�H4ٟ"�rccCEe�����zA��ՠ�hF��#��gt�Q!�***���5z����O�A�`0�8N�ٰ #��ړ����D�/,=�j��? t��@�Pq1��q���>;�р�-h��N/􃮣%;F�@�tot!�c�>����I��x~g6^��!9�3ս���=�ع%���=njt��I������c_��MQ�&�G��0�Ñ�g�{~���]����G �i'�<fc���Kb�[����w��>����N�=
wycct��C��=��%H��r�헿D���h7����	0��!4M��J��{�O�W6b�</@t]?�B�!thC�x�3-��i��f~�~���a�t�I  ._�:��W!��1vl���$1��7�
�a��h�?����j�Il��N���s:�r�t�Ϗ]&]����ҹ��}���cWN��!:(�;;�^]��uu�'�c����		�>�0 �V
�=.�K\�r`��o�
8q���c8�=1��tySzޒ�Ou�O��-��)G6�Bʍ7�7�d��n���?Bh�^���ыq�`�S�՟Ү�
�(!���	�t]�|-�N�q@��! I����(���V�_,C����
v�r ;���P��,I��<� NK�W@��������>s���T�,�4�\o���Ծ}B���`22�+Qm0{���'<�'�ݡ��R����(f3�a^�O��N܀��*+B\M��t)w�m���ʸ}�!�b	�`ڕ��I$.`�A06
��/�f�ڰ��v�BMcq�9'z���e�[�Zͯ���9�S�B0��ؘ9���4�w��=�8.++�ɏu�C� ��#O7V�K{���BE��XZ[�u���:Kk[��-�Ǐ�o�
4-6��	|��QmW\�O���Y>i�NB`$��p�75N�(�1Y�gΉ�B?�P���d��yFkU3�W!�(*����O<a�徛6^~�=�&0M��9*.�.�>�������h��W�8�v��2�8���W&��"C���oxk(�1"��8�jI����O-V����0g�IݼwAȱiǙs�`l�l�����˖��t<a�D���=w�F�o�xUU�|�-^��$:`��I�>Ɛ!FV������7��0��C��c�aӌN11�tb8�k�2	!�74s>�9�'��s��L�!$�&�Z!�/Z�}*J�V�o�4z��~�z����4	���ů	��0�z���o�)�U<e�_��Ú��sB��@Q�E\��`����2g��$���9����E��j�kpj*PU����:^)
���t�Щ*��	s�b��Bc
����6��N{�+!�4��–����h����˰�Ϝ��,o�Hǟ��\B,�3�B?s�`I��a_'�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^��~�a�^�7#�!<��0ÜA�����y��[�����0�ߍ"�"��a��a��a��a��a��a��a��a��a��a��a��a��a��a��a��a��a����0Ü�!gm]t>�3�FB���0�#!���lMpc�&���lL&��,�3�t!4���R�Tp��?�q�	��ںZz8C!�q�!��a��V�V[-��))�3�	@C���J@@B|���8Ck���Y�P����B?�0LWcC�df��uB�IU�5���dEU��<ʄ��1�y,�3��B��YX!��B41���Џ1&�Ȓ;�azr����B��az�-G/�!�_qa�!�
�6�U�� �BB���-�^Y��0�)��1Ƅ��@g"&B��h������;�n�U����“´i���BDQ�y��>B��~#�P�
��q�at?��~�a�D�%!DĪ�����y�/)=^|�D%��DA�FyADQ��@ E��B�9>��eI�a�����N�BDQ<Rtt����������JO'�("�<�ӀNo %Q�"� �x��$I7��fI�����0=��7	�ќK�MB������5u5��q!lhl<^z<�@�Z[[���657@��/,lhj��QVQ���B#�W0L�_Z���'�����bQ�����P8�jim�?#��0"�r��,
�<��[Z[�ꏗ��;D�
@m����a�TNn%�D�p����@��ښ��{4M+��(<z8k��!T�P�.+/SU5
�74��y(?/��~������㸯�������V�A���
�C�P^AADQ���TVWA�ljn*8\��
�UU���ܻ_uMMk[��i5���i�8�a��a�G'��B��D!@6���x�%�������nx����#v�i�)I�6�
c��jRbrrBR������&QB�PM]
����K�< �?tP�aC��W��B!A�[�[Z����546���:�EU[�ZEAhniiln�y>1>aܘ����g�Ѣ(ƞc ��g��Ql&�H��h�C��457ɲ @@L�B@/�Mlj��w�>���6����*��@�w��ó�f���u�D�P�
01NKM���K5Mm
�	� ���;j���x<��”���(�n�k��Q	q�C16Llj��DN�m`ݼ�0=��B]����{���ڛc�X�����YU]�m�6U�2��OOM+.)�ݿ�0Q�Kcc��*��������}�����ʊ��jEUD'�2��Y���6�--5-�0?)!�f��������.����!��������!��9��8�4�nBܾ��O$��-[�dggk�v�Ə1�\��+�ddD��Y�����Q�

T]�H�/�2M�Q5M�uI�|	�P�"[8���:�XEa("ȒD{z	!�$����6\�ȁ��;���<���I���IS�B��8��C@B1ƺa��@��s| ؾy����0Ӄ.Ä�0A4���9��z<�E�"
�$��6���tc�	!�`��I��lt!]VG�I��GGraBA�S�;�����}(��!�E	��:.�Y�g��Q׆��� 4#%9��Z쐮�Vu��~9������q�㾘��˿bZ�S=�O,��~�a��h������4M��
:G�~�u�h��c���N!��0�!B�H��p`�{���T�nu�BEU!�EUi"���q2 ����B?�0L�|q�uu�

gzE�"X�?@PZV����4���xY�g��"�rJr��3etR��vF�H�X,�U�,�3��#U�d�dž�3�FB�i����aN��u���5�Ѽ�0���0��)|��\g�@p����dLHtXD��G�}�����a��or��o��9i����W��F��
�t�
I��iFw��g1Ƨ�p�,< $w�����E�UM�4���:������3ÜM=�~Zx���--23R��ńDg��+D���(��`���c�W��jzz�N7!������¸8��A�$I�Z�����G���l�`�����4i}���
�!��PqIICc#�X���F�!Эmv��IOM����u�V=Ƙ]��pi����j��3x/�G�0���NLL��,I�����v���Ç'��r:M�4M�f��H�v�d���ں��{��5RQ����%I�4����Z�� ��hw���G��պ'7�����yaWNN�>}23��嗩}��\.�����2�PӴ5��g�!�D	"�1v8�a�<�v�dYVU�f�J���*��b���ɧ��?���\Y]=2{D��/˲E�1!����bCCC�ߗ���t8xAPU�b�mV��f�u�b�8�v�ͺo���۶_|��,[m6���.��j�I�Dw�\���0��㸺���i�;v�~�{gθ2���r��Ç����ڝ��y�qK>�XQ��W�X��
��455]9m*!�ㅟج֫g^
��o�T[[�z<�CyӦL���?�(�,�;p�a����_��j�������o��Z�����}���=������ٳfVUW�㒓׬]7n옶@`݆�6��W�ڶs���O�
����Gw�~�E����и��/��[E���V�]����[n�!����ҭ�w�MI�t�ey�m���S���߿go�c�^OF�~�E�%U��33'^z	�x�ҥ��-~����N'kb�ӵc�q�]w���{��o�_P�v��mظf�z��λ���8g��<�����&��$�4�͙����G��>�h~$�3o���\���5�6Ȳ�Ҝ��n�<lo�z�U����ŋ@Y���A���_zu�nV��4MQ~������T���O���(�q�x�ѢbQ!AM�^xy��<8g��P��W_x����Ο�z��%��ѣϽ�2!d�ҥ˖�ص{����/�Һ
?^���"���K6n޲e�O/NJL������^�d鎝���?�~����Τz��D�a���5�#E���kޘ;g���y��������ܝ�yG��;z��j�m�i��cF���ȸ�'w�����ok455��W`����w�m��5j��w�y�a�p��yM����}���y�����ʫ/��O���q���l��Q���iX֐��ԟ�_j߾�,��vQ<��b�t�eUUU�(B�ѝ�wok[m���֛o|���o��]7df���gϚY��0|ذ�䤟��μ����>p�=�{Ϻ
Ü>u��޲w߾������&"m6�(WL�x��7ef�?z�h�޽��[o�i�,�3s��9�g�ҥ��#.ZTSS;u�����k��|��j�#����O�VYUUXxX�d��5 #��3�v{$	���P(	�Ñ@ �k:c,Iҁ�����Ǐ�?=����0�H8�;''1!���7Lcʤ+�

t]ommݹk�E���Q^^^�ѿ��AQ�+�N�2�O<�q�}>o[[`�E*�b�&B�4US��p{ �LG"���`kkkZj��/�8~�tӖ��}�B�@�߿u���YC$I2M3QTM�À�P8l�Z�v��U�%Ijkk���R�aΎ�ݼ�
��%��^�v���Wϼj��ɂ ��4�۸�7�=
!\�ɧ�׮MNJ�1}ZF�~E%%���2�䤤�k��8���䲲��&^ZZV���-.�h�x��9��ݿ���6o�j��?{�~�����+S��3��ۓ�[S[6e��i�֬1|h�>}�,���ѣn�kTv��w���<���r����^	�A�+(س'w���G��
�ljn���	MMM�nd
\V^6n̘��mmmK>�B���﫭��Z�ò��SSw��,_�J��ߧ����Z,Ç;���5x��K/Y�tٞ��Ꚛ��'����c�9��nޞk�Z�%� �!
9�Υ�/;�����Cχ�a�Պ2�^ �.I�i���	�@�P��J[���4M��$I���f�ZiO�$I4�G�H$�!d�X�SWk�F��/�#��a�q�hAKI����cATU�8�㐮��X,C�y��T
"D���<�e��i�'M*�4M�%]��NgUU�c����O>�����3s����8�s�O0F��N㚪��׭���k���0dY6!��4�>���wZ�=Q���ѧE"�BBHo�A���=4LB�0B(Q ��=
��Xd��I��WQ��@#�� 4�0NڌP(!Tu:�`��������Ųgo��[����ONJbq�a�Qϡ�V�B��|�33��p��|4�E���7�C�����!��_݆.��˟��硓w<��q��pgF�~��Ͱ�,���0���'m���9�6ל�
>�0�n���K.&����s�9�0��U��Q�U�\o�9��z�	�
�a.,�h�#�DQg�
�0s��n^���Ԕ���f#`��
BMMM���8D��=%tEXtdate:create2013-09-27T12:33:07-04:00���%tEXtdate:modify2013-09-27T12:33:07-04:00�|�tEXtSoftwareShutterc��	IEND�B`�site-packages/sepolicy/help/ports_inbound.png000064400000163162147511334660015461 0ustar00�PNG


IHDR�j��8pgAMA���a cHRMz&�����u0�`:�p��Q<bKGD��������IDATx��wt\G��	��K��!Ћ�)���M��vUWuu���7;{v��wv������Nϴ����*�J��E#z�-��0���|$2	 E�$��wH���x�ދx7�ƍ{��^z��YTT�76�m�eY6�@ |-�$)�J�N�����i���EEE߮����%v��F�O ��-�H,lkk�ۖ�E:z���ٳg�SU˲nt��@ |�H�D"��ĉ�U�AUULӼ��@�%`Yv���6K5MS3MS��@ ��c���Z�E�G ��J_ ��9Y]�f�_ �|H���=ۏ���$iB?���Q��u�L.S��~��y�~o�c���"�U@���ρ$I��i���i�p�q9���)��8`aY�i*��p$���Ħi$�)�$^�Y���=P�$L�$�`��$��q:�`X�h�݁n蘦���L�D�$Si|^��`Z�0�,�q�o�cndI�1�;��G}U(}���1����-:q
]יY_��?ˡ��x�w�i*�iQ_[��y�����x��u�k��eYojf��#��w����&�m��@\�eL�D�$$������D"�߿�C���RQZ�S�ό����E(4¯^�#��u'�=���g�@�$dY��̙�/���$xo�V��=��z��w̥�R��ۗ������|�Ӟ��ea�Z:�Z�ܱ�2��H�W�ib��,L���a]���c(4��)4��� /@yI�U�3��/:�@�ِe�����}�O?�5�$�I�v�x<�����}��a�ۑ$��H�t:�Q~�IiqK�Ų�b{�e�J�))(@�Tz�T���F�'�s�s����NqA�,c&#�O>x5�����)Օt����)-*�4M‘(�t�i�5�!�h���~l��a�T��!I�]��}�|X�D,�p�Ov量��������Mqa�i����Ph��Eiq!��!�i�d*E��Ga~=�A�Ge�K�A���^���,ST���f���U�d��('��1��<fM�GU����I��	��ܹ�@b��<v��K����-��_ �<X�IQ~>%�ol�2M�3I�A�	Gؼc7�"3g�4fM��NTs���R'���_���"�H&�(-eق���i��os�I.�]b���l�w���`��#<������)��F.�_"�LRU^Ǝ}پ� ���t8xh�j$Y²�DS3�CL���w?����E~�O��V~��g�k�y�M��ģ�X��i�̬�B$�?�����
3�%�4غw?��:	G��{�
N���/8��*������z���H��׬&�J�u��;��!�~��8���]KS[U���}DYq15����T�:��{�D �U�,�Y���
y�������̘~MS�'��/�ò��,��=�7������9|�������DQ�6Y�,���3n�0r�߻�N�n7��c����QM�b���wu�����kY0���ι��$|^/�|�*JK��?�Ɗ�X�p>����9x�d��n�&�i�t���z��w�����t:�/~�*�O7�r:�{�L���0���eQRXȌ�)���<t�jdY�RWXS��I���8�t�2��0����/�J{gu5U�u�`c3�F7V/]�¹��ח_����#��*/#������y<|�G��(G��@p;"A�dž��%�O���@�9�%�֎NZ.\������j>ٵ��p˲p�\,�7���p�1��:ukG'v�
Y�����6M��t"�2��O&9�~����e�Le��v;��0�D��1�9��{V�h�,� �`������T*�����
z��9�L�L���0L���˪%�x��M��6֮\�ݦefY��d���P]^��{�̺�{�>aju5�yy躎,�h����n����͉�3L���0?�0�4�����R�4����*Sk���s>����~��,I�lnaׁ�<z�=�:p������9 ��/�K����`$��pp��L��b`h�T*ͯ^}�4����[O�#����ض�.��;f�"/�GU~/6MCQd�>/��e̝9���wa��).��D��$?���lڹ�#��X�p~�Q� @�$R�t�[���5w��_��uSX�r9�X�c���|3�ֲb�>޶�f��+h�6����D�1f�O�
.L���v��*���3�c�&N��Ǎ�㡬���~ˢ ?`t���^
��.ˌ�,����{8��Npp��"�|���8�D�Ӂ�a'���Y�{Kp�bŅ<��}�u�+�ˣ0/���g�kΜ9��kcZ�t�D"��f���8�u�TZ��x��5�d*=:�x����iZh�J:���
��k�
�E2�F�e��v��D2�G�Z�E2�F��Q���@@76�,��ut��
lZffO&�L��a�_� ?��>�ظ��\��ۖ��
,�Ħi$�i��6�
U��E�dE&���T�0��d��$i]�Rw�����?�8
���DI�R��
���Ʀi��
�m�$IȒ�1��k}������/|^$�a��ـ��0TYFu�OqmY�m�r,��n�r�0��$K8'I�m�&v�e���a�M�e���ͳ�yYS�X w.I�8{����>^sW.N@��@a�y4U�̵hZn@r��iZn�`Y����&L�dΌi�TV`�Z|nV��l��rn,�b��a������� &�.L�g���%;�g���O����<�ir�$�p�y���,��$^�v���+e��1�)Օ�4�%E�������G �ò,��J�����,|7�'yL���/�	|DGc�^����fK�@�� b��@p���(�X���6 7�n&� ���J[oųn>ԁ�Zε���%�� ?���(��]�kba!K2n����Hx�x"���SUU�^?�`�x��=ߟ����墨��T:E8�0�[jВe�y}��5cQ#�u�:���n�$1<<LWOӧM����\.ڛ)3�������JKJ��:K�N����fhx���A�8�Mz�$$�6;�h��B*+*Q��}�׾�̳ ���ĻRpK�JH�l6�v'�%�j7Y���b$�I|�z�BI�����b�E��M���l�l����i=�Q�71��300@YIN��Y�nv��zph��w��#�I��T7��Ap{�mc���쳪�!2��:K���1+�ͭ�3}�XX�uK+~Y��;Rp�`�&��c�����~��Z%����0��D&[�WS�ltMI�P�[@�Q�X`Z�vPTHp��F��™�f&��,+x<><n�u�!���VQ�c��~u�ΞeIF�o3�̽����[�7���DU��TUTՆe$�IB#���^ϧ/E]S�K��I��G�Y��i����dg6��ݞ\i6�$i�z�a��6��tƗ�td)cꗐ0-ɒ�Z�}���l���*��3=K��c�Ŀ�T����IM�Y��XS�,�̒_�`��d_Ի$����)\�Љ�"(��e&�a��1�$��`1�r�?�窊_�$F�a�8��Y
���]W�$��Y������^*��?��);H�e��P�p8BEy��:s�M�:��~=�#�Ξ%I"80���{�
����STXx]���I0��ٳ��-�&fF�I��y��=��̘>-W���JKK(-)��sY����$K�ʘk� /�����X�����n�3�������X��H���b�I�b�YgM��$q�y�;��-ӧ�3g֬���u]������"�v{�����EE�u��.��/�A��n�`
QR�ĩ���&��C�i���a�����&�$�Ʀ&~��_�c׮�_��\�l������c�8u�h4ʉ��H�ӹ�X��`�'[8�؄�*�i>s&��˞'[��2{l����� �::����Ap`����������-۶�v����䍷�&�Op��΢�?p��Ev�܅�9��\�w]�/FF�g��������m����l�	v�
EV����ٺc[�2߳��cl��<_n`ae���(�Y�)((������|$Y���������YY��pPUQ��nW��((�2��UddY����Ù;ϕ?YK��됤�N�Ӎ��?x�MCԌ����\R�'N���?�[�GgW7�$q�����������Ů'��D�%,�D�5��z��:� ��B��k�����N�u]�ĩSܵz.��46~��d"���ܷv
�x��{��J&����{ע�*�,�T�-��|�ʊ
V߹��[����Cey9S�L��>������h,��}���ꢲ���+��s�z{{	��1m��}ǎ�ĩS�}>��^��&@�$��χ�7��?�	�������_��'{��x�S����Ec�z���"ӛH I[�ػ�t�Ew,d��465s�q4Mc��wf�"ˀES�v�݋�i���n

�`�zzzz)++�իٲ};�X��p��K�2w�lv�����V*+*X{�=�G��K�4�\�q^��ǥ���0o�\.��'C��+IH$	���Ʌ�Vj��Xs�]�}�Nb�˗,a���q���UEw�t]��������$�	��**~��Ӆi�().AUU����x<�j*�E�tvub��q9]
�����j��i���(((@UUR���`Y�y��mv2�r.�ղ�q���J���b&��<�����_�0������Kooee���\*+X�|9��?����u=�������ص�P(��3�3{�7l��g}7v�]�-������ �N1����<�	��4�EsI�Lv��鎿�
$I�������k5�d���6‘f��)����@k[M�g�3g6�.�y˖\�G";q��G�q��QV����s�`w8���ʢ;p���������̚9��� ��d�6ZΞe�8��Ȏ��8}�UUY�h�v줽���>�����3n��߷+�-�������?�Ƿ����ϔ���x�k~�4Mø<��,����x�FN56�����捷����RS]�;�ρC�y�������r��7�"#áo��.~��x<�}�eYL�2���s��=ʱ'(+-e�)l��.^le���TWUQ]U��5�n����OV�K����7x��׹��ƅ�?�
�؉�޻���Sٳo'N�b�޽��p��s���x&�0d�TU7�����林�E�̶=�y����a��2<<L˙�AY�����������H�����u���r��=
s��Ӡ����M���LpUEQ3�����1�t:}��N
�8|�(U��TWVR7e
�ϻ��RSUE*��ȱcx�^JJ�Y0o���p�3--���D"fϚ�'[�r��q���gm���A�@p��{�!
i���~/I��TZ!����k9ꪊ�tS3�::���
��p��1� �`�<�]s=���#QJKKX�x1˗-���VR�tn9@�eZ�ڨ��d��y�M���봴��tc�hY�)�ϧ�����jEƴ,Z[ۘ;g˗.��a&Z[QU����4̘��nǦ�X��Ctw��w�b�7���eY�?�)5���/QS]�_��^s�t��3��*���o���_�e�m�b�+�.a�ҥȊ̑�Gq:,[���K�0fxxI�������v‘��X,F�ٳ465����tQ_WǴ���IAA>������Os��IR��!�7�g�\[S���=^���1��k���if�C���=���$�/[JyY�v줻�{� cT�_i��*X���ph=���G�eTUES5���I&�D"\ny�yx<�C�|>l��������Fp8��4^�7�vhZ��?h��r�J��x<�l������棃&�21F����c���~Z�}��r�$�T
Y�)(����Q7u
��H�D<�����%��t�"���hmk��g�J	S���f��G"���ͦOD���nw`�ٮ��	�_� �Hp�	�~�q~�g��Ͼ��_��� �H�3gϲ{�>����z=�AN56r��q���p�\�=w�ή.,ˤ�����ﴜ=K[[;�N�d���?�e�v�9y�M����,�̘>��ǎ�y�N�nd�̙Xd�L��2Mb�8.��E�|���b����4�Q��/���_���).*�T�mӚ�$��8t���v�H&�3{6>���[���O�%�˖�H&ټe+��n������b::;������j�Y��U+����͉��h�9�χa膁a�
� SXX����9x�0C���޳�:N�\���Ñ�Y`�����������ys���QUYA8aάY(��C�'(~�x�����ɮ��������Ӊ�f'�(�H$B0$��v���收��B�e��(^��̬�R\XLuU5�����'�,s��ydY���M�ru�T�od~,벯�n\����CC������i8x���v��pP]UɎ�{ؼe+��A����u�s=k��F!I�T���R�`84D���\L��Ay��g�Syy9N�3Wp*�"���|����(*.²,4U���	<.?� ��{� �LRY^��5�����!??��2�.^��ᤩ��i2k�,ˢ����*f�j`z�4��8�NJJJ�z��X�I����d�9,[�˂ʊ
��L����r�ϴ���Ƃy�X��4M���G0�eY��Njkjp8W}yfffq�2�f�M�$�G0-��w�d����TU���G*�b�=wS_WGYi	�8�[���"���q{�,]���mmt���R[��ttvQUY���������n�c��)����S
�r�2�릎�\��322B^^�:�fƉ��y��+**������^̟G~^gΞ#�S^V�� �O���r�����\��$�DH��8N$Y��������{���A4EQ���q:��l64Mò,z{{�Ec�eU������\.]�]�2k��H���~�===D"R���188�n��k���d�ȒL2����O|���Ly���`�&6�FuU%>�?�=�e���s�%��S�P^^F_���<�~�EE,^��p8�y�+�1������u�4q��7�[	nS�����:�c��
(/���p���$�^}�Uk������똊���j�ƹ��˯�W?�)3ڎ]�8|�(?��8�q[�����9Ϋ�����d��{YY�0&��~�y�Gn<�$184ȹ�5cքϲ�cg`c��g���#nF���8�e���f���퇊,c�i��Y�1�^�T�K���Z;uB�YF#�v±[��0��\H�z�z�Ƣ����[�X4��(�v� �ba�q{r&�욿�f#��L$�z���J$AUU�;�pM�F� ��Pp88�N��(�M�f��Ds�>���x�]��� �E�}�	��l��f�{�>�s1M3w<�.�>��bl[Q%��z�u2��0�
��;FpC�$���^Νoa���q��"����Xv'EE%X�X��$����{��m��*�B^ 0N�:|���+;ڵ��~��s|Z=���=[֕��[����X��l�<�Y��\K>[���-I��=H��s=u�,m6��/���~�/�+P�?j)�խ�ByZX`�I/�r��|�z�-�����ȕmY���Y��Wgk��}W�=�^|��2���y��Ɯ�z��eZ�mJ�2�?��a����1���?:X���u��[�EyY?����l������e��Y›^�q�xO������Y��*�����?�η�%.��$��Fb�&��Q\\ʑ����맴��)5���v��Ϥ�eY�a��;����(n���W���ؐ�cC���H�4>h�-��A�@p����nV�,fΘ9nySd�ܲ���n����4���n���n�ʬc+�rK�u]�n���JF��a
�a�MAV��Rj6 �ab���!K2�a���u��-��n�SQ^����ė[gI�2k�D��}��U���R��$�H�FW�O�n���xn�%�׋��U��Ap �Ph�z��/	�NQa���:g݌d�湜�/ݹt���ǓYs����d�$<�g�.�[�l\��VG-)*a���_Z�Ep=��G�2&�\B I�JꜽO��3mn;�zk<����VG��\�_��'⌄G0tC�\�݁���9�ފ/ٯ�η����+|��Ҧ
�T���>1�\Éa��4%�ן�^ ��/E�K�D2�DUU�J��v?���L&���MO _<_ڌ߲,�v�U�K�,g���n���g݇*��Ǘ�!$]��=�#	�q�K�	bI�K��ȉ�&#�oO$I�B�@��O�2?si�
�z
Ƒ%��B�E.�[@i��F��
 �n �H084@*�"��D�o����+�%��ZY™L &'��-�"�Ng�U�f��nO�e�ݳ|��I�����,K������g�9ۅ��י���I]�>��xvM-~�]��o3$I)�$������B��o��������	�#�$�lj��\N��M�B��,�MV�斩��t�o\�l��l�eYF�u��4��}�L6���sH�DZ���iTUͥܝ�l�-aU9�����,�BUUfL�ƜY�		c�۹��tttRP��[����-K2�d�3�Zl?x���.Z�}g)s���d�9^{� E��-w�vq�X��ȶ�T*�a�� �Ҹ�I���zz{H��8��1�I�nlb��
膁����ehh���O��#�iڸ��W�۱���=��o�߿cɢ�<��:dY���Ӽ�ʫ���c����b�g�/�2gδp��Q�.^D_��3x��������u��r�r��}�4y�����d��%�kh:s��۶�Dp9��^u'����
�u�D����7x�)//��:[���.�b��|�Y��ʰi6B#!4����N����~~����X0��`0H8��������0�t���

�ɖ�&g���DXz_%���s�<���[h��\��{�� &&�YS�i�躎eL�O��0tLӚ�$g���غG��r�b+�O���n*>����rTUedd�p$�����p022B �0FF����N��F�$IB�u���Y�q�X@qQ}����/��1���aP����{����CsK]]]H�4Mc�����p���I���B�� 0��?��������R�ʲLgW���ˬ\����gs�^~�u�/x(,,$��F).,	��{�'�q��4�PhMSq:��@p;���K�D<����#f7�$�q��)�v;�,�|����R�N���ˇ}��*x<�y�I���k
QR\�sO?�,�c��I�FӴ�l�����Nb1�Hg
Ib�;!�Rn)?�֯�W�$�i�+CMJ����Z�8ȡÇY�p�h������m�����'�$??���}FOy��Ul޺�o>��� ����������q��Y���o疸*���z=lݾ���J�`��Y���-[9q�HPSU����w�cxx���N����>��<I$a`p��{�#��e�߰	�ʤ�}��g��ֻ��q�imog�Ź�m>ӂ���5kp�\TVT���‘�����ghxM��l��s(��a�ƛ,�?����k�s��̟7��O�#��q���$a��q:���F*�"�X��cz}=�=�4��c��]��G(,,��tuw��:�/��˭)fTUav}	3�ѓq��t308����h�cT;�di���m��e����s�����u]OO�aY�߻�g�|��Z�o��N�:�L�2����1}:Mgΰe�v*+��?����c'N���Ǒ��hj>�a�8u��OS_W��f˝GQd�[���.�ц
��Z�^/�}}�;p��]���m�Μa��M�����>�v��T*�e�,�7���g�غ�1�CG�r������_���Ӎ�ٷ��3������g�3ɧR�܀��}�r:I$���i�/[�w��<�B!@��p0���#ǎ����Hx���Ja�n3�9���^��s7Օ�X@_H�u�T:3�eit�qtr:����QTXHx�\.��I"�5�˒ăw����N���(E�:z��Dp:�h9dhQ�@@��n#���"ӮLӜ<��$��Ƥ���q�N��ԩS��~͉���>m��a��p:$It]G�ٰi���J�Y0>o��>�����;wc�ۘ=�aܹLӢ���g�z���3�Ogێ����fæِe�T*�e�<���(ȣ���qt��x�^y�A�N����Wqț2��M[�p��,�����줹��g�|�����r�p9]h�-�X�J��3{6����:��<���ێq�~U�i��e4U����-۷3�����j��CV,_�Go���%���%UU���Y�Qsj6P����Z^~� G�ws�u��"E�^�g9�l�W_�'?��|���$)3�7�I���iL��,
�A�u��C����z�e�eK3�%�Qd��ΒE�x�����$O=�8�E���y�y,]���7QS]MA~�e�U$��>2�<,�$O�(
ee��7���[o��
Sjky�޵����o���罬((���(��J]S5�?������|��6V,[���GWW�ژ=k��!P7e
�{/�}�!N��x,Ɗ�K�={���:�4�Áͦ��g�.ee�|�e�>����.nҞ={��3g��u�� ���m|�h�x<��夽�%%��l6���(((�����Kp`��GYi)án���l?�(��0��$�x2EO�D2���LE��o~�m�����W���-��L�c�&�/�'?/���gH��(�2.y�DfP`�4̜5n�eY���H������G4#�PTXHhd˲����������ò,�����6|>�}}8N�~_��a�RPp��X�E0���A�4zzz0�����::;�T
�χ˕1���v��$^���P�‚�C!zzzq8�TTT`�����fxx��MQa�g�5����q���(2�� .���N�h�$
Q_W��{ٵg/��o��U#k
���$���|Y�[�5�/+4��+�8e���4MdII"4B��I��^h�$���ٷo/>��e˖��m�i��;���"ݠ���h,:A��rS^V>n�=���㷗��9�+���bT\-^ŕ}d��Mv�,W[�������_W���{�%5kT>[�lٺ���߾HmM��p�n#��ܔ�ʗ�ؗ��~��;�e�B�|Zֱ�0�4�<��#��+�z�5�c��iL�2uҶcY�U��d�q�+��c˽�9&;~��]O�����V���>���l٩t����ѥ
1�nG�T[�g��~e01ۿ������ߗ��f���Y)��Ǘ��E
�b	%.�TR�Ը �/�I |)�߲,v�p����O�C.�R)�n�PJ�@�%��eY��c�0)
>�Ù��mE �T��rbK�ະD��@ �*�ҵ�x��@p�𵞎gw��ld�X��Z��v�k��Cq����n�<ZV<�B��|3�J�@��S���e-Kpq �k���D���5��
o������3$��5�aDcQ��䍮�g¦*l�p��<1�ssi}#)\
��ʆ*���@ �
������&�N�r�r�nv,,�#I�BO--ff��q��R��/���-Tۭq_�@�ઊ_�ub�e�&D�Y��H���u�7�)]�%�\
#	�]�~�@ �ʘT�K�D2�D�4E���K@�0��*�����	E��S&�4�Po��@ ���W� �J�*�M�4?	Һ�$Y�o�ٴ,�ߥM���-�UM��T*c�ʬ��E�$��$�a��lh�z�@=cs�g�~���,7�2�:dTE�4/g��,�t:����f'��g����S�%�.����z�@ �4�:�O�idYƴLL����ݷ�_����_~�o_�=�/\�4�\���ﺮ���N,C�ev����O>� V�ؕ�����I�Ğ}�ٰys�x4�&ϭ�h���O,㗿�m����((��,ˤR)��W����=��%IB��\�ﱟe�2��g�$�(
�D�_��w\hmW�]A7�q���@ ���t�m�a`�l��;.I�N7������)/+�Б�������$������D��^y�;W�����R��)**"/`ph�D"A^^����^o��H$���f8"O`Z�e��l6B������=�͖��,Iē:�>
��T�IwO�D�`p�p$B$�����N{�%N�:E�����t�H$����
�EE�R)��yyB##Ȓ���aFF����c�&�:������ܵjU�`Y�+ȒEڰĖ>�@ |eL���3{	i��8q����,\0Y�Ys���3g8~�$�y|�:C�Ï70s�tΝ�����.,�������Ey��'���g��}�������5����f�Y��ﻗ߿�*��X�t)sg���Ʀ�h�t�Y
33V	�i�KCB�L!�$c�&�y�E���Ȓ����������TVV��O������ۋ��,Y������˟�9~��Ӊad� �%�����9�Ο�m��y��Ady�q�mW���u�&���aRSN�K�3*�¦i�dTE%�L�H$0]7��4̘��)�<���̝=�4�?o.�η().fێ��ص�%����I,�ȱc$��\9�d�����t�u,Y��s��s��q�N'���,]�8�C`�&i]�벁�8����s��U��084���0>��o>�,?�ٟ�r�,o����V~���g�a�Νtuu�N��$�T:M:�&�J3{�,~��?��s��,;w��G�?�5�չe��3�Ӯ"K���-!A _&U�����e����@�RS��s�i�t�h4��	Q[]����	��$�i �i=��g�̖eZ&zZ�2������V��r�4MTEA�T�v{�a.[�9��g�&�t�*aW'w_�.�˒�����t�i��πeZ`e?��F]LJ̺~"�$�N�J�2�IN��Á��D�utC�95^�(hMA�,tü�"
��k�S��D*��t��i�̟;���>�x��6;�t��׮������B>Z��M�T���i�$2�n�/\�ŗ_&���/d��>v���d�"������iZH(���Y
4���_a`p���S1M�dJ�iS�T%s5�JU�$dIʘ��͆"g��v[�z��#��4S�N呇"��/��
�ap��U̜>�����W��۹k�jdY�у�i��Z��w�{��"��r;�3~I��8TRi���v ��iϞ=�̙3s�RI����&<���N�%�0$�J�x�H�D$����Ӂ�餠��D<�������:����ò�Q�$��y���q帜.�>/�p�@�O2�"�J�	�Bq��^|>#���`ٜze���4M��A�~?�p�ˉ�n�?$//���A��0�iR]U���&���ۋ�j��� I��
�r�������2��D��MIq1v�e/I'�w��SU�)�
���$I477O��[�[I��8Ϋ~9;sk�{2�6{l�̕��:�e�]Y�eYH�9�Y�������Y0s�{�{�/{�[㎍=��s�X��ƞclYc�����~���Opp���"���@�U�L��e�J��%�+�E�G��0t���G�*߱��cco�㙥���[ֵ�����ѫ���[,0f�?��t=㔆��)�ɲp�o��B���ql�{�@����i‘pnv�J���bh��1i�B&h�2��ͯL5UŮ�
��b/�@ �<$Iʄ�����u�r�r��� ݂{�,,\N~��FW庰��TWT���KJX��@�%�����A�,���

ot�nK�����Ͽ���ל�@�s��e�|���/�lp&��fE�Q��@n'ݍ��@�iL�P ���_p� ��G(~���h��[��f���s�'.���|}��	�N�@���l�ݞ	Y���D"�LFI��O/Pp�s��i<'�"N���E4%�e�8N;�p�Ϗ�dv��bQ���^gM����	�B�@c.��U��}�i2�42�F2��v{p:?_����I<��w�$���
>���+VQ[S{�����ML�2�����L�R5M�͛7��Ձ��p�\,�c1�f;��7L���&
)-)�\��u�;�S\\’�Ko���$��DS0�ώ]�����f���ws��~��8���4M�̞ô����!O?�,�@&����'	G"ܻ�?�~�������eEQ(..a��wQXXt��
��ikke挆\��D"���?dd$D*�BUUdY��0{֜�(��:�a]�Y��.�KJٺu3g�4Q9��,�SXX��ᤩ��d2���?��px���i������|
��g�8�J�S̛;��s�o�N�>��i�F��ښ)~�{�1�Ғ2F�!�GYi�fSQQ��� #�~����"LӤ���X,��称�����Q�~�]�1�F��M�eNl�--g��XMe9�һ���{�Y������H��㢊������"��cZ��%ť��i"�0��e���%�d*������q���B|>��(����ò,�l�Lk[+>�oL)�0�[�1
������N�@�ujkj)((��p�����Gض}��Ә1}&�H�ήN���H��tww�r�I��膎�f���EQ���%�RTT���я�B����N��y�<�T���r�?JsK36�
˲X�h)ǎES5����
�d2EGG.����2�< �ӱ&7�K�ą�	�B�B�L�ZGww�͍�T��p88�x���$I���RRRB<�����C4�����v�8˗�d84LSs#n����"

9v,+�`�%"��`�eM:�/+/�����'�QSSK}�4 ������۷"�2����aw�kW�e����H$�����
���O$��v
^������cOr��!\��m��������=0�6�T�C��L��y��޳k\�1}&����?@$&�L��z��KAA!�,�4M���x��/�R�%�?��?MWw'ǎ��v
'N����H4��Ys�|<tEV�;�ܷ�~���B�k}�(..a����>��fϚ��
���{�=ׂM��p�b�O���� �`��WR[;�F_��k�U�O-p�\3k�l��j�`U�U<��#�r���{
>���>x��jJ�K���g�y|��#JJJ���5l۾���`Z&�f�`��z���6�d�n�Bc��	���ɢ���L� ����V�9��<�(�i����H������3��R�$I�"G۾�i���.zz�ii9���2��e9��\�Y0���8t� ?�n�_�%�<�UU5��I���q}��Ձ��f���),,���"�h�5��;!�iY0�"W7���'���z��n***�|��n~h-g�9v�(��d2IYi/^ 8��+���b��N2k:����~�n�d��K�,!�
����݅�餸�D�2W3���I}�4�.YvYv4τ�(8�NTU�������O����Gww���H�K�B!���F������@-'�&/�G8fpp�H$���ѷHp0�v���M��}_F&��f���,�%��I��\�x�ǃ�i�z+c�,TU�X\���lD"aZ�.	�vy�D"��W��`0�^�����y�BS5�,^��`hxh\�$	�ӕq�dYA�4��4�E~~���.+�����39r���q�;$�J���M0���v��D#�\n�Ν�_��}�L�%�2����;�p8X�d9�d�cǎRWWό�
y�m�����i3hk�ȹ�����5y��@ ���,fYx�n|^�e�X�x]]��9��R]UÌ3����9����8t�~���W�l�
��${�����-��r���<�,^Ɣک�FB��7��n&�Z�E,�LK3;vn'<b�Ux<^�^�x��;��}�VΞ;�f���8v�(�v��ȑ��l6|>?�������(�����5��Wp��14����������r�����CooO�>���_P0n2���T����)*,����Ʀ�ttv��FE����vH�D}�4R����Q�C4���C��	V�\���K)((�������zM��Fz��W��k��d���(�2�C�0L�DU3�ou]G�圜i���,K(�x��?�D"���fֺ�#IRn�dڧeY�g9\�8��d��[�$TU�4MtCGU�\;6��8C)�2AƲLtݘ��n?t]'��Ig1M]��uͦ��ZE
���(2��df��㇌?c�Wd%��QUUͽ��KO�i���t�w�����+��#�e�Ε�3L�0Ǖ�-3󾿬FFFx��7Y�p1sf����S=~��{�˝�����q4U�n���W@��-[�Ln�$)�����0�z�Hml�(&����ƶͫ�]پeY�&ۮ(GelQW�H����v.��dmT�el66ە�K���j��S�g
�u��������k���I�r�ʕ++(��f�c���e����\=<nv�}���z���ώ��nzD�>����PnG�
�X���\��j
��:�|��p��,�KUp�"!	�OF�TTT�;�p8?w�@�KV׫�Hd��t�s
�m����_�|2�D���>B^�_����qԑ�MQ!/�'"�DP
��w{���4�ibWe�i�]!�4pj2�	�i�Ʉ^�B,e�ʒ��_�|�&����W��o��y!o��DS6c���������E�6}Q����05:�A��i���	�`L�F렅������
y!/䅼��B�&����X���p�*[�W^�vGg2b���1�Q�������9�ZDŽ���B^�y!���@m��>�B[#��l��@$�,]�R��o��ߕ�MvL�y!/䅼��7��	t
�)�$�0��O�r_0-r��$M8.I��1�ʆl�ā�Ȍ2Ls��2�{��#s�ڲ�u���@ �c���v��e���\*��q��@����NM���n��'3P��;�~�I.�'�,(�k�,uM���O�8l��\��
gzt�����4��ȒDKo���420��ƔB��4�=	t���R�U�D��tg�ᘁ]��R�m�i�NO�B��@p�,q�7A��f��g�}�?��
Nو�Ll�Ģ7E^�i%j�4v�Y5�Ú>4UbU������H|kY>n���r'e~��I�`neF��p1�$7Ǚ"f�;�R�ħ���McW��bO-�CS$�N� K05x~i>��y6�T8i�S��xbA��<��12�|��M�X��#7�$dI��T�4��h��iYXd�N-l���6c�̴��ߙuګ�]s�}g�u3ϫ�e�P]`ga�����Y��2Y�|}0-�)����K���N���e'�21-hH�6�"ϭP�Q9ۛ�y֟
������J'�.��n�}AZ�)��#��,�r�}�>az��|�Jswb\#�	��O��u�Ntĸo����o"�0�{�(��]A�_�sg��Xʤ�;��`��%��$��0-XT��[K(�و$
FwN�z��&#�+�̯rQ���	�W��/�*�f���UN�Lq�L[T������67XR�ay��C!�YP�fa���U.���WX�ߡ�z���U.��:)�bN��UӼ��5��:�K,��f~���S=�����t�I����+�T� �?�5nnl��j%�P����d�N%�G��Y~�m��=��i�Qd�ޑ4�eq1� ߭Q��Na��;�&�2�V� �2��$u���$���gxynq>�����d�b�O/�#30-�9:�RX�E�`
M��Rh�?�&�6����y5t�"���9ȁ��F�̬$i�J37XQ�aN����Q��<w�{�uָ�k�H�+�j�)wM�bYOY�� �6s�67XY�a�T7��ΝӼ,��aE��R�FR�xja�-��.��
>�i��<���S���9~�b:ӊ<8�Ϝ
'�*]�u�:�A�P��aMXzۧBq�3^��V ���F�W��u��֐�->>�Ž̭t���xJNA�R�I�;�-ʡIh�DX�mSp�&0�9F�#i���Rf�L��B�_n��R�o#���TL<���`0�p�(���H�l2�d�af�@�V���	��S#:�.E���Q��X{�=�#��I`ݡ4[���>aF�M�C�W�$ID�&G�cm��)}#i�Ùv�J3����K165�8�cf�˲8����DS
�9���Y� 3�s>�S!�l,�v�1�bs��υ�/q`S%��$��<Bcw��P�#DSf�<��X��K�}�_0P���;U,F��
<*N�FSW�C����pLG�2���1����6��s)����p[��������P]`'�4�!B]��{|t
��Y�䝣C���֖���2�T�!O�<��gE���n���ͅ`�#|E!wO����s*4u'(�h�(u��Q�Q� �6����]�l(s��u�V��Nz2y6��8�N�S��+F��m����;�ʜ�:�4�9Ft��8�П���N�Gez��]�?�S_�m���d:���D����";�Q���+�<,�q1��E�Pf���CJ7	�fW8�؟�ԯ�].3Ic��ʜ�bQa�:!Kp�?I�-��s_K����FJ��:d��e�T�y6��	s�7ɥ�y6J��M��l���4�	��`�ǧBt��t��^ꤦ�ƾNt�s��dIBU$�O�pg����_�q�-F�p�]f^����ΦƑ��*f0��E����q���$��v�N��6,n�`D�?��5G�"�[ԗ8���S�gcN��̘1�R�Q�&M>iK	�W�,I��
���&�5���8єŴ�Q�q�N��Z���=aQ���"�>��mQ�{��楡�Iu��|��p�`E��"���Hlj�LoY��U�$�0��4�"K��
���D�&>�B�_��P��S<�2<*���KF
�#�XB�aZ0��I�����C��+�X�ҳ��\L"K��n�Lڰ�%3/��Z�,�a^`Y�Ȍ�0�����c�$p�2�������.Ed�X��3�yEy�s	n/F��sq#$@�%R�Ŋ:�j\�nO�hR��nc�Mcߍc�-c��
���b�_���B4�/M,�����K�ͮ~�=�N��Y�I�s��N��/����C�@�Nws&;ߕ�E(fdAv��g��'�m$����k44�"g��h2ZޕǬ+�1�3���dѩ��|ôH�M�ik���Wɕ��	�r�d�K$�kVʰrAҮĴ2K�I�7��W���\pc��!�2�;��#��"q�!���%�EZ��5�\c�>���%�/�o?yE�S�q�%0,��G��땷,�7���3J_�D^��'�楽9k��R!��KRƷN=�<���5YH~�K8a�:�((�������r'��.��D�bN��HҤ��.�M%�i�Λ�>B^�_�|�Hf��y
y�M!�[7e����)?��ɹ�$�T�i�A�W^{�R�#��,U�r[<t#��d���$e~W��g����B^�y!/�ob��fb���$���|���!�����Φ�L�^�FWG ��IV��7�"�@ �:����6b�>~Ӳ����fӨ����p�D���&�LRYQA~^��!:��r[��vU��l6��}}�4ee�����M^^U�����q�eY��F��Ȕ�������rQR\�����&80@yYE����a��/�t:���FUUà��%"�(5UU�|�}���D"A,#//�+�H?88�$K�7��7�	�?�J�y�V.\l%3w�l��ܳ�8q�M�|�i�����~�������G Itvv�L&����Ajh�W_�#���,^����=Bk[;�����z=��<��,]�x���i�o��C�)(���yy��l�w�e��y����9�$q��!�~�=TUE�u~�����o~�;�C!��(w�Zţ�<��7�{�^$I��v�^����F�{��D�
:���~1�ܺ45�a��=���?AU�?���`��
�l6�{��/��s�ϓL��=�a��#�(ǎ�`��%_�5
�̈́�`��x�G�L�-۶s��	t]g��7w �?�/Ξ=NJ�˘>m�a��7^����op8��_���@OO�X��<�<�==�*s�qv�����=���p�\ܹr�x����8�7n��gɒż���|��*�ˑ$������p���%���صg?��(++���رkO?�č������F��׿a��eܻf
�5՜:�H[{S�L�������D"


�3{����jn39r�(����E�e���f"�v��;��'�J���ě�K^�ϝ+V`��9v���}̞�@Mu5�����#�QQQNOO/>���K#K��Ʀ&B�$I��s��;�-�0Mtøj�ϟ�0��H�ę�g��ꢲ����:�ϴ	�Dٻ?�p���x����iZ�<˲ؽg//��*�==ܵ�Μ�Vp㘠�%I�n��o���<��l6[�,��)����
Y�����q�b+�W?'�s��	���x�?0�����R�$��_�H$xl�#tttr��	�;����

�~�F�.=�5�U�������A�$�H&�̚Հ��b�…���W�D�,�7��ìY
��N>���aʔZv;sf��Bk+�e}e�5�K"� �!�N�w�6}�	
3g�Eiln���Ȓŋ��'<x���s�j�_�d���TU%	���Ͳ�K��͒E���I[�%��*�F�l�dN��O�n�����񳟼�[ヒ��8�}��]����?��t���r����K:�f��EDc1F�a9B[�%dY�6F9�����?��KG���<�f�x�&�La�<��Clڲ]ש��#<ΝÜ$f,#�3<<L:��яF����lܵj.�����s߽k),(����k<x�}TUV�ݦ-[�>����j�� Hw�XNqQo��.��y���2�����|��۷�q�'<p߽��t��p���3��Z�?��|���?�f��^ܪ��:�X�D2�M�a��	��:�i����"�J��I&��m6�naf540eJ-�=�3gL��c����l۱���*��{�a����ٷ�;W����N ��������cL��#4&�L�h��}�i��~����񸩟Z��O<N*����á}�~�.'m��b��X�b9�����غG����RG.\����v��U+W���{8w�<===�Si.\�HMu5Ҩ5�Jä:�wv�b�Ν��Ʋ%KX��C��λ�޻EQr忬�N$���P�$I,�?�=�����O~72��?$?/��ӧ�u���4gϝ��?0�����u�b1\.m��:��_���s���b��󩯫���efg�$aY&N���d�i���j��.TVZB$a�Z�����_B�uF�a��A�C!���)((�����;v��Εl߱�y��RVZ��7�`�|Ν?��(ܵj�gZصg/��5;q�u?|���O@�%,��R�%*���x<\���� v���D"Qz����(��"��p8��={5U�x=$Y��ub�8�=ݸ]nl6�ef,���,�}>�*+�w�=<p�Z*++�
k4S�$�X����d�>?UUc�+�
I�I�����7���b��4����p(D<�0
���t`�f����������%�/]������"::;s�'MU	���K^^��ʔ$H�����c��q:�7���(�>���2e
n����a^~�5>ٺ�cǏ�r�2�q7���&E�ĩSSZR��Gq:��s��	�f�f���
�ݿ��C>@�ԩ\�x�=���J�y��'8�q��!�n�Υ�N}�a��
y�9|�(�Q�ǂ�L�2���g�>���x�Ǩ��axh�
�6����3O>A���u�v�=��s�w��dr��*�D�=��STX�y�8x�0���4ME棏7���F<�''/OD�|u��N4M�Сô�_"��y�hkk�����֭��2
�p�zzzp�\��s%Mg�in>C<gZ}=]���TU��x����;���IYi)K/�����3��++)/+�?�O���;�fL�ƾ��T�-۶�g�~9�����M�?�b[�x�g�|�ٳh>s��x쑇��饪���2G�chx�Ғ��ʫ�ݗ)s`p�ys�0<��ѣL��}��Ÿ�z���_�֮YKQa�1�G"Q�A�%%%H�D:��4ML3���f��*
��#I�穁�A���%�9S�i�$S)4U���'����X�E*��4͌�p��:%	\.W�e�F�4-7��̶]7p�]��5��,��$����
�T�X,���dϾ��ع�~��̌eL;�Jb���笣���_�����R�nw�v�(
�t�X,��n�n�c�,#I�a�v3I��"�W��e�0	���l9�h4��f�Y�@Qd���ώ��&��0H���l6YF�\��T
UUI�R����'=�}YX�n��~�l�2�s����;�]e}�j
� ?����q�dY��p\�bN��ꪪq��_�ŭiڄzI���� ��
n-$I�Lm�/5�$���m�	ˎ��b-]U
�q�3ֱ�s��IӴ܎(���>�ص�E��g��NZ_I�&�;���S�+�پL��WE쬹�oE�׎�s�P]]�*^4�����2~�g6���fa�9TW���uG(~�׎��<��:��&��t2}Z����5}�@,��@p!�@ �B��@p!�@ �B��@p1��߲,��HUQ	�ȣ"‘zZ'�c��H&�D",2�?UU��v�(�h ��,�ݳ���p�����2�v��$�EQ0�h,�����=F�Q"�(~�/��[�u����l6c���FFH&���剽�_s���i��
���0�u���J���$�WzN���
�J�x��9}����s��U�߽:|�?��t:MUe%���p���z�] �W�$�����f���iia޼y<������^&���?�3gLwn�49p�0۶oGUU~�gߧ ?��۷��`�|^��ri"_����������ko�Is�,,[�w�X����p��$	�����yS��ilݶ���!���o��פ�L�{��W��c�4[��R^Oo/zZ���b���$mmm��Չ >7	b�˲Laa+�-`߁ܹb9~��ys�|�R>޴�‚�͙��l�zzz��ܽzo��]==<��s̛3��Ʊ�'8|�?��i9{Cי�0s\e���y��y�Y����Ld+I&�H�FY�t)�t���w4̜�7�y��MM\��dhx���f~������TVU��7�b��q�ڵl޲EQ���z���s�L&�����v�F��������F>@�ԩ����3���u��d��A<�LK�C��>Ӥ�����^��>����lٺ�����H��~���m��6�N']��ttvr�R�ap�b+�H��?�$1A������~�2)�Ϟ;��� ^���ǎcs�̞P�h4FWw7ݽ�l~�I�����ҥKX���墧����~��{�p�z:Lii)�\H^˲8r�(���ߒ� �K�
$�Œ_�e�
��|�~=k�'�Q)??��K���(�/�Ņ����B�E^��H&��;p�ښjv������Y�h!uuSI$��?�3�h��׭cpp���Njkj8p�0�X������z՝���Q[[CuUM��H��� #�0w�XAII1��u���&��.X@eE�EElڲ��{��r:Yt�8/\Hcs3�w��������o~�"���>z�c'N��SO����‚Z�ژ?w.7m�C��y�?����FW_p��:���&=��$S)�,ZȬ���������ܻf
>����n�;�����O��xx������=~�����b(�B���38�"?}��$SI�}���/�����f���JwO�.�d�ʻ��ɓ���+���008���=�ͦ�Ƿ���v�Dx��y��	
�(**�칳D�1:HUeńY}�ٳ����}�.JKJ�|7���L��'�2�55̟7���3�Ù��� ���K,Y�h܌y����2}Z=�x�X<FYi)�E���?q�'OQZR�7�}��3f�g�>���d�6�l�F_�����Q^^Fk[���&����#��ղ,��}��,�"�N]>FF>��Y��oeL��={�2�u��QXP�_���s�j$I"�H0�����뿢����{���nzz{ٱ{w�j��}$�I��wx���رsáafL��O~�#�ϝKsK�v�`84��a���tw� �2k﹇��x�����m��J9{�;w� /�2ko?�M-���Ӧ�v�9~�$�d��d*EII	���r�|�i�7ldvC�󿠾���[�bw߽����ϸs�JV�X����	~5�$�r���5��L�2�F?����N,g��9~���7"�(�d�_��_���nL�D�ez��8x��ַ�4
UU)+-e������t����Z��)**�a�9t����ܱ`~��y�%�Eܷf
^���7�L�H�ub�XΙ0?/�דI	�rr��afL�NiI	��a����p�"�i�r�r~��K465SSS���&�-]"��-���$��� 3�2MdI&�H�H$2��Fs������HR&w�@�U !a��)++e�����-eM�2����;H�Y�n��'�$���|f�`vC�e�X���������`�ٰ,Y�\���8B���۸�nJKKhkoDz,b�8�d���c�پ�bY���iZ&��h}�$�D�d*��f#:�p�f�e>O$���8�Ę23�yL� ��3�s�M��5������ٵg{��cZ}=w�X�;�φ͛q�=?y���|JKJسw/�x��y8�c��7l��$�Iy�!���8|�({���u=4���yٲm;���Tc#w�ZEuUo��6[�����.�Oii	6nbמ��z��y����9�r��7q���￟���i*��{������]'֙na4U�؉<t�ֶ6���~�:Ⱦ�x����JL�bǮݜ�x�h4��=zլd����"��������yy��~�~�}:;�8y�k﹇��OGW'˗.�h�Ɗe�8y����a0����3-TWU�r�8w�˗,��L3��%,�?�'ON��������S����&��z��2�;w�i6>ް��;v�{�^�����ll�d�::hmk���c��Y�ٻ��L&y��Gimm�����*L�`Ǯ�������݋l-������3�p��CG�P[]M��p�Ȯ�K�����v�Z�
��̌����ή.�.7ӧ��p8&�a�&����].���T��t�����FF��2����H��p��I�4�Y���imo�������̚���d
��2DIq1��084D(�������}}8�
�e�4�I$��m^_���������r9�����p��zy�w�����t�N���&�N���G*�������~��?��o=�������N��'�}>b������A���z=�F33}Y�ma�D"h����\�f��p�:�u�r��x<�m6zz{�y}$���2�����O���-|�g)+-���I�	��D2���P(��n��p`�&��H����?�\��NAA�X�p$B~^��t�@���lٺe���.�ۋ�GQd�������;���o�_���H�U&>7���w�w�~�����Qa����U���
��,_�������B��qA�n6�����?����_[f��� ~?�V���ո&�TVT��	njİN �����@ ���_ �����@ ���_ �ۈk*~˲21�7Z�{�
��t�Y����4�t��+2��q�^�N�i^�NW����u}B}M��2�,/�N���k����ݧd*5.�ee��gsf���+3����{�܉u��W㳔9����ɕX�u��(�Jd����S ��\u;_k[��_�غG�{�*^y�uΞ?�,�rםw��c��y�Vv���C��ݫWc[�mgϾ}��ɪ�+Ys�]�W�ϜJ����p(����C.��C�x�W�w��!������Skk�ŦO��g�>dYfdd�ښ~�N��w��)�}��\<��_��xؽg/G�cڴi<����0�~��gZ(,(��<Cii	�d�֭��}k�r����OBFi��=�Ǜ6�F�c�|�]��T:�;�OcS����9c�x��>��Ʀ&4Mc���`~&)�}D$ez}=O?�$.��}�###|�a�/\�����{���B<��ݻ�{�j/^Doo/|����~�L�u�`��ظy3��n����1��a�y�����U�f��ɖ�;~��ź��������d�5�4m¹�����[o384�O_�1�E��2M�`p�@���f��F �&�X��>���v�����=�_D���ljD�~ee���e&��H$�����w�
3g�0s&�̜1��)Sٻo?w,X��)��s��9�^/�g��0t���(/���g�ܹ����q������Ƀ�߇,���_������Z����R,��?d������Ε+�|L��cvC'N�����%KOH��r:�R[˼9����%�r߽kٸi3{��g�=w3����û�@k[;O=�8�mm�<}���|^~�u��>*��y�w�9c�WĜ�D"��׿avC���c������Gp Hcc��멬����Ι�,�l޲�˗�(2�e��j�������Ғ�	��>����u=DGg'-g�R���^}�{׬�����6l`������v�k5�!�H0<<��-[y��'1M�
�7�xѢ	��K���+�Zf2�ddd�O�n��G�0
v�������_O

)+-�pM�H���AN76�tɒ������_�n����<l�F$�?2a����H�2o�N.�1H���(|��G�TW��*N�I�H�R�B���K�3
122�����J8fhxI�&mét��p����$�384�k���ùP�6��H$B8ahx�ݎ,��d�v�hx�"��i`��H���c���5�+�2�Tj�|$aph(���ngϾ}���L�6-W���I4%��N�sa~��o�D"���#���W-s�{#?/m���םl��I��`dd�{�+w�**�ع{7���X�UUY}�Jδ� �Fz�e�ӧp��Mծ��)80��M�y����df�o�DqQ!v�\൳��q��Iy�i������B�_�H2��&ɺ���|�B#tuw��SO`�:[��`ڴz"�(�x�x<NsK?�@.;���?�k�^��*Y}�J=ʩӧ��:>�d*�&3uJ-uuu�޻��sgY0�w��%{A6�������ۇ�fC�dJKJH&������m6���&�w�
��9c:�D��?����>�`z}6���[�������3�Oc�̙�?���f�N'�-��1}������q���(��_��s���r�,u�T3�O#//��}���"���'gϟ�p�Ғ�_��#G�^��=v���34<�i�T����믣�:,����7�y�ǃe�w��&S��t����x"��M�X�r%��A��,���5kX0���G�e��}�p$�
�ݬ��N����ë��isf���G�0�9z�x�5��
�����w�\�}k���k���	����x�Ly�H��s���㏍�y�?����=���(
�X���˟�a�'ttv"I�|�y��W'�w�
��>^��tvuM�/))fxx�o<�;w���h�ʟ}��{����/���M,��e��z^sߞx�Q�δ���|^��~.��]�̱�
C7��&��e2a�����ׯ��V����u�X,�'[�rת;�e93͉kX==�����ܽz��M��a����@*�"�Hp���CG�p��5H��3߼���̝3��R�Dn
ܲ,6}�	�fRUYɵصgn��ys���	�������7�}�S��躞�Q�۱,�D�(�dF�v�}R�����=w��˯�������{���sf���45�a��ͬ{�!���l;�J�T�޾�I���5k����?��ˋ/��T*ʹi�4̜���/���q��y|^>p�w��?���{~����R��uf�)+����T[[ˬ�O)�PT����t:�?����~��RRR�=w���L�a�����q�w0s���o�x�B6l�LMu5��������M,C�4~��1���?��:��/
�0"�NSVZ�O�c�z�q6n��G�?fxx���������wߣ��9��8K�,a��)������s�7wΤi���8������cEa��%<�>rUӸ��լX����n��<I"���G�٧���ɓ I�3*����'[��}�.��7y��0���fv��͔������I"���G���*�>�����T��s���իV1�\~�RXP0�ڒ�$�X���k�����������{��짪�����Ox䡇�7w�U�\�0��x�%���&rC�0?y��mmlٶ�����r9���eŲ�;q�x"���K�L��B31�̏,����/��
����I�� B�4�;:8w�<���G��$=�����?q�d2I0HcS3��f���<�nm��i9�_��O'�v�
�ع{�~��A4������|���j��?�#�]�����Y
4����0o�l6l����aLˤ��w�Л���u,[���mm���RS�q�3,��03�o�$q��Y~��q���,\��8���M��3���'O�f����5�����7��ή.>\�1.����|~��������&��ee�[�������7�d�̌����gI&��_�D*���t�u��t��}�Sˬ�:���"��g?�Rg'{�����#��tѢq�d2,�u��]����vTU�a���*#�0�eec2Df�E�S^^N{G�i���M �ґ.�[Y���m~��$á�������$�	�w-�ee�߰��K�x��'��ؼu+�����v�u�-��r��z�y}��ZY����⭷�eJm
���J��l6~?鴎��tvv��;�2��M�	�I����n��4����3����ŋ�>m�Hd��"C,�T��q��g,��e�p8Pd%��a�fF��b��b��D�P(�;w�
H&�X�E^^���|�^�5ʴ�l��
�U}�n&\��e�h�9�4y�o�����ٳH$l۱��˗�2K

���w�~�v;������o/���{�r�]w�o/��g�z����2c�4�����%��9�r��>��>�����#O�H$�w��-Y’E�0�m;vr���U+Wb�&�v줾n*Sjk��LK-�.v��Ql6-�`�>
3g��Q^VF0Ⱥ�f�̙����������!�/\��&����2�(/gVCʥ�����ö;ijnf�L�VO{�%��#N56!�N3w���?�==����000ȳO?Ɏ]�hmkǦi\��`�=ws�R�� w,����T�}r��	��(�<�$�Yٽw�Ο�����0;v��Tc#���X��D2��S�����G�,[����|v��ü9s��ɸ�2W.[�eY?q������s�<�M�&�'�N7�iZ�=?q��7r��9^|�e�y�I���%�R��z��˖�{����ì��X�E�^��˜?���[{[��	n<ٵ��{�Ǖ�޿���F�9$��)JE�%�T���x�xmߵ�^o���w�׻5�(K�H1H��s� rα@����?h�LB��DݧN����ΩS�y{ۻ��p�V�Z�+������r�}�RU]CKk+��!�S][ˍ�J4�L �g�5Į={q�ݨ�J~^F�I�##��.�eY���108@F0I�FF���Ŵ,�x�>�HOGE%;+�V�/�"��X��U\DAA>����,3#n7Ҥ�JH}���^�����k��+����͏p�����$�)�3���n��.=��,[��%���^��ۺ�����-75%��۔Ǧ
��HOcێ(�‚y��6������NGbB�`��V��Ӱ���|�h�bbc(*(������TUE�ѐ��ʿ��<�i+W,����Co_�99~���JbB�͉�x<^22�Q���6l���D�m;v���>z�|�d���0!H���\�r���A�JK���E�$::;iii%99���\dYfdd���:�`Vq1N����/��������E��
��ӘUT�^�ghh����Ȩ�����BJr25uu�U�d41�dV�g]U���t�l
������t�y���v"��<��av�,��ȼ���6��r���C�e�\.�_���ncvI	��t8����c�Z).*���k�ڻ����?�0�Ӗ��*��\��fVq��S�����o�����㡧����V$IBQU�ssH���~�s���V232���������r��ijif�=lzh9Y�Fo���088���?��o�@8��Fo�3{�,L&�}}ttvb�����EWw7�
��LFf�����FU������*��"�U��m�<������9:�N232�����tORR"n��Ԕ�D���23��o��tGzZ:���Z��8u�����_���7 �E��hF��Sm7';��9��23���A�e���hln��񐟟Oss3��#�L&4���/��dggQ:{6����>�G�-7'�޾>$ %%EQhji��L��Ȭ�"<^/-��d�g����=Q�u1���c���5<̵��Y:Ŭ�/���\�z����I_�ή.Z��X�d���b��� .\�d\p�Sjjk	�-+���q�\\+/�c�Ñc�8s�<���,�p��߸�+����wF��������q
s������r�\���/_|������O���X�+7:bcb���{�v$Ib�…wc�HOK��/�^��k�OǞƸ�bcc��񐞖΂��@��"L��æ�����_�~��/a�Xx�m�,66�K��[tw�]��� �0��z�b�~AA�AD�A�D~AA�AD�A�D~AA�AD�A�d�s��@��{���ֆF�%!!�'��D8�}108ȷ�.�pM]}={��g�|���=Y"W���zߺ����I�V^΁����tȲ��Q2�,�'���c[	�l��:�ŋ�\C_�fQU�ׇ��+�	_I^od���"��_gúu�rI|}�A�+*(,( v\��;��|��@$�h�^��M��.^�LRb"K-�t��hZ�ʪj^~�5N�9�2l=���6{?��>G$���ב%�ŋ�`�<̣K�ު����ɢX�pAt��O[fWO7uuu�-+c��E�%b�o��7/��w446ͺ�(
��P(I�(G�.w�F#�����������E3��	�B��(Dt�����E��G��[�%�r{<�B!�� �w�k|>_4��Te����&��xo�'��ÄB!��;�|(�������M��XF��}���=�^c۾���P���oo���Zޡ#G��o)S���F9�;$%'QP��N�cxd���ϊeKq
G?�#ǎ�*
K/W�Đk�s@�eXIDAT�9�gG��bS�=nz��(*($����K]���;娂��s��Y�}�����y�)�پs'}}},^��̌t��?���X�V�~�)SC�z��������ˋ/����#�61����?�GCc#V��-�add��?�G8f�k����ݝ��r
�`�<ڰ~R����7رkWdu;5��]o_|��+����\�QA���SO<����.\�a.�Ǫ�+ٹk7����d�-��*+���u<��3�����{��~�y�I���ini�׿}	�F���[�l�l�v��MSs3�@�%���#O���(l{w�x}>Οφ�QQYɑc�	�B�X����<^}�
�Z>���/Q���O>�=+WL(����]{�200���/<��Ǧ�@���O�e^^^4��$I�tZ$I����CG�2�d6'O���t���{���cͽ�i��`�Ν<�u-������N��d��Ï>"--�Ԕ�I�j�Ȓ������Ȳ�N�c��OSf��hdEQ8}��]]�--��k���h4RY]͖͛Y�p{?��`0�sO?Mnn�-�\�~��}���&���Y0ޔ��N�x<��Cr�shim�G�}bbb�� ^��+׮�̓O��pp�z��hmkg��
defR[W�#Gxh�z
���'�U�5TU���o�CG��|�R�RS9s�<))ɼ��۬X����ZF��Q���І�0<<̛oo�U+�[ZFbB"����4�+*�w�s�Z9���Xs�?y��|Μ;GRR"W�\�[�>�?����p8�7��_�����K-�PgEQع{�Y��Z���?���Ğ�?d��e̙3����j�r��%۲��s����Ė͛Y�`>�[��
�]��'��̓O`�����x<457M���t:֭]y���_���r��%�߷����?�������90@Ss3G��
ŠBbBK/b��E���?��/���f�ܹ�*QXP@Qa!����ʵk4�4�2+��WV�Xw��U�޶�ǃm\JY�'���l"=-
��BWW�. ##r�G�$)�5�~��#ǎ
�&�i�;J�"	�=I��X�>���g��S\T��㡢��'ۊ�l�={���U�{�=��9|�8�y�defNZ�^�$���df����HrR�9���32<B[{�I�����h����e>�ԓ��s�>s�������8}�,C.>��~��a�0�"=-��R$��$I��ƒ��EwO/===�wv���@fF�	�
M�4Z���,-\�GRSWG0d޼��m6v�܅��$�ngVQZ�n�:��N���,KSv0��&~�����12<�kx����ys���������Hww%%��Z�,Y���Occ�����Z�ᇴ�w���fÃ����B��%�9u�,��o �m�l\����\Ο��en\����j��8�,˴���p��h�Y�K�������j�[Vơ#��dgg*-��<|��.�h���C�6N������NA~>��soo�Λ�l����E���Oll����rss�%���Dj���ԙ�x}>TU%#=�߇���{�2an@^n.��Y$�Ǔ��@^^.m�H������3338{�<==�TTV�d�b���ƒ��¶wwPV6����H�lE�ާWUu��++�ŗ_a���~��YY���r��U$IB�$�ϝK(���k��xX��x�^�zgf���BQAW�_#�(5���Gh�ZE�Y&�d�"����p�"�����O0)��
�rs���"//�����x3�dqQE��,�7o�'�X�rss���#;+�P0�N�c���$%&�����UZJ
N���p�}�1��Y�����2s�����!7'��ˢ��WV&fy�F��o=�,7**FG���j����1z2��cmkkc�ʕ�\�L�	w�������g�ehh���bV�XA||F���:X��������@}��ys�r�2�� n����D~h&��V�N�E%2ypVq1)�ɑs�#����	��G!3#���;\�z
�׋^�G���Wf��N8F#��ߣt�l��jj�X�|yy�d��s��9�{z��Zy|��RSI���l21�����D�JK����CM]=�`�V�}+�~��n�:�N����ߢ� ���-._�J(d�Û0�<��1,f3�a�]O(�P�$I�\�I��$n�~
w%;����Б�l\����#ǎs��9����	Z�]�<-�74�_��?��?!%��;�xpp��Csk+;�����?#1!a�����Q��^y�eK��І�^�(
����U����{�{�f���|w嬘�t!w��n��~a:����lɒ	��_G�8t���[�>r=����#Giko'/7�U�̾��(HNv66��N������i쾣��K(B��1��*�`I���|�~+��N����5EQ��������n'a�}Sn1ɒ�&�}����_}�O@Af�Aa�_Af�Aa�_Af�IIz��gΜ���TVU���NVf&Z����!.\�DKk+��S����\�z���O���'55�VKo_�����8�msSq��1.]��V�%11�P(č�J.\���f��	���r��a�)��� �O���t���Ɛ�ũӧ9{��a�����߭eNU���Tjj�8x��--$%%a2���\�|��@||<��P^Q���$m����<t���jl���͍=U=oW�T�Q`4M��W��$%&N9c��}dd�������UU��T�Ԑ���?���K�<}���.���1|�lbEUat�NA��	��TVUc4'$�܎e8��O�|��=b'|��%���~?�����vTE!<��sGg'���_p��E�n�Ks^QY��۷c48p�'N�b`p�_��-��446�/��΁�I�
�ü��{TTU�(
/��==���p��q��.M-��c�`���|���C[{{�痮\���b��C@$uuu#�2;����s�>]��:����TUWO��'NLY^Ww7/��*���{7>��W�x�KW��t��MM\�x�7�z�����g���w����餯��W^{��;i;S�goo�2����s�g��>b��C8x��r�ڧ���]Q�8�����ʵ���7����?����P���O[G�����Ӽ�����<|>oo�Ngg焟�.�� �m�@���o���=�c��O8n����ޮ;~ܞ9w��G���}�����;�ݞ�n6�.��;�"K��I�̚EAA>��?ڏA��'�$>>���k�Abcb���**�PQ��HX	�W��o�r����?��pY����E�բ�h��l���?�54��9w�]]�,\� �3���ϒE���d����O�����k�>s���3�4a�zÔ�IR�u����Ao��������1�		��/���p�v�}l\� W�]�7�b�����a�z=�$122r�gco����?yrB�o��6�FF���]�|�+W��q���Z��]{�r��y̟7����&�_���eKю����~v��CA~*�����<�u�455EzF�q��y�޶�7*x|�f��b������`�<�2�8s�,��ĄD���mG��N�$��m�1�M�w�=C��q[Q���9}�,.^����Gy��/280H d�5N*�����N����hd�;Fk[;9���w�j�;No_����*��G�����[0���TU�y{�v�;:x�'(�ϛ���Igo�V��s���a������~j��hik���^���ihl�����Ȑ�#G���d2���KRbz���DrR�����+I)��446r�	dYF������|Ҹ�����l~dSd�L)r`��~RR�Y�|���wvu�[o���C|\��+�v��ebb���l6MY��h$&&��'OR][��j��߁�l"6&Y���΢��א���Hb�̌��0>���$._��K���3>�=5
�`���	e�Ba��#��F����*�ü��6��:��띲�r��n���|�޻�˖QTX}����q�\lڸ�h�O��0�w�ǵ�r�o�9������	�Y�8t���&֮����<�;:�x�
�KfQS[��>��]����h�?o.es��������摟�Knvi��,]��F��cDZ�X����7�dpppR�-mm\�|y�خ像>��z����R.\��Ç�x�2.��E���EnN6��͛���$I�dg���Ȓ%�I����{(|�M
���o=�,��Ÿ�_���a*����t<��Q��g?%=-����Ly2���s����b�#����!�F#}}B��P�k��)���AN�:��~�#���Oh5Z.\���$!!!�SGUU�w��
�457SW_ω��8s�{?�����˗����¥K�dg����r��{8x��*��STVUq���	��x�
���>�R���_@��������Sg���v��� 5%��Hww$3��z����.������?�S��ihl���h4\/�1�L�VKzZ����|F�<�裬��~tz���6�\�[?��s��N�=���F/]�ŋ����=�K�hhj�ȱc�A������w��G������R���g��2R�����c�…,\��ܜ$I"==�5��˚���`0x7�O�0J�h42g�l�J�OBBs����� ++���$/\��d�d6�x�Bߺ��͈w���t��{/��w/7**(��`�Ⅼ^����QSS�V�e��y�\����,�23�3�dʕ��23I��g�…���n8�.�t��nN�>C0�90�����dS��GCcٙY���������MH�GEe�Μ������d�ϛ��ӧyw�{��a<^/���c``�ш���t:,��^�����GRb����ߨ�����W��ONv6�CC���!�2��2{V	z��P(��D&ј�f�{�)B�0׮_4����Fz������f�r�>�LI����Ꙙ��k�E��������J��-�f�exx�SgN��݃�f������D�yw�TTV����]:r�p8��˗�[ZJVV&��VU�^�-6���A�F�����##=��0�̲R֬^͵��|Fe�t:��2q8��z�/]�����=)1�̌t��Y`�Y��ч��v��ގ��h4Z�{zhmk#1!�޾��p���4�S�Ѡ��8}�V��ܜN�9��n���p8�������|c�P�>��0J8��^Q4�NǙs爉�b1�q8��<u���|�~?
�M�76b2EF�Ξ;�?T��B�c���+�))�d2q��e׮��h��^�N8��e�Z�V~�+׮Q��OEeU���xl���!��<�=+W����[��rVm}��-�t:�>�(y��dfdP]S˹��IOK㑇6����f�ܹ��ڼ�\T�IOO�M���DfFU�5�ߺ�-�������FRRdX�`0PX�OGG'}����w5,�����g�X,�B!bb�tvu�g�,Y��V�$I���R:{6es�`6�X{��
)).fNI	i�����r�r���8r�8��^%)1�Ƿn�z�
���	e�Y3z!4�����G����HKM%66�ںz[�LjJ
�y��������2o�\�23�$�ʪ*��ry��M�$'���NCS>��G6m">��?��h��ΚX��7���@V���(';����{�-�R��ώ]�>�s��&232�3��9�K��m,^����QXP���%��撘���244��c�9�"�p���x��ŁC��?wV��g����JJr2�-���FEf�EQ8v���ä����0���)wC ���0r�O��9sHLJ���E��A���3������>�			;~���v��{PU�Fé�gn�mdVq�@���&��-�ի	���dg�ObB<���ARSR8y�L�LEQ(-���j����ܜbE��o��Y��);����|>�F#�@�W�x�U+V2gv�g�@Mm-�O������&�ֻ;�#>>��k�܍
��L&dY�Rʼ��4�h{ގ�(���#I>����|�s-�׋�`@��r��I��غ�S�_U##2G����s������ׂ���������n�h�8��/�se�$):�l0��w���3-Q�1�'y|�/����l�/��;Q�O㋶��Ȳ<�9d���>�O��a�իV}��{��`���҈
�W�$���l�=�ű-|>S�_��@�q�OX�I�*TU��vc2�>v!1ql_�����ׁ^��L9�a�I��֚Ƕ�ek�� �"� � � "�� �"� � � "�� �2i�>�!��kׯ���I�5�р�(446RY]�$I�FWw����ڵ�8N�v;:��@0HUuud�I��IdB��uuTVW�(*v�
I��w8�����\��磾��v�=��nk[��o��x���T���Te��njj먫�'
c��PU���**+���#66��3k�*"i�������$I��n�߸AGG'��2o��S�ɭ�z��>��������Ϗ�f�m}A�=EQ���B��M�r�==�(Jx�z��4�rߤ���x��W��o������Z̛˩3gy��w������ddf��������5Lqa!;w��Б#���r��Yf��3ţ*gΞcێ�|>=F^n111���]^�-�fstu�KW���K���rV߳
�^O}C#�y�e�^/�N��b1���5i;��_��7���t��)�ljnf�������'HMM!!>����PU]���h��`nY���NUfEU���+tuus��	z=III���\�v���J��k�7��3��Ol�t���l���[c�kB�;n�,�̞M[{/���h}�zy���?p��_y
��Ɋeː$�c'N�λ;&�'9iz�|�9v�͆�d��
—����_���̌�����\�z�ܜ�;Z�߽��rǷPQYI[[;�iiS���rq��I�23?v��θm�z}|��G,^����dZZ[�7����m�ч�'��pp������?�C������ޮ=��?`�h�wא+��j���.384�C�s�����"�RS),�g``��B���$$$���BSs�W�D�����ĵ��l��mm���(*�����pp��Qڰ�@����cтS�i��X�x1�W������͂��(��g�ܹ��n��[X�b��W��\�
�V˛oocVQ����`1[��0:��������K���A�z=�O����W���b�R�����&�ɲ<��::;'|W�]c`p��F��с,[��‚|�:�.˗-�����o��G���:˖.�+����+W��/E_?v�
������9~��@$8����q:HMM'�
C�ڻ�޾>��k0�L�����_����~�F#�a���*I�I\/�Ay�
�kj���O9���*�7*8t�}�����O����
��������C\/�A\\�/\������&��������466���4�2�===��p���7��#�+W�q��|�Hr��W�r���kׯ���;!���ப*8�+�����%9))��[�;������d$/7��v�aמ�����v�	������(++�����B||/���Μ%;+���^�Vi�i�M&J��i��2�oA~>�--��ŗp{<�$%#IҔC�&�qt���r��BA~��K465���1�����w��9YY�凧(S��b2imk������R$IB�ev��ͻ;ߣ� �N7�Ɨ)^�����b�:sf����"3#=�Q.1����o� 
Ml�Ѥ�11ڤ���]{��λ;�����ݺ{z&~�fQ[[O__�-��3�ra�X�=hI� ��>��o�.�n�j��
F�����'9t�qqqhuZj��yw�N�~?���CG�Ny�	"IV,����x�m¡0�F��`�����h4�F�~?om�F[{;�MM��W�)�ǫ���w��l6c0P�` ��dd���8{�<���6��}��6$)rKT��b��y�whim�7���pRY]Î]���~��ձ}�{�����8~�${��p(̎]�8u�|��/^D�$dYB��b��m��w�h��hR�����������	�`��_}��Y�(�����II�GܘMf�\�������?RS[�C�ٸa=��ߏ�h��vc6�'�?�9p��6n�իy��w8x�/<�l�.Y-I�F{�������g�z��G��o�J��']4\�|yB=�<Ȳ�K���'�	����+��ΊeK�[Z
�A�g�ƍ����o�A|`�m���eJ�Y�aph�\Ȳ��n����p8�,�x�^R�������F�a�d�M$Y��F���}�ф��~#n7��1�\Ó�3��n�yo }l}�CaaY���[{?�y�ص�{V�`�ڵ=~���,�y�IRSS8q�4ܿF�x&�5��b1[����
~����� ;3���OIq1�������Cq
c4y�IMM�������Y�EʽV~���\��(�� �=����]\�z�޾^�����<�VÒ�Y�|9.^������*��rq���h���E%;�=J�ȱc�ZV,_�C֣�j9�Z��U+W�������G���Փ�D�$J��&3#����G$�}�XQ�8�������3��A�Y��CG��`�<t:^����~<7�,����N�������ds��u�z��HnjE��S�������'
��h�Bttv��ۋ�磽����t��45�000@}C%�ńB���}����h@�����n�S�e$���B����q8��������2���V����v��∏�'
F��I��WUuB=��UTDٜ9=~[���N�����U+���ÉS�	�����O2222��5
�-��$3#�M7�k��]���ttt����^���Y\��3O>I[{������ttv���A_d%#����	��?޴O�%	P���#99���x**�(�5+:yq�hlj&99i�/V��GQ�(a%қ
�*P[[GJr2z����Ajjk���'384Ĉۍ,k0�L465
���J��NeUM���Xc���b�<��4��b1�P��׮3kV1��XE!�(��� IJJd��2�r󰎦;o��n�j��(a���Q�ii�F�kj��͡��������#z�����h��ή�q��ܺ��TW�PV:�S-S,|�&���lX-VΝ?O[[�X��s��Φ����/Q���#�6���E ����\lټ���|���9{�7*+Y�|+�/�m�r���`"�2��Y���r�z9�l��^����v��Յkx��K��ٜ<}�Sg�
�io���eetw�p��etZ-O>��m��ڳ�yeeр���B ��s룛����ĩSʜU\DSs׮_�h4RSS�^�'11��;��Б�TUW�n��^�n����Df�8u��z1�����nN�:��b��#++�����O����#�bnY)��Y��G���Lj���� ��s���k�/K�,��Y��g99�tw߬��O>��`�}����������˂y�(*,����f}�4�_��"I��\!)1�EPQY����Í8@[{;�,��-"ǸpW)J�I��%�0�8�N/\�^���Ņ��wv���L^n��������v6<����4�ڶ��W�p�F.��իV�p:8~�C.�Kf���Ccc&��˗QTXȑ�ǩ��%#=��GFz�h �f�d��s��%������m�	�F�7p���艭[�3g6�mm\�r���·��$;+�Ԕbb�T���r�HMMa������ys���^~���\���f��{���Χ(
��I\��f�GO�򄟅�a�z=�}}�������'7'{�{UUEQ�h�l���i4�ѫU$	UQ�h�	=zY����7��l6��-S�^_ϩ����Y��e����� �!r�]]��={�����ؘ�T>����D@d�/�4!���|�6���z��b��FF�z��>��6�sWFG��O�������D�͆V�i0~�|����c������ۿ�g����8:�.G{�c�UU����j�鴄B!� F�!zn(���0��z�A�a�^���։S�o�=
����c�c�2ǟ���@�{>1;�T��o��>f�aT�F�yB|<���͔�T�S�{�ȓ�5��[��'��veN�^��@jJJ�߅���ѿ����OSϩ�n�I����������v�����v���>���W�eL�eY �+��0mƟG�}���Z���S}'%I�d���V;�	���g_�N�������G����ɷ-�~.�wee	�Fs���X,�/^ȧ �c~u-�?���B�
���O�CGo~���7��.)%w@LL11�`��C�ѐ��������7�X�_Af�Aa�_Af�Aa�_Af�)g����HMM�0�'��r�H�����h ��t��)��׏�Y�$R�
� �W¤�پc'յ�(�BɬY<��Sttt��^��t�_��_���}�����߾�������#ǎs��!B�0�i�|�{�	�/M8�FE%�yb�oaƪ���n����|��t8�tuuO�f]����V���Ͷww��������ի�Z���DRb"5��,[�$��������mCEe����Z^y�
�}�)ڰ��'O��UT�E�)Q�u�������'.�NLL�
���� #��M�4���Fӓ��A��'���Lk[]]]����hni���� =�}H�DccS�X������OSK�--�Bll~���7��w
�/��%m�m���$��O:�{��&�
��oh����N��h������N���y�m��M]f0d����=��őw����W��Z��z�V����B^{�-4�̪�+1��b���C��1��9����Y�hMM�HH�wvb6��SR��hd���5L��Q>���~������gΜ�tw��{�dge"kd:�:y��m�ΙM[{;�my��˗Ow�����5�)ɸ=�R/q��{x��7���g\�v�k׮�|�R^����֭467q��e�23�����.7**����h4r�z9�޳���^�� �L:�<|�K��D����Km}�/^">.����'����������m�+
]�̩�55�**�f��^&M���줹����wx��g���������A�#é3g����0?װ���N�/�`p4#�Ba�Z��$��͟;���B�~�	敕q��iV�X����ܷz5�����Ώ�}֮Yñ�'"ɐ�.Q�Vã���S�hmm"k�KD�C!23#��������斕�?�)y���?x��'O��z�\�z����JKٸ�AX{���U�� e�7���AN�>��7�?�	&��s�/`��yl�~��PV:��N]��``Ų�̙3���-��㦻���iR�xx��MJr2���(
��χ������t:ILH���d6���a����h4<��c��a._�BNN�o��ka_���r�����C������Ȉ�.rH��J8���`4��h�pWI���:z��%	�ٌ��chh����h6LE��#n&�q4��J0���b�۱��dee2���^GLL�,�`xx�������x��h�l_�YN\\F����>����x<����Ac�H?�L�F��롿�ABB���@��ä{�111�}~�;FyEK/f��%\�v�7��FOw��ͤ������X�d)��lټ�E`�Z���p��|6m� �	_�F��� 'N�")!���9?y�s.
��hd>��--��GI�
%?���{���znTT�l�V�XNMm-�Ϟ������d��y��}��j?�o}�ήN�;A}c�$��㏑����W���D��PR\��(;q�F�,K���q��yΞ;���q
s��h9O=�8�8u���/�����5�QWW�ܲ2�v;�`��)33�������Q2��xb�ke����;o��}��Ĥ�/Eadd���G����x���ը�b������룙��\.�qv��*�t�P���ш�``dčk؅-�ƅK�8|�(��$'&;��f�����ãobVQ1��	h4<΁���s�/�;V�����w�����0[6?Bbb"f�	UUq
�x��X��-�����H$��V�����ύF#�w�`xxdB9n��`0��bA����x����˴��x�^�� 1��A��������S?�/�򤓥�h��'�V�>�o'[��j��Z-X�����Fr���E�0-T@�$���H�h��l�eB��cUUUbcc��ʊ�J�$l��Ϋ�$O������љ�����2!o|}d���"����&Ί�7�����DoD�6f���y����~�c��ի����_V9�7���7NJr2)�_ޢ&��Y�t:̟�����X���R��e�#|�.� � � "�� �"� � � "�� �"� � � ��#K>QU�P(��(�]�h�n�(�]�_0��vTU�������'	�ÄB�;�� ��􉁿���_��[\.���&��o|�
�AZZ۾P����|��G���ގ�����3<|����t���w��{�^~�ۗhjj��;EQط����O�<�k>����ٵw�2��n� �D�f�4�Ʈ�n��\�]U�kn�Z��`���z{{��t�\�?y�˖q��I�RR�Ύ��(
�2<<��dB�e��==x=�f3>������uu���k������ttv284��h��^5<<����h4244�?�h0088�����@QU~��+�z���t�
5uu�bc	CX��	�Y�===��!�f����z=����Av����.�����r��p�ή.�8v�$��a��hkkgxd��BWW7����?o9�٤��q8���`4���tx<Z���m
���̹�,Y����.���`0`��p�=h���z�CC����� ��C�z�Z-�@`B�dYfhh���.B�f��k���f�裉GN'���M4� 
OZUR�/~n����dxd�_���������cl��I�������6�[�,k@�,A9~)(���?��?���q�Y�pm\�ww���E8b��eX,�x�m232�x<T�T�c�.��i�]/������D6���".]�µ�����
����_f���\�|�8��ʪ*?��z��T���>����ܜl������|>�� �~��z$I≭[��+�RTP��W�w8��by�g0�"�?y�CG�`�X"#����SS[G(bͽ�Q���N*��q8�L&��ߏ�b��vS�����>ëo��e���㏑���o^z�VK}C#�-��qeU��CAdY�ҕ�,_��e˖�ۗ_��p��yzz{A���T���s|t� �u���!�[}�����_D�ӑ�����m���h���f���y�m�:�ϛ��f���INJ����s�2��d��-� _���s�Q\T8����6�]�NcSf��͛6�͍"�Ť����x�[��ٳfQWWO[G;�,E�ꎿ�T�a<>�y�}��\�t���S��ȏ���<�ԓ:r���N�����O~ķ���b~�����G_?Z������-�d�,zz{�r�}��tvvr��)��g��b��y�.)�^��-Ue�ҥ����ؘZZ[��>{�P(��짬Z���|��x�X�|9�޳���1��h���ȱcl޴����G��������G���PU����P:g��E<��dfd�zQ�Ƿl�٧�������78w�)��8�NN�9�1�L��?�C֭��%1^�xQd��<�W�b��Y��p���&�\C��d�r�X�|�o~FWW7���ѣѺ>r�Ǎ�9@Zj*�.D��O����E��o�d�">L����>��@ ��n�~��ϛǾ�ļ	�"�Ɵ���O*sVq�������ƍ"��ۤ��z9{?��V�N�#
�	�(�J �ރR�F��j%.Ύ,I�$YB��E��TU�l2a��M����>��,_��m;vp�ܹh=��HOg��(-)� ?�ww�GA~>����t�Hnv��$:���A��d2߼W�ь�N�CEE�$|��@�G-���&�cS�P(����\�PI���XY�x�7mB����iuhF���j4�-f,f� �N�#6&����+H��$I���c"�(G�q��8�Nv�����B��A��X,��v�z=^���u{x�F2���~@ �׿�-��t{�ۭm�#��EO��[l,��1F'�
���E΍�����?�$��j�2����IC�a%̈�MwOO$��Έ{��|��b�����K^n���w���$I�*.b�q8���C(�ի�����ى�(�����v^~�5{�Q*�����A#k&�B5
���ĩS|��/0�rq��i��{��V�A��a6����eێȲ�V����iu��W._NMm��_~���e��
�B!�zg;���h�k���|N�:�;y���0X�V�Y��ww�GjJ
}��dfd�b�R�[Z�x��l6$IB��E���AH��!��VKIq1%���oh >.���\�-]��W���/tuwS:{v��������؁��,Z��̌t>�h?O?�8H�@��}���0��l|p�`p\�촵�s��Q�n7F��k�[�&��ݻ���m�������*����W�����-�<2�~� �I_���+��g������W�"�2F��'�neVq1���d4�~�:��J�\���yK}`�$%&��}
���A�6;�	���(
v��ή.�r�Z-�\.��kV߳���B
�1�Lx�^�����d����p�=$%%"I����RSR�����NnN����刺����d�p��~))�h5zz{I���?:Y&91UU	+
�qq�c6��Z��򆆆����j����B(���	�߇=�Frr����dYCFzZ��hm������TTU����?@zzqv;�}}�����h2�v��#΁R��q�imkCB"++�ؘ�������d2�ɏ������D�v�������i�����{R��Y�h!yy�l6�>��u�HO�l6��܌��#+33�Y|]E�$1[l,����JuM
{?���[%3#���Lq��/znLOKchh���A�$4�LfFZ�����V+)I���

�'�����Ó�g1�r�W��o���OaA�t�ӌ���_����<6o�D(���߳z�J����W‘c�9s�,�g�����ĹQ�*�_(-��`��c�ۧ{f�`0HNVKE��%�իV���5�U�����F�EWH�>��(|�}��p��M�Cׂ �g��ol�� � |��(� 3��� �0���/� 3��� �0���/� 3�'���|g�##��g�5�_h��p����/��t ��044D0���k�=w�����02���}>oo�N[{��ߩ������O��7/������y><��q�(|�[�P�>͹q,]� |���(���000H(bdd�����*������`0�����F��*^���DV��\�MM��~EWw7�PUUq�݌��L8�P4�Z(t��`�@ ���E���/_�-U�5�kj�p�2=����'oEQ�x<ь|����Av�څ�㙔���v�����˗BUU���q�nk``�m;v���]80�v300@8��chhh�8�������������o^����?����+8Z�P(���P�[�6��.��k�P�@ ��p8zQtk�F���=�h�����΍��I�A�x���<����k�s��==��ƛ������#˞J���	ϐ{<~��#����5k�g�
�<D��H�ă<�xoV����]/��v��矧���3g�a��y�G(*�,oy��Y��y�gy�W�h4�����^'?/����;:8~�$��<��� I��p�3�y�?�a4����;w����c[���
I�ظa=/����?tG���Ϸ�.�������>�P���dY�,
{�,�,��ƍ�;T��"�2�P���d�|�E�5<̲%K��&v��Km}=F����x�8����|��~M�ͬX�,����_��tvv�e�fjj�X�b9s�Jy��7X�`�����(�3�G7?©�g8{��,�i����x��q8��fg���c2����\::;yw�{��{ߣ�����j��(o��N�
[�Y�|9�ŋ�BJr2�=�46[�tW_����;/|�Y��ʬ�����#X�FFF��f�$��L����v�?�˗.�¥K4��NX%n��1�P��vV߳�͛6�o���<ʼnS�y|�V�]};v��EQx��xⱭ����6YY�\�|�N˚��%))1ZvNv65��TVUQYU��ܸQAcS��i���Q2���"�y�	̟�ʂy�7?�	�@`B���s4���{/<Ovv�v줳������Ս�ba��%�X����x�����e��l�R~�����MOo���!##�Fî�{�SRBQaO>�V�����֭]��7q��EnTVr��f�����G9w�"�CC���~��'���`�|���x��'X�|II��:s�����HIN������B��.^�‰�'ٽwo�n���޾^�]/g��Y,[����v��麟���fxx�޾>Ξ�p�
׬A�$������廿���~N�:5�Uf�/tn�����N*�0?���Rڰ����w
�ۤ#��������EU�C����Ŏ��Y�:#=���R$	���1����搟�G ���OvV�II�uz�㱘ͼ�ܳdef���ۨ�������FbB"���OaA>���~�}�23#Y��ł�` �7�{���b���c��n�w���@Vf&�99�\Ä�aE��Q��uX,fL&3qv{�!##[l,z��ۍ���&!>��`4�����`�h0$''����F�����300@aAŅ�8��v���0��.��3z��8{�������������MJJ
�,���BIq16[,�m�=7�h�r������Kcs3o�����t{��$I(�n�H���׆�QY�HJL ';����I���N�B�Ƅ�R��1���&bcb���+x
��C��RXP�V��l6�tp��4
&/7���t$Ibxx�����(
)�)�_����&�y�e<^/s��HIN����p8��jE������q���76��j	�B8����t̛[�?����?�w�\.����?��?G�ס�*z��Ąx�۽�F�	)�-Y���_}�߼�2]��ܷ�$Ib���ZZZZ@�\p�<s��|��D�����9l{w��tuw����ܲR��4�iiiH����(�������WQQ�‚|�rs�$�ĄR��IH��7/��+��NUM-�9��!1!���w*%�f�Ǿ������e�� �Μ����` Ⱥ��3<<�[Fz]��457c�������?0����f���Bl{w�
̟�o�
���:�5���株p�Z9&����z�{��鮺0C|�s��G7���LUu
�,���Y��~2��8z�811VV�X!z���y��2//��@jJ2V��N��U�(+�CVV&��X�.Y��d&'+�ل?����$$$���˦�HMIav�,t:-����
			�fg�����l&?/��Lff&�0�a��E�\���+ݤ�D斕R:g�i�̝[���R�i����R2���BZZY�Y���GrR���X,f���(*,@��Y�`>˗.%#=-�e�g�Jf���CJr2q�8RS#�iI�(,(�b6���Y�|9�g�0����,�23IHL$3=���,�23��ͥ�����bbb��ʢ� �Y�E�F#Y��dg���M `vI	�. n4��V�� /��Jzz6�
��Mo?�<�Z����N��h���d��
�dgS\\�[ff&v����!z=?��쬬�m��d2������t�l�.^Ĭ�"rsr��mÖ�6NYY��]���%%_�}�^�й�l&3#EQ�hd��v���dfdP2kv���$��,|&����/��/�s��?�0?��iFP�_�v����|�p8����g�=����{��z_)c�B��·��*�#΍�Wї��O��RVZ��*�y�@�FÒŋ�ȐbIq1	���]�����l�#��87
_e_��/�}����"�	� �ڗ���>Y���
� �g%��
� �"� � � "�� �"� � � "�� �"� � � "�� �"� � � "�� �"� � � "�� �"� � � "�� �"� � � "�� �"� � � "�� �"� � � "�� �"� � � "�� �"� � � "�� �"� � � "�� �"� � � "�� �"� � � "�� �"� � � "�� �"� � � "�� �"� � � "�� �"H�4�A���ZT������Lw�A�@U�;R�N���穏�
�_�G�AmRbz���;�uA��TT�G�q��(��-I&�	��������V�l6c��5���Ok�$	��LB|Z��LrR�W��HA�f�$	�׃N��tv)Y󥖯�*��~�z��L� !!yoc��az�zqSTP�-�6mu�>�r���2�GA?�V�E��ޑ��``�� 7�V;�qM��`�X����l2O{�k�߇vZk"� �(���(�	���FF�G�>�$)r{cl���.7�D~A��
�����ԢeYFUU$I����(�t����� �]���(���$IH�4e�8:;~�{n
�ce�.�Ox?m0��P���[�s�w"1e�_�o.��q��?��>�6�_A�k�z���$I8�x<�����'�-��p`0��Z�H����f�a4�e��_�E8��`6��h"�
�� ����<!�*��*Zm$\ʲ���AQ�&S�����q�_�$�>�B!�ZI�	�l�)���� �$a�M��qKS�v��E��'� �=���cg���?�?��鉼lt�^Qv����K��$	Y��z����4777{�cC��I�����.\����\ 4�4s��U|~4��_@�$dI������v��8�N�������P8}��“$	EU&���f����=���]����Ceu5��e9Z��G8z�D�������	m6v!3���vV�D�_A�����m_?�]�bc���&-5�@ ���JZ[ۨ��'7'���+W����EgW7��N�x��p��F�������d�� ���#�*>����AdY&��V�epp�!�ǀ�ق^�'
��с��1����$����+:�.�ձKUQQ$%���P���t����b4��0���C�O~Lww7M�-dge2���P(�,�(�B{G'�5����SV:�s.�}�.�{zX�b9���׮��[ZJBB|�.��G�A��f|o,��֑����5�q�z9�s�/���!�z=��ɀ~���W�`0+��e)�m�	��Xl�6Z��h�� )1�Ʉ�秶��!���~�Z����	UUF��%@U$)2Q������!�^-��SOXT'��#TdYB��`4X�d1����;��v3�r�c�n��룽���~�z���<|���/������p�G\�z����؉���� z�� �]"�?�?
q��%�{��|�74���̵�r�,Z��5khko��rQUSÓ�m%3#���J��p4�ɲʞr۲DfF��՘�F�2����r��
��ܨ� c�����$�~T@�ף���쬬ё�~LF���8�o�կ(�
UQPT��H������IlL�}}tvu3vc������FL&�'�mm�{�*�23شq�$q��U�f^����~ܞȂI��FG
Ĭ~A�0�_�$����s�s߽����022�
�V+-�mtuw3�r�����`���	�^���AU'�P����WQ��D��9h4�HUT�^O(�5<�?�G��`2���uG������{U�>����Hum
CCC��O����� �c��U�����م,�>z���222p8�,]��U+W���F��n������Ǽ�sY}�*��	�C��ZZ[IOK�f�%#=�ys��h4�M�輂�=~�~�LLL���AA��$	�O0Ġ7D{�
��HHl��9�Y�Mf::;�g�
����V~���ܹ��*.��������%6&&�B�C�.R�����%I"��H�O�d4���+
n����d�-=}�x�^232ILH`xd���^��11h�:�:-��V�-ֆV��10@0 ?7o��@���9@�=~�#{n���ׯ���D}}���lڰ���zz������ ;3�A��K����&!!���o���MRb�iir�����rs�)����������]���A:}��ZRR�X�HA�f�$	װ��K�5涏�������z����`�p8�N��h��m�--C��G����eiⴶ��@䑿�3�UU�N���q��m��ס����͝����Em}�9�G�F)$)r���}>�F�n�B �N�C�ь���T��!G�
���u�	��\(//C�� �ݣ�����p�\U�	n�kǗ=�s���&�||`��y|I�&�П�����q������<6�?�o�����zD.x��ߍ�~�a���p��m���q>A�"��ݡ��#���Y����b�|쨹��u���'�D&T��/� �5���+��~�NGB|����ؓ_�b�~A����{��СP�kn��V{G�q���p8�����G��ƒ�|�P�� � �p穪��`�f�E����`�Ŋ�jexd������e�z��
�y��.�$a�X��ƈ�/� �=1�bcb�Ho<�t���TU�$��EQB�д�G�$dMd2��� �]�q)h�h���>t:�IIL�-u	��COo$�P�݆9.��G���0N���a�A��?UU	�d��GS�N7��D�Á�BbB�]�s�q¡05�:�A�o�)�͝ƺD�?���$	I�A�A�o�龱?U����í�_A�)���1�/� |#L^�[�w,�K��*
E�П�
i,�ާ��j "Gg�k4�/u^���V�� �S�޾^�GF(*(���UU��Ι����އB!j���L�f�Mx���AQU����	i�<~I�hji����р,����`��M��K]|��^�����}>bcc'�g�]"�� �X�P�`08��!E������F��H���#�5�������1�L������j�q���tX,��3�%�|�/
���DVf&��mttub����z�z<�L&�F#n����F���q���	�Bx=�&f�	��G DQ��p8���'6&f�ȆH�#� |�M1�.I�]]�;�1���!�

�����@ ��x<�7ɉI���w8�iu467��� 1!�S��}���1�L�=��t�A�rsi��k#1!��:R�S���G��)�/����>GV���˰{�޾^�&��/f�� �h�b�޸c�ٙS2��DGg'}�}dgeQ�_��b �ɾ����~HNL&?7�Ǎs��F����p:���TsTU�5<L}c=�����LN#�4-�##���I����t�p:�iux}����׸\C�J�=�ٳJ�L� �f�����0eeD�A��
�:��ϋ����v�#��%)�l�س�r�P�p8��(���P�р��ȒL("
�Z���LNv6�)�S�ێ:���1{V	���$	�NOL����|�6q�8B� -m�$�'��j��t7_c���K�A�� �%�<�H�9"�� ��XUU���'�CM]-U��h4��PU���*k��d�e���JGGu
��Y��H0&��j��b�x=���c�g���t288H8<y-~�6��i�ZdY�k#9)����6;F��~�#��H�׏N0T���#1!øר��F�E#˨���$�#�+�h ����jII�Wf�#AA��E������,dY��@ ����z�Z-��
��~2�30�L�u:TU��+h��:=�P�ߏN�C�բ�h��`0D��|H
�	��%I�������	q6�HHB�բ(
�`�^O(����h0ȲL8&����o�FQ��#��@EQ0�V�r�8����'� |3�n�Y�1�0�����yv�m��r&�i�#tz�>r�\U����;�h4�X��E~��t:݄���������'lO3:�?��z����z�3v�p�F�_A�f��%{U���%������g�4"����@���f���K�|A�o�O�:��f>_�Su�[��_A�ړ$	�����c6�?��@ �,�ѿ����
~�N'� ��'I���8�NwuP��X��ۇ,O_�?��@K�5F~A�d4a4�2O��_'�b�N{�dY���W~A��֤9_�W�^bAA�AD�A�D~AA�AD�A��c'�I��,E2�*���&Y�#�1����]���͉�p��m$�(���6~YHAA�>6�=q�IKM#3#}�z��y�!���(
�mmtvva�������bA�eTU�v1s,^������|���` '''�����޾^���H�����,+_���� �p��(
;��E{G'	�q�74��㏡�룽[y4
 LL���EK[+�-��}��F�WM60����o�s�n2���|֯� ?��g�Q:����xE���!�^/g����@���h4�<�c#*����uS�߽��P��$�v;)))X�����uhh�W^���=�V���m��r%IB£��SSWGccmX�V���q��/�>�h� � L嶁?R[W��5�q��Մ�!$I�¥K���g�SYU��nG��r��U���s��/_fph�U+V��t�
z��%��$q��u\.F���JwO7es�P��DF
���1��<��3є�����[,Z��uk�`2��t�
��0�\jj�ؾs'===ܳj%N��ĄDlv��U�p:����b1�x�"ZZ[��J�D�^�g��u,Z0�;q��M(f�ҥ\�v���j֭]KJR2M���74ggni)�tvu�WVJk[;�mm���a�X�������ήnIMIa���B!.^�����b��p�<l�2H	� ���~�,Ei�ZTU�ĩS445c�!.�����r����ko�MFF:8H{G'��q������J{G'��$'%�m�N�>?m�t��g����~GVF�Μ��@��s��-\��`@UU,�U\�t�P(HJr2���.���L^^�@���n�{z�t�
f����Frsr���`�b2��l����������QU��DjJ
5�uR\T
�������7*����f��/����j�����#�CuM-��g��X,f��o04䢡��3�Α��L{{;�/]"%%�k}��\�|�~���W����ʁC������444þ�(�ULbB�t+� �ט$I��������?�1驩���Whhhb��ETU�P~��AOA^>�yy\�a�vsf������7������������pX!1!�x�‚B

�Y{�}��>�~?������g<p�N�=���Q��GVFܿ�YEEX�fB�^����r�����`ӆ
���Ewvl��`�� ��`�V˃�e��
�av0
�߽<��ܷ�$$X�nK�,f``�ܜ��S��}tuw���³O=ź�k�r�>���ee<�q-m�,[����7�h���)R��ٳy���$&&���NeU�\Ǻ��������� _����A.^���h$;+�F���t�l�<Ďݻٰn��9�KHIN�w��l2������0M�-�z���3����L�V��lUUFg�D�LU��H__?��	�$'1<2���JKk���8|��斕��Յ,����)*,@�ezzz���c��FUaՊ�wt�wIKM%!!�`0DaA~t8]UTdIF�Ѣ�*�PE��D����
ᰂ�($%$P[WG}Cu
�$$$��jat.Dll
��̙]��h���aۮN��d4r���F�����A�/͔C��$�z9t��Ϟ�����K�b�2�f3�}�\�^��=�N��ȱ�\�|���=+W���Ik[�--�-+#.�Ε�Wiik#66���D��JKq8��s��t��0����@ee8@EE��Gy���x��0W�^#>>���8jjk$!>��K(�� 7'�8{gϟ�����DqQ�O�������TV�Xε��t��0{֬h���������j:::IOO���P6gn��P(LnN6gd?2�����K��#���`6�)��'11��\�|EQ����B���CNv6���S2��s�/RQUIGG'+W��LO� ­Ɔ��ӧO�%%%SNS��"2�� 
�M��F�!��y1�|��(
�Y���az}t��cwc��:C_UU�� �p���F���3
!�2�,��x�e���C�0Z�I�p{<Ȓ�} $�c2��j���4 2��G.I�蜇�O2��|�gc�h44M���߇B!|>�[�$dY���X9�����k��G?$##]L�A>7I�������eY�d2E���*�@����/]�^�#�(�tZt��I����Nz~lq��v���^���я���f��f��t7o���~B}n�.
�S�y|�ű������1���2���$���.]�L_�����h��� —�s��]�����O�J~�t�����ʂ�|dY&'';��� � |Q�9��t:f���`t��m6����x� ���\=~���A��
��xfLA��N�����鮇 � w����w���$E%tEXtdate:create2013-08-30T12:28:19-04:00��(�%tEXtdate:modify2013-08-30T12:05:05-04:00xȦtEXtSoftwaregnome-screenshot��>IEND�B`�site-packages/sepolicy/help/system_export.png000064400000147732147511334660015526 0ustar00�PNG


IHDR��ah�>gAMA���a cHRMz&�����u0�`:�p��Q<bKGD��������IDATx��ux��gf���KӸ4������/�.�r������(�Rw�T"M�$�{�����i��-d>O�<���������+++����$ٖ
 �P(��a(
���lkkk\\\^^�T�)
��m����`�!??߶m˲u�(
���0Lnn��իY۶�e!�u�(
��b�&�q� �c�)��B�P(cl��%�T�)
�oOT���w:�K���ɀ���n!wc��cF{�x�ϧ���B9��$�t���A4�aƘt8��B0�m#��.�K����ܕm۶m�%�!l�D��_q�GKB�� ~k�(�?rs�;����`���	��ؔ�3�;�bXV�c���e醉Ю��ʊ�q��4MӲx�۷�S�B�1B�罎1�� �eY�0��&9@QT�cm۶1&���B��e[�(��I�
����ݭIDۉ��Ͼ7쵋B9X��Ų,BH�4� �m���׻u�Vg�n�mG�� �l�滟Ϝ�p��i��?x�
��|��7$I
G������{���dH�~�k�Ϝ3��o�1�����5]g!�k�.
���aB����އ[�;�YW�{fz�!˲�a�e��G���3��WT���(�"I"ƻ�^�4� �ei��t:���Ͽ�ᡛ�=d@QSK�?Z��u�5�q�0�mY�ap!@�1
ô,�㢻<�!�0��iY�%�X�4�a��5]'m�rPضͲ�Ά�@(4���`]���7=5�0͞ھ��:�C�#�l��p,^���og�w�}�w��"�����t!h膢��eAu]�JO;���i6���eUMKI���է�$wu4]7M��ؾ}R�jjB��	6�������dHQ�-=��׳���
�`0;�/�0��ʪ6rȠ��>�mJ�W2�,k�vZJ2������r�<n�a����P�y�u�yٙ������8�?5)����vXV�b}^OCSs[gW�ߗ��� �kh�U�cY����K+*������Q\Zff\�Pa�c�B�Uض�2LcK���}t�ygA�����?;+=M����Ѿ�O��C]x���4S2�|���#�O5��l0~��/dY6���Q��c�2�)�������sx�eQ��y�'���W_y��/�^U3i��������z��8����M�m�E�_�|Ֆ��`(|��S�\�_r<�\s�9,�r,�|��ڝc�z�*kv��ޔĄu��>p�I��gn��Ҹ@VԱÇȪ��{�TV1~�eYB3�̫kj������\SS�q,��n����j;��/=�4UU_���ؘ���_~QCs��Uk�'uD~v��o���֧ORb��잣[ʯB(+��!�²��{.>��qÇʲ��1���S� ��e��y����X�p��s�=x�9��g��2�F�9�e9@t�9'���W��{�<�9˶�O�P��|�_�SƎ��˒9EU
Ӻ��s�����6c��iS���g�[�v:$AX�qsYU������!�����n�391~؀��P����H��7bص��{�5W@�o��>t��+V�\�	B0a��O:�(���2vԚM[��Y�â��P��SN8v��}���a޲C��u|������[��*51!"�t��۰m�rH�iYIB���r���X��.9���X���m�+��s�/���ى1v9g�0C�u���چF�� ��ض��1���X�e9��
*kv���clc�
��N���٩k�(�z��'��Z�)��$�M�m��-���<nYQ�[�dY�l�!I�i���{n��s��G���L��q��q=�PX;lȳ��1|`�3��8�����Wޜ:~LZr�n�����]S�0o���C�IN�m[๖����fM���8."+E�9G�tݲ,]�à�!�ac����ֿ���q
�>��:DVU��X�.b�v�O�;�1�vtu:�?�J�O?9/;���!]q˝YuϿ��z<�|>B��i��x\n�mI!�PFZ���C�z�m�ӑ���â�յ��}�c���')��)t���`(�����.���3O{�/Wn�4����c��7�|3waB\�����O�4ṷ�gY��㎞2vԧ��om�9x��i������ 1�?ϱ,�r�q��3�VV�*���OS5�a۲Y����e�����UT-'���I���-e�]�)'O�����]��de���1})��4�cb����QCBy���,��k�'��K�>�0j�C�C�� ɲ$� h�n���lZ�����.��]��#�x�S5�e۶u�x��8�0t]�9��G^�Y[7M7�Q�
bn! Q�c� �r8�~�y���+.
C�̔<=۶�-�gY��@��t0�߫�%�A��@��p�c��!�64��s7\q�Z��#��zX��1@��o`��gY6"��C2MK���1�y~Æ
�‡��aZ��-I�mˊB��{��b����$��qƘAжmUU�!���c�eUey��=V��p
��� d$�
��'l�eYfc[ํ�J�m�ȭ7j��1���I�u�D��V��fRH�4!�>�i���1{Ϊ�� D#
(��
�.���r��I.ٶ��G�_Q5
�*��	Gd�?��=9б~ʟ�8:���ϲ�s+Z����+�8��s�t� �6ƀl��s�eYӴ B��}zFj��뻏z~l��}&fI!1�mU�>y|���$�f�5o���d
�7�S(��:�1�>�^�@7��̌�y�H�z"l�v9C�
,�VU
P�]ʡc:�K�D�t]Q5�A��Ca�f�0���B�q�B~��>JO�?���^�z(�C�O�~�aHHB���琨�a(y�V��Z��~���X���eZ`�e��\10-���ar��e�)����y�;�7m�D���@
�HMI�w��?^��W� ɉ����:US!����닋�;�e�P9lWwWvNv�/ƴL�'H�O̱\[{[EU���V��[c�����PcS��(�C�_���~E���l)�^�r9].����1��˲���+ˆA� �i�g�v{G;жmI�u5:�w
�oY�n_Ht���k7��L�ض�f�g��!!�,�'N0!�k8��P~ ��@���<��(��~��I�z�ސߗ��ύ&KS!��O+�FG�o�D�ٶmkgw����mD�SX@8������wxs�}���B�v��N����k�?"G"�g��gT��a����s�0�m���?B�����5jD�0L"�B�eY	�	�֭��쌏K�w<��o�.���X�AD�>�'Y��PWBog/A<�>Bw
��]8<��<�4�0���Du�`5��6BI�hc,
�mc�0 <p��� ������p!�u�����^?U~�����km-��!A� ӶA� ��}�e�M���K�o�7i���4M�aH!d�6	^ EQ�4���+@{~ƻ�@h�A~2���扣�~,X��^?c��kkwn�R��������4�6�D=�;q<kY�eY�A�޵"H>��� �j+[��X�ʪjA�����H4� �c�|C}�a��}
�@��fY����4͌���n��/+��mۈ!���n�=�!�-j�L��1����
"�0��#��$��ܽ�C�m�v���ϙ��̯O<�����i�nDdY��"��q��5ò��>���$'&vn��AH�uYQ�.b۶w�@���ʥ�o�0��¸�XY�G{G���1���a�v��w���f�S&���w�YcEQB�(*�"�<ò�e1Z�xYjjJQ����N���0ȲlUU��8βl2B�iZ�d �	�0`˲N�u����rs���0ɋeӴH֒$Y��t8��37���/�,SQT��\.���+[��
�M��y�4LM�%I��]��a�e9��,��y۲0���:�Z�u8�P(�۲�R��;w֖��q���5k�d��a�ց���~�1�2-�m��8z�i��lƼ�Ñ��'�i��Ҳ�c�����_l+)�<q��?ش����O1lػ�P]S;m�qc�|9��Ꚛ�,��W�}���;��X�h*�öm����s>���W^x���`[Yٕ������?���B��^/[�@�.��ꚪ���y�T]׷��VTVy�����%KWL7vS�-�J��Ҳ�2~�3��q�N?n�蕫֔�o���?n��M���+dEIKM
�E>��<n7iu �ě��!�DN��D,ò�b�Ꚛ��̌��F�#��?�jkk?j�T���s|8�5��c��n��

���
9�[�tEmm]nnva���L�:%~7{�I����TU]���2
%'%N�4���谰5�P~��YQVUm���ɓ��>l��6N���hѰ�. ����	3fdee^p��?�HgW����/,˚�`a0Z�~���xԴ#��SSRRR��\����_o�2f�ȧ����u��� ���{'7c��W�r:�u�]�*�z���II������`��W���c��(*��aX�eY�0��~S�Lz���{���䤤5k7��5���-�H��O>6tH^nn|\|jjJ~^ސ�����._�j���-�ڱ�����;0u��s~:t0�`���n�BD�"���gYC�9�e+Vn�V:e��
7��X<���1G����0LsK�g_����mni]�r��'���Ͳ���Y��o,++�kh�����k��o�����t�hjjnlj>aƱk�m�YW�t:İ,{�N��BX���h���'�.(�~������ ����J¶m��s�-7���s9r�Ï�=:>>~μy�M�S'O:x��7��9�-�������'���o٢�jiyy\\���~�o�a#����6th^NN(b���!�,��u?���}��w�%�}�>���{/G
��e�(2�0,˜y���o���W�x+..v@��
�����G���~���[��rGU||l�ߟ��[XXP]S�FEe���C��$)7'+?/�oZ����ܜl�0��2O�"�Xİb�EbDI�x�N����r¸1GN�:~��5��765{�Q���$���x��(�;������߈�C��$C˶o�0~�QGN=rDeU��Q#6n����t��Ǯ\����c���~yEE�RS�Ue91�\>5Z��}��TU#1�?z��#����̫�^��3㘎P�e��m�!�cQ/]���o*�lY���1e��<��z��mhl;zTz߾/��e��_}�mccc��|��u�S/�袤����`(�����5M
��A�� ���9$�{�~��G��n�$�����f�a��$J���O?���+����ⱍǎ�pђ���EE-�-)II�F���Y��A�$�\�z�m9��N�cȠA��|BA^^ �#�nC7#����rb����0�añ��i��e�V�ݰ�8/7g����}?g���'�OH��r�7ś���w�C�#�LNIN~��
�m�T�xɲ���i0`���}?g��5��ÇY�f]ccӹg���f�f��P0lf0p����)3 ������Cuee��官���E��t��SN=�oZ_�� ������Mś�˷���\|�����$'��7�{���WT~��ת��s֙��	�	�[6'��O�:���aɲe��\X��Fvf����/*
G"�iiII�t��?�ȸ C�놱����-m-~��XŰ#+ʎꚺ�z��}�Y�#���׬[�֧Ϥ	��;;�,]V��4���9�Y))�
�Ì3B�b���ή�i����&%%Z����!�}RS��
�2��A!�c�fgWWGgW[[��G)�▭%#G;zTfF�κ��ʔ�D�����>�|{�A��V�]��
��ƍ�1.)-9b��ѣ�\�E����<(9)QU������$M���y�^�a���@唿4c��v�V���Q�la^������=��`gC��^|IKK3�066����0---��7v\�?�0
b��$Q�H`�`0ȲlKK�cO=u�u:��	��m۲�H$"I����ú�{<bR�����m;�`0Hb��J�?��ھ^x����}˶-YY��fy�w�ݶm��m�����ԓN��Ȱm��r�:s��xTU
G���cY�P($<PQ���A`
��!?����N�A�?Ʋ-2��r��dc`�H�$��i�p�eY��
�!� �(r���DQ`Y�XE����y^Ӵp8�z=�aF"�ߧ�F8v�\�H$�zTU�u��y��3,���E{!��4$.��eK�
9��Ę�%ᶶ·M��c�]�v5��9�Q�%~qq���ˆ,�#���۽t��>}��mۊ��#B��D"2~���ENa&��gww7�0�PR�?�]q~�]�6�eY����s�N���b�ar������c9۶;::B<�G"��0d�Y�3�i:���"1˲Ђ�a�=C�eYV�eb�ɱ�d�ڽ���Ս�ec���g�HD&�Kd��`0!�y>B^P��<9�c9U�b,��K���uC�#�i:E��*+۪k8�2�|ҟ�d��a��n�tv�)��	�'O� �b8!�z�B(z|�����CE�0���w�l�Vvf�5W���g���f�W���w�j�c�"Bd�.c��>�m�@o�
��W#=��w�x�s�n�[GS33��۶��,Y� ����Axg}�i!��
�~��m�m�ccc�`¡�R�
�0���׏v���n ���F��>��H��aۤw2����_�¼�|�c0	���⮾���X�����=�� =eA��g?Ga��#CL����Vd��U!d���K�{!$q\ā�Y۶��@�NE��Dв����0‘��BM�L�K�wtv>�܉	�d�P�B��m��u�u
M
��7!$+rZjZrR�&��ax���.˲��U���+�.�r(aS�S���򷁼�Y��t8��L�SD<�&7�m�T�)6f�0y2)����\�9zSE�9���B��‡�7�Pi�\
���S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:~!���1�mL�7�V��[�E�w���Nȁ֞�P(�A�sҏ1���t
C��?s0	wP11Ƃ �\�P(��:��0�	�B�����,�t:#�H���)
�r�P��.o��wߩ�:���=�������E4!d��aǑ�z��$��Q�y��_}}����۶��ǟ���]p�9� D��̲LKk����8���]7�nuR(�_��H?W�$ikI��3?/�a��śO<�8��a����$�#9��p�r�꧞{�����z��i
�PUնm��!��B�m+�b�X����^z�����߼e˽=���|�����n�$AmlG"2������鋯f�9"7'[�t���d��B9̡}�a�~��a��g[[[Ss��]{�QG�,��������?c�+�����1~옏>�4"+�\x���~�����k����23�{�%��}��W�����oضm�F���u�7��d_s��!lc��A�jv`��%KR��R��9�kiiy�W�1�s�^�a�;�}
�xAp��s��{��O���]rх��0	O���a�"P(?�����Ύ���Ǝ��=w���Ͽ��I�w�E������G��_^x��o��~}CÉ����<t��śO?��䤤��';+����{��7.Z��_�];w����s�~��7p�GTUMNJ:�c�����.8��Y������O��>픓�{���g͞]��?t��/g~������^�2i�y�UM���~~ab
����d���C]
�}{�,�vuw��M�&�w���|;��o�i��_r�5��7��|ӦL�_�t��M�7�fg���H�x��'��ֶ�����Mx!�$�ǟv�ɡp�l���=�ǹ��;:�0�,�j�v�	'�Z������D�kjk��ⲓN8a�?���\UU/8���1ᢊ�J�4M�ؿ���u;��}��V����1�qe��4]s:�T�)�	{K?!�p8����~4�o_EQ�
������>t��'�{�W^��x�~�������[
���q��ig��'�8|�І����̬���[Z���::;[[[��`{G��kB�����=��n����?oo�5�^~u�s����7�~��[oKOK������s6���z��̌��Ͳ,�~�ቍm����r��\r�%III�e�i(�e;�:9��DI�>��^�g��AW\zIbBX�������366�eY���3�=���~~^���4`�1G�j�iZ�
�33ғ�����{=�����}�>oaA?��g�&/�9��YY�i��)�|fFFVV�����Nǥ]د�`��!� $''�;zԈ�F����<?h@���X2��)��QK[����9�^?�P�0LKK\�t�A�t]�b�A���p:�I�Ic��	 ��a]׽^�}=\]S��s�j��t8Q4#�ض�$I�"�l�������j���
�B� 0��n��몦A!
0�<ϋ���]�q�HD�4�e�.��8��#I@�4UU��尅��[K��M��t8��B��!7dqq��!���wvuE�r���YZV~�����j8	�����P$B�~ɉB�PgW�0�,c���i��(�}�䨪*I-
cP��eYђ����F�?ԕI�P(1�͋�V�1B�'!�쫹�Ǧ�����Ξ� ���3n�^��{�P(�/��C1��q}i�#-
��W��FK0�$�� ��W��= s_?��w���?zV�Oh�����m��c�a~f�mߟ�mСp
��ᠥ�eY�a��~�A�m��$R6L3zp�'�c�6�q<�Gw����rVϽ�j_ύ�]ѯd�XE�e{&��1���{�g��z��䘨q�~	zHa�{��G{�g�W�++��W9�$����T����|�q�`PQU��Oy�Hv�`��i�{�d�d���C����y���?�u�P����_$y��'�y��zxђ�� ض�r���q\Cc�=B��H�H�r���t:]N���him�oh��������>..����g�eY����-����8��p�,�q��ɐ d��kϋ����:z<˲�鐶�!In���8���t:{^���.���tr��I�c��].B�|��|������p��8��-	�(Al�Y�x�Ě���|��
����⢎�.��)Ic�.�r:\��CtGuuwww\\ܖm�^y��ؘ��J� x��>�d��\)c� D������9��Ҳ2�ɲ���<�tJ.��\�(
Q!v:_�������+*��.�(��("��n�$�� �q-�v�\Ͼ���k���HK�1�D��p�tc8�(��o-)�$���t�
'�H�����]��r�<gS��P�����v�-��Nl=W�^3g�w_u�	~�OE�������RS��ۗ._1m�Q�l�VU]���H��l�VYY�z�x띥�Wdf�;|��W^]�beNV�a�}����ijjNH����s��5I�x<�E�9.�]��4Ma{G���,���������������?��Ԕ���:::V�]+I���o�訫��VZ*�byEEcSSrRRT��[Z�WT455�,�z�:QH#Q��XY�����X\�a�-۶}ԑ��+WZ�g�&�0�`���#��C�]~����KQ����,�<�x0�kׯo��HINV5�����jG}CcRb"�T�e��Gkhl��Ϊ�o��`������LL��y���v��ͱ���(��x���i��M��ˊ"˲�����%6&f�.Q����ecc�~�oGMM�-�11���������j��!�߸IU��x�4��Q]��/u�E���"�\__�.ךu�;;;cbb^~���$�綷wb���$I,�VTU565��ű,�1�VZ����Z���N�O�y��$o޲sg���eY�X5�����P(��,K�
�y��so�aZ�Z�]?�0����H_l�ڵ�W�������A��;���7oݪ�jrr򲕫N<n����]�������|¸qo���?��C��X�tI}C�`@Q��]]�}�eSs3b`nn�W_���dY���x쩧5M��paA^���A5����ò�t�EQ|���g}tY����|kЀ��>���(�@7�q�f����bb�O?����?Ο?b�Яg�z�ÏTE}��0�_Ϛ%�RQa?M7�m�p�-�K��X�bE(�r�7G1uŪ�o��N]}��EKƏ��,Z�4"GZ[ێ�6��W_���\�|���M듊S][��O�t�	����>w�%?�������m�V�����1ӏz��K��7o٪����_7����Uӆ�1�Q]3��Y�]�~����͜5��r}�ŗii}dYy�ͷeYY�r�!��Eq��
O?��eنa������G��|�_�\�rUlL�Ï?a۶���p��y��@VFfWw�[��D�._1aܸ�b�;jj���
���Ͼ���KJJD�ٸiӢ%K�N׏�������+��MӜ��u�]��^����76B���Dq��E�

]]�s�͛:y�G�~6��9%���JJF����m�X�|�*Y��[ZV�^�D�
�}��W�
��^�'9)	!����Ǟ8��.��ǹ���w�`8����23�SSS
�8<՟J?専H�A�@
È��{⑇��ڛo=��K--�s�/�|��|��l!� Y���=�eY���̛���t��M��u�
�s��	�F�>b��믻B�iZ�>}&O�0e��k���4͸��[n���g�Y�c��EK�kj�7nڼn�F��a�6��0
��1n̘Æ���
�6m*�<l��ۭ���;z�#�t����u��s�Ϋ��MLHظ�x��M�(M7��²�U�_v�Gn*���<�=~�?�9~Ʊ~���;�@n���5{�yg���8�K�������;o����7lڴv���������:�H$2��(6&f��E]]��{�ҥ�MM�'MT唓N���; ���oX�d���e�����n��
g�vꖭ�X�UeȠ�Ç>��3N9�`(X��w�n�0n춒����B~�o��ŵ;w�n�P���ÇM7.�o߾��dʤI6ƺ��$'�1b҄q���u�u�&N�b��H$�,Z\�s�C��8�ԫ���a���tcVffiY��!��:fԨ�쬫.��_AAD�H�t�W\t����!Ǝ=dР�c�L�<�뮋�6���q��w����{�߰����ȃ�שׂ{���;;����[n��~2h`��»o�5?/W�4�4F���M�
!�� �Y�xAS��>���}׉���g�#�r�r0�4/�0����--�}��睻d�rM�DQ���=�����kTU5� &6&f��1w�_��4M�öm���jZ TUc@"�m��Ñ�a��cl{ܞ�ii��|���eY�m;6&��w�\�����ή�GN��/����9��t>���N���[okmm3���!�=}��v�F��Y�%I�e�Ƙ�X�#]2UQ-��9NQd���aX6ٖE|�u�����	!tH��Ą�/�촓O���
m¸q�����8��s�x�ل����tEU
���<�c�y����#UG�NU�m��I@-˒��;@��]��,��~_aA�}wݑ��b��i��
����շ�y���9|�s/�l�VA~^vV֣�_U]}׽���)�JWW�eY"��WXPpߝw��}�ij�&���(�A�Ad�]׉�a��`����e�Q��}��͛��]���l�Vd ˊ$I�P8
�L�r�ݺ����`0DfB�P ��W^v�����/|��l��iYB(
�,k�e���w�����p�O&H�WV�p�/���}�K.��ʚ~��k�-]����>%99
��;g�vJeՎEK����e��O�0������G���'�Z��/2���6j�Ҳ�g^x�a2��2��ӦNMH��;��
u]pW]S��s�W��defb�ǎ]]]�****)+{���kj2�3x�?v�׳f}��'�{l\\���oب�:˲�ABR!!@�
�e!�!� �DP���:���f��?7z<�+.�d�����+��!��!��
�p�╫��A2-�iڈ���`nN��ѣ:::�`2�l�z���׮]�|�Z n�!���X�=q����|��̯��A��a�)'��(꼅�n+�x׬��k_x��`(���eY�S&/]�<';;!!aي/��j(JO��p8F���/._�✳ΐeyނ�[���>5DAr��a��r,�"�1�}�5�֥��q��a�����`��8���z��q�YgN�2鹗^����z������qϽ/��ک'�8��Ԕ�z������>�5�{�ߟ��hٶi�)��	��7�z�s�	��s����w�w����s֙�mѱ
��s�>ɉɺ��m�v(�������4�㸺���IJL�y���튢fd�77����'��|>��v��+�����tH��ݦi���fq���m������p�������4���VŤ�D2+h�vCcS$���a��r�r��G�<��㻺���[�.��������Ĩ�J�INJ2M�\iD��d�u!�A�LӴ,K��H$��������B)��Ā���βl���t:!��5������n;����0��(,�p'˲(��(��Y�D����D��0Ӵ�]�a�����8���T��t���m��������G�Dt]�YW����c�릛����~�@�����`����N����
��|>oB|����"�TUu:�$�����EI�5�eu�!�����ޞ����x0�݁�i�q�Jj�P(�dfd@Y�mko7M39)�0�e���^���!k|vtv��#���!)11)1�ܜ�wT����&&&,Y������bcb�1��i�4��"�砥��T`�4M�0�$-!�4M�$b���s,��n�2�-�,��X`O�e��<��9��~B����w���;����Ǟ|��G�$ɶm��F�Y��0Ɔa����C��m���0��U���m۲,�ݺ���%�,k�i��I+���%(��e�{la۲��U�F�<{-hL�����E���e���eY��Bd�EQ<n����7��>��d��i�%�8β,�4��$I��쎕-p��!ɻi�Y��
�4MDz���"�ț�@�u]'�v�0!���8�0��X�0͞��ĠS�u����_/\��g��L�B���*��Ì_�3g�A�!v�]��6���a�%3�=�1AWبR@���ش,�!�-Z�Q#�4O;�$��EZ����3�=��?��^Oc�D��W���pw�h����}�{�vS����Y���}�����,0�в,EQz�k۶���,�N�i��lۘ�����+��t�_Q9dcԞR߭��k�����|�=�g
���{F�*��1�e�_~������X�)�Ö��~���=���_<�g΂�t0@Ӵ��ܢ�~��Fۘߐ��l�5��I����+˰G�@����ѣɐ��Wί�q�?�o���P�n��99�H���Ps~��.�1��`�z����2L����Ec���������4M�h�n
�qȞ⬯(��*B���o���B�sg���(������ـ$Z���O��A�N�kA�j�P�M��a����`0 Bٖ��o�{8db���#��X~�t� ����_<B���V4!0����q�\{E�@^�l�f����U���
�`1``�v|\B\\�^ޭc˲p�n�b�X��[��s �%����_�?[��AHee�eۃ�$ �ݵe�V�0��S��E´�����Y�k����Aɂ�?�)������e�>�o`� A,���h+�^������1��t�m����f�׀��L���ñ�N�+��c����!91��`���D׵��Խ*��w�5M#F�,�F���-���
G"D�y�'�e�{��� D"���6�P(,��,�����'N�X��9��-q�N�6d���+�^	H@Ԥ����_ �9#�?b�d]׉�%B�0�7�~[QԓN8��7bq��̲,�z����Ĕ�$b0JR �`��>Ƙ��<��+�4U�T`bbD���d��QCI��n����9���|'�p���%R��gv�P(�z�ڡC��Lٳ<K�.�W��v����^�H^MX�mkk�Ccnj�m[�U���<h��5��^�(�=+��_¶�0��aC&M�������u��A�,�!�>�j�q�ñ,��e��kjk�>�Hb�-�eY$�趒��M~
�r�󧾡{���ƌ�ݡ{�]Ӯ����G?Z-\LLLSsS4H2��---�W�1|��ѣRSSX�u��ĸ����u�7:��y��!I	�L�C�~(	)�s\}cC(�x܂  �$I�1v8�0fBh[VggGFz��DC�m˶�H$������A�$H�ᒲ�m%%�F�hnn~��WX��D�eI��[2	k,���8A���$Iӵڝ;B�IJ���B���,�`\�s�mc��iY� �<�q����c� ���)--ͶM]Wm۲mKUe����b�{V&ǵ��.^�� ??#=��r�׶mאָAE��-����ȑ�?��**)����%˖�CV���p�<_��0�� �B9�S{�`�,�^�ckM[�`�{�^׺��-�!�$��wD僴	���)�idef����e+V躑��p��gΚ=[մ�>��=�DGG��#�=z�cO=���]��7x���>�p�%�����#�?�����8��O8��Y��[���ؾii�]|�a�5M�y>�$EJ�fRU��2�]C(̀����~� /���2�f��lŊ����/�x{e�3��y>3#���J�5���v��WT\x�y鲢tvu���;��r�q3������}��s����~�W|���P(Կ���O>�$���OK��W�5�,�/�t%#3<�w�:{j1 ';���g�������/�����vT��]����867;���4m�СGN�Gy�I��u�\�q{$QDa���f/™�������̯��X������P�P�����~v
S�#j�Op2zS���Lqc��(��q|�A�av�տ��/�����Ʀ&˲n��?�6����5r`���O8��_��W]~�s~\�r��aC��=��_�>m�U�_����n���斒Ҳ�&�|�����l��/���ګ�2xpIi)�9��.�H��jY�e��Y7t�a��N-Ƙ���p�-��%^P]S��9ujiY�'������;W����u��[��O?rڧ��(ʦ͛׬]7g�ܓO8~��Q�55��Ʉqc��v����+���._��ܼ|�J��^|��EE�g^x���aM��zaĵ�4��1���������}A�ZRRQY���?����w�����t�)'5m�7���^Q��\3~��W^S�u�` |�W�A_y��ڝ;����뮾ꔓN4M�v�)��
�I�m��II[�J���/��W�n���+��� X�sgJ�i^�a��=���o����a�L����KMMMMM�{<8�����<��i�����攖�'%&�9��X�0�9���������������,��WZV��ded8�N2�β����Q�v{�iZ��t��݁`0>>!j�!4-s��a�]}��w�v�GT��a�vUuuzz��,�4s����}>ojJjZZZfzzJrrfF&�s!�e��2�W^�W3�-\���s���\���oga�D�_~~aA��+*,(hkkoima��׶�Ĥʪ
A�5M�2-A-˪o�KMI�
Ÿ~bA�^o\\��
�qZ�T˶.���ӎx���ʷwuw�q^mm�$��"{<���>GM;���������
���[�5rdY��Ą�����$�;d>"
�`�s{���٧O���e�=p٘�.�ἅ���}^�^�[KJ^}��g�q����wvv���
ÌD��n��F��k����{�;vL�<�������Oj���|��Ǿ��ƦfYVTU���RdEUզ�Æ555���kK�-#�gH�ܜ��斵�W#�q�ݒ���ܰlŲ��lA��*!EV��;
�������Č9�_~^8!�����Ӷ���NYQdEC�e�Cᒲ2�a���Y�hIM�Ύ��A,X�$��!�0��vīo����ݿ��oZ�$IÇ
2h �6M�O�4���e�L�p���3,X<?����U�%�eo����/���/�q�i�����ܼukrr2�0+W�޺���b���Ñ�eٵ�;����C�=�������d���;�䋢0r��~�y�hmm{���.^�����P(?n���۶���uuu���mں�l{Ukk[ߴ������]�(��a�����7�O|\��!eef�������
�_!���<���s�2EQ���r:���o������IJH��HO��'#=]r8bc��
<��fg[{�eYGN�j&BIIN���++/ݴycՎ����~�Eiii��A�ǥ�����Y��1�y��9��7wuu�M��������JMIMMIq:�ٙ��'1!!�oZ|||\l��5k�.��"�˵z�����=';;��r���MJL�HO;f����_TXQYUW_������LԟTfg{giYI��M۷�����IM�ׯp�ʔQ�u�A'�x��cu]��ۗe�+W����y�iy�9����Y7y��~�)��-��q���yFRb��in����/)+�k�O��/�����-)-�ƎMII���A���E9� �tt���AB��,�B��-�bO�A@�Q��qC!B�"�0d����.UUIdEQ �.��c�p$B�:*�JR�\.@8�8�2M��H7���@Z�>Ͻ�r8����;����.�0,�vuw麎 �|�����0Qo���Y&+���ab�D\.1�AQUIM�E	c�*�e�.�`�뺢�.�SV۶�⺺�sG.G�4r�?U&/��%�v������L�Ɗ�DE�em�A���e۶�N'B۶���$!�P�v���4��v� ˢ(:��a��,���F�V���Ϳc	H,_�eYQ�����l'B�m��˶��;$�0��G��Wp��ϿظiSLL��_��Y���G�e�7�Ͼ�=s=��Rl�����	�&3�}/g���2�Û7jLv;$GLj�=w�(M�@��*��Ì�����(�OC����/H	��!����=ӏ�g �}�i��r�$��e{i JU�oߨ�=�JD�%��[`��<~���{9��L˲�����o���g�E��RQ(�Ö^m�aۘaU�O�}�_k��UZ
�r���z�Z�P(���o�%û$.�~��/�k�@�|i�Zo�ov��2S(���~2��v�].���!�y�S�k�맩B���j�~/ �/���D�W���{�Ⱦ��o��S(En�������+VJ�����r9���,�����OLS�n��B� @dY!-�m��C�"��N��T ��=��"�\�*jc
��!����>���4J!.=�kw�Al��^����]�i������F�nFILnJ����Q#��WD�j?�k�>�t�׎1�A��y�dn��p8ܳ��i(�W��"i��J��7#j�C��ē��{�/Y���ϿX�v
	9��^/�󍍍�<��e�B���Ky�����~����j�҄��.��rǎ-[K���;::}�iQȉ$6r\l�$I�;�X���X��EL݉B�].�(�q��N���r�]����5>>޴,��C�l�������������t�\~���8�ؘ��A���r���pGt6..δ̇<���CJ�,��Ν��W_+)-�u���}>��n���;Ӳ\.�^;˲111$5��IV�!H�].����%QdY666��rٶ�q����].�(]��˖�HHH�y��;66V�h�I"�Kv�h�N�36&��vwtt~��'�n<�'�W5&&��tb@��(�Ñ��c�EA���lim}��'Y����0LSӴ�?��ᐎ�>!���C��nv(�v�T����Ͽ���qcF��λ�]�݁s
�_{��@@�唔���>�H�����dͺ�Ç)*,�4
�q\Վ�V����J�ӧ��i�С�Hd��MC�Q]�b�Ꜭ�̌�g^x1�oک'�4lȐm%%k֭6d��A���˫kj����,k����06�+W�nhl�fgemظq����r�ꆦ��Ç�IM������ݐ$������23ƌ��:��;w���Orr��H�4��9��N?@�n�����Ԕ����������/f~-
����6���ڶ��xsNN��
����g���%+33.6��O>��H�8~\CCcEՎ�f��{Դ#>���%˖wcG�r:��u��`a}c��#�rs�lݺf�aC��_���5f�Ȭ�̵��o+):xPjjjfF�$�
M�k׭KMM=rd���������ܜ��C��B98�^?�PV�!����\|ŕo���x�ŗ:��6m��I�`��$I|���Mŝ]�Ͻ�ò/���M�l���m�����	v�h3L�4���!Y�Y�]�q�ǟ}���⫯�]�W�x���������y���wuuG"��_zYV�m%��}�UKk�mwݍ1���
�C�H$��ZEe�˯��r�_~퍝;w.Z���>7-�ֻ�	��s�͛�ͷN�˶ml�O<�lSs��U��{�%��{��@0��7�~��gmmm<�hss�K��^V���t����|�9�e���ݼ�N��4-�0�n(���K�,]������iljz�'�����a��A];�P��Ǟ|���IQUUU_~���ή���W�|�����/�^�����ǟ(۾�����gYv�7��-_Y]���fY���>^�hy}i�h'�N��>���%QU;v����n����_߹���/����O%Q�B+V�z��g��������w�WV�x����;:�����[�bś���S�=���ZE��D�n��[��t:y����m%=�x}}���͊�tttn-)�1�y^���+W����ٹ��f��m;������?oԈ�;zԹg�!�
#9)iİ�cF�<��Lӊ������O>��ήe�W��wt�
�;jj�7����khhLLL<h`VFƊի7m�<i����~

��	��,��;��1u��e���@��������rN?��K/INJ<�S��:�����;a����.<�ܣ���޷��\��v�X�r��W\����.7;��������N=��Ą����;�k‘H[k[iY9���eeefef�x������[��~��w�vk(^�aCa��믻v¸q�p8%%�W^;�s�(�^����n���s�z�.���yA|^�83=�����1��o��F��[ZV.G䶶���r��
p���1���c��	�y��\y���m�4�~��N�����._�����WV8P�P8�������֧OSssNV��8~~~��]{�e�Ξ�i�S'O��������hDO
�0��|8�kjn.�^1l�qcF��g� x��쬬�GNs�\�at�p��v���L�<�zEY�~Cvv���L�������eY��iY�n��ֶ����VU5F"�>����9r��a�II��LfG����5k��z�]��Όc�~��7=.��'����W\�vݺ[���ޱm���|��I1~���5r��!��
������@0h�f0��(�(��"��뺢(��K��֮ߐ�P�s��	�/\�~�FI��^���r(*3j���
��,���宮.Q�Y�>1!AQ�Ą��VG"2�a�q�<����6���o��t:^C��n�A����pX�Uӂ����n���‘#��훶n�z⼦i�eۆa4��TVV����TU��˹�_�?���uu�cnjr�\�F�1|hRR�07��'�y���쾻���}�������n�e�Oj��˛��֮ߐ��DVK�A��)�Ó����q\eՎW�xs��Eu���\yE~~�(��.*+/�$q�K�.���;��S/Y�y�VEQǍ#I⇟~�a��i��/�oh1l(����ؘ%K�546�/�W^Q1i℮�������8mGu��+����rs}>/Bhg}�K�����~ȠA#G��_��$�y�i%ee����κ���7��v͚��i���8���f�ʕ;����w��Xد`���	�ǵ��ڶ=|�M��[�n3jd �e���6l>dȨ��7���C:�|��'�U/�qک�i�_��jǎ���>�)�ap��d[zZ���<�og�^�n��'�PX�_RV6~�X2��1����UUMMI	}���Ç
|�ŗu�u�1�Cڱ�z�q��5bxZ�TEQ.^RYU���1���s҄�%ee�1q�
��׮_?b�PAX��~��Ͼ��e�)�'M�0~Ǯ�/*�r��?��a��=�����7�G"�'L���,)+���;ڿ�j��j���Ʀ&���~��ܜ�>))��5z'4r'��D�$&�p8�r� ���������+�#���t:M����BFE~RE�a�	�O^&X"h&1:$+�#B���e�2����i������.Iҭw�5��i�'N�e�0��0"�$��q�t�a�X���
[��F�cY2<%���(�eY��0QC��m���1���#�"	�q��i$��v�Q�� ��y�绻���i�ӣ鐃C����(���0�eIj,��O�P"���� N��0A໻w��B� �\N]7dE�����,Y=XUU��m˲1&3��^>�O÷Q+~c�6���eI���:٢(
� ��'��If��E̽eEa��p�ae�#��������5*WT5j^I��m��yQEq:�7�u��6LQTrU� �ܣ~Y��XV4�h�ʋ���F�U7�P�e�����0C�LY��#[�>9�\	ة(
��|�Y�EA�`�5M#����h:��h�H+HD��մ=��l�z<6Ƥ�e���.U� �N���j4SYQ���ѐzVݡ��)�~��Sp=�#�0�ԝ�*iϨ�Q�W�
���C4DpO0�֮�r��q_s���E���'��^��k�/ �����y-=W?�7ԣ������v��]i�G��^�^;�aQ�=S�{�]�f�v�~3�P(���c Y���!�ʴ�>�Q�\��w,�/
���I��Y]������-��L�P(���L�9��DŠ�]N�(J��σ�ޒL�.�	�B��1�]n����5�B��p����o��nY�(I–�f�۝֧o�̆��=�57�q��L4Ͱ�ad򶦦�aA,�jjn��KH�O0�=Կ��,FO���~�^
�B�_84�_ZV��
�u�u�x󦚚��̬h皨��tږ 4M�0ƀh,�0�(Z���/��wH�a�ѩ��4��)-ۖ���'��0��#�p��M�m%%%�,�q�m`F״��^�+S���!ɲr��].��jQ
�B��S���eccD��_ц��/�V;E���
��dٲ��X��M�A�ݹ��qHY��L�x�'��s�a�B�P�g��X��3NOHHؽ8-�@��i�Xr��������\b8m��<y��,WYU��7]�#�ؒ$�/�a��?��۶�r��-_��}^�.8���tM�y��N]@�� ��XӴ�[�����X9��i��xr
�0M�/�?vlL���b{�;m(�ȟ=��jni��ۧ���Ϸ���-�я7X����B����uuw�����7����c��n�vcs�i�N��/g����8�h��gYV8&�K�2]�eI�"�Hk[�eYː���p�����)	�������A�eC���TY۶�����(���p8���"�c���x�����H��WB<Ϸ�����Ȳa�e555[�%+Jk{;��@� ����83��֦�*��a�4��$�4A���
� �ۆB����s{�CUMuܖ�.ĉ���ŲuM�큰���@�t�nOW˲�lݦ���	�g}���y�}>�m��^}��kÑ��u뱍���0�3�.<��{xPQ�C����|��7���K.����D��>���w�3���.}��7�l�j�866涛n�tc�k�^�Wd1J�n���˲~�/9)1..NS5����_][�0�՗_V]S��o������l�T���P��Ԋ��SO:i���yy��e���8���_~E�ȉ�	�}VM�Ύ����λn��7�R5-';kΏ��9~ƱÆ�{V
���S{�@�(��pf��ϫ5
�U�m�����$I��{Lca8�^QYZV
���`ff�ӏ=���&O�0n��.����_3j�#ܿ����TU=�3�9�Ͽ���-Y��յ��P0
�{�Y�^u��k׮_�v��G��1ӻ����Yè�
0�
�4M�4LӴmK�4���kc��V�]{�]�,]���QRV��w�ǎ]]]��_"�{�o铚*�£>G�L�x�EΝ�@����Ꚛ�%%G1�裎lmk[�f͐��N=�$�a���V�ر}��-�JRSR�~�~�
B���&DZ4�B���ӼX���'mK���Ӧ��IE߯�v�2'M�#�uucF�3��H����}��7��s�͋��c9.6&ƶ1��c�vD����<O|\\GG���NMI1ö��LaB�y�a�N�����������|^�qxwG>..~[��	�'�Ų,����Y_�D�oBC7&��oQ5��r͙7��P8|��)Cn��HH��x�<�{<����%Q�K���6�<qbFz��}�t���n���'��˯�._qˍ7p�'%u�СϿ������ˍDd]��3�T8�7�B���g��M�LHHL�M����5�<��yG�kok^�xq~^	�C�0m,�|�=��p�-s�Ϸ-;
B��eY���#��>��W^�����>f�Q-���p(#==#=����|��Wjw�4C��P8l�eZ-�mcF��L��;�����adg��9����[��Z�y���E��G�AM�UUeX�0�`(4x���v�����
=
t]���`(��n�HIJ,��W�Y�j������UYU5{������n�,˺�1e��~,�ׯ_A��A[[[���>�B�_8�ȝ�$�Zmmm������a�c�r�sH1�a!YQ�n+�4�0�̌I�,�ꛖV�}{Rbb8�,�Oj*B������u�A����<�O����t}��5���=�����tv�))�uu�

TUݲuۆM����y�H$B�[B˷����݂ &&$�y���,;[Z[#93#�,t�q\ww���k ÇeX���#/7������;';{{EEbb"���55%�����q�[�B8a�X�aV�^�G����T�}{Z�>�(�o�蓚��xTU]�zM8.*,��H��%��;)��ȝ�@���[�E<xy���.�rH�bUӈ���i$n(�0�+@�$��Ho��%���N'B0�Y��l�eM�!���G"reUUll�[��������a�5��Ġ�|&�>���"69$�%�=^D�Zʲ��XVQ��a�p�<�i�.�eے(br<�N�"�*���$�H��iZ��A���9޿
T�)��1h����46Dj�ɱm�����
�seY�F؏F'�� I\ebOB�C�ӄ躎�m;��������ϳ23�<��P(�}�V�����q�0�0P�p��ivу
� �B$_Rl�l��&�t��Hi{\����q9�9L�P(=9d1|�=�_�e�����sK�F�@��+�i'�t���#����L���l�
�^��3��
��y�W��ˡP(���^�d6Q|2���B�P(�Z���S(���o�>��>.E��;;��	R(
�6������3Q�� ����7A{�x���R(�o�H?���
��
!����2�AŘ��Y�m"�6Ɩ�-�6m�~&}y��ia�ڥ���h�?7�
!��`�� �Xc�P(�o��c�"��Oyx�c3�9ֲl�Eq~ɰ�?ތ��Ӳ�qp^���������B�G�88c�E	^1�+�,�Ļ�� pȶq4��>�VL�A�n��� �;y�������%rn���O�P(�M���$��UGO
�FI] ?�S�|���N�쒒V���Mʜ���#�k�5��)7�������)T�9fHʩ��v4��_Vͫ��M��n��8 �N�.J��_����K�!�Xd�X�з���rۥS�N՗cЋs�?;������G�߲�#�B��Z�׏ Tts����N�똿�Y��
�nH��
O�1�|aV��[�,ڱ�=�#�<2m�֖����][_���?�϶�t���Y�)��.��϶���d�t�[�#�y�����Sm`0‡���9�>XV;���Ǽs��K�f���K_Zų�v�)
��s�oc,�LEShuEǣ_�,)iY[Ѿ��[W�;�0}p���q�Ғܯ̩*ƹ3B�����ܭ.���}Ο��7ֱr{�"��=��IY.�۰��+�oo
�$vcu���w�1��A�~o�6��m
�ŵ݆i��9 ;�#[�"�$�HT�)
�8��yQ����sjQmk�w6t�4ġ�ߞ���e�E���{棋o=}�2w����K��u)�4d�;�#T3✲�_��|t^\Z�##ٝ���na��-���}bg���u���ݾ���I�	��ݘ��z�ʑ)��lP/W
�B��0�\rIRRRt�W�e;�:y�w�ܖ���@�����q�rbs��]��N��΢5m�~�1�񩱎��~渌�$����MM��N�(I\~�7-V��!5(yH����;�Vv��=8yޖ�[ZlO.J�O�|����5R��7�0�c�i�9�O�N���	8j`���3�RM���S_�imk�z�<��v(��aZZZ.r'����R
B�q�aA��l�"�u��ɇS�l�����Yʹ��i��qp!�X�g�icݰ.��-V
�gyY�c!`�L� D"4�F4�
��F�V�r�Np�t��+�A�Ȯ��n1������D k��@fA��e͌hH�����=w�!�
tۂ��j��- H��?�pcL{��Ð�H?�h�0Lt�'��.�����ϣ�;�@gQ(E0���=;Q�!go�'c>5;k�:�l˦j
���*���?��a�'ˏ�$�$�'b���S(��5��S�?�CV�:�e�P�&Pѧnx��ެ
��7�ΨR(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��J?�B��:��S(J��=��P~�
,�wHB�@��C��P��`�].,��KbPQ�,��Җ����mc��_zB0�ƒ-���`@�b
�"��!ǞE�,��dJ9d`��;�]�HҮ-��W�] c�5Ǐ7F��M����+����g
���/<9�CE�i��p�g�D2g�Ae�1fY��p AeYV5��X�$I���e˲l�恎��B��ѢBccbdY�t���a
Ƙ�{�aׯ��[��@h��^|1NHY��vv6v�v�(��0��4m���0���ҏ1�y~��%�|���(Ç����8��mB�1Fڻ��BUӺ[[�bc�!]i��Lû��Sl�E�o��x��W�l�&�¹g�5j�pM�Y��m���,�痯X��W31�322N9��$˲�n�1�0BB˲H�H�UM�niIHH�4���p��'�1"��,�o��I8�@0L��O�4���z��&��;@�~�4ƀeas���[�U�U�v�x<VQ�z����i�iL���~;6�NJ���+����B���b��oY���\�v��?p��$I���aH'ݲ,�_�41�.�k�
�=���<�q�i�d E�u��veƲd�i��e�,���q�n�y~��Q5���"���4S�$C�m�I{�B����L���d[{GGnvvD���I���gY���3"�������uB��y�/\��O���3>�7/7��r!�$I2c�LY��u��{���ƻ��,����>�*E�m���O8�\�4��ڶv�y��ҥ��ag'�l�jU�~=?s&i�ѣ�c�����6f�0G�"����u|���!(���~����8M�EIMM=��c��;::���N�1�s�y��ǂ�Ј��>��3]ׯ���Y�g7��^u�]~��II�/������ �<���è�����i�����x�(��$}�λ�a|�Λ�W�4�4�y�����y���/���Νu�]]��
o��=�ȣ��]�;�`��@0�t:�-\��˯8��8��#��=���v���UUǍ���]N��#�%%%}��,��u���fVkk�5�_�'565E"r0|��**+{f���UTX����q>�+?�ƶ1B�9簛7;�{O>uu�hY.��L�_'���ԩ���`YFg���
��vv�bv�F�c�j�j�ҁc����'�!4��))��]�W�!��iꪤP~b��s���\�P/���#^|��7�y�O??f�M7�k�Ϝy���͙;��f�s��駞\ԯ ���������翷
2�����{��;K���ί��næM���u�=���+��1�4͖����,������q���z�7�[Z���~�7�nmms{��{�������8����[.���p8���/VU�x�f==.6���-���`(��ޡ���x��~GGGd����{�����q�����RSR>��㩓'���;M�M{gz��w�{߰�C����A{m� �S&qO=agfZkWAM��<�4�	���r�!A���O�m`[��8̱����3�駢�Vn�b`���c�⅄���/g�o�!��&����m�~T�IzZ4L0�1@@�l�_^?�ei�r��w��AL(J�OL�������8w�U���ɓ�?眥˗?����IcG����w�ڵ?�����蛖�>thcSS863';��v�#��?n̘���L�:��w���DI���c�/]�|�������p���%?7w�A��9;��)?/o�ĉ)���H� /��q�GC��?����[S�iS�:��>��cY˲Y��9Nx���-*,�m��7�TT53#���-!.�a����<���Ɔ����3�4111`PXP�����1�?h�u�y!aNf��r�_?೿�����ɸ!��A� x�f�Vv�x���GcTU��U��/ļ�npַ��@ׁ U�‹���m����G����ҏF�,�v�~�7�{��,k�ԩY��II�cG�~���o���.�덷�mmk�oh�F����I��y�y�N�<��|{���ԔQ��ۃ�P ���������3���=��i[�n��_7$�Ƿwt\{�����<x�y�wt�w�Ͽ�rBww(���4M3"˝]]�p8�465�de�3��^��fge�}�ˮ�&"+ٙ��vvu*�b�֎�,ǹ��������a�<���O8A7t]��9����g��ڶ��<�qT�1���^~�l��ܘ6����m������»t/����ii�}�a�,3w�)������ǟ��Ns�h��S�1c��V�}��\�t�A��(�X���
�Ù����ѱ��������6LE��׮[wݿ���o�MK+-�^S[���ؿ��e���U;��ss���/]���9h�����[�2���1��/*ܴys|\|Z�T�0�af�-;�k�.g����i�U;**+���23֭��z���mظ)3#]ō��G�i���-[���bb��UU��ox���_z�Q�7o�|���q�548���Ʀ�k�|>��3x��ڝ;��3�ӻ�}RR���ʷo����;�M���IKH��Rʟ�i�		��/���:��s����փ��}�X�0v:����=�]�@���vb�v�I���6�zZA?� ڴ�R��fqq��?)1ɶmI�B�H$b�����-�*����+�"�� �a*�B�EAQT�4]N'�P7]���e���(��p8È�p8,��UM�4MEQUEQ5��t�������T����p��a��!��/�<���T��G�@(I�eY�e��%�|D��Պ��1�m;�� 
��k�꺮�3u84]'��'����꫅>���~�Y�������y�&3���i�k���~+|�1���23�`L�~'�)�?�_���d]�11��A�0�,�w�{�i'�4eҤp8v���g�v�[��0qٍ�Ө�.�%_!Bh��&�� 6�-^�v]vf�q3�u9��a�����'�I�.r�/fz�/
`�m>P���4+--�r%�y�1����o���@�!������x�ݰ�80��vJ
U�!�WHb�n���JL�!��e&7�C�8��LKV���vQ�>`UU�6�]���u�nmk9X7�����{��_��%^f<��^TU�=�T[X��9��auURz)Q�?�A�h��Q�p$�#@��,;����N� lkoMINq����]um5�pN���r$�n���	gg�<���;��(���.<�!� DC��
0���6�1e
�жm�0|^����t����(���*d��N�s
W�8&<x����o`��w�DD��o��i�{8ǚ�i�Y����y�X���z�ݎ��B

�㨭'���[����
8�ge�M�,?�S���f˲���Q�Ij�`͉C!s���Kag'7ov:iǟrh9h�Fñ�f~���g��k���58�E�z"�J�3�0P^Bfϩ��h4-z�@ꄀ1�,�6-��`U~ߗ�]��`�3 u]?�x��o�B�c8�i^��������ਉ'���.�b��$���x�á��m�Q٨�eO�Lr��]�%I2MS�uF��ͷgQ{:�<�{�al���Ddb?
!"N��0�T��i�G�q\[{{(JNJ"�Zv���F���	�#�Ո1�m�.E�y.���D"���t����!������p%��� `�ł�Ʋ{Uܳ���mҮ�T��h+�
Cc�˹Wд���ѯѺ��Ƙ�8A"�H�&�v�`uB�kV�B����;�ٱ�NO�B�|)����������v�}h��v�=���$)��$�؆�].��K��v������x�^��U�0�n�k��a��|LL����8��pmr:���!I.��x����X˲A����DѶm���n���$���x<���fY��{<"Xn����Ng�-m����q�m/[���t���Ͼ����,&&��r��>�em���|>���0��#&&&�&�t8**+|���k��z�n79���1v:��=jL�6W!����<p�\^�����eY��I��q\0\�nC|\�0�=�LggW\\i�X����	�qN����<Orq:{�.ٶ-
BL�߳;#�$�}����-[�����qqq���p,\�d���~���r��ϓ��z��?�V�p��F�� �`��1�i�N�㋙3����+����$��p��c,I���v:�w�0���$ɴ,��$I���
��׮����[���oy�j���d��a�с�2��mk�B0��[����#�2226�/**�ϫ��io��BÆ�y��i.[�RU�Q#FX����TV�H$��;�N�4q������!�|CC��
�3�3����ۋ
E))++,(hll\�n}FFz�>i�<�BfFƩ'�Xد_Վ�7n<p`A~���ꆆ�ֶ���º�z۶'Mv�~0�3w���1|�(��,�BGL�����s�g��_p�9�;������I�gge�ݮ��65��tuu�;&..������J����\����������_���,˲U}��0�G�h��ܱCӴq�Go޲���y��A~�SqqSs�0##c���EEy9���3�(ʂE���਑#���֮[���iY�qc�Ngk[��E������W3-Y
��
����ܹk֭3jDJJJk[�׳f���1|XSssSSsWwWnNNrR�a�[�465Gs�_XXTد|{��b��;v�(��]RZ�y��aC�ڶ����>v8G15)1!���������y��!��,��7�4n��'�
`��Nv�
n���#��|(�xw/>�돪3�0�7nlii���TU�����)�B��+::;ǐA�����ۼ%99iЀ;�����u]/޼eŪաPh��#�G��o��!'�n>���K�z,tŲlgW'��n�۲���bY�;н|�꩓'�z�],ˮ߸i�e11��?�x��)/��Z�-mm�s�͟2i�ko��~�����Uk֤������ ]�ݵ�;!���}S��Bu

w�s_|\�a�㱧�9�#���~��'��v�����a8$�E���HOG>��K�		3��5d�~�;��o�>�K��������y�e
2X�u�0n��.�4ssr��n��M�EY�aCFF�?ε,+-�Ok[ۦ���4��'�)��K(J�Ν�׮�2i�}=l��֒�
�6y=�'�y.�o_YVr�����8o~D����>���Uk�$��o+-�5�{�п�n����y�!�aH���?���1�G�\.�P{{���dE���N�IDAT���I���,^
�232�>��yl�nji�ܱ����7g�<^��;ڗ,[6q���}��v/\����w<��/��$'%&�~㭷�<�n�F��}2eҤ���ֶ�Ukֶ��#�^x���)��$I���)��⫙�		���MMM��-�X\� ������~ԋ���**�����ы\�6p8�<�nؠ�r�9j�e�Pw�����N�7a��a?��9�/��/�e���+��r�o��v��5{Ώ��'�'����9s�9�
��|���ohhkm�y}�9ق D;�b‘0����9��s�ô���\.��c���ݴI��b;5R�^ʡ�a������Abێ��_~�%�3�Oj��]�t8�[ZEA�>m����t8���ʪ�[o���mmo_�aC�����7��u))��^uդ��UM8��Ԕ�`0���=lȐ�>��׮۴y�qc�㓓���`nN��C��v�I�{̂E��[Z,ˮڱ�����tN�0��^s�9gO?rZeUDzd$#=���;??�繹�X�����e˳2�x��'{���nj����n���3��������H$�h�R��t��'IRll���Ue��0
0f��������g�q��/X��K/�햛f�1�O�]Dj��^�p8�[Zx��u=11��_˲�
����9�]}թ'�����s33�{�_t��ǎ5���?����rqw�9g_y٥���X�j{EƸ��y������/,���[
��u]�c.���h�N�sg}���}^/�s��_}��i��t�n8���"��_~��'����1�$I'������M�7�߰c��ݽj͚?�n%�=MM���c�tG{�=�x�m&���Կ�U�_��7�ɓ�?�캺zUӒ�����n�q��U}�iNV����%����نnL�4��㘣�����?�s:��a���m��>�Yo�YF"L}=5�ZZ�1�ab��C��ږM·��Nm��5-
���cc������R�4�ϯ(��*Y�e�rGUWw7�Q�D��+';��[oojn>v��?����|��n�wNv�M�����d�fuM���.����<�;n3jd �4-BB���iѸ.�e]v���{�=<�y�ָ�����f{��C�"���6���TTVɲb��iZ�aDd9"��>�������)˲���rǭ����;�&��eɲ�D�i�*�a�*۾������-1!^Q�p$�g�Al�N���~��g#G�y�4M��Cᰪ�����Ʀ�@wwy�v�׋m���VWWo��i��p8c$Ir���y�7\��3O;5X�M��A1���B=rA��c�����B˲\NW���`0
���a�&}y�0-�2
SQ�P(�1x��2�^|���_�M���^��.�r%jh�N9Ŏ���2�eƃ1��4M�#[�
���D�ش,UUkv��<�֧O]}Css�֒Ҹ���(��ˊ���VYYI"Y�e�s���@���Jn�\s�(++*
��S!7֏1fY&66Að��!y�����8�᧟�_�� /���;��(�������w�	���4�nq��8q/	q�8�)�8�I�aq��^p�.`L7]4@$�.����֙���Clc ���ó�۝�mߝ}���Eʲ����Ɨ^|q���Uk�@eY�5cڋ��*����1�++_|�5�s'Ma��Q-NL�8p�֭����й'*�2kƌ?���륙�^_������p�\زB>���긜Nc���S�>�i��쬬k�����^ohl���.���Y�<����r��#�?��������DY��^/
X$˶�#
'����h�frr�;�}��CSΝ�!����tZ��<^Y�8����_y���_n��7k��/7l��A��+<�2����$}�l���u8�>�BhW�fϜQRV���� ��m7���^ϣO<��$1!AZfaA����{v�UW\n��=n7�r��}>����񊢘���s�.Q������Ǟx�g���Y3S���^��p�`�����l�CN�#=-��+�~�|�;�;&7;[;��<�ZZֈ�f�����u8�"A|^/����@��������C�p��k�1�����O�M��w���U� D����+W�~����;��#��<I�~B8�B+/�$$��F����Ʌo;c��~���RP0l��)n��0I���(�IӴ���GT�`@;;;��hjJ�iY������O�:u���p8G����T���:�X�$�Pss3ue9�B�F;:�4CBHUՎ���)����EQDF������a���dɶ����ʪj� 5��Z/!DEӲ���r:i��[JR{{���~�=�!$|0ة(J��SUU��$BQ[Z�`��y"B��\}C���e��	���8�N�T� KRGG=:�T�#3M��$/^l�??����0�U��R�R�*б���i�� �׃b��WT<��sw�uWbb�$��a�l�p8,�25��]8��y^�4�\Ļ𶶵*��xO$��^+.��?��]Z�8Q����1�g��f��� �:v{ZjJ ��q\TU�#�F�
�aP�XO��v{<��]�M��Ǎ�D�B�
�oY�q�㨿vB��`bwb$��t:��aнx�OJJ�Z�-clYD�$ܣ4*�o��NG0������]la���� ���k�BM�!�6YVU�8!�4�᰻\Ξɻ���f���r�Nu;LUU=7��o]7R���5��ᤤ$lY_�Q����8ڰo�F:N�{]�ͯ�IOEL�uC7L3
�/�H4����NV�Z�]�0#6�!v���y�9��@$���.��q�5���v�o�|���3٢�I@�6�\1]�0�e�֛n�$)&��B�	��yBG{s��M�&kI�I��*
!!���]q���Z���!������
��oH��n3�bG��'�BKD�uJ���c��;�N�����B�����n�و�;�Q�<::��uL[k�j���`0Nߦ�YXP`���KkEw�u=k��ܜ����77"�#��7Vմh4��%�l?*�'(Ǧiʲ�7t�C~��!|
�~0��oL�5�9�3N/ߦ�B�D�a�|��u�LK��DB�p�����P�H$"�≫��B_/,�����t��F��"WU������^?��-�&�Vd��L����$9)�%��F�3'��'�YϽ ��|>�l;�����pE;@K�1�r�(0f�~��a��PgI��j“u�<9 �����m&@����#GY��Le0NL�g1†��������CG	mlj�{��$crb�����JINf�gL�g1cM���%hE�3���E����!�
TU‰�L�gL�g7�4L�gW}u�!��1+�t
��UW��$&B6��8`��8{�ͥBH09�֞�������"�^?��kJ�����;&}�$��[uE�/c��Zyҕ|���ngn"�3BD��1��2�P?1�7H$�H�&��$
�Z4`��vt�S����*NZ���%�(K���:��x7Ķ�+eI��p{�B��L�6ESEؠ�,\��	��vے� �"�,�49��eQ�u@�,I�e����j���b3z�	����D��X.���g���i�㠪r����	���B�!z��_�ۥ�l�8N"~�ak�7�2�2�NB�1!6Y�_Qq�������DfN�6��k"��X��ؓ�Դ���DQ�1j���#�AVV���J���q�$��ib���a��ir�$�����w�>�'�P�>M�h�=��x��t]@��]���i5��
n}}��n��|�H���*3�?/o������F��q���!MBi��i�<��P�4$� �mm�i�����͒$B���@�(�R��_|�l�����3g|��{�OK�8��ډ_��eh��@@~�i�p�cǂ����{��ey��=���_|���bY��?�kȠA��0�!A�5���b=I��,t��E������-k�UUc.�Gz�L�g9'��'���P��v=�BL�:t���+J}C��&#�:C!M��K'�<_��`FF�~��u�uj���?���.?n,
C&���e{�&&��v���n�c+x�^]���K�~��<��K/q:�p���t@f�������B�@`����Uœ��a@9�!%�e�,���H�TYU�F
hƂE�t�5W��ݻp�S��x����4hs[[[ �
���ОcIi�$����󵶶��ե���\.��z�7;��?��6��.IRUu�$I��ɚ����-u��I��ii����l��ۊ�}��]�߆�L����������A���|�����H�,,4Ǎ�--�6�X
-�jjj:��;n����#���MNN�u�����vv���A�fKLL�1y���y=~����]�4BȦ�[�X�������p$|��t����p��1!���h�p�����/��?{�����3���7���R��_,y��7n@���͚1�W��]vV6���a�;0��sM�hniy쉅n�;--e�����'yx_��^y��?�����\.gzZj���{��kj��.׌iS,Z�(J8�ս?{�ÏV�Z��ѿ��&o�ВҲo�7cڔp8J���
�p���t��ۖ�X!B��!��e{���MH�77�T:���/��n۾�4�'����nomk<h�/v�C�<����D�n��\���Ϥ��ed�j�\�PSssю����o�}�-7���g�Uk�>c��K.�(���G�$\?�w��1�۶o�#�֬kni����iN�W�\�F�Ĭ��믹Z;Չ�* � ΏBHm���c�4EQܴe�,��gϜ����7�ܵ[U�k���!;+������;fԃ�<�������;�(ھ���<�������W_����>_�ƈ-L0�2�g�՜�B	!��8]7n��&M���7?��~��-���S>�h}C�w�]�v�=��/޺�h��		������22~��{fN��F9��5������Yӧ�5J�]�w�ع+�PI�jjk2���9}���cG�u�5W��y�,[�w_���7lܼq�A�7�O��>��?���YӧmܼY%Ƹ��^Q��/�Į(�/Y�(J���/
��θy�^{�5�̜>m��A���C��s�~����'��W��rCUu����9��K������a�y��O�v��~�fL�2mʔ�~��EE%e����{�;����D#�|��'O�X�q#/���S�����w˲��*�񪪾��{6������P蔇�?[�'&r%%�_�3� �]1$Ir���+�ޟ��#��8p���
�����yۭ.]���w��>��]x�˯��DF�,�۟�t�e�
�����p�K�ea��l����I?�4�~���|�����C0�%YUUE�8��z<�,�69�"EQqȲ,I����j�&&������>��?5]���
�f͘����]Z2q�9~�o���t�7������2M�E�4���t:�{��#�Ѩ ��{\nM�y���	�y���A�~��?���I�hW����_��^��|M�R��eY���</�������DDUUǍ����m���=�O*M�i:˴8�	�(
�c�!
�}≖@KRR"���1M�`�j����E�;����_��^�r�t�$����$�G?s@��=��>!ƓƏ��뮚;A��<�&B���t8�.���a!�	�^;Ȓ�-KӴh4��j|��X���}F�o�I'hD����?�e8�`��@k+�?��<��<Ͽ��;kׯ�9n�u�"�{��^(,(YX�l�
˲ǰ���[p��3������?�f�:o򤧟~PnnaA�֢��~,
����=��7ސe���甖�m�!�҄sΉ
�i���eZ�縶���_{]E�M�x�_x�'�-!1!B8f�7���shάY	�	���/���/8��6-��������}���?��a�N�c�ƍ�׮�,�nw@��8����.X��o�aޖ�m����4M���R��阳��
�mm���kZ�,�l-1������^���V�\�s<M+Ƅ�ss���z�a��Y�]s��o�Y�c�i��^u�/�z�O�ްq��_q�eՇ=�����Usf�<o�~��~���[�ܸ�Ќh4����Dx��o���t1�t��,�����K�H�:�TMÖ�n���rU8�����Ï&��7v���,!����
���}>�q�aTUW'$$x�n�hQ^Q�
�Ϸ��������#�����抃�O.�Ӳ���jY��������ٻOQ�ܜlU�!N���%�"��iY^��>������8PY����/----PU]m�崴�`0XW_߿_?�UU��Ei��HJL�y����U������v[�v�C�
x�� <���֙��ٿ?���,g��O݊Ahko/+ۓ���y�BOfG0�x���~]}}$����_Q����/=-5%�t�!g�ɧ��=y2q��7m�,@�3A#�0�������!�v�L�JNJRU5���Dr�Us禧�e��G;
�55~�����B�l���|0���HNJ�m%���h�D�F�EE)ټ�p�y���-[`0�b�1N_3A#��ܜ���0L�eY�H$3#�f����'M�p��I�@���gefB�Q�������|�С�aPPQG�� �	���767M�<9
ɲLR5�f���ka�����<i"5�H�D[���b����	��
\����:5�dg
<�0]7�@]�v���<U�E���NӲ�))��oܴY����=Æ�{��I�#E�G<�����BH��<cs7��vO�r�i��e�\BS��
àg&3#�ڠr���d�ޕh��!�4+7ט8Q\��߱Ø<F"�kh��k�]Q<n7��@�f�HKI))+3tc�ر�‘M����e�f(r��˲h�L�ۥ�zO�ήV��Q0g'=����!D�fϜ���A׌$U�@�d��?;!$���,B"�}�A���?kM�KZ�IM7��*!hYV(�]	�h-t�؟�yQU�/A�4U� �BM�2#�X�˜���U�M[�L��g�wt`���M��`�\��_M��<:b�����B@����.?���=��Q]I㷵,��L7Ic�
�]�ϟ
�8��?�	t�S]WG�4�Ǥ<[㓛�K�c��v�8c�:�z�#��߻��0��1�o|�e��EUաC/K��-sK|ɰ���)<�O��o�B�@\0g�e߻�0*�fn9��lFϺ�:4��ݠ�%S���t��k�1�=E"�0�#�|���w-��C
�p:��?�3:��7�dOz,�-�x���_?�ؔ��u_JX��>��a������7OЈ1�F�gZV�P($�����⌌�=���K�y�Ƽy`ڔ(�9�C��4
��j��{
F�q�����8�p�z��P�~:�~B}B,�O(+C�ロu�ggb��o��o��θ6�58��~��?�!:$nܘ1�z��D��3;;8x��t�2�P3ёA ��p��,vF��V��ѹ\.��~B	-�x<�k��o�Ed�4|'y���mf�e@M����]w)������OR=���,JD�����q�M�0!�CD/�a��T[���A��
���\f~>IJ��a(tZ&�r|̧�+0M�p��;o�F��qN�D�{�L���ܖg"�!85%��Ӧ�#�57��W�6l�hj���vE�s�q���_��߲F��0���\��8+�:II��s+k�ـi���!]���B� E�V������7�d��L�g���;�ԔT�y2�T �D��y84�eAM��߲�>>*+=l�9��h�i
Bd��;�Cܾ}��� �%�טr���o�a̚El6hY��`����Q�A�����^���O�(/�7u���N,�/�q��
�~��UM5�<�3*���]� @ 4U3:�{����+-Os���pB"��8��#n7�8��-��#��Q@(?���ч���8'&�_���h~>�w�m[��-��!c"I��oC55�b�w�9p�T�p��1���{�� &%%1�g�e��@��<�[�4'2M@��ن�$�@� �M�b�~��^z�u�/���.�:v,�y Dٸ��|d�QT����F���o{QvV�*x���O��~Qm������`�!"�#B@׽�8 �@ǝ�K!�{�Q��|8}��q�W����7�ŋ+V��!����H[MӘ4I�;E"�aq���g�K,D-�3�V�3�J�fy�P9�&�������^�k` �� ����?�i��[[	X��;?�L�>���`�#I�L���S�O�в`[�f�Ap`��g$̯�q6��[�$=���k�0`W��#+YR���~�ل 4Z>�J8���'ͮ���O�����5���8K�:�oa�LjG�1�� �8�ZZ���sw�&�k��cA㯎�~ܦ�^z=Q�L:���hƧ4�q����ښP�[X�?��u�C�?^�ɉD?%��<
��UMC*6[Jr�(J�����́3����~����@Z�D"�zF!�f6����@Ľ>���躆1�'M�A�|���"�}��J\N��l�e�]�XM=���W�Mʁҳo����MU�^���
8��4��:��(��뺅�W���C4���w`� �B�Pe��`03c�a�{CE��#�@��X���4I�Ʀ����촔DB��={�rrr�v���ظB�nW��AhW��.�q���r�-)))���<Ϸ�����t8-lu{fB�HdӖ��e{EA����9���+x�Mi*�bm]]KK���,�>�4Q	-|�}��P}��D�����v��������qܮ���w�t:�m�������j@�p���ˀ"�
]�h�-��9��� u�����^��LtK�,��8D}�@���±�X�4�.��+��r�3�҅x��G�iZ��
��!�0���<Ϸkׯ�8��tƲĪ�8��������v�l6�ٽ������U�֒X;i��B�l�^Qh2�,��<����8Y�TM;)����=����C��9��8�Ñ��Vq`?��rќ?�
VUU�&� UVU7����>�W��S�c�8.8P1v�X�Ƿ���-d��fJ���|ORRr����;wӼ]�e���k��A��Ͻ�c8�kjnr�ݢ �t����q���'�b�%I�>t�O<�m��ח��ē�-���z<nI��X��4MEQ|>���v�?��>������v��x<��+��a�?����E��z<~���r���#��'�-�����b���ܲm�^z�l�mۋ�~��'�-��l� ��~��I��t8��:���n�ŗv��|>I�h.���銟��x�^����eo����nw���Y��v��X��PMMRR�a��$%��.�3�	V%1��t:EQ�9�n�;����/�4M��$���Ud��|^���u:^����p]��U���`YւE�C��fK�\N�q�ǞXX�go(�|~�OQl��jA�^ﶢ�Ͻ���[�rաC5I����;
�˙�K�Dq��^����I�a�"�8�v���z9�<���x��'�;:�~����
��^/}C�3�p8dYnjjZ�ٲ����������fYVfFf82�4�h4�����]�P�F 
���q�� !���{}C��n��|n���AO���4Qzm㫫>�aً�HII�ٔ��zA�]$�bCc�o��w��v����?{���E��ȩ;��5�qB���`a<b���@ p�m����I���/���^��M�ɲ�e�Uk�8��o���z�}��?�7t��SB�Ћ/�����<���B�!���^���Q8|�ĉv�!�$	@5%h}��$QL����� ��(�..^���쬬9�fb����bWIɘ�#ǟ3NUUL��n����<8i„1�G~��'���322.��B��K��������bYV$}��%�f�HNJ�[^���g�RR�^v��Sv�lش��~ӧN������6l2p`nnN���9�f��w|�rմ)�m߱v����
4p��Ϥ��\q�eӧ���M6m;f����ۺ��tOY �:,?���NӴ���U}������ʫ

�c��^���O?��y�%���ƛ�%�yC�ʒ���iS���d����3�W8�
)���q8p�_O?���z���'N����KJnj9�q�a���K?���=w�ā�9��Q{[{Ss�E\��5@�uQW�[W���=k�;���䢋�SS9�{��W����?{VNVVeU��W+6��3�Ϯ�36{��W^c݆�---�?��r���V��B�NI!��wMS������!��j�С����=���<z���?[QQ��Ƅs�Q+YG088w�����I�a�Ӣ�M�����\L�-�θ�+V�1"77�f�q< ���Oj��Ν4q��1����Y]��']��OV��z�qY�kjk�y�?�,��I1LcX~^��CϾ�"���t�9��7�ܱs��o�U}�F��G<A����`Y�&�Æ���[��$Q��E�&j�u�\��	��՟.[��<������'e
�b�+Wn-*Z��SyC�Pc	��7#-5%7'g῞��>$���•�V/[��[�}o��]C
B!����[�dG���v��)I���v��	~IY��O.J�X���7ޔ$I��+������������^oo��]\�r͚Ʀ�?�����t�MFy��˕���c��_y5o�����{��]�eր>���+-���{�9�v�1!����L���
!h�G���j��>Z����(���1-�����o�������B�J��>Z��$Ɇa��v��nWRR��k?�L��!L��&�/��ʚu�^ϣ�-��K�������yc"I��}�q󖔔�g���ZY���,k��6�m���u��>� ����}�l��ǟ\�V�\��K������0�[�EGkh��e&.>�qmmmW\~��ݻ��yGee�E�����⠥��-d���S[�eŻ�����n�a�§��Z�$���+k֭�y��<�ľ���w�q&s�O ���!I�7�v�M���{����@k[�eY/�������4U��7s��f�����;��uu��PTU��$�ڈEQ�e���
���(�$�BI�7����d
�ϟ3s�,K��ӹ�|CCcMmm8���+,(=r��m��vm5��1�Gͽ���_nؘ��Zq��q{��۾s��뚫�:d� 
�Y]}�/�S:ޛ���= s�ԩ����#Gλ��o�y��hc���<g��]�w뺞7dH��!k֯߱k���S332fϜ�k�n�08 3c��i�ƎٴeK�����*U[��>�s���R����&���Я�+�����ӧNMLH(.)mll������[Z��&'']z�E�YTMmlj�u���-1!Q�y���r��XA�~��dΜ>}�Q�
3z���C�s���mEw�v���<x��U���|���^v�i���!1&N��K/�3sfIiYG0XWW��
G"N���I�~x����^{=���w����V:���o�3v�
�]w��7mٶmȠ��9��^}��S�t
�z<�`gP7t��0�4L�MihjP���R���#G8��]�ō�ǽ7�y����rR�������bB�ץ���'��G9�;;/�3'����}��r�wtl�Vt�m�~��Z�f��fc��3���~��늢̿��k����')v�����ɾ��n���@K��?�@�+��HĴ�ʪ��u_~9b�p�ߟ��z͕W��+6[G0��|���6Y..)}��M9OQ�e�¡}�
è8p�%�8[V�3��w�G�*�p�}�y<�Ç�s����??�`�~���"C��H��`���j�����OKK��<��.-//omm
;/�3���,X���y�i�[�nkll����������e�s��m�LQ5����L?��|�����/x��w**�;����Χ?���s�e�֦��~�iN���I��cF�jmm�������0]�rc�fRb�i��|�Y���MM�)��N�sda�=����ZZ��NӲ>\���_̚9ò,Q��*,�_
�x�?TS���زu[UU��y�^C�&�ڑRSS>�l����������Pgg��j]��#���vLpbBB��[��@kŁ�׮���8o򤺆��������qޤIE;v���~�lyVf�,IUՇ�lݦ��UIJ,���r�7n� q�\n���t64՗���d咣�%"�H$$'%����� !�0�d����ݿ�Y���7�:od���[)))G�� ��U�H$���ܵk��m�|^_jj�ǟ-+..)߿?h�΂2�N�Ç�C����Z0,��hGZj��ѣ>�l������(��������I�6n��������RYU�v�����o�����6nڲm�a��G�ljnްy�q�Q�+JsK�h���1i����o�0���k֯OLH���&���7$$$��~�0�;棏?>l��!�?���}��Y2�;:�}��P8t��
8�4M�����VV�]��y�'}��;:���n��1b��˿�b�i�i� x��뮾j�'�fge�}>˲���֬_oY��^Uե�|jW��}��9�khjz��VVΜ>}daajJ�ҏ?����%�T8��K/���1mڸ1c�k֭��u�47�|�ŪC55yC�F"Q��907���r��1����(��G�Q�˙����VVW''%]r�E6���e��/�߯���mll;z����_Q�gᄂ���S� �����e8p�9�^��4�9�f~���,�}��P��A�
Z�c�
g͘>�s�W�9�㸖@��1�饯>t({�����gfdlشi�֭�e��
�>th_y����+�^>s�TE�}�ٲ���o�7n̘h4���Ov��]���VS[�s׮�#G��QI�LMImii)-+mln8Py���PEŁ‚��	�1�N ����F�b�2���H�������yo��A��R�S��w���;Z[-e{�Z;f\���B�����].WzZZ �:u�y�G�ܺmۗ7͜>}���TU�nf�O
��a�9P�ރ6�{jr�ntw�_Ӣ(F�Q��n�D"� @��@��)��eY<ϩ�&I�A�4Y�‘-ASU���!��I�L9��u�n���w7��$�.���h��$���1M�!!�`,J5�R�~�e�� ��*����vE1��:uMrMӴ�l�H�:��� J���a�r�QU��@g��n�٢�*
�o��ǹ�]:n�]�9�3tݦ(�h��y����h��4M�~^躮�l�h���F'�Ȳ����D��1��D�Q��	زl��F�� ��f��0Ɔa�`�&�� �8?<������f�4(I�������$�/h�F���(�i@!B�HDQz�bGJ���XQ��z���"�rKK�jC7�.W��lAz�ՠ�q̣�����qSy:�hF�KMIMII��C5}�B$Q�$�3���S׽ل��X\Z��?î�����8EĂ6����%϶%�P�gZ�!D����x�.o���)9��սg�t
&v�$��צ�O��UG�Q�{�����p�=�1'�X�M�t:�[�n{��y��9Qq��<�iy��Ʒ��.��-�% �k���y�t9v~@�ɫ��ݮ`�i��;E����&���BhYX��^�{ϻ�F�#���0�^g��K��8��v�z}Ҙ�3��"^?!|�~Y�Z^G$8���G�K @���>�.L�?����Uבf�~�}|u���IIG��z{u;ƣ��!�Gڌ1BHӴ��{~�c@�V��^����Nl���ـ#�Z���q���X��E�%G���7>~{�����a�:���]q���FM��@�
w�bA�k��Ռ?d�	��8�8f�Y�Bl���  ������Ɍ�!D%��3�(z�~����V��4��9���7A���g��<�
a0�pt^!
�HKIs9]�Ӝ2�o!$�� �"3�3����i ���n��!�}C3�(z1�B�8�
c0���,�)�̢�a^f�f0��aX�!���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>�~���s0�g0�>o�f(2tBx��`0�S!D7u�4y���C�u&���
!DE����`0�L������`0�L������`0�L������`0�L������`0���n��`��B���h<�S]#!���3F�@y��U�˲<j5��#0�g0��@MӬi��u
�z��8.%9���@��BHRbB�I?��`tBX�P��l���1ƧP�	@�¡��@@rR���<E5B-�jlnjkoc��`0����/7�;����L���������v��(Bx����2�g0�ހc��C�0�-�1ƧN�1��s'��`��w��们2�g0���!ű�8��8�?��ڤ��n� �BB�5�^����`���cL�8t9b"���ƴ��iY�a����^��=�£dڲ,�0!�(�<G�O��X!=b��*BǙ���m�z���1�$���X[_WRV�s<������D��8Qc*/�(�;v���(�T�<LJ#��P�,IT��ު;�
Eqo��u��U�{ӖM���u"�"B��<t�� �D�C���y^�$�4Z�Z%I�qlL��w`�"�8�s��E�t������^�u�� ��--+F�Q�P{{����@k���|oIYYs !4L��Pu[{U��`ZfV�i�MKHH(��/�b$9X]��Bm���3�#4M3������0��h�������`e����t�1k`�~��8G[�!1� T��Y����~˶���W�*۷'���!���
����4-557[����8�ٹo����f��3!j��o��Bn�;
�������Ҩ��?PQSW!�Y�+�(�S����J4M���ٶc{]}}{G���

�ey�a��`0�r�}�� �� B��&���z�+�746��w:�e�����v���i�)iiɩ���m��ր(�p�����Ī�f��D"��w�1lh~���p8,Bk[k[[kVFfsKK}c���T5���]�ֶ�����)I�cG��1���9J�wd�~���x&�H���Gª�Z�,D�4A��.��-�0����q��n�c�T5��U���N�� /?g@�aG��z4��83#��	�t]kv� ���3r����z��Ҳ��t��(��{�ɉI�a@16-l��F�����a^���.1���������u{Ѧm[l6[��t˲j�j�o\��zNV����E;���)
�b�������q\RRR0Թ�xw�?!)1���Pm}����#��F��<g��n���,)+IMN���UU
���aH���z	 ~�/%)ٮt�d�<�qgY���E�۾k�'��v����B]�O�1��8�������z�k�F�oDA(.-�-g@�,��F�u�0$Q��B�M�q�!�0
��(��p8L�%���B$I�U\>�@�
�@��R��0x���"!��$Q���.F�H�C��! !�c�4EA��u�Y����3F/t�BL�P��8��R�
!DI�2����`�	!�`��I���n�rT���L.L� 4Ծ���7~�@�O1�~�,J���:��g��`0���rd���i���4M���oU������{	
tx������\�b�'�б~��#%2�g0��P��ǎ�fY���]3{�Q]G�6 ~�):(� `��`0=A"��Ѩ����:J�T�fy�BU� �UӨ#�>�z�e@$E�I?��`�B�?�������TWD3�%&$*��x�?U-Ӵ�>��I?��`t�"�rzZ��&d,��]���	!6�����I?��`��aW����=^�Oi��˲!L�㘜�A��[#���`0}&����8��'f�:u	�cyd�ݒ1!�i=�M�;�㊥�a0��%�)���㐮��g��'+�4f�$Iز�������Y���n�e����n!�!<��dY�u]7���4d�w3��`0�%�H?M�n���mmss2��DŽ�"��
�-y�E1
Q_T�>>}%����X�	IVV���%&&<X�$EQTU���
G��P(Ds`�w�eY4�A�.��"p�_Q��҂1�Di��A�d������[dd$%%�A�cL�B0!B����(J|{@���1��w�-�������$�_�~fæ͑H�tϞsƎu�\�eY�e�۩�{�n�f�y���q��m�F�PU!��xdI�t!�r:EAE��A�BE�ZT����i˖�����d�������n�����2�P��+�(^!�D	"�1v:��i�<�q�eY�4ͮ(�$i�!��l�����w�����Ս(��� ˲M�1!��<�Hsssb�?33��t�i��&���n7�f���]پc�_N�M����y���r)v�$I�pN�d0������ƞ���eY6m���.�3;
i���e�
�7m����uUU/8Β��Y�fm �=s!�%o���G"_�Z�����y�^���3�O��Ύ��,��w�r:��������/���;w]y�e���祗�^|Ѧ-[=��C55_xAm]]bBbZj���W�3�#\�r�]Q�wх�7n4�ɓ��0���p|�Ǐ�4����Ï?����z�W,���;w_s�9�9++�}��z��)���:t�#�=cFю[��3����de����ﯨ����;y�D��;���ږ�7��r1���8��nFǘpw�
���_��7%�e�{��U�W|!���GӴG,�U\�;::DQ�D��߂E�
��[^�ҫ�E#���ݮw�`����,?�`�n�<χ:;/��|����ֻ���$��B!���D}b�a���X�%�Ⓑ�)++#�<��Ku�
�����(
�i<X��|�(�����?��;�Z�hq(z�'�߹k�˯���y��%y�}=�!���p�'�6o~��$QZ�r��o.�|���֬]��˷�}75%eѿ����{�7l���z�~��FI�t_D��89�K?BPUչ�]���ƌ��}���\y��⍛�$�}#G�5J��̌��#G����rӍm��K::��@kU�!LpvVּ�=r��o�gZf$�y^7�~��{�GwޱjͺGb��	����`޼��>���jKK`X��̌��q{F���,�Q�^�b�M�rnmm� �����]�����A�� \{���w�}�\y�a�ss�?�/�����`ذ��ԛ~pCqi�����;o�u�ʕ�i͚1��k�ٶ}���_~��7�w�d��ݮ��0e��k��*7'{߾�۶���w]{�U�d��`0�Rz��y����G_��;��
3�O{mɒǟ|򺫯�4�px���������a�jjk��Ȓ�q���\4g���F��p$G��H$
�`�%Iڹkwm]ݸ�c�d665���D7oْ����/��9}���R�0���7n�<~�8�4����s����UM�=cF�С����{�����5rDTU���Z���LSӵP8���a���h��3��ޞ���G<X�z���!��  1!aݗ_��
�$ɲ�hT�t=�B‘�MQ�'�-�$����[wKe0���ü4��}�l��7o���gL�&B[[�U�u�=�7�z{�矧��Ι53';���b�֭ӧNIKM]���ť�N�#=-�����ɓ*��df�ۿ�9���8�hǎ7�~g��u6���;�p�<�/����쬭EE�

I	�3�O�t���/��߯�{~�g�>��=���7�l޲-oȐ�.���ϖUUWO�41��P\Z�ukц͛����hm�0��@ `f��!U�UcG��������Ax��
���������i˖��~&�BB�_Ӵ�_��

��.)�2d���֢����Ӧ�}�ةc0�3:��{n^E��Qq� �p��r����;�����B�����HDQ��i���l�$I�i�.�B�i����t]��$I�h��(
�$�����F9�l6[,t���4��F]N�i��Ac	-%I��~b�A�4��8�C�a�ʱ�l�a��k:D����<=d���'u*�u]�%�0�\.Wmm���O����4�֘�`�-���;w����
E��iP]�4mيW\v�b��,˦i ��H�z�G�Q�N��G�*BP�u�Y4�Z ]�Bt��*��p$��FUA����aZ�M��J�h�_UUc!��*��4M�<��pB�t�`�&�����r"���fۺ�hͺu�Mͅi��L���H��O3xA!:��?�aޠ��H$K!����;�@:\H��z][��'ZH�����M|���`��8�:���sw+��8�4����ayy���`0�R�:h3
~0j�j�9�
>�`�=n���!�H�t7��`0�&<�E�WbV����ӌa1;��"��<BHE�]%�a0���"��@ PTTĢ0F_!�o@�Rj?%tEXtdate:create2013-09-27T12:32:40-04:00	��Q%tEXtdate:modify2013-09-27T12:32:40-04:00x�'�tEXtSoftwareShutterc��	IEND�B`�site-packages/sepolicy/help/system_policy_type.png000064400000152547147511334660016545 0ustar00�PNG


IHDR��ah�>gAMA���a cHRMz&�����u0�`:�p��Q<bKGD��������IDATx��g�E�ǫ:�tO؜s&-9'�1�|z�{�y���g8sΠb�EE��$Jλ�vٜ'w�C��*p(�����U�5���<O�ŋ�<o�6�q���@ �Pe�0��3���={���ϧ(�HW�@ �!�mWWW���3III�����X�u�kE �������������m�^?�@ �X�Ų���aB����@ �8��[����1OL����#]+±O�N�p�>G�|7v��?�C��?�˹��#� ?J?��!�n`ͥi��,���8E�C|��l�ckW��8���LS���sw~�&�n4E�C����oN|��{��z'!&v‘�6�Xw�)��FQT!�c�m�EQ�����*˲4E,˲l�c�};�?��!Q�^G)�� �y�a�4��&N���2��8�%�J��i;�(���������������_7�u�@8X���0EQ��x��v�{��c�:��@n;�o�eӲ�l���t]�SV���_���W^E1Uz<��_����k��*��_�f���O���<�!Q4m��8�0h��(D�!𼃐i�4Ei��m�����p�9��B�a,ӌ(�ݏ?���.�\�m{m�=��YUUQ�-���#�����$�3��Ͼ����׻gsk��>^Z���ױ4CӴcۆi�,!�(�2M˶Y��pKA
dY�m�<�#�t]�i��9 @H7�.#
�q��kl
��}{�Vo����d����]��~@&|�%��H.��W|����~kfj�κz@8����ེaP4
S�4۶!��a��d�w�D˲���B��e��!�j������n�e�,�����vEQ)�B�N���׳ǝ�<��3�ϻ��PAN6CӪ�)�>�_ynV�������4E1�8NVz�p[M�G�}7�4�@(���9�z��r�ܚ���<`csK{WWb\\FjJcK+r����$&�����.b�/=5�����Y�4�a�^��]�m;�q��Ym�*����Wޣ�d�@8 �ah���mʻ�x�E��z��/����
���~���c��HW�p�c[VzJrnV����9x��!đ7?��(�>�FHA��l���5_~;�WI�C/Lx��.-�8z�/���#�~q���1����W7^vыo�wå�d��� $p��?l��
�#g�t��W����,��{=7_y	�0,�,]�f���a�˟~���uq^ozJ�ꍛ��.Q���yӵ��>��j��S4mʻVl�q��a�m�5}���.����Y�|emC#�24E?��[��n��ƦN�ڋ&i�6��ϒ��[�~�U�-m�����YN,)�{��w�23SSʊ
��n�PQա��#�2��W_pވ��E�טC�	�M�s�2�����X�|��9s���,C8��8�ah|#r,�0, ��璳OO��{�W����y��J.��8ǎjhi���'o����
G",��5U�L˾��K�����'�6��'���1w�[r�4 y~ź�[w�<��}^7���^�a�[��R������4ʹ�������y�
���7lB
�K����~X�f=�`���}�
w�S�m�	Ç�\�i��ճ.�#��{���������s�,�߻gk{�'_Ϻ��7$���u˟2R���B��	��� ��\�`�6@@x���媻���#]s±B��m5�[��Џc��'O���Bɒ�3O�
���0L���&��)�j:�q��!�2,ð4���ֵ�w � ��#�Kl��2tO�TM��szٖ�h�(
�m�M-����<nE�Z��E��%��e��7x��8��ܓ'�3m��rM=����(��{��{���Y�mhny�7ƍ���jX�� 
B
��][�8w��f��:��slk{{cs�n4C�,UԞ%����
öm��M�$�!��pr	���k&��������a��)�F�3׿�������eB�N����A�<�����`(|Ý��*JYa�������t�KƏ��9� @E�fe���oɒ+/+k����jv=w��O��zfj��'�2 $�\�pD�4E�d���'�����׮/�Qzީ�[��nArb��q��J&���[�1s��'�0|�'3f�ut��G�u\��,�*�űC3,�z�rAn���]����ei�F3�c;M��
r����R5�07{☑/������:��9��L?k��N`@�y��CDC¡cYVR|��W\:�_ Dz�>�m�{��h9�x����I,|�)!���aEy^�y�0!�M���ɭ��zsw!��8��t��i�q��9�cY�4
��X��
�����n��K0�A� (���)(�\�=���
WCalf����8p�7�c��`H�\�q5�<���.CӦi�\�Ʀ��~��Чw[GGTQ���(���7B<�1U��%Z�m���q�v�Zb�C���lB�E�qU���F��qMA~�lw�8!DS�qM�h����<�!�Mc(J�8{�\9��5�i!CS��b�y�@� �(
M�9<Ǯ�R�z����]�t��>� ��mXa��kw7�JZ�!H�y-�xn��9?�[!5��wiA~(�].�,[��/�q��G�_�ti��D�?���r s���
�Pl%s�7�m��V��$��m�{���ҽ쵋�D��m��eِ�����܌4�0��?��8�,��J"���6q�Ȳ�B�(�f�����/^2�p�
���~�;Ʉ�8B`�fI^n��H4z���qd���R�v4M�k�p��Y�%b膡j:MS��,+l�`Od!�� ?Z��	�;؟���W���c���i��B�y���m�G�s���O���82��!�-�0��W*�m~�B�F�L '�����ٱ~�z�>~l!4M3#=���~��<�A���4�����c���C@I����H������-ۂ��$~w@,öw�o۱����Ύ�z����}>_fF��呻�@4Ewtvttv�$�X?
fK W0Eɒ,�r7;c±�0������A��%�B ��y#�(�0�D�Q��eI�,˴,Y��CL�x�$��������k�FA���CDQTbBb��j��!3�m�G��`�t��I�B�(�bY�e�e��!��qN&�" )1�����I��J#�011����YDZ�A�s��W�᠉�� ��`�q����� 
�~�������HQT8IJJ�M�^@”�$�i��v⥨#��8�s�����Ƴe��@�����Dz����wzs�v1'/���P���f|^/M�!���*o�����<DZX�c]�_�0:��v��E���� t�C::�c� tvu2R�4-,�B�m��IɫW��wu%%&�;���7GY�Y�D嗃z
]9B�c��H�4O"x�á.u�0M�8�wEQ�[fh�v,rv��9��E!M�(�g=fB� @UU���8N��_8��j)� ���c��2���UWu�� EA�c�4!)���ҵW^,�~�x񆍛njgYMӱ��88x!�P]��x�!����7�o���a�Ƕ4��`���hӴ��g��ڵi���{�ꑒ����,�:�mY6����c!�Π C1M#��r�+�z�ZQ���pDY|���9N&BEe�a�z����dY����0�^=z��:����8M�L��,���9��,!K@�M�<�3�@Ӵ�M��K��9��a��O��qY��/\�އ���565utt�ݲ,IXPܲ�常8I����U�y�����8��%���e����n���q��n7�8���i��������,A��~ ��?�f脄xHAQ����MNN.ð�@�%��7� ˲��fhI�|>�$K�gŪ�M-�II�Y��>�,K��.����n��q�ݲ���8Ґ�M����	˲DQ�j�7K�.x޲l�q$�+��s�\����Ȓ$���}��,����Q<Ϫ5k��9��yQ���\.����3��m��iY�x��y�nY�e��rv�| ��$fd�M�}sEkW���Yټ%7+_v{��1�Oz�!��[��Y�=�䉓�=�2͹�D��s�<c�ƍ�[�N=y�ǟM�RQ9v�(
Ro�����.<Ҡ�y��]�ǝ0b��ϧYS[U��eeU��>�ﲋ/b𦩄Ç�8��z��9~��+/�س�t�֭7��/W]~��\Gv����h�n�lKMIfYN�4M�KK�&�?�Ѝ-��۶�<i�	��KG��~��M[*r����rgϙ���F�1l��+�VU�1|�
U��U���C���r�����NRb"EQ�p��qdٶ$I�(X���!=��[*+{��q��	�`��wޭoh��eI������λ�_r�߯^�v��CA}���ʭ[���0�>��I���k��YS�i˖Ԕ�6vvu���L:�l<�"�'3 1�ڱ�z�k=ǎ�3ҹP���ik֮s�=��S
��<���+���G��=^�'Ӧٶ=w��P8�j͚w�`����2���3���[Z\���7l�4l��g_|���m��e�(-}��w�<��Uk�,]��$I6q>�@UU;��32�3��׿}7�_�����S'NPUm?>4����0�i�=��N;��)�=��s5�v����\�&������D?�����&%&ed���+/�RQ�t�cF���p�;���ัcf�����E��ʒ̱lW�_Q�eDZ۶�=�ۖe��s����qc��������OC������v��?��yy��۷>����=������/��1��qc�~3k���o��X�a�-/N������y~�������ܳΜ5{Ζ�
��,ˊ�6��^�1�pWuU�a���_0�굩e���-
�8����6�W��x���yǫ/�U�>���C����̝���2n�}��u��3g�iim���I��7vLzz��M�TM���JLL�u#.�7x������
�߿��0	�H���!�m��u?��S9�ٗ]uMNv��O?�q��r�]]����(�04M34s���>��e%����FbBB�=֮߰sg��rss.:��5k��IJL���+.*�YVZS�˰����}^/MӢ(�ege������2 ))���"�L�n�P����~��u��<a�ѧL�0���;v��G����Ͱ��e˂��E矿j������ׯ� ߱�Uk֜z��cnj?�
�6�;v���;v��g�������B���;xР���`(����8�‡p��ifJ�=r躉'�qյy�iuQ^�i�t��M��wo�9A�_���'�\�a�m�qqq�O��s/��ޜ��Ʀ��C��dg��ʫ�� �|���MMMe�%nY>��q�]uUjJJGGg(�D#�@@׵`0d��o�7B�i�K~�g�|���s�����h��)H��������.�!$'%#
6d�����;z����Қ��:x�/��:�Dq�+6m�R�_ �\���'�}Viqq0�D��a�!�0�Ѩn�ò�k:^޽x� �q���]���y�-YZVZ<kη�,���wN����ֻ�-]�������v�)�}��޽z.[�b��6mެf��}��5{�…�,�ݳ��C,��f׮�n���Ϧ��VZR��щ�@ `�Vl��Ad�I8��4�1-6�ߺ���J��w�~����M�{޹�Y�.ѵ�ۓ��--�7l���..*����
�HOK��˯N?��>�{Uo���_j�v�E���$'%oش19)i��q�
���,�F��%�i��z�^�P��=#�hNVVjj
���o2��|��}B�i�eOEӴ�*�m����Y��iZQ՝����
n���'A�JNJZ�zMVf���#::��_�����S&䧧�666�45l���Vv��9ٙ�'&ħ��ض]XP@A����H�r�m�<�{<l<
!T5�����������3�bXfŪUÆ�	�yU���6lHOO�z=�qq�u�u����0o�B��ӳG�O�����k�
r��1E�ӿoyIQ��8C���G�)��Y��H$/7��ڶ
D���zI�r��ðu�j2
�M��o���Yi�UZ׸��wz\}Mkk�0!!���Mӭ�����1|DB\�i�x)�%��(��P�a��ֶ'�y�?w�K�$�ey�G�c;N4EQ��H$b���&�HDr�t�p��r�B!#H�4�v�<f�jdl���ٱi˦���Ʀ��4�
�v��+?
:�CA��_>��3�rs���(G"�e=��i�h���
��<�UU�z<�p��y��‘Mіm1���!�rVF����$I���\��q�n�H�aY���l����aA���뺢(n��cY|����F��|�aF"�,#�Ѩ���t��)�@(����o�#�����8�(.^���������!!>^��*�����W\�ʩ�V�`9���(����߰a��v�¨�D���7n�{�ҥY��)��P�qUUqJ��TU�F�x��ߏO�i:
�g �i:Cb�!��@N1h���H�?�m۝��1�)�$�]����|˲B�`�1�8Ngg'EQ,�F�Q!M�x�Y���4��:���c�;dIfY&6ق�7
\.MӁ`0�%`�fGG趑!EQm��8瘫MӁ@ v_M��?��a�W @�qA��2"F>�?:E��(Q˲$A��۷��Բ����q���e��e�w�d��c�?����ѣA�D��Pw}�(*������MD�(kbkk[fF��q�l�ƾT��24��vp��h� ���v��qĨ}� �N���N�Tk[k����^�n�g�����~x����=�C�P�ϊ���&sǡ�|���q�䤔E��� ����<M�����4!E�,۳G�:��IB			��=�WJ��@14��xTUs���P8,�\���EӴc;�,�B!AbX��0���$�#���f�hTsIbk[[jJ
n� ۱�l�f��]r���!,+�Q\\��D�e�.�Q�'��s��_���?db��f@ii��!�c��r��됢X��*
�0B4� �B앆f9υ#��K�0���咒y���ݱ�`��BA�0
��Ʉ��B��?��ap[
R���~�/��?)h�v||�i����,��4�%Q�D�4M!�QE�P0$��j���J���(H>	S��ٙ��h�h�=�s%�^<HMI%�=�c�C�2.�U�P���H�A�0E)��������Fv�t�=�8����`_Z�f8�H��m+�l; �8��C�#[��¤�����[ߟ�X��lۖ\��Vhl��c9�7X����f���aݩ�pt����&��mr���Ur��@,|�GJ���G�ܞ@ �;���q�~�@8� �O �D�	ḃH?�@ w�'��"��p�A��@ �;���qǯrpB���O��N[!۶��!���2�s{��%�G�</I��4�p8��c��x�!��eY�Æa�^Mѡpx�P_!�a$I�F��w�'��ҏuyæ�_}�i}z�:��Sh�ƛ]�v���jY�i�,��6�?�>�9�\�
7~�ŗ_0����q�>������K/�y>��81�Эmm3f�:픓SS�
Ä{8�mH 0�#�x^E�����?%�E4M�۰�3N�\.Ӳ�E�d����V<�‹/>��+Y���<@�4�qܲ!�{rSe;�����x��QS����~�+/��qӦy4=-���u��.Q�:ȉF��<OcS�/�<���@�
��ix��@8�!}�Q�~���i�a%�������r�-9e�I�t�����<��3N;합�wtv�>�Ï?�*�5W^���_Wo�v�_n�������^xy����M�4����;�cZfYI��5kn��z���� ���k����_�`Ѣ����4�a[[[_z���Ʀ�'�p�%�^���w���ϻ��9��}�Ï�J����J��}��'~�4�t������ή���F��������;/Ny��3N���+�����'L�t��W^~�[����x��g�����w݆��wnZj� ?���������/\��o����5��K/����}��>��������y�����޽z^q٥3f�r���S�8șt��/L~�e�3g��������_n������ƌ�3w����\uycc�/oLL A@�¼B<_z��C �o��������麮�9���z�7����>�����o���7��|�O8���t�ҥ�7n,*((**a�9g������VTX����s|4MNJ�t�9�Hdku��^��w�::�h�B14��g����U�L� 
 ����u�
ם}�3g;���4M���K\.׼�m�nY�e�}z��y�nɝ��CAb�J8A�,��z�n�K"�O8J�[�@E�\���i_|�࣏�dg��:��Ԕ�ܜ��}�>�‹������/�
7�()=�S5]���+�;�����766������r,����;�����B�PGg�a�B)M�::;Y�}��ܲ���>��茏��0�ė��:s�����G|�޹���s����미�u6���y�޼�\���0�~�щ������p�@_s�5����m�e(�a��],Ê�(�BfF����׷��k�IIN��8����׿&$$0CQ���z��%�ŕ˲T޻�)'���e�e�%y�9ii�e%%^����(';;���QZ��ٖ��|QaAA~�eY�m�������7x���$Ir]{Օe����x>--m��C
2x�?�8��w�Ą<����
R��^��c9Dz��#
Mӭ��p������a@B<�o۱MrI�)�!Y���H$b����#����Ny�y]�%���4�Ѩ�8.QE1�(�iʲ��LS�u�,��a��i��D"n��4M�<�<
�!��A�D"�.�e�h4��:�0nY8E�Q��EQD躮i�� ��zs���l�%���p�7�
~֮BhF����eٮ��ʭU�?�cYM�"�h8`wܨ���Q��O�R���4�(
B���`0)���뺪��}��i�-ccP��lێ���jZ,�#ݘ����y��Š�(ꅧ�x>�T���c��c���ܟ��F��|��@ �*́'E�,˲ll�JfZ��͖ �p��Y�ݯ���}��7��q~�U����]šM:#�h���I�}ʃ
�A��	�߇��~�ah�����q�H
ȴ�X��O2N�8˲���D�|V���j_�/c�b�C!A���i�����+�W�{���Č��[I�M
��;����*�q~,q�#�"^��O�8h���p,�TM�(��2�e�׏�,˲��[�[�N���Kc��~�?פ��۝@8&9��=�h4��O=���}�х�<�8�,�.QdY����G�quDA�f9��$I�$Ir���546�|^��<���V�А���������x<`ϛ���,�]8����d��y|G
�}��A�YS�<�m;�|��%�n��eY��$��e�'˒$I,�z<n\4B��8Y�)���|>M��ȣ�H�eY���	n(�����9ǚ���{��w֮ߐ��s�eYE�KɒK�S��;kj�@bb�-[^y����xמJ<�{���>�x�_)!DS��q��cY��*�nő�l���q�$ʲ�/J��K�k��/}��G�xr�nYA���nY�����e���'�X�*11�iB� H.׏7��%���JQq�%I�
��?1�q��K�8�u����9�s�n۱�ձ��+�|7�����5*.�'�p՚5Y���.?�A6mٲ��&5%Olڼe��^���7�^�tY^n�����_~��%˖�盦���_�������$��;;�V�Z�E�ǃ�(�C�U��X�EA����zm�nlj�z<�Hd��զes�У�757����|���+V�E1..������~Ke�(U۶557����T����z۶���aV�Z-<~I445m߱3����nÆ���oڲ��	'!��._n[vRR�eY4M�B��΄��P(��友�~UU��??F��X�ǃZ�fMGggzZ�������w�lhlJMI��ꊢ<���MM��

�s�/����ή��$��jw�Z�qcBB�(�qx�kjn^�n��s��*��q�UmmmM���}H��}�f����8�ogm�M���].WKkkMm�;].qͺ���%%%Y����Κ��^zi‰'��Q���U����P(�啫�tuu���Oymj����8���::�@^�GE�f�����ܜ���0BhKee0���a��'�}6%)����a㦺�z���0��jimu���p82��
�y�Gso����ۈ]?�(���p_l�UK����_RTHQ��o�]UU�q�fM���Җ,���O�9{�w��77�TVU�1�ޚ���p$,����546Bz���q\����7��P4,**��˯X���`��(yy�O<��,(-.�x<E������GU
� <�̳��|�֪���x��w�~TQ�`0��쌙�::��|����g_|Iӌo��4���3f����������/g��g�2�0���vǝ��]��.[�lY8�|�WN�����v}C���F��{�/\�8�D���'���uvv-^�4����̠(�f׮ǟz��3�|�Ï�~��믹��_����t}�H$�ՌoN�8�?��Z�q�f�����n��m�of��t}���5�ӿ�����|>��3ܲ�ɴϳ�2E}퍷E]�|��~}i�a�ڵϾ��m;�i��������s��/�/[�CB|��O>�8��k�H�۹�`~n�?x�w�����F��ȓO�Z�fgm��_�>�tZIqqjj
E��֯_�h�$��Ν[RR�ҔW�n�`Y��g��7���3����� }z���0o�†�F�?0g��qc�~�ɧ3gϩ���RQ1x���a�
K���(JKk��+#�h�e6m��"����IKM�(�����'�:����.X0����{��H$��_���ddd��yt�?�~�Q������������c���[��o>����ֶ����|��~��L!CS��|=s&�0,�Ι;oKE嚵�{�=���F�1t�c��z�-B]׳33ǎu˜���&˲���^t᎝;,\TS[����n���k�I.��8B�2%�5bذA���޵�ׯ߰q@�~n�[7t�$
:t��A���_|�]p������ڕ���n��5�֋�8jĈ��N�a�t�u'�t�
9�{&�o���g�vj�/��CAX�mی�3/����~��'�6�z۶���o�>99i��֬INJ�oh�n��K�F��{�L����p����/njn;f����}���B�z�څ���^EQ��9 ���|�?n�`�y�6oaFU�~�}��{Ʌ�{�١p������6j��-��Ν���|�|���n�t��n���8`��P���zKE�	c�8����6xР1�F����[ƌ5m��h47���\�x�y��t�4M�럷���Un��r���
2�� �O�_[VZU��(�|�
W]~���Ç�W^>rذqc��z�-��t��������ֻ�Y����������OymjW�����۾�=�_y�^=z�w�]%�E��[�9x�‚�o!x �0��gef���{�:��>��";7�-��4ZZ[/���+/�t�ҥ������p��g���7k�f������Ç�{��8��u=�8�mٶ��`H�4��ȯ�z0rl!�D��e����q{������*��8NB|�c=(��?��?��L�|KeŰ��$Iz��%Y�������M�������ʼ��ۇ�D�����(���2�7��d��ٶͱ��*,��4M3L8vl�0��U5MB�]))�7]ݤsΉD"�
mԈ����$�.��}>91)7'G�4Ӳ"��q ��q��SU���kB۶U����ؽ��4 .�ף��ޓ��n����z�x�v����{}^��_�<Ŷ�Ғ���zpGMͽ<h����~�߶m
R>��Gi��s��gY��� ��J�ESx��0��i���`Y�=��s�!w�w�������8��EE!�D��e�n��0�H$
��JC8��p��뮽��K�}񥯾�)K�m�E��a�aLôm@�q���}�`8����۷�vǿ������y��k�)�ϟx��V/^����!=--����L:w���-��ܚ��3z��|��G�ZU5j�����_z��)���C�ܺ���^�iO�24m�q㒓���7��u�aX����}��kjk���BÇ����(�gϞ[�>���5��y9�ˍ>��3>���3N=511�y�׬]��0MAqh��)
`y�aHA��(��(
RP��.���of���{<���f�?��+S_We@����[���+V�B!�,�����PQa��C:;;����i��A9����q�ĉ+V�Z�t^Z�n�E�����v��#GL���ϧ�]4ES�4�8��TU��`��-�^��aŪ�^y5��۶}�	c/]ZXP����dٲ�^y5��d�\�A�<���K�-��E�;��-�O
)����!M������eX���4M�4���<y���Y�/8���
��� ��-X����r�.���q'�ya�{xp`���\qyqa�=�?0���;��޽ze�g����jw�z��g̜���b;�eY�ii�II��u�����<ϱ�
���Λ�E:�M�R��gc������� �q�p8����u]gY���!�DSSR|^oGG��j��9--���I�>��cٺ�UU322$����,3>>k˲�@��,�׫���rY�e����u�v�.ARSR��8�M��h�����iY����C�s��@����8���I3LB|��i8���T˲�F�%��ib[�x4`Y�mۢ(F�Q��$I
�p8=-
����۶�$	BXS�!'--M�c���CQ�@Ӵ��M�,�(� ����5D�Ѵ�T�(�k7MӲl�kw}h�n���XVt�tM�$I�u�8�'�74�|ޤ�D<%bF]}���!����o�QV��!B����%Iq^�?lhh��II���,CQ��i�$�0��GD��Y�1LS�y������;:�rr<B(Z���
���h^n.��a���˲�RSM�d������x��ggWW||\4�446�������7�Κ�Ą����EK�|3k�UW\�������i?�>���X���~S��i˲LӢ(��)ʴ,˲��i��q,�0ج�Wl�fY���e!6��k�x�_��x�s2M���wO�q۶o��g�z�Q��8k^�X!d���(��8�,jϟ�2�
�]�i�vǶm�݆at�%�m{�i
��[$�V1KP��l��74�ض���ӌ�g�
�q�b#w�Wزm��9��m;�EQx�UU=n��Y��Λ���D�
^z躎mIY��m۲,l.��5{b%�*ܽq�x���M�iY�e�q)��#\.;���i�B
B�eMӤi�eӲ�/�b�N�0ܲ<��/,\4��g��e��Q���H?�(��÷�™�ib��(��]{�6c�p2����ni,��6�Bl���l�Ej�Ug��Ȳ�I�-�2~{u/[!����zc�:����!��	<�<@�t��so�n���i�ݛnߢc�Ǻ��+LAh۶����uGӴX��t�9���Yڞ��8������9��_��)�=*������*�c�\|��to���a�d����!EQ�JJܲ;�G��G--�`���]�5�/�-1@�⢢�=�TU��c��_��@.�W�:��?���ƁPմ�C��)�_n��q�?�!4]��?S�a��99E���h���!�rE�x�!6ق�<�L�(OS��aw�X���Yb�ۘ{
)�%��NBv��{r�������B�E�d����gYֲ,UU
Cdz�G�I�1q������ǂl�@ �/�^?M�
���P@��(Ƕ����L�'�!{ό?6��+<�| ��cR��)!����UW)� �2�3eY�+Z@�z<D8�Fp��I�_>�Wc655CA����$%&'&&��݊�m@��毶�m۱��C��S[��&ZL �A����l߾�v��>���M[6�����,�->.@A�iZTQ��k��|B���/��5�׿rk����ST��m;���U�U�9y>�/��!��$I9��Db����(�����~�\��vJ�ژ;�!�����%J�P(XQYazzz�^���'꺎�V��-�[C�,G�Q,���2��F\
����a�M�o��-�ôw�G��ѣF7��Ϭt��ycK��|�r�ׇ`5iko�ŗ�q��O<a�a�ڒ�(�4_�-U��>��׋G����3�0̊��RRR��R��(�{̂=3�!��jjwrף���k��SRR!+**��ꈢ�y˖�g�b���s�^�Kmla�����QGV�\տ_��)��g��%e�%n��]�jF<4a���=	6�q��n��oy�+Wx�>A�7��/OF�C���ƌ�������_���,ƙ��P(���O?��a8��ZU]�k��N�8�jض�Ën��<픓�@ �r~�:�gojj����
�xge�F��}��<����7�4ǂ$s��ں�
0|萌�t�a�n76�z�����zͺ���s'�\�(���8:4����645��a����<EQ�(:�\.����:���ՙ����Q�4�v+
��x�����k8�p�֪-C
jiiyq�+M���0�(��-�5�y^�e��r�<GӴ(���磌�(����9B�r����k��I.ɶm��y��9�eY��!<B����欬,DZCs�qlMS8�MLLh�ޘ,�����ť%%�9YS�|���_:�S�� � 8�3n[���nwT�Μ=[S5\���5��,eh�������8���qނ��I$�p��@@EQ�n������M���5�]a�(�wt���'���2��M���˛��wK�-33%9�ҋ/�1s���U۶efd<��S��]C:��O<�l (-)�ۻϻ~����-�R5��'��m��|�9g��Ռo�.XȲLvV�uW_e���]�8��9I� �h��4
�2��S(t�^�N�8��ɯ�F3􌙳�,[���t�5WWo���/9����ٶm�n�,�e�z۶+/�,77GQ�.���wޫ��ؿߙ����TVUgge^r��y�q�
7L�bz8�ճ�,����O����@�(
���]��
�q���Z(,(8��x^(,(x�ŗB;v�|���@��N-*(|�t]п�I�N�D��=�4E�[n�������%Kg̜���+).�|��K�-���'%%��±����`�4E$���x�6�;���P�I�j�r�'Ih���ox�w^z�՝55M�Ͷm���߸���}�}z�<��3��e���믝5��%˗Wo�6�_�C�M�:u���������kֶ��VTn;j�9g��Ռo�VWO��˿�tc��}+*+q���m��\m۶m{ ���4�өEq,7������޿��+jjw���'�W����Ϧ�������.������ݺq��Ғ�'���i���߸q��s�~wΙg:������>5b���':-Y�,))q��e---K�/gY��W_�ݳ'EQϽ4��y���i�0�~aص�4]�i����
�;::r��y��\Q�m��Y�~[��겋/���q��I�3a���f|S�m{bb�_�|����_���a���`��W^�ճ'�𕩯磌�⫯o��O�}�e٤�O �Q��M2�IKMݼ�2?+����F�у�n<���஺��n˼��i��Q��SO��ʔ�X�UR\��������x܉������e�I�{��Vn�JMI2x�2�iM<i|��R�׃�v���

���|�[��|�����\I��:�0^�o���n�`Y�m;�$��`(����A�Z�5p��[n��}��=��w���8Ύ�����|˲

JKJ|>oFzFVVV^NNzZZ^n˱E1s�	c�|�
�}1}��E���s��V|���8������(-}��w(��QZ������&�|�~}�#��8�)��wl�y��˲m��y������t�4 ��'�y��&&&Vo���̰��+��8�ĩo������v�]�DQPT���deeN�������;�Ñ��V�����:%9���4-5�e����@8X~�^?��eeffѐ��pAϼć�v�Cd�]0?-5���N����թo<��˫׬�u�����iE��?`YvЀ�_���;�n߹s��1�MM]]�̌Y�~��/�����EQTM������5��0���e�-Z���A��
�[[ZW�YAѴ�v�.WSK�eK�s
x��ͫ@�R��077��v���Ғ���Ą������G"Q\[EU��]�cwvu)���J(�m;�Tl�J�tvV��jw�uvu���=��E�`(
��5a�����FaAA��=���DQ8`@��>xAز���,���KZ�)�%�,�#����K�K���߫1+*����{/N���S.�4I]�]]7oNKKchz���7Wl�Vݻw�H4j�ή]u_��'�,,,(*�_�d�;�JK�<p`YIq�>���ڧ��ւ��Ǫ�p�qp�t�����U[�o޲�zG[[{vVnqQI����.E��4
COLL(�����LJLt�������������޽zB;��.���<A�%I�ӻw}CcgggQaajrrnnNVffnN��r%���ճ_���ں��۶O7�4-<����Q__���r��u;v�t�JzfeeŪ!�,������h�6B���۰��gge''%���dedP������.����<�Ǔ�����������b�*�����*�,�X������K/)(ȗe)'';5%%7'g���nY�ճǶ�;��SSR��Ұ�������Z�a��ꪶ��̌����6���a�4u�Yg�>�0���l�a�-_���s���

è��;ztYiIzZZk[[bB�%^�����[��9󌊭U��
�II�%�%�E�[���F
������٥�p��w�:�ȝ����ᰢ(B��-Bw�A��Qi��qC!E���Ð)��v˚�����BeYf&�Ⰾ��န�,C"�(˲�e�,����\�P(�23_�<%����;��~�K�4C3���0
B��Dz�^a�q[14�6�߸\"@ �(x'�p$�Mb�Ѩ,�؈^�yU�DQ�,KD������Ȳ0C�4Y�Uuo�k˲�r4]�W�ccr|0�^n���t]�cB
 ��jTU�g�A��y�8QEqG�$���㨚&�"EA�@(�ݲiZ��{�n��E�KD2-KQ�#}Wu�ȝ���C�|k�c�lێ����v��.IQ��8T7t��8`O�U_��-Tr�+8�z��g�֭_��W%&&vw���x���ᳯ�k�A7w�W��̿{��Gž��4�`���7���7&?��S�=ɱ9���D�C�J�2��Z�~�Qơm>\`�q�g�42�Ҁt?�{�X���@��c��.�`��#
�m��O�4`��)��Uoߨ�ݣJ����n�[`��wO������6�m۱�1������*�r��Q�qm��8��iM�O�}�k��U[�p9��t��@ ��29��wq\���_*��2�������1v��:���~���v�eYf��
B��~5���~\*�{X�����B����8?� `k�k�U��
2�O �9�	l
��.[.�baA�,K��Z�a�lmkKJL¦)`�q�����!���7��8<�A
�M���$�
x���T��Q�llb�8cs88Ps,�g,��F)���~�ya`��w��u�eY��m��	����Jb��ʪ*@nNN�ȧ�����c��E0��#� EAb�~�xm��H$����Pj��=���4
��oA��@8:9�'3����]�t�ǟM[�j%9�z9�kjjz�ɧmۂz=���8l)OQT����y�|�m�wl�R����5B���;wn�\�������3�
�Oı�DQ���B� $$$Ȳ�MݱB�e BbB�$InY^�zMkk[RR�e�8�(:����獏�s�\>���]�,�q>���\���x�˅�[��x�e��&&&Z��ȓOF"QA����qM�m{����͟��k���a�|�8��[����$�˲mY�������ǹɒ�w��p�eY�$)>.N�a�ݲ�8˲^�7>.N�e�����˒���_���NHH�y.V�Kq��C8Z�$I	�������>6���8��qU���%IB�LG#��G	<�m��ֶ��}�a���NӲt]�].�)����¯���ǟ8.��*�g_L�*ʈaC�x��.W �߷/˲�p�7��������������<�Mbbb▊�����߯g���X�ݱs�+V�gef657��?��]��_��55�~XQ�������K/�dg�w�����RQ�r�����-/�ZUUS[���ѫg���۶'�4��(����V465A��׮[Wާw�޽��,�aEcs��232�~�7�g�74��)�BKK������2�0�����>����‚��RQgΞ�i��#��׮�i}z�
�B{�]]Ӧ)��#�m�PRT�8�
V�]�|q4M������%&���'y�9�G�hllڶcgKk���0�ď?�lђ��`p��!�$�˼����T\T�i�敫��ׯw����CÆ���[�f͖��}�322�rsDQhlnY�zuFF������546446�������A�!����+//.,����z�]<��˓����7������[�%��}�n��.�'���+��]��q�`(�8�eٝ]]`O�6Ӳ,����)JQ�a֬[�ѧ�565���knY~��7*�n�n����}�B~ ��8y���[**?���ֶ���!��Ñ0�0�ꚾm��)S_�e���^���[�x�ǟ~f�]���D�̝;���%Yv9�S�=��Ҳ�^�<���'�y6
}���~�i{{�C�=���2�թ[��eI�l;?��,�|9㛹�H�!dY6Eӆa�����W/]ZYU���557?���[**‘����v��(O<�Lss��i��MyujW������7����?�ē[�����?��S[�����y�E�a�5c���5��|�Y�a����KGgv_�$�?����DA�;v���T�,O�:�����Ͽ��ODA���~���g�s�bGggWW����ھc�O<����|;w�e��x���y�ŝ55� ?&�h��߶mI�{��[����[*y�Ɇ���6����ٵ���A��8EQ�.�AӴ����5��6o�kh���w_u�eC
*��{��!�^t��0ʹ��A�2��I�,�NJJ�����9�̮.����:::�`Cc���]<ϙ���z�(kllJII�[�'?7wي�7n3jdzjjYiiccSJrJ�>}JK��<���'��~񒎎�P0��Դ���-K�F����k�RS.�t�I�����N����+/�����dg_�5qqq;v�\�|�M7\�������`�k�w��w���sVJr҆
w��F����ʭU�k�������{��lܼ��w�q��w�#�5k�(+�����1"�����s+��U�9����;���b��+�e��x��y}������Ç
MMI�럷�|�ʭUJTioo��Z�s\y��W\z��A;::ǎUR\t�7z<�q,�*+)�\b(...^�tY{Gg0jjj�޾��OQ‘HVVVjJJVffsKKa~> ..��y�JJ�o��/7^w��9�Z�5n��k��"'+����D�$�Bn‡e�斖��m��1l�����.�ϟx�xY�M���H��vg���0v4����z�ڂ�|��k[VM��f�a(
Z�mf�]r4M�B�h�A���z<C<p@Zjj4����?]��U��Ͻ��i��<�7<���3��
��_�z�]���o;����j��1�qq�{��A����٣l�o���eY�PHQ����jZ(�*�a���!
b�Ϸj�ڔ��]uu�G���`��u�DQ�^�W���={2���F"��Mӌ*���x�����פ$'��������h�X�A\;�q�		-��U�Ւ��9!
wtv�
�@��E"EQ5]�B<ǹe�W��
���Z�f
v^�u�v�4[Z[�o��v{8��4������3���76�-�C4�jj*M�w�㶧�{�r�����z˧_L�����*�����d�����Uk֦���ݒB�i��~����"w�,�}��W^c�…�

7�xCII� �,�ZU%�B�>}-^�������~�⍛7��6b�0Q>�䓵���dg���������A��IH�_�xIcSs�eU۶�=��wt\|���5�K�/�oh(.*�E�54L~�����~��
��H�t��i�?�b��W��^W�зO����ny��Y�e�qک55�K�/����գG{G� =�J׮�0j�ֶ6�q���:h��-Æ�Š�4p�ڵ��2h�s��_�}���]rM�|���;x��`�y�e�[�p�Ν�ii��i�,��bKNVv�޽|��3g�Z��3��QZR�u������7B��=..NӴ���P0��'�VWo8�?��i��7�'�'��[�sgͨ�#4]�YS3d���U�|�h��9Y�Muvv�5�b�V��=j��V�Y3h@����5�O?����Ǝ;j���-�гG��+>��3��O?�����W_#��5*?/�b��+.����c��5M��7457�ׯo�5k�f��{�`8>!�;	G��[�P�DdY��.��i��@�B�0I�,ˊD"�BF~R��)M"&R�4-lt�wN�F�<��A�a�6���t]�4-!!�0Q��މƏ=JQT�4�!�4����g���9c&VV,l�^_�?Y���S� �����m��r��)B8v�9��quA��X��ugxh�!
8��8.��-�§���ᰃ������4M�apnCƏ�PR
�x��$�4M���ݥ��0��,���*r���8l8�w�4��8Ƕ���x9�8��'��G��
��mۢ(���QU��)I����8�Y�el*M�.�O+�ng��~��]�bF媦��+qz�q8�x^UUI�֭����1h�U��U	<�K�m�e۶j۱|bث,��^_�?
�P*�BQ��j4M�4�_���`G��}:|"��SUU,��|�EQ�G �t]�����X>8q�b�-�E������z<B��T�'��C��C%��iZ�PEUi��DC��tG�V'���ງ�(
!�Џݙ��v���}�`����?�Bw!d�+����7\ϲlL��}�m�U�����
�E{�mR�~-�w?�7�[C�{�{�E9��ֻ�O����U��K�in�K�~h�B�=r�߂�����@��k?hp�2++3����n�r�G�@ {1�?��E~�iZ���-б�!l�"��p,qĤ�e�h4��*�P�$Ac��<ܒ��.i��ú����,Ǒ�k�p�p���Wo��m[E
����۝��3�� P�	�}͍bA�~�П[f�7^�������y۶�[������M�'��}��ѝX�C8J �GF�+�V�|�Ғ2�n�alظ���&//?ֹ�*,J�c�B˲L� ��24��mۆa�j�|�(��[
�9��8�[����gfd�4�8(�lظ�q��Դ��cY�h�6t����*��(�KT���˲�iz�߂@ #���c�ljj�գ��ڭ
��Hs���}�.Z�$>!�-��u#����|�9��R�1��9�L�49�1-���`8<��X����󓓓�lN!���e�Xr4MϘ5;?/�������d�6B�n߱=!>!';GQ��ڢ(���k�qq��(�qY��,]��{���<�_q�%y99�ap[��Rb��e,ˎ}��,[>��,�b�x|
��,��i_�><>>�bb{\:y����{J��������̆��'>������?Zk�Nbbbkk+�b�;�"�����;n�����
�p���Ų,I��}>�����SN��|�mG"츄-�EE1����۶M34���D��۱������P86d�V(���P����99��H�����:��"�8�mhl�8���s^n΋S^�(��ֶ6�����b��m���-�c+����q��)���
;�4��ޮi��fh˲�$˲x����#���mC ����~� ����xv�.?�
,�%&0��]��1�`w�OW�v6mޢi��Q#g̚���y>��q���t�U�"���k���~�}�4���zXմA��gd���W.��k����%
�;��j��1l蟮�v���7m��8(!!��ޮB�����f��4 �q3�a�|��Ԕ��D]�)�z��jv�i��믫���od��g���[�!.�
�332�m�~��g��׷�j[iq�G�~f�N�e��r�S^Q�JJJ�e_T�����������=��75]/,ȟ��\���8����鿍=+�@8�]{�@���]�H^�;��j;v�%�8�E��(:Ξ9q� ��h�z�ʭU�p8
���>��c��?�3j��a�]}Ք�S�
��CV�����]r�O��g�n���{�ug~^^ͮ]�p(_z�E�Ӎ+V�Z�fͪ5k{��N��b�Y4Mk�0Lò,�2-�r[�u��c�k!�cW�Zu���/^���[�\�u�W��>thMM�'�>�h�������̌A���H$z˜��^u�w������TS[����ǝ<ᤶ��V��׷��sΦizkUU��=���Y]]�iKEFz�[�WVVJQ�+S�`Y�D�!�����m�Y�Y�*6O?����9kE�+�>{t��m��6t�i?F��������9g�ܤ�D�e����<��D�� ���$%&vvv����t�4�����&x�chZ�$�ۭ����mmk�W��>�eў�|bbҖ�ͣF�V��m�@�G�k��FS�Sb��i��G��Ͽ��t�-�s���8.��8�}�vtv&''y�^��<Dz�II�  �DQ�(
9h��ѹ99�}���K���?�䧟�x�;o��e�����8���UR\�*�a����6T8�7�@���{��-�JNNINH�9{���7���	e�-��������4E�۰���;��7ϱ�p8�öm��F9�;���^���=N�8���-	��������4�]uu�i��DLӴ-���}ؐ!�e�u���u,2�i�y��a͛?7�DD6�7Wl\�fMϲ^��(��n���c�f(�ۧwnNN{{EQ�,�
��a��P(2L�0�h4j۶a?�\��ʕn����߾c��9s<nwGg�i����q�	cg��gYYYiI��>mmm��Et�@ �/\�����k׮�����N�f�����q5p2��Uݼ�B�5�4�rsEQ�m;;+���:5%%�ڶ���AQTeUU[[[�>���rkUVf�,I�a,_�RU�aC���~Y��`Fz:B������TӴM���]����᱇�F�8�%������aW0�y!%9�������˲���-U�rs�FW,���+VB��O3LGGgqQa{GG (,(�޶-%%��֖�������W�Y!5b8M��V���#CJKM�Z]���)
BU��̌t�ǣi��+#�H�=�rs�ƇH�N�QE,r���X�mc^�����R�E�Ś�c�G]�q�P���QY�Ži|�j�$��`4�04m;CӺa@y��F��;v$$$��λ�		����MӸh��A���}��υmrpK�g�G�T�2��i�d�p���s��8�  �!���TM7LQpDQ�u���P���5�?
D�	G���p!�/,����qϙ��
�sE�E؏E'{� q\el�C�C�2-`EQ��D�
M�۶�������.�t^8��>k+����q�4M�4�n�*-�
��Ħibc!\.�6@�,<����p̓��ݮ��"?����&���>��r�?�e������7�_?�q&�s���OEe/�������+�{�r���_�^�~/�@ ���-����#]�@��8���(>�@8>9D����Rd;����gH phҏ�I���1�GPz]�ᚋ����r�O�$p(�OSp�ڦ��0�@,���j~��E3��B��lY6��}�}y��e#�޽�j9?~�e���rb�� � $6����H�!�@ ��88�G0��>���[\<c��P�q�i;�~�'��x��b�.6�'$�y���&y�$�q�B,C%{�d/�1� ��'{y���2|Ⓧ͢!D8��٥UAŀ�I�`qe~�9�e^!Y���q�bV�K�=[CS?��إB�YT�f��1ys�5u�
ݲ�lh.Ju�ߩ%/ݵ�9\�=�_�yC�v��#�Y7M(J���ц�b^;��g���!4��
M~U���)��3˪�'��f;�k
,��uo��Y��R�`�'���z���a���r���R\�6��,u�;k����3�X
�4�R5���fċ��Z����ŵ߭jh舞�+�O7���Q`�z�����n��i>��7�D5���%{�3f8��
n~m�ﭻrlއ���t���]T�ߋ�<qE��>�@ !�B"Gok������*ZWm�X_04�zO��3Ԉ��T�+s��T��ѹa�Z���Py�;}`���\˫;T�|����������6��C�Ȭ���u��O*O�����O�O�jK}��#JC���2,gS]�-�?'�@8(�׏���-�9�w��y���w��a�b��gU�����P8��44�oSW�g�8����5#J��\Q�23у�i���⠗fU���fXvn�;#A�iB�]�*��(6�<�zEuGW�BQث�a�Wo��n��ʴ���2�?�z���z��)���p0��\sMjjjl�W�a��]˹e���,,��q���!�	��n�b��ك��^���=Z��V����j���-L�5�o�^)�Ȗ�{�ĝ�ʄ�~�q/�[���<7��is7�����8hlϔ�tχKkkۢ�پ�=��Bh������1�$I�h�B�I�9C���x�&�O8j�i�����r,G6�!qh�nmm=�ȝ��U
[3m����
�9�vCQAŀ�$.�Z���<cڎ�ct˦(hZ�a9VM��9��d�6�@�Y�v �9�c(!��!��{!��-!�ۧ�gh�&�	G)$r'��#wB�Q����  5(�a�ݛ�@�@{P�)!�&�@5lB��B�4Ê�Bt�:� @��ɰ{�4�1K���n�O ��c�t���g����б�V ��O΢ �΢�)�:�{	=�}�@8XH�2�@8�8�ȝ��P�˱�1�݀�@�}�7�&ޫ���&6� !�	�� ˲��,��M�~AL�A������t
BH�$
B����jw�����2MB������P:^�fBB�A��%���C�ֻ�^����s�<s����i��޴[����eDA0-B�0�e�ދB�0�eY4M�,�Ƨ<��/�ܹ�煲ҒK.�K?�Цi	�0�ͷ�,I�ʭU'�8β,�e��ӿ�u��m{���d�r�a(
:�s�
�i:E�ި��������^���1�x�:��r�Y�n�w���_W^zibB�W=&�E�,��ѡ�:˲�_{m���!����6������F�t0E"x�s��ʪ�C����֮[?s����8��‘(˲<�Wn������	�ah�nko�4Ͳlj�ȶm
BM�
p�
�����u�
�g��۶�u���Q��y��x�S��������c���CM�JNN�[�p��iii˖/���}��;v������|�[���rs���t��u6(J�3Ϝ��k�@���/����z<�q���jkw�����(
@H])�I�99� X��{��p$�z�q�_eY�x���sW]]���s����N�����x��������q�U�]333�^ozZZ^Nvy�>�?�tB|��W_��˓��%���Y,�����q��7l���_��4�sn��ocF��t��@������G� z�xf?/7��ݱ�������}ܷ�<l��^�acrR�mۋ�,4���'���w@�~�{�uW]5��o׬[?n���,Y��4�⢢{ᆱ���O�_7x������8��a�o����OZj�y�=����n�S�=�0�{}$�@ ����ٴ��̧t�I��Y�lY(Z�ac�^=M˄e�����i��[�g���Ph�
�����7$'&=��C������eY��͛��Ԏ����fP��QE!�8�c��5�a�%%�>����f�oR�0z�3�ٸy�ò22���;w>��s�iBݲ ���>
]8��"˲�|>��SSSX������w� ��n���z�����΂�<Qss�;::�EQ�e����rs}>��}z��x<S�z;>�׻GU�(!R4E�4m[V��r��=�������JKu]�Â�⢝�������+�p�GYYJr2Y&�7�/��M�/��;�}�|��g �Ǝ�d��p8ҿ_���7;o���mii�4����>�5{ΉcǸDW$�y�-ˊ�D!�VUMU��V=F�
B��iUU-�<���g}��cO<�h�ҋΟ���SU�ʹ̉'�����z��9�}��z���o�:����K������ $IҐ��^{���Æh�^�|�o��쇕_p��A��@�%�^�����=���<Ol�@ �
"|��vIr�dg�ey���L��iZrr���<p�iY�Hp��%&&�秧�%$ė�������|ތ�����B�[.-)NNNNIN.+)q�\�� ??7;�ah)Ӳr��z�(��}��yyٙ�$���ݻg��>�

�������� �\kׯ����y���x97';�� ��8��,�k׭��K���V�^
�
���9󌜜��BI���PVVfRb�����ɑ\���B·�*p���3�Dy��qcFS4eYV$�8nW]] <q�P8�x�:�4��i�a�,{�	'X��*9�٥%%��D�Ѭ�L�i�}��Q5-#-
Rv�m�W�2Ӵb����fg����DRSS�rs
���֧W�A�����2̲V���+##-�f-�*.*�mG�u�׻l�}z��HO7M3��_rх�P���9�e�0U2p�iY8�ݑ���pr�v��e���,�0��-��KKM�r
�B�:�!�4M���Pe�&�=�ReZ��!TU��Tlq��i�4uÀ24�+�嘦iUUE�4�FǍ�8����!�x���i۶q��mۥŻ_E�a�WE4@)��*J�*G-1o�#]��{�4���EE�I�U۫�� �F�!�R����4�Օ���B��ꭤ_�Dž����$��Eԟ@8p%��^�w����{ܞ#������_O���#"����YS[�r,˰�w$��U�DZ,���yd��xԏ�@Ҡ=���ŶmI�(H��IQ	��0��m%f0���8f@Y�����>��3'X�U=�@Єc����C���B�^?�T9dØ�=��֘al�&=w����N?�pP釠{���i���(M�8�26�����g���7菗[]��j�O�(��?�;:<nw,�'�x!�r���cQv������;kN<a���w�"��iF�QA�)�pX�$�al�V!��z����.^2�?��
�dI�x.U�_�[�!�w�{��^ҷ����d��8���%%���g��8rd��@8(��q�O�}���%�@`��!7o޲�";;k��ᛷTtvu�9j���ttv&%%���E���|��	�Oܸy��>;��qO:���p�?쬩=bxZZ�i������������? �8�[�a]CC݄�NaY�4�Y�g�����۴L�q w�pp��ҏ'yl��Q�LӌD���457���kW� �}��eY��,m��8e„�>�����_��'�y���f��K���wtv�(-Ejko�,k��%�̚ݷ���Ͽ��{�|���6n>t��� gO�G�	�9�i����X�|��3Ǎ?�����M1Mӱ�	 ��9�c۶��9lX8���:;;��J7l�ؾc���p�yǏ���}�E�7l���_�~����������/�8//��+��٣ǭ��]Q�`0��fgU�������7���c��U������N8��m;)1��3Ι=��w�S�3�8'1!�0<�$�O ,�M�w��ei����Z]�Ͷ��{��k�����u�0�p8��:�qK�-�%)�\�$I�y�'��ql0T���-�ii>��a�ƌ1lhvV�p�
>���ˏ �q���	�	&��a����	�	��R�=Ɲ�@ ������$�𱬄��ʪ�Uk�8��TVn
G"e�%�e����祦�L�z���w�՟u��Y��s�ͯڶ�c�=�L������IcF�Z�f��
������ŗ_�߸�2��C�x<n�-0",�r�RqQ	��Em�(*	˒��k>���1��p4��6�ŋ���c����vl�\RZJ�aP��]��R�R��#�h��X�0M\F ��G��r�$������{�o���a�0DA�m��y��i�4M����sGUUY�
��<�G	@B��/�@L��p��ޚ��"���9�@q��bsvV�䒈�Ȃo�
6�	��0M�4-�˦iJ�KӴX�� �]�~���m�驩~���(�cq/^Q��+�a<��EQ(�b'$�qF̌���p�?6�G�|��p�����u=--�8Zԫg�>�z���v�����w?�&;%~
��A��~�8ҏ���m,��z<�GGE�QD��"}}�P8v��)������4{��dކ��*)�><�W�8a��R�è�R��������!�~? P����2���R?�y��ߙ��G`6�0�4}
���Rӎ��_l��t���ǃZ�4�tE�����ێ�q{����OŐu6‘�(�$F�����H?�в��Ĥ�䔃�M���pls�/��'�H?��a?s���+J��B��	G?��{�+F �ߊC�F�@ \���q�~�@8� �O �D�	ḃH?�@ w�'��"��p�A��@ �;���q�~�@8� �O �D�	ḃH?�@ w�'��"��p�A��@ �;���q�~�@8� �O �D�	ḃH?�@ w�'��"��p�A��@ �;���q�~�@8� �O �D�	ḃH?�@ w�'��"��p�A��@ �;���q�~�@8� �O �D�	ḃH?�@ w�'��"��p�A��@ �;���q�~�@8� �O �D�	ḃH?�@ w�'��"��p�A��@ �;���q�~�@8� �O �̯�@��ߨ[�a�B跻.�@8:�%�Dz�q�v۶9��84Ml
�a�a�˲c���Uض�8�!\s��rw�����~��0���VDZeY+/��A !�)5�0.�;#�(��bJQ &���{\�$I�P�����8�,3� ���(!>@A�O����؎�v�EA@u���j�с�8B!C��;;�����eAl�Ƨ�U(�c��K?AEU���9��UUu@�~�>����U�4MA�QU��r�565-�b���2LTQ�n��8���4-�"P�0�7MK�4��.Q�1s�ko���٥������ϲ,��r�������4�k�e�,�>�̳K�-�8N��[o�s�^����8�eY,�*���<MQ�a�\.��,˲mKDӲ�nnm���/.��|�0o��_7�pݸ�ctݐ�-T�
� �O �1�#�B����W�N�|�䗋
v��8͝7���89)��v�?�/+)�~�bM�����y���MQ����Z��{�%����l���, ������6%9� ?O�uY�W�Z���O�xݵNW[W����0�����;v�dgo�Raۖ?�����͝���>�v���|7of!�d�r���	�-�m�z�����4����V�ѷO�����[*|^�|7��7�dY��w����,;+��T}C�>��rr233�t��1�Z?���C-ۊF�����[��+��>|�9g�D�'�+/���?���ii���|�x鲌�t��w��I��3f���3g-_��v���8c����Ғ�F��KQ0
?��c�yy�s�/�[��O�_�y�_�x��O?�v����7�y�����vG���~�b���t]������z�y�$%&vtv^~��o���G���9[�T�{�YϾ���X���U�`Ѣ����#G���3�7�ؖ��_�r�"��]�ޗ��B=nEU�y�1�[�m�'< �X�E�HW�@���H?B�㸺ں����WRT���.Z��̙�_s��)/�|�Ͽ<y��
�����_.Z\��7|���>}ܲ����7�hiiy�''�?��������߷�[��y��7�~���zʋ�]|����o(+-E�,�(�cۂ �y��?�d��!�����7�yW�SO�x�מ5邝55��[V�Y;�����-�GeUՒ��&��,ɷ�O����|nI�y�n� TVV:��Y�Y�zmGg��(7o��kv��L��2�0�L��0�{}4|��ﻷ{�7\sͤK.[�vmy�^x��H�X�?!��;�:�{��� �'�3��c;�ٹ�K�.[�d��ѣ��d鲆��ƌy�Ï���>��23��%����hmk�0��`0�ݼyq�8�����)OMM1
�0M۶!��(��lBhƊ��
TVIDAT)'L�z���ʄ��J����2��رi��;kz�(kll�-!D3�iZ�i�a��i���Ȓ�0��ի5]�=��V�ܶ}�i���jF$��#�m��D������ض�������b�BY�����)��@� ���o\o�6���=K�0��e^��u=�|;o���!I�m��>�{�}���q�=�^y� �_�i˖�ee'�S\T4b���k���7~��'mm�F�(��;9)B(Kr\�9NB|�� ���,+{��^�<e���ǹ�?��ky��uwiIɟ����'�t���q�g&=5BH3L��W������߫���5=?/����x���}q�EE�%''!�D�8fԨ��֩����PVR2|ذ��r��gegeA�������ȣ�)41��8UUuC�Y%66����*��ŋ��˱B��m;�I.)5%�G�QQ8�UT��v�n�3Ͽ��o��HӴi�4MSe�6MS�e8��‘h$��e�Da��p8,N�0��i� X������.RR�9߾��o�:%%9YUU����n��0����eY!MQ�mc+#��0�J  xus���l�%��^‘ߐ6l�Y�~!�u�qEU!�����ʫcnjNHH�B�mcѷ,��6����� I.l,�����
!E!��[�EQ��i��u)�zi�+[��!��]|Qrbb$af/�F��z���v��BhZ�y���6�7��V��~ޛw�bێ��4�/������뺎}������ǖ�8�m����8�
�����MM͉�	����4�ߗ
��vOvP��*p�0
 ���)�r�.��9җN :��k�n,��s�����w�Q�P|\\jJ�eZ�����
R�p0
c�ߡD�PT�&%$�^�ͯKwB�HM�
,�2MBHt�8�A�?��Qڃ��C��ܾs{(����l�x{����n�w�����w��@����p8�#�"Mѻ'3 �O��r��O U@�[s���Ho��G�P�߶�C����A�m?~���H��>��wk��aY�}~������Oz��?8;�h��e9��/��C�7�W�ў0�غ[�Tn>����p8�o��,I���hIuP�3�=9p��
�� �po�G�`�������H_5��q�"��СP�/�2M?��#���m~�������xo��b��^y�C�C������=�Dzu�X��2�z�47onkk�.݊p~Z��l~,��?����o|���ǧ�r�~���d,%��X&�O����b�ҥ~���{��m���a��wo����0mmmKXAQ�����عs����c��G�
۶9�[�z����8�3���{���1��8���ga,
�q����@��9�1CӁ`�O?3-��\.���v���$	 �\����\.�q8��x<>���y���z<nY����@0�xbNU�$ɒ$�\,˺�2e�,�����?�l��-����mK�$�\�$!�DQ�z�<�{�n�K�$�e��s�DY�dY�n��,�n���g�6mޒ���D^��*��.��ukUGgg\\�,�.��%����lݲ,KEQ�,ɒ�r�۶=n���/��� _t��^�,��x�n��#�\!��+�Z���b`��nYƗCQn@I���$I^�G�eY�w�Լ��IIIE���]~�����J.�,KB�e}^��㉵���٫t<��$I�eܯw�ݸ%�m�1��.Q8A�]u����^���Ƨ`�o��to�#B���m�<��r��E�].�[�9��%I��w_�,��w��qi��j�Ҍp�l[`�1f�8��8q�q�noo1ٽݽ��[z�B39qb�L1�e$�2H��q��P�D��l������PU]]��ꪧ���v���]���r:	�`Y��tDGG���y�ͷ�}��tq��c�~.q�� (���x�m?����NӲ�23OTT�ͽ���W�Z]SW���3s���ӧ�<}���kTE]x�u�~�3�=OB�����O����&��i�pŪU��<(w�С{����[[׬[��[6l�|��Ȑ��Ç]���(����n�…�>Z~��lhA��.�ɖ������zzz}�D�_�u�av�ڳs�Q��^M��~oiX�n_���?���j�hoim[�jUSsӣ=��v=�dIgWw{{�
�];{挭�vl���e�22n�n���PU}v�믝G�dK[�G+>6L�����iSuM���ƅ+׬mii�8a��)��|�ٞ��H�
4���n�,�a����W^C����\3a��<�m�=����}Wa^ޚ�*�'$ܾ��w���C��<�I�/��	��3����M�lٶsgRb��}�ԩ��?Q5m�ܹÇ
��g����䞛�����ߴy˶�(�q��̌�^~�����3g���Tל���ֲ���?i��H�ݥ����=���+=�r�j^n_|KB|<2��,�����2-��}���w޳,���rWYU}ϝ��$'���늢ضuϝw��	k7l8Tr86&�{�޳oߡ�#$I667�9S���q�=wG�|h���k�$A���� Q�gϾ}S&O����w��=�q�'MMMŇS5g��7�y������^IKI�4a�s/�|������#�
�?o�?!a�Y�
�u��ں�^~u�5׌**J���w�`Sssi��ֶ�����^xi�kF����4('g��13�M۹{������3磏W,)ihl���n]�j���EA\�r�Cр�K�$'&͝5���_~�$�������qqq�?n��1c���RSR��mrR��{�+,�2i�.���{����/�%=-��⒒�+W�pݵC
2M�4M��;d��ܜ��ӧ�;p�`���/�����7ߴp�G�:Tr���IQ�=��C�>�RS[�^3� ̝5k��)._Q^Qq��믝�fݺ�ee�W�\��@0��[o�=:#=c�u׉�`朙3��;g�ǫ�TU����&O);^����,>zl�̙����C��N��}󖭕UՃ��z�"���-_���
�M�>�Ђ�4\N׿=����w�<[S[q�Ա�2��y��X�����^��͵-��g���Ĭ�>��>�~˶M�$I���\u����۳orR��1��y�I����5u�d��_���Z����n
K��o�]}������7�Y������y�3�O�����璥9K���-�/�����Z���'+���,>l�̙9YY6}�}ˢ���7Wt��UU�99�&�5b���cnjNOK�,K���̟���۷s�����Gm޶�XY��33��ڳo߮={�}�(�;Ȑ���S��ؖUq�dFz���,���-����H���<8�0
4.��'oljڸ��������@ ��ޑ���v{=����Ԕ���8��5y�D���p88����>t谡���XiYt�o���yyEN�8�;n_�rUũSI"OG�>�?!>!!��tN?>7'����Ǝ3x𠣥e)IIy��������)���J�|�����,�2�$e��
:thaE�G�1|��Q�\���ڨ(O\l̘ѣ��y~Pnnސ!^o��S�uM?SU�F���ض=d𠩓'�|>˲�bc
>�������ܵ[�
Mӎ��Ν=�hĈ��l�0��̌���rU
�+,6���O8[[�455�����f�<%�
�W�mm�I�$Ɍ����̌Ԕ��4E����4~�{ﺳ��m��EÇ�5�k�UVUٶ=|ذ�C��ިQ�F2��8�/��c�~.M�AX�m�{zdE��=�������n���[�m;[[�����X�⋯��a~^^Cc�$I�e1,��{K�OT0CQTsK�2�ƌY�ruKk�sV�^���2fԨ��g9�7f��::��^�u�ˎ>�Pł�������������vuu+���uEQ����Ph��[��himZP ���ɓ��3��(�ӹv��G�8��$��ƛmmm�,+����+I��(m��#���oxw��%%�m��ק$'������#�D0
CA��ή.�a
��_y���<SY9i���fUU�՘����ڶ	�������_s�8EQ5M���V5
�t�L�0���Z��;�**����>[��劢CAI�UU�������|i���͛7��(
uw�H��� Atu����i��Ӳ�{Yt�/--5�1�ݥK7n�|��ڰ��� �@ �i��(�,C�@0��ZGGGB||jJrLt̜�3�N�|y=�}F�m���,K�,���խ�jgww8,���@0HRT{G��^y��������+.)��Ï�x��kƎ�-���SSUQkk��]�>Z,�ٗ�~�U��$$$Di����f��tY�V�@EA����X67'�e����Ta~~^qIIuMM8,�|㍅�yyC���K�|��w��Ʋ���òlZZjMMmRbbB|�mCEQ�l����t�m�fee����ٻ/7'g֌�;v�jlj��ŃrsSSSZ����:e�q����B9Y٢C�'$$��$E
�ͥi:>.>%)ɲ,��8VV6�܉�2dpMm��2Q232�22ZZ[9�VX�����r��RSccb��2y��8n��Q�ss���5���4=��p�ƍ�m��]qqq�m���NNLJMI6M+;+��v�����ŋn��bZ��A�"5FDNN�Cu]OMI	�g*+Ǎ3d������,���8�����cee�yy��ͱlk[[A^���٢(03v���Խ��777���DGG{<�̌�s�A�����J��
<d���G�r,;v̘Y3��~��a�כ����xr�2-�NJLL�����K��SS�^/�Ђ��C=z��*6&&59��e�wF p:�s)!$�~B||�����LSTFz�St�|>����m�,{}�[n�1)1199�����!�_;�0���$���s8����9�y*4!/+2˰hj���2EQ����e.ϰ������>�_�?ޯ���(��$9I���i�IQԏ~���s�Ι�UUe��pB�m��۶9�u:�aIB@i�EڶiY��������ч��ɶ,��p@۶,K�ed3��Z8,9�Nd[�Y�!PU��t�%�eY�$Qǟ$I�ÁÒD����aZ���v�4]�����B0�8�4M�aE!�!��`PӴ���%Ͽ�����F�4�0P��s��!ɲ���u]�u�$Q1Y�UMs�\�>5�r�EA�$I�� �T�eYtM��aQ�7L3C].'Mс`��p(�!t�b0�y��yBYQ �džß�7@�1��NQDz��iɲ�t:)��5�0M�ァ�(
��sI��i��*��(T�������5����]��z'� L��y.y�nEU!�Q��Ҳ�_{�/��s�@в,d�fz t�"q~A���i�7��5��a���n���Fy���sE���_�eہ@��(�i���$]NgT�ǴL4NBQ�i��==�y������rG�&�0���T,�˵��A�Ñ�����B�h�fGW	I�H	��(JV�Ko @Q�r�O� l�F���L۶?/AI�C!��Hd��(*E�B���5kKˎ��C�a����Q��P�<�Ѻ7��~o�"��-�
���$I*��LY��
	��^")��kU��� ��<���r7M���@0H@��K�4��P�}@$;�0���#�^�֌��v�C�0qޟ3EQ�==$IB����i��r���=�p8Ϲ%�QB�{z A�}�fGL�/��b0����.
t�υ��\�˲Uu�\$I^FCFL?���Q^4��e��4M�.��{���'��{�������nJ�V]Q�OM���ź����,$ 	@�s�:�@a��U|A�s؉�\�\q�O��uE�V�W����t������(2j�� ڶ���a�^�D��`�$�m����=�"�ۏ
G@�f�}�4M_��K��s���Ț��"0/ �e[�P���^�[�`�X�1W3�����H��������.d�`�X�1W16�-�x��~��ٶ �42(�?��,���d�d��NJ���J���I�b|��9�
! @�ж����W>(6o?�[�mC���Rd�ݷa��d�G��jUM�U�
iՠ�_��/��}^y/"�e��K^�0�(�a��򼪪�ԭ
��TC+9�C�D��k�O��R^�%`0W6�&�ȩ���C����������Ӳ,������m[�?#�bN�CV�O'�D$I��$���(\b�Aȕ��4M!'�ȽA��Y���$��d���'��4�E䚦���x����ѽ#~��~"����`Yy����=RH���lo�(=~<��OIIn����̈TԿ�G�78�|#�Z��������!�VX�v�ѹ�iE���І6�:ߐP"�I�Dӟ�F �H��s�\�\Bնm���~��?--+9|���!*ʃ\z�^��EA�FE�DG�,k�6E�^�7:�G�4 ���|�>��Lӌ�x"=kQcc�].˲�u�<O�����[w�oh���5M�!�����ömQ|^���u9�ި(oTTDs�#�h����7**��#I��rmݱ���!..N��>^�v��.צ-[Ξ�������|>�㐲G�|>���t�r:cc���c۶�.������C��"9"���\���訨(��8���O�>sf��ݧN�^�⋦i�$�*���+mI�I_�ʦi��������[�Μ>r�TQU��k�6r�!t9�QQQ�
�p��N˲8�s�\111�k�o���!>
h�kp���¶��'~�#�  A�X�*%9y��ao��ޤ	ZZ[kjj{z{�M������ݽr�YQ�͞��M�UM
�ë׮?ZZ�h�‚M��)9rd��rss���TV͝=��7�e۶�S�>zt��}����}����		�n\0}ꔽ��/.3j��ɓK>q�TWWWA~^SS��i��$�"R��5[��p8��O���o�o������^|)!!��;�khx����n���ԔAV�Y������y�9yC���ܸy� ����_�i���'F��f�X˲���_z�`0���~ݼ���m[�ow��Y3A0���u6�?�\��4Ų�K�?y�t~^^T�GEA�[�7�~'
�p���8�/�����ж,�an\p��wݥ�jkk�ko�y��5��'*NN�4q�]���Λ3��t��8�����ysf+-��>����c�e�Ν��x�\�׏����f�l��8y�?��}�<z��a����K�<��KŇ��~�ayEMQ�����۟^�|Cc�eY�Ǔ�

���������rE�|� ؖ�q�ٚ�������xg�����˷��������CJr�(�$Iz�Q�(O\l�Ҳ��}�`Ȑ��|����%G��ٷ/+3�����S�O/�x�{.���	�q���|���Y���)���^���u:.���|^�(n�Ieu�{|�iC�/��r ��3�&%���{6lڴo���^|)o�$AP4��=11��==���5��O>�}j�6�1|�ݫ׮��m���v�EQ\��@ �<+��ƛ��͂ �����/}?Y��'@�m�6���.}���?��s�N-y�ŗ^y�D(z��WEQ<t��;K߯��~jɒ����v-�xeɑ�k7l�����n�����>
c�B@`O>���Kv�lFvV֯~���cل��ce�_~��6���q��)S�Ι�����z��_��	�������Ҳ��w�q;�q���eѢ�쬎��D¼ٳK��4aB��yCv��[[[?c�Դ�Թ�f�C�rs�RS�_s��Q��|��`(t��NQ�ή.��q�ر7-X�z݆�3fh�VYUE��eY� �x�G����Ƙ�y����`fWw�����H7v�ѣ�nw���]w	� <�2	��7̿V���<�g�>��}���ڽg�ڵ��y�G*9<o�l���11yCI�2g֬W��y8
�ȣ�MM)�ɖeEEyJ���hmoG�9d�D���k�+-�m��8UUˎ���Ǜ��UUE/���N�h�Z�ض�����o_�8�,�ߏ�lނ'�w������RSo�q��~��?��#F�~��쬗^}m�AS&M�i�B��IOO�-7/jkk3Met�5m<ˋ�ڹ��ض�����lkk�e�xyy}}������-�Z����Y[[���8a�8k7lؾc�"+�i����PH�eM�w����ЀF�U�;{�q��׿665]w�>����ٱ�Gu�tϛ3{��q��_��[(�:t�P[[{rR���<q��?�Ѩ�����P(
�%IB�i� IR���E#n�e�/�r����Դ���[o^��|�$IӲ�nll���پsgoo/��
���P8�™5���JJ��5M' ��[�c|��_�
��{��N��>{6+#���s�޽k֭��xbbb �==��������I�,��(J0�B�@�$�ֶ�����ؘD��֛��?���L.w#�҉�͞��K����������Ԥ��G+>.���ѱ���B�P]CÖmۢ��Ǝu����#�6l��ʂvww�fZfCC㞽{��չ���fy/�
c0_�K�H����e�+N�:P|Hų��99�?p��u

�=z��PyEŔɓ�͙��Oرk��3g��������QE#DQt:[����RR�hoo�kjjfM�1|��Ą��6���,���5o��nMM����nj�k�>۶�ϛ��ٵe���Ƽ!CdEq�\9�Y�uucF�
����噦�j�;K�/.)Y4b�̙�r���?X|�Ĵ��������{��^3v���;w�Z�����d۶���nmo�~��<ǯ\������v��	����\]NN6������Ξ��z�o���x�{�bb4]w���k���6�0+3B��ZO��h��斖	���74����}�{�/.)�-+oȐ��B�tP�Fd��ԟ�����'O�>S����O|�{�����'N���|�z�ٚ�;o�mXA�i��>��v�����.�Ǔ��ꍊ���*��ZX�2L���"�������Q�,`�: I4�!�<�/~���'̛3�2MYQx��� �4
�x�D�5MC��E��'�+��2�/��77-\0f�(]��̧ �n��8M�
�p����Ǎ,�t]EAQT��I�D�I�dhZ�4QTM�(�eYM�h�R5�e��t]�
C�y@����,k�&�К��s�$�<�9η�y���O���$I�,�ħ(��y�4UUx�<_3�a��~��(�mG��q���<��NӔeY�à Y��_�vr��4�u�!E�cY��PQA$IbYV������Ư~�s�˥j�aN�Y��a�@V�~��0,I}�Y����:��K,��Z0W_#@�eA��#���`(�v�MӰm� `X�h��t]�4�|`?�h����T��B��\�C%������BM�"ø�$��zK��#����K��$�ǶmU��,�M�nu���'(��n�„��J$��[�.���m��|��{��0�e9�4�4�P(���� `��Q-)�B�dX��e
�m���$		%�<�|��H4۷���p��	@�B�uBM�M��B������LI�ІeY=}b��d�(��㇃�j��ռ(�i�=A�4�(����	</�
�[yQI���9M��'��?�H��}�����K���"G?�Ѕ	F��{��S����bc5M�{�Œ
���ы�Ċ	@
����Ct�kF\l��~�C�<g��s!��Ew�|��8s%�M�o����";��� �4�bcQ��%��i����7N"1_4�>=��H:��FE}�ؼ(K<Ӌ���=w�_{�Qd��J�W]�5]W^�2$I�<y���R�uz�a���xh�\�|����╩�ߤ�8�7�m������|G�q1q<��	[�U
�׏��!I2)1�|/[�e�8>���s�/\�\@�A�w��%�
ǿ$̕�ť@���|K\��Wt�0��H? ��*a9�}Fb0_�n�i�9a��E��c9�h��ҏ�|lh�K�A�\Q|΀	H��qb1��	I�$I�.?�J�s�$��)�����1f���s��9Y����o�����(���/�����K�0WB��m��C�m�,#�˲�)�fFUUUSA�Ȳ,�p�s�ҏ�֢�l#G#ޒ#��?o�زm�]x�e�#��/3��-��_���rъB>�/|.�D����!M3=�ݭ�-,�"GO$I���S�W�Q�D�e��C'PS{VQ�aI���F�ח����7%�m˲�y��`�.M�Q���p�TU���EQ��K��A�rq/+n�KQ��zDq�\�,�@�mB��A���i�9��v:�i~�߮�+*z�|���WNi7��~����2����5���iI��D~�n�KU�/]�|��K�~!d������� �����=u��'OU�
ɏtЙχ�a۶]N�iY��yPu��)��F�E�A�"+-3L#5%-���劊
�����q�eE�l`�:.�3�t�����O����/���'Np�ipG�j��x��"�-/iێk֭w:��(wA���e���+�]]t��ꢚ���K/+��e�-۶���(��S�.��/~�g����wlٶM˲�U��dw�(�����G�:��+���=N�cێ�׮�O�@TMC�!��;����ͣ_�E���M�y�TП�z��`�6�0�P腗_Q�8�xնm�e7o�į~]|���ȅ�_۶y�/=~��e�<��W�~�)'�=�P�~�	!�m������Q�F��|1>�1�K�è�������&˒i��Y9������h�����O>M����|{�l���|�{�{�B����ZM�>lxuc����ԇ%�j�*����P��\�ok���omm�y^մg������(��C7�UĥI?��]]]�U��ϿvPn�3Ͽ����r:5]�,K�P(|�P	�hEQ�����XY����B5Mx޲�ŇdY&�O#���a�a�4�������a˶��j��ں�Ӧ0}���ʪ����Bk�4��4�0��Fĝe�c��g*�Uxe�c�4�0EQaIbY��������8?��s�+���g�~Q���m����7� �A0�v��HtB��<J}İ,+I�0y�p8��4���}lmk��a�� ����hhl|罥�ǍKMM�4M���H�d�A���֣ǎ9�H�E��⼮�,k�PET<۶eE�8.��aڲ,���D��<��Ԭ� �e�*]+�z{y��c�eO?�C�_��u仺����ݲ"ٶ�mY�,����{{{,�D/*M�s����8�y��ӕ��6l�2��(�wa� ��Ȗ����Ξ�/=B
Ά����%����ֶV������pXZ�~�3�?��x�� ,Iϣ�BU��\\�X?$I&$ďY���_�f
	���{7l����;o�Ux��DQ\�n��ݻ-�~�bbb�l�����ُ~x�����9kƴkƎu8D�$QwR���<���㍊z�<n7�� $l��X���e󖭢(�]�q҄��öm��?\����.���5�w�CC�o��fY�mEQ\�q��c�rsr����K^x���,I���1˶^x�YQ�.��IQ��#G^{��~Bɑ#��n������ŋ_y���gN;��;��7MSQ�w����ؔ��1�hĪ���Q[[��|?����̢�#>ٶ}��]]]7�t㋯���Ԕ��c��X�zǮ]�	��q\cSӿ?�h~�EQ����W�]!4MS�e�W9v��t>��Û>����Z|���r��ʫ�Ph��g�x�����'v��s��vPn���=z���-[�,���2i�o���EU����뽥�phhA��������c֌�,�I�$�(>�gkkS���M��f�:M�gL�:s�t@�w�[��G͜6햛n\�|��H,�yQ��᯼���3gP��^���䰢��أ111_2%
�	`C�FZ�~˲h���Ͳ,�C|�'?��?�����~(7'��7�,����(
AEQ�"�����%�<�x��'�sl{���Ot]�;{���?�햛y�������_�$����[���\�\��
!C3�UU��~���{�]4M?��+C�i�~����m����1���ؘ��+W��z�ر��tM�x���>t8�rs�|v�$��:7����|�Š�'s�L8��I�D��ȢI�O�<���)�4M�:u��O�:���z��+��ۿ�S&MB�(�ׂ������c���q�
w޶X��u7�����cF��9���"��{����[��swt�/
��
����y��
������O���=��+�aZ���o�4q����g
4�������hi�(����Y�~�k�̜5�0İa����;o��ɖ}Λ={��O���/IRbb����Ҳ��swa~��M�A I����e=��}����DyNT��p��ӧL�>[��{K�ϛ�����-,X�h��3���y����.۶��Pk[MS�$�R4b��믻�ێ��(.��	���������'/.9�m������_��}��߸	EΡ(���� /�|㭷N��ܧ�}�2����_?X�qӆ͛�o��G�;k�[ᄋ|��'��㩩)o����3�����t�ݑq�
�n�
ö-�<�a@UM#>b3�
����ި9�f8X�b�/h9�	��X�E�
m�ڦ�tt;IVx��tȎ lB���oܴ���� X��⫯�fg���O*ٶs�C�����Ҥ`�Fnv������'h���jo �'̝5KQU��].׶;�l�F�4
��r��<(�ԙ3�PH�u� -\!$ �r9�IuOO�+VTVU9�N]�	��Nۆ�;�,7x�!��x�4-��	�hmk=r��'��ߏ���U�?Z��1��(��ry<����Ꚛ�7<T�r:C�PgWר�EIII��do h�VzZq~��$A\L,�q��
��			���:::�eef���,=^��˯p7iℵ6���O�8!--
z�W�4�q|sK�0M--sg�JKM������r���bI��Цi���phaa��#
B[{;I���֢#F�	m���h����i��坝��C�u�ah���t8x��,����	���M�<�?�ɹ�
���52111%%���!&::>....��r�	H`C(BNv���
����р���<h����ccb���ƌeƉ's���~���]��$	�ONN��1h�?66.
vt��\n	hC���Nש�'�}�t�$�i�	qq)IIA�wtZ[[/�6�}Rb���x��uٝn��5L���>)1�8?	��HM���Ro��槞YbY����==�F��))ɍ�MM��+�K�~�4��pfFƿ?����M׋FWdş��OHp�\u
��A�C<ϣ!x�4������7m^�覢��UE���INLD�=���V(bY����x��D�"+Ii���o1�n��C%%h�U�dI�UM
�C�>�+>V5+3�2�D�?59����Ɔ�����rv�@��m;v>��斖�EE;v튋�5�e�Q#���׿���k'O���YӧM�0��?{<SeYVjj��enݶ��k�U�:��3Ͼ�������8���Ꚛ�G�����<}���g��[�����UTT~�đc�i))'O��$)==m�˶m�1y�D^�9�MNJ���C�P8lYVWW�i��$˒L���i����_��w�����ahAA�?Ѷ���Ą�xUӂ� E�g������$��^X�qS0��yo߱3';�ݥ�oٺ�������\���,��5e��^|�O=���<g�̦�f��5M�$�P��7���ѣ�LIJbY���c����cX�{�VU������_
������ǟ|��W^s�����u
�O>��$�_��ٕ
�r����4C�<kZƮ�;��PvVN_|!	�P8�:�yI���m_�m���?!��po�ds��}��Q�q�]
mݾ-#=��tZ���K�\7?))�@qq��7uҤ�^|�O?���<g�Y�I�s�C=�			�Z��twO7˰.�˲��C9�����x<�II��ɢ L�:����7HJLLOK����8�7446%%&N�:%+3#9))
-��������斖�����䔔�ܜ� �㒒kk�M�\T4"7'G׵iS���cFt�/3##��OKI6t(DNV��ɓ��Ԛ�����p8�TV�\�����p@۶m;)�e��������<{�f��acG�?n,�q����Դ���iS&�<��]N�aC�:�(�� /�� ?L�0a��I������FYQ���������G

&\3NQ�ƦfYQ��Ą���L�������Z}�,�q��)���̌�����Ԕ���h�/;+�����Q#�$Yv8ӧM��;������0����h�/#-=666;+���g����546�1b����YY.���p̜>���ؘ���3�U.��{�q8hahA����~��~�?!>>5%Ut���	����'��gefr,S��o�vsKK||�?!!#-��)��u��E�|#G�epn�u���y��
z���&L�0��re������~I˲<O��[]]]V^ZVV����rz�F���D6����rs�].�(�C
���"m#b�����k��ӧJ���<S�s�����5�=�33���l�>l�Ђ����Æ������-���_heKQT{G���a/�\v(�jkk�w�6lj��qWVW:D�?ޯ���O�4DZ��@�a�i�\.UUUu8$����x.��ms<O���i����t���$I<�s
�ЯB�a,�RU����%����Ȋ���쮮�.�SQ˲]N'$�$ɀ D�B�(J�A`�$IY�UM��8۶5]�u��t�4-KHEQ<n����
p�\�aH��q�C5]GE� EU-���
���~��O��۶�t:A����*I��tAxކv8,�4
�i�CeEa�$�����E9�N� ,˒$��i��AP�dӲA�e��I��,KQ��yA�m��
U�\.�iZ�4]�L�_��'�p8����a��#R۲,�DQ0�I�	ʲ,�"MS�aI2��i��i�V8,!k�H�!�"�U�TD#KBY�!�M;�aH����q	���A0�j�m��%����d^h��'�?Mӂ H�d����.\�f��+�SSR�O`./�A���^���]��E�F���n��FZ�M��G6��> ���
�YB�4b�)q�Z�oy.L]�OT��1(2݋\��4;��v�B"�g��y�С����߾��z�$^)���V��{���H��[�d�e��Iو<�~k��.�[����]{��$د�����q_j��0��;�y�_�6"��mc}�j|A��eß��K?�""����@�]��"���L έ�E۟�I����+i���9�'0�T�H�hG�P��n��*�)էAP}�� h�Ӛ�e��#{PvA��Z��7b�P�4Q�}�T}�E8ww�o�_�Gҏ$��Hm���E�C����~>S��I���	}�`�{������7�K�O_�m���x�{?Wi}Vꑀ���}��W�s5�G�1W����m�2m���w��0q�a��4M�sE�_�-�r:��͍===���Ĺ_xd��i}�B�����#����!*��|@���/��|}.2��n��74!;�\Q\|Y
n��7�Aa�4�Ę$��V�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�ҏ�`0,�3�M��aC7��0����n�&MӴSt괎���`����,K�4�`0��~�p`��`0��~�p`��`0��~�p`��`0��~�p`��`0�}���`0W(��,/�O���B	H`��`0���)����X�EM��!�I��ҏ�`00M���Q�5�ۗ����hmkE�o	a\lI�X�1�?���fQR��l��$I��pcs#���X��-�`YV[G{Oo�~��mۦa$ggyA]�����I�DUӾ%?�B��m�>Q^����`.�m�@-�� `ٶm�ߞ�۶
	Hl܉�`0�;`�]f��c0�Ÿ@�#�q�t�/<���&�K�$I�$�"�+p�|�>�9|F�mۆRE�7�$I	kD�џ�e��������o�gdڲ,�0 �,��4�F� ��D.܈$�� I��(�4/|�^?��\��ZBY�mji>q���h���kk��Vs,GQ˰�g�e�ce��`�eY
��hI�C��qH���e��+�e�ӕg��Wv�C{��u±,I�A�4�m�p,K�$��$I��9�3L������
K?��\�w�v��Ҷ-��k���-m-��S��쬩���doooM]mWw$����'N�����Y�P��ۃ�0-3#-}��i111��U,�ʲ\S_'�I�=���3"����jM]�$I4Mˊ������^S[WZ~�Dd���c��y|v�"2�	�a��K�˚[[.�u�����S�,�N�d[G{8>y�T]}��ia)���aY���P�LUU{GEQ_0���6��a����O��$���BQժ�Ս�M���S'òT~�i�����mni�
�����jY֧/2H�ҏ�`0�3��B��H@�$I@¶옘�kƌ�z���U�mm�y��
���N˲����ö��i����x���a�=�]�],�H���ւ�"��!���e�aɯo��$�a���Դ��Ζ�V�ӥjZo��e���.����G������QIJl�w��~��(}uBȱ\ �dI�Ԯ�.��	@��$h�&�0��oٖn���(��pضm[������v9��y�Y陆a|j5tA,�NKM�t�]�z��e���xF�(�z}^�׶튓'��H�dY6��9|D|l�n�6-��uMV��|[��Ӽsq΋1�0���������G>$BrR�eYM�M{��t=+#3=5����ȱ��i�#
Bgg���E�����1�1q��
M-ͪ����T�A��e� ��)C7GZjډ�'��~��Q[W���fDz>�0��K��w���&MQEY���$u��اF>� @�޽{ذa��K��0�j���>'++�'���4
�߰S^Q�ZVz&���F�u�08�E��RX���(�2Löm�e�$A��9��B9�++/'8��P;��-�E�;�a�4
��Eq,�ײ�I�D���A�m�0M�a��4Eá]�wc�~���:�B�4�!��?A�AQ���E#*B�a8�E��(w�
m�
md͉9�H�� �L�I��\6�� W�.���}
�=���,g�H^�X�1���t��i&%&�iz�%]}G�-�S��}�_�5йE^�6���cb���
}ޡ>��i�X�1�?H"��{p�,�E�_���tІ��[�)��~���$Eq�\�m_t"))��jA���HB�4d���O��{��,+,�b��`0���������mg�"���������i���`�����y}X�1�?B����������!���K?��\�s�2߉�{_��Vs�Z�!�ҏ�`0�˷7�zysīy1f�����`�;��G��|�)�F�E\�xd�ݗ��H���W�s���tR�놦i_p=R�K�o䳂�8۲U����Pgm۾���gh��v?�uB����t]׍/Vu�㻙��`0�H?
�g���Ԕˆ'��_!t&M�,ˆ�ad����
_�����!�n�cٚ��'ccc��q�(�����#��$�p8��0����
�e�����I�EVUWwtvڶͱ��A��`���`�@���Ը�X�0P�cۆF�6�I�n�(�}�C�Y��#����x�����9�{�ŗ�,�e��ԩ��G{�n˲,�r8Hɣ<^h�nmk+.9\4b���$IFEE���:I�n�KE�aX�u����PŒ#G�~�y�f:�������v��Ԕ��c�vTT���]�?ٺm��B��Iڶ�r�LӤi:���y^�4�(r�i� O���G��vuw767646&��y��m!����:::bc�����.�0��	���0A�N��!=V�k�kƍx^t8h�F�M=n��pp�n�r?A��(�jkk�040�,k����~hޜ��pXӴ��7-,���=X|(:��r�RUU�͝����w����=s�p��8�\I���������y�^o���ӧegf*�����2�����i�i�{��{�����|�BQ_�m� \7�ࡒ%/����xݵ󚚛ccb�	�l�:z��@0�u��(^?�ڽ�1y�DI��m��λ�c��q��utt�Y����GU�ŋm޲�X��[oY���US[�g�����S&�W�lhh��g�8r�X��#cF��Y'***�����s��'No��U���{b��Ǎ�v������?�nې��{��7��??Qq2*ʳu��O�nch���^�4�oO=UV^X��X��SK�3t�te�[ᄃ��SK�x<��+W�u;����0i��B���%���p�
� ��
�=�,���4EQ�,�e�e-?y�$$�o������>�,c�fMM��*�a ������O=��cO-y.,����M������{>��ah��O�9��U�֬�p����w��Xn�K?X��$YYU�c���{�}�b�?!aɋ/557/_�j���>���W_kmkC�T/�C�`0�K���$PU�����QEE���?������������>lTQ��齁޴�ԑ#�gge=p߽=����OB����ڙw�~��#���{ם�eʲDӴn�II����#;v�ۓOOMrr�=w�9|�0��G@��jggWAސ����=�pjJ
����e�7J��S&5551,!���{���@
��s�����޼�0̜쬻��k�wt$&�����r}�Gzp��i͚1���n=|���7��M�4�$��!�3e���ߒ��y�Le�����n[|KnN}s�rq��V���6t���--�3�O{oٲ'�}��ŋ5M6�� ?��_���DaAAcS�ɓ�x���xr���ϙ�t:E�$Y�$Y�eY	��na�6�q�eǛ��nj������nZ�"+Ň%�ǽ�ƛ�eN�:����0�����Ǎc�f}}}KVf{{��i�g��2�_��뉊�����U%���eY$	,��t-,ɡ`0,I�e+�
�{{{�RSW�][SS�s��Ԕ@ $"6&fϾ}�yC8��,KQTM�%Y& �dYE�ӹa�f����7n���`0�
��yQ�+Wo�dˁ����Θ6�a����;v��G���Ï6oْ�ϙ53+3������d��)�~��-[�+*\.gRbb]]���j��bcb��R�TU�;��i���;��G�w��#��}��p�4�n�Ƥ�Ĭ̌�#GZZ[�bbgN���O�-�OINY�f�3g��<#�
{�e�
����o�\W_?q�xE��)��()9����te堜���kƍ���2
3o����#G�dg�������ímm�(�奧�<th��M���Dk�-
Baa��'��8a��UkJ�ini�1mj���:���AӼ��+����$E1#I���^�zMiy��~�x8,14-ɲ(�$I���:Ȇapg��n�à(�����pt���h�q�����fz9�C&=�(�B�� �պ����@�!+���2M��(@$�%�q�O۶��4��(�"
Ì�#�i4C�H�N�4�e����'2*�u��9�0H�r��MMM?z����I��8�1��Z@��KKK/>��$	,�@��i��[-\HBh�&��i�eY�+����Q�wEQI躎NS����F8F�F� �&B�e�$E���$�t�G;��}DUU��@QU�4M�3Ő$	� �|
�i"�Q�ꛎ,+� �>�kϞ���a���~?�}s5rq�G�yN������lY�#!�#��l�C��ഋ���w%r�K"e�'���$�ŋq�=����_:E������>h
���P��J��N������p��.��Ķ�(�g�k ��,_��`0�W�&.�"Q"�:�����0"CX�60��ED�i�$Y�%��`4��\�i^�$鮮�#G�`o3 I����t�$э�B�%tEXtdate:create2013-09-27T12:34:47-04:00�ј%tEXtdate:modify2013-09-27T12:34:47-04:00�Vi$tEXtSoftwareShutterc��	IEND�B`�site-packages/sepolicy/help/transition_from_boolean.txt000064400000000717147511334660017537 0ustar00Transitions can be controlled by SELinux Booleans.


SELinux Booleans are If-then-else rules in policy, that allow the administrator to modify the access control on a process type.

Transition rules are either always allowed or can be turned on and off based on the boolean settings.   If the 'Boolean Enabled' column has an arrow on it, this indicates the transition is controlled by a boolean.

Go to the next screen to see the effect of clicking on the arrow.
site-packages/sepolicy/help/__pycache__/__init__.cpython-36.pyc000064400000000161147511334660020450 0ustar003

>�\�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/sepolicy/help/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000161147511334660021407 0ustar003

>�\�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/sepolicy/help/files_exec.png000064400000204432147511334660014676 0ustar00�PNG


IHDR�j��8pgAMA���a cHRMz&�����u0�`:�p��Q<bKGD��������IDATx��gtG���~��T�A�V%��i?�3�s�瞙��=��w~��ο�ֻ��3�t���tOy���)zoA� ���]���PE�N$%5I1���P���;bǎ����wּ��o��d�X��a��������� �T*uvll�ou]Mߛ7o������j���������|�X��ʞ������]N�<y���a�,��q�3gbbbbbb�#�D�3gΜ��Ų,��Η�������׀aX�V,K��뺢�9�711111����"�A��������䛍�B�������NV�����71�9A�}ζ��Dz�A���]��W���y�R�L~n����>�J�LLLn/�m_LU���M �t�p4����lV6;�t�h,(���� ��pر(
�d�D*���Dn�
�����t]ClV+v�m�B4� �b��P5]�q:7t�H4J2���v!K�a�E��y�����@�վ�}��˦�71�9B��~��cg΢�*�����y�cMg�͛�`Qdt�`^u%�����<�8K/�0N�����	�����t�i�Bn .�"��#�
��o��sB��ߟ���`E�)+.��G����:�A`�!~��l�o#�#LLN��FDQ�y�̝���{	��}��/��"kW4��7����QU^Ώ��<v�5�S�@��t��.�c�d�#�<!��1so0���L킉	�#�LC,���\�y�}y�^��9�~�����(����c'x�ч�*+%�Lb�*��	�;?��w�*
V�A�"�����u��Yވa@Wo?���J�)��GQdF�'(+*"�O$r�.),�o���Ja~>�(�i:�H�g~���R^~�}�>JMe9C#c��q��N8%�N3������F���h�Nei	� �;8�����`�x���)>��yՕ���ށ!
��t�����!�Ņ�ON�N�$S)|^~�c�LϤ)-*�GH�R��H ߏ�b�hY�d��(gZZ�4�����%�V�
&&tA`x|��w�b������Gyr�fJ�
�)�
�0����`�:����|N�k��uV,�GDB�;��G�D�,�O��yىj��u�
�Ô�󯯾�$�$�Iʊ�Y���w����=��n�BO�����6�Z��c'x桭���\/s͓g�q���D2IEi	{e��Ȓ��f�͛DÀ3-��MLQSQ�+�|���.�����n����cU~��[|��'�y�����,�WC$�>�����M��u�0�}��Q�n��F�'�e	Y���w^`���t���Dyt�&����n�2>5�s�<���M��ܷv5�e��RU^J]U��~�K$&&w*�aP?����q�x�?�K.���?k��[�711�~4à�����g��6N45s�I��/"�Q[;v$I�j���ŒJ=3��Q5-�y�q;���njNL�q����e�����m[Xް��ƹ3<n7�~�1ʊ���K��a�r֯\����=ՔS�뺎��t\�op����w(., �N�߾���v;^����
4M�,?E,����?��#lB���0���$�j�>���6Cg��lX���ݫ�RWUAZU?�ʾ#�Q5�MkW����y�u���9q��%�bq�;AaA>���?�Ue�3y11� gc��g�������&���/tQZTļ�J>�w�P8�a8V/]�EQ�٬�zf����ł(
��	ȵ?����EY��'�t��1>9�a�$S�A��J0&�H`����>�&V-�G�5��V����)�FH�Ҹ�4Ucdl<g��ɟ���(����qs�U���v�V[�Y�բdfY�]g����,-e��@f����Q[YI�χ�������8l6��C�im�����M�P��`���	R�4�,��2�U��9Y�h�;��
Qhjmgߑ�<���;r�<�����9 �c�obbr�LC|~��H��ƃ�����iR�4�x�Mt]�����>�8>�����Gp�m�h�Ǘ�E�$�n,��$�x=n*KKh\��Ow��j�R���LK�D����;l�{�g[ذrY�P.?/QH��k}���6��ǻ���v�հ�DcqN�k��v�q�XX[͆U���Xn��|�h4Ƣy����n��Y�f�иp��c��𸜸].J
C3�>���u�)�.���%Xް��{������E|��>u�x"��n�n���zrZ��2�[с‚|�~�A���|�|��6�K�,�l���ɵ�
�T:M"��b��e��ҪJ*�Τ�X�[�d*=3�X�˒��(�L:�"���E� �N#�"Y�xA��B"��3[
� �J�(�L���@@�4l�(�VUTU�
,JffO&��5UM�g�y�|��<����<��ۖ�V5CǢ($�i���Y��ED$I$�VQd	Uӈ��H$�IҪJ��0���?|�)ϯ#��L�f
)5�EQ2;LL�bA@�+쾹�9��ͦ����f�ł�b.�ÐE�67ĵaج�+^�0�%���Q�~�Pٺ�c�\�YlV�e
�0��y�0/�*����K:�z����d�f�G�%@�<���$W{N����a�S�lnA�u�,�OUyƌv��t`d��b������n�0f��i�v���T�_�1���+�#��u���os
��/L�E׿������?HUy)��{_�s^�f݊e�T�cQ,�f�!�5�cbbrs�AzLLL�`%�T�].�.<.'ynW�����|�\��lh&&&��^��W��jK���W�������]B΁�$I�������]@n�obr;q'B���a���111�}�'&'h?�>+2��ɭA�5�y~$I"����\Qq:�8N&�'	�B��׷Rm��������/�5b�X�z��R)‘0��ݑ�Bf׵��#���d6r$��]G^^�9�0�e����4�Ã,�����"r�hoG���{br���&''	��hz� .�Ea�@�k���EQG���)).AQ�ۺ|��8�C���v��ʑ$���<�aLLL�F�=f_irG#X,�V��:[11�� ��H&�x\d��_���I�x�D����N��Xb�8��D"�$I��{�$��X,��fE�t4M�f�^3}<G�%$Qbjj�<oN��Q�n<n�;��q{���}~�E��'�&�=�:;&&_�\�jܑCq�o
�o����}V%�"�<g�(��,Iج6����1s�ro|�JoQ"�hFp
��v��]����/|��4�M�uUS1�OY�oh����V��UdU�w���u=��Q O����T�_�$�+
b�޺�߶�ͳ�3t��v�c:2�]�F#Dc�LtJ=�C%\..�뺮a
~�ێ;E��^��%�˅,}�MJB�p&"�͖+��]+���l}�	uz5�-�&��H�x"��X��2�l�04��$�P&|���KQ��AȄ����nV吝��h��>�
��R�&B�0lMӮ��g"�i_��1'�E1�F�����<
�F��u! �;�D1�6.��(� �퉢� ��}��VŜ��+d<���U��(^���*W�N�U�%ٺ���9&&׍���b$Y�Г��X�N4-I8<��Av��J\U�gg����~1�%%��xA���I�(++edd����/4��DQd:$�PVZr۪2M��l��������=���	��?��i��NV,_F�����70��8�

_��U�̀X@@3�(�X���\�p$L:��f���I$�W�9a)Bή �fA��SD�Q֯]�,�t��019ɲ���ڐa膑}��<��梨���{��۱�U�����NN�:��m��cI}�M�GUU�)*`�Zs�222
7u��u��W��cbr�h����`LQ��Į�h��"jx�S�i����k_(���
�������ʞ}�.�0�d;��������S�9{�h4ʙ�����\�?��k��?�I�dYbll�ֶ��є0���-Lf����`ll����9�������y��_V�����8�x�o��/3�j���y��={�������Ϫ�^�����v�9����,�r9�eQq��]n���azzQ�

��YmH��$IȲ<c��O�f����Č6av~0�8���?r��DQ�������iGfkI���y���F/+�l�_�Lg�3�]����֞k���ev{�}\2;��ϝ���cX,JFc�\����T*ř�&����;�208� \�����ʍֵ����V"���#�
�t55�U�}]O!���A��	�U�᪪r��Y��t/��	�B�Ƨ��$�H �2n�L<g��C��I|>n�2G%	�N��y�(-�m����i�=��9C�Ô��R[S�;ᅬ��f|b���b��f`p��2�ٰ���022B(f�<p�}�:u�3g��xx��O��A8x�0|�	��/�˖6r�L?���'��������n*++�Ì$�@Ww"�N�j�J-\���VN�>��(lڸ1�QZZ��� �������G1<<BII	l���ݻ��b��a֯]K���?@Ww7�eely��3q�sk�3Fw�h�ϊ f���b�Z�e9c>3S�|4P�ߏE����LNNbQ,����h�H$� x�nA�h
��U�V��y�u|�}Օ��3y�4��Ǐs�LS�}n~�]����022½��-�O0b��y��g��|֭����u�;p���q�v;�DA8u�]]�<��V�V뜺�mկ�:���TW�k:��<�����ۗ_A�4j��add����/���2�Y���Ǐ��{������<��(gϾ��A-\Ȓ�z>���/���>�٧��BdY�j��N��
W08�I�?A<��h$��3
Y��k]qh ccc

�}�H&Stw��D8z�8��5h��;�OwO-�m,Y�@o_;v��Y:�#N�i��S?q�M�n�q��6��jY�b9G�����X�x��266N�s|��s�;:X�b9��αg�>���!�2kV���={�����?��q�h��;b���†u�().�~�O������EE�^����zf{N�CCC|�ɧ�=w�X,���o��6.����J���{9v�w����J�o��{��00�w����y��1�ښ�/[��'8~�$�Μ����ښ��]]�|�ɧTVTPYQ��1'o�-v�,���f�(�t9)�/ ??��AaQ!~�������a������as�z�|�N[{�� �,#�2�X�tZ����yX�j%�%ż��Ǥf�z��x�Ï������v��GuU%��V�ZEiI	��8{���4����8��G��Dٷ�m�X���s�|�^��6�L��_�I�9y��l�AmM
�,_Rw�~M�xL�4���E�K:M0�ɓT��SY^N]M
˗-��z��� �Jq��)�n7EE�,_���2:/t����;�O$�����v���Ӝ:�um̭���ɭ���h�$�Fcn�z%��D2���^�r�UsK+}�|��'�s��)4M#��e�ҥl��##�#Q���X�z5�׭�BW7�T:� �"�==����|�R�jkPU����ϵ�FE�|����J*++�$�0���q�֯]��ŋ��ݍ,K̫�e��X�V,���y���>B,��ucBff[����*+��oGUe%�W?!(����R��i:���������{���ذv
�׮E�DN�<��fcݚլ_����4�������CC���D����b1�;:8��B4#
�;�WW��Z4M'?���������&R��\�����8ܙ��Ctuw���M<�i���8EVP���B��d�޾^�� ���	�AD^��ӑ12�ʂ_E޶����?�
�06>FGg'��
���^/��((�~�b�;���.6m���G���g�Et��аx�W�d�tuu#��.`т��ș��8�N�jk.�W�]�Lm>�:�
2�>˗6����
�˙&R�� ������PW[CAA>� O$bͪ��]���@����/��T�T���VX,V�.�H�P�Xd�(c�X�6,�u]�2�/O$8u��=�?�����O�.&&'�D"�ut���!JKJp�]���s��9N�:ͼ�:���308�a�TUV�>sN{G==��jj��~1�</�a�t:i:{���V4MCE.���S�رs'g��Q�h#�00t�X<��ng��崶���?�h����>#������W������@���u�r!1>9��c�ػo?�d�%

x�nv��ŧ�}�(lX��D2Ɏ��ر�s
),,�`���I��J����M��w�z�8�t�ŋ�x�4
U��4uf���)(�gт=~�����ldQ�j�"Krn}^�(SRZ���D������'��D������L&	PU�D"���E36�T]7rN�.-U�PU�|���l��l3�x��BJ��Y8>n�L�dI&�N���!���h�_Lk{;�x���m�ԙ�`@Muu���<}��v�\�,��
TU��
������i�K����:���3����������r���i4]C@@Q�=FOo�gD�����;w166μ�:TU��61��d
fS�E��h��tp�޾����u�>�_�便���܅S��D��k�0EV8~�$�D����<L0���#$�I�KKٲ��y�)-)a��U�lvZZ��u��E1���A*+*h�_̂y����n�STT���f�u�����K��n�j����z2�2��Jimk�����K��j劌s���0p��TWUa�ٮ�yfffq&�&),(�s�n�D�1���a�i�=,Z����J�GGI%Sly�~���QR\D�v���l�0`jz����U����ahx�<����*TU�`���r,��//���
�+V��|���Mg����u�WW;�q���Tpꢗ=�I�H�RLLL`�X��E~("���b��1B�N�Y���d2�5�ƢLLL`��3k���,Ih���b��t�@i�FQQ!~��|������x�"�>��395Ea����"'������l4�/��������%RVZ
�P_���A���<��2���Ɔ�̚|q1����ŧ���l6l���>��GQQ��cQ,TV���zsV��S/�g�EDjkk(--atl�|���Ca ��U��Ùzܰ���������VUt]��p��fer���m748HmM��255���%e8�.�C���^}�Uc���}�9
S��9�\EQ8���o_~��������Q��ٷ��'N�W��/��ls�ae��>Ǫ����d?_z^6��iWL���f|������$�/��~a�e�em7f�f�����G<��}v�-C��u�i�����Ҍ�;�¬}���R)��������D5���;���g��d�ߢ�(��t:M,C�%�.7�t:���v�Q,�AlzfgL$�awPRX���:{�,K9���캮�,!���$��&���$I���>�
ξN�R^��9�DQ����<o^��2MO6_٨}�(���Cp#���z�x���]W��~WdI�w���N��h�F� `�1&�A�|g;��n`bb���.FFFذn#�@�5��A�����V��_zȌv%I—�7G��l6<3��Kڵ���}vh�+����߳��׺4�ɝ�n�
Z�V���Q'0�J�����-���&~�/7@�m3^���q�.w�\�+�Z�n��xsil6n��tY!E2�Qʕ��_�����b�Y�S��5�^��ӥ��k]���o�^f���$u��_����&&���7��ɧ3�F�T�ޙ��
H���0JKJ��)V�%7�_��HC}�iMo�q�XOϞa���t(�1(��z[����#]t��l~_�ޗ��(�L�.�z܎��Η��ɳ����Z�&w����
��Ŝ8y���1��
����,+��'ܐ�E��bY+f���;ŧ�����n�t�0�Z-�bq�6[��\�$��v��X<�Ţ������[���y^��|w���ٞ�Tf׵�ɭB<���Y�pќ���L��051��Ȳ���_�B��,J�;�nˬWKd֚ADȺ��#��?�����eI�����((��$JwIJ��iX�/�a��5ݜ��dÀ[�֌��tG.g�h�f�S5�e�B�˜��AUU�w�?��JYi�P�p$<�Z �L +�dbf���N�H�H<G�E�t���k�O$3F�i5�qd@8���Ë�HQ��d*I8����X�V\.�m�e1����wQ��l�O�3��^���ԴJ� @A~���nG�vv��:701gKs' Q�,W�^����_2�S�or�#�X�`���(͆br=���1�r�Ꮢ�l9�)ƴ���Q���,&&w:2�s�UO�	�Ch�fƱ6�&6�
�Ǜ[C�;�?f�����kb�M�k�6�R)FGG���u1��&�NSTXt��bbbb��k�� �L&�e������I&����gbbbb����3{��W��.�8���;|��ɗ�F��������_�����s~ W�KKjx2�;v���xw"��,dbbb�G�k7
�Rg���C#L���@���}����LLn�p��D�ɩ���}�
X,�����k\�4&3111�29�o�t:�j&�Uv{Ҝ(]3ѫ��c���.F��	��:�D�w�'n�S�x9Fb�����jv�w�v��1<2D~~��<�n1���fhd�����lG���X�Îuƽ�nH_�2�a�`U��%���e�.%�[?�gQQU�t:��(_h��`7�� �VU�tY�s�p�D6ߦV���$'���0�ӟeBt�2�gI}��Ba�V+ܷ��������}�wQ�}�
�0`��>zZ�S�y�ki\���s�y�ͣb��,u�eu�����}wR���RTX�(
s�~
�&�Jb�;f�&�|�}�)���v���/05=MSs3O>�(���	�|�{;ۗ����q�����Y����xQijn�w��ʟ��OX�`������EQ����c'N�v�*F��Y�h!o��.� ��<��c8�ܹ���u�w����r֭]�{�ֶ6>�|7�H��Φ{7�tɒ�� ��H4�ko��c?Lii�e���A���݅g�X�o���%%X�PE�095��ngtt�_��7<p�},_�����8�H�@ ���`zz�T:M("?�O�����v~�t$���Y�j	n��KEn�w`��=.���œ��]dU�����*��q%�}�����s��a�ڽ���B�|�1����x�\����3̫���rSVV�,�C!"����F("//M����>��4�h���<A@UU������\���@~�	�����h�������c�X��b
��������(�(
�����yh[n2211�ϗG����8�ccttt�d����� �y�e�Y���%
��˯���墠��X,F$�� 
O$�����t�(
�PE�����5���܌_�ǎ� //��ŋ�Fc4�=��jEEZ;�iio'�Na�g����Ȳ����g��W_cbj���/>�,�(^]*
�Z]7P,6l6��,��vb1CKg
	�nB��R~:�^�e� ��*������j9ʱ��Y�r��%���Ϯ�w3::���>�����w���G�Mٱ�s���󌍍��_������q�<m����-q����v�ص{7���ƒ�z4Mc��]�i:TUT�m��z�]���L���vFFF9}��H4����F�J���O1f��}�����;��r:���a�չ�mmk�n����8���hmo���S���15=�	�(|�[/"I"���ڛo�b�RV._�˯����Y����cbbr��Y��Պ�nG�R��h����,_�������ϱ|�R���G(�����'N284����[��g�z2�����e��y�$�4'�ⴴ
11��~:�t�j��B�(�9�����vV�t.pԥ������v����[x���������r��Y�jj�����������wS^V�O�����jj���q��)ZZ��t��MM�inf^]�%wIyp�:/\�Ï?�[p�݌��r���z�q~�������G�ngll����ģ�b�XI�����Kil����E�t��8AWw7�EEtvu���́C�X�p!��?���a�J>�J��ƌ���N"�$�J�~�Z~���011A0�6���ĩ���
���(7U�&&ws��<n7[����r`tl��:k*����������*v��@A�<�$��@��v�p8�u��k�� ��}��h꧹y������.�����D��zh?9�Ԫ:�����."+�
2�U6��ҩ�v����8��5����Ͽ�W�4�e��y(�ł�n'�H��Z昢��
�T��˖�����v�x���سo?V���ŋ��K7�
<��3�!-X��{��f���bAER��a J��ufg� d��3���1�Z���n{�a���r��Q��ؚ�jv��I[{����?0@k{;/<���p8p��(�]τ�M�R46�s��>��#jkj�����c��_V2_����"+��s�nj�������>`�u|��'��f��e���Q�ΨS!������^~�('O��=A � `!>�AG���_}����ʹ@���Z����������it]�,
�BUU�y�ZZ�p�]��Ⱥ5��sm$Q�j��f�J��C���$�I�}�I
T5���c��|��v�++���1ZUf���K1�x<�$I���Lk�ͷ�F�$j��ٶu���&���_����%I��$	yF#��
+�-chh���^<^��
����288DWw
��@f�@]M
۶n�?�a���ٰv-

�>v1�H�f�a�(���������b>�9O<�ح�v�[�p�cѢE�U���q��6�h4J<������GQQ!����Q������0>1��롤���`�Á�g����]6��x2��X�D2��iCOE��_��ݟ��_��7wL2�/���t^�������F2G��9��2��bc�9[��`rr�ѱ1����Dc1"������a���c|F���|����V������(v��Ǔ���i���0��r!+
���h�NIq6��`(D�����xp8�$I�6+�D���t0HA~>� C���m6��ʰZ,
1==��� PP����̣cc��a�N'���$16>���f�26�߉�I�� ����� ��O��5MLL�y�@kk�E�o�C�l�ٟ/��='��뙙� E�?K�eoF*0==͡C�x<�[�ޜ��E����4Ucpp�h,zY:��IiI�u�,s����^���.����Qq5���+�w��g���E6��v��Z~f�!d�Ԍ���f���*���o�����m�F}&&wY�?gJ}i'0�����K��
#�r�ܰ���]���裏]�&�l��~P����+�;�a\�ݸ�`�}��w�Ҵ��{�{\�λgW��w�<]w~��>��N�Ӕ���li�9�71�K�Zu�7��Rg"�l����w��}=X-ٶ-��011�����$I3�XS��|!�Tj���3�����"�
�f�����B?�&&�T
��i
%����m�/�"^�M�L���b��3V��bbbb���3��\�,������.�����������v���\2��ou.LLLLLLn�k
~M���F$��Bnಉ8%DsTdbbbbrqM���#��.�PJD7̙�l�V���-���q����-�|�%3����
�xZ�D��u~򜦁����8�5�<OMM���Q�͆�b%��N�ou���d���0�:���|5\Q�L�tj�e��̊l�/#���>dd"����E �&�j*�#Ӥ�4N���v���G�T��A��y�nuvLLL�\Y�����)!�B�����q���lw�
�"+���Q(� ���#(�N���*�/�Uf�����n��6��(B�K!��0t�.�-�$_3X��W��bAis"�����,W��Œ*^�t1t��D�2���%�$QBETU��Y�kEQI��M���ɗ�3~M7P5�M��r���L�0ff{V�EQ�� ���^O4MC�4E��|��4�  ��5�J�H�U$Q�fˬO�33WE�s�ٯI�:dG20�4�N���z.�@�}�lo�Y�+�y����K�U$I����0R�T�|@��\{M���(d�����e���:���(�
�L�ϒ=�z��F���Fgê��J�su`�X��;�i�|��0�D"1s��a����j��g� ��ɯ�b5�&&&_�+
���#`�J�@��$������oRP�� <�裬[�:�g��(`Y�6�إ1ҳ�Ȟ�>�
8�9��A��ٳ:|�?��g���۷��
�>����?�C�#8����xxۃt����p������-dY�	�K�|i�tAp[%Қ��j �#^z�U}�!jkj8z����ÿ��?����݈����099ɉS�Y�r�H���� ���{��'�����c�����@C�b�����]*��?���"3���ŋ1>1A8�j��}��z���;�����V}N��8u��>�����VW��ԟ d�]Z��a��*o��p:��n횜�ͽW��E�<4<LA~>v���g�h\���7;��~�'O���`��xd�6z��((����S<��Ӹ\���߉p$�˯�����S���)..�٧������}���k����0�,&&&_	�	~P5Q�~���.
QVVʏ��G���fchh��</�$1>>AAA>���?0�,˔���(
�X���!DQ����H$����0��0�X�����lۺ�5�W#K�CC�Aey�x��vN�9Ciq	��%LMMc��a��a��p�\�C!zp+�V�@Dt]�n����0 �L�ɳ,�LMM16>���&�>V	�dZE�D,�$q�����R�ϝ���y�_褱��s���]�����
D��J���|��3�~�1jjjH�ӌ���(
�%%L������@ ���fl|�h4���LOO319�ϗ�///W��T�d2�s�<MIQ1�"�P���il6���F��p((�0���G"�����,*BN��%�Jq���'&ijn����D"A(&�J�;��Fc���^�����8�h�ۅbQ�u���q��������r1:6F2�$���˯�q��5.�hB��crj�D"����f��|O�L&().��p��<=�~�"�ny�����0�T���1ҪJ:�fxdUU)*,�j�����>�$����wx���z=D�Qlv;0�p8���$SIà ??�n����&&&_	W�񫚎�-�?���(
��ɧ�Q�M�n�>���r:i���/�����O�Ѕ��ܳ~=��_��^y�ѱq�
�����k���AUU^{�-�.YB[{;���x�ڬ|�}�P���
,X���{��gjz�g�z*��)ID�Q~��+D"�$O?�˖6�jZ���㌎�R]UEyy9f�EH3Z�?��>f�q�z����+6����2�y�Iv{n6�ʈ$��$�h�N�:��eK	��]������t������()*bpx����.����S���l�9��AV�Z���t��g����<�x��rxd���}Q�T�g�z���B 3���9z�������mn�f��H$���y�X,J2�⡭[s��<����j��t�$I�������	���x�чij>��m�����>����|��8O>���8u�N�Qy��'.>����-8N>��3�,`^]-o�A,C�e�*+3��B"�(B�e�6n��K�����$�S^VƷ�.7��u����H����aYc#'N�fIC=� `�:���ɓH�HYY�>�-����I���6�����y��'p:����[<��6�}�C�B<g�l�g#�}�!��
��,�VӦ�����Ks���Q�+��͒�V����f������<��<t��>��g�~�h4���w�x�u�={�������0�W?�O������9�f<gՊ,Z��|�;�Y����
֯]CEY��41==��E���	�ݻ��v�J�E����=v���&�&9t���|�(�v�3�]׈F���(�����k��<�w/�D�P8Laa��˗a�Yϖ��*#K"iUE�u��45����q=���`ú�
q��մ
E���㏳dI�D�eK�����瞥�����*��v�hj:K2�b��F~���RTX��c��g��tS��}�|t�����3�p]Ϭ;;�.���L���c�tvuq�l��<���9q��x�X,NiI	u�5躞��4
I�(�u]��uΞ;���4�D���!z{{I$x=n���,Z���0RYQ���]�v;G��Ƚ�T�p$́C��_W���=֭YM��ˊ�K�������|�����|�Qg�pbr�}���f�J~�o��9~�� ��x�m����s�b6��˅�nG�u��hF�.@$e��3Nt=J(Ν��*�6S�����0��&:�wb����$�Iش��}���-8t�3g�o���T2e
~�/�g�ɔ��~5c7���JzpkƨNQ�'9�;Y���0�x=��Ր��G�t4-#�dYFE4M#�H���(������念������g��2��Z΀-�L�X,���xx�6*+*.��0�Z�ܻa�֮����T:M"���g��GiI	���w�~�嗿�?�w���2����"f��tE�0 �����c��<��c���!��[/<���8�6R�LG.�Rn�;�H�c�.A��t0���ϔG�\=#l�$�d
�Պ��aۖ-�TU��Y�4
��κ�k(-)2��l� �L�H$Pd���=��SRRLyi)O?�8��|'���D)ca�i�(�H�iimŗ�G8��rr�L��Ȋ��f�n����膎EQf<Z2��g��|�R�l4ε��w�/\�"˙�'�jUUsd�A��{h�(���(
�3��a ��g����p8w-UU��躎�ᠲ�����(����ECTе̱Ɔ>��S:��X�f�VA�Z-82KJ7\�H4������7�+����a��u�,+�n:��ۿCE6�?M�ͬ_��7�=��?�����&��r�Xڸ�}���?��<��Ô�����c�Z�t��NmM
o�����P���;����0?��_022�s�<M0�ޣ����E�h����Q^^6GUj�Xf������b����O���\K+�׮��{&�.�l��3��(N�s�n��u]6�d:��Id:��s��j��P���*��G(-)!�L�G�$�v��=���yp�ft]'<#�lV+�$q���߼�2�`�g�|�c'N�g�>�^����q�ݹ�b�9Y�
�y�e������S]]MwO�����;/p��i��8���ƒ�A���(����(�x��}�[
�i>�®={�z<tv^��7�drj�����8u�p$�t0�S�?Ρ�Gسo����m,m\�g�>g`h���a�Q���"I26����R���	F���.�5�8|��NaQ,,]�@wOOf8�+˜�EC�XU2����B�e�N�X��G�����s��@qq1��ϱm���CG�4��J�u���a��j�4�p8�G�nG��*Y�f5�cc7U暮!K�m��G8p�hѢY����7N �GMi�e3�p8����L�$���'�J�����1����4���~��%�E��~��0}��(����J�8]��8N\.'�EEDc1FG�(�c�X���B�d�^N����)"�(����R��$�==aQ����R�3نa06>�{F�c���i�/}�}8�N�>?�3y.)�a����K<����������'���ezr��Ǟ;��vA~>�$�lj�b�}>4M#
���E���h4J(�� 5�fpx�͆��B�$B���Z�3e966NII1�T���q�V����%	M�2���l���$�IlV+C�#��Qd���Q@��0����0PR\����܆a015���b��L&	G"�}���`0�8t�(�ᅬ�@���B��c��'X�v5E���}>B�0������!
���I����|x�^��P�*.�_^��199E��s������㾩�i<nwƀ4����۹�$)g���:S�Ӹ�N&&�0t���c�DcQ|yy�g̒N�CA|yy�����}USy��g�D���/Ŋe˨�(�����:�_��N(���*������� ���ze��|a�E���y.�s�/n���%���|Y�镶�;����[�����-Q��}�~��׼���wiY�����M�ב3��^��-����ŭn�?_�9��\���>wg����{�K�Mc�Xq���f�}dY���c457���}7���>DE�|�ќ��gɪѯV����ۥ�e���e��/-�9�5��j�ĕ�"[Ƃ �Fy����r�},\0�P(̯~�;�y�	�*+/{�)�p4LIq	y�<s������
��t��uR�E�(ۯ��l��%��:�+��4_&�Մť�n�~���a˨oS��_W�DA�l/�l��4�E�����������ښv��/=�F�r3w��R��$�ϧ��UՐd��+W�t:s�7��$�ɛ>����f���<�*6Y@�Ļ.��`�(��9OxwY��Q\TDIqq�:`~]p�v��rq�u�]%���Wͱ��2d=U�����,r:�&	�f"�d2c�.ߘ�ջE�q�-D�Q$I�-	�J�H�RD��]�����W�ۑH4B0�,r�E2�Vyxt�ۙ�hf�%�s�›\��‚y���`֣���̷��2Z,��Y111���b�#�AUe�_��w)��|�:&&&&&&We|b����g<���&&�<�n�ow�|��z�w�ˑRv�3bbb�p�m��wJ>MnL���1���7�;E��)�4�=0ߕ/�-�t]'����H�Ym��.'�H��>Ǘ���j�`0�s��p8�ٮ�qLUU"�LxV�tKjb�S:�+�� ����E�Ӆ�j%
�N�r�\.7����;E"a$I���fI$D���5dYF%dEF%���;��3�!��ň�Bƫ�����c�� Q�u"��TEVp:�$	dY�n���7�+J�x<���E<G�ddIb��EC�ܳ~#;w�`��5���]��ccc������$I�i�}�dzz�ŋꙘg���<����[]6&&w4w�L�j��������!DQd���4.id�g�200��b�0t�ߴ�0h:{��{E��i*�����+V�<vv�g�g�b�Z1��@.����BvMg�x򉧰ZmW<rb��>x�T*M:�BQdY��ǟ����VW��n\��O&��8y���v��V����LK�9�B֬^{��}[q�iu&:ٽ�af4=95�8�5�zttt�h4J ��䮠:�$��#�c;w�M�	�8�yy45�fhh��(:�H���C(��������$w��B>
��̙LX姟|�=� �J�ظ�e����Ik[+�t��eDQ�����E8fjz]��IEE�CA]'?����Al6;�h�ߏ,�
QZZ��R������Ǟ@�3QDGFGp9]�A�j�e2�dhhI�).*�����x���}�[�!��`���1‘0���v��i5��颰�]�BUU����P{�M��/��67����u
�E�x�yY�i�R)P5�?_����1�� v���¡ȲLII�55�w:Wէ�����tvv��q�ܜ��0Pdb�wu]���H���f�-�f	�T*ͱcG��Ja����6FFG�����195I4�BW'�%�$S)�ϝ%�JQ]]æ{��?111��;D��|
�@UU5MM�9q�8��u�}�9��LMNb�XX�n�e*vM�imm����7Ğ��((djz��W2==���l��~����…���υ�N����O(b�=��HmM==ݙ�{<97�G�axx��Y0kV����)�e�Ɋ���%ߟϙ3������'����S�O��q{�p��<��X4ʺu�D"���CE�3���LW�0��f��t���jkjY��>s0� �\‘0gΞ!�"��K�r�Q$Q"??����w�D�x=��v��h_øO��磨��|>�$�F���*m�$	�^/C���O�Ͻ�(��磺���|���r|y>\N7j:M(��v�E%<��!JJJ��llݲ���W22:B"���edbrGr����,�G����cǏ�w�nTU���RZZJqq	V�5�v.���� 3���ŶmSS]���`.v�(� X�V֮Y���(.t�n���L�(
E�E�r�f�|1���.��n���2���/�I�
�}�E
���aph�yu�Q��E<�䳔��s��qZZ2B��p���E$���v��U��c*���.X�|�����8�P�Nj�륤�Y�q��x�JKʾѳ}�ƌ�4,������膎 ����x�pO�p8�46.����:=~��x�y���r�QJKJ�x3aiE!���כ�'�rf�N�1s놉��p����67�J%q��X�V"�0��c�.��@�pF�gz�x,F{G;���?���@��B,�b�2<2DOo�x#c�H���8���\�G�0p��ܳa#�$��*Mg�d��H,%�B�$DQ�����3'���g�>m PHIq)���a�SYQ���$�x����� yyyD�QTU���STXtU[���K��ɲ̼y�9pp?��D2AQQFc311��S'��I]m�##TUU322̑����씕�3>1Ʃ�'q84.Yz��k㊂_E�ȳ�Mm6+yyy8*+�aѢ�����0�i��_EQ����Q��\.JJJ������륲����^N�9EuuM澒��j#/χ(�j~���V��N(���
]�(��n�z,+>����FGG�
�����p"+2�à���Ǔ�Md����!N�0����tp��g��zA�8���F�v;�;;(++'//�ݎ/χ�dW�/���@!n����4K�-�…NFFG���-S*�����2ӷʲ̢��imk��f
n����I�;���f��LMOq��qFF�)�/�k]�_m0�p�"tM��&��[�����P]]Ck[n������df"*��t��X0^����N055IEE%e�e��Q�V�W_�زy�Y���1�ˎ^�\�5Y�s��dYF�4�8��`�Z��UU��`?��fֹdI�$MS�̶?�Б%9�I�$3X���M�H$PU��u��rS��u�t:���X,֜UUuά/+��@��!:�_Ȓ����1'M��I��3[�f��|q2�].�$)woI�H&���4v�mNz�􋚦"I�Ł�~>��C{�qJK�س�s���y��G�g	!cԘL&�Z��xU��PU�H4Bތ�R��t��2����!J�0�nd�Ug4B:���gM��t
I���vec�c�ܵ��3~A.�[*�b���9�6r=�Ȧ�t�*Irn-!�ο������#f�Wɧ(�WTi_m�{-a��Kfw�$ì�}�o�^kv�w�|�l�ٰqeU|���b���:�m-�:[�VN'�b�3�X,w���͵�iEQ.�;�2S��K��%�E����R�N}���]��n�w7`55�8ΜPo�_��馦�
�)v+�;2`�`&&�P�N�N��W�$I���u�溂�$��`
�/AV���H�]�'���L2�̬Cs{��|��zTM%�
�nuV�H��@s3~A��Q���7�Պ�j�����O�[�,�x=^�]�	��3~�ә�Fbbbbbbb��#�J3v�暉������7���7�˙������E���������.��R)Z�ڨ���q�yc綵w�(2V����j ����p��5��{E����?j!�A[{{��F��El�F��bq"�(�$QQQ�Gͻ�ɝF2��������+:�Q�H�R�����G3�M���EE*���8^jgxd��2<7�P�������oj�����K��F2���s_y�
���;w����������y�����O[G���4]��wޥ������������s�<�Gϻ�ɝ����ۗ/nٚ����w��E�<~�$�\_{:t��?�G�K�lr��{��8p��;Ο�W��
��s����8ɻ|��i��Q�����=�BUe]]��Ѹ��=��SW[��������`dt����ʶ-[i>�BeE9n��]�w�L&Y�`>�6n��t�:�}�!�P�P(Ľ�a�%��055Miq	����p��B��K��kV�b��U�����AR��׮%/������b��i� �ee:r����X,�4�#I��w�i֯�ls3�CC�Z��Ɔ�k��N�>���NMM
MgϢ�*7l�f��;r[&UU���#���SYQ��ߟ�abb����^�f���9q�/��:��lX��W�x�ɩ)���ٰ~G��#�Z�r��x<n�8���s������p�{�OL���]�q�^�^�8Ƚ�atl���~6��ZZ8r�8��[7?��n�𑣜ij�����l�f��y�����ߠ�����x���?���n �L����D�Q


���`���X[�l�0�\K+B�$*�˩��dph�{﹇��O���C�d2ɇ���U��l�x{ �������CUU��#��������tfb<�������={�X,lݼ����~ί����i�ϝcic#�--�njB�u���G����[�RVZ�a�:}���OP8ㇺ��9�.Y����r�r����Ç9z�8�;;�����o�]�i>�BA��Ea���(/+�LSm��[�ghx�_��%B�0���o��--�m|�}�����|��'457���Y�p���8y�S��p��	
ٽw/�N� ����W_cph�q���Z-���!�"���:gϝ�؉�z�gOo�{�e�^/n�C7���\�(�,Z�M�x����|��.�*+���|��(gzz��>���Fww7���ǎ���;X�t)�WVߪ��3����楗A��K^z�U,����/��^���Ǐ�s�n�6��Ï?a޼y���^��]�~?n����*�Nǭ.�o$�d��>��p8��j�/���
����`�W�x��24Mc������4���Ns��A���z=TUV���>z�O>�N(&��2�

���y9t��dA�D"���WЍL�|���r��
���~��(��a�( �Ա(�g�T:7�E��%�lݲ�P8L8e��E�NUe%��_ϒ�z�6��͊�.�׭eú���)‘Mgϲli#��PP��=�3���ښ֭Y�a��319��O<�s�<���t��PX��{�a��C!Ξm�������#I�Kx�M
1<<BGg'�T�D"�@�9�X�3ϸ�q�\tuw3<2��8�Di֠�����k�q�������01���|>֯]��eK���TWUR\T�}�n�����j�ϫcՊȲ���c��x�g9�y��G�RW[��{60���+5�h,ƹ�V�����#�H��2O>�'N�B�ٰ���>z�����$�G8��JmM5[7?���˯h�0�����"�^


nu�~#10p�]l��tuw������x���(�����qIV�%N�	�$���Ct�`rj�󝝄Ca���q��l��>V,_��SQ^NYi)+�-�j�f�E���9��y1��m��߰��l�9������6�;�������(�t���Ry�!^z�U>2^ט9����"|yyh����0�ZZ8�����in>����L�gtl�E�J�Pg��k:����:�~?v��=������:�%Ŝ>s&Vs&HyY�~�!g��244���	�	�YVZ���<��AdI���M���줪����qtC
3�>E������X�j%C��h3��$����X�j%~��?��kW�������0���p�2�d�c'N�f�J�v�Z[hkob�8���ut�p�|>߻�����`�X�`���@���2�^֮Y��bA�X8p�05�U�Ba�[Z�n�j��~�
�8��?�NSs3y^/���tv^ �N��xP�T*ɱ'Y�r^��V��7M��5
��GiI	�

����v��u���	�<������p�������~���e��>9̞�{y����@�L~�fV�uTUC�2�J�%���	C�_̃�}[�I/��������b�d2�;���5k���������AGg'��~��7����G).*�-���GqQќs#��##��wPSU�C>H,�����¢��X,$S)B�0�X�m[��j�D��ŋ��b(**"��z�Jj��9������>�0��U�q��I�UY�M��H$8z�8�;)+-�����GuU%�ee�t�,�drF+ ��Ҋ��`�����@g�,ߌڱ����BGG� u55̫���r�x�BZZ����g�`�ڵ�X���aFML�FTU%����]���˗a:穩�����޾>b�8�@}������4�>�$K��9z�%
��	v|����\�ꢨ��5�V���F[[���������^x��"��X�zv���'N288LYiI�!�STX�ǟn�����.�H�i\�I�h�蠦���_���X�`>���<s���n���X�ؘ��8z����$��}���qN�i��p���@uUFn���a�Y9�y���|�4���Y�`>n�+w�X<��(�TU�J�Y��HWON��uk�r*�_����.�;������_5�l�B��
��u��9����ʌ�2�t:��(sF:W:7{��@���
_�v]�I&3�A~��_Q]Y���>sY:UՐ$1�ד��y*� "IsWR4MCŌVC�M�>����.�/��K����
g-�]�Ր�k���f__UՌa�m8��[�h�t$I�0��$����0<<��@ZU���w�0TU�᭤�]�9y��c箝7��W��˖�9&I�M�\\�/:7{�Q[S��>|(��gphX�n��]�\o%^m�=?�Ndbb��D�ݎ�f�#��F����`�kj�n?A��{:��O>��|'�X��y*7H�t�=G���1�]���ۑ��u�]<�Ѓ��q�~<ϭΒ���MR^V��o��l�&��(
�m���˖�q{�}�:K�w��$����[�
��EQnأ��݁ ���x&L_�&&&&&&w��711111��0�������]�)�MLLLLL�"L�obbbbbrqS�_�4�n*,��i
14<���4���F3���G.�k�e�u)��2>>�N���bjzz�I&'���{���-|��tz:���ĭ~,�ۘK���add���v���	�Ì��ݖ~���M	���A~��L8��s������d�ϙ���?�ȱc�귿���g�
�n�^�26>���˛����O�1�[7^~�u=2'�a;q�������&&�4��N#�(�DUU:�w�|�庮�{�^�y�?�sD�s�lr�2��z��������]�y�}<�+���O*�.njk[;~����)"���km���Q8u��+W,'244L2�b��e��������sn�A������TWV�r�
��m���R���|�2�v;�;;�D"���Ӹ����q��<���,[���2Ο拉���@�@ @Ww��n��{֯�� �ֶv:;/PS]E��ŗ������\KN��U+V��9=�5]G���ɉIy쑇�\C9ʊ�˨(/��nbrۢ�]�ݹv�a�:N�>�o_~���5.�7�bzz��~����t^�@$�j��|�R�V+�m��r�䩫Š�b;q�D"���+�Z3}Β�z��C��,kld`p������b�2E�-��V�X>�c�a�?p�߽�*C���w�F�
ou�~�H�Ӝ9{�X,��妬��SgN�(V�\���bhx���� K��E���b��t�� I�L0�t:��c������ŋ9��B4��t�li#���ɧ��ƭ�|y�����IN�8�b���ۍ���q��V,_Ρ#G1��|�E~��w���{9s�,� �-"�g�>��8ɺ�kY�pg��X�l�##LMO�ٮ�ٸa���(
�����4o{��gΰr�rb�8���r�L�V,�?`ͪU|��gL����ɉIFFG9y��?�o��.���n�SPP@"� ���i:����6n�w/�ʂ��8p�0?�%�ή.lV+�����RYQA0�7�$�q�@�\kn�+��h���&��G8�yUՈD�4�/����V׿��� �i�##��8����	b�X&Dn$���0?�ſ�z�J�����aic#�{�V,[vYd�,���wߣo`��A���{���={8s6�?,^����R~���P^Vơ#G���T����^b��l�+�xc��x&�٨}&_-�D�_��7TWV�����va����
�Ѓ[��o~KiI	��8x�
��s��)����ڽ�ݞq�,�����X�]�{��{���=��\��0�D2�$���r��	6޳a��˯ I2�H���A���o�v�nX�?<2���
��1�L[AQ���Ȅ�UӹhF+�/�{���111IC�bt]�)�L��AMu�?�4?� G�����ɩ)/Z����b��N����e:������yh�V�<���j�f���g�>��*����˃[6c�X
x�ǹ��{�Q��1>1���0�D�Q�'&���D�D�*+�Q��Ae��g4Q�u�3Mgi���z6�ŒO���	�C����L&ouݛ�ܖ�A ȵӉ�	���SS]��>��%KX���˗��m(��7/��{�?��8q�4�v����U+Wp�n7�r��1��������|'�E�����;�D�<��6:��hmkgrr���	�wvr��i�*3}˶�[r1ٳ���e˨����瞥����}����䙧�������gg�s���ZZZQU�?��w�w�=9�� 3��/�@�d� �d2��t�޾~\nO=�[7?�,I(�B��E,�?�m[6g��A�N�:��t&�Wx�Jnx��\~����B$Ibtt���1�CA�`ͪU����ۗ^A��L�̌v2�s*���I‘p�`"�##�tvu���G[{�~?���y�%���Y������ �J�D��v��'�
�Y��1Sa.�����p�1tC70t#W�n����B6�_��6QUUIA~>+W, �J�Ѓ[����w/����c�5TT��Q3��A�x��H�*�ZZ����6GX�b9֭��nbr�bsکH�H2�����͆"ˌ�h5MGSUb�8C�C8�
���1<2���4�a�D���X-V��T���r�
E�n�s��	|>�h���a\N�~?�V�`��(
���LKk+��#�vv{� Q��݆ @*�����Պ�n��%���0�6�<

�TUT�v�H$LNN�y����躎Ţ0==��8������ �SS�:}�w����|"#�5
#7I���h���g�Q 8$O������r9���V#�����������S�4�K�?���p���6�C�4�4�s��9��n�{�%�N�;�����΍'���ݎ,�TWU���A���O>��di�����Diniajj�/�‚�v��x<�sO?�Ţ�|�SS�������184��{�299IyyCCC�u�X�h,ʃ[�0<<™��LNNRW[�wV���`�?��-���f�j<�ᣙg��&�n7����I�y�m$S)^z�U����|3OII��7���$	z��r�4��v�&&�9~�5�5���p��)���)..�?�������|�qV�\���N������$���#���or��)N�:E���%
��8u���$���|v��Ƿ���AWW77�'�Lp��I��(+-aIC=�;;�=ӷ���<t�c'N019���%LO9~�$5�U�|f���UU���EC�bJ��Q�cǎ��ۇ��aŲ�#>�9���!+
=��s��;v�h,F��Ŕ���g�>�ē	����ŋ�j*
��zs���� �LR[[���8�9~�v��%

;~���^�5�շ��r�b1���^}�Uc��-
_xR2�dϾ}�X����LP�H$��j8�N��D���Qp��QU�9��N4�j�"�"�(�N�I&�Xm60�9�
� 
#I"V����g�>����ߏ�j���jEQ��v�H�ł�jE�tdY��K�eMӈF�Ȳ��О�a�lj'�\.��dr�3�N,���
Q�X�b9�a�N��e9w�����cbr;����]����i,�(��������?�����?����33�ɶk�b��(9-aV�gQdY&�L�L$��m��(3�3�h4�8�A�%}K2���>
r}Хy6��I�*�$��X,���8dYFUU��8����n���ϤR)b�8Ι4� �F��l��F"���p I�wP�u} 7�Ͼ�$�Ő%	��vYn���عk獩��V+۶n�s�R�E��:g�kv��犢���湳�������b�Pf
��4���e� �ϒ$]54� 8�U�)�".��իV���,+_��&&_H&��6=��Ύq~q�dI� ߟ����+u�6��%�9{��k—�u��-v��k_�g���:��W�R)v}������{xx�VDQ�f�a�����(��c��=��U��N�f�{��p;r�En��%K���@6���]MII	��ݖۦLn-�,3�<�n7�o��ښ�[��ۂ;V�}>�暙��]��ng��y�:&�!�(R[[Cm�)�gc.:�������E���������.��&&&&&&w��711111��0�M`�U�/^�7�[�M	��3��M� �ڹ�����MMӮy�t:M,�J
"�Ng�{\�����i&��#�J��:�?���ô�����k�01�Nt]'�H|e�d*E"��Տer�L&o8~J<gjj*P*�&���ǸnnJ��?�)��S7}�����Mg��ů�m��?r�G���v����o��+)�={���_�n:�O?EE��<��w������K"�(�t��v�B�4��<�[x�����y��ws�{MLL������o�ùc�d����\�0<2B��u]o��y��w���qi�Mn_��c>���u��~�3����W_�}�ҫ��1}�M��o�8����)&&'�|
��E��à���x"A("�NQYQI�̹v����FcTVT�L&i?��Q^VJMu5.����,/����tZ����,_������G�z=TWU15=M$!�JQVZ�������Q�ތ�dA���e:8MUe#cct^����#PP@]m-��s���d*Emu
N����6�`��$q~�?FE��?�#�Z[X�z5{��g``�{֯�8r�(���=��d���H&S�m>������h�B�ϝ�/����-�jky��?����QVZ���X��BeE9�$122JwoM��x�Wnw�t���n��u55�����(ee���	����(/#
��Ӄ�㡪�Q������MuU5�|љ�a�>}�߾�
���464\�;��ͣi��C$SI,+�~]�=ȲLmM5��p8̅�ndY"//��E,������IDA̸^�q������t����%%
�L&�X,T����:�v僚�����J"�dzz�h:/\@��\n7nX����ӧY��Ȟ��00x���~Ǧ���y���(/+�b�p��1:̂��y��g8u�K�,����s*+*Y�*�(�����t�,����=JwO/�����G$��r��o��er&�撆�<H�}��<���8y���idY��t��?���`�޾~��i��$���0@D�����/\�?����|��;�EQ(��w�C[{;�ee��6;~��޾>�>?��L���*++�`�,_��m�����v@���9y�4,klD�
zz{سo� �����G�g�r���?QUY���[x�ŋ�o�������%
���0����'Of�xa!�=�0�y��̄�o����=¯~�� 
��OPRR̿�����6**�)+-�#��;:���a��}���k ��?����z=,ih 211I2�d�ʕl�w#���ߑH&�D"Ȳ̆u�8u�4��?�
|�1v�-�_I&��ڽ�P(���0���x�Wq��,_��Ғ4M�….B���%�L�~�ZQDSU^��L�X�b�?�ȭ.�˸aU���$C��,��'��hZ&����3�A�����@!����F������
�P_�K[^VJee%���h�~���͛��n"�Hnm|������H$�t0H[{;��,kl���]�i:�L<�`��{�/��o�B�=w��˖R[S�a<t�����SO���,[��]����ŏ),Ȅ����蚎(�8|�H4J(Ɨ���06>ο��w�Y��yu�dB�fm��#\�L<g��ab��AQQ!?���y����e��U,\����X�v
kV�f�
|��f�kY����w^|��{�������ش	���dD�ѹm��!lV�=����&��}<���tv^�ĩ�8&&&8z�8G������y��g���v�{6�g���?�S�U�ׄaX,
/<�,kW�����X,
�d����|���I��~�C[3~�5MC�4f����y��ƪ�+��(grj����X,���x�1dY�j��j�
�56�3O#��n 
"�##�ݷ�L0�ݖK=7<�?�Ҋ�墢�A�H4B<'�H"
"O<�(�ZZ���!�H"B.�MK[;.����2�~����
�t�TU%�L��rB2�΢XP�"�J�8}��eK��b���k,�����m�BeEA�DQd:�]�S�h��|4M���12�[0�eI��X,膎,��x�"6�s.��<��،���������3q�UUs�BJ��ٸa3���ub ��L[T@f���D�1|>_&�^"A<�02�%��AD�$�$�D�T:�A&`K2����L���m�n��L��e)c�N��SQ^���)/+��\�x�t*��(�@,��3���	Σ��h�<�vY0�/�A�-��nǢX�Z����|�2|yy��4�X�p$�Y����ň�����\@�D"A{G�����s^�MU3�w��ԟ  �� @"� ��,	d��ޖQo�MT�̺z���H��⅋��+��ӟ�����g;w��ׇ�b!���X�c�i�N���~1�,s��a���I�U�
�Z���O��OLOO��^���M˄6TAp�\���26>N��z$I�j��y��p8BiI	S�Ӽ��9r�@A+�/���,���ù���}�!��e��c��PU�ݎ���m[���N���3��-��O>%����>v��K����3O=Ia @,���
�0h���w�����A%\N� �(
���|?�%�楗��G���9z�߿�.�V��`�����q��-�����SSӬX���'N�ѧ�"
"�����m�ܵ�x<A]m-S���ع�?�яhim�Ï?�٧�di�N�<M^����֯[˙������{,�ǽ���+��N8�0�����C�(--�7/�ğ|������Qp:���@aa��7�KsK��#�X��5�VQQ^�?��Ƣ8�N��*�d{���������z�����q:��H[{`�p:p:��ik-�Y�p9���g��y�{��s�陞�s�Id	QY�lɖ�s�{|�y��]�{�}��q�dI&�@ 2���`2&���=�C��ME9 �kA�w��]O���1����QS[��ŋ��o����Ӧ=�iإw����\���Ĥ����c�.�,Zy�!����?@ll�11�\n�GF���!%%���h�j5;vu8����IOKE�v;���BRb"n��pXF�ס��p8��R��ȖMO�V����144LllF����o���fޜ�d��c2�>�&&�Ō�db`p��CJJ2�$!+
��8z]dCCø�.��∏�g||���/���������� ˑ�[|<�������D�V32:��h�(��	��8ǝ�Y��A�>V���Ÿ˅->�V���h�y�_���y����� -5�Z���dxd��@lL�c���~���0�L���x,3�H�x<l6�>�H$%%a2q:�
c�����3<2�U�`Г����㉶��T���!�2��łV�!208H  11�ؘ|~?C�C�<s�����o����Q���V,14
}}}����|�;lj��b6�	�o�|�LF#nw䂢`�ш56�SmxT
q������ ���2e�v+��k%�����X�t�Wr�Μ;�N�c��Y�j�LJ���MYi�W�oA������^~���w�E��	w�\��¨�Υ�*f͘���&�Y��V࿯�~ib��A|��w�cs&�J�꧞$-5�!�;-Z����T*V?�*գ7�#�V���<�a�U>E��.r{<<�r%sf͜�&=�uO��$�9�f}��>�g�ۈ�/_�8���Lv3�G�F�a�������ㆂ � |eD�A�Lj�� ���_A#"�� �c���(�/�^e��[��T�����=w�ߏ,���h���._�-�t�3<2�I�(Ӄ�[>���Y���� �EQ��Ѳߤ����ott������͛HLL|���l�D�me��\U�K/��fb��k7n S�L�g]�.\�����[��ˮ�x��%�~˖,yh����m�[����T�]8OCc�yy�ڻ���n���<��3(������&�{�1j�������"V��KxL���9x��׮!f"�u0d�n')1�J���$,�_h��Sg��t��a�گ�}��f��u���-+�/�B�;��l��C)**$;3����HҞ��^�W遾�ͭm��"I*|K��n�� �p������	�B�|>N'#���B!Z&�
������p8�^SKKk���Ƒ~2n��Q����h�;Y��p�Z��k��^F�vE���p8��'�s��m�zz{�z����Qu����vFFG�Wp#��F���8���n ��q��N��`(�͖�^�@�s.2o�bcc�5c:/��F����������j���*�������'�����ts��'��q{ܜ=w���V���	�B4���wZ��p����w����Q��P(���btԎ�����x��좾��{��>
�p:�Q�`0�s|�$������pD�|>z�����csK��-mm����>��J��0�r166��9N(bpp�����g��X��M �,x}>|>}}���(
v��������]��܅ܨ�Q��IDAT�a�9�������с�D���цG�}��+�B��kL�����Sl\���_}����?8��+�Y�l\��뵵�;��$^z�y��_�����z������ŋQ�մ����	��<��V::;	�œ8}���!�"�k��/~���`��OodϾ�457�zya�\��Bo_�P�‚���������^/�}������C����HH\�z�ۍ���~��{{8x�(
S*+Y��)����ALFC�C���}�Z-S**�7w���>���,�zb%j��JEo_.����L&�fF�8�"�@�FCVf�kjX�z�		�=w�Z�B(���bhh��;w��xX�l��\��B��+8u�#��h5j�.Y�^�ٌ��e��
������o�hmkcޜ��ׅK�8|�(
S�La����Y�|9흝�|>�n���w��ۋ$I<��3$%&�[oc�����a�3�`2}�r��(�9w��UU�(|��ߡ��x��������$P+)A��SW_�"�<�r%3gL�;him#`��g��Y����?�	����h0�ѨQ�Ux�^�~oCC� �w����}I�(+-e�̈́B!.\�H_�G���l21m�T*5��p��!��]CQ�X�����O�a���������)���\n\�ȝ�c��Ckk:��E��������~��b0h��`JE]]�x<^�ΙC~^�,SZZ�_��_0s�������N��Ǚ7w?�я����n�s���ؘ�RS�;g�/&�9w�"�CC̘6�?��Ohlj�����s�l�bdY����;�1˗.��ϙ6m*�g�����i�ܨ�a���,�		�?x���^>���{���8A_����L�2�����\.��;(��cޜ��t: ���ؘ�JE||<91�8y�Ʀ&V?�$�q�xܑ+і�6�����g����>��A��ȊBbR"?��Y�|�kj�7g6�%%�ɏĂys�;g6�.���I������ϱr�2���ϡ#GQ�������+�O�z<v�����h0�t�b~���T]�ª'�������ϐ�����'N��¥K���_��nٌ�h��nI�X�`�%%��?� ?�뷒,˸=֮Z��ٳ9t�0)�ɨ�j<D]}�
M��g?c��e�'��:��!��51Z�~�
�-]‚������ڍx�^V?��׮A�Vc0�3{3�O��͛�xo�w�b߁�IIIA�V��ハ�]�}��745��j����¥K�B�萼V���������lڸ�	v6[<�.W��j���#=-��8+�?����^rsr�L���X�n111�Ց��Vk,Vk,z���Ou�u*++�B|�kV��VK0����L�&3��}�:s����Z-���M\\f�9���f3f���eb�Kll	6�V�&55����Ȋ�Z�"
��j9r���������?���r��I���ɏ41�B�ףR�O�������ݼ��dgeE���0C$�W0`��̚1�L�(_%�V��l����z�j5F��N�A����b����@Rb"�������FL&��H����@0���%N����5k0$'%�z#��X:;�"C�a���gQTTHKkF�����h�� �$I��jF��1�M���$''�����瓜I���2f��шJ�B�$B� �,G>3	P�p���^���I^nZ��@ �^�#9))��	�V�C��b4��o��2��`0mCRR�#y����/�2׮ߠ���NGNv;v���P�ή.�~?�/^d�n���JD�E��_����V��s����B�R�Q�Ttvw�ʫ����ƚ�����7r%u�)�R�����ukW�d2��h4�G�����ZEYI	�/\�`0���l���s����sf#ݵ�~lL,���x�^�V4u�.����ں:�*���.UU�(
�@�N�JR�}�.�{f:��޾>�j5��o��b6s��Yz{{ٰ~=]]dge!I#�vQ���WI��P�ҒD\l,&����|���Jnng�_�]�)/+�����>����A�Ξ��)S��?������SR\L��k?y
IR!I�l�-X����H����r�Ѯ]<�i7[Zسo?O�XAaA>#��$$$`6��5c:���W��oG^n.�f�`�=��nE!7'����b2�x��7y~��dgO�a�֑�|W²LrR23gLgtԎ�`�h0RRR̡#G�տ�{�IOKeph��_{���z�,^Hl|,�D��2�#LG�@��\(D����+I��%I�j��t�&
��̘>�Q�D��G�}_iy�^/x�
�\���B�~?5�u�NGFZ:�#�tu�`��SVR��X�a��ؘ;�v����Ҋ�h�|"�m[{�q���).*dd$�]  66�����Q��+W�����VK{G�]���a�������S)--!9)�m�$'%GcSN�8EE�HH��aRSS���&v�D�x�&cc2����ɹ����Q(-.�h4R�P���!?/���L��;f�J�߾EQظnm���adY&.>kl,��?�/~��RVZ���0�����G�#_��OwO�YYx<�N��2���gph���<7[Z�j4(�?�˿���'��΢���^OgW7]]]�L&RSS�����IBrr������>���Lo_?99ٌ;�����12:Jss�J�����FWw7��$$$����͛��P�j�������!#m�y��[�Ba:�:IKM�h42�r��Ԅ� /'���4F�v��[�z��CC���gZZ��f������`������L��㌌�`��������&5%���1\.6����A����ol$�GrrM7���6<*n�彯��(
>�/:l}?�LٻC!��0��S�&�2����ŋ��!�4�ah����S���V�;�k��TU�K/<Ǩ� \sK�}�����w�Y����
�hni���s��2��y��&�Y��V࿯��ȼՃ�2e��h�j>��3�O'-%�!��#;+��ۊ�,�E�䋠/�)66��s��j'�)�#FQ�{zihl$?7���X9�Mz$��� � �L���rR� ���_A#"�� �cD~AAx���/� ��A�1�@�||��;wN,�pʶ���w��;�߶������u���G�q��܅���q�����s��s���h
n�,���x�}=��n�V_��ٳ_�����£�ŋ\����|�޻�_��w>z�ں:=�H&乗
�m\�\�J��~�oO�	ќ�aY&
�(J��F�!
��e���Μ;�}l�`0���(�9�C�n�'z`E���
!IR$�׋,˜:}�ںzN'��'�3�~?���o��V�!���t+�F_��/^�&r������b@�e�>�7��G���d߁��f�����w/����Q��0G�g��}��ʭ���ʲ��墦�������_0�?qq+��-���(
^��@ p�z{z{�m燯έs�����<D>�@ �S��EQ�oh���M���'Z߭�C���׮s�	����wvr��:Z���(���|׮]����/!I�r�r�z�=fN�θk����`0�ٴa7[Z��r����m�����������<�b9����&~��%&&��}��A8,������ׇ��%?/����.���yr�Tu�CG���h���FB��>��K�������.W�Vs��)�j
֭!..�W_�Z̀w���Zͪ'�@%Ih4:��8t�+�/����k7nc����-�����G;���&#=�g7o�q—$I}�����kȲ̺իi�y��a��Y�b9���ett������ٱsg��IOK�����ꦵ���s��s_
�����`0��E)*,dۇ�Y��tuw388ĖM�8|�(�kj"}����ع���rsrغe&�'+�*���G8p�#�#|�;ߡ��h�����w/�B  /7���d�]��Z�f���s��Q��\E�e2�ӘRYI��f^xn+F�ס�hP��x�^^�-���I��ز�i����Onn.�lz�P(�Ç��$eKJL$//�*�����<u�Z���)/+����)�}���z�����)���?��(�wt0<2̉S��$��KGo_H�6n�`0���̔�J._��� ˖,!==YQ(,��O��ٽw/�]������Inv6�n����U�������0M�ڳ�eK���HFz�Jb�y|�;/PSWG]}=��AFF:�Jb��]��\ttu�|�R�f3��+�RQ����}l��^�ë��#�a���OYi	=�}>r�KUU\��´�S�x����Փ�y
�7��`����֭L�:����1u���r��f�O�ʴ)S�9s֭%�������������9v�$]]���?b�…�܍����;�h4�������zr�����;b�tvu����O��ѣ\��Bk{?��K�~��h:�[$Ib��������3ϐ��9�G�[)
�����)�T���c�N2�#��vs�����N��3[�>u*]�=�����ى���100I��(�ZΟϊe˨oh�RU�L��d�E��j�z=S�L�����˗���K(�Ȫ:��o�5�z$G{�;𷴴��
Ņ�(J�
�$	I���t���V���x������R���� ?/���^Y����UO>��S�h�n�]��"˘L&�RS)*,d``�P8�JR��j��� ++�Ɉ���u�������*�JIOK�d��\NIN!-5�����0v���h4

��8����bL&#�L�:�׋�����>6Fvv��(C�I�����?0��㦳����|l��y
�7���&��RIOME�ebcb1Mdfdb��%�K��JJr2*����X����?o.v��mm������F��ʽ�[�}~�����w��(/C��2}�T�1���2<<�붾���FOo/�))��䐒�|��c�qq�LF�23Z��N
`2�(.*�b13�p��ߏF��h";�V�� ?���$�
� ��")w%U�%I��tr��	jjk	��x<^L&EE����Dc��b�l6����f"C�$E�a�#�L�!����W��7n�����d")1�c'N�g�~:;��tvu���J]}]]݀�V�,k4RW_O�5�^OsK3g̠���]{�Ru�*3�M��t�e�?�� +
c�������g��wޥ���)��9�[�u�Iɔ�C^n�ii Er8��0E����;��r�(.*")!���"�̚�;�mc��5��d�`����@A~II�T_����XH}	�7�� �aYAVd²��lB�`�Ν�_����$.]�̑c��HOcxx��N���NYI1ӧM���c�G;�~����T��R]9�ӦFFFFG(*( �j%
�k�^�-YLOo/G�g�J����}???���$^{�->�h'�iq��i�>_4�{a~�$�}�N֭^EZڣ����D���OMM��<���FrR"{��獷��pIL���������cάY��&.\���`����'V,��f3����-�� �����d2r�����.J��nkC��F�UPo}n�����a6�?wc��Ǒc�Y�h�))����D���3gPZZ��������,]����x�231�LѲi��������"3#��O=IB�
�Պ�를��˗a�����Inn��YX�Vl6~����!֬z
��@QA���OVf&i���dgG||Ņ�TT��p:p{�������BrbY��w�x<2��X�ԓ��ő����0�M�$'3c�4����2��䓔��� :����,qe/_�Z�&)1��쬉Q�
���HO'���I~^&��ш�d���3�X,�dg�f�*r��HNJbhh���fN��Z�fhx�Z�Z�&##�9�g
����HLL$.>�Z͚UOQR\LXSR\L^nCCÄe���L

�IINf``��Drr2�}�Ȳ�J�&6&���B�3���deea6�&��~�$���xr����X(��n����T�rs)*,�n�388H8��
�Zc��2����S*),( 
QR\DNv6���Ӧ2�����r���H�n��IOK#+#��d*�J��t$&&�l��mHM#)1q�S��㡭����aFFG������_���H��&� 
�0��[Z��˯���7��!�,308�"+�ܳ�Z͟��'�ݬIs+;�}
���j��,}�){7�ɄI\=��F���ШՓ��
��t���z���lX�v���H��;~A�Gͭ�b?�s���|�F��G�_�Q#M<i-���x�C���}�#� ��A�1"� � <FD�A�Lj�� ��y����8{������ ����H�988����g���Ɏt���]�Ɋ­,ط~��Er��eΝ�����C9ȟ�QL� ���r��&�*�������6���'彈>��������/�}8�RUU4�LgW�ׯc>�
����ܽ;����ʆ�,������>��M7oN��q�A=|�(�.W�tuwGse��~�{�z{{�����v~�B�eZZ[ٻ��Ϟ��ٳ���j���0W
����!�d#�v�yocG�F��r��E�?��u��a._����3_��}���܅�\������Ŋ����N�8u

ܨ��Б�ߘ��@��QSKA~>�

�$sf��C��9q�4*����W���Km]
�a�Zn��Q�����fמ=��n�-Y�F��������IKKc�u��*�̞}������pPQ^�꧞��������u�x�wxo�MF�.ZD0�ߧ����O=����t����a6�_��˗��Y�|��\�t��.��Ӄ�df��Y�:s�uk�`�m���F�?�N�e�����000�SO<A���9���ptb�5�Va0ع{~��Ǻ5k�|�
��==l}�
��&�� ���^^�ëhu:֭^M������d�"�}�#�v���Y�x1�O�����hX�j���>z���Fn673s��{���zY�l)�I�:r�˗���OwwkV=EՕ�\�|���D֭Y��l��ܨ�%+3��k�b4~r�P���=�ǿR>���v�q�IMM%;+�Sg�DνO�"##���՜=�$���IQA�]]<����p�N�Z�F�R�������SO��Б���nRRSY��S�BA���Oo_o$��D"�[��?9�?���̌�>L�r�w��@����N�����ƛM�e��UW���b��]8*+��i��76���FEY����*+*9~�$��
L�2�Ō�(����n��N'{����ƛ7��zY��9c:��>6Fgg�,����J�f��)T���{�>���INJf�ԩ��咛ɒ4g�,4j
�O��c!6&�7�y��7jع{sg�f�9��*
���x<�E߻s|���y��縋��@��p��1���9v��CC��޶�6��}��.R^V�F������dg�����ٳH��t��p;�^Ǣ���Z���!�23#�e�Lr�����$7'�iS�0�rq��)�23q�=�����x���/�������s�`�m|����b1����֨��淿g����Do_�>����L��8r�׮���3,\0�ʊr��;�$I"';���D�ǿB�@�3g�b0HJJ�mۈ����{?����N>ڵ��ӧǕ��tvuSu�Y�'.["@M�������i�8}�,�/]����

��P�$t:9�YdgeQ��ϵ�7Ѵ�o��^�������:�w�o�����QZR���$E�tTj:��5��BQN�:�s܅J��������2::��륬���擑�
�����`cJe��Ρ���`0�J�B��Q^V����1x<��_����K����2*+����HHH`Je%�dee���Ĭ30�M&f͘��MO��x�2���瑟��J�"))�̌�V_�����:��imogll�FC^n.			8x��I^N�]wl�L���ӧSVZ���##=���Y3f'�B���5�JeE�%��RSI�٘9c:�ii��h������Ѩ���c�ylX����n�VW���˴�S����^�y�^Z���������F��'VR�؀V�e��t����?@cS�phjn&/'�ys�PZ���[Vf���_1�X"��Ԕd�{zimkcll�VK{G':��EPTX�^�C����J�Z�B��IRiii�����͘}����ϛKYII�\JJ2�II����T*FGG����h&Z�h����Z232��Z��X�v���\�����?@LLK/b�۹Z]�[<1�q�&R��� �����ᣜ:}����E�\��ӧ���C��ȟpY���X�����¦��Q�T�Ba�ݗZ�B��p��bb,�M&FFF9}�,���~ZZ�hnm�`00k��~oG������ߏZE8u�,�]]\�|���瓛�MIQ��X�V�F#sg���_ٴa=�E�dgeF����C9�o���t�N�>͢D�AA�E��@Wd9��\��9y�4K.$&&�k7j��v
��ȸ�E��3A�L�2�}�imm#&�BwO-�m��I�ܱ���j�3{F��Z���'�(+�9>��+WHOK##=�iS����Hzz�]|�k�.\$1�FZj*5�u�@�o�K0}�+�	��$&DR9r�h��C�"
c����UWhmk����V�͛͜=w����xf�&����e�p(�|��l�,G�N8F�����������Պ$=z?�So}n�����a6�?wc����X0Y����mt�����MNV��ttvr��9��<�r%:����Dl�h��,��nr��E,3kV��f���r���Ebb"O�_�F�!)!���Dr�����'�(
m�ml\��VC �����T�����HGg')��!;3��'N284D `㺵TVT`�Z�6�TfYi)�EEX,N�:EjJ
��C̘6�ܜl����&�f###��8+j��%����FzZ*���mHMMA%I�`6���(G�7���CnN�����]�G���(�L����`dJE�D��!?/�Q��`(�-�ơ#Gq8�t:6o�HIi	�"s�F
&���S*	�\�x���!HKMe���ttu��Ҋ�h">>���~�ݲ���L�GF�1}:��xn��284DFz:�ee(�L���BaR��9s�]��� �2�X,f�D��(
�P�‚l��;ν		6*��HH�q�����v$�������z�oh$%9�)��������ANNq�X�;:��̤����8+��͟��P(��l&''�Z���J�����İh�B?���}��<m�m���/
���JVf�iqE�#I�,˟ʂt�����e�� ?��q:�����ߧ���&��ô�����BlL��A�Қ[Z��+���DRb�׺��}£EQ\n7���s�n�1��/��t�QP�&&�dY���vy��|Z�����O�~��^��^e4E��f���f�JE�Պ^����V�)*,|�����h5��š�j��}���h�:|���&�n�����tۓ �Ơ�{��;�oEQp���F��Q���0^������[�t�|�x0��X,�?��m�@w���$a�X&�� <$j�Z�i�n����N��XA#"�� �cD~AAx���/� ���EQ�x&?��,˸���n� ��-pG�e���6��-ini���?���餭�㎴�.����u��c�����Ӌ�w�m���I{G��v;/��UΞ;�s|����8���~�`>44��o���� p��%v�����k����.AxA�{z�i�����!zoK�%w��``p�o�(
�\�r����5��?7���;~�K�/Ru��
(�==��^}���[e�zu�
���ۄB���\u�*W��?���f��}���tuwG_�>z�����>��<�����t�7�y�P8̹��x��:s��g�>��
���?�F{G��>99�`0���UȲL����7�~�K���P��M1<<�˯����Ŷ�����#ћ���:��\�B��:s���|����6���G�p���/�}[{;�����ԙ�:r��/���Z�Qu��%~?�/�C�T̚9�V��Bu�
���������0�E�\�r���L$�¥�((,Z����Q�;:	,^����F�RS�T:���d���H*�-�����dg�d�"bcb����/388Ĩ�NEy$K��磩�&K�,�ݻٽo?qV+��g�5�+��ƜٳX�l)��_�ŋ����t�b�[Z���	��̛;��$n67s��eZZ���}>��]g��T�/^�ɕ+��"ǣ����v̟G[[;=��,^�����x}>�

�\u�Q���Rr��9w�"n��NG(	�w�))*�̹�x�^�ϛKrRg�_��vGYi)�/^dْ�lڸ�P(�`hxx��C��PɲLGg�>�����eK�r��7�y���^f͚�{���>���(�f̠�����q�z���d2r��5uuܨ���8�N'�Ϟ��FʙM\��f��Y��������shmk��ի$%%�p�<�z=7jj���%%%�����*�±'y�����a��O����m��ո�8��2�x�2:���G[{/_F�V���NzzCCC̘>���&4ud�>E!p��)�����f��T]�����j�2���0{����Ғ�Q�T(��$I�GR��t�6<j�1�/����l��~�_C�7jk�6u
W���p��p���������Fm}=:���������!�L0�FM
S++9r�G��`4F��(�t:����\U�LJq��
�]����R[_�F��7#C�]==x}^J���Ø'�����¨�j�z=z��F�Z��`0�y�������W_���V^�mz=I��H�Ę���(�YY1::���'A6
�s������ �������g��}tw��c�n�^������7��zM
�����V$IB�e.]��;>B�V�}�.�፷�ftt�m~HsK$WtJr2CC�x�^B�;v���0o��	��%I��!l6}���ܽ'ڇ�:��Z�F��h004<�[�ǘ���K��`��ߨ�;HNJ��<�p�v|�w����p��7jx��7x퍷ghh���x�`(��'9y��MMl��CL&�����h4�&�;b�Я���g���ڊ���7�bxd���Fv��E��o��6*����~�<H}C>B8���\����R��y9`4�`�.\�Ķ��#��}�T*�:��P(���H������fdd���F���H���ᾌ�L�-YNl��S���ۇ�ᠢ��H"I�P��h��,E~����~�Z-�ӧMe��
�|>�*��(*,�l6��ގF�EQ233X��<�rW���$	�NǂysY�zj��q��5��fgc��b6�X�d1+�-�����Avf&+W,������r�s�X�f51�1V._�O�#FFG9~�:��-�61g�̉;��,c2IIN&%9�k7n|r���IMI�ɓ��cdeer��Icc���RS[ˆ�kxṭ$'%Q}�:��1�Y��%���ٱk7eeq�����108���'6&���W1�\L��%����aj���K/���2��!Ax�E!1!�'W,g�ܹ8�N��
���`��U�SR\Ley9K/B����uky�穫��ԙ3��v�**+*�ג�.���kף��`�F�3�7q��Y�� +�/������V�����������ss��F-\	�E�$*�����`ݚդ��~�UP���֮YMQa!
M7���gxx����n"I��M̙5�NH�T*$i�'�嗈u�J�� ������yx���j4
����h�|��H��R104Hm]}�
�}}ȏ`�Ԓ�������v�5uu$%&����V�������.FFG���`���[oC�=G�xMm�lBB�P�g6ob�=�ڳ���16栽��5���Y��EA!������˗��h���3�-K��n����N�e�1FcS�x�p�1��ۍZ�"?/���zj�����'`1���X%=-���rjjj�;g6MM7�=k&�f��_��-Ŷ��o˲%K���&..�k�o�$144Lqq7��QP�����ի�R]Maa��I�&'3g�,�Z-�%��|�K�v{0
�t:��ӧN#=-
A��Q��>���(��j�~?

�L��@�����OgW�p�@ ���NSs3V�52rx�
m��"I`��@� �CrR"�))�~g08u�i�)������F|\�II̟;���x���k��Fm$ի�b!6&���މg��h
W ��ʊr�l�W$,�(���d"91�iS+����b1���	�2������� ';�$�?0���+8x���l������]�[�eB�0aYFG�����q�ILLd��)���b6�Q=����[���yyy�M�/jzZqq����s�f͜N~^f����רkh�`0PYQAMm-gϟ'�je���abbb����]{�5s�y�\�|�#G��+�-#>>������<�e3�PK���@NV			�`4�ohdӆ��t:�{z(+-�`0D�t:UW����Hqa!ݽ=����������ini����'�x����9~��##dfd0o�:��eJe����O�`K�¥�L�6[|�}},]���"zzzY�|9�Yd�gPSWGm}=�f�`��yR^V��l����UO>��l������RSWOCSj�����(/+#�b�C���1w�lFFG�R]��)�����o�@ ���ӦM%��=k&���55��摕�I}C�������ή.���شq#3�O���?:_ZR����]����������$fL�ε�44�D�V��`���U^xn+��q�wv�p�<.]�������t*++������38�N���ࣝ\�z��5�8��L�2���5�����~Ӆ�a���)/+%99����UU���a��2}�TdE���446��hX�jU�|���2����p��))(�BkkF����2T*�e�wL9NY&/7�8ӦN�������3�<ц8��ܜ��>LQ���������TUEEYY� �����e�χ^�C��F�����eeY���E��&��e�@ �V�E�V�y�%E�&*�BOo/}��̞93rw[N����C:�� �2.������w����bCt~��J�F�A�$�n����W�bcc�x����|�Y�>k߷�[����gm{+/t8&���P�U�mFFF���/|�;/R^V�P��L��"}��O[����/��_��Ԕ��Ű�(��~4m�?�sa�Jug���PM�+n���s�t:m����h4h4�;�%I�����,<|w�w�� ᰌN�E�R	��ݷ���n�����p8����h4"I��j�ߏN���CA�O�ݸ����s���@ �w��Q�@�t:�,��5�]w�j����ԇ���eU*���n�@|��"I�dfd�s��~�5�:�h5,f���x* /7�
��NtlM���|V{o�l�[_�Z}��5-�7n���(��8��f_�O��7�JB��s��$I�:���$�Y��}���7
w�s/�����B�V�V�{ ��C�ikogph����#IR�~;���V��=�ݫ
�����6�������A���

����'l��"�2����`��'==���s��<����Z��c�@�o6�^Of���§�T*���HOB��љ|A�+'� � <FD�A�Lj�� ���_A#�eYft�N0x���?�����>6vǢ=�`���|�����B���^Q���Z~o����]m����8���Xl��p�݌�����o-�˅�T�g�ᠧ��s��Y��(��/��5c��.k���Z^�ëw�3�p�.]�̺=�ko��o~�{z���x��=����tuwG_��\�~�����(5�u_�=�54�򫯉<ۂ��?^y��qW�5����kע�ή.ZZ[�P}�N�dǮ]_�����f�ѵw�>�����_~�k^~�5��CN�=�[�����w��M71���B�~RRRhoo'�fC�R��؄�(����r�!�PRRL��HY����5�\.�
	��4�����dgeRVZ����@k[ccc
�����&PS[Kii	�������)2RPW߀s|�����::;8}�7�[�:���o~�{�-]��s8�gϝ�/�@Ey�CC��\(�LqQf����!j������tN�g'�J�`���79v�V��Ӧq��
^��v�%%���v��n�BA~=����n�z=�EEh�Z::;���B��k$%�{�2��O]}=>���24
�����r��������nOII1���^ZZ[�������mK�)����U��;<��f�̚��f��C��
�&R�0��$''��ЈV�����D����F45�	����1�'''�޾�H��YQ�B���32IΖ��K[{;^���@Qa!aY�Б���~�,�`�׋$I���S[WG5��eY���u�M�™�gQ������mc��E�74��?@~^.�		�9���.1m�T��Ө�~�����<u��g�P^R�ޠG%I��]
q��EV�XNOo/�p�ή.�� 6[<<�����2�pPY^ɧ,I\����@Gg'�׮���̹���N ������>�\�ʼ9s��͛��������� �����������d�i)�lzz���V��Q��se'�4�$1������W���с�dbxx���Fl6}�����������e���0w�l*+*x��7���c�>FRR��(
{����	��@���lݲ��v�&??���>222HJL��W_#66���֮^EFz:���:))�d��S\XxG��]���0�
������x���o~KJr2S*+8x��`��CGg'+�/�����hdl�A(b��y\�V���_��Ø�FT*ud��`0:�{�^^xn+W���$f̘FA~>�,344L �����Q;�̏�����p{<x��;;xfӦGn����fhh���H:D&�*V@B"66�p8D|\��,3c�t~���#I����f�	��Z�i��ȲLaa?��wY�ԓ�>s�׃,�h�j�/[�w��"�`��I}C����l�h�EQs�q��E�<���{���dY�b1�i��ټ�ѱ1
(*,d�3�0�\�͝CyY9?��K��šV�ٸa=����imo�Б��!~�����'��|3fs�EQR��y�<�r���̘>���B����0{�LfL�μ9sز�i4Z-�����|�9�"G�#3#����,\0�H��;�\.N�9������tz�{v;w�fhx�
k����ʍ�Z�� ���ܨ���+�����	OoX����$1w�l

��w_z�2�}�(
F��g��̌�Ө�r�@ �����׮Q[_����O�V�X�F��HrF��L��%"�g���������hmk�d2���M�]�
�F�N�c��J*��ظn-:�6�ة`��.N��굇����t�w�uu�Xc��5�U�1N��;2�r�2�rsx�	�a�$4�H����Hٴ4,f3�x�w�bhx��3f��p8b0�QI��Z
�V�P�TѼ�7jj�=kz��`0Ęc�`0�Z�F�R���M��F(���~�$�h���a�^/������Z�Pd�ǃ$I�����z��r�;~A��L\Rߖ�J=�emhx�łF����<d+ˠ(�e�ˍN��b���I��v�Q�׋��T=�&���,�VV��i1��
c2��"l�u:��X�JK�;g6i��ܨ�e�ng��ף��p�;�'y�V�J%
E�l�X>��L���&^���X,

�HO'66�߇��`�Nj�(h4���q:�8�ǣm�.�u�|�}��z��-z��j5w$lR�YRo�x�]n|>?�'m���y���ݢ��ֿ����l2��`��{�P^VFIq q��ij��p:�TV�S}�:W�^Ib��y�|>tzE��I��Ns��i|>3�M�f��¥��54����3�7��z��ȊLvf�I��54�`�Q}�O�_Gll,5uu���a6�IMIa֌||��nܠ������?@eE:����6̛����ԙ�$&$������n�$//�#G��t�&5�u̙=�'W�����#ǎ���I��ʼ9sD�-Ax�<MM�̝=�8�v;��ͣ���KUU��咔������������}��s�kV=ż9��v��Ϟ������\</o��.]�¥K$&$P\\ĕ�jzz{�h4$$$���<�e3��ֶ6/\��c���:�����H������78u�C�#$&$��oq��i�_���� ���q���ܜl��}h�u�� ���L�� =-���j�5���Z�1}:�>z�ں:4j
O�X���kT]�fhx�Ғb�RS9z�8����9

��))*��2��"��```�q����|��{�ZYə��1�Lp�Z�
V������>LQ����������8p����!-5EQ%b1�1�|>�,f3qqq�'�E������n���jI���x<x�^�&�11��~ �\�F�E�V�z��项��5��B�R�r��z��-f�j5:����q��{��˯��$'%��G�+�|>�F#�p�ۍ�`@����x	��Ȳ���Ȇuk))*&11�ZM0�>6�F��`0b4��A���e��>1���~�>s����n$�`�_������###[||��;tz=&�EQp{<�!^�Ɍ����t�x��X0��~�f3�P�ߏe��v�Vk,:��χ}l�ш�lf����k�Zb,�@��"s��wk$G�ףV��e�����o�5��@0��pr�qZ�����ϸ�n\�.bb"��U*�v��g��qc����	�B�o	���atZ-�@�N�}l�N��l�T����wZ�[��!=�@�(
H����h�y��5����?�&����p:���?�'?�1E_��� |u�[Z��+����F�mwe�9s������ձd�"V=��d7k�<pZ�/������K|\�	6�N�r_A�d4��'59���W��NJJ2/=�<���·�J�"))��dz�7SQ^>�Mz$|+ǞRSRHMIy��Z���ӦN�����XD�I�VSQ^.�]��
AA�ʈ�/� ��A�1"� � <FD�A����`0�,?�j��*+�2�`��n-��Y"e�|�W� �EQ��<���	�B_�"�[+�2�P��K(B��o�w����0���edt�����zMm�������U�\��̺N�9ö������q�ĉG2a� ����������ߑ�Q��P}�O�d�=_�����£k��s���/���n�7��=��O����������{۾19\�w�7�[p:�(���8����Dz�H�"EQHMI!��x��$'%���¸Ӊ�h�����GJrn����F��Ԑ��JjJ
Z]$��ݎ��e��"59���8��0���P\T���0^���KFFf�	��Goo$�rL��Z�^��9>����9�$..���Q��8��J���(	6�X^S&������H*�xrsshll⍷���}��,>ڵ�s��>KbbN�8~��VCRb"*��1����>j��ݕ2��p8L_?�`���t$`�� 1!!��-11��K_?�����h���A�&))ɟ��Q���o��_z��‚�^HL�|�,3::J D��C_?j���4�j��b��`��ge��B�$FFF��tȲL������l6l6���5��wK�eN�9K��f�ް>����pEQ��i>iã澣��(T_���)�?u
E�g6=�+��΢����ꦦ����D֬z����8{�<YYY���V��_����+��8x���Ο�Z������G��t:y���4���8x��v;:��F�����>6���0�׮����b1�v�)���/����mcph�׋�ba�ܹ������FՕj����O��'||�0I��X,���ş����0IT����G���t�`�<�].���h�nV.[ƕ�Ռ��oe�����?^�\8��[���B^~�5�
�-�̚9���:u��ΜA�RSX���K��+��x�::�PI*�l�ț+��$���ͤ$'�������͛0��zE���j��ؾs'�}�E��
'��~�=�������2��0��B!�,Z�ysy�w��N5���Ʈ={1�h4��{^���v��>j��v�ҋ/�����騬��͛��\�Q���0'N�B�R3{�LT*�,���=ܼ�L(bɢ��\�|�ӧ�P���AwOS**��D:^��@ ���(���ӧ�����磤��?��O��tuwSY^��>F(������<dY�������_�x�B:����X�d?�ɏ�>6F��fl�xQ�-O?��m����������旿�9O=��P���I�X^V�/�gh�FFG��FN�9Î]�������T(�WCV����ɏ~Ȫ'�����ys�PZR��~�c̟Ǽ�sX�h!/��j�
P��w_b��
<|�C��`6��O�K�X�����v���A�j
�G��`гn�j���74�~�jZZ۸p�)�Ɍ��r��9.\��^��o��/y�g��YV|�…���g�S
�'��~+)��,�l~z#������S��š(
�N�����S� ��y��dpF��%2uJ%��#��գ(�i�F�ް>:r<g�,fL��s�l!,��e�$108ȑ�Lj���(
G�'�N��mm}C#F����Ξ����'Ѩ5l}fW�V�m�v�*I�b1c���^S��`$7'���d�{�c�{z)).���*��`�a<�J��d�d2�����Q��
j���D(B�V����F��cmI���#��h־P(�N���I��h0�u���H�um�T�(
*�
�V{�ܹV�%&&&:
��$	I�P��GB��s����e,3��T��a41��a��5��SO�$/7���&��v����o0���tS��⯎B�3�Zc�i�H*�����,l	�C�H���b�J���D�V F�i��`�ۙ6u
&��`(�F��j��p[�=�Z�Z�B��F�K0��!Ic������fC�fp������v�Ғb�
���������e7��}�A��<�cQ���th4 �#eKK�h�\8y���&d9x5
=�}��w������O?M{G�p��0�N�cpx���A�ټ	EQ"��U�F�������/�/��n��N�Z�F��F;*Lt�P����0w�P�}������b��
^���$Ih�Z@B����x���7��w^������/��G1m�Tzz{y��7s�d�B*�+���/���+�����(��ի<r��	V=�+�-�܅h4r��q8��ٿ��
MM�޷��k�PZRLsK�xy��̝=����O��o0�\�mߎ��BQ
��Y�l)����7��/<Ryٿ-n}W %9��s���щ���ke��)?u*t:-Y����w������K��`��Ѯ=�u:|~���Ȳ�N��|�
�j�
�F�BB�-���g�9�wF�k}$S��WZ^�����mcŲe���iim��g�����c�A�@?q�8rs�q�\�~o+�-#//���!:��1�
�������\n���dee�p8P�P0��l�h0088�� �o����#I�CC�l6PF�vRSRp��tuws��%zz{��/~�?@�V��T��Y�h0�r����G�j����� L�`0���0�II���)��$F�v�v;�ii�z�{z"oy���~�����"/7�V���}���Fl6����Ԕbbb�����t���D||<�#������z'=-�q��ή.$$��2���ahh���^��8RS����!
�����t��h���}_8�������x}>:;;����������Ƃ��^��o����^FFF����f���jimk'%%�ۍ}l���8����z��oq�\���ؘ��8+��X,�l6:���hã�VZ�
���}x�2e�����9
q������Sg())b�-_���k����o_~���qV�d7Gx��Xp�[���nk%Iz�9�/S�n_��.�����̚9�%�=��
��艏����ы�s�{�����_A�o�[w�b�~AAx���/� ��A�1"� � <FD�A�Lj�� ��y��?�p����022���6�l��
����74����u]��f��}�Z{l����p��ն�V��׿�Ï>���|��+�}�W�
�Œ�9���{<\�/ֿ�_���_<��
��f��u��)Μ=���w{<���}���gv���ՉXt{.�G����vQ���@�Hx�(
n��˅,ˑ�x~?�Y��euz=^��8�P��._�J_?o$s������x������J�gΝ���
��z�x'ʍ�\;~���Ȳ��D'�?�g����Av��E0���	�SQ|~?ccc�\.|>?���;.TA����;�:���~G_?�@��f�Ν��J ���ޑi-2<<B}C�--��?�ۃs|EQ&��:dY�w��>�/Z.<��n�=��tv~�fq��j(�B ����Q���qܷ]4ʲ�����r���
��B�����ى�(x<^F���}�}^�����KL��@����?@}cc4!��mx�<Ђ�W�]�����g�"!�z�S��7�;k&�v;g�]�l6�i���p���X����T_�NEY������Z͊��Ш��ln��W_C�������?8@8�r����d���'?�N����^V,]¯����|><^/+�-�� ���A�x�m� �֬"�f�������t���s$$��|H*�==T]�J||�Y�||�0��?R����������̸�ERbc�EE|��ъ��ᡐ$����'������x�^�X���ftd�@ ��eKy�mh4�� �7=MvVo��.��#���3��{��u||�P8̼9s�ZY��˪'�����1���[6�{�>�n6c0x�-$�l���t��I�{�Z���p�	N�:�����}i"��0y<��W�F�),,$�j���U$Ib����`Ͼ�ܨ�%
�����i�hhh��.���Ǡ�G�M�}��o000��d�g��>$
QXX��-�	�B?q���^N�:M|\<��ŨU�e�?���T*֬z��ӧO�a�����w�\���1���}�Q�(��0�?����J�54Z
˖.!11���V^z��F#�mmL������1��%������(���
ٳw}}��084DeE9?��w��o`�n�����DRR�]],^��
kײ�����`����¥K
���w^$���={8~�������^�b6S^ZʺիY�d1]=ݟ��H�'���^|�8k,���<��3T]����d����(�l��{���Y�fRTT�_z���g1{�L�ϛ��-��e���!�ް�ӧ���]9z���/�sV,_z��x}>v�܅N�'5%��va0�>u
���o9y�4K-�����G�QR\��� G����������7l�pW6OI��7wE����{�%Wd�J�Jҳx�BfN�ƞ}��HOG�V�{�>��|�
?���Y�`�##��;�DFFF���#��^ϲ�KX�t	m�\��fhx�E��'P��f͜δ�Sذn��Q¡0�Jbhx���w��iî={ɩ���7�[�T*
�Q��,#ɋ��iy������ާ���JErJ2�iittt"I��~�I/X���{8s�<�Dz��82�Ӱ�ǐe9�j7))���$�F~�����ȁ�dB�ד��NeeN�����O�-E����ő��Mqq!v������'?"#=�N��b�l6��q'f��Ąl6�qq$��j5��G��o*0�
��&�	�NG�͆�d�d�4����C�T��f��2��(g�墷�[|<6[<&��{��8�^/*��E���),(`ll�NGZj
��ߏ�n������B����Ǒ��L|\�=s��X,�t:D���B$�kvVF��׃��$�fc�����R�HIN�j�C�V#IR$�L�!M}���a���OsK+���1deea��G�F���j5
��in��s���Y3fL�!�����e�k7nPXP��` =5�}�M�!��|~jj��h4�B!�v{�@��{�*��멺z�χN�eph�Ԕ:;�xg���ձx�B���	��y9:�f�����f�e I8��9|��,���JvV�'�P@Q�$���Vv���7X�`���;�ۭ�	����3�8>�<���D��ܞ _��
��cA���m<��i2��9�"{� ?/���A��?@wwӦLa�����7y�ͷ�ol"/7�55��pI���ܑϙ5������$&$��رk7kW�����1�\�rs�$�ĄR����l��꫼��[dge1������񢠐����)�h��~o$33c�췒"+��a��ҙRQ�,����#� ;���W���� @rR2������}n�԰`�<bbc8~�:���~����h"�deEA��]{�[������7ini���<چ�TT�G��s��m����<�&��n��8s�K-$))���Tb,��h�J�K@�L̞5���摜�DnNz�.Z699�@ ���AQAO�XARb���U*fΘ���IJL$7'��Br�����!33��ǘ��+W�8q�$			���v�*����$==���Xr�sp{��74�����ٳ�7wΧ��%''����Z����LJ";+���T�����&�f#++���t4_ c� �O�Ւ=ѯ"�7��,���0�Ldfd���CJr2��q�u:�_�DVf�奬\���4rs�	���2k�t�px"Ϻ���fΜ��h$
���N��J\���K�R^^�^�'/'���"�����rr������`�������%&�B�5���D�rs),(���h4L����Q�Td�������d�>G�RIdef���NyYrX�>6F ��
�IOK#2g�,�N������Hyi)E�E��a�͝���J��
��ȸ#���b!7'������(-)!!�FZj*K-I��!#�����>LQ��������'�2���X,�/��a�����'c��p:�?��?���R����e��8�����7���/Z� �����߽�
����w�Y��������&
�w�~bcc���?�͚4�����P�J����˔��^�G����h����t��5wKJL����/20�E�a��f*��Ũ��)�,SW�@]}=����_�f���H��;~AA�&�5�(ε��_A�I$I�>�/D�K AAx���/� ��A�1"� � <FD�A���=��x8{�sfϾ�3>��� �==̘6�K�����p��y�:II����~j�P(�$I�5��P(�F���O~������C0d�ԩ$%%F����Ea~>Z��kkϽ<�c(���vSS[��S1"����/��Ȓܚ/����������e���v�:j��ʊ�/�}0���st�tS�_@bR"CC�̟;�q�}�vttr��q>k5�[y��~��������;ػ�.�+�+"+/���{�|*�u{G'G�C�Q�Vk��Oԡ(
��͕�Ց�����ƛ8?'����'hlj�Б�t�t�د�����m��o�_U{�u�?�
�7ɘ}���bdt�?D.jo�EQ8y���\~��kׯs�r�gns{�cn]l���~���,|�n}6�p�S�ɽ^�ݕ�k\�Q��^���?�5��޻��HoMM79s�lt����@w��nܠ����׮��$-\�����RY�����Sh56�_G{Gg�j�͛�^SCIQ!#�����{�|>V._�$I��7�/��
Z���^|�����{	,Y�����G�"I*/\@Gg'�--���a�ZL&�nܠ���@ ���:������Y�l)��v|Dll}��
�c�.�x�'hhl�����u�ܽ�'W�@�����ɳ[6q�F
o��I�GX�t	99��>jjk9}�r8�J����##=���g�夦����Ɓ��>����s�dgO�wL�$�����_�*���k�PSW�+�.��12:���˗-�C��~T*ׯ'9)��{�q�����V�Ξ}�]�������~�/]BVf&����'�������^�ް��g�Ru�*���l޸���X<H]}99�lyz#��V�T��c�.������STX8�G�[����;������ #=��gΡ�jX�f5y���9w.��	��̤���ֶ66m����th4Tj5^��w��OOo/驩�_��=���z���`�u�B!�8@OO�7o���BZ��*���j�5��8�F�e���<�˾���隣���S*���YQ��o�����>&2�\,f3m�m���X�x1z����F�TTr��9�;:X8>��	(�BbR"[�������l,]����,v��C��&���Y�p=���:s�ʊ
N�=˕�kC�H�*����MsK���p8LbBo��1�
��;{6sfͤ�����LV=����q���������X,��w�VIdgfa0X�z˗.a�=4��RSWGwO�==�[�����K���=f�)�Y�<}��s.L��K`6�Y�v
y�9=q���h�(-)�����R�.Y����ju53�O�d2��8u�,5�u<��3̞5�	l��?nC��a�X��_�;��EQa!]�=�ص���"���9|�W���z�OoXϢ���H��Ec�JK���d�SO���:�G�[)r��
22����e��INN"����������C<��J�rsilj�����FdY�������hj^�Fô�SX�p!U��\�t�55d��3czd
U��SRRLQQ!�gΤ�f3�`I�������L(�=�w���������( I��$Tj:��-��F�ӱg�~Q�����1�����A� %�E�X����B��?@Kk���hK� /����^/==��>{��^<7�@�����x���ilj"�f�h4��:Ѷ�"$	�^�=��008HZj*Vk,�q�def���Fjr2111������ٳ�|�
gΝcJE&���׮QR\��bF�Ր��ά�3�e���!�5��"5%�Ғb222�����j{,K�xr��>�x���~	�}S�d?���#7'�p(L�-��؉~��@bb$=nnN6j�k\ӦN��+������L

HNJ�^3�^�����hLOKE�հx�B::;0����?����(����z,�dgeQQ^F�D�LlL��%���o��G�ӧN��������~���E��1m�����i�ѕ�T*j�
�J"��=/UUW��~��M,7u�rsrP�TH�D|\<qqqdef��iQ�\썍�100mCRB�d�{����(
�kj���&&�B|����&N�:EWw7~��q���S*s���41FQ�zM
9��X,FFG),(@��r��9�ΡE��B\��"SR\�Z����)�BaA)��T������fgs�ƍ��� �
��Ғb����?w+�-�h4b��r��en�Ԣ���0bhh��ӧ�������=k&n����6�L�D�R�v{��z�v|�N�%#=�P(����ye"�5�dg=��̚�s�p�G\�Z�ؘ㳏�"�(�ޕ� |.%�_�V��,c4FR�8x���al��4�l���!��Ƹr���G���Akh�<���Ј�(����g�~��?����������̞5+����9�3���̠ ?�˖2u��

����Б#\�Q������c�ٿ�=��s��e�*�D_>����d�o���y�o� ?���t�,\��s���cl�����v�:�P��8+}��?y�ƦH���
���q��Yj��(+-%66y�(�9���"��z������'?/7چ��=����>������0�̟����c��Y�p��i$%&�p8�����R��ihl��՛M���Y�h��t��p�����D֮^�-�F\��ܜ�j
iiiTV�308�,�̜1���BRSR���$5%��Dsk+�P���$N�:â�HOK���uz=ׯc�>JwO�Dn�,���wc���[|<��1r��ILL���
�J��7����͖6�]�F�E��p{<�^����,�&yy$''����J�">�ʕ���=7 I]�=$�l�~�߸��055%�<� |SH�F
���t���QZR�-����y�_èTX��;q��,�����u�(,��l2���JRR3�MCQd�n�x�ݤ��3o�h�����v{شa�yy�\n�TV���FKk��.232(-)�l6�t��ZMJJ2��u�9�p��h�Z��JIJJ�hs���
H�N�%?/�5r����fhh���$J�"SD���t����2O�_�Z�������\�N�BA~###䓒�Bo_���L��$5%����;��P�T�l6�3����0����G�5��K���3цdRS�s��㡭���B!z��HNN�0�����MJ�`���_D8�����d4j5��o���0�M6�W=�@��կ�9c:+�/��t�v�IOO��zdY���硷G�;5���W^���`��&�9�#D�e��((�ܵ��ϟ�ٟNv�&�e��h4dge=��L�/B�V���`pE��1m�}�3j�#+
���Xcc����w=*��+i� w��t�������/�P��NQ�ЀAo��-�&�I��������;�f��@ �������x��#�&�,
��N<�%��z��~}�y�;�o
�Z�@A"w���G�MT*��#�g����sj����{�PA����� ���_A#"�� �cD~AAx���/� ��A�1"� � <FD�A�Lj�� ���_A#"�� �cD~AAx���/� ��A�1"� � <FD�A�Lj�� ���_A#*I�&�� � |�n�z
����x&�M� �c@Q�����ob��}<H{���0�4I�I�:�^�d�IA�SPw��v��e���-IF��ь�a'�O�0�LĘcp����I
��$a2�H�%�1�L$'%?WG� ·�$Ix��Z-e�U�Z��(��7�Gfz			�&q;304��妨�k�u��r��1r��(�#� _��?�F�A��|%w�z��q����<�ͤ�5�Z��l�����4�1����}h&�%� �cEQdY�J��(���O6I�Lo�zϓۖO���� ��c"ʲL8~�U�T*EA�$�ߩIHȲ|G���$� ��5R@��OAI��$�w�ѧ�o+sw �U�g�{�/�%$�����H��_��d�?6�����~�E�A�6��o��$1j���xHIIA��sm���z=1	bC��X�Vz}�EQ"��w	��x�LFju��`0�?��d4���P�&.U*n�Y�1����{�-�K���磧��P(�F�%)1��z��nC�$⬟~8�],�k�[�.b�>A���9����>�������l61l/�2;w��rU�$�R��z����۴������ʾ�$I��\\�t���n��ޝ=�q܉������t�}����I")j��u�l����n�>�o�g?8���kSAŮ$��0����	���`��{���L?�ts.@$E��`���ʪ��_�~�Yz@����ʵ�M�@w��(
��������<�t�t&��:ã#��Vq\��M��(
�����i���o����o�������a���PU�r��?����w���޶��͹O�c��{V~�)���}(��/I�$�A���w��J���D�����a�,������16>NG{;�\����,,-�����=7O�4]cye���:l�&�N�)�E667PU�xu]����`sk��z�H(�a؎�����0loo��>Easc�H8��竜��( <��x��w��&��_�{~?�~�!�|�W4��_������m�-��}8����x���"�#���k8�{�O/^�7�dye�'N�p8��k�1M����$5���?��/I�$�1�{��`4::Fm2��g���~,���y��0w����˥�>�~?��<�'nrN�E��UŘ��c~a��d�`0H�h2:1�����S3Ӭo�31=��Nz\a}}���u��(�B��y677)����^�(���K����i���{���
>�p���|�ͭ-^�-���+��T*�?��g�Ri��#�_��iZ��I6��4M~�O��w�*�33���7(
�5��$I�t�(��޻����¥�,�,S(��drz�k��<��Üy�������bhd�?��?������A\׭5UU+��Cϭ*�4738<L0����lv���-<��HW7�eqcp˲�F"t�w�(*�e"�0�u hkm������a�����_��y�
�yxBP,�����(�TE�������L����4���A2�f���SO����K/���(\�J(�P(��J�˗&LBT���_�$I�s�T�+����2k�O��D#Q��,7�D"��α�����
����~&&�0�|>�{��w��~���@��c��4��!0�u���ƴLt]#��f�r��/�+�ͯM$ass��G	(�	E���=<���YX\BUU>��o��h����t&é���'N����҇�loghjh ��ɓ|��'I�kp\Ӵ�������X���&8yM������=~��_��d2y��$I��{��(�����
�<19���^��m���!y��
s���B��'{����8��E����8IU4Z	h�밹�E}]]��^Q,ۦhI�$�|>\�#��Q_W��鬬�R(hin!�H��Ͳ���eYTE��ݧ�i������:�ulˢ���@q�i�d2��T������z�:�SS��O�L&y�hkifeu���!2�u�ZZ0��._����D"��7XZ^�6�������
n���NG{��LMM���Z������
ʹs�DOO��b�#I�$�ޤ(
[�[
���M���+�˩i˲��N��m۸����;h-�b~q���'0��M���UeoY[�<P�"^Q)�;l�_�ܣ�c>�Ύ��lnm1:>FW��l@9K�(�w����ETM÷3�д,|>���(e*�mp]��N�q�l�������E�L�K�$Iw�'�[V��y�.���v�؇��/��w�����?�i���9;~�5�S����C��k0v�Qz��+����
����W�'�p>I�$��QPJC�n�\���;GUU�[�	�÷̚{�ݝ��|\�TP)�$I�t��1�رQ�\�|$jjny-�_��\��$I����ʻs�6h�q�F���9t]�-���fA�u�R=C4�i�>����:.~�_~I�$�B���b���	G�D"lg�YZ^�ۗ�a�iijacs�B�pWۢ(
�p�h$*�$I�t�D#Q��U��7nY���m�Ɍ���O���B�*<��8�]k��(�Z��P~I�$鎹���qS�5|>5���W�
�B���ҢC���x�;C�]�%�ɰ�ݖ�_�$I��'���m���*K��m�`�T:��L$�H�����HaL~I�$��q费w�-��w�]����*�:�$I�t���/�k��簿A2�K�$I�m�T��ڐ�~I�$�pXx-OT�����)z�s�2���s(�U�>��D�뺕*MӾҺ�[�V��H�$I��C����*��,G����������=���qF��hij&���/���	Ad�4��zPP��+����4k���UQ�ho'V��Q^�x����A�<����Y,RUUu�=�O�/I�$ݳ�Ŷ�]?;x��i�X�����R��l�\>O6����{���� 237�m�t��	G�m�l.����V�+:��q꒵���0;?���"���B�B>O0$���	��h�F.���؎C!�'
)�E,���\�76H��tuvQ��l�Ez$I��{�!ivEQX\Z"�N�p\���G���d-�F �Dz,��N�����s�%k���ƴLR�4>�����m���L2��ܯ�f��l�4	���y&&'�|ضMgG�Īb$	F��h��g-�B�ulۦ�����U��kD�
�۹,�k������_V�K�$I��]1xwo�s]�c��9F0daq���m���:B8 ^/���	��I"��.YGWG'�|��zM�1M�t&sh�=B�����8�e���B:�&�ˢi:��,�l�DM
�L�t&�O�Q(��lmm�xu�c�����L<V���#���C#�$I�tO8,��|:�b�|!O�P �˕֫W����X{���v\�Ex�e15=����!����� ]׉�C����P�pHo_�i֡���c����B���(>�h4BwWձ��8�c337K�&���>�g�T�K��44MCU5<�4=�(���%I��{þ+��&^C4edl���a4M�6Y����G��l���fXXX`lb�Tկ���vv�v��:�X�B�������8�L���
\��\��M��麎��Īb��ֲ��@u��@ @*��,dd�N���&'�H�ߵ��<4MGSU�N�!����d�u�!7H9w������t$I�$I_��y,�.��ڊ���Y���0�u���	LӤ���`0���!��hq]]�0|��P4M|>>]GӴJ@�����w�E���wꊢ061���#{�li!!]��<۶1�u(M4M#��*���.�aT>[4?���<�ʐ@˲�<�߿k�@�ͭ-.\�(��$I��{�ͦ�QU�` �Ƿ﬜W��.��3��\�J���w皦�Dv�����|�=�+ӧ�0�ƞ�i;i�ݟ5����S~@��M��_�$I�7�b�^��r)�d2	>������-�+�|������,����G�$I��|��q�c�����Mw�2�K�$I�EAS5̢I(�B��],�BU���L��fBx����瓁_�$I��(
UUU�3�ol����y���
�����޽�i��h$*�$I�to�����(����Ñ��.UUY]Y��_�$I�7�_4��`w����KN�#I�$I� 2�K�$I�7���$I��
"�$I�$}�ܲ�OQT����T���TU-M�$��UE��Y!��7�G���y_�m����$I��;ᖁ?��399I6�������=��LiCi���<f��X\\"����N$FQU��Y�,�/K���Li�#������R����V�V���%QS��aY�eG��zJ�$I�7
����/�d~a�DM��I~�/��0*�[ug@ػH���3ss�~�ԁ��\�_���;�
��~��o���MM�E��?OwW'��J�jjj�<oϹ777�z��o=~���N�@�J��3�RO~�<��W_�q�jk�������p(T����nnn���@ �SO>Am2Y�'��*���;�-�crr���<��܆�X_&� I�$I��i�l�ѱq�<�4O�۸���(\�|�Ύ���
Q]]���\��G8�{�w?��KW�������ܵ�IDAT�Oc�&���a�=�0(
W�]gkk�@�O8aye�Ǐ�����S�ӄB!��ueI��~���#?��3���ׇ�y<x�$#�c�������ԓO��dH&�Īcq�ѣ�3���	�C<��#����'z�W�a<�,�<� �`Y����G.��q]N�:Eߵ�s���k똚�f|b�x������/,����m�<p��ٹyf�����$S�319���2�4������8���E8�� �k)I�$I����_~��J���:Bx��㏙��"��W�;ﲵ�ESc#���Oinn��g~a���8�h3���/,r�����Z~���E���y�WV�W�_��Bks3�	��:�a��Ox���!��q��Ǧ���t:Å˗i�����۲YZZfye��W��B�OL���NkK3�z����c1^y�'��4���^CA(������1�76���J�࣏��ƍAVSk�bU���y���zF��X][#�2<2J�cX�ͯ�~�p8D�
67�����O?������y.\�L}}�H��T��W��J����GSC�}�!휿p���	��Q����>����[�$I���)�B*��uU��g���W454�?�1S�z���G�1@�o���EWg'�e���Muu���z��h�/��Y4"�N�ɬ377��z$	���s�>Bwwg�~��Y�4M���ook�?����ܳ�p�^{�twu����s�>�G��p�B>���
�m���4��/���X��r����Hw��ul�A�u���^z�J�]���g��?�W���~
�d2��g�r�GY__����Ʀ�}�i���i���G?�!gϜ���5��"'O��/����<�?�������^/(���;���"�d���y��x��g9{�Yjk�Ȏ�$I��U�i�߶m.]�#��ڊ���r9z���?��ⅳg�
�롾�������AZZ����fjz�a��9~��֖"�0�λl!���}A�Κ�;�''Y[K�L&���e;�EA ��������>���l��'N������X����4G�t��*++����ds9��'O�f~a����464�H�`�G��*�t�	TEE�t�8��畲�&<����M$c|b���q����N-DUU���I��!T��3��|>��7�5�b�(��$I���̡�~EQ(
|��G�;���)N�z�ӏ?N(bum�k��w��G������\�r�p8�SO�����ٹ9�gf8y��x5}W�237GUU��d�T:͉�^��t�x����N����c``�p8̿����u]��^��&NM<���($jj8�أX�E���mī㜿p��yB� �=�o?���Qx��v�����%���2<2���0�455���9q�8�|�q�ho#�)]GsS��\�t��K��4-B��]]$�	nr��<O���8�8U�(�mm�/,���E����K
����O��B#$I�$i�r�_9w����9�p��<LӬ���)�w����)��?���a��b��_Yo��<<�C�5TE�h�����0*���aw����B`�6����4��N�qPUUU��󨪊��؎�������Q��ݲm,�$�������=rEQvj>�P�}�w��m�F�44M���/ow�b��ߵ���*�{Q>���"��ǯ�7��/inn��}�$Iҗ�(
��÷ǯ�*�`��˲�[�Na�|���ϧ��E|�H���'��lw_��0�]�/�Ϛ
�|��l{x�v�a�i����}X�w/�X��;U_�y�C��i������EQ�r��Ri�v��2�K�$I_�/�,��g�����.��w�B���C�J{{[e��$I�$���p�|�����6��Ũ��9y�$I���R=~�n����$I��U�~E��$I��{���d���$I�$�6�d2��kذ�a��%tEXtdate:create2013-08-30T12:28:19-04:00��(�%tEXtdate:modify2013-08-30T12:05:05-04:00xȦtEXtSoftwaregnome-screenshot��>IEND�B`�site-packages/sepolicy/help/booleans.txt000064400000000736147511334660014426 0ustar00You are viewing the booleans page for the application domain.


SELinux Policy writers have written booleans, if-than-else rules, into the policy. This allows the administrator to change the way SELinux enforces policy on an application. The administrator can tighten or loosen the SELinux policy based on his needs.

You can use the 'Filter Text Entry' to search for appropriate booleans.  The Show Modified Only toggle, will show the booleans that your system has customized.
site-packages/sepolicy/help/users.txt000064400000001456147511334660013765 0ustar00By Default on a SELinux Targeted Policy system, all users login using the unconfined_t user.

SELinux has a very powerful concept called confined users.  You can setup individual users on your system to login with different SELinux user types.  This SELinux User Screen allows you to create/modify SELinux Users and map them to SELinux Roles and MLS/MCS Ranges

Default SELinux Users:

* Terminal user/ssh - guest_u
  - No Network, No setuid, no exec in homedir

* Browser user/kiosk - xguest_u
  - Web access ports only.  No setuid, no exec in homedir

* Full Desktop user - User_u
  - Full Network, No SETUID.

* Confined Admin/Desktop User - Staff_u
  - Full Network, sudo to admin only, no root password.  Usually a confined admin

* Unconfined user - unconfined_u (Default)
  - SELinux does not block access.
site-packages/sepolicy/help/lockdown_ptrace.png000064400000071230147511334660015744 0ustar00�PNG


IHDR��ah�>gAMA���a cHRMz&�����u0�`:�p��Q<bKGD�������q�IDATx���u�E��j�u�l�M6�đ(��zp�w?�C8���ÂB�e7�l�u������Q��M��a!G�<<<�����NOu�[x��Œ$��)��eY�ш�i���mmm((( ��8�L�ܱcGMM
���\TTdY�aG�U#�㊋�xQB�i�Q?������R���G1˲��r�z���c���V��_σ��~����wc��?sG�\�:������oG����a��qB�R�8X�b�(B�eB�U!v�Os;weY�eY��4!�Zo���c-am�A?�a&�����=��J)�Ñn68���yB���(�D!d����t]+��
���2�0MQ<��{�1����N)��
B�`,I"�󺮳�Mv�hT޲,�R��%��1�4ݴL��ƾ�TM��%��E)�b�G��W�C�n��*~,���y���*BH�$6ns��ֱ�:O���>,�u�x�O�\�HU�A���w�M�-Y�ȋ���pD.�Sx�_�t�_��՗
�#�ݪ��|3���O�DjQ�q��!�$QT5��8B0F�R�j�M�,Ju]�Q4�Ow?��چ�WXp�g�fe`�y�7t=,˷>��5��i�Ί���o�]4��m�vŽ��!6�QUU��Ƈs>���o����
M����~E7_w����Y��� #B8C�
�!vBH�	E�0L�4%I�����q�$�aD��i��Ų,�����А���5e�|����0zf��я��N�e9���W��ٗ���Ƭ����5�P$��={��uM�*�i�aM�
r�O�9�0���D��(���ʚ���΀�i�a
����������bQ�j����ˆ(��Gޝ��7����.���UYQG
���eY&Bhˎ
���-���H���S�q�|7BH��`�/�Z:`�7����#��[�ۓ��̴Ժ�&jYa9�����z�[�;�����T�qM]��(��^�ǽug�$�9�ͭme[�麑��\��s�IJ,��ꛚ_x�ݫ/8c��ko]{��٪���;��{���r�ad���eg}6w��Q#&���y>
�����>h���#ƔR�48�۾kϜ���-���/�$�#(�W<m��{���z���*'���ǟ^}�9ϼ��U矝���j�E�M�[�b���P���,\��?D!�����x�x~骵U�c�
~쟯VTV��ތԔ5��7:��~�<)����ʸCeEy��w�T�:v�X�49B>�f^MCc{G��g����U��u��s������U�շut^~������ɉ	5
�����+V�O>�ؾ��O��F~vVVZj�>�={���0�r4:f��,?���K�:���d9z�`
����#��j^v�Sw߶`���W~�ͼ�n���s!Q��y��EA�y!;@-�Sf'%�x��A����^Q��iY�&O�ml���Gn���)�Ƅ�a���9����y���
���w�wƬi����;�|�n�����.I+חo۵穿���1�7�uߚ�Mn�3=5yx�@(����''3r����}�uWa�הm���1l�X�v�h¨�2���oߺsהq�Wmظx՚��_��?�S'�t��5u�,V:����Ͼ��wW%'$��k2SS"����OcY����D�m!IJ�~Su{��?�-G3J�$�;�Tn�Q1n�PQ�z���vJ���8�Y���5M�U��;�ਢ"D-�Z��E�x^�@�������R�Zf0v:�-�횪���PTQ.9�Ԓ�>�aȊb���[���h��q�Q���E���e9�v�0*kj����9m�	o|����8a�1=?��<n�Ч�}Ġo� �
��������f��i�nZ�`L0�ݕ5u�.3lHVz�eY�(4���54���� 9:�o��1#TM3MS�T]��c~�R�Ͷt����x�ҳNC=����ÆʊB����Æ��ƃ��#�����_����6�t�T�����n�#"��
������|�1v�m�3��rSj�m6������9nİ�_~��t�gg���{����և_|%+-u��	��#J]G0VEV��q��g�������0����3��64}�킔�D�(���;m҄�_{��gO�2n����6j� EUY�%I4La4r�@Q�y���v��T쩊�J�?[Q��,��9.!�W��SQYUԢ��i��?�����lm����8u�q_}����sx�~�6���(|�OgFrB�u�?z� ��(	>�i�����;rz���#F��uF��ÊR�s�8$�vI�K��i�E)�;�������5z��6��QU�9β,M�%QA�uM�DA`�c?b�fc�TMw:l����FcD�H���h�R��r8�z�����.	Cl�)��`�,��]�"�[���.���]��tj]�q��;�����{���.>����5"G�^�qE>|�硔J���|D�B��0LM���H�(��֭�>��c�&��m�[�%G�l�{�_�Բ(G0B{#��QJ9�-�R��8�-.�"�TV��(��}�\��a!EQ�<Gd9��ϳ �,JeY�8�RK�����)�����**>`>�Qʚ��8��5{�nf�4c����a����f�
���K��A���u��`ٲ��~.EU����<ሌ{�0�O!���Ji�Lf�o<���hv�,v�i�=o�2�=A��
���H)=��iR��Ą\q�y�骦u���cQ�pb�5�RdY:B(�(�&��_\����躁�l�C>d~���PB��2�:|@��i��7?��oq8��>�e��a����(*�Y����g��� Fմ��r�?�a�tuW��8����zb�	~�m�W��#b�Q?�q�$!�=����<x����O�}�G=�1;��+��l�<����,�:�0qb�i�ֶ�
�`���
c�zFZB( �0��#q6�����TLpum���ޝ�w�|����#�����E�	��40��8.�,���-�-�+�NMMesk�N1ƺ��������&9*��?bo3�0B6�}@��ɩƾ�l�+<!��t�\��ц�U%)1�
��ע��骫��o��y�`��>���3�r:
��
��rvv|~o8x��p$��y�m�.�eQM��@zj��~M�e�e�Ī��G�G��� �D)%�X�E5�W�ܣ���iQ�p�4MBHRRF89i�����d�P�_d�����_�6Q���i�P(��U�}'�q �,��c9J%��n�����Z��z�r���Wl�/�N3uӲDQd˰��OE�n�9vM�������͛�;ۜg��zoY�P88p���?�{��u��-!~
=�>u�a~��z����,#�s<�(�}�Q��9�������y�ր෌-e���6z�hL���S�1F������f���䤔��^��r��GEd��E=�*�G�I�w��!�1�l2��n�$	!�D��d�!v��U?�G��Z����4MQ���$]�m��Et}_Q1�Z�V�����rG�����R[ښԨJ8�	�!���8�1���#��m	��p���III�&�O�
��8��%���Ŋ"�m6�����]�1�y�v��o�Ä�d!M��3�&j��
�d��nV�q����v��q�r��t]���4f��$��i��Ζ��_h��@:�²,�ZaD���v�Ԣv��b�nI������
:�$���N7�<��(����5�����i:B]�F�EY����hYKb�=�39B ���SJ9����lǙ�F8N�;�pHDV��W�ıeY.���������׷����.��ɒ��r!�<n���w:�6Iz��w�h��Z��v��>��n�,��r�|>���u�=n��톎�Ä㸶������f��fk������?;�o��(��w�x�ɧSRRB�PeU��nw�\�l�����v��c���n�'�}�}����D]�v���q:�i�$������r��n7!�4M�E��9�4U�x��t�Z�v���v�L8�r��	~�ۅ	����.��|�5k7�\N������HJJܽ�rͺ�v���r�{��yGQ�}�`Q]n��a��}n���vqG8B8��9�C���B0F�i%ef�u�eӖ��Ά�y�����cY�t�9꧔�<���"�¬���8�TC���_�DN=��
��[�m�9}�{|�y���'L^��
�eg�y�����x��=�U�M�r̸�}2gOeeD���}���w���#�#�\U,�r:o���w���g�Я��mۮ���\x��
����E��j}���@D�LLvv�a�7�m(/?��3[ZZ���G'͞�dٲe+V���w`Iɫo����x��'͞9��^�v��CO�5s��%�6l�B�}�����B�/<?��7M3��1�y�R���p8�v!��x��-_YYY���7n̨p8����[Z[O8n���!
���3�_�f�;���(x~��U�5}�
K�������NC_|���'�jhhܵ{ObBŽ�]�`(--e��!DQVf��jp�Ԣ��ܵk��0y���!�
��v�I�ގ(�AG���`0tҬY�]qս�?�����z��C�4�}� 
�^�����9�cS��33222Ӈү��?~T�q��ѣ�x����K���Tү߫o�y��Y�׮]�|���4a��/
c�*��xbVF����-����Ȝ9�hT90㺎�)
�ÃJΚ1�O7����oٶm[rrҗ_��Դ��|���-�?�䰡C
��}>oNNv���CZ�|�Gs�L�4���_�a���۫�k��:�_��1tȐ����O�j��e��I)��D�f���GQ�0��c���6m�<i�ek�m��`!�hƴR�S8B�����'}��444-]��ԓN,,(�y~鲕�6m�2q�5�n�^[[_������Ï>
t׮۠�z}Cc}]�I�f�Y������pB�6��(�qՎ�ƎEキa�w��r��ݍ�U�$�h��	˲�^�m��ҳOG�Ƚ<8n̘���o�ͫoh�:yҰ!CN>q��_}��Ԝ����O�:yRFFF�ƍQEٺ}{RR��j~�o���F�(��>lXqQQ(�Ap��b�y�^�S�=���s�%����<�أ�����aص�\~���?[TXx���IIN>l�w�_V�i�ĉ��9�]|��e˷�ؑ�����4l������M��nڲ����a���9bxaA~递���r$¶o��eZ�R��5C���(��q������rn߹s�cN8v�1�Ʈ^����q��
���.�҆�r9=q洭�v(�?rİ�����;&L8�㧎=r��=cF�\_������3W�Z���>v�(BH���J�gfd(�"�qY��ਂ��(zj�9~�iǞx���\��O~¬m��q��ܷ7�R�Ͷp�ydCY�i�~�!t�ɏ=������ɩ��7ftnN�s/�d����s>����߯���:�ةW\rIZjjkk[0
G�����A]�y��XQT��v��w=�ȃ��}��nST�#|L�a�Νw��E���B���$UQ��<��O?���6tHUMM~^�q�^y��F���[�l)..r:��F�����K��mkk�������H$F�v�����zݞ��$D����mݺmŊ��ו�Y�x�_}�x�I�'''�ɧe��B�бS'g��?���K��[_�pђ;+L�<������+W���;b�Е���74�w�Y_}�i}
�`06t#
Q�
�.�9#|�Qs�
�B0T�m[�����'�!���N?-';�aw�1����
e�۷�(�St�Ej�����ќOgϜ>�t�͙�(�y眝�����R��<%9���Skj�-Y��%��j�^���z�G"���ii������$i�!�RM����8N��-�-��l�/Ƹ3ظy�;�U�]BIIIY�xI^n�	�[���ŗ_WVU�z�I%}�322*v�BM�4�4��.ljn),��x.)113##U���gY�������u��=�aPJÑ���A�����������:}��v�m��-�F�7fT~^nMM���^����M�v�C�&''�Z����+�s��1���[��9b��Q���>p`I~^.��ÆNKKU55%9==MU���,��KUT�f�q��O�B��7�R��Bu՞�¢�ƍ�˯�+���<vG�꺊7�(��F�qbbR��~�555�7�~�q�$�uCg��v��ng�m�� ��MM�?������A�$jY�eE"���t:�ᰦi��
)��N�C�4˲G0d5�E����F789���lmkݼu��a#ـz6F��v���		��i��}�s�>�_q�i�N�ӲLJQ8���(��D�n7�q��P8,�"�8�z<�P($IG[ݐ�Ѭ���D"^�O�4v��r9�v�Z��@ ����
�x����!9%��#����C6�$��XQՈ�=�(��GB^�G׍H$�4MG�n��R�D�^�����a�-j��I8R�_G!�7���Y�dѰað������e�Ж���J�袂3W�^)�R��>��e������c�#��Dbq�݋�.���JMI	��eE�QvKBH4�D"l����v��� ;�����8.
a��a��kGv�)]��Ȳ����\�vƸOA��(�REUY4r��jcS�����ή

�fWB���8��D"�`���DzE,���1�r4�~��<��p8Lx�R� �p�i!���;:9Bd9jQ!D0x!��B!��y�P0�����%Q�F�� l��Y}MCZZ*�!D�tY�����{o]EE˞J����󧥙�����>w��u�=�	�'O�`��‘��g�Bb���ev�����?6)7��QJ��h^n�
�]��
�>(f��EdWŎ bج+�=Xu�
�(�F-ӲI��i�(��IAQjI�3-�#+A����݆R*�!���4E��n#�~����竚
�S���e�)ɩ�-$�Pj	��!Z][k�:&D�%��Μ��)MLLܯ���b�c��a8v�Gl8���������n�BN�C��!<�G"!��
�m�dY�a��&�3o�s�a��HQ�#�:����_R\ܗ��R�(+ӈ�����?�#�#��/�c'��#?�q�&���/�G���k2-!�e
�`�LAUńp<oRK��P8�	<�
���
���r:]N�0���cV��`v�w���Xɷ޷~� �Y����U��h�u��v;��/T�#��eYlLgKK�(�DžB!���8�PˊFe��m6I�*#�?\�Qd��:MKM3-X�%~�t�aw����5�q�Q�"G������p$L�q�"�NE�����Zw�!�.�U毷?��A�ie�{	����QZZ�d8j�_u�i:�_m���~Q%,���}W n�=��dž����~ΗA�3�ƀ���X�E{�bX�3[�

<�}�c�~��8ʰ���"̸�����}�~�ᓘ��DXAv���_i96���tb�tÐe����t88��M)HLH0L3
!�X��p8�j��]����n˲dY�]�1����hTӠR�>�Գ��;��.�,==]ӴǞ~&%99=-����_o�������f~�^��"�]�ĘZ�q�([�n{����7AeuuZJ��ac��6eu�u���D񻅋V�\5���i��=��=���8nGEœ�>�������u7�5â��>�������0vH�i�˯��v�RSSٴ�� �_D���g.��F�w��|�~�����;=��d�m��w������r �;|�z����6I
�|�QcS���d�o��Vx�g��8����l5���p8l�����;��� ,X�������].���p8(�G���?��p�=��[�Ν��8��a����~dێ�v������W�TV
��p8ؾx�'�x\.�ϻh��Uk׺�n���J#�e}�ɜ��:���t8�n��n���;B���$�n��]���z��vq��x�&I!�������5�v��x�w:�lAăm��zX�hv�mz��$IRw��}ԏy����p�f����n?tk��)Ʞ�r8�>�C>��=v��N�}2�N�Clp�״����[~c�;|0N�y��(%�x=����>^�t�,G�n���P���^���5b����/m(+:x��W\��/�74���:��K.���j�겳�E1M�0���/546Ξ1��3N��˯>��i�>E�_z���|�����\���\ܧ(6	(-%�0����immݾ�b��c��W�~g���Y��_{���|�G+vﮮ�XRr��+.7v�_���/LNJ�IR4}쩧wVT�~���M��3�X6UVU?���.�!4����Oc�/�"�
��1��Fl�J�}0;�xޢT�e+Wn޲�����'�9���4��[�06��8�� �B�"
�iYl6V��l8?k� �2BH����U�� �/��/�$��9)��l�!��X�Xu�0$I�u��Q-��~���$%&�;��R�s�����أX�x	Bh�cTU!�#�
^f+��]X�r���`�0�u��p���>���_|���A�u�Ͷ��lɲ�7�p=넌�I���x�t?��a����u�%�gLgşٳ���}!b/.�#�4��M�x�קw|nnN8f�b/A�<yB��"�'J�
��v��K����WP�2L��g9�5e�EQ��Muoa���A�yQJ
��y��eQLpeU����O9�ĥ�W|���{����ܳ�XR�򫯭\���/X�l��o���bWg 0s��w���m;&�?F�c'O>lhuMm$"�X����
:䵷�Z�h�cO>]\T�ͷ�^y��+W���O�=+7''�ˬ�Е�^ZV^��;����9�}�����s�:���㉧�}�ͷW�]{橧�|>]�w�����l��rY��Q޿�xs������wmݶ�n����8[�,�¡K/��/�PV��?_v�\�eUUװzȆa��ZUS�R��Nb�fb]R�i646544H6[gg`Oe%�x��
UUI�7�i��@@�4�������ɢ���ݴ,AثV][��G��f�</IR$�oh0C�D]�k��E�8N���ZY�A`
���c�X�|���mmm'͞%��,����$����!�"{,�----��ʪ*�0b���Xi��,��v/[��'�
��,�ÑH(�F����H$"�(���T��FEEAt]��rUM
{3�$������96%�Rj�$�0�;��'"�`�f�5����Y�m�vJ-AjjjeY���`p��ͻv�*
��b_���fG�zUU-J�6[[{{Ss��i�`��fY{YYCAӬ����h�s'�b{{�a�����L�Z�rU��]�&�^Qً��>�e���H$b���7M0�F��#*�"I!�縺�������ה�U�1����D"aI����:::DQ�y�Į�Ψx�uԏ	A545�rW�H�n�sWү�I�g};��={���/:���N9!��{�9r֌k׭/۸������s�~�Ï:�qc�H�xƩ����/����v���293#c��5k֭���QZ:09)i՚��}��8��斖�ee]��0�����;w�굳�O�z�\A6o�V���I'Q��y��斖��ǟt���Y�
�`��=����*�"{��a���F)�z�ރ���.a������-v���?zo���>���@ h����|��w�/LJLTU��y��n���~��q�uW_���!
��>��SU�Ǝ���,�B�n�L�	���Ƈ}<��khlb�Ͼ�RJrҩ'��v��ʪJ��5W^��x}�IESǍ=k���^|)Ulv�
�^�ru�g_|a�lS&O*��{��J�Tܧ�s�y��74 �.��"�0����$Q�ӧ�3O�777�\�&��^����nמʾ�}&���ǟ����uť����;�v[0JLH�4��o7�E�Ͷbժ��|��E��Â��֮���O\s��O>�|T��:T�ʪ�`(tõ����˯pΙg���r���w�ك1�}w�q��_~��ܹn���3p�=�$�&I7oy��[����u�%�qϽ���1��̝7_�$M�EQ|��7�kjB�\y����y��7�[[n����.�*�ﯻ�O�w�)'���z�����]�w�w�߷o�1���].�,G���.�����ͩ����/NL�?��c�c���K.�ษS�Ѩ��O?�Bcc#�ҋ.�~ђ��7<��7^�;Bȇ�1M���?\���_�g}C��+����~��%%&�yک#G��D"�x�!BHsK�i'�4t���~���:��cC���ի�e�r҉nj�У�565]y�%o��nmm�{_SQ���;s��'�y�0��Ą�]s��_~�f�:�艳f������1
399�ҋ.�%T�A����N�5�ǟ<��sn�寥��[������f�fCC���v�qO=��տ��G=q�̹�������:�������3hmm5=���v��-�m��������w�
�ÆNMInjjr�Y��gL߽{�������FU��;��������O���wbB��;B����k�j������}�ܳ�<��S�|��ͷ޶r�j��)~��o���M�͢(B^x���tM�B�PZjjZZ�uȱF��'A��JG�ug �NZ|��W�6�;f��EK�-�,ڧ���G�C�`(7;{�ʕ;+v!�ssr�PTu���>k`�������I8�fV��M!�����?\����8�����t�\�ァ�������7f��o���{��[��C�~�)��nC��qc�,�~�+���O>q��1�v�޳�b���O�5kC�Ư��5b��=���3غm��IO8�XӴt]���;zԄ��\y���--S'M��߽��;���W��|����]}�qS����؃446VVV�oYvP|�]w��[��'��r≃J�uۭ^������sϽ�s��p�ɳgS���y����'���?�3zTCccggg(�:y�}wߵ{Ϟ�����[n��U�_��a���0����&���jj��7�4̋�?�q�~���7����n�CbBBY�Ư��=b�ʪ>�X�>EE?p���z�]M�Y�R9*˲��;�y�����n����j�_�目��r�4Eihl�����?����[�a�ǟ~ַ�������G�c���7s��[}�1�F���3��0dP�]���q�E�\yş�x��͛W�^�ͷ�F���q��n��ֿ���i¨���SO�妛>���=UU�����_����}�����\�ǟ���L�z�����m�w|��7�F�N�8c�	�mm��6Pr��g������{�}� �_}�M9Y�aC���r2�s���+�LӺ�+�TU]���4q�x]ק|D�D���OHH<�4#=����OQ�c�IMIٸy�g�5|�0EU���B7��ڢ�B��y���v���]w�U}���.�﮹zPi��o�m��5]OLHP���߰�l�ȑ	~?��y�֛�R���Ȇa<�����<���W�:q�����qq��{*+*���-��OOOxa�ĉY�7��O�6o9�ة����P�19)��pbq`�q]}] ���E��&J_Zr"�'� .�S��������tr�i�%^0��Uu���DŽ$&$h�v��)o����;wN?^���_S�qcQa��&i��gL��qr��y=YYY���ŗ��&%$:]�p8������Y���.��*����r����]Ӵ�.�(7'���S�rr^y��!�]w�Ui�)o���5kǏ+
��(�g�(��1l��������E����Ĉ�E1%%��yEU��2�������NA���y���OPRb����׷������(��r4j��ٙ�W^{��� !!AQ�P8������<�� 
";i���!]�eY%Q�1�4-UX{�>_gg���LINf'W�RS;A��(���̚>m��@ (�B(=5�|�F���s<+phF(JLH��.�3�蚞���Dعn�$a��^/&���!���ۋ

B.�����	��	6�=+3s���a"�<nwg �z�x��z<���k��i�
jjnJLL�z<lp3뜙�����s|0�|I��۶ow���aP����'&$x<jY�pHUU|M�oO?��~t�Ï\q�ńp�i<��󦦦>�c|��?|���o�x<�u68:�{�!d��DZn�P8,���Q#!�p��3M#UfN�&
����HdԈ�'�g��g�p;�=��S�ѨeY�\xA ��lg�z���),�*�eYg�~Z(��ͽ��Q�TU
C��1l�e��i�
��=76�@�SN��jZ0�ST4d� M���a��c�N�<�2-U�4M;��ٰ��,'&&�1"
�pܱ3�MC��#�^ر����Z9*���-J=N�e��t[i^�u'v��C�pܦM��AM��~�u		��ʚ�����lQE�T
!��P8<�qϽ�Ҏ;����2&!����HDUUY���$�����kI�z��[�|��������_(ߴ��)�=nw �Tu�	�?��w�soMM�g�i�ٞx�٭۷8p��I���$It:�}�USS�$J���\�zي�.����y��‚�ں:���v���X�n�������#�HU5�Ѩ�m6۬����k_~=7�̜>��;j�Ɗ���21�ԲDQ,߼�W��h���.�077������8�����({�jjj���PZj��z��A�JO8�XJ���Q%��*B���� ?�_����߲��á0;��r�f�ا=w�}O8<��o��o�=#=���a��A�V�����z��(b��o��3�.Z��ګ��,��_�~��g�~��S�<�ȣcnj޵{ϵW]ٷ�ϟo�[vvV8F�r��
�2�,˪��r���~Ok[[��MÇ
��*�:sڴ���{�=��ӊ
�����/�HQ�`Y[{��A�����uu^��&I�nD"av��R�.
���;�H����OQѿ��a(5b���|�m�`h���S&M��������M�ZXPPS[��5���@`���|��d���;232F�Q[[������Q�{�W�̕D)I���?f��SO>�-�q��	Fx��Ńf'�����J��:!Ĵ,�}�s�!+��2B�.BƦi��������a���Հ���'[$v��;f��h�]@�II)�Di݆��
�y}�RM7$QX���UuN�o׮�'�*
�iY�E���;�rT���OQBB�eY��M����nװ!C�i����ݕ��7� �ɧ�mln���;::;cO����9���d����/+O�.gEŮQ#G8��=�U�x�'����5z�ֶ�`0�����|g �y�֌�<jY۷�(*,(�ׯ��n}Y���:xpGG���o�ر�`pͺ����#G�|�����k�"���	Y�f
G�1�F:�Nv����Q����{O����y�<������:h��ñs׮���p8�#ٙ���T�����6��1������3�rsK���,������m@���
��)�����v����������
ee�
�ÆaEduMw���{OeevV���(߸y��=_~��c= ��eY�$����b���4��aƪ5k(��))��9�r���A��ͶgOesKszz����
�X�~C ���HOMKMM޴ykSS�ko�}���߷�u�TV~5w�}w������r����ܺ�z�qA~��ݻ��x���fϚ5}Z ��l{_��|�4���[[�J���KKM%�����G"�U�� ���s\GGgNN6{����:l�‚�1�F�YWW���-��i�69����	�
�����SSR�
�W�^��yM���c��׮
��#�KOKSU������Ą�kׅB�Çef��oڜ����
��b�t��G�UG��$i����}K������篮�lu}zF�W7�~FΌcF�*�!��E��)
B:�!!;�B2BB�RMQl�X�bh����;w��r$�N�c��BH7M��v�e��eJ�-K��}�·���D1
��s��il���Q��l6�C�XCH�T��!BBr(����@Ɋb�&�ql�r4�r8�R9��ge�4M��$�0�PE��&i��UU�!����f�u]�u���g�/Z���������&I� ��,����)���f3
C�uM�� Q�jF6*Q�$�4�n�a�I�/^RU]}�]wF"2!x���l����t:#�0E�'Br4��q<��z4e��DZ�����QU���v�g_<��{*�DQX�xIUu�}�K�ujY�)��j���7457QJ�}��υ�� ���:�=vJ�$�Ѩ��K��j썡i��o���w�ַ�����"�"wDq8l�:BH��n�&[,���T�)E���4�4�m�#$�(�Fv���0]� ��Z�/�س�����G�Qˢv�,�P��Ȁ���WFt����t��sG:��I1Ɣ#�t=h���f�8JyB��&�R���~g��bz����eg�XV��}nj��"�$�F�Q`�����,,�!�R������7sfP�G		:BB��D�vR�.�n5w�vt`B���!dx�$��D��#�K�4�@0{�lY����,}P�:�j�hTa��1�e��nE��1f��	!��gef<8���G)
[ѓ���4�}+���^��^t�I�:::>Z����]�՚��[�ߞ�-[������#��*�����m6��(�����_�r:[Z[.^����EUف�ncUUA���x���+.�$?//���a�`0�d��B�ݯa�{��X�&3Q���ɚ��ѱ�&lC�����`�i��3�W�!����`��~6�!D)���� �N;X�L�@[�|�~�uT-˲�l+V,�$f<�yMfvF�WP-�xMŵ�g��k�F����&�x���w��o��a|�ِ��RS-�KS7~�
Lg���|�{a��AR]���/;�1��JM5��W����͵��hR2Di�=oL��E)DQ�*�r��
�e�gD��X�m6�}�h�v���B4U�v�/��tF�Qò��(��$���i�W��r�Oinn���xrR���>�˯G�z?���}]��}�Eijn�F��`�ӕ���N��w�؄�W��9��Ȋ: ��S
�yok[Jzcc��$�����!��� �ݻIm�6c�>s&R�?������ �ws\ @�	gͬ�v/�v�Ç#F�99���Bv;�b��ۤ�IX���>{v�p4J%		�{���4��7�c�)��(?�b��`l��W���a=6l/��)�z�?�<�	!�g�B�n$HU�� M��dY��.��‚�����ŋ��hJE�N�͖���B$���N�'ffd��(
<�J�E�s��))�O�{��ՁY�wt�W�Teef��$Q�*kj����l���o�;B�E�����A��On�eY���ysKKK�r7�}s��	����G��|�Ύ�ر�"��wm:�n����{�&R�(XUQo������N'�^���`�B�矋_}%~��,����͵rs�|+%�+)(�22��#ib"���-Y6d�]�Q�u�zd�(em;�m��x�:�-��1�en�:~�Z~�~�n��)�%���~󍙟�Q�#���-N�>G���y-^4b����{��%c�I4M[�zEVVNbBb��.!������ôLQ���p�}d+bYVSssrr2�k�DdYNMI��
�
�����e��aC��|�m��6��K�W�TWV�T:��a��i�55Ѩ"BA~���
�–e�l�خB3�ӱkWEU��Ĥ$D-��t�[�[��������a ]�=�X��~��r�$�,d�
5J�#A�v;�yׅ�˖aY&��|!alR��GG$��\�~�q���4!�:���.�y��XU��������|�
�����6����!XQĹs�?�dq�D��;٫F�^+99���x�46��Ns� }�(s�`s� }��p J��k���58�F��e����n�2v�X��:F�,j��Ʀ�ݕ{�aIjY����4M�HOw8�\qyZZ*�c��n����ص��|��'�N�e�Ͼ�r����Ψ�x=�Rv���c�mN��_�~�|�nk]�4�i�RO����~BFz�t86n���o5b���p���p8�z��K.���-�D"�ax�����C���]YYI)+�@��-��]Om�g�6~�����K�v#�"��77��&��z~�j�ϰ,��ߛ$Y��VV���a��#GÆQ��l�fC��e��o��ݎ�ϖ�Nױ,#U��Ƥ�NX��ۺ�gvg�z�v�,@��r�>y�6s�Y\l��cfeu}�Y�ٸ�2��r�7�n��|���~�C�����>e�v�)��d�ߴ�}�<�CA��F�4���G����Q%6R���q8�^y��1�UUmmm�;ﻶ�6M�f͘nY�'�~'M�����K�����z�)'wOc�eY�xΧN��SXD�qg��(?�|G�[:f�/��.��:4+��k����Ή�61��~w�q�$I�z�'�y��]���g�w�/����8~|RR���1�`�C/,(�[ܟ��.A�4M�}���T��|r�Q0B8AS��HH@"A���8!?%)E��o���Ic#���W��{�c+-͘8Q?^;�0	�OI�IK�n��>�O���*��u�p,�t��w
�c�P��+�OD��8�z���F��G�{Q$b���?�����J飏��_?���Y��Q�<+����x�t�i���U��]��Pﱘ_�_t��'����۷x���>��矷|��@0x��gE"���<��~w�eY�I�k֭o��(���4-��S�=���������ẫ���ajQӲtCsJ\�l��H7t�Z���6M�͊<q�̥+V|�paKKKuM����gN�����t9���jF8a=E��*��9������4ql�OOG=��r��:|�~�P��G
�`8�n���_�����'��u�̢"��^0&�����P��K�-ir�>~�q�1�w�{�����\f߾��Ʋ��nD)2�ںw VϷ�� I�����Bc�X��s�e!B~�K�6hYVN�F!��b�a^�wO�.��	�{
��t�5��l6��c��)B</�z�IÆ!�(E���_rqjJ�w�/LHHHII������u�
Y��g�q����o��6���z���7n�=mm�-���:�>���������u[+2g
KL�:7mj�HόM2b
cEqm6[Gg'�joo8���|c{G��Q#�OjJ��=�?��Q��p���?�tס(�S��&$P�7F�P�?!D����r��L��c~ݺ����ӧ�c�R��`�F=���,)���s��Q��^_�IJ�;:P����$��WNJ�8N�0��<R����st�
��6ɾfݚQ#G�l6V,!�(+/6d�~+���*����1M�3���P0
��v��_}���V�ZM-������n��-6���v��7��q��
fL;AU�!�����/Z���\>{��M"O2R�6nT�u�wE?F��@ PW_�z��
,)IOK[�n]Qa�o��#O>US[7fԨ?��c�q�_��?�����S�N����l���զMC3g*�^��៾��n7u�u�ز��Wu�������a_T?|�
{�8Q������:6��ٰ�����^�j�w󓒓,ˤmhh(�W�����@)��f��<��o�����t��ss�>�7)!���>���O?����)�)Ͽ�ώ@�՗_�����{�/[��+.��OV�^s��1�y�l��(I#��^�feub���Bm�Rn���b�1Ʀa��n�Г�>G)��˻��+V�]������E-a��ɬ�j��+N^����L�+�x���?=Y,�����bë�Џ��qi�l^Q���몪+5M�z}}��8��k��`YVl�%����}嚮{=]���Ȳ(�<��iY�i�N�SQU��=쳡��i�����L������;z6�-j�!D�$EQ�ϳ��$Qd+���f�_��?��i~�^ʷ��JظxUUV߸��?c��=y�W5�u��
p��9[e�-���*�����6[��gAП�8��Rȁ�P��z����/�X����[�������tk�W=��s������8��[�x ��I0�o�w�v��W ��v��%vml������q0&?�����x�7�].�ql�K�G�����6��f���H$���.]�`HR�M�q8� ��e���������k{ ��]���?��)g�s��g��#�@��J3ƖEB�(J�Ć~�F�(�{*+/�򪦦&AP�؛�(L������F��VK��Ʋ�lcYT��}�Գ�=��s��7�o�$I[{}���ӳp��47K����2|���bm�V�9|������%��e��k����v��0��;wn۾����4Mˢ�a"�DA�u�`�q�ǎm��A�u]�4��$�l~�i��))�G�4t#�X��i�e��6���(�R����k֮s�\g�z�U���lٺ��p��ȖؖD�-*��:[��0A!��ɲ���xƎ�
gӕ9�^Ũ���෢��c�t:���9B|^_��M=�����s��q��q����y���Ǎ]�p�iZ<ϧ���m�4u�SN<q��U3N8���`Χ�i���?��#��_���9g����UV^޿o�{x��G���~������s�eef��w�f����BȔI�.���8��+���㏇t�?�__^�q{�o�,�;o�iZII�/lݾ���/=r�]��g���A�f�p��ի��C�=�k��P0X\��?���W_[�v� w���|�Gv%�ѡ�ߢ������>v��)�w�?�s|Qa�w����nY��^��#9�Ʀ�;o�k ���p�⍛6���Ge����a�u�i��/_����i���ƌnoo߲m��A��/^�t���T�Լ�ʫ��i��g���p�N5G�H4�l6���'�}�z��'~��O�Ï?�����wܶv��Æ^x�9�}�!��4� ?��ϿX�f]}}�n��-9�Yw��5��/Y�|���	~�9g����ܳ5ĕ��~����M��(m��`0))���y�i���<������,I���<hPVVf~n���n��iN�Qt��W�X���O?�D"g�~Znv�_}���6�?��.
�Yg���>4��Q]S�q\(�4aB~^nD�	!4�e/_�vɲ��q��]��	���+*,شy�E����C�����+��p8�����[�p�#}^��kN��`��\^n��CS��Bw���/~��7%Q<n�z~q�{���P(�D��@ ��fϜ��_U9�			^�'��(BHմ@ �(J0
��p$
��D����{�ﬨ��'�}>���8�Ĵ�p8��ꔉ�꙾}�$&&��׷b�n��>`@֛�e>�‹}2�4�3N;u��	���,=��3����i�ڵ�QY�P�4�H$"�rSSsKkK(�4=
YԒe9�#�HDnjn�z�-��r4*�$���[x���C����m��rr=7�q��|SUMMnNvI�~;+*��Ғ�[Z��[J���Y���t$%&�ڽ;11���-??������&9)��O�斖�{���Ғ���ΆƦ��}!�v�lRNv6�4
�Y�V��J�����8�b׮�@@��䤼�\EQA۶�HII�ST�k�nM���6oٚ���RS[WTTX^�Q�F����΂����z�ݞ���}ǎ��`(��ؔ��������5I�7��ʝN�SU՞�_�v;�q�i*�b��4McciA�e�n���iH#��(<ϳ8QE��ˌF�li�h4J)���e����y�n�!V�T��f�9΢�06��R�q��n�u=�(l=EQ���i�RV��n�aB,�2��yEQ$Q4-K�u��ΊK��i�z���p�%��x��jp��������_؅�j#�z�{,��sf@�Z���T�^�+�y`��^�%�> �R����=��s��~3�؅X��X��r���½`�uk��m�oQ��z���f���Y���ֲ�J�p�t����J"E�u��W��b�wPwo����<B��D5M���:�D�E9����)�?+��.s�\��g��CW
���������3H�����p8�+�[�u��o����<B�㸈��ؑ��^�_��BD��޽+
�LJ���E#dZV8b��D0v�\lH�T���h�׺�� ��׵����S���D)���۱s{^^��닥?���z�cJ����*�<�8� #5x3 ~�!��۳�s
�4M�y=���F�X����>##�Ac}2K�-�����r��'�bϪ��{�㸨��^��t���A�khhmk+0�0v����q\ hll7v�a����qم�s�g�ƲA��1��\0�;o>�?���� ���)"G2��YM�������v�qVfF$�u�4uE�u]-,(hln�9,Rŗ_{m�ڵ�`���
!���z�~���yBH����OGGNj/�"���s9��7����A����|�Ѯy���6;;ˢ�o.�[�����}�:1��r�A�:f���ڧ�{����b׮;�g��Um�틖,�$���$$$H���q��8N�DD�G���4��p���[�S��|��|�1cƨ�����ˡH$_|���$>��pxʤ�%���r�G���W^z�����(�xΧ���SO�x<�D(*�[P��aG�v���pT_\��IA������q\W�>B�ҁ��߉���矿���_sujJJGg�g_|YWW7k��쬬��!�c�e�q�P�0Ɔi����a�.�n�{Wm��b�5U}��篺���^�g4���)-UU��w�u�\%��<�z����n�M?�x��{�%�0�����?�LZjj |�ŗl�;I���]�������SB,��	&�"�#�5M�mmm#���t��5�����,\�xʤI�$!�J���f@� ��'$$�EQDQ4C�4�4�NW]}���x!�Ԣ�$�p�5/<���\��z���㏝:�q�,c����TVa���;R�SƎ}��WVW545�|�����ں���@ �q�a�cJ�����w(L;q��j��Φ��&g{���ںԔT��q����r%&&���������q܉�f�),��`�eY���7��f�~=�i�.�;91e��%cF��x��`�0�T�ھ}��1�Mk��q0޲u�$I<�s���#�@ H�v��%˖��\�r�i�u����oX�aCRbbFzڷ��wNKOK��͝9}[�c��z^n��ח�����:GBx�e�(%'�h��5�#Jium�e�M�x���O�4)!���ܬ�駞�����=r�%K���T3�����.KJJJJJ���M[6U�Vn߹������i�Б>��g?8Ḗ��]�w��Y���ܷ�/Fh�RE�
�0i„ʪ�-۶���g���alڼ��3p������Pf�I'͚��w�mڴ����糡8����̝;w쮪���������e�9�
�s�l�NUuueUUUu�q��:����)G�����=��Ӌ����̀���㚚��ʷ��j��;�

��a8����l��a�ͨr�\<�QD-�RU��H$"�󚦉�@)1���v�0�K�4���RI�LӴ,��ʋ��%::;�))��	����l1]�r9�EƦa�#Q��,a�1a�6EA��G7�n�T�D�����:V�s�{����IQ�	ƖE9���flnmϪ�!�o�L�R�B��_��WBH�^G����V���1�1���4�V�T�d��&��nޓ۷
��S���]�+�6��������w��~�����v��f�Q���W/�+�Bfw � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�w � �@�@܁����q��D?��~�;�aa9�k:�H7�aD)��0��y�å�D?�(��(�<>w � �@�@܁����q��D?��~�;�w�#��E)����)�=RJ)��z�1�9�ZnL�D���dJ)Ox���Q�\�i*:���q\ZjB����}&�Ҕ�BD?�c\�X�۳33-�:��O!$	���"�RS��.�a�#F�4ͦ����~؟eY��g�
���f����������:ʔR��-�ڼiD?�c˲~��PJM��!�M˲,��E�eYQ�ap'��W_����!����ű�q��!�����Q����B!c6"�'�m��8�}�ز,J)�q�{ &!�k,��?M�����؍ݫ���!0�Ĵi���SJEQ�y��>QJc9�Bl�l����8�����������X�P�y���y��U��b�.I�8�1�� �����,��ȖB�9>"ˡp�&I,���{��ޯQ��ܱdŲ��W�^��d_'�(BB<ϳ@gƒ(r�PJ	!<�K��z{G�$I<6�~��y�ZVl̥e��RUS;;;�TM�8c��ں�r��bB:;;�TU���QD��ܾy�֖�6��nU5��,��0��ܼ)�$%%��U!��,�{��"r�����~F�A�0�����*��</G�Ύ��=�Ue�6���z������K�1���P$BUMu٦�Ɔ�k�h�V][�uǶ�,k�������n�VU]��j8nni1Ms��M�PhGEEsK�q�8�1������
��n�D"��l�*J��]��u�M�m�m[�m
ˑM[7��Z[[�v�����@��i
���i�"�!
��ڧ�RJ0�G0!� �,�JJJ;j���߹������d���An��4�����iYTQ�����;����DA�D"
M
�@lw�u�c�dYް�\���T�TG"A�;�;:��sr[Z[�=.�����NQ�;:Z��x�OKI9|T^N���1t�(�=�c0�@�z�0�T�@0�#������l6��!j� #v�oZ���k7��8��tZ�e���*��`��w;]�%
�
t]�;j������3a�1��v�(���zG�'��~˲�lݚ��IE��2459E�u��e�ej�*G������i^�]wc�u]OMIMIJY�~�ʵ��v{Vf�i�u�uKW,U5�0� /'�b׮u��!
��nommUT�㸔��`8T�icRbRJrjumM]C��*����{�YF!���N�37'w�����N���������uI�~�hbBBZJ���}�a��q�i�	>?!��
{�P�F�ŋ<XӴ�4~��k����F٫�ʆ߈��i�UW�
l6~�j���(�z	�H�n�s�N7t˲DQ�G"��M�ؙ^J�$I�6!D
,U�9�i�l��<�c��r��8I�z��dC�"r�#�C"c�-��
Cvc���Тŋa\?�b��aJ�a�  6�!��9�O�Y�
�TI�e���㳨E)���Fs���N'��~��t�]r�c3�,JA`���.w���(����~��M�,�W�a?D?�j������06#3#!��Z�)]={�Mko���4P�$��]�1�(v�C�:�U=r�!�`,"�+����P��ޟ����6�������D0&�D�Q��mYV�'BY�
?o]-��XQU�0&HQU6�?��e��,GEQ���^$'&7655�������RF�UU<���M�p:]	��~���f�efd�:%�cE��a�#��n����C�@/���*��{�a�#��4MJ)D?��;�zd��y �@�@�9h�O����-[G�ݲEilZā�M��!�+��
M�.��㈦骪�,Il|���$Y�U�{�sֲ��q��&�<˜Z�~%�(�<��m6U�4M?t�����}�5��l	�%K��wt�)*��ζ(�U��b��y^�p8�Ƣ���\����f_�r�(ܼekrrR��}%Ir8�����6Nq:��p���ɾL�d���.B(EG*v�jim�,K�~}�c�!�����V��䤤$��V=�,Jil#Ԣ�#D�4]�G����b�~[��.�,--��z��$=��?��\%��m�F��xL�4M��t�$�y�6���Ʀ�Uk�:DQB���I��i����p8DAE��v��G�R�ñfݺ��{�煕�Wgge|��9��^�ײ,��g��0ƚ�};����J1F�(aB,�r�݆a�<��zm6���N�C�$UU1Fv�����X��������~��A�II6��n�Y�RJx�і������\������v���p:�N]��v���r:�7�-Z�l��v���t�<��z=��)I{8G���㸦���Ʀi._��+��1�p8����s�*-MIN^�jubbBrR�'���(ʌ����������pܱ��w����5czD���~acccB���o�t��)EQE��l����.��n�����a,Y����_X_V~Ʃ�8�W�x!t��Y+W�y��kjkgϜQW_��������#G��|�t8N�5s���O?>�P�r�\�޸1cTUmii���/��;E9����Λ��l��g�^XP���rɲ�ٙ��'MشekMMM <��c�mذf�Q#F$$���7oٲ�bW]}}qQ�c�Y���9��I��cF��x<�#��ٿݲ(�q_p��^{�Ϸ�u�>�w��￝����/��UUU}���M���(��(��{���tM߾s�o���O>�����O����f�=��Ӛn�<�f͘����+���G!��M����a��+��cO=����0MS�?�p�֭�W^����{�
�EQ0cϞ�;+DA��b�5M{��'/���'�}.	?��3ϗ�����;		~A�m�m��=�8��9s>�⫕�V���;�(�_�⦅��@�Y��E�/]���>JOK{�ŗ���?�d��+���^�WcS��z�_D�q��~B��(��r��O?9bذ��|��g�Q�iӊU���<b�0U�:��99Ç)*,�쒋;:;7n���ګ�k,j���ÇT:���7LC�#<�k�������]w�U�/Z��O�76++���2xp�?Q�(Jkk����99�^ueNv��fs�\� ��>��>y҄��:A)�_x�5W^��n}^�9��n���3N�u�OQ��7{�斖ҁ32�/��M[���[|��_}���,0�c��w��kׯ?��S/�����N�(�Ə?�3�
v�عf��믽朳�,��B������IIJ2x�~���x��)���<s�Yg��6xP�%7�v;��t�ں��[��$����SX8k�4���F#Y�D�,��`0�k:BȲ,I���7��׏9� /����0��]�zuZj�+��n���ɓ6m٢�zgg犕�ƌeFuuuCaAss���'{lI����v���KLL�Æ�*
BXQ�4	��a���ȡ`0����FC�pgggnNΜ�?߳�r��%9���@0�JNJZ�lـ���$���*��EdQ�e���r���f�$I�@��
���O�U(?��ӹ��[�jլӏ�2E������/��?݈1~�?̝7/#=}�����ܵk՚5S'O�HO�;oަ-[�nWfFFUU����TVU%'%���쨨3z���ǭ۰�>\�x��n���\N��_|�ufFFaA��u�S����:Eմo��7�t@vV�'�}�m���;t��w�{��%���r��/��[U]=��qQYaӖ-k֬[�j��;��)jko;ft[[��%��UUW�>�OQQ ����1��^uecS���XR����r��/��F��DUU�v{i��7���7��q��lͺu�

�N����{��;��ڼ�]�*��A�D"�gΧ��m�t�_o�#�Gd��pB�`Ⱥ�K�d��k� �U(UUe���f���y�$a�TUs8�L�$IlH� �h�#�n��JWk�ƶ�bwA�Ѩ��6��8�qlAKI�bC?-�AUU��8��ێ�n7t�xM�0!l�<ϳ���<���j�f�I���y<����?�|��o�-3#�5�V������z��	�eB�i�,�TU�;�駜����a���@a�eYf#��(��V�FB��i�f�h����
��p��7����,B�Qc�3�#�ێ�fcd�c��(J�`���B06�0�iF$����B�[0��e_W=�#�Q�ݾf�EK�45�.-�HO�����l/�0&]�~��ɲ[B>y�.��vU�F�Y���e���Fz�K�
���~�$��ft}�{�q��v8�3#??���XR]=�ߨ�^��?6d�9�
>�,��y�njK)�e�H7~"��|�b�:G��G���.,x6�-���	!�(�_k1G
;�K����֭[� B���[4H��n�%tEXtdate:create2013-09-27T12:18:10-04:00�=%tEXtdate:modify2013-09-27T12:18:10-04:00A�tEXtSoftwareShutterc��	IEND�B`�site-packages/sepolicy/help/transition_from.png000064400000173377147511334660016022 0ustar00�PNG


IHDR�j��8pgAMA���a cHRMz&�����u0�`:�p��Q<bKGD��������IDATx��w�\G��	�������Lh�5Y��S�]�=�{�{vvflw�����9�3��ٝݞ����]U,�*�E	� ��H��ZFd��x��@&�A`�����23Ÿ?w�����?�^}�U���i~~���Z�kMӴ"��[�$I�D"�299���a������+�?����6�m��'��b��b�����׎�E�v�Z�5k֫��i��]8�@ I���b4555�@��������@�0M�͆�jmP
ð�!f��@ |�1â��I�G ��J_ �o9i]����_ �rH���;�-�,��$Iw�gK}� ʱ�,K�K���?}���-��@ x�H��* L���@�$��$�p�0��m8���H01M�XT�N'�P�Ӂ�b OK��\Ȓė}%I�0f�!CG�$�6�}�Bt�$���t
�0p9�_�^�p�x"I�Ǎ�(�Ip.�,K�]��~,�c�,I��Nː�q�W���s��z��M-h�F݊Z��g?�rs���{X-*�a����?���˿�%?{�e�ׯ�4M[�8s�*�����E굕2qY�1I��`�[0�[�3
���o���bQUJ�
��ϲ��*# 0Mfg���˳O�axl��?��!K�,��,u��Yx/I���Ĺ��������?8���RYVƟ��G8�TIMc�ұк`f>��&ui��`���0n
^�����	��s����fG��>J
�z�"S�x��/�,˴vvs��U~��sT���DZ�,D�1\N�G?�f�`�ِ$�ٹ�d2�����|�m\�iB��i�H&)���bQ������`8L4�ܻ�����!v��Ȳ��̅B|��g�,-�>��KTW�1:>I$�(?�0��$�IVVUR\����aF�'�ZTtà��I�%ۛEvV�)�F���s��9VTUp��y�G)��E7�G�����r:)*�cj�O2�O$�f��������4%� ���8�DY����f�24:���2�p���6t� /'��U+Pe���@�#IcS�|��1�l�H��p��>���‚{�t�4����a���Ca^.�7[1
�Mk�$$��`��N�AVd֮^I���j�}����wo�E�����c�:�;|��'���fz���Plj�ٵeg._��=MAn�|~�<��ܤg`�X<NyI1'�_�ąK����n�{�d	ӄ��6&��T���{1>9���$�祣����a�X�śo���^/�ibu+�	E"���'�
��~5�$�:��]�op�`8��{v���������*
��s�%����yq�^���^�a�1����cbz���ܾ���R�x�c�
�,+������K$�7�4iXY���~�/?����W��ԟ��ɬ���G7M*�J���ilm�j�
�^�Ɵ��O�H���N��`�ZI+fiޤ�ph�����'��q��퇟01=��0M�mX�����>�<{��k��;_0I"����}�ҢB��7o�k�Fvn����׸t�9cz7�0h��apd�����QT�G2��~��7Zq:x=�+��u=��`��屺���~�!/�ۋ,K���iRSQAR�i��Jck;�i��~5��n⿾��#�V���4�n�q��4]g�-l^���y�-�FǸ�r��b"�(g._� /�,�����;T�� ͗E x� �c�����Ō_ �ȒD��0=������#��1a�&.�����b�X��mFj��oh�Պ,KD�1ȼV��Á,˨�J4�{`��?�iO��6�sA���}�s����em��3a�Y���'�H�v9�5��ɩ�b�|�膎/��۶��'��٬ؽ�Ւ�I��
�]�7R��KEI	��:���?|���
��4
Y��XT�v;6����Q��ک./'/']ױX,�g瘜�&�L��*U�����7ム*)ˁ>� <nȒDs[�/^�;�����+��<lj���x�W�@ �L	��s?w��P���3{wSSQδ?@"��o��aP[Q�|�e��Y9s���.�t�ٴ��l�UQ�y=X-Eƛ塢��uu����il6�94�����?��O��jK+�6o�8��|ȒD"�Ly�;���)>9q�Y][́�;	G�\�ن��&��fuM��l�㧰Z�<�w�+k8x�F8�nEMfpa�&���b�YY�z�a�p��r���0>9�n���d��0��C^N��4���s��Y�����SR�Ov�N.\o$��t�q�md{�2V!��+P������y羼�l���ݐΞ=k�]���=�����I"�$�c�Zq�SN|IM#���S�<�m�Dr~$��jW�0�XT�I
UU�H]o��`�ēIY&�/I`�Z�� �m~k�i��I,u>}
i~ ��:v�Y�Ij��gVKjf�ǁT����W�z�ܜl���]T�T��̶D���c�V��x2I4�j��*��Ȓ���$�UA�u"�4
�x���18:ƛ�O~�
�+k���'�
	-��bI�Tc$IB�$�%v���7nS�@�e��Պ�jn��Peվ��k�4�۬K�c�&6�%����d	�Ge���zK᧱۬w���ib�7���Ҧ�i�̽$I������	^�d&N@��@a�},�(��X,����i�X2�49~�:�n�bkW����s޺��rb���j�\�&<���B7>]�����AF�zg����v��;왟������K'Iv���=��e%�~����~�y{��6P]Q��b��0�b��>���!��0M���"*K�T��,��ǝ�<&�@��#��x��B��t� ��l)"V�@ �	�>����}�@ 23~��QBB�]|��
��)�6�N�L��ձ�d.�`y�
�_���F��8���D�d\N.�����X��SUU�/��Y4M��9>\�V+�,/�x��i��r�K#�29��}�ߪ����D
�C�zj�|b4+X6$I"026ª��(,($s����M�L3<:LaA!E�E��$�IFFG().�b�<���F����'R�LL$n��^jL+���9=����g���x��BV
�ѨV��́a
�Z�� K2k�x<N�;U}�W�$I��t�����ڕ��V�}�P��u�S��<Yt�tHj��6��X,��(��cq�N'�H�݆�躎�n#��t:�=}2���r�D��bx���ݴ�W"#��Gv� xH����K�Y���_G�3N�&�(|�2�Pd�Á���D#t:��>Kc���^��X�x<��f���04]�4R>UE�B����J�X�Vn��R�O��zʼ��<��a��z���|6����jQ�剻�t9q�3�<>��!‘`�tJ#u��,+��Y�]��C(~�#�7E�/��u�93�4�FXE��!IR�aEV�b�_?��G�_��CsDcT�����VLS'�3;�:�����|�%IJ��Ch�M��N'�GX�	�+5I�P*�^���􇠔e)eʖ�0Lɔ�Z��Зy��yBjY$=�O��2HR&�^jv
��p#�,g���Y�ԏ"+h�H����21o�%R�u_u2m���
�$�v#v>	�]׈DB(��i���c���8��4`2���p-�=KqW�/Is� .^bMC=%��%�$I���EV(--a||�Ғ��u|JdY&0;K0���X�־���}]����ȅ�gI������h����b��
����������bݚ5\p)&-]�Sg΢k&�z��L�GG�(**�����e����n�+׮
�ٹ}��������֭��2M�43���•k���ɡ���{��� D�$d%������4LB�s��/�k$K2N��X,����8�L�$�}�zJ�7�Z!�����n��07�â�k
Y���%��
tC�\�WI(I7[[���w�<}����\i��V�_�v�����0M�-$��E�>i�Q>r�7[QU���)���3�NZ �*���-�-x4���b`ph�g�CCLOO���l6�HGFF9z�.����a~�λD��;��ҳ��t��r��i4MϤI�K��iڭ������]x��| ���yt�ß�f���
�}}t��r��Iz��SיfƂ�����o�g�6�6L��g��?s��	Y�i����S��ZI&������
���¼�j�Eu6M��8E[{G�\Xޅ��Á�fCQ‘0���蚎E���ߏ?�G��y�ԏ�������LQP���eEQ(--��r!�r�G���¼>/��$Kh�6?�2F�<Ȳ�iH��d2��Ǧ���0H�-��?��N�5M����'�>AOo�ss�ΧG���PT�g�'�r��y�8���<���ۜ`��G/]������R����gǎ3:6FYI	5�ռ��dy<LMOS\\@8����PVZ��];9u�,������^��}O=��4������g�1	$I�܅|��!����ٰ~�M���
��wy�g?�x؅��=��A�c1$I����sΓL&ٲi3u�Wq���k��X,��ٓ*�,&�m�9w����"77�dll���b�����'�D"���ܾ�uk�p��Yz��(+-�}��Ͻ�$i޹�@F^�<;��ukq:���^��gS/�$!I)o��'O���Gey9��z���N�>E$c�m�i�_�@��r���f�����C�?���YN
�u]�ҕ+465����8v�$�����<�g7����ss=~�}���OL���ɮ;�p�ҭwp�N�=�����X,�$K\oj�����y��vk�o�L��hբb�ZS�$� �"+2�i�;)/+G�u"�6�
UU1��i���1MEQ���A������]��fy)*,��p195I<���t8���F�eb�sss�=�D��a(V�ϲܯ��1GU-�lv���`9#����ĈF���#�0���(��[���������Ñ�P�$&&&8~�/>��-7).*BVd�}�}v��̌�7o�(
/]�ɽ{�q�&�33h���Z�����ώ��*W�]�}����y1M���\��<EnN�H���jv�����0�MM��s�v.^�L8���V���X�f
�FMuo���eԯ�#77�պ���G�$rrrhmk㳣Lj'���[���~��yG�h,ʌ����[�vIbrr�ώcnn���!$I���������PX�ϧ��j�q��OY�~�np��9��<jkjx��7(-)ann���6�_������r��)l6.]bmC^�����STX�[�{�͛6R^VFvN�"��������wG�?9�#��������G[{��3D�Q��J�ȱlڰ��g���yini����'v�";;�,�gQ��s�`BVV֢���FV��`�Z�����p0=3CA~o��.�֬����p$LEy���<���T��s��i�6+}}����ԯ^��F�`dt���vl�ƥ˗	�Bܸي�be�t��266FWO۶l�� Qy�f��
�ݎ�b�̰�v;.��ݎ��8�r�sh�h'
�������I��fy���bvv��r��$y�ytvwp�TWU32:��j����`0Hyy9������������ +2�,�=�L[(t]gn.�����r�r����<�XT�`(���D�vf�d5�����fe�*w_V�$�����M��$q�����a>��c�c\�~]��y�l\�����cl|�`(LQQ!۷ne����H$3��,���OYYׯ���M�������V��0�,���Cey(��a��n�Zvn�N}}=}}��Š��W��f�a�Xy���܅�D�7l��0M���|��_�9��ï_�����N~~�=-2���uݠ���?�����
���'�D"�ھ��۷#+2W�]�a��c�Vvn�F`6@ @�$�~?��c���D"::;���J8ann��Ɋ�ZV�֠���9<��7n���L"��|�u���\UY�?��?���?���
s���a�
3㟡����H�ܱ���b��<���(��/���عo�i]�e��Y����|�*0:6���$���`�E�������eՊZ���h�������w��]����uu�����[6�z�*z{�PT��ի�[�
Y�ij���墶��ζ!��/�i��Ȭ����xE&�Lb���TVVR\T�Y��u�Ӂ��D�d‘0����Q[S��j�0
�;�i�zq{ܙgcQ-����B�u,��s�Z�(���f#77G8�	��ՆǝE0�05�V�h,���6���'�w�dI�h,��&~��W���'����^�gf�B�wvr��yJ���x�LMM�r�&W�7�����IgW�##��AeE��ttv��?���f�4���M\.�--�����:�,�z�J�]��gG��r�&
uu�����4"�(N��-�7������0�X�{$0#������o����������f��
djf���/s��b�8k׬!��᳣���dIb׎���q>;z�ώ����������������ʊ
�����v2<2JSs�u��fe��:����ڼ��	����R�j��\�ܱ����"�o���6;v�=��@O��C���rQ�_��uky��g(/+%
���EQ�t���?�v��4]G�4rsrx�~�[n�F)*,�����+W�́��[�UQI&��9w���I�4����A$�g��zS#�P]UImM
��8r�7n�R�j�a�iZ�	^x�Y"��9J2�ļM�˒��bêZP��k貒��Wdբf���annMӐ$i>�OJ�Ɋ����v�=X-)?
EV��‚B<n��Z.�����QYQ��f���_����D<��}�eG�$$$��E�N`��`&��p��_�R�FcQvn�F~~>���;�j���kDc1�N'/��<��s��p�x<NYI	����166FNN%��lߺ��Ak[;�a�P��4����5
��Z��������v��H����0�֮eǶ��&������B�$JKKhk����׳e�Ts�#�i�8��*+����l��� �`��iGC���=��[����
�&&H��+jk).*dhh���3�S���?��v�}�z����R]U���
�P^VƪU+���(+�f��fz9946�0�{�V��,z�4]cnn�����e6�v;��e������c�lܰ���l�;��F��13=Ccs3�������f�$�P8�i��]�E(]�),, ';�ܜr�߽��:r�����b�� ?���B�NS�Ӕ�����ngMC=U�mv֯[Kaa�%ŀ���(kؼi#HPZ\LVV������uk�0>>AqQ�%S�l��Z,(j��/���	�BX�V��$���(�B<gzz��N  ��
�,��Ə�k��$K���"I3��`�?��!`���a"��s���"��,V��=�EQ,jJ�膁�j�aw�r
S�`yH����]��LOMSR\����^�Ҧ~�7�0�n�JNv�"��(ʢ��b����_��:��_�9���H���ӧ�r����ݾhKS��a���:@-�O��ۯK��u}�����h���]=]4�n���vЅ��{�ӟ�=��̶���Z�R�~�|�i�}g�CE���]i�>�t��D���Aj�j�(�"�h��n*�}���;Un
��
��g�
�z_�WU�$c=��10�	/ev;���1�]کn��^�$����|�k����
���eMӱ�*N�3���s�a8�NtM'�f�σ� �D�ł�nCKj��I4�f��
�É�� ��p8�u�p8|�^��C����r��X,���{�.�a�hZH�?�v4�n$Ifbr���vl�����}�����k���1�q�$I����ݫ�������(d�|���n'k~����_���K��hѥ������g����u{Z�7�4���������w����voIJ�I��2e�"}6�%obf�C�w"q��L��D"qϴw���u���e�<.$��Ȅ5Mզb��2�=''��`�����!��[�.�kqL���S����|��'R�=¡YY��~V��!5Y�F����t]���O"�o@z�!{MӤ���?���$�%g~��a�:�44|#NS|3Ho�{�Y8�~�q�nmL�����D�b����j���2��1��.��>��b��mv�y6�Z�`91�j�����׮3>1IQaՕ+P�X�G&|!m-���\��t��7!N?���a�o�����%�.'�i��q�$�T`E!�H�ۉDcX�#��`�Y�D�8�eM���FX��X �"H�DVV
�
ԭ�[��)N�|cQU5By�fz�o���4�[��@Vd�_�Tg�it]�j����ί���ᔧ�&#�+	)�=/�
``���n��K
.�UQ���߈~)���V�_�,�$5��u=�^),���2���B�4�ߐx6��ҒR����n�%I"ۗ
f���Ƶ���e��B�8�p$�XUb�ؼ�2H&
U&���2Z2�:�G���c�ey��;�%�)GC��H[Y�_�/�v����)���o�9قohI���|�r�t�|Iljw:�ݹt��o�����m��t�/_�A��o:ja~!�V�zh/�xQ�����)����:ʜn�o�3mf;���x��S��
��?��E�Ρk�p���͎7˛q�&
ٯ��ߴ����W �6�Ц
�D���	1���X�d2Ia���Q/�/�CQ��$�<`Uu�A�݈�㌎�.��'��C�񛦉�f�k|v	��0��|C�n_t�@ ��CHbQ��E_I�jNƈ�u�N��|��X
|<�$I,	��Cw
^J���q�8��(�����&'�ɓ�r)�q|���餏���%��M�$�L��3�E�lOZtJ�Ie�㥛���:�K�<ȯ?�d�s-ڋ�g��Am�F>E��*�.��~̐$I����������_�N�RUV�Z�چ5LMO17�f���ɽ

����Y�_���/K2�d,2�&��4H�>�?��v֭���fo����u\+qr`k�X��0MS���k@��l+�iD"~��SR\��bevn��ʌߏ��`bb�_��W�{�I6nX���bjj�`(D~~>.��@ @"�dnn����њiů��]�$C!�?S��-k�x�lZ/s�B�~���<����x!f��@��K�D4��|>���Ghni�f�!�2m��vt�H&p8�x<>��TU��v�|�߼�&�~?����?@���R�,Kz�0�X���v$	TU��:�D4L=�
($�)��$�/_���$I�f��p8PU�d�p8��f�d��Z��~���_�����V����s�ڹM�رu+�6��ĶN�UUa͊\���q�9ʺ5����p�u��^������$KH�P��)�����`�s_��Á}OQQV�	LLN�$a��H�f�,��7M4M��p����_�.�yy������tb��,I<�dW���qc��㯮���fh,�?�a����-��|>��!L��@��"I�%50��dYƢZ����ҕ+�wvPYQ��}Dk[;�v�@�u	�BȒ��*HR�����i@�� ����`���7���z��獣D;�lk�7^[�((~�����#�={֬��#�iLMM��������Fc8�),,�j�211Ann.Ycc�LMO�fQ\TD`v�Ӊn�H�DiI�3wI����M��'���0������(+��o��Wߘ�_�0��� �@D����$I����R����`���;��/ܾ�PH��,I I���"��?M:d/& IΟ?GVV;v���#à�����|����!�V�����m��*�}�5�if���u�x�4�^//��Ғ�
�݈5~�@ �zx���/�V�s����_ ���WM�2Nz��H$�v�����P�i��mv�� ��(���<�$	\.�P��@�yh3~Y��z��.��>�݁�(���C��[����P��@�5�е���@ <:|������[;�@����
����̂��,I�2�N�G ���vF9٫3����<pL@��ȩ�ӍVXD?<0�V����|�U�G�W��l���z���i�T�3ݳ�[!�H.w��,.��x"N,
�1�f��v�E��`I�.I�T�d�
U�f��A�`t�� 0'a�ٗ�H�o8�i277���(V��˅,	e�m�0
�g��Ƣ.wqy�V�@0��sȩCw���E�<��׉ǡ
s��+�i�� 9��n�rG�5s��#??_�>���%�q�ӂ"f�����U��B,i�	GR�WGQl6�$���1c�@V��wm�p\'ˮ���bVU�i��%
�uT�`0�j��H&��$��	I�PU�DB<��c��i�$4�MA�HI�ېeI��u��PQ�E#�/�ts�)~��,�w8|��g�$�Q�/S��)�(/�98�j�>��P��~^���$�,/:���Jح2qM���dIB���;�$IȲ�a̟�H��/—y��$�q.�i�_��K!K�|��[yH���t[}��]�thn��
<�c��H�4MC�e�t�g)5L&S΢O�\����3����(w����f�dJ�4��?�/���A�QO��EU�(�Wi��}��dUU�e9�V��ZrI�,�Œ�?��H��l
 q��_�$���ԭ^��b���%5���N*��Ij�қ����-|)���q88����$I����щ<�la&���TW}��>�a095����f�e:���,H�}��^e����0�$X�jE�_�iE�$9J0d���0M�‚��m��C�4::�().����5M��oRSUž�����N�L,�D�-e&I�C����T�ߟ�i���٬Dc1|^��>�
C ��>ׇ%Ibvv��m�
R���d�XQ[�������8��L������׽��D����E]2��I�8q��P��;w`&yy���O?���R
�`�‚\.ג��T�{�)/+c׎�����3JD�$�� ��h��III	>��ߏ�����-Y>I����dl|"#K0�������/�|��$s� �>ߢ�*�������$������(`��*�W�"+��z�6����n��z���x���*�˖�+�g
�x�wo��޽���Y0x�Ō�>X�ԯ�&�)KM�1t���z������02:�Q*���*��e�3%�Y$�7o����$�--45�dF�<���o����Ï?&�H I�Ϟ�o` �}z�89�A$I"�������}�_��:�LNM.*��צ�����x���O0>>��C�255���46/�W:�y.�+���+W��Ck[;m����f�O�y����Ns��#�O$8}�,�9�i#Y�ilj�¥K���ǟJ�kA-|��2Dc1^�-��_&��iTU%�f��B��x�@���4MΜ;�������m@zo�-�P�/Tr���n���t�JJ�/�v!�'8r�S��Ȳ̍�Vn���!���rDz0�.C8�����g����ޠ������5K(祄d�>�k���K���Kt��*
��C�군=�g���g����
�P��d�uh��?�h�x,΅��8y�̢[ˍ�\�v=#��y����<!��>dph8���L٦��$�H$B<�\�FQ��d�O
��k�s�����.����q���7��Чttu��7�w||���N��;|���zz���[T�۟q���%Վ��?����糣Lj�b�1��3m�pt�<o�G���O���CwO/�hd�s\�濰�u�0�
#��>L�͛����ɹ�������GR�J��~���i��{���-��i���1�P8L2�\�F�$��s�1ė�0�%���Y�,���2==MuU%M���TWs�����nB�0+jkشa��\etl�H$��M�.�3�ٹ9N�:���4;w��tp��)t�`�m����[�ehx�����,v;����p�D"���[��x8q�4�H�d2���QVZ�i�4�ױ�����~���)��?�L���G��(+-ejj�ѱq*+*عc;gϟgnv��`�ݻv�j�
.\�LGGeeelݲ�l����f^{�M�����K/�r9�$�P(̹���񳶡�ښj��<�����|b�n�F��u����~����m6N�>������}�V���^����'����:Ņ�
Q]]�{�`�w|yr�>r����
�)))a˦�������8N~��k4���^BQ:��RϷ��-�7�8v���#
#��*�2W�]�W�I��ҋ/�|�ݾX�YN�S:��gR8�����իh�q��k�099�Ͷ6"�999l޸����G"TV��z�*L3%�,���� �sa^��������s.
��[�
M�x�hkog��'��;��A�76��e��T��s��eB��P�M�6����0����g?�1׮7r��~��c�gΝ'�❟e
��zٱ}]��LMM3;7����_��޾>n���v�ٶen��ѱ1���ˁ}O������$�Hp�����QJKKh���������t���ض��:�0Lb�8�N� L�k��M��o��SOR]UIsˍ̽�64d��n��əs�ilj&SXP���z>�� sss�~e�/�SQ�[Ϯ��5
��!�_�����c���I�hji����'x���l6�ۖM�$�%Q�,�u���^|�9�'6����t#e�6
�D"Acs�}}TV��a�z�y�͖͛ؼi#�L&������O�L&)//�`�P(��+������kD�4Mg��x���z�:���\QKiq	N��s.p��	FFGy�=�\.,���\kl"�L�n��n7�._�k�+jkX��0��H�̆��x��~�$Y�؉�D�Q|^/Օ�4߸�(��R}�j��q�g��W��,3�ۼ��
Μ;OWw�V� //�w��l���O=���btl���>������b1�먮�btl����011����iܱ�|��9><�	�c��ݳ���
>o֢t�$��0�K�u�D��nIu���,�s����"�޿�k�M��	.]�…����l���{\����G���0~��o��Ho�l�������TWgL�y�y$�I^{�-L���tRXX��j�¥K�ut��k�319IR��_�J[G<���b||��~�qLL�h��e�?����d2���f�SXP�G?���������wo�CwO�z�7�vl6+�X���ϐL&q:������3�z�hi���n�7ߢ������=O�s�:�$I\�z������������N�������;�Gs��E�r�0�,3:6��G)))�|���pf&���GaA�6��իVr�	z{��fe��񐗗G�ǃ�����ҕ˷���ϕk���0:�v�=3K]8�q��8��Sy������_�D�Â������pj�o�C���8q�4���/�Ms3���*?=Lkk��P���������v��Ғ\N�a���>��c�Yv;>���AGg'���|r�3�::�X,|��Ǵ��q��Ib�8&��OH$�f;f�Nf��q�;���4v������zS3gΞ���������O184�ۿ�H4��a'�Lp��1���q{�x<���r������s��y�N''N���˜�p�<�{���V�!�q��z����f����p�9w��{]�r5����%''�E�����.�?0�����t��zq8�3y:hlZ�����
>=r���IV��`�Y1�ų}�0��m�� ?/�۽(�4o�H�
�A���|�7���~EOo]��ttv�:��|v�.��#ǎ�������$	��8~��� K2�~�����q�	v���;q���7��s{`6���WPd��FNv6��r������}����I$���������<o����k�>{�������{�������LOO��{�35�8�'O��ʢ�����\.'}�	�P(��yyd�����墸��OF(�a��n7N��5U/EQ8u�,�h���=�����`�TUV��wXy�Ng�}�v;Ņ�l���\��t���=nܩ�%H�:V,�3�d2ɹ��ik�����ttt��׋$�ԭ^͏��4���܌,�lٴ������ݴwt q�,
�������}TWU!!1:6����SS���PR\ā}�(*,D�e�'&�����|����h�FOo��{r/�7mdvv��7&������ԭ^E4�`���9&&&�x��ٽ�'v�&������M������J��������=�vS]U�,˄�!��;x��g���^�����7�x<w�sf>υ�[�n�����D^n.���;w�u��x��gn��ͮ�;ذn�%%�ݳ��I(Z�YMb���<��b�����Z���۷�j�
���2�2M2�w�5�9{����{��v�$/7c�Զpɤ�����"�ص���꥝�L�]��@[`�oljfph�Sg�022B[{;�����{r/�v�,B�x��*��V�T��ޞ���hM�h��`����޹����&fg�� ���v���Ǧ
(++�y�qOo/{v��٧���M{g'N��Mֳi����x|��1����0R������[YS_�i
����DQ��ٶeK�Ar�ڵLLN266�Q6yy�R_���U�@��5�ֶ֭m����[�Ͷ6dE�k���y�~�>/`�NG�^�x���<��RmRTXHk[;k���k�jn���iڢ�L#%��l�H^n.��SVVFUeׯ������\6m�@yY)�iP]U���>�ʚ�\�F���vng��u��ތ3�BssIq1����iڢe������
]7(..��{�����`��|7L]������}�D"Q��w<���u�4(--��={(�������,��9rr�ٺy�uu���QXPHyY)�]D�Q�@�e**����c���dgg�$15=����?�sO@�4?7�t[���-�6����g�>�EUq8��ܾ�����[�vv6�7m������R6�_��i���Eyk��"�ԭZɦ��Z��m6�����[M���TUT,��I�h�������tr��e���x�=4���p��
���T����ٺ�l���|�[��%	-��-��Ò3�xB��ܱ�J�e����G����?f��U�#���Y���L��?]]C!JKJ�%����F�Ymdeeq��e��������#G�l۲UUQ�����ffgg1t�l������8�"��STX���@����S������鿻zR#�͛6����a��f���{v�búu�������w�K���͛LNM�f�v;��\�r�+׮119IeEż`\:�>��ɩ����0L\N'��%a�X0t�0��m�/��a��f�'=�K�x}V_ LM�6+�##tvug�x�^@�
�rE-�>/��\����ٹYä�����f���i��$�Ԑ�󙘜\z-T�p�,HfJ񛦁�裏���}����}�V[Z���D�������#//Y����cph�ѱq
��S���y'=���lZ�����grj���f���Y���Î9/�;�����D�5,�>�-7hjnazz��‚���tc���DC��[J�`zz�#�OP^^FIqq*��/R��d�ښ�m����W����0S�EQ���gphM�@�(,,��������QRT�i����2==��>��6O�_X��墳���?��tvu����KIq�{e��B%��XT���q�zz�5
�2�lss��g7FyY)n�����u�����;hk��D<��grr*��@��
B7tTE!7'73��5}~�f24<��f#?/���6��Š���B�����{����ʁ>?��(KE�ٵc�>��%2��eI&����ޞ���:n�����&�75�i99�h�v�<����� /�l�2��=���T���a�F�����/���6���O�|_J����knvC���
����������ᔿH<NWwO��v�%)����E}/]�Ԍ_(�{��������| I0�v����v�|�?8����瞥���l��ɩ)b�����UUU<�g�_��͛���c�6v��Ɍ�Oww%��T���s�6�]�NsK99ٔ����с?0KEy9;�mC�eZ�ک��D�4V�Z��kiljfdd��}���*B�k׬!��#�
�u�)���,.���+jI����=�����ۋ7+�5k�Z�ԭ^��b!�RX�ϧ�?cpx�m[��q�:fggY�f
�994�������f#/7�'�<AgW7��]�ض�m[����[:ϭ[��e���Z|0�j��j�̳�������Ғ���i�X,V��2=3ͪ�+�y��|`6�������i*��Y�r=���A�44�;h�q���<�'&8{������瞣���k������a�:Μ;��+W���b|b��������ɣ���>)ߒ��,��UU��	��<�gE�����295�*+\�����$6��}{�284��&�'&()*b׎��AGW9��x��غe3��#�ܸ��DEy#�c�OL���MC]پl::;�y�����,6m�@�� ��]�@C]sssTVV��Qjk�m)�D"h�FMu5�a�������M8���UU�����tRTXH�����E�\�~���*��ٸn�h���"JKK���A�|^/�$�c�6������c�6B��]�,/g�uX�֌ 
G"TWV�3��D��^���FgW^��5��L�S�*.f��wl嚝���pPZRB �����2���	�C�TUe��y�LOOs���񉉔�i�N
hmk�`�����Z�56�|�}�LMOQUU��墣���KnN�"�@�<�8��Ƣ1b�8��U�g6;7KV��ܜ::;Y�v
��p�����%�E���[y��,yy���������
,�##�OL���Cue%��PS]M"��8ޞ>w�p8̖M).*"��PW@Oo/�E��Ley+jki��`|r��;wPZ\B8��f�<7o�HeEy���@�l���������RYQNvvvʒ�D�55ՄCa��>rr�	�B��Tc�X���OAA>�>��,�U������Ӄ,�TUV�L&SV��B�;;���f|b��
�l޴����&&&���ɡ����g����E__�H$㽟j�"�n���<������b]n���!IR�!��ٳf]]��h��� ������#I��~C7��?�	v�]������͛yj��l6�D7R����K���?�e�x<�,�ǡ�G�™�BԅB��6���$5
���%=�O�e�_�|w��~���T��n%I◯.~�麦G��g�T\����vL.��Œ��lKF�e�KW�����O~�C\N'���Gc��=��������i����j��a����R���.��b�k{�ARӰޖ�^q#��⥞����>�E�u�<��W~|pѳK�O[�Nuw�W���$ICAtC���r?���$I��_�З�N��j)���	�����M�����E��%����:�z�u�V�y��LY%IJ9f�ff�뗕%s�9


������6lkk[z��'qڿ�hI¢Z��2��1��j��g��MYFY @���a����4�?��Q�*�RX,����۷�|���-�)�Ry.��$�|�^���s�4�Y�p٭���n���*I�6���:t]GVdI�j�,2��,$]��ڧ��=(�!�>�|��(�W�h�˶����n�Rl�l�H0'-�e��籰��Zv�bf��W�|�wk�/�>/�G��2��bAU�Ee�R�w_���#��-s��s�%���	��ݪ��	����4M^z��ߐ���~�c�@�Q�R�O����a��Q=;|��{{Y�j�%I��2=����{m+W����|Ѿ�۷#�h<��<�g'�r���Q}�7^|�9lv�C{��D|���H�H�K��Ev����|�Ag��,V�hVbO�7�۟��a���It��3%ۼ��™�k>&�p��񰟝��YN�?x�Y��=h$$���;�@5�d2�(�MMż�"f�҉����u<K�EE�Sf:1;<LR[+c���~��<��a�&�X�+���"�?K�7�-�����U]t�FqQ�"�(��� ++�g�?-��� !���,
�-�v��)�5>9��E��1M�H$���ܾkz�53=��%5�����K����!X.YaՊ����.�@��V���q�:k�F__�drQ�K��P8��8*��Щ��K'�7��X�.�3��
^�U�e��V.z�q��@ |�$XR�����E����o��4�Z7����#-�,�x �ah�����ɗ�񧢞%S�ڑ2������|�I ,'B�����gYx�
�E��bQ>��=B�0��RZZʎ�;q�=�u�kׯ�D�ē�f�/�@ XN��C𨲤�7���hk֬��tq��
�X�f�d�`(�,�dy��e�X,F8�j��v����C�L���3n�]י���0t\.�j!a�)e�q�Q�C�~%�;'�1�<���/ILLN`�X����x���N32:�IC����9s�4�s�Ȳ��-�@"3 8{���#�lVv�J�s��1b�>��U+Ws�lv;�H���5lݲ�+G�/�@ XV�<��U�J��� ��#//���N���`ߓ�ٴqM͍\�x�H4‹ϿDEy%W�^�D^������<��--M�l6*+��|06>�n�ض��������b_�RB���F�!���ݧ����R_�@~^>�H$�=���r�0$/f���:�-O��!uZEy�%etvvp�f�UU�uEQ�de�z1
��7��N ,'b�_�rS��3gNa�Y�z}�ؾ���b��F8y�87l�����gOs�Ӄ(��m;��S&�‚B**���
`�X(..�eua�&������j�XZY��X��!p)ş>�#��(_�΃��,:�|��wC7t�䭭�,R&���6��.����u
�dYwg�DұD�vdw�|�����QU�d2�hW�a����[�,���i��(��b�hZI��pL�d2���BV�K��������V
v����{�
��e�;����#	c���4M����͆,+~�����b%77�0R�[m�[��ۍ������+��h��勴�hB�Ӡnu=O�y�5���'	SZRz��F8z�3֯�@eE����=����';;���vwuq��	�C����q9]���v�ioo���b���G��r:Y�v=�u
�^8�0�����v��p�}O�`m$<.,5�H$�\�v���64]�����������S�w�0M���xj�~���;�PVV@Oo7��m<��s�|߿�]\�t�g�y��¢����kD#V����3<�w��9��&�L0<2BaA!V����N�����
��@�|v�S����Բ��I���س���ɷ�%�,�|K+"U����f[�	UU]���Δ���dފ�`�>�μ�K��v�ٱc��`0��M��0;7����]nB��D�ۍ՚:6
�H$�g�*�t9	�͡(*�-�
���}���O$H&�L�$�355�ǝ���Aӓ�L܃�n�'k��5�`{�ڏ��X-VB�n������L��옐e�dz�A2]���w���ͥ�(*,BQR�
��E<���cT�W�n�:B� 333LOOa��p��"�@p�,%���x�M6�����먪J4]��mv 5�H$��Y^*+��$�ٹY��].7�,��x��'�,RJ�-�oi4M������q:;�),(D�5B�0�i IO����QU��C$&
�p8����f�ߒ��v��j!
e�0>1Α��ٺy+��U����r�R��¡�`���dr�‘�T��Ću����ʵ�8�֭]��j���zS�3oU�tӸUEQ�F�D"�N��h~�V�E�վI��D#!�2��Z���q����7p��[6o��p��i��8����ٽ���~�^�@AA!�a"�2{v?���>������~��Μ;��
�����|6mڌa�46]�f�
��ʩ��B�Rʹ���+�.c��+غe{f6��92:�,+��ӝ��\���^�BGg;�$�a�F��,�{4c||���Y�,B�W�^!����|��3D�Qb�(�,3<2�ǟ|�"�8����5K,�;)E3=3MvvU���~Q,x�eJ�KɚW`���������b�9�i��qJJʨ���¥��tvu0>>FII)�]�_��o�7ma���L~�D�a֯����p�H8�Cc��H$4�5�p:�p�<N�]�زe���M�P�7[(--chx�+W.a�%����H<���]Ә��������)(($+�ˉ�ǘ�!K��a�Stvub�X0�����I�())�0��;q�=�X����^>��=�{�ETU���CTW���ފ�b%�LP_�@m�
Ξ=���,O{�>E�/�ljom�ޥ��D"�?�����X����˧�QTXLUe5��M�BA


h�h������F*+���w��m;���5� �%�x�X�bE��<���WP^VN^^>��S��*[�lc����gdxY��F#\o��,�deyil�N0\TrMט��0�&�H$��Ԉ^�d���[��=(�Lc�5��䢺ǢQ&�&��+�'����GEE��~�n��_��'���b�RZR�Ͻ���,�#�1,��X\\�w^�.��9\o��C1;7�$�h�F`^6E���5����$	L� �%��_�����χe5�Ժ��Uux�����N�4M�޾‘0.��ɩIFF�3��]���}�n��`ff��³�<GM�
�_�F"�ڽe�&�D�p(Dc�u*+���w���-���ͣ����IOw%%����{�lٲ�d2��%��lK�{�e*++�z�2�p�,O�>�<�����b�cƢI^qQ	v�����Q-*��h����͛-\�t�ѱQ
����{����ηr�,9�6L�����z�ݎ4s��wIu$���C}�\N&`f�d��*��Y��dE���l�N2�`rb��׮RZV��nO�I^����R
=
�^XI��tQW׀��z4�a����;/���b!�L��ۓ�R�H$�B!L�@7t����� ��TTT�_�|v��UnܼAyY��r ����DJ��&6�
�/��Ir��Q |K� ��r_MM
V��K�/27�2�{�v�؝rz�X�7���ezz*z5������ZI�`�H$5p�e���&	���
&t�t��
�lܸyQ�o?�U�@�S��b��2E�@HH9�4u���'77��jA�%�8�DrQ]L��0�Cdڹ<%�,M�N6����5\�v��J��zlv�u1Mc~w�������]���U�K���+�#�0�����y��giii������5fLMO�;�z}lܰ�˗/10�GIIeee

p��QCGV��Khm��gGSYQ�jQ�
�$	�ł��4�4!KU�Uԭ�cph�����k���fzz����E^���`�ZY(I,�������i�Q[���!4M��t.�PU�i,�2�;��I`֏���m6*+*iln$��\Z,��ف!f�3�Ӝ<}�D"��Z�������`p�`(����0)..a��X,Ξ;���٩���f�!�2�%��Iɗ��<
�8}��"��˦���@ %ߚ[����d��:$Ibtt���K/~��,/�}=\�|��4�D�3gN!I��m@UU�k|v�0۶n'�Lp�:��3�6l����SXX��j%�H�a���Z\\��+�x�ZU���~zEUٶe;cc�!�{ɟ�\MSUY͵�W	�TVV�ň�c��RV^A4MY1�VT��'פ7�z�<���y��5Mc`��x"�J�DyyE�	�n����(6���y>M�&';��}�k5Mctlo���,�W����D�<��-������`���M�n��`0��(8�N�� �`��F�/;�8�H$��R����·�l�j���4
��I4%�c�Z�X��>�@��v��*�P�02��D"�����
@<'���deLk�pEU�Ym�BA�y�����D����Eg+��aL�E�]��w\�X,*N�Y�S�6EV01q9]��aE���w'���q{͖��$�p�x<��j�8����T�r,������|\Y���lh��y'�;�$�;����#�hj۰����H��r���Դ�Ӯ���!~?G�}�SO�#?/�s���&W�^�g����r�2�Y�0R2t������fg��n�MӈD#XTK�07��A<'�`�Zp:]�bQtø��1M�p$�,�8�)=M��q8<������ɡ��E�S�5;@�����u���A�4������������j��q�:�]�]K����~v(���vdI���}���u���.PXXĮy��F9u�$;��dŊ�w�6��s��i֮YϚ5_M���3~�Ӆ�y�:dee�ۓE�'k��R���yh��4���f�`(ۗ�����&$IZt������,��E�����YN�@�,%�,˒����x��ҦY$��VU�����vS*��7�K4�f���eg��e��NvvN�������Y���d~~���S֋;�z{,�_J�H�tǒ����3�<��$IJ����t:3��Ǒ��8dY��{��Hm�Z�����F��c���S]UCwO���x<*+�@"�VZR���c~�����A ��������(�]��3�����W�������M�A>��={�.R��%��=�fq�4Y�v=%%�w�����4MN�:��jś�c�S�ikk%���hm����N�M�����J ��O�Ԓ���:���[R�.n�����dy�����g^���fj) ']�x �Ŀ��@ ,;v�����E�e=�%د��⒌ҿ[]+���]�$Ilܰ���b�+Ngj�xMu
k׬��cjrM�Hj)o���I�$jkV�m�v�{�]�2���$	TU%�H0>1���0k׮��f���������m�@��@Ƞ�`,�,�Ȳ�	���{k�����t�`jzj��JX�63��R#�i��=ݔ���v���V���Ad9�$VS]���Tj_��bQU�n7>����"����꜏�~`[��N ,'B	��4q8219�����iR��K+~UUY�b%�P���	dI�����
��������*JKKq:�LMM����r�Ra&?[�le��UȲ���$�V���r���r�q:��ڹ���&��u���E�\헻���P���BQ*��(*,��-d��|_�0�>M�Pd塙e�Z�`0H��!�4���!''G��_����ۛ	��K��%I"�};��(}`QT���)F��`yyP˖���nz�[�_(~�@��$xT�v*~aYˌP��G��o�:�����q��Z����4�F�L��#|�����NEJ�/�=��@ �<���6��1���t*�N������@ M�v�};��@ ,�P��@ <Fܡ�5][�2	�@ xH,R��a��x���]/���ƫ��N0`bb����Ǘ�.�KGg'�=�C����|�i�q�D"��u�p���~�񓧸x������8.]B�宾@ XF�\�F������$���
�a���k``v�O>=L8^�j�L&i�q�P(t�w�S��� x��1�o�h��c��-�����_��/�࣏1���i�;Nb^񛦉��_5M��'O���˱'�(�H4JgW7IM�4M�D&�!@_?�Ϝ�fk+�._!�f:q*}rQz�0�����O3�u`!��=~�7[	G"��a��|�
�a06>�/~�+������۳��$���9v�H$��L��Ƞ��¼�i��Ri4M#�L.J�L&���"�)��f�z-�v<|kmm�(�}O�t�}�(
�6l���]��Պ��<	o�l�؉$�Iv��Ύm۾p8߇E 0K��?������ko�š��xj��TW3���̉S�3���g?�����Mͬ^��j�Ï?��SZZ�O~��^����g�$���".���?9D"���O!K��m����_��
��DuU�r7�@ ��I�|t�.^�Lmm
k���q{gjz�>����	~��~B[{;��D�Q�l�̾'�r���ώcvv���,��ubb��>���?�7m��r1:6���
-7n���ͮ�;�����ٴq=�>�����ٴq=O��ωS��p�>���^x���>%����G<�`dd�߽�.�DI���K/q��i��#�����~ƪ�+ݧ���w��,���	���K(ʣ�7��,ݺ��r�x��aPS]�s�<�����idYf�����ښv���;����r��׋�(����;x�;/�����������Q&''��+��L&�x�2�D���N֭]���lذ�?���3:6ƉS�����٧�p8�࣏8z�C�#�ݳ�‚Ӥ���?��O��,��xq��B ,�,�u�&~���s��e�++W�d����ڹ�
��QQ^�w_~���"�ZZ�z�x�'8t�0.]�>dϮ]���? ��Yr}����W��}�Vj������…��OLp��E���sA��%˻0���z��8|�(/<�,/��N�������԰i�F��;���Vv��NnN�}�]V��RUY�w_~�Ғ�;�S^V����޵��;�#�"�����/+
�֬㩽O�1�Oc�&O>�yy����G��? ��}�V�o݊�babrr���zc�W���r��*�lX�ӄ��UA�d


�������d"IOo���j�J� �磢���*z�����Ʃ3g�#�޿���ռ��G�ut`&9��T��SR\L,��B���Ȕ��RY^����b���fQR\LaAyyy��njkjp�Sr����];w���Kˍ$�I֯]CqQV�e�l޸�g�>���ǹp�e�%���q��qFFGٺy3�6l�����%}��s��e��q9lX���l6�.kꩬ��bQA�7�5

<�w/�33��v<7+jjp���)��כi���sǴ~��TW��S�놁�n��?����@�e���(,(�w+���n�S[S�� �����~�$Y"	s��e.]���f����K�/c���cίO�ASKU��Nà���O>=��F�9p��.P\\��墷����q������[7oA7��Z���-<���HHHdggs��

(,,�4
><�1�ڇ���l%�
�x�;/395�o�yoVs��p�+W���0�‚|�^/>����"�76�|��۶�����]���ߟI����s�<��+W����&��mf.�w�KEE9���!bM��f�~.\�DwO�5���:~|�g�>���(}}�H��E��m�r�ۡ ?�
��=2K��V��������Ӆ$Ib�X�z���PR\�����ʊr�멫[M0Ģ����.w�h�蠷���^x�ł�j#�H`�&���2�e�x�����������R�V+9�>�^�Ξݻ).N��U���`Ƕ�)s~a#�cdgg�u�&�V+�����ϑ��Cnn.�eX�6J��)(�_�&_36��ʊ
��}�\Nj�����&���O~~>�P��2�^����V���}�5
�TWU166�����(/����p$B"���p��ʢ�����K/�@qQn��S����s�QS]���,7n�b�Xx����ɦ���H$z�|�g��U����?8H2����������p88�N��>Ν������/RP�O^n�p���r���#��0���rV�Z5������9� ,�$)I�d���o�a����o�2
�D"Q
��u�i22:J^n6��]+_�0����ʞݻ��_9���N~��������~h������/�������x��h�Foo�]fi$I"��݅�%����x�x��!D�$QZR�����Iqq1n���d�?0��5
U�X�6�JKP��h%���P��@�MBQ�����ֺ��ډ�>|q_VZ¿��B(�G����D}��z)o���$IX����!\'�@ x��_ �����@ x��_ �Lj;��,'���E��0::ډ����CC�:s������s\�~}�1��*�cc465s��ufgo;��Aӗ>:�s�z#-7n��������ݳ(�R4������8s��Xl��F <dz�����~ y�#�56�?g�����G�<�{snj�z�5Ν?C,�������q��1�GFx�ͷ�9XxT8|�(�Μ�߽��P���,'O�!q�N;08�o�y�c'Nr�/d����w�Cb>D0@(����ɩ)>��c>>��r7�@ xȼ���\ojZ4	��#ܸ�
����}������S������[�j��q�N������ka;>wl�Դ$W�^A�$v�ڃ�f_�"+��Ts��1���eI��$���ь5`׎�\���	�Btuw�����������Μa��O�����ᠻ��k��D"�v;�}�%rsrh�q3uP��͑�LjD��Z��g짩��s�/`��xq:=~��`��{�`�&7[�/�7�].���w������(
�>]]��)�S�b‘�Fcs3k�(/�7��Avv6/��<�S�����{��#��Dغy3�׭�����G�161A @Z"�O0�؉�����a�:N��s<��	z�����c󦍜<u���Q֯]��];���kٵs�-78w�.����q��eB�0��������_ 
a�Z9�G��7�bpx���
�U������ū��ANv6�y���'�y�,Ѻ���I̛�b�kؾm+o���pY��B���ߐH$��x��_etll�뗡`]ө��FUU6o�Hey���[�{{9�"]]\�t�5��ttvr�t]�fk+�֭EUU��Jٳs'�/^�܅�8�N��	������\kld�X�0M�;;�ogdt��g�"I��0::��Wٹ}�P���	���b6mX�G����ˣ���ʊ
�
���[�����'���$���7ߤ��M�x����c�Vl6,a}�z�::DUUe&p�'�>�p�����ijj^�ƹ�sQ>n��#����TT�S__�$I|v�(�d���\�]�@Oo/�F�����o��mr���뵊��;C�{�����RYQNyy���_w(~I���Z��;�Ǔu��������ϣi�>���I�?/��"�|�eL```p�뗡�������<�l6���x��gH$����
�$QQ^��];���!288D$����0(/+c��-4��s��M�� 7n�dzf���i�l�D~^>�.\`.�0MJ�Kضu+jk��7�MLN�5u�W�{��n�@�QT�5
�lݲ�Ӊ�頤��5

�Z��ښj���ٽs'�yy(�†���G����s�	�C<s`?k׬��t�Ԣ��իX�f
/^f|b����8�NΞ;Oo_;�o[�frrIſ0���ݽ�8�v���lڄ����v��{ظa=V�����'����}���Arr�),,`�Ν����q���|J�S�z�J����P�55+x��g���Y��4�u��,?��hjnfjz���\\.G���ȱ㘦IYi�r��X,FGg�׮EQ"�(���;�Z�/��G��̙҆a�Θ6M�oܠ���כq`iln���V��
9��!rrrh��G�uTU��0�#G���:��a���szz��������!*������)�o9���/fZ���7o��ׇ��$	.\�����a02:FSs�sA6�[��(?u�����0�X�k��\�t�/���A<g��������(�–͛x㷿�n�SZR��i�ޙJ��C�]�W>����2�s��9�{z��b膁nى	�p���~.^�LAA>9����q�_�������/q��e�\��l`�#�=��B~
(?�ɏ�}uu5.�I�(**���P�h��EmM5999��><�vl�������gfx�穭�y$Fo��45���w_�b�233���#cc���TWW�%�TUU�v�X�b�h��͵��پm+��D�Q������fem-/>���ֶv0M�W�&�i�9w�����=K�χ,�ԭ^E4%77��BgW7>����q�d���ՏD[	������ee��Φ�[��ҲR�{z��lԭ^���t��RQQ���F:::�����={صs�\�v���i*++����؉�tuu�?0@2�ĢZ8q��H�g짶���š�>�٧��n�j���1"��8@aa�>LWw�]�ٲiN����.119EUe%�i�z�J�N'�p���|N�>����i�W�KEy9H���KUUJG\�t���AF�ƨ�������l6;�B~I� ,r M#I�d���o�a������.�Ccrr�i��)=��W�4�ֶv��*�i�"�M���i�,�@��u���?��l߶�}{�.�/�#����)y���_�w���/)*,\2��)�'IҒ�����������P��/�׌�i�i��I�D(�����8�/??������鐪��n�;>�]��~�փ<UK <�8�Nv���~d���Fk+U��5���sB#�
n���&��#��P�@�(
��(�غi3;�=��C�e�����_���|��
������>�����2��J����K�����@ <F�/�c�P��@ <F�/�c�P��@ <F,R��i044H2���~�L`�h4���P�G���9���\tD��i���-�(����`�i�4g�泥��O>=̕�׈�b�[�d2���7����czz�P8�@�J$
�\"P�r`�&sssK�<z,R��ir��y�7^��<{�����w\�|�����o��p8��u�\}v���}����_�?0��|jz�>>H�.�{xd���o���g_��w9w�}��
���o�ɧ�~n&����_�=s��r7�@ �y��w�z��._���E�P��'O�s�frr����_��Z$�I><x����;���z-l��g�>~	�X,��sg�$�
6bQ-�]"�����ߧ����uI��� כI$��_����G'p,����];w�G��Oik�`�m�mv

��������Y\.�w���t���N^^n���.���Kk[+W����O#I���CC���p��5tCg��Mr�����1V�\A��C_?��<{w聆��WP <�~N�9���
����d��~���^|�9���y����'�2;;���$�h�5

���295���T8�x<AcsJ�Z���F,������)F�FY�r%�--)�������{�N34<̵�ױYmlٲ���"�0V��ܜ�C�.���l޴�Sg����`dt��O=EqQ�
g���g��
"�Cf�֕�#tuuYbo�&[6o����w~��D"5`���o^��Fzz{��_�=~���aph�P8L��U����dzf�_��U��9v��y��7I&�<t�s�/`�&M��lX����.s�СTT��X�$q��%��[8�o�h�H$B����?y�_��:�X��?&&�xUU�������Q	��a�H$@�_��*�h�F"�@�u��$ZR#��i�gΝ�������_���ϫ�����0n�{^nܩ��\�ʯ~�ZF�OL��oK(�ɓ\�v�k�3i�g�KʠE���LNN�5��3�A$�׿�
���9r����/��G��߿G<��O<�0�\�L�=�H�����%�IqQ1O<�$YYYw~k�X-~���184ęs�e���Q��ß������P(LgW�r�/CˍT����p:<��	������,���P�55��ߣ�����I�����Z��KW��{�.���K�\�ӄX<Α��y��X�P����4M"�.����|FF�p9��f*��K/<��w15=-���UUٽs'�y�%,YYV�\��={ر}�6n����~����ؽs�g��n�ȱ�OL���޿���,L�T؅�d���`MC=�ar�z#���ض����L�ӹdy��r����D������?�7+���w^~��O=�����/;�~�{���,#w��*���^����T���rŭv����m-lRTT�3�<KYiْq�M�D�u
��w�˱'�f�r{P-*7[۸�چ�k���,w���C���V֯]�Ţ�'�����������xR�c��ab��O7n����KQa!^o�==���1dF�
uu�iH��>:�	#cclڸEQ���?��r����D0�0�
���#�3-c$R��"+
15=�$��q�{z	�B�@8e|b�H$BYi)�a�����������|�o��U��@z�^�F���%��鞝�Y�����v��;;3��-/�P��H�ފ�H ���P�f�?
�T�Q���C	�ʌ̈*�/Md���a���ik�����6�?�H��y�
�
�*+����(@nN���e�x��MW�畳�F�sl���FFG	G"7�E�
�V �v�����b�`2��4����������::����*:��&�
���{����Wb4�n�B||�4Ea���ܿ};]]��$��p��1dY枭[��˝���ށ�㥴���HO�����a6�_Ϝ�B��H�O 3Ӈ$�HIIFB����y��h�Z�m�̎���_"��O$����ŋPZR¾��|�*E�t��r��E� +�/�f�����F��n����)�X	� 3#�#�FCNNf��իV�9�w�^͢�x�~�V+�H������K̯�d���tZ>�d?z��Ԕ�� ;�{�ۍ�(�����'	��9z��ŋ���wذn-&����z>޷�P(����	�C���xܞ��Yv��wt�ڎ7IKM垭[����`0�V�����d2����?��
��^���X�`>��?@����5��x�J�Ao��Gf��U��p�-�6�>�6�v��CY�n=I�_l �,�(�z�r�#���Ñ(
Z����m��줳��˖!I����PE��8QI�P�Z�,��AN�9Kee	�W/"��=8�NK�)ID#�j�ė���R�&2k��~P,���j����+6ZB�R�FQ�TH�D8F�$4��#��$���ż�
V�\��j��_�~?
���P�T���+�*�
�J5���������?�9��Ȳ���X�B�(������E��G4������~�󟑖���h���:�x{S�բR�&����D"Z[[�+!I����/����/�$M
\�Y��ege���5�Z��h4޲��i4Y���I�k4,7���/�?��7�~}[�mW����'��7�$�T*t:]�@��X��okuc����l��>���RR���Ș����|�r$I�l6߲~�w��1�o������d�n�EiA�;�J���'��v��WQTX@I��~������~�#�V��L�W$� �,#I��Y_��qy����~�F���m	_�xnBA� "�� �D~AA����/� w�A�2)�+�����-��Cln�����i#��ĬQ�](���v�'Mp�(
�p��u�e%���S?
g��� �2ק{��u�IDAT�&D�Q�n���ߦ?Շ
�ˤ�/�2�N��������'�g��_r���ֶ6��W����t]��CG�������-m���
�����i������W����m=v�O���
�0˼��{\����ZsK+�CC@����%���;F�Ι�����1888�/R��A��n��_�9���CH@QQ���N�o����H$�s�
@(����H$JNv�-gx�	�p���W�[ZBS�1N�=�訓��9�Z�p{<tvu�r�1��)B��������f������rsILH����ڊ�(04<BCS#v���$���A��-n�@����&x�e�23ټq�������|�Q�UVq�\C!�3���mx�^�hmk���o�3�h4JGg'��#�����h�D#�����xp:�HMM������a���HOO��rƗs��ҌZ�� /���!��qqdY����χ^�#''��Uռ���zͫ�a�O�����D;lٴ�ܜ1W�m6e	����_<Ojjv�c�J˗݅Z�f�ޏ�(/�����G{hkk�Mkk���g0k�Oo/#�#T�����w�������Y��n�?���b�G{�WYACc#O>�8+��ť��̯����5>�q���G�?�����L,*�D{{5���S_�?��/(*,����0âr�Ʀ&�� {?��ᅬ��N\�1�r������3�Α������v��j����g�`�G�������W���oP��OaA>:��OO��_��?������h�^۱���|���HKK�R֍���a����K/�DHMI�f��w�����Q��Oݵk(�®�? /7���!V�\��sl�^YY�S���0��CNv���ٔ�}

��Ec�L�vQQ�&�?�0����|�ZMOo/՗/��3O��?�9�]�74�t�&\��'%9���x,=p��~v��Kj����f����Ғ�������9EE=~��Q'*���Wk��@�Ӣ(
���$&&"+
K/�?�!�IIt���t�A��-�6l���H$B|\��ܳu+�7nd����)*��?|���<���W���/�}�;p�Ʀ&~���y�G��sL{KV�V!I*,�yyT���r��V�@Mm��C��N,SX�?m�������54�r���?�3�y�I�����t<�Ѓܻm+�,
�1�L<���<���:}�Ғb������??�~E�T�_o�
S�2߼�Z8�׭�@�[^��e�ܜ6mXχ{���x�h�H�Ę˅��w=;�L�F�T_�JeE9z},F�`ll,�)O��RP�5h�jt:�$Q[[��j%33�JMRR"�*�����n������'?qr"�J�B��"�b�� �_T��j4
�9��n7~��ZE$al��TG,��J,�N4*366���'�"G���n�������Ǐ��N'x��F#sK�yk�N��s����˝�����r���+��!��'6�O��T*z=���\�#�YH�:m�^�1��0~�b;.�k<[�g� x��t�_���q����b00[�H�ĚU���|���>���X��nv���Jb��E�)����{z��t.���=��!�WVRTP@��+
��lT&�	�Få�j���Š׳i�z�zg'�N�!-5���|=J{g'����d������^���[���Ŋ^�G%I�l6�:��U�{��X,V�\AvV&���O=�8�H��?�͙��HKIa˦�(������dB�V�xx��7&��*,, 7'��g���x���D�ӱt�bv�z��y���9t�����,<^//��n�g�r�23��(��ի���~CVV&�6��a�Md��l�Z�����?��p(���)��#+3�W���z�+Wk8�"*�
���>‚��ؽ�c�f3۶lF%.��VҎ�v(�׭')1���D�X�����v�EQp:ǐ�8�c�jkhl������oG�R1�r��P��$T*^��NK4�b���8t��˗������N�f��ݎ��fdd��BRb"�`y�v���A��Κ+� ̜�V����`2�A��^��j����z�X,�������l�R�RS1D"'n���fFGG��2(�����

�(
�))�GF���o�ُLI����h�Z����r�� ��h�����������=�<������z��v�~?>�$PIG����A�j"�����W�N�$<^M�M_>;��`���`�8��$I����tݧ(*,���p�*��n�2��f��e3�L�L&}��I���v;�֍��#>�:ߘ+�rS.kA�\7�7��MLL���z��FQ����I��4
�i�G�'''Oٖ9;{��55�$&&����n4ȹi��/P�^�'+3s�w�V;�N�Ch�Z


�s8��@��M�C�k�#��~��b�� �6�J��[III������۶N
ԷCrR"�<� �N������ �%�$�`��o��ʊ�oe�m6w-]�lK���� � �AD�A�;��� �p�_A� "�� �dR�W�����*�2)#��(��#�E!���_:ot4��,˄��[���A���(�X�ps?,_Ƥ�/+2�NJgg�-W8��)���������~�O�������ӧy���h��###�;p 0�z��#������������?y��wܴ�����x��3�Ua�ٽg/W��Lzmph�'�_��azz{�P@�w/���9�� 
��A�GF���E�uc;�ߤ�/!���������t�����K���;�A�~?-�m�������CC�N8���(gϝG�����J]}=�]��a�$$��`0����]�L|Q�Z����jj�X�t)III(�sl���n�>��������Hk[;�px��-�,q���+5546516�6��W^c����R�����7446�z���008H(�K"�]��\�o����pxj?�(ʤ~�;>].Ă�k|^��e��i�w�eB��==�
�chx���Q$)�ώ��dtԉ�(446M���&���)��~�L�#!���͉�ǹg�vl��S+J�Ċ��p��q�	��2Q�U(����8{�
P\T�C܏N7;���?0��۶q�	>ʩ�g(-)a���TU]&���ޏIKKehh�>�*�˩���7]_O_?��E���p��=
���e^e%��C|��Gx�^�^����o���0D�Q��<IcS:���w�͹��v
�J���U��h4j�z�q�;N_?�h�ܜ�|�1�:ȥKUȊ���ARM�������v���h�����HMm���O9y�4---�X��o���h`ni)�޳m��7�S6��
���������NRR"۶l�_B�V���M�����~��j��r�m���SU]�Z��٧��xNѤ��q���X;�Uܿ}�����}5��>��DA~��y�'�U��y�8x�M�-�Uj���9t�>p??y�y�_�DCc�L�oBCcv����$,�<����ϸTUMSs3�1'>����8~�󟑓�E]�5<��ܵt)K/b��y<�Ѓ�B!��ani)�屙������~��u�r��h4:�UaШ5lٸ�?��O��`��yܷ��ٺ�5�VQR<�������3<2�����Fk[����<����#..nڳ㑑щ~����Ғb:��hmk�Ru5��\.�F�c}WE��Ӛ�XNeE

�4����?�9?z>��@ �=۶�}�V\n��P8��O>��M��x�~*+*(-)��~�

�lgNQ��U��a��S��F+��b�ŷ�dE����ōx���q��D"aE�n�a�YQ�U��賀,�TUWSQ^>1��Z��d2�(2�Ph��/a2��X,X�VE�Z}:����\:::Ѩ5�z�WV��������v�G|�N��h�H�+.[	� �$Lff��^�$Ih�Z�j5��A�$��j��T*T*v��ԔT�F#α1d9��d$#I��%�L��W�$*++x^�f��C����W�K��ry���x����b4�x��gb������h4�[�Sw�K�,A�$,f3&cl�Z-�		X�扼&z����xB�0�p�Z�J5^/I�-ې$�F�FwC;�ߤV�$��e����2�_�B�U*�m����p�9�HO�����_}�NKnNΔK:3ehh���>�w;��O�{�^"�����r�yT*	�JB!vKCQ�._�x�4
*I��r��*����؄,˨$	I�P�T(( ŖAP�TH�άc}����\9��dd��yX,f^|�U{�!��0��jM-����6�����f���r�ع�=<�����KzZWkk'�(i4�.Y��?����߃�ᠪ�2.]B�e$��ؘ��o�;}9QI��x'>�����_���Ͳ�KQkԱ��,6���7�%K����d2��+��ģ�P��H��+�$z�����NA~>���`4�a�Z��6�v��CY�n=I�I_h���Q� ii�

1�r���K8����h4JNv�DƙVS[˩3gy���V�hk�`�5�,������b6�HOKcph�@ ��O��a�Z

����p8�h0��֊�(b�X���###}�2#=�ke����Ɍ�j������$E���
��NzZ�
������7��ű`�<���HH������4qu55%������U%�^�f�����

HHH`tt������ϱ`�<FFGillB!�wY,	G>����Q��[����1<2BrRZ����n7����ضu9YYϙ�N�������ar��epx(v��V����F�����j"�������$	��CSsӗ��E�HY���@�@ �~��� ·!��_��gՊ�]��k�w����ǿ�=�حoJSs�}������8�_��|�G�P�:���L� wI�X�x�i�Hy�p��W���A�n��j�rtZ��L�WtG~A���JŖ����-]��7OJL������A���#r��IAA����/� w�A�"� � �A���2ͮ � ߼I�_�e�_8����-W�T]�?��7��������;���0�]�Z��?��wߥ��w�u��͹��Ӯ�r�yg�.�~��c'Nܲ�1��O����M�����RUul�W��/w������y�k>��P(�&��x<�������v�z�)u�m�p�s�/�r����E�uc;�ߔ3���9����+tvu���w��D�Q��1Μ;G8������C�(
�O�dhx���.��فg<Gu(�o��H$B$��N����F��+���S{�>�N8<� ����Fq��|������\�)��Б�������W^��W^���t���-���)j��uG�D����N>=}�p8LWw���

�edY���7D��Gwo/�Μ�6�=0�/E"��8���(�D��ϖ��r��[���cs�G���a��}��A��(�Phb���~3^��$����7�9~����h�lڸ��2�m�J��ŋ�oh��*�Vk�QI��ŋ>z�H4’E�X�f��y~txd���n�z�q�_��[���X�|%�%�q��%>�^�'��3O���G��˔�R�p��\�����3�Z���ǎ��zY0oee��^!�pϖ-�^�rb�Z-�]]���rג�b`A����a�~���.��������>,3###
���aF�N�y�	�._����@ �y�lݼ��g�r��1�>cccH�L�����;�����p�<L&3��<���\������U˗�����_Yɶ-���G7�3���M���\�TE||۷m���'
������#-%�_y
E���|:s��t:y�駧$o�V��O>�b�0::��n�5q�j��}
j���6m�˲LVf&�޳��>�M__*����av��>K/�-[���}����t�&4��Kd��c6�x�Gx��G8p�0M�M���34<L0�٧��`0p��%�M��TT��V��x��wv�b���8�Nj뮱l�JK�Q��3O>���9��Do8Ҷ٬l\��
��Q<g�L7� �"�Z��+y��PSW��d�|�\6�[��w�t�b
��x����ʢ�����L��>N|z�����}��g�V~��S��io65�Pw��eK�PR\LVf՗����˙s�p�m�u�O,SZR<��X����[Z���i�x�Q�lV+��,Z��5�V�����0]�lٴ�9EE�z�斖PXP�S�?NNv֔����2�����ֱ��5"3߷`��F�a����\��^?�J���l�R��7����t�=n‘sKKp8huZF��3]�������ܒ�f*���NA~>
�t��V��$���x���INJ"����@QA��u�:�X����(�����s��qZZ�X�v-&����$����oh@�F��M��l�R�ssq�=��t���-R�T$%&�����dB��`6������p`w�a4HOM�h4��h��Ƞ�����457�D�������2ɢ��|9v��V~�ԓ���N܆|���Z,�=މe~����4���崶������`� ?�F����h4���GJJJl$�Z,���p88s�Z���HZZ괹OLF#f����x������S�9�/";;�s���Lw?p?.]����FNV��x�ш�fcNa�L��
�k����ի	����c'8z�8V����lΞ??Q�뗩����+W(���`0�eV�ZI�������+	��������N$!*�(����H�DWw7~��>�,6����d���g�IA��)�%��$���̱'��s����$�x��]lۼ�H$¹��a�Z�N'���:v�
�ۃ������B�Bzz	����6�j5w-^��_��kILH����`0@|\<mm�:�;~�崴��~�Z�8�˯�NNv6�s�P��"\����������Ғ232�$x��wپm�445!I:��5�V��k����ŋ�Y�m�~�G�&//�Ɍ$I$$$~n6;��DvV&III�L&J���STDAA>��%H���ag�������t�hhl����-�6��hH��G��b�X���m������BaA99٤$'c��HJL���
�W� %9��L~^.˖.E��b1�ub�Yپm+ii���������b!3#���$$IENV���(�w��a'''��ARb"9�����I�O ';��,T*�\�TE8"1!�m[6S��OQQ!~����]K���������@\�������c��֭$&&b1�9}�۶n&';�@0Hww6{��s8��e9�l�J~^.�y�����h���&3#���,�I��X�Ξ;O||<�l޸���x232Q�Tdef�	
F�&#Y����A�ד��@rR���ɲ���v�$I��!FFG�v��CY�n=I�I3�Ϸ�GE_�2����6��ڂ |KdY�?��cՊ�]��k�WS[˛���_��_a��n�~7���߿���o��W�o��+����:��$��zhjn�3��L&L&�WXS"!!a�w_�;�����#��p����!v����P�ʟ��/��]�V�x��Ǿ��V._�Zs��qzZ?��ť�YN~A�Y�����2�-�b� � �AD�A�;��� �p�_A� S�I	)|u##���N�������?0�5�Y� ߴI�_�ejj���z�]8r��az��������ߙ<���:}��'O2<22��粒���-�S��~��89m��[������ƅ��b)|�Q�ٵ����O�o���&~����^~�u��7��ȱc"U� ܡ�O�;S�n��`���N1v��O��W��r�_Ss�'���U#�ZMGg'o��n���w�K{g��T��e��M>¥˗��^:��&��n����&`�ϧ}]{G��;���|��Q�X楪jjj�8t�(�����AcS��S�-//�@ ���t:����)�wS��8�M'��oh�wwM�g��go쳮�O���q[W�������=3<<<mY��:́C�g���/a�s��h��W��T*֬�����
����=���������^�|�	�@��kV3���Sg�p�R�����>�Y��\.Z[�x�{����or0�0���tP�]���ٳ�Ba�Z
�>��))\�r������RU]�? 1!�P(����ч�`0��wq�\����`�<��(�$M$����#9)$�W^�
�����KWW7�߻���lΜ;��ŋY�|��=�?xI%����2EQ�ZSñ'Ѩլ_���+WY�`>�99�d�yyD��;�F���իy�9s��CC<��c�s���Z-�l݂�h�ww�$�qzz{Y�p!���=%��� ;w�7����^Þ�?�������nGQY��^z����|�FG���CN��O�]��'!�2�@`�#Ǹ�8�m�LU�e��~f��7��I �n��o����;���f�QVZ�fL���ގ�BNv6z��uw�����{45�p�����v�{�*9}�,�H�k��TT���ڊ��f�]K9r�8%��De����`��WQA��R��+Wk�T]M0D�bGѝ]]��$�����?�淼��.2��Q�T$&&288�,����_d�…�&ˡ _M8�>��1�<�.��c�c�O�:�$I�-3���FaA>Y��Z��H4�k;���p��ٹ�=�N'��/�t���q��h�>��c\n���p�ݷ\��v������������\�z��~`����J^n�,]��F�0�M	��J�����w��b�L�RFZz,�Tf&^����vzz{���f�a�Zy�����ɓ�|���'�՗�ST��jA�Ӓ����ˉF���Ѩ�$���̫� ;;��O[{;�`���b$ ';���sIIN����ܜl|>_��~u5
�Mx�^���eKH��h�Z�
�׮���	�N�y�0��Z�F�$��{HJJ�'[���V˽��ARI8t��Ͳ�Khji���$'%����=۶M,��zHOO'11�y�|~���ikk��vc2�P��sP>���,232)).A%I�'j�˫T��~Z��8w�".��ӉJ�PI�L{�o4IKM!#=�9EEh��i���oʥ���R

?7���(D�tX�V�

P�$/\��dE�h4rג��x�m�;:HI���^����~��*^��+W�r���JEZj*�HE��F���*㗸�\�!+3��FT�Q����<�
��:��ؼq#�M��r�h4�N�������F�33imk�����!*+*p:��9w��˗3444��WBVd�RS��t3�n� |3B�0�,�l�Rv�z���Wx����s'��_�� �щe._�ʜ�"���9r�s�����!??���|���$)�g)�x�EQb}S$�ŋ8�cX�f�&��+��訓��O�~����~^�V
��p��`0DgW��A@ɜ9�l6�._�܅̯�'*��G{�o���0��H�DJJ*��sWR�FCQQ��ܜlzz{imm�l6����Ūj�[[Y�p!��Z�~X}C���o�=h�ZB��N'.�����l�5yyy$�Ǒ���,G��l\�t����"3#�h$JJJ2i����jILH���������(-.fni	qqqT��166�^�����c'N��������'';��IRR����+W������^��‚1�� |���(Wkj�r�*���lX��͊�妡���y�V�ի5ˬ_����tdEa�9F�ܹ�����LOo/I�������h)*,$��RINJB�VS��OU�e���&�����j���`��*����rL&M��$%%1���Ԕ

�s8Ġ�Sw��##������D�ܹ�=�h���|��oɲ���)I��C��� �xk��~�z����PeV���Q\n79��_j�h4Jk[��9~U�h�^|���|6�[;齫55���n����n���3A�7��
������z��s�Ně�~s��/|u�H�����o�H����ۛ�o�|y��∋�����j

���P��<�����<������}V}A���O���i�I��g�ߜ����H�-KLL��u�Ác�y�ۣ�jy���g�Qg��#� �Ш�$��D@n'��GA� "�� �D~AA����/� w�A�2)�˲LKk3�P�+P}��D�z��K}Cô�6�##465Q�Ѐ��x=344|�t��p���|�[�p:�hjn�e^jY�������� |�zz{q��}#e'R��4Y�&��_�Mg�
.��ܹ���B�9w�����8�)����ᥗ�|3]�?���>^|�5::;'^��?�����/��A�Z}=����_�P(�K��JgW�L7� 3`��TU=��'O������{>����CC��ʫ��\3]- vr���GF���E�uc;��M��K�BAΞ;K��h�$1*����>ڻ��9EHR��A�$���9}��P�����9����hljb��w�Լ�w���.�z�JlV+	�qtwwSw��ۍ�db�u�l6j뮑��Nww7G���v������&��y�����?�)���s���̌.����������ewQQV6��"·��vs��)/�KJr
/��:��	ܳu������tuw�u�&���������3�<�
������t���t:��e��s��Y�{z(�[��`���x�Bzzz��ꤼ��s�����CYi	,��r�/���™s�0�Z��+55x<L&#6�
���Çq���zV._Ρ#GymǛtuw�m��)q�����^}����o�ʊ��&�d(|�i]�`0HWw���3\EQ�_YI��2�yw�@I����ktuu�x�ŗ��M���"PTKz����N���^������Op������#!>��g����3Ȳ̕�*+*��|����Z�����T*�54p��T*n��7�z���fFGGy�ŗP�ը$)��OQ����`г㭷	�_�b� |'((��&�RSy�wq�\�6�
�шN�#��@�Ѱ�O��V����/�LCc����H8B~^^,�4���x�w�݅�j%��z����n�?NccUUռ�s|�H�O��F�����_G�Ւ��@8��ߧ��$��Ǐ����[;�%
R}�
o��.Z��^O��1m�Q�N�A��j�`�Zg�c�#L
�dee�v�:l6���EA��p�}�ur��Q$I������~}�!�z�1B�0-��3]�	��\%/7�͎�h`�<�Ѓx}>:;�Pk� I���e�F��)btt��\.e������b6�����d�Z����sFR��oh�܅ܳuEE�T����܌���&+�/c�ʕD"���L7� ��Fâ�X{�=����/Zļ�
斖�����IKME�R�d�"�{�i�VG�c�9���˲�K�X,(�l'';����]#
QR<�F��s�inn殥K��l�p$<m࿱�p$Bcs3Z��Gz�5�Va�X�Z,l޸��w݅F�����űm��z�1��HKM%3#��7��<5={zZ�9���8ۿ���pVf6�6l&99e�E!���=��3g?��s��yΞ�@4��C�	�P�k��,/G�Q�����D"Q���F�(�LT�岾�gPS[GJr2		�D�Q�Q��\�2\�9�x��UT�G�r�X{�*+*xg� �2r4v�(
��
��$Ge��X?�(

����ZZ���E�R�RSØ˅"ˌ�:iim��0���/��ى��'
���Dm�5j��h��D�Ѱ}�6y{绨�̟ώ��F�ѐ���N��m['�q�\44~^9�0M���Ru�2��}�B��+��,#G��@ dpp��k�p�X�V�X��n7=�����Q[w-6���E;��==���,���=�7yyy�Mf$I"--
��q��c.�Z-�s搔��V�E�ӱf�j2�Ӹx��]]lٸ�����"�SSS3.]��E�����A{{M�-l޸���B\.799�h�Z斖222��h��+,Z����\�GFp��dgf288DyY�`��@ZZ=p?M�-��n.\�HGg'��g�
SR<������e�^�������R������Ԕ._��V�����@0Hmm-9�YT]�•�W���b��E���n�v�O�������$�

�p��/_���Z�i%Ib����\�^��Ғb�f3���e�u�ϝKcS3{�~̘�ŚU�HOKe�>��ew-EQ��8Aww799�x��X�&�����p��q���q{<<x��������#/'��+W8t���him%';���._��F�!?/wVč�"Y�q:��>]&I�p����o�P֯[ORb�7���Gk��������*'�+��(
Z����B!.UUSR<��E77��H�H4��`��&a���ϔǯ:��Y�`kV��ԇD"@��L,#I�&����@���_����?'#=EQ�,�EʁX�(IҴ}gSK������RSR�h>?.�2*��+�7lG��Y;�k"�����>^/I����o>;�l�����H?#���/��t:�Z��+��V���A�����Q��`4��8��wMwO��@
�1N��餥�Nl��e�H9��^�TIz���|���T;��H�+�0˨T*��Lf�7R^Ey��U��s���~�M�Ο�D�A�e$I"�������^���� �^�	AA����/� w�A�"� � �AD�A�;Ȥ��(
��}��K��x��뛘�!
184t�<���ghx����Ii�eY���r��h4����5�(
ccc"G� ��n�P~�p$�������&EQ�߉8 ��eY�ԙO���r���O�g�/�|�*�m�ï����I�7��W��m���_O���]������կu:����/��2�]�3]uAf�[;��bUդ�jj����b���]�H��A~���}�>�v	�ü�{7���S�"�����o�\�.����&���Ĥ)>446r�����!.������Y���,s�����.�3�Y4Um0�wS>w.5�ux<\.6�
�F���l6���ĵ�z�HLH@��p��FFFX�t	5�u446b4P�T�=�k�
�$����v��/��y�Ym�����{zc9�m��n
Af�'��������p�_����wt`�۸V_�^|�VK\\��������֎�`�l2166ƙs�t�����^�
��4i;�H�k��\����1<2B�Á�餳���B}CWkjQ����)�{c9ח����b�Yini���3��T�HH�����ֶ6�����sp��Y^x�%�Z-���S�����M��v;��	b�����?e		��ŕ�+dg�LIأ(
w-Y�^��?b���s:�y�����8~�$?���f�N����u疲{�^��X҈���lڰ��G��r�ؽw/+�-�����kV����e*��9r�8'>=EEyV������F�^/�����ͷߡ����EzZ:�p���A�GG�T]Ϳ�W�RL�+w���`��̑c�x��Gu��C
����������s'�`��āC��ɏ�g����h��B�&@^���7�d��J ��Б�����>v���*��x��7�_Y�����NM�sc9���p8x�ŗ��ͤ������o����RXX����P�w�����2�r���WEy�		����zE�$))�#�m6ep��Bbb"��Z��:��TQ�=�0��9w�*�����54��3O�g?�)CC�446�t�&��Ց��F�#��̶-��ُD��-mm�Ժ�BA^?x�)斖������0CCÔϝ��aG�V!II�X-2���h4����r������2���b6s�=�x�{q�\�|��]A�{�-��^��>�$I8v�JKٴa=k׬�%�)���'�$7'�h4��+��_��}����s?x���ݎ�n�v\��b�l6�HIM����`0���Z�]���E��m�M�e�RӦ���IKK����P(���c}�!�L&#=p?[6mB�TD�QlvO=��>�.]����‚�y�	r���l'?/���%�p���~���$6��D~^����B4%==��[����.�F#j����n�����Q�V�L��]��RSCey9:��p8�ؘ���n"�X�<Y��F�@�P�5 I����;HMMani)O=�8M-�����>ͱ'()���d�d2�ԓ�����|���~ v�:��
�kZ�DAZ�b�x�Jbph�1��JE(���ob`$��
����#�����t��F"
300@��N����T���3Ȳ�_z�NGEYo���H���|�R�x�2/����ah����+(���㡯���;1�P�V�gדAE�	G"�:��zz=�P�޾~�.�����
�V�T�v�� k�3�R�$�X�|��I�`�XIH�Ib�]K�Z[KWw7i��ܳu+�`�����t������cni)v�����'�b�2�
����f������9��*.UUSYQ�Z����x�p8�ܒR��>w�		񌍍q��Y�n7��%8�v������Z���Q��YA�#%&&`��Q�T�$'c4X�d	��M|\kV���x[���H$���̹s�q�z�j5o���р�b����q��(�BQa����<u
��OEy:��%����wy���0�͜�x�}�,/���˯�v{�-�l�\�U�����^���lݲ���dt:*����d=]���w�G���g�V���)�S�;�v�̓Op�j��b��'{���.���p�m�fq��6�v��CY�n=I�I_h�H$BT�я�f�D"��a�$����4k>���V�ZZظ~*�
 @0@Vv;��V��e�^O(&p�q�,^LJr2�@���1�Z
��JŨ�I4�l6��h�x<�B!��z=�@`"�e(B��Ϛv��Q�ը�j�� :�I��|h�j�p�@0�N���������˗��F��xD��-W�N���C̥�h�j���\(�B|\����a������~BA~>�p�x9�qq�T*<^�,'��t:Q��جV‘:�I�Cttu��z���92��'���2��á�ǩ%I�d2�V�'����D"Z[[�D"Sޓ$	��CSsӗ�Χ�h&���h&�]�y��l���G~~���F�aJ�k�My�uZ-:��{��ez�����I���4:��/��B���7��R�j�Z�Z��e�8)7d鋍
pL*w��I���~�|�
II��صZ�e�H9j�������G%��hԤ��N����1�S�e��R�H�+�0˨T*~�A�㿑����y��I'i�Cjj
�>�������$� �,#I%s�|c�se}��LyYٷ�-�#�A�"� � �AD�A�;��� �p�_A� ���(��'l��H$B ��]�e���wb��h4J0$N�^�(���V�+�1�Κ|ق ̬H$�嫏�I�Y��ʲ<��G��I�_�e>=u��֖[�p����oiim������o�z}3]�?�������귿���}���>�1~��	t��������V::;������t�A�v}�!U՗'�v}�}��omm#����~��3-
��ޏ���?U��A��n��_bhx�ήN$I"''wJ���KKk;�~����F0���Y��D"��FIMI�4S�L�D"\�t���<��7Pu�2�P���l�5f�	�����n��AOVV��Ʀf�^��/s���>�,9�YH�DWW7�P��,�j5�==(
d�����K[G)����Ǒ��)2O	�����ۍ$Idd�c���˯�����������7v��xN�p��K$!99��H0������6�;:���(�2���������Z�F�e���x<����96FbB¤��-g|��GWwj����t\c.B�rT�d2"�2��Z-)��������o��U<g�}.�k�6�_OJJ���6�2��������|J||<v�c�J+�/#�p��!�
Q��q��=\�R�J%����SO<>k����`ph��|��s�bWkj���`Պ�:s��χ���ή.�}�i/\HUu5E�444RS[Ǟ�?�����\��!!>�-�6r��E�IIIf��M�T*�]C������_��/��͝�aD�Q.\�D__�P���7SSWGGW'v����!�h�^�f3������0���<�������446�R��z���ںk������()���`�¥*���c'N�����Ky��7IJL�xN!�l�:���)�S��5�y���&%9�������F����\��;P�ռ��nR��u:ټa#�]]446�{�^l6+E�������CMm-]�8�v�ٺ�Z��i�駂��b�ln9f�Դ���`��x��G8��)�]C�V���˩�gy������g�~�H��'1!��“�=�?��?�������~?�P���t~��QTXHSS3cc.�{z�k�V�\���
��?!�p��%~����?#7'�ۍZ�������ddEa�%�ş�qǗ�M ��F�a�M��'?���ǂy��v�o�ʚU+)�3����'���xX�|����������y��7�=�@0@ �'++���%���eph�Ʀf�\�J�ܹ��a����̝���r��R��H_?�����G�=K��t�=p?�m��`(�?@����3O�ȃr��Q���().��~��i3��)*d޼J�߾�m[6�V����۔�Z�ܽf-啷��Y�e��j�
��C|��K�域
dY���e*+�1ȲBT��Eo�������Ɇ�]�d4���=��*��F�D"(��J�B����C�r�rv�z�S�� I�����h��#�w���N����jA�FM8&���jQ�H86PR�0X,4Z
�p�hT﷢(��"��~?~��`0DiI	?z�y��;x��װX,�)*b�{��v{(-)����=������k�|��-���^����DQ����%�Z,���i4Lf3f�EQ�hԀB8FQB���v��@l��p8��
~>ߔ{�˗�$1!�5�D��M�s�����!2��Y�t	��x�Z������t����g``��y��U�vD8fNQ!��\�t�F�NK���jA��T]MiI1j������0x�%�|�1����/���ɚU��RSCoo/��Ʉ���H����A��:��}�^�G��PZ\̾0�,[�������+<��c��av��/a�Xغy�p�߽�G�V�H���ko����FQ
�������*<^v�
�V��ŋ�������`�Y�p��'OƖ��q�]���[�=�[�c��l.gϟ��������k�ӱ�b��`@�V����/����f�ʕ���ϋ���3O<AM]/U�R���<�Ѓ�--a߁�l�(�B�fҎ�v(�׭')1��p=���<.���GJr2�,��ߏ,ˤ����C?��]�bU5O<�(j�����7�����X5�tb4&�6�r�d�~�^��ܜEaph�@ @zZ�h���N��(������mdef���2����F#�Y��X��oxx�A��hdhx8�^WQ����l6������91����W����d�Brsr�Z��A::�P��B\��޾����&�	��<>�X!;+�����(��?�~���)/+��v��A!'+��@Ww�-�����j���x���F�Ւ������a���hfxx�_��x�����$;+k�#%9��͘˅�Tj�RS�h4��A�j"������$I��������w���7��‘�z�ܶ�/��_��gՊ�]��k�����:r��/���>m����o_x��o�M�F�V}�G��*�:�m�_-�p3I�X�f5��HyZ���6�G����غi#�Y�4�0=�Af�JŪ+���/\��\����cúu�};��#FP� |ω[�D�A�;��� �p�_A� "�� �dR�WP3�O� � �&��,s��i��zo�™s�O��o������^y��7�u���_��[���W^���k�u��ɱ'�?�u:��˯L�k�Oimkc�ǟL�>S�;�����vm�kccc���\'�H�ᑑ/4_���/��:c.�LW�P(ı'u:���E�uc;��M��%:;;9x���Ӯ000���y��]D"�nWjj&f
�|x��{�MdY��3g�B\��������adY&*�|~��0>����a\.�DZZ[ijnFQ��/o("���966����e��$��?��OO������ȬkA�.]��������l�!^��&G�������ί~�;���	�ÄB!�>n�g��Q���Q��;��|�P08��#���AB��P���,s+7/#�2��N�N�xY�^/���K$&�x&y[��z�,xS;�ߴ�tt�s��nކ�j��J%����������3$%%M̌w��?y9*3���m[6�2�߷mhx���^6�_ϧ�O���:r�E0o^%
��h�>޷�ݎ�������9TU_������y�ew-e~e/��*w-]���0'N~J `��MX��|�	:��uk֠V�hlj��W^������OQQV6��!·,�p��!��\!>>�����㜿p�����0'O�"�̓Or��9���	�”�����q�	N�:E(��w?M2�ֶv�|�mdEa~e&����f~�ܳ�=w���&V._��ヒ,�T���}��)��XNey9�7n�wS[[GBB<۷m�7�D��dd��t�a���w� :��`0���˥KU�z���)�3g�v9z��/\���}�~��V�?m��}
:����tt���;ʲBjj*�?{>����N�*��|�w/�֯������457�t�&465c1�HMM�j���O�����455����ؘ�N���#��|���xK�,&#=�OO�������NRSS)-)f�
$��s��._����fՊ���� +
y���ş����,���g�)A�����������vlV�**�g��n���˙STď����\���)--�<M��+=v�#ǎ���%�7���n::;Y�p!es�(�ϣ���ή.�_�HzZ:���ttt�x�B*�˦����r*��ihj���e~���xq8������5l޸�����1FGGy��Xv�>��#*��(�3�=����S�STP@�x;l޴Q�o���q�N��%wq��e)eo&�2��ZC�>���O4*���I�ÎF����t���屪�j����d4!�$�&#))ɀ��닥��$l6�		���!�2
�M��j
��X,�=��{����K|\;�~���A"�(�p�-�6�������d�B2��1����v;�P�k�G��I��a���1���*=f��Ʉ�lF��b���j���j���(,(�f����K4%%9�@ �Z�E��ΟO4*s��1������L>�1CCC̫��b6�DƗ�Ɵ��g�t�[�S��@y�\�Z-�))�t:\.z����4���b[�e��HOG��q��q�5Z��ݎN����z��x;��ߎ)���%d�g�2���F���p��E����HOgNQ/������d���f�~�����������Y����@^n�Ο�[�I!v�S}�
E�_�t�w� ��?�
QY�����0~�*q��U��1��X*ʌ�t�8g
k�EA%������ѣX-JJ��鴼����^"��N�����H4��
p{<�����lx<\n7{>ه��EQ ;+���$:��0�����V�Y�d1����7�n�D|\Wkk���-�t�s��'��|Ӗ322JQa!��㷿���l�WV"�
}�$I��އ����a��J����j5���~�~�[Z���C�T�:6o�@Vf��f3�W����	���������<�&3�$��K-����<��*�˨,/'++���%��62��ٴa=6��K���������(֭E�Ր��E\��ԔT�m�DJJ
��9PXX@b|<I�����PSS˪�+HJLD�R��x�ч�h0�����(�r�2��	��)�Æ�w���H~^���$&&����}��� ߞ��rsr�٬dge���AaA>I�I��$���NaAv���4.\��A�� ?�m�7���Aɜ9��:��Y�b��iD�2��q$%&���2qE %%�m[6�p80�L��p��[���������#5��eB��疓��NIq1v����,
���LO�`0�����d�¥K��QY^��5�q���c���L�@��0$'%���Dfz:%s搔�Djj
�qq"��W$�2N�s�'($I"12:����u�IJL��}�m�� �h�+]Fr�ݘL&�j5�,�O�#			<�Ѓ3]-A�������ƪ�+X�z��.���2�}�!��_�5f����wSK������oq���f�	�����N<ew#I��x=457�iy�z=_5��j��9
a0X�h�LWI��1	(�[JRb�7R�����Zz[�>��lf~e�y�K��q�MQ�p8�V���A���è�j�����8�
$IB����n� |)�;X[���?AA����/� w�A�"� � �A��h4�U�A�;`R�e���K��9o�Bm�5^}cn����A�~��sS;�
�M8t����``�u���՚ډԕ7�z����䚾����ħ��2�t8�Z}='N~J()z�r�R���^��D�Q 6�m0�����ۏw��C	�#\�����Ly����vn�)g��
�9v�kl������/��=Ȋ���0���L�4b6P�#ǎ�����G����>��Ʀ&‘��
�&}	���9v����F����(
���1��#��G��:|������K����<@��Б#�����eY�y���9��,����^z����I���ӯ��a���9x��-����/��O�w��t�D"�Iω��7	��؄o�O�e��LW�����7�9~E��v��Z��0�&��V�Y0o/UQ>��N�Z�F�$j��8t�(�p��w���%�gM��s���{�a�._�o�/�kV�&?/����+W8v�dl	�~�q�23��|�‚.UUs��EFFG),(`˦�|���##,�?���zz{���ܻmK/�^*����FFG�WQ!��;H(⣽s��y
��)�;��>���J__?C��|���x�ǸV_OGg'~�E�v�j.UWs��a�cc8�Ɛ��_y��݌���p�|�f3�}}<���\��������b������`^%�6l���X΂y�ذn-G���̹s8�v�ٺ%v�7 1!�`(DOo/[�P(�$��o�6�?��?�����|�9E���Sw��]���j���{�'�yP�mnѺj��s�ȲL~^�6l`�{�1<2�J�bdt�7�z���|�-]��ޟrYk&����V����h0r�{ؼq#��졩���k���208����K(���B!��WY��Eټi#����l�[Z8w�K/&?7�Z�<p���)*�Б#G�����iY�b9�V,'+3Y�}WFA�=T*�.����h�Z�
�X�|˗�ż�
������m���r��Ul6�W���9s�<���˗��Ca�Z�=�������~)/�ĄΜ9K���N�A�VS��ȹ�e������������غi۷n�d2Q[WG^^.��Z}��}��ֱl����x��]���ý۷���6e;Y��b�2�ݵ�J��nS��J������W�=�l�:EQ�{�*�`�G����I0d��8�vv����y�93]G.U_����قV�!5%�‚�~wC�è5j$IErr��4¡0-�mD#Q���d4r�R�g�ڻ�x�q��斔�쮻0[��dg��hkoG�eT*�$�f�*
�����d�\
�S�Udfd�����dB����HOK#%9�Ϗ�b�0��łF�� ?�˖q�I�����,/����l|��ch(�/�����}������#G���a��-X�Vc����3e~�˙[ZJjJ
f��y��T*\n7&���sIIIA�Հv����2��3��_�����j�0?�\V�u"#aJr�LLw�)��������
�QY�`0��C����J�"-5���d�yw��!�Nr6p�=�����c�"�$�>/g�]��K�:2328w�<�"��×'�y��Urs��Y�:r����ȃR}�2Z�n"�o���,\�9E�ed%v������y�G�X,dfd���1�M"·L��D���}{	���8N�9CJJ2)��(���{Y��n��(5�u��<��{��C���.lvn���˅��(�Brr2�
�x�t���G|��K���-�V,'99���v���M7p��E���TUU������;�v���EvV�R���#���=w��V
��HKM%��{��l\����>Z��QIZ��ŋ?���I̫�'E���������<�&3�$�������k4R��IOO�j����KvVe�����v�Pk��{�6RSRf�~�74����=[���j���CAdY������f�Q��OZj*��ƃ��KU�Z����T����Z�WlV+
�Mh4�o�JfFz�M��0$%&����sl��|1_� �t:-9�����a6����%?/0������x=dedr������Z
[6o�l�\��r���C��ϫ$;+���&�^/�P����f���a�_JKK�b�p��I�n�L~^N�5��h��ݶ���x��߲�{�n�x�23�io� ���C||y���FL&#��S�ϠR�HMIa��m$''������!;+���!z���BD�Q���(�3 ��;�㟾Y�q:����$�P8���ȝ����v��􁈢(����^�U�
� |9QY�����+��v͚�]^}c#���������۶��--���?���E�m܎0=���V����ד$�����}A�4�2q��Mho��}�NGfz:���h%�+�#� �w�Z��g����u�\���w�����?�2PP�]ħ#�0}��s���4>`O����IAA����/� w�A�"� � �A��٘YOA�oƤ�/�2
����.�����'?%8��vl�ť�*���L��O��맦��+�Ȳ̕�'^��줱��k(���2xCٟ�s���g��A��Z���F��|TUWO�>Ӣ�(�==g��o�U�%N�>9%��u�����?r���{zx�ͷ���<���L�x�������.��������gΞ�Б#_{�dY��Q��8�(?q�����?���]�}'�X���n�._�4�����`dd���݅���e

���wp�\3]- 6c��'u:���E�uc;�ߔE#�0.^@R�X�lz�a�j����<�<DiIq,�JB�$zz{9~�$�P�eK�RTX��w�v�x<457����������l޸���&^}c��^�{磻��CG����FyYkV������HzZ*���jM
�

̫��ŋ466!�2		�$'%3����G�R��Owo]]�|>/ZDy�\>Lm]=��ql޸�J|v@"I�`����b��$;;k��P�������\�r��def��;���c�����Νtuu�}����KW�g�FEy-��<|���~��1�i&�q��>z���^�UV`2�s��f�*���imkc�;q��^*��Y�|��SYQ����jM-��>��lb�ڵ�=�Nj�jE���r�8u���N�uk9x�o��]]]<x��S2�665M�ý�lc�ڻE���LL��/�Fp��O��WQ���宥Kx{�.<��zy�W	��ج6^|���=����wt�FINJ���@��044̻�OBB<��‚RSR8x�0W�ְx�B���P�TTUUQR<��:IJ�K���C._�J��+dggQ��ȕ�E�ħ�������
ǫo젽���F��R���r��a��ѨLyY���8�i/��ݑ���y�p�^B� �		dgg���MZj*����b�V+����o����o��]K����.�ͥ�j���GnN&����ct���'���ի�d|�q�}�\����|�r���[Z�J�b��C��a���iZZ�x�~�����;;�����SR\��n����A||<��Ydgg���5H�4)�O�w÷E�(
�1'�$Ų��))��W	�L�g���i5,]��k��\�t��Dk[+�`�U+�a�X�r�*���c9�g��ku��d34<DOo#��~?I���LFrs�)-��V������*)..BQ:�:�������z
�)**����ˍZ��ln)�sKimkE�`hd��@(�h4PR<������Bݵ:l6gΝedt��(Z��@0���g��U�c��a���~�?}P���h40gN!��$��!�29���呐O0��C��HM��-m��<}
Y��x�B|>		q�K�7E���8R��9}�,[7o$3#��ȅ���q�:���lٴqR�3}9��o��h�|4j5n����D.�GB|<6��p$DFz������ΝTV����Ci�E��-I�9�䓘?�|�+�����W$��L�H��]�( }���JjZ�ěrT)6W�t���ר5D���e$�*QYF��%g�F�H�
�jv|�����Ñ��A�֠�j�D���*$I��3J�A5�D9*�j"=��ԪX����FQI*dEF%�P�TD�$IB�B�*I�f��H%�n�L|P�4��� �=7��Y�'^S�h4:1U��z�q��_�Q��I�X2����X!��jQ���@BQd4ʹ����e��A�f"���o�T��LāH$�Z���T�|U�H�Q�(�pfAw��hp�8ǜ�f��M'$I��l������d��QA��#/e��c&�A��'	�O�-��2�E�`?q;6�}V����>F�NJ�c6�fl_��(ã#t��b�����S̖/�l�A��H��M-3ȲL($=5u�d4
�(�?割o[8����|� ��l�-��@���/I�bO�(b�^A�b�Ϩ�ݥYq�a���/� ��4�YC\�A���ןD����Gݦ{��F�H�Z=��߿�e��.DD�щQ�j����y�$$�A��i���n�����L�ݽ=(�Bn��o|t:����Hfzv�}�r>�YQ��ֹ͓��I��4ex�$I���184���G%����n�O:�Pe�Ǻ�,�4q���l�)�s}-�A��H$:)�\$A�e�� �`�H4��`���#�2��χ��AVd�^/���V��h����p8LnN��p8���E��b6���ܗ�vN�H$BrbY��ttu��ۃ�f���0��>��Z���E���D�|�FLF#�`�P8�,Gu:&?/��:�ʆ$� ��}6�e��e���0�
D��

q:��`�
�&����yINL��vF�����
@(&+#�Ą�/| �x��F�>�-�h�Z��0y��tuwc��ILH���������h4��a
�`px�ق?���008��h��Ũ~A�{�|�ٸ��;�[R��`������A���(�/�l���l6��
�`�����������22:�Z�!2<22�.L7�@Q\n7M-MCA�22��Vkp{<x<^��axd�F�?;X����5(�9�(-.!3=�8����B�z��;#� ��0]��j5�~|~~��׋^���ӥ�#�)�Y&
��֊"+�
zEA%ŦP�D"(Ħ嵘M�dg���:�پt˫�ii����@I��iuX�
��q���9�D´wv���F�A��~��#.�_j5j��J�,˱��y_�ؘ�A���(���c�X�ol���j����$E����:B�0�i������McsSlT�&L�7�p�F��n����ю�l!����N��ht�\��-��h4�T*�6;�IIt�t�;0
�r���N7>�P!>.�Ą�7,��2j��x����DskK�v�4
$}��JIIɬ��HA�,Y���#'kjj_y��]Qt:����f�� ��FtZ-���FcI�tZ�H�@0�V�E�ѠV�'�^�'�����Ao�tO]�$��(*(�g��0�$��h�e�p8�N�#�Q���zT�X·h4�N��X7�l��)ۯ?
��e�^?)M���ٳ���>A��V��T*��|�����q���3�����t�{��gYn�w�V��Z,�E~���Z��^�V�Eg�Mڞz�2����t�-s���[5��� ���L٫�s+�Bbb"�T�S��ͯ��+��җ���E�����bA��Ύ�;��@o�>}[�!� ��'Ij��` ��K使��rC�*I5���^���E&��jE�A��$I�f�1<2ʨ�9ӻ�,+�lv$�`�j�N�c�
4X-V�A�����Ao�5O��8O��l��R�T���/� |?ܜ4g6�1�ϖ��� �D~AA����/� w�A��$IB%�2)
��T*Ul2&dE��FEƶ��@�h4z�6R��e�K�ۍ�B
� ·�s��磥���KZj������#�g�Y���줧���B^N�I�BQdn3��JQZ��c��P0�����R9�dxx������H������B�B!�f�d�� �p�e�e�w�{�����hjn�чD��M�ݪ����$==��wv�l�)�_��+Jl~��d7���Uv}����6Q�����g(+-!>>Y�'m{ll���W�k钉@��_)P�cy��_�P���߼o�,�ҫ��DHNJ��p�����d��g|_���x���0�\����ĉ6�^�$IH@t|���76�����͛�h��x}n<��*WAa:���p���&�ݽ�5�V�F�$�s.���K��Am]��F��+WP�(�˾�9�"�1'+�-#r��%t:�.I���2.��A��l�����s)��bW
Z��0�L<���)�����;X�`�ݍ�h�¥KȲ̼�
�yg�.���Y�b9###$&$bwة��cNQ�##�]���lbт��w��ʔ�͝�:��M6�p�<@"
q�	�^/�h�eK�p����k�ذn)Iɴ�����B\����2������#SY^FGg�����a1����������>F�NRSRX�`>�H��/�r�0��,�W���R� ��U��M,K�d�E�9~�$ͭ�X-V���d.���4^}�M22�d���{����j����AWw��$'%��tvw���O���/�DVF'O�bdd�N��OO�p�|�z=��`6������D"aR������$���G8������.\���d����ܜ�23�p�^�F���^}����|�E�d2���B}C#�N's�
'��Ç�RS�ի�
b��Ë/���BCc#��X-V��7PVZJ(��={0�M\�r��1�-��:s���d���8{�))�X-V���p����a.^�Dzj*�"77��g���܌�j��}�))�CbB�LWA��0I���Q��6l��s�SS���/���ʒE��Vϕ�5�:
�����#
�r�q8��--!/7�{�l!R[[���0##�tvv��$&$�q�z

)(�gݚ5��� �<����?��/X��n>=}�wv�GA~Y�_{7�EE�M&"�~����~r�����`��ͤ��MT�� <�^OaA>#����4
ׯcۖ���@oгv�j�x�֬Z��Dbb�6l`��E������CZz*k�^Co_�))<��lX��K��*�˹g�f�;�X�x1�{/̏�^�b�VVZ�=[����@gg�uulڸ�
�֒���8�A�)����9�F���,4j
^������;p�w?���6���[ZBJr2o�܉�h$33��Mk[;z����8斖�����bF���(�}��]����ZZ"11���$�H

��$''���!�����T�B!Z��(*,@�R��?@|\�E�˖���͛o�$-5���x������YA%�P�5(�B$F�cWb�� +2Ѩ�"�$%$���HSs3��M$$$��h`|,��f������%
����ׯ0\߮V��h0p���ZC ��A�o̴��%I��s��a>=}���V�,YIJ�K1�LP}�
�<�Z���G�q��E�f3+�/#33���N��۩(/'.���*�;;�٬$%&24<LyY��ñ�s���SQV��`����O�罹����}��!!>�h4ʥ�j��㈏������IB|<K/"
q�������4�q��̾��@)r�%��(�D/�jL4~	M��21~
���_�[���H��X���8��BB�ߥ�;���^���yf1V)�0P���U|�݅�8������ex�m���0?�w�?���������ن��A���l�T*"ܘ��a@�c�2����)����o��<�a؁��V14t7�����$Q��!�Da����>T�e�6��VQ�����5lַ�l���0��B!����g+++�V���8�$	�0L������%|����B�(�h�����}�I� IC�3�v"�cؖ�f��;�ow��R
QA�1��6��8���s� �9���[��!��a�3��w��0��80���G����1�'�ḒA���vEB@���v)%��6�{8g�o��i�Zx��9�<^��H���!��c����s�q�}�:��^s���Lq��4
�f_Ϲ:�*�zj��8�`���߱,V�a�uݞ�L�=��n[�_��.�4x�4���gv�^�w��B ���cu������c�G��	!���3�����(��"�����:���a܊o�s�J��V9B!��ԁ�4ML�j@����,��=�B�י��)],��!�r��Ϩf�B����y�e��B!�}�{��3��D�%tEXtdate:create2013-08-30T12:28:19-04:00��(�%tEXtdate:modify2013-08-30T12:05:05-04:00xȦtEXtSoftwaregnome-screenshot��>IEND�B`�site-packages/sepolicy/help/transition_file.txt000064400000002032147511334660016004 0ustar00This screen shows the 'file types' of the specified 'class' that will be created by processes running with '%(APP)s' type in the '%(APP)s' directory.


SELinux allows policy writers to define file transition rules.  These rules define the label of a newly create file system object.

By default an newly created file system object will get the label of the directory the object is being created in.

Creating an file in a directory with the file type of etc_t will get the label etc_t.  In certain situations SELinux aware applications
can override this behavior, for example the passwd command creates /etc/shadow with a type of shadow_t.

A third option is for policy writers to write a transition rule.  For example a process labeled NetworkManager creating content in a directory labeled etc_t will create it with the label net_conf_t.

File Transition Rules can be written to create all objects of a particular class, or specific to a particular file name.

You need to build a policy module if you want to add additional File Transition Rules.
site-packages/sepolicy/help/booleans_more_show.png000064400000104204147511334660016450 0ustar00�PNG


IHDR�jt��'bKGD�������	pHYs���+tIME�	
���iTXtCommentCreated with GIMPd.e IDATx��w�Ǖ�e�L{o���3��%@"��$�"EJZ}���/�"���}�}��m\�v�+�Z��Do@x~`3�=����L�4C��	���*�*+��˗/߃�D"�˕J%��!�@������R)A���訪����E�O �1cY�/^�H'��z��(�WR5�pC�tgg']�V)�"B�@ 8��eY��O c��(UU���@  я�@��@ я�D�>�@ <PЊ���p�PU�T.K�LQ��y�a�Um�(���Ȳ�EM7)�J:�eY�˞�Z�U�U���9�a���j��$<W*�EQ�<�n�%����EQ��J�*��b	�{E�^����v����������<xh��]���?��?|�k�OO�=zlpQ����CG�-�V��_����;�/{�������'N�;z��/t���4��z��v���h��/~���K���V�V������g���Qz�����,��[-��=$���Ug�c��z1�ϟ:{��tPד�Ҥ������wp��W��R��Xc�b��⏻:�5O�����U�1�a�7o4�L�L6��ˊ�д��J$SCۭ��̬�h��rN�O$�->�j�t�%���F'&���6?��R�&Si�M�
<���zYQ��X�\�q�T�[,f�Œ��3�lG[+�c��xj6tib�l6:~��(p�Z�
G��Z,��b���:��rR56٭�t6�Jg(��(���$R�B�`1��V˹�L.�*겁>��F�	�+��dj���l�x�I�(<�|ٍ����H��i��]����=�j���.���xǨ��<��o>�5EM#���o�}�7����}�I_wץ�������Kv�u�7��ͷ~���������jk�_.ೣ���_K�%�ݒ,��Y��tf�@��h�J������E"����oaYF�eI�~�ʷ?ٻ��V�������[�ڶ��v.蛞a����#Dz���(�?��o�=��w6ݰz���+�䗟{�>4�b��q�m��߳Y,�b�Ͽ�Jpv���K�ӹ��f%���@mZ�p:���_����_z�e��v��	��#�֬X���+����OV.�v�f�#?��!
�<x�
�|�?}�߼��xp�˻�y걍��?��A��oR]�<���|z�F���.�ɳ�Y��������o����2uI����M���u�P$�����Dpftr��^��AwwM���������|K��N�$I�&�ŒѠ��+/?s��ȥ��ōM�N�ұ�_|���዇��:z�L�R�NM�D�	��򁾗�{��	�H�R���&��V����(^�qcLC��O�3���c�A�(��Qॆw`EQdI���ٱ�)
!DQ�c�(�6kŹp40=[��0�s�Cӹbyf.����Ѽ�&SO��y,�ݖ��&��i�ǥc�l.�'UEy����t��!t����o~��f�[,���c�׿�M��$�U�8v|ljz���3���(!YQ��t6���U�3�cl*8Fbq���̅l6��O=�;��g1VE!-�p+Ȳ�]"/��'�N���m���x����^{�5���­#���d`�����TOG���ehzzv603wibjlb���]������Tplr�.ɢ��x=Q�Z���;ODq�@�d08;,��q��G�/��=1t��ͯ�ә�$I�]���c6,F�ᓧ1/>���j��
Uj5����-���	�Ϝ�X,�uv�w�{n��V�G�rEQf�B�����2�$��X�����vML2٬Q;�[k��Ҿ�|�(K�ʥK�3ڦM��z]Σ'OOL�
�����ry|*���tKe��m���B�5�l����-6����V��o���w�}��G��t��w�u"V1BPUU@s�XEQ �
3M��V�CE�6a�����j%�r�D�!�SCE��:���`�EA1��������;����(��i��'��U[�@���E^L_�$IR2�1�ŏv�E����H�L!
B�ݾV�p�����IQBD,��;FC7drCX_��Mo����?^[Ƹ�C���_�YQ��8�~�rN�6��$]=���1ͅ5�Z�T(�~����B��o��BHQ!�*y�_7�ꚭ�O 4�[}�`��Di����O_UUU����%�FNb�WA!�v:��v{ch

�a��~�	�/>R&��Ƽ��@xP�~�1M�D�ƒ%����@ ,�/����g�_��-�H,R����5����dY�d3�e�*�PE�ɜ�fK��]QbBF�Q/�S�T�ZYP�a���]oI'�A��hJ�E=�H]���p68�r�z�n]C����5�õ�w�n��l.�t�ut�wܭ>5��̈́f�;�EA\P]���LЭs�$���D�"+��D��GU�R��**M�wo�r�\��x�\��>���E�]�U�P(����h�&.|��4b��@���I[*ug����歷�Mt��Y� ���XQ�� E�7W�iL�ŃD�������6����uf�E?�!���,��lJUdDQ�W��bӱ7�ĥY�%�@i��}I�u�H��x;�f�2H���UU��(�UEe�B��*�T�'������>�.v��T������Z����6JI-����cm�bj�D��ߔ�%�eKŜ�n3�!Y.g2N�es��y}�/��$I�0<<00�2��EQ���Z��I����Z�����lf�ܣwN�G�ٯ���xA�j.�K$7�B�?�޺wV���7Nwfh(�H@�Y�lY*��X,���K�H�i�Ho�]I�8�t�R�ݎ1����%��a����_������W�\I7������1>|�H�RQT�B�f�U+W��/~�\>_*�<n��ZQ�@0����0̗�I����d	!����zPӪ�8��������5�)H�v��U)����X,�엿�"֞>3t����U����B�@`�93t���C�	}��^�fff+���Rinn�^�!�	Q�'Ə�8n6��f3ucBV�@�}�i��m4E5c�Ͼ^�IC��7s��P�V��|�B
�2�̩�gdY�&��@���h�FUU!�*���z���_���~��J�������Ţ�g������=7��yhZ|��J�r��!Y��mj�h�t9N�]��6���aX�q�,�ڭi�J�Ը*�
�b1��^�࣏�}B�P�V����=|�-����J�����"İ��*t*������_�B���
UU�ز�����}�!���Go+���T��
�}�]UU��3O=m2�8C��k��*�Jmۖ��ө�֭[9�v�k�X,��'?��[��7����o��Jg��o�Z���"�ju�f�Y,���˲n���Z��EBn��e�b���"ZG��>�+���h�l6;�Q,CY,�88�9��t��h��e8O�8�6�ǃ�%�I�˅1VU<33�{�^ѦM�ϡ�G���hz�٭�Q�e�_�́��--E�<u��5k0�*��tz��z}��e>�w�'���D¢�7��Kv���z��w��50Я����G0�yĠ?ڶ
����GŸX,<tx�@g���LC������jf������'ON���tuv]��(�#�X�q�C�YfY�����r{���_����h,~�ܹ@p�X(n��x2����c�Νc��͛K��YfX����
{���_�������C��x1�I1�e.΀�,DZ�2��Z5n�9n��_w�o��_|�U�V
�Z�|Yk����kkk[�Ъ�V�zl�&�P ��ޞ�?����X(~�;�y��
�Ɨ<ϯ^��U�ۼY˜G��X,�����o���~}����~��k�Z?���dY�Ukn���p���ǎf�Y���o�G"�O�}JS��b�R�p8J����!��EQ�F�G�q:�Ee3�C��E��h������dz��|�_�L�UUUQ��������I�^�W*�w����������d�;w�\2�f�j@s	�gg3��}�٭G?=6;;�1�$�C�/_����gXv��ີk^�pWg硣G���O|z�x4a歷�^�z��uk�z��x"
G6m��o��r�?�����hm�7]��%�m�)EU��*����iA��XWg��O>�tp�;��,�/��B�\��CCg1��7m�Z,�R)�ͬY�z`��'�04=>1��޷��������}��t&#Iҋ/�P�T��U�7o�d�Z���}_CQ��b�V+4��K\���8[�X�y�&^<hfv��_y�3�&�I�dQ8��j�p'�E�^4��"Q���@,���j���z���ʚ�Q������7{fhhrj�o�붶����&��3�L.��7���0�^��fh�1�(����kfq
Q"�����U.�UUm���\���ע�*D!�k� �.\��}x���n>�fo�M�-�=��o��͛��)0Ƹ\.��p8�N�{���F�k��zax���;*�js	ZB�f��-��m[6��$i.�f3����A��y���E�^4���K�Ɩ-��j��.
�+��z=>��V��J%���p88����3�s.���.���54a�6��j�D#�I(4W,UE5�Lz��f�ʊ�n�Z����~
GTE����s��h�*-�z=n������D�z����>l�i���;!2�{�+��!*�JZ#�'c�|Π7���(��v),�>z4��
����g�$Y���h��C}��^��؉�/\�������ǎ=v���hT;�ԙ3צ�#�. ��z{��������@�+Ӽ�?D!����ѹd`I<�'�v���p:�h4��-~�ӥ��xBUU�͞��%Y*��b`���l�V�a�D!�fB4����E��r<�_uR���l�,��&�#���Q�_����ߚ�����%��͈���ઁ���������h����(������Bh6��_>?<!\��wo��b���K�~��>��I�ԁ�w���-f�f8���-�~��w� �N_��_1�\����eydt���9�.a�`�m���koo��l�Мv��P���j���===s��۶o_�j���5>;jok���sss�*�p� �T�V����+��G��8�i�ꩧ�ni��m�f!�z�6��\n��n����Z�^��f��/�GMF���\�z��l�hk�
~_K{G;�0�Z�eY�e���f�	g�X����
��7	��GY�ձ:t�X,����B(�d�����Z�
�t:]�R�D#�refv�`4����j�j���B�Z���H$0(��xL��|>���ɤ,��b��P(dr��'��k� ^[c�-��fcYV�l�x�.��fEq����R�l2�z�P(K�uk�:�ii�5�T�n�7f8!�-^��bAz=��%N����x!_ȋ�h2��n��h�h4�����>��e1���].��/Z!���kV�D��r��F
!�������y!��_��U�l.k�ښg��W�j�"E�h�Z�w�\�rY�eK�n�����&��j���T�����'����x�^�����->�������X���Ky����z��h4جV�p������2L�۵T*��&2澿	��r��d4��L&S��:;�n�3��?���]��w�:M6��������5(&gΝ�9���g���^Y����!�����,�,���h�{�q�V��8!�1���\��H�DQ�0
���d2v��h0ޮ��^���g�]�peE������-�Ho�>�P<omi%����sF���������i�����(
��ۓ��닃_�5{�4�|(��,���j6��4�R�u��Ct�c�@�r*UU���ԉ��~c�鸑ы�Ϝ*�KB=ݽ�����B�,��d4�����s�^�	ޮPɷr
�:wޢ迻bW�Y`��~��(�b�ڪ����C��n��ů��o(��iO������
�^ֻ�V��kӼГ�L�>�4m�٬V�@�-��ȝd��R�5��WQ�
tE�,ˢ ֥�]�C�(���$-�
cL�i�_*0	
 ���h4
����"�<.O]��r���� �ǵ�����r�|Wn
B�����ڴp ���f#s���m��5�,͐����~�z�{����3n�e\^v�I鮠]��Vy�i�dA/��A*MQT_o�]��� X އ��2��B��$nh�]->�MW�U�F�@�}
�M慖1�@ �ۢ?��˲L~-XTU�Ƣ~��%�����r�������8�NI�D��@�m�_� ^�#�8�p�E�����c�J�"!MFάgI�p��'�z �M�7:���RU�~xv,X�8QO���^�`��BD��%�.�EL���l����(��j�Xɸ�@x�D���Q[p�E|������~����x>9Vv��2�x:/(=�����%wM���,��T��WY���\.k6[���R�"�t:B��1o4��qï�F;k
">o�
���ƺ��D�ݟ@x�E��={9b1��y��g/^Y�xў��^��R��(��b��F�f�c����љW���𚕬N�ƛN]
k�.������B@.���^��0��㛝�m�uIz����񸪪kV?�ЪU��ڇn�쩏�8�{����l6K����F�����EH���;�Y�z*0�l�ҹ��=��#_x���+Yp��F��w��+�r�P�V<t������͛7�ū�YG/]
�#�o�D��W^�z��׭[���tc�a
Ţ�(ǎ/��Oly\I����0�(*�Z�V1Ɗ�\e�)��z�-]��b6c��t���s�����b�(����(�u��#��񉉿��?gYBB�H$�񄪪���yY�����8-v
˲�J��i�a*�
�0�p�$ixd��cO=�D <y�̚ի4I-�2EQ�a���b:�I���Ž��m�wl}�)��l2�E)��^OQ�$I�\.���-��x��T���}���ر�瞭��z`0eY��j�$)���(�z�x(hяc(���/_��{�my�q@�R9uf�P(0,���G�}��e"��s[�A����s�C+�_�@�^Q0��t�B��RSW�J���$�
��R^[�fU���q����C�7:�u����)nt�u?@�U�e�Zu[K���^�zq��˖-�2>J�t�ӣ��P(���߱sW(�$i��"T�Vݰ��������Y��_���/}��vk�=�q�ዋz}�}�s�>�L&S����z�d4�ә���R�P�����S��,K�v�jmm���:wnzf��+��������h_�v�{|X�UUkb]�H�/^|�٭6�
�~�ڟ��}}���_�Z�"��sڞ�ss;w���Qpz��ɓ���*Y�Nx�@MZ!0���j����!��f�Ck^��O���##F�`<2:
���믙Mf؄�b�+�Y�p(��υ����
��,���/IC�}e�-\=E1oO��y�Ͻ���s?|�S����n=dY�'�9���?�3k뎝��z�m�ޓ[�<�yS2���������+���?8t䨿�����H$R,�N
�#����l��z���?��Gmm����8yjp`໯���-����G�ߺ��-�/^�P���������6/��Og2��Y�� �?p(�����~��`��ɍ|�Ekv����o��5�cј�����J%O���/$r��@k��Ύ��˖j
B�	
Q����m��ξŋE�X(.��CW	M���K+��''~���6o�X�'�պN��`OO/_�%���TV׆��2��(�280��x�����c� IDAT�s[����! �XU�1BP/�->���z���{��۰~}�l*�p���<�w�wD��(m��(
M�I�.'i״��1"B�0LWg��b^��a����F*<�wvt{���ٳm��&��B��h�ek�P�$��o�?t������7��Ax�E?����U��X,�صk���)���Ox<�����eK�
����M�B7�0�7?8�
�8��??��׿���?�oF��T����B�j����cP�����^‘ȏ~�/V�E��o>��04CC�,�z=�߼��"+�m�l������?���o���w^�v�!D1�t��JX�]�j��y7�j��_�V.���w[��@0�i�F��!�ZR_���ˆG��{�^��k����HWG�o�M�XU�|�'>�����O �6����[�R�ah�5�d4�<u���������x���I{�~~���5k�fBM�c�%I�i�\.Cy��t%-%w�R�p��r����,�_2�h�q]�UU�@�F�/�ӏ���K#o����� U��d3�jM���K�z�ؼ�B����/5��Z�^�Vx�gY!T��)�F���2��J�`!�
Y�U����?s��e�q#��6�`F�eEQBڦj�*I˲�hCI��Ϛs���v=����e�M3,�4�K�eM�4M7.F�!���EA��?{��'���H� <h(��迕�fC�}77㨪�ɤ�՚��N"o��-�3�j�*B�\.W��D?//<8���~�ƛ�7M�"���m5��.��(�n'�]�]�_A�����SO^��D < |�1|
 ��Wp��n����迵R(�^�7��
UU��*��{�LY!�@ ��V��-3�3,�B�X.HUQU�l2�� �M��t:A�&E�)şT�@�����Ԛ#��@ �]����[��<�:h�^U�B����5Ƥ��;EQ��h4jKO`�N��E-�l��8k����+��H$J���d"�E�.��R��L&}>EQP�W��Z�vѿ`+�_�U(j���j%-�@ �]L&S>�O�R�c��}j�Z<�PU��ri����s�_!��6� ���^�O$6�ma�UU�
ә�P,j�0��D�"{�@UU-��@ ,�hfى._(h�Œ�(��1�Z�f��$0��j5�a��?��j������U���e����,+*VUE���D��V
M�/�D��Ɉ�(
��:��P,���?�(!��|����Qm����ᄉqæ�? �ͼ��{/��^��v�\.���<��3��We�׏>�pff!��q�m$�'�@Xh��6��� h������q�i:�H@E�� �R)Y�t:N��J��5��UQ�^%�ɨ�j2�t�N��l6��f���k�Z����6l��静����&�@ ,L�Od �I�H��g�.�f�V��+>����)����SO=s��!I���u���:<|��Ȱ�+W>�X�8<4t��)��膍G����eY�Z�ϕ��~c���v��!���˻hY�?e�jo�б:�結���=���>��R�����F�x$�h�G���o���x<��j/^�y�l2_���
N�z��a?���/W��Ӽ@K�A��%�U*!�q6���n�U�Vә���jn��&�	�V�ZMeR.���bQ%	[,��WU5���E=�X*���/.5�|:�Pe1[�Z%_�K  �5�|��f�;z�AӼ!�����������l�*��s�T*��*�q�"��y��x������>"�"EQ�t*��������5Ñl6�DN�95=C>��p4�Gs�����\h��&�J�237��8�����|1��Z����~�z�|��T`6����<�H�X*J-e�'_�g��B>���Uk�\(H۾J�r�TҲfs�����'���^�{�.˲�?���`�)
!d��n�\hnl쒪b�����0�`��;;������e�V
��46j�X�>�a�ɓ' �/P_FI�����xiltbr�d4�]n�(o;f���p��eY��JY�n��j���P4l�4)W�3�3B����Tp�\.����k���js�.GƯK����R�T,�v�\x.�Ͱ���J����L�^��=�!h�H��X�T0���E͆f���bu�������ju;��h8�ɰ,���=/�\.��C�zMU�\>7=;����EUU�y>�IK��2,Mӥr�as��8;7�1�(�����l#I�\h�X*�oK,�f��(�<�P$�L%Bm�6�aB�˲�-�,�^VƱ�L%MFS.��T+�ߛz(��x[j�Z6��l�
��-��t�h0:��J����gQ&��LJ�K�h�v9]c�c�����#�l.k��\NW�\��)��uyɢ�υ��R��/TU�(
�G��y��^o��'�Ɵ_���L�[�6����WU��)pu|f�3O?tvtJR�aX���Z�[�m�}��n���'�QUUsꟙ�!��L%�B�^+UJ�z}tlTD��xi�RWG�Tp��ʊ<ri��ߪi��c�:V]>���x�P8�t[����S�)������t�\.;��b�б:��O�'�,��E��8rid���\֠7�<?��l�L���Y�q�Rv9\�sӊ�h��@0@St`:�9DA�8�Z��\q9]:��R�ԥz6���*%UV!��t���>��[휎L:�:��xwGw.��4qi��F_�1���.�
^��H&B�P��5�J��1�i���B8ri�j��������ŗ����H&����d*��|�G/��.��ű����q����TU�Z��T��p5wiG��*����9�T*1��f�v�]�G�:��5��8.����E�i�˲��h�PQ�}���:�j⊼�l�EQ�@x9VQp%G#��(��oK�>_��s���f�BQ�/���[�t���L�\.�j5���v�i�N$B���v����N�K�J�X*VkU`�R�'�lFK!��-.�����T`�E���r�8��9�N%�	EQJ�R���֪�b��iI� ���aw0�r�,fK�R.v���t��\>!4���)h�n���j�d*Y��癧0�n��f�!� ���rn���x���r�� Q4M;�N��[*��Gƪ�JM�v:��R�h0��n�ݡ�(/ج6a�V��s��h*�vx<�֪ãùB.�k��`��֪c���qydY��p64�x!��/���Ř����)�ұ:�Њj~�r����i��=.�^�'��R�D�~.�$�j5�� ��j�~��{)��k%����T�Z��^4�
��˲c�lblbjBQ�^/
b�^�'㚚o�(��$�X,��.I��Zk���b	G��j�R�HzIH�$I��lI�SF�|���Ym�h�X.��ή�ٙ&�l)��B�|>������(��G����$EQ*VA�$ilb�P,��yF�ƈ�^�O�J��bGù\N��awX-�p4\��B�e|j|lb,��zܞJ��0���q��^д(Y��xGk���dixd�X*R���"&��X*�y�R1�/���t8�_<���AQ���Ē�%ͩ�D&���ߦ�A��)�R��5U)��Mf�f��d�㴾V��|>�0�nD�^�iZY�+��f��?n���w���ۻ�":h��kW�iK�r��]�sC!J�Mc���Y���-E�X]��͠7�,6�^Է��A�9��q>����^�[-V�N�1��,&�ɠ7�e�1�
4MC�����b�X-V�ɬ���`��,n���hY�^0�
4CD˲�QQHD����8��Z�-f���/��0��db�eY��g2�Eѱ:��8�3���q��d��-f�!d��4q�s�^�Se�X�eY�a�T:�;�&s��cL!J+ME���x�(�V��d4�v���(JD��)�2�͚Ro4y�(���=�I^`�j��m���^�oDP�6˲���X�aw� ���jkL�<��zQOS4�0V�U���X��f7MF���i��a0dYf�d4i+�-f���e���ϱ�3�Vc4E�,{����V+EQ-zs&��d9�4�i��MQ|��7�y���f��%�Z�^嗆��UEQdI�D"���k^8�����Ot.�^�jͭ��.�;�W���]���}->�ah�������H�Z��8y�1��j�r�ei.�Zg;O%�CI���ml�䅡iZ���7wD��dM@A��$�Y#��,]�į�(�$��*�kE���^".	_�� Z8K��p.��(�Q�	\_�el!�$�Iƒ}�\h"�PUl�Z4�?yr���E�=�rJ�����
���;UUk���(��\7�'�@ �1���������H�\�:����O�j8�����p�e?��\�pz�$�O�Olzt�̙!�׋1�e�����:�p�	�C7�_�pE�����l/�˥��5�$����]NG�ZKg2�6>
n�ȅ@ �z
�9�D�om�R�V''�!B�Q�U�(���綬�.��/~�_�:�"�Lr4�W��k�$~)fgg�V���U��p8�L&��ېS�P�oڸ���KU՗_z)��[�U���@ �3���H$�V��c�N�y����q�6�b����1VUUi�a��
�j�z[.�U�֬^
����m�(YiI ,�u���j?���3٬�U�叶m���w)@��dTEQ	!x��X%6��@P�T�`�xq�X*vvv���^WI���gϝ��cc�+�/��
L��Y:8X.�ÑH�Zs�ݝ�_������,+�"˲Jr��E��ui���#ǎ���T*]w@0t���'���Z������̇o�82r��aYU��7���-l�_�,��W���{��@Q�`��x,v���b�pݠ���=w��M�;:ڽ7Ƙa��/&≚$9�����6 �Ξ?���_Ȣ_�6���߉�%�@x0�d3S�S->���^�n�$K� &��r��I������:w�l1�3�J��{Ϟ5��,^ܷs�np�T�����֮Y���~��O�.��95�8���]>� ƒ��h���aǮ]���n������ccc���lٲr��S�N=���}�����{/8�ukמ<yJ���>�Ç��hWg�E�_���{o�֭_��Z��:}r.<�6M��4��ֵ|��ە�V�iQ�@0����R��lp)ip�3
�c��K�j�Z0��}Zjqm���$i��ݷ�Z�>�y3˲�ˏqjj�\�\'A�-��q�u߸��sK������|�5�ŒN�G/��Z���6��0�Q�����A����i���P�o]@���O�>�w��b1��SOc��?pP/��m�����u����MS�F��-^���__2Џ Z����P�}˲쭅W���R�,����t,��zm$
"KҲ­�~���8�����m�LLNb�L�@ n�{ A#EQ���T2i2�v���*
���,�F����kYy9NGD?�@ �
�T�@ �O ���\�`�(��*V1V�T�WEQ�"���U�*�@"��*�b��wo�н�1�cUUTE^e�-j��{��|^��A�*���`�
k�
�"C���I��|�P��t:��J�b4ot��x����7nx�d2��*EQ�p��$�\.ߤ�P,)��y���K�\���<��fs9��x���Z�`Q�ֆ�q������y��B�,�{�r���㸦ū�,�E��X��X�i��ݸ�d�Z��eb�ڛoN#�����W�~q ����{�i���Sr�^Hu��
TEU��
��0�B��G>=�����3Cg�z�	�K>EUO�>�v���~n��j��|�/{�c��F���v�=_(LLN�X��Qf.����Z˥�%��_��Sו���Ξ]�t�h4jv��E�����z����E���d��O����o"kn$�Ξ?o6���?w�Z�~���uk׎��/[��a�ϭ�/�dY:{n���wr7*prjJ�ӵ�|����mm�&��T*�^[�b��zХryttt��Q����D"9�߇1>?<��k��UU����/���^�������Q��U�nK���@EQ9�d`���!���ݷ�� D�p衕+o
�x��C��]ͺŽ�{^xv��S���\�FQ�j���>��w^��q�/{�-|���K�|)# �!\(��@�A�|:Oܢ��ë�o����d2��|Wg�M�j58=-+���d�R��[Z"���*>�/�X��nȊ�������sssQ]��(��\.�].�\(�����j���,�.��(*g�Y���y#�H�T��n����ךW*�N�R�Z���=��f�Y�����a��gf�������cV����������`���/?y!���7���R�l6�N'MQ��T*MLNA��:i?1V�������]�(
�Z-8=-IR��g6�s�\`z`��;n2���z}s��t&�L&k�Z[kk"��fsn����R���`GG�A�NO�b���̲��ځ�p�eY��
��j
�&�)O`�����:3;���A����nQ&����jkk��jmt'�ss�D�b1��ιPH�d��%IR85����jul|B�%��W*��:��jOw�(���m6�U&�:V�x���T�Zm�9���q��+�J[k��D�\��mzQ�ȑT*����F9������,C�:;)��KR (��n���r�b�P8b4;;�GFG:����'�N�b1�e;�ۇΞ����X�Z���I���z1O���vC��Wefv�X,�l6��]��'��*�J{{�&���rpf�����i����d&�5�]�]�H�Z�J������,K��v����\(l4:;:��R8.�J<�twC'����T8]20pY������
G"��j�L6�/��6�����(�N�#�e��|!o����Z�l6K$z��!Ԅ����eFF/ɲT�T��:9�+�:�z����b�|�^�������R�����l��3�j�P(���^�wrj*�Ɉ����
���R�Z�Y��LV���n��/�J-�����c�X"�����沙L�Z�9�6��]���'&E��h�`�zd�Yn�x��m�H���忿L�I_rИEwJ��j5��461q����g���={�D�V�p�'�|���O����c�>\*9��򧧧�]8��oP��EQ�f���~:955t�(�;v�)�GG�8��o�a6�Ϟ??;7733{��E?ujrj�������b�d��z��u��h�ĩ�����ᣟR}���ɩ�3CgGFG!B;��N�R,�r:��q����h�y�B(��V*�(
<��q�N��$I{��+W��b��#ڴ����/\`u�p��1��D�{��D��,�m[*�޶c'��h4�:��e9����Ǐ�8	���;u�^/�ܳ'_(ԥ� 
�B������g�Z�-�&��T�N�:](w��{fh(��>zTQ�]{��EE�?=v��4���]���ffǮ���‘��'m6+˰�Je����,�ŝ{���x����YEUi�F�ھkW�.�-EQ���i�a��ڭ=p�����l���H���p dv��݊,�X��if
UU�zq��}�t�eh��Y��8��iA��N���c�@ p�̐�`�T��bq��ݢ(�:s&8=�qCӂ  D�ju��8
�:˲,ð</�,�2�v�T*u�ا�yJ���ѡs�E�;pp.:|�H*����Ms����{ƍ���f�i��Kcc�O���b�r飏���=���J�h4�kϞl.�}�.AΜ=;�R��'OY,��g��b��K�&&��[[�M7�tz��%���D<��
�&��6��*���!;zil*��l{��+W*�X<�}�nQ��N���Dr�����1�� F�� IDAT�X����GF��V	���s�/@�F/]�46�q���ʕ��C�x�h4l۱SV�Z�n2������9{6��Q�ڱk��f��X<>6>>:6��]����v�dr��}��K��/��#*Vw��O$�;�L�TUٽo���{�hi���_��1��u�ϋPc�eYi���)M��w���0���x.��gS.�J�Tj�MϏON���X\UUU��X!�
�UU�
�
.�z=v��2���h6��T�*V��̅��BA�$a����q�:�oi�T�h,�b��ޞ��Νx�������s�$I�!j{[�@_߾V,_���G�OO�?�i������x�f����|>�UC+=��h����Z�B������ꊑ���)K�1�UU�'�:�n��&_,�ⱗ_zI��@`��9�����4��f���->_<��W���}�Jg2�/E�^gfnv.��k���%>�ozv�q���:p�p4u��x"�:Qz���~� D��v���.�����J�P(f2�v0Ɓ@p��`G{; �J���%c�T���\>W*�G*����l�j��
��N��'��r�X,f��F�S��@__[kk8�e��>�R6�aY�n��MfK�$˵zmf6^(X�5�L�mmN���(����ݥ�*MS}�y������t��##+�/�#�d��p.�+�---����BȲ�ő�L&#K�ޞB����w�A�i��3�(xth�g�g�{�#�\��]�.��w���](ޝ��"!)��/t��=]�Ş�ޝ.�{�渴������[�e3�G̀`�����3��/&&��@!+��U"+���J��鬮��L6�D����r���UYQ�*��;wR���}��6��R��s�^cCCScc�U,ˎ���3������u�phnn~ii�l2556�L������xO_o2��d2.��2��������c�]��B����Z�9,�U�XU,�r:凧���b��X��r*�Btr���UVT<��m6[��j����!$�H\�r��f��c|�jnj|�c�D\��T:Et�e�ښ��{��uMU9�M�Ӊ��a����n�E#^�G��ѱ�x<��J*�*&7���jnjlmi	��N������E;�o�5���|b�d�l.�M=)�����d���1-�Z��Le8 ����1B/4XD�LS�J����/>{���^M�0F�ɔ��es9UUA������$�\�����Nܿ�z�����gT��뙘����^�^]]UW[{��5��+Z,;�;t���##�0f0Ɓ@`|b��rONME*+WVW�T���8���Ox\��ɩ���t:3>>�(ZXXܹ��P�++�d�j�&S������,�K��^�gaq��x��ryI�<nw�]��<��8���C?�կ�|I��h��qjj��r�r�����S�F_�*�</$VVW�I�q>�Og2���"�����kw��I�Y��Ϟ
�B��u�.\�y����`(��G�A��EO�=��/�����p���_~�A���YZZ�z�^����h���4�8R�q���'B�������tF**�>̲��l�eyߞ=��a������XcC��iF�,����	�B�l�*Z�<����Օ��L&322��s�K&��"��q�U�Vuiy��p��=A�a٣G
_�|��G�8�b6ˊ������j�p���ή�p���q�lv9w�\<�/'V�����2k��Jg���ظ��s�5�Ճ�CSSS�XL�y��;���#��3��W����&��8�Ɋ�J���:!���3�`��*��#q<g6����-�`��q�OLXE1��;��y~5�Lg2�t&
k3#�N&���/c��o$����2�?�0���J���x���Ϧ�M��n��s�+��a�>����Y�u���p !�3���˗_x�9��Z�h"~���x�����Ң���~_����Htrz�2\�1�d�P(,,.��ե�麺�h$�kz����=B��]gY��q������WC��aWv����_�OV�=�)����w�ք,W�ӲF�<j�Ʉ�A��+���,+�p(���G�����P玝������{��� $��mmmN�=�Ln۶�hf���隙���@UWU��ξ��d*�:���������ʊ�\.�v��V1�JWWU�3��Y]]�\M�\����sٜ������X4��<F(�˳,\nW2�r:�;;:***&&��'&���jk�N����6��܅��S�SS���b�B�095���y^EI*TTTd2�������۶߻|b�e�` `43�0��#U�hq�?�J{<������ٹ�]��UUV����p8�z���5U�������$���c��0F�@��t�/,���F*+M&S_��j2���ۗ��{��	��ʈ��2*��y�Phmi��-����43,�L&�b1㇚�j�b��U�������T�*++��;���Ņ��>I�|�$I�h�f�j�v��������r��G����nw]m��f���}U������B���4��8�D"���73;[S]�����9�B����t��6�&&&-KUU�"���3�M�`PQ����ʊp.�C�����v�z:�����F)!w�ݛ���z�n�KU�ѱ�h4ʱl��@>�
�uK��ť�X4�v�z{�L&AEBh(|�e�d*5;;�g��ťѱ�H$����޻766�p80f����{=��pX��2����0�$.--��O����ܹ#\�~maaqiy���0B�ss�^���8]׍�)N����jdtthx�c���j���������0����J�`�"�ݮ�d���3�p0�,g�YE�*��|!����j{����J�eY�1����߯�J,C���H$��d�tue��B���zB�L�b6{�^��36>>08��f++*ҙ���X̩d���NR䡡a]�B�� ��l2e�٪�X2��|�htl|b`p0��F*+���v�]Qվ��e�tu.���ܝ���Bm���@���wdt��.�3���Y�0��,5�ё5�HUU�l�]����揦$��K�{M#����Od�ٿ���(
�eUY^�p���w��h�k5ٶ[\��H�P<�?��|�6 ���j�Vܦ�Y�qj�G�1xX�5kUt�޽d*u`߾��
k����?��|�|��jYSc����Ѽ5;R|�B���t]7zs�q��R�]�O�:�޶-�~ԛ�P�iNF�����ZSo�M�PJ��[�W]�9��t��7�ܽ������8I�~���_��]N�o,�G�;!���KS����
O�|�Q��{�����nc�Kg�wmfv�N����F�Y��������z.�G����q!d�>��k���(�tM#�#N��Y�O�XL�ۮ��׏^�6��x�t&bq��� >P6���R��k�2imi/�u�65�?:���њ���;�1mQ�v�q���y��oܔeYUTc��G��vg�v>j�tӏ��U�3,�~�+��W�ڏ��G�;�0���/UY�i�ߦ�?��#�eJ���t׶688J{�0֯��8��=I�Z��&�5jm��J��?)�7� 475A=�!Z,���!��Y����8�}���!�d2}���l|ra���۟��
7F���^���Et5�R�t��JK:�~��2������&�����a���f�̚k��߽�H���Z!j��
U�{��6�~0�#����Z��/K-,:v/9�R%�{����օ~���J6�[�p?#��gd�ݡ��/��T���k��`�B?BHǼ��b��Ā|��<�`��P�	���Z�<��S������j*1����go2��L~##��2]���7����(��۷k�����y��G}B4M�8n����ߺ	��v9c7?>����,��J{Pc<o�����~�c�cnؘx<>42ҹsg���8����
��Y�e0��{?��<%�#B?���
��?:�{u]WUu\�s�G�~Bșs����f��{�>��s��!�^�v-66�o՞..-���9a> I�r<�D���7]n�,�~���\�hxtTQ��g�:3;��f[[Z>��t�P(���B�����+�l`�������ܑC�~c�~��ն��B!�z7�iL��Fz�5�//�y�s�\�����UVT�?����^�r��_��O!�c�v���F>�����g�~�/��'&'��ǟ}������ԟ�p����c����0���XkKK�:f�����F��n�c�G�����H*���&Ap:���?<I����t�b�ۉ�k�n��r���f���!E)M�R�dru5��|*�f�v�y�O�RIr��(f�YY��d���DZ^�c���dY6	��eUUs����?\,� I�$��p8dY.>و\����qW�]���MΒ��(�S��/_���_�ݹ\^x�ٜJ�V�ɕ�U����L�d2I(e0v>̶633S_WW�kn��f�B2���(j��L&Uu9�}���GL&Mӊ�v5M�d2:!VQ�	ɤ3��t84M[Y]�5��t�����WVV������$IRT�a���yM�v�����j��P]m-���ٳ{��"��V�5�N����V� I>�O�R�[�fQ3���Q��ff�=r��r���\>/Z,6����c��6UU�}��/���ϫi��iD�].˲�R]Ӎ�P:�!D��lf�IQ�d2E(q�\��QJ$I�
n�#D]YY�d�n�9�NE��D<��!���++�KK���/ds9E�s���nw:�l6��p<�q��㪐/��6I�1B>��a�t&���m6k1���i�L�"Z,�t*m��]N��(�\.��[E�"˲��A��\.�0�����rf�Y#��멇/_�r��l�fs:��\.��p<�v�E)U�e����c&���>\TJi&�������t*����ʱ�++�$٬V�ÑJ�(������;��-..����!Ji&�QT�l2[�b&��5M'�n��L�ҷ@�R)UU	�>��n�q\c}����[��u]O��n��wmy8���N]]���OJ��'&���WV,ǭO�^�^!$�L��b�oCݽoee���n�`J�S>��㸅�ź��k�o
-ŗ���M&��k�k��Ϝ;74<\S]}��)��m��<��p��i��619�Tf2٩����d2ɱ��s�0�7owWWU]�relb"�˟>w�f�]�r%�˒t�	I�nu��xܷ���G(!�\���ʲ,Ehph���3��JΞ;o<�����_�r5��uw�q�]�{z$I6��>���{z{���E�%�_�yc�j�'O�l��ɩP842:�iڙs痖�	%�@�a�x"q��Ύ���m;c��SS��\�|9�����T޽wohd�bL����K�f���v
�67�x"񓟽�j*Dzgϟ�_�y3h���ۻ�����W�{���q��k��1�����g����^�|�ν{
uugΝ3�̧Ξ�_X���[f�GG��Z���?u�a��7oVWW7�J��αc�Ё���e��Z|N$���]\\2������;�t���k�yhd4�ϛ��l.���g1����=�w�ʲ
y��
SSӍ
��w�޻߳������F�]���K��v�G���v����&�������T:}���X��KF�`eu������s,�XIY�K�ܹ~�Ɲ{���p���exdtqy��]�&��z��TU}��I�h�q��n�;~�a�k7oT�b��d*���DU5���_�H�y���q��&�;~�,��{�8����T*����~2��%iaiyqq)���;N��������=n��a/��ݼ}�V��a�^����<:6���x��WUՙ��{���~��IQo�"�0�?�0��7b����b*�z㭷Y��'VB��3gV����1MU�^��ҙ�+W��/��t6����3���Ȋ����L��,�\�r���Ũə��Sg�$�Vww �?���������puU����WWV�&&dE�X,?��Ox����#���ܺ��DtB�?q���y3cJ�L&����b���9������+-6�f��.-.�㉪XucC�����SS��t��ٷ��QM�053]ڿ�f�ã��=��f�Y�ar�|<O�өLz9����Z?�ҡ��];;�۫�b�P0�v9�Vqlb���_�I.�����$�����yw�.�h���Q�H������<��s׭�n���];w0��,ol�?z���|�m����y�ڵ{�{<p�С�����R4������cc������+��G����
�ܱcG{{MuՃE���w����X�R��?�0̇>�)�r���ЁG�F*�߸12:���]��B�PC}}uuUW�����G�����L�r�L&���l�Zá�(Z���~箝M�
Ń?�k���/'IZZ^^I&á ��{vw�ڹ��H�2v���F�GF5M���K�܃t�S��n�{wWgSc#�tpxDU�l.����(JCCCU,���S���:�ۑ��b����mm�
�u>�w��=>�cԾ�������|��?0���#�=���ݝL�<����f��~��)����� >Dz5�5f�)��
�R�}{��6��<?=3�v9�:w���/���ܹs��b�$9��m6�b���F554���%�⎎��������Ȩ,+�\vy9��f�	���>�1Z���|.�3��V�b����Ng[kKss������I�>��s�ݽ?�
���1�)���\>����������s�N�h�$)��m6Q�LLMQJ�vwuYE�8WVVGF�$I���D"��<�]���d�e�*I�RC�#�"+�1SW[c6�$IN$��rp߾���r<q�С�]�v��a��mm۷m#�H�d|9���۳{���[���ݿ�(�������:���VTE����a����ݳ{w[kkbe�x��l2Y�bo��j��k��d]�+�e�͆�奪X՚�'f�%�H�S�W�̹�P��פ*
�O����U]]�H$0B�,����N�0�\#�k�(�M�
/_����u���M�
�a�$��iEUUU�4���+>����nqqIL5�UM����W����ƈeY���bɤ3�,g�Y�ͦ(�q�g��>��0�ELg2�,g�Y��Q��t:�r:���I�Ҳ,+���p��eY�0��$�b6��y��AH?(���a#O��\任:���D0,-��lI�3n�;�����ťl.g,��2����$!Ju�]��1�&AC۷m�hoEqhhhvn~GG{o_� ��񜑏���c���拗�TE#&����u55� `�T"Ba�a$I�4M0	U�X[KkG{{�;� �TR�eMU�&�`�c<'�LeV�ĩS�9��@�,#˲�(�17�

L�����n���9^Q�ͦ�J�P0�1�,���:5=��-c�o���t�ڵ�=:2:�r��iIR5�R�|.��e٨BHqQ0#���0�"_�x������fhx�X��"�Y���
&�*m߶��6ЃD�X�@ �m[ێ�v��2����}O��!��T�$�P���f���r���3���ڹ�j������vv�[,���+��eYQ._��P__W[��?P��a�,��X4������^Lf�X^z��S33�~�����%�����"�"Dz�x�¥��=��Ĥ��0�`q����y�e��e`0f.�gd�ȤӲ,�r9�͖��(%��i�f���[XE��b0f�t�|>����ݻ��ݜ�̝�T��f��u�26Lմ���K&��_��"��;�q.��ν{��D"�(^�z���K��C����p��h��m������;�h!t���h$R�hm--�8155��d�b1��=<:�Mp6�m{[��s���D�x��AAX���f����qm--g�_x�����������4�˃Q/�Dz�a�Z[��^0�,9t� I�/^������6ۙ��dYnjl<v�%!T[[��m�r< IDAT��n�;~��矿t�JuUUks�'N���f�ي���˙�f�$g��lv�ܩt��9�Eq����/����];w�]��N;�=]]�p�Vw����9|�d�_���/�<�0��dBG*+GFGϞ;o2�����p8�޿��5��\]U�H$�~�]]׋�a������7b���l�~�fU,�`l6��&3�2m--�/^�ڵ������3gΟ-�C�/����#�o��n6���Ś�N�9s�y�b9r�P(�q�����o�7o�Jg�^oeeʼnS�UEݳ�������'�
���D�%�����ٿo��k�)�u�����tM{��v��Ёf����9�N_�r���=��h�B�\�V��` ���{�����He����[�K	
�K���.\@!��Z[9��8#d,��{��	!.��x�e1B�ł�y�����N�9w�b1>xȨ
��q=<
y���ϛL��]�,��`Gc��
#��ǎ�@9t�l�����<}���V�����?y�"[[�9��8�h ���w�@?��aw�<o��b�q<�`\_W711y��y��|���G��/]�B)��E}^���.Y̖���X,�1�z�z&��8�������;~�n�ܷ�$��E�����8{������ܿo߱ٹ��.[�beEEcC��(.^�X,�uu�X�b1#�X����J��q{�������1�ե�2����w�~�7<����u�5���_(ȵ��Iߘ����;��~�Պpx`p�"R�ׯ�,˂�d�eE1	���˱,S��XLd_����B����\F�# IR�z�1����O��B�F��y�t�������_KY|yI�)��/>aM����j����=|�1L\,ƃ20��1+�b]��,˨�J)5>�kʯj�0�,�F^\�ũ������Kw�(��Q6c#�G&!DQ՞�^EV���[|#�9:!��O��)�b2��No�6)��XQ�U$�2˲,�JQIC�v�������.��_��ӂK�_��b;��F��#����:�����icv�Q��->��eB2:�Ρ���.��){dQE�u�l6%1�4�3a���fq�����R�'�R���ﴵ�D*+׿fJ4zq���\�t9���\�|�n��o߾�p<3;;66�r�[Z����5U��664�<?84d���&�����~}}���b<W����j�~J!~llL^7ŧ��	��|��V�d�&O����GbX���;�n˲ť�B���a��]���{��Z|f�6K�T�Y�5�>�9�m�55��iU��]Sf�nIH�P}]m*�Z]M�B��b�y���wq�k�����T2ʷ&���ܩG����F4M� �9t��S�s�(9�[�~B�a0B�#�����O��Hf�
��'��,�uM�?f�5G�GZkjcM[�����-�AX��O��&�����#Mx��5��Ț������#��\,ۚ��%'��sSf����c�ٹ��Н+k(���_��7��_���\�ڵ�_������24<299����_��/�����?���xtt�ʵ�ؒ�M�~�B�_A&Fv�r�4��߯��B<�8r�����y��a��_(^|�ycM���v�'X����ҞbYz��t:O���w�����Z�/�C�tvv��o�{�q�f*�*
��];�>�D��c+�"kzyT�%�-���SS.����?�1��~����1S���	���J�TP����-d�Z�����K�4������f�Y��ɉp0TS]��f���ˈ"��977���l诩���s��v������$���_�"&≣G���4B���SSS�55y	�k<.r�<�s�#���g��l֯����霜�����l�|�˯x<����_��WV��/��riږű͆�@ �k��t:M?<����ձ.���ݻ=�,�޾���y||5�����ٟ���i'O��η�U�<�������pQ���+���������{���m�]�16��0	ƒ���k6�K/�?&	� � ���l�o��/�35�.����R�����q��l���ʶO2���d�w�~���iW���L&�J�hI6M�e]n�h�~�)��������f���v��n�������P����g�f�����]�E����X��iÌ��!
-[�EfYf}�~���G.�!����T�������z� A����~#���ͺ���"]Z^��0�L+��ӹ&_�Of�c��N�G�4����fPJ����:;���}�v7�lq�'����N�r��u]?~�h�d��;w�y�lY�G��X��C�4U+��KK˙l�j�U�+��WVW1��r�a���ãc#�Pi8��NMN���,K���8���%�B�ݻ���$)��-
�U�*J���l�Y,�65�����������G�Vejj
Z�,�3S[S[[S���Xn�X
�a�G.
��n���B?� ���B?� ���B?�6�4.,.��$Hɚ\��A��˱e>�P
K3�V�����T:US]�q\1:����{+]�JGY�?}�>�/Ǘkk>�����K��/ۼ�хTY���i�7Ca0���ܳ�l���k��+(�/_�?-�J�����yA�My��\>��_���o�B���s��lY���dr.q�7񊽓+�,a,���o���,��YD�縀?�r9]n��K��'&kjc�x�5�~��+o�ri�*����F�澾���$%K6���� ��-[�	!n�����V�������yw6�X�)�UYJ�NtM��2�L(���
���X�h����s5��
�c��Ômfg&�y��d0�����K/�}����E�?�
��=nJiG=�r��B(_ٵ��,EdY�/���B)e06�L����r�e����R�x"At��Yl��w�y<��c�v��y���i��[�1Ɓ@0BU��r��~�@���~�@���~���fs�PJ3�L.�%%�Y��;V�
�Oa�euel|���h}@������k�]�	�*��-����VW�8QE�h1��ޭ%��{��e[�Q�uH��E�WU�hhbij~U�ԙ���ύ�ξх�v�X�<�!����w����^(��`�6;֯}b&��}6����3���ξх�u\mH`\�R������V�I��N�#����N8e�Y*���H���7���3���g��R:==���c��=�owC���~�����#�ɉ���{F�m�銚���/!����>�GUԻ��$	�
�2�#Dƭu�o���g�g��,f���,/�����p���&W�ah9ذ͎��}��P(
Y��
^�U]Je
SS�M
-e)��^��K��Н�w��%�������:M׆�F�N(�a��q�u��P�,E<t� Bc�0��}FQ���ih9ز�ϲlKsksS-��c\����q!c���ز�i�z�*H��@������y5MSUu�AX�����0�����a���0l��@�4������bx�B���T0r�ݥRJ��D8.cYK�Xذ͎��yQ4������V���e�o����
]�	!dp<!$Iz��?_��KL�R����,oU�ʐ�Y�4]_���:%e[�+�J]�x�ŋ�|�RJ(��	�X1�%��u]7�$ieuutlLU]�)��R]�5U���$�PJE�e�x9!D�eUU?�Җa��G.��i)����\uu���{�ޞ�]p�+�lvu5�������N��_����ܯ�x��p
���������o+��oo���^3^884��[o�L��];w���[o���Z,��}��6���
�D'��3��oz�z���_�|�k׮l6��ӳ���3���3��契��֖�{�\�r��w�[ZZz��ád*E)�d��R��l6�(EI���{����l6ۯ�u[[[__gg箝;,˧Z�M�PB�#�r-ԂP&�����}�����������<V����f���":���x<���l6��X��C=�c�YJ����l�b6����~�[����O8����]"��x��~�ù�8
�����\,Z]�"���߷�믽F��m=�=�HD�t8��	��qx��������}�`�'?�YKs����l�x<�/\���,�����7ntuv���w�ogX6�L^�vME����h�w�y��^�p&dEQ��힘'%�;�5�7��m+�]]�Lal�Z)��L&�˯���
َ�0��z3����s�ڶ�����tM3[,�]Ӵx<����bq�\I���w�� ����a<���j>_p:v�=�H�Ri���x��o����yD����A�ݵgwמu�����v����b�y~uu�6���x���0���b��<ϗ�����n/�ZYQa���B�r-r��ʐ�2���ҷ�~�@��d+�e^��5�B��Ԇ~Y�GFGdY�%w�J�Nk]u4�C���-�OMO�\.��_��ˊ�ƹ�������>������
��l&���1f,��{�Gg24��'_�1���ݺNڷo#�(�~FS���t]_\���x�m����ޅ�$YɪZr����[���|��e6�����0��A����t�NB+��U5�����$Id5&���sU[�|�ssv�c�����Ah3��Я�z�#vU�ɕp4���l~�;�U�}�r��E�s��G�664ܿ�X���mv��j1�/�!�.ܝ����?_]p��yT�N�$Igϟ��|�ҹ��_x�
�2�&!�I{ܞW�v6�8րυJ��V�^�"�tl����R���hll��ph��[���߸��w�J(��xc3^�����,E�/����$�����D�cY���(�;�g�|N��ҥZX�5�LeY�!�FB�R����w������r�e�!�q܆��ك�m��~�@���~�@���~eR��;��k�F��1�<�3L��+e\� �oPA*���	�S��x��*��-�OOM�}~�ǃ��qJ�\<�›/"�trrJ�5J�`2�].h6��П����pi�~��f���e)����Wu`�~�a���vwvUWŠ�`�B?щ����=�SB�UJ��ŗ��r:��o���x����+�:�5M_O�4Bi����>�ӱ��-[���5M[{J(k��H�eEV,�l]规RJ�u�	!e��+�215iY2���q:�5��:ՠ�`c6;�c�ۗ�5]Ӊ^�����҂�l.K1�]�v���z���l���}�\�-�7����v����ΖN��8���!T���!�e�{�5�[躮�*���V�~�ɴo�{���Ý��t�~���ā�m��~�@���~�@���~�@���~�B?��eX�K���T��~p>a�*
��Oa��d2���^��c?ؔ�(����-f�����?=3��ǚ�Y�K$���/"!��k��`�GMM54l�f�eY�m�?�IPU�,E$���'?�$Yյ��ξ��[��A�����u�躦ix��%�\�Eq��.�˵�������q�B�4lM��]�4}M�	�e-*F9�ݞJ���`{��Nt]�מ���G!DB�\.�ϯ�����RJ(]��uR�>�P�s���w����T�b�&C��ք~��753U#�K��3S�
-e)"�0���TE�u��=���s���`�B]]]�|��]�dx�㸦��@�<Wb�9|���R��z�&������ok߾����� ���B?� ���B?� ����L��o���:E�4c��,�2�W�i��,��� �pI�fB��u���A�������n������?;7O�C�T1<m�?�ͅ�!J�.�k���|Y���O�3EQ�^���C�l���_'��k��<Nt��im^J����{ߵ��sG˼�;�n��X]'����h�F��16����*������`�z��<���N	:-���B��������a�
h9���O(���v!�鞛�p8��ʗ�N'!dll�
�2�[��J<����~Ji<���U�l&{��i�(:]N�Á��`CSc��[7��zK;�,�U�jB��~�e����t��q��f۶m�lY�7��\?��0L�L�����PJ)������t:s�,4lM�G1�A,�$<1 �@���~�@���~�@���~�!�����yAi)k������cJ���c��1�*c�!k���c�QJS��>�7K�6˒���!I���0��,�bc��B?ET�u�ӝJ'1�.�x�dMYt]7	�'��_����X!��<�q�l$pg�����X���k��G�4mtl�`�Q�
V��S��!�����Zw�����N�g���sP�PF�,��9��IC��gSJ��N�q86�z��8�����Yc.��?!dxtDQ�ҧ=��_�!dl|L�uI*�
���F&�ONO�R��'�@y�~<�X�/+�"�(���y=^����t.��E"v�czz� ����o���鴪��ɩI]�#��e���cU�Baiy9NMO�Y���c#EQ�q�l	�B�����^?l!$��MLMVEc��a�a��`(����y^-��
�ն�0/�R0�WT��7+����HO_��� ����<�r�GFG8�����\.�� �����<�v�GFGt]_��-f�"��l��Y{�yR{��NJ�=�V�4UӴ����d���#M���UUUT��x���6���t:��J��1F�Y�
@�HE���D8�g2�|.o$_�8������Նں���|���(�,˺].QM�IE���7zr/�f�ٙ�98�[n5��q���YM�$�R��\%�VWU���0B,�&V���p����	�y���`�f�5��;���n1�}>!c�o��v��(:�v���!�y x�O��f��f���Z���~��x���&E�V��+���T1�X�ֆ��x"!���z<�|~iy�2\i2��k≸�i�����(�v��cz��p(�0��f��F�TWS;;?7;7�t8v��d��UY�V���jkg���f��C!��B�Htn~N�%[�{=�yAp8��9��R*IR&��X���]n��W��Y_[W�f���tόE���V�>4��a�X4j���j�k���G*#��&�T��X�-fs]M��>ٗyM&�}@Y`�M&�&��`��1���O{��H��1���~�g���C
�)[����z��;=�>�|�aA�l�4��nU��X�~V�Ć~���2���-��1ʿ��
��Db%��&�,�~��؅B�a�'/��g˭&W�h<_(���	��X�e0�@�e���v{
R!����:�,���o�}��^�T�"X����a�eEQ�|<ϳ,�1Ww��R��"Z�O�D�i��].��f�u}k��~�#?��0cL���B��p\��֠垬Z6��aF�B����IDAT�l!#����o	����~��G
�	�	J+�D����; ���?��A�X�&�����~MӌѫO齍���t�W��d�~I��\�:??/Z�����=CPJGFF�ܻKt������a�u�t���e�X\\�t�2ð���ֆ��ҿ�R��W����w�^����g�����XLE8PO�G��ᑑ��Ç���֙�&�>�=_\pR�uc~���~,>��s2��j���zq��,��?mnjڳg��fe�����_���q	��w�)]��Q���O�tztb���������ۑ���S�����p,[z���g]׍���tZQU�ԇ���#N����/B.�c0jlh0��{==�KK�>���������={�y�X>����q:���_Ȥӟ��[3��������٩�)Y�������o~��F�Rc�$����6�5�Ͽ�T*_�����7�H��V�m����O��s����޻��K/]�q�*�~���c�QB;;;����g�'�uD����`���x�ԩ��)E�#�h�Ν�/\N��Rz��MS���[]]�����a���y��q� ��������G?�	�q�B�^���;w���舮�����߿�<��c���TVF��/�����}��޾>��
������o޺�u]?{�—��X4�����Ǣ��n%S��]��|�];w���������<y�̟����y<n��o�����ޞ��NB�ш�i~��/�h�Z�>}]ݝ�w=n�x��}��v���޾���CC�x">9=�ŗ��Vk0p��Ơ�����S�%I����E�}}��X��#o����G�,.-:p����g?���Çڷo���o�7o���/�l��'N���ﵷo�Y��Ο�Fc�����^�YmCC�Q�����WE-�uup��d��>��*�|kk���]sS�;�g�Zkk�������Z[���^~��'O]�~C�y�$��N�IXXXP$inn�྽�c���岊��f������q0�ʗ_���ޯ�z;��۬�U�:�΂$�<s�v�I�(���a��l6���rJ)f�;v�o��_�jhh�e�ꪪҳ׶�֯��^}�n�#��.��n�Y�aA�Vc��J$	�B�%�I�K���4�a��|^��Yܠ�lv�\.�S���|��s8�/�O������7n�#��;;v0�sǎ����{��g�f���Ƹ���`�P�]�عs綶�3g�W�+U��y;�T����s�|Mu��dr:�W�]��}�Phnn����GGB����P8453����?0���kbr��tVVV���e��_���w�������ԅ��0�5���kI���tVŪ����p��K�TV�W�ɋ�.�����K7�����O�gph����x��~kϞ����R�XYY\XTT��vG#�e���]WW���:19I�F"�@`5�\Z\��czz&�L��X,�ǭV�(Zffg�b�����o�Z����饲���Xok��mec�ml/c��m?a�[^!���"$S��z��`���x�
��p�U�B�A`Q�O&R���:R�0�a>�c���})H��R�8�,E��:!��q�R�V�0\P�.
�a�$��j��<}:�QJ�gQUu�\�$I�e˲t]w!�
���v�ZC+��m�cA<ƨEAh6
�u˕ʘ1��|>�Ng��漿;#��?��./�a�pX�(Z���F�w�7�W�(r�?�zKr�$Y�+M6�a˥��-�w'�c�1>�{x4Ϻ�N����B��y�$q=��I����ƞ��+��w�\���0��_<��4��>�58D9�4{�ޗY�b��u����~,��L!��f�W��!��#�<�z���-��P"F��I�IEND�B`�site-packages/sepolicy/help/booleans_toggled.png000064400000171006147511334660016077 0ustar00�PNG


IHDR�j��8psBIT|d�	pHYsNN�1�tEXtSoftwarewww.inkscape.org��< IDATx��etU�ڀ��9���[�J�S��To���ըS��-m�@�xqk{)V��-��e?B�	A�Yk�:g�wfϞw��32I�8~���ȑ#�l6۽2���@ �
F��~$  ����?J�L�0&&f�����@ �j��fΜ9Sv�ĉ�ʜ��)Ç����n�o��@ \c�J%-Z�p/++�^i�Z��j�B��@p�����Z.���$I�oY�@ �1�$�����@p Iʊ�@ n}����6BB�W�$Ib�L �S�@p��ef������{/>���;��:���o�O�a���2�g,��}'�EE|�����k"�c��z��RI���<5�>B�k��<�wZ7k������N��똻l��;���0e�R22��W�EVA"�@pL�1��-[@vn>Z��Ш(Jd��T�wuqa�~��kլ	1-�q��	&��O�x��~��\Q��7l�о����c�����i[b��HI����Z���k��%����3e�9���Pf,\J��`���dXǪ~�\^�
������:<�\����B�P~F��B���2�y�_q_�7�j���嗹HMϤ̠G�q��G��Ͼ浧�b�X����~�i~[��w^x��KWӢ{�V�G�KX�Vl6nn?s��.�n������?|N�+(d��
��Ϊ���y��r9��Es��I���h�F6��)3�٥#]��T��_��Z��?�o�yt$e:=�$�v�v�o݁N��O�.������fC!���ϲc��*~�FF0s�r�V+��!�{�<:�i�c���Ҷ
C����?__���K����n\ܴZ�ԟ_f���b�yTc�t�z�η��_�G \����৙�pպ�����p`����0�W��=;��I �!I2�}�ӽ+��%�Y�5�
�m���W�zy�3O�X[p��Yҳ���.������f��ώc���L_�wWmy���OJMc��U���8�t�f�1w�J�=Nt�p��9�Kcu�]!�о���I�U*�����?W#IC��u��lٽ�ͻ�Ҫi4nn�|��x��/�u�P
?�����sO��w?"��c�F���M��Mx㳯���F
�w�.���k�,��	I�h��l��TGd�pTJE�t���_yD"jGBJ*�9�{�
d��9z���0�OQ�UH���f���)��шF��n/�i�Z��P�!���B��l5�ۻ��t�i�X	p�[\R����?��b@�T`4�֡�f���ۓ�ƍP*�%�iD#�ԟ��[��$��K��������V�Ő$�F�����Dd`&����db��8O�1��)z��*����b�0s�rZDE���{�
�ID㋆s���v�P��eb0��a�,2�shմ	�[�t(��_�?8�^���'��a��0��\\呴$I AC_o�����@Vn���7�����z����YE�re~Q���ۢe�9�k�4��'ư}����q�h�8�G�ɬ%+�/[Eۖ�x��x��1s�r^~��*�����w�tB�#/2����B�fs�)Ċ�5��=t�-Zʡc'��/�E�(�����+��l�U*�..U�I �͉�g���U*�E5�n�j�n��͛'
6�BqI���<2�����\�U����W���puu���X,(�Jt:...���j�T��P(0�L�T*�z=��L�����a�Z1�LUdT��(�J,f��R�F��b��R���t����r����aŚuL��sT�F�j��Z�N�s42
��F�^�G�Ra�Xqs����V��Zna���`0��hP(������ډ��:�R���/����͆��+&��J�F �]���*��
���>}Z���I�0(d`���+++��_��ոV��++p��p�0
��r�X��$a2����Z���FE^*��w�(����үS��n��П+�ł�+S9��~v�?�ɳ	(
�KK�ܶ5�$a6����p��Y�
�+U-q���U���_ TP\ZF��m�;
��5����n��P����@D�$\K��5��p瞡�ێ�nZ--�"�E{$�B����@ �5���_���@ �(�J�P��@ ��H��اW ��%�ل��-�@���j�����[7W7|}|)ӕչ�r��~��j

��k~�w#�R�����������Y7;����"��S��CGa��0�
5���pss#�Qf���-[;N��6+'O�D.����IT��MF�ƟEB�IT\4.�-�U���N��w7�K{n`ꮕ����,V�M����d"=#����b��1Ykn�s���dh4���h���&>7r��F�}��{)��s��i��7=B�n(*�w�ˑ��j�"�����{�W�bw�/ȭ�WB�)�mjí�����Ӻ
nd W�\kNe��}��B��7���d�r��03A�STT@N^6:��Z���7A!���
�/��P�o|���1�|D�>�s��"���
y���f!W����'H�B��X,fҳR�o�����q�/���!C�����d/}_�Q�T>�n�]-����n�#���$#3
���ш�_ Z-v����b�*
�yل�6�d<7W+�-��F�V�z�M����#~A}���IxX>�I�e*�F�(�W��\KD���E'Sm6w?����zY������g�c4�x���H�ȼd���l��S0�LL�9���"����:u脇��~�DFD:���)UJZ�hYk�W��齲���G�������X�b%�EEL��?M�����8
sI���W��`��ž}��^~��Ƴc�+K���:}-��������O?�W_c�|��g,]��eY�r%i���?x�Ĥ$$Ir���ȫ�ns��zE�Ta�[q�Z�jm�e�P+rP��QAV��y������#����0r�4lؐU���U��la`�~(U*6l�HqQ1...<(O_!�өc\]��>s��v#�˹��Qlڲ�Դ4����c�`fΞKQq1͛5�Qw	@rJ
;w�/q��9r�z�����4m҄�=b),,⯿�F�$���EHp�U��Zp6>�	_}�;o�Itd$��|��D�Z-���9[�_ZZJqI1�e��h\P*��d2����7�).����|���~��r���1��ج6�
%r���`
yyy��:��j�MgY�Y�x����V�������|���7n�X,degՐ�"�85��3����u:Q��;~(=~�]���װ!Ç
E.��m�N���Ӳys:w�X%Ί}��$I".6�9��Y�&�������R��{
Z�
��x���gN�:E�f�((,d�,_�����ng݆4�?�))�ڳ__z�v��c��v��xxx`0�{�zF��R�S���zٌ�m����ğ'1���1[,̘5���@<=<(.-!--���@�������&5=��֭���%!1��v�����A�
	!)9�i3f���]#�ӺUK�"#�X,�ٷ���TZ�lI�-سo�%�̭`���4
������q��Z6�����>�djT��Z�s�����c�PZ4o��X,L�4�ܼ|N�:�󯼆A�gނ����q��)V��#��jeڌ�(((�??L�mL:�k�Z�&���.�:�៍lٶ���`BB��i��\�c��x�����f�zfϛ��}�Y�q#q=b�>��*0s�B��iӆ�s߁��H
%O=�<g�lݾ��qq
�P(j8
Ā~@V�'(0���֭[i��FC��m0dffҩc'Tj2��BApP0'O���?�$�q
��ܜ\֮Y�R����///���Y�p���sh�wdϞ=�<u��LJ���I�Pĕݩӧ�;x(}e�ʕ��t�۰�r�h�٘<eC"-#�q6>��Krjj�8ϭ�s�@�_�����>����sJ�<��_y
�ZMZF3f����7WWbZ�&((�CG�p��1����$���INI����X�V>��BCBY�~=s��g��},^���@���9squ�:��eg�ِ�rEj��0�L̜3���D||���/�S���X�>�ٷ�Ҳ2���ϜM~AO>�<!���yΝ�v#�����$8(���@G��^��~�DhH(�|�=�99���f�U�Y������'��29��q�B��9&����b�<�k�����l޲www����׽/=�/��HLN�QX�ݺ����C5�IIM�u�V�o֭ۖZ"���ѽ���h4���H��0�C��Ю�s�����A�V-��֍�?���N�36��M�����NO�6�ټu+��I���Ժu�o��QQ���{?q�O?O���
�L����2����}>
|���IzF:���$�$��탗��锔�`4qwsw(~wwwZ�lɈ�#���(
�V3=�z����f+�ʠC��
z�3 66����*2U6�;S��5b�/����$���uNy�d�#Iv��S��u�_����D�����LJM[�ХS�qJR����R��3G��݋�{����$%������'ILLd㖭������AL��Үm�͞���ݳ'������oז��wЪeK�v��'����SH�Ā�}�ݳ'��m;wr������T�ꝁ*�����п/�QQt�ҙ�G�`4����N�	
��aC�r9C
dӖ-h�.��|}|�!44���@G��ݿ��ǿG�Ν4�?�,�d��嶝�Q �O|}�(--������&H2;��R�3�Z�Z�qAſq�"7b���ܳ���ӕ�
���9}���n4��d2c2�8}����(�JJ��(**��˓��d��3�/( ;;�ǟz����ҤI4v��ZMFf&�	���it�33INIaͺ�����r9�$ѬiS�}�I:�����Pז�P���#�����%�~Ū��N�TC۶m�%##����7kN@@������RRRB�f�iҤ	j��¢B�j5�n��p��q�m�ƺ
��h4��ӏ�gϒ����nG.����J�����ʞ={���B�RՐ��*��N�VDpP�Z-�]$d29�YH�DP` �w/o��5���d����LJ'�{�^�t��L�"�����c��ҿ_���OO����ݳ'o��5���r�v�3gϒ��K�.]HHJb����3�޻��׭�c��DGE���ARr2�]G��?�v[�"�Kݻv%�Q#6m�rY�_�K���Ymdfe��s��6&�訨sցr�@E��j�r��;xꉱL�u�r��x�����Inn���CCB���5$%'s��i�v�T�8ktT����qwsG����������B�Ɵ�l2�:�*�)�N'.�;�!!���жM�+��|��o���_������˞}�<b$�czoo


Y�b^�^t�Ё��>��O=���|-���戳]L۶��/��|nkπF�1�W^��۷�O;�u��%<�����ˀ~}k�y��cؐ��[{!+_�_���Ъe+���ٵsZ���gN�q�Y�f8pOOO�̝����~L�6
�J�$I4o֜��N�#�|�>���+�q��A��1��d���\�LV��mL[F�5����k
�����&n�|�٢y3�N���M�hØqO1f�S���'95�g^x�IS���K/��hj��+o�SE��K�it4�{�Ō��?~�-?M��1����������7��
�6�V��o�]��������C	
D&�ѰA���ǎ]�yz�X�2����3{�|
��j�{��X�g?g��˯aC�+���XQ=͢�"�z�};�F��xyy9�<<�Av��͆M�e>���0�ͼ��{�mC��kg
k�����������b��3�Z��͛7O�n]��BQe�!����9����It4
��^�_��o��F�A.?o��0�:z�V+2��|�B��hD�rޤV��P�ȫ"\E�6��F�׌F#Z�V����lٱ���k\?u�����d�HII�d2�Я!����l6�RӰ�l����֨ILH�l1Irr26�
����p222(--E�R����n���RSR��'9)I�pww'$��TRNn�c�c4V}�Je�m|+�+�łZ�v���u�\��/��������̬LbZ�T�[.�;�*�Wl#\�U~�*�:*�*��x+ǥP(�$	���]�<*~;�oaq!H�.VG&+��TlA�R��Z�?�T(�Lf\\4U��$�T*r�F\4�
?*�ʑ��|*
G�+�_*���3� ����y�'�YÐAwPP���3'���"�q4-[��d�����^�++T�R�0�W��V�-��~ECPa�������,͊���*�Q�O�T�������|��V��ThDEU�*�ך4i�ݼy�*����/(KDD-[����ʤ2fsM�[��\��{�[�NV����zz�Ӯ���ݨ��TVҗ��6~��R�T~g+�S��9����fC!�;:�Өׅ�u�8/$�@P��dr�GqqE%EجVT*U��_�>n��l߸���xi�\]������(��p�m�s=wҫ�lW6Q��T���ّ��b�_P��ո1w�Bv��Ohp0c�d����b\��$��њ�$	�/�v���ٸ^SL��o�)�
�3�z)n�΋����ח�;Һuk�f3J��F�Z��bU�b�^�
�B�p̹�H��Z��l6_7�V�ϭt�y���$�(���:��[�Z�Z����7G�*�m���"�0���7��/*.",4��l|�}��Za��P(���QTT���g��u-0��(�
��h4��h�[�������s�73J�¢B���[��B�(ӕa��SH
|��֕2]Y��lJ����}
����S�h]�p�����_װAC\�}�$��(:��T�rUp��׍��nG�R]7��v;������^��^�^�-�5A(}��@�V�@ �%��IԬ�,JJJ��+TB�P���MP��E �:U�z����/kc��$I$�&׷�@p�S��_���T��f_�-7u���2y�7d2�-�@ �5u����yc^���dK*�9v�(h�D��sO�`<\Ŵ��J�ϊ��(**���Z�R���d?�@ \�:m=/4��y�I4�n?��-�Wη������{.} ��֤�����A��jX����8s�4-��z"WBſh�R�z��]#�SPXD��{��wϞ�y��I�)�̱|RO�C\g7^��1��L��7��
a�,��+�oK�U��1p�`��B}}����4��d�͛ILL�aÆ0ww7��?@�N�����.%�ߟ^qq�k�,�G��_0\Ii)�W�➻F���GD�F�'$�{�^��ҳGl�.+���NZz:ݺtq\��l<|��g��G�n���;
�g�>�DG��uk|O/��*é�K����I��8z�šR�ILJB�׳q�V���'�^��J$���@&��p[$a��'��'�J�v�18�{Gv"�L~��4�p��(,,D�T`�Y�:��p��.c��}�H�DG�պ`��Y�t)V��Nw�/��%K��?�����-[���Y�gd\0��j%77��fS�ӑ�����&ѵs'J��0�;��b�Ԙ������c��*�?��Kv��M�֭��d<��8����_���=��S\\|�7?5V��Ѹ��$�q�ǽe+W�|�|�ͷL�4���0x�H>���V+?O��?����u��Wv:��2�����D&�#W�i����s�]�WO�ܐ�l�`��.��/��T�T��O@�V�.���bp��i����{�'>!��[�qσ�ֻ�STT�˯��P�c�=	�����o�H�k�N��謹��G�n����p�=��h�2�|����q<��@�b߸yO>�<aܳ���I�)))q�Kϼ�#�/�Q�0��O�Ƞ������瞡U�П�Ǝe�
�<e*O��;F��W��a� IDAT��J�	_}͂�Kع{7��,��K
�?o�>��3~�e
�t�(�A��ѭK��<4���L���Ǥ��l|<[�o@�T�7_9L��]d�7F���{͔��	��̉3��Y��ܨ��p��*L�����w�.t�C�0x@���[~�e
��0k�4>�`<7o&3+�������^q=�4u-�5㏿Vs6>���R6o�Ƒc�h�&�F:q��8x�U�EhH0nnn��w����?���ǎs�	^z�u^|�Y~��7��d�/&кU+��Rށ.)-e��}L��'�v��������d���,[�;���B������r-�Mk
��
d�o�INI�R^c�<��#G��lL�1��v���F
����{��3OW�#��憫�/OO���8y�4z���v�+.���
QdDQ��N���Z��:o}��/���+�f��b�
I�R6RRR\�JH�zp�4��h�"����ƃ����~�����	��n��h���x���hΠ����e@�~L�:���ɫ/�����k�2����j� �Ϟe�1��)�it�܉А�7k��G),*�e�xyV=I����Z��o�������z�nߎR�į�'O�&0 o//||����kؐ��,


�X�̞7����BC�������|�LI*���j���5�d2:w��,��K�>�w�h���&)9��� :��S��޵e:];u�e��5'g�~�R��o�I�%�ܒ\�!+��Ģ��(i�������p��|N\��0��N�$�v{�J
̜3��'|��U�ϦM<ȩ����=v��۶�އӺUK�##�kؐ}�лgJ���ǎ�4�����_���;�o��ص{k�o`����`�N�>]3�JS��!���t�lޜ��:���M���9p�'O����#U��T*z�`ܳϱt�J>�������w�t*�R�`߁:w����x���uz��@ �1Q�s�=FGG �A��-�Ԙ��
�7k��`��Ņ~}�PZZ���!��4��f�Ν��tDF4��ݍ��`|}}0����hܽ<\y|T!a�x@l�0��#;6�&/���G3f�zWB�]_WT\����A��fA��j�Vqf����4�7�Q�UJ%�����2�~}z��a����@��VKhH1m�p��)ڴn��CQ*�DGFЬiS"7&�qcZ�jI�F���!s��(a�oۖ2]�YY�s�HB��د/iii�ڴjEۘ4
��LFtd$J����ѦU+:Dzf�c~tT$�w�"<,�&�Q���8�mݲM�����#*2��^z�rB�;�����8IL���5d��ͼ��3�db�$��v���ټy��I�D4����NǼ�s�����D&��Kbr"�޾$$&���^ÏJ��M���Up��Q9£��oQ�u&>>��w����x��g�F�
��R����$2"���uٱ{7}z�]ڣ@ �%���^���"��ׇG��ť���Dݎ�%јjO��ku��Ev�>u:7�M�e�[���"L�v�A�Q�#~������W��n
�~��-�@ ��ԩ���xy�CA�@ �Q+��@ ��P���ַ�@ �'N��ི��[g��m\��.��6G�i���d2���GY�#��s��L \O��m��j0��?x��S��h4���K�w,�u��6Mp5�;���Sg�8~�2�_��ޱ@ �~�u�#�4�� U�8���9~;5��ܵ�M���r��޺������}6�<�/ǵ��t��kO=��Eæg��y�&F?��E�]��ž��+�V��]�ҿ߀k����"44�֭�8��f��T*)--����c��~��G}�����+�7��{�`꯿иq�;?�Z ��E5`ϸ^��՗޽���{�_�%��ރ�]����ʔ��())��4,�E�a�ۉ�j���S�gΞ��ᄁ� �FM\���+n>�s�?>�,'N����.]�g��P($4hHYY����'e����E��m�9{�w7wz��Mii)3g�h4ף'>��g�K�Į�;9v�
����^DEF��V����B.����3r���:vF�u�C�L��{�;JXX8k׮�f�ӤIS�����}{ػor���]�����
k�Zm���Эk,s����?777���	

�j��t�\\\�����ѓ���d�bڷk��m[����d2�~�Zlv�n��N�l�Dnn��Z���A��e�]
�$�����{�g���4
o�v��~:��b�e�VD4�`ي��h4�|}P\R���7w����֒���R�����gݷ��?��={ws6�|Q�B��чc�!�>{
�L�C>B��]IIM&5-{t,#�Ů];q���i�?�oߑ��F,]����l���x��1l��ʈĥ{�..凓��v��م���}��`�ZIKO�U�V�}lmcڱ{�.<=�x򉧹{�}xzz�أc���ĭT*x򉧉��c���v��=��Y������LJqc�F�T�v���oߑ��|d2����
Omڜ��?8(���h��֮m�~�v�ʱcG��vi�O��+���HMK����dgg1j�=��7���u����E��/��%�JP>_�J��f������k�n;�CuѸ`6���
\q��ӧpuu�y��te�\��A/�F�$9�$I����A�g���4�n����g�^=v��f�J����Z,4h������b�8�۫����
�Qkj��Z��j�^V���Sߋ�:w���m�֑��D`` 6���_WWWlv;jUy{���1�l۶�ƍ#� ���*��'~�8V��ǟr��b�PZR�X�g0�5gf����^��6�n��O8K��]HH��j����MDd$G����Sp�j�ˮ|���b?O��BA``�� ���v֮[CqI1.<<<Y��_$$�#�+�p� �{kׯa��_�i@���/�FYY�c��W�>����������i���=����댩xxxp爻ؼe��[ ��������;vf�5@���h4PZRrɶ4$$�ܼ

�Q*U�l&�9�͛7O=z�G`�ZY��m6+z�����wFqI�TZV��V�Bq�s��sE����͕�9u���G�R׺=6���-�k��'N���FJjZ��=�w#++��w
�-
��N*ٕ��D�WB}����պ^����F��o7n�o�.������ח��[���#v�ܬ�2��>��vq�Uo���ث_p�"�@ \B�nV�9�9�-�5�Vɇ@ �9��6G�i��@	���_�r�@ ����_�-c��@pi����6⚟#�l�n�F�픜:�)/�jE��kh(>m�ؿ?ڠ�k��@ �Kp��!3���}G�̙s.� E&#�_?Z��A^+�@ \��7�K�~���M�r��J_��E���	��n=��C��Ѥ	�d Id�_φA��4r$���9��f��o�P���t:�v�JBb"��{֮�P'�^+�l�v���ߓ��x�ҫo�SR��/�(셞˂ŋIMKsz�6���xN�>}���ٷ����g����o�{�0�))<�����W5��������B�.��Ͽ����!##��Z�V���'�Y����U)~��Ė��c�/b-+C�R5v,Cv���Bo�N�%K���w����y�4����e�d<�5 m�J�ױ#�g�^�U�L|<�a̞7��p��V���7���qRam�W0sμ+��h4�H�j��t�l�K{��~�2���U��W��Y^,K�`������R���:��n�SR�t�=�����S�L*����W9�K�f�Z�zђe�ݿ�i����c6�w�z��i:�A���'v^�[s��X^����v


��.�tz���+��8��)S�$���BoГ��U-���[��v��M��Ķ�&e�����>c��
�iЀ&�<C�q�0�#}�.%�
�3t�^�>>W,Ruv��͘G��e��=t��ѩ�/���ш^��aCHMK����'���#y����<�N~�=��H���6�fM���+�����	h4|}}y��Wػo?/���/OO��8�}�1t� Z�h��C�����&����;�?��/�LfV3f������̇��8b�~OnN.�`Ѐ��_���ƍpww���g��U�ڳ�F�C��u�2��I`�Xx��7�.Z����$��k��ڿ�ӻF�Ւ���k/�HxXPި��۟sfa�Xyd�l\�7|�	��(UJ^�e��:������=�vmc3�I�5iJ�����Q�8i2Y���\��w��	�l6��3���:|�J^��3X�a�4���N]��͟�3��E�Ra0�ps;���$IN�|d�8ڴjEn^�ᅬ}Ty.3f�&//�Ն��+�￟M[�T��C�v��{�;ʤ�s޻?��-�7�ȱc�zէw/�x�Y�D���),,�g���|<�,f3..�u�‘�GI�H��UK�N�8t�Ͻ�2�[�����~<+�,B���������p��a�*E��X�6F����+���^���6���L�����MAA!�|1��;%I�f���c�puus\���Ѩ�$&%���=�qz��#//��]���##�����l{��eϾ}d��нK�<HNN���^�7����h4|�٧|�ɧU�F��_/�LtT�?�l�::�����O��C�G>�8���J�22��6��;g�d�%$���7�T��o��;~����\�ک���D,V+��_|�	s��#�Gs��	�ٸ��}�r��I֮_�C�8�>Pe�(!!���{m�Z���(7�*U�5��;���(��sw
���ҋDFF2��h��Ys��o��{���]������Ɨ�}��5k���f�;�S�TX�VZ4oN��hf͝˫/����GtT����#����	�ݷ��좢8|�(�����m۫�[y�'I�e W(yJHJ��_��T)1L��k��˜��G�`4�x��|��xxxp��}h4�aaX,����y���ח���}v
�W����óO��Z. g�F�|���<��<|��s�x��|1�k�#�U)��,X���}�w�|��֭kԫ�c�c��5�LLJB�V�ɇ�y��ޢ���iݚ;�gİa��m۴�[�ά^��?��?���C���ӛ�Ç3���Q�T���X���6���߷nnn����Ĥ$~�4��{�JY��ooo,g��).)�чF����߉?r�̙*�T�ӳG��ˆ��dێ��1&|������c�뙧�{�:9�ކ����g�
�/��������#@@߾t�6
�Bḯ��9u��ӣn%I��ۇF�h<z4y;vpj�Df̠��o�u����v���1tw‡�|�0�����$���A��h���9�@~~>AA��d2ڶ�a��u�d��:��`Ɣ�U�p�hHK��͍��R�v;�<4ww^z�uZ4kF�;��C�v��)�ޙ__�Jee��34nԈF���ص��1m8q�$���v�����<9|8c�<
@�V�h�6��|?���L����?�h4�8y��C����AR��	
�_��|����|�!!��3ṿh�\\�+
�J�Z�"-=�aj���ҋX�f�|�=~��<<<ؿ�Z����N��������8|d3��BNN.v��F^F�I��]X�n=3f��_��3M$%����Annn�t}��/*�B�ptr*?��8��z�th��V@`�?��=����5�P�nn�$$&����V����/^^��������R�������B>�>/��&�^�����h�WP��u���qO�8_��
*�E��ۏ��7�YY,]��O?�������^��ӧ�pw�a�N��y�4j
���\4���8}��U�����Qo
զ��ە+R�I��c.(���'VQ��	gqպѨQcd2Y�{&���gN;?@�>��ԩ،F�ͣ��_�XU�k��t�څȈ7nDfV�QQ,^���۳v�zh���0
6��M�ٵ{7�Z-QQ�8x��XWЬiS��XI�����PRRʌ�����ϯ!�����Μy�ٵg/>^�<�Ѓ��%%%5�W&2"������K/�3��͚þ�h\	v����q�$233	��Ʊy�R���/(�o�^�����ϓ�m�4:w�Hzf��.�}۶�h���@�V-Q*����1y�4\]]�箑xzz�V��߷�V�t��KJJ�u�L�Z-~~
��kȃ���������͛3h@�>��K���퍋ƅ��`�5mŠU���J��k�e�-$��RTTD�^�.��4
^^^L�2�\�Z}�Hio/�Z�Y��T��I<ݺva���2Q�Χ�#�;��#�%%dfe;Mϙl#G��Ӄ�S������>C��ݘ1k�wϞ�{�^�B.�`�qFm�[u���(,*�ԙ3<x߽,[���~�d2�Z-����gp6!��݀�gn�X���{�T_hh(+V��B�@�V��օ�����k�:)3����N#
75��͓.�m�<"�iu�nN�Ϙ�����$�����ڽ�F�Mw�%��︸˖�r1��$IRYY�d�ۯ*.�N'Y�V���.���U�c��$���4�3����T�m�ZQ.ΰZ���lqzO��U��l6K���U��l������+<|X*�V:����y���Z~/��d���.W��P=geR]�ˍ�"���\^���(�	_~-�9{�6�_5�ӵZ�U��l6;�gYY�ө�t���Z�z=�
��u���<i�ь�mv��ǯ[�l޺��{��Yұ�'�+�<>!�Vq�g��RRS����M�ϓ$I����L�9KZ�t�K��Ζ^��w�\KJN��v�*���]79.���ǥ�׬���:��;~���a��F����ɅL���͊�.sz�b�i�mK���9r%"]s�nnn��yi���e2Y�x�r9Z�����_�ju���I���ou
�;���J��2���2�nv��F�{5Y��_=/��,1��z�T��r^���8+���t1�.���` 5-��k0�V*ˤ�Vi����FS��W���b��.X�5��2z�
�G�v"������$!��p�R�d��),,�C�vt�ؑ6mf���撘�Dbb�������~�~���X�r�V��C���s�e��ضc'�II�oCL�6X�V,ZLQQ^���9b8]:uB�T�`�JJJkq���jȷ}�Nt��V-�ӯO�
X�b%�$1d�@\]]Y�f-%%%6�����6�
�,$�MkZ6oΌY��קM���}�*�r�jY�l9YYY��NYY'N�"9%��͛1t����-f̜Ž���f��l�J��8[�m��ㄆ�pם#())aي�h]��=r$J���dqI	˖� %-�1}5g�����ҨQ8#�
c��e���b2�>l���k��p.�%$8�_���t�>�q;7Og).�z�s!��El��4����ʱX,��ק�Ũ�J�湧����y���LE:3��%��b�ڙ��n+o/oO`�ړ�MV&�>��}�����ɓ$%�0y�4rrsY�f
99��ڽ�̬,�F#)��X-�>feg��hX�ly�Ÿ'N�b��ILL�E���t�8q��PZVƲ��())aҔ�ddf��E���q��	v>@۾s�]KaA�/�޲u�		�gd���͖��ؾs'y��deg�����͛��L�2��۶SXXĤ)�())e��55�u��Iv�ڍ�` ?�|�iFf&&��F>+���L�2���|rrr�4e*��e,[����2���#I�.���l߱�]��\�yH����K8p��5$'N�����U�ENN��LeϾ}lظ���\է�ׂ+R�
�t �n��2F���zA�l�����v���8�v�v��]�+��ӓ{�Ũ�Z��٧������.ѫu����]�$؋v�,�u�D� IDAT�d�Ud`��4�ëK9��ǯkk��Ѐ~�
��ѣ�����e�$9��d���+/���N�N�����9��3��wx���i�j~�r!z�`��� ��)++��/q��"����Ӽ����y�7:x���7��ݝ,'z�r^*~�+�q��^��\��9�\�d�q�j�7�M�%�F��J�����F'�Ĉ�F�3�[��a�?{gձ���&�l��=�,�R��m���moK�w�T�-u��p��/��IB�����d��GȒ�n����<�y��93��y�93�^&'6ԍw;6$d��fGNq���XZ��h+��ن��f�{�(��ܭ���)Y{�x4�̹slش���Gaaa�����eTVV����W���V�WE9�y����y۟ddf���L��]������x�+*���a��.����QXh�U���:��V�Ѽ9�5N�<����Ut4�}�.����7�UT3��-B��ٴu���?��|y��s��Id2��F���t���ا����ʫn��g����yXYJ�+h�̞�����tF�4�z�n�P��&<OTd�=�^yT��;~ۀǍ#s�J�|�-c��ܦ��++���z��C�5��9��]66X:9ݏY"�����ߙ`{f?�Ѥ�� ����Fb��gc9}M�����@�%z؁<��}�����լ�}5v��X�-�-X�t�����>�ݪ�X�_�4��Y��O�{lllX���\IJ";� jjjX�d��{LtK��d%���E�|u�ҙ�+Vq��%�6�D5�	>�������*$OO�|>3ft��^yT�V�Z�7n�=�X}�:�Z�DY^������6(��)�	G��o�iiiIXH�ڶ7����lڄ[�N�OL��5ƕ�d�<=qqi��h4�=w��-[�󄳦�������3������n���	�Z��7n���`�t�v(�F'�j4�Zo�c�}��m'��-2�����j��͑�Ť���h4(��F'N7�\.���Zo�J�B�P`gg�S;2�E,n��X�T"��D�Q�ɬ*�
�J���
2����z���e�������JRR9��F�ݺt��������ʕ6yn.{z���Ν8DE���ư!w�NQYZJ�͝�|��:��̘�_�)��[�.8::�+H��`fffp��t:�J�]7x�Z�����Sg������xT*����:���VV��V�5����{�
���`�F�xڠ��j��3����ͳ������jG���Dc�-��U֮߀L.#2"��t���+qLann�����~X���j�ʣ�<�ϠAt��G'OF����Ν�j�=wޗ>�MM�^��a�z�:._�¢����Xv�ڃ�۶a`��F�6�:7o1�d�n��8�H�J�2���Jr2�����w7zt릷w��5��#��^RZj����Ϗ�����B���v��Ç
���_^Ґ�Kk<��پsy���ߏА�%&RRRJ��븻�1b�0D"��I�'�`��=���0~�_�TE@@�ptp���s�0����W�<��^���5�[Y�����|�5�2�Cr}����oB'OƦ���={���������kۆ7_{�5�����#G���JQQ1�|�}TJ%;v�߾�뗣@�R���藂,X����b-]F��}�n�B&��u�x���Y�tώ�B���QCi�vm�0f�H�%p%)I����H�v���a)s--�EK�1�_?lmm)--3�������<C``Ç1*��K��������-Z��̬,�<�K?��X�q�:-}��ENn.�N�!'7����ǟn^�L�h�///&�0��ύ��q�ʃ�T��S����_���ڢYk����u�8<b:�__�|���0o//$	�~~�!ioolmmqqv&7/��
���aaaA��v�����6��PSo����oN�κ?6��棬'
�0��/��憻�^7�A�C.���/�ٴu2��`��k��BL�V|?o>g�_ ,4�7�O�y�Y�����ؐ��Oii)�cbLn���+� �J�1�=Ξ�@vN.
��CG�ҥ�-����BC3j$��nc��w2p�Ë���f�C�vm���R��Q����g�aw�n8���q��
�קGƎE-�a���[� y�wD:t&�����`ooO��1�>�>�z�6�:A����KAd2>>�<;v4�EEz�pSDFF���NQQ�
֘VTV�P*�0~~��w�e����ك.�:���Bqq1>>�t�ԑ���PQQi`���#����>s�H��.�!����MrJ
�^^���0r�p�?��(����99�gdq����Q�F}s�i��#�H�6�'����ij�������o�ι�3�0{6Yk�P���CT�w�4X��	��3DED [���ss榈�Vь9�S��p��E����̼vkSQ���om���y���GlaAL�hZ�lɆM��p�� *2ss�bc���3��Ut4�AAl߹���\�b1ݻu��Ύ�r�'$8��-q����ص�R�g�A�T������rf��&NNN�߰QoC�6�1|'O�F*�Үm[bc��'���p�]{������Q��ޭ��'$�R����V������333����IZ�jE�vmo^@@@@�E*�RYYi��A��UUU����N�4����s� ��Bdn����=�-[�꫄N���ս-�o%#33�s>��A�L�x4�r�;!27't�B^|���bT���mߎ��;�qq8DE5U�͖����δA��m�jsC�|&���k//����ξ>ӦL��?��o�!     �0yh������+:~����x�:~�Lj&��'-�"��T�Dla,T#      д4I���,ŖMfS(j8�b�"i���$C��Ϝ"  ��>_gpH$����oj֬_���T�Vao��e���s,1��>,��T�T�ٷ��p��9F"=�+h4����%�߰�_W����MR��(*.&�&OG࿃s�%�|�Vǒ]Iw8�h�>d߁��Mh�<:>tחBn��_�R��i�j;4w��!��hX��G�m�Qk�R��%K)+/�X�q��v�_�J�d���������ѣ5��H��u*�����4Iܷ#;'��{������G�~����y)W��r��R@���lL� 9�����4I:�ɇ�b�T����5��໛�--�&�W(���F׬��i
��8ɓ={�|�*Ǝ�����p�K<��X���H<qgڴ�a՚���iE˨(~^��O���������޻�X̠��h�ػ�%����ؐr5���V�>t����p����h٢�z� ��	***I�z��V�ddf�T*�4q��֤gdp$�n����gϟG&��|�*�-��ݫ%��lܼ�Nǀ~O��W+t��>��.I�Ut4]:u$>!���Lڵi�W,8p�P�eһW/Ξ;�L&��LJ�0V�]�򕫸<$�qό�J"a��u��r�?՗А}�*����p--�N��ء��|i5Z���2b�0>B^~>��#4$�%?����#�^^�s��%}��.\���x<===bfff<|������еKg}؅K�RVVN��L�8���|��a]�`��4e�]�F�ǁ��
ʫ�˔,�~�v!n��Z~?����9c����X:�2%�{����{�30���ț���B��5k)�J����R�b�=�i4Z-7oa�s�ٴuR�[[[���HN�J�n]�mߎu6bkc�B�`Ƞ���\IJf߁�����\/(0�N�Z-��쥼��aC����gP&ff������ܰ���ƍX[[����Ir_SS�Z�n�xT�����2�E����L���q�/���Nqq����_��+I�z��]���r��9~_�W�[��`oϰ!�	��﾿�-9��,��'&�0�����?՗�]:�wߓ�����K��d/,^̞}�o>���̘�,X�������Ď]������o��G�n(
>��K�%�JrcF��_��R�X�t�����K�I�A����0x��[�`�9��@�n�u�J�<�{�2���g㖭t��%((��w�D���mmlX�t�����ޝϾ�ڠl���3���5�А�F�x�ݺv��_����E�'{��j�"�Nq��˞}�F[*++Y�x)���KQQ1{���ZZ��.c@�~����M��8�����Q�����"|}|n[��Z�������^�w
"�Ǒ�o�A�g^̧����gr�5��om�\���$�U��ч�g�	��/M����Rɂ�K���PTT̂�K���a��}<;v����(j�:�E�/Z�R���o�E"���ʊ�#FQZVf���,\����F�1.��y��i�L��-[�ֵ+�e�Fe������A�M���jt:]�ǣzǟz�l�DQq�׮�#�J���%<,������ˍ�*R��ѽkW�dU\KK�[�.X��H$�ڳ��."�ðV��T|}|pvv�W�D"!>!���#��IN�JaQ�6l���iI	��~x{y��䌧�^�TWW������/!�A4���"��������BA�֭I<~���D����XY��燳��YY<=l�^!0!�>������
WWZDE�������>nr��8��Ȇ͛)(0�%���f蠁xzz����h���È������\�mب����{\��
�..)�������s��e�%%������͎]�prr4��쌳�3�~~������$���	<��л��i�Hn�V$���b�F�C��ao-F��q5�O'����bV)�J
YEUU�o�>�>9�y�i�;;;ll�M�ptt���
ww���qvr�����DpP�{=Aii)�EEF�'����냧�'�!�����N]]]��'+;�d�<�4I�oii�R�D$23q��h׭X�ڳ���`���ߢ��H?�$'7t:���173'7/�ki�:}oo/D"mbZ������	f��m���g�����۞��������ϧ���7n������		"3+���b�_ןo%�0v�(AHP~�>�<u*��A
�33��.33st@Td�yyddf�}�.������t:ZEG����8q�?/�� �N�%�j*E���y۟ddf���L��]���L�}55�R���'O������E�0���Ά-[�JK����c�<=<i�"���N��%BD���uu�N����&N`��Y4�Z�[�O��ٙ�ό���$$+��ܼ<R�]���ɨmԡ�h���'��ݻ�<^�ިat�½�����9��pV��[�c��V�l�P��$l�`�fD�9�!!�
���R*�J���ߜ|HIi)�����
/ON�>CYy97n�R,�R,&7/�ܼ<������\o�]&�}�`og����A:��E����TZBn^����^�p�2y15jԇ1���>"�#H<�ȕ�K�^�Jjڭ#+'�� ���j�
9~�$Æ!4$ggg�����X۾��\aЀ��pW���MBb"Ur9�&�����~�xyy�"*__�|}���4H#(0��v�V�iۦ
ݺt���s8;:agkKrJ
m۴���ggg|}}ؾs�.�����g��?y���|:w�HǸ�x���Ν�����~~���>YFF����DxX-"�8���/<���"DD��q��ev�݇��/cG�2xo̱�D���4�?Z��.2�_?:�u0�o�e"anfNXh����`N�9CTD�:Ƒx��	���"����kۖ��"���ckcC�����}�����J�""<�����
W����ڹ�Ϟ��Ս�:�w�~�SR	F�V������rf��&�^����7�99�&&��ȇa�8y�4����_��?��90�������ᆴ��ZY��fK�(O��ps��ϵ��R��#O��&�Þ^�|���焕؜ {�����v-���D��3HK��ݍ��{s,�8'O�&<<�vm�үo���#�� �U4ڷGl!�����;NNN�����NGuu5sf���ɑu6��Q(�u���ׇ�nGZRJLtKb�[\�b,Ŗ������l�L��	�T��UUU�V�Z�7n�CO8�j2���F�[ZZ�!6����D�P �H��d���<�p�\.7�H��T�TXXXM"Q*�XZZ�:�����z���y<��>(Z��B����ºz�[�TN���h�ju�ŵ�J�B��!��rNJN������:_@�~i�>D�Ѡ�h�˱������dt�՛�?-�Ed�������Ɨ�4��}�QDF4O޺��0n>�:������wk�waM�8���:�����x�ɝ�S9�ϳ��9��ōLP�[ZDE��1���kh�>��:3klf~cXI$we���M���-����?y�����Ōz��_@@@@@�1B�#��_@@@@@�1��&�IK�H�����B��ʀ�@�6�Z@@@@@@��i����X�-	
6Z��P�p��9:��5E���I���]K���3��-z�VV�4xD4I�/-�"�0G�Q7r��"Y#4
�仹����d2��u"=#�o����U����d̘�^�����JoG��'��+�a�%�1���etygS��Ӝ|HS ����I��
j���-�,��'y�gO��\��ѣpvr2.=#�@�9�ĉF��<�_�Z֔���޻�X̠��hOs�Ҿt�
��
��$�*:�.�:��HFf&�ڴ�u=
�L��c	��/��\A&����{���dȠ�����_�eqô:�u`�\KK�S�8Z�lɾ������I��O�w���̢���6�c�x�2"3�Z��ԙ3����k�^|}|ptt������l|}}9|8gΞ#'7iI	]:u"<,Ԥ�8r4OOF�Auu�����+�IJN�ML]�t~�v)�KJ^�U
�eJl�L�7Tj-�I��Ҝ��BX,�r���=�e�U�=��W�77R����D������]��߷/�:r7WW�6�B�������/�!!F�<������>t��]Ǹ�cP���صWW��Z�vmےxℾ�\]��x�2�7*��̢S\-[D���ߟ�~�U�zv�v�����d��Z���C��4E�F\�v�n]����εki��[��R�]���?y�ܼ<>�L.gђ��?FYY9�,���F���q��Y��a�@�ղn�FJ%��e��32����f��t:6l�LEE����M)���ٷ���r��XEQQgϟ'5-
�N��������;8~�$���k��JKY����� -=���|

oI�ި���������HJNa�����`����d2}�*����w���'9r4�c��HKJ((,4ici]IJf݆����QRR�L&c��u����r�.\��?��F%�Б#�?���������%K�OH���3g),*b��ddfr,1���Os%)��k�QSSâ%K9y�4�d�?�h4lش�*��c	���ˣJ&c��դ���w=���� -)��ot�5Z�W+~�v�/\�%��O�JN[�g"S�٘�I�B��#id�`�icM�q7?R�:_�}�.nܨbᒥ�����V�/^���+IHKJض}��x�W�g���TUUQT\̢%K����y�ϱ��l�s;)��d�������%&�s�RSSY�v*��(/*��5g��㯩�ы5�:T�G3Կc�.._I",$�?w�4&=#�Z�G����aŪ�<?n���8p�}�|���d������u�ںU������x[[�<���5�3��ώ՟�*o/�ٻ�.^B��RZZʑ��=����#�y���0n�.^�ľQ��|�����kV�Z
@��ݘ�޻�DfTUU!�Q�մ��""�*^�N			᳏?b�A,_���^�?x���P���F�- IDAT�<�M���t�t:}��..D��5j���V�Y�G����e`�~hu:�=�l��|���,_�Rn��.'%��j���!)9�JENn��>&M����#�0~���T*v��M�B�����/���'%5���J&��7_|���s�.\��J����Zo���?_~�	~~������ �n[��/:@LM�7Z���Ą�͡���S.S��h��˜:�����p1��j����*�*���o�>�a�o}�!��ӯV��ڹk֯��P��L�� ��>}��a���|�ɧt���Б��7�C����ڵ3_~�	=�u3(CS��(t�܉��|���=۶oo4/u��9�$���%J�����!B�Q7E��ڳ���`���ߢ���M���<���������H�n���ΟO��<D�4��+*�q��99h�Z�yii�G�Ӓr5���bؼ�O223IJN�{�.�p������Q,��w�I�Ut4�}�.�����;�_?-ww6lيTZBEe%P��H.�s��a|�}�mӚ��:�V�-		���ӹS'�0�
��x:Ķ'*2R_�2��7n�=_/MZ_�;<4?_&M���ٳ��V�ռ��t\����8�k���ިat�½�����9��pV��[�c��V�l�P��$l�`�fD�9�!!�
���R*�J���ߜ|HIi)�����
+��ܼ|r�򨬼�V����xs�k|7w���xzx�T�޼���Y4�t:�A<�(��{������	~�m%999���wX�x	
���ݺ�a�����
�r9��󉊌0��M�e����e���c�V�C���o߹�Ν:@PP �

e��
�o׎={����I���\EuM5Cn^(�a?qkkBCC8{�<�
T����ؼu���XZZ����w���FYy?��+���%�8::�p�R�r9������p�8���o�o���-��.E$1f������LJ@�~�mҒ�=��N�;|�9�������=
�	f��?�T����ލ���o+�H����gkc�Xl��e?"�J	
��#d��P^^N��=M����`2��]:�PYYI�N��iMfV�.���K�4u����a,��G&�0�Ҳ2~�u9�&O2ccmMdD8��n#+;����~�F�����X���M�R1v�(���nڽF==��r��شe+r���ݻ!�Y��w�?���#�x�joE�ַ|^�pw����a~PlO��ClnF�ho��@�V�FᛓI<~�����7v�lܼWW*++����Ͽ����=��n���ЭkV����'O�����'�p�2���~c�=�
S��o�����ggg4Z
}���°�kX��Q���s�������Xj����?{۲nN�V�Z�7n�C�8�j2����,--�������8?�z�\n ;Y�J����B?���欥}'�Z-
���H^]���������_W�EE|���̙={{��������T�ބ��`gk{[�o�F�V�C,6�V�Thu:�|
�4g"�ˑH$zi^�N�\.7�~Dy[�nS4����]�M��s	
�w�'Ҽ߼45III��WPP�t;�EFD�<���t���GC��8�蚳���0333�xP�
4�E��r!7�+���7��]�%n:NS���
��ix�D"��j�Ƚw�M�+�0*CS�0��~��Wc>jԨc�-��+�bK�BCz�~~��H���9b�9)��OS�����租#�ܑJ���WUU�tC���;
��|�B�/     �!t��M6�_Z"E*-F�`�~�Z�@��,i�GM�<���rrrj� T*��QUUE~~~S$+      p�������k+��/�'0 �)��
M��/-�"�0G�Q7r<��FC��nn�>�2��6q�H�������z�J]�2��3�k4\s�Ү�{����X�����j^����ok֯'>!�o��KzFe���_�����w]����IKϸo��l�v�L�Mu�Ͼ��_W��hHMM
S^~����{�X��k֯''7�������m�x��.g�����nGs�!MA�k�^|����D������ǣ�啖� ��p���&ST�Tz1��A&��C�W��j�F����"F57�u:
����
]ܚ_��j��[lٶ]����
�L�L&3��:UES4�_����7���R,&,4���֬[�|�edp�ʕ�
��Fc�o�NGuu��oj��B��M�Vs��1��Ý037'.6kkk���)��]9�_W�ۀ������
�B��T~�s/mW��\/�SXV+��t^N��$J��q��%�͇܉�m�N���{�wG��+5���F�tV*���;v�f��o���,�@t�>_|�
�55�er@Nn.��eL�<��C�3��g3r$��c�o�R�f�ѥc��ڢ�j���lll����ϗ���}�n4
�-����Df���%)��|��l��������ۑ��EdD8YYL��+�DG��7�RSSCaQ1όũ3g)((��ޞ���Y4��l�ɓH$@\lm9$?��3gxw�l�ܔ�ܳo?y������l�N�9��`go���o���>O
�v�X���a�B����2�EN�>Mx=�*��]{��R�HNI��Y�6x0o��_�ز���tf~0�-��oѹf�:���I<q����O~�����ճVx�n����Ѿ];���HKKG�R1f��u��~��Ғ��,����ظ�K�/ccsk[��'O��갶�b���k�2�",,,xy�T>��k
��)Sؿ-��X�tVVV����Ţ��
�����<}�{�
223
�y��yi�ws`Ƞ���_���
'''JK���HH<�D"�πA�ߟ�
||��`��;��Y��7m�ʞ�����d@���mB�(�)qs���5g��X���C׸�UJ���ů���s�'�V��#W���R�;��}H}��~.C
$2"���|�O>��ǟ}�D"��Ņo�ɩ�g�'u4�K�v�k_!pw4�MM������TO�
ٱs��$̟;w�����J����O�b�j�7��II8x�>O>IRr2{��c��Nj����	
!>!�4G-톺�O��ͻ���Ύ�����g7� 44��3���ys�}�Zt@�g0|�P<<<����ܩ#�;Ʊs��ܱ��>���B>t(Æ᳏?���Ш�2i�b۷c��%X�-���泯��o��899�R�سw�.b�����)������ޥC��\NJB�Ւ��CRr2*����\};�4q�G<�N�3*�5��coo�F�5*��^�����r[m��#�`�s���d/lmmy������˳�'>��/Me���(�J��^KKӇ�k�..���:j�{�>$��!n�r!�lj	�>��C[ak%�\���њ�{�1u�!,-̉	t�bV)�J
YEUUT��}H���C$�ݻ_�V����I
��{�wG�t�Z�����t�MMYy9
���\�������7̆ɋ��H$���s��3�?������Ġ�
�;|�(E�Ÿ�����a"j��(��G�r�����@T;fgk���;C
�U��ji�No�H$2��NK�G�n����i�V��h`��03�=W�����;#�������t�
:�I�oÇ�jj*�R)-��ŝ�mX_���^�pw�]�6�=
�\Ο�w`fn����~��a{,**&�U4���>{{{���8y�nnn�m�I�V����ɲ3��r������ކʏu���*
5�o�y:��֍~}��r5ոM�%�5XK,�vdH\��r=���Ώ�^&�Þ�nVH��ą�#A���>MA��M�3��)9x�Q��և�CSoX��ښ��M%,4����A.��u;o��{�w�I��۷�%=�>>>F�i�:���<&��gF�f��1�x�����y�o׎_��Ƅ��cff��?�����ύ�_�>��r��E�b���NqqiK��T*��󱴴��ۋ_���R�$;'��s>��w�A"��>��w� }��^K�X*E��RZV���%}�|��QTT�H$��l�'�����̞�	Ғ&>7ޤ���[)))A"���b��wy�Oo�i�9��~���gG'��׷ //�9�}���$�N��������y�xs�t��BQ(|��Ǽ��wx�gV�^MT�ܺ�+�&-=��_�d}5$$8�i�'������7-��P�T�WTPSSC�BIHHm��?�7wD"�EE8;;�����ԗ���ys)-+�鯳�?�����9z��/\Hdx8��&�~��Ǽ���-�E� ���:��ϛG�6m�^P�Xl���{���o'w��w=��
�Ņ�p���4Z��f�n��C[0�{
iN>��;9x����qc��&&������Bnnׯ�`�R��괘���s�JŻ�>4H��7�\G��w�+�&�I��Lvv���t��{(2��K����Ш����4�U*FOe�YK�1n�9}/�ו{c|��7�5⡪ni�Z4-�Ëu�Iԯ��'sss���Q�5�'�����gSh{��o]=ȫ�177G"8�*�����H$�o�[�t:�r����)|�w�i���"#���h��0u��ø�h(�Zgc]s��n��iNߋ�wz�((*����
�
fff7_b��Az'���!��n�)��M差L՝@�9����P$��0�yS��lj&��Jzv�������@���f��f�����~��6C@@��#��G�#��_@@@@@�1B�#��_@@@@@�1B�#��_@@@@@�1B�#��_@@@@@�1B�#��_@@@@@�1���^�OII���b�����;
�<|���?u���.DFDI8���
s6�s���h��4B���g����N�C��ꏤ�R.dV����ɤ��Hڀ�~]��e?�|�JN�9���7�R��c�&�8�W�b���\��Տ$�}�LK���玝%
�Fé�gY��+I�\KK������s8r4��6C࿔&�K�J�QiX{�*���h4j�8�ʏ�Squs&�j�L��I���!�|�'m۴fێ��q��?�Zm^�T����*n�V�J�2:_��>�ѷA��QSSc�[�|h�Z�<��i��O���'�e��J[�R��[UUuW��?Ɩm��O��Q]]c`�N���''7�\��O�P��e�V��q�{��fZ�Fo�F�e�ۑ��(��;�װ=h�ZT*���3g�y��t&�d
Se�er/̘�����OHD&���hnٯ�|�ԯ���jX���ݴ�G��.�֏	\�.��;��pƽ#�������ѣ��L��*����H�1����F�W�TF��|��_O��+
v���R���ٙX�THl�1�ؐ��Ä.6x��4E�XYI� 2"�����ȧ�3{�'H,-IIM�g���
�`oO���,�?���s>����o�'��~^�Çپs7���Q4��_{�vmZs��i�y�-�o�Č����Ϗ�g���������ż��w�';7����ƁÇ).*2��+V���R���õ�t~�u9��k�w77-]FpP vvv��ګlڲ��'O"�H<pyy��ٿ/OO�{���XΞ;��S��'0 �Y�a`�~Ķk�V��ĩS�����pg��IL~�eb۵#-=?���()-�/>ӗ��������L$��B���dͺu����x�o�>��_�Ndx�ڵ���H,-���⻯�`���:sw7^�2����H$�Ġ��)����Ǜf��O���Z�F���J»�|Goǚu���+�tnj��3g�m�*�|��x�2:����~$--�JŘ�#�GZz:�.������
%*2�/�����׭����^={2�?�̹�|��'�����o���s��i�~�5��.#�C,�^y���h��RƍC����˯@����g'�<���G�����,?=ҠL�
̾y���9{�3��1uҋ��I��8N�9û�fӥs'���O^~>i��,[���gβv��������I/ҵW��vҾ��M����T�Z�a��lش{{{�J%��k�aP�Y��i�4e2���D���3�zӨm�ݱ�׵�GIJ^�U
�eJl�L�7Tj-�I��Ҝ��BX,�r���=�e�U�=��Ǹ�Z-�>����
��9���r���r���HK���#;'�W���L�[6�����f��uF�L��5�֑��»�f�����;p��
\�|�FC^^>�ޛIe�;�(��A�<��j���d��\m����˩��鈣ݣ��'�HP�����QXX�ȧ�#��IJN��ߏ��L��J�J�"4$���L�J%j���"����iS��XΞ����/M�B��mI�z�А`�8�F�A�Q�;}�N��R̋'ۮ-�o޹���1��8u��A�L|�y
�O�">!�����K��ׁ���[�`��ùp�W�����I/��Ǖ�$���x~�8���鷈���ח'�x�Ν:Хs'ڶi�O�.�g���46m&//'''�6m
1���i��/M���Р|k
���xq�D�����Φg����֍W^��D"��Ɩ	ύ�g�n��&��..�\M���+:x�&N��ř��)������رk7�����<���7L�_?�:�%$����T�BB��q�U��гGw�BCذy3����2m*�=�G�k�V�l��t�t:���۞�M����'G��6d0/O�������ѱ�#:x�W���:E��m�Tڷ3��R���_7*�@���HIM%2"��e��섯���4��]�ݲ�:�FÍ�*��٣;#��j_u�~9����p/O�©�g�ٽ;�'ND�P�|��Q]7L������!6���1ٶ��h�:\��a�%B�p���6>�+9el9��L�fcb&�

����Ut���*�|���xe�T���`���}���4��d��cnfFFf&)��<ѳ���զ�R���qvv�_�p�
2�i�'���̩3g��G	4���W(�wc`Kk6LFka���,f��cd���|�\KK��Ǜ�W���d:r���{���@P` ^^�8:9RSS��C��v����t��� ��'O�`��lk�n������E�(���>t?,X��-[�ίƋ&���@xx��������a�ѪeK�� =#���qqv��Ņ�\J%���K�ڧ��^y�Tʌ��q��^{�bZ��y�7(VVVX��8��cgk@TD��fҭk�:�bccCN^�������养�;�NNXY;���@"n���ٳ8;;��䄿��ff����"*{;;~�e9��nC&����AQq1b���㍹��a����#���˿^k��'�n��R�I�����Ї��ΡuLvvv��XST\LyE��Ұ���uL�Ͳ����@�V���"��j���32�ճ�nn����k�2ggg�o��> ������x��nn�|�G.���L&3*lmm�x�2	��t�|���*��
$	�~~8;9��퍫�.��TUU���˵�4�;�o_u�~9����pa�����ӭk�iղ%�Ϝ5��i��8�m�H�R���|�r�Q۾�����"`��hڇ��:ȅH'&��om����r�Gk�������0'&Ѕ�Y�T+5dվF+�0�r�J>�>8����r����쌣�#666|��l������D����桡_rvv���B
Cm���ٱj�Z~[�;2���¢��Q̓&��Ũ5j�t�E�#Ǯ3mt$��΀�R��E��$ԓ��nj���5��۸��q��!����D�.+
$''�KW�ر��=]�T�V����KK�Fϟ��x�����)W�����x�/M�̱��&<7��oo/4-���_���g�w� IDATB�֭=r�.$2<O���b�֭����G76o�JVNb��]�g	��?>��k,,�<p]:vdמ�w<`˶�\KKGZ,�W^&/?�sV����ߦNՇS�T�WTPSSC�BIHPS'���g̔_h4�А`***yo���:��l��1�F������k�3m�$��S���eTyy�|����\g��Q���!���/�J�����`6�ڴ�JR2ώͻ�f���1�=՗���SXT���O�e�Ç�o�Ng�
n�L�I�9|4�3��>�M�7z�V��?��nk�z�����&<7�Ͼ�{�j
�'UUUu�0���[�G{�:-fffl޺ɠm�M�߲�OV�����V���E�Fw!�ۑpoGN��6؍c_gšk�5<��j%���N¶p���(?'6$d�-9�!��Β+9�x8�?{����>�o�@HX!�Qw���Q먫�Z�lk�m]u[���}m��P�պW�{+�����$!��$%$�V�}��瓏&���y�9�9�=�^�ck$��vmٵg/�N���{{��`�g[=ò+����5���_L0���G�ϙ�K^����*�M�d�U��hHIM��Nf��޽��%��w����nj���l���� ��&�NfM�V�J�"�v:u��A�V��舧�m[��Nx٪��Ņ����Q�.��v����P BDXh����=w�LF��{�u�˹};��3q���c-�&$8"�����F���C�y핗��W*�lݶoo7jijϴD,�2ؠpv������ڵ������R�	�A�($6T*�e+��{y���$,4��mۢ֨���B�PЩC�R��*��BQ��`eiI���R)��!�jKqws�O���dXZX���+NNN *{n����	�U*d2];u����L�J���Ņzu�b-�D"�PYHa��F1�DFFM������z��H$���booo��N&#�Q,�deg.[�hn��A���P����^�'&�!
�ק^ݺaiiI�:u��4�Ny{yѩ�s���`cmC~~>M�4&48��
�#��E�ѐ~�.�:<GHp0y���~�=�vA&����]�="������͍�̲�Q�Xl�S嶊�W��ɨM�oˏ�,�_�ބ��յr��d2�r'JJT�����၏����!��x{y����Z�&00���P#?���|}|�J�(��Qk��X[ӡ];��<		1�k_o#]!��ddf��hhݪ�aa&����@����dR6h��ɢ��X�no�!�X��"�yw\$�8H�Q�]��z9Ҽ�;A�����N"����%n�&�-,,pqQ���Id�z�
'6&ڤ�!"4$Ξ;OdDa�!�znܼ��\��\����&󒓣#����Q?2
kk�V�
�yy4��&"�
���%�d��ʪ�Xaa!��˗�P�/^��͛7M��mllh�{{S���'#�0v�a;�QY��π�a�1��nF_Μ��_Հ��f��А`zv�V뺮߸���7oA.w⅞=|R-����݌L��b&L��O�QPP���|


���7��`��;-*r��
,DL�r:]:w|�:�8u��njc���H$�ǭ�C�$�]���̘2�J���8/=�\�t��cw�ܩ�7�իA���%��Q�T�j��o}�JEnn�����1;��N�=��&��h޴��cms�Z2W���s�ODguq��)


���8D"EEE�ܵK++�o�q�Z2�o���_�[O����c֌/�hЇ'���{���j٢Z���8/	���V����t:JT*���.\����8�߲�Yt��{��&�z�Q����Q��EEHllL��hJєj��|&��?��_�������Eٽp�Z�������A��}���ߋ!�Xl�X,L�e��fO�ZC�OB�x��������S���"��/     �!~��!�<E�_@@@@@�)��^�~�6���&���b��L�+      P{�J�OMM!-�61
cL^�z*1���cֻ����Q���V��9DD��z=Z���x#�e{��86��֞�
��P+�?'7����$����V[ʚ}I��=	���k�ט�z��Pm�V��m��|��P*�4�kB��̚�-;v�Q}�2�J%#G���܁���:}�c����㻹����=��ǡ����,��ˏ-sԸ/��ZzTV�^MJj�#�s��
�Ĩ/�3{�w��Ȩ����$���i࿛���G�p+�f���=�����A��3�e+V~�y�
7��x2�J��h4�<q����v�����\��8x�&=��p��lbI׮������QT\\+:��"JKK�|Ff&:�}5e4
��Y����JKKQ*��;91���

M�V�^��_�?����X�����>�����B���H�z
�Zm$_��>PVII	i�o��Wעe}EEE�z����W*���b�>���eir�
L�U�u��9<=<�r�DF�7�Dwuu1g����t:�m^��<j�W�?U�t;��l%Pv��\m�R����	Or����VҭKg�(�������l�ȣ�!�G���W��8�,Y�./<Ş�x���z0��c�*�jC�	�	Gyy�@����=F\�X��ϜEqI	E�"�v�DJj*99�����d`���t�ރK����ٹ{w32hָq���t:��4�Tʭ[)��xs��E�0Ow�
c>�������^�'-�6��a�޻����NNN���VKD�:t�ҙ�3g���͝;wp���vz:����H8v�v�D\lYݏ$%99��_��Y�&��� �������f�W�X�����|}ٱ{7��t���bb?y
6��\IJbʄ�Vٶ�N�1��X��r��|�t�2����ڵd�
}_�E��y������ݐ�/��ɓ,]�o�]��X�j5G��Bd���+C��Ɛao͵�d|}}��H���e��i��<�VKZ�m�3���7��V�]:v$��=��<�d�|>v�X����Cw����k�;a"*��O>,��^�7�K���t�د�3яF�c���]��7��<����^��˖���Ei��LJ�����HZ=Ӓ��,�4nLXH��}��E9�##�t劉�TO�V��\ʤ�'	�r$W�bپ����CjV!��mŰ��4��Ċ+iy�J�3��e?�9�2�N�!��u6lތ���7oq���RY�6��"P��ʊ_�R�B�z�l�{��5�.�dT/^h�^��[W����?�q��%B����f�$_��������2}�d~]���p��%���G�6m�t�2;v�k�N��P?2
?__��8t$�Z[v�݇�֖џ�d`�~,Y���΄qc		f���xy�@�M��X,f�#iǹs�ETd^�}�L�|1f4"������bJKK�[�a!!;�5iLPP�&M�{�.U�PN���>+��pv&,$����p��y�:9998t�ڶ�������믾�'#>��Ύ�����X��(..f�W3�e�䉆��r�Ə�ȏFP?2��ׯ3��9��S'���'��ߏ��>�]�ִmݚQ��$'7������F��mۍVQ��<���
e��u�5,,-)--�y�&FmԦu+�BC�d�ƾu�ٺTe���"����7n0k�l&�ǘ�?�����^~�Yӧaoo�^��^~>/
���_2{�w8+�����(
��#P5z@��-�� �8���K��aD�Hd1yJ5n��n���ameI��3�n�P��r3���{�;�Oz�Lt�г[7����n�<Ə�[o���v��7�����Z-":6�cH+g.^HfHO"�<���D�י<�_�$_���Ć:����$]�V���cccCI�
(�����@$������[x��7�;9�s���#?̟O��>���>]�_��X[�
�y���m.''G���e2�b+��u:F�7�����@��ӵs'>�	����/5*����W�
"��R��m���;1��9������Hllhڤ1_���uk
[��ٶ����v��,��㏘����_k�,�����`�����J��t*�H$�
mci�++;���(&�g[=cv��Ң�[��o����}�0e�W��k%Bdf�V��ҪںT���L��z�۳�!o]i��Il%~�~���$��p���H�8?o�#&؅�iJ�t���xR����lx�s=�l9Oƽ꿿�Ĵ<�v�[����z�D���C���W�=��s�@�P+[�
4�d<<܉��NT�R��u1:������j�|��t�ؑ�ڵ�lJ�~��U�b��Y��W^4�T�IS(.)f�4mܘ����(4jˍ5�P8߯swwg���l޺+�^�,^�+��[))��<�џ~���5͛5e�?�8uEEE8��ӷ�|��<�ܾ�V��O?������;�Ϟ%-=+q��t�rRRS�D��
d��<׮�Y�quq����)�g���E`@K�-�fJ
yyy�nՊ�M��y�V&M��H$�o������ٶ
d��I|�&v�x{�Ɛ�����<�W�ڷ3[Ǟݻ1������t///>��c��vVV��ا7�
�a�f�狭�e���ܼ�B���*�3=�?.^���-�~�X[[�W��srt�VJ
�/^4�K��=����Ǵ�M�4f��A�}�U&N�FPP ���J�6/�u���_}����{E߫������A����@���{D����͖����c����/�`|�[+Ozٲ�O��?`�����_�����+"��Fc�PkTD=^4����A�|�r��j\��K�y�ɕ���
��`oo_�:�J���
J��T�H$�۲����J�zP�\�F������Z�V��G�W\���
�X���5Q?�N�J������KKKl*��q�eN�V�C��ɩ�OE�M+�aUh4T*vvv,�V��ňD�j�b������j��z��J���۲�E�
��mS�#P�<�9�A�?�jYa7�2���4s�ҥ*�ݹs����W�n��FԖ�Ǣ|K��!�ǡr�(�Y�?�cK+�Q����0�U�P�����nYZZV;�<�쭓;YVl���b�����}�:Kll�_j�j�'9�<����j��]���[[�7P(��)�Bj�F�����Z,�7���f�KV�OB�x��������S���"j�ᾢ�"�$^���T�^���I���m���Z	�۶o�m��J��虳g��/     �P+[��9�X���t�*?O�^��-eo"kׄ��י5�[v��]���e*��j�8x��3�zl}J��?��xl9UQRR��of�k���<��#2k��$_�n�^�O��_��Jbb��U��T�Y��ϻ�0�S�Ku�}��V�s�FSJt�f�9{�V�nFF��������x����4n�J�vL�tL�-j%�k4�h�UJK5R��Ƞ���?�d�����h4���\�T����Z�֐F�R��a��.j�iYszv �tz�%'�T*Q��eC9�Ƭ�i_�d��-�ʇ���fm�/�b�Ǫt��V�M��*���~W*�U�B�d&*��l�ʲ�'1W�G�K��!1�o���M��QT��C�^ϣ�)}�s���|��\�k��C�2N.�1�(T�UZZJAA��_���\6�ʔ��P�*yh�����2&ke�_�*�����w|W��k��۶3�|9�?:|��F�s�W����JNN.C��F��=�E�����`��%�Լ{��g��=��ȠY����R[�������Mri�߸��c�L�I8ʉ�'���^�U������8z�8������{{�_����}L�g[q��Ylmm�{7��>xϐ[�H�Q<ą�Y�h׺��|g��y��פ-�8��E?2n�(~[���.�y�B���߷/��}It��=~�O?��T�@�1�G�N�0�������[��.\0:�nFo
��Q��=F��F�߹ó�<���M|�U˖&~���B�n]>�#��<� _�כ�2?y�Q����}�j\%LZy���r��ʹ�9�f2��V�~?M�ܑI����G���9�Z<P���C�����OG��E�fxyz�j�Z���ptp`�k�Ҽu[:w�@�:u���ҥcG�����˓����SL���p��E~^�{{{�j5ƍe��h�Z��n�ŘQd��t�r|�}L�b9J�����Nlt4ג��V"!;'�Yӧ�r�j�?���77W��&9k���2���L�9�of~ŊU����0�/��(�ɘ4u...�8;���εE����j��*���qW[��c�i֤)/
�O�ѣf�ܾ�NAA!Ǎe�	,[��A��s��%Ξ;����.]�ĩ����>@�V��b�(>|�=�'Tkˉ�����4ʥ�r�j^<�	������O����./Ӱ!��_�u�V|5m
}�ʹ��z=J��K�/��1��&�fM�e�ʆr����H��s�Ll�ӫ/_!5-�}@AA!=�u�A�H��鎊��G�nt��Ŭ���6ѳ{7��ك'O��cvV6s,b�8�ۛ����ˌ�S�����ߏ/ƌ慞=8}���W�\�T�
�6���3�ӑԏ�4������qcѨմkӚ~�{�|�J��g�O��������|�Hmm�O8ʥ˗��Hxk�P!�t��%-[I��x9���n>��WG[�)�8J�y�]���Ǽ��8��p��Oz����Χu�V��֍
�6ӣ{7z�z�h<���K��<��<�a�a��uܸy���A�\���
g��1�x{s��&���Gȶ;ٰi3�F�d�G#̎�r���;�3Z?ۊvmZ3�ӑdegs�v:Ӿ�Ɉ��3u�V�[�n��Ғ/ƌ�7^7�+ͻ�����t�:��呒��[C���Aji�_Z�)k�*�D��vGn^���Ť����U訿ʕ�J	$,--��XXX���Ȯ�{1�]֬_ρC���p#�7n�ŋxzx�L�T�������8	EI�
gEY�_[���;w����Ly�L���^���8x�0�!!f�#F����[o���3��xe� :ux��i�k�s�Jl��
"�?_~_���{t��Ύ��L<@LÆFrD"�n���/..�¥K�ݾM�gZ����t8;ؓ��Eqqq����K�V)ߜ���ۧr�cg��6R���,�N�3�{��o����gh��
J�ر{7S�Ϡs��}/Pkd� ���s�/�c}Y�;�H?9�b|���n���e+�VȰ��8̕�Ѿ���ό�|�)���նal;�J��AF��R2�������%?q�Dbv��S�d����1[���P���b+�����ǟ�B.w"�vr�EEň�>�X,6�}++KJTe��lmm����N&#/�I��D"tZ�mdn^

�ӏ?b����ɘ0vL�dG���J��%�zY~���tzJ����0y*/��À~}�b�d��_'$8��s��/K��Ҡ�XXXrƿ<h ڷc���9Y��^��&qqF:2���h4�ݾ���5^���d)j��[))��<�џ~���5ϵkˮ={�ri�4h_�/���K;3+�NGNn�3Ymش���llllL�F``�!/�9�Ü
/<ߓ�?��?6����
oo/RRS�J�&�x�;v��%��,w�es���>/�⻹s	
����丛�+���z�1�}Z�.s������9s�:q"��H$?yy�`��Y�b�Q���݋�&Q�n��>Іr�M|�E�f&~� *�������+����<q�8���GFF"�6l��͔�b�'��&.�3���+m�h�iT..������hu�--h[�l���{�*�r�P� IDAT����e�����w��7��f�z222y�G7V��__�N��
0�N���JIa��;xШ���m�K���ʹ3qp���TK��]��_���Qr�����8?����hL�izs�j��O>fڌ�XYYҵs'����	�;a"��X[�quqAY�d��dee`��
�oRǓ�O�ǟ"[QTT􏧺-_�\?`���q�:`�������ao��ԟz����B�������C�����l
gCG��,*.����䪮����W�/;[�-2Y��d�z=�YY�(�U��k�J�B�ә��|X��������
p��7|��'���j��W\\Lqq1���fm.,,D&�!�;6���Wa���.OjyX�Z-�����,�s*��B����ٹ�P�4;��������_�(�j5j���Ƣ���B���
L�������Ɋ1�r�EE��'2&/]�T�;w��^�x�<(���x��������S���"��/     �!~��!�<E��|���t3/���Wq%Ӓ-��r�ٿ���Z	����ᙖ�%{)����qp�X:�sm���Z�����@"�%���S2��x�����G�z��ڄ��W����Z�a��E���q8>����Tk4�ص��r�n���8h4�^���r��h�ڇ�gbR?-^���R�>I�ݻW#�*����l�_H@�ӳ`[�/R�;$&%��l���?��55w	���J�/))�t>Yx�)��jZ_�9G@�'�i����ݪΜTS����%���g��r/���]��ܼ\�'p-��ż��25j5�w�6[^�Tr��M�wRW�asi��j�^K6��K[�әM�S����a�n.'�^����i_�d��-ԥ��L�i.�ȌY_ӴIc<<�).)�R������꾛�aԯ�{�>V�Ƽ�e���
��z"���:}U�r�3c�״hތ���j}�a��T8_�zp�tss����ZZ��Љ%�
�e��x�<�}�6Ζ%��p~�����4h�RR�t�J��v��M��=�3` 9����8�%˖ЭWoV�ZMii)͞mM±c��6n�B\�g�k��y�=�]F��m�p�g@Y;M�>�v���t�o̚�-���ٳm5n<-۴�C�����q�b���q�V:|�B���;w2k��|:z,�[�e��)$]�ʳ�;дUk�7ذ�Ul߹�qM�1�k�6ټ�W�Z<��	)--���{�^t:���R�$??�.={�����	@ff/�8���:�t�����\I,�yy�ÏY��2V�Z͸���1�k���)����݌z�H��miٶ=7o�2�Y�f-�O�d�K��އ���N�^�ym؛�0���,&L�Bl��6�M�}�T*�٧�[���}x���hӎ�s�����a��π��]�ſ.��3����l7���<�k�72
��U���ʒR��b}g����3w���
d���ZF,:�P��
��'�;`3��5���{g8�:u�πA��„�S�k�3��Ƭ����ӏ��Z���a���p�t�D��2�<��ܕ����C��y��j-�7��@��M��z��O��;xJ���%�Ȣʔ�5I��c�nՊ�w�2�s~A�֬c�7_�Ɉ�g��{}gΞ�Jb"*�����s��Yd2
�����IJa��L�0��VWkKJj*G�c�?;�l�,���,-Y8�N�9���`��l^��ĤD�nڀ��{��'.&�5+��h�}��+zooO6�]Cr�u4
�֬e��f�r��
vt�ҙ���l^��aC�TiC�͇�X��g._�®={���w�:c&_L���!��y�R�{�m�������PPX�W^�ݷ�dŒ%��y�߷YG�Νy�E��Æ1��"���c��5�oӆm;��p�u�Bt��ܺ��ݻ���IIM��>�yy�@�
~��׮q91�U˗1�ӑU����3۷l�c�v�oז�;�q��I�
�3`];u�#x��9s�,G�c���eт*�M��Yw�}�n���w���'��J\%��+��V���i��F�B\I�R>����!�A֬([���z�5��˃2я_��r:�o_�,ggg�n\Obb"�۵e��o1q�T�󡹹��%K�n؀�f��5��hO�Z	��'[F�Ǧ$������F�����:����07/��.�������=��.5*�~gee���ChH0q�b��������Z���'P�-e���L�8�Db87��1�.������a|as��%�<����(�gs�~�&��AP`���.N�<=��A�P�PHQq1	ǎ��?q�D_��?�8:9RRR���&++����p��YC9;;;lll��A��T�
�6+�J�8H��m�pw#$8�vmZ���!Z�,K�{+%��׮q��a�5i�ݻddf�(6//O,--��-�r���ɤ899!wr2��NNNxx�S�n���
��J$X[[���b��/0 ��ڷ#''�J͈��c՚�|���[��qtt���\]\�;9!�ؐ��ŝ�wiѼq�b�J��<u/O/�\]qrt�������_��Ɗ7;�e��.��삟�k�\��Ŏ��B����Q&�C.�.��w�v�y�)�~0�D�\.����_�蜜�>~�~�����9��5����
�WW�<=���6;���n���R��w� ͚4���5j%�;::RT���M����+�h_��+++����L�XS,�m�۴a�S���T���d���(�;~�Nϩ3g�s���
�� 6�!�G��8���q��Y����8z�8];u����^���ǎ���n�N����h4eۼm�mE±c��p��'Nдq�9���c��?��k������nՊ�
�޻�șsgi׶
Q\IL49�q��_�X�
�Z������͛Q?2�����u�6F��/ *"�NG���[�?�<=Y��/\ILD�������3g�?z������>��;~��/��Udfe�~�&<=<prrD��2�?�
P���ص��SP����"#"�a���*��>/<O±c$;ƅ���l7����8H�ϐ��ﯴ
�U�'[�wf��8F�iH\�+c�F#��"af/F���m}/V���|�=G�5�Z=�����X�q�
r��E�Z-�۶a��&>~��j��̇�殨�N�<I��͈�[��7��?�e�޽'DUؾ�	�9v"��ϑ|���S�v�*�IWH��L�_ ^�^5��2I׮���gq���V8;�`gOlL4)��t�ԉkדquq�G�n\IL�����bee�B���Ϗ��`���	 $�x0�Ƕ�8��D7h@�MH�zww��大�!�a,,,���"<<�}��O��pڷk����n�b��������,WvxX( " ��fM�p�����ӬIB���[���.
"D��������P(��׻�A@``�IIx��Ӫe�6��bꆇ�w�

	
	&==�7��F�F���Q'<�zu�p��MN�:�T*�Nx];u$%5���7
&<<��G���ΡQL�ԯE~�=�
|}}Lt�V�DXZXV�o0"D������m�'�a����\���%�����9�[v��řs��ؾ=�!��ڳ���\çNXvvv���抓���NX];w�PY���
�F}���#��a�O@��ݠ!�AU�E@�b�G_{���"����빓qKKK��^����c�G#psu5�q�XLHp�a!���#NNN4��"6:�h>�H$&sW���#��8}�,�������B��]����<VXX�h���Ԋrs5YXX`a!�0P�vQ*�4oݖߗ-%<,�6G@��Q*���0��3���M��q�R�Fz�Ν�y�O9VV�*^@@@@@@�"���Kd2g�%��f�2�LX��-�=w��!�<E�_@@@@@�)B�OB�x��������S���"��/     �!~��!�<E�_@@@@@�)�V��_TTĕ�KXZ����89���
��P+����m�[����gΞy"����K�h4�J$DԫGlLt���j��:}�z��"�5_�GE�ӱq�Z�l�����ϻx�2��tNEn���g�~���	
	�������D�P�X�j�]{�Ҭi����6RRSQ(���ظyNNN�kӺ��II:��B�3-[p�q�k���,@Jj*R[[
�?m�?JŹ7-�6��ϯZ�}������l��bt:m��'��m���B$1f�Ξ;o�ܱ'9j�c�+))��&%%�e����?y
)��,;m�L�w���P�C���o���K�
ط�����א�����c��%�2�N��1Y��ܽ�⢢��4P޿��a.��ģpv�����_���ʼ���q�&�w�l?V��o���y���~
u�����ժ����p�ȑZG�����i_}��\�@�}���ݻ������p4���H�z=z���s;�6~����߾s�|8����i��_�C�v�>{��k�Q�T����E?�p����"#(**b��H��DxX�Z-6m��dgg�{�����K����q�,-,���AleE��(��sG�8�@xX(��D�H$��ܺ����;��_ 3+7WW��<�����VammM���((
N�=˪5k�y�u��a��l��

	

���s��p����TB~~~��������k�o�^�:���P��|i߶
666��w/��V�v�6����;��U�m�ȭ�ꄇ�~�&ꄇ�k�����~�Q�yq��Yn�������UTTĚ�ص{w��%��5v��CpP�V�*��_���]�9�ݺtf�=l߹��۷�[�*��e���p|<��r���X���\IJ��h4�m؈N�����V˖�r8��ȝ�Y���\�y��{�������F��kֲs�n,--),T�j�Z����ၕ��6nbϾ��=w�� $��_[��={H�s�P������@�VVVF�]
6m��]{��P8�,������+9q�n������n�F>���;�}��Kmm),T�t�r����	���ب��Y�f-����p'<,���H8v��ۙ�����]��#{��8IAA��ܺ�¹��dFe����ȷE�H�X�M*�AM�d��Χw�@��p�{J5��	���N���Cʿx�2+~_MjZ��������˪�k�Ƒ���f���ٺm;��Rm�aιw��O����ר̽�|~�e1����T�������x�<Ď��)..⏭۸|�
a����b��' ��E*��Ƕmr-9��Ǫ����t�5��s��i�֭���-2��?�����������Vn����w����<VXXX;+~������*?j��6�VK��uHM�MJj*s�/�]��0>;v�&*";;;�v�DQq1�_�B� ##��ƌE�V�n�&�������߯/yyy��J:�U��7�Z��p;=��۶ѱ}{��}��z�Q8�Y�l+V������e�-��
h�W^#33�����ddd�c�n�m�
gg�v�L*5�3b���\�����j�j��̹s4k���2v�D~^��C�������� 2������ɥ���r��7o���ٳ��j6lތRYdR&*"�Ȉz���\�B���;w�b�>̘�JT%899����E?����;����o//zt�ʡ�GX�v-�.B�PЪeK�}5�Z�����$//����F���ݼ��:��v�۰��e$&&2w�B��	d�˯���{��quqᕗ���=�kO�fM���l
�J�n�F�^�HII��k��ݰ}{����GH8z��}����B�N�qtt0�?o�"nܼI�6���U��l��E��r:w쀵X���H�z�Nʾ�SPXh"�P|<v�v����(&//O��*�}�6���оm�.XHn^��X�v6�J��	�ŋ�j�|��(�z�I�ʾ��X�Mj�<�G	�V�D$�9�]��%G��n��R}��9�/���+�Zr��z(�i�o3o�"���J��C���r��qd.�^�v�yѩCd2
���/�m���u&e���Q��Y;����'������ž�ؽ��۶��á�x���ز�O223M|�:�)�����ҵs'�vꌅ�?�_P�..��<^^��[���ۛ���N�~�Z�V���SZZZj�����Ć�clj?z��/�B��kܹ{�\�X,��LJ�;wȻw�1�'��9u�
�3u���抣��..xxx�����	���H���~~����R\\lVo9*���Be!ʢ"�z+B/OO�==�h4$߸I�ѣ�8��?�̝�w���F"����p����L&�Y.'������Ȟ��7hټ9G���C*�"�ˑ;9�qv�3w���`!�7n�e��>���[3[F.�#��
m_�^��xzx�P(puq����`_9W�%ӸQ#���J�v222���?p��w3��|�1�'0��׸x��\/O<<<P8;�P��rb"�۶��ϗF��\��@��~��(�J��91��͛��EJ��>o��J����ԭ[���b�:��j
]���r9�
���l�//O,--M�_K��/K��K�����A�[��>I�J�˃��t�!o�E�JeЕ��j"�V"�Z,��?__��*G.�cmc���{�U�s�}��B�Rs��En޼I��8�2�}�Q��r��6�c|I�VҢ�^�2���{7��і{J5�Rk^i��s�1o�%N'g?X(e��?�������X�qdaa:��ӬIS�������`o�J�FYT��c�hgR������r''��e����B��E� 8(��J�]
gg���IMM%8(_��<���{j+�`mm���~��4kҔ�����N�<�WZ�A�כ=���jC�	*��k��Q�Jظy�;t �~}�iќ/F�F�V��������#1)	W����L?����jlwœފԏ���>�e������G��3Ϙ�����3-���'ccmc���z�II���F�pw�z��څ���MQQ�^�\�t��!AGGR��������eQr��,Z��}BNNQ���`!���L�:�l��
�ii$]����Ť^�%%վ�~�:O\�X


H�s��ǎ���c�6�|}�=�+b6$���2�CCٽw
�Gq��)^}ip�6�����!Z4kJ�V��e�yc�k4mҘ	S�V{n��;�����|?��_�"&�!
gg�~���-zv�J]дZ�,Z����a݆���ٓ9��;w�n�FF�7���ʘ�/��*o


������w����L�iڤ1��|�������l������X�Mzt���+�t����>
Y�%Hm���K�X_�N$�ON�_f�;���-i�J�2�,D4s�c�/�#��3c3_m�ƣ7y�m�N�үe���ؘh6������VX8F�(�����T����^���͕G~��;iڤ1�f|������dff����u\ILB.w2�oՑ��ý{���\]��y�LF~~J����*��^��F��r9�._!%5[[[$66TFYTD��{ЭK'>=���b�ۃO3�r�_�TR�*�V*�T�1��5j�ܹCXh��o���#��U��͠I\�k���/O/���'i�o���KxX(��r��_�Ql}^���'9s��xyyaiaIHp0�D�pws���	D���%q������/v2QQ�&z���
���x�_P��=��R��W��H��XXX�(6�jVJ��-��e���\�r���@���	
��ɓ�	C,����;;;@DTdD�������bc�����k�4lЀ!��|�NZ�lIZZ	e+������ɱ'�ک�XZZr��y��b�����2^���JI!#3��
��1����=�bkC�GEF��䈓���"
6�4l���?Ahh(��l IDAT͛6%2"�#	G	
 "�ݻv!>�(���SZ�%44+KK�� �����Vv�t�/c6$�j/]�sǎ����F�1H�R�d2��VX�
�����G�Ie4o֔�c�H�}���l���nn�x{e;)y����A�4���A9~��/\$$(�fM���޵3w���Б#h4�BC8s�,�v���Ǜ~�{�t5��۶�R���H������W��F�%�WwR���PdR�IU&<��^n�&MP�J�|�
���2G�7�sXh�����gM���f��ߙ1m
"��l���ݤq�#�c�6��+�6h�3�1R+:F��7T��f��֡����3�<�V�h��ʂ7:ԥe=B<��2��
M�ݐX[�gz_[��D`@�����T�b1-[4��LJ��H�8

f�
\K�~���ٙM[�@YT��G��ၯ�7c'Lb�k�Bn^�I���@��cogO�NM�
0�L�?����޹ˡ#�;\]]h׈�;wr��
�GFĎ]��|,&��Q��j�:��k�׉��$$$��;v�t������[���().a��u�=���;Q'<�F���̃��/_�0`@�+޸i@_i�oooϛ����ͽ�3�?1j�x����;<�*}�wz2i�J�+���"MĎ *�.�UşeA������U��]Wv�&Ҕ.%R 	��L���?B�L2	�D������0��s�{�9��[f���|�Cg����>�p�C��x��b03z�o�#�����]��o���l'��v�}�����uN�\=tW�ZՋ?��Auu�$}Ѣ�k����g:�?����9R\̔G�]u�^����^#I�
j�#~!�n7V�M~HG���l����s��h��(����f3~���Nq�8cG�B��jI���~*N��H� �-���!�m�$~!��
��/�B�!���B�6D�BцH�B!�I�B!D"�_!�hC$�!�m�$~!��
��/�B�!���B�6D�BцH�B!�I�B!D"�_!�hC��UqYyeeGQ��}�^OBB":����B!D3Z%�o߱
�NORb2*��k��fe�֟�٣Wk4-N3�ÁN'҄�\�>|������Z�߯!�s�:=*���O��Q\RB����M_��ٰi�v���tӪ�\.����P�iK�n���."*2
�S�o���h4�j���
�l�"֮[Geeqqq).F�ӝ��]����vg}�g�}�s�|�"**+IIN�Y��r�l�rV�Y�


b�wߑ���G�*���B�F��L�rVط?�E����뉌�8��1eee�N���i�k�e�e�\.g3��h���+Wqх�ڥ3��,��O?�Ls:���߭��Qc��k^����2EC���n9jjj��q�ٴef�EQ��l^�5l�T�\n�Ϛ��C�v��0�	'{^�b�QQQ���:��8���cO<IfF:ÆA�բV�((,�l6�����Ţ/}��v�O�L
�|���&u5^.���ym��}��t:9~�8�|�-�۠q].��k}ԿW�-�����۵��(^���nO��v���(
��+^EQ��L�i�r��M�_Kۥ�̙;�>�{�թ�Ϻ��o���7������QS[��e�=�[�c�bk�̩�M_�a}���7��3����$s�z
�SAa!e��>Ǒ/�v|6ܧ�k�'\.�g�g��O�}O%._�ԗ��9s�ү�Ĵk�^Q�UN��l6�Ng����S۹�^~~2�����`�=�r�5Ø>k6��=��1{�t>X�����aҴ�Ly�ޙ����;�pݵt�ҙ�F�"3=���>�b!=�u#������󣼢��S'�ڿߠW����^r��8ZVƈo���M�M��`��\������˯��V�e��Q<��3\s�������?������Y�����qc���O?c�-L�6����'!!��?������o�����֭+7\w->���G��W��=J���;κ�?�w�~�[����v�R�:x0����g`0c�C��̚1���`�Mx���'2�'��|�`�UU�߰�)�>œ���Z.������sN6�7l�^=9R\̅��sˈ�z���-[xg�|��DZ}�N�Ϛ��墨�0�&O����?����@B��u���8�!�����#DFF`2����d�'رs�}�m������L�/����_NUu5��1L���'��۶�M��H�u튿���Z���"�c�|%c�{�n]:�a�f�����?e�Lz=y���sX��+6m�BTdw���#G�ꏌ���ArR"������^>�b!?l܈�`�+����0�,_Nt�v\q٥������Z��,C���C��|���t�҅e˗�.�N˄��Ĺ��%X,�~w6���2���p�>ַ����μ��0�A���7��O�̊���ʜ�ؼ厑>���S���ֶ����U�v^Y��n)8�n�.?���������sӀ4�Z��� ��ҺE��E���a�&�ii,]��q4�Q����WT���OADX}�\��+V2����Ol��'�v�W��]�z�)�>��v�M����_���EQ(*:Lff�W��٧��9;��2��2�J232��q̞1�Ǟx�kmڼ�k�����&����r�#����rs9��ϼ�?���ne���=�r��w���zj�#~�Ӊ��n��T>ɝn���IaQ%%�\w�0�f3��À~�

垻Fc�ۙ���ߏ��>��s��n��&����d���#���"';�1w������i7$�.��j0?����zɉ��p�����B��u�D���2��f�W�������R�_3���v�����x�cc��ф�{�g�(\.eee-.׀~}=�?~SH����
~����DjJ
����_�>�y,+�W��g�\3t�)��<��n��r���&e��G��}=��r�L&ƌű�jr������j�R����\s�P�ki)�\=�*F�yaa�lڲ����1�?�6�m;vx�d� r���X��5�N϶\�f
�����n�f�QTT��0�,�z�W_^�nC�̘�Fs^��d�g0f�Hn���u��_�5r$ݻv��{��ᦿ��;o�aa��ݷ����c�U����mR��u���ԑ�eyv��M�=u��$�űk�n"#"�e��⚌��˒��F���>ԥsgJJK~�5�mU�^=HJL�[o�z����>�X�Ʊ��$�|�*\.N�����&eNel67}����v+�����Dd�õ�UP�?�������\��].�J��xs�)�}��J���ѣ�ޭ+
��x�#?��ɼ������rˈ�����8t�{��#3#�I���N�}J�>}|�7�?��5����y'1���ߟ�s��
���6jn�6�x{�5�<��l�x-���*��j�zN��k|��?7���v��Mmm-��[�%��.���P�Z-�qq���SU]��\�Z-�O�F�����$$$�Ȉ������$�d���{��L&�I���b��l�^pp0j����7�g���r�0�L���C���E����e��Uh�Z��E��` >.�V�'  ���P
��d�~�'=-�
7x��~ (0��#&��P��3OXX(�>����2�{�e�f3Æaᗋ������2����L&Ϻo�\P7�c��	'2"�ؘ��˽���:������lf��������������B�������=���DMxX��m�w�}�\@BB<ٝ:Qp"�'%&�Omm�W�?��G��	
%<,�6n��Φ�[0[̞~ݎ�;`�X<��f����|�p������Qz�(={t'66�FӤ���Blv;��[�Y����2&L�̏[�q�=������/����P�e��t�k�	�q<x��,���{�e�1�^�'"<�AM
�G���c��6l�^|�>�b!}.��̩���ƣ��֚@�?$��)tN
#3�ĭϭ���騪���-���*�Z
9�al?T����Pi�e��jK���o�AdDDݲ�8�]��8R�����RSy�y��������#  ��;w�n�z���ݤ��`��SBM&��
��O|\���D��������4�z�����šl���ic����&���f��{-˙�9g�V9կ�������L��u׹���"&L�
8]N��8"�#X�j����T���R��mL������;�`��{����S��%��/��I�
]v�Ō����TTVr���䳮��$ڷ�������+99��3c����T������}��o��`4j5��r3�7l��$���w	5q�%�ug'^z�5B��p+n�j5I���c�n�|���2�i��;�}r���}��\�Sq��k�2c&	qq�ݷ��AAa��XO\����kyw�{���b:I�p��#x�g��tq�[,?��k�8e��)t�ܙ�ee��n**+��-��p8����j�b��IIJb��3q�4RR�y�ߚ��u��򫯒��N���D�…���{�p}�p!�

�鴿�#��,��+���q+�&���
��BB�/(`�-��Ʊ%%&r�-7��/����|�鐙�Ǧ���Ţ/��l�0��S^��⸕��BzL��kr�?5�wW��a9�X�*��M�����g�uy�}E'~�+��gWAQ!���.��e+V���ORRZB��Xϴ���{�a֓Oy��du";;��_}�N��l6�R�6dw�{?�_{
��-?�ԤLrr&N�ҋ��o-�r�W�\���#n��.99���넇�QXXđ#ż��Q�q�p8�4m�W�w���Ŷ}-��E���_]]��i��߀�i����m������yZ���f�BLL4��Cf��ȈJJK���$�SG�۵#88���@�tΡCf&f��FC����E�֐��
*�/��vQ��L&P��G��h�գA��'�C`@		�M�

��j4�D�~}	'$$���b4��V��٣���Y#��It�vTUUa4��9��� BC�INv:����c	T���ѡC�ՀZ�&%9�NKpP0}��߉k��qq$%%�t:HJL":�n��J��bw88`�l۾����_3�g���lv�dw�D�(��

B��{�}Nv&S&���9�\C����
EQ�ޭ+��Att4�UU��֍��Nt�ñcǰ��$''����V��g�������")1ѳ-��°;��z.��bbbb~)o4@��=GXTUWa���HK#;��!<,�^=z����Y*�EEy.��QS[CMM-=�w#;;�ݺQS[�N�#�S'������9?�6����D2�Ө�����(�i\2hv����2��ù��~��[��,*��̌��<\.
@zZ�g]�TNzZ:����tlVq��s���_�I�86��H��Ě��s��n�Y&22���f�񘑖ޤ�9NBO�u9݌-��~�����@���#"؏���cC�ӱ)��K���q&�t����ԯR��ңu:��~}��GbB�gu�Ё���BC	
%..��D�&$$��_GTT�~~������d����ӨLh�	��FzZ9Y���7��'��̌t@ERR":���Cf))�J����ԑ�ee^�(2"�k����r��!O=a��t��&��>�~,efd�\���dw���ϟ��1�7�g����7�v����٣��Mq�;~3�L��, D���/�p�Ȗ��D���Q���zr�����
��um��G�<���|乤(�
�w�nvZqqq�r_fF23N�ir��d��Я�$}��f�����o��L�򧑛{��C�=��74~���j�>��$�6�Վ��p��Xm�&_-�DZc�	A-�`O��(��u�ٌ���F�9
Q���;�B�VK��R��u��ůr�nT�_�k���|B!D"�_!�hC$�!�m�$~!��
��/�B�!���B�6D�BцH�B!�I�B!D�j��WV^FY�Q����$$$���Z�i!�B4�U���f3����8����>��
!��$Z��K4���|�N��m[ILHl���BтV9�/+/C���r9��s�F�M\1t���G'�μ��4�K�������zo��/p /�5B�P~>��y�ʪ*^~���V�]�wO<�4������&N�ƞ�{W�<�<����u�~����t�u>��|���f�N�:�������}DAa�S���<�f]_ޙ_�֪����f��?��KJK�0qrk�&�b�r��t�p�ݨT���.��5�m"2"�1�G�y�n����<O�r8h4�j5��P�P|�
N������p��j=�(����D����Պ�fC�:i�����r�q���t'o{���:q�СDFF��|=�ۍ����n���jO��:���M��h���:��͎��oR��={����BCO9FEQ�Zm��N�����b��4^����N�������@�i��2>.�O��l��r��C�{�T�)���dsѦ����>��_�=���5>[�(J�}��1ޒ���v{�ğO��l6+N����G���h��cǏ�lyݧ��{���ާ���� 9�k/���իY�|%��\p�y|��29Byy9�Ϝ�OQ�Ϛ�A�gϾ}̞1���x�.�6r��t�څ;w���OII)���OB|<Sf��j�QZZ�����^D��a��0RS[���|Î]��Ĺm��z�"##ٸi3o���'Y�ع��^y���v�&?�0�/���cTUW3q�x����g���3O�(--��pp��(��O�ެ]���EV�\5�J��ݺtfæ�<<�!�F^x��t�Ё�;wz���o��ng¸�x��|%�!22��DEE%s�<�Y�o��?��3�������\�V+%�G��
�ٴ�GOl�y���`�̘�A�'��!f͘�bشe�̛O\�8��܉|��#6lڄZ�&**�Qw��w��G�n�8@||�~~���w��ٸ\.��3m�D֭�����Wf��SU]Mll�ˣS��.�N˄����X��T�ډ���`В�ʼU��~��²�u���jzg�#�O˞�*�6'/���u+���O=
�BU�1�BM�|���\,V+�Z3W
���z�"��^|�Uv��w܎��Y��R�1����՟�s7���`0Ƅ���W_�����B�1�S'ˇ�?�V�BV�����_��Uk
�A��d� ����)�߯/�&OB�RSSSC����IMNa�\~�%L�0�Ą�𡧾e�W�m�\n7|��{�����yb�Lzt�ƶ�;p8X,O�K
"6&�m;v�Ͳo1��3m�$ƌ驣���q����i�'1q�x��R��}{�<z�>fL�BZj*Ͽ�O
��aC�0m�D�O���+��9�ӧ7O=>��}����ElL�g]$&&0m�$�r�p~ڶ�|��ɓ���:gg{ڟ8a3�L&-5��?lඛo�����'PQY����Frb�&Md�p8�̘:������+���_�M&�����-��|��)��0�A:gg����O?��	��,�Z�
��@bbS&>��
d��Ll�7� !>���������˜�F��'�bw8Pk48�N��>�ŸĹGT��C��A�02cM��
�M����Z;Q!��20�Q/�B�Ր��CX�.��PZmiR��+V��h�2�Q_q�״yy8�NfN�œ�gy�Ojj
'��՗^�?�svN�1�,�ʻ?�T�F��m�ٸ����9�g߾߽E�k���v�P�%����#�������B���5M�Q�(
�׬���Ѻ��NP(�rYBQ�^ɐ������g����)׭k�;u����//!00�F����TVC}�.����=8qvE͞iQ�T�]�f�Z�f
[�m�;�ݨ
�6&ë/IDATN��(��nŻ�~�W�O�y��g{�|$�P�S�R5i���l���KPk4x�nn]���Dϫ��7���Z��l6���oPz�(����MןF]7�����z�j5�����+��Ǭ���DŽ0�W�+�$Fry�8^�r'IQA.�%"��NC��HT*H�	a�)�4������r��&�;N���Ə[QP�J3c��iܟ�
cF�"-5�y�}��l�*��jZܿ��G����ޭ��d�ۭ�t5�yO���"ύ+]r�}�9ZV�����a�k�6m"?����e<5{�~�Ͽ�3�NeŪU����R����6���L���b41���RU]MEe%A����$���mL�1�Z�^���p=��M'5%�]���v��#x�������r3���|�be�:"#"�5�2{�S�������d]��n**+=뢱�_˔�ѩc����i����!88��ŵînqY|����ذq3�gͦ����n��
�67)�p8����j�b��1�N��x��ݺ�����v����K�;���8���OJr2���;O<�Z�������;���Y�3.��AAa��Xm-_�:r���_}���?nōJ�bԘ���K/����u"�\"���ǝ�=��۠d�x��	��\���F�Q3�sݾr�к�ԍ�R��٥��l�J{�IJJKhKlL4o����|j�����b�p��7PTt�YO�a�ϻu����㦐�
X��bV�Y��ވo����%%���	­�r�����d2QYQI�̌ӱ*E+S͟?_1b�i�x�ޟ���o�^��g�^��8w&��f�F�p�v�W�kxW�iͩ��M�܌��汆�h44
�4OV߃�'���s�����ۍ�f�9�!�łN�=�?��ZC�oB4�f�ۧ�����u֣����Wi.6���uc�����`�'SVV������{���c��M�
5�ϊ�`6�=��Ͽ��"5%���h����޽��i��ŭ��}�����Z՟V�v荓D�${�	�.���_��T�:v��ǎq�x
f��7'1�Z�l�0�0��3��n=6^������S�`ՒSI���&��w� j��+W3�ʺ�';�j<V���r�R����g0H�����I��f�}��V��iS��t/�8w�� ��!F���^=z��p��|����t�W��'���L�!�8�.�xЙ�K�޽O^H�u��B!D"�_!�hC$�!�m�$~!��
��/�B�!���B�6D�BцH�B!�I�B!D"�_!�hC$�!�m��b���8�B�߭���Zs��{jdx$z����������z�$~!����NIi�}lxC��Pt��c�eX\!!�>߷;((�t>!�熲�2R�S�jϞԖw� {�����x\N����V�5~!����}Ɠkcj���qVĥ�jq�ݒ��B�֤R��t^$�!�87�]�����J�B�s��˲u1�MG�*T���B��
�
=�mv�eGO:ϑ�bl6�ii�W⯬�d�}�ݿ�¢"E9-m�
I�B!�
�X6L�6;G�����N'G��x���>-�-e݆�^�[L���TYU���ljn��bf���<�����;Y��?mi1�3��B���llܼ����f3;tEa��=��SUU@Ii	G���X-DFDRQQA��c���'r��9������
I		DFD��8��Kju�q�O۷��h�Z�t�Б��2jjj��؉
�7MEU�W�Z�MB��Q^Q��[�Sǎ�m���'�B��(
j����T�B�()-������p2�2
 8(���4���� **
�^OFzn���CiՎ��@����U�g�>���IOM��tb�ZH�O (0��cմ�������AyE���M�h5Z�RSIIJ��o$#=��I�BqNh��>���y�p8�h4�z���HKI������Z�GǢ��ќ����2r�a�Zp:]��*F#6�
������Q�Մ�L>cj�澈��;uU�k�͊�餸��Z���~~~����Olt~~�&e�A��Q�	0}�+7�	!�8���ųa�F6��������c���n6n�Lrb��	��m+[wl��c�l6��ٌZ�&((���O۶���Ozjǎ�f����|���]�����$V���F���H�����9��O��"r��	�Y����@~ڶ���M��/����L~���AQA�A�F�V+j�EQ��mOXhI	��Z����c0�iM~a��D��c�'#5�ӁZ�&%)���B�NF�)�j2` <,���q�Uj��(=Z�J�Fw�T}`` i�$߸�V��}L���qTTU��8�����B�+�;կ��HJHl�~`@ Y;y�����@��ދ�����&����#�PS�W|i�����UV�Ց� ����&ebcb=�M&&��ɩ~!�mN���"?�#�B��_�_�Z��Qr["�_!�9A��x�6P�NV��r��h4r�_!Ĺ!<,�¢����P��zz����".��IxX�$~!���VKpPV��L��F�%8(���+8(���$~!��EQ0�}}��r��gU\r�_!�hC$�!�m�$~!��
��/�B�!���B�6䤉��rq��1�n��Of���y��B!�-~�o��������������c4��R��7l��y�~w���<�^��b�ҽk�^5���n�d2����jm/��V�YKxX���s7�M��_�S^^NV����S��j�1�y�m�+N!���h1�o߹����o�8�_���o�2���yl���|�|��4|v�����r�������[,�)f��X,V�<\��wt���$~EQp:��g!7n�d�6���%�3���
cݿ?�{��Eppp�u�s8�t:4j5�I�<m�n���j�br8��Β_�Bqn�>|FZZ�ω��%_/E�בөF���|�wk�ҳGw�۶���a��ؾs'�~�|�h߯�����ԉ=����7���~b�]t���!WS\\�ǟ���[ؾc_-]��=��t:V�Y���HJLd�?��ǟr(�z��Ԕ�|��\���|��������{����˾������&N�F�����+Y��'��q#F#?m����eؐ!��w����PUUEAa!��\6�j
��X�7l߹���@�X��Z��N���d՚5,�v:�?�h�v� -%�7���7ld���t������Žc"��!^}�
bb�9~��'�~�={���KN6�-<bQ!�8���-_�?�WO&=2�J��ُ�}�Nn��:���ϑ�ޞ7��.��N���$�d"%9�1�F������'���9q�r�w��o>��}�b4�3��{���jÜٳ��!��|�ԙ��ٽ;�:drť�r�%pa�����b��	�}�%�PwF@9Q�N���t���FJrW\v)�y��&mggub@�~������_o����z3�\L�����HJL��r�𸇘2�>��S���2u���G��3����{�~��������3���z;v�b�'�2�x�IIN:��B!N�Io�5����K�ٽ;���D���i���	���c��8��-?�x�l]⍌'"<��W\�U�F��$f_��U*=�w�[n��#���穨��o�`4��hPūm�ɍ��w�II,Y���/�ڑ��HRb���q*�����e��ቹ9�pp��X�g�oM!�8��xy��Æ���0���\~٥L{l6�����%K�RTt�V�y�`b|<s�{F�b��+���hCHpHKMz),*b��O��feȕW0袁���L&i�)T;FUu5���g>���OVǎ���ط��Ò�K���D��{T���,c�'��ƫ��j��>d�o��֛[�51!�Z���>��b����yM�߷�׬��_B��3�����ګ�����b��}����B��Q͟?_���˛-�p8P��h4�{���<���̘2���jE�V7�A��ۭ`������A���!�Ӊ�(���|Mw:�^m���3��͆�`8�����3g1�	�rdB!�u���':��dz(��Ԕ���O���j�oJ�@�I8�3��Zm�2g�)I���?NYy9Z���LP`��B���7}W��X5W\z��E�`�Xٺm;5��<��q>�B!~�ߔ����{��
DEEr��C�tB!�A�[�B!D"�_!�hC�Pw��B!�m�?��$ۜx�IEND�B`�site-packages/sepolicy/communicate.py000064400000003276147511334660014013 0ustar00# Copyright (C) 2012 Red Hat
# see file 'COPYING' for use and warranty information
#
# setrans is a tool for analyzing process transistions in SELinux policy
#
#    This program is free software; you can redistribute it and/or
#    modify it under the terms of the GNU General Public License as
#    published by the Free Software Foundation; either version 2 of
#    the License, or (at your option) any later version.
#
#    This program is distributed in the hope that it will be useful,
#    but WITHOUT ANY WARRANTY; without even the implied warranty of
#    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
#    GNU General Public License for more details.
#
#    You should have received a copy of the GNU General Public License
#    along with this program; if not, write to the Free Software
#    Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA
#                                        02111-1307  USA
#
#
import sepolicy
import sys


def usage(parser, msg):
    parser.print_help()

    sys.stderr.write("\n%s\n" % msg)
    sys.stderr.flush()
    sys.exit(1)


def expand_attribute(attribute):
    try:
        return list(next(sepolicy.info(sepolicy.ATTRIBUTE, attribute))["types"])
    except StopIteration:
        return [attribute]


def get_types(src, tclass, perm):
    allows = sepolicy.search([sepolicy.ALLOW], {sepolicy.SOURCE: src, sepolicy.CLASS: tclass, sepolicy.PERMS: perm})
    if not allows:
        raise ValueError("The %s type is not allowed to %s any types" % (src, ",".join(perm)))

    tlist = []
    for l in map(lambda y: y[sepolicy.TARGET], filter(lambda x: set(perm).issubset(x[sepolicy.PERMS]), allows)):
        tlist = tlist + expand_attribute(l)
    return tlist
site-packages/syspurpose-1.28.32-py3.6.egg-info/entry_points.txt000064400000000065147511334660020211 0ustar00[console_scripts]
syspurpose = syspurpose.main:main

site-packages/syspurpose-1.28.32-py3.6.egg-info/top_level.txt000064400000000013147511334660017436 0ustar00syspurpose
site-packages/syspurpose-1.28.32-py3.6.egg-info/dependency_links.txt000064400000000001147511334660020760 0ustar00
site-packages/syspurpose-1.28.32-py3.6.egg-info/PKG-INFO000064400000000367147511334660016015 0ustar00Metadata-Version: 1.0
Name: syspurpose
Version: 1.28.32
Summary: Manage Red Hat System Purpose
Home-page: http://www.candlepinproject.org
Author: Chris Snyder
Author-email: chainsaw@redhat.com
License: GPLv2
Description: UNKNOWN
Platform: UNKNOWN
site-packages/syspurpose-1.28.32-py3.6.egg-info/SOURCES.txt000064400000000575147511334660016605 0ustar00README.md
setup.cfg
setup.py
man/syspurpose.8
src/syspurpose/__init__.py
src/syspurpose/cli.py
src/syspurpose/files.py
src/syspurpose/i18n.py
src/syspurpose/main.py
src/syspurpose/utils.py
src/syspurpose.egg-info/PKG-INFO
src/syspurpose.egg-info/SOURCES.txt
src/syspurpose.egg-info/dependency_links.txt
src/syspurpose.egg-info/entry_points.txt
src/syspurpose.egg-info/top_level.txtsite-packages/ply/cpp.py000064400000101365147511334660011244 0ustar00# -----------------------------------------------------------------------------
# cpp.py
#
# Author:  David Beazley (http://www.dabeaz.com)
# Copyright (C) 2007
# All rights reserved
#
# This module implements an ANSI-C style lexical preprocessor for PLY. 
# -----------------------------------------------------------------------------
from __future__ import generators

import sys

# Some Python 3 compatibility shims
if sys.version_info.major < 3:
    STRING_TYPES = (str, unicode)
else:
    STRING_TYPES = str
    xrange = range

# -----------------------------------------------------------------------------
# Default preprocessor lexer definitions.   These tokens are enough to get
# a basic preprocessor working.   Other modules may import these if they want
# -----------------------------------------------------------------------------

tokens = (
   'CPP_ID','CPP_INTEGER', 'CPP_FLOAT', 'CPP_STRING', 'CPP_CHAR', 'CPP_WS', 'CPP_COMMENT1', 'CPP_COMMENT2', 'CPP_POUND','CPP_DPOUND'
)

literals = "+-*/%|&~^<>=!?()[]{}.,;:\\\'\""

# Whitespace
def t_CPP_WS(t):
    r'\s+'
    t.lexer.lineno += t.value.count("\n")
    return t

t_CPP_POUND = r'\#'
t_CPP_DPOUND = r'\#\#'

# Identifier
t_CPP_ID = r'[A-Za-z_][\w_]*'

# Integer literal
def CPP_INTEGER(t):
    r'(((((0x)|(0X))[0-9a-fA-F]+)|(\d+))([uU][lL]|[lL][uU]|[uU]|[lL])?)'
    return t

t_CPP_INTEGER = CPP_INTEGER

# Floating literal
t_CPP_FLOAT = r'((\d+)(\.\d+)(e(\+|-)?(\d+))? | (\d+)e(\+|-)?(\d+))([lL]|[fF])?'

# String literal
def t_CPP_STRING(t):
    r'\"([^\\\n]|(\\(.|\n)))*?\"'
    t.lexer.lineno += t.value.count("\n")
    return t

# Character constant 'c' or L'c'
def t_CPP_CHAR(t):
    r'(L)?\'([^\\\n]|(\\(.|\n)))*?\''
    t.lexer.lineno += t.value.count("\n")
    return t

# Comment
def t_CPP_COMMENT1(t):
    r'(/\*(.|\n)*?\*/)'
    ncr = t.value.count("\n")
    t.lexer.lineno += ncr
    # replace with one space or a number of '\n'
    t.type = 'CPP_WS'; t.value = '\n' * ncr if ncr else ' '
    return t

# Line comment
def t_CPP_COMMENT2(t):
    r'(//.*?(\n|$))'
    # replace with '/n'
    t.type = 'CPP_WS'; t.value = '\n'
    
def t_error(t):
    t.type = t.value[0]
    t.value = t.value[0]
    t.lexer.skip(1)
    return t

import re
import copy
import time
import os.path

# -----------------------------------------------------------------------------
# trigraph()
# 
# Given an input string, this function replaces all trigraph sequences. 
# The following mapping is used:
#
#     ??=    #
#     ??/    \
#     ??'    ^
#     ??(    [
#     ??)    ]
#     ??!    |
#     ??<    {
#     ??>    }
#     ??-    ~
# -----------------------------------------------------------------------------

_trigraph_pat = re.compile(r'''\?\?[=/\'\(\)\!<>\-]''')
_trigraph_rep = {
    '=':'#',
    '/':'\\',
    "'":'^',
    '(':'[',
    ')':']',
    '!':'|',
    '<':'{',
    '>':'}',
    '-':'~'
}

def trigraph(input):
    return _trigraph_pat.sub(lambda g: _trigraph_rep[g.group()[-1]],input)

# ------------------------------------------------------------------
# Macro object
#
# This object holds information about preprocessor macros
#
#    .name      - Macro name (string)
#    .value     - Macro value (a list of tokens)
#    .arglist   - List of argument names
#    .variadic  - Boolean indicating whether or not variadic macro
#    .vararg    - Name of the variadic parameter
#
# When a macro is created, the macro replacement token sequence is
# pre-scanned and used to create patch lists that are later used
# during macro expansion
# ------------------------------------------------------------------

class Macro(object):
    def __init__(self,name,value,arglist=None,variadic=False):
        self.name = name
        self.value = value
        self.arglist = arglist
        self.variadic = variadic
        if variadic:
            self.vararg = arglist[-1]
        self.source = None

# ------------------------------------------------------------------
# Preprocessor object
#
# Object representing a preprocessor.  Contains macro definitions,
# include directories, and other information
# ------------------------------------------------------------------

class Preprocessor(object):
    def __init__(self,lexer=None):
        if lexer is None:
            lexer = lex.lexer
        self.lexer = lexer
        self.macros = { }
        self.path = []
        self.temp_path = []

        # Probe the lexer for selected tokens
        self.lexprobe()

        tm = time.localtime()
        self.define("__DATE__ \"%s\"" % time.strftime("%b %d %Y",tm))
        self.define("__TIME__ \"%s\"" % time.strftime("%H:%M:%S",tm))
        self.parser = None

    # -----------------------------------------------------------------------------
    # tokenize()
    #
    # Utility function. Given a string of text, tokenize into a list of tokens
    # -----------------------------------------------------------------------------

    def tokenize(self,text):
        tokens = []
        self.lexer.input(text)
        while True:
            tok = self.lexer.token()
            if not tok: break
            tokens.append(tok)
        return tokens

    # ---------------------------------------------------------------------
    # error()
    #
    # Report a preprocessor error/warning of some kind
    # ----------------------------------------------------------------------

    def error(self,file,line,msg):
        print("%s:%d %s" % (file,line,msg))

    # ----------------------------------------------------------------------
    # lexprobe()
    #
    # This method probes the preprocessor lexer object to discover
    # the token types of symbols that are important to the preprocessor.
    # If this works right, the preprocessor will simply "work"
    # with any suitable lexer regardless of how tokens have been named.
    # ----------------------------------------------------------------------

    def lexprobe(self):

        # Determine the token type for identifiers
        self.lexer.input("identifier")
        tok = self.lexer.token()
        if not tok or tok.value != "identifier":
            print("Couldn't determine identifier type")
        else:
            self.t_ID = tok.type

        # Determine the token type for integers
        self.lexer.input("12345")
        tok = self.lexer.token()
        if not tok or int(tok.value) != 12345:
            print("Couldn't determine integer type")
        else:
            self.t_INTEGER = tok.type
            self.t_INTEGER_TYPE = type(tok.value)

        # Determine the token type for strings enclosed in double quotes
        self.lexer.input("\"filename\"")
        tok = self.lexer.token()
        if not tok or tok.value != "\"filename\"":
            print("Couldn't determine string type")
        else:
            self.t_STRING = tok.type

        # Determine the token type for whitespace--if any
        self.lexer.input("  ")
        tok = self.lexer.token()
        if not tok or tok.value != "  ":
            self.t_SPACE = None
        else:
            self.t_SPACE = tok.type

        # Determine the token type for newlines
        self.lexer.input("\n")
        tok = self.lexer.token()
        if not tok or tok.value != "\n":
            self.t_NEWLINE = None
            print("Couldn't determine token for newlines")
        else:
            self.t_NEWLINE = tok.type

        self.t_WS = (self.t_SPACE, self.t_NEWLINE)

        # Check for other characters used by the preprocessor
        chars = [ '<','>','#','##','\\','(',')',',','.']
        for c in chars:
            self.lexer.input(c)
            tok = self.lexer.token()
            if not tok or tok.value != c:
                print("Unable to lex '%s' required for preprocessor" % c)

    # ----------------------------------------------------------------------
    # add_path()
    #
    # Adds a search path to the preprocessor.  
    # ----------------------------------------------------------------------

    def add_path(self,path):
        self.path.append(path)

    # ----------------------------------------------------------------------
    # group_lines()
    #
    # Given an input string, this function splits it into lines.  Trailing whitespace
    # is removed.   Any line ending with \ is grouped with the next line.  This
    # function forms the lowest level of the preprocessor---grouping into text into
    # a line-by-line format.
    # ----------------------------------------------------------------------

    def group_lines(self,input):
        lex = self.lexer.clone()
        lines = [x.rstrip() for x in input.splitlines()]
        for i in xrange(len(lines)):
            j = i+1
            while lines[i].endswith('\\') and (j < len(lines)):
                lines[i] = lines[i][:-1]+lines[j]
                lines[j] = ""
                j += 1

        input = "\n".join(lines)
        lex.input(input)
        lex.lineno = 1

        current_line = []
        while True:
            tok = lex.token()
            if not tok:
                break
            current_line.append(tok)
            if tok.type in self.t_WS and '\n' in tok.value:
                yield current_line
                current_line = []

        if current_line:
            yield current_line

    # ----------------------------------------------------------------------
    # tokenstrip()
    # 
    # Remove leading/trailing whitespace tokens from a token list
    # ----------------------------------------------------------------------

    def tokenstrip(self,tokens):
        i = 0
        while i < len(tokens) and tokens[i].type in self.t_WS:
            i += 1
        del tokens[:i]
        i = len(tokens)-1
        while i >= 0 and tokens[i].type in self.t_WS:
            i -= 1
        del tokens[i+1:]
        return tokens


    # ----------------------------------------------------------------------
    # collect_args()
    #
    # Collects comma separated arguments from a list of tokens.   The arguments
    # must be enclosed in parenthesis.  Returns a tuple (tokencount,args,positions)
    # where tokencount is the number of tokens consumed, args is a list of arguments,
    # and positions is a list of integers containing the starting index of each
    # argument.  Each argument is represented by a list of tokens.
    #
    # When collecting arguments, leading and trailing whitespace is removed
    # from each argument.  
    #
    # This function properly handles nested parenthesis and commas---these do not
    # define new arguments.
    # ----------------------------------------------------------------------

    def collect_args(self,tokenlist):
        args = []
        positions = []
        current_arg = []
        nesting = 1
        tokenlen = len(tokenlist)
    
        # Search for the opening '('.
        i = 0
        while (i < tokenlen) and (tokenlist[i].type in self.t_WS):
            i += 1

        if (i < tokenlen) and (tokenlist[i].value == '('):
            positions.append(i+1)
        else:
            self.error(self.source,tokenlist[0].lineno,"Missing '(' in macro arguments")
            return 0, [], []

        i += 1

        while i < tokenlen:
            t = tokenlist[i]
            if t.value == '(':
                current_arg.append(t)
                nesting += 1
            elif t.value == ')':
                nesting -= 1
                if nesting == 0:
                    if current_arg:
                        args.append(self.tokenstrip(current_arg))
                        positions.append(i)
                    return i+1,args,positions
                current_arg.append(t)
            elif t.value == ',' and nesting == 1:
                args.append(self.tokenstrip(current_arg))
                positions.append(i+1)
                current_arg = []
            else:
                current_arg.append(t)
            i += 1
    
        # Missing end argument
        self.error(self.source,tokenlist[-1].lineno,"Missing ')' in macro arguments")
        return 0, [],[]

    # ----------------------------------------------------------------------
    # macro_prescan()
    #
    # Examine the macro value (token sequence) and identify patch points
    # This is used to speed up macro expansion later on---we'll know
    # right away where to apply patches to the value to form the expansion
    # ----------------------------------------------------------------------
    
    def macro_prescan(self,macro):
        macro.patch     = []             # Standard macro arguments 
        macro.str_patch = []             # String conversion expansion
        macro.var_comma_patch = []       # Variadic macro comma patch
        i = 0
        while i < len(macro.value):
            if macro.value[i].type == self.t_ID and macro.value[i].value in macro.arglist:
                argnum = macro.arglist.index(macro.value[i].value)
                # Conversion of argument to a string
                if i > 0 and macro.value[i-1].value == '#':
                    macro.value[i] = copy.copy(macro.value[i])
                    macro.value[i].type = self.t_STRING
                    del macro.value[i-1]
                    macro.str_patch.append((argnum,i-1))
                    continue
                # Concatenation
                elif (i > 0 and macro.value[i-1].value == '##'):
                    macro.patch.append(('c',argnum,i-1))
                    del macro.value[i-1]
                    continue
                elif ((i+1) < len(macro.value) and macro.value[i+1].value == '##'):
                    macro.patch.append(('c',argnum,i))
                    i += 1
                    continue
                # Standard expansion
                else:
                    macro.patch.append(('e',argnum,i))
            elif macro.value[i].value == '##':
                if macro.variadic and (i > 0) and (macro.value[i-1].value == ',') and \
                        ((i+1) < len(macro.value)) and (macro.value[i+1].type == self.t_ID) and \
                        (macro.value[i+1].value == macro.vararg):
                    macro.var_comma_patch.append(i-1)
            i += 1
        macro.patch.sort(key=lambda x: x[2],reverse=True)

    # ----------------------------------------------------------------------
    # macro_expand_args()
    #
    # Given a Macro and list of arguments (each a token list), this method
    # returns an expanded version of a macro.  The return value is a token sequence
    # representing the replacement macro tokens
    # ----------------------------------------------------------------------

    def macro_expand_args(self,macro,args):
        # Make a copy of the macro token sequence
        rep = [copy.copy(_x) for _x in macro.value]

        # Make string expansion patches.  These do not alter the length of the replacement sequence
        
        str_expansion = {}
        for argnum, i in macro.str_patch:
            if argnum not in str_expansion:
                str_expansion[argnum] = ('"%s"' % "".join([x.value for x in args[argnum]])).replace("\\","\\\\")
            rep[i] = copy.copy(rep[i])
            rep[i].value = str_expansion[argnum]

        # Make the variadic macro comma patch.  If the variadic macro argument is empty, we get rid
        comma_patch = False
        if macro.variadic and not args[-1]:
            for i in macro.var_comma_patch:
                rep[i] = None
                comma_patch = True

        # Make all other patches.   The order of these matters.  It is assumed that the patch list
        # has been sorted in reverse order of patch location since replacements will cause the
        # size of the replacement sequence to expand from the patch point.
        
        expanded = { }
        for ptype, argnum, i in macro.patch:
            # Concatenation.   Argument is left unexpanded
            if ptype == 'c':
                rep[i:i+1] = args[argnum]
            # Normal expansion.  Argument is macro expanded first
            elif ptype == 'e':
                if argnum not in expanded:
                    expanded[argnum] = self.expand_macros(args[argnum])
                rep[i:i+1] = expanded[argnum]

        # Get rid of removed comma if necessary
        if comma_patch:
            rep = [_i for _i in rep if _i]

        return rep


    # ----------------------------------------------------------------------
    # expand_macros()
    #
    # Given a list of tokens, this function performs macro expansion.
    # The expanded argument is a dictionary that contains macros already
    # expanded.  This is used to prevent infinite recursion.
    # ----------------------------------------------------------------------

    def expand_macros(self,tokens,expanded=None):
        if expanded is None:
            expanded = {}
        i = 0
        while i < len(tokens):
            t = tokens[i]
            if t.type == self.t_ID:
                if t.value in self.macros and t.value not in expanded:
                    # Yes, we found a macro match
                    expanded[t.value] = True
                    
                    m = self.macros[t.value]
                    if not m.arglist:
                        # A simple macro
                        ex = self.expand_macros([copy.copy(_x) for _x in m.value],expanded)
                        for e in ex:
                            e.lineno = t.lineno
                        tokens[i:i+1] = ex
                        i += len(ex)
                    else:
                        # A macro with arguments
                        j = i + 1
                        while j < len(tokens) and tokens[j].type in self.t_WS:
                            j += 1
                        if tokens[j].value == '(':
                            tokcount,args,positions = self.collect_args(tokens[j:])
                            if not m.variadic and len(args) !=  len(m.arglist):
                                self.error(self.source,t.lineno,"Macro %s requires %d arguments" % (t.value,len(m.arglist)))
                                i = j + tokcount
                            elif m.variadic and len(args) < len(m.arglist)-1:
                                if len(m.arglist) > 2:
                                    self.error(self.source,t.lineno,"Macro %s must have at least %d arguments" % (t.value, len(m.arglist)-1))
                                else:
                                    self.error(self.source,t.lineno,"Macro %s must have at least %d argument" % (t.value, len(m.arglist)-1))
                                i = j + tokcount
                            else:
                                if m.variadic:
                                    if len(args) == len(m.arglist)-1:
                                        args.append([])
                                    else:
                                        args[len(m.arglist)-1] = tokens[j+positions[len(m.arglist)-1]:j+tokcount-1]
                                        del args[len(m.arglist):]
                                        
                                # Get macro replacement text
                                rep = self.macro_expand_args(m,args)
                                rep = self.expand_macros(rep,expanded)
                                for r in rep:
                                    r.lineno = t.lineno
                                tokens[i:j+tokcount] = rep
                                i += len(rep)
                    del expanded[t.value]
                    continue
                elif t.value == '__LINE__':
                    t.type = self.t_INTEGER
                    t.value = self.t_INTEGER_TYPE(t.lineno)
                
            i += 1
        return tokens

    # ----------------------------------------------------------------------    
    # evalexpr()
    # 
    # Evaluate an expression token sequence for the purposes of evaluating
    # integral expressions.
    # ----------------------------------------------------------------------

    def evalexpr(self,tokens):
        # tokens = tokenize(line)
        # Search for defined macros
        i = 0
        while i < len(tokens):
            if tokens[i].type == self.t_ID and tokens[i].value == 'defined':
                j = i + 1
                needparen = False
                result = "0L"
                while j < len(tokens):
                    if tokens[j].type in self.t_WS:
                        j += 1
                        continue
                    elif tokens[j].type == self.t_ID:
                        if tokens[j].value in self.macros:
                            result = "1L"
                        else:
                            result = "0L"
                        if not needparen: break
                    elif tokens[j].value == '(':
                        needparen = True
                    elif tokens[j].value == ')':
                        break
                    else:
                        self.error(self.source,tokens[i].lineno,"Malformed defined()")
                    j += 1
                tokens[i].type = self.t_INTEGER
                tokens[i].value = self.t_INTEGER_TYPE(result)
                del tokens[i+1:j+1]
            i += 1
        tokens = self.expand_macros(tokens)
        for i,t in enumerate(tokens):
            if t.type == self.t_ID:
                tokens[i] = copy.copy(t)
                tokens[i].type = self.t_INTEGER
                tokens[i].value = self.t_INTEGER_TYPE("0L")
            elif t.type == self.t_INTEGER:
                tokens[i] = copy.copy(t)
                # Strip off any trailing suffixes
                tokens[i].value = str(tokens[i].value)
                while tokens[i].value[-1] not in "0123456789abcdefABCDEF":
                    tokens[i].value = tokens[i].value[:-1]
        
        expr = "".join([str(x.value) for x in tokens])
        expr = expr.replace("&&"," and ")
        expr = expr.replace("||"," or ")
        expr = expr.replace("!"," not ")
        try:
            result = eval(expr)
        except Exception:
            self.error(self.source,tokens[0].lineno,"Couldn't evaluate expression")
            result = 0
        return result

    # ----------------------------------------------------------------------
    # parsegen()
    #
    # Parse an input string/
    # ----------------------------------------------------------------------
    def parsegen(self,input,source=None):

        # Replace trigraph sequences
        t = trigraph(input)
        lines = self.group_lines(t)

        if not source:
            source = ""
            
        self.define("__FILE__ \"%s\"" % source)

        self.source = source
        chunk = []
        enable = True
        iftrigger = False
        ifstack = []

        for x in lines:
            for i,tok in enumerate(x):
                if tok.type not in self.t_WS: break
            if tok.value == '#':
                # Preprocessor directive

                # insert necessary whitespace instead of eaten tokens
                for tok in x:
                    if tok.type in self.t_WS and '\n' in tok.value:
                        chunk.append(tok)
                
                dirtokens = self.tokenstrip(x[i+1:])
                if dirtokens:
                    name = dirtokens[0].value
                    args = self.tokenstrip(dirtokens[1:])
                else:
                    name = ""
                    args = []
                
                if name == 'define':
                    if enable:
                        for tok in self.expand_macros(chunk):
                            yield tok
                        chunk = []
                        self.define(args)
                elif name == 'include':
                    if enable:
                        for tok in self.expand_macros(chunk):
                            yield tok
                        chunk = []
                        oldfile = self.macros['__FILE__']
                        for tok in self.include(args):
                            yield tok
                        self.macros['__FILE__'] = oldfile
                        self.source = source
                elif name == 'undef':
                    if enable:
                        for tok in self.expand_macros(chunk):
                            yield tok
                        chunk = []
                        self.undef(args)
                elif name == 'ifdef':
                    ifstack.append((enable,iftrigger))
                    if enable:
                        if not args[0].value in self.macros:
                            enable = False
                            iftrigger = False
                        else:
                            iftrigger = True
                elif name == 'ifndef':
                    ifstack.append((enable,iftrigger))
                    if enable:
                        if args[0].value in self.macros:
                            enable = False
                            iftrigger = False
                        else:
                            iftrigger = True
                elif name == 'if':
                    ifstack.append((enable,iftrigger))
                    if enable:
                        result = self.evalexpr(args)
                        if not result:
                            enable = False
                            iftrigger = False
                        else:
                            iftrigger = True
                elif name == 'elif':
                    if ifstack:
                        if ifstack[-1][0]:     # We only pay attention if outer "if" allows this
                            if enable:         # If already true, we flip enable False
                                enable = False
                            elif not iftrigger:   # If False, but not triggered yet, we'll check expression
                                result = self.evalexpr(args)
                                if result:
                                    enable  = True
                                    iftrigger = True
                    else:
                        self.error(self.source,dirtokens[0].lineno,"Misplaced #elif")
                        
                elif name == 'else':
                    if ifstack:
                        if ifstack[-1][0]:
                            if enable:
                                enable = False
                            elif not iftrigger:
                                enable = True
                                iftrigger = True
                    else:
                        self.error(self.source,dirtokens[0].lineno,"Misplaced #else")

                elif name == 'endif':
                    if ifstack:
                        enable,iftrigger = ifstack.pop()
                    else:
                        self.error(self.source,dirtokens[0].lineno,"Misplaced #endif")
                else:
                    # Unknown preprocessor directive
                    pass

            else:
                # Normal text
                if enable:
                    chunk.extend(x)

        for tok in self.expand_macros(chunk):
            yield tok
        chunk = []

    # ----------------------------------------------------------------------
    # include()
    #
    # Implementation of file-inclusion
    # ----------------------------------------------------------------------

    def include(self,tokens):
        # Try to extract the filename and then process an include file
        if not tokens:
            return
        if tokens:
            if tokens[0].value != '<' and tokens[0].type != self.t_STRING:
                tokens = self.expand_macros(tokens)

            if tokens[0].value == '<':
                # Include <...>
                i = 1
                while i < len(tokens):
                    if tokens[i].value == '>':
                        break
                    i += 1
                else:
                    print("Malformed #include <...>")
                    return
                filename = "".join([x.value for x in tokens[1:i]])
                path = self.path + [""] + self.temp_path
            elif tokens[0].type == self.t_STRING:
                filename = tokens[0].value[1:-1]
                path = self.temp_path + [""] + self.path
            else:
                print("Malformed #include statement")
                return
        for p in path:
            iname = os.path.join(p,filename)
            try:
                data = open(iname,"r").read()
                dname = os.path.dirname(iname)
                if dname:
                    self.temp_path.insert(0,dname)
                for tok in self.parsegen(data,filename):
                    yield tok
                if dname:
                    del self.temp_path[0]
                break
            except IOError:
                pass
        else:
            print("Couldn't find '%s'" % filename)

    # ----------------------------------------------------------------------
    # define()
    #
    # Define a new macro
    # ----------------------------------------------------------------------

    def define(self,tokens):
        if isinstance(tokens,STRING_TYPES):
            tokens = self.tokenize(tokens)

        linetok = tokens
        try:
            name = linetok[0]
            if len(linetok) > 1:
                mtype = linetok[1]
            else:
                mtype = None
            if not mtype:
                m = Macro(name.value,[])
                self.macros[name.value] = m
            elif mtype.type in self.t_WS:
                # A normal macro
                m = Macro(name.value,self.tokenstrip(linetok[2:]))
                self.macros[name.value] = m
            elif mtype.value == '(':
                # A macro with arguments
                tokcount, args, positions = self.collect_args(linetok[1:])
                variadic = False
                for a in args:
                    if variadic:
                        print("No more arguments may follow a variadic argument")
                        break
                    astr = "".join([str(_i.value) for _i in a])
                    if astr == "...":
                        variadic = True
                        a[0].type = self.t_ID
                        a[0].value = '__VA_ARGS__'
                        variadic = True
                        del a[1:]
                        continue
                    elif astr[-3:] == "..." and a[0].type == self.t_ID:
                        variadic = True
                        del a[1:]
                        # If, for some reason, "." is part of the identifier, strip off the name for the purposes
                        # of macro expansion
                        if a[0].value[-3:] == '...':
                            a[0].value = a[0].value[:-3]
                        continue
                    if len(a) > 1 or a[0].type != self.t_ID:
                        print("Invalid macro argument")
                        break
                else:
                    mvalue = self.tokenstrip(linetok[1+tokcount:])
                    i = 0
                    while i < len(mvalue):
                        if i+1 < len(mvalue):
                            if mvalue[i].type in self.t_WS and mvalue[i+1].value == '##':
                                del mvalue[i]
                                continue
                            elif mvalue[i].value == '##' and mvalue[i+1].type in self.t_WS:
                                del mvalue[i+1]
                        i += 1
                    m = Macro(name.value,mvalue,[x[0].value for x in args],variadic)
                    self.macro_prescan(m)
                    self.macros[name.value] = m
            else:
                print("Bad macro definition")
        except LookupError:
            print("Bad macro definition")

    # ----------------------------------------------------------------------
    # undef()
    #
    # Undefine a macro
    # ----------------------------------------------------------------------

    def undef(self,tokens):
        id = tokens[0].value
        try:
            del self.macros[id]
        except LookupError:
            pass

    # ----------------------------------------------------------------------
    # parse()
    #
    # Parse input text.
    # ----------------------------------------------------------------------
    def parse(self,input,source=None,ignore={}):
        self.ignore = ignore
        self.parser = self.parsegen(input,source)
        
    # ----------------------------------------------------------------------
    # token()
    #
    # Method to return individual tokens
    # ----------------------------------------------------------------------
    def token(self):
        try:
            while True:
                tok = next(self.parser)
                if tok.type not in self.ignore: return tok
        except StopIteration:
            self.parser = None
            return None

if __name__ == '__main__':
    import ply.lex as lex
    lexer = lex.lex()

    # Run a preprocessor
    import sys
    f = open(sys.argv[1])
    input = f.read()

    p = Preprocessor(lexer)
    p.parse(input,sys.argv[1])
    while True:
        tok = p.token()
        if not tok: break
        print(p.source, tok)




    






site-packages/ply/yacc.py000064400000414106147511334660011401 0ustar00# -----------------------------------------------------------------------------
# ply: yacc.py
#
# Copyright (C) 2001-2016
# David M. Beazley (Dabeaz LLC)
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are
# met:
#
# * Redistributions of source code must retain the above copyright notice,
#   this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright notice,
#   this list of conditions and the following disclaimer in the documentation
#   and/or other materials provided with the distribution.
# * Neither the name of the David Beazley or Dabeaz LLC may be used to
#   endorse or promote products derived from this software without
#  specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
# -----------------------------------------------------------------------------
#
# This implements an LR parser that is constructed from grammar rules defined
# as Python functions. The grammer is specified by supplying the BNF inside
# Python documentation strings.  The inspiration for this technique was borrowed
# from John Aycock's Spark parsing system.  PLY might be viewed as cross between
# Spark and the GNU bison utility.
#
# The current implementation is only somewhat object-oriented. The
# LR parser itself is defined in terms of an object (which allows multiple
# parsers to co-exist).  However, most of the variables used during table
# construction are defined in terms of global variables.  Users shouldn't
# notice unless they are trying to define multiple parsers at the same
# time using threads (in which case they should have their head examined).
#
# This implementation supports both SLR and LALR(1) parsing.  LALR(1)
# support was originally implemented by Elias Ioup (ezioup@alumni.uchicago.edu),
# using the algorithm found in Aho, Sethi, and Ullman "Compilers: Principles,
# Techniques, and Tools" (The Dragon Book).  LALR(1) has since been replaced
# by the more efficient DeRemer and Pennello algorithm.
#
# :::::::: WARNING :::::::
#
# Construction of LR parsing tables is fairly complicated and expensive.
# To make this module run fast, a *LOT* of work has been put into
# optimization---often at the expensive of readability and what might
# consider to be good Python "coding style."   Modify the code at your
# own risk!
# ----------------------------------------------------------------------------

import re
import types
import sys
import os.path
import inspect
import base64
import warnings

__version__    = '3.9'
__tabversion__ = '3.8'

#-----------------------------------------------------------------------------
#                     === User configurable parameters ===
#
# Change these to modify the default behavior of yacc (if you wish)
#-----------------------------------------------------------------------------

yaccdebug   = True             # Debugging mode.  If set, yacc generates a
                               # a 'parser.out' file in the current directory

debug_file  = 'parser.out'     # Default name of the debugging file
tab_module  = 'parsetab'       # Default name of the table module
default_lr  = 'LALR'           # Default LR table generation method

error_count = 3                # Number of symbols that must be shifted to leave recovery mode

yaccdevel   = False            # Set to True if developing yacc.  This turns off optimized
                               # implementations of certain functions.

resultlimit = 40               # Size limit of results when running in debug mode.

pickle_protocol = 0            # Protocol to use when writing pickle files

# String type-checking compatibility
if sys.version_info[0] < 3:
    string_types = basestring
else:
    string_types = str

MAXINT = sys.maxsize

# This object is a stand-in for a logging object created by the
# logging module.   PLY will use this by default to create things
# such as the parser.out file.  If a user wants more detailed
# information, they can create their own logging object and pass
# it into PLY.

class PlyLogger(object):
    def __init__(self, f):
        self.f = f

    def debug(self, msg, *args, **kwargs):
        self.f.write((msg % args) + '\n')

    info = debug

    def warning(self, msg, *args, **kwargs):
        self.f.write('WARNING: ' + (msg % args) + '\n')

    def error(self, msg, *args, **kwargs):
        self.f.write('ERROR: ' + (msg % args) + '\n')

    critical = debug

# Null logger is used when no output is generated. Does nothing.
class NullLogger(object):
    def __getattribute__(self, name):
        return self

    def __call__(self, *args, **kwargs):
        return self

# Exception raised for yacc-related errors
class YaccError(Exception):
    pass

# Format the result message that the parser produces when running in debug mode.
def format_result(r):
    repr_str = repr(r)
    if '\n' in repr_str:
        repr_str = repr(repr_str)
    if len(repr_str) > resultlimit:
        repr_str = repr_str[:resultlimit] + ' ...'
    result = '<%s @ 0x%x> (%s)' % (type(r).__name__, id(r), repr_str)
    return result

# Format stack entries when the parser is running in debug mode
def format_stack_entry(r):
    repr_str = repr(r)
    if '\n' in repr_str:
        repr_str = repr(repr_str)
    if len(repr_str) < 16:
        return repr_str
    else:
        return '<%s @ 0x%x>' % (type(r).__name__, id(r))

# Panic mode error recovery support.   This feature is being reworked--much of the
# code here is to offer a deprecation/backwards compatible transition

_errok = None
_token = None
_restart = None
_warnmsg = '''PLY: Don't use global functions errok(), token(), and restart() in p_error().
Instead, invoke the methods on the associated parser instance:

    def p_error(p):
        ...
        # Use parser.errok(), parser.token(), parser.restart()
        ...

    parser = yacc.yacc()
'''

def errok():
    warnings.warn(_warnmsg)
    return _errok()

def restart():
    warnings.warn(_warnmsg)
    return _restart()

def token():
    warnings.warn(_warnmsg)
    return _token()

# Utility function to call the p_error() function with some deprecation hacks
def call_errorfunc(errorfunc, token, parser):
    global _errok, _token, _restart
    _errok = parser.errok
    _token = parser.token
    _restart = parser.restart
    r = errorfunc(token)
    try:
        del _errok, _token, _restart
    except NameError:
        pass
    return r

#-----------------------------------------------------------------------------
#                        ===  LR Parsing Engine ===
#
# The following classes are used for the LR parser itself.  These are not
# used during table construction and are independent of the actual LR
# table generation algorithm
#-----------------------------------------------------------------------------

# This class is used to hold non-terminal grammar symbols during parsing.
# It normally has the following attributes set:
#        .type       = Grammar symbol type
#        .value      = Symbol value
#        .lineno     = Starting line number
#        .endlineno  = Ending line number (optional, set automatically)
#        .lexpos     = Starting lex position
#        .endlexpos  = Ending lex position (optional, set automatically)

class YaccSymbol:
    def __str__(self):
        return self.type

    def __repr__(self):
        return str(self)

# This class is a wrapper around the objects actually passed to each
# grammar rule.   Index lookup and assignment actually assign the
# .value attribute of the underlying YaccSymbol object.
# The lineno() method returns the line number of a given
# item (or 0 if not defined).   The linespan() method returns
# a tuple of (startline,endline) representing the range of lines
# for a symbol.  The lexspan() method returns a tuple (lexpos,endlexpos)
# representing the range of positional information for a symbol.

class YaccProduction:
    def __init__(self, s, stack=None):
        self.slice = s
        self.stack = stack
        self.lexer = None
        self.parser = None

    def __getitem__(self, n):
        if isinstance(n, slice):
            return [s.value for s in self.slice[n]]
        elif n >= 0:
            return self.slice[n].value
        else:
            return self.stack[n].value

    def __setitem__(self, n, v):
        self.slice[n].value = v

    def __getslice__(self, i, j):
        return [s.value for s in self.slice[i:j]]

    def __len__(self):
        return len(self.slice)

    def lineno(self, n):
        return getattr(self.slice[n], 'lineno', 0)

    def set_lineno(self, n, lineno):
        self.slice[n].lineno = lineno

    def linespan(self, n):
        startline = getattr(self.slice[n], 'lineno', 0)
        endline = getattr(self.slice[n], 'endlineno', startline)
        return startline, endline

    def lexpos(self, n):
        return getattr(self.slice[n], 'lexpos', 0)

    def lexspan(self, n):
        startpos = getattr(self.slice[n], 'lexpos', 0)
        endpos = getattr(self.slice[n], 'endlexpos', startpos)
        return startpos, endpos

    def error(self):
        raise SyntaxError

# -----------------------------------------------------------------------------
#                               == LRParser ==
#
# The LR Parsing engine.
# -----------------------------------------------------------------------------

class LRParser:
    def __init__(self, lrtab, errorf):
        self.productions = lrtab.lr_productions
        self.action = lrtab.lr_action
        self.goto = lrtab.lr_goto
        self.errorfunc = errorf
        self.set_defaulted_states()
        self.errorok = True

    def errok(self):
        self.errorok = True

    def restart(self):
        del self.statestack[:]
        del self.symstack[:]
        sym = YaccSymbol()
        sym.type = '$end'
        self.symstack.append(sym)
        self.statestack.append(0)

    # Defaulted state support.
    # This method identifies parser states where there is only one possible reduction action.
    # For such states, the parser can make a choose to make a rule reduction without consuming
    # the next look-ahead token.  This delayed invocation of the tokenizer can be useful in
    # certain kinds of advanced parsing situations where the lexer and parser interact with
    # each other or change states (i.e., manipulation of scope, lexer states, etc.).
    #
    # See:  http://www.gnu.org/software/bison/manual/html_node/Default-Reductions.html#Default-Reductions
    def set_defaulted_states(self):
        self.defaulted_states = {}
        for state, actions in self.action.items():
            rules = list(actions.values())
            if len(rules) == 1 and rules[0] < 0:
                self.defaulted_states[state] = rules[0]

    def disable_defaulted_states(self):
        self.defaulted_states = {}

    def parse(self, input=None, lexer=None, debug=False, tracking=False, tokenfunc=None):
        if debug or yaccdevel:
            if isinstance(debug, int):
                debug = PlyLogger(sys.stderr)
            return self.parsedebug(input, lexer, debug, tracking, tokenfunc)
        elif tracking:
            return self.parseopt(input, lexer, debug, tracking, tokenfunc)
        else:
            return self.parseopt_notrack(input, lexer, debug, tracking, tokenfunc)


    # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
    # parsedebug().
    #
    # This is the debugging enabled version of parse().  All changes made to the
    # parsing engine should be made here.   Optimized versions of this function
    # are automatically created by the ply/ygen.py script.  This script cuts out
    # sections enclosed in markers such as this:
    #
    #      #--! DEBUG
    #      statements
    #      #--! DEBUG
    #
    # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

    def parsedebug(self, input=None, lexer=None, debug=False, tracking=False, tokenfunc=None):
        #--! parsedebug-start
        lookahead = None                         # Current lookahead symbol
        lookaheadstack = []                      # Stack of lookahead symbols
        actions = self.action                    # Local reference to action table (to avoid lookup on self.)
        goto    = self.goto                      # Local reference to goto table (to avoid lookup on self.)
        prod    = self.productions               # Local reference to production list (to avoid lookup on self.)
        defaulted_states = self.defaulted_states # Local reference to defaulted states
        pslice  = YaccProduction(None)           # Production object passed to grammar rules
        errorcount = 0                           # Used during error recovery

        #--! DEBUG
        debug.info('PLY: PARSE DEBUG START')
        #--! DEBUG

        # If no lexer was given, we will try to use the lex module
        if not lexer:
            from . import lex
            lexer = lex.lexer

        # Set up the lexer and parser objects on pslice
        pslice.lexer = lexer
        pslice.parser = self

        # If input was supplied, pass to lexer
        if input is not None:
            lexer.input(input)

        if tokenfunc is None:
            # Tokenize function
            get_token = lexer.token
        else:
            get_token = tokenfunc

        # Set the parser() token method (sometimes used in error recovery)
        self.token = get_token

        # Set up the state and symbol stacks

        statestack = []                # Stack of parsing states
        self.statestack = statestack
        symstack   = []                # Stack of grammar symbols
        self.symstack = symstack

        pslice.stack = symstack         # Put in the production
        errtoken   = None               # Err token

        # The start state is assumed to be (0,$end)

        statestack.append(0)
        sym = YaccSymbol()
        sym.type = '$end'
        symstack.append(sym)
        state = 0
        while True:
            # Get the next symbol on the input.  If a lookahead symbol
            # is already set, we just use that. Otherwise, we'll pull
            # the next token off of the lookaheadstack or from the lexer

            #--! DEBUG
            debug.debug('')
            debug.debug('State  : %s', state)
            #--! DEBUG

            if state not in defaulted_states:
                if not lookahead:
                    if not lookaheadstack:
                        lookahead = get_token()     # Get the next token
                    else:
                        lookahead = lookaheadstack.pop()
                    if not lookahead:
                        lookahead = YaccSymbol()
                        lookahead.type = '$end'

                # Check the action table
                ltype = lookahead.type
                t = actions[state].get(ltype)
            else:
                t = defaulted_states[state]
                #--! DEBUG
                debug.debug('Defaulted state %s: Reduce using %d', state, -t)
                #--! DEBUG

            #--! DEBUG
            debug.debug('Stack  : %s',
                        ('%s . %s' % (' '.join([xx.type for xx in symstack][1:]), str(lookahead))).lstrip())
            #--! DEBUG

            if t is not None:
                if t > 0:
                    # shift a symbol on the stack
                    statestack.append(t)
                    state = t

                    #--! DEBUG
                    debug.debug('Action : Shift and goto state %s', t)
                    #--! DEBUG

                    symstack.append(lookahead)
                    lookahead = None

                    # Decrease error count on successful shift
                    if errorcount:
                        errorcount -= 1
                    continue

                if t < 0:
                    # reduce a symbol on the stack, emit a production
                    p = prod[-t]
                    pname = p.name
                    plen  = p.len

                    # Get production function
                    sym = YaccSymbol()
                    sym.type = pname       # Production name
                    sym.value = None

                    #--! DEBUG
                    if plen:
                        debug.info('Action : Reduce rule [%s] with %s and goto state %d', p.str,
                                   '['+','.join([format_stack_entry(_v.value) for _v in symstack[-plen:]])+']',
                                   goto[statestack[-1-plen]][pname])
                    else:
                        debug.info('Action : Reduce rule [%s] with %s and goto state %d', p.str, [],
                                   goto[statestack[-1]][pname])

                    #--! DEBUG

                    if plen:
                        targ = symstack[-plen-1:]
                        targ[0] = sym

                        #--! TRACKING
                        if tracking:
                            t1 = targ[1]
                            sym.lineno = t1.lineno
                            sym.lexpos = t1.lexpos
                            t1 = targ[-1]
                            sym.endlineno = getattr(t1, 'endlineno', t1.lineno)
                            sym.endlexpos = getattr(t1, 'endlexpos', t1.lexpos)
                        #--! TRACKING

                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                        # The code enclosed in this section is duplicated
                        # below as a performance optimization.  Make sure
                        # changes get made in both locations.

                        pslice.slice = targ

                        try:
                            # Call the grammar rule with our special slice object
                            del symstack[-plen:]
                            self.state = state
                            p.callable(pslice)
                            del statestack[-plen:]
                            #--! DEBUG
                            debug.info('Result : %s', format_result(pslice[0]))
                            #--! DEBUG
                            symstack.append(sym)
                            state = goto[statestack[-1]][pname]
                            statestack.append(state)
                        except SyntaxError:
                            # If an error was set. Enter error recovery state
                            lookaheadstack.append(lookahead)    # Save the current lookahead token
                            symstack.extend(targ[1:-1])         # Put the production slice back on the stack
                            statestack.pop()                    # Pop back one state (before the reduce)
                            state = statestack[-1]
                            sym.type = 'error'
                            sym.value = 'error'
                            lookahead = sym
                            errorcount = error_count
                            self.errorok = False

                        continue
                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

                    else:

                        #--! TRACKING
                        if tracking:
                            sym.lineno = lexer.lineno
                            sym.lexpos = lexer.lexpos
                        #--! TRACKING

                        targ = [sym]

                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                        # The code enclosed in this section is duplicated
                        # above as a performance optimization.  Make sure
                        # changes get made in both locations.

                        pslice.slice = targ

                        try:
                            # Call the grammar rule with our special slice object
                            self.state = state
                            p.callable(pslice)
                            #--! DEBUG
                            debug.info('Result : %s', format_result(pslice[0]))
                            #--! DEBUG
                            symstack.append(sym)
                            state = goto[statestack[-1]][pname]
                            statestack.append(state)
                        except SyntaxError:
                            # If an error was set. Enter error recovery state
                            lookaheadstack.append(lookahead)    # Save the current lookahead token
                            statestack.pop()                    # Pop back one state (before the reduce)
                            state = statestack[-1]
                            sym.type = 'error'
                            sym.value = 'error'
                            lookahead = sym
                            errorcount = error_count
                            self.errorok = False

                        continue
                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

                if t == 0:
                    n = symstack[-1]
                    result = getattr(n, 'value', None)
                    #--! DEBUG
                    debug.info('Done   : Returning %s', format_result(result))
                    debug.info('PLY: PARSE DEBUG END')
                    #--! DEBUG
                    return result

            if t is None:

                #--! DEBUG
                debug.error('Error  : %s',
                            ('%s . %s' % (' '.join([xx.type for xx in symstack][1:]), str(lookahead))).lstrip())
                #--! DEBUG

                # We have some kind of parsing error here.  To handle
                # this, we are going to push the current token onto
                # the tokenstack and replace it with an 'error' token.
                # If there are any synchronization rules, they may
                # catch it.
                #
                # In addition to pushing the error token, we call call
                # the user defined p_error() function if this is the
                # first syntax error.  This function is only called if
                # errorcount == 0.
                if errorcount == 0 or self.errorok:
                    errorcount = error_count
                    self.errorok = False
                    errtoken = lookahead
                    if errtoken.type == '$end':
                        errtoken = None               # End of file!
                    if self.errorfunc:
                        if errtoken and not hasattr(errtoken, 'lexer'):
                            errtoken.lexer = lexer
                        self.state = state
                        tok = call_errorfunc(self.errorfunc, errtoken, self)
                        if self.errorok:
                            # User must have done some kind of panic
                            # mode recovery on their own.  The
                            # returned token is the next lookahead
                            lookahead = tok
                            errtoken = None
                            continue
                    else:
                        if errtoken:
                            if hasattr(errtoken, 'lineno'):
                                lineno = lookahead.lineno
                            else:
                                lineno = 0
                            if lineno:
                                sys.stderr.write('yacc: Syntax error at line %d, token=%s\n' % (lineno, errtoken.type))
                            else:
                                sys.stderr.write('yacc: Syntax error, token=%s' % errtoken.type)
                        else:
                            sys.stderr.write('yacc: Parse error in input. EOF\n')
                            return

                else:
                    errorcount = error_count

                # case 1:  the statestack only has 1 entry on it.  If we're in this state, the
                # entire parse has been rolled back and we're completely hosed.   The token is
                # discarded and we just keep going.

                if len(statestack) <= 1 and lookahead.type != '$end':
                    lookahead = None
                    errtoken = None
                    state = 0
                    # Nuke the pushback stack
                    del lookaheadstack[:]
                    continue

                # case 2: the statestack has a couple of entries on it, but we're
                # at the end of the file. nuke the top entry and generate an error token

                # Start nuking entries on the stack
                if lookahead.type == '$end':
                    # Whoa. We're really hosed here. Bail out
                    return

                if lookahead.type != 'error':
                    sym = symstack[-1]
                    if sym.type == 'error':
                        # Hmmm. Error is on top of stack, we'll just nuke input
                        # symbol and continue
                        #--! TRACKING
                        if tracking:
                            sym.endlineno = getattr(lookahead, 'lineno', sym.lineno)
                            sym.endlexpos = getattr(lookahead, 'lexpos', sym.lexpos)
                        #--! TRACKING
                        lookahead = None
                        continue

                    # Create the error symbol for the first time and make it the new lookahead symbol
                    t = YaccSymbol()
                    t.type = 'error'

                    if hasattr(lookahead, 'lineno'):
                        t.lineno = t.endlineno = lookahead.lineno
                    if hasattr(lookahead, 'lexpos'):
                        t.lexpos = t.endlexpos = lookahead.lexpos
                    t.value = lookahead
                    lookaheadstack.append(lookahead)
                    lookahead = t
                else:
                    sym = symstack.pop()
                    #--! TRACKING
                    if tracking:
                        lookahead.lineno = sym.lineno
                        lookahead.lexpos = sym.lexpos
                    #--! TRACKING
                    statestack.pop()
                    state = statestack[-1]

                continue

            # Call an error function here
            raise RuntimeError('yacc: internal parser error!!!\n')

        #--! parsedebug-end

    # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
    # parseopt().
    #
    # Optimized version of parse() method.  DO NOT EDIT THIS CODE DIRECTLY!
    # This code is automatically generated by the ply/ygen.py script. Make
    # changes to the parsedebug() method instead.
    # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

    def parseopt(self, input=None, lexer=None, debug=False, tracking=False, tokenfunc=None):
        #--! parseopt-start
        lookahead = None                         # Current lookahead symbol
        lookaheadstack = []                      # Stack of lookahead symbols
        actions = self.action                    # Local reference to action table (to avoid lookup on self.)
        goto    = self.goto                      # Local reference to goto table (to avoid lookup on self.)
        prod    = self.productions               # Local reference to production list (to avoid lookup on self.)
        defaulted_states = self.defaulted_states # Local reference to defaulted states
        pslice  = YaccProduction(None)           # Production object passed to grammar rules
        errorcount = 0                           # Used during error recovery


        # If no lexer was given, we will try to use the lex module
        if not lexer:
            from . import lex
            lexer = lex.lexer

        # Set up the lexer and parser objects on pslice
        pslice.lexer = lexer
        pslice.parser = self

        # If input was supplied, pass to lexer
        if input is not None:
            lexer.input(input)

        if tokenfunc is None:
            # Tokenize function
            get_token = lexer.token
        else:
            get_token = tokenfunc

        # Set the parser() token method (sometimes used in error recovery)
        self.token = get_token

        # Set up the state and symbol stacks

        statestack = []                # Stack of parsing states
        self.statestack = statestack
        symstack   = []                # Stack of grammar symbols
        self.symstack = symstack

        pslice.stack = symstack         # Put in the production
        errtoken   = None               # Err token

        # The start state is assumed to be (0,$end)

        statestack.append(0)
        sym = YaccSymbol()
        sym.type = '$end'
        symstack.append(sym)
        state = 0
        while True:
            # Get the next symbol on the input.  If a lookahead symbol
            # is already set, we just use that. Otherwise, we'll pull
            # the next token off of the lookaheadstack or from the lexer


            if state not in defaulted_states:
                if not lookahead:
                    if not lookaheadstack:
                        lookahead = get_token()     # Get the next token
                    else:
                        lookahead = lookaheadstack.pop()
                    if not lookahead:
                        lookahead = YaccSymbol()
                        lookahead.type = '$end'

                # Check the action table
                ltype = lookahead.type
                t = actions[state].get(ltype)
            else:
                t = defaulted_states[state]


            if t is not None:
                if t > 0:
                    # shift a symbol on the stack
                    statestack.append(t)
                    state = t


                    symstack.append(lookahead)
                    lookahead = None

                    # Decrease error count on successful shift
                    if errorcount:
                        errorcount -= 1
                    continue

                if t < 0:
                    # reduce a symbol on the stack, emit a production
                    p = prod[-t]
                    pname = p.name
                    plen  = p.len

                    # Get production function
                    sym = YaccSymbol()
                    sym.type = pname       # Production name
                    sym.value = None


                    if plen:
                        targ = symstack[-plen-1:]
                        targ[0] = sym

                        #--! TRACKING
                        if tracking:
                            t1 = targ[1]
                            sym.lineno = t1.lineno
                            sym.lexpos = t1.lexpos
                            t1 = targ[-1]
                            sym.endlineno = getattr(t1, 'endlineno', t1.lineno)
                            sym.endlexpos = getattr(t1, 'endlexpos', t1.lexpos)
                        #--! TRACKING

                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                        # The code enclosed in this section is duplicated
                        # below as a performance optimization.  Make sure
                        # changes get made in both locations.

                        pslice.slice = targ

                        try:
                            # Call the grammar rule with our special slice object
                            del symstack[-plen:]
                            self.state = state
                            p.callable(pslice)
                            del statestack[-plen:]
                            symstack.append(sym)
                            state = goto[statestack[-1]][pname]
                            statestack.append(state)
                        except SyntaxError:
                            # If an error was set. Enter error recovery state
                            lookaheadstack.append(lookahead)    # Save the current lookahead token
                            symstack.extend(targ[1:-1])         # Put the production slice back on the stack
                            statestack.pop()                    # Pop back one state (before the reduce)
                            state = statestack[-1]
                            sym.type = 'error'
                            sym.value = 'error'
                            lookahead = sym
                            errorcount = error_count
                            self.errorok = False

                        continue
                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

                    else:

                        #--! TRACKING
                        if tracking:
                            sym.lineno = lexer.lineno
                            sym.lexpos = lexer.lexpos
                        #--! TRACKING

                        targ = [sym]

                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                        # The code enclosed in this section is duplicated
                        # above as a performance optimization.  Make sure
                        # changes get made in both locations.

                        pslice.slice = targ

                        try:
                            # Call the grammar rule with our special slice object
                            self.state = state
                            p.callable(pslice)
                            symstack.append(sym)
                            state = goto[statestack[-1]][pname]
                            statestack.append(state)
                        except SyntaxError:
                            # If an error was set. Enter error recovery state
                            lookaheadstack.append(lookahead)    # Save the current lookahead token
                            statestack.pop()                    # Pop back one state (before the reduce)
                            state = statestack[-1]
                            sym.type = 'error'
                            sym.value = 'error'
                            lookahead = sym
                            errorcount = error_count
                            self.errorok = False

                        continue
                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

                if t == 0:
                    n = symstack[-1]
                    result = getattr(n, 'value', None)
                    return result

            if t is None:


                # We have some kind of parsing error here.  To handle
                # this, we are going to push the current token onto
                # the tokenstack and replace it with an 'error' token.
                # If there are any synchronization rules, they may
                # catch it.
                #
                # In addition to pushing the error token, we call call
                # the user defined p_error() function if this is the
                # first syntax error.  This function is only called if
                # errorcount == 0.
                if errorcount == 0 or self.errorok:
                    errorcount = error_count
                    self.errorok = False
                    errtoken = lookahead
                    if errtoken.type == '$end':
                        errtoken = None               # End of file!
                    if self.errorfunc:
                        if errtoken and not hasattr(errtoken, 'lexer'):
                            errtoken.lexer = lexer
                        self.state = state
                        tok = call_errorfunc(self.errorfunc, errtoken, self)
                        if self.errorok:
                            # User must have done some kind of panic
                            # mode recovery on their own.  The
                            # returned token is the next lookahead
                            lookahead = tok
                            errtoken = None
                            continue
                    else:
                        if errtoken:
                            if hasattr(errtoken, 'lineno'):
                                lineno = lookahead.lineno
                            else:
                                lineno = 0
                            if lineno:
                                sys.stderr.write('yacc: Syntax error at line %d, token=%s\n' % (lineno, errtoken.type))
                            else:
                                sys.stderr.write('yacc: Syntax error, token=%s' % errtoken.type)
                        else:
                            sys.stderr.write('yacc: Parse error in input. EOF\n')
                            return

                else:
                    errorcount = error_count

                # case 1:  the statestack only has 1 entry on it.  If we're in this state, the
                # entire parse has been rolled back and we're completely hosed.   The token is
                # discarded and we just keep going.

                if len(statestack) <= 1 and lookahead.type != '$end':
                    lookahead = None
                    errtoken = None
                    state = 0
                    # Nuke the pushback stack
                    del lookaheadstack[:]
                    continue

                # case 2: the statestack has a couple of entries on it, but we're
                # at the end of the file. nuke the top entry and generate an error token

                # Start nuking entries on the stack
                if lookahead.type == '$end':
                    # Whoa. We're really hosed here. Bail out
                    return

                if lookahead.type != 'error':
                    sym = symstack[-1]
                    if sym.type == 'error':
                        # Hmmm. Error is on top of stack, we'll just nuke input
                        # symbol and continue
                        #--! TRACKING
                        if tracking:
                            sym.endlineno = getattr(lookahead, 'lineno', sym.lineno)
                            sym.endlexpos = getattr(lookahead, 'lexpos', sym.lexpos)
                        #--! TRACKING
                        lookahead = None
                        continue

                    # Create the error symbol for the first time and make it the new lookahead symbol
                    t = YaccSymbol()
                    t.type = 'error'

                    if hasattr(lookahead, 'lineno'):
                        t.lineno = t.endlineno = lookahead.lineno
                    if hasattr(lookahead, 'lexpos'):
                        t.lexpos = t.endlexpos = lookahead.lexpos
                    t.value = lookahead
                    lookaheadstack.append(lookahead)
                    lookahead = t
                else:
                    sym = symstack.pop()
                    #--! TRACKING
                    if tracking:
                        lookahead.lineno = sym.lineno
                        lookahead.lexpos = sym.lexpos
                    #--! TRACKING
                    statestack.pop()
                    state = statestack[-1]

                continue

            # Call an error function here
            raise RuntimeError('yacc: internal parser error!!!\n')

        #--! parseopt-end

    # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
    # parseopt_notrack().
    #
    # Optimized version of parseopt() with line number tracking removed.
    # DO NOT EDIT THIS CODE DIRECTLY. This code is automatically generated
    # by the ply/ygen.py script. Make changes to the parsedebug() method instead.
    # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

    def parseopt_notrack(self, input=None, lexer=None, debug=False, tracking=False, tokenfunc=None):
        #--! parseopt-notrack-start
        lookahead = None                         # Current lookahead symbol
        lookaheadstack = []                      # Stack of lookahead symbols
        actions = self.action                    # Local reference to action table (to avoid lookup on self.)
        goto    = self.goto                      # Local reference to goto table (to avoid lookup on self.)
        prod    = self.productions               # Local reference to production list (to avoid lookup on self.)
        defaulted_states = self.defaulted_states # Local reference to defaulted states
        pslice  = YaccProduction(None)           # Production object passed to grammar rules
        errorcount = 0                           # Used during error recovery


        # If no lexer was given, we will try to use the lex module
        if not lexer:
            from . import lex
            lexer = lex.lexer

        # Set up the lexer and parser objects on pslice
        pslice.lexer = lexer
        pslice.parser = self

        # If input was supplied, pass to lexer
        if input is not None:
            lexer.input(input)

        if tokenfunc is None:
            # Tokenize function
            get_token = lexer.token
        else:
            get_token = tokenfunc

        # Set the parser() token method (sometimes used in error recovery)
        self.token = get_token

        # Set up the state and symbol stacks

        statestack = []                # Stack of parsing states
        self.statestack = statestack
        symstack   = []                # Stack of grammar symbols
        self.symstack = symstack

        pslice.stack = symstack         # Put in the production
        errtoken   = None               # Err token

        # The start state is assumed to be (0,$end)

        statestack.append(0)
        sym = YaccSymbol()
        sym.type = '$end'
        symstack.append(sym)
        state = 0
        while True:
            # Get the next symbol on the input.  If a lookahead symbol
            # is already set, we just use that. Otherwise, we'll pull
            # the next token off of the lookaheadstack or from the lexer


            if state not in defaulted_states:
                if not lookahead:
                    if not lookaheadstack:
                        lookahead = get_token()     # Get the next token
                    else:
                        lookahead = lookaheadstack.pop()
                    if not lookahead:
                        lookahead = YaccSymbol()
                        lookahead.type = '$end'

                # Check the action table
                ltype = lookahead.type
                t = actions[state].get(ltype)
            else:
                t = defaulted_states[state]


            if t is not None:
                if t > 0:
                    # shift a symbol on the stack
                    statestack.append(t)
                    state = t


                    symstack.append(lookahead)
                    lookahead = None

                    # Decrease error count on successful shift
                    if errorcount:
                        errorcount -= 1
                    continue

                if t < 0:
                    # reduce a symbol on the stack, emit a production
                    p = prod[-t]
                    pname = p.name
                    plen  = p.len

                    # Get production function
                    sym = YaccSymbol()
                    sym.type = pname       # Production name
                    sym.value = None


                    if plen:
                        targ = symstack[-plen-1:]
                        targ[0] = sym


                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                        # The code enclosed in this section is duplicated
                        # below as a performance optimization.  Make sure
                        # changes get made in both locations.

                        pslice.slice = targ

                        try:
                            # Call the grammar rule with our special slice object
                            del symstack[-plen:]
                            self.state = state
                            p.callable(pslice)
                            del statestack[-plen:]
                            symstack.append(sym)
                            state = goto[statestack[-1]][pname]
                            statestack.append(state)
                        except SyntaxError:
                            # If an error was set. Enter error recovery state
                            lookaheadstack.append(lookahead)    # Save the current lookahead token
                            symstack.extend(targ[1:-1])         # Put the production slice back on the stack
                            statestack.pop()                    # Pop back one state (before the reduce)
                            state = statestack[-1]
                            sym.type = 'error'
                            sym.value = 'error'
                            lookahead = sym
                            errorcount = error_count
                            self.errorok = False

                        continue
                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

                    else:


                        targ = [sym]

                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                        # The code enclosed in this section is duplicated
                        # above as a performance optimization.  Make sure
                        # changes get made in both locations.

                        pslice.slice = targ

                        try:
                            # Call the grammar rule with our special slice object
                            self.state = state
                            p.callable(pslice)
                            symstack.append(sym)
                            state = goto[statestack[-1]][pname]
                            statestack.append(state)
                        except SyntaxError:
                            # If an error was set. Enter error recovery state
                            lookaheadstack.append(lookahead)    # Save the current lookahead token
                            statestack.pop()                    # Pop back one state (before the reduce)
                            state = statestack[-1]
                            sym.type = 'error'
                            sym.value = 'error'
                            lookahead = sym
                            errorcount = error_count
                            self.errorok = False

                        continue
                        # !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

                if t == 0:
                    n = symstack[-1]
                    result = getattr(n, 'value', None)
                    return result

            if t is None:


                # We have some kind of parsing error here.  To handle
                # this, we are going to push the current token onto
                # the tokenstack and replace it with an 'error' token.
                # If there are any synchronization rules, they may
                # catch it.
                #
                # In addition to pushing the error token, we call call
                # the user defined p_error() function if this is the
                # first syntax error.  This function is only called if
                # errorcount == 0.
                if errorcount == 0 or self.errorok:
                    errorcount = error_count
                    self.errorok = False
                    errtoken = lookahead
                    if errtoken.type == '$end':
                        errtoken = None               # End of file!
                    if self.errorfunc:
                        if errtoken and not hasattr(errtoken, 'lexer'):
                            errtoken.lexer = lexer
                        self.state = state
                        tok = call_errorfunc(self.errorfunc, errtoken, self)
                        if self.errorok:
                            # User must have done some kind of panic
                            # mode recovery on their own.  The
                            # returned token is the next lookahead
                            lookahead = tok
                            errtoken = None
                            continue
                    else:
                        if errtoken:
                            if hasattr(errtoken, 'lineno'):
                                lineno = lookahead.lineno
                            else:
                                lineno = 0
                            if lineno:
                                sys.stderr.write('yacc: Syntax error at line %d, token=%s\n' % (lineno, errtoken.type))
                            else:
                                sys.stderr.write('yacc: Syntax error, token=%s' % errtoken.type)
                        else:
                            sys.stderr.write('yacc: Parse error in input. EOF\n')
                            return

                else:
                    errorcount = error_count

                # case 1:  the statestack only has 1 entry on it.  If we're in this state, the
                # entire parse has been rolled back and we're completely hosed.   The token is
                # discarded and we just keep going.

                if len(statestack) <= 1 and lookahead.type != '$end':
                    lookahead = None
                    errtoken = None
                    state = 0
                    # Nuke the pushback stack
                    del lookaheadstack[:]
                    continue

                # case 2: the statestack has a couple of entries on it, but we're
                # at the end of the file. nuke the top entry and generate an error token

                # Start nuking entries on the stack
                if lookahead.type == '$end':
                    # Whoa. We're really hosed here. Bail out
                    return

                if lookahead.type != 'error':
                    sym = symstack[-1]
                    if sym.type == 'error':
                        # Hmmm. Error is on top of stack, we'll just nuke input
                        # symbol and continue
                        lookahead = None
                        continue

                    # Create the error symbol for the first time and make it the new lookahead symbol
                    t = YaccSymbol()
                    t.type = 'error'

                    if hasattr(lookahead, 'lineno'):
                        t.lineno = t.endlineno = lookahead.lineno
                    if hasattr(lookahead, 'lexpos'):
                        t.lexpos = t.endlexpos = lookahead.lexpos
                    t.value = lookahead
                    lookaheadstack.append(lookahead)
                    lookahead = t
                else:
                    sym = symstack.pop()
                    statestack.pop()
                    state = statestack[-1]

                continue

            # Call an error function here
            raise RuntimeError('yacc: internal parser error!!!\n')

        #--! parseopt-notrack-end

# -----------------------------------------------------------------------------
#                          === Grammar Representation ===
#
# The following functions, classes, and variables are used to represent and
# manipulate the rules that make up a grammar.
# -----------------------------------------------------------------------------

# regex matching identifiers
_is_identifier = re.compile(r'^[a-zA-Z0-9_-]+$')

# -----------------------------------------------------------------------------
# class Production:
#
# This class stores the raw information about a single production or grammar rule.
# A grammar rule refers to a specification such as this:
#
#       expr : expr PLUS term
#
# Here are the basic attributes defined on all productions
#
#       name     - Name of the production.  For example 'expr'
#       prod     - A list of symbols on the right side ['expr','PLUS','term']
#       prec     - Production precedence level
#       number   - Production number.
#       func     - Function that executes on reduce
#       file     - File where production function is defined
#       lineno   - Line number where production function is defined
#
# The following attributes are defined or optional.
#
#       len       - Length of the production (number of symbols on right hand side)
#       usyms     - Set of unique symbols found in the production
# -----------------------------------------------------------------------------

class Production(object):
    reduced = 0
    def __init__(self, number, name, prod, precedence=('right', 0), func=None, file='', line=0):
        self.name     = name
        self.prod     = tuple(prod)
        self.number   = number
        self.func     = func
        self.callable = None
        self.file     = file
        self.line     = line
        self.prec     = precedence

        # Internal settings used during table construction

        self.len  = len(self.prod)   # Length of the production

        # Create a list of unique production symbols used in the production
        self.usyms = []
        for s in self.prod:
            if s not in self.usyms:
                self.usyms.append(s)

        # List of all LR items for the production
        self.lr_items = []
        self.lr_next = None

        # Create a string representation
        if self.prod:
            self.str = '%s -> %s' % (self.name, ' '.join(self.prod))
        else:
            self.str = '%s -> <empty>' % self.name

    def __str__(self):
        return self.str

    def __repr__(self):
        return 'Production(' + str(self) + ')'

    def __len__(self):
        return len(self.prod)

    def __nonzero__(self):
        return 1

    def __getitem__(self, index):
        return self.prod[index]

    # Return the nth lr_item from the production (or None if at the end)
    def lr_item(self, n):
        if n > len(self.prod):
            return None
        p = LRItem(self, n)
        # Precompute the list of productions immediately following.
        try:
            p.lr_after = Prodnames[p.prod[n+1]]
        except (IndexError, KeyError):
            p.lr_after = []
        try:
            p.lr_before = p.prod[n-1]
        except IndexError:
            p.lr_before = None
        return p

    # Bind the production function name to a callable
    def bind(self, pdict):
        if self.func:
            self.callable = pdict[self.func]

# This class serves as a minimal standin for Production objects when
# reading table data from files.   It only contains information
# actually used by the LR parsing engine, plus some additional
# debugging information.
class MiniProduction(object):
    def __init__(self, str, name, len, func, file, line):
        self.name     = name
        self.len      = len
        self.func     = func
        self.callable = None
        self.file     = file
        self.line     = line
        self.str      = str

    def __str__(self):
        return self.str

    def __repr__(self):
        return 'MiniProduction(%s)' % self.str

    # Bind the production function name to a callable
    def bind(self, pdict):
        if self.func:
            self.callable = pdict[self.func]


# -----------------------------------------------------------------------------
# class LRItem
#
# This class represents a specific stage of parsing a production rule.  For
# example:
#
#       expr : expr . PLUS term
#
# In the above, the "." represents the current location of the parse.  Here
# basic attributes:
#
#       name       - Name of the production.  For example 'expr'
#       prod       - A list of symbols on the right side ['expr','.', 'PLUS','term']
#       number     - Production number.
#
#       lr_next      Next LR item. Example, if we are ' expr -> expr . PLUS term'
#                    then lr_next refers to 'expr -> expr PLUS . term'
#       lr_index   - LR item index (location of the ".") in the prod list.
#       lookaheads - LALR lookahead symbols for this item
#       len        - Length of the production (number of symbols on right hand side)
#       lr_after    - List of all productions that immediately follow
#       lr_before   - Grammar symbol immediately before
# -----------------------------------------------------------------------------

class LRItem(object):
    def __init__(self, p, n):
        self.name       = p.name
        self.prod       = list(p.prod)
        self.number     = p.number
        self.lr_index   = n
        self.lookaheads = {}
        self.prod.insert(n, '.')
        self.prod       = tuple(self.prod)
        self.len        = len(self.prod)
        self.usyms      = p.usyms

    def __str__(self):
        if self.prod:
            s = '%s -> %s' % (self.name, ' '.join(self.prod))
        else:
            s = '%s -> <empty>' % self.name
        return s

    def __repr__(self):
        return 'LRItem(' + str(self) + ')'

# -----------------------------------------------------------------------------
# rightmost_terminal()
#
# Return the rightmost terminal from a list of symbols.  Used in add_production()
# -----------------------------------------------------------------------------
def rightmost_terminal(symbols, terminals):
    i = len(symbols) - 1
    while i >= 0:
        if symbols[i] in terminals:
            return symbols[i]
        i -= 1
    return None

# -----------------------------------------------------------------------------
#                           === GRAMMAR CLASS ===
#
# The following class represents the contents of the specified grammar along
# with various computed properties such as first sets, follow sets, LR items, etc.
# This data is used for critical parts of the table generation process later.
# -----------------------------------------------------------------------------

class GrammarError(YaccError):
    pass

class Grammar(object):
    def __init__(self, terminals):
        self.Productions  = [None]  # A list of all of the productions.  The first
                                    # entry is always reserved for the purpose of
                                    # building an augmented grammar

        self.Prodnames    = {}      # A dictionary mapping the names of nonterminals to a list of all
                                    # productions of that nonterminal.

        self.Prodmap      = {}      # A dictionary that is only used to detect duplicate
                                    # productions.

        self.Terminals    = {}      # A dictionary mapping the names of terminal symbols to a
                                    # list of the rules where they are used.

        for term in terminals:
            self.Terminals[term] = []

        self.Terminals['error'] = []

        self.Nonterminals = {}      # A dictionary mapping names of nonterminals to a list
                                    # of rule numbers where they are used.

        self.First        = {}      # A dictionary of precomputed FIRST(x) symbols

        self.Follow       = {}      # A dictionary of precomputed FOLLOW(x) symbols

        self.Precedence   = {}      # Precedence rules for each terminal. Contains tuples of the
                                    # form ('right',level) or ('nonassoc', level) or ('left',level)

        self.UsedPrecedence = set() # Precedence rules that were actually used by the grammer.
                                    # This is only used to provide error checking and to generate
                                    # a warning about unused precedence rules.

        self.Start = None           # Starting symbol for the grammar


    def __len__(self):
        return len(self.Productions)

    def __getitem__(self, index):
        return self.Productions[index]

    # -----------------------------------------------------------------------------
    # set_precedence()
    #
    # Sets the precedence for a given terminal. assoc is the associativity such as
    # 'left','right', or 'nonassoc'.  level is a numeric level.
    #
    # -----------------------------------------------------------------------------

    def set_precedence(self, term, assoc, level):
        assert self.Productions == [None], 'Must call set_precedence() before add_production()'
        if term in self.Precedence:
            raise GrammarError('Precedence already specified for terminal %r' % term)
        if assoc not in ['left', 'right', 'nonassoc']:
            raise GrammarError("Associativity must be one of 'left','right', or 'nonassoc'")
        self.Precedence[term] = (assoc, level)

    # -----------------------------------------------------------------------------
    # add_production()
    #
    # Given an action function, this function assembles a production rule and
    # computes its precedence level.
    #
    # The production rule is supplied as a list of symbols.   For example,
    # a rule such as 'expr : expr PLUS term' has a production name of 'expr' and
    # symbols ['expr','PLUS','term'].
    #
    # Precedence is determined by the precedence of the right-most non-terminal
    # or the precedence of a terminal specified by %prec.
    #
    # A variety of error checks are performed to make sure production symbols
    # are valid and that %prec is used correctly.
    # -----------------------------------------------------------------------------

    def add_production(self, prodname, syms, func=None, file='', line=0):

        if prodname in self.Terminals:
            raise GrammarError('%s:%d: Illegal rule name %r. Already defined as a token' % (file, line, prodname))
        if prodname == 'error':
            raise GrammarError('%s:%d: Illegal rule name %r. error is a reserved word' % (file, line, prodname))
        if not _is_identifier.match(prodname):
            raise GrammarError('%s:%d: Illegal rule name %r' % (file, line, prodname))

        # Look for literal tokens
        for n, s in enumerate(syms):
            if s[0] in "'\"":
                try:
                    c = eval(s)
                    if (len(c) > 1):
                        raise GrammarError('%s:%d: Literal token %s in rule %r may only be a single character' %
                                           (file, line, s, prodname))
                    if c not in self.Terminals:
                        self.Terminals[c] = []
                    syms[n] = c
                    continue
                except SyntaxError:
                    pass
            if not _is_identifier.match(s) and s != '%prec':
                raise GrammarError('%s:%d: Illegal name %r in rule %r' % (file, line, s, prodname))

        # Determine the precedence level
        if '%prec' in syms:
            if syms[-1] == '%prec':
                raise GrammarError('%s:%d: Syntax error. Nothing follows %%prec' % (file, line))
            if syms[-2] != '%prec':
                raise GrammarError('%s:%d: Syntax error. %%prec can only appear at the end of a grammar rule' %
                                   (file, line))
            precname = syms[-1]
            prodprec = self.Precedence.get(precname)
            if not prodprec:
                raise GrammarError('%s:%d: Nothing known about the precedence of %r' % (file, line, precname))
            else:
                self.UsedPrecedence.add(precname)
            del syms[-2:]     # Drop %prec from the rule
        else:
            # If no %prec, precedence is determined by the rightmost terminal symbol
            precname = rightmost_terminal(syms, self.Terminals)
            prodprec = self.Precedence.get(precname, ('right', 0))

        # See if the rule is already in the rulemap
        map = '%s -> %s' % (prodname, syms)
        if map in self.Prodmap:
            m = self.Prodmap[map]
            raise GrammarError('%s:%d: Duplicate rule %s. ' % (file, line, m) +
                               'Previous definition at %s:%d' % (m.file, m.line))

        # From this point on, everything is valid.  Create a new Production instance
        pnumber  = len(self.Productions)
        if prodname not in self.Nonterminals:
            self.Nonterminals[prodname] = []

        # Add the production number to Terminals and Nonterminals
        for t in syms:
            if t in self.Terminals:
                self.Terminals[t].append(pnumber)
            else:
                if t not in self.Nonterminals:
                    self.Nonterminals[t] = []
                self.Nonterminals[t].append(pnumber)

        # Create a production and add it to the list of productions
        p = Production(pnumber, prodname, syms, prodprec, func, file, line)
        self.Productions.append(p)
        self.Prodmap[map] = p

        # Add to the global productions list
        try:
            self.Prodnames[prodname].append(p)
        except KeyError:
            self.Prodnames[prodname] = [p]

    # -----------------------------------------------------------------------------
    # set_start()
    #
    # Sets the starting symbol and creates the augmented grammar.  Production
    # rule 0 is S' -> start where start is the start symbol.
    # -----------------------------------------------------------------------------

    def set_start(self, start=None):
        if not start:
            start = self.Productions[1].name
        if start not in self.Nonterminals:
            raise GrammarError('start symbol %s undefined' % start)
        self.Productions[0] = Production(0, "S'", [start])
        self.Nonterminals[start].append(0)
        self.Start = start

    # -----------------------------------------------------------------------------
    # find_unreachable()
    #
    # Find all of the nonterminal symbols that can't be reached from the starting
    # symbol.  Returns a list of nonterminals that can't be reached.
    # -----------------------------------------------------------------------------

    def find_unreachable(self):

        # Mark all symbols that are reachable from a symbol s
        def mark_reachable_from(s):
            if s in reachable:
                return
            reachable.add(s)
            for p in self.Prodnames.get(s, []):
                for r in p.prod:
                    mark_reachable_from(r)

        reachable = set()
        mark_reachable_from(self.Productions[0].prod[0])
        return [s for s in self.Nonterminals if s not in reachable]

    # -----------------------------------------------------------------------------
    # infinite_cycles()
    #
    # This function looks at the various parsing rules and tries to detect
    # infinite recursion cycles (grammar rules where there is no possible way
    # to derive a string of only terminals).
    # -----------------------------------------------------------------------------

    def infinite_cycles(self):
        terminates = {}

        # Terminals:
        for t in self.Terminals:
            terminates[t] = True

        terminates['$end'] = True

        # Nonterminals:

        # Initialize to false:
        for n in self.Nonterminals:
            terminates[n] = False

        # Then propagate termination until no change:
        while True:
            some_change = False
            for (n, pl) in self.Prodnames.items():
                # Nonterminal n terminates iff any of its productions terminates.
                for p in pl:
                    # Production p terminates iff all of its rhs symbols terminate.
                    for s in p.prod:
                        if not terminates[s]:
                            # The symbol s does not terminate,
                            # so production p does not terminate.
                            p_terminates = False
                            break
                    else:
                        # didn't break from the loop,
                        # so every symbol s terminates
                        # so production p terminates.
                        p_terminates = True

                    if p_terminates:
                        # symbol n terminates!
                        if not terminates[n]:
                            terminates[n] = True
                            some_change = True
                        # Don't need to consider any more productions for this n.
                        break

            if not some_change:
                break

        infinite = []
        for (s, term) in terminates.items():
            if not term:
                if s not in self.Prodnames and s not in self.Terminals and s != 'error':
                    # s is used-but-not-defined, and we've already warned of that,
                    # so it would be overkill to say that it's also non-terminating.
                    pass
                else:
                    infinite.append(s)

        return infinite

    # -----------------------------------------------------------------------------
    # undefined_symbols()
    #
    # Find all symbols that were used the grammar, but not defined as tokens or
    # grammar rules.  Returns a list of tuples (sym, prod) where sym in the symbol
    # and prod is the production where the symbol was used.
    # -----------------------------------------------------------------------------
    def undefined_symbols(self):
        result = []
        for p in self.Productions:
            if not p:
                continue

            for s in p.prod:
                if s not in self.Prodnames and s not in self.Terminals and s != 'error':
                    result.append((s, p))
        return result

    # -----------------------------------------------------------------------------
    # unused_terminals()
    #
    # Find all terminals that were defined, but not used by the grammar.  Returns
    # a list of all symbols.
    # -----------------------------------------------------------------------------
    def unused_terminals(self):
        unused_tok = []
        for s, v in self.Terminals.items():
            if s != 'error' and not v:
                unused_tok.append(s)

        return unused_tok

    # ------------------------------------------------------------------------------
    # unused_rules()
    #
    # Find all grammar rules that were defined,  but not used (maybe not reachable)
    # Returns a list of productions.
    # ------------------------------------------------------------------------------

    def unused_rules(self):
        unused_prod = []
        for s, v in self.Nonterminals.items():
            if not v:
                p = self.Prodnames[s][0]
                unused_prod.append(p)
        return unused_prod

    # -----------------------------------------------------------------------------
    # unused_precedence()
    #
    # Returns a list of tuples (term,precedence) corresponding to precedence
    # rules that were never used by the grammar.  term is the name of the terminal
    # on which precedence was applied and precedence is a string such as 'left' or
    # 'right' corresponding to the type of precedence.
    # -----------------------------------------------------------------------------

    def unused_precedence(self):
        unused = []
        for termname in self.Precedence:
            if not (termname in self.Terminals or termname in self.UsedPrecedence):
                unused.append((termname, self.Precedence[termname][0]))

        return unused

    # -------------------------------------------------------------------------
    # _first()
    #
    # Compute the value of FIRST1(beta) where beta is a tuple of symbols.
    #
    # During execution of compute_first1, the result may be incomplete.
    # Afterward (e.g., when called from compute_follow()), it will be complete.
    # -------------------------------------------------------------------------
    def _first(self, beta):

        # We are computing First(x1,x2,x3,...,xn)
        result = []
        for x in beta:
            x_produces_empty = False

            # Add all the non-<empty> symbols of First[x] to the result.
            for f in self.First[x]:
                if f == '<empty>':
                    x_produces_empty = True
                else:
                    if f not in result:
                        result.append(f)

            if x_produces_empty:
                # We have to consider the next x in beta,
                # i.e. stay in the loop.
                pass
            else:
                # We don't have to consider any further symbols in beta.
                break
        else:
            # There was no 'break' from the loop,
            # so x_produces_empty was true for all x in beta,
            # so beta produces empty as well.
            result.append('<empty>')

        return result

    # -------------------------------------------------------------------------
    # compute_first()
    #
    # Compute the value of FIRST1(X) for all symbols
    # -------------------------------------------------------------------------
    def compute_first(self):
        if self.First:
            return self.First

        # Terminals:
        for t in self.Terminals:
            self.First[t] = [t]

        self.First['$end'] = ['$end']

        # Nonterminals:

        # Initialize to the empty set:
        for n in self.Nonterminals:
            self.First[n] = []

        # Then propagate symbols until no change:
        while True:
            some_change = False
            for n in self.Nonterminals:
                for p in self.Prodnames[n]:
                    for f in self._first(p.prod):
                        if f not in self.First[n]:
                            self.First[n].append(f)
                            some_change = True
            if not some_change:
                break

        return self.First

    # ---------------------------------------------------------------------
    # compute_follow()
    #
    # Computes all of the follow sets for every non-terminal symbol.  The
    # follow set is the set of all symbols that might follow a given
    # non-terminal.  See the Dragon book, 2nd Ed. p. 189.
    # ---------------------------------------------------------------------
    def compute_follow(self, start=None):
        # If already computed, return the result
        if self.Follow:
            return self.Follow

        # If first sets not computed yet, do that first.
        if not self.First:
            self.compute_first()

        # Add '$end' to the follow list of the start symbol
        for k in self.Nonterminals:
            self.Follow[k] = []

        if not start:
            start = self.Productions[1].name

        self.Follow[start] = ['$end']

        while True:
            didadd = False
            for p in self.Productions[1:]:
                # Here is the production set
                for i, B in enumerate(p.prod):
                    if B in self.Nonterminals:
                        # Okay. We got a non-terminal in a production
                        fst = self._first(p.prod[i+1:])
                        hasempty = False
                        for f in fst:
                            if f != '<empty>' and f not in self.Follow[B]:
                                self.Follow[B].append(f)
                                didadd = True
                            if f == '<empty>':
                                hasempty = True
                        if hasempty or i == (len(p.prod)-1):
                            # Add elements of follow(a) to follow(b)
                            for f in self.Follow[p.name]:
                                if f not in self.Follow[B]:
                                    self.Follow[B].append(f)
                                    didadd = True
            if not didadd:
                break
        return self.Follow


    # -----------------------------------------------------------------------------
    # build_lritems()
    #
    # This function walks the list of productions and builds a complete set of the
    # LR items.  The LR items are stored in two ways:  First, they are uniquely
    # numbered and placed in the list _lritems.  Second, a linked list of LR items
    # is built for each production.  For example:
    #
    #   E -> E PLUS E
    #
    # Creates the list
    #
    #  [E -> . E PLUS E, E -> E . PLUS E, E -> E PLUS . E, E -> E PLUS E . ]
    # -----------------------------------------------------------------------------

    def build_lritems(self):
        for p in self.Productions:
            lastlri = p
            i = 0
            lr_items = []
            while True:
                if i > len(p):
                    lri = None
                else:
                    lri = LRItem(p, i)
                    # Precompute the list of productions immediately following
                    try:
                        lri.lr_after = self.Prodnames[lri.prod[i+1]]
                    except (IndexError, KeyError):
                        lri.lr_after = []
                    try:
                        lri.lr_before = lri.prod[i-1]
                    except IndexError:
                        lri.lr_before = None

                lastlri.lr_next = lri
                if not lri:
                    break
                lr_items.append(lri)
                lastlri = lri
                i += 1
            p.lr_items = lr_items

# -----------------------------------------------------------------------------
#                            == Class LRTable ==
#
# This basic class represents a basic table of LR parsing information.
# Methods for generating the tables are not defined here.  They are defined
# in the derived class LRGeneratedTable.
# -----------------------------------------------------------------------------

class VersionError(YaccError):
    pass

class LRTable(object):
    def __init__(self):
        self.lr_action = None
        self.lr_goto = None
        self.lr_productions = None
        self.lr_method = None

    def read_table(self, module):
        if isinstance(module, types.ModuleType):
            parsetab = module
        else:
            exec('import %s' % module)
            parsetab = sys.modules[module]

        if parsetab._tabversion != __tabversion__:
            raise VersionError('yacc table file version is out of date')

        self.lr_action = parsetab._lr_action
        self.lr_goto = parsetab._lr_goto

        self.lr_productions = []
        for p in parsetab._lr_productions:
            self.lr_productions.append(MiniProduction(*p))

        self.lr_method = parsetab._lr_method
        return parsetab._lr_signature

    def read_pickle(self, filename):
        try:
            import cPickle as pickle
        except ImportError:
            import pickle

        if not os.path.exists(filename):
          raise ImportError

        in_f = open(filename, 'rb')

        tabversion = pickle.load(in_f)
        if tabversion != __tabversion__:
            raise VersionError('yacc table file version is out of date')
        self.lr_method = pickle.load(in_f)
        signature      = pickle.load(in_f)
        self.lr_action = pickle.load(in_f)
        self.lr_goto   = pickle.load(in_f)
        productions    = pickle.load(in_f)

        self.lr_productions = []
        for p in productions:
            self.lr_productions.append(MiniProduction(*p))

        in_f.close()
        return signature

    # Bind all production function names to callable objects in pdict
    def bind_callables(self, pdict):
        for p in self.lr_productions:
            p.bind(pdict)


# -----------------------------------------------------------------------------
#                           === LR Generator ===
#
# The following classes and functions are used to generate LR parsing tables on
# a grammar.
# -----------------------------------------------------------------------------

# -----------------------------------------------------------------------------
# digraph()
# traverse()
#
# The following two functions are used to compute set valued functions
# of the form:
#
#     F(x) = F'(x) U U{F(y) | x R y}
#
# This is used to compute the values of Read() sets as well as FOLLOW sets
# in LALR(1) generation.
#
# Inputs:  X    - An input set
#          R    - A relation
#          FP   - Set-valued function
# ------------------------------------------------------------------------------

def digraph(X, R, FP):
    N = {}
    for x in X:
        N[x] = 0
    stack = []
    F = {}
    for x in X:
        if N[x] == 0:
            traverse(x, N, stack, F, X, R, FP)
    return F

def traverse(x, N, stack, F, X, R, FP):
    stack.append(x)
    d = len(stack)
    N[x] = d
    F[x] = FP(x)             # F(X) <- F'(x)

    rel = R(x)               # Get y's related to x
    for y in rel:
        if N[y] == 0:
            traverse(y, N, stack, F, X, R, FP)
        N[x] = min(N[x], N[y])
        for a in F.get(y, []):
            if a not in F[x]:
                F[x].append(a)
    if N[x] == d:
        N[stack[-1]] = MAXINT
        F[stack[-1]] = F[x]
        element = stack.pop()
        while element != x:
            N[stack[-1]] = MAXINT
            F[stack[-1]] = F[x]
            element = stack.pop()

class LALRError(YaccError):
    pass

# -----------------------------------------------------------------------------
#                             == LRGeneratedTable ==
#
# This class implements the LR table generation algorithm.  There are no
# public methods except for write()
# -----------------------------------------------------------------------------

class LRGeneratedTable(LRTable):
    def __init__(self, grammar, method='LALR', log=None):
        if method not in ['SLR', 'LALR']:
            raise LALRError('Unsupported method %s' % method)

        self.grammar = grammar
        self.lr_method = method

        # Set up the logger
        if not log:
            log = NullLogger()
        self.log = log

        # Internal attributes
        self.lr_action     = {}        # Action table
        self.lr_goto       = {}        # Goto table
        self.lr_productions  = grammar.Productions    # Copy of grammar Production array
        self.lr_goto_cache = {}        # Cache of computed gotos
        self.lr0_cidhash   = {}        # Cache of closures

        self._add_count    = 0         # Internal counter used to detect cycles

        # Diagonistic information filled in by the table generator
        self.sr_conflict   = 0
        self.rr_conflict   = 0
        self.conflicts     = []        # List of conflicts

        self.sr_conflicts  = []
        self.rr_conflicts  = []

        # Build the tables
        self.grammar.build_lritems()
        self.grammar.compute_first()
        self.grammar.compute_follow()
        self.lr_parse_table()

    # Compute the LR(0) closure operation on I, where I is a set of LR(0) items.

    def lr0_closure(self, I):
        self._add_count += 1

        # Add everything in I to J
        J = I[:]
        didadd = True
        while didadd:
            didadd = False
            for j in J:
                for x in j.lr_after:
                    if getattr(x, 'lr0_added', 0) == self._add_count:
                        continue
                    # Add B --> .G to J
                    J.append(x.lr_next)
                    x.lr0_added = self._add_count
                    didadd = True

        return J

    # Compute the LR(0) goto function goto(I,X) where I is a set
    # of LR(0) items and X is a grammar symbol.   This function is written
    # in a way that guarantees uniqueness of the generated goto sets
    # (i.e. the same goto set will never be returned as two different Python
    # objects).  With uniqueness, we can later do fast set comparisons using
    # id(obj) instead of element-wise comparison.

    def lr0_goto(self, I, x):
        # First we look for a previously cached entry
        g = self.lr_goto_cache.get((id(I), x))
        if g:
            return g

        # Now we generate the goto set in a way that guarantees uniqueness
        # of the result

        s = self.lr_goto_cache.get(x)
        if not s:
            s = {}
            self.lr_goto_cache[x] = s

        gs = []
        for p in I:
            n = p.lr_next
            if n and n.lr_before == x:
                s1 = s.get(id(n))
                if not s1:
                    s1 = {}
                    s[id(n)] = s1
                gs.append(n)
                s = s1
        g = s.get('$end')
        if not g:
            if gs:
                g = self.lr0_closure(gs)
                s['$end'] = g
            else:
                s['$end'] = gs
        self.lr_goto_cache[(id(I), x)] = g
        return g

    # Compute the LR(0) sets of item function
    def lr0_items(self):
        C = [self.lr0_closure([self.grammar.Productions[0].lr_next])]
        i = 0
        for I in C:
            self.lr0_cidhash[id(I)] = i
            i += 1

        # Loop over the items in C and each grammar symbols
        i = 0
        while i < len(C):
            I = C[i]
            i += 1

            # Collect all of the symbols that could possibly be in the goto(I,X) sets
            asyms = {}
            for ii in I:
                for s in ii.usyms:
                    asyms[s] = None

            for x in asyms:
                g = self.lr0_goto(I, x)
                if not g or id(g) in self.lr0_cidhash:
                    continue
                self.lr0_cidhash[id(g)] = len(C)
                C.append(g)

        return C

    # -----------------------------------------------------------------------------
    #                       ==== LALR(1) Parsing ====
    #
    # LALR(1) parsing is almost exactly the same as SLR except that instead of
    # relying upon Follow() sets when performing reductions, a more selective
    # lookahead set that incorporates the state of the LR(0) machine is utilized.
    # Thus, we mainly just have to focus on calculating the lookahead sets.
    #
    # The method used here is due to DeRemer and Pennelo (1982).
    #
    # DeRemer, F. L., and T. J. Pennelo: "Efficient Computation of LALR(1)
    #     Lookahead Sets", ACM Transactions on Programming Languages and Systems,
    #     Vol. 4, No. 4, Oct. 1982, pp. 615-649
    #
    # Further details can also be found in:
    #
    #  J. Tremblay and P. Sorenson, "The Theory and Practice of Compiler Writing",
    #      McGraw-Hill Book Company, (1985).
    #
    # -----------------------------------------------------------------------------

    # -----------------------------------------------------------------------------
    # compute_nullable_nonterminals()
    #
    # Creates a dictionary containing all of the non-terminals that might produce
    # an empty production.
    # -----------------------------------------------------------------------------

    def compute_nullable_nonterminals(self):
        nullable = set()
        num_nullable = 0
        while True:
            for p in self.grammar.Productions[1:]:
                if p.len == 0:
                    nullable.add(p.name)
                    continue
                for t in p.prod:
                    if t not in nullable:
                        break
                else:
                    nullable.add(p.name)
            if len(nullable) == num_nullable:
                break
            num_nullable = len(nullable)
        return nullable

    # -----------------------------------------------------------------------------
    # find_nonterminal_trans(C)
    #
    # Given a set of LR(0) items, this functions finds all of the non-terminal
    # transitions.    These are transitions in which a dot appears immediately before
    # a non-terminal.   Returns a list of tuples of the form (state,N) where state
    # is the state number and N is the nonterminal symbol.
    #
    # The input C is the set of LR(0) items.
    # -----------------------------------------------------------------------------

    def find_nonterminal_transitions(self, C):
        trans = []
        for stateno, state in enumerate(C):
            for p in state:
                if p.lr_index < p.len - 1:
                    t = (stateno, p.prod[p.lr_index+1])
                    if t[1] in self.grammar.Nonterminals:
                        if t not in trans:
                            trans.append(t)
        return trans

    # -----------------------------------------------------------------------------
    # dr_relation()
    #
    # Computes the DR(p,A) relationships for non-terminal transitions.  The input
    # is a tuple (state,N) where state is a number and N is a nonterminal symbol.
    #
    # Returns a list of terminals.
    # -----------------------------------------------------------------------------

    def dr_relation(self, C, trans, nullable):
        dr_set = {}
        state, N = trans
        terms = []

        g = self.lr0_goto(C[state], N)
        for p in g:
            if p.lr_index < p.len - 1:
                a = p.prod[p.lr_index+1]
                if a in self.grammar.Terminals:
                    if a not in terms:
                        terms.append(a)

        # This extra bit is to handle the start state
        if state == 0 and N == self.grammar.Productions[0].prod[0]:
            terms.append('$end')

        return terms

    # -----------------------------------------------------------------------------
    # reads_relation()
    #
    # Computes the READS() relation (p,A) READS (t,C).
    # -----------------------------------------------------------------------------

    def reads_relation(self, C, trans, empty):
        # Look for empty transitions
        rel = []
        state, N = trans

        g = self.lr0_goto(C[state], N)
        j = self.lr0_cidhash.get(id(g), -1)
        for p in g:
            if p.lr_index < p.len - 1:
                a = p.prod[p.lr_index + 1]
                if a in empty:
                    rel.append((j, a))

        return rel

    # -----------------------------------------------------------------------------
    # compute_lookback_includes()
    #
    # Determines the lookback and includes relations
    #
    # LOOKBACK:
    #
    # This relation is determined by running the LR(0) state machine forward.
    # For example, starting with a production "N : . A B C", we run it forward
    # to obtain "N : A B C ."   We then build a relationship between this final
    # state and the starting state.   These relationships are stored in a dictionary
    # lookdict.
    #
    # INCLUDES:
    #
    # Computes the INCLUDE() relation (p,A) INCLUDES (p',B).
    #
    # This relation is used to determine non-terminal transitions that occur
    # inside of other non-terminal transition states.   (p,A) INCLUDES (p', B)
    # if the following holds:
    #
    #       B -> LAT, where T -> epsilon and p' -L-> p
    #
    # L is essentially a prefix (which may be empty), T is a suffix that must be
    # able to derive an empty string.  State p' must lead to state p with the string L.
    #
    # -----------------------------------------------------------------------------

    def compute_lookback_includes(self, C, trans, nullable):
        lookdict = {}          # Dictionary of lookback relations
        includedict = {}       # Dictionary of include relations

        # Make a dictionary of non-terminal transitions
        dtrans = {}
        for t in trans:
            dtrans[t] = 1

        # Loop over all transitions and compute lookbacks and includes
        for state, N in trans:
            lookb = []
            includes = []
            for p in C[state]:
                if p.name != N:
                    continue

                # Okay, we have a name match.  We now follow the production all the way
                # through the state machine until we get the . on the right hand side

                lr_index = p.lr_index
                j = state
                while lr_index < p.len - 1:
                    lr_index = lr_index + 1
                    t = p.prod[lr_index]

                    # Check to see if this symbol and state are a non-terminal transition
                    if (j, t) in dtrans:
                        # Yes.  Okay, there is some chance that this is an includes relation
                        # the only way to know for certain is whether the rest of the
                        # production derives empty

                        li = lr_index + 1
                        while li < p.len:
                            if p.prod[li] in self.grammar.Terminals:
                                break      # No forget it
                            if p.prod[li] not in nullable:
                                break
                            li = li + 1
                        else:
                            # Appears to be a relation between (j,t) and (state,N)
                            includes.append((j, t))

                    g = self.lr0_goto(C[j], t)               # Go to next set
                    j = self.lr0_cidhash.get(id(g), -1)      # Go to next state

                # When we get here, j is the final state, now we have to locate the production
                for r in C[j]:
                    if r.name != p.name:
                        continue
                    if r.len != p.len:
                        continue
                    i = 0
                    # This look is comparing a production ". A B C" with "A B C ."
                    while i < r.lr_index:
                        if r.prod[i] != p.prod[i+1]:
                            break
                        i = i + 1
                    else:
                        lookb.append((j, r))
            for i in includes:
                if i not in includedict:
                    includedict[i] = []
                includedict[i].append((state, N))
            lookdict[(state, N)] = lookb

        return lookdict, includedict

    # -----------------------------------------------------------------------------
    # compute_read_sets()
    #
    # Given a set of LR(0) items, this function computes the read sets.
    #
    # Inputs:  C        =  Set of LR(0) items
    #          ntrans   = Set of nonterminal transitions
    #          nullable = Set of empty transitions
    #
    # Returns a set containing the read sets
    # -----------------------------------------------------------------------------

    def compute_read_sets(self, C, ntrans, nullable):
        FP = lambda x: self.dr_relation(C, x, nullable)
        R =  lambda x: self.reads_relation(C, x, nullable)
        F = digraph(ntrans, R, FP)
        return F

    # -----------------------------------------------------------------------------
    # compute_follow_sets()
    #
    # Given a set of LR(0) items, a set of non-terminal transitions, a readset,
    # and an include set, this function computes the follow sets
    #
    # Follow(p,A) = Read(p,A) U U {Follow(p',B) | (p,A) INCLUDES (p',B)}
    #
    # Inputs:
    #            ntrans     = Set of nonterminal transitions
    #            readsets   = Readset (previously computed)
    #            inclsets   = Include sets (previously computed)
    #
    # Returns a set containing the follow sets
    # -----------------------------------------------------------------------------

    def compute_follow_sets(self, ntrans, readsets, inclsets):
        FP = lambda x: readsets[x]
        R  = lambda x: inclsets.get(x, [])
        F = digraph(ntrans, R, FP)
        return F

    # -----------------------------------------------------------------------------
    # add_lookaheads()
    #
    # Attaches the lookahead symbols to grammar rules.
    #
    # Inputs:    lookbacks         -  Set of lookback relations
    #            followset         -  Computed follow set
    #
    # This function directly attaches the lookaheads to productions contained
    # in the lookbacks set
    # -----------------------------------------------------------------------------

    def add_lookaheads(self, lookbacks, followset):
        for trans, lb in lookbacks.items():
            # Loop over productions in lookback
            for state, p in lb:
                if state not in p.lookaheads:
                    p.lookaheads[state] = []
                f = followset.get(trans, [])
                for a in f:
                    if a not in p.lookaheads[state]:
                        p.lookaheads[state].append(a)

    # -----------------------------------------------------------------------------
    # add_lalr_lookaheads()
    #
    # This function does all of the work of adding lookahead information for use
    # with LALR parsing
    # -----------------------------------------------------------------------------

    def add_lalr_lookaheads(self, C):
        # Determine all of the nullable nonterminals
        nullable = self.compute_nullable_nonterminals()

        # Find all non-terminal transitions
        trans = self.find_nonterminal_transitions(C)

        # Compute read sets
        readsets = self.compute_read_sets(C, trans, nullable)

        # Compute lookback/includes relations
        lookd, included = self.compute_lookback_includes(C, trans, nullable)

        # Compute LALR FOLLOW sets
        followsets = self.compute_follow_sets(trans, readsets, included)

        # Add all of the lookaheads
        self.add_lookaheads(lookd, followsets)

    # -----------------------------------------------------------------------------
    # lr_parse_table()
    #
    # This function constructs the parse tables for SLR or LALR
    # -----------------------------------------------------------------------------
    def lr_parse_table(self):
        Productions = self.grammar.Productions
        Precedence  = self.grammar.Precedence
        goto   = self.lr_goto         # Goto array
        action = self.lr_action       # Action array
        log    = self.log             # Logger for output

        actionp = {}                  # Action production array (temporary)

        log.info('Parsing method: %s', self.lr_method)

        # Step 1: Construct C = { I0, I1, ... IN}, collection of LR(0) items
        # This determines the number of states

        C = self.lr0_items()

        if self.lr_method == 'LALR':
            self.add_lalr_lookaheads(C)

        # Build the parser table, state by state
        st = 0
        for I in C:
            # Loop over each production in I
            actlist = []              # List of actions
            st_action  = {}
            st_actionp = {}
            st_goto    = {}
            log.info('')
            log.info('state %d', st)
            log.info('')
            for p in I:
                log.info('    (%d) %s', p.number, p)
            log.info('')

            for p in I:
                    if p.len == p.lr_index + 1:
                        if p.name == "S'":
                            # Start symbol. Accept!
                            st_action['$end'] = 0
                            st_actionp['$end'] = p
                        else:
                            # We are at the end of a production.  Reduce!
                            if self.lr_method == 'LALR':
                                laheads = p.lookaheads[st]
                            else:
                                laheads = self.grammar.Follow[p.name]
                            for a in laheads:
                                actlist.append((a, p, 'reduce using rule %d (%s)' % (p.number, p)))
                                r = st_action.get(a)
                                if r is not None:
                                    # Whoa. Have a shift/reduce or reduce/reduce conflict
                                    if r > 0:
                                        # Need to decide on shift or reduce here
                                        # By default we favor shifting. Need to add
                                        # some precedence rules here.
                                        sprec, slevel = Productions[st_actionp[a].number].prec
                                        rprec, rlevel = Precedence.get(a, ('right', 0))
                                        if (slevel < rlevel) or ((slevel == rlevel) and (rprec == 'left')):
                                            # We really need to reduce here.
                                            st_action[a] = -p.number
                                            st_actionp[a] = p
                                            if not slevel and not rlevel:
                                                log.info('  ! shift/reduce conflict for %s resolved as reduce', a)
                                                self.sr_conflicts.append((st, a, 'reduce'))
                                            Productions[p.number].reduced += 1
                                        elif (slevel == rlevel) and (rprec == 'nonassoc'):
                                            st_action[a] = None
                                        else:
                                            # Hmmm. Guess we'll keep the shift
                                            if not rlevel:
                                                log.info('  ! shift/reduce conflict for %s resolved as shift', a)
                                                self.sr_conflicts.append((st, a, 'shift'))
                                    elif r < 0:
                                        # Reduce/reduce conflict.   In this case, we favor the rule
                                        # that was defined first in the grammar file
                                        oldp = Productions[-r]
                                        pp = Productions[p.number]
                                        if oldp.line > pp.line:
                                            st_action[a] = -p.number
                                            st_actionp[a] = p
                                            chosenp, rejectp = pp, oldp
                                            Productions[p.number].reduced += 1
                                            Productions[oldp.number].reduced -= 1
                                        else:
                                            chosenp, rejectp = oldp, pp
                                        self.rr_conflicts.append((st, chosenp, rejectp))
                                        log.info('  ! reduce/reduce conflict for %s resolved using rule %d (%s)',
                                                 a, st_actionp[a].number, st_actionp[a])
                                    else:
                                        raise LALRError('Unknown conflict in state %d' % st)
                                else:
                                    st_action[a] = -p.number
                                    st_actionp[a] = p
                                    Productions[p.number].reduced += 1
                    else:
                        i = p.lr_index
                        a = p.prod[i+1]       # Get symbol right after the "."
                        if a in self.grammar.Terminals:
                            g = self.lr0_goto(I, a)
                            j = self.lr0_cidhash.get(id(g), -1)
                            if j >= 0:
                                # We are in a shift state
                                actlist.append((a, p, 'shift and go to state %d' % j))
                                r = st_action.get(a)
                                if r is not None:
                                    # Whoa have a shift/reduce or shift/shift conflict
                                    if r > 0:
                                        if r != j:
                                            raise LALRError('Shift/shift conflict in state %d' % st)
                                    elif r < 0:
                                        # Do a precedence check.
                                        #   -  if precedence of reduce rule is higher, we reduce.
                                        #   -  if precedence of reduce is same and left assoc, we reduce.
                                        #   -  otherwise we shift
                                        rprec, rlevel = Productions[st_actionp[a].number].prec
                                        sprec, slevel = Precedence.get(a, ('right', 0))
                                        if (slevel > rlevel) or ((slevel == rlevel) and (rprec == 'right')):
                                            # We decide to shift here... highest precedence to shift
                                            Productions[st_actionp[a].number].reduced -= 1
                                            st_action[a] = j
                                            st_actionp[a] = p
                                            if not rlevel:
                                                log.info('  ! shift/reduce conflict for %s resolved as shift', a)
                                                self.sr_conflicts.append((st, a, 'shift'))
                                        elif (slevel == rlevel) and (rprec == 'nonassoc'):
                                            st_action[a] = None
                                        else:
                                            # Hmmm. Guess we'll keep the reduce
                                            if not slevel and not rlevel:
                                                log.info('  ! shift/reduce conflict for %s resolved as reduce', a)
                                                self.sr_conflicts.append((st, a, 'reduce'))

                                    else:
                                        raise LALRError('Unknown conflict in state %d' % st)
                                else:
                                    st_action[a] = j
                                    st_actionp[a] = p

            # Print the actions associated with each terminal
            _actprint = {}
            for a, p, m in actlist:
                if a in st_action:
                    if p is st_actionp[a]:
                        log.info('    %-15s %s', a, m)
                        _actprint[(a, m)] = 1
            log.info('')
            # Print the actions that were not used. (debugging)
            not_used = 0
            for a, p, m in actlist:
                if a in st_action:
                    if p is not st_actionp[a]:
                        if not (a, m) in _actprint:
                            log.debug('  ! %-15s [ %s ]', a, m)
                            not_used = 1
                            _actprint[(a, m)] = 1
            if not_used:
                log.debug('')

            # Construct the goto table for this state

            nkeys = {}
            for ii in I:
                for s in ii.usyms:
                    if s in self.grammar.Nonterminals:
                        nkeys[s] = None
            for n in nkeys:
                g = self.lr0_goto(I, n)
                j = self.lr0_cidhash.get(id(g), -1)
                if j >= 0:
                    st_goto[n] = j
                    log.info('    %-30s shift and go to state %d', n, j)

            action[st] = st_action
            actionp[st] = st_actionp
            goto[st] = st_goto
            st += 1

    # -----------------------------------------------------------------------------
    # write()
    #
    # This function writes the LR parsing tables to a file
    # -----------------------------------------------------------------------------

    def write_table(self, tabmodule, outputdir='', signature=''):
        if isinstance(tabmodule, types.ModuleType):
            raise IOError("Won't overwrite existing tabmodule")

        basemodulename = tabmodule.split('.')[-1]
        filename = os.path.join(outputdir, basemodulename) + '.py'
        try:
            f = open(filename, 'w')

            f.write('''
# %s
# This file is automatically generated. Do not edit.
_tabversion = %r

_lr_method = %r

_lr_signature = %r
    ''' % (os.path.basename(filename), __tabversion__, self.lr_method, signature))

            # Change smaller to 0 to go back to original tables
            smaller = 1

            # Factor out names to try and make smaller
            if smaller:
                items = {}

                for s, nd in self.lr_action.items():
                    for name, v in nd.items():
                        i = items.get(name)
                        if not i:
                            i = ([], [])
                            items[name] = i
                        i[0].append(s)
                        i[1].append(v)

                f.write('\n_lr_action_items = {')
                for k, v in items.items():
                    f.write('%r:([' % k)
                    for i in v[0]:
                        f.write('%r,' % i)
                    f.write('],[')
                    for i in v[1]:
                        f.write('%r,' % i)

                    f.write(']),')
                f.write('}\n')

                f.write('''
_lr_action = {}
for _k, _v in _lr_action_items.items():
   for _x,_y in zip(_v[0],_v[1]):
      if not _x in _lr_action:  _lr_action[_x] = {}
      _lr_action[_x][_k] = _y
del _lr_action_items
''')

            else:
                f.write('\n_lr_action = { ')
                for k, v in self.lr_action.items():
                    f.write('(%r,%r):%r,' % (k[0], k[1], v))
                f.write('}\n')

            if smaller:
                # Factor out names to try and make smaller
                items = {}

                for s, nd in self.lr_goto.items():
                    for name, v in nd.items():
                        i = items.get(name)
                        if not i:
                            i = ([], [])
                            items[name] = i
                        i[0].append(s)
                        i[1].append(v)

                f.write('\n_lr_goto_items = {')
                for k, v in items.items():
                    f.write('%r:([' % k)
                    for i in v[0]:
                        f.write('%r,' % i)
                    f.write('],[')
                    for i in v[1]:
                        f.write('%r,' % i)

                    f.write(']),')
                f.write('}\n')

                f.write('''
_lr_goto = {}
for _k, _v in _lr_goto_items.items():
   for _x, _y in zip(_v[0], _v[1]):
       if not _x in _lr_goto: _lr_goto[_x] = {}
       _lr_goto[_x][_k] = _y
del _lr_goto_items
''')
            else:
                f.write('\n_lr_goto = { ')
                for k, v in self.lr_goto.items():
                    f.write('(%r,%r):%r,' % (k[0], k[1], v))
                f.write('}\n')

            # Write production table
            f.write('_lr_productions = [\n')
            for p in self.lr_productions:
                if p.func:
                    f.write('  (%r,%r,%d,%r,%r,%d),\n' % (p.str, p.name, p.len,
                                                          p.func, os.path.basename(p.file), p.line))
                else:
                    f.write('  (%r,%r,%d,None,None,None),\n' % (str(p), p.name, p.len))
            f.write(']\n')
            f.close()

        except IOError as e:
            raise


    # -----------------------------------------------------------------------------
    # pickle_table()
    #
    # This function pickles the LR parsing tables to a supplied file object
    # -----------------------------------------------------------------------------

    def pickle_table(self, filename, signature=''):
        try:
            import cPickle as pickle
        except ImportError:
            import pickle
        with open(filename, 'wb') as outf:
            pickle.dump(__tabversion__, outf, pickle_protocol)
            pickle.dump(self.lr_method, outf, pickle_protocol)
            pickle.dump(signature, outf, pickle_protocol)
            pickle.dump(self.lr_action, outf, pickle_protocol)
            pickle.dump(self.lr_goto, outf, pickle_protocol)

            outp = []
            for p in self.lr_productions:
                if p.func:
                    outp.append((p.str, p.name, p.len, p.func, os.path.basename(p.file), p.line))
                else:
                    outp.append((str(p), p.name, p.len, None, None, None))
            pickle.dump(outp, outf, pickle_protocol)

# -----------------------------------------------------------------------------
#                            === INTROSPECTION ===
#
# The following functions and classes are used to implement the PLY
# introspection features followed by the yacc() function itself.
# -----------------------------------------------------------------------------

# -----------------------------------------------------------------------------
# get_caller_module_dict()
#
# This function returns a dictionary containing all of the symbols defined within
# a caller further down the call stack.  This is used to get the environment
# associated with the yacc() call if none was provided.
# -----------------------------------------------------------------------------

def get_caller_module_dict(levels):
    f = sys._getframe(levels)
    ldict = f.f_globals.copy()
    if f.f_globals != f.f_locals:
        ldict.update(f.f_locals)
    return ldict

# -----------------------------------------------------------------------------
# parse_grammar()
#
# This takes a raw grammar rule string and parses it into production data
# -----------------------------------------------------------------------------
def parse_grammar(doc, file, line):
    grammar = []
    # Split the doc string into lines
    pstrings = doc.splitlines()
    lastp = None
    dline = line
    for ps in pstrings:
        dline += 1
        p = ps.split()
        if not p:
            continue
        try:
            if p[0] == '|':
                # This is a continuation of a previous rule
                if not lastp:
                    raise SyntaxError("%s:%d: Misplaced '|'" % (file, dline))
                prodname = lastp
                syms = p[1:]
            else:
                prodname = p[0]
                lastp = prodname
                syms   = p[2:]
                assign = p[1]
                if assign != ':' and assign != '::=':
                    raise SyntaxError("%s:%d: Syntax error. Expected ':'" % (file, dline))

            grammar.append((file, dline, prodname, syms))
        except SyntaxError:
            raise
        except Exception:
            raise SyntaxError('%s:%d: Syntax error in rule %r' % (file, dline, ps.strip()))

    return grammar

# -----------------------------------------------------------------------------
# ParserReflect()
#
# This class represents information extracted for building a parser including
# start symbol, error function, tokens, precedence list, action functions,
# etc.
# -----------------------------------------------------------------------------
class ParserReflect(object):
    def __init__(self, pdict, log=None):
        self.pdict      = pdict
        self.start      = None
        self.error_func = None
        self.tokens     = None
        self.modules    = set()
        self.grammar    = []
        self.error      = False

        if log is None:
            self.log = PlyLogger(sys.stderr)
        else:
            self.log = log

    # Get all of the basic information
    def get_all(self):
        self.get_start()
        self.get_error_func()
        self.get_tokens()
        self.get_precedence()
        self.get_pfunctions()

    # Validate all of the information
    def validate_all(self):
        self.validate_start()
        self.validate_error_func()
        self.validate_tokens()
        self.validate_precedence()
        self.validate_pfunctions()
        self.validate_modules()
        return self.error

    # Compute a signature over the grammar
    def signature(self):
        try:
            from hashlib import md5
        except ImportError:
            from md5 import md5
        try:
            sig = md5(usedforsecurity=False)
            if self.start:
                sig.update(self.start.encode('latin-1'))
            if self.prec:
                sig.update(''.join([''.join(p) for p in self.prec]).encode('latin-1'))
            if self.tokens:
                sig.update(' '.join(self.tokens).encode('latin-1'))
            for f in self.pfuncs:
                if f[3]:
                    sig.update(f[3].encode('latin-1'))
        except (TypeError, ValueError):
            pass

        digest = base64.b16encode(sig.digest())
        if sys.version_info[0] >= 3:
            digest = digest.decode('latin-1')
        return digest

    # -----------------------------------------------------------------------------
    # validate_modules()
    #
    # This method checks to see if there are duplicated p_rulename() functions
    # in the parser module file.  Without this function, it is really easy for
    # users to make mistakes by cutting and pasting code fragments (and it's a real
    # bugger to try and figure out why the resulting parser doesn't work).  Therefore,
    # we just do a little regular expression pattern matching of def statements
    # to try and detect duplicates.
    # -----------------------------------------------------------------------------

    def validate_modules(self):
        # Match def p_funcname(
        fre = re.compile(r'\s*def\s+(p_[a-zA-Z_0-9]*)\(')

        for module in self.modules:
            try:
                lines, linen = inspect.getsourcelines(module)
            except IOError:
                continue

            counthash = {}
            for linen, line in enumerate(lines):
                linen += 1
                m = fre.match(line)
                if m:
                    name = m.group(1)
                    prev = counthash.get(name)
                    if not prev:
                        counthash[name] = linen
                    else:
                        filename = inspect.getsourcefile(module)
                        self.log.warning('%s:%d: Function %s redefined. Previously defined on line %d',
                                         filename, linen, name, prev)

    # Get the start symbol
    def get_start(self):
        self.start = self.pdict.get('start')

    # Validate the start symbol
    def validate_start(self):
        if self.start is not None:
            if not isinstance(self.start, string_types):
                self.log.error("'start' must be a string")

    # Look for error handler
    def get_error_func(self):
        self.error_func = self.pdict.get('p_error')

    # Validate the error function
    def validate_error_func(self):
        if self.error_func:
            if isinstance(self.error_func, types.FunctionType):
                ismethod = 0
            elif isinstance(self.error_func, types.MethodType):
                ismethod = 1
            else:
                self.log.error("'p_error' defined, but is not a function or method")
                self.error = True
                return

            eline = self.error_func.__code__.co_firstlineno
            efile = self.error_func.__code__.co_filename
            module = inspect.getmodule(self.error_func)
            self.modules.add(module)

            argcount = self.error_func.__code__.co_argcount - ismethod
            if argcount != 1:
                self.log.error('%s:%d: p_error() requires 1 argument', efile, eline)
                self.error = True

    # Get the tokens map
    def get_tokens(self):
        tokens = self.pdict.get('tokens')
        if not tokens:
            self.log.error('No token list is defined')
            self.error = True
            return

        if not isinstance(tokens, (list, tuple)):
            self.log.error('tokens must be a list or tuple')
            self.error = True
            return

        if not tokens:
            self.log.error('tokens is empty')
            self.error = True
            return

        self.tokens = tokens

    # Validate the tokens
    def validate_tokens(self):
        # Validate the tokens.
        if 'error' in self.tokens:
            self.log.error("Illegal token name 'error'. Is a reserved word")
            self.error = True
            return

        terminals = set()
        for n in self.tokens:
            if n in terminals:
                self.log.warning('Token %r multiply defined', n)
            terminals.add(n)

    # Get the precedence map (if any)
    def get_precedence(self):
        self.prec = self.pdict.get('precedence')

    # Validate and parse the precedence map
    def validate_precedence(self):
        preclist = []
        if self.prec:
            if not isinstance(self.prec, (list, tuple)):
                self.log.error('precedence must be a list or tuple')
                self.error = True
                return
            for level, p in enumerate(self.prec):
                if not isinstance(p, (list, tuple)):
                    self.log.error('Bad precedence table')
                    self.error = True
                    return

                if len(p) < 2:
                    self.log.error('Malformed precedence entry %s. Must be (assoc, term, ..., term)', p)
                    self.error = True
                    return
                assoc = p[0]
                if not isinstance(assoc, string_types):
                    self.log.error('precedence associativity must be a string')
                    self.error = True
                    return
                for term in p[1:]:
                    if not isinstance(term, string_types):
                        self.log.error('precedence items must be strings')
                        self.error = True
                        return
                    preclist.append((term, assoc, level+1))
        self.preclist = preclist

    # Get all p_functions from the grammar
    def get_pfunctions(self):
        p_functions = []
        for name, item in self.pdict.items():
            if not name.startswith('p_') or name == 'p_error':
                continue
            if isinstance(item, (types.FunctionType, types.MethodType)):
                line = getattr(item, 'co_firstlineno', item.__code__.co_firstlineno)
                module = inspect.getmodule(item)
                p_functions.append((line, module, name, item.__doc__))

        # Sort all of the actions by line number; make sure to stringify
        # modules to make them sortable, since `line` may not uniquely sort all
        # p functions
        p_functions.sort(key=lambda p_function: (
            p_function[0],
            str(p_function[1]),
            p_function[2],
            p_function[3]))
        self.pfuncs = p_functions

    # Validate all of the p_functions
    def validate_pfunctions(self):
        grammar = []
        # Check for non-empty symbols
        if len(self.pfuncs) == 0:
            self.log.error('no rules of the form p_rulename are defined')
            self.error = True
            return

        for line, module, name, doc in self.pfuncs:
            file = inspect.getsourcefile(module)
            func = self.pdict[name]
            if isinstance(func, types.MethodType):
                reqargs = 2
            else:
                reqargs = 1
            if func.__code__.co_argcount > reqargs:
                self.log.error('%s:%d: Rule %r has too many arguments', file, line, func.__name__)
                self.error = True
            elif func.__code__.co_argcount < reqargs:
                self.log.error('%s:%d: Rule %r requires an argument', file, line, func.__name__)
                self.error = True
            elif not func.__doc__:
                self.log.warning('%s:%d: No documentation string specified in function %r (ignored)',
                                 file, line, func.__name__)
            else:
                try:
                    parsed_g = parse_grammar(doc, file, line)
                    for g in parsed_g:
                        grammar.append((name, g))
                except SyntaxError as e:
                    self.log.error(str(e))
                    self.error = True

                # Looks like a valid grammar rule
                # Mark the file in which defined.
                self.modules.add(module)

        # Secondary validation step that looks for p_ definitions that are not functions
        # or functions that look like they might be grammar rules.

        for n, v in self.pdict.items():
            if n.startswith('p_') and isinstance(v, (types.FunctionType, types.MethodType)):
                continue
            if n.startswith('t_'):
                continue
            if n.startswith('p_') and n != 'p_error':
                self.log.warning('%r not defined as a function', n)
            if ((isinstance(v, types.FunctionType) and v.__code__.co_argcount == 1) or
                   (isinstance(v, types.MethodType) and v.__func__.__code__.co_argcount == 2)):
                if v.__doc__:
                    try:
                        doc = v.__doc__.split(' ')
                        if doc[1] == ':':
                            self.log.warning('%s:%d: Possible grammar rule %r defined without p_ prefix',
                                             v.__code__.co_filename, v.__code__.co_firstlineno, n)
                    except IndexError:
                        pass

        self.grammar = grammar

# -----------------------------------------------------------------------------
# yacc(module)
#
# Build a parser
# -----------------------------------------------------------------------------

def yacc(method='LALR', debug=yaccdebug, module=None, tabmodule=tab_module, start=None,
         check_recursion=True, optimize=False, write_tables=True, debugfile=debug_file,
         outputdir=None, debuglog=None, errorlog=None, picklefile=None):

    if tabmodule is None:
        tabmodule = tab_module

    # Reference to the parsing method of the last built parser
    global parse

    # If pickling is enabled, table files are not created
    if picklefile:
        write_tables = 0

    if errorlog is None:
        errorlog = PlyLogger(sys.stderr)

    # Get the module dictionary used for the parser
    if module:
        _items = [(k, getattr(module, k)) for k in dir(module)]
        pdict = dict(_items)
        # If no __file__ attribute is available, try to obtain it from the __module__ instead
        if '__file__' not in pdict:
            pdict['__file__'] = sys.modules[pdict['__module__']].__file__
    else:
        pdict = get_caller_module_dict(2)

    if outputdir is None:
        # If no output directory is set, the location of the output files
        # is determined according to the following rules:
        #     - If tabmodule specifies a package, files go into that package directory
        #     - Otherwise, files go in the same directory as the specifying module
        if isinstance(tabmodule, types.ModuleType):
            srcfile = tabmodule.__file__
        else:
            if '.' not in tabmodule:
                srcfile = pdict['__file__']
            else:
                parts = tabmodule.split('.')
                pkgname = '.'.join(parts[:-1])
                exec('import %s' % pkgname)
                srcfile = getattr(sys.modules[pkgname], '__file__', '')
        outputdir = os.path.dirname(srcfile)

    # Determine if the module is package of a package or not.
    # If so, fix the tabmodule setting so that tables load correctly
    pkg = pdict.get('__package__')
    if pkg and isinstance(tabmodule, str):
        if '.' not in tabmodule:
            tabmodule = pkg + '.' + tabmodule



    # Set start symbol if it's specified directly using an argument
    if start is not None:
        pdict['start'] = start

    # Collect parser information from the dictionary
    pinfo = ParserReflect(pdict, log=errorlog)
    pinfo.get_all()

    if pinfo.error:
        raise YaccError('Unable to build parser')

    # Check signature against table files (if any)
    signature = pinfo.signature()

    # Read the tables
    try:
        lr = LRTable()
        if picklefile:
            read_signature = lr.read_pickle(picklefile)
        else:
            read_signature = lr.read_table(tabmodule)
        if optimize or (read_signature == signature):
            try:
                lr.bind_callables(pinfo.pdict)
                parser = LRParser(lr, pinfo.error_func)
                parse = parser.parse
                return parser
            except Exception as e:
                errorlog.warning('There was a problem loading the table file: %r', e)
    except VersionError as e:
        errorlog.warning(str(e))
    except ImportError:
        pass

    if debuglog is None:
        if debug:
            try:
                debuglog = PlyLogger(open(os.path.join(outputdir, debugfile), 'w'))
            except IOError as e:
                errorlog.warning("Couldn't open %r. %s" % (debugfile, e))
                debuglog = NullLogger()
        else:
            debuglog = NullLogger()

    debuglog.info('Created by PLY version %s (http://www.dabeaz.com/ply)', __version__)

    errors = False

    # Validate the parser information
    if pinfo.validate_all():
        raise YaccError('Unable to build parser')

    if not pinfo.error_func:
        errorlog.warning('no p_error() function is defined')

    # Create a grammar object
    grammar = Grammar(pinfo.tokens)

    # Set precedence level for terminals
    for term, assoc, level in pinfo.preclist:
        try:
            grammar.set_precedence(term, assoc, level)
        except GrammarError as e:
            errorlog.warning('%s', e)

    # Add productions to the grammar
    for funcname, gram in pinfo.grammar:
        file, line, prodname, syms = gram
        try:
            grammar.add_production(prodname, syms, funcname, file, line)
        except GrammarError as e:
            errorlog.error('%s', e)
            errors = True

    # Set the grammar start symbols
    try:
        if start is None:
            grammar.set_start(pinfo.start)
        else:
            grammar.set_start(start)
    except GrammarError as e:
        errorlog.error(str(e))
        errors = True

    if errors:
        raise YaccError('Unable to build parser')

    # Verify the grammar structure
    undefined_symbols = grammar.undefined_symbols()
    for sym, prod in undefined_symbols:
        errorlog.error('%s:%d: Symbol %r used, but not defined as a token or a rule', prod.file, prod.line, sym)
        errors = True

    unused_terminals = grammar.unused_terminals()
    if unused_terminals:
        debuglog.info('')
        debuglog.info('Unused terminals:')
        debuglog.info('')
        for term in unused_terminals:
            errorlog.warning('Token %r defined, but not used', term)
            debuglog.info('    %s', term)

    # Print out all productions to the debug log
    if debug:
        debuglog.info('')
        debuglog.info('Grammar')
        debuglog.info('')
        for n, p in enumerate(grammar.Productions):
            debuglog.info('Rule %-5d %s', n, p)

    # Find unused non-terminals
    unused_rules = grammar.unused_rules()
    for prod in unused_rules:
        errorlog.warning('%s:%d: Rule %r defined, but not used', prod.file, prod.line, prod.name)

    if len(unused_terminals) == 1:
        errorlog.warning('There is 1 unused token')
    if len(unused_terminals) > 1:
        errorlog.warning('There are %d unused tokens', len(unused_terminals))

    if len(unused_rules) == 1:
        errorlog.warning('There is 1 unused rule')
    if len(unused_rules) > 1:
        errorlog.warning('There are %d unused rules', len(unused_rules))

    if debug:
        debuglog.info('')
        debuglog.info('Terminals, with rules where they appear')
        debuglog.info('')
        terms = list(grammar.Terminals)
        terms.sort()
        for term in terms:
            debuglog.info('%-20s : %s', term, ' '.join([str(s) for s in grammar.Terminals[term]]))

        debuglog.info('')
        debuglog.info('Nonterminals, with rules where they appear')
        debuglog.info('')
        nonterms = list(grammar.Nonterminals)
        nonterms.sort()
        for nonterm in nonterms:
            debuglog.info('%-20s : %s', nonterm, ' '.join([str(s) for s in grammar.Nonterminals[nonterm]]))
        debuglog.info('')

    if check_recursion:
        unreachable = grammar.find_unreachable()
        for u in unreachable:
            errorlog.warning('Symbol %r is unreachable', u)

        infinite = grammar.infinite_cycles()
        for inf in infinite:
            errorlog.error('Infinite recursion detected for symbol %r', inf)
            errors = True

    unused_prec = grammar.unused_precedence()
    for term, assoc in unused_prec:
        errorlog.error('Precedence rule %r defined for unknown symbol %r', assoc, term)
        errors = True

    if errors:
        raise YaccError('Unable to build parser')

    # Run the LRGeneratedTable on the grammar
    if debug:
        errorlog.debug('Generating %s tables', method)

    lr = LRGeneratedTable(grammar, method, debuglog)

    if debug:
        num_sr = len(lr.sr_conflicts)

        # Report shift/reduce and reduce/reduce conflicts
        if num_sr == 1:
            errorlog.warning('1 shift/reduce conflict')
        elif num_sr > 1:
            errorlog.warning('%d shift/reduce conflicts', num_sr)

        num_rr = len(lr.rr_conflicts)
        if num_rr == 1:
            errorlog.warning('1 reduce/reduce conflict')
        elif num_rr > 1:
            errorlog.warning('%d reduce/reduce conflicts', num_rr)

    # Write out conflicts to the output file
    if debug and (lr.sr_conflicts or lr.rr_conflicts):
        debuglog.warning('')
        debuglog.warning('Conflicts:')
        debuglog.warning('')

        for state, tok, resolution in lr.sr_conflicts:
            debuglog.warning('shift/reduce conflict for %s in state %d resolved as %s',  tok, state, resolution)

        already_reported = set()
        for state, rule, rejected in lr.rr_conflicts:
            if (state, id(rule), id(rejected)) in already_reported:
                continue
            debuglog.warning('reduce/reduce conflict in state %d resolved using rule (%s)', state, rule)
            debuglog.warning('rejected rule (%s) in state %d', rejected, state)
            errorlog.warning('reduce/reduce conflict in state %d resolved using rule (%s)', state, rule)
            errorlog.warning('rejected rule (%s) in state %d', rejected, state)
            already_reported.add((state, id(rule), id(rejected)))

        warned_never = []
        for state, rule, rejected in lr.rr_conflicts:
            if not rejected.reduced and (rejected not in warned_never):
                debuglog.warning('Rule (%s) is never reduced', rejected)
                errorlog.warning('Rule (%s) is never reduced', rejected)
                warned_never.append(rejected)

    # Write the table file if requested
    if write_tables:
        try:
            lr.write_table(tabmodule, outputdir, signature)
        except IOError as e:
            errorlog.warning("Couldn't create %r. %s" % (tabmodule, e))

    # Write a pickled version of the tables
    if picklefile:
        try:
            lr.pickle_table(picklefile, signature)
        except IOError as e:
            errorlog.warning("Couldn't create %r. %s" % (picklefile, e))

    # Build the parser
    lr.bind_callables(pinfo.pdict)
    parser = LRParser(lr, pinfo.error_func)

    parse = parser.parse
    return parser
site-packages/ply/ygen.py000064400000004313147511334660011417 0ustar00# ply: ygen.py
#
# This is a support program that auto-generates different versions of the YACC parsing
# function with different features removed for the purposes of performance.
#
# Users should edit the method LParser.parsedebug() in yacc.py.   The source code 
# for that method is then used to create the other methods.   See the comments in
# yacc.py for further details.

import os.path
import shutil

def get_source_range(lines, tag):
    srclines = enumerate(lines)
    start_tag = '#--! %s-start' % tag
    end_tag = '#--! %s-end' % tag

    for start_index, line in srclines:
        if line.strip().startswith(start_tag):
            break

    for end_index, line in srclines:
        if line.strip().endswith(end_tag):
            break

    return (start_index + 1, end_index)

def filter_section(lines, tag):
    filtered_lines = []
    include = True
    tag_text = '#--! %s' % tag
    for line in lines:
        if line.strip().startswith(tag_text):
            include = not include
        elif include:
            filtered_lines.append(line)
    return filtered_lines

def main():
    dirname = os.path.dirname(__file__)
    shutil.copy2(os.path.join(dirname, 'yacc.py'), os.path.join(dirname, 'yacc.py.bak'))
    with open(os.path.join(dirname, 'yacc.py'), 'r') as f:
        lines = f.readlines()

    parse_start, parse_end = get_source_range(lines, 'parsedebug')
    parseopt_start, parseopt_end = get_source_range(lines, 'parseopt')
    parseopt_notrack_start, parseopt_notrack_end = get_source_range(lines, 'parseopt-notrack')

    # Get the original source
    orig_lines = lines[parse_start:parse_end]

    # Filter the DEBUG sections out
    parseopt_lines = filter_section(orig_lines, 'DEBUG')

    # Filter the TRACKING sections out
    parseopt_notrack_lines = filter_section(parseopt_lines, 'TRACKING')

    # Replace the parser source sections with updated versions
    lines[parseopt_notrack_start:parseopt_notrack_end] = parseopt_notrack_lines
    lines[parseopt_start:parseopt_end] = parseopt_lines

    lines = [line.rstrip()+'\n' for line in lines]
    with open(os.path.join(dirname, 'yacc.py'), 'w') as f:
        f.writelines(lines)

    print('Updated yacc.py')

if __name__ == '__main__':
    main()





site-packages/ply/__pycache__/ctokens.cpython-36.pyc000064400000004127147511334660016412 0ustar003

r��Wi�4@sLdddddddddd	d
ddd
ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3g4Zd4Zd5Zd6Zd7Zd8Zd9Zd:Zd;Zd<Z	d=Z
d>Zd?Zd@Z
dAZdBZdCZdDZdEZdFZdGZdHZdIZdJZdKZdLZdMZdNZdOZdPZdQZdRZdSZ dTZ!dUZ"dVZ#dWZ$dXZ%dYZ&dZZ'd[Z(d\Z)d]Z*d^Z+d_Z,d`Z-daZ.dbZ/dcZ0ddZ1deZ2dfZ3dgdh�Z4didj�Z5dkS)lZIDZTYPEIDZINTEGERZFLOAT�STRINGZ	CHARACTER�PLUS�MINUSZTIMESZDIVIDEZMODULO�ORZANDZNOTZXORZLSHIFTZRSHIFTZLORZLANDZLNOTZLTZLEZGTZGEZEQZNEZEQUALSZ
TIMESEQUALZDIVEQUALZMODEQUAL�	PLUSEQUALZ
MINUSEQUALZLSHIFTEQUALZRSHIFTEQUALZANDEQUALZXOREQUALZOREQUALZ	INCREMENTZ	DECREMENTZARROWZTERNARYZLPARENZRPARENZLBRACKETZRBRACKET�LBRACE�RBRACE�COMMAZPERIOD�SEMI�COLON�ELLIPSISz\+�-z\*�/�%z\|�&�~z\^z<<z>>z\|\|z&&�!�<�>z<=z>=z==z!=�=z\*=z/=z%=z\+=z-=z<<=z>>=z&=z\|=z\^=z\+\+z--z->z\?z\(z\)z\[z\]z\{z\}�,z\.�;�:z\.\.\.z[A-Za-z_][A-Za-z0-9_]*z!\d+([uU]|[lL]|[uU][lL]|[lL][uU])?z?((\d+)(\.\d+)(e(\+|-)?(\d+))? | (\d+)e(\+|-)?(\d+))([lL]|[fF])?z\"([^\\\n]|(\\.))*?\"z(L)?\'([^\\\n]|(\\.))*?\'cCs|jj|jjd�7_|S)z/\*(.|\n)*?\*/�
)�lexer�lineno�value�count)�t�r�/usr/lib/python3.6/ctokens.py�	t_COMMENTvsr cCs|jjd7_|S)z//.*\n�)rr)rrrr�t_CPPCOMMENT|sr"N)6�tokensZt_PLUSZt_MINUSZt_TIMESZt_DIVIDEZt_MODULOZt_ORZt_ANDZt_NOTZt_XORZt_LSHIFTZt_RSHIFTZt_LORZt_LANDZt_LNOTZt_LTZt_GTZt_LEZt_GEZt_EQZt_NEZt_EQUALSZt_TIMESEQUALZ
t_DIVEQUALZ
t_MODEQUALZt_PLUSEQUALZt_MINUSEQUALZ
t_LSHIFTEQUALZ
t_RSHIFTEQUALZ
t_ANDEQUALZ	t_OREQUALZ
t_XOREQUALZt_INCREMENTZt_DECREMENTZt_ARROWZ	t_TERNARYZt_LPARENZt_RPARENZ
t_LBRACKETZ
t_RBRACKETZt_LBRACEZt_RBRACEZt_COMMAZt_PERIODZt_SEMIZt_COLONZ
t_ELLIPSISZt_IDZ	t_INTEGERZt_FLOATZt_STRINGZt_CHARACTERr r"rrrr�<module>s�

site-packages/ply/__pycache__/lex.cpython-36.pyc000064400000051733147511334660015541 0ustar003

r��W�@s"dZdZddlZddlZddlZddlZddlZddlZyejej	fZ
Wnek
rdee
fZ
YnXejd�ZGdd�de�ZGdd�de�ZGd	d
�d
e�ZGdd�de�ZGd
d�d�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�ZGdd�de�Zd%dd �Zd&d!d"�Zd#d$�Z e Z!dS)'z3.9z3.8�Nz^[a-zA-Z0-9_]+$c@seZdZdd�ZdS)�LexErrorcCs|f|_||_dS)N)�args�text)�self�message�s�r�/usr/lib/python3.6/lex.py�__init__:szLexError.__init__N)�__name__�
__module__�__qualname__r
rrrr	r9src@seZdZdd�Zdd�ZdS)�LexTokencCsd|j|j|j|jfS)NzLexToken(%s,%r,%d,%d))�type�value�lineno�lexpos)rrrr	�__str__AszLexToken.__str__cCst|�S)N)�str)rrrr	�__repr__DszLexToken.__repr__N)rrr
rrrrrr	r@src@s4eZdZdd�Zdd�Zdd�Zdd�ZeZeZd	S)
�	PlyLoggercCs
||_dS)N)�f)rrrrr	r
LszPlyLogger.__init__cOs|jj||d�dS)N�
)r�write)r�msgr�kwargsrrr	�criticalOszPlyLogger.criticalcOs|jjd||d�dS)Nz	WARNING: r)rr)rrrrrrr	�warningRszPlyLogger.warningcOs|jjd||d�dS)NzERROR: r)rr)rrrrrrr	�errorUszPlyLogger.errorN)	rrr
r
rrr�info�debugrrrr	rKsrc@seZdZdd�Zdd�ZdS)�
NullLoggercCs|S)Nr)r�namerrr	�__getattribute__^szNullLogger.__getattribute__cOs|S)Nr)rrrrrr	�__call__aszNullLogger.__call__N)rrr
r#r$rrrr	r!]sr!c@s|eZdZdd�Zddd�Zddd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�ZeZdS)�LexercCs�d|_d|_i|_i|_i|_d|_g|_d|_i|_i|_	i|_
d|_d|_d|_
d|_d|_d|_d|_d|_d|_d|_d|_d|_dS)N�INITIALr��F)�lexre�	lexretext�
lexstatere�lexstateretext�lexstaterenames�lexstate�
lexstatestack�lexstateinfo�lexstateignore�lexstateerrorf�lexstateeoff�
lexreflags�lexdatar�lexlen�	lexerrorf�lexeoff�	lextokens�	lexignore�lexliterals�	lexmoduler�lexoptimize)rrrr	r
ts.zLexer.__init__NcCs�tj|�}|r�i}x�|jj�D]�\}}g}x\|D]T\}}g}	xF|D]>}
|
sV|
drb|	j|
�qB|	jt||
dj�|
df�qBWq0W|j||	f�|||<qW||_i|_x(|jj�D]\}}t||j�|j|<q�W||_|S)Nrr()�copyr+�items�append�getattrrr2r<)r�object�cZnewtab�keyZritemZnewreZcreZfindexZ	newfindexr�efrrr	�clone�s(


&zLexer.cloner'cCs�t|tj�rtd��|jd�d}tjj||�d}t|d����}|j	d|t
f�|j	dtt��|j	dtt
|j���|j	d	t|j��|j	d
t|j��|j	dt|j��i}xb|jj�D]T\}}g}	x>t||j||j|�D]"\\}
}}}
|	j|t||
�f�q�W|	||<q�W|j	dt|��|j	d
t|j��i}x,|jj�D]\}}|�rl|jnd||<�qXW|j	dt|��i}x,|jj�D]\}}|�r�|jnd||<�q�W|j	dt|��WdQRXdS)Nz&Won't overwrite existing lextab module�.r(z.py�wzJ# %s.py. This file automatically created by PLY (version %s). Don't edit!
z_tabversion   = %s
z_lextokens    = set(%s)
z_lexreflags   = %s
z_lexliterals  = %s
z_lexstateinfo = %s
z_lexstatere   = %s
z_lexstateignore = %s
z_lexstateerrorf = %s
z_lexstateeoff = %s
���)�
isinstance�types�
ModuleType�IOError�split�os�path�join�openr�__version__�repr�__tabversion__�tupler9r4r;r0r+r?�zipr,r-r@�_funcs_to_namesr1r2rr3)r�lextab�	outputdirZ
basetabmodule�filenameZtfZtabre�	statename�lre�titem�pat�funcZretext�renamesZtaberrrEZtabeofrrr	�writetab�s6(zLexer.writetabcCsZt|tj�r|}ntd|�tj|}t|dd�tkr@td��|j	|_
|j|_|j
|_|j
t|j�B|_|j|_|j|_i|_i|_xh|jj�D]Z\}}g}g}x4|D],\}}	|jtj||jtjB�t|	|�f�q�W||j|<||j|<q�Wi|_x&|jj�D]\}}
||
|j|<�qWi|_ x&|j!j�D]\}}
||
|j |<�q0W|j"d�dS)Nz	import %sZ_tabversionz0.0zInconsistent PLY versionr&)#rJrKrL�exec�sys�modulesrArU�ImportErrorZ
_lextokensr9Z_lexreflagsr4Z_lexliteralsr;�set�
lextokens_allZ
_lexstateinfor0Z_lexstateignorer1r+r,Z_lexstaterer?r@�re�compile�VERBOSE�_names_to_funcsr2Z_lexstateerrorfr3Z
_lexstateeoff�begin)rZtabfile�fdictrYr\r]r^Ztxtitemr_Z	func_namerErrr	�readtab�s8
(
z
Lexer.readtabcCs8|dd�}t|t�std��||_d|_t|�|_dS)Nr(zExpected a stringr)rJ�StringTypes�
ValueErrorr5r�lenr6)rrrCrrr	�input�s
zLexer.inputcCsd||jkrtd��|j||_|j||_|jj|d�|_|jj|d�|_	|j
j|d�|_||_dS)NzUndefined stater')
r+rqr)r,r*r1�getr:r2r7r3r8r.)r�staterrr	rms
zLexer.begincCs|jj|j�|j|�dS)N)r/r@r.rm)rrurrr	�
push_stateszLexer.push_statecCs|j|jj��dS)N)rmr/�pop)rrrr	�	pop_stateszLexer.pop_statecCs|jS)N)r.)rrrr	�
current_state!szLexer.current_statecCs|j|7_dS)N)r)r�nrrr	�skip'sz
Lexer.skipcCs~|j}|j}|j}|j}�x�||k�r|||kr<|d7}q�x�|jD]�\}}|j||�}|s`qFt�}|j�|_|j	|_	||_|j
}	||	\}
|_|
s�|jr�|j�|_|S|j�}P|j�}||_
||_||_|
|�}|s�|j}|j}P|j�s(|j|jk�r(td|
jj|
jj|
j|jf||d���|SW|||jk�rrt�}|||_|j	|_	|j|_||_|d|_|S|j�r�t�}|j|d�|_|j	|_	d|_||_
||_||_|j|�}||jk�r�td||||d���|j}|�s�q|S||_td|||f||d���qW|j�r\t�}d|_d|_|j	|_	||_||_
||_|j|�}|S|d|_|jdk�rztd��dS)	Nr(z4%s:%d: Rule '%s' returned an unknown token type '%s'rz&Scanning error. Illegal character '%s'z"Illegal character '%s' at index %d�eofr'z"No input string given with input())rr6r:r5r)�matchr�grouprr�	lastindexr�end�lexerZlexmatchr=rhr�__code__�co_filename�co_firstlinenorr;r7r8�RuntimeError)rrr6r:r5r)�lexindexfunc�m�tok�ir`Znewtokrrr	�token1s�




"

zLexer.tokencCs|S)Nr)rrrr	�__iter__�szLexer.__iter__cCs|j�}|dkrt�|S)N)r��
StopIteration)r�trrr	�next�sz
Lexer.next)N)r')rrr
r
rFrbrorsrmrvrxryr{r�r�r��__next__rrrr	r%ss

%(

nr%cCst|d|j�S)N�regex)rA�__doc__)r`rrr	�
_get_regex�sr�cCs0tj|�}|jj�}|j|jkr,|j|j�|S)N)rd�	_getframe�	f_globalsr>�f_locals�update)Zlevelsr�ldictrrr	�get_caller_module_dict�s


r�cCsJg}x@t||�D]2\}}|r8|dr8|j||df�q|j|�qW|S)Nrr()rWr@)Zfunclist�namelist�resultrr"rrr	rX�srXcCsHg}x>|D]6}|r6|dr6|j||d|df�q
|j|�q
W|S)Nrr()r@)r�rnr�rzrrr	rl�s
rlcCsj|sgSdj|�}y�tj|tj|B�}dgt|jj��d}|dd�}x�|jj�D]z\}}	|j|d�}
t	|
�t
jt
jfkr�|
||f||	<|||	<qV|
dk	rV|||	<|j
d�dkr�d||	<qVd||f||	<qVW||fg|g|gfStk
�rdtt|�d�}|dk�rd}t|d|�|||�\}}
}t||d�|||�\}}}|||
|||fSXdS)N�|r(�ignore_r�)NN)rQrirjrk�max�
groupindex�valuesr?rtrrK�FunctionType�
MethodType�find�	Exception�intrr�_form_master_re)Zrelist�reflagsr��toknamesr�r)r�Z
lexindexnamesrr�Zhandler�Zllistr]ZlnamesZrlistZrreZrnamesrrr	r��s2



r�cCs�d}|jd�}x0t|dd�d�D]\}}||kr"|dkr"Pq"W|dkrZt|d|��}nd}d|krnt|�}dj||d��}||fS)Nr(�_�ANYr&)r&)rN�	enumeraterVrQ)r�namesZnonstate�partsr��part�statesZ	tokennamerrr	�_statetokens
r�c@sfeZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dS)�LexerReflectNrcCsL||_d|_g|_||_ddi|_t�|_d|_|dkrBtt	j
�n||_dS)Nr&�	inclusiveF)r�Z
error_func�tokensr��	stateinforgrerrrd�stderr�log)rr�r�r�rrr	r
/s
zLexerReflect.__init__cCs$|j�|j�|j�|j�dS)N)�
get_tokens�get_literals�
get_states�	get_rules)rrrr	�get_all:szLexerReflect.get_allcCs|j�|j�|j�|jS)N)�validate_tokens�validate_literals�validate_rulesr)rrrr	�validate_allAszLexerReflect.validate_allcCsp|jjdd�}|s(|jjd�d|_dSt|ttf�sL|jjd�d|_dS|sf|jjd�d|_dS||_dS)Nr�zNo token list is definedTztokens must be a list or tupleztokens is empty)r�rtr�rrJ�listrVr�)rr�rrr	r�HszLexerReflect.get_tokenscCsTi}xJ|jD]@}tj|�s.|jjd|�d|_||krD|jjd|�d||<qWdS)NzBad token name '%s'TzToken '%s' multiply definedr()r��_is_identifierr}r�rr)rZ	terminalsrzrrr	r�\s
zLexerReflect.validate_tokenscCs |jjdd�|_|jsd|_dS)N�literalsr')r�rtr�)rrrr	r�gszLexerReflect.get_literalscCspyDx>|jD]4}t|t�s&t|�dkr
|jjdt|��d|_q
WWn&tk
rj|jjd�d|_YnXdS)Nr(z.Invalid literal %s. Must be a single characterTzIInvalid literals specification. literals must be a sequence of characters)r�rJrprrr�rrT�	TypeError)rrCrrr	r�mszLexerReflect.validate_literalscCs�|jjdd�|_|jr�t|jttf�s:|jjd�d|_n�x�|jD]�}t|t�s^t|�dkrx|jjdt	|��d|_qB|\}}t|t
�s�|jjdt	|��d|_qB|dkp�|dks�|jjd	|�d|_qB||jkr�|jjd
|�d|_qB||j|<qBWdS)Nr�z)states must be defined as a tuple or listTr�zMInvalid state specifier %s. Must be a tuple (statename,'exclusive|inclusive')zState name %s must be a stringr��	exclusivez:State type for state %s must be 'inclusive' or 'exclusive'zState '%s' already defined)r�rtr�rJrVr�r�rrrrTrpr�)rrr"Z	statetyperrr	r�xs0

zLexerReflect.get_statesc	CsRdd�|jD�}i|_i|_i|_i|_i|_i|_x"|jD]}g|j|<g|j|<q<Wt|�dkrz|j	j
d�d|_
dS�x�|D�]x}|j|}t||j�\}}||j|<t|d��rX|dkr�x�|D]}||j|<q�Wn||dkr�xr|D]}||j|<q�WnZ|d	k�r2|j
j}|j
j}|j	j
d
|||j�d|_
n$x�|D]}|j|j||f��q8Wq�t|t��r�|d	k�r�x|D]}||j|<�qtWd|k�r�|j	jd|�nD|dk�r�|j	j
d
|�d|_
n$x8|D]}|j|j||f��q�Wq�|j	j
d|�d|_
q�Wx$|jj�D]}|jdd�d��qWx&|jj�D]}|jdd�dd��q2WdS)NcSs g|]}|dd�dkr|�qS)Nr�Zt_r)�.0rrrr	�
<listcomp>�sz*LexerReflect.get_rules.<locals>.<listcomp>rz+No rules of the form t_rulename are definedTr$rr|�ignorez,%s:%d: Rule '%s' must be defined as a string�\z#%s contains a literal backslash '\'z'Rule '%s' must be defined as a functionz&%s not defined as a function or stringcSs|djjS)Nr()r�r�)�xrrr	�<lambda>�sz(LexerReflect.get_rules.<locals>.<lambda>)rDcSst|d�S)Nr()rr)r�rrr	r��s)rD�reverse)r�r��funcsym�strsymr��errorf�eoffr�rrr�rr��hasattrr�r�r�rr@rJrprr��sort)	rZtsymbolsrrr�r��tokname�line�filerrr	r��sb












zLexerReflect.get_rulescCs��x~|jD�]r}�x�|j|D�]r\}}|jj}|jj}tj|�}|jj|�|j	|}t
|tj�rjd}nd}|jj
}	|	|kr�|jjd|||j�d|_q|	|kr�|jjd|||j�d|_qt|�s�|jjd|||j�d|_qyJtjd|t|�ftj|jB�}
|
jd��r*|jjd	|||j�d|_Wqtjk
�r�}zD|jjd
|||j|�dt|�k�rz|jjd|||j�d|_WYdd}~XqXqW�x|j|D�]\}}
|j	|}|d
k�r�|jjd|�d|_�q�||jk�r|jd�dk�r|jjd||�d|_�q�y@tjd||
ftj|jB�}
|
jd��rN|jjd|�d|_WnTtjk
�r�}z4|jjd||�d|
k�r�|jjd|�d|_WYdd}~XnX�q�W|j|�r�|j|�r�|jjd|�d|_|jj|d�}|r
|}|jj}|jj}tj|�}|jj|�t
|tj��r,d}nd}|jj
}	|	|k�r\|jjd|||j�d|_|	|kr
|jjd|||j�d|_q
Wx|jD]}|j|��q�WdS)Nr�r(z'%s:%d: Rule '%s' has too many argumentsTz%%s:%d: Rule '%s' requires an argumentz2%s:%d: No regular expression defined for rule '%s'z
(?P<%s>%s)r'z<%s:%d: Regular expression for rule '%s' matches empty stringz3%s:%d: Invalid regular expression for rule '%s'. %s�#z6%s:%d. Make sure '#' in rule '%s' is escaped with '\#'rz'Rule '%s' must be defined as a functionr�rz-Rule '%s' defined for an unspecified token %sz5Regular expression for rule '%s' matches empty stringz,Invalid regular expression for rule '%s'. %sz/Make sure '#' in rule '%s' is escaped with '\#'zNo rules defined for state '%s')r�r�r�r�r��inspectZ	getmodulere�addr�rJrKr��co_argcountr�rrr�rirjrkr�r}r�r�r�r�rt�validate_module)rru�fnamerr�r��moduler�Zreqargs�nargsrC�er"�rZefuncrrr	r��s�

 







zLexerReflect.validate_rulescCs�ytj|�\}}Wntk
r&dSXtjd�}tjd�}i}|d7}xv|D]n}|j|�}|sj|j|�}|r�|jd�}	|j|	�}
|
s�|||	<n$tj|�}|j	j
d|||	|
�d|_
|d7}qNWdS)Nz\s*def\s+(t_[a-zA-Z_0-9]*)\(z\s*(t_[a-zA-Z_0-9]*)\s*=r(z7%s:%d: Rule %s redefined. Previously defined on line %dT)r�ZgetsourcelinesrMrirjr}r~rtZ
getsourcefiler�r)rr��linesZlinenZfreZsreZ	counthashr�r�r"�prevr[rrr	r�@s*








zLexerReflect.validate_module)Nr)rrr
r
r�r�r�r�r�r�r�r�r�r�rrrr	r�.s
Bgr�FrYc
%s�|dkrd}d}
ddi}t�}||_|	dkr6ttj�}	|rL|dkrLttj�}|rT|��r��fdd�t��D�}
t|
�}
d|
kr�tj|
dj|
d<nt	d�}
|
j
d	�}|r�t|t�r�d
|kr�|d
|}t
|
|	|d�}|j�|s�|j�r�td��|o�|�r4y |j||
�|ja|ja|a|Stk
�r2YnX|�rd|jd
|j�|jd|j�|jd|j�t�|_x|jD]}|jj|��qtWt|jttf��r�t|jd��j |j�|_!n|j|_!|jt|j!�B|_"|j}i}x�|D]�}g}xX|j#|D]J\}}|j$j%}|j$j&}|j'd|t(|�f�|�r�|jd|t(|�|��q�Wx@|j)|D]2\}}|j'd||f�|�rP|jd|||��qPW|||<�q�W|�r�|jd�xt|D]l}t*||||
|j+�\}}}||j,|<||j-|<||j.|<|�r�x&t/|�D]\}}|jd|||��q�W�q�Wxl|j0�D]`\}}|dk�r$|dk�r$|j,|j1|j,d�|j-|j1|j-d�|j.|j1|j.d��q$W||_2|j,d|_3|j-d|_4||_5|j6|_7|j7j
dd�|_8|j9|_:|j9j
dd�|_;|j;�s�|	j<d�|j=|_>|j=j
dd�|_?x�|j0�D]�\} }|dk�r\| |j9k�r:|	j<d| �| |j6k�r�|j8�r�|	j<d| �nJ|dk�r| |j9k�r�|j9j
dd�|j9| <| |j6k�r|j6j
dd�|j6| <�qW|ja|ja|a|�r�|�r�|dk�rBt|t@jA��r�|j}!nNd
|k�r�|
d}!n:|jBd
�}"d
j |"dd��}#tCd|#�tDtj|#dd�}!tEjFjG|!�}y|jH||�Wn6tIk
�r�}$z|	j<d||$f�WYdd}$~$XnX|S)NrYr&r�csg|]}|t�|�f�qSr)rA)r��k)r�rr	r�zszlex.<locals>.<listcomp>�__file__rr��__package__rG)r�r�zCan't build lexerzlex: tokens   = %rzlex: literals = %rzlex: states   = %rrz
(?P<%s>%s)z(lex: Adding rule %s -> '%s' (state '%s')z#lex: ==== MASTER REGEXS FOLLOW ====z"lex: state '%s' : regex[%d] = '%s'r'zNo t_error rule is definedr�z1No error rule is defined for exclusive state '%s'z2No ignore rule is defined for exclusive state '%s'r(z	import %sz#Couldn't write lextab module %r. %srI)Jr%r=rrdr��dir�dictrer�r�rtrJrr�r�r��SyntaxErrorror�rsr�rfrr�r�r�rgr9r�r�rVrrQr;rhr�r�r�r�r@r�r�r�r�r+r,r-r�r?�extendr0r)r*r4r�r1r:r�r2r7rr�r3r8rKrLrNrcrArOrP�dirnamerbrM)%r�rBr �optimizerYr�ZnowarnrZZdebuglogZerrorlogr�r�ZlexobjZ_itemsZpkgZlinforzZregexsruZ
regex_listr�rr�r�r"r�r)Zre_textZre_namesr�r�styperZsrcfiler�Zpkgnamer�r)r�r	�lex_s�
















$r�cCs�|sVy&tjd}t|�}|j�}|j�Wn*tk
rTtjjd�tjj�}YnX|rb|j	}nt	}||�|rz|j
}nt
}x0|�}|s�Ptjjd|j|j|j
|jf�q�WdS)Nr(z/Reading from standard input (type EOF to end):
z(%s,%r,%d,%d)
)rd�argvrR�read�close�
IndexError�stdoutr�stdinrsr�rrrr)r��datar[rZ_inputZ_tokenr�rrr	�runmains*
r�cs�fdd�}|S)Ncs t�d�rt��|_n�|_|S)Nr$)r�r�r�)r)r�rr	�	set_regexBs
zTOKEN.<locals>.set_regexr)r�r�r)r�r	�TOKENAsr�)
NNFFrYrFNNN)NN)"rSrUrirdrKr>rOr�Z
StringTypeZUnicodeTyperp�AttributeErrorr�bytesrjr�r�rrBrrr!r%r�r�rXrlr�r�r�r�r�r��Tokenrrrr	�<module>"sD
F

(3
@
"
site-packages/ply/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000242147511334660017434 0ustar003

r��Wf�@sdZddgZdS)z3.9ZlexZyaccN)�__version__�__all__�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/ply/__pycache__/ygen.cpython-36.opt-1.pyc000064400000003215147511334660016642 0ustar003

r��W��@s:ddlZddlZdd�Zdd�Zdd�Zedkr6e�dS)	�NcCsht|�}d|}d|}x |D]\}}|j�j|�rPqWx |D]\}}|j�j|�r@Pq@W|d|fS)Nz
#--! %s-startz#--! %s-end�)�	enumerate�strip�
startswith�endswith)�lines�tagZsrclinesZ	start_tagZend_tagZstart_index�lineZ	end_index�r
�/usr/lib/python3.6/ygen.py�get_source_range
srcCsFg}d}d|}x0|D](}|j�j|�r0|}q|r|j|�qW|S)NTz#--! %s)rr�append)rrZfiltered_lines�includeZtag_textr	r
r
r�filter_sections
rcCs�tjjt�}tjtjj|d�tjj|d��ttjj|d�d��}|j�}WdQRXt	|d�\}}t	|d�\}}t	|d�\}}|||�}	t
|	d�}
t
|
d�}||||�<|
|||�<d	d
�|D�}ttjj|d�d��}|j|�WdQRXtd�dS)
Nzyacc.pyzyacc.py.bak�rZ
parsedebugZparseoptzparseopt-notrack�DEBUGZTRACKINGcSsg|]}|j�d�qS)�
)�rstrip)�.0r	r
r
r�
<listcomp>>szmain.<locals>.<listcomp>�wzUpdated yacc.py)
�os�path�dirname�__file__�shutilZcopy2�join�open�	readlinesrr�
writelines�print)r�frZparse_startZ	parse_endZparseopt_startZparseopt_endZparseopt_notrack_startZparseopt_notrack_endZ
orig_linesZparseopt_linesZparseopt_notrack_linesr
r
r�main's  

r"�__main__)Zos.pathrrrrr"�__name__r
r
r
r�<module>
ssite-packages/ply/__pycache__/ctokens.cpython-36.opt-1.pyc000064400000004127147511334660017351 0ustar003

r��Wi�4@sLdddddddddd	d
ddd
ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3g4Zd4Zd5Zd6Zd7Zd8Zd9Zd:Zd;Zd<Z	d=Z
d>Zd?Zd@Z
dAZdBZdCZdDZdEZdFZdGZdHZdIZdJZdKZdLZdMZdNZdOZdPZdQZdRZdSZ dTZ!dUZ"dVZ#dWZ$dXZ%dYZ&dZZ'd[Z(d\Z)d]Z*d^Z+d_Z,d`Z-daZ.dbZ/dcZ0ddZ1deZ2dfZ3dgdh�Z4didj�Z5dkS)lZIDZTYPEIDZINTEGERZFLOAT�STRINGZ	CHARACTER�PLUS�MINUSZTIMESZDIVIDEZMODULO�ORZANDZNOTZXORZLSHIFTZRSHIFTZLORZLANDZLNOTZLTZLEZGTZGEZEQZNEZEQUALSZ
TIMESEQUALZDIVEQUALZMODEQUAL�	PLUSEQUALZ
MINUSEQUALZLSHIFTEQUALZRSHIFTEQUALZANDEQUALZXOREQUALZOREQUALZ	INCREMENTZ	DECREMENTZARROWZTERNARYZLPARENZRPARENZLBRACKETZRBRACKET�LBRACE�RBRACE�COMMAZPERIOD�SEMI�COLON�ELLIPSISz\+�-z\*�/�%z\|�&�~z\^z<<z>>z\|\|z&&�!�<�>z<=z>=z==z!=�=z\*=z/=z%=z\+=z-=z<<=z>>=z&=z\|=z\^=z\+\+z--z->z\?z\(z\)z\[z\]z\{z\}�,z\.�;�:z\.\.\.z[A-Za-z_][A-Za-z0-9_]*z!\d+([uU]|[lL]|[uU][lL]|[lL][uU])?z?((\d+)(\.\d+)(e(\+|-)?(\d+))? | (\d+)e(\+|-)?(\d+))([lL]|[fF])?z\"([^\\\n]|(\\.))*?\"z(L)?\'([^\\\n]|(\\.))*?\'cCs|jj|jjd�7_|S)z/\*(.|\n)*?\*/�
)�lexer�lineno�value�count)�t�r�/usr/lib/python3.6/ctokens.py�	t_COMMENTvsr cCs|jjd7_|S)z//.*\n�)rr)rrrr�t_CPPCOMMENT|sr"N)6�tokensZt_PLUSZt_MINUSZt_TIMESZt_DIVIDEZt_MODULOZt_ORZt_ANDZt_NOTZt_XORZt_LSHIFTZt_RSHIFTZt_LORZt_LANDZt_LNOTZt_LTZt_GTZt_LEZt_GEZt_EQZt_NEZt_EQUALSZt_TIMESEQUALZ
t_DIVEQUALZ
t_MODEQUALZt_PLUSEQUALZt_MINUSEQUALZ
t_LSHIFTEQUALZ
t_RSHIFTEQUALZ
t_ANDEQUALZ	t_OREQUALZ
t_XOREQUALZt_INCREMENTZt_DECREMENTZt_ARROWZ	t_TERNARYZt_LPARENZt_RPARENZ
t_LBRACKETZ
t_RBRACKETZt_LBRACEZt_RBRACEZt_COMMAZt_PERIODZt_SEMIZt_COLONZ
t_ELLIPSISZt_IDZ	t_INTEGERZt_FLOATZt_STRINGZt_CHARACTERr r"rrrr�<module>s�

site-packages/ply/__pycache__/yacc.cpython-36.opt-1.pyc000064400000151411147511334660016621 0ustar003

�}�`F�
@sddlZddlZddlZddlZddlZddlZddlZdZdZ	dZ
dZdZdZ
dZd	Zd
ZdZejddkrteZneZejZGdd�de�ZGd
d�de�ZGdd�de�Zdd�Zdd�Zdada da!dZ"dd�Z#dd�Z$dd�Z%dd�Z&Gdd�d�Z'Gd d!�d!�Z(Gd"d#�d#�Z)ej*d$�Z+Gd%d&�d&e�Z,Gd'd(�d(e�Z-Gd)d*�d*e�Z.d+d,�Z/Gd-d.�d.e�Z0Gd/d0�d0e�Z1Gd1d2�d2e�Z2Gd3d4�d4e�Z3d5d6�Z4d7d8�Z5Gd9d:�d:e�Z6Gd;d<�d<e3�Z7d=d>�Z8d?d@�Z9GdAdB�dBe�Z:de
deddd	deddddf
dCdD�Z;dS)E�Nz3.9z3.8Tz
parser.out�parsetab�LALR�F�(c@s4eZdZdd�Zdd�ZeZdd�Zdd�ZeZd	S)
�	PlyLoggercCs
||_dS)N)�f)�selfr�r	�/usr/lib/python3.6/yacc.py�__init__nszPlyLogger.__init__cOs|jj||d�dS)N�
)r�write)r�msg�args�kwargsr	r	r
�debugqszPlyLogger.debugcOs|jjd||d�dS)Nz	WARNING: r)rr
)rrrrr	r	r
�warningvszPlyLogger.warningcOs|jjd||d�dS)NzERROR: r)rr
)rrrrr	r	r
�erroryszPlyLogger.errorN)	�__name__�
__module__�__qualname__rr�inforrZcriticalr	r	r	r
rmsrc@seZdZdd�Zdd�ZdS)�
NullLoggercCs|S)Nr	)r�namer	r	r
�__getattribute__�szNullLogger.__getattribute__cOs|S)Nr	)rrrr	r	r
�__call__�szNullLogger.__call__N)rrrrrr	r	r	r
rsrc@seZdZdS)�	YaccErrorN)rrrr	r	r	r
r�srcCsPt|�}d|krt|�}t|�tkr4|dt�d}dt|�jt|�|f}|S)Nrz ...z<%s @ 0x%x> (%s))�repr�len�resultlimit�typer�id)�r�repr_str�resultr	r	r
�
format_result�sr%cCsBt|�}d|krt|�}t|�dkr(|Sdt|�jt|�fSdS)Nr�z<%s @ 0x%x>)rrr rr!)r"r#r	r	r
�format_stack_entry�sr'aPLY: Don't use global functions errok(), token(), and restart() in p_error().
Instead, invoke the methods on the associated parser instance:

    def p_error(p):
        ...
        # Use parser.errok(), parser.token(), parser.restart()
        ...

    parser = yacc.yacc()
cCstjt�t�S)N)�warnings�warn�_warnmsg�_errokr	r	r	r
�errok�s
r,cCstjt�t�S)N)r(r)r*�_restartr	r	r	r
�restart�s
r.cCstjt�t�S)N)r(r)r*�_tokenr	r	r	r
�token�s
r0cCs>|ja|ja|ja||�}y
bbbWntk
r8YnX|S)N)r,r+r0r/r.r-�	NameError)�	errorfuncr0�parserr"r	r	r
�call_errorfunc�s
r4c@seZdZdd�Zdd�ZdS)�
YaccSymbolcCs|jS)N)r )rr	r	r
�__str__�szYaccSymbol.__str__cCst|�S)N)�str)rr	r	r
�__repr__�szYaccSymbol.__repr__N)rrrr6r8r	r	r	r
r5�sr5c@sfeZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dS)�YaccProductionNcCs||_||_d|_d|_dS)N)�slice�stack�lexerr3)r�sr;r	r	r
r�szYaccProduction.__init__cCsBt|t�rdd�|j|D�S|dkr2|j|jS|j|jSdS)NcSsg|]
}|j�qSr	)�value)�.0r=r	r	r
�
<listcomp>�sz.YaccProduction.__getitem__.<locals>.<listcomp>r)�
isinstancer:r>r;)r�nr	r	r
�__getitem__�s

zYaccProduction.__getitem__cCs||j|_dS)N)r:r>)rrB�vr	r	r
�__setitem__�szYaccProduction.__setitem__cCsdd�|j||�D�S)NcSsg|]
}|j�qSr	)r>)r?r=r	r	r
r@�sz/YaccProduction.__getslice__.<locals>.<listcomp>)r:)r�i�jr	r	r
�__getslice__�szYaccProduction.__getslice__cCs
t|j�S)N)rr:)rr	r	r
�__len__�szYaccProduction.__len__cCst|j|dd�S)N�linenor)�getattrr:)rrBr	r	r
rJszYaccProduction.linenocCs||j|_dS)N)r:rJ)rrBrJr	r	r
�
set_linenoszYaccProduction.set_linenocCs,t|j|dd�}t|j|d|�}||fS)NrJr�	endlineno)rKr:)rrB�	startlineZendliner	r	r
�linespanszYaccProduction.linespancCst|j|dd�S)N�lexposr)rKr:)rrBr	r	r
rPszYaccProduction.lexposcCs,t|j|dd�}t|j|d|�}||fS)NrPr�	endlexpos)rKr:)rrB�startpos�endposr	r	r
�lexspanszYaccProduction.lexspancCst�dS)N)�SyntaxError)rr	r	r
rszYaccProduction.error)N)rrrrrCrErHrIrJrLrOrPrTrr	r	r	r
r9�s
r9c@s\eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd
d�Zddd�Z	ddd�Z
ddd�ZdS)�LRParsercCs0|j|_|j|_|j|_||_|j�d|_dS)NT)	�lr_productions�productions�	lr_action�action�lr_goto�gotor2�set_defaulted_states�errorok)rZlrtabZerrorfr	r	r
rszLRParser.__init__cCs
d|_dS)NT)r^)rr	r	r
r,&szLRParser.errokcCs@|jdd�=|jdd�=t�}d|_|jj|�|jjd�dS)Nz$endr)�
statestack�symstackr5r �append)r�symr	r	r
r.)szLRParser.restartcCsTi|_xH|jj�D]:\}}t|j��}t|�dkr|ddkr|d|j|<qWdS)N�r)�defaulted_statesrZ�items�list�valuesr)r�state�actionsZrulesr	r	r
r]9s
zLRParser.set_defaulted_statescCs
i|_dS)N)rd)rr	r	r
�disable_defaulted_states@sz!LRParser.disable_defaulted_statesNFcCsZ|str.t|t�rttj�}|j|||||�S|rD|j|||||�S|j|||||�SdS)N)	�	yaccdevelrA�intr�sys�stderr�
parsedebug�parseopt�parseopt_notrack)r�inputr<r�tracking�	tokenfuncr	r	r
�parseCs

zLRParser.parsec Cs�d}g}|j}|j}	|j}
|j}td�}d}
|jd�|sLddlm}|j}||_||_	|dk	rj|j
|�|dkrz|j}n|}||_g}||_g}||_
||_d}|jd�t�}d|_|j|�d}�x�|jd�|jd|�||k�r.|�s|�s�|�}n|j�}|�st�}d|_|j}||j|�}n||}|jd||�|jd	d
djdd
�|D�dd��t|�fj��|dk	�r�|dk�r�|j|�|}|jd|�|j|�d}|
r�|
d8}
q�|dk�rN|
|}|j}|j}t�}||_d|_|�rB|jd|jddjdd
�||d�D��d|	|d%||�n|jd|jg|	|d&|�|�r�||dd�}||d<|�r�|d}|j|_|j|_|d'}t|d|j�|_t|d|j�|_||_ yd||d�=||_!|j"|�||d�=|jdt#|d��|j|�|	|d(|}|j|�Wq�t$k
�r�|j|�|j%|dd)��|j�|d*}d|_d|_|}t&}
d|_'Yq�Xq�n�|�r�|j|_|j|_|g}||_ yL||_!|j"|�|jdt#|d��|j|�|	|d+|}|j|�Wq�t$k
�rJ|j|�|j�|d,}d|_d|_|}t&}
d|_'Yq�Xq�|dk�r�|d-}t|dd�}|jdt#|��|jd�|S|dk�r�|j(dd
djdd
�|D�dd��t|�fj��|
dk�s�|j'�r�t&}
d|_'|}|jdk�r�d}|j)�rB|�rt*|d��r||_||_!t+|j)||�}|j'�r�|}d}q�n`|�r�t*|d��r\|j}nd}|�r~t,j-j.d ||jf�nt,j-j.d!|j�nt,j-j.d"�dSnt&}
t|�dk�r�|jdk�r�d}d}d}|dd�=q�|jdk�r�dS|jdk�r�|d.}|jdk�r6|�r0t|d|j�|_t|d#|j�|_d}q�t�}d|_t*|d��r\|j|_|_t*|d#��rv|j|_|_||_|j|�|}q�|j�}|�r�|j|_|j|_|j�|d/}q�t/d$��q�WdS)0NrzPLY: PARSE DEBUG STARTrc)�lexz$end�zState  : %sz#Defaulted state %s: Reduce using %dzStack  : %sz%s . %s� cSsg|]
}|j�qSr	)r )r?�xxr	r	r
r@�sz'LRParser.parsedebug.<locals>.<listcomp>z Action : Shift and goto state %sz3Action : Reduce rule [%s] with %s and goto state %d�[�,cSsg|]}t|j��qSr	)r'r>)r?Z_vr	r	r
r@�s�]rMrQzResult : %srFr>zDone   : Returning %szPLY: PARSE DEBUG ENDzError  : %scSsg|]
}|j�qSr	)r )r?ryr	r	r
r@Bsr<rJz(yacc: Syntax error at line %d, token=%s
zyacc: Syntax error, token=%sz yacc: Parse error in input. EOF
rPzyacc: internal parser error!!!
���r}r}r}r}r}r}r}r}r}r})0rZr\rXrdr9rrwrvr<r3rrr0r_r`r;rar5r r�pop�get�joinr7�lstriprrr>rJrPrKrMrQr:rh�callabler%rU�extend�error_countr^rr2�hasattrr4rmrnr
�RuntimeError) rrrr<rrsrt�	lookahead�lookaheadstackrir\�prodrd�pslice�
errorcountrv�	get_tokenr_r`�errtokenrbrh�ltype�t�p�pname�plen�targ�t1rBr$�tokrJr	r	r
ro\s~





.






$








.


zLRParser.parsedebugc Csvd}g}|j}|j}	|j}
|j}td�}d}
|sBddlm}|j}||_||_|dk	r`|j	|�|dkrp|j
}n|}||_
g}||_g}||_||_
d}|jd�t�}d|_|j|�d}�x�||k�r|s�|s�|�}n|j�}|s�t�}d|_|j}||j|�}n||}|dk	�rh|dk�rN|j|�|}|j|�d}|
r�|
d8}
q�|dk�rF|
|}|j}|j}t�}||_d|_|�r�||dd�}||d<|�r�|d}|j|_|j|_|d}t|d|j�|_t|d|j�|_||_yP||d�=||_|j|�||d�=|j|�|	|d|}|j|�Wq�tk
�r�|j|�|j|dd��|j�|d}d|_d|_|}t }
d|_!Yq�Xq�n�|�r�|j|_|j|_|g}||_y8||_|j|�|j|�|	|d|}|j|�Wq�tk
�rB|j|�|j�|d}d|_d|_|}t }
d|_!Yq�Xq�|dk�rh|d}t|d	d�}|S|dk�rf|
dk�s�|j!�rNt }
d|_!|}|jdk�r�d}|j"�r�|�r�t#|d
��r�||_||_t$|j"||�}|j!�rL|}d}q�n`|�r<t#|d��r|j}nd}|�r(t%j&j'd||jf�nt%j&j'd
|j�nt%j&j'd�dSnt }
t|�dk�r�|jdk�r�d}d}d}|dd�=q�|jdk�r�dS|jdk�r6|d}|jdk�r�|�r�t|d|j�|_t|d|j�|_d}q�t�}d|_t#|d��r|j|_|_t#|d��r |j|_|_||_|j|�|}q�|j�}|�rT|j|_|j|_|j�|d}q�t(d��q�WdS)Nrrc)rvz$endrMrQrFr>r<rJz(yacc: Syntax error at line %d, token=%s
zyacc: Syntax error, token=%sz yacc: Parse error in input. EOF
rPzyacc: internal parser error!!!
r}r}r}r}r}r}r}r}r}))rZr\rXrdr9rwrvr<r3rrr0r_r`r;rar5r r~rrrr>rJrPrKrMrQr:rhr�rUr�r�r^r2r�r4rmrnr
r�) rrrr<rrsrtr�r�rir\r�rdr�r�rvr�r_r`r�rbrhr�r�r�r�r�r�r�rBr$r�rJr	r	r
rp�sX




















zLRParser.parseoptcCs�d}g}|j}|j}	|j}
|j}td�}d}
|sBddlm}|j}||_||_|dk	r`|j	|�|dkrp|j
}n|}||_
g}||_g}||_||_
d}|jd�t�}d|_|j|�d}�x||k�r|s�|s�|�}n|j�}|s�t�}d|_|j}||j|�}n||}|dk	�r|dk�rN|j|�|}|j|�d}|
r�|
d8}
q�|dk�r�|
|}|j}|j}t�}||_d|_|�rX||dd�}||d<||_yP||d�=||_|j|�||d�=|j|�|	|d|}|j|�Wq�tk
�rR|j|�|j|dd��|j�|d}d|_d|_|}t}
d|_Yq�Xq�n�|g}||_y8||_|j|�|j|�|	|d|}|j|�Wq�tk
�r�|j|�|j�|d}d|_d|_|}t}
d|_Yq�Xq�|dk�r|d}t|dd�}|S|dk�r�|
dk�s(|j�r�t}
d|_|}|jdk�rFd}|j�r�|�rht|d��rh||_||_t |j||�}|j�r�|}d}q�n`|�r�t|d	��r�|j!}nd}|�r�t"j#j$d
||jf�nt"j#j$d|j�nt"j#j$d�dSnt}
t|�dk�r(|jdk�r(d}d}d}|dd�=q�|jdk�r8dS|jdk�r�|d}|jdk�r^d}q�t�}d|_t|d	��r�|j!|_!|_%t|d
��r�|j&|_&|_'||_|j|�|}q�|j�}|j�|d}q�t(d��q�WdS)Nrrc)rvz$endrFr>r<rJz(yacc: Syntax error at line %d, token=%s
zyacc: Syntax error, token=%sz yacc: Parse error in input. EOF
rPzyacc: internal parser error!!!
r}r}r}r}r}r}r}r}))rZr\rXrdr9rwrvr<r3rrr0r_r`r;rar5r r~rrrr>r:rhr�rUr�r�r^rKr2r�r4rJrmrnr
rMrPrQr�)rrrr<rrsrtr�r�rir\r�rdr�r�rvr�r_r`r�rbrhr�r�r�r�r�r�rBr$r�rJr	r	r
rq�s8




















zLRParser.parseopt_notrack)NNFFN)NNFFN)NNFFN)NNFFN)rrrrr,r.r]rjrurorprqr	r	r	r
rVs

]
4rVz^[a-zA-Z0-9_-]+$c@sReZdZdZddd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�ZdS)�
Productionr�rightNrwc	Cs�||_t|�|_||_||_d|_||_||_||_t	|j�|_	g|_
x$|jD]}||j
krN|j
j|�qNWg|_d|_
|jr�d|jdj|j�f|_nd|j|_dS)Nz%s -> %srxz
%s -> <empty>)r�tupler��number�funcr��file�line�precr�usymsra�lr_items�lr_nextr�r7)	rr�rr��
precedencer�r�r�r=r	r	r
rs$

zProduction.__init__cCs|jS)N)r7)rr	r	r
r6=szProduction.__str__cCsdt|�dS)NzProduction(�))r7)rr	r	r
r8@szProduction.__repr__cCs
t|j�S)N)rr�)rr	r	r
rICszProduction.__len__cCsdS)Nrcr	)rr	r	r
�__nonzero__FszProduction.__nonzero__cCs
|j|S)N)r�)r�indexr	r	r
rCIszProduction.__getitem__cCs�|t|j�krdSt||�}yt|j|d|_Wnttfk
rRg|_YnXy|j|d|_Wntk
r�d|_YnX|S)Nrc)rr��LRItem�	Prodnames�lr_after�
IndexError�KeyError�	lr_before)rrBr�r	r	r
�lr_itemMs
zProduction.lr_itemcCs|jr||j|_dS)N)r�r�)r�pdictr	r	r
�bind]szProduction.bind�r�r)r�Nrwr)rrr�reducedrr6r8rIr�rCr�r�r	r	r	r
r�s
r�c@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�MiniProductioncCs.||_||_||_d|_||_||_||_dS)N)rrr�r�r�r�r7)rr7rrr�r�r�r	r	r
rfszMiniProduction.__init__cCs|jS)N)r7)rr	r	r
r6oszMiniProduction.__str__cCs
d|jS)NzMiniProduction(%s))r7)rr	r	r
r8rszMiniProduction.__repr__cCs|jr||j|_dS)N)r�r�)rr�r	r	r
r�vszMiniProduction.bindN)rrrrr6r8r�r	r	r	r
r�es	r�c@s$eZdZdd�Zdd�Zdd�ZdS)r�cCsZ|j|_t|j�|_|j|_||_i|_|jj|d�t|j�|_t|j�|_|j	|_	dS)N�.)
rrfr�r��lr_index�
lookaheads�insertr�rr�)rr�rBr	r	r
r�szLRItem.__init__cCs,|jrd|jdj|j�f}n
d|j}|S)Nz%s -> %srxz
%s -> <empty>)r�rr�)rr=r	r	r
r6�s
zLRItem.__str__cCsdt|�dS)NzLRItem(r�)r7)rr	r	r
r8�szLRItem.__repr__N)rrrrr6r8r	r	r	r
r��sr�cCs:t|�d}x(|dkr4|||kr*||S|d8}qWdS)Nrcr)r)Zsymbols�	terminalsrFr	r	r
�rightmost_terminal�s
r�c@seZdZdS)�GrammarErrorN)rrrr	r	r	r
r��sr�c@s�eZdZdd�Zdd�Zdd�Zdd�Zd$dd
�Zd%dd�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zd&d d!�Zd"d#�Zd	S)'�GrammarcCsfdg|_i|_i|_i|_x|D]}g|j|<q Wg|jd<i|_i|_i|_i|_t�|_	d|_
dS)Nr)�Productionsr��Prodmap�	Terminals�Nonterminals�First�Follow�
Precedence�set�UsedPrecedence�Start)rr��termr	r	r
r�s

zGrammar.__init__cCs
t|j�S)N)rr�)rr	r	r
rI�szGrammar.__len__cCs
|j|S)N)r�)rr�r	r	r
rC�szGrammar.__getitem__cCs8||jkrtd|��|dkr&td��||f|j|<dS)Nz,Precedence already specified for terminal %r�leftr��nonassocz:Associativity must be one of 'left','right', or 'nonassoc')r�r�r�)r�r�)rr��assoc�levelr	r	r
�set_precedence�s

zGrammar.set_precedenceNrwrcCs�||jkrtd|||f��|dkr6td|||f��tj|�sRtd|||f��x�t|�D]�\}}|ddkr�yJt|�}t|�dkr�td||||f��||jkr�g|j|<|||<w\Wntk
r�YnXtj|�r\|d	kr\td
||||f��q\Wd	|k�r�|dd	k�r$td||f��|dd	k�rBtd
||f��|d}	|jj	|	�}
|
�sptd|||	f��n|j
j|	�|dd�=nt||j�}	|jj	|	d�}
d||f}||j
k�r�|j
|}td|||fd|j|jf��t|j�}
||jk�rg|j|<xR|D]J}||jk�r.|j|j|
�n&||jk�rDg|j|<|j|j|
��qWt|
|||
|||�}|jj|�||j
|<y|j|j|�Wn"tk
�r�|g|j|<YnXdS)Nz7%s:%d: Illegal rule name %r. Already defined as a tokenrz5%s:%d: Illegal rule name %r. error is a reserved wordz%s:%d: Illegal rule name %rrz'"rczA%s:%d: Literal token %s in rule %r may only be a single characterz%precz!%s:%d: Illegal name %r in rule %rz+%s:%d: Syntax error. Nothing follows %%prec�zH%s:%d: Syntax error. %%prec can only appear at the end of a grammar rulez/%s:%d: Nothing known about the precedence of %rr�z%s -> %sz%s:%d: Duplicate rule %s. zPrevious definition at %s:%dr}���r}r�)r�r)r�r��_is_identifier�match�	enumerate�evalrrUr�rr��addr�r�r�r�r�r�rar�r�r�)r�prodname�symsr�r�r�rBr=�cZprecnameZprodprec�map�mZpnumberr�r�r	r	r
�add_production
sp










zGrammar.add_productioncCsT|s|jdj}||jkr&td|��tdd|g�|jd<|j|jd�||_dS)Nrczstart symbol %s undefinedrzS')r�rr�r�r�rar�)r�startr	r	r
�	set_startas
zGrammar.set_startcs>���fdd��t����jdjd��fdd��jD�S)NcsJ|�krdS�j|�x.�jj|g�D]}x|jD]}�|�q2Wq&WdS)N)r�r�rr�)r=r�r")�mark_reachable_from�	reachablerr	r
r�ts
z5Grammar.find_unreachable.<locals>.mark_reachable_fromrcsg|]}|�kr|�qSr	r	)r?r=)r�r	r
r@~sz,Grammar.find_unreachable.<locals>.<listcomp>)r�r�r�r�)rr	)r�r�rr
�find_unreachableqszGrammar.find_unreachablecCs�i}x|jD]}d||<qWd|d<x|jD]}d||<q,Wxpd}x`|jj�D]R\}}xH|D]@}x |jD]}||shd}PqhWd}|r\||s�d||<d}Pq\WqNW|s>Pq>Wg}	x@|j�D]4\}}
|
s�||jkr�||jkr�|dkr�q�|	j|�q�W|	S)NTz$endFr)r�r�r�rer�ra)rZ
terminatesr�rB�some_changeZplr�r=Zp_terminates�infiniter�r	r	r
�infinite_cycles�s:

zGrammar.infinite_cyclescCsXg}xN|jD]D}|sqx8|jD].}||jkr||jkr|dkr|j||f�qWqW|S)Nr)r�r�r�r�ra)rr$r�r=r	r	r
�undefined_symbols�szGrammar.undefined_symbolscCs8g}x.|jj�D] \}}|dkr|r|j|�qW|S)Nr)r�rera)rZ
unused_tokr=rDr	r	r
�unused_terminals�s
zGrammar.unused_terminalscCs<g}x2|jj�D]$\}}|s|j|d}|j|�qW|S)Nr)r�rer�ra)rZunused_prodr=rDr�r	r	r
�unused_rules�szGrammar.unused_rulescCsDg}x:|jD]0}||jkp"||jks|j||j|df�qW|S)Nr)r�r�r�ra)rZunusedZtermnamer	r	r
�unused_precedence�s
zGrammar.unused_precedencecCs`g}xV|D]D}d}x2|j|D]$}|dkr0d}q||kr|j|�qW|rLq
Pq
W|jd�|S)NFz<empty>T)r�ra)rZbetar$�xZx_produces_emptyrr	r	r
�_first	s

zGrammar._firstcCs�|jr|jSx|jD]}|g|j|<qWdg|jd<x|jD]}g|j|<q<Wxjd}xZ|jD]P}xJ|j|D]<}x6|j|j�D]&}||j|kr~|j|j|�d}q~WqlWq\W|sPPqPW|jS)Nz$endFT)r�r�r�r�r�r�ra)rr�rBr�r�rr	r	r
�
compute_first,s$zGrammar.compute_firstc
CsV|jr|jS|js|j�x|jD]}g|j|<q"W|sD|jdj}dg|j|<�x�d}x�|jdd�D]�}x�t|j�D]�\}}||jkrx|j|j|dd��}d}xB|D]:}	|	dkr�|	|j|kr�|j|j	|	�d}|	dkr�d}q�W|�s|t
|j�dkrxx:|j|jD]*}	|	|j|k�r|j|j	|	�d}�qWqxWqhW|sTPqTW|jS)Nrcz$endFz<empty>T)r�r�r�r�r�rr�r�r�rar)
rr��k�didaddr�rF�BZfstZhasemptyrr	r	r
�compute_followQs<

zGrammar.compute_followcCs�x�|jD]�}|}d}g}x�|t|�kr,d}ntt||�}y|j|j|d|_Wnttfk
rng|_YnXy|j|d|_Wntk
r�d|_YnX||_	|s�P|j
|�|}|d7}qW||_qWdS)Nrrc)r�rr�r�r�r�r�r�r�r�rar�)rr�ZlastlrirFr�Zlrir	r	r
�
build_lritems�s.

zGrammar.build_lritems)Nrwr)N)N)rrrrrIrCr�r�r�r�r�r�r�r�r�r�r�r�r�r	r	r	r
r��s $
T
@#%
;r�c@seZdZdS)�VersionErrorN)rrrr	r	r	r
r��sr�c@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�LRTablecCsd|_d|_d|_d|_dS)N)rYr[rW�	lr_method)rr	r	r
r�szLRTable.__init__cCs~t|tj�r|}ntd|�tj|}|jtkr:td��|j	|_
|j|_g|_
x|jD]}|j
jt|��qXW|j|_|jS)Nz	import %sz&yacc table file version is out of date)rA�types�
ModuleType�execrm�modulesZ_tabversion�__tabversion__r�Z
_lr_actionrYZ_lr_gotor[rWZ_lr_productionsrar�Z
_lr_methodr�Z
_lr_signature)r�modulerr�r	r	r
�
read_table�s

zLRTable.read_tablecCs�yddl}Wntk
r(ddl}YnXtjj|�s:t�t|d�}|j|�}|tkr^t	d��|j|�|_
|j|�}|j|�|_|j|�|_|j|�}g|_
x|D]}|j
jt|��q�W|j�|S)Nr�rbz&yacc table file version is out of date)�cPickle�ImportError�pickle�os�path�exists�open�loadr�r�r�rYr[rWrar��close)r�filenamer�Zin_fZ
tabversion�	signaturerXr�r	r	r
�read_pickle�s(




zLRTable.read_picklecCsx|jD]}|j|�qWdS)N)rWr�)rr�r�r	r	r
�bind_callables�szLRTable.bind_callablesN)rrrrr�rrr	r	r	r
r��sr�c	CsTi}x|D]}d||<q
Wg}i}x,|D]$}||dkr(t|||||||�q(W|S)Nr)�traverse)�X�R�FP�Nr�r;�Fr	r	r
�digraphs

rc	Cs|j|�t|�}|||<||�||<||�}xr|D]j}	||	dkrXt|	||||||�t||||	�||<x.|j|	g�D]}
|
||kr|||j|
�q|Wq4W|||k�rt||d<||||d<|j�}x2||k�rt||d<||||d<|j�}q�WdS)Nrrcr}r}r}r})rarr�minr�MAXINTr~)r�rr;rrrr�d�rel�y�a�elementr	r	r
rs(

rc@seZdZdS)�	LALRErrorN)rrrr	r	r	r
r)src@s�eZdZd$dd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zd%d d!�Zd&d"d#�ZdS)'�LRGeneratedTablerNcCs�|dkrtd|��||_||_|s*t�}||_i|_i|_|j|_i|_	i|_
d|_d|_d|_
g|_g|_g|_|jj�|jj�|jj�|j�dS)N�SLRrzUnsupported method %sr)rr)r�grammarr�r�logrYr[r�rW�
lr_goto_cache�lr0_cidhash�
_add_countZsr_conflictZrr_conflictZ	conflicts�sr_conflicts�rr_conflictsr�r�r��lr_parse_table)rr�methodrr	r	r
r4s,


zLRGeneratedTable.__init__cCsz|jd7_|dd�}d}xV|rtd}xH|D]@}x:|jD]0}t|dd�|jkrRq:|j|j�|j|_d}q:Wq.Wq W|S)NrcTF�	lr0_addedr)rr�rKrar�r)r�I�Jr�rGr�r	r	r
�lr0_closureYs
zLRGeneratedTable.lr0_closurec	Cs�|jjt|�|f�}|r|S|jj|�}|s:i}||j|<g}xP|D]H}|j}|rD|j|krD|jt|��}|s~i}||t|�<|j|�|}qDW|jd�}|s�|r�|j|�}||d<n||d<||jt|�|f<|S)Nz$end)rrr!r�r�rar)	rrr��gr=Zgsr�rB�s1r	r	r
�lr0_gotoss2





zLRGeneratedTable.lr0_gotoc	Cs�|j|jjdjg�g}d}x"|D]}||jt|�<|d7}q"Wd}x�|t|�kr�||}|d7}i}x$|D]}x|jD]}d||<qxWqlWxJ|D]B}|j||�}|s�t|�|jkr�q�t|�|jt|�<|j	|�q�WqFW|S)Nrrc)
rrr�r�rr!rr�r"ra)	r�CrFrZasyms�iir=r�r r	r	r
�	lr0_items�s(


zLRGeneratedTable.lr0_itemscCs�t�}d}xrxV|jjdd�D]B}|jdkr:|j|j�qx$|jD]}||krBPqBW|j|j�qWt|�|krrPt|�}qW|S)Nrrc)r�rr�rr�rr�)r�nullableZnum_nullabler�r�r	r	r
�compute_nullable_nonterminals�s
z.LRGeneratedTable.compute_nullable_nonterminalscCsrg}xht|�D]\\}}xR|D]J}|j|jdkr||j|jdf}|d|jjkr||kr|j|�qWqW|S)Nrc)r�r�rr�rr�ra)rr#�transZstatenorhr�r�r	r	r
�find_nonterminal_transitions�s
z-LRGeneratedTable.find_nonterminal_transitionscCs�i}|\}}g}|j|||�}xJ|D]B}	|	j|	jdkr&|	j|	jd}
|
|jjkr&|
|kr&|j|
�q&W|dkr�||jjdjdkr�|jd�|S)Nrcrz$end)r"r�rr�rr�rar�)rr#r(r&Zdr_setrhr�termsr r�rr	r	r
�dr_relation�s

zLRGeneratedTable.dr_relationcCsvg}|\}}|j|||�}|jjt|�d�}xB|D]:}	|	j|	jdkr4|	j|	jd}
|
|kr4|j||
f�q4W|S)Nrcr})r"rrr!r�rr�ra)rr#r(�emptyrrhrr rGr�rr	r	r
�reads_relation	s
zLRGeneratedTable.reads_relationcCs�i}i}i}x|D]}d||<qW�x�|D�]�\}}	g}
g}�xR||D�]D}|j|	krZqH|j}
|}x�|
|jdk�r
|
d}
|j|
}||f|kr�|
d}xH||jkr�|j||jjkr�P|j||kr�P|d}q�W|j||f�|j|||�}|jj	t
|�d�}qfWx�||D]t}|j|jk�r,�q|j|jk�r>�qd}xD||jk�rx|j||j|dk�rlP|d}�qDW|
j||f��qWqHWx2|D]*}||k�r�g||<||j||	f��q�W|
|||	f<q*W||fS)Nrcrr})rr�rr�rr�rar"rrr!)rr#r(r&ZlookdictZincludedictZdtransr�rhrZlookbZincludesr�r�rGZlir r"rFr	r	r
�compute_lookback_includesC	sX




z*LRGeneratedTable.compute_lookback_includescs0���fdd�}���fdd�}t|||�}|S)Ncs�j�|��S)N)r+)r�)r#r&rr	r
�<lambda>�	sz4LRGeneratedTable.compute_read_sets.<locals>.<lambda>cs�j�|��S)N)r-)r�)r#r&rr	r
r/�	s)r)rr#�ntransr&rrrr	)r#r&rr
�compute_read_sets�	sz"LRGeneratedTable.compute_read_setscs(�fdd�}�fdd�}t|||�}|S)Ncs�|S)Nr	)r�)�readsetsr	r
r/�	sz6LRGeneratedTable.compute_follow_sets.<locals>.<lambda>cs�j|g�S)N)r)r�)�inclsetsr	r
r/�	s)r)rr0r2r3rrrr	)r3r2r
�compute_follow_sets�	sz$LRGeneratedTable.compute_follow_setsc	Csxxr|j�D]f\}}x\|D]T\}}||jkr4g|j|<|j|g�}x*|D]"}||j|krF|j|j|�qFWqWq
WdS)N)rer�rra)	rZ	lookbacksZ	followsetr(Zlbrhr�rrr	r	r
�add_lookaheads�	s


zLRGeneratedTable.add_lookaheadscCsP|j�}|j|�}|j|||�}|j|||�\}}|j|||�}|j||�dS)N)r'r)r1r.r4r5)rr#r&r(r2ZlookdZincludedZ
followsetsr	r	r
�add_lalr_lookaheads�	s
z$LRGeneratedTable.add_lalr_lookaheadsc$	Cs<|jj}|jj}|j}|j}|j}i}|jd|j�|j�}|jdkrP|j	|�d}�x�|D�]�}	g}
i}i}i}
|jd�|jd|�|jd�x|	D]}|jd|j
|�q�W|jd��x|	D�]�}|j|jdk�r,|j
dkr�d|d	<||d	<�q�|jdk�r|j|}n|jj|j
}�x�|D�]�}|
j||d
|j
|ff�|j|�}|dk	�r�|dk�rB|||j
j\}}|j|d�\}}||k�s�||k�r�|dk�r�|j
||<|||<|�r�|�r�|jd
|�|jj||df�||j
jd7_nB||k�r|dk�rd||<n$|�s�|jd|�|jj||df�n�|dk�r�||}||j
}|j|jk�r�|j
||<|||<||}}||j
jd7_||j
jd8_n
||}}|jj|||f�|jd|||j
||�ntd|��n(|j
||<|||<||j
jd7_�q&Wq�|j}|j|d}||jjkr�|j|	|�}|jjt|�d�}|dkr�|
j||d|f�|j|�}|dk	�r�|dk�r�||k�r�td|��n�|dk�r�|||j
j\}}|j|d�\}}||k�s||k�rV|dk�rV|||j
jd8_|||<|||<|�s�|jd|�|jj||df�nL||k�rt|dk�rtd||<n.|�r�|�r�|jd
|�|jj||df�ntd|��q�|||<|||<q�Wi}xF|
D]>\}}}||k�r�|||k�r�|jd||�d|||f<�q�W|jd�d}xX|
D]P\}}}||k�r&|||k	�r&||f|k�r&|jd||�d}d|||f<�q&W|�r�|jd�i} x6|	D].}!x&|!jD]}"|"|jjk�r�d| |"<�q�W�q�WxL| D]D}#|j|	|#�}|jjt|�d�}|dk�r�||
|#<|jd|#|��q�W|||<|||<|
||<|d7}q\WdS)NzParsing method: %srrrwzstate %dz    (%d) %srczS'z$endzreduce using rule %d (%s)r�r�z3  ! shift/reduce conflict for %s resolved as reduce�reducer�z2  ! shift/reduce conflict for %s resolved as shiftZshiftz=  ! reduce/reduce conflict for %s resolved using rule %d (%s)zUnknown conflict in state %dzshift and go to state %dz Shift/shift conflict in state %dz    %-15s %sz  ! %-15s [ %s ]z"    %-30s shift and go to state %d)r�rr})r�rr}) rr�r�r[rYrrr�r%r6r�rr�rr�r�rarr�rr�r�rrr�r�r"rr!rr�r�)$rr�r�r\rZrZactionpr#�strZactlistZ	st_actionZ
st_actionpZst_gotor�Zlaheadsrr"ZsprecZslevelZrprecZrlevelZoldpZppZchosenpZrejectprFr rGZ	_actprintr�Znot_usedZnkeysr$r=rBr	r	r
r�	s




























zLRGeneratedTable.lr_parse_tablerwcCs�t|tj�rtd��|jd�d}tjj||�d}�ylt|d�}|j	dtjj
|�t|j|f�d}|�rti}xf|j
j�D]X\}	}
xN|
j�D]B\}}|j|�}
|
s�ggf}
|
||<|
dj|	�|
dj|�q�Wq|W|j	d�xz|j�D]n\}}|j	d	|�x |dD]}
|j	d
|
��qW|j	d�x |dD]}
|j	d
|
��q8W|j	d�q�W|j	d
�|j	d�nJ|j	d�x4|j
j�D]&\}}|j	d|d|d|f��q�W|j	d
�|�r�i}xl|jj�D]^\}	}
xR|
j�D]F\}}|j|�}
|
�sggf}
|
||<|
dj|	�|
dj|��q�W�q�W|j	d�x||j�D]p\}}|j	d	|�x |dD]}
|j	d
|
��qjW|j	d�x |dD]}
|j	d
|
��q�W|j	d��qJW|j	d
�|j	d�nJ|j	d�x4|jj�D]&\}}|j	d|d|d|f��q�W|j	d
�|j	d�xd|jD]Z}|j�rl|j	d|j|j|j|jtjj
|j�|jf�n|j	dt|�|j|jf��q0W|j	d�|j�Wn&tk
�r�}z�WYdd}~XnXdS)Nz"Won't overwrite existing tabmoduler�rcz.py�wzu
# %s
# This file is automatically generated. Do not edit.
_tabversion = %r

_lr_method = %r

_lr_signature = %r
    rz
_lr_action_items = {z%r:([z%r,z],[z]),z}
z�
_lr_action = {}
for _k, _v in _lr_action_items.items():
   for _x,_y in zip(_v[0],_v[1]):
      if not _x in _lr_action:  _lr_action[_x] = {}
      _lr_action[_x][_k] = _y
del _lr_action_items
z
_lr_action = { z(%r,%r):%r,z
_lr_goto_items = {z�
_lr_goto = {}
for _k, _v in _lr_goto_items.items():
   for _x, _y in zip(_v[0], _v[1]):
       if not _x in _lr_goto: _lr_goto[_x] = {}
       _lr_goto[_x][_k] = _y
del _lr_goto_items
z
_lr_goto = { z_lr_productions = [
z  (%r,%r,%d,%r,%r,%d),
z  (%r,%r,%d,None,None,None),
z]
r})rAr�r��IOError�splitr�r�r�r�r
�basenamer�r�rYrerrar[rWr�r7rrr�r�r�)r�	tabmodule�	outputdirr�Zbasemodulenamer�rZsmallerrer=ZndrrDrFr�r��er	r	r
�write_table�
s�





"





"

"
zLRGeneratedTable.write_tablecCsyddl}Wntk
r(ddl}YnXt|d���}|jt|t�|j|j|t�|j||t�|j|j|t�|j|j	|t�g}x^|j
D]T}|jr�|j|j
|j|j|jtjj|j�|jf�q�|jt
|�|j|jdddf�q�W|j||t�WdQRXdS)Nr�wb)r�r�r�r��dumpr��pickle_protocolr�rYr[rWr�rar7rrr�r�r<r�r�)rr�r�r�ZoutfZoutpr�r	r	r
�pickle_tables ,"zLRGeneratedTable.pickle_table)rN)rwrw)rw)rrrrrr"r%r'r)r+r-r.r1r4r5r6rr@rDr	r	r	r
r3s"
%#8+P8
zrcCs0tj|�}|jj�}|j|jkr,|j|j�|S)N)rm�	_getframe�	f_globals�copy�f_locals�update)ZlevelsrZldictr	r	r
�get_caller_module_dict9s


rJcCsg}|j�}d}|}x�|D]�}|d7}|j�}|s4qy�|ddkrh|sVtd||f��|}	|dd�}
n@|d}	|	}|dd�}
|d}|dkr�|dkr�td||f��|j|||	|
f�Wqtk
r��Yqtk
r�td	|||j�f��YqXqW|S)
Nrcr�|z%s:%d: Misplaced '|'r��:z::=z!%s:%d: Syntax error. Expected ':'z%s:%d: Syntax error in rule %r)�
splitlinesr;rUra�	Exception�strip)�docr�r�rZpstringsZlastpZdlineZpsr�r�r�Zassignr	r	r
�
parse_grammarEs6
 rQc@s�eZdZd dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�ZdS)!�
ParserReflectNcCsL||_d|_d|_d|_t�|_g|_d|_|dkrBtt	j
�|_n||_dS)NF)r�r��
error_func�tokensr�r�rrrrmrnr)rr�rr	r	r
roszParserReflect.__init__cCs,|j�|j�|j�|j�|j�dS)N)�	get_start�get_error_func�
get_tokens�get_precedence�get_pfunctions)rr	r	r
�get_all~s
zParserReflect.get_allcCs6|j�|j�|j�|j�|j�|j�|jS)N)�validate_start�validate_error_func�validate_tokens�validate_precedence�validate_pfunctions�validate_modulesr)rr	r	r
�validate_all�szParserReflect.validate_allcCsyddlm}Wn tk
r0ddlm}YnXy�|dd�}|jrV|j|jjd��|jr~|jdjdd�|jD��jd��|jr�|jd	j|j�jd��x*|j	D] }|d
r�|j|d
jd��q�WWnt
tfk
r�YnXtj
|j��}tjdd
k�r|jd�}|S)Nr)�md5F)Zusedforsecurityzlatin-1rwcSsg|]}dj|��qS)rw)r�)r?r�r	r	r
r@�sz+ParserReflect.signature.<locals>.<listcomp>rxr)Zhashlibrbr�r�rI�encoder�r�rT�pfuncs�	TypeError�
ValueError�base64Z	b16encode�digestrm�version_info�decode)rrbZsigrrhr	r	r
r��s*
"
zParserReflect.signaturecCs�tjd�}x�|jD]�}ytj|�\}}Wntk
r>wYnXi}xjt|�D]^\}}|d7}|j|�}|rN|jd�}|j	|�}	|	s�|||<qNtj
|�}
|jjd|
|||	�qNWqWdS)Nz\s*def\s+(p_[a-zA-Z_0-9]*)\(rcz;%s:%d: Function %s redefined. Previously defined on line %d)
�re�compiler��inspectZgetsourcelinesr:r�r��groupr�
getsourcefilerr)rZfrer��linesZlinenZ	counthashr�r�r�prevr�r	r	r
r`�s$





zParserReflect.validate_modulescCs|jjd�|_dS)Nr�)r�rr�)rr	r	r
rU�szParserReflect.get_startcCs&|jdk	r"t|jt�s"|jjd�dS)Nz'start' must be a string)r�rA�string_typesrr)rr	r	r
r[�s
zParserReflect.validate_startcCs|jjd�|_dS)N�p_error)r�rrS)rr	r	r
rV�szParserReflect.get_error_funccCs�|jr�t|jtj�rd}n*t|jtj�r.d}n|jjd�d|_dS|jjj}|jjj	}t
j|j�}|jj
|�|jjj|}|dkr�|jjd||�d|_dS)Nrrcz2'p_error' defined, but is not a function or methodTz$%s:%d: p_error() requires 1 argument)rSrAr��FunctionType�
MethodTyperr�__code__�co_firstlineno�co_filenamerm�	getmoduler�r��co_argcount)rZismethodZelineZefiler�Zargcountr	r	r
r\�s 

z!ParserReflect.validate_error_funccCsn|jjd�}|s&|jjd�d|_dSt|ttf�sJ|jjd�d|_dS|sd|jjd�d|_dS||_dS)NrTzNo token list is definedTztokens must be a list or tupleztokens is empty)r�rrrrArfr�rT)rrTr	r	r
rW�szParserReflect.get_tokenscCsZd|jkr |jjd�d|_dSt�}x.|jD]$}||krH|jjd|�|j|�q.WdS)Nrz.Illegal token name 'error'. Is a reserved wordTzToken %r multiply defined)rTrrr�rr�)rr�rBr	r	r
r]s
zParserReflect.validate_tokenscCs|jjd�|_dS)Nr�)r�rr�)rr	r	r
rXszParserReflect.get_precedencecCsg}|j�rt|jttf�s2|jjd�d|_dSx�t|j�D]�\}}t|ttf�sj|jjd�d|_dSt|�dkr�|jjd|�d|_dS|d}t|t�s�|jjd�d|_dSxH|dd�D]8}t|t�s�|jjd	�d|_dS|j	|||df�q�Wq>W||_
dS)
Nz"precedence must be a list or tupleTzBad precedence tabler�z?Malformed precedence entry %s. Must be (assoc, term, ..., term)rz)precedence associativity must be a stringrcz precedence items must be strings)r�rArfr�rrr�rrrra�preclist)rr{r�r�r�r�r	r	r
r^s6

z!ParserReflect.validate_precedencecCs�g}xl|jj�D]^\}}|jd�s|dkr.qt|tjtjf�rt|d|jj	�}t
j|�}|j||||j
f�qW|jdd�d�||_dS)N�p_rsrwcSs |dt|d�|d|dfS)Nrrcr�r)r7)Z
p_functionr	r	r
r/Bs
z.ParserReflect.get_pfunctions.<locals>.<lambda>)�key)r�re�
startswithrAr�rtrurKrvrwrmryra�__doc__�sortrd)rZp_functionsr�itemr�r�r	r	r
rY5s
zParserReflect.get_pfunctionscCs^g}t|j�dkr(|jjd�d|_dS�x"|jD�]\}}}}tj|�}|j|}t|tj	�rfd}nd}|j
j|kr�|jjd|||j�d|_q2|j
j|kr�|jjd|||j�d|_q2|j
s�|jjd|||j�q2y,t|||�}	x|	D]}
|j||
f�q�WWn:tk
�r<}z|jjt|��d|_WYdd}~XnX|jj|�q2W�x|jj�D]�\}}
|jd	��r�t|
tjtj	f��r��q\|jd
��r��q\|jd	��r�|dk�r�|jjd|�t|
tj��r�|
j
jdk�s�t|
tj	��r\|
jj
jdk�r\|
j
�r\y8|
j
jd
�}|ddk�r4|jjd|
j
j|
j
j|�Wntk
�rLYnX�q\W||_dS)Nrz+no rules of the form p_rulename are definedTr�rcz%%s:%d: Rule %r has too many argumentsz#%s:%d: Rule %r requires an argumentzA%s:%d: No documentation string specified in function %r (ignored)r|Zt_rsz%r not defined as a functionrxrLz9%s:%d: Possible grammar rule %r defined without p_ prefix)rrdrrrmror�rAr�rurvrzrrrrQrarUr7r�r�rer~rt�__func__r;rxrwr�r)rrr�r�rrPr�r�ZreqargsZparsed_gr r?rBrDr	r	r
r_Js\


 z!ParserReflect.validate_pfunctions)N)rrrrrZrar�r`rUr[rVr\rWr]rXr^rYr_r	r	r	r
rRns

#rRc
<Osd	|dkrt}|rd}|dkr&ttj�}�rf�fdd�t��D�}
t|
�}d|krntj|dj|d<ntd�}|	dkr�t	|t
j�r�|j}nLd|kr�|d}n:|jd�}dj
|dd6��}td	|�ttj|dd
�}tjj|�}	|jd�}|o�t	|t��rd|k�r|d|}|dk	�r$||d<t||d
�}|j�|j�rHtd��|j�}y�t�}|�rj|j|�}n
|j|�}|�s�||k�r�y"|j|j�t||j �}|j!a!|St"k
�r�}z|j#d|�WYdd}~XnXWnFt$k
�r}z|j#t|��WYdd}~Xnt%k
�r YnX|
dk�r�|�r�ytt&tjj
|	|�d��}
Wn<t'k
�r�}z|j#d||f�t(�}
WYdd}~XnXnt(�}
|
j)dt*�d}|j+��r�td��|j �s�|j#d�t,|j-�}xZ|j.D]P\}}}y|j/|||�Wn0t0k
�r&}z|j#d|�WYdd}~XnX�q�Wxl|j1D]b\}}|\} }!}"}#y|j2|"|#|| |!�Wn4t0k
�r�}z|jd|�d}WYdd}~XnX�q6Wy&|dk�r�|j3|j4�n
|j3|�Wn6t0k
�r�}z|jt|��d}WYdd}~XnX|�rtd��|j5�}$x*|$D]"\}%}&|jd|&j6|&j7|%�d}�qW|j8�}'|'�r�|
j)d
�|
j)d�|
j)d
�x&|'D]}|j#d|�|
j)d|��qnW|�r�|
j)d
�|
j)d�|
j)d
�x&t9|j:�D]\}(})|
j)d|(|)��q�W|j;�}*x$|*D]}&|j#d|&j6|&j7|&j<��q�Wt=|'�dk�r"|j#d�t=|'�dk�r@|j#dt=|'��t=|*�dk�rX|j#d �t=|*�dk�rv|j#d!t=|*��|�rN|
j)d
�|
j)d"�|
j)d
�t>|j?�}+|+j@�x2|+D]*}|
j)d#|d$j
d%d�|j?|D����q�W|
j)d
�|
j)d&�|
j)d
�t>|jA�},|,j@�x2|,D]*}-|
j)d#|-d$j
d'd�|jA|-D����qW|
j)d
�|�r�|jB�}.x|.D]}/|j#d(|/��qbW|jC�}0x|0D]}1|jd)|1�d}�q�W|jD�}2x$|2D]\}}|jd*||�d}�q�W|�r�td��|�r�|jEd+|�tF|||
�}|�rlt=|jG�}3|3dk�r |j#d,�n|3dk�r6|j#d-|3�t=|jH�}4|4dk�rV|j#d.�n|4dk�rl|j#d/|4�|�r�|jG�s�|jH�r�|
j#d
�|
j#d0�|
j#d
�x&|jGD]\}5}6}7|
j#d1|6|5|7��q�WtI�}8x�|jHD]x\}5}9}:|5tJ|9�tJ|:�f|8k�r��q�|
j#d2|5|9�|
j#d3|:|5�|j#d2|5|9�|j#d3|:|5�|8jK|5tJ|9�tJ|:�f��q�Wg};xL|jHD]B\}5}9}:|:jL�r^|:|;k�r^|
j#d4|:�|j#d4|:�|;jM|:��q^W|�r�y|jN||	|�Wn6t'k
�r�}z|j#d5||f�WYdd}~XnX|�	rBy|jO||�Wn6t'k
�	r@}z|j#d5||f�WYdd}~XnX|j|j�t||j �}|j!a!|S)7Nrcsg|]}|t�|�f�qSr	)rK)r?r�)r�r	r
r@�szyacc.<locals>.<listcomp>�__file__rr�r�rcz	import %srw�__package__r�)rzUnable to build parserz.There was a problem loading the table file: %rr9zCouldn't open %r. %sz5Created by PLY version %s (http://www.dabeaz.com/ply)Fz no p_error() function is definedz%sTz;%s:%d: Symbol %r used, but not defined as a token or a rulezUnused terminals:zToken %r defined, but not usedz    %sr�zRule %-5d %sz$%s:%d: Rule %r defined, but not usedzThere is 1 unused tokenzThere are %d unused tokenszThere is 1 unused rulezThere are %d unused rulesz'Terminals, with rules where they appearz
%-20s : %srxcSsg|]}t|��qSr	)r7)r?r=r	r	r
r@E
sz*Nonterminals, with rules where they appearcSsg|]}t|��qSr	)r7)r?r=r	r	r
r@M
szSymbol %r is unreachablez)Infinite recursion detected for symbol %rz0Precedence rule %r defined for unknown symbol %rzGenerating %s tablesz1 shift/reduce conflictz%d shift/reduce conflictsz1 reduce/reduce conflictz%d reduce/reduce conflictsz
Conflicts:z7shift/reduce conflict for %s in state %d resolved as %sz;reduce/reduce conflict in state %d resolved using rule (%s)zrejected rule (%s) in state %dzRule (%s) is never reducedzCouldn't create %r. %sr})P�
tab_modulerrmrn�dir�dictr�r�rJrAr�r�r;r�r�rKr�r��dirnamerr7rRrZrrr�r�rr�rr�rVrSrurNrr�r�r�r:rr�__version__rar�rTr{r�r�rr�r�r�r�r�r�r�r�r�r�rrrfr�r�r�r�r�r�rrrrr�r!r�r�rar@rD)<rrr�r=r�Zcheck_recursion�optimizeZwrite_tablesZ	debugfiler>ZdebuglogZerrorlogZ
picklefileZ_itemsr�Zsrcfile�partsZpkgnameZpkgZpinfor�ZlrZread_signaturer3r?�errorsrr�r�r��funcnameZgramr�r�r�r�r�rbr�r�rBr�r�r*ZnontermsZnontermZunreachable�ur��infZunused_precZnum_srZnum_rrrhr�Z
resolutionZalready_reportedZruleZrejectedZwarned_neverr	)r�r
�yacc�s�






"



$
















*




*













$$r�)<rkr�rmZos.pathr�rmrgr(r�r�Z	yaccdebugZ
debug_filer�Z
default_lrr�rkrrCriZ
basestringrrr7�maxsizer
�objectrrrNrr%r'r+r/r-r*r,r.r0r4r5r9rVrlr�r�r�r�r�r�r�r�r�rrrrrJrQrRr�r	r	r	r
�<module>>s�

4m
H.rT
) 
site-packages/ply/__pycache__/lex.cpython-36.opt-1.pyc000064400000051733147511334660016500 0ustar003

r��W�@s"dZdZddlZddlZddlZddlZddlZddlZyejej	fZ
Wnek
rdee
fZ
YnXejd�ZGdd�de�ZGdd�de�ZGd	d
�d
e�ZGdd�de�ZGd
d�d�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�ZGdd�de�Zd%dd �Zd&d!d"�Zd#d$�Z e Z!dS)'z3.9z3.8�Nz^[a-zA-Z0-9_]+$c@seZdZdd�ZdS)�LexErrorcCs|f|_||_dS)N)�args�text)�self�message�s�r�/usr/lib/python3.6/lex.py�__init__:szLexError.__init__N)�__name__�
__module__�__qualname__r
rrrr	r9src@seZdZdd�Zdd�ZdS)�LexTokencCsd|j|j|j|jfS)NzLexToken(%s,%r,%d,%d))�type�value�lineno�lexpos)rrrr	�__str__AszLexToken.__str__cCst|�S)N)�str)rrrr	�__repr__DszLexToken.__repr__N)rrr
rrrrrr	r@src@s4eZdZdd�Zdd�Zdd�Zdd�ZeZeZd	S)
�	PlyLoggercCs
||_dS)N)�f)rrrrr	r
LszPlyLogger.__init__cOs|jj||d�dS)N�
)r�write)r�msgr�kwargsrrr	�criticalOszPlyLogger.criticalcOs|jjd||d�dS)Nz	WARNING: r)rr)rrrrrrr	�warningRszPlyLogger.warningcOs|jjd||d�dS)NzERROR: r)rr)rrrrrrr	�errorUszPlyLogger.errorN)	rrr
r
rrr�info�debugrrrr	rKsrc@seZdZdd�Zdd�ZdS)�
NullLoggercCs|S)Nr)r�namerrr	�__getattribute__^szNullLogger.__getattribute__cOs|S)Nr)rrrrrr	�__call__aszNullLogger.__call__N)rrr
r#r$rrrr	r!]sr!c@s|eZdZdd�Zddd�Zddd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�ZeZdS)�LexercCs�d|_d|_i|_i|_i|_d|_g|_d|_i|_i|_	i|_
d|_d|_d|_
d|_d|_d|_d|_d|_d|_d|_d|_d|_dS)N�INITIALr��F)�lexre�	lexretext�
lexstatere�lexstateretext�lexstaterenames�lexstate�
lexstatestack�lexstateinfo�lexstateignore�lexstateerrorf�lexstateeoff�
lexreflags�lexdatar�lexlen�	lexerrorf�lexeoff�	lextokens�	lexignore�lexliterals�	lexmoduler�lexoptimize)rrrr	r
ts.zLexer.__init__NcCs�tj|�}|r�i}x�|jj�D]�\}}g}x\|D]T\}}g}	xF|D]>}
|
sV|
drb|	j|
�qB|	jt||
dj�|
df�qBWq0W|j||	f�|||<qW||_i|_x(|jj�D]\}}t||j�|j|<q�W||_|S)Nrr()�copyr+�items�append�getattrrr2r<)r�object�cZnewtab�keyZritemZnewreZcreZfindexZ	newfindexr�efrrr	�clone�s(


&zLexer.cloner'cCs�t|tj�rtd��|jd�d}tjj||�d}t|d����}|j	d|t
f�|j	dtt��|j	dtt
|j���|j	d	t|j��|j	d
t|j��|j	dt|j��i}xb|jj�D]T\}}g}	x>t||j||j|�D]"\\}
}}}
|	j|t||
�f�q�W|	||<q�W|j	dt|��|j	d
t|j��i}x,|jj�D]\}}|�rl|jnd||<�qXW|j	dt|��i}x,|jj�D]\}}|�r�|jnd||<�q�W|j	dt|��WdQRXdS)Nz&Won't overwrite existing lextab module�.r(z.py�wzJ# %s.py. This file automatically created by PLY (version %s). Don't edit!
z_tabversion   = %s
z_lextokens    = set(%s)
z_lexreflags   = %s
z_lexliterals  = %s
z_lexstateinfo = %s
z_lexstatere   = %s
z_lexstateignore = %s
z_lexstateerrorf = %s
z_lexstateeoff = %s
���)�
isinstance�types�
ModuleType�IOError�split�os�path�join�openr�__version__�repr�__tabversion__�tupler9r4r;r0r+r?�zipr,r-r@�_funcs_to_namesr1r2rr3)r�lextab�	outputdirZ
basetabmodule�filenameZtfZtabre�	statename�lre�titem�pat�funcZretext�renamesZtaberrrEZtabeofrrr	�writetab�s6(zLexer.writetabcCsZt|tj�r|}ntd|�tj|}t|dd�tkr@td��|j	|_
|j|_|j
|_|j
t|j�B|_|j|_|j|_i|_i|_xh|jj�D]Z\}}g}g}x4|D],\}}	|jtj||jtjB�t|	|�f�q�W||j|<||j|<q�Wi|_x&|jj�D]\}}
||
|j|<�qWi|_ x&|j!j�D]\}}
||
|j |<�q0W|j"d�dS)Nz	import %sZ_tabversionz0.0zInconsistent PLY versionr&)#rJrKrL�exec�sys�modulesrArU�ImportErrorZ
_lextokensr9Z_lexreflagsr4Z_lexliteralsr;�set�
lextokens_allZ
_lexstateinfor0Z_lexstateignorer1r+r,Z_lexstaterer?r@�re�compile�VERBOSE�_names_to_funcsr2Z_lexstateerrorfr3Z
_lexstateeoff�begin)rZtabfile�fdictrYr\r]r^Ztxtitemr_Z	func_namerErrr	�readtab�s8
(
z
Lexer.readtabcCs8|dd�}t|t�std��||_d|_t|�|_dS)Nr(zExpected a stringr)rJ�StringTypes�
ValueErrorr5r�lenr6)rrrCrrr	�input�s
zLexer.inputcCsd||jkrtd��|j||_|j||_|jj|d�|_|jj|d�|_	|j
j|d�|_||_dS)NzUndefined stater')
r+rqr)r,r*r1�getr:r2r7r3r8r.)r�staterrr	rms
zLexer.begincCs|jj|j�|j|�dS)N)r/r@r.rm)rrurrr	�
push_stateszLexer.push_statecCs|j|jj��dS)N)rmr/�pop)rrrr	�	pop_stateszLexer.pop_statecCs|jS)N)r.)rrrr	�
current_state!szLexer.current_statecCs|j|7_dS)N)r)r�nrrr	�skip'sz
Lexer.skipcCs~|j}|j}|j}|j}�x�||k�r|||kr<|d7}q�x�|jD]�\}}|j||�}|s`qFt�}|j�|_|j	|_	||_|j
}	||	\}
|_|
s�|jr�|j�|_|S|j�}P|j�}||_
||_||_|
|�}|s�|j}|j}P|j�s(|j|jk�r(td|
jj|
jj|
j|jf||d���|SW|||jk�rrt�}|||_|j	|_	|j|_||_|d|_|S|j�r�t�}|j|d�|_|j	|_	d|_||_
||_||_|j|�}||jk�r�td||||d���|j}|�s�q|S||_td|||f||d���qW|j�r\t�}d|_d|_|j	|_	||_||_
||_|j|�}|S|d|_|jdk�rztd��dS)	Nr(z4%s:%d: Rule '%s' returned an unknown token type '%s'rz&Scanning error. Illegal character '%s'z"Illegal character '%s' at index %d�eofr'z"No input string given with input())rr6r:r5r)�matchr�grouprr�	lastindexr�end�lexerZlexmatchr=rhr�__code__�co_filename�co_firstlinenorr;r7r8�RuntimeError)rrr6r:r5r)�lexindexfunc�m�tok�ir`Znewtokrrr	�token1s�




"

zLexer.tokencCs|S)Nr)rrrr	�__iter__�szLexer.__iter__cCs|j�}|dkrt�|S)N)r��
StopIteration)r�trrr	�next�sz
Lexer.next)N)r')rrr
r
rFrbrorsrmrvrxryr{r�r�r��__next__rrrr	r%ss

%(

nr%cCst|d|j�S)N�regex)rA�__doc__)r`rrr	�
_get_regex�sr�cCs0tj|�}|jj�}|j|jkr,|j|j�|S)N)rd�	_getframe�	f_globalsr>�f_locals�update)Zlevelsr�ldictrrr	�get_caller_module_dict�s


r�cCsJg}x@t||�D]2\}}|r8|dr8|j||df�q|j|�qW|S)Nrr()rWr@)Zfunclist�namelist�resultrr"rrr	rX�srXcCsHg}x>|D]6}|r6|dr6|j||d|df�q
|j|�q
W|S)Nrr()r@)r�rnr�rzrrr	rl�s
rlcCsj|sgSdj|�}y�tj|tj|B�}dgt|jj��d}|dd�}x�|jj�D]z\}}	|j|d�}
t	|
�t
jt
jfkr�|
||f||	<|||	<qV|
dk	rV|||	<|j
d�dkr�d||	<qVd||f||	<qVW||fg|g|gfStk
�rdtt|�d�}|dk�rd}t|d|�|||�\}}
}t||d�|||�\}}}|||
|||fSXdS)N�|r(�ignore_r�)NN)rQrirjrk�max�
groupindex�valuesr?rtrrK�FunctionType�
MethodType�find�	Exception�intrr�_form_master_re)Zrelist�reflagsr��toknamesr�r)r�Z
lexindexnamesrr�Zhandler�Zllistr]ZlnamesZrlistZrreZrnamesrrr	r��s2



r�cCs�d}|jd�}x0t|dd�d�D]\}}||kr"|dkr"Pq"W|dkrZt|d|��}nd}d|krnt|�}dj||d��}||fS)Nr(�_�ANYr&)r&)rN�	enumeraterVrQ)r�namesZnonstate�partsr��part�statesZ	tokennamerrr	�_statetokens
r�c@sfeZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dS)�LexerReflectNrcCsL||_d|_g|_||_ddi|_t�|_d|_|dkrBtt	j
�n||_dS)Nr&�	inclusiveF)r�Z
error_func�tokensr��	stateinforgrerrrd�stderr�log)rr�r�r�rrr	r
/s
zLexerReflect.__init__cCs$|j�|j�|j�|j�dS)N)�
get_tokens�get_literals�
get_states�	get_rules)rrrr	�get_all:szLexerReflect.get_allcCs|j�|j�|j�|jS)N)�validate_tokens�validate_literals�validate_rulesr)rrrr	�validate_allAszLexerReflect.validate_allcCsp|jjdd�}|s(|jjd�d|_dSt|ttf�sL|jjd�d|_dS|sf|jjd�d|_dS||_dS)Nr�zNo token list is definedTztokens must be a list or tupleztokens is empty)r�rtr�rrJ�listrVr�)rr�rrr	r�HszLexerReflect.get_tokenscCsTi}xJ|jD]@}tj|�s.|jjd|�d|_||krD|jjd|�d||<qWdS)NzBad token name '%s'TzToken '%s' multiply definedr()r��_is_identifierr}r�rr)rZ	terminalsrzrrr	r�\s
zLexerReflect.validate_tokenscCs |jjdd�|_|jsd|_dS)N�literalsr')r�rtr�)rrrr	r�gszLexerReflect.get_literalscCspyDx>|jD]4}t|t�s&t|�dkr
|jjdt|��d|_q
WWn&tk
rj|jjd�d|_YnXdS)Nr(z.Invalid literal %s. Must be a single characterTzIInvalid literals specification. literals must be a sequence of characters)r�rJrprrr�rrT�	TypeError)rrCrrr	r�mszLexerReflect.validate_literalscCs�|jjdd�|_|jr�t|jttf�s:|jjd�d|_n�x�|jD]�}t|t�s^t|�dkrx|jjdt	|��d|_qB|\}}t|t
�s�|jjdt	|��d|_qB|dkp�|dks�|jjd	|�d|_qB||jkr�|jjd
|�d|_qB||j|<qBWdS)Nr�z)states must be defined as a tuple or listTr�zMInvalid state specifier %s. Must be a tuple (statename,'exclusive|inclusive')zState name %s must be a stringr��	exclusivez:State type for state %s must be 'inclusive' or 'exclusive'zState '%s' already defined)r�rtr�rJrVr�r�rrrrTrpr�)rrr"Z	statetyperrr	r�xs0

zLexerReflect.get_statesc	CsRdd�|jD�}i|_i|_i|_i|_i|_i|_x"|jD]}g|j|<g|j|<q<Wt|�dkrz|j	j
d�d|_
dS�x�|D�]x}|j|}t||j�\}}||j|<t|d��rX|dkr�x�|D]}||j|<q�Wn||dkr�xr|D]}||j|<q�WnZ|d	k�r2|j
j}|j
j}|j	j
d
|||j�d|_
n$x�|D]}|j|j||f��q8Wq�t|t��r�|d	k�r�x|D]}||j|<�qtWd|k�r�|j	jd|�nD|dk�r�|j	j
d
|�d|_
n$x8|D]}|j|j||f��q�Wq�|j	j
d|�d|_
q�Wx$|jj�D]}|jdd�d��qWx&|jj�D]}|jdd�dd��q2WdS)NcSs g|]}|dd�dkr|�qS)Nr�Zt_r)�.0rrrr	�
<listcomp>�sz*LexerReflect.get_rules.<locals>.<listcomp>rz+No rules of the form t_rulename are definedTr$rr|�ignorez,%s:%d: Rule '%s' must be defined as a string�\z#%s contains a literal backslash '\'z'Rule '%s' must be defined as a functionz&%s not defined as a function or stringcSs|djjS)Nr()r�r�)�xrrr	�<lambda>�sz(LexerReflect.get_rules.<locals>.<lambda>)rDcSst|d�S)Nr()rr)r�rrr	r��s)rD�reverse)r�r��funcsym�strsymr��errorf�eoffr�rrr�rr��hasattrr�r�r�rr@rJrprr��sort)	rZtsymbolsrrr�r��tokname�line�filerrr	r��sb












zLexerReflect.get_rulescCs��x~|jD�]r}�x�|j|D�]r\}}|jj}|jj}tj|�}|jj|�|j	|}t
|tj�rjd}nd}|jj
}	|	|kr�|jjd|||j�d|_q|	|kr�|jjd|||j�d|_qt|�s�|jjd|||j�d|_qyJtjd|t|�ftj|jB�}
|
jd��r*|jjd	|||j�d|_Wqtjk
�r�}zD|jjd
|||j|�dt|�k�rz|jjd|||j�d|_WYdd}~XqXqW�x|j|D�]\}}
|j	|}|d
k�r�|jjd|�d|_�q�||jk�r|jd�dk�r|jjd||�d|_�q�y@tjd||
ftj|jB�}
|
jd��rN|jjd|�d|_WnTtjk
�r�}z4|jjd||�d|
k�r�|jjd|�d|_WYdd}~XnX�q�W|j|�r�|j|�r�|jjd|�d|_|jj|d�}|r
|}|jj}|jj}tj|�}|jj|�t
|tj��r,d}nd}|jj
}	|	|k�r\|jjd|||j�d|_|	|kr
|jjd|||j�d|_q
Wx|jD]}|j|��q�WdS)Nr�r(z'%s:%d: Rule '%s' has too many argumentsTz%%s:%d: Rule '%s' requires an argumentz2%s:%d: No regular expression defined for rule '%s'z
(?P<%s>%s)r'z<%s:%d: Regular expression for rule '%s' matches empty stringz3%s:%d: Invalid regular expression for rule '%s'. %s�#z6%s:%d. Make sure '#' in rule '%s' is escaped with '\#'rz'Rule '%s' must be defined as a functionr�rz-Rule '%s' defined for an unspecified token %sz5Regular expression for rule '%s' matches empty stringz,Invalid regular expression for rule '%s'. %sz/Make sure '#' in rule '%s' is escaped with '\#'zNo rules defined for state '%s')r�r�r�r�r��inspectZ	getmodulere�addr�rJrKr��co_argcountr�rrr�rirjrkr�r}r�r�r�r�rt�validate_module)rru�fnamerr�r��moduler�Zreqargs�nargsrC�er"�rZefuncrrr	r��s�

 







zLexerReflect.validate_rulescCs�ytj|�\}}Wntk
r&dSXtjd�}tjd�}i}|d7}xv|D]n}|j|�}|sj|j|�}|r�|jd�}	|j|	�}
|
s�|||	<n$tj|�}|j	j
d|||	|
�d|_
|d7}qNWdS)Nz\s*def\s+(t_[a-zA-Z_0-9]*)\(z\s*(t_[a-zA-Z_0-9]*)\s*=r(z7%s:%d: Rule %s redefined. Previously defined on line %dT)r�ZgetsourcelinesrMrirjr}r~rtZ
getsourcefiler�r)rr��linesZlinenZfreZsreZ	counthashr�r�r"�prevr[rrr	r�@s*








zLexerReflect.validate_module)Nr)rrr
r
r�r�r�r�r�r�r�r�r�r�rrrr	r�.s
Bgr�FrYc
%s�|dkrd}d}
ddi}t�}||_|	dkr6ttj�}	|rL|dkrLttj�}|rT|��r��fdd�t��D�}
t|
�}
d|
kr�tj|
dj|
d<nt	d�}
|
j
d	�}|r�t|t�r�d
|kr�|d
|}t
|
|	|d�}|j�|s�|j�r�td��|o�|�r4y |j||
�|ja|ja|a|Stk
�r2YnX|�rd|jd
|j�|jd|j�|jd|j�t�|_x|jD]}|jj|��qtWt|jttf��r�t|jd��j |j�|_!n|j|_!|jt|j!�B|_"|j}i}x�|D]�}g}xX|j#|D]J\}}|j$j%}|j$j&}|j'd|t(|�f�|�r�|jd|t(|�|��q�Wx@|j)|D]2\}}|j'd||f�|�rP|jd|||��qPW|||<�q�W|�r�|jd�xt|D]l}t*||||
|j+�\}}}||j,|<||j-|<||j.|<|�r�x&t/|�D]\}}|jd|||��q�W�q�Wxl|j0�D]`\}}|dk�r$|dk�r$|j,|j1|j,d�|j-|j1|j-d�|j.|j1|j.d��q$W||_2|j,d|_3|j-d|_4||_5|j6|_7|j7j
dd�|_8|j9|_:|j9j
dd�|_;|j;�s�|	j<d�|j=|_>|j=j
dd�|_?x�|j0�D]�\} }|dk�r\| |j9k�r:|	j<d| �| |j6k�r�|j8�r�|	j<d| �nJ|dk�r| |j9k�r�|j9j
dd�|j9| <| |j6k�r|j6j
dd�|j6| <�qW|ja|ja|a|�r�|�r�|dk�rBt|t@jA��r�|j}!nNd
|k�r�|
d}!n:|jBd
�}"d
j |"dd��}#tCd|#�tDtj|#dd�}!tEjFjG|!�}y|jH||�Wn6tIk
�r�}$z|	j<d||$f�WYdd}$~$XnX|S)NrYr&r�csg|]}|t�|�f�qSr)rA)r��k)r�rr	r�zszlex.<locals>.<listcomp>�__file__rr��__package__rG)r�r�zCan't build lexerzlex: tokens   = %rzlex: literals = %rzlex: states   = %rrz
(?P<%s>%s)z(lex: Adding rule %s -> '%s' (state '%s')z#lex: ==== MASTER REGEXS FOLLOW ====z"lex: state '%s' : regex[%d] = '%s'r'zNo t_error rule is definedr�z1No error rule is defined for exclusive state '%s'z2No ignore rule is defined for exclusive state '%s'r(z	import %sz#Couldn't write lextab module %r. %srI)Jr%r=rrdr��dir�dictrer�r�rtrJrr�r�r��SyntaxErrorror�rsr�rfrr�r�r�rgr9r�r�rVrrQr;rhr�r�r�r�r@r�r�r�r�r+r,r-r�r?�extendr0r)r*r4r�r1r:r�r2r7rr�r3r8rKrLrNrcrArOrP�dirnamerbrM)%r�rBr �optimizerYr�ZnowarnrZZdebuglogZerrorlogr�r�ZlexobjZ_itemsZpkgZlinforzZregexsruZ
regex_listr�rr�r�r"r�r)Zre_textZre_namesr�r�styperZsrcfiler�Zpkgnamer�r)r�r	�lex_s�
















$r�cCs�|sVy&tjd}t|�}|j�}|j�Wn*tk
rTtjjd�tjj�}YnX|rb|j	}nt	}||�|rz|j
}nt
}x0|�}|s�Ptjjd|j|j|j
|jf�q�WdS)Nr(z/Reading from standard input (type EOF to end):
z(%s,%r,%d,%d)
)rd�argvrR�read�close�
IndexError�stdoutr�stdinrsr�rrrr)r��datar[rZ_inputZ_tokenr�rrr	�runmains*
r�cs�fdd�}|S)Ncs t�d�rt��|_n�|_|S)Nr$)r�r�r�)r)r�rr	�	set_regexBs
zTOKEN.<locals>.set_regexr)r�r�r)r�r	�TOKENAsr�)
NNFFrYrFNNN)NN)"rSrUrirdrKr>rOr�Z
StringTypeZUnicodeTyperp�AttributeErrorr�bytesrjr�r�rrBrrr!r%r�r�rXrlr�r�r�r�r�r��Tokenrrrr	�<module>"sD
F

(3
@
"
site-packages/ply/__pycache__/yacc.cpython-36.pyc000064400000151550147511334660015666 0ustar003

�}�`F�
@sddlZddlZddlZddlZddlZddlZddlZdZdZ	dZ
dZdZdZ
dZd	Zd
ZdZejddkrteZneZejZGdd�de�ZGd
d�de�ZGdd�de�Zdd�Zdd�Zdada da!dZ"dd�Z#dd�Z$dd�Z%dd�Z&Gdd�d�Z'Gd d!�d!�Z(Gd"d#�d#�Z)ej*d$�Z+Gd%d&�d&e�Z,Gd'd(�d(e�Z-Gd)d*�d*e�Z.d+d,�Z/Gd-d.�d.e�Z0Gd/d0�d0e�Z1Gd1d2�d2e�Z2Gd3d4�d4e�Z3d5d6�Z4d7d8�Z5Gd9d:�d:e�Z6Gd;d<�d<e3�Z7d=d>�Z8d?d@�Z9GdAdB�dBe�Z:de
deddd	deddddf
dCdD�Z;dS)E�Nz3.9z3.8Tz
parser.out�parsetab�LALR�F�(c@s4eZdZdd�Zdd�ZeZdd�Zdd�ZeZd	S)
�	PlyLoggercCs
||_dS)N)�f)�selfr�r	�/usr/lib/python3.6/yacc.py�__init__nszPlyLogger.__init__cOs|jj||d�dS)N�
)r�write)r�msg�args�kwargsr	r	r
�debugqszPlyLogger.debugcOs|jjd||d�dS)Nz	WARNING: r)rr
)rrrrr	r	r
�warningvszPlyLogger.warningcOs|jjd||d�dS)NzERROR: r)rr
)rrrrr	r	r
�erroryszPlyLogger.errorN)	�__name__�
__module__�__qualname__rr�inforrZcriticalr	r	r	r
rmsrc@seZdZdd�Zdd�ZdS)�
NullLoggercCs|S)Nr	)r�namer	r	r
�__getattribute__�szNullLogger.__getattribute__cOs|S)Nr	)rrrr	r	r
�__call__�szNullLogger.__call__N)rrrrrr	r	r	r
rsrc@seZdZdS)�	YaccErrorN)rrrr	r	r	r
r�srcCsPt|�}d|krt|�}t|�tkr4|dt�d}dt|�jt|�|f}|S)Nrz ...z<%s @ 0x%x> (%s))�repr�len�resultlimit�typer�id)�r�repr_str�resultr	r	r
�
format_result�sr%cCsBt|�}d|krt|�}t|�dkr(|Sdt|�jt|�fSdS)Nr�z<%s @ 0x%x>)rrr rr!)r"r#r	r	r
�format_stack_entry�sr'aPLY: Don't use global functions errok(), token(), and restart() in p_error().
Instead, invoke the methods on the associated parser instance:

    def p_error(p):
        ...
        # Use parser.errok(), parser.token(), parser.restart()
        ...

    parser = yacc.yacc()
cCstjt�t�S)N)�warnings�warn�_warnmsg�_errokr	r	r	r
�errok�s
r,cCstjt�t�S)N)r(r)r*�_restartr	r	r	r
�restart�s
r.cCstjt�t�S)N)r(r)r*�_tokenr	r	r	r
�token�s
r0cCs>|ja|ja|ja||�}y
bbbWntk
r8YnX|S)N)r,r+r0r/r.r-�	NameError)�	errorfuncr0�parserr"r	r	r
�call_errorfunc�s
r4c@seZdZdd�Zdd�ZdS)�
YaccSymbolcCs|jS)N)r )rr	r	r
�__str__�szYaccSymbol.__str__cCst|�S)N)�str)rr	r	r
�__repr__�szYaccSymbol.__repr__N)rrrr6r8r	r	r	r
r5�sr5c@sfeZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dS)�YaccProductionNcCs||_||_d|_d|_dS)N)�slice�stack�lexerr3)r�sr;r	r	r
r�szYaccProduction.__init__cCsBt|t�rdd�|j|D�S|dkr2|j|jS|j|jSdS)NcSsg|]
}|j�qSr	)�value)�.0r=r	r	r
�
<listcomp>�sz.YaccProduction.__getitem__.<locals>.<listcomp>r)�
isinstancer:r>r;)r�nr	r	r
�__getitem__�s

zYaccProduction.__getitem__cCs||j|_dS)N)r:r>)rrB�vr	r	r
�__setitem__�szYaccProduction.__setitem__cCsdd�|j||�D�S)NcSsg|]
}|j�qSr	)r>)r?r=r	r	r
r@�sz/YaccProduction.__getslice__.<locals>.<listcomp>)r:)r�i�jr	r	r
�__getslice__�szYaccProduction.__getslice__cCs
t|j�S)N)rr:)rr	r	r
�__len__�szYaccProduction.__len__cCst|j|dd�S)N�linenor)�getattrr:)rrBr	r	r
rJszYaccProduction.linenocCs||j|_dS)N)r:rJ)rrBrJr	r	r
�
set_linenoszYaccProduction.set_linenocCs,t|j|dd�}t|j|d|�}||fS)NrJr�	endlineno)rKr:)rrB�	startlineZendliner	r	r
�linespanszYaccProduction.linespancCst|j|dd�S)N�lexposr)rKr:)rrBr	r	r
rPszYaccProduction.lexposcCs,t|j|dd�}t|j|d|�}||fS)NrPr�	endlexpos)rKr:)rrB�startpos�endposr	r	r
�lexspanszYaccProduction.lexspancCst�dS)N)�SyntaxError)rr	r	r
rszYaccProduction.error)N)rrrrrCrErHrIrJrLrOrPrTrr	r	r	r
r9�s
r9c@s\eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd
d�Zddd�Z	ddd�Z
ddd�ZdS)�LRParsercCs0|j|_|j|_|j|_||_|j�d|_dS)NT)	�lr_productions�productions�	lr_action�action�lr_goto�gotor2�set_defaulted_states�errorok)rZlrtabZerrorfr	r	r
rszLRParser.__init__cCs
d|_dS)NT)r^)rr	r	r
r,&szLRParser.errokcCs@|jdd�=|jdd�=t�}d|_|jj|�|jjd�dS)Nz$endr)�
statestack�symstackr5r �append)r�symr	r	r
r.)szLRParser.restartcCsTi|_xH|jj�D]:\}}t|j��}t|�dkr|ddkr|d|j|<qWdS)N�r)�defaulted_statesrZ�items�list�valuesr)r�state�actionsZrulesr	r	r
r]9s
zLRParser.set_defaulted_statescCs
i|_dS)N)rd)rr	r	r
�disable_defaulted_states@sz!LRParser.disable_defaulted_statesNFcCsZ|str.t|t�rttj�}|j|||||�S|rD|j|||||�S|j|||||�SdS)N)	�	yaccdevelrA�intr�sys�stderr�
parsedebug�parseopt�parseopt_notrack)r�inputr<r�tracking�	tokenfuncr	r	r
�parseCs

zLRParser.parsec Cs�d}g}|j}|j}	|j}
|j}td�}d}
|jd�|sLddlm}|j}||_||_	|dk	rj|j
|�|dkrz|j}n|}||_g}||_g}||_
||_d}|jd�t�}d|_|j|�d}�x�|jd�|jd|�||k�r.|�s|�s�|�}n|j�}|�st�}d|_|j}||j|�}n||}|jd||�|jd	d
djdd
�|D�dd��t|�fj��|dk	�r�|dk�r�|j|�|}|jd|�|j|�d}|
r�|
d8}
q�|dk�rN|
|}|j}|j}t�}||_d|_|�rB|jd|jddjdd
�||d�D��d|	|d%||�n|jd|jg|	|d&|�|�r�||dd�}||d<|�r�|d}|j|_|j|_|d'}t|d|j�|_t|d|j�|_||_ yd||d�=||_!|j"|�||d�=|jdt#|d��|j|�|	|d(|}|j|�Wq�t$k
�r�|j|�|j%|dd)��|j�|d*}d|_d|_|}t&}
d|_'Yq�Xq�n�|�r�|j|_|j|_|g}||_ yL||_!|j"|�|jdt#|d��|j|�|	|d+|}|j|�Wq�t$k
�rJ|j|�|j�|d,}d|_d|_|}t&}
d|_'Yq�Xq�|dk�r�|d-}t|dd�}|jdt#|��|jd�|S|dk�r�|j(dd
djdd
�|D�dd��t|�fj��|
dk�s�|j'�r�t&}
d|_'|}|jdk�r�d}|j)�rB|�rt*|d��r||_||_!t+|j)||�}|j'�r�|}d}q�n`|�r�t*|d��r\|j}nd}|�r~t,j-j.d ||jf�nt,j-j.d!|j�nt,j-j.d"�dSnt&}
t|�dk�r�|jdk�r�d}d}d}|dd�=q�|jdk�r�dS|jdk�r�|d.}|jdk�r6|�r0t|d|j�|_t|d#|j�|_d}q�t�}d|_t*|d��r\|j|_|_t*|d#��rv|j|_|_||_|j|�|}q�|j�}|�r�|j|_|j|_|j�|d/}q�t/d$��q�WdS)0NrzPLY: PARSE DEBUG STARTrc)�lexz$end�zState  : %sz#Defaulted state %s: Reduce using %dzStack  : %sz%s . %s� cSsg|]
}|j�qSr	)r )r?�xxr	r	r
r@�sz'LRParser.parsedebug.<locals>.<listcomp>z Action : Shift and goto state %sz3Action : Reduce rule [%s] with %s and goto state %d�[�,cSsg|]}t|j��qSr	)r'r>)r?Z_vr	r	r
r@�s�]rMrQzResult : %srFr>zDone   : Returning %szPLY: PARSE DEBUG ENDzError  : %scSsg|]
}|j�qSr	)r )r?ryr	r	r
r@Bsr<rJz(yacc: Syntax error at line %d, token=%s
zyacc: Syntax error, token=%sz yacc: Parse error in input. EOF
rPzyacc: internal parser error!!!
���r}r}r}r}r}r}r}r}r}r})0rZr\rXrdr9rrwrvr<r3rrr0r_r`r;rar5r r�pop�get�joinr7�lstriprrr>rJrPrKrMrQr:rh�callabler%rU�extend�error_countr^rr2�hasattrr4rmrnr
�RuntimeError) rrrr<rrsrt�	lookahead�lookaheadstackrir\�prodrd�pslice�
errorcountrv�	get_tokenr_r`�errtokenrbrh�ltype�t�p�pname�plen�targ�t1rBr$�tokrJr	r	r
ro\s~





.






$








.


zLRParser.parsedebugc Csvd}g}|j}|j}	|j}
|j}td�}d}
|sBddlm}|j}||_||_|dk	r`|j	|�|dkrp|j
}n|}||_
g}||_g}||_||_
d}|jd�t�}d|_|j|�d}�x�||k�r|s�|s�|�}n|j�}|s�t�}d|_|j}||j|�}n||}|dk	�rh|dk�rN|j|�|}|j|�d}|
r�|
d8}
q�|dk�rF|
|}|j}|j}t�}||_d|_|�r�||dd�}||d<|�r�|d}|j|_|j|_|d}t|d|j�|_t|d|j�|_||_yP||d�=||_|j|�||d�=|j|�|	|d|}|j|�Wq�tk
�r�|j|�|j|dd��|j�|d}d|_d|_|}t }
d|_!Yq�Xq�n�|�r�|j|_|j|_|g}||_y8||_|j|�|j|�|	|d|}|j|�Wq�tk
�rB|j|�|j�|d}d|_d|_|}t }
d|_!Yq�Xq�|dk�rh|d}t|d	d�}|S|dk�rf|
dk�s�|j!�rNt }
d|_!|}|jdk�r�d}|j"�r�|�r�t#|d
��r�||_||_t$|j"||�}|j!�rL|}d}q�n`|�r<t#|d��r|j}nd}|�r(t%j&j'd||jf�nt%j&j'd
|j�nt%j&j'd�dSnt }
t|�dk�r�|jdk�r�d}d}d}|dd�=q�|jdk�r�dS|jdk�r6|d}|jdk�r�|�r�t|d|j�|_t|d|j�|_d}q�t�}d|_t#|d��r|j|_|_t#|d��r |j|_|_||_|j|�|}q�|j�}|�rT|j|_|j|_|j�|d}q�t(d��q�WdS)Nrrc)rvz$endrMrQrFr>r<rJz(yacc: Syntax error at line %d, token=%s
zyacc: Syntax error, token=%sz yacc: Parse error in input. EOF
rPzyacc: internal parser error!!!
r}r}r}r}r}r}r}r}r}))rZr\rXrdr9rwrvr<r3rrr0r_r`r;rar5r r~rrrr>rJrPrKrMrQr:rhr�rUr�r�r^r2r�r4rmrnr
r�) rrrr<rrsrtr�r�rir\r�rdr�r�rvr�r_r`r�rbrhr�r�r�r�r�r�r�rBr$r�rJr	r	r
rp�sX




















zLRParser.parseoptcCs�d}g}|j}|j}	|j}
|j}td�}d}
|sBddlm}|j}||_||_|dk	r`|j	|�|dkrp|j
}n|}||_
g}||_g}||_||_
d}|jd�t�}d|_|j|�d}�x||k�r|s�|s�|�}n|j�}|s�t�}d|_|j}||j|�}n||}|dk	�r|dk�rN|j|�|}|j|�d}|
r�|
d8}
q�|dk�r�|
|}|j}|j}t�}||_d|_|�rX||dd�}||d<||_yP||d�=||_|j|�||d�=|j|�|	|d|}|j|�Wq�tk
�rR|j|�|j|dd��|j�|d}d|_d|_|}t}
d|_Yq�Xq�n�|g}||_y8||_|j|�|j|�|	|d|}|j|�Wq�tk
�r�|j|�|j�|d}d|_d|_|}t}
d|_Yq�Xq�|dk�r|d}t|dd�}|S|dk�r�|
dk�s(|j�r�t}
d|_|}|jdk�rFd}|j�r�|�rht|d��rh||_||_t |j||�}|j�r�|}d}q�n`|�r�t|d	��r�|j!}nd}|�r�t"j#j$d
||jf�nt"j#j$d|j�nt"j#j$d�dSnt}
t|�dk�r(|jdk�r(d}d}d}|dd�=q�|jdk�r8dS|jdk�r�|d}|jdk�r^d}q�t�}d|_t|d	��r�|j!|_!|_%t|d
��r�|j&|_&|_'||_|j|�|}q�|j�}|j�|d}q�t(d��q�WdS)Nrrc)rvz$endrFr>r<rJz(yacc: Syntax error at line %d, token=%s
zyacc: Syntax error, token=%sz yacc: Parse error in input. EOF
rPzyacc: internal parser error!!!
r}r}r}r}r}r}r}r}))rZr\rXrdr9rwrvr<r3rrr0r_r`r;rar5r r~rrrr>r:rhr�rUr�r�r^rKr2r�r4rJrmrnr
rMrPrQr�)rrrr<rrsrtr�r�rir\r�rdr�r�rvr�r_r`r�rbrhr�r�r�r�r�r�rBr$r�rJr	r	r
rq�s8




















zLRParser.parseopt_notrack)NNFFN)NNFFN)NNFFN)NNFFN)rrrrr,r.r]rjrurorprqr	r	r	r
rVs

]
4rVz^[a-zA-Z0-9_-]+$c@sReZdZdZddd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�ZdS)�
Productionr�rightNrwc	Cs�||_t|�|_||_||_d|_||_||_||_t	|j�|_	g|_
x$|jD]}||j
krN|j
j|�qNWg|_d|_
|jr�d|jdj|j�f|_nd|j|_dS)Nz%s -> %srxz
%s -> <empty>)r�tupler��number�funcr��file�line�precr�usymsra�lr_items�lr_nextr�r7)	rr�rr��
precedencer�r�r�r=r	r	r
rs$

zProduction.__init__cCs|jS)N)r7)rr	r	r
r6=szProduction.__str__cCsdt|�dS)NzProduction(�))r7)rr	r	r
r8@szProduction.__repr__cCs
t|j�S)N)rr�)rr	r	r
rICszProduction.__len__cCsdS)Nrcr	)rr	r	r
�__nonzero__FszProduction.__nonzero__cCs
|j|S)N)r�)r�indexr	r	r
rCIszProduction.__getitem__cCs�|t|j�krdSt||�}yt|j|d|_Wnttfk
rRg|_YnXy|j|d|_Wntk
r�d|_YnX|S)Nrc)rr��LRItem�	Prodnames�lr_after�
IndexError�KeyError�	lr_before)rrBr�r	r	r
�lr_itemMs
zProduction.lr_itemcCs|jr||j|_dS)N)r�r�)r�pdictr	r	r
�bind]szProduction.bind�r�r)r�Nrwr)rrr�reducedrr6r8rIr�rCr�r�r	r	r	r
r�s
r�c@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�MiniProductioncCs.||_||_||_d|_||_||_||_dS)N)rrr�r�r�r�r7)rr7rrr�r�r�r	r	r
rfszMiniProduction.__init__cCs|jS)N)r7)rr	r	r
r6oszMiniProduction.__str__cCs
d|jS)NzMiniProduction(%s))r7)rr	r	r
r8rszMiniProduction.__repr__cCs|jr||j|_dS)N)r�r�)rr�r	r	r
r�vszMiniProduction.bindN)rrrrr6r8r�r	r	r	r
r�es	r�c@s$eZdZdd�Zdd�Zdd�ZdS)r�cCsZ|j|_t|j�|_|j|_||_i|_|jj|d�t|j�|_t|j�|_|j	|_	dS)N�.)
rrfr�r��lr_index�
lookaheads�insertr�rr�)rr�rBr	r	r
r�szLRItem.__init__cCs,|jrd|jdj|j�f}n
d|j}|S)Nz%s -> %srxz
%s -> <empty>)r�rr�)rr=r	r	r
r6�s
zLRItem.__str__cCsdt|�dS)NzLRItem(r�)r7)rr	r	r
r8�szLRItem.__repr__N)rrrrr6r8r	r	r	r
r��sr�cCs:t|�d}x(|dkr4|||kr*||S|d8}qWdS)Nrcr)r)Zsymbols�	terminalsrFr	r	r
�rightmost_terminal�s
r�c@seZdZdS)�GrammarErrorN)rrrr	r	r	r
r��sr�c@s�eZdZdd�Zdd�Zdd�Zdd�Zd$dd
�Zd%dd�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zd&d d!�Zd"d#�Zd	S)'�GrammarcCsfdg|_i|_i|_i|_x|D]}g|j|<q Wg|jd<i|_i|_i|_i|_t�|_	d|_
dS)Nr)�Productionsr��Prodmap�	Terminals�Nonterminals�First�Follow�
Precedence�set�UsedPrecedence�Start)rr��termr	r	r
r�s

zGrammar.__init__cCs
t|j�S)N)rr�)rr	r	r
rI�szGrammar.__len__cCs
|j|S)N)r�)rr�r	r	r
rC�szGrammar.__getitem__cCsL|jdgkstd��||jkr*td|��|dkr:td��||f|j|<dS)Nz2Must call set_precedence() before add_production()z,Precedence already specified for terminal %r�leftr��nonassocz:Associativity must be one of 'left','right', or 'nonassoc')r�r�r�)r��AssertionErrorr�r�)rr��assoc�levelr	r	r
�set_precedence�s
zGrammar.set_precedenceNrwrcCs�||jkrtd|||f��|dkr6td|||f��tj|�sRtd|||f��x�t|�D]�\}}|ddkr�yJt|�}t|�dkr�td||||f��||jkr�g|j|<|||<w\Wntk
r�YnXtj|�r\|d	kr\td
||||f��q\Wd	|k�r�|dd	k�r$td||f��|dd	k�rBtd
||f��|d}	|jj	|	�}
|
�sptd|||	f��n|j
j|	�|dd�=nt||j�}	|jj	|	d�}
d||f}||j
k�r�|j
|}td|||fd|j|jf��t|j�}
||jk�rg|j|<xR|D]J}||jk�r.|j|j|
�n&||jk�rDg|j|<|j|j|
��qWt|
|||
|||�}|jj|�||j
|<y|j|j|�Wn"tk
�r�|g|j|<YnXdS)Nz7%s:%d: Illegal rule name %r. Already defined as a tokenrz5%s:%d: Illegal rule name %r. error is a reserved wordz%s:%d: Illegal rule name %rrz'"rczA%s:%d: Literal token %s in rule %r may only be a single characterz%precz!%s:%d: Illegal name %r in rule %rz+%s:%d: Syntax error. Nothing follows %%prec�zH%s:%d: Syntax error. %%prec can only appear at the end of a grammar rulez/%s:%d: Nothing known about the precedence of %rr�z%s -> %sz%s:%d: Duplicate rule %s. zPrevious definition at %s:%dr}���r}r�)r�r)r�r��_is_identifier�match�	enumerate�evalrrUr�rr��addr�r�r�r�r�r�rar�r�r�)r�prodname�symsr�r�r�rBr=�cZprecnameZprodprec�map�mZpnumberr�r�r	r	r
�add_production
sp










zGrammar.add_productioncCsT|s|jdj}||jkr&td|��tdd|g�|jd<|j|jd�||_dS)Nrczstart symbol %s undefinedrzS')r�rr�r�r�rar�)r�startr	r	r
�	set_startas
zGrammar.set_startcs>���fdd��t����jdjd��fdd��jD�S)NcsJ|�krdS�j|�x.�jj|g�D]}x|jD]}�|�q2Wq&WdS)N)r�r�rr�)r=r�r")�mark_reachable_from�	reachablerr	r
r�ts
z5Grammar.find_unreachable.<locals>.mark_reachable_fromrcsg|]}|�kr|�qSr	r	)r?r=)r�r	r
r@~sz,Grammar.find_unreachable.<locals>.<listcomp>)r�r�r�r�)rr	)r�r�rr
�find_unreachableqszGrammar.find_unreachablecCs�i}x|jD]}d||<qWd|d<x|jD]}d||<q,Wxpd}x`|jj�D]R\}}xH|D]@}x |jD]}||shd}PqhWd}|r\||s�d||<d}Pq\WqNW|s>Pq>Wg}	x@|j�D]4\}}
|
s�||jkr�||jkr�|dkr�q�|	j|�q�W|	S)NTz$endFr)r�r�r�rer�ra)rZ
terminatesr�rB�some_changeZplr�r=Zp_terminates�infiniter�r	r	r
�infinite_cycles�s:

zGrammar.infinite_cyclescCsXg}xN|jD]D}|sqx8|jD].}||jkr||jkr|dkr|j||f�qWqW|S)Nr)r�r�r�r�ra)rr$r�r=r	r	r
�undefined_symbols�szGrammar.undefined_symbolscCs8g}x.|jj�D] \}}|dkr|r|j|�qW|S)Nr)r�rera)rZ
unused_tokr=rDr	r	r
�unused_terminals�s
zGrammar.unused_terminalscCs<g}x2|jj�D]$\}}|s|j|d}|j|�qW|S)Nr)r�rer�ra)rZunused_prodr=rDr�r	r	r
�unused_rules�szGrammar.unused_rulescCsDg}x:|jD]0}||jkp"||jks|j||j|df�qW|S)Nr)r�r�r�ra)rZunusedZtermnamer	r	r
�unused_precedence�s
zGrammar.unused_precedencecCs`g}xV|D]D}d}x2|j|D]$}|dkr0d}q||kr|j|�qW|rLq
Pq
W|jd�|S)NFz<empty>T)r�ra)rZbetar$�xZx_produces_emptyrr	r	r
�_first	s

zGrammar._firstcCs�|jr|jSx|jD]}|g|j|<qWdg|jd<x|jD]}g|j|<q<Wxjd}xZ|jD]P}xJ|j|D]<}x6|j|j�D]&}||j|kr~|j|j|�d}q~WqlWq\W|sPPqPW|jS)Nz$endFT)r�r�r�r�r�r�ra)rr�rBr�r�rr	r	r
�
compute_first,s$zGrammar.compute_firstc
CsV|jr|jS|js|j�x|jD]}g|j|<q"W|sD|jdj}dg|j|<�x�d}x�|jdd�D]�}x�t|j�D]�\}}||jkrx|j|j|dd��}d}xB|D]:}	|	dkr�|	|j|kr�|j|j	|	�d}|	dkr�d}q�W|�s|t
|j�dkrxx:|j|jD]*}	|	|j|k�r|j|j	|	�d}�qWqxWqhW|sTPqTW|jS)Nrcz$endFz<empty>T)r�r�r�r�r�rr�r�r�rar)
rr��k�didaddr�rF�BZfstZhasemptyrr	r	r
�compute_followQs<

zGrammar.compute_followcCs�x�|jD]�}|}d}g}x�|t|�kr,d}ntt||�}y|j|j|d|_Wnttfk
rng|_YnXy|j|d|_Wntk
r�d|_YnX||_	|s�P|j
|�|}|d7}qW||_qWdS)Nrrc)r�rr�r�r�r�r�r�r�r�rar�)rr�ZlastlrirFr�Zlrir	r	r
�
build_lritems�s.

zGrammar.build_lritems)Nrwr)N)N)rrrrrIrCr�r�r�r�r�r�r�r�r�r�r�r�r�r	r	r	r
r��s $
T
@#%
;r�c@seZdZdS)�VersionErrorN)rrrr	r	r	r
r��sr�c@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�LRTablecCsd|_d|_d|_d|_dS)N)rYr[rW�	lr_method)rr	r	r
r�szLRTable.__init__cCs~t|tj�r|}ntd|�tj|}|jtkr:td��|j	|_
|j|_g|_
x|jD]}|j
jt|��qXW|j|_|jS)Nz	import %sz&yacc table file version is out of date)rA�types�
ModuleType�execrm�modulesZ_tabversion�__tabversion__r�Z
_lr_actionrYZ_lr_gotor[rWZ_lr_productionsrar�Z
_lr_methodr�Z
_lr_signature)r�modulerr�r	r	r
�
read_table�s

zLRTable.read_tablecCs�yddl}Wntk
r(ddl}YnXtjj|�s:t�t|d�}|j|�}|tkr^t	d��|j|�|_
|j|�}|j|�|_|j|�|_|j|�}g|_
x|D]}|j
jt|��q�W|j�|S)Nr�rbz&yacc table file version is out of date)�cPickle�ImportError�pickle�os�path�exists�open�loadr�r�r�rYr[rWrar��close)r�filenamer�Zin_fZ
tabversion�	signaturerXr�r	r	r
�read_pickle�s(




zLRTable.read_picklecCsx|jD]}|j|�qWdS)N)rWr�)rr�r�r	r	r
�bind_callables�szLRTable.bind_callablesN)rrrrr�rrr	r	r	r
r��sr�c	CsTi}x|D]}d||<q
Wg}i}x,|D]$}||dkr(t|||||||�q(W|S)Nr)�traverse)�X�R�FP�Nr�r;�Fr	r	r
�digraphs

r	c	Cs|j|�t|�}|||<||�||<||�}xr|D]j}	||	dkrXt|	||||||�t||||	�||<x.|j|	g�D]}
|
||kr|||j|
�q|Wq4W|||k�rt||d<||||d<|j�}x2||k�rt||d<||||d<|j�}q�WdS)Nrrcr}r}r}r})rarr�minr�MAXINTr~)r�rr;rrrr�d�rel�y�a�elementr	r	r
rs(

rc@seZdZdS)�	LALRErrorN)rrrr	r	r	r
r)src@s�eZdZd$dd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zd%d d!�Zd&d"d#�ZdS)'�LRGeneratedTablerNcCs�|dkrtd|��||_||_|s*t�}||_i|_i|_|j|_i|_	i|_
d|_d|_d|_
g|_g|_g|_|jj�|jj�|jj�|j�dS)N�SLRrzUnsupported method %sr)rr)r�grammarr�r�logrYr[r�rW�
lr_goto_cache�lr0_cidhash�
_add_countZsr_conflictZrr_conflictZ	conflicts�sr_conflicts�rr_conflictsr�r�r��lr_parse_table)rr�methodrr	r	r
r4s,


zLRGeneratedTable.__init__cCsz|jd7_|dd�}d}xV|rtd}xH|D]@}x:|jD]0}t|dd�|jkrRq:|j|j�|j|_d}q:Wq.Wq W|S)NrcTF�	lr0_addedr)rr�rKrar�r)r�I�Jr�rGr�r	r	r
�lr0_closureYs
zLRGeneratedTable.lr0_closurec	Cs�|jjt|�|f�}|r|S|jj|�}|s:i}||j|<g}xP|D]H}|j}|rD|j|krD|jt|��}|s~i}||t|�<|j|�|}qDW|jd�}|s�|r�|j|�}||d<n||d<||jt|�|f<|S)Nz$end)rrr!r�r�rar )	rrr��gr=Zgsr�rB�s1r	r	r
�lr0_gotoss2





zLRGeneratedTable.lr0_gotoc	Cs�|j|jjdjg�g}d}x"|D]}||jt|�<|d7}q"Wd}x�|t|�kr�||}|d7}i}x$|D]}x|jD]}d||<qxWqlWxJ|D]B}|j||�}|s�t|�|jkr�q�t|�|jt|�<|j	|�q�WqFW|S)Nrrc)
r rr�r�rr!rr�r#ra)	r�CrFrZasyms�iir=r�r!r	r	r
�	lr0_items�s(


zLRGeneratedTable.lr0_itemscCs�t�}d}xrxV|jjdd�D]B}|jdkr:|j|j�qx$|jD]}||krBPqBW|j|j�qWt|�|krrPt|�}qW|S)Nrrc)r�rr�rr�rr�)r�nullableZnum_nullabler�r�r	r	r
�compute_nullable_nonterminals�s
z.LRGeneratedTable.compute_nullable_nonterminalscCsrg}xht|�D]\\}}xR|D]J}|j|jdkr||j|jdf}|d|jjkr||kr|j|�qWqW|S)Nrc)r�r�rr�rr�ra)rr$�transZstatenorhr�r�r	r	r
�find_nonterminal_transitions�s
z-LRGeneratedTable.find_nonterminal_transitionscCs�i}|\}}g}|j|||�}xJ|D]B}	|	j|	jdkr&|	j|	jd}
|
|jjkr&|
|kr&|j|
�q&W|dkr�||jjdjdkr�|jd�|S)Nrcrz$end)r#r�rr�rr�rar�)rr$r)r'Zdr_setrhr�termsr!r�rr	r	r
�dr_relation�s

zLRGeneratedTable.dr_relationcCsvg}|\}}|j|||�}|jjt|�d�}xB|D]:}	|	j|	jdkr4|	j|	jd}
|
|kr4|j||
f�q4W|S)Nrcr})r#rrr!r�rr�ra)rr$r)�emptyr
rhrr!rGr�rr	r	r
�reads_relation	s
zLRGeneratedTable.reads_relationcCs�i}i}i}x|D]}d||<qW�x�|D�]�\}}	g}
g}�xR||D�]D}|j|	krZqH|j}
|}x�|
|jdk�r
|
d}
|j|
}||f|kr�|
d}xH||jkr�|j||jjkr�P|j||kr�P|d}q�W|j||f�|j|||�}|jj	t
|�d�}qfWx�||D]t}|j|jk�r,�q|j|jk�r>�qd}xD||jk�rx|j||j|dk�rlP|d}�qDW|
j||f��qWqHWx2|D]*}||k�r�g||<||j||	f��q�W|
|||	f<q*W||fS)Nrcrr})rr�rr�rr�rar#rrr!)rr$r)r'ZlookdictZincludedictZdtransr�rhrZlookbZincludesr�r�rGZlir!r"rFr	r	r
�compute_lookback_includesC	sX




z*LRGeneratedTable.compute_lookback_includescs0���fdd�}���fdd�}t|||�}|S)Ncs�j�|��S)N)r,)r�)r$r'rr	r
�<lambda>�	sz4LRGeneratedTable.compute_read_sets.<locals>.<lambda>cs�j�|��S)N)r.)r�)r$r'rr	r
r0�	s)r	)rr$�ntransr'rrrr	)r$r'rr
�compute_read_sets�	sz"LRGeneratedTable.compute_read_setscs(�fdd�}�fdd�}t|||�}|S)Ncs�|S)Nr	)r�)�readsetsr	r
r0�	sz6LRGeneratedTable.compute_follow_sets.<locals>.<lambda>cs�j|g�S)N)r)r�)�inclsetsr	r
r0�	s)r	)rr1r3r4rrrr	)r4r3r
�compute_follow_sets�	sz$LRGeneratedTable.compute_follow_setsc	Csxxr|j�D]f\}}x\|D]T\}}||jkr4g|j|<|j|g�}x*|D]"}||j|krF|j|j|�qFWqWq
WdS)N)rer�rra)	rZ	lookbacksZ	followsetr)Zlbrhr�rrr	r	r
�add_lookaheads�	s


zLRGeneratedTable.add_lookaheadscCsP|j�}|j|�}|j|||�}|j|||�\}}|j|||�}|j||�dS)N)r(r*r2r/r5r6)rr$r'r)r3ZlookdZincludedZ
followsetsr	r	r
�add_lalr_lookaheads�	s
z$LRGeneratedTable.add_lalr_lookaheadsc$	Cs<|jj}|jj}|j}|j}|j}i}|jd|j�|j�}|jdkrP|j	|�d}�x�|D�]�}	g}
i}i}i}
|jd�|jd|�|jd�x|	D]}|jd|j
|�q�W|jd��x|	D�]�}|j|jdk�r,|j
dkr�d|d	<||d	<�q�|jdk�r|j|}n|jj|j
}�x�|D�]�}|
j||d
|j
|ff�|j|�}|dk	�r�|dk�rB|||j
j\}}|j|d�\}}||k�s�||k�r�|dk�r�|j
||<|||<|�r�|�r�|jd
|�|jj||df�||j
jd7_nB||k�r|dk�rd||<n$|�s�|jd|�|jj||df�n�|dk�r�||}||j
}|j|jk�r�|j
||<|||<||}}||j
jd7_||j
jd8_n
||}}|jj|||f�|jd|||j
||�ntd|��n(|j
||<|||<||j
jd7_�q&Wq�|j}|j|d}||jjkr�|j|	|�}|jjt|�d�}|dkr�|
j||d|f�|j|�}|dk	�r�|dk�r�||k�r�td|��n�|dk�r�|||j
j\}}|j|d�\}}||k�s||k�rV|dk�rV|||j
jd8_|||<|||<|�s�|jd|�|jj||df�nL||k�rt|dk�rtd||<n.|�r�|�r�|jd
|�|jj||df�ntd|��q�|||<|||<q�Wi}xF|
D]>\}}}||k�r�|||k�r�|jd||�d|||f<�q�W|jd�d}xX|
D]P\}}}||k�r&|||k	�r&||f|k�r&|jd||�d}d|||f<�q&W|�r�|jd�i} x6|	D].}!x&|!jD]}"|"|jjk�r�d| |"<�q�W�q�WxL| D]D}#|j|	|#�}|jjt|�d�}|dk�r�||
|#<|jd|#|��q�W|||<|||<|
||<|d7}q\WdS)NzParsing method: %srrrwzstate %dz    (%d) %srczS'z$endzreduce using rule %d (%s)r�r�z3  ! shift/reduce conflict for %s resolved as reduce�reducer�z2  ! shift/reduce conflict for %s resolved as shiftZshiftz=  ! reduce/reduce conflict for %s resolved using rule %d (%s)zUnknown conflict in state %dzshift and go to state %dz Shift/shift conflict in state %dz    %-15s %sz  ! %-15s [ %s ]z"    %-30s shift and go to state %d)r�rr})r�rr}) rr�r�r[rYrrr�r&r7r�rr�rr�r�rarr�rr�r�rrr�r�r#rr!rr�r�)$rr�r�r\rZrZactionpr$�strZactlistZ	st_actionZ
st_actionpZst_gotor�Zlaheadsrr"ZsprecZslevelZrprecZrlevelZoldpZppZchosenpZrejectprFr!rGZ	_actprintr�Znot_usedZnkeysr%r=rBr	r	r
r�	s




























zLRGeneratedTable.lr_parse_tablerwcCs�t|tj�rtd��|jd�d}tjj||�d}�ylt|d�}|j	dtjj
|�t|j|f�d}|�rti}xf|j
j�D]X\}	}
xN|
j�D]B\}}|j|�}
|
s�ggf}
|
||<|
dj|	�|
dj|�q�Wq|W|j	d�xz|j�D]n\}}|j	d	|�x |dD]}
|j	d
|
��qW|j	d�x |dD]}
|j	d
|
��q8W|j	d�q�W|j	d
�|j	d�nJ|j	d�x4|j
j�D]&\}}|j	d|d|d|f��q�W|j	d
�|�r�i}xl|jj�D]^\}	}
xR|
j�D]F\}}|j|�}
|
�sggf}
|
||<|
dj|	�|
dj|��q�W�q�W|j	d�x||j�D]p\}}|j	d	|�x |dD]}
|j	d
|
��qjW|j	d�x |dD]}
|j	d
|
��q�W|j	d��qJW|j	d
�|j	d�nJ|j	d�x4|jj�D]&\}}|j	d|d|d|f��q�W|j	d
�|j	d�xd|jD]Z}|j�rl|j	d|j|j|j|jtjj
|j�|jf�n|j	dt|�|j|jf��q0W|j	d�|j�Wn&tk
�r�}z�WYdd}~XnXdS)Nz"Won't overwrite existing tabmoduler�rcz.py�wzu
# %s
# This file is automatically generated. Do not edit.
_tabversion = %r

_lr_method = %r

_lr_signature = %r
    rz
_lr_action_items = {z%r:([z%r,z],[z]),z}
z�
_lr_action = {}
for _k, _v in _lr_action_items.items():
   for _x,_y in zip(_v[0],_v[1]):
      if not _x in _lr_action:  _lr_action[_x] = {}
      _lr_action[_x][_k] = _y
del _lr_action_items
z
_lr_action = { z(%r,%r):%r,z
_lr_goto_items = {z�
_lr_goto = {}
for _k, _v in _lr_goto_items.items():
   for _x, _y in zip(_v[0], _v[1]):
       if not _x in _lr_goto: _lr_goto[_x] = {}
       _lr_goto[_x][_k] = _y
del _lr_goto_items
z
_lr_goto = { z_lr_productions = [
z  (%r,%r,%d,%r,%r,%d),
z  (%r,%r,%d,None,None,None),
z]
r})rAr�r��IOError�splitr�r�r�r�r
�basenamer�r�rYrerrar[rWr�r7rrr�r�r�)r�	tabmodule�	outputdirrZbasemodulenamer�rZsmallerrer=ZndrrDrFr�r��er	r	r
�write_table�
s�





"





"

"
zLRGeneratedTable.write_tablecCsyddl}Wntk
r(ddl}YnXt|d���}|jt|t�|j|j|t�|j||t�|j|j|t�|j|j	|t�g}x^|j
D]T}|jr�|j|j
|j|j|jtjj|j�|jf�q�|jt
|�|j|jdddf�q�W|j||t�WdQRXdS)Nr�wb)r�r�r�r��dumpr��pickle_protocolr�rYr[rWr�rar7rrr�r�r=r�r�)rr�rr�ZoutfZoutpr�r	r	r
�pickle_tables ,"zLRGeneratedTable.pickle_table)rN)rwrw)rw)rrrrr r#r&r(r*r,r.r/r2r5r6r7rrArEr	r	r	r
r3s"
%#8+P8
zrcCs0tj|�}|jj�}|j|jkr,|j|j�|S)N)rm�	_getframe�	f_globals�copy�f_locals�update)ZlevelsrZldictr	r	r
�get_caller_module_dict9s


rKcCsg}|j�}d}|}x�|D]�}|d7}|j�}|s4qy�|ddkrh|sVtd||f��|}	|dd�}
n@|d}	|	}|dd�}
|d}|dkr�|dkr�td||f��|j|||	|
f�Wqtk
r��Yqtk
r�td	|||j�f��YqXqW|S)
Nrcr�|z%s:%d: Misplaced '|'r��:z::=z!%s:%d: Syntax error. Expected ':'z%s:%d: Syntax error in rule %r)�
splitlinesr<rUra�	Exception�strip)�docr�r�rZpstringsZlastpZdlineZpsr�r�r�Zassignr	r	r
�
parse_grammarEs6
 rRc@s�eZdZd dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�ZdS)!�
ParserReflectNcCsL||_d|_d|_d|_t�|_g|_d|_|dkrBtt	j
�|_n||_dS)NF)r�r��
error_func�tokensr�r�rrrrmrnr)rr�rr	r	r
roszParserReflect.__init__cCs,|j�|j�|j�|j�|j�dS)N)�	get_start�get_error_func�
get_tokens�get_precedence�get_pfunctions)rr	r	r
�get_all~s
zParserReflect.get_allcCs6|j�|j�|j�|j�|j�|j�|jS)N)�validate_start�validate_error_func�validate_tokens�validate_precedence�validate_pfunctions�validate_modulesr)rr	r	r
�validate_all�szParserReflect.validate_allcCsyddlm}Wn tk
r0ddlm}YnXy�|dd�}|jrV|j|jjd��|jr~|jdjdd�|jD��jd��|jr�|jd	j|j�jd��x*|j	D] }|d
r�|j|d
jd��q�WWnt
tfk
r�YnXtj
|j��}tjdd
k�r|jd�}|S)Nr)�md5F)Zusedforsecurityzlatin-1rwcSsg|]}dj|��qS)rw)r�)r?r�r	r	r
r@�sz+ParserReflect.signature.<locals>.<listcomp>rxr)Zhashlibrcr�r�rJ�encoder�r�rU�pfuncs�	TypeError�
ValueError�base64Z	b16encode�digestrm�version_info�decode)rrcZsigrrir	r	r
r�s*
"
zParserReflect.signaturecCs�tjd�}x�|jD]�}ytj|�\}}Wntk
r>wYnXi}xjt|�D]^\}}|d7}|j|�}|rN|jd�}|j	|�}	|	s�|||<qNtj
|�}
|jjd|
|||	�qNWqWdS)Nz\s*def\s+(p_[a-zA-Z_0-9]*)\(rcz;%s:%d: Function %s redefined. Previously defined on line %d)
�re�compiler��inspectZgetsourcelinesr;r�r��groupr�
getsourcefilerr)rZfrer��linesZlinenZ	counthashr�r�r�prevr�r	r	r
ra�s$





zParserReflect.validate_modulescCs|jjd�|_dS)Nr�)r�rr�)rr	r	r
rV�szParserReflect.get_startcCs&|jdk	r"t|jt�s"|jjd�dS)Nz'start' must be a string)r�rA�string_typesrr)rr	r	r
r\�s
zParserReflect.validate_startcCs|jjd�|_dS)N�p_error)r�rrT)rr	r	r
rW�szParserReflect.get_error_funccCs�|jr�t|jtj�rd}n*t|jtj�r.d}n|jjd�d|_dS|jjj}|jjj	}t
j|j�}|jj
|�|jjj|}|dkr�|jjd||�d|_dS)Nrrcz2'p_error' defined, but is not a function or methodTz$%s:%d: p_error() requires 1 argument)rTrAr��FunctionType�
MethodTyperr�__code__�co_firstlineno�co_filenamern�	getmoduler�r��co_argcount)rZismethodZelineZefiler�Zargcountr	r	r
r]�s 

z!ParserReflect.validate_error_funccCsn|jjd�}|s&|jjd�d|_dSt|ttf�sJ|jjd�d|_dS|sd|jjd�d|_dS||_dS)NrUzNo token list is definedTztokens must be a list or tupleztokens is empty)r�rrrrArfr�rU)rrUr	r	r
rX�szParserReflect.get_tokenscCsZd|jkr |jjd�d|_dSt�}x.|jD]$}||krH|jjd|�|j|�q.WdS)Nrz.Illegal token name 'error'. Is a reserved wordTzToken %r multiply defined)rUrrr�rr�)rr�rBr	r	r
r^s
zParserReflect.validate_tokenscCs|jjd�|_dS)Nr�)r�rr�)rr	r	r
rYszParserReflect.get_precedencecCsg}|j�rt|jttf�s2|jjd�d|_dSx�t|j�D]�\}}t|ttf�sj|jjd�d|_dSt|�dkr�|jjd|�d|_dS|d}t|t�s�|jjd�d|_dSxH|dd�D]8}t|t�s�|jjd	�d|_dS|j	|||df�q�Wq>W||_
dS)
Nz"precedence must be a list or tupleTzBad precedence tabler�z?Malformed precedence entry %s. Must be (assoc, term, ..., term)rz)precedence associativity must be a stringrcz precedence items must be strings)r�rArfr�rrr�rrsra�preclist)rr|r�r�r�r�r	r	r
r_s6

z!ParserReflect.validate_precedencecCs�g}xl|jj�D]^\}}|jd�s|dkr.qt|tjtjf�rt|d|jj	�}t
j|�}|j||||j
f�qW|jdd�d�||_dS)N�p_rtrxcSs |dt|d�|d|dfS)Nrrcr�r)r7)Z
p_functionr	r	r
r0Bs
z.ParserReflect.get_pfunctions.<locals>.<lambda>)�key)r�re�
startswithrAr�rurvrKrwrxrnrzra�__doc__�sortre)rZp_functionsr�itemr�r�r	r	r
rZ5s
zParserReflect.get_pfunctionscCs^g}t|j�dkr(|jjd�d|_dS�x"|jD�]\}}}}tj|�}|j|}t|tj	�rfd}nd}|j
j|kr�|jjd|||j�d|_q2|j
j|kr�|jjd|||j�d|_q2|j
s�|jjd|||j�q2y,t|||�}	x|	D]}
|j||
f�q�WWn:tk
�r<}z|jjt|��d|_WYdd}~XnX|jj|�q2W�x|jj�D]�\}}
|jd	��r�t|
tjtj	f��r��q\|jd
��r��q\|jd	��r�|dk�r�|jjd|�t|
tj��r�|
j
jdk�s�t|
tj	��r\|
jj
jdk�r\|
j
�r\y8|
j
jd
�}|ddk�r4|jjd|
j
j|
j
j|�Wntk
�rLYnX�q\W||_dS)Nrz+no rules of the form p_rulename are definedTr�rcz%%s:%d: Rule %r has too many argumentsz#%s:%d: Rule %r requires an argumentzA%s:%d: No documentation string specified in function %r (ignored)r}Zt_rtz%r not defined as a functionrxrMz9%s:%d: Possible grammar rule %r defined without p_ prefix)rrerrrnrpr�rAr�rvrwr{rr�rrRrarUr7r�r�rerru�__func__r<ryrxr�r)rrr�r�rrQr�r�ZreqargsZparsed_gr!r@rBrDr	r	r
r`Js\


 z!ParserReflect.validate_pfunctions)N)rrrrr[rbrrarVr\rWr]rXr^rYr_rZr`r	r	r	r
rSns

#rSc
<Osd	|dkrt}|rd}|dkr&ttj�}�rf�fdd�t��D�}
t|
�}d|krntj|dj|d<ntd�}|	dkr�t	|t
j�r�|j}nLd|kr�|d}n:|jd�}dj
|dd6��}td	|�ttj|dd
�}tjj|�}	|jd�}|o�t	|t��rd|k�r|d|}|dk	�r$||d<t||d
�}|j�|j�rHtd��|j�}y�t�}|�rj|j|�}n
|j|�}|�s�||k�r�y"|j|j�t||j �}|j!a!|St"k
�r�}z|j#d|�WYdd}~XnXWnFt$k
�r}z|j#t|��WYdd}~Xnt%k
�r YnX|
dk�r�|�r�ytt&tjj
|	|�d��}
Wn<t'k
�r�}z|j#d||f�t(�}
WYdd}~XnXnt(�}
|
j)dt*�d}|j+��r�td��|j �s�|j#d�t,|j-�}xZ|j.D]P\}}}y|j/|||�Wn0t0k
�r&}z|j#d|�WYdd}~XnX�q�Wxl|j1D]b\}}|\} }!}"}#y|j2|"|#|| |!�Wn4t0k
�r�}z|jd|�d}WYdd}~XnX�q6Wy&|dk�r�|j3|j4�n
|j3|�Wn6t0k
�r�}z|jt|��d}WYdd}~XnX|�rtd��|j5�}$x*|$D]"\}%}&|jd|&j6|&j7|%�d}�qW|j8�}'|'�r�|
j)d
�|
j)d�|
j)d
�x&|'D]}|j#d|�|
j)d|��qnW|�r�|
j)d
�|
j)d�|
j)d
�x&t9|j:�D]\}(})|
j)d|(|)��q�W|j;�}*x$|*D]}&|j#d|&j6|&j7|&j<��q�Wt=|'�dk�r"|j#d�t=|'�dk�r@|j#dt=|'��t=|*�dk�rX|j#d �t=|*�dk�rv|j#d!t=|*��|�rN|
j)d
�|
j)d"�|
j)d
�t>|j?�}+|+j@�x2|+D]*}|
j)d#|d$j
d%d�|j?|D����q�W|
j)d
�|
j)d&�|
j)d
�t>|jA�},|,j@�x2|,D]*}-|
j)d#|-d$j
d'd�|jA|-D����qW|
j)d
�|�r�|jB�}.x|.D]}/|j#d(|/��qbW|jC�}0x|0D]}1|jd)|1�d}�q�W|jD�}2x$|2D]\}}|jd*||�d}�q�W|�r�td��|�r�|jEd+|�tF|||
�}|�rlt=|jG�}3|3dk�r |j#d,�n|3dk�r6|j#d-|3�t=|jH�}4|4dk�rV|j#d.�n|4dk�rl|j#d/|4�|�r�|jG�s�|jH�r�|
j#d
�|
j#d0�|
j#d
�x&|jGD]\}5}6}7|
j#d1|6|5|7��q�WtI�}8x�|jHD]x\}5}9}:|5tJ|9�tJ|:�f|8k�r��q�|
j#d2|5|9�|
j#d3|:|5�|j#d2|5|9�|j#d3|:|5�|8jK|5tJ|9�tJ|:�f��q�Wg};xL|jHD]B\}5}9}:|:jL�r^|:|;k�r^|
j#d4|:�|j#d4|:�|;jM|:��q^W|�r�y|jN||	|�Wn6t'k
�r�}z|j#d5||f�WYdd}~XnX|�	rBy|jO||�Wn6t'k
�	r@}z|j#d5||f�WYdd}~XnX|j|j�t||j �}|j!a!|S)7Nrcsg|]}|t�|�f�qSr	)rK)r?r�)r�r	r
r@�szyacc.<locals>.<listcomp>�__file__rr�r�rcz	import %srw�__package__r�)rzUnable to build parserz.There was a problem loading the table file: %rr:zCouldn't open %r. %sz5Created by PLY version %s (http://www.dabeaz.com/ply)Fz no p_error() function is definedz%sTz;%s:%d: Symbol %r used, but not defined as a token or a rulezUnused terminals:zToken %r defined, but not usedz    %sr�zRule %-5d %sz$%s:%d: Rule %r defined, but not usedzThere is 1 unused tokenzThere are %d unused tokenszThere is 1 unused rulezThere are %d unused rulesz'Terminals, with rules where they appearz
%-20s : %srxcSsg|]}t|��qSr	)r7)r?r=r	r	r
r@E
sz*Nonterminals, with rules where they appearcSsg|]}t|��qSr	)r7)r?r=r	r	r
r@M
szSymbol %r is unreachablez)Infinite recursion detected for symbol %rz0Precedence rule %r defined for unknown symbol %rzGenerating %s tablesz1 shift/reduce conflictz%d shift/reduce conflictsz1 reduce/reduce conflictz%d reduce/reduce conflictsz
Conflicts:z7shift/reduce conflict for %s in state %d resolved as %sz;reduce/reduce conflict in state %d resolved using rule (%s)zrejected rule (%s) in state %dzRule (%s) is never reducedzCouldn't create %r. %sr})P�
tab_modulerrmrn�dir�dictr�r�rKrAr�r�r<r�r�rKr�r��dirnamerr7rSr[rrrr�rr�rr�rVrTrurOrr�r�r�r;rr�__version__rbr�rUr|r�r�rr�r�r�r�r�r�r�r�r�r�rrrfr�r�r�r�r�r�rrrrr�r!r�r�rarArE)<rrr�r>r�Zcheck_recursion�optimizeZwrite_tablesZ	debugfiler?ZdebuglogZerrorlogZ
picklefileZ_itemsr�Zsrcfile�partsZpkgnameZpkgZpinforZlrZread_signaturer3r@�errorsrr�r�r��funcnameZgramr�r�r�r�r�rbr�r�rBr�r�r+ZnontermsZnontermZunreachable�ur��infZunused_precZnum_srZnum_rrrhr�Z
resolutionZalready_reportedZruleZrejectedZwarned_neverr	)r�r
�yacc�s�






"



$
















*




*













$$r�)<rlr�rmZos.pathr�rnrhr(r�r�Z	yaccdebugZ
debug_filer�Z
default_lrr�rkrrDrjZ
basestringrsr7�maxsizer�objectrrrOrr%r'r+r/r-r*r,r.r0r4r5r9rVrmr�r�r�r�r�r�r�r�r�r	rrrrKrRrSr�r	r	r	r
�<module>>s�

4m
H.rT
) 
site-packages/ply/__pycache__/__init__.cpython-36.pyc000064400000000242147511334660016475 0ustar003

r��Wf�@sdZddgZdS)z3.9ZlexZyaccN)�__version__�__all__�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/ply/__pycache__/ygen.cpython-36.pyc000064400000003215147511334660015703 0ustar003

r��W��@s:ddlZddlZdd�Zdd�Zdd�Zedkr6e�dS)	�NcCsht|�}d|}d|}x |D]\}}|j�j|�rPqWx |D]\}}|j�j|�r@Pq@W|d|fS)Nz
#--! %s-startz#--! %s-end�)�	enumerate�strip�
startswith�endswith)�lines�tagZsrclinesZ	start_tagZend_tagZstart_index�lineZ	end_index�r
�/usr/lib/python3.6/ygen.py�get_source_range
srcCsFg}d}d|}x0|D](}|j�j|�r0|}q|r|j|�qW|S)NTz#--! %s)rr�append)rrZfiltered_lines�includeZtag_textr	r
r
r�filter_sections
rcCs�tjjt�}tjtjj|d�tjj|d��ttjj|d�d��}|j�}WdQRXt	|d�\}}t	|d�\}}t	|d�\}}|||�}	t
|	d�}
t
|
d�}||||�<|
|||�<d	d
�|D�}ttjj|d�d��}|j|�WdQRXtd�dS)
Nzyacc.pyzyacc.py.bak�rZ
parsedebugZparseoptzparseopt-notrack�DEBUGZTRACKINGcSsg|]}|j�d�qS)�
)�rstrip)�.0r	r
r
r�
<listcomp>>szmain.<locals>.<listcomp>�wzUpdated yacc.py)
�os�path�dirname�__file__�shutilZcopy2�join�open�	readlinesrr�
writelines�print)r�frZparse_startZ	parse_endZparseopt_startZparseopt_endZparseopt_notrack_startZparseopt_notrack_endZ
orig_linesZparseopt_linesZparseopt_notrack_linesr
r
r�main's  

r"�__main__)Zos.pathrrrrr"�__name__r
r
r
r�<module>
ssite-packages/ply/__pycache__/cpp.cpython-36.opt-1.pyc000064400000037557147511334660016502 0ustar003

r��W���
@slddlmZddlZejjdkr*eefZneZeZ	d3Z
dZdd�ZdZ
dZdZdd�ZeZdZdd�Zdd�Zdd�Zdd�Zdd�ZddlZddlZddlZddlZejd �Zd!d"d#d$d%d&d'd(d)d*�	Zd+d,�Z Gd-d.�d.e!�Z"Gd/d0�d0e!�Z#e$d1k�rhddl%j&Z&e&j&�Z'ddlZe(ej)d2�Z*e*j+�Z,e#e'�Z-e-j.e,ej)d2�x"e-j/�Z0e0�sVPe1e-j2e0��qFWdS)4�)�
generatorsN��CPP_ID�CPP_INTEGER�	CPP_FLOAT�
CPP_STRING�CPP_CHAR�CPP_WS�CPP_COMMENT1�CPP_COMMENT2�	CPP_POUND�
CPP_DPOUNDz+-*/%|&~^<>=!?()[]{}.,;:\'"cCs|jj|jjd�7_|S)z\s+�
)�lexer�lineno�value�count)�t�r�/usr/lib/python3.6/cpp.py�t_CPP_WS!srz\#z\#\#z[A-Za-z_][\w_]*cCs|S)zA(((((0x)|(0X))[0-9a-fA-F]+)|(\d+))([uU][lL]|[lL][uU]|[uU]|[lL])?)r)rrrrr-sz?((\d+)(\.\d+)(e(\+|-)?(\d+))? | (\d+)e(\+|-)?(\d+))([lL]|[fF])?cCs|jj|jjd�7_|S)z\"([^\\\n]|(\\(.|\n)))*?\"r)rrrr)rrrr�t_CPP_STRING7srcCs|jj|jjd�7_|S)z(L)?\'([^\\\n]|(\\(.|\n)))*?\'r)rrrr)rrrr�
t_CPP_CHAR=srcCs8|jjd�}|jj|7_d|_|r.d|nd|_|S)z(/\*(.|\n)*?\*/)rr	� )rrrr�type)rZncrrrr�t_CPP_COMMENT1Cs
rcCsd|_d|_dS)z
(//.*?(\n|$))r	rN)rr)rrrr�t_CPP_COMMENT2LsrcCs(|jd|_|jd|_|jjd�|S)Nr�)rrr�skip)rrrr�t_errorQsrz\?\?[=/\'\(\)\!<>\-]�#�\�^�[�]�|�{�}�~)	�=�/�'�(�)�!�<�>�-cCstjdd�|�S)NcSst|j�dS)Nr���)�
_trigraph_rep�group)�grrr�<lambda>{sztrigraph.<locals>.<lambda>)�
_trigraph_pat�sub)�inputrrr�trigraphzsr:c@seZdZddd�ZdS)�MacroNFcCs0||_||_||_||_|r&|d|_d|_dS)Nrr2)�namer�arglist�variadic�vararg�source)�selfr<rr=r>rrr�__init__�s
zMacro.__init__)NF)�__name__�
__module__�__qualname__rBrrrrr;�sr;c@s�eZdZd&dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zd'dd�Z
dd�Zd(dd�Zdd�Zdd�Zd d!�Zdifd"d#�Zd$d%�ZdS))�PreprocessorNcCsl|dkrtj}||_i|_g|_g|_|j�tj�}|jdtj	d|��|jdtj	d|��d|_
dS)Nz
__DATE__ "%s"z%b %d %Yz
__TIME__ "%s"z%H:%M:%S)�lexr�macros�path�	temp_path�lexprobe�timeZ	localtime�defineZstrftime�parser)rArZtmrrrrB�szPreprocessor.__init__cCs4g}|jj|�x|jj�}|s"P|j|�qW|S)N)rr9�token�append)rA�text�tokens�tokrrr�tokenize�s
zPreprocessor.tokenizecCstd|||f�dS)Nz%s:%d %s)�print)rA�file�line�msgrrr�error�szPreprocessor.errorc	Cs�|jjd�|jj�}|s&|jdkr0td�n|j|_|jjd�|jj�}|sbt|j�dkrltd�n|j|_t|j�|_	|jjd�|jj�}|s�|jdkr�td�n|j|_
|jjd�|jj�}|s�|jdkr�d|_n|j|_|jjd	�|jj�}|�s|jd	k�r(d|_td
�n|j|_|j|jf|_
ddd
ddddddg	}xD|D]<}|jj|�|jj�}|�s�|j|k�rZtd|��qZWdS)NZ
identifierz"Couldn't determine identifier typeZ12345i90zCouldn't determine integer typez
"filename"zCouldn't determine string typez  rz%Couldn't determine token for newlinesr/r0r z##r!r,r-�,�.z,Unable to lex '%s' required for preprocessor)rr9rOrrUr�t_ID�int�	t_INTEGER�t_INTEGER_TYPE�t_STRINGZt_SPACEZ	t_NEWLINE�t_WS)rArS�chars�crrrrK�sD










zPreprocessor.lexprobecCs|jj|�dS)N)rIrP)rArIrrr�add_pathszPreprocessor.add_pathccs�|jj�}dd�|j�D�}xhtt|��D]X}|d}xJ||jd�r�|t|�kr�||dd�||||<d||<|d7}q8Wq*Wdj|�}|j|�d|_g}x<|j	�}|s�P|j
|�|j|jkr�d|j
kr�|Vg}q�W|r�|VdS)NcSsg|]}|j��qSr)�rstrip)�.0�xrrr�
<listcomp>sz,Preprocessor.group_lines.<locals>.<listcomp>rr!�rr2)rZclone�
splitlines�xrange�len�endswith�joinr9rrOrPrrar)rAr9rG�lines�i�jZcurrent_linerSrrr�group_liness,



zPreprocessor.group_linescCs|d}x(|t|�kr,||j|jkr,|d7}qW|d|�=t|�d}x$|dkrh||j|jkrh|d8}qFW||dd�=|S)Nrr)rlrra)rArRrprrr�
tokenstrip9s
zPreprocessor.tokenstripc	Cs�g}g}g}d}t|�}d}x$||kr@||j|jkr@|d7}qW||krh||jdkrh|j|d�n |j|j|djd�dggfS|d7}x�||k�rf||}|jdkr�|j|�|d7}n�|jdk�r|d8}|dk�r|r�|j|j|��|j|�|d||fS|j|�nD|jdk�rR|dk�rR|j|j|��|j|d�g}n
|j|�|d7}q�W|j|j|djd�dggfS)	Nrrr,zMissing '(' in macro argumentsr-rZzMissing ')' in macro argumentsr2)	rlrrarrPrYr@rrs)	rAZ	tokenlist�args�	positionsZcurrent_argZnestingZtokenlenrprrrr�collect_argsUsD






zPreprocessor.collect_argscCsg|_g|_g|_d}�x�|t|j�k�r|j|j|jkoL|j|j|jk�rh|jj|j|j�}|dkr�|j|djdkr�t	j	|j|�|j|<|j
|j|_|j|d=|jj||df�qn�|dko�|j|djdk�r|jjd||df�|j|d=qnZ|dt|j�k�rT|j|djdk�rT|jjd||f�|d7}qn|jjd||f�n�|j|jdk�r�|j�r�|dk�r�|j|djdk�r�|dt|j�k�r�|j|dj|jk�r�|j|dj|j
k�r�|jj|d�|d7}qW|jjdd	�d
d�dS)Nrrr z##rc�erZcSs|dS)N�r)rgrrrr6�sz,Preprocessor.macro_prescan.<locals>.<lambda>T)�key�reverse)�patch�	str_patch�var_comma_patchrlrrr\r=�index�copyr`rPr>r?�sort)rA�macrorp�argnumrrr�
macro_prescan�s:&*(,zPreprocessor.macro_prescanc
Cs0dd�|jD�}i}xb|jD]X\}}||krTddjdd�||D��jdd�||<tj||�||<||||_qWd}|jr�|dr�x|jD]}d||<d
}q�Wi}xj|jD]`\}	}}|	dkr�|||||d	�<q�|	dkr�||k�r|j||�||<|||||d	�<q�W|�r,d
d�|D�}|S)NcSsg|]}tj|��qSr)r)rf�_xrrrrh�sz2Preprocessor.macro_expand_args.<locals>.<listcomp>z"%s"ricSsg|]
}|j�qSr)r)rfrgrrrrh�sr!z\\FrTrcrwcSsg|]}|r|�qSrr)rf�_irrrrh�sr2)	rr|rn�replacerr>r}r{�
expand_macros)
rAr�rt�repZ
str_expansionr�rpZcomma_patch�expandedZptyperrr�macro_expand_args�s.(
zPreprocessor.macro_expand_argscCs�|dkri}d}�x�|t|�k�r�||}|j|jk�r�|j|jkoL|j|k�r�d||j<|j|j}|js�|jdd�|jD�|�}x|D]}|j|_q�W||||d�<|t|�7}�n�|d}x(|t|�kr�||j|jkr�|d7}q�W||jdk�r�|j	||d��\}	}
}|j
�r`t|
�t|j�k�r`|j|j|jd|jt|j�f�||	}�nD|j
�r�t|
�t|j�dk�r�t|j�dk�r�|j|j|jd	|jt|j�df�n&|j|j|jd
|jt|j�df�||	}n�|j
�rXt|
�t|j�dk�r|
j
g�nD|||t|j�d||	d�|
t|j�d<|
t|j�d�=|j||
�}|j||�}x|D]}
|j|
_�qvW|||||	�<|t|�7}||j=qn"|jdk�r�|j|_|j|j�|_|d7}qW|S)NrTcSsg|]}tj|��qSr)r)rfr�rrrrh�sz.Preprocessor.expand_macros.<locals>.<listcomp>rr,zMacro %s requires %d argumentsrxz(Macro %s must have at least %d argumentsz'Macro %s must have at least %d argumentZ__LINE__)rlrr\rrHr=r�rrarvr>rYr@rPr�r^r_)rArRr�rpr�mZexrwrq�tokcountrtrur��rrrrr��s\

" (&
4
zPreprocessor.expand_macroscCs^d}�x|t|�k�r"||j|jko2||jdk�r|d}d}d}x�|t|�kr�||j|jkrp|d7}qHnn||j|jkr�||j|jkr�d}nd}|s�Pn<||jdkr�d}n(||jd	kr�Pn|j|j||jd
�|d7}qHW|j	||_|j
|�||_||d|d�=|d7}qW|j|�}x�t|�D]�\}}|j|jk�rzt
j
|�||<|j	||_|j
d�||_nd|j|j	k�r8t
j
|�||<t||j�||_x2||jddk�r�||jdd�||_�q�W�q8Wdjd
d�|D��}|jdd�}|jdd�}|jdd�}yt|�}Wn0tk
�rX|j|j|djd�d}YnX|S)NrZdefinedrFZ0LZ1Lr,Tr-zMalformed defined()Z0123456789abcdefABCDEFricSsg|]}t|j��qSr)�strr)rfrgrrrrhTsz)Preprocessor.evalexpr.<locals>.<listcomp>z&&z and z||z or r.z not zCouldn't evaluate expressionr2r2)rlrr\rrarHrYr@rr^r_r��	enumeraterr�rnr��eval�	Exception)rArRrprqZ	needparen�resultr�exprrrr�evalexpr)s^ 
$
zPreprocessor.evalexprccs�t|�}|j|�}|sd}|jd|�||_g}d}d}g}�xN|D�]D}	x"t|	�D]\}
}|j|jkrVPqVW|jdk�r~x,|	D]$}|j|jkr�d|jkr�|j|�q�W|j	|	|
dd��}|r�|dj}
|j	|dd��}nd}
g}|
d	k�r(|�r|x|j
|�D]}|V�qWg}|j|��q�|
d
k�r�|�r|x|j
|�D]}|V�qDWg}|jd}x|j|�D]}|V�qnW||jd<||_�q�|
dk�r�|�r|x|j
|�D]}|V�q�Wg}|j
|��q�|
d
k�r|j||f�|�r||dj|jk�r
d}d}nd}�q�|
dk�rT|j||f�|�r||dj|jk�rLd}d}nd}�q�|
dk�r�|j||f�|�r||j|�}|�s�d}d}nd}n�|
dk�r�|�r�|dd�r�|�r�d}n|�s�|j|�}|�r�d}d}n|j|j|djd�n�|
dk�rF|�r.|dd�rD|�rd}n|�sDd}d}n|j|j|djd�n6|
dk�r�|�rd|j�\}}n|j|j|djd�nqF|rF|j|	�qFWx|j
|�D]}|V�q�Wg}dS)Nriz
__FILE__ "%s"TFr rrrrM�includeZ__FILE__�undefZifdefZifndef�if�elifzMisplaced #elif�elsezMisplaced #elseZendifzMisplaced #endifr2r2)r:rrrMr@r�rrarrPrsr�rHr�r�r�rYr�pop�extend)rAr9r@rro�chunk�enableZ	iftriggerZifstackrgrprSZ	dirtokensr<rtZoldfiler�rrr�parsegends�
















zPreprocessor.parsegenc
cs�|sdS|r�|djdkr4|dj|jkr4|j|�}|djdkr�d}x4|t|�krn||jdkrdP|d7}qHWtd�dSdjdd�|d|�D��}|jdg|j}nB|dj|jkr�|djdd�}|jdg|j}ntd	�dSx�|D]�}t	jj||�}y`t
|d
�j�}t	jj|�}|�r6|jj
d|�x|j||�D]}	|	V�qDW|�rb|jd=PWq�tk
�r|Yq�Xq�Wtd|�dS)
Nrr/rr0zMalformed #include <...>ricSsg|]
}|j�qSr)r)rfrgrrrrh�sz(Preprocessor.include.<locals>.<listcomp>zMalformed #include statementr�zCouldn't find '%s'r2)rrr`r�rlrUrnrIrJ�os�open�read�dirname�insertr��IOError)
rArRrp�filenamerI�pZiname�dataZdnamerSrrrr��sF


zPreprocessor.includecCs�t|t�r|j|�}|}�y||d}t|�dkr:|d}nd}|s^t|jg�}||j|j<�n6|j|jkr�t|j|j	|dd���}||j|j<�n|jdk�r�|j
|dd��\}}}d}	�x�|D]�}
|	r�td�Pdjdd	�|
D��}|d
k�r d}	|j
|
d_d|
d_d}	|
dd�=q�nb|dd�d
k�r�|
dj|j
k�r�d}	|
dd�=|
djdd�d
kr�|
djdd�|
d_q�t|
�dk�s�|
dj|j
kr�td�Pq�W|j	|d|d��}d}
x�|
t|�k�rX|
dt|�k�rL||
j|jk�r||
djdk�r||
=�q�n0||
jdk�rL||
dj|jk�rL||
d=|
d7}
�q�Wt|j|dd	�|D�|	�}|j|�||j|j<ntd�Wntk
�r�td�YnXdS)Nrrrxr,Fz0No more arguments may follow a variadic argumentricSsg|]}t|j��qSr)r�r)rfr�rrrrh2sz'Preprocessor.define.<locals>.<listcomp>z...TZ__VA_ARGS__rzInvalid macro argumentz##cSsg|]}|dj�qS)r)r)rfrgrrrrhPszBad macro definition���r�r�)�
isinstance�STRING_TYPESrTrlr;rrHrrarsrvrUrnr\r��LookupError)rArRZlinetokr<Zmtyper�r�rtrur>�aZastrZmvaluerprrrrMsl





$
&&

zPreprocessor.definecCs0|dj}y|j|=Wntk
r*YnXdS)Nr)rrHr�)rArR�idrrrr�^s

zPreprocessor.undefcCs||_|j||�|_dS)N)�ignorer�rN)rAr9r@r�rrr�parsejszPreprocessor.parsecCsDy$xt|j�}|j|jkr|SqWWntk
r>d|_dSXdS)N)�nextrNrr��
StopIteration)rArSrrrrOss
zPreprocessor.token)N)N)N)rCrDrErBrTrYrKrdrrrsrvr�r�r�r�r�r�rMr�r�rOrrrrrF�s&
<!5+2
B;
1F	rF�__main__r)
rrrrrr	r
rrr
)3Z
__future__r�sys�version_info�majorr�Zunicoder��rangerkrR�literalsrZt_CPP_POUNDZt_CPP_DPOUNDZt_CPP_IDrZ
t_CPP_INTEGERZt_CPP_FLOATrrrrr�rerrLZos.pathr��compiler7r3r:�objectr;rFrCZply.lexrGrr��argv�fr�r9r�r�rOrSrUr@rrrr�<module>
sl
	
c

site-packages/ply/__pycache__/cpp.cpython-36.pyc000064400000037557147511334660015543 0ustar003

r��W���
@slddlmZddlZejjdkr*eefZneZeZ	d3Z
dZdd�ZdZ
dZdZdd�ZeZdZdd�Zdd�Zdd�Zdd�Zdd�ZddlZddlZddlZddlZejd �Zd!d"d#d$d%d&d'd(d)d*�	Zd+d,�Z Gd-d.�d.e!�Z"Gd/d0�d0e!�Z#e$d1k�rhddl%j&Z&e&j&�Z'ddlZe(ej)d2�Z*e*j+�Z,e#e'�Z-e-j.e,ej)d2�x"e-j/�Z0e0�sVPe1e-j2e0��qFWdS)4�)�
generatorsN��CPP_ID�CPP_INTEGER�	CPP_FLOAT�
CPP_STRING�CPP_CHAR�CPP_WS�CPP_COMMENT1�CPP_COMMENT2�	CPP_POUND�
CPP_DPOUNDz+-*/%|&~^<>=!?()[]{}.,;:\'"cCs|jj|jjd�7_|S)z\s+�
)�lexer�lineno�value�count)�t�r�/usr/lib/python3.6/cpp.py�t_CPP_WS!srz\#z\#\#z[A-Za-z_][\w_]*cCs|S)zA(((((0x)|(0X))[0-9a-fA-F]+)|(\d+))([uU][lL]|[lL][uU]|[uU]|[lL])?)r)rrrrr-sz?((\d+)(\.\d+)(e(\+|-)?(\d+))? | (\d+)e(\+|-)?(\d+))([lL]|[fF])?cCs|jj|jjd�7_|S)z\"([^\\\n]|(\\(.|\n)))*?\"r)rrrr)rrrr�t_CPP_STRING7srcCs|jj|jjd�7_|S)z(L)?\'([^\\\n]|(\\(.|\n)))*?\'r)rrrr)rrrr�
t_CPP_CHAR=srcCs8|jjd�}|jj|7_d|_|r.d|nd|_|S)z(/\*(.|\n)*?\*/)rr	� )rrrr�type)rZncrrrr�t_CPP_COMMENT1Cs
rcCsd|_d|_dS)z
(//.*?(\n|$))r	rN)rr)rrrr�t_CPP_COMMENT2LsrcCs(|jd|_|jd|_|jjd�|S)Nr�)rrr�skip)rrrr�t_errorQsrz\?\?[=/\'\(\)\!<>\-]�#�\�^�[�]�|�{�}�~)	�=�/�'�(�)�!�<�>�-cCstjdd�|�S)NcSst|j�dS)Nr���)�
_trigraph_rep�group)�grrr�<lambda>{sztrigraph.<locals>.<lambda>)�
_trigraph_pat�sub)�inputrrr�trigraphzsr:c@seZdZddd�ZdS)�MacroNFcCs0||_||_||_||_|r&|d|_d|_dS)Nrr2)�namer�arglist�variadic�vararg�source)�selfr<rr=r>rrr�__init__�s
zMacro.__init__)NF)�__name__�
__module__�__qualname__rBrrrrr;�sr;c@s�eZdZd&dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zd'dd�Z
dd�Zd(dd�Zdd�Zdd�Zd d!�Zdifd"d#�Zd$d%�ZdS))�PreprocessorNcCsl|dkrtj}||_i|_g|_g|_|j�tj�}|jdtj	d|��|jdtj	d|��d|_
dS)Nz
__DATE__ "%s"z%b %d %Yz
__TIME__ "%s"z%H:%M:%S)�lexr�macros�path�	temp_path�lexprobe�timeZ	localtime�defineZstrftime�parser)rArZtmrrrrB�szPreprocessor.__init__cCs4g}|jj|�x|jj�}|s"P|j|�qW|S)N)rr9�token�append)rA�text�tokens�tokrrr�tokenize�s
zPreprocessor.tokenizecCstd|||f�dS)Nz%s:%d %s)�print)rA�file�line�msgrrr�error�szPreprocessor.errorc	Cs�|jjd�|jj�}|s&|jdkr0td�n|j|_|jjd�|jj�}|sbt|j�dkrltd�n|j|_t|j�|_	|jjd�|jj�}|s�|jdkr�td�n|j|_
|jjd�|jj�}|s�|jdkr�d|_n|j|_|jjd	�|jj�}|�s|jd	k�r(d|_td
�n|j|_|j|jf|_
ddd
ddddddg	}xD|D]<}|jj|�|jj�}|�s�|j|k�rZtd|��qZWdS)NZ
identifierz"Couldn't determine identifier typeZ12345i90zCouldn't determine integer typez
"filename"zCouldn't determine string typez  rz%Couldn't determine token for newlinesr/r0r z##r!r,r-�,�.z,Unable to lex '%s' required for preprocessor)rr9rOrrUr�t_ID�int�	t_INTEGER�t_INTEGER_TYPE�t_STRINGZt_SPACEZ	t_NEWLINE�t_WS)rArS�chars�crrrrK�sD










zPreprocessor.lexprobecCs|jj|�dS)N)rIrP)rArIrrr�add_pathszPreprocessor.add_pathccs�|jj�}dd�|j�D�}xhtt|��D]X}|d}xJ||jd�r�|t|�kr�||dd�||||<d||<|d7}q8Wq*Wdj|�}|j|�d|_g}x<|j	�}|s�P|j
|�|j|jkr�d|j
kr�|Vg}q�W|r�|VdS)NcSsg|]}|j��qSr)�rstrip)�.0�xrrr�
<listcomp>sz,Preprocessor.group_lines.<locals>.<listcomp>rr!�rr2)rZclone�
splitlines�xrange�len�endswith�joinr9rrOrPrrar)rAr9rG�lines�i�jZcurrent_linerSrrr�group_liness,



zPreprocessor.group_linescCs|d}x(|t|�kr,||j|jkr,|d7}qW|d|�=t|�d}x$|dkrh||j|jkrh|d8}qFW||dd�=|S)Nrr)rlrra)rArRrprrr�
tokenstrip9s
zPreprocessor.tokenstripc	Cs�g}g}g}d}t|�}d}x$||kr@||j|jkr@|d7}qW||krh||jdkrh|j|d�n |j|j|djd�dggfS|d7}x�||k�rf||}|jdkr�|j|�|d7}n�|jdk�r|d8}|dk�r|r�|j|j|��|j|�|d||fS|j|�nD|jdk�rR|dk�rR|j|j|��|j|d�g}n
|j|�|d7}q�W|j|j|djd�dggfS)	Nrrr,zMissing '(' in macro argumentsr-rZzMissing ')' in macro argumentsr2)	rlrrarrPrYr@rrs)	rAZ	tokenlist�args�	positionsZcurrent_argZnestingZtokenlenrprrrr�collect_argsUsD






zPreprocessor.collect_argscCsg|_g|_g|_d}�x�|t|j�k�r|j|j|jkoL|j|j|jk�rh|jj|j|j�}|dkr�|j|djdkr�t	j	|j|�|j|<|j
|j|_|j|d=|jj||df�qn�|dko�|j|djdk�r|jjd||df�|j|d=qnZ|dt|j�k�rT|j|djdk�rT|jjd||f�|d7}qn|jjd||f�n�|j|jdk�r�|j�r�|dk�r�|j|djdk�r�|dt|j�k�r�|j|dj|jk�r�|j|dj|j
k�r�|jj|d�|d7}qW|jjdd	�d
d�dS)Nrrr z##rc�erZcSs|dS)N�r)rgrrrr6�sz,Preprocessor.macro_prescan.<locals>.<lambda>T)�key�reverse)�patch�	str_patch�var_comma_patchrlrrr\r=�index�copyr`rPr>r?�sort)rA�macrorp�argnumrrr�
macro_prescan�s:&*(,zPreprocessor.macro_prescanc
Cs0dd�|jD�}i}xb|jD]X\}}||krTddjdd�||D��jdd�||<tj||�||<||||_qWd}|jr�|dr�x|jD]}d||<d
}q�Wi}xj|jD]`\}	}}|	dkr�|||||d	�<q�|	dkr�||k�r|j||�||<|||||d	�<q�W|�r,d
d�|D�}|S)NcSsg|]}tj|��qSr)r)rf�_xrrrrh�sz2Preprocessor.macro_expand_args.<locals>.<listcomp>z"%s"ricSsg|]
}|j�qSr)r)rfrgrrrrh�sr!z\\FrTrcrwcSsg|]}|r|�qSrr)rf�_irrrrh�sr2)	rr|rn�replacerr>r}r{�
expand_macros)
rAr�rt�repZ
str_expansionr�rpZcomma_patch�expandedZptyperrr�macro_expand_args�s.(
zPreprocessor.macro_expand_argscCs�|dkri}d}�x�|t|�k�r�||}|j|jk�r�|j|jkoL|j|k�r�d||j<|j|j}|js�|jdd�|jD�|�}x|D]}|j|_q�W||||d�<|t|�7}�n�|d}x(|t|�kr�||j|jkr�|d7}q�W||jdk�r�|j	||d��\}	}
}|j
�r`t|
�t|j�k�r`|j|j|jd|jt|j�f�||	}�nD|j
�r�t|
�t|j�dk�r�t|j�dk�r�|j|j|jd	|jt|j�df�n&|j|j|jd
|jt|j�df�||	}n�|j
�rXt|
�t|j�dk�r|
j
g�nD|||t|j�d||	d�|
t|j�d<|
t|j�d�=|j||
�}|j||�}x|D]}
|j|
_�qvW|||||	�<|t|�7}||j=qn"|jdk�r�|j|_|j|j�|_|d7}qW|S)NrTcSsg|]}tj|��qSr)r)rfr�rrrrh�sz.Preprocessor.expand_macros.<locals>.<listcomp>rr,zMacro %s requires %d argumentsrxz(Macro %s must have at least %d argumentsz'Macro %s must have at least %d argumentZ__LINE__)rlrr\rrHr=r�rrarvr>rYr@rPr�r^r_)rArRr�rpr�mZexrwrq�tokcountrtrur��rrrrr��s\

" (&
4
zPreprocessor.expand_macroscCs^d}�x|t|�k�r"||j|jko2||jdk�r|d}d}d}x�|t|�kr�||j|jkrp|d7}qHnn||j|jkr�||j|jkr�d}nd}|s�Pn<||jdkr�d}n(||jd	kr�Pn|j|j||jd
�|d7}qHW|j	||_|j
|�||_||d|d�=|d7}qW|j|�}x�t|�D]�\}}|j|jk�rzt
j
|�||<|j	||_|j
d�||_nd|j|j	k�r8t
j
|�||<t||j�||_x2||jddk�r�||jdd�||_�q�W�q8Wdjd
d�|D��}|jdd�}|jdd�}|jdd�}yt|�}Wn0tk
�rX|j|j|djd�d}YnX|S)NrZdefinedrFZ0LZ1Lr,Tr-zMalformed defined()Z0123456789abcdefABCDEFricSsg|]}t|j��qSr)�strr)rfrgrrrrhTsz)Preprocessor.evalexpr.<locals>.<listcomp>z&&z and z||z or r.z not zCouldn't evaluate expressionr2r2)rlrr\rrarHrYr@rr^r_r��	enumeraterr�rnr��eval�	Exception)rArRrprqZ	needparen�resultr�exprrrr�evalexpr)s^ 
$
zPreprocessor.evalexprccs�t|�}|j|�}|sd}|jd|�||_g}d}d}g}�xN|D�]D}	x"t|	�D]\}
}|j|jkrVPqVW|jdk�r~x,|	D]$}|j|jkr�d|jkr�|j|�q�W|j	|	|
dd��}|r�|dj}
|j	|dd��}nd}
g}|
d	k�r(|�r|x|j
|�D]}|V�qWg}|j|��q�|
d
k�r�|�r|x|j
|�D]}|V�qDWg}|jd}x|j|�D]}|V�qnW||jd<||_�q�|
dk�r�|�r|x|j
|�D]}|V�q�Wg}|j
|��q�|
d
k�r|j||f�|�r||dj|jk�r
d}d}nd}�q�|
dk�rT|j||f�|�r||dj|jk�rLd}d}nd}�q�|
dk�r�|j||f�|�r||j|�}|�s�d}d}nd}n�|
dk�r�|�r�|dd�r�|�r�d}n|�s�|j|�}|�r�d}d}n|j|j|djd�n�|
dk�rF|�r.|dd�rD|�rd}n|�sDd}d}n|j|j|djd�n6|
dk�r�|�rd|j�\}}n|j|j|djd�nqF|rF|j|	�qFWx|j
|�D]}|V�q�Wg}dS)Nriz
__FILE__ "%s"TFr rrrrM�includeZ__FILE__�undefZifdefZifndef�if�elifzMisplaced #elif�elsezMisplaced #elseZendifzMisplaced #endifr2r2)r:rrrMr@r�rrarrPrsr�rHr�r�r�rYr�pop�extend)rAr9r@rro�chunk�enableZ	iftriggerZifstackrgrprSZ	dirtokensr<rtZoldfiler�rrr�parsegends�
















zPreprocessor.parsegenc
cs�|sdS|r�|djdkr4|dj|jkr4|j|�}|djdkr�d}x4|t|�krn||jdkrdP|d7}qHWtd�dSdjdd�|d|�D��}|jdg|j}nB|dj|jkr�|djdd�}|jdg|j}ntd	�dSx�|D]�}t	jj||�}y`t
|d
�j�}t	jj|�}|�r6|jj
d|�x|j||�D]}	|	V�qDW|�rb|jd=PWq�tk
�r|Yq�Xq�Wtd|�dS)
Nrr/rr0zMalformed #include <...>ricSsg|]
}|j�qSr)r)rfrgrrrrh�sz(Preprocessor.include.<locals>.<listcomp>zMalformed #include statementr�zCouldn't find '%s'r2)rrr`r�rlrUrnrIrJ�os�open�read�dirname�insertr��IOError)
rArRrp�filenamerI�pZiname�dataZdnamerSrrrr��sF


zPreprocessor.includecCs�t|t�r|j|�}|}�y||d}t|�dkr:|d}nd}|s^t|jg�}||j|j<�n6|j|jkr�t|j|j	|dd���}||j|j<�n|jdk�r�|j
|dd��\}}}d}	�x�|D]�}
|	r�td�Pdjdd	�|
D��}|d
k�r d}	|j
|
d_d|
d_d}	|
dd�=q�nb|dd�d
k�r�|
dj|j
k�r�d}	|
dd�=|
djdd�d
kr�|
djdd�|
d_q�t|
�dk�s�|
dj|j
kr�td�Pq�W|j	|d|d��}d}
x�|
t|�k�rX|
dt|�k�rL||
j|jk�r||
djdk�r||
=�q�n0||
jdk�rL||
dj|jk�rL||
d=|
d7}
�q�Wt|j|dd	�|D�|	�}|j|�||j|j<ntd�Wntk
�r�td�YnXdS)Nrrrxr,Fz0No more arguments may follow a variadic argumentricSsg|]}t|j��qSr)r�r)rfr�rrrrh2sz'Preprocessor.define.<locals>.<listcomp>z...TZ__VA_ARGS__rzInvalid macro argumentz##cSsg|]}|dj�qS)r)r)rfrgrrrrhPszBad macro definition���r�r�)�
isinstance�STRING_TYPESrTrlr;rrHrrarsrvrUrnr\r��LookupError)rArRZlinetokr<Zmtyper�r�rtrur>�aZastrZmvaluerprrrrMsl





$
&&

zPreprocessor.definecCs0|dj}y|j|=Wntk
r*YnXdS)Nr)rrHr�)rArR�idrrrr�^s

zPreprocessor.undefcCs||_|j||�|_dS)N)�ignorer�rN)rAr9r@r�rrr�parsejszPreprocessor.parsecCsDy$xt|j�}|j|jkr|SqWWntk
r>d|_dSXdS)N)�nextrNrr��
StopIteration)rArSrrrrOss
zPreprocessor.token)N)N)N)rCrDrErBrTrYrKrdrrrsrvr�r�r�r�r�r�rMr�r�rOrrrrrF�s&
<!5+2
B;
1F	rF�__main__r)
rrrrrr	r
rrr
)3Z
__future__r�sys�version_info�majorr�Zunicoder��rangerkrR�literalsrZt_CPP_POUNDZt_CPP_DPOUNDZt_CPP_IDrZ
t_CPP_INTEGERZt_CPP_FLOATrrrrr�rerrLZos.pathr��compiler7r3r:�objectr;rFrCZply.lexrGrr��argv�fr�r9r�r�rOrSrUr@rrrr�<module>
sl
	
c

site-packages/ply/lex.py000064400000123720147511334660011251 0ustar00# -----------------------------------------------------------------------------
# ply: lex.py
#
# Copyright (C) 2001-2016
# David M. Beazley (Dabeaz LLC)
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are
# met:
#
# * Redistributions of source code must retain the above copyright notice,
#   this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright notice,
#   this list of conditions and the following disclaimer in the documentation
#   and/or other materials provided with the distribution.
# * Neither the name of the David Beazley or Dabeaz LLC may be used to
#   endorse or promote products derived from this software without
#  specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
# -----------------------------------------------------------------------------

__version__    = '3.9'
__tabversion__ = '3.8'

import re
import sys
import types
import copy
import os
import inspect

# This tuple contains known string types
try:
    # Python 2.6
    StringTypes = (types.StringType, types.UnicodeType)
except AttributeError:
    # Python 3.0
    StringTypes = (str, bytes)

# This regular expression is used to match valid token names
_is_identifier = re.compile(r'^[a-zA-Z0-9_]+$')

# Exception thrown when invalid token encountered and no default error
# handler is defined.
class LexError(Exception):
    def __init__(self, message, s):
        self.args = (message,)
        self.text = s


# Token class.  This class is used to represent the tokens produced.
class LexToken(object):
    def __str__(self):
        return 'LexToken(%s,%r,%d,%d)' % (self.type, self.value, self.lineno, self.lexpos)

    def __repr__(self):
        return str(self)


# This object is a stand-in for a logging object created by the
# logging module.

class PlyLogger(object):
    def __init__(self, f):
        self.f = f

    def critical(self, msg, *args, **kwargs):
        self.f.write((msg % args) + '\n')

    def warning(self, msg, *args, **kwargs):
        self.f.write('WARNING: ' + (msg % args) + '\n')

    def error(self, msg, *args, **kwargs):
        self.f.write('ERROR: ' + (msg % args) + '\n')

    info = critical
    debug = critical


# Null logger is used when no output is generated. Does nothing.
class NullLogger(object):
    def __getattribute__(self, name):
        return self

    def __call__(self, *args, **kwargs):
        return self


# -----------------------------------------------------------------------------
#                        === Lexing Engine ===
#
# The following Lexer class implements the lexer runtime.   There are only
# a few public methods and attributes:
#
#    input()          -  Store a new string in the lexer
#    token()          -  Get the next token
#    clone()          -  Clone the lexer
#
#    lineno           -  Current line number
#    lexpos           -  Current position in the input string
# -----------------------------------------------------------------------------

class Lexer:
    def __init__(self):
        self.lexre = None             # Master regular expression. This is a list of
                                      # tuples (re, findex) where re is a compiled
                                      # regular expression and findex is a list
                                      # mapping regex group numbers to rules
        self.lexretext = None         # Current regular expression strings
        self.lexstatere = {}          # Dictionary mapping lexer states to master regexs
        self.lexstateretext = {}      # Dictionary mapping lexer states to regex strings
        self.lexstaterenames = {}     # Dictionary mapping lexer states to symbol names
        self.lexstate = 'INITIAL'     # Current lexer state
        self.lexstatestack = []       # Stack of lexer states
        self.lexstateinfo = None      # State information
        self.lexstateignore = {}      # Dictionary of ignored characters for each state
        self.lexstateerrorf = {}      # Dictionary of error functions for each state
        self.lexstateeoff = {}        # Dictionary of eof functions for each state
        self.lexreflags = 0           # Optional re compile flags
        self.lexdata = None           # Actual input data (as a string)
        self.lexpos = 0               # Current position in input text
        self.lexlen = 0               # Length of the input text
        self.lexerrorf = None         # Error rule (if any)
        self.lexeoff = None           # EOF rule (if any)
        self.lextokens = None         # List of valid tokens
        self.lexignore = ''           # Ignored characters
        self.lexliterals = ''         # Literal characters that can be passed through
        self.lexmodule = None         # Module
        self.lineno = 1               # Current line number
        self.lexoptimize = False      # Optimized mode

    def clone(self, object=None):
        c = copy.copy(self)

        # If the object parameter has been supplied, it means we are attaching the
        # lexer to a new object.  In this case, we have to rebind all methods in
        # the lexstatere and lexstateerrorf tables.

        if object:
            newtab = {}
            for key, ritem in self.lexstatere.items():
                newre = []
                for cre, findex in ritem:
                    newfindex = []
                    for f in findex:
                        if not f or not f[0]:
                            newfindex.append(f)
                            continue
                        newfindex.append((getattr(object, f[0].__name__), f[1]))
                newre.append((cre, newfindex))
                newtab[key] = newre
            c.lexstatere = newtab
            c.lexstateerrorf = {}
            for key, ef in self.lexstateerrorf.items():
                c.lexstateerrorf[key] = getattr(object, ef.__name__)
            c.lexmodule = object
        return c

    # ------------------------------------------------------------
    # writetab() - Write lexer information to a table file
    # ------------------------------------------------------------
    def writetab(self, lextab, outputdir=''):
        if isinstance(lextab, types.ModuleType):
            raise IOError("Won't overwrite existing lextab module")
        basetabmodule = lextab.split('.')[-1]
        filename = os.path.join(outputdir, basetabmodule) + '.py'
        with open(filename, 'w') as tf:
            tf.write('# %s.py. This file automatically created by PLY (version %s). Don\'t edit!\n' % (basetabmodule, __version__))
            tf.write('_tabversion   = %s\n' % repr(__tabversion__))
            tf.write('_lextokens    = set(%s)\n' % repr(tuple(self.lextokens)))
            tf.write('_lexreflags   = %s\n' % repr(self.lexreflags))
            tf.write('_lexliterals  = %s\n' % repr(self.lexliterals))
            tf.write('_lexstateinfo = %s\n' % repr(self.lexstateinfo))

            # Rewrite the lexstatere table, replacing function objects with function names 
            tabre = {}
            for statename, lre in self.lexstatere.items():
                titem = []
                for (pat, func), retext, renames in zip(lre, self.lexstateretext[statename], self.lexstaterenames[statename]):
                    titem.append((retext, _funcs_to_names(func, renames)))
                tabre[statename] = titem

            tf.write('_lexstatere   = %s\n' % repr(tabre))
            tf.write('_lexstateignore = %s\n' % repr(self.lexstateignore))

            taberr = {}
            for statename, ef in self.lexstateerrorf.items():
                taberr[statename] = ef.__name__ if ef else None
            tf.write('_lexstateerrorf = %s\n' % repr(taberr))

            tabeof = {}
            for statename, ef in self.lexstateeoff.items():
                tabeof[statename] = ef.__name__ if ef else None
            tf.write('_lexstateeoff = %s\n' % repr(tabeof))

    # ------------------------------------------------------------
    # readtab() - Read lexer information from a tab file
    # ------------------------------------------------------------
    def readtab(self, tabfile, fdict):
        if isinstance(tabfile, types.ModuleType):
            lextab = tabfile
        else:
            exec('import %s' % tabfile)
            lextab = sys.modules[tabfile]

        if getattr(lextab, '_tabversion', '0.0') != __tabversion__:
            raise ImportError('Inconsistent PLY version')

        self.lextokens      = lextab._lextokens
        self.lexreflags     = lextab._lexreflags
        self.lexliterals    = lextab._lexliterals
        self.lextokens_all  = self.lextokens | set(self.lexliterals)
        self.lexstateinfo   = lextab._lexstateinfo
        self.lexstateignore = lextab._lexstateignore
        self.lexstatere     = {}
        self.lexstateretext = {}
        for statename, lre in lextab._lexstatere.items():
            titem = []
            txtitem = []
            for pat, func_name in lre:
                titem.append((re.compile(pat, lextab._lexreflags | re.VERBOSE), _names_to_funcs(func_name, fdict)))

            self.lexstatere[statename] = titem
            self.lexstateretext[statename] = txtitem

        self.lexstateerrorf = {}
        for statename, ef in lextab._lexstateerrorf.items():
            self.lexstateerrorf[statename] = fdict[ef]

        self.lexstateeoff = {}
        for statename, ef in lextab._lexstateeoff.items():
            self.lexstateeoff[statename] = fdict[ef]

        self.begin('INITIAL')

    # ------------------------------------------------------------
    # input() - Push a new string into the lexer
    # ------------------------------------------------------------
    def input(self, s):
        # Pull off the first character to see if s looks like a string
        c = s[:1]
        if not isinstance(c, StringTypes):
            raise ValueError('Expected a string')
        self.lexdata = s
        self.lexpos = 0
        self.lexlen = len(s)

    # ------------------------------------------------------------
    # begin() - Changes the lexing state
    # ------------------------------------------------------------
    def begin(self, state):
        if state not in self.lexstatere:
            raise ValueError('Undefined state')
        self.lexre = self.lexstatere[state]
        self.lexretext = self.lexstateretext[state]
        self.lexignore = self.lexstateignore.get(state, '')
        self.lexerrorf = self.lexstateerrorf.get(state, None)
        self.lexeoff = self.lexstateeoff.get(state, None)
        self.lexstate = state

    # ------------------------------------------------------------
    # push_state() - Changes the lexing state and saves old on stack
    # ------------------------------------------------------------
    def push_state(self, state):
        self.lexstatestack.append(self.lexstate)
        self.begin(state)

    # ------------------------------------------------------------
    # pop_state() - Restores the previous state
    # ------------------------------------------------------------
    def pop_state(self):
        self.begin(self.lexstatestack.pop())

    # ------------------------------------------------------------
    # current_state() - Returns the current lexing state
    # ------------------------------------------------------------
    def current_state(self):
        return self.lexstate

    # ------------------------------------------------------------
    # skip() - Skip ahead n characters
    # ------------------------------------------------------------
    def skip(self, n):
        self.lexpos += n

    # ------------------------------------------------------------
    # opttoken() - Return the next token from the Lexer
    #
    # Note: This function has been carefully implemented to be as fast
    # as possible.  Don't make changes unless you really know what
    # you are doing
    # ------------------------------------------------------------
    def token(self):
        # Make local copies of frequently referenced attributes
        lexpos    = self.lexpos
        lexlen    = self.lexlen
        lexignore = self.lexignore
        lexdata   = self.lexdata

        while lexpos < lexlen:
            # This code provides some short-circuit code for whitespace, tabs, and other ignored characters
            if lexdata[lexpos] in lexignore:
                lexpos += 1
                continue

            # Look for a regular expression match
            for lexre, lexindexfunc in self.lexre:
                m = lexre.match(lexdata, lexpos)
                if not m:
                    continue

                # Create a token for return
                tok = LexToken()
                tok.value = m.group()
                tok.lineno = self.lineno
                tok.lexpos = lexpos

                i = m.lastindex
                func, tok.type = lexindexfunc[i]

                if not func:
                    # If no token type was set, it's an ignored token
                    if tok.type:
                        self.lexpos = m.end()
                        return tok
                    else:
                        lexpos = m.end()
                        break

                lexpos = m.end()

                # If token is processed by a function, call it

                tok.lexer = self      # Set additional attributes useful in token rules
                self.lexmatch = m
                self.lexpos = lexpos

                newtok = func(tok)

                # Every function must return a token, if nothing, we just move to next token
                if not newtok:
                    lexpos    = self.lexpos         # This is here in case user has updated lexpos.
                    lexignore = self.lexignore      # This is here in case there was a state change
                    break

                # Verify type of the token.  If not in the token map, raise an error
                if not self.lexoptimize:
                    if newtok.type not in self.lextokens_all:
                        raise LexError("%s:%d: Rule '%s' returned an unknown token type '%s'" % (
                            func.__code__.co_filename, func.__code__.co_firstlineno,
                            func.__name__, newtok.type), lexdata[lexpos:])

                return newtok
            else:
                # No match, see if in literals
                if lexdata[lexpos] in self.lexliterals:
                    tok = LexToken()
                    tok.value = lexdata[lexpos]
                    tok.lineno = self.lineno
                    tok.type = tok.value
                    tok.lexpos = lexpos
                    self.lexpos = lexpos + 1
                    return tok

                # No match. Call t_error() if defined.
                if self.lexerrorf:
                    tok = LexToken()
                    tok.value = self.lexdata[lexpos:]
                    tok.lineno = self.lineno
                    tok.type = 'error'
                    tok.lexer = self
                    tok.lexpos = lexpos
                    self.lexpos = lexpos
                    newtok = self.lexerrorf(tok)
                    if lexpos == self.lexpos:
                        # Error method didn't change text position at all. This is an error.
                        raise LexError("Scanning error. Illegal character '%s'" % (lexdata[lexpos]), lexdata[lexpos:])
                    lexpos = self.lexpos
                    if not newtok:
                        continue
                    return newtok

                self.lexpos = lexpos
                raise LexError("Illegal character '%s' at index %d" % (lexdata[lexpos], lexpos), lexdata[lexpos:])

        if self.lexeoff:
            tok = LexToken()
            tok.type = 'eof'
            tok.value = ''
            tok.lineno = self.lineno
            tok.lexpos = lexpos
            tok.lexer = self
            self.lexpos = lexpos
            newtok = self.lexeoff(tok)
            return newtok

        self.lexpos = lexpos + 1
        if self.lexdata is None:
            raise RuntimeError('No input string given with input()')
        return None

    # Iterator interface
    def __iter__(self):
        return self

    def next(self):
        t = self.token()
        if t is None:
            raise StopIteration
        return t

    __next__ = next

# -----------------------------------------------------------------------------
#                           ==== Lex Builder ===
#
# The functions and classes below are used to collect lexing information
# and build a Lexer object from it.
# -----------------------------------------------------------------------------

# -----------------------------------------------------------------------------
# _get_regex(func)
#
# Returns the regular expression assigned to a function either as a doc string
# or as a .regex attribute attached by the @TOKEN decorator.
# -----------------------------------------------------------------------------
def _get_regex(func):
    return getattr(func, 'regex', func.__doc__)

# -----------------------------------------------------------------------------
# get_caller_module_dict()
#
# This function returns a dictionary containing all of the symbols defined within
# a caller further down the call stack.  This is used to get the environment
# associated with the yacc() call if none was provided.
# -----------------------------------------------------------------------------
def get_caller_module_dict(levels):
    f = sys._getframe(levels)
    ldict = f.f_globals.copy()
    if f.f_globals != f.f_locals:
        ldict.update(f.f_locals)
    return ldict

# -----------------------------------------------------------------------------
# _funcs_to_names()
#
# Given a list of regular expression functions, this converts it to a list
# suitable for output to a table file
# -----------------------------------------------------------------------------
def _funcs_to_names(funclist, namelist):
    result = []
    for f, name in zip(funclist, namelist):
        if f and f[0]:
            result.append((name, f[1]))
        else:
            result.append(f)
    return result

# -----------------------------------------------------------------------------
# _names_to_funcs()
#
# Given a list of regular expression function names, this converts it back to
# functions.
# -----------------------------------------------------------------------------
def _names_to_funcs(namelist, fdict):
    result = []
    for n in namelist:
        if n and n[0]:
            result.append((fdict[n[0]], n[1]))
        else:
            result.append(n)
    return result

# -----------------------------------------------------------------------------
# _form_master_re()
#
# This function takes a list of all of the regex components and attempts to
# form the master regular expression.  Given limitations in the Python re
# module, it may be necessary to break the master regex into separate expressions.
# -----------------------------------------------------------------------------
def _form_master_re(relist, reflags, ldict, toknames):
    if not relist:
        return []
    regex = '|'.join(relist)
    try:
        lexre = re.compile(regex, re.VERBOSE | reflags)

        # Build the index to function map for the matching engine
        lexindexfunc = [None] * (max(lexre.groupindex.values()) + 1)
        lexindexnames = lexindexfunc[:]

        for f, i in lexre.groupindex.items():
            handle = ldict.get(f, None)
            if type(handle) in (types.FunctionType, types.MethodType):
                lexindexfunc[i] = (handle, toknames[f])
                lexindexnames[i] = f
            elif handle is not None:
                lexindexnames[i] = f
                if f.find('ignore_') > 0:
                    lexindexfunc[i] = (None, None)
                else:
                    lexindexfunc[i] = (None, toknames[f])

        return [(lexre, lexindexfunc)], [regex], [lexindexnames]
    except Exception:
        m = int(len(relist)/2)
        if m == 0:
            m = 1
        llist, lre, lnames = _form_master_re(relist[:m], reflags, ldict, toknames)
        rlist, rre, rnames = _form_master_re(relist[m:], reflags, ldict, toknames)
        return (llist+rlist), (lre+rre), (lnames+rnames)

# -----------------------------------------------------------------------------
# def _statetoken(s,names)
#
# Given a declaration name s of the form "t_" and a dictionary whose keys are
# state names, this function returns a tuple (states,tokenname) where states
# is a tuple of state names and tokenname is the name of the token.  For example,
# calling this with s = "t_foo_bar_SPAM" might return (('foo','bar'),'SPAM')
# -----------------------------------------------------------------------------
def _statetoken(s, names):
    nonstate = 1
    parts = s.split('_')
    for i, part in enumerate(parts[1:], 1):
        if part not in names and part != 'ANY':
            break
    
    if i > 1:
        states = tuple(parts[1:i])
    else:
        states = ('INITIAL',)

    if 'ANY' in states:
        states = tuple(names)

    tokenname = '_'.join(parts[i:])
    return (states, tokenname)


# -----------------------------------------------------------------------------
# LexerReflect()
#
# This class represents information needed to build a lexer as extracted from a
# user's input file.
# -----------------------------------------------------------------------------
class LexerReflect(object):
    def __init__(self, ldict, log=None, reflags=0):
        self.ldict      = ldict
        self.error_func = None
        self.tokens     = []
        self.reflags    = reflags
        self.stateinfo  = {'INITIAL': 'inclusive'}
        self.modules    = set()
        self.error      = False
        self.log        = PlyLogger(sys.stderr) if log is None else log

    # Get all of the basic information
    def get_all(self):
        self.get_tokens()
        self.get_literals()
        self.get_states()
        self.get_rules()

    # Validate all of the information
    def validate_all(self):
        self.validate_tokens()
        self.validate_literals()
        self.validate_rules()
        return self.error

    # Get the tokens map
    def get_tokens(self):
        tokens = self.ldict.get('tokens', None)
        if not tokens:
            self.log.error('No token list is defined')
            self.error = True
            return

        if not isinstance(tokens, (list, tuple)):
            self.log.error('tokens must be a list or tuple')
            self.error = True
            return

        if not tokens:
            self.log.error('tokens is empty')
            self.error = True
            return

        self.tokens = tokens

    # Validate the tokens
    def validate_tokens(self):
        terminals = {}
        for n in self.tokens:
            if not _is_identifier.match(n):
                self.log.error("Bad token name '%s'", n)
                self.error = True
            if n in terminals:
                self.log.warning("Token '%s' multiply defined", n)
            terminals[n] = 1

    # Get the literals specifier
    def get_literals(self):
        self.literals = self.ldict.get('literals', '')
        if not self.literals:
            self.literals = ''

    # Validate literals
    def validate_literals(self):
        try:
            for c in self.literals:
                if not isinstance(c, StringTypes) or len(c) > 1:
                    self.log.error('Invalid literal %s. Must be a single character', repr(c))
                    self.error = True

        except TypeError:
            self.log.error('Invalid literals specification. literals must be a sequence of characters')
            self.error = True

    def get_states(self):
        self.states = self.ldict.get('states', None)
        # Build statemap
        if self.states:
            if not isinstance(self.states, (tuple, list)):
                self.log.error('states must be defined as a tuple or list')
                self.error = True
            else:
                for s in self.states:
                    if not isinstance(s, tuple) or len(s) != 2:
                        self.log.error("Invalid state specifier %s. Must be a tuple (statename,'exclusive|inclusive')", repr(s))
                        self.error = True
                        continue
                    name, statetype = s
                    if not isinstance(name, StringTypes):
                        self.log.error('State name %s must be a string', repr(name))
                        self.error = True
                        continue
                    if not (statetype == 'inclusive' or statetype == 'exclusive'):
                        self.log.error("State type for state %s must be 'inclusive' or 'exclusive'", name)
                        self.error = True
                        continue
                    if name in self.stateinfo:
                        self.log.error("State '%s' already defined", name)
                        self.error = True
                        continue
                    self.stateinfo[name] = statetype

    # Get all of the symbols with a t_ prefix and sort them into various
    # categories (functions, strings, error functions, and ignore characters)

    def get_rules(self):
        tsymbols = [f for f in self.ldict if f[:2] == 't_']

        # Now build up a list of functions and a list of strings
        self.toknames = {}        # Mapping of symbols to token names
        self.funcsym  = {}        # Symbols defined as functions
        self.strsym   = {}        # Symbols defined as strings
        self.ignore   = {}        # Ignore strings by state
        self.errorf   = {}        # Error functions by state
        self.eoff     = {}        # EOF functions by state

        for s in self.stateinfo:
            self.funcsym[s] = []
            self.strsym[s] = []

        if len(tsymbols) == 0:
            self.log.error('No rules of the form t_rulename are defined')
            self.error = True
            return

        for f in tsymbols:
            t = self.ldict[f]
            states, tokname = _statetoken(f, self.stateinfo)
            self.toknames[f] = tokname

            if hasattr(t, '__call__'):
                if tokname == 'error':
                    for s in states:
                        self.errorf[s] = t
                elif tokname == 'eof':
                    for s in states:
                        self.eoff[s] = t
                elif tokname == 'ignore':
                    line = t.__code__.co_firstlineno
                    file = t.__code__.co_filename
                    self.log.error("%s:%d: Rule '%s' must be defined as a string", file, line, t.__name__)
                    self.error = True
                else:
                    for s in states:
                        self.funcsym[s].append((f, t))
            elif isinstance(t, StringTypes):
                if tokname == 'ignore':
                    for s in states:
                        self.ignore[s] = t
                    if '\\' in t:
                        self.log.warning("%s contains a literal backslash '\\'", f)

                elif tokname == 'error':
                    self.log.error("Rule '%s' must be defined as a function", f)
                    self.error = True
                else:
                    for s in states:
                        self.strsym[s].append((f, t))
            else:
                self.log.error('%s not defined as a function or string', f)
                self.error = True

        # Sort the functions by line number
        for f in self.funcsym.values():
            f.sort(key=lambda x: x[1].__code__.co_firstlineno)

        # Sort the strings by regular expression length
        for s in self.strsym.values():
            s.sort(key=lambda x: len(x[1]), reverse=True)

    # Validate all of the t_rules collected
    def validate_rules(self):
        for state in self.stateinfo:
            # Validate all rules defined by functions

            for fname, f in self.funcsym[state]:
                line = f.__code__.co_firstlineno
                file = f.__code__.co_filename
                module = inspect.getmodule(f)
                self.modules.add(module)

                tokname = self.toknames[fname]
                if isinstance(f, types.MethodType):
                    reqargs = 2
                else:
                    reqargs = 1
                nargs = f.__code__.co_argcount
                if nargs > reqargs:
                    self.log.error("%s:%d: Rule '%s' has too many arguments", file, line, f.__name__)
                    self.error = True
                    continue

                if nargs < reqargs:
                    self.log.error("%s:%d: Rule '%s' requires an argument", file, line, f.__name__)
                    self.error = True
                    continue

                if not _get_regex(f):
                    self.log.error("%s:%d: No regular expression defined for rule '%s'", file, line, f.__name__)
                    self.error = True
                    continue

                try:
                    c = re.compile('(?P<%s>%s)' % (fname, _get_regex(f)), re.VERBOSE | self.reflags)
                    if c.match(''):
                        self.log.error("%s:%d: Regular expression for rule '%s' matches empty string", file, line, f.__name__)
                        self.error = True
                except re.error as e:
                    self.log.error("%s:%d: Invalid regular expression for rule '%s'. %s", file, line, f.__name__, e)
                    if '#' in _get_regex(f):
                        self.log.error("%s:%d. Make sure '#' in rule '%s' is escaped with '\\#'", file, line, f.__name__)
                    self.error = True

            # Validate all rules defined by strings
            for name, r in self.strsym[state]:
                tokname = self.toknames[name]
                if tokname == 'error':
                    self.log.error("Rule '%s' must be defined as a function", name)
                    self.error = True
                    continue

                if tokname not in self.tokens and tokname.find('ignore_') < 0:
                    self.log.error("Rule '%s' defined for an unspecified token %s", name, tokname)
                    self.error = True
                    continue

                try:
                    c = re.compile('(?P<%s>%s)' % (name, r), re.VERBOSE | self.reflags)
                    if (c.match('')):
                        self.log.error("Regular expression for rule '%s' matches empty string", name)
                        self.error = True
                except re.error as e:
                    self.log.error("Invalid regular expression for rule '%s'. %s", name, e)
                    if '#' in r:
                        self.log.error("Make sure '#' in rule '%s' is escaped with '\\#'", name)
                    self.error = True

            if not self.funcsym[state] and not self.strsym[state]:
                self.log.error("No rules defined for state '%s'", state)
                self.error = True

            # Validate the error function
            efunc = self.errorf.get(state, None)
            if efunc:
                f = efunc
                line = f.__code__.co_firstlineno
                file = f.__code__.co_filename
                module = inspect.getmodule(f)
                self.modules.add(module)

                if isinstance(f, types.MethodType):
                    reqargs = 2
                else:
                    reqargs = 1
                nargs = f.__code__.co_argcount
                if nargs > reqargs:
                    self.log.error("%s:%d: Rule '%s' has too many arguments", file, line, f.__name__)
                    self.error = True

                if nargs < reqargs:
                    self.log.error("%s:%d: Rule '%s' requires an argument", file, line, f.__name__)
                    self.error = True

        for module in self.modules:
            self.validate_module(module)

    # -----------------------------------------------------------------------------
    # validate_module()
    #
    # This checks to see if there are duplicated t_rulename() functions or strings
    # in the parser input file.  This is done using a simple regular expression
    # match on each line in the source code of the given module.
    # -----------------------------------------------------------------------------

    def validate_module(self, module):
        try:
            lines, linen = inspect.getsourcelines(module)
        except IOError:
            return

        fre = re.compile(r'\s*def\s+(t_[a-zA-Z_0-9]*)\(')
        sre = re.compile(r'\s*(t_[a-zA-Z_0-9]*)\s*=')

        counthash = {}
        linen += 1
        for line in lines:
            m = fre.match(line)
            if not m:
                m = sre.match(line)
            if m:
                name = m.group(1)
                prev = counthash.get(name)
                if not prev:
                    counthash[name] = linen
                else:
                    filename = inspect.getsourcefile(module)
                    self.log.error('%s:%d: Rule %s redefined. Previously defined on line %d', filename, linen, name, prev)
                    self.error = True
            linen += 1

# -----------------------------------------------------------------------------
# lex(module)
#
# Build all of the regular expression rules from definitions in the supplied module
# -----------------------------------------------------------------------------
def lex(module=None, object=None, debug=False, optimize=False, lextab='lextab',
        reflags=0, nowarn=False, outputdir=None, debuglog=None, errorlog=None):

    if lextab is None:
        lextab = 'lextab'

    global lexer

    ldict = None
    stateinfo  = {'INITIAL': 'inclusive'}
    lexobj = Lexer()
    lexobj.lexoptimize = optimize
    global token, input

    if errorlog is None:
        errorlog = PlyLogger(sys.stderr)

    if debug:
        if debuglog is None:
            debuglog = PlyLogger(sys.stderr)

    # Get the module dictionary used for the lexer
    if object:
        module = object

    # Get the module dictionary used for the parser
    if module:
        _items = [(k, getattr(module, k)) for k in dir(module)]
        ldict = dict(_items)
        # If no __file__ attribute is available, try to obtain it from the __module__ instead
        if '__file__' not in ldict:
            ldict['__file__'] = sys.modules[ldict['__module__']].__file__
    else:
        ldict = get_caller_module_dict(2)

    # Determine if the module is package of a package or not.
    # If so, fix the tabmodule setting so that tables load correctly
    pkg = ldict.get('__package__')
    if pkg and isinstance(lextab, str):
        if '.' not in lextab:
            lextab = pkg + '.' + lextab

    # Collect parser information from the dictionary
    linfo = LexerReflect(ldict, log=errorlog, reflags=reflags)
    linfo.get_all()
    if not optimize:
        if linfo.validate_all():
            raise SyntaxError("Can't build lexer")

    if optimize and lextab:
        try:
            lexobj.readtab(lextab, ldict)
            token = lexobj.token
            input = lexobj.input
            lexer = lexobj
            return lexobj

        except ImportError:
            pass

    # Dump some basic debugging information
    if debug:
        debuglog.info('lex: tokens   = %r', linfo.tokens)
        debuglog.info('lex: literals = %r', linfo.literals)
        debuglog.info('lex: states   = %r', linfo.stateinfo)

    # Build a dictionary of valid token names
    lexobj.lextokens = set()
    for n in linfo.tokens:
        lexobj.lextokens.add(n)

    # Get literals specification
    if isinstance(linfo.literals, (list, tuple)):
        lexobj.lexliterals = type(linfo.literals[0])().join(linfo.literals)
    else:
        lexobj.lexliterals = linfo.literals

    lexobj.lextokens_all = lexobj.lextokens | set(lexobj.lexliterals)

    # Get the stateinfo dictionary
    stateinfo = linfo.stateinfo

    regexs = {}
    # Build the master regular expressions
    for state in stateinfo:
        regex_list = []

        # Add rules defined by functions first
        for fname, f in linfo.funcsym[state]:
            line = f.__code__.co_firstlineno
            file = f.__code__.co_filename
            regex_list.append('(?P<%s>%s)' % (fname, _get_regex(f)))
            if debug:
                debuglog.info("lex: Adding rule %s -> '%s' (state '%s')", fname, _get_regex(f), state)

        # Now add all of the simple rules
        for name, r in linfo.strsym[state]:
            regex_list.append('(?P<%s>%s)' % (name, r))
            if debug:
                debuglog.info("lex: Adding rule %s -> '%s' (state '%s')", name, r, state)

        regexs[state] = regex_list

    # Build the master regular expressions

    if debug:
        debuglog.info('lex: ==== MASTER REGEXS FOLLOW ====')

    for state in regexs:
        lexre, re_text, re_names = _form_master_re(regexs[state], reflags, ldict, linfo.toknames)
        lexobj.lexstatere[state] = lexre
        lexobj.lexstateretext[state] = re_text
        lexobj.lexstaterenames[state] = re_names
        if debug:
            for i, text in enumerate(re_text):
                debuglog.info("lex: state '%s' : regex[%d] = '%s'", state, i, text)

    # For inclusive states, we need to add the regular expressions from the INITIAL state
    for state, stype in stateinfo.items():
        if state != 'INITIAL' and stype == 'inclusive':
            lexobj.lexstatere[state].extend(lexobj.lexstatere['INITIAL'])
            lexobj.lexstateretext[state].extend(lexobj.lexstateretext['INITIAL'])
            lexobj.lexstaterenames[state].extend(lexobj.lexstaterenames['INITIAL'])

    lexobj.lexstateinfo = stateinfo
    lexobj.lexre = lexobj.lexstatere['INITIAL']
    lexobj.lexretext = lexobj.lexstateretext['INITIAL']
    lexobj.lexreflags = reflags

    # Set up ignore variables
    lexobj.lexstateignore = linfo.ignore
    lexobj.lexignore = lexobj.lexstateignore.get('INITIAL', '')

    # Set up error functions
    lexobj.lexstateerrorf = linfo.errorf
    lexobj.lexerrorf = linfo.errorf.get('INITIAL', None)
    if not lexobj.lexerrorf:
        errorlog.warning('No t_error rule is defined')

    # Set up eof functions
    lexobj.lexstateeoff = linfo.eoff
    lexobj.lexeoff = linfo.eoff.get('INITIAL', None)

    # Check state information for ignore and error rules
    for s, stype in stateinfo.items():
        if stype == 'exclusive':
            if s not in linfo.errorf:
                errorlog.warning("No error rule is defined for exclusive state '%s'", s)
            if s not in linfo.ignore and lexobj.lexignore:
                errorlog.warning("No ignore rule is defined for exclusive state '%s'", s)
        elif stype == 'inclusive':
            if s not in linfo.errorf:
                linfo.errorf[s] = linfo.errorf.get('INITIAL', None)
            if s not in linfo.ignore:
                linfo.ignore[s] = linfo.ignore.get('INITIAL', '')

    # Create global versions of the token() and input() functions
    token = lexobj.token
    input = lexobj.input
    lexer = lexobj

    # If in optimize mode, we write the lextab
    if lextab and optimize:
        if outputdir is None:
            # If no output directory is set, the location of the output files
            # is determined according to the following rules:
            #     - If lextab specifies a package, files go into that package directory
            #     - Otherwise, files go in the same directory as the specifying module
            if isinstance(lextab, types.ModuleType):
                srcfile = lextab.__file__
            else:
                if '.' not in lextab:
                    srcfile = ldict['__file__']
                else:
                    parts = lextab.split('.')
                    pkgname = '.'.join(parts[:-1])
                    exec('import %s' % pkgname)
                    srcfile = getattr(sys.modules[pkgname], '__file__', '')
            outputdir = os.path.dirname(srcfile)
        try:
            lexobj.writetab(lextab, outputdir)
        except IOError as e:
            errorlog.warning("Couldn't write lextab module %r. %s" % (lextab, e))

    return lexobj

# -----------------------------------------------------------------------------
# runmain()
#
# This runs the lexer as a main program
# -----------------------------------------------------------------------------

def runmain(lexer=None, data=None):
    if not data:
        try:
            filename = sys.argv[1]
            f = open(filename)
            data = f.read()
            f.close()
        except IndexError:
            sys.stdout.write('Reading from standard input (type EOF to end):\n')
            data = sys.stdin.read()

    if lexer:
        _input = lexer.input
    else:
        _input = input
    _input(data)
    if lexer:
        _token = lexer.token
    else:
        _token = token

    while True:
        tok = _token()
        if not tok:
            break
        sys.stdout.write('(%s,%r,%d,%d)\n' % (tok.type, tok.value, tok.lineno, tok.lexpos))

# -----------------------------------------------------------------------------
# @TOKEN(regex)
#
# This decorator function can be used to set the regex expression on a function
# when its docstring might need to be set in an alternative way
# -----------------------------------------------------------------------------

def TOKEN(r):
    def set_regex(f):
        if hasattr(r, '__call__'):
            f.regex = _get_regex(r)
        else:
            f.regex = r
        return f
    return set_regex

# Alternative spelling of the TOKEN decorator
Token = TOKEN

site-packages/ply/__init__.py000064400000000146147511334660012214 0ustar00# PLY package
# Author: David Beazley (dave@dabeaz.com)

__version__ = '3.9'
__all__ = ['lex','yacc']
site-packages/ply/ctokens.py000064400000006151147511334660012125 0ustar00# ----------------------------------------------------------------------
# ctokens.py
#
# Token specifications for symbols in ANSI C and C++.  This file is
# meant to be used as a library in other tokenizers.
# ----------------------------------------------------------------------

# Reserved words

tokens = [
    # Literals (identifier, integer constant, float constant, string constant, char const)
    'ID', 'TYPEID', 'INTEGER', 'FLOAT', 'STRING', 'CHARACTER',

    # Operators (+,-,*,/,%,|,&,~,^,<<,>>, ||, &&, !, <, <=, >, >=, ==, !=)
    'PLUS', 'MINUS', 'TIMES', 'DIVIDE', 'MODULO',
    'OR', 'AND', 'NOT', 'XOR', 'LSHIFT', 'RSHIFT',
    'LOR', 'LAND', 'LNOT',
    'LT', 'LE', 'GT', 'GE', 'EQ', 'NE',
    
    # Assignment (=, *=, /=, %=, +=, -=, <<=, >>=, &=, ^=, |=)
    'EQUALS', 'TIMESEQUAL', 'DIVEQUAL', 'MODEQUAL', 'PLUSEQUAL', 'MINUSEQUAL',
    'LSHIFTEQUAL','RSHIFTEQUAL', 'ANDEQUAL', 'XOREQUAL', 'OREQUAL',

    # Increment/decrement (++,--)
    'INCREMENT', 'DECREMENT',

    # Structure dereference (->)
    'ARROW',

    # Ternary operator (?)
    'TERNARY',
    
    # Delimeters ( ) [ ] { } , . ; :
    'LPAREN', 'RPAREN',
    'LBRACKET', 'RBRACKET',
    'LBRACE', 'RBRACE',
    'COMMA', 'PERIOD', 'SEMI', 'COLON',

    # Ellipsis (...)
    'ELLIPSIS',
]
    
# Operators
t_PLUS             = r'\+'
t_MINUS            = r'-'
t_TIMES            = r'\*'
t_DIVIDE           = r'/'
t_MODULO           = r'%'
t_OR               = r'\|'
t_AND              = r'&'
t_NOT              = r'~'
t_XOR              = r'\^'
t_LSHIFT           = r'<<'
t_RSHIFT           = r'>>'
t_LOR              = r'\|\|'
t_LAND             = r'&&'
t_LNOT             = r'!'
t_LT               = r'<'
t_GT               = r'>'
t_LE               = r'<='
t_GE               = r'>='
t_EQ               = r'=='
t_NE               = r'!='

# Assignment operators

t_EQUALS           = r'='
t_TIMESEQUAL       = r'\*='
t_DIVEQUAL         = r'/='
t_MODEQUAL         = r'%='
t_PLUSEQUAL        = r'\+='
t_MINUSEQUAL       = r'-='
t_LSHIFTEQUAL      = r'<<='
t_RSHIFTEQUAL      = r'>>='
t_ANDEQUAL         = r'&='
t_OREQUAL          = r'\|='
t_XOREQUAL         = r'\^='

# Increment/decrement
t_INCREMENT        = r'\+\+'
t_DECREMENT        = r'--'

# ->
t_ARROW            = r'->'

# ?
t_TERNARY          = r'\?'

# Delimeters
t_LPAREN           = r'\('
t_RPAREN           = r'\)'
t_LBRACKET         = r'\['
t_RBRACKET         = r'\]'
t_LBRACE           = r'\{'
t_RBRACE           = r'\}'
t_COMMA            = r','
t_PERIOD           = r'\.'
t_SEMI             = r';'
t_COLON            = r':'
t_ELLIPSIS         = r'\.\.\.'

# Identifiers
t_ID = r'[A-Za-z_][A-Za-z0-9_]*'

# Integer literal
t_INTEGER = r'\d+([uU]|[lL]|[uU][lL]|[lL][uU])?'

# Floating literal
t_FLOAT = r'((\d+)(\.\d+)(e(\+|-)?(\d+))? | (\d+)e(\+|-)?(\d+))([lL]|[fF])?'

# String literal
t_STRING = r'\"([^\\\n]|(\\.))*?\"'

# Character constant 'c' or L'c'
t_CHARACTER = r'(L)?\'([^\\\n]|(\\.))*?\''

# Comment (C-Style)
def t_COMMENT(t):
    r'/\*(.|\n)*?\*/'
    t.lexer.lineno += t.value.count('\n')
    return t

# Comment (C++-Style)
def t_CPPCOMMENT(t):
    r'//.*\n'
    t.lexer.lineno += 1
    return t


    



site-packages/_version.py000064400000000025147511334660011471 0ustar00__version__ = '5.0.6'site-packages/dnf-plugins/debug.py000064400000030425147511334660013170 0ustar00#
# Copyright (C) 2015  Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals

from dnf.i18n import ucd
from dnfpluginscore import _, logger

import dnf
import dnf.cli
import gzip
import hawkey
import os
import rpm
import sys
import time

DEBUG_VERSION = "dnf-debug-dump version 1\n"


class Debug(dnf.Plugin):

    name = 'debug'

    def __init__(self, base, cli):
        super(Debug, self).__init__(base, cli)
        self.base = base
        self.cli = cli
        if self.cli is not None:
            self.cli.register_command(DebugDumpCommand)
            self.cli.register_command(DebugRestoreCommand)


class DebugDumpCommand(dnf.cli.Command):

    aliases = ("debug-dump",)
    summary = _("dump information about installed rpm packages to file")

    def __init__(self, cli):
        super(DebugDumpCommand, self).__init__(cli)
        self.dump_file = None

    def configure(self):
        self.cli.demands.sack_activation = True
        self.cli.demands.available_repos = True

    @staticmethod
    def set_argparser(parser):
        parser.add_argument(
            "--norepos", action="store_true", default=False,
            help=_("do not attempt to dump the repository contents."))
        parser.add_argument(
            "filename", nargs="?",
            help=_("optional name of dump file"))

    def run(self):
        """create debug txt file and compress it, if no filename specified
           use dnf_debug_dump-<timestamp>.txt.gz by default"""

        filename = self.opts.filename
        if not filename:
            now = time.strftime("%Y-%m-%d_%T", time.localtime(time.time()))
            filename = "dnf_debug_dump-%s-%s.txt.gz" % (os.uname()[1], now)

        filename = os.path.abspath(filename)
        if filename.endswith(".gz"):
            self.dump_file = gzip.GzipFile(filename, "w")
        else:
            self.dump_file = open(filename, "w")

        self.write(DEBUG_VERSION)
        self.dump_system_info()
        self.dump_dnf_config_info()
        self.dump_rpm_problems()
        self.dump_packages(not self.opts.norepos)
        self.dump_rpmdb_versions()
        self.dump_file.close()

        print(_("Output written to: %s") % filename)

    def write(self, msg):
        if dnf.pycomp.PY3 and isinstance(self.dump_file, gzip.GzipFile):
            msg = bytes(msg, "utf8")
        dnf.pycomp.write_to_file(self.dump_file, msg)

    def dump_system_info(self):
        self.write("%%%%SYSTEM INFO\n")
        uname = os.uname()
        self.write("  uname: %s, %s\n" % (uname[2], uname[4]))
        self.write("  rpm ver: %s\n" % rpm.__version__)
        self.write("  python ver: %s\n" % sys.version.replace("\n", ""))
        return

    def dump_dnf_config_info(self):
        var = self.base.conf.substitutions
        plugins = ",".join([p.name for p in self.base._plugins.plugins])
        self.write("%%%%DNF INFO\n")
        self.write("  arch: %s\n" % var["arch"])
        self.write("  basearch: %s\n" % var["basearch"])
        self.write("  releasever: %s\n" % var["releasever"])
        self.write("  dnf ver: %s\n" % dnf.const.VERSION)
        self.write("  enabled plugins: %s\n" % plugins)
        self.write("  global excludes: %s\n" % ",".join(self.base.conf.excludepkgs))
        return

    def dump_rpm_problems(self):
        self.write("%%%%RPMDB PROBLEMS\n")
        (missing, conflicts) = rpm_problems(self.base)
        self.write("".join(["Package %s requires %s\n" % (ucd(pkg), ucd(req))
                            for (req, pkg) in missing]))
        self.write("".join(["Package %s conflicts with %s\n" % (ucd(pkg),
                                                                ucd(conf))
                            for (conf, pkg) in conflicts]))

    def dump_packages(self, load_repos):
        q = self.base.sack.query()
        # packages from rpmdb
        self.write("%%%%RPMDB\n")
        for p in sorted(q.installed()):
            self.write("  %s\n" % pkgspec(p))

        if not load_repos:
            return

        self.write("%%%%REPOS\n")
        available = q.available()
        for repo in sorted(self.base.repos.iter_enabled(), key=lambda x: x.id):
            try:
                url = None
                if repo.metalink is not None:
                    url = repo.metalink
                elif repo.mirrorlist is not None:
                    url = repo.mirrorlist
                elif len(repo.baseurl) > 0:
                    url = repo.baseurl[0]
                self.write("%%%s - %s\n" % (repo.id, url))
                self.write("  excludes: %s\n" % ",".join(repo.excludepkgs))
                for po in sorted(available.filter(reponame=repo.id)):
                    self.write("  %s\n" % pkgspec(po))

            except dnf.exceptions.Error as e:
                self.write("Error accessing repo %s: %s\n" % (repo, str(e)))
                continue
        return

    def dump_rpmdb_versions(self):
        self.write("%%%%RPMDB VERSIONS\n")
        version = self.base.sack._rpmdb_version()
        self.write("  all: %s\n" % version)
        return


class DebugRestoreCommand(dnf.cli.Command):

    aliases = ("debug-restore",)
    summary = _("restore packages recorded in debug-dump file")

    def configure(self):
        self.cli.demands.sack_activation = True
        self.cli.demands.available_repos = True
        self.cli.demands.root_user = True
        if not self.opts.output:
            self.cli.demands.resolving = True

    @staticmethod
    def set_argparser(parser):
        parser.add_argument(
            "--output", action="store_true",
            help=_("output commands that would be run to stdout."))
        parser.add_argument(
            "--install-latest", action="store_true",
            help=_("Install the latest version of recorded packages."))
        parser.add_argument(
            "--ignore-arch", action="store_true",
            help=_("Ignore architecture and install missing packages matching "
                   "the name, epoch, version and release."))
        parser.add_argument(
            "--filter-types", metavar="[install, remove, replace]",
            default="install, remove, replace",
            help=_("limit to specified type"))
        parser.add_argument(
            "--remove-installonly", action="store_true",
            help=_('Allow removing of install-only packages. Using this option may '
                   'result in an attempt to remove the running kernel.'))
        parser.add_argument(
            "filename", nargs=1, help=_("name of dump file"))

    def run(self):
        """Execute the command action here."""
        if self.opts.filter_types:
            self.opts.filter_types = set(
                self.opts.filter_types.replace(",", " ").split())

        dump_pkgs = self.read_dump_file(self.opts.filename[0])

        self.process_installed(dump_pkgs, self.opts)

        self.process_dump(dump_pkgs, self.opts)

    def process_installed(self, dump_pkgs, opts):
        installed = self.base.sack.query().installed()
        installonly_pkgs = self.base._get_installonly_query(installed)
        for pkg in installed:
            pkg_remove = False
            spec = pkgspec(pkg)
            dumped_versions = dump_pkgs.get((pkg.name, pkg.arch), None)
            if dumped_versions is not None:
                evr = (pkg.epoch, pkg.version, pkg.release)
                if evr in dumped_versions:
                    # the correct version is already installed
                    dumped_versions[evr] = 'skip'
                else:
                    # other version is currently installed
                    if pkg in installonly_pkgs:
                        # package is install-only, should be removed
                        pkg_remove = True
                    else:
                        # package should be upgraded / downgraded
                        if "replace" in opts.filter_types:
                            action = 'replace'
                        else:
                            action = 'skip'
                        for d_evr in dumped_versions.keys():
                            dumped_versions[d_evr] = action
            else:
                # package should not be installed
                pkg_remove = True
            if pkg_remove and "remove" in opts.filter_types:
                if pkg not in installonly_pkgs or opts.remove_installonly:
                    if opts.output:
                        print("remove    %s" % spec)
                    else:
                        self.base.package_remove(pkg)

    def process_dump(self, dump_pkgs, opts):
        for (n, a) in sorted(dump_pkgs.keys()):
            dumped_versions = dump_pkgs[(n, a)]
            for (e, v, r) in sorted(dumped_versions.keys()):
                action = dumped_versions[(e, v, r)]
                if action == 'skip':
                    continue
                if opts.ignore_arch:
                    arch = ""
                else:
                    arch = "." + a
                if opts.install_latest and action == "install":
                    pkg_spec = "%s%s" % (n, arch)
                else:
                    pkg_spec = pkgtup2spec(n, arch, e, v, r)
                if action in opts.filter_types:
                    if opts.output:
                        print("%s   %s" % (action, pkg_spec))
                    else:
                        try:
                            self.base.install(pkg_spec)
                        except dnf.exceptions.MarkingError:
                            logger.error(_("Package %s is not available"), pkg_spec)

    @staticmethod
    def read_dump_file(filename):
        if filename.endswith(".gz"):
            fobj = gzip.GzipFile(filename)
        else:
            fobj = open(filename)

        if ucd(fobj.readline()) != DEBUG_VERSION:
            logger.error(_("Bad dnf debug file: %s"), filename)
            raise dnf.exceptions.Error

        skip = True
        pkgs = {}
        for line in fobj:
            line = ucd(line)
            if skip:
                if line == "%%%%RPMDB\n":
                    skip = False
                continue

            if not line or line[0] != " ":
                break

            pkg_spec = line.strip()
            nevra = hawkey.split_nevra(pkg_spec)
            # {(name, arch): {(epoch, version, release): action}}
            pkgs.setdefault((nevra.name, nevra.arch), {})[
                (nevra.epoch, nevra.version, nevra.release)] = "install"

        return pkgs


def rpm_problems(base):
    rpmdb = dnf.sack._rpmdb_sack(base)
    allpkgs = rpmdb.query().installed()

    requires = set()
    conflicts = set()
    for pkg in allpkgs:
        requires.update([(req, pkg) for req in pkg.requires
                         if not str(req) == "solvable:prereqmarker"
                         and not str(req).startswith("rpmlib(")])
        conflicts.update([(conf, pkg) for conf in pkg.conflicts])

    missing_requires = [(req, pkg) for (req, pkg) in requires
                        if not allpkgs.filter(provides=req)]
    existing_conflicts = [(conf, pkg) for (conf, pkg) in conflicts
                          if allpkgs.filter(provides=conf)]
    return missing_requires, existing_conflicts


def pkgspec(pkg):
    return pkgtup2spec(pkg.name, pkg.arch, pkg.epoch, pkg.version, pkg.release)


def pkgtup2spec(name, arch, epoch, version, release):
    a = "" if not arch else ".%s" % arch.lstrip('.')
    e = "" if epoch in (None, "") else "%s:" % epoch
    return "%s-%s%s-%s%s" % (name, e, version, release, a)
site-packages/dnf-plugins/debuginfo-install.py000064400000025514147511334660015513 0ustar00# debuginfo-install.py
# Install the debuginfo of packages and their dependencies to debug this package.
#
# Copyright (C) 2014 Igor Gnatenko
# Copyright (C) 2014-2019 Red Hat
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General
# Public License for more details. You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA. Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from dnfpluginscore import _, logger

import dnf
from dnf.package import Package

class DebuginfoInstall(dnf.Plugin):
    """DNF plugin supplying the 'debuginfo-install' command."""

    name = 'debuginfo-install'

    def __init__(self, base, cli):
        """Initialize the plugin instance."""
        super(DebuginfoInstall, self).__init__(base, cli)
        self.base = base
        self.cli = cli
        if cli is not None:
            cli.register_command(DebuginfoInstallCommand)

    def config(self):
        cp = self.read_config(self.base.conf)
        autoupdate = (cp.has_section('main')
                      and cp.has_option('main', 'autoupdate')
                      and cp.getboolean('main', 'autoupdate'))

        if autoupdate:
            # allow update of already installed debuginfo packages
            dbginfo = dnf.sack._rpmdb_sack(self.base).query().filterm(name__glob="*-debuginfo")
            if len(dbginfo):
                self.base.repos.enable_debug_repos()

class DebuginfoInstallCommand(dnf.cli.Command):
    """ DebuginfoInstall plugin for DNF """

    aliases = ("debuginfo-install",)
    summary = _('install debuginfo packages')

    def __init__(self, cli):
        super(DebuginfoInstallCommand, self).__init__(cli)

        self.available_debuginfo_missing = set()
        self.available_debugsource_missing = set()
        self.installed_debuginfo_missing = set()
        self.installed_debugsource_missing = set()

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('package', nargs='+')

    def configure(self):
        demands = self.cli.demands
        demands.resolving = True
        demands.root_user = True
        demands.sack_activation = True
        demands.available_repos = True
        self.base.repos.enable_debug_repos()

    def run(self):
        errors_spec = []

        debuginfo_suffix_len = len(Package.DEBUGINFO_SUFFIX)
        debugsource_suffix_len = len(Package.DEBUGSOURCE_SUFFIX)

        for pkgspec in self.opts.package:
            solution = dnf.subject.Subject(pkgspec).get_best_solution(self.base.sack,
                                                                      with_src=False)

            query = solution["query"]
            if not query:
                logger.info(_('No match for argument: %s'), self.base.output.term.bold(pkgspec))
                errors_spec.append(pkgspec)
                continue

            package_dict = query.available()._name_dict()
            # installed versions of packages have priority, replace / add them to the dict
            package_dict.update(query.installed()._name_dict())

            # Remove debuginfo packages if their base packages are in the query.
            # They can get there through globs and they break the installation
            # of debug packages with the same version as the installed base
            # packages. If the base package of a debuginfo package is not in
            # the query, the user specified a debug package on the command
            # line. We don't want to ignore those, so we will install them.
            # But, in this case the version will not be matched to the
            # installed version of the base package, as that would require
            # another query and is further complicated if the user specifies a
            # version themselves etc.
            for name in list(package_dict.keys()):
                if name.endswith(Package.DEBUGINFO_SUFFIX):
                    if name[:-debuginfo_suffix_len] in package_dict:
                        package_dict.pop(name)
                if name.endswith(Package.DEBUGSOURCE_SUFFIX):
                    if name[:-debugsource_suffix_len] in package_dict:
                        package_dict.pop(name)

            # attempt to install debuginfo and debugsource for the highest
            # listed version of the package (in case the package is installed,
            # only the installed version is listed)
            for pkgs in package_dict.values():
                first_pkg = pkgs[0]

                # for packages from system (installed) there can be more
                # packages with different architectures listed and we want to
                # install debuginfo for all of them
                if first_pkg._from_system:
                    # we need to split them by architectures and install the
                    # latest version for each architecture
                    arch_dict = {}

                    for pkg in pkgs:
                        arch_dict.setdefault(pkg.arch, []).append(pkg)

                    for package_arch_list in arch_dict.values():
                        pkg = package_arch_list[0]

                        if not self._install_debug_from_system(pkg.debug_name, pkg):
                            if not self._install_debug_from_system(pkg.source_debug_name, pkg):
                                self.installed_debuginfo_missing.add(str(pkg))

                        if not self._install_debug_from_system(pkg.debugsource_name, pkg):
                            self.installed_debugsource_missing.add(str(pkg))

                    continue

                # if the package in question is -debuginfo or -debugsource, install it directly
                if first_pkg.name.endswith(Package.DEBUGINFO_SUFFIX) \
                        or first_pkg.name.endswith(Package.DEBUGSOURCE_SUFFIX):

                    self._install(pkgs)  # pass all pkgs to the solver, it will pick the best one
                    continue

                # if we have NEVRA parsed from the pkgspec, use it to install the package
                if solution["nevra"] is not None:
                    if not self._install_debug(first_pkg.debug_name, solution["nevra"]):
                        if not self._install_debug(first_pkg.source_debug_name, solution["nevra"]):
                            self.available_debuginfo_missing.add(
                                "{}-{}".format(first_pkg.name, first_pkg.evr))

                    if not self._install_debug(first_pkg.debugsource_name, solution["nevra"]):
                        self.available_debugsource_missing.add(
                            "{}-{}".format(first_pkg.name, first_pkg.evr))

                    continue

                # if we don't have NEVRA from the pkgspec, pass nevras from
                # all packages that were found (while replacing the name with
                # the -debuginfo and -debugsource variant) to the solver, which
                # will pick the correct version and architecture
                if not self._install_debug_no_nevra(first_pkg.debug_name, pkgs):
                    if not self._install_debug_no_nevra(first_pkg.source_debug_name, pkgs):
                        self.available_debuginfo_missing.add(
                            "{}-{}".format(first_pkg.name, first_pkg.evr))

                if not self._install_debug_no_nevra(first_pkg.debugsource_name, pkgs):
                    self.available_debugsource_missing.add(
                        "{}-{}".format(first_pkg.name, first_pkg.evr))

        if self.available_debuginfo_missing:
            logger.info(
                _("Could not find debuginfo package for the following available packages: %s"),
                ", ".join(sorted(self.available_debuginfo_missing)))

        if self.available_debugsource_missing:
            logger.info(
                _("Could not find debugsource package for the following available packages: %s"),
                ", ".join(sorted(self.available_debugsource_missing)))

        if self.installed_debuginfo_missing:
            logger.info(
                _("Could not find debuginfo package for the following installed packages: %s"),
                ", ".join(sorted(self.installed_debuginfo_missing)))

        if self.installed_debugsource_missing:
            logger.info(
                _("Could not find debugsource package for the following installed packages: %s"),
                ", ".join(sorted(self.installed_debugsource_missing)))

        if errors_spec and self.base.conf.strict:
            raise dnf.exceptions.PackagesNotAvailableError(_("Unable to find a match"),
                                                           pkg_spec=' '.join(errors_spec))

    def _install_debug_from_system(self, debug_name, pkg):
        query = self.base.sack.query().filter(name=debug_name,
                                              epoch=pkg.epoch,
                                              version=pkg.version,
                                              release=pkg.release,
                                              arch=pkg.arch)

        if query:
            self._install(query)
            return True

        return False

    def _install_debug(self, debug_name, base_nevra):
        kwargs = {}

        # if some part of EVRA was specified in the argument, add it as a filter
        if base_nevra.epoch is not None:
            kwargs["epoch__glob"] = base_nevra.epoch
        if base_nevra.version is not None:
            kwargs["version__glob"] = base_nevra.version
        if base_nevra.release is not None:
            kwargs["release__glob"] = base_nevra.release
        if base_nevra.arch is not None:
            kwargs["arch__glob"] = base_nevra.arch

        query = self.base.sack.query().filter(name=debug_name, **kwargs)

        if query:
            self._install(query)
            return True

        return False

    def _install_debug_no_nevra(self, debug_name, pkgs):
        query = self.base.sack.query().filterm(
            nevra_strict=["{}-{}.{}".format(debug_name, p.evr, p.arch) for p in pkgs])
        if query:
            self._install(query)
            return True

        return False

    def _install(self, pkgs):
        selector = dnf.selector.Selector(self.base.sack)
        selector.set(pkg=pkgs)
        self.base.goal.install(select=selector, optional=not self.base.conf.strict)
site-packages/dnf-plugins/repograph.py000064400000007774147511334660014104 0ustar00# repograph.py
# DNF plugin adding a command to Output a full package dependency graph in dot
# format.
#
# Copyright (C) 2015 Igor Gnatenko
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from dnfpluginscore import _, logger

import dnf.cli

DOT_HEADER = """
size="20.69,25.52";
ratio="fill";
rankdir="TB";
orientation=port;
node[style="filled"];
"""


class RepoGraph(dnf.Plugin):

    name = "repograph"

    def __init__(self, base, cli):
        super(RepoGraph, self).__init__(base, cli)
        if cli is None:
            return
        cli.register_command(RepoGraphCommand)


class RepoGraphCommand(dnf.cli.Command):
    aliases = ("repograph", "repo-graph",)
    summary = _("Output a full package dependency graph in dot format")

    def configure(self):
        demands = self.cli.demands
        demands.sack_activation = True
        demands.available_repos = True
        if self.opts.repo:
            for repo in self.base.repos.all():
                if repo.id not in self.opts.repo:
                    repo.disable()
                else:
                    repo.enable()

    def run(self):
        self.do_dot(DOT_HEADER)

    def do_dot(self, header):
        maxdeps = 0
        deps = self._get_deps(self.base.sack)

        print("digraph packages {")
        print("{}".format(header))

        for pkg in deps.keys():
            if len(deps[pkg]) > maxdeps:
                maxdeps = len(deps[pkg])

            # color calculations lifted from rpmgraph
            h = 0.5 + (0.6 / 23 * len(deps[pkg]))
            s = h + 0.1
            b = 1.0

            print('"{}" [color="{:.12g} {:.12g} {}"];'.format(pkg, h, s, b))
            print('"{}" -> {{'.format(pkg))
            for req in deps[pkg]:
                print('"{}"'.format(req))
            print('}} [color="{:.12g} {:.12g} {}"];\n'.format(h, s, b))
        print("}")

    @staticmethod
    def _get_deps(sack):
        requires = {}
        prov = {}
        skip = []

        available = sack.query().available()
        for pkg in available:
            xx = {}
            for req in pkg.requires:
                reqname = str(req)
                if reqname in skip:
                    continue
                # XXX: https://bugzilla.redhat.com/show_bug.cgi?id=1186721
                if reqname.startswith("solvable:"):
                    continue
                if reqname in prov:
                    provider = prov[reqname]
                else:
                    provider = available.filter(provides=reqname)
                    if not provider:
                        logger.debug(_("Nothing provides: '%s'"), reqname)
                        skip.append(reqname)
                        continue
                    else:
                        provider = provider[0].name
                    prov[reqname] = provider
                if provider == pkg.name:
                    xx[provider] = None
                if provider in xx or provider in skip:
                    continue
                else:
                    xx[provider] = None
                requires[pkg.name] = xx.keys()
        return requires
site-packages/dnf-plugins/generate_completion_cache.py000064400000007554147511334660017257 0ustar00# coding=utf-8
# generate_completion_cache.py - generate cache for dnf bash completion
# Copyright © 2013 Elad Alfassa <elad@fedoraproject.org>
# Copyright (C) 2014-2015 Igor Gnatenko <i.gnatenko.brain@gmail.com>
# Copyright (C) 2015  Red Hat, Inc.

# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.

from __future__ import absolute_import
from __future__ import unicode_literals
from dnf.i18n import ucd
from dnfpluginscore import logger

import dnf
import os.path
import sqlite3


class BashCompletionCache(dnf.Plugin):
    name = 'generate_completion_cache'

    def __init__(self, base, cli):
        super(BashCompletionCache, self).__init__(base, cli)
        self.base = base
        self.cache_file = "/var/cache/dnf/packages.db"

    @staticmethod
    def _out(msg):
        logger.debug('Completion plugin: %s', msg)

    def sack(self):
        ''' Generate cache of available packages '''
        # We generate this cache only if the repos were just freshed or if the
        # cache file doesn't exist

        fresh = False
        for repo in self.base.repos.iter_enabled():
            if repo.metadata is not None and repo.metadata.fresh:
                # One fresh repo is enough to cause a regen of the cache
                fresh = True
                break

        if not os.path.exists(self.cache_file) or fresh:
            try:
                with sqlite3.connect(self.cache_file) as conn:
                    self._out('Generating completion cache...')
                    cur = conn.cursor()
                    cur.execute(
                        "create table if not exists available (pkg TEXT)")
                    cur.execute(
                        "create unique index if not exists "
                        "pkg_available ON available(pkg)")
                    cur.execute("delete from available")
                    avail_pkgs = self.base.sack.query().available()
                    avail_pkgs_insert = [[str(x)] for x in avail_pkgs if x.arch != "src"]
                    cur.executemany("insert or ignore into available values (?)",
                                    avail_pkgs_insert)
                    conn.commit()
            except sqlite3.OperationalError as e:
                self._out("Can't write completion cache: %s" % ucd(e))

    def transaction(self):
        ''' Generate cache of installed packages '''
        if not self.transaction:
            return

        try:
            with sqlite3.connect(self.cache_file) as conn:
                self._out('Generating completion cache...')
                cur = conn.cursor()
                cur.execute("create table if not exists installed (pkg TEXT)")
                cur.execute(
                    "create unique index if not exists "
                    "pkg_installed ON installed(pkg)")
                cur.execute("delete from installed")
                inst_pkgs = dnf.sack._rpmdb_sack(self.base).query().installed()
                inst_pkgs_insert = [[str(x)] for x in inst_pkgs if x.arch != "src"]
                cur.executemany("insert or ignore into installed values (?)",
                                inst_pkgs_insert)
                conn.commit()
        except sqlite3.OperationalError as e:
            self._out("Can't write completion cache: %s" % ucd(e))
site-packages/dnf-plugins/groups_manager.py000064400000032334147511334660015114 0ustar00# groups_manager.py
# DNF plugin for managing comps groups metadata files
#
# Copyright (C) 2020 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals

import argparse
import gzip
import libcomps
import os
import re
import shutil
import tempfile

from dnfpluginscore import _, logger
import dnf
import dnf.cli


RE_GROUP_ID_VALID = '-a-z0-9_.:'
RE_GROUP_ID = re.compile(r'^[{}]+$'.format(RE_GROUP_ID_VALID))
RE_LANG = re.compile(r'^[-a-zA-Z0-9_.@]+$')
COMPS_XML_OPTIONS = {
    'default_explicit': True,
    'uservisible_explicit': True,
    'empty_groups': True}


def group_id_type(value):
    '''group id validator'''
    if not RE_GROUP_ID.match(value):
        raise argparse.ArgumentTypeError(_('Invalid group id'))
    return value


def translation_type(value):
    '''translated texts validator'''
    data = value.split(':', 2)
    if len(data) != 2:
        raise argparse.ArgumentTypeError(
            _("Invalid translated data, should be in form 'lang:text'"))
    lang, text = data
    if not RE_LANG.match(lang):
        raise argparse.ArgumentTypeError(_('Invalid/empty language for translated data'))
    return lang, text


def text_to_id(text):
    '''generate group id based on its name'''
    group_id = text.lower()
    group_id = re.sub('[^{}]'.format(RE_GROUP_ID_VALID), '', group_id)
    if not group_id:
        raise dnf.cli.CliError(
            _("Can't generate group id from '{}'. Please specify group id using --id.").format(
                text))
    return group_id


@dnf.plugin.register_command
class GroupsManagerCommand(dnf.cli.Command):
    aliases = ('groups-manager',)
    summary = _('create and edit groups metadata file')

    def __init__(self, cli):
        super(GroupsManagerCommand, self).__init__(cli)
        self.comps = libcomps.Comps()

    @staticmethod
    def set_argparser(parser):
        # input / output options
        parser.add_argument('--load', action='append', default=[],
                            metavar='COMPS.XML',
                            help=_('load groups metadata from file'))
        parser.add_argument('--save', action='append', default=[],
                            metavar='COMPS.XML',
                            help=_('save groups metadata to file'))
        parser.add_argument('--merge', metavar='COMPS.XML',
                            help=_('load and save groups metadata to file'))
        parser.add_argument('--print', action='store_true', default=False,
                            help=_('print the result metadata to stdout'))
        # group options
        parser.add_argument('--id', type=group_id_type,
                            help=_('group id'))
        parser.add_argument('-n', '--name', help=_('group name'))
        parser.add_argument('--description',
                            help=_('group description'))
        parser.add_argument('--display-order', type=int,
                            help=_('group display order'))
        parser.add_argument('--translated-name', action='append', default=[],
                            metavar='LANG:TEXT', type=translation_type,
                            help=_('translated name for the group'))
        parser.add_argument('--translated-description', action='append', default=[],
                            metavar='LANG:TEXT', type=translation_type,
                            help=_('translated description for the group'))
        visible = parser.add_mutually_exclusive_group()
        visible.add_argument('--user-visible', dest='user_visible', action='store_true',
                             default=None,
                             help=_('make the group user visible (default)'))
        visible.add_argument('--not-user-visible', dest='user_visible', action='store_false',
                             default=None,
                             help=_('make the group user invisible'))

        # package list options
        section = parser.add_mutually_exclusive_group()
        section.add_argument('--mandatory', action='store_true',
                             help=_('add packages to the mandatory section'))
        section.add_argument('--optional', action='store_true',
                             help=_('add packages to the optional section'))
        section.add_argument('--remove', action='store_true', default=False,
                             help=_('remove packages from the group instead of adding them'))
        parser.add_argument('--dependencies', action='store_true',
                            help=_('include also direct dependencies for packages'))

        parser.add_argument("packages", nargs='*', metavar='PACKAGE',
                            help=_('package specification'))

    def configure(self):
        demands = self.cli.demands

        if self.opts.packages:
            demands.sack_activation = True
            demands.available_repos = True
            demands.load_system_repo = False

        # handle --merge option (shortcut to --load and --save the same file)
        if self.opts.merge:
            self.opts.load.insert(0, self.opts.merge)
            self.opts.save.append(self.opts.merge)

        # check that group is specified when editing is attempted
        if (self.opts.description
                or self.opts.display_order
                or self.opts.translated_name
                or self.opts.translated_description
                or self.opts.user_visible is not None
                or self.opts.packages):
            if not self.opts.id and not self.opts.name:
                raise dnf.cli.CliError(
                    _("Can't edit group without specifying it (use --id or --name)"))

    def load_input_files(self):
        """
        Loads all input xml files.
        Returns True if at least one file was successfuly loaded
        """
        for file_name in self.opts.load:
            file_comps = libcomps.Comps()
            try:
                if file_name.endswith('.gz'):
                    # libcomps does not support gzipped files - decompress to temporary
                    # location
                    with gzip.open(file_name) as gz_file:
                        temp_file = tempfile.NamedTemporaryFile(delete=False)
                        try:
                            shutil.copyfileobj(gz_file, temp_file)
                            # close temp_file to ensure the content is flushed to disk
                            temp_file.close()
                            file_comps.fromxml_f(temp_file.name)
                        finally:
                            os.unlink(temp_file.name)
                else:
                    file_comps.fromxml_f(file_name)
            except (IOError, OSError, libcomps.ParserError) as err:
                # gzip module raises OSError on reading from malformed gz file
                # get_last_errors() output often contains duplicit lines, remove them
                seen = set()
                for error in file_comps.get_last_errors():
                    if error in seen:
                        continue
                    logger.error(error.strip())
                    seen.add(error)
                raise dnf.exceptions.Error(
                    _("Can't load file \"{}\": {}").format(file_name, err))
            else:
                self.comps += file_comps

    def save_output_files(self):
        for file_name in self.opts.save:
            try:
                # xml_f returns a list of errors / log entries
                errors = self.comps.xml_f(file_name, xml_options=COMPS_XML_OPTIONS)
            except libcomps.XMLGenError as err:
                errors = [err]
            if errors:
                # xml_f() method could return more than one error. In this case
                # raise the latest of them and log the others.
                for err in errors[:-1]:
                    logger.error(err.strip())
                raise dnf.exceptions.Error(_("Can't save file \"{}\": {}").format(
                    file_name, errors[-1].strip()))


    def find_group(self, group_id, name):
        '''
        Try to find group according to command line parameters - first by id
        then by name.
        '''
        group = None
        if group_id:
            for grp in self.comps.groups:
                if grp.id == group_id:
                    group = grp
                    break
        if group is None and name:
            for grp in self.comps.groups:
                if grp.name == name:
                    group = grp
                    break
        return group

    def edit_group(self, group):
        '''
        Set attributes and package lists for selected group
        '''
        def langlist_to_strdict(lst):
            str_dict = libcomps.StrDict()
            for lang, text in lst:
                str_dict[lang] = text
            return str_dict

        # set group attributes
        if self.opts.name:
            group.name = self.opts.name
        if self.opts.description:
            group.desc = self.opts.description
        if self.opts.display_order:
            group.display_order = self.opts.display_order
        if self.opts.user_visible is not None:
            group.uservisible = self.opts.user_visible
        if self.opts.translated_name:
            group.name_by_lang = langlist_to_strdict(self.opts.translated_name)
        if self.opts.translated_description:
            group.desc_by_lang = langlist_to_strdict(self.opts.translated_description)

        # edit packages list
        if self.opts.packages:
            # find packages according to specifications from command line
            packages = set()
            for pkg_spec in self.opts.packages:
                subj = dnf.subject.Subject(pkg_spec)
                q = subj.get_best_query(self.base.sack, with_nevra=True,
                                        with_provides=False, with_filenames=False).latest()
                if not q:
                    logger.warning(_("No match for argument: {}").format(pkg_spec))
                    continue
                packages.update(q)
            if self.opts.dependencies:
                # add packages that provide requirements
                requirements = set()
                for pkg in packages:
                    requirements.update(pkg.requires)
                packages.update(self.base.sack.query().filterm(provides=requirements))

            pkg_names = {pkg.name for pkg in packages}

            if self.opts.remove:
                for pkg_name in pkg_names:
                    for pkg in group.packages_match(name=pkg_name,
                                                    type=libcomps.PACKAGE_TYPE_UNKNOWN):
                        group.packages.remove(pkg)
            else:
                if self.opts.mandatory:
                    pkg_type = libcomps.PACKAGE_TYPE_MANDATORY
                elif self.opts.optional:
                    pkg_type = libcomps.PACKAGE_TYPE_OPTIONAL
                else:
                    pkg_type = libcomps.PACKAGE_TYPE_DEFAULT
                for pkg_name in sorted(pkg_names):
                    if not group.packages_match(name=pkg_name, type=pkg_type):
                        group.packages.append(libcomps.Package(name=pkg_name, type=pkg_type))

    def run(self):
        self.load_input_files()

        if self.opts.id or self.opts.name:
            # we are adding / editing a group
            group = self.find_group(group_id=self.opts.id, name=self.opts.name)
            if group is None:
                # create a new group
                if self.opts.remove:
                    raise dnf.exceptions.Error(_("Can't remove packages from non-existent group"))
                group = libcomps.Group()
                if self.opts.id:
                    group.id = self.opts.id
                    group.name = self.opts.id
                elif self.opts.name:
                    group_id = text_to_id(self.opts.name)
                    if self.find_group(group_id=group_id, name=None):
                        raise dnf.cli.CliError(
                            _("Group id '{}' generated from '{}' is duplicit. "
                              "Please specify group id using --id.").format(
                                  group_id, self.opts.name))
                    group.id = group_id
                self.comps.groups.append(group)
            self.edit_group(group)

        self.save_output_files()
        if self.opts.print or (not self.opts.save):
            print(self.comps.xml_str(xml_options=COMPS_XML_OPTIONS))
site-packages/dnf-plugins/repoclosure.py000064400000015233147511334660014444 0ustar00# repoclosure.py
# DNF plugin adding a command to display a list of unresolved dependencies
# for repositories.
#
# Copyright (C) 2015 Igor Gnatenko
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from dnfpluginscore import _

import dnf.cli


class RepoClosure(dnf.Plugin):

    name = "repoclosure"

    def __init__(self, base, cli):
        super(RepoClosure, self).__init__(base, cli)
        if cli is None:
            return
        cli.register_command(RepoClosureCommand)


class RepoClosureCommand(dnf.cli.Command):
    aliases = ("repoclosure",)
    summary = _("Display a list of unresolved dependencies for repositories")

    def configure(self):
        demands = self.cli.demands
        demands.sack_activation = True
        demands.available_repos = True
        if self.opts.repo:
            for repo in self.base.repos.all():
                if repo.id not in self.opts.repo and repo.id not in self.opts.check:
                    repo.disable()
                else:
                    repo.enable()

    def run(self):
        if self.opts.arches:
            unresolved = self._get_unresolved(self.opts.arches)
        else:
            unresolved = self._get_unresolved()
        for pkg in sorted(unresolved.keys()):
            print("package: {} from {}".format(str(pkg), pkg.reponame))
            print("  unresolved deps:")
            for dep in unresolved[pkg]:
                print("    {}".format(dep))
        if len(unresolved) > 0:
            msg = _("Repoclosure ended with unresolved dependencies.")
            raise dnf.exceptions.Error(msg)

    def _get_unresolved(self, arch=None):
        unresolved = {}
        deps = set()

        # We have two sets of packages, available and to_check:
        # * available is the set of packages used to satisfy dependencies
        # * to_check is the set of packages we are checking the dependencies of
        #
        # to_check can be a subset of available if the --arch, --best, --check,
        # --newest, or --pkg options are used
        #
        # --arch:   only packages matching arch are checked
        # --best:   available only contains the latest packages per arch across all repos
        # --check:  only check packages in the specified repo(s)
        # --newest: only consider the latest versions of a package from each repo
        # --pkg:    only check the specified packages
        #
        # Relationship of --best and --newest:
        #
        # Pkg Set   | Neither |  --best             | --newest        | --best and --newest |
        # available | all     | latest in all repos | latest per repo | latest in all repos |
        # to_check  | all     | all                 | latest per repo | latest per repo     |

        if self.opts.newest:
            available = self.base.sack.query().filter(empty=True)
            to_check = self.base.sack.query().filter(empty=True)
            for repo in self.base.repos.iter_enabled():
                available = \
                    available.union(self.base.sack.query().filter(reponame=repo.id).latest())
                to_check = \
                    to_check.union(self.base.sack.query().filter(reponame=repo.id).latest())
        else:
            available = self.base.sack.query().available()
            to_check = self.base.sack.query().available()

        if self.opts.pkglist:
            pkglist_q = self.base.sack.query().filter(empty=True)
            errors = []
            for pkg in self.opts.pkglist:
                subj = dnf.subject.Subject(pkg)
                pkg_q = to_check.intersection(
                    subj.get_best_query(self.base.sack, with_nevra=True,
                                        with_provides=False, with_filenames=False))
                if pkg_q:
                    pkglist_q = pkglist_q.union(pkg_q)
                else:
                    errors.append(pkg)
            if errors:
                raise dnf.exceptions.Error(
                    _('no package matched: %s') % ', '.join(errors))
            to_check = pkglist_q

        if self.opts.check:
            to_check.filterm(reponame=self.opts.check)

        if arch is not None:
            to_check.filterm(arch=arch)

        if self.base.conf.best:
            available.filterm(latest_per_arch=True)

        available.apply()
        to_check.apply()

        for pkg in to_check:
            unresolved[pkg] = set()
            for req in pkg.requires:
                reqname = str(req)
                # XXX: https://bugzilla.redhat.com/show_bug.cgi?id=1186721
                if reqname.startswith("solvable:") or \
                        reqname.startswith("rpmlib("):
                    continue
                deps.add(req)
                unresolved[pkg].add(req)

        unresolved_deps = set(x for x in deps if not available.filter(provides=x))

        unresolved_transition = {k: set(x for x in v if x in unresolved_deps)
                                 for k, v in unresolved.items()}
        return {k: v for k, v in unresolved_transition.items() if v}

    @staticmethod
    def set_argparser(parser):
        parser.add_argument("--arch", default=[], action="append", dest='arches',
                            help=_("check packages of the given archs, can be "
                                   "specified multiple times"))
        parser.add_argument("--check", default=[], action="append",
                            help=_("Specify repositories to check"))
        parser.add_argument("-n", "--newest", action="store_true",
                            help=_("Check only the newest packages in the "
                                   "repos"))
        parser.add_argument("--pkg", default=[], action="append",
                            help=_("Check closure for this package only"),
                            dest="pkglist")
site-packages/dnf-plugins/download.py000064400000030052147511334660013705 0ustar00# download.py, supplies the 'download' command.
#
# Copyright (C) 2013-2015  Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from dnfpluginscore import _, logger
from dnf.cli.option_parser import OptionParser

import dnf
import dnf.cli
import dnf.exceptions
import dnf.i18n
import dnf.subject
import dnf.util
import hawkey
import itertools
import os
import shutil


@dnf.plugin.register_command
class DownloadCommand(dnf.cli.Command):

    aliases = ['download']
    summary = _('Download package to current directory')

    def __init__(self, cli):
        super(DownloadCommand, self).__init__(cli)
        self.opts = None
        self.parser = None

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('packages', nargs='+',
                            help=_('packages to download'))
        parser.add_argument("--source", action='store_true',
                            help=_('download the src.rpm instead'))
        parser.add_argument("--debuginfo", action='store_true',
                            help=_('download the -debuginfo package instead'))
        parser.add_argument("--debugsource", action='store_true',
                            help=_('download the -debugsource package instead'))
        parser.add_argument("--arch", '--archlist', dest='arches', default=[],
                            action=OptionParser._SplitCallback, metavar='[arch]',
                            help=_("limit  the  query to packages of given architectures."))
        parser.add_argument('--resolve', action='store_true',
                            help=_('resolve and download needed dependencies'))
        parser.add_argument('--alldeps', action='store_true',
                            help=_('when running with --resolve, download all dependencies '
                                   '(do not exclude already installed ones)'))
        parser.add_argument('--url', '--urls', action='store_true', dest='url',
                            help=_('print list of urls where the rpms '
                                   'can be downloaded instead of downloading'))
        parser.add_argument('--urlprotocols', action='append',
                            choices=['http', 'https', 'rsync', 'ftp'],
                            default=[],
                            help=_('when running with --url, '
                                   'limit to specific protocols'))

    def configure(self):
        # setup sack and populate it with enabled repos
        demands = self.cli.demands
        demands.sack_activation = True
        demands.available_repos = True
        if self.opts.resolve and self.opts.alldeps:
            demands.load_system_repo = False

        if self.opts.source:
            self.base.repos.enable_source_repos()

        if self.opts.debuginfo or self.opts.debugsource:
            self.base.repos.enable_debug_repos()

        if self.opts.destdir:
            self.base.conf.destdir = self.opts.destdir
        else:
            self.base.conf.destdir = dnf.i18n.ucd(os.getcwd())

    def run(self):
        """Execute the util action here."""

        if (not self.opts.source
                and not self.opts.debuginfo
                and not self.opts.debugsource):
            pkgs = self._get_pkg_objs_rpms(self.opts.packages)
        else:
            pkgs = []
            if self.opts.source:
                pkgs.extend(self._get_pkg_objs_source(self.opts.packages))

            if self.opts.debuginfo:
                pkgs.extend(self._get_pkg_objs_debuginfo(self.opts.packages))

            if self.opts.debugsource:
                pkgs.extend(self._get_pkg_objs_debugsource(self.opts.packages))

        # If user asked for just urls then print them and we're done
        if self.opts.url:
            for pkg in pkgs:
                # command line repo packages do not have .remote_location
                if pkg.repoid != hawkey.CMDLINE_REPO_NAME:
                    url = pkg.remote_location(schemes=self.opts.urlprotocols)
                    if url:
                        print(url)
                    else:
                        msg = _("Failed to get mirror for package: %s") % pkg.name
                        if self.base.conf.strict:
                            raise dnf.exceptions.Error(msg)
                        logger.warning(msg)
            return
        else:
            self._do_downloads(pkgs)  # download rpms

    def _do_downloads(self, pkgs):
        """
        Perform the download for a list of packages
        """
        pkg_dict = {}
        for pkg in pkgs:
            pkg_dict.setdefault(str(pkg), []).append(pkg)

        to_download = []
        cmdline = []
        for pkg_list in pkg_dict.values():
            pkgs_cmdline = [pkg for pkg in pkg_list
                            if pkg.repoid == hawkey.CMDLINE_REPO_NAME]
            if pkgs_cmdline:
                cmdline.append(pkgs_cmdline[0])
                continue
            pkg_list.sort(key=lambda x: (x.repo.priority, x.repo.cost))
            to_download.append(pkg_list[0])
        if to_download:
            self.base.download_packages(to_download, self.base.output.progress)
        if cmdline:
            # command line repo packages are either local files or already downloaded urls
            # just copy them to the destination
            for pkg in cmdline:
                # python<3.4 shutil module does not raise SameFileError, check manually
                src = pkg.localPkg()
                dst = os.path.join(self.base.conf.destdir, os.path.basename(src))
                if os.path.exists(dst) and os.path.samefile(src, dst):
                    continue
                shutil.copy(src, self.base.conf.destdir)
        locations = sorted([pkg.localPkg() for pkg in to_download + cmdline])
        return locations

    def _get_pkg_objs_rpms(self, pkg_specs):
        """
        Return a list of dnf.Package objects that represent the rpms
        to download.
        """
        if self.opts.resolve:
            pkgs = self._get_packages_with_deps(pkg_specs)
        else:
            pkgs = self._get_packages(pkg_specs)
        return pkgs

    def _get_pkg_objs_source(self, pkg_specs):
        """
        Return a list of dnf.Package objects that represent the source
        rpms to download.
        """
        pkgs = self._get_pkg_objs_rpms(pkg_specs)
        source_pkgs = self._get_source_packages(pkgs)
        pkgs = set(self._get_packages(source_pkgs, source=True))
        return pkgs

    def _get_pkg_objs_debuginfo(self, pkg_specs):
        """
        Return a list of dnf.Package objects that represent the debuginfo
        rpms to download.
        """
        dbg_pkgs = set()
        q = self.base.sack.query().available()

        for pkg in self._get_packages(pkg_specs):
            for dbg_name in [pkg.debug_name, pkg.source_debug_name]:
                dbg_available = q.filter(
                    name=dbg_name,
                    epoch=int(pkg.epoch),
                    version=pkg.version,
                    release=pkg.release,
                    arch=pkg.arch
                )

                if not dbg_available:
                    continue

                for p in dbg_available:
                    dbg_pkgs.add(p)

                break

        return dbg_pkgs

    def _get_pkg_objs_debugsource(self, pkg_specs):
        """
        Return a list of dnf.Package objects that represent the debugsource
        rpms to download.
        """
        dbg_pkgs = set()
        q = self.base.sack.query().available()

        for pkg in self._get_packages(pkg_specs):
            dbg_available = q.filter(
                name=pkg.debugsource_name,
                epoch=int(pkg.epoch),
                version=pkg.version,
                release=pkg.release,
                arch=pkg.arch
            )

            for p in dbg_available:
                dbg_pkgs.add(p)

        return dbg_pkgs

    def _get_packages(self, pkg_specs, source=False):
        """Get packages matching pkg_specs."""
        func = self._get_query_source if source else self._get_query
        queries = []
        for pkg_spec in pkg_specs:
            try:
                queries.append(func(pkg_spec))
            except dnf.exceptions.PackageNotFoundError as e:
                logger.error(dnf.i18n.ucd(e))
                if self.base.conf.strict:
                    logger.error(_("Exiting due to strict setting."))
                    raise dnf.exceptions.Error(e)

        pkgs = list(itertools.chain(*queries))
        return pkgs

    def _get_packages_with_deps(self, pkg_specs, source=False):
        """Get packages matching pkg_specs and the deps."""
        pkgs = self._get_packages(pkg_specs)
        pkg_set = set(pkgs)
        for pkg in pkgs:
            goal = hawkey.Goal(self.base.sack)
            goal.install(pkg)
            rc = goal.run()
            if rc:
                pkg_set.update(goal.list_installs())
                pkg_set.update(goal.list_upgrades())
            else:
                msg = [_('Error in resolve of packages:')]
                logger.error("\n    ".join(msg + [str(pkg) for pkg in pkgs]))
                logger.error(dnf.util._format_resolve_problems(goal.problem_rules()))
                raise dnf.exceptions.Error()
        return pkg_set

    @staticmethod
    def _get_source_packages(pkgs):
        """Get list of source rpm names for a list of packages."""
        source_pkgs = set()
        for pkg in pkgs:
            if pkg.sourcerpm:
                source_pkgs.add(pkg.sourcerpm)
                logger.debug('  --> Package : %s Source : %s',
                             str(pkg), pkg.sourcerpm)
            elif pkg.arch == 'src':
                source_pkgs.add("%s-%s.src.rpm" % (pkg.name, pkg.evr))
            else:
                logger.info(_("No source rpm defined for %s"), str(pkg))
        return list(source_pkgs)

    def _get_query(self, pkg_spec):
        """Return a query to match a pkg_spec."""
        schemes = dnf.pycomp.urlparse.urlparse(pkg_spec)[0]
        is_url = schemes and schemes in ('http', 'ftp', 'file', 'https')
        if is_url or (pkg_spec.endswith('.rpm') and os.path.isfile(pkg_spec)):
            pkgs = self.base.add_remote_rpms([pkg_spec], progress=self.base.output.progress)
            return self.base.sack.query().filterm(pkg=pkgs)
        subj = dnf.subject.Subject(pkg_spec)
        q = subj.get_best_query(self.base.sack, with_src=self.opts.source)
        q = q.available()
        q = q.filterm(latest_per_arch_by_priority=True)
        if self.opts.arches:
            q = q.filter(arch=self.opts.arches)
        if len(q.run()) == 0:
            msg = _("No package %s available.") % (pkg_spec)
            raise dnf.exceptions.PackageNotFoundError(msg)
        return q

    def _get_query_source(self, pkg_spec):
        """Return a query to match a source rpm file name."""
        pkg_spec = pkg_spec[:-4]  # skip the .rpm
        subj = dnf.subject.Subject(pkg_spec)
        for nevra_obj in subj.get_nevra_possibilities():
            tmp_query = nevra_obj.to_query(self.base.sack).available()
            if tmp_query:
                return tmp_query.latest()

        msg = _("No package %s available.") % (pkg_spec)
        raise dnf.exceptions.PackageNotFoundError(msg)
site-packages/dnf-plugins/system_upgrade.py000064400000064251147511334660015141 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (c) 2015-2020 Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along
# with this program.  If not, see <http://www.gnu.org/licenses/>.
#
# Author(s): Will Woods <wwoods@redhat.com>

"""system_upgrade.py - DNF plugin to handle major-version system upgrades."""

from subprocess import call, Popen, check_output, CalledProcessError
import json
import os
import os.path
import re
import sys
import uuid

from systemd import journal

from dnfpluginscore import _, logger

import dnf
import dnf.cli
from dnf.cli import CliError
from dnf.i18n import ucd
import dnf.transaction
from dnf.transaction_sr import serialize_transaction, TransactionReplay

import libdnf.conf


# Translators: This string is only used in unit tests.
_("the color of the sky")

DOWNLOAD_FINISHED_ID = uuid.UUID('9348174c5cc74001a71ef26bd79d302e')
REBOOT_REQUESTED_ID = uuid.UUID('fef1cc509d5047268b83a3a553f54b43')
UPGRADE_STARTED_ID = uuid.UUID('3e0a5636d16b4ca4bbe5321d06c6aa62')
UPGRADE_FINISHED_ID = uuid.UUID('8cec00a1566f4d3594f116450395f06c')

ID_TO_IDENTIFY_BOOTS = UPGRADE_STARTED_ID

PLYMOUTH = '/usr/bin/plymouth'

RELEASEVER_MSG = _(
    "Need a --releasever greater than the current system version.")
DOWNLOAD_FINISHED_MSG = _(  # Translators: do not change "reboot" here
    "Download complete! Use 'dnf {command} reboot' to start the upgrade.\n"
    "To remove cached metadata and transaction use 'dnf {command} clean'")
CANT_RESET_RELEASEVER = _(
    "Sorry, you need to use 'download --releasever' instead of '--network'")

STATE_VERSION = 2

# --- Miscellaneous helper functions ------------------------------------------


def reboot():
    if os.getenv("DNF_SYSTEM_UPGRADE_NO_REBOOT", default=False):
        logger.info(_("Reboot turned off, not rebooting."))
    else:
        Popen(["systemctl", "reboot"])


def get_url_from_os_release():
    key = "UPGRADE_GUIDE_URL="
    for path in ["/etc/os-release", "/usr/lib/os-release"]:
        try:
            with open(path) as release_file:
                for line in release_file:
                    line = line.strip()
                    if line.startswith(key):
                        return line[len(key):].strip('"')
        except IOError:
            continue
    return None


# DNF-FIXME: dnf.util.clear_dir() doesn't delete regular files :/
def clear_dir(path, ignore=[]):
    if not os.path.isdir(path):
        return

    for entry in os.listdir(path):
        fullpath = os.path.join(path, entry)
        if fullpath in ignore:
            continue
        try:
            if os.path.isdir(fullpath):
                dnf.util.rm_rf(fullpath)
            else:
                os.unlink(fullpath)
        except OSError:
            pass


def check_release_ver(conf, target=None):
    if dnf.rpm.detect_releasever(conf.installroot) == conf.releasever:
        raise CliError(RELEASEVER_MSG)
    if target and target != conf.releasever:
        # it's too late to set releasever here, so this can't work.
        # (see https://bugzilla.redhat.com/show_bug.cgi?id=1212341)
        raise CliError(CANT_RESET_RELEASEVER)


def disable_blanking():
    try:
        tty = open('/dev/tty0', 'wb')
        tty.write(b'\33[9;0]')
    except Exception as e:
        print(_("Screen blanking can't be disabled: %s") % e)

# --- State object - for tracking upgrade state between runs ------------------


# DNF-INTEGRATION-NOTE: basically the same thing as dnf.persistor.JSONDB
class State(object):
    def __init__(self, statefile):
        self.statefile = statefile
        self._data = {}
        self._read()

    def _read(self):
        try:
            with open(self.statefile) as fp:
                self._data = json.load(fp)
        except IOError:
            self._data = {}
        except ValueError:
            self._data = {}
            logger.warning(_("Failed loading state file: %s, continuing with "
                             "empty state."), self.statefile)

    def write(self):
        dnf.util.ensure_dir(os.path.dirname(self.statefile))
        with open(self.statefile, 'w') as outf:
            json.dump(self._data, outf, indent=4, sort_keys=True)

    def clear(self):
        if os.path.exists(self.statefile):
            os.unlink(self.statefile)
        self._read()

    def __enter__(self):
        return self

    def __exit__(self, exc_type, exc_value, traceback):
        if exc_type is None:
            self.write()

    # helper function for creating properties. pylint: disable=protected-access
    def _prop(option):  # pylint: disable=no-self-argument
        def setprop(self, value):
            self._data[option] = value

        def getprop(self):
            return self._data.get(option)
        return property(getprop, setprop)

    #  !!! Increase STATE_VERSION for any changes in data structure like a new property or a new
    #  data structure !!!
    state_version = _prop("state_version")
    download_status = _prop("download_status")
    destdir = _prop("destdir")
    target_releasever = _prop("target_releasever")
    system_releasever = _prop("system_releasever")
    gpgcheck = _prop("gpgcheck")
    # list of repos with gpgcheck=True
    gpgcheck_repos = _prop("gpgcheck_repos")
    # list of repos with repo_gpgcheck=True
    repo_gpgcheck_repos = _prop("repo_gpgcheck_repos")
    upgrade_status = _prop("upgrade_status")
    upgrade_command = _prop("upgrade_command")
    distro_sync = _prop("distro_sync")
    enable_disable_repos = _prop("enable_disable_repos")
    module_platform_id = _prop("module_platform_id")

# --- Plymouth output helpers -------------------------------------------------


class PlymouthOutput(object):
    """A plymouth output helper class.

    Filters duplicate calls, and stops calling the plymouth binary if we
    fail to contact it.
    """

    def __init__(self):
        self.alive = True
        self._last_args = dict()
        self._last_msg = None

    def _plymouth(self, cmd, *args):
        dupe_cmd = (args == self._last_args.get(cmd))
        if (self.alive and not dupe_cmd) or cmd == '--ping':
            try:
                self.alive = (call((PLYMOUTH, cmd) + args) == 0)
            except OSError:
                self.alive = False
            self._last_args[cmd] = args
        return self.alive

    def ping(self):
        return self._plymouth("--ping")

    def message(self, msg):
        if self._last_msg and self._last_msg != msg:
            self._plymouth("hide-message", "--text", self._last_msg)
        self._last_msg = msg
        return self._plymouth("display-message", "--text", msg)

    def set_mode(self):
        mode = 'updates'
        try:
            s = check_output([PLYMOUTH, '--help'])
            if re.search('--system-upgrade', ucd(s)):
                mode = 'system-upgrade'
        except (CalledProcessError, OSError):
            pass
        return self._plymouth("change-mode", "--" + mode)

    def progress(self, percent):
        return self._plymouth("system-update", "--progress", str(percent))


# A single PlymouthOutput instance for us to use within this module
Plymouth = PlymouthOutput()


# A TransactionProgress class that updates plymouth for us.
class PlymouthTransactionProgress(dnf.callback.TransactionProgress):

    # pylint: disable=too-many-arguments
    def progress(self, package, action, ti_done, ti_total, ts_done, ts_total):
        self._update_plymouth(package, action, ts_done, ts_total)

    def _update_plymouth(self, package, action, current, total):
        # Prevents quick jumps of progressbar when pretrans scriptlets
        # and TRANS_PREPARATION are reported as 1/1
        if total == 1:
            return
        # Verification goes through all the packages again,
        # which resets the "current" param value, this prevents
        # resetting of the progress bar as well. (Rhbug:1809096)
        if action != dnf.callback.PKG_VERIFY:
            Plymouth.progress(int(90.0 * current / total))
        else:
            Plymouth.progress(90 + int(10.0 * current / total))

        Plymouth.message(self._fmt_event(package, action, current, total))

    def _fmt_event(self, package, action, current, total):
        action = dnf.transaction.ACTIONS.get(action, action)
        return "[%d/%d] %s %s..." % (current, total, action, package)

# --- journal helpers -------------------------------------------------


def find_boots(message_id):
    """Find all boots with this message id.

    Returns the entries of all found boots.
    """
    j = journal.Reader()
    j.add_match(MESSAGE_ID=message_id.hex,  # identify the message
                _UID=0)                     # prevent spoofing of logs

    oldboot = None
    for entry in j:
        boot = entry['_BOOT_ID']
        if boot == oldboot:
            continue
        oldboot = boot
        yield entry


def list_logs():
    print(_('The following boots appear to contain upgrade logs:'))
    n = -1
    for n, entry in enumerate(find_boots(ID_TO_IDENTIFY_BOOTS)):
        print('{} / {.hex}: {:%Y-%m-%d %H:%M:%S} {}→{}'.format(
            n + 1,
            entry['_BOOT_ID'],
            entry['__REALTIME_TIMESTAMP'],
            entry.get('SYSTEM_RELEASEVER', '??'),
            entry.get('TARGET_RELEASEVER', '??')))
    if n == -1:
        print(_('-- no logs were found --'))


def pick_boot(message_id, n):
    boots = list(find_boots(message_id))
    # Positive indices index all found boots starting with 1 and going forward,
    # zero is the current boot, and -1, -2, -3 are previous going backwards.
    # This is the same as journalctl.
    try:
        if n == 0:
            raise IndexError
        if n > 0:
            n -= 1
        return boots[n]['_BOOT_ID']
    except IndexError:
        raise CliError(_("Cannot find logs with this index."))


def show_log(n):
    boot_id = pick_boot(ID_TO_IDENTIFY_BOOTS, n)
    process = Popen(['journalctl', '--boot', boot_id.hex])
    process.wait()
    rc = process.returncode
    if rc == 1:
        raise dnf.exceptions.Error(_("Unable to match systemd journal entry"))


CMDS = ['download', 'clean', 'reboot', 'upgrade', 'log']

# --- The actual Plugin and Command objects! ----------------------------------


class SystemUpgradePlugin(dnf.Plugin):
    name = 'system-upgrade'

    def __init__(self, base, cli):
        super(SystemUpgradePlugin, self).__init__(base, cli)
        if cli:
            cli.register_command(SystemUpgradeCommand)
            cli.register_command(OfflineUpgradeCommand)
            cli.register_command(OfflineDistrosyncCommand)


class SystemUpgradeCommand(dnf.cli.Command):
    aliases = ('system-upgrade', 'fedup',)
    summary = _("Prepare system for upgrade to a new release")

    DATADIR = 'var/lib/dnf/system-upgrade'

    def __init__(self, cli):
        super(SystemUpgradeCommand, self).__init__(cli)
        self.datadir = os.path.join(cli.base.conf.installroot, self.DATADIR)
        self.transaction_file = os.path.join(self.datadir, 'system-upgrade-transaction.json')
        self.magic_symlink = os.path.join(cli.base.conf.installroot, 'system-update')

        self.state = State(os.path.join(self.datadir, 'system-upgrade-state.json'))

    @staticmethod
    def set_argparser(parser):
        parser.add_argument("--no-downgrade", dest='distro_sync',
                            action='store_false',
                            help=_("keep installed packages if the new "
                                   "release's version is older"))
        parser.add_argument('tid', nargs=1, choices=CMDS,
                            metavar="[%s]" % "|".join(CMDS))
        parser.add_argument('--number', type=int, help=_('which logs to show'))

    def log_status(self, message, message_id):
        """Log directly to the journal."""
        journal.send(message,
                     MESSAGE_ID=message_id,
                     PRIORITY=journal.LOG_NOTICE,
                     SYSTEM_RELEASEVER=self.state.system_releasever,
                     TARGET_RELEASEVER=self.state.target_releasever,
                     DNF_VERSION=dnf.const.VERSION)

    def pre_configure(self):
        self._call_sub("check")
        self._call_sub("pre_configure")

    def configure(self):
        self._call_sub("configure")

    def run(self):
        self._call_sub("run")

    def run_transaction(self):
        self._call_sub("transaction")

    def run_resolved(self):
        self._call_sub("resolved")

    def _call_sub(self, name):
        subfunc = getattr(self, name + '_' + self.opts.tid[0], None)
        if callable(subfunc):
            subfunc()

    def _check_state_version(self, command):
        if self.state.state_version != STATE_VERSION:
            msg = _("Incompatible version of data. Rerun 'dnf {command} download [OPTIONS]'"
                    "").format(command=command)
            raise CliError(msg)

    def _set_cachedir(self):
        # set download directories from json state file
        self.base.conf.cachedir = self.datadir
        self.base.conf.destdir = self.state.destdir if self.state.destdir else None

    def _get_forward_reverse_pkg_reason_pairs(self):
        """
        forward = {repoid:{pkg_nevra: {tsi.action: tsi.reason}}
        reverse = {pkg_nevra: {tsi.action: tsi.reason}}
        :return: forward, reverse
        """
        backward_action = set(dnf.transaction.BACKWARD_ACTIONS + [libdnf.transaction.TransactionItemAction_REINSTALLED])
        forward_actions = set(dnf.transaction.FORWARD_ACTIONS)

        forward = {}
        reverse = {}
        for tsi in self.cli.base.transaction:
            if tsi.action in forward_actions:
                pkg = tsi.pkg
                forward.setdefault(pkg.repo.id, {}).setdefault(
                    str(pkg), {})[tsi.action] = tsi.reason
            elif tsi.action in backward_action:
                reverse.setdefault(str(tsi.pkg), {})[tsi.action] = tsi.reason
        return forward, reverse

    # == pre_configure_*: set up action-specific demands ==========================
    def pre_configure_download(self):
        # only download subcommand accepts --destdir command line option
        self.base.conf.cachedir = self.datadir
        self.base.conf.destdir = self.opts.destdir if self.opts.destdir else None
        if 'offline-distrosync' == self.opts.command and not self.opts.distro_sync:
            raise CliError(
                _("Command 'offline-distrosync' cannot be used with --no-downgrade option"))
        elif 'offline-upgrade' == self.opts.command:
            self.opts.distro_sync = False

    def pre_configure_reboot(self):
        self._set_cachedir()

    def pre_configure_upgrade(self):
        self._set_cachedir()
        if self.state.enable_disable_repos:
            self.opts.repos_ed = self.state.enable_disable_repos
        self.base.conf.releasever = self.state.target_releasever

    def pre_configure_clean(self):
        self._set_cachedir()

    # == configure_*: set up action-specific demands ==========================

    def configure_download(self):
        if 'system-upgrade' == self.opts.command or 'fedup' == self.opts.command:
            logger.warning(_('WARNING: this operation is not supported on the RHEL distribution. '
                             'Proceed at your own risk.'))
            help_url = get_url_from_os_release()
            if help_url:
                msg = _('Additional information for System Upgrade: {}')
                logger.info(msg.format(ucd(help_url)))
            if self.base._promptWanted():
                msg = _('Before you continue ensure that your system is fully upgraded by running '
                        '"dnf --refresh upgrade". Do you want to continue')
                if self.base.conf.assumeno or not self.base.output.userconfirm(
                        msg='{} [y/N]: '.format(msg), defaultyes_msg='{} [Y/n]: '.format(msg)):
                    logger.error(_("Operation aborted."))
                    sys.exit(1)
            check_release_ver(self.base.conf, target=self.opts.releasever)
        elif 'offline-upgrade' == self.opts.command:
            self.cli._populate_update_security_filter(self.opts)

        self.cli.demands.root_user = True
        self.cli.demands.resolving = True
        self.cli.demands.available_repos = True
        self.cli.demands.sack_activation = True
        self.cli.demands.freshest_metadata = True
        # We want to do the depsolve / download / transaction-test, but *not*
        # run the actual RPM transaction to install the downloaded packages.
        # Setting the "test" flag makes the RPM transaction a test transaction,
        # so nothing actually gets installed.
        # (It also means that we run two test transactions in a row, which is
        # kind of silly, but that's something for DNF to fix...)
        self.base.conf.tsflags += ["test"]

    def configure_reboot(self):
        # FUTURE: add a --debug-shell option to enable debug shell:
        # systemctl add-wants system-update.target debug-shell.service
        self.cli.demands.root_user = True

    def configure_upgrade(self):
        # same as the download, but offline and non-interactive. so...
        self.cli.demands.root_user = True
        self.cli.demands.resolving = True
        self.cli.demands.available_repos = True
        self.cli.demands.sack_activation = True
        # use the saved value for --allowerasing, etc.
        self.opts.distro_sync = self.state.distro_sync
        if self.state.gpgcheck is not None:
            self.base.conf.gpgcheck = self.state.gpgcheck
        if self.state.gpgcheck_repos is not None:
            for repo in self.base.repos.values():
                repo.gpgcheck = repo.id in self.state.gpgcheck_repos
        if self.state.repo_gpgcheck_repos is not None:
            for repo in self.base.repos.values():
                repo.repo_gpgcheck = repo.id in self.state.repo_gpgcheck_repos
        self.base.conf.module_platform_id = self.state.module_platform_id
        # don't try to get new metadata, 'cuz we're offline
        self.cli.demands.cacheonly = True
        # and don't ask any questions (we confirmed all this beforehand)
        self.base.conf.assumeyes = True
        self.cli.demands.transaction_display = PlymouthTransactionProgress()
        # upgrade operation already removes all element that must be removed. Additional removal
        # could trigger unwanted changes in transaction.
        self.base.conf.clean_requirements_on_remove = False
        self.base.conf.install_weak_deps = False

    def configure_clean(self):
        self.cli.demands.root_user = True

    def configure_log(self):
        pass

    # == check_*: do any action-specific checks ===============================

    def check_reboot(self):
        if not self.state.download_status == 'complete':
            raise CliError(_("system is not ready for upgrade"))
        self._check_state_version(self.opts.command)
        if self.state.upgrade_command != self.opts.command:
            msg = _("the transaction was not prepared for '{command}'. "
                    "Rerun 'dnf {command} download [OPTIONS]'").format(command=self.opts.command)
            raise CliError(msg)
        if os.path.lexists(self.magic_symlink):
            raise CliError(_("upgrade is already scheduled"))
        dnf.util.ensure_dir(self.datadir)
        # FUTURE: checkRPMDBStatus(self.state.download_transaction_id)

    def check_upgrade(self):
        if not os.path.lexists(self.magic_symlink):
            logger.info(_("trigger file does not exist. exiting quietly."))
            raise SystemExit(0)
        if os.readlink(self.magic_symlink) != self.datadir:
            logger.info(_("another upgrade tool is running. exiting quietly."))
            raise SystemExit(0)
        # Delete symlink ASAP to avoid reboot loops
        dnf.yum.misc.unlink_f(self.magic_symlink)
        command = self.state.upgrade_command
        if not command:
            command = self.opts.command
        self._check_state_version(command)
        if not self.state.upgrade_status == 'ready':
            msg = _("use 'dnf {command} reboot' to begin the upgrade").format(command=command)
            raise CliError(msg)

    # == run_*: run the action/prep the transaction ===========================

    def run_prepare(self):
        # make the magic symlink
        os.symlink(self.datadir, self.magic_symlink)
        # set upgrade_status so that the upgrade can run
        with self.state as state:
            state.upgrade_status = 'ready'

    def run_reboot(self):
        self.run_prepare()

        if not self.opts.tid[0] == "reboot":
            return

        self.log_status(_("Rebooting to perform upgrade."),
                        REBOOT_REQUESTED_ID)
        reboot()

    def run_download(self):
        # Mark everything in the world for upgrade/sync
        if self.opts.distro_sync:
            self.base.distro_sync()
        else:
            self.base.upgrade_all()

        if self.opts.command not in ['offline-upgrade', 'offline-distrosync']:
            # Mark all installed groups and environments for upgrade
            self.base.read_comps()
            installed_groups = [g.id for g in self.base.comps.groups if self.base.history.group.get(g.id)]
            if installed_groups:
                self.base.env_group_upgrade(installed_groups)
            installed_environments = [g.id for g in self.base.comps.environments if self.base.history.env.get(g.id)]
            if installed_environments:
                self.base.env_group_upgrade(installed_environments)

        with self.state as state:
            state.download_status = 'downloading'
            state.target_releasever = self.base.conf.releasever
            state.destdir = self.base.conf.destdir

    def run_upgrade(self):
        # change the upgrade status (so we can detect crashed upgrades later)
        command = ''
        with self.state as state:
            state.upgrade_status = 'incomplete'
            command = state.upgrade_command
        if command == 'offline-upgrade':
            msg = _("Starting offline upgrade. This will take a while.")
        elif command == 'offline-distrosync':
            msg = _("Starting offline distrosync. This will take a while.")
        else:
            msg = _("Starting system upgrade. This will take a while.")

        self.log_status(msg, UPGRADE_STARTED_ID)

        # reset the splash mode and let the user know we're running
        Plymouth.set_mode()
        Plymouth.progress(0)
        Plymouth.message(msg)

        # disable screen blanking
        disable_blanking()

        self.replay = TransactionReplay(self.base, self.transaction_file)
        self.replay.run()

    def run_clean(self):
        logger.info(_("Cleaning up downloaded data..."))
        # Don't delete persistor, it contains paths for downloaded packages
        # that are used by dnf during finalizing base to clean them up
        clear_dir(self.base.conf.cachedir,
                  [dnf.persistor.TempfilePersistor(self.base.conf.cachedir).db_path])
        with self.state as state:
            state.download_status = None
            state.state_version = None
            state.upgrade_status = None
            state.upgrade_command = None
            state.destdir = None

    def run_log(self):
        if self.opts.number:
            show_log(self.opts.number)
        else:
            list_logs()

    # == resolved_*: do staff after succesful resolvement =====================

    def resolved_upgrade(self):
        """Adjust transaction reasons according to stored values"""
        self.replay.post_transaction()

    # == transaction_*: do stuff after a successful transaction ===============

    def transaction_download(self):
        transaction = self.base.history.get_current()

        if not transaction.packages():
            logger.info(_("The system-upgrade transaction is empty, your system is already up-to-date."))
            return

        data = serialize_transaction(transaction)
        try:
            with open(self.transaction_file, "w") as f:
                json.dump(data, f, indent=4, sort_keys=True)
                f.write("\n")

            print(_("Transaction saved to {}.").format(self.transaction_file))

        except OSError as e:
            raise dnf.cli.CliError(_('Error storing transaction: {}').format(str(e)))

        # Okay! Write out the state so the upgrade can use it.
        system_ver = dnf.rpm.detect_releasever(self.base.conf.installroot)
        with self.state as state:
            state.download_status = 'complete'
            state.state_version = STATE_VERSION
            state.distro_sync = self.opts.distro_sync
            state.gpgcheck = self.base.conf.gpgcheck
            state.gpgcheck_repos = [
                repo.id for repo in self.base.repos.values() if repo.gpgcheck]
            state.repo_gpgcheck_repos = [
                repo.id for repo in self.base.repos.values() if repo.repo_gpgcheck]
            state.system_releasever = system_ver
            state.target_releasever = self.base.conf.releasever
            state.module_platform_id = self.base.conf.module_platform_id
            state.enable_disable_repos = self.opts.repos_ed
            state.destdir = self.base.conf.destdir
            state.upgrade_command = self.opts.command

        msg = DOWNLOAD_FINISHED_MSG.format(command=self.opts.command)
        logger.info(msg)
        self.log_status(_("Download finished."), DOWNLOAD_FINISHED_ID)

    def transaction_upgrade(self):
        Plymouth.message(_("Upgrade complete! Cleaning up and rebooting..."))
        self.log_status(_("Upgrade complete! Cleaning up and rebooting..."),
                        UPGRADE_FINISHED_ID)
        self.run_clean()
        if self.opts.tid[0] == "upgrade":
            reboot()


class OfflineUpgradeCommand(SystemUpgradeCommand):
    aliases = ('offline-upgrade',)
    summary = _("Prepare offline upgrade of the system")


class OfflineDistrosyncCommand(SystemUpgradeCommand):
    aliases = ('offline-distrosync',)
    summary = _("Prepare offline distrosync of the system")
site-packages/dnf-plugins/changelog.py000064400000011547147511334660014035 0ustar00# changelog.py
# DNF plugin adding a command changelog.
#
# Copyright (C) 2014 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals

import argparse
import collections
import dateutil.parser

from dnfpluginscore import _, P_, logger
import dnf
import dnf.cli


def validate_date(val):
    try:
        return dateutil.parser.parse(val, fuzzy=True)
    except (ValueError, TypeError, OverflowError):
        raise argparse.ArgumentTypeError(_('Not a valid date: "{0}".').format(val))


@dnf.plugin.register_command
class ChangelogCommand(dnf.cli.Command):
    aliases = ('changelog',)
    summary = _('Show changelog data of packages')

    @staticmethod
    def set_argparser(parser):
        filter_group = parser.add_mutually_exclusive_group()
        filter_group.add_argument(
            '--since', metavar="DATE", default=None,
            type=validate_date,
            help=_('show changelog entries since DATE. To avoid ambiguosity, '
                   'YYYY-MM-DD format is recommended.'))
        filter_group.add_argument(
            '--count', default=None, type=int,
            help=_('show given number of changelog entries per package'))
        filter_group.add_argument(
            '--upgrades', default=False, action='store_true',
            help=_('show only new changelog entries for packages, that provide an '
                   'upgrade for some of already installed packages.'))
        parser.add_argument("package", nargs='*', metavar=_('PACKAGE'))

    def configure(self):
        demands = self.cli.demands
        demands.available_repos = True
        demands.sack_activation = True
        demands.changelogs = True

    def query(self):
        q = self.base.sack.query()
        if self.opts.package:
            q.filterm(empty=True)
            for pkg in self.opts.package:
                pkg_q = dnf.subject.Subject(pkg, ignore_case=True).get_best_query(
                    self.base.sack, with_nevra=True,
                    with_provides=False, with_filenames=False)
                if self.opts.repo:
                    pkg_q.filterm(reponame=self.opts.repo)
                if pkg_q:
                    q = q.union(pkg_q.latest())
                else:
                    logger.info(_('No match for argument: %s') % pkg)
        elif self.opts.repo:
            q.filterm(reponame=self.opts.repo)
        if self.opts.upgrades:
            q = q.upgrades()
        else:
            q = q.available()
        return q

    def by_srpm(self, packages):
        by_srpm = collections.OrderedDict()
        for pkg in sorted(packages):
            by_srpm.setdefault((pkg.source_name or pkg.name, pkg.evr), []).append(pkg)
        return by_srpm

    def filter_changelogs(self, package):
        if self.opts.upgrades:
            return self.base.latest_changelogs(package)
        elif self.opts.count:
            return package.changelogs[:self.opts.count]
        elif self.opts.since:
            return [chlog for chlog in package.changelogs
                    if chlog['timestamp'] >= self.opts.since.date()]
        else:
            return package.changelogs

    def run(self):
        if self.opts.since:
            logger.info(_('Listing changelogs since {}').format(self.opts.since))
        elif self.opts.count:
            logger.info(P_('Listing only latest changelog',
                           'Listing {} latest changelogs',
                           self.opts.count).format(self.opts.count))
        elif self.opts.upgrades:
            logger.info(
                _('Listing only new changelogs since installed version of the package'))
        else:
            logger.info(_('Listing all changelogs'))

        by_srpm = self.by_srpm(self.query())
        for name in by_srpm:
            print(_('Changelogs for {}').format(
                ', '.join(sorted({str(pkg) for pkg in by_srpm[name]}))))
            for chlog in self.filter_changelogs(by_srpm[name][0]):
                print(self.base.format_changelog(chlog))
site-packages/dnf-plugins/repodiff.py000064400000026323147511334660013702 0ustar00# repodiff.py
# DNF plugin adding a command to show differencies between two sets
# of repositories.
#
# Copyright (C) 2018 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals

import dnf.cli
from dnf.cli.option_parser import OptionParser
import hawkey

from dnfpluginscore import _


class RepoDiff(dnf.Plugin):

    name = "repodiff"

    def __init__(self, base, cli):
        super(RepoDiff, self).__init__(base, cli)
        if cli is None:
            return
        cli.register_command(RepoDiffCommand)


class RepoDiffCommand(dnf.cli.Command):
    aliases = ("repodiff",)
    summary = _("List differences between two sets of repositories")

    @staticmethod
    def set_argparser(parser):
        # I'd like to use --old and --new options like Yum did.
        # But ability to disable abbreviated long options is added
        # only in Python >= 3.5
        # So in command arguments we are not able to use arguments,
        # which are prefixes of main arguments (i.w. --new would be
        # treated as --newpackage). This is because we run .parse_args
        # two times - for main and then for command arguments.
        # https://stackoverflow.com/questions/33900846
        parser.add_argument("--repo-old", "-o", default=[], action="append", dest="old",
                            help=_("Specify old repository, can be used multiple times"))
        parser.add_argument("--repo-new", "-n", default=[], action="append", dest="new",
                            help=_("Specify new repository, can be used multiple times"))
        parser.add_argument("--arch", "--archlist", "-a", default=[],
                            action=OptionParser._SplitCallback, dest="arches",
                            help=_("Specify architectures to compare, can be used "
                                   "multiple times. By default, only source rpms are "
                                   "compared."))
        parser.add_argument("--size", "-s", action="store_true",
                            help=_("Output additional data about the size of the changes."))
        parser.add_argument("--compare-arch", action="store_true",
                            help=_("Compare packages also by arch. By default "
                                   "packages are compared just by name."))
        parser.add_argument("--simple", action="store_true",
                            help=_("Output a simple one line message for modified packages."))
        parser.add_argument("--downgrade", action="store_true",
                            help=_("Split the data for modified packages between "
                                   "upgraded and downgraded packages."))

    def configure(self):
        demands = self.cli.demands
        demands.sack_activation = True
        demands.available_repos = True
        demands.changelogs = True
        self.base.conf.disable_excludes = ["all"]
        # TODO yum was able to handle mirrorlist in --new/--old arguments
        # Can be resolved by improving --repofrompath option
        if not self.opts.new or not self.opts.old:
            msg = _("Both old and new repositories must be set.")
            raise dnf.exceptions.Error(msg)
        for repo in self.base.repos.all():
            if repo.id in self.opts.new + self.opts.old:
                repo.enable()
            else:
                repo.disable()
        if not self.opts.arches:
            self.opts.arches = ['src']

    def _pkgkey(self, pkg):
        if self.opts.compare_arch:
            return (pkg.name, pkg.arch)
        return pkg.name

    def _repodiff(self, old, new):
        '''compares packagesets old and new, returns dictionary with packages:
        added: only in new set
        removed: only in old set
        upgraded: in both old and new, new has bigger evr
        downgraded: in both old and new, new has lower evr
        obsoletes: dictionary of which old package is obsoleted by which new
        '''
        old_d = dict([(self._pkgkey(p), p) for p in old])
        old_keys = set(old_d.keys())
        new_d = dict([(self._pkgkey(p), p) for p in new])
        new_keys = set(new_d.keys())

        # mapping obsoleted_package_from_old: obsoleted_by_package_from_new
        obsoletes = dict()
        for obsoleter in new.filter(obsoletes=old):
            for obsoleted in old.filter(provides=obsoleter.obsoletes):
                obsoletes[self._pkgkey(obsoleted)] = obsoleter

        evr_cmp = self.base.sack.evr_cmp
        repodiff = dict(
            added=[new_d[k] for k in new_keys - old_keys],
            removed=[old_d[k] for k in old_keys - new_keys],
            obsoletes=obsoletes,
            upgraded=[],
            downgraded=[])
        for k in old_keys.intersection(new_keys):
            pkg_old = old_d[k]
            pkg_new = new_d[k]
            if pkg_old.evr == pkg_new.evr:
                continue
            if evr_cmp(pkg_old.evr, pkg_new.evr) > 0:
                repodiff['downgraded'].append((pkg_old, pkg_new))
            else:
                repodiff['upgraded'].append((pkg_old, pkg_new))

        return repodiff

    def _report(self, repodiff):
        def pkgstr(pkg):
            if self.opts.compare_arch:
                return str(pkg)
            return "%s-%s" % (pkg.name, pkg.evr)

        def sizestr(num):
            msg = str(num)
            if num > 0:
                msg += " ({})".format(dnf.cli.format.format_number(num).strip())
            elif num < 0:
                msg += " (-{})".format(dnf.cli.format.format_number(-num).strip())
            return msg

        def report_modified(pkg_old, pkg_new):
            msgs = []
            if self.opts.simple:
                msgs.append("%s -> %s" % (pkgstr(pkg_old), pkgstr(pkg_new)))
            else:
                msgs.append('')
                msgs.append("%s -> %s" % (pkgstr(pkg_old), pkgstr(pkg_new)))
                msgs.append('-' * len(msgs[-1]))
                if pkg_old.changelogs:
                    old_chlog = pkg_old.changelogs[0]
                else:
                    old_chlog = None
                for chlog in pkg_new.changelogs:
                    if old_chlog:
                        if chlog['timestamp'] < old_chlog['timestamp']:
                            break
                        elif (chlog['timestamp'] == old_chlog['timestamp'] and
                              chlog['author'] == old_chlog['author'] and
                              chlog['text'] == old_chlog['text']):
                            break
                    msgs.append('* %s %s\n%s' % (
                        chlog['timestamp'].strftime("%a %b %d %Y"),
                        dnf.i18n.ucd(chlog['author']),
                        dnf.i18n.ucd(chlog['text'])))
                if self.opts.size:
                    msgs.append(_("Size change: {} bytes").format(
                        pkg_new.size - pkg_old.size))
            print('\n'.join(msgs))

        sizes = dict(added=0, removed=0, upgraded=0, downgraded=0)
        for pkg in sorted(repodiff['added']):
            print(_("Added package  : {}").format(pkgstr(pkg)))
            sizes['added'] += pkg.size
        for pkg in sorted(repodiff['removed']):
            print(_("Removed package: {}").format(pkgstr(pkg)))
            obsoletedby = repodiff['obsoletes'].get(self._pkgkey(pkg))
            if obsoletedby:
                print(_("Obsoleted by   : {}").format(pkgstr(obsoletedby)))
            sizes['removed'] += pkg.size

        if self.opts.downgrade:
            if repodiff['upgraded']:
                print(_("\nUpgraded packages"))
                for (pkg_old, pkg_new) in sorted(repodiff['upgraded']):
                    sizes['upgraded'] += (pkg_new.size - pkg_old.size)
                    report_modified(pkg_old, pkg_new)
            if repodiff['downgraded']:
                print(_("\nDowngraded packages"))
                for (pkg_old, pkg_new) in sorted(repodiff['downgraded']):
                    sizes['downgraded'] += (pkg_new.size - pkg_old.size)
                    report_modified(pkg_old, pkg_new)
        else:
            modified = repodiff['upgraded'] + repodiff['downgraded']
            if modified:
                print(_("\nModified packages"))
                for (pkg_old, pkg_new) in sorted(modified):
                    sizes['upgraded'] += (pkg_new.size - pkg_old.size)
                    report_modified(pkg_old, pkg_new)

        print(_("\nSummary"))
        print(_("Added packages: {}").format(len(repodiff['added'])))
        print(_("Removed packages: {}").format(len(repodiff['removed'])))
        if self.opts.downgrade:
            print(_("Upgraded packages: {}").format(len(repodiff['upgraded'])))
            print(_("Downgraded packages: {}").format(len(repodiff['downgraded'])))
        else:
            print(_("Modified packages: {}").format(
                len(repodiff['upgraded']) + len(repodiff['downgraded'])))
        if self.opts.size:
            print(_("Size of added packages: {}").format(sizestr(sizes['added'])))
            print(_("Size of removed packages: {}").format(sizestr(sizes['removed'])))
            if not self.opts.downgrade:
                print(_("Size of modified packages: {}").format(
                    sizestr(sizes['upgraded'] + sizes['downgraded'])))
            else:
                print(_("Size of upgraded packages: {}").format(
                    sizestr(sizes['upgraded'])))
                print(_("Size of downgraded packages: {}").format(
                    sizestr(sizes['downgraded'])))
            print(_("Size change: {}").format(
                sizestr(sizes['added'] + sizes['upgraded'] + sizes['downgraded'] -
                        sizes['removed'])))

    def run(self):
        # prepare old and new packagesets based by given arguments
        q_new = self.base.sack.query(hawkey.IGNORE_EXCLUDES).filter(
            reponame=self.opts.new)
        q_old = self.base.sack.query(hawkey.IGNORE_EXCLUDES).filter(
            reponame=self.opts.old)
        if self.opts.arches and '*' not in self.opts.arches:
            q_new.filterm(arch=self.opts.arches)
            q_old.filterm(arch=self.opts.arches)
        if self.opts.compare_arch:
            q_new.filterm(latest_per_arch=1)
            q_old.filterm(latest_per_arch=1)
        else:
            q_new.filterm(latest=1)
            q_old.filterm(latest=1)
        q_new.apply()
        q_old.apply()

        self._report(self._repodiff(q_old, q_new))
site-packages/dnf-plugins/reposync.py000064400000034470147511334660013750 0ustar00# reposync.py
# DNF plugin adding a command to download all packages from given remote repo.
#
# Copyright (C) 2014 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals

import hawkey
import os
import shutil
import types

from dnfpluginscore import _, logger
from dnf.cli.option_parser import OptionParser
import dnf
import dnf.cli


def _pkgdir(intermediate, target):
    cwd = dnf.i18n.ucd(os.getcwd())
    return os.path.realpath(os.path.join(cwd, intermediate, target))


class RPMPayloadLocation(dnf.repo.RPMPayload):
    def __init__(self, pkg, progress, pkg_location):
        super(RPMPayloadLocation, self).__init__(pkg, progress)
        self.package_dir = os.path.dirname(pkg_location)

    def _target_params(self):
        tp = super(RPMPayloadLocation, self)._target_params()
        dnf.util.ensure_dir(self.package_dir)
        tp['dest'] = self.package_dir
        return tp


@dnf.plugin.register_command
class RepoSyncCommand(dnf.cli.Command):
    aliases = ('reposync',)
    summary = _('download all packages from remote repo')

    def __init__(self, cli):
        super(RepoSyncCommand, self).__init__(cli)

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('-a', '--arch', dest='arches', default=[],
                            action=OptionParser._SplitCallback, metavar='[arch]',
                            help=_('download only packages for this ARCH'))
        parser.add_argument('--delete', default=False, action='store_true',
                            help=_('delete local packages no longer present in repository'))
        parser.add_argument('--download-metadata', default=False, action='store_true',
                            help=_('download all the metadata.'))
        parser.add_argument('-g', '--gpgcheck', default=False, action='store_true',
                            help=_('Remove packages that fail GPG signature checking '
                                   'after downloading'))
        parser.add_argument('-m', '--downloadcomps', default=False, action='store_true',
                            help=_('also download and uncompress comps.xml'))
        parser.add_argument('--metadata-path',
                            help=_('where to store downloaded repository metadata. '
                                   'Defaults to the value of --download-path.'))
        parser.add_argument('-n', '--newest-only', default=False, action='store_true',
                            help=_('download only newest packages per-repo'))
        parser.add_argument('--norepopath', default=False, action='store_true',
                            help=_("Don't add the reponame to the download path."))
        parser.add_argument('-p', '--download-path', default='./',
                            help=_('where to store downloaded repositories'))
        parser.add_argument('--remote-time', default=False, action='store_true',
                            help=_('try to set local timestamps of local files by '
                                   'the one on the server'))
        parser.add_argument('--source', default=False, action='store_true',
                            help=_('download only source packages'))
        parser.add_argument('-u', '--urls', default=False, action='store_true',
                            help=_("Just list urls of what would be downloaded, "
                                   "don't download"))

    def configure(self):
        demands = self.cli.demands
        demands.available_repos = True
        demands.sack_activation = True

        repos = self.base.repos

        if self.opts.repo:
            repos.all().disable()
            for repoid in self.opts.repo:
                try:
                    repo = repos[repoid]
                except KeyError:
                    raise dnf.cli.CliError("Unknown repo: '%s'." % repoid)
                repo.enable()

        if self.opts.source:
            repos.enable_source_repos()

        if len(list(repos.iter_enabled())) > 1 and self.opts.norepopath:
            raise dnf.cli.CliError(
                _("Can't use --norepopath with multiple repositories"))

        for repo in repos.iter_enabled():
            repo._repo.expire()
            repo.deltarpm = False

    def run(self):
        self.base.conf.keepcache = True
        gpgcheck_ok = True
        for repo in self.base.repos.iter_enabled():
            if self.opts.remote_time:
                repo._repo.setPreserveRemoteTime(True)
            if self.opts.download_metadata:
                if self.opts.urls:
                    for md_type, md_location in repo._repo.getMetadataLocations():
                        url = repo.remote_location(md_location)
                        if url:
                            print(url)
                        else:
                            msg = _("Failed to get mirror for metadata: %s") % md_type
                            logger.warning(msg)
                else:
                    self.download_metadata(repo)
            if self.opts.downloadcomps:
                if self.opts.urls:
                    mdl = dict(repo._repo.getMetadataLocations())
                    group_locations = [mdl[md_type]
                                       for md_type in ('group', 'group_gz', 'group_gz_zck')
                                       if md_type in mdl]
                    if group_locations:
                        for group_location in group_locations:
                            url = repo.remote_location(group_location)
                            if url:
                                print(url)
                                break
                        else:
                            msg = _("Failed to get mirror for the group file.")
                            logger.warning(msg)
                else:
                    self.getcomps(repo)
            pkglist = self.get_pkglist(repo)
            if self.opts.urls:
                self.print_urls(pkglist)
            else:
                self.download_packages(pkglist)
                if self.opts.gpgcheck:
                    for pkg in pkglist:
                        local_path = self.pkg_download_path(pkg)
                        # base.package_signature_check uses pkg.localPkg() to determine
                        # the location of the package rpm file on the disk.
                        # Set it to the correct download path.
                        pkg.localPkg  = types.MethodType(
                            lambda s, local_path=local_path: local_path, pkg)
                        result, error = self.base.package_signature_check(pkg)
                        if result != 0:
                            logger.warning(_("Removing {}: {}").format(
                                os.path.basename(local_path), error))
                            os.unlink(local_path)
                            gpgcheck_ok = False
            if self.opts.delete:
                self.delete_old_local_packages(repo, pkglist)
        if not gpgcheck_ok:
            raise dnf.exceptions.Error(_("GPG signature check failed."))

    def repo_target(self, repo):
        return _pkgdir(self.opts.destdir or self.opts.download_path,
                       repo.id if not self.opts.norepopath else '')

    def metadata_target(self, repo):
        if self.opts.metadata_path:
            return _pkgdir(self.opts.metadata_path, repo.id)
        else:
            return self.repo_target(repo)

    def pkg_download_path(self, pkg):
        repo_target = self.repo_target(pkg.repo)
        pkg_download_path = os.path.realpath(
            os.path.join(repo_target, pkg.location))
        # join() ensures repo_target ends with a path separator (otherwise the
        # check would pass if pkg_download_path was a "sibling" path component
        # of repo_target that has the same prefix).
        if not pkg_download_path.startswith(os.path.join(repo_target, '')):
            raise dnf.exceptions.Error(
                _("Download target '{}' is outside of download path '{}'.").format(
                    pkg_download_path, repo_target))
        return pkg_download_path

    def delete_old_local_packages(self, repo, pkglist):
        # delete any *.rpm file under target path, that was not downloaded from repository
        downloaded_files = set(self.pkg_download_path(pkg) for pkg in pkglist)
        for dirpath, dirnames, filenames in os.walk(self.repo_target(repo)):
            for filename in filenames:
                path = os.path.join(dirpath, filename)
                if filename.endswith('.rpm') and os.path.isfile(path):
                    if path not in downloaded_files:
                        # Delete disappeared or relocated file
                        try:
                            os.unlink(path)
                            logger.info(_("[DELETED] %s"), path)
                        except OSError:
                            logger.error(_("failed to delete file %s"), path)

    def getcomps(self, repo):
        comps_fn = repo._repo.getCompsFn()
        if comps_fn:
            dest_path = self.metadata_target(repo)
            dnf.util.ensure_dir(dest_path)
            dest = os.path.join(dest_path, 'comps.xml')
            dnf.yum.misc.decompress(comps_fn, dest=dest)
            logger.info(_("comps.xml for repository %s saved"), repo.id)

    def download_metadata(self, repo):
        repo_target = self.metadata_target(repo)
        repo._repo.downloadMetadata(repo_target)
        return True

    def _get_latest(self, query):
        """
        return union of these queries:
        - the latest NEVRAs from non-modular packages
        - all packages from stream version with the latest package NEVRA
          (this should not be needed but the latest package NEVRAs might be
          part of an older module version)
        - all packages from the latest stream version
        """
        if not dnf.base.WITH_MODULES:
            return query.latest()

        query.apply()
        module_packages = self.base._moduleContainer.getModulePackages()
        all_artifacts = set()
        module_dict = {}  # {NameStream: {Version: [modules]}}
        artifact_version = {} # {artifact: {NameStream: [Version]}}
        for module_package in module_packages:
            artifacts = module_package.getArtifacts()
            all_artifacts.update(artifacts)
            module_dict.setdefault(module_package.getNameStream(), {}).setdefault(
                module_package.getVersionNum(), []).append(module_package)
            for artifact in artifacts:
                artifact_version.setdefault(artifact, {}).setdefault(
                    module_package.getNameStream(), []).append(module_package.getVersionNum())

        # the latest NEVRAs from non-modular packages
        latest_query = query.filter(
            pkg__neq=query.filter(nevra_strict=all_artifacts)).latest()

        # artifacts from the newest version and those versions that contain an artifact
        # with the highest NEVRA
        latest_stream_artifacts = set()
        for namestream, version_dict in module_dict.items():
            # versions that will be synchronized
            versions = set()
            # add the newest stream version
            versions.add(sorted(version_dict.keys(), reverse=True)[0])
            # collect all artifacts in all stream versions
            stream_artifacts = set()
            for modules in version_dict.values():
                for module in modules:
                    stream_artifacts.update(module.getArtifacts())
            # find versions to which the packages with the highest NEVRAs belong
            for latest_pkg in query.filter(nevra_strict=stream_artifacts).latest():
                # here we depend on modules.yaml allways containing full NEVRA (including epoch)
                nevra = "{0.name}-{0.epoch}:{0.version}-{0.release}.{0.arch}".format(latest_pkg)
                # download only highest version containing the latest artifact
                versions.add(max(artifact_version[nevra][namestream]))
            # add all artifacts from selected versions for synchronization
            for version in versions:
                for module in version_dict[version]:
                    latest_stream_artifacts.update(module.getArtifacts())
        latest_query = latest_query.union(query.filter(nevra_strict=latest_stream_artifacts))

        return latest_query

    def get_pkglist(self, repo):
        query = self.base.sack.query(flags=hawkey.IGNORE_MODULAR_EXCLUDES).available().filterm(
            reponame=repo.id)
        if self.opts.newest_only:
            query = self._get_latest(query)
        if self.opts.source:
            query.filterm(arch='src')
        elif self.opts.arches:
            query.filterm(arch=self.opts.arches)
        return query

    def download_packages(self, pkglist):
        base = self.base
        progress = base.output.progress
        if progress is None:
            progress = dnf.callback.NullDownloadProgress()
        drpm = dnf.drpm.DeltaInfo(base.sack.query(flags=hawkey.IGNORE_MODULAR_EXCLUDES).installed(),
                                  progress, 0)
        payloads = [RPMPayloadLocation(pkg, progress, self.pkg_download_path(pkg))
                    for pkg in pkglist]
        base._download_remote_payloads(payloads, drpm, progress, None, False)

    def print_urls(self, pkglist):
        for pkg in pkglist:
            url = pkg.remote_location()
            if url:
                print(url)
            else:
                msg = _("Failed to get mirror for package: %s") % pkg.name
                logger.warning(msg)
site-packages/dnf-plugins/__pycache__/changelog.cpython-36.opt-1.pyc000064400000010117147511334660021250 0ustar003

�gt`g�@s|ddlmZddlmZddlZddlZddlZddlmZm	Z	m
Z
ddlZddlZdd�Z
ejjGdd�dejj��ZdS)	�)�absolute_import)�unicode_literalsN)�_�P_�loggerc
CsDytjj|dd�Stttfk
r>tjtd�j	|���YnXdS)NT)ZfuzzyzNot a valid date: "{0}".)
�dateutil�parser�parse�
ValueError�	TypeError�
OverflowError�argparseZArgumentTypeErrorr�format)�val�r�/usr/lib/python3.6/changelog.py�
validate_date!src@sLeZdZdZed�Zedd��Zdd�Zdd�Z	d	d
�Z
dd�Zd
d�ZdS)�ChangelogCommand�	changelogzShow changelog data of packagescCsd|j�}|jdddttd�d�|jddttd�d�|jdd	d
td�d�|jd
dtd�d�dS)Nz--sinceZDATEzZshow changelog entries since DATE. To avoid ambiguosity, YYYY-MM-DD format is recommended.)�metavar�default�type�helpz--countz2show given number of changelog entries per package)rrrz
--upgradesF�
store_truezmshow only new changelog entries for packages, that provide an upgrade for some of already installed packages.)r�actionr�package�*ZPACKAGE)�nargsr)Zadd_mutually_exclusive_group�add_argumentrr�int)rZfilter_grouprrr�
set_argparser-szChangelogCommand.set_argparsercCs|jj}d|_d|_d|_dS)NT)�cli�demandsZavailable_reposZsack_activation�
changelogs)�selfr"rrr�	configure>szChangelogCommand.configurecCs�|jjj�}|jjr�|jdd�x�|jjD]d}tjj|dd�j	|jjdddd�}|jj
rh|j|jj
d�|r||j|j��}q*t
jtd�|�q*Wn|jj
r�|j|jj
d�|jjr�|j�}n|j�}|S)NT)�empty)Zignore_caseF)Z
with_nevraZ
with_providesZwith_filenames)ZreponamezNo match for argument: %s)�baseZsack�query�optsrZfilterm�dnfZsubjectZSubjectZget_best_queryZrepo�unionZlatestr�infor�upgradesZ	available)r$�q�pkgZpkg_qrrrr(Ds$

zChangelogCommand.querycCs>tj�}x0t|�D]$}|j|jp$|j|jfg�j|�qW|S)N)�collections�OrderedDict�sorted�
setdefaultZsource_name�nameZevr�append)r$Zpackages�by_srpmr/rrrr6Zs$zChangelogCommand.by_srpmcsT�jjr�jj|�S�jjr.|jd�jj�S�jjrJ�fdd�|jD�S|jSdS)Ncs$g|]}|d�jjj�kr|�qS)Z	timestamp)r)�sinceZdate)�.0�chlog)r$rr�
<listcomp>fsz6ChangelogCommand.filter_changelogs.<locals>.<listcomp>)r)r-r'Zlatest_changelogs�countr#r7)r$rr)r$r�filter_changelogs`sz"ChangelogCommand.filter_changelogscCs�|jjr"tjtd�j|jj��nP|jjrLtjtdd|jj�j|jj��n&|jjrdtjtd��ntjtd��|j	|j
��}xb|D]Z}ttd�jdjt
dd	�||D�����x*|j||d
�D]}t|jj|��q�Wq�WdS)NzListing changelogs since {}zListing only latest changelogzListing {} latest changelogszBListing only new changelogs since installed version of the packagezListing all changelogszChangelogs for {}z, cSsh|]}t|��qSr)�str)r8r/rrr�	<setcomp>{sz'ChangelogCommand.run.<locals>.<setcomp>r)r)r7rr,rrr;rr-r6r(�print�joinr2r<r'Zformat_changelog)r$r6r4r9rrr�runks 

 zChangelogCommand.runN)r)
�__name__�
__module__�__qualname__�aliasesrZsummary�staticmethodr r%r(r6r<rArrrrr(sr)Z
__future__rrr
r0Zdateutil.parserrZdnfpluginscorerrrr*Zdnf.clirZpluginZregister_commandr!ZCommandrrrrr�<module>ssite-packages/dnf-plugins/__pycache__/system_upgrade.cpython-36.pyc000064400000054753147511334660021433 0ustar003

@|�e�h�@s�dZddlmZmZmZmZddlZddlZddlZddl	Z	ddl
Z
ddlZddlm
Z
ddlmZmZddlZddlZddlmZddlmZddlZddlmZmZddlZed	�ejd
�Zejd�Zejd�Zejd
�Z eZ!dZ"ed�Z#ed�Z$ed�Z%dZ&dd�Z'dd�Z(gfdd�Z)d7dd�Z*dd�Z+Gdd�de,�Z-Gdd �d e,�Z.e.�Z/Gd!d"�d"ej0j1�Z2d#d$�Z3d%d&�Z4d'd(�Z5d)d*�Z6d+d,dd-d.gZ7Gd/d0�d0ej8�Z9Gd1d2�d2ej:j;�Z<Gd3d4�d4e<�Z=Gd5d6�d6e<�Z>dS)8zGsystem_upgrade.py - DNF plugin to handle major-version system upgrades.�)�call�Popen�check_output�CalledProcessErrorN)�journal)�_�logger)�CliError)�ucd)�serialize_transaction�TransactionReplayzthe color of the skyZ 9348174c5cc74001a71ef26bd79d302eZ fef1cc509d5047268b83a3a553f54b43Z 3e0a5636d16b4ca4bbe5321d06c6aa62Z 8cec00a1566f4d3594f116450395f06cz/usr/bin/plymouthz<Need a --releasever greater than the current system version.z�Download complete! Use 'dnf {command} reboot' to start the upgrade.
To remove cached metadata and transaction use 'dnf {command} clean'zESorry, you need to use 'download --releasever' instead of '--network'�cCs.tjddd�rtjtd��ntddg�dS)NZDNF_SYSTEM_UPGRADE_NO_REBOOTF)�defaultz!Reboot turned off, not rebooting.Z	systemctl�reboot)�os�getenvr�inforr�rr�$/usr/lib/python3.6/system_upgrade.pyrEsrcCs|d}xrdD]j}yNt|��<}x4|D],}|j�}|j|�r |t|�d�jd�Sq WWdQRXWq
tk
rrw
Yq
Xq
WdS)NzUPGRADE_GUIDE_URL=�/etc/os-release�/usr/lib/os-release�")rr)�open�strip�
startswith�len�IOError)�key�pathZrelease_file�linerrr�get_url_from_os_releaseLs



(r cCs~tjj|�sdSxhtj|�D]Z}tjj||�}||kr8qy(tjj|�rTtjj|�n
tj|�Wqt	k
rtYqXqWdS)N)
rr�isdir�listdir�join�dnf�utilZrm_rf�unlink�OSError)r�ignore�entryZfullpathrrr�	clear_dir[sr*cCs6tjj|j�|jkrtt��|r2||jkr2tt��dS)N)r$�rpm�detect_releasever�installroot�
releaseverr	�RELEASEVER_MSG�CANT_RESET_RELEASEVER)�conf�targetrrr�check_release_verlsr3cCsPytdd�}|jd�Wn2tk
rJ}zttd�|�WYdd}~XnXdS)Nz	/dev/tty0�wbs[9;0]z%Screen blanking can't be disabled: %s)r�write�	Exception�printr)Ztty�errr�disable_blankingus

r9c@s�eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	e	d�Z
e	d�Ze	d�Ze	d�Z
e	d�Ze	d�Ze	d�Ze	d�Ze	d�Ze	d�Ze	d�Ze	d�Ze	d�ZdS)�StatecCs||_i|_|j�dS)N)�	statefile�_data�_read)�selfr;rrr�__init__�szState.__init__cCspy&t|j��}tj|�|_WdQRXWnDtk
r@i|_Yn,tk
rji|_tjt	d�|j�YnXdS)Nz;Failed loading state file: %s, continuing with empty state.)
rr;�json�loadr<r�
ValueErrorr�warningr)r>�fprrrr=�s

zState._readc
CsFtjjtjj|j��t|jd��}tj	|j
|ddd�WdQRXdS)N�w�T)�indent�	sort_keys)r$r%�
ensure_dirrr�dirnamer;rr@�dumpr<)r>Zoutfrrrr5�szState.writecCs&tjj|j�rtj|j�|j�dS)N)rr�existsr;r&r=)r>rrr�clear�szState.clearcCs|S)Nr)r>rrr�	__enter__�szState.__enter__cCs|dkr|j�dS)N)r5)r>�exc_type�	exc_value�	tracebackrrr�__exit__�szState.__exit__cs"�fdd�}�fdd�}t||�S)Ncs||j�<dS)N)r<)r>�value)�optionrr�setprop�szState._prop.<locals>.setpropcs|jj��S)N)r<�get)r>)rTrr�getprop�szState._prop.<locals>.getprop)�property)rTrUrWr)rTr�_prop�szState._prop�
state_version�download_status�destdir�target_releasever�system_releasever�gpgcheck�gpgcheck_repos�repo_gpgcheck_repos�upgrade_status�upgrade_command�distro_sync�enable_disable_repos�module_platform_idN)�__name__�
__module__�__qualname__r?r=r5rMrNrRrYrZr[r\r]r^r_r`rarbrcrdrerfrrrrr:�s(
r:c@s@eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dS)�PlymouthOutputz�A plymouth output helper class.

    Filters duplicate calls, and stops calling the plymouth binary if we
    fail to contact it.
    cCsd|_t�|_d|_dS)NT)�alive�dict�
_last_args�	_last_msg)r>rrrr?�szPlymouthOutput.__init__cGsj||jj|�k}|jr|s$|dkrdytt|f|�dk|_Wntk
rXd|_YnX||j|<|jS)Nz--pingrF)rmrVrkr�PLYMOUTHr')r>�cmd�argsZdupe_cmdrrr�	_plymouth�s
zPlymouthOutput._plymouthcCs
|jd�S)Nz--ping)rr)r>rrr�ping�szPlymouthOutput.pingcCs4|jr |j|kr |jdd|j�||_|jdd|�S)Nzhide-messagez--textzdisplay-message)rnrr)r>�msgrrr�message�szPlymouthOutput.messagecCsRd}y$ttdg�}tjdt|��r&d}Wnttfk
r@YnX|jdd|�S)NZupdatesz--helpz--system-upgradezsystem-upgradezchange-modez--)rro�re�searchr
rr'rr)r>�mode�srrr�set_mode�szPlymouthOutput.set_modecCs|jddt|��S)Nz
system-updatez
--progress)rr�str)r>Zpercentrrr�progress�szPlymouthOutput.progressN)
rgrhri�__doc__r?rrrsrurzr|rrrrrj�s

rjc@s$eZdZdd�Zdd�Zdd�ZdS)�PlymouthTransactionProgresscCs|j||||�dS)N)�_update_plymouth)r>�package�actionZti_doneZti_totalZts_doneZts_totalrrrr|�sz$PlymouthTransactionProgress.progresscCsd|dkrdS|tjjkr0tjtd||��ntjdtd||��tj|j||||��dS)N�g�V@�Zg$@)r$�callbackZ
PKG_VERIFY�Plymouthr|�intru�
_fmt_event)r>r�r��current�totalrrrr�sz,PlymouthTransactionProgress._update_plymouthcCs tjjj||�}d||||fS)Nz[%d/%d] %s %s...)r$�transactionZACTIONSrV)r>r�r�r�r�rrrr�sz&PlymouthTransactionProgress._fmt_eventN)rgrhrir|rr�rrrrr~�sr~ccsJtj�}|j|jdd�d}x(|D] }|d}||kr8q"|}|Vq"WdS)zVFind all boots with this message id.

    Returns the entries of all found boots.
    r)�
MESSAGE_IDZ_UIDN�_BOOT_ID)r�ReaderZ	add_match�hex)�
message_id�jZoldbootr)Zbootrrr�
find_bootss
r�c
Cstttd��d
}xJttt��D]:\}}tdj|d|d|d|jdd�|jdd���qW|dkrpttd	��dS)Nz3The following boots appear to contain upgrade logs:r�u){} / {.hex}: {:%Y-%m-%d %H:%M:%S} {}→{}r�Z__REALTIME_TIMESTAMP�SYSTEM_RELEASEVERz??�TARGET_RELEASEVERz-- no logs were found --���r�)r7r�	enumerater��ID_TO_IDENTIFY_BOOTS�formatrV)�nr)rrr�	list_logs s
r�cCsZtt|��}y(|dkrt�|dkr*|d8}||dStk
rTttd���YnXdS)Nrr�r�z!Cannot find logs with this index.)�listr��
IndexErrorr	r)r�r�Zbootsrrr�	pick_boot.sr�cCsDtt|�}tdd|jg�}|j�|j}|dkr@tjjt	d���dS)NZ
journalctlz--bootr�z%Unable to match systemd journal entry)
r�r�rr��wait�
returncoder$�
exceptions�Errorr)r�Zboot_idZprocessZrcrrr�show_log=s
r�ZdownloadZclean�upgrade�logcs eZdZdZ�fdd�Z�ZS)�SystemUpgradePluginzsystem-upgradecs8tt|�j||�|r4|jt�|jt�|jt�dS)N)�superr�r?Zregister_command�SystemUpgradeCommand�OfflineUpgradeCommand�OfflineDistrosyncCommand)r>�base�cli)�	__class__rrr?Ns


zSystemUpgradePlugin.__init__)rgrhri�namer?�
__classcell__rr)r�rr�Ksr�cs(eZdZdEZed�ZdZ�fdd�Zedd��Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Zd5d6�Z d7d8�Z!d9d:�Z"d;d<�Z#d=d>�Z$d?d@�Z%dAdB�Z&dCdD�Z'�Z(S)Fr��system-upgrade�fedupz+Prepare system for upgrade to a new releasezvar/lib/dnf/system-upgradecsjtt|�j|�tjj|jjj|j	�|_
tjj|j
d�|_tjj|jjjd�|_t
tjj|j
d��|_dS)Nzsystem-upgrade-transaction.jsonz
system-updatezsystem-upgrade-state.json)r�r�r?rrr#r�r1r-�DATADIR�datadir�transaction_file�
magic_symlinkr:�state)r>r�)r�rrr?\s
zSystemUpgradeCommand.__init__cCsJ|jdddtd�d�|jddtdd	jt�d
�|jdttd�d
�dS)Nz--no-downgraderdZstore_falsez=keep installed packages if the new release's version is older)�destr��help�tidr�z[%s]�|)�nargs�choices�metavarz--numberzwhich logs to show)�typer�)�add_argumentr�CMDSr#r�)�parserrrr�
set_argparserds
z"SystemUpgradeCommand.set_argparsercCs(tj||tj|jj|jjtjjd�dS)zLog directly to the journal.)r�ZPRIORITYr�r�ZDNF_VERSIONN)	r�sendZ
LOG_NOTICEr�r^r]r$�const�VERSION)r>rur�rrr�
log_statusnszSystemUpgradeCommand.log_statuscCs|jd�|jd�dS)NZcheck�
pre_configure)�	_call_sub)r>rrrr�ws
z"SystemUpgradeCommand.pre_configurecCs|jd�dS)N�	configure)r�)r>rrrr�{szSystemUpgradeCommand.configurecCs|jd�dS)N�run)r�)r>rrrr�~szSystemUpgradeCommand.runcCs|jd�dS)Nr�)r�)r>rrr�run_transaction�sz$SystemUpgradeCommand.run_transactioncCs|jd�dS)NZresolved)r�)r>rrr�run_resolved�sz!SystemUpgradeCommand.run_resolvedcCs.t||d|jjdd�}t|�r*|�dS)Nrr)�getattr�optsr��callable)r>r�Zsubfuncrrrr��szSystemUpgradeCommand._call_subcCs(|jjtkr$td�j|d�}t|��dS)NzFIncompatible version of data. Rerun 'dnf {command} download [OPTIONS]')�command)r�rZ�
STATE_VERSIONrr�r	)r>r�rtrrr�_check_state_version�sz)SystemUpgradeCommand._check_state_versioncCs*|j|jj_|jjr|jjnd|jj_dS)N)r�r�r1�cachedirr�r\)r>rrr�
_set_cachedir�sz"SystemUpgradeCommand._set_cachedircCs�ttjjtjjg�}ttjj�}i}i}xl|jjjD]^}|j	|krp|j
}|j|j|j
ji�jt|�i�|j	<q6|j	|kr6|j|jt|j
�i�|j	<q6W||fS)z�
        forward = {repoid:{pkg_nevra: {tsi.action: tsi.reason}}
        reverse = {pkg_nevra: {tsi.action: tsi.reason}}
        :return: forward, reverse
        )�setr$r�ZBACKWARD_ACTIONS�libdnfZ!TransactionItemAction_REINSTALLEDZFORWARD_ACTIONSr�r�r��pkg�reason�
setdefault�repo�idr{)r>Zbackward_actionZforward_actionsZforward�reverseZtsir�rrr�%_get_forward_reverse_pkg_reason_pairs�s
&
z:SystemUpgradeCommand._get_forward_reverse_pkg_reason_pairscCsb|j|jj_|jjr|jjnd|jj_d|jjkrJ|jjrJtt	d���nd|jjkr^d|j_dS)Nzoffline-distrosynczFCommand 'offline-distrosync' cannot be used with --no-downgrade optionzoffline-upgradeF)
r�r�r1r�r�r\r�rdr	r)r>rrr�pre_configure_download�sz+SystemUpgradeCommand.pre_configure_downloadcCs|j�dS)N)r�)r>rrr�pre_configure_reboot�sz)SystemUpgradeCommand.pre_configure_rebootcCs.|j�|jjr|jj|j_|jj|jj_dS)N)	r�r�rer��repos_edr]r�r1r.)r>rrr�pre_configure_upgrade�sz*SystemUpgradeCommand.pre_configure_upgradecCs|j�dS)N)r�)r>rrr�pre_configure_clean�sz(SystemUpgradeCommand.pre_configure_cleancCsd|jjksd|jjkr�tjtd��t�}|rLtd�}tj|jt|���|j	j
�r�td�}|j	jjs�|j	j
jdj|�dj|�d�r�tjtd	��tjd
�t|j	j|jjd�nd|jjkr�|jj|j�d
|jj_d
|jj_d
|jj_d
|jj_d
|jj_|j	jjdg7_dS)Nzsystem-upgrader�z\WARNING: this operation is not supported on the RHEL distribution. Proceed at your own risk.z-Additional information for System Upgrade: {}zyBefore you continue ensure that your system is fully upgraded by running "dnf --refresh upgrade". Do you want to continuez
{} [y/N]: z
{} [Y/n]: )rtZdefaultyes_msgzOperation aborted.r�)r2zoffline-upgradeTZtest)r�r�rrCrr rr�r
r�Z
_promptWantedr1Zassumeno�outputZuserconfirm�error�sys�exitr3r.r�Z _populate_update_security_filter�demands�	root_user�	resolving�available_repos�sack_activationZfreshest_metadataZtsflags)r>Zhelp_urlrtrrr�configure_download�s*






z'SystemUpgradeCommand.configure_downloadcCsd|jj_dS)NT)r�r�r�)r>rrr�configure_reboot�sz%SystemUpgradeCommand.configure_rebootcCs�d|jj_d|jj_d|jj_d|jj_|jj|j_|jj	dk	rN|jj	|j
j_	|jjdk	r�x$|j
j
j�D]}|j|jjk|_	qhW|jjdk	r�x$|j
j
j�D]}|j|jjk|_q�W|jj|j
j_d|jj_d|j
j_t�|jj_d|j
j_d|j
j_dS)NTF)r�r�r�r�r�r�r�rdr�r_r�r1r`�repos�valuesr�ra�
repo_gpgcheckrfZ	cacheonlyZ	assumeyesr~Ztransaction_displayZclean_requirements_on_removeZinstall_weak_deps)r>r�rrr�configure_upgrade�s&






z&SystemUpgradeCommand.configure_upgradecCsd|jj_dS)NT)r�r�r�)r>rrr�configure_cleansz$SystemUpgradeCommand.configure_cleancCsdS)Nr)r>rrr�
configure_logsz"SystemUpgradeCommand.configure_logcCs~|jjdksttd���|j|jj�|jj|jjkrRtd�j|jjd�}t|��t	j
j|j�rlttd���t
jj|j�dS)N�completezsystem is not ready for upgradezZthe transaction was not prepared for '{command}'. Rerun 'dnf {command} download [OPTIONS]')r�zupgrade is already scheduled)r�r[r	rr�r�r�rcr�rr�lexistsr�r$r%rIr�)r>rtrrr�check_rebootsz!SystemUpgradeCommand.check_rebootcCs�tjj|j�s$tjtd��td��tj|j�|j	krLtjtd��td��t
jjj
|j�|jj}|sp|jj}|j|�|jjdks�td�j|d�}t|��dS)Nz-trigger file does not exist. exiting quietly.rz1another upgrade tool is running. exiting quietly.�readyz/use 'dnf {command} reboot' to begin the upgrade)r�)rrr�r�rrr�
SystemExit�readlinkr�r$ZyumZmiscZunlink_fr�rcr�r�r�rbr�r	)r>r�rtrrr�
check_upgrades
z"SystemUpgradeCommand.check_upgradec	Cs,tj|j|j�|j�}d|_WdQRXdS)Nr�)r�symlinkr�r�r�rb)r>r�rrr�run_prepare,sz SystemUpgradeCommand.run_preparecCs6|j�|jjddksdS|jtd�t�t�dS)NrrzRebooting to perform upgrade.)r�r�r�r�r�REBOOT_REQUESTED_IDr)r>rrr�
run_reboot3s
zSystemUpgradeCommand.run_rebootc	s��jjr�jj�n
�jj��jjdkr��jj��fdd��jjjD�}|r\�jj|��fdd��jjj	D�}|r��jj|��j
�$}d|_�jjj
|_�jjj|_WdQRXdS)N�offline-upgrade�offline-distrosynccs$g|]}�jjjj|j�r|j�qSr)r��history�grouprVr�)�.0�g)r>rr�
<listcomp>Gsz5SystemUpgradeCommand.run_download.<locals>.<listcomp>cs$g|]}�jjjj|j�r|j�qSr)r�r��envrVr�)rr)r>rrrJsZdownloading)r�r�)r�rdr�Zupgrade_allr�Z
read_comps�comps�groupsZenv_group_upgradeZenvironmentsr�r[r1r.r]r\)r>Zinstalled_groupsZinstalled_environmentsr�r)r>r�run_download=s

z!SystemUpgradeCommand.run_downloadc
Cs�d}|j�}d|_|j}WdQRX|dkr4td�}n|dkrFtd�}ntd�}|j|t�tj�tjd�tj	|�t
�t|j|j
�|_|jj�dS)	N�Z
incompletezoffline-upgradez1Starting offline upgrade. This will take a while.zoffline-distrosyncz4Starting offline distrosync. This will take a while.z0Starting system upgrade. This will take a while.r)r�rbrcrr��UPGRADE_STARTED_IDr�rzr|rur9rr�r��replayr�)r>r�r�rtrrr�run_upgradeSs 



z SystemUpgradeCommand.run_upgradec	Csdtjtd��t|jjjtjj	|jjj�j
g�|j�$}d|_d|_
d|_d|_d|_WdQRXdS)NzCleaning up downloaded data...)rrrr*r�r1r�r$Z	persistorZTempfilePersistorZdb_pathr�r[rZrbrcr\)r>r�rrr�	run_cleanms
zSystemUpgradeCommand.run_cleancCs |jjrt|jj�nt�dS)N)r�Znumberr�r�)r>rrr�run_logzszSystemUpgradeCommand.run_logcCs|jj�dS)z5Adjust transaction reasons according to stored valuesN)r
Zpost_transaction)r>rrr�resolved_upgrade�sz%SystemUpgradeCommand.resolved_upgradecCs�|jjj�}|j�s&tjtd��dSt|�}yLt|j	d��"}t
j||ddd�|jd�WdQRXt
td�j|j	��Wn<tk
r�}z tjjtd�jt|����WYdd}~XnXtjj|jjj�}|j��}d	|_t|_|jj|_|jjj|_d
d�|jjj �D�|_!dd�|jjj �D�|_"||_#|jjj$|_%|jjj&|_&|jj'|_(|jjj)|_)|jj*|_+WdQRXt,j|jj*d
�}tj|�|j-td�t.�dS)NzKThe system-upgrade transaction is empty, your system is already up-to-date.rErFT)rGrH�
zTransaction saved to {}.zError storing transaction: {}r�cSsg|]}|jr|j�qSr)r_r�)rr�rrrr�sz=SystemUpgradeCommand.transaction_download.<locals>.<listcomp>cSsg|]}|jr|j�qSr)r�r�)rr�rrrr�s)r�zDownload finished.)/r�r�Zget_currentZpackagesrrrrrr�r@rKr5r7r�r'r$r�r	r{r+r,r1r-r�r[r�rZr�rdr_r�r�r`rar^r.r]rfr�rer\r�rc�DOWNLOAD_FINISHED_MSGr��DOWNLOAD_FINISHED_ID)r>r��data�fr8Z
system_verr�rtrrr�transaction_download�s:,


z)SystemUpgradeCommand.transaction_downloadcCs@tjtd��|jtd�t�|j�|jjddkr<t�dS)Nz.Upgrade complete! Cleaning up and rebooting...rr�)	r�rurr��UPGRADE_FINISHED_IDrr�r�r)r>rrr�transaction_upgrade�s
z(SystemUpgradeCommand.transaction_upgrade)r�r�))rgrhri�aliasesr�summaryr�r?�staticmethodr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr
rrrr�rr)r�rr�VsF
	
 


(r�c@seZdZdZed�ZdS)r��offline-upgradez%Prepare offline upgrade of the systemN)r)rgrhrirrrrrrrr��sr�c@seZdZdZed�ZdS)r��offline-distrosyncz(Prepare offline distrosync of the systemN)r)rgrhrirrrrrrrr��sr�)N)?r}�
subprocessrrrrr@rZos.pathrvr�ZuuidZsystemdrZdnfpluginscorerrr$Zdnf.clir	Zdnf.i18nr
Zdnf.transactionZdnf.transaction_srrrZlibdnf.confr�ZUUIDrr�r	rr�ror/rr0r�rr r*r3r9�objectr:rjr�r�ZTransactionProgressr~r�r�r�r�r�ZPluginr�r�ZCommandr�r�r�rrrr�<module>sd




	@.	esite-packages/dnf-plugins/__pycache__/needs_restarting.cpython-36.opt-1.pyc000064400000023602147511334660022664 0ustar003

@|�e`.�	@s$ddlmZddlmZddlmZddlmZddlmZmZddlZddl	Zddl
Z
ddlZddlZddl
Z
ddlZddlZddd	d
ddd
ddg	ZdgZdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�ZGd#d$�d$e�ZGd%d&�d&e�ZejjGd'd(�d(ej j!��Z"dS))�)�absolute_import)�division)�print_function)�unicode_literals)�logger�_NZkernelz	kernel-rtZglibczlinux-firmwareZsystemd�dbuszdbus-brokerzdbus-daemonZ
microcode_ctl�zlibcs�tjj|�st�St�}xjtj|�D]\}tjj|�s$|jd�rBq$ttjj||���&}x|D]}|j	|j
�|f�q\WWdQRXq$Wt��x4|jj�j
�jdd�|D�d�D]}�j	|j�q�Wx6�fdd�|D�D] \}}tjtdj||d���q�W�S)	z�
    Provide filepath as string if single dir or list of strings
    Return set of package names contained in files under filepath
    z.confNcSsh|]}|d�qS)r�)�.0�xr
r
�&/usr/lib/python3.6/needs_restarting.py�	<setcomp>Bsz'get_options_from_dir.<locals>.<setcomp>)�namecsh|]}|d�kr|�qS)rr
)rr)�packagesr
r
rDsz`No installed package found for package name "{pkg}" specified in needs-restarting file "{file}".)�pkg�file)�os�path�exists�set�listdir�isdir�endswith�open�join�add�rstrip�sack�query�	installed�filterrr�warningr�format)�filepath�baseZoptionsr�fp�linerrr
)rr
�get_options_from_dir0s"
$&r(ccs�x�t�D]�\}}y<|dk	r(|t|�kr(wt|ddd��}|j�}WdQRXWn"tk
rntjd|�wYnXx$|D]}t||�}|dk	rv|VqvWqWdS)N�r�replace)�errorszFailed to read PID %d's smaps.)�
list_smaps�	owner_uidr�	readlines�EnvironmentErrorrr"�smap2opened_file)�uid�pid�smapsZ
smaps_file�linesr'�ofiler
r
r
�list_opened_filesKs

r6ccsNxHtjd�D]:}yt|�}Wntk
r2wYnXd|}||fVqWdS)Nz/procz/proc/%d/smaps)rr�int�
ValueError)Zdir_r2r3r
r
r
r,\sr,cst��i����fdd�}|S)Ncs,�j|��}|�k	r|S�|�}|�|<|S)N)�get)Zparam�val)�cache�func�sentinelr
r
�wrapperiszmemoize.<locals>.wrapper)�object)r<r>r
)r;r<r=r
�memoizefsr@cCstj|�tjS)N)r�stat�ST_UID)�fnamer
r
r
r-ssr-cCs$|j�j|d�j�}|r |dSdS)N)rr)rr!�run)rrCZmatchesr
r
r
�owning_packagewsrEcCsPd|}t|��}tjj|j��}WdQRXdj|jd��}td||f�dS)Nz/proc/%d/cmdline� �z%d : %s)r�dnfZi18nZucd�readr�split�print)r2ZcmdlineZcmdline_fileZcommandr
r
r
�	print_cmd~s

rLc	Cs�tj�}|jdd�}tj|d�}d}y|jd|j|��}Wn<tjk
rv}zt|�}tjdj	||��dSd}~XnXtj|dd�}|j
dd�}|jd	�r�|SdS)
Nzorg.freedesktop.systemd1z/org/freedesktop/systemd1z org.freedesktop.systemd1.Managerz)Failed to get systemd unit for PID {}: {}zorg.freedesktop.DBus.Properties)Zdbus_interfacezorg.freedesktop.systemd1.UnitZIdz.service)rZ	SystemBusZ
get_objectZ	InterfaceZGetUnitByPIDZ
DBusException�strrr"r#ZGetr)	r2ZbusZsystemd_manager_objectZsystemd_manager_interfaceZ
service_proxy�e�msgZservice_propertiesrr
r
r
�get_service_dbus�s0

rPcCsn|jd�}|dkrdS|jd�dkr(dS||d�j�}|jd�}|dkrVt||d�St||d|�d�SdS)N�/rz00:z
 (deleted)FT)�find�strip�rfind�
OpenedFile)r2r'Zslash�fnZsuffix_indexr
r
r
r0�s

r0c@s*eZdZejd�Zdd�Zedd��ZdS)rUz^(.+);[0-9A-Fa-f]{8,}$cCs||_||_||_dS)N)�deletedrr2)�selfr2rrWr
r
r
�__init__�szOpenedFile.__init__cCs(|jr"|jj|j�}|r"|jd�S|jS)a;Calculate the name of the file pre-transaction.

        In case of a file that got deleted during the transactionm, possibly
        just because of an upgrade to a newer version of the same file, RPM
        renames the old file to the same name with a hexadecimal suffix just
        before delting it.

        �)rW�RE_TRANSACTION_FILE�matchr�group)rXr\r
r
r
�
presumed_name�s

zOpenedFile.presumed_nameN)	�__name__�
__module__�__qualname__�re�compiler[rY�propertyr^r
r
r
r
rU�s
rUc@s4eZdZdd�Zedd��Zedd��Zdd�Zd	S)
�ProcessStartcCs|j�|_|j�|_dS)N)�
get_boot_time�	boot_time�get_sc_clk_tck�
sc_clk_tck)rXr
r
r
rY�s
zProcessStart.__init__cCshttjd�j�}tjjd�rdtdd��8}|j�j�j	�dj�}tt
j
�t|��}t||�SQRX|S)a	
        We have two sources from which to derive the boot time. These values vary
        depending on containerization, existence of a Real Time Clock, etc.
        For our purposes we want the latest derived value.
        - st_mtime of /proc/1
             Reflects the time the first process was run after booting
             This works for all known cases except machines without
             a RTC - they awake at the start of the epoch.
        - /proc/uptime
             Seconds field of /proc/uptime subtracted from the current time
             Works for machines without RTC iff the current time is reasonably correct.
             Does not work on containers which share their kernel with the
             host - there the host kernel uptime is returned
        z/proc/1z/proc/uptime�rbrN)
r7rrA�st_mtimer�isfiler�readlinerSrJ�time�float�max)Zproc_1_boot_time�fZuptimeZproc_uptime_boot_timer
r
r
rf�szProcessStart.get_boot_timecCstjtjd�S)N�
SC_CLK_TCK)r�sysconf�
sysconf_namesr
r
r
r
rh�szProcessStart.get_sc_clk_tckc
CsLd|}t|��}|j�j�j�}WdQRXt|d�}||j}|j|S)Nz
/proc/%d/stat�)rrIrSrJr7rirg)rXr2Zstat_fnZ	stat_fileZstatsZticks_after_bootZsecs_after_bootr
r
r
�__call__�s

zProcessStart.__call__N)r_r`rarY�staticmethodrfrhrvr
r
r
r
re�srec@s4eZdZd
Zed�Zedd��Zdd�Zdd�Z	d	S)�NeedsRestartingCommand�needs-restartingz/determine updated binaries that need restartingcCsF|jdddtd�d�|jdddtd�d�|jd	d
dtd�d�dS)Nz-uz
--useronly�
store_truez#only consider this user's processes)�action�helpz-rz--reboothintzKonly report whether a reboot is required (exit code 1) or not (exit code 0)z-sz
--servicesz%only report affected systemd services)�add_argumentr)�parserr
r
r
�
set_argparsers


z$NeedsRestartingCommand.set_argparsercCs|jj}d|_dS)NT)�cli�demandsZsack_activation)rXr�r
r
r
�	configuresz NeedsRestartingCommand.configurecCsNt�}tjt|jj�}t|�}ttj	j
|jjjd�|j�}t
j|�|jj�r�t�}t�}|jjj�j�}x,|jt
d�D]}|j|jkrx|j|j�qxW|jdddgd�}t|�dkr�x,|jtd�D]}|j|jkr�|j|j�q�W|s�|�rfttd��xt|�D]}	td|	��qWxt|�D]}	td	|	��q$Wt�ttd
��ttd�d�tjj ��nttd
��ttd��dSt�}
|jj!�r�tj"�nd}xHt#|�D]<}||j$�}|dk�rĐq�|j||j%�k�r�|
j|j%��q�W|jj&�r.tdd�t|
�D��}
x |
D]}	|	dk	�rt|	��qWdSxt|
�D]}t'|��q8WdS)Nz#etc/dnf/plugins/needs-restarting.d/)rrzdbus-daemonzdbus-brokerrz;Core libraries or services have been updated since boot-up:z  * %sz8  * %s (dependency of dbus. Recommending reboot of dbus)z2Reboot is required to fully utilize these updates.zMore information:z)https://access.redhat.com/solutions/27943z>No core libraries or services have been updated since boot-up.zReboot should not be necessary.cSsg|]}t|��qSr
)rP)rr2r
r
r
�
<listcomp>Bsz.NeedsRestartingCommand.run.<locals>.<listcomp>)(re�	functools�partialrEr%rr@r(rrrZconfZinstallroot�NEED_REBOOT�extendZoptsZ
reboothintrrr r!Zinstalltimergrr�len�NEED_REBOOT_DEPENDS_ON_DBUSrKr�sortedrH�
exceptions�ErrorZuseronly�geteuidr6r^r2ZservicesrL)rXZ
process_startZ
owning_pkg_fn�optZneed_rebootZneed_reboot_depends_on_dbusr rZdbus_installedrZ
stale_pidsr1r5�namesr2r
r
r
rDsd







zNeedsRestartingCommand.runN)ry)
r_r`ra�aliasesrZsummaryrwrr�rDr
r
r
r
rx�s

rx)#Z
__future__rrrrZdnfpluginscorerrrHZdnf.clirr�rrbrArnr�r�r(r6r,r@r-rErLrPr0r?rUreZpluginZregister_commandr�ZCommandrxr
r
r
r
�<module>s:

"+site-packages/dnf-plugins/__pycache__/groups_manager.cpython-36.opt-1.pyc000064400000020721147511334660022334 0ustar003

@|�e�4�@s�ddlmZddlmZddlZddlZddlZddlZddlZddlZddl	Z	ddl
mZmZddl
Z
ddlZ
dZejdje��Zejd�Zdddd	�Zd
d�Zdd
�Zdd�Ze
jjGdd�de
jj��ZdS)�)�absolute_import)�unicode_literalsN)�_�loggerz
-a-z0-9_.:z^[{}]+$z^[-a-zA-Z0-9_.@]+$T)Zdefault_explicitZuservisible_explicitZempty_groupscCstj|�stjtd���|S)zgroup id validatorzInvalid group id)�RE_GROUP_ID�match�argparse�ArgumentTypeErrorr)�value�r�$/usr/lib/python3.6/groups_manager.py�
group_id_type.s
r
cCsN|jdd�}t|�dkr&tjtd���|\}}tj|�sFtjtd���||fS)ztranslated texts validator�:�z6Invalid translated data, should be in form 'lang:text'z*Invalid/empty language for translated data)�split�lenrr	r�RE_LANGr)r
�data�lang�textrrr�translation_type5s

rcCs:|j�}tjdjt�d|�}|s6tjjtd�j|���|S)z#generate group id based on its namez[^{}]�zFCan't generate group id from '{}'. Please specify group id using --id.)	�lower�re�sub�format�RE_GROUP_ID_VALID�dnf�cli�CliErrorr)r�group_idrrr�
text_to_idAsr!csdeZdZdZed�Z�fdd�Zedd��Zdd�Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Z�ZS)�GroupsManagerCommand�groups-managerz$create and edit groups metadata filecstt|�j|�tj�|_dS)N)�superr"�__init__�libcomps�Comps�comps)�selfr)�	__class__rrr%QszGroupsManagerCommand.__init__cCs�|jddgdtd�d�|jddgdtd�d�|jddtd	�d
�|jddd
td�d�|jdttd�d�|jddtd�d�|jdtd�d�|jdttd�d�|jddgdttd�d�|jddgdttd �d�|j�}|jd!d"ddtd#�d$�|jd%d"d&dtd'�d$�|j�}|jd(dtd)�d*�|jd+dtd,�d*�|jd-dd
td.�d�|jd/dtd0�d*�|jd1d2d3td4�d5�dS)6Nz--load�appendz	COMPS.XMLzload groups metadata from file)�action�default�metavar�helpz--savezsave groups metadata to filez--mergez%load and save groups metadata to file)r.r/z--print�
store_trueFz#print the result metadata to stdout)r,r-r/z--idzgroup id)�typer/z-nz--namez
group name)r/z
--descriptionzgroup descriptionz--display-orderzgroup display orderz--translated-namez	LANG:TEXTztranslated name for the group)r,r-r.r1r/z--translated-descriptionz$translated description for the groupz--user-visible�user_visiblez%make the group user visible (default))�destr,r-r/z--not-user-visibleZstore_falsezmake the group user invisiblez--mandatoryz%add packages to the mandatory section)r,r/z
--optionalz$add packages to the optional sectionz--removez5remove packages from the group instead of adding themz--dependenciesz-include also direct dependencies for packages�packages�*ZPACKAGEzpackage specification)�nargsr.r/)�add_argumentrr
�intrZadd_mutually_exclusive_group)�parserZvisibleZsectionrrr�
set_argparserUsR








z"GroupsManagerCommand.set_argparsercCs�|jj}|jjr"d|_d|_d|_|jjrP|jjj	d|jj�|jj
j|jj�|jjs�|jj
s�|jjs�|jjs�|jjdk	s�|jjr�|jjr�|jjr�tjjtd���dS)NTFrz;Can't edit group without specifying it (use --id or --name))r�demands�optsr4Zsack_activationZavailable_reposZload_system_repo�merge�load�insert�saver+�description�
display_order�translated_name�translated_descriptionr2�id�namerrr)r)r;rrr�	configure�s"zGroupsManagerCommand.configurecCs �x|jjD�]
}tj�}yp|jd�r~tj|��F}tjdd�}z$t	j
||�|j�|j|j
�Wdtj|j
�XWdQRXn
|j|�Wn~tttjfk
�r}zXt�}x2|j�D]&}||kr�q�tj|j��|j|�q�Wtjjtd�j||���WYdd}~XqX|j|7_qWdS)zm
        Loads all input xml files.
        Returns True if at least one file was successfuly loaded
        z.gzF)�deleteNzCan't load file "{}": {})r<r>r&r'�endswith�gzip�open�tempfileZNamedTemporaryFile�shutilZcopyfileobj�closeZ	fromxml_frF�os�unlink�IOError�OSErrorZParserError�setZget_last_errorsr�error�strip�addr�
exceptions�Errorrrr()r)�	file_nameZ
file_compsZgz_fileZ	temp_file�err�seenrTrrr�load_input_files�s,
$z%GroupsManagerCommand.load_input_filescCs�x�|jjD]�}y|jj|td�}Wn*tjk
rL}z|g}WYdd}~XnX|r
x"|dd�D]}tj|j	��q`Wt
jjt
d�j||dj	����q
WdS)N)�xml_options�zCan't save file "{}": {}���r_)r<r@r(Zxml_f�COMPS_XML_OPTIONSr&ZXMLGenErrorrrTrUrrWrXrr)r)rY�errorsrZrrr�save_output_files�sz&GroupsManagerCommand.save_output_filescCs\d}|r*x |jjD]}|j|kr|}PqW|dkrX|rXx |jjD]}|j|kr@|}Pq@W|S)zl
        Try to find group according to command line parameters - first by id
        then by name.
        N)r(�groupsrErF)r)r rF�groupZgrprrr�
find_group�s

zGroupsManagerCommand.find_groupcCs�dd�}|jjr|jj|_|jjr,|jj|_|jjr>|jj|_|jjdk	rT|jj|_|jjrj||jj�|_|jj	r�||jj	�|_
|jj�r�t�}xZ|jjD]N}t
jj|�}|j|jjdddd�j�}|s�tjtd�j|��q�|j|�q�W|jj�r2t�}x|D]}|j|j��qW|j|jjj�j|d��d	d
�|D�}	|jj�r�x�|	D].}
x&|j|
tj d�D]}|jj|��qfW�qPWnd|jj!�r�tj"}n|jj#�r�tj$}ntj%}x8t&|	�D],}
|j|
|d��s�|jj'tj(|
|d���q�WdS)zE
        Set attributes and package lists for selected group
        cSs&tj�}x|D]\}}|||<qW|S)N)r&ZStrDict)ZlstZstr_dictrrrrr�langlist_to_strdict�sz<GroupsManagerCommand.edit_group.<locals>.langlist_to_strdictNTF)Z
with_nevraZ
with_providesZwith_filenameszNo match for argument: {})ZprovidescSsh|]
}|j�qSr)rF)�.0�pkgrrr�	<setcomp>sz2GroupsManagerCommand.edit_group.<locals>.<setcomp>)rFr1))r<rFrAZdescrBr2ZuservisiblerCZname_by_langrDZdesc_by_langr4rSrZsubjectZSubjectZget_best_query�baseZsackZlatestrZwarningrr�updateZdependenciesZrequiresZqueryZfilterm�removeZpackages_matchr&ZPACKAGE_TYPE_UNKNOWNZ	mandatoryZPACKAGE_TYPE_MANDATORYZoptionalZPACKAGE_TYPE_OPTIONALZPACKAGE_TYPE_DEFAULT�sortedr+ZPackage)r)rdrfr4Zpkg_specZsubj�qZrequirementsrhZ	pkg_namesZpkg_nameZpkg_typerrr�
edit_group�sT










zGroupsManagerCommand.edit_groupcCs|j�|jjs|jjr�|j|jj|jjd�}|dkr�|jjrNtjjt	d���t
j�}|jjrt|jj|_|jj|_nD|jjr�t|jj�}|j|dd�r�tj
jt	d�j||jj���||_|jjj|�|j|�|j�|jjs�|jjr�t|jjtd��dS)N)r rFz-Can't remove packages from non-existent groupzRGroup id '{}' generated from '{}' is duplicit. Please specify group id using --id.)r])r\r<rErFrerlrrWrXrr&ZGroupr!rrrr(rcr+rorb�printr@Zxml_strr`)r)rdr rrr�run!s,

zGroupsManagerCommand.run)r#)�__name__�
__module__�__qualname__�aliasesrZsummaryr%�staticmethodr:rGr\rbrerorq�
__classcell__rr)r*rr"Ls1$=r")Z
__future__rrrrJr&rOrrMrLZdnfpluginscorerrrZdnf.clir�compilerrrr`r
rr!ZpluginZregister_commandrZCommandr"rrrr�<module>s,
site-packages/dnf-plugins/__pycache__/copr.cpython-36.opt-1.pyc000064400000050311147511334660020264 0ustar003

@|�eZv�@s�ddlmZddlZddlZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddlZddlmZm
Z
ddlZddlmZddlmZddlZy$ddlmZmZmZmZdd�ZWnLek
�rd	d
�ZyddlmZWnek
r�dd�ZYnXYnXd
Zeed�ed�g�Zeed�ed�dg�Ze�rdddl m!Z!m"Z"m#Z#ddl$m%Z%m&Z&m'Z'n(ddl!m!Z!m"Z"m#Z#ddl(m%Z%m&Z&m'Z'ej)j*Gdd�dej+j,��Z-ej)j*Gdd�de-��Z.dS)�)�print_functionN)�_�logger)�PY3)�ucd)�name�version�codename�os_release_attrcCst�t�t�fS)N)rrr	�rr�/usr/lib/python3.6/copr.py�linux_distribution.sr
cCsdS)N�r)rrrrr
1sr
)r
cCsrtd��`}i}xF|D]>}y$|j�jd�\}}|jd�||<Wqtk
rPYqXqW|d|ddfSQRXdS)Nz/etc/os-release�=�"�NAMEZ
VERSION_ID)�open�rstrip�split�strip�
ValueError)Zos_release_fileZos_release_data�lineZos_release_keyZos_release_valuerrrr
7s


�copr�yes�y�no�nr)�ConfigParser�
NoOptionError�NoSectionError)�urlopen�	HTTPError�URLErrorc@seZdZdZdZdZdZdZdZedeZ	d8Z
ed	�Zd
Z
ed�Zedd
��Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zed$d%��Zd&d'�Zd(d)�Zd*d+�Z d,d-�Z!d.d/�Z"d0d1�Z#ed2d3��Z$ed4d5��Z%ed6d7��Z&dS)9�CoprCommandz Copr plugin for DNF Nzcopr.fedorainfracloud.orgZfedoraZhttpsi�z://rz Interact with Copr repositories.Ta�
  enable name/project [chroot]
  disable name/project
  remove name/project
  list --installed/enabled/disabled
  list --available-by-user=NAME
  search project

  Examples:
  copr enable rhscl/perl516 epel-6-x86_64
  copr enable ignatenkobrain/ocltoys
  copr disable rhscl/perl516
  copr remove rhscl/perl516
  copr list --enabled
  copr list --available-by-user=ignatenkobrain
  copr search tests
    c	Cs�|jddddddddgd	�|j�}|jd
dtd�d
�|jddtd�d
�|jddtd�d
�|jddtd�d�|jdtd�d�|jddd�dS)N�
subcommand��help�enable�disable�remove�list�search)�nargs�choicesz--installed�
store_truez.List all installed Copr repositories (default))�actionr&z	--enabledzList enabled Copr repositoriesz
--disabledzList disabled Copr repositoriesz--available-by-userrz-List available Copr repositories by user NAME)�metavarr&z--hubz(Specify an instance of Copr to work with)r&�arg�*)r,)�add_argumentZadd_mutually_exclusive_groupr)�parserZlist_optionrrr�
set_argparserpszCoprCommand.set_argparsercCs�|jjjjdkrdSd}t�}g}|jjjd}tjj	|t
d�}tjj|�r�|j|�|j
|�|jdd�r�|jdd�r�|jdd�}|jdd�}||g|_n
ddg|_xHtjtjj	|t
d��D],}|jd�r�tjj	|t
d|�}	|j|	�q�Wg}
t|jj��r|jjdjd	�}
t|
�d
k�rV|jj�rVtjtd�td��tjjtd
���nL|jj�r�t|
�d
k�r�|j|_|j|_n t|
�d
k�r�|
d}n|jj}|�rH|�rHd|_|j
t |dd��|j!||dd�}|�rH|j!||d|j"�}|j!||d|j#�}
||_|d||_t$|
�|j#k�rH|jd|
7_|jd|
7_|j�s�d|k�rr||_|j"d||_n|jdd�d|_||_dS)Nrrz.conf�main�distribution�
releaseverFz.d�/�zError: z^specify Copr hub either with `--hub` or using `copr_hub/copr_username/copr_projectname` formatzmultiple hubs specifiedT)�reverse�hostname�protocol�portz://�:r%)%�cliZcommand�optsr�base�confZpluginconfpath�os�path�join�PLUGIN_CONF�isfile�append�readZ
has_option�get�
chroot_config�listdir�endswith�lenr1r�hubr�criticalr�dnf�CliError�default_hostname�
copr_hostname�default_url�copr_url�sorted�_read_config_item�default_protocol�default_port�int)�selfZcopr_hubZcopr_plugin_configZconfig_filesZconfig_pathZdefault_config_filer7r8�filenameZconfig_file�projectr<r=r>rrr�	configure�sl








zCoprCommand.configurecCs*y|j||�Sttfk
r$|SXdS)N)rKrr)r]�configrPZsection�defaultrrrrY�szCoprCommand._read_config_itemcCstjjdj|j���dS)Nz{0}
)�sys�stderr�write�formatr)r]�textrrr�_user_warning_before_prompt�sz'CoprCommand._user_warning_before_promptc
Cs�|jjd}|dkr&|jjj|�dS|dkrl|jjrH|j|jj�dS|j|jj	j
d|jj|jj�dSy|jj
d}WnLttfk
r�tjtd�td��|jjj|�tjjtd���YnXy\|jj
d}t|jj
�dkr�tjjtd���|jd	�|_t|j�d
k�r$tjjtd���Wn*tk
�rP|j�}|jd	�|_YnX|dk�rj|j|�dS|jd
�}t|�dk�r�tjtd�td��tjjtd���n<t|�dk�r�|d}|d}n|d}|d}|d
|}dj|jj	j|j|j|�|�}|dk�rn|j �td�}d
j!|j||g�}dj|�}	|j"||	�|j#||�tj$td��|j%||�nr|dk�r�|j �|j&||�tj$td��nD|dk�r�|j �|j'||�tj$td��ntjjtd�j|���dS)Nrr&r*zError: z>exactly two additional parameters to copr command are requiredr%�zToo many arguments.�-r:zOBad format of optional chroot. The format is distribution-version-architecture.r+r9zEuse format `copr_username/copr_projectname` to reference copr projectzbad copr project formatz{0}/_copr:{1}:{2}:{3}.repor'a
Enabling a Copr repository. Please note that this repository is not part
of the main distribution, and quality may vary.

The Fedora Project does not exercise any power over the contents of
this repository beyond the rules outlined in the Copr FAQ at
<https://docs.pagure.org/copr.copr/user_documentation.html#what-i-can-build-in-copr>,
and packages are not held to any quality or security level.

Please do not file bug reports about these packages in Fedora
Bugzilla. In case of problems, contact the owner of this repository.
z!Do you really want to enable {0}?z Repository successfully enabled.r(z!Repository successfully disabled.r)z Repository successfully removed.zUnknown subcommand {}.)rir:)(rAr$r@Z	optparserZ
print_helpZavailable_by_user�_list_user_projects�_list_installed_repositoriesrBrCZreposdir�enabledZdisabledr1r�
IndexErrorrrQrrRrSrO�
exceptions�Errorr�chroot_parts�
_guess_chroot�_searchrf�get_reposdirrU�_sanitize_username�
_need_rootrF�	_ask_user�_download_repo�info�_runtime_deps_warning�
_disable_repo�_remove_repo)
r]r$�project_name�chrootr_�
copr_username�copr_projectname�
repo_filenamery�msgrrr�run�s�








zCoprCommand.runcCs�|jjd�d}tjd|j|�}|j|jko8tjd|�}tjd|�}|jjr`|rh|rhdSn|shdStjd|�rxdStjd|�r�dS|j	}	|	r�|s�|	r�|r�dSd}
tjd	|�r�|jd
d�\}}}
}|d|
d|}n�tjd|��r2|jdd�}|j
d
d
�djd
d�d}|d|dd|d}n.|jdd�}|jd|dd|d}d}
|	�sn|d7}|
�r||d7}t|�|
S)Nr9r%z_copr:Z_copr_z_copr:|^_copr_zcopr:.*:.*:.*:mlz
coprdep:.*Fzcopr:r?r:rjrirTz (disabled)z *���)
�repofiler�re�matchrUrWrVrArPrm�rsplitrT�print)r]�repo_id�repo�enabled_only�
disabled_only�	file_nameZ	match_newZ	match_oldZ	match_anyrm�old_reporrUZ
copr_ownerZcopr_dirr�Z	copr_namerrr�_list_repo_file8sBzCoprCommand._list_repo_filecCsFd}x,|jjj�D]\}}|j||||�rd}qW|rBttd��dS)NFTz�* These coprs have repo file with an old format that contains no information about Copr hub - the default one was assumed. Re-enable the project to fix this.)rB�repos�itemsr�r�r)r]Z	directoryr�r�r�r�r�rrrrlisz(CoprCommand._list_installed_repositoriesc

Cs�dj|�}|j|}|jj|dd�}ytj|j��}Wn*tk
r`tj	j
td�j|���YnX|j|�td�j|�}|j
|�xL|dD]@}dj||d�}|d	p�td
�}	|jjjt|�|	�}t|�q�WdS)Nz!/api_3/project/list?ownername={0}zw+)�modez+Can't parse repositories for username '{}'.zList of {} coprsr�z
{0}/{1} : r�descriptionzNo description given)rfrWrBr �json�loadsrJrrRrorpr�_check_json_output�_print_match_section�output�
fmtKeyValFillrr�)
r]Z	user_name�api_path�url�res�
json_parse�section_text�itemr��descrrrrkss"



zCoprCommand._list_user_projectsc
Cs�dj|�}|j|}|jj|dd�}ytj|j��}Wn*tk
r`tj	j
td�j|���YnX|j|�td�j|�}|j
|�xJ|dD]>}dj|d�}|d	p�td
�}	|jjjt|�|	�}t|�q�WdS)Nz/api_3/project/search?query={}zw+)r�zCan't parse search for '{}'.zMatched: {}r�z{0} : Z	full_namer�zNo description given.)rfrWrBr r�r�rJrrRrorprr�r�r�r�rr�)
r]Zqueryr�r�r�r�r�r�r�r�rrrrs�s 



zCoprCommand._searchcCs|jjj|�}t|�dS)N)rBr�Z
fmtSectionr�)r]rgZ	formattedrrrr��sz CoprCommand._print_match_sectioncCsj|jstjjd�d|_tjjdj|j���|jj�rf|jjj	sb|jj
jdj|�dj|�d�rfdSdS)N�
Fz{0}
z
{} [y/N]: z
{} [Y/n]: )r�Zdefaultyes_msgT)�
first_warningrcrdrerfrrBZ
_promptWantedrCZassumenor�Zuserconfirm)r]ryr�rrr�_ask_user_no_raise�s
zCoprCommand._ask_user_no_raisecCs |j||�stjjtd���dS)NzSafe and good answer. Exiting.)r�rRrorpr)r]ryr�rrrrw�szCoprCommand._ask_usercCs tj�dkrtjjtd���dS)Nrz/This command has to be run under the root user.)rD�geteuidrRrorpr)�clsrrrrv�szCoprCommand._need_rootcs|j��dks&�ddks&�ddkr,t��|jjjd}t�fdd�dD��r�d
�krbd|}n&dtd
�krxd|}ndj�d|�}n�d�kr�tj	d�}d�kr�dj|�}ndj�d|�}nPd�kr�tj	d�}d�kr�dj|�}ndj�d|�}nd�dj
dd�d}|S)z2 Guess which chroot is equivalent to this machine NrFr%Zbasearchcsg|]}|�k�qSrr)�.0r)�distrr�
<listcomp>�sz-CoprCommand._guess_chroot.<locals>.<listcomp>�Fedora�Fedora LinuxZRawhidezfedora-rawhide-ZrawhideZredhat_support_product_versionzfedora-{0}-{1}ZMageiaz%{distro_arch}ZCauldronzmageia-cauldron-{}zmageia-{0}-{1}ZopenSUSEz%{_target_cpu}Z
Tumbleweedzopensuse-tumbleweed-{}zopensuse-leap-{0}-{1}zepel-%s-x86_64�.)r�r�)rLr
rBrC�
substitutions�anyr
rf�rpmZexpandMacror)r]Zdistarchr~r)r�rrr�s, 



zCoprCommand._guess_chrootcCs�dj|jdd��}|jd}dj|||�}y*t|j|�}tjj|�rRtj|�W�n^t	k
�rl}z�|j
dkr�td�j|j||j
t|��}t
jj|��td�}|jjd�}	|	�r>tj|	�jd�}
tj|
�}
|td	�jdj|j�|�7}|
jd
��r0|td�djd
d�|
d
D��7}|td�j|�7}t
jj|��n|td�j|�7}t
jj|��WYdd}~XnJtk
�r�}z,td�j|j||jj�}t
jj|��WYdd}~XnX|j�}|jd�}tjd|��r�tjj|jjjd|dd�d�}|j|j k�rR|j!ddd�j!|j"d�j!ddd�j!dd�j!dd�}
tjj|
��rRtj|
�t#|d��.}|j$|�x|j%�D]}|j$|��qrWWdQRXtj&|t'j(t'j)Bt'j*Bt'j+B�dS) Nrjr%z%/coprs/{0}/repo/{1}/dnf.repo?arch={2}i�z Request to {0} failed: {1} - {2}z+It wasn't possible to enable this project.
zCopr-Error-Datazutf-8z1Repository '{0}' does not exist in project '{1}'.zavailable chrootsz
Available repositories: z, css|]}dj|�VqdS)z'{}'N)rf)r��xrrr�	<genexpr>�sz-CoprCommand._download_repo.<locals>.<genexpr>z�

If you want to enable a non-default repository, use the following command:
  'dnf copr enable {0} <repository>'
But note that the installed repo file will likely need a manual modification.zProject {0} does not exist.zFailed to connect to {0}: {1}z\[copr:rriz.repoz_copr:�_coprrr?Zgroup_�@�wbr�r����),rFrqrfr rWrDrE�existsr)r!�coder�strrRrorpZheadersrK�base64Z	b64decode�decoder�r�r"�reason�strerror�readliner�r�rBrCrtrV�replacerUrre�	readlines�chmod�stat�S_IRUSR�S_IWUSR�S_IRGRP�S_IROTH)r]r}r�Zshort_chrootZarchr�Zresponse�eZ	error_msgZ
error_dataZerror_data_decodedZ
first_linerZold_repo_filename�frrrrx�sX





$

zCoprCommand._download_repocs�|jjdd�|jj�|j|j|�|��g}x(�jj�D]}|jd�rJq:|j|�q:W|s`dSt	d�}t
jd��|jdj
��fdd	�|D���}|j|t	d
��s�x,|D]$}|jjj�j||jjjddi�q�WdS)
a,
        In addition to the main copr repo (that has repo ID prefixed with
        `copr:`), the repofile might contain additional repositories that
        serve as runtime dependencies. This method informs the user about
        the additional repos and provides an option to disable them.
        T)r�zcopr:Na�Maintainer of the enabled Copr repository decided to make
it dependent on other repositories. Such repositories are
usually necessary for successful installation of RPMs from
the main Copr repository (they provide runtime dependencies).

Be aware that the note about quality and bug-reporting
above applies here too, Fedora Project doesn't control the
content. Please review the list:

{0}

These repositories have been enabled automatically.r%z

cs*g|]"}djt��|�jj|d�d��qS)z){num:2}. [{repoid}]
    baseurl={baseurl}�baseurl)Znum�repoidr�)rf�next�cfgZgetValue)r�r�)�counterr�rrr�9sz5CoprCommand._runtime_deps_warning.<locals>.<listcomp>z!Do you want to keep them enabled?rm�0)rB�resetZread_all_repos�_get_copr_reporur��sections�
startswithrIr�	itertools�countrfrFr�rC�write_raw_configfiler�r�)r]rr�Zruntime_depsr�ryZdepr)r�r�rrzs*



z!CoprCommand._runtime_deps_warningcCs�dj|jjdd�d|j|�|�}||jjkr�dj|j|�|�}}||jjkr�d|jj|jkr�|jj|jjd�d	}y.|jdd�djdd�d}||jkr�dSWq�tk
r�Yq�XndS|jj|S)
Nzcopr:{0}:{1}:{2}r?r%rz{0}-{1}r�r9rir�)	rfrUr�rurBr�r�rrn)r]rr�r�r�rUrrrr�Fs 

zCoprCommand._get_copr_repocCst|j||�}|s,tjjtdj|j||����ytj|j	�Wn2t
k
rn}ztjjt|���WYdd}~XnXdS)Nz&Failed to remove copr repo {0}/{1}/{2})r�rRrorprrfrUrDr)r��OSErrorr�)r]rr�r�r�rrrr|\szCoprCommand._remove_repocCsd|j||�}|dkr,tjjtdj||����x2|jj�D]$}|jj	j
|j||jj	jddi�q8WdS)Nz!Failed to disable copr repo {}/{}rmr�)
r�rRrorprrfr�r�rBrCr�r�r�)r]rr�r�r�rrrr{hszCoprCommand._disable_repocCs<ytj|j��}Wn$tk
r6tjjtd��dSX|S)z� Wrapper around response from server

        check data and print nice error in case of some error (and return None)
        otherwise return json object.
        zUnknown response from server.N)r�r�rJrrRr@rSr)r�r�r�rrr�	_get_datatszCoprCommand._get_datacCs"d|krtjjdj|d���dS)N�errorz{})rRrorprf)r�Zjson_objrrrr��szCoprCommand._check_json_outputcCs&|ddkrdj|dd��S|SdS)Nrr�zgroup_{}r%)rf)r�rrrrru�szCoprCommand._sanitize_username)r)'�__name__�
__module__�__qualname__�__doc__rLrTZdefault_hubrZr[rV�aliasesr�summaryr��usage�staticmethodr5r`rYrhr�r�rlrkrsr�r�rw�classmethodrvrrrxrzr�r|r{r�r�rurrrrr#PsDL_1
%82r#c@sDeZdZdZdZed�ZdZdd�Zdd�Z	e
d	d
��Zdd�Zd
S)�PlaygroundCommandz Playground plugin for DNF �
playgroundz$Interact with Playground repository.z [enable|disable|upgrade]c	Cs0|j�|jtd�td��dj|j�}|jj|dd�}|j|�}|j�|ddkrft	j
jtd���x�|d	D]�}d
j|d|d�}d
j|jjj
|jdd��}yj||dkr�wpdj|j||�}|jj|dd�}|j|�}|j�|o�d|ko�|ddk�r
|j||�Wqpt	jjk
�r&YqpXqpWdS)Nz!Enabling a Playground repository.zDo you want to continue?z{0}/api/playground/list/zw+)r�r��okzUnknown response from server.r�z{0}/{1}ZusernameZcoprnamez{}/_playground_{}.repor9rjZchrootsz{0}/api/coprs/{1}/detail/{2}/)rvrwrrfrWrBr r��closerRr@rSrCrtr�rxrorp)	r]r~Zapi_urlr�r�r�r}r�Zoutput2rrr�_cmd_enable�s8




zPlaygroundCommand._cmd_enablecCs6|j�x(tjdj|jjj��D]}|j|�q WdS)Nz{}/_playground_*.repo)rv�globrfrBrCrtr|)r]r�rrr�_cmd_disable�szPlaygroundCommand._cmd_disablecCs|jdddddgd�dS)Nr$r%r'r(�upgrade)r,r-)r3)r4rrrr5�szPlaygroundCommand.set_argparsercCs�tjjd��|jjd}|j�}|dkrB|j|�tjt	d��n`|dkrb|j
�tjt	d��n@|dkr�|j
�|j|�tjt	d��ntjjt	d	�j|���dS)
Nz%Playground is temporarily unsupportedrr'z-Playground repositories successfully enabled.r(z.Playground repositories successfully disabled.r�z-Playground repositories successfully updated.zUnknown subcommand {}.)rRrorprAr$rrr�rryrr�rf)r]r$r~rrrr��s

zPlaygroundCommand.runN)r�)
r�r�r�r�r�rr�r�r�r�r�r5r�rrrrr��s r�)/Z
__future__rr�r�r�rDr�Zshutilr�rcr�ZdnfpluginscorerrrRZ
dnf.pycomprZdnf.i18nrr�Zdistrorrr	r
r
�ImportError�platformrG�setZYESZNOZconfigparserrrrZurllib.requestr r!r"Zurllib2ZpluginZregister_commandr@ZCommandr#r�rrrr�<module>sP
Bsite-packages/dnf-plugins/__pycache__/builddep.cpython-36.pyc000064400000016426147511334660020163 0ustar003

@|�e�$�@s�ddlmZddlmZddlmZmZddlZddlZddlZddl	Zddl
ZddlZddlZ
ddlZddlZddlZddlZejjGdd�dejj��ZdS)�)�absolute_import)�unicode_literals)�_�loggerNcs�eZdZdZdZee�Zed�Z�fdd�Zdd�Z	d	d
�Z
edd��Zd
d�Z
dd�Zdd�Zedd��Zdd�Zdd�Zdd�Zdd�Z�ZS)�BuildDepCommand�builddep�	build-depz3Install build dependencies for package or spec filez[PACKAGE|PACKAGE.spec]cs(tt|�j|�tjjj�|_g|_dS)N)	�superr�__init__�dnf�rpmZtransactionZinitReadOnlyTransaction�_rpm_ts�tempdirs)�self�cli)�	__class__��/usr/lib/python3.6/builddep.pyr
/szBuildDepCommand.__init__cCsx|jD]}tj|�qWdS)N)r�shutilZrmtree)r�temp_dirrrr�__del__4szBuildDepCommand.__del__cCs�tjjj|�}|ddkr |jStjj�}tjdd�}t	jj
|t	jj|��}|jj
|�t|d�}zFy|j|jjj||j��Wn$tk
r�}z�WYdd}~XnXWd|j�X|S)	z�
        In case pkgspec is a remote URL, download it to a temporary location
        and use the temporary file instead.
        r�file�Z
dnf_builddep_)�prefixzwb+N)rr)rZpycompZurlparse�path�libdnfZrepoZ
Downloader�tempfileZmkdtemp�os�join�basenamer�append�openZdownloadURL�baseZconfZ_config�fileno�RuntimeError�close)r�pkgspec�locationZ
downloaderrZ	temp_fileZtemp_fo�exrrr�_download_remote_file8s


z%BuildDepCommand._download_remote_filec	Cs�dd�}|jdddtd�d�|jdd	d
gd|td�d
�|jdddtd�d�|j�}|jddtd�d�|jddtd�d�dS)NcSs:|r|jdd�ng}t|�dkr6td�|}tj|��|S)N��z&'%s' is not of the format 'MACRO EXPR')�split�lenr�argparseZArgumentTypeError)�argZarglist�msgrrr�	macro_defRs

z0BuildDepCommand.set_argparser.<locals>.macro_def�packages�+�packagez"packages with builddeps to install)�nargs�metavar�helpz-Dz--definer z'MACRO EXPR'z$define a macro for spec file parsing)�action�defaultr6�typer7z--skip-unavailable�
store_trueFz5skip build dependencies not available in repositories)r8r9r7z--specz)treat commandline arguments as spec files)r8r7z--srpmz)treat commandline arguments as source rpm)�add_argumentrZadd_mutually_exclusive_group)�parserr1Zptyperrr�
set_argparserPs

zBuildDepCommand.set_argparsercCs|jjsd|j_dS)N�error)�optsZrpmverbosity)rrrr�
pre_configurefszBuildDepCommand.pre_configurecCsr|jj}d|_d|_d|_d|_|jjp.|jjsnx<|jj	D]0}|j
d�pZ|j
d�pZ|j
d�s:|jjj
�Pq:WdS)NTz.src.rpmz
.nosrc.rpmz.spec)r�demandsZavailable_reposZ	resolvingZ	root_userZsack_activationr@�spec�srpmr2�endswithr"ZreposZenable_source_repos)rrBr&rrr�	configurejs


zBuildDepCommand.configurecCs\tjjj|j�}x$|jjD]}tj|d|d�qWd}x�|jj	D]�}|j
|�}yl|jjrh|j|�nT|jj
r||j|�n@|jd�s�|jd�r�|j|�n |jd�r�|j|�n
|j|�WqDtjjk
�r}z:x$|j�D]}tjtd�j|��q�Wtj|�d}WYdd}~XqDXqDWx |jjD]}tj|d��q*W|�rXtjjtd	���dS)
Nrr*Fz.src.rpmz	nosrc.rpmz.speczRPM: {}Tz!Some packages could not be found.)rZyumZrpmtransZRPMTransactionr"r@�definerZaddMacror2r)rD�	_src_depsrC�
_spec_depsrE�_remote_deps�
exceptions�ErrorZmessagesrr?r�formatZdelMacro)rZrpmlogZmacroZ
pkg_errorsr&�e�linerrr�runzs2


zBuildDepCommand.runcCs|j�dd�S)Nr+)ZDNEVR)Zrpm_deprrr�_rpm_dep2reldep_str�sz#BuildDepCommand._rpm_dep2reldep_strcCs�tjj|jj�}|j|d�|j�}|rX|jd�rXtjj|jj�}|j|d�|j�}|r�|jd�r�td�}t	j
||�|jjdkS|r�|jj
|�}|r�x|D]}tjj|�q�W|jjj|dd�dS)	N)Zprovides�/)r�(z$No matching package to install: '%s'TF)ZselectZoptional)rZselectorZSelectorr"�sack�setZmatches�
startswithrr�warningr@Zskip_unavailableZ_sltr_matches_installedZ_msg_installedZ_goalZinstall)r�
reldep_strZsltr�foundr0Zalready_instr4rrr�_install�s$
zBuildDepCommand._installc
Cs�tj|tj�}y|jj|�}WnRtjk
rp}z4t|�dkrJtd�|}tj	|�t
jj|��WYdd}~XnXtj	|�|j
d�}d}x0|D](}|j|�}|jd�r�q�||j|�M}q�W|s�td�}	t
jj|	��|jjr�tjtd��dS)Nzerror reading package headerz2Failed to open: '%s', not a valid source rpm file.ZrequirenameTzrpmlib(zNot all dependencies satisfiedzJWarning: -D or --define arguments have no meaning for source rpm packages.)rr!�O_RDONLYr
ZhdrFromFdnorr?�strrr%rrKrLZdsFromHeaderrQrVrZr@rGrrW)
rZsrc_fn�fd�hrN�ds�done�deprX�errrrrrH�s*





zBuildDepCommand._src_depsc	Cs�ytj|�}Wn>tk
rL}z"td�||f}tjj|��WYdd}~XnXd}x.tj|jd�D]}|j	|�}||j
|�M}qbW|s�td�}tjj|��dS)Nz/Failed to open: '%s', not a valid spec file: %sT�requireszNot all dependencies satisfied)rrC�
ValueErrorrrrKrLr_ZsourceHeaderrQrZ)	rZspec_fnrCr(r0r`rarXrbrrrrI�s

zBuildDepCommand._spec_depsc	Cs�tjj|�j|jj�jdd�}tdd�|D��}|jjj�j	�j||gdd�j
�j�}|sptjj
td�|��d}x.|D]&}x |jD]}||jt|��M}q�WqzW|s�td�}tjj
|��dS)	N�src)Z	arch__neqcSsh|]
}|j�qSr)Zsource_name)�.0�pkgrrr�	<setcomp>�sz/BuildDepCommand._remote_deps.<locals>.<setcomp>)�nameZarchzno package matched: %sTzNot all dependencies satisfied)rZsubjectZSubjectZget_best_queryr"rT�filter�listZquery�	availableZlatestrPrKrLrrcrZr\)	rr4rlZsourcenamesZpkgsr`rgZreqrbrrrrJ�s
zBuildDepCommand._remote_deps)rr)�__name__�
__module__�__qualname__�aliasesr0rZsummaryZusager
rr)�staticmethodr>rArFrPrQrZrHrIrJ�
__classcell__rr)rrr's !r)Z
__future__rrZdnfpluginscorerrr.rZdnf.cliZdnf.exceptionsZdnf.rpm.transactionZdnf.yum.rpmtransZlibdnf.reporrrrrZpluginZregister_commandrZCommandrrrrr�<module>ssite-packages/dnf-plugins/__pycache__/reposync.cpython-36.opt-1.pyc000064400000024276147511334660021176 0ustar003

@|�e89�@s�ddlmZddlmZddlZddlZddlZddlZddlmZm	Z	ddl
mZddlZddl
Zdd�ZGdd	�d	ejj�ZejjGd
d�dejj��ZdS)�)�absolute_import)�unicode_literalsN)�_�logger)�OptionParsercCs(tjjtj��}tjjtjj|||��S)N)�dnfZi18nZucd�os�getcwd�path�realpath�join)Zintermediate�target�cwd�r�/usr/lib/python3.6/reposync.py�_pkgdir#srcs(eZdZ�fdd�Z�fdd�Z�ZS)�RPMPayloadLocationcs$tt|�j||�tjj|�|_dS)N)�superr�__init__rr
�dirname�package_dir)�self�pkg�progressZpkg_location)�	__class__rrr)szRPMPayloadLocation.__init__cs*tt|�j�}tjj|j�|j|d<|S)N�dest)rr�_target_paramsr�util�
ensure_dirr)r�tp)rrrr-s
z!RPMPayloadLocation._target_params)�__name__�
__module__�__qualname__rr�
__classcell__rr)rrr(srcs�eZdZdZed�Z�fdd�Zedd��Zdd�Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Z�ZS) �RepoSyncCommand�reposyncz&download all packages from remote repocstt|�j|�dS)N)rr$r)r�cli)rrrr9szRepoSyncCommand.__init__c	Cs|jdddgtjdtd�d�|jddd	td
�d�|jddd	td
�d�|jdddd	td�d�|jdddd	td�d�|jdtd�d�|jdddd	td�d�|jddd	td�d�|jdddtd�d �|jd!dd	td"�d�|jd#dd	td$�d�|jd%d&dd	td'�d�dS)(Nz-az--arch�archesz[arch]z$download only packages for this ARCH)r�default�action�metavar�helpz--deleteF�
store_truez5delete local packages no longer present in repository)r(r)r+z--download-metadatazdownload all the metadata.z-gz
--gpgcheckzBRemove packages that fail GPG signature checking after downloadingz-mz--downloadcompsz&also download and uncompress comps.xmlz--metadata-pathzXwhere to store downloaded repository metadata. Defaults to the value of --download-path.)r+z-nz
--newest-onlyz&download only newest packages per-repoz--norepopathz,Don't add the reponame to the download path.z-pz--download-pathz./z&where to store downloaded repositories)r(r+z
--remote-timezCtry to set local timestamps of local files by the one on the serverz--sourcezdownload only source packagesz-uz--urlsz:Just list urls of what would be downloaded, don't download)�add_argumentrZ_SplitCallbackr)�parserrrr�
set_argparser<s2





zRepoSyncCommand.set_argparsercCs�|jj}d|_d|_|jj}|jjr||j�j	�xJ|jjD]>}y||}Wn$t
k
rntjjd|��YnX|j
�q:W|jjr�|j�tt|j���dkr�|jjr�tjjtd���x |j�D]}|jj�d|_q�WdS)NTzUnknown repo: '%s'.�z1Can't use --norepopath with multiple repositoriesF)r&�demandsZavailable_reposZsack_activation�base�repos�opts�repo�all�disable�KeyErrorrZCliError�enable�sourceZenable_source_repos�len�list�iter_enabled�
norepopathr�_repoZexpireZdeltarpm)rr1r3Zrepoidr5rrr�	configure\s(

zRepoSyncCommand.configurecs�d|jj_d}�x�|jjj�D�]�}|jjr8|jjd�|jj	r�|jj
r�xP|jj�D]6\}}|j|�}|rtt
|�qTtd�|}tj|�qTWn
|j	|�|jj�r|jj
�rt|jj����fdd�dD�}|�rxB|D]}|j|�}|r�t
|�Pq�Wtd�}tj|�n
|j|�|j|�}	|jj
�r8|j|	�n�|j|	�|jj�r�xt|	D]l}
|j|
�}tj|fd	d
�|
�|
_|jj|
�\}}
|dk�rRtjtd�jtjj |�|
��tj!|�d
}�qRW|jj"r|j#||	�qW|�s�t$j%j&td���dS)NTz%Failed to get mirror for metadata: %scsg|]}|�kr�|�qSrr)�.0�md_type)�mdlrr�
<listcomp>�sz'RepoSyncCommand.run.<locals>.<listcomp>�group�group_gz�group_gz_zckz(Failed to get mirror for the group file.cSs|S)Nr)�s�
local_pathrrr�<lambda>�sz%RepoSyncCommand.run.<locals>.<lambda>rzRemoving {}: {}FzGPG signature check failed.)rErFrG)'r2ZconfZ	keepcacher3r=r4Zremote_timer?ZsetPreserveRemoteTime�download_metadataZurlsZgetMetadataLocations�remote_location�printrr�warningZ
downloadcomps�dict�getcomps�get_pkglist�
print_urls�download_packagesZgpgcheck�pkg_download_path�types�
MethodTypeZlocalPkgZpackage_signature_check�formatrr
�basename�unlink�delete�delete_old_local_packagesr�
exceptions�Error)rZgpgcheck_okr5rBZmd_location�url�msgZgroup_locationsZgroup_location�pkglistrrI�result�errorr)rCr�runws^


















zRepoSyncCommand.runcCs$t|jjp|jj|jjs|jnd�S)N�)rr4ZdestdirZ
download_pathr>�id)rr5rrr�repo_target�szRepoSyncCommand.repo_targetcCs&|jjrt|jj|j�S|j|�SdS)N)r4Z
metadata_pathrrerf)rr5rrr�metadata_target�szRepoSyncCommand.metadata_targetcCsT|j|j�}tjjtjj||j��}|jtjj|d��sPtj	j
td�j||���|S)Nrdz6Download target '{}' is outside of download path '{}'.)
rfr5rr
rr�location�
startswithrr\r]rrW)rrrfrTrrrrT�s
z!RepoSyncCommand.pkg_download_pathc	
s�t�fdd�|D��}x�tj�j|��D]�\}}}x||D]t}tjj||�}|jd�r8tjj|�r8||kr8ytj|�t	j
td�|�Wq8tk
r�t	j
td�|�Yq8Xq8Wq(WdS)Nc3s|]}�j|�VqdS)N)rT)rAr)rrr�	<genexpr>�sz<RepoSyncCommand.delete_old_local_packages.<locals>.<genexpr>z.rpmz[DELETED] %szfailed to delete file %s)�setr�walkrfr
r�endswith�isfilerYr�infor�OSErrorrb)	rr5r`Zdownloaded_files�dirpathZdirnames�	filenames�filenamer
r)rrr[�s

z)RepoSyncCommand.delete_old_local_packagescCsZ|jj�}|rV|j|�}tjj|�tjj|d�}tj	j
j||d�tj
td�|j�dS)Nz	comps.xml)rz!comps.xml for repository %s saved)r?Z
getCompsFnrgrrrrr
rZyumZmiscZ
decompressrrorre)rr5Zcomps_fnZ	dest_pathrrrrrP�s

zRepoSyncCommand.getcompscCs|j|�}|jj|�dS)NT)rgr?ZdownloadMetadata)rr5rfrrrrK�s
z!RepoSyncCommand.download_metadatacCs�tjjs|j�S|j�|jjj�}t�}i}i}xp|D]h}|j�}|j	|�|j
|j�i�j
|j�g�j
|�x.|D]&}|j
|i�j
|j�g�j
|j��qvWq8W|j|j|d�d�j�}	t�}
x�|j�D]�\}}t�}
|
jt|j�dd�d�t�}x0|j�D]$}x|D]}|j	|j���qW�qWx:|j|d�j�D]&}dj|�}|
jt|||���q>Wx0|
D](}x ||D]}|
j	|j���q|W�qnWq�W|	j|j|
d��}	|	S)a\
        return union of these queries:
        - the latest NEVRAs from non-modular packages
        - all packages from stream version with the latest package NEVRA
          (this should not be needed but the latest package NEVRAs might be
          part of an older module version)
        - all packages from the latest stream version
        )Znevra_strict)Zpkg__neqT)�reverserz3{0.name}-{0.epoch}:{0.version}-{0.release}.{0.arch})rr2ZWITH_MODULESZlatestZapplyZ_moduleContainerZgetModulePackagesrkZgetArtifacts�update�
setdefaultZ
getNameStreamZ
getVersionNum�append�filter�items�add�sorted�keys�valuesrW�max�union)r�queryZmodule_packagesZ
all_artifactsZmodule_dictZartifact_versionZmodule_packageZ	artifactsZartifactZlatest_queryZlatest_stream_artifactsZ
namestreamZversion_dictZversionsZstream_artifacts�modules�moduleZ
latest_pkgZnevra�versionrrr�_get_latest�sB	





zRepoSyncCommand._get_latestcCsd|jjjtjd�j�j|jd�}|jj	r2|j
|�}|jjrH|jdd�n|jjr`|j|jjd�|S)N)�flags)Zreponame�src)Zarch)
r2�sackr��hawkey�IGNORE_MODULAR_EXCLUDESZ	availableZfiltermrer4Znewest_onlyr�r:r')rr5r�rrrrQs

zRepoSyncCommand.get_pkglistcsj�j}|jj��dkr tjj��tjj|jj	t
jd�j��d�}��fdd�|D�}|j
||�dd�dS)N)r�rcsg|]}t|��j|���qSr)rrT)rAr)rrrrrD0sz5RepoSyncCommand.download_packages.<locals>.<listcomp>F)r2�outputrr�callbackZNullDownloadProgress�drpmZ	DeltaInfor�r�r�r�Z	installedZ_download_remote_payloads)rr`r2r�Zpayloadsr)rrrrS)s
z!RepoSyncCommand.download_packagescCs@x:|D]2}|j�}|r t|�qtd�|j}tj|�qWdS)Nz$Failed to get mirror for package: %s)rLrMr�namerrN)rr`rr^r_rrrrR4s

zRepoSyncCommand.print_urls)r%)r r!r"�aliasesrZsummaryr�staticmethodr/r@rcrfrgrTr[rPrKr�rQrSrRr#rr)rrr$4s  :
	9r$)Z
__future__rrr�rZshutilrUZdnfpluginscorerrZdnf.cli.option_parserrrZdnf.clirr5Z
RPMPayloadrZpluginZregister_commandr&ZCommandr$rrrr�<module>ssite-packages/dnf-plugins/__pycache__/generate_completion_cache.cpython-36.opt-1.pyc000064400000006000147511334660024463 0ustar003

�gt`l�@s^ddlmZddlmZddlmZddlmZddlZddlZ	ddl
Z
Gdd�dej�ZdS)�)�absolute_import)�unicode_literals)�ucd)�loggerNcs<eZdZdZ�fdd�Zedd��Zdd�Zdd	�Z�Z	S)
�BashCompletionCacheZgenerate_completion_cachecs"tt|�j||�||_d|_dS)Nz/var/cache/dnf/packages.db)�superr�__init__�base�
cache_file)�selfr	Zcli)�	__class__��//usr/lib/python3.6/generate_completion_cache.pyrszBashCompletionCache.__init__cCstjd|�dS)NzCompletion plugin: %s)r�debug)�msgr
r
r�_out$szBashCompletionCache._outcCsd}x,|jjj�D]}|jdk	r|jjrd}PqWtjj|j�sF|r�y~t	j
|j��h}|jd�|j�}|j
d�|j
d�|j
d�|jjj�j�}dd	�|D�}|jd
|�|j�WdQRXWn6t	jk
r�}z|jdt|��WYdd}~XnXdS)z& Generate cache of available packages FNTzGenerating completion cache...z/create table if not exists available (pkg TEXT)zAcreate unique index if not exists pkg_available ON available(pkg)zdelete from availablecSs g|]}|jdkrt|�g�qS)�src)�arch�str)�.0�xr
r
r�
<listcomp>@sz,BashCompletionCache.sack.<locals>.<listcomp>z*insert or ignore into available values (?)z Can't write completion cache: %s)r	ZreposZiter_enabledZmetadata�fresh�os�path�existsr
�sqlite3�connectr�cursor�execute�sack�queryZ	available�executemany�commit�OperationalErrorr)rrZrepo�conn�curZ
avail_pkgsZavail_pkgs_insert�er
r
rr (s,

zBashCompletionCache.sackcCs�|js
dSy�tj|j��n}|jd�|j�}|jd�|jd�|jd�tjj	|j
�j�j�}dd�|D�}|j
d|�|j�WdQRXWn6tjk
r�}z|jd	t|��WYdd}~XnXdS)
z& Generate cache of installed packages NzGenerating completion cache...z/create table if not exists installed (pkg TEXT)zAcreate unique index if not exists pkg_installed ON installed(pkg)zdelete from installedcSs g|]}|jdkrt|�g�qS)r)rr)rrr
r
rrVsz3BashCompletionCache.transaction.<locals>.<listcomp>z*insert or ignore into installed values (?)z Can't write completion cache: %s)�transactionrrr
rrr�dnfr Z_rpmdb_sackr	r!Z	installedr"r#r$r)rr%r&Z	inst_pkgsZinst_pkgs_insertr'r
r
rr(Gs"


zBashCompletionCache.transaction)
�__name__�
__module__�__qualname__�namer�staticmethodrr r(�
__classcell__r
r
)rrrs
r)
Z
__future__rrZdnf.i18nrZdnfpluginscorerr)Zos.pathrrZPluginrr
r
r
r�<module>ssite-packages/dnf-plugins/__pycache__/config_manager.cpython-36.opt-1.pyc000064400000016133147511334660022264 0ustar003

�gt`�*�@s�ddlmZddlmZddlmZmZmZddlZddlZddl	Zddl
ZddlZddlZddl
Z
ddlZddlZejjGdd�dejj��Zdd�Zejd	�Zejd
�Zejd�Zejd�Zd
d�ZdS)�)�absolute_import)�unicode_literals)�_�logger�P_Nc@sReZdZdgZed�jejjd�Z	e
dd��Zdd�Zdd	�Z
d
d�Zdd
�ZdS)�ConfigManagerCommandzconfig-managerz4manage {prog} configuration options and repositories)�progcCs�|jdddtd�d�|jdddtd	�d
�|jdgdd
td�d�|jdddtd�d
�|jdddtd�d
�|j�}|jddddtd�d�|jddddtd�d�dS)N�crepo�*�repozrepo to modify)�nargs�metavar�helpz--saveF�
store_truez/save the current options (useful with --setopt))�default�actionrz
--add-repo�appendZURLz8add (and enable) the repo from the specified file or url)rrr
rz--dumpz,print current configuration values to stdoutz--dump-variableszprint variable values to stdoutz
--set-enabled�set_enabledz"enable repos (automatically saves))r�destrrz--set-disabled�set_disabledz#disable repos (automatically saves))�add_argumentrZadd_mutually_exclusive_group)�parserZenable_group�r�$/usr/lib/python3.6/config_manager.py�
set_argparser)s,z"ConfigManagerCommand.set_argparsercCs�|jj}d|_|jjgkp@|jjp@|jjp@|jjp@|jjp@|jj	sp|jj
jtd�j
djdddddd	d
dg���|jjgkr�tjtd��|jjs�|jj	s�|jjs�|jjr�d|_d
d�|jjD�}dd�|D�|j_dS)NTz.one of the following arguments is required: {}� z--savez
--add-repoz--dumpz--dump-variablesz
--set-enabledz--enablez--set-disabledz	--disablez{Warning: --enablerepo/--disablerepo arguments have no meaningwith config manager. Use --set-enabled/--set-disabled instead.cSsg|]}|dkr|jd��qS)�,)�split)�.0�xrrr�
<listcomp>_sz2ConfigManagerCommand.configure.<locals>.<listcomp>cSs"g|]}|D]}|dkr|�qqS)�r)rZsublist�itemrrrr as)�cli�demandsZavailable_repos�opts�add_repo�save�dump�dump_variablesrrZ	optparser�errorr�format�joinZrepos_edrZwarningZ	root_userr	)�selfr$Z	temp_listrrr�	configureBs*zConfigManagerCommand.configurecCs|jjr|j�n|j�dS)zExecute the util action here.N)r%r&�modify_repo)r-rrr�runds
zConfigManagerCommand.runc	s�g�t������fdd�}�jjrnx�jjD]�|�d�q.Wt�jd�r�xL�jjj�D]�|�d�qZWn,t�jd�r�x�jjj�D]�|�d�q�W�r�tjjt	d�dj
�����jj}i}t�jd�r�jj
r�jj
}�jj�rx*�jjjj�D]\�}td	�|f�q�W�jj�s0d
�jjk�r��jj�r\|�r\�jjj�jjjd
|j|��jj�r�t�jjjd
��t�jjj����s�dS�jj�s��jj�r�d�j_x�t��D]�}i}�jj�r�d|d
<n�jj�r�d|d
<t�jd��r*x4�jjj�D]$\}}tj|j|��r|j|��qW�jj�rT|�rT�jjj|j|j|j|��jj�r�t�jjjd|j��t|j���q�WdS)z< process --set-enabled, --set-disabled and --setopt options cs0�jjj|�}|s�j��n|r,�j|�dS)N)�baseZreposZget_matching�add�extend)�keyZadd_matching_reposZmatching)�matching_repos�name�not_matching_repos_idr-rr�match_reposqs
z5ConfigManagerCommand.modify_repo.<locals>.match_reposT�repo_setoptsFzNo matching repo to modify: %s.z, �main_setoptsz%s = %s�mainN�1Zenabled�0zrepo: )�setr%r	�hasattrr9�keys�dnf�
exceptions�Errorrr,r1�confr:r)Z
substitutions�items�printr'Zwrite_raw_configfileZconfig_file_pathr(�outputZ
fmtSectionrr�sorted�fnmatch�id�updateZrepofile)	r-r8ZsbcZmodify�valrZrepo_modify�repoidZsetoptsr)r5r6r7r-rr/ks`






z ConfigManagerCommand.modify_repoc
CsN|jjj}d}�x|jjD�]}tjjj|�jdkrDdt	j
j|�}tj
td�|�|jd�r�t	j
j|�}t	j
j||�}y6|jj|dd�}tj|j|�t	j|d�|j�Wn6tk
r�}z|d	7}tj|�wWYd
d
}~XnXqt|�}djtjj|�}t	j
j|d|�}d
|||f}	t||	�sqqW|�rJtjj t!dd|���d
S)z process --add-repo option rr!zfile://zAdding repo from: %sz.repozw+)�modei��Nz$created by {} config-manager from {}z%s.repoz"[%s]
name=%s
baseurl=%s
enabled=1
zConfiguration of repo failedzConfiguration of repos failed)"r1rDZget_reposdirr%r&rA�pycompZurlparse�scheme�os�path�abspathr�infor�endswith�basenamer,Zurlopen�shutilZcopy2r6�chmod�close�IOErrorr*�sanitize_url_to_fsr+�util�	MAIN_PROG�save_to_filerBrCr)
r-Z	myrepodirZerrors_count�urlZdestname�f�erMZreponame�contentrrrr&�s8




zConfigManagerCommand.add_repoN)�__name__�
__module__�__qualname__�aliasesrr+rAr]r^Zsummary�staticmethodrr.r0r/r&rrrrr"s"BrcCspy4t|d�� }tjj||�tj|d�WdQRXWn6ttfk
rj}ztj	t
d�||�dSd}~XnXdS)Nzw+i�z&Could not save repo to repofile %s: %sFT)�openrArPZ
write_to_filerRrYr[�OSErrorrr*r)�filenamerc�fdrbrrrr_�s
r_z^\w+:/*(\w+:|www\.)?z[?/:&#|~\*\[\]\(\)\'\\]+z^[,.]*z[,.]*$cCs*ybtj|�r`tjjr&|jd�jd�}n:t|t�rB|jd�jd�}n
|jd�}t|t	�r`|jd�}Wnt
ttt
fk
r~YnXtjd|�}tjd|�}tjd|�}tjd|�}t|�dk�r|dd�jd�}dt|d
�}tj�}|j||d�jd��|d|�d|j�}d	}tj|d|�S)z�Return a filename suitable for the filesystem and for repo id

    Strips dangerous and common characters to create a filename we
    can use to store the cache in.
    Zidnazutf-8r!r�N�rOzE[^abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789_.:-]���)�	RE_SCHEME�matchrArPZPY3�encode�decode�
isinstance�strZunicode�UnicodeDecodeError�UnicodeEncodeError�UnicodeError�	TypeError�sub�RE_SLASH�RE_BEGIN�RE_FINAL�lenr�hashlibZsha256rKZ	hexdigest�re)r`�parts�	lastindexZcsumZ
allowed_regexrrrr\�s.



r\)Z
__future__rrZdnfpluginscorerrrrAZdnf.cliZ
dnf.pycompZdnf.utilrIrrRr�rXZpluginZregister_commandr#ZCommandrr_�compilerpr{r|r}r\rrrr�<module>s(1



site-packages/dnf-plugins/__pycache__/repograph.cpython-36.opt-1.pyc000064400000005342147511334660021314 0ustar003

�gt`��@s^ddlmZddlmZddlmZmZddlZdZGdd�dej	�Z
Gdd	�d	ejj�Z
dS)
�)�absolute_import)�unicode_literals)�_�loggerNzY
size="20.69,25.52";
ratio="fill";
rankdir="TB";
orientation=port;
node[style="filled"];
cs eZdZdZ�fdd�Z�ZS)�	RepoGraph�	repographcs,tt|�j||�|dkrdS|jt�dS)N)�superr�__init__Zregister_command�RepoGraphCommand)�self�base�cli)�	__class__��/usr/lib/python3.6/repograph.pyr	)szRepoGraph.__init__)�__name__�
__module__�__qualname__�namer	�
__classcell__rr)rrr%src@s<eZdZd
Zed�Zdd�Zdd�Zdd	�Ze	d
d��Z
dS)r
r�
repo-graphz4Output a full package dependency graph in dot formatcCsV|jj}d|_d|_|jjrRx4|jjj�D]$}|j	|jjkrF|j
�q*|j�q*WdS)NT)r
�demandsZsack_activationZavailable_reposZopts�reporZrepos�all�id�disable�enable)rrrrrr�	configure4s
zRepoGraphCommand.configurecCs|jt�dS)N)�do_dot�
DOT_HEADER)rrrr�run?szRepoGraphCommand.runc	Cs�d}|j|jj�}td�tdj|��x�|j�D]�}t||�|krRt||�}ddt||�}|d}d}td	j||||��td
j|��x||D]}tdj|��q�Wtdj|||��q2Wtd
�dS)Nrzdigraph packages {z{}g�?g333333�?�g�������?g�?z""{}" [color="{:.12g} {:.12g} {}"];z
"{}" -> {{z"{}"z!}} [color="{:.12g} {:.12g} {}"];
�}g��s���?)�	_get_depsr�sack�print�format�keys�len)	r�headerZmaxdepsZdeps�pkg�h�s�b�reqrrrrBs zRepoGraphCommand.do_dotc
Cs�i}i}g}|j�j�}x�|D]�}i}x�|jD]�}t|�}||krDq.|jd�rPq.||krb||}	n@|j|d�}	|	s�tjtd�|�|j	|�q.n
|	dj
}	|	||<|	|j
kr�d||	<|	|ks.|	|kr�q.nd||	<|j�||j
<q.WqW|S)Nz	solvable:)ZprovideszNothing provides: '%s'r)Zquery�	available�requires�str�
startswith�filterr�debugr�appendrr')
r$r0Zprov�skipr/r*Zxxr.ZreqnameZproviderrrrr#Ys8





zRepoGraphCommand._get_depsN)rr)rrr�aliasesrZsummaryrr r�staticmethodr#rrrrr
0sr
)Z
__future__rrZdnfpluginscorerrZdnf.cliZdnfrZPluginrr
ZCommandr
rrrr�<module>ssite-packages/dnf-plugins/__pycache__/download.cpython-36.opt-1.pyc000064400000022456147511334660021141 0ustar003

�gt`*0�@s�ddlmZddlmZddlmZmZddlmZddlZddl	Zddl
ZddlZddlZddl
ZddlZddlZddlZddlZejjGdd�dejj��ZdS)�)�absolute_import)�unicode_literals)�_�logger)�OptionParserNcs�eZdZdgZed�Z�fdd�Zedd��Zdd�Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�Zd dd�Zd!dd�Zedd��Zdd�Zdd�Z�ZS)"�DownloadCommandZdownloadz%Download package to current directorycs tt|�j|�d|_d|_dS)N)�superr�__init__�opts�parser)�self�cli)�	__class__��/usr/lib/python3.6/download.pyr	+szDownloadCommand.__init__c	Cs�|jddtd�d�|jddtd�d�|jd	dtd
�d�|jddtd�d�|jd
ddgtjdtd�d�|jddtd�d�|jddtd�d�|jddddtd�d�|jddddd d!ggtd"�d#�dS)$N�packages�+zpackages to download)�nargs�helpz--source�
store_truezdownload the src.rpm instead)�actionrz--debuginfoz'download the -debuginfo package insteadz
--debugsourcez)download the -debugsource package insteadz--archz
--archlist�archesz[arch]z5limit  the  query to packages of given architectures.)�dest�defaultr�metavarrz	--resolvez(resolve and download needed dependenciesz	--alldepsz^when running with --resolve, download all dependencies (do not exclude already installed ones)z--urlz--urls�urlzJprint list of urls where the rpms can be downloaded instead of downloading)rrrz--urlprotocols�append�http�httpsZrsync�ftpz4when running with --url, limit to specific protocols)r�choicesrr)�add_argumentrrZ_SplitCallback)rrrr�
set_argparser0s*
zDownloadCommand.set_argparsercCs�|jj}d|_d|_|jjr*|jjr*d|_|jjr>|j	j
j�|jjsN|jj
rZ|j	j
j�|jjrr|jj|j	j_ntjjtj��|j	j_dS)NTF)r
�demandsZsack_activationZavailable_reposr
�resolveZalldepsZload_system_repo�source�baseZreposZenable_source_repos�	debuginfo�debugsourceZenable_debug_repos�destdir�conf�dnf�i18n�ucd�os�getcwd)rr#rrr�	configureKszDownloadCommand.configurecCs|jjr.|jjr.|jjr.|j|jj�}nXg}|jjrN|j|j|jj��|jjrj|j|j|jj��|jjr�|j|j	|jj��|jj
r�xd|D]\}|jtj
kr�|j|jjd�}|r�t|�q�td�|j}|jjjr�tjj|��tj|�q�WdS|j|�dS)zExecute the util action here.)�schemesz$Failed to get mirror for package: %sN)r
r%r'r(�_get_pkg_objs_rpmsr�extend�_get_pkg_objs_source�_get_pkg_objs_debuginfo�_get_pkg_objs_debugsourcer�repoid�hawkey�CMDLINE_REPO_NAMEZremote_locationZurlprotocols�printr�namer&r*�strictr+�
exceptions�ErrorrZwarning�
_do_downloads)r�pkgs�pkgr�msgrrr�run^s.





zDownloadCommand.runcCsi}x"|D]}|jt|�g�j|�q
Wg}g}xP|j�D]D}dd�|D�}|r`|j|d�q:|jdd�d�|j|d�q:W|r�|jj||jjj�|r�x^|D]V}|j	�}t
jj|jj
jt
jj|��}	t
jj|	�r�t
jj||	�r�q�tj||jj
j�q�Wtdd�||D��}
|
S)z=
        Perform the download for a list of packages
        cSsg|]}|jtjkr|�qSr)r7r8r9)�.0rArrr�
<listcomp>�sz1DownloadCommand._do_downloads.<locals>.<listcomp>rcSs|jj|jjfS)N)ZrepoZpriorityZcost)�xrrr�<lambda>�sz/DownloadCommand._do_downloads.<locals>.<lambda>)�keycSsg|]}|j��qSr)�localPkg)rDrArrrrE�s)�
setdefault�strr�values�sortr&Zdownload_packages�output�progressrIr.�path�joinr*r)�basename�exists�samefile�shutil�copy�sorted)rr@Zpkg_dictrAZto_downloadZcmdlineZpkg_listZpkgs_cmdline�src�dstZ	locationsrrrr?�s.

zDownloadCommand._do_downloadscCs"|jjr|j|�}n
|j|�}|S)zc
        Return a list of dnf.Package objects that represent the rpms
        to download.
        )r
r$�_get_packages_with_deps�
_get_packages)r�	pkg_specsr@rrrr2�s
z"DownloadCommand._get_pkg_objs_rpmscCs*|j|�}|j|�}t|j|dd��}|S)zj
        Return a list of dnf.Package objects that represent the source
        rpms to download.
        T)r%)r2�_get_source_packages�setr[)rr\r@�source_pkgsrrrr4�s

z$DownloadCommand._get_pkg_objs_sourcec	Cs�t�}|jjj�j�}xh|j|�D]Z}xT|j|jgD]D}|j|t	|j
�|j|j|j
d�}|s^q4x|D]}|j|�qdWPq4Wq"W|S)zm
        Return a list of dnf.Package objects that represent the debuginfo
        rpms to download.
        )r;�epoch�version�release�arch)r^r&�sack�query�	availabler[Z
debug_nameZsource_debug_name�filter�intr`rarbrc�add)rr\�dbg_pkgs�qrAZdbg_name�
dbg_available�prrrr5�s 


z'DownloadCommand._get_pkg_objs_debuginfocCsht�}|jjj�j�}xL|j|�D]>}|j|jt|j	�|j
|j|jd�}x|D]}|j
|�qNWq"W|S)zo
        Return a list of dnf.Package objects that represent the debugsource
        rpms to download.
        )r;r`rarbrc)r^r&rdrerfr[rgZdebugsource_namerhr`rarbrcri)rr\rjrkrArlrmrrrr6�s

z)DownloadCommand._get_pkg_objs_debugsourceFcCs�|r
|jn|j}g}x||D]t}y|j||��Wqtjjk
r�}z<tjtjj	|��|j
jjr|tjt
d��tjj|��WYdd}~XqXqWttj|��}|S)z Get packages matching pkg_specs.zExiting due to strict setting.N)�_get_query_source�
_get_queryrr+r=�PackageNotFoundErrorr�errorr,r-r&r*r<rr>�list�	itertools�chain)rr\r%�funcZqueries�pkg_spec�er@rrrr[�s

"zDownloadCommand._get_packagesc	Cs�|j|�}t|�}x�|D]�}tj|jj�}|j|�|j�}|r^|j|j	��|j|j
��qtd�g}tj
dj|dd�|D���tj
tjj|j���tjj��qW|S)z-Get packages matching pkg_specs and the deps.zError in resolve of packages:z
    cSsg|]}t|��qSr)rK)rDrArrrrEsz;DownloadCommand._get_packages_with_deps.<locals>.<listcomp>)r[r^r8ZGoalr&rdZinstallrC�updateZ
list_installsZ
list_upgradesrrrqrQr+�utilZ_format_resolve_problemsZ
problem_rulesr=r>)	rr\r%r@Zpkg_setrAZgoalZrcrBrrrrZ�s



z'DownloadCommand._get_packages_with_depscCszt�}xj|D]b}|jr8|j|j�tjdt|�|j�q|jdkrZ|jd|j|jf�qtj	t
d�t|��qWt|�S)z4Get list of source rpm names for a list of packages.z  --> Package : %s Source : %srXz
%s-%s.src.rpmzNo source rpm defined for %s)r^Z	sourcerpmrir�debugrKrcr;Zevr�inforrr)r@r_rArrrr]s

z$DownloadCommand._get_source_packagescCs�tjjj|�d}|o|dk}|s8|jd�rdtjj|�rd|jj|g|jj	j
d�}|jjj�j
|d�Stjj|�}|j|jj|jjd	�}|j�}|j
d
d�}|jjr�|j|jjd�}t|j��dkr�td
�|}tjj|��|S)z#Return a query to match a pkg_spec.rrr�filerz.rpm)rO)rA)Zwith_srcT)Zlatest_per_arch_by_priority)rczNo package %s available.)rrr|r)r+ZpycompZurlparse�endswithr.rP�isfiler&Zadd_remote_rpmsrNrOrdreZfilterm�subject�SubjectZget_best_queryr
r%rfrrg�lenrCrr=rp)rrvr1Zis_urlr@�subjrkrBrrrroszDownloadCommand._get_querycCsd|dd�}tjj|�}x.|j�D]"}|j|jj�j�}|r"|j�Sq"Wt	d�|}tj
j|��dS)z/Return a query to match a source rpm file name.N�zNo package %s available.���)r+rr�Zget_nevra_possibilitiesZto_queryr&rdrfZlatestrr=rp)rrvr�Z	nevra_objZ	tmp_queryrBrrrrn,sz!DownloadCommand._get_query_source)F)F)�__name__�
__module__�__qualname__�aliasesrZsummaryr	�staticmethodr"r0rCr?r2r4r5r6r[rZr]rorn�
__classcell__rr)rrr%s #!


r)Z
__future__rrZdnfpluginscorerrZdnf.cli.option_parserrr+Zdnf.cliZdnf.exceptionsZdnf.i18nZdnf.subjectZdnf.utilr8rsr.rUZpluginZregister_commandr
ZCommandrrrrr�<module>ssite-packages/dnf-plugins/__pycache__/copr.cpython-36.pyc000064400000050311147511334660017325 0ustar003

@|�eZv�@s�ddlmZddlZddlZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddlZddlmZm
Z
ddlZddlmZddlmZddlZy$ddlmZmZmZmZdd�ZWnLek
�rd	d
�ZyddlmZWnek
r�dd�ZYnXYnXd
Zeed�ed�g�Zeed�ed�dg�Ze�rdddl m!Z!m"Z"m#Z#ddl$m%Z%m&Z&m'Z'n(ddl!m!Z!m"Z"m#Z#ddl(m%Z%m&Z&m'Z'ej)j*Gdd�dej+j,��Z-ej)j*Gdd�de-��Z.dS)�)�print_functionN)�_�logger)�PY3)�ucd)�name�version�codename�os_release_attrcCst�t�t�fS)N)rrr	�rr�/usr/lib/python3.6/copr.py�linux_distribution.sr
cCsdS)N�r)rrrrr
1sr
)r
cCsrtd��`}i}xF|D]>}y$|j�jd�\}}|jd�||<Wqtk
rPYqXqW|d|ddfSQRXdS)Nz/etc/os-release�=�"�NAMEZ
VERSION_ID)�open�rstrip�split�strip�
ValueError)Zos_release_fileZos_release_data�lineZos_release_keyZos_release_valuerrrr
7s


�copr�yes�y�no�nr)�ConfigParser�
NoOptionError�NoSectionError)�urlopen�	HTTPError�URLErrorc@seZdZdZdZdZdZdZdZedeZ	d8Z
ed	�Zd
Z
ed�Zedd
��Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zed$d%��Zd&d'�Zd(d)�Zd*d+�Z d,d-�Z!d.d/�Z"d0d1�Z#ed2d3��Z$ed4d5��Z%ed6d7��Z&dS)9�CoprCommandz Copr plugin for DNF Nzcopr.fedorainfracloud.orgZfedoraZhttpsi�z://rz Interact with Copr repositories.Ta�
  enable name/project [chroot]
  disable name/project
  remove name/project
  list --installed/enabled/disabled
  list --available-by-user=NAME
  search project

  Examples:
  copr enable rhscl/perl516 epel-6-x86_64
  copr enable ignatenkobrain/ocltoys
  copr disable rhscl/perl516
  copr remove rhscl/perl516
  copr list --enabled
  copr list --available-by-user=ignatenkobrain
  copr search tests
    c	Cs�|jddddddddgd	�|j�}|jd
dtd�d
�|jddtd�d
�|jddtd�d
�|jddtd�d�|jdtd�d�|jddd�dS)N�
subcommand��help�enable�disable�remove�list�search)�nargs�choicesz--installed�
store_truez.List all installed Copr repositories (default))�actionr&z	--enabledzList enabled Copr repositoriesz
--disabledzList disabled Copr repositoriesz--available-by-userrz-List available Copr repositories by user NAME)�metavarr&z--hubz(Specify an instance of Copr to work with)r&�arg�*)r,)�add_argumentZadd_mutually_exclusive_groupr)�parserZlist_optionrrr�
set_argparserpszCoprCommand.set_argparsercCs�|jjjjdkrdSd}t�}g}|jjjd}tjj	|t
d�}tjj|�r�|j|�|j
|�|jdd�r�|jdd�r�|jdd�}|jdd�}||g|_n
ddg|_xHtjtjj	|t
d��D],}|jd�r�tjj	|t
d|�}	|j|	�q�Wg}
t|jj��r|jjdjd	�}
t|
�d
k�rV|jj�rVtjtd�td��tjjtd
���nL|jj�r�t|
�d
k�r�|j|_|j|_n t|
�d
k�r�|
d}n|jj}|�rH|�rHd|_|j
t |dd��|j!||dd�}|�rH|j!||d|j"�}|j!||d|j#�}
||_|d||_t$|
�|j#k�rH|jd|
7_|jd|
7_|j�s�d|k�rr||_|j"d||_n|jdd�d|_||_dS)Nrrz.conf�main�distribution�
releaseverFz.d�/�zError: z^specify Copr hub either with `--hub` or using `copr_hub/copr_username/copr_projectname` formatzmultiple hubs specifiedT)�reverse�hostname�protocol�portz://�:r%)%�cliZcommand�optsr�base�confZpluginconfpath�os�path�join�PLUGIN_CONF�isfile�append�readZ
has_option�get�
chroot_config�listdir�endswith�lenr1r�hubr�criticalr�dnf�CliError�default_hostname�
copr_hostname�default_url�copr_url�sorted�_read_config_item�default_protocol�default_port�int)�selfZcopr_hubZcopr_plugin_configZconfig_filesZconfig_pathZdefault_config_filer7r8�filenameZconfig_file�projectr<r=r>rrr�	configure�sl








zCoprCommand.configurecCs*y|j||�Sttfk
r$|SXdS)N)rKrr)r]�configrPZsection�defaultrrrrY�szCoprCommand._read_config_itemcCstjjdj|j���dS)Nz{0}
)�sys�stderr�write�formatr)r]�textrrr�_user_warning_before_prompt�sz'CoprCommand._user_warning_before_promptc
Cs�|jjd}|dkr&|jjj|�dS|dkrl|jjrH|j|jj�dS|j|jj	j
d|jj|jj�dSy|jj
d}WnLttfk
r�tjtd�td��|jjj|�tjjtd���YnXy\|jj
d}t|jj
�dkr�tjjtd���|jd	�|_t|j�d
k�r$tjjtd���Wn*tk
�rP|j�}|jd	�|_YnX|dk�rj|j|�dS|jd
�}t|�dk�r�tjtd�td��tjjtd���n<t|�dk�r�|d}|d}n|d}|d}|d
|}dj|jj	j|j|j|�|�}|dk�rn|j �td�}d
j!|j||g�}dj|�}	|j"||	�|j#||�tj$td��|j%||�nr|dk�r�|j �|j&||�tj$td��nD|dk�r�|j �|j'||�tj$td��ntjjtd�j|���dS)Nrr&r*zError: z>exactly two additional parameters to copr command are requiredr%�zToo many arguments.�-r:zOBad format of optional chroot. The format is distribution-version-architecture.r+r9zEuse format `copr_username/copr_projectname` to reference copr projectzbad copr project formatz{0}/_copr:{1}:{2}:{3}.repor'a
Enabling a Copr repository. Please note that this repository is not part
of the main distribution, and quality may vary.

The Fedora Project does not exercise any power over the contents of
this repository beyond the rules outlined in the Copr FAQ at
<https://docs.pagure.org/copr.copr/user_documentation.html#what-i-can-build-in-copr>,
and packages are not held to any quality or security level.

Please do not file bug reports about these packages in Fedora
Bugzilla. In case of problems, contact the owner of this repository.
z!Do you really want to enable {0}?z Repository successfully enabled.r(z!Repository successfully disabled.r)z Repository successfully removed.zUnknown subcommand {}.)rir:)(rAr$r@Z	optparserZ
print_helpZavailable_by_user�_list_user_projects�_list_installed_repositoriesrBrCZreposdir�enabledZdisabledr1r�
IndexErrorrrQrrRrSrO�
exceptions�Errorr�chroot_parts�
_guess_chroot�_searchrf�get_reposdirrU�_sanitize_username�
_need_rootrF�	_ask_user�_download_repo�info�_runtime_deps_warning�
_disable_repo�_remove_repo)
r]r$�project_name�chrootr_�
copr_username�copr_projectname�
repo_filenamery�msgrrr�run�s�








zCoprCommand.runcCs�|jjd�d}tjd|j|�}|j|jko8tjd|�}tjd|�}|jjr`|rh|rhdSn|shdStjd|�rxdStjd|�r�dS|j	}	|	r�|s�|	r�|r�dSd}
tjd	|�r�|jd
d�\}}}
}|d|
d|}n�tjd|��r2|jdd�}|j
d
d
�djd
d�d}|d|dd|d}n.|jdd�}|jd|dd|d}d}
|	�sn|d7}|
�r||d7}t|�|
S)Nr9r%z_copr:Z_copr_z_copr:|^_copr_zcopr:.*:.*:.*:mlz
coprdep:.*Fzcopr:r?r:rjrirTz (disabled)z *���)
�repofiler�re�matchrUrWrVrArPrm�rsplitrT�print)r]�repo_id�repo�enabled_only�
disabled_only�	file_nameZ	match_newZ	match_oldZ	match_anyrm�old_reporrUZ
copr_ownerZcopr_dirr�Z	copr_namerrr�_list_repo_file8sBzCoprCommand._list_repo_filecCsFd}x,|jjj�D]\}}|j||||�rd}qW|rBttd��dS)NFTz�* These coprs have repo file with an old format that contains no information about Copr hub - the default one was assumed. Re-enable the project to fix this.)rB�repos�itemsr�r�r)r]Z	directoryr�r�r�r�r�rrrrlisz(CoprCommand._list_installed_repositoriesc

Cs�dj|�}|j|}|jj|dd�}ytj|j��}Wn*tk
r`tj	j
td�j|���YnX|j|�td�j|�}|j
|�xL|dD]@}dj||d�}|d	p�td
�}	|jjjt|�|	�}t|�q�WdS)Nz!/api_3/project/list?ownername={0}zw+)�modez+Can't parse repositories for username '{}'.zList of {} coprsr�z
{0}/{1} : r�descriptionzNo description given)rfrWrBr �json�loadsrJrrRrorpr�_check_json_output�_print_match_section�output�
fmtKeyValFillrr�)
r]Z	user_name�api_path�url�res�
json_parse�section_text�itemr��descrrrrkss"



zCoprCommand._list_user_projectsc
Cs�dj|�}|j|}|jj|dd�}ytj|j��}Wn*tk
r`tj	j
td�j|���YnX|j|�td�j|�}|j
|�xJ|dD]>}dj|d�}|d	p�td
�}	|jjjt|�|	�}t|�q�WdS)Nz/api_3/project/search?query={}zw+)r�zCan't parse search for '{}'.zMatched: {}r�z{0} : Z	full_namer�zNo description given.)rfrWrBr r�r�rJrrRrorprr�r�r�r�rr�)
r]Zqueryr�r�r�r�r�r�r�r�rrrrs�s 



zCoprCommand._searchcCs|jjj|�}t|�dS)N)rBr�Z
fmtSectionr�)r]rgZ	formattedrrrr��sz CoprCommand._print_match_sectioncCsj|jstjjd�d|_tjjdj|j���|jj�rf|jjj	sb|jj
jdj|�dj|�d�rfdSdS)N�
Fz{0}
z
{} [y/N]: z
{} [Y/n]: )r�Zdefaultyes_msgT)�
first_warningrcrdrerfrrBZ
_promptWantedrCZassumenor�Zuserconfirm)r]ryr�rrr�_ask_user_no_raise�s
zCoprCommand._ask_user_no_raisecCs |j||�stjjtd���dS)NzSafe and good answer. Exiting.)r�rRrorpr)r]ryr�rrrrw�szCoprCommand._ask_usercCs tj�dkrtjjtd���dS)Nrz/This command has to be run under the root user.)rD�geteuidrRrorpr)�clsrrrrv�szCoprCommand._need_rootcs|j��dks&�ddks&�ddkr,t��|jjjd}t�fdd�dD��r�d
�krbd|}n&dtd
�krxd|}ndj�d|�}n�d�kr�tj	d�}d�kr�dj|�}ndj�d|�}nPd�kr�tj	d�}d�kr�dj|�}ndj�d|�}nd�dj
dd�d}|S)z2 Guess which chroot is equivalent to this machine NrFr%Zbasearchcsg|]}|�k�qSrr)�.0r)�distrr�
<listcomp>�sz-CoprCommand._guess_chroot.<locals>.<listcomp>�Fedora�Fedora LinuxZRawhidezfedora-rawhide-ZrawhideZredhat_support_product_versionzfedora-{0}-{1}ZMageiaz%{distro_arch}ZCauldronzmageia-cauldron-{}zmageia-{0}-{1}ZopenSUSEz%{_target_cpu}Z
Tumbleweedzopensuse-tumbleweed-{}zopensuse-leap-{0}-{1}zepel-%s-x86_64�.)r�r�)rLr
rBrC�
substitutions�anyr
rf�rpmZexpandMacror)r]Zdistarchr~r)r�rrr�s, 



zCoprCommand._guess_chrootcCs�dj|jdd��}|jd}dj|||�}y*t|j|�}tjj|�rRtj|�W�n^t	k
�rl}z�|j
dkr�td�j|j||j
t|��}t
jj|��td�}|jjd�}	|	�r>tj|	�jd�}
tj|
�}
|td	�jdj|j�|�7}|
jd
��r0|td�djd
d�|
d
D��7}|td�j|�7}t
jj|��n|td�j|�7}t
jj|��WYdd}~XnJtk
�r�}z,td�j|j||jj�}t
jj|��WYdd}~XnX|j�}|jd�}tjd|��r�tjj|jjjd|dd�d�}|j|j k�rR|j!ddd�j!|j"d�j!ddd�j!dd�j!dd�}
tjj|
��rRtj|
�t#|d��.}|j$|�x|j%�D]}|j$|��qrWWdQRXtj&|t'j(t'j)Bt'j*Bt'j+B�dS) Nrjr%z%/coprs/{0}/repo/{1}/dnf.repo?arch={2}i�z Request to {0} failed: {1} - {2}z+It wasn't possible to enable this project.
zCopr-Error-Datazutf-8z1Repository '{0}' does not exist in project '{1}'.zavailable chrootsz
Available repositories: z, css|]}dj|�VqdS)z'{}'N)rf)r��xrrr�	<genexpr>�sz-CoprCommand._download_repo.<locals>.<genexpr>z�

If you want to enable a non-default repository, use the following command:
  'dnf copr enable {0} <repository>'
But note that the installed repo file will likely need a manual modification.zProject {0} does not exist.zFailed to connect to {0}: {1}z\[copr:rriz.repoz_copr:�_coprrr?Zgroup_�@�wbr�r����),rFrqrfr rWrDrE�existsr)r!�coder�strrRrorpZheadersrK�base64Z	b64decode�decoder�r�r"�reason�strerror�readliner�r�rBrCrtrV�replacerUrre�	readlines�chmod�stat�S_IRUSR�S_IWUSR�S_IRGRP�S_IROTH)r]r}r�Zshort_chrootZarchr�Zresponse�eZ	error_msgZ
error_dataZerror_data_decodedZ
first_linerZold_repo_filename�frrrrx�sX





$

zCoprCommand._download_repocs�|jjdd�|jj�|j|j|�|��g}x(�jj�D]}|jd�rJq:|j|�q:W|s`dSt	d�}t
jd��|jdj
��fdd	�|D���}|j|t	d
��s�x,|D]$}|jjj�j||jjjddi�q�WdS)
a,
        In addition to the main copr repo (that has repo ID prefixed with
        `copr:`), the repofile might contain additional repositories that
        serve as runtime dependencies. This method informs the user about
        the additional repos and provides an option to disable them.
        T)r�zcopr:Na�Maintainer of the enabled Copr repository decided to make
it dependent on other repositories. Such repositories are
usually necessary for successful installation of RPMs from
the main Copr repository (they provide runtime dependencies).

Be aware that the note about quality and bug-reporting
above applies here too, Fedora Project doesn't control the
content. Please review the list:

{0}

These repositories have been enabled automatically.r%z

cs*g|]"}djt��|�jj|d�d��qS)z){num:2}. [{repoid}]
    baseurl={baseurl}�baseurl)Znum�repoidr�)rf�next�cfgZgetValue)r�r�)�counterr�rrr�9sz5CoprCommand._runtime_deps_warning.<locals>.<listcomp>z!Do you want to keep them enabled?rm�0)rB�resetZread_all_repos�_get_copr_reporur��sections�
startswithrIr�	itertools�countrfrFr�rC�write_raw_configfiler�r�)r]rr�Zruntime_depsr�ryZdepr)r�r�rrzs*



z!CoprCommand._runtime_deps_warningcCs�dj|jjdd�d|j|�|�}||jjkr�dj|j|�|�}}||jjkr�d|jj|jkr�|jj|jjd�d	}y.|jdd�djdd�d}||jkr�dSWq�tk
r�Yq�XndS|jj|S)
Nzcopr:{0}:{1}:{2}r?r%rz{0}-{1}r�r9rir�)	rfrUr�rurBr�r�rrn)r]rr�r�r�rUrrrr�Fs 

zCoprCommand._get_copr_repocCst|j||�}|s,tjjtdj|j||����ytj|j	�Wn2t
k
rn}ztjjt|���WYdd}~XnXdS)Nz&Failed to remove copr repo {0}/{1}/{2})r�rRrorprrfrUrDr)r��OSErrorr�)r]rr�r�r�rrrr|\szCoprCommand._remove_repocCsd|j||�}|dkr,tjjtdj||����x2|jj�D]$}|jj	j
|j||jj	jddi�q8WdS)Nz!Failed to disable copr repo {}/{}rmr�)
r�rRrorprrfr�r�rBrCr�r�r�)r]rr�r�r�rrrr{hszCoprCommand._disable_repocCs<ytj|j��}Wn$tk
r6tjjtd��dSX|S)z� Wrapper around response from server

        check data and print nice error in case of some error (and return None)
        otherwise return json object.
        zUnknown response from server.N)r�r�rJrrRr@rSr)r�r�r�rrr�	_get_datatszCoprCommand._get_datacCs"d|krtjjdj|d���dS)N�errorz{})rRrorprf)r�Zjson_objrrrr��szCoprCommand._check_json_outputcCs&|ddkrdj|dd��S|SdS)Nrr�zgroup_{}r%)rf)r�rrrrru�szCoprCommand._sanitize_username)r)'�__name__�
__module__�__qualname__�__doc__rLrTZdefault_hubrZr[rV�aliasesr�summaryr��usage�staticmethodr5r`rYrhr�r�rlrkrsr�r�rw�classmethodrvrrrxrzr�r|r{r�r�rurrrrr#PsDL_1
%82r#c@sDeZdZdZdZed�ZdZdd�Zdd�Z	e
d	d
��Zdd�Zd
S)�PlaygroundCommandz Playground plugin for DNF �
playgroundz$Interact with Playground repository.z [enable|disable|upgrade]c	Cs0|j�|jtd�td��dj|j�}|jj|dd�}|j|�}|j�|ddkrft	j
jtd���x�|d	D]�}d
j|d|d�}d
j|jjj
|jdd��}yj||dkr�wpdj|j||�}|jj|dd�}|j|�}|j�|o�d|ko�|ddk�r
|j||�Wqpt	jjk
�r&YqpXqpWdS)Nz!Enabling a Playground repository.zDo you want to continue?z{0}/api/playground/list/zw+)r�r��okzUnknown response from server.r�z{0}/{1}ZusernameZcoprnamez{}/_playground_{}.repor9rjZchrootsz{0}/api/coprs/{1}/detail/{2}/)rvrwrrfrWrBr r��closerRr@rSrCrtr�rxrorp)	r]r~Zapi_urlr�r�r�r}r�Zoutput2rrr�_cmd_enable�s8




zPlaygroundCommand._cmd_enablecCs6|j�x(tjdj|jjj��D]}|j|�q WdS)Nz{}/_playground_*.repo)rv�globrfrBrCrtr|)r]r�rrr�_cmd_disable�szPlaygroundCommand._cmd_disablecCs|jdddddgd�dS)Nr$r%r'r(�upgrade)r,r-)r3)r4rrrr5�szPlaygroundCommand.set_argparsercCs�tjjd��|jjd}|j�}|dkrB|j|�tjt	d��n`|dkrb|j
�tjt	d��n@|dkr�|j
�|j|�tjt	d��ntjjt	d	�j|���dS)
Nz%Playground is temporarily unsupportedrr'z-Playground repositories successfully enabled.r(z.Playground repositories successfully disabled.r�z-Playground repositories successfully updated.zUnknown subcommand {}.)rRrorprAr$rrr�rryrr�rf)r]r$r~rrrr��s

zPlaygroundCommand.runN)r�)
r�r�r�r�r�rr�r�r�r�r�r5r�rrrrr��s r�)/Z
__future__rr�r�r�rDr�Zshutilr�rcr�ZdnfpluginscorerrrRZ
dnf.pycomprZdnf.i18nrr�Zdistrorrr	r
r
�ImportError�platformrG�setZYESZNOZconfigparserrrrZurllib.requestr r!r"Zurllib2ZpluginZregister_commandr@ZCommandr#r�rrrr�<module>sP
Bsite-packages/dnf-plugins/__pycache__/system_upgrade.cpython-36.opt-1.pyc000064400000054753147511334660022372 0ustar003

@|�e�h�@s�dZddlmZmZmZmZddlZddlZddlZddl	Z	ddl
Z
ddlZddlm
Z
ddlmZmZddlZddlZddlmZddlmZddlZddlmZmZddlZed	�ejd
�Zejd�Zejd�Zejd
�Z eZ!dZ"ed�Z#ed�Z$ed�Z%dZ&dd�Z'dd�Z(gfdd�Z)d7dd�Z*dd�Z+Gdd�de,�Z-Gdd �d e,�Z.e.�Z/Gd!d"�d"ej0j1�Z2d#d$�Z3d%d&�Z4d'd(�Z5d)d*�Z6d+d,dd-d.gZ7Gd/d0�d0ej8�Z9Gd1d2�d2ej:j;�Z<Gd3d4�d4e<�Z=Gd5d6�d6e<�Z>dS)8zGsystem_upgrade.py - DNF plugin to handle major-version system upgrades.�)�call�Popen�check_output�CalledProcessErrorN)�journal)�_�logger)�CliError)�ucd)�serialize_transaction�TransactionReplayzthe color of the skyZ 9348174c5cc74001a71ef26bd79d302eZ fef1cc509d5047268b83a3a553f54b43Z 3e0a5636d16b4ca4bbe5321d06c6aa62Z 8cec00a1566f4d3594f116450395f06cz/usr/bin/plymouthz<Need a --releasever greater than the current system version.z�Download complete! Use 'dnf {command} reboot' to start the upgrade.
To remove cached metadata and transaction use 'dnf {command} clean'zESorry, you need to use 'download --releasever' instead of '--network'�cCs.tjddd�rtjtd��ntddg�dS)NZDNF_SYSTEM_UPGRADE_NO_REBOOTF)�defaultz!Reboot turned off, not rebooting.Z	systemctl�reboot)�os�getenvr�inforr�rr�$/usr/lib/python3.6/system_upgrade.pyrEsrcCs|d}xrdD]j}yNt|��<}x4|D],}|j�}|j|�r |t|�d�jd�Sq WWdQRXWq
tk
rrw
Yq
Xq
WdS)NzUPGRADE_GUIDE_URL=�/etc/os-release�/usr/lib/os-release�")rr)�open�strip�
startswith�len�IOError)�key�pathZrelease_file�linerrr�get_url_from_os_releaseLs



(r cCs~tjj|�sdSxhtj|�D]Z}tjj||�}||kr8qy(tjj|�rTtjj|�n
tj|�Wqt	k
rtYqXqWdS)N)
rr�isdir�listdir�join�dnf�utilZrm_rf�unlink�OSError)r�ignore�entryZfullpathrrr�	clear_dir[sr*cCs6tjj|j�|jkrtt��|r2||jkr2tt��dS)N)r$�rpm�detect_releasever�installroot�
releaseverr	�RELEASEVER_MSG�CANT_RESET_RELEASEVER)�conf�targetrrr�check_release_verlsr3cCsPytdd�}|jd�Wn2tk
rJ}zttd�|�WYdd}~XnXdS)Nz	/dev/tty0�wbs[9;0]z%Screen blanking can't be disabled: %s)r�write�	Exception�printr)Ztty�errr�disable_blankingus

r9c@s�eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	e	d�Z
e	d�Ze	d�Ze	d�Z
e	d�Ze	d�Ze	d�Ze	d�Ze	d�Ze	d�Ze	d�Ze	d�Ze	d�ZdS)�StatecCs||_i|_|j�dS)N)�	statefile�_data�_read)�selfr;rrr�__init__�szState.__init__cCspy&t|j��}tj|�|_WdQRXWnDtk
r@i|_Yn,tk
rji|_tjt	d�|j�YnXdS)Nz;Failed loading state file: %s, continuing with empty state.)
rr;�json�loadr<r�
ValueErrorr�warningr)r>�fprrrr=�s

zState._readc
CsFtjjtjj|j��t|jd��}tj	|j
|ddd�WdQRXdS)N�w�T)�indent�	sort_keys)r$r%�
ensure_dirrr�dirnamer;rr@�dumpr<)r>Zoutfrrrr5�szState.writecCs&tjj|j�rtj|j�|j�dS)N)rr�existsr;r&r=)r>rrr�clear�szState.clearcCs|S)Nr)r>rrr�	__enter__�szState.__enter__cCs|dkr|j�dS)N)r5)r>�exc_type�	exc_value�	tracebackrrr�__exit__�szState.__exit__cs"�fdd�}�fdd�}t||�S)Ncs||j�<dS)N)r<)r>�value)�optionrr�setprop�szState._prop.<locals>.setpropcs|jj��S)N)r<�get)r>)rTrr�getprop�szState._prop.<locals>.getprop)�property)rTrUrWr)rTr�_prop�szState._prop�
state_version�download_status�destdir�target_releasever�system_releasever�gpgcheck�gpgcheck_repos�repo_gpgcheck_repos�upgrade_status�upgrade_command�distro_sync�enable_disable_repos�module_platform_idN)�__name__�
__module__�__qualname__r?r=r5rMrNrRrYrZr[r\r]r^r_r`rarbrcrdrerfrrrrr:�s(
r:c@s@eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dS)�PlymouthOutputz�A plymouth output helper class.

    Filters duplicate calls, and stops calling the plymouth binary if we
    fail to contact it.
    cCsd|_t�|_d|_dS)NT)�alive�dict�
_last_args�	_last_msg)r>rrrr?�szPlymouthOutput.__init__cGsj||jj|�k}|jr|s$|dkrdytt|f|�dk|_Wntk
rXd|_YnX||j|<|jS)Nz--pingrF)rmrVrkr�PLYMOUTHr')r>�cmd�argsZdupe_cmdrrr�	_plymouth�s
zPlymouthOutput._plymouthcCs
|jd�S)Nz--ping)rr)r>rrr�ping�szPlymouthOutput.pingcCs4|jr |j|kr |jdd|j�||_|jdd|�S)Nzhide-messagez--textzdisplay-message)rnrr)r>�msgrrr�message�szPlymouthOutput.messagecCsRd}y$ttdg�}tjdt|��r&d}Wnttfk
r@YnX|jdd|�S)NZupdatesz--helpz--system-upgradezsystem-upgradezchange-modez--)rro�re�searchr
rr'rr)r>�mode�srrr�set_mode�szPlymouthOutput.set_modecCs|jddt|��S)Nz
system-updatez
--progress)rr�str)r>Zpercentrrr�progress�szPlymouthOutput.progressN)
rgrhri�__doc__r?rrrsrurzr|rrrrrj�s

rjc@s$eZdZdd�Zdd�Zdd�ZdS)�PlymouthTransactionProgresscCs|j||||�dS)N)�_update_plymouth)r>�package�actionZti_doneZti_totalZts_doneZts_totalrrrr|�sz$PlymouthTransactionProgress.progresscCsd|dkrdS|tjjkr0tjtd||��ntjdtd||��tj|j||||��dS)N�g�V@�Zg$@)r$�callbackZ
PKG_VERIFY�Plymouthr|�intru�
_fmt_event)r>r�r��current�totalrrrr�sz,PlymouthTransactionProgress._update_plymouthcCs tjjj||�}d||||fS)Nz[%d/%d] %s %s...)r$�transactionZACTIONSrV)r>r�r�r�r�rrrr�sz&PlymouthTransactionProgress._fmt_eventN)rgrhrir|rr�rrrrr~�sr~ccsJtj�}|j|jdd�d}x(|D] }|d}||kr8q"|}|Vq"WdS)zVFind all boots with this message id.

    Returns the entries of all found boots.
    r)�
MESSAGE_IDZ_UIDN�_BOOT_ID)r�ReaderZ	add_match�hex)�
message_id�jZoldbootr)Zbootrrr�
find_bootss
r�c
Cstttd��d
}xJttt��D]:\}}tdj|d|d|d|jdd�|jdd���qW|dkrpttd	��dS)Nz3The following boots appear to contain upgrade logs:r�u){} / {.hex}: {:%Y-%m-%d %H:%M:%S} {}→{}r�Z__REALTIME_TIMESTAMP�SYSTEM_RELEASEVERz??�TARGET_RELEASEVERz-- no logs were found --���r�)r7r�	enumerater��ID_TO_IDENTIFY_BOOTS�formatrV)�nr)rrr�	list_logs s
r�cCsZtt|��}y(|dkrt�|dkr*|d8}||dStk
rTttd���YnXdS)Nrr�r�z!Cannot find logs with this index.)�listr��
IndexErrorr	r)r�r�Zbootsrrr�	pick_boot.sr�cCsDtt|�}tdd|jg�}|j�|j}|dkr@tjjt	d���dS)NZ
journalctlz--bootr�z%Unable to match systemd journal entry)
r�r�rr��wait�
returncoder$�
exceptions�Errorr)r�Zboot_idZprocessZrcrrr�show_log=s
r�ZdownloadZclean�upgrade�logcs eZdZdZ�fdd�Z�ZS)�SystemUpgradePluginzsystem-upgradecs8tt|�j||�|r4|jt�|jt�|jt�dS)N)�superr�r?Zregister_command�SystemUpgradeCommand�OfflineUpgradeCommand�OfflineDistrosyncCommand)r>�base�cli)�	__class__rrr?Ns


zSystemUpgradePlugin.__init__)rgrhri�namer?�
__classcell__rr)r�rr�Ksr�cs(eZdZdEZed�ZdZ�fdd�Zedd��Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Zd5d6�Z d7d8�Z!d9d:�Z"d;d<�Z#d=d>�Z$d?d@�Z%dAdB�Z&dCdD�Z'�Z(S)Fr��system-upgrade�fedupz+Prepare system for upgrade to a new releasezvar/lib/dnf/system-upgradecsjtt|�j|�tjj|jjj|j	�|_
tjj|j
d�|_tjj|jjjd�|_t
tjj|j
d��|_dS)Nzsystem-upgrade-transaction.jsonz
system-updatezsystem-upgrade-state.json)r�r�r?rrr#r�r1r-�DATADIR�datadir�transaction_file�
magic_symlinkr:�state)r>r�)r�rrr?\s
zSystemUpgradeCommand.__init__cCsJ|jdddtd�d�|jddtdd	jt�d
�|jdttd�d
�dS)Nz--no-downgraderdZstore_falsez=keep installed packages if the new release's version is older)�destr��help�tidr�z[%s]�|)�nargs�choices�metavarz--numberzwhich logs to show)�typer�)�add_argumentr�CMDSr#r�)�parserrrr�
set_argparserds
z"SystemUpgradeCommand.set_argparsercCs(tj||tj|jj|jjtjjd�dS)zLog directly to the journal.)r�ZPRIORITYr�r�ZDNF_VERSIONN)	r�sendZ
LOG_NOTICEr�r^r]r$�const�VERSION)r>rur�rrr�
log_statusnszSystemUpgradeCommand.log_statuscCs|jd�|jd�dS)NZcheck�
pre_configure)�	_call_sub)r>rrrr�ws
z"SystemUpgradeCommand.pre_configurecCs|jd�dS)N�	configure)r�)r>rrrr�{szSystemUpgradeCommand.configurecCs|jd�dS)N�run)r�)r>rrrr�~szSystemUpgradeCommand.runcCs|jd�dS)Nr�)r�)r>rrr�run_transaction�sz$SystemUpgradeCommand.run_transactioncCs|jd�dS)NZresolved)r�)r>rrr�run_resolved�sz!SystemUpgradeCommand.run_resolvedcCs.t||d|jjdd�}t|�r*|�dS)Nrr)�getattr�optsr��callable)r>r�Zsubfuncrrrr��szSystemUpgradeCommand._call_subcCs(|jjtkr$td�j|d�}t|��dS)NzFIncompatible version of data. Rerun 'dnf {command} download [OPTIONS]')�command)r�rZ�
STATE_VERSIONrr�r	)r>r�rtrrr�_check_state_version�sz)SystemUpgradeCommand._check_state_versioncCs*|j|jj_|jjr|jjnd|jj_dS)N)r�r�r1�cachedirr�r\)r>rrr�
_set_cachedir�sz"SystemUpgradeCommand._set_cachedircCs�ttjjtjjg�}ttjj�}i}i}xl|jjjD]^}|j	|krp|j
}|j|j|j
ji�jt|�i�|j	<q6|j	|kr6|j|jt|j
�i�|j	<q6W||fS)z�
        forward = {repoid:{pkg_nevra: {tsi.action: tsi.reason}}
        reverse = {pkg_nevra: {tsi.action: tsi.reason}}
        :return: forward, reverse
        )�setr$r�ZBACKWARD_ACTIONS�libdnfZ!TransactionItemAction_REINSTALLEDZFORWARD_ACTIONSr�r�r��pkg�reason�
setdefault�repo�idr{)r>Zbackward_actionZforward_actionsZforward�reverseZtsir�rrr�%_get_forward_reverse_pkg_reason_pairs�s
&
z:SystemUpgradeCommand._get_forward_reverse_pkg_reason_pairscCsb|j|jj_|jjr|jjnd|jj_d|jjkrJ|jjrJtt	d���nd|jjkr^d|j_dS)Nzoffline-distrosynczFCommand 'offline-distrosync' cannot be used with --no-downgrade optionzoffline-upgradeF)
r�r�r1r�r�r\r�rdr	r)r>rrr�pre_configure_download�sz+SystemUpgradeCommand.pre_configure_downloadcCs|j�dS)N)r�)r>rrr�pre_configure_reboot�sz)SystemUpgradeCommand.pre_configure_rebootcCs.|j�|jjr|jj|j_|jj|jj_dS)N)	r�r�rer��repos_edr]r�r1r.)r>rrr�pre_configure_upgrade�sz*SystemUpgradeCommand.pre_configure_upgradecCs|j�dS)N)r�)r>rrr�pre_configure_clean�sz(SystemUpgradeCommand.pre_configure_cleancCsd|jjksd|jjkr�tjtd��t�}|rLtd�}tj|jt|���|j	j
�r�td�}|j	jjs�|j	j
jdj|�dj|�d�r�tjtd	��tjd
�t|j	j|jjd�nd|jjkr�|jj|j�d
|jj_d
|jj_d
|jj_d
|jj_d
|jj_|j	jjdg7_dS)Nzsystem-upgrader�z\WARNING: this operation is not supported on the RHEL distribution. Proceed at your own risk.z-Additional information for System Upgrade: {}zyBefore you continue ensure that your system is fully upgraded by running "dnf --refresh upgrade". Do you want to continuez
{} [y/N]: z
{} [Y/n]: )rtZdefaultyes_msgzOperation aborted.r�)r2zoffline-upgradeTZtest)r�r�rrCrr rr�r
r�Z
_promptWantedr1Zassumeno�outputZuserconfirm�error�sys�exitr3r.r�Z _populate_update_security_filter�demands�	root_user�	resolving�available_repos�sack_activationZfreshest_metadataZtsflags)r>Zhelp_urlrtrrr�configure_download�s*






z'SystemUpgradeCommand.configure_downloadcCsd|jj_dS)NT)r�r�r�)r>rrr�configure_reboot�sz%SystemUpgradeCommand.configure_rebootcCs�d|jj_d|jj_d|jj_d|jj_|jj|j_|jj	dk	rN|jj	|j
j_	|jjdk	r�x$|j
j
j�D]}|j|jjk|_	qhW|jjdk	r�x$|j
j
j�D]}|j|jjk|_q�W|jj|j
j_d|jj_d|j
j_t�|jj_d|j
j_d|j
j_dS)NTF)r�r�r�r�r�r�r�rdr�r_r�r1r`�repos�valuesr�ra�
repo_gpgcheckrfZ	cacheonlyZ	assumeyesr~Ztransaction_displayZclean_requirements_on_removeZinstall_weak_deps)r>r�rrr�configure_upgrade�s&






z&SystemUpgradeCommand.configure_upgradecCsd|jj_dS)NT)r�r�r�)r>rrr�configure_cleansz$SystemUpgradeCommand.configure_cleancCsdS)Nr)r>rrr�
configure_logsz"SystemUpgradeCommand.configure_logcCs~|jjdksttd���|j|jj�|jj|jjkrRtd�j|jjd�}t|��t	j
j|j�rlttd���t
jj|j�dS)N�completezsystem is not ready for upgradezZthe transaction was not prepared for '{command}'. Rerun 'dnf {command} download [OPTIONS]')r�zupgrade is already scheduled)r�r[r	rr�r�r�rcr�rr�lexistsr�r$r%rIr�)r>rtrrr�check_rebootsz!SystemUpgradeCommand.check_rebootcCs�tjj|j�s$tjtd��td��tj|j�|j	krLtjtd��td��t
jjj
|j�|jj}|sp|jj}|j|�|jjdks�td�j|d�}t|��dS)Nz-trigger file does not exist. exiting quietly.rz1another upgrade tool is running. exiting quietly.�readyz/use 'dnf {command} reboot' to begin the upgrade)r�)rrr�r�rrr�
SystemExit�readlinkr�r$ZyumZmiscZunlink_fr�rcr�r�r�rbr�r	)r>r�rtrrr�
check_upgrades
z"SystemUpgradeCommand.check_upgradec	Cs,tj|j|j�|j�}d|_WdQRXdS)Nr�)r�symlinkr�r�r�rb)r>r�rrr�run_prepare,sz SystemUpgradeCommand.run_preparecCs6|j�|jjddksdS|jtd�t�t�dS)NrrzRebooting to perform upgrade.)r�r�r�r�r�REBOOT_REQUESTED_IDr)r>rrr�
run_reboot3s
zSystemUpgradeCommand.run_rebootc	s��jjr�jj�n
�jj��jjdkr��jj��fdd��jjjD�}|r\�jj|��fdd��jjj	D�}|r��jj|��j
�$}d|_�jjj
|_�jjj|_WdQRXdS)N�offline-upgrade�offline-distrosynccs$g|]}�jjjj|j�r|j�qSr)r��history�grouprVr�)�.0�g)r>rr�
<listcomp>Gsz5SystemUpgradeCommand.run_download.<locals>.<listcomp>cs$g|]}�jjjj|j�r|j�qSr)r�r��envrVr�)rr)r>rrrJsZdownloading)r�r�)r�rdr�Zupgrade_allr�Z
read_comps�comps�groupsZenv_group_upgradeZenvironmentsr�r[r1r.r]r\)r>Zinstalled_groupsZinstalled_environmentsr�r)r>r�run_download=s

z!SystemUpgradeCommand.run_downloadc
Cs�d}|j�}d|_|j}WdQRX|dkr4td�}n|dkrFtd�}ntd�}|j|t�tj�tjd�tj	|�t
�t|j|j
�|_|jj�dS)	N�Z
incompletezoffline-upgradez1Starting offline upgrade. This will take a while.zoffline-distrosyncz4Starting offline distrosync. This will take a while.z0Starting system upgrade. This will take a while.r)r�rbrcrr��UPGRADE_STARTED_IDr�rzr|rur9rr�r��replayr�)r>r�r�rtrrr�run_upgradeSs 



z SystemUpgradeCommand.run_upgradec	Csdtjtd��t|jjjtjj	|jjj�j
g�|j�$}d|_d|_
d|_d|_d|_WdQRXdS)NzCleaning up downloaded data...)rrrr*r�r1r�r$Z	persistorZTempfilePersistorZdb_pathr�r[rZrbrcr\)r>r�rrr�	run_cleanms
zSystemUpgradeCommand.run_cleancCs |jjrt|jj�nt�dS)N)r�Znumberr�r�)r>rrr�run_logzszSystemUpgradeCommand.run_logcCs|jj�dS)z5Adjust transaction reasons according to stored valuesN)r
Zpost_transaction)r>rrr�resolved_upgrade�sz%SystemUpgradeCommand.resolved_upgradecCs�|jjj�}|j�s&tjtd��dSt|�}yLt|j	d��"}t
j||ddd�|jd�WdQRXt
td�j|j	��Wn<tk
r�}z tjjtd�jt|����WYdd}~XnXtjj|jjj�}|j��}d	|_t|_|jj|_|jjj|_d
d�|jjj �D�|_!dd�|jjj �D�|_"||_#|jjj$|_%|jjj&|_&|jj'|_(|jjj)|_)|jj*|_+WdQRXt,j|jj*d
�}tj|�|j-td�t.�dS)NzKThe system-upgrade transaction is empty, your system is already up-to-date.rErFT)rGrH�
zTransaction saved to {}.zError storing transaction: {}r�cSsg|]}|jr|j�qSr)r_r�)rr�rrrr�sz=SystemUpgradeCommand.transaction_download.<locals>.<listcomp>cSsg|]}|jr|j�qSr)r�r�)rr�rrrr�s)r�zDownload finished.)/r�r�Zget_currentZpackagesrrrrrr�r@rKr5r7r�r'r$r�r	r{r+r,r1r-r�r[r�rZr�rdr_r�r�r`rar^r.r]rfr�rer\r�rc�DOWNLOAD_FINISHED_MSGr��DOWNLOAD_FINISHED_ID)r>r��data�fr8Z
system_verr�rtrrr�transaction_download�s:,


z)SystemUpgradeCommand.transaction_downloadcCs@tjtd��|jtd�t�|j�|jjddkr<t�dS)Nz.Upgrade complete! Cleaning up and rebooting...rr�)	r�rurr��UPGRADE_FINISHED_IDrr�r�r)r>rrr�transaction_upgrade�s
z(SystemUpgradeCommand.transaction_upgrade)r�r�))rgrhri�aliasesr�summaryr�r?�staticmethodr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr
rrrr�rr)r�rr�VsF
	
 


(r�c@seZdZdZed�ZdS)r��offline-upgradez%Prepare offline upgrade of the systemN)r)rgrhrirrrrrrrr��sr�c@seZdZdZed�ZdS)r��offline-distrosyncz(Prepare offline distrosync of the systemN)r)rgrhrirrrrrrrr��sr�)N)?r}�
subprocessrrrrr@rZos.pathrvr�ZuuidZsystemdrZdnfpluginscorerrr$Zdnf.clir	Zdnf.i18nr
Zdnf.transactionZdnf.transaction_srrrZlibdnf.confr�ZUUIDrr�r	rr�ror/rr0r�rr r*r3r9�objectr:rjr�r�ZTransactionProgressr~r�r�r�r�r�ZPluginr�r�ZCommandr�r�r�rrrr�<module>sd




	@.	esite-packages/dnf-plugins/__pycache__/repoclosure.cpython-36.pyc000064400000010500147511334660020720 0ustar003

�gt`��@sVddlmZddlmZddlmZddlZGdd�dej�ZGdd�dej	j
�ZdS)	�)�absolute_import)�unicode_literals)�_Ncs eZdZdZ�fdd�Z�ZS)�RepoClosure�repoclosurecs,tt|�j||�|dkrdS|jt�dS)N)�superr�__init__Zregister_command�RepoClosureCommand)�self�base�cli)�	__class__��!/usr/lib/python3.6/repoclosure.pyr!szRepoClosure.__init__)�__name__�
__module__�__qualname__�namer�
__classcell__rr)r
rrsrc@s>eZdZdZed�Zdd�Zdd�Zd
dd	�Ze	d
d��Z
dS)r	rz:Display a list of unresolved dependencies for repositoriescCsd|jj}d|_d|_|jjr`xB|jjj�D]2}|j	|jjkrT|j	|jj
krT|j�q*|j�q*WdS)NT)
r�demandsZsack_activationZavailable_repos�opts�repor�repos�all�id�check�disable�enable)r
rrrrr�	configure,s
zRepoClosureCommand.configurecCs�|jjr|j|jj�}n|j�}xRt|j��D]B}tdjt|�|j��td�x||D]}tdj|��qZWq.Wt	|�dkr�t
d�}tjj
|��dS)Nzpackage: {} from {}z  unresolved deps:z    {}rz/Repoclosure ended with unresolved dependencies.)r�arches�_get_unresolved�sorted�keys�print�format�str�reponame�lenr�dnf�
exceptions�Error)r
�
unresolved�pkgZdep�msgrrr�run7szRepoClosureCommand.runNcsLi}t�}|jjr�|jjj�jdd��|jjj�jdd�}xv|jjj�D]D}�j	|jjj�j|j
d�j���|j	|jjj�j|j
d�j��}qHWn |jjj�j��|jjj�j�}|jj
�rN|jjj�jdd�}g}xT|jj
D]H}tjj|�}	|j|	j|jjdddd��}
|
�r|j	|
�}q�|j|�q�W|�rJtjjtd�dj|���|}|jj�rh|j|jjd�|dk	�r~|j|d�|jjj�r��jdd	��j�|j�xf|D]^}t�||<xL|jD]B}t|�}|jd
��s�|jd��r�q�|j |�||j |��q�W�q�Wt�fdd
�|D����fdd�|j!�D�}
dd�|
j!�D�S)NT)�empty)r&F)Z
with_nevraZ
with_providesZwith_filenameszno package matched: %sz, )�arch)Zlatest_per_archz	solvable:zrpmlib(c3s|]}�j|d�s|VqdS))ZprovidesN)�filter)�.0�x)�	availablerr�	<genexpr>�sz5RepoClosureCommand._get_unresolved.<locals>.<genexpr>cs(i|] \}}t�fdd�|D��|�qS)c3s|]}|�kr|VqdS)Nr)r2r3)�unresolved_depsrrr5�sz@RepoClosureCommand._get_unresolved.<locals>.<dictcomp>.<genexpr>)�set)r2�k�v)r6rr�
<dictcomp>�sz6RepoClosureCommand._get_unresolved.<locals>.<dictcomp>cSsi|]\}}|r||�qSrr)r2r8r9rrrr:�s)"r7rZnewestrZsackZqueryr1rZiter_enabled�unionrZlatestr4�pkglistr(ZsubjectZSubject�intersectionZget_best_query�appendr)r*r�joinrZfiltermZconfZbestZapplyZrequiresr%�
startswith�add�items)r
r0r+ZdepsZto_checkrZ	pkglist_q�errorsr,ZsubjZpkg_qZreqZreqnameZunresolved_transitionr)r4r6rr Es\ &






z"RepoClosureCommand._get_unresolvedcCs`|jdgddtd�d�|jdgdtd�d�|jd	d
dtd�d
�|jdgdtd�dd�dS)Nz--archr>rzBcheck packages of the given archs, can be specified multiple times)�default�action�dest�helpz--checkzSpecify repositories to check)rDrErGz-nz--newest�
store_truez+Check only the newest packages in the repos)rErGz--pkgz#Check closure for this package onlyr<)rDrErGrF)�add_argumentr)�parserrrr�
set_argparser�s


z RepoClosureCommand.set_argparser)r)N)rrr�aliasesrZsummaryrr.r �staticmethodrKrrrrr	(s
Qr	)Z
__future__rrZdnfpluginscorerZdnf.clir(ZPluginrrZCommandr	rrrr�<module>s
site-packages/dnf-plugins/__pycache__/generate_completion_cache.cpython-36.pyc000064400000006000147511334660023524 0ustar003

�gt`l�@s^ddlmZddlmZddlmZddlmZddlZddlZ	ddl
Z
Gdd�dej�ZdS)�)�absolute_import)�unicode_literals)�ucd)�loggerNcs<eZdZdZ�fdd�Zedd��Zdd�Zdd	�Z�Z	S)
�BashCompletionCacheZgenerate_completion_cachecs"tt|�j||�||_d|_dS)Nz/var/cache/dnf/packages.db)�superr�__init__�base�
cache_file)�selfr	Zcli)�	__class__��//usr/lib/python3.6/generate_completion_cache.pyrszBashCompletionCache.__init__cCstjd|�dS)NzCompletion plugin: %s)r�debug)�msgr
r
r�_out$szBashCompletionCache._outcCsd}x,|jjj�D]}|jdk	r|jjrd}PqWtjj|j�sF|r�y~t	j
|j��h}|jd�|j�}|j
d�|j
d�|j
d�|jjj�j�}dd	�|D�}|jd
|�|j�WdQRXWn6t	jk
r�}z|jdt|��WYdd}~XnXdS)z& Generate cache of available packages FNTzGenerating completion cache...z/create table if not exists available (pkg TEXT)zAcreate unique index if not exists pkg_available ON available(pkg)zdelete from availablecSs g|]}|jdkrt|�g�qS)�src)�arch�str)�.0�xr
r
r�
<listcomp>@sz,BashCompletionCache.sack.<locals>.<listcomp>z*insert or ignore into available values (?)z Can't write completion cache: %s)r	ZreposZiter_enabledZmetadata�fresh�os�path�existsr
�sqlite3�connectr�cursor�execute�sack�queryZ	available�executemany�commit�OperationalErrorr)rrZrepo�conn�curZ
avail_pkgsZavail_pkgs_insert�er
r
rr (s,

zBashCompletionCache.sackcCs�|js
dSy�tj|j��n}|jd�|j�}|jd�|jd�|jd�tjj	|j
�j�j�}dd�|D�}|j
d|�|j�WdQRXWn6tjk
r�}z|jd	t|��WYdd}~XnXdS)
z& Generate cache of installed packages NzGenerating completion cache...z/create table if not exists installed (pkg TEXT)zAcreate unique index if not exists pkg_installed ON installed(pkg)zdelete from installedcSs g|]}|jdkrt|�g�qS)r)rr)rrr
r
rrVsz3BashCompletionCache.transaction.<locals>.<listcomp>z*insert or ignore into installed values (?)z Can't write completion cache: %s)�transactionrrr
rrr�dnfr Z_rpmdb_sackr	r!Z	installedr"r#r$r)rr%r&Z	inst_pkgsZinst_pkgs_insertr'r
r
rr(Gs"


zBashCompletionCache.transaction)
�__name__�
__module__�__qualname__�namer�staticmethodrr r(�
__classcell__r
r
)rrrs
r)
Z
__future__rrZdnf.i18nrZdnfpluginscorerr)Zos.pathrrZPluginrr
r
r
r�<module>ssite-packages/dnf-plugins/__pycache__/groups_manager.cpython-36.pyc000064400000020721147511334660021375 0ustar003

@|�e�4�@s�ddlmZddlmZddlZddlZddlZddlZddlZddlZddl	Z	ddl
mZmZddl
Z
ddlZ
dZejdje��Zejd�Zdddd	�Zd
d�Zdd
�Zdd�Ze
jjGdd�de
jj��ZdS)�)�absolute_import)�unicode_literalsN)�_�loggerz
-a-z0-9_.:z^[{}]+$z^[-a-zA-Z0-9_.@]+$T)Zdefault_explicitZuservisible_explicitZempty_groupscCstj|�stjtd���|S)zgroup id validatorzInvalid group id)�RE_GROUP_ID�match�argparse�ArgumentTypeErrorr)�value�r�$/usr/lib/python3.6/groups_manager.py�
group_id_type.s
r
cCsN|jdd�}t|�dkr&tjtd���|\}}tj|�sFtjtd���||fS)ztranslated texts validator�:�z6Invalid translated data, should be in form 'lang:text'z*Invalid/empty language for translated data)�split�lenrr	r�RE_LANGr)r
�data�lang�textrrr�translation_type5s

rcCs:|j�}tjdjt�d|�}|s6tjjtd�j|���|S)z#generate group id based on its namez[^{}]�zFCan't generate group id from '{}'. Please specify group id using --id.)	�lower�re�sub�format�RE_GROUP_ID_VALID�dnf�cli�CliErrorr)r�group_idrrr�
text_to_idAsr!csdeZdZdZed�Z�fdd�Zedd��Zdd�Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Z�ZS)�GroupsManagerCommand�groups-managerz$create and edit groups metadata filecstt|�j|�tj�|_dS)N)�superr"�__init__�libcomps�Comps�comps)�selfr)�	__class__rrr%QszGroupsManagerCommand.__init__cCs�|jddgdtd�d�|jddgdtd�d�|jddtd	�d
�|jddd
td�d�|jdttd�d�|jddtd�d�|jdtd�d�|jdttd�d�|jddgdttd�d�|jddgdttd �d�|j�}|jd!d"ddtd#�d$�|jd%d"d&dtd'�d$�|j�}|jd(dtd)�d*�|jd+dtd,�d*�|jd-dd
td.�d�|jd/dtd0�d*�|jd1d2d3td4�d5�dS)6Nz--load�appendz	COMPS.XMLzload groups metadata from file)�action�default�metavar�helpz--savezsave groups metadata to filez--mergez%load and save groups metadata to file)r.r/z--print�
store_trueFz#print the result metadata to stdout)r,r-r/z--idzgroup id)�typer/z-nz--namez
group name)r/z
--descriptionzgroup descriptionz--display-orderzgroup display orderz--translated-namez	LANG:TEXTztranslated name for the group)r,r-r.r1r/z--translated-descriptionz$translated description for the groupz--user-visible�user_visiblez%make the group user visible (default))�destr,r-r/z--not-user-visibleZstore_falsezmake the group user invisiblez--mandatoryz%add packages to the mandatory section)r,r/z
--optionalz$add packages to the optional sectionz--removez5remove packages from the group instead of adding themz--dependenciesz-include also direct dependencies for packages�packages�*ZPACKAGEzpackage specification)�nargsr.r/)�add_argumentrr
�intrZadd_mutually_exclusive_group)�parserZvisibleZsectionrrr�
set_argparserUsR








z"GroupsManagerCommand.set_argparsercCs�|jj}|jjr"d|_d|_d|_|jjrP|jjj	d|jj�|jj
j|jj�|jjs�|jj
s�|jjs�|jjs�|jjdk	s�|jjr�|jjr�|jjr�tjjtd���dS)NTFrz;Can't edit group without specifying it (use --id or --name))r�demands�optsr4Zsack_activationZavailable_reposZload_system_repo�merge�load�insert�saver+�description�
display_order�translated_name�translated_descriptionr2�id�namerrr)r)r;rrr�	configure�s"zGroupsManagerCommand.configurecCs �x|jjD�]
}tj�}yp|jd�r~tj|��F}tjdd�}z$t	j
||�|j�|j|j
�Wdtj|j
�XWdQRXn
|j|�Wn~tttjfk
�r}zXt�}x2|j�D]&}||kr�q�tj|j��|j|�q�Wtjjtd�j||���WYdd}~XqX|j|7_qWdS)zm
        Loads all input xml files.
        Returns True if at least one file was successfuly loaded
        z.gzF)�deleteNzCan't load file "{}": {})r<r>r&r'�endswith�gzip�open�tempfileZNamedTemporaryFile�shutilZcopyfileobj�closeZ	fromxml_frF�os�unlink�IOError�OSErrorZParserError�setZget_last_errorsr�error�strip�addr�
exceptions�Errorrrr()r)�	file_nameZ
file_compsZgz_fileZ	temp_file�err�seenrTrrr�load_input_files�s,
$z%GroupsManagerCommand.load_input_filescCs�x�|jjD]�}y|jj|td�}Wn*tjk
rL}z|g}WYdd}~XnX|r
x"|dd�D]}tj|j	��q`Wt
jjt
d�j||dj	����q
WdS)N)�xml_options�zCan't save file "{}": {}���r_)r<r@r(Zxml_f�COMPS_XML_OPTIONSr&ZXMLGenErrorrrTrUrrWrXrr)r)rY�errorsrZrrr�save_output_files�sz&GroupsManagerCommand.save_output_filescCs\d}|r*x |jjD]}|j|kr|}PqW|dkrX|rXx |jjD]}|j|kr@|}Pq@W|S)zl
        Try to find group according to command line parameters - first by id
        then by name.
        N)r(�groupsrErF)r)r rF�groupZgrprrr�
find_group�s

zGroupsManagerCommand.find_groupcCs�dd�}|jjr|jj|_|jjr,|jj|_|jjr>|jj|_|jjdk	rT|jj|_|jjrj||jj�|_|jj	r�||jj	�|_
|jj�r�t�}xZ|jjD]N}t
jj|�}|j|jjdddd�j�}|s�tjtd�j|��q�|j|�q�W|jj�r2t�}x|D]}|j|j��qW|j|jjj�j|d��d	d
�|D�}	|jj�r�x�|	D].}
x&|j|
tj d�D]}|jj|��qfW�qPWnd|jj!�r�tj"}n|jj#�r�tj$}ntj%}x8t&|	�D],}
|j|
|d��s�|jj'tj(|
|d���q�WdS)zE
        Set attributes and package lists for selected group
        cSs&tj�}x|D]\}}|||<qW|S)N)r&ZStrDict)ZlstZstr_dictrrrrr�langlist_to_strdict�sz<GroupsManagerCommand.edit_group.<locals>.langlist_to_strdictNTF)Z
with_nevraZ
with_providesZwith_filenameszNo match for argument: {})ZprovidescSsh|]
}|j�qSr)rF)�.0�pkgrrr�	<setcomp>sz2GroupsManagerCommand.edit_group.<locals>.<setcomp>)rFr1))r<rFrAZdescrBr2ZuservisiblerCZname_by_langrDZdesc_by_langr4rSrZsubjectZSubjectZget_best_query�baseZsackZlatestrZwarningrr�updateZdependenciesZrequiresZqueryZfilterm�removeZpackages_matchr&ZPACKAGE_TYPE_UNKNOWNZ	mandatoryZPACKAGE_TYPE_MANDATORYZoptionalZPACKAGE_TYPE_OPTIONALZPACKAGE_TYPE_DEFAULT�sortedr+ZPackage)r)rdrfr4Zpkg_specZsubj�qZrequirementsrhZ	pkg_namesZpkg_nameZpkg_typerrr�
edit_group�sT










zGroupsManagerCommand.edit_groupcCs|j�|jjs|jjr�|j|jj|jjd�}|dkr�|jjrNtjjt	d���t
j�}|jjrt|jj|_|jj|_nD|jjr�t|jj�}|j|dd�r�tj
jt	d�j||jj���||_|jjj|�|j|�|j�|jjs�|jjr�t|jjtd��dS)N)r rFz-Can't remove packages from non-existent groupzRGroup id '{}' generated from '{}' is duplicit. Please specify group id using --id.)r])r\r<rErFrerlrrWrXrr&ZGroupr!rrrr(rcr+rorb�printr@Zxml_strr`)r)rdr rrr�run!s,

zGroupsManagerCommand.run)r#)�__name__�
__module__�__qualname__�aliasesrZsummaryr%�staticmethodr:rGr\rbrerorq�
__classcell__rr)r*rr"Ls1$=r")Z
__future__rrrrJr&rOrrMrLZdnfpluginscorerrrZdnf.clir�compilerrrr`r
rr!ZpluginZregister_commandrZCommandr"rrrr�<module>s,
site-packages/dnf-plugins/__pycache__/debug.cpython-36.pyc000064400000025101147511334660017447 0ustar003

�gt`1�@s�ddlmZddlmZddlmZddlmZmZddlZddl	Zddl
Z
ddlZddlZddl
Z
ddlZddlZdZGdd�dej�ZGd	d
�d
ejj�ZGdd�dejj�Zd
d�Zdd�Zdd�ZdS)�)�absolute_import)�unicode_literals)�ucd)�_�loggerNzdnf-debug-dump version 1
cs eZdZdZ�fdd�Z�ZS)�Debug�debugcsDtt|�j||�||_||_|jdk	r@|jjt�|jjt�dS)N)�superr�__init__�base�cliZregister_command�DebugDumpCommand�DebugRestoreCommand)�selfrr)�	__class__��/usr/lib/python3.6/debug.pyr
)s
zDebug.__init__)�__name__�
__module__�__qualname__�namer
�
__classcell__rr)rrr%srcsteZdZdZed�Z�fdd�Zdd�Zedd��Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�Zdd�Z�ZS)r
�
debug-dumpz5dump information about installed rpm packages to filecstt|�j|�d|_dS)N)r	r
r
�	dump_file)rr)rrrr
7szDebugDumpCommand.__init__cCsd|jj_d|jj_dS)NT)r�demands�sack_activation�available_repos)rrrr�	configure;s
zDebugDumpCommand.configurecCs.|jdddtd�d�|jddtd�d	�dS)
Nz	--norepos�
store_trueFz/do not attempt to dump the repository contents.)�action�default�help�filename�?zoptional name of dump file)�nargsr!)�add_argumentr)�parserrrr�
set_argparser?szDebugDumpCommand.set_argparsercCs�|jj}|s6tjdtjtj���}dtj�d|f}tjj|�}|j	d�r\t
j|d�|_nt
|d�|_|jt�|j�|j�|j�|j|jj�|j�|jj�ttd�|�dS)z{create debug txt file and compress it, if no filename specified
           use dnf_debug_dump-<timestamp>.txt.gz by defaultz%Y-%m-%d_%Tzdnf_debug_dump-%s-%s.txt.gz�z.gz�wzOutput written to: %sN)�optsr"�timeZstrftimeZ	localtime�os�uname�path�abspath�endswith�gzip�GzipFiler�open�write�
DEBUG_VERSION�dump_system_info�dump_dnf_config_info�dump_rpm_problems�
dump_packagesZnorepos�dump_rpmdb_versions�close�printr)rr"Znowrrr�runHs 


zDebugDumpCommand.runcCs4tjjr t|jtj�r t|d�}tjj|j|�dS)N�utf8)	�dnfZpycompZPY3�
isinstancerr1r2�bytesZ
write_to_file)r�msgrrrr4as
zDebugDumpCommand.writecCsX|jd�tj�}|jd|d|df�|jdtj�|jdtjjdd��dS)	Nz%%%%SYSTEM INFO
z  uname: %s, %s
��z  rpm ver: %s
z  python ver: %s
�
�)r4r,r-�rpm�__version__�sys�version�replace)rr-rrrr6fs
z!DebugDumpCommand.dump_system_infocCs�|jjj}djdd�|jjjD��}|jd�|jd|d�|jd|d�|jd	|d
�|jdtjj	�|jd|�|jd
dj|jjj
��dS)N�,cSsg|]
}|j�qSr)r)�.0�prrr�
<listcomp>psz9DebugDumpCommand.dump_dnf_config_info.<locals>.<listcomp>z
%%%%DNF INFO
z  arch: %s
�archz  basearch: %s
Zbasearchz  releasever: %s
Z
releaseverz  dnf ver: %s
z  enabled plugins: %s
z  global excludes: %s
)r�confZ
substitutions�joinZ_plugins�pluginsr4r?�const�VERSION�excludepkgs)r�varrSrrrr7ns

z%DebugDumpCommand.dump_dnf_config_infocCsP|jd�t|j�\}}|jdjdd�|D���|jdjdd�|D���dS)Nz%%%%RPMDB PROBLEMS
rFcSs$g|]\}}dt|�t|�f�qS)zPackage %s requires %s
)r)rM�req�pkgrrrrO}sz6DebugDumpCommand.dump_rpm_problems.<locals>.<listcomp>cSs$g|]\}}dt|�t|�f�qS)zPackage %s conflicts with %s
)r)rMrQrYrrrrOs)r4�rpm_problemsrrR)rZmissing�	conflictsrrrr8zs
z"DebugDumpCommand.dump_rpm_problemsc	Cs\|jjj�}|jd�x&t|j��D]}|jdt|��q$W|sFdS|jd�|j�}x�t|jjj	�dd�d�D]�}y�d}|j
dk	r�|j
}n*|jdk	r�|j}nt|j
�dkr�|j
d}|jd|j|f�|jd	d
j|j��x,t|j|jd��D]}|jdt|��q�WWqrtjjk
�rR}z|jd|t|�f�wrWYdd}~XqrXqrWdS)
Nz
%%%%RPMDB
z  %s
z
%%%%REPOS
cSs|jS)N)�id)�xrrr�<lambda>�sz0DebugDumpCommand.dump_packages.<locals>.<lambda>)�keyrz
%%%s - %s
z  excludes: %s
rL)ZreponamezError accessing repo %s: %s
)r�sack�queryr4�sorted�	installed�pkgspec�	availableZreposZiter_enabledZmetalinkZ
mirrorlist�lenZbaseurlr\rRrV�filterr?�
exceptions�Error�str)	rZ
load_repos�qrNreZrepoZurlZpo�errrr9�s2




zDebugDumpCommand.dump_packagescCs(|jd�|jjj�}|jd|�dS)Nz%%%%RPMDB VERSIONS
z
  all: %s
)r4rr`Z_rpmdb_version)rrJrrrr:�s
z$DebugDumpCommand.dump_rpmdb_versions)r)rrr�aliasesr�summaryr
r�staticmethodr'r=r4r6r7r8r9r:rrr)rrr
2s		r
c@sPeZdZdZed�Zdd�Zedd��Zdd�Z	d	d
�Z
dd�Zed
d��ZdS)r�
debug-restorez,restore packages recorded in debug-dump filecCs4d|jj_d|jj_d|jj_|jjs0d|jj_dS)NT)rrrrZ	root_userr*�outputZ	resolving)rrrrr�s



zDebugRestoreCommand.configurecCs~|jddtd�d�|jddtd�d�|jddtd�d�|jd	d
dtd�d
�|jddtd�d�|jddtd�d�dS)Nz--outputrz,output commands that would be run to stdout.)rr!z--install-latestz0Install the latest version of recorded packages.z
--ignore-archz_Ignore architecture and install missing packages matching the name, epoch, version and release.z--filter-typesz[install, remove, replace]zinstall, remove, replacezlimit to specified type)�metavarr r!z--remove-installonlyzqAllow removing of install-only packages. Using this option may result in an attempt to remove the running kernel.r"r(zname of dump file)r$r!)r%r)r&rrrr'�s$z!DebugRestoreCommand.set_argparsercCsV|jjr$t|jjjdd�j��|j_|j|jjd�}|j||j�|j||j�dS)z Execute the command action here.rL� rN)	r*�filter_types�setrK�split�read_dump_filer"�process_installed�process_dump)r�	dump_pkgsrrrr=�szDebugRestoreCommand.runcCs�|jjj�j�}|jj|�}x�|D]�}d}t|�}|j|j|jfd�}|dk	r�|j	|j
|jf}	|	|krpd||	<q�||kr~d}q�d|jkr�d}
nd}
x|j
�D]}|
||<q�Wnd}|r"d|jkr"||ks�|jr"|jr�td|�q"|jj|�q"WdS)NF�skipTrK�removezremove    %s)rr`rarcZ_get_installonly_queryrd�getrrP�epochrJ�releasert�keysZremove_installonlyrqr<Zpackage_remove)rrzr*rcZinstallonly_pkgsrYZ
pkg_remove�spec�dumped_versionsZevrrZd_evrrrrrx�s.


z%DebugRestoreCommand.process_installedc
Cs�x�t|j��D]�\}}|||f}x�t|j��D]�\}}}||||f}	|	dkrRq0|jr^d}
nd|}
|jr�|	dkr�d||
f}nt||
|||�}|	|jkr0|jr�td|	|f�q0y|jj	|�Wq0t
jjk
r�t
jtd�|�Yq0Xq0WqWdS)Nr{rF�.�installz%s%sz%s   %szPackage %s is not available)rbr�Zignore_archZinstall_latest�pkgtup2specrtrqr<rr�r?rhZMarkingErrorr�errorr)rrzr*�n�ar�rl�v�rrrP�pkg_specrrrry�s&
z DebugRestoreCommand.process_dumpcCs�|jd�rtj|�}nt|�}t|j��tkrFtjt	d�|�t
jj�d}i}xp|D]h}t|�}|rr|dkrTd}qT|s�|ddkr�P|j
�}tj|�}d|j|j|jfi�|j|j|jf<qTW|S)	Nz.gzzBad dnf debug file: %sTz
%%%%RPMDB
Frrsr�)r0r1r2r3r�readliner5rr�rr?rhri�strip�hawkeyZsplit_nevra�
setdefaultrrPr~rJr)r"Zfobjr{Zpkgs�liner�Znevrarrrrws(


(z"DebugRestoreCommand.read_dump_fileN)rp)
rrrrmrrnrror'r=rxryrwrrrrr�s#rcs�tjj|�}|j�j��t�}t�}x@�D]8�|j�fdd��jD��|j�fdd��jD��q*W�fdd�|D�}�fdd�|D�}||fS)Ncs2g|]*}t|�dkrt|�jd�r|�f�qS)zsolvable:prereqmarkerzrpmlib()rj�
startswith)rMrX)rYrrrO:sz rpm_problems.<locals>.<listcomp>csg|]}|�f�qSrr)rMrQ)rYrrrO=scs$g|]\}}�j|d�s||f�qS))�provides)rg)rMrXrY)�allpkgsrrrO?scs$g|]\}}�j|d�r||f�qS))r�)rg)rMrQrY)r�rrrOAs)	r?r`Z_rpmdb_sackrarcru�update�requiresr[)rZrpmdbr�r[Zmissing_requiresZexisting_conflictsr)r�rYrrZ3s
rZcCst|j|j|j|j|j�S)N)r�rrPr~rJr)rYrrrrdFsrdcCs<|sdnd|jd�}|dkr"dnd|}d|||||fS)NrFz.%sr�z%s:z%s-%s%s-%s%s)NrF)�lstrip)rrPr~rJrr�rlrrrr�Jsr�)Z
__future__rrZdnf.i18nrZdnfpluginscorerrr?Zdnf.clir1r�r,rGrIr+r5ZPluginrrZCommandr
rrZrdr�rrrr�<module>s&
wsite-packages/dnf-plugins/__pycache__/repograph.cpython-36.pyc000064400000005342147511334660020355 0ustar003

�gt`��@s^ddlmZddlmZddlmZmZddlZdZGdd�dej	�Z
Gdd	�d	ejj�Z
dS)
�)�absolute_import)�unicode_literals)�_�loggerNzY
size="20.69,25.52";
ratio="fill";
rankdir="TB";
orientation=port;
node[style="filled"];
cs eZdZdZ�fdd�Z�ZS)�	RepoGraph�	repographcs,tt|�j||�|dkrdS|jt�dS)N)�superr�__init__Zregister_command�RepoGraphCommand)�self�base�cli)�	__class__��/usr/lib/python3.6/repograph.pyr	)szRepoGraph.__init__)�__name__�
__module__�__qualname__�namer	�
__classcell__rr)rrr%src@s<eZdZd
Zed�Zdd�Zdd�Zdd	�Ze	d
d��Z
dS)r
r�
repo-graphz4Output a full package dependency graph in dot formatcCsV|jj}d|_d|_|jjrRx4|jjj�D]$}|j	|jjkrF|j
�q*|j�q*WdS)NT)r
�demandsZsack_activationZavailable_reposZopts�reporZrepos�all�id�disable�enable)rrrrrr�	configure4s
zRepoGraphCommand.configurecCs|jt�dS)N)�do_dot�
DOT_HEADER)rrrr�run?szRepoGraphCommand.runc	Cs�d}|j|jj�}td�tdj|��x�|j�D]�}t||�|krRt||�}ddt||�}|d}d}td	j||||��td
j|��x||D]}tdj|��q�Wtdj|||��q2Wtd
�dS)Nrzdigraph packages {z{}g�?g333333�?�g�������?g�?z""{}" [color="{:.12g} {:.12g} {}"];z
"{}" -> {{z"{}"z!}} [color="{:.12g} {:.12g} {}"];
�}g��s���?)�	_get_depsr�sack�print�format�keys�len)	r�headerZmaxdepsZdeps�pkg�h�s�b�reqrrrrBs zRepoGraphCommand.do_dotc
Cs�i}i}g}|j�j�}x�|D]�}i}x�|jD]�}t|�}||krDq.|jd�rPq.||krb||}	n@|j|d�}	|	s�tjtd�|�|j	|�q.n
|	dj
}	|	||<|	|j
kr�d||	<|	|ks.|	|kr�q.nd||	<|j�||j
<q.WqW|S)Nz	solvable:)ZprovideszNothing provides: '%s'r)Zquery�	available�requires�str�
startswith�filterr�debugr�appendrr')
r$r0Zprov�skipr/r*Zxxr.ZreqnameZproviderrrrr#Ys8





zRepoGraphCommand._get_depsN)rr)rrr�aliasesrZsummaryrr r�staticmethodr#rrrrr
0sr
)Z
__future__rrZdnfpluginscorerrZdnf.cliZdnfrZPluginrr
ZCommandr
rrrr�<module>ssite-packages/dnf-plugins/__pycache__/config_manager.cpython-36.pyc000064400000016133147511334660021325 0ustar003

�gt`�*�@s�ddlmZddlmZddlmZmZmZddlZddlZddl	Zddl
ZddlZddlZddl
Z
ddlZddlZejjGdd�dejj��Zdd�Zejd	�Zejd
�Zejd�Zejd�Zd
d�ZdS)�)�absolute_import)�unicode_literals)�_�logger�P_Nc@sReZdZdgZed�jejjd�Z	e
dd��Zdd�Zdd	�Z
d
d�Zdd
�ZdS)�ConfigManagerCommandzconfig-managerz4manage {prog} configuration options and repositories)�progcCs�|jdddtd�d�|jdddtd	�d
�|jdgdd
td�d�|jdddtd�d
�|jdddtd�d
�|j�}|jddddtd�d�|jddddtd�d�dS)N�crepo�*�repozrepo to modify)�nargs�metavar�helpz--saveF�
store_truez/save the current options (useful with --setopt))�default�actionrz
--add-repo�appendZURLz8add (and enable) the repo from the specified file or url)rrr
rz--dumpz,print current configuration values to stdoutz--dump-variableszprint variable values to stdoutz
--set-enabled�set_enabledz"enable repos (automatically saves))r�destrrz--set-disabled�set_disabledz#disable repos (automatically saves))�add_argumentrZadd_mutually_exclusive_group)�parserZenable_group�r�$/usr/lib/python3.6/config_manager.py�
set_argparser)s,z"ConfigManagerCommand.set_argparsercCs�|jj}d|_|jjgkp@|jjp@|jjp@|jjp@|jjp@|jj	sp|jj
jtd�j
djdddddd	d
dg���|jjgkr�tjtd��|jjs�|jj	s�|jjs�|jjr�d|_d
d�|jjD�}dd�|D�|j_dS)NTz.one of the following arguments is required: {}� z--savez
--add-repoz--dumpz--dump-variablesz
--set-enabledz--enablez--set-disabledz	--disablez{Warning: --enablerepo/--disablerepo arguments have no meaningwith config manager. Use --set-enabled/--set-disabled instead.cSsg|]}|dkr|jd��qS)�,)�split)�.0�xrrr�
<listcomp>_sz2ConfigManagerCommand.configure.<locals>.<listcomp>cSs"g|]}|D]}|dkr|�qqS)�r)rZsublist�itemrrrr as)�cli�demandsZavailable_repos�opts�add_repo�save�dump�dump_variablesrrZ	optparser�errorr�format�joinZrepos_edrZwarningZ	root_userr	)�selfr$Z	temp_listrrr�	configureBs*zConfigManagerCommand.configurecCs|jjr|j�n|j�dS)zExecute the util action here.N)r%r&�modify_repo)r-rrr�runds
zConfigManagerCommand.runc	s�g�t������fdd�}�jjrnx�jjD]�|�d�q.Wt�jd�r�xL�jjj�D]�|�d�qZWn,t�jd�r�x�jjj�D]�|�d�q�W�r�tjjt	d�dj
�����jj}i}t�jd�r�jj
r�jj
}�jj�rx*�jjjj�D]\�}td	�|f�q�W�jj�s0d
�jjk�r��jj�r\|�r\�jjj�jjjd
|j|��jj�r�t�jjjd
��t�jjj����s�dS�jj�s��jj�r�d�j_x�t��D]�}i}�jj�r�d|d
<n�jj�r�d|d
<t�jd��r*x4�jjj�D]$\}}tj|j|��r|j|��qW�jj�rT|�rT�jjj|j|j|j|��jj�r�t�jjjd|j��t|j���q�WdS)z< process --set-enabled, --set-disabled and --setopt options cs0�jjj|�}|s�j��n|r,�j|�dS)N)�baseZreposZget_matching�add�extend)�keyZadd_matching_reposZmatching)�matching_repos�name�not_matching_repos_idr-rr�match_reposqs
z5ConfigManagerCommand.modify_repo.<locals>.match_reposT�repo_setoptsFzNo matching repo to modify: %s.z, �main_setoptsz%s = %s�mainN�1Zenabled�0zrepo: )�setr%r	�hasattrr9�keys�dnf�
exceptions�Errorrr,r1�confr:r)Z
substitutions�items�printr'Zwrite_raw_configfileZconfig_file_pathr(�outputZ
fmtSectionrr�sorted�fnmatch�id�updateZrepofile)	r-r8ZsbcZmodify�valrZrepo_modify�repoidZsetoptsr)r5r6r7r-rr/ks`






z ConfigManagerCommand.modify_repoc
CsN|jjj}d}�x|jjD�]}tjjj|�jdkrDdt	j
j|�}tj
td�|�|jd�r�t	j
j|�}t	j
j||�}y6|jj|dd�}tj|j|�t	j|d�|j�Wn6tk
r�}z|d	7}tj|�wWYd
d
}~XnXqt|�}djtjj|�}t	j
j|d|�}d
|||f}	t||	�sqqW|�rJtjj t!dd|���d
S)z process --add-repo option rr!zfile://zAdding repo from: %sz.repozw+)�modei��Nz$created by {} config-manager from {}z%s.repoz"[%s]
name=%s
baseurl=%s
enabled=1
zConfiguration of repo failedzConfiguration of repos failed)"r1rDZget_reposdirr%r&rA�pycompZurlparse�scheme�os�path�abspathr�infor�endswith�basenamer,Zurlopen�shutilZcopy2r6�chmod�close�IOErrorr*�sanitize_url_to_fsr+�util�	MAIN_PROG�save_to_filerBrCr)
r-Z	myrepodirZerrors_count�urlZdestname�f�erMZreponame�contentrrrr&�s8




zConfigManagerCommand.add_repoN)�__name__�
__module__�__qualname__�aliasesrr+rAr]r^Zsummary�staticmethodrr.r0r/r&rrrrr"s"BrcCspy4t|d�� }tjj||�tj|d�WdQRXWn6ttfk
rj}ztj	t
d�||�dSd}~XnXdS)Nzw+i�z&Could not save repo to repofile %s: %sFT)�openrArPZ
write_to_filerRrYr[�OSErrorrr*r)�filenamerc�fdrbrrrr_�s
r_z^\w+:/*(\w+:|www\.)?z[?/:&#|~\*\[\]\(\)\'\\]+z^[,.]*z[,.]*$cCs*ybtj|�r`tjjr&|jd�jd�}n:t|t�rB|jd�jd�}n
|jd�}t|t	�r`|jd�}Wnt
ttt
fk
r~YnXtjd|�}tjd|�}tjd|�}tjd|�}t|�dk�r|dd�jd�}dt|d
�}tj�}|j||d�jd��|d|�d|j�}d	}tj|d|�S)z�Return a filename suitable for the filesystem and for repo id

    Strips dangerous and common characters to create a filename we
    can use to store the cache in.
    Zidnazutf-8r!r�N�rOzE[^abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789_.:-]���)�	RE_SCHEME�matchrArPZPY3�encode�decode�
isinstance�strZunicode�UnicodeDecodeError�UnicodeEncodeError�UnicodeError�	TypeError�sub�RE_SLASH�RE_BEGIN�RE_FINAL�lenr�hashlibZsha256rKZ	hexdigest�re)r`�parts�	lastindexZcsumZ
allowed_regexrrrr\�s.



r\)Z
__future__rrZdnfpluginscorerrrrAZdnf.cliZ
dnf.pycompZdnf.utilrIrrRr�rXZpluginZregister_commandr#ZCommandrr_�compilerpr{r|r}r\rrrr�<module>s(1



site-packages/dnf-plugins/__pycache__/changelog.cpython-36.pyc000064400000010117147511334660020311 0ustar003

�gt`g�@s|ddlmZddlmZddlZddlZddlZddlmZm	Z	m
Z
ddlZddlZdd�Z
ejjGdd�dejj��ZdS)	�)�absolute_import)�unicode_literalsN)�_�P_�loggerc
CsDytjj|dd�Stttfk
r>tjtd�j	|���YnXdS)NT)ZfuzzyzNot a valid date: "{0}".)
�dateutil�parser�parse�
ValueError�	TypeError�
OverflowError�argparseZArgumentTypeErrorr�format)�val�r�/usr/lib/python3.6/changelog.py�
validate_date!src@sLeZdZdZed�Zedd��Zdd�Zdd�Z	d	d
�Z
dd�Zd
d�ZdS)�ChangelogCommand�	changelogzShow changelog data of packagescCsd|j�}|jdddttd�d�|jddttd�d�|jdd	d
td�d�|jd
dtd�d�dS)Nz--sinceZDATEzZshow changelog entries since DATE. To avoid ambiguosity, YYYY-MM-DD format is recommended.)�metavar�default�type�helpz--countz2show given number of changelog entries per package)rrrz
--upgradesF�
store_truezmshow only new changelog entries for packages, that provide an upgrade for some of already installed packages.)r�actionr�package�*ZPACKAGE)�nargsr)Zadd_mutually_exclusive_group�add_argumentrr�int)rZfilter_grouprrr�
set_argparser-szChangelogCommand.set_argparsercCs|jj}d|_d|_d|_dS)NT)�cli�demandsZavailable_reposZsack_activation�
changelogs)�selfr"rrr�	configure>szChangelogCommand.configurecCs�|jjj�}|jjr�|jdd�x�|jjD]d}tjj|dd�j	|jjdddd�}|jj
rh|j|jj
d�|r||j|j��}q*t
jtd�|�q*Wn|jj
r�|j|jj
d�|jjr�|j�}n|j�}|S)NT)�empty)Zignore_caseF)Z
with_nevraZ
with_providesZwith_filenames)ZreponamezNo match for argument: %s)�baseZsack�query�optsrZfilterm�dnfZsubjectZSubjectZget_best_queryZrepo�unionZlatestr�infor�upgradesZ	available)r$�q�pkgZpkg_qrrrr(Ds$

zChangelogCommand.querycCs>tj�}x0t|�D]$}|j|jp$|j|jfg�j|�qW|S)N)�collections�OrderedDict�sorted�
setdefaultZsource_name�nameZevr�append)r$Zpackages�by_srpmr/rrrr6Zs$zChangelogCommand.by_srpmcsT�jjr�jj|�S�jjr.|jd�jj�S�jjrJ�fdd�|jD�S|jSdS)Ncs$g|]}|d�jjj�kr|�qS)Z	timestamp)r)�sinceZdate)�.0�chlog)r$rr�
<listcomp>fsz6ChangelogCommand.filter_changelogs.<locals>.<listcomp>)r)r-r'Zlatest_changelogs�countr#r7)r$rr)r$r�filter_changelogs`sz"ChangelogCommand.filter_changelogscCs�|jjr"tjtd�j|jj��nP|jjrLtjtdd|jj�j|jj��n&|jjrdtjtd��ntjtd��|j	|j
��}xb|D]Z}ttd�jdjt
dd	�||D�����x*|j||d
�D]}t|jj|��q�Wq�WdS)NzListing changelogs since {}zListing only latest changelogzListing {} latest changelogszBListing only new changelogs since installed version of the packagezListing all changelogszChangelogs for {}z, cSsh|]}t|��qSr)�str)r8r/rrr�	<setcomp>{sz'ChangelogCommand.run.<locals>.<setcomp>r)r)r7rr,rrr;rr-r6r(�print�joinr2r<r'Zformat_changelog)r$r6r4r9rrr�runks 

 zChangelogCommand.runN)r)
�__name__�
__module__�__qualname__�aliasesrZsummary�staticmethodr r%r(r6r<rArrrrr(sr)Z
__future__rrr
r0Zdateutil.parserrZdnfpluginscorerrrr*Zdnf.clirZpluginZregister_commandr!ZCommandrrrrr�<module>ssite-packages/dnf-plugins/__pycache__/repodiff.cpython-36.pyc000064400000017063147511334660020167 0ustar003

�gt`�,�@sjddlmZddlmZddlZddlmZddlZddlm	Z	Gdd�dej
�ZGdd	�d	ejj
�ZdS)
�)�absolute_import)�unicode_literalsN)�OptionParser)�_cs eZdZdZ�fdd�Z�ZS)�RepoDiff�repodiffcs,tt|�j||�|dkrdS|jt�dS)N)�superr�__init__Zregister_command�RepoDiffCommand)�self�base�cli)�	__class__��/usr/lib/python3.6/repodiff.pyr	$szRepoDiff.__init__)�__name__�
__module__�__qualname__�namer	�
__classcell__rr)rrr src@sLeZdZdZed�Zedd��Zdd�Zdd�Z	d	d
�Z
dd�Zd
d�ZdS)r
rz1List differences between two sets of repositoriesc	Cs�|jddgddtd�d�|jddgdd	td
�d�|jddd
gtjdtd�d�|jdddtd�d�|jddtd�d�|jddtd�d�|jddtd�d�dS)Nz
--repo-oldz-o�append�oldz2Specify old repository, can be used multiple times)�default�action�dest�helpz
--repo-newz-n�newz2Specify new repository, can be used multiple timesz--archz
--archlistz-a�archeszhSpecify architectures to compare, can be used multiple times. By default, only source rpms are compared.z--sizez-s�
store_truez5Output additional data about the size of the changes.)rrz--compare-archzMCompare packages also by arch. By default packages are compared just by name.z--simplez7Output a simple one line message for modified packages.z--downgradezNSplit the data for modified packages between upgraded and downgraded packages.)�add_argumentrrZ_SplitCallback)�parserrrr�
set_argparser/s

zRepoDiffCommand.set_argparsercCs�|jj}d|_d|_d|_dg|jj_|jj	s:|jj
rNtd�}tj
j|��x<|jjj�D],}|j|jj	|jj
kr�|j�q\|j�q\W|jjs�dg|j_dS)NT�allz*Both old and new repositories must be set.�src)r
�demandsZsack_activationZavailable_repos�
changelogsrZconfZdisable_excludes�optsrrr�dnf�
exceptions�ErrorZreposr"�id�enable�disabler)rr$�msgZreporrr�	configureMs
zRepoDiffCommand.configurecCs|jjr|j|jfS|jS)N)r&�compare_archr�arch)r�pkgrrr�_pkgkey`szRepoDiffCommand._pkgkeyc
s6t�fdd�|D���t�j��}t�fdd�|D���t�j��}t�}x:|j|d�D]*}x$|j|jd�D]}||�j|�<qlWqXW�jjj}t�fdd�||D��fdd�||D�|ggd�}	xj|j	|�D]\}
�|
}�|
}|j
|j
kr�q�||j
|j
�d	k�r|	d
j||f�q�|	dj||f�q�W|	S)aNcompares packagesets old and new, returns dictionary with packages:
        added: only in new set
        removed: only in old set
        upgraded: in both old and new, new has bigger evr
        downgraded: in both old and new, new has lower evr
        obsoletes: dictionary of which old package is obsoleted by which new
        csg|]}�j|�|f�qSr)r2)�.0�p)rrr�
<listcomp>msz-RepoDiffCommand._repodiff.<locals>.<listcomp>csg|]}�j|�|f�qSr)r2)r3r4)rrrr5os)�	obsoletes)Zprovidescsg|]}�|�qSrr)r3�k)�new_drrr5zscsg|]}�|�qSrr)r3r7)�old_drrr5{s)�added�removedr6�upgraded�
downgradedrr=r<)�dict�set�keys�filterr6r2r�sack�evr_cmp�intersection�evrr)
rrrZold_keysZnew_keysr6Z	obsoleterZ	obsoletedrCrr7�pkg_old�pkg_newr)r8r9rr�	_repodiffes0
zRepoDiffCommand._repodiffc
sh�fdd��dd�}��fdd�}tddddd�}x<t|d	�D],}ttd
�j�|���|d	|j7<q@Wxjt|d�D]Z}ttd�j�|���|d
j�j|��}|r�ttd�j�|���|d|j7<q~W�jj	�r�|d�r:ttd��x<t|d�D],\}}|d|j|j7<|||��q
W|d�r�ttd��x�t|d�D],\}}|d|j|j7<|||��q^Wn\|d|d}	|	�r�ttd��x8t|	�D],\}}|d|j|j7<|||��q�Wttd��ttd�jt
|d	���ttd�jt
|d����jj	�rlttd�jt
|d���ttd�jt
|d���n&ttd�jt
|d�t
|d����jj�rdttd�j||d	���ttd�j||d����jj	�s�ttd�j||d|d���n4ttd�j||d���ttd�j||d���ttd�j||d	|d|d|d���dS) Ncs �jjrt|�Sd|j|jfS)Nz%s-%s)r&r/�strrrE)r1)rrr�pkgstr�sz'RepoDiffCommand._report.<locals>.pkgstrcSsXt|�}|dkr.|djtjjj|�j��7}n&|dkrT|djtjjj|�j��7}|S)Nrz ({})z (-{}))rI�formatr'r
Z
format_number�strip)Znumr-rrr�sizestr�sz(RepoDiffCommand._report.<locals>.sizestrcsBg}�jjr*|jd�|��|�f��n|jd�|jd�|��|�f�|jdt|d
��|jrv|jd}nd}x�|jD]�}|r�|d|dkr�Pn2|d|dkr�|d|dkr�|d|dkr�P|jd	|djd
�tjj|d�tjj|d�f�q�W�jj	�r0|jt
d�j|j	|j	��tdj
|��dS)Nz%s -> %s��-�rZ	timestampZauthor�textz
* %s %s
%sz%a %b %d %YzSize change: {} bytes�
���)r&Zsimpler�lenr%Zstrftimer'Zi18nZucd�sizerrK�print�join)rFrGZmsgsZ	old_chlogZchlog)rJrrr�report_modified�s2

z0RepoDiffCommand._report.<locals>.report_modifiedr)r:r;r<r=r:zAdded package  : {}r;zRemoved package: {}r6zObsoleted by   : {}r<z
Upgraded packagesr=z
Downgraded packagesz
Modified packagesz
SummaryzAdded packages: {}zRemoved packages: {}zUpgraded packages: {}zDowngraded packages: {}zModified packages: {}zSize of added packages: {}zSize of removed packages: {}zSize of modified packages: {}zSize of upgraded packages: {}zSize of downgraded packages: {}zSize change: {})r>�sortedrVrrKrU�getr2r&Z	downgraderT)
rrrMrXZsizesr1ZobsoletedbyrFrGZmodifiedr)rJrr�_report�sf










zRepoDiffCommand._reportcCs�|jjjtj�j|jjd�}|jjjtj�j|jjd�}|jj	rld|jj	krl|j
|jj	d�|j
|jj	d�|jjr�|j
dd�|j
dd�n|j
dd�|j
dd�|j�|j�|j
|j||��dS)N)Zreponame�*)r0rP)Zlatest_per_arch)Zlatest)rrBZquery�hawkeyZIGNORE_EXCLUDESrAr&rrrZfiltermr/Zapplyr[rH)rZq_newZq_oldrrr�run�szRepoDiffCommand.runN)r)
rrr�aliasesrZsummary�staticmethodr!r.r2rHr[r^rrrrr
+s&ar
)Z
__future__rrZdnf.clir'Zdnf.cli.option_parserrr]ZdnfpluginscorerZPluginrr
ZCommandr
rrrr�<module>ssite-packages/dnf-plugins/__pycache__/debuginfo-install.cpython-36.pyc000064400000013665147511334660022003 0ustar003

�gt`L+�@sNddlmZmZddlZddlmZGdd�dej�ZGdd�dejj	�Z
dS)�)�_�loggerN)�Packagecs,eZdZdZdZ�fdd�Zdd�Z�ZS)�DebuginfoInstallz5DNF plugin supplying the 'debuginfo-install' command.zdebuginfo-installcs4tt|�j||�||_||_|dk	r0|jt�dS)zInitialize the plugin instance.N)�superr�__init__�base�cliZregister_command�DebuginfoInstallCommand)�selfrr	)�	__class__��'/usr/lib/python3.6/debuginfo-install.pyr s
zDebuginfoInstall.__init__cCsf|j|jj�}|jd�o.|jdd�o.|jdd�}|rbtjj|j�j	�j
dd�}t|�rb|jjj
�dS)N�main�
autoupdatez*-debuginfo)Z
name__glob)Zread_configr�confZhas_sectionZ
has_optionZ
getboolean�dnf�sackZ_rpmdb_sack�query�filterm�len�repos�enable_debug_repos)rZcprZdbginfor
r
r�config(s
zDebuginfoInstall.config)�__name__�
__module__�__qualname__�__doc__�namerr�
__classcell__r
r
)rrrsrcsheZdZdZdZed�Z�fdd�Zedd��Z	dd	�Z
d
d�Zdd
�Zdd�Z
dd�Zdd�Z�ZS)r
z! DebuginfoInstall plugin for DNF �debuginfo-installzinstall debuginfo packagescs4tt|�j|�t�|_t�|_t�|_t�|_dS)N)rr
r�set�available_debuginfo_missing�available_debugsource_missing�installed_debuginfo_missing�installed_debugsource_missing)rr	)rr
rr:s
z DebuginfoInstallCommand.__init__cCs|jddd�dS)N�package�+)�nargs)�add_argument)�parserr
r
r�
set_argparserBsz%DebuginfoInstallCommand.set_argparsercCs0|jj}d|_d|_d|_d|_|jjj�dS)NT)	r	�demandsZ	resolvingZ	root_userZsack_activationZavailable_reposrrr)rr,r
r
r�	configureFsz!DebuginfoInstallCommand.configurecCs�g}ttj�}ttj�}�x�|jjD�]�}tjj|�j	|j
jdd�}|d}|sxtj
td�|j
jjj|��|j|�q$|j�j�}|j|j�j��xdt|j��D]T}|jtj�r�|d|�|kr�|j|�|jtj�r�|d|�|kr�|j|�q�W�x�|j�D�]�}	|	d}
|
j�r�i}x"|	D]}|j|jg�j|��q(Wxj|j�D]^}
|
d}|j|j |��s�|j|j!|��s�|j"j#t$|��|j|j%|��sP|j&j#t$|���qPW�q|
j'jtj��s�|
j'jtj��r�|j(|	��q|ddk	�rb|j)|
j |d��s2|j)|
j!|d��s2|j*j#dj+|
j'|
j,��|j)|
j%|d��s|j-j#dj+|
j'|
j,���q|j.|
j |	��s�|j.|
j!|	��s�|j*j#dj+|
j'|
j,��|j.|
j%|	��s|j-j#dj+|
j'|
j,���qWq$W|j*�r�tj
td�d	j/t0|j*���|j-�rtj
td
�d	j/t0|j-���|j"�r8tj
td�d	j/t0|j"���|j&�r\tj
td�d	j/t0|j&���|�r�|j
j1j2�r�tj3j4td
�dj/|�d��dS)NF)Zwith_srcrzNo match for argument: %srZnevraz{}-{}zICould not find debuginfo package for the following available packages: %sz, zKCould not find debugsource package for the following available packages: %szICould not find debuginfo package for the following installed packages: %szKCould not find debugsource package for the following installed packages: %szUnable to find a match� )Zpkg_spec)5rrZDEBUGINFO_SUFFIXZDEBUGSOURCE_SUFFIXZoptsr&rZsubjectZSubjectZget_best_solutionrrr�infor�outputZtermZbold�appendZ	availableZ
_name_dict�updateZ	installed�list�keys�endswith�pop�valuesZ_from_system�
setdefault�arch�_install_debug_from_system�
debug_nameZsource_debug_namer$�add�strZdebugsource_namer%r�_install�_install_debugr"�format�evrr#�_install_debug_no_nevra�join�sortedr�strict�
exceptionsZPackagesNotAvailableError)rZerrors_specZdebuginfo_suffix_lenZdebugsource_suffix_lenZpkgspecZsolutionrZpackage_dictr�pkgsZ	first_pkgZ	arch_dict�pkgZpackage_arch_listr
r
r�runNs�





zDebuginfoInstallCommand.runcCs:|jjj�j||j|j|j|jd�}|r6|j|�dSdS)N)r�epoch�version�releaser9TF)	rrr�filterrJrKrLr9r>)rr;rHrr
r
rr:�s

z2DebuginfoInstallCommand._install_debug_from_systemcCs�i}|jdk	r|j|d<|jdk	r,|j|d<|jdk	r@|j|d<|jdk	rT|j|d<|jjj�jfd|i|��}|r�|j|�dSdS)NZepoch__globZ
version__globZ
release__globZ
arch__globrTF)	rJrKrLr9rrrrMr>)rr;Z
base_nevra�kwargsrr
r
rr?�s








z&DebuginfoInstallCommand._install_debugcs8|jjj�j�fdd�|D�d�}|r4|j|�dSdS)Ncsg|]}dj�|j|j��qS)z{}-{}.{})r@rAr9)�.0�p)r;r
r�
<listcomp>�szCDebuginfoInstallCommand._install_debug_no_nevra.<locals>.<listcomp>)Znevra_strictTF)rrrrr>)rr;rGrr
)r;rrB�s
z/DebuginfoInstallCommand._install_debug_no_nevracCs:tjj|jj�}|j|d�|jjj||jjj	d�dS)N)rH)ZselectZoptional)
r�selectorZSelectorrrr!ZgoalZinstallrrE)rrGrRr
r
rr>�sz DebuginfoInstallCommand._install)r )rrrr�aliasesrZsummaryr�staticmethodr+r-rIr:r?rBr>rr
r
)rrr
4s|
	r
)ZdnfpluginscorerrrZdnf.packagerZPluginrr	ZCommandr
r
r
r
r�<module>ssite-packages/dnf-plugins/__pycache__/download.cpython-36.pyc000064400000022456147511334660020202 0ustar003

�gt`*0�@s�ddlmZddlmZddlmZmZddlmZddlZddl	Zddl
ZddlZddlZddl
ZddlZddlZddlZddlZejjGdd�dejj��ZdS)�)�absolute_import)�unicode_literals)�_�logger)�OptionParserNcs�eZdZdgZed�Z�fdd�Zedd��Zdd�Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�Zd dd�Zd!dd�Zedd��Zdd�Zdd�Z�ZS)"�DownloadCommandZdownloadz%Download package to current directorycs tt|�j|�d|_d|_dS)N)�superr�__init__�opts�parser)�self�cli)�	__class__��/usr/lib/python3.6/download.pyr	+szDownloadCommand.__init__c	Cs�|jddtd�d�|jddtd�d�|jd	dtd
�d�|jddtd�d�|jd
ddgtjdtd�d�|jddtd�d�|jddtd�d�|jddddtd�d�|jddddd d!ggtd"�d#�dS)$N�packages�+zpackages to download)�nargs�helpz--source�
store_truezdownload the src.rpm instead)�actionrz--debuginfoz'download the -debuginfo package insteadz
--debugsourcez)download the -debugsource package insteadz--archz
--archlist�archesz[arch]z5limit  the  query to packages of given architectures.)�dest�defaultr�metavarrz	--resolvez(resolve and download needed dependenciesz	--alldepsz^when running with --resolve, download all dependencies (do not exclude already installed ones)z--urlz--urls�urlzJprint list of urls where the rpms can be downloaded instead of downloading)rrrz--urlprotocols�append�http�httpsZrsync�ftpz4when running with --url, limit to specific protocols)r�choicesrr)�add_argumentrrZ_SplitCallback)rrrr�
set_argparser0s*
zDownloadCommand.set_argparsercCs�|jj}d|_d|_|jjr*|jjr*d|_|jjr>|j	j
j�|jjsN|jj
rZ|j	j
j�|jjrr|jj|j	j_ntjjtj��|j	j_dS)NTF)r
�demandsZsack_activationZavailable_reposr
�resolveZalldepsZload_system_repo�source�baseZreposZenable_source_repos�	debuginfo�debugsourceZenable_debug_repos�destdir�conf�dnf�i18n�ucd�os�getcwd)rr#rrr�	configureKszDownloadCommand.configurecCs|jjr.|jjr.|jjr.|j|jj�}nXg}|jjrN|j|j|jj��|jjrj|j|j|jj��|jjr�|j|j	|jj��|jj
r�xd|D]\}|jtj
kr�|j|jjd�}|r�t|�q�td�|j}|jjjr�tjj|��tj|�q�WdS|j|�dS)zExecute the util action here.)�schemesz$Failed to get mirror for package: %sN)r
r%r'r(�_get_pkg_objs_rpmsr�extend�_get_pkg_objs_source�_get_pkg_objs_debuginfo�_get_pkg_objs_debugsourcer�repoid�hawkey�CMDLINE_REPO_NAMEZremote_locationZurlprotocols�printr�namer&r*�strictr+�
exceptions�ErrorrZwarning�
_do_downloads)r�pkgs�pkgr�msgrrr�run^s.





zDownloadCommand.runcCsi}x"|D]}|jt|�g�j|�q
Wg}g}xP|j�D]D}dd�|D�}|r`|j|d�q:|jdd�d�|j|d�q:W|r�|jj||jjj�|r�x^|D]V}|j	�}t
jj|jj
jt
jj|��}	t
jj|	�r�t
jj||	�r�q�tj||jj
j�q�Wtdd�||D��}
|
S)z=
        Perform the download for a list of packages
        cSsg|]}|jtjkr|�qSr)r7r8r9)�.0rArrr�
<listcomp>�sz1DownloadCommand._do_downloads.<locals>.<listcomp>rcSs|jj|jjfS)N)ZrepoZpriorityZcost)�xrrr�<lambda>�sz/DownloadCommand._do_downloads.<locals>.<lambda>)�keycSsg|]}|j��qSr)�localPkg)rDrArrrrE�s)�
setdefault�strr�values�sortr&Zdownload_packages�output�progressrIr.�path�joinr*r)�basename�exists�samefile�shutil�copy�sorted)rr@Zpkg_dictrAZto_downloadZcmdlineZpkg_listZpkgs_cmdline�src�dstZ	locationsrrrr?�s.

zDownloadCommand._do_downloadscCs"|jjr|j|�}n
|j|�}|S)zc
        Return a list of dnf.Package objects that represent the rpms
        to download.
        )r
r$�_get_packages_with_deps�
_get_packages)r�	pkg_specsr@rrrr2�s
z"DownloadCommand._get_pkg_objs_rpmscCs*|j|�}|j|�}t|j|dd��}|S)zj
        Return a list of dnf.Package objects that represent the source
        rpms to download.
        T)r%)r2�_get_source_packages�setr[)rr\r@�source_pkgsrrrr4�s

z$DownloadCommand._get_pkg_objs_sourcec	Cs�t�}|jjj�j�}xh|j|�D]Z}xT|j|jgD]D}|j|t	|j
�|j|j|j
d�}|s^q4x|D]}|j|�qdWPq4Wq"W|S)zm
        Return a list of dnf.Package objects that represent the debuginfo
        rpms to download.
        )r;�epoch�version�release�arch)r^r&�sack�query�	availabler[Z
debug_nameZsource_debug_name�filter�intr`rarbrc�add)rr\�dbg_pkgs�qrAZdbg_name�
dbg_available�prrrr5�s 


z'DownloadCommand._get_pkg_objs_debuginfocCsht�}|jjj�j�}xL|j|�D]>}|j|jt|j	�|j
|j|jd�}x|D]}|j
|�qNWq"W|S)zo
        Return a list of dnf.Package objects that represent the debugsource
        rpms to download.
        )r;r`rarbrc)r^r&rdrerfr[rgZdebugsource_namerhr`rarbrcri)rr\rjrkrArlrmrrrr6�s

z)DownloadCommand._get_pkg_objs_debugsourceFcCs�|r
|jn|j}g}x||D]t}y|j||��Wqtjjk
r�}z<tjtjj	|��|j
jjr|tjt
d��tjj|��WYdd}~XqXqWttj|��}|S)z Get packages matching pkg_specs.zExiting due to strict setting.N)�_get_query_source�
_get_queryrr+r=�PackageNotFoundErrorr�errorr,r-r&r*r<rr>�list�	itertools�chain)rr\r%�funcZqueries�pkg_spec�er@rrrr[�s

"zDownloadCommand._get_packagesc	Cs�|j|�}t|�}x�|D]�}tj|jj�}|j|�|j�}|r^|j|j	��|j|j
��qtd�g}tj
dj|dd�|D���tj
tjj|j���tjj��qW|S)z-Get packages matching pkg_specs and the deps.zError in resolve of packages:z
    cSsg|]}t|��qSr)rK)rDrArrrrEsz;DownloadCommand._get_packages_with_deps.<locals>.<listcomp>)r[r^r8ZGoalr&rdZinstallrC�updateZ
list_installsZ
list_upgradesrrrqrQr+�utilZ_format_resolve_problemsZ
problem_rulesr=r>)	rr\r%r@Zpkg_setrAZgoalZrcrBrrrrZ�s



z'DownloadCommand._get_packages_with_depscCszt�}xj|D]b}|jr8|j|j�tjdt|�|j�q|jdkrZ|jd|j|jf�qtj	t
d�t|��qWt|�S)z4Get list of source rpm names for a list of packages.z  --> Package : %s Source : %srXz
%s-%s.src.rpmzNo source rpm defined for %s)r^Z	sourcerpmrir�debugrKrcr;Zevr�inforrr)r@r_rArrrr]s

z$DownloadCommand._get_source_packagescCs�tjjj|�d}|o|dk}|s8|jd�rdtjj|�rd|jj|g|jj	j
d�}|jjj�j
|d�Stjj|�}|j|jj|jjd	�}|j�}|j
d
d�}|jjr�|j|jjd�}t|j��dkr�td
�|}tjj|��|S)z#Return a query to match a pkg_spec.rrr�filerz.rpm)rO)rA)Zwith_srcT)Zlatest_per_arch_by_priority)rczNo package %s available.)rrr|r)r+ZpycompZurlparse�endswithr.rP�isfiler&Zadd_remote_rpmsrNrOrdreZfilterm�subject�SubjectZget_best_queryr
r%rfrrg�lenrCrr=rp)rrvr1Zis_urlr@�subjrkrBrrrroszDownloadCommand._get_querycCsd|dd�}tjj|�}x.|j�D]"}|j|jj�j�}|r"|j�Sq"Wt	d�|}tj
j|��dS)z/Return a query to match a source rpm file name.N�zNo package %s available.���)r+rr�Zget_nevra_possibilitiesZto_queryr&rdrfZlatestrr=rp)rrvr�Z	nevra_objZ	tmp_queryrBrrrrn,sz!DownloadCommand._get_query_source)F)F)�__name__�
__module__�__qualname__�aliasesrZsummaryr	�staticmethodr"r0rCr?r2r4r5r6r[rZr]rorn�
__classcell__rr)rrr%s #!


r)Z
__future__rrZdnfpluginscorerrZdnf.cli.option_parserrr+Zdnf.cliZdnf.exceptionsZdnf.i18nZdnf.subjectZdnf.utilr8rsr.rUZpluginZregister_commandr
ZCommandrrrrr�<module>ssite-packages/dnf-plugins/__pycache__/reposync.cpython-36.pyc000064400000024276147511334660020237 0ustar003

@|�e89�@s�ddlmZddlmZddlZddlZddlZddlZddlmZm	Z	ddl
mZddlZddl
Zdd�ZGdd	�d	ejj�ZejjGd
d�dejj��ZdS)�)�absolute_import)�unicode_literalsN)�_�logger)�OptionParsercCs(tjjtj��}tjjtjj|||��S)N)�dnfZi18nZucd�os�getcwd�path�realpath�join)Zintermediate�target�cwd�r�/usr/lib/python3.6/reposync.py�_pkgdir#srcs(eZdZ�fdd�Z�fdd�Z�ZS)�RPMPayloadLocationcs$tt|�j||�tjj|�|_dS)N)�superr�__init__rr
�dirname�package_dir)�self�pkg�progressZpkg_location)�	__class__rrr)szRPMPayloadLocation.__init__cs*tt|�j�}tjj|j�|j|d<|S)N�dest)rr�_target_paramsr�util�
ensure_dirr)r�tp)rrrr-s
z!RPMPayloadLocation._target_params)�__name__�
__module__�__qualname__rr�
__classcell__rr)rrr(srcs�eZdZdZed�Z�fdd�Zedd��Zdd�Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Z�ZS) �RepoSyncCommand�reposyncz&download all packages from remote repocstt|�j|�dS)N)rr$r)r�cli)rrrr9szRepoSyncCommand.__init__c	Cs|jdddgtjdtd�d�|jddd	td
�d�|jddd	td
�d�|jdddd	td�d�|jdddd	td�d�|jdtd�d�|jdddd	td�d�|jddd	td�d�|jdddtd�d �|jd!dd	td"�d�|jd#dd	td$�d�|jd%d&dd	td'�d�dS)(Nz-az--arch�archesz[arch]z$download only packages for this ARCH)r�default�action�metavar�helpz--deleteF�
store_truez5delete local packages no longer present in repository)r(r)r+z--download-metadatazdownload all the metadata.z-gz
--gpgcheckzBRemove packages that fail GPG signature checking after downloadingz-mz--downloadcompsz&also download and uncompress comps.xmlz--metadata-pathzXwhere to store downloaded repository metadata. Defaults to the value of --download-path.)r+z-nz
--newest-onlyz&download only newest packages per-repoz--norepopathz,Don't add the reponame to the download path.z-pz--download-pathz./z&where to store downloaded repositories)r(r+z
--remote-timezCtry to set local timestamps of local files by the one on the serverz--sourcezdownload only source packagesz-uz--urlsz:Just list urls of what would be downloaded, don't download)�add_argumentrZ_SplitCallbackr)�parserrrr�
set_argparser<s2





zRepoSyncCommand.set_argparsercCs�|jj}d|_d|_|jj}|jjr||j�j	�xJ|jjD]>}y||}Wn$t
k
rntjjd|��YnX|j
�q:W|jjr�|j�tt|j���dkr�|jjr�tjjtd���x |j�D]}|jj�d|_q�WdS)NTzUnknown repo: '%s'.�z1Can't use --norepopath with multiple repositoriesF)r&�demandsZavailable_reposZsack_activation�base�repos�opts�repo�all�disable�KeyErrorrZCliError�enable�sourceZenable_source_repos�len�list�iter_enabled�
norepopathr�_repoZexpireZdeltarpm)rr1r3Zrepoidr5rrr�	configure\s(

zRepoSyncCommand.configurecs�d|jj_d}�x�|jjj�D�]�}|jjr8|jjd�|jj	r�|jj
r�xP|jj�D]6\}}|j|�}|rtt
|�qTtd�|}tj|�qTWn
|j	|�|jj�r|jj
�rt|jj����fdd�dD�}|�rxB|D]}|j|�}|r�t
|�Pq�Wtd�}tj|�n
|j|�|j|�}	|jj
�r8|j|	�n�|j|	�|jj�r�xt|	D]l}
|j|
�}tj|fd	d
�|
�|
_|jj|
�\}}
|dk�rRtjtd�jtjj |�|
��tj!|�d
}�qRW|jj"r|j#||	�qW|�s�t$j%j&td���dS)NTz%Failed to get mirror for metadata: %scsg|]}|�kr�|�qSrr)�.0�md_type)�mdlrr�
<listcomp>�sz'RepoSyncCommand.run.<locals>.<listcomp>�group�group_gz�group_gz_zckz(Failed to get mirror for the group file.cSs|S)Nr)�s�
local_pathrrr�<lambda>�sz%RepoSyncCommand.run.<locals>.<lambda>rzRemoving {}: {}FzGPG signature check failed.)rErFrG)'r2ZconfZ	keepcacher3r=r4Zremote_timer?ZsetPreserveRemoteTime�download_metadataZurlsZgetMetadataLocations�remote_location�printrr�warningZ
downloadcomps�dict�getcomps�get_pkglist�
print_urls�download_packagesZgpgcheck�pkg_download_path�types�
MethodTypeZlocalPkgZpackage_signature_check�formatrr
�basename�unlink�delete�delete_old_local_packagesr�
exceptions�Error)rZgpgcheck_okr5rBZmd_location�url�msgZgroup_locationsZgroup_location�pkglistrrI�result�errorr)rCr�runws^


















zRepoSyncCommand.runcCs$t|jjp|jj|jjs|jnd�S)N�)rr4ZdestdirZ
download_pathr>�id)rr5rrr�repo_target�szRepoSyncCommand.repo_targetcCs&|jjrt|jj|j�S|j|�SdS)N)r4Z
metadata_pathrrerf)rr5rrr�metadata_target�szRepoSyncCommand.metadata_targetcCsT|j|j�}tjjtjj||j��}|jtjj|d��sPtj	j
td�j||���|S)Nrdz6Download target '{}' is outside of download path '{}'.)
rfr5rr
rr�location�
startswithrr\r]rrW)rrrfrTrrrrT�s
z!RepoSyncCommand.pkg_download_pathc	
s�t�fdd�|D��}x�tj�j|��D]�\}}}x||D]t}tjj||�}|jd�r8tjj|�r8||kr8ytj|�t	j
td�|�Wq8tk
r�t	j
td�|�Yq8Xq8Wq(WdS)Nc3s|]}�j|�VqdS)N)rT)rAr)rrr�	<genexpr>�sz<RepoSyncCommand.delete_old_local_packages.<locals>.<genexpr>z.rpmz[DELETED] %szfailed to delete file %s)�setr�walkrfr
r�endswith�isfilerYr�infor�OSErrorrb)	rr5r`Zdownloaded_files�dirpathZdirnames�	filenames�filenamer
r)rrr[�s

z)RepoSyncCommand.delete_old_local_packagescCsZ|jj�}|rV|j|�}tjj|�tjj|d�}tj	j
j||d�tj
td�|j�dS)Nz	comps.xml)rz!comps.xml for repository %s saved)r?Z
getCompsFnrgrrrrr
rZyumZmiscZ
decompressrrorre)rr5Zcomps_fnZ	dest_pathrrrrrP�s

zRepoSyncCommand.getcompscCs|j|�}|jj|�dS)NT)rgr?ZdownloadMetadata)rr5rfrrrrK�s
z!RepoSyncCommand.download_metadatacCs�tjjs|j�S|j�|jjj�}t�}i}i}xp|D]h}|j�}|j	|�|j
|j�i�j
|j�g�j
|�x.|D]&}|j
|i�j
|j�g�j
|j��qvWq8W|j|j|d�d�j�}	t�}
x�|j�D]�\}}t�}
|
jt|j�dd�d�t�}x0|j�D]$}x|D]}|j	|j���qW�qWx:|j|d�j�D]&}dj|�}|
jt|||���q>Wx0|
D](}x ||D]}|
j	|j���q|W�qnWq�W|	j|j|
d��}	|	S)a\
        return union of these queries:
        - the latest NEVRAs from non-modular packages
        - all packages from stream version with the latest package NEVRA
          (this should not be needed but the latest package NEVRAs might be
          part of an older module version)
        - all packages from the latest stream version
        )Znevra_strict)Zpkg__neqT)�reverserz3{0.name}-{0.epoch}:{0.version}-{0.release}.{0.arch})rr2ZWITH_MODULESZlatestZapplyZ_moduleContainerZgetModulePackagesrkZgetArtifacts�update�
setdefaultZ
getNameStreamZ
getVersionNum�append�filter�items�add�sorted�keys�valuesrW�max�union)r�queryZmodule_packagesZ
all_artifactsZmodule_dictZartifact_versionZmodule_packageZ	artifactsZartifactZlatest_queryZlatest_stream_artifactsZ
namestreamZversion_dictZversionsZstream_artifacts�modules�moduleZ
latest_pkgZnevra�versionrrr�_get_latest�sB	





zRepoSyncCommand._get_latestcCsd|jjjtjd�j�j|jd�}|jj	r2|j
|�}|jjrH|jdd�n|jjr`|j|jjd�|S)N)�flags)Zreponame�src)Zarch)
r2�sackr��hawkey�IGNORE_MODULAR_EXCLUDESZ	availableZfiltermrer4Znewest_onlyr�r:r')rr5r�rrrrQs

zRepoSyncCommand.get_pkglistcsj�j}|jj��dkr tjj��tjj|jj	t
jd�j��d�}��fdd�|D�}|j
||�dd�dS)N)r�rcsg|]}t|��j|���qSr)rrT)rAr)rrrrrD0sz5RepoSyncCommand.download_packages.<locals>.<listcomp>F)r2�outputrr�callbackZNullDownloadProgress�drpmZ	DeltaInfor�r�r�r�Z	installedZ_download_remote_payloads)rr`r2r�Zpayloadsr)rrrrS)s
z!RepoSyncCommand.download_packagescCs@x:|D]2}|j�}|r t|�qtd�|j}tj|�qWdS)Nz$Failed to get mirror for package: %s)rLrMr�namerrN)rr`rr^r_rrrrR4s

zRepoSyncCommand.print_urls)r%)r r!r"�aliasesrZsummaryr�staticmethodr/r@rcrfrgrTr[rPrKr�rQrSrRr#rr)rrr$4s  :
	9r$)Z
__future__rrr�rZshutilrUZdnfpluginscorerrZdnf.cli.option_parserrrZdnf.clirr5Z
RPMPayloadrZpluginZregister_commandr&ZCommandr$rrrr�<module>ssite-packages/dnf-plugins/__pycache__/builddep.cpython-36.opt-1.pyc000064400000016426147511334660021122 0ustar003

@|�e�$�@s�ddlmZddlmZddlmZmZddlZddlZddlZddl	Zddl
ZddlZddlZ
ddlZddlZddlZddlZejjGdd�dejj��ZdS)�)�absolute_import)�unicode_literals)�_�loggerNcs�eZdZdZdZee�Zed�Z�fdd�Zdd�Z	d	d
�Z
edd��Zd
d�Z
dd�Zdd�Zedd��Zdd�Zdd�Zdd�Zdd�Z�ZS)�BuildDepCommand�builddep�	build-depz3Install build dependencies for package or spec filez[PACKAGE|PACKAGE.spec]cs(tt|�j|�tjjj�|_g|_dS)N)	�superr�__init__�dnf�rpmZtransactionZinitReadOnlyTransaction�_rpm_ts�tempdirs)�self�cli)�	__class__��/usr/lib/python3.6/builddep.pyr
/szBuildDepCommand.__init__cCsx|jD]}tj|�qWdS)N)r�shutilZrmtree)r�temp_dirrrr�__del__4szBuildDepCommand.__del__cCs�tjjj|�}|ddkr |jStjj�}tjdd�}t	jj
|t	jj|��}|jj
|�t|d�}zFy|j|jjj||j��Wn$tk
r�}z�WYdd}~XnXWd|j�X|S)	z�
        In case pkgspec is a remote URL, download it to a temporary location
        and use the temporary file instead.
        r�file�Z
dnf_builddep_)�prefixzwb+N)rr)rZpycompZurlparse�path�libdnfZrepoZ
Downloader�tempfileZmkdtemp�os�join�basenamer�append�openZdownloadURL�baseZconfZ_config�fileno�RuntimeError�close)r�pkgspec�locationZ
downloaderrZ	temp_fileZtemp_fo�exrrr�_download_remote_file8s


z%BuildDepCommand._download_remote_filec	Cs�dd�}|jdddtd�d�|jdd	d
gd|td�d
�|jdddtd�d�|j�}|jddtd�d�|jddtd�d�dS)NcSs:|r|jdd�ng}t|�dkr6td�|}tj|��|S)N��z&'%s' is not of the format 'MACRO EXPR')�split�lenr�argparseZArgumentTypeError)�argZarglist�msgrrr�	macro_defRs

z0BuildDepCommand.set_argparser.<locals>.macro_def�packages�+�packagez"packages with builddeps to install)�nargs�metavar�helpz-Dz--definer z'MACRO EXPR'z$define a macro for spec file parsing)�action�defaultr6�typer7z--skip-unavailable�
store_trueFz5skip build dependencies not available in repositories)r8r9r7z--specz)treat commandline arguments as spec files)r8r7z--srpmz)treat commandline arguments as source rpm)�add_argumentrZadd_mutually_exclusive_group)�parserr1Zptyperrr�
set_argparserPs

zBuildDepCommand.set_argparsercCs|jjsd|j_dS)N�error)�optsZrpmverbosity)rrrr�
pre_configurefszBuildDepCommand.pre_configurecCsr|jj}d|_d|_d|_d|_|jjp.|jjsnx<|jj	D]0}|j
d�pZ|j
d�pZ|j
d�s:|jjj
�Pq:WdS)NTz.src.rpmz
.nosrc.rpmz.spec)r�demandsZavailable_reposZ	resolvingZ	root_userZsack_activationr@�spec�srpmr2�endswithr"ZreposZenable_source_repos)rrBr&rrr�	configurejs


zBuildDepCommand.configurecCs\tjjj|j�}x$|jjD]}tj|d|d�qWd}x�|jj	D]�}|j
|�}yl|jjrh|j|�nT|jj
r||j|�n@|jd�s�|jd�r�|j|�n |jd�r�|j|�n
|j|�WqDtjjk
�r}z:x$|j�D]}tjtd�j|��q�Wtj|�d}WYdd}~XqDXqDWx |jjD]}tj|d��q*W|�rXtjjtd	���dS)
Nrr*Fz.src.rpmz	nosrc.rpmz.speczRPM: {}Tz!Some packages could not be found.)rZyumZrpmtransZRPMTransactionr"r@�definerZaddMacror2r)rD�	_src_depsrC�
_spec_depsrE�_remote_deps�
exceptions�ErrorZmessagesrr?r�formatZdelMacro)rZrpmlogZmacroZ
pkg_errorsr&�e�linerrr�runzs2


zBuildDepCommand.runcCs|j�dd�S)Nr+)ZDNEVR)Zrpm_deprrr�_rpm_dep2reldep_str�sz#BuildDepCommand._rpm_dep2reldep_strcCs�tjj|jj�}|j|d�|j�}|rX|jd�rXtjj|jj�}|j|d�|j�}|r�|jd�r�td�}t	j
||�|jjdkS|r�|jj
|�}|r�x|D]}tjj|�q�W|jjj|dd�dS)	N)Zprovides�/)r�(z$No matching package to install: '%s'TF)ZselectZoptional)rZselectorZSelectorr"�sack�setZmatches�
startswithrr�warningr@Zskip_unavailableZ_sltr_matches_installedZ_msg_installedZ_goalZinstall)r�
reldep_strZsltr�foundr0Zalready_instr4rrr�_install�s$
zBuildDepCommand._installc
Cs�tj|tj�}y|jj|�}WnRtjk
rp}z4t|�dkrJtd�|}tj	|�t
jj|��WYdd}~XnXtj	|�|j
d�}d}x0|D](}|j|�}|jd�r�q�||j|�M}q�W|s�td�}	t
jj|	��|jjr�tjtd��dS)Nzerror reading package headerz2Failed to open: '%s', not a valid source rpm file.ZrequirenameTzrpmlib(zNot all dependencies satisfiedzJWarning: -D or --define arguments have no meaning for source rpm packages.)rr!�O_RDONLYr
ZhdrFromFdnorr?�strrr%rrKrLZdsFromHeaderrQrVrZr@rGrrW)
rZsrc_fn�fd�hrN�ds�done�deprX�errrrrrH�s*





zBuildDepCommand._src_depsc	Cs�ytj|�}Wn>tk
rL}z"td�||f}tjj|��WYdd}~XnXd}x.tj|jd�D]}|j	|�}||j
|�M}qbW|s�td�}tjj|��dS)Nz/Failed to open: '%s', not a valid spec file: %sT�requireszNot all dependencies satisfied)rrC�
ValueErrorrrrKrLr_ZsourceHeaderrQrZ)	rZspec_fnrCr(r0r`rarXrbrrrrI�s

zBuildDepCommand._spec_depsc	Cs�tjj|�j|jj�jdd�}tdd�|D��}|jjj�j	�j||gdd�j
�j�}|sptjj
td�|��d}x.|D]&}x |jD]}||jt|��M}q�WqzW|s�td�}tjj
|��dS)	N�src)Z	arch__neqcSsh|]
}|j�qSr)Zsource_name)�.0�pkgrrr�	<setcomp>�sz/BuildDepCommand._remote_deps.<locals>.<setcomp>)�nameZarchzno package matched: %sTzNot all dependencies satisfied)rZsubjectZSubjectZget_best_queryr"rT�filter�listZquery�	availableZlatestrPrKrLrrcrZr\)	rr4rlZsourcenamesZpkgsr`rgZreqrbrrrrJ�s
zBuildDepCommand._remote_deps)rr)�__name__�
__module__�__qualname__�aliasesr0rZsummaryZusager
rr)�staticmethodr>rArFrPrQrZrHrIrJ�
__classcell__rr)rrr's !r)Z
__future__rrZdnfpluginscorerrr.rZdnf.cliZdnf.exceptionsZdnf.rpm.transactionZdnf.yum.rpmtransZlibdnf.reporrrrrZpluginZregister_commandrZCommandrrrrr�<module>ssite-packages/dnf-plugins/__pycache__/debuginfo-install.cpython-36.opt-1.pyc000064400000013665147511334660022742 0ustar003

�gt`L+�@sNddlmZmZddlZddlmZGdd�dej�ZGdd�dejj	�Z
dS)�)�_�loggerN)�Packagecs,eZdZdZdZ�fdd�Zdd�Z�ZS)�DebuginfoInstallz5DNF plugin supplying the 'debuginfo-install' command.zdebuginfo-installcs4tt|�j||�||_||_|dk	r0|jt�dS)zInitialize the plugin instance.N)�superr�__init__�base�cliZregister_command�DebuginfoInstallCommand)�selfrr	)�	__class__��'/usr/lib/python3.6/debuginfo-install.pyr s
zDebuginfoInstall.__init__cCsf|j|jj�}|jd�o.|jdd�o.|jdd�}|rbtjj|j�j	�j
dd�}t|�rb|jjj
�dS)N�main�
autoupdatez*-debuginfo)Z
name__glob)Zread_configr�confZhas_sectionZ
has_optionZ
getboolean�dnf�sackZ_rpmdb_sack�query�filterm�len�repos�enable_debug_repos)rZcprZdbginfor
r
r�config(s
zDebuginfoInstall.config)�__name__�
__module__�__qualname__�__doc__�namerr�
__classcell__r
r
)rrrsrcsheZdZdZdZed�Z�fdd�Zedd��Z	dd	�Z
d
d�Zdd
�Zdd�Z
dd�Zdd�Z�ZS)r
z! DebuginfoInstall plugin for DNF �debuginfo-installzinstall debuginfo packagescs4tt|�j|�t�|_t�|_t�|_t�|_dS)N)rr
r�set�available_debuginfo_missing�available_debugsource_missing�installed_debuginfo_missing�installed_debugsource_missing)rr	)rr
rr:s
z DebuginfoInstallCommand.__init__cCs|jddd�dS)N�package�+)�nargs)�add_argument)�parserr
r
r�
set_argparserBsz%DebuginfoInstallCommand.set_argparsercCs0|jj}d|_d|_d|_d|_|jjj�dS)NT)	r	�demandsZ	resolvingZ	root_userZsack_activationZavailable_reposrrr)rr,r
r
r�	configureFsz!DebuginfoInstallCommand.configurecCs�g}ttj�}ttj�}�x�|jjD�]�}tjj|�j	|j
jdd�}|d}|sxtj
td�|j
jjj|��|j|�q$|j�j�}|j|j�j��xdt|j��D]T}|jtj�r�|d|�|kr�|j|�|jtj�r�|d|�|kr�|j|�q�W�x�|j�D�]�}	|	d}
|
j�r�i}x"|	D]}|j|jg�j|��q(Wxj|j�D]^}
|
d}|j|j |��s�|j|j!|��s�|j"j#t$|��|j|j%|��sP|j&j#t$|���qPW�q|
j'jtj��s�|
j'jtj��r�|j(|	��q|ddk	�rb|j)|
j |d��s2|j)|
j!|d��s2|j*j#dj+|
j'|
j,��|j)|
j%|d��s|j-j#dj+|
j'|
j,���q|j.|
j |	��s�|j.|
j!|	��s�|j*j#dj+|
j'|
j,��|j.|
j%|	��s|j-j#dj+|
j'|
j,���qWq$W|j*�r�tj
td�d	j/t0|j*���|j-�rtj
td
�d	j/t0|j-���|j"�r8tj
td�d	j/t0|j"���|j&�r\tj
td�d	j/t0|j&���|�r�|j
j1j2�r�tj3j4td
�dj/|�d��dS)NF)Zwith_srcrzNo match for argument: %srZnevraz{}-{}zICould not find debuginfo package for the following available packages: %sz, zKCould not find debugsource package for the following available packages: %szICould not find debuginfo package for the following installed packages: %szKCould not find debugsource package for the following installed packages: %szUnable to find a match� )Zpkg_spec)5rrZDEBUGINFO_SUFFIXZDEBUGSOURCE_SUFFIXZoptsr&rZsubjectZSubjectZget_best_solutionrrr�infor�outputZtermZbold�appendZ	availableZ
_name_dict�updateZ	installed�list�keys�endswith�pop�valuesZ_from_system�
setdefault�arch�_install_debug_from_system�
debug_nameZsource_debug_namer$�add�strZdebugsource_namer%r�_install�_install_debugr"�format�evrr#�_install_debug_no_nevra�join�sortedr�strict�
exceptionsZPackagesNotAvailableError)rZerrors_specZdebuginfo_suffix_lenZdebugsource_suffix_lenZpkgspecZsolutionrZpackage_dictr�pkgsZ	first_pkgZ	arch_dict�pkgZpackage_arch_listr
r
r�runNs�





zDebuginfoInstallCommand.runcCs:|jjj�j||j|j|j|jd�}|r6|j|�dSdS)N)r�epoch�version�releaser9TF)	rrr�filterrJrKrLr9r>)rr;rHrr
r
rr:�s

z2DebuginfoInstallCommand._install_debug_from_systemcCs�i}|jdk	r|j|d<|jdk	r,|j|d<|jdk	r@|j|d<|jdk	rT|j|d<|jjj�jfd|i|��}|r�|j|�dSdS)NZepoch__globZ
version__globZ
release__globZ
arch__globrTF)	rJrKrLr9rrrrMr>)rr;Z
base_nevra�kwargsrr
r
rr?�s








z&DebuginfoInstallCommand._install_debugcs8|jjj�j�fdd�|D�d�}|r4|j|�dSdS)Ncsg|]}dj�|j|j��qS)z{}-{}.{})r@rAr9)�.0�p)r;r
r�
<listcomp>�szCDebuginfoInstallCommand._install_debug_no_nevra.<locals>.<listcomp>)Znevra_strictTF)rrrrr>)rr;rGrr
)r;rrB�s
z/DebuginfoInstallCommand._install_debug_no_nevracCs:tjj|jj�}|j|d�|jjj||jjj	d�dS)N)rH)ZselectZoptional)
r�selectorZSelectorrrr!ZgoalZinstallrrE)rrGrRr
r
rr>�sz DebuginfoInstallCommand._install)r )rrrr�aliasesrZsummaryr�staticmethodr+r-rIr:r?rBr>rr
r
)rrr
4s|
	r
)ZdnfpluginscorerrrZdnf.packagerZPluginrr	ZCommandr
r
r
r
r�<module>ssite-packages/dnf-plugins/__pycache__/debug.cpython-36.opt-1.pyc000064400000025101147511334660020406 0ustar003

�gt`1�@s�ddlmZddlmZddlmZddlmZmZddlZddl	Zddl
Z
ddlZddlZddl
Z
ddlZddlZdZGdd�dej�ZGd	d
�d
ejj�ZGdd�dejj�Zd
d�Zdd�Zdd�ZdS)�)�absolute_import)�unicode_literals)�ucd)�_�loggerNzdnf-debug-dump version 1
cs eZdZdZ�fdd�Z�ZS)�Debug�debugcsDtt|�j||�||_||_|jdk	r@|jjt�|jjt�dS)N)�superr�__init__�base�cliZregister_command�DebugDumpCommand�DebugRestoreCommand)�selfrr)�	__class__��/usr/lib/python3.6/debug.pyr
)s
zDebug.__init__)�__name__�
__module__�__qualname__�namer
�
__classcell__rr)rrr%srcsteZdZdZed�Z�fdd�Zdd�Zedd��Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�Zdd�Z�ZS)r
�
debug-dumpz5dump information about installed rpm packages to filecstt|�j|�d|_dS)N)r	r
r
�	dump_file)rr)rrrr
7szDebugDumpCommand.__init__cCsd|jj_d|jj_dS)NT)r�demands�sack_activation�available_repos)rrrr�	configure;s
zDebugDumpCommand.configurecCs.|jdddtd�d�|jddtd�d	�dS)
Nz	--norepos�
store_trueFz/do not attempt to dump the repository contents.)�action�default�help�filename�?zoptional name of dump file)�nargsr!)�add_argumentr)�parserrrr�
set_argparser?szDebugDumpCommand.set_argparsercCs�|jj}|s6tjdtjtj���}dtj�d|f}tjj|�}|j	d�r\t
j|d�|_nt
|d�|_|jt�|j�|j�|j�|j|jj�|j�|jj�ttd�|�dS)z{create debug txt file and compress it, if no filename specified
           use dnf_debug_dump-<timestamp>.txt.gz by defaultz%Y-%m-%d_%Tzdnf_debug_dump-%s-%s.txt.gz�z.gz�wzOutput written to: %sN)�optsr"�timeZstrftimeZ	localtime�os�uname�path�abspath�endswith�gzip�GzipFiler�open�write�
DEBUG_VERSION�dump_system_info�dump_dnf_config_info�dump_rpm_problems�
dump_packagesZnorepos�dump_rpmdb_versions�close�printr)rr"Znowrrr�runHs 


zDebugDumpCommand.runcCs4tjjr t|jtj�r t|d�}tjj|j|�dS)N�utf8)	�dnfZpycompZPY3�
isinstancerr1r2�bytesZ
write_to_file)r�msgrrrr4as
zDebugDumpCommand.writecCsX|jd�tj�}|jd|d|df�|jdtj�|jdtjjdd��dS)	Nz%%%%SYSTEM INFO
z  uname: %s, %s
��z  rpm ver: %s
z  python ver: %s
�
�)r4r,r-�rpm�__version__�sys�version�replace)rr-rrrr6fs
z!DebugDumpCommand.dump_system_infocCs�|jjj}djdd�|jjjD��}|jd�|jd|d�|jd|d�|jd	|d
�|jdtjj	�|jd|�|jd
dj|jjj
��dS)N�,cSsg|]
}|j�qSr)r)�.0�prrr�
<listcomp>psz9DebugDumpCommand.dump_dnf_config_info.<locals>.<listcomp>z
%%%%DNF INFO
z  arch: %s
�archz  basearch: %s
Zbasearchz  releasever: %s
Z
releaseverz  dnf ver: %s
z  enabled plugins: %s
z  global excludes: %s
)r�confZ
substitutions�joinZ_plugins�pluginsr4r?�const�VERSION�excludepkgs)r�varrSrrrr7ns

z%DebugDumpCommand.dump_dnf_config_infocCsP|jd�t|j�\}}|jdjdd�|D���|jdjdd�|D���dS)Nz%%%%RPMDB PROBLEMS
rFcSs$g|]\}}dt|�t|�f�qS)zPackage %s requires %s
)r)rM�req�pkgrrrrO}sz6DebugDumpCommand.dump_rpm_problems.<locals>.<listcomp>cSs$g|]\}}dt|�t|�f�qS)zPackage %s conflicts with %s
)r)rMrQrYrrrrOs)r4�rpm_problemsrrR)rZmissing�	conflictsrrrr8zs
z"DebugDumpCommand.dump_rpm_problemsc	Cs\|jjj�}|jd�x&t|j��D]}|jdt|��q$W|sFdS|jd�|j�}x�t|jjj	�dd�d�D]�}y�d}|j
dk	r�|j
}n*|jdk	r�|j}nt|j
�dkr�|j
d}|jd|j|f�|jd	d
j|j��x,t|j|jd��D]}|jdt|��q�WWqrtjjk
�rR}z|jd|t|�f�wrWYdd}~XqrXqrWdS)
Nz
%%%%RPMDB
z  %s
z
%%%%REPOS
cSs|jS)N)�id)�xrrr�<lambda>�sz0DebugDumpCommand.dump_packages.<locals>.<lambda>)�keyrz
%%%s - %s
z  excludes: %s
rL)ZreponamezError accessing repo %s: %s
)r�sack�queryr4�sorted�	installed�pkgspec�	availableZreposZiter_enabledZmetalinkZ
mirrorlist�lenZbaseurlr\rRrV�filterr?�
exceptions�Error�str)	rZ
load_repos�qrNreZrepoZurlZpo�errrr9�s2




zDebugDumpCommand.dump_packagescCs(|jd�|jjj�}|jd|�dS)Nz%%%%RPMDB VERSIONS
z
  all: %s
)r4rr`Z_rpmdb_version)rrJrrrr:�s
z$DebugDumpCommand.dump_rpmdb_versions)r)rrr�aliasesr�summaryr
r�staticmethodr'r=r4r6r7r8r9r:rrr)rrr
2s		r
c@sPeZdZdZed�Zdd�Zedd��Zdd�Z	d	d
�Z
dd�Zed
d��ZdS)r�
debug-restorez,restore packages recorded in debug-dump filecCs4d|jj_d|jj_d|jj_|jjs0d|jj_dS)NT)rrrrZ	root_userr*�outputZ	resolving)rrrrr�s



zDebugRestoreCommand.configurecCs~|jddtd�d�|jddtd�d�|jddtd�d�|jd	d
dtd�d
�|jddtd�d�|jddtd�d�dS)Nz--outputrz,output commands that would be run to stdout.)rr!z--install-latestz0Install the latest version of recorded packages.z
--ignore-archz_Ignore architecture and install missing packages matching the name, epoch, version and release.z--filter-typesz[install, remove, replace]zinstall, remove, replacezlimit to specified type)�metavarr r!z--remove-installonlyzqAllow removing of install-only packages. Using this option may result in an attempt to remove the running kernel.r"r(zname of dump file)r$r!)r%r)r&rrrr'�s$z!DebugRestoreCommand.set_argparsercCsV|jjr$t|jjjdd�j��|j_|j|jjd�}|j||j�|j||j�dS)z Execute the command action here.rL� rN)	r*�filter_types�setrK�split�read_dump_filer"�process_installed�process_dump)r�	dump_pkgsrrrr=�szDebugRestoreCommand.runcCs�|jjj�j�}|jj|�}x�|D]�}d}t|�}|j|j|jfd�}|dk	r�|j	|j
|jf}	|	|krpd||	<q�||kr~d}q�d|jkr�d}
nd}
x|j
�D]}|
||<q�Wnd}|r"d|jkr"||ks�|jr"|jr�td|�q"|jj|�q"WdS)NF�skipTrK�removezremove    %s)rr`rarcZ_get_installonly_queryrd�getrrP�epochrJ�releasert�keysZremove_installonlyrqr<Zpackage_remove)rrzr*rcZinstallonly_pkgsrYZ
pkg_remove�spec�dumped_versionsZevrrZd_evrrrrrx�s.


z%DebugRestoreCommand.process_installedc
Cs�x�t|j��D]�\}}|||f}x�t|j��D]�\}}}||||f}	|	dkrRq0|jr^d}
nd|}
|jr�|	dkr�d||
f}nt||
|||�}|	|jkr0|jr�td|	|f�q0y|jj	|�Wq0t
jjk
r�t
jtd�|�Yq0Xq0WqWdS)Nr{rF�.�installz%s%sz%s   %szPackage %s is not available)rbr�Zignore_archZinstall_latest�pkgtup2specrtrqr<rr�r?rhZMarkingErrorr�errorr)rrzr*�n�ar�rl�v�rrrP�pkg_specrrrry�s&
z DebugRestoreCommand.process_dumpcCs�|jd�rtj|�}nt|�}t|j��tkrFtjt	d�|�t
jj�d}i}xp|D]h}t|�}|rr|dkrTd}qT|s�|ddkr�P|j
�}tj|�}d|j|j|jfi�|j|j|jf<qTW|S)	Nz.gzzBad dnf debug file: %sTz
%%%%RPMDB
Frrsr�)r0r1r2r3r�readliner5rr�rr?rhri�strip�hawkeyZsplit_nevra�
setdefaultrrPr~rJr)r"Zfobjr{Zpkgs�liner�Znevrarrrrws(


(z"DebugRestoreCommand.read_dump_fileN)rp)
rrrrmrrnrror'r=rxryrwrrrrr�s#rcs�tjj|�}|j�j��t�}t�}x@�D]8�|j�fdd��jD��|j�fdd��jD��q*W�fdd�|D�}�fdd�|D�}||fS)Ncs2g|]*}t|�dkrt|�jd�r|�f�qS)zsolvable:prereqmarkerzrpmlib()rj�
startswith)rMrX)rYrrrO:sz rpm_problems.<locals>.<listcomp>csg|]}|�f�qSrr)rMrQ)rYrrrO=scs$g|]\}}�j|d�s||f�qS))�provides)rg)rMrXrY)�allpkgsrrrO?scs$g|]\}}�j|d�r||f�qS))r�)rg)rMrQrY)r�rrrOAs)	r?r`Z_rpmdb_sackrarcru�update�requiresr[)rZrpmdbr�r[Zmissing_requiresZexisting_conflictsr)r�rYrrZ3s
rZcCst|j|j|j|j|j�S)N)r�rrPr~rJr)rYrrrrdFsrdcCs<|sdnd|jd�}|dkr"dnd|}d|||||fS)NrFz.%sr�z%s:z%s-%s%s-%s%s)NrF)�lstrip)rrPr~rJrr�rlrrrr�Jsr�)Z
__future__rrZdnf.i18nrZdnfpluginscorerrr?Zdnf.clir1r�r,rGrIr+r5ZPluginrrZCommandr
rrZrdr�rrrr�<module>s&
wsite-packages/dnf-plugins/__pycache__/repomanage.cpython-36.opt-1.pyc000064400000014573147511334660021451 0ustar003

@|�eJ)�@szddlmZddlmZddlmZmZddlZddlZddlZddl	Z	ddl
Z
Gdd�dej�ZGdd�dej
j�ZdS)	�)�absolute_import)�unicode_literals)�_�loggerNcs eZdZdZ�fdd�Z�ZS)�
RepoManage�
repomanagecs,tt|�j||�|dkrdS|jt�dS)N)�superr�__init__Zregister_command�RepoManageCommand)�self�base�cli)�	__class__�� /usr/lib/python3.6/repomanage.pyr	$szRepoManage.__init__)�__name__�
__module__�__qualname__�namer	�
__classcell__rr)rrr src@s\eZdZdZed�Zdd�Zdd�Zdd�Ze	d	d
��Z
e	dd��Zd
d�Ze	dd��Z
dS)r
rz"Manage a directory of rpm packagescCs,|jjr(|jjr(|jjtjtjd�dS)N)�stdout�stderr)�opts�verbose�quietr
Zredirect_logger�loggingZWARNING�INFO)rrrr�
pre_configure/szRepoManageCommand.pre_configurecCs0|jjr|jjr|jj�|jj}d|_dS)NT)rrrr
Zredirect_repo_progress�demandsZsack_activation)rrrrr�	configure3s
zRepoManageCommand.configurec"s@�jjr �jjr tjjtd����jjr@�jjr@tjjtd����jjr`�jjr`tjjtd����jjr|�jjr|d�j_i}i}i}t�}t	�jj
�}y�d}�jjj
|�jj�jjgd�}|jj��jj|�tjj�r>�jj��jjj�}xH|D]@}	|	j�|kr�|j|	j��|j|	j�i�j|	j�g�j|	�q�WWn�tjjk
�r�g}
�j�jjd�}
t |
�dk�r�tjjtd	����jj!ddd
��jj"ddd�y�jj#|
�jj$j%d
�Wn0t&k
�r�t'j(td�j)dj*|
���YnXYnX�jj+j,t-j.d�j/�}dd�|j0|j0|d�d�j/�D�}|j1�x�|D]~}
|
j2|
j3f}||k�rx|
||k�r�||j|
�n
|
g||<�j4|
�}||k�r�||j�j5|
��n�j5|
�g||<�q@Wg}t�}�jj�r�xh|j6�D]\\}}|||f}||d�}x6|D].}�j4|�}x||D]}|j|��q W�qW�q�Wxb|j7�D]V}t8|j6��}||d�}x4|D],}x$||D]}|jt|j����q|W�qnW�qJW�jj�r|xh|j6�D]\\}}|||f}|d|�}x6|D].}�j4|�}x||D]}|j|��q�W�q�W�q�Wxb|j7�D]V}t8|j6��}|d|�}x4|D],}x$||D]}|jt|j����qTW�qFW�q"W�jj�r�xh|j6�D]\\}}|||f}|d|�}x6|D].}�j4|�}x||D]}|j|��q�W�q�W�q�Wt�}xb|j7�D]V}t8|j6��}||d�}x4|D],}x$||D]}|jt|j����q2W�q$W�qWxx|j7�D]l}t8|j6��}|d|�}xJ|D]B}x:||D].}x&|j�D]} | |k�r�|j9| ��q�W�q�W�q�W�qdW�fdd�|j0|j0|d�d�j/�D�}!||!}|j1��jj:�r$t;dj*|��nx|D]}
t;|
��q*WdS)Nz%Pass either --old or --new, not both!z)Pass either --oldonly or --new, not both!z)Pass either --old or --oldonly, not both!TZrepomanage_repo)Zbaseurlz.rpmrzNo files to process)�sack�reposF)Zload_system_repoZload_available_repos)�progresszCould not open {}z, )�flagscSsg|]}|�qSrr)�.0�xrrr�
<listcomp>osz)RepoManageCommand.run.<locals>.<listcomp>)Znevra_strict)Zpkg__neqcsg|]}�j|��qSr)�_package_to_path)r$r%)rrrr&�s)Zpkg__eq� )<r�new�old�dnf�
exceptions�ErrorrZoldonly�set�intZkeeprr!Zadd_new_repoZconf�pathZ_repoZexpireZ_add_repo_to_sackZWITH_MODULESZ_setup_modular_excludesZ_moduleContainerZgetModulePackagesZ	getRepoID�updateZgetArtifacts�
setdefaultZ
getNameStreamZ
getVersionNum�appendZ	RepoError�_get_file_list�len�resetZ	fill_sackZadd_remote_rpms�outputr"�IOErrorrZwarning�format�joinr �query�hawkeyZIGNORE_MODULAR_EXCLUDESZ	available�filter�sortr�arch�_package_to_nevrar'�keys�values�sorted�addZspace�print)"rZverfileZpkgdictZmodule_dictZall_modular_artifactsZkeepnumZREPOMANAGE_REPOIDZ	repo_confZmodule_packagesZmodule_packageZrpm_listr;Zpackages�pkgZnaZnevraZoutputpackagesZkeepnum_latest_stream_artifacts�n�aZevrlistZnewevrs�packageZfpkgZstreams_by_versionZsorted_stream_versionsZnew_sorted_stream_versions�i�streamZoldevrsZold_sorted_stream_versionsZkeepnum_newer_stream_artifactsZartifactZmodular_packagesr)rr�run9s�



&"








$



$



$

"&

zRepoManageCommand.runc	Cs�|jdddtd�d�|jdddtd�d�|jd	d
dtd�d�|jdd
dtd�d�|jddddtd�dtd�|jddtd�d�dS)Nz-oz--old�
store_truezPrint the older packages)�action�helpz-Oz	--oldonlyz6Print the older packages. Exclude the newest packages.z-nz--newzPrint the newest packagesz-sz--spacez#Space separated output, not newlinez-kz--keepZstoreZKEEPz)Newest N packages to keep - defaults to 1�)rN�metavarrO�default�typer0zPath to directory)�add_argumentrr/)�parserrrr�
set_argparser�s




zRepoManageCommand.set_argparsercCs`g}xVtj|�D]H\}}}x<|D]4}tjj|�dj�t|�kr |jtjj||��q WqW|S)zJReturn all files in path matching ext

        return list object
        rP)�os�walkr0�splitext�lower�strr3r:)r0ZextZfilelist�root�dirs�files�frrrr4�s
z RepoManageCommand._get_file_listcCs*t|jj�r tjj|jj|j�S|jSdS)N)r5rr!rWr0r:r�location)rrFrrrr'�sz"RepoManageCommand._package_to_pathcCs|j|j|j|j|jfS)N)rZepoch�version�releaser?)rFrrrr@sz#RepoManageCommand._package_to_nevraN)r)rrr�aliasesrZsummaryrrrL�staticmethodrVr4r'r@rrrrr
+s$r
)Z
__future__rrZdnfpluginscorerrr+Zdnf.clirrWr<ZPluginrr
ZCommandr
rrrr�<module>ssite-packages/dnf-plugins/__pycache__/repodiff.cpython-36.opt-1.pyc000064400000017063147511334660021126 0ustar003

�gt`�,�@sjddlmZddlmZddlZddlmZddlZddlm	Z	Gdd�dej
�ZGdd	�d	ejj
�ZdS)
�)�absolute_import)�unicode_literalsN)�OptionParser)�_cs eZdZdZ�fdd�Z�ZS)�RepoDiff�repodiffcs,tt|�j||�|dkrdS|jt�dS)N)�superr�__init__Zregister_command�RepoDiffCommand)�self�base�cli)�	__class__��/usr/lib/python3.6/repodiff.pyr	$szRepoDiff.__init__)�__name__�
__module__�__qualname__�namer	�
__classcell__rr)rrr src@sLeZdZdZed�Zedd��Zdd�Zdd�Z	d	d
�Z
dd�Zd
d�ZdS)r
rz1List differences between two sets of repositoriesc	Cs�|jddgddtd�d�|jddgdd	td
�d�|jddd
gtjdtd�d�|jdddtd�d�|jddtd�d�|jddtd�d�|jddtd�d�dS)Nz
--repo-oldz-o�append�oldz2Specify old repository, can be used multiple times)�default�action�dest�helpz
--repo-newz-n�newz2Specify new repository, can be used multiple timesz--archz
--archlistz-a�archeszhSpecify architectures to compare, can be used multiple times. By default, only source rpms are compared.z--sizez-s�
store_truez5Output additional data about the size of the changes.)rrz--compare-archzMCompare packages also by arch. By default packages are compared just by name.z--simplez7Output a simple one line message for modified packages.z--downgradezNSplit the data for modified packages between upgraded and downgraded packages.)�add_argumentrrZ_SplitCallback)�parserrrr�
set_argparser/s

zRepoDiffCommand.set_argparsercCs�|jj}d|_d|_d|_dg|jj_|jj	s:|jj
rNtd�}tj
j|��x<|jjj�D],}|j|jj	|jj
kr�|j�q\|j�q\W|jjs�dg|j_dS)NT�allz*Both old and new repositories must be set.�src)r
�demandsZsack_activationZavailable_repos�
changelogsrZconfZdisable_excludes�optsrrr�dnf�
exceptions�ErrorZreposr"�id�enable�disabler)rr$�msgZreporrr�	configureMs
zRepoDiffCommand.configurecCs|jjr|j|jfS|jS)N)r&�compare_archr�arch)r�pkgrrr�_pkgkey`szRepoDiffCommand._pkgkeyc
s6t�fdd�|D���t�j��}t�fdd�|D���t�j��}t�}x:|j|d�D]*}x$|j|jd�D]}||�j|�<qlWqXW�jjj}t�fdd�||D��fdd�||D�|ggd�}	xj|j	|�D]\}
�|
}�|
}|j
|j
kr�q�||j
|j
�d	k�r|	d
j||f�q�|	dj||f�q�W|	S)aNcompares packagesets old and new, returns dictionary with packages:
        added: only in new set
        removed: only in old set
        upgraded: in both old and new, new has bigger evr
        downgraded: in both old and new, new has lower evr
        obsoletes: dictionary of which old package is obsoleted by which new
        csg|]}�j|�|f�qSr)r2)�.0�p)rrr�
<listcomp>msz-RepoDiffCommand._repodiff.<locals>.<listcomp>csg|]}�j|�|f�qSr)r2)r3r4)rrrr5os)�	obsoletes)Zprovidescsg|]}�|�qSrr)r3�k)�new_drrr5zscsg|]}�|�qSrr)r3r7)�old_drrr5{s)�added�removedr6�upgraded�
downgradedrr=r<)�dict�set�keys�filterr6r2r�sack�evr_cmp�intersection�evrr)
rrrZold_keysZnew_keysr6Z	obsoleterZ	obsoletedrCrr7�pkg_old�pkg_newr)r8r9rr�	_repodiffes0
zRepoDiffCommand._repodiffc
sh�fdd��dd�}��fdd�}tddddd�}x<t|d	�D],}ttd
�j�|���|d	|j7<q@Wxjt|d�D]Z}ttd�j�|���|d
j�j|��}|r�ttd�j�|���|d|j7<q~W�jj	�r�|d�r:ttd��x<t|d�D],\}}|d|j|j7<|||��q
W|d�r�ttd��x�t|d�D],\}}|d|j|j7<|||��q^Wn\|d|d}	|	�r�ttd��x8t|	�D],\}}|d|j|j7<|||��q�Wttd��ttd�jt
|d	���ttd�jt
|d����jj	�rlttd�jt
|d���ttd�jt
|d���n&ttd�jt
|d�t
|d����jj�rdttd�j||d	���ttd�j||d����jj	�s�ttd�j||d|d���n4ttd�j||d���ttd�j||d���ttd�j||d	|d|d|d���dS) Ncs �jjrt|�Sd|j|jfS)Nz%s-%s)r&r/�strrrE)r1)rrr�pkgstr�sz'RepoDiffCommand._report.<locals>.pkgstrcSsXt|�}|dkr.|djtjjj|�j��7}n&|dkrT|djtjjj|�j��7}|S)Nrz ({})z (-{}))rI�formatr'r
Z
format_number�strip)Znumr-rrr�sizestr�sz(RepoDiffCommand._report.<locals>.sizestrcsBg}�jjr*|jd�|��|�f��n|jd�|jd�|��|�f�|jdt|d
��|jrv|jd}nd}x�|jD]�}|r�|d|dkr�Pn2|d|dkr�|d|dkr�|d|dkr�P|jd	|djd
�tjj|d�tjj|d�f�q�W�jj	�r0|jt
d�j|j	|j	��tdj
|��dS)Nz%s -> %s��-�rZ	timestampZauthor�textz
* %s %s
%sz%a %b %d %YzSize change: {} bytes�
���)r&Zsimpler�lenr%Zstrftimer'Zi18nZucd�sizerrK�print�join)rFrGZmsgsZ	old_chlogZchlog)rJrrr�report_modified�s2

z0RepoDiffCommand._report.<locals>.report_modifiedr)r:r;r<r=r:zAdded package  : {}r;zRemoved package: {}r6zObsoleted by   : {}r<z
Upgraded packagesr=z
Downgraded packagesz
Modified packagesz
SummaryzAdded packages: {}zRemoved packages: {}zUpgraded packages: {}zDowngraded packages: {}zModified packages: {}zSize of added packages: {}zSize of removed packages: {}zSize of modified packages: {}zSize of upgraded packages: {}zSize of downgraded packages: {}zSize change: {})r>�sortedrVrrKrU�getr2r&Z	downgraderT)
rrrMrXZsizesr1ZobsoletedbyrFrGZmodifiedr)rJrr�_report�sf










zRepoDiffCommand._reportcCs�|jjjtj�j|jjd�}|jjjtj�j|jjd�}|jj	rld|jj	krl|j
|jj	d�|j
|jj	d�|jjr�|j
dd�|j
dd�n|j
dd�|j
dd�|j�|j�|j
|j||��dS)N)Zreponame�*)r0rP)Zlatest_per_arch)Zlatest)rrBZquery�hawkeyZIGNORE_EXCLUDESrAr&rrrZfiltermr/Zapplyr[rH)rZq_newZq_oldrrr�run�szRepoDiffCommand.runN)r)
rrr�aliasesrZsummary�staticmethodr!r.r2rHr[r^rrrrr
+s&ar
)Z
__future__rrZdnf.clir'Zdnf.cli.option_parserrr]ZdnfpluginscorerZPluginrr
ZCommandr
rrrr�<module>ssite-packages/dnf-plugins/__pycache__/repomanage.cpython-36.pyc000064400000014573147511334660020512 0ustar003

@|�eJ)�@szddlmZddlmZddlmZmZddlZddlZddlZddl	Z	ddl
Z
Gdd�dej�ZGdd�dej
j�ZdS)	�)�absolute_import)�unicode_literals)�_�loggerNcs eZdZdZ�fdd�Z�ZS)�
RepoManage�
repomanagecs,tt|�j||�|dkrdS|jt�dS)N)�superr�__init__Zregister_command�RepoManageCommand)�self�base�cli)�	__class__�� /usr/lib/python3.6/repomanage.pyr	$szRepoManage.__init__)�__name__�
__module__�__qualname__�namer	�
__classcell__rr)rrr src@s\eZdZdZed�Zdd�Zdd�Zdd�Ze	d	d
��Z
e	dd��Zd
d�Ze	dd��Z
dS)r
rz"Manage a directory of rpm packagescCs,|jjr(|jjr(|jjtjtjd�dS)N)�stdout�stderr)�opts�verbose�quietr
Zredirect_logger�loggingZWARNING�INFO)rrrr�
pre_configure/szRepoManageCommand.pre_configurecCs0|jjr|jjr|jj�|jj}d|_dS)NT)rrrr
Zredirect_repo_progress�demandsZsack_activation)rrrrr�	configure3s
zRepoManageCommand.configurec"s@�jjr �jjr tjjtd����jjr@�jjr@tjjtd����jjr`�jjr`tjjtd����jjr|�jjr|d�j_i}i}i}t�}t	�jj
�}y�d}�jjj
|�jj�jjgd�}|jj��jj|�tjj�r>�jj��jjj�}xH|D]@}	|	j�|kr�|j|	j��|j|	j�i�j|	j�g�j|	�q�WWn�tjjk
�r�g}
�j�jjd�}
t |
�dk�r�tjjtd	����jj!ddd
��jj"ddd�y�jj#|
�jj$j%d
�Wn0t&k
�r�t'j(td�j)dj*|
���YnXYnX�jj+j,t-j.d�j/�}dd�|j0|j0|d�d�j/�D�}|j1�x�|D]~}
|
j2|
j3f}||k�rx|
||k�r�||j|
�n
|
g||<�j4|
�}||k�r�||j�j5|
��n�j5|
�g||<�q@Wg}t�}�jj�r�xh|j6�D]\\}}|||f}||d�}x6|D].}�j4|�}x||D]}|j|��q W�qW�q�Wxb|j7�D]V}t8|j6��}||d�}x4|D],}x$||D]}|jt|j����q|W�qnW�qJW�jj�r|xh|j6�D]\\}}|||f}|d|�}x6|D].}�j4|�}x||D]}|j|��q�W�q�W�q�Wxb|j7�D]V}t8|j6��}|d|�}x4|D],}x$||D]}|jt|j����qTW�qFW�q"W�jj�r�xh|j6�D]\\}}|||f}|d|�}x6|D].}�j4|�}x||D]}|j|��q�W�q�W�q�Wt�}xb|j7�D]V}t8|j6��}||d�}x4|D],}x$||D]}|jt|j����q2W�q$W�qWxx|j7�D]l}t8|j6��}|d|�}xJ|D]B}x:||D].}x&|j�D]} | |k�r�|j9| ��q�W�q�W�q�W�qdW�fdd�|j0|j0|d�d�j/�D�}!||!}|j1��jj:�r$t;dj*|��nx|D]}
t;|
��q*WdS)Nz%Pass either --old or --new, not both!z)Pass either --oldonly or --new, not both!z)Pass either --old or --oldonly, not both!TZrepomanage_repo)Zbaseurlz.rpmrzNo files to process)�sack�reposF)Zload_system_repoZload_available_repos)�progresszCould not open {}z, )�flagscSsg|]}|�qSrr)�.0�xrrr�
<listcomp>osz)RepoManageCommand.run.<locals>.<listcomp>)Znevra_strict)Zpkg__neqcsg|]}�j|��qSr)�_package_to_path)r$r%)rrrr&�s)Zpkg__eq� )<r�new�old�dnf�
exceptions�ErrorrZoldonly�set�intZkeeprr!Zadd_new_repoZconf�pathZ_repoZexpireZ_add_repo_to_sackZWITH_MODULESZ_setup_modular_excludesZ_moduleContainerZgetModulePackagesZ	getRepoID�updateZgetArtifacts�
setdefaultZ
getNameStreamZ
getVersionNum�appendZ	RepoError�_get_file_list�len�resetZ	fill_sackZadd_remote_rpms�outputr"�IOErrorrZwarning�format�joinr �query�hawkeyZIGNORE_MODULAR_EXCLUDESZ	available�filter�sortr�arch�_package_to_nevrar'�keys�values�sorted�addZspace�print)"rZverfileZpkgdictZmodule_dictZall_modular_artifactsZkeepnumZREPOMANAGE_REPOIDZ	repo_confZmodule_packagesZmodule_packageZrpm_listr;Zpackages�pkgZnaZnevraZoutputpackagesZkeepnum_latest_stream_artifacts�n�aZevrlistZnewevrs�packageZfpkgZstreams_by_versionZsorted_stream_versionsZnew_sorted_stream_versions�i�streamZoldevrsZold_sorted_stream_versionsZkeepnum_newer_stream_artifactsZartifactZmodular_packagesr)rr�run9s�



&"








$



$



$

"&

zRepoManageCommand.runc	Cs�|jdddtd�d�|jdddtd�d�|jd	d
dtd�d�|jdd
dtd�d�|jddddtd�dtd�|jddtd�d�dS)Nz-oz--old�
store_truezPrint the older packages)�action�helpz-Oz	--oldonlyz6Print the older packages. Exclude the newest packages.z-nz--newzPrint the newest packagesz-sz--spacez#Space separated output, not newlinez-kz--keepZstoreZKEEPz)Newest N packages to keep - defaults to 1�)rN�metavarrO�default�typer0zPath to directory)�add_argumentrr/)�parserrrr�
set_argparser�s




zRepoManageCommand.set_argparsercCs`g}xVtj|�D]H\}}}x<|D]4}tjj|�dj�t|�kr |jtjj||��q WqW|S)zJReturn all files in path matching ext

        return list object
        rP)�os�walkr0�splitext�lower�strr3r:)r0ZextZfilelist�root�dirs�files�frrrr4�s
z RepoManageCommand._get_file_listcCs*t|jj�r tjj|jj|j�S|jSdS)N)r5rr!rWr0r:r�location)rrFrrrr'�sz"RepoManageCommand._package_to_pathcCs|j|j|j|j|jfS)N)rZepoch�version�releaser?)rFrrrr@sz#RepoManageCommand._package_to_nevraN)r)rrr�aliasesrZsummaryrrrL�staticmethodrVr4r'r@rrrrr
+s$r
)Z
__future__rrZdnfpluginscorerrr+Zdnf.clirrWr<ZPluginrr
ZCommandr
rrrr�<module>ssite-packages/dnf-plugins/__pycache__/repoclosure.cpython-36.opt-1.pyc000064400000010500147511334660021657 0ustar003

�gt`��@sVddlmZddlmZddlmZddlZGdd�dej�ZGdd�dej	j
�ZdS)	�)�absolute_import)�unicode_literals)�_Ncs eZdZdZ�fdd�Z�ZS)�RepoClosure�repoclosurecs,tt|�j||�|dkrdS|jt�dS)N)�superr�__init__Zregister_command�RepoClosureCommand)�self�base�cli)�	__class__��!/usr/lib/python3.6/repoclosure.pyr!szRepoClosure.__init__)�__name__�
__module__�__qualname__�namer�
__classcell__rr)r
rrsrc@s>eZdZdZed�Zdd�Zdd�Zd
dd	�Ze	d
d��Z
dS)r	rz:Display a list of unresolved dependencies for repositoriescCsd|jj}d|_d|_|jjr`xB|jjj�D]2}|j	|jjkrT|j	|jj
krT|j�q*|j�q*WdS)NT)
r�demandsZsack_activationZavailable_repos�opts�repor�repos�all�id�check�disable�enable)r
rrrrr�	configure,s
zRepoClosureCommand.configurecCs�|jjr|j|jj�}n|j�}xRt|j��D]B}tdjt|�|j��td�x||D]}tdj|��qZWq.Wt	|�dkr�t
d�}tjj
|��dS)Nzpackage: {} from {}z  unresolved deps:z    {}rz/Repoclosure ended with unresolved dependencies.)r�arches�_get_unresolved�sorted�keys�print�format�str�reponame�lenr�dnf�
exceptions�Error)r
�
unresolved�pkgZdep�msgrrr�run7szRepoClosureCommand.runNcsLi}t�}|jjr�|jjj�jdd��|jjj�jdd�}xv|jjj�D]D}�j	|jjj�j|j
d�j���|j	|jjj�j|j
d�j��}qHWn |jjj�j��|jjj�j�}|jj
�rN|jjj�jdd�}g}xT|jj
D]H}tjj|�}	|j|	j|jjdddd��}
|
�r|j	|
�}q�|j|�q�W|�rJtjjtd�dj|���|}|jj�rh|j|jjd�|dk	�r~|j|d�|jjj�r��jdd	��j�|j�xf|D]^}t�||<xL|jD]B}t|�}|jd
��s�|jd��r�q�|j |�||j |��q�W�q�Wt�fdd
�|D����fdd�|j!�D�}
dd�|
j!�D�S)NT)�empty)r&F)Z
with_nevraZ
with_providesZwith_filenameszno package matched: %sz, )�arch)Zlatest_per_archz	solvable:zrpmlib(c3s|]}�j|d�s|VqdS))ZprovidesN)�filter)�.0�x)�	availablerr�	<genexpr>�sz5RepoClosureCommand._get_unresolved.<locals>.<genexpr>cs(i|] \}}t�fdd�|D��|�qS)c3s|]}|�kr|VqdS)Nr)r2r3)�unresolved_depsrrr5�sz@RepoClosureCommand._get_unresolved.<locals>.<dictcomp>.<genexpr>)�set)r2�k�v)r6rr�
<dictcomp>�sz6RepoClosureCommand._get_unresolved.<locals>.<dictcomp>cSsi|]\}}|r||�qSrr)r2r8r9rrrr:�s)"r7rZnewestrZsackZqueryr1rZiter_enabled�unionrZlatestr4�pkglistr(ZsubjectZSubject�intersectionZget_best_query�appendr)r*r�joinrZfiltermZconfZbestZapplyZrequiresr%�
startswith�add�items)r
r0r+ZdepsZto_checkrZ	pkglist_q�errorsr,ZsubjZpkg_qZreqZreqnameZunresolved_transitionr)r4r6rr Es\ &






z"RepoClosureCommand._get_unresolvedcCs`|jdgddtd�d�|jdgdtd�d�|jd	d
dtd�d
�|jdgdtd�dd�dS)Nz--archr>rzBcheck packages of the given archs, can be specified multiple times)�default�action�dest�helpz--checkzSpecify repositories to check)rDrErGz-nz--newest�
store_truez+Check only the newest packages in the repos)rErGz--pkgz#Check closure for this package onlyr<)rDrErGrF)�add_argumentr)�parserrrr�
set_argparser�s


z RepoClosureCommand.set_argparser)r)N)rrr�aliasesrZsummaryrr.r �staticmethodrKrrrrr	(s
Qr	)Z
__future__rrZdnfpluginscorerZdnf.clir(ZPluginrrZCommandr	rrrr�<module>s
site-packages/dnf-plugins/__pycache__/needs_restarting.cpython-36.pyc000064400000023602147511334660021725 0ustar003

@|�e`.�	@s$ddlmZddlmZddlmZddlmZddlmZmZddlZddl	Zddl
Z
ddlZddlZddl
Z
ddlZddlZddd	d
ddd
ddg	ZdgZdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�ZGd#d$�d$e�ZGd%d&�d&e�ZejjGd'd(�d(ej j!��Z"dS))�)�absolute_import)�division)�print_function)�unicode_literals)�logger�_NZkernelz	kernel-rtZglibczlinux-firmwareZsystemd�dbuszdbus-brokerzdbus-daemonZ
microcode_ctl�zlibcs�tjj|�st�St�}xjtj|�D]\}tjj|�s$|jd�rBq$ttjj||���&}x|D]}|j	|j
�|f�q\WWdQRXq$Wt��x4|jj�j
�jdd�|D�d�D]}�j	|j�q�Wx6�fdd�|D�D] \}}tjtdj||d���q�W�S)	z�
    Provide filepath as string if single dir or list of strings
    Return set of package names contained in files under filepath
    z.confNcSsh|]}|d�qS)r�)�.0�xr
r
�&/usr/lib/python3.6/needs_restarting.py�	<setcomp>Bsz'get_options_from_dir.<locals>.<setcomp>)�namecsh|]}|d�kr|�qS)rr
)rr)�packagesr
r
rDsz`No installed package found for package name "{pkg}" specified in needs-restarting file "{file}".)�pkg�file)�os�path�exists�set�listdir�isdir�endswith�open�join�add�rstrip�sack�query�	installed�filterrr�warningr�format)�filepath�baseZoptionsr�fp�linerrr
)rr
�get_options_from_dir0s"
$&r(ccs�x�t�D]�\}}y<|dk	r(|t|�kr(wt|ddd��}|j�}WdQRXWn"tk
rntjd|�wYnXx$|D]}t||�}|dk	rv|VqvWqWdS)N�r�replace)�errorszFailed to read PID %d's smaps.)�
list_smaps�	owner_uidr�	readlines�EnvironmentErrorrr"�smap2opened_file)�uid�pid�smapsZ
smaps_file�linesr'�ofiler
r
r
�list_opened_filesKs

r6ccsNxHtjd�D]:}yt|�}Wntk
r2wYnXd|}||fVqWdS)Nz/procz/proc/%d/smaps)rr�int�
ValueError)Zdir_r2r3r
r
r
r,\sr,cst��i����fdd�}|S)Ncs,�j|��}|�k	r|S�|�}|�|<|S)N)�get)Zparam�val)�cache�func�sentinelr
r
�wrapperiszmemoize.<locals>.wrapper)�object)r<r>r
)r;r<r=r
�memoizefsr@cCstj|�tjS)N)r�stat�ST_UID)�fnamer
r
r
r-ssr-cCs$|j�j|d�j�}|r |dSdS)N)rr)rr!�run)rrCZmatchesr
r
r
�owning_packagewsrEcCsPd|}t|��}tjj|j��}WdQRXdj|jd��}td||f�dS)Nz/proc/%d/cmdline� �z%d : %s)r�dnfZi18nZucd�readr�split�print)r2ZcmdlineZcmdline_fileZcommandr
r
r
�	print_cmd~s

rLc	Cs�tj�}|jdd�}tj|d�}d}y|jd|j|��}Wn<tjk
rv}zt|�}tjdj	||��dSd}~XnXtj|dd�}|j
dd�}|jd	�r�|SdS)
Nzorg.freedesktop.systemd1z/org/freedesktop/systemd1z org.freedesktop.systemd1.Managerz)Failed to get systemd unit for PID {}: {}zorg.freedesktop.DBus.Properties)Zdbus_interfacezorg.freedesktop.systemd1.UnitZIdz.service)rZ	SystemBusZ
get_objectZ	InterfaceZGetUnitByPIDZ
DBusException�strrr"r#ZGetr)	r2ZbusZsystemd_manager_objectZsystemd_manager_interfaceZ
service_proxy�e�msgZservice_propertiesrr
r
r
�get_service_dbus�s0

rPcCsn|jd�}|dkrdS|jd�dkr(dS||d�j�}|jd�}|dkrVt||d�St||d|�d�SdS)N�/rz00:z
 (deleted)FT)�find�strip�rfind�
OpenedFile)r2r'Zslash�fnZsuffix_indexr
r
r
r0�s

r0c@s*eZdZejd�Zdd�Zedd��ZdS)rUz^(.+);[0-9A-Fa-f]{8,}$cCs||_||_||_dS)N)�deletedrr2)�selfr2rrWr
r
r
�__init__�szOpenedFile.__init__cCs(|jr"|jj|j�}|r"|jd�S|jS)a;Calculate the name of the file pre-transaction.

        In case of a file that got deleted during the transactionm, possibly
        just because of an upgrade to a newer version of the same file, RPM
        renames the old file to the same name with a hexadecimal suffix just
        before delting it.

        �)rW�RE_TRANSACTION_FILE�matchr�group)rXr\r
r
r
�
presumed_name�s

zOpenedFile.presumed_nameN)	�__name__�
__module__�__qualname__�re�compiler[rY�propertyr^r
r
r
r
rU�s
rUc@s4eZdZdd�Zedd��Zedd��Zdd�Zd	S)
�ProcessStartcCs|j�|_|j�|_dS)N)�
get_boot_time�	boot_time�get_sc_clk_tck�
sc_clk_tck)rXr
r
r
rY�s
zProcessStart.__init__cCshttjd�j�}tjjd�rdtdd��8}|j�j�j	�dj�}tt
j
�t|��}t||�SQRX|S)a	
        We have two sources from which to derive the boot time. These values vary
        depending on containerization, existence of a Real Time Clock, etc.
        For our purposes we want the latest derived value.
        - st_mtime of /proc/1
             Reflects the time the first process was run after booting
             This works for all known cases except machines without
             a RTC - they awake at the start of the epoch.
        - /proc/uptime
             Seconds field of /proc/uptime subtracted from the current time
             Works for machines without RTC iff the current time is reasonably correct.
             Does not work on containers which share their kernel with the
             host - there the host kernel uptime is returned
        z/proc/1z/proc/uptime�rbrN)
r7rrA�st_mtimer�isfiler�readlinerSrJ�time�float�max)Zproc_1_boot_time�fZuptimeZproc_uptime_boot_timer
r
r
rf�szProcessStart.get_boot_timecCstjtjd�S)N�
SC_CLK_TCK)r�sysconf�
sysconf_namesr
r
r
r
rh�szProcessStart.get_sc_clk_tckc
CsLd|}t|��}|j�j�j�}WdQRXt|d�}||j}|j|S)Nz
/proc/%d/stat�)rrIrSrJr7rirg)rXr2Zstat_fnZ	stat_fileZstatsZticks_after_bootZsecs_after_bootr
r
r
�__call__�s

zProcessStart.__call__N)r_r`rarY�staticmethodrfrhrvr
r
r
r
re�srec@s4eZdZd
Zed�Zedd��Zdd�Zdd�Z	d	S)�NeedsRestartingCommand�needs-restartingz/determine updated binaries that need restartingcCsF|jdddtd�d�|jdddtd�d�|jd	d
dtd�d�dS)Nz-uz
--useronly�
store_truez#only consider this user's processes)�action�helpz-rz--reboothintzKonly report whether a reboot is required (exit code 1) or not (exit code 0)z-sz
--servicesz%only report affected systemd services)�add_argumentr)�parserr
r
r
�
set_argparsers


z$NeedsRestartingCommand.set_argparsercCs|jj}d|_dS)NT)�cli�demandsZsack_activation)rXr�r
r
r
�	configuresz NeedsRestartingCommand.configurecCsNt�}tjt|jj�}t|�}ttj	j
|jjjd�|j�}t
j|�|jj�r�t�}t�}|jjj�j�}x,|jt
d�D]}|j|jkrx|j|j�qxW|jdddgd�}t|�dkr�x,|jtd�D]}|j|jkr�|j|j�q�W|s�|�rfttd��xt|�D]}	td|	��qWxt|�D]}	td	|	��q$Wt�ttd
��ttd�d�tjj ��nttd
��ttd��dSt�}
|jj!�r�tj"�nd}xHt#|�D]<}||j$�}|dk�rĐq�|j||j%�k�r�|
j|j%��q�W|jj&�r.tdd�t|
�D��}
x |
D]}	|	dk	�rt|	��qWdSxt|
�D]}t'|��q8WdS)Nz#etc/dnf/plugins/needs-restarting.d/)rrzdbus-daemonzdbus-brokerrz;Core libraries or services have been updated since boot-up:z  * %sz8  * %s (dependency of dbus. Recommending reboot of dbus)z2Reboot is required to fully utilize these updates.zMore information:z)https://access.redhat.com/solutions/27943z>No core libraries or services have been updated since boot-up.zReboot should not be necessary.cSsg|]}t|��qSr
)rP)rr2r
r
r
�
<listcomp>Bsz.NeedsRestartingCommand.run.<locals>.<listcomp>)(re�	functools�partialrEr%rr@r(rrrZconfZinstallroot�NEED_REBOOT�extendZoptsZ
reboothintrrr r!Zinstalltimergrr�len�NEED_REBOOT_DEPENDS_ON_DBUSrKr�sortedrH�
exceptions�ErrorZuseronly�geteuidr6r^r2ZservicesrL)rXZ
process_startZ
owning_pkg_fn�optZneed_rebootZneed_reboot_depends_on_dbusr rZdbus_installedrZ
stale_pidsr1r5�namesr2r
r
r
rDsd







zNeedsRestartingCommand.runN)ry)
r_r`ra�aliasesrZsummaryrwrr�rDr
r
r
r
rx�s

rx)#Z
__future__rrrrZdnfpluginscorerrrHZdnf.clirr�rrbrArnr�r�r(r6r,r@r-rErLrPr0r?rUreZpluginZregister_commandr�ZCommandrxr
r
r
r
�<module>s:

"+site-packages/dnf-plugins/builddep.py000064400000022202147511334660013664 0ustar00# builddep.py
# Install all the deps needed to build this package.
#
# Copyright (C) 2013-2015  Red Hat, Inc.
# Copyright (C) 2015 Igor Gnatenko
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from dnfpluginscore import _, logger

import argparse
import dnf
import dnf.cli
import dnf.exceptions
import dnf.rpm.transaction
import dnf.yum.rpmtrans
import libdnf.repo
import os
import rpm
import shutil
import tempfile


@dnf.plugin.register_command
class BuildDepCommand(dnf.cli.Command):

    aliases = ('builddep', 'build-dep')
    msg = "Install build dependencies for package or spec file"
    summary = _(msg)
    usage = _("[PACKAGE|PACKAGE.spec]")

    def __init__(self, cli):
        super(BuildDepCommand, self).__init__(cli)
        self._rpm_ts = dnf.rpm.transaction.initReadOnlyTransaction()
        self.tempdirs = []

    def __del__(self):
        for temp_dir in self.tempdirs:
            shutil.rmtree(temp_dir)

    def _download_remote_file(self, pkgspec):
        """
        In case pkgspec is a remote URL, download it to a temporary location
        and use the temporary file instead.
        """
        location = dnf.pycomp.urlparse.urlparse(pkgspec)
        if location[0] in ('file', ''):
            # just strip the file:// prefix
            return location.path

        downloader = libdnf.repo.Downloader()
        temp_dir = tempfile.mkdtemp(prefix="dnf_builddep_")
        temp_file = os.path.join(temp_dir, os.path.basename(pkgspec))
        self.tempdirs.append(temp_dir)

        temp_fo = open(temp_file, "wb+")
        try:
            downloader.downloadURL(self.base.conf._config, pkgspec, temp_fo.fileno())
        except RuntimeError as ex:
            raise
        finally:
            temp_fo.close()
        return temp_file

    @staticmethod
    def set_argparser(parser):
        def macro_def(arg):
            arglist = arg.split(None, 1) if arg else []
            if len(arglist) < 2:
                msg = _("'%s' is not of the format 'MACRO EXPR'") % arg
                raise argparse.ArgumentTypeError(msg)
            return arglist

        parser.add_argument('packages', nargs='+', metavar='package',
                            help=_('packages with builddeps to install'))
        parser.add_argument('-D', '--define', action='append', default=[],
                            metavar="'MACRO EXPR'", type=macro_def,
                            help=_('define a macro for spec file parsing'))
        parser.add_argument('--skip-unavailable', action='store_true', default=False,
                            help=_('skip build dependencies not available in repositories'))
        ptype = parser.add_mutually_exclusive_group()
        ptype.add_argument('--spec', action='store_true',
                            help=_('treat commandline arguments as spec files'))
        ptype.add_argument('--srpm', action='store_true',
                            help=_('treat commandline arguments as source rpm'))

    def pre_configure(self):
        if not self.opts.rpmverbosity:
            self.opts.rpmverbosity = 'error'

    def configure(self):
        demands = self.cli.demands
        demands.available_repos = True
        demands.resolving = True
        demands.root_user = True
        demands.sack_activation = True

        # enable source repos only if needed
        if not (self.opts.spec or self.opts.srpm):
            for pkgspec in self.opts.packages:
                if not (pkgspec.endswith('.src.rpm')
                        or pkgspec.endswith('.nosrc.rpm')
                        or pkgspec.endswith('.spec')):
                    self.base.repos.enable_source_repos()
                    break

    def run(self):
        rpmlog = dnf.yum.rpmtrans.RPMTransaction(self.base)
        # Push user-supplied macro definitions for spec parsing
        for macro in self.opts.define:
            rpm.addMacro(macro[0], macro[1])

        pkg_errors = False
        for pkgspec in self.opts.packages:
            pkgspec = self._download_remote_file(pkgspec)
            try:
                if self.opts.srpm:
                    self._src_deps(pkgspec)
                elif self.opts.spec:
                    self._spec_deps(pkgspec)
                elif pkgspec.endswith('.src.rpm') or pkgspec.endswith('nosrc.rpm'):
                    self._src_deps(pkgspec)
                elif pkgspec.endswith('.spec'):
                    self._spec_deps(pkgspec)
                else:
                    self._remote_deps(pkgspec)
            except dnf.exceptions.Error as e:
                for line in rpmlog.messages():
                    logger.error(_("RPM: {}").format(line))
                logger.error(e)
                pkg_errors = True

        # Pop user macros so they don't affect future rpm calls
        for macro in self.opts.define:
            rpm.delMacro(macro[0])

        if pkg_errors:
            raise dnf.exceptions.Error(_("Some packages could not be found."))

    @staticmethod
    def _rpm_dep2reldep_str(rpm_dep):
        return rpm_dep.DNEVR()[2:]

    def _install(self, reldep_str):
        # Try to find something by provides
        sltr = dnf.selector.Selector(self.base.sack)
        sltr.set(provides=reldep_str)
        found = sltr.matches()
        if not found and reldep_str.startswith("/"):
            # Nothing matches by provides and since it's file, try by files
            sltr = dnf.selector.Selector(self.base.sack)
            sltr.set(file=reldep_str)
            found = sltr.matches()

        if not found and not reldep_str.startswith("("):
            # No provides, no files
            # Richdeps can have no matches but it could be correct (solver must decide later)
            msg = _("No matching package to install: '%s'")
            logger.warning(msg, reldep_str)
            return self.opts.skip_unavailable is True

        if found:
            already_inst = self.base._sltr_matches_installed(sltr)
            if already_inst:
                for package in already_inst:
                    dnf.base._msg_installed(package)
        self.base._goal.install(select=sltr, optional=False)
        return True

    def _src_deps(self, src_fn):
        fd = os.open(src_fn, os.O_RDONLY)
        try:
            h = self._rpm_ts.hdrFromFdno(fd)
        except rpm.error as e:
            if str(e) == 'error reading package header':
                e = _("Failed to open: '%s', not a valid source rpm file.") % src_fn
            os.close(fd)
            raise dnf.exceptions.Error(e)
        os.close(fd)
        ds = h.dsFromHeader('requirename')
        done = True
        for dep in ds:
            reldep_str = self._rpm_dep2reldep_str(dep)
            if reldep_str.startswith('rpmlib('):
                continue
            done &= self._install(reldep_str)

        if not done:
            err = _("Not all dependencies satisfied")
            raise dnf.exceptions.Error(err)

        if self.opts.define:
            logger.warning(_("Warning: -D or --define arguments have no meaning "
                             "for source rpm packages."))

    def _spec_deps(self, spec_fn):
        try:
            spec = rpm.spec(spec_fn)
        except ValueError as ex:
            msg = _("Failed to open: '%s', not a valid spec file: %s") % (
                    spec_fn, ex)
            raise dnf.exceptions.Error(msg)
        done = True
        for dep in rpm.ds(spec.sourceHeader, 'requires'):
            reldep_str = self._rpm_dep2reldep_str(dep)
            done &= self._install(reldep_str)

        if not done:
            err = _("Not all dependencies satisfied")
            raise dnf.exceptions.Error(err)

    def _remote_deps(self, package):
        available = dnf.subject.Subject(package).get_best_query(
                        self.base.sack).filter(arch__neq="src")
        sourcenames = list({pkg.source_name for pkg in available})
        pkgs = self.base.sack.query().available().filter(
                name=(sourcenames + [package]), arch="src").latest().run()
        if not pkgs:
            raise dnf.exceptions.Error(_('no package matched: %s') % package)
        done = True
        for pkg in pkgs:
            for req in pkg.requires:
                done &= self._install(str(req))

        if not done:
            err = _("Not all dependencies satisfied")
            raise dnf.exceptions.Error(err)
site-packages/dnf-plugins/copr.py000064400000073132147511334660013047 0ustar00# supplies the 'copr' command.
#
# Copyright (C) 2014-2015  Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import print_function

import glob
import itertools
import json
import os
import re
import shutil
import stat
import sys
import base64
import json

from dnfpluginscore import _, logger
import dnf
from dnf.pycomp import PY3
from dnf.i18n import ucd
import rpm

# Attempt importing the linux_distribution function from distro
# If that fails, attempt to import the deprecated implementation
# from the platform module.
try:
    from distro import name, version, codename, os_release_attr

    # Re-implement distro.linux_distribution() to avoid a deprecation warning
    def linux_distribution():
        return (name(), version(), codename())
except ImportError:
    def os_release_attr(_):
        return ""
    try:
        from platform import linux_distribution
    except ImportError:
        # Simple fallback for distributions that lack an implementation
        def linux_distribution():
            with open('/etc/os-release') as os_release_file:
                os_release_data = {}
                for line in os_release_file:
                    try:
                        os_release_key, os_release_value = line.rstrip().split('=')
                        os_release_data[os_release_key] = os_release_value.strip('"')
                    except ValueError:
                        # Skip empty lines and everything that is not a simple
                        # variable assignment
                        pass
                return (os_release_data['NAME'], os_release_data['VERSION_ID'], None)

PLUGIN_CONF = 'copr'

YES = set([_('yes'), _('y')])
NO = set([_('no'), _('n'), ''])

if PY3:
    from configparser import ConfigParser, NoOptionError, NoSectionError
    from urllib.request import urlopen, HTTPError, URLError
else:
    from ConfigParser import ConfigParser, NoOptionError, NoSectionError
    from urllib2 import urlopen, HTTPError, URLError

@dnf.plugin.register_command
class CoprCommand(dnf.cli.Command):
    """ Copr plugin for DNF """

    chroot_config = None

    default_hostname = "copr.fedorainfracloud.org"
    default_hub = "fedora"
    default_protocol = "https"
    default_port = 443
    default_url = default_protocol + "://" + default_hostname
    aliases = ("copr",)
    summary = _("Interact with Copr repositories.")
    first_warning = True
    usage = _("""
  enable name/project [chroot]
  disable name/project
  remove name/project
  list --installed/enabled/disabled
  list --available-by-user=NAME
  search project

  Examples:
  copr enable rhscl/perl516 epel-6-x86_64
  copr enable ignatenkobrain/ocltoys
  copr disable rhscl/perl516
  copr remove rhscl/perl516
  copr list --enabled
  copr list --available-by-user=ignatenkobrain
  copr search tests
    """)

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('subcommand', nargs=1,
                            choices=['help', 'enable', 'disable',
                                     'remove', 'list', 'search'])

        list_option = parser.add_mutually_exclusive_group()
        list_option.add_argument('--installed', action='store_true',
                                 help=_('List all installed Copr repositories (default)'))
        list_option.add_argument('--enabled', action='store_true',
                                 help=_('List enabled Copr repositories'))
        list_option.add_argument('--disabled', action='store_true',
                                 help=_('List disabled Copr repositories'))
        list_option.add_argument('--available-by-user', metavar='NAME',
                                 help=_('List available Copr repositories by user NAME'))

        parser.add_argument('--hub', help=_('Specify an instance of Copr to work with'))

        parser.add_argument('arg', nargs='*')

    def configure(self):
        if self.cli.command.opts.command != "copr":
            return
        copr_hub = None
        copr_plugin_config = ConfigParser()
        config_files = []
        config_path = self.base.conf.pluginconfpath[0]

        default_config_file = os.path.join(config_path, PLUGIN_CONF + ".conf")
        if os.path.isfile(default_config_file):
            config_files.append(default_config_file)

            copr_plugin_config.read(default_config_file)
            if copr_plugin_config.has_option('main', 'distribution') and\
                    copr_plugin_config.has_option('main', 'releasever'):
                distribution = copr_plugin_config.get('main', 'distribution')
                releasever = copr_plugin_config.get('main', 'releasever')
                self.chroot_config = [distribution, releasever]
            else:
                self.chroot_config = [False, False]

        for filename in os.listdir(os.path.join(config_path, PLUGIN_CONF + ".d")):
            if filename.endswith('.conf'):
                config_file = os.path.join(config_path, PLUGIN_CONF + ".d", filename)
                config_files.append(config_file)

        project = []
        if len(self.opts.arg):
            project = self.opts.arg[0].split("/")

        if len(project) == 3 and self.opts.hub:
            logger.critical(
                _('Error: ') +
                _('specify Copr hub either with `--hub` or using '
                  '`copr_hub/copr_username/copr_projectname` format')
            )
            raise dnf.cli.CliError(_('multiple hubs specified'))

        # Copr hub was not specified, using default hub `fedora`
        elif not self.opts.hub and len(project) != 3:
            self.copr_hostname = self.default_hostname
            self.copr_url = self.default_url

        # Copr hub specified with hub/user/project format
        elif len(project) == 3:
            copr_hub = project[0]

        else:
            copr_hub = self.opts.hub

        # Try to find hub in a config file
        if config_files and copr_hub:
            self.copr_url = None
            copr_plugin_config.read(sorted(config_files, reverse=True))
            hostname = self._read_config_item(copr_plugin_config, copr_hub, 'hostname', None)

            if hostname:
                protocol = self._read_config_item(copr_plugin_config, copr_hub, 'protocol',
                                                  self.default_protocol)
                port = self._read_config_item(copr_plugin_config, copr_hub, 'port',
                                              self.default_port)

                self.copr_hostname = hostname
                self.copr_url = protocol + "://" + hostname
                if int(port) != self.default_port:
                    self.copr_url += ":" + port
                    self.copr_hostname += ":" + port

        if not self.copr_url:
            if '://' not in copr_hub:
                self.copr_hostname = copr_hub
                self.copr_url = self.default_protocol + "://" + copr_hub
            else:
                self.copr_hostname = copr_hub.split('://', 1)[1]
                self.copr_url = copr_hub

    def _read_config_item(self, config, hub, section, default):
        try:
            return config.get(hub, section)
        except (NoOptionError, NoSectionError):
            return default

    def _user_warning_before_prompt(self, text):
        sys.stderr.write("{0}\n".format(text.strip()))

    def run(self):
        subcommand = self.opts.subcommand[0]

        if subcommand == "help":
            self.cli.optparser.print_help(self)
            return 0
        if subcommand == "list":
            if self.opts.available_by_user:
                self._list_user_projects(self.opts.available_by_user)
                return
            else:
                self._list_installed_repositories(self.base.conf.reposdir[0],
                                                  self.opts.enabled, self.opts.disabled)
                return

        try:
            project_name = self.opts.arg[0]
        except (ValueError, IndexError):
            logger.critical(
                _('Error: ') +
                _('exactly two additional parameters to '
                  'copr command are required'))
            self.cli.optparser.print_help(self)
            raise dnf.cli.CliError(
                _('exactly two additional parameters to '
                  'copr command are required'))
        try:
            chroot = self.opts.arg[1]
            if len(self.opts.arg) > 2:
                raise dnf.exceptions.Error(_('Too many arguments.'))
            self.chroot_parts = chroot.split("-")
            if len(self.chroot_parts) < 3:
                raise dnf.exceptions.Error(_('Bad format of optional chroot. The format is '
                                             'distribution-version-architecture.'))
        except IndexError:
            chroot = self._guess_chroot()
            self.chroot_parts = chroot.split("-")

        # commands without defined copr_username/copr_projectname
        if subcommand == "search":
            self._search(project_name)
            return

        project = project_name.split("/")
        if len(project) not in [2, 3]:
            logger.critical(
                _('Error: ') +
                _('use format `copr_username/copr_projectname` '
                  'to reference copr project'))
            raise dnf.cli.CliError(_('bad copr project format'))
        elif len(project) == 2:
            copr_username = project[0]
            copr_projectname = project[1]
        else:
            copr_username = project[1]
            copr_projectname = project[2]
            project_name = copr_username + "/" + copr_projectname

        repo_filename = "{0}/_copr:{1}:{2}:{3}.repo".format(
            self.base.conf.get_reposdir, self.copr_hostname,
            self._sanitize_username(copr_username), copr_projectname)
        if subcommand == "enable":
            self._need_root()
            info = _("""
Enabling a Copr repository. Please note that this repository is not part
of the main distribution, and quality may vary.

The Fedora Project does not exercise any power over the contents of
this repository beyond the rules outlined in the Copr FAQ at
<https://docs.pagure.org/copr.copr/user_documentation.html#what-i-can-build-in-copr>,
and packages are not held to any quality or security level.

Please do not file bug reports about these packages in Fedora
Bugzilla. In case of problems, contact the owner of this repository.
""")
            project = '/'.join([self.copr_hostname, copr_username,
                                copr_projectname])
            msg = "Do you really want to enable {0}?".format(project)
            self._ask_user(info, msg)
            self._download_repo(project_name, repo_filename)
            logger.info(_("Repository successfully enabled."))
            self._runtime_deps_warning(copr_username, copr_projectname)
        elif subcommand == "disable":
            self._need_root()
            self._disable_repo(copr_username, copr_projectname)
            logger.info(_("Repository successfully disabled."))
        elif subcommand == "remove":
            self._need_root()
            self._remove_repo(copr_username, copr_projectname)
            logger.info(_("Repository successfully removed."))

        else:
            raise dnf.exceptions.Error(
                _('Unknown subcommand {}.').format(subcommand))

    def _list_repo_file(self, repo_id, repo, enabled_only, disabled_only):
        file_name = repo.repofile.split('/')[-1]

        match_new = re.match("_copr:" + self.copr_hostname, file_name)
        match_old = self.copr_url == self.default_url and re.match("_copr_", file_name)
        match_any = re.match("_copr:|^_copr_", file_name)

        if self.opts.hub:
            if not match_new and not match_old:
                return
        elif not match_any:
            return

        if re.match('copr:.*:.*:.*:ml', repo_id):
            # We skip multilib repositories
            return

        if re.match('coprdep:.*', repo_id):
            # Runtime dependencies are not listed.
            return

        enabled = repo.enabled
        if (enabled and disabled_only) or (not enabled and enabled_only):
            return

        old_repo = False
        # repo ID has copr:<hostname>:<user>:<copr_dir> format, while <copr_dir>
        # can contain more colons
        if re.match("copr:", repo_id):
            _, copr_hostname, copr_owner, copr_dir = repo_id.split(':', 3)
            msg = copr_hostname + '/' + copr_owner + "/" + copr_dir
        # repo ID has <user>-<project> format, try to get hub from file name
        elif re.match("_copr:", file_name):
            copr_name = repo_id.split('-', 1)
            copr_hostname = file_name.rsplit(':', 2)[0].split(':', 1)[1]
            msg = copr_hostname + '/' + copr_name[0] + '/' + copr_name[1]
        # no information about hub, assume the default one
        else:
            copr_name = repo_id.split('-', 1)
            msg = self.default_hostname + '/' + copr_name[0] + '/' + copr_name[1]
            old_repo = True
        if not enabled:
            msg += " (disabled)"
        if old_repo:
            msg += " *"

        print(msg)
        return old_repo

    def _list_installed_repositories(self, directory, enabled_only, disabled_only):
        old_repo = False
        for repo_id, repo in self.base.repos.items():
            if self._list_repo_file(repo_id, repo, enabled_only, disabled_only):
                old_repo = True
        if old_repo:
            print(_("* These coprs have repo file with an old format that contains "
                    "no information about Copr hub - the default one was assumed. "
                    "Re-enable the project to fix this."))

    def _list_user_projects(self, user_name):
        # https://copr.fedorainfracloud.org/api_3/project/list?ownername=ignatenkobrain
        api_path = "/api_3/project/list?ownername={0}".format(user_name)
        url = self.copr_url + api_path
        res = self.base.urlopen(url, mode='w+')
        try:
            json_parse = json.loads(res.read())
        except ValueError:
            raise dnf.exceptions.Error(
                _("Can't parse repositories for username '{}'.")
                .format(user_name))
        self._check_json_output(json_parse)
        section_text = _("List of {} coprs").format(user_name)
        self._print_match_section(section_text)

        for item in json_parse["items"]:
            msg = "{0}/{1} : ".format(user_name, item["name"])
            desc = item["description"] or _("No description given")
            msg = self.base.output.fmtKeyValFill(ucd(msg), desc)
            print(msg)

    def _search(self, query):
        # https://copr.fedorainfracloud.org/api_3/project/search?query=tests
        api_path = "/api_3/project/search?query={}".format(query)
        url = self.copr_url + api_path
        res = self.base.urlopen(url, mode='w+')
        try:
            json_parse = json.loads(res.read())
        except ValueError:
            raise dnf.exceptions.Error(_("Can't parse search for '{}'."
                                         ).format(query))
        self._check_json_output(json_parse)
        section_text = _("Matched: {}").format(query)
        self._print_match_section(section_text)

        for item in json_parse["items"]:
            msg = "{0} : ".format(item["full_name"])
            desc = item["description"] or _("No description given.")
            msg = self.base.output.fmtKeyValFill(ucd(msg), desc)
            print(msg)

    def _print_match_section(self, text):
        formatted = self.base.output.fmtSection(text)
        print(formatted)

    def _ask_user_no_raise(self, info, msg):
        if not self.first_warning:
            sys.stderr.write("\n")
        self.first_warning = False
        sys.stderr.write("{0}\n".format(info.strip()))

        if self.base._promptWanted():
            if self.base.conf.assumeno or not self.base.output.userconfirm(
                    msg='\n{} [y/N]: '.format(msg), defaultyes_msg='\n{} [Y/n]: '.format(msg)):
                return False
        return True

    def _ask_user(self, info, msg):
        if not self._ask_user_no_raise(info, msg):
            raise dnf.exceptions.Error(_('Safe and good answer. Exiting.'))

    @classmethod
    def _need_root(cls):
        # FIXME this should do dnf itself (BZ#1062889)
        if os.geteuid() != 0:
            raise dnf.exceptions.Error(
                _('This command has to be run under the root user.'))

    def _guess_chroot(self):
        """ Guess which chroot is equivalent to this machine """
        # FIXME Copr should generate non-specific arch repo
        dist = self.chroot_config
        if dist is None or (dist[0] is False) or (dist[1] is False):
            dist = linux_distribution()
        # Get distribution architecture
        distarch = self.base.conf.substitutions['basearch']
        if any([name in dist for name in ["Fedora", "Fedora Linux"]]):
            if "Rawhide" in dist:
                chroot = ("fedora-rawhide-" + distarch)
            # workaround for enabling repos in Rawhide when VERSION in os-release
            # contains a name other than Rawhide
            elif "rawhide" in os_release_attr("redhat_support_product_version"):
                chroot = ("fedora-rawhide-" + distarch)
            else:
                chroot = ("fedora-{0}-{1}".format(dist[1], distarch))
        elif "Mageia" in dist:
            # Get distribution architecture (Mageia does not use $basearch)
            distarch = rpm.expandMacro("%{distro_arch}")
            # Set the chroot
            if "Cauldron" in dist:
                chroot = ("mageia-cauldron-{}".format(distarch))
            else:
                chroot = ("mageia-{0}-{1}".format(dist[1], distarch))
        elif "openSUSE" in dist:
            # Get distribution architecture (openSUSE does not use $basearch)
            distarch = rpm.expandMacro("%{_target_cpu}")
            # Set the chroot
            if "Tumbleweed" in dist:
                chroot = ("opensuse-tumbleweed-{}".format(distarch))
            else:
                chroot = ("opensuse-leap-{0}-{1}".format(dist[1], distarch))
        else:
            chroot = ("epel-%s-x86_64" % dist[1].split(".", 1)[0])
        return chroot

    def _download_repo(self, project_name, repo_filename):
        short_chroot = '-'.join(self.chroot_parts[:-1])
        arch = self.chroot_parts[-1]
        api_path = "/coprs/{0}/repo/{1}/dnf.repo?arch={2}".format(project_name, short_chroot, arch)

        try:
            response = urlopen(self.copr_url + api_path)
            if os.path.exists(repo_filename):
                os.remove(repo_filename)
        except HTTPError as e:
            if e.code != 404:
                error_msg = _("Request to {0} failed: {1} - {2}").format(self.copr_url + api_path, e.code, str(e))
                raise dnf.exceptions.Error(error_msg)
            error_msg = _("It wasn't possible to enable this project.\n")
            error_data = e.headers.get("Copr-Error-Data")
            if error_data:
                error_data_decoded = base64.b64decode(error_data).decode('utf-8')
                error_data_decoded = json.loads(error_data_decoded)
                error_msg += _("Repository '{0}' does not exist in project '{1}'.").format(
                    '-'.join(self.chroot_parts), project_name)
                if error_data_decoded.get("available chroots"):
                    error_msg += _("\nAvailable repositories: ") + ', '.join(
                        "'{}'".format(x) for x in error_data_decoded["available chroots"])
                    error_msg += _("\n\nIf you want to enable a non-default repository, use the following command:\n"
                                   "  'dnf copr enable {0} <repository>'\n"
                                   "But note that the installed repo file will likely need a manual "
                                   "modification.").format(project_name)
                raise dnf.exceptions.Error(error_msg)
            else:
                error_msg += _("Project {0} does not exist.").format(project_name)
                raise dnf.exceptions.Error(error_msg)
        except URLError as e:
            error_msg = _("Failed to connect to {0}: {1}").format(self.copr_url + api_path, e.reason.strerror)
            raise dnf.exceptions.Error(error_msg)

        # Try to read the first line, and detect the repo_filename from that (override the repo_filename value).
        first_line = response.readline()
        line = first_line.decode("utf-8")
        if re.match(r"\[copr:", line):
            repo_filename = os.path.join(self.base.conf.get_reposdir, "_" + line[1:-2] + ".repo")

        # if using default hub, remove possible old repofile
        if self.copr_url == self.default_url:
            # copr:hub:user:project.repo => _copr_user_project.repo
            old_repo_filename = repo_filename.replace("_copr:", "_copr", 1)\
                .replace(self.copr_hostname, "").replace(":", "_", 1).replace(":", "-")\
                .replace("group_", "@")
            if os.path.exists(old_repo_filename):
                os.remove(old_repo_filename)

        with open(repo_filename, 'wb') as f:
            f.write(first_line)
            for line in response.readlines():
                f.write(line)
        os.chmod(repo_filename, stat.S_IRUSR | stat.S_IWUSR | stat.S_IRGRP | stat.S_IROTH)

    def _runtime_deps_warning(self, copr_username, copr_projectname):
        """
        In addition to the main copr repo (that has repo ID prefixed with
        `copr:`), the repofile might contain additional repositories that
        serve as runtime dependencies. This method informs the user about
        the additional repos and provides an option to disable them.
        """

        self.base.reset(repos=True)
        self.base.read_all_repos()

        repo = self._get_copr_repo(self._sanitize_username(copr_username), copr_projectname)

        runtime_deps = []
        for repo_id in repo.cfg.sections():
            if repo_id.startswith("copr:"):
                continue
            runtime_deps.append(repo_id)

        if not runtime_deps:
            return

        info = _(
            "Maintainer of the enabled Copr repository decided to make\n"
            "it dependent on other repositories. Such repositories are\n"
            "usually necessary for successful installation of RPMs from\n"
            "the main Copr repository (they provide runtime dependencies).\n\n"

            "Be aware that the note about quality and bug-reporting\n"
            "above applies here too, Fedora Project doesn't control the\n"
            "content. Please review the list:\n\n"
            "{0}\n\n"
            "These repositories have been enabled automatically."
        )

        counter = itertools.count(1)
        info = info.format("\n\n".join([
            "{num:2}. [{repoid}]\n    baseurl={baseurl}".format(
                num=next(counter),
                repoid=repoid,
                baseurl=repo.cfg.getValue(repoid, "baseurl"))
            for repoid in runtime_deps
        ]))

        if not self._ask_user_no_raise(info, _("Do you want to keep them enabled?")):
            for dep in runtime_deps:
                self.base.conf.write_raw_configfile(repo.repofile, dep,
                                                    self.base.conf.substitutions,
                                                    {"enabled": "0"})

    def _get_copr_repo(self, copr_username, copr_projectname):
        repo_id = "copr:{0}:{1}:{2}".format(self.copr_hostname.rsplit(':', 1)[0],
                                            self._sanitize_username(copr_username),
                                            copr_projectname)
        if repo_id not in self.base.repos:
            # check if there is a repo with old ID format
            repo_id = repo_id = "{0}-{1}".format(self._sanitize_username(copr_username),
                                                 copr_projectname)
            if repo_id in self.base.repos and "_copr" in self.base.repos[repo_id].repofile:
                file_name = self.base.repos[repo_id].repofile.split('/')[-1]
                try:
                    copr_hostname = file_name.rsplit(':', 2)[0].split(':', 1)[1]
                    if copr_hostname != self.copr_hostname:
                        return None
                except IndexError:
                    # old filename format without hostname
                    pass
            else:
                return None

        return self.base.repos[repo_id]

    def _remove_repo(self, copr_username, copr_projectname):
        # FIXME is it Copr repo ?
        repo = self._get_copr_repo(copr_username, copr_projectname)
        if not repo:
            raise dnf.exceptions.Error(
                _("Failed to remove copr repo {0}/{1}/{2}"
                  .format(self.copr_hostname, copr_username, copr_projectname)))
        try:
            os.remove(repo.repofile)
        except OSError as e:
            raise dnf.exceptions.Error(str(e))

    def _disable_repo(self, copr_username, copr_projectname):
        repo = self._get_copr_repo(copr_username, copr_projectname)
        if repo is None:
            raise dnf.exceptions.Error(
                _("Failed to disable copr repo {}/{}"
                  .format(copr_username, copr_projectname)))

        # disable all repos provided by the repo file
        for repo_id in repo.cfg.sections():
            self.base.conf.write_raw_configfile(repo.repofile, repo_id,
                                                self.base.conf.substitutions, {"enabled": "0"})

    @classmethod
    def _get_data(cls, f):
        """ Wrapper around response from server

        check data and print nice error in case of some error (and return None)
        otherwise return json object.
        """
        try:
            output = json.loads(f.read())
        except ValueError:
            dnf.cli.CliError(_("Unknown response from server."))
            return
        return output

    @classmethod
    def _check_json_output(cls, json_obj):
        if "error" in json_obj:
            raise dnf.exceptions.Error("{}".format(json_obj["error"]))

    @classmethod
    def _sanitize_username(cls, copr_username):
        if copr_username[0] == "@":
            return "group_{}".format(copr_username[1:])
        else:
            return copr_username


@dnf.plugin.register_command
class PlaygroundCommand(CoprCommand):
    """ Playground plugin for DNF """

    aliases = ("playground",)
    summary = _("Interact with Playground repository.")
    usage = " [enable|disable|upgrade]"

    def _cmd_enable(self, chroot):
        self._need_root()
        self._ask_user(
            _("Enabling a Playground repository."),
            _("Do you want to continue?"),
        )
        api_url = "{0}/api/playground/list/".format(
            self.copr_url)
        f = self.base.urlopen(api_url, mode="w+")
        output = self._get_data(f)
        f.close()
        if output["output"] != "ok":
            raise dnf.cli.CliError(_("Unknown response from server."))
        for repo in output["repos"]:
            project_name = "{0}/{1}".format(repo["username"],
                                            repo["coprname"])
            repo_filename = "{}/_playground_{}.repo".format(self.base.conf.get_reposdir, project_name.replace("/", "-"))
            try:
                if chroot not in repo["chroots"]:
                    continue
                api_url = "{0}/api/coprs/{1}/detail/{2}/".format(
                    self.copr_url, project_name, chroot)
                f = self.base.urlopen(api_url, mode='w+')
                output2 = self._get_data(f)
                f.close()
                if (output2 and ("output" in output2)
                        and (output2["output"] == "ok")):
                    self._download_repo(project_name, repo_filename)
            except dnf.exceptions.Error:
                # likely 404 and that repo does not exist
                pass

    def _cmd_disable(self):
        self._need_root()
        for repo_filename in glob.glob("{}/_playground_*.repo".format(self.base.conf.get_reposdir)):
            self._remove_repo(repo_filename)

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('subcommand', nargs=1,
                            choices=['enable', 'disable', 'upgrade'])

    def run(self):
        raise dnf.exceptions.Error("Playground is temporarily unsupported")
        subcommand = self.opts.subcommand[0]
        chroot = self._guess_chroot()
        if subcommand == "enable":
            self._cmd_enable(chroot)
            logger.info(_("Playground repositories successfully enabled."))
        elif subcommand == "disable":
            self._cmd_disable()
            logger.info(_("Playground repositories successfully disabled."))
        elif subcommand == "upgrade":
            self._cmd_disable()
            self._cmd_enable(chroot)
            logger.info(_("Playground repositories successfully updated."))
        else:
            raise dnf.exceptions.Error(
                _('Unknown subcommand {}.').format(subcommand))
site-packages/dnf-plugins/needs_restarting.py000064400000027140147511334660015442 0ustar00# needs_restarting.py
# DNF plugin to check for running binaries in a need of restarting.
#
# Copyright (C) 2014 Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#
# the mechanism of scanning smaps for opened files and matching them back to
# packages is heavily inspired by the original needs-restarting.py:
# http://yum.baseurl.org/gitweb?p=yum-utils.git;a=blob;f=needs-restarting.py

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
from dnfpluginscore import logger, _

import dnf
import dnf.cli
import dbus
import functools
import os
import re
import stat
import time


# For which package updates we should recommend a reboot
# Mostly taken from https://access.redhat.com/solutions/27943
NEED_REBOOT = ['kernel', 'kernel-rt', 'glibc', 'linux-firmware',
               'systemd', 'dbus', 'dbus-broker', 'dbus-daemon',
               'microcode_ctl']

NEED_REBOOT_DEPENDS_ON_DBUS = ['zlib']

def get_options_from_dir(filepath, base):
    """
    Provide filepath as string if single dir or list of strings
    Return set of package names contained in files under filepath
    """

    if not os.path.exists(filepath):
        return set()
    options = set()
    for file in os.listdir(filepath):
        if os.path.isdir(file) or not file.endswith('.conf'):
            continue

        with open(os.path.join(filepath, file)) as fp:
            for line in fp:
                options.add((line.rstrip(), file))

    packages = set()
    for pkg in base.sack.query().installed().filter(name={x[0] for x in options}):
        packages.add(pkg.name)
    for name, file in {x for x in options if x[0] not in packages}:
        logger.warning(
            _('No installed package found for package name "{pkg}" '
                'specified in needs-restarting file "{file}".'.format(pkg=name, file=file)))
    return packages


def list_opened_files(uid):
    for (pid, smaps) in list_smaps():
        try:
            if uid is not None and uid != owner_uid(smaps):
                continue
            with open(smaps, 'r', errors='replace') as smaps_file:
                lines = smaps_file.readlines()
        except EnvironmentError:
            logger.warning("Failed to read PID %d's smaps.", pid)
            continue

        for line in lines:
            ofile = smap2opened_file(pid, line)
            if ofile is not None:
                yield ofile


def list_smaps():
    for dir_ in os.listdir('/proc'):
        try:
            pid = int(dir_)
        except ValueError:
            continue
        smaps = '/proc/%d/smaps' % pid
        yield (pid, smaps)


def memoize(func):
    sentinel = object()
    cache = {}
    def wrapper(param):
        val = cache.get(param, sentinel)
        if val is not sentinel:
            return val
        val = func(param)
        cache[param] = val
        return val
    return wrapper


def owner_uid(fname):
    return os.stat(fname)[stat.ST_UID]


def owning_package(sack, fname):
    matches = sack.query().filter(file=fname).run()
    if matches:
        return matches[0]
    return None


def print_cmd(pid):
    cmdline = '/proc/%d/cmdline' % pid
    with open(cmdline) as cmdline_file:
        command = dnf.i18n.ucd(cmdline_file.read())
    command = ' '.join(command.split('\000'))
    print('%d : %s' % (pid, command))


def get_service_dbus(pid):
    bus = dbus.SystemBus()
    systemd_manager_object = bus.get_object(
        'org.freedesktop.systemd1',
        '/org/freedesktop/systemd1'
    )
    systemd_manager_interface = dbus.Interface(
        systemd_manager_object,
        'org.freedesktop.systemd1.Manager'
    )
    service_proxy = None
    try:
        service_proxy = bus.get_object(
            'org.freedesktop.systemd1',
            systemd_manager_interface.GetUnitByPID(pid)
        )
    except dbus.DBusException as e:
        # There is no unit for the pid. Usually error is 'NoUnitForPid'.
        # Considering what we do at the bottom (just return if not service)
        # Then there's really no reason to exit here on that exception.
        # Log what's happened then move on.
        msg = str(e)
        logger.warning("Failed to get systemd unit for PID {}: {}".format(pid, msg))
        return
    service_properties = dbus.Interface(
        service_proxy, dbus_interface="org.freedesktop.DBus.Properties")
    name = service_properties.Get(
        "org.freedesktop.systemd1.Unit",
        'Id'
    )
    if name.endswith(".service"):
        return name
    return

def smap2opened_file(pid, line):
    slash = line.find('/')
    if slash < 0:
        return None
    if line.find('00:') >= 0:
        # not a regular file
        return None
    fn = line[slash:].strip()
    suffix_index = fn.rfind(' (deleted)')
    if suffix_index < 0:
        return OpenedFile(pid, fn, False)
    else:
        return OpenedFile(pid, fn[:suffix_index], True)


class OpenedFile(object):
    RE_TRANSACTION_FILE = re.compile('^(.+);[0-9A-Fa-f]{8,}$')

    def __init__(self, pid, name, deleted):
        self.deleted = deleted
        self.name = name
        self.pid = pid

    @property
    def presumed_name(self):
        """Calculate the name of the file pre-transaction.

        In case of a file that got deleted during the transactionm, possibly
        just because of an upgrade to a newer version of the same file, RPM
        renames the old file to the same name with a hexadecimal suffix just
        before delting it.

        """

        if self.deleted:
            match = self.RE_TRANSACTION_FILE.match(self.name)
            if match:
                return match.group(1)
        return self.name


class ProcessStart(object):
    def __init__(self):
        self.boot_time = self.get_boot_time()
        self.sc_clk_tck = self.get_sc_clk_tck()

    @staticmethod
    def get_boot_time():
        """
        We have two sources from which to derive the boot time. These values vary
        depending on containerization, existence of a Real Time Clock, etc.
        For our purposes we want the latest derived value.
        - st_mtime of /proc/1
             Reflects the time the first process was run after booting
             This works for all known cases except machines without
             a RTC - they awake at the start of the epoch.
        - /proc/uptime
             Seconds field of /proc/uptime subtracted from the current time
             Works for machines without RTC iff the current time is reasonably correct.
             Does not work on containers which share their kernel with the
             host - there the host kernel uptime is returned
        """

        proc_1_boot_time = int(os.stat('/proc/1').st_mtime)
        if os.path.isfile('/proc/uptime'):
            with open('/proc/uptime', 'rb') as f:
                uptime = f.readline().strip().split()[0].strip()
                proc_uptime_boot_time = int(time.time() - float(uptime))
                return max(proc_1_boot_time, proc_uptime_boot_time)
        return proc_1_boot_time

    @staticmethod
    def get_sc_clk_tck():
        return os.sysconf(os.sysconf_names['SC_CLK_TCK'])

    def __call__(self, pid):
        stat_fn = '/proc/%d/stat' % pid
        with open(stat_fn) as stat_file:
            stats = stat_file.read().strip().split()
        ticks_after_boot = int(stats[21])
        secs_after_boot = ticks_after_boot // self.sc_clk_tck
        return self.boot_time + secs_after_boot


@dnf.plugin.register_command
class NeedsRestartingCommand(dnf.cli.Command):
    aliases = ('needs-restarting',)
    summary = _('determine updated binaries that need restarting')

    @staticmethod
    def set_argparser(parser):
        parser.add_argument('-u', '--useronly', action='store_true',
                            help=_("only consider this user's processes"))
        parser.add_argument('-r', '--reboothint', action='store_true',
                            help=_("only report whether a reboot is required "
                                   "(exit code 1) or not (exit code 0)"))
        parser.add_argument('-s', '--services', action='store_true',
                            help=_("only report affected systemd services"))

    def configure(self):
        demands = self.cli.demands
        demands.sack_activation = True

    def run(self):
        process_start = ProcessStart()
        owning_pkg_fn = functools.partial(owning_package, self.base.sack)
        owning_pkg_fn = memoize(owning_pkg_fn)

        opt = get_options_from_dir(os.path.join(
            self.base.conf.installroot,
            "etc/dnf/plugins/needs-restarting.d/"),
            self.base)
        NEED_REBOOT.extend(opt)
        if self.opts.reboothint:
            need_reboot = set()
            need_reboot_depends_on_dbus = set()
            installed = self.base.sack.query().installed()
            for pkg in installed.filter(name=NEED_REBOOT):
                if pkg.installtime > process_start.boot_time:
                    need_reboot.add(pkg.name)

            dbus_installed = installed.filter(name=['dbus', 'dbus-daemon', 'dbus-broker'])
            if len(dbus_installed) != 0:
                for pkg in installed.filter(name=NEED_REBOOT_DEPENDS_ON_DBUS):
                    if pkg.installtime > process_start.boot_time:
                        need_reboot_depends_on_dbus.add(pkg.name)
            if need_reboot or need_reboot_depends_on_dbus:
                print(_('Core libraries or services have been updated '
                        'since boot-up:'))
                for name in sorted(need_reboot):
                    print('  * %s' % name)
                for name in sorted(need_reboot_depends_on_dbus):
                    print('  * %s (dependency of dbus. Recommending reboot of dbus)' % name)
                print()
                print(_('Reboot is required to fully utilize these updates.'))
                print(_('More information:'),
                      'https://access.redhat.com/solutions/27943')
                raise dnf.exceptions.Error()  # Sets exit code 1
            else:
                print(_('No core libraries or services have been updated '
                        'since boot-up.'))
                print(_('Reboot should not be necessary.'))
                return None

        stale_pids = set()
        uid = os.geteuid() if self.opts.useronly else None
        for ofile in list_opened_files(uid):
            pkg = owning_pkg_fn(ofile.presumed_name)
            if pkg is None:
                continue
            if pkg.installtime > process_start(ofile.pid):
                stale_pids.add(ofile.pid)

        if self.opts.services:
            names = set([get_service_dbus(pid) for pid in sorted(stale_pids)])
            for name in names:
                if name is not None:
                    print(name)
            return 0
        for pid in sorted(stale_pids):
            print_cmd(pid)
site-packages/dnf-plugins/config_manager.py000064400000025205147511334660015041 0ustar00#
# Copyright (C) 2015  Red Hat, Inc.
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from dnfpluginscore import _, logger, P_

import dnf
import dnf.cli
import dnf.pycomp
import dnf.util
import fnmatch
import hashlib
import os
import re
import shutil


@dnf.plugin.register_command
class ConfigManagerCommand(dnf.cli.Command):

    aliases = ['config-manager']
    summary = _('manage {prog} configuration options and repositories').format(
        prog=dnf.util.MAIN_PROG)

    @staticmethod
    def set_argparser(parser):
        parser.add_argument(
            'crepo', nargs='*', metavar='repo',
            help=_('repo to modify'))
        parser.add_argument(
            '--save', default=False, action='store_true',
            help=_('save the current options (useful with --setopt)'))
        parser.add_argument(
            '--add-repo', default=[], action='append', metavar='URL',
            help=_('add (and enable) the repo from the specified file or url'))
        parser.add_argument(
            '--dump', default=False, action='store_true',
            help=_('print current configuration values to stdout'))
        parser.add_argument(
            '--dump-variables', default=False, action='store_true',
            help=_('print variable values to stdout'))
        enable_group = parser.add_mutually_exclusive_group()
        enable_group.add_argument("--set-enabled", default=False,
                                  dest="set_enabled", action="store_true",
                                  help=_("enable repos (automatically saves)"))
        enable_group.add_argument("--set-disabled", default=False,
                                  dest="set_disabled", action="store_true",
                                  help=_("disable repos (automatically saves)"))

    def configure(self):
        # setup sack and populate it with enabled repos
        demands = self.cli.demands
        demands.available_repos = True

        # if no argument was passed then error
        if (not (self.opts.add_repo != [] or
                 self.opts.save or
                 self.opts.dump or
                 self.opts.dump_variables or
                 self.opts.set_disabled or
                 self.opts.set_enabled) ):
            self.cli.optparser.error(_("one of the following arguments is required: {}")
                                     .format(' '.join([
                                         "--save", "--add-repo",
                                         "--dump", "--dump-variables",
                                         "--set-enabled", "--enable",
                                         "--set-disabled", "--disable"])))

        # warn with hint if --enablerepo or --disablerepo argument was passed
        if self.opts.repos_ed != []:
            logger.warning(_("Warning: --enablerepo/--disablerepo arguments have no meaning"
                             "with config manager. Use --set-enabled/--set-disabled instead."))

        if (self.opts.save or self.opts.set_enabled or
                self.opts.set_disabled or self.opts.add_repo):
            demands.root_user = True

        # sanitize commas https://bugzilla.redhat.com/show_bug.cgi?id=1830530
        temp_list = [x.split(',') for x in self.opts.crepo if x != ',']
        # flatten sublists
        self.opts.crepo = [item for sublist in temp_list
            for item in sublist if item != '']

    def run(self):
        """Execute the util action here."""
        if self.opts.add_repo:
            self.add_repo()
        else:
            self.modify_repo()

    def modify_repo(self):
        """ process --set-enabled, --set-disabled and --setopt options """

        matching_repos = []         # list of matched repositories
        not_matching_repos_id = set()  # IDs of not matched repositories

        def match_repos(key, add_matching_repos):
            matching = self.base.repos.get_matching(key)
            if not matching:
                not_matching_repos_id.add(name)
            elif add_matching_repos:
                matching_repos.extend(matching)

        if self.opts.crepo:
            for name in self.opts.crepo:
                match_repos(name, True)
            if hasattr(self.opts, 'repo_setopts'):
                for name in self.opts.repo_setopts.keys():
                    match_repos(name, False)
        else:
            if hasattr(self.opts, 'repo_setopts'):
                for name in self.opts.repo_setopts.keys():
                    match_repos(name, True)

        if not_matching_repos_id:
            raise dnf.exceptions.Error(_("No matching repo to modify: %s.")
                                       % ', '.join(not_matching_repos_id))

        sbc = self.base.conf
        modify = {}
        if hasattr(self.opts, 'main_setopts') and self.opts.main_setopts:
            modify = self.opts.main_setopts
        if self.opts.dump_variables:
            for name, val in self.base.conf.substitutions.items():
                print("%s = %s" % (name, val))
        if not self.opts.crepo or 'main' in self.opts.crepo:
            if self.opts.save and modify:
                # modify [main] in global configuration file
                self.base.conf.write_raw_configfile(self.base.conf.config_file_path, 'main',
                                                    sbc.substitutions, modify)
            if self.opts.dump:
                print(self.base.output.fmtSection('main'))
                print(self.base.conf.dump())

        if not matching_repos:
            return

        if self.opts.set_enabled or self.opts.set_disabled:
            self.opts.save = True

        for repo in sorted(matching_repos):
            repo_modify = {}
            if self.opts.set_enabled:
                repo_modify['enabled'] = "1"
            elif self.opts.set_disabled:
                repo_modify['enabled'] = "0"
            if hasattr(self.opts, 'repo_setopts'):
                for repoid, setopts in self.opts.repo_setopts.items():
                    if fnmatch.fnmatch(repo.id, repoid):
                        repo_modify.update(setopts)
            if self.opts.save and repo_modify:
                self.base.conf.write_raw_configfile(repo.repofile, repo.id, sbc.substitutions, repo_modify)
            if self.opts.dump:
                print(self.base.output.fmtSection('repo: ' + repo.id))
                print(repo.dump())

    def add_repo(self):
        """ process --add-repo option """

        # Get the reposdir location
        myrepodir = self.base.conf.get_reposdir
        errors_count = 0

        for url in self.opts.add_repo:
            if dnf.pycomp.urlparse.urlparse(url).scheme == '':
                url = 'file://' + os.path.abspath(url)
            logger.info(_('Adding repo from: %s'), url)
            if url.endswith('.repo'):
                # .repo file - download, put into reposdir and enable it
                destname = os.path.basename(url)
                destname = os.path.join(myrepodir, destname)
                try:
                    f = self.base.urlopen(url, mode='w+')
                    shutil.copy2(f.name, destname)
                    os.chmod(destname, 0o644)
                    f.close()
                except IOError as e:
                    errors_count += 1
                    logger.error(e)
                    continue
            else:
                # just url to repo, create .repo file on our own
                repoid = sanitize_url_to_fs(url)
                reponame = 'created by {} config-manager from {}'.format(dnf.util.MAIN_PROG, url)
                destname = os.path.join(myrepodir, "%s.repo" % repoid)
                content = "[%s]\nname=%s\nbaseurl=%s\nenabled=1\n" % \
                                                (repoid, reponame, url)
                if not save_to_file(destname, content):
                    continue
        if errors_count:
            raise dnf.exceptions.Error(P_("Configuration of repo failed",
                                          "Configuration of repos failed", errors_count))


def save_to_file(filename, content):
    try:
        with open(filename, 'w+') as fd:
            dnf.pycomp.write_to_file(fd, content)
            os.chmod(filename, 0o644)
    except (IOError, OSError) as e:
        logger.error(_('Could not save repo to repofile %s: %s'),
                     filename, e)
        return False
    return True

# Regular expressions to sanitise cache filenames
RE_SCHEME = re.compile(r'^\w+:/*(\w+:|www\.)?')
RE_SLASH = re.compile(r'[?/:&#|~\*\[\]\(\)\'\\]+')
RE_BEGIN = re.compile(r'^[,.]*')
RE_FINAL = re.compile(r'[,.]*$')

def sanitize_url_to_fs(url):
    """Return a filename suitable for the filesystem and for repo id

    Strips dangerous and common characters to create a filename we
    can use to store the cache in.
    """

    try:
        if RE_SCHEME.match(url):
            if dnf.pycomp.PY3:
                url = url.encode('idna').decode('utf-8')
            else:
                if isinstance(url, str):
                    url = url.decode('utf-8').encode('idna')
                else:
                    url = url.encode('idna')
                if isinstance(url, unicode):
                    url = url.encode('utf-8')
    except (UnicodeDecodeError, UnicodeEncodeError, UnicodeError, TypeError):
        pass
    url = RE_SCHEME.sub("", url)
    url = RE_SLASH.sub("_", url)
    url = RE_BEGIN.sub("", url)
    url = RE_FINAL.sub("", url)

    # limit length of url
    if len(url) > 250:
        parts = url[:185].split('_')
        lastindex = 185-len(parts[-1])
        csum = hashlib.sha256()
        csum.update(url[lastindex:].encode('utf-8'))
        url = url[:lastindex] + '_' + csum.hexdigest()
    # remove all not allowed characters
    allowed_regex = "[^abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789_.:-]"
    return re.sub(allowed_regex, '', url)
site-packages/dnf-plugins/repomanage.py000064400000024512147511334660014220 0ustar00# repomanage.py
# DNF plugin adding a command to manage rpm packages from given directory.
#
# Copyright (C) 2015 Igor Gnatenko
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# the GNU General Public License v.2, or (at your option) any later version.
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY expressed or implied, including the implied warranties of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General
# Public License for more details.  You should have received a copy of the
# GNU General Public License along with this program; if not, write to the
# Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301, USA.  Any Red Hat trademarks that are incorporated in the
# source code or documentation are not subject to the GNU General Public
# License and may only be used or replicated with the express permission of
# Red Hat, Inc.
#

from __future__ import absolute_import
from __future__ import unicode_literals
from dnfpluginscore import _, logger

import dnf
import dnf.cli
import logging
import os
import hawkey


class RepoManage(dnf.Plugin):

    name = "repomanage"

    def __init__(self, base, cli):
        super(RepoManage, self).__init__(base, cli)
        if cli is None:
            return
        cli.register_command(RepoManageCommand)


class RepoManageCommand(dnf.cli.Command):
    aliases = ("repomanage",)
    summary = _("Manage a directory of rpm packages")

    def pre_configure(self):
        if not self.opts.verbose and not self.opts.quiet:
            self.cli.redirect_logger(stdout=logging.WARNING, stderr=logging.INFO)

    def configure(self):
        if not self.opts.verbose and not self.opts.quiet:
            self.cli.redirect_repo_progress()
        demands = self.cli.demands
        demands.sack_activation = True

    def run(self):
        if self.opts.new and self.opts.old:
            raise dnf.exceptions.Error(_("Pass either --old or --new, not both!"))
        if self.opts.new and self.opts.oldonly:
            raise dnf.exceptions.Error(_("Pass either --oldonly or --new, not both!"))
        if self.opts.old and self.opts.oldonly:
            raise dnf.exceptions.Error(_("Pass either --old or --oldonly, not both!"))
        if not self.opts.old and not self.opts.oldonly:
            self.opts.new = True

        verfile = {}
        pkgdict = {}
        module_dict = {}  # {NameStream: {Version: [modules]}}
        all_modular_artifacts = set()

        keepnum = int(self.opts.keep) # the number of items to keep

        try:
            REPOMANAGE_REPOID = "repomanage_repo"
            repo_conf = self.base.repos.add_new_repo(REPOMANAGE_REPOID, self.base.conf, baseurl=[self.opts.path])
            # Always expire the repo, otherwise repomanage could use cached metadata and give identical results
            # for multiple runs even if the actual repo changed in the meantime
            repo_conf._repo.expire()
            self.base._add_repo_to_sack(repo_conf)
            if dnf.base.WITH_MODULES:
                self.base._setup_modular_excludes()

                # Prepare modules
                module_packages = self.base._moduleContainer.getModulePackages()

                for module_package in module_packages:
                    # Even though we load only REPOMANAGE_REPOID other modules can be loaded from system
                    # failsafe data automatically, we don't want them affecting repomanage results so ONLY
                    # use modules from REPOMANAGE_REPOID.
                    if module_package.getRepoID() == REPOMANAGE_REPOID:
                        all_modular_artifacts.update(module_package.getArtifacts())
                        module_dict.setdefault(module_package.getNameStream(), {}).setdefault(
                            module_package.getVersionNum(), []).append(module_package)

        except dnf.exceptions.RepoError:
            rpm_list = []
            rpm_list = self._get_file_list(self.opts.path, ".rpm")
            if len(rpm_list) == 0:
                raise dnf.exceptions.Error(_("No files to process"))

            self.base.reset(sack=True, repos=True)
            self.base.fill_sack(load_system_repo=False, load_available_repos=False)
            try:
                self.base.add_remote_rpms(rpm_list, progress=self.base.output.progress)
            except IOError:
                logger.warning(_("Could not open {}").format(', '.join(rpm_list)))

        # Prepare regular packages
        query = self.base.sack.query(flags=hawkey.IGNORE_MODULAR_EXCLUDES).available()
        packages = [x for x in query.filter(pkg__neq=query.filter(nevra_strict=all_modular_artifacts)).available()]
        packages.sort()

        for pkg in packages:
            na = (pkg.name, pkg.arch)
            if na in pkgdict:
                if pkg not in pkgdict[na]:
                    pkgdict[na].append(pkg)
            else:
                pkgdict[na] = [pkg]

            nevra = self._package_to_nevra(pkg)
            if nevra in verfile:
                verfile[nevra].append(self._package_to_path(pkg))
            else:
                verfile[nevra] = [self._package_to_path(pkg)]

        outputpackages = []
        # modular packages
        keepnum_latest_stream_artifacts = set()

        if self.opts.new:
            # regular packages
            for (n, a) in pkgdict.keys():
                evrlist = pkgdict[(n, a)]

                newevrs = evrlist[-keepnum:]

                for package in newevrs:
                    nevra = self._package_to_nevra(package)
                    for fpkg in verfile[nevra]:
                        outputpackages.append(fpkg)

            # modular packages
            for streams_by_version in module_dict.values():
                sorted_stream_versions = sorted(streams_by_version.keys())

                new_sorted_stream_versions = sorted_stream_versions[-keepnum:]

                for i in new_sorted_stream_versions:
                    for stream in streams_by_version[i]:
                        keepnum_latest_stream_artifacts.update(set(stream.getArtifacts()))

        if self.opts.old:
            # regular packages
            for (n, a) in pkgdict.keys():
                evrlist = pkgdict[(n, a)]

                oldevrs = evrlist[:-keepnum]

                for package in oldevrs:
                    nevra = self._package_to_nevra(package)
                    for fpkg in verfile[nevra]:
                        outputpackages.append(fpkg)

            # modular packages
            for streams_by_version in module_dict.values():
                sorted_stream_versions = sorted(streams_by_version.keys())

                old_sorted_stream_versions = sorted_stream_versions[:-keepnum]

                for i in old_sorted_stream_versions:
                    for stream in streams_by_version[i]:
                        keepnum_latest_stream_artifacts.update(set(stream.getArtifacts()))

        if self.opts.oldonly:
            # regular packages
            for (n, a) in pkgdict.keys():
                evrlist = pkgdict[(n, a)]

                oldevrs = evrlist[:-keepnum]

                for package in oldevrs:
                    nevra = self._package_to_nevra(package)
                    for fpkg in verfile[nevra]:
                        outputpackages.append(fpkg)

            # modular packages
            keepnum_newer_stream_artifacts = set()

            for streams_by_version in module_dict.values():
                sorted_stream_versions = sorted(streams_by_version.keys())

                new_sorted_stream_versions = sorted_stream_versions[-keepnum:]

                for i in new_sorted_stream_versions:
                    for stream in streams_by_version[i]:
                        keepnum_newer_stream_artifacts.update(set(stream.getArtifacts()))

            for streams_by_version in module_dict.values():
                sorted_stream_versions = sorted(streams_by_version.keys())

                old_sorted_stream_versions = sorted_stream_versions[:-keepnum]

                for i in old_sorted_stream_versions:
                    for stream in streams_by_version[i]:
                        for artifact in stream.getArtifacts():
                            if artifact not in keepnum_newer_stream_artifacts:
                                keepnum_latest_stream_artifacts.add(artifact)

        modular_packages = [self._package_to_path(x) for x in query.filter(pkg__eq=query.filter(nevra_strict=keepnum_latest_stream_artifacts)).available()]
        outputpackages = outputpackages + modular_packages
        outputpackages.sort()
        if self.opts.space:
            print(" ".join(outputpackages))
        else:
            for pkg in outputpackages:
                print(pkg)

    @staticmethod
    def set_argparser(parser):
        parser.add_argument("-o", "--old", action="store_true",
                            help=_("Print the older packages"))
        parser.add_argument("-O", "--oldonly", action="store_true",
                            help=_("Print the older packages. Exclude the newest packages."))
        parser.add_argument("-n", "--new", action="store_true",
                            help=_("Print the newest packages"))
        parser.add_argument("-s", "--space", action="store_true",
                            help=_("Space separated output, not newline"))
        parser.add_argument("-k", "--keep", action="store", metavar="KEEP",
                            help=_("Newest N packages to keep - defaults to 1"),
                            default=1, type=int)
        parser.add_argument("path", action="store",
                            help=_("Path to directory"))

    @staticmethod
    def _get_file_list(path, ext):
        """Return all files in path matching ext

        return list object
        """
        filelist = []
        for root, dirs, files in os.walk(path):
            for f in files:
                if os.path.splitext(f)[1].lower() == str(ext):
                    filelist.append(os.path.join(root, f))

        return filelist

    def _package_to_path(self, pkg):
        if len(self.base.repos):
            return os.path.join(self.opts.path, pkg.location)
        else:
            return pkg.location

    @staticmethod
    def _package_to_nevra(pkg):
        return (pkg.name, pkg.epoch, pkg.version, pkg.release, pkg.arch)
site-packages/syspurpose/files.py000064400000071137147511334660013217 0ustar00# -*- coding: utf-8 -*-

from __future__ import print_function, division, absolute_import
#
# Copyright (c) 2018 Red Hat, Inc.
#
# This software is licensed to you under the GNU General Public License,
# version 2 (GPLv2). There is NO WARRANTY for this software, express or
# implied, including the implied warranties of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. You should have received a copy of GPLv2
# along with this software; if not, see
# http://www.gnu.org/licenses/old-licenses/gpl-2.0.txt.
#
# Red Hat trademarks are not licensed under GPLv2. No permission is
# granted to use or replicate Red Hat trademarks that are incorporated
# in this software or its documentation.

"""
This module contains utilities for manipulating files pertaining to system syspurpose
"""

import collections
import logging
import json
import os
import errno
import io
from syspurpose.utils import system_exit, create_dir, create_file, make_utf8, write_to_file_utf8
from syspurpose.i18n import ugettext as _

# Constants for locations of the two system syspurpose files
USER_SYSPURPOSE_DIR = "/etc/rhsm/syspurpose"
USER_SYSPURPOSE = os.path.join(USER_SYSPURPOSE_DIR, "syspurpose.json")
VALID_FIELDS = os.path.join(USER_SYSPURPOSE_DIR, "valid_fields.json")  # Will be used for future validation
CACHE_DIR = "/var/lib/rhsm/cache"
CACHED_SYSPURPOSE = os.path.join(CACHE_DIR, "syspurpose.json")  # Stores cached values

# All names that represent syspurpose values locally
ROLE = 'role'
ADDONS = 'addons'
SERVICE_LEVEL = 'service_level_agreement'
USAGE = 'usage'

# Remote values keyed on the local ones
LOCAL_TO_REMOTE = {
    ROLE: 'role',
    ADDONS: 'addOns',
    SERVICE_LEVEL: 'serviceLevel',
    USAGE: 'usage'
}

# All known syspurpose attributes
ATTRIBUTES = [ROLE, ADDONS, SERVICE_LEVEL, USAGE]


# Values used in determining changes between client and server
UNSUPPORTED = "unsupported"


log = logging.getLogger(__name__)


def post_process_received_data(data):
    """
    Try to solve conflicts in keys
     - Server returns key "roles", but it should be "role"
     - Server returns key "support_level", but service_level_agreement is used in syspurpose.json
    :return: modified dictionary
    """
    if 'systemPurposeAttributes' in data:
        # Fix
        if 'roles' in data['systemPurposeAttributes']:
            data['systemPurposeAttributes']['role'] = data['systemPurposeAttributes']['roles']
            del data['systemPurposeAttributes']['roles']
        if 'support_level' in data['systemPurposeAttributes']:
            data['systemPurposeAttributes']['service_level_agreement'] = data['systemPurposeAttributes']['support_level']
            del data['systemPurposeAttributes']['support_level']
    return data


class SyspurposeStore(object):
    """
    Represents and maintains a json syspurpose file
    """

    def __init__(self, path, raise_on_error=False):
        self.path = path
        self.contents = {}
        self.raise_on_error = raise_on_error

    def read_file(self):
        """
        Opens & reads the contents of the store's file based on the 'path' provided to the constructor,
        and stores them on this object. If the user doesn't have access rights to the file, the program exits.
        :return: False if the contents of the file were empty, or the file doesn't exist; otherwise, nothing.
        """
        try:
            with io.open(self.path, 'r', encoding='utf-8') as f:
                self.contents = json.load(f)
                return True
        except ValueError:
            # Malformed JSON or empty file. Let's not error out on an empty file
            if os.path.getsize(self.path):
                system_exit(
                    os.EX_CONFIG,
                    _("Error: Malformed data in file {}; please review and correct.").format(self.path)
                )

            return False
        except OSError as e:
            if e.errno == errno.EACCES and not self.raise_on_error:
                system_exit(os.EX_NOPERM, _('Cannot read syspurpose file {}\nAre you root?').format(self.path))
            if e.errno == errno.ENOENT and not self.raise_on_error:
                log.error('Unable to read file {file}: {error}'.format(file=self.path, error=e))
                return False
            if self.raise_on_error:
                raise e

    def create(self):
        """
        Create the files necessary for this store
        :return: True if changes were made, false otherwise
        """
        return create_dir(os.path.dirname(self.path)) or \
            self.read_file() or \
            create_file(self.path, self.contents)

    def add(self, key, value):
        """
        Add a value to a list of values specified by key. If the current value specified by the key is scalar/non-list,
        it is not overridden, but maintained in the list, along with the new value.
        :param key: The name of the list
        :param value: The value to append to the list
        :return: None
        """
        value = make_utf8(value)
        key = make_utf8(key)
        try:
            current_value = self.contents[key]
            if current_value is not None and not isinstance(current_value, list):
                self.contents[key] = [current_value]

            if self.contents[key] is None:
                self.contents[key] = []

            if value not in self.contents[key]:
                self.contents[key].append(value)
            else:
                return False
        except (AttributeError, KeyError):
            self.contents[key] = [value]
        return True

    def remove(self, key, value):
        """
        Remove a value from a list specified by key.
        If the current value specified by the key is not a list, unset the value.
        :param key: The name of the list parameter to manipulate
        :param value: The value to attempt to remove
        :return: True if the value was in the list, False if it was not
        """
        value = make_utf8(value)
        key = make_utf8(key)
        try:
            current_value = self.contents[key]
            if current_value is not None and not isinstance(current_value, list) and current_value == value:
                return self.unset(key)

            if value in current_value:
                self.contents[key].remove(value)
            else:
                return False

            return True
        except (AttributeError, KeyError, ValueError):
            return False

    def unset(self, key):
        """
        Unsets a key
        :param key: The key to unset
        :return: boolean
        """
        key = make_utf8(key)

        # Special handling is required for the SLA, since it deviates from the typical CP
        # empty => null semantics
        if key == 'service_level_agreement':
            value = self.contents.get(key, None)
            self.contents[key] = ''
        else:
            value = self.contents.pop(key, None)

        return value is not None

    def set(self, key, value):
        """
        Set a key (syspurpose parameter) to value
        :param key: The parameter of the syspurpose file to set
        :type key: str

        :param value: The value to set that parameter to
        :return: Whether any change was made
        """
        value = make_utf8(value)
        key = make_utf8(key)
        org = make_utf8(self.contents.get(key, None))
        self.contents[key] = value
        return org != value or org is None

    def write(self, fp=None):
        """
        Write the current contents to the file at self.path
        """
        if not fp:
            with io.open(self.path, 'w', encoding='utf-8') as f:
                write_to_file_utf8(f, self.contents)
                f.flush()
        else:
            write_to_file_utf8(fp, self.contents)

    @classmethod
    def read(cls, path, raise_on_error=False):
        """
        Read the file represented by path. If the file does not exist it is created.
        :param path: The path on the file system to read, should be a json file
        :param raise_on_error: When it is set to True, then exceptions are raised as expected.
        :return: new SyspurposeStore with the contents read in
        """
        new_store = cls(path, raise_on_error=raise_on_error)

        if not os.access(path, os.W_OK):
            new_store.create()
        else:
            new_store.read_file()

        return new_store


class SyncResult(object):
    """
    A container class for the results of a sync operation performed by a SyncedStore class.
    """

    def __init__(self, result, remote_changed, local_changed, cached_changed):
        self.result = result
        self.remote_changed = remote_changed
        self.local_changed = local_changed
        self.cached_changed = cached_changed


class SyncedStore(object):
    """
    Stores values in a local file backed by a cache which is then synced with another source
    of the same values.
    """
    PATH = USER_SYSPURPOSE
    CACHE_PATH = CACHED_SYSPURPOSE

    def __init__(self, uep, on_changed=None, consumer_uuid=None, use_valid_fields=False):
        """
        Initialization of SyncedStore
        :param uep: object representing connection to candlepin server
        :param on_changed: optional callback method called, during three-way merge
        :param consumer_uuid: UUID of consumer
        :param use_valid_fields: if valid fields are considered
        """
        self.uep = uep
        self.filename = self.PATH.split('/')[-1]
        self.path = self.PATH
        self.cache_path = self.CACHE_PATH
        self.local_file = None
        self.local_contents = self.get_local_contents()
        self.cache_file = None
        self.cache_contents = self.get_cached_contents()
        self.changed = False
        self.on_changed = on_changed
        self.consumer_uuid = consumer_uuid
        if use_valid_fields is True:
            self.valid_fields = self.get_valid_fields()
        else:
            self.valid_fields = None

    def __enter__(self):
        return self

    def __exit__(self, exc_type, exc_val, exc_tb):
        self.finish()

    def finish(self):
        """
        When local content was changed, then try to synchronize local content with remote server
        :return:
        """
        if self.changed:
            self.sync()

    def sync(self):
        """
        Try to synchronize local content with remote server
        :return: instance of SyncResult holding result of synchronization
        """
        log.debug('Attempting to sync syspurpose content...')
        try:
            if self.uep and not self.uep.has_capability('syspurpose'):
                log.debug('Server does not support syspurpose, syncing only locally.')
                return self._sync_local_only()
        except Exception as err:
            log.debug(
                'Failed to detect whether the server has syspurpose capability: {err}'.format(
                    err=err
                )
            )
            return self._sync_local_only()

        remote_contents = self.get_remote_contents()
        local_contents = self.get_local_contents()
        cached_contents = self.get_cached_contents()

        result = self.merge(local=local_contents,
                            remote=remote_contents,
                            base=cached_contents)

        local_result = {key: result[key] for key in result if result[key]}

        sync_result = SyncResult(
            result,
            (remote_contents == result) or self.update_remote(result),
            self.update_local(local_result),
            self.update_cache(result),
        )

        log.debug('Successfully synced system purpose.')

        # Reset the changed attribute as all items should be synced if we've gotten to this point
        self.changed = False

        return sync_result

    def _sync_local_only(self):
        local_updated = self.update_local(self.get_local_contents())
        return SyncResult(self.local_contents, False, local_updated, False)

    def merge(self, local=None, remote=None, base=None):
        """
        Do three-way merge
        :param local: dictionary with local values (syspyrpose.json)
        :param remote: dictionary with values from server
        :param base:
        :return:
        """
        result = three_way_merge(
            local=local,
            base=base,
            remote=remote,
            on_change=self.on_changed
        )
        return result

    def get_local_contents(self):
        """
        Try to load local content from file
        :return: dictionary with system purpose values
        """
        try:
            self.local_contents = json.load(io.open(self.path, 'r', encoding='utf-8'))
        except (os.error, ValueError, IOError):
            log.debug('Unable to read local system purpose at "%s"' % self.path)
            self.update_local({})
            self.local_contents = {}
        return self.local_contents

    def get_remote_contents(self):
        """
        Try to get remote content from server
        :return: dictionary with system purpose values
        """
        if self.uep is None or self.consumer_uuid is None:
            log.debug('Failed to read remote syspurpose from server: no available connection, '
                      'or the consumer is not registered.')
            return {}
        if not self.uep.has_capability('syspurpose'):
            log.debug('Server does not support syspurpose, not syncing.')
            return {}

        consumer = self.uep.getConsumer(self.consumer_uuid)
        result = {}

        # Translate from the remote values to the local, filtering out items not known
        for attr in ATTRIBUTES:
            value = consumer.get(LOCAL_TO_REMOTE[attr])
            result[attr] = value
        log.debug('Successfully read remote syspurpose from server.')

        return result

    def get_cached_contents(self):
        """
        Try to load cached server response from the file
        :return: dictionary with system purpose values
        """
        try:
            self.cache_contents = json.load(io.open(self.cache_path, 'r', encoding='utf-8'))
            log.debug('Successfully read cached syspurpose contents.')
        except (ValueError, os.error, IOError):
            log.debug('Unable to read cached syspurpose contents at \'%s\'.' % self.path)
            self.cache_contents = {}
            self.update_cache({})
        return self.cache_contents

    def update_local(self, data):
        """
        Rewrite local content with new data and write data to file syspurpose.json
        :param data: new dictionary with local data
        :return: None
        """
        self.local_contents = data
        self._write_local()

    def _write_local(self):
        """
        Write local data to the file
        :return: None
        """
        self._update_file(self.path, self.local_contents)

    def update_cache(self, data):
        self.cache_contents = data
        self._write_cache()

    def _write_cache(self):
        """
        Write cache to file
        :return: None
        """
        self._update_file(self.cache_path, self.cache_contents)

    def update_remote(self, data):
        if self.uep is None or self.consumer_uuid is None:
            log.debug('Failed to update remote syspurpose on the server: no available connection, '
                      'or the consumer is not registered.')
            return False

        addons = data.get(ADDONS)
        self.uep.updateConsumer(
                self.consumer_uuid,
                role=data.get(ROLE) or "",
                addons=addons if addons is not None else [],
                service_level=data.get(SERVICE_LEVEL) or "",
                usage=data.get(USAGE) or ""
        )
        log.debug('Successfully updated remote syspurpose on the server.')
        return True

    def _check_key_value_validity(self, key, value):
        """
        Check validity of provided key and value of it is included in valid fields
        :param key: provided key
        :param value: provided value
        :return: None
        """
        if self.valid_fields is not None:
            if key in self.valid_fields:
                if value not in self.valid_fields[key]:
                    print(
                        _('Warning: Provided value "{val}" is not included in the list '
                          'of valid values for attribute {attr}:').format(val=value, attr=key)
                    )
                    for valid_value in self.valid_fields[key]:
                        if len(valid_value) > 0:
                            print(" - %s" % valid_value)
            else:
                print(_('Warning: Provided key "{key}" is not included in the list of valid keys:').format(
                    key=key
                ))
                for valid_key in self.valid_fields.keys():
                    print(" - %s" % valid_key)

    def add(self, key, value):
        """
        Add a value to a list of values specified by key. If the current value specified by the key is scalar/non-list,
        it is not overridden, but maintained in the list, along with the new value.
        :param key: The name of the list
        :param value: The value to append to the list
        :return: None
        """
        value = make_utf8(value)
        key = make_utf8(key)
        try:
            # When existing value was set using set() method, then the
            # existing valus is not list, but simple value. We have to convert
            # it first
            current_value = self.local_contents[key]
            if current_value is not None and not isinstance(current_value, list):
                self.local_contents[key] = [current_value]

            # When existing value is None, then first covert to empty list to be
            # able to call append method. It is very theoretical case.
            if self.local_contents[key] is None:
                self.local_contents[key] = []

            if value not in self.local_contents[key]:
                self.local_contents[key].append(value)
            else:
                log.debug('Will not add value \'%s\' to key \'%s\'.' % (value, key))
                self.changed = False
                return self.changed
        except (AttributeError, KeyError):
            self.local_contents[key] = [value]

        self._check_key_value_validity(key, value)

        self.changed = True
        log.debug('Adding value \'%s\' to key \'%s\'.' % (value, key))

        # Write changes to the syspurpose.json file
        if self.changed is True:
            self._write_local()

        return self.changed

    def remove(self, key, value):
        """
        Remove a value from a list specified by key.
        If the current value specified by the key is not a list, unset the value.
        :param key: The name of the list parameter to manipulate
        :param value: The value to attempt to remove
        :return: True if the value was in the list, False if it was not
        """
        value = make_utf8(value)
        key = make_utf8(key)
        try:
            current_values = self.local_contents[key]
            if current_values is not None and not isinstance(current_values, list) and current_values == value:
                return self.unset(key)

            if value in current_values:
                self.local_contents[key].remove(value)
                self.changed = True
                log.debug('Removing value \'%s\' from key \'%s\'.' % (value, key))
            else:
                self.changed = False
                log.debug('Will not remove value \'%s\' from key \'%s\'.' % (value, key))
                return self.changed

        except (AttributeError, KeyError, ValueError):
            log.debug('Will not remove value \'%s\' from key \'%s\'.' % (value, key))
            self.changed = False

        # Write changes to the syspurpose.json file
        if self.changed is True:
            self._write_local()

        return self.changed

    def unset(self, key):
        """
        Unsets a key
        :param key: The key to unset
        :return: boolean
        """
        key = make_utf8(key)

        # Special handling is required for the SLA, since it deviates from the typical CP
        # empty => null semantics
        if key == 'service_level_agreement':
            value = self.local_contents.get(key, None)
            self.local_contents[key] = ''
        elif key == 'addons':
            value = self.local_contents.get(key, None)
            self.local_contents[key] = []
        else:
            value = self.local_contents.pop(key, None)
        self.changed = True
        log.debug('Unsetting value \'%s\' of key \'%s\'.' % (value, key))

        self.changed = value is not None

        # Write changes to the syspurpose.json file
        if self.changed is True:
            self._write_local()

        return self.changed

    def set(self, key, value):
        """
        Set a key (syspurpose parameter) to value
        :param key: The parameter of the syspurpose file to set
        :type key: str

        :param value: The value to set that parameter to
        :return: Whether any change was made
        """
        value = make_utf8(value)
        key = make_utf8(key)
        current_value = make_utf8(self.local_contents.get(key, None))
        self.local_contents[key] = value

        if current_value != value or current_value is None:
            self._check_key_value_validity(key, value)

            self.changed = True
            log.debug('Setting value \'%s\' to key \'%s\'.' % (value, key))
        else:
            log.debug('NOT Setting value \'%s\' to key \'%s\'.')

        self.changed = current_value != value or current_value is None

        # Write changes to the syspurpose.json file
        if self.changed is True:
            self._write_local()

        return self.changed

    @staticmethod
    def _create_missing_dir(dir_path):
        """
        Try to create missing directory
        :param dir_path: path to directory
        :return: None
        """
        # Check if the directory exists
        if not os.path.isdir(dir_path):
            log.debug('Trying to create directory: %s' % dir_path)
            try:
                os.makedirs(dir_path, mode=0o755, exist_ok=True)
            except Exception as err:
                log.warning('Unable to create directory: %s, error: %s' % (dir_path, err))

    @classmethod
    def _update_file(cls, path, data):
        """
        Write the contents of data to file in the first mode we can (effectively to create or update
        the file)
        :param path: The string path to the file location we should update
        :param data: The data to write to the file
        :return: None
        """

        # Check if /etc/rhsm/syspurpose directory exists
        cls._create_missing_dir(USER_SYSPURPOSE_DIR)
        # Check if /var/lib/rhsm/cache/ directory exists
        cls._create_missing_dir(CACHE_DIR)

        # Then we can try to create syspurpose.json file
        try:
            f = io.open(path, 'w+', encoding='utf-8')
        except OSError as e:
            if e.errno != 17:
                raise
        else:
            write_to_file_utf8(f, data)
            f.flush()
            f.close()
            log.debug('Successfully updated syspurpose values at \'%s\'.' % path)
        log.debug('Failed to update syspurpose values at \'%s\'.' % path)

    def get_valid_fields(self):
        """
        Try to get valid fields from server using current owner (organization)
        :return: Dictionary with valid fields
        """
        valid_fields = None

        if self.uep is not None and self.consumer_uuid is not None:
            current_owner = self.uep.getOwner(self.consumer_uuid)
            if 'key' in current_owner:
                owner_key = current_owner['key']
                try:
                    response = self.uep.getOwnerSyspurposeValidFields(owner_key)
                except Exception as err:
                    log.debug("Unable to get valid fields from server: %s" % err)
                else:
                    if 'systemPurposeAttributes' in response:
                        response = post_process_received_data(response)
                        valid_fields = response['systemPurposeAttributes']
        return valid_fields


# A simple container class used to hold the values representing a change detected
# during three_way_merge
DiffChange = collections.namedtuple(
    'DiffChange',
    ['key', 'previous_value', 'new_value', 'source', 'in_base', 'in_result']
)


def three_way_merge(local, base, remote, on_conflict="remote", on_change=None):
    """
    Performs a three-way merge on the local and remote dictionaries with a given base.
    :param local: The dictionary of the current local values
    :param base: The dictionary with the values we've last seen
    :param remote: The dictionary with "their" values
    :param on_conflict: Either "remote" or "local" or None. If "remote", the remote changes
                               will win any conflict. If "local", the local changes will win any
                               conflict. If anything else, an error will be thrown.
    :param on_change: This is an optional function which will be given each change as it is
                      detected.
    :return: The dictionary of values as merged between the three provided dictionaries.
    """
    log.debug('Attempting a three-way merge...')
    result = {}
    local = local or {}
    base = base or {}
    remote = remote or {}

    if on_conflict == "remote":
        winner = remote
    elif on_conflict == "local":
        winner = local
    else:
        raise ValueError('keyword argument "on_conflict" must be either "remote" or "local"')

    if on_change is None:
        on_change = lambda change: change

    all_keys = set(local.keys()) | set(base.keys()) | set(remote.keys())

    for key in all_keys:

        local_changed = detect_changed(base=base, other=local, key=key, source="local")
        remote_changed = detect_changed(base=base, other=remote, key=key, source="server")
        changed = local_changed or remote_changed and remote_changed != UNSUPPORTED
        source = 'base'

        if local_changed == remote_changed:
            if local_changed is True:
                log.debug('Three way merge conflict: both local and remote values changed for key \'%s\'.' % key)
            source = on_conflict
            if key in winner:
                result[key] = winner[key]
        elif remote_changed is True:
            log.debug('Three way merge: remote value was changed for key \'%s\'.' % key)
            source = 'remote'
            if key in remote:
                result[key] = remote[key]
        elif local_changed or remote_changed == UNSUPPORTED:
            if local_changed is True:
                log.debug('Three way merge: local value was changed for key \'%s\'.' % key)
            source = 'local'
            if key in local:
                result[key] = local[key]

        if changed:
            original = base.get(key)
            diff = DiffChange(key=key, source=source, previous_value=original,
                              new_value=result.get(key), in_base=key in base,
                              in_result=key in result)
            on_change(diff)

    return result


def detect_changed(base, other, key, source="server"):
    """
    Detect the type of change that has occurred between base and other for a given key.
    :param base: The dictionary of values we are starting with
    :param other: The dictionary of now current values
    :param key: The key that we are interested in knowing how it changed
    :param source: An optional string which indicates where the "other" values came from. Used to
                   make decisions which are one sided. (i.e. only applicable for changes from the
                   server side).
    :return: True if there was a change, false if there was no change
    :rtype: bool
    """
    base = base or {}
    other = other or {}
    if key not in other and source != "local":
        return UNSUPPORTED

    base_val = base.get(key)
    other_val = other.get(key)

    if key not in other and source == "local":
        # If the local values no longer contain the key we want to treat this as removal
        # It would constitute a change if the base had a truthy value. The values tracked from the
        # server all have falsey values.
        return bool(base_val)

    # Handle "addons" (the lists might be out of order from the server)
    if type(base_val) == list and type(other_val) == list:
        return sorted(base_val) != sorted(other_val)

    # When value is removed from server, then it is set to empty string, but
    # it is completely removed from local syspurpose.json.
    # See: https://bugzilla.redhat.com/show_bug.cgi?id=1738764
    if source == "server" and base_val is None and other_val == '':
        return False

    return base_val != other_val
site-packages/syspurpose/__pycache__/cli.cpython-36.pyc000064400000024516147511334660017147 0ustar003

1�e<8�@s�ddlmZmZmZddlZddlZddlZddlZddlm	Z	ddl
mZmZddl
mZddlZed�Zed�Zdd	�Zd
d�Zdd
�Zdd�Zdd�Zdd�Zdd�Zddd�Zdd�ZdS)�)�print_function�division�absolute_importN)�SyncedStore)�in_container�	make_utf8)�ugettextzzWarning: A {attr} of "{download_value}" was recently set for this system by the entitlement server administrator.
{advice}zGIf you'd like to overwrite the server side change please run: {command}cs�d}xb�jD]X}|j�j|�rDd}ttd�jt|�t�j�d��qttd�jt|�t�j�d��qW|dkrtdStd�jt�j�d�}d	j�j�}d
j�jd�|}t|�fdd
�||�jd�dS)a�
    Uses the syspurposestore to add one or more values to a particular property.
    :param args: The parsed args from argparse, expected attributes are:
        prop_name: the string name of the property to add to
        values: A list of the values to add to the given property (could be anything json-serializable)
    :param syspurposestore: An SyspurposeStore object to manipulate
    :return: None
    FTzAdded "{value}" to {prop_name}.)�value�	prop_namez=Not adding value "{value}" to {prop_name}; it already exists.Nz{attr} updated.)�attr�zsyspurpose add-{name} )�namecst��fdd��jD��S)Nc3s |]}|�j�jg�kVqdS)N)�getr
)�.0�x)�args�res��/usr/lib/python3.6/cli.py�	<genexpr>=sz0add_command.<locals>.<lambda>.<locals>.<genexpr>)�all�values)r)r)rr�<lambda>=szadd_command.<locals>.<lambda>)�expectation�success_msg�commandr)	r�addr
�print�_�formatr�join�check_result)r�syspurposestoreZany_prop_addedr	rZto_addrr)rr�add_command"s&	


r#cs�xb�jD]X}|j�j|�r<ttd�jt|�t�j�d��qttd�jt|�t�j�d��dSqWtd�jt�j�d�}dj�j�}dj�jd	�|}t|�fd
d�||�jd�dS)
a�
    Uses the syspurposestore to remove one or more values from a particular property.
    :param args: The parsed args from argparse, expected attributes are:
        prop_name: the string name of the property to add to
        values: A list of the values to remove from the given property (could be anything json-serializable)
    :param syspurposestore: An SyspurposeStore object to manipulate
    :return: None
    zRemoved "{val}" from {name}.)�valr
z9Not removing value "{val}" from {name}; it was not there.Nz{attr} updated.)rrzsyspurpose remove-{name} )r
cst�fdd��jD��S)Nc3s|]}|�jdg�kVqdS)�addonsN)r)rr)rrrrZsz3remove_command.<locals>.<lambda>.<locals>.<genexpr>)rr)r)r)rrrZsz remove_command.<locals>.<lambda>)rrrr)	r�remover
rrrrr r!)rr"r	rZ	to_removerr)rr�remove_commandDs	"

r'cs\|j�j�j�td�jt�j�t�j�d�}t|�fdd�|dj�j�jd��jd�dS)	ap
    Uses the syspurposestore to set the prop_name to value.
    :param args: The parsed args from argparse, expected attributes are:
        prop_name: the string name of the property to set
        value: An object to set the property to (could be anything json-serializable)
    :param syspurposestore: An SyspurposeStore object to manipulate
    :return: None
    z{attr} set to "{val}".)rr$cs|j�j��jkS)N)rr
r	)r)rrrroszset_command.<locals>.<lambda>zsyspurpose set {name} "{val}")r
r$)rrrrN)�setr
r	rrrr!)rr"rr)rr�set_commandas	
r)csL|j�j�td�jt�j�d�}t|�fdd�|dj�jd��jd�dS)	a.
    Uses the syspurposestore to unset (clear entirely) the prop_name.
    :param args: The parsed args from argparse, expected attributes are:
        prop_name: the string name of the property to unset (clear)
    :param syspurposestore: An SyspurposeStore object to manipulate
    :return: None
    z
{attr} unset.)rcs|j�j�ddggkS)Nr)rr
)r)rrrr�szunset_command.<locals>.<lambda>zsyspurpose unset {name})r
)rrrrN)�unsetr
rrrr!)rr"rr)rr�
unset_commandws
r+cs:|j�}|j��fdd��D��ttj�dddd��|S)z?
    :param args:
    :param syspurposestore:
    :return:
    csi|]}�|r�||�qSrr)r�key)�contentsrr�
<dictcomp>�sz!show_contents.<locals>.<dictcomp>�FT)�indentZensure_asciiZ	sort_keys)�sync�resultr�json�dumps)rr"Zsync_resultr)r-r�
show_contents�s
r5cCs,tjdtd�td�d�}|jdd�}tjdd�}|jd	td
�dd�|jtd
d�tjdd�}|jd	td�dd�|jtd
d�tjdd�}|jdtd�dd�|jtd
d�tjdd�}|jt	d
d�|j
dtd�d�}|jddtd�dd�|jdtd�dd�|jtd
d�|j
dtd�|gd�}|jddtd�dd�|j
dtd�d�}|jddtd�dd�|jd	td
�ddd �|jtd
d�|j
d!td"�d�}	|	jddtd�dd�|	jd	td�ddd �|	jtd
d�|j
d#td$�|gd�}
|
jd%d&�|j
d'td(�|gd�}|jd%d&�|j
d)td*�|gd�}|jd+d&�|j
d,td-�|gd�}
|
jd+d&�|j
d.td/�|gd�}|jd+d&�|j
d0td1�|gd�}|jd2d&�|j
d3td4�|gd�}|jd2d&�|j
d5td6�|gd�}|jd7d&�|j
d8td9�|gd�}|jd7d&�|j
d:td;�d�}|jtdd�|S)<z~
    Sets up argument parsing for the syspurpose tool.
    :return: An argparse.ArgumentParser ready to use to parse_args
    Z
syspurposez!System Syspurpose Management Toolz�The 'syspurpose' command is deprecated and will be removed in a future major release. Please use the 'subscription-manager syspurpose' command going forward.)�prog�descriptionZepilogzsub-command help)�helpF)Zadd_helprzThe value(s) to add�+)r8�nargsT)�funcZ
requires_synczThe value(s) to remover	zThe value to setZstore)r8�actionr(z%Sets the value for the given propertyr
�propertyz&The name of the property to set/update)�metavarr8r<r*z0Unsets (clears) the value for the given property)r8�parentsrz'Adds the value(s) to the given propertyz"The name of the property to update)r8r<r:r&z,Removes the value(s) from the given propertyzset-rolez,Set the system role to the system syspurposeZrole)r
z
unset-rolezClear set rolez
add-addonsz#Add addons to the system syspurposer%z
remove-addonsz(Remove addons from the system syspurposezunset-addonszClear set addonszset-slazSet the system slaZservice_level_agreementz	unset-slaz
Clear set slaz	set-usagezSet the system usageZusagezunset-usagezClear set usageZshowz"Show the current system syspurpose)�argparse�ArgumentParserrZadd_subparsers�add_argumentZset_defaultsr#r'r)r+Z
add_parserr5)�parserZ
subparsersZadd_optionsZremove_optionsZset_optionsZ
unset_optionsZgeneric_set_parserZgeneric_unset_parserZgeneric_add_parserZgeneric_remove_parserZset_role_parserZunset_role_parserZadd_addons_parserZremove_addons_parserZset_sla_parserZunset_sla_parserZset_usage_parserZunset_usage_parserZshow_parserrrr�setup_arg_parser�s�









rDc
Cs�t�}|j�}t�r ttd��ttd�tjd�y2ddlm}ddl	m
}|�}|j}|�j�}Wn(t
k
r�d}d}ttd��YnXt||d	d
�}t|dd�dk	r�|j||�n|j�dSdS)zE
    Run the cli (Do the syspurpose tool thing!!)
    :return: 0
    z\WARNING: Setting syspurpose in containers has no effect.
Please run syspurpose on the host.
z�The 'syspurpose' command is deprecated and will be removed in a future major release. Please use the 'subscription-manager syspurpose' command going forward.)�filer)�Identity)�
CPProviderNzyWarning: Unable to sync system purpose with subscription management server: subscription_manager module is not available.T)�uepZ
consumer_uuidZuse_valid_fieldsr;)rD�
parse_argsrrr�sys�stderrZsubscription_manager.identityrFZ subscription_manager.cp_providerrG�uuidZget_consumer_auth_cp�ImportErrorr�getattrr;Z
print_help)rCrrFrGZidentityrLrHr"rrr�mainSs,
rOcCs�|r�t|�tg�tf�gkr"|f}xh|D]`}t|t�r>d|}t|tj�rptjrptjj|j	d��tjjd�q(tjj|�tjjd�q(Wytj
j�tjj�Wntk
r�YnXtj
|�dS)z�
    Exit with a code and optional message(s). Saved a few lines of code.
    Note: this method was copied from subscription_manager, because syspurpose should be independent
    on subscription_manager module as much as possible.
    z%s�utf8�
N)�type�
isinstance�	Exception�sixZ	text_typeZPY2rJrK�write�encode�stdout�flush�IOError�exit)�code�msgs�msgrrr�system_exitys"


r_cCsh|r|j�|j�}ni}|r\||�r\tj|d�}||}ttjttj|||d��d�nt	|�dS)N)r)rZdownload_value�advice)r])
r1Zget_cached_contents�	SP_ADVICErr_�os�EX_SOFTWAREr�SP_CONFLICT_MESSAGEr)r"rrrrr2r`r	rrrr!�s
r!)N)Z
__future__rrrr@rbrJrUZsyspurpose.filesrZsyspurpose.utilsrrZsyspurpose.i18nrrr3rdrar#r'r)r+r5rDrOr_r!rrrr�<module>s("
<&
site-packages/syspurpose/__pycache__/main.cpython-36.opt-1.pyc000064400000001362147511334660020255 0ustar003

1�eX�@slddlmZmZmZddlZddlmZddlmZddl	j
Z
e
j�ddl	mZ
dd�Zedkrhe�dS)	�)�print_function�division�absolute_importN)�cli)�system_exit)�ugettextcCslytjtj�pd�WnPtk
r8tdtd��Yn0tk
rf}ztdt|��WYdd}~XnXdS)NrzUser interrupted process����)	�sys�exitr�main�KeyboardInterruptr�_�	Exception�str)�e�r�/usr/lib/python3.6/main.pyrsr�__main__)Z
__future__rrrr
Z
syspurposerZsyspurpose.utilsrZsyspurpose.i18nZi18nZconfigure_i18nrrr�__name__rrrr�<module>s
	site-packages/syspurpose/__pycache__/utils.cpython-36.pyc000064400000005622147511334660017535 0ustar003

1�e3�@s�ddlmZmZmZddlZddlZddlZddlZddlZddl	Z	ddl
mZdZ
ddd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�ZdS)�)�print_function�division�absolute_importN)�ugettextz/etc/rhsm-host/cCsR|rDt|�tg�tf�gkr"|f}x |D]}tjjt|�d�q(Wtj|�dS)zDExit with a code and optional message(s). Saved a few lines of code.�
N)�type�sys�stderr�write�str�exit)�codeZmsgs�msg�r�/usr/lib/python3.6/utils.py�system_exit"s
rcCslytj|dd�WnTtk
rf}z8|jtjkr4dS|jtjkrVttjtd�j	|��WYdd}~XnXdS)z�
    Attempts to create the path given (less any file)
    :param path: path
    :return: True if changes were made, false otherwise
    i�)�modeFz(Cannot create directory {}
Are you root?NT)
�os�makedirs�OSError�errno�EEXIST�EACCESr�	EX_NOPERM�_�format)�path�errr�
create_dir-s"rcCs�y2tj|ddd��}t||�|j�WdQRXWnXtk
r�}z<|jtjkrTdS|jtjkrxtt	j
td�j|��n�WYdd}~XnXdS)z�
    Attempts to create a file, with the given contents
    :param path: The desired path to the file
    :param contents: The contents to write to the file, should json-serializable
    :return: True if the file was newly created, false otherwise
    �wzutf-8)�encodingNFz#Cannot create file {}
Are you root?T)
�io�open�write_to_file_utf8�flushrrrrrrrrr)r�contents�frrrr�create_file?s
r'cCstjjt�rdSdS)z�
    Are we running in a docker container or not?

    Assumes that if we see host rhsm configuration shared with us, we must
    be running in a container.
    TF)rr�exists�HOST_CONFIG_DIRrrrr�in_containerWsr*cCs>tjr
|S|dk	r6t|t�r6t|t�r6|jd�}|S|SdS)z�
    Transforms the provided string into unicode if it is not already
    :param obj: the string to decode
    :return: the unicode format of the string
    Nzutf-8)�sixZPY3�
isinstancerZunicode�decode)�objrrr�	make_utf8cs
r/cCs |jttj|dddd���dS)z�
    Writes out the provided data to the specified file, with user-friendly indentation,
    and in utf-8 encoding.
    :param file: The file to write to
    :param data: The data to be written
    :return:
    �FT)�indentZensure_asciiZ	sort_keysN)r
r/�json�dumps)�file�datarrrr#rsr#)N)Z
__future__rrrr!r2rrrr+Zsyspurpose.i18nrrr)rrr'r*r/r#rrrr�<module>s
site-packages/syspurpose/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000161147511334660021064 0ustar003

1�e�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/syspurpose/__pycache__/files.cpython-36.opt-1.pyc000064400000053065147511334660020442 0ustar003

1�e_r�@sDddlmZmZmZddlZddlZddlZddlZddlZddl	Z	ddl
mZmZm
Z
mZmZddlmZdZejjed�Zejjed�ZdZejjed�Zd	Zd
ZdZdZed	ed
edediZeeeegZdZ ej!e"�Z#dd�Z$Gdd�de%�Z&Gdd�de%�Z'Gdd�de%�Z(ej)dddddddg�Z*d%d d!�Z+d&d#d$�Z,dS)'�)�print_function�division�absolute_importN)�system_exit�
create_dir�create_file�	make_utf8�write_to_file_utf8)�ugettextz/etc/rhsm/syspurposezsyspurpose.jsonzvalid_fields.jsonz/var/lib/rhsm/cache�role�addons�service_level_agreement�usageZaddOnsZserviceLevelZunsupportedcCs`d|kr\d|dkr2|dd|dd<|dd=d|dkr\|dd|dd<|dd=|S)z�
    Try to solve conflicts in keys
     - Server returns key "roles", but it should be "role"
     - Server returns key "support_level", but service_level_agreement is used in syspurpose.json
    :return: modified dictionary
    �systemPurposeAttributesZrolesrZ
support_levelr
�)�datarr�/usr/lib/python3.6/files.py�post_process_received_data?s

rc@sbeZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
ddd�Zeddd��Z
dS)�SyspurposeStorez9
    Represents and maintains a json syspurpose file
    FcCs||_i|_||_dS)N)�path�contents�raise_on_error)�selfrrrrr�__init__VszSyspurposeStore.__init__cCs�y.tj|jddd��}tj|�|_dSQRXWn�tk
rhtjj|j�rdt	tj
td�j|j��dSt
k
r�}zj|jtjkr�|jr�t	tjtd�j|j��|jtjkr�|jr�tjd	j|j|d
��dS|jr�|�WYdd}~XnXdS)aN
        Opens & reads the contents of the store's file based on the 'path' provided to the constructor,
        and stores them on this object. If the user doesn't have access rights to the file, the program exits.
        :return: False if the contents of the file were empty, or the file doesn't exist; otherwise, nothing.
        �rzutf-8)�encodingTNz<Error: Malformed data in file {}; please review and correct.Fz,Cannot read syspurpose file {}
Are you root?z#Unable to read file {file}: {error})�file�error)�io�openr�json�loadr�
ValueError�os�getsizer�	EX_CONFIG�_�format�OSError�errnoZEACCESr�	EX_NOPERM�ENOENT�logr)r�f�errr�	read_file[s$zSyspurposeStore.read_filecCs(ttjj|j��p&|j�p&t|j|j�S)zw
        Create the files necessary for this store
        :return: True if changes were made, false otherwise
        )rr#r�dirnamer/rr)rrrr�createwszSyspurposeStore.createcCs�t|�}t|�}yj|j|}|dk	r<t|t�r<|g|j|<|j|dkrTg|j|<||j|krt|j|j|�ndSWn$ttfk
r�|g|j|<YnXdS)aJ
        Add a value to a list of values specified by key. If the current value specified by the key is scalar/non-list,
        it is not overridden, but maintained in the list, along with the new value.
        :param key: The name of the list
        :param value: The value to append to the list
        :return: None
        NFT)rr�
isinstance�list�append�AttributeError�KeyError)r�key�value�
current_valuerrr�add�s

zSyspurposeStore.addc
Cs�t|�}t|�}yR|j|}|dk	rBt|t�rB||krB|j|�S||kr\|j|j|�ndSdStttfk
r|dSXdS)aN
        Remove a value from a list specified by key.
        If the current value specified by the key is not a list, unset the value.
        :param key: The name of the list parameter to manipulate
        :param value: The value to attempt to remove
        :return: True if the value was in the list, False if it was not
        NFT)	rrr2r3�unset�remover5r6r")rr7r8r9rrrr<�s

zSyspurposeStore.removecCs@t|�}|dkr*|jj|d�}d|j|<n|jj|d�}|dk	S)z\
        Unsets a key
        :param key: The key to unset
        :return: boolean
        r
N�)rr�get�pop)rr7r8rrrr;�szSyspurposeStore.unsetcCs<t|�}t|�}t|jj|d��}||j|<||kp:|dkS)z�
        Set a key (syspurpose parameter) to value
        :param key: The parameter of the syspurpose file to set
        :type key: str

        :param value: The value to set that parameter to
        :return: Whether any change was made
        N)rrr>)rr7r8Zorgrrr�set�s
	
zSyspurposeStore.setNc
CsH|s8tj|jddd��}t||j�|j�WdQRXnt||j�dS)zE
        Write the current contents to the file at self.path
        �wzutf-8)rN)rrrr	r�flush)r�fpr-rrr�write�s
zSyspurposeStore.writecCs0|||d�}tj|tj�s$|j�n|j�|S)aL
        Read the file represented by path. If the file does not exist it is created.
        :param path: The path on the file system to read, should be a json file
        :param raise_on_error: When it is set to True, then exceptions are raised as expected.
        :return: new SyspurposeStore with the contents read in
        )r)r#�access�W_OKr1r/)�clsrrZ	new_storerrr�read�s

zSyspurposeStore.read)F)N)F)�__name__�
__module__�__qualname__�__doc__rr/r1r:r<r;r@rD�classmethodrHrrrrrQs
	
rc@seZdZdZdd�ZdS)�
SyncResultza
    A container class for the results of a sync operation performed by a SyncedStore class.
    cCs||_||_||_||_dS)N)�result�remote_changed�
local_changed�cached_changed)rrOrPrQrRrrrr�szSyncResult.__init__N)rIrJrKrLrrrrrrN�srNc@s�eZdZdZeZeZd2dd�Zdd�Z	dd	�Z
d
d�Zdd
�Zdd�Z
d3dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zed,d-��Zed.d/��Zd0d1�Z dS)4�SyncedStorezz
    Stores values in a local file backed by a cache which is then synced with another source
    of the same values.
    NFcCsx||_|jjd�d|_|j|_|j|_d|_|j�|_	d|_
|j�|_d|_
||_||_|dkrn|j�|_nd|_dS)a8
        Initialization of SyncedStore
        :param uep: object representing connection to candlepin server
        :param on_changed: optional callback method called, during three-way merge
        :param consumer_uuid: UUID of consumer
        :param use_valid_fields: if valid fields are considered
        �/�NFT���)�uep�PATH�split�filenamer�
CACHE_PATH�
cache_pathZ
local_file�get_local_contents�local_contentsZ
cache_file�get_cached_contents�cache_contents�changed�
on_changed�
consumer_uuid�get_valid_fields�valid_fields)rrWrbrcZuse_valid_fieldsrrrrs

zSyncedStore.__init__cCs|S)Nr)rrrr�	__enter__szSyncedStore.__enter__cCs|j�dS)N)�finish)r�exc_typeZexc_valZexc_tbrrr�__exit__szSyncedStore.__exit__cCs|jr|j�dS)z{
        When local content was changed, then try to synchronize local content with remote server
        :return:
        N)ra�sync)rrrrrg"szSyncedStore.finishcs�tjd�y*|jr2|jjd�r2tjd�|j�SWn6tk
rj}ztjdj|d��|j�Sd}~XnX|j�}|j�}|j	�}|j
|||d���fdd	��D�}t�|�kp�|j��|j
|�|j���}tjd
�d|_|S)z�
        Try to synchronize local content with remote server
        :return: instance of SyncResult holding result of synchronization
        z(Attempting to sync syspurpose content...�
syspurposez9Server does not support syspurpose, syncing only locally.zDFailed to detect whether the server has syspurpose capability: {err})�errN)�local�remote�basecsi|]}�|r�||�qSrr)�.0r7)rOrr�
<dictcomp>Dsz$SyncedStore.sync.<locals>.<dictcomp>z#Successfully synced system purpose.F)r,�debugrW�has_capability�_sync_local_only�	Exceptionr'�get_remote_contentsr]r_�mergerN�
update_remote�update_local�update_cachera)rrlZremote_contentsr^Zcached_contentsZlocal_resultZsync_resultr)rOrrj*s2



zSyncedStore.synccCs|j|j��}t|jd|d�S)NF)ryr]rNr^)rZ
local_updatedrrrrtTszSyncedStore._sync_local_onlycCst||||jd�}|S)z�
        Do three-way merge
        :param local: dictionary with local values (syspyrpose.json)
        :param remote: dictionary with values from server
        :param base:
        :return:
        )rmrorn�	on_change)�three_way_mergerb)rrmrnrorOrrrrwXs
zSyncedStore.mergec
Csbytjtj|jddd��|_Wn<tjtt	fk
rZt
jd|j�|ji�i|_YnX|jS)zl
        Try to load local content from file
        :return: dictionary with system purpose values
        rzutf-8)rz+Unable to read local system purpose at "%s")
r r!rrrr^r#rr"�IOErrorr,rrry)rrrrr]hs
zSyncedStore.get_local_contentscCs�|jdks|jdkr"tjd�iS|jjd�s<tjd�iS|jj|j�}i}x"tD]}|jt|�}|||<qTWtjd�|S)zn
        Try to get remote content from server
        :return: dictionary with system purpose values
        NziFailed to read remote syspurpose from server: no available connection, or the consumer is not registered.rkz0Server does not support syspurpose, not syncing.z0Successfully read remote syspurpose from server.)	rWrcr,rrrsZgetConsumer�
ATTRIBUTESr>�LOCAL_TO_REMOTE)rZconsumerrO�attrr8rrrrvus



zSyncedStore.get_remote_contentsc
Csly(tjtj|jddd��|_tjd�Wn<tt	j
tfk
rdtjd|j�i|_|j
i�YnX|jS)zy
        Try to load cached server response from the file
        :return: dictionary with system purpose values
        rzutf-8)rz-Successfully read cached syspurpose contents.z2Unable to read cached syspurpose contents at '%s'.)r r!rrr\r`r,rrr"r#rr}rrz)rrrrr_�szSyncedStore.get_cached_contentscCs||_|j�dS)z�
        Rewrite local content with new data and write data to file syspurpose.json
        :param data: new dictionary with local data
        :return: None
        N)r^�_write_local)rrrrrry�szSyncedStore.update_localcCs|j|j|j�dS)zD
        Write local data to the file
        :return: None
        N)�_update_filerr^)rrrrr��szSyncedStore._write_localcCs||_|j�dS)N)r`�_write_cache)rrrrrrz�szSyncedStore.update_cachecCs|j|j|j�dS)z;
        Write cache to file
        :return: None
        N)r�r\r`)rrrrr��szSyncedStore._write_cachecCs||jdks|jdkr"tjd�dS|jt�}|jj|j|jt�pBd|dk	rN|ng|jt�p\d|jt	�phdd�tjd�dS)NzmFailed to update remote syspurpose on the server: no available connection, or the consumer is not registered.Fr=)rrZ
service_levelrz5Successfully updated remote syspurpose on the server.T)
rWrcr,rrr>�ADDONSZupdateConsumer�ROLE�
SERVICE_LEVEL�USAGE)rrrrrrrx�s


zSyncedStore.update_remotecCs�|jdk	r�||jkrf||j|kr�ttd�j||d��x`|j|D]}t|�dkrDtd|�qDWn4ttd�j|d��x|jj�D]}td|�q�WdS)z�
        Check validity of provided key and value of it is included in valid fields
        :param key: provided key
        :param value: provided value
        :return: None
        NzaWarning: Provided value "{val}" is not included in the list of valid values for attribute {attr}:)�valr�rz - %szHWarning: Provided key "{key}" is not included in the list of valid keys:)r7)re�printr&r'�len�keys)rr7r8Zvalid_valueZ	valid_keyrrr�_check_key_value_validity�s



z%SyncedStore._check_key_value_validitycCs�t|�}t|�}y�|j|}|dk	r<t|t�r<|g|j|<|j|dkrTg|j|<||j|krt|j|j|�ntjd||f�d|_|jSWn$tt	fk
r�|g|j|<YnX|j
||�d|_tjd||f�|jdkr�|j�|jS)aJ
        Add a value to a list of values specified by key. If the current value specified by the key is scalar/non-list,
        it is not overridden, but maintained in the list, along with the new value.
        :param key: The name of the list
        :param value: The value to append to the list
        :return: None
        Nz$Will not add value '%s' to key '%s'.FTzAdding value '%s' to key '%s'.)rr^r2r3r4r,rrrar5r6r�r�)rr7r8r9rrrr:�s*



zSyncedStore.addc
Cs�t|�}t|�}y�|j|}|dk	rBt|t�rB||krB|j|�S||krt|j|j|�d|_tjd||f�nd|_tjd||f�|jSWn2t	t
tfk
r�tjd||f�d|_YnX|jdkr�|j�|jS)aN
        Remove a value from a list specified by key.
        If the current value specified by the key is not a list, unset the value.
        :param key: The name of the list parameter to manipulate
        :param value: The value to attempt to remove
        :return: True if the value was in the list, False if it was not
        NTz"Removing value '%s' from key '%s'.Fz)Will not remove value '%s' from key '%s'.)
rr^r2r3r;r<rar,rrr5r6r"r�)rr7r8Zcurrent_valuesrrrr<
s&



zSyncedStore.removecCs�t|�}|dkr*|jj|d�}d|j|<n0|dkrL|jj|d�}g|j|<n|jj|d�}d|_tjd||f�|dk	|_|jdkr�|j�|jS)z\
        Unsets a key
        :param key: The key to unset
        :return: boolean
        r
Nr=rTz!Unsetting value '%s' of key '%s'.)rr^r>r?rar,rrr�)rr7r8rrrr;,s

zSyncedStore.unsetcCs�t|�}t|�}t|jj|d��}||j|<||ks<|dkrb|j||�d|_tjd||f�n
tjd�||kpz|dk|_|jdkr�|j�|jS)z�
        Set a key (syspurpose parameter) to value
        :param key: The parameter of the syspurpose file to set
        :type key: str

        :param value: The value to set that parameter to
        :return: Whether any change was made
        NTzSetting value '%s' to key '%s'.z#NOT Setting value '%s' to key '%s'.)rr^r>r�rar,rrr�)rr7r8r9rrrr@Is	


zSyncedStore.setcCshtjj|�sdtjd|�ytj|ddd�Wn4tk
rb}ztjd||f�WYdd}~XnXdS)zr
        Try to create missing directory
        :param dir_path: path to directory
        :return: None
        zTrying to create directory: %si�T)�mode�exist_okz)Unable to create directory: %s, error: %sN)r#r�isdirr,rr�makedirsruZwarning)Zdir_pathrlrrr�_create_missing_dirgszSyncedStore._create_missing_dircCs�|jt�|jt�ytj|ddd�}Wn.tk
rV}z|jdkrF�WYdd}~Xn*Xt||�|j�|j	�t
jd|�t
jd|�dS)a
        Write the contents of data to file in the first mode we can (effectively to create or update
        the file)
        :param path: The string path to the file location we should update
        :param data: The data to write to the file
        :return: None
        zw+zutf-8)r�Nz/Successfully updated syspurpose values at '%s'.z+Failed to update syspurpose values at '%s'.)r��USER_SYSPURPOSE_DIR�	CACHE_DIRrrr(r)r	rB�closer,rr)rGrrr-r.rrrr�vs



zSyncedStore._update_filecCs�d}|jdk	r�|jdk	r�|jj|j�}d|kr�|d}y|jj|�}Wn0tk
rv}ztjd|�WYdd}~XnXd|kr�t|�}|d}|S)z�
        Try to get valid fields from server using current owner (organization)
        :return: Dictionary with valid fields
        Nr7z*Unable to get valid fields from server: %sr)rWrcZgetOwnerZgetOwnerSyspurposeValidFieldsrur,rrr)rreZ
current_ownerZ	owner_keyZresponserlrrrrd�s zSyncedStore.get_valid_fields)NNF)NNN)!rIrJrKrL�USER_SYSPURPOSErX�CACHED_SYSPURPOSEr[rrfrirgrjrtrwr]rvr_ryr�rzr�rxr�r:r<r;r@�staticmethodr�rMr�rdrrrrrS�s4
*

	+"rS�
DiffChanger7�previous_value�	new_value�source�in_base�	in_resultrnc	Cs�tjd�i}|pi}|pi}|p$i}|dkr4|}n|dkrB|}ntd��|dkrZdd�}t|j��t|j��Bt|j��B}�x(|D�]}t|||dd�}	t|||d	d�}
|	p�|
o�|
tk}d
}|	|
kr�|	dkr�tjd|�|}||kr�||||<nv|
dk�r,tjd
|�d}||k�rn||||<nB|	�s<|
tk�rn|	dk�rTtjd|�d}||k�rn||||<|r�|j|�}
t|||
|j|�||k||kd�}||�q�W|S)a�
    Performs a three-way merge on the local and remote dictionaries with a given base.
    :param local: The dictionary of the current local values
    :param base: The dictionary with the values we've last seen
    :param remote: The dictionary with "their" values
    :param on_conflict: Either "remote" or "local" or None. If "remote", the remote changes
                               will win any conflict. If "local", the local changes will win any
                               conflict. If anything else, an error will be thrown.
    :param on_change: This is an optional function which will be given each change as it is
                      detected.
    :return: The dictionary of values as merged between the three provided dictionaries.
    zAttempting a three-way merge...rnrmzAkeyword argument "on_conflict" must be either "remote" or "local"NcSs|S)Nr)Zchangerrr�<lambda>�sz!three_way_merge.<locals>.<lambda>)ro�otherr7r��serverroTzLThree way merge conflict: both local and remote values changed for key '%s'.z7Three way merge: remote value was changed for key '%s'.z6Three way merge: local value was changed for key '%s'.)r7r�r�r�r�r�)	r,rrr"r@r��detect_changed�UNSUPPORTEDr>r�)rmrornZon_conflictr{rO�winnerZall_keysr7rQrPrar��originalZdiffrrrr|�sT
$




r|r�cCs�|pi}|pi}||kr$|dkr$tS|j|�}|j|�}||krP|dkrPt|�St|�tkrxt|�tkrxt|�t|�kS|dkr�|dkr�|dkr�dS||kS)aX
    Detect the type of change that has occurred between base and other for a given key.
    :param base: The dictionary of values we are starting with
    :param other: The dictionary of now current values
    :param key: The key that we are interested in knowing how it changed
    :param source: An optional string which indicates where the "other" values came from. Used to
                   make decisions which are one sided. (i.e. only applicable for changes from the
                   server side).
    :return: True if there was a change, false if there was no change
    :rtype: bool
    rmr�Nr=F)r�r>�bool�typer3�sorted)ror�r7r�Zbase_valZ	other_valrrrr��s

r�)rnN)r�)-Z
__future__rrr�collectionsZloggingr r#r)rZsyspurpose.utilsrrrrr	Zsyspurpose.i18nr
r&r�r�joinr�ZVALID_FIELDSr�r�r�r�r�r�rr~r�Z	getLoggerrIr,r�objectrrNrS�
namedtupler�r|r�rrrr�<module>sJ
 1
Bsite-packages/syspurpose/__pycache__/i18n.cpython-36.opt-1.pyc000064400000003067147511334660020114 0ustar003

1�e�	�@s�ddlmZmZmZddlZddlZddlmZddlZdZ	dZ
eje	dd�Ze�Z
de
_de
_dd	�Zd
d�Zdd
�Zdd�ZdS)�)�print_function�division�absolute_importN)�localZ
syspurposez/usr/share/locale/T)ZfallbackcCsJddl}y|j|jd�Wn$|jk
r>|j|jd�YnXt�dS)z�
    Configure internationalization for the application. Should only be
    called once per invocation. (once for CLI, once for GUI)
    rN��C)�locale�	setlocale�LC_ALL�Error�configure_gettext)r�r
�/usr/lib/python3.6/i18n.py�configure_i18n"srcCs2tjtt�tjt�tjtd�tjtd�dS)NzUTF-8)�gettext�bindtextdomain�APP�DIR�
textdomain�bind_textdomain_codesetrr
r
r
rr/s
rcOshtjr6ttd�r(tjdk	r(tjj||�Stj||�Sn.ttd�rXtjdk	rXtjj||�Stj||�SdS)N�lang)�six�PY2�hasattr�LOCALEr�ugettext�TRANSLATIONr)�args�kwargsr
r
rr6srcOshtjr6ttd�r(tjdk	r(tjj||�Stj||�Sn.ttd�rXtjdk	rXtjj||�Stj||�SdS)Nr)rrrrr�	ungettextrZngettext)rrr
r
rrCsr)Z
__future__rrrrrZ	threadingrrrrZtranslationrrZlanguagerrrrrr
r
r
r�<module>s

site-packages/syspurpose/__pycache__/__init__.cpython-36.pyc000064400000000161147511334660020125 0ustar003

1�e�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/syspurpose/__pycache__/cli.cpython-36.opt-1.pyc000064400000024516147511334660020106 0ustar003

1�e<8�@s�ddlmZmZmZddlZddlZddlZddlZddlm	Z	ddl
mZmZddl
mZddlZed�Zed�Zdd	�Zd
d�Zdd
�Zdd�Zdd�Zdd�Zdd�Zddd�Zdd�ZdS)�)�print_function�division�absolute_importN)�SyncedStore)�in_container�	make_utf8)�ugettextzzWarning: A {attr} of "{download_value}" was recently set for this system by the entitlement server administrator.
{advice}zGIf you'd like to overwrite the server side change please run: {command}cs�d}xb�jD]X}|j�j|�rDd}ttd�jt|�t�j�d��qttd�jt|�t�j�d��qW|dkrtdStd�jt�j�d�}d	j�j�}d
j�jd�|}t|�fdd
�||�jd�dS)a�
    Uses the syspurposestore to add one or more values to a particular property.
    :param args: The parsed args from argparse, expected attributes are:
        prop_name: the string name of the property to add to
        values: A list of the values to add to the given property (could be anything json-serializable)
    :param syspurposestore: An SyspurposeStore object to manipulate
    :return: None
    FTzAdded "{value}" to {prop_name}.)�value�	prop_namez=Not adding value "{value}" to {prop_name}; it already exists.Nz{attr} updated.)�attr�zsyspurpose add-{name} )�namecst��fdd��jD��S)Nc3s |]}|�j�jg�kVqdS)N)�getr
)�.0�x)�args�res��/usr/lib/python3.6/cli.py�	<genexpr>=sz0add_command.<locals>.<lambda>.<locals>.<genexpr>)�all�values)r)r)rr�<lambda>=szadd_command.<locals>.<lambda>)�expectation�success_msg�commandr)	r�addr
�print�_�formatr�join�check_result)r�syspurposestoreZany_prop_addedr	rZto_addrr)rr�add_command"s&	


r#cs�xb�jD]X}|j�j|�r<ttd�jt|�t�j�d��qttd�jt|�t�j�d��dSqWtd�jt�j�d�}dj�j�}dj�jd	�|}t|�fd
d�||�jd�dS)
a�
    Uses the syspurposestore to remove one or more values from a particular property.
    :param args: The parsed args from argparse, expected attributes are:
        prop_name: the string name of the property to add to
        values: A list of the values to remove from the given property (could be anything json-serializable)
    :param syspurposestore: An SyspurposeStore object to manipulate
    :return: None
    zRemoved "{val}" from {name}.)�valr
z9Not removing value "{val}" from {name}; it was not there.Nz{attr} updated.)rrzsyspurpose remove-{name} )r
cst�fdd��jD��S)Nc3s|]}|�jdg�kVqdS)�addonsN)r)rr)rrrrZsz3remove_command.<locals>.<lambda>.<locals>.<genexpr>)rr)r)r)rrrZsz remove_command.<locals>.<lambda>)rrrr)	r�remover
rrrrr r!)rr"r	rZ	to_removerr)rr�remove_commandDs	"

r'cs\|j�j�j�td�jt�j�t�j�d�}t|�fdd�|dj�j�jd��jd�dS)	ap
    Uses the syspurposestore to set the prop_name to value.
    :param args: The parsed args from argparse, expected attributes are:
        prop_name: the string name of the property to set
        value: An object to set the property to (could be anything json-serializable)
    :param syspurposestore: An SyspurposeStore object to manipulate
    :return: None
    z{attr} set to "{val}".)rr$cs|j�j��jkS)N)rr
r	)r)rrrroszset_command.<locals>.<lambda>zsyspurpose set {name} "{val}")r
r$)rrrrN)�setr
r	rrrr!)rr"rr)rr�set_commandas	
r)csL|j�j�td�jt�j�d�}t|�fdd�|dj�jd��jd�dS)	a.
    Uses the syspurposestore to unset (clear entirely) the prop_name.
    :param args: The parsed args from argparse, expected attributes are:
        prop_name: the string name of the property to unset (clear)
    :param syspurposestore: An SyspurposeStore object to manipulate
    :return: None
    z
{attr} unset.)rcs|j�j�ddggkS)Nr)rr
)r)rrrr�szunset_command.<locals>.<lambda>zsyspurpose unset {name})r
)rrrrN)�unsetr
rrrr!)rr"rr)rr�
unset_commandws
r+cs:|j�}|j��fdd��D��ttj�dddd��|S)z?
    :param args:
    :param syspurposestore:
    :return:
    csi|]}�|r�||�qSrr)r�key)�contentsrr�
<dictcomp>�sz!show_contents.<locals>.<dictcomp>�FT)�indentZensure_asciiZ	sort_keys)�sync�resultr�json�dumps)rr"Zsync_resultr)r-r�
show_contents�s
r5cCs,tjdtd�td�d�}|jdd�}tjdd�}|jd	td
�dd�|jtd
d�tjdd�}|jd	td�dd�|jtd
d�tjdd�}|jdtd�dd�|jtd
d�tjdd�}|jt	d
d�|j
dtd�d�}|jddtd�dd�|jdtd�dd�|jtd
d�|j
dtd�|gd�}|jddtd�dd�|j
dtd�d�}|jddtd�dd�|jd	td
�ddd �|jtd
d�|j
d!td"�d�}	|	jddtd�dd�|	jd	td�ddd �|	jtd
d�|j
d#td$�|gd�}
|
jd%d&�|j
d'td(�|gd�}|jd%d&�|j
d)td*�|gd�}|jd+d&�|j
d,td-�|gd�}
|
jd+d&�|j
d.td/�|gd�}|jd+d&�|j
d0td1�|gd�}|jd2d&�|j
d3td4�|gd�}|jd2d&�|j
d5td6�|gd�}|jd7d&�|j
d8td9�|gd�}|jd7d&�|j
d:td;�d�}|jtdd�|S)<z~
    Sets up argument parsing for the syspurpose tool.
    :return: An argparse.ArgumentParser ready to use to parse_args
    Z
syspurposez!System Syspurpose Management Toolz�The 'syspurpose' command is deprecated and will be removed in a future major release. Please use the 'subscription-manager syspurpose' command going forward.)�prog�descriptionZepilogzsub-command help)�helpF)Zadd_helprzThe value(s) to add�+)r8�nargsT)�funcZ
requires_synczThe value(s) to remover	zThe value to setZstore)r8�actionr(z%Sets the value for the given propertyr
�propertyz&The name of the property to set/update)�metavarr8r<r*z0Unsets (clears) the value for the given property)r8�parentsrz'Adds the value(s) to the given propertyz"The name of the property to update)r8r<r:r&z,Removes the value(s) from the given propertyzset-rolez,Set the system role to the system syspurposeZrole)r
z
unset-rolezClear set rolez
add-addonsz#Add addons to the system syspurposer%z
remove-addonsz(Remove addons from the system syspurposezunset-addonszClear set addonszset-slazSet the system slaZservice_level_agreementz	unset-slaz
Clear set slaz	set-usagezSet the system usageZusagezunset-usagezClear set usageZshowz"Show the current system syspurpose)�argparse�ArgumentParserrZadd_subparsers�add_argumentZset_defaultsr#r'r)r+Z
add_parserr5)�parserZ
subparsersZadd_optionsZremove_optionsZset_optionsZ
unset_optionsZgeneric_set_parserZgeneric_unset_parserZgeneric_add_parserZgeneric_remove_parserZset_role_parserZunset_role_parserZadd_addons_parserZremove_addons_parserZset_sla_parserZunset_sla_parserZset_usage_parserZunset_usage_parserZshow_parserrrr�setup_arg_parser�s�









rDc
Cs�t�}|j�}t�r ttd��ttd�tjd�y2ddlm}ddl	m
}|�}|j}|�j�}Wn(t
k
r�d}d}ttd��YnXt||d	d
�}t|dd�dk	r�|j||�n|j�dSdS)zE
    Run the cli (Do the syspurpose tool thing!!)
    :return: 0
    z\WARNING: Setting syspurpose in containers has no effect.
Please run syspurpose on the host.
z�The 'syspurpose' command is deprecated and will be removed in a future major release. Please use the 'subscription-manager syspurpose' command going forward.)�filer)�Identity)�
CPProviderNzyWarning: Unable to sync system purpose with subscription management server: subscription_manager module is not available.T)�uepZ
consumer_uuidZuse_valid_fieldsr;)rD�
parse_argsrrr�sys�stderrZsubscription_manager.identityrFZ subscription_manager.cp_providerrG�uuidZget_consumer_auth_cp�ImportErrorr�getattrr;Z
print_help)rCrrFrGZidentityrLrHr"rrr�mainSs,
rOcCs�|r�t|�tg�tf�gkr"|f}xh|D]`}t|t�r>d|}t|tj�rptjrptjj|j	d��tjjd�q(tjj|�tjjd�q(Wytj
j�tjj�Wntk
r�YnXtj
|�dS)z�
    Exit with a code and optional message(s). Saved a few lines of code.
    Note: this method was copied from subscription_manager, because syspurpose should be independent
    on subscription_manager module as much as possible.
    z%s�utf8�
N)�type�
isinstance�	Exception�sixZ	text_typeZPY2rJrK�write�encode�stdout�flush�IOError�exit)�code�msgs�msgrrr�system_exitys"


r_cCsh|r|j�|j�}ni}|r\||�r\tj|d�}||}ttjttj|||d��d�nt	|�dS)N)r)rZdownload_value�advice)r])
r1Zget_cached_contents�	SP_ADVICErr_�os�EX_SOFTWAREr�SP_CONFLICT_MESSAGEr)r"rrrrr2r`r	rrrr!�s
r!)N)Z
__future__rrrr@rbrJrUZsyspurpose.filesrZsyspurpose.utilsrrZsyspurpose.i18nrrr3rdrar#r'r)r+r5rDrOr_r!rrrr�<module>s("
<&
site-packages/syspurpose/__pycache__/i18n.cpython-36.pyc000064400000003067147511334660017155 0ustar003

1�e�	�@s�ddlmZmZmZddlZddlZddlmZddlZdZ	dZ
eje	dd�Ze�Z
de
_de
_dd	�Zd
d�Zdd
�Zdd�ZdS)�)�print_function�division�absolute_importN)�localZ
syspurposez/usr/share/locale/T)ZfallbackcCsJddl}y|j|jd�Wn$|jk
r>|j|jd�YnXt�dS)z�
    Configure internationalization for the application. Should only be
    called once per invocation. (once for CLI, once for GUI)
    rN��C)�locale�	setlocale�LC_ALL�Error�configure_gettext)r�r
�/usr/lib/python3.6/i18n.py�configure_i18n"srcCs2tjtt�tjt�tjtd�tjtd�dS)NzUTF-8)�gettext�bindtextdomain�APP�DIR�
textdomain�bind_textdomain_codesetrr
r
r
rr/s
rcOshtjr6ttd�r(tjdk	r(tjj||�Stj||�Sn.ttd�rXtjdk	rXtjj||�Stj||�SdS)N�lang)�six�PY2�hasattr�LOCALEr�ugettext�TRANSLATIONr)�args�kwargsr
r
rr6srcOshtjr6ttd�r(tjdk	r(tjj||�Stj||�Sn.ttd�rXtjdk	rXtjj||�Stj||�SdS)Nr)rrrrr�	ungettextrZngettext)rrr
r
rrCsr)Z
__future__rrrrrZ	threadingrrrrZtranslationrrZlanguagerrrrrr
r
r
r�<module>s

site-packages/syspurpose/__pycache__/utils.cpython-36.opt-1.pyc000064400000005622147511334660020474 0ustar003

1�e3�@s�ddlmZmZmZddlZddlZddlZddlZddlZddl	Z	ddl
mZdZ
ddd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�ZdS)�)�print_function�division�absolute_importN)�ugettextz/etc/rhsm-host/cCsR|rDt|�tg�tf�gkr"|f}x |D]}tjjt|�d�q(Wtj|�dS)zDExit with a code and optional message(s). Saved a few lines of code.�
N)�type�sys�stderr�write�str�exit)�codeZmsgs�msg�r�/usr/lib/python3.6/utils.py�system_exit"s
rcCslytj|dd�WnTtk
rf}z8|jtjkr4dS|jtjkrVttjtd�j	|��WYdd}~XnXdS)z�
    Attempts to create the path given (less any file)
    :param path: path
    :return: True if changes were made, false otherwise
    i�)�modeFz(Cannot create directory {}
Are you root?NT)
�os�makedirs�OSError�errno�EEXIST�EACCESr�	EX_NOPERM�_�format)�path�errr�
create_dir-s"rcCs�y2tj|ddd��}t||�|j�WdQRXWnXtk
r�}z<|jtjkrTdS|jtjkrxtt	j
td�j|��n�WYdd}~XnXdS)z�
    Attempts to create a file, with the given contents
    :param path: The desired path to the file
    :param contents: The contents to write to the file, should json-serializable
    :return: True if the file was newly created, false otherwise
    �wzutf-8)�encodingNFz#Cannot create file {}
Are you root?T)
�io�open�write_to_file_utf8�flushrrrrrrrrr)r�contents�frrrr�create_file?s
r'cCstjjt�rdSdS)z�
    Are we running in a docker container or not?

    Assumes that if we see host rhsm configuration shared with us, we must
    be running in a container.
    TF)rr�exists�HOST_CONFIG_DIRrrrr�in_containerWsr*cCs>tjr
|S|dk	r6t|t�r6t|t�r6|jd�}|S|SdS)z�
    Transforms the provided string into unicode if it is not already
    :param obj: the string to decode
    :return: the unicode format of the string
    Nzutf-8)�sixZPY3�
isinstancerZunicode�decode)�objrrr�	make_utf8cs
r/cCs |jttj|dddd���dS)z�
    Writes out the provided data to the specified file, with user-friendly indentation,
    and in utf-8 encoding.
    :param file: The file to write to
    :param data: The data to be written
    :return:
    �FT)�indentZensure_asciiZ	sort_keysN)r
r/�json�dumps)�file�datarrrr#rsr#)N)Z
__future__rrrr!r2rrrr+Zsyspurpose.i18nrrr)rrr'r*r/r#rrrr�<module>s
site-packages/syspurpose/__pycache__/main.cpython-36.pyc000064400000001362147511334660017316 0ustar003

1�eX�@slddlmZmZmZddlZddlmZddlmZddl	j
Z
e
j�ddl	mZ
dd�Zedkrhe�dS)	�)�print_function�division�absolute_importN)�cli)�system_exit)�ugettextcCslytjtj�pd�WnPtk
r8tdtd��Yn0tk
rf}ztdt|��WYdd}~XnXdS)NrzUser interrupted process����)	�sys�exitr�main�KeyboardInterruptr�_�	Exception�str)�e�r�/usr/lib/python3.6/main.pyrsr�__main__)Z
__future__rrrr
Z
syspurposerZsyspurpose.utilsrZsyspurpose.i18nZi18nZconfigure_i18nrrr�__name__rrrr�<module>s
	site-packages/syspurpose/__pycache__/files.cpython-36.pyc000064400000053065147511334660017503 0ustar003

1�e_r�@sDddlmZmZmZddlZddlZddlZddlZddlZddl	Z	ddl
mZmZm
Z
mZmZddlmZdZejjed�Zejjed�ZdZejjed�Zd	Zd
ZdZdZed	ed
edediZeeeegZdZ ej!e"�Z#dd�Z$Gdd�de%�Z&Gdd�de%�Z'Gdd�de%�Z(ej)dddddddg�Z*d%d d!�Z+d&d#d$�Z,dS)'�)�print_function�division�absolute_importN)�system_exit�
create_dir�create_file�	make_utf8�write_to_file_utf8)�ugettextz/etc/rhsm/syspurposezsyspurpose.jsonzvalid_fields.jsonz/var/lib/rhsm/cache�role�addons�service_level_agreement�usageZaddOnsZserviceLevelZunsupportedcCs`d|kr\d|dkr2|dd|dd<|dd=d|dkr\|dd|dd<|dd=|S)z�
    Try to solve conflicts in keys
     - Server returns key "roles", but it should be "role"
     - Server returns key "support_level", but service_level_agreement is used in syspurpose.json
    :return: modified dictionary
    �systemPurposeAttributesZrolesrZ
support_levelr
�)�datarr�/usr/lib/python3.6/files.py�post_process_received_data?s

rc@sbeZdZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
ddd�Zeddd��Z
dS)�SyspurposeStorez9
    Represents and maintains a json syspurpose file
    FcCs||_i|_||_dS)N)�path�contents�raise_on_error)�selfrrrrr�__init__VszSyspurposeStore.__init__cCs�y.tj|jddd��}tj|�|_dSQRXWn�tk
rhtjj|j�rdt	tj
td�j|j��dSt
k
r�}zj|jtjkr�|jr�t	tjtd�j|j��|jtjkr�|jr�tjd	j|j|d
��dS|jr�|�WYdd}~XnXdS)aN
        Opens & reads the contents of the store's file based on the 'path' provided to the constructor,
        and stores them on this object. If the user doesn't have access rights to the file, the program exits.
        :return: False if the contents of the file were empty, or the file doesn't exist; otherwise, nothing.
        �rzutf-8)�encodingTNz<Error: Malformed data in file {}; please review and correct.Fz,Cannot read syspurpose file {}
Are you root?z#Unable to read file {file}: {error})�file�error)�io�openr�json�loadr�
ValueError�os�getsizer�	EX_CONFIG�_�format�OSError�errnoZEACCESr�	EX_NOPERM�ENOENT�logr)r�f�errr�	read_file[s$zSyspurposeStore.read_filecCs(ttjj|j��p&|j�p&t|j|j�S)zw
        Create the files necessary for this store
        :return: True if changes were made, false otherwise
        )rr#r�dirnamer/rr)rrrr�createwszSyspurposeStore.createcCs�t|�}t|�}yj|j|}|dk	r<t|t�r<|g|j|<|j|dkrTg|j|<||j|krt|j|j|�ndSWn$ttfk
r�|g|j|<YnXdS)aJ
        Add a value to a list of values specified by key. If the current value specified by the key is scalar/non-list,
        it is not overridden, but maintained in the list, along with the new value.
        :param key: The name of the list
        :param value: The value to append to the list
        :return: None
        NFT)rr�
isinstance�list�append�AttributeError�KeyError)r�key�value�
current_valuerrr�add�s

zSyspurposeStore.addc
Cs�t|�}t|�}yR|j|}|dk	rBt|t�rB||krB|j|�S||kr\|j|j|�ndSdStttfk
r|dSXdS)aN
        Remove a value from a list specified by key.
        If the current value specified by the key is not a list, unset the value.
        :param key: The name of the list parameter to manipulate
        :param value: The value to attempt to remove
        :return: True if the value was in the list, False if it was not
        NFT)	rrr2r3�unset�remover5r6r")rr7r8r9rrrr<�s

zSyspurposeStore.removecCs@t|�}|dkr*|jj|d�}d|j|<n|jj|d�}|dk	S)z\
        Unsets a key
        :param key: The key to unset
        :return: boolean
        r
N�)rr�get�pop)rr7r8rrrr;�szSyspurposeStore.unsetcCs<t|�}t|�}t|jj|d��}||j|<||kp:|dkS)z�
        Set a key (syspurpose parameter) to value
        :param key: The parameter of the syspurpose file to set
        :type key: str

        :param value: The value to set that parameter to
        :return: Whether any change was made
        N)rrr>)rr7r8Zorgrrr�set�s
	
zSyspurposeStore.setNc
CsH|s8tj|jddd��}t||j�|j�WdQRXnt||j�dS)zE
        Write the current contents to the file at self.path
        �wzutf-8)rN)rrrr	r�flush)r�fpr-rrr�write�s
zSyspurposeStore.writecCs0|||d�}tj|tj�s$|j�n|j�|S)aL
        Read the file represented by path. If the file does not exist it is created.
        :param path: The path on the file system to read, should be a json file
        :param raise_on_error: When it is set to True, then exceptions are raised as expected.
        :return: new SyspurposeStore with the contents read in
        )r)r#�access�W_OKr1r/)�clsrrZ	new_storerrr�read�s

zSyspurposeStore.read)F)N)F)�__name__�
__module__�__qualname__�__doc__rr/r1r:r<r;r@rD�classmethodrHrrrrrQs
	
rc@seZdZdZdd�ZdS)�
SyncResultza
    A container class for the results of a sync operation performed by a SyncedStore class.
    cCs||_||_||_||_dS)N)�result�remote_changed�
local_changed�cached_changed)rrOrPrQrRrrrr�szSyncResult.__init__N)rIrJrKrLrrrrrrN�srNc@s�eZdZdZeZeZd2dd�Zdd�Z	dd	�Z
d
d�Zdd
�Zdd�Z
d3dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zed,d-��Zed.d/��Zd0d1�Z dS)4�SyncedStorezz
    Stores values in a local file backed by a cache which is then synced with another source
    of the same values.
    NFcCsx||_|jjd�d|_|j|_|j|_d|_|j�|_	d|_
|j�|_d|_
||_||_|dkrn|j�|_nd|_dS)a8
        Initialization of SyncedStore
        :param uep: object representing connection to candlepin server
        :param on_changed: optional callback method called, during three-way merge
        :param consumer_uuid: UUID of consumer
        :param use_valid_fields: if valid fields are considered
        �/�NFT���)�uep�PATH�split�filenamer�
CACHE_PATH�
cache_pathZ
local_file�get_local_contents�local_contentsZ
cache_file�get_cached_contents�cache_contents�changed�
on_changed�
consumer_uuid�get_valid_fields�valid_fields)rrWrbrcZuse_valid_fieldsrrrrs

zSyncedStore.__init__cCs|S)Nr)rrrr�	__enter__szSyncedStore.__enter__cCs|j�dS)N)�finish)r�exc_typeZexc_valZexc_tbrrr�__exit__szSyncedStore.__exit__cCs|jr|j�dS)z{
        When local content was changed, then try to synchronize local content with remote server
        :return:
        N)ra�sync)rrrrrg"szSyncedStore.finishcs�tjd�y*|jr2|jjd�r2tjd�|j�SWn6tk
rj}ztjdj|d��|j�Sd}~XnX|j�}|j�}|j	�}|j
|||d���fdd	��D�}t�|�kp�|j��|j
|�|j���}tjd
�d|_|S)z�
        Try to synchronize local content with remote server
        :return: instance of SyncResult holding result of synchronization
        z(Attempting to sync syspurpose content...�
syspurposez9Server does not support syspurpose, syncing only locally.zDFailed to detect whether the server has syspurpose capability: {err})�errN)�local�remote�basecsi|]}�|r�||�qSrr)�.0r7)rOrr�
<dictcomp>Dsz$SyncedStore.sync.<locals>.<dictcomp>z#Successfully synced system purpose.F)r,�debugrW�has_capability�_sync_local_only�	Exceptionr'�get_remote_contentsr]r_�mergerN�
update_remote�update_local�update_cachera)rrlZremote_contentsr^Zcached_contentsZlocal_resultZsync_resultr)rOrrj*s2



zSyncedStore.synccCs|j|j��}t|jd|d�S)NF)ryr]rNr^)rZ
local_updatedrrrrtTszSyncedStore._sync_local_onlycCst||||jd�}|S)z�
        Do three-way merge
        :param local: dictionary with local values (syspyrpose.json)
        :param remote: dictionary with values from server
        :param base:
        :return:
        )rmrorn�	on_change)�three_way_mergerb)rrmrnrorOrrrrwXs
zSyncedStore.mergec
Csbytjtj|jddd��|_Wn<tjtt	fk
rZt
jd|j�|ji�i|_YnX|jS)zl
        Try to load local content from file
        :return: dictionary with system purpose values
        rzutf-8)rz+Unable to read local system purpose at "%s")
r r!rrrr^r#rr"�IOErrorr,rrry)rrrrr]hs
zSyncedStore.get_local_contentscCs�|jdks|jdkr"tjd�iS|jjd�s<tjd�iS|jj|j�}i}x"tD]}|jt|�}|||<qTWtjd�|S)zn
        Try to get remote content from server
        :return: dictionary with system purpose values
        NziFailed to read remote syspurpose from server: no available connection, or the consumer is not registered.rkz0Server does not support syspurpose, not syncing.z0Successfully read remote syspurpose from server.)	rWrcr,rrrsZgetConsumer�
ATTRIBUTESr>�LOCAL_TO_REMOTE)rZconsumerrO�attrr8rrrrvus



zSyncedStore.get_remote_contentsc
Csly(tjtj|jddd��|_tjd�Wn<tt	j
tfk
rdtjd|j�i|_|j
i�YnX|jS)zy
        Try to load cached server response from the file
        :return: dictionary with system purpose values
        rzutf-8)rz-Successfully read cached syspurpose contents.z2Unable to read cached syspurpose contents at '%s'.)r r!rrr\r`r,rrr"r#rr}rrz)rrrrr_�szSyncedStore.get_cached_contentscCs||_|j�dS)z�
        Rewrite local content with new data and write data to file syspurpose.json
        :param data: new dictionary with local data
        :return: None
        N)r^�_write_local)rrrrrry�szSyncedStore.update_localcCs|j|j|j�dS)zD
        Write local data to the file
        :return: None
        N)�_update_filerr^)rrrrr��szSyncedStore._write_localcCs||_|j�dS)N)r`�_write_cache)rrrrrrz�szSyncedStore.update_cachecCs|j|j|j�dS)z;
        Write cache to file
        :return: None
        N)r�r\r`)rrrrr��szSyncedStore._write_cachecCs||jdks|jdkr"tjd�dS|jt�}|jj|j|jt�pBd|dk	rN|ng|jt�p\d|jt	�phdd�tjd�dS)NzmFailed to update remote syspurpose on the server: no available connection, or the consumer is not registered.Fr=)rrZ
service_levelrz5Successfully updated remote syspurpose on the server.T)
rWrcr,rrr>�ADDONSZupdateConsumer�ROLE�
SERVICE_LEVEL�USAGE)rrrrrrrx�s


zSyncedStore.update_remotecCs�|jdk	r�||jkrf||j|kr�ttd�j||d��x`|j|D]}t|�dkrDtd|�qDWn4ttd�j|d��x|jj�D]}td|�q�WdS)z�
        Check validity of provided key and value of it is included in valid fields
        :param key: provided key
        :param value: provided value
        :return: None
        NzaWarning: Provided value "{val}" is not included in the list of valid values for attribute {attr}:)�valr�rz - %szHWarning: Provided key "{key}" is not included in the list of valid keys:)r7)re�printr&r'�len�keys)rr7r8Zvalid_valueZ	valid_keyrrr�_check_key_value_validity�s



z%SyncedStore._check_key_value_validitycCs�t|�}t|�}y�|j|}|dk	r<t|t�r<|g|j|<|j|dkrTg|j|<||j|krt|j|j|�ntjd||f�d|_|jSWn$tt	fk
r�|g|j|<YnX|j
||�d|_tjd||f�|jdkr�|j�|jS)aJ
        Add a value to a list of values specified by key. If the current value specified by the key is scalar/non-list,
        it is not overridden, but maintained in the list, along with the new value.
        :param key: The name of the list
        :param value: The value to append to the list
        :return: None
        Nz$Will not add value '%s' to key '%s'.FTzAdding value '%s' to key '%s'.)rr^r2r3r4r,rrrar5r6r�r�)rr7r8r9rrrr:�s*



zSyncedStore.addc
Cs�t|�}t|�}y�|j|}|dk	rBt|t�rB||krB|j|�S||krt|j|j|�d|_tjd||f�nd|_tjd||f�|jSWn2t	t
tfk
r�tjd||f�d|_YnX|jdkr�|j�|jS)aN
        Remove a value from a list specified by key.
        If the current value specified by the key is not a list, unset the value.
        :param key: The name of the list parameter to manipulate
        :param value: The value to attempt to remove
        :return: True if the value was in the list, False if it was not
        NTz"Removing value '%s' from key '%s'.Fz)Will not remove value '%s' from key '%s'.)
rr^r2r3r;r<rar,rrr5r6r"r�)rr7r8Zcurrent_valuesrrrr<
s&



zSyncedStore.removecCs�t|�}|dkr*|jj|d�}d|j|<n0|dkrL|jj|d�}g|j|<n|jj|d�}d|_tjd||f�|dk	|_|jdkr�|j�|jS)z\
        Unsets a key
        :param key: The key to unset
        :return: boolean
        r
Nr=rTz!Unsetting value '%s' of key '%s'.)rr^r>r?rar,rrr�)rr7r8rrrr;,s

zSyncedStore.unsetcCs�t|�}t|�}t|jj|d��}||j|<||ks<|dkrb|j||�d|_tjd||f�n
tjd�||kpz|dk|_|jdkr�|j�|jS)z�
        Set a key (syspurpose parameter) to value
        :param key: The parameter of the syspurpose file to set
        :type key: str

        :param value: The value to set that parameter to
        :return: Whether any change was made
        NTzSetting value '%s' to key '%s'.z#NOT Setting value '%s' to key '%s'.)rr^r>r�rar,rrr�)rr7r8r9rrrr@Is	


zSyncedStore.setcCshtjj|�sdtjd|�ytj|ddd�Wn4tk
rb}ztjd||f�WYdd}~XnXdS)zr
        Try to create missing directory
        :param dir_path: path to directory
        :return: None
        zTrying to create directory: %si�T)�mode�exist_okz)Unable to create directory: %s, error: %sN)r#r�isdirr,rr�makedirsruZwarning)Zdir_pathrlrrr�_create_missing_dirgszSyncedStore._create_missing_dircCs�|jt�|jt�ytj|ddd�}Wn.tk
rV}z|jdkrF�WYdd}~Xn*Xt||�|j�|j	�t
jd|�t
jd|�dS)a
        Write the contents of data to file in the first mode we can (effectively to create or update
        the file)
        :param path: The string path to the file location we should update
        :param data: The data to write to the file
        :return: None
        zw+zutf-8)r�Nz/Successfully updated syspurpose values at '%s'.z+Failed to update syspurpose values at '%s'.)r��USER_SYSPURPOSE_DIR�	CACHE_DIRrrr(r)r	rB�closer,rr)rGrrr-r.rrrr�vs



zSyncedStore._update_filecCs�d}|jdk	r�|jdk	r�|jj|j�}d|kr�|d}y|jj|�}Wn0tk
rv}ztjd|�WYdd}~XnXd|kr�t|�}|d}|S)z�
        Try to get valid fields from server using current owner (organization)
        :return: Dictionary with valid fields
        Nr7z*Unable to get valid fields from server: %sr)rWrcZgetOwnerZgetOwnerSyspurposeValidFieldsrur,rrr)rreZ
current_ownerZ	owner_keyZresponserlrrrrd�s zSyncedStore.get_valid_fields)NNF)NNN)!rIrJrKrL�USER_SYSPURPOSErX�CACHED_SYSPURPOSEr[rrfrirgrjrtrwr]rvr_ryr�rzr�rxr�r:r<r;r@�staticmethodr�rMr�rdrrrrrS�s4
*

	+"rS�
DiffChanger7�previous_value�	new_value�source�in_base�	in_resultrnc	Cs�tjd�i}|pi}|pi}|p$i}|dkr4|}n|dkrB|}ntd��|dkrZdd�}t|j��t|j��Bt|j��B}�x(|D�]}t|||dd�}	t|||d	d�}
|	p�|
o�|
tk}d
}|	|
kr�|	dkr�tjd|�|}||kr�||||<nv|
dk�r,tjd
|�d}||k�rn||||<nB|	�s<|
tk�rn|	dk�rTtjd|�d}||k�rn||||<|r�|j|�}
t|||
|j|�||k||kd�}||�q�W|S)a�
    Performs a three-way merge on the local and remote dictionaries with a given base.
    :param local: The dictionary of the current local values
    :param base: The dictionary with the values we've last seen
    :param remote: The dictionary with "their" values
    :param on_conflict: Either "remote" or "local" or None. If "remote", the remote changes
                               will win any conflict. If "local", the local changes will win any
                               conflict. If anything else, an error will be thrown.
    :param on_change: This is an optional function which will be given each change as it is
                      detected.
    :return: The dictionary of values as merged between the three provided dictionaries.
    zAttempting a three-way merge...rnrmzAkeyword argument "on_conflict" must be either "remote" or "local"NcSs|S)Nr)Zchangerrr�<lambda>�sz!three_way_merge.<locals>.<lambda>)ro�otherr7r��serverroTzLThree way merge conflict: both local and remote values changed for key '%s'.z7Three way merge: remote value was changed for key '%s'.z6Three way merge: local value was changed for key '%s'.)r7r�r�r�r�r�)	r,rrr"r@r��detect_changed�UNSUPPORTEDr>r�)rmrornZon_conflictr{rO�winnerZall_keysr7rQrPrar��originalZdiffrrrr|�sT
$




r|r�cCs�|pi}|pi}||kr$|dkr$tS|j|�}|j|�}||krP|dkrPt|�St|�tkrxt|�tkrxt|�t|�kS|dkr�|dkr�|dkr�dS||kS)aX
    Detect the type of change that has occurred between base and other for a given key.
    :param base: The dictionary of values we are starting with
    :param other: The dictionary of now current values
    :param key: The key that we are interested in knowing how it changed
    :param source: An optional string which indicates where the "other" values came from. Used to
                   make decisions which are one sided. (i.e. only applicable for changes from the
                   server side).
    :return: True if there was a change, false if there was no change
    :rtype: bool
    rmr�Nr=F)r�r>�bool�typer3�sorted)ror�r7r�Zbase_valZ	other_valrrrr��s

r�)rnN)r�)-Z
__future__rrr�collectionsZloggingr r#r)rZsyspurpose.utilsrrrrr	Zsyspurpose.i18nr
r&r�r�joinr�ZVALID_FIELDSr�r�r�r�r�r�rr~r�Z	getLoggerrIr,r�objectrrNrS�
namedtupler�r|r�rrrr�<module>sJ
 1
Bsite-packages/syspurpose/i18n.py000064400000004602147511334660012665 0ustar00from __future__ import print_function, division, absolute_import

# Copyright (c) 2018 Red Hat, Inc.
#
# This software is licensed to you under the GNU General Public License,
# version 2 (GPLv2). There is NO WARRANTY for this software, express or
# implied, including the implied warranties of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. You should have received a copy of GPLv2
# along with this software; if not, see
# http://www.gnu.org/licenses/old-licenses/gpl-2.0.txt.
#
# Red Hat trademarks are not licensed under GPLv2. No permission is
# granted to use or replicate Red Hat trademarks that are incorporated
# in this software or its documentation.
import gettext
import locale
from threading import local

import six

# Localization domain:
APP = 'syspurpose'
# Directory where translations are deployed:
DIR = '/usr/share/locale/'

TRANSLATION = gettext.translation(APP, fallback=True)

# Save current language using thread safe approach
LOCALE = local()
LOCALE.language = None
LOCALE.lang = None


def configure_i18n():
    """
    Configure internationalization for the application. Should only be
    called once per invocation. (once for CLI, once for GUI)
    """
    import locale
    try:
        locale.setlocale(locale.LC_ALL, '')
    except locale.Error:
        locale.setlocale(locale.LC_ALL, 'C')
    configure_gettext()


def configure_gettext():
    gettext.bindtextdomain(APP, DIR)
    gettext.textdomain(APP)
    gettext.bind_textdomain_codeset(APP, 'UTF-8')
    locale.bind_textdomain_codeset(APP, 'UTF-8')


def ugettext(*args, **kwargs):
    if six.PY2:
        if hasattr(LOCALE, 'lang') and LOCALE.lang is not None:
            return LOCALE.lang.ugettext(*args, **kwargs)
        else:
            return TRANSLATION.ugettext(*args, **kwargs)
    else:
        if hasattr(LOCALE, 'lang') and LOCALE.lang is not None:
            return LOCALE.lang.gettext(*args, **kwargs)
        else:
            return TRANSLATION.gettext(*args, **kwargs)


def ungettext(*args, **kwargs):
    if six.PY2:
        if hasattr(LOCALE, 'lang') and LOCALE.lang is not None:
            return LOCALE.lang.ungettext(*args, **kwargs)
        else:
            return TRANSLATION.ungettext(*args, **kwargs)
    else:
        if hasattr(LOCALE, 'lang') and LOCALE.lang is not None:
            return LOCALE.lang.ngettext(*args, **kwargs)
        else:
            return TRANSLATION.ngettext(*args, **kwargs)
site-packages/syspurpose/utils.py000064400000007063147511334660013252 0ustar00# -*- coding: utf-8 -*-

from __future__ import print_function, division, absolute_import
#
# Copyright (c) 2018 Red Hat, Inc.
#
# This software is licensed to you under the GNU General Public License,
# version 2 (GPLv2). There is NO WARRANTY for this software, express or
# implied, including the implied warranties of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. You should have received a copy of GPLv2
# along with this software; if not, see
# http://www.gnu.org/licenses/old-licenses/gpl-2.0.txt.
#
# Red Hat trademarks are not licensed under GPLv2. No permission is
# granted to use or replicate Red Hat trademarks that are incorporated
# in this software or its documentation.

"""
Utility methods for the syspurpose command
"""

import io
import json
import os
import errno
import sys
import six
from syspurpose.i18n import ugettext as _

HOST_CONFIG_DIR = "/etc/rhsm-host/"  # symlink inside docker containers


# Borrowed from the subscription-manager cli script
def system_exit(code, msgs=None):
    """Exit with a code and optional message(s). Saved a few lines of code."""

    if msgs:
        if type(msgs) not in [type([]), type(())]:
            msgs = (msgs, )
        for msg in msgs:
            sys.stderr.write(str(msg) + '\n')
    sys.exit(code)


def create_dir(path):
    """
    Attempts to create the path given (less any file)
    :param path: path
    :return: True if changes were made, false otherwise
    """
    try:
        os.makedirs(path, mode=0o755)
    except OSError as e:
        if e.errno == errno.EEXIST:
            # If the directory exists no changes necessary
            return False
        if e.errno == errno.EACCES:
            system_exit(os.EX_NOPERM,
                        _('Cannot create directory {}\nAre you root?').format(path))
    return True


def create_file(path, contents):
    """
    Attempts to create a file, with the given contents
    :param path: The desired path to the file
    :param contents: The contents to write to the file, should json-serializable
    :return: True if the file was newly created, false otherwise
    """
    try:
        with io.open(path, 'w', encoding='utf-8') as f:
            write_to_file_utf8(f, contents)
            f.flush()

    except OSError as e:
        if e.errno == errno.EEXIST:
            # If the file exists no changes necessary
            return False
        if e.errno == errno.EACCES:
            system_exit(os.EX_NOPERM, _("Cannot create file {}\nAre you root?").format(path))
        else:
            raise
    return True


# Borrowed from the subscription-manager config module
def in_container():
    """
    Are we running in a docker container or not?

    Assumes that if we see host rhsm configuration shared with us, we must
    be running in a container.
    """
    if os.path.exists(HOST_CONFIG_DIR):
        return True
    return False


def make_utf8(obj):
    """
    Transforms the provided string into unicode if it is not already
    :param obj: the string to decode
    :return: the unicode format of the string
    """
    if six.PY3:
        return obj
    elif obj is not None and isinstance(obj, str) and not isinstance(obj, unicode):
        obj =  obj.decode('utf-8')
        return obj
    else:
        return obj


def write_to_file_utf8(file, data):
    """
    Writes out the provided data to the specified file, with user-friendly indentation,
    and in utf-8 encoding.
    :param file: The file to write to
    :param data: The data to be written
    :return:
    """
    file.write(make_utf8(json.dumps(data, indent=2, ensure_ascii=False, sort_keys=True)))
site-packages/syspurpose/main.py000064400000002130147511334660013024 0ustar00# -*- coding: utf-8 -*-

from __future__ import print_function, division, absolute_import
#
# Copyright (c) 2018 Red Hat, Inc.
#
# This software is licensed to you under the GNU General Public License,
# version 2 (GPLv2). There is NO WARRANTY for this software, express or
# implied, including the implied warranties of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. You should have received a copy of GPLv2
# along with this software; if not, see
# http://www.gnu.org/licenses/old-licenses/gpl-2.0.txt.
#
# Red Hat trademarks are not licensed under GPLv2. No permission is
# granted to use or replicate Red Hat trademarks that are incorporated
# in this software or its documentation.

import sys
from syspurpose import cli
from syspurpose.utils import system_exit

import syspurpose.i18n as i18n
i18n.configure_i18n()

from syspurpose.i18n import ugettext as _


def main():
    try:
        sys.exit(cli.main() or 0)
    except KeyboardInterrupt:
        system_exit(0, _("User interrupted process"))
    except Exception as e:
        system_exit(-1, str(e))


if __name__ == "__main__":
    main()
site-packages/syspurpose/__init__.py000064400000000000147511334660013631 0ustar00site-packages/syspurpose/cli.py000064400000034074147511334660012663 0ustar00# -*- coding: utf-8 -*-

from __future__ import print_function, division, absolute_import
#
# Copyright (c) 2018 Red Hat, Inc.
#
# This software is licensed to you under the GNU General Public License,
# version 2 (GPLv2). There is NO WARRANTY for this software, express or
# implied, including the implied warranties of MERCHANTABILITY or FITNESS
# FOR A PARTICULAR PURPOSE. You should have received a copy of GPLv2
# along with this software; if not, see
# http://www.gnu.org/licenses/old-licenses/gpl-2.0.txt.
#
# Red Hat trademarks are not licensed under GPLv2. No permission is
# granted to use or replicate Red Hat trademarks that are incorporated
# in this software or its documentation.

import argparse
import os
import sys
import six

from syspurpose.files import SyncedStore
from syspurpose.utils import in_container, make_utf8
from syspurpose.i18n import ugettext as _
import json


SP_CONFLICT_MESSAGE = _("Warning: A {attr} of \"{download_value}\" was recently set for this system "
                        "by the entitlement server administrator.\n{advice}")
SP_ADVICE = _("If you'd like to overwrite the server side change please run: {command}")


def add_command(args, syspurposestore):
    """
    Uses the syspurposestore to add one or more values to a particular property.
    :param args: The parsed args from argparse, expected attributes are:
        prop_name: the string name of the property to add to
        values: A list of the values to add to the given property (could be anything json-serializable)
    :param syspurposestore: An SyspurposeStore object to manipulate
    :return: None
    """
    any_prop_added = False
    for value in args.values:
        if syspurposestore.add(args.prop_name, value):
            any_prop_added = True
            print(_("Added \"{value}\" to {prop_name}.").format(
                value=make_utf8(value), prop_name=make_utf8(args.prop_name)))
        else:
            print(_("Not adding value \"{value}\" to {prop_name}; it already exists.").format(
                value=make_utf8(value), prop_name=make_utf8(args.prop_name)))

    if any_prop_added is False:
        return

    success_msg = _("{attr} updated.").format(attr=make_utf8(args.prop_name))
    to_add = "".join(args.values)
    command = "syspurpose add-{name} ".format(name=args.prop_name) + to_add
    check_result(
        syspurposestore,
        expectation=lambda res: all(x in res.get(args.prop_name, []) for x in args.values),
        success_msg=success_msg,
        command=command,
        attr=args.prop_name
    )


def remove_command(args, syspurposestore):
    """
    Uses the syspurposestore to remove one or more values from a particular property.
    :param args: The parsed args from argparse, expected attributes are:
        prop_name: the string name of the property to add to
        values: A list of the values to remove from the given property (could be anything json-serializable)
    :param syspurposestore: An SyspurposeStore object to manipulate
    :return: None
    """
    for value in args.values:
        if syspurposestore.remove(args.prop_name, value):
            print(_("Removed \"{val}\" from {name}.").format(val=make_utf8(value), name=make_utf8(args.prop_name)))
        else:
            print(_("Not removing value \"{val}\" from {name}; it was not there.").format(
                val=make_utf8(value), name=make_utf8(args.prop_name)))
            return

    success_msg = _("{attr} updated.").format(attr=make_utf8(args.prop_name))
    to_remove = "".join(args.values)
    command = "syspurpose remove-{name} ".format(name=args.prop_name) + to_remove
    check_result(
        syspurposestore,
        expectation=lambda res: all(x not in res.get('addons', []) for x in args.values),
        success_msg=success_msg,
        command=command,
        attr=args.prop_name
    )


def set_command(args, syspurposestore):
    """
    Uses the syspurposestore to set the prop_name to value.
    :param args: The parsed args from argparse, expected attributes are:
        prop_name: the string name of the property to set
        value: An object to set the property to (could be anything json-serializable)
    :param syspurposestore: An SyspurposeStore object to manipulate
    :return: None
    """
    syspurposestore.set(args.prop_name, args.value)

    success_msg = _("{attr} set to \"{val}\".").format(attr=make_utf8(args.prop_name), val=make_utf8(args.value))
    check_result(
        syspurposestore,
        expectation=lambda res: res.get(args.prop_name) == args.value,
        success_msg=success_msg,
        command="syspurpose set {name} \"{val}\"".format(
            name=args.prop_name, val=args.value),
        attr=args.prop_name
    )


def unset_command(args, syspurposestore):
    """
    Uses the syspurposestore to unset (clear entirely) the prop_name.
    :param args: The parsed args from argparse, expected attributes are:
        prop_name: the string name of the property to unset (clear)
    :param syspurposestore: An SyspurposeStore object to manipulate
    :return: None
    """
    syspurposestore.unset(args.prop_name)

    success_msg = _("{attr} unset.").format(attr=make_utf8(args.prop_name))
    check_result(
        syspurposestore,
        expectation=lambda res: res.get(args.prop_name) in ["", None, []],
        success_msg=success_msg,
        command="syspurpose unset {name}".format(name=args.prop_name),
        attr=args.prop_name
    )


def show_contents(args, syspurposestore):
    """
    :param args:
    :param syspurposestore:
    :return:
    """
    sync_result = syspurposestore.sync()
    contents = sync_result.result
    contents = {key: contents[key] for key in contents if contents[key]}
    print(json.dumps(contents, indent=2, ensure_ascii=False, sort_keys=True))
    return sync_result


def setup_arg_parser():
    """
    Sets up argument parsing for the syspurpose tool.
    :return: An argparse.ArgumentParser ready to use to parse_args
    """
    parser = argparse.ArgumentParser(prog="syspurpose", description=_("System Syspurpose Management Tool"),
                                     epilog=_("The 'syspurpose' command is deprecated and will be removed in a future major release."
                                                " Please use the 'subscription-manager syspurpose' command going forward."))

    subparsers = parser.add_subparsers(help="sub-command help")

    # Arguments shared by subcommands
    add_options = argparse.ArgumentParser(add_help=False)
    add_options.add_argument("values", help=_("The value(s) to add"), nargs='+')
    add_options.set_defaults(func=add_command, requires_sync=True)

    remove_options = argparse.ArgumentParser(add_help=False)
    remove_options.add_argument("values", help=_("The value(s) to remove"), nargs='+')
    remove_options.set_defaults(func=remove_command, requires_sync=True)

    set_options = argparse.ArgumentParser(add_help=False)
    set_options.add_argument("value", help=_("The value to set"), action="store")
    set_options.set_defaults(func=set_command, requires_sync=True)

    unset_options = argparse.ArgumentParser(add_help=False)
    unset_options.set_defaults(func=unset_command, requires_sync=True)

    # Generic assignments
    # Set ################
    generic_set_parser = subparsers.add_parser(
        "set",
        help=_("Sets the value for the given property")
    )

    generic_set_parser.add_argument(
        "prop_name",
        metavar="property",
        help=_("The name of the property to set/update"),
        action="store"
    )

    generic_set_parser.add_argument(
        "value",
        help=_("The value to set"),
        action="store"
    )

    generic_set_parser.set_defaults(func=set_command, requires_sync=True)

    # Unset ##############
    generic_unset_parser = subparsers.add_parser(
        "unset",
        help=_("Unsets (clears) the value for the given property"),
        parents=[unset_options]
    )

    generic_unset_parser.add_argument(
        "prop_name",
        metavar="property",
        help=_("The name of the property to set/update"),
        action="store"
    )

    # Add ################
    generic_add_parser = subparsers.add_parser(
        "add",
        help=_("Adds the value(s) to the given property")
    )

    generic_add_parser.add_argument(
        "prop_name",
        metavar="property",
        help=_("The name of the property to update"),
        action="store"
    )

    generic_add_parser.add_argument(
        "values",
        help=_("The value(s) to add"),
        action="store",
        nargs="+"
    )

    generic_add_parser.set_defaults(func=add_command, requires_sync=True)

    # Remove #############
    generic_remove_parser = subparsers.add_parser(
        "remove",
        help=_("Removes the value(s) from the given property")
    )

    generic_remove_parser.add_argument(
        "prop_name",
        metavar="property",
        help=_("The name of the property to update"),
        action="store"
    )

    generic_remove_parser.add_argument(
        "values",
        help=_("The value(s) to remove"),
        action="store",
        nargs="+"
    )

    generic_remove_parser.set_defaults(func=remove_command, requires_sync=True)
    # Targeted commands
    # Roles ##########
    set_role_parser = subparsers.add_parser(
        "set-role",
        help=_("Set the system role to the system syspurpose"),
        parents=[set_options]
    )
    # TODO: Set prop_name from schema file
    set_role_parser.set_defaults(prop_name="role")

    unset_role_parser = subparsers.add_parser(
        "unset-role",
        help=_("Clear set role"),
        parents=[unset_options]
    )
    unset_role_parser.set_defaults(prop_name="role")

    # ADDONS #############
    add_addons_parser = subparsers.add_parser(
        "add-addons",
        help=_("Add addons to the system syspurpose"),
        parents=[add_options]
    )
    # TODO: Set prop_name from schema file
    add_addons_parser.set_defaults(prop_name="addons")

    remove_addons_parser = subparsers.add_parser(
        "remove-addons",
        help=_("Remove addons from the system syspurpose"),
        parents=[remove_options]
    )
    remove_addons_parser.set_defaults(prop_name="addons")

    unset_role_parser = subparsers.add_parser(
        "unset-addons",
        help=_("Clear set addons"),
        parents=[unset_options]
    )
    unset_role_parser.set_defaults(prop_name="addons")

    # SLA ################
    set_sla_parser = subparsers.add_parser(
        "set-sla",
        help=_("Set the system sla"),
        parents=[set_options]
    )
    set_sla_parser.set_defaults(prop_name="service_level_agreement")

    unset_sla_parser = subparsers.add_parser(
        "unset-sla",
        help=_("Clear set sla"),
        parents=[unset_options]
    )
    unset_sla_parser.set_defaults(prop_name="service_level_agreement")

    # USAGE ##############
    set_usage_parser = subparsers.add_parser(
        "set-usage",
        help=_("Set the system usage"),
        parents=[set_options]
    )

    set_usage_parser.set_defaults(prop_name="usage")

    unset_usage_parser = subparsers.add_parser(
        "unset-usage",
        help=_("Clear set usage"),
        parents=[unset_options]
    )
    unset_usage_parser.set_defaults(prop_name="usage")

    # Pretty Print Json contents of default syspurpose file
    show_parser = subparsers.add_parser(
        "show",
        help=_("Show the current system syspurpose")
    )
    show_parser.set_defaults(func=show_contents, requires_sync=False)

    return parser


def main():
    """
    Run the cli (Do the syspurpose tool thing!!)
    :return: 0
    """

    parser = setup_arg_parser()
    args = parser.parse_args()

    # Syspurpose is not intended to be used in containers for the time being (could change later).
    if in_container():
        print(_("WARNING: Setting syspurpose in containers has no effect.\n"
              "Please run syspurpose on the host.\n"))

    print(_("The 'syspurpose' command is deprecated and will be removed in a future major release."
        " Please use the 'subscription-manager syspurpose' command going forward."), file=sys.stderr)
    try:
        from subscription_manager.identity import Identity
        from subscription_manager.cp_provider import CPProvider
        identity = Identity()
        uuid = identity.uuid
        uep = CPProvider().get_consumer_auth_cp()
    except ImportError:
        uuid = None
        uep = None
        print(_("Warning: Unable to sync system purpose with subscription management server:"
                " subscription_manager module is not available."))

    syspurposestore = SyncedStore(uep=uep, consumer_uuid=uuid, use_valid_fields=True)
    if getattr(args, 'func', None) is not None:
        args.func(args, syspurposestore)
    else:
        parser.print_help()
        return 0

    return 0


def system_exit(code, msgs=None):
    """
    Exit with a code and optional message(s). Saved a few lines of code.
    Note: this method was copied from subscription_manager, because syspurpose should be independent
    on subscription_manager module as much as possible.
    """

    if msgs:
        if type(msgs) not in [type([]), type(())]:
            msgs = (msgs, )
        for msg in msgs:
            if isinstance(msg, Exception):
                msg = "%s" % msg

            if isinstance(msg, six.text_type) and six.PY2:
                sys.stderr.write(msg.encode("utf8"))
                sys.stderr.write("\n")
            else:
                sys.stderr.write(msg)
                sys.stderr.write("\n")

    try:
        sys.stdout.flush()
        sys.stderr.flush()
    except IOError:
        pass

    sys.exit(code)


def check_result(syspurposestore, expectation, success_msg, command, attr):
    if syspurposestore:
        syspurposestore.sync()
        result = syspurposestore.get_cached_contents()
    else:
        result = {}
    if result and not expectation(result):
        advice = SP_ADVICE.format(command=command)
        value = result[attr]
        system_exit(
            os.EX_SOFTWARE,
            msgs=_(SP_CONFLICT_MESSAGE.format(
                attr=attr,
                download_value=value,
                advice=advice
            ))
        )
    else:
        print(success_msg)
site-packages/pkg_resources/_vendor/six.py000064400000072622147511334660014773 0ustar00"""Utilities for writing code that runs on Python 2 and 3"""

# Copyright (c) 2010-2015 Benjamin Peterson
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in all
# copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.

from __future__ import absolute_import

import functools
import itertools
import operator
import sys
import types

__author__ = "Benjamin Peterson <benjamin@python.org>"
__version__ = "1.10.0"


# Useful for very coarse version differentiation.
PY2 = sys.version_info[0] == 2
PY3 = sys.version_info[0] == 3
PY34 = sys.version_info[0:2] >= (3, 4)

if PY3:
    string_types = str,
    integer_types = int,
    class_types = type,
    text_type = str
    binary_type = bytes

    MAXSIZE = sys.maxsize
else:
    string_types = basestring,
    integer_types = (int, long)
    class_types = (type, types.ClassType)
    text_type = unicode
    binary_type = str

    if sys.platform.startswith("java"):
        # Jython always uses 32 bits.
        MAXSIZE = int((1 << 31) - 1)
    else:
        # It's possible to have sizeof(long) != sizeof(Py_ssize_t).
        class X(object):

            def __len__(self):
                return 1 << 31
        try:
            len(X())
        except OverflowError:
            # 32-bit
            MAXSIZE = int((1 << 31) - 1)
        else:
            # 64-bit
            MAXSIZE = int((1 << 63) - 1)
        del X


def _add_doc(func, doc):
    """Add documentation to a function."""
    func.__doc__ = doc


def _import_module(name):
    """Import module, returning the module after the last dot."""
    __import__(name)
    return sys.modules[name]


class _LazyDescr(object):

    def __init__(self, name):
        self.name = name

    def __get__(self, obj, tp):
        result = self._resolve()
        setattr(obj, self.name, result)  # Invokes __set__.
        try:
            # This is a bit ugly, but it avoids running this again by
            # removing this descriptor.
            delattr(obj.__class__, self.name)
        except AttributeError:
            pass
        return result


class MovedModule(_LazyDescr):

    def __init__(self, name, old, new=None):
        super(MovedModule, self).__init__(name)
        if PY3:
            if new is None:
                new = name
            self.mod = new
        else:
            self.mod = old

    def _resolve(self):
        return _import_module(self.mod)

    def __getattr__(self, attr):
        _module = self._resolve()
        value = getattr(_module, attr)
        setattr(self, attr, value)
        return value


class _LazyModule(types.ModuleType):

    def __init__(self, name):
        super(_LazyModule, self).__init__(name)
        self.__doc__ = self.__class__.__doc__

    def __dir__(self):
        attrs = ["__doc__", "__name__"]
        attrs += [attr.name for attr in self._moved_attributes]
        return attrs

    # Subclasses should override this
    _moved_attributes = []


class MovedAttribute(_LazyDescr):

    def __init__(self, name, old_mod, new_mod, old_attr=None, new_attr=None):
        super(MovedAttribute, self).__init__(name)
        if PY3:
            if new_mod is None:
                new_mod = name
            self.mod = new_mod
            if new_attr is None:
                if old_attr is None:
                    new_attr = name
                else:
                    new_attr = old_attr
            self.attr = new_attr
        else:
            self.mod = old_mod
            if old_attr is None:
                old_attr = name
            self.attr = old_attr

    def _resolve(self):
        module = _import_module(self.mod)
        return getattr(module, self.attr)


class _SixMetaPathImporter(object):

    """
    A meta path importer to import six.moves and its submodules.

    This class implements a PEP302 finder and loader. It should be compatible
    with Python 2.5 and all existing versions of Python3
    """

    def __init__(self, six_module_name):
        self.name = six_module_name
        self.known_modules = {}

    def _add_module(self, mod, *fullnames):
        for fullname in fullnames:
            self.known_modules[self.name + "." + fullname] = mod

    def _get_module(self, fullname):
        return self.known_modules[self.name + "." + fullname]

    def find_module(self, fullname, path=None):
        if fullname in self.known_modules:
            return self
        return None

    def __get_module(self, fullname):
        try:
            return self.known_modules[fullname]
        except KeyError:
            raise ImportError("This loader does not know module " + fullname)

    def load_module(self, fullname):
        try:
            # in case of a reload
            return sys.modules[fullname]
        except KeyError:
            pass
        mod = self.__get_module(fullname)
        if isinstance(mod, MovedModule):
            mod = mod._resolve()
        else:
            mod.__loader__ = self
        sys.modules[fullname] = mod
        return mod

    def is_package(self, fullname):
        """
        Return true, if the named module is a package.

        We need this method to get correct spec objects with
        Python 3.4 (see PEP451)
        """
        return hasattr(self.__get_module(fullname), "__path__")

    def get_code(self, fullname):
        """Return None

        Required, if is_package is implemented"""
        self.__get_module(fullname)  # eventually raises ImportError
        return None
    get_source = get_code  # same as get_code

_importer = _SixMetaPathImporter(__name__)


class _MovedItems(_LazyModule):

    """Lazy loading of moved objects"""
    __path__ = []  # mark as package


_moved_attributes = [
    MovedAttribute("cStringIO", "cStringIO", "io", "StringIO"),
    MovedAttribute("filter", "itertools", "builtins", "ifilter", "filter"),
    MovedAttribute("filterfalse", "itertools", "itertools", "ifilterfalse", "filterfalse"),
    MovedAttribute("input", "__builtin__", "builtins", "raw_input", "input"),
    MovedAttribute("intern", "__builtin__", "sys"),
    MovedAttribute("map", "itertools", "builtins", "imap", "map"),
    MovedAttribute("getcwd", "os", "os", "getcwdu", "getcwd"),
    MovedAttribute("getcwdb", "os", "os", "getcwd", "getcwdb"),
    MovedAttribute("range", "__builtin__", "builtins", "xrange", "range"),
    MovedAttribute("reload_module", "__builtin__", "importlib" if PY34 else "imp", "reload"),
    MovedAttribute("reduce", "__builtin__", "functools"),
    MovedAttribute("shlex_quote", "pipes", "shlex", "quote"),
    MovedAttribute("StringIO", "StringIO", "io"),
    MovedAttribute("UserDict", "UserDict", "collections"),
    MovedAttribute("UserList", "UserList", "collections"),
    MovedAttribute("UserString", "UserString", "collections"),
    MovedAttribute("xrange", "__builtin__", "builtins", "xrange", "range"),
    MovedAttribute("zip", "itertools", "builtins", "izip", "zip"),
    MovedAttribute("zip_longest", "itertools", "itertools", "izip_longest", "zip_longest"),
    MovedModule("builtins", "__builtin__"),
    MovedModule("configparser", "ConfigParser"),
    MovedModule("copyreg", "copy_reg"),
    MovedModule("dbm_gnu", "gdbm", "dbm.gnu"),
    MovedModule("_dummy_thread", "dummy_thread", "_dummy_thread"),
    MovedModule("http_cookiejar", "cookielib", "http.cookiejar"),
    MovedModule("http_cookies", "Cookie", "http.cookies"),
    MovedModule("html_entities", "htmlentitydefs", "html.entities"),
    MovedModule("html_parser", "HTMLParser", "html.parser"),
    MovedModule("http_client", "httplib", "http.client"),
    MovedModule("email_mime_multipart", "email.MIMEMultipart", "email.mime.multipart"),
    MovedModule("email_mime_nonmultipart", "email.MIMENonMultipart", "email.mime.nonmultipart"),
    MovedModule("email_mime_text", "email.MIMEText", "email.mime.text"),
    MovedModule("email_mime_base", "email.MIMEBase", "email.mime.base"),
    MovedModule("BaseHTTPServer", "BaseHTTPServer", "http.server"),
    MovedModule("CGIHTTPServer", "CGIHTTPServer", "http.server"),
    MovedModule("SimpleHTTPServer", "SimpleHTTPServer", "http.server"),
    MovedModule("cPickle", "cPickle", "pickle"),
    MovedModule("queue", "Queue"),
    MovedModule("reprlib", "repr"),
    MovedModule("socketserver", "SocketServer"),
    MovedModule("_thread", "thread", "_thread"),
    MovedModule("tkinter", "Tkinter"),
    MovedModule("tkinter_dialog", "Dialog", "tkinter.dialog"),
    MovedModule("tkinter_filedialog", "FileDialog", "tkinter.filedialog"),
    MovedModule("tkinter_scrolledtext", "ScrolledText", "tkinter.scrolledtext"),
    MovedModule("tkinter_simpledialog", "SimpleDialog", "tkinter.simpledialog"),
    MovedModule("tkinter_tix", "Tix", "tkinter.tix"),
    MovedModule("tkinter_ttk", "ttk", "tkinter.ttk"),
    MovedModule("tkinter_constants", "Tkconstants", "tkinter.constants"),
    MovedModule("tkinter_dnd", "Tkdnd", "tkinter.dnd"),
    MovedModule("tkinter_colorchooser", "tkColorChooser",
                "tkinter.colorchooser"),
    MovedModule("tkinter_commondialog", "tkCommonDialog",
                "tkinter.commondialog"),
    MovedModule("tkinter_tkfiledialog", "tkFileDialog", "tkinter.filedialog"),
    MovedModule("tkinter_font", "tkFont", "tkinter.font"),
    MovedModule("tkinter_messagebox", "tkMessageBox", "tkinter.messagebox"),
    MovedModule("tkinter_tksimpledialog", "tkSimpleDialog",
                "tkinter.simpledialog"),
    MovedModule("urllib_parse", __name__ + ".moves.urllib_parse", "urllib.parse"),
    MovedModule("urllib_error", __name__ + ".moves.urllib_error", "urllib.error"),
    MovedModule("urllib", __name__ + ".moves.urllib", __name__ + ".moves.urllib"),
    MovedModule("urllib_robotparser", "robotparser", "urllib.robotparser"),
    MovedModule("xmlrpc_client", "xmlrpclib", "xmlrpc.client"),
    MovedModule("xmlrpc_server", "SimpleXMLRPCServer", "xmlrpc.server"),
]
# Add windows specific modules.
if sys.platform == "win32":
    _moved_attributes += [
        MovedModule("winreg", "_winreg"),
    ]

for attr in _moved_attributes:
    setattr(_MovedItems, attr.name, attr)
    if isinstance(attr, MovedModule):
        _importer._add_module(attr, "moves." + attr.name)
del attr

_MovedItems._moved_attributes = _moved_attributes

moves = _MovedItems(__name__ + ".moves")
_importer._add_module(moves, "moves")


class Module_six_moves_urllib_parse(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_parse"""


_urllib_parse_moved_attributes = [
    MovedAttribute("ParseResult", "urlparse", "urllib.parse"),
    MovedAttribute("SplitResult", "urlparse", "urllib.parse"),
    MovedAttribute("parse_qs", "urlparse", "urllib.parse"),
    MovedAttribute("parse_qsl", "urlparse", "urllib.parse"),
    MovedAttribute("urldefrag", "urlparse", "urllib.parse"),
    MovedAttribute("urljoin", "urlparse", "urllib.parse"),
    MovedAttribute("urlparse", "urlparse", "urllib.parse"),
    MovedAttribute("urlsplit", "urlparse", "urllib.parse"),
    MovedAttribute("urlunparse", "urlparse", "urllib.parse"),
    MovedAttribute("urlunsplit", "urlparse", "urllib.parse"),
    MovedAttribute("quote", "urllib", "urllib.parse"),
    MovedAttribute("quote_plus", "urllib", "urllib.parse"),
    MovedAttribute("unquote", "urllib", "urllib.parse"),
    MovedAttribute("unquote_plus", "urllib", "urllib.parse"),
    MovedAttribute("urlencode", "urllib", "urllib.parse"),
    MovedAttribute("splitquery", "urllib", "urllib.parse"),
    MovedAttribute("splittag", "urllib", "urllib.parse"),
    MovedAttribute("splituser", "urllib", "urllib.parse"),
    MovedAttribute("uses_fragment", "urlparse", "urllib.parse"),
    MovedAttribute("uses_netloc", "urlparse", "urllib.parse"),
    MovedAttribute("uses_params", "urlparse", "urllib.parse"),
    MovedAttribute("uses_query", "urlparse", "urllib.parse"),
    MovedAttribute("uses_relative", "urlparse", "urllib.parse"),
]
for attr in _urllib_parse_moved_attributes:
    setattr(Module_six_moves_urllib_parse, attr.name, attr)
del attr

Module_six_moves_urllib_parse._moved_attributes = _urllib_parse_moved_attributes

_importer._add_module(Module_six_moves_urllib_parse(__name__ + ".moves.urllib_parse"),
                      "moves.urllib_parse", "moves.urllib.parse")


class Module_six_moves_urllib_error(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_error"""


_urllib_error_moved_attributes = [
    MovedAttribute("URLError", "urllib2", "urllib.error"),
    MovedAttribute("HTTPError", "urllib2", "urllib.error"),
    MovedAttribute("ContentTooShortError", "urllib", "urllib.error"),
]
for attr in _urllib_error_moved_attributes:
    setattr(Module_six_moves_urllib_error, attr.name, attr)
del attr

Module_six_moves_urllib_error._moved_attributes = _urllib_error_moved_attributes

_importer._add_module(Module_six_moves_urllib_error(__name__ + ".moves.urllib.error"),
                      "moves.urllib_error", "moves.urllib.error")


class Module_six_moves_urllib_request(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_request"""


_urllib_request_moved_attributes = [
    MovedAttribute("urlopen", "urllib2", "urllib.request"),
    MovedAttribute("install_opener", "urllib2", "urllib.request"),
    MovedAttribute("build_opener", "urllib2", "urllib.request"),
    MovedAttribute("pathname2url", "urllib", "urllib.request"),
    MovedAttribute("url2pathname", "urllib", "urllib.request"),
    MovedAttribute("getproxies", "urllib", "urllib.request"),
    MovedAttribute("Request", "urllib2", "urllib.request"),
    MovedAttribute("OpenerDirector", "urllib2", "urllib.request"),
    MovedAttribute("HTTPDefaultErrorHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPRedirectHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPCookieProcessor", "urllib2", "urllib.request"),
    MovedAttribute("ProxyHandler", "urllib2", "urllib.request"),
    MovedAttribute("BaseHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPPasswordMgr", "urllib2", "urllib.request"),
    MovedAttribute("HTTPPasswordMgrWithDefaultRealm", "urllib2", "urllib.request"),
    MovedAttribute("AbstractBasicAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPBasicAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("ProxyBasicAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("AbstractDigestAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPDigestAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("ProxyDigestAuthHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPSHandler", "urllib2", "urllib.request"),
    MovedAttribute("FileHandler", "urllib2", "urllib.request"),
    MovedAttribute("FTPHandler", "urllib2", "urllib.request"),
    MovedAttribute("CacheFTPHandler", "urllib2", "urllib.request"),
    MovedAttribute("UnknownHandler", "urllib2", "urllib.request"),
    MovedAttribute("HTTPErrorProcessor", "urllib2", "urllib.request"),
    MovedAttribute("urlretrieve", "urllib", "urllib.request"),
    MovedAttribute("urlcleanup", "urllib", "urllib.request"),
    MovedAttribute("URLopener", "urllib", "urllib.request"),
    MovedAttribute("FancyURLopener", "urllib", "urllib.request"),
    MovedAttribute("proxy_bypass", "urllib", "urllib.request"),
]
for attr in _urllib_request_moved_attributes:
    setattr(Module_six_moves_urllib_request, attr.name, attr)
del attr

Module_six_moves_urllib_request._moved_attributes = _urllib_request_moved_attributes

_importer._add_module(Module_six_moves_urllib_request(__name__ + ".moves.urllib.request"),
                      "moves.urllib_request", "moves.urllib.request")


class Module_six_moves_urllib_response(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_response"""


_urllib_response_moved_attributes = [
    MovedAttribute("addbase", "urllib", "urllib.response"),
    MovedAttribute("addclosehook", "urllib", "urllib.response"),
    MovedAttribute("addinfo", "urllib", "urllib.response"),
    MovedAttribute("addinfourl", "urllib", "urllib.response"),
]
for attr in _urllib_response_moved_attributes:
    setattr(Module_six_moves_urllib_response, attr.name, attr)
del attr

Module_six_moves_urllib_response._moved_attributes = _urllib_response_moved_attributes

_importer._add_module(Module_six_moves_urllib_response(__name__ + ".moves.urllib.response"),
                      "moves.urllib_response", "moves.urllib.response")


class Module_six_moves_urllib_robotparser(_LazyModule):

    """Lazy loading of moved objects in six.moves.urllib_robotparser"""


_urllib_robotparser_moved_attributes = [
    MovedAttribute("RobotFileParser", "robotparser", "urllib.robotparser"),
]
for attr in _urllib_robotparser_moved_attributes:
    setattr(Module_six_moves_urllib_robotparser, attr.name, attr)
del attr

Module_six_moves_urllib_robotparser._moved_attributes = _urllib_robotparser_moved_attributes

_importer._add_module(Module_six_moves_urllib_robotparser(__name__ + ".moves.urllib.robotparser"),
                      "moves.urllib_robotparser", "moves.urllib.robotparser")


class Module_six_moves_urllib(types.ModuleType):

    """Create a six.moves.urllib namespace that resembles the Python 3 namespace"""
    __path__ = []  # mark as package
    parse = _importer._get_module("moves.urllib_parse")
    error = _importer._get_module("moves.urllib_error")
    request = _importer._get_module("moves.urllib_request")
    response = _importer._get_module("moves.urllib_response")
    robotparser = _importer._get_module("moves.urllib_robotparser")

    def __dir__(self):
        return ['parse', 'error', 'request', 'response', 'robotparser']

_importer._add_module(Module_six_moves_urllib(__name__ + ".moves.urllib"),
                      "moves.urllib")


def add_move(move):
    """Add an item to six.moves."""
    setattr(_MovedItems, move.name, move)


def remove_move(name):
    """Remove item from six.moves."""
    try:
        delattr(_MovedItems, name)
    except AttributeError:
        try:
            del moves.__dict__[name]
        except KeyError:
            raise AttributeError("no such move, %r" % (name,))


if PY3:
    _meth_func = "__func__"
    _meth_self = "__self__"

    _func_closure = "__closure__"
    _func_code = "__code__"
    _func_defaults = "__defaults__"
    _func_globals = "__globals__"
else:
    _meth_func = "im_func"
    _meth_self = "im_self"

    _func_closure = "func_closure"
    _func_code = "func_code"
    _func_defaults = "func_defaults"
    _func_globals = "func_globals"


try:
    advance_iterator = next
except NameError:
    def advance_iterator(it):
        return it.next()
next = advance_iterator


try:
    callable = callable
except NameError:
    def callable(obj):
        return any("__call__" in klass.__dict__ for klass in type(obj).__mro__)


if PY3:
    def get_unbound_function(unbound):
        return unbound

    create_bound_method = types.MethodType

    def create_unbound_method(func, cls):
        return func

    Iterator = object
else:
    def get_unbound_function(unbound):
        return unbound.im_func

    def create_bound_method(func, obj):
        return types.MethodType(func, obj, obj.__class__)

    def create_unbound_method(func, cls):
        return types.MethodType(func, None, cls)

    class Iterator(object):

        def next(self):
            return type(self).__next__(self)

    callable = callable
_add_doc(get_unbound_function,
         """Get the function out of a possibly unbound function""")


get_method_function = operator.attrgetter(_meth_func)
get_method_self = operator.attrgetter(_meth_self)
get_function_closure = operator.attrgetter(_func_closure)
get_function_code = operator.attrgetter(_func_code)
get_function_defaults = operator.attrgetter(_func_defaults)
get_function_globals = operator.attrgetter(_func_globals)


if PY3:
    def iterkeys(d, **kw):
        return iter(d.keys(**kw))

    def itervalues(d, **kw):
        return iter(d.values(**kw))

    def iteritems(d, **kw):
        return iter(d.items(**kw))

    def iterlists(d, **kw):
        return iter(d.lists(**kw))

    viewkeys = operator.methodcaller("keys")

    viewvalues = operator.methodcaller("values")

    viewitems = operator.methodcaller("items")
else:
    def iterkeys(d, **kw):
        return d.iterkeys(**kw)

    def itervalues(d, **kw):
        return d.itervalues(**kw)

    def iteritems(d, **kw):
        return d.iteritems(**kw)

    def iterlists(d, **kw):
        return d.iterlists(**kw)

    viewkeys = operator.methodcaller("viewkeys")

    viewvalues = operator.methodcaller("viewvalues")

    viewitems = operator.methodcaller("viewitems")

_add_doc(iterkeys, "Return an iterator over the keys of a dictionary.")
_add_doc(itervalues, "Return an iterator over the values of a dictionary.")
_add_doc(iteritems,
         "Return an iterator over the (key, value) pairs of a dictionary.")
_add_doc(iterlists,
         "Return an iterator over the (key, [values]) pairs of a dictionary.")


if PY3:
    def b(s):
        return s.encode("latin-1")

    def u(s):
        return s
    unichr = chr
    import struct
    int2byte = struct.Struct(">B").pack
    del struct
    byte2int = operator.itemgetter(0)
    indexbytes = operator.getitem
    iterbytes = iter
    import io
    StringIO = io.StringIO
    BytesIO = io.BytesIO
    _assertCountEqual = "assertCountEqual"
    if sys.version_info[1] <= 1:
        _assertRaisesRegex = "assertRaisesRegexp"
        _assertRegex = "assertRegexpMatches"
    else:
        _assertRaisesRegex = "assertRaisesRegex"
        _assertRegex = "assertRegex"
else:
    def b(s):
        return s
    # Workaround for standalone backslash

    def u(s):
        return unicode(s.replace(r'\\', r'\\\\'), "unicode_escape")
    unichr = unichr
    int2byte = chr

    def byte2int(bs):
        return ord(bs[0])

    def indexbytes(buf, i):
        return ord(buf[i])
    iterbytes = functools.partial(itertools.imap, ord)
    import StringIO
    StringIO = BytesIO = StringIO.StringIO
    _assertCountEqual = "assertItemsEqual"
    _assertRaisesRegex = "assertRaisesRegexp"
    _assertRegex = "assertRegexpMatches"
_add_doc(b, """Byte literal""")
_add_doc(u, """Text literal""")


def assertCountEqual(self, *args, **kwargs):
    return getattr(self, _assertCountEqual)(*args, **kwargs)


def assertRaisesRegex(self, *args, **kwargs):
    return getattr(self, _assertRaisesRegex)(*args, **kwargs)


def assertRegex(self, *args, **kwargs):
    return getattr(self, _assertRegex)(*args, **kwargs)


if PY3:
    exec_ = getattr(moves.builtins, "exec")

    def reraise(tp, value, tb=None):
        if value is None:
            value = tp()
        if value.__traceback__ is not tb:
            raise value.with_traceback(tb)
        raise value

else:
    def exec_(_code_, _globs_=None, _locs_=None):
        """Execute code in a namespace."""
        if _globs_ is None:
            frame = sys._getframe(1)
            _globs_ = frame.f_globals
            if _locs_ is None:
                _locs_ = frame.f_locals
            del frame
        elif _locs_ is None:
            _locs_ = _globs_
        exec("""exec _code_ in _globs_, _locs_""")

    exec_("""def reraise(tp, value, tb=None):
    raise tp, value, tb
""")


if sys.version_info[:2] == (3, 2):
    exec_("""def raise_from(value, from_value):
    if from_value is None:
        raise value
    raise value from from_value
""")
elif sys.version_info[:2] > (3, 2):
    exec_("""def raise_from(value, from_value):
    raise value from from_value
""")
else:
    def raise_from(value, from_value):
        raise value


print_ = getattr(moves.builtins, "print", None)
if print_ is None:
    def print_(*args, **kwargs):
        """The new-style print function for Python 2.4 and 2.5."""
        fp = kwargs.pop("file", sys.stdout)
        if fp is None:
            return

        def write(data):
            if not isinstance(data, basestring):
                data = str(data)
            # If the file has an encoding, encode unicode with it.
            if (isinstance(fp, file) and
                    isinstance(data, unicode) and
                    fp.encoding is not None):
                errors = getattr(fp, "errors", None)
                if errors is None:
                    errors = "strict"
                data = data.encode(fp.encoding, errors)
            fp.write(data)
        want_unicode = False
        sep = kwargs.pop("sep", None)
        if sep is not None:
            if isinstance(sep, unicode):
                want_unicode = True
            elif not isinstance(sep, str):
                raise TypeError("sep must be None or a string")
        end = kwargs.pop("end", None)
        if end is not None:
            if isinstance(end, unicode):
                want_unicode = True
            elif not isinstance(end, str):
                raise TypeError("end must be None or a string")
        if kwargs:
            raise TypeError("invalid keyword arguments to print()")
        if not want_unicode:
            for arg in args:
                if isinstance(arg, unicode):
                    want_unicode = True
                    break
        if want_unicode:
            newline = unicode("\n")
            space = unicode(" ")
        else:
            newline = "\n"
            space = " "
        if sep is None:
            sep = space
        if end is None:
            end = newline
        for i, arg in enumerate(args):
            if i:
                write(sep)
            write(arg)
        write(end)
if sys.version_info[:2] < (3, 3):
    _print = print_

    def print_(*args, **kwargs):
        fp = kwargs.get("file", sys.stdout)
        flush = kwargs.pop("flush", False)
        _print(*args, **kwargs)
        if flush and fp is not None:
            fp.flush()

_add_doc(reraise, """Reraise an exception.""")

if sys.version_info[0:2] < (3, 4):
    def wraps(wrapped, assigned=functools.WRAPPER_ASSIGNMENTS,
              updated=functools.WRAPPER_UPDATES):
        def wrapper(f):
            f = functools.wraps(wrapped, assigned, updated)(f)
            f.__wrapped__ = wrapped
            return f
        return wrapper
else:
    wraps = functools.wraps


def with_metaclass(meta, *bases):
    """Create a base class with a metaclass."""
    # This requires a bit of explanation: the basic idea is to make a dummy
    # metaclass for one level of class instantiation that replaces itself with
    # the actual metaclass.
    class metaclass(meta):

        def __new__(cls, name, this_bases, d):
            return meta(name, bases, d)
    return type.__new__(metaclass, 'temporary_class', (), {})


def add_metaclass(metaclass):
    """Class decorator for creating a class with a metaclass."""
    def wrapper(cls):
        orig_vars = cls.__dict__.copy()
        slots = orig_vars.get('__slots__')
        if slots is not None:
            if isinstance(slots, str):
                slots = [slots]
            for slots_var in slots:
                orig_vars.pop(slots_var)
        orig_vars.pop('__dict__', None)
        orig_vars.pop('__weakref__', None)
        return metaclass(cls.__name__, cls.__bases__, orig_vars)
    return wrapper


def python_2_unicode_compatible(klass):
    """
    A decorator that defines __unicode__ and __str__ methods under Python 2.
    Under Python 3 it does nothing.

    To support Python 2 and 3 with a single code base, define a __str__ method
    returning text and apply this decorator to the class.
    """
    if PY2:
        if '__str__' not in klass.__dict__:
            raise ValueError("@python_2_unicode_compatible cannot be applied "
                             "to %s because it doesn't define __str__()." %
                             klass.__name__)
        klass.__unicode__ = klass.__str__
        klass.__str__ = lambda self: self.__unicode__().encode('utf-8')
    return klass


# Complete the moves implementation.
# This code is at the end of this module to speed up module loading.
# Turn this module into a package.
__path__ = []  # required for PEP 302 and PEP 451
__package__ = __name__  # see PEP 366 @ReservedAssignment
if globals().get("__spec__") is not None:
    __spec__.submodule_search_locations = []  # PEP 451 @UndefinedVariable
# Remove other six meta path importers, since they cause problems. This can
# happen if six is removed from sys.modules and then reloaded. (Setuptools does
# this for some reason.)
if sys.meta_path:
    for i, importer in enumerate(sys.meta_path):
        # Here's some real nastiness: Another "instance" of the six module might
        # be floating around. Therefore, we can't use isinstance() to check for
        # the six meta path importer, since the other six instance will have
        # inserted an importer with different class.
        if (type(importer).__name__ == "_SixMetaPathImporter" and
                importer.name == __name__):
            del sys.meta_path[i]
            break
    del i, importer
# Finally, add the importer to the meta path import hook.
sys.meta_path.append(_importer)
site-packages/pkg_resources/_vendor/appdirs.py000064400000053546147511334660015636 0ustar00#!/usr/bin/env python
# -*- coding: utf-8 -*-
# Copyright (c) 2005-2010 ActiveState Software Inc.
# Copyright (c) 2013 Eddy Petrișor

"""Utilities for determining application-specific dirs.

See <http://github.com/ActiveState/appdirs> for details and usage.
"""
# Dev Notes:
# - MSDN on where to store app data files:
#   http://support.microsoft.com/default.aspx?scid=kb;en-us;310294#XSLTH3194121123120121120120
# - Mac OS X: http://developer.apple.com/documentation/MacOSX/Conceptual/BPFileSystem/index.html
# - XDG spec for Un*x: http://standards.freedesktop.org/basedir-spec/basedir-spec-latest.html

__version_info__ = (1, 4, 0)
__version__ = '.'.join(map(str, __version_info__))


import sys
import os

PY3 = sys.version_info[0] == 3

if PY3:
    unicode = str

if sys.platform.startswith('java'):
    import platform
    os_name = platform.java_ver()[3][0]
    if os_name.startswith('Windows'): # "Windows XP", "Windows 7", etc.
        system = 'win32'
    elif os_name.startswith('Mac'): # "Mac OS X", etc.
        system = 'darwin'
    else: # "Linux", "SunOS", "FreeBSD", etc.
        # Setting this to "linux2" is not ideal, but only Windows or Mac
        # are actually checked for and the rest of the module expects
        # *sys.platform* style strings.
        system = 'linux2'
else:
    system = sys.platform



def user_data_dir(appname=None, appauthor=None, version=None, roaming=False):
    r"""Return full path to the user-specific data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user data directories are:
        Mac OS X:               ~/Library/Application Support/<AppName>
        Unix:                   ~/.local/share/<AppName>    # or in $XDG_DATA_HOME, if defined
        Win XP (not roaming):   C:\Documents and Settings\<username>\Application Data\<AppAuthor>\<AppName>
        Win XP (roaming):       C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>
        Win 7  (not roaming):   C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>
        Win 7  (roaming):       C:\Users\<username>\AppData\Roaming\<AppAuthor>\<AppName>

    For Unix, we follow the XDG spec and support $XDG_DATA_HOME.
    That means, by default "~/.local/share/<AppName>".
    """
    if system == "win32":
        if appauthor is None:
            appauthor = appname
        const = roaming and "CSIDL_APPDATA" or "CSIDL_LOCAL_APPDATA"
        path = os.path.normpath(_get_win_folder(const))
        if appname:
            if appauthor is not False:
                path = os.path.join(path, appauthor, appname)
            else:
                path = os.path.join(path, appname)
    elif system == 'darwin':
        path = os.path.expanduser('~/Library/Application Support/')
        if appname:
            path = os.path.join(path, appname)
    else:
        path = os.getenv('XDG_DATA_HOME', os.path.expanduser("~/.local/share"))
        if appname:
            path = os.path.join(path, appname)
    if appname and version:
        path = os.path.join(path, version)
    return path


def site_data_dir(appname=None, appauthor=None, version=None, multipath=False):
    """Return full path to the user-shared data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "multipath" is an optional parameter only applicable to *nix
            which indicates that the entire list of data dirs should be
            returned. By default, the first item from XDG_DATA_DIRS is
            returned, or '/usr/local/share/<AppName>',
            if XDG_DATA_DIRS is not set

    Typical user data directories are:
        Mac OS X:   /Library/Application Support/<AppName>
        Unix:       /usr/local/share/<AppName> or /usr/share/<AppName>
        Win XP:     C:\Documents and Settings\All Users\Application Data\<AppAuthor>\<AppName>
        Vista:      (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)
        Win 7:      C:\ProgramData\<AppAuthor>\<AppName>   # Hidden, but writeable on Win 7.

    For Unix, this is using the $XDG_DATA_DIRS[0] default.

    WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
    """
    if system == "win32":
        if appauthor is None:
            appauthor = appname
        path = os.path.normpath(_get_win_folder("CSIDL_COMMON_APPDATA"))
        if appname:
            if appauthor is not False:
                path = os.path.join(path, appauthor, appname)
            else:
                path = os.path.join(path, appname)
    elif system == 'darwin':
        path = os.path.expanduser('/Library/Application Support')
        if appname:
            path = os.path.join(path, appname)
    else:
        # XDG default for $XDG_DATA_DIRS
        # only first, if multipath is False
        path = os.getenv('XDG_DATA_DIRS',
                         os.pathsep.join(['/usr/local/share', '/usr/share']))
        pathlist = [os.path.expanduser(x.rstrip(os.sep)) for x in path.split(os.pathsep)]
        if appname:
            if version:
                appname = os.path.join(appname, version)
            pathlist = [os.sep.join([x, appname]) for x in pathlist]

        if multipath:
            path = os.pathsep.join(pathlist)
        else:
            path = pathlist[0]
        return path

    if appname and version:
        path = os.path.join(path, version)
    return path


def user_config_dir(appname=None, appauthor=None, version=None, roaming=False):
    r"""Return full path to the user-specific config dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user data directories are:
        Mac OS X:               same as user_data_dir
        Unix:                   ~/.config/<AppName>     # or in $XDG_CONFIG_HOME, if defined
        Win *:                  same as user_data_dir

    For Unix, we follow the XDG spec and support $XDG_CONFIG_HOME.
    That means, by deafult "~/.config/<AppName>".
    """
    if system in ["win32", "darwin"]:
        path = user_data_dir(appname, appauthor, None, roaming)
    else:
        path = os.getenv('XDG_CONFIG_HOME', os.path.expanduser("~/.config"))
        if appname:
            path = os.path.join(path, appname)
    if appname and version:
        path = os.path.join(path, version)
    return path


def site_config_dir(appname=None, appauthor=None, version=None, multipath=False):
    """Return full path to the user-shared data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "multipath" is an optional parameter only applicable to *nix
            which indicates that the entire list of config dirs should be
            returned. By default, the first item from XDG_CONFIG_DIRS is
            returned, or '/etc/xdg/<AppName>', if XDG_CONFIG_DIRS is not set

    Typical user data directories are:
        Mac OS X:   same as site_data_dir
        Unix:       /etc/xdg/<AppName> or $XDG_CONFIG_DIRS[i]/<AppName> for each value in
                    $XDG_CONFIG_DIRS
        Win *:      same as site_data_dir
        Vista:      (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)

    For Unix, this is using the $XDG_CONFIG_DIRS[0] default, if multipath=False

    WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
    """
    if system in ["win32", "darwin"]:
        path = site_data_dir(appname, appauthor)
        if appname and version:
            path = os.path.join(path, version)
    else:
        # XDG default for $XDG_CONFIG_DIRS
        # only first, if multipath is False
        path = os.getenv('XDG_CONFIG_DIRS', '/etc/xdg')
        pathlist = [os.path.expanduser(x.rstrip(os.sep)) for x in path.split(os.pathsep)]
        if appname:
            if version:
                appname = os.path.join(appname, version)
            pathlist = [os.sep.join([x, appname]) for x in pathlist]

        if multipath:
            path = os.pathsep.join(pathlist)
        else:
            path = pathlist[0]
    return path


def user_cache_dir(appname=None, appauthor=None, version=None, opinion=True):
    r"""Return full path to the user-specific cache dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "opinion" (boolean) can be False to disable the appending of
            "Cache" to the base app data dir for Windows. See
            discussion below.

    Typical user cache directories are:
        Mac OS X:   ~/Library/Caches/<AppName>
        Unix:       ~/.cache/<AppName> (XDG default)
        Win XP:     C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Cache
        Vista:      C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Cache

    On Windows the only suggestion in the MSDN docs is that local settings go in
    the `CSIDL_LOCAL_APPDATA` directory. This is identical to the non-roaming
    app data dir (the default returned by `user_data_dir` above). Apps typically
    put cache data somewhere *under* the given dir here. Some examples:
        ...\Mozilla\Firefox\Profiles\<ProfileName>\Cache
        ...\Acme\SuperApp\Cache\1.0
    OPINION: This function appends "Cache" to the `CSIDL_LOCAL_APPDATA` value.
    This can be disabled with the `opinion=False` option.
    """
    if system == "win32":
        if appauthor is None:
            appauthor = appname
        path = os.path.normpath(_get_win_folder("CSIDL_LOCAL_APPDATA"))
        if appname:
            if appauthor is not False:
                path = os.path.join(path, appauthor, appname)
            else:
                path = os.path.join(path, appname)
            if opinion:
                path = os.path.join(path, "Cache")
    elif system == 'darwin':
        path = os.path.expanduser('~/Library/Caches')
        if appname:
            path = os.path.join(path, appname)
    else:
        path = os.getenv('XDG_CACHE_HOME', os.path.expanduser('~/.cache'))
        if appname:
            path = os.path.join(path, appname)
    if appname and version:
        path = os.path.join(path, version)
    return path


def user_log_dir(appname=None, appauthor=None, version=None, opinion=True):
    r"""Return full path to the user-specific log dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "opinion" (boolean) can be False to disable the appending of
            "Logs" to the base app data dir for Windows, and "log" to the
            base cache dir for Unix. See discussion below.

    Typical user cache directories are:
        Mac OS X:   ~/Library/Logs/<AppName>
        Unix:       ~/.cache/<AppName>/log  # or under $XDG_CACHE_HOME if defined
        Win XP:     C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Logs
        Vista:      C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Logs

    On Windows the only suggestion in the MSDN docs is that local settings
    go in the `CSIDL_LOCAL_APPDATA` directory. (Note: I'm interested in
    examples of what some windows apps use for a logs dir.)

    OPINION: This function appends "Logs" to the `CSIDL_LOCAL_APPDATA`
    value for Windows and appends "log" to the user cache dir for Unix.
    This can be disabled with the `opinion=False` option.
    """
    if system == "darwin":
        path = os.path.join(
            os.path.expanduser('~/Library/Logs'),
            appname)
    elif system == "win32":
        path = user_data_dir(appname, appauthor, version)
        version = False
        if opinion:
            path = os.path.join(path, "Logs")
    else:
        path = user_cache_dir(appname, appauthor, version)
        version = False
        if opinion:
            path = os.path.join(path, "log")
    if appname and version:
        path = os.path.join(path, version)
    return path


class AppDirs(object):
    """Convenience wrapper for getting application dirs."""
    def __init__(self, appname, appauthor=None, version=None, roaming=False,
                 multipath=False):
        self.appname = appname
        self.appauthor = appauthor
        self.version = version
        self.roaming = roaming
        self.multipath = multipath

    @property
    def user_data_dir(self):
        return user_data_dir(self.appname, self.appauthor,
                             version=self.version, roaming=self.roaming)

    @property
    def site_data_dir(self):
        return site_data_dir(self.appname, self.appauthor,
                             version=self.version, multipath=self.multipath)

    @property
    def user_config_dir(self):
        return user_config_dir(self.appname, self.appauthor,
                               version=self.version, roaming=self.roaming)

    @property
    def site_config_dir(self):
        return site_config_dir(self.appname, self.appauthor,
                             version=self.version, multipath=self.multipath)

    @property
    def user_cache_dir(self):
        return user_cache_dir(self.appname, self.appauthor,
                              version=self.version)

    @property
    def user_log_dir(self):
        return user_log_dir(self.appname, self.appauthor,
                            version=self.version)


#---- internal support stuff

def _get_win_folder_from_registry(csidl_name):
    """This is a fallback technique at best. I'm not sure if using the
    registry for this guarantees us the correct answer for all CSIDL_*
    names.
    """
    import _winreg

    shell_folder_name = {
        "CSIDL_APPDATA": "AppData",
        "CSIDL_COMMON_APPDATA": "Common AppData",
        "CSIDL_LOCAL_APPDATA": "Local AppData",
    }[csidl_name]

    key = _winreg.OpenKey(
        _winreg.HKEY_CURRENT_USER,
        r"Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders"
    )
    dir, type = _winreg.QueryValueEx(key, shell_folder_name)
    return dir


def _get_win_folder_with_pywin32(csidl_name):
    from win32com.shell import shellcon, shell
    dir = shell.SHGetFolderPath(0, getattr(shellcon, csidl_name), 0, 0)
    # Try to make this a unicode path because SHGetFolderPath does
    # not return unicode strings when there is unicode data in the
    # path.
    try:
        dir = unicode(dir)

        # Downgrade to short path name if have highbit chars. See
        # <http://bugs.activestate.com/show_bug.cgi?id=85099>.
        has_high_char = False
        for c in dir:
            if ord(c) > 255:
                has_high_char = True
                break
        if has_high_char:
            try:
                import win32api
                dir = win32api.GetShortPathName(dir)
            except ImportError:
                pass
    except UnicodeError:
        pass
    return dir


def _get_win_folder_with_ctypes(csidl_name):
    import ctypes

    csidl_const = {
        "CSIDL_APPDATA": 26,
        "CSIDL_COMMON_APPDATA": 35,
        "CSIDL_LOCAL_APPDATA": 28,
    }[csidl_name]

    buf = ctypes.create_unicode_buffer(1024)
    ctypes.windll.shell32.SHGetFolderPathW(None, csidl_const, None, 0, buf)

    # Downgrade to short path name if have highbit chars. See
    # <http://bugs.activestate.com/show_bug.cgi?id=85099>.
    has_high_char = False
    for c in buf:
        if ord(c) > 255:
            has_high_char = True
            break
    if has_high_char:
        buf2 = ctypes.create_unicode_buffer(1024)
        if ctypes.windll.kernel32.GetShortPathNameW(buf.value, buf2, 1024):
            buf = buf2

    return buf.value

def _get_win_folder_with_jna(csidl_name):
    import array
    from com.sun import jna
    from com.sun.jna.platform import win32

    buf_size = win32.WinDef.MAX_PATH * 2
    buf = array.zeros('c', buf_size)
    shell = win32.Shell32.INSTANCE
    shell.SHGetFolderPath(None, getattr(win32.ShlObj, csidl_name), None, win32.ShlObj.SHGFP_TYPE_CURRENT, buf)
    dir = jna.Native.toString(buf.tostring()).rstrip("\0")

    # Downgrade to short path name if have highbit chars. See
    # <http://bugs.activestate.com/show_bug.cgi?id=85099>.
    has_high_char = False
    for c in dir:
        if ord(c) > 255:
            has_high_char = True
            break
    if has_high_char:
        buf = array.zeros('c', buf_size)
        kernel = win32.Kernel32.INSTANCE
        if kernal.GetShortPathName(dir, buf, buf_size):
            dir = jna.Native.toString(buf.tostring()).rstrip("\0")

    return dir

if system == "win32":
    try:
        import win32com.shell
        _get_win_folder = _get_win_folder_with_pywin32
    except ImportError:
        try:
            from ctypes import windll
            _get_win_folder = _get_win_folder_with_ctypes
        except ImportError:
            try:
                import com.sun.jna
                _get_win_folder = _get_win_folder_with_jna
            except ImportError:
                _get_win_folder = _get_win_folder_from_registry


#---- self test code

if __name__ == "__main__":
    appname = "MyApp"
    appauthor = "MyCompany"

    props = ("user_data_dir", "site_data_dir",
             "user_config_dir", "site_config_dir",
             "user_cache_dir", "user_log_dir")

    print("-- app dirs (with optional 'version')")
    dirs = AppDirs(appname, appauthor, version="1.0")
    for prop in props:
        print("%s: %s" % (prop, getattr(dirs, prop)))

    print("\n-- app dirs (without optional 'version')")
    dirs = AppDirs(appname, appauthor)
    for prop in props:
        print("%s: %s" % (prop, getattr(dirs, prop)))

    print("\n-- app dirs (without optional 'appauthor')")
    dirs = AppDirs(appname)
    for prop in props:
        print("%s: %s" % (prop, getattr(dirs, prop)))

    print("\n-- app dirs (with disabled 'appauthor')")
    dirs = AppDirs(appname, appauthor=False)
    for prop in props:
        print("%s: %s" % (prop, getattr(dirs, prop)))
site-packages/pkg_resources/_vendor/__init__.py000064400000000000147511334660015704 0ustar00site-packages/pkg_resources/_vendor/__pycache__/pyparsing.cpython-36.pyc000064400000610513147511334660022465 0ustar003

��f��@s�dZdZdZdZddlZddlmZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlmZyddlmZWn ek
r�ddlmZYnXydd	l
mZWn>ek
r�ydd	lmZWnek
r�dZYnXYnXd
ddd
ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrgiZee	j�dds�ZeddskZe�r"e	jZe Z!e"Z#e Z$e%e&e'e(e)ee*e+e,e-e.gZ/nbe	j0Ze1Z2dtdu�Z$gZ/ddl3Z3xBdvj4�D]6Z5ye/j6e7e3e5��Wne8k
�r|�wJYnX�qJWe9dwdx�e2dy�D��Z:dzd{�Z;Gd|d}�d}e<�Z=ej>ej?Z@d~ZAeAdZBe@eAZCe"d��ZDd�jEd�dx�ejFD��ZGGd�d!�d!eH�ZIGd�d#�d#eI�ZJGd�d%�d%eI�ZKGd�d'�d'eK�ZLGd�d*�d*eH�ZMGd�d��d�e<�ZNGd�d&�d&e<�ZOe
jPjQeO�d�d=�ZRd�dN�ZSd�dK�ZTd�d��ZUd�d��ZVd�d��ZWd�dU�ZX�d/d�d��ZYGd�d(�d(e<�ZZGd�d0�d0eZ�Z[Gd�d�de[�Z\Gd�d�de[�Z]Gd�d�de[�Z^e^Z_e^eZ_`Gd�d�de[�ZaGd�d�de^�ZbGd�d�dea�ZcGd�dp�dpe[�ZdGd�d3�d3e[�ZeGd�d+�d+e[�ZfGd�d)�d)e[�ZgGd�d
�d
e[�ZhGd�d2�d2e[�ZiGd�d��d�e[�ZjGd�d�dej�ZkGd�d�dej�ZlGd�d�dej�ZmGd�d.�d.ej�ZnGd�d-�d-ej�ZoGd�d5�d5ej�ZpGd�d4�d4ej�ZqGd�d$�d$eZ�ZrGd�d
�d
er�ZsGd�d �d er�ZtGd�d�der�ZuGd�d�der�ZvGd�d"�d"eZ�ZwGd�d�dew�ZxGd�d�dew�ZyGd�d��d�ew�ZzGd�d�dez�Z{Gd�d6�d6ez�Z|Gd�d��d�e<�Z}e}�Z~Gd�d�dew�ZGd�d,�d,ew�Z�Gd�d�dew�Z�Gd�d��d�e��Z�Gd�d1�d1ew�Z�Gd�d�de��Z�Gd�d�de��Z�Gd�d�de��Z�Gd�d/�d/e��Z�Gd�d�de<�Z�d�df�Z��d0d�dD�Z��d1d�d@�Z�d�d΄Z�d�dS�Z�d�dR�Z�d�d҄Z��d2d�dW�Z�d�dE�Z��d3d�dk�Z�d�dl�Z�d�dn�Z�e\�j�dG�Z�el�j�dM�Z�em�j�dL�Z�en�j�de�Z�eo�j�dd�Z�eeeDd�d�dڍj�d�d܄�Z�efd݃j�d�d܄�Z�efd߃j�d�d܄�Z�e�e�Be�BeeeGd�dyd�Befd�ej��BZ�e�e�e�d�e��Z�e^d�ed�j�d�e�e{e�e�B��j�d�d�Z�d�dc�Z�d�dQ�Z�d�d`�Z�d�d^�Z�d�dq�Z�e�d�d܄�Z�e�d�d܄�Z�d�d�Z�d�dO�Z�d�dP�Z�d�di�Z�e<�e�_��d4d�do�Z�e=�Z�e<�e�_�e<�e�_�e�d��e�d��fd�dm�Z�e�Z�e�efd��d��j�d��Z�e�efd��d��j�d��Z�e�efd��d�efd��d�B�j��d�Z�e�e_�d�e�j��j��d�Z�d�d�de�j�f�ddT�Z��d5�ddj�Z�e��d�Z�e��d�Z�e�eee@eC�d�j��d��\Z�Z�e�e��d	j4��d
��Z�ef�d�djEe�j��d
�j��d�ZĐdd_�Z�e�ef�d��d�j��d�Z�ef�d�j��d�Z�ef�d�jȃj��d�Z�ef�d�j��d�Z�e�ef�d��de�B�j��d�Z�e�Z�ef�d�j��d�Z�e�e{eeeGdɐd�eee�d�e^dɃem����j΃j��d�Z�e�ee�j�e�Bd��d��j�d>�Z�G�d dr�dr�Z�eҐd!k�r�eb�d"�Z�eb�d#�Z�eee@eC�d$�Z�e�eՐd%dӐd&�j�e��Z�e�e�eփ�j��d'�Zאd(e�BZ�e�eՐd%dӐd&�j�e��Z�e�e�eك�j��d)�Z�eӐd*�eؐd'�e�eڐd)�Z�e�jܐd+�e�j�jܐd,�e�j�jܐd,�e�j�jܐd-�ddl�Z�e�j�j�e�e�j��e�j�jܐd.�dS(6aS
pyparsing module - Classes and methods to define and execute parsing grammars

The pyparsing module is an alternative approach to creating and executing simple grammars,
vs. the traditional lex/yacc approach, or the use of regular expressions.  With pyparsing, you
don't need to learn a new syntax for defining grammars or matching expressions - the parsing module
provides a library of classes that you use to construct the grammar directly in Python.

Here is a program to parse "Hello, World!" (or any greeting of the form 
C{"<salutation>, <addressee>!"}), built up using L{Word}, L{Literal}, and L{And} elements 
(L{'+'<ParserElement.__add__>} operator gives L{And} expressions, strings are auto-converted to
L{Literal} expressions)::

    from pyparsing import Word, alphas

    # define grammar of a greeting
    greet = Word(alphas) + "," + Word(alphas) + "!"

    hello = "Hello, World!"
    print (hello, "->", greet.parseString(hello))

The program outputs the following::

    Hello, World! -> ['Hello', ',', 'World', '!']

The Python representation of the grammar is quite readable, owing to the self-explanatory
class names, and the use of '+', '|' and '^' operators.

The L{ParseResults} object returned from L{ParserElement.parseString<ParserElement.parseString>} can be accessed as a nested list, a dictionary, or an
object with named attributes.

The pyparsing module handles some of the problems that are typically vexing when writing text parsers:
 - extra or missing whitespace (the above program will also handle "Hello,World!", "Hello  ,  World  !", etc.)
 - quoted strings
 - embedded comments
z2.1.10z07 Oct 2016 01:31 UTCz*Paul McGuire <ptmcg@users.sourceforge.net>�N)�ref)�datetime)�RLock)�OrderedDict�And�CaselessKeyword�CaselessLiteral�
CharsNotIn�Combine�Dict�Each�Empty�
FollowedBy�Forward�
GoToColumn�Group�Keyword�LineEnd�	LineStart�Literal�
MatchFirst�NoMatch�NotAny�	OneOrMore�OnlyOnce�Optional�Or�ParseBaseException�ParseElementEnhance�ParseException�ParseExpression�ParseFatalException�ParseResults�ParseSyntaxException�
ParserElement�QuotedString�RecursiveGrammarException�Regex�SkipTo�	StringEnd�StringStart�Suppress�Token�TokenConverter�White�Word�WordEnd�	WordStart�
ZeroOrMore�	alphanums�alphas�
alphas8bit�anyCloseTag�
anyOpenTag�
cStyleComment�col�commaSeparatedList�commonHTMLEntity�countedArray�cppStyleComment�dblQuotedString�dblSlashComment�
delimitedList�dictOf�downcaseTokens�empty�hexnums�htmlComment�javaStyleComment�line�lineEnd�	lineStart�lineno�makeHTMLTags�makeXMLTags�matchOnlyAtCol�matchPreviousExpr�matchPreviousLiteral�
nestedExpr�nullDebugAction�nums�oneOf�opAssoc�operatorPrecedence�
printables�punc8bit�pythonStyleComment�quotedString�removeQuotes�replaceHTMLEntity�replaceWith�
restOfLine�sglQuotedString�srange�	stringEnd�stringStart�traceParseAction�
unicodeString�upcaseTokens�
withAttribute�
indentedBlock�originalTextFor�ungroup�
infixNotation�locatedExpr�	withClass�
CloseMatch�tokenMap�pyparsing_common�cCs`t|t�r|Syt|�Stk
rZt|�jtj�d�}td�}|jdd��|j	|�SXdS)aDrop-in replacement for str(obj) that tries to be Unicode friendly. It first tries
           str(obj). If that fails with a UnicodeEncodeError, then it tries unicode(obj). It
           then < returns the unicode object | encodes it with the default encoding | ... >.
        �xmlcharrefreplacez&#\d+;cSs$dtt|ddd���dd�S)Nz\ur�����)�hex�int)�t�rw�/usr/lib/python3.6/pyparsing.py�<lambda>�sz_ustr.<locals>.<lambda>N)
�
isinstanceZunicode�str�UnicodeEncodeError�encode�sys�getdefaultencodingr'�setParseAction�transformString)�obj�retZ
xmlcharrefrwrwrx�_ustr�s
r�z6sum len sorted reversed list tuple set any all min maxccs|]
}|VqdS)Nrw)�.0�yrwrwrx�	<genexpr>�sr�rrcCs>d}dd�dj�D�}x"t||�D]\}}|j||�}q"W|S)z/Escape &, <, >, ", ', etc. in a string of data.z&><"'css|]}d|dVqdS)�&�;Nrw)r��srwrwrxr��sz_xml_escape.<locals>.<genexpr>zamp gt lt quot apos)�split�zip�replace)�dataZfrom_symbolsZ
to_symbolsZfrom_Zto_rwrwrx�_xml_escape�s
r�c@seZdZdS)�
_ConstantsN)�__name__�
__module__�__qualname__rwrwrwrxr��sr��
0123456789ZABCDEFabcdef�\�ccs|]}|tjkr|VqdS)N)�stringZ
whitespace)r��crwrwrxr��sc@sPeZdZdZddd�Zedd��Zdd	�Zd
d�Zdd
�Z	ddd�Z
dd�ZdS)rz7base exception class for all parsing runtime exceptionsrNcCs>||_|dkr||_d|_n||_||_||_|||f|_dS)Nr�)�loc�msg�pstr�
parserElement�args)�selfr�r�r��elemrwrwrx�__init__�szParseBaseException.__init__cCs||j|j|j|j�S)z�
        internal factory method to simplify creating one type of ParseException 
        from another - avoids having __init__ signature conflicts among subclasses
        )r�r�r�r�)�cls�perwrwrx�_from_exception�sz"ParseBaseException._from_exceptioncCsN|dkrt|j|j�S|dkr,t|j|j�S|dkrBt|j|j�St|��dS)z�supported attributes by name are:
            - lineno - returns the line number of the exception text
            - col - returns the column number of the exception text
            - line - returns the line containing the exception text
        rJr9�columnrGN)r9r�)rJr�r�r9rG�AttributeError)r�Zanamerwrwrx�__getattr__�szParseBaseException.__getattr__cCsd|j|j|j|jfS)Nz"%s (at char %d), (line:%d, col:%d))r�r�rJr�)r�rwrwrx�__str__�szParseBaseException.__str__cCst|�S)N)r�)r�rwrwrx�__repr__�szParseBaseException.__repr__�>!<cCs<|j}|jd}|r4dj|d|�|||d�f�}|j�S)z�Extracts the exception line from the input string, and marks
           the location of the exception with a special symbol.
        rrr�N)rGr��join�strip)r�ZmarkerStringZline_strZline_columnrwrwrx�
markInputline�s
z ParseBaseException.markInputlinecCsdj�tt|��S)Nzlineno col line)r��dir�type)r�rwrwrx�__dir__�szParseBaseException.__dir__)rNN)r�)r�r�r��__doc__r��classmethodr�r�r�r�r�r�rwrwrwrxr�s


c@seZdZdZdS)raN
    Exception thrown when parse expressions don't match class;
    supported attributes by name are:
     - lineno - returns the line number of the exception text
     - col - returns the column number of the exception text
     - line - returns the line containing the exception text
        
    Example::
        try:
            Word(nums).setName("integer").parseString("ABC")
        except ParseException as pe:
            print(pe)
            print("column: {}".format(pe.col))
            
    prints::
       Expected integer (at char 0), (line:1, col:1)
        column: 1
    N)r�r�r�r�rwrwrwrxr�sc@seZdZdZdS)r!znuser-throwable exception thrown when inconsistent parse content
       is found; stops all parsing immediatelyN)r�r�r�r�rwrwrwrxr!sc@seZdZdZdS)r#z�just like L{ParseFatalException}, but thrown internally when an
       L{ErrorStop<And._ErrorStop>} ('-' operator) indicates that parsing is to stop 
       immediately because an unbacktrackable syntax error has been foundN)r�r�r�r�rwrwrwrxr#sc@s eZdZdZdd�Zdd�ZdS)r&zZexception thrown by L{ParserElement.validate} if the grammar could be improperly recursivecCs
||_dS)N)�parseElementTrace)r��parseElementListrwrwrxr�sz"RecursiveGrammarException.__init__cCs
d|jS)NzRecursiveGrammarException: %s)r�)r�rwrwrxr� sz!RecursiveGrammarException.__str__N)r�r�r�r�r�r�rwrwrwrxr&sc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�_ParseResultsWithOffsetcCs||f|_dS)N)�tup)r�Zp1Zp2rwrwrxr�$sz _ParseResultsWithOffset.__init__cCs
|j|S)N)r�)r��irwrwrx�__getitem__&sz#_ParseResultsWithOffset.__getitem__cCst|jd�S)Nr)�reprr�)r�rwrwrxr�(sz _ParseResultsWithOffset.__repr__cCs|jd|f|_dS)Nr)r�)r�r�rwrwrx�	setOffset*sz!_ParseResultsWithOffset.setOffsetN)r�r�r�r�r�r�r�rwrwrwrxr�#sr�c@s�eZdZdZd[dd�Zddddefdd�Zdd	�Zefd
d�Zdd
�Z	dd�Z
dd�Zdd�ZeZ
dd�Zdd�Zdd�Zdd�Zdd�Zer�eZeZeZn$eZeZeZdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd\d(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Z d2d3�Z!d4d5�Z"d6d7�Z#d8d9�Z$d:d;�Z%d<d=�Z&d]d?d@�Z'dAdB�Z(dCdD�Z)dEdF�Z*d^dHdI�Z+dJdK�Z,dLdM�Z-d_dOdP�Z.dQdR�Z/dSdT�Z0dUdV�Z1dWdX�Z2dYdZ�Z3dS)`r"aI
    Structured parse results, to provide multiple means of access to the parsed data:
       - as a list (C{len(results)})
       - by list index (C{results[0], results[1]}, etc.)
       - by attribute (C{results.<resultsName>} - see L{ParserElement.setResultsName})

    Example::
        integer = Word(nums)
        date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))
        # equivalent form:
        # date_str = integer("year") + '/' + integer("month") + '/' + integer("day")

        # parseString returns a ParseResults object
        result = date_str.parseString("1999/12/31")

        def test(s, fn=repr):
            print("%s -> %s" % (s, fn(eval(s))))
        test("list(result)")
        test("result[0]")
        test("result['month']")
        test("result.day")
        test("'month' in result")
        test("'minutes' in result")
        test("result.dump()", str)
    prints::
        list(result) -> ['1999', '/', '12', '/', '31']
        result[0] -> '1999'
        result['month'] -> '12'
        result.day -> '31'
        'month' in result -> True
        'minutes' in result -> False
        result.dump() -> ['1999', '/', '12', '/', '31']
        - day: 31
        - month: 12
        - year: 1999
    NTcCs"t||�r|Stj|�}d|_|S)NT)rz�object�__new__�_ParseResults__doinit)r��toklist�name�asList�modalZretobjrwrwrxr�Ts


zParseResults.__new__c
Cs`|jrvd|_d|_d|_i|_||_||_|dkr6g}||t�rP|dd�|_n||t�rft|�|_n|g|_t	�|_
|dk	o�|�r\|s�d|j|<||t�r�t|�}||_||t
d�ttf�o�|ddgfk�s\||t�r�|g}|�r&||t��rt|j�d�||<ntt|d�d�||<|||_n6y|d||<Wn$tttfk
�rZ|||<YnXdS)NFrr�)r��_ParseResults__name�_ParseResults__parent�_ParseResults__accumNames�_ParseResults__asList�_ParseResults__modal�list�_ParseResults__toklist�_generatorType�dict�_ParseResults__tokdictrur�r��
basestringr"r��copy�KeyError�	TypeError�
IndexError)r�r�r�r�r�rzrwrwrxr�]sB



$
zParseResults.__init__cCsPt|ttf�r|j|S||jkr4|j|ddStdd�|j|D��SdS)NrrrcSsg|]}|d�qS)rrw)r��vrwrwrx�
<listcomp>�sz,ParseResults.__getitem__.<locals>.<listcomp>rs)rzru�slicer�r�r�r")r�r�rwrwrxr��s


zParseResults.__getitem__cCs�||t�r0|jj|t��|g|j|<|d}nD||ttf�rN||j|<|}n&|jj|t��t|d�g|j|<|}||t�r�t|�|_	dS)Nr)
r�r��getr�rur�r�r"�wkrefr�)r��kr�rz�subrwrwrx�__setitem__�s


"
zParseResults.__setitem__c
Cs�t|ttf�r�t|j�}|j|=t|t�rH|dkr:||7}t||d�}tt|j|���}|j�x^|j	j
�D]F\}}x<|D]4}x.t|�D]"\}\}}	t||	|	|k�||<q�Wq|WqnWn|j	|=dS)Nrrr)
rzrur��lenr�r��range�indices�reverser��items�	enumerater�)
r�r�ZmylenZremovedr��occurrences�jr��value�positionrwrwrx�__delitem__�s


$zParseResults.__delitem__cCs
||jkS)N)r�)r�r�rwrwrx�__contains__�szParseResults.__contains__cCs
t|j�S)N)r�r�)r�rwrwrx�__len__�szParseResults.__len__cCs
|jS)N)r�)r�rwrwrx�__bool__�szParseResults.__bool__cCs
t|j�S)N)�iterr�)r�rwrwrx�__iter__�szParseResults.__iter__cCst|jddd��S)Nrrrs)r�r�)r�rwrwrx�__reversed__�szParseResults.__reversed__cCs$t|jd�r|jj�St|j�SdS)N�iterkeys)�hasattrr�r�r�)r�rwrwrx�	_iterkeys�s
zParseResults._iterkeyscs�fdd��j�D�S)Nc3s|]}�|VqdS)Nrw)r�r�)r�rwrxr��sz+ParseResults._itervalues.<locals>.<genexpr>)r�)r�rw)r�rx�_itervalues�szParseResults._itervaluescs�fdd��j�D�S)Nc3s|]}|�|fVqdS)Nrw)r�r�)r�rwrxr��sz*ParseResults._iteritems.<locals>.<genexpr>)r�)r�rw)r�rx�
_iteritems�szParseResults._iteritemscCst|j��S)zVReturns all named result keys (as a list in Python 2.x, as an iterator in Python 3.x).)r�r�)r�rwrwrx�keys�szParseResults.keyscCst|j��S)zXReturns all named result values (as a list in Python 2.x, as an iterator in Python 3.x).)r��
itervalues)r�rwrwrx�values�szParseResults.valuescCst|j��S)zfReturns all named result key-values (as a list of tuples in Python 2.x, as an iterator in Python 3.x).)r��	iteritems)r�rwrwrxr��szParseResults.itemscCs
t|j�S)z�Since keys() returns an iterator, this method is helpful in bypassing
           code that looks for the existence of any defined results names.)�boolr�)r�rwrwrx�haskeys�szParseResults.haskeyscOs�|s
dg}x6|j�D]*\}}|dkr2|d|f}qtd|��qWt|dt�sht|�dksh|d|kr�|d}||}||=|S|d}|SdS)a�
        Removes and returns item at specified index (default=C{last}).
        Supports both C{list} and C{dict} semantics for C{pop()}. If passed no
        argument or an integer argument, it will use C{list} semantics
        and pop tokens from the list of parsed tokens. If passed a 
        non-integer argument (most likely a string), it will use C{dict}
        semantics and pop the corresponding value from any defined 
        results names. A second default return value argument is 
        supported, just as in C{dict.pop()}.

        Example::
            def remove_first(tokens):
                tokens.pop(0)
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            print(OneOrMore(Word(nums)).addParseAction(remove_first).parseString("0 123 321")) # -> ['123', '321']

            label = Word(alphas)
            patt = label("LABEL") + OneOrMore(Word(nums))
            print(patt.parseString("AAB 123 321").dump())

            # Use pop() in a parse action to remove named result (note that corresponding value is not
            # removed from list form of results)
            def remove_LABEL(tokens):
                tokens.pop("LABEL")
                return tokens
            patt.addParseAction(remove_LABEL)
            print(patt.parseString("AAB 123 321").dump())
        prints::
            ['AAB', '123', '321']
            - LABEL: AAB

            ['AAB', '123', '321']
        rr�defaultrz-pop() got an unexpected keyword argument '%s'Nrs)r�r�rzrur�)r�r��kwargsr�r��indexr�Zdefaultvaluerwrwrx�pop�s"zParseResults.popcCs||kr||S|SdS)ai
        Returns named result matching the given key, or if there is no
        such name, then returns the given C{defaultValue} or C{None} if no
        C{defaultValue} is specified.

        Similar to C{dict.get()}.
        
        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            result = date_str.parseString("1999/12/31")
            print(result.get("year")) # -> '1999'
            print(result.get("hour", "not specified")) # -> 'not specified'
            print(result.get("hour")) # -> None
        Nrw)r��key�defaultValuerwrwrxr�szParseResults.getcCsZ|jj||�xF|jj�D]8\}}x.t|�D]"\}\}}t||||k�||<q,WqWdS)a
        Inserts new element at location index in the list of parsed tokens.
        
        Similar to C{list.insert()}.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']

            # use a parse action to insert the parse location in the front of the parsed results
            def insert_locn(locn, tokens):
                tokens.insert(0, locn)
            print(OneOrMore(Word(nums)).addParseAction(insert_locn).parseString("0 123 321")) # -> [0, '0', '123', '321']
        N)r��insertr�r�r�r�)r�r�ZinsStrr�r�r�r�r�rwrwrxr�2szParseResults.insertcCs|jj|�dS)a�
        Add single element to end of ParseResults list of elements.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            
            # use a parse action to compute the sum of the parsed integers, and add it to the end
            def append_sum(tokens):
                tokens.append(sum(map(int, tokens)))
            print(OneOrMore(Word(nums)).addParseAction(append_sum).parseString("0 123 321")) # -> ['0', '123', '321', 444]
        N)r��append)r��itemrwrwrxr�FszParseResults.appendcCs$t|t�r||7}n|jj|�dS)a
        Add sequence of elements to end of ParseResults list of elements.

        Example::
            patt = OneOrMore(Word(alphas))
            
            # use a parse action to append the reverse of the matched strings, to make a palindrome
            def make_palindrome(tokens):
                tokens.extend(reversed([t[::-1] for t in tokens]))
                return ''.join(tokens)
            print(patt.addParseAction(make_palindrome).parseString("lskdj sdlkjf lksd")) # -> 'lskdjsdlkjflksddsklfjkldsjdksl'
        N)rzr"r��extend)r�Zitemseqrwrwrxr�Ts

zParseResults.extendcCs|jdd�=|jj�dS)z7
        Clear all elements and results names.
        N)r�r��clear)r�rwrwrxr�fszParseResults.clearcCsfy||Stk
rdSX||jkr^||jkrD|j|ddStdd�|j|D��SndSdS)Nr�rrrcSsg|]}|d�qS)rrw)r�r�rwrwrxr�wsz,ParseResults.__getattr__.<locals>.<listcomp>rs)r�r�r�r")r�r�rwrwrxr�ms

zParseResults.__getattr__cCs|j�}||7}|S)N)r�)r��otherr�rwrwrx�__add__{szParseResults.__add__cs�|jrnt|j���fdd��|jj�}�fdd�|D�}x4|D],\}}|||<t|dt�r>t|�|d_q>W|j|j7_|jj	|j�|S)Ncs|dkr�S|�S)Nrrw)�a)�offsetrwrxry�sz'ParseResults.__iadd__.<locals>.<lambda>c	s4g|],\}}|D]}|t|d�|d��f�qqS)rrr)r�)r�r��vlistr�)�	addoffsetrwrxr��sz)ParseResults.__iadd__.<locals>.<listcomp>r)
r�r�r�r�rzr"r�r�r��update)r�r�Z
otheritemsZotherdictitemsr�r�rw)rrrx�__iadd__�s


zParseResults.__iadd__cCs&t|t�r|dkr|j�S||SdS)Nr)rzrur�)r�r�rwrwrx�__radd__�szParseResults.__radd__cCsdt|j�t|j�fS)Nz(%s, %s))r�r�r�)r�rwrwrxr��szParseResults.__repr__cCsddjdd�|jD��dS)N�[z, css(|] }t|t�rt|�nt|�VqdS)N)rzr"r�r�)r�r�rwrwrxr��sz'ParseResults.__str__.<locals>.<genexpr>�])r�r�)r�rwrwrxr��szParseResults.__str__r�cCsPg}xF|jD]<}|r"|r"|j|�t|t�r:||j�7}q|jt|��qW|S)N)r�r�rzr"�
_asStringListr�)r��sep�outr�rwrwrxr
�s

zParseResults._asStringListcCsdd�|jD�S)a�
        Returns the parse results as a nested list of matching tokens, all converted to strings.

        Example::
            patt = OneOrMore(Word(alphas))
            result = patt.parseString("sldkj lsdkj sldkj")
            # even though the result prints in string-like form, it is actually a pyparsing ParseResults
            print(type(result), result) # -> <class 'pyparsing.ParseResults'> ['sldkj', 'lsdkj', 'sldkj']
            
            # Use asList() to create an actual list
            result_list = result.asList()
            print(type(result_list), result_list) # -> <class 'list'> ['sldkj', 'lsdkj', 'sldkj']
        cSs"g|]}t|t�r|j�n|�qSrw)rzr"r�)r��resrwrwrxr��sz'ParseResults.asList.<locals>.<listcomp>)r�)r�rwrwrxr��szParseResults.asListcs6tr|j}n|j}�fdd��t�fdd�|�D��S)a�
        Returns the named parse results as a nested dictionary.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(type(result), repr(result)) # -> <class 'pyparsing.ParseResults'> (['12', '/', '31', '/', '1999'], {'day': [('1999', 4)], 'year': [('12', 0)], 'month': [('31', 2)]})
            
            result_dict = result.asDict()
            print(type(result_dict), repr(result_dict)) # -> <class 'dict'> {'day': '1999', 'year': '12', 'month': '31'}

            # even though a ParseResults supports dict-like access, sometime you just need to have a dict
            import json
            print(json.dumps(result)) # -> Exception: TypeError: ... is not JSON serializable
            print(json.dumps(result.asDict())) # -> {"month": "31", "day": "1999", "year": "12"}
        cs6t|t�r.|j�r|j�S�fdd�|D�Sn|SdS)Ncsg|]}�|��qSrwrw)r�r�)�toItemrwrxr��sz7ParseResults.asDict.<locals>.toItem.<locals>.<listcomp>)rzr"r��asDict)r�)rrwrxr�s

z#ParseResults.asDict.<locals>.toItemc3s|]\}}|�|�fVqdS)Nrw)r�r�r�)rrwrxr��sz&ParseResults.asDict.<locals>.<genexpr>)�PY_3r�r�r�)r�Zitem_fnrw)rrxr�s
	zParseResults.asDictcCs8t|j�}|jj�|_|j|_|jj|j�|j|_|S)zA
        Returns a new copy of a C{ParseResults} object.
        )r"r�r�r�r�r�rr�)r�r�rwrwrxr��s
zParseResults.copyFcCsPd}g}tdd�|jj�D��}|d}|s8d}d}d}d}	|dk	rJ|}	n|jrV|j}	|	sf|rbdSd}	|||d|	d	g7}x�t|j�D]�\}
}t|t�r�|
|kr�||j||
|o�|dk||�g7}n||jd|o�|dk||�g7}q�d}|
|kr�||
}|�s
|�rq�nd}t	t
|��}
|||d|d	|
d
|d	g	7}q�W|||d
|	d	g7}dj|�S)z�
        (Deprecated) Returns the parse results as XML. Tags are created for tokens and lists that have defined results names.
        �
css(|] \}}|D]}|d|fVqqdS)rrNrw)r�r�rr�rwrwrxr��sz%ParseResults.asXML.<locals>.<genexpr>z  r�NZITEM�<�>z</)r�r�r�r�r�r�rzr"�asXMLr�r�r�)r�ZdoctagZnamedItemsOnly�indentZ	formatted�nlrZ
namedItemsZnextLevelIndentZselfTagr�r
ZresTagZxmlBodyTextrwrwrxr�sT


zParseResults.asXMLcCs:x4|jj�D]&\}}x|D]\}}||kr|SqWqWdS)N)r�r�)r�r�r�rr�r�rwrwrxZ__lookup$s
zParseResults.__lookupcCs�|jr|jS|jr.|j�}|r(|j|�SdSnNt|�dkrxt|j�dkrxtt|jj���dddkrxtt|jj���SdSdS)a(
        Returns the results name for this token expression. Useful when several 
        different expressions might match at a particular location.

        Example::
            integer = Word(nums)
            ssn_expr = Regex(r"\d\d\d-\d\d-\d\d\d\d")
            house_number_expr = Suppress('#') + Word(nums, alphanums)
            user_data = (Group(house_number_expr)("house_number") 
                        | Group(ssn_expr)("ssn")
                        | Group(integer)("age"))
            user_info = OneOrMore(user_data)
            
            result = user_info.parseString("22 111-22-3333 #221B")
            for item in result:
                print(item.getName(), ':', item[0])
        prints::
            age : 22
            ssn : 111-22-3333
            house_number : 221B
        Nrrrrs)rrs)	r�r��_ParseResults__lookupr�r��nextr�r�r�)r��parrwrwrx�getName+s
zParseResults.getNamercCsbg}d}|j|t|j���|�rX|j�r�tdd�|j�D��}xz|D]r\}}|r^|j|�|jd|d||f�t|t�r�|r�|j|j||d��q�|jt|��qH|jt	|��qHWn�t
dd�|D���rX|}x~t|�D]r\}	}
t|
t��r*|jd|d||	|d|d|
j||d�f�q�|jd|d||	|d|dt|
�f�q�Wd	j|�S)
aH
        Diagnostic method for listing out the contents of a C{ParseResults}.
        Accepts an optional C{indent} argument so that this string can be embedded
        in a nested display of other data.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(result.dump())
        prints::
            ['12', '/', '31', '/', '1999']
            - day: 1999
            - month: 31
            - year: 12
        rcss|]\}}t|�|fVqdS)N)r{)r�r�r�rwrwrxr�gsz$ParseResults.dump.<locals>.<genexpr>z
%s%s- %s: z  rrcss|]}t|t�VqdS)N)rzr")r��vvrwrwrxr�ssz
%s%s[%d]:
%s%s%sr�)
r�r�r�r��sortedr�rzr"�dumpr��anyr�r�)r�r�depth�fullr�NLr�r�r�r�rrwrwrxrPs,

4.zParseResults.dumpcOstj|j�f|�|�dS)a�
        Pretty-printer for parsed results as a list, using the C{pprint} module.
        Accepts additional positional or keyword args as defined for the 
        C{pprint.pprint} method. (U{http://docs.python.org/3/library/pprint.html#pprint.pprint})

        Example::
            ident = Word(alphas, alphanums)
            num = Word(nums)
            func = Forward()
            term = ident | num | Group('(' + func + ')')
            func <<= ident + Group(Optional(delimitedList(term)))
            result = func.parseString("fna a,b,(fnb c,d,200),100")
            result.pprint(width=40)
        prints::
            ['fna',
             ['a',
              'b',
              ['(', 'fnb', ['c', 'd', '200'], ')'],
              '100']]
        N)�pprintr�)r�r�r�rwrwrxr"}szParseResults.pprintcCs.|j|jj�|jdk	r|j�p d|j|jffS)N)r�r�r�r�r�r�)r�rwrwrx�__getstate__�s
zParseResults.__getstate__cCsN|d|_|d\|_}}|_i|_|jj|�|dk	rDt|�|_nd|_dS)Nrrr)r�r�r�r�rr�r�)r��staterZinAccumNamesrwrwrx�__setstate__�s
zParseResults.__setstate__cCs|j|j|j|jfS)N)r�r�r�r�)r�rwrwrx�__getnewargs__�szParseResults.__getnewargs__cCstt|��t|j��S)N)r�r�r�r�)r�rwrwrxr��szParseResults.__dir__)NNTT)N)r�)NFr�T)r�rT)4r�r�r�r�r�rzr�r�r�r�r�r�r��__nonzero__r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr�r�r
r�rr�rrrrr"r#r%r&r�rwrwrwrxr"-sh&
	'	
4

#
=%
-
cCsF|}d|kot|�knr4||ddkr4dS||jdd|�S)aReturns current column within a string, counting newlines as line separators.
   The first column is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   rrrr)r��rfind)r��strgr�rwrwrxr9�s
cCs|jdd|�dS)aReturns current line number within a string, counting newlines as line separators.
   The first line is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   rrrr)�count)r�r)rwrwrxrJ�s
cCsF|jdd|�}|jd|�}|dkr2||d|�S||dd�SdS)zfReturns the line of text containing loc within a string, counting newlines as line separators.
       rrrrN)r(�find)r�r)ZlastCRZnextCRrwrwrxrG�s
cCs8tdt|�dt|�dt||�t||�f�dS)NzMatch z at loc z(%d,%d))�printr�rJr9)�instringr��exprrwrwrx�_defaultStartDebugAction�sr/cCs$tdt|�dt|j���dS)NzMatched z -> )r,r�r{r�)r-�startlocZendlocr.�toksrwrwrx�_defaultSuccessDebugAction�sr2cCstdt|��dS)NzException raised:)r,r�)r-r�r.�excrwrwrx�_defaultExceptionDebugAction�sr4cGsdS)zG'Do-nothing' debug action, to suppress debugging output during parsing.Nrw)r�rwrwrxrQ�srqcs��tkr�fdd�Sdg�dg�tdd�dkrFddd	�}dd
d��ntj}tj�d}|dd
�d}|d|d|f�������fdd�}d}yt�dt�d�j�}Wntk
r�t��}YnX||_|S)Ncs�|�S)Nrw)r��lrv)�funcrwrxry�sz_trim_arity.<locals>.<lambda>rFrqro�cSs8tdkrdnd	}tj||dd�|}|j|jfgS)
Nror7rrqrr)�limit)ror7r������)�system_version�	traceback�
extract_stack�filenamerJ)r8r�
frame_summaryrwrwrxr=sz"_trim_arity.<locals>.extract_stackcSs$tj||d�}|d}|j|jfgS)N)r8rrrs)r<�
extract_tbr>rJ)�tbr8Zframesr?rwrwrxr@sz_trim_arity.<locals>.extract_tb�)r8rrcs�x�y �|�dd��}d�d<|Stk
r��dr>�n4z.tj�d}�|dd�ddd��ksj�Wd~X�d�kr��dd7<w�YqXqWdS)NrTrrrq)r8rsrs)r�r~�exc_info)r�r�rA)r@�
foundArityr6r8�maxargs�pa_call_line_synthrwrx�wrappers"z_trim_arity.<locals>.wrapperz<parse action>r��	__class__)ror7)r)rrs)	�singleArgBuiltinsr;r<r=r@�getattrr��	Exceptionr{)r6rEr=Z	LINE_DIFFZ	this_linerG�	func_namerw)r@rDr6r8rErFrx�_trim_arity�s*
rMcs�eZdZdZdZdZedd��Zedd��Zd�dd	�Z	d
d�Z
dd
�Zd�dd�Zd�dd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd�dd �Zd!d"�Zd�d#d$�Zd%d&�Zd'd(�ZGd)d*�d*e�Zed+k	r�Gd,d-�d-e�ZnGd.d-�d-e�ZiZe�Zd/d/gZ d�d0d1�Z!eZ"ed2d3��Z#dZ$ed�d5d6��Z%d�d7d8�Z&e'dfd9d:�Z(d;d<�Z)e'fd=d>�Z*e'dfd?d@�Z+dAdB�Z,dCdD�Z-dEdF�Z.dGdH�Z/dIdJ�Z0dKdL�Z1dMdN�Z2dOdP�Z3dQdR�Z4dSdT�Z5dUdV�Z6dWdX�Z7dYdZ�Z8d�d[d\�Z9d]d^�Z:d_d`�Z;dadb�Z<dcdd�Z=dedf�Z>dgdh�Z?d�didj�Z@dkdl�ZAdmdn�ZBdodp�ZCdqdr�ZDgfdsdt�ZEd�dudv�ZF�fdwdx�ZGdydz�ZHd{d|�ZId}d~�ZJdd��ZKd�d�d��ZLd�d�d��ZM�ZNS)�r$z)Abstract base level parser element class.z 
	
FcCs
|t_dS)a�
        Overrides the default whitespace chars

        Example::
            # default whitespace chars are space, <TAB> and newline
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def', 'ghi', 'jkl']
            
            # change to just treat newline as significant
            ParserElement.setDefaultWhitespaceChars(" \t")
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def']
        N)r$�DEFAULT_WHITE_CHARS)�charsrwrwrx�setDefaultWhitespaceChars=s
z'ParserElement.setDefaultWhitespaceCharscCs
|t_dS)a�
        Set class to be used for inclusion of string literals into a parser.
        
        Example::
            # default literal class used is Literal
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']


            # change to Suppress
            ParserElement.inlineLiteralsUsing(Suppress)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '12', '31']
        N)r$�_literalStringClass)r�rwrwrx�inlineLiteralsUsingLsz!ParserElement.inlineLiteralsUsingcCs�t�|_d|_d|_d|_||_d|_tj|_	d|_
d|_d|_t�|_
d|_d|_d|_d|_d|_d|_d|_d|_d|_dS)NTFr�)NNN)r��parseAction�
failAction�strRepr�resultsName�
saveAsList�skipWhitespacer$rN�
whiteChars�copyDefaultWhiteChars�mayReturnEmpty�keepTabs�ignoreExprs�debug�streamlined�
mayIndexError�errmsg�modalResults�debugActions�re�callPreparse�
callDuringTry)r��savelistrwrwrxr�as(zParserElement.__init__cCs<tj|�}|jdd�|_|jdd�|_|jr8tj|_|S)a$
        Make a copy of this C{ParserElement}.  Useful for defining different parse actions
        for the same parsing pattern, using copies of the original parse element.
        
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            integerK = integer.copy().addParseAction(lambda toks: toks[0]*1024) + Suppress("K")
            integerM = integer.copy().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
            
            print(OneOrMore(integerK | integerM | integer).parseString("5K 100 640K 256M"))
        prints::
            [5120, 100, 655360, 268435456]
        Equivalent form of C{expr.copy()} is just C{expr()}::
            integerM = integer().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
        N)r�rSr]rZr$rNrY)r�Zcpyrwrwrxr�xs
zParserElement.copycCs*||_d|j|_t|d�r&|j|j_|S)af
        Define name for this expression, makes debugging and exception messages clearer.
        
        Example::
            Word(nums).parseString("ABC")  # -> Exception: Expected W:(0123...) (at char 0), (line:1, col:1)
            Word(nums).setName("integer").parseString("ABC")  # -> Exception: Expected integer (at char 0), (line:1, col:1)
        z	Expected �	exception)r�rar�rhr�)r�r�rwrwrx�setName�s


zParserElement.setNamecCs4|j�}|jd�r"|dd�}d}||_||_|S)aP
        Define name for referencing matching tokens as a nested attribute
        of the returned parse results.
        NOTE: this returns a *copy* of the original C{ParserElement} object;
        this is so that the client can define a basic element, such as an
        integer, and reference it in multiple places with different names.

        You can also set results names using the abbreviated syntax,
        C{expr("name")} in place of C{expr.setResultsName("name")} - 
        see L{I{__call__}<__call__>}.

        Example::
            date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))

            # equivalent form:
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
        �*NrrTrs)r��endswithrVrb)r�r��listAllMatchesZnewselfrwrwrx�setResultsName�s
zParserElement.setResultsNameTcs@|r&|j�d�fdd�	}�|_||_nt|jd�r<|jj|_|S)z�Method to invoke the Python pdb debugger when this element is
           about to be parsed. Set C{breakFlag} to True to enable, False to
           disable.
        Tcsddl}|j��||||�S)Nr)�pdbZ	set_trace)r-r��	doActions�callPreParsern)�_parseMethodrwrx�breaker�sz'ParserElement.setBreak.<locals>.breaker�_originalParseMethod)TT)�_parsersr�)r�Z	breakFlagrrrw)rqrx�setBreak�s
zParserElement.setBreakcOs&tttt|���|_|jdd�|_|S)a
        Define action to perform when successfully matching parse element definition.
        Parse action fn is a callable method with 0-3 arguments, called as C{fn(s,loc,toks)},
        C{fn(loc,toks)}, C{fn(toks)}, or just C{fn()}, where:
         - s   = the original string being parsed (see note below)
         - loc = the location of the matching substring
         - toks = a list of the matched tokens, packaged as a C{L{ParseResults}} object
        If the functions in fns modify the tokens, they can return them as the return
        value from fn, and the modified list of tokens will replace the original.
        Otherwise, fn does not need to return any value.

        Optional keyword arguments:
         - callDuringTry = (default=C{False}) indicate if parse action should be run during lookaheads and alternate testing

        Note: the default parsing behavior is to expand tabs in the input string
        before starting the parsing process.  See L{I{parseString}<parseString>} for more information
        on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
        consistent view of the parsed string, the parse location, and line and column
        positions within the parsed string.
        
        Example::
            integer = Word(nums)
            date_str = integer + '/' + integer + '/' + integer

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']

            # use parse action to convert to ints at parse time
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            date_str = integer + '/' + integer + '/' + integer

            # note that integer fields are now ints, not strings
            date_str.parseString("1999/12/31")  # -> [1999, '/', 12, '/', 31]
        rfF)r��maprMrSr�rf)r��fnsr�rwrwrxr��s"zParserElement.setParseActioncOs4|jtttt|���7_|jp,|jdd�|_|S)z�
        Add parse action to expression's list of parse actions. See L{I{setParseAction}<setParseAction>}.
        
        See examples in L{I{copy}<copy>}.
        rfF)rSr�rvrMrfr�)r�rwr�rwrwrx�addParseAction�szParserElement.addParseActioncsb|jdd��|jdd�rtnt�x(|D] ����fdd�}|jj|�q&W|jpZ|jdd�|_|S)a�Add a boolean predicate function to expression's list of parse actions. See 
        L{I{setParseAction}<setParseAction>} for function call signatures. Unlike C{setParseAction}, 
        functions passed to C{addCondition} need to return boolean success/fail of the condition.

        Optional keyword arguments:
         - message = define a custom message to be used in the raised exception
         - fatal   = if True, will raise ParseFatalException to stop parsing immediately; otherwise will raise ParseException
         
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            year_int = integer.copy()
            year_int.addCondition(lambda toks: toks[0] >= 2000, message="Only support years 2000 and later")
            date_str = year_int + '/' + integer + '/' + integer

            result = date_str.parseString("1999/12/31")  # -> Exception: Only support years 2000 and later (at char 0), (line:1, col:1)
        �messagezfailed user-defined condition�fatalFcs$tt��|||��s �||���dS)N)r�rM)r�r5rv)�exc_type�fnr�rwrx�pasz&ParserElement.addCondition.<locals>.parf)r�r!rrSr�rf)r�rwr�r}rw)r{r|r�rx�addCondition�s
zParserElement.addConditioncCs
||_|S)aDefine action to perform if parsing fails at this expression.
           Fail acton fn is a callable function that takes the arguments
           C{fn(s,loc,expr,err)} where:
            - s = string being parsed
            - loc = location where expression match was attempted and failed
            - expr = the parse expression that failed
            - err = the exception thrown
           The function returns no value.  It may throw C{L{ParseFatalException}}
           if it is desired to stop parsing immediately.)rT)r�r|rwrwrx�
setFailActions
zParserElement.setFailActioncCsZd}xP|rTd}xB|jD]8}yx|j||�\}}d}qWWqtk
rLYqXqWqW|S)NTF)r]rtr)r�r-r�Z
exprsFound�eZdummyrwrwrx�_skipIgnorables#szParserElement._skipIgnorablescCsL|jr|j||�}|jrH|j}t|�}x ||krF|||krF|d7}q(W|S)Nrr)r]r�rXrYr�)r�r-r�Zwt�instrlenrwrwrx�preParse0szParserElement.preParsecCs|gfS)Nrw)r�r-r�rorwrwrx�	parseImpl<szParserElement.parseImplcCs|S)Nrw)r�r-r��	tokenlistrwrwrx�	postParse?szParserElement.postParsec"Cs�|j}|s|jr�|jdr,|jd|||�|rD|jrD|j||�}n|}|}yDy|j|||�\}}Wn(tk
r�t|t|�|j	|��YnXWnXt
k
r�}	z<|jdr�|jd||||	�|jr�|j||||	��WYdd}	~	XnXn�|o�|j�r|j||�}n|}|}|j�s$|t|�k�rhy|j|||�\}}Wn*tk
�rdt|t|�|j	|��YnXn|j|||�\}}|j|||�}t
||j|j|jd�}
|j�r�|�s�|j�r�|�rVyRxL|jD]B}||||
�}|dk	�r�t
||j|j�o�t|t
tf�|jd�}
�q�WWnFt
k
�rR}	z(|jd�r@|jd||||	��WYdd}	~	XnXnNxL|jD]B}||||
�}|dk	�r^t
||j|j�o�t|t
tf�|jd�}
�q^W|�r�|jd�r�|jd|||||
�||
fS)Nrrq)r�r�rr)r^rTrcrer�r�r�rr�rarr`r�r"rVrWrbrSrfrzr�)r�r-r�rorpZ	debugging�prelocZtokensStart�tokens�errZ	retTokensr|rwrwrx�
_parseNoCacheCsp





zParserElement._parseNoCachecCs>y|j||dd�dStk
r8t|||j|��YnXdS)NF)ror)rtr!rra)r�r-r�rwrwrx�tryParse�szParserElement.tryParsecCs2y|j||�Wnttfk
r(dSXdSdS)NFT)r�rr�)r�r-r�rwrwrx�canParseNext�s
zParserElement.canParseNextc@seZdZdd�ZdS)zParserElement._UnboundedCachecsdi�t�|_���fdd�}�fdd�}�fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)�cache�not_in_cacherwrxr��sz3ParserElement._UnboundedCache.__init__.<locals>.getcs|�|<dS)Nrw)r�r�r�)r�rwrx�set�sz3ParserElement._UnboundedCache.__init__.<locals>.setcs�j�dS)N)r�)r�)r�rwrxr��sz5ParserElement._UnboundedCache.__init__.<locals>.clear)r�r��types�
MethodTyper�r�r�)r�r�r�r�rw)r�r�rxr��sz&ParserElement._UnboundedCache.__init__N)r�r�r�r�rwrwrwrx�_UnboundedCache�sr�Nc@seZdZdd�ZdS)zParserElement._FifoCachecsht�|_�t����fdd�}��fdd�}�fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)r�r�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.getcs"|�|<t���kr�jd�dS)NF)r��popitem)r�r�r�)r��sizerwrxr��sz.ParserElement._FifoCache.__init__.<locals>.setcs�j�dS)N)r�)r�)r�rwrxr��sz0ParserElement._FifoCache.__init__.<locals>.clear)r�r��_OrderedDictr�r�r�r�r�)r�r�r�r�r�rw)r�r�r�rxr��sz!ParserElement._FifoCache.__init__N)r�r�r�r�rwrwrwrx�
_FifoCache�sr�c@seZdZdd�ZdS)zParserElement._FifoCachecsvt�|_�i�tjg�����fdd�}���fdd�}��fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)r�r�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.getcs2|�|<t���kr$�j�j�d��j|�dS)N)r�r��popleftr�)r�r�r�)r��key_fifor�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.setcs�j��j�dS)N)r�)r�)r�r�rwrxr��sz0ParserElement._FifoCache.__init__.<locals>.clear)	r�r��collections�dequer�r�r�r�r�)r�r�r�r�r�rw)r�r�r�r�rxr��sz!ParserElement._FifoCache.__init__N)r�r�r�r�rwrwrwrxr��srcCs�d\}}|||||f}tj��tj}|j|�}	|	|jkr�tj|d7<y|j||||�}	Wn8tk
r�}
z|j||
j	|
j
���WYdd}
~
Xq�X|j||	d|	dj�f�|	Sn4tj|d7<t|	t
�r�|	�|	d|	dj�fSWdQRXdS)Nrrr)rrr)r$�packrat_cache_lock�
packrat_cacher�r��packrat_cache_statsr�rr�rHr�r�rzrK)r�r-r�rorpZHITZMISS�lookupr�r�r�rwrwrx�_parseCache�s$


zParserElement._parseCachecCs(tjj�dgttj�tjdd�<dS)Nr)r$r�r�r�r�rwrwrwrx�
resetCache�s
zParserElement.resetCache�cCs8tjs4dt_|dkr tj�t_ntj|�t_tjt_dS)a�Enables "packrat" parsing, which adds memoizing to the parsing logic.
           Repeated parse attempts at the same string location (which happens
           often in many complex grammars) can immediately return a cached value,
           instead of re-executing parsing/validating code.  Memoizing is done of
           both valid results and parsing exceptions.
           
           Parameters:
            - cache_size_limit - (default=C{128}) - if an integer value is provided
              will limit the size of the packrat cache; if None is passed, then
              the cache size will be unbounded; if 0 is passed, the cache will
              be effectively disabled.
            
           This speedup may break existing programs that use parse actions that
           have side-effects.  For this reason, packrat parsing is disabled when
           you first import pyparsing.  To activate the packrat feature, your
           program must call the class method C{ParserElement.enablePackrat()}.  If
           your program uses C{psyco} to "compile as you go", you must call
           C{enablePackrat} before calling C{psyco.full()}.  If you do not do this,
           Python will crash.  For best results, call C{enablePackrat()} immediately
           after importing pyparsing.
           
           Example::
               import pyparsing
               pyparsing.ParserElement.enablePackrat()
        TN)r$�_packratEnabledr�r�r�r�rt)Zcache_size_limitrwrwrx�
enablePackratszParserElement.enablePackratcCs�tj�|js|j�x|jD]}|j�qW|js<|j�}y<|j|d�\}}|rv|j||�}t	�t
�}|j||�Wn0tk
r�}ztjr��n|�WYdd}~XnX|SdS)aB
        Execute the parse expression with the given string.
        This is the main interface to the client code, once the complete
        expression has been built.

        If you want the grammar to require that the entire input string be
        successfully parsed, then set C{parseAll} to True (equivalent to ending
        the grammar with C{L{StringEnd()}}).

        Note: C{parseString} implicitly calls C{expandtabs()} on the input string,
        in order to report proper column numbers in parse actions.
        If the input string contains tabs and
        the grammar uses parse actions that use the C{loc} argument to index into the
        string being parsed, you can ensure you have a consistent view of the input
        string by:
         - calling C{parseWithTabs} on your grammar before calling C{parseString}
           (see L{I{parseWithTabs}<parseWithTabs>})
         - define your parse action using the full C{(s,loc,toks)} signature, and
           reference the input string using the parse action's C{s} argument
         - explictly expand the tabs in your input string before calling
           C{parseString}
        
        Example::
            Word('a').parseString('aaaaabaaa')  # -> ['aaaaa']
            Word('a').parseString('aaaaabaaa', parseAll=True)  # -> Exception: Expected end of text
        rN)
r$r�r_�
streamliner]r\�
expandtabsrtr�r
r)r�verbose_stacktrace)r�r-�parseAllr�r�r�Zser3rwrwrx�parseString#s$zParserElement.parseStringccs@|js|j�x|jD]}|j�qW|js8t|�j�}t|�}d}|j}|j}t	j
�d}	y�x�||kon|	|k�ry |||�}
|||
dd�\}}Wntk
r�|
d}Yq`X||kr�|	d7}	||
|fV|r�|||�}
|
|kr�|}q�|d7}n|}q`|
d}q`WWn4tk
�r:}zt	j
�r&�n|�WYdd}~XnXdS)a�
        Scan the input string for expression matches.  Each match will return the
        matching tokens, start location, and end location.  May be called with optional
        C{maxMatches} argument, to clip scanning after 'n' matches are found.  If
        C{overlap} is specified, then overlapping matches will be reported.

        Note that the start and end locations are reported relative to the string
        being parsed.  See L{I{parseString}<parseString>} for more information on parsing
        strings with embedded tabs.

        Example::
            source = "sldjf123lsdjjkf345sldkjf879lkjsfd987"
            print(source)
            for tokens,start,end in Word(alphas).scanString(source):
                print(' '*start + '^'*(end-start))
                print(' '*start + tokens[0])
        
        prints::
        
            sldjf123lsdjjkf345sldkjf879lkjsfd987
            ^^^^^
            sldjf
                    ^^^^^^^
                    lsdjjkf
                              ^^^^^^
                              sldkjf
                                       ^^^^^^
                                       lkjsfd
        rF)rprrN)r_r�r]r\r�r�r�r�rtr$r�rrr�)r�r-�
maxMatchesZoverlapr�r�r�Z
preparseFnZparseFn�matchesr�ZnextLocr�Znextlocr3rwrwrx�
scanStringUsB


zParserElement.scanStringcCs�g}d}d|_y�xh|j|�D]Z\}}}|j|||��|rrt|t�rT||j�7}nt|t�rh||7}n
|j|�|}qW|j||d��dd�|D�}djtt	t
|���Stk
r�}ztj
rȂn|�WYdd}~XnXdS)af
        Extension to C{L{scanString}}, to modify matching text with modified tokens that may
        be returned from a parse action.  To use C{transformString}, define a grammar and
        attach a parse action to it that modifies the returned token list.
        Invoking C{transformString()} on a target string will then scan for matches,
        and replace the matched text patterns according to the logic in the parse
        action.  C{transformString()} returns the resulting transformed string.
        
        Example::
            wd = Word(alphas)
            wd.setParseAction(lambda toks: toks[0].title())
            
            print(wd.transformString("now is the winter of our discontent made glorious summer by this sun of york."))
        Prints::
            Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York.
        rTNcSsg|]}|r|�qSrwrw)r��orwrwrxr��sz1ParserElement.transformString.<locals>.<listcomp>r�)r\r�r�rzr"r�r�r�rvr��_flattenrr$r�)r�r-rZlastErvr�r�r3rwrwrxr��s(



zParserElement.transformStringcCsPytdd�|j||�D��Stk
rJ}ztjr6�n|�WYdd}~XnXdS)a~
        Another extension to C{L{scanString}}, simplifying the access to the tokens found
        to match the given parse expression.  May be called with optional
        C{maxMatches} argument, to clip searching after 'n' matches are found.
        
        Example::
            # a capitalized word starts with an uppercase letter, followed by zero or more lowercase letters
            cap_word = Word(alphas.upper(), alphas.lower())
            
            print(cap_word.searchString("More than Iron, more than Lead, more than Gold I need Electricity"))
        prints::
            ['More', 'Iron', 'Lead', 'Gold', 'I']
        cSsg|]\}}}|�qSrwrw)r�rvr�r�rwrwrxr��sz.ParserElement.searchString.<locals>.<listcomp>N)r"r�rr$r�)r�r-r�r3rwrwrx�searchString�szParserElement.searchStringc	csXd}d}x<|j||d�D]*\}}}|||�V|r>|dV|}qW||d�VdS)a[
        Generator method to split a string using the given expression as a separator.
        May be called with optional C{maxsplit} argument, to limit the number of splits;
        and the optional C{includeSeparators} argument (default=C{False}), if the separating
        matching text should be included in the split results.
        
        Example::        
            punc = oneOf(list(".,;:/-!?"))
            print(list(punc.split("This, this?, this sentence, is badly punctuated!")))
        prints::
            ['This', ' this', '', ' this sentence', ' is badly punctuated', '']
        r)r�N)r�)	r�r-�maxsplitZincludeSeparatorsZsplitsZlastrvr�r�rwrwrxr��s

zParserElement.splitcCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)a�
        Implementation of + operator - returns C{L{And}}. Adding strings to a ParserElement
        converts them to L{Literal}s by default.
        
        Example::
            greet = Word(alphas) + "," + Word(alphas) + "!"
            hello = "Hello, World!"
            print (hello, "->", greet.parseString(hello))
        Prints::
            Hello, World! -> ['Hello', ',', 'World', '!']
        z4Cannot combine element of type %s with ParserElementrq)�
stacklevelN)	rzr�r$rQ�warnings�warnr��
SyntaxWarningr)r�r�rwrwrxr�s



zParserElement.__add__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||S)z]
        Implementation of + operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrxrs



zParserElement.__radd__cCsLt|t�rtj|�}t|t�s:tjdt|�tdd�dSt|tj	�|g�S)zQ
        Implementation of - operator, returns C{L{And}} with error stop
        z4Cannot combine element of type %s with ParserElementrq)r�N)
rzr�r$rQr�r�r�r�r�
_ErrorStop)r�r�rwrwrx�__sub__s



zParserElement.__sub__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||S)z]
        Implementation of - operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rsub__ s



zParserElement.__rsub__cs�t|t�r|d}}n�t|t�r�|ddd�}|ddkrHd|df}t|dt�r�|ddkr�|ddkrvt��S|ddkr�t��S�|dt��SnJt|dt�r�t|dt�r�|\}}||8}ntdt|d�t|d���ntdt|���|dk�rtd��|dk�rtd��||k�o2dkn�rBtd	��|�r���fd
d��|�r�|dk�rt��|�}nt�g|��|�}n�|�}n|dk�r��}nt�g|�}|S)
a�
        Implementation of * operator, allows use of C{expr * 3} in place of
        C{expr + expr + expr}.  Expressions may also me multiplied by a 2-integer
        tuple, similar to C{{min,max}} multipliers in regular expressions.  Tuples
        may also include C{None} as in:
         - C{expr*(n,None)} or C{expr*(n,)} is equivalent
              to C{expr*n + L{ZeroOrMore}(expr)}
              (read as "at least n instances of C{expr}")
         - C{expr*(None,n)} is equivalent to C{expr*(0,n)}
              (read as "0 to n instances of C{expr}")
         - C{expr*(None,None)} is equivalent to C{L{ZeroOrMore}(expr)}
         - C{expr*(1,None)} is equivalent to C{L{OneOrMore}(expr)}

        Note that C{expr*(None,n)} does not raise an exception if
        more than n exprs exist in the input stream; that is,
        C{expr*(None,n)} does not enforce a maximum number of expr
        occurrences.  If this behavior is desired, then write
        C{expr*(None,n) + ~expr}
        rNrqrrz7cannot multiply 'ParserElement' and ('%s','%s') objectsz0cannot multiply 'ParserElement' and '%s' objectsz/cannot multiply ParserElement by negative valuez@second tuple value must be greater or equal to first tuple valuez+cannot multiply ParserElement by 0 or (0,0)cs(|dkrt��|d��St��SdS)Nrr)r)�n)�makeOptionalListr�rwrxr�]sz/ParserElement.__mul__.<locals>.makeOptionalList)NN)	rzru�tupler2rr�r��
ValueErrorr)r�r�ZminElementsZoptElementsr�rw)r�r�rx�__mul__,sD







zParserElement.__mul__cCs
|j|�S)N)r�)r�r�rwrwrx�__rmul__pszParserElement.__rmul__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zI
        Implementation of | operator - returns C{L{MatchFirst}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__or__ss



zParserElement.__or__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||BS)z]
        Implementation of | operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__ror__s



zParserElement.__ror__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zA
        Implementation of ^ operator - returns C{L{Or}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__xor__�s



zParserElement.__xor__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||AS)z]
        Implementation of ^ operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rxor__�s



zParserElement.__rxor__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zC
        Implementation of & operator - returns C{L{Each}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__and__�s



zParserElement.__and__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||@S)z]
        Implementation of & operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rand__�s



zParserElement.__rand__cCst|�S)zE
        Implementation of ~ operator - returns C{L{NotAny}}
        )r)r�rwrwrx�
__invert__�szParserElement.__invert__cCs|dk	r|j|�S|j�SdS)a

        Shortcut for C{L{setResultsName}}, with C{listAllMatches=False}.
        
        If C{name} is given with a trailing C{'*'} character, then C{listAllMatches} will be
        passed as C{True}.
           
        If C{name} is omitted, same as calling C{L{copy}}.

        Example::
            # these are equivalent
            userdata = Word(alphas).setResultsName("name") + Word(nums+"-").setResultsName("socsecno")
            userdata = Word(alphas)("name") + Word(nums+"-")("socsecno")             
        N)rmr�)r�r�rwrwrx�__call__�s
zParserElement.__call__cCst|�S)z�
        Suppresses the output of this C{ParserElement}; useful to keep punctuation from
        cluttering up returned output.
        )r+)r�rwrwrx�suppress�szParserElement.suppresscCs
d|_|S)a
        Disables the skipping of whitespace before matching the characters in the
        C{ParserElement}'s defined pattern.  This is normally only used internally by
        the pyparsing module, but may be needed in some whitespace-sensitive grammars.
        F)rX)r�rwrwrx�leaveWhitespace�szParserElement.leaveWhitespacecCsd|_||_d|_|S)z8
        Overrides the default whitespace chars
        TF)rXrYrZ)r�rOrwrwrx�setWhitespaceChars�sz ParserElement.setWhitespaceCharscCs
d|_|S)z�
        Overrides default behavior to expand C{<TAB>}s to spaces before parsing the input string.
        Must be called before C{parseString} when the input grammar contains elements that
        match C{<TAB>} characters.
        T)r\)r�rwrwrx�
parseWithTabs�szParserElement.parseWithTabscCsLt|t�rt|�}t|t�r4||jkrH|jj|�n|jjt|j���|S)a�
        Define expression to be ignored (e.g., comments) while doing pattern
        matching; may be called repeatedly, to define multiple comment or other
        ignorable patterns.
        
        Example::
            patt = OneOrMore(Word(alphas))
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj']
            
            patt.ignore(cStyleComment)
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj', 'lskjd']
        )rzr�r+r]r�r�)r�r�rwrwrx�ignore�s


zParserElement.ignorecCs"|pt|pt|ptf|_d|_|S)zT
        Enable display of debugging messages while doing pattern matching.
        T)r/r2r4rcr^)r�ZstartActionZ
successActionZexceptionActionrwrwrx�setDebugActions
s
zParserElement.setDebugActionscCs|r|jttt�nd|_|S)a�
        Enable display of debugging messages while doing pattern matching.
        Set C{flag} to True to enable, False to disable.

        Example::
            wd = Word(alphas).setName("alphaword")
            integer = Word(nums).setName("numword")
            term = wd | integer
            
            # turn on debugging for wd
            wd.setDebug()

            OneOrMore(term).parseString("abc 123 xyz 890")
        
        prints::
            Match alphaword at loc 0(1,1)
            Matched alphaword -> ['abc']
            Match alphaword at loc 3(1,4)
            Exception raised:Expected alphaword (at char 4), (line:1, col:5)
            Match alphaword at loc 7(1,8)
            Matched alphaword -> ['xyz']
            Match alphaword at loc 11(1,12)
            Exception raised:Expected alphaword (at char 12), (line:1, col:13)
            Match alphaword at loc 15(1,16)
            Exception raised:Expected alphaword (at char 15), (line:1, col:16)

        The output shown is that produced by the default debug actions - custom debug actions can be
        specified using L{setDebugActions}. Prior to attempting
        to match the C{wd} expression, the debugging message C{"Match <exprname> at loc <n>(<line>,<col>)"}
        is shown. Then if the parse succeeds, a C{"Matched"} message is shown, or an C{"Exception raised"}
        message is shown. Also note the use of L{setName} to assign a human-readable name to the expression,
        which makes debugging and exception messages easier to understand - for instance, the default
        name created for the C{Word} expression without calling C{setName} is C{"W:(ABCD...)"}.
        F)r�r/r2r4r^)r��flagrwrwrx�setDebugs#zParserElement.setDebugcCs|jS)N)r�)r�rwrwrxr�@szParserElement.__str__cCst|�S)N)r�)r�rwrwrxr�CszParserElement.__repr__cCsd|_d|_|S)NT)r_rU)r�rwrwrxr�FszParserElement.streamlinecCsdS)Nrw)r�r�rwrwrx�checkRecursionKszParserElement.checkRecursioncCs|jg�dS)zj
        Check defined expressions for valid structure, check for infinite recursive definitions.
        N)r�)r��
validateTracerwrwrx�validateNszParserElement.validatecCs�y|j�}Wn2tk
r>t|d��}|j�}WdQRXYnXy|j||�Stk
r|}ztjrh�n|�WYdd}~XnXdS)z�
        Execute the parse expression on the given file or filename.
        If a filename is specified (instead of a file object),
        the entire file is opened, read, and closed before parsing.
        �rN)�readr��openr�rr$r�)r�Zfile_or_filenamer�Z
file_contents�fr3rwrwrx�	parseFileTszParserElement.parseFilecsHt|t�r"||kp t|�t|�kSt|t�r6|j|�Stt|�|kSdS)N)rzr$�varsr�r��super)r�r�)rHrwrx�__eq__hs



zParserElement.__eq__cCs
||kS)Nrw)r�r�rwrwrx�__ne__pszParserElement.__ne__cCstt|��S)N)�hash�id)r�rwrwrx�__hash__sszParserElement.__hash__cCs||kS)Nrw)r�r�rwrwrx�__req__vszParserElement.__req__cCs
||kS)Nrw)r�r�rwrwrx�__rne__yszParserElement.__rne__cCs0y|jt|�|d�dStk
r*dSXdS)a�
        Method for quick testing of a parser against a test string. Good for simple 
        inline microtests of sub expressions while building up larger parser.
           
        Parameters:
         - testString - to test against this expression for a match
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests
            
        Example::
            expr = Word(nums)
            assert expr.matches("100")
        )r�TFN)r�r�r)r�Z
testStringr�rwrwrxr�|s

zParserElement.matches�#cCs�t|t�r"tttj|j�j���}t|t�r4t|�}g}g}d}	�x�|D�]�}
|dk	rb|j	|
d�sl|rx|
rx|j
|
�qH|
s~qHdj|�|
g}g}y:|
jdd�}
|j
|
|d�}|j
|j|d��|	o�|}	Wn�tk
�rx}
z�t|
t�r�dnd	}d|
k�r0|j
t|
j|
��|j
d
t|
j|
�dd|�n|j
d
|
jd|�|j
d
t|
��|	�ob|}	|
}WYdd}
~
XnDtk
�r�}z&|j
dt|��|	�o�|}	|}WYdd}~XnX|�r�|�r�|j
d	�tdj|��|j
|
|f�qHW|	|fS)a3
        Execute the parse expression on a series of test strings, showing each
        test, the parsed results or where the parse failed. Quick and easy way to
        run a parse expression against a list of sample strings.
           
        Parameters:
         - tests - a list of separate test strings, or a multiline string of test strings
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests           
         - comment - (default=C{'#'}) - expression for indicating embedded comments in the test 
              string; pass None to disable comment filtering
         - fullDump - (default=C{True}) - dump results as list followed by results names in nested outline;
              if False, only dump nested list
         - printResults - (default=C{True}) prints test output to stdout
         - failureTests - (default=C{False}) indicates if these tests are expected to fail parsing

        Returns: a (success, results) tuple, where success indicates that all tests succeeded
        (or failed if C{failureTests} is True), and the results contain a list of lines of each 
        test's output
        
        Example::
            number_expr = pyparsing_common.number.copy()

            result = number_expr.runTests('''
                # unsigned integer
                100
                # negative integer
                -100
                # float with scientific notation
                6.02e23
                # integer with scientific notation
                1e-12
                ''')
            print("Success" if result[0] else "Failed!")

            result = number_expr.runTests('''
                # stray character
                100Z
                # missing leading digit before '.'
                -.100
                # too many '.'
                3.14.159
                ''', failureTests=True)
            print("Success" if result[0] else "Failed!")
        prints::
            # unsigned integer
            100
            [100]

            # negative integer
            -100
            [-100]

            # float with scientific notation
            6.02e23
            [6.02e+23]

            # integer with scientific notation
            1e-12
            [1e-12]

            Success
            
            # stray character
            100Z
               ^
            FAIL: Expected end of text (at char 3), (line:1, col:4)

            # missing leading digit before '.'
            -.100
            ^
            FAIL: Expected {real number with scientific notation | real number | signed integer} (at char 0), (line:1, col:1)

            # too many '.'
            3.14.159
                ^
            FAIL: Expected end of text (at char 4), (line:1, col:5)

            Success

        Each test string must be on a single line. If you want to test a string that spans multiple
        lines, create a test like this::

            expr.runTest(r"this is a test\n of strings that spans \n 3 lines")
        
        (Note that this is a raw string literal, you must include the leading 'r'.)
        TNFrz\n)r�)r z(FATAL)r�� rr�^zFAIL: zFAIL-EXCEPTION: )rzr�r�rvr{r��rstrip�
splitlinesrr�r�r�r�r�rrr!rGr�r9rKr,)r�Ztestsr�ZcommentZfullDumpZprintResultsZfailureTestsZ
allResultsZcomments�successrvr�resultr�rzr3rwrwrx�runTests�sNW



$


zParserElement.runTests)F)F)T)T)TT)TT)r�)F)N)T)F)T)Tr�TTF)Or�r�r�r�rNr��staticmethodrPrRr�r�rirmrur�rxr~rr�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�rtr�r�r�r��_MAX_INTr�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r��
__classcell__rwrw)rHrxr$8s�


&




H
"
2G+D
			

)

cs eZdZdZ�fdd�Z�ZS)r,zT
    Abstract C{ParserElement} subclass, for defining atomic matching patterns.
    cstt|�jdd�dS)NF)rg)r�r,r�)r�)rHrwrxr�	szToken.__init__)r�r�r�r�r�r�rwrw)rHrxr,	scs eZdZdZ�fdd�Z�ZS)r
z,
    An empty token, will always match.
    cs$tt|�j�d|_d|_d|_dS)Nr
TF)r�r
r�r�r[r`)r�)rHrwrxr�	szEmpty.__init__)r�r�r�r�r�r�rwrw)rHrxr
	scs*eZdZdZ�fdd�Zddd�Z�ZS)rz(
    A token that will never match.
    cs*tt|�j�d|_d|_d|_d|_dS)NrTFzUnmatchable token)r�rr�r�r[r`ra)r�)rHrwrxr�*	s
zNoMatch.__init__TcCst|||j|��dS)N)rra)r�r-r�rorwrwrxr�1	szNoMatch.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr&	scs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Token to exactly match a specified string.
    
    Example::
        Literal('blah').parseString('blah')  # -> ['blah']
        Literal('blah').parseString('blahfooblah')  # -> ['blah']
        Literal('blah').parseString('bla')  # -> Exception: Expected "blah"
    
    For case-insensitive matching, use L{CaselessLiteral}.
    
    For keyword matching (force word break before and after the matched string),
    use L{Keyword} or L{CaselessKeyword}.
    cs�tt|�j�||_t|�|_y|d|_Wn*tk
rVtj	dt
dd�t|_YnXdt
|j�|_d|j|_d|_d|_dS)Nrz2null string passed to Literal; use Empty() insteadrq)r�z"%s"z	Expected F)r�rr��matchr��matchLen�firstMatchCharr�r�r�r�r
rHr�r�rar[r`)r��matchString)rHrwrxr�C	s

zLiteral.__init__TcCsJ|||jkr6|jdks&|j|j|�r6||j|jfSt|||j|��dS)Nrr)r�r��
startswithr�rra)r�r-r�rorwrwrxr�V	szLiteral.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr5	s
csLeZdZdZedZd�fdd�	Zddd	�Z�fd
d�Ze	dd
��Z
�ZS)ra\
    Token to exactly match a specified string as a keyword, that is, it must be
    immediately followed by a non-keyword character.  Compare with C{L{Literal}}:
     - C{Literal("if")} will match the leading C{'if'} in C{'ifAndOnlyIf'}.
     - C{Keyword("if")} will not; it will only match the leading C{'if'} in C{'if x=1'}, or C{'if(y==2)'}
    Accepts two optional constructor arguments in addition to the keyword string:
     - C{identChars} is a string of characters that would be valid identifier characters,
          defaulting to all alphanumerics + "_" and "$"
     - C{caseless} allows case-insensitive matching, default is C{False}.
       
    Example::
        Keyword("start").parseString("start")  # -> ['start']
        Keyword("start").parseString("starting")  # -> Exception

    For case-insensitive matching, use L{CaselessKeyword}.
    z_$NFcs�tt|�j�|dkrtj}||_t|�|_y|d|_Wn$tk
r^t	j
dtdd�YnXd|j|_d|j|_
d|_d|_||_|r�|j�|_|j�}t|�|_dS)Nrz2null string passed to Keyword; use Empty() insteadrq)r�z"%s"z	Expected F)r�rr��DEFAULT_KEYWORD_CHARSr�r�r�r�r�r�r�r�r�rar[r`�caseless�upper�
caselessmatchr��
identChars)r�r�r�r�)rHrwrxr�q	s&

zKeyword.__init__TcCs|jr|||||j�j�|jkr�|t|�|jksL|||jj�|jkr�|dksj||dj�|jkr�||j|jfSnv|||jkr�|jdks�|j|j|�r�|t|�|jks�|||j|jkr�|dks�||d|jkr�||j|jfSt	|||j
|��dS)Nrrr)r�r�r�r�r�r�r�r�r�rra)r�r-r�rorwrwrxr��	s*&zKeyword.parseImplcstt|�j�}tj|_|S)N)r�rr�r�r�)r�r�)rHrwrxr��	szKeyword.copycCs
|t_dS)z,Overrides the default Keyword chars
        N)rr�)rOrwrwrx�setDefaultKeywordChars�	szKeyword.setDefaultKeywordChars)NF)T)r�r�r�r�r3r�r�r�r�r�r�r�rwrw)rHrxr^	s
cs*eZdZdZ�fdd�Zddd�Z�ZS)ral
    Token to match a specified string, ignoring case of letters.
    Note: the matched results will always be in the case of the given
    match string, NOT the case of the input text.

    Example::
        OneOrMore(CaselessLiteral("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD', 'CMD']
        
    (Contrast with example for L{CaselessKeyword}.)
    cs6tt|�j|j��||_d|j|_d|j|_dS)Nz'%s'z	Expected )r�rr�r��returnStringr�ra)r�r�)rHrwrxr��	szCaselessLiteral.__init__TcCs@||||j�j�|jkr,||j|jfSt|||j|��dS)N)r�r�r�r�rra)r�r-r�rorwrwrxr��	szCaselessLiteral.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr�	s
cs,eZdZdZd�fdd�	Zd	dd�Z�ZS)
rz�
    Caseless version of L{Keyword}.

    Example::
        OneOrMore(CaselessKeyword("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD']
        
    (Contrast with example for L{CaselessLiteral}.)
    Ncstt|�j||dd�dS)NT)r�)r�rr�)r�r�r�)rHrwrxr��	szCaselessKeyword.__init__TcCsj||||j�j�|jkrV|t|�|jksF|||jj�|jkrV||j|jfSt|||j|��dS)N)r�r�r�r�r�r�rra)r�r-r�rorwrwrxr��	s*zCaselessKeyword.parseImpl)N)T)r�r�r�r�r�r�r�rwrw)rHrxr�	scs,eZdZdZd�fdd�	Zd	dd�Z�ZS)
rlax
    A variation on L{Literal} which matches "close" matches, that is, 
    strings with at most 'n' mismatching characters. C{CloseMatch} takes parameters:
     - C{match_string} - string to be matched
     - C{maxMismatches} - (C{default=1}) maximum number of mismatches allowed to count as a match
    
    The results from a successful parse will contain the matched text from the input string and the following named results:
     - C{mismatches} - a list of the positions within the match_string where mismatches were found
     - C{original} - the original match_string used to compare against the input string
    
    If C{mismatches} is an empty list, then the match was an exact match.
    
    Example::
        patt = CloseMatch("ATCATCGAATGGA")
        patt.parseString("ATCATCGAAXGGA") # -> (['ATCATCGAAXGGA'], {'mismatches': [[9]], 'original': ['ATCATCGAATGGA']})
        patt.parseString("ATCAXCGAAXGGA") # -> Exception: Expected 'ATCATCGAATGGA' (with up to 1 mismatches) (at char 0), (line:1, col:1)

        # exact match
        patt.parseString("ATCATCGAATGGA") # -> (['ATCATCGAATGGA'], {'mismatches': [[]], 'original': ['ATCATCGAATGGA']})

        # close match allowing up to 2 mismatches
        patt = CloseMatch("ATCATCGAATGGA", maxMismatches=2)
        patt.parseString("ATCAXCGAAXGGA") # -> (['ATCAXCGAAXGGA'], {'mismatches': [[4, 9]], 'original': ['ATCATCGAATGGA']})
    rrcsBtt|�j�||_||_||_d|j|jf|_d|_d|_dS)Nz&Expected %r (with up to %d mismatches)F)	r�rlr�r��match_string�
maxMismatchesrar`r[)r�r�r�)rHrwrxr��	szCloseMatch.__init__TcCs�|}t|�}|t|j�}||kr�|j}d}g}	|j}
x�tt|||�|j��D]0\}}|\}}
||
krP|	j|�t|	�|
krPPqPW|d}t|||�g�}|j|d<|	|d<||fSt|||j|��dS)Nrrr�original�
mismatches)	r�r�r�r�r�r�r"rra)r�r-r�ro�startr��maxlocr�Zmatch_stringlocr�r�Zs_m�src�mat�resultsrwrwrxr��	s("

zCloseMatch.parseImpl)rr)T)r�r�r�r�r�r�r�rwrw)rHrxrl�	s	cs8eZdZdZd
�fdd�	Zdd	d
�Z�fdd�Z�ZS)r/a	
    Token for matching words composed of allowed character sets.
    Defined with string containing all allowed initial characters,
    an optional string containing allowed body characters (if omitted,
    defaults to the initial character set), and an optional minimum,
    maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction. An optional
    C{excludeChars} parameter can list characters that might be found in 
    the input C{bodyChars} string; useful to define a word of all printables
    except for one or two characters, for instance.
    
    L{srange} is useful for defining custom character set strings for defining 
    C{Word} expressions, using range notation from regular expression character sets.
    
    A common mistake is to use C{Word} to match a specific literal string, as in 
    C{Word("Address")}. Remember that C{Word} uses the string argument to define
    I{sets} of matchable characters. This expression would match "Add", "AAA",
    "dAred", or any other word made up of the characters 'A', 'd', 'r', 'e', and 's'.
    To match an exact literal string, use L{Literal} or L{Keyword}.

    pyparsing includes helper strings for building Words:
     - L{alphas}
     - L{nums}
     - L{alphanums}
     - L{hexnums}
     - L{alphas8bit} (alphabetic characters in ASCII range 128-255 - accented, tilded, umlauted, etc.)
     - L{punc8bit} (non-alphabetic characters in ASCII range 128-255 - currency, symbols, superscripts, diacriticals, etc.)
     - L{printables} (any non-whitespace character)

    Example::
        # a word composed of digits
        integer = Word(nums) # equivalent to Word("0123456789") or Word(srange("0-9"))
        
        # a word with a leading capital, and zero or more lowercase
        capital_word = Word(alphas.upper(), alphas.lower())

        # hostnames are alphanumeric, with leading alpha, and '-'
        hostname = Word(alphas, alphanums+'-')
        
        # roman numeral (not a strict parser, accepts invalid mix of characters)
        roman = Word("IVXLCDM")
        
        # any string of non-whitespace characters, except for ','
        csv_value = Word(printables, excludeChars=",")
    NrrrFcs�tt|�j��rFdj�fdd�|D��}|rFdj�fdd�|D��}||_t|�|_|rl||_t|�|_n||_t|�|_|dk|_	|dkr�t
d��||_|dkr�||_nt
|_|dkr�||_||_t|�|_d|j|_d	|_||_d
|j|jk�r�|dk�r�|dk�r�|dk�r�|j|jk�r8dt|j�|_nHt|j�dk�rfdtj|j�t|j�f|_nd
t|j�t|j�f|_|j�r�d|jd|_ytj|j�|_Wntk
�r�d|_YnXdS)Nr�c3s|]}|�kr|VqdS)Nrw)r�r�)�excludeCharsrwrxr�7
sz Word.__init__.<locals>.<genexpr>c3s|]}|�kr|VqdS)Nrw)r�r�)r�rwrxr�9
srrrzZcannot specify a minimum length < 1; use Optional(Word()) if zero-length word is permittedz	Expected Fr�z[%s]+z%s[%s]*z	[%s][%s]*z\b)r�r/r�r��
initCharsOrigr��	initChars�
bodyCharsOrig�	bodyChars�maxSpecifiedr��minLen�maxLenr�r�r�rar`�	asKeyword�_escapeRegexRangeChars�reStringr�rd�escape�compilerK)r�rr�min�max�exactrr�)rH)r�rxr�4
sT



0
z
Word.__init__Tc
CsD|jr<|jj||�}|s(t|||j|��|j�}||j�fS|||jkrZt|||j|��|}|d7}t|�}|j}||j	}t
||�}x ||kr�|||kr�|d7}q�Wd}	|||jkr�d}	|jr�||kr�|||kr�d}	|j
�r|dk�r||d|k�s||k�r|||k�rd}	|	�r4t|||j|��||||�fS)NrrFTr)rdr�rra�end�grouprr�rrrrrr)
r�r-r�ror�r�r�Z	bodycharsr�ZthrowExceptionrwrwrxr�j
s6

4zWord.parseImplcstytt|�j�Stk
r"YnX|jdkrndd�}|j|jkr^d||j�||j�f|_nd||j�|_|jS)NcSs$t|�dkr|dd�dS|SdS)N�z...)r�)r�rwrwrx�
charsAsStr�
sz Word.__str__.<locals>.charsAsStrz	W:(%s,%s)zW:(%s))r�r/r�rKrUr�r)r�r)rHrwrxr��
s
zWord.__str__)NrrrrFN)T)r�r�r�r�r�r�r�r�rwrw)rHrxr/
s.6
#csFeZdZdZeejd��Zd�fdd�	Zddd�Z	�fd	d
�Z
�ZS)
r'a�
    Token for matching strings that match a given regular expression.
    Defined with string specifying the regular expression in a form recognized by the inbuilt Python re module.
    If the given regex contains named groups (defined using C{(?P<name>...)}), these will be preserved as 
    named parse results.

    Example::
        realnum = Regex(r"[+-]?\d+\.\d*")
        date = Regex(r'(?P<year>\d{4})-(?P<month>\d\d?)-(?P<day>\d\d?)')
        # ref: http://stackoverflow.com/questions/267399/how-do-you-match-only-valid-roman-numerals-with-a-regular-expression
        roman = Regex(r"M{0,4}(CM|CD|D?C{0,3})(XC|XL|L?X{0,3})(IX|IV|V?I{0,3})")
    z[A-Z]rcs�tt|�j�t|t�r�|s,tjdtdd�||_||_	yt
j|j|j	�|_
|j|_Wq�t
jk
r�tjd|tdd��Yq�Xn2t|tj�r�||_
t|�|_|_||_	ntd��t|�|_d|j|_d|_d|_d	S)
z�The parameters C{pattern} and C{flags} are passed to the C{re.compile()} function as-is. See the Python C{re} module for an explanation of the acceptable patterns and flags.z0null string passed to Regex; use Empty() insteadrq)r�z$invalid pattern (%s) passed to RegexzCRegex may only be constructed with a string or a compiled RE objectz	Expected FTN)r�r'r�rzr�r�r�r��pattern�flagsrdr
r�
sre_constants�error�compiledREtyper{r�r�r�rar`r[)r�rr)rHrwrxr��
s.





zRegex.__init__TcCsd|jj||�}|s"t|||j|��|j�}|j�}t|j��}|r\x|D]}||||<qHW||fS)N)rdr�rrar�	groupdictr"r)r�r-r�ror��dr�r�rwrwrxr��
s
zRegex.parseImplcsDytt|�j�Stk
r"YnX|jdkr>dt|j�|_|jS)NzRe:(%s))r�r'r�rKrUr�r)r�)rHrwrxr��
s
z
Regex.__str__)r)T)r�r�r�r�r�rdr
rr�r�r�r�rwrw)rHrxr'�
s
"

cs8eZdZdZd�fdd�	Zddd�Z�fd	d
�Z�ZS)
r%a�
    Token for matching strings that are delimited by quoting characters.
    
    Defined with the following parameters:
        - quoteChar - string of one or more characters defining the quote delimiting string
        - escChar - character to escape quotes, typically backslash (default=C{None})
        - escQuote - special quote sequence to escape an embedded quote string (such as SQL's "" to escape an embedded ") (default=C{None})
        - multiline - boolean indicating whether quotes can span multiple lines (default=C{False})
        - unquoteResults - boolean indicating whether the matched text should be unquoted (default=C{True})
        - endQuoteChar - string of one or more characters defining the end of the quote delimited string (default=C{None} => same as quoteChar)
        - convertWhitespaceEscapes - convert escaped whitespace (C{'\t'}, C{'\n'}, etc.) to actual whitespace (default=C{True})

    Example::
        qs = QuotedString('"')
        print(qs.searchString('lsjdf "This is the quote" sldjf'))
        complex_qs = QuotedString('{{', endQuoteChar='}}')
        print(complex_qs.searchString('lsjdf {{This is the "quote"}} sldjf'))
        sql_qs = QuotedString('"', escQuote='""')
        print(sql_qs.searchString('lsjdf "This is the quote with ""embedded"" quotes" sldjf'))
    prints::
        [['This is the quote']]
        [['This is the "quote"']]
        [['This is the quote with "embedded" quotes']]
    NFTcsNtt��j�|j�}|s0tjdtdd�t��|dkr>|}n"|j�}|s`tjdtdd�t��|�_t	|��_
|d�_|�_t	|��_
|�_|�_|�_|�_|r�tjtjB�_dtj�j�t�jd�|dk	r�t|�p�df�_n<d�_dtj�j�t�jd�|dk	�rt|��pdf�_t	�j�d	k�rp�jd
dj�fdd
�tt	�j�d	dd�D��d7_|�r��jdtj|�7_|�r��jdtj|�7_tj�j�d�_�jdtj�j�7_ytj�j�j��_�j�_Wn0tjk
�r&tjd�jtdd��YnXt ���_!d�j!�_"d�_#d�_$dS)Nz$quoteChar cannot be the empty stringrq)r�z'endQuoteChar cannot be the empty stringrz%s(?:[^%s%s]r�z%s(?:[^%s\n\r%s]rrz|(?:z)|(?:c3s4|],}dtj�jd|��t�j|�fVqdS)z%s[^%s]N)rdr	�endQuoteCharr)r�r�)r�rwrxr�/sz(QuotedString.__init__.<locals>.<genexpr>�)z|(?:%s)z|(?:%s.)z(.)z)*%sz$invalid pattern (%s) passed to Regexz	Expected FTrs)%r�r%r�r�r�r�r��SyntaxError�	quoteCharr��quoteCharLen�firstQuoteCharr�endQuoteCharLen�escChar�escQuote�unquoteResults�convertWhitespaceEscapesrd�	MULTILINE�DOTALLrr	rrr�r��escCharReplacePatternr
rrrr�r�rar`r[)r�rr r!Z	multiliner"rr#)rH)r�rxr�sf




6

zQuotedString.__init__c	Cs�|||jkr|jj||�pd}|s4t|||j|��|j�}|j�}|jr�||j|j	�}t
|t�r�d|kr�|jr�ddddd�}x |j
�D]\}}|j||�}q�W|jr�tj|jd|�}|jr�|j|j|j�}||fS)N�\�	r��
)z\tz\nz\fz\rz\g<1>)rrdr�rrarrr"rrrzr�r#r�r�r r�r&r!r)	r�r-r�ror�r�Zws_mapZwslitZwscharrwrwrxr�Gs( 
zQuotedString.parseImplcsFytt|�j�Stk
r"YnX|jdkr@d|j|jf|_|jS)Nz.quoted string, starting with %s ending with %s)r�r%r�rKrUrr)r�)rHrwrxr�js
zQuotedString.__str__)NNFTNT)T)r�r�r�r�r�r�r�r�rwrw)rHrxr%�
sA
#cs8eZdZdZd�fdd�	Zddd�Z�fd	d
�Z�ZS)
r	a�
    Token for matching words composed of characters I{not} in a given set (will
    include whitespace in matched characters if not listed in the provided exclusion set - see example).
    Defined with string containing all disallowed characters, and an optional
    minimum, maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction.

    Example::
        # define a comma-separated-value as anything that is not a ','
        csv_value = CharsNotIn(',')
        print(delimitedList(csv_value).parseString("dkls,lsdkjf,s12 34,@!#,213"))
    prints::
        ['dkls', 'lsdkjf', 's12 34', '@!#', '213']
    rrrcs�tt|�j�d|_||_|dkr*td��||_|dkr@||_nt|_|dkrZ||_||_t	|�|_
d|j
|_|jdk|_d|_
dS)NFrrzfcannot specify a minimum length < 1; use Optional(CharsNotIn()) if zero-length char group is permittedrz	Expected )r�r	r�rX�notCharsr�rrr�r�r�rar[r`)r�r+rrr
)rHrwrxr��s 
zCharsNotIn.__init__TcCs�|||jkrt|||j|��|}|d7}|j}t||jt|��}x ||krd|||krd|d7}qFW|||jkr�t|||j|��||||�fS)Nrr)r+rrarrr�r)r�r-r�ror�Znotchars�maxlenrwrwrxr��s
zCharsNotIn.parseImplcsdytt|�j�Stk
r"YnX|jdkr^t|j�dkrRd|jdd�|_nd|j|_|jS)Nrz
!W:(%s...)z!W:(%s))r�r	r�rKrUr�r+)r�)rHrwrxr��s
zCharsNotIn.__str__)rrrr)T)r�r�r�r�r�r�r�r�rwrw)rHrxr	vs
cs<eZdZdZdddddd�Zd�fdd�	Zddd�Z�ZS)r.a�
    Special matching class for matching whitespace.  Normally, whitespace is ignored
    by pyparsing grammars.  This class is included when some whitespace structures
    are significant.  Define with a string containing the whitespace characters to be
    matched; default is C{" \t\r\n"}.  Also takes optional C{min}, C{max}, and C{exact} arguments,
    as defined for the C{L{Word}} class.
    z<SPC>z<TAB>z<LF>z<CR>z<FF>)r�r(rr*r)� 	
rrrcs�tt��j�|�_�jdj�fdd��jD���djdd��jD���_d�_d�j�_	|�_
|dkrt|�_nt�_|dkr�|�_|�_
dS)Nr�c3s|]}|�jkr|VqdS)N)�
matchWhite)r�r�)r�rwrxr��sz!White.__init__.<locals>.<genexpr>css|]}tj|VqdS)N)r.�	whiteStrs)r�r�rwrwrxr��sTz	Expected r)
r�r.r�r.r�r�rYr�r[rarrr�)r�Zwsrrr
)rH)r�rxr��s zWhite.__init__TcCs�|||jkrt|||j|��|}|d7}||j}t|t|��}x"||krd|||jkrd|d7}qDW|||jkr�t|||j|��||||�fS)Nrr)r.rrarrr�r)r�r-r�ror�r�rwrwrxr��s
zWhite.parseImpl)r-rrrr)T)r�r�r�r�r/r�r�r�rwrw)rHrxr.�scseZdZ�fdd�Z�ZS)�_PositionTokencs(tt|�j�|jj|_d|_d|_dS)NTF)r�r0r�rHr�r�r[r`)r�)rHrwrxr��s
z_PositionToken.__init__)r�r�r�r�r�rwrw)rHrxr0�sr0cs2eZdZdZ�fdd�Zdd�Zd	dd�Z�ZS)
rzb
    Token to advance to a specific column of input text; useful for tabular report scraping.
    cstt|�j�||_dS)N)r�rr�r9)r��colno)rHrwrxr��szGoToColumn.__init__cCs`t||�|jkr\t|�}|jr*|j||�}x0||krZ||j�rZt||�|jkrZ|d7}q,W|S)Nrr)r9r�r]r��isspace)r�r-r�r�rwrwrxr��s&zGoToColumn.preParseTcCsDt||�}||jkr"t||d|��||j|}|||�}||fS)NzText not in expected column)r9r)r�r-r�roZthiscolZnewlocr�rwrwrxr�s

zGoToColumn.parseImpl)T)r�r�r�r�r�r�r�r�rwrw)rHrxr�s	cs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Matches if current position is at the beginning of a line within the parse string
    
    Example::
    
        test = '''        AAA this line
        AAA and this line
          AAA but not this one
        B AAA and definitely not this one
        '''

        for t in (LineStart() + 'AAA' + restOfLine).searchString(test):
            print(t)
    
    Prints::
        ['AAA', ' this line']
        ['AAA', ' and this line']    

    cstt|�j�d|_dS)NzExpected start of line)r�rr�ra)r�)rHrwrxr�&szLineStart.__init__TcCs*t||�dkr|gfSt|||j|��dS)Nrr)r9rra)r�r-r�rorwrwrxr�*szLineStart.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxrscs*eZdZdZ�fdd�Zddd�Z�ZS)rzU
    Matches if current position is at the end of a line within the parse string
    cs,tt|�j�|jtjjdd��d|_dS)Nrr�zExpected end of line)r�rr�r�r$rNr�ra)r�)rHrwrxr�3szLineEnd.__init__TcCsb|t|�kr6||dkr$|ddfSt|||j|��n(|t|�krN|dgfSt|||j|��dS)Nrrr)r�rra)r�r-r�rorwrwrxr�8szLineEnd.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr/scs*eZdZdZ�fdd�Zddd�Z�ZS)r*zM
    Matches if current position is at the beginning of the parse string
    cstt|�j�d|_dS)NzExpected start of text)r�r*r�ra)r�)rHrwrxr�GszStringStart.__init__TcCs0|dkr(||j|d�kr(t|||j|��|gfS)Nr)r�rra)r�r-r�rorwrwrxr�KszStringStart.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr*Cscs*eZdZdZ�fdd�Zddd�Z�ZS)r)zG
    Matches if current position is at the end of the parse string
    cstt|�j�d|_dS)NzExpected end of text)r�r)r�ra)r�)rHrwrxr�VszStringEnd.__init__TcCs^|t|�krt|||j|��n<|t|�kr6|dgfS|t|�krJ|gfSt|||j|��dS)Nrr)r�rra)r�r-r�rorwrwrxr�ZszStringEnd.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr)Rscs.eZdZdZef�fdd�	Zddd�Z�ZS)r1ap
    Matches if the current position is at the beginning of a Word, and
    is not preceded by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{} behavior of regular expressions,
    use C{WordStart(alphanums)}. C{WordStart} will also match at the beginning of
    the string being parsed, or at the beginning of a line.
    cs"tt|�j�t|�|_d|_dS)NzNot at the start of a word)r�r1r�r��	wordCharsra)r�r3)rHrwrxr�ls
zWordStart.__init__TcCs@|dkr8||d|jks(|||jkr8t|||j|��|gfS)Nrrr)r3rra)r�r-r�rorwrwrxr�qs
zWordStart.parseImpl)T)r�r�r�r�rVr�r�r�rwrw)rHrxr1dscs.eZdZdZef�fdd�	Zddd�Z�ZS)r0aZ
    Matches if the current position is at the end of a Word, and
    is not followed by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{} behavior of regular expressions,
    use C{WordEnd(alphanums)}. C{WordEnd} will also match at the end of
    the string being parsed, or at the end of a line.
    cs(tt|�j�t|�|_d|_d|_dS)NFzNot at the end of a word)r�r0r�r�r3rXra)r�r3)rHrwrxr��s
zWordEnd.__init__TcCsPt|�}|dkrH||krH|||jks8||d|jkrHt|||j|��|gfS)Nrrr)r�r3rra)r�r-r�ror�rwrwrxr��szWordEnd.parseImpl)T)r�r�r�r�rVr�r�r�rwrw)rHrxr0xscs�eZdZdZd�fdd�	Zdd�Zdd�Zd	d
�Z�fdd�Z�fd
d�Z	�fdd�Z
d�fdd�	Zgfdd�Z�fdd�Z
�ZS)r z^
    Abstract subclass of ParserElement, for combining and post-processing parsed tokens.
    Fcs�tt|�j|�t|t�r"t|�}t|t�r<tj|�g|_	njt|t
j�rzt|�}tdd�|D��rnt
tj|�}t|�|_	n,yt|�|_	Wntk
r�|g|_	YnXd|_dS)Ncss|]}t|t�VqdS)N)rzr�)r�r.rwrwrxr��sz+ParseExpression.__init__.<locals>.<genexpr>F)r�r r�rzr�r�r�r$rQ�exprsr��Iterable�allrvr�re)r�r4rg)rHrwrxr��s

zParseExpression.__init__cCs
|j|S)N)r4)r�r�rwrwrxr��szParseExpression.__getitem__cCs|jj|�d|_|S)N)r4r�rU)r�r�rwrwrxr��szParseExpression.appendcCs4d|_dd�|jD�|_x|jD]}|j�q W|S)z~Extends C{leaveWhitespace} defined in base class, and also invokes C{leaveWhitespace} on
           all contained expressions.FcSsg|]}|j��qSrw)r�)r�r�rwrwrxr��sz3ParseExpression.leaveWhitespace.<locals>.<listcomp>)rXr4r�)r�r�rwrwrxr��s
zParseExpression.leaveWhitespacecszt|t�rF||jkrvtt|�j|�xP|jD]}|j|jd�q,Wn0tt|�j|�x|jD]}|j|jd�q^W|S)Nrrrsrs)rzr+r]r�r r�r4)r�r�r�)rHrwrxr��s

zParseExpression.ignorecsLytt|�j�Stk
r"YnX|jdkrFd|jjt|j�f|_|jS)Nz%s:(%s))	r�r r�rKrUrHr�r�r4)r�)rHrwrxr��s
zParseExpression.__str__cs0tt|�j�x|jD]}|j�qWt|j�dk�r|jd}t||j�r�|jr�|jdkr�|j	r�|jdd�|jdg|_d|_
|j|jO_|j|jO_|jd}t||j�o�|jo�|jdko�|j	�r|jdd�|jdd�|_d|_
|j|jO_|j|jO_dt
|�|_|S)Nrqrrrz	Expected rsrs)r�r r�r4r�rzrHrSrVr^rUr[r`r�ra)r�r�r�)rHrwrxr��s0




zParseExpression.streamlinecstt|�j||�}|S)N)r�r rm)r�r�rlr�)rHrwrxrm�szParseExpression.setResultsNamecCs:|dd�|g}x|jD]}|j|�qW|jg�dS)N)r4r�r�)r�r��tmpr�rwrwrxr��szParseExpression.validatecs$tt|�j�}dd�|jD�|_|S)NcSsg|]}|j��qSrw)r�)r�r�rwrwrxr��sz(ParseExpression.copy.<locals>.<listcomp>)r�r r�r4)r�r�)rHrwrxr��szParseExpression.copy)F)F)r�r�r�r�r�r�r�r�r�r�r�rmr�r�r�rwrw)rHrxr �s	
"csTeZdZdZGdd�de�Zd�fdd�	Zddd�Zd	d
�Zdd�Z	d
d�Z
�ZS)ra

    Requires all given C{ParseExpression}s to be found in the given order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'+'} operator.
    May also be constructed using the C{'-'} operator, which will suppress backtracking.

    Example::
        integer = Word(nums)
        name_expr = OneOrMore(Word(alphas))

        expr = And([integer("id"),name_expr("name"),integer("age")])
        # more easily written as:
        expr = integer("id") + name_expr("name") + integer("age")
    cseZdZ�fdd�Z�ZS)zAnd._ErrorStopcs&ttj|�j||�d|_|j�dS)N�-)r�rr�r�r�r�)r�r�r�)rHrwrxr�
szAnd._ErrorStop.__init__)r�r�r�r�r�rwrw)rHrxr�
sr�TcsRtt|�j||�tdd�|jD��|_|j|jdj�|jdj|_d|_	dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�
szAnd.__init__.<locals>.<genexpr>rT)
r�rr�r6r4r[r�rYrXre)r�r4rg)rHrwrxr�
s
zAnd.__init__c	Cs|jdj|||dd�\}}d}x�|jdd�D]�}t|tj�rFd}q0|r�y|j|||�\}}Wq�tk
rv�Yq�tk
r�}zd|_tj|��WYdd}~Xq�t	k
r�t|t
|�|j|��Yq�Xn|j|||�\}}|s�|j�r0||7}q0W||fS)NrF)rprrT)
r4rtrzrr�r#r�
__traceback__r�r�r�rar�)	r�r-r�ro�
resultlistZ	errorStopr�Z
exprtokensr�rwrwrxr�
s(z
And.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrxr5
s

zAnd.__iadd__cCs8|dd�|g}x |jD]}|j|�|jsPqWdS)N)r4r�r[)r�r��subRecCheckListr�rwrwrxr�:
s

zAnd.checkRecursioncCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr��{r�css|]}t|�VqdS)N)r�)r�r�rwrwrxr�F
szAnd.__str__.<locals>.<genexpr>�})r�r�rUr�r4)r�rwrwrxr�A
s


 zAnd.__str__)T)T)r�r�r�r�r
r�r�r�rr�r�r�rwrw)rHrxr�s
csDeZdZdZd�fdd�	Zddd�Zdd	�Zd
d�Zdd
�Z�Z	S)ra�
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the expression that matches the longest string will be used.
    May be constructed using the C{'^'} operator.

    Example::
        # construct Or using '^' operator
        
        number = Word(nums) ^ Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789"))
    prints::
        [['123'], ['3.1416'], ['789']]
    Fcs:tt|�j||�|jr0tdd�|jD��|_nd|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�\
szOr.__init__.<locals>.<genexpr>T)r�rr�r4rr[)r�r4rg)rHrwrxr�Y
szOr.__init__TcCsTd}d}g}x�|jD]�}y|j||�}Wnvtk
rd}	z d|	_|	j|krT|	}|	j}WYdd}	~	Xqtk
r�t|�|kr�t|t|�|j|�}t|�}YqX|j||f�qW|�r*|j	dd�d�x`|D]X\}
}y|j
|||�Stk
�r$}	z"d|	_|	j|k�r|	}|	j}WYdd}	~	Xq�Xq�W|dk	�rB|j|_|�nt||d|��dS)NrrcSs
|dS)Nrrw)�xrwrwrxryu
szOr.parseImpl.<locals>.<lambda>)r�z no defined alternatives to matchrs)r4r�rr9r�r�r�rar��sortrtr�)r�r-r�ro�	maxExcLoc�maxExceptionr�r�Zloc2r��_rwrwrxr�`
s<

zOr.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrx�__ixor__�
s

zOr.__ixor__cCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r<z ^ css|]}t|�VqdS)N)r�)r�r�rwrwrxr��
szOr.__str__.<locals>.<genexpr>r=)r�r�rUr�r4)r�rwrwrxr��
s


 z
Or.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r4r�)r�r�r;r�rwrwrxr��
szOr.checkRecursion)F)T)
r�r�r�r�r�r�rCr�r�r�rwrw)rHrxrK
s

&	csDeZdZdZd�fdd�	Zddd�Zdd	�Zd
d�Zdd
�Z�Z	S)ra�
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the first one listed is the one that will match.
    May be constructed using the C{'|'} operator.

    Example::
        # construct MatchFirst using '|' operator
        
        # watch the order of expressions to match
        number = Word(nums) | Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789")) #  Fail! -> [['123'], ['3'], ['1416'], ['789']]

        # put more selective expression first
        number = Combine(Word(nums) + '.' + Word(nums)) | Word(nums)
        print(number.searchString("123 3.1416 789")) #  Better -> [['123'], ['3.1416'], ['789']]
    Fcs:tt|�j||�|jr0tdd�|jD��|_nd|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr��
sz&MatchFirst.__init__.<locals>.<genexpr>T)r�rr�r4rr[)r�r4rg)rHrwrxr��
szMatchFirst.__init__Tc	Cs�d}d}x�|jD]�}y|j|||�}|Stk
r\}z|j|krL|}|j}WYdd}~Xqtk
r�t|�|kr�t|t|�|j|�}t|�}YqXqW|dk	r�|j|_|�nt||d|��dS)Nrrz no defined alternatives to matchrs)r4rtrr�r�r�rar�)	r�r-r�ror@rAr�r�r�rwrwrxr��
s$
zMatchFirst.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrx�__ior__�
s

zMatchFirst.__ior__cCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r<z | css|]}t|�VqdS)N)r�)r�r�rwrwrxr��
sz%MatchFirst.__str__.<locals>.<genexpr>r=)r�r�rUr�r4)r�rwrwrxr��
s


 zMatchFirst.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r4r�)r�r�r;r�rwrwrxr��
szMatchFirst.checkRecursion)F)T)
r�r�r�r�r�r�rDr�r�r�rwrw)rHrxr�
s
	cs<eZdZdZd�fdd�	Zddd�Zdd�Zd	d
�Z�ZS)
ram
    Requires all given C{ParseExpression}s to be found, but in any order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'&'} operator.

    Example::
        color = oneOf("RED ORANGE YELLOW GREEN BLUE PURPLE BLACK WHITE BROWN")
        shape_type = oneOf("SQUARE CIRCLE TRIANGLE STAR HEXAGON OCTAGON")
        integer = Word(nums)
        shape_attr = "shape:" + shape_type("shape")
        posn_attr = "posn:" + Group(integer("x") + ',' + integer("y"))("posn")
        color_attr = "color:" + color("color")
        size_attr = "size:" + integer("size")

        # use Each (using operator '&') to accept attributes in any order 
        # (shape and posn are required, color and size are optional)
        shape_spec = shape_attr & posn_attr & Optional(color_attr) & Optional(size_attr)

        shape_spec.runTests('''
            shape: SQUARE color: BLACK posn: 100, 120
            shape: CIRCLE size: 50 color: BLUE posn: 50,80
            color:GREEN size:20 shape:TRIANGLE posn:20,40
            '''
            )
    prints::
        shape: SQUARE color: BLACK posn: 100, 120
        ['shape:', 'SQUARE', 'color:', 'BLACK', 'posn:', ['100', ',', '120']]
        - color: BLACK
        - posn: ['100', ',', '120']
          - x: 100
          - y: 120
        - shape: SQUARE


        shape: CIRCLE size: 50 color: BLUE posn: 50,80
        ['shape:', 'CIRCLE', 'size:', '50', 'color:', 'BLUE', 'posn:', ['50', ',', '80']]
        - color: BLUE
        - posn: ['50', ',', '80']
          - x: 50
          - y: 80
        - shape: CIRCLE
        - size: 50


        color: GREEN size: 20 shape: TRIANGLE posn: 20,40
        ['color:', 'GREEN', 'size:', '20', 'shape:', 'TRIANGLE', 'posn:', ['20', ',', '40']]
        - color: GREEN
        - posn: ['20', ',', '40']
          - x: 20
          - y: 40
        - shape: TRIANGLE
        - size: 20
    Tcs8tt|�j||�tdd�|jD��|_d|_d|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�sz Each.__init__.<locals>.<genexpr>T)r�rr�r6r4r[rX�initExprGroups)r�r4rg)rHrwrxr�sz
Each.__init__c
s�|jr�tdd�|jD��|_dd�|jD�}dd�|jD�}|||_dd�|jD�|_dd�|jD�|_dd�|jD�|_|j|j7_d	|_|}|jdd�}|jdd��g}d
}	x�|	�rp|�|j|j}
g}x~|
D]v}y|j||�}Wn t	k
�r|j
|�Yq�X|j
|jjt|�|��||k�rD|j
|�q�|�kr�j
|�q�Wt|�t|
�kr�d	}	q�W|�r�djdd�|D��}
t	||d
|
��|�fdd�|jD�7}g}x*|D]"}|j|||�\}}|j
|��q�Wt|tg��}||fS)Ncss&|]}t|t�rt|j�|fVqdS)N)rzrr�r.)r�r�rwrwrxr�sz!Each.parseImpl.<locals>.<genexpr>cSsg|]}t|t�r|j�qSrw)rzrr.)r�r�rwrwrxr�sz"Each.parseImpl.<locals>.<listcomp>cSs"g|]}|jrt|t�r|�qSrw)r[rzr)r�r�rwrwrxr�scSsg|]}t|t�r|j�qSrw)rzr2r.)r�r�rwrwrxr� scSsg|]}t|t�r|j�qSrw)rzrr.)r�r�rwrwrxr�!scSs g|]}t|tttf�s|�qSrw)rzrr2r)r�r�rwrwrxr�"sFTz, css|]}t|�VqdS)N)r�)r�r�rwrwrxr�=sz*Missing one or more required elements (%s)cs$g|]}t|t�r|j�kr|�qSrw)rzrr.)r�r�)�tmpOptrwrxr�As)rEr�r4Zopt1mapZ	optionalsZmultioptionalsZ
multirequiredZrequiredr�rr�r�r��remover�r�rt�sumr")r�r-r�roZopt1Zopt2ZtmpLocZtmpReqdZ
matchOrderZkeepMatchingZtmpExprsZfailedr�Zmissingr:r�ZfinalResultsrw)rFrxr�sP



zEach.parseImplcCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r<z & css|]}t|�VqdS)N)r�)r�r�rwrwrxr�PszEach.__str__.<locals>.<genexpr>r=)r�r�rUr�r4)r�rwrwrxr�Ks


 zEach.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r4r�)r�r�r;r�rwrwrxr�TszEach.checkRecursion)T)T)	r�r�r�r�r�r�r�r�r�rwrw)rHrxr�
s
5
1	csleZdZdZd�fdd�	Zddd�Zdd	�Z�fd
d�Z�fdd
�Zdd�Z	gfdd�Z
�fdd�Z�ZS)rza
    Abstract subclass of C{ParserElement}, for combining and post-processing parsed tokens.
    Fcs�tt|�j|�t|t�r@ttjt�r2tj|�}ntjt	|��}||_
d|_|dk	r�|j|_|j
|_
|j|j�|j|_|j|_|j|_|jj|j�dS)N)r�rr�rzr��
issubclassr$rQr,rr.rUr`r[r�rYrXrWrer]r�)r�r.rg)rHrwrxr�^s
zParseElementEnhance.__init__TcCs2|jdk	r|jj|||dd�Std||j|��dS)NF)rpr�)r.rtrra)r�r-r�rorwrwrxr�ps
zParseElementEnhance.parseImplcCs*d|_|jj�|_|jdk	r&|jj�|S)NF)rXr.r�r�)r�rwrwrxr�vs


z#ParseElementEnhance.leaveWhitespacecsrt|t�rB||jkrntt|�j|�|jdk	rn|jj|jd�n,tt|�j|�|jdk	rn|jj|jd�|S)Nrrrsrs)rzr+r]r�rr�r.)r�r�)rHrwrxr�}s



zParseElementEnhance.ignorecs&tt|�j�|jdk	r"|jj�|S)N)r�rr�r.)r�)rHrwrxr��s

zParseElementEnhance.streamlinecCsB||krt||g��|dd�|g}|jdk	r>|jj|�dS)N)r&r.r�)r�r�r;rwrwrxr��s

z"ParseElementEnhance.checkRecursioncCs6|dd�|g}|jdk	r(|jj|�|jg�dS)N)r.r�r�)r�r�r7rwrwrxr��s
zParseElementEnhance.validatecsVytt|�j�Stk
r"YnX|jdkrP|jdk	rPd|jjt|j�f|_|jS)Nz%s:(%s))	r�rr�rKrUr.rHr�r�)r�)rHrwrxr��szParseElementEnhance.__str__)F)T)
r�r�r�r�r�r�r�r�r�r�r�r�r�rwrw)rHrxrZs
cs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Lookahead matching of the given parse expression.  C{FollowedBy}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression matches at the current
    position.  C{FollowedBy} always returns a null token list.

    Example::
        # use FollowedBy to match a label only if it is followed by a ':'
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        OneOrMore(attr_expr).parseString("shape: SQUARE color: BLACK posn: upper left").pprint()
    prints::
        [['shape', 'SQUARE'], ['color', 'BLACK'], ['posn', 'upper left']]
    cstt|�j|�d|_dS)NT)r�rr�r[)r�r.)rHrwrxr��szFollowedBy.__init__TcCs|jj||�|gfS)N)r.r�)r�r-r�rorwrwrxr��szFollowedBy.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr�scs2eZdZdZ�fdd�Zd	dd�Zdd�Z�ZS)
ra�
    Lookahead to disallow matching with the given parse expression.  C{NotAny}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression does I{not} match at the current
    position.  Also, C{NotAny} does I{not} skip over leading whitespace. C{NotAny}
    always returns a null token list.  May be constructed using the '~' operator.

    Example::
        
    cs0tt|�j|�d|_d|_dt|j�|_dS)NFTzFound unwanted token, )r�rr�rXr[r�r.ra)r�r.)rHrwrxr��szNotAny.__init__TcCs&|jj||�rt|||j|��|gfS)N)r.r�rra)r�r-r�rorwrwrxr��szNotAny.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�z~{r=)r�r�rUr�r.)r�rwrwrxr��s


zNotAny.__str__)T)r�r�r�r�r�r�r�r�rwrw)rHrxr�s

cs(eZdZd�fdd�	Zddd�Z�ZS)	�_MultipleMatchNcsFtt|�j|�d|_|}t|t�r.tj|�}|dk	r<|nd|_dS)NT)	r�rJr�rWrzr�r$rQ�	not_ender)r�r.�stopOnZender)rHrwrxr��s

z_MultipleMatch.__init__TcCs�|jj}|j}|jdk	}|r$|jj}|r2|||�||||dd�\}}yZ|j}	xJ|rb|||�|	rr|||�}
n|}
|||
|�\}}|s�|j�rT||7}qTWWnttfk
r�YnX||fS)NF)rp)	r.rtr�rKr�r]r�rr�)r�r-r�roZself_expr_parseZself_skip_ignorablesZcheck_enderZ
try_not_enderr�ZhasIgnoreExprsr�Z	tmptokensrwrwrxr��s,



z_MultipleMatch.parseImpl)N)T)r�r�r�r�r�r�rwrw)rHrxrJ�srJc@seZdZdZdd�ZdS)ra�
    Repetition of one or more of the given expression.
    
    Parameters:
     - expr - expression that must match one or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: BLACK"
        OneOrMore(attr_expr).parseString(text).pprint()  # Fail! read 'color' as data instead of next label -> [['shape', 'SQUARE color']]

        # use stopOn attribute for OneOrMore to avoid reading label string as part of the data
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        OneOrMore(attr_expr).parseString(text).pprint() # Better -> [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'BLACK']]
        
        # could also be written as
        (attr_expr * (1,)).parseString(text).pprint()
    cCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�r<z}...)r�r�rUr�r.)r�rwrwrxr�!s


zOneOrMore.__str__N)r�r�r�r�r�rwrwrwrxrscs8eZdZdZd
�fdd�	Zd�fdd�	Zdd	�Z�ZS)r2aw
    Optional repetition of zero or more of the given expression.
    
    Parameters:
     - expr - expression that must match zero or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example: similar to L{OneOrMore}
    Ncstt|�j||d�d|_dS)N)rLT)r�r2r�r[)r�r.rL)rHrwrxr�6szZeroOrMore.__init__Tcs6ytt|�j|||�Sttfk
r0|gfSXdS)N)r�r2r�rr�)r�r-r�ro)rHrwrxr�:szZeroOrMore.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�rz]...)r�r�rUr�r.)r�rwrwrxr�@s


zZeroOrMore.__str__)N)T)r�r�r�r�r�r�r�r�rwrw)rHrxr2*sc@s eZdZdd�ZeZdd�ZdS)�
_NullTokencCsdS)NFrw)r�rwrwrxr�Jsz_NullToken.__bool__cCsdS)Nr�rw)r�rwrwrxr�Msz_NullToken.__str__N)r�r�r�r�r'r�rwrwrwrxrMIsrMcs6eZdZdZef�fdd�	Zd	dd�Zdd�Z�ZS)
raa
    Optional matching of the given expression.

    Parameters:
     - expr - expression that must match zero or more times
     - default (optional) - value to be returned if the optional expression is not found.

    Example::
        # US postal code can be a 5-digit zip, plus optional 4-digit qualifier
        zip = Combine(Word(nums, exact=5) + Optional('-' + Word(nums, exact=4)))
        zip.runTests('''
            # traditional ZIP code
            12345
            
            # ZIP+4 form
            12101-0001
            
            # invalid ZIP
            98765-
            ''')
    prints::
        # traditional ZIP code
        12345
        ['12345']

        # ZIP+4 form
        12101-0001
        ['12101-0001']

        # invalid ZIP
        98765-
             ^
        FAIL: Expected end of text (at char 5), (line:1, col:6)
    cs.tt|�j|dd�|jj|_||_d|_dS)NF)rgT)r�rr�r.rWr�r[)r�r.r�)rHrwrxr�ts
zOptional.__init__TcCszy|jj|||dd�\}}WnTttfk
rp|jtk	rh|jjr^t|jg�}|j||jj<ql|jg}ng}YnX||fS)NF)rp)r.rtrr�r��_optionalNotMatchedrVr")r�r-r�ror�rwrwrxr�zs


zOptional.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�rr	)r�r�rUr�r.)r�rwrwrxr��s


zOptional.__str__)T)	r�r�r�r�rNr�r�r�r�rwrw)rHrxrQs"
cs,eZdZdZd	�fdd�	Zd
dd�Z�ZS)r(a�	
    Token for skipping over all undefined text until the matched expression is found.

    Parameters:
     - expr - target expression marking the end of the data to be skipped
     - include - (default=C{False}) if True, the target expression is also parsed 
          (the skipped text and target expression are returned as a 2-element list).
     - ignore - (default=C{None}) used to define grammars (typically quoted strings and 
          comments) that might contain false matches to the target expression
     - failOn - (default=C{None}) define expressions that are not allowed to be 
          included in the skipped test; if found before the target expression is found, 
          the SkipTo is not a match

    Example::
        report = '''
            Outstanding Issues Report - 1 Jan 2000

               # | Severity | Description                               |  Days Open
            -----+----------+-------------------------------------------+-----------
             101 | Critical | Intermittent system crash                 |          6
              94 | Cosmetic | Spelling error on Login ('log|n')         |         14
              79 | Minor    | System slow when running too many reports |         47
            '''
        integer = Word(nums)
        SEP = Suppress('|')
        # use SkipTo to simply match everything up until the next SEP
        # - ignore quoted strings, so that a '|' character inside a quoted string does not match
        # - parse action will call token.strip() for each matched token, i.e., the description body
        string_data = SkipTo(SEP, ignore=quotedString)
        string_data.setParseAction(tokenMap(str.strip))
        ticket_expr = (integer("issue_num") + SEP 
                      + string_data("sev") + SEP 
                      + string_data("desc") + SEP 
                      + integer("days_open"))
        
        for tkt in ticket_expr.searchString(report):
            print tkt.dump()
    prints::
        ['101', 'Critical', 'Intermittent system crash', '6']
        - days_open: 6
        - desc: Intermittent system crash
        - issue_num: 101
        - sev: Critical
        ['94', 'Cosmetic', "Spelling error on Login ('log|n')", '14']
        - days_open: 14
        - desc: Spelling error on Login ('log|n')
        - issue_num: 94
        - sev: Cosmetic
        ['79', 'Minor', 'System slow when running too many reports', '47']
        - days_open: 47
        - desc: System slow when running too many reports
        - issue_num: 79
        - sev: Minor
    FNcs`tt|�j|�||_d|_d|_||_d|_t|t	�rFt
j|�|_n||_dt
|j�|_dS)NTFzNo match found for )r�r(r��
ignoreExprr[r`�includeMatchr�rzr�r$rQ�failOnr�r.ra)r�r��includer�rQ)rHrwrxr��s
zSkipTo.__init__TcCs,|}t|�}|j}|jj}|jdk	r,|jjnd}|jdk	rB|jjnd}	|}
x�|
|kr�|dk	rh|||
�rhP|	dk	r�x*y|	||
�}
Wqrtk
r�PYqrXqrWy|||
ddd�Wn tt	fk
r�|
d7}
YqLXPqLWt|||j
|��|
}|||�}t|�}|j�r$||||dd�\}}
||
7}||fS)NF)rorprr)rp)
r�r.rtrQr�rOr�rrr�rar"rP)r�r-r�ror0r�r.Z
expr_parseZself_failOn_canParseNextZself_ignoreExpr_tryParseZtmplocZskiptextZ
skipresultr�rwrwrxr��s<

zSkipTo.parseImpl)FNN)T)r�r�r�r�r�r�r�rwrw)rHrxr(�s6
csbeZdZdZd�fdd�	Zdd�Zdd�Zd	d
�Zdd�Zgfd
d�Z	dd�Z
�fdd�Z�ZS)raK
    Forward declaration of an expression to be defined later -
    used for recursive grammars, such as algebraic infix notation.
    When the expression is known, it is assigned to the C{Forward} variable using the '<<' operator.

    Note: take care when assigning to C{Forward} not to overlook precedence of operators.
    Specifically, '|' has a lower precedence than '<<', so that::
        fwdExpr << a | b | c
    will actually be evaluated as::
        (fwdExpr << a) | b | c
    thereby leaving b and c out as parseable alternatives.  It is recommended that you
    explicitly group the values inserted into the C{Forward}::
        fwdExpr << (a | b | c)
    Converting to use the '<<=' operator instead will avoid this problem.

    See L{ParseResults.pprint} for an example of a recursive parser created using
    C{Forward}.
    Ncstt|�j|dd�dS)NF)rg)r�rr�)r�r�)rHrwrxr�szForward.__init__cCsjt|t�rtj|�}||_d|_|jj|_|jj|_|j|jj	�|jj
|_
|jj|_|jj
|jj�|S)N)rzr�r$rQr.rUr`r[r�rYrXrWr]r�)r�r�rwrwrx�
__lshift__s





zForward.__lshift__cCs||>S)Nrw)r�r�rwrwrx�__ilshift__'szForward.__ilshift__cCs
d|_|S)NF)rX)r�rwrwrxr�*szForward.leaveWhitespacecCs$|js d|_|jdk	r |jj�|S)NT)r_r.r�)r�rwrwrxr�.s


zForward.streamlinecCs>||kr0|dd�|g}|jdk	r0|jj|�|jg�dS)N)r.r�r�)r�r�r7rwrwrxr�5s

zForward.validatecCs>t|d�r|jS|jjdSd}Wd|j|_X|jjd|S)Nr�z: ...�Nonez: )r�r�rHr�Z_revertClass�_ForwardNoRecurser.r�)r�Z	retStringrwrwrxr�<s

zForward.__str__cs.|jdk	rtt|�j�St�}||K}|SdS)N)r.r�rr�)r�r�)rHrwrxr�Ms

zForward.copy)N)
r�r�r�r�r�rSrTr�r�r�r�r�r�rwrw)rHrxrs
c@seZdZdd�ZdS)rVcCsdS)Nz...rw)r�rwrwrxr�Vsz_ForwardNoRecurse.__str__N)r�r�r�r�rwrwrwrxrVUsrVcs"eZdZdZd�fdd�	Z�ZS)r-zQ
    Abstract subclass of C{ParseExpression}, for converting parsed results.
    Fcstt|�j|�d|_dS)NF)r�r-r�rW)r�r.rg)rHrwrxr�]szTokenConverter.__init__)F)r�r�r�r�r�r�rwrw)rHrxr-Yscs6eZdZdZd
�fdd�	Z�fdd�Zdd	�Z�ZS)r
a�
    Converter to concatenate all matching tokens to a single string.
    By default, the matching patterns must also be contiguous in the input string;
    this can be disabled by specifying C{'adjacent=False'} in the constructor.

    Example::
        real = Word(nums) + '.' + Word(nums)
        print(real.parseString('3.1416')) # -> ['3', '.', '1416']
        # will also erroneously match the following
        print(real.parseString('3. 1416')) # -> ['3', '.', '1416']

        real = Combine(Word(nums) + '.' + Word(nums))
        print(real.parseString('3.1416')) # -> ['3.1416']
        # no match when there are internal spaces
        print(real.parseString('3. 1416')) # -> Exception: Expected W:(0123...)
    r�Tcs8tt|�j|�|r|j�||_d|_||_d|_dS)NT)r�r
r�r��adjacentrX�
joinStringre)r�r.rXrW)rHrwrxr�rszCombine.__init__cs(|jrtj||�ntt|�j|�|S)N)rWr$r�r�r
)r�r�)rHrwrxr�|szCombine.ignorecCsP|j�}|dd�=|tdj|j|j��g|jd�7}|jrH|j�rH|gS|SdS)Nr�)r�)r�r"r�r
rXrbrVr�)r�r-r�r�ZretToksrwrwrxr��s
"zCombine.postParse)r�T)r�r�r�r�r�r�r�r�rwrw)rHrxr
as
cs(eZdZdZ�fdd�Zdd�Z�ZS)ra�
    Converter to return the matched tokens as a list - useful for returning tokens of C{L{ZeroOrMore}} and C{L{OneOrMore}} expressions.

    Example::
        ident = Word(alphas)
        num = Word(nums)
        term = ident | num
        func = ident + Optional(delimitedList(term))
        print(func.parseString("fn a,b,100"))  # -> ['fn', 'a', 'b', '100']

        func = ident + Group(Optional(delimitedList(term)))
        print(func.parseString("fn a,b,100"))  # -> ['fn', ['a', 'b', '100']]
    cstt|�j|�d|_dS)NT)r�rr�rW)r�r.)rHrwrxr��szGroup.__init__cCs|gS)Nrw)r�r-r�r�rwrwrxr��szGroup.postParse)r�r�r�r�r�r�r�rwrw)rHrxr�s
cs(eZdZdZ�fdd�Zdd�Z�ZS)raW
    Converter to return a repetitive expression as a list, but also as a dictionary.
    Each element can also be referenced using the first token in the expression as its key.
    Useful for tabular report scraping when the first column can be used as a item key.

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        # print attributes as plain groups
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        # instead of OneOrMore(expr), parse using Dict(OneOrMore(Group(expr))) - Dict will auto-assign names
        result = Dict(OneOrMore(Group(attr_expr))).parseString(text)
        print(result.dump())
        
        # access named fields as dict entries, or output as dict
        print(result['shape'])        
        print(result.asDict())
    prints::
        ['shape', 'SQUARE', 'posn', 'upper left', 'color', 'light blue', 'texture', 'burlap']

        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        {'color': 'light blue', 'posn': 'upper left', 'texture': 'burlap', 'shape': 'SQUARE'}
    See more examples at L{ParseResults} of accessing fields by results name.
    cstt|�j|�d|_dS)NT)r�rr�rW)r�r.)rHrwrxr��sz
Dict.__init__cCs�x�t|�D]�\}}t|�dkr q
|d}t|t�rBt|d�j�}t|�dkr^td|�||<q
t|�dkr�t|dt�r�t|d|�||<q
|j�}|d=t|�dks�t|t�r�|j	�r�t||�||<q
t|d|�||<q
W|j
r�|gS|SdS)Nrrrr�rq)r�r�rzrur�r�r�r"r�r�rV)r�r-r�r�r��tokZikeyZ	dictvaluerwrwrxr��s$
zDict.postParse)r�r�r�r�r�r�r�rwrw)rHrxr�s#c@s eZdZdZdd�Zdd�ZdS)r+aV
    Converter for ignoring the results of a parsed expression.

    Example::
        source = "a, b, c,d"
        wd = Word(alphas)
        wd_list1 = wd + ZeroOrMore(',' + wd)
        print(wd_list1.parseString(source))

        # often, delimiters that are useful during parsing are just in the
        # way afterward - use Suppress to keep them out of the parsed output
        wd_list2 = wd + ZeroOrMore(Suppress(',') + wd)
        print(wd_list2.parseString(source))
    prints::
        ['a', ',', 'b', ',', 'c', ',', 'd']
        ['a', 'b', 'c', 'd']
    (See also L{delimitedList}.)
    cCsgS)Nrw)r�r-r�r�rwrwrxr��szSuppress.postParsecCs|S)Nrw)r�rwrwrxr��szSuppress.suppressN)r�r�r�r�r�r�rwrwrwrxr+�sc@s(eZdZdZdd�Zdd�Zdd�ZdS)	rzI
    Wrapper for parse actions, to ensure they are only called once.
    cCst|�|_d|_dS)NF)rM�callable�called)r�Z
methodCallrwrwrxr�s
zOnlyOnce.__init__cCs.|js|j|||�}d|_|St||d��dS)NTr�)r[rZr)r�r�r5rvr�rwrwrxr�s
zOnlyOnce.__call__cCs
d|_dS)NF)r[)r�rwrwrx�reset
szOnlyOnce.resetN)r�r�r�r�r�r�r\rwrwrwrxr�scs:t����fdd�}y�j|_Wntk
r4YnX|S)as
    Decorator for debugging parse actions. 
    
    When the parse action is called, this decorator will print C{">> entering I{method-name}(line:I{current_source_line}, I{parse_location}, I{matched_tokens})".}
    When the parse action completes, the decorator will print C{"<<"} followed by the returned value, or any exception that the parse action raised.

    Example::
        wd = Word(alphas)

        @traceParseAction
        def remove_duplicate_chars(tokens):
            return ''.join(sorted(set(''.join(tokens)))

        wds = OneOrMore(wd).setParseAction(remove_duplicate_chars)
        print(wds.parseString("slkdjs sld sldd sdlf sdljf"))
    prints::
        >>entering remove_duplicate_chars(line: 'slkdjs sld sldd sdlf sdljf', 0, (['slkdjs', 'sld', 'sldd', 'sdlf', 'sdljf'], {}))
        <<leaving remove_duplicate_chars (ret: 'dfjkls')
        ['dfjkls']
    cs��j}|dd�\}}}t|�dkr8|djjd|}tjjd|t||�||f�y�|�}Wn8tk
r�}ztjjd||f��WYdd}~XnXtjjd||f�|S)Nror�.z">>entering %s(line: '%s', %d, %r)
z<<leaving %s (exception: %s)
z<<leaving %s (ret: %r)
r9)r�r�rHr~�stderr�writerGrK)ZpaArgsZthisFuncr�r5rvr�r3)r�rwrx�z#sztraceParseAction.<locals>.z)rMr�r�)r�r`rw)r�rxrb
s
�,FcCs`t|�dt|�dt|�d}|rBt|t||��j|�S|tt|�|�j|�SdS)a�
    Helper to define a delimited list of expressions - the delimiter defaults to ','.
    By default, the list elements and delimiters can have intervening whitespace, and
    comments, but this can be overridden by passing C{combine=True} in the constructor.
    If C{combine} is set to C{True}, the matching tokens are returned as a single token
    string, with the delimiters included; otherwise, the matching tokens are returned
    as a list of tokens, with the delimiters suppressed.

    Example::
        delimitedList(Word(alphas)).parseString("aa,bb,cc") # -> ['aa', 'bb', 'cc']
        delimitedList(Word(hexnums), delim=':', combine=True).parseString("AA:BB:CC:DD:EE") # -> ['AA:BB:CC:DD:EE']
    z [r�z]...N)r�r
r2rir+)r.Zdelim�combineZdlNamerwrwrxr@9s
$csjt����fdd�}|dkr0tt�jdd��}n|j�}|jd�|j|dd�|�jd	t��d
�S)a:
    Helper to define a counted list of expressions.
    This helper defines a pattern of the form::
        integer expr expr expr...
    where the leading integer tells how many expr expressions follow.
    The matched tokens returns the array of expr tokens as a list - the leading count token is suppressed.
    
    If C{intExpr} is specified, it should be a pyparsing expression that produces an integer value.

    Example::
        countedArray(Word(alphas)).parseString('2 ab cd ef')  # -> ['ab', 'cd']

        # in this parser, the leading integer value is given in binary,
        # '10' indicating that 2 values are in the array
        binaryConstant = Word('01').setParseAction(lambda t: int(t[0], 2))
        countedArray(Word(alphas), intExpr=binaryConstant).parseString('10 ab cd ef')  # -> ['ab', 'cd']
    cs.|d}�|r tt�g|��p&tt�>gS)Nr)rrrC)r�r5rvr�)�	arrayExprr.rwrx�countFieldParseAction_s"z+countedArray.<locals>.countFieldParseActionNcSst|d�S)Nr)ru)rvrwrwrxrydszcountedArray.<locals>.<lambda>ZarrayLenT)rfz(len) z...)rr/rRr�r�rirxr�)r.ZintExprrdrw)rcr.rxr<Ls
cCs:g}x0|D](}t|t�r(|jt|��q
|j|�q
W|S)N)rzr�r�r�r�)�Lr�r�rwrwrxr�ks

r�cs6t���fdd�}|j|dd��jdt|���S)a*
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousLiteral(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches a
    previous literal, will also match the leading C{"1:1"} in C{"1:10"}.
    If this is not desired, use C{matchPreviousExpr}.
    Do I{not} use with packrat parsing enabled.
    csP|rBt|�dkr�|d>qLt|j��}�tdd�|D��>n
�t�>dS)Nrrrcss|]}t|�VqdS)N)r)r��ttrwrwrxr��szDmatchPreviousLiteral.<locals>.copyTokenToRepeater.<locals>.<genexpr>)r�r�r�rr
)r�r5rvZtflat)�reprwrx�copyTokenToRepeater�sz1matchPreviousLiteral.<locals>.copyTokenToRepeaterT)rfz(prev) )rrxrir�)r.rhrw)rgrxrOts


csFt��|j�}�|K��fdd�}|j|dd��jdt|���S)aS
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousExpr(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches by
    expressions, will I{not} match the leading C{"1:1"} in C{"1:10"};
    the expressions are evaluated first, and then compared, so
    C{"1"} is compared with C{"10"}.
    Do I{not} use with packrat parsing enabled.
    cs*t|j����fdd�}�j|dd�dS)Ncs$t|j��}|�kr tddd��dS)Nr�r)r�r�r)r�r5rvZtheseTokens)�matchTokensrwrx�mustMatchTheseTokens�szLmatchPreviousExpr.<locals>.copyTokenToRepeater.<locals>.mustMatchTheseTokensT)rf)r�r�r�)r�r5rvrj)rg)rirxrh�sz.matchPreviousExpr.<locals>.copyTokenToRepeaterT)rfz(prev) )rr�rxrir�)r.Ze2rhrw)rgrxrN�scCs>xdD]}|j|t|�}qW|jdd�}|jdd�}t|�S)Nz\^-]rz\nr(z\t)r��_bslashr�)r�r�rwrwrxr�s

rTc
s�|rdd�}dd�}t�ndd�}dd�}t�g}t|t�rF|j�}n&t|tj�r\t|�}ntj	dt
dd�|svt�Sd	}x�|t|�d
k�r||}xnt
||d
d��D]N\}}	||	|�r�|||d
=Pq�|||	�r�|||d
=|j||	�|	}Pq�W|d
7}q|W|�r�|�r�yht|�tdj|��k�rZtd
djdd�|D���jdj|��Stdjdd�|D���jdj|��SWn&tk
�r�tj	dt
dd�YnXt�fdd�|D��jdj|��S)a�
    Helper to quickly define a set of alternative Literals, and makes sure to do
    longest-first testing when there is a conflict, regardless of the input order,
    but returns a C{L{MatchFirst}} for best performance.

    Parameters:
     - strs - a string of space-delimited literals, or a collection of string literals
     - caseless - (default=C{False}) - treat all literals as caseless
     - useRegex - (default=C{True}) - as an optimization, will generate a Regex
          object; otherwise, will generate a C{MatchFirst} object (if C{caseless=True}, or
          if creating a C{Regex} raises an exception)

    Example::
        comp_oper = oneOf("< = > <= >= !=")
        var = Word(alphas)
        number = Word(nums)
        term = var | number
        comparison_expr = term + comp_oper + term
        print(comparison_expr.searchString("B = 12  AA=23 B<=AA AA>12"))
    prints::
        [['B', '=', '12'], ['AA', '=', '23'], ['B', '<=', 'AA'], ['AA', '>', '12']]
    cSs|j�|j�kS)N)r�)r�brwrwrxry�szoneOf.<locals>.<lambda>cSs|j�j|j��S)N)r�r�)rrlrwrwrxry�scSs||kS)Nrw)rrlrwrwrxry�scSs
|j|�S)N)r�)rrlrwrwrxry�sz6Invalid argument to oneOf, expected string or iterablerq)r�rrrNr�z[%s]css|]}t|�VqdS)N)r)r��symrwrwrxr��szoneOf.<locals>.<genexpr>z | �|css|]}tj|�VqdS)N)rdr	)r�rmrwrwrxr��sz7Exception creating Regex for oneOf, building MatchFirstc3s|]}�|�VqdS)Nrw)r�rm)�parseElementClassrwrxr��s)rrrzr�r�r�r5r�r�r�r�rr�r�r�r�r'rirKr)
Zstrsr�ZuseRegexZisequalZmasksZsymbolsr�Zcurr�r�rw)rorxrS�sL





((cCsttt||���S)a�
    Helper to easily and clearly define a dictionary by specifying the respective patterns
    for the key and value.  Takes care of defining the C{L{Dict}}, C{L{ZeroOrMore}}, and C{L{Group}} tokens
    in the proper order.  The key pattern can include delimiting markers or punctuation,
    as long as they are suppressed, thereby leaving the significant key text.  The value
    pattern can include named results, so that the C{Dict} results can include named token
    fields.

    Example::
        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        attr_label = label
        attr_value = Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join)

        # similar to Dict, but simpler call format
        result = dictOf(attr_label, attr_value).parseString(text)
        print(result.dump())
        print(result['shape'])
        print(result.shape)  # object attribute access works too
        print(result.asDict())
    prints::
        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        SQUARE
        {'color': 'light blue', 'shape': 'SQUARE', 'posn': 'upper left', 'texture': 'burlap'}
    )rr2r)r�r�rwrwrxrA�s!cCs^t�jdd��}|j�}d|_|d�||d�}|r@dd�}ndd�}|j|�|j|_|S)	a�
    Helper to return the original, untokenized text for a given expression.  Useful to
    restore the parsed fields of an HTML start tag into the raw tag text itself, or to
    revert separate tokens with intervening whitespace back to the original matching
    input text. By default, returns astring containing the original parsed text.  
       
    If the optional C{asString} argument is passed as C{False}, then the return value is a 
    C{L{ParseResults}} containing any results names that were originally matched, and a 
    single token containing the original matched text from the input string.  So if 
    the expression passed to C{L{originalTextFor}} contains expressions with defined
    results names, you must set C{asString} to C{False} if you want to preserve those
    results name values.

    Example::
        src = "this is test <b> bold <i>text</i> </b> normal text "
        for tag in ("b","i"):
            opener,closer = makeHTMLTags(tag)
            patt = originalTextFor(opener + SkipTo(closer) + closer)
            print(patt.searchString(src)[0])
    prints::
        ['<b> bold <i>text</i> </b>']
        ['<i>text</i>']
    cSs|S)Nrw)r�r�rvrwrwrxry8sz!originalTextFor.<locals>.<lambda>F�_original_start�
_original_endcSs||j|j�S)N)rprq)r�r5rvrwrwrxry=scSs&||jd�|jd��g|dd�<dS)Nrprq)r�)r�r5rvrwrwrx�extractText?sz$originalTextFor.<locals>.extractText)r
r�r�rer])r.ZasStringZ	locMarkerZendlocMarker�	matchExprrrrwrwrxrg s

cCst|�jdd��S)zp
    Helper to undo pyparsing's default grouping of And expressions, even
    if all but one are non-empty.
    cSs|dS)Nrrw)rvrwrwrxryJszungroup.<locals>.<lambda>)r-r�)r.rwrwrxrhEscCs4t�jdd��}t|d�|d�|j�j�d��S)a�
    Helper to decorate a returned token with its starting and ending locations in the input string.
    This helper adds the following results names:
     - locn_start = location where matched expression begins
     - locn_end = location where matched expression ends
     - value = the actual parsed results

    Be careful if the input text contains C{<TAB>} characters, you may want to call
    C{L{ParserElement.parseWithTabs}}

    Example::
        wd = Word(alphas)
        for match in locatedExpr(wd).searchString("ljsdf123lksdjjf123lkkjj1222"):
            print(match)
    prints::
        [[0, 'ljsdf', 5]]
        [[8, 'lksdjjf', 15]]
        [[18, 'lkkjj', 23]]
    cSs|S)Nrw)r�r5rvrwrwrxry`szlocatedExpr.<locals>.<lambda>Z
locn_startr�Zlocn_end)r
r�rr�r�)r.ZlocatorrwrwrxrjLsz\[]-*.$+^?()~ )r
cCs|ddS)Nrrrrw)r�r5rvrwrwrxryksryz\\0?[xX][0-9a-fA-F]+cCstt|djd�d��S)Nrz\0x�)�unichrru�lstrip)r�r5rvrwrwrxrylsz	\\0[0-7]+cCstt|ddd�d��S)Nrrr�)ruru)r�r5rvrwrwrxrymsz\])r�r
z\wr8rr�Znegate�bodyr	csBdd��y dj�fdd�tj|�jD��Stk
r<dSXdS)a�
    Helper to easily define string ranges for use in Word construction.  Borrows
    syntax from regexp '[]' string range definitions::
        srange("[0-9]")   -> "0123456789"
        srange("[a-z]")   -> "abcdefghijklmnopqrstuvwxyz"
        srange("[a-z$_]") -> "abcdefghijklmnopqrstuvwxyz$_"
    The input string must be enclosed in []'s, and the returned string is the expanded
    character set joined into a single string.
    The values enclosed in the []'s may be:
     - a single character
     - an escaped character with a leading backslash (such as C{\-} or C{\]})
     - an escaped hex character with a leading C{'\x'} (C{\x21}, which is a C{'!'} character) 
         (C{\0x##} is also supported for backwards compatibility) 
     - an escaped octal character with a leading C{'\0'} (C{\041}, which is a C{'!'} character)
     - a range of any of the above, separated by a dash (C{'a-z'}, etc.)
     - any combination of the above (C{'aeiouy'}, C{'a-zA-Z0-9_$'}, etc.)
    cSs<t|t�s|Sdjdd�tt|d�t|d�d�D��S)Nr�css|]}t|�VqdS)N)ru)r�r�rwrwrxr��sz+srange.<locals>.<lambda>.<locals>.<genexpr>rrr)rzr"r�r��ord)�prwrwrxry�szsrange.<locals>.<lambda>r�c3s|]}�|�VqdS)Nrw)r��part)�	_expandedrwrxr��szsrange.<locals>.<genexpr>N)r��_reBracketExprr�rxrK)r�rw)r|rxr_rs
 cs�fdd�}|S)zt
    Helper method for defining parse actions that require matching at a specific
    column in the input text.
    cs"t||��krt||d���dS)Nzmatched token not at column %d)r9r)r)Zlocnr1)r�rwrx�	verifyCol�sz!matchOnlyAtCol.<locals>.verifyColrw)r�r~rw)r�rxrM�scs�fdd�S)a�
    Helper method for common parse actions that simply return a literal value.  Especially
    useful when used with C{L{transformString<ParserElement.transformString>}()}.

    Example::
        num = Word(nums).setParseAction(lambda toks: int(toks[0]))
        na = oneOf("N/A NA").setParseAction(replaceWith(math.nan))
        term = na | num
        
        OneOrMore(term).parseString("324 234 N/A 234") # -> [324, 234, nan, 234]
    cs�gS)Nrw)r�r5rv)�replStrrwrxry�szreplaceWith.<locals>.<lambda>rw)rrw)rrxr\�scCs|ddd�S)a
    Helper parse action for removing quotation marks from parsed quoted strings.

    Example::
        # by default, quotation marks are included in parsed results
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["'Now is the Winter of our Discontent'"]

        # use removeQuotes to strip quotation marks from parsed results
        quotedString.setParseAction(removeQuotes)
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["Now is the Winter of our Discontent"]
    rrrrsrw)r�r5rvrwrwrxrZ�scsN��fdd�}yt�dt�d�j�}Wntk
rBt��}YnX||_|S)aG
    Helper to define a parse action by mapping a function to all elements of a ParseResults list.If any additional 
    args are passed, they are forwarded to the given function as additional arguments after
    the token, as in C{hex_integer = Word(hexnums).setParseAction(tokenMap(int, 16))}, which will convert the
    parsed data to an integer using base 16.

    Example (compare the last to example in L{ParserElement.transformString}::
        hex_ints = OneOrMore(Word(hexnums)).setParseAction(tokenMap(int, 16))
        hex_ints.runTests('''
            00 11 22 aa FF 0a 0d 1a
            ''')
        
        upperword = Word(alphas).setParseAction(tokenMap(str.upper))
        OneOrMore(upperword).runTests('''
            my kingdom for a horse
            ''')

        wd = Word(alphas).setParseAction(tokenMap(str.title))
        OneOrMore(wd).setParseAction(' '.join).runTests('''
            now is the winter of our discontent made glorious summer by this sun of york
            ''')
    prints::
        00 11 22 aa FF 0a 0d 1a
        [0, 17, 34, 170, 255, 10, 13, 26]

        my kingdom for a horse
        ['MY', 'KINGDOM', 'FOR', 'A', 'HORSE']

        now is the winter of our discontent made glorious summer by this sun of york
        ['Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York']
    cs��fdd�|D�S)Ncsg|]}�|f����qSrwrw)r�Ztokn)r�r6rwrxr��sz(tokenMap.<locals>.pa.<locals>.<listcomp>rw)r�r5rv)r�r6rwrxr}�sztokenMap.<locals>.par�rH)rJr�rKr{)r6r�r}rLrw)r�r6rxrm�s cCst|�j�S)N)r�r�)rvrwrwrxry�scCst|�j�S)N)r��lower)rvrwrwrxry�scCs�t|t�r|}t||d�}n|j}tttd�}|r�tj�j	t
�}td�|d�tt
t|td�|���tddgd�jd	�j	d
d��td�}n�d
jdd�tD��}tj�j	t
�t|�B}td�|d�tt
t|j	t�ttd�|����tddgd�jd	�j	dd��td�}ttd�|d�}|jdd
j|jdd�j�j���jd|�}|jdd
j|jdd�j�j���jd|�}||_||_||fS)zRInternal helper to construct opening and closing tag expressions, given a tag name)r�z_-:r�tag�=�/F)r�rCcSs|ddkS)Nrr�rw)r�r5rvrwrwrxry�sz_makeTags.<locals>.<lambda>rr�css|]}|dkr|VqdS)rNrw)r�r�rwrwrxr��sz_makeTags.<locals>.<genexpr>cSs|ddkS)Nrr�rw)r�r5rvrwrwrxry�sz</r��:r�z<%s>rz</%s>)rzr�rr�r/r4r3r>r�r�rZr+rr2rrrmr�rVrYrBr
�_Lr��titler�rir�)�tagStrZxmlZresnameZtagAttrNameZtagAttrValueZopenTagZprintablesLessRAbrackZcloseTagrwrwrx�	_makeTags�s"
T\..r�cCs
t|d�S)a 
    Helper to construct opening and closing tag expressions for HTML, given a tag name. Matches
    tags in either upper or lower case, attributes with namespaces and with quoted or unquoted values.

    Example::
        text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
        # makeHTMLTags returns pyparsing expressions for the opening and closing tags as a 2-tuple
        a,a_end = makeHTMLTags("A")
        link_expr = a + SkipTo(a_end)("link_text") + a_end
        
        for link in link_expr.searchString(text):
            # attributes in the <A> tag (like "href" shown here) are also accessible as named results
            print(link.link_text, '->', link.href)
    prints::
        pyparsing -> http://pyparsing.wikispaces.com
    F)r�)r�rwrwrxrK�scCs
t|d�S)z�
    Helper to construct opening and closing tag expressions for XML, given a tag name. Matches
    tags only in the given upper/lower case.

    Example: similar to L{makeHTMLTags}
    T)r�)r�rwrwrxrLscs8|r|dd��n|j��dd��D���fdd�}|S)a<
    Helper to create a validating parse action to be used with start tags created
    with C{L{makeXMLTags}} or C{L{makeHTMLTags}}. Use C{withAttribute} to qualify a starting tag
    with a required attribute value, to avoid false matches on common tags such as
    C{<TD>} or C{<DIV>}.

    Call C{withAttribute} with a series of attribute names and values. Specify the list
    of filter attributes names and values as:
     - keyword arguments, as in C{(align="right")}, or
     - as an explicit dict with C{**} operator, when an attribute name is also a Python
          reserved word, as in C{**{"class":"Customer", "align":"right"}}
     - a list of name-value tuples, as in ( ("ns1:class", "Customer"), ("ns2:align","right") )
    For attribute names with a namespace prefix, you must use the second form.  Attribute
    names are matched insensitive to upper/lower case.
       
    If just testing for C{class} (with or without a namespace), use C{L{withClass}}.

    To verify that the attribute exists, but without specifying a value, pass
    C{withAttribute.ANY_VALUE} as the value.

    Example::
        html = '''
            <div>
            Some text
            <div type="grid">1 4 0 1 0</div>
            <div type="graph">1,3 2,3 1,1</div>
            <div>this has no type</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")

        # only match div tag having a type attribute with value "grid"
        div_grid = div().setParseAction(withAttribute(type="grid"))
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        # construct a match with any div tag having a type attribute, regardless of the value
        div_any_type = div().setParseAction(withAttribute(type=withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    NcSsg|]\}}||f�qSrwrw)r�r�r�rwrwrxr�Qsz!withAttribute.<locals>.<listcomp>cs^xX�D]P\}}||kr&t||d|��|tjkr|||krt||d||||f��qWdS)Nzno matching attribute z+attribute '%s' has value '%s', must be '%s')rre�	ANY_VALUE)r�r5r�ZattrNameZ	attrValue)�attrsrwrxr}RszwithAttribute.<locals>.pa)r�)r�ZattrDictr}rw)r�rxres2cCs|rd|nd}tf||i�S)a�
    Simplified version of C{L{withAttribute}} when matching on a div class - made
    difficult because C{class} is a reserved word in Python.

    Example::
        html = '''
            <div>
            Some text
            <div class="grid">1 4 0 1 0</div>
            <div class="graph">1,3 2,3 1,1</div>
            <div>this &lt;div&gt; has no class</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")
        div_grid = div().setParseAction(withClass("grid"))
        
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        div_any_type = div().setParseAction(withClass(withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    z%s:class�class)re)Z	classname�	namespaceZ	classattrrwrwrxrk\s �(rcCs�t�}||||B}�x`t|�D�]R\}}|ddd�\}}	}
}|	dkrTd|nd|}|	dkr�|dksxt|�dkr�td��|\}
}t�j|�}|
tjk�rd|	dkr�t||�t|t	|��}n�|	dk�r|dk	�rt|||�t|t	||��}nt||�t|t	|��}nD|	dk�rZt||
|||�t||
|||�}ntd	��n�|
tj
k�rH|	dk�r�t|t��s�t|�}t|j
|�t||�}n�|	dk�r|dk	�r�t|||�t|t	||��}nt||�t|t	|��}nD|	dk�r>t||
|||�t||
|||�}ntd	��ntd
��|�r`|j|�||j|�|BK}|}q"W||K}|S)a�	
    Helper method for constructing grammars of expressions made up of
    operators working in a precedence hierarchy.  Operators may be unary or
    binary, left- or right-associative.  Parse actions can also be attached
    to operator expressions. The generated parser will also recognize the use 
    of parentheses to override operator precedences (see example below).
    
    Note: if you define a deep operator list, you may see performance issues
    when using infixNotation. See L{ParserElement.enablePackrat} for a
    mechanism to potentially improve your parser performance.

    Parameters:
     - baseExpr - expression representing the most basic element for the nested
     - opList - list of tuples, one for each operator precedence level in the
      expression grammar; each tuple is of the form
      (opExpr, numTerms, rightLeftAssoc, parseAction), where:
       - opExpr is the pyparsing expression for the operator;
          may also be a string, which will be converted to a Literal;
          if numTerms is 3, opExpr is a tuple of two expressions, for the
          two operators separating the 3 terms
       - numTerms is the number of terms for this operator (must
          be 1, 2, or 3)
       - rightLeftAssoc is the indicator whether the operator is
          right or left associative, using the pyparsing-defined
          constants C{opAssoc.RIGHT} and C{opAssoc.LEFT}.
       - parseAction is the parse action to be associated with
          expressions matching this operator expression (the
          parse action tuple member may be omitted)
     - lpar - expression for matching left-parentheses (default=C{Suppress('(')})
     - rpar - expression for matching right-parentheses (default=C{Suppress(')')})

    Example::
        # simple example of four-function arithmetic with ints and variable names
        integer = pyparsing_common.signed_integer
        varname = pyparsing_common.identifier 
        
        arith_expr = infixNotation(integer | varname,
            [
            ('-', 1, opAssoc.RIGHT),
            (oneOf('* /'), 2, opAssoc.LEFT),
            (oneOf('+ -'), 2, opAssoc.LEFT),
            ])
        
        arith_expr.runTests('''
            5+3*6
            (5+3)*6
            -2--11
            ''', fullDump=False)
    prints::
        5+3*6
        [[5, '+', [3, '*', 6]]]

        (5+3)*6
        [[[5, '+', 3], '*', 6]]

        -2--11
        [[['-', 2], '-', ['-', 11]]]
    Nrroz%s termz	%s%s termrqz@if numterms=3, opExpr must be a tuple or list of two expressionsrrz6operator must be unary (1), binary (2), or ternary (3)z2operator must indicate right or left associativity)N)rr�r�r�rirT�LEFTrrr�RIGHTrzrr.r�)ZbaseExprZopListZlparZrparr�ZlastExprr�ZoperDefZopExprZarityZrightLeftAssocr}ZtermNameZopExpr1ZopExpr2ZthisExprrsrwrwrxri�sR;

&




&


z4"(?:[^"\n\r\\]|(?:"")|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*�"z string enclosed in double quotesz4'(?:[^'\n\r\\]|(?:'')|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*�'z string enclosed in single quotesz*quotedString using single or double quotes�uzunicode string literalcCs�||krtd��|dk�r(t|t�o,t|t��r t|�dkr�t|�dkr�|dk	r�tt|t||tjdd���j	dd��}n$t
j�t||tj�j	dd��}nx|dk	r�tt|t|�t|�ttjdd���j	dd��}n4ttt|�t|�ttjdd���j	d	d��}ntd
��t
�}|dk	�rb|tt|�t||B|B�t|��K}n$|tt|�t||B�t|��K}|jd||f�|S)a~	
    Helper method for defining nested lists enclosed in opening and closing
    delimiters ("(" and ")" are the default).

    Parameters:
     - opener - opening character for a nested list (default=C{"("}); can also be a pyparsing expression
     - closer - closing character for a nested list (default=C{")"}); can also be a pyparsing expression
     - content - expression for items within the nested lists (default=C{None})
     - ignoreExpr - expression for ignoring opening and closing delimiters (default=C{quotedString})

    If an expression is not provided for the content argument, the nested
    expression will capture all whitespace-delimited content between delimiters
    as a list of separate values.

    Use the C{ignoreExpr} argument to define expressions that may contain
    opening or closing characters that should not be treated as opening
    or closing characters for nesting, such as quotedString or a comment
    expression.  Specify multiple expressions using an C{L{Or}} or C{L{MatchFirst}}.
    The default is L{quotedString}, but if no expressions are to be ignored,
    then pass C{None} for this argument.

    Example::
        data_type = oneOf("void int short long char float double")
        decl_data_type = Combine(data_type + Optional(Word('*')))
        ident = Word(alphas+'_', alphanums+'_')
        number = pyparsing_common.number
        arg = Group(decl_data_type + ident)
        LPAR,RPAR = map(Suppress, "()")

        code_body = nestedExpr('{', '}', ignoreExpr=(quotedString | cStyleComment))

        c_function = (decl_data_type("type") 
                      + ident("name")
                      + LPAR + Optional(delimitedList(arg), [])("args") + RPAR 
                      + code_body("body"))
        c_function.ignore(cStyleComment)
        
        source_code = '''
            int is_odd(int x) { 
                return (x%2); 
            }
                
            int dec_to_hex(char hchar) { 
                if (hchar >= '0' && hchar <= '9') { 
                    return (ord(hchar)-ord('0')); 
                } else { 
                    return (10+ord(hchar)-ord('A'));
                } 
            }
        '''
        for func in c_function.searchString(source_code):
            print("%(name)s (%(type)s) args: %(args)s" % func)

    prints::
        is_odd (int) args: [['int', 'x']]
        dec_to_hex (int) args: [['char', 'hchar']]
    z.opening and closing strings cannot be the sameNrr)r
cSs|dj�S)Nr)r�)rvrwrwrxry9sznestedExpr.<locals>.<lambda>cSs|dj�S)Nr)r�)rvrwrwrxry<scSs|dj�S)Nr)r�)rvrwrwrxryBscSs|dj�S)Nr)r�)rvrwrwrxryFszOopening and closing arguments must be strings if no content expression is givenznested %s%s expression)r�rzr�r�r
rr	r$rNr�rCr�rrrr+r2ri)�openerZcloserZcontentrOr�rwrwrxrP�s4:

*$cs��fdd�}�fdd�}�fdd�}tt�jd�j��}t�t�j|�jd�}t�j|�jd	�}t�j|�jd
�}	|r�tt|�|t|t|�t|��|	�}
n$tt|�t|t|�t|���}
|j	t
t��|
jd�S)a
	
    Helper method for defining space-delimited indentation blocks, such as
    those used to define block statements in Python source code.

    Parameters:
     - blockStatementExpr - expression defining syntax of statement that
            is repeated within the indented block
     - indentStack - list created by caller to manage indentation stack
            (multiple statementWithIndentedBlock expressions within a single grammar
            should share a common indentStack)
     - indent - boolean indicating whether block must be indented beyond the
            the current level; set to False for block of left-most statements
            (default=C{True})

    A valid block must contain at least one C{blockStatement}.

    Example::
        data = '''
        def A(z):
          A1
          B = 100
          G = A2
          A2
          A3
        B
        def BB(a,b,c):
          BB1
          def BBA():
            bba1
            bba2
            bba3
        C
        D
        def spam(x,y):
             def eggs(z):
                 pass
        '''


        indentStack = [1]
        stmt = Forward()

        identifier = Word(alphas, alphanums)
        funcDecl = ("def" + identifier + Group( "(" + Optional( delimitedList(identifier) ) + ")" ) + ":")
        func_body = indentedBlock(stmt, indentStack)
        funcDef = Group( funcDecl + func_body )

        rvalue = Forward()
        funcCall = Group(identifier + "(" + Optional(delimitedList(rvalue)) + ")")
        rvalue << (funcCall | identifier | Word(nums))
        assignment = Group(identifier + "=" + rvalue)
        stmt << ( funcDef | assignment | identifier )

        module_body = OneOrMore(stmt)

        parseTree = module_body.parseString(data)
        parseTree.pprint()
    prints::
        [['def',
          'A',
          ['(', 'z', ')'],
          ':',
          [['A1'], [['B', '=', '100']], [['G', '=', 'A2']], ['A2'], ['A3']]],
         'B',
         ['def',
          'BB',
          ['(', 'a', 'b', 'c', ')'],
          ':',
          [['BB1'], [['def', 'BBA', ['(', ')'], ':', [['bba1'], ['bba2'], ['bba3']]]]]],
         'C',
         'D',
         ['def',
          'spam',
          ['(', 'x', 'y', ')'],
          ':',
          [[['def', 'eggs', ['(', 'z', ')'], ':', [['pass']]]]]]] 
    csN|t|�krdSt||�}|�dkrJ|�dkr>t||d��t||d��dS)Nrrzillegal nestingznot a peer entryrsrs)r�r9r!r)r�r5rv�curCol)�indentStackrwrx�checkPeerIndent�s
z&indentedBlock.<locals>.checkPeerIndentcs2t||�}|�dkr"�j|�nt||d��dS)Nrrznot a subentryrs)r9r�r)r�r5rvr�)r�rwrx�checkSubIndent�s
z%indentedBlock.<locals>.checkSubIndentcsN|t|�krdSt||�}�o4|�dko4|�dksBt||d���j�dS)Nrrrqznot an unindentrsr:)r�r9rr�)r�r5rvr�)r�rwrx�
checkUnindent�s
z$indentedBlock.<locals>.checkUnindentz	 �INDENTr�ZUNINDENTzindented block)rrr�r�r
r�rirrr�rk)ZblockStatementExprr�rr�r�r�r!r�ZPEERZUNDENTZsmExprrw)r�rxrfQsN,z#[\0xc0-\0xd6\0xd8-\0xf6\0xf8-\0xff]z[\0xa1-\0xbf\0xd7\0xf7]z_:zany tagzgt lt amp nbsp quot aposz><& "'z&(?P<entity>rnz);zcommon HTML entitycCstj|j�S)zRHelper parser action to replace common HTML entities with their special characters)�_htmlEntityMapr�Zentity)rvrwrwrxr[�sz/\*(?:[^*]|\*(?!/))*z*/zC style commentz<!--[\s\S]*?-->zHTML commentz.*zrest of linez//(?:\\\n|[^\n])*z
// commentzC++ style commentz#.*zPython style comment)r�z 	�	commaItem)r�c@s�eZdZdZee�Zee�Ze	e
�jd�je�Z
e	e�jd�jeed��Zed�jd�je�Ze�je�de�je�jd�Zejd	d
��eeeed�j�e�Bjd�Zeje�ed
�jd�je�Zed�jd�je�ZeeBeBj�Zed�jd�je�Ze	eded�jd�Zed�jd�Z ed�jd�Z!e!de!djd�Z"ee!de!d>�dee!de!d?�jd�Z#e#j$d d
��d!e jd"�Z%e&e"e%Be#Bjd#��jd#�Z'ed$�jd%�Z(e)d@d'd(��Z*e)dAd*d+��Z+ed,�jd-�Z,ed.�jd/�Z-ed0�jd1�Z.e/j�e0j�BZ1e)d2d3��Z2e&e3e4d4�e5�e	e6d4d5�ee7d6����j�jd7�Z8e9ee:j;�e8Bd8d9��jd:�Z<e)ed;d
���Z=e)ed<d
���Z>d=S)Brna�

    Here are some common low-level expressions that may be useful in jump-starting parser development:
     - numeric forms (L{integers<integer>}, L{reals<real>}, L{scientific notation<sci_real>})
     - common L{programming identifiers<identifier>}
     - network addresses (L{MAC<mac_address>}, L{IPv4<ipv4_address>}, L{IPv6<ipv6_address>})
     - ISO8601 L{dates<iso8601_date>} and L{datetime<iso8601_datetime>}
     - L{UUID<uuid>}
     - L{comma-separated list<comma_separated_list>}
    Parse actions:
     - C{L{convertToInteger}}
     - C{L{convertToFloat}}
     - C{L{convertToDate}}
     - C{L{convertToDatetime}}
     - C{L{stripHTMLTags}}
     - C{L{upcaseTokens}}
     - C{L{downcaseTokens}}

    Example::
        pyparsing_common.number.runTests('''
            # any int or real number, returned as the appropriate type
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.fnumber.runTests('''
            # any int or real number, returned as float
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.hex_integer.runTests('''
            # hex numbers
            100
            FF
            ''')

        pyparsing_common.fraction.runTests('''
            # fractions
            1/2
            -3/4
            ''')

        pyparsing_common.mixed_integer.runTests('''
            # mixed fractions
            1
            1/2
            -3/4
            1-3/4
            ''')

        import uuid
        pyparsing_common.uuid.setParseAction(tokenMap(uuid.UUID))
        pyparsing_common.uuid.runTests('''
            # uuid
            12345678-1234-5678-1234-567812345678
            ''')
    prints::
        # any int or real number, returned as the appropriate type
        100
        [100]

        -100
        [-100]

        +100
        [100]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # any int or real number, returned as float
        100
        [100.0]

        -100
        [-100.0]

        +100
        [100.0]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # hex numbers
        100
        [256]

        FF
        [255]

        # fractions
        1/2
        [0.5]

        -3/4
        [-0.75]

        # mixed fractions
        1
        [1]

        1/2
        [0.5]

        -3/4
        [-0.75]

        1-3/4
        [1.75]

        # uuid
        12345678-1234-5678-1234-567812345678
        [UUID('12345678-1234-5678-1234-567812345678')]
    �integerzhex integerrtz[+-]?\d+zsigned integerr��fractioncCs|d|dS)Nrrrrsrw)rvrwrwrxry�szpyparsing_common.<lambda>r8z"fraction or mixed integer-fractionz
[+-]?\d+\.\d*zreal numberz+[+-]?\d+([eE][+-]?\d+|\.\d*([eE][+-]?\d+)?)z$real number with scientific notationz[+-]?\d+\.?\d*([eE][+-]?\d+)?�fnumberrB�
identifierzK(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})(\.(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})){3}zIPv4 addressz[0-9a-fA-F]{1,4}�hex_integerr��zfull IPv6 addressrrBz::zshort IPv6 addresscCstdd�|D��dkS)Ncss|]}tjj|�rdVqdS)rrN)rn�
_ipv6_partr�)r�rfrwrwrxr��sz,pyparsing_common.<lambda>.<locals>.<genexpr>rw)rH)rvrwrwrxry�sz::ffff:zmixed IPv6 addresszIPv6 addressz:[0-9a-fA-F]{2}([:.-])[0-9a-fA-F]{2}(?:\1[0-9a-fA-F]{2}){4}zMAC address�%Y-%m-%dcs�fdd�}|S)a�
        Helper to create a parse action for converting parsed date string to Python datetime.date

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%d"})

        Example::
            date_expr = pyparsing_common.iso8601_date.copy()
            date_expr.setParseAction(pyparsing_common.convertToDate())
            print(date_expr.parseString("1999-12-31"))
        prints::
            [datetime.date(1999, 12, 31)]
        csLytj|d��j�Stk
rF}zt||t|���WYdd}~XnXdS)Nr)r�strptimeZdater�rr{)r�r5rv�ve)�fmtrwrx�cvt_fn�sz.pyparsing_common.convertToDate.<locals>.cvt_fnrw)r�r�rw)r�rx�
convertToDate�szpyparsing_common.convertToDate�%Y-%m-%dT%H:%M:%S.%fcs�fdd�}|S)a
        Helper to create a parse action for converting parsed datetime string to Python datetime.datetime

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%dT%H:%M:%S.%f"})

        Example::
            dt_expr = pyparsing_common.iso8601_datetime.copy()
            dt_expr.setParseAction(pyparsing_common.convertToDatetime())
            print(dt_expr.parseString("1999-12-31T23:59:59.999"))
        prints::
            [datetime.datetime(1999, 12, 31, 23, 59, 59, 999000)]
        csHytj|d��Stk
rB}zt||t|���WYdd}~XnXdS)Nr)rr�r�rr{)r�r5rvr�)r�rwrxr��sz2pyparsing_common.convertToDatetime.<locals>.cvt_fnrw)r�r�rw)r�rx�convertToDatetime�sz"pyparsing_common.convertToDatetimez7(?P<year>\d{4})(?:-(?P<month>\d\d)(?:-(?P<day>\d\d))?)?zISO8601 datez�(?P<year>\d{4})-(?P<month>\d\d)-(?P<day>\d\d)[T ](?P<hour>\d\d):(?P<minute>\d\d)(:(?P<second>\d\d(\.\d*)?)?)?(?P<tz>Z|[+-]\d\d:?\d\d)?zISO8601 datetimez2[0-9a-fA-F]{8}(-[0-9a-fA-F]{4}){3}-[0-9a-fA-F]{12}�UUIDcCstjj|d�S)a
        Parse action to remove HTML tags from web page HTML source

        Example::
            # strip HTML links from normal text 
            text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
            td,td_end = makeHTMLTags("TD")
            table_text = td + SkipTo(td_end).setParseAction(pyparsing_common.stripHTMLTags)("body") + td_end
            
            print(table_text.parseString(text).body) # -> 'More info at the pyparsing wiki page'
        r)rn�_html_stripperr�)r�r5r�rwrwrx�
stripHTMLTags�s
zpyparsing_common.stripHTMLTagsra)r�z 	r�r�)r�zcomma separated listcCst|�j�S)N)r�r�)rvrwrwrxry�scCst|�j�S)N)r�r�)rvrwrwrxry�sN)rrB)rrB)r�)r�)?r�r�r�r�rmruZconvertToInteger�floatZconvertToFloatr/rRrir�r�rDr�r'Zsigned_integerr�rxrr�Z
mixed_integerrH�realZsci_realr��numberr�r4r3r�Zipv4_addressr�Z_full_ipv6_addressZ_short_ipv6_addressr~Z_mixed_ipv6_addressr
Zipv6_addressZmac_addressr�r�r�Ziso8601_dateZiso8601_datetime�uuidr7r6r�r�rrrrVr.�
_commasepitemr@rYr�Zcomma_separated_listrdrBrwrwrwrxrn�sN""
28�__main__Zselect�fromz_$r])rb�columnsrjZtablesZcommandaK
        # '*' as column list and dotted table name
        select * from SYS.XYZZY

        # caseless match on "SELECT", and casts back to "select"
        SELECT * from XYZZY, ABC

        # list of column names, and mixed case SELECT keyword
        Select AA,BB,CC from Sys.dual

        # multiple tables
        Select A, B, C from Sys.dual, Table2

        # invalid SELECT keyword - should fail
        Xelect A, B, C from Sys.dual

        # incomplete command - should fail
        Select

        # invalid column name - should fail
        Select ^^^ frox Sys.dual

        z]
        100
        -100
        +100
        3.14159
        6.02e23
        1e-12
        z 
        100
        FF
        z6
        12345678-1234-5678-1234-567812345678
        )rq)raF)N)FT)T)r�)T)�r��__version__Z__versionTime__�
__author__r��weakrefrr�r�r~r�rdrr�r"r<r�r�_threadr�ImportErrorZ	threadingrr�Zordereddict�__all__r��version_infor;r�maxsizer�r{r��chrrur�rHr�r�reversedr�r�rr6rrrIZmaxintZxranger�Z__builtin__r�Zfnamer�rJr�r�r�r�r�r�Zascii_uppercaseZascii_lowercaser4rRrDr3rkr�Z	printablerVrKrrr!r#r&r�r"�MutableMapping�registerr9rJrGr/r2r4rQrMr$r,r
rrr�rQrrrrlr/r'r%r	r.r0rrrr*r)r1r0r rrrrrrrrJrr2rMrNrr(rrVr-r
rrr+rrbr@r<r�rOrNrrSrArgrhrjrirCrIrHrar`r�Z_escapedPuncZ_escapedHexCharZ_escapedOctChar�UNICODEZ_singleCharZ
_charRangermr}r_rMr\rZrmrdrBr�rKrLrer�rkrTr�r�rirUr>r^rYrcrPrfr5rWr7r6r�r�r�r�r;r[r8rEr�r]r?r=rFrXr�r�r:rnr�ZselectTokenZ	fromTokenZidentZ
columnNameZcolumnNameListZ
columnSpecZ	tableNameZ
tableNameListZ	simpleSQLr�r�r�r�r�r�rwrwrwrx�<module>=s�









8


@d&A= I
G3pLOD|M &#@sQ,A,	I#%&0
,	?#kZr

 (
 0 


"site-packages/pkg_resources/_vendor/__pycache__/six.cpython-36.pyc000064400000057532147511334660021262 0ustar003

��f�u�I@srdZddlmZddlZddlZddlZddlZddlZdZdZ	ej
ddkZej
ddkZej
dd��dzkZ
er�efZefZefZeZeZejZn�efZeefZeejfZeZeZejjd	�r�e�d|�ZnLGdd
�d
e�Z ye!e ��Wn e"k
�re�d~�ZYnXe�d��Z[ dd�Z#dd�Z$Gdd�de�Z%Gdd�de%�Z&Gdd�dej'�Z(Gdd�de%�Z)Gdd�de�Z*e*e+�Z,Gdd�de(�Z-e)ddd d!�e)d"d#d$d%d"�e)d&d#d#d'd&�e)d(d)d$d*d(�e)d+d)d,�e)d-d#d$d.d-�e)d/d0d0d1d/�e)d2d0d0d/d2�e)d3d)d$d4d3�e)d5d)e
�rd6nd7d8�e)d9d)d:�e)d;d<d=d>�e)d!d!d �e)d?d?d@�e)dAdAd@�e)dBdBd@�e)d4d)d$d4d3�e)dCd#d$dDdC�e)dEd#d#dFdE�e&d$d)�e&dGdH�e&dIdJ�e&dKdLdM�e&dNdOdN�e&dPdQdR�e&dSdTdU�e&dVdWdX�e&dYdZd[�e&d\d]d^�e&d_d`da�e&dbdcdd�e&dedfdg�e&dhdidj�e&dkdkdl�e&dmdmdl�e&dndndl�e&dododp�e&dqdr�e&dsdt�e&dudv�e&dwdxdw�e&dydz�e&d{d|d}�e&d~dd��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�e+d�d��e&d�e+d�d��e&d�e+d�e+d��e&d�d�d��e&d�d�d��e&d�d�d��g>Z.ejd�k�rZe.e&d�d��g7Z.x:e.D]2Z/e0e-e/j1e/�e2e/e&��r`e,j3e/d�e/j1��q`W[/e.e-_.e-e+d��Z4e,j3e4d��Gd�d��d�e(�Z5e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d>d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��gZ6xe6D]Z/e0e5e/j1e/��q�W[/e6e5_.e,j3e5e+d��d�dӃGd�dՄd�e(�Z7e)d�d�d��e)d�d�d��e)d�d�d��gZ8xe8D]Z/e0e7e/j1e/��q$W[/e8e7_.e,j3e7e+d��d�d܃Gd�dބd�e(�Z9e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)�dd�d�g!Z:xe:D]Z/e0e9e/j1e/��q�W[/e:e9_.e,j3e9e+�d��d�d�G�d�d��de(�Z;e)�dd��d�e)�dd��d�e)�d	d��d�e)�d
d��d�gZ<xe<D]Z/e0e;e/j1e/��qTW[/e<e;_.e,j3e;e+�d��d�d
�G�d�d��de(�Z=e)�dd�d��gZ>xe>D]Z/e0e=e/j1e/��q�W[/e>e=_.e,j3e=e+�d��d�d�G�d�d��dej'�Z?e,j3e?e+d���d��d�d�Z@�d�d�ZAe�	rj�dZB�dZC�dZD�dZE�dZF�d ZGn$�d!ZB�d"ZC�d#ZD�d$ZE�d%ZF�d&ZGyeHZIWn"eJk
�	r��d'�d(�ZIYnXeIZHyeKZKWn"eJk
�	r��d)�d*�ZKYnXe�
r�d+�d,�ZLejMZN�d-�d.�ZOeZPn>�d/�d,�ZL�d0�d1�ZN�d2�d.�ZOG�d3�d4��d4e�ZPeKZKe#eL�d5�ejQeB�ZRejQeC�ZSejQeD�ZTejQeE�ZUejQeF�ZVejQeG�ZWe�
r��d6�d7�ZX�d8�d9�ZY�d:�d;�ZZ�d<�d=�Z[ej\�d>�Z]ej\�d?�Z^ej\�d@�Z_nT�dA�d7�ZX�dB�d9�ZY�dC�d;�ZZ�dD�d=�Z[ej\�dE�Z]ej\�dF�Z^ej\�dG�Z_e#eX�dH�e#eY�dI�e#eZ�dJ�e#e[�dK�e�r�dL�dM�Z`�dN�dO�ZaebZcddldZdedje�dP�jfZg[dejhd�ZiejjZkelZmddlnZnenjoZoenjpZp�dQZqej
d
d
k�r�dRZr�dSZsn�dTZr�dUZsnj�dV�dM�Z`�dW�dO�ZaecZcebZg�dX�dY�Zi�dZ�d[�Zkejtejuev�ZmddloZoeojoZoZp�d\Zq�dRZr�dSZse#e`�d]�e#ea�d^��d_�dQ�Zw�d`�dT�Zx�da�dU�Zye�r�eze4j{�db�Z|�d��dc�dd�Z}n�d��de�df�Z|e|�dg�ej
dd��d�k�
re|�dh�n.ej
dd��d�k�
r8e|�di�n�dj�dk�Z~eze4j{�dld�Zedk�
rj�dm�dn�Zej
dd��d�k�
r�eZ��do�dn�Ze#e}�dp�ej
dd��d�k�
r�ej�ej�f�dq�dr�Z�nej�Z��ds�dt�Z��du�dv�Z��dw�dx�Z�gZ�e+Z�e��j��dy�dk	�rge�_�ej��rbx>e�ej��D]0\Z�Z�ee��j+dk�r*e�j1e+k�r*ej�e�=P�q*W[�[�ej�j�e,�dS(�z6Utilities for writing code that runs on Python 2 and 3�)�absolute_importNz'Benjamin Peterson <benjamin@python.org>z1.10.0����java��c@seZdZdd�ZdS)�XcCsdS)Nrrl�)�selfr
r
�/usr/lib/python3.6/six.py�__len__>sz	X.__len__N)�__name__�
__module__�__qualname__r
r
r
r
rr	<sr	�?cCs
||_dS)z Add documentation to a function.N)�__doc__)�func�docr
r
r�_add_docKsrcCst|�tj|S)z7Import module, returning the module after the last dot.)�
__import__�sys�modules)�namer
r
r�_import_modulePsrc@seZdZdd�Zdd�ZdS)�
_LazyDescrcCs
||_dS)N)r)rrr
r
r�__init__Xsz_LazyDescr.__init__cCsB|j�}t||j|�yt|j|j�Wntk
r<YnX|S)N)�_resolve�setattrr�delattr�	__class__�AttributeError)r�obj�tp�resultr
r
r�__get__[sz_LazyDescr.__get__N)rrrrr%r
r
r
rrVsrcs.eZdZd�fdd�	Zdd�Zdd�Z�ZS)	�MovedModuleNcs2tt|�j|�tr(|dkr |}||_n||_dS)N)�superr&r�PY3�mod)rr�old�new)r r
rriszMovedModule.__init__cCs
t|j�S)N)rr))rr
r
rrrszMovedModule._resolvecCs"|j�}t||�}t|||�|S)N)r�getattrr)r�attr�_module�valuer
r
r�__getattr__us
zMovedModule.__getattr__)N)rrrrrr0�
__classcell__r
r
)r rr&gs	r&cs(eZdZ�fdd�Zdd�ZgZ�ZS)�_LazyModulecstt|�j|�|jj|_dS)N)r'r2rr r)rr)r r
rr~sz_LazyModule.__init__cCs ddg}|dd�|jD�7}|S)NrrcSsg|]
}|j�qSr
)r)�.0r-r
r
r�
<listcomp>�sz'_LazyModule.__dir__.<locals>.<listcomp>)�_moved_attributes)rZattrsr
r
r�__dir__�sz_LazyModule.__dir__)rrrrr6r5r1r
r
)r rr2|sr2cs&eZdZd�fdd�	Zdd�Z�ZS)�MovedAttributeNcsdtt|�j|�trH|dkr |}||_|dkr@|dkr<|}n|}||_n||_|dkrZ|}||_dS)N)r'r7rr(r)r-)rrZold_modZnew_modZold_attrZnew_attr)r r
rr�szMovedAttribute.__init__cCst|j�}t||j�S)N)rr)r,r-)r�moduler
r
rr�s
zMovedAttribute._resolve)NN)rrrrrr1r
r
)r rr7�sr7c@sVeZdZdZdd�Zdd�Zdd�Zdd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZeZdS)�_SixMetaPathImporterz�
    A meta path importer to import six.moves and its submodules.

    This class implements a PEP302 finder and loader. It should be compatible
    with Python 2.5 and all existing versions of Python3
    cCs||_i|_dS)N)r�
known_modules)rZsix_module_namer
r
rr�sz_SixMetaPathImporter.__init__cGs&x |D]}||j|jd|<qWdS)N�.)r:r)rr)Z	fullnames�fullnamer
r
r�_add_module�s
z _SixMetaPathImporter._add_modulecCs|j|jd|S)Nr;)r:r)rr<r
r
r�_get_module�sz _SixMetaPathImporter._get_moduleNcCs||jkr|SdS)N)r:)rr<�pathr
r
r�find_module�s
z _SixMetaPathImporter.find_modulecCs0y
|j|Stk
r*td|��YnXdS)Nz!This loader does not know module )r:�KeyError�ImportError)rr<r
r
rZ__get_module�s
z!_SixMetaPathImporter.__get_modulecCsRy
tj|Stk
rYnX|j|�}t|t�r>|j�}n||_|tj|<|S)N)rrrA� _SixMetaPathImporter__get_module�
isinstancer&r�
__loader__)rr<r)r
r
r�load_module�s




z _SixMetaPathImporter.load_modulecCst|j|�d�S)z�
        Return true, if the named module is a package.

        We need this method to get correct spec objects with
        Python 3.4 (see PEP451)
        �__path__)�hasattrrC)rr<r
r
r�
is_package�sz_SixMetaPathImporter.is_packagecCs|j|�dS)z;Return None

        Required, if is_package is implementedN)rC)rr<r
r
r�get_code�s
z_SixMetaPathImporter.get_code)N)
rrrrrr=r>r@rCrFrIrJ�
get_sourcer
r
r
rr9�s
	r9c@seZdZdZgZdS)�_MovedItemszLazy loading of moved objectsN)rrrrrGr
r
r
rrL�srLZ	cStringIO�io�StringIO�filter�	itertools�builtinsZifilter�filterfalseZifilterfalse�inputZ__builtin__Z	raw_input�internr�map�imap�getcwd�osZgetcwdu�getcwdb�rangeZxrangeZ
reload_module�	importlibZimp�reload�reduce�	functoolsZshlex_quoteZpipesZshlexZquote�UserDict�collections�UserList�
UserString�zipZizip�zip_longestZizip_longestZconfigparserZConfigParser�copyregZcopy_regZdbm_gnuZgdbmzdbm.gnuZ
_dummy_threadZdummy_threadZhttp_cookiejarZ	cookielibzhttp.cookiejarZhttp_cookiesZCookiezhttp.cookiesZ
html_entitiesZhtmlentitydefsz
html.entitiesZhtml_parserZ
HTMLParserzhtml.parserZhttp_clientZhttplibzhttp.clientZemail_mime_multipartzemail.MIMEMultipartzemail.mime.multipartZemail_mime_nonmultipartzemail.MIMENonMultipartzemail.mime.nonmultipartZemail_mime_textzemail.MIMETextzemail.mime.textZemail_mime_basezemail.MIMEBasezemail.mime.baseZBaseHTTPServerzhttp.serverZ
CGIHTTPServerZSimpleHTTPServerZcPickle�pickleZqueueZQueue�reprlib�reprZsocketserverZSocketServer�_threadZthreadZtkinterZTkinterZtkinter_dialogZDialogztkinter.dialogZtkinter_filedialogZ
FileDialogztkinter.filedialogZtkinter_scrolledtextZScrolledTextztkinter.scrolledtextZtkinter_simpledialogZSimpleDialogztkinter.simpledialogZtkinter_tixZTixztkinter.tixZtkinter_ttkZttkztkinter.ttkZtkinter_constantsZTkconstantsztkinter.constantsZtkinter_dndZTkdndztkinter.dndZtkinter_colorchooserZtkColorChooserztkinter.colorchooserZtkinter_commondialogZtkCommonDialogztkinter.commondialogZtkinter_tkfiledialogZtkFileDialogZtkinter_fontZtkFontztkinter.fontZtkinter_messageboxZtkMessageBoxztkinter.messageboxZtkinter_tksimpledialogZtkSimpleDialogZurllib_parsez.moves.urllib_parsezurllib.parseZurllib_errorz.moves.urllib_errorzurllib.errorZurllibz
.moves.urllibZurllib_robotparser�robotparserzurllib.robotparserZ
xmlrpc_clientZ	xmlrpclibz
xmlrpc.clientZ
xmlrpc_serverZSimpleXMLRPCServerz
xmlrpc.serverZwin32�winreg�_winregzmoves.z.moves�movesc@seZdZdZdS)�Module_six_moves_urllib_parsez7Lazy loading of moved objects in six.moves.urllib_parseN)rrrrr
r
r
rrn@srnZParseResultZurlparseZSplitResultZparse_qsZ	parse_qslZ	urldefragZurljoinZurlsplitZ
urlunparseZ
urlunsplitZ
quote_plusZunquoteZunquote_plusZ	urlencodeZ
splitqueryZsplittagZ	splituserZ
uses_fragmentZuses_netlocZuses_paramsZ
uses_queryZ
uses_relativezmoves.urllib_parsezmoves.urllib.parsec@seZdZdZdS)�Module_six_moves_urllib_errorz7Lazy loading of moved objects in six.moves.urllib_errorN)rrrrr
r
r
rrohsroZURLErrorZurllib2Z	HTTPErrorZContentTooShortErrorz.moves.urllib.errorzmoves.urllib_errorzmoves.urllib.errorc@seZdZdZdS)�Module_six_moves_urllib_requestz9Lazy loading of moved objects in six.moves.urllib_requestN)rrrrr
r
r
rrp|srpZurlopenzurllib.requestZinstall_openerZbuild_openerZpathname2urlZurl2pathnameZ
getproxiesZRequestZOpenerDirectorZHTTPDefaultErrorHandlerZHTTPRedirectHandlerZHTTPCookieProcessorZProxyHandlerZBaseHandlerZHTTPPasswordMgrZHTTPPasswordMgrWithDefaultRealmZAbstractBasicAuthHandlerZHTTPBasicAuthHandlerZProxyBasicAuthHandlerZAbstractDigestAuthHandlerZHTTPDigestAuthHandlerZProxyDigestAuthHandlerZHTTPHandlerZHTTPSHandlerZFileHandlerZ
FTPHandlerZCacheFTPHandlerZUnknownHandlerZHTTPErrorProcessorZurlretrieveZ
urlcleanupZ	URLopenerZFancyURLopenerZproxy_bypassz.moves.urllib.requestzmoves.urllib_requestzmoves.urllib.requestc@seZdZdZdS)� Module_six_moves_urllib_responsez:Lazy loading of moved objects in six.moves.urllib_responseN)rrrrr
r
r
rrq�srqZaddbasezurllib.responseZaddclosehookZaddinfoZ
addinfourlz.moves.urllib.responsezmoves.urllib_responsezmoves.urllib.responsec@seZdZdZdS)�#Module_six_moves_urllib_robotparserz=Lazy loading of moved objects in six.moves.urllib_robotparserN)rrrrr
r
r
rrr�srrZRobotFileParserz.moves.urllib.robotparserzmoves.urllib_robotparserzmoves.urllib.robotparserc@sNeZdZdZgZejd�Zejd�Zejd�Z	ejd�Z
ejd�Zdd�Zd	S)
�Module_six_moves_urllibzICreate a six.moves.urllib namespace that resembles the Python 3 namespacezmoves.urllib_parsezmoves.urllib_errorzmoves.urllib_requestzmoves.urllib_responsezmoves.urllib_robotparsercCsdddddgS)N�parse�error�request�responserjr
)rr
r
rr6�szModule_six_moves_urllib.__dir__N)
rrrrrG�	_importerr>rtrurvrwrjr6r
r
r
rrs�s




rszmoves.urllibcCstt|j|�dS)zAdd an item to six.moves.N)rrLr)Zmover
r
r�add_move�srycCsXytt|�WnDtk
rRytj|=Wn"tk
rLtd|f��YnXYnXdS)zRemove item from six.moves.zno such move, %rN)rrLr!rm�__dict__rA)rr
r
r�remove_move�sr{�__func__�__self__�__closure__�__code__�__defaults__�__globals__�im_funcZim_selfZfunc_closureZ	func_codeZ
func_defaultsZfunc_globalscCs|j�S)N)�next)�itr
r
r�advance_iteratorsr�cCstdd�t|�jD��S)Ncss|]}d|jkVqdS)�__call__N)rz)r3�klassr
r
r�	<genexpr>szcallable.<locals>.<genexpr>)�any�type�__mro__)r"r
r
r�callablesr�cCs|S)Nr
)�unboundr
r
r�get_unbound_functionsr�cCs|S)Nr
)r�clsr
r
r�create_unbound_methodsr�cCs|jS)N)r�)r�r
r
rr�"scCstj|||j�S)N)�types�
MethodTyper )rr"r
r
r�create_bound_method%sr�cCstj|d|�S)N)r�r�)rr�r
r
rr�(sc@seZdZdd�ZdS)�IteratorcCst|�j|�S)N)r��__next__)rr
r
rr�-sz
Iterator.nextN)rrrr�r
r
r
rr�+sr�z3Get the function out of a possibly unbound functioncKst|jf|��S)N)�iter�keys)�d�kwr
r
r�iterkeys>sr�cKst|jf|��S)N)r��values)r�r�r
r
r�
itervaluesAsr�cKst|jf|��S)N)r��items)r�r�r
r
r�	iteritemsDsr�cKst|jf|��S)N)r�Zlists)r�r�r
r
r�	iterlistsGsr�r�r�r�cKs|jf|�S)N)r�)r�r�r
r
rr�PscKs|jf|�S)N)r�)r�r�r
r
rr�SscKs|jf|�S)N)r�)r�r�r
r
rr�VscKs|jf|�S)N)r�)r�r�r
r
rr�Ys�viewkeys�
viewvalues�	viewitemsz1Return an iterator over the keys of a dictionary.z3Return an iterator over the values of a dictionary.z?Return an iterator over the (key, value) pairs of a dictionary.zBReturn an iterator over the (key, [values]) pairs of a dictionary.cCs
|jd�S)Nzlatin-1)�encode)�sr
r
r�bksr�cCs|S)Nr
)r�r
r
r�unsr�z>B�assertCountEqualZassertRaisesRegexpZassertRegexpMatches�assertRaisesRegex�assertRegexcCs|S)Nr
)r�r
r
rr��scCst|jdd�d�S)Nz\\z\\\\Zunicode_escape)�unicode�replace)r�r
r
rr��scCst|d�S)Nr)�ord)Zbsr
r
r�byte2int�sr�cCst||�S)N)r�)Zbuf�ir
r
r�
indexbytes�sr�ZassertItemsEqualzByte literalzText literalcOst|t�||�S)N)r,�_assertCountEqual)r�args�kwargsr
r
rr��scOst|t�||�S)N)r,�_assertRaisesRegex)rr�r�r
r
rr��scOst|t�||�S)N)r,�_assertRegex)rr�r�r
r
rr��s�execcCs*|dkr|�}|j|k	r"|j|��|�dS)N)�
__traceback__�with_traceback)r#r/�tbr
r
r�reraise�s


r�cCsB|dkr*tjd�}|j}|dkr&|j}~n|dkr6|}td�dS)zExecute code in a namespace.Nrzexec _code_ in _globs_, _locs_)r�	_getframe�	f_globals�f_localsr�)Z_code_Z_globs_Z_locs_�framer
r
r�exec_�s
r�z9def reraise(tp, value, tb=None):
    raise tp, value, tb
zrdef raise_from(value, from_value):
    if from_value is None:
        raise value
    raise value from from_value
zCdef raise_from(value, from_value):
    raise value from from_value
cCs|�dS)Nr
)r/Z
from_valuer
r
r�
raise_from�sr��printc
s6|jdtj���dkrdS�fdd�}d}|jdd�}|dk	r`t|t�rNd}nt|t�s`td��|jd	d�}|dk	r�t|t�r�d}nt|t�s�td
��|r�td��|s�x|D]}t|t�r�d}Pq�W|r�td�}td
�}nd}d
}|dkr�|}|dk�r�|}x,t|�D] \}	}|	�r||�||��qW||�dS)z4The new-style print function for Python 2.4 and 2.5.�fileNcsdt|t�st|�}t�t�rVt|t�rV�jdk	rVt�dd�}|dkrHd}|j�j|�}�j|�dS)N�errors�strict)	rD�
basestring�strr�r��encodingr,r��write)�datar�)�fpr
rr��s



zprint_.<locals>.writeF�sepTzsep must be None or a string�endzend must be None or a stringz$invalid keyword arguments to print()�
� )�popr�stdoutrDr�r��	TypeError�	enumerate)
r�r�r�Zwant_unicoder�r��arg�newlineZspacer�r
)r�r�print_�sL







r�cOs<|jdtj�}|jdd�}t||�|r8|dk	r8|j�dS)Nr��flushF)�getrr�r��_printr�)r�r�r�r�r
r
rr�s

zReraise an exception.cs���fdd�}|S)Ncstj����|�}�|_|S)N)r^�wraps�__wrapped__)�f)�assigned�updated�wrappedr
r�wrapperszwraps.<locals>.wrapperr
)r�r�r�r�r
)r�r�r�rr�sr�cs&G��fdd�d��}tj|dfi�S)z%Create a base class with a metaclass.cseZdZ��fdd�ZdS)z!with_metaclass.<locals>.metaclasscs�|�|�S)Nr
)r�rZ
this_basesr�)�bases�metar
r�__new__'sz)with_metaclass.<locals>.metaclass.__new__N)rrrr�r
)r�r�r
r�	metaclass%sr�Ztemporary_class)r�r�)r�r�r�r
)r�r�r�with_metaclass sr�cs�fdd�}|S)z6Class decorator for creating a class with a metaclass.csl|jj�}|jd�}|dk	rDt|t�r,|g}x|D]}|j|�q2W|jdd�|jdd��|j|j|�S)N�	__slots__rz�__weakref__)rz�copyr�rDr�r�r�	__bases__)r�Z	orig_vars�slotsZ	slots_var)r�r
rr�.s



zadd_metaclass.<locals>.wrapperr
)r�r�r
)r�r�
add_metaclass,sr�cCs2tr.d|jkrtd|j��|j|_dd�|_|S)a
    A decorator that defines __unicode__ and __str__ methods under Python 2.
    Under Python 3 it does nothing.

    To support Python 2 and 3 with a single code base, define a __str__ method
    returning text and apply this decorator to the class.
    �__str__zY@python_2_unicode_compatible cannot be applied to %s because it doesn't define __str__().cSs|j�jd�S)Nzutf-8)�__unicode__r�)rr
r
r�<lambda>Jsz-python_2_unicode_compatible.<locals>.<lambda>)�PY2rz�
ValueErrorrr�r�)r�r
r
r�python_2_unicode_compatible<s


r��__spec__)rrli���li���ll����)N)NN)rr)rr)rr)rr)�rZ
__future__rr^rP�operatorrr��
__author__�__version__�version_infor�r(ZPY34r�Zstring_types�intZ
integer_typesr�Zclass_typesZ	text_type�bytesZbinary_type�maxsizeZMAXSIZEr�ZlongZ	ClassTyper��platform�
startswith�objectr	�len�
OverflowErrorrrrr&�
ModuleTyper2r7r9rrxrLr5r-rrrDr=rmrnZ_urllib_parse_moved_attributesroZ_urllib_error_moved_attributesrpZ _urllib_request_moved_attributesrqZ!_urllib_response_moved_attributesrrZ$_urllib_robotparser_moved_attributesrsryr{Z
_meth_funcZ
_meth_selfZ
_func_closureZ
_func_codeZ_func_defaultsZ
_func_globalsr�r��	NameErrorr�r�r�r�r�r��
attrgetterZget_method_functionZget_method_selfZget_function_closureZget_function_codeZget_function_defaultsZget_function_globalsr�r�r�r��methodcallerr�r�r�r�r��chrZunichr�struct�Struct�packZint2byte�
itemgetterr��getitemr�r�Z	iterbytesrMrN�BytesIOr�r�r��partialrVr�r�r�r�r,rQr�r�r�r�r��WRAPPER_ASSIGNMENTS�WRAPPER_UPDATESr�r�r�r�rG�__package__�globalsr�r��submodule_search_locations�	meta_pathr�r�Zimporter�appendr
r
r
r�<module>s�

>












































































































5site-packages/pkg_resources/_vendor/__pycache__/appdirs.cpython-36.pyc000064400000044153147511334660022114 0ustar003

��ffW�@s�dZd1Zdjeee��ZddlZddlZejddkZ	e	r>eZ
ejjd�r�ddlZej
�ddZejd�rrd	Zq�ejd
�r�dZq�dZnejZd2dd�Zd3dd�Zd4dd�Zd5dd�Zd6dd�Zd7dd�ZGdd�de�Zdd�Zdd �Zd!d"�Zd#d$�Zed	k�r�yddlZeZWnnek
�r�ydd%l m!Z!eZWnBek
�r|yddl"Z#eZWnek
�rveZYnXYnXYnXe$d&k�r~d'Z%d(Z&d8Z'e(d)�ee%e&d*d+�Z)x$e'D]Z*e(d,e*e+e)e*�f��q�We(d-�ee%e&�Z)x$e'D]Z*e(d,e*e+e)e*�f��q�We(d.�ee%�Z)x$e'D]Z*e(d,e*e+e)e*�f��q$We(d/�ee%d
d0�Z)x$e'D]Z*e(d,e*e+e)e*�f��q^WdS)9zyUtilities for determining application-specific dirs.

See <http://github.com/ActiveState/appdirs> for details and usage.
����.N��javaZWindows�win32ZMac�darwinZlinux2FcCs�tdkr^|dkr|}|rdpd}tjjt|��}|r�|dk	rNtjj|||�}q�tjj||�}nNtdkr�tjjd�}|r�tjj||�}n&tjdtjjd	��}|r�tjj||�}|r�|r�tjj||�}|S)
aJReturn full path to the user-specific data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user data directories are:
        Mac OS X:               ~/Library/Application Support/<AppName>
        Unix:                   ~/.local/share/<AppName>    # or in $XDG_DATA_HOME, if defined
        Win XP (not roaming):   C:\Documents and Settings\<username>\Application Data\<AppAuthor>\<AppName>
        Win XP (roaming):       C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>
        Win 7  (not roaming):   C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>
        Win 7  (roaming):       C:\Users\<username>\AppData\Roaming\<AppAuthor>\<AppName>

    For Unix, we follow the XDG spec and support $XDG_DATA_HOME.
    That means, by default "~/.local/share/<AppName>".
    rN�
CSIDL_APPDATA�CSIDL_LOCAL_APPDATAFrz~/Library/Application Support/Z
XDG_DATA_HOMEz~/.local/share)�system�os�path�normpath�_get_win_folder�join�
expanduser�getenv)�appname�	appauthor�version�roaming�constr
�r�/usr/lib/python3.6/appdirs.py�
user_data_dir-s& rcs
tdkrR|dkr�}tjjtd��}�r�|dk	rBtjj||��}q�tjj|��}n�tdkrztjjd�}�r�tjj|��}nttjdtjjdd	g��}d
d�|j	tj�D�}�r�|r�tjj�|���fdd�|D�}|r�tjj|�}n|d
}|S�o�|�rtjj||�}|S)aiReturn full path to the user-shared data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "multipath" is an optional parameter only applicable to *nix
            which indicates that the entire list of data dirs should be
            returned. By default, the first item from XDG_DATA_DIRS is
            returned, or '/usr/local/share/<AppName>',
            if XDG_DATA_DIRS is not set

    Typical user data directories are:
        Mac OS X:   /Library/Application Support/<AppName>
        Unix:       /usr/local/share/<AppName> or /usr/share/<AppName>
        Win XP:     C:\Documents and Settings\All Users\Application Data\<AppAuthor>\<AppName>
        Vista:      (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)
        Win 7:      C:\ProgramData\<AppAuthor>\<AppName>   # Hidden, but writeable on Win 7.

    For Unix, this is using the $XDG_DATA_DIRS[0] default.

    WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
    rN�CSIDL_COMMON_APPDATAFrz/Library/Application SupportZ
XDG_DATA_DIRSz/usr/local/sharez
/usr/sharecSs g|]}tjj|jtj���qSr)rr
r�rstrip�sep)�.0�xrrr�
<listcomp>�sz!site_data_dir.<locals>.<listcomp>csg|]}tjj|�g��qSr)rrr)rr)rrrr �sr)
rrr
rrrrr�pathsep�split)rrr�	multipathr
�pathlistr)rr�
site_data_dirds4
r%cCsXtdkrt||d|�}n&tjdtjjd��}|r>tjj||�}|rT|rTtjj||�}|S)a�Return full path to the user-specific config dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user data directories are:
        Mac OS X:               same as user_data_dir
        Unix:                   ~/.config/<AppName>     # or in $XDG_CONFIG_HOME, if defined
        Win *:                  same as user_data_dir

    For Unix, we follow the XDG spec and support $XDG_CONFIG_HOME.
    That means, by deafult "~/.config/<AppName>".
    rrNZXDG_CONFIG_HOMEz	~/.config)rr)rrrrr
rr)rrrrr
rrr�user_config_dir�sr&cs�td	kr*t�|�}�r�|r�tjj||�}ndtjdd�}dd�|jtj�D�}�rt|rbtjj�|���fdd�|D�}|r�tjj|�}n|d}|S)
aReturn full path to the user-shared data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "multipath" is an optional parameter only applicable to *nix
            which indicates that the entire list of config dirs should be
            returned. By default, the first item from XDG_CONFIG_DIRS is
            returned, or '/etc/xdg/<AppName>', if XDG_CONFIG_DIRS is not set

    Typical user data directories are:
        Mac OS X:   same as site_data_dir
        Unix:       /etc/xdg/<AppName> or $XDG_CONFIG_DIRS[i]/<AppName> for each value in
                    $XDG_CONFIG_DIRS
        Win *:      same as site_data_dir
        Vista:      (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)

    For Unix, this is using the $XDG_CONFIG_DIRS[0] default, if multipath=False

    WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
    rrZXDG_CONFIG_DIRSz/etc/xdgcSs g|]}tjj|jtj���qSr)rr
rrr)rrrrrr �sz#site_config_dir.<locals>.<listcomp>csg|]}tjj|�g��qSr)rrr)rr)rrrr �sr)rr)rr%rr
rrr"r!)rrrr#r
r$r)rr�site_config_dir�s
r'TcCs�tdkrd|dkr|}tjjtd��}|r�|dk	rBtjj|||�}ntjj||�}|r�tjj|d�}nNtdkr�tjjd�}|r�tjj||�}n&tjdtjjd	��}|r�tjj||�}|r�|r�tjj||�}|S)
aReturn full path to the user-specific cache dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "opinion" (boolean) can be False to disable the appending of
            "Cache" to the base app data dir for Windows. See
            discussion below.

    Typical user cache directories are:
        Mac OS X:   ~/Library/Caches/<AppName>
        Unix:       ~/.cache/<AppName> (XDG default)
        Win XP:     C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Cache
        Vista:      C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Cache

    On Windows the only suggestion in the MSDN docs is that local settings go in
    the `CSIDL_LOCAL_APPDATA` directory. This is identical to the non-roaming
    app data dir (the default returned by `user_data_dir` above). Apps typically
    put cache data somewhere *under* the given dir here. Some examples:
        ...\Mozilla\Firefox\Profiles\<ProfileName>\Cache
        ...\Acme\SuperApp\Cache\1.0
    OPINION: This function appends "Cache" to the `CSIDL_LOCAL_APPDATA` value.
    This can be disabled with the `opinion=False` option.
    rNr
FZCacherz~/Library/CachesZXDG_CACHE_HOMEz~/.cache)rrr
rrrrr)rrr�opinionr
rrr�user_cache_dirs(!r)cCs�tdkr tjjtjjd�|�}nNtdkrLt|||�}d}|rntjj|d�}n"t|||�}d}|rntjj|d�}|r�|r�tjj||�}|S)a�Return full path to the user-specific log dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "opinion" (boolean) can be False to disable the appending of
            "Logs" to the base app data dir for Windows, and "log" to the
            base cache dir for Unix. See discussion below.

    Typical user cache directories are:
        Mac OS X:   ~/Library/Logs/<AppName>
        Unix:       ~/.cache/<AppName>/log  # or under $XDG_CACHE_HOME if defined
        Win XP:     C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Logs
        Vista:      C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Logs

    On Windows the only suggestion in the MSDN docs is that local settings
    go in the `CSIDL_LOCAL_APPDATA` directory. (Note: I'm interested in
    examples of what some windows apps use for a logs dir.)

    OPINION: This function appends "Logs" to the `CSIDL_LOCAL_APPDATA`
    value for Windows and appends "log" to the user cache dir for Unix.
    This can be disabled with the `opinion=False` option.
    rz~/Library/LogsrFZLogs�log)rrr
rrrr))rrrr(r
rrr�user_log_dir:s  
r+c@sbeZdZdZddd�Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��ZdS)�AppDirsz1Convenience wrapper for getting application dirs.NFcCs"||_||_||_||_||_dS)N)rrrrr#)�selfrrrrr#rrr�__init__os
zAppDirs.__init__cCst|j|j|j|jd�S)N)rr)rrrrr)r-rrrrws
zAppDirs.user_data_dircCst|j|j|j|jd�S)N)rr#)r%rrrr#)r-rrrr%|s
zAppDirs.site_data_dircCst|j|j|j|jd�S)N)rr)r&rrrr)r-rrrr&�s
zAppDirs.user_config_dircCst|j|j|j|jd�S)N)rr#)r'rrrr#)r-rrrr'�s
zAppDirs.site_config_dircCst|j|j|jd�S)N)r)r)rrr)r-rrrr)�s
zAppDirs.user_cache_dircCst|j|j|jd�S)N)r)r+rrr)r-rrrr+�s
zAppDirs.user_log_dir)NNFF)�__name__�
__module__�__qualname__�__doc__r.�propertyrr%r&r'r)r+rrrrr,ms
r,cCs:ddl}dddd�|}|j|jd�}|j||�\}}|S)z�This is a fallback technique at best. I'm not sure if using the
    registry for this guarantees us the correct answer for all CSIDL_*
    names.
    rNZAppDatazCommon AppDataz
Local AppData)r	rr
z@Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders)�_winreg�OpenKey�HKEY_CURRENT_USERZQueryValueEx)�
csidl_namer4Zshell_folder_name�key�dir�typerrr�_get_win_folder_from_registry�sr;cCs�ddlm}m}|jdt||�dd�}y`t|�}d}x|D]}t|�dkr:d}Pq:W|r�yddl}|j|�}Wnt	k
r�YnXWnt
k
r�YnX|S)Nr)�shellcon�shellF�T)�win32com.shellr<r=�SHGetFolderPath�getattr�unicode�ord�win32api�GetShortPathName�ImportError�UnicodeError)r7r<r=r9�
has_high_char�crDrrr�_get_win_folder_with_pywin32�s$

rJcCs�ddl}dddd�|}|jd�}|jjjd|dd|�d}x|D]}t|�dkrBd	}PqBW|r�|jd�}|jjj|j|d�r�|}|jS)
Nr��#�)r	rr
iFr>T)	�ctypesZcreate_unicode_buffer�windllZshell32ZSHGetFolderPathWrCZkernel32ZGetShortPathNameW�value)r7rNZcsidl_const�bufrHrIZbuf2rrr�_get_win_folder_with_ctypes�s"


rRcCs�ddl}ddlm}ddlm}|jjd}|jd|�}|jj	}|j
dt|j|�d|jj
|�|jj|j��jd�}d}x|D]}	t|	�dkr~d	}Pq~W|r�|jd|�}|jj	}
tj|||�r�|jj|j��jd�}|S)
Nr)�jna)r�rI�Fr>T)�arrayZcom.sunrSZcom.sun.jna.platformrZWinDefZMAX_PATHZzerosZShell32ZINSTANCEr@rAZShlObjZSHGFP_TYPE_CURRENTZNativeZtoStringZtostringrrCZKernel32ZkernalrE)r7rVrSrZbuf_sizerQr=r9rHrIZkernelrrr�_get_win_folder_with_jna�s&
rW)rO�__main__ZMyAppZ	MyCompanyz%-- app dirs (with optional 'version')z1.0)rz%s: %sz)
-- app dirs (without optional 'version')z+
-- app dirs (without optional 'appauthor')z(
-- app dirs (with disabled 'appauthor'))r)rrr)NNNF)NNNF)NNNF)NNNF)NNNT)NNNT)rr%r&r'r)r+),r2Z__version_info__r�map�str�__version__�sysr�version_infoZPY3rB�platform�
startswithZjava_verZos_namerrr%r&r'r)r+�objectr,r;rJrRrWr?Zwin32comrrFrNrOZcom.sun.jnaZcomr/rrZprops�print�dirsZproprArrrr�<module>	s~


7
B
(
3
9
3+






site-packages/pkg_resources/_vendor/__pycache__/__init__.cpython-36.pyc000064400000000161147511334660022200 0ustar003

��f�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pkg_resources/_vendor/__pycache__/appdirs.cpython-36.opt-1.pyc000064400000044153147511334660023053 0ustar003

��ffW�@s�dZd1Zdjeee��ZddlZddlZejddkZ	e	r>eZ
ejjd�r�ddlZej
�ddZejd�rrd	Zq�ejd
�r�dZq�dZnejZd2dd�Zd3dd�Zd4dd�Zd5dd�Zd6dd�Zd7dd�ZGdd�de�Zdd�Zdd �Zd!d"�Zd#d$�Zed	k�r�yddlZeZWnnek
�r�ydd%l m!Z!eZWnBek
�r|yddl"Z#eZWnek
�rveZYnXYnXYnXe$d&k�r~d'Z%d(Z&d8Z'e(d)�ee%e&d*d+�Z)x$e'D]Z*e(d,e*e+e)e*�f��q�We(d-�ee%e&�Z)x$e'D]Z*e(d,e*e+e)e*�f��q�We(d.�ee%�Z)x$e'D]Z*e(d,e*e+e)e*�f��q$We(d/�ee%d
d0�Z)x$e'D]Z*e(d,e*e+e)e*�f��q^WdS)9zyUtilities for determining application-specific dirs.

See <http://github.com/ActiveState/appdirs> for details and usage.
����.N��javaZWindows�win32ZMac�darwinZlinux2FcCs�tdkr^|dkr|}|rdpd}tjjt|��}|r�|dk	rNtjj|||�}q�tjj||�}nNtdkr�tjjd�}|r�tjj||�}n&tjdtjjd	��}|r�tjj||�}|r�|r�tjj||�}|S)
aJReturn full path to the user-specific data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user data directories are:
        Mac OS X:               ~/Library/Application Support/<AppName>
        Unix:                   ~/.local/share/<AppName>    # or in $XDG_DATA_HOME, if defined
        Win XP (not roaming):   C:\Documents and Settings\<username>\Application Data\<AppAuthor>\<AppName>
        Win XP (roaming):       C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>
        Win 7  (not roaming):   C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>
        Win 7  (roaming):       C:\Users\<username>\AppData\Roaming\<AppAuthor>\<AppName>

    For Unix, we follow the XDG spec and support $XDG_DATA_HOME.
    That means, by default "~/.local/share/<AppName>".
    rN�
CSIDL_APPDATA�CSIDL_LOCAL_APPDATAFrz~/Library/Application Support/Z
XDG_DATA_HOMEz~/.local/share)�system�os�path�normpath�_get_win_folder�join�
expanduser�getenv)�appname�	appauthor�version�roaming�constr
�r�/usr/lib/python3.6/appdirs.py�
user_data_dir-s& rcs
tdkrR|dkr�}tjjtd��}�r�|dk	rBtjj||��}q�tjj|��}n�tdkrztjjd�}�r�tjj|��}nttjdtjjdd	g��}d
d�|j	tj�D�}�r�|r�tjj�|���fdd�|D�}|r�tjj|�}n|d
}|S�o�|�rtjj||�}|S)aiReturn full path to the user-shared data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "multipath" is an optional parameter only applicable to *nix
            which indicates that the entire list of data dirs should be
            returned. By default, the first item from XDG_DATA_DIRS is
            returned, or '/usr/local/share/<AppName>',
            if XDG_DATA_DIRS is not set

    Typical user data directories are:
        Mac OS X:   /Library/Application Support/<AppName>
        Unix:       /usr/local/share/<AppName> or /usr/share/<AppName>
        Win XP:     C:\Documents and Settings\All Users\Application Data\<AppAuthor>\<AppName>
        Vista:      (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)
        Win 7:      C:\ProgramData\<AppAuthor>\<AppName>   # Hidden, but writeable on Win 7.

    For Unix, this is using the $XDG_DATA_DIRS[0] default.

    WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
    rN�CSIDL_COMMON_APPDATAFrz/Library/Application SupportZ
XDG_DATA_DIRSz/usr/local/sharez
/usr/sharecSs g|]}tjj|jtj���qSr)rr
r�rstrip�sep)�.0�xrrr�
<listcomp>�sz!site_data_dir.<locals>.<listcomp>csg|]}tjj|�g��qSr)rrr)rr)rrrr �sr)
rrr
rrrrr�pathsep�split)rrr�	multipathr
�pathlistr)rr�
site_data_dirds4
r%cCsXtdkrt||d|�}n&tjdtjjd��}|r>tjj||�}|rT|rTtjj||�}|S)a�Return full path to the user-specific config dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user data directories are:
        Mac OS X:               same as user_data_dir
        Unix:                   ~/.config/<AppName>     # or in $XDG_CONFIG_HOME, if defined
        Win *:                  same as user_data_dir

    For Unix, we follow the XDG spec and support $XDG_CONFIG_HOME.
    That means, by deafult "~/.config/<AppName>".
    rrNZXDG_CONFIG_HOMEz	~/.config)rr)rrrrr
rr)rrrrr
rrr�user_config_dir�sr&cs�td	kr*t�|�}�r�|r�tjj||�}ndtjdd�}dd�|jtj�D�}�rt|rbtjj�|���fdd�|D�}|r�tjj|�}n|d}|S)
aReturn full path to the user-shared data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "multipath" is an optional parameter only applicable to *nix
            which indicates that the entire list of config dirs should be
            returned. By default, the first item from XDG_CONFIG_DIRS is
            returned, or '/etc/xdg/<AppName>', if XDG_CONFIG_DIRS is not set

    Typical user data directories are:
        Mac OS X:   same as site_data_dir
        Unix:       /etc/xdg/<AppName> or $XDG_CONFIG_DIRS[i]/<AppName> for each value in
                    $XDG_CONFIG_DIRS
        Win *:      same as site_data_dir
        Vista:      (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)

    For Unix, this is using the $XDG_CONFIG_DIRS[0] default, if multipath=False

    WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
    rrZXDG_CONFIG_DIRSz/etc/xdgcSs g|]}tjj|jtj���qSr)rr
rrr)rrrrrr �sz#site_config_dir.<locals>.<listcomp>csg|]}tjj|�g��qSr)rrr)rr)rrrr �sr)rr)rr%rr
rrr"r!)rrrr#r
r$r)rr�site_config_dir�s
r'TcCs�tdkrd|dkr|}tjjtd��}|r�|dk	rBtjj|||�}ntjj||�}|r�tjj|d�}nNtdkr�tjjd�}|r�tjj||�}n&tjdtjjd	��}|r�tjj||�}|r�|r�tjj||�}|S)
aReturn full path to the user-specific cache dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "opinion" (boolean) can be False to disable the appending of
            "Cache" to the base app data dir for Windows. See
            discussion below.

    Typical user cache directories are:
        Mac OS X:   ~/Library/Caches/<AppName>
        Unix:       ~/.cache/<AppName> (XDG default)
        Win XP:     C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Cache
        Vista:      C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Cache

    On Windows the only suggestion in the MSDN docs is that local settings go in
    the `CSIDL_LOCAL_APPDATA` directory. This is identical to the non-roaming
    app data dir (the default returned by `user_data_dir` above). Apps typically
    put cache data somewhere *under* the given dir here. Some examples:
        ...\Mozilla\Firefox\Profiles\<ProfileName>\Cache
        ...\Acme\SuperApp\Cache\1.0
    OPINION: This function appends "Cache" to the `CSIDL_LOCAL_APPDATA` value.
    This can be disabled with the `opinion=False` option.
    rNr
FZCacherz~/Library/CachesZXDG_CACHE_HOMEz~/.cache)rrr
rrrrr)rrr�opinionr
rrr�user_cache_dirs(!r)cCs�tdkr tjjtjjd�|�}nNtdkrLt|||�}d}|rntjj|d�}n"t|||�}d}|rntjj|d�}|r�|r�tjj||�}|S)a�Return full path to the user-specific log dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "opinion" (boolean) can be False to disable the appending of
            "Logs" to the base app data dir for Windows, and "log" to the
            base cache dir for Unix. See discussion below.

    Typical user cache directories are:
        Mac OS X:   ~/Library/Logs/<AppName>
        Unix:       ~/.cache/<AppName>/log  # or under $XDG_CACHE_HOME if defined
        Win XP:     C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Logs
        Vista:      C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Logs

    On Windows the only suggestion in the MSDN docs is that local settings
    go in the `CSIDL_LOCAL_APPDATA` directory. (Note: I'm interested in
    examples of what some windows apps use for a logs dir.)

    OPINION: This function appends "Logs" to the `CSIDL_LOCAL_APPDATA`
    value for Windows and appends "log" to the user cache dir for Unix.
    This can be disabled with the `opinion=False` option.
    rz~/Library/LogsrFZLogs�log)rrr
rrrr))rrrr(r
rrr�user_log_dir:s  
r+c@sbeZdZdZddd�Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��ZdS)�AppDirsz1Convenience wrapper for getting application dirs.NFcCs"||_||_||_||_||_dS)N)rrrrr#)�selfrrrrr#rrr�__init__os
zAppDirs.__init__cCst|j|j|j|jd�S)N)rr)rrrrr)r-rrrrws
zAppDirs.user_data_dircCst|j|j|j|jd�S)N)rr#)r%rrrr#)r-rrrr%|s
zAppDirs.site_data_dircCst|j|j|j|jd�S)N)rr)r&rrrr)r-rrrr&�s
zAppDirs.user_config_dircCst|j|j|j|jd�S)N)rr#)r'rrrr#)r-rrrr'�s
zAppDirs.site_config_dircCst|j|j|jd�S)N)r)r)rrr)r-rrrr)�s
zAppDirs.user_cache_dircCst|j|j|jd�S)N)r)r+rrr)r-rrrr+�s
zAppDirs.user_log_dir)NNFF)�__name__�
__module__�__qualname__�__doc__r.�propertyrr%r&r'r)r+rrrrr,ms
r,cCs:ddl}dddd�|}|j|jd�}|j||�\}}|S)z�This is a fallback technique at best. I'm not sure if using the
    registry for this guarantees us the correct answer for all CSIDL_*
    names.
    rNZAppDatazCommon AppDataz
Local AppData)r	rr
z@Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders)�_winreg�OpenKey�HKEY_CURRENT_USERZQueryValueEx)�
csidl_namer4Zshell_folder_name�key�dir�typerrr�_get_win_folder_from_registry�sr;cCs�ddlm}m}|jdt||�dd�}y`t|�}d}x|D]}t|�dkr:d}Pq:W|r�yddl}|j|�}Wnt	k
r�YnXWnt
k
r�YnX|S)Nr)�shellcon�shellF�T)�win32com.shellr<r=�SHGetFolderPath�getattr�unicode�ord�win32api�GetShortPathName�ImportError�UnicodeError)r7r<r=r9�
has_high_char�crDrrr�_get_win_folder_with_pywin32�s$

rJcCs�ddl}dddd�|}|jd�}|jjjd|dd|�d}x|D]}t|�dkrBd	}PqBW|r�|jd�}|jjj|j|d�r�|}|jS)
Nr��#�)r	rr
iFr>T)	�ctypesZcreate_unicode_buffer�windllZshell32ZSHGetFolderPathWrCZkernel32ZGetShortPathNameW�value)r7rNZcsidl_const�bufrHrIZbuf2rrr�_get_win_folder_with_ctypes�s"


rRcCs�ddl}ddlm}ddlm}|jjd}|jd|�}|jj	}|j
dt|j|�d|jj
|�|jj|j��jd�}d}x|D]}	t|	�dkr~d	}Pq~W|r�|jd|�}|jj	}
tj|||�r�|jj|j��jd�}|S)
Nr)�jna)r�rI�Fr>T)�arrayZcom.sunrSZcom.sun.jna.platformrZWinDefZMAX_PATHZzerosZShell32ZINSTANCEr@rAZShlObjZSHGFP_TYPE_CURRENTZNativeZtoStringZtostringrrCZKernel32ZkernalrE)r7rVrSrZbuf_sizerQr=r9rHrIZkernelrrr�_get_win_folder_with_jna�s&
rW)rO�__main__ZMyAppZ	MyCompanyz%-- app dirs (with optional 'version')z1.0)rz%s: %sz)
-- app dirs (without optional 'version')z+
-- app dirs (without optional 'appauthor')z(
-- app dirs (with disabled 'appauthor'))r)rrr)NNNF)NNNF)NNNF)NNNF)NNNT)NNNT)rr%r&r'r)r+),r2Z__version_info__r�map�str�__version__�sysr�version_infoZPY3rB�platform�
startswithZjava_verZos_namerrr%r&r'r)r+�objectr,r;rJrRrWr?Zwin32comrrFrNrOZcom.sun.jnaZcomr/rrZprops�print�dirsZproprArrrr�<module>	s~


7
B
(
3
9
3+






site-packages/pkg_resources/_vendor/__pycache__/pyparsing.cpython-36.opt-1.pyc000064400000610513147511334660023424 0ustar003

��f��@s�dZdZdZdZddlZddlmZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlmZyddlmZWn ek
r�ddlmZYnXydd	l
mZWn>ek
r�ydd	lmZWnek
r�dZYnXYnXd
ddd
ddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpdqdrgiZee	j�dds�ZeddskZe�r"e	jZe Z!e"Z#e Z$e%e&e'e(e)ee*e+e,e-e.gZ/nbe	j0Ze1Z2dtdu�Z$gZ/ddl3Z3xBdvj4�D]6Z5ye/j6e7e3e5��Wne8k
�r|�wJYnX�qJWe9dwdx�e2dy�D��Z:dzd{�Z;Gd|d}�d}e<�Z=ej>ej?Z@d~ZAeAdZBe@eAZCe"d��ZDd�jEd�dx�ejFD��ZGGd�d!�d!eH�ZIGd�d#�d#eI�ZJGd�d%�d%eI�ZKGd�d'�d'eK�ZLGd�d*�d*eH�ZMGd�d��d�e<�ZNGd�d&�d&e<�ZOe
jPjQeO�d�d=�ZRd�dN�ZSd�dK�ZTd�d��ZUd�d��ZVd�d��ZWd�dU�ZX�d/d�d��ZYGd�d(�d(e<�ZZGd�d0�d0eZ�Z[Gd�d�de[�Z\Gd�d�de[�Z]Gd�d�de[�Z^e^Z_e^eZ_`Gd�d�de[�ZaGd�d�de^�ZbGd�d�dea�ZcGd�dp�dpe[�ZdGd�d3�d3e[�ZeGd�d+�d+e[�ZfGd�d)�d)e[�ZgGd�d
�d
e[�ZhGd�d2�d2e[�ZiGd�d��d�e[�ZjGd�d�dej�ZkGd�d�dej�ZlGd�d�dej�ZmGd�d.�d.ej�ZnGd�d-�d-ej�ZoGd�d5�d5ej�ZpGd�d4�d4ej�ZqGd�d$�d$eZ�ZrGd�d
�d
er�ZsGd�d �d er�ZtGd�d�der�ZuGd�d�der�ZvGd�d"�d"eZ�ZwGd�d�dew�ZxGd�d�dew�ZyGd�d��d�ew�ZzGd�d�dez�Z{Gd�d6�d6ez�Z|Gd�d��d�e<�Z}e}�Z~Gd�d�dew�ZGd�d,�d,ew�Z�Gd�d�dew�Z�Gd�d��d�e��Z�Gd�d1�d1ew�Z�Gd�d�de��Z�Gd�d�de��Z�Gd�d�de��Z�Gd�d/�d/e��Z�Gd�d�de<�Z�d�df�Z��d0d�dD�Z��d1d�d@�Z�d�d΄Z�d�dS�Z�d�dR�Z�d�d҄Z��d2d�dW�Z�d�dE�Z��d3d�dk�Z�d�dl�Z�d�dn�Z�e\�j�dG�Z�el�j�dM�Z�em�j�dL�Z�en�j�de�Z�eo�j�dd�Z�eeeDd�d�dڍj�d�d܄�Z�efd݃j�d�d܄�Z�efd߃j�d�d܄�Z�e�e�Be�BeeeGd�dyd�Befd�ej��BZ�e�e�e�d�e��Z�e^d�ed�j�d�e�e{e�e�B��j�d�d�Z�d�dc�Z�d�dQ�Z�d�d`�Z�d�d^�Z�d�dq�Z�e�d�d܄�Z�e�d�d܄�Z�d�d�Z�d�dO�Z�d�dP�Z�d�di�Z�e<�e�_��d4d�do�Z�e=�Z�e<�e�_�e<�e�_�e�d��e�d��fd�dm�Z�e�Z�e�efd��d��j�d��Z�e�efd��d��j�d��Z�e�efd��d�efd��d�B�j��d�Z�e�e_�d�e�j��j��d�Z�d�d�de�j�f�ddT�Z��d5�ddj�Z�e��d�Z�e��d�Z�e�eee@eC�d�j��d��\Z�Z�e�e��d	j4��d
��Z�ef�d�djEe�j��d
�j��d�ZĐdd_�Z�e�ef�d��d�j��d�Z�ef�d�j��d�Z�ef�d�jȃj��d�Z�ef�d�j��d�Z�e�ef�d��de�B�j��d�Z�e�Z�ef�d�j��d�Z�e�e{eeeGdɐd�eee�d�e^dɃem����j΃j��d�Z�e�ee�j�e�Bd��d��j�d>�Z�G�d dr�dr�Z�eҐd!k�r�eb�d"�Z�eb�d#�Z�eee@eC�d$�Z�e�eՐd%dӐd&�j�e��Z�e�e�eփ�j��d'�Zאd(e�BZ�e�eՐd%dӐd&�j�e��Z�e�e�eك�j��d)�Z�eӐd*�eؐd'�e�eڐd)�Z�e�jܐd+�e�j�jܐd,�e�j�jܐd,�e�j�jܐd-�ddl�Z�e�j�j�e�e�j��e�j�jܐd.�dS(6aS
pyparsing module - Classes and methods to define and execute parsing grammars

The pyparsing module is an alternative approach to creating and executing simple grammars,
vs. the traditional lex/yacc approach, or the use of regular expressions.  With pyparsing, you
don't need to learn a new syntax for defining grammars or matching expressions - the parsing module
provides a library of classes that you use to construct the grammar directly in Python.

Here is a program to parse "Hello, World!" (or any greeting of the form 
C{"<salutation>, <addressee>!"}), built up using L{Word}, L{Literal}, and L{And} elements 
(L{'+'<ParserElement.__add__>} operator gives L{And} expressions, strings are auto-converted to
L{Literal} expressions)::

    from pyparsing import Word, alphas

    # define grammar of a greeting
    greet = Word(alphas) + "," + Word(alphas) + "!"

    hello = "Hello, World!"
    print (hello, "->", greet.parseString(hello))

The program outputs the following::

    Hello, World! -> ['Hello', ',', 'World', '!']

The Python representation of the grammar is quite readable, owing to the self-explanatory
class names, and the use of '+', '|' and '^' operators.

The L{ParseResults} object returned from L{ParserElement.parseString<ParserElement.parseString>} can be accessed as a nested list, a dictionary, or an
object with named attributes.

The pyparsing module handles some of the problems that are typically vexing when writing text parsers:
 - extra or missing whitespace (the above program will also handle "Hello,World!", "Hello  ,  World  !", etc.)
 - quoted strings
 - embedded comments
z2.1.10z07 Oct 2016 01:31 UTCz*Paul McGuire <ptmcg@users.sourceforge.net>�N)�ref)�datetime)�RLock)�OrderedDict�And�CaselessKeyword�CaselessLiteral�
CharsNotIn�Combine�Dict�Each�Empty�
FollowedBy�Forward�
GoToColumn�Group�Keyword�LineEnd�	LineStart�Literal�
MatchFirst�NoMatch�NotAny�	OneOrMore�OnlyOnce�Optional�Or�ParseBaseException�ParseElementEnhance�ParseException�ParseExpression�ParseFatalException�ParseResults�ParseSyntaxException�
ParserElement�QuotedString�RecursiveGrammarException�Regex�SkipTo�	StringEnd�StringStart�Suppress�Token�TokenConverter�White�Word�WordEnd�	WordStart�
ZeroOrMore�	alphanums�alphas�
alphas8bit�anyCloseTag�
anyOpenTag�
cStyleComment�col�commaSeparatedList�commonHTMLEntity�countedArray�cppStyleComment�dblQuotedString�dblSlashComment�
delimitedList�dictOf�downcaseTokens�empty�hexnums�htmlComment�javaStyleComment�line�lineEnd�	lineStart�lineno�makeHTMLTags�makeXMLTags�matchOnlyAtCol�matchPreviousExpr�matchPreviousLiteral�
nestedExpr�nullDebugAction�nums�oneOf�opAssoc�operatorPrecedence�
printables�punc8bit�pythonStyleComment�quotedString�removeQuotes�replaceHTMLEntity�replaceWith�
restOfLine�sglQuotedString�srange�	stringEnd�stringStart�traceParseAction�
unicodeString�upcaseTokens�
withAttribute�
indentedBlock�originalTextFor�ungroup�
infixNotation�locatedExpr�	withClass�
CloseMatch�tokenMap�pyparsing_common�cCs`t|t�r|Syt|�Stk
rZt|�jtj�d�}td�}|jdd��|j	|�SXdS)aDrop-in replacement for str(obj) that tries to be Unicode friendly. It first tries
           str(obj). If that fails with a UnicodeEncodeError, then it tries unicode(obj). It
           then < returns the unicode object | encodes it with the default encoding | ... >.
        �xmlcharrefreplacez&#\d+;cSs$dtt|ddd���dd�S)Nz\ur�����)�hex�int)�t�rw�/usr/lib/python3.6/pyparsing.py�<lambda>�sz_ustr.<locals>.<lambda>N)
�
isinstanceZunicode�str�UnicodeEncodeError�encode�sys�getdefaultencodingr'�setParseAction�transformString)�obj�retZ
xmlcharrefrwrwrx�_ustr�s
r�z6sum len sorted reversed list tuple set any all min maxccs|]
}|VqdS)Nrw)�.0�yrwrwrx�	<genexpr>�sr�rrcCs>d}dd�dj�D�}x"t||�D]\}}|j||�}q"W|S)z/Escape &, <, >, ", ', etc. in a string of data.z&><"'css|]}d|dVqdS)�&�;Nrw)r��srwrwrxr��sz_xml_escape.<locals>.<genexpr>zamp gt lt quot apos)�split�zip�replace)�dataZfrom_symbolsZ
to_symbolsZfrom_Zto_rwrwrx�_xml_escape�s
r�c@seZdZdS)�
_ConstantsN)�__name__�
__module__�__qualname__rwrwrwrxr��sr��
0123456789ZABCDEFabcdef�\�ccs|]}|tjkr|VqdS)N)�stringZ
whitespace)r��crwrwrxr��sc@sPeZdZdZddd�Zedd��Zdd	�Zd
d�Zdd
�Z	ddd�Z
dd�ZdS)rz7base exception class for all parsing runtime exceptionsrNcCs>||_|dkr||_d|_n||_||_||_|||f|_dS)Nr�)�loc�msg�pstr�
parserElement�args)�selfr�r�r��elemrwrwrx�__init__�szParseBaseException.__init__cCs||j|j|j|j�S)z�
        internal factory method to simplify creating one type of ParseException 
        from another - avoids having __init__ signature conflicts among subclasses
        )r�r�r�r�)�cls�perwrwrx�_from_exception�sz"ParseBaseException._from_exceptioncCsN|dkrt|j|j�S|dkr,t|j|j�S|dkrBt|j|j�St|��dS)z�supported attributes by name are:
            - lineno - returns the line number of the exception text
            - col - returns the column number of the exception text
            - line - returns the line containing the exception text
        rJr9�columnrGN)r9r�)rJr�r�r9rG�AttributeError)r�Zanamerwrwrx�__getattr__�szParseBaseException.__getattr__cCsd|j|j|j|jfS)Nz"%s (at char %d), (line:%d, col:%d))r�r�rJr�)r�rwrwrx�__str__�szParseBaseException.__str__cCst|�S)N)r�)r�rwrwrx�__repr__�szParseBaseException.__repr__�>!<cCs<|j}|jd}|r4dj|d|�|||d�f�}|j�S)z�Extracts the exception line from the input string, and marks
           the location of the exception with a special symbol.
        rrr�N)rGr��join�strip)r�ZmarkerStringZline_strZline_columnrwrwrx�
markInputline�s
z ParseBaseException.markInputlinecCsdj�tt|��S)Nzlineno col line)r��dir�type)r�rwrwrx�__dir__�szParseBaseException.__dir__)rNN)r�)r�r�r��__doc__r��classmethodr�r�r�r�r�r�rwrwrwrxr�s


c@seZdZdZdS)raN
    Exception thrown when parse expressions don't match class;
    supported attributes by name are:
     - lineno - returns the line number of the exception text
     - col - returns the column number of the exception text
     - line - returns the line containing the exception text
        
    Example::
        try:
            Word(nums).setName("integer").parseString("ABC")
        except ParseException as pe:
            print(pe)
            print("column: {}".format(pe.col))
            
    prints::
       Expected integer (at char 0), (line:1, col:1)
        column: 1
    N)r�r�r�r�rwrwrwrxr�sc@seZdZdZdS)r!znuser-throwable exception thrown when inconsistent parse content
       is found; stops all parsing immediatelyN)r�r�r�r�rwrwrwrxr!sc@seZdZdZdS)r#z�just like L{ParseFatalException}, but thrown internally when an
       L{ErrorStop<And._ErrorStop>} ('-' operator) indicates that parsing is to stop 
       immediately because an unbacktrackable syntax error has been foundN)r�r�r�r�rwrwrwrxr#sc@s eZdZdZdd�Zdd�ZdS)r&zZexception thrown by L{ParserElement.validate} if the grammar could be improperly recursivecCs
||_dS)N)�parseElementTrace)r��parseElementListrwrwrxr�sz"RecursiveGrammarException.__init__cCs
d|jS)NzRecursiveGrammarException: %s)r�)r�rwrwrxr� sz!RecursiveGrammarException.__str__N)r�r�r�r�r�r�rwrwrwrxr&sc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�_ParseResultsWithOffsetcCs||f|_dS)N)�tup)r�Zp1Zp2rwrwrxr�$sz _ParseResultsWithOffset.__init__cCs
|j|S)N)r�)r��irwrwrx�__getitem__&sz#_ParseResultsWithOffset.__getitem__cCst|jd�S)Nr)�reprr�)r�rwrwrxr�(sz _ParseResultsWithOffset.__repr__cCs|jd|f|_dS)Nr)r�)r�r�rwrwrx�	setOffset*sz!_ParseResultsWithOffset.setOffsetN)r�r�r�r�r�r�r�rwrwrwrxr�#sr�c@s�eZdZdZd[dd�Zddddefdd�Zdd	�Zefd
d�Zdd
�Z	dd�Z
dd�Zdd�ZeZ
dd�Zdd�Zdd�Zdd�Zdd�Zer�eZeZeZn$eZeZeZdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd\d(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Z d2d3�Z!d4d5�Z"d6d7�Z#d8d9�Z$d:d;�Z%d<d=�Z&d]d?d@�Z'dAdB�Z(dCdD�Z)dEdF�Z*d^dHdI�Z+dJdK�Z,dLdM�Z-d_dOdP�Z.dQdR�Z/dSdT�Z0dUdV�Z1dWdX�Z2dYdZ�Z3dS)`r"aI
    Structured parse results, to provide multiple means of access to the parsed data:
       - as a list (C{len(results)})
       - by list index (C{results[0], results[1]}, etc.)
       - by attribute (C{results.<resultsName>} - see L{ParserElement.setResultsName})

    Example::
        integer = Word(nums)
        date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))
        # equivalent form:
        # date_str = integer("year") + '/' + integer("month") + '/' + integer("day")

        # parseString returns a ParseResults object
        result = date_str.parseString("1999/12/31")

        def test(s, fn=repr):
            print("%s -> %s" % (s, fn(eval(s))))
        test("list(result)")
        test("result[0]")
        test("result['month']")
        test("result.day")
        test("'month' in result")
        test("'minutes' in result")
        test("result.dump()", str)
    prints::
        list(result) -> ['1999', '/', '12', '/', '31']
        result[0] -> '1999'
        result['month'] -> '12'
        result.day -> '31'
        'month' in result -> True
        'minutes' in result -> False
        result.dump() -> ['1999', '/', '12', '/', '31']
        - day: 31
        - month: 12
        - year: 1999
    NTcCs"t||�r|Stj|�}d|_|S)NT)rz�object�__new__�_ParseResults__doinit)r��toklist�name�asList�modalZretobjrwrwrxr�Ts


zParseResults.__new__c
Cs`|jrvd|_d|_d|_i|_||_||_|dkr6g}||t�rP|dd�|_n||t�rft|�|_n|g|_t	�|_
|dk	o�|�r\|s�d|j|<||t�r�t|�}||_||t
d�ttf�o�|ddgfk�s\||t�r�|g}|�r&||t��rt|j�d�||<ntt|d�d�||<|||_n6y|d||<Wn$tttfk
�rZ|||<YnXdS)NFrr�)r��_ParseResults__name�_ParseResults__parent�_ParseResults__accumNames�_ParseResults__asList�_ParseResults__modal�list�_ParseResults__toklist�_generatorType�dict�_ParseResults__tokdictrur�r��
basestringr"r��copy�KeyError�	TypeError�
IndexError)r�r�r�r�r�rzrwrwrxr�]sB



$
zParseResults.__init__cCsPt|ttf�r|j|S||jkr4|j|ddStdd�|j|D��SdS)NrrrcSsg|]}|d�qS)rrw)r��vrwrwrx�
<listcomp>�sz,ParseResults.__getitem__.<locals>.<listcomp>rs)rzru�slicer�r�r�r")r�r�rwrwrxr��s


zParseResults.__getitem__cCs�||t�r0|jj|t��|g|j|<|d}nD||ttf�rN||j|<|}n&|jj|t��t|d�g|j|<|}||t�r�t|�|_	dS)Nr)
r�r��getr�rur�r�r"�wkrefr�)r��kr�rz�subrwrwrx�__setitem__�s


"
zParseResults.__setitem__c
Cs�t|ttf�r�t|j�}|j|=t|t�rH|dkr:||7}t||d�}tt|j|���}|j�x^|j	j
�D]F\}}x<|D]4}x.t|�D]"\}\}}	t||	|	|k�||<q�Wq|WqnWn|j	|=dS)Nrrr)
rzrur��lenr�r��range�indices�reverser��items�	enumerater�)
r�r�ZmylenZremovedr��occurrences�jr��value�positionrwrwrx�__delitem__�s


$zParseResults.__delitem__cCs
||jkS)N)r�)r�r�rwrwrx�__contains__�szParseResults.__contains__cCs
t|j�S)N)r�r�)r�rwrwrx�__len__�szParseResults.__len__cCs
|jS)N)r�)r�rwrwrx�__bool__�szParseResults.__bool__cCs
t|j�S)N)�iterr�)r�rwrwrx�__iter__�szParseResults.__iter__cCst|jddd��S)Nrrrs)r�r�)r�rwrwrx�__reversed__�szParseResults.__reversed__cCs$t|jd�r|jj�St|j�SdS)N�iterkeys)�hasattrr�r�r�)r�rwrwrx�	_iterkeys�s
zParseResults._iterkeyscs�fdd��j�D�S)Nc3s|]}�|VqdS)Nrw)r�r�)r�rwrxr��sz+ParseResults._itervalues.<locals>.<genexpr>)r�)r�rw)r�rx�_itervalues�szParseResults._itervaluescs�fdd��j�D�S)Nc3s|]}|�|fVqdS)Nrw)r�r�)r�rwrxr��sz*ParseResults._iteritems.<locals>.<genexpr>)r�)r�rw)r�rx�
_iteritems�szParseResults._iteritemscCst|j��S)zVReturns all named result keys (as a list in Python 2.x, as an iterator in Python 3.x).)r�r�)r�rwrwrx�keys�szParseResults.keyscCst|j��S)zXReturns all named result values (as a list in Python 2.x, as an iterator in Python 3.x).)r��
itervalues)r�rwrwrx�values�szParseResults.valuescCst|j��S)zfReturns all named result key-values (as a list of tuples in Python 2.x, as an iterator in Python 3.x).)r��	iteritems)r�rwrwrxr��szParseResults.itemscCs
t|j�S)z�Since keys() returns an iterator, this method is helpful in bypassing
           code that looks for the existence of any defined results names.)�boolr�)r�rwrwrx�haskeys�szParseResults.haskeyscOs�|s
dg}x6|j�D]*\}}|dkr2|d|f}qtd|��qWt|dt�sht|�dksh|d|kr�|d}||}||=|S|d}|SdS)a�
        Removes and returns item at specified index (default=C{last}).
        Supports both C{list} and C{dict} semantics for C{pop()}. If passed no
        argument or an integer argument, it will use C{list} semantics
        and pop tokens from the list of parsed tokens. If passed a 
        non-integer argument (most likely a string), it will use C{dict}
        semantics and pop the corresponding value from any defined 
        results names. A second default return value argument is 
        supported, just as in C{dict.pop()}.

        Example::
            def remove_first(tokens):
                tokens.pop(0)
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            print(OneOrMore(Word(nums)).addParseAction(remove_first).parseString("0 123 321")) # -> ['123', '321']

            label = Word(alphas)
            patt = label("LABEL") + OneOrMore(Word(nums))
            print(patt.parseString("AAB 123 321").dump())

            # Use pop() in a parse action to remove named result (note that corresponding value is not
            # removed from list form of results)
            def remove_LABEL(tokens):
                tokens.pop("LABEL")
                return tokens
            patt.addParseAction(remove_LABEL)
            print(patt.parseString("AAB 123 321").dump())
        prints::
            ['AAB', '123', '321']
            - LABEL: AAB

            ['AAB', '123', '321']
        rr�defaultrz-pop() got an unexpected keyword argument '%s'Nrs)r�r�rzrur�)r�r��kwargsr�r��indexr�Zdefaultvaluerwrwrx�pop�s"zParseResults.popcCs||kr||S|SdS)ai
        Returns named result matching the given key, or if there is no
        such name, then returns the given C{defaultValue} or C{None} if no
        C{defaultValue} is specified.

        Similar to C{dict.get()}.
        
        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            result = date_str.parseString("1999/12/31")
            print(result.get("year")) # -> '1999'
            print(result.get("hour", "not specified")) # -> 'not specified'
            print(result.get("hour")) # -> None
        Nrw)r��key�defaultValuerwrwrxr�szParseResults.getcCsZ|jj||�xF|jj�D]8\}}x.t|�D]"\}\}}t||||k�||<q,WqWdS)a
        Inserts new element at location index in the list of parsed tokens.
        
        Similar to C{list.insert()}.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']

            # use a parse action to insert the parse location in the front of the parsed results
            def insert_locn(locn, tokens):
                tokens.insert(0, locn)
            print(OneOrMore(Word(nums)).addParseAction(insert_locn).parseString("0 123 321")) # -> [0, '0', '123', '321']
        N)r��insertr�r�r�r�)r�r�ZinsStrr�r�r�r�r�rwrwrxr�2szParseResults.insertcCs|jj|�dS)a�
        Add single element to end of ParseResults list of elements.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            
            # use a parse action to compute the sum of the parsed integers, and add it to the end
            def append_sum(tokens):
                tokens.append(sum(map(int, tokens)))
            print(OneOrMore(Word(nums)).addParseAction(append_sum).parseString("0 123 321")) # -> ['0', '123', '321', 444]
        N)r��append)r��itemrwrwrxr�FszParseResults.appendcCs$t|t�r||7}n|jj|�dS)a
        Add sequence of elements to end of ParseResults list of elements.

        Example::
            patt = OneOrMore(Word(alphas))
            
            # use a parse action to append the reverse of the matched strings, to make a palindrome
            def make_palindrome(tokens):
                tokens.extend(reversed([t[::-1] for t in tokens]))
                return ''.join(tokens)
            print(patt.addParseAction(make_palindrome).parseString("lskdj sdlkjf lksd")) # -> 'lskdjsdlkjflksddsklfjkldsjdksl'
        N)rzr"r��extend)r�Zitemseqrwrwrxr�Ts

zParseResults.extendcCs|jdd�=|jj�dS)z7
        Clear all elements and results names.
        N)r�r��clear)r�rwrwrxr�fszParseResults.clearcCsfy||Stk
rdSX||jkr^||jkrD|j|ddStdd�|j|D��SndSdS)Nr�rrrcSsg|]}|d�qS)rrw)r�r�rwrwrxr�wsz,ParseResults.__getattr__.<locals>.<listcomp>rs)r�r�r�r")r�r�rwrwrxr�ms

zParseResults.__getattr__cCs|j�}||7}|S)N)r�)r��otherr�rwrwrx�__add__{szParseResults.__add__cs�|jrnt|j���fdd��|jj�}�fdd�|D�}x4|D],\}}|||<t|dt�r>t|�|d_q>W|j|j7_|jj	|j�|S)Ncs|dkr�S|�S)Nrrw)�a)�offsetrwrxry�sz'ParseResults.__iadd__.<locals>.<lambda>c	s4g|],\}}|D]}|t|d�|d��f�qqS)rrr)r�)r�r��vlistr�)�	addoffsetrwrxr��sz)ParseResults.__iadd__.<locals>.<listcomp>r)
r�r�r�r�rzr"r�r�r��update)r�r�Z
otheritemsZotherdictitemsr�r�rw)rrrx�__iadd__�s


zParseResults.__iadd__cCs&t|t�r|dkr|j�S||SdS)Nr)rzrur�)r�r�rwrwrx�__radd__�szParseResults.__radd__cCsdt|j�t|j�fS)Nz(%s, %s))r�r�r�)r�rwrwrxr��szParseResults.__repr__cCsddjdd�|jD��dS)N�[z, css(|] }t|t�rt|�nt|�VqdS)N)rzr"r�r�)r�r�rwrwrxr��sz'ParseResults.__str__.<locals>.<genexpr>�])r�r�)r�rwrwrxr��szParseResults.__str__r�cCsPg}xF|jD]<}|r"|r"|j|�t|t�r:||j�7}q|jt|��qW|S)N)r�r�rzr"�
_asStringListr�)r��sep�outr�rwrwrxr
�s

zParseResults._asStringListcCsdd�|jD�S)a�
        Returns the parse results as a nested list of matching tokens, all converted to strings.

        Example::
            patt = OneOrMore(Word(alphas))
            result = patt.parseString("sldkj lsdkj sldkj")
            # even though the result prints in string-like form, it is actually a pyparsing ParseResults
            print(type(result), result) # -> <class 'pyparsing.ParseResults'> ['sldkj', 'lsdkj', 'sldkj']
            
            # Use asList() to create an actual list
            result_list = result.asList()
            print(type(result_list), result_list) # -> <class 'list'> ['sldkj', 'lsdkj', 'sldkj']
        cSs"g|]}t|t�r|j�n|�qSrw)rzr"r�)r��resrwrwrxr��sz'ParseResults.asList.<locals>.<listcomp>)r�)r�rwrwrxr��szParseResults.asListcs6tr|j}n|j}�fdd��t�fdd�|�D��S)a�
        Returns the named parse results as a nested dictionary.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(type(result), repr(result)) # -> <class 'pyparsing.ParseResults'> (['12', '/', '31', '/', '1999'], {'day': [('1999', 4)], 'year': [('12', 0)], 'month': [('31', 2)]})
            
            result_dict = result.asDict()
            print(type(result_dict), repr(result_dict)) # -> <class 'dict'> {'day': '1999', 'year': '12', 'month': '31'}

            # even though a ParseResults supports dict-like access, sometime you just need to have a dict
            import json
            print(json.dumps(result)) # -> Exception: TypeError: ... is not JSON serializable
            print(json.dumps(result.asDict())) # -> {"month": "31", "day": "1999", "year": "12"}
        cs6t|t�r.|j�r|j�S�fdd�|D�Sn|SdS)Ncsg|]}�|��qSrwrw)r�r�)�toItemrwrxr��sz7ParseResults.asDict.<locals>.toItem.<locals>.<listcomp>)rzr"r��asDict)r�)rrwrxr�s

z#ParseResults.asDict.<locals>.toItemc3s|]\}}|�|�fVqdS)Nrw)r�r�r�)rrwrxr��sz&ParseResults.asDict.<locals>.<genexpr>)�PY_3r�r�r�)r�Zitem_fnrw)rrxr�s
	zParseResults.asDictcCs8t|j�}|jj�|_|j|_|jj|j�|j|_|S)zA
        Returns a new copy of a C{ParseResults} object.
        )r"r�r�r�r�r�rr�)r�r�rwrwrxr��s
zParseResults.copyFcCsPd}g}tdd�|jj�D��}|d}|s8d}d}d}d}	|dk	rJ|}	n|jrV|j}	|	sf|rbdSd}	|||d|	d	g7}x�t|j�D]�\}
}t|t�r�|
|kr�||j||
|o�|dk||�g7}n||jd|o�|dk||�g7}q�d}|
|kr�||
}|�s
|�rq�nd}t	t
|��}
|||d|d	|
d
|d	g	7}q�W|||d
|	d	g7}dj|�S)z�
        (Deprecated) Returns the parse results as XML. Tags are created for tokens and lists that have defined results names.
        �
css(|] \}}|D]}|d|fVqqdS)rrNrw)r�r�rr�rwrwrxr��sz%ParseResults.asXML.<locals>.<genexpr>z  r�NZITEM�<�>z</)r�r�r�r�r�r�rzr"�asXMLr�r�r�)r�ZdoctagZnamedItemsOnly�indentZ	formatted�nlrZ
namedItemsZnextLevelIndentZselfTagr�r
ZresTagZxmlBodyTextrwrwrxr�sT


zParseResults.asXMLcCs:x4|jj�D]&\}}x|D]\}}||kr|SqWqWdS)N)r�r�)r�r�r�rr�r�rwrwrxZ__lookup$s
zParseResults.__lookupcCs�|jr|jS|jr.|j�}|r(|j|�SdSnNt|�dkrxt|j�dkrxtt|jj���dddkrxtt|jj���SdSdS)a(
        Returns the results name for this token expression. Useful when several 
        different expressions might match at a particular location.

        Example::
            integer = Word(nums)
            ssn_expr = Regex(r"\d\d\d-\d\d-\d\d\d\d")
            house_number_expr = Suppress('#') + Word(nums, alphanums)
            user_data = (Group(house_number_expr)("house_number") 
                        | Group(ssn_expr)("ssn")
                        | Group(integer)("age"))
            user_info = OneOrMore(user_data)
            
            result = user_info.parseString("22 111-22-3333 #221B")
            for item in result:
                print(item.getName(), ':', item[0])
        prints::
            age : 22
            ssn : 111-22-3333
            house_number : 221B
        Nrrrrs)rrs)	r�r��_ParseResults__lookupr�r��nextr�r�r�)r��parrwrwrx�getName+s
zParseResults.getNamercCsbg}d}|j|t|j���|�rX|j�r�tdd�|j�D��}xz|D]r\}}|r^|j|�|jd|d||f�t|t�r�|r�|j|j||d��q�|jt|��qH|jt	|��qHWn�t
dd�|D���rX|}x~t|�D]r\}	}
t|
t��r*|jd|d||	|d|d|
j||d�f�q�|jd|d||	|d|dt|
�f�q�Wd	j|�S)
aH
        Diagnostic method for listing out the contents of a C{ParseResults}.
        Accepts an optional C{indent} argument so that this string can be embedded
        in a nested display of other data.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(result.dump())
        prints::
            ['12', '/', '31', '/', '1999']
            - day: 1999
            - month: 31
            - year: 12
        rcss|]\}}t|�|fVqdS)N)r{)r�r�r�rwrwrxr�gsz$ParseResults.dump.<locals>.<genexpr>z
%s%s- %s: z  rrcss|]}t|t�VqdS)N)rzr")r��vvrwrwrxr�ssz
%s%s[%d]:
%s%s%sr�)
r�r�r�r��sortedr�rzr"�dumpr��anyr�r�)r�r�depth�fullr�NLr�r�r�r�rrwrwrxrPs,

4.zParseResults.dumpcOstj|j�f|�|�dS)a�
        Pretty-printer for parsed results as a list, using the C{pprint} module.
        Accepts additional positional or keyword args as defined for the 
        C{pprint.pprint} method. (U{http://docs.python.org/3/library/pprint.html#pprint.pprint})

        Example::
            ident = Word(alphas, alphanums)
            num = Word(nums)
            func = Forward()
            term = ident | num | Group('(' + func + ')')
            func <<= ident + Group(Optional(delimitedList(term)))
            result = func.parseString("fna a,b,(fnb c,d,200),100")
            result.pprint(width=40)
        prints::
            ['fna',
             ['a',
              'b',
              ['(', 'fnb', ['c', 'd', '200'], ')'],
              '100']]
        N)�pprintr�)r�r�r�rwrwrxr"}szParseResults.pprintcCs.|j|jj�|jdk	r|j�p d|j|jffS)N)r�r�r�r�r�r�)r�rwrwrx�__getstate__�s
zParseResults.__getstate__cCsN|d|_|d\|_}}|_i|_|jj|�|dk	rDt|�|_nd|_dS)Nrrr)r�r�r�r�rr�r�)r��staterZinAccumNamesrwrwrx�__setstate__�s
zParseResults.__setstate__cCs|j|j|j|jfS)N)r�r�r�r�)r�rwrwrx�__getnewargs__�szParseResults.__getnewargs__cCstt|��t|j��S)N)r�r�r�r�)r�rwrwrxr��szParseResults.__dir__)NNTT)N)r�)NFr�T)r�rT)4r�r�r�r�r�rzr�r�r�r�r�r�r��__nonzero__r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr�r�r
r�rr�rrrrr"r#r%r&r�rwrwrwrxr"-sh&
	'	
4

#
=%
-
cCsF|}d|kot|�knr4||ddkr4dS||jdd|�S)aReturns current column within a string, counting newlines as line separators.
   The first column is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   rrrr)r��rfind)r��strgr�rwrwrxr9�s
cCs|jdd|�dS)aReturns current line number within a string, counting newlines as line separators.
   The first line is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   rrrr)�count)r�r)rwrwrxrJ�s
cCsF|jdd|�}|jd|�}|dkr2||d|�S||dd�SdS)zfReturns the line of text containing loc within a string, counting newlines as line separators.
       rrrrN)r(�find)r�r)ZlastCRZnextCRrwrwrxrG�s
cCs8tdt|�dt|�dt||�t||�f�dS)NzMatch z at loc z(%d,%d))�printr�rJr9)�instringr��exprrwrwrx�_defaultStartDebugAction�sr/cCs$tdt|�dt|j���dS)NzMatched z -> )r,r�r{r�)r-�startlocZendlocr.�toksrwrwrx�_defaultSuccessDebugAction�sr2cCstdt|��dS)NzException raised:)r,r�)r-r�r.�excrwrwrx�_defaultExceptionDebugAction�sr4cGsdS)zG'Do-nothing' debug action, to suppress debugging output during parsing.Nrw)r�rwrwrxrQ�srqcs��tkr�fdd�Sdg�dg�tdd�dkrFddd	�}dd
d��ntj}tj�d}|dd
�d}|d|d|f�������fdd�}d}yt�dt�d�j�}Wntk
r�t��}YnX||_|S)Ncs�|�S)Nrw)r��lrv)�funcrwrxry�sz_trim_arity.<locals>.<lambda>rFrqro�cSs8tdkrdnd	}tj||dd�|}|j|jfgS)
Nror7rrqrr)�limit)ror7r������)�system_version�	traceback�
extract_stack�filenamerJ)r8r�
frame_summaryrwrwrxr=sz"_trim_arity.<locals>.extract_stackcSs$tj||d�}|d}|j|jfgS)N)r8rrrs)r<�
extract_tbr>rJ)�tbr8Zframesr?rwrwrxr@sz_trim_arity.<locals>.extract_tb�)r8rrcs�x�y �|�dd��}d�d<|Stk
r��dr>�n4z.tj�d}�|dd�ddd��ksj�Wd~X�d�kr��dd7<w�YqXqWdS)NrTrrrq)r8rsrs)r�r~�exc_info)r�r�rA)r@�
foundArityr6r8�maxargs�pa_call_line_synthrwrx�wrappers"z_trim_arity.<locals>.wrapperz<parse action>r��	__class__)ror7)r)rrs)	�singleArgBuiltinsr;r<r=r@�getattrr��	Exceptionr{)r6rEr=Z	LINE_DIFFZ	this_linerG�	func_namerw)r@rDr6r8rErFrx�_trim_arity�s*
rMcs�eZdZdZdZdZedd��Zedd��Zd�dd	�Z	d
d�Z
dd
�Zd�dd�Zd�dd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd�dd �Zd!d"�Zd�d#d$�Zd%d&�Zd'd(�ZGd)d*�d*e�Zed+k	r�Gd,d-�d-e�ZnGd.d-�d-e�ZiZe�Zd/d/gZ d�d0d1�Z!eZ"ed2d3��Z#dZ$ed�d5d6��Z%d�d7d8�Z&e'dfd9d:�Z(d;d<�Z)e'fd=d>�Z*e'dfd?d@�Z+dAdB�Z,dCdD�Z-dEdF�Z.dGdH�Z/dIdJ�Z0dKdL�Z1dMdN�Z2dOdP�Z3dQdR�Z4dSdT�Z5dUdV�Z6dWdX�Z7dYdZ�Z8d�d[d\�Z9d]d^�Z:d_d`�Z;dadb�Z<dcdd�Z=dedf�Z>dgdh�Z?d�didj�Z@dkdl�ZAdmdn�ZBdodp�ZCdqdr�ZDgfdsdt�ZEd�dudv�ZF�fdwdx�ZGdydz�ZHd{d|�ZId}d~�ZJdd��ZKd�d�d��ZLd�d�d��ZM�ZNS)�r$z)Abstract base level parser element class.z 
	
FcCs
|t_dS)a�
        Overrides the default whitespace chars

        Example::
            # default whitespace chars are space, <TAB> and newline
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def', 'ghi', 'jkl']
            
            # change to just treat newline as significant
            ParserElement.setDefaultWhitespaceChars(" \t")
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def']
        N)r$�DEFAULT_WHITE_CHARS)�charsrwrwrx�setDefaultWhitespaceChars=s
z'ParserElement.setDefaultWhitespaceCharscCs
|t_dS)a�
        Set class to be used for inclusion of string literals into a parser.
        
        Example::
            # default literal class used is Literal
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']


            # change to Suppress
            ParserElement.inlineLiteralsUsing(Suppress)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '12', '31']
        N)r$�_literalStringClass)r�rwrwrx�inlineLiteralsUsingLsz!ParserElement.inlineLiteralsUsingcCs�t�|_d|_d|_d|_||_d|_tj|_	d|_
d|_d|_t�|_
d|_d|_d|_d|_d|_d|_d|_d|_d|_dS)NTFr�)NNN)r��parseAction�
failAction�strRepr�resultsName�
saveAsList�skipWhitespacer$rN�
whiteChars�copyDefaultWhiteChars�mayReturnEmpty�keepTabs�ignoreExprs�debug�streamlined�
mayIndexError�errmsg�modalResults�debugActions�re�callPreparse�
callDuringTry)r��savelistrwrwrxr�as(zParserElement.__init__cCs<tj|�}|jdd�|_|jdd�|_|jr8tj|_|S)a$
        Make a copy of this C{ParserElement}.  Useful for defining different parse actions
        for the same parsing pattern, using copies of the original parse element.
        
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            integerK = integer.copy().addParseAction(lambda toks: toks[0]*1024) + Suppress("K")
            integerM = integer.copy().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
            
            print(OneOrMore(integerK | integerM | integer).parseString("5K 100 640K 256M"))
        prints::
            [5120, 100, 655360, 268435456]
        Equivalent form of C{expr.copy()} is just C{expr()}::
            integerM = integer().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
        N)r�rSr]rZr$rNrY)r�Zcpyrwrwrxr�xs
zParserElement.copycCs*||_d|j|_t|d�r&|j|j_|S)af
        Define name for this expression, makes debugging and exception messages clearer.
        
        Example::
            Word(nums).parseString("ABC")  # -> Exception: Expected W:(0123...) (at char 0), (line:1, col:1)
            Word(nums).setName("integer").parseString("ABC")  # -> Exception: Expected integer (at char 0), (line:1, col:1)
        z	Expected �	exception)r�rar�rhr�)r�r�rwrwrx�setName�s


zParserElement.setNamecCs4|j�}|jd�r"|dd�}d}||_||_|S)aP
        Define name for referencing matching tokens as a nested attribute
        of the returned parse results.
        NOTE: this returns a *copy* of the original C{ParserElement} object;
        this is so that the client can define a basic element, such as an
        integer, and reference it in multiple places with different names.

        You can also set results names using the abbreviated syntax,
        C{expr("name")} in place of C{expr.setResultsName("name")} - 
        see L{I{__call__}<__call__>}.

        Example::
            date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))

            # equivalent form:
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
        �*NrrTrs)r��endswithrVrb)r�r��listAllMatchesZnewselfrwrwrx�setResultsName�s
zParserElement.setResultsNameTcs@|r&|j�d�fdd�	}�|_||_nt|jd�r<|jj|_|S)z�Method to invoke the Python pdb debugger when this element is
           about to be parsed. Set C{breakFlag} to True to enable, False to
           disable.
        Tcsddl}|j��||||�S)Nr)�pdbZ	set_trace)r-r��	doActions�callPreParsern)�_parseMethodrwrx�breaker�sz'ParserElement.setBreak.<locals>.breaker�_originalParseMethod)TT)�_parsersr�)r�Z	breakFlagrrrw)rqrx�setBreak�s
zParserElement.setBreakcOs&tttt|���|_|jdd�|_|S)a
        Define action to perform when successfully matching parse element definition.
        Parse action fn is a callable method with 0-3 arguments, called as C{fn(s,loc,toks)},
        C{fn(loc,toks)}, C{fn(toks)}, or just C{fn()}, where:
         - s   = the original string being parsed (see note below)
         - loc = the location of the matching substring
         - toks = a list of the matched tokens, packaged as a C{L{ParseResults}} object
        If the functions in fns modify the tokens, they can return them as the return
        value from fn, and the modified list of tokens will replace the original.
        Otherwise, fn does not need to return any value.

        Optional keyword arguments:
         - callDuringTry = (default=C{False}) indicate if parse action should be run during lookaheads and alternate testing

        Note: the default parsing behavior is to expand tabs in the input string
        before starting the parsing process.  See L{I{parseString}<parseString>} for more information
        on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
        consistent view of the parsed string, the parse location, and line and column
        positions within the parsed string.
        
        Example::
            integer = Word(nums)
            date_str = integer + '/' + integer + '/' + integer

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']

            # use parse action to convert to ints at parse time
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            date_str = integer + '/' + integer + '/' + integer

            # note that integer fields are now ints, not strings
            date_str.parseString("1999/12/31")  # -> [1999, '/', 12, '/', 31]
        rfF)r��maprMrSr�rf)r��fnsr�rwrwrxr��s"zParserElement.setParseActioncOs4|jtttt|���7_|jp,|jdd�|_|S)z�
        Add parse action to expression's list of parse actions. See L{I{setParseAction}<setParseAction>}.
        
        See examples in L{I{copy}<copy>}.
        rfF)rSr�rvrMrfr�)r�rwr�rwrwrx�addParseAction�szParserElement.addParseActioncsb|jdd��|jdd�rtnt�x(|D] ����fdd�}|jj|�q&W|jpZ|jdd�|_|S)a�Add a boolean predicate function to expression's list of parse actions. See 
        L{I{setParseAction}<setParseAction>} for function call signatures. Unlike C{setParseAction}, 
        functions passed to C{addCondition} need to return boolean success/fail of the condition.

        Optional keyword arguments:
         - message = define a custom message to be used in the raised exception
         - fatal   = if True, will raise ParseFatalException to stop parsing immediately; otherwise will raise ParseException
         
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            year_int = integer.copy()
            year_int.addCondition(lambda toks: toks[0] >= 2000, message="Only support years 2000 and later")
            date_str = year_int + '/' + integer + '/' + integer

            result = date_str.parseString("1999/12/31")  # -> Exception: Only support years 2000 and later (at char 0), (line:1, col:1)
        �messagezfailed user-defined condition�fatalFcs$tt��|||��s �||���dS)N)r�rM)r�r5rv)�exc_type�fnr�rwrx�pasz&ParserElement.addCondition.<locals>.parf)r�r!rrSr�rf)r�rwr�r}rw)r{r|r�rx�addCondition�s
zParserElement.addConditioncCs
||_|S)aDefine action to perform if parsing fails at this expression.
           Fail acton fn is a callable function that takes the arguments
           C{fn(s,loc,expr,err)} where:
            - s = string being parsed
            - loc = location where expression match was attempted and failed
            - expr = the parse expression that failed
            - err = the exception thrown
           The function returns no value.  It may throw C{L{ParseFatalException}}
           if it is desired to stop parsing immediately.)rT)r�r|rwrwrx�
setFailActions
zParserElement.setFailActioncCsZd}xP|rTd}xB|jD]8}yx|j||�\}}d}qWWqtk
rLYqXqWqW|S)NTF)r]rtr)r�r-r�Z
exprsFound�eZdummyrwrwrx�_skipIgnorables#szParserElement._skipIgnorablescCsL|jr|j||�}|jrH|j}t|�}x ||krF|||krF|d7}q(W|S)Nrr)r]r�rXrYr�)r�r-r�Zwt�instrlenrwrwrx�preParse0szParserElement.preParsecCs|gfS)Nrw)r�r-r�rorwrwrx�	parseImpl<szParserElement.parseImplcCs|S)Nrw)r�r-r��	tokenlistrwrwrx�	postParse?szParserElement.postParsec"Cs�|j}|s|jr�|jdr,|jd|||�|rD|jrD|j||�}n|}|}yDy|j|||�\}}Wn(tk
r�t|t|�|j	|��YnXWnXt
k
r�}	z<|jdr�|jd||||	�|jr�|j||||	��WYdd}	~	XnXn�|o�|j�r|j||�}n|}|}|j�s$|t|�k�rhy|j|||�\}}Wn*tk
�rdt|t|�|j	|��YnXn|j|||�\}}|j|||�}t
||j|j|jd�}
|j�r�|�s�|j�r�|�rVyRxL|jD]B}||||
�}|dk	�r�t
||j|j�o�t|t
tf�|jd�}
�q�WWnFt
k
�rR}	z(|jd�r@|jd||||	��WYdd}	~	XnXnNxL|jD]B}||||
�}|dk	�r^t
||j|j�o�t|t
tf�|jd�}
�q^W|�r�|jd�r�|jd|||||
�||
fS)Nrrq)r�r�rr)r^rTrcrer�r�r�rr�rarr`r�r"rVrWrbrSrfrzr�)r�r-r�rorpZ	debugging�prelocZtokensStart�tokens�errZ	retTokensr|rwrwrx�
_parseNoCacheCsp





zParserElement._parseNoCachecCs>y|j||dd�dStk
r8t|||j|��YnXdS)NF)ror)rtr!rra)r�r-r�rwrwrx�tryParse�szParserElement.tryParsecCs2y|j||�Wnttfk
r(dSXdSdS)NFT)r�rr�)r�r-r�rwrwrx�canParseNext�s
zParserElement.canParseNextc@seZdZdd�ZdS)zParserElement._UnboundedCachecsdi�t�|_���fdd�}�fdd�}�fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)�cache�not_in_cacherwrxr��sz3ParserElement._UnboundedCache.__init__.<locals>.getcs|�|<dS)Nrw)r�r�r�)r�rwrx�set�sz3ParserElement._UnboundedCache.__init__.<locals>.setcs�j�dS)N)r�)r�)r�rwrxr��sz5ParserElement._UnboundedCache.__init__.<locals>.clear)r�r��types�
MethodTyper�r�r�)r�r�r�r�rw)r�r�rxr��sz&ParserElement._UnboundedCache.__init__N)r�r�r�r�rwrwrwrx�_UnboundedCache�sr�Nc@seZdZdd�ZdS)zParserElement._FifoCachecsht�|_�t����fdd�}��fdd�}�fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)r�r�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.getcs"|�|<t���kr�jd�dS)NF)r��popitem)r�r�r�)r��sizerwrxr��sz.ParserElement._FifoCache.__init__.<locals>.setcs�j�dS)N)r�)r�)r�rwrxr��sz0ParserElement._FifoCache.__init__.<locals>.clear)r�r��_OrderedDictr�r�r�r�r�)r�r�r�r�r�rw)r�r�r�rxr��sz!ParserElement._FifoCache.__init__N)r�r�r�r�rwrwrwrx�
_FifoCache�sr�c@seZdZdd�ZdS)zParserElement._FifoCachecsvt�|_�i�tjg�����fdd�}���fdd�}��fdd�}tj||�|_tj||�|_tj||�|_dS)Ncs�j|��S)N)r�)r�r�)r�r�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.getcs2|�|<t���kr$�j�j�d��j|�dS)N)r�r��popleftr�)r�r�r�)r��key_fifor�rwrxr��sz.ParserElement._FifoCache.__init__.<locals>.setcs�j��j�dS)N)r�)r�)r�r�rwrxr��sz0ParserElement._FifoCache.__init__.<locals>.clear)	r�r��collections�dequer�r�r�r�r�)r�r�r�r�r�rw)r�r�r�r�rxr��sz!ParserElement._FifoCache.__init__N)r�r�r�r�rwrwrwrxr��srcCs�d\}}|||||f}tj��tj}|j|�}	|	|jkr�tj|d7<y|j||||�}	Wn8tk
r�}
z|j||
j	|
j
���WYdd}
~
Xq�X|j||	d|	dj�f�|	Sn4tj|d7<t|	t
�r�|	�|	d|	dj�fSWdQRXdS)Nrrr)rrr)r$�packrat_cache_lock�
packrat_cacher�r��packrat_cache_statsr�rr�rHr�r�rzrK)r�r-r�rorpZHITZMISS�lookupr�r�r�rwrwrx�_parseCache�s$


zParserElement._parseCachecCs(tjj�dgttj�tjdd�<dS)Nr)r$r�r�r�r�rwrwrwrx�
resetCache�s
zParserElement.resetCache�cCs8tjs4dt_|dkr tj�t_ntj|�t_tjt_dS)a�Enables "packrat" parsing, which adds memoizing to the parsing logic.
           Repeated parse attempts at the same string location (which happens
           often in many complex grammars) can immediately return a cached value,
           instead of re-executing parsing/validating code.  Memoizing is done of
           both valid results and parsing exceptions.
           
           Parameters:
            - cache_size_limit - (default=C{128}) - if an integer value is provided
              will limit the size of the packrat cache; if None is passed, then
              the cache size will be unbounded; if 0 is passed, the cache will
              be effectively disabled.
            
           This speedup may break existing programs that use parse actions that
           have side-effects.  For this reason, packrat parsing is disabled when
           you first import pyparsing.  To activate the packrat feature, your
           program must call the class method C{ParserElement.enablePackrat()}.  If
           your program uses C{psyco} to "compile as you go", you must call
           C{enablePackrat} before calling C{psyco.full()}.  If you do not do this,
           Python will crash.  For best results, call C{enablePackrat()} immediately
           after importing pyparsing.
           
           Example::
               import pyparsing
               pyparsing.ParserElement.enablePackrat()
        TN)r$�_packratEnabledr�r�r�r�rt)Zcache_size_limitrwrwrx�
enablePackratszParserElement.enablePackratcCs�tj�|js|j�x|jD]}|j�qW|js<|j�}y<|j|d�\}}|rv|j||�}t	�t
�}|j||�Wn0tk
r�}ztjr��n|�WYdd}~XnX|SdS)aB
        Execute the parse expression with the given string.
        This is the main interface to the client code, once the complete
        expression has been built.

        If you want the grammar to require that the entire input string be
        successfully parsed, then set C{parseAll} to True (equivalent to ending
        the grammar with C{L{StringEnd()}}).

        Note: C{parseString} implicitly calls C{expandtabs()} on the input string,
        in order to report proper column numbers in parse actions.
        If the input string contains tabs and
        the grammar uses parse actions that use the C{loc} argument to index into the
        string being parsed, you can ensure you have a consistent view of the input
        string by:
         - calling C{parseWithTabs} on your grammar before calling C{parseString}
           (see L{I{parseWithTabs}<parseWithTabs>})
         - define your parse action using the full C{(s,loc,toks)} signature, and
           reference the input string using the parse action's C{s} argument
         - explictly expand the tabs in your input string before calling
           C{parseString}
        
        Example::
            Word('a').parseString('aaaaabaaa')  # -> ['aaaaa']
            Word('a').parseString('aaaaabaaa', parseAll=True)  # -> Exception: Expected end of text
        rN)
r$r�r_�
streamliner]r\�
expandtabsrtr�r
r)r�verbose_stacktrace)r�r-�parseAllr�r�r�Zser3rwrwrx�parseString#s$zParserElement.parseStringccs@|js|j�x|jD]}|j�qW|js8t|�j�}t|�}d}|j}|j}t	j
�d}	y�x�||kon|	|k�ry |||�}
|||
dd�\}}Wntk
r�|
d}Yq`X||kr�|	d7}	||
|fV|r�|||�}
|
|kr�|}q�|d7}n|}q`|
d}q`WWn4tk
�r:}zt	j
�r&�n|�WYdd}~XnXdS)a�
        Scan the input string for expression matches.  Each match will return the
        matching tokens, start location, and end location.  May be called with optional
        C{maxMatches} argument, to clip scanning after 'n' matches are found.  If
        C{overlap} is specified, then overlapping matches will be reported.

        Note that the start and end locations are reported relative to the string
        being parsed.  See L{I{parseString}<parseString>} for more information on parsing
        strings with embedded tabs.

        Example::
            source = "sldjf123lsdjjkf345sldkjf879lkjsfd987"
            print(source)
            for tokens,start,end in Word(alphas).scanString(source):
                print(' '*start + '^'*(end-start))
                print(' '*start + tokens[0])
        
        prints::
        
            sldjf123lsdjjkf345sldkjf879lkjsfd987
            ^^^^^
            sldjf
                    ^^^^^^^
                    lsdjjkf
                              ^^^^^^
                              sldkjf
                                       ^^^^^^
                                       lkjsfd
        rF)rprrN)r_r�r]r\r�r�r�r�rtr$r�rrr�)r�r-�
maxMatchesZoverlapr�r�r�Z
preparseFnZparseFn�matchesr�ZnextLocr�Znextlocr3rwrwrx�
scanStringUsB


zParserElement.scanStringcCs�g}d}d|_y�xh|j|�D]Z\}}}|j|||��|rrt|t�rT||j�7}nt|t�rh||7}n
|j|�|}qW|j||d��dd�|D�}djtt	t
|���Stk
r�}ztj
rȂn|�WYdd}~XnXdS)af
        Extension to C{L{scanString}}, to modify matching text with modified tokens that may
        be returned from a parse action.  To use C{transformString}, define a grammar and
        attach a parse action to it that modifies the returned token list.
        Invoking C{transformString()} on a target string will then scan for matches,
        and replace the matched text patterns according to the logic in the parse
        action.  C{transformString()} returns the resulting transformed string.
        
        Example::
            wd = Word(alphas)
            wd.setParseAction(lambda toks: toks[0].title())
            
            print(wd.transformString("now is the winter of our discontent made glorious summer by this sun of york."))
        Prints::
            Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York.
        rTNcSsg|]}|r|�qSrwrw)r��orwrwrxr��sz1ParserElement.transformString.<locals>.<listcomp>r�)r\r�r�rzr"r�r�r�rvr��_flattenrr$r�)r�r-rZlastErvr�r�r3rwrwrxr��s(



zParserElement.transformStringcCsPytdd�|j||�D��Stk
rJ}ztjr6�n|�WYdd}~XnXdS)a~
        Another extension to C{L{scanString}}, simplifying the access to the tokens found
        to match the given parse expression.  May be called with optional
        C{maxMatches} argument, to clip searching after 'n' matches are found.
        
        Example::
            # a capitalized word starts with an uppercase letter, followed by zero or more lowercase letters
            cap_word = Word(alphas.upper(), alphas.lower())
            
            print(cap_word.searchString("More than Iron, more than Lead, more than Gold I need Electricity"))
        prints::
            ['More', 'Iron', 'Lead', 'Gold', 'I']
        cSsg|]\}}}|�qSrwrw)r�rvr�r�rwrwrxr��sz.ParserElement.searchString.<locals>.<listcomp>N)r"r�rr$r�)r�r-r�r3rwrwrx�searchString�szParserElement.searchStringc	csXd}d}x<|j||d�D]*\}}}|||�V|r>|dV|}qW||d�VdS)a[
        Generator method to split a string using the given expression as a separator.
        May be called with optional C{maxsplit} argument, to limit the number of splits;
        and the optional C{includeSeparators} argument (default=C{False}), if the separating
        matching text should be included in the split results.
        
        Example::        
            punc = oneOf(list(".,;:/-!?"))
            print(list(punc.split("This, this?, this sentence, is badly punctuated!")))
        prints::
            ['This', ' this', '', ' this sentence', ' is badly punctuated', '']
        r)r�N)r�)	r�r-�maxsplitZincludeSeparatorsZsplitsZlastrvr�r�rwrwrxr��s

zParserElement.splitcCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)a�
        Implementation of + operator - returns C{L{And}}. Adding strings to a ParserElement
        converts them to L{Literal}s by default.
        
        Example::
            greet = Word(alphas) + "," + Word(alphas) + "!"
            hello = "Hello, World!"
            print (hello, "->", greet.parseString(hello))
        Prints::
            Hello, World! -> ['Hello', ',', 'World', '!']
        z4Cannot combine element of type %s with ParserElementrq)�
stacklevelN)	rzr�r$rQ�warnings�warnr��
SyntaxWarningr)r�r�rwrwrxr�s



zParserElement.__add__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||S)z]
        Implementation of + operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrxrs



zParserElement.__radd__cCsLt|t�rtj|�}t|t�s:tjdt|�tdd�dSt|tj	�|g�S)zQ
        Implementation of - operator, returns C{L{And}} with error stop
        z4Cannot combine element of type %s with ParserElementrq)r�N)
rzr�r$rQr�r�r�r�r�
_ErrorStop)r�r�rwrwrx�__sub__s



zParserElement.__sub__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||S)z]
        Implementation of - operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rsub__ s



zParserElement.__rsub__cs�t|t�r|d}}n�t|t�r�|ddd�}|ddkrHd|df}t|dt�r�|ddkr�|ddkrvt��S|ddkr�t��S�|dt��SnJt|dt�r�t|dt�r�|\}}||8}ntdt|d�t|d���ntdt|���|dk�rtd��|dk�rtd��||k�o2dkn�rBtd	��|�r���fd
d��|�r�|dk�rt��|�}nt�g|��|�}n�|�}n|dk�r��}nt�g|�}|S)
a�
        Implementation of * operator, allows use of C{expr * 3} in place of
        C{expr + expr + expr}.  Expressions may also me multiplied by a 2-integer
        tuple, similar to C{{min,max}} multipliers in regular expressions.  Tuples
        may also include C{None} as in:
         - C{expr*(n,None)} or C{expr*(n,)} is equivalent
              to C{expr*n + L{ZeroOrMore}(expr)}
              (read as "at least n instances of C{expr}")
         - C{expr*(None,n)} is equivalent to C{expr*(0,n)}
              (read as "0 to n instances of C{expr}")
         - C{expr*(None,None)} is equivalent to C{L{ZeroOrMore}(expr)}
         - C{expr*(1,None)} is equivalent to C{L{OneOrMore}(expr)}

        Note that C{expr*(None,n)} does not raise an exception if
        more than n exprs exist in the input stream; that is,
        C{expr*(None,n)} does not enforce a maximum number of expr
        occurrences.  If this behavior is desired, then write
        C{expr*(None,n) + ~expr}
        rNrqrrz7cannot multiply 'ParserElement' and ('%s','%s') objectsz0cannot multiply 'ParserElement' and '%s' objectsz/cannot multiply ParserElement by negative valuez@second tuple value must be greater or equal to first tuple valuez+cannot multiply ParserElement by 0 or (0,0)cs(|dkrt��|d��St��SdS)Nrr)r)�n)�makeOptionalListr�rwrxr�]sz/ParserElement.__mul__.<locals>.makeOptionalList)NN)	rzru�tupler2rr�r��
ValueErrorr)r�r�ZminElementsZoptElementsr�rw)r�r�rx�__mul__,sD







zParserElement.__mul__cCs
|j|�S)N)r�)r�r�rwrwrx�__rmul__pszParserElement.__rmul__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zI
        Implementation of | operator - returns C{L{MatchFirst}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__or__ss



zParserElement.__or__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||BS)z]
        Implementation of | operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__ror__s



zParserElement.__ror__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zA
        Implementation of ^ operator - returns C{L{Or}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__xor__�s



zParserElement.__xor__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||AS)z]
        Implementation of ^ operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rxor__�s



zParserElement.__rxor__cCsFt|t�rtj|�}t|t�s:tjdt|�tdd�dSt||g�S)zC
        Implementation of & operator - returns C{L{Each}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)	rzr�r$rQr�r�r�r�r)r�r�rwrwrx�__and__�s



zParserElement.__and__cCsBt|t�rtj|�}t|t�s:tjdt|�tdd�dS||@S)z]
        Implementation of & operator when left operand is not a C{L{ParserElement}}
        z4Cannot combine element of type %s with ParserElementrq)r�N)rzr�r$rQr�r�r�r�)r�r�rwrwrx�__rand__�s



zParserElement.__rand__cCst|�S)zE
        Implementation of ~ operator - returns C{L{NotAny}}
        )r)r�rwrwrx�
__invert__�szParserElement.__invert__cCs|dk	r|j|�S|j�SdS)a

        Shortcut for C{L{setResultsName}}, with C{listAllMatches=False}.
        
        If C{name} is given with a trailing C{'*'} character, then C{listAllMatches} will be
        passed as C{True}.
           
        If C{name} is omitted, same as calling C{L{copy}}.

        Example::
            # these are equivalent
            userdata = Word(alphas).setResultsName("name") + Word(nums+"-").setResultsName("socsecno")
            userdata = Word(alphas)("name") + Word(nums+"-")("socsecno")             
        N)rmr�)r�r�rwrwrx�__call__�s
zParserElement.__call__cCst|�S)z�
        Suppresses the output of this C{ParserElement}; useful to keep punctuation from
        cluttering up returned output.
        )r+)r�rwrwrx�suppress�szParserElement.suppresscCs
d|_|S)a
        Disables the skipping of whitespace before matching the characters in the
        C{ParserElement}'s defined pattern.  This is normally only used internally by
        the pyparsing module, but may be needed in some whitespace-sensitive grammars.
        F)rX)r�rwrwrx�leaveWhitespace�szParserElement.leaveWhitespacecCsd|_||_d|_|S)z8
        Overrides the default whitespace chars
        TF)rXrYrZ)r�rOrwrwrx�setWhitespaceChars�sz ParserElement.setWhitespaceCharscCs
d|_|S)z�
        Overrides default behavior to expand C{<TAB>}s to spaces before parsing the input string.
        Must be called before C{parseString} when the input grammar contains elements that
        match C{<TAB>} characters.
        T)r\)r�rwrwrx�
parseWithTabs�szParserElement.parseWithTabscCsLt|t�rt|�}t|t�r4||jkrH|jj|�n|jjt|j���|S)a�
        Define expression to be ignored (e.g., comments) while doing pattern
        matching; may be called repeatedly, to define multiple comment or other
        ignorable patterns.
        
        Example::
            patt = OneOrMore(Word(alphas))
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj']
            
            patt.ignore(cStyleComment)
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj', 'lskjd']
        )rzr�r+r]r�r�)r�r�rwrwrx�ignore�s


zParserElement.ignorecCs"|pt|pt|ptf|_d|_|S)zT
        Enable display of debugging messages while doing pattern matching.
        T)r/r2r4rcr^)r�ZstartActionZ
successActionZexceptionActionrwrwrx�setDebugActions
s
zParserElement.setDebugActionscCs|r|jttt�nd|_|S)a�
        Enable display of debugging messages while doing pattern matching.
        Set C{flag} to True to enable, False to disable.

        Example::
            wd = Word(alphas).setName("alphaword")
            integer = Word(nums).setName("numword")
            term = wd | integer
            
            # turn on debugging for wd
            wd.setDebug()

            OneOrMore(term).parseString("abc 123 xyz 890")
        
        prints::
            Match alphaword at loc 0(1,1)
            Matched alphaword -> ['abc']
            Match alphaword at loc 3(1,4)
            Exception raised:Expected alphaword (at char 4), (line:1, col:5)
            Match alphaword at loc 7(1,8)
            Matched alphaword -> ['xyz']
            Match alphaword at loc 11(1,12)
            Exception raised:Expected alphaword (at char 12), (line:1, col:13)
            Match alphaword at loc 15(1,16)
            Exception raised:Expected alphaword (at char 15), (line:1, col:16)

        The output shown is that produced by the default debug actions - custom debug actions can be
        specified using L{setDebugActions}. Prior to attempting
        to match the C{wd} expression, the debugging message C{"Match <exprname> at loc <n>(<line>,<col>)"}
        is shown. Then if the parse succeeds, a C{"Matched"} message is shown, or an C{"Exception raised"}
        message is shown. Also note the use of L{setName} to assign a human-readable name to the expression,
        which makes debugging and exception messages easier to understand - for instance, the default
        name created for the C{Word} expression without calling C{setName} is C{"W:(ABCD...)"}.
        F)r�r/r2r4r^)r��flagrwrwrx�setDebugs#zParserElement.setDebugcCs|jS)N)r�)r�rwrwrxr�@szParserElement.__str__cCst|�S)N)r�)r�rwrwrxr�CszParserElement.__repr__cCsd|_d|_|S)NT)r_rU)r�rwrwrxr�FszParserElement.streamlinecCsdS)Nrw)r�r�rwrwrx�checkRecursionKszParserElement.checkRecursioncCs|jg�dS)zj
        Check defined expressions for valid structure, check for infinite recursive definitions.
        N)r�)r��
validateTracerwrwrx�validateNszParserElement.validatecCs�y|j�}Wn2tk
r>t|d��}|j�}WdQRXYnXy|j||�Stk
r|}ztjrh�n|�WYdd}~XnXdS)z�
        Execute the parse expression on the given file or filename.
        If a filename is specified (instead of a file object),
        the entire file is opened, read, and closed before parsing.
        �rN)�readr��openr�rr$r�)r�Zfile_or_filenamer�Z
file_contents�fr3rwrwrx�	parseFileTszParserElement.parseFilecsHt|t�r"||kp t|�t|�kSt|t�r6|j|�Stt|�|kSdS)N)rzr$�varsr�r��super)r�r�)rHrwrx�__eq__hs



zParserElement.__eq__cCs
||kS)Nrw)r�r�rwrwrx�__ne__pszParserElement.__ne__cCstt|��S)N)�hash�id)r�rwrwrx�__hash__sszParserElement.__hash__cCs||kS)Nrw)r�r�rwrwrx�__req__vszParserElement.__req__cCs
||kS)Nrw)r�r�rwrwrx�__rne__yszParserElement.__rne__cCs0y|jt|�|d�dStk
r*dSXdS)a�
        Method for quick testing of a parser against a test string. Good for simple 
        inline microtests of sub expressions while building up larger parser.
           
        Parameters:
         - testString - to test against this expression for a match
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests
            
        Example::
            expr = Word(nums)
            assert expr.matches("100")
        )r�TFN)r�r�r)r�Z
testStringr�rwrwrxr�|s

zParserElement.matches�#cCs�t|t�r"tttj|j�j���}t|t�r4t|�}g}g}d}	�x�|D�]�}
|dk	rb|j	|
d�sl|rx|
rx|j
|
�qH|
s~qHdj|�|
g}g}y:|
jdd�}
|j
|
|d�}|j
|j|d��|	o�|}	Wn�tk
�rx}
z�t|
t�r�dnd	}d|
k�r0|j
t|
j|
��|j
d
t|
j|
�dd|�n|j
d
|
jd|�|j
d
t|
��|	�ob|}	|
}WYdd}
~
XnDtk
�r�}z&|j
dt|��|	�o�|}	|}WYdd}~XnX|�r�|�r�|j
d	�tdj|��|j
|
|f�qHW|	|fS)a3
        Execute the parse expression on a series of test strings, showing each
        test, the parsed results or where the parse failed. Quick and easy way to
        run a parse expression against a list of sample strings.
           
        Parameters:
         - tests - a list of separate test strings, or a multiline string of test strings
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests           
         - comment - (default=C{'#'}) - expression for indicating embedded comments in the test 
              string; pass None to disable comment filtering
         - fullDump - (default=C{True}) - dump results as list followed by results names in nested outline;
              if False, only dump nested list
         - printResults - (default=C{True}) prints test output to stdout
         - failureTests - (default=C{False}) indicates if these tests are expected to fail parsing

        Returns: a (success, results) tuple, where success indicates that all tests succeeded
        (or failed if C{failureTests} is True), and the results contain a list of lines of each 
        test's output
        
        Example::
            number_expr = pyparsing_common.number.copy()

            result = number_expr.runTests('''
                # unsigned integer
                100
                # negative integer
                -100
                # float with scientific notation
                6.02e23
                # integer with scientific notation
                1e-12
                ''')
            print("Success" if result[0] else "Failed!")

            result = number_expr.runTests('''
                # stray character
                100Z
                # missing leading digit before '.'
                -.100
                # too many '.'
                3.14.159
                ''', failureTests=True)
            print("Success" if result[0] else "Failed!")
        prints::
            # unsigned integer
            100
            [100]

            # negative integer
            -100
            [-100]

            # float with scientific notation
            6.02e23
            [6.02e+23]

            # integer with scientific notation
            1e-12
            [1e-12]

            Success
            
            # stray character
            100Z
               ^
            FAIL: Expected end of text (at char 3), (line:1, col:4)

            # missing leading digit before '.'
            -.100
            ^
            FAIL: Expected {real number with scientific notation | real number | signed integer} (at char 0), (line:1, col:1)

            # too many '.'
            3.14.159
                ^
            FAIL: Expected end of text (at char 4), (line:1, col:5)

            Success

        Each test string must be on a single line. If you want to test a string that spans multiple
        lines, create a test like this::

            expr.runTest(r"this is a test\n of strings that spans \n 3 lines")
        
        (Note that this is a raw string literal, you must include the leading 'r'.)
        TNFrz\n)r�)r z(FATAL)r�� rr�^zFAIL: zFAIL-EXCEPTION: )rzr�r�rvr{r��rstrip�
splitlinesrr�r�r�r�r�rrr!rGr�r9rKr,)r�Ztestsr�ZcommentZfullDumpZprintResultsZfailureTestsZ
allResultsZcomments�successrvr�resultr�rzr3rwrwrx�runTests�sNW



$


zParserElement.runTests)F)F)T)T)TT)TT)r�)F)N)T)F)T)Tr�TTF)Or�r�r�r�rNr��staticmethodrPrRr�r�rirmrur�rxr~rr�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�rtr�r�r�r��_MAX_INTr�r�r�r�rrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r��
__classcell__rwrw)rHrxr$8s�


&




H
"
2G+D
			

)

cs eZdZdZ�fdd�Z�ZS)r,zT
    Abstract C{ParserElement} subclass, for defining atomic matching patterns.
    cstt|�jdd�dS)NF)rg)r�r,r�)r�)rHrwrxr�	szToken.__init__)r�r�r�r�r�r�rwrw)rHrxr,	scs eZdZdZ�fdd�Z�ZS)r
z,
    An empty token, will always match.
    cs$tt|�j�d|_d|_d|_dS)Nr
TF)r�r
r�r�r[r`)r�)rHrwrxr�	szEmpty.__init__)r�r�r�r�r�r�rwrw)rHrxr
	scs*eZdZdZ�fdd�Zddd�Z�ZS)rz(
    A token that will never match.
    cs*tt|�j�d|_d|_d|_d|_dS)NrTFzUnmatchable token)r�rr�r�r[r`ra)r�)rHrwrxr�*	s
zNoMatch.__init__TcCst|||j|��dS)N)rra)r�r-r�rorwrwrxr�1	szNoMatch.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr&	scs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Token to exactly match a specified string.
    
    Example::
        Literal('blah').parseString('blah')  # -> ['blah']
        Literal('blah').parseString('blahfooblah')  # -> ['blah']
        Literal('blah').parseString('bla')  # -> Exception: Expected "blah"
    
    For case-insensitive matching, use L{CaselessLiteral}.
    
    For keyword matching (force word break before and after the matched string),
    use L{Keyword} or L{CaselessKeyword}.
    cs�tt|�j�||_t|�|_y|d|_Wn*tk
rVtj	dt
dd�t|_YnXdt
|j�|_d|j|_d|_d|_dS)Nrz2null string passed to Literal; use Empty() insteadrq)r�z"%s"z	Expected F)r�rr��matchr��matchLen�firstMatchCharr�r�r�r�r
rHr�r�rar[r`)r��matchString)rHrwrxr�C	s

zLiteral.__init__TcCsJ|||jkr6|jdks&|j|j|�r6||j|jfSt|||j|��dS)Nrr)r�r��
startswithr�rra)r�r-r�rorwrwrxr�V	szLiteral.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr5	s
csLeZdZdZedZd�fdd�	Zddd	�Z�fd
d�Ze	dd
��Z
�ZS)ra\
    Token to exactly match a specified string as a keyword, that is, it must be
    immediately followed by a non-keyword character.  Compare with C{L{Literal}}:
     - C{Literal("if")} will match the leading C{'if'} in C{'ifAndOnlyIf'}.
     - C{Keyword("if")} will not; it will only match the leading C{'if'} in C{'if x=1'}, or C{'if(y==2)'}
    Accepts two optional constructor arguments in addition to the keyword string:
     - C{identChars} is a string of characters that would be valid identifier characters,
          defaulting to all alphanumerics + "_" and "$"
     - C{caseless} allows case-insensitive matching, default is C{False}.
       
    Example::
        Keyword("start").parseString("start")  # -> ['start']
        Keyword("start").parseString("starting")  # -> Exception

    For case-insensitive matching, use L{CaselessKeyword}.
    z_$NFcs�tt|�j�|dkrtj}||_t|�|_y|d|_Wn$tk
r^t	j
dtdd�YnXd|j|_d|j|_
d|_d|_||_|r�|j�|_|j�}t|�|_dS)Nrz2null string passed to Keyword; use Empty() insteadrq)r�z"%s"z	Expected F)r�rr��DEFAULT_KEYWORD_CHARSr�r�r�r�r�r�r�r�r�rar[r`�caseless�upper�
caselessmatchr��
identChars)r�r�r�r�)rHrwrxr�q	s&

zKeyword.__init__TcCs|jr|||||j�j�|jkr�|t|�|jksL|||jj�|jkr�|dksj||dj�|jkr�||j|jfSnv|||jkr�|jdks�|j|j|�r�|t|�|jks�|||j|jkr�|dks�||d|jkr�||j|jfSt	|||j
|��dS)Nrrr)r�r�r�r�r�r�r�r�r�rra)r�r-r�rorwrwrxr��	s*&zKeyword.parseImplcstt|�j�}tj|_|S)N)r�rr�r�r�)r�r�)rHrwrxr��	szKeyword.copycCs
|t_dS)z,Overrides the default Keyword chars
        N)rr�)rOrwrwrx�setDefaultKeywordChars�	szKeyword.setDefaultKeywordChars)NF)T)r�r�r�r�r3r�r�r�r�r�r�r�rwrw)rHrxr^	s
cs*eZdZdZ�fdd�Zddd�Z�ZS)ral
    Token to match a specified string, ignoring case of letters.
    Note: the matched results will always be in the case of the given
    match string, NOT the case of the input text.

    Example::
        OneOrMore(CaselessLiteral("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD', 'CMD']
        
    (Contrast with example for L{CaselessKeyword}.)
    cs6tt|�j|j��||_d|j|_d|j|_dS)Nz'%s'z	Expected )r�rr�r��returnStringr�ra)r�r�)rHrwrxr��	szCaselessLiteral.__init__TcCs@||||j�j�|jkr,||j|jfSt|||j|��dS)N)r�r�r�r�rra)r�r-r�rorwrwrxr��	szCaselessLiteral.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr�	s
cs,eZdZdZd�fdd�	Zd	dd�Z�ZS)
rz�
    Caseless version of L{Keyword}.

    Example::
        OneOrMore(CaselessKeyword("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD']
        
    (Contrast with example for L{CaselessLiteral}.)
    Ncstt|�j||dd�dS)NT)r�)r�rr�)r�r�r�)rHrwrxr��	szCaselessKeyword.__init__TcCsj||||j�j�|jkrV|t|�|jksF|||jj�|jkrV||j|jfSt|||j|��dS)N)r�r�r�r�r�r�rra)r�r-r�rorwrwrxr��	s*zCaselessKeyword.parseImpl)N)T)r�r�r�r�r�r�r�rwrw)rHrxr�	scs,eZdZdZd�fdd�	Zd	dd�Z�ZS)
rlax
    A variation on L{Literal} which matches "close" matches, that is, 
    strings with at most 'n' mismatching characters. C{CloseMatch} takes parameters:
     - C{match_string} - string to be matched
     - C{maxMismatches} - (C{default=1}) maximum number of mismatches allowed to count as a match
    
    The results from a successful parse will contain the matched text from the input string and the following named results:
     - C{mismatches} - a list of the positions within the match_string where mismatches were found
     - C{original} - the original match_string used to compare against the input string
    
    If C{mismatches} is an empty list, then the match was an exact match.
    
    Example::
        patt = CloseMatch("ATCATCGAATGGA")
        patt.parseString("ATCATCGAAXGGA") # -> (['ATCATCGAAXGGA'], {'mismatches': [[9]], 'original': ['ATCATCGAATGGA']})
        patt.parseString("ATCAXCGAAXGGA") # -> Exception: Expected 'ATCATCGAATGGA' (with up to 1 mismatches) (at char 0), (line:1, col:1)

        # exact match
        patt.parseString("ATCATCGAATGGA") # -> (['ATCATCGAATGGA'], {'mismatches': [[]], 'original': ['ATCATCGAATGGA']})

        # close match allowing up to 2 mismatches
        patt = CloseMatch("ATCATCGAATGGA", maxMismatches=2)
        patt.parseString("ATCAXCGAAXGGA") # -> (['ATCAXCGAAXGGA'], {'mismatches': [[4, 9]], 'original': ['ATCATCGAATGGA']})
    rrcsBtt|�j�||_||_||_d|j|jf|_d|_d|_dS)Nz&Expected %r (with up to %d mismatches)F)	r�rlr�r��match_string�
maxMismatchesrar`r[)r�r�r�)rHrwrxr��	szCloseMatch.__init__TcCs�|}t|�}|t|j�}||kr�|j}d}g}	|j}
x�tt|||�|j��D]0\}}|\}}
||
krP|	j|�t|	�|
krPPqPW|d}t|||�g�}|j|d<|	|d<||fSt|||j|��dS)Nrrr�original�
mismatches)	r�r�r�r�r�r�r"rra)r�r-r�ro�startr��maxlocr�Zmatch_stringlocr�r�Zs_m�src�mat�resultsrwrwrxr��	s("

zCloseMatch.parseImpl)rr)T)r�r�r�r�r�r�r�rwrw)rHrxrl�	s	cs8eZdZdZd
�fdd�	Zdd	d
�Z�fdd�Z�ZS)r/a	
    Token for matching words composed of allowed character sets.
    Defined with string containing all allowed initial characters,
    an optional string containing allowed body characters (if omitted,
    defaults to the initial character set), and an optional minimum,
    maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction. An optional
    C{excludeChars} parameter can list characters that might be found in 
    the input C{bodyChars} string; useful to define a word of all printables
    except for one or two characters, for instance.
    
    L{srange} is useful for defining custom character set strings for defining 
    C{Word} expressions, using range notation from regular expression character sets.
    
    A common mistake is to use C{Word} to match a specific literal string, as in 
    C{Word("Address")}. Remember that C{Word} uses the string argument to define
    I{sets} of matchable characters. This expression would match "Add", "AAA",
    "dAred", or any other word made up of the characters 'A', 'd', 'r', 'e', and 's'.
    To match an exact literal string, use L{Literal} or L{Keyword}.

    pyparsing includes helper strings for building Words:
     - L{alphas}
     - L{nums}
     - L{alphanums}
     - L{hexnums}
     - L{alphas8bit} (alphabetic characters in ASCII range 128-255 - accented, tilded, umlauted, etc.)
     - L{punc8bit} (non-alphabetic characters in ASCII range 128-255 - currency, symbols, superscripts, diacriticals, etc.)
     - L{printables} (any non-whitespace character)

    Example::
        # a word composed of digits
        integer = Word(nums) # equivalent to Word("0123456789") or Word(srange("0-9"))
        
        # a word with a leading capital, and zero or more lowercase
        capital_word = Word(alphas.upper(), alphas.lower())

        # hostnames are alphanumeric, with leading alpha, and '-'
        hostname = Word(alphas, alphanums+'-')
        
        # roman numeral (not a strict parser, accepts invalid mix of characters)
        roman = Word("IVXLCDM")
        
        # any string of non-whitespace characters, except for ','
        csv_value = Word(printables, excludeChars=",")
    NrrrFcs�tt|�j��rFdj�fdd�|D��}|rFdj�fdd�|D��}||_t|�|_|rl||_t|�|_n||_t|�|_|dk|_	|dkr�t
d��||_|dkr�||_nt
|_|dkr�||_||_t|�|_d|j|_d	|_||_d
|j|jk�r�|dk�r�|dk�r�|dk�r�|j|jk�r8dt|j�|_nHt|j�dk�rfdtj|j�t|j�f|_nd
t|j�t|j�f|_|j�r�d|jd|_ytj|j�|_Wntk
�r�d|_YnXdS)Nr�c3s|]}|�kr|VqdS)Nrw)r�r�)�excludeCharsrwrxr�7
sz Word.__init__.<locals>.<genexpr>c3s|]}|�kr|VqdS)Nrw)r�r�)r�rwrxr�9
srrrzZcannot specify a minimum length < 1; use Optional(Word()) if zero-length word is permittedz	Expected Fr�z[%s]+z%s[%s]*z	[%s][%s]*z\b)r�r/r�r��
initCharsOrigr��	initChars�
bodyCharsOrig�	bodyChars�maxSpecifiedr��minLen�maxLenr�r�r�rar`�	asKeyword�_escapeRegexRangeChars�reStringr�rd�escape�compilerK)r�rr�min�max�exactrr�)rH)r�rxr�4
sT



0
z
Word.__init__Tc
CsD|jr<|jj||�}|s(t|||j|��|j�}||j�fS|||jkrZt|||j|��|}|d7}t|�}|j}||j	}t
||�}x ||kr�|||kr�|d7}q�Wd}	|||jkr�d}	|jr�||kr�|||kr�d}	|j
�r|dk�r||d|k�s||k�r|||k�rd}	|	�r4t|||j|��||||�fS)NrrFTr)rdr�rra�end�grouprr�rrrrrr)
r�r-r�ror�r�r�Z	bodycharsr�ZthrowExceptionrwrwrxr�j
s6

4zWord.parseImplcstytt|�j�Stk
r"YnX|jdkrndd�}|j|jkr^d||j�||j�f|_nd||j�|_|jS)NcSs$t|�dkr|dd�dS|SdS)N�z...)r�)r�rwrwrx�
charsAsStr�
sz Word.__str__.<locals>.charsAsStrz	W:(%s,%s)zW:(%s))r�r/r�rKrUr�r)r�r)rHrwrxr��
s
zWord.__str__)NrrrrFN)T)r�r�r�r�r�r�r�r�rwrw)rHrxr/
s.6
#csFeZdZdZeejd��Zd�fdd�	Zddd�Z	�fd	d
�Z
�ZS)
r'a�
    Token for matching strings that match a given regular expression.
    Defined with string specifying the regular expression in a form recognized by the inbuilt Python re module.
    If the given regex contains named groups (defined using C{(?P<name>...)}), these will be preserved as 
    named parse results.

    Example::
        realnum = Regex(r"[+-]?\d+\.\d*")
        date = Regex(r'(?P<year>\d{4})-(?P<month>\d\d?)-(?P<day>\d\d?)')
        # ref: http://stackoverflow.com/questions/267399/how-do-you-match-only-valid-roman-numerals-with-a-regular-expression
        roman = Regex(r"M{0,4}(CM|CD|D?C{0,3})(XC|XL|L?X{0,3})(IX|IV|V?I{0,3})")
    z[A-Z]rcs�tt|�j�t|t�r�|s,tjdtdd�||_||_	yt
j|j|j	�|_
|j|_Wq�t
jk
r�tjd|tdd��Yq�Xn2t|tj�r�||_
t|�|_|_||_	ntd��t|�|_d|j|_d|_d|_d	S)
z�The parameters C{pattern} and C{flags} are passed to the C{re.compile()} function as-is. See the Python C{re} module for an explanation of the acceptable patterns and flags.z0null string passed to Regex; use Empty() insteadrq)r�z$invalid pattern (%s) passed to RegexzCRegex may only be constructed with a string or a compiled RE objectz	Expected FTN)r�r'r�rzr�r�r�r��pattern�flagsrdr
r�
sre_constants�error�compiledREtyper{r�r�r�rar`r[)r�rr)rHrwrxr��
s.





zRegex.__init__TcCsd|jj||�}|s"t|||j|��|j�}|j�}t|j��}|r\x|D]}||||<qHW||fS)N)rdr�rrar�	groupdictr"r)r�r-r�ror��dr�r�rwrwrxr��
s
zRegex.parseImplcsDytt|�j�Stk
r"YnX|jdkr>dt|j�|_|jS)NzRe:(%s))r�r'r�rKrUr�r)r�)rHrwrxr��
s
z
Regex.__str__)r)T)r�r�r�r�r�rdr
rr�r�r�r�rwrw)rHrxr'�
s
"

cs8eZdZdZd�fdd�	Zddd�Z�fd	d
�Z�ZS)
r%a�
    Token for matching strings that are delimited by quoting characters.
    
    Defined with the following parameters:
        - quoteChar - string of one or more characters defining the quote delimiting string
        - escChar - character to escape quotes, typically backslash (default=C{None})
        - escQuote - special quote sequence to escape an embedded quote string (such as SQL's "" to escape an embedded ") (default=C{None})
        - multiline - boolean indicating whether quotes can span multiple lines (default=C{False})
        - unquoteResults - boolean indicating whether the matched text should be unquoted (default=C{True})
        - endQuoteChar - string of one or more characters defining the end of the quote delimited string (default=C{None} => same as quoteChar)
        - convertWhitespaceEscapes - convert escaped whitespace (C{'\t'}, C{'\n'}, etc.) to actual whitespace (default=C{True})

    Example::
        qs = QuotedString('"')
        print(qs.searchString('lsjdf "This is the quote" sldjf'))
        complex_qs = QuotedString('{{', endQuoteChar='}}')
        print(complex_qs.searchString('lsjdf {{This is the "quote"}} sldjf'))
        sql_qs = QuotedString('"', escQuote='""')
        print(sql_qs.searchString('lsjdf "This is the quote with ""embedded"" quotes" sldjf'))
    prints::
        [['This is the quote']]
        [['This is the "quote"']]
        [['This is the quote with "embedded" quotes']]
    NFTcsNtt��j�|j�}|s0tjdtdd�t��|dkr>|}n"|j�}|s`tjdtdd�t��|�_t	|��_
|d�_|�_t	|��_
|�_|�_|�_|�_|r�tjtjB�_dtj�j�t�jd�|dk	r�t|�p�df�_n<d�_dtj�j�t�jd�|dk	�rt|��pdf�_t	�j�d	k�rp�jd
dj�fdd
�tt	�j�d	dd�D��d7_|�r��jdtj|�7_|�r��jdtj|�7_tj�j�d�_�jdtj�j�7_ytj�j�j��_�j�_Wn0tjk
�r&tjd�jtdd��YnXt ���_!d�j!�_"d�_#d�_$dS)Nz$quoteChar cannot be the empty stringrq)r�z'endQuoteChar cannot be the empty stringrz%s(?:[^%s%s]r�z%s(?:[^%s\n\r%s]rrz|(?:z)|(?:c3s4|],}dtj�jd|��t�j|�fVqdS)z%s[^%s]N)rdr	�endQuoteCharr)r�r�)r�rwrxr�/sz(QuotedString.__init__.<locals>.<genexpr>�)z|(?:%s)z|(?:%s.)z(.)z)*%sz$invalid pattern (%s) passed to Regexz	Expected FTrs)%r�r%r�r�r�r�r��SyntaxError�	quoteCharr��quoteCharLen�firstQuoteCharr�endQuoteCharLen�escChar�escQuote�unquoteResults�convertWhitespaceEscapesrd�	MULTILINE�DOTALLrr	rrr�r��escCharReplacePatternr
rrrr�r�rar`r[)r�rr r!Z	multiliner"rr#)rH)r�rxr�sf




6

zQuotedString.__init__c	Cs�|||jkr|jj||�pd}|s4t|||j|��|j�}|j�}|jr�||j|j	�}t
|t�r�d|kr�|jr�ddddd�}x |j
�D]\}}|j||�}q�W|jr�tj|jd|�}|jr�|j|j|j�}||fS)N�\�	r��
)z\tz\nz\fz\rz\g<1>)rrdr�rrarrr"rrrzr�r#r�r�r r�r&r!r)	r�r-r�ror�r�Zws_mapZwslitZwscharrwrwrxr�Gs( 
zQuotedString.parseImplcsFytt|�j�Stk
r"YnX|jdkr@d|j|jf|_|jS)Nz.quoted string, starting with %s ending with %s)r�r%r�rKrUrr)r�)rHrwrxr�js
zQuotedString.__str__)NNFTNT)T)r�r�r�r�r�r�r�r�rwrw)rHrxr%�
sA
#cs8eZdZdZd�fdd�	Zddd�Z�fd	d
�Z�ZS)
r	a�
    Token for matching words composed of characters I{not} in a given set (will
    include whitespace in matched characters if not listed in the provided exclusion set - see example).
    Defined with string containing all disallowed characters, and an optional
    minimum, maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction.

    Example::
        # define a comma-separated-value as anything that is not a ','
        csv_value = CharsNotIn(',')
        print(delimitedList(csv_value).parseString("dkls,lsdkjf,s12 34,@!#,213"))
    prints::
        ['dkls', 'lsdkjf', 's12 34', '@!#', '213']
    rrrcs�tt|�j�d|_||_|dkr*td��||_|dkr@||_nt|_|dkrZ||_||_t	|�|_
d|j
|_|jdk|_d|_
dS)NFrrzfcannot specify a minimum length < 1; use Optional(CharsNotIn()) if zero-length char group is permittedrz	Expected )r�r	r�rX�notCharsr�rrr�r�r�rar[r`)r�r+rrr
)rHrwrxr��s 
zCharsNotIn.__init__TcCs�|||jkrt|||j|��|}|d7}|j}t||jt|��}x ||krd|||krd|d7}qFW|||jkr�t|||j|��||||�fS)Nrr)r+rrarrr�r)r�r-r�ror�Znotchars�maxlenrwrwrxr��s
zCharsNotIn.parseImplcsdytt|�j�Stk
r"YnX|jdkr^t|j�dkrRd|jdd�|_nd|j|_|jS)Nrz
!W:(%s...)z!W:(%s))r�r	r�rKrUr�r+)r�)rHrwrxr��s
zCharsNotIn.__str__)rrrr)T)r�r�r�r�r�r�r�r�rwrw)rHrxr	vs
cs<eZdZdZdddddd�Zd�fdd�	Zddd�Z�ZS)r.a�
    Special matching class for matching whitespace.  Normally, whitespace is ignored
    by pyparsing grammars.  This class is included when some whitespace structures
    are significant.  Define with a string containing the whitespace characters to be
    matched; default is C{" \t\r\n"}.  Also takes optional C{min}, C{max}, and C{exact} arguments,
    as defined for the C{L{Word}} class.
    z<SPC>z<TAB>z<LF>z<CR>z<FF>)r�r(rr*r)� 	
rrrcs�tt��j�|�_�jdj�fdd��jD���djdd��jD���_d�_d�j�_	|�_
|dkrt|�_nt�_|dkr�|�_|�_
dS)Nr�c3s|]}|�jkr|VqdS)N)�
matchWhite)r�r�)r�rwrxr��sz!White.__init__.<locals>.<genexpr>css|]}tj|VqdS)N)r.�	whiteStrs)r�r�rwrwrxr��sTz	Expected r)
r�r.r�r.r�r�rYr�r[rarrr�)r�Zwsrrr
)rH)r�rxr��s zWhite.__init__TcCs�|||jkrt|||j|��|}|d7}||j}t|t|��}x"||krd|||jkrd|d7}qDW|||jkr�t|||j|��||||�fS)Nrr)r.rrarrr�r)r�r-r�ror�r�rwrwrxr��s
zWhite.parseImpl)r-rrrr)T)r�r�r�r�r/r�r�r�rwrw)rHrxr.�scseZdZ�fdd�Z�ZS)�_PositionTokencs(tt|�j�|jj|_d|_d|_dS)NTF)r�r0r�rHr�r�r[r`)r�)rHrwrxr��s
z_PositionToken.__init__)r�r�r�r�r�rwrw)rHrxr0�sr0cs2eZdZdZ�fdd�Zdd�Zd	dd�Z�ZS)
rzb
    Token to advance to a specific column of input text; useful for tabular report scraping.
    cstt|�j�||_dS)N)r�rr�r9)r��colno)rHrwrxr��szGoToColumn.__init__cCs`t||�|jkr\t|�}|jr*|j||�}x0||krZ||j�rZt||�|jkrZ|d7}q,W|S)Nrr)r9r�r]r��isspace)r�r-r�r�rwrwrxr��s&zGoToColumn.preParseTcCsDt||�}||jkr"t||d|��||j|}|||�}||fS)NzText not in expected column)r9r)r�r-r�roZthiscolZnewlocr�rwrwrxr�s

zGoToColumn.parseImpl)T)r�r�r�r�r�r�r�r�rwrw)rHrxr�s	cs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Matches if current position is at the beginning of a line within the parse string
    
    Example::
    
        test = '''        AAA this line
        AAA and this line
          AAA but not this one
        B AAA and definitely not this one
        '''

        for t in (LineStart() + 'AAA' + restOfLine).searchString(test):
            print(t)
    
    Prints::
        ['AAA', ' this line']
        ['AAA', ' and this line']    

    cstt|�j�d|_dS)NzExpected start of line)r�rr�ra)r�)rHrwrxr�&szLineStart.__init__TcCs*t||�dkr|gfSt|||j|��dS)Nrr)r9rra)r�r-r�rorwrwrxr�*szLineStart.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxrscs*eZdZdZ�fdd�Zddd�Z�ZS)rzU
    Matches if current position is at the end of a line within the parse string
    cs,tt|�j�|jtjjdd��d|_dS)Nrr�zExpected end of line)r�rr�r�r$rNr�ra)r�)rHrwrxr�3szLineEnd.__init__TcCsb|t|�kr6||dkr$|ddfSt|||j|��n(|t|�krN|dgfSt|||j|��dS)Nrrr)r�rra)r�r-r�rorwrwrxr�8szLineEnd.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr/scs*eZdZdZ�fdd�Zddd�Z�ZS)r*zM
    Matches if current position is at the beginning of the parse string
    cstt|�j�d|_dS)NzExpected start of text)r�r*r�ra)r�)rHrwrxr�GszStringStart.__init__TcCs0|dkr(||j|d�kr(t|||j|��|gfS)Nr)r�rra)r�r-r�rorwrwrxr�KszStringStart.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr*Cscs*eZdZdZ�fdd�Zddd�Z�ZS)r)zG
    Matches if current position is at the end of the parse string
    cstt|�j�d|_dS)NzExpected end of text)r�r)r�ra)r�)rHrwrxr�VszStringEnd.__init__TcCs^|t|�krt|||j|��n<|t|�kr6|dgfS|t|�krJ|gfSt|||j|��dS)Nrr)r�rra)r�r-r�rorwrwrxr�ZszStringEnd.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr)Rscs.eZdZdZef�fdd�	Zddd�Z�ZS)r1ap
    Matches if the current position is at the beginning of a Word, and
    is not preceded by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{} behavior of regular expressions,
    use C{WordStart(alphanums)}. C{WordStart} will also match at the beginning of
    the string being parsed, or at the beginning of a line.
    cs"tt|�j�t|�|_d|_dS)NzNot at the start of a word)r�r1r�r��	wordCharsra)r�r3)rHrwrxr�ls
zWordStart.__init__TcCs@|dkr8||d|jks(|||jkr8t|||j|��|gfS)Nrrr)r3rra)r�r-r�rorwrwrxr�qs
zWordStart.parseImpl)T)r�r�r�r�rVr�r�r�rwrw)rHrxr1dscs.eZdZdZef�fdd�	Zddd�Z�ZS)r0aZ
    Matches if the current position is at the end of a Word, and
    is not followed by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{} behavior of regular expressions,
    use C{WordEnd(alphanums)}. C{WordEnd} will also match at the end of
    the string being parsed, or at the end of a line.
    cs(tt|�j�t|�|_d|_d|_dS)NFzNot at the end of a word)r�r0r�r�r3rXra)r�r3)rHrwrxr��s
zWordEnd.__init__TcCsPt|�}|dkrH||krH|||jks8||d|jkrHt|||j|��|gfS)Nrrr)r�r3rra)r�r-r�ror�rwrwrxr��szWordEnd.parseImpl)T)r�r�r�r�rVr�r�r�rwrw)rHrxr0xscs�eZdZdZd�fdd�	Zdd�Zdd�Zd	d
�Z�fdd�Z�fd
d�Z	�fdd�Z
d�fdd�	Zgfdd�Z�fdd�Z
�ZS)r z^
    Abstract subclass of ParserElement, for combining and post-processing parsed tokens.
    Fcs�tt|�j|�t|t�r"t|�}t|t�r<tj|�g|_	njt|t
j�rzt|�}tdd�|D��rnt
tj|�}t|�|_	n,yt|�|_	Wntk
r�|g|_	YnXd|_dS)Ncss|]}t|t�VqdS)N)rzr�)r�r.rwrwrxr��sz+ParseExpression.__init__.<locals>.<genexpr>F)r�r r�rzr�r�r�r$rQ�exprsr��Iterable�allrvr�re)r�r4rg)rHrwrxr��s

zParseExpression.__init__cCs
|j|S)N)r4)r�r�rwrwrxr��szParseExpression.__getitem__cCs|jj|�d|_|S)N)r4r�rU)r�r�rwrwrxr��szParseExpression.appendcCs4d|_dd�|jD�|_x|jD]}|j�q W|S)z~Extends C{leaveWhitespace} defined in base class, and also invokes C{leaveWhitespace} on
           all contained expressions.FcSsg|]}|j��qSrw)r�)r�r�rwrwrxr��sz3ParseExpression.leaveWhitespace.<locals>.<listcomp>)rXr4r�)r�r�rwrwrxr��s
zParseExpression.leaveWhitespacecszt|t�rF||jkrvtt|�j|�xP|jD]}|j|jd�q,Wn0tt|�j|�x|jD]}|j|jd�q^W|S)Nrrrsrs)rzr+r]r�r r�r4)r�r�r�)rHrwrxr��s

zParseExpression.ignorecsLytt|�j�Stk
r"YnX|jdkrFd|jjt|j�f|_|jS)Nz%s:(%s))	r�r r�rKrUrHr�r�r4)r�)rHrwrxr��s
zParseExpression.__str__cs0tt|�j�x|jD]}|j�qWt|j�dk�r|jd}t||j�r�|jr�|jdkr�|j	r�|jdd�|jdg|_d|_
|j|jO_|j|jO_|jd}t||j�o�|jo�|jdko�|j	�r|jdd�|jdd�|_d|_
|j|jO_|j|jO_dt
|�|_|S)Nrqrrrz	Expected rsrs)r�r r�r4r�rzrHrSrVr^rUr[r`r�ra)r�r�r�)rHrwrxr��s0




zParseExpression.streamlinecstt|�j||�}|S)N)r�r rm)r�r�rlr�)rHrwrxrm�szParseExpression.setResultsNamecCs:|dd�|g}x|jD]}|j|�qW|jg�dS)N)r4r�r�)r�r��tmpr�rwrwrxr��szParseExpression.validatecs$tt|�j�}dd�|jD�|_|S)NcSsg|]}|j��qSrw)r�)r�r�rwrwrxr��sz(ParseExpression.copy.<locals>.<listcomp>)r�r r�r4)r�r�)rHrwrxr��szParseExpression.copy)F)F)r�r�r�r�r�r�r�r�r�r�r�rmr�r�r�rwrw)rHrxr �s	
"csTeZdZdZGdd�de�Zd�fdd�	Zddd�Zd	d
�Zdd�Z	d
d�Z
�ZS)ra

    Requires all given C{ParseExpression}s to be found in the given order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'+'} operator.
    May also be constructed using the C{'-'} operator, which will suppress backtracking.

    Example::
        integer = Word(nums)
        name_expr = OneOrMore(Word(alphas))

        expr = And([integer("id"),name_expr("name"),integer("age")])
        # more easily written as:
        expr = integer("id") + name_expr("name") + integer("age")
    cseZdZ�fdd�Z�ZS)zAnd._ErrorStopcs&ttj|�j||�d|_|j�dS)N�-)r�rr�r�r�r�)r�r�r�)rHrwrxr�
szAnd._ErrorStop.__init__)r�r�r�r�r�rwrw)rHrxr�
sr�TcsRtt|�j||�tdd�|jD��|_|j|jdj�|jdj|_d|_	dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�
szAnd.__init__.<locals>.<genexpr>rT)
r�rr�r6r4r[r�rYrXre)r�r4rg)rHrwrxr�
s
zAnd.__init__c	Cs|jdj|||dd�\}}d}x�|jdd�D]�}t|tj�rFd}q0|r�y|j|||�\}}Wq�tk
rv�Yq�tk
r�}zd|_tj|��WYdd}~Xq�t	k
r�t|t
|�|j|��Yq�Xn|j|||�\}}|s�|j�r0||7}q0W||fS)NrF)rprrT)
r4rtrzrr�r#r�
__traceback__r�r�r�rar�)	r�r-r�ro�
resultlistZ	errorStopr�Z
exprtokensr�rwrwrxr�
s(z
And.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrxr5
s

zAnd.__iadd__cCs8|dd�|g}x |jD]}|j|�|jsPqWdS)N)r4r�r[)r�r��subRecCheckListr�rwrwrxr�:
s

zAnd.checkRecursioncCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr��{r�css|]}t|�VqdS)N)r�)r�r�rwrwrxr�F
szAnd.__str__.<locals>.<genexpr>�})r�r�rUr�r4)r�rwrwrxr�A
s


 zAnd.__str__)T)T)r�r�r�r�r
r�r�r�rr�r�r�rwrw)rHrxr�s
csDeZdZdZd�fdd�	Zddd�Zdd	�Zd
d�Zdd
�Z�Z	S)ra�
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the expression that matches the longest string will be used.
    May be constructed using the C{'^'} operator.

    Example::
        # construct Or using '^' operator
        
        number = Word(nums) ^ Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789"))
    prints::
        [['123'], ['3.1416'], ['789']]
    Fcs:tt|�j||�|jr0tdd�|jD��|_nd|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�\
szOr.__init__.<locals>.<genexpr>T)r�rr�r4rr[)r�r4rg)rHrwrxr�Y
szOr.__init__TcCsTd}d}g}x�|jD]�}y|j||�}Wnvtk
rd}	z d|	_|	j|krT|	}|	j}WYdd}	~	Xqtk
r�t|�|kr�t|t|�|j|�}t|�}YqX|j||f�qW|�r*|j	dd�d�x`|D]X\}
}y|j
|||�Stk
�r$}	z"d|	_|	j|k�r|	}|	j}WYdd}	~	Xq�Xq�W|dk	�rB|j|_|�nt||d|��dS)NrrcSs
|dS)Nrrw)�xrwrwrxryu
szOr.parseImpl.<locals>.<lambda>)r�z no defined alternatives to matchrs)r4r�rr9r�r�r�rar��sortrtr�)r�r-r�ro�	maxExcLoc�maxExceptionr�r�Zloc2r��_rwrwrxr�`
s<

zOr.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrx�__ixor__�
s

zOr.__ixor__cCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r<z ^ css|]}t|�VqdS)N)r�)r�r�rwrwrxr��
szOr.__str__.<locals>.<genexpr>r=)r�r�rUr�r4)r�rwrwrxr��
s


 z
Or.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r4r�)r�r�r;r�rwrwrxr��
szOr.checkRecursion)F)T)
r�r�r�r�r�r�rCr�r�r�rwrw)rHrxrK
s

&	csDeZdZdZd�fdd�	Zddd�Zdd	�Zd
d�Zdd
�Z�Z	S)ra�
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the first one listed is the one that will match.
    May be constructed using the C{'|'} operator.

    Example::
        # construct MatchFirst using '|' operator
        
        # watch the order of expressions to match
        number = Word(nums) | Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789")) #  Fail! -> [['123'], ['3'], ['1416'], ['789']]

        # put more selective expression first
        number = Combine(Word(nums) + '.' + Word(nums)) | Word(nums)
        print(number.searchString("123 3.1416 789")) #  Better -> [['123'], ['3.1416'], ['789']]
    Fcs:tt|�j||�|jr0tdd�|jD��|_nd|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr��
sz&MatchFirst.__init__.<locals>.<genexpr>T)r�rr�r4rr[)r�r4rg)rHrwrxr��
szMatchFirst.__init__Tc	Cs�d}d}x�|jD]�}y|j|||�}|Stk
r\}z|j|krL|}|j}WYdd}~Xqtk
r�t|�|kr�t|t|�|j|�}t|�}YqXqW|dk	r�|j|_|�nt||d|��dS)Nrrz no defined alternatives to matchrs)r4rtrr�r�r�rar�)	r�r-r�ror@rAr�r�r�rwrwrxr��
s$
zMatchFirst.parseImplcCst|t�rtj|�}|j|�S)N)rzr�r$rQr�)r�r�rwrwrx�__ior__�
s

zMatchFirst.__ior__cCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r<z | css|]}t|�VqdS)N)r�)r�r�rwrwrxr��
sz%MatchFirst.__str__.<locals>.<genexpr>r=)r�r�rUr�r4)r�rwrwrxr��
s


 zMatchFirst.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r4r�)r�r�r;r�rwrwrxr��
szMatchFirst.checkRecursion)F)T)
r�r�r�r�r�r�rDr�r�r�rwrw)rHrxr�
s
	cs<eZdZdZd�fdd�	Zddd�Zdd�Zd	d
�Z�ZS)
ram
    Requires all given C{ParseExpression}s to be found, but in any order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'&'} operator.

    Example::
        color = oneOf("RED ORANGE YELLOW GREEN BLUE PURPLE BLACK WHITE BROWN")
        shape_type = oneOf("SQUARE CIRCLE TRIANGLE STAR HEXAGON OCTAGON")
        integer = Word(nums)
        shape_attr = "shape:" + shape_type("shape")
        posn_attr = "posn:" + Group(integer("x") + ',' + integer("y"))("posn")
        color_attr = "color:" + color("color")
        size_attr = "size:" + integer("size")

        # use Each (using operator '&') to accept attributes in any order 
        # (shape and posn are required, color and size are optional)
        shape_spec = shape_attr & posn_attr & Optional(color_attr) & Optional(size_attr)

        shape_spec.runTests('''
            shape: SQUARE color: BLACK posn: 100, 120
            shape: CIRCLE size: 50 color: BLUE posn: 50,80
            color:GREEN size:20 shape:TRIANGLE posn:20,40
            '''
            )
    prints::
        shape: SQUARE color: BLACK posn: 100, 120
        ['shape:', 'SQUARE', 'color:', 'BLACK', 'posn:', ['100', ',', '120']]
        - color: BLACK
        - posn: ['100', ',', '120']
          - x: 100
          - y: 120
        - shape: SQUARE


        shape: CIRCLE size: 50 color: BLUE posn: 50,80
        ['shape:', 'CIRCLE', 'size:', '50', 'color:', 'BLUE', 'posn:', ['50', ',', '80']]
        - color: BLUE
        - posn: ['50', ',', '80']
          - x: 50
          - y: 80
        - shape: CIRCLE
        - size: 50


        color: GREEN size: 20 shape: TRIANGLE posn: 20,40
        ['color:', 'GREEN', 'size:', '20', 'shape:', 'TRIANGLE', 'posn:', ['20', ',', '40']]
        - color: GREEN
        - posn: ['20', ',', '40']
          - x: 20
          - y: 40
        - shape: TRIANGLE
        - size: 20
    Tcs8tt|�j||�tdd�|jD��|_d|_d|_dS)Ncss|]}|jVqdS)N)r[)r�r�rwrwrxr�sz Each.__init__.<locals>.<genexpr>T)r�rr�r6r4r[rX�initExprGroups)r�r4rg)rHrwrxr�sz
Each.__init__c
s�|jr�tdd�|jD��|_dd�|jD�}dd�|jD�}|||_dd�|jD�|_dd�|jD�|_dd�|jD�|_|j|j7_d	|_|}|jdd�}|jdd��g}d
}	x�|	�rp|�|j|j}
g}x~|
D]v}y|j||�}Wn t	k
�r|j
|�Yq�X|j
|jjt|�|��||k�rD|j
|�q�|�kr�j
|�q�Wt|�t|
�kr�d	}	q�W|�r�djdd�|D��}
t	||d
|
��|�fdd�|jD�7}g}x*|D]"}|j|||�\}}|j
|��q�Wt|tg��}||fS)Ncss&|]}t|t�rt|j�|fVqdS)N)rzrr�r.)r�r�rwrwrxr�sz!Each.parseImpl.<locals>.<genexpr>cSsg|]}t|t�r|j�qSrw)rzrr.)r�r�rwrwrxr�sz"Each.parseImpl.<locals>.<listcomp>cSs"g|]}|jrt|t�r|�qSrw)r[rzr)r�r�rwrwrxr�scSsg|]}t|t�r|j�qSrw)rzr2r.)r�r�rwrwrxr� scSsg|]}t|t�r|j�qSrw)rzrr.)r�r�rwrwrxr�!scSs g|]}t|tttf�s|�qSrw)rzrr2r)r�r�rwrwrxr�"sFTz, css|]}t|�VqdS)N)r�)r�r�rwrwrxr�=sz*Missing one or more required elements (%s)cs$g|]}t|t�r|j�kr|�qSrw)rzrr.)r�r�)�tmpOptrwrxr�As)rEr�r4Zopt1mapZ	optionalsZmultioptionalsZ
multirequiredZrequiredr�rr�r�r��remover�r�rt�sumr")r�r-r�roZopt1Zopt2ZtmpLocZtmpReqdZ
matchOrderZkeepMatchingZtmpExprsZfailedr�Zmissingr:r�ZfinalResultsrw)rFrxr�sP



zEach.parseImplcCs@t|d�r|jS|jdkr:ddjdd�|jD��d|_|jS)Nr�r<z & css|]}t|�VqdS)N)r�)r�r�rwrwrxr�PszEach.__str__.<locals>.<genexpr>r=)r�r�rUr�r4)r�rwrwrxr�Ks


 zEach.__str__cCs0|dd�|g}x|jD]}|j|�qWdS)N)r4r�)r�r�r;r�rwrwrxr�TszEach.checkRecursion)T)T)	r�r�r�r�r�r�r�r�r�rwrw)rHrxr�
s
5
1	csleZdZdZd�fdd�	Zddd�Zdd	�Z�fd
d�Z�fdd
�Zdd�Z	gfdd�Z
�fdd�Z�ZS)rza
    Abstract subclass of C{ParserElement}, for combining and post-processing parsed tokens.
    Fcs�tt|�j|�t|t�r@ttjt�r2tj|�}ntjt	|��}||_
d|_|dk	r�|j|_|j
|_
|j|j�|j|_|j|_|j|_|jj|j�dS)N)r�rr�rzr��
issubclassr$rQr,rr.rUr`r[r�rYrXrWrer]r�)r�r.rg)rHrwrxr�^s
zParseElementEnhance.__init__TcCs2|jdk	r|jj|||dd�Std||j|��dS)NF)rpr�)r.rtrra)r�r-r�rorwrwrxr�ps
zParseElementEnhance.parseImplcCs*d|_|jj�|_|jdk	r&|jj�|S)NF)rXr.r�r�)r�rwrwrxr�vs


z#ParseElementEnhance.leaveWhitespacecsrt|t�rB||jkrntt|�j|�|jdk	rn|jj|jd�n,tt|�j|�|jdk	rn|jj|jd�|S)Nrrrsrs)rzr+r]r�rr�r.)r�r�)rHrwrxr�}s



zParseElementEnhance.ignorecs&tt|�j�|jdk	r"|jj�|S)N)r�rr�r.)r�)rHrwrxr��s

zParseElementEnhance.streamlinecCsB||krt||g��|dd�|g}|jdk	r>|jj|�dS)N)r&r.r�)r�r�r;rwrwrxr��s

z"ParseElementEnhance.checkRecursioncCs6|dd�|g}|jdk	r(|jj|�|jg�dS)N)r.r�r�)r�r�r7rwrwrxr��s
zParseElementEnhance.validatecsVytt|�j�Stk
r"YnX|jdkrP|jdk	rPd|jjt|j�f|_|jS)Nz%s:(%s))	r�rr�rKrUr.rHr�r�)r�)rHrwrxr��szParseElementEnhance.__str__)F)T)
r�r�r�r�r�r�r�r�r�r�r�r�r�rwrw)rHrxrZs
cs*eZdZdZ�fdd�Zddd�Z�ZS)ra�
    Lookahead matching of the given parse expression.  C{FollowedBy}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression matches at the current
    position.  C{FollowedBy} always returns a null token list.

    Example::
        # use FollowedBy to match a label only if it is followed by a ':'
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        OneOrMore(attr_expr).parseString("shape: SQUARE color: BLACK posn: upper left").pprint()
    prints::
        [['shape', 'SQUARE'], ['color', 'BLACK'], ['posn', 'upper left']]
    cstt|�j|�d|_dS)NT)r�rr�r[)r�r.)rHrwrxr��szFollowedBy.__init__TcCs|jj||�|gfS)N)r.r�)r�r-r�rorwrwrxr��szFollowedBy.parseImpl)T)r�r�r�r�r�r�r�rwrw)rHrxr�scs2eZdZdZ�fdd�Zd	dd�Zdd�Z�ZS)
ra�
    Lookahead to disallow matching with the given parse expression.  C{NotAny}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression does I{not} match at the current
    position.  Also, C{NotAny} does I{not} skip over leading whitespace. C{NotAny}
    always returns a null token list.  May be constructed using the '~' operator.

    Example::
        
    cs0tt|�j|�d|_d|_dt|j�|_dS)NFTzFound unwanted token, )r�rr�rXr[r�r.ra)r�r.)rHrwrxr��szNotAny.__init__TcCs&|jj||�rt|||j|��|gfS)N)r.r�rra)r�r-r�rorwrwrxr��szNotAny.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�z~{r=)r�r�rUr�r.)r�rwrwrxr��s


zNotAny.__str__)T)r�r�r�r�r�r�r�r�rwrw)rHrxr�s

cs(eZdZd�fdd�	Zddd�Z�ZS)	�_MultipleMatchNcsFtt|�j|�d|_|}t|t�r.tj|�}|dk	r<|nd|_dS)NT)	r�rJr�rWrzr�r$rQ�	not_ender)r�r.�stopOnZender)rHrwrxr��s

z_MultipleMatch.__init__TcCs�|jj}|j}|jdk	}|r$|jj}|r2|||�||||dd�\}}yZ|j}	xJ|rb|||�|	rr|||�}
n|}
|||
|�\}}|s�|j�rT||7}qTWWnttfk
r�YnX||fS)NF)rp)	r.rtr�rKr�r]r�rr�)r�r-r�roZself_expr_parseZself_skip_ignorablesZcheck_enderZ
try_not_enderr�ZhasIgnoreExprsr�Z	tmptokensrwrwrxr��s,



z_MultipleMatch.parseImpl)N)T)r�r�r�r�r�r�rwrw)rHrxrJ�srJc@seZdZdZdd�ZdS)ra�
    Repetition of one or more of the given expression.
    
    Parameters:
     - expr - expression that must match one or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: BLACK"
        OneOrMore(attr_expr).parseString(text).pprint()  # Fail! read 'color' as data instead of next label -> [['shape', 'SQUARE color']]

        # use stopOn attribute for OneOrMore to avoid reading label string as part of the data
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        OneOrMore(attr_expr).parseString(text).pprint() # Better -> [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'BLACK']]
        
        # could also be written as
        (attr_expr * (1,)).parseString(text).pprint()
    cCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�r<z}...)r�r�rUr�r.)r�rwrwrxr�!s


zOneOrMore.__str__N)r�r�r�r�r�rwrwrwrxrscs8eZdZdZd
�fdd�	Zd�fdd�	Zdd	�Z�ZS)r2aw
    Optional repetition of zero or more of the given expression.
    
    Parameters:
     - expr - expression that must match zero or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example: similar to L{OneOrMore}
    Ncstt|�j||d�d|_dS)N)rLT)r�r2r�r[)r�r.rL)rHrwrxr�6szZeroOrMore.__init__Tcs6ytt|�j|||�Sttfk
r0|gfSXdS)N)r�r2r�rr�)r�r-r�ro)rHrwrxr�:szZeroOrMore.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�rz]...)r�r�rUr�r.)r�rwrwrxr�@s


zZeroOrMore.__str__)N)T)r�r�r�r�r�r�r�r�rwrw)rHrxr2*sc@s eZdZdd�ZeZdd�ZdS)�
_NullTokencCsdS)NFrw)r�rwrwrxr�Jsz_NullToken.__bool__cCsdS)Nr�rw)r�rwrwrxr�Msz_NullToken.__str__N)r�r�r�r�r'r�rwrwrwrxrMIsrMcs6eZdZdZef�fdd�	Zd	dd�Zdd�Z�ZS)
raa
    Optional matching of the given expression.

    Parameters:
     - expr - expression that must match zero or more times
     - default (optional) - value to be returned if the optional expression is not found.

    Example::
        # US postal code can be a 5-digit zip, plus optional 4-digit qualifier
        zip = Combine(Word(nums, exact=5) + Optional('-' + Word(nums, exact=4)))
        zip.runTests('''
            # traditional ZIP code
            12345
            
            # ZIP+4 form
            12101-0001
            
            # invalid ZIP
            98765-
            ''')
    prints::
        # traditional ZIP code
        12345
        ['12345']

        # ZIP+4 form
        12101-0001
        ['12101-0001']

        # invalid ZIP
        98765-
             ^
        FAIL: Expected end of text (at char 5), (line:1, col:6)
    cs.tt|�j|dd�|jj|_||_d|_dS)NF)rgT)r�rr�r.rWr�r[)r�r.r�)rHrwrxr�ts
zOptional.__init__TcCszy|jj|||dd�\}}WnTttfk
rp|jtk	rh|jjr^t|jg�}|j||jj<ql|jg}ng}YnX||fS)NF)rp)r.rtrr�r��_optionalNotMatchedrVr")r�r-r�ror�rwrwrxr�zs


zOptional.parseImplcCs4t|d�r|jS|jdkr.dt|j�d|_|jS)Nr�rr	)r�r�rUr�r.)r�rwrwrxr��s


zOptional.__str__)T)	r�r�r�r�rNr�r�r�r�rwrw)rHrxrQs"
cs,eZdZdZd	�fdd�	Zd
dd�Z�ZS)r(a�	
    Token for skipping over all undefined text until the matched expression is found.

    Parameters:
     - expr - target expression marking the end of the data to be skipped
     - include - (default=C{False}) if True, the target expression is also parsed 
          (the skipped text and target expression are returned as a 2-element list).
     - ignore - (default=C{None}) used to define grammars (typically quoted strings and 
          comments) that might contain false matches to the target expression
     - failOn - (default=C{None}) define expressions that are not allowed to be 
          included in the skipped test; if found before the target expression is found, 
          the SkipTo is not a match

    Example::
        report = '''
            Outstanding Issues Report - 1 Jan 2000

               # | Severity | Description                               |  Days Open
            -----+----------+-------------------------------------------+-----------
             101 | Critical | Intermittent system crash                 |          6
              94 | Cosmetic | Spelling error on Login ('log|n')         |         14
              79 | Minor    | System slow when running too many reports |         47
            '''
        integer = Word(nums)
        SEP = Suppress('|')
        # use SkipTo to simply match everything up until the next SEP
        # - ignore quoted strings, so that a '|' character inside a quoted string does not match
        # - parse action will call token.strip() for each matched token, i.e., the description body
        string_data = SkipTo(SEP, ignore=quotedString)
        string_data.setParseAction(tokenMap(str.strip))
        ticket_expr = (integer("issue_num") + SEP 
                      + string_data("sev") + SEP 
                      + string_data("desc") + SEP 
                      + integer("days_open"))
        
        for tkt in ticket_expr.searchString(report):
            print tkt.dump()
    prints::
        ['101', 'Critical', 'Intermittent system crash', '6']
        - days_open: 6
        - desc: Intermittent system crash
        - issue_num: 101
        - sev: Critical
        ['94', 'Cosmetic', "Spelling error on Login ('log|n')", '14']
        - days_open: 14
        - desc: Spelling error on Login ('log|n')
        - issue_num: 94
        - sev: Cosmetic
        ['79', 'Minor', 'System slow when running too many reports', '47']
        - days_open: 47
        - desc: System slow when running too many reports
        - issue_num: 79
        - sev: Minor
    FNcs`tt|�j|�||_d|_d|_||_d|_t|t	�rFt
j|�|_n||_dt
|j�|_dS)NTFzNo match found for )r�r(r��
ignoreExprr[r`�includeMatchr�rzr�r$rQ�failOnr�r.ra)r�r��includer�rQ)rHrwrxr��s
zSkipTo.__init__TcCs,|}t|�}|j}|jj}|jdk	r,|jjnd}|jdk	rB|jjnd}	|}
x�|
|kr�|dk	rh|||
�rhP|	dk	r�x*y|	||
�}
Wqrtk
r�PYqrXqrWy|||
ddd�Wn tt	fk
r�|
d7}
YqLXPqLWt|||j
|��|
}|||�}t|�}|j�r$||||dd�\}}
||
7}||fS)NF)rorprr)rp)
r�r.rtrQr�rOr�rrr�rar"rP)r�r-r�ror0r�r.Z
expr_parseZself_failOn_canParseNextZself_ignoreExpr_tryParseZtmplocZskiptextZ
skipresultr�rwrwrxr��s<

zSkipTo.parseImpl)FNN)T)r�r�r�r�r�r�r�rwrw)rHrxr(�s6
csbeZdZdZd�fdd�	Zdd�Zdd�Zd	d
�Zdd�Zgfd
d�Z	dd�Z
�fdd�Z�ZS)raK
    Forward declaration of an expression to be defined later -
    used for recursive grammars, such as algebraic infix notation.
    When the expression is known, it is assigned to the C{Forward} variable using the '<<' operator.

    Note: take care when assigning to C{Forward} not to overlook precedence of operators.
    Specifically, '|' has a lower precedence than '<<', so that::
        fwdExpr << a | b | c
    will actually be evaluated as::
        (fwdExpr << a) | b | c
    thereby leaving b and c out as parseable alternatives.  It is recommended that you
    explicitly group the values inserted into the C{Forward}::
        fwdExpr << (a | b | c)
    Converting to use the '<<=' operator instead will avoid this problem.

    See L{ParseResults.pprint} for an example of a recursive parser created using
    C{Forward}.
    Ncstt|�j|dd�dS)NF)rg)r�rr�)r�r�)rHrwrxr�szForward.__init__cCsjt|t�rtj|�}||_d|_|jj|_|jj|_|j|jj	�|jj
|_
|jj|_|jj
|jj�|S)N)rzr�r$rQr.rUr`r[r�rYrXrWr]r�)r�r�rwrwrx�
__lshift__s





zForward.__lshift__cCs||>S)Nrw)r�r�rwrwrx�__ilshift__'szForward.__ilshift__cCs
d|_|S)NF)rX)r�rwrwrxr�*szForward.leaveWhitespacecCs$|js d|_|jdk	r |jj�|S)NT)r_r.r�)r�rwrwrxr�.s


zForward.streamlinecCs>||kr0|dd�|g}|jdk	r0|jj|�|jg�dS)N)r.r�r�)r�r�r7rwrwrxr�5s

zForward.validatecCs>t|d�r|jS|jjdSd}Wd|j|_X|jjd|S)Nr�z: ...�Nonez: )r�r�rHr�Z_revertClass�_ForwardNoRecurser.r�)r�Z	retStringrwrwrxr�<s

zForward.__str__cs.|jdk	rtt|�j�St�}||K}|SdS)N)r.r�rr�)r�r�)rHrwrxr�Ms

zForward.copy)N)
r�r�r�r�r�rSrTr�r�r�r�r�r�rwrw)rHrxrs
c@seZdZdd�ZdS)rVcCsdS)Nz...rw)r�rwrwrxr�Vsz_ForwardNoRecurse.__str__N)r�r�r�r�rwrwrwrxrVUsrVcs"eZdZdZd�fdd�	Z�ZS)r-zQ
    Abstract subclass of C{ParseExpression}, for converting parsed results.
    Fcstt|�j|�d|_dS)NF)r�r-r�rW)r�r.rg)rHrwrxr�]szTokenConverter.__init__)F)r�r�r�r�r�r�rwrw)rHrxr-Yscs6eZdZdZd
�fdd�	Z�fdd�Zdd	�Z�ZS)r
a�
    Converter to concatenate all matching tokens to a single string.
    By default, the matching patterns must also be contiguous in the input string;
    this can be disabled by specifying C{'adjacent=False'} in the constructor.

    Example::
        real = Word(nums) + '.' + Word(nums)
        print(real.parseString('3.1416')) # -> ['3', '.', '1416']
        # will also erroneously match the following
        print(real.parseString('3. 1416')) # -> ['3', '.', '1416']

        real = Combine(Word(nums) + '.' + Word(nums))
        print(real.parseString('3.1416')) # -> ['3.1416']
        # no match when there are internal spaces
        print(real.parseString('3. 1416')) # -> Exception: Expected W:(0123...)
    r�Tcs8tt|�j|�|r|j�||_d|_||_d|_dS)NT)r�r
r�r��adjacentrX�
joinStringre)r�r.rXrW)rHrwrxr�rszCombine.__init__cs(|jrtj||�ntt|�j|�|S)N)rWr$r�r�r
)r�r�)rHrwrxr�|szCombine.ignorecCsP|j�}|dd�=|tdj|j|j��g|jd�7}|jrH|j�rH|gS|SdS)Nr�)r�)r�r"r�r
rXrbrVr�)r�r-r�r�ZretToksrwrwrxr��s
"zCombine.postParse)r�T)r�r�r�r�r�r�r�r�rwrw)rHrxr
as
cs(eZdZdZ�fdd�Zdd�Z�ZS)ra�
    Converter to return the matched tokens as a list - useful for returning tokens of C{L{ZeroOrMore}} and C{L{OneOrMore}} expressions.

    Example::
        ident = Word(alphas)
        num = Word(nums)
        term = ident | num
        func = ident + Optional(delimitedList(term))
        print(func.parseString("fn a,b,100"))  # -> ['fn', 'a', 'b', '100']

        func = ident + Group(Optional(delimitedList(term)))
        print(func.parseString("fn a,b,100"))  # -> ['fn', ['a', 'b', '100']]
    cstt|�j|�d|_dS)NT)r�rr�rW)r�r.)rHrwrxr��szGroup.__init__cCs|gS)Nrw)r�r-r�r�rwrwrxr��szGroup.postParse)r�r�r�r�r�r�r�rwrw)rHrxr�s
cs(eZdZdZ�fdd�Zdd�Z�ZS)raW
    Converter to return a repetitive expression as a list, but also as a dictionary.
    Each element can also be referenced using the first token in the expression as its key.
    Useful for tabular report scraping when the first column can be used as a item key.

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        # print attributes as plain groups
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        # instead of OneOrMore(expr), parse using Dict(OneOrMore(Group(expr))) - Dict will auto-assign names
        result = Dict(OneOrMore(Group(attr_expr))).parseString(text)
        print(result.dump())
        
        # access named fields as dict entries, or output as dict
        print(result['shape'])        
        print(result.asDict())
    prints::
        ['shape', 'SQUARE', 'posn', 'upper left', 'color', 'light blue', 'texture', 'burlap']

        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        {'color': 'light blue', 'posn': 'upper left', 'texture': 'burlap', 'shape': 'SQUARE'}
    See more examples at L{ParseResults} of accessing fields by results name.
    cstt|�j|�d|_dS)NT)r�rr�rW)r�r.)rHrwrxr��sz
Dict.__init__cCs�x�t|�D]�\}}t|�dkr q
|d}t|t�rBt|d�j�}t|�dkr^td|�||<q
t|�dkr�t|dt�r�t|d|�||<q
|j�}|d=t|�dks�t|t�r�|j	�r�t||�||<q
t|d|�||<q
W|j
r�|gS|SdS)Nrrrr�rq)r�r�rzrur�r�r�r"r�r�rV)r�r-r�r�r��tokZikeyZ	dictvaluerwrwrxr��s$
zDict.postParse)r�r�r�r�r�r�r�rwrw)rHrxr�s#c@s eZdZdZdd�Zdd�ZdS)r+aV
    Converter for ignoring the results of a parsed expression.

    Example::
        source = "a, b, c,d"
        wd = Word(alphas)
        wd_list1 = wd + ZeroOrMore(',' + wd)
        print(wd_list1.parseString(source))

        # often, delimiters that are useful during parsing are just in the
        # way afterward - use Suppress to keep them out of the parsed output
        wd_list2 = wd + ZeroOrMore(Suppress(',') + wd)
        print(wd_list2.parseString(source))
    prints::
        ['a', ',', 'b', ',', 'c', ',', 'd']
        ['a', 'b', 'c', 'd']
    (See also L{delimitedList}.)
    cCsgS)Nrw)r�r-r�r�rwrwrxr��szSuppress.postParsecCs|S)Nrw)r�rwrwrxr��szSuppress.suppressN)r�r�r�r�r�r�rwrwrwrxr+�sc@s(eZdZdZdd�Zdd�Zdd�ZdS)	rzI
    Wrapper for parse actions, to ensure they are only called once.
    cCst|�|_d|_dS)NF)rM�callable�called)r�Z
methodCallrwrwrxr�s
zOnlyOnce.__init__cCs.|js|j|||�}d|_|St||d��dS)NTr�)r[rZr)r�r�r5rvr�rwrwrxr�s
zOnlyOnce.__call__cCs
d|_dS)NF)r[)r�rwrwrx�reset
szOnlyOnce.resetN)r�r�r�r�r�r�r\rwrwrwrxr�scs:t����fdd�}y�j|_Wntk
r4YnX|S)as
    Decorator for debugging parse actions. 
    
    When the parse action is called, this decorator will print C{">> entering I{method-name}(line:I{current_source_line}, I{parse_location}, I{matched_tokens})".}
    When the parse action completes, the decorator will print C{"<<"} followed by the returned value, or any exception that the parse action raised.

    Example::
        wd = Word(alphas)

        @traceParseAction
        def remove_duplicate_chars(tokens):
            return ''.join(sorted(set(''.join(tokens)))

        wds = OneOrMore(wd).setParseAction(remove_duplicate_chars)
        print(wds.parseString("slkdjs sld sldd sdlf sdljf"))
    prints::
        >>entering remove_duplicate_chars(line: 'slkdjs sld sldd sdlf sdljf', 0, (['slkdjs', 'sld', 'sldd', 'sdlf', 'sdljf'], {}))
        <<leaving remove_duplicate_chars (ret: 'dfjkls')
        ['dfjkls']
    cs��j}|dd�\}}}t|�dkr8|djjd|}tjjd|t||�||f�y�|�}Wn8tk
r�}ztjjd||f��WYdd}~XnXtjjd||f�|S)Nror�.z">>entering %s(line: '%s', %d, %r)
z<<leaving %s (exception: %s)
z<<leaving %s (ret: %r)
r9)r�r�rHr~�stderr�writerGrK)ZpaArgsZthisFuncr�r5rvr�r3)r�rwrx�z#sztraceParseAction.<locals>.z)rMr�r�)r�r`rw)r�rxrb
s
�,FcCs`t|�dt|�dt|�d}|rBt|t||��j|�S|tt|�|�j|�SdS)a�
    Helper to define a delimited list of expressions - the delimiter defaults to ','.
    By default, the list elements and delimiters can have intervening whitespace, and
    comments, but this can be overridden by passing C{combine=True} in the constructor.
    If C{combine} is set to C{True}, the matching tokens are returned as a single token
    string, with the delimiters included; otherwise, the matching tokens are returned
    as a list of tokens, with the delimiters suppressed.

    Example::
        delimitedList(Word(alphas)).parseString("aa,bb,cc") # -> ['aa', 'bb', 'cc']
        delimitedList(Word(hexnums), delim=':', combine=True).parseString("AA:BB:CC:DD:EE") # -> ['AA:BB:CC:DD:EE']
    z [r�z]...N)r�r
r2rir+)r.Zdelim�combineZdlNamerwrwrxr@9s
$csjt����fdd�}|dkr0tt�jdd��}n|j�}|jd�|j|dd�|�jd	t��d
�S)a:
    Helper to define a counted list of expressions.
    This helper defines a pattern of the form::
        integer expr expr expr...
    where the leading integer tells how many expr expressions follow.
    The matched tokens returns the array of expr tokens as a list - the leading count token is suppressed.
    
    If C{intExpr} is specified, it should be a pyparsing expression that produces an integer value.

    Example::
        countedArray(Word(alphas)).parseString('2 ab cd ef')  # -> ['ab', 'cd']

        # in this parser, the leading integer value is given in binary,
        # '10' indicating that 2 values are in the array
        binaryConstant = Word('01').setParseAction(lambda t: int(t[0], 2))
        countedArray(Word(alphas), intExpr=binaryConstant).parseString('10 ab cd ef')  # -> ['ab', 'cd']
    cs.|d}�|r tt�g|��p&tt�>gS)Nr)rrrC)r�r5rvr�)�	arrayExprr.rwrx�countFieldParseAction_s"z+countedArray.<locals>.countFieldParseActionNcSst|d�S)Nr)ru)rvrwrwrxrydszcountedArray.<locals>.<lambda>ZarrayLenT)rfz(len) z...)rr/rRr�r�rirxr�)r.ZintExprrdrw)rcr.rxr<Ls
cCs:g}x0|D](}t|t�r(|jt|��q
|j|�q
W|S)N)rzr�r�r�r�)�Lr�r�rwrwrxr�ks

r�cs6t���fdd�}|j|dd��jdt|���S)a*
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousLiteral(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches a
    previous literal, will also match the leading C{"1:1"} in C{"1:10"}.
    If this is not desired, use C{matchPreviousExpr}.
    Do I{not} use with packrat parsing enabled.
    csP|rBt|�dkr�|d>qLt|j��}�tdd�|D��>n
�t�>dS)Nrrrcss|]}t|�VqdS)N)r)r��ttrwrwrxr��szDmatchPreviousLiteral.<locals>.copyTokenToRepeater.<locals>.<genexpr>)r�r�r�rr
)r�r5rvZtflat)�reprwrx�copyTokenToRepeater�sz1matchPreviousLiteral.<locals>.copyTokenToRepeaterT)rfz(prev) )rrxrir�)r.rhrw)rgrxrOts


csFt��|j�}�|K��fdd�}|j|dd��jdt|���S)aS
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousExpr(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches by
    expressions, will I{not} match the leading C{"1:1"} in C{"1:10"};
    the expressions are evaluated first, and then compared, so
    C{"1"} is compared with C{"10"}.
    Do I{not} use with packrat parsing enabled.
    cs*t|j����fdd�}�j|dd�dS)Ncs$t|j��}|�kr tddd��dS)Nr�r)r�r�r)r�r5rvZtheseTokens)�matchTokensrwrx�mustMatchTheseTokens�szLmatchPreviousExpr.<locals>.copyTokenToRepeater.<locals>.mustMatchTheseTokensT)rf)r�r�r�)r�r5rvrj)rg)rirxrh�sz.matchPreviousExpr.<locals>.copyTokenToRepeaterT)rfz(prev) )rr�rxrir�)r.Ze2rhrw)rgrxrN�scCs>xdD]}|j|t|�}qW|jdd�}|jdd�}t|�S)Nz\^-]rz\nr(z\t)r��_bslashr�)r�r�rwrwrxr�s

rTc
s�|rdd�}dd�}t�ndd�}dd�}t�g}t|t�rF|j�}n&t|tj�r\t|�}ntj	dt
dd�|svt�Sd	}x�|t|�d
k�r||}xnt
||d
d��D]N\}}	||	|�r�|||d
=Pq�|||	�r�|||d
=|j||	�|	}Pq�W|d
7}q|W|�r�|�r�yht|�tdj|��k�rZtd
djdd�|D���jdj|��Stdjdd�|D���jdj|��SWn&tk
�r�tj	dt
dd�YnXt�fdd�|D��jdj|��S)a�
    Helper to quickly define a set of alternative Literals, and makes sure to do
    longest-first testing when there is a conflict, regardless of the input order,
    but returns a C{L{MatchFirst}} for best performance.

    Parameters:
     - strs - a string of space-delimited literals, or a collection of string literals
     - caseless - (default=C{False}) - treat all literals as caseless
     - useRegex - (default=C{True}) - as an optimization, will generate a Regex
          object; otherwise, will generate a C{MatchFirst} object (if C{caseless=True}, or
          if creating a C{Regex} raises an exception)

    Example::
        comp_oper = oneOf("< = > <= >= !=")
        var = Word(alphas)
        number = Word(nums)
        term = var | number
        comparison_expr = term + comp_oper + term
        print(comparison_expr.searchString("B = 12  AA=23 B<=AA AA>12"))
    prints::
        [['B', '=', '12'], ['AA', '=', '23'], ['B', '<=', 'AA'], ['AA', '>', '12']]
    cSs|j�|j�kS)N)r�)r�brwrwrxry�szoneOf.<locals>.<lambda>cSs|j�j|j��S)N)r�r�)rrlrwrwrxry�scSs||kS)Nrw)rrlrwrwrxry�scSs
|j|�S)N)r�)rrlrwrwrxry�sz6Invalid argument to oneOf, expected string or iterablerq)r�rrrNr�z[%s]css|]}t|�VqdS)N)r)r��symrwrwrxr��szoneOf.<locals>.<genexpr>z | �|css|]}tj|�VqdS)N)rdr	)r�rmrwrwrxr��sz7Exception creating Regex for oneOf, building MatchFirstc3s|]}�|�VqdS)Nrw)r�rm)�parseElementClassrwrxr��s)rrrzr�r�r�r5r�r�r�r�rr�r�r�r�r'rirKr)
Zstrsr�ZuseRegexZisequalZmasksZsymbolsr�Zcurr�r�rw)rorxrS�sL





((cCsttt||���S)a�
    Helper to easily and clearly define a dictionary by specifying the respective patterns
    for the key and value.  Takes care of defining the C{L{Dict}}, C{L{ZeroOrMore}}, and C{L{Group}} tokens
    in the proper order.  The key pattern can include delimiting markers or punctuation,
    as long as they are suppressed, thereby leaving the significant key text.  The value
    pattern can include named results, so that the C{Dict} results can include named token
    fields.

    Example::
        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        attr_label = label
        attr_value = Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join)

        # similar to Dict, but simpler call format
        result = dictOf(attr_label, attr_value).parseString(text)
        print(result.dump())
        print(result['shape'])
        print(result.shape)  # object attribute access works too
        print(result.asDict())
    prints::
        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        SQUARE
        {'color': 'light blue', 'shape': 'SQUARE', 'posn': 'upper left', 'texture': 'burlap'}
    )rr2r)r�r�rwrwrxrA�s!cCs^t�jdd��}|j�}d|_|d�||d�}|r@dd�}ndd�}|j|�|j|_|S)	a�
    Helper to return the original, untokenized text for a given expression.  Useful to
    restore the parsed fields of an HTML start tag into the raw tag text itself, or to
    revert separate tokens with intervening whitespace back to the original matching
    input text. By default, returns astring containing the original parsed text.  
       
    If the optional C{asString} argument is passed as C{False}, then the return value is a 
    C{L{ParseResults}} containing any results names that were originally matched, and a 
    single token containing the original matched text from the input string.  So if 
    the expression passed to C{L{originalTextFor}} contains expressions with defined
    results names, you must set C{asString} to C{False} if you want to preserve those
    results name values.

    Example::
        src = "this is test <b> bold <i>text</i> </b> normal text "
        for tag in ("b","i"):
            opener,closer = makeHTMLTags(tag)
            patt = originalTextFor(opener + SkipTo(closer) + closer)
            print(patt.searchString(src)[0])
    prints::
        ['<b> bold <i>text</i> </b>']
        ['<i>text</i>']
    cSs|S)Nrw)r�r�rvrwrwrxry8sz!originalTextFor.<locals>.<lambda>F�_original_start�
_original_endcSs||j|j�S)N)rprq)r�r5rvrwrwrxry=scSs&||jd�|jd��g|dd�<dS)Nrprq)r�)r�r5rvrwrwrx�extractText?sz$originalTextFor.<locals>.extractText)r
r�r�rer])r.ZasStringZ	locMarkerZendlocMarker�	matchExprrrrwrwrxrg s

cCst|�jdd��S)zp
    Helper to undo pyparsing's default grouping of And expressions, even
    if all but one are non-empty.
    cSs|dS)Nrrw)rvrwrwrxryJszungroup.<locals>.<lambda>)r-r�)r.rwrwrxrhEscCs4t�jdd��}t|d�|d�|j�j�d��S)a�
    Helper to decorate a returned token with its starting and ending locations in the input string.
    This helper adds the following results names:
     - locn_start = location where matched expression begins
     - locn_end = location where matched expression ends
     - value = the actual parsed results

    Be careful if the input text contains C{<TAB>} characters, you may want to call
    C{L{ParserElement.parseWithTabs}}

    Example::
        wd = Word(alphas)
        for match in locatedExpr(wd).searchString("ljsdf123lksdjjf123lkkjj1222"):
            print(match)
    prints::
        [[0, 'ljsdf', 5]]
        [[8, 'lksdjjf', 15]]
        [[18, 'lkkjj', 23]]
    cSs|S)Nrw)r�r5rvrwrwrxry`szlocatedExpr.<locals>.<lambda>Z
locn_startr�Zlocn_end)r
r�rr�r�)r.ZlocatorrwrwrxrjLsz\[]-*.$+^?()~ )r
cCs|ddS)Nrrrrw)r�r5rvrwrwrxryksryz\\0?[xX][0-9a-fA-F]+cCstt|djd�d��S)Nrz\0x�)�unichrru�lstrip)r�r5rvrwrwrxrylsz	\\0[0-7]+cCstt|ddd�d��S)Nrrr�)ruru)r�r5rvrwrwrxrymsz\])r�r
z\wr8rr�Znegate�bodyr	csBdd��y dj�fdd�tj|�jD��Stk
r<dSXdS)a�
    Helper to easily define string ranges for use in Word construction.  Borrows
    syntax from regexp '[]' string range definitions::
        srange("[0-9]")   -> "0123456789"
        srange("[a-z]")   -> "abcdefghijklmnopqrstuvwxyz"
        srange("[a-z$_]") -> "abcdefghijklmnopqrstuvwxyz$_"
    The input string must be enclosed in []'s, and the returned string is the expanded
    character set joined into a single string.
    The values enclosed in the []'s may be:
     - a single character
     - an escaped character with a leading backslash (such as C{\-} or C{\]})
     - an escaped hex character with a leading C{'\x'} (C{\x21}, which is a C{'!'} character) 
         (C{\0x##} is also supported for backwards compatibility) 
     - an escaped octal character with a leading C{'\0'} (C{\041}, which is a C{'!'} character)
     - a range of any of the above, separated by a dash (C{'a-z'}, etc.)
     - any combination of the above (C{'aeiouy'}, C{'a-zA-Z0-9_$'}, etc.)
    cSs<t|t�s|Sdjdd�tt|d�t|d�d�D��S)Nr�css|]}t|�VqdS)N)ru)r�r�rwrwrxr��sz+srange.<locals>.<lambda>.<locals>.<genexpr>rrr)rzr"r�r��ord)�prwrwrxry�szsrange.<locals>.<lambda>r�c3s|]}�|�VqdS)Nrw)r��part)�	_expandedrwrxr��szsrange.<locals>.<genexpr>N)r��_reBracketExprr�rxrK)r�rw)r|rxr_rs
 cs�fdd�}|S)zt
    Helper method for defining parse actions that require matching at a specific
    column in the input text.
    cs"t||��krt||d���dS)Nzmatched token not at column %d)r9r)r)Zlocnr1)r�rwrx�	verifyCol�sz!matchOnlyAtCol.<locals>.verifyColrw)r�r~rw)r�rxrM�scs�fdd�S)a�
    Helper method for common parse actions that simply return a literal value.  Especially
    useful when used with C{L{transformString<ParserElement.transformString>}()}.

    Example::
        num = Word(nums).setParseAction(lambda toks: int(toks[0]))
        na = oneOf("N/A NA").setParseAction(replaceWith(math.nan))
        term = na | num
        
        OneOrMore(term).parseString("324 234 N/A 234") # -> [324, 234, nan, 234]
    cs�gS)Nrw)r�r5rv)�replStrrwrxry�szreplaceWith.<locals>.<lambda>rw)rrw)rrxr\�scCs|ddd�S)a
    Helper parse action for removing quotation marks from parsed quoted strings.

    Example::
        # by default, quotation marks are included in parsed results
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["'Now is the Winter of our Discontent'"]

        # use removeQuotes to strip quotation marks from parsed results
        quotedString.setParseAction(removeQuotes)
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["Now is the Winter of our Discontent"]
    rrrrsrw)r�r5rvrwrwrxrZ�scsN��fdd�}yt�dt�d�j�}Wntk
rBt��}YnX||_|S)aG
    Helper to define a parse action by mapping a function to all elements of a ParseResults list.If any additional 
    args are passed, they are forwarded to the given function as additional arguments after
    the token, as in C{hex_integer = Word(hexnums).setParseAction(tokenMap(int, 16))}, which will convert the
    parsed data to an integer using base 16.

    Example (compare the last to example in L{ParserElement.transformString}::
        hex_ints = OneOrMore(Word(hexnums)).setParseAction(tokenMap(int, 16))
        hex_ints.runTests('''
            00 11 22 aa FF 0a 0d 1a
            ''')
        
        upperword = Word(alphas).setParseAction(tokenMap(str.upper))
        OneOrMore(upperword).runTests('''
            my kingdom for a horse
            ''')

        wd = Word(alphas).setParseAction(tokenMap(str.title))
        OneOrMore(wd).setParseAction(' '.join).runTests('''
            now is the winter of our discontent made glorious summer by this sun of york
            ''')
    prints::
        00 11 22 aa FF 0a 0d 1a
        [0, 17, 34, 170, 255, 10, 13, 26]

        my kingdom for a horse
        ['MY', 'KINGDOM', 'FOR', 'A', 'HORSE']

        now is the winter of our discontent made glorious summer by this sun of york
        ['Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York']
    cs��fdd�|D�S)Ncsg|]}�|f����qSrwrw)r�Ztokn)r�r6rwrxr��sz(tokenMap.<locals>.pa.<locals>.<listcomp>rw)r�r5rv)r�r6rwrxr}�sztokenMap.<locals>.par�rH)rJr�rKr{)r6r�r}rLrw)r�r6rxrm�s cCst|�j�S)N)r�r�)rvrwrwrxry�scCst|�j�S)N)r��lower)rvrwrwrxry�scCs�t|t�r|}t||d�}n|j}tttd�}|r�tj�j	t
�}td�|d�tt
t|td�|���tddgd�jd	�j	d
d��td�}n�d
jdd�tD��}tj�j	t
�t|�B}td�|d�tt
t|j	t�ttd�|����tddgd�jd	�j	dd��td�}ttd�|d�}|jdd
j|jdd�j�j���jd|�}|jdd
j|jdd�j�j���jd|�}||_||_||fS)zRInternal helper to construct opening and closing tag expressions, given a tag name)r�z_-:r�tag�=�/F)r�rCcSs|ddkS)Nrr�rw)r�r5rvrwrwrxry�sz_makeTags.<locals>.<lambda>rr�css|]}|dkr|VqdS)rNrw)r�r�rwrwrxr��sz_makeTags.<locals>.<genexpr>cSs|ddkS)Nrr�rw)r�r5rvrwrwrxry�sz</r��:r�z<%s>rz</%s>)rzr�rr�r/r4r3r>r�r�rZr+rr2rrrmr�rVrYrBr
�_Lr��titler�rir�)�tagStrZxmlZresnameZtagAttrNameZtagAttrValueZopenTagZprintablesLessRAbrackZcloseTagrwrwrx�	_makeTags�s"
T\..r�cCs
t|d�S)a 
    Helper to construct opening and closing tag expressions for HTML, given a tag name. Matches
    tags in either upper or lower case, attributes with namespaces and with quoted or unquoted values.

    Example::
        text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
        # makeHTMLTags returns pyparsing expressions for the opening and closing tags as a 2-tuple
        a,a_end = makeHTMLTags("A")
        link_expr = a + SkipTo(a_end)("link_text") + a_end
        
        for link in link_expr.searchString(text):
            # attributes in the <A> tag (like "href" shown here) are also accessible as named results
            print(link.link_text, '->', link.href)
    prints::
        pyparsing -> http://pyparsing.wikispaces.com
    F)r�)r�rwrwrxrK�scCs
t|d�S)z�
    Helper to construct opening and closing tag expressions for XML, given a tag name. Matches
    tags only in the given upper/lower case.

    Example: similar to L{makeHTMLTags}
    T)r�)r�rwrwrxrLscs8|r|dd��n|j��dd��D���fdd�}|S)a<
    Helper to create a validating parse action to be used with start tags created
    with C{L{makeXMLTags}} or C{L{makeHTMLTags}}. Use C{withAttribute} to qualify a starting tag
    with a required attribute value, to avoid false matches on common tags such as
    C{<TD>} or C{<DIV>}.

    Call C{withAttribute} with a series of attribute names and values. Specify the list
    of filter attributes names and values as:
     - keyword arguments, as in C{(align="right")}, or
     - as an explicit dict with C{**} operator, when an attribute name is also a Python
          reserved word, as in C{**{"class":"Customer", "align":"right"}}
     - a list of name-value tuples, as in ( ("ns1:class", "Customer"), ("ns2:align","right") )
    For attribute names with a namespace prefix, you must use the second form.  Attribute
    names are matched insensitive to upper/lower case.
       
    If just testing for C{class} (with or without a namespace), use C{L{withClass}}.

    To verify that the attribute exists, but without specifying a value, pass
    C{withAttribute.ANY_VALUE} as the value.

    Example::
        html = '''
            <div>
            Some text
            <div type="grid">1 4 0 1 0</div>
            <div type="graph">1,3 2,3 1,1</div>
            <div>this has no type</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")

        # only match div tag having a type attribute with value "grid"
        div_grid = div().setParseAction(withAttribute(type="grid"))
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        # construct a match with any div tag having a type attribute, regardless of the value
        div_any_type = div().setParseAction(withAttribute(type=withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    NcSsg|]\}}||f�qSrwrw)r�r�r�rwrwrxr�Qsz!withAttribute.<locals>.<listcomp>cs^xX�D]P\}}||kr&t||d|��|tjkr|||krt||d||||f��qWdS)Nzno matching attribute z+attribute '%s' has value '%s', must be '%s')rre�	ANY_VALUE)r�r5r�ZattrNameZ	attrValue)�attrsrwrxr}RszwithAttribute.<locals>.pa)r�)r�ZattrDictr}rw)r�rxres2cCs|rd|nd}tf||i�S)a�
    Simplified version of C{L{withAttribute}} when matching on a div class - made
    difficult because C{class} is a reserved word in Python.

    Example::
        html = '''
            <div>
            Some text
            <div class="grid">1 4 0 1 0</div>
            <div class="graph">1,3 2,3 1,1</div>
            <div>this &lt;div&gt; has no class</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")
        div_grid = div().setParseAction(withClass("grid"))
        
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        div_any_type = div().setParseAction(withClass(withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    z%s:class�class)re)Z	classname�	namespaceZ	classattrrwrwrxrk\s �(rcCs�t�}||||B}�x`t|�D�]R\}}|ddd�\}}	}
}|	dkrTd|nd|}|	dkr�|dksxt|�dkr�td��|\}
}t�j|�}|
tjk�rd|	dkr�t||�t|t	|��}n�|	dk�r|dk	�rt|||�t|t	||��}nt||�t|t	|��}nD|	dk�rZt||
|||�t||
|||�}ntd	��n�|
tj
k�rH|	dk�r�t|t��s�t|�}t|j
|�t||�}n�|	dk�r|dk	�r�t|||�t|t	||��}nt||�t|t	|��}nD|	dk�r>t||
|||�t||
|||�}ntd	��ntd
��|�r`|j|�||j|�|BK}|}q"W||K}|S)a�	
    Helper method for constructing grammars of expressions made up of
    operators working in a precedence hierarchy.  Operators may be unary or
    binary, left- or right-associative.  Parse actions can also be attached
    to operator expressions. The generated parser will also recognize the use 
    of parentheses to override operator precedences (see example below).
    
    Note: if you define a deep operator list, you may see performance issues
    when using infixNotation. See L{ParserElement.enablePackrat} for a
    mechanism to potentially improve your parser performance.

    Parameters:
     - baseExpr - expression representing the most basic element for the nested
     - opList - list of tuples, one for each operator precedence level in the
      expression grammar; each tuple is of the form
      (opExpr, numTerms, rightLeftAssoc, parseAction), where:
       - opExpr is the pyparsing expression for the operator;
          may also be a string, which will be converted to a Literal;
          if numTerms is 3, opExpr is a tuple of two expressions, for the
          two operators separating the 3 terms
       - numTerms is the number of terms for this operator (must
          be 1, 2, or 3)
       - rightLeftAssoc is the indicator whether the operator is
          right or left associative, using the pyparsing-defined
          constants C{opAssoc.RIGHT} and C{opAssoc.LEFT}.
       - parseAction is the parse action to be associated with
          expressions matching this operator expression (the
          parse action tuple member may be omitted)
     - lpar - expression for matching left-parentheses (default=C{Suppress('(')})
     - rpar - expression for matching right-parentheses (default=C{Suppress(')')})

    Example::
        # simple example of four-function arithmetic with ints and variable names
        integer = pyparsing_common.signed_integer
        varname = pyparsing_common.identifier 
        
        arith_expr = infixNotation(integer | varname,
            [
            ('-', 1, opAssoc.RIGHT),
            (oneOf('* /'), 2, opAssoc.LEFT),
            (oneOf('+ -'), 2, opAssoc.LEFT),
            ])
        
        arith_expr.runTests('''
            5+3*6
            (5+3)*6
            -2--11
            ''', fullDump=False)
    prints::
        5+3*6
        [[5, '+', [3, '*', 6]]]

        (5+3)*6
        [[[5, '+', 3], '*', 6]]

        -2--11
        [[['-', 2], '-', ['-', 11]]]
    Nrroz%s termz	%s%s termrqz@if numterms=3, opExpr must be a tuple or list of two expressionsrrz6operator must be unary (1), binary (2), or ternary (3)z2operator must indicate right or left associativity)N)rr�r�r�rirT�LEFTrrr�RIGHTrzrr.r�)ZbaseExprZopListZlparZrparr�ZlastExprr�ZoperDefZopExprZarityZrightLeftAssocr}ZtermNameZopExpr1ZopExpr2ZthisExprrsrwrwrxri�sR;

&




&


z4"(?:[^"\n\r\\]|(?:"")|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*�"z string enclosed in double quotesz4'(?:[^'\n\r\\]|(?:'')|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*�'z string enclosed in single quotesz*quotedString using single or double quotes�uzunicode string literalcCs�||krtd��|dk�r(t|t�o,t|t��r t|�dkr�t|�dkr�|dk	r�tt|t||tjdd���j	dd��}n$t
j�t||tj�j	dd��}nx|dk	r�tt|t|�t|�ttjdd���j	dd��}n4ttt|�t|�ttjdd���j	d	d��}ntd
��t
�}|dk	�rb|tt|�t||B|B�t|��K}n$|tt|�t||B�t|��K}|jd||f�|S)a~	
    Helper method for defining nested lists enclosed in opening and closing
    delimiters ("(" and ")" are the default).

    Parameters:
     - opener - opening character for a nested list (default=C{"("}); can also be a pyparsing expression
     - closer - closing character for a nested list (default=C{")"}); can also be a pyparsing expression
     - content - expression for items within the nested lists (default=C{None})
     - ignoreExpr - expression for ignoring opening and closing delimiters (default=C{quotedString})

    If an expression is not provided for the content argument, the nested
    expression will capture all whitespace-delimited content between delimiters
    as a list of separate values.

    Use the C{ignoreExpr} argument to define expressions that may contain
    opening or closing characters that should not be treated as opening
    or closing characters for nesting, such as quotedString or a comment
    expression.  Specify multiple expressions using an C{L{Or}} or C{L{MatchFirst}}.
    The default is L{quotedString}, but if no expressions are to be ignored,
    then pass C{None} for this argument.

    Example::
        data_type = oneOf("void int short long char float double")
        decl_data_type = Combine(data_type + Optional(Word('*')))
        ident = Word(alphas+'_', alphanums+'_')
        number = pyparsing_common.number
        arg = Group(decl_data_type + ident)
        LPAR,RPAR = map(Suppress, "()")

        code_body = nestedExpr('{', '}', ignoreExpr=(quotedString | cStyleComment))

        c_function = (decl_data_type("type") 
                      + ident("name")
                      + LPAR + Optional(delimitedList(arg), [])("args") + RPAR 
                      + code_body("body"))
        c_function.ignore(cStyleComment)
        
        source_code = '''
            int is_odd(int x) { 
                return (x%2); 
            }
                
            int dec_to_hex(char hchar) { 
                if (hchar >= '0' && hchar <= '9') { 
                    return (ord(hchar)-ord('0')); 
                } else { 
                    return (10+ord(hchar)-ord('A'));
                } 
            }
        '''
        for func in c_function.searchString(source_code):
            print("%(name)s (%(type)s) args: %(args)s" % func)

    prints::
        is_odd (int) args: [['int', 'x']]
        dec_to_hex (int) args: [['char', 'hchar']]
    z.opening and closing strings cannot be the sameNrr)r
cSs|dj�S)Nr)r�)rvrwrwrxry9sznestedExpr.<locals>.<lambda>cSs|dj�S)Nr)r�)rvrwrwrxry<scSs|dj�S)Nr)r�)rvrwrwrxryBscSs|dj�S)Nr)r�)rvrwrwrxryFszOopening and closing arguments must be strings if no content expression is givenznested %s%s expression)r�rzr�r�r
rr	r$rNr�rCr�rrrr+r2ri)�openerZcloserZcontentrOr�rwrwrxrP�s4:

*$cs��fdd�}�fdd�}�fdd�}tt�jd�j��}t�t�j|�jd�}t�j|�jd	�}t�j|�jd
�}	|r�tt|�|t|t|�t|��|	�}
n$tt|�t|t|�t|���}
|j	t
t��|
jd�S)a
	
    Helper method for defining space-delimited indentation blocks, such as
    those used to define block statements in Python source code.

    Parameters:
     - blockStatementExpr - expression defining syntax of statement that
            is repeated within the indented block
     - indentStack - list created by caller to manage indentation stack
            (multiple statementWithIndentedBlock expressions within a single grammar
            should share a common indentStack)
     - indent - boolean indicating whether block must be indented beyond the
            the current level; set to False for block of left-most statements
            (default=C{True})

    A valid block must contain at least one C{blockStatement}.

    Example::
        data = '''
        def A(z):
          A1
          B = 100
          G = A2
          A2
          A3
        B
        def BB(a,b,c):
          BB1
          def BBA():
            bba1
            bba2
            bba3
        C
        D
        def spam(x,y):
             def eggs(z):
                 pass
        '''


        indentStack = [1]
        stmt = Forward()

        identifier = Word(alphas, alphanums)
        funcDecl = ("def" + identifier + Group( "(" + Optional( delimitedList(identifier) ) + ")" ) + ":")
        func_body = indentedBlock(stmt, indentStack)
        funcDef = Group( funcDecl + func_body )

        rvalue = Forward()
        funcCall = Group(identifier + "(" + Optional(delimitedList(rvalue)) + ")")
        rvalue << (funcCall | identifier | Word(nums))
        assignment = Group(identifier + "=" + rvalue)
        stmt << ( funcDef | assignment | identifier )

        module_body = OneOrMore(stmt)

        parseTree = module_body.parseString(data)
        parseTree.pprint()
    prints::
        [['def',
          'A',
          ['(', 'z', ')'],
          ':',
          [['A1'], [['B', '=', '100']], [['G', '=', 'A2']], ['A2'], ['A3']]],
         'B',
         ['def',
          'BB',
          ['(', 'a', 'b', 'c', ')'],
          ':',
          [['BB1'], [['def', 'BBA', ['(', ')'], ':', [['bba1'], ['bba2'], ['bba3']]]]]],
         'C',
         'D',
         ['def',
          'spam',
          ['(', 'x', 'y', ')'],
          ':',
          [[['def', 'eggs', ['(', 'z', ')'], ':', [['pass']]]]]]] 
    csN|t|�krdSt||�}|�dkrJ|�dkr>t||d��t||d��dS)Nrrzillegal nestingznot a peer entryrsrs)r�r9r!r)r�r5rv�curCol)�indentStackrwrx�checkPeerIndent�s
z&indentedBlock.<locals>.checkPeerIndentcs2t||�}|�dkr"�j|�nt||d��dS)Nrrznot a subentryrs)r9r�r)r�r5rvr�)r�rwrx�checkSubIndent�s
z%indentedBlock.<locals>.checkSubIndentcsN|t|�krdSt||�}�o4|�dko4|�dksBt||d���j�dS)Nrrrqznot an unindentrsr:)r�r9rr�)r�r5rvr�)r�rwrx�
checkUnindent�s
z$indentedBlock.<locals>.checkUnindentz	 �INDENTr�ZUNINDENTzindented block)rrr�r�r
r�rirrr�rk)ZblockStatementExprr�rr�r�r�r!r�ZPEERZUNDENTZsmExprrw)r�rxrfQsN,z#[\0xc0-\0xd6\0xd8-\0xf6\0xf8-\0xff]z[\0xa1-\0xbf\0xd7\0xf7]z_:zany tagzgt lt amp nbsp quot aposz><& "'z&(?P<entity>rnz);zcommon HTML entitycCstj|j�S)zRHelper parser action to replace common HTML entities with their special characters)�_htmlEntityMapr�Zentity)rvrwrwrxr[�sz/\*(?:[^*]|\*(?!/))*z*/zC style commentz<!--[\s\S]*?-->zHTML commentz.*zrest of linez//(?:\\\n|[^\n])*z
// commentzC++ style commentz#.*zPython style comment)r�z 	�	commaItem)r�c@s�eZdZdZee�Zee�Ze	e
�jd�je�Z
e	e�jd�jeed��Zed�jd�je�Ze�je�de�je�jd�Zejd	d
��eeeed�j�e�Bjd�Zeje�ed
�jd�je�Zed�jd�je�ZeeBeBj�Zed�jd�je�Ze	eded�jd�Zed�jd�Z ed�jd�Z!e!de!djd�Z"ee!de!d>�dee!de!d?�jd�Z#e#j$d d
��d!e jd"�Z%e&e"e%Be#Bjd#��jd#�Z'ed$�jd%�Z(e)d@d'd(��Z*e)dAd*d+��Z+ed,�jd-�Z,ed.�jd/�Z-ed0�jd1�Z.e/j�e0j�BZ1e)d2d3��Z2e&e3e4d4�e5�e	e6d4d5�ee7d6����j�jd7�Z8e9ee:j;�e8Bd8d9��jd:�Z<e)ed;d
���Z=e)ed<d
���Z>d=S)Brna�

    Here are some common low-level expressions that may be useful in jump-starting parser development:
     - numeric forms (L{integers<integer>}, L{reals<real>}, L{scientific notation<sci_real>})
     - common L{programming identifiers<identifier>}
     - network addresses (L{MAC<mac_address>}, L{IPv4<ipv4_address>}, L{IPv6<ipv6_address>})
     - ISO8601 L{dates<iso8601_date>} and L{datetime<iso8601_datetime>}
     - L{UUID<uuid>}
     - L{comma-separated list<comma_separated_list>}
    Parse actions:
     - C{L{convertToInteger}}
     - C{L{convertToFloat}}
     - C{L{convertToDate}}
     - C{L{convertToDatetime}}
     - C{L{stripHTMLTags}}
     - C{L{upcaseTokens}}
     - C{L{downcaseTokens}}

    Example::
        pyparsing_common.number.runTests('''
            # any int or real number, returned as the appropriate type
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.fnumber.runTests('''
            # any int or real number, returned as float
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.hex_integer.runTests('''
            # hex numbers
            100
            FF
            ''')

        pyparsing_common.fraction.runTests('''
            # fractions
            1/2
            -3/4
            ''')

        pyparsing_common.mixed_integer.runTests('''
            # mixed fractions
            1
            1/2
            -3/4
            1-3/4
            ''')

        import uuid
        pyparsing_common.uuid.setParseAction(tokenMap(uuid.UUID))
        pyparsing_common.uuid.runTests('''
            # uuid
            12345678-1234-5678-1234-567812345678
            ''')
    prints::
        # any int or real number, returned as the appropriate type
        100
        [100]

        -100
        [-100]

        +100
        [100]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # any int or real number, returned as float
        100
        [100.0]

        -100
        [-100.0]

        +100
        [100.0]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # hex numbers
        100
        [256]

        FF
        [255]

        # fractions
        1/2
        [0.5]

        -3/4
        [-0.75]

        # mixed fractions
        1
        [1]

        1/2
        [0.5]

        -3/4
        [-0.75]

        1-3/4
        [1.75]

        # uuid
        12345678-1234-5678-1234-567812345678
        [UUID('12345678-1234-5678-1234-567812345678')]
    �integerzhex integerrtz[+-]?\d+zsigned integerr��fractioncCs|d|dS)Nrrrrsrw)rvrwrwrxry�szpyparsing_common.<lambda>r8z"fraction or mixed integer-fractionz
[+-]?\d+\.\d*zreal numberz+[+-]?\d+([eE][+-]?\d+|\.\d*([eE][+-]?\d+)?)z$real number with scientific notationz[+-]?\d+\.?\d*([eE][+-]?\d+)?�fnumberrB�
identifierzK(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})(\.(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})){3}zIPv4 addressz[0-9a-fA-F]{1,4}�hex_integerr��zfull IPv6 addressrrBz::zshort IPv6 addresscCstdd�|D��dkS)Ncss|]}tjj|�rdVqdS)rrN)rn�
_ipv6_partr�)r�rfrwrwrxr��sz,pyparsing_common.<lambda>.<locals>.<genexpr>rw)rH)rvrwrwrxry�sz::ffff:zmixed IPv6 addresszIPv6 addressz:[0-9a-fA-F]{2}([:.-])[0-9a-fA-F]{2}(?:\1[0-9a-fA-F]{2}){4}zMAC address�%Y-%m-%dcs�fdd�}|S)a�
        Helper to create a parse action for converting parsed date string to Python datetime.date

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%d"})

        Example::
            date_expr = pyparsing_common.iso8601_date.copy()
            date_expr.setParseAction(pyparsing_common.convertToDate())
            print(date_expr.parseString("1999-12-31"))
        prints::
            [datetime.date(1999, 12, 31)]
        csLytj|d��j�Stk
rF}zt||t|���WYdd}~XnXdS)Nr)r�strptimeZdater�rr{)r�r5rv�ve)�fmtrwrx�cvt_fn�sz.pyparsing_common.convertToDate.<locals>.cvt_fnrw)r�r�rw)r�rx�
convertToDate�szpyparsing_common.convertToDate�%Y-%m-%dT%H:%M:%S.%fcs�fdd�}|S)a
        Helper to create a parse action for converting parsed datetime string to Python datetime.datetime

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%dT%H:%M:%S.%f"})

        Example::
            dt_expr = pyparsing_common.iso8601_datetime.copy()
            dt_expr.setParseAction(pyparsing_common.convertToDatetime())
            print(dt_expr.parseString("1999-12-31T23:59:59.999"))
        prints::
            [datetime.datetime(1999, 12, 31, 23, 59, 59, 999000)]
        csHytj|d��Stk
rB}zt||t|���WYdd}~XnXdS)Nr)rr�r�rr{)r�r5rvr�)r�rwrxr��sz2pyparsing_common.convertToDatetime.<locals>.cvt_fnrw)r�r�rw)r�rx�convertToDatetime�sz"pyparsing_common.convertToDatetimez7(?P<year>\d{4})(?:-(?P<month>\d\d)(?:-(?P<day>\d\d))?)?zISO8601 datez�(?P<year>\d{4})-(?P<month>\d\d)-(?P<day>\d\d)[T ](?P<hour>\d\d):(?P<minute>\d\d)(:(?P<second>\d\d(\.\d*)?)?)?(?P<tz>Z|[+-]\d\d:?\d\d)?zISO8601 datetimez2[0-9a-fA-F]{8}(-[0-9a-fA-F]{4}){3}-[0-9a-fA-F]{12}�UUIDcCstjj|d�S)a
        Parse action to remove HTML tags from web page HTML source

        Example::
            # strip HTML links from normal text 
            text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
            td,td_end = makeHTMLTags("TD")
            table_text = td + SkipTo(td_end).setParseAction(pyparsing_common.stripHTMLTags)("body") + td_end
            
            print(table_text.parseString(text).body) # -> 'More info at the pyparsing wiki page'
        r)rn�_html_stripperr�)r�r5r�rwrwrx�
stripHTMLTags�s
zpyparsing_common.stripHTMLTagsra)r�z 	r�r�)r�zcomma separated listcCst|�j�S)N)r�r�)rvrwrwrxry�scCst|�j�S)N)r�r�)rvrwrwrxry�sN)rrB)rrB)r�)r�)?r�r�r�r�rmruZconvertToInteger�floatZconvertToFloatr/rRrir�r�rDr�r'Zsigned_integerr�rxrr�Z
mixed_integerrH�realZsci_realr��numberr�r4r3r�Zipv4_addressr�Z_full_ipv6_addressZ_short_ipv6_addressr~Z_mixed_ipv6_addressr
Zipv6_addressZmac_addressr�r�r�Ziso8601_dateZiso8601_datetime�uuidr7r6r�r�rrrrVr.�
_commasepitemr@rYr�Zcomma_separated_listrdrBrwrwrwrxrn�sN""
28�__main__Zselect�fromz_$r])rb�columnsrjZtablesZcommandaK
        # '*' as column list and dotted table name
        select * from SYS.XYZZY

        # caseless match on "SELECT", and casts back to "select"
        SELECT * from XYZZY, ABC

        # list of column names, and mixed case SELECT keyword
        Select AA,BB,CC from Sys.dual

        # multiple tables
        Select A, B, C from Sys.dual, Table2

        # invalid SELECT keyword - should fail
        Xelect A, B, C from Sys.dual

        # incomplete command - should fail
        Select

        # invalid column name - should fail
        Select ^^^ frox Sys.dual

        z]
        100
        -100
        +100
        3.14159
        6.02e23
        1e-12
        z 
        100
        FF
        z6
        12345678-1234-5678-1234-567812345678
        )rq)raF)N)FT)T)r�)T)�r��__version__Z__versionTime__�
__author__r��weakrefrr�r�r~r�rdrr�r"r<r�r�_threadr�ImportErrorZ	threadingrr�Zordereddict�__all__r��version_infor;r�maxsizer�r{r��chrrur�rHr�r�reversedr�r�rr6rrrIZmaxintZxranger�Z__builtin__r�Zfnamer�rJr�r�r�r�r�r�Zascii_uppercaseZascii_lowercaser4rRrDr3rkr�Z	printablerVrKrrr!r#r&r�r"�MutableMapping�registerr9rJrGr/r2r4rQrMr$r,r
rrr�rQrrrrlr/r'r%r	r.r0rrrr*r)r1r0r rrrrrrrrJrr2rMrNrr(rrVr-r
rrr+rrbr@r<r�rOrNrrSrArgrhrjrirCrIrHrar`r�Z_escapedPuncZ_escapedHexCharZ_escapedOctChar�UNICODEZ_singleCharZ
_charRangermr}r_rMr\rZrmrdrBr�rKrLrer�rkrTr�r�rirUr>r^rYrcrPrfr5rWr7r6r�r�r�r�r;r[r8rEr�r]r?r=rFrXr�r�r:rnr�ZselectTokenZ	fromTokenZidentZ
columnNameZcolumnNameListZ
columnSpecZ	tableNameZ
tableNameListZ	simpleSQLr�r�r�r�r�r�rwrwrwrx�<module>=s�









8


@d&A= I
G3pLOD|M &#@sQ,A,	I#%&0
,	?#kZr

 (
 0 


"site-packages/pkg_resources/_vendor/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000161147511334660023137 0ustar003

��f�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/pkg_resources/_vendor/__pycache__/six.cpython-36.opt-1.pyc000064400000057532147511334660022221 0ustar003

��f�u�I@srdZddlmZddlZddlZddlZddlZddlZdZdZ	ej
ddkZej
ddkZej
dd��dzkZ
er�efZefZefZeZeZejZn�efZeefZeejfZeZeZejjd	�r�e�d|�ZnLGdd
�d
e�Z ye!e ��Wn e"k
�re�d~�ZYnXe�d��Z[ dd�Z#dd�Z$Gdd�de�Z%Gdd�de%�Z&Gdd�dej'�Z(Gdd�de%�Z)Gdd�de�Z*e*e+�Z,Gdd�de(�Z-e)ddd d!�e)d"d#d$d%d"�e)d&d#d#d'd&�e)d(d)d$d*d(�e)d+d)d,�e)d-d#d$d.d-�e)d/d0d0d1d/�e)d2d0d0d/d2�e)d3d)d$d4d3�e)d5d)e
�rd6nd7d8�e)d9d)d:�e)d;d<d=d>�e)d!d!d �e)d?d?d@�e)dAdAd@�e)dBdBd@�e)d4d)d$d4d3�e)dCd#d$dDdC�e)dEd#d#dFdE�e&d$d)�e&dGdH�e&dIdJ�e&dKdLdM�e&dNdOdN�e&dPdQdR�e&dSdTdU�e&dVdWdX�e&dYdZd[�e&d\d]d^�e&d_d`da�e&dbdcdd�e&dedfdg�e&dhdidj�e&dkdkdl�e&dmdmdl�e&dndndl�e&dododp�e&dqdr�e&dsdt�e&dudv�e&dwdxdw�e&dydz�e&d{d|d}�e&d~dd��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�d�d��e&d�e+d�d��e&d�e+d�d��e&d�e+d�e+d��e&d�d�d��e&d�d�d��e&d�d�d��g>Z.ejd�k�rZe.e&d�d��g7Z.x:e.D]2Z/e0e-e/j1e/�e2e/e&��r`e,j3e/d�e/j1��q`W[/e.e-_.e-e+d��Z4e,j3e4d��Gd�d��d�e(�Z5e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d>d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��e)d�d�d��gZ6xe6D]Z/e0e5e/j1e/��q�W[/e6e5_.e,j3e5e+d��d�dӃGd�dՄd�e(�Z7e)d�d�d��e)d�d�d��e)d�d�d��gZ8xe8D]Z/e0e7e/j1e/��q$W[/e8e7_.e,j3e7e+d��d�d܃Gd�dބd�e(�Z9e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)d�d�d�e)�dd�d�g!Z:xe:D]Z/e0e9e/j1e/��q�W[/e:e9_.e,j3e9e+�d��d�d�G�d�d��de(�Z;e)�dd��d�e)�dd��d�e)�d	d��d�e)�d
d��d�gZ<xe<D]Z/e0e;e/j1e/��qTW[/e<e;_.e,j3e;e+�d��d�d
�G�d�d��de(�Z=e)�dd�d��gZ>xe>D]Z/e0e=e/j1e/��q�W[/e>e=_.e,j3e=e+�d��d�d�G�d�d��dej'�Z?e,j3e?e+d���d��d�d�Z@�d�d�ZAe�	rj�dZB�dZC�dZD�dZE�dZF�d ZGn$�d!ZB�d"ZC�d#ZD�d$ZE�d%ZF�d&ZGyeHZIWn"eJk
�	r��d'�d(�ZIYnXeIZHyeKZKWn"eJk
�	r��d)�d*�ZKYnXe�
r�d+�d,�ZLejMZN�d-�d.�ZOeZPn>�d/�d,�ZL�d0�d1�ZN�d2�d.�ZOG�d3�d4��d4e�ZPeKZKe#eL�d5�ejQeB�ZRejQeC�ZSejQeD�ZTejQeE�ZUejQeF�ZVejQeG�ZWe�
r��d6�d7�ZX�d8�d9�ZY�d:�d;�ZZ�d<�d=�Z[ej\�d>�Z]ej\�d?�Z^ej\�d@�Z_nT�dA�d7�ZX�dB�d9�ZY�dC�d;�ZZ�dD�d=�Z[ej\�dE�Z]ej\�dF�Z^ej\�dG�Z_e#eX�dH�e#eY�dI�e#eZ�dJ�e#e[�dK�e�r�dL�dM�Z`�dN�dO�ZaebZcddldZdedje�dP�jfZg[dejhd�ZiejjZkelZmddlnZnenjoZoenjpZp�dQZqej
d
d
k�r�dRZr�dSZsn�dTZr�dUZsnj�dV�dM�Z`�dW�dO�ZaecZcebZg�dX�dY�Zi�dZ�d[�Zkejtejuev�ZmddloZoeojoZoZp�d\Zq�dRZr�dSZse#e`�d]�e#ea�d^��d_�dQ�Zw�d`�dT�Zx�da�dU�Zye�r�eze4j{�db�Z|�d��dc�dd�Z}n�d��de�df�Z|e|�dg�ej
dd��d�k�
re|�dh�n.ej
dd��d�k�
r8e|�di�n�dj�dk�Z~eze4j{�dld�Zedk�
rj�dm�dn�Zej
dd��d�k�
r�eZ��do�dn�Ze#e}�dp�ej
dd��d�k�
r�ej�ej�f�dq�dr�Z�nej�Z��ds�dt�Z��du�dv�Z��dw�dx�Z�gZ�e+Z�e��j��dy�dk	�rge�_�ej��rbx>e�ej��D]0\Z�Z�ee��j+dk�r*e�j1e+k�r*ej�e�=P�q*W[�[�ej�j�e,�dS(�z6Utilities for writing code that runs on Python 2 and 3�)�absolute_importNz'Benjamin Peterson <benjamin@python.org>z1.10.0����java��c@seZdZdd�ZdS)�XcCsdS)Nrrl�)�selfr
r
�/usr/lib/python3.6/six.py�__len__>sz	X.__len__N)�__name__�
__module__�__qualname__r
r
r
r
rr	<sr	�?cCs
||_dS)z Add documentation to a function.N)�__doc__)�func�docr
r
r�_add_docKsrcCst|�tj|S)z7Import module, returning the module after the last dot.)�
__import__�sys�modules)�namer
r
r�_import_modulePsrc@seZdZdd�Zdd�ZdS)�
_LazyDescrcCs
||_dS)N)r)rrr
r
r�__init__Xsz_LazyDescr.__init__cCsB|j�}t||j|�yt|j|j�Wntk
r<YnX|S)N)�_resolve�setattrr�delattr�	__class__�AttributeError)r�obj�tp�resultr
r
r�__get__[sz_LazyDescr.__get__N)rrrrr%r
r
r
rrVsrcs.eZdZd�fdd�	Zdd�Zdd�Z�ZS)	�MovedModuleNcs2tt|�j|�tr(|dkr |}||_n||_dS)N)�superr&r�PY3�mod)rr�old�new)r r
rriszMovedModule.__init__cCs
t|j�S)N)rr))rr
r
rrrszMovedModule._resolvecCs"|j�}t||�}t|||�|S)N)r�getattrr)r�attr�_module�valuer
r
r�__getattr__us
zMovedModule.__getattr__)N)rrrrrr0�
__classcell__r
r
)r rr&gs	r&cs(eZdZ�fdd�Zdd�ZgZ�ZS)�_LazyModulecstt|�j|�|jj|_dS)N)r'r2rr r)rr)r r
rr~sz_LazyModule.__init__cCs ddg}|dd�|jD�7}|S)NrrcSsg|]
}|j�qSr
)r)�.0r-r
r
r�
<listcomp>�sz'_LazyModule.__dir__.<locals>.<listcomp>)�_moved_attributes)rZattrsr
r
r�__dir__�sz_LazyModule.__dir__)rrrrr6r5r1r
r
)r rr2|sr2cs&eZdZd�fdd�	Zdd�Z�ZS)�MovedAttributeNcsdtt|�j|�trH|dkr |}||_|dkr@|dkr<|}n|}||_n||_|dkrZ|}||_dS)N)r'r7rr(r)r-)rrZold_modZnew_modZold_attrZnew_attr)r r
rr�szMovedAttribute.__init__cCst|j�}t||j�S)N)rr)r,r-)r�moduler
r
rr�s
zMovedAttribute._resolve)NN)rrrrrr1r
r
)r rr7�sr7c@sVeZdZdZdd�Zdd�Zdd�Zdd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZeZdS)�_SixMetaPathImporterz�
    A meta path importer to import six.moves and its submodules.

    This class implements a PEP302 finder and loader. It should be compatible
    with Python 2.5 and all existing versions of Python3
    cCs||_i|_dS)N)r�
known_modules)rZsix_module_namer
r
rr�sz_SixMetaPathImporter.__init__cGs&x |D]}||j|jd|<qWdS)N�.)r:r)rr)Z	fullnames�fullnamer
r
r�_add_module�s
z _SixMetaPathImporter._add_modulecCs|j|jd|S)Nr;)r:r)rr<r
r
r�_get_module�sz _SixMetaPathImporter._get_moduleNcCs||jkr|SdS)N)r:)rr<�pathr
r
r�find_module�s
z _SixMetaPathImporter.find_modulecCs0y
|j|Stk
r*td|��YnXdS)Nz!This loader does not know module )r:�KeyError�ImportError)rr<r
r
rZ__get_module�s
z!_SixMetaPathImporter.__get_modulecCsRy
tj|Stk
rYnX|j|�}t|t�r>|j�}n||_|tj|<|S)N)rrrA� _SixMetaPathImporter__get_module�
isinstancer&r�
__loader__)rr<r)r
r
r�load_module�s




z _SixMetaPathImporter.load_modulecCst|j|�d�S)z�
        Return true, if the named module is a package.

        We need this method to get correct spec objects with
        Python 3.4 (see PEP451)
        �__path__)�hasattrrC)rr<r
r
r�
is_package�sz_SixMetaPathImporter.is_packagecCs|j|�dS)z;Return None

        Required, if is_package is implementedN)rC)rr<r
r
r�get_code�s
z_SixMetaPathImporter.get_code)N)
rrrrrr=r>r@rCrFrIrJ�
get_sourcer
r
r
rr9�s
	r9c@seZdZdZgZdS)�_MovedItemszLazy loading of moved objectsN)rrrrrGr
r
r
rrL�srLZ	cStringIO�io�StringIO�filter�	itertools�builtinsZifilter�filterfalseZifilterfalse�inputZ__builtin__Z	raw_input�internr�map�imap�getcwd�osZgetcwdu�getcwdb�rangeZxrangeZ
reload_module�	importlibZimp�reload�reduce�	functoolsZshlex_quoteZpipesZshlexZquote�UserDict�collections�UserList�
UserString�zipZizip�zip_longestZizip_longestZconfigparserZConfigParser�copyregZcopy_regZdbm_gnuZgdbmzdbm.gnuZ
_dummy_threadZdummy_threadZhttp_cookiejarZ	cookielibzhttp.cookiejarZhttp_cookiesZCookiezhttp.cookiesZ
html_entitiesZhtmlentitydefsz
html.entitiesZhtml_parserZ
HTMLParserzhtml.parserZhttp_clientZhttplibzhttp.clientZemail_mime_multipartzemail.MIMEMultipartzemail.mime.multipartZemail_mime_nonmultipartzemail.MIMENonMultipartzemail.mime.nonmultipartZemail_mime_textzemail.MIMETextzemail.mime.textZemail_mime_basezemail.MIMEBasezemail.mime.baseZBaseHTTPServerzhttp.serverZ
CGIHTTPServerZSimpleHTTPServerZcPickle�pickleZqueueZQueue�reprlib�reprZsocketserverZSocketServer�_threadZthreadZtkinterZTkinterZtkinter_dialogZDialogztkinter.dialogZtkinter_filedialogZ
FileDialogztkinter.filedialogZtkinter_scrolledtextZScrolledTextztkinter.scrolledtextZtkinter_simpledialogZSimpleDialogztkinter.simpledialogZtkinter_tixZTixztkinter.tixZtkinter_ttkZttkztkinter.ttkZtkinter_constantsZTkconstantsztkinter.constantsZtkinter_dndZTkdndztkinter.dndZtkinter_colorchooserZtkColorChooserztkinter.colorchooserZtkinter_commondialogZtkCommonDialogztkinter.commondialogZtkinter_tkfiledialogZtkFileDialogZtkinter_fontZtkFontztkinter.fontZtkinter_messageboxZtkMessageBoxztkinter.messageboxZtkinter_tksimpledialogZtkSimpleDialogZurllib_parsez.moves.urllib_parsezurllib.parseZurllib_errorz.moves.urllib_errorzurllib.errorZurllibz
.moves.urllibZurllib_robotparser�robotparserzurllib.robotparserZ
xmlrpc_clientZ	xmlrpclibz
xmlrpc.clientZ
xmlrpc_serverZSimpleXMLRPCServerz
xmlrpc.serverZwin32�winreg�_winregzmoves.z.moves�movesc@seZdZdZdS)�Module_six_moves_urllib_parsez7Lazy loading of moved objects in six.moves.urllib_parseN)rrrrr
r
r
rrn@srnZParseResultZurlparseZSplitResultZparse_qsZ	parse_qslZ	urldefragZurljoinZurlsplitZ
urlunparseZ
urlunsplitZ
quote_plusZunquoteZunquote_plusZ	urlencodeZ
splitqueryZsplittagZ	splituserZ
uses_fragmentZuses_netlocZuses_paramsZ
uses_queryZ
uses_relativezmoves.urllib_parsezmoves.urllib.parsec@seZdZdZdS)�Module_six_moves_urllib_errorz7Lazy loading of moved objects in six.moves.urllib_errorN)rrrrr
r
r
rrohsroZURLErrorZurllib2Z	HTTPErrorZContentTooShortErrorz.moves.urllib.errorzmoves.urllib_errorzmoves.urllib.errorc@seZdZdZdS)�Module_six_moves_urllib_requestz9Lazy loading of moved objects in six.moves.urllib_requestN)rrrrr
r
r
rrp|srpZurlopenzurllib.requestZinstall_openerZbuild_openerZpathname2urlZurl2pathnameZ
getproxiesZRequestZOpenerDirectorZHTTPDefaultErrorHandlerZHTTPRedirectHandlerZHTTPCookieProcessorZProxyHandlerZBaseHandlerZHTTPPasswordMgrZHTTPPasswordMgrWithDefaultRealmZAbstractBasicAuthHandlerZHTTPBasicAuthHandlerZProxyBasicAuthHandlerZAbstractDigestAuthHandlerZHTTPDigestAuthHandlerZProxyDigestAuthHandlerZHTTPHandlerZHTTPSHandlerZFileHandlerZ
FTPHandlerZCacheFTPHandlerZUnknownHandlerZHTTPErrorProcessorZurlretrieveZ
urlcleanupZ	URLopenerZFancyURLopenerZproxy_bypassz.moves.urllib.requestzmoves.urllib_requestzmoves.urllib.requestc@seZdZdZdS)� Module_six_moves_urllib_responsez:Lazy loading of moved objects in six.moves.urllib_responseN)rrrrr
r
r
rrq�srqZaddbasezurllib.responseZaddclosehookZaddinfoZ
addinfourlz.moves.urllib.responsezmoves.urllib_responsezmoves.urllib.responsec@seZdZdZdS)�#Module_six_moves_urllib_robotparserz=Lazy loading of moved objects in six.moves.urllib_robotparserN)rrrrr
r
r
rrr�srrZRobotFileParserz.moves.urllib.robotparserzmoves.urllib_robotparserzmoves.urllib.robotparserc@sNeZdZdZgZejd�Zejd�Zejd�Z	ejd�Z
ejd�Zdd�Zd	S)
�Module_six_moves_urllibzICreate a six.moves.urllib namespace that resembles the Python 3 namespacezmoves.urllib_parsezmoves.urllib_errorzmoves.urllib_requestzmoves.urllib_responsezmoves.urllib_robotparsercCsdddddgS)N�parse�error�request�responserjr
)rr
r
rr6�szModule_six_moves_urllib.__dir__N)
rrrrrG�	_importerr>rtrurvrwrjr6r
r
r
rrs�s




rszmoves.urllibcCstt|j|�dS)zAdd an item to six.moves.N)rrLr)Zmover
r
r�add_move�srycCsXytt|�WnDtk
rRytj|=Wn"tk
rLtd|f��YnXYnXdS)zRemove item from six.moves.zno such move, %rN)rrLr!rm�__dict__rA)rr
r
r�remove_move�sr{�__func__�__self__�__closure__�__code__�__defaults__�__globals__�im_funcZim_selfZfunc_closureZ	func_codeZ
func_defaultsZfunc_globalscCs|j�S)N)�next)�itr
r
r�advance_iteratorsr�cCstdd�t|�jD��S)Ncss|]}d|jkVqdS)�__call__N)rz)r3�klassr
r
r�	<genexpr>szcallable.<locals>.<genexpr>)�any�type�__mro__)r"r
r
r�callablesr�cCs|S)Nr
)�unboundr
r
r�get_unbound_functionsr�cCs|S)Nr
)r�clsr
r
r�create_unbound_methodsr�cCs|jS)N)r�)r�r
r
rr�"scCstj|||j�S)N)�types�
MethodTyper )rr"r
r
r�create_bound_method%sr�cCstj|d|�S)N)r�r�)rr�r
r
rr�(sc@seZdZdd�ZdS)�IteratorcCst|�j|�S)N)r��__next__)rr
r
rr�-sz
Iterator.nextN)rrrr�r
r
r
rr�+sr�z3Get the function out of a possibly unbound functioncKst|jf|��S)N)�iter�keys)�d�kwr
r
r�iterkeys>sr�cKst|jf|��S)N)r��values)r�r�r
r
r�
itervaluesAsr�cKst|jf|��S)N)r��items)r�r�r
r
r�	iteritemsDsr�cKst|jf|��S)N)r�Zlists)r�r�r
r
r�	iterlistsGsr�r�r�r�cKs|jf|�S)N)r�)r�r�r
r
rr�PscKs|jf|�S)N)r�)r�r�r
r
rr�SscKs|jf|�S)N)r�)r�r�r
r
rr�VscKs|jf|�S)N)r�)r�r�r
r
rr�Ys�viewkeys�
viewvalues�	viewitemsz1Return an iterator over the keys of a dictionary.z3Return an iterator over the values of a dictionary.z?Return an iterator over the (key, value) pairs of a dictionary.zBReturn an iterator over the (key, [values]) pairs of a dictionary.cCs
|jd�S)Nzlatin-1)�encode)�sr
r
r�bksr�cCs|S)Nr
)r�r
r
r�unsr�z>B�assertCountEqualZassertRaisesRegexpZassertRegexpMatches�assertRaisesRegex�assertRegexcCs|S)Nr
)r�r
r
rr��scCst|jdd�d�S)Nz\\z\\\\Zunicode_escape)�unicode�replace)r�r
r
rr��scCst|d�S)Nr)�ord)Zbsr
r
r�byte2int�sr�cCst||�S)N)r�)Zbuf�ir
r
r�
indexbytes�sr�ZassertItemsEqualzByte literalzText literalcOst|t�||�S)N)r,�_assertCountEqual)r�args�kwargsr
r
rr��scOst|t�||�S)N)r,�_assertRaisesRegex)rr�r�r
r
rr��scOst|t�||�S)N)r,�_assertRegex)rr�r�r
r
rr��s�execcCs*|dkr|�}|j|k	r"|j|��|�dS)N)�
__traceback__�with_traceback)r#r/�tbr
r
r�reraise�s


r�cCsB|dkr*tjd�}|j}|dkr&|j}~n|dkr6|}td�dS)zExecute code in a namespace.Nrzexec _code_ in _globs_, _locs_)r�	_getframe�	f_globals�f_localsr�)Z_code_Z_globs_Z_locs_�framer
r
r�exec_�s
r�z9def reraise(tp, value, tb=None):
    raise tp, value, tb
zrdef raise_from(value, from_value):
    if from_value is None:
        raise value
    raise value from from_value
zCdef raise_from(value, from_value):
    raise value from from_value
cCs|�dS)Nr
)r/Z
from_valuer
r
r�
raise_from�sr��printc
s6|jdtj���dkrdS�fdd�}d}|jdd�}|dk	r`t|t�rNd}nt|t�s`td��|jd	d�}|dk	r�t|t�r�d}nt|t�s�td
��|r�td��|s�x|D]}t|t�r�d}Pq�W|r�td�}td
�}nd}d
}|dkr�|}|dk�r�|}x,t|�D] \}	}|	�r||�||��qW||�dS)z4The new-style print function for Python 2.4 and 2.5.�fileNcsdt|t�st|�}t�t�rVt|t�rV�jdk	rVt�dd�}|dkrHd}|j�j|�}�j|�dS)N�errors�strict)	rD�
basestring�strr�r��encodingr,r��write)�datar�)�fpr
rr��s



zprint_.<locals>.writeF�sepTzsep must be None or a string�endzend must be None or a stringz$invalid keyword arguments to print()�
� )�popr�stdoutrDr�r��	TypeError�	enumerate)
r�r�r�Zwant_unicoder�r��arg�newlineZspacer�r
)r�r�print_�sL







r�cOs<|jdtj�}|jdd�}t||�|r8|dk	r8|j�dS)Nr��flushF)�getrr�r��_printr�)r�r�r�r�r
r
rr�s

zReraise an exception.cs���fdd�}|S)Ncstj����|�}�|_|S)N)r^�wraps�__wrapped__)�f)�assigned�updated�wrappedr
r�wrapperszwraps.<locals>.wrapperr
)r�r�r�r�r
)r�r�r�rr�sr�cs&G��fdd�d��}tj|dfi�S)z%Create a base class with a metaclass.cseZdZ��fdd�ZdS)z!with_metaclass.<locals>.metaclasscs�|�|�S)Nr
)r�rZ
this_basesr�)�bases�metar
r�__new__'sz)with_metaclass.<locals>.metaclass.__new__N)rrrr�r
)r�r�r
r�	metaclass%sr�Ztemporary_class)r�r�)r�r�r�r
)r�r�r�with_metaclass sr�cs�fdd�}|S)z6Class decorator for creating a class with a metaclass.csl|jj�}|jd�}|dk	rDt|t�r,|g}x|D]}|j|�q2W|jdd�|jdd��|j|j|�S)N�	__slots__rz�__weakref__)rz�copyr�rDr�r�r�	__bases__)r�Z	orig_vars�slotsZ	slots_var)r�r
rr�.s



zadd_metaclass.<locals>.wrapperr
)r�r�r
)r�r�
add_metaclass,sr�cCs2tr.d|jkrtd|j��|j|_dd�|_|S)a
    A decorator that defines __unicode__ and __str__ methods under Python 2.
    Under Python 3 it does nothing.

    To support Python 2 and 3 with a single code base, define a __str__ method
    returning text and apply this decorator to the class.
    �__str__zY@python_2_unicode_compatible cannot be applied to %s because it doesn't define __str__().cSs|j�jd�S)Nzutf-8)�__unicode__r�)rr
r
r�<lambda>Jsz-python_2_unicode_compatible.<locals>.<lambda>)�PY2rz�
ValueErrorrr�r�)r�r
r
r�python_2_unicode_compatible<s


r��__spec__)rrli���li���ll����)N)NN)rr)rr)rr)rr)�rZ
__future__rr^rP�operatorrr��
__author__�__version__�version_infor�r(ZPY34r�Zstring_types�intZ
integer_typesr�Zclass_typesZ	text_type�bytesZbinary_type�maxsizeZMAXSIZEr�ZlongZ	ClassTyper��platform�
startswith�objectr	�len�
OverflowErrorrrrr&�
ModuleTyper2r7r9rrxrLr5r-rrrDr=rmrnZ_urllib_parse_moved_attributesroZ_urllib_error_moved_attributesrpZ _urllib_request_moved_attributesrqZ!_urllib_response_moved_attributesrrZ$_urllib_robotparser_moved_attributesrsryr{Z
_meth_funcZ
_meth_selfZ
_func_closureZ
_func_codeZ_func_defaultsZ
_func_globalsr�r��	NameErrorr�r�r�r�r�r��
attrgetterZget_method_functionZget_method_selfZget_function_closureZget_function_codeZget_function_defaultsZget_function_globalsr�r�r�r��methodcallerr�r�r�r�r��chrZunichr�struct�Struct�packZint2byte�
itemgetterr��getitemr�r�Z	iterbytesrMrN�BytesIOr�r�r��partialrVr�r�r�r�r,rQr�r�r�r�r��WRAPPER_ASSIGNMENTS�WRAPPER_UPDATESr�r�r�r�rG�__package__�globalsr�r��submodule_search_locations�	meta_pathr�r�Zimporter�appendr
r
r
r�<module>s�

>












































































































5site-packages/pkg_resources/_vendor/pyparsing.py000064400000700753147511334660016207 0ustar00# module pyparsing.py
#
# Copyright (c) 2003-2016  Paul T. McGuire
#
# Permission is hereby granted, free of charge, to any person obtaining
# a copy of this software and associated documentation files (the
# "Software"), to deal in the Software without restriction, including
# without limitation the rights to use, copy, modify, merge, publish,
# distribute, sublicense, and/or sell copies of the Software, and to
# permit persons to whom the Software is furnished to do so, subject to
# the following conditions:
#
# The above copyright notice and this permission notice shall be
# included in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
#

__doc__ = \
"""
pyparsing module - Classes and methods to define and execute parsing grammars

The pyparsing module is an alternative approach to creating and executing simple grammars,
vs. the traditional lex/yacc approach, or the use of regular expressions.  With pyparsing, you
don't need to learn a new syntax for defining grammars or matching expressions - the parsing module
provides a library of classes that you use to construct the grammar directly in Python.

Here is a program to parse "Hello, World!" (or any greeting of the form 
C{"<salutation>, <addressee>!"}), built up using L{Word}, L{Literal}, and L{And} elements 
(L{'+'<ParserElement.__add__>} operator gives L{And} expressions, strings are auto-converted to
L{Literal} expressions)::

    from pyparsing import Word, alphas

    # define grammar of a greeting
    greet = Word(alphas) + "," + Word(alphas) + "!"

    hello = "Hello, World!"
    print (hello, "->", greet.parseString(hello))

The program outputs the following::

    Hello, World! -> ['Hello', ',', 'World', '!']

The Python representation of the grammar is quite readable, owing to the self-explanatory
class names, and the use of '+', '|' and '^' operators.

The L{ParseResults} object returned from L{ParserElement.parseString<ParserElement.parseString>} can be accessed as a nested list, a dictionary, or an
object with named attributes.

The pyparsing module handles some of the problems that are typically vexing when writing text parsers:
 - extra or missing whitespace (the above program will also handle "Hello,World!", "Hello  ,  World  !", etc.)
 - quoted strings
 - embedded comments
"""

__version__ = "2.1.10"
__versionTime__ = "07 Oct 2016 01:31 UTC"
__author__ = "Paul McGuire <ptmcg@users.sourceforge.net>"

import string
from weakref import ref as wkref
import copy
import sys
import warnings
import re
import sre_constants
import collections
import pprint
import traceback
import types
from datetime import datetime

try:
    from _thread import RLock
except ImportError:
    from threading import RLock

try:
    from collections import OrderedDict as _OrderedDict
except ImportError:
    try:
        from ordereddict import OrderedDict as _OrderedDict
    except ImportError:
        _OrderedDict = None

#~ sys.stderr.write( "testing pyparsing module, version %s, %s\n" % (__version__,__versionTime__ ) )

__all__ = [
'And', 'CaselessKeyword', 'CaselessLiteral', 'CharsNotIn', 'Combine', 'Dict', 'Each', 'Empty',
'FollowedBy', 'Forward', 'GoToColumn', 'Group', 'Keyword', 'LineEnd', 'LineStart', 'Literal',
'MatchFirst', 'NoMatch', 'NotAny', 'OneOrMore', 'OnlyOnce', 'Optional', 'Or',
'ParseBaseException', 'ParseElementEnhance', 'ParseException', 'ParseExpression', 'ParseFatalException',
'ParseResults', 'ParseSyntaxException', 'ParserElement', 'QuotedString', 'RecursiveGrammarException',
'Regex', 'SkipTo', 'StringEnd', 'StringStart', 'Suppress', 'Token', 'TokenConverter', 
'White', 'Word', 'WordEnd', 'WordStart', 'ZeroOrMore',
'alphanums', 'alphas', 'alphas8bit', 'anyCloseTag', 'anyOpenTag', 'cStyleComment', 'col',
'commaSeparatedList', 'commonHTMLEntity', 'countedArray', 'cppStyleComment', 'dblQuotedString',
'dblSlashComment', 'delimitedList', 'dictOf', 'downcaseTokens', 'empty', 'hexnums',
'htmlComment', 'javaStyleComment', 'line', 'lineEnd', 'lineStart', 'lineno',
'makeHTMLTags', 'makeXMLTags', 'matchOnlyAtCol', 'matchPreviousExpr', 'matchPreviousLiteral',
'nestedExpr', 'nullDebugAction', 'nums', 'oneOf', 'opAssoc', 'operatorPrecedence', 'printables',
'punc8bit', 'pythonStyleComment', 'quotedString', 'removeQuotes', 'replaceHTMLEntity', 
'replaceWith', 'restOfLine', 'sglQuotedString', 'srange', 'stringEnd',
'stringStart', 'traceParseAction', 'unicodeString', 'upcaseTokens', 'withAttribute',
'indentedBlock', 'originalTextFor', 'ungroup', 'infixNotation','locatedExpr', 'withClass',
'CloseMatch', 'tokenMap', 'pyparsing_common',
]

system_version = tuple(sys.version_info)[:3]
PY_3 = system_version[0] == 3
if PY_3:
    _MAX_INT = sys.maxsize
    basestring = str
    unichr = chr
    _ustr = str

    # build list of single arg builtins, that can be used as parse actions
    singleArgBuiltins = [sum, len, sorted, reversed, list, tuple, set, any, all, min, max]

else:
    _MAX_INT = sys.maxint
    range = xrange

    def _ustr(obj):
        """Drop-in replacement for str(obj) that tries to be Unicode friendly. It first tries
           str(obj). If that fails with a UnicodeEncodeError, then it tries unicode(obj). It
           then < returns the unicode object | encodes it with the default encoding | ... >.
        """
        if isinstance(obj,unicode):
            return obj

        try:
            # If this works, then _ustr(obj) has the same behaviour as str(obj), so
            # it won't break any existing code.
            return str(obj)

        except UnicodeEncodeError:
            # Else encode it
            ret = unicode(obj).encode(sys.getdefaultencoding(), 'xmlcharrefreplace')
            xmlcharref = Regex('&#\d+;')
            xmlcharref.setParseAction(lambda t: '\\u' + hex(int(t[0][2:-1]))[2:])
            return xmlcharref.transformString(ret)

    # build list of single arg builtins, tolerant of Python version, that can be used as parse actions
    singleArgBuiltins = []
    import __builtin__
    for fname in "sum len sorted reversed list tuple set any all min max".split():
        try:
            singleArgBuiltins.append(getattr(__builtin__,fname))
        except AttributeError:
            continue
            
_generatorType = type((y for y in range(1)))
 
def _xml_escape(data):
    """Escape &, <, >, ", ', etc. in a string of data."""

    # ampersand must be replaced first
    from_symbols = '&><"\''
    to_symbols = ('&'+s+';' for s in "amp gt lt quot apos".split())
    for from_,to_ in zip(from_symbols, to_symbols):
        data = data.replace(from_, to_)
    return data

class _Constants(object):
    pass

alphas     = string.ascii_uppercase + string.ascii_lowercase
nums       = "0123456789"
hexnums    = nums + "ABCDEFabcdef"
alphanums  = alphas + nums
_bslash    = chr(92)
printables = "".join(c for c in string.printable if c not in string.whitespace)

class ParseBaseException(Exception):
    """base exception class for all parsing runtime exceptions"""
    # Performance tuning: we construct a *lot* of these, so keep this
    # constructor as small and fast as possible
    def __init__( self, pstr, loc=0, msg=None, elem=None ):
        self.loc = loc
        if msg is None:
            self.msg = pstr
            self.pstr = ""
        else:
            self.msg = msg
            self.pstr = pstr
        self.parserElement = elem
        self.args = (pstr, loc, msg)

    @classmethod
    def _from_exception(cls, pe):
        """
        internal factory method to simplify creating one type of ParseException 
        from another - avoids having __init__ signature conflicts among subclasses
        """
        return cls(pe.pstr, pe.loc, pe.msg, pe.parserElement)

    def __getattr__( self, aname ):
        """supported attributes by name are:
            - lineno - returns the line number of the exception text
            - col - returns the column number of the exception text
            - line - returns the line containing the exception text
        """
        if( aname == "lineno" ):
            return lineno( self.loc, self.pstr )
        elif( aname in ("col", "column") ):
            return col( self.loc, self.pstr )
        elif( aname == "line" ):
            return line( self.loc, self.pstr )
        else:
            raise AttributeError(aname)

    def __str__( self ):
        return "%s (at char %d), (line:%d, col:%d)" % \
                ( self.msg, self.loc, self.lineno, self.column )
    def __repr__( self ):
        return _ustr(self)
    def markInputline( self, markerString = ">!<" ):
        """Extracts the exception line from the input string, and marks
           the location of the exception with a special symbol.
        """
        line_str = self.line
        line_column = self.column - 1
        if markerString:
            line_str = "".join((line_str[:line_column],
                                markerString, line_str[line_column:]))
        return line_str.strip()
    def __dir__(self):
        return "lineno col line".split() + dir(type(self))

class ParseException(ParseBaseException):
    """
    Exception thrown when parse expressions don't match class;
    supported attributes by name are:
     - lineno - returns the line number of the exception text
     - col - returns the column number of the exception text
     - line - returns the line containing the exception text
        
    Example::
        try:
            Word(nums).setName("integer").parseString("ABC")
        except ParseException as pe:
            print(pe)
            print("column: {}".format(pe.col))
            
    prints::
       Expected integer (at char 0), (line:1, col:1)
        column: 1
    """
    pass

class ParseFatalException(ParseBaseException):
    """user-throwable exception thrown when inconsistent parse content
       is found; stops all parsing immediately"""
    pass

class ParseSyntaxException(ParseFatalException):
    """just like L{ParseFatalException}, but thrown internally when an
       L{ErrorStop<And._ErrorStop>} ('-' operator) indicates that parsing is to stop 
       immediately because an unbacktrackable syntax error has been found"""
    pass

#~ class ReparseException(ParseBaseException):
    #~ """Experimental class - parse actions can raise this exception to cause
       #~ pyparsing to reparse the input string:
        #~ - with a modified input string, and/or
        #~ - with a modified start location
       #~ Set the values of the ReparseException in the constructor, and raise the
       #~ exception in a parse action to cause pyparsing to use the new string/location.
       #~ Setting the values as None causes no change to be made.
       #~ """
    #~ def __init_( self, newstring, restartLoc ):
        #~ self.newParseText = newstring
        #~ self.reparseLoc = restartLoc

class RecursiveGrammarException(Exception):
    """exception thrown by L{ParserElement.validate} if the grammar could be improperly recursive"""
    def __init__( self, parseElementList ):
        self.parseElementTrace = parseElementList

    def __str__( self ):
        return "RecursiveGrammarException: %s" % self.parseElementTrace

class _ParseResultsWithOffset(object):
    def __init__(self,p1,p2):
        self.tup = (p1,p2)
    def __getitem__(self,i):
        return self.tup[i]
    def __repr__(self):
        return repr(self.tup[0])
    def setOffset(self,i):
        self.tup = (self.tup[0],i)

class ParseResults(object):
    """
    Structured parse results, to provide multiple means of access to the parsed data:
       - as a list (C{len(results)})
       - by list index (C{results[0], results[1]}, etc.)
       - by attribute (C{results.<resultsName>} - see L{ParserElement.setResultsName})

    Example::
        integer = Word(nums)
        date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))
        # equivalent form:
        # date_str = integer("year") + '/' + integer("month") + '/' + integer("day")

        # parseString returns a ParseResults object
        result = date_str.parseString("1999/12/31")

        def test(s, fn=repr):
            print("%s -> %s" % (s, fn(eval(s))))
        test("list(result)")
        test("result[0]")
        test("result['month']")
        test("result.day")
        test("'month' in result")
        test("'minutes' in result")
        test("result.dump()", str)
    prints::
        list(result) -> ['1999', '/', '12', '/', '31']
        result[0] -> '1999'
        result['month'] -> '12'
        result.day -> '31'
        'month' in result -> True
        'minutes' in result -> False
        result.dump() -> ['1999', '/', '12', '/', '31']
        - day: 31
        - month: 12
        - year: 1999
    """
    def __new__(cls, toklist=None, name=None, asList=True, modal=True ):
        if isinstance(toklist, cls):
            return toklist
        retobj = object.__new__(cls)
        retobj.__doinit = True
        return retobj

    # Performance tuning: we construct a *lot* of these, so keep this
    # constructor as small and fast as possible
    def __init__( self, toklist=None, name=None, asList=True, modal=True, isinstance=isinstance ):
        if self.__doinit:
            self.__doinit = False
            self.__name = None
            self.__parent = None
            self.__accumNames = {}
            self.__asList = asList
            self.__modal = modal
            if toklist is None:
                toklist = []
            if isinstance(toklist, list):
                self.__toklist = toklist[:]
            elif isinstance(toklist, _generatorType):
                self.__toklist = list(toklist)
            else:
                self.__toklist = [toklist]
            self.__tokdict = dict()

        if name is not None and name:
            if not modal:
                self.__accumNames[name] = 0
            if isinstance(name,int):
                name = _ustr(name) # will always return a str, but use _ustr for consistency
            self.__name = name
            if not (isinstance(toklist, (type(None), basestring, list)) and toklist in (None,'',[])):
                if isinstance(toklist,basestring):
                    toklist = [ toklist ]
                if asList:
                    if isinstance(toklist,ParseResults):
                        self[name] = _ParseResultsWithOffset(toklist.copy(),0)
                    else:
                        self[name] = _ParseResultsWithOffset(ParseResults(toklist[0]),0)
                    self[name].__name = name
                else:
                    try:
                        self[name] = toklist[0]
                    except (KeyError,TypeError,IndexError):
                        self[name] = toklist

    def __getitem__( self, i ):
        if isinstance( i, (int,slice) ):
            return self.__toklist[i]
        else:
            if i not in self.__accumNames:
                return self.__tokdict[i][-1][0]
            else:
                return ParseResults([ v[0] for v in self.__tokdict[i] ])

    def __setitem__( self, k, v, isinstance=isinstance ):
        if isinstance(v,_ParseResultsWithOffset):
            self.__tokdict[k] = self.__tokdict.get(k,list()) + [v]
            sub = v[0]
        elif isinstance(k,(int,slice)):
            self.__toklist[k] = v
            sub = v
        else:
            self.__tokdict[k] = self.__tokdict.get(k,list()) + [_ParseResultsWithOffset(v,0)]
            sub = v
        if isinstance(sub,ParseResults):
            sub.__parent = wkref(self)

    def __delitem__( self, i ):
        if isinstance(i,(int,slice)):
            mylen = len( self.__toklist )
            del self.__toklist[i]

            # convert int to slice
            if isinstance(i, int):
                if i < 0:
                    i += mylen
                i = slice(i, i+1)
            # get removed indices
            removed = list(range(*i.indices(mylen)))
            removed.reverse()
            # fixup indices in token dictionary
            for name,occurrences in self.__tokdict.items():
                for j in removed:
                    for k, (value, position) in enumerate(occurrences):
                        occurrences[k] = _ParseResultsWithOffset(value, position - (position > j))
        else:
            del self.__tokdict[i]

    def __contains__( self, k ):
        return k in self.__tokdict

    def __len__( self ): return len( self.__toklist )
    def __bool__(self): return ( not not self.__toklist )
    __nonzero__ = __bool__
    def __iter__( self ): return iter( self.__toklist )
    def __reversed__( self ): return iter( self.__toklist[::-1] )
    def _iterkeys( self ):
        if hasattr(self.__tokdict, "iterkeys"):
            return self.__tokdict.iterkeys()
        else:
            return iter(self.__tokdict)

    def _itervalues( self ):
        return (self[k] for k in self._iterkeys())
            
    def _iteritems( self ):
        return ((k, self[k]) for k in self._iterkeys())

    if PY_3:
        keys = _iterkeys       
        """Returns an iterator of all named result keys (Python 3.x only)."""

        values = _itervalues
        """Returns an iterator of all named result values (Python 3.x only)."""

        items = _iteritems
        """Returns an iterator of all named result key-value tuples (Python 3.x only)."""

    else:
        iterkeys = _iterkeys
        """Returns an iterator of all named result keys (Python 2.x only)."""

        itervalues = _itervalues
        """Returns an iterator of all named result values (Python 2.x only)."""

        iteritems = _iteritems
        """Returns an iterator of all named result key-value tuples (Python 2.x only)."""

        def keys( self ):
            """Returns all named result keys (as a list in Python 2.x, as an iterator in Python 3.x)."""
            return list(self.iterkeys())

        def values( self ):
            """Returns all named result values (as a list in Python 2.x, as an iterator in Python 3.x)."""
            return list(self.itervalues())
                
        def items( self ):
            """Returns all named result key-values (as a list of tuples in Python 2.x, as an iterator in Python 3.x)."""
            return list(self.iteritems())

    def haskeys( self ):
        """Since keys() returns an iterator, this method is helpful in bypassing
           code that looks for the existence of any defined results names."""
        return bool(self.__tokdict)
        
    def pop( self, *args, **kwargs):
        """
        Removes and returns item at specified index (default=C{last}).
        Supports both C{list} and C{dict} semantics for C{pop()}. If passed no
        argument or an integer argument, it will use C{list} semantics
        and pop tokens from the list of parsed tokens. If passed a 
        non-integer argument (most likely a string), it will use C{dict}
        semantics and pop the corresponding value from any defined 
        results names. A second default return value argument is 
        supported, just as in C{dict.pop()}.

        Example::
            def remove_first(tokens):
                tokens.pop(0)
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            print(OneOrMore(Word(nums)).addParseAction(remove_first).parseString("0 123 321")) # -> ['123', '321']

            label = Word(alphas)
            patt = label("LABEL") + OneOrMore(Word(nums))
            print(patt.parseString("AAB 123 321").dump())

            # Use pop() in a parse action to remove named result (note that corresponding value is not
            # removed from list form of results)
            def remove_LABEL(tokens):
                tokens.pop("LABEL")
                return tokens
            patt.addParseAction(remove_LABEL)
            print(patt.parseString("AAB 123 321").dump())
        prints::
            ['AAB', '123', '321']
            - LABEL: AAB

            ['AAB', '123', '321']
        """
        if not args:
            args = [-1]
        for k,v in kwargs.items():
            if k == 'default':
                args = (args[0], v)
            else:
                raise TypeError("pop() got an unexpected keyword argument '%s'" % k)
        if (isinstance(args[0], int) or 
                        len(args) == 1 or 
                        args[0] in self):
            index = args[0]
            ret = self[index]
            del self[index]
            return ret
        else:
            defaultvalue = args[1]
            return defaultvalue

    def get(self, key, defaultValue=None):
        """
        Returns named result matching the given key, or if there is no
        such name, then returns the given C{defaultValue} or C{None} if no
        C{defaultValue} is specified.

        Similar to C{dict.get()}.
        
        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            result = date_str.parseString("1999/12/31")
            print(result.get("year")) # -> '1999'
            print(result.get("hour", "not specified")) # -> 'not specified'
            print(result.get("hour")) # -> None
        """
        if key in self:
            return self[key]
        else:
            return defaultValue

    def insert( self, index, insStr ):
        """
        Inserts new element at location index in the list of parsed tokens.
        
        Similar to C{list.insert()}.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']

            # use a parse action to insert the parse location in the front of the parsed results
            def insert_locn(locn, tokens):
                tokens.insert(0, locn)
            print(OneOrMore(Word(nums)).addParseAction(insert_locn).parseString("0 123 321")) # -> [0, '0', '123', '321']
        """
        self.__toklist.insert(index, insStr)
        # fixup indices in token dictionary
        for name,occurrences in self.__tokdict.items():
            for k, (value, position) in enumerate(occurrences):
                occurrences[k] = _ParseResultsWithOffset(value, position + (position > index))

    def append( self, item ):
        """
        Add single element to end of ParseResults list of elements.

        Example::
            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
            
            # use a parse action to compute the sum of the parsed integers, and add it to the end
            def append_sum(tokens):
                tokens.append(sum(map(int, tokens)))
            print(OneOrMore(Word(nums)).addParseAction(append_sum).parseString("0 123 321")) # -> ['0', '123', '321', 444]
        """
        self.__toklist.append(item)

    def extend( self, itemseq ):
        """
        Add sequence of elements to end of ParseResults list of elements.

        Example::
            patt = OneOrMore(Word(alphas))
            
            # use a parse action to append the reverse of the matched strings, to make a palindrome
            def make_palindrome(tokens):
                tokens.extend(reversed([t[::-1] for t in tokens]))
                return ''.join(tokens)
            print(patt.addParseAction(make_palindrome).parseString("lskdj sdlkjf lksd")) # -> 'lskdjsdlkjflksddsklfjkldsjdksl'
        """
        if isinstance(itemseq, ParseResults):
            self += itemseq
        else:
            self.__toklist.extend(itemseq)

    def clear( self ):
        """
        Clear all elements and results names.
        """
        del self.__toklist[:]
        self.__tokdict.clear()

    def __getattr__( self, name ):
        try:
            return self[name]
        except KeyError:
            return ""
            
        if name in self.__tokdict:
            if name not in self.__accumNames:
                return self.__tokdict[name][-1][0]
            else:
                return ParseResults([ v[0] for v in self.__tokdict[name] ])
        else:
            return ""

    def __add__( self, other ):
        ret = self.copy()
        ret += other
        return ret

    def __iadd__( self, other ):
        if other.__tokdict:
            offset = len(self.__toklist)
            addoffset = lambda a: offset if a<0 else a+offset
            otheritems = other.__tokdict.items()
            otherdictitems = [(k, _ParseResultsWithOffset(v[0],addoffset(v[1])) )
                                for (k,vlist) in otheritems for v in vlist]
            for k,v in otherdictitems:
                self[k] = v
                if isinstance(v[0],ParseResults):
                    v[0].__parent = wkref(self)
            
        self.__toklist += other.__toklist
        self.__accumNames.update( other.__accumNames )
        return self

    def __radd__(self, other):
        if isinstance(other,int) and other == 0:
            # useful for merging many ParseResults using sum() builtin
            return self.copy()
        else:
            # this may raise a TypeError - so be it
            return other + self
        
    def __repr__( self ):
        return "(%s, %s)" % ( repr( self.__toklist ), repr( self.__tokdict ) )

    def __str__( self ):
        return '[' + ', '.join(_ustr(i) if isinstance(i, ParseResults) else repr(i) for i in self.__toklist) + ']'

    def _asStringList( self, sep='' ):
        out = []
        for item in self.__toklist:
            if out and sep:
                out.append(sep)
            if isinstance( item, ParseResults ):
                out += item._asStringList()
            else:
                out.append( _ustr(item) )
        return out

    def asList( self ):
        """
        Returns the parse results as a nested list of matching tokens, all converted to strings.

        Example::
            patt = OneOrMore(Word(alphas))
            result = patt.parseString("sldkj lsdkj sldkj")
            # even though the result prints in string-like form, it is actually a pyparsing ParseResults
            print(type(result), result) # -> <class 'pyparsing.ParseResults'> ['sldkj', 'lsdkj', 'sldkj']
            
            # Use asList() to create an actual list
            result_list = result.asList()
            print(type(result_list), result_list) # -> <class 'list'> ['sldkj', 'lsdkj', 'sldkj']
        """
        return [res.asList() if isinstance(res,ParseResults) else res for res in self.__toklist]

    def asDict( self ):
        """
        Returns the named parse results as a nested dictionary.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(type(result), repr(result)) # -> <class 'pyparsing.ParseResults'> (['12', '/', '31', '/', '1999'], {'day': [('1999', 4)], 'year': [('12', 0)], 'month': [('31', 2)]})
            
            result_dict = result.asDict()
            print(type(result_dict), repr(result_dict)) # -> <class 'dict'> {'day': '1999', 'year': '12', 'month': '31'}

            # even though a ParseResults supports dict-like access, sometime you just need to have a dict
            import json
            print(json.dumps(result)) # -> Exception: TypeError: ... is not JSON serializable
            print(json.dumps(result.asDict())) # -> {"month": "31", "day": "1999", "year": "12"}
        """
        if PY_3:
            item_fn = self.items
        else:
            item_fn = self.iteritems
            
        def toItem(obj):
            if isinstance(obj, ParseResults):
                if obj.haskeys():
                    return obj.asDict()
                else:
                    return [toItem(v) for v in obj]
            else:
                return obj
                
        return dict((k,toItem(v)) for k,v in item_fn())

    def copy( self ):
        """
        Returns a new copy of a C{ParseResults} object.
        """
        ret = ParseResults( self.__toklist )
        ret.__tokdict = self.__tokdict.copy()
        ret.__parent = self.__parent
        ret.__accumNames.update( self.__accumNames )
        ret.__name = self.__name
        return ret

    def asXML( self, doctag=None, namedItemsOnly=False, indent="", formatted=True ):
        """
        (Deprecated) Returns the parse results as XML. Tags are created for tokens and lists that have defined results names.
        """
        nl = "\n"
        out = []
        namedItems = dict((v[1],k) for (k,vlist) in self.__tokdict.items()
                                                            for v in vlist)
        nextLevelIndent = indent + "  "

        # collapse out indents if formatting is not desired
        if not formatted:
            indent = ""
            nextLevelIndent = ""
            nl = ""

        selfTag = None
        if doctag is not None:
            selfTag = doctag
        else:
            if self.__name:
                selfTag = self.__name

        if not selfTag:
            if namedItemsOnly:
                return ""
            else:
                selfTag = "ITEM"

        out += [ nl, indent, "<", selfTag, ">" ]

        for i,res in enumerate(self.__toklist):
            if isinstance(res,ParseResults):
                if i in namedItems:
                    out += [ res.asXML(namedItems[i],
                                        namedItemsOnly and doctag is None,
                                        nextLevelIndent,
                                        formatted)]
                else:
                    out += [ res.asXML(None,
                                        namedItemsOnly and doctag is None,
                                        nextLevelIndent,
                                        formatted)]
            else:
                # individual token, see if there is a name for it
                resTag = None
                if i in namedItems:
                    resTag = namedItems[i]
                if not resTag:
                    if namedItemsOnly:
                        continue
                    else:
                        resTag = "ITEM"
                xmlBodyText = _xml_escape(_ustr(res))
                out += [ nl, nextLevelIndent, "<", resTag, ">",
                                                xmlBodyText,
                                                "</", resTag, ">" ]

        out += [ nl, indent, "</", selfTag, ">" ]
        return "".join(out)

    def __lookup(self,sub):
        for k,vlist in self.__tokdict.items():
            for v,loc in vlist:
                if sub is v:
                    return k
        return None

    def getName(self):
        """
        Returns the results name for this token expression. Useful when several 
        different expressions might match at a particular location.

        Example::
            integer = Word(nums)
            ssn_expr = Regex(r"\d\d\d-\d\d-\d\d\d\d")
            house_number_expr = Suppress('#') + Word(nums, alphanums)
            user_data = (Group(house_number_expr)("house_number") 
                        | Group(ssn_expr)("ssn")
                        | Group(integer)("age"))
            user_info = OneOrMore(user_data)
            
            result = user_info.parseString("22 111-22-3333 #221B")
            for item in result:
                print(item.getName(), ':', item[0])
        prints::
            age : 22
            ssn : 111-22-3333
            house_number : 221B
        """
        if self.__name:
            return self.__name
        elif self.__parent:
            par = self.__parent()
            if par:
                return par.__lookup(self)
            else:
                return None
        elif (len(self) == 1 and
               len(self.__tokdict) == 1 and
               next(iter(self.__tokdict.values()))[0][1] in (0,-1)):
            return next(iter(self.__tokdict.keys()))
        else:
            return None

    def dump(self, indent='', depth=0, full=True):
        """
        Diagnostic method for listing out the contents of a C{ParseResults}.
        Accepts an optional C{indent} argument so that this string can be embedded
        in a nested display of other data.

        Example::
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
            
            result = date_str.parseString('12/31/1999')
            print(result.dump())
        prints::
            ['12', '/', '31', '/', '1999']
            - day: 1999
            - month: 31
            - year: 12
        """
        out = []
        NL = '\n'
        out.append( indent+_ustr(self.asList()) )
        if full:
            if self.haskeys():
                items = sorted((str(k), v) for k,v in self.items())
                for k,v in items:
                    if out:
                        out.append(NL)
                    out.append( "%s%s- %s: " % (indent,('  '*depth), k) )
                    if isinstance(v,ParseResults):
                        if v:
                            out.append( v.dump(indent,depth+1) )
                        else:
                            out.append(_ustr(v))
                    else:
                        out.append(repr(v))
            elif any(isinstance(vv,ParseResults) for vv in self):
                v = self
                for i,vv in enumerate(v):
                    if isinstance(vv,ParseResults):
                        out.append("\n%s%s[%d]:\n%s%s%s" % (indent,('  '*(depth)),i,indent,('  '*(depth+1)),vv.dump(indent,depth+1) ))
                    else:
                        out.append("\n%s%s[%d]:\n%s%s%s" % (indent,('  '*(depth)),i,indent,('  '*(depth+1)),_ustr(vv)))
            
        return "".join(out)

    def pprint(self, *args, **kwargs):
        """
        Pretty-printer for parsed results as a list, using the C{pprint} module.
        Accepts additional positional or keyword args as defined for the 
        C{pprint.pprint} method. (U{http://docs.python.org/3/library/pprint.html#pprint.pprint})

        Example::
            ident = Word(alphas, alphanums)
            num = Word(nums)
            func = Forward()
            term = ident | num | Group('(' + func + ')')
            func <<= ident + Group(Optional(delimitedList(term)))
            result = func.parseString("fna a,b,(fnb c,d,200),100")
            result.pprint(width=40)
        prints::
            ['fna',
             ['a',
              'b',
              ['(', 'fnb', ['c', 'd', '200'], ')'],
              '100']]
        """
        pprint.pprint(self.asList(), *args, **kwargs)

    # add support for pickle protocol
    def __getstate__(self):
        return ( self.__toklist,
                 ( self.__tokdict.copy(),
                   self.__parent is not None and self.__parent() or None,
                   self.__accumNames,
                   self.__name ) )

    def __setstate__(self,state):
        self.__toklist = state[0]
        (self.__tokdict,
         par,
         inAccumNames,
         self.__name) = state[1]
        self.__accumNames = {}
        self.__accumNames.update(inAccumNames)
        if par is not None:
            self.__parent = wkref(par)
        else:
            self.__parent = None

    def __getnewargs__(self):
        return self.__toklist, self.__name, self.__asList, self.__modal

    def __dir__(self):
        return (dir(type(self)) + list(self.keys()))

collections.MutableMapping.register(ParseResults)

def col (loc,strg):
    """Returns current column within a string, counting newlines as line separators.
   The first column is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   """
    s = strg
    return 1 if 0<loc<len(s) and s[loc-1] == '\n' else loc - s.rfind("\n", 0, loc)

def lineno(loc,strg):
    """Returns current line number within a string, counting newlines as line separators.
   The first line is number 1.

   Note: the default parsing behavior is to expand tabs in the input string
   before starting the parsing process.  See L{I{ParserElement.parseString}<ParserElement.parseString>} for more information
   on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
   consistent view of the parsed string, the parse location, and line and column
   positions within the parsed string.
   """
    return strg.count("\n",0,loc) + 1

def line( loc, strg ):
    """Returns the line of text containing loc within a string, counting newlines as line separators.
       """
    lastCR = strg.rfind("\n", 0, loc)
    nextCR = strg.find("\n", loc)
    if nextCR >= 0:
        return strg[lastCR+1:nextCR]
    else:
        return strg[lastCR+1:]

def _defaultStartDebugAction( instring, loc, expr ):
    print (("Match " + _ustr(expr) + " at loc " + _ustr(loc) + "(%d,%d)" % ( lineno(loc,instring), col(loc,instring) )))

def _defaultSuccessDebugAction( instring, startloc, endloc, expr, toks ):
    print ("Matched " + _ustr(expr) + " -> " + str(toks.asList()))

def _defaultExceptionDebugAction( instring, loc, expr, exc ):
    print ("Exception raised:" + _ustr(exc))

def nullDebugAction(*args):
    """'Do-nothing' debug action, to suppress debugging output during parsing."""
    pass

# Only works on Python 3.x - nonlocal is toxic to Python 2 installs
#~ 'decorator to trim function calls to match the arity of the target'
#~ def _trim_arity(func, maxargs=3):
    #~ if func in singleArgBuiltins:
        #~ return lambda s,l,t: func(t)
    #~ limit = 0
    #~ foundArity = False
    #~ def wrapper(*args):
        #~ nonlocal limit,foundArity
        #~ while 1:
            #~ try:
                #~ ret = func(*args[limit:])
                #~ foundArity = True
                #~ return ret
            #~ except TypeError:
                #~ if limit == maxargs or foundArity:
                    #~ raise
                #~ limit += 1
                #~ continue
    #~ return wrapper

# this version is Python 2.x-3.x cross-compatible
'decorator to trim function calls to match the arity of the target'
def _trim_arity(func, maxargs=2):
    if func in singleArgBuiltins:
        return lambda s,l,t: func(t)
    limit = [0]
    foundArity = [False]
    
    # traceback return data structure changed in Py3.5 - normalize back to plain tuples
    if system_version[:2] >= (3,5):
        def extract_stack(limit=0):
            # special handling for Python 3.5.0 - extra deep call stack by 1
            offset = -3 if system_version == (3,5,0) else -2
            frame_summary = traceback.extract_stack(limit=-offset+limit-1)[offset]
            return [(frame_summary.filename, frame_summary.lineno)]
        def extract_tb(tb, limit=0):
            frames = traceback.extract_tb(tb, limit=limit)
            frame_summary = frames[-1]
            return [(frame_summary.filename, frame_summary.lineno)]
    else:
        extract_stack = traceback.extract_stack
        extract_tb = traceback.extract_tb
    
    # synthesize what would be returned by traceback.extract_stack at the call to 
    # user's parse action 'func', so that we don't incur call penalty at parse time
    
    LINE_DIFF = 6
    # IF ANY CODE CHANGES, EVEN JUST COMMENTS OR BLANK LINES, BETWEEN THE NEXT LINE AND 
    # THE CALL TO FUNC INSIDE WRAPPER, LINE_DIFF MUST BE MODIFIED!!!!
    this_line = extract_stack(limit=2)[-1]
    pa_call_line_synth = (this_line[0], this_line[1]+LINE_DIFF)

    def wrapper(*args):
        while 1:
            try:
                ret = func(*args[limit[0]:])
                foundArity[0] = True
                return ret
            except TypeError:
                # re-raise TypeErrors if they did not come from our arity testing
                if foundArity[0]:
                    raise
                else:
                    try:
                        tb = sys.exc_info()[-1]
                        if not extract_tb(tb, limit=2)[-1][:2] == pa_call_line_synth:
                            raise
                    finally:
                        del tb

                if limit[0] <= maxargs:
                    limit[0] += 1
                    continue
                raise

    # copy func name to wrapper for sensible debug output
    func_name = "<parse action>"
    try:
        func_name = getattr(func, '__name__', 
                            getattr(func, '__class__').__name__)
    except Exception:
        func_name = str(func)
    wrapper.__name__ = func_name

    return wrapper

class ParserElement(object):
    """Abstract base level parser element class."""
    DEFAULT_WHITE_CHARS = " \n\t\r"
    verbose_stacktrace = False

    @staticmethod
    def setDefaultWhitespaceChars( chars ):
        r"""
        Overrides the default whitespace chars

        Example::
            # default whitespace chars are space, <TAB> and newline
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def', 'ghi', 'jkl']
            
            # change to just treat newline as significant
            ParserElement.setDefaultWhitespaceChars(" \t")
            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def']
        """
        ParserElement.DEFAULT_WHITE_CHARS = chars

    @staticmethod
    def inlineLiteralsUsing(cls):
        """
        Set class to be used for inclusion of string literals into a parser.
        
        Example::
            # default literal class used is Literal
            integer = Word(nums)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']


            # change to Suppress
            ParserElement.inlineLiteralsUsing(Suppress)
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           

            date_str.parseString("1999/12/31")  # -> ['1999', '12', '31']
        """
        ParserElement._literalStringClass = cls

    def __init__( self, savelist=False ):
        self.parseAction = list()
        self.failAction = None
        #~ self.name = "<unknown>"  # don't define self.name, let subclasses try/except upcall
        self.strRepr = None
        self.resultsName = None
        self.saveAsList = savelist
        self.skipWhitespace = True
        self.whiteChars = ParserElement.DEFAULT_WHITE_CHARS
        self.copyDefaultWhiteChars = True
        self.mayReturnEmpty = False # used when checking for left-recursion
        self.keepTabs = False
        self.ignoreExprs = list()
        self.debug = False
        self.streamlined = False
        self.mayIndexError = True # used to optimize exception handling for subclasses that don't advance parse index
        self.errmsg = ""
        self.modalResults = True # used to mark results names as modal (report only last) or cumulative (list all)
        self.debugActions = ( None, None, None ) #custom debug actions
        self.re = None
        self.callPreparse = True # used to avoid redundant calls to preParse
        self.callDuringTry = False

    def copy( self ):
        """
        Make a copy of this C{ParserElement}.  Useful for defining different parse actions
        for the same parsing pattern, using copies of the original parse element.
        
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            integerK = integer.copy().addParseAction(lambda toks: toks[0]*1024) + Suppress("K")
            integerM = integer.copy().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
            
            print(OneOrMore(integerK | integerM | integer).parseString("5K 100 640K 256M"))
        prints::
            [5120, 100, 655360, 268435456]
        Equivalent form of C{expr.copy()} is just C{expr()}::
            integerM = integer().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
        """
        cpy = copy.copy( self )
        cpy.parseAction = self.parseAction[:]
        cpy.ignoreExprs = self.ignoreExprs[:]
        if self.copyDefaultWhiteChars:
            cpy.whiteChars = ParserElement.DEFAULT_WHITE_CHARS
        return cpy

    def setName( self, name ):
        """
        Define name for this expression, makes debugging and exception messages clearer.
        
        Example::
            Word(nums).parseString("ABC")  # -> Exception: Expected W:(0123...) (at char 0), (line:1, col:1)
            Word(nums).setName("integer").parseString("ABC")  # -> Exception: Expected integer (at char 0), (line:1, col:1)
        """
        self.name = name
        self.errmsg = "Expected " + self.name
        if hasattr(self,"exception"):
            self.exception.msg = self.errmsg
        return self

    def setResultsName( self, name, listAllMatches=False ):
        """
        Define name for referencing matching tokens as a nested attribute
        of the returned parse results.
        NOTE: this returns a *copy* of the original C{ParserElement} object;
        this is so that the client can define a basic element, such as an
        integer, and reference it in multiple places with different names.

        You can also set results names using the abbreviated syntax,
        C{expr("name")} in place of C{expr.setResultsName("name")} - 
        see L{I{__call__}<__call__>}.

        Example::
            date_str = (integer.setResultsName("year") + '/' 
                        + integer.setResultsName("month") + '/' 
                        + integer.setResultsName("day"))

            # equivalent form:
            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
        """
        newself = self.copy()
        if name.endswith("*"):
            name = name[:-1]
            listAllMatches=True
        newself.resultsName = name
        newself.modalResults = not listAllMatches
        return newself

    def setBreak(self,breakFlag = True):
        """Method to invoke the Python pdb debugger when this element is
           about to be parsed. Set C{breakFlag} to True to enable, False to
           disable.
        """
        if breakFlag:
            _parseMethod = self._parse
            def breaker(instring, loc, doActions=True, callPreParse=True):
                import pdb
                pdb.set_trace()
                return _parseMethod( instring, loc, doActions, callPreParse )
            breaker._originalParseMethod = _parseMethod
            self._parse = breaker
        else:
            if hasattr(self._parse,"_originalParseMethod"):
                self._parse = self._parse._originalParseMethod
        return self

    def setParseAction( self, *fns, **kwargs ):
        """
        Define action to perform when successfully matching parse element definition.
        Parse action fn is a callable method with 0-3 arguments, called as C{fn(s,loc,toks)},
        C{fn(loc,toks)}, C{fn(toks)}, or just C{fn()}, where:
         - s   = the original string being parsed (see note below)
         - loc = the location of the matching substring
         - toks = a list of the matched tokens, packaged as a C{L{ParseResults}} object
        If the functions in fns modify the tokens, they can return them as the return
        value from fn, and the modified list of tokens will replace the original.
        Otherwise, fn does not need to return any value.

        Optional keyword arguments:
         - callDuringTry = (default=C{False}) indicate if parse action should be run during lookaheads and alternate testing

        Note: the default parsing behavior is to expand tabs in the input string
        before starting the parsing process.  See L{I{parseString}<parseString>} for more information
        on parsing strings containing C{<TAB>}s, and suggested methods to maintain a
        consistent view of the parsed string, the parse location, and line and column
        positions within the parsed string.
        
        Example::
            integer = Word(nums)
            date_str = integer + '/' + integer + '/' + integer

            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']

            # use parse action to convert to ints at parse time
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            date_str = integer + '/' + integer + '/' + integer

            # note that integer fields are now ints, not strings
            date_str.parseString("1999/12/31")  # -> [1999, '/', 12, '/', 31]
        """
        self.parseAction = list(map(_trim_arity, list(fns)))
        self.callDuringTry = kwargs.get("callDuringTry", False)
        return self

    def addParseAction( self, *fns, **kwargs ):
        """
        Add parse action to expression's list of parse actions. See L{I{setParseAction}<setParseAction>}.
        
        See examples in L{I{copy}<copy>}.
        """
        self.parseAction += list(map(_trim_arity, list(fns)))
        self.callDuringTry = self.callDuringTry or kwargs.get("callDuringTry", False)
        return self

    def addCondition(self, *fns, **kwargs):
        """Add a boolean predicate function to expression's list of parse actions. See 
        L{I{setParseAction}<setParseAction>} for function call signatures. Unlike C{setParseAction}, 
        functions passed to C{addCondition} need to return boolean success/fail of the condition.

        Optional keyword arguments:
         - message = define a custom message to be used in the raised exception
         - fatal   = if True, will raise ParseFatalException to stop parsing immediately; otherwise will raise ParseException
         
        Example::
            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
            year_int = integer.copy()
            year_int.addCondition(lambda toks: toks[0] >= 2000, message="Only support years 2000 and later")
            date_str = year_int + '/' + integer + '/' + integer

            result = date_str.parseString("1999/12/31")  # -> Exception: Only support years 2000 and later (at char 0), (line:1, col:1)
        """
        msg = kwargs.get("message", "failed user-defined condition")
        exc_type = ParseFatalException if kwargs.get("fatal", False) else ParseException
        for fn in fns:
            def pa(s,l,t):
                if not bool(_trim_arity(fn)(s,l,t)):
                    raise exc_type(s,l,msg)
            self.parseAction.append(pa)
        self.callDuringTry = self.callDuringTry or kwargs.get("callDuringTry", False)
        return self

    def setFailAction( self, fn ):
        """Define action to perform if parsing fails at this expression.
           Fail acton fn is a callable function that takes the arguments
           C{fn(s,loc,expr,err)} where:
            - s = string being parsed
            - loc = location where expression match was attempted and failed
            - expr = the parse expression that failed
            - err = the exception thrown
           The function returns no value.  It may throw C{L{ParseFatalException}}
           if it is desired to stop parsing immediately."""
        self.failAction = fn
        return self

    def _skipIgnorables( self, instring, loc ):
        exprsFound = True
        while exprsFound:
            exprsFound = False
            for e in self.ignoreExprs:
                try:
                    while 1:
                        loc,dummy = e._parse( instring, loc )
                        exprsFound = True
                except ParseException:
                    pass
        return loc

    def preParse( self, instring, loc ):
        if self.ignoreExprs:
            loc = self._skipIgnorables( instring, loc )

        if self.skipWhitespace:
            wt = self.whiteChars
            instrlen = len(instring)
            while loc < instrlen and instring[loc] in wt:
                loc += 1

        return loc

    def parseImpl( self, instring, loc, doActions=True ):
        return loc, []

    def postParse( self, instring, loc, tokenlist ):
        return tokenlist

    #~ @profile
    def _parseNoCache( self, instring, loc, doActions=True, callPreParse=True ):
        debugging = ( self.debug ) #and doActions )

        if debugging or self.failAction:
            #~ print ("Match",self,"at loc",loc,"(%d,%d)" % ( lineno(loc,instring), col(loc,instring) ))
            if (self.debugActions[0] ):
                self.debugActions[0]( instring, loc, self )
            if callPreParse and self.callPreparse:
                preloc = self.preParse( instring, loc )
            else:
                preloc = loc
            tokensStart = preloc
            try:
                try:
                    loc,tokens = self.parseImpl( instring, preloc, doActions )
                except IndexError:
                    raise ParseException( instring, len(instring), self.errmsg, self )
            except ParseBaseException as err:
                #~ print ("Exception raised:", err)
                if self.debugActions[2]:
                    self.debugActions[2]( instring, tokensStart, self, err )
                if self.failAction:
                    self.failAction( instring, tokensStart, self, err )
                raise
        else:
            if callPreParse and self.callPreparse:
                preloc = self.preParse( instring, loc )
            else:
                preloc = loc
            tokensStart = preloc
            if self.mayIndexError or loc >= len(instring):
                try:
                    loc,tokens = self.parseImpl( instring, preloc, doActions )
                except IndexError:
                    raise ParseException( instring, len(instring), self.errmsg, self )
            else:
                loc,tokens = self.parseImpl( instring, preloc, doActions )

        tokens = self.postParse( instring, loc, tokens )

        retTokens = ParseResults( tokens, self.resultsName, asList=self.saveAsList, modal=self.modalResults )
        if self.parseAction and (doActions or self.callDuringTry):
            if debugging:
                try:
                    for fn in self.parseAction:
                        tokens = fn( instring, tokensStart, retTokens )
                        if tokens is not None:
                            retTokens = ParseResults( tokens,
                                                      self.resultsName,
                                                      asList=self.saveAsList and isinstance(tokens,(ParseResults,list)),
                                                      modal=self.modalResults )
                except ParseBaseException as err:
                    #~ print "Exception raised in user parse action:", err
                    if (self.debugActions[2] ):
                        self.debugActions[2]( instring, tokensStart, self, err )
                    raise
            else:
                for fn in self.parseAction:
                    tokens = fn( instring, tokensStart, retTokens )
                    if tokens is not None:
                        retTokens = ParseResults( tokens,
                                                  self.resultsName,
                                                  asList=self.saveAsList and isinstance(tokens,(ParseResults,list)),
                                                  modal=self.modalResults )

        if debugging:
            #~ print ("Matched",self,"->",retTokens.asList())
            if (self.debugActions[1] ):
                self.debugActions[1]( instring, tokensStart, loc, self, retTokens )

        return loc, retTokens

    def tryParse( self, instring, loc ):
        try:
            return self._parse( instring, loc, doActions=False )[0]
        except ParseFatalException:
            raise ParseException( instring, loc, self.errmsg, self)
    
    def canParseNext(self, instring, loc):
        try:
            self.tryParse(instring, loc)
        except (ParseException, IndexError):
            return False
        else:
            return True

    class _UnboundedCache(object):
        def __init__(self):
            cache = {}
            self.not_in_cache = not_in_cache = object()

            def get(self, key):
                return cache.get(key, not_in_cache)

            def set(self, key, value):
                cache[key] = value

            def clear(self):
                cache.clear()

            self.get = types.MethodType(get, self)
            self.set = types.MethodType(set, self)
            self.clear = types.MethodType(clear, self)

    if _OrderedDict is not None:
        class _FifoCache(object):
            def __init__(self, size):
                self.not_in_cache = not_in_cache = object()

                cache = _OrderedDict()

                def get(self, key):
                    return cache.get(key, not_in_cache)

                def set(self, key, value):
                    cache[key] = value
                    if len(cache) > size:
                        cache.popitem(False)

                def clear(self):
                    cache.clear()

                self.get = types.MethodType(get, self)
                self.set = types.MethodType(set, self)
                self.clear = types.MethodType(clear, self)

    else:
        class _FifoCache(object):
            def __init__(self, size):
                self.not_in_cache = not_in_cache = object()

                cache = {}
                key_fifo = collections.deque([], size)

                def get(self, key):
                    return cache.get(key, not_in_cache)

                def set(self, key, value):
                    cache[key] = value
                    if len(cache) > size:
                        cache.pop(key_fifo.popleft(), None)
                    key_fifo.append(key)

                def clear(self):
                    cache.clear()
                    key_fifo.clear()

                self.get = types.MethodType(get, self)
                self.set = types.MethodType(set, self)
                self.clear = types.MethodType(clear, self)

    # argument cache for optimizing repeated calls when backtracking through recursive expressions
    packrat_cache = {} # this is set later by enabledPackrat(); this is here so that resetCache() doesn't fail
    packrat_cache_lock = RLock()
    packrat_cache_stats = [0, 0]

    # this method gets repeatedly called during backtracking with the same arguments -
    # we can cache these arguments and save ourselves the trouble of re-parsing the contained expression
    def _parseCache( self, instring, loc, doActions=True, callPreParse=True ):
        HIT, MISS = 0, 1
        lookup = (self, instring, loc, callPreParse, doActions)
        with ParserElement.packrat_cache_lock:
            cache = ParserElement.packrat_cache
            value = cache.get(lookup)
            if value is cache.not_in_cache:
                ParserElement.packrat_cache_stats[MISS] += 1
                try:
                    value = self._parseNoCache(instring, loc, doActions, callPreParse)
                except ParseBaseException as pe:
                    # cache a copy of the exception, without the traceback
                    cache.set(lookup, pe.__class__(*pe.args))
                    raise
                else:
                    cache.set(lookup, (value[0], value[1].copy()))
                    return value
            else:
                ParserElement.packrat_cache_stats[HIT] += 1
                if isinstance(value, Exception):
                    raise value
                return (value[0], value[1].copy())

    _parse = _parseNoCache

    @staticmethod
    def resetCache():
        ParserElement.packrat_cache.clear()
        ParserElement.packrat_cache_stats[:] = [0] * len(ParserElement.packrat_cache_stats)

    _packratEnabled = False
    @staticmethod
    def enablePackrat(cache_size_limit=128):
        """Enables "packrat" parsing, which adds memoizing to the parsing logic.
           Repeated parse attempts at the same string location (which happens
           often in many complex grammars) can immediately return a cached value,
           instead of re-executing parsing/validating code.  Memoizing is done of
           both valid results and parsing exceptions.
           
           Parameters:
            - cache_size_limit - (default=C{128}) - if an integer value is provided
              will limit the size of the packrat cache; if None is passed, then
              the cache size will be unbounded; if 0 is passed, the cache will
              be effectively disabled.
            
           This speedup may break existing programs that use parse actions that
           have side-effects.  For this reason, packrat parsing is disabled when
           you first import pyparsing.  To activate the packrat feature, your
           program must call the class method C{ParserElement.enablePackrat()}.  If
           your program uses C{psyco} to "compile as you go", you must call
           C{enablePackrat} before calling C{psyco.full()}.  If you do not do this,
           Python will crash.  For best results, call C{enablePackrat()} immediately
           after importing pyparsing.
           
           Example::
               import pyparsing
               pyparsing.ParserElement.enablePackrat()
        """
        if not ParserElement._packratEnabled:
            ParserElement._packratEnabled = True
            if cache_size_limit is None:
                ParserElement.packrat_cache = ParserElement._UnboundedCache()
            else:
                ParserElement.packrat_cache = ParserElement._FifoCache(cache_size_limit)
            ParserElement._parse = ParserElement._parseCache

    def parseString( self, instring, parseAll=False ):
        """
        Execute the parse expression with the given string.
        This is the main interface to the client code, once the complete
        expression has been built.

        If you want the grammar to require that the entire input string be
        successfully parsed, then set C{parseAll} to True (equivalent to ending
        the grammar with C{L{StringEnd()}}).

        Note: C{parseString} implicitly calls C{expandtabs()} on the input string,
        in order to report proper column numbers in parse actions.
        If the input string contains tabs and
        the grammar uses parse actions that use the C{loc} argument to index into the
        string being parsed, you can ensure you have a consistent view of the input
        string by:
         - calling C{parseWithTabs} on your grammar before calling C{parseString}
           (see L{I{parseWithTabs}<parseWithTabs>})
         - define your parse action using the full C{(s,loc,toks)} signature, and
           reference the input string using the parse action's C{s} argument
         - explictly expand the tabs in your input string before calling
           C{parseString}
        
        Example::
            Word('a').parseString('aaaaabaaa')  # -> ['aaaaa']
            Word('a').parseString('aaaaabaaa', parseAll=True)  # -> Exception: Expected end of text
        """
        ParserElement.resetCache()
        if not self.streamlined:
            self.streamline()
            #~ self.saveAsList = True
        for e in self.ignoreExprs:
            e.streamline()
        if not self.keepTabs:
            instring = instring.expandtabs()
        try:
            loc, tokens = self._parse( instring, 0 )
            if parseAll:
                loc = self.preParse( instring, loc )
                se = Empty() + StringEnd()
                se._parse( instring, loc )
        except ParseBaseException as exc:
            if ParserElement.verbose_stacktrace:
                raise
            else:
                # catch and re-raise exception from here, clears out pyparsing internal stack trace
                raise exc
        else:
            return tokens

    def scanString( self, instring, maxMatches=_MAX_INT, overlap=False ):
        """
        Scan the input string for expression matches.  Each match will return the
        matching tokens, start location, and end location.  May be called with optional
        C{maxMatches} argument, to clip scanning after 'n' matches are found.  If
        C{overlap} is specified, then overlapping matches will be reported.

        Note that the start and end locations are reported relative to the string
        being parsed.  See L{I{parseString}<parseString>} for more information on parsing
        strings with embedded tabs.

        Example::
            source = "sldjf123lsdjjkf345sldkjf879lkjsfd987"
            print(source)
            for tokens,start,end in Word(alphas).scanString(source):
                print(' '*start + '^'*(end-start))
                print(' '*start + tokens[0])
        
        prints::
        
            sldjf123lsdjjkf345sldkjf879lkjsfd987
            ^^^^^
            sldjf
                    ^^^^^^^
                    lsdjjkf
                              ^^^^^^
                              sldkjf
                                       ^^^^^^
                                       lkjsfd
        """
        if not self.streamlined:
            self.streamline()
        for e in self.ignoreExprs:
            e.streamline()

        if not self.keepTabs:
            instring = _ustr(instring).expandtabs()
        instrlen = len(instring)
        loc = 0
        preparseFn = self.preParse
        parseFn = self._parse
        ParserElement.resetCache()
        matches = 0
        try:
            while loc <= instrlen and matches < maxMatches:
                try:
                    preloc = preparseFn( instring, loc )
                    nextLoc,tokens = parseFn( instring, preloc, callPreParse=False )
                except ParseException:
                    loc = preloc+1
                else:
                    if nextLoc > loc:
                        matches += 1
                        yield tokens, preloc, nextLoc
                        if overlap:
                            nextloc = preparseFn( instring, loc )
                            if nextloc > loc:
                                loc = nextLoc
                            else:
                                loc += 1
                        else:
                            loc = nextLoc
                    else:
                        loc = preloc+1
        except ParseBaseException as exc:
            if ParserElement.verbose_stacktrace:
                raise
            else:
                # catch and re-raise exception from here, clears out pyparsing internal stack trace
                raise exc

    def transformString( self, instring ):
        """
        Extension to C{L{scanString}}, to modify matching text with modified tokens that may
        be returned from a parse action.  To use C{transformString}, define a grammar and
        attach a parse action to it that modifies the returned token list.
        Invoking C{transformString()} on a target string will then scan for matches,
        and replace the matched text patterns according to the logic in the parse
        action.  C{transformString()} returns the resulting transformed string.
        
        Example::
            wd = Word(alphas)
            wd.setParseAction(lambda toks: toks[0].title())
            
            print(wd.transformString("now is the winter of our discontent made glorious summer by this sun of york."))
        Prints::
            Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York.
        """
        out = []
        lastE = 0
        # force preservation of <TAB>s, to minimize unwanted transformation of string, and to
        # keep string locs straight between transformString and scanString
        self.keepTabs = True
        try:
            for t,s,e in self.scanString( instring ):
                out.append( instring[lastE:s] )
                if t:
                    if isinstance(t,ParseResults):
                        out += t.asList()
                    elif isinstance(t,list):
                        out += t
                    else:
                        out.append(t)
                lastE = e
            out.append(instring[lastE:])
            out = [o for o in out if o]
            return "".join(map(_ustr,_flatten(out)))
        except ParseBaseException as exc:
            if ParserElement.verbose_stacktrace:
                raise
            else:
                # catch and re-raise exception from here, clears out pyparsing internal stack trace
                raise exc

    def searchString( self, instring, maxMatches=_MAX_INT ):
        """
        Another extension to C{L{scanString}}, simplifying the access to the tokens found
        to match the given parse expression.  May be called with optional
        C{maxMatches} argument, to clip searching after 'n' matches are found.
        
        Example::
            # a capitalized word starts with an uppercase letter, followed by zero or more lowercase letters
            cap_word = Word(alphas.upper(), alphas.lower())
            
            print(cap_word.searchString("More than Iron, more than Lead, more than Gold I need Electricity"))
        prints::
            ['More', 'Iron', 'Lead', 'Gold', 'I']
        """
        try:
            return ParseResults([ t for t,s,e in self.scanString( instring, maxMatches ) ])
        except ParseBaseException as exc:
            if ParserElement.verbose_stacktrace:
                raise
            else:
                # catch and re-raise exception from here, clears out pyparsing internal stack trace
                raise exc

    def split(self, instring, maxsplit=_MAX_INT, includeSeparators=False):
        """
        Generator method to split a string using the given expression as a separator.
        May be called with optional C{maxsplit} argument, to limit the number of splits;
        and the optional C{includeSeparators} argument (default=C{False}), if the separating
        matching text should be included in the split results.
        
        Example::        
            punc = oneOf(list(".,;:/-!?"))
            print(list(punc.split("This, this?, this sentence, is badly punctuated!")))
        prints::
            ['This', ' this', '', ' this sentence', ' is badly punctuated', '']
        """
        splits = 0
        last = 0
        for t,s,e in self.scanString(instring, maxMatches=maxsplit):
            yield instring[last:s]
            if includeSeparators:
                yield t[0]
            last = e
        yield instring[last:]

    def __add__(self, other ):
        """
        Implementation of + operator - returns C{L{And}}. Adding strings to a ParserElement
        converts them to L{Literal}s by default.
        
        Example::
            greet = Word(alphas) + "," + Word(alphas) + "!"
            hello = "Hello, World!"
            print (hello, "->", greet.parseString(hello))
        Prints::
            Hello, World! -> ['Hello', ',', 'World', '!']
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return And( [ self, other ] )

    def __radd__(self, other ):
        """
        Implementation of + operator when left operand is not a C{L{ParserElement}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return other + self

    def __sub__(self, other):
        """
        Implementation of - operator, returns C{L{And}} with error stop
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return And( [ self, And._ErrorStop(), other ] )

    def __rsub__(self, other ):
        """
        Implementation of - operator when left operand is not a C{L{ParserElement}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return other - self

    def __mul__(self,other):
        """
        Implementation of * operator, allows use of C{expr * 3} in place of
        C{expr + expr + expr}.  Expressions may also me multiplied by a 2-integer
        tuple, similar to C{{min,max}} multipliers in regular expressions.  Tuples
        may also include C{None} as in:
         - C{expr*(n,None)} or C{expr*(n,)} is equivalent
              to C{expr*n + L{ZeroOrMore}(expr)}
              (read as "at least n instances of C{expr}")
         - C{expr*(None,n)} is equivalent to C{expr*(0,n)}
              (read as "0 to n instances of C{expr}")
         - C{expr*(None,None)} is equivalent to C{L{ZeroOrMore}(expr)}
         - C{expr*(1,None)} is equivalent to C{L{OneOrMore}(expr)}

        Note that C{expr*(None,n)} does not raise an exception if
        more than n exprs exist in the input stream; that is,
        C{expr*(None,n)} does not enforce a maximum number of expr
        occurrences.  If this behavior is desired, then write
        C{expr*(None,n) + ~expr}
        """
        if isinstance(other,int):
            minElements, optElements = other,0
        elif isinstance(other,tuple):
            other = (other + (None, None))[:2]
            if other[0] is None:
                other = (0, other[1])
            if isinstance(other[0],int) and other[1] is None:
                if other[0] == 0:
                    return ZeroOrMore(self)
                if other[0] == 1:
                    return OneOrMore(self)
                else:
                    return self*other[0] + ZeroOrMore(self)
            elif isinstance(other[0],int) and isinstance(other[1],int):
                minElements, optElements = other
                optElements -= minElements
            else:
                raise TypeError("cannot multiply 'ParserElement' and ('%s','%s') objects", type(other[0]),type(other[1]))
        else:
            raise TypeError("cannot multiply 'ParserElement' and '%s' objects", type(other))

        if minElements < 0:
            raise ValueError("cannot multiply ParserElement by negative value")
        if optElements < 0:
            raise ValueError("second tuple value must be greater or equal to first tuple value")
        if minElements == optElements == 0:
            raise ValueError("cannot multiply ParserElement by 0 or (0,0)")

        if (optElements):
            def makeOptionalList(n):
                if n>1:
                    return Optional(self + makeOptionalList(n-1))
                else:
                    return Optional(self)
            if minElements:
                if minElements == 1:
                    ret = self + makeOptionalList(optElements)
                else:
                    ret = And([self]*minElements) + makeOptionalList(optElements)
            else:
                ret = makeOptionalList(optElements)
        else:
            if minElements == 1:
                ret = self
            else:
                ret = And([self]*minElements)
        return ret

    def __rmul__(self, other):
        return self.__mul__(other)

    def __or__(self, other ):
        """
        Implementation of | operator - returns C{L{MatchFirst}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return MatchFirst( [ self, other ] )

    def __ror__(self, other ):
        """
        Implementation of | operator when left operand is not a C{L{ParserElement}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return other | self

    def __xor__(self, other ):
        """
        Implementation of ^ operator - returns C{L{Or}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return Or( [ self, other ] )

    def __rxor__(self, other ):
        """
        Implementation of ^ operator when left operand is not a C{L{ParserElement}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return other ^ self

    def __and__(self, other ):
        """
        Implementation of & operator - returns C{L{Each}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return Each( [ self, other ] )

    def __rand__(self, other ):
        """
        Implementation of & operator when left operand is not a C{L{ParserElement}}
        """
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        if not isinstance( other, ParserElement ):
            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
                    SyntaxWarning, stacklevel=2)
            return None
        return other & self

    def __invert__( self ):
        """
        Implementation of ~ operator - returns C{L{NotAny}}
        """
        return NotAny( self )

    def __call__(self, name=None):
        """
        Shortcut for C{L{setResultsName}}, with C{listAllMatches=False}.
        
        If C{name} is given with a trailing C{'*'} character, then C{listAllMatches} will be
        passed as C{True}.
           
        If C{name} is omitted, same as calling C{L{copy}}.

        Example::
            # these are equivalent
            userdata = Word(alphas).setResultsName("name") + Word(nums+"-").setResultsName("socsecno")
            userdata = Word(alphas)("name") + Word(nums+"-")("socsecno")             
        """
        if name is not None:
            return self.setResultsName(name)
        else:
            return self.copy()

    def suppress( self ):
        """
        Suppresses the output of this C{ParserElement}; useful to keep punctuation from
        cluttering up returned output.
        """
        return Suppress( self )

    def leaveWhitespace( self ):
        """
        Disables the skipping of whitespace before matching the characters in the
        C{ParserElement}'s defined pattern.  This is normally only used internally by
        the pyparsing module, but may be needed in some whitespace-sensitive grammars.
        """
        self.skipWhitespace = False
        return self

    def setWhitespaceChars( self, chars ):
        """
        Overrides the default whitespace chars
        """
        self.skipWhitespace = True
        self.whiteChars = chars
        self.copyDefaultWhiteChars = False
        return self

    def parseWithTabs( self ):
        """
        Overrides default behavior to expand C{<TAB>}s to spaces before parsing the input string.
        Must be called before C{parseString} when the input grammar contains elements that
        match C{<TAB>} characters.
        """
        self.keepTabs = True
        return self

    def ignore( self, other ):
        """
        Define expression to be ignored (e.g., comments) while doing pattern
        matching; may be called repeatedly, to define multiple comment or other
        ignorable patterns.
        
        Example::
            patt = OneOrMore(Word(alphas))
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj']
            
            patt.ignore(cStyleComment)
            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj', 'lskjd']
        """
        if isinstance(other, basestring):
            other = Suppress(other)

        if isinstance( other, Suppress ):
            if other not in self.ignoreExprs:
                self.ignoreExprs.append(other)
        else:
            self.ignoreExprs.append( Suppress( other.copy() ) )
        return self

    def setDebugActions( self, startAction, successAction, exceptionAction ):
        """
        Enable display of debugging messages while doing pattern matching.
        """
        self.debugActions = (startAction or _defaultStartDebugAction,
                             successAction or _defaultSuccessDebugAction,
                             exceptionAction or _defaultExceptionDebugAction)
        self.debug = True
        return self

    def setDebug( self, flag=True ):
        """
        Enable display of debugging messages while doing pattern matching.
        Set C{flag} to True to enable, False to disable.

        Example::
            wd = Word(alphas).setName("alphaword")
            integer = Word(nums).setName("numword")
            term = wd | integer
            
            # turn on debugging for wd
            wd.setDebug()

            OneOrMore(term).parseString("abc 123 xyz 890")
        
        prints::
            Match alphaword at loc 0(1,1)
            Matched alphaword -> ['abc']
            Match alphaword at loc 3(1,4)
            Exception raised:Expected alphaword (at char 4), (line:1, col:5)
            Match alphaword at loc 7(1,8)
            Matched alphaword -> ['xyz']
            Match alphaword at loc 11(1,12)
            Exception raised:Expected alphaword (at char 12), (line:1, col:13)
            Match alphaword at loc 15(1,16)
            Exception raised:Expected alphaword (at char 15), (line:1, col:16)

        The output shown is that produced by the default debug actions - custom debug actions can be
        specified using L{setDebugActions}. Prior to attempting
        to match the C{wd} expression, the debugging message C{"Match <exprname> at loc <n>(<line>,<col>)"}
        is shown. Then if the parse succeeds, a C{"Matched"} message is shown, or an C{"Exception raised"}
        message is shown. Also note the use of L{setName} to assign a human-readable name to the expression,
        which makes debugging and exception messages easier to understand - for instance, the default
        name created for the C{Word} expression without calling C{setName} is C{"W:(ABCD...)"}.
        """
        if flag:
            self.setDebugActions( _defaultStartDebugAction, _defaultSuccessDebugAction, _defaultExceptionDebugAction )
        else:
            self.debug = False
        return self

    def __str__( self ):
        return self.name

    def __repr__( self ):
        return _ustr(self)

    def streamline( self ):
        self.streamlined = True
        self.strRepr = None
        return self

    def checkRecursion( self, parseElementList ):
        pass

    def validate( self, validateTrace=[] ):
        """
        Check defined expressions for valid structure, check for infinite recursive definitions.
        """
        self.checkRecursion( [] )

    def parseFile( self, file_or_filename, parseAll=False ):
        """
        Execute the parse expression on the given file or filename.
        If a filename is specified (instead of a file object),
        the entire file is opened, read, and closed before parsing.
        """
        try:
            file_contents = file_or_filename.read()
        except AttributeError:
            with open(file_or_filename, "r") as f:
                file_contents = f.read()
        try:
            return self.parseString(file_contents, parseAll)
        except ParseBaseException as exc:
            if ParserElement.verbose_stacktrace:
                raise
            else:
                # catch and re-raise exception from here, clears out pyparsing internal stack trace
                raise exc

    def __eq__(self,other):
        if isinstance(other, ParserElement):
            return self is other or vars(self) == vars(other)
        elif isinstance(other, basestring):
            return self.matches(other)
        else:
            return super(ParserElement,self)==other

    def __ne__(self,other):
        return not (self == other)

    def __hash__(self):
        return hash(id(self))

    def __req__(self,other):
        return self == other

    def __rne__(self,other):
        return not (self == other)

    def matches(self, testString, parseAll=True):
        """
        Method for quick testing of a parser against a test string. Good for simple 
        inline microtests of sub expressions while building up larger parser.
           
        Parameters:
         - testString - to test against this expression for a match
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests
            
        Example::
            expr = Word(nums)
            assert expr.matches("100")
        """
        try:
            self.parseString(_ustr(testString), parseAll=parseAll)
            return True
        except ParseBaseException:
            return False
                
    def runTests(self, tests, parseAll=True, comment='#', fullDump=True, printResults=True, failureTests=False):
        """
        Execute the parse expression on a series of test strings, showing each
        test, the parsed results or where the parse failed. Quick and easy way to
        run a parse expression against a list of sample strings.
           
        Parameters:
         - tests - a list of separate test strings, or a multiline string of test strings
         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests           
         - comment - (default=C{'#'}) - expression for indicating embedded comments in the test 
              string; pass None to disable comment filtering
         - fullDump - (default=C{True}) - dump results as list followed by results names in nested outline;
              if False, only dump nested list
         - printResults - (default=C{True}) prints test output to stdout
         - failureTests - (default=C{False}) indicates if these tests are expected to fail parsing

        Returns: a (success, results) tuple, where success indicates that all tests succeeded
        (or failed if C{failureTests} is True), and the results contain a list of lines of each 
        test's output
        
        Example::
            number_expr = pyparsing_common.number.copy()

            result = number_expr.runTests('''
                # unsigned integer
                100
                # negative integer
                -100
                # float with scientific notation
                6.02e23
                # integer with scientific notation
                1e-12
                ''')
            print("Success" if result[0] else "Failed!")

            result = number_expr.runTests('''
                # stray character
                100Z
                # missing leading digit before '.'
                -.100
                # too many '.'
                3.14.159
                ''', failureTests=True)
            print("Success" if result[0] else "Failed!")
        prints::
            # unsigned integer
            100
            [100]

            # negative integer
            -100
            [-100]

            # float with scientific notation
            6.02e23
            [6.02e+23]

            # integer with scientific notation
            1e-12
            [1e-12]

            Success
            
            # stray character
            100Z
               ^
            FAIL: Expected end of text (at char 3), (line:1, col:4)

            # missing leading digit before '.'
            -.100
            ^
            FAIL: Expected {real number with scientific notation | real number | signed integer} (at char 0), (line:1, col:1)

            # too many '.'
            3.14.159
                ^
            FAIL: Expected end of text (at char 4), (line:1, col:5)

            Success

        Each test string must be on a single line. If you want to test a string that spans multiple
        lines, create a test like this::

            expr.runTest(r"this is a test\\n of strings that spans \\n 3 lines")
        
        (Note that this is a raw string literal, you must include the leading 'r'.)
        """
        if isinstance(tests, basestring):
            tests = list(map(str.strip, tests.rstrip().splitlines()))
        if isinstance(comment, basestring):
            comment = Literal(comment)
        allResults = []
        comments = []
        success = True
        for t in tests:
            if comment is not None and comment.matches(t, False) or comments and not t:
                comments.append(t)
                continue
            if not t:
                continue
            out = ['\n'.join(comments), t]
            comments = []
            try:
                t = t.replace(r'\n','\n')
                result = self.parseString(t, parseAll=parseAll)
                out.append(result.dump(full=fullDump))
                success = success and not failureTests
            except ParseBaseException as pe:
                fatal = "(FATAL)" if isinstance(pe, ParseFatalException) else ""
                if '\n' in t:
                    out.append(line(pe.loc, t))
                    out.append(' '*(col(pe.loc,t)-1) + '^' + fatal)
                else:
                    out.append(' '*pe.loc + '^' + fatal)
                out.append("FAIL: " + str(pe))
                success = success and failureTests
                result = pe
            except Exception as exc:
                out.append("FAIL-EXCEPTION: " + str(exc))
                success = success and failureTests
                result = exc

            if printResults:
                if fullDump:
                    out.append('')
                print('\n'.join(out))

            allResults.append((t, result))
        
        return success, allResults

        
class Token(ParserElement):
    """
    Abstract C{ParserElement} subclass, for defining atomic matching patterns.
    """
    def __init__( self ):
        super(Token,self).__init__( savelist=False )


class Empty(Token):
    """
    An empty token, will always match.
    """
    def __init__( self ):
        super(Empty,self).__init__()
        self.name = "Empty"
        self.mayReturnEmpty = True
        self.mayIndexError = False


class NoMatch(Token):
    """
    A token that will never match.
    """
    def __init__( self ):
        super(NoMatch,self).__init__()
        self.name = "NoMatch"
        self.mayReturnEmpty = True
        self.mayIndexError = False
        self.errmsg = "Unmatchable token"

    def parseImpl( self, instring, loc, doActions=True ):
        raise ParseException(instring, loc, self.errmsg, self)


class Literal(Token):
    """
    Token to exactly match a specified string.
    
    Example::
        Literal('blah').parseString('blah')  # -> ['blah']
        Literal('blah').parseString('blahfooblah')  # -> ['blah']
        Literal('blah').parseString('bla')  # -> Exception: Expected "blah"
    
    For case-insensitive matching, use L{CaselessLiteral}.
    
    For keyword matching (force word break before and after the matched string),
    use L{Keyword} or L{CaselessKeyword}.
    """
    def __init__( self, matchString ):
        super(Literal,self).__init__()
        self.match = matchString
        self.matchLen = len(matchString)
        try:
            self.firstMatchChar = matchString[0]
        except IndexError:
            warnings.warn("null string passed to Literal; use Empty() instead",
                            SyntaxWarning, stacklevel=2)
            self.__class__ = Empty
        self.name = '"%s"' % _ustr(self.match)
        self.errmsg = "Expected " + self.name
        self.mayReturnEmpty = False
        self.mayIndexError = False

    # Performance tuning: this routine gets called a *lot*
    # if this is a single character match string  and the first character matches,
    # short-circuit as quickly as possible, and avoid calling startswith
    #~ @profile
    def parseImpl( self, instring, loc, doActions=True ):
        if (instring[loc] == self.firstMatchChar and
            (self.matchLen==1 or instring.startswith(self.match,loc)) ):
            return loc+self.matchLen, self.match
        raise ParseException(instring, loc, self.errmsg, self)
_L = Literal
ParserElement._literalStringClass = Literal

class Keyword(Token):
    """
    Token to exactly match a specified string as a keyword, that is, it must be
    immediately followed by a non-keyword character.  Compare with C{L{Literal}}:
     - C{Literal("if")} will match the leading C{'if'} in C{'ifAndOnlyIf'}.
     - C{Keyword("if")} will not; it will only match the leading C{'if'} in C{'if x=1'}, or C{'if(y==2)'}
    Accepts two optional constructor arguments in addition to the keyword string:
     - C{identChars} is a string of characters that would be valid identifier characters,
          defaulting to all alphanumerics + "_" and "$"
     - C{caseless} allows case-insensitive matching, default is C{False}.
       
    Example::
        Keyword("start").parseString("start")  # -> ['start']
        Keyword("start").parseString("starting")  # -> Exception

    For case-insensitive matching, use L{CaselessKeyword}.
    """
    DEFAULT_KEYWORD_CHARS = alphanums+"_$"

    def __init__( self, matchString, identChars=None, caseless=False ):
        super(Keyword,self).__init__()
        if identChars is None:
            identChars = Keyword.DEFAULT_KEYWORD_CHARS
        self.match = matchString
        self.matchLen = len(matchString)
        try:
            self.firstMatchChar = matchString[0]
        except IndexError:
            warnings.warn("null string passed to Keyword; use Empty() instead",
                            SyntaxWarning, stacklevel=2)
        self.name = '"%s"' % self.match
        self.errmsg = "Expected " + self.name
        self.mayReturnEmpty = False
        self.mayIndexError = False
        self.caseless = caseless
        if caseless:
            self.caselessmatch = matchString.upper()
            identChars = identChars.upper()
        self.identChars = set(identChars)

    def parseImpl( self, instring, loc, doActions=True ):
        if self.caseless:
            if ( (instring[ loc:loc+self.matchLen ].upper() == self.caselessmatch) and
                 (loc >= len(instring)-self.matchLen or instring[loc+self.matchLen].upper() not in self.identChars) and
                 (loc == 0 or instring[loc-1].upper() not in self.identChars) ):
                return loc+self.matchLen, self.match
        else:
            if (instring[loc] == self.firstMatchChar and
                (self.matchLen==1 or instring.startswith(self.match,loc)) and
                (loc >= len(instring)-self.matchLen or instring[loc+self.matchLen] not in self.identChars) and
                (loc == 0 or instring[loc-1] not in self.identChars) ):
                return loc+self.matchLen, self.match
        raise ParseException(instring, loc, self.errmsg, self)

    def copy(self):
        c = super(Keyword,self).copy()
        c.identChars = Keyword.DEFAULT_KEYWORD_CHARS
        return c

    @staticmethod
    def setDefaultKeywordChars( chars ):
        """Overrides the default Keyword chars
        """
        Keyword.DEFAULT_KEYWORD_CHARS = chars

class CaselessLiteral(Literal):
    """
    Token to match a specified string, ignoring case of letters.
    Note: the matched results will always be in the case of the given
    match string, NOT the case of the input text.

    Example::
        OneOrMore(CaselessLiteral("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD', 'CMD']
        
    (Contrast with example for L{CaselessKeyword}.)
    """
    def __init__( self, matchString ):
        super(CaselessLiteral,self).__init__( matchString.upper() )
        # Preserve the defining literal.
        self.returnString = matchString
        self.name = "'%s'" % self.returnString
        self.errmsg = "Expected " + self.name

    def parseImpl( self, instring, loc, doActions=True ):
        if instring[ loc:loc+self.matchLen ].upper() == self.match:
            return loc+self.matchLen, self.returnString
        raise ParseException(instring, loc, self.errmsg, self)

class CaselessKeyword(Keyword):
    """
    Caseless version of L{Keyword}.

    Example::
        OneOrMore(CaselessKeyword("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD']
        
    (Contrast with example for L{CaselessLiteral}.)
    """
    def __init__( self, matchString, identChars=None ):
        super(CaselessKeyword,self).__init__( matchString, identChars, caseless=True )

    def parseImpl( self, instring, loc, doActions=True ):
        if ( (instring[ loc:loc+self.matchLen ].upper() == self.caselessmatch) and
             (loc >= len(instring)-self.matchLen or instring[loc+self.matchLen].upper() not in self.identChars) ):
            return loc+self.matchLen, self.match
        raise ParseException(instring, loc, self.errmsg, self)

class CloseMatch(Token):
    """
    A variation on L{Literal} which matches "close" matches, that is, 
    strings with at most 'n' mismatching characters. C{CloseMatch} takes parameters:
     - C{match_string} - string to be matched
     - C{maxMismatches} - (C{default=1}) maximum number of mismatches allowed to count as a match
    
    The results from a successful parse will contain the matched text from the input string and the following named results:
     - C{mismatches} - a list of the positions within the match_string where mismatches were found
     - C{original} - the original match_string used to compare against the input string
    
    If C{mismatches} is an empty list, then the match was an exact match.
    
    Example::
        patt = CloseMatch("ATCATCGAATGGA")
        patt.parseString("ATCATCGAAXGGA") # -> (['ATCATCGAAXGGA'], {'mismatches': [[9]], 'original': ['ATCATCGAATGGA']})
        patt.parseString("ATCAXCGAAXGGA") # -> Exception: Expected 'ATCATCGAATGGA' (with up to 1 mismatches) (at char 0), (line:1, col:1)

        # exact match
        patt.parseString("ATCATCGAATGGA") # -> (['ATCATCGAATGGA'], {'mismatches': [[]], 'original': ['ATCATCGAATGGA']})

        # close match allowing up to 2 mismatches
        patt = CloseMatch("ATCATCGAATGGA", maxMismatches=2)
        patt.parseString("ATCAXCGAAXGGA") # -> (['ATCAXCGAAXGGA'], {'mismatches': [[4, 9]], 'original': ['ATCATCGAATGGA']})
    """
    def __init__(self, match_string, maxMismatches=1):
        super(CloseMatch,self).__init__()
        self.name = match_string
        self.match_string = match_string
        self.maxMismatches = maxMismatches
        self.errmsg = "Expected %r (with up to %d mismatches)" % (self.match_string, self.maxMismatches)
        self.mayIndexError = False
        self.mayReturnEmpty = False

    def parseImpl( self, instring, loc, doActions=True ):
        start = loc
        instrlen = len(instring)
        maxloc = start + len(self.match_string)

        if maxloc <= instrlen:
            match_string = self.match_string
            match_stringloc = 0
            mismatches = []
            maxMismatches = self.maxMismatches

            for match_stringloc,s_m in enumerate(zip(instring[loc:maxloc], self.match_string)):
                src,mat = s_m
                if src != mat:
                    mismatches.append(match_stringloc)
                    if len(mismatches) > maxMismatches:
                        break
            else:
                loc = match_stringloc + 1
                results = ParseResults([instring[start:loc]])
                results['original'] = self.match_string
                results['mismatches'] = mismatches
                return loc, results

        raise ParseException(instring, loc, self.errmsg, self)


class Word(Token):
    """
    Token for matching words composed of allowed character sets.
    Defined with string containing all allowed initial characters,
    an optional string containing allowed body characters (if omitted,
    defaults to the initial character set), and an optional minimum,
    maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction. An optional
    C{excludeChars} parameter can list characters that might be found in 
    the input C{bodyChars} string; useful to define a word of all printables
    except for one or two characters, for instance.
    
    L{srange} is useful for defining custom character set strings for defining 
    C{Word} expressions, using range notation from regular expression character sets.
    
    A common mistake is to use C{Word} to match a specific literal string, as in 
    C{Word("Address")}. Remember that C{Word} uses the string argument to define
    I{sets} of matchable characters. This expression would match "Add", "AAA",
    "dAred", or any other word made up of the characters 'A', 'd', 'r', 'e', and 's'.
    To match an exact literal string, use L{Literal} or L{Keyword}.

    pyparsing includes helper strings for building Words:
     - L{alphas}
     - L{nums}
     - L{alphanums}
     - L{hexnums}
     - L{alphas8bit} (alphabetic characters in ASCII range 128-255 - accented, tilded, umlauted, etc.)
     - L{punc8bit} (non-alphabetic characters in ASCII range 128-255 - currency, symbols, superscripts, diacriticals, etc.)
     - L{printables} (any non-whitespace character)

    Example::
        # a word composed of digits
        integer = Word(nums) # equivalent to Word("0123456789") or Word(srange("0-9"))
        
        # a word with a leading capital, and zero or more lowercase
        capital_word = Word(alphas.upper(), alphas.lower())

        # hostnames are alphanumeric, with leading alpha, and '-'
        hostname = Word(alphas, alphanums+'-')
        
        # roman numeral (not a strict parser, accepts invalid mix of characters)
        roman = Word("IVXLCDM")
        
        # any string of non-whitespace characters, except for ','
        csv_value = Word(printables, excludeChars=",")
    """
    def __init__( self, initChars, bodyChars=None, min=1, max=0, exact=0, asKeyword=False, excludeChars=None ):
        super(Word,self).__init__()
        if excludeChars:
            initChars = ''.join(c for c in initChars if c not in excludeChars)
            if bodyChars:
                bodyChars = ''.join(c for c in bodyChars if c not in excludeChars)
        self.initCharsOrig = initChars
        self.initChars = set(initChars)
        if bodyChars :
            self.bodyCharsOrig = bodyChars
            self.bodyChars = set(bodyChars)
        else:
            self.bodyCharsOrig = initChars
            self.bodyChars = set(initChars)

        self.maxSpecified = max > 0

        if min < 1:
            raise ValueError("cannot specify a minimum length < 1; use Optional(Word()) if zero-length word is permitted")

        self.minLen = min

        if max > 0:
            self.maxLen = max
        else:
            self.maxLen = _MAX_INT

        if exact > 0:
            self.maxLen = exact
            self.minLen = exact

        self.name = _ustr(self)
        self.errmsg = "Expected " + self.name
        self.mayIndexError = False
        self.asKeyword = asKeyword

        if ' ' not in self.initCharsOrig+self.bodyCharsOrig and (min==1 and max==0 and exact==0):
            if self.bodyCharsOrig == self.initCharsOrig:
                self.reString = "[%s]+" % _escapeRegexRangeChars(self.initCharsOrig)
            elif len(self.initCharsOrig) == 1:
                self.reString = "%s[%s]*" % \
                                      (re.escape(self.initCharsOrig),
                                      _escapeRegexRangeChars(self.bodyCharsOrig),)
            else:
                self.reString = "[%s][%s]*" % \
                                      (_escapeRegexRangeChars(self.initCharsOrig),
                                      _escapeRegexRangeChars(self.bodyCharsOrig),)
            if self.asKeyword:
                self.reString = r"\b"+self.reString+r"\b"
            try:
                self.re = re.compile( self.reString )
            except Exception:
                self.re = None

    def parseImpl( self, instring, loc, doActions=True ):
        if self.re:
            result = self.re.match(instring,loc)
            if not result:
                raise ParseException(instring, loc, self.errmsg, self)

            loc = result.end()
            return loc, result.group()

        if not(instring[ loc ] in self.initChars):
            raise ParseException(instring, loc, self.errmsg, self)

        start = loc
        loc += 1
        instrlen = len(instring)
        bodychars = self.bodyChars
        maxloc = start + self.maxLen
        maxloc = min( maxloc, instrlen )
        while loc < maxloc and instring[loc] in bodychars:
            loc += 1

        throwException = False
        if loc - start < self.minLen:
            throwException = True
        if self.maxSpecified and loc < instrlen and instring[loc] in bodychars:
            throwException = True
        if self.asKeyword:
            if (start>0 and instring[start-1] in bodychars) or (loc<instrlen and instring[loc] in bodychars):
                throwException = True

        if throwException:
            raise ParseException(instring, loc, self.errmsg, self)

        return loc, instring[start:loc]

    def __str__( self ):
        try:
            return super(Word,self).__str__()
        except Exception:
            pass


        if self.strRepr is None:

            def charsAsStr(s):
                if len(s)>4:
                    return s[:4]+"..."
                else:
                    return s

            if ( self.initCharsOrig != self.bodyCharsOrig ):
                self.strRepr = "W:(%s,%s)" % ( charsAsStr(self.initCharsOrig), charsAsStr(self.bodyCharsOrig) )
            else:
                self.strRepr = "W:(%s)" % charsAsStr(self.initCharsOrig)

        return self.strRepr


class Regex(Token):
    """
    Token for matching strings that match a given regular expression.
    Defined with string specifying the regular expression in a form recognized by the inbuilt Python re module.
    If the given regex contains named groups (defined using C{(?P<name>...)}), these will be preserved as 
    named parse results.

    Example::
        realnum = Regex(r"[+-]?\d+\.\d*")
        date = Regex(r'(?P<year>\d{4})-(?P<month>\d\d?)-(?P<day>\d\d?)')
        # ref: http://stackoverflow.com/questions/267399/how-do-you-match-only-valid-roman-numerals-with-a-regular-expression
        roman = Regex(r"M{0,4}(CM|CD|D?C{0,3})(XC|XL|L?X{0,3})(IX|IV|V?I{0,3})")
    """
    compiledREtype = type(re.compile("[A-Z]"))
    def __init__( self, pattern, flags=0):
        """The parameters C{pattern} and C{flags} are passed to the C{re.compile()} function as-is. See the Python C{re} module for an explanation of the acceptable patterns and flags."""
        super(Regex,self).__init__()

        if isinstance(pattern, basestring):
            if not pattern:
                warnings.warn("null string passed to Regex; use Empty() instead",
                        SyntaxWarning, stacklevel=2)

            self.pattern = pattern
            self.flags = flags

            try:
                self.re = re.compile(self.pattern, self.flags)
                self.reString = self.pattern
            except sre_constants.error:
                warnings.warn("invalid pattern (%s) passed to Regex" % pattern,
                    SyntaxWarning, stacklevel=2)
                raise

        elif isinstance(pattern, Regex.compiledREtype):
            self.re = pattern
            self.pattern = \
            self.reString = str(pattern)
            self.flags = flags
            
        else:
            raise ValueError("Regex may only be constructed with a string or a compiled RE object")

        self.name = _ustr(self)
        self.errmsg = "Expected " + self.name
        self.mayIndexError = False
        self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        result = self.re.match(instring,loc)
        if not result:
            raise ParseException(instring, loc, self.errmsg, self)

        loc = result.end()
        d = result.groupdict()
        ret = ParseResults(result.group())
        if d:
            for k in d:
                ret[k] = d[k]
        return loc,ret

    def __str__( self ):
        try:
            return super(Regex,self).__str__()
        except Exception:
            pass

        if self.strRepr is None:
            self.strRepr = "Re:(%s)" % repr(self.pattern)

        return self.strRepr


class QuotedString(Token):
    r"""
    Token for matching strings that are delimited by quoting characters.
    
    Defined with the following parameters:
        - quoteChar - string of one or more characters defining the quote delimiting string
        - escChar - character to escape quotes, typically backslash (default=C{None})
        - escQuote - special quote sequence to escape an embedded quote string (such as SQL's "" to escape an embedded ") (default=C{None})
        - multiline - boolean indicating whether quotes can span multiple lines (default=C{False})
        - unquoteResults - boolean indicating whether the matched text should be unquoted (default=C{True})
        - endQuoteChar - string of one or more characters defining the end of the quote delimited string (default=C{None} => same as quoteChar)
        - convertWhitespaceEscapes - convert escaped whitespace (C{'\t'}, C{'\n'}, etc.) to actual whitespace (default=C{True})

    Example::
        qs = QuotedString('"')
        print(qs.searchString('lsjdf "This is the quote" sldjf'))
        complex_qs = QuotedString('{{', endQuoteChar='}}')
        print(complex_qs.searchString('lsjdf {{This is the "quote"}} sldjf'))
        sql_qs = QuotedString('"', escQuote='""')
        print(sql_qs.searchString('lsjdf "This is the quote with ""embedded"" quotes" sldjf'))
    prints::
        [['This is the quote']]
        [['This is the "quote"']]
        [['This is the quote with "embedded" quotes']]
    """
    def __init__( self, quoteChar, escChar=None, escQuote=None, multiline=False, unquoteResults=True, endQuoteChar=None, convertWhitespaceEscapes=True):
        super(QuotedString,self).__init__()

        # remove white space from quote chars - wont work anyway
        quoteChar = quoteChar.strip()
        if not quoteChar:
            warnings.warn("quoteChar cannot be the empty string",SyntaxWarning,stacklevel=2)
            raise SyntaxError()

        if endQuoteChar is None:
            endQuoteChar = quoteChar
        else:
            endQuoteChar = endQuoteChar.strip()
            if not endQuoteChar:
                warnings.warn("endQuoteChar cannot be the empty string",SyntaxWarning,stacklevel=2)
                raise SyntaxError()

        self.quoteChar = quoteChar
        self.quoteCharLen = len(quoteChar)
        self.firstQuoteChar = quoteChar[0]
        self.endQuoteChar = endQuoteChar
        self.endQuoteCharLen = len(endQuoteChar)
        self.escChar = escChar
        self.escQuote = escQuote
        self.unquoteResults = unquoteResults
        self.convertWhitespaceEscapes = convertWhitespaceEscapes

        if multiline:
            self.flags = re.MULTILINE | re.DOTALL
            self.pattern = r'%s(?:[^%s%s]' % \
                ( re.escape(self.quoteChar),
                  _escapeRegexRangeChars(self.endQuoteChar[0]),
                  (escChar is not None and _escapeRegexRangeChars(escChar) or '') )
        else:
            self.flags = 0
            self.pattern = r'%s(?:[^%s\n\r%s]' % \
                ( re.escape(self.quoteChar),
                  _escapeRegexRangeChars(self.endQuoteChar[0]),
                  (escChar is not None and _escapeRegexRangeChars(escChar) or '') )
        if len(self.endQuoteChar) > 1:
            self.pattern += (
                '|(?:' + ')|(?:'.join("%s[^%s]" % (re.escape(self.endQuoteChar[:i]),
                                               _escapeRegexRangeChars(self.endQuoteChar[i]))
                                    for i in range(len(self.endQuoteChar)-1,0,-1)) + ')'
                )
        if escQuote:
            self.pattern += (r'|(?:%s)' % re.escape(escQuote))
        if escChar:
            self.pattern += (r'|(?:%s.)' % re.escape(escChar))
            self.escCharReplacePattern = re.escape(self.escChar)+"(.)"
        self.pattern += (r')*%s' % re.escape(self.endQuoteChar))

        try:
            self.re = re.compile(self.pattern, self.flags)
            self.reString = self.pattern
        except sre_constants.error:
            warnings.warn("invalid pattern (%s) passed to Regex" % self.pattern,
                SyntaxWarning, stacklevel=2)
            raise

        self.name = _ustr(self)
        self.errmsg = "Expected " + self.name
        self.mayIndexError = False
        self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        result = instring[loc] == self.firstQuoteChar and self.re.match(instring,loc) or None
        if not result:
            raise ParseException(instring, loc, self.errmsg, self)

        loc = result.end()
        ret = result.group()

        if self.unquoteResults:

            # strip off quotes
            ret = ret[self.quoteCharLen:-self.endQuoteCharLen]

            if isinstance(ret,basestring):
                # replace escaped whitespace
                if '\\' in ret and self.convertWhitespaceEscapes:
                    ws_map = {
                        r'\t' : '\t',
                        r'\n' : '\n',
                        r'\f' : '\f',
                        r'\r' : '\r',
                    }
                    for wslit,wschar in ws_map.items():
                        ret = ret.replace(wslit, wschar)

                # replace escaped characters
                if self.escChar:
                    ret = re.sub(self.escCharReplacePattern,"\g<1>",ret)

                # replace escaped quotes
                if self.escQuote:
                    ret = ret.replace(self.escQuote, self.endQuoteChar)

        return loc, ret

    def __str__( self ):
        try:
            return super(QuotedString,self).__str__()
        except Exception:
            pass

        if self.strRepr is None:
            self.strRepr = "quoted string, starting with %s ending with %s" % (self.quoteChar, self.endQuoteChar)

        return self.strRepr


class CharsNotIn(Token):
    """
    Token for matching words composed of characters I{not} in a given set (will
    include whitespace in matched characters if not listed in the provided exclusion set - see example).
    Defined with string containing all disallowed characters, and an optional
    minimum, maximum, and/or exact length.  The default value for C{min} is 1 (a
    minimum value < 1 is not valid); the default values for C{max} and C{exact}
    are 0, meaning no maximum or exact length restriction.

    Example::
        # define a comma-separated-value as anything that is not a ','
        csv_value = CharsNotIn(',')
        print(delimitedList(csv_value).parseString("dkls,lsdkjf,s12 34,@!#,213"))
    prints::
        ['dkls', 'lsdkjf', 's12 34', '@!#', '213']
    """
    def __init__( self, notChars, min=1, max=0, exact=0 ):
        super(CharsNotIn,self).__init__()
        self.skipWhitespace = False
        self.notChars = notChars

        if min < 1:
            raise ValueError("cannot specify a minimum length < 1; use Optional(CharsNotIn()) if zero-length char group is permitted")

        self.minLen = min

        if max > 0:
            self.maxLen = max
        else:
            self.maxLen = _MAX_INT

        if exact > 0:
            self.maxLen = exact
            self.minLen = exact

        self.name = _ustr(self)
        self.errmsg = "Expected " + self.name
        self.mayReturnEmpty = ( self.minLen == 0 )
        self.mayIndexError = False

    def parseImpl( self, instring, loc, doActions=True ):
        if instring[loc] in self.notChars:
            raise ParseException(instring, loc, self.errmsg, self)

        start = loc
        loc += 1
        notchars = self.notChars
        maxlen = min( start+self.maxLen, len(instring) )
        while loc < maxlen and \
              (instring[loc] not in notchars):
            loc += 1

        if loc - start < self.minLen:
            raise ParseException(instring, loc, self.errmsg, self)

        return loc, instring[start:loc]

    def __str__( self ):
        try:
            return super(CharsNotIn, self).__str__()
        except Exception:
            pass

        if self.strRepr is None:
            if len(self.notChars) > 4:
                self.strRepr = "!W:(%s...)" % self.notChars[:4]
            else:
                self.strRepr = "!W:(%s)" % self.notChars

        return self.strRepr

class White(Token):
    """
    Special matching class for matching whitespace.  Normally, whitespace is ignored
    by pyparsing grammars.  This class is included when some whitespace structures
    are significant.  Define with a string containing the whitespace characters to be
    matched; default is C{" \\t\\r\\n"}.  Also takes optional C{min}, C{max}, and C{exact} arguments,
    as defined for the C{L{Word}} class.
    """
    whiteStrs = {
        " " : "<SPC>",
        "\t": "<TAB>",
        "\n": "<LF>",
        "\r": "<CR>",
        "\f": "<FF>",
        }
    def __init__(self, ws=" \t\r\n", min=1, max=0, exact=0):
        super(White,self).__init__()
        self.matchWhite = ws
        self.setWhitespaceChars( "".join(c for c in self.whiteChars if c not in self.matchWhite) )
        #~ self.leaveWhitespace()
        self.name = ("".join(White.whiteStrs[c] for c in self.matchWhite))
        self.mayReturnEmpty = True
        self.errmsg = "Expected " + self.name

        self.minLen = min

        if max > 0:
            self.maxLen = max
        else:
            self.maxLen = _MAX_INT

        if exact > 0:
            self.maxLen = exact
            self.minLen = exact

    def parseImpl( self, instring, loc, doActions=True ):
        if not(instring[ loc ] in self.matchWhite):
            raise ParseException(instring, loc, self.errmsg, self)
        start = loc
        loc += 1
        maxloc = start + self.maxLen
        maxloc = min( maxloc, len(instring) )
        while loc < maxloc and instring[loc] in self.matchWhite:
            loc += 1

        if loc - start < self.minLen:
            raise ParseException(instring, loc, self.errmsg, self)

        return loc, instring[start:loc]


class _PositionToken(Token):
    def __init__( self ):
        super(_PositionToken,self).__init__()
        self.name=self.__class__.__name__
        self.mayReturnEmpty = True
        self.mayIndexError = False

class GoToColumn(_PositionToken):
    """
    Token to advance to a specific column of input text; useful for tabular report scraping.
    """
    def __init__( self, colno ):
        super(GoToColumn,self).__init__()
        self.col = colno

    def preParse( self, instring, loc ):
        if col(loc,instring) != self.col:
            instrlen = len(instring)
            if self.ignoreExprs:
                loc = self._skipIgnorables( instring, loc )
            while loc < instrlen and instring[loc].isspace() and col( loc, instring ) != self.col :
                loc += 1
        return loc

    def parseImpl( self, instring, loc, doActions=True ):
        thiscol = col( loc, instring )
        if thiscol > self.col:
            raise ParseException( instring, loc, "Text not in expected column", self )
        newloc = loc + self.col - thiscol
        ret = instring[ loc: newloc ]
        return newloc, ret


class LineStart(_PositionToken):
    """
    Matches if current position is at the beginning of a line within the parse string
    
    Example::
    
        test = '''\
        AAA this line
        AAA and this line
          AAA but not this one
        B AAA and definitely not this one
        '''

        for t in (LineStart() + 'AAA' + restOfLine).searchString(test):
            print(t)
    
    Prints::
        ['AAA', ' this line']
        ['AAA', ' and this line']    

    """
    def __init__( self ):
        super(LineStart,self).__init__()
        self.errmsg = "Expected start of line"

    def parseImpl( self, instring, loc, doActions=True ):
        if col(loc, instring) == 1:
            return loc, []
        raise ParseException(instring, loc, self.errmsg, self)

class LineEnd(_PositionToken):
    """
    Matches if current position is at the end of a line within the parse string
    """
    def __init__( self ):
        super(LineEnd,self).__init__()
        self.setWhitespaceChars( ParserElement.DEFAULT_WHITE_CHARS.replace("\n","") )
        self.errmsg = "Expected end of line"

    def parseImpl( self, instring, loc, doActions=True ):
        if loc<len(instring):
            if instring[loc] == "\n":
                return loc+1, "\n"
            else:
                raise ParseException(instring, loc, self.errmsg, self)
        elif loc == len(instring):
            return loc+1, []
        else:
            raise ParseException(instring, loc, self.errmsg, self)

class StringStart(_PositionToken):
    """
    Matches if current position is at the beginning of the parse string
    """
    def __init__( self ):
        super(StringStart,self).__init__()
        self.errmsg = "Expected start of text"

    def parseImpl( self, instring, loc, doActions=True ):
        if loc != 0:
            # see if entire string up to here is just whitespace and ignoreables
            if loc != self.preParse( instring, 0 ):
                raise ParseException(instring, loc, self.errmsg, self)
        return loc, []

class StringEnd(_PositionToken):
    """
    Matches if current position is at the end of the parse string
    """
    def __init__( self ):
        super(StringEnd,self).__init__()
        self.errmsg = "Expected end of text"

    def parseImpl( self, instring, loc, doActions=True ):
        if loc < len(instring):
            raise ParseException(instring, loc, self.errmsg, self)
        elif loc == len(instring):
            return loc+1, []
        elif loc > len(instring):
            return loc, []
        else:
            raise ParseException(instring, loc, self.errmsg, self)

class WordStart(_PositionToken):
    """
    Matches if the current position is at the beginning of a Word, and
    is not preceded by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{\b} behavior of regular expressions,
    use C{WordStart(alphanums)}. C{WordStart} will also match at the beginning of
    the string being parsed, or at the beginning of a line.
    """
    def __init__(self, wordChars = printables):
        super(WordStart,self).__init__()
        self.wordChars = set(wordChars)
        self.errmsg = "Not at the start of a word"

    def parseImpl(self, instring, loc, doActions=True ):
        if loc != 0:
            if (instring[loc-1] in self.wordChars or
                instring[loc] not in self.wordChars):
                raise ParseException(instring, loc, self.errmsg, self)
        return loc, []

class WordEnd(_PositionToken):
    """
    Matches if the current position is at the end of a Word, and
    is not followed by any character in a given set of C{wordChars}
    (default=C{printables}). To emulate the C{\b} behavior of regular expressions,
    use C{WordEnd(alphanums)}. C{WordEnd} will also match at the end of
    the string being parsed, or at the end of a line.
    """
    def __init__(self, wordChars = printables):
        super(WordEnd,self).__init__()
        self.wordChars = set(wordChars)
        self.skipWhitespace = False
        self.errmsg = "Not at the end of a word"

    def parseImpl(self, instring, loc, doActions=True ):
        instrlen = len(instring)
        if instrlen>0 and loc<instrlen:
            if (instring[loc] in self.wordChars or
                instring[loc-1] not in self.wordChars):
                raise ParseException(instring, loc, self.errmsg, self)
        return loc, []


class ParseExpression(ParserElement):
    """
    Abstract subclass of ParserElement, for combining and post-processing parsed tokens.
    """
    def __init__( self, exprs, savelist = False ):
        super(ParseExpression,self).__init__(savelist)
        if isinstance( exprs, _generatorType ):
            exprs = list(exprs)

        if isinstance( exprs, basestring ):
            self.exprs = [ ParserElement._literalStringClass( exprs ) ]
        elif isinstance( exprs, collections.Iterable ):
            exprs = list(exprs)
            # if sequence of strings provided, wrap with Literal
            if all(isinstance(expr, basestring) for expr in exprs):
                exprs = map(ParserElement._literalStringClass, exprs)
            self.exprs = list(exprs)
        else:
            try:
                self.exprs = list( exprs )
            except TypeError:
                self.exprs = [ exprs ]
        self.callPreparse = False

    def __getitem__( self, i ):
        return self.exprs[i]

    def append( self, other ):
        self.exprs.append( other )
        self.strRepr = None
        return self

    def leaveWhitespace( self ):
        """Extends C{leaveWhitespace} defined in base class, and also invokes C{leaveWhitespace} on
           all contained expressions."""
        self.skipWhitespace = False
        self.exprs = [ e.copy() for e in self.exprs ]
        for e in self.exprs:
            e.leaveWhitespace()
        return self

    def ignore( self, other ):
        if isinstance( other, Suppress ):
            if other not in self.ignoreExprs:
                super( ParseExpression, self).ignore( other )
                for e in self.exprs:
                    e.ignore( self.ignoreExprs[-1] )
        else:
            super( ParseExpression, self).ignore( other )
            for e in self.exprs:
                e.ignore( self.ignoreExprs[-1] )
        return self

    def __str__( self ):
        try:
            return super(ParseExpression,self).__str__()
        except Exception:
            pass

        if self.strRepr is None:
            self.strRepr = "%s:(%s)" % ( self.__class__.__name__, _ustr(self.exprs) )
        return self.strRepr

    def streamline( self ):
        super(ParseExpression,self).streamline()

        for e in self.exprs:
            e.streamline()

        # collapse nested And's of the form And( And( And( a,b), c), d) to And( a,b,c,d )
        # but only if there are no parse actions or resultsNames on the nested And's
        # (likewise for Or's and MatchFirst's)
        if ( len(self.exprs) == 2 ):
            other = self.exprs[0]
            if ( isinstance( other, self.__class__ ) and
                  not(other.parseAction) and
                  other.resultsName is None and
                  not other.debug ):
                self.exprs = other.exprs[:] + [ self.exprs[1] ]
                self.strRepr = None
                self.mayReturnEmpty |= other.mayReturnEmpty
                self.mayIndexError  |= other.mayIndexError

            other = self.exprs[-1]
            if ( isinstance( other, self.__class__ ) and
                  not(other.parseAction) and
                  other.resultsName is None and
                  not other.debug ):
                self.exprs = self.exprs[:-1] + other.exprs[:]
                self.strRepr = None
                self.mayReturnEmpty |= other.mayReturnEmpty
                self.mayIndexError  |= other.mayIndexError

        self.errmsg = "Expected " + _ustr(self)
        
        return self

    def setResultsName( self, name, listAllMatches=False ):
        ret = super(ParseExpression,self).setResultsName(name,listAllMatches)
        return ret

    def validate( self, validateTrace=[] ):
        tmp = validateTrace[:]+[self]
        for e in self.exprs:
            e.validate(tmp)
        self.checkRecursion( [] )
        
    def copy(self):
        ret = super(ParseExpression,self).copy()
        ret.exprs = [e.copy() for e in self.exprs]
        return ret

class And(ParseExpression):
    """
    Requires all given C{ParseExpression}s to be found in the given order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'+'} operator.
    May also be constructed using the C{'-'} operator, which will suppress backtracking.

    Example::
        integer = Word(nums)
        name_expr = OneOrMore(Word(alphas))

        expr = And([integer("id"),name_expr("name"),integer("age")])
        # more easily written as:
        expr = integer("id") + name_expr("name") + integer("age")
    """

    class _ErrorStop(Empty):
        def __init__(self, *args, **kwargs):
            super(And._ErrorStop,self).__init__(*args, **kwargs)
            self.name = '-'
            self.leaveWhitespace()

    def __init__( self, exprs, savelist = True ):
        super(And,self).__init__(exprs, savelist)
        self.mayReturnEmpty = all(e.mayReturnEmpty for e in self.exprs)
        self.setWhitespaceChars( self.exprs[0].whiteChars )
        self.skipWhitespace = self.exprs[0].skipWhitespace
        self.callPreparse = True

    def parseImpl( self, instring, loc, doActions=True ):
        # pass False as last arg to _parse for first element, since we already
        # pre-parsed the string as part of our And pre-parsing
        loc, resultlist = self.exprs[0]._parse( instring, loc, doActions, callPreParse=False )
        errorStop = False
        for e in self.exprs[1:]:
            if isinstance(e, And._ErrorStop):
                errorStop = True
                continue
            if errorStop:
                try:
                    loc, exprtokens = e._parse( instring, loc, doActions )
                except ParseSyntaxException:
                    raise
                except ParseBaseException as pe:
                    pe.__traceback__ = None
                    raise ParseSyntaxException._from_exception(pe)
                except IndexError:
                    raise ParseSyntaxException(instring, len(instring), self.errmsg, self)
            else:
                loc, exprtokens = e._parse( instring, loc, doActions )
            if exprtokens or exprtokens.haskeys():
                resultlist += exprtokens
        return loc, resultlist

    def __iadd__(self, other ):
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        return self.append( other ) #And( [ self, other ] )

    def checkRecursion( self, parseElementList ):
        subRecCheckList = parseElementList[:] + [ self ]
        for e in self.exprs:
            e.checkRecursion( subRecCheckList )
            if not e.mayReturnEmpty:
                break

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "{" + " ".join(_ustr(e) for e in self.exprs) + "}"

        return self.strRepr


class Or(ParseExpression):
    """
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the expression that matches the longest string will be used.
    May be constructed using the C{'^'} operator.

    Example::
        # construct Or using '^' operator
        
        number = Word(nums) ^ Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789"))
    prints::
        [['123'], ['3.1416'], ['789']]
    """
    def __init__( self, exprs, savelist = False ):
        super(Or,self).__init__(exprs, savelist)
        if self.exprs:
            self.mayReturnEmpty = any(e.mayReturnEmpty for e in self.exprs)
        else:
            self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        maxExcLoc = -1
        maxException = None
        matches = []
        for e in self.exprs:
            try:
                loc2 = e.tryParse( instring, loc )
            except ParseException as err:
                err.__traceback__ = None
                if err.loc > maxExcLoc:
                    maxException = err
                    maxExcLoc = err.loc
            except IndexError:
                if len(instring) > maxExcLoc:
                    maxException = ParseException(instring,len(instring),e.errmsg,self)
                    maxExcLoc = len(instring)
            else:
                # save match among all matches, to retry longest to shortest
                matches.append((loc2, e))

        if matches:
            matches.sort(key=lambda x: -x[0])
            for _,e in matches:
                try:
                    return e._parse( instring, loc, doActions )
                except ParseException as err:
                    err.__traceback__ = None
                    if err.loc > maxExcLoc:
                        maxException = err
                        maxExcLoc = err.loc

        if maxException is not None:
            maxException.msg = self.errmsg
            raise maxException
        else:
            raise ParseException(instring, loc, "no defined alternatives to match", self)


    def __ixor__(self, other ):
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        return self.append( other ) #Or( [ self, other ] )

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "{" + " ^ ".join(_ustr(e) for e in self.exprs) + "}"

        return self.strRepr

    def checkRecursion( self, parseElementList ):
        subRecCheckList = parseElementList[:] + [ self ]
        for e in self.exprs:
            e.checkRecursion( subRecCheckList )


class MatchFirst(ParseExpression):
    """
    Requires that at least one C{ParseExpression} is found.
    If two expressions match, the first one listed is the one that will match.
    May be constructed using the C{'|'} operator.

    Example::
        # construct MatchFirst using '|' operator
        
        # watch the order of expressions to match
        number = Word(nums) | Combine(Word(nums) + '.' + Word(nums))
        print(number.searchString("123 3.1416 789")) #  Fail! -> [['123'], ['3'], ['1416'], ['789']]

        # put more selective expression first
        number = Combine(Word(nums) + '.' + Word(nums)) | Word(nums)
        print(number.searchString("123 3.1416 789")) #  Better -> [['123'], ['3.1416'], ['789']]
    """
    def __init__( self, exprs, savelist = False ):
        super(MatchFirst,self).__init__(exprs, savelist)
        if self.exprs:
            self.mayReturnEmpty = any(e.mayReturnEmpty for e in self.exprs)
        else:
            self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        maxExcLoc = -1
        maxException = None
        for e in self.exprs:
            try:
                ret = e._parse( instring, loc, doActions )
                return ret
            except ParseException as err:
                if err.loc > maxExcLoc:
                    maxException = err
                    maxExcLoc = err.loc
            except IndexError:
                if len(instring) > maxExcLoc:
                    maxException = ParseException(instring,len(instring),e.errmsg,self)
                    maxExcLoc = len(instring)

        # only got here if no expression matched, raise exception for match that made it the furthest
        else:
            if maxException is not None:
                maxException.msg = self.errmsg
                raise maxException
            else:
                raise ParseException(instring, loc, "no defined alternatives to match", self)

    def __ior__(self, other ):
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass( other )
        return self.append( other ) #MatchFirst( [ self, other ] )

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "{" + " | ".join(_ustr(e) for e in self.exprs) + "}"

        return self.strRepr

    def checkRecursion( self, parseElementList ):
        subRecCheckList = parseElementList[:] + [ self ]
        for e in self.exprs:
            e.checkRecursion( subRecCheckList )


class Each(ParseExpression):
    """
    Requires all given C{ParseExpression}s to be found, but in any order.
    Expressions may be separated by whitespace.
    May be constructed using the C{'&'} operator.

    Example::
        color = oneOf("RED ORANGE YELLOW GREEN BLUE PURPLE BLACK WHITE BROWN")
        shape_type = oneOf("SQUARE CIRCLE TRIANGLE STAR HEXAGON OCTAGON")
        integer = Word(nums)
        shape_attr = "shape:" + shape_type("shape")
        posn_attr = "posn:" + Group(integer("x") + ',' + integer("y"))("posn")
        color_attr = "color:" + color("color")
        size_attr = "size:" + integer("size")

        # use Each (using operator '&') to accept attributes in any order 
        # (shape and posn are required, color and size are optional)
        shape_spec = shape_attr & posn_attr & Optional(color_attr) & Optional(size_attr)

        shape_spec.runTests('''
            shape: SQUARE color: BLACK posn: 100, 120
            shape: CIRCLE size: 50 color: BLUE posn: 50,80
            color:GREEN size:20 shape:TRIANGLE posn:20,40
            '''
            )
    prints::
        shape: SQUARE color: BLACK posn: 100, 120
        ['shape:', 'SQUARE', 'color:', 'BLACK', 'posn:', ['100', ',', '120']]
        - color: BLACK
        - posn: ['100', ',', '120']
          - x: 100
          - y: 120
        - shape: SQUARE


        shape: CIRCLE size: 50 color: BLUE posn: 50,80
        ['shape:', 'CIRCLE', 'size:', '50', 'color:', 'BLUE', 'posn:', ['50', ',', '80']]
        - color: BLUE
        - posn: ['50', ',', '80']
          - x: 50
          - y: 80
        - shape: CIRCLE
        - size: 50


        color: GREEN size: 20 shape: TRIANGLE posn: 20,40
        ['color:', 'GREEN', 'size:', '20', 'shape:', 'TRIANGLE', 'posn:', ['20', ',', '40']]
        - color: GREEN
        - posn: ['20', ',', '40']
          - x: 20
          - y: 40
        - shape: TRIANGLE
        - size: 20
    """
    def __init__( self, exprs, savelist = True ):
        super(Each,self).__init__(exprs, savelist)
        self.mayReturnEmpty = all(e.mayReturnEmpty for e in self.exprs)
        self.skipWhitespace = True
        self.initExprGroups = True

    def parseImpl( self, instring, loc, doActions=True ):
        if self.initExprGroups:
            self.opt1map = dict((id(e.expr),e) for e in self.exprs if isinstance(e,Optional))
            opt1 = [ e.expr for e in self.exprs if isinstance(e,Optional) ]
            opt2 = [ e for e in self.exprs if e.mayReturnEmpty and not isinstance(e,Optional)]
            self.optionals = opt1 + opt2
            self.multioptionals = [ e.expr for e in self.exprs if isinstance(e,ZeroOrMore) ]
            self.multirequired = [ e.expr for e in self.exprs if isinstance(e,OneOrMore) ]
            self.required = [ e for e in self.exprs if not isinstance(e,(Optional,ZeroOrMore,OneOrMore)) ]
            self.required += self.multirequired
            self.initExprGroups = False
        tmpLoc = loc
        tmpReqd = self.required[:]
        tmpOpt  = self.optionals[:]
        matchOrder = []

        keepMatching = True
        while keepMatching:
            tmpExprs = tmpReqd + tmpOpt + self.multioptionals + self.multirequired
            failed = []
            for e in tmpExprs:
                try:
                    tmpLoc = e.tryParse( instring, tmpLoc )
                except ParseException:
                    failed.append(e)
                else:
                    matchOrder.append(self.opt1map.get(id(e),e))
                    if e in tmpReqd:
                        tmpReqd.remove(e)
                    elif e in tmpOpt:
                        tmpOpt.remove(e)
            if len(failed) == len(tmpExprs):
                keepMatching = False

        if tmpReqd:
            missing = ", ".join(_ustr(e) for e in tmpReqd)
            raise ParseException(instring,loc,"Missing one or more required elements (%s)" % missing )

        # add any unmatched Optionals, in case they have default values defined
        matchOrder += [e for e in self.exprs if isinstance(e,Optional) and e.expr in tmpOpt]

        resultlist = []
        for e in matchOrder:
            loc,results = e._parse(instring,loc,doActions)
            resultlist.append(results)

        finalResults = sum(resultlist, ParseResults([]))
        return loc, finalResults

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "{" + " & ".join(_ustr(e) for e in self.exprs) + "}"

        return self.strRepr

    def checkRecursion( self, parseElementList ):
        subRecCheckList = parseElementList[:] + [ self ]
        for e in self.exprs:
            e.checkRecursion( subRecCheckList )


class ParseElementEnhance(ParserElement):
    """
    Abstract subclass of C{ParserElement}, for combining and post-processing parsed tokens.
    """
    def __init__( self, expr, savelist=False ):
        super(ParseElementEnhance,self).__init__(savelist)
        if isinstance( expr, basestring ):
            if issubclass(ParserElement._literalStringClass, Token):
                expr = ParserElement._literalStringClass(expr)
            else:
                expr = ParserElement._literalStringClass(Literal(expr))
        self.expr = expr
        self.strRepr = None
        if expr is not None:
            self.mayIndexError = expr.mayIndexError
            self.mayReturnEmpty = expr.mayReturnEmpty
            self.setWhitespaceChars( expr.whiteChars )
            self.skipWhitespace = expr.skipWhitespace
            self.saveAsList = expr.saveAsList
            self.callPreparse = expr.callPreparse
            self.ignoreExprs.extend(expr.ignoreExprs)

    def parseImpl( self, instring, loc, doActions=True ):
        if self.expr is not None:
            return self.expr._parse( instring, loc, doActions, callPreParse=False )
        else:
            raise ParseException("",loc,self.errmsg,self)

    def leaveWhitespace( self ):
        self.skipWhitespace = False
        self.expr = self.expr.copy()
        if self.expr is not None:
            self.expr.leaveWhitespace()
        return self

    def ignore( self, other ):
        if isinstance( other, Suppress ):
            if other not in self.ignoreExprs:
                super( ParseElementEnhance, self).ignore( other )
                if self.expr is not None:
                    self.expr.ignore( self.ignoreExprs[-1] )
        else:
            super( ParseElementEnhance, self).ignore( other )
            if self.expr is not None:
                self.expr.ignore( self.ignoreExprs[-1] )
        return self

    def streamline( self ):
        super(ParseElementEnhance,self).streamline()
        if self.expr is not None:
            self.expr.streamline()
        return self

    def checkRecursion( self, parseElementList ):
        if self in parseElementList:
            raise RecursiveGrammarException( parseElementList+[self] )
        subRecCheckList = parseElementList[:] + [ self ]
        if self.expr is not None:
            self.expr.checkRecursion( subRecCheckList )

    def validate( self, validateTrace=[] ):
        tmp = validateTrace[:]+[self]
        if self.expr is not None:
            self.expr.validate(tmp)
        self.checkRecursion( [] )

    def __str__( self ):
        try:
            return super(ParseElementEnhance,self).__str__()
        except Exception:
            pass

        if self.strRepr is None and self.expr is not None:
            self.strRepr = "%s:(%s)" % ( self.__class__.__name__, _ustr(self.expr) )
        return self.strRepr


class FollowedBy(ParseElementEnhance):
    """
    Lookahead matching of the given parse expression.  C{FollowedBy}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression matches at the current
    position.  C{FollowedBy} always returns a null token list.

    Example::
        # use FollowedBy to match a label only if it is followed by a ':'
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        OneOrMore(attr_expr).parseString("shape: SQUARE color: BLACK posn: upper left").pprint()
    prints::
        [['shape', 'SQUARE'], ['color', 'BLACK'], ['posn', 'upper left']]
    """
    def __init__( self, expr ):
        super(FollowedBy,self).__init__(expr)
        self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        self.expr.tryParse( instring, loc )
        return loc, []


class NotAny(ParseElementEnhance):
    """
    Lookahead to disallow matching with the given parse expression.  C{NotAny}
    does I{not} advance the parsing position within the input string, it only
    verifies that the specified parse expression does I{not} match at the current
    position.  Also, C{NotAny} does I{not} skip over leading whitespace. C{NotAny}
    always returns a null token list.  May be constructed using the '~' operator.

    Example::
        
    """
    def __init__( self, expr ):
        super(NotAny,self).__init__(expr)
        #~ self.leaveWhitespace()
        self.skipWhitespace = False  # do NOT use self.leaveWhitespace(), don't want to propagate to exprs
        self.mayReturnEmpty = True
        self.errmsg = "Found unwanted token, "+_ustr(self.expr)

    def parseImpl( self, instring, loc, doActions=True ):
        if self.expr.canParseNext(instring, loc):
            raise ParseException(instring, loc, self.errmsg, self)
        return loc, []

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "~{" + _ustr(self.expr) + "}"

        return self.strRepr

class _MultipleMatch(ParseElementEnhance):
    def __init__( self, expr, stopOn=None):
        super(_MultipleMatch, self).__init__(expr)
        self.saveAsList = True
        ender = stopOn
        if isinstance(ender, basestring):
            ender = ParserElement._literalStringClass(ender)
        self.not_ender = ~ender if ender is not None else None

    def parseImpl( self, instring, loc, doActions=True ):
        self_expr_parse = self.expr._parse
        self_skip_ignorables = self._skipIgnorables
        check_ender = self.not_ender is not None
        if check_ender:
            try_not_ender = self.not_ender.tryParse
        
        # must be at least one (but first see if we are the stopOn sentinel;
        # if so, fail)
        if check_ender:
            try_not_ender(instring, loc)
        loc, tokens = self_expr_parse( instring, loc, doActions, callPreParse=False )
        try:
            hasIgnoreExprs = (not not self.ignoreExprs)
            while 1:
                if check_ender:
                    try_not_ender(instring, loc)
                if hasIgnoreExprs:
                    preloc = self_skip_ignorables( instring, loc )
                else:
                    preloc = loc
                loc, tmptokens = self_expr_parse( instring, preloc, doActions )
                if tmptokens or tmptokens.haskeys():
                    tokens += tmptokens
        except (ParseException,IndexError):
            pass

        return loc, tokens
        
class OneOrMore(_MultipleMatch):
    """
    Repetition of one or more of the given expression.
    
    Parameters:
     - expr - expression that must match one or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: BLACK"
        OneOrMore(attr_expr).parseString(text).pprint()  # Fail! read 'color' as data instead of next label -> [['shape', 'SQUARE color']]

        # use stopOn attribute for OneOrMore to avoid reading label string as part of the data
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        OneOrMore(attr_expr).parseString(text).pprint() # Better -> [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'BLACK']]
        
        # could also be written as
        (attr_expr * (1,)).parseString(text).pprint()
    """

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "{" + _ustr(self.expr) + "}..."

        return self.strRepr

class ZeroOrMore(_MultipleMatch):
    """
    Optional repetition of zero or more of the given expression.
    
    Parameters:
     - expr - expression that must match zero or more times
     - stopOn - (default=C{None}) - expression for a terminating sentinel
          (only required if the sentinel would ordinarily match the repetition 
          expression)          

    Example: similar to L{OneOrMore}
    """
    def __init__( self, expr, stopOn=None):
        super(ZeroOrMore,self).__init__(expr, stopOn=stopOn)
        self.mayReturnEmpty = True
        
    def parseImpl( self, instring, loc, doActions=True ):
        try:
            return super(ZeroOrMore, self).parseImpl(instring, loc, doActions)
        except (ParseException,IndexError):
            return loc, []

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "[" + _ustr(self.expr) + "]..."

        return self.strRepr

class _NullToken(object):
    def __bool__(self):
        return False
    __nonzero__ = __bool__
    def __str__(self):
        return ""

_optionalNotMatched = _NullToken()
class Optional(ParseElementEnhance):
    """
    Optional matching of the given expression.

    Parameters:
     - expr - expression that must match zero or more times
     - default (optional) - value to be returned if the optional expression is not found.

    Example::
        # US postal code can be a 5-digit zip, plus optional 4-digit qualifier
        zip = Combine(Word(nums, exact=5) + Optional('-' + Word(nums, exact=4)))
        zip.runTests('''
            # traditional ZIP code
            12345
            
            # ZIP+4 form
            12101-0001
            
            # invalid ZIP
            98765-
            ''')
    prints::
        # traditional ZIP code
        12345
        ['12345']

        # ZIP+4 form
        12101-0001
        ['12101-0001']

        # invalid ZIP
        98765-
             ^
        FAIL: Expected end of text (at char 5), (line:1, col:6)
    """
    def __init__( self, expr, default=_optionalNotMatched ):
        super(Optional,self).__init__( expr, savelist=False )
        self.saveAsList = self.expr.saveAsList
        self.defaultValue = default
        self.mayReturnEmpty = True

    def parseImpl( self, instring, loc, doActions=True ):
        try:
            loc, tokens = self.expr._parse( instring, loc, doActions, callPreParse=False )
        except (ParseException,IndexError):
            if self.defaultValue is not _optionalNotMatched:
                if self.expr.resultsName:
                    tokens = ParseResults([ self.defaultValue ])
                    tokens[self.expr.resultsName] = self.defaultValue
                else:
                    tokens = [ self.defaultValue ]
            else:
                tokens = []
        return loc, tokens

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name

        if self.strRepr is None:
            self.strRepr = "[" + _ustr(self.expr) + "]"

        return self.strRepr

class SkipTo(ParseElementEnhance):
    """
    Token for skipping over all undefined text until the matched expression is found.

    Parameters:
     - expr - target expression marking the end of the data to be skipped
     - include - (default=C{False}) if True, the target expression is also parsed 
          (the skipped text and target expression are returned as a 2-element list).
     - ignore - (default=C{None}) used to define grammars (typically quoted strings and 
          comments) that might contain false matches to the target expression
     - failOn - (default=C{None}) define expressions that are not allowed to be 
          included in the skipped test; if found before the target expression is found, 
          the SkipTo is not a match

    Example::
        report = '''
            Outstanding Issues Report - 1 Jan 2000

               # | Severity | Description                               |  Days Open
            -----+----------+-------------------------------------------+-----------
             101 | Critical | Intermittent system crash                 |          6
              94 | Cosmetic | Spelling error on Login ('log|n')         |         14
              79 | Minor    | System slow when running too many reports |         47
            '''
        integer = Word(nums)
        SEP = Suppress('|')
        # use SkipTo to simply match everything up until the next SEP
        # - ignore quoted strings, so that a '|' character inside a quoted string does not match
        # - parse action will call token.strip() for each matched token, i.e., the description body
        string_data = SkipTo(SEP, ignore=quotedString)
        string_data.setParseAction(tokenMap(str.strip))
        ticket_expr = (integer("issue_num") + SEP 
                      + string_data("sev") + SEP 
                      + string_data("desc") + SEP 
                      + integer("days_open"))
        
        for tkt in ticket_expr.searchString(report):
            print tkt.dump()
    prints::
        ['101', 'Critical', 'Intermittent system crash', '6']
        - days_open: 6
        - desc: Intermittent system crash
        - issue_num: 101
        - sev: Critical
        ['94', 'Cosmetic', "Spelling error on Login ('log|n')", '14']
        - days_open: 14
        - desc: Spelling error on Login ('log|n')
        - issue_num: 94
        - sev: Cosmetic
        ['79', 'Minor', 'System slow when running too many reports', '47']
        - days_open: 47
        - desc: System slow when running too many reports
        - issue_num: 79
        - sev: Minor
    """
    def __init__( self, other, include=False, ignore=None, failOn=None ):
        super( SkipTo, self ).__init__( other )
        self.ignoreExpr = ignore
        self.mayReturnEmpty = True
        self.mayIndexError = False
        self.includeMatch = include
        self.asList = False
        if isinstance(failOn, basestring):
            self.failOn = ParserElement._literalStringClass(failOn)
        else:
            self.failOn = failOn
        self.errmsg = "No match found for "+_ustr(self.expr)

    def parseImpl( self, instring, loc, doActions=True ):
        startloc = loc
        instrlen = len(instring)
        expr = self.expr
        expr_parse = self.expr._parse
        self_failOn_canParseNext = self.failOn.canParseNext if self.failOn is not None else None
        self_ignoreExpr_tryParse = self.ignoreExpr.tryParse if self.ignoreExpr is not None else None
        
        tmploc = loc
        while tmploc <= instrlen:
            if self_failOn_canParseNext is not None:
                # break if failOn expression matches
                if self_failOn_canParseNext(instring, tmploc):
                    break
                    
            if self_ignoreExpr_tryParse is not None:
                # advance past ignore expressions
                while 1:
                    try:
                        tmploc = self_ignoreExpr_tryParse(instring, tmploc)
                    except ParseBaseException:
                        break
            
            try:
                expr_parse(instring, tmploc, doActions=False, callPreParse=False)
            except (ParseException, IndexError):
                # no match, advance loc in string
                tmploc += 1
            else:
                # matched skipto expr, done
                break

        else:
            # ran off the end of the input string without matching skipto expr, fail
            raise ParseException(instring, loc, self.errmsg, self)

        # build up return values
        loc = tmploc
        skiptext = instring[startloc:loc]
        skipresult = ParseResults(skiptext)
        
        if self.includeMatch:
            loc, mat = expr_parse(instring,loc,doActions,callPreParse=False)
            skipresult += mat

        return loc, skipresult

class Forward(ParseElementEnhance):
    """
    Forward declaration of an expression to be defined later -
    used for recursive grammars, such as algebraic infix notation.
    When the expression is known, it is assigned to the C{Forward} variable using the '<<' operator.

    Note: take care when assigning to C{Forward} not to overlook precedence of operators.
    Specifically, '|' has a lower precedence than '<<', so that::
        fwdExpr << a | b | c
    will actually be evaluated as::
        (fwdExpr << a) | b | c
    thereby leaving b and c out as parseable alternatives.  It is recommended that you
    explicitly group the values inserted into the C{Forward}::
        fwdExpr << (a | b | c)
    Converting to use the '<<=' operator instead will avoid this problem.

    See L{ParseResults.pprint} for an example of a recursive parser created using
    C{Forward}.
    """
    def __init__( self, other=None ):
        super(Forward,self).__init__( other, savelist=False )

    def __lshift__( self, other ):
        if isinstance( other, basestring ):
            other = ParserElement._literalStringClass(other)
        self.expr = other
        self.strRepr = None
        self.mayIndexError = self.expr.mayIndexError
        self.mayReturnEmpty = self.expr.mayReturnEmpty
        self.setWhitespaceChars( self.expr.whiteChars )
        self.skipWhitespace = self.expr.skipWhitespace
        self.saveAsList = self.expr.saveAsList
        self.ignoreExprs.extend(self.expr.ignoreExprs)
        return self
        
    def __ilshift__(self, other):
        return self << other
    
    def leaveWhitespace( self ):
        self.skipWhitespace = False
        return self

    def streamline( self ):
        if not self.streamlined:
            self.streamlined = True
            if self.expr is not None:
                self.expr.streamline()
        return self

    def validate( self, validateTrace=[] ):
        if self not in validateTrace:
            tmp = validateTrace[:]+[self]
            if self.expr is not None:
                self.expr.validate(tmp)
        self.checkRecursion([])

    def __str__( self ):
        if hasattr(self,"name"):
            return self.name
        return self.__class__.__name__ + ": ..."

        # stubbed out for now - creates awful memory and perf issues
        self._revertClass = self.__class__
        self.__class__ = _ForwardNoRecurse
        try:
            if self.expr is not None:
                retString = _ustr(self.expr)
            else:
                retString = "None"
        finally:
            self.__class__ = self._revertClass
        return self.__class__.__name__ + ": " + retString

    def copy(self):
        if self.expr is not None:
            return super(Forward,self).copy()
        else:
            ret = Forward()
            ret <<= self
            return ret

class _ForwardNoRecurse(Forward):
    def __str__( self ):
        return "..."

class TokenConverter(ParseElementEnhance):
    """
    Abstract subclass of C{ParseExpression}, for converting parsed results.
    """
    def __init__( self, expr, savelist=False ):
        super(TokenConverter,self).__init__( expr )#, savelist )
        self.saveAsList = False

class Combine(TokenConverter):
    """
    Converter to concatenate all matching tokens to a single string.
    By default, the matching patterns must also be contiguous in the input string;
    this can be disabled by specifying C{'adjacent=False'} in the constructor.

    Example::
        real = Word(nums) + '.' + Word(nums)
        print(real.parseString('3.1416')) # -> ['3', '.', '1416']
        # will also erroneously match the following
        print(real.parseString('3. 1416')) # -> ['3', '.', '1416']

        real = Combine(Word(nums) + '.' + Word(nums))
        print(real.parseString('3.1416')) # -> ['3.1416']
        # no match when there are internal spaces
        print(real.parseString('3. 1416')) # -> Exception: Expected W:(0123...)
    """
    def __init__( self, expr, joinString="", adjacent=True ):
        super(Combine,self).__init__( expr )
        # suppress whitespace-stripping in contained parse expressions, but re-enable it on the Combine itself
        if adjacent:
            self.leaveWhitespace()
        self.adjacent = adjacent
        self.skipWhitespace = True
        self.joinString = joinString
        self.callPreparse = True

    def ignore( self, other ):
        if self.adjacent:
            ParserElement.ignore(self, other)
        else:
            super( Combine, self).ignore( other )
        return self

    def postParse( self, instring, loc, tokenlist ):
        retToks = tokenlist.copy()
        del retToks[:]
        retToks += ParseResults([ "".join(tokenlist._asStringList(self.joinString)) ], modal=self.modalResults)

        if self.resultsName and retToks.haskeys():
            return [ retToks ]
        else:
            return retToks

class Group(TokenConverter):
    """
    Converter to return the matched tokens as a list - useful for returning tokens of C{L{ZeroOrMore}} and C{L{OneOrMore}} expressions.

    Example::
        ident = Word(alphas)
        num = Word(nums)
        term = ident | num
        func = ident + Optional(delimitedList(term))
        print(func.parseString("fn a,b,100"))  # -> ['fn', 'a', 'b', '100']

        func = ident + Group(Optional(delimitedList(term)))
        print(func.parseString("fn a,b,100"))  # -> ['fn', ['a', 'b', '100']]
    """
    def __init__( self, expr ):
        super(Group,self).__init__( expr )
        self.saveAsList = True

    def postParse( self, instring, loc, tokenlist ):
        return [ tokenlist ]

class Dict(TokenConverter):
    """
    Converter to return a repetitive expression as a list, but also as a dictionary.
    Each element can also be referenced using the first token in the expression as its key.
    Useful for tabular report scraping when the first column can be used as a item key.

    Example::
        data_word = Word(alphas)
        label = data_word + FollowedBy(':')
        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))

        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        
        # print attributes as plain groups
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        # instead of OneOrMore(expr), parse using Dict(OneOrMore(Group(expr))) - Dict will auto-assign names
        result = Dict(OneOrMore(Group(attr_expr))).parseString(text)
        print(result.dump())
        
        # access named fields as dict entries, or output as dict
        print(result['shape'])        
        print(result.asDict())
    prints::
        ['shape', 'SQUARE', 'posn', 'upper left', 'color', 'light blue', 'texture', 'burlap']

        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        {'color': 'light blue', 'posn': 'upper left', 'texture': 'burlap', 'shape': 'SQUARE'}
    See more examples at L{ParseResults} of accessing fields by results name.
    """
    def __init__( self, expr ):
        super(Dict,self).__init__( expr )
        self.saveAsList = True

    def postParse( self, instring, loc, tokenlist ):
        for i,tok in enumerate(tokenlist):
            if len(tok) == 0:
                continue
            ikey = tok[0]
            if isinstance(ikey,int):
                ikey = _ustr(tok[0]).strip()
            if len(tok)==1:
                tokenlist[ikey] = _ParseResultsWithOffset("",i)
            elif len(tok)==2 and not isinstance(tok[1],ParseResults):
                tokenlist[ikey] = _ParseResultsWithOffset(tok[1],i)
            else:
                dictvalue = tok.copy() #ParseResults(i)
                del dictvalue[0]
                if len(dictvalue)!= 1 or (isinstance(dictvalue,ParseResults) and dictvalue.haskeys()):
                    tokenlist[ikey] = _ParseResultsWithOffset(dictvalue,i)
                else:
                    tokenlist[ikey] = _ParseResultsWithOffset(dictvalue[0],i)

        if self.resultsName:
            return [ tokenlist ]
        else:
            return tokenlist


class Suppress(TokenConverter):
    """
    Converter for ignoring the results of a parsed expression.

    Example::
        source = "a, b, c,d"
        wd = Word(alphas)
        wd_list1 = wd + ZeroOrMore(',' + wd)
        print(wd_list1.parseString(source))

        # often, delimiters that are useful during parsing are just in the
        # way afterward - use Suppress to keep them out of the parsed output
        wd_list2 = wd + ZeroOrMore(Suppress(',') + wd)
        print(wd_list2.parseString(source))
    prints::
        ['a', ',', 'b', ',', 'c', ',', 'd']
        ['a', 'b', 'c', 'd']
    (See also L{delimitedList}.)
    """
    def postParse( self, instring, loc, tokenlist ):
        return []

    def suppress( self ):
        return self


class OnlyOnce(object):
    """
    Wrapper for parse actions, to ensure they are only called once.
    """
    def __init__(self, methodCall):
        self.callable = _trim_arity(methodCall)
        self.called = False
    def __call__(self,s,l,t):
        if not self.called:
            results = self.callable(s,l,t)
            self.called = True
            return results
        raise ParseException(s,l,"")
    def reset(self):
        self.called = False

def traceParseAction(f):
    """
    Decorator for debugging parse actions. 
    
    When the parse action is called, this decorator will print C{">> entering I{method-name}(line:I{current_source_line}, I{parse_location}, I{matched_tokens})".}
    When the parse action completes, the decorator will print C{"<<"} followed by the returned value, or any exception that the parse action raised.

    Example::
        wd = Word(alphas)

        @traceParseAction
        def remove_duplicate_chars(tokens):
            return ''.join(sorted(set(''.join(tokens)))

        wds = OneOrMore(wd).setParseAction(remove_duplicate_chars)
        print(wds.parseString("slkdjs sld sldd sdlf sdljf"))
    prints::
        >>entering remove_duplicate_chars(line: 'slkdjs sld sldd sdlf sdljf', 0, (['slkdjs', 'sld', 'sldd', 'sdlf', 'sdljf'], {}))
        <<leaving remove_duplicate_chars (ret: 'dfjkls')
        ['dfjkls']
    """
    f = _trim_arity(f)
    def z(*paArgs):
        thisFunc = f.__name__
        s,l,t = paArgs[-3:]
        if len(paArgs)>3:
            thisFunc = paArgs[0].__class__.__name__ + '.' + thisFunc
        sys.stderr.write( ">>entering %s(line: '%s', %d, %r)\n" % (thisFunc,line(l,s),l,t) )
        try:
            ret = f(*paArgs)
        except Exception as exc:
            sys.stderr.write( "<<leaving %s (exception: %s)\n" % (thisFunc,exc) )
            raise
        sys.stderr.write( "<<leaving %s (ret: %r)\n" % (thisFunc,ret) )
        return ret
    try:
        z.__name__ = f.__name__
    except AttributeError:
        pass
    return z

#
# global helpers
#
def delimitedList( expr, delim=",", combine=False ):
    """
    Helper to define a delimited list of expressions - the delimiter defaults to ','.
    By default, the list elements and delimiters can have intervening whitespace, and
    comments, but this can be overridden by passing C{combine=True} in the constructor.
    If C{combine} is set to C{True}, the matching tokens are returned as a single token
    string, with the delimiters included; otherwise, the matching tokens are returned
    as a list of tokens, with the delimiters suppressed.

    Example::
        delimitedList(Word(alphas)).parseString("aa,bb,cc") # -> ['aa', 'bb', 'cc']
        delimitedList(Word(hexnums), delim=':', combine=True).parseString("AA:BB:CC:DD:EE") # -> ['AA:BB:CC:DD:EE']
    """
    dlName = _ustr(expr)+" ["+_ustr(delim)+" "+_ustr(expr)+"]..."
    if combine:
        return Combine( expr + ZeroOrMore( delim + expr ) ).setName(dlName)
    else:
        return ( expr + ZeroOrMore( Suppress( delim ) + expr ) ).setName(dlName)

def countedArray( expr, intExpr=None ):
    """
    Helper to define a counted list of expressions.
    This helper defines a pattern of the form::
        integer expr expr expr...
    where the leading integer tells how many expr expressions follow.
    The matched tokens returns the array of expr tokens as a list - the leading count token is suppressed.
    
    If C{intExpr} is specified, it should be a pyparsing expression that produces an integer value.

    Example::
        countedArray(Word(alphas)).parseString('2 ab cd ef')  # -> ['ab', 'cd']

        # in this parser, the leading integer value is given in binary,
        # '10' indicating that 2 values are in the array
        binaryConstant = Word('01').setParseAction(lambda t: int(t[0], 2))
        countedArray(Word(alphas), intExpr=binaryConstant).parseString('10 ab cd ef')  # -> ['ab', 'cd']
    """
    arrayExpr = Forward()
    def countFieldParseAction(s,l,t):
        n = t[0]
        arrayExpr << (n and Group(And([expr]*n)) or Group(empty))
        return []
    if intExpr is None:
        intExpr = Word(nums).setParseAction(lambda t:int(t[0]))
    else:
        intExpr = intExpr.copy()
    intExpr.setName("arrayLen")
    intExpr.addParseAction(countFieldParseAction, callDuringTry=True)
    return ( intExpr + arrayExpr ).setName('(len) ' + _ustr(expr) + '...')

def _flatten(L):
    ret = []
    for i in L:
        if isinstance(i,list):
            ret.extend(_flatten(i))
        else:
            ret.append(i)
    return ret

def matchPreviousLiteral(expr):
    """
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousLiteral(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches a
    previous literal, will also match the leading C{"1:1"} in C{"1:10"}.
    If this is not desired, use C{matchPreviousExpr}.
    Do I{not} use with packrat parsing enabled.
    """
    rep = Forward()
    def copyTokenToRepeater(s,l,t):
        if t:
            if len(t) == 1:
                rep << t[0]
            else:
                # flatten t tokens
                tflat = _flatten(t.asList())
                rep << And(Literal(tt) for tt in tflat)
        else:
            rep << Empty()
    expr.addParseAction(copyTokenToRepeater, callDuringTry=True)
    rep.setName('(prev) ' + _ustr(expr))
    return rep

def matchPreviousExpr(expr):
    """
    Helper to define an expression that is indirectly defined from
    the tokens matched in a previous expression, that is, it looks
    for a 'repeat' of a previous expression.  For example::
        first = Word(nums)
        second = matchPreviousExpr(first)
        matchExpr = first + ":" + second
    will match C{"1:1"}, but not C{"1:2"}.  Because this matches by
    expressions, will I{not} match the leading C{"1:1"} in C{"1:10"};
    the expressions are evaluated first, and then compared, so
    C{"1"} is compared with C{"10"}.
    Do I{not} use with packrat parsing enabled.
    """
    rep = Forward()
    e2 = expr.copy()
    rep <<= e2
    def copyTokenToRepeater(s,l,t):
        matchTokens = _flatten(t.asList())
        def mustMatchTheseTokens(s,l,t):
            theseTokens = _flatten(t.asList())
            if  theseTokens != matchTokens:
                raise ParseException("",0,"")
        rep.setParseAction( mustMatchTheseTokens, callDuringTry=True )
    expr.addParseAction(copyTokenToRepeater, callDuringTry=True)
    rep.setName('(prev) ' + _ustr(expr))
    return rep

def _escapeRegexRangeChars(s):
    #~  escape these chars: ^-]
    for c in r"\^-]":
        s = s.replace(c,_bslash+c)
    s = s.replace("\n",r"\n")
    s = s.replace("\t",r"\t")
    return _ustr(s)

def oneOf( strs, caseless=False, useRegex=True ):
    """
    Helper to quickly define a set of alternative Literals, and makes sure to do
    longest-first testing when there is a conflict, regardless of the input order,
    but returns a C{L{MatchFirst}} for best performance.

    Parameters:
     - strs - a string of space-delimited literals, or a collection of string literals
     - caseless - (default=C{False}) - treat all literals as caseless
     - useRegex - (default=C{True}) - as an optimization, will generate a Regex
          object; otherwise, will generate a C{MatchFirst} object (if C{caseless=True}, or
          if creating a C{Regex} raises an exception)

    Example::
        comp_oper = oneOf("< = > <= >= !=")
        var = Word(alphas)
        number = Word(nums)
        term = var | number
        comparison_expr = term + comp_oper + term
        print(comparison_expr.searchString("B = 12  AA=23 B<=AA AA>12"))
    prints::
        [['B', '=', '12'], ['AA', '=', '23'], ['B', '<=', 'AA'], ['AA', '>', '12']]
    """
    if caseless:
        isequal = ( lambda a,b: a.upper() == b.upper() )
        masks = ( lambda a,b: b.upper().startswith(a.upper()) )
        parseElementClass = CaselessLiteral
    else:
        isequal = ( lambda a,b: a == b )
        masks = ( lambda a,b: b.startswith(a) )
        parseElementClass = Literal

    symbols = []
    if isinstance(strs,basestring):
        symbols = strs.split()
    elif isinstance(strs, collections.Iterable):
        symbols = list(strs)
    else:
        warnings.warn("Invalid argument to oneOf, expected string or iterable",
                SyntaxWarning, stacklevel=2)
    if not symbols:
        return NoMatch()

    i = 0
    while i < len(symbols)-1:
        cur = symbols[i]
        for j,other in enumerate(symbols[i+1:]):
            if ( isequal(other, cur) ):
                del symbols[i+j+1]
                break
            elif ( masks(cur, other) ):
                del symbols[i+j+1]
                symbols.insert(i,other)
                cur = other
                break
        else:
            i += 1

    if not caseless and useRegex:
        #~ print (strs,"->", "|".join( [ _escapeRegexChars(sym) for sym in symbols] ))
        try:
            if len(symbols)==len("".join(symbols)):
                return Regex( "[%s]" % "".join(_escapeRegexRangeChars(sym) for sym in symbols) ).setName(' | '.join(symbols))
            else:
                return Regex( "|".join(re.escape(sym) for sym in symbols) ).setName(' | '.join(symbols))
        except Exception:
            warnings.warn("Exception creating Regex for oneOf, building MatchFirst",
                    SyntaxWarning, stacklevel=2)


    # last resort, just use MatchFirst
    return MatchFirst(parseElementClass(sym) for sym in symbols).setName(' | '.join(symbols))

def dictOf( key, value ):
    """
    Helper to easily and clearly define a dictionary by specifying the respective patterns
    for the key and value.  Takes care of defining the C{L{Dict}}, C{L{ZeroOrMore}}, and C{L{Group}} tokens
    in the proper order.  The key pattern can include delimiting markers or punctuation,
    as long as they are suppressed, thereby leaving the significant key text.  The value
    pattern can include named results, so that the C{Dict} results can include named token
    fields.

    Example::
        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
        print(OneOrMore(attr_expr).parseString(text).dump())
        
        attr_label = label
        attr_value = Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join)

        # similar to Dict, but simpler call format
        result = dictOf(attr_label, attr_value).parseString(text)
        print(result.dump())
        print(result['shape'])
        print(result.shape)  # object attribute access works too
        print(result.asDict())
    prints::
        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
        - color: light blue
        - posn: upper left
        - shape: SQUARE
        - texture: burlap
        SQUARE
        SQUARE
        {'color': 'light blue', 'shape': 'SQUARE', 'posn': 'upper left', 'texture': 'burlap'}
    """
    return Dict( ZeroOrMore( Group ( key + value ) ) )

def originalTextFor(expr, asString=True):
    """
    Helper to return the original, untokenized text for a given expression.  Useful to
    restore the parsed fields of an HTML start tag into the raw tag text itself, or to
    revert separate tokens with intervening whitespace back to the original matching
    input text. By default, returns astring containing the original parsed text.  
       
    If the optional C{asString} argument is passed as C{False}, then the return value is a 
    C{L{ParseResults}} containing any results names that were originally matched, and a 
    single token containing the original matched text from the input string.  So if 
    the expression passed to C{L{originalTextFor}} contains expressions with defined
    results names, you must set C{asString} to C{False} if you want to preserve those
    results name values.

    Example::
        src = "this is test <b> bold <i>text</i> </b> normal text "
        for tag in ("b","i"):
            opener,closer = makeHTMLTags(tag)
            patt = originalTextFor(opener + SkipTo(closer) + closer)
            print(patt.searchString(src)[0])
    prints::
        ['<b> bold <i>text</i> </b>']
        ['<i>text</i>']
    """
    locMarker = Empty().setParseAction(lambda s,loc,t: loc)
    endlocMarker = locMarker.copy()
    endlocMarker.callPreparse = False
    matchExpr = locMarker("_original_start") + expr + endlocMarker("_original_end")
    if asString:
        extractText = lambda s,l,t: s[t._original_start:t._original_end]
    else:
        def extractText(s,l,t):
            t[:] = [s[t.pop('_original_start'):t.pop('_original_end')]]
    matchExpr.setParseAction(extractText)
    matchExpr.ignoreExprs = expr.ignoreExprs
    return matchExpr

def ungroup(expr): 
    """
    Helper to undo pyparsing's default grouping of And expressions, even
    if all but one are non-empty.
    """
    return TokenConverter(expr).setParseAction(lambda t:t[0])

def locatedExpr(expr):
    """
    Helper to decorate a returned token with its starting and ending locations in the input string.
    This helper adds the following results names:
     - locn_start = location where matched expression begins
     - locn_end = location where matched expression ends
     - value = the actual parsed results

    Be careful if the input text contains C{<TAB>} characters, you may want to call
    C{L{ParserElement.parseWithTabs}}

    Example::
        wd = Word(alphas)
        for match in locatedExpr(wd).searchString("ljsdf123lksdjjf123lkkjj1222"):
            print(match)
    prints::
        [[0, 'ljsdf', 5]]
        [[8, 'lksdjjf', 15]]
        [[18, 'lkkjj', 23]]
    """
    locator = Empty().setParseAction(lambda s,l,t: l)
    return Group(locator("locn_start") + expr("value") + locator.copy().leaveWhitespace()("locn_end"))


# convenience constants for positional expressions
empty       = Empty().setName("empty")
lineStart   = LineStart().setName("lineStart")
lineEnd     = LineEnd().setName("lineEnd")
stringStart = StringStart().setName("stringStart")
stringEnd   = StringEnd().setName("stringEnd")

_escapedPunc = Word( _bslash, r"\[]-*.$+^?()~ ", exact=2 ).setParseAction(lambda s,l,t:t[0][1])
_escapedHexChar = Regex(r"\\0?[xX][0-9a-fA-F]+").setParseAction(lambda s,l,t:unichr(int(t[0].lstrip(r'\0x'),16)))
_escapedOctChar = Regex(r"\\0[0-7]+").setParseAction(lambda s,l,t:unichr(int(t[0][1:],8)))
_singleChar = _escapedPunc | _escapedHexChar | _escapedOctChar | Word(printables, excludeChars=r'\]', exact=1) | Regex(r"\w", re.UNICODE)
_charRange = Group(_singleChar + Suppress("-") + _singleChar)
_reBracketExpr = Literal("[") + Optional("^").setResultsName("negate") + Group( OneOrMore( _charRange | _singleChar ) ).setResultsName("body") + "]"

def srange(s):
    r"""
    Helper to easily define string ranges for use in Word construction.  Borrows
    syntax from regexp '[]' string range definitions::
        srange("[0-9]")   -> "0123456789"
        srange("[a-z]")   -> "abcdefghijklmnopqrstuvwxyz"
        srange("[a-z$_]") -> "abcdefghijklmnopqrstuvwxyz$_"
    The input string must be enclosed in []'s, and the returned string is the expanded
    character set joined into a single string.
    The values enclosed in the []'s may be:
     - a single character
     - an escaped character with a leading backslash (such as C{\-} or C{\]})
     - an escaped hex character with a leading C{'\x'} (C{\x21}, which is a C{'!'} character) 
         (C{\0x##} is also supported for backwards compatibility) 
     - an escaped octal character with a leading C{'\0'} (C{\041}, which is a C{'!'} character)
     - a range of any of the above, separated by a dash (C{'a-z'}, etc.)
     - any combination of the above (C{'aeiouy'}, C{'a-zA-Z0-9_$'}, etc.)
    """
    _expanded = lambda p: p if not isinstance(p,ParseResults) else ''.join(unichr(c) for c in range(ord(p[0]),ord(p[1])+1))
    try:
        return "".join(_expanded(part) for part in _reBracketExpr.parseString(s).body)
    except Exception:
        return ""

def matchOnlyAtCol(n):
    """
    Helper method for defining parse actions that require matching at a specific
    column in the input text.
    """
    def verifyCol(strg,locn,toks):
        if col(locn,strg) != n:
            raise ParseException(strg,locn,"matched token not at column %d" % n)
    return verifyCol

def replaceWith(replStr):
    """
    Helper method for common parse actions that simply return a literal value.  Especially
    useful when used with C{L{transformString<ParserElement.transformString>}()}.

    Example::
        num = Word(nums).setParseAction(lambda toks: int(toks[0]))
        na = oneOf("N/A NA").setParseAction(replaceWith(math.nan))
        term = na | num
        
        OneOrMore(term).parseString("324 234 N/A 234") # -> [324, 234, nan, 234]
    """
    return lambda s,l,t: [replStr]

def removeQuotes(s,l,t):
    """
    Helper parse action for removing quotation marks from parsed quoted strings.

    Example::
        # by default, quotation marks are included in parsed results
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["'Now is the Winter of our Discontent'"]

        # use removeQuotes to strip quotation marks from parsed results
        quotedString.setParseAction(removeQuotes)
        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["Now is the Winter of our Discontent"]
    """
    return t[0][1:-1]

def tokenMap(func, *args):
    """
    Helper to define a parse action by mapping a function to all elements of a ParseResults list.If any additional 
    args are passed, they are forwarded to the given function as additional arguments after
    the token, as in C{hex_integer = Word(hexnums).setParseAction(tokenMap(int, 16))}, which will convert the
    parsed data to an integer using base 16.

    Example (compare the last to example in L{ParserElement.transformString}::
        hex_ints = OneOrMore(Word(hexnums)).setParseAction(tokenMap(int, 16))
        hex_ints.runTests('''
            00 11 22 aa FF 0a 0d 1a
            ''')
        
        upperword = Word(alphas).setParseAction(tokenMap(str.upper))
        OneOrMore(upperword).runTests('''
            my kingdom for a horse
            ''')

        wd = Word(alphas).setParseAction(tokenMap(str.title))
        OneOrMore(wd).setParseAction(' '.join).runTests('''
            now is the winter of our discontent made glorious summer by this sun of york
            ''')
    prints::
        00 11 22 aa FF 0a 0d 1a
        [0, 17, 34, 170, 255, 10, 13, 26]

        my kingdom for a horse
        ['MY', 'KINGDOM', 'FOR', 'A', 'HORSE']

        now is the winter of our discontent made glorious summer by this sun of york
        ['Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York']
    """
    def pa(s,l,t):
        return [func(tokn, *args) for tokn in t]

    try:
        func_name = getattr(func, '__name__', 
                            getattr(func, '__class__').__name__)
    except Exception:
        func_name = str(func)
    pa.__name__ = func_name

    return pa

upcaseTokens = tokenMap(lambda t: _ustr(t).upper())
"""(Deprecated) Helper parse action to convert tokens to upper case. Deprecated in favor of L{pyparsing_common.upcaseTokens}"""

downcaseTokens = tokenMap(lambda t: _ustr(t).lower())
"""(Deprecated) Helper parse action to convert tokens to lower case. Deprecated in favor of L{pyparsing_common.downcaseTokens}"""
    
def _makeTags(tagStr, xml):
    """Internal helper to construct opening and closing tag expressions, given a tag name"""
    if isinstance(tagStr,basestring):
        resname = tagStr
        tagStr = Keyword(tagStr, caseless=not xml)
    else:
        resname = tagStr.name

    tagAttrName = Word(alphas,alphanums+"_-:")
    if (xml):
        tagAttrValue = dblQuotedString.copy().setParseAction( removeQuotes )
        openTag = Suppress("<") + tagStr("tag") + \
                Dict(ZeroOrMore(Group( tagAttrName + Suppress("=") + tagAttrValue ))) + \
                Optional("/",default=[False]).setResultsName("empty").setParseAction(lambda s,l,t:t[0]=='/') + Suppress(">")
    else:
        printablesLessRAbrack = "".join(c for c in printables if c not in ">")
        tagAttrValue = quotedString.copy().setParseAction( removeQuotes ) | Word(printablesLessRAbrack)
        openTag = Suppress("<") + tagStr("tag") + \
                Dict(ZeroOrMore(Group( tagAttrName.setParseAction(downcaseTokens) + \
                Optional( Suppress("=") + tagAttrValue ) ))) + \
                Optional("/",default=[False]).setResultsName("empty").setParseAction(lambda s,l,t:t[0]=='/') + Suppress(">")
    closeTag = Combine(_L("</") + tagStr + ">")

    openTag = openTag.setResultsName("start"+"".join(resname.replace(":"," ").title().split())).setName("<%s>" % resname)
    closeTag = closeTag.setResultsName("end"+"".join(resname.replace(":"," ").title().split())).setName("</%s>" % resname)
    openTag.tag = resname
    closeTag.tag = resname
    return openTag, closeTag

def makeHTMLTags(tagStr):
    """
    Helper to construct opening and closing tag expressions for HTML, given a tag name. Matches
    tags in either upper or lower case, attributes with namespaces and with quoted or unquoted values.

    Example::
        text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
        # makeHTMLTags returns pyparsing expressions for the opening and closing tags as a 2-tuple
        a,a_end = makeHTMLTags("A")
        link_expr = a + SkipTo(a_end)("link_text") + a_end
        
        for link in link_expr.searchString(text):
            # attributes in the <A> tag (like "href" shown here) are also accessible as named results
            print(link.link_text, '->', link.href)
    prints::
        pyparsing -> http://pyparsing.wikispaces.com
    """
    return _makeTags( tagStr, False )

def makeXMLTags(tagStr):
    """
    Helper to construct opening and closing tag expressions for XML, given a tag name. Matches
    tags only in the given upper/lower case.

    Example: similar to L{makeHTMLTags}
    """
    return _makeTags( tagStr, True )

def withAttribute(*args,**attrDict):
    """
    Helper to create a validating parse action to be used with start tags created
    with C{L{makeXMLTags}} or C{L{makeHTMLTags}}. Use C{withAttribute} to qualify a starting tag
    with a required attribute value, to avoid false matches on common tags such as
    C{<TD>} or C{<DIV>}.

    Call C{withAttribute} with a series of attribute names and values. Specify the list
    of filter attributes names and values as:
     - keyword arguments, as in C{(align="right")}, or
     - as an explicit dict with C{**} operator, when an attribute name is also a Python
          reserved word, as in C{**{"class":"Customer", "align":"right"}}
     - a list of name-value tuples, as in ( ("ns1:class", "Customer"), ("ns2:align","right") )
    For attribute names with a namespace prefix, you must use the second form.  Attribute
    names are matched insensitive to upper/lower case.
       
    If just testing for C{class} (with or without a namespace), use C{L{withClass}}.

    To verify that the attribute exists, but without specifying a value, pass
    C{withAttribute.ANY_VALUE} as the value.

    Example::
        html = '''
            <div>
            Some text
            <div type="grid">1 4 0 1 0</div>
            <div type="graph">1,3 2,3 1,1</div>
            <div>this has no type</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")

        # only match div tag having a type attribute with value "grid"
        div_grid = div().setParseAction(withAttribute(type="grid"))
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        # construct a match with any div tag having a type attribute, regardless of the value
        div_any_type = div().setParseAction(withAttribute(type=withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    """
    if args:
        attrs = args[:]
    else:
        attrs = attrDict.items()
    attrs = [(k,v) for k,v in attrs]
    def pa(s,l,tokens):
        for attrName,attrValue in attrs:
            if attrName not in tokens:
                raise ParseException(s,l,"no matching attribute " + attrName)
            if attrValue != withAttribute.ANY_VALUE and tokens[attrName] != attrValue:
                raise ParseException(s,l,"attribute '%s' has value '%s', must be '%s'" %
                                            (attrName, tokens[attrName], attrValue))
    return pa
withAttribute.ANY_VALUE = object()

def withClass(classname, namespace=''):
    """
    Simplified version of C{L{withAttribute}} when matching on a div class - made
    difficult because C{class} is a reserved word in Python.

    Example::
        html = '''
            <div>
            Some text
            <div class="grid">1 4 0 1 0</div>
            <div class="graph">1,3 2,3 1,1</div>
            <div>this &lt;div&gt; has no class</div>
            </div>
                
        '''
        div,div_end = makeHTMLTags("div")
        div_grid = div().setParseAction(withClass("grid"))
        
        grid_expr = div_grid + SkipTo(div | div_end)("body")
        for grid_header in grid_expr.searchString(html):
            print(grid_header.body)
        
        div_any_type = div().setParseAction(withClass(withAttribute.ANY_VALUE))
        div_expr = div_any_type + SkipTo(div | div_end)("body")
        for div_header in div_expr.searchString(html):
            print(div_header.body)
    prints::
        1 4 0 1 0

        1 4 0 1 0
        1,3 2,3 1,1
    """
    classattr = "%s:class" % namespace if namespace else "class"
    return withAttribute(**{classattr : classname})        

opAssoc = _Constants()
opAssoc.LEFT = object()
opAssoc.RIGHT = object()

def infixNotation( baseExpr, opList, lpar=Suppress('('), rpar=Suppress(')') ):
    """
    Helper method for constructing grammars of expressions made up of
    operators working in a precedence hierarchy.  Operators may be unary or
    binary, left- or right-associative.  Parse actions can also be attached
    to operator expressions. The generated parser will also recognize the use 
    of parentheses to override operator precedences (see example below).
    
    Note: if you define a deep operator list, you may see performance issues
    when using infixNotation. See L{ParserElement.enablePackrat} for a
    mechanism to potentially improve your parser performance.

    Parameters:
     - baseExpr - expression representing the most basic element for the nested
     - opList - list of tuples, one for each operator precedence level in the
      expression grammar; each tuple is of the form
      (opExpr, numTerms, rightLeftAssoc, parseAction), where:
       - opExpr is the pyparsing expression for the operator;
          may also be a string, which will be converted to a Literal;
          if numTerms is 3, opExpr is a tuple of two expressions, for the
          two operators separating the 3 terms
       - numTerms is the number of terms for this operator (must
          be 1, 2, or 3)
       - rightLeftAssoc is the indicator whether the operator is
          right or left associative, using the pyparsing-defined
          constants C{opAssoc.RIGHT} and C{opAssoc.LEFT}.
       - parseAction is the parse action to be associated with
          expressions matching this operator expression (the
          parse action tuple member may be omitted)
     - lpar - expression for matching left-parentheses (default=C{Suppress('(')})
     - rpar - expression for matching right-parentheses (default=C{Suppress(')')})

    Example::
        # simple example of four-function arithmetic with ints and variable names
        integer = pyparsing_common.signed_integer
        varname = pyparsing_common.identifier 
        
        arith_expr = infixNotation(integer | varname,
            [
            ('-', 1, opAssoc.RIGHT),
            (oneOf('* /'), 2, opAssoc.LEFT),
            (oneOf('+ -'), 2, opAssoc.LEFT),
            ])
        
        arith_expr.runTests('''
            5+3*6
            (5+3)*6
            -2--11
            ''', fullDump=False)
    prints::
        5+3*6
        [[5, '+', [3, '*', 6]]]

        (5+3)*6
        [[[5, '+', 3], '*', 6]]

        -2--11
        [[['-', 2], '-', ['-', 11]]]
    """
    ret = Forward()
    lastExpr = baseExpr | ( lpar + ret + rpar )
    for i,operDef in enumerate(opList):
        opExpr,arity,rightLeftAssoc,pa = (operDef + (None,))[:4]
        termName = "%s term" % opExpr if arity < 3 else "%s%s term" % opExpr
        if arity == 3:
            if opExpr is None or len(opExpr) != 2:
                raise ValueError("if numterms=3, opExpr must be a tuple or list of two expressions")
            opExpr1, opExpr2 = opExpr
        thisExpr = Forward().setName(termName)
        if rightLeftAssoc == opAssoc.LEFT:
            if arity == 1:
                matchExpr = FollowedBy(lastExpr + opExpr) + Group( lastExpr + OneOrMore( opExpr ) )
            elif arity == 2:
                if opExpr is not None:
                    matchExpr = FollowedBy(lastExpr + opExpr + lastExpr) + Group( lastExpr + OneOrMore( opExpr + lastExpr ) )
                else:
                    matchExpr = FollowedBy(lastExpr+lastExpr) + Group( lastExpr + OneOrMore(lastExpr) )
            elif arity == 3:
                matchExpr = FollowedBy(lastExpr + opExpr1 + lastExpr + opExpr2 + lastExpr) + \
                            Group( lastExpr + opExpr1 + lastExpr + opExpr2 + lastExpr )
            else:
                raise ValueError("operator must be unary (1), binary (2), or ternary (3)")
        elif rightLeftAssoc == opAssoc.RIGHT:
            if arity == 1:
                # try to avoid LR with this extra test
                if not isinstance(opExpr, Optional):
                    opExpr = Optional(opExpr)
                matchExpr = FollowedBy(opExpr.expr + thisExpr) + Group( opExpr + thisExpr )
            elif arity == 2:
                if opExpr is not None:
                    matchExpr = FollowedBy(lastExpr + opExpr + thisExpr) + Group( lastExpr + OneOrMore( opExpr + thisExpr ) )
                else:
                    matchExpr = FollowedBy(lastExpr + thisExpr) + Group( lastExpr + OneOrMore( thisExpr ) )
            elif arity == 3:
                matchExpr = FollowedBy(lastExpr + opExpr1 + thisExpr + opExpr2 + thisExpr) + \
                            Group( lastExpr + opExpr1 + thisExpr + opExpr2 + thisExpr )
            else:
                raise ValueError("operator must be unary (1), binary (2), or ternary (3)")
        else:
            raise ValueError("operator must indicate right or left associativity")
        if pa:
            matchExpr.setParseAction( pa )
        thisExpr <<= ( matchExpr.setName(termName) | lastExpr )
        lastExpr = thisExpr
    ret <<= lastExpr
    return ret

operatorPrecedence = infixNotation
"""(Deprecated) Former name of C{L{infixNotation}}, will be dropped in a future release."""

dblQuotedString = Combine(Regex(r'"(?:[^"\n\r\\]|(?:"")|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*')+'"').setName("string enclosed in double quotes")
sglQuotedString = Combine(Regex(r"'(?:[^'\n\r\\]|(?:'')|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*")+"'").setName("string enclosed in single quotes")
quotedString = Combine(Regex(r'"(?:[^"\n\r\\]|(?:"")|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*')+'"'|
                       Regex(r"'(?:[^'\n\r\\]|(?:'')|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*")+"'").setName("quotedString using single or double quotes")
unicodeString = Combine(_L('u') + quotedString.copy()).setName("unicode string literal")

def nestedExpr(opener="(", closer=")", content=None, ignoreExpr=quotedString.copy()):
    """
    Helper method for defining nested lists enclosed in opening and closing
    delimiters ("(" and ")" are the default).

    Parameters:
     - opener - opening character for a nested list (default=C{"("}); can also be a pyparsing expression
     - closer - closing character for a nested list (default=C{")"}); can also be a pyparsing expression
     - content - expression for items within the nested lists (default=C{None})
     - ignoreExpr - expression for ignoring opening and closing delimiters (default=C{quotedString})

    If an expression is not provided for the content argument, the nested
    expression will capture all whitespace-delimited content between delimiters
    as a list of separate values.

    Use the C{ignoreExpr} argument to define expressions that may contain
    opening or closing characters that should not be treated as opening
    or closing characters for nesting, such as quotedString or a comment
    expression.  Specify multiple expressions using an C{L{Or}} or C{L{MatchFirst}}.
    The default is L{quotedString}, but if no expressions are to be ignored,
    then pass C{None} for this argument.

    Example::
        data_type = oneOf("void int short long char float double")
        decl_data_type = Combine(data_type + Optional(Word('*')))
        ident = Word(alphas+'_', alphanums+'_')
        number = pyparsing_common.number
        arg = Group(decl_data_type + ident)
        LPAR,RPAR = map(Suppress, "()")

        code_body = nestedExpr('{', '}', ignoreExpr=(quotedString | cStyleComment))

        c_function = (decl_data_type("type") 
                      + ident("name")
                      + LPAR + Optional(delimitedList(arg), [])("args") + RPAR 
                      + code_body("body"))
        c_function.ignore(cStyleComment)
        
        source_code = '''
            int is_odd(int x) { 
                return (x%2); 
            }
                
            int dec_to_hex(char hchar) { 
                if (hchar >= '0' && hchar <= '9') { 
                    return (ord(hchar)-ord('0')); 
                } else { 
                    return (10+ord(hchar)-ord('A'));
                } 
            }
        '''
        for func in c_function.searchString(source_code):
            print("%(name)s (%(type)s) args: %(args)s" % func)

    prints::
        is_odd (int) args: [['int', 'x']]
        dec_to_hex (int) args: [['char', 'hchar']]
    """
    if opener == closer:
        raise ValueError("opening and closing strings cannot be the same")
    if content is None:
        if isinstance(opener,basestring) and isinstance(closer,basestring):
            if len(opener) == 1 and len(closer)==1:
                if ignoreExpr is not None:
                    content = (Combine(OneOrMore(~ignoreExpr +
                                    CharsNotIn(opener+closer+ParserElement.DEFAULT_WHITE_CHARS,exact=1))
                                ).setParseAction(lambda t:t[0].strip()))
                else:
                    content = (empty.copy()+CharsNotIn(opener+closer+ParserElement.DEFAULT_WHITE_CHARS
                                ).setParseAction(lambda t:t[0].strip()))
            else:
                if ignoreExpr is not None:
                    content = (Combine(OneOrMore(~ignoreExpr + 
                                    ~Literal(opener) + ~Literal(closer) +
                                    CharsNotIn(ParserElement.DEFAULT_WHITE_CHARS,exact=1))
                                ).setParseAction(lambda t:t[0].strip()))
                else:
                    content = (Combine(OneOrMore(~Literal(opener) + ~Literal(closer) +
                                    CharsNotIn(ParserElement.DEFAULT_WHITE_CHARS,exact=1))
                                ).setParseAction(lambda t:t[0].strip()))
        else:
            raise ValueError("opening and closing arguments must be strings if no content expression is given")
    ret = Forward()
    if ignoreExpr is not None:
        ret <<= Group( Suppress(opener) + ZeroOrMore( ignoreExpr | ret | content ) + Suppress(closer) )
    else:
        ret <<= Group( Suppress(opener) + ZeroOrMore( ret | content )  + Suppress(closer) )
    ret.setName('nested %s%s expression' % (opener,closer))
    return ret

def indentedBlock(blockStatementExpr, indentStack, indent=True):
    """
    Helper method for defining space-delimited indentation blocks, such as
    those used to define block statements in Python source code.

    Parameters:
     - blockStatementExpr - expression defining syntax of statement that
            is repeated within the indented block
     - indentStack - list created by caller to manage indentation stack
            (multiple statementWithIndentedBlock expressions within a single grammar
            should share a common indentStack)
     - indent - boolean indicating whether block must be indented beyond the
            the current level; set to False for block of left-most statements
            (default=C{True})

    A valid block must contain at least one C{blockStatement}.

    Example::
        data = '''
        def A(z):
          A1
          B = 100
          G = A2
          A2
          A3
        B
        def BB(a,b,c):
          BB1
          def BBA():
            bba1
            bba2
            bba3
        C
        D
        def spam(x,y):
             def eggs(z):
                 pass
        '''


        indentStack = [1]
        stmt = Forward()

        identifier = Word(alphas, alphanums)
        funcDecl = ("def" + identifier + Group( "(" + Optional( delimitedList(identifier) ) + ")" ) + ":")
        func_body = indentedBlock(stmt, indentStack)
        funcDef = Group( funcDecl + func_body )

        rvalue = Forward()
        funcCall = Group(identifier + "(" + Optional(delimitedList(rvalue)) + ")")
        rvalue << (funcCall | identifier | Word(nums))
        assignment = Group(identifier + "=" + rvalue)
        stmt << ( funcDef | assignment | identifier )

        module_body = OneOrMore(stmt)

        parseTree = module_body.parseString(data)
        parseTree.pprint()
    prints::
        [['def',
          'A',
          ['(', 'z', ')'],
          ':',
          [['A1'], [['B', '=', '100']], [['G', '=', 'A2']], ['A2'], ['A3']]],
         'B',
         ['def',
          'BB',
          ['(', 'a', 'b', 'c', ')'],
          ':',
          [['BB1'], [['def', 'BBA', ['(', ')'], ':', [['bba1'], ['bba2'], ['bba3']]]]]],
         'C',
         'D',
         ['def',
          'spam',
          ['(', 'x', 'y', ')'],
          ':',
          [[['def', 'eggs', ['(', 'z', ')'], ':', [['pass']]]]]]] 
    """
    def checkPeerIndent(s,l,t):
        if l >= len(s): return
        curCol = col(l,s)
        if curCol != indentStack[-1]:
            if curCol > indentStack[-1]:
                raise ParseFatalException(s,l,"illegal nesting")
            raise ParseException(s,l,"not a peer entry")

    def checkSubIndent(s,l,t):
        curCol = col(l,s)
        if curCol > indentStack[-1]:
            indentStack.append( curCol )
        else:
            raise ParseException(s,l,"not a subentry")

    def checkUnindent(s,l,t):
        if l >= len(s): return
        curCol = col(l,s)
        if not(indentStack and curCol < indentStack[-1] and curCol <= indentStack[-2]):
            raise ParseException(s,l,"not an unindent")
        indentStack.pop()

    NL = OneOrMore(LineEnd().setWhitespaceChars("\t ").suppress())
    INDENT = (Empty() + Empty().setParseAction(checkSubIndent)).setName('INDENT')
    PEER   = Empty().setParseAction(checkPeerIndent).setName('')
    UNDENT = Empty().setParseAction(checkUnindent).setName('UNINDENT')
    if indent:
        smExpr = Group( Optional(NL) +
            #~ FollowedBy(blockStatementExpr) +
            INDENT + (OneOrMore( PEER + Group(blockStatementExpr) + Optional(NL) )) + UNDENT)
    else:
        smExpr = Group( Optional(NL) +
            (OneOrMore( PEER + Group(blockStatementExpr) + Optional(NL) )) )
    blockStatementExpr.ignore(_bslash + LineEnd())
    return smExpr.setName('indented block')

alphas8bit = srange(r"[\0xc0-\0xd6\0xd8-\0xf6\0xf8-\0xff]")
punc8bit = srange(r"[\0xa1-\0xbf\0xd7\0xf7]")

anyOpenTag,anyCloseTag = makeHTMLTags(Word(alphas,alphanums+"_:").setName('any tag'))
_htmlEntityMap = dict(zip("gt lt amp nbsp quot apos".split(),'><& "\''))
commonHTMLEntity = Regex('&(?P<entity>' + '|'.join(_htmlEntityMap.keys()) +");").setName("common HTML entity")
def replaceHTMLEntity(t):
    """Helper parser action to replace common HTML entities with their special characters"""
    return _htmlEntityMap.get(t.entity)

# it's easy to get these comment structures wrong - they're very common, so may as well make them available
cStyleComment = Combine(Regex(r"/\*(?:[^*]|\*(?!/))*") + '*/').setName("C style comment")
"Comment of the form C{/* ... */}"

htmlComment = Regex(r"<!--[\s\S]*?-->").setName("HTML comment")
"Comment of the form C{<!-- ... -->}"

restOfLine = Regex(r".*").leaveWhitespace().setName("rest of line")
dblSlashComment = Regex(r"//(?:\\\n|[^\n])*").setName("// comment")
"Comment of the form C{// ... (to end of line)}"

cppStyleComment = Combine(Regex(r"/\*(?:[^*]|\*(?!/))*") + '*/'| dblSlashComment).setName("C++ style comment")
"Comment of either form C{L{cStyleComment}} or C{L{dblSlashComment}}"

javaStyleComment = cppStyleComment
"Same as C{L{cppStyleComment}}"

pythonStyleComment = Regex(r"#.*").setName("Python style comment")
"Comment of the form C{# ... (to end of line)}"

_commasepitem = Combine(OneOrMore(Word(printables, excludeChars=',') +
                                  Optional( Word(" \t") +
                                            ~Literal(",") + ~LineEnd() ) ) ).streamline().setName("commaItem")
commaSeparatedList = delimitedList( Optional( quotedString.copy() | _commasepitem, default="") ).setName("commaSeparatedList")
"""(Deprecated) Predefined expression of 1 or more printable words or quoted strings, separated by commas.
   This expression is deprecated in favor of L{pyparsing_common.comma_separated_list}."""

# some other useful expressions - using lower-case class name since we are really using this as a namespace
class pyparsing_common:
    """
    Here are some common low-level expressions that may be useful in jump-starting parser development:
     - numeric forms (L{integers<integer>}, L{reals<real>}, L{scientific notation<sci_real>})
     - common L{programming identifiers<identifier>}
     - network addresses (L{MAC<mac_address>}, L{IPv4<ipv4_address>}, L{IPv6<ipv6_address>})
     - ISO8601 L{dates<iso8601_date>} and L{datetime<iso8601_datetime>}
     - L{UUID<uuid>}
     - L{comma-separated list<comma_separated_list>}
    Parse actions:
     - C{L{convertToInteger}}
     - C{L{convertToFloat}}
     - C{L{convertToDate}}
     - C{L{convertToDatetime}}
     - C{L{stripHTMLTags}}
     - C{L{upcaseTokens}}
     - C{L{downcaseTokens}}

    Example::
        pyparsing_common.number.runTests('''
            # any int or real number, returned as the appropriate type
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.fnumber.runTests('''
            # any int or real number, returned as float
            100
            -100
            +100
            3.14159
            6.02e23
            1e-12
            ''')

        pyparsing_common.hex_integer.runTests('''
            # hex numbers
            100
            FF
            ''')

        pyparsing_common.fraction.runTests('''
            # fractions
            1/2
            -3/4
            ''')

        pyparsing_common.mixed_integer.runTests('''
            # mixed fractions
            1
            1/2
            -3/4
            1-3/4
            ''')

        import uuid
        pyparsing_common.uuid.setParseAction(tokenMap(uuid.UUID))
        pyparsing_common.uuid.runTests('''
            # uuid
            12345678-1234-5678-1234-567812345678
            ''')
    prints::
        # any int or real number, returned as the appropriate type
        100
        [100]

        -100
        [-100]

        +100
        [100]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # any int or real number, returned as float
        100
        [100.0]

        -100
        [-100.0]

        +100
        [100.0]

        3.14159
        [3.14159]

        6.02e23
        [6.02e+23]

        1e-12
        [1e-12]

        # hex numbers
        100
        [256]

        FF
        [255]

        # fractions
        1/2
        [0.5]

        -3/4
        [-0.75]

        # mixed fractions
        1
        [1]

        1/2
        [0.5]

        -3/4
        [-0.75]

        1-3/4
        [1.75]

        # uuid
        12345678-1234-5678-1234-567812345678
        [UUID('12345678-1234-5678-1234-567812345678')]
    """

    convertToInteger = tokenMap(int)
    """
    Parse action for converting parsed integers to Python int
    """

    convertToFloat = tokenMap(float)
    """
    Parse action for converting parsed numbers to Python float
    """

    integer = Word(nums).setName("integer").setParseAction(convertToInteger)
    """expression that parses an unsigned integer, returns an int"""

    hex_integer = Word(hexnums).setName("hex integer").setParseAction(tokenMap(int,16))
    """expression that parses a hexadecimal integer, returns an int"""

    signed_integer = Regex(r'[+-]?\d+').setName("signed integer").setParseAction(convertToInteger)
    """expression that parses an integer with optional leading sign, returns an int"""

    fraction = (signed_integer().setParseAction(convertToFloat) + '/' + signed_integer().setParseAction(convertToFloat)).setName("fraction")
    """fractional expression of an integer divided by an integer, returns a float"""
    fraction.addParseAction(lambda t: t[0]/t[-1])

    mixed_integer = (fraction | signed_integer + Optional(Optional('-').suppress() + fraction)).setName("fraction or mixed integer-fraction")
    """mixed integer of the form 'integer - fraction', with optional leading integer, returns float"""
    mixed_integer.addParseAction(sum)

    real = Regex(r'[+-]?\d+\.\d*').setName("real number").setParseAction(convertToFloat)
    """expression that parses a floating point number and returns a float"""

    sci_real = Regex(r'[+-]?\d+([eE][+-]?\d+|\.\d*([eE][+-]?\d+)?)').setName("real number with scientific notation").setParseAction(convertToFloat)
    """expression that parses a floating point number with optional scientific notation and returns a float"""

    # streamlining this expression makes the docs nicer-looking
    number = (sci_real | real | signed_integer).streamline()
    """any numeric expression, returns the corresponding Python type"""

    fnumber = Regex(r'[+-]?\d+\.?\d*([eE][+-]?\d+)?').setName("fnumber").setParseAction(convertToFloat)
    """any int or real number, returned as float"""
    
    identifier = Word(alphas+'_', alphanums+'_').setName("identifier")
    """typical code identifier (leading alpha or '_', followed by 0 or more alphas, nums, or '_')"""
    
    ipv4_address = Regex(r'(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})(\.(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})){3}').setName("IPv4 address")
    "IPv4 address (C{0.0.0.0 - 255.255.255.255})"

    _ipv6_part = Regex(r'[0-9a-fA-F]{1,4}').setName("hex_integer")
    _full_ipv6_address = (_ipv6_part + (':' + _ipv6_part)*7).setName("full IPv6 address")
    _short_ipv6_address = (Optional(_ipv6_part + (':' + _ipv6_part)*(0,6)) + "::" + Optional(_ipv6_part + (':' + _ipv6_part)*(0,6))).setName("short IPv6 address")
    _short_ipv6_address.addCondition(lambda t: sum(1 for tt in t if pyparsing_common._ipv6_part.matches(tt)) < 8)
    _mixed_ipv6_address = ("::ffff:" + ipv4_address).setName("mixed IPv6 address")
    ipv6_address = Combine((_full_ipv6_address | _mixed_ipv6_address | _short_ipv6_address).setName("IPv6 address")).setName("IPv6 address")
    "IPv6 address (long, short, or mixed form)"
    
    mac_address = Regex(r'[0-9a-fA-F]{2}([:.-])[0-9a-fA-F]{2}(?:\1[0-9a-fA-F]{2}){4}').setName("MAC address")
    "MAC address xx:xx:xx:xx:xx (may also have '-' or '.' delimiters)"

    @staticmethod
    def convertToDate(fmt="%Y-%m-%d"):
        """
        Helper to create a parse action for converting parsed date string to Python datetime.date

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%d"})

        Example::
            date_expr = pyparsing_common.iso8601_date.copy()
            date_expr.setParseAction(pyparsing_common.convertToDate())
            print(date_expr.parseString("1999-12-31"))
        prints::
            [datetime.date(1999, 12, 31)]
        """
        def cvt_fn(s,l,t):
            try:
                return datetime.strptime(t[0], fmt).date()
            except ValueError as ve:
                raise ParseException(s, l, str(ve))
        return cvt_fn

    @staticmethod
    def convertToDatetime(fmt="%Y-%m-%dT%H:%M:%S.%f"):
        """
        Helper to create a parse action for converting parsed datetime string to Python datetime.datetime

        Params -
         - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%dT%H:%M:%S.%f"})

        Example::
            dt_expr = pyparsing_common.iso8601_datetime.copy()
            dt_expr.setParseAction(pyparsing_common.convertToDatetime())
            print(dt_expr.parseString("1999-12-31T23:59:59.999"))
        prints::
            [datetime.datetime(1999, 12, 31, 23, 59, 59, 999000)]
        """
        def cvt_fn(s,l,t):
            try:
                return datetime.strptime(t[0], fmt)
            except ValueError as ve:
                raise ParseException(s, l, str(ve))
        return cvt_fn

    iso8601_date = Regex(r'(?P<year>\d{4})(?:-(?P<month>\d\d)(?:-(?P<day>\d\d))?)?').setName("ISO8601 date")
    "ISO8601 date (C{yyyy-mm-dd})"

    iso8601_datetime = Regex(r'(?P<year>\d{4})-(?P<month>\d\d)-(?P<day>\d\d)[T ](?P<hour>\d\d):(?P<minute>\d\d)(:(?P<second>\d\d(\.\d*)?)?)?(?P<tz>Z|[+-]\d\d:?\d\d)?').setName("ISO8601 datetime")
    "ISO8601 datetime (C{yyyy-mm-ddThh:mm:ss.s(Z|+-00:00)}) - trailing seconds, milliseconds, and timezone optional; accepts separating C{'T'} or C{' '}"

    uuid = Regex(r'[0-9a-fA-F]{8}(-[0-9a-fA-F]{4}){3}-[0-9a-fA-F]{12}').setName("UUID")
    "UUID (C{xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx})"

    _html_stripper = anyOpenTag.suppress() | anyCloseTag.suppress()
    @staticmethod
    def stripHTMLTags(s, l, tokens):
        """
        Parse action to remove HTML tags from web page HTML source

        Example::
            # strip HTML links from normal text 
            text = '<td>More info at the <a href="http://pyparsing.wikispaces.com">pyparsing</a> wiki page</td>'
            td,td_end = makeHTMLTags("TD")
            table_text = td + SkipTo(td_end).setParseAction(pyparsing_common.stripHTMLTags)("body") + td_end
            
            print(table_text.parseString(text).body) # -> 'More info at the pyparsing wiki page'
        """
        return pyparsing_common._html_stripper.transformString(tokens[0])

    _commasepitem = Combine(OneOrMore(~Literal(",") + ~LineEnd() + Word(printables, excludeChars=',') 
                                        + Optional( White(" \t") ) ) ).streamline().setName("commaItem")
    comma_separated_list = delimitedList( Optional( quotedString.copy() | _commasepitem, default="") ).setName("comma separated list")
    """Predefined expression of 1 or more printable words or quoted strings, separated by commas."""

    upcaseTokens = staticmethod(tokenMap(lambda t: _ustr(t).upper()))
    """Parse action to convert tokens to upper case."""

    downcaseTokens = staticmethod(tokenMap(lambda t: _ustr(t).lower()))
    """Parse action to convert tokens to lower case."""


if __name__ == "__main__":

    selectToken    = CaselessLiteral("select")
    fromToken      = CaselessLiteral("from")

    ident          = Word(alphas, alphanums + "_$")

    columnName     = delimitedList(ident, ".", combine=True).setParseAction(upcaseTokens)
    columnNameList = Group(delimitedList(columnName)).setName("columns")
    columnSpec     = ('*' | columnNameList)

    tableName      = delimitedList(ident, ".", combine=True).setParseAction(upcaseTokens)
    tableNameList  = Group(delimitedList(tableName)).setName("tables")
    
    simpleSQL      = selectToken("command") + columnSpec("columns") + fromToken + tableNameList("tables")

    # demo runTests method, including embedded comments in test string
    simpleSQL.runTests("""
        # '*' as column list and dotted table name
        select * from SYS.XYZZY

        # caseless match on "SELECT", and casts back to "select"
        SELECT * from XYZZY, ABC

        # list of column names, and mixed case SELECT keyword
        Select AA,BB,CC from Sys.dual

        # multiple tables
        Select A, B, C from Sys.dual, Table2

        # invalid SELECT keyword - should fail
        Xelect A, B, C from Sys.dual

        # incomplete command - should fail
        Select

        # invalid column name - should fail
        Select ^^^ frox Sys.dual

        """)

    pyparsing_common.number.runTests("""
        100
        -100
        +100
        3.14159
        6.02e23
        1e-12
        """)

    # any int or real number, returned as float
    pyparsing_common.fnumber.runTests("""
        100
        -100
        +100
        3.14159
        6.02e23
        1e-12
        """)

    pyparsing_common.hex_integer.runTests("""
        100
        FF
        """)

    import uuid
    pyparsing_common.uuid.setParseAction(tokenMap(uuid.UUID))
    pyparsing_common.uuid.runTests("""
        12345678-1234-5678-1234-567812345678
        """)
site-packages/pkg_resources/_vendor/packaging/__init__.py000064400000001001147511334660017632 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

from .__about__ import (
    __author__, __copyright__, __email__, __license__, __summary__, __title__,
    __uri__, __version__
)

__all__ = [
    "__title__", "__summary__", "__uri__", "__version__", "__author__",
    "__email__", "__license__", "__copyright__",
]
site-packages/pkg_resources/_vendor/packaging/_compat.py000064400000001534147511334660017530 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

import sys


PY2 = sys.version_info[0] == 2
PY3 = sys.version_info[0] == 3

# flake8: noqa

if PY3:
    string_types = str,
else:
    string_types = basestring,


def with_metaclass(meta, *bases):
    """
    Create a base class with a metaclass.
    """
    # This requires a bit of explanation: the basic idea is to make a dummy
    # metaclass for one level of class instantiation that replaces itself with
    # the actual metaclass.
    class metaclass(meta):
        def __new__(cls, name, this_bases, d):
            return meta(name, bases, d)
    return type.__new__(metaclass, 'temporary_class', (), {})
site-packages/pkg_resources/_vendor/packaging/utils.py000064400000000645147511334660017250 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

import re


_canonicalize_regex = re.compile(r"[-_.]+")


def canonicalize_name(name):
    # This is taken from PEP 503.
    return _canonicalize_regex.sub("-", name).lower()
site-packages/pkg_resources/_vendor/packaging/markers.py000064400000020070147511334660017546 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

import operator
import os
import platform
import sys

from pkg_resources.extern.pyparsing import ParseException, ParseResults, stringStart, stringEnd
from pkg_resources.extern.pyparsing import ZeroOrMore, Group, Forward, QuotedString
from pkg_resources.extern.pyparsing import Literal as L  # noqa

from ._compat import string_types
from .specifiers import Specifier, InvalidSpecifier


__all__ = [
    "InvalidMarker", "UndefinedComparison", "UndefinedEnvironmentName",
    "Marker", "default_environment",
]


class InvalidMarker(ValueError):
    """
    An invalid marker was found, users should refer to PEP 508.
    """


class UndefinedComparison(ValueError):
    """
    An invalid operation was attempted on a value that doesn't support it.
    """


class UndefinedEnvironmentName(ValueError):
    """
    A name was attempted to be used that does not exist inside of the
    environment.
    """


class Node(object):

    def __init__(self, value):
        self.value = value

    def __str__(self):
        return str(self.value)

    def __repr__(self):
        return "<{0}({1!r})>".format(self.__class__.__name__, str(self))

    def serialize(self):
        raise NotImplementedError


class Variable(Node):

    def serialize(self):
        return str(self)


class Value(Node):

    def serialize(self):
        return '"{0}"'.format(self)


class Op(Node):

    def serialize(self):
        return str(self)


VARIABLE = (
    L("implementation_version") |
    L("platform_python_implementation") |
    L("implementation_name") |
    L("python_full_version") |
    L("platform_release") |
    L("platform_version") |
    L("platform_machine") |
    L("platform_system") |
    L("python_version") |
    L("sys_platform") |
    L("os_name") |
    L("os.name") |  # PEP-345
    L("sys.platform") |  # PEP-345
    L("platform.version") |  # PEP-345
    L("platform.machine") |  # PEP-345
    L("platform.python_implementation") |  # PEP-345
    L("python_implementation") |  # undocumented setuptools legacy
    L("extra")
)
ALIASES = {
    'os.name': 'os_name',
    'sys.platform': 'sys_platform',
    'platform.version': 'platform_version',
    'platform.machine': 'platform_machine',
    'platform.python_implementation': 'platform_python_implementation',
    'python_implementation': 'platform_python_implementation'
}
VARIABLE.setParseAction(lambda s, l, t: Variable(ALIASES.get(t[0], t[0])))

VERSION_CMP = (
    L("===") |
    L("==") |
    L(">=") |
    L("<=") |
    L("!=") |
    L("~=") |
    L(">") |
    L("<")
)

MARKER_OP = VERSION_CMP | L("not in") | L("in")
MARKER_OP.setParseAction(lambda s, l, t: Op(t[0]))

MARKER_VALUE = QuotedString("'") | QuotedString('"')
MARKER_VALUE.setParseAction(lambda s, l, t: Value(t[0]))

BOOLOP = L("and") | L("or")

MARKER_VAR = VARIABLE | MARKER_VALUE

MARKER_ITEM = Group(MARKER_VAR + MARKER_OP + MARKER_VAR)
MARKER_ITEM.setParseAction(lambda s, l, t: tuple(t[0]))

LPAREN = L("(").suppress()
RPAREN = L(")").suppress()

MARKER_EXPR = Forward()
MARKER_ATOM = MARKER_ITEM | Group(LPAREN + MARKER_EXPR + RPAREN)
MARKER_EXPR << MARKER_ATOM + ZeroOrMore(BOOLOP + MARKER_EXPR)

MARKER = stringStart + MARKER_EXPR + stringEnd


def _coerce_parse_result(results):
    if isinstance(results, ParseResults):
        return [_coerce_parse_result(i) for i in results]
    else:
        return results


def _format_marker(marker, first=True):
    assert isinstance(marker, (list, tuple, string_types))

    # Sometimes we have a structure like [[...]] which is a single item list
    # where the single item is itself it's own list. In that case we want skip
    # the rest of this function so that we don't get extraneous () on the
    # outside.
    if (isinstance(marker, list) and len(marker) == 1 and
            isinstance(marker[0], (list, tuple))):
        return _format_marker(marker[0])

    if isinstance(marker, list):
        inner = (_format_marker(m, first=False) for m in marker)
        if first:
            return " ".join(inner)
        else:
            return "(" + " ".join(inner) + ")"
    elif isinstance(marker, tuple):
        return " ".join([m.serialize() for m in marker])
    else:
        return marker


_operators = {
    "in": lambda lhs, rhs: lhs in rhs,
    "not in": lambda lhs, rhs: lhs not in rhs,
    "<": operator.lt,
    "<=": operator.le,
    "==": operator.eq,
    "!=": operator.ne,
    ">=": operator.ge,
    ">": operator.gt,
}


def _eval_op(lhs, op, rhs):
    try:
        spec = Specifier("".join([op.serialize(), rhs]))
    except InvalidSpecifier:
        pass
    else:
        return spec.contains(lhs)

    oper = _operators.get(op.serialize())
    if oper is None:
        raise UndefinedComparison(
            "Undefined {0!r} on {1!r} and {2!r}.".format(op, lhs, rhs)
        )

    return oper(lhs, rhs)


_undefined = object()


def _get_env(environment, name):
    value = environment.get(name, _undefined)

    if value is _undefined:
        raise UndefinedEnvironmentName(
            "{0!r} does not exist in evaluation environment.".format(name)
        )

    return value


def _evaluate_markers(markers, environment):
    groups = [[]]

    for marker in markers:
        assert isinstance(marker, (list, tuple, string_types))

        if isinstance(marker, list):
            groups[-1].append(_evaluate_markers(marker, environment))
        elif isinstance(marker, tuple):
            lhs, op, rhs = marker

            if isinstance(lhs, Variable):
                lhs_value = _get_env(environment, lhs.value)
                rhs_value = rhs.value
            else:
                lhs_value = lhs.value
                rhs_value = _get_env(environment, rhs.value)

            groups[-1].append(_eval_op(lhs_value, op, rhs_value))
        else:
            assert marker in ["and", "or"]
            if marker == "or":
                groups.append([])

    return any(all(item) for item in groups)


def format_full_version(info):
    version = '{0.major}.{0.minor}.{0.micro}'.format(info)
    kind = info.releaselevel
    if kind != 'final':
        version += kind[0] + str(info.serial)
    return version


def default_environment():
    if hasattr(sys, 'implementation'):
        iver = format_full_version(sys.implementation.version)
        implementation_name = sys.implementation.name
    else:
        iver = '0'
        implementation_name = ''

    return {
        "implementation_name": implementation_name,
        "implementation_version": iver,
        "os_name": os.name,
        "platform_machine": platform.machine(),
        "platform_release": platform.release(),
        "platform_system": platform.system(),
        "platform_version": platform.version(),
        "python_full_version": platform.python_version(),
        "platform_python_implementation": platform.python_implementation(),
        "python_version": platform.python_version()[:3],
        "sys_platform": sys.platform,
    }


class Marker(object):

    def __init__(self, marker):
        try:
            self._markers = _coerce_parse_result(MARKER.parseString(marker))
        except ParseException as e:
            err_str = "Invalid marker: {0!r}, parse error at {1!r}".format(
                marker, marker[e.loc:e.loc + 8])
            raise InvalidMarker(err_str)

    def __str__(self):
        return _format_marker(self._markers)

    def __repr__(self):
        return "<Marker({0!r})>".format(str(self))

    def evaluate(self, environment=None):
        """Evaluate a marker.

        Return the boolean from evaluating the given marker against the
        environment. environment is an optional argument to override all or
        part of the determined environment.

        The environment is determined from the current Python process.
        """
        current_environment = default_environment()
        if environment is not None:
            current_environment.update(environment)

        return _evaluate_markers(self._markers, current_environment)
site-packages/pkg_resources/_vendor/packaging/requirements.py000064400000010403147511334660020624 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

import string
import re

from pkg_resources.extern.pyparsing import stringStart, stringEnd, originalTextFor, ParseException
from pkg_resources.extern.pyparsing import ZeroOrMore, Word, Optional, Regex, Combine
from pkg_resources.extern.pyparsing import Literal as L  # noqa
from pkg_resources.extern.six.moves.urllib import parse as urlparse

from .markers import MARKER_EXPR, Marker
from .specifiers import LegacySpecifier, Specifier, SpecifierSet


class InvalidRequirement(ValueError):
    """
    An invalid requirement was found, users should refer to PEP 508.
    """


ALPHANUM = Word(string.ascii_letters + string.digits)

LBRACKET = L("[").suppress()
RBRACKET = L("]").suppress()
LPAREN = L("(").suppress()
RPAREN = L(")").suppress()
COMMA = L(",").suppress()
SEMICOLON = L(";").suppress()
AT = L("@").suppress()

PUNCTUATION = Word("-_.")
IDENTIFIER_END = ALPHANUM | (ZeroOrMore(PUNCTUATION) + ALPHANUM)
IDENTIFIER = Combine(ALPHANUM + ZeroOrMore(IDENTIFIER_END))

NAME = IDENTIFIER("name")
EXTRA = IDENTIFIER

URI = Regex(r'[^ ]+')("url")
URL = (AT + URI)

EXTRAS_LIST = EXTRA + ZeroOrMore(COMMA + EXTRA)
EXTRAS = (LBRACKET + Optional(EXTRAS_LIST) + RBRACKET)("extras")

VERSION_PEP440 = Regex(Specifier._regex_str, re.VERBOSE | re.IGNORECASE)
VERSION_LEGACY = Regex(LegacySpecifier._regex_str, re.VERBOSE | re.IGNORECASE)

VERSION_ONE = VERSION_PEP440 ^ VERSION_LEGACY
VERSION_MANY = Combine(VERSION_ONE + ZeroOrMore(COMMA + VERSION_ONE),
                       joinString=",", adjacent=False)("_raw_spec")
_VERSION_SPEC = Optional(((LPAREN + VERSION_MANY + RPAREN) | VERSION_MANY))
_VERSION_SPEC.setParseAction(lambda s, l, t: t._raw_spec or '')

VERSION_SPEC = originalTextFor(_VERSION_SPEC)("specifier")
VERSION_SPEC.setParseAction(lambda s, l, t: t[1])

MARKER_EXPR = originalTextFor(MARKER_EXPR())("marker")
MARKER_EXPR.setParseAction(
    lambda s, l, t: Marker(s[t._original_start:t._original_end])
)
MARKER_SEPERATOR = SEMICOLON
MARKER = MARKER_SEPERATOR + MARKER_EXPR

VERSION_AND_MARKER = VERSION_SPEC + Optional(MARKER)
URL_AND_MARKER = URL + Optional(MARKER)

NAMED_REQUIREMENT = \
    NAME + Optional(EXTRAS) + (URL_AND_MARKER | VERSION_AND_MARKER)

REQUIREMENT = stringStart + NAMED_REQUIREMENT + stringEnd


class Requirement(object):
    """Parse a requirement.

    Parse a given requirement string into its parts, such as name, specifier,
    URL, and extras. Raises InvalidRequirement on a badly-formed requirement
    string.
    """

    # TODO: Can we test whether something is contained within a requirement?
    #       If so how do we do that? Do we need to test against the _name_ of
    #       the thing as well as the version? What about the markers?
    # TODO: Can we normalize the name and extra name?

    def __init__(self, requirement_string):
        try:
            req = REQUIREMENT.parseString(requirement_string)
        except ParseException as e:
            raise InvalidRequirement(
                "Invalid requirement, parse error at \"{0!r}\"".format(
                    requirement_string[e.loc:e.loc + 8]))

        self.name = req.name
        if req.url:
            parsed_url = urlparse.urlparse(req.url)
            if not (parsed_url.scheme and parsed_url.netloc) or (
                    not parsed_url.scheme and not parsed_url.netloc):
                raise InvalidRequirement("Invalid URL given")
            self.url = req.url
        else:
            self.url = None
        self.extras = set(req.extras.asList() if req.extras else [])
        self.specifier = SpecifierSet(req.specifier)
        self.marker = req.marker if req.marker else None

    def __str__(self):
        parts = [self.name]

        if self.extras:
            parts.append("[{0}]".format(",".join(sorted(self.extras))))

        if self.specifier:
            parts.append(str(self.specifier))

        if self.url:
            parts.append("@ {0}".format(self.url))

        if self.marker:
            parts.append("; {0}".format(self.marker))

        return "".join(parts)

    def __repr__(self):
        return "<Requirement({0!r})>".format(str(self))
site-packages/pkg_resources/_vendor/packaging/__pycache__/specifiers.cpython-36.opt-1.pyc000064400000046437147511334660025500 0ustar003

��fym�@s�ddlmZmZmZddlZddlZddlZddlZddlm	Z	m
Z
ddlmZm
Z
mZGdd�de�ZGdd	�d	e
eje��ZGd
d�de�ZGdd
�d
e�Zdd�ZGdd�de�Zejd�Zdd�Zdd�ZGdd�de�ZdS)�)�absolute_import�division�print_functionN�)�string_types�with_metaclass)�Version�
LegacyVersion�parsec@seZdZdZdS)�InvalidSpecifierzH
    An invalid specifier was found, users should refer to PEP 440.
    N)�__name__�
__module__�__qualname__�__doc__�rr� /usr/lib/python3.6/specifiers.pyrsrc@s�eZdZejdd��Zejdd��Zejdd��Zejdd��Zej	d	d
��Z
e
jdd
��Z
ejdd
d��Zejddd��Z
dS)�
BaseSpecifiercCsdS)z�
        Returns the str representation of this Specifier like object. This
        should be representative of the Specifier itself.
        Nr)�selfrrr�__str__szBaseSpecifier.__str__cCsdS)zF
        Returns a hash value for this Specifier like object.
        Nr)rrrr�__hash__szBaseSpecifier.__hash__cCsdS)zq
        Returns a boolean representing whether or not the two Specifier like
        objects are equal.
        Nr)r�otherrrr�__eq__$szBaseSpecifier.__eq__cCsdS)zu
        Returns a boolean representing whether or not the two Specifier like
        objects are not equal.
        Nr)rrrrr�__ne__+szBaseSpecifier.__ne__cCsdS)zg
        Returns whether or not pre-releases as a whole are allowed by this
        specifier.
        Nr)rrrr�prereleases2szBaseSpecifier.prereleasescCsdS)zd
        Sets whether or not pre-releases as a whole are allowed by this
        specifier.
        Nr)r�valuerrrr9sNcCsdS)zR
        Determines if the given item is contained within this specifier.
        Nr)r�itemrrrr�contains@szBaseSpecifier.containscCsdS)z�
        Takes an iterable of items and filters them so that only items which
        are contained within this specifier are allowed in it.
        Nr)r�iterablerrrr�filterFszBaseSpecifier.filter)N)N)rr
r�abc�abstractmethodrrrr�abstractpropertyr�setterrrrrrrrsrc@s�eZdZiZd dd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zedd��Z
edd��Zedd��Zejdd��Zdd�Zd!dd�Zd"dd�ZdS)#�_IndividualSpecifier�NcCsF|jj|�}|stdj|���|jd�j�|jd�j�f|_||_dS)NzInvalid specifier: '{0}'�operator�version)�_regex�searchr�format�group�strip�_spec�_prereleases)r�specr�matchrrr�__init__Rsz_IndividualSpecifier.__init__cCs0|jdk	rdj|j�nd}dj|jjt|�|�S)Nz, prereleases={0!r}r$z<{0}({1!r}{2})>)r-r)r�	__class__r�str)r�prerrr�__repr___sz_IndividualSpecifier.__repr__cCsdj|j�S)Nz{0}{1})r)r,)rrrrrlsz_IndividualSpecifier.__str__cCs
t|j�S)N)�hashr,)rrrrrosz_IndividualSpecifier.__hash__cCsLt|t�r0y|j|�}Wq@tk
r,tSXnt||j�s@tS|j|jkS)N)�
isinstancerr1r�NotImplementedr,)rrrrrrrs
z_IndividualSpecifier.__eq__cCsLt|t�r0y|j|�}Wq@tk
r,tSXnt||j�s@tS|j|jkS)N)r6rr1rr7r,)rrrrrr}s
z_IndividualSpecifier.__ne__cCst|dj|j|��S)Nz_compare_{0})�getattrr)�
_operators)r�oprrr�
_get_operator�sz"_IndividualSpecifier._get_operatorcCst|ttf�st|�}|S)N)r6r	rr
)rr&rrr�_coerce_version�sz$_IndividualSpecifier._coerce_versioncCs
|jdS)Nr)r,)rrrrr%�sz_IndividualSpecifier.operatorcCs
|jdS)Nr)r,)rrrrr&�sz_IndividualSpecifier.versioncCs|jS)N)r-)rrrrr�sz _IndividualSpecifier.prereleasescCs
||_dS)N)r-)rrrrrr�scCs
|j|�S)N)r)rrrrr�__contains__�sz!_IndividualSpecifier.__contains__cCs<|dkr|j}|j|�}|jr(|r(dS|j|j�||j�S)NF)rr<�
is_prereleaser;r%r&)rrrrrrr�s
z_IndividualSpecifier.containsccs�d}g}d|dk	r|ndi}xL|D]D}|j|�}|j|f|�r"|jr\|pL|jr\|j|�q"d}|Vq"W|r�|r�x|D]
}|VqzWdS)NFrT)r<rr>r�append)rrrZyielded�found_prereleases�kwr&�parsed_versionrrrr�s




z_IndividualSpecifier.filter)r$N)N)N)rr
rr9r0r4rrrrr;r<�propertyr%r&rr"r=rrrrrrr#Ns 



r#c@sveZdZdZejdedejejB�Zdddddd	d
�Z	dd�Z
d
d�Zdd�Zdd�Z
dd�Zdd�Zdd�ZdS)�LegacySpecifiera�
        (?P<operator>(==|!=|<=|>=|<|>))
        \s*
        (?P<version>
            [^,;\s)]* # Since this is a "legacy" specifier, and the version
                      # string can be just about anything, we match everything
                      # except for whitespace, a semi-colon for marker support,
                      # a closing paren since versions can be enclosed in
                      # them, and a comma since it's a version separator.
        )
        z^\s*z\s*$�equal�	not_equal�less_than_equal�greater_than_equal�	less_than�greater_than)z==z!=z<=z>=�<�>cCst|t�stt|��}|S)N)r6r	r2)rr&rrrr<�s
zLegacySpecifier._coerce_versioncCs||j|�kS)N)r<)r�prospectiver.rrr�_compare_equal�szLegacySpecifier._compare_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_not_equal�sz"LegacySpecifier._compare_not_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_less_than_equal�sz(LegacySpecifier._compare_less_than_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_greater_than_equalsz+LegacySpecifier._compare_greater_than_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_less_thansz"LegacySpecifier._compare_less_thancCs||j|�kS)N)r<)rrMr.rrr�_compare_greater_thansz%LegacySpecifier._compare_greater_thanN)rr
r�
_regex_str�re�compile�VERBOSE�
IGNORECASEr'r9r<rNrOrPrQrRrSrrrrrD�s 
rDcstj���fdd��}|S)Ncst|t�sdS�|||�S)NF)r6r)rrMr.)�fnrr�wrappeds
z)_require_version_compare.<locals>.wrapped)�	functools�wraps)rYrZr)rYr�_require_version_compare
sr]c	@s�eZdZdZejdedejejB�Zdddddd	d
dd�Z	e
d
d��Ze
dd��Ze
dd��Z
e
dd��Ze
dd��Ze
dd��Ze
dd��Zdd�Zedd��Zejdd��Zd S)!�	Specifiera
        (?P<operator>(~=|==|!=|<=|>=|<|>|===))
        (?P<version>
            (?:
                # The identity operators allow for an escape hatch that will
                # do an exact string match of the version you wish to install.
                # This will not be parsed by PEP 440 and we cannot determine
                # any semantic meaning from it. This operator is discouraged
                # but included entirely as an escape hatch.
                (?<====)  # Only match for the identity operator
                \s*
                [^\s]*    # We just match everything, except for whitespace
                          # since we are only testing for strict identity.
            )
            |
            (?:
                # The (non)equality operators allow for wild card and local
                # versions to be specified so we have to define these two
                # operators separately to enable that.
                (?<===|!=)            # Only match for equals and not equals

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)*   # release
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?

                # You cannot use a wild card and a dev or local version
                # together so group them with a | and make them optional.
                (?:
                    (?:[-_\.]?dev[-_\.]?[0-9]*)?         # dev release
                    (?:\+[a-z0-9]+(?:[-_\.][a-z0-9]+)*)? # local
                    |
                    \.\*  # Wild card syntax of .*
                )?
            )
            |
            (?:
                # The compatible operator requires at least two digits in the
                # release segment.
                (?<=~=)               # Only match for the compatible operator

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)+   # release  (We have a + instead of a *)
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?
                (?:[-_\.]?dev[-_\.]?[0-9]*)?          # dev release
            )
            |
            (?:
                # All other operators only allow a sub set of what the
                # (non)equality operators do. Specifically they do not allow
                # local versions to be specified nor do they allow the prefix
                # matching wild cards.
                (?<!==|!=|~=)         # We have special cases for these
                                      # operators so we want to make sure they
                                      # don't match here.

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)*   # release
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?
                (?:[-_\.]?dev[-_\.]?[0-9]*)?          # dev release
            )
        )
        z^\s*z\s*$Z
compatiblerErFrGrHrIrJZ	arbitrary)z~=z==z!=z<=z>=rKrLz===cCsNdjttjdd�t|���dd��}|d7}|jd�||�oL|jd�||�S)	N�.cSs|jd�o|jd�S)NZpostZdev)�
startswith)�xrrr�<lambda>�sz/Specifier._compare_compatible.<locals>.<lambda>rz.*z>=z==���)�join�list�	itertools�	takewhile�_version_splitr;)rrMr.�prefixrrr�_compare_compatible�s
zSpecifier._compare_compatiblecCsp|jd�rPt|j�}t|dd��}tt|��}|dt|��}t||�\}}nt|�}|jsht|j�}||kS)Nz.*����)�endswithrZpublicrhr2�len�_pad_version�local)rrMr.rrrrN�s


zSpecifier._compare_equalcCs|j||�S)N)rN)rrMr.rrrrO�szSpecifier._compare_not_equalcCs|t|�kS)N)r)rrMr.rrrrP�sz"Specifier._compare_less_than_equalcCs|t|�kS)N)r)rrMr.rrrrQ�sz%Specifier._compare_greater_than_equalcCs>t|�}||ksdS|jr:|jr:t|j�t|j�kr:dSdS)NFT)rr>�base_version)rrMr.rrrrR�szSpecifier._compare_less_thancCs`t|�}||ksdS|jr:|jr:t|j�t|j�kr:dS|jdk	r\t|j�t|j�kr\dSdS)NFT)rZis_postreleaserqrp)rrMr.rrrrS�s
zSpecifier._compare_greater_thancCst|�j�t|�j�kS)N)r2�lower)rrMr.rrr�_compare_arbitraryszSpecifier._compare_arbitrarycCsR|jdk	r|jS|j\}}|d
krN|dkr@|jd�r@|dd�}t|�jrNdSd	S)N�==�>=�<=�~=�===z.*rkTF)rtrurvrwrxrl)r-r,rmr
r>)rr%r&rrrrs


zSpecifier.prereleasescCs
||_dS)N)r-)rrrrrrsN)rr
rrTrUrVrWrXr'r9r]rjrNrOrPrQrRrSrsrCrr"rrrrr^s*^#r^z^([0-9]+)((?:a|b|c|rc)[0-9]+)$cCsDg}x:|jd�D],}tj|�}|r2|j|j��q|j|�qW|S)Nr_)�split�
_prefix_regexr(�extend�groupsr?)r&�resultrr/rrrrh's
rhc	Cs�gg}}|jttjdd�|���|jttjdd�|���|j|t|d�d��|j|t|d�d��|jddgtdt|d�t|d���|jddgtdt|d�t|d���ttj|��ttj|��fS)NcSs|j�S)N)�isdigit)rarrrrb6sz_pad_version.<locals>.<lambda>cSs|j�S)N)r~)rarrrrb7srr�0)r?rerfrgrn�insert�max�chain)�left�rightZ
left_splitZright_splitrrrro2s
&&roc@s�eZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zedd��Z
e
jdd��Z
dd�Zddd�Zd dd�ZdS)!�SpecifierSetr$NcCsrdd�|jd�D�}t�}xB|D]:}y|jt|��Wq tk
rX|jt|��Yq Xq Wt|�|_||_dS)NcSsg|]}|j�r|j��qSr)r+)�.0�srrr�
<listcomp>Rsz)SpecifierSet.__init__.<locals>.<listcomp>�,)	ry�set�addr^rrD�	frozenset�_specsr-)rZ
specifiersrZparsed�	specifierrrrr0Os

zSpecifierSet.__init__cCs*|jdk	rdj|j�nd}djt|�|�S)Nz, prereleases={0!r}r$z<SpecifierSet({0!r}{1})>)r-r)rr2)rr3rrrr4dszSpecifierSet.__repr__cCsdjtdd�|jD���S)Nr�css|]}t|�VqdS)N)r2)r�r�rrr�	<genexpr>nsz'SpecifierSet.__str__.<locals>.<genexpr>)rd�sortedr�)rrrrrmszSpecifierSet.__str__cCs
t|j�S)N)r5r�)rrrrrpszSpecifierSet.__hash__cCs�t|t�rt|�}nt|t�s"tSt�}t|j|jB�|_|jdkrX|jdk	rX|j|_n<|jdk	rv|jdkrv|j|_n|j|jkr�|j|_ntd��|S)NzFCannot combine SpecifierSets with True and False prerelease overrides.)r6rr�r7r�r�r-�
ValueError)rrr�rrr�__and__ss





zSpecifierSet.__and__cCsFt|t�rt|�}n&t|t�r,tt|��}nt|t�s:tS|j|jkS)N)r6rr�r#r2r7r�)rrrrrr�s



zSpecifierSet.__eq__cCsFt|t�rt|�}n&t|t�r,tt|��}nt|t�s:tS|j|jkS)N)r6rr�r#r2r7r�)rrrrrr�s



zSpecifierSet.__ne__cCs
t|j�S)N)rnr�)rrrr�__len__�szSpecifierSet.__len__cCs
t|j�S)N)�iterr�)rrrr�__iter__�szSpecifierSet.__iter__cCs.|jdk	r|jS|jsdStdd�|jD��S)Ncss|]}|jVqdS)N)r)r�r�rrrr��sz+SpecifierSet.prereleases.<locals>.<genexpr>)r-r��any)rrrrr�s

zSpecifierSet.prereleasescCs
||_dS)N)r-)rrrrrr�scCs
|j|�S)N)r)rrrrrr=�szSpecifierSet.__contains__csNt�ttf�st����dkr$|j��r4�jr4dSt��fdd�|jD��S)NFc3s|]}|j��d�VqdS))rN)r)r�r�)rrrrr��sz(SpecifierSet.contains.<locals>.<genexpr>)r6r	rr
rr>�allr�)rrrr)rrrr�szSpecifierSet.containscCs�|dkr|j}|jr:x |jD]}|j|t|�d�}qW|Sg}g}xZ|D]R}t|ttf�sdt|�}n|}t|t�rtqH|jr�|r�|s�|j	|�qH|j	|�qHW|r�|r�|dkr�|S|SdS)N)r)
rr�r�boolr6r	rr
r>r?)rrrr.Zfilteredr@rrBrrrr�s*


zSpecifierSet.filter)r$N)N)N)rr
rr0r4rrr�rrr�r�rCrr"r=rrrrrrr�Ms
	


r�)Z
__future__rrrrr[rfrUZ_compatrrr&rr	r
r�r�ABCMeta�objectrr#rDr]r^rVrzrhror�rrrr�<module>s&9	4	
site-packages/pkg_resources/_vendor/packaging/__pycache__/utils.cpython-36.pyc000064400000000630147511334660023526 0ustar003

��f��@s2ddlmZmZmZddlZejd�Zdd�ZdS)�)�absolute_import�division�print_functionNz[-_.]+cCstjd|�j�S)N�-)�_canonicalize_regex�sub�lower)�name�r
�/usr/lib/python3.6/utils.py�canonicalize_namesr)Z
__future__rrr�re�compilerrr
r
r
r�<module>s
site-packages/pkg_resources/_vendor/packaging/__pycache__/_compat.cpython-36.pyc000064400000001634147511334660024015 0ustar003

��f\�@sVddlmZmZmZddlZejddkZejddkZerDefZ	ne
fZ	dd�ZdS)�)�absolute_import�division�print_functionN��cs&G��fdd�d��}tj|dfi�S)z/
    Create a base class with a metaclass.
    cseZdZ��fdd�ZdS)z!with_metaclass.<locals>.metaclasscs�|�|�S)N�)�cls�nameZ
this_bases�d)�bases�metar�/usr/lib/python3.6/_compat.py�__new__sz)with_metaclass.<locals>.metaclass.__new__N)�__name__�
__module__�__qualname__rr)rrrr
�	metaclasssrZtemporary_class)�typer)rrrr)rrr
�with_metaclasssr)Z
__future__rrr�sys�version_infoZPY2ZPY3�strZstring_typesZ
basestringrrrrr
�<module>ssite-packages/pkg_resources/_vendor/packaging/__pycache__/_compat.cpython-36.opt-1.pyc000064400000001634147511334660024754 0ustar003

��f\�@sVddlmZmZmZddlZejddkZejddkZerDefZ	ne
fZ	dd�ZdS)�)�absolute_import�division�print_functionN��cs&G��fdd�d��}tj|dfi�S)z/
    Create a base class with a metaclass.
    cseZdZ��fdd�ZdS)z!with_metaclass.<locals>.metaclasscs�|�|�S)N�)�cls�nameZ
this_bases�d)�bases�metar�/usr/lib/python3.6/_compat.py�__new__sz)with_metaclass.<locals>.metaclass.__new__N)�__name__�
__module__�__qualname__rr)rrrr
�	metaclasssrZtemporary_class)�typer)rrrr)rrr
�with_metaclasssr)Z
__future__rrr�sys�version_infoZPY2ZPY3�strZstring_typesZ
basestringrrrrr
�<module>ssite-packages/pkg_resources/_vendor/packaging/__pycache__/_structures.cpython-36.opt-1.pyc000064400000005335147511334660025716 0ustar003

��f��@sDddlmZmZmZGdd�de�Ze�ZGdd�de�Ze�ZdS)�)�absolute_import�division�print_functionc@sTeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)�InfinitycCsdS)Nr�)�selfrr�!/usr/lib/python3.6/_structures.py�__repr__	szInfinity.__repr__cCstt|��S)N)�hash�repr)rrrr�__hash__szInfinity.__hash__cCsdS)NFr)r�otherrrr�__lt__szInfinity.__lt__cCsdS)NFr)rr
rrr�__le__szInfinity.__le__cCst||j�S)N)�
isinstance�	__class__)rr
rrr�__eq__szInfinity.__eq__cCst||j�S)N)rr)rr
rrr�__ne__szInfinity.__ne__cCsdS)NTr)rr
rrr�__gt__szInfinity.__gt__cCsdS)NTr)rr
rrr�__ge__szInfinity.__ge__cCstS)N)�NegativeInfinity)rrrr�__neg__!szInfinity.__neg__N)�__name__�
__module__�__qualname__r	rrrrrrrrrrrrrsrc@sTeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)rcCsdS)Nz	-Infinityr)rrrrr	)szNegativeInfinity.__repr__cCstt|��S)N)r
r)rrrrr,szNegativeInfinity.__hash__cCsdS)NTr)rr
rrrr/szNegativeInfinity.__lt__cCsdS)NTr)rr
rrrr2szNegativeInfinity.__le__cCst||j�S)N)rr)rr
rrrr5szNegativeInfinity.__eq__cCst||j�S)N)rr)rr
rrrr8szNegativeInfinity.__ne__cCsdS)NFr)rr
rrrr;szNegativeInfinity.__gt__cCsdS)NFr)rr
rrrr>szNegativeInfinity.__ge__cCstS)N)r)rrrrrAszNegativeInfinity.__neg__N)rrrr	rrrrrrrrrrrrr'srN)Z
__future__rrr�objectrrrrrr�<module>ssite-packages/pkg_resources/_vendor/packaging/__pycache__/version.cpython-36.pyc000064400000024426147511334660024064 0ustar003

��f$-�@s�ddlmZmZmZddlZddlZddlZddlmZddddd	gZ	ej
d
ddd
dddg�Zdd�ZGdd�de
�ZGdd�de�ZGdd�de�Zejdej�Zdddddd�Zdd�Zdd�ZdZGd d�de�Zd!d"�Zejd#�Zd$d%�Zd&d'�ZdS)(�)�absolute_import�division�print_functionN�)�Infinity�parse�Version�
LegacyVersion�InvalidVersion�VERSION_PATTERN�_Version�epoch�release�dev�pre�post�localcCs&yt|�Stk
r t|�SXdS)z�
    Parse the given version string and return either a :class:`Version` object
    or a :class:`LegacyVersion` object depending on if the given version is
    a valid PEP 440 version or a legacy version.
    N)rr
r	)�version�r�/usr/lib/python3.6/version.pyrsc@seZdZdZdS)r
zF
    An invalid version was found, users should refer to PEP 440.
    N)�__name__�
__module__�__qualname__�__doc__rrrrr
$sc@sLeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)�_BaseVersioncCs
t|j�S)N)�hash�_key)�selfrrr�__hash__,sz_BaseVersion.__hash__cCs|j|dd��S)NcSs||kS)Nr)�s�orrr�<lambda>0sz%_BaseVersion.__lt__.<locals>.<lambda>)�_compare)r�otherrrr�__lt__/sz_BaseVersion.__lt__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!3sz%_BaseVersion.__le__.<locals>.<lambda>)r")rr#rrr�__le__2sz_BaseVersion.__le__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!6sz%_BaseVersion.__eq__.<locals>.<lambda>)r")rr#rrr�__eq__5sz_BaseVersion.__eq__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!9sz%_BaseVersion.__ge__.<locals>.<lambda>)r")rr#rrr�__ge__8sz_BaseVersion.__ge__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!<sz%_BaseVersion.__gt__.<locals>.<lambda>)r")rr#rrr�__gt__;sz_BaseVersion.__gt__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!?sz%_BaseVersion.__ne__.<locals>.<lambda>)r")rr#rrr�__ne__>sz_BaseVersion.__ne__cCst|t�stS||j|j�S)N)�
isinstancer�NotImplementedr)rr#�methodrrrr"As
z_BaseVersion._compareN)rrrrr$r%r&r'r(r)r"rrrrr*src@s`eZdZdd�Zdd�Zdd�Zedd��Zed	d
��Zedd��Z	ed
d��Z
edd��ZdS)r	cCst|�|_t|j�|_dS)N)�str�_version�_legacy_cmpkeyr)rrrrr�__init__Js
zLegacyVersion.__init__cCs|jS)N)r.)rrrr�__str__NszLegacyVersion.__str__cCsdjtt|���S)Nz<LegacyVersion({0})>)�format�reprr-)rrrr�__repr__QszLegacyVersion.__repr__cCs|jS)N)r.)rrrr�publicTszLegacyVersion.publiccCs|jS)N)r.)rrrr�base_versionXszLegacyVersion.base_versioncCsdS)Nr)rrrrr\szLegacyVersion.localcCsdS)NFr)rrrr�
is_prerelease`szLegacyVersion.is_prereleasecCsdS)NFr)rrrr�is_postreleasedszLegacyVersion.is_postreleaseN)rrrr0r1r4�propertyr5r6rr7r8rrrrr	Hsz(\d+ | [a-z]+ | \.| -)�czfinal-�@)r�preview�-�rcrccsbxVtj|�D]H}tj||�}|s|dkr,q|dd�dkrJ|jd�Vqd|VqWdVdS)N�.r�
0123456789��*z*final)�_legacy_version_component_re�split�_legacy_version_replacement_map�get�zfill)r�partrrr�_parse_version_partsrsrIcCs�d}g}xlt|j��D]\}|jd�rh|dkrJx|rH|ddkrH|j�q.Wx|rf|ddkrf|j�qLW|j|�qWt|�}||fS)	NrrBz*finalz*final-Z00000000���rJrJ)rI�lower�
startswith�pop�append�tuple)rr
�partsrHrrrr/�s
r/a�
    v?
    (?:
        (?:(?P<epoch>[0-9]+)!)?                           # epoch
        (?P<release>[0-9]+(?:\.[0-9]+)*)                  # release segment
        (?P<pre>                                          # pre-release
            [-_\.]?
            (?P<pre_l>(a|b|c|rc|alpha|beta|pre|preview))
            [-_\.]?
            (?P<pre_n>[0-9]+)?
        )?
        (?P<post>                                         # post release
            (?:-(?P<post_n1>[0-9]+))
            |
            (?:
                [-_\.]?
                (?P<post_l>post|rev|r)
                [-_\.]?
                (?P<post_n2>[0-9]+)?
            )
        )?
        (?P<dev>                                          # dev release
            [-_\.]?
            (?P<dev_l>dev)
            [-_\.]?
            (?P<dev_n>[0-9]+)?
        )?
    )
    (?:\+(?P<local>[a-z0-9]+(?:[-_\.][a-z0-9]+)*))?       # local version
c@s|eZdZejdedejejB�Zdd�Z	dd�Z
dd�Zed	d
��Z
edd��Zed
d��Zedd��Zedd��ZdS)rz^\s*z\s*$c	Cs�|jj|�}|stdj|���t|jd�r8t|jd��ndtdd�|jd�jd�D��t	|jd�|jd	��t	|jd
�|jd�p�|jd��t	|jd
�|jd��t
|jd��d�|_t|jj
|jj|jj|jj|jj|jj�|_dS)NzInvalid version: '{0}'r
rcss|]}t|�VqdS)N)�int)�.0�irrr�	<genexpr>�sz#Version.__init__.<locals>.<genexpr>rr?Zpre_lZpre_nZpost_lZpost_n1Zpost_n2Zdev_lZdev_nr)r
rrrrr)�_regex�searchr
r2r�grouprQrOrD�_parse_letter_version�_parse_local_versionr.�_cmpkeyr
rrrrrr)rr�matchrrrr0�s.

zVersion.__init__cCsdjtt|���S)Nz<Version({0})>)r2r3r-)rrrrr4�szVersion.__repr__cCs�g}|jjdkr$|jdj|jj��|jdjdd�|jjD���|jjdk	rl|jdjdd�|jjD���|jjdk	r�|jdj|jjd	��|jjdk	r�|jd
j|jjd	��|jj	dk	r�|jdjdjdd�|jj	D����dj|�S)
Nrz{0}!r?css|]}t|�VqdS)N)r-)rR�xrrrrT�sz"Version.__str__.<locals>.<genexpr>�css|]}t|�VqdS)N)r-)rRr\rrrrT�sz.post{0}rz.dev{0}z+{0}css|]}t|�VqdS)N)r-)rRr\rrrrTs)
r.r
rNr2�joinrrrrr)rrPrrrr1�s zVersion.__str__cCst|�jdd�dS)N�+rr)r-rD)rrrrr5
szVersion.publiccCsLg}|jjdkr$|jdj|jj��|jdjdd�|jjD���dj|�S)Nrz{0}!r?css|]}t|�VqdS)N)r-)rRr\rrrrTsz'Version.base_version.<locals>.<genexpr>r])r.r
rNr2r^r)rrPrrrr6s
zVersion.base_versioncCs$t|�}d|kr |jdd�dSdS)Nr_r)r-rD)rZversion_stringrrrrsz
Version.localcCst|jjp|jj�S)N)�boolr.rr)rrrrr7!szVersion.is_prereleasecCst|jj�S)N)r`r.r)rrrrr8%szVersion.is_postreleaseN)rrr�re�compiler�VERBOSE�
IGNORECASErUr0r4r1r9r5r6rr7r8rrrrr�s
#
cCsx|rZ|dkrd}|j�}|dkr&d}n(|dkr4d}n|d
krBd	}n|dkrNd}|t|�fS|rt|rtd}|t|�fSdS)NrZalpha�aZbeta�br:rr<r>�rev�rr)r:rr<)rgrh)rKrQ)ZletterZnumberrrrrX*s 
rXz[\._-]cCs$|dk	r tdd�tj|�D��SdS)zR
    Takes a string like abc.1.twelve and turns it into ("abc", 1, "twelve").
    Ncss&|]}|j�s|j�nt|�VqdS)N)�isdigitrKrQ)rRrHrrrrTRsz'_parse_local_version.<locals>.<genexpr>)rO�_local_version_seperatorsrD)rrrrrYLsrYcCs�ttttjdd�t|�����}|dkr@|dkr@|dk	r@t}n|dkrLt}|dkrZt}|dkrft}|dkrvt}ntdd�|D��}||||||fS)NcSs|dkS)Nrr)r\rrrr!`sz_cmpkey.<locals>.<lambda>css*|]"}t|t�r|dfnt|fVqdS)r]N)r*rQr)rRrSrrrrT�sz_cmpkey.<locals>.<genexpr>)rO�reversed�list�	itertools�	dropwhiler)r
rrrrrrrrrZWs&		
rZ)Z
__future__rrr�collectionsrmraZ_structuresr�__all__�
namedtuplerr�
ValueErrorr
�objectrr	rbrcrCrErIr/rrrXrjrYrZrrrr�<module>s.!
9k
site-packages/pkg_resources/_vendor/packaging/__pycache__/__init__.cpython-36.pyc000064400000000735147511334660024133 0ustar003

��f�@sTddlmZmZmZddlmZmZmZmZm	Z	m
Z
mZmZdddddd	d
dgZ
dS)
�)�absolute_import�division�print_function�)�
__author__�
__copyright__�	__email__�__license__�__summary__�	__title__�__uri__�__version__rr
rr
rrr	rN)Z
__future__rrr�	__about__rrrr	r
rrr
�__all__�rr�/usr/lib/python3.6/__init__.py�<module>s(
site-packages/pkg_resources/_vendor/packaging/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000735147511334660025072 0ustar003

��f�@sTddlmZmZmZddlmZmZmZmZm	Z	m
Z
mZmZdddddd	d
dgZ
dS)
�)�absolute_import�division�print_function�)�
__author__�
__copyright__�	__email__�__license__�__summary__�	__title__�__uri__�__version__rr
rr
rrr	rN)Z
__future__rrr�	__about__rrrr	r
rrr
�__all__�rr�/usr/lib/python3.6/__init__.py�<module>s(
site-packages/pkg_resources/_vendor/packaging/__pycache__/version.cpython-36.opt-1.pyc000064400000024426147511334660025023 0ustar003

��f$-�@s�ddlmZmZmZddlZddlZddlZddlmZddddd	gZ	ej
d
ddd
dddg�Zdd�ZGdd�de
�ZGdd�de�ZGdd�de�Zejdej�Zdddddd�Zdd�Zdd�ZdZGd d�de�Zd!d"�Zejd#�Zd$d%�Zd&d'�ZdS)(�)�absolute_import�division�print_functionN�)�Infinity�parse�Version�
LegacyVersion�InvalidVersion�VERSION_PATTERN�_Version�epoch�release�dev�pre�post�localcCs&yt|�Stk
r t|�SXdS)z�
    Parse the given version string and return either a :class:`Version` object
    or a :class:`LegacyVersion` object depending on if the given version is
    a valid PEP 440 version or a legacy version.
    N)rr
r	)�version�r�/usr/lib/python3.6/version.pyrsc@seZdZdZdS)r
zF
    An invalid version was found, users should refer to PEP 440.
    N)�__name__�
__module__�__qualname__�__doc__rrrrr
$sc@sLeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)�_BaseVersioncCs
t|j�S)N)�hash�_key)�selfrrr�__hash__,sz_BaseVersion.__hash__cCs|j|dd��S)NcSs||kS)Nr)�s�orrr�<lambda>0sz%_BaseVersion.__lt__.<locals>.<lambda>)�_compare)r�otherrrr�__lt__/sz_BaseVersion.__lt__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!3sz%_BaseVersion.__le__.<locals>.<lambda>)r")rr#rrr�__le__2sz_BaseVersion.__le__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!6sz%_BaseVersion.__eq__.<locals>.<lambda>)r")rr#rrr�__eq__5sz_BaseVersion.__eq__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!9sz%_BaseVersion.__ge__.<locals>.<lambda>)r")rr#rrr�__ge__8sz_BaseVersion.__ge__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!<sz%_BaseVersion.__gt__.<locals>.<lambda>)r")rr#rrr�__gt__;sz_BaseVersion.__gt__cCs|j|dd��S)NcSs||kS)Nr)rr rrrr!?sz%_BaseVersion.__ne__.<locals>.<lambda>)r")rr#rrr�__ne__>sz_BaseVersion.__ne__cCst|t�stS||j|j�S)N)�
isinstancer�NotImplementedr)rr#�methodrrrr"As
z_BaseVersion._compareN)rrrrr$r%r&r'r(r)r"rrrrr*src@s`eZdZdd�Zdd�Zdd�Zedd��Zed	d
��Zedd��Z	ed
d��Z
edd��ZdS)r	cCst|�|_t|j�|_dS)N)�str�_version�_legacy_cmpkeyr)rrrrr�__init__Js
zLegacyVersion.__init__cCs|jS)N)r.)rrrr�__str__NszLegacyVersion.__str__cCsdjtt|���S)Nz<LegacyVersion({0})>)�format�reprr-)rrrr�__repr__QszLegacyVersion.__repr__cCs|jS)N)r.)rrrr�publicTszLegacyVersion.publiccCs|jS)N)r.)rrrr�base_versionXszLegacyVersion.base_versioncCsdS)Nr)rrrrr\szLegacyVersion.localcCsdS)NFr)rrrr�
is_prerelease`szLegacyVersion.is_prereleasecCsdS)NFr)rrrr�is_postreleasedszLegacyVersion.is_postreleaseN)rrrr0r1r4�propertyr5r6rr7r8rrrrr	Hsz(\d+ | [a-z]+ | \.| -)�czfinal-�@)r�preview�-�rcrccsbxVtj|�D]H}tj||�}|s|dkr,q|dd�dkrJ|jd�Vqd|VqWdVdS)N�.r�
0123456789��*z*final)�_legacy_version_component_re�split�_legacy_version_replacement_map�get�zfill)r�partrrr�_parse_version_partsrsrIcCs�d}g}xlt|j��D]\}|jd�rh|dkrJx|rH|ddkrH|j�q.Wx|rf|ddkrf|j�qLW|j|�qWt|�}||fS)	NrrBz*finalz*final-Z00000000���rJrJ)rI�lower�
startswith�pop�append�tuple)rr
�partsrHrrrr/�s
r/a�
    v?
    (?:
        (?:(?P<epoch>[0-9]+)!)?                           # epoch
        (?P<release>[0-9]+(?:\.[0-9]+)*)                  # release segment
        (?P<pre>                                          # pre-release
            [-_\.]?
            (?P<pre_l>(a|b|c|rc|alpha|beta|pre|preview))
            [-_\.]?
            (?P<pre_n>[0-9]+)?
        )?
        (?P<post>                                         # post release
            (?:-(?P<post_n1>[0-9]+))
            |
            (?:
                [-_\.]?
                (?P<post_l>post|rev|r)
                [-_\.]?
                (?P<post_n2>[0-9]+)?
            )
        )?
        (?P<dev>                                          # dev release
            [-_\.]?
            (?P<dev_l>dev)
            [-_\.]?
            (?P<dev_n>[0-9]+)?
        )?
    )
    (?:\+(?P<local>[a-z0-9]+(?:[-_\.][a-z0-9]+)*))?       # local version
c@s|eZdZejdedejejB�Zdd�Z	dd�Z
dd�Zed	d
��Z
edd��Zed
d��Zedd��Zedd��ZdS)rz^\s*z\s*$c	Cs�|jj|�}|stdj|���t|jd�r8t|jd��ndtdd�|jd�jd�D��t	|jd�|jd	��t	|jd
�|jd�p�|jd��t	|jd
�|jd��t
|jd��d�|_t|jj
|jj|jj|jj|jj|jj�|_dS)NzInvalid version: '{0}'r
rcss|]}t|�VqdS)N)�int)�.0�irrr�	<genexpr>�sz#Version.__init__.<locals>.<genexpr>rr?Zpre_lZpre_nZpost_lZpost_n1Zpost_n2Zdev_lZdev_nr)r
rrrrr)�_regex�searchr
r2r�grouprQrOrD�_parse_letter_version�_parse_local_versionr.�_cmpkeyr
rrrrrr)rr�matchrrrr0�s.

zVersion.__init__cCsdjtt|���S)Nz<Version({0})>)r2r3r-)rrrrr4�szVersion.__repr__cCs�g}|jjdkr$|jdj|jj��|jdjdd�|jjD���|jjdk	rl|jdjdd�|jjD���|jjdk	r�|jdj|jjd	��|jjdk	r�|jd
j|jjd	��|jj	dk	r�|jdjdjdd�|jj	D����dj|�S)
Nrz{0}!r?css|]}t|�VqdS)N)r-)rR�xrrrrT�sz"Version.__str__.<locals>.<genexpr>�css|]}t|�VqdS)N)r-)rRr\rrrrT�sz.post{0}rz.dev{0}z+{0}css|]}t|�VqdS)N)r-)rRr\rrrrTs)
r.r
rNr2�joinrrrrr)rrPrrrr1�s zVersion.__str__cCst|�jdd�dS)N�+rr)r-rD)rrrrr5
szVersion.publiccCsLg}|jjdkr$|jdj|jj��|jdjdd�|jjD���dj|�S)Nrz{0}!r?css|]}t|�VqdS)N)r-)rRr\rrrrTsz'Version.base_version.<locals>.<genexpr>r])r.r
rNr2r^r)rrPrrrr6s
zVersion.base_versioncCs$t|�}d|kr |jdd�dSdS)Nr_r)r-rD)rZversion_stringrrrrsz
Version.localcCst|jjp|jj�S)N)�boolr.rr)rrrrr7!szVersion.is_prereleasecCst|jj�S)N)r`r.r)rrrrr8%szVersion.is_postreleaseN)rrr�re�compiler�VERBOSE�
IGNORECASErUr0r4r1r9r5r6rr7r8rrrrr�s
#
cCsx|rZ|dkrd}|j�}|dkr&d}n(|dkr4d}n|d
krBd	}n|dkrNd}|t|�fS|rt|rtd}|t|�fSdS)NrZalpha�aZbeta�br:rr<r>�rev�rr)r:rr<)rgrh)rKrQ)ZletterZnumberrrrrX*s 
rXz[\._-]cCs$|dk	r tdd�tj|�D��SdS)zR
    Takes a string like abc.1.twelve and turns it into ("abc", 1, "twelve").
    Ncss&|]}|j�s|j�nt|�VqdS)N)�isdigitrKrQ)rRrHrrrrTRsz'_parse_local_version.<locals>.<genexpr>)rO�_local_version_seperatorsrD)rrrrrYLsrYcCs�ttttjdd�t|�����}|dkr@|dkr@|dk	r@t}n|dkrLt}|dkrZt}|dkrft}|dkrvt}ntdd�|D��}||||||fS)NcSs|dkS)Nrr)r\rrrr!`sz_cmpkey.<locals>.<lambda>css*|]"}t|t�r|dfnt|fVqdS)r]N)r*rQr)rRrSrrrrT�sz_cmpkey.<locals>.<genexpr>)rO�reversed�list�	itertools�	dropwhiler)r
rrrrrrrrrZWs&		
rZ)Z
__future__rrr�collectionsrmraZ_structuresr�__all__�
namedtuplerr�
ValueErrorr
�objectrr	rbrcrCrErIr/rrrXrjrYrZrrrr�<module>s.!
9k
site-packages/pkg_resources/_vendor/packaging/__pycache__/utils.cpython-36.opt-1.pyc000064400000000630147511334660024465 0ustar003

��f��@s2ddlmZmZmZddlZejd�Zdd�ZdS)�)�absolute_import�division�print_functionNz[-_.]+cCstjd|�j�S)N�-)�_canonicalize_regex�sub�lower)�name�r
�/usr/lib/python3.6/utils.py�canonicalize_namesr)Z
__future__rrr�re�compilerrr
r
r
r�<module>s
site-packages/pkg_resources/_vendor/packaging/__pycache__/requirements.cpython-36.pyc000064400000007330147511334660025115 0ustar003

��f�@srddlmZmZmZddlZddlZddlmZmZm	Z	m
Z
ddlmZmZm
Z
mZmZddlmZddlmZddlmZmZdd	lmZmZmZGd
d�de�Zeejej�Z ed�j!�Z"ed
�j!�Z#ed�j!�Z$ed�j!�Z%ed�j!�Z&ed�j!�Z'ed�j!�Z(ed�Z)e ee)�e BZ*ee ee*��Z+e+d�Z,e+Z-ed�d�Z.e(e.Z/e-ee&e-�Z0e"e
e0�e#d�Z1eej2ej3ej4B�Z5eej2ej3ej4B�Z6e5e6AZ7ee7ee&e7�ddd�d�Z8e
e$e8e%e8B�Z9e9j:dd��e	e9�d�Z;e;j:dd��e	e��d�Zej:d d��e'Z<e<eZ=e;e
e=�Z>e/e
e=�Z?e,e
e1�e?e>BZ@ee@eZAGd!d"�d"eB�ZCdS)#�)�absolute_import�division�print_functionN)�stringStart�	stringEnd�originalTextFor�ParseException)�
ZeroOrMore�Word�Optional�Regex�Combine)�Literal)�parse�)�MARKER_EXPR�Marker)�LegacySpecifier�	Specifier�SpecifierSetc@seZdZdZdS)�InvalidRequirementzJ
    An invalid requirement was found, users should refer to PEP 508.
    N)�__name__�
__module__�__qualname__�__doc__�rr�"/usr/lib/python3.6/requirements.pyrsr�[�]�(�)�,�;�@z-_.�namez[^ ]+�url�extrasF)Z
joinStringZadjacent�	_raw_speccCs
|jpdS)N�)r')�s�l�trrr�<lambda>6sr,�	specifiercCs|dS)Nrr)r)r*r+rrrr,9s�markercCst||j|j��S)N)rZ_original_startZ
_original_end)r)r*r+rrrr,=sc@s(eZdZdZdd�Zdd�Zdd�ZdS)	�Requirementz�Parse a requirement.

    Parse a given requirement string into its parts, such as name, specifier,
    URL, and extras. Raises InvalidRequirement on a badly-formed requirement
    string.
    cCs�ytj|�}Wn@tk
rN}z$tdj||j|jd����WYdd}~XnX|j|_|jr�tj|j�}|j	ot|j
s�|j	r�|j
r�td��|j|_nd|_t|jr�|jj
�ng�|_t|j�|_|jr�|jnd|_dS)Nz+Invalid requirement, parse error at "{0!r}"�zInvalid URL given)�REQUIREMENTZparseStringrr�format�locr$r%�urlparse�schemeZnetloc�setr&ZasListrr-r.)�selfZrequirement_stringZreq�eZ
parsed_urlrrr�__init__Xs"*
zRequirement.__init__cCsz|jg}|jr*|jdjdjt|j����|jr@|jt|j��|jrX|jdj|j��|j	rp|jdj|j	��dj|�S)Nz[{0}]r!z@ {0}z; {0}r()
r$r&�appendr2�join�sortedr-�strr%r.)r7�partsrrr�__str__mszRequirement.__str__cCsdjt|��S)Nz<Requirement({0!r})>)r2r=)r7rrr�__repr__~szRequirement.__repr__N)rrrrr9r?r@rrrrr/Ksr/)DZ
__future__rrr�string�reZpkg_resources.extern.pyparsingrrrrr	r
rrr
r�LZ%pkg_resources.extern.six.moves.urllibrr4ZmarkersrrZ
specifiersrrr�
ValueErrorrZ
ascii_lettersZdigitsZALPHANUM�suppressZLBRACKETZRBRACKETZLPARENZRPAREN�COMMAZ	SEMICOLON�ATZPUNCTUATIONZIDENTIFIER_ENDZ
IDENTIFIER�NAMEZEXTRAZURIZURLZEXTRAS_LISTZEXTRASZ
_regex_str�VERBOSE�
IGNORECASEZVERSION_PEP440ZVERSION_LEGACYZVERSION_ONEZVERSION_MANYZ
_VERSION_SPECZsetParseActionZVERSION_SPECZMARKER_SEPERATORZMARKERZVERSION_AND_MARKERZURL_AND_MARKERZNAMED_REQUIREMENTr1�objectr/rrrr�<module>sZ
site-packages/pkg_resources/_vendor/packaging/__pycache__/markers.cpython-36.opt-1.pyc000064400000020772147511334660025002 0ustar003

��f8 �	@s@ddlmZmZmZddlZddlZddlZddlZddlm	Z	m
Z
mZmZddlm
Z
mZmZmZddlmZddlmZddlmZmZd	d
ddd
gZGdd	�d	e�ZGdd
�d
e�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�Z Gdd�de�Z!ed�ed�Bed�Bed�Bed�Bed�Bed�Bed �Bed!�Bed"�Bed#�Bed$�Bed%�Bed&�Bed'�Bed(�Bed)�Bed*�BZ"d#d"ddddd+�Z#e"j$d,d-��ed.�ed/�Bed0�Bed1�Bed2�Bed3�Bed4�Bed5�BZ%e%ed6�Bed7�BZ&e&j$d8d-��ed9�ed:�BZ'e'j$d;d-��ed<�ed=�BZ(e"e'BZ)ee)e&e)�Z*e*j$d>d-��ed?�j+�Z,ed@�j+�Z-e�Z.e*ee,e.e-�BZ/e.e/e
e(e.�>ee.eZ0dAdB�Z1dSdDdE�Z2dFd-�dGd-�ej3ej4ej5ej6ej7ej8dH�Z9dIdJ�Z:e�Z;dKdL�Z<dMdN�Z=dOdP�Z>dQd
�Z?GdRd�de�Z@dS)T�)�absolute_import�division�print_functionN)�ParseException�ParseResults�stringStart�	stringEnd)�
ZeroOrMore�Group�Forward�QuotedString)�Literal�)�string_types)�	Specifier�InvalidSpecifier�
InvalidMarker�UndefinedComparison�UndefinedEnvironmentName�Marker�default_environmentc@seZdZdZdS)rzE
    An invalid marker was found, users should refer to PEP 508.
    N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/markers.pyrsc@seZdZdZdS)rzP
    An invalid operation was attempted on a value that doesn't support it.
    N)rrrrrrrrrsc@seZdZdZdS)rz\
    A name was attempted to be used that does not exist inside of the
    environment.
    N)rrrrrrrrr%sc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�NodecCs
||_dS)N)�value)�selfrrrr�__init__.sz
Node.__init__cCs
t|j�S)N)�strr)rrrr�__str__1szNode.__str__cCsdj|jjt|��S)Nz<{0}({1!r})>)�format�	__class__rr!)rrrr�__repr__4sz
Node.__repr__cCst�dS)N)�NotImplementedError)rrrr�	serialize7szNode.serializeN)rrrr r"r%r'rrrrr,src@seZdZdd�ZdS)�VariablecCst|�S)N)r!)rrrrr'=szVariable.serializeN)rrrr'rrrrr(;sr(c@seZdZdd�ZdS)�ValuecCs
dj|�S)Nz"{0}")r#)rrrrr'CszValue.serializeN)rrrr'rrrrr)Asr)c@seZdZdd�ZdS)�OpcCst|�S)N)r!)rrrrr'IszOp.serializeN)rrrr'rrrrr*Gsr*�implementation_version�platform_python_implementation�implementation_name�python_full_version�platform_release�platform_version�platform_machine�platform_system�python_version�sys_platform�os_namezos.namezsys.platformzplatform.versionzplatform.machinezplatform.python_implementation�python_implementationZextra)zos.namezsys.platformzplatform.versionzplatform.machinezplatform.python_implementationr6cCsttj|d|d��S)Nr)r(�ALIASES�get)�s�l�trrr�<lambda>isr<z===z==z>=z<=z!=z~=�>�<znot in�incCst|d�S)Nr)r*)r9r:r;rrrr<ws�'�"cCst|d�S)Nr)r))r9r:r;rrrr<zs�and�orcCst|d�S)Nr)�tuple)r9r:r;rrrr<�s�(�)cCs t|t�rdd�|D�S|SdS)NcSsg|]}t|��qSr)�_coerce_parse_result)�.0�irrr�
<listcomp>�sz(_coerce_parse_result.<locals>.<listcomp>)�
isinstancer)�resultsrrrrG�s
rGTcCs�t|t�r4t|�dkr4t|dttf�r4t|d�St|t�rndd�|D�}|rZdj|�Sddj|�dSn"t|t�r�djdd	�|D��S|SdS)
Nrrcss|]}t|dd�VqdS)F)�firstN)�_format_marker)rH�mrrr�	<genexpr>�sz!_format_marker.<locals>.<genexpr>� rErFcSsg|]}|j��qSr)r')rHrOrrrrJ�sz"_format_marker.<locals>.<listcomp>)rK�list�lenrDrN�join)�markerrM�innerrrrrN�s


rNcCs||kS)Nr)�lhs�rhsrrrr<�scCs||kS)Nr)rWrXrrrr<�s)r?znot inr>z<=z==z!=z>=r=c
Cslytdj|j�|g��}Wntk
r.YnX|j|�Stj|j��}|dkrbtdj|||���|||�S)N�z#Undefined {0!r} on {1!r} and {2!r}.)	rrTr'r�contains�
_operatorsr8rr#)rW�oprX�specZoperrrr�_eval_op�s
r^cCs&|j|t�}|tkr"tdj|���|S)Nz/{0!r} does not exist in evaluation environment.)r8�
_undefinedrr#)�environment�namerrrr�_get_env�s
rbc	Cs�gg}x�|D]�}t|t�r0|djt||��qt|t�r�|\}}}t|t�rbt||j�}|j}n|j}t||j�}|djt|||��q|dkr|jg�qWt	dd�|D��S)NrrCcss|]}t|�VqdS)N)�all)rH�itemrrrrP�sz$_evaluate_markers.<locals>.<genexpr>���re)
rKrR�append�_evaluate_markersrDr(rbrr^�any)	Zmarkersr`�groupsrUrWr\rXZ	lhs_valueZ	rhs_valuerrrrg�s




rgcCs2dj|�}|j}|dkr.||dt|j�7}|S)Nz{0.major}.{0.minor}.{0.micro}�finalr)r#�releaselevelr!�serial)�info�versionZkindrrr�format_full_version�s

rocCslttd�r ttjj�}tjj}nd}d}||tjtj�tj	�tj
�tj�tj�tj�tj�dd�tjd�S)N�implementation�0rY�)r-r+r5r1r/r2r0r.r,r3r4)
�hasattr�sysrorprnra�os�platform�machine�release�systemr3r6)Ziverr-rrrr�s 

c@s.eZdZdd�Zdd�Zdd�Zd
dd	�ZdS)rcCs`yttj|��|_WnFtk
rZ}z*dj|||j|jd��}t|��WYdd}~XnXdS)Nz+Invalid marker: {0!r}, parse error at {1!r}�)rG�MARKERZparseString�_markersrr#�locr)rrU�eZerr_strrrrr szMarker.__init__cCs
t|j�S)N)rNr|)rrrrr"szMarker.__str__cCsdjt|��S)Nz<Marker({0!r})>)r#r!)rrrrr%szMarker.__repr__NcCs$t�}|dk	r|j|�t|j|�S)a$Evaluate a marker.

        Return the boolean from evaluating the given marker against the
        environment. environment is an optional argument to override all or
        part of the determined environment.

        The environment is determined from the current Python process.
        N)r�updatergr|)rr`Zcurrent_environmentrrr�evaluate s	
zMarker.evaluate)N)rrrr r"r%r�rrrrrs)T)AZ
__future__rrr�operatorrurvrtZpkg_resources.extern.pyparsingrrrrr	r
rrr
�LZ_compatrZ
specifiersrr�__all__�
ValueErrorrrr�objectrr(r)r*ZVARIABLEr7ZsetParseActionZVERSION_CMPZ	MARKER_OPZMARKER_VALUEZBOOLOPZ
MARKER_VARZMARKER_ITEM�suppressZLPARENZRPARENZMARKER_EXPRZMARKER_ATOMr{rGrN�lt�le�eq�ne�ge�gtr[r^r_rbrgrorrrrrr�<module>sx�
	6


site-packages/pkg_resources/_vendor/packaging/__pycache__/__about__.cpython-36.pyc000064400000001177147511334660024303 0ustar003

��f��@sPddlmZmZmZdddddddd	gZd
ZdZdZd
ZdZ	dZ
dZde	ZdS)�)�absolute_import�division�print_function�	__title__�__summary__�__uri__�__version__�
__author__�	__email__�__license__�
__copyright__Z	packagingz"Core utilities for Python packagesz!https://github.com/pypa/packagingz16.8z)Donald Stufft and individual contributorszdonald@stufft.ioz"BSD or Apache License, Version 2.0zCopyright 2014-2016 %sN)
Z
__future__rrr�__all__rrrrr	r
rr�rr�/usr/lib/python3.6/__about__.py�<module>s

site-packages/pkg_resources/_vendor/packaging/__pycache__/__about__.cpython-36.opt-1.pyc000064400000001177147511334660025242 0ustar003

��f��@sPddlmZmZmZdddddddd	gZd
ZdZdZd
ZdZ	dZ
dZde	ZdS)�)�absolute_import�division�print_function�	__title__�__summary__�__uri__�__version__�
__author__�	__email__�__license__�
__copyright__Z	packagingz"Core utilities for Python packagesz!https://github.com/pypa/packagingz16.8z)Donald Stufft and individual contributorszdonald@stufft.ioz"BSD or Apache License, Version 2.0zCopyright 2014-2016 %sN)
Z
__future__rrr�__all__rrrrr	r
rr�rr�/usr/lib/python3.6/__about__.py�<module>s

site-packages/pkg_resources/_vendor/packaging/__pycache__/specifiers.cpython-36.pyc000064400000046437147511334660024541 0ustar003

��fym�@s�ddlmZmZmZddlZddlZddlZddlZddlm	Z	m
Z
ddlmZm
Z
mZGdd�de�ZGdd	�d	e
eje��ZGd
d�de�ZGdd
�d
e�Zdd�ZGdd�de�Zejd�Zdd�Zdd�ZGdd�de�ZdS)�)�absolute_import�division�print_functionN�)�string_types�with_metaclass)�Version�
LegacyVersion�parsec@seZdZdZdS)�InvalidSpecifierzH
    An invalid specifier was found, users should refer to PEP 440.
    N)�__name__�
__module__�__qualname__�__doc__�rr� /usr/lib/python3.6/specifiers.pyrsrc@s�eZdZejdd��Zejdd��Zejdd��Zejdd��Zej	d	d
��Z
e
jdd
��Z
ejdd
d��Zejddd��Z
dS)�
BaseSpecifiercCsdS)z�
        Returns the str representation of this Specifier like object. This
        should be representative of the Specifier itself.
        Nr)�selfrrr�__str__szBaseSpecifier.__str__cCsdS)zF
        Returns a hash value for this Specifier like object.
        Nr)rrrr�__hash__szBaseSpecifier.__hash__cCsdS)zq
        Returns a boolean representing whether or not the two Specifier like
        objects are equal.
        Nr)r�otherrrr�__eq__$szBaseSpecifier.__eq__cCsdS)zu
        Returns a boolean representing whether or not the two Specifier like
        objects are not equal.
        Nr)rrrrr�__ne__+szBaseSpecifier.__ne__cCsdS)zg
        Returns whether or not pre-releases as a whole are allowed by this
        specifier.
        Nr)rrrr�prereleases2szBaseSpecifier.prereleasescCsdS)zd
        Sets whether or not pre-releases as a whole are allowed by this
        specifier.
        Nr)r�valuerrrr9sNcCsdS)zR
        Determines if the given item is contained within this specifier.
        Nr)r�itemrrrr�contains@szBaseSpecifier.containscCsdS)z�
        Takes an iterable of items and filters them so that only items which
        are contained within this specifier are allowed in it.
        Nr)r�iterablerrrr�filterFszBaseSpecifier.filter)N)N)rr
r�abc�abstractmethodrrrr�abstractpropertyr�setterrrrrrrrsrc@s�eZdZiZd dd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zedd��Z
edd��Zedd��Zejdd��Zdd�Zd!dd�Zd"dd�ZdS)#�_IndividualSpecifier�NcCsF|jj|�}|stdj|���|jd�j�|jd�j�f|_||_dS)NzInvalid specifier: '{0}'�operator�version)�_regex�searchr�format�group�strip�_spec�_prereleases)r�specr�matchrrr�__init__Rsz_IndividualSpecifier.__init__cCs0|jdk	rdj|j�nd}dj|jjt|�|�S)Nz, prereleases={0!r}r$z<{0}({1!r}{2})>)r-r)r�	__class__r�str)r�prerrr�__repr___sz_IndividualSpecifier.__repr__cCsdj|j�S)Nz{0}{1})r)r,)rrrrrlsz_IndividualSpecifier.__str__cCs
t|j�S)N)�hashr,)rrrrrosz_IndividualSpecifier.__hash__cCsLt|t�r0y|j|�}Wq@tk
r,tSXnt||j�s@tS|j|jkS)N)�
isinstancerr1r�NotImplementedr,)rrrrrrrs
z_IndividualSpecifier.__eq__cCsLt|t�r0y|j|�}Wq@tk
r,tSXnt||j�s@tS|j|jkS)N)r6rr1rr7r,)rrrrrr}s
z_IndividualSpecifier.__ne__cCst|dj|j|��S)Nz_compare_{0})�getattrr)�
_operators)r�oprrr�
_get_operator�sz"_IndividualSpecifier._get_operatorcCst|ttf�st|�}|S)N)r6r	rr
)rr&rrr�_coerce_version�sz$_IndividualSpecifier._coerce_versioncCs
|jdS)Nr)r,)rrrrr%�sz_IndividualSpecifier.operatorcCs
|jdS)Nr)r,)rrrrr&�sz_IndividualSpecifier.versioncCs|jS)N)r-)rrrrr�sz _IndividualSpecifier.prereleasescCs
||_dS)N)r-)rrrrrr�scCs
|j|�S)N)r)rrrrr�__contains__�sz!_IndividualSpecifier.__contains__cCs<|dkr|j}|j|�}|jr(|r(dS|j|j�||j�S)NF)rr<�
is_prereleaser;r%r&)rrrrrrr�s
z_IndividualSpecifier.containsccs�d}g}d|dk	r|ndi}xL|D]D}|j|�}|j|f|�r"|jr\|pL|jr\|j|�q"d}|Vq"W|r�|r�x|D]
}|VqzWdS)NFrT)r<rr>r�append)rrrZyielded�found_prereleases�kwr&�parsed_versionrrrr�s




z_IndividualSpecifier.filter)r$N)N)N)rr
rr9r0r4rrrrr;r<�propertyr%r&rr"r=rrrrrrr#Ns 



r#c@sveZdZdZejdedejejB�Zdddddd	d
�Z	dd�Z
d
d�Zdd�Zdd�Z
dd�Zdd�Zdd�ZdS)�LegacySpecifiera�
        (?P<operator>(==|!=|<=|>=|<|>))
        \s*
        (?P<version>
            [^,;\s)]* # Since this is a "legacy" specifier, and the version
                      # string can be just about anything, we match everything
                      # except for whitespace, a semi-colon for marker support,
                      # a closing paren since versions can be enclosed in
                      # them, and a comma since it's a version separator.
        )
        z^\s*z\s*$�equal�	not_equal�less_than_equal�greater_than_equal�	less_than�greater_than)z==z!=z<=z>=�<�>cCst|t�stt|��}|S)N)r6r	r2)rr&rrrr<�s
zLegacySpecifier._coerce_versioncCs||j|�kS)N)r<)r�prospectiver.rrr�_compare_equal�szLegacySpecifier._compare_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_not_equal�sz"LegacySpecifier._compare_not_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_less_than_equal�sz(LegacySpecifier._compare_less_than_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_greater_than_equalsz+LegacySpecifier._compare_greater_than_equalcCs||j|�kS)N)r<)rrMr.rrr�_compare_less_thansz"LegacySpecifier._compare_less_thancCs||j|�kS)N)r<)rrMr.rrr�_compare_greater_thansz%LegacySpecifier._compare_greater_thanN)rr
r�
_regex_str�re�compile�VERBOSE�
IGNORECASEr'r9r<rNrOrPrQrRrSrrrrrD�s 
rDcstj���fdd��}|S)Ncst|t�sdS�|||�S)NF)r6r)rrMr.)�fnrr�wrappeds
z)_require_version_compare.<locals>.wrapped)�	functools�wraps)rYrZr)rYr�_require_version_compare
sr]c	@s�eZdZdZejdedejejB�Zdddddd	d
dd�Z	e
d
d��Ze
dd��Ze
dd��Z
e
dd��Ze
dd��Ze
dd��Ze
dd��Zdd�Zedd��Zejdd��Zd S)!�	Specifiera
        (?P<operator>(~=|==|!=|<=|>=|<|>|===))
        (?P<version>
            (?:
                # The identity operators allow for an escape hatch that will
                # do an exact string match of the version you wish to install.
                # This will not be parsed by PEP 440 and we cannot determine
                # any semantic meaning from it. This operator is discouraged
                # but included entirely as an escape hatch.
                (?<====)  # Only match for the identity operator
                \s*
                [^\s]*    # We just match everything, except for whitespace
                          # since we are only testing for strict identity.
            )
            |
            (?:
                # The (non)equality operators allow for wild card and local
                # versions to be specified so we have to define these two
                # operators separately to enable that.
                (?<===|!=)            # Only match for equals and not equals

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)*   # release
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?

                # You cannot use a wild card and a dev or local version
                # together so group them with a | and make them optional.
                (?:
                    (?:[-_\.]?dev[-_\.]?[0-9]*)?         # dev release
                    (?:\+[a-z0-9]+(?:[-_\.][a-z0-9]+)*)? # local
                    |
                    \.\*  # Wild card syntax of .*
                )?
            )
            |
            (?:
                # The compatible operator requires at least two digits in the
                # release segment.
                (?<=~=)               # Only match for the compatible operator

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)+   # release  (We have a + instead of a *)
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?
                (?:[-_\.]?dev[-_\.]?[0-9]*)?          # dev release
            )
            |
            (?:
                # All other operators only allow a sub set of what the
                # (non)equality operators do. Specifically they do not allow
                # local versions to be specified nor do they allow the prefix
                # matching wild cards.
                (?<!==|!=|~=)         # We have special cases for these
                                      # operators so we want to make sure they
                                      # don't match here.

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)*   # release
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?
                (?:[-_\.]?dev[-_\.]?[0-9]*)?          # dev release
            )
        )
        z^\s*z\s*$Z
compatiblerErFrGrHrIrJZ	arbitrary)z~=z==z!=z<=z>=rKrLz===cCsNdjttjdd�t|���dd��}|d7}|jd�||�oL|jd�||�S)	N�.cSs|jd�o|jd�S)NZpostZdev)�
startswith)�xrrr�<lambda>�sz/Specifier._compare_compatible.<locals>.<lambda>rz.*z>=z==���)�join�list�	itertools�	takewhile�_version_splitr;)rrMr.�prefixrrr�_compare_compatible�s
zSpecifier._compare_compatiblecCsp|jd�rPt|j�}t|dd��}tt|��}|dt|��}t||�\}}nt|�}|jsht|j�}||kS)Nz.*����)�endswithrZpublicrhr2�len�_pad_version�local)rrMr.rrrrN�s


zSpecifier._compare_equalcCs|j||�S)N)rN)rrMr.rrrrO�szSpecifier._compare_not_equalcCs|t|�kS)N)r)rrMr.rrrrP�sz"Specifier._compare_less_than_equalcCs|t|�kS)N)r)rrMr.rrrrQ�sz%Specifier._compare_greater_than_equalcCs>t|�}||ksdS|jr:|jr:t|j�t|j�kr:dSdS)NFT)rr>�base_version)rrMr.rrrrR�szSpecifier._compare_less_thancCs`t|�}||ksdS|jr:|jr:t|j�t|j�kr:dS|jdk	r\t|j�t|j�kr\dSdS)NFT)rZis_postreleaserqrp)rrMr.rrrrS�s
zSpecifier._compare_greater_thancCst|�j�t|�j�kS)N)r2�lower)rrMr.rrr�_compare_arbitraryszSpecifier._compare_arbitrarycCsR|jdk	r|jS|j\}}|d
krN|dkr@|jd�r@|dd�}t|�jrNdSd	S)N�==�>=�<=�~=�===z.*rkTF)rtrurvrwrxrl)r-r,rmr
r>)rr%r&rrrrs


zSpecifier.prereleasescCs
||_dS)N)r-)rrrrrrsN)rr
rrTrUrVrWrXr'r9r]rjrNrOrPrQrRrSrsrCrr"rrrrr^s*^#r^z^([0-9]+)((?:a|b|c|rc)[0-9]+)$cCsDg}x:|jd�D],}tj|�}|r2|j|j��q|j|�qW|S)Nr_)�split�
_prefix_regexr(�extend�groupsr?)r&�resultrr/rrrrh's
rhc	Cs�gg}}|jttjdd�|���|jttjdd�|���|j|t|d�d��|j|t|d�d��|jddgtdt|d�t|d���|jddgtdt|d�t|d���ttj|��ttj|��fS)NcSs|j�S)N)�isdigit)rarrrrb6sz_pad_version.<locals>.<lambda>cSs|j�S)N)r~)rarrrrb7srr�0)r?rerfrgrn�insert�max�chain)�left�rightZ
left_splitZright_splitrrrro2s
&&roc@s�eZdZddd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Zdd�Z	dd�Z
dd�Zedd��Z
e
jdd��Z
dd�Zddd�Zd dd�ZdS)!�SpecifierSetr$NcCsrdd�|jd�D�}t�}xB|D]:}y|jt|��Wq tk
rX|jt|��Yq Xq Wt|�|_||_dS)NcSsg|]}|j�r|j��qSr)r+)�.0�srrr�
<listcomp>Rsz)SpecifierSet.__init__.<locals>.<listcomp>�,)	ry�set�addr^rrD�	frozenset�_specsr-)rZ
specifiersrZparsed�	specifierrrrr0Os

zSpecifierSet.__init__cCs*|jdk	rdj|j�nd}djt|�|�S)Nz, prereleases={0!r}r$z<SpecifierSet({0!r}{1})>)r-r)rr2)rr3rrrr4dszSpecifierSet.__repr__cCsdjtdd�|jD���S)Nr�css|]}t|�VqdS)N)r2)r�r�rrr�	<genexpr>nsz'SpecifierSet.__str__.<locals>.<genexpr>)rd�sortedr�)rrrrrmszSpecifierSet.__str__cCs
t|j�S)N)r5r�)rrrrrpszSpecifierSet.__hash__cCs�t|t�rt|�}nt|t�s"tSt�}t|j|jB�|_|jdkrX|jdk	rX|j|_n<|jdk	rv|jdkrv|j|_n|j|jkr�|j|_ntd��|S)NzFCannot combine SpecifierSets with True and False prerelease overrides.)r6rr�r7r�r�r-�
ValueError)rrr�rrr�__and__ss





zSpecifierSet.__and__cCsFt|t�rt|�}n&t|t�r,tt|��}nt|t�s:tS|j|jkS)N)r6rr�r#r2r7r�)rrrrrr�s



zSpecifierSet.__eq__cCsFt|t�rt|�}n&t|t�r,tt|��}nt|t�s:tS|j|jkS)N)r6rr�r#r2r7r�)rrrrrr�s



zSpecifierSet.__ne__cCs
t|j�S)N)rnr�)rrrr�__len__�szSpecifierSet.__len__cCs
t|j�S)N)�iterr�)rrrr�__iter__�szSpecifierSet.__iter__cCs.|jdk	r|jS|jsdStdd�|jD��S)Ncss|]}|jVqdS)N)r)r�r�rrrr��sz+SpecifierSet.prereleases.<locals>.<genexpr>)r-r��any)rrrrr�s

zSpecifierSet.prereleasescCs
||_dS)N)r-)rrrrrr�scCs
|j|�S)N)r)rrrrrr=�szSpecifierSet.__contains__csNt�ttf�st����dkr$|j��r4�jr4dSt��fdd�|jD��S)NFc3s|]}|j��d�VqdS))rN)r)r�r�)rrrrr��sz(SpecifierSet.contains.<locals>.<genexpr>)r6r	rr
rr>�allr�)rrrr)rrrr�szSpecifierSet.containscCs�|dkr|j}|jr:x |jD]}|j|t|�d�}qW|Sg}g}xZ|D]R}t|ttf�sdt|�}n|}t|t�rtqH|jr�|r�|s�|j	|�qH|j	|�qHW|r�|r�|dkr�|S|SdS)N)r)
rr�r�boolr6r	rr
r>r?)rrrr.Zfilteredr@rrBrrrr�s*


zSpecifierSet.filter)r$N)N)N)rr
rr0r4rrr�rrr�r�rCrr"r=rrrrrrr�Ms
	


r�)Z
__future__rrrrr[rfrUZ_compatrrr&rr	r
r�r�ABCMeta�objectrr#rDr]r^rVrzrhror�rrrr�<module>s&9	4	
site-packages/pkg_resources/_vendor/packaging/__pycache__/markers.cpython-36.pyc000064400000021144147511334660024035 0ustar003

��f8 �	@s@ddlmZmZmZddlZddlZddlZddlZddlm	Z	m
Z
mZmZddlm
Z
mZmZmZddlmZddlmZddlmZmZd	d
ddd
gZGdd	�d	e�ZGdd
�d
e�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�Z Gdd�de�Z!ed�ed�Bed�Bed�Bed�Bed�Bed�Bed �Bed!�Bed"�Bed#�Bed$�Bed%�Bed&�Bed'�Bed(�Bed)�Bed*�BZ"d#d"ddddd+�Z#e"j$d,d-��ed.�ed/�Bed0�Bed1�Bed2�Bed3�Bed4�Bed5�BZ%e%ed6�Bed7�BZ&e&j$d8d-��ed9�ed:�BZ'e'j$d;d-��ed<�ed=�BZ(e"e'BZ)ee)e&e)�Z*e*j$d>d-��ed?�j+�Z,ed@�j+�Z-e�Z.e*ee,e.e-�BZ/e.e/e
e(e.�>ee.eZ0dAdB�Z1dSdDdE�Z2dFd-�dGd-�ej3ej4ej5ej6ej7ej8dH�Z9dIdJ�Z:e�Z;dKdL�Z<dMdN�Z=dOdP�Z>dQd
�Z?GdRd�de�Z@dS)T�)�absolute_import�division�print_functionN)�ParseException�ParseResults�stringStart�	stringEnd)�
ZeroOrMore�Group�Forward�QuotedString)�Literal�)�string_types)�	Specifier�InvalidSpecifier�
InvalidMarker�UndefinedComparison�UndefinedEnvironmentName�Marker�default_environmentc@seZdZdZdS)rzE
    An invalid marker was found, users should refer to PEP 508.
    N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/markers.pyrsc@seZdZdZdS)rzP
    An invalid operation was attempted on a value that doesn't support it.
    N)rrrrrrrrrsc@seZdZdZdS)rz\
    A name was attempted to be used that does not exist inside of the
    environment.
    N)rrrrrrrrr%sc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�NodecCs
||_dS)N)�value)�selfrrrr�__init__.sz
Node.__init__cCs
t|j�S)N)�strr)rrrr�__str__1szNode.__str__cCsdj|jjt|��S)Nz<{0}({1!r})>)�format�	__class__rr!)rrrr�__repr__4sz
Node.__repr__cCst�dS)N)�NotImplementedError)rrrr�	serialize7szNode.serializeN)rrrr r"r%r'rrrrr,src@seZdZdd�ZdS)�VariablecCst|�S)N)r!)rrrrr'=szVariable.serializeN)rrrr'rrrrr(;sr(c@seZdZdd�ZdS)�ValuecCs
dj|�S)Nz"{0}")r#)rrrrr'CszValue.serializeN)rrrr'rrrrr)Asr)c@seZdZdd�ZdS)�OpcCst|�S)N)r!)rrrrr'IszOp.serializeN)rrrr'rrrrr*Gsr*�implementation_version�platform_python_implementation�implementation_name�python_full_version�platform_release�platform_version�platform_machine�platform_system�python_version�sys_platform�os_namezos.namezsys.platformzplatform.versionzplatform.machinezplatform.python_implementation�python_implementationZextra)zos.namezsys.platformzplatform.versionzplatform.machinezplatform.python_implementationr6cCsttj|d|d��S)Nr)r(�ALIASES�get)�s�l�trrr�<lambda>isr<z===z==z>=z<=z!=z~=�>�<znot in�incCst|d�S)Nr)r*)r9r:r;rrrr<ws�'�"cCst|d�S)Nr)r))r9r:r;rrrr<zs�and�orcCst|d�S)Nr)�tuple)r9r:r;rrrr<�s�(�)cCs t|t�rdd�|D�S|SdS)NcSsg|]}t|��qSr)�_coerce_parse_result)�.0�irrr�
<listcomp>�sz(_coerce_parse_result.<locals>.<listcomp>)�
isinstancer)�resultsrrrrG�s
rGTcCs�t|tttf�st�t|t�rHt|�dkrHt|dttf�rHt|d�St|t�r�dd�|D�}|rndj|�Sddj|�dSn"t|t�r�djdd	�|D��S|SdS)
Nrrcss|]}t|dd�VqdS)F)�firstN)�_format_marker)rH�mrrr�	<genexpr>�sz!_format_marker.<locals>.<genexpr>� rErFcSsg|]}|j��qSr)r')rHrOrrrrJ�sz"_format_marker.<locals>.<listcomp>)rK�listrDr�AssertionError�lenrN�join)�markerrM�innerrrrrN�s


rNcCs||kS)Nr)�lhs�rhsrrrr<�scCs||kS)Nr)rXrYrrrr<�s)r?znot inr>z<=z==z!=z>=r=c
Cslytdj|j�|g��}Wntk
r.YnX|j|�Stj|j��}|dkrbtdj|||���|||�S)N�z#Undefined {0!r} on {1!r} and {2!r}.)	rrUr'r�contains�
_operatorsr8rr#)rX�oprY�specZoperrrr�_eval_op�s
r_cCs&|j|t�}|tkr"tdj|���|S)Nz/{0!r} does not exist in evaluation environment.)r8�
_undefinedrr#)�environment�namerrrr�_get_env�s
rcc	Cs�gg}x�|D]�}t|tttf�s$t�t|t�rD|djt||��qt|t�r�|\}}}t|t�rvt||j	�}|j	}n|j	}t||j	�}|djt
|||��q|dks�t�|dkr|jg�qWtdd�|D��S)	NrrBrCcss|]}t|�VqdS)N)�all)rH�itemrrrrP�sz$_evaluate_markers.<locals>.<genexpr>���rf)rBrC)rKrRrDrrS�append�_evaluate_markersr(rcrr_�any)	Zmarkersra�groupsrVrXr]rYZ	lhs_valueZ	rhs_valuerrrrh�s"




rhcCs2dj|�}|j}|dkr.||dt|j�7}|S)Nz{0.major}.{0.minor}.{0.micro}�finalr)r#�releaselevelr!�serial)�info�versionZkindrrr�format_full_version�s

rpcCslttd�r ttjj�}tjj}nd}d}||tjtj�tj	�tj
�tj�tj�tj�tj�dd�tjd�S)N�implementation�0rZ�)r-r+r5r1r/r2r0r.r,r3r4)
�hasattr�sysrprqrorb�os�platform�machine�release�systemr3r6)Ziverr-rrrr�s 

c@s.eZdZdd�Zdd�Zdd�Zd
dd	�ZdS)rcCs`yttj|��|_WnFtk
rZ}z*dj|||j|jd��}t|��WYdd}~XnXdS)Nz+Invalid marker: {0!r}, parse error at {1!r}�)rG�MARKERZparseString�_markersrr#�locr)rrV�eZerr_strrrrr szMarker.__init__cCs
t|j�S)N)rNr})rrrrr"szMarker.__str__cCsdjt|��S)Nz<Marker({0!r})>)r#r!)rrrrr%szMarker.__repr__NcCs$t�}|dk	r|j|�t|j|�S)a$Evaluate a marker.

        Return the boolean from evaluating the given marker against the
        environment. environment is an optional argument to override all or
        part of the determined environment.

        The environment is determined from the current Python process.
        N)r�updaterhr})rraZcurrent_environmentrrr�evaluate s	
zMarker.evaluate)N)rrrr r"r%r�rrrrrs)T)AZ
__future__rrr�operatorrvrwruZpkg_resources.extern.pyparsingrrrrr	r
rrr
�LZ_compatrZ
specifiersrr�__all__�
ValueErrorrrr�objectrr(r)r*ZVARIABLEr7ZsetParseActionZVERSION_CMPZ	MARKER_OPZMARKER_VALUEZBOOLOPZ
MARKER_VARZMARKER_ITEM�suppressZLPARENZRPARENZMARKER_EXPRZMARKER_ATOMr|rGrN�lt�le�eq�ne�ge�gtr\r_r`rcrhrprrrrrr�<module>sx�
	6


site-packages/pkg_resources/_vendor/packaging/__pycache__/requirements.cpython-36.opt-1.pyc000064400000007330147511334660026054 0ustar003

��f�@srddlmZmZmZddlZddlZddlmZmZm	Z	m
Z
ddlmZmZm
Z
mZmZddlmZddlmZddlmZmZdd	lmZmZmZGd
d�de�Zeejej�Z ed�j!�Z"ed
�j!�Z#ed�j!�Z$ed�j!�Z%ed�j!�Z&ed�j!�Z'ed�j!�Z(ed�Z)e ee)�e BZ*ee ee*��Z+e+d�Z,e+Z-ed�d�Z.e(e.Z/e-ee&e-�Z0e"e
e0�e#d�Z1eej2ej3ej4B�Z5eej2ej3ej4B�Z6e5e6AZ7ee7ee&e7�ddd�d�Z8e
e$e8e%e8B�Z9e9j:dd��e	e9�d�Z;e;j:dd��e	e��d�Zej:d d��e'Z<e<eZ=e;e
e=�Z>e/e
e=�Z?e,e
e1�e?e>BZ@ee@eZAGd!d"�d"eB�ZCdS)#�)�absolute_import�division�print_functionN)�stringStart�	stringEnd�originalTextFor�ParseException)�
ZeroOrMore�Word�Optional�Regex�Combine)�Literal)�parse�)�MARKER_EXPR�Marker)�LegacySpecifier�	Specifier�SpecifierSetc@seZdZdZdS)�InvalidRequirementzJ
    An invalid requirement was found, users should refer to PEP 508.
    N)�__name__�
__module__�__qualname__�__doc__�rr�"/usr/lib/python3.6/requirements.pyrsr�[�]�(�)�,�;�@z-_.�namez[^ ]+�url�extrasF)Z
joinStringZadjacent�	_raw_speccCs
|jpdS)N�)r')�s�l�trrr�<lambda>6sr,�	specifiercCs|dS)Nrr)r)r*r+rrrr,9s�markercCst||j|j��S)N)rZ_original_startZ
_original_end)r)r*r+rrrr,=sc@s(eZdZdZdd�Zdd�Zdd�ZdS)	�Requirementz�Parse a requirement.

    Parse a given requirement string into its parts, such as name, specifier,
    URL, and extras. Raises InvalidRequirement on a badly-formed requirement
    string.
    cCs�ytj|�}Wn@tk
rN}z$tdj||j|jd����WYdd}~XnX|j|_|jr�tj|j�}|j	ot|j
s�|j	r�|j
r�td��|j|_nd|_t|jr�|jj
�ng�|_t|j�|_|jr�|jnd|_dS)Nz+Invalid requirement, parse error at "{0!r}"�zInvalid URL given)�REQUIREMENTZparseStringrr�format�locr$r%�urlparse�schemeZnetloc�setr&ZasListrr-r.)�selfZrequirement_stringZreq�eZ
parsed_urlrrr�__init__Xs"*
zRequirement.__init__cCsz|jg}|jr*|jdjdjt|j����|jr@|jt|j��|jrX|jdj|j��|j	rp|jdj|j	��dj|�S)Nz[{0}]r!z@ {0}z; {0}r()
r$r&�appendr2�join�sortedr-�strr%r.)r7�partsrrr�__str__mszRequirement.__str__cCsdjt|��S)Nz<Requirement({0!r})>)r2r=)r7rrr�__repr__~szRequirement.__repr__N)rrrrr9r?r@rrrrr/Ksr/)DZ
__future__rrr�string�reZpkg_resources.extern.pyparsingrrrrr	r
rrr
r�LZ%pkg_resources.extern.six.moves.urllibrr4ZmarkersrrZ
specifiersrrr�
ValueErrorrZ
ascii_lettersZdigitsZALPHANUM�suppressZLBRACKETZRBRACKETZLPARENZRPAREN�COMMAZ	SEMICOLON�ATZPUNCTUATIONZIDENTIFIER_ENDZ
IDENTIFIER�NAMEZEXTRAZURIZURLZEXTRAS_LISTZEXTRASZ
_regex_str�VERBOSE�
IGNORECASEZVERSION_PEP440ZVERSION_LEGACYZVERSION_ONEZVERSION_MANYZ
_VERSION_SPECZsetParseActionZVERSION_SPECZMARKER_SEPERATORZMARKERZVERSION_AND_MARKERZURL_AND_MARKERZNAMED_REQUIREMENTr1�objectr/rrrr�<module>sZ
site-packages/pkg_resources/_vendor/packaging/__pycache__/_structures.cpython-36.pyc000064400000005335147511334660024757 0ustar003

��f��@sDddlmZmZmZGdd�de�Ze�ZGdd�de�Ze�ZdS)�)�absolute_import�division�print_functionc@sTeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)�InfinitycCsdS)Nr�)�selfrr�!/usr/lib/python3.6/_structures.py�__repr__	szInfinity.__repr__cCstt|��S)N)�hash�repr)rrrr�__hash__szInfinity.__hash__cCsdS)NFr)r�otherrrr�__lt__szInfinity.__lt__cCsdS)NFr)rr
rrr�__le__szInfinity.__le__cCst||j�S)N)�
isinstance�	__class__)rr
rrr�__eq__szInfinity.__eq__cCst||j�S)N)rr)rr
rrr�__ne__szInfinity.__ne__cCsdS)NTr)rr
rrr�__gt__szInfinity.__gt__cCsdS)NTr)rr
rrr�__ge__szInfinity.__ge__cCstS)N)�NegativeInfinity)rrrr�__neg__!szInfinity.__neg__N)�__name__�
__module__�__qualname__r	rrrrrrrrrrrrrsrc@sTeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)rcCsdS)Nz	-Infinityr)rrrrr	)szNegativeInfinity.__repr__cCstt|��S)N)r
r)rrrrr,szNegativeInfinity.__hash__cCsdS)NTr)rr
rrrr/szNegativeInfinity.__lt__cCsdS)NTr)rr
rrrr2szNegativeInfinity.__le__cCst||j�S)N)rr)rr
rrrr5szNegativeInfinity.__eq__cCst||j�S)N)rr)rr
rrrr8szNegativeInfinity.__ne__cCsdS)NFr)rr
rrrr;szNegativeInfinity.__gt__cCsdS)NFr)rr
rrrr>szNegativeInfinity.__ge__cCstS)N)r)rrrrrAszNegativeInfinity.__neg__N)rrrr	rrrrrrrrrrrrr'srN)Z
__future__rrr�objectrrrrrr�<module>ssite-packages/pkg_resources/_vendor/packaging/version.py000064400000026444147511334660017602 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

import collections
import itertools
import re

from ._structures import Infinity


__all__ = [
    "parse", "Version", "LegacyVersion", "InvalidVersion", "VERSION_PATTERN"
]


_Version = collections.namedtuple(
    "_Version",
    ["epoch", "release", "dev", "pre", "post", "local"],
)


def parse(version):
    """
    Parse the given version string and return either a :class:`Version` object
    or a :class:`LegacyVersion` object depending on if the given version is
    a valid PEP 440 version or a legacy version.
    """
    try:
        return Version(version)
    except InvalidVersion:
        return LegacyVersion(version)


class InvalidVersion(ValueError):
    """
    An invalid version was found, users should refer to PEP 440.
    """


class _BaseVersion(object):

    def __hash__(self):
        return hash(self._key)

    def __lt__(self, other):
        return self._compare(other, lambda s, o: s < o)

    def __le__(self, other):
        return self._compare(other, lambda s, o: s <= o)

    def __eq__(self, other):
        return self._compare(other, lambda s, o: s == o)

    def __ge__(self, other):
        return self._compare(other, lambda s, o: s >= o)

    def __gt__(self, other):
        return self._compare(other, lambda s, o: s > o)

    def __ne__(self, other):
        return self._compare(other, lambda s, o: s != o)

    def _compare(self, other, method):
        if not isinstance(other, _BaseVersion):
            return NotImplemented

        return method(self._key, other._key)


class LegacyVersion(_BaseVersion):

    def __init__(self, version):
        self._version = str(version)
        self._key = _legacy_cmpkey(self._version)

    def __str__(self):
        return self._version

    def __repr__(self):
        return "<LegacyVersion({0})>".format(repr(str(self)))

    @property
    def public(self):
        return self._version

    @property
    def base_version(self):
        return self._version

    @property
    def local(self):
        return None

    @property
    def is_prerelease(self):
        return False

    @property
    def is_postrelease(self):
        return False


_legacy_version_component_re = re.compile(
    r"(\d+ | [a-z]+ | \.| -)", re.VERBOSE,
)

_legacy_version_replacement_map = {
    "pre": "c", "preview": "c", "-": "final-", "rc": "c", "dev": "@",
}


def _parse_version_parts(s):
    for part in _legacy_version_component_re.split(s):
        part = _legacy_version_replacement_map.get(part, part)

        if not part or part == ".":
            continue

        if part[:1] in "0123456789":
            # pad for numeric comparison
            yield part.zfill(8)
        else:
            yield "*" + part

    # ensure that alpha/beta/candidate are before final
    yield "*final"


def _legacy_cmpkey(version):
    # We hardcode an epoch of -1 here. A PEP 440 version can only have a epoch
    # greater than or equal to 0. This will effectively put the LegacyVersion,
    # which uses the defacto standard originally implemented by setuptools,
    # as before all PEP 440 versions.
    epoch = -1

    # This scheme is taken from pkg_resources.parse_version setuptools prior to
    # it's adoption of the packaging library.
    parts = []
    for part in _parse_version_parts(version.lower()):
        if part.startswith("*"):
            # remove "-" before a prerelease tag
            if part < "*final":
                while parts and parts[-1] == "*final-":
                    parts.pop()

            # remove trailing zeros from each series of numeric parts
            while parts and parts[-1] == "00000000":
                parts.pop()

        parts.append(part)
    parts = tuple(parts)

    return epoch, parts

# Deliberately not anchored to the start and end of the string, to make it
# easier for 3rd party code to reuse
VERSION_PATTERN = r"""
    v?
    (?:
        (?:(?P<epoch>[0-9]+)!)?                           # epoch
        (?P<release>[0-9]+(?:\.[0-9]+)*)                  # release segment
        (?P<pre>                                          # pre-release
            [-_\.]?
            (?P<pre_l>(a|b|c|rc|alpha|beta|pre|preview))
            [-_\.]?
            (?P<pre_n>[0-9]+)?
        )?
        (?P<post>                                         # post release
            (?:-(?P<post_n1>[0-9]+))
            |
            (?:
                [-_\.]?
                (?P<post_l>post|rev|r)
                [-_\.]?
                (?P<post_n2>[0-9]+)?
            )
        )?
        (?P<dev>                                          # dev release
            [-_\.]?
            (?P<dev_l>dev)
            [-_\.]?
            (?P<dev_n>[0-9]+)?
        )?
    )
    (?:\+(?P<local>[a-z0-9]+(?:[-_\.][a-z0-9]+)*))?       # local version
"""


class Version(_BaseVersion):

    _regex = re.compile(
        r"^\s*" + VERSION_PATTERN + r"\s*$",
        re.VERBOSE | re.IGNORECASE,
    )

    def __init__(self, version):
        # Validate the version and parse it into pieces
        match = self._regex.search(version)
        if not match:
            raise InvalidVersion("Invalid version: '{0}'".format(version))

        # Store the parsed out pieces of the version
        self._version = _Version(
            epoch=int(match.group("epoch")) if match.group("epoch") else 0,
            release=tuple(int(i) for i in match.group("release").split(".")),
            pre=_parse_letter_version(
                match.group("pre_l"),
                match.group("pre_n"),
            ),
            post=_parse_letter_version(
                match.group("post_l"),
                match.group("post_n1") or match.group("post_n2"),
            ),
            dev=_parse_letter_version(
                match.group("dev_l"),
                match.group("dev_n"),
            ),
            local=_parse_local_version(match.group("local")),
        )

        # Generate a key which will be used for sorting
        self._key = _cmpkey(
            self._version.epoch,
            self._version.release,
            self._version.pre,
            self._version.post,
            self._version.dev,
            self._version.local,
        )

    def __repr__(self):
        return "<Version({0})>".format(repr(str(self)))

    def __str__(self):
        parts = []

        # Epoch
        if self._version.epoch != 0:
            parts.append("{0}!".format(self._version.epoch))

        # Release segment
        parts.append(".".join(str(x) for x in self._version.release))

        # Pre-release
        if self._version.pre is not None:
            parts.append("".join(str(x) for x in self._version.pre))

        # Post-release
        if self._version.post is not None:
            parts.append(".post{0}".format(self._version.post[1]))

        # Development release
        if self._version.dev is not None:
            parts.append(".dev{0}".format(self._version.dev[1]))

        # Local version segment
        if self._version.local is not None:
            parts.append(
                "+{0}".format(".".join(str(x) for x in self._version.local))
            )

        return "".join(parts)

    @property
    def public(self):
        return str(self).split("+", 1)[0]

    @property
    def base_version(self):
        parts = []

        # Epoch
        if self._version.epoch != 0:
            parts.append("{0}!".format(self._version.epoch))

        # Release segment
        parts.append(".".join(str(x) for x in self._version.release))

        return "".join(parts)

    @property
    def local(self):
        version_string = str(self)
        if "+" in version_string:
            return version_string.split("+", 1)[1]

    @property
    def is_prerelease(self):
        return bool(self._version.dev or self._version.pre)

    @property
    def is_postrelease(self):
        return bool(self._version.post)


def _parse_letter_version(letter, number):
    if letter:
        # We consider there to be an implicit 0 in a pre-release if there is
        # not a numeral associated with it.
        if number is None:
            number = 0

        # We normalize any letters to their lower case form
        letter = letter.lower()

        # We consider some words to be alternate spellings of other words and
        # in those cases we want to normalize the spellings to our preferred
        # spelling.
        if letter == "alpha":
            letter = "a"
        elif letter == "beta":
            letter = "b"
        elif letter in ["c", "pre", "preview"]:
            letter = "rc"
        elif letter in ["rev", "r"]:
            letter = "post"

        return letter, int(number)
    if not letter and number:
        # We assume if we are given a number, but we are not given a letter
        # then this is using the implicit post release syntax (e.g. 1.0-1)
        letter = "post"

        return letter, int(number)


_local_version_seperators = re.compile(r"[\._-]")


def _parse_local_version(local):
    """
    Takes a string like abc.1.twelve and turns it into ("abc", 1, "twelve").
    """
    if local is not None:
        return tuple(
            part.lower() if not part.isdigit() else int(part)
            for part in _local_version_seperators.split(local)
        )


def _cmpkey(epoch, release, pre, post, dev, local):
    # When we compare a release version, we want to compare it with all of the
    # trailing zeros removed. So we'll use a reverse the list, drop all the now
    # leading zeros until we come to something non zero, then take the rest
    # re-reverse it back into the correct order and make it a tuple and use
    # that for our sorting key.
    release = tuple(
        reversed(list(
            itertools.dropwhile(
                lambda x: x == 0,
                reversed(release),
            )
        ))
    )

    # We need to "trick" the sorting algorithm to put 1.0.dev0 before 1.0a0.
    # We'll do this by abusing the pre segment, but we _only_ want to do this
    # if there is not a pre or a post segment. If we have one of those then
    # the normal sorting rules will handle this case correctly.
    if pre is None and post is None and dev is not None:
        pre = -Infinity
    # Versions without a pre-release (except as noted above) should sort after
    # those with one.
    elif pre is None:
        pre = Infinity

    # Versions without a post segment should sort before those with one.
    if post is None:
        post = -Infinity

    # Versions without a development segment should sort after those with one.
    if dev is None:
        dev = Infinity

    if local is None:
        # Versions without a local segment should sort before those with one.
        local = -Infinity
    else:
        # Versions with a local segment need that segment parsed to implement
        # the sorting rules in PEP440.
        # - Alpha numeric segments sort before numeric segments
        # - Alpha numeric segments sort lexicographically
        # - Numeric segments sort numerically
        # - Shorter versions sort before longer versions when the prefixes
        #   match exactly
        local = tuple(
            (i, "") if isinstance(i, int) else (-Infinity, i)
            for i in local
        )

    return epoch, release, pre, post, dev, local
site-packages/pkg_resources/_vendor/packaging/specifiers.py000064400000066571147511334660020256 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

import abc
import functools
import itertools
import re

from ._compat import string_types, with_metaclass
from .version import Version, LegacyVersion, parse


class InvalidSpecifier(ValueError):
    """
    An invalid specifier was found, users should refer to PEP 440.
    """


class BaseSpecifier(with_metaclass(abc.ABCMeta, object)):

    @abc.abstractmethod
    def __str__(self):
        """
        Returns the str representation of this Specifier like object. This
        should be representative of the Specifier itself.
        """

    @abc.abstractmethod
    def __hash__(self):
        """
        Returns a hash value for this Specifier like object.
        """

    @abc.abstractmethod
    def __eq__(self, other):
        """
        Returns a boolean representing whether or not the two Specifier like
        objects are equal.
        """

    @abc.abstractmethod
    def __ne__(self, other):
        """
        Returns a boolean representing whether or not the two Specifier like
        objects are not equal.
        """

    @abc.abstractproperty
    def prereleases(self):
        """
        Returns whether or not pre-releases as a whole are allowed by this
        specifier.
        """

    @prereleases.setter
    def prereleases(self, value):
        """
        Sets whether or not pre-releases as a whole are allowed by this
        specifier.
        """

    @abc.abstractmethod
    def contains(self, item, prereleases=None):
        """
        Determines if the given item is contained within this specifier.
        """

    @abc.abstractmethod
    def filter(self, iterable, prereleases=None):
        """
        Takes an iterable of items and filters them so that only items which
        are contained within this specifier are allowed in it.
        """


class _IndividualSpecifier(BaseSpecifier):

    _operators = {}

    def __init__(self, spec="", prereleases=None):
        match = self._regex.search(spec)
        if not match:
            raise InvalidSpecifier("Invalid specifier: '{0}'".format(spec))

        self._spec = (
            match.group("operator").strip(),
            match.group("version").strip(),
        )

        # Store whether or not this Specifier should accept prereleases
        self._prereleases = prereleases

    def __repr__(self):
        pre = (
            ", prereleases={0!r}".format(self.prereleases)
            if self._prereleases is not None
            else ""
        )

        return "<{0}({1!r}{2})>".format(
            self.__class__.__name__,
            str(self),
            pre,
        )

    def __str__(self):
        return "{0}{1}".format(*self._spec)

    def __hash__(self):
        return hash(self._spec)

    def __eq__(self, other):
        if isinstance(other, string_types):
            try:
                other = self.__class__(other)
            except InvalidSpecifier:
                return NotImplemented
        elif not isinstance(other, self.__class__):
            return NotImplemented

        return self._spec == other._spec

    def __ne__(self, other):
        if isinstance(other, string_types):
            try:
                other = self.__class__(other)
            except InvalidSpecifier:
                return NotImplemented
        elif not isinstance(other, self.__class__):
            return NotImplemented

        return self._spec != other._spec

    def _get_operator(self, op):
        return getattr(self, "_compare_{0}".format(self._operators[op]))

    def _coerce_version(self, version):
        if not isinstance(version, (LegacyVersion, Version)):
            version = parse(version)
        return version

    @property
    def operator(self):
        return self._spec[0]

    @property
    def version(self):
        return self._spec[1]

    @property
    def prereleases(self):
        return self._prereleases

    @prereleases.setter
    def prereleases(self, value):
        self._prereleases = value

    def __contains__(self, item):
        return self.contains(item)

    def contains(self, item, prereleases=None):
        # Determine if prereleases are to be allowed or not.
        if prereleases is None:
            prereleases = self.prereleases

        # Normalize item to a Version or LegacyVersion, this allows us to have
        # a shortcut for ``"2.0" in Specifier(">=2")
        item = self._coerce_version(item)

        # Determine if we should be supporting prereleases in this specifier
        # or not, if we do not support prereleases than we can short circuit
        # logic if this version is a prereleases.
        if item.is_prerelease and not prereleases:
            return False

        # Actually do the comparison to determine if this item is contained
        # within this Specifier or not.
        return self._get_operator(self.operator)(item, self.version)

    def filter(self, iterable, prereleases=None):
        yielded = False
        found_prereleases = []

        kw = {"prereleases": prereleases if prereleases is not None else True}

        # Attempt to iterate over all the values in the iterable and if any of
        # them match, yield them.
        for version in iterable:
            parsed_version = self._coerce_version(version)

            if self.contains(parsed_version, **kw):
                # If our version is a prerelease, and we were not set to allow
                # prereleases, then we'll store it for later incase nothing
                # else matches this specifier.
                if (parsed_version.is_prerelease and not
                        (prereleases or self.prereleases)):
                    found_prereleases.append(version)
                # Either this is not a prerelease, or we should have been
                # accepting prereleases from the begining.
                else:
                    yielded = True
                    yield version

        # Now that we've iterated over everything, determine if we've yielded
        # any values, and if we have not and we have any prereleases stored up
        # then we will go ahead and yield the prereleases.
        if not yielded and found_prereleases:
            for version in found_prereleases:
                yield version


class LegacySpecifier(_IndividualSpecifier):

    _regex_str = (
        r"""
        (?P<operator>(==|!=|<=|>=|<|>))
        \s*
        (?P<version>
            [^,;\s)]* # Since this is a "legacy" specifier, and the version
                      # string can be just about anything, we match everything
                      # except for whitespace, a semi-colon for marker support,
                      # a closing paren since versions can be enclosed in
                      # them, and a comma since it's a version separator.
        )
        """
    )

    _regex = re.compile(
        r"^\s*" + _regex_str + r"\s*$", re.VERBOSE | re.IGNORECASE)

    _operators = {
        "==": "equal",
        "!=": "not_equal",
        "<=": "less_than_equal",
        ">=": "greater_than_equal",
        "<": "less_than",
        ">": "greater_than",
    }

    def _coerce_version(self, version):
        if not isinstance(version, LegacyVersion):
            version = LegacyVersion(str(version))
        return version

    def _compare_equal(self, prospective, spec):
        return prospective == self._coerce_version(spec)

    def _compare_not_equal(self, prospective, spec):
        return prospective != self._coerce_version(spec)

    def _compare_less_than_equal(self, prospective, spec):
        return prospective <= self._coerce_version(spec)

    def _compare_greater_than_equal(self, prospective, spec):
        return prospective >= self._coerce_version(spec)

    def _compare_less_than(self, prospective, spec):
        return prospective < self._coerce_version(spec)

    def _compare_greater_than(self, prospective, spec):
        return prospective > self._coerce_version(spec)


def _require_version_compare(fn):
    @functools.wraps(fn)
    def wrapped(self, prospective, spec):
        if not isinstance(prospective, Version):
            return False
        return fn(self, prospective, spec)
    return wrapped


class Specifier(_IndividualSpecifier):

    _regex_str = (
        r"""
        (?P<operator>(~=|==|!=|<=|>=|<|>|===))
        (?P<version>
            (?:
                # The identity operators allow for an escape hatch that will
                # do an exact string match of the version you wish to install.
                # This will not be parsed by PEP 440 and we cannot determine
                # any semantic meaning from it. This operator is discouraged
                # but included entirely as an escape hatch.
                (?<====)  # Only match for the identity operator
                \s*
                [^\s]*    # We just match everything, except for whitespace
                          # since we are only testing for strict identity.
            )
            |
            (?:
                # The (non)equality operators allow for wild card and local
                # versions to be specified so we have to define these two
                # operators separately to enable that.
                (?<===|!=)            # Only match for equals and not equals

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)*   # release
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?

                # You cannot use a wild card and a dev or local version
                # together so group them with a | and make them optional.
                (?:
                    (?:[-_\.]?dev[-_\.]?[0-9]*)?         # dev release
                    (?:\+[a-z0-9]+(?:[-_\.][a-z0-9]+)*)? # local
                    |
                    \.\*  # Wild card syntax of .*
                )?
            )
            |
            (?:
                # The compatible operator requires at least two digits in the
                # release segment.
                (?<=~=)               # Only match for the compatible operator

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)+   # release  (We have a + instead of a *)
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?
                (?:[-_\.]?dev[-_\.]?[0-9]*)?          # dev release
            )
            |
            (?:
                # All other operators only allow a sub set of what the
                # (non)equality operators do. Specifically they do not allow
                # local versions to be specified nor do they allow the prefix
                # matching wild cards.
                (?<!==|!=|~=)         # We have special cases for these
                                      # operators so we want to make sure they
                                      # don't match here.

                \s*
                v?
                (?:[0-9]+!)?          # epoch
                [0-9]+(?:\.[0-9]+)*   # release
                (?:                   # pre release
                    [-_\.]?
                    (a|b|c|rc|alpha|beta|pre|preview)
                    [-_\.]?
                    [0-9]*
                )?
                (?:                                   # post release
                    (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
                )?
                (?:[-_\.]?dev[-_\.]?[0-9]*)?          # dev release
            )
        )
        """
    )

    _regex = re.compile(
        r"^\s*" + _regex_str + r"\s*$", re.VERBOSE | re.IGNORECASE)

    _operators = {
        "~=": "compatible",
        "==": "equal",
        "!=": "not_equal",
        "<=": "less_than_equal",
        ">=": "greater_than_equal",
        "<": "less_than",
        ">": "greater_than",
        "===": "arbitrary",
    }

    @_require_version_compare
    def _compare_compatible(self, prospective, spec):
        # Compatible releases have an equivalent combination of >= and ==. That
        # is that ~=2.2 is equivalent to >=2.2,==2.*. This allows us to
        # implement this in terms of the other specifiers instead of
        # implementing it ourselves. The only thing we need to do is construct
        # the other specifiers.

        # We want everything but the last item in the version, but we want to
        # ignore post and dev releases and we want to treat the pre-release as
        # it's own separate segment.
        prefix = ".".join(
            list(
                itertools.takewhile(
                    lambda x: (not x.startswith("post") and not
                               x.startswith("dev")),
                    _version_split(spec),
                )
            )[:-1]
        )

        # Add the prefix notation to the end of our string
        prefix += ".*"

        return (self._get_operator(">=")(prospective, spec) and
                self._get_operator("==")(prospective, prefix))

    @_require_version_compare
    def _compare_equal(self, prospective, spec):
        # We need special logic to handle prefix matching
        if spec.endswith(".*"):
            # In the case of prefix matching we want to ignore local segment.
            prospective = Version(prospective.public)
            # Split the spec out by dots, and pretend that there is an implicit
            # dot in between a release segment and a pre-release segment.
            spec = _version_split(spec[:-2])  # Remove the trailing .*

            # Split the prospective version out by dots, and pretend that there
            # is an implicit dot in between a release segment and a pre-release
            # segment.
            prospective = _version_split(str(prospective))

            # Shorten the prospective version to be the same length as the spec
            # so that we can determine if the specifier is a prefix of the
            # prospective version or not.
            prospective = prospective[:len(spec)]

            # Pad out our two sides with zeros so that they both equal the same
            # length.
            spec, prospective = _pad_version(spec, prospective)
        else:
            # Convert our spec string into a Version
            spec = Version(spec)

            # If the specifier does not have a local segment, then we want to
            # act as if the prospective version also does not have a local
            # segment.
            if not spec.local:
                prospective = Version(prospective.public)

        return prospective == spec

    @_require_version_compare
    def _compare_not_equal(self, prospective, spec):
        return not self._compare_equal(prospective, spec)

    @_require_version_compare
    def _compare_less_than_equal(self, prospective, spec):
        return prospective <= Version(spec)

    @_require_version_compare
    def _compare_greater_than_equal(self, prospective, spec):
        return prospective >= Version(spec)

    @_require_version_compare
    def _compare_less_than(self, prospective, spec):
        # Convert our spec to a Version instance, since we'll want to work with
        # it as a version.
        spec = Version(spec)

        # Check to see if the prospective version is less than the spec
        # version. If it's not we can short circuit and just return False now
        # instead of doing extra unneeded work.
        if not prospective < spec:
            return False

        # This special case is here so that, unless the specifier itself
        # includes is a pre-release version, that we do not accept pre-release
        # versions for the version mentioned in the specifier (e.g. <3.1 should
        # not match 3.1.dev0, but should match 3.0.dev0).
        if not spec.is_prerelease and prospective.is_prerelease:
            if Version(prospective.base_version) == Version(spec.base_version):
                return False

        # If we've gotten to here, it means that prospective version is both
        # less than the spec version *and* it's not a pre-release of the same
        # version in the spec.
        return True

    @_require_version_compare
    def _compare_greater_than(self, prospective, spec):
        # Convert our spec to a Version instance, since we'll want to work with
        # it as a version.
        spec = Version(spec)

        # Check to see if the prospective version is greater than the spec
        # version. If it's not we can short circuit and just return False now
        # instead of doing extra unneeded work.
        if not prospective > spec:
            return False

        # This special case is here so that, unless the specifier itself
        # includes is a post-release version, that we do not accept
        # post-release versions for the version mentioned in the specifier
        # (e.g. >3.1 should not match 3.0.post0, but should match 3.2.post0).
        if not spec.is_postrelease and prospective.is_postrelease:
            if Version(prospective.base_version) == Version(spec.base_version):
                return False

        # Ensure that we do not allow a local version of the version mentioned
        # in the specifier, which is techincally greater than, to match.
        if prospective.local is not None:
            if Version(prospective.base_version) == Version(spec.base_version):
                return False

        # If we've gotten to here, it means that prospective version is both
        # greater than the spec version *and* it's not a pre-release of the
        # same version in the spec.
        return True

    def _compare_arbitrary(self, prospective, spec):
        return str(prospective).lower() == str(spec).lower()

    @property
    def prereleases(self):
        # If there is an explicit prereleases set for this, then we'll just
        # blindly use that.
        if self._prereleases is not None:
            return self._prereleases

        # Look at all of our specifiers and determine if they are inclusive
        # operators, and if they are if they are including an explicit
        # prerelease.
        operator, version = self._spec
        if operator in ["==", ">=", "<=", "~=", "==="]:
            # The == specifier can include a trailing .*, if it does we
            # want to remove before parsing.
            if operator == "==" and version.endswith(".*"):
                version = version[:-2]

            # Parse the version, and if it is a pre-release than this
            # specifier allows pre-releases.
            if parse(version).is_prerelease:
                return True

        return False

    @prereleases.setter
    def prereleases(self, value):
        self._prereleases = value


_prefix_regex = re.compile(r"^([0-9]+)((?:a|b|c|rc)[0-9]+)$")


def _version_split(version):
    result = []
    for item in version.split("."):
        match = _prefix_regex.search(item)
        if match:
            result.extend(match.groups())
        else:
            result.append(item)
    return result


def _pad_version(left, right):
    left_split, right_split = [], []

    # Get the release segment of our versions
    left_split.append(list(itertools.takewhile(lambda x: x.isdigit(), left)))
    right_split.append(list(itertools.takewhile(lambda x: x.isdigit(), right)))

    # Get the rest of our versions
    left_split.append(left[len(left_split[0]):])
    right_split.append(right[len(right_split[0]):])

    # Insert our padding
    left_split.insert(
        1,
        ["0"] * max(0, len(right_split[0]) - len(left_split[0])),
    )
    right_split.insert(
        1,
        ["0"] * max(0, len(left_split[0]) - len(right_split[0])),
    )

    return (
        list(itertools.chain(*left_split)),
        list(itertools.chain(*right_split)),
    )


class SpecifierSet(BaseSpecifier):

    def __init__(self, specifiers="", prereleases=None):
        # Split on , to break each indidivual specifier into it's own item, and
        # strip each item to remove leading/trailing whitespace.
        specifiers = [s.strip() for s in specifiers.split(",") if s.strip()]

        # Parsed each individual specifier, attempting first to make it a
        # Specifier and falling back to a LegacySpecifier.
        parsed = set()
        for specifier in specifiers:
            try:
                parsed.add(Specifier(specifier))
            except InvalidSpecifier:
                parsed.add(LegacySpecifier(specifier))

        # Turn our parsed specifiers into a frozen set and save them for later.
        self._specs = frozenset(parsed)

        # Store our prereleases value so we can use it later to determine if
        # we accept prereleases or not.
        self._prereleases = prereleases

    def __repr__(self):
        pre = (
            ", prereleases={0!r}".format(self.prereleases)
            if self._prereleases is not None
            else ""
        )

        return "<SpecifierSet({0!r}{1})>".format(str(self), pre)

    def __str__(self):
        return ",".join(sorted(str(s) for s in self._specs))

    def __hash__(self):
        return hash(self._specs)

    def __and__(self, other):
        if isinstance(other, string_types):
            other = SpecifierSet(other)
        elif not isinstance(other, SpecifierSet):
            return NotImplemented

        specifier = SpecifierSet()
        specifier._specs = frozenset(self._specs | other._specs)

        if self._prereleases is None and other._prereleases is not None:
            specifier._prereleases = other._prereleases
        elif self._prereleases is not None and other._prereleases is None:
            specifier._prereleases = self._prereleases
        elif self._prereleases == other._prereleases:
            specifier._prereleases = self._prereleases
        else:
            raise ValueError(
                "Cannot combine SpecifierSets with True and False prerelease "
                "overrides."
            )

        return specifier

    def __eq__(self, other):
        if isinstance(other, string_types):
            other = SpecifierSet(other)
        elif isinstance(other, _IndividualSpecifier):
            other = SpecifierSet(str(other))
        elif not isinstance(other, SpecifierSet):
            return NotImplemented

        return self._specs == other._specs

    def __ne__(self, other):
        if isinstance(other, string_types):
            other = SpecifierSet(other)
        elif isinstance(other, _IndividualSpecifier):
            other = SpecifierSet(str(other))
        elif not isinstance(other, SpecifierSet):
            return NotImplemented

        return self._specs != other._specs

    def __len__(self):
        return len(self._specs)

    def __iter__(self):
        return iter(self._specs)

    @property
    def prereleases(self):
        # If we have been given an explicit prerelease modifier, then we'll
        # pass that through here.
        if self._prereleases is not None:
            return self._prereleases

        # If we don't have any specifiers, and we don't have a forced value,
        # then we'll just return None since we don't know if this should have
        # pre-releases or not.
        if not self._specs:
            return None

        # Otherwise we'll see if any of the given specifiers accept
        # prereleases, if any of them do we'll return True, otherwise False.
        return any(s.prereleases for s in self._specs)

    @prereleases.setter
    def prereleases(self, value):
        self._prereleases = value

    def __contains__(self, item):
        return self.contains(item)

    def contains(self, item, prereleases=None):
        # Ensure that our item is a Version or LegacyVersion instance.
        if not isinstance(item, (LegacyVersion, Version)):
            item = parse(item)

        # Determine if we're forcing a prerelease or not, if we're not forcing
        # one for this particular filter call, then we'll use whatever the
        # SpecifierSet thinks for whether or not we should support prereleases.
        if prereleases is None:
            prereleases = self.prereleases

        # We can determine if we're going to allow pre-releases by looking to
        # see if any of the underlying items supports them. If none of them do
        # and this item is a pre-release then we do not allow it and we can
        # short circuit that here.
        # Note: This means that 1.0.dev1 would not be contained in something
        #       like >=1.0.devabc however it would be in >=1.0.debabc,>0.0.dev0
        if not prereleases and item.is_prerelease:
            return False

        # We simply dispatch to the underlying specs here to make sure that the
        # given version is contained within all of them.
        # Note: This use of all() here means that an empty set of specifiers
        #       will always return True, this is an explicit design decision.
        return all(
            s.contains(item, prereleases=prereleases)
            for s in self._specs
        )

    def filter(self, iterable, prereleases=None):
        # Determine if we're forcing a prerelease or not, if we're not forcing
        # one for this particular filter call, then we'll use whatever the
        # SpecifierSet thinks for whether or not we should support prereleases.
        if prereleases is None:
            prereleases = self.prereleases

        # If we have any specifiers, then we want to wrap our iterable in the
        # filter method for each one, this will act as a logical AND amongst
        # each specifier.
        if self._specs:
            for spec in self._specs:
                iterable = spec.filter(iterable, prereleases=bool(prereleases))
            return iterable
        # If we do not have any specifiers, then we need to have a rough filter
        # which will filter out any pre-releases, unless there are no final
        # releases, and which will filter out LegacyVersion in general.
        else:
            filtered = []
            found_prereleases = []

            for item in iterable:
                # Ensure that we some kind of Version class for this item.
                if not isinstance(item, (LegacyVersion, Version)):
                    parsed_version = parse(item)
                else:
                    parsed_version = item

                # Filter out any item which is parsed as a LegacyVersion
                if isinstance(parsed_version, LegacyVersion):
                    continue

                # Store any item which is a pre-release for later unless we've
                # already found a final version or we are accepting prereleases
                if parsed_version.is_prerelease and not prereleases:
                    if not filtered:
                        found_prereleases.append(item)
                else:
                    filtered.append(item)

            # If we've found no items except for pre-releases, then we'll go
            # ahead and use the pre-releases
            if not filtered and found_prereleases and prereleases is None:
                return found_prereleases

            return filtered
site-packages/pkg_resources/_vendor/packaging/__about__.py000064400000001320147511334660020005 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function

__all__ = [
    "__title__", "__summary__", "__uri__", "__version__", "__author__",
    "__email__", "__license__", "__copyright__",
]

__title__ = "packaging"
__summary__ = "Core utilities for Python packages"
__uri__ = "https://github.com/pypa/packaging"

__version__ = "16.8"

__author__ = "Donald Stufft and individual contributors"
__email__ = "donald@stufft.io"

__license__ = "BSD or Apache License, Version 2.0"
__copyright__ = "Copyright 2014-2016 %s" % __author__
site-packages/pkg_resources/_vendor/packaging/_structures.py000064400000002610147511334660020464 0ustar00# This file is dual licensed under the terms of the Apache License, Version
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
# for complete details.
from __future__ import absolute_import, division, print_function


class Infinity(object):

    def __repr__(self):
        return "Infinity"

    def __hash__(self):
        return hash(repr(self))

    def __lt__(self, other):
        return False

    def __le__(self, other):
        return False

    def __eq__(self, other):
        return isinstance(other, self.__class__)

    def __ne__(self, other):
        return not isinstance(other, self.__class__)

    def __gt__(self, other):
        return True

    def __ge__(self, other):
        return True

    def __neg__(self):
        return NegativeInfinity

Infinity = Infinity()


class NegativeInfinity(object):

    def __repr__(self):
        return "-Infinity"

    def __hash__(self):
        return hash(repr(self))

    def __lt__(self, other):
        return True

    def __le__(self, other):
        return True

    def __eq__(self, other):
        return isinstance(other, self.__class__)

    def __ne__(self, other):
        return not isinstance(other, self.__class__)

    def __gt__(self, other):
        return False

    def __ge__(self, other):
        return False

    def __neg__(self):
        return Infinity

NegativeInfinity = NegativeInfinity()
site-packages/pkg_resources/__init__.py000064400000312616147511334660014273 0ustar00# coding: utf-8
"""
Package resource API
--------------------

A resource is a logical file contained within a package, or a logical
subdirectory thereof.  The package resource API expects resource names
to have their path parts separated with ``/``, *not* whatever the local
path separator is.  Do not use os.path operations to manipulate resource
names being passed into the API.

The package resource API is designed to work with normal filesystem packages,
.egg files, and unpacked .egg files.  It can also work in a limited way with
.zip files and with custom PEP 302 loaders that support the ``get_data()``
method.
"""

from __future__ import absolute_import

import sys
import os
import io
import time
import re
import types
import zipfile
import zipimport
import warnings
import stat
import functools
import pkgutil
import operator
import platform
import collections
import plistlib
import email.parser
import errno
import tempfile
import textwrap
import itertools
import inspect
from pkgutil import get_importer

try:
    import _imp
except ImportError:
    # Python 3.2 compatibility
    import imp as _imp

from pkg_resources.extern import six
from pkg_resources.extern.six.moves import urllib, map, filter

# capture these to bypass sandboxing
from os import utime
try:
    from os import mkdir, rename, unlink
    WRITE_SUPPORT = True
except ImportError:
    # no write support, probably under GAE
    WRITE_SUPPORT = False

from os import open as os_open
from os.path import isdir, split

try:
    import importlib.machinery as importlib_machinery
    # access attribute to force import under delayed import mechanisms.
    importlib_machinery.__name__
except ImportError:
    importlib_machinery = None

from . import py31compat
from pkg_resources.extern import appdirs
from pkg_resources.extern import packaging
__import__('pkg_resources.extern.packaging.version')
__import__('pkg_resources.extern.packaging.specifiers')
__import__('pkg_resources.extern.packaging.requirements')
__import__('pkg_resources.extern.packaging.markers')


if (3, 0) < sys.version_info < (3, 3):
    raise RuntimeError("Python 3.3 or later is required")

if six.PY2:
    # Those builtin exceptions are only defined in Python 3
    PermissionError = None
    NotADirectoryError = None

# declare some globals that will be defined later to
# satisfy the linters.
require = None
working_set = None
add_activation_listener = None
resources_stream = None
cleanup_resources = None
resource_dir = None
resource_stream = None
set_extraction_path = None
resource_isdir = None
resource_string = None
iter_entry_points = None
resource_listdir = None
resource_filename = None
resource_exists = None
_distribution_finders = None
_namespace_handlers = None
_namespace_packages = None


class PEP440Warning(RuntimeWarning):
    """
    Used when there is an issue with a version or specifier not complying with
    PEP 440.
    """


def parse_version(v):
    try:
        return packaging.version.Version(v)
    except packaging.version.InvalidVersion:
        return packaging.version.LegacyVersion(v)


_state_vars = {}


def _declare_state(vartype, **kw):
    globals().update(kw)
    _state_vars.update(dict.fromkeys(kw, vartype))


def __getstate__():
    state = {}
    g = globals()
    for k, v in _state_vars.items():
        state[k] = g['_sget_' + v](g[k])
    return state


def __setstate__(state):
    g = globals()
    for k, v in state.items():
        g['_sset_' + _state_vars[k]](k, g[k], v)
    return state


def _sget_dict(val):
    return val.copy()


def _sset_dict(key, ob, state):
    ob.clear()
    ob.update(state)


def _sget_object(val):
    return val.__getstate__()


def _sset_object(key, ob, state):
    ob.__setstate__(state)


_sget_none = _sset_none = lambda *args: None


def get_supported_platform():
    """Return this platform's maximum compatible version.

    distutils.util.get_platform() normally reports the minimum version
    of Mac OS X that would be required to *use* extensions produced by
    distutils.  But what we want when checking compatibility is to know the
    version of Mac OS X that we are *running*.  To allow usage of packages that
    explicitly require a newer version of Mac OS X, we must also know the
    current version of the OS.

    If this condition occurs for any other platform with a version in its
    platform strings, this function should be extended accordingly.
    """
    plat = get_build_platform()
    m = macosVersionString.match(plat)
    if m is not None and sys.platform == "darwin":
        try:
            plat = 'macosx-%s-%s' % ('.'.join(_macosx_vers()[:2]), m.group(3))
        except ValueError:
            # not Mac OS X
            pass
    return plat


__all__ = [
    # Basic resource access and distribution/entry point discovery
    'require', 'run_script', 'get_provider', 'get_distribution',
    'load_entry_point', 'get_entry_map', 'get_entry_info',
    'iter_entry_points',
    'resource_string', 'resource_stream', 'resource_filename',
    'resource_listdir', 'resource_exists', 'resource_isdir',

    # Environmental control
    'declare_namespace', 'working_set', 'add_activation_listener',
    'find_distributions', 'set_extraction_path', 'cleanup_resources',
    'get_default_cache',

    # Primary implementation classes
    'Environment', 'WorkingSet', 'ResourceManager',
    'Distribution', 'Requirement', 'EntryPoint',

    # Exceptions
    'ResolutionError', 'VersionConflict', 'DistributionNotFound',
    'UnknownExtra', 'ExtractionError',

    # Warnings
    'PEP440Warning',

    # Parsing functions and string utilities
    'parse_requirements', 'parse_version', 'safe_name', 'safe_version',
    'get_platform', 'compatible_platforms', 'yield_lines', 'split_sections',
    'safe_extra', 'to_filename', 'invalid_marker', 'evaluate_marker',

    # filesystem utilities
    'ensure_directory', 'normalize_path',

    # Distribution "precedence" constants
    'EGG_DIST', 'BINARY_DIST', 'SOURCE_DIST', 'CHECKOUT_DIST', 'DEVELOP_DIST',

    # "Provider" interfaces, implementations, and registration/lookup APIs
    'IMetadataProvider', 'IResourceProvider', 'FileMetadata',
    'PathMetadata', 'EggMetadata', 'EmptyProvider', 'empty_provider',
    'NullProvider', 'EggProvider', 'DefaultProvider', 'ZipProvider',
    'register_finder', 'register_namespace_handler', 'register_loader_type',
    'fixup_namespace_packages', 'get_importer',

    # Deprecated/backward compatibility only
    'run_main', 'AvailableDistributions',
]


class ResolutionError(Exception):
    """Abstract base for dependency resolution errors"""

    def __repr__(self):
        return self.__class__.__name__ + repr(self.args)


class VersionConflict(ResolutionError):
    """
    An already-installed version conflicts with the requested version.

    Should be initialized with the installed Distribution and the requested
    Requirement.
    """

    _template = "{self.dist} is installed but {self.req} is required"

    @property
    def dist(self):
        return self.args[0]

    @property
    def req(self):
        return self.args[1]

    def report(self):
        return self._template.format(**locals())

    def with_context(self, required_by):
        """
        If required_by is non-empty, return a version of self that is a
        ContextualVersionConflict.
        """
        if not required_by:
            return self
        args = self.args + (required_by,)
        return ContextualVersionConflict(*args)


class ContextualVersionConflict(VersionConflict):
    """
    A VersionConflict that accepts a third parameter, the set of the
    requirements that required the installed Distribution.
    """

    _template = VersionConflict._template + ' by {self.required_by}'

    @property
    def required_by(self):
        return self.args[2]


class DistributionNotFound(ResolutionError):
    """A requested distribution was not found"""

    _template = ("The '{self.req}' distribution was not found "
                 "and is required by {self.requirers_str}")

    @property
    def req(self):
        return self.args[0]

    @property
    def requirers(self):
        return self.args[1]

    @property
    def requirers_str(self):
        if not self.requirers:
            return 'the application'
        return ', '.join(self.requirers)

    def report(self):
        return self._template.format(**locals())

    def __str__(self):
        return self.report()


class UnknownExtra(ResolutionError):
    """Distribution doesn't have an "extra feature" of the given name"""


_provider_factories = {}

PY_MAJOR = sys.version[:3]
EGG_DIST = 3
BINARY_DIST = 2
SOURCE_DIST = 1
CHECKOUT_DIST = 0
DEVELOP_DIST = -1


def register_loader_type(loader_type, provider_factory):
    """Register `provider_factory` to make providers for `loader_type`

    `loader_type` is the type or class of a PEP 302 ``module.__loader__``,
    and `provider_factory` is a function that, passed a *module* object,
    returns an ``IResourceProvider`` for that module.
    """
    _provider_factories[loader_type] = provider_factory


def get_provider(moduleOrReq):
    """Return an IResourceProvider for the named module or requirement"""
    if isinstance(moduleOrReq, Requirement):
        return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0]
    try:
        module = sys.modules[moduleOrReq]
    except KeyError:
        __import__(moduleOrReq)
        module = sys.modules[moduleOrReq]
    loader = getattr(module, '__loader__', None)
    return _find_adapter(_provider_factories, loader)(module)


def _macosx_vers(_cache=[]):
    if not _cache:
        version = platform.mac_ver()[0]
        # fallback for MacPorts
        if version == '':
            plist = '/System/Library/CoreServices/SystemVersion.plist'
            if os.path.exists(plist):
                if hasattr(plistlib, 'readPlist'):
                    plist_content = plistlib.readPlist(plist)
                    if 'ProductVersion' in plist_content:
                        version = plist_content['ProductVersion']

        _cache.append(version.split('.'))
    return _cache[0]


def _macosx_arch(machine):
    return {'PowerPC': 'ppc', 'Power_Macintosh': 'ppc'}.get(machine, machine)


def get_build_platform():
    """Return this platform's string for platform-specific distributions

    XXX Currently this is the same as ``distutils.util.get_platform()``, but it
    needs some hacks for Linux and Mac OS X.
    """
    from sysconfig import get_platform

    plat = get_platform()
    if sys.platform == "darwin" and not plat.startswith('macosx-'):
        try:
            version = _macosx_vers()
            machine = os.uname()[4].replace(" ", "_")
            return "macosx-%d.%d-%s" % (
                int(version[0]), int(version[1]),
                _macosx_arch(machine),
            )
        except ValueError:
            # if someone is running a non-Mac darwin system, this will fall
            # through to the default implementation
            pass
    return plat


macosVersionString = re.compile(r"macosx-(\d+)\.(\d+)-(.*)")
darwinVersionString = re.compile(r"darwin-(\d+)\.(\d+)\.(\d+)-(.*)")
# XXX backward compat
get_platform = get_build_platform


def compatible_platforms(provided, required):
    """Can code for the `provided` platform run on the `required` platform?

    Returns true if either platform is ``None``, or the platforms are equal.

    XXX Needs compatibility checks for Linux and other unixy OSes.
    """
    if provided is None or required is None or provided == required:
        # easy case
        return True

    # Mac OS X special cases
    reqMac = macosVersionString.match(required)
    if reqMac:
        provMac = macosVersionString.match(provided)

        # is this a Mac package?
        if not provMac:
            # this is backwards compatibility for packages built before
            # setuptools 0.6. All packages built after this point will
            # use the new macosx designation.
            provDarwin = darwinVersionString.match(provided)
            if provDarwin:
                dversion = int(provDarwin.group(1))
                macosversion = "%s.%s" % (reqMac.group(1), reqMac.group(2))
                if dversion == 7 and macosversion >= "10.3" or \
                        dversion == 8 and macosversion >= "10.4":
                    return True
            # egg isn't macosx or legacy darwin
            return False

        # are they the same major version and machine type?
        if provMac.group(1) != reqMac.group(1) or \
                provMac.group(3) != reqMac.group(3):
            return False

        # is the required OS major update >= the provided one?
        if int(provMac.group(2)) > int(reqMac.group(2)):
            return False

        return True

    # XXX Linux and other platforms' special cases should go here
    return False


def run_script(dist_spec, script_name):
    """Locate distribution `dist_spec` and run its `script_name` script"""
    ns = sys._getframe(1).f_globals
    name = ns['__name__']
    ns.clear()
    ns['__name__'] = name
    require(dist_spec)[0].run_script(script_name, ns)


# backward compatibility
run_main = run_script


def get_distribution(dist):
    """Return a current distribution object for a Requirement or string"""
    if isinstance(dist, six.string_types):
        dist = Requirement.parse(dist)
    if isinstance(dist, Requirement):
        dist = get_provider(dist)
    if not isinstance(dist, Distribution):
        raise TypeError("Expected string, Requirement, or Distribution", dist)
    return dist


def load_entry_point(dist, group, name):
    """Return `name` entry point of `group` for `dist` or raise ImportError"""
    return get_distribution(dist).load_entry_point(group, name)


def get_entry_map(dist, group=None):
    """Return the entry point map for `group`, or the full entry map"""
    return get_distribution(dist).get_entry_map(group)


def get_entry_info(dist, group, name):
    """Return the EntryPoint object for `group`+`name`, or ``None``"""
    return get_distribution(dist).get_entry_info(group, name)


class IMetadataProvider:
    def has_metadata(name):
        """Does the package's distribution contain the named metadata?"""

    def get_metadata(name):
        """The named metadata resource as a string"""

    def get_metadata_lines(name):
        """Yield named metadata resource as list of non-blank non-comment lines

       Leading and trailing whitespace is stripped from each line, and lines
       with ``#`` as the first non-blank character are omitted."""

    def metadata_isdir(name):
        """Is the named metadata a directory?  (like ``os.path.isdir()``)"""

    def metadata_listdir(name):
        """List of metadata names in the directory (like ``os.listdir()``)"""

    def run_script(script_name, namespace):
        """Execute the named script in the supplied namespace dictionary"""


class IResourceProvider(IMetadataProvider):
    """An object that provides access to package resources"""

    def get_resource_filename(manager, resource_name):
        """Return a true filesystem path for `resource_name`

        `manager` must be an ``IResourceManager``"""

    def get_resource_stream(manager, resource_name):
        """Return a readable file-like object for `resource_name`

        `manager` must be an ``IResourceManager``"""

    def get_resource_string(manager, resource_name):
        """Return a string containing the contents of `resource_name`

        `manager` must be an ``IResourceManager``"""

    def has_resource(resource_name):
        """Does the package contain the named resource?"""

    def resource_isdir(resource_name):
        """Is the named resource a directory?  (like ``os.path.isdir()``)"""

    def resource_listdir(resource_name):
        """List of resource names in the directory (like ``os.listdir()``)"""


class WorkingSet(object):
    """A collection of active distributions on sys.path (or a similar list)"""

    def __init__(self, entries=None):
        """Create working set from list of path entries (default=sys.path)"""
        self.entries = []
        self.entry_keys = {}
        self.by_key = {}
        self.callbacks = []

        if entries is None:
            entries = sys.path

        for entry in entries:
            self.add_entry(entry)

    @classmethod
    def _build_master(cls):
        """
        Prepare the master working set.
        """
        ws = cls()
        try:
            from __main__ import __requires__
        except ImportError:
            # The main program does not list any requirements
            return ws

        # ensure the requirements are met
        try:
            ws.require(__requires__)
        except VersionConflict:
            return cls._build_from_requirements(__requires__)

        return ws

    @classmethod
    def _build_from_requirements(cls, req_spec):
        """
        Build a working set from a requirement spec. Rewrites sys.path.
        """
        # try it without defaults already on sys.path
        # by starting with an empty path
        ws = cls([])
        reqs = parse_requirements(req_spec)
        dists = ws.resolve(reqs, Environment())
        for dist in dists:
            ws.add(dist)

        # add any missing entries from sys.path
        for entry in sys.path:
            if entry not in ws.entries:
                ws.add_entry(entry)

        # then copy back to sys.path
        sys.path[:] = ws.entries
        return ws

    def add_entry(self, entry):
        """Add a path item to ``.entries``, finding any distributions on it

        ``find_distributions(entry, True)`` is used to find distributions
        corresponding to the path entry, and they are added.  `entry` is
        always appended to ``.entries``, even if it is already present.
        (This is because ``sys.path`` can contain the same value more than
        once, and the ``.entries`` of the ``sys.path`` WorkingSet should always
        equal ``sys.path``.)
        """
        self.entry_keys.setdefault(entry, [])
        self.entries.append(entry)
        for dist in find_distributions(entry, True):
            self.add(dist, entry, False)

    def __contains__(self, dist):
        """True if `dist` is the active distribution for its project"""
        return self.by_key.get(dist.key) == dist

    def find(self, req):
        """Find a distribution matching requirement `req`

        If there is an active distribution for the requested project, this
        returns it as long as it meets the version requirement specified by
        `req`.  But, if there is an active distribution for the project and it
        does *not* meet the `req` requirement, ``VersionConflict`` is raised.
        If there is no active distribution for the requested project, ``None``
        is returned.
        """
        dist = self.by_key.get(req.key)
        if dist is not None and dist not in req:
            # XXX add more info
            raise VersionConflict(dist, req)
        return dist

    def iter_entry_points(self, group, name=None):
        """Yield entry point objects from `group` matching `name`

        If `name` is None, yields all entry points in `group` from all
        distributions in the working set, otherwise only ones matching
        both `group` and `name` are yielded (in distribution order).
        """
        for dist in self:
            entries = dist.get_entry_map(group)
            if name is None:
                for ep in entries.values():
                    yield ep
            elif name in entries:
                yield entries[name]

    def run_script(self, requires, script_name):
        """Locate distribution for `requires` and run `script_name` script"""
        ns = sys._getframe(1).f_globals
        name = ns['__name__']
        ns.clear()
        ns['__name__'] = name
        self.require(requires)[0].run_script(script_name, ns)

    def __iter__(self):
        """Yield distributions for non-duplicate projects in the working set

        The yield order is the order in which the items' path entries were
        added to the working set.
        """
        seen = {}
        for item in self.entries:
            if item not in self.entry_keys:
                # workaround a cache issue
                continue

            for key in self.entry_keys[item]:
                if key not in seen:
                    seen[key] = 1
                    yield self.by_key[key]

    def add(self, dist, entry=None, insert=True, replace=False):
        """Add `dist` to working set, associated with `entry`

        If `entry` is unspecified, it defaults to the ``.location`` of `dist`.
        On exit from this routine, `entry` is added to the end of the working
        set's ``.entries`` (if it wasn't already present).

        `dist` is only added to the working set if it's for a project that
        doesn't already have a distribution in the set, unless `replace=True`.
        If it's added, any callbacks registered with the ``subscribe()`` method
        will be called.
        """
        if insert:
            dist.insert_on(self.entries, entry, replace=replace)

        if entry is None:
            entry = dist.location
        keys = self.entry_keys.setdefault(entry, [])
        keys2 = self.entry_keys.setdefault(dist.location, [])
        if not replace and dist.key in self.by_key:
            # ignore hidden distros
            return

        self.by_key[dist.key] = dist
        if dist.key not in keys:
            keys.append(dist.key)
        if dist.key not in keys2:
            keys2.append(dist.key)
        self._added_new(dist)

    def resolve(self, requirements, env=None, installer=None,
                replace_conflicting=False, extras=None):
        """List all distributions needed to (recursively) meet `requirements`

        `requirements` must be a sequence of ``Requirement`` objects.  `env`,
        if supplied, should be an ``Environment`` instance.  If
        not supplied, it defaults to all distributions available within any
        entry or distribution in the working set.  `installer`, if supplied,
        will be invoked with each requirement that cannot be met by an
        already-installed distribution; it should return a ``Distribution`` or
        ``None``.

        Unless `replace_conflicting=True`, raises a VersionConflict exception
        if
        any requirements are found on the path that have the correct name but
        the wrong version.  Otherwise, if an `installer` is supplied it will be
        invoked to obtain the correct version of the requirement and activate
        it.

        `extras` is a list of the extras to be used with these requirements.
        This is important because extra requirements may look like `my_req;
        extra = "my_extra"`, which would otherwise be interpreted as a purely
        optional requirement.  Instead, we want to be able to assert that these
        requirements are truly required.
        """

        # set up the stack
        requirements = list(requirements)[::-1]
        # set of processed requirements
        processed = {}
        # key -> dist
        best = {}
        to_activate = []

        req_extras = _ReqExtras()

        # Mapping of requirement to set of distributions that required it;
        # useful for reporting info about conflicts.
        required_by = collections.defaultdict(set)

        while requirements:
            # process dependencies breadth-first
            req = requirements.pop(0)
            if req in processed:
                # Ignore cyclic or redundant dependencies
                continue

            if not req_extras.markers_pass(req, extras):
                continue

            dist = best.get(req.key)
            if dist is None:
                # Find the best distribution and add it to the map
                dist = self.by_key.get(req.key)
                if dist is None or (dist not in req and replace_conflicting):
                    ws = self
                    if env is None:
                        if dist is None:
                            env = Environment(self.entries)
                        else:
                            # Use an empty environment and workingset to avoid
                            # any further conflicts with the conflicting
                            # distribution
                            env = Environment([])
                            ws = WorkingSet([])
                    dist = best[req.key] = env.best_match(
                        req, ws, installer,
                        replace_conflicting=replace_conflicting
                    )
                    if dist is None:
                        requirers = required_by.get(req, None)
                        raise DistributionNotFound(req, requirers)
                to_activate.append(dist)
            if dist not in req:
                # Oops, the "best" so far conflicts with a dependency
                dependent_req = required_by[req]
                raise VersionConflict(dist, req).with_context(dependent_req)

            # push the new requirements onto the stack
            new_requirements = dist.requires(req.extras)[::-1]
            requirements.extend(new_requirements)

            # Register the new requirements needed by req
            for new_requirement in new_requirements:
                required_by[new_requirement].add(req.project_name)
                req_extras[new_requirement] = req.extras

            processed[req] = True

        # return list of distros to activate
        return to_activate

    def find_plugins(
            self, plugin_env, full_env=None, installer=None, fallback=True):
        """Find all activatable distributions in `plugin_env`

        Example usage::

            distributions, errors = working_set.find_plugins(
                Environment(plugin_dirlist)
            )
            # add plugins+libs to sys.path
            map(working_set.add, distributions)
            # display errors
            print('Could not load', errors)

        The `plugin_env` should be an ``Environment`` instance that contains
        only distributions that are in the project's "plugin directory" or
        directories. The `full_env`, if supplied, should be an ``Environment``
        contains all currently-available distributions.  If `full_env` is not
        supplied, one is created automatically from the ``WorkingSet`` this
        method is called on, which will typically mean that every directory on
        ``sys.path`` will be scanned for distributions.

        `installer` is a standard installer callback as used by the
        ``resolve()`` method. The `fallback` flag indicates whether we should
        attempt to resolve older versions of a plugin if the newest version
        cannot be resolved.

        This method returns a 2-tuple: (`distributions`, `error_info`), where
        `distributions` is a list of the distributions found in `plugin_env`
        that were loadable, along with any other distributions that are needed
        to resolve their dependencies.  `error_info` is a dictionary mapping
        unloadable plugin distributions to an exception instance describing the
        error that occurred. Usually this will be a ``DistributionNotFound`` or
        ``VersionConflict`` instance.
        """

        plugin_projects = list(plugin_env)
        # scan project names in alphabetic order
        plugin_projects.sort()

        error_info = {}
        distributions = {}

        if full_env is None:
            env = Environment(self.entries)
            env += plugin_env
        else:
            env = full_env + plugin_env

        shadow_set = self.__class__([])
        # put all our entries in shadow_set
        list(map(shadow_set.add, self))

        for project_name in plugin_projects:

            for dist in plugin_env[project_name]:

                req = [dist.as_requirement()]

                try:
                    resolvees = shadow_set.resolve(req, env, installer)

                except ResolutionError as v:
                    # save error info
                    error_info[dist] = v
                    if fallback:
                        # try the next older version of project
                        continue
                    else:
                        # give up on this project, keep going
                        break

                else:
                    list(map(shadow_set.add, resolvees))
                    distributions.update(dict.fromkeys(resolvees))

                    # success, no need to try any more versions of this project
                    break

        distributions = list(distributions)
        distributions.sort()

        return distributions, error_info

    def require(self, *requirements):
        """Ensure that distributions matching `requirements` are activated

        `requirements` must be a string or a (possibly-nested) sequence
        thereof, specifying the distributions and versions required.  The
        return value is a sequence of the distributions that needed to be
        activated to fulfill the requirements; all relevant distributions are
        included, even if they were already activated in this working set.
        """
        needed = self.resolve(parse_requirements(requirements))

        for dist in needed:
            self.add(dist)

        return needed

    def subscribe(self, callback, existing=True):
        """Invoke `callback` for all distributions

        If `existing=True` (default),
        call on all existing ones, as well.
        """
        if callback in self.callbacks:
            return
        self.callbacks.append(callback)
        if not existing:
            return
        for dist in self:
            callback(dist)

    def _added_new(self, dist):
        for callback in self.callbacks:
            callback(dist)

    def __getstate__(self):
        return (
            self.entries[:], self.entry_keys.copy(), self.by_key.copy(),
            self.callbacks[:]
        )

    def __setstate__(self, e_k_b_c):
        entries, keys, by_key, callbacks = e_k_b_c
        self.entries = entries[:]
        self.entry_keys = keys.copy()
        self.by_key = by_key.copy()
        self.callbacks = callbacks[:]


class _ReqExtras(dict):
    """
    Map each requirement to the extras that demanded it.
    """

    def markers_pass(self, req, extras=None):
        """
        Evaluate markers for req against each extra that
        demanded it.

        Return False if the req has a marker and fails
        evaluation. Otherwise, return True.
        """
        extra_evals = (
            req.marker.evaluate({'extra': extra})
            for extra in self.get(req, ()) + (extras or (None,))
        )
        return not req.marker or any(extra_evals)


class Environment(object):
    """Searchable snapshot of distributions on a search path"""

    def __init__(
            self, search_path=None, platform=get_supported_platform(),
            python=PY_MAJOR):
        """Snapshot distributions available on a search path

        Any distributions found on `search_path` are added to the environment.
        `search_path` should be a sequence of ``sys.path`` items.  If not
        supplied, ``sys.path`` is used.

        `platform` is an optional string specifying the name of the platform
        that platform-specific distributions must be compatible with.  If
        unspecified, it defaults to the current platform.  `python` is an
        optional string naming the desired version of Python (e.g. ``'3.3'``);
        it defaults to the current version.

        You may explicitly set `platform` (and/or `python`) to ``None`` if you
        wish to map *all* distributions, not just those compatible with the
        running platform or Python version.
        """
        self._distmap = {}
        self.platform = platform
        self.python = python
        self.scan(search_path)

    def can_add(self, dist):
        """Is distribution `dist` acceptable for this environment?

        The distribution must match the platform and python version
        requirements specified when this environment was created, or False
        is returned.
        """
        py_compat = (
            self.python is None
            or dist.py_version is None
            or dist.py_version == self.python
        )
        return py_compat and compatible_platforms(dist.platform, self.platform)

    def remove(self, dist):
        """Remove `dist` from the environment"""
        self._distmap[dist.key].remove(dist)

    def scan(self, search_path=None):
        """Scan `search_path` for distributions usable in this environment

        Any distributions found are added to the environment.
        `search_path` should be a sequence of ``sys.path`` items.  If not
        supplied, ``sys.path`` is used.  Only distributions conforming to
        the platform/python version defined at initialization are added.
        """
        if search_path is None:
            search_path = sys.path

        for item in search_path:
            for dist in find_distributions(item):
                self.add(dist)

    def __getitem__(self, project_name):
        """Return a newest-to-oldest list of distributions for `project_name`

        Uses case-insensitive `project_name` comparison, assuming all the
        project's distributions use their project's name converted to all
        lowercase as their key.

        """
        distribution_key = project_name.lower()
        return self._distmap.get(distribution_key, [])

    def add(self, dist):
        """Add `dist` if we ``can_add()`` it and it has not already been added
        """
        if self.can_add(dist) and dist.has_version():
            dists = self._distmap.setdefault(dist.key, [])
            if dist not in dists:
                dists.append(dist)
                dists.sort(key=operator.attrgetter('hashcmp'), reverse=True)

    def best_match(
            self, req, working_set, installer=None, replace_conflicting=False):
        """Find distribution best matching `req` and usable on `working_set`

        This calls the ``find(req)`` method of the `working_set` to see if a
        suitable distribution is already active.  (This may raise
        ``VersionConflict`` if an unsuitable version of the project is already
        active in the specified `working_set`.)  If a suitable distribution
        isn't active, this method returns the newest distribution in the
        environment that meets the ``Requirement`` in `req`.  If no suitable
        distribution is found, and `installer` is supplied, then the result of
        calling the environment's ``obtain(req, installer)`` method will be
        returned.
        """
        try:
            dist = working_set.find(req)
        except VersionConflict:
            if not replace_conflicting:
                raise
            dist = None
        if dist is not None:
            return dist
        for dist in self[req.key]:
            if dist in req:
                return dist
        # try to download/install
        return self.obtain(req, installer)

    def obtain(self, requirement, installer=None):
        """Obtain a distribution matching `requirement` (e.g. via download)

        Obtain a distro that matches requirement (e.g. via download).  In the
        base ``Environment`` class, this routine just returns
        ``installer(requirement)``, unless `installer` is None, in which case
        None is returned instead.  This method is a hook that allows subclasses
        to attempt other ways of obtaining a distribution before falling back
        to the `installer` argument."""
        if installer is not None:
            return installer(requirement)

    def __iter__(self):
        """Yield the unique project names of the available distributions"""
        for key in self._distmap.keys():
            if self[key]:
                yield key

    def __iadd__(self, other):
        """In-place addition of a distribution or environment"""
        if isinstance(other, Distribution):
            self.add(other)
        elif isinstance(other, Environment):
            for project in other:
                for dist in other[project]:
                    self.add(dist)
        else:
            raise TypeError("Can't add %r to environment" % (other,))
        return self

    def __add__(self, other):
        """Add an environment or distribution to an environment"""
        new = self.__class__([], platform=None, python=None)
        for env in self, other:
            new += env
        return new


# XXX backward compatibility
AvailableDistributions = Environment


class ExtractionError(RuntimeError):
    """An error occurred extracting a resource

    The following attributes are available from instances of this exception:

    manager
        The resource manager that raised this exception

    cache_path
        The base directory for resource extraction

    original_error
        The exception instance that caused extraction to fail
    """


class ResourceManager:
    """Manage resource extraction and packages"""
    extraction_path = None

    def __init__(self):
        self.cached_files = {}

    def resource_exists(self, package_or_requirement, resource_name):
        """Does the named resource exist?"""
        return get_provider(package_or_requirement).has_resource(resource_name)

    def resource_isdir(self, package_or_requirement, resource_name):
        """Is the named resource an existing directory?"""
        return get_provider(package_or_requirement).resource_isdir(
            resource_name
        )

    def resource_filename(self, package_or_requirement, resource_name):
        """Return a true filesystem path for specified resource"""
        return get_provider(package_or_requirement).get_resource_filename(
            self, resource_name
        )

    def resource_stream(self, package_or_requirement, resource_name):
        """Return a readable file-like object for specified resource"""
        return get_provider(package_or_requirement).get_resource_stream(
            self, resource_name
        )

    def resource_string(self, package_or_requirement, resource_name):
        """Return specified resource as a string"""
        return get_provider(package_or_requirement).get_resource_string(
            self, resource_name
        )

    def resource_listdir(self, package_or_requirement, resource_name):
        """List the contents of the named resource directory"""
        return get_provider(package_or_requirement).resource_listdir(
            resource_name
        )

    def extraction_error(self):
        """Give an error message for problems extracting file(s)"""

        old_exc = sys.exc_info()[1]
        cache_path = self.extraction_path or get_default_cache()

        tmpl = textwrap.dedent("""
            Can't extract file(s) to egg cache

            The following error occurred while trying to extract file(s)
            to the Python egg cache:

              {old_exc}

            The Python egg cache directory is currently set to:

              {cache_path}

            Perhaps your account does not have write access to this directory?
            You can change the cache directory by setting the PYTHON_EGG_CACHE
            environment variable to point to an accessible directory.
            """).lstrip()
        err = ExtractionError(tmpl.format(**locals()))
        err.manager = self
        err.cache_path = cache_path
        err.original_error = old_exc
        raise err

    def get_cache_path(self, archive_name, names=()):
        """Return absolute location in cache for `archive_name` and `names`

        The parent directory of the resulting path will be created if it does
        not already exist.  `archive_name` should be the base filename of the
        enclosing egg (which may not be the name of the enclosing zipfile!),
        including its ".egg" extension.  `names`, if provided, should be a
        sequence of path name parts "under" the egg's extraction location.

        This method should only be called by resource providers that need to
        obtain an extraction location, and only for names they intend to
        extract, as it tracks the generated names for possible cleanup later.
        """
        extract_path = self.extraction_path or get_default_cache()
        target_path = os.path.join(extract_path, archive_name + '-tmp', *names)
        try:
            _bypass_ensure_directory(target_path)
        except Exception:
            self.extraction_error()

        self._warn_unsafe_extraction_path(extract_path)

        self.cached_files[target_path] = 1
        return target_path

    @staticmethod
    def _warn_unsafe_extraction_path(path):
        """
        If the default extraction path is overridden and set to an insecure
        location, such as /tmp, it opens up an opportunity for an attacker to
        replace an extracted file with an unauthorized payload. Warn the user
        if a known insecure location is used.

        See Distribute #375 for more details.
        """
        if os.name == 'nt' and not path.startswith(os.environ['windir']):
            # On Windows, permissions are generally restrictive by default
            #  and temp directories are not writable by other users, so
            #  bypass the warning.
            return
        mode = os.stat(path).st_mode
        if mode & stat.S_IWOTH or mode & stat.S_IWGRP:
            msg = (
                "%s is writable by group/others and vulnerable to attack "
                "when "
                "used with get_resource_filename. Consider a more secure "
                "location (set with .set_extraction_path or the "
                "PYTHON_EGG_CACHE environment variable)." % path
            )
            warnings.warn(msg, UserWarning)

    def postprocess(self, tempname, filename):
        """Perform any platform-specific postprocessing of `tempname`

        This is where Mac header rewrites should be done; other platforms don't
        have anything special they should do.

        Resource providers should call this method ONLY after successfully
        extracting a compressed resource.  They must NOT call it on resources
        that are already in the filesystem.

        `tempname` is the current (temporary) name of the file, and `filename`
        is the name it will be renamed to by the caller after this routine
        returns.
        """

        if os.name == 'posix':
            # Make the resource executable
            mode = ((os.stat(tempname).st_mode) | 0o555) & 0o7777
            os.chmod(tempname, mode)

    def set_extraction_path(self, path):
        """Set the base path where resources will be extracted to, if needed.

        If you do not call this routine before any extractions take place, the
        path defaults to the return value of ``get_default_cache()``.  (Which
        is based on the ``PYTHON_EGG_CACHE`` environment variable, with various
        platform-specific fallbacks.  See that routine's documentation for more
        details.)

        Resources are extracted to subdirectories of this path based upon
        information given by the ``IResourceProvider``.  You may set this to a
        temporary directory, but then you must call ``cleanup_resources()`` to
        delete the extracted files when done.  There is no guarantee that
        ``cleanup_resources()`` will be able to remove all extracted files.

        (Note: you may not change the extraction path for a given resource
        manager once resources have been extracted, unless you first call
        ``cleanup_resources()``.)
        """
        if self.cached_files:
            raise ValueError(
                "Can't change extraction path, files already extracted"
            )

        self.extraction_path = path

    def cleanup_resources(self, force=False):
        """
        Delete all extracted resource files and directories, returning a list
        of the file and directory names that could not be successfully removed.
        This function does not have any concurrency protection, so it should
        generally only be called when the extraction path is a temporary
        directory exclusive to a single process.  This method is not
        automatically called; you must call it explicitly or register it as an
        ``atexit`` function if you wish to ensure cleanup of a temporary
        directory used for extractions.
        """
        # XXX


def get_default_cache():
    """
    Return the ``PYTHON_EGG_CACHE`` environment variable
    or a platform-relevant user cache dir for an app
    named "Python-Eggs".
    """
    return (
        os.environ.get('PYTHON_EGG_CACHE')
        or appdirs.user_cache_dir(appname='Python-Eggs')
    )


def safe_name(name):
    """Convert an arbitrary string to a standard distribution name

    Any runs of non-alphanumeric/. characters are replaced with a single '-'.
    """
    return re.sub('[^A-Za-z0-9.]+', '-', name)


def safe_version(version):
    """
    Convert an arbitrary string to a standard version string
    """
    try:
        # normalize the version
        return str(packaging.version.Version(version))
    except packaging.version.InvalidVersion:
        version = version.replace(' ', '.')
        return re.sub('[^A-Za-z0-9.]+', '-', version)


def safe_extra(extra):
    """Convert an arbitrary string to a standard 'extra' name

    Any runs of non-alphanumeric characters are replaced with a single '_',
    and the result is always lowercased.
    """
    return re.sub('[^A-Za-z0-9.-]+', '_', extra).lower()


def to_filename(name):
    """Convert a project or version name to its filename-escaped form

    Any '-' characters are currently replaced with '_'.
    """
    return name.replace('-', '_')


def invalid_marker(text):
    """
    Validate text as a PEP 508 environment marker; return an exception
    if invalid or False otherwise.
    """
    try:
        evaluate_marker(text)
    except SyntaxError as e:
        e.filename = None
        e.lineno = None
        return e
    return False


def evaluate_marker(text, extra=None):
    """
    Evaluate a PEP 508 environment marker.
    Return a boolean indicating the marker result in this environment.
    Raise SyntaxError if marker is invalid.

    This implementation uses the 'pyparsing' module.
    """
    try:
        marker = packaging.markers.Marker(text)
        return marker.evaluate()
    except packaging.markers.InvalidMarker as e:
        raise SyntaxError(e)


class NullProvider:
    """Try to implement resources and metadata for arbitrary PEP 302 loaders"""

    egg_name = None
    egg_info = None
    loader = None

    def __init__(self, module):
        self.loader = getattr(module, '__loader__', None)
        self.module_path = os.path.dirname(getattr(module, '__file__', ''))

    def get_resource_filename(self, manager, resource_name):
        return self._fn(self.module_path, resource_name)

    def get_resource_stream(self, manager, resource_name):
        return io.BytesIO(self.get_resource_string(manager, resource_name))

    def get_resource_string(self, manager, resource_name):
        return self._get(self._fn(self.module_path, resource_name))

    def has_resource(self, resource_name):
        return self._has(self._fn(self.module_path, resource_name))

    def has_metadata(self, name):
        return self.egg_info and self._has(self._fn(self.egg_info, name))

    def get_metadata(self, name):
        if not self.egg_info:
            return ""
        value = self._get(self._fn(self.egg_info, name))
        return value.decode('utf-8') if six.PY3 else value

    def get_metadata_lines(self, name):
        return yield_lines(self.get_metadata(name))

    def resource_isdir(self, resource_name):
        return self._isdir(self._fn(self.module_path, resource_name))

    def metadata_isdir(self, name):
        return self.egg_info and self._isdir(self._fn(self.egg_info, name))

    def resource_listdir(self, resource_name):
        return self._listdir(self._fn(self.module_path, resource_name))

    def metadata_listdir(self, name):
        if self.egg_info:
            return self._listdir(self._fn(self.egg_info, name))
        return []

    def run_script(self, script_name, namespace):
        script = 'scripts/' + script_name
        if not self.has_metadata(script):
            raise ResolutionError(
                "Script {script!r} not found in metadata at {self.egg_info!r}"
                .format(**locals()),
            )
        script_text = self.get_metadata(script).replace('\r\n', '\n')
        script_text = script_text.replace('\r', '\n')
        script_filename = self._fn(self.egg_info, script)
        namespace['__file__'] = script_filename
        if os.path.exists(script_filename):
            source = open(script_filename).read()
            code = compile(source, script_filename, 'exec')
            exec(code, namespace, namespace)
        else:
            from linecache import cache
            cache[script_filename] = (
                len(script_text), 0, script_text.split('\n'), script_filename
            )
            script_code = compile(script_text, script_filename, 'exec')
            exec(script_code, namespace, namespace)

    def _has(self, path):
        raise NotImplementedError(
            "Can't perform this operation for unregistered loader type"
        )

    def _isdir(self, path):
        raise NotImplementedError(
            "Can't perform this operation for unregistered loader type"
        )

    def _listdir(self, path):
        raise NotImplementedError(
            "Can't perform this operation for unregistered loader type"
        )

    def _fn(self, base, resource_name):
        if resource_name:
            return os.path.join(base, *resource_name.split('/'))
        return base

    def _get(self, path):
        if hasattr(self.loader, 'get_data'):
            return self.loader.get_data(path)
        raise NotImplementedError(
            "Can't perform this operation for loaders without 'get_data()'"
        )


register_loader_type(object, NullProvider)


class EggProvider(NullProvider):
    """Provider based on a virtual filesystem"""

    def __init__(self, module):
        NullProvider.__init__(self, module)
        self._setup_prefix()

    def _setup_prefix(self):
        # we assume here that our metadata may be nested inside a "basket"
        # of multiple eggs; that's why we use module_path instead of .archive
        path = self.module_path
        old = None
        while path != old:
            if _is_egg_path(path):
                self.egg_name = os.path.basename(path)
                self.egg_info = os.path.join(path, 'EGG-INFO')
                self.egg_root = path
                break
            old = path
            path, base = os.path.split(path)


class DefaultProvider(EggProvider):
    """Provides access to package resources in the filesystem"""

    def _has(self, path):
        return os.path.exists(path)

    def _isdir(self, path):
        return os.path.isdir(path)

    def _listdir(self, path):
        return os.listdir(path)

    def get_resource_stream(self, manager, resource_name):
        return open(self._fn(self.module_path, resource_name), 'rb')

    def _get(self, path):
        with open(path, 'rb') as stream:
            return stream.read()

    @classmethod
    def _register(cls):
        loader_names = 'SourceFileLoader', 'SourcelessFileLoader',
        for name in loader_names:
            loader_cls = getattr(importlib_machinery, name, type(None))
            register_loader_type(loader_cls, cls)


DefaultProvider._register()


class EmptyProvider(NullProvider):
    """Provider that returns nothing for all requests"""

    module_path = None

    _isdir = _has = lambda self, path: False

    def _get(self, path):
        return ''

    def _listdir(self, path):
        return []

    def __init__(self):
        pass


empty_provider = EmptyProvider()


class ZipManifests(dict):
    """
    zip manifest builder
    """

    @classmethod
    def build(cls, path):
        """
        Build a dictionary similar to the zipimport directory
        caches, except instead of tuples, store ZipInfo objects.

        Use a platform-specific path separator (os.sep) for the path keys
        for compatibility with pypy on Windows.
        """
        with zipfile.ZipFile(path) as zfile:
            items = (
                (
                    name.replace('/', os.sep),
                    zfile.getinfo(name),
                )
                for name in zfile.namelist()
            )
            return dict(items)

    load = build


class MemoizedZipManifests(ZipManifests):
    """
    Memoized zipfile manifests.
    """
    manifest_mod = collections.namedtuple('manifest_mod', 'manifest mtime')

    def load(self, path):
        """
        Load a manifest at path or return a suitable manifest already loaded.
        """
        path = os.path.normpath(path)
        mtime = os.stat(path).st_mtime

        if path not in self or self[path].mtime != mtime:
            manifest = self.build(path)
            self[path] = self.manifest_mod(manifest, mtime)

        return self[path].manifest


class ZipProvider(EggProvider):
    """Resource support for zips and eggs"""

    eagers = None
    _zip_manifests = MemoizedZipManifests()

    def __init__(self, module):
        EggProvider.__init__(self, module)
        self.zip_pre = self.loader.archive + os.sep

    def _zipinfo_name(self, fspath):
        # Convert a virtual filename (full path to file) into a zipfile subpath
        # usable with the zipimport directory cache for our target archive
        fspath = fspath.rstrip(os.sep)
        if fspath == self.loader.archive:
            return ''
        if fspath.startswith(self.zip_pre):
            return fspath[len(self.zip_pre):]
        raise AssertionError(
            "%s is not a subpath of %s" % (fspath, self.zip_pre)
        )

    def _parts(self, zip_path):
        # Convert a zipfile subpath into an egg-relative path part list.
        # pseudo-fs path
        fspath = self.zip_pre + zip_path
        if fspath.startswith(self.egg_root + os.sep):
            return fspath[len(self.egg_root) + 1:].split(os.sep)
        raise AssertionError(
            "%s is not a subpath of %s" % (fspath, self.egg_root)
        )

    @property
    def zipinfo(self):
        return self._zip_manifests.load(self.loader.archive)

    def get_resource_filename(self, manager, resource_name):
        if not self.egg_name:
            raise NotImplementedError(
                "resource_filename() only supported for .egg, not .zip"
            )
        # no need to lock for extraction, since we use temp names
        zip_path = self._resource_to_zip(resource_name)
        eagers = self._get_eager_resources()
        if '/'.join(self._parts(zip_path)) in eagers:
            for name in eagers:
                self._extract_resource(manager, self._eager_to_zip(name))
        return self._extract_resource(manager, zip_path)

    @staticmethod
    def _get_date_and_size(zip_stat):
        size = zip_stat.file_size
        # ymdhms+wday, yday, dst
        date_time = zip_stat.date_time + (0, 0, -1)
        # 1980 offset already done
        timestamp = time.mktime(date_time)
        return timestamp, size

    def _extract_resource(self, manager, zip_path):

        if zip_path in self._index():
            for name in self._index()[zip_path]:
                last = self._extract_resource(
                    manager, os.path.join(zip_path, name)
                )
            # return the extracted directory name
            return os.path.dirname(last)

        timestamp, size = self._get_date_and_size(self.zipinfo[zip_path])

        if not WRITE_SUPPORT:
            raise IOError('"os.rename" and "os.unlink" are not supported '
                          'on this platform')
        try:

            real_path = manager.get_cache_path(
                self.egg_name, self._parts(zip_path)
            )

            if self._is_current(real_path, zip_path):
                return real_path

            outf, tmpnam = _mkstemp(
                ".$extract",
                dir=os.path.dirname(real_path),
            )
            os.write(outf, self.loader.get_data(zip_path))
            os.close(outf)
            utime(tmpnam, (timestamp, timestamp))
            manager.postprocess(tmpnam, real_path)

            try:
                rename(tmpnam, real_path)

            except os.error:
                if os.path.isfile(real_path):
                    if self._is_current(real_path, zip_path):
                        # the file became current since it was checked above,
                        #  so proceed.
                        return real_path
                    # Windows, del old file and retry
                    elif os.name == 'nt':
                        unlink(real_path)
                        rename(tmpnam, real_path)
                        return real_path
                raise

        except os.error:
            # report a user-friendly error
            manager.extraction_error()

        return real_path

    def _is_current(self, file_path, zip_path):
        """
        Return True if the file_path is current for this zip_path
        """
        timestamp, size = self._get_date_and_size(self.zipinfo[zip_path])
        if not os.path.isfile(file_path):
            return False
        stat = os.stat(file_path)
        if stat.st_size != size or stat.st_mtime != timestamp:
            return False
        # check that the contents match
        zip_contents = self.loader.get_data(zip_path)
        with open(file_path, 'rb') as f:
            file_contents = f.read()
        return zip_contents == file_contents

    def _get_eager_resources(self):
        if self.eagers is None:
            eagers = []
            for name in ('native_libs.txt', 'eager_resources.txt'):
                if self.has_metadata(name):
                    eagers.extend(self.get_metadata_lines(name))
            self.eagers = eagers
        return self.eagers

    def _index(self):
        try:
            return self._dirindex
        except AttributeError:
            ind = {}
            for path in self.zipinfo:
                parts = path.split(os.sep)
                while parts:
                    parent = os.sep.join(parts[:-1])
                    if parent in ind:
                        ind[parent].append(parts[-1])
                        break
                    else:
                        ind[parent] = [parts.pop()]
            self._dirindex = ind
            return ind

    def _has(self, fspath):
        zip_path = self._zipinfo_name(fspath)
        return zip_path in self.zipinfo or zip_path in self._index()

    def _isdir(self, fspath):
        return self._zipinfo_name(fspath) in self._index()

    def _listdir(self, fspath):
        return list(self._index().get(self._zipinfo_name(fspath), ()))

    def _eager_to_zip(self, resource_name):
        return self._zipinfo_name(self._fn(self.egg_root, resource_name))

    def _resource_to_zip(self, resource_name):
        return self._zipinfo_name(self._fn(self.module_path, resource_name))


register_loader_type(zipimport.zipimporter, ZipProvider)


class FileMetadata(EmptyProvider):
    """Metadata handler for standalone PKG-INFO files

    Usage::

        metadata = FileMetadata("/path/to/PKG-INFO")

    This provider rejects all data and metadata requests except for PKG-INFO,
    which is treated as existing, and will be the contents of the file at
    the provided location.
    """

    def __init__(self, path):
        self.path = path

    def has_metadata(self, name):
        return name == 'PKG-INFO' and os.path.isfile(self.path)

    def get_metadata(self, name):
        if name != 'PKG-INFO':
            raise KeyError("No metadata except PKG-INFO is available")

        with io.open(self.path, encoding='utf-8', errors="replace") as f:
            metadata = f.read()
        self._warn_on_replacement(metadata)
        return metadata

    def _warn_on_replacement(self, metadata):
        # Python 2.7 compat for: replacement_char = '�'
        replacement_char = b'\xef\xbf\xbd'.decode('utf-8')
        if replacement_char in metadata:
            tmpl = "{self.path} could not be properly decoded in UTF-8"
            msg = tmpl.format(**locals())
            warnings.warn(msg)

    def get_metadata_lines(self, name):
        return yield_lines(self.get_metadata(name))


class PathMetadata(DefaultProvider):
    """Metadata provider for egg directories

    Usage::

        # Development eggs:

        egg_info = "/path/to/PackageName.egg-info"
        base_dir = os.path.dirname(egg_info)
        metadata = PathMetadata(base_dir, egg_info)
        dist_name = os.path.splitext(os.path.basename(egg_info))[0]
        dist = Distribution(basedir, project_name=dist_name, metadata=metadata)

        # Unpacked egg directories:

        egg_path = "/path/to/PackageName-ver-pyver-etc.egg"
        metadata = PathMetadata(egg_path, os.path.join(egg_path,'EGG-INFO'))
        dist = Distribution.from_filename(egg_path, metadata=metadata)
    """

    def __init__(self, path, egg_info):
        self.module_path = path
        self.egg_info = egg_info


class EggMetadata(ZipProvider):
    """Metadata provider for .egg files"""

    def __init__(self, importer):
        """Create a metadata provider from a zipimporter"""

        self.zip_pre = importer.archive + os.sep
        self.loader = importer
        if importer.prefix:
            self.module_path = os.path.join(importer.archive, importer.prefix)
        else:
            self.module_path = importer.archive
        self._setup_prefix()


_declare_state('dict', _distribution_finders={})


def register_finder(importer_type, distribution_finder):
    """Register `distribution_finder` to find distributions in sys.path items

    `importer_type` is the type or class of a PEP 302 "Importer" (sys.path item
    handler), and `distribution_finder` is a callable that, passed a path
    item and the importer instance, yields ``Distribution`` instances found on
    that path item.  See ``pkg_resources.find_on_path`` for an example."""
    _distribution_finders[importer_type] = distribution_finder


def find_distributions(path_item, only=False):
    """Yield distributions accessible via `path_item`"""
    importer = get_importer(path_item)
    finder = _find_adapter(_distribution_finders, importer)
    return finder(importer, path_item, only)


def find_eggs_in_zip(importer, path_item, only=False):
    """
    Find eggs in zip files; possibly multiple nested eggs.
    """
    if importer.archive.endswith('.whl'):
        # wheels are not supported with this finder
        # they don't have PKG-INFO metadata, and won't ever contain eggs
        return
    metadata = EggMetadata(importer)
    if metadata.has_metadata('PKG-INFO'):
        yield Distribution.from_filename(path_item, metadata=metadata)
    if only:
        # don't yield nested distros
        return
    for subitem in metadata.resource_listdir('/'):
        if _is_egg_path(subitem):
            subpath = os.path.join(path_item, subitem)
            dists = find_eggs_in_zip(zipimport.zipimporter(subpath), subpath)
            for dist in dists:
                yield dist
        elif subitem.lower().endswith('.dist-info'):
            subpath = os.path.join(path_item, subitem)
            submeta = EggMetadata(zipimport.zipimporter(subpath))
            submeta.egg_info = subpath
            yield Distribution.from_location(path_item, subitem, submeta)


register_finder(zipimport.zipimporter, find_eggs_in_zip)


def find_nothing(importer, path_item, only=False):
    return ()


register_finder(object, find_nothing)


def _by_version_descending(names):
    """
    Given a list of filenames, return them in descending order
    by version number.

    >>> names = 'bar', 'foo', 'Python-2.7.10.egg', 'Python-2.7.2.egg'
    >>> _by_version_descending(names)
    ['Python-2.7.10.egg', 'Python-2.7.2.egg', 'foo', 'bar']
    >>> names = 'Setuptools-1.2.3b1.egg', 'Setuptools-1.2.3.egg'
    >>> _by_version_descending(names)
    ['Setuptools-1.2.3.egg', 'Setuptools-1.2.3b1.egg']
    >>> names = 'Setuptools-1.2.3b1.egg', 'Setuptools-1.2.3.post1.egg'
    >>> _by_version_descending(names)
    ['Setuptools-1.2.3.post1.egg', 'Setuptools-1.2.3b1.egg']
    """
    def _by_version(name):
        """
        Parse each component of the filename
        """
        name, ext = os.path.splitext(name)
        parts = itertools.chain(name.split('-'), [ext])
        return [packaging.version.parse(part) for part in parts]

    return sorted(names, key=_by_version, reverse=True)


def find_on_path(importer, path_item, only=False):
    """Yield distributions accessible on a sys.path directory"""
    path_item = _normalize_cached(path_item)

    if _is_unpacked_egg(path_item):
        yield Distribution.from_filename(
            path_item, metadata=PathMetadata(
                path_item, os.path.join(path_item, 'EGG-INFO')
            )
        )
        return

    entries = safe_listdir(path_item)

    # for performance, before sorting by version,
    # screen entries for only those that will yield
    # distributions
    filtered = (
        entry
        for entry in entries
        if dist_factory(path_item, entry, only)
    )

    # scan for .egg and .egg-info in directory
    path_item_entries = _by_version_descending(filtered)
    for entry in path_item_entries:
        fullpath = os.path.join(path_item, entry)
        factory = dist_factory(path_item, entry, only)
        for dist in factory(fullpath):
            yield dist


def dist_factory(path_item, entry, only):
    """
    Return a dist_factory for a path_item and entry
    """
    lower = entry.lower()
    is_meta = any(map(lower.endswith, ('.egg-info', '.dist-info')))
    return (
        distributions_from_metadata
        if is_meta else
        find_distributions
        if not only and _is_egg_path(entry) else
        resolve_egg_link
        if not only and lower.endswith('.egg-link') else
        NoDists()
    )


class NoDists:
    """
    >>> bool(NoDists())
    False

    >>> list(NoDists()('anything'))
    []
    """
    def __bool__(self):
        return False
    if six.PY2:
        __nonzero__ = __bool__

    def __call__(self, fullpath):
        return iter(())


def safe_listdir(path):
    """
    Attempt to list contents of path, but suppress some exceptions.
    """
    try:
        return os.listdir(path)
    except (PermissionError, NotADirectoryError):
        pass
    except OSError as e:
        # Ignore the directory if does not exist, not a directory or
        # permission denied
        ignorable = (
            e.errno in (errno.ENOTDIR, errno.EACCES, errno.ENOENT)
            # Python 2 on Windows needs to be handled this way :(
            or getattr(e, "winerror", None) == 267
        )
        if not ignorable:
            raise
    return ()


def distributions_from_metadata(path):
    root = os.path.dirname(path)
    if os.path.isdir(path):
        if len(os.listdir(path)) == 0:
            # empty metadata dir; skip
            return
        metadata = PathMetadata(root, path)
    else:
        metadata = FileMetadata(path)
    entry = os.path.basename(path)
    yield Distribution.from_location(
        root, entry, metadata, precedence=DEVELOP_DIST,
    )


def non_empty_lines(path):
    """
    Yield non-empty lines from file at path
    """
    with open(path) as f:
        for line in f:
            line = line.strip()
            if line:
                yield line


def resolve_egg_link(path):
    """
    Given a path to an .egg-link, resolve distributions
    present in the referenced path.
    """
    referenced_paths = non_empty_lines(path)
    resolved_paths = (
        os.path.join(os.path.dirname(path), ref)
        for ref in referenced_paths
    )
    dist_groups = map(find_distributions, resolved_paths)
    return next(dist_groups, ())


register_finder(pkgutil.ImpImporter, find_on_path)

if hasattr(importlib_machinery, 'FileFinder'):
    register_finder(importlib_machinery.FileFinder, find_on_path)

_declare_state('dict', _namespace_handlers={})
_declare_state('dict', _namespace_packages={})


def register_namespace_handler(importer_type, namespace_handler):
    """Register `namespace_handler` to declare namespace packages

    `importer_type` is the type or class of a PEP 302 "Importer" (sys.path item
    handler), and `namespace_handler` is a callable like this::

        def namespace_handler(importer, path_entry, moduleName, module):
            # return a path_entry to use for child packages

    Namespace handlers are only called if the importer object has already
    agreed that it can handle the relevant path item, and they should only
    return a subpath if the module __path__ does not already contain an
    equivalent subpath.  For an example namespace handler, see
    ``pkg_resources.file_ns_handler``.
    """
    _namespace_handlers[importer_type] = namespace_handler


def _handle_ns(packageName, path_item):
    """Ensure that named package includes a subpath of path_item (if needed)"""

    importer = get_importer(path_item)
    if importer is None:
        return None
    loader = importer.find_module(packageName)
    if loader is None:
        return None
    module = sys.modules.get(packageName)
    if module is None:
        module = sys.modules[packageName] = types.ModuleType(packageName)
        module.__path__ = []
        _set_parent_ns(packageName)
    elif not hasattr(module, '__path__'):
        raise TypeError("Not a package:", packageName)
    handler = _find_adapter(_namespace_handlers, importer)
    subpath = handler(importer, path_item, packageName, module)
    if subpath is not None:
        path = module.__path__
        path.append(subpath)
        loader.load_module(packageName)
        _rebuild_mod_path(path, packageName, module)
    return subpath


def _rebuild_mod_path(orig_path, package_name, module):
    """
    Rebuild module.__path__ ensuring that all entries are ordered
    corresponding to their sys.path order
    """
    sys_path = [_normalize_cached(p) for p in sys.path]

    def safe_sys_path_index(entry):
        """
        Workaround for #520 and #513.
        """
        try:
            return sys_path.index(entry)
        except ValueError:
            return float('inf')

    def position_in_sys_path(path):
        """
        Return the ordinal of the path based on its position in sys.path
        """
        path_parts = path.split(os.sep)
        module_parts = package_name.count('.') + 1
        parts = path_parts[:-module_parts]
        return safe_sys_path_index(_normalize_cached(os.sep.join(parts)))

    if not isinstance(orig_path, list):
        # Is this behavior useful when module.__path__ is not a list?
        return

    orig_path.sort(key=position_in_sys_path)
    module.__path__[:] = [_normalize_cached(p) for p in orig_path]


def declare_namespace(packageName):
    """Declare that package 'packageName' is a namespace package"""

    _imp.acquire_lock()
    try:
        if packageName in _namespace_packages:
            return

        path, parent = sys.path, None
        if '.' in packageName:
            parent = '.'.join(packageName.split('.')[:-1])
            declare_namespace(parent)
            if parent not in _namespace_packages:
                __import__(parent)
            try:
                path = sys.modules[parent].__path__
            except AttributeError:
                raise TypeError("Not a package:", parent)

        # Track what packages are namespaces, so when new path items are added,
        # they can be updated
        _namespace_packages.setdefault(parent, []).append(packageName)
        _namespace_packages.setdefault(packageName, [])

        for path_item in path:
            # Ensure all the parent's path items are reflected in the child,
            # if they apply
            _handle_ns(packageName, path_item)

    finally:
        _imp.release_lock()


def fixup_namespace_packages(path_item, parent=None):
    """Ensure that previously-declared namespace packages include path_item"""
    _imp.acquire_lock()
    try:
        for package in _namespace_packages.get(parent, ()):
            subpath = _handle_ns(package, path_item)
            if subpath:
                fixup_namespace_packages(subpath, package)
    finally:
        _imp.release_lock()


def file_ns_handler(importer, path_item, packageName, module):
    """Compute an ns-package subpath for a filesystem or zipfile importer"""

    subpath = os.path.join(path_item, packageName.split('.')[-1])
    normalized = _normalize_cached(subpath)
    for item in module.__path__:
        if _normalize_cached(item) == normalized:
            break
    else:
        # Only return the path if it's not already there
        return subpath


register_namespace_handler(pkgutil.ImpImporter, file_ns_handler)
register_namespace_handler(zipimport.zipimporter, file_ns_handler)

if hasattr(importlib_machinery, 'FileFinder'):
    register_namespace_handler(importlib_machinery.FileFinder, file_ns_handler)


def null_ns_handler(importer, path_item, packageName, module):
    return None


register_namespace_handler(object, null_ns_handler)


def normalize_path(filename):
    """Normalize a file/dir name for comparison purposes"""
    return os.path.normcase(os.path.realpath(filename))


def _normalize_cached(filename, _cache={}):
    try:
        return _cache[filename]
    except KeyError:
        _cache[filename] = result = normalize_path(filename)
        return result


def _is_egg_path(path):
    """
    Determine if given path appears to be an egg.
    """
    return path.lower().endswith('.egg')


def _is_unpacked_egg(path):
    """
    Determine if given path appears to be an unpacked egg.
    """
    return (
        _is_egg_path(path) and
        os.path.isfile(os.path.join(path, 'EGG-INFO', 'PKG-INFO'))
    )


def _set_parent_ns(packageName):
    parts = packageName.split('.')
    name = parts.pop()
    if parts:
        parent = '.'.join(parts)
        setattr(sys.modules[parent], name, sys.modules[packageName])


def yield_lines(strs):
    """Yield non-empty/non-comment lines of a string or sequence"""
    if isinstance(strs, six.string_types):
        for s in strs.splitlines():
            s = s.strip()
            # skip blank lines/comments
            if s and not s.startswith('#'):
                yield s
    else:
        for ss in strs:
            for s in yield_lines(ss):
                yield s


MODULE = re.compile(r"\w+(\.\w+)*$").match
EGG_NAME = re.compile(
    r"""
    (?P<name>[^-]+) (
        -(?P<ver>[^-]+) (
            -py(?P<pyver>[^-]+) (
                -(?P<plat>.+)
            )?
        )?
    )?
    """,
    re.VERBOSE | re.IGNORECASE,
).match


class EntryPoint(object):
    """Object representing an advertised importable object"""

    def __init__(self, name, module_name, attrs=(), extras=(), dist=None):
        if not MODULE(module_name):
            raise ValueError("Invalid module name", module_name)
        self.name = name
        self.module_name = module_name
        self.attrs = tuple(attrs)
        self.extras = tuple(extras)
        self.dist = dist

    def __str__(self):
        s = "%s = %s" % (self.name, self.module_name)
        if self.attrs:
            s += ':' + '.'.join(self.attrs)
        if self.extras:
            s += ' [%s]' % ','.join(self.extras)
        return s

    def __repr__(self):
        return "EntryPoint.parse(%r)" % str(self)

    def load(self, require=True, *args, **kwargs):
        """
        Require packages for this EntryPoint, then resolve it.
        """
        if not require or args or kwargs:
            warnings.warn(
                "Parameters to load are deprecated.  Call .resolve and "
                ".require separately.",
                DeprecationWarning,
                stacklevel=2,
            )
        if require:
            self.require(*args, **kwargs)
        return self.resolve()

    def resolve(self):
        """
        Resolve the entry point from its module and attrs.
        """
        module = __import__(self.module_name, fromlist=['__name__'], level=0)
        try:
            return functools.reduce(getattr, self.attrs, module)
        except AttributeError as exc:
            raise ImportError(str(exc))

    def require(self, env=None, installer=None):
        if self.extras and not self.dist:
            raise UnknownExtra("Can't require() without a distribution", self)

        # Get the requirements for this entry point with all its extras and
        # then resolve them. We have to pass `extras` along when resolving so
        # that the working set knows what extras we want. Otherwise, for
        # dist-info distributions, the working set will assume that the
        # requirements for that extra are purely optional and skip over them.
        reqs = self.dist.requires(self.extras)
        items = working_set.resolve(reqs, env, installer, extras=self.extras)
        list(map(working_set.add, items))

    pattern = re.compile(
        r'\s*'
        r'(?P<name>.+?)\s*'
        r'=\s*'
        r'(?P<module>[\w.]+)\s*'
        r'(:\s*(?P<attr>[\w.]+))?\s*'
        r'(?P<extras>\[.*\])?\s*$'
    )

    @classmethod
    def parse(cls, src, dist=None):
        """Parse a single entry point from string `src`

        Entry point syntax follows the form::

            name = some.module:some.attr [extra1, extra2]

        The entry name and module name are required, but the ``:attrs`` and
        ``[extras]`` parts are optional
        """
        m = cls.pattern.match(src)
        if not m:
            msg = "EntryPoint must be in 'name=module:attrs [extras]' format"
            raise ValueError(msg, src)
        res = m.groupdict()
        extras = cls._parse_extras(res['extras'])
        attrs = res['attr'].split('.') if res['attr'] else ()
        return cls(res['name'], res['module'], attrs, extras, dist)

    @classmethod
    def _parse_extras(cls, extras_spec):
        if not extras_spec:
            return ()
        req = Requirement.parse('x' + extras_spec)
        if req.specs:
            raise ValueError()
        return req.extras

    @classmethod
    def parse_group(cls, group, lines, dist=None):
        """Parse an entry point group"""
        if not MODULE(group):
            raise ValueError("Invalid group name", group)
        this = {}
        for line in yield_lines(lines):
            ep = cls.parse(line, dist)
            if ep.name in this:
                raise ValueError("Duplicate entry point", group, ep.name)
            this[ep.name] = ep
        return this

    @classmethod
    def parse_map(cls, data, dist=None):
        """Parse a map of entry point groups"""
        if isinstance(data, dict):
            data = data.items()
        else:
            data = split_sections(data)
        maps = {}
        for group, lines in data:
            if group is None:
                if not lines:
                    continue
                raise ValueError("Entry points must be listed in groups")
            group = group.strip()
            if group in maps:
                raise ValueError("Duplicate group name", group)
            maps[group] = cls.parse_group(group, lines, dist)
        return maps


def _remove_md5_fragment(location):
    if not location:
        return ''
    parsed = urllib.parse.urlparse(location)
    if parsed[-1].startswith('md5='):
        return urllib.parse.urlunparse(parsed[:-1] + ('',))
    return location


def _version_from_file(lines):
    """
    Given an iterable of lines from a Metadata file, return
    the value of the Version field, if present, or None otherwise.
    """
    def is_version_line(line):
        return line.lower().startswith('version:')
    version_lines = filter(is_version_line, lines)
    line = next(iter(version_lines), '')
    _, _, value = line.partition(':')
    return safe_version(value.strip()) or None


class Distribution(object):
    """Wrap an actual or potential sys.path entry w/metadata"""
    PKG_INFO = 'PKG-INFO'

    def __init__(
            self, location=None, metadata=None, project_name=None,
            version=None, py_version=PY_MAJOR, platform=None,
            precedence=EGG_DIST):
        self.project_name = safe_name(project_name or 'Unknown')
        if version is not None:
            self._version = safe_version(version)
        self.py_version = py_version
        self.platform = platform
        self.location = location
        self.precedence = precedence
        self._provider = metadata or empty_provider

    @classmethod
    def from_location(cls, location, basename, metadata=None, **kw):
        project_name, version, py_version, platform = [None] * 4
        basename, ext = os.path.splitext(basename)
        if ext.lower() in _distributionImpl:
            cls = _distributionImpl[ext.lower()]

            match = EGG_NAME(basename)
            if match:
                project_name, version, py_version, platform = match.group(
                    'name', 'ver', 'pyver', 'plat'
                )
        return cls(
            location, metadata, project_name=project_name, version=version,
            py_version=py_version, platform=platform, **kw
        )._reload_version()

    def _reload_version(self):
        return self

    @property
    def hashcmp(self):
        return (
            self.parsed_version,
            self.precedence,
            self.key,
            _remove_md5_fragment(self.location),
            self.py_version or '',
            self.platform or '',
        )

    def __hash__(self):
        return hash(self.hashcmp)

    def __lt__(self, other):
        return self.hashcmp < other.hashcmp

    def __le__(self, other):
        return self.hashcmp <= other.hashcmp

    def __gt__(self, other):
        return self.hashcmp > other.hashcmp

    def __ge__(self, other):
        return self.hashcmp >= other.hashcmp

    def __eq__(self, other):
        if not isinstance(other, self.__class__):
            # It's not a Distribution, so they are not equal
            return False
        return self.hashcmp == other.hashcmp

    def __ne__(self, other):
        return not self == other

    # These properties have to be lazy so that we don't have to load any
    # metadata until/unless it's actually needed.  (i.e., some distributions
    # may not know their name or version without loading PKG-INFO)

    @property
    def key(self):
        try:
            return self._key
        except AttributeError:
            self._key = key = self.project_name.lower()
            return key

    @property
    def parsed_version(self):
        if not hasattr(self, "_parsed_version"):
            self._parsed_version = parse_version(self.version)

        return self._parsed_version

    def _warn_legacy_version(self):
        LV = packaging.version.LegacyVersion
        is_legacy = isinstance(self._parsed_version, LV)
        if not is_legacy:
            return

        # While an empty version is technically a legacy version and
        # is not a valid PEP 440 version, it's also unlikely to
        # actually come from someone and instead it is more likely that
        # it comes from setuptools attempting to parse a filename and
        # including it in the list. So for that we'll gate this warning
        # on if the version is anything at all or not.
        if not self.version:
            return

        tmpl = textwrap.dedent("""
            '{project_name} ({version})' is being parsed as a legacy,
            non PEP 440,
            version. You may find odd behavior and sort order.
            In particular it will be sorted as less than 0.0. It
            is recommended to migrate to PEP 440 compatible
            versions.
            """).strip().replace('\n', ' ')

        warnings.warn(tmpl.format(**vars(self)), PEP440Warning)

    @property
    def version(self):
        try:
            return self._version
        except AttributeError:
            version = _version_from_file(self._get_metadata(self.PKG_INFO))
            if version is None:
                tmpl = "Missing 'Version:' header and/or %s file"
                raise ValueError(tmpl % self.PKG_INFO, self)
            return version

    @property
    def _dep_map(self):
        """
        A map of extra to its list of (direct) requirements
        for this distribution, including the null extra.
        """
        try:
            return self.__dep_map
        except AttributeError:
            self.__dep_map = self._filter_extras(self._build_dep_map())
        return self.__dep_map

    @staticmethod
    def _filter_extras(dm):
        """
        Given a mapping of extras to dependencies, strip off
        environment markers and filter out any dependencies
        not matching the markers.
        """
        for extra in list(filter(None, dm)):
            new_extra = extra
            reqs = dm.pop(extra)
            new_extra, _, marker = extra.partition(':')
            fails_marker = marker and (
                invalid_marker(marker)
                or not evaluate_marker(marker)
            )
            if fails_marker:
                reqs = []
            new_extra = safe_extra(new_extra) or None

            dm.setdefault(new_extra, []).extend(reqs)
        return dm

    def _build_dep_map(self):
        dm = {}
        for name in 'requires.txt', 'depends.txt':
            for extra, reqs in split_sections(self._get_metadata(name)):
                dm.setdefault(extra, []).extend(parse_requirements(reqs))
        return dm

    def requires(self, extras=()):
        """List of Requirements needed for this distro if `extras` are used"""
        dm = self._dep_map
        deps = []
        deps.extend(dm.get(None, ()))
        for ext in extras:
            try:
                deps.extend(dm[safe_extra(ext)])
            except KeyError:
                raise UnknownExtra(
                    "%s has no such extra feature %r" % (self, ext)
                )
        return deps

    def _get_metadata(self, name):
        if self.has_metadata(name):
            for line in self.get_metadata_lines(name):
                yield line

    def activate(self, path=None, replace=False):
        """Ensure distribution is importable on `path` (default=sys.path)"""
        if path is None:
            path = sys.path
        self.insert_on(path, replace=replace)
        if path is sys.path:
            fixup_namespace_packages(self.location)
            for pkg in self._get_metadata('namespace_packages.txt'):
                if pkg in sys.modules:
                    declare_namespace(pkg)

    def egg_name(self):
        """Return what this distribution's standard .egg filename should be"""
        filename = "%s-%s-py%s" % (
            to_filename(self.project_name), to_filename(self.version),
            self.py_version or PY_MAJOR
        )

        if self.platform:
            filename += '-' + self.platform
        return filename

    def __repr__(self):
        if self.location:
            return "%s (%s)" % (self, self.location)
        else:
            return str(self)

    def __str__(self):
        try:
            version = getattr(self, 'version', None)
        except ValueError:
            version = None
        version = version or "[unknown version]"
        return "%s %s" % (self.project_name, version)

    def __getattr__(self, attr):
        """Delegate all unrecognized public attributes to .metadata provider"""
        if attr.startswith('_'):
            raise AttributeError(attr)
        return getattr(self._provider, attr)

    def __dir__(self):
        return list(
            set(super(Distribution, self).__dir__())
            | set(
                attr for attr in self._provider.__dir__()
                if not attr.startswith('_')
            )
        )

    if not hasattr(object, '__dir__'):
        # python 2.7 not supported
        del __dir__

    @classmethod
    def from_filename(cls, filename, metadata=None, **kw):
        return cls.from_location(
            _normalize_cached(filename), os.path.basename(filename), metadata,
            **kw
        )

    def as_requirement(self):
        """Return a ``Requirement`` that matches this distribution exactly"""
        if isinstance(self.parsed_version, packaging.version.Version):
            spec = "%s==%s" % (self.project_name, self.parsed_version)
        else:
            spec = "%s===%s" % (self.project_name, self.parsed_version)

        return Requirement.parse(spec)

    def load_entry_point(self, group, name):
        """Return the `name` entry point of `group` or raise ImportError"""
        ep = self.get_entry_info(group, name)
        if ep is None:
            raise ImportError("Entry point %r not found" % ((group, name),))
        return ep.load()

    def get_entry_map(self, group=None):
        """Return the entry point map for `group`, or the full entry map"""
        try:
            ep_map = self._ep_map
        except AttributeError:
            ep_map = self._ep_map = EntryPoint.parse_map(
                self._get_metadata('entry_points.txt'), self
            )
        if group is not None:
            return ep_map.get(group, {})
        return ep_map

    def get_entry_info(self, group, name):
        """Return the EntryPoint object for `group`+`name`, or ``None``"""
        return self.get_entry_map(group).get(name)

    def insert_on(self, path, loc=None, replace=False):
        """Ensure self.location is on path

        If replace=False (default):
            - If location is already in path anywhere, do nothing.
            - Else:
              - If it's an egg and its parent directory is on path,
                insert just ahead of the parent.
              - Else: add to the end of path.
        If replace=True:
            - If location is already on path anywhere (not eggs)
              or higher priority than its parent (eggs)
              do nothing.
            - Else:
              - If it's an egg and its parent directory is on path,
                insert just ahead of the parent,
                removing any lower-priority entries.
              - Else: add it to the front of path.
        """

        loc = loc or self.location
        if not loc:
            return

        nloc = _normalize_cached(loc)
        bdir = os.path.dirname(nloc)
        npath = [(p and _normalize_cached(p) or p) for p in path]

        for p, item in enumerate(npath):
            if item == nloc:
                if replace:
                    break
                else:
                    # don't modify path (even removing duplicates) if
                    # found and not replace
                    return
            elif item == bdir and self.precedence == EGG_DIST:
                # if it's an .egg, give it precedence over its directory
                # UNLESS it's already been added to sys.path and replace=False
                if (not replace) and nloc in npath[p:]:
                    return
                if path is sys.path:
                    self.check_version_conflict()
                path.insert(p, loc)
                npath.insert(p, nloc)
                break
        else:
            if path is sys.path:
                self.check_version_conflict()
            if replace:
                path.insert(0, loc)
            else:
                path.append(loc)
            return

        # p is the spot where we found or inserted loc; now remove duplicates
        while True:
            try:
                np = npath.index(nloc, p + 1)
            except ValueError:
                break
            else:
                del npath[np], path[np]
                # ha!
                p = np

        return

    def check_version_conflict(self):
        if self.key == 'setuptools':
            # ignore the inevitable setuptools self-conflicts  :(
            return

        nsp = dict.fromkeys(self._get_metadata('namespace_packages.txt'))
        loc = normalize_path(self.location)
        for modname in self._get_metadata('top_level.txt'):
            if (modname not in sys.modules or modname in nsp
                    or modname in _namespace_packages):
                continue
            if modname in ('pkg_resources', 'setuptools', 'site'):
                continue
            fn = getattr(sys.modules[modname], '__file__', None)
            if fn and (normalize_path(fn).startswith(loc) or
                       fn.startswith(self.location)):
                continue
            issue_warning(
                "Module %s was already imported from %s, but %s is being added"
                " to sys.path" % (modname, fn, self.location),
            )

    def has_version(self):
        try:
            self.version
        except ValueError:
            issue_warning("Unbuilt egg for " + repr(self))
            return False
        return True

    def clone(self, **kw):
        """Copy this distribution, substituting in any changed keyword args"""
        names = 'project_name version py_version platform location precedence'
        for attr in names.split():
            kw.setdefault(attr, getattr(self, attr, None))
        kw.setdefault('metadata', self._provider)
        return self.__class__(**kw)

    @property
    def extras(self):
        return [dep for dep in self._dep_map if dep]


class EggInfoDistribution(Distribution):
    def _reload_version(self):
        """
        Packages installed by distutils (e.g. numpy or scipy),
        which uses an old safe_version, and so
        their version numbers can get mangled when
        converted to filenames (e.g., 1.11.0.dev0+2329eae to
        1.11.0.dev0_2329eae). These distributions will not be
        parsed properly
        downstream by Distribution and safe_version, so
        take an extra step and try to get the version number from
        the metadata file itself instead of the filename.
        """
        md_version = _version_from_file(self._get_metadata(self.PKG_INFO))
        if md_version:
            self._version = md_version
        return self


class DistInfoDistribution(Distribution):
    """
    Wrap an actual or potential sys.path entry
    w/metadata, .dist-info style.
    """
    PKG_INFO = 'METADATA'
    EQEQ = re.compile(r"([\(,])\s*(\d.*?)\s*([,\)])")

    @property
    def _parsed_pkg_info(self):
        """Parse and cache metadata"""
        try:
            return self._pkg_info
        except AttributeError:
            metadata = self.get_metadata(self.PKG_INFO)
            self._pkg_info = email.parser.Parser().parsestr(metadata)
            return self._pkg_info

    @property
    def _dep_map(self):
        try:
            return self.__dep_map
        except AttributeError:
            self.__dep_map = self._compute_dependencies()
            return self.__dep_map

    def _compute_dependencies(self):
        """Recompute this distribution's dependencies."""
        dm = self.__dep_map = {None: []}

        reqs = []
        # Including any condition expressions
        for req in self._parsed_pkg_info.get_all('Requires-Dist') or []:
            reqs.extend(parse_requirements(req))

        def reqs_for_extra(extra):
            for req in reqs:
                if not req.marker or req.marker.evaluate({'extra': extra}):
                    yield req

        common = frozenset(reqs_for_extra(None))
        dm[None].extend(common)

        for extra in self._parsed_pkg_info.get_all('Provides-Extra') or []:
            s_extra = safe_extra(extra.strip())
            dm[s_extra] = list(frozenset(reqs_for_extra(extra)) - common)

        return dm


_distributionImpl = {
    '.egg': Distribution,
    '.egg-info': EggInfoDistribution,
    '.dist-info': DistInfoDistribution,
}


def issue_warning(*args, **kw):
    level = 1
    g = globals()
    try:
        # find the first stack frame that is *not* code in
        # the pkg_resources module, to use for the warning
        while sys._getframe(level).f_globals is g:
            level += 1
    except ValueError:
        pass
    warnings.warn(stacklevel=level + 1, *args, **kw)


class RequirementParseError(ValueError):
    def __str__(self):
        return ' '.join(self.args)


def parse_requirements(strs):
    """Yield ``Requirement`` objects for each specification in `strs`

    `strs` must be a string, or a (possibly-nested) iterable thereof.
    """
    # create a steppable iterator, so we can handle \-continuations
    lines = iter(yield_lines(strs))

    for line in lines:
        # Drop comments -- a hash without a space may be in a URL.
        if ' #' in line:
            line = line[:line.find(' #')]
        # If there is a line continuation, drop it, and append the next line.
        if line.endswith('\\'):
            line = line[:-2].strip()
            try:
                line += next(lines)
            except StopIteration:
                return
        yield Requirement(line)


class Requirement(packaging.requirements.Requirement):
    def __init__(self, requirement_string):
        """DO NOT CALL THIS UNDOCUMENTED METHOD; use Requirement.parse()!"""
        try:
            super(Requirement, self).__init__(requirement_string)
        except packaging.requirements.InvalidRequirement as e:
            raise RequirementParseError(str(e))
        self.unsafe_name = self.name
        project_name = safe_name(self.name)
        self.project_name, self.key = project_name, project_name.lower()
        self.specs = [
            (spec.operator, spec.version) for spec in self.specifier]
        self.extras = tuple(map(safe_extra, self.extras))
        self.hashCmp = (
            self.key,
            self.specifier,
            frozenset(self.extras),
            str(self.marker) if self.marker else None,
        )
        self.__hash = hash(self.hashCmp)

    def __eq__(self, other):
        return (
            isinstance(other, Requirement) and
            self.hashCmp == other.hashCmp
        )

    def __ne__(self, other):
        return not self == other

    def __contains__(self, item):
        if isinstance(item, Distribution):
            if item.key != self.key:
                return False

            item = item.version

        # Allow prereleases always in order to match the previous behavior of
        # this method. In the future this should be smarter and follow PEP 440
        # more accurately.
        return self.specifier.contains(item, prereleases=True)

    def __hash__(self):
        return self.__hash

    def __repr__(self):
        return "Requirement.parse(%r)" % str(self)

    @staticmethod
    def parse(s):
        req, = parse_requirements(s)
        return req


def _always_object(classes):
    """
    Ensure object appears in the mro even
    for old-style classes.
    """
    if object not in classes:
        return classes + (object,)
    return classes


def _find_adapter(registry, ob):
    """Return an adapter factory for `ob` from `registry`"""
    types = _always_object(inspect.getmro(getattr(ob, '__class__', type(ob))))
    for t in types:
        if t in registry:
            return registry[t]


def ensure_directory(path):
    """Ensure that the parent directory of `path` exists"""
    dirname = os.path.dirname(path)
    py31compat.makedirs(dirname, exist_ok=True)


def _bypass_ensure_directory(path):
    """Sandbox-bypassing version of ensure_directory()"""
    if not WRITE_SUPPORT:
        raise IOError('"os.mkdir" not supported on this platform.')
    dirname, filename = split(path)
    if dirname and filename and not isdir(dirname):
        _bypass_ensure_directory(dirname)
        mkdir(dirname, 0o755)


def split_sections(s):
    """Split a string or iterable thereof into (section, content) pairs

    Each ``section`` is a stripped version of the section header ("[section]")
    and each ``content`` is a list of stripped lines excluding blank lines and
    comment-only lines.  If there are any such lines before the first section
    header, they're returned in a first ``section`` of ``None``.
    """
    section = None
    content = []
    for line in yield_lines(s):
        if line.startswith("["):
            if line.endswith("]"):
                if section or content:
                    yield section, content
                section = line[1:-1].strip()
                content = []
            else:
                raise ValueError("Invalid section heading", line)
        else:
            content.append(line)

    # wrap up last segment
    yield section, content


def _mkstemp(*args, **kw):
    old_open = os.open
    try:
        # temporarily bypass sandboxing
        os.open = os_open
        return tempfile.mkstemp(*args, **kw)
    finally:
        # and then put it back
        os.open = old_open


# Silence the PEP440Warning by default, so that end users don't get hit by it
# randomly just because they use pkg_resources. We want to append the rule
# because we want earlier uses of filterwarnings to take precedence over this
# one.
warnings.filterwarnings("ignore", category=PEP440Warning, append=True)


# from jaraco.functools 1.3
def _call_aside(f, *args, **kwargs):
    f(*args, **kwargs)
    return f


@_call_aside
def _initialize(g=globals()):
    "Set up global resource manager (deliberately not state-saved)"
    manager = ResourceManager()
    g['_manager'] = manager
    g.update(
        (name, getattr(manager, name))
        for name in dir(manager)
        if not name.startswith('_')
    )


@_call_aside
def _initialize_master_working_set():
    """
    Prepare the master working set and make the ``require()``
    API available.

    This function has explicit effects on the global state
    of pkg_resources. It is intended to be invoked once at
    the initialization of this module.

    Invocation by other packages is unsupported and done
    at their own risk.
    """
    working_set = WorkingSet._build_master()
    _declare_state('object', working_set=working_set)

    require = working_set.require
    iter_entry_points = working_set.iter_entry_points
    add_activation_listener = working_set.subscribe
    run_script = working_set.run_script
    # backward compatibility
    run_main = run_script
    # Activate all distributions already on sys.path with replace=False and
    # ensure that all distributions added to the working set in the future
    # (e.g. by calling ``require()``) will get activated as well,
    # with higher priority (replace=True).
    tuple(
        dist.activate(replace=False)
        for dist in working_set
    )
    add_activation_listener(
        lambda dist: dist.activate(replace=True),
        existing=False,
    )
    working_set.entries = []
    # match order
    list(map(working_set.add_entry, sys.path))
    globals().update(locals())
site-packages/pkg_resources/py31compat.py000064400000001130147511334660014516 0ustar00import os
import errno
import sys


def _makedirs_31(path, exist_ok=False):
    try:
        os.makedirs(path)
    except OSError as exc:
        if not exist_ok or exc.errno != errno.EEXIST:
            raise


# rely on compatibility behavior until mode considerations
#  and exists_ok considerations are disentangled.
# See https://github.com/pypa/setuptools/pull/1083#issuecomment-315168663
needs_makedirs = (
    sys.version_info < (3, 2, 5) or
    (3, 3) <= sys.version_info < (3, 3, 6) or
    (3, 4) <= sys.version_info < (3, 4, 1)
)
makedirs = _makedirs_31 if needs_makedirs else os.makedirs
site-packages/pkg_resources/__pycache__/__init__.cpython-36.pyc000064400000272700147511334660020556 0ustar003

��f���^@sLdZddlmZddlZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlmZyddlZWnek
r�ddlZYnXddlmZddl m!Z!m"Z"m#Z#ddlm$Z$yddlm%Z%m&Z&m'Z'd	Z(Wnek
�rXd
Z(YnXddlm)Z*ddl+m,Z,m-Z-yddl.j/Z0e0j1Wnek
�r�dZ0YnXd
dl2m3Z3ddlm4Z4ddlm5Z5e6d�e6d�e6d�e6d�d�ej7k�od�kn�re8d��ej9�r dZ:dZ;dZ<dZ=dZ>dZ?dZ@dZAdZBdZCdZDdZEdZFdZGdZHdZIdZJdZKdZLGdd�deM�ZNdd�ZOiZPdd�ZQdd�ZRdd �ZSd!d"�ZTd#d$�ZUd%d&�ZVd'd(�ZWd)d*�ZXZYd+d,�ZZd-d.d/d0d1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLddMddNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpgFZ[GdqdH�dHe\�Z]GdrdI�dIe]�Z^Gdsdt�dte^�Z_GdudJ�dJe]�Z`GdvdK�dKe]�ZaiZbejcdd�ZddZedwZfd
ZgdZhd�Zidxdl�Zjdyd/�Zkgfdzd{�Zld|d}�Zmd~d�Znejod��Zpejod��ZqenZrd�dQ�Zsd�d.�ZtetZud�d0�Zvd�d1�Zwd�d�d2�Zxd�d3�ZyGd�d_�d_�ZzGd�d`�d`ez�Z{Gd�dC�dCe|�Z}Gd�d��d�e~�ZGd�dB�dBe|�Z�e�Z�Gd�dL�dLe8�Z�Gd�dD�dD�Z�d�dA�Z�d�dN�Z�d�dO�Z�d�dT�Z�d�dU�Z�d�dV�Z�d�d�dW�Z�Gd�df�df�Z�eje|e��Gd�dg�dge��Z�Gd�dh�dhe��Z�e�j��Gd�dd�dde��Z�e��Z�Gd�d��d�e~�Z�Gd�d��d�e��Z�Gd�di�die��Z�eje
j�e��Gd�da�dae��Z�Gd�db�dbe��Z�Gd�dc�dce��Z�eQd�id��d�dj�Z�d�d�d>�Z�d�d�d��Z�e�e
j�e��d�d�d��Z�e�e|e��d�d��Z��dd�d��Z�d�d��Z�Gd�d��d��Z�d�d��Z�d�d��Z�d�d��Z�d�d��Z�e�ej�e��e�e0d���rne�e0j�e��eQd�id��eQd�id��d�dk�Z�d�d��Z�d�d„Z�d�d;�Z��dd�dm�Z�d�dƄZ�e�ej�e��e�e
j�e��e�e0d���r�e�e0j�e��d�dȄZ�e�e|e��d�dY�Z�ifd�d˄Z�d�d̈́Z�d�dτZ�d�dфZ�d�dR�Z�ejodӃj�Z�ejod�ej�ej�B�j�Z�Gd�dG�dGe|�Z�d�dׄZ�d�dلZ�Gd�dE�dEe|�Z�Gd�d܄d�e��Z�Gd�dބd�e��Z�e�e�e�dߜZ�d�d�Z�Gd�d�d�e��Z�d�dM�Z�Gd�dF�dFe5j�jŃZ�d�d�Z�d�d�Z�d�dX�Z�d�d�Z�d�dS�Z�d�d�Z�ej�d�eNd	d�d�d�Z�e�e΃fd�d���Z�e�d�d���Z�dS(aZ
Package resource API
--------------------

A resource is a logical file contained within a package, or a logical
subdirectory thereof.  The package resource API expects resource names
to have their path parts separated with ``/``, *not* whatever the local
path separator is.  Do not use os.path operations to manipulate resource
names being passed into the API.

The package resource API is designed to work with normal filesystem packages,
.egg files, and unpacked .egg files.  It can also work in a limited way with
.zip files and with custom PEP 302 loaders that support the ``get_data()``
method.
�)�absolute_importN)�get_importer)�six)�urllib�map�filter)�utime)�mkdir�rename�unlinkTF)�open)�isdir�split�)�
py31compat)�appdirs)�	packagingz&pkg_resources.extern.packaging.versionz)pkg_resources.extern.packaging.specifiersz+pkg_resources.extern.packaging.requirementsz&pkg_resources.extern.packaging.markers�zPython 3.3 or later is requiredc@seZdZdZdS)�
PEP440Warningza
    Used when there is an issue with a version or specifier not complying with
    PEP 440.
    N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/__init__.pyrnsrcCs2ytjj|�Stjjk
r,tjj|�SXdS)N)r�version�Version�InvalidVersion�
LegacyVersion)�vrrr�
parse_versionusr cKs"t�j|�tjtj||��dS)N)�globals�update�_state_vars�dict�fromkeys)Zvartype�kwrrr�_declare_statesr'cCs<i}t�}x,tj�D] \}}|d|||�||<qW|S)NZ_sget_)r!r#�items)�state�g�krrrr�__getstate__�s
r,cCs<t�}x0|j�D]$\}}|dt|||||�qW|S)NZ_sset_)r!r(r#)r)r*r+rrrr�__setstate__�s r-cCs|j�S)N)�copy)�valrrr�
_sget_dict�sr0cCs|j�|j|�dS)N)�clearr")�key�obr)rrr�
_sset_dict�sr4cCs|j�S)N)r,)r/rrr�_sget_object�sr5cCs|j|�dS)N)r-)r2r3r)rrr�_sset_object�sr6cGsdS)Nr)�argsrrr�<lambda>�sr8cCsbt�}tj|�}|dk	r^tjdkr^y&ddjt�dd��|jd�f}Wntk
r\YnX|S)aZReturn this platform's maximum compatible version.

    distutils.util.get_platform() normally reports the minimum version
    of Mac OS X that would be required to *use* extensions produced by
    distutils.  But what we want when checking compatibility is to know the
    version of Mac OS X that we are *running*.  To allow usage of packages that
    explicitly require a newer version of Mac OS X, we must also know the
    current version of the OS.

    If this condition occurs for any other platform with a version in its
    platform strings, this function should be extended accordingly.
    N�darwinzmacosx-%s-%s�.�r)	�get_build_platform�macosVersionString�match�sys�platform�join�_macosx_vers�group�
ValueError)�plat�mrrr�get_supported_platform�s

&rG�require�
run_script�get_provider�get_distribution�load_entry_point�
get_entry_map�get_entry_info�iter_entry_points�resource_string�resource_stream�resource_filename�resource_listdir�resource_exists�resource_isdir�declare_namespace�working_set�add_activation_listener�find_distributions�set_extraction_path�cleanup_resources�get_default_cache�Environment�
WorkingSet�ResourceManager�Distribution�Requirement�
EntryPoint�ResolutionError�VersionConflict�DistributionNotFound�UnknownExtra�ExtractionError�parse_requirements�	safe_name�safe_version�get_platform�compatible_platforms�yield_lines�split_sections�
safe_extra�to_filename�invalid_marker�evaluate_marker�ensure_directory�normalize_path�EGG_DIST�BINARY_DIST�SOURCE_DIST�
CHECKOUT_DIST�DEVELOP_DIST�IMetadataProvider�IResourceProvider�FileMetadata�PathMetadata�EggMetadata�
EmptyProvider�empty_provider�NullProvider�EggProvider�DefaultProvider�ZipProvider�register_finder�register_namespace_handler�register_loader_type�fixup_namespace_packagesr�run_main�AvailableDistributionsc@seZdZdZdd�ZdS)rcz.Abstract base for dependency resolution errorscCs|jjt|j�S)N)�	__class__r�reprr7)�selfrrr�__repr__�szResolutionError.__repr__N)rrrrr�rrrrrc�sc@s<eZdZdZdZedd��Zedd��Zdd�Zd	d
�Z	dS)rdz�
    An already-installed version conflicts with the requested version.

    Should be initialized with the installed Distribution and the requested
    Requirement.
    z3{self.dist} is installed but {self.req} is requiredcCs
|jdS)Nr)r7)r�rrr�dist�szVersionConflict.distcCs
|jdS)Nr)r7)r�rrr�reqszVersionConflict.reqcCs|jjft��S)N)�	_template�format�locals)r�rrr�reportszVersionConflict.reportcCs|s|S|j|f}t|�S)zt
        If required_by is non-empty, return a version of self that is a
        ContextualVersionConflict.
        )r7�ContextualVersionConflict)r��required_byr7rrr�with_context
szVersionConflict.with_contextN)
rrrrr��propertyr�r�r�r�rrrrrd�sc@s&eZdZdZejdZedd��ZdS)r�z�
    A VersionConflict that accepts a third parameter, the set of the
    requirements that required the installed Distribution.
    z by {self.required_by}cCs
|jdS)Nr;)r7)r�rrrr�sz%ContextualVersionConflict.required_byN)rrrrrdr�r�r�rrrrr�s
r�c@sHeZdZdZdZedd��Zedd��Zedd��Zd	d
�Z	dd�Z
d
S)rez&A requested distribution was not foundzSThe '{self.req}' distribution was not found and is required by {self.requirers_str}cCs
|jdS)Nr)r7)r�rrrr�(szDistributionNotFound.reqcCs
|jdS)Nr)r7)r�rrr�	requirers,szDistributionNotFound.requirerscCs|js
dSdj|j�S)Nzthe applicationz, )r�rA)r�rrr�
requirers_str0sz"DistributionNotFound.requirers_strcCs|jjft��S)N)r�r�r�)r�rrrr�6szDistributionNotFound.reportcCs|j�S)N)r�)r�rrr�__str__9szDistributionNotFound.__str__N)rrrrr�r�r�r�r�r�r�rrrrre"sc@seZdZdZdS)rfz>Distribution doesn't have an "extra feature" of the given nameN)rrrrrrrrrf=sr;cCs|t|<dS)aRegister `provider_factory` to make providers for `loader_type`

    `loader_type` is the type or class of a PEP 302 ``module.__loader__``,
    and `provider_factory` is a function that, passed a *module* object,
    returns an ``IResourceProvider`` for that module.
    N)�_provider_factories)Zloader_typeZprovider_factoryrrrr�KscCstt|t�r$tj|�p"tt|��dSytj|}Wn&tk
rXt	|�tj|}YnXt
|dd�}tt|�|�S)z?Return an IResourceProvider for the named module or requirementr�
__loader__N)
�
isinstancerarW�findrH�strr?�modules�KeyError�
__import__�getattr�
_find_adapterr�)ZmoduleOrReq�module�loaderrrrrJUs
cCsd|s\tj�d}|dkrLd}tjj|�rLttd�rLtj|�}d|krL|d}|j|j	d��|dS)Nr�z0/System/Library/CoreServices/SystemVersion.plist�	readPlistZProductVersionr:)
r@Zmac_ver�os�path�exists�hasattr�plistlibr��appendr)�_cacherZplistZ
plist_contentrrrrBbs

rBcCsddd�j||�S)NZppc)ZPowerPCZPower_Macintosh)�get)�machinerrr�_macosx_archrsr�cCs~ddlm}|�}tjdkrz|jd�rzy<t�}tj�djdd�}dt	|d�t	|d	�t
|�fStk
rxYnX|S)
z�Return this platform's string for platform-specific distributions

    XXX Currently this is the same as ``distutils.util.get_platform()``, but it
    needs some hacks for Linux and Mac OS X.
    r)rkr9zmacosx-�� �_zmacosx-%d.%d-%sr)�	sysconfigrkr?r@�
startswithrBr��uname�replace�intr�rD)rkrErr�rrrr<vsr<zmacosx-(\d+)\.(\d+)-(.*)zdarwin-(\d+)\.(\d+)\.(\d+)-(.*)cCs�|dks|dks||krdStj|�}|r�tj|�}|s�tj|�}|r�t|jd��}d|jd�|jd�f}|dkr||dks�|dkr�|d	kr�dSd
S|jd�|jd�ks�|jd�|jd�kr�d
St|jd��t|jd��kr�d
SdSd
S)z�Can code for the `provided` platform run on the `required` platform?

    Returns true if either platform is ``None``, or the platforms are equal.

    XXX Needs compatibility checks for Linux and other unixy OSes.
    NTrz%s.%sr;�z10.3�z10.4Fr)r=r>�darwinVersionStringr�rC)ZprovidedZrequiredZreqMacZprovMacZ
provDarwinZdversionZmacosversionrrrrl�s*


cCs<tjd�j}|d}|j�||d<t|�dj||�dS)z@Locate distribution `dist_spec` and run its `script_name` scriptrrrN)r?�	_getframe�	f_globalsr1rHrI)Z	dist_spec�script_name�ns�namerrrrI�s
cCs@t|tj�rtj|�}t|t�r(t|�}t|t�s<td|��|S)z@Return a current distribution object for a Requirement or stringz-Expected string, Requirement, or Distribution)r�r�string_typesra�parserJr`�	TypeError)r�rrrrK�s



cCst|�j||�S)zDReturn `name` entry point of `group` for `dist` or raise ImportError)rKrL)r�rCr�rrrrL�scCst|�j|�S)z=Return the entry point map for `group`, or the full entry map)rKrM)r�rCrrrrM�scCst|�j||�S)z<Return the EntryPoint object for `group`+`name`, or ``None``)rKrN)r�rCr�rrrrN�sc@s<eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
S)rzcCsdS)z;Does the package's distribution contain the named metadata?Nr)r�rrr�has_metadata�szIMetadataProvider.has_metadatacCsdS)z'The named metadata resource as a stringNr)r�rrr�get_metadata�szIMetadataProvider.get_metadatacCsdS)z�Yield named metadata resource as list of non-blank non-comment lines

       Leading and trailing whitespace is stripped from each line, and lines
       with ``#`` as the first non-blank character are omitted.Nr)r�rrr�get_metadata_lines�sz$IMetadataProvider.get_metadata_linescCsdS)z>Is the named metadata a directory?  (like ``os.path.isdir()``)Nr)r�rrr�metadata_isdir�sz IMetadataProvider.metadata_isdircCsdS)z?List of metadata names in the directory (like ``os.listdir()``)Nr)r�rrr�metadata_listdir�sz"IMetadataProvider.metadata_listdircCsdS)z=Execute the named script in the supplied namespace dictionaryNr)r��	namespacerrrrI�szIMetadataProvider.run_scriptN)	rrrr�r�r�r�r�rIrrrrrz�sc@s@eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dS)r{z3An object that provides access to package resourcescCsdS)zdReturn a true filesystem path for `resource_name`

        `manager` must be an ``IResourceManager``Nr)�manager�
resource_namerrr�get_resource_filenamesz'IResourceProvider.get_resource_filenamecCsdS)ziReturn a readable file-like object for `resource_name`

        `manager` must be an ``IResourceManager``Nr)r�r�rrr�get_resource_streamsz%IResourceProvider.get_resource_streamcCsdS)zmReturn a string containing the contents of `resource_name`

        `manager` must be an ``IResourceManager``Nr)r�r�rrr�get_resource_string
sz%IResourceProvider.get_resource_stringcCsdS)z,Does the package contain the named resource?Nr)r�rrr�has_resourceszIResourceProvider.has_resourcecCsdS)z>Is the named resource a directory?  (like ``os.path.isdir()``)Nr)r�rrrrUsz IResourceProvider.resource_isdircCsdS)z?List of resource names in the directory (like ``os.listdir()``)Nr)r�rrrrSsz"IResourceProvider.resource_listdirN)
rrrrr�r�r�r�rUrSrrrrr{sc@s�eZdZdZd'dd�Zedd��Zedd��Zd	d
�Zdd�Z	d
d�Z
d(dd�Zdd�Zdd�Z
d)dd�Zd*dd�Zd+dd�Zdd�Zd,dd �Zd!d"�Zd#d$�Zd%d&�ZdS)-r^zDA collection of active distributions on sys.path (or a similar list)NcCsBg|_i|_i|_g|_|dkr&tj}x|D]}|j|�q,WdS)z?Create working set from list of path entries (default=sys.path)N)�entries�
entry_keys�by_key�	callbacksr?r��	add_entry)r�r��entryrrr�__init__s
zWorkingSet.__init__cCsZ|�}yddlm}Wntk
r*|SXy|j|�Wntk
rT|j|�SX|S)z1
        Prepare the master working set.
        r)�__requires__)�__main__r��ImportErrorrHrd�_build_from_requirements)�cls�wsr�rrr�
_build_master,szWorkingSet._build_mastercCsn|g�}t|�}|j|t��}x|D]}|j|�q$Wx"tjD]}||jkr>|j|�q>W|jtjdd�<|S)zQ
        Build a working set from a requirement spec. Rewrites sys.path.
        N)rh�resolver]�addr?r�r�r�)r�Zreq_specr��reqs�distsr�r�rrrr�@s

z#WorkingSet._build_from_requirementscCs@|jj|g�|jj|�x t|d�D]}|j||d�q&WdS)a�Add a path item to ``.entries``, finding any distributions on it

        ``find_distributions(entry, True)`` is used to find distributions
        corresponding to the path entry, and they are added.  `entry` is
        always appended to ``.entries``, even if it is already present.
        (This is because ``sys.path`` can contain the same value more than
        once, and the ``.entries`` of the ``sys.path`` WorkingSet should always
        equal ``sys.path``.)
        TFN)r��
setdefaultr�r�rYr�)r�r�r�rrrr�Vs
zWorkingSet.add_entrycCs|jj|j�|kS)z9True if `dist` is the active distribution for its project)r�r�r2)r�r�rrr�__contains__eszWorkingSet.__contains__cCs,|jj|j�}|dk	r(||kr(t||��|S)a�Find a distribution matching requirement `req`

        If there is an active distribution for the requested project, this
        returns it as long as it meets the version requirement specified by
        `req`.  But, if there is an active distribution for the project and it
        does *not* meet the `req` requirement, ``VersionConflict`` is raised.
        If there is no active distribution for the requested project, ``None``
        is returned.
        N)r�r�r2rd)r�r�r�rrrr�is

zWorkingSet.findccsPxJ|D]B}|j|�}|dkr6x*|j�D]
}|Vq&Wq||kr||VqWdS)aYield entry point objects from `group` matching `name`

        If `name` is None, yields all entry points in `group` from all
        distributions in the working set, otherwise only ones matching
        both `group` and `name` are yielded (in distribution order).
        N)rM�values)r�rCr�r�r��eprrrrOys

zWorkingSet.iter_entry_pointscCs>tjd�j}|d}|j�||d<|j|�dj||�dS)z?Locate distribution for `requires` and run `script_name` scriptrrrN)r?r�r�r1rHrI)r��requiresr�r�r�rrrrI�s
zWorkingSet.run_scriptccsTi}xJ|jD]@}||jkrqx.|j|D] }||kr(d||<|j|Vq(WqWdS)z�Yield distributions for non-duplicate projects in the working set

        The yield order is the order in which the items' path entries were
        added to the working set.
        rN)r�r�r�)r��seen�itemr2rrr�__iter__�s
zWorkingSet.__iter__TFcCs�|r|j|j||d�|dkr$|j}|jj|g�}|jj|jg�}|rX|j|jkrXdS||j|j<|j|krz|j|j�|j|kr�|j|j�|j|�dS)aAdd `dist` to working set, associated with `entry`

        If `entry` is unspecified, it defaults to the ``.location`` of `dist`.
        On exit from this routine, `entry` is added to the end of the working
        set's ``.entries`` (if it wasn't already present).

        `dist` is only added to the working set if it's for a project that
        doesn't already have a distribution in the set, unless `replace=True`.
        If it's added, any callbacks registered with the ``subscribe()`` method
        will be called.
        )r�N)	�	insert_onr��locationr�r�r2r�r��
_added_new)r�r�r��insertr��keysZkeys2rrrr��s

zWorkingSet.addcCs�t|�ddd�}i}i}g}t�}	tjt�}
�xP|�r�|jd�}||krLq2|	j||�sZq2|j|j�}|dk�r
|j	j|j�}|dks�||ko�|�r|}
|dkr�|dkr�t
|j�}nt
g�}tg�}
|j
||
||d�}||j<|dk�r|
j|d�}t||��|j|�||k�r,|
|}t||�j|��|j|j�ddd�}|j|�x(|D] }|
|j|j�|j|	|<�qRWd||<q2W|S)a�List all distributions needed to (recursively) meet `requirements`

        `requirements` must be a sequence of ``Requirement`` objects.  `env`,
        if supplied, should be an ``Environment`` instance.  If
        not supplied, it defaults to all distributions available within any
        entry or distribution in the working set.  `installer`, if supplied,
        will be invoked with each requirement that cannot be met by an
        already-installed distribution; it should return a ``Distribution`` or
        ``None``.

        Unless `replace_conflicting=True`, raises a VersionConflict exception
        if
        any requirements are found on the path that have the correct name but
        the wrong version.  Otherwise, if an `installer` is supplied it will be
        invoked to obtain the correct version of the requirement and activate
        it.

        `extras` is a list of the extras to be used with these requirements.
        This is important because extra requirements may look like `my_req;
        extra = "my_extra"`, which would otherwise be interpreted as a purely
        optional requirement.  Instead, we want to be able to assert that these
        requirements are truly required.
        Nrr)�replace_conflictingT���r�)�list�
_ReqExtras�collections�defaultdict�set�pop�markers_passr�r2r�r]r�r^�
best_matchrer�rdr�r��extras�extendr��project_name)r��requirements�env�	installerr�r�Z	processedZbestZto_activateZ
req_extrasr�r�r�r�r�Z
dependent_reqZnew_requirementsZnew_requirementrrrr��sN









zWorkingSet.resolvecCst|�}|j�i}i}|dkr4t|j�}||7}n||}|jg�}	tt|	j|��x�|D]�}
x�||
D]x}|j�g}y|	j|||�}
Wn4t	k
r�}z|||<|r�wjnPWYdd}~XqjXtt|	j|
��|j
tj|
��PqjWq\Wt|�}|j�||fS)asFind all activatable distributions in `plugin_env`

        Example usage::

            distributions, errors = working_set.find_plugins(
                Environment(plugin_dirlist)
            )
            # add plugins+libs to sys.path
            map(working_set.add, distributions)
            # display errors
            print('Could not load', errors)

        The `plugin_env` should be an ``Environment`` instance that contains
        only distributions that are in the project's "plugin directory" or
        directories. The `full_env`, if supplied, should be an ``Environment``
        contains all currently-available distributions.  If `full_env` is not
        supplied, one is created automatically from the ``WorkingSet`` this
        method is called on, which will typically mean that every directory on
        ``sys.path`` will be scanned for distributions.

        `installer` is a standard installer callback as used by the
        ``resolve()`` method. The `fallback` flag indicates whether we should
        attempt to resolve older versions of a plugin if the newest version
        cannot be resolved.

        This method returns a 2-tuple: (`distributions`, `error_info`), where
        `distributions` is a list of the distributions found in `plugin_env`
        that were loadable, along with any other distributions that are needed
        to resolve their dependencies.  `error_info` is a dictionary mapping
        unloadable plugin distributions to an exception instance describing the
        error that occurred. Usually this will be a ``DistributionNotFound`` or
        ``VersionConflict`` instance.
        N)
r��sortr]r�r�rr��as_requirementr�rcr"r$r%)r�Z
plugin_envZfull_envrZfallbackZplugin_projectsZ
error_infoZ
distributionsrZ
shadow_setr�r�r�Z	resolveesrrrr�find_pluginss4$





zWorkingSet.find_pluginscGs*|jt|��}x|D]}|j|�qW|S)a�Ensure that distributions matching `requirements` are activated

        `requirements` must be a string or a (possibly-nested) sequence
        thereof, specifying the distributions and versions required.  The
        return value is a sequence of the distributions that needed to be
        activated to fulfill the requirements; all relevant distributions are
        included, even if they were already activated in this working set.
        )r�rhr�)r�r�Zneededr�rrrrHos	
zWorkingSet.requirecCs<||jkrdS|jj|�|s"dSx|D]}||�q(WdS)z�Invoke `callback` for all distributions

        If `existing=True` (default),
        call on all existing ones, as well.
        N)r�r�)r��callback�existingr�rrr�	subscribes

zWorkingSet.subscribecCsx|jD]}||�qWdS)N)r�)r�r�rrrrr��szWorkingSet._added_newcCs,|jdd�|jj�|jj�|jdd�fS)N)r�r�r.r�r�)r�rrrr,�szWorkingSet.__getstate__cCs@|\}}}}|dd�|_|j�|_|j�|_|dd�|_dS)N)r�r.r�r�r�)r�Ze_k_b_cr�r�r�r�rrrr-�s


zWorkingSet.__setstate__)N)N)NTF)NNFN)NNT)T)rrrrr��classmethodr�r�r�r�r�rOrIr�r�r�rrHrr�r,r-rrrrr^s&




\
S
c@seZdZdZddd�ZdS)r�z>
    Map each requirement to the extras that demanded it.
    Ncs2�fdd�|j�f�|pdD�}�jp0t|�S)z�
        Evaluate markers for req against each extra that
        demanded it.

        Return False if the req has a marker and fails
        evaluation. Otherwise, return True.
        c3s|]}�jjd|i�VqdS)�extraN)�marker�evaluate)�.0r	)r�rr�	<genexpr>�sz*_ReqExtras.markers_pass.<locals>.<genexpr>N)N)r�r
�any)r�r�r�Zextra_evalsr)r�rr��s	
z_ReqExtras.markers_pass)N)rrrrr�rrrrr��sr�c@sxeZdZdZde�efdd�Zdd�Zdd�Zdd	d
�Z	dd�Z
d
d�Zddd�Zddd�Z
dd�Zdd�Zdd�ZdS)r]z5Searchable snapshot of distributions on a search pathNcCs i|_||_||_|j|�dS)a!Snapshot distributions available on a search path

        Any distributions found on `search_path` are added to the environment.
        `search_path` should be a sequence of ``sys.path`` items.  If not
        supplied, ``sys.path`` is used.

        `platform` is an optional string specifying the name of the platform
        that platform-specific distributions must be compatible with.  If
        unspecified, it defaults to the current platform.  `python` is an
        optional string naming the desired version of Python (e.g. ``'3.3'``);
        it defaults to the current version.

        You may explicitly set `platform` (and/or `python`) to ``None`` if you
        wish to map *all* distributions, not just those compatible with the
        running platform or Python version.
        N)�_distmapr@�python�scan)r��search_pathr@rrrrr��szEnvironment.__init__cCs2|jdkp|jdkp|j|jk}|o0t|j|j�S)z�Is distribution `dist` acceptable for this environment?

        The distribution must match the platform and python version
        requirements specified when this environment was created, or False
        is returned.
        N)r�
py_versionrlr@)r�r�Z	py_compatrrr�can_add�s

zEnvironment.can_addcCs|j|jj|�dS)z"Remove `dist` from the environmentN)rr2�remove)r�r�rrrr�szEnvironment.removecCs<|dkrtj}x(|D] }xt|�D]}|j|�q"WqWdS)adScan `search_path` for distributions usable in this environment

        Any distributions found are added to the environment.
        `search_path` should be a sequence of ``sys.path`` items.  If not
        supplied, ``sys.path`` is used.  Only distributions conforming to
        the platform/python version defined at initialization are added.
        N)r?r�rYr�)r�rr�r�rrrr�s

zEnvironment.scancCs|j�}|jj|g�S)aReturn a newest-to-oldest list of distributions for `project_name`

        Uses case-insensitive `project_name` comparison, assuming all the
        project's distributions use their project's name converted to all
        lowercase as their key.

        )�lowerrr�)r�r�Zdistribution_keyrrr�__getitem__�szEnvironment.__getitem__cCsL|j|�rH|j�rH|jj|jg�}||krH|j|�|jtjd�dd�dS)zLAdd `dist` if we ``can_add()`` it and it has not already been added
        �hashcmpT)r2�reverseN)	r�has_versionrr�r2r�r�operator�
attrgetter)r�r�r�rrrr��s

zEnvironment.addFcCsfy|j|�}Wntk
r,|s$�d}YnX|dk	r:|Sx||jD]}||krF|SqFW|j||�S)a�Find distribution best matching `req` and usable on `working_set`

        This calls the ``find(req)`` method of the `working_set` to see if a
        suitable distribution is already active.  (This may raise
        ``VersionConflict`` if an unsuitable version of the project is already
        active in the specified `working_set`.)  If a suitable distribution
        isn't active, this method returns the newest distribution in the
        environment that meets the ``Requirement`` in `req`.  If no suitable
        distribution is found, and `installer` is supplied, then the result of
        calling the environment's ``obtain(req, installer)`` method will be
        returned.
        N)r�rdr2�obtain)r�r�rWrr�r�rrrr�s
zEnvironment.best_matchcCs|dk	r||�SdS)a�Obtain a distribution matching `requirement` (e.g. via download)

        Obtain a distro that matches requirement (e.g. via download).  In the
        base ``Environment`` class, this routine just returns
        ``installer(requirement)``, unless `installer` is None, in which case
        None is returned instead.  This method is a hook that allows subclasses
        to attempt other ways of obtaining a distribution before falling back
        to the `installer` argument.Nr)r�Zrequirementrrrrrs	zEnvironment.obtainccs&x |jj�D]}||r|VqWdS)z=Yield the unique project names of the available distributionsN)rr�)r�r2rrrr�+szEnvironment.__iter__cCs^t|t�r|j|�nDt|t�rLx8|D] }x||D]}|j|�q4Wq&Wntd|f��|S)z2In-place addition of a distribution or environmentzCan't add %r to environment)r�r`r�r]r�)r��otherZprojectr�rrr�__iadd__1s


zEnvironment.__iadd__cCs.|jgddd�}x||fD]}||7}qW|S)z4Add an environment or distribution to an environmentN)r@r)r�)r�r�newrrrr�__add__=szEnvironment.__add__)N)NF)N)rrrrrG�PY_MAJORr�rrrrr�r�rr�rr!rrrrr]�s



c@seZdZdZdS)rgaTAn error occurred extracting a resource

    The following attributes are available from instances of this exception:

    manager
        The resource manager that raised this exception

    cache_path
        The base directory for resource extraction

    original_error
        The exception instance that caused extraction to fail
    N)rrrrrrrrrgIs
c@s�eZdZdZdZdd�Zdd�Zdd�Zd	d
�Zdd�Z	d
d�Z
dd�Zdd�Zffdd�Z
edd��Zdd�Zdd�Zddd�ZdS)r_z'Manage resource extraction and packagesNcCs
i|_dS)N)�cached_files)r�rrrr�]szResourceManager.__init__cCst|�j|�S)zDoes the named resource exist?)rJr�)r��package_or_requirementr�rrrrT`szResourceManager.resource_existscCst|�j|�S)z,Is the named resource an existing directory?)rJrU)r�r$r�rrrrUdszResourceManager.resource_isdircCst|�j||�S)z4Return a true filesystem path for specified resource)rJr�)r�r$r�rrrrRjsz!ResourceManager.resource_filenamecCst|�j||�S)z9Return a readable file-like object for specified resource)rJr�)r�r$r�rrrrQpszResourceManager.resource_streamcCst|�j||�S)z%Return specified resource as a string)rJr�)r�r$r�rrrrPvszResourceManager.resource_stringcCst|�j|�S)z1List the contents of the named resource directory)rJrS)r�r$r�rrrrS|sz ResourceManager.resource_listdircCsRtj�d}|jpt�}tjd�j�}t|jft	���}||_
||_||_|�dS)z5Give an error message for problems extracting file(s)ra
            Can't extract file(s) to egg cache

            The following error occurred while trying to extract file(s)
            to the Python egg cache:

              {old_exc}

            The Python egg cache directory is currently set to:

              {cache_path}

            Perhaps your account does not have write access to this directory?
            You can change the cache directory by setting the PYTHON_EGG_CACHE
            environment variable to point to an accessible directory.
            N)
r?�exc_info�extraction_pathr\�textwrap�dedent�lstriprgr�r�r��
cache_pathZoriginal_error)r��old_excr*�tmpl�errrrr�extraction_error�s
z ResourceManager.extraction_errorcCsf|jp
t�}tjj||df|��}yt|�Wntk
rL|j�YnX|j|�d|j	|<|S)a�Return absolute location in cache for `archive_name` and `names`

        The parent directory of the resulting path will be created if it does
        not already exist.  `archive_name` should be the base filename of the
        enclosing egg (which may not be the name of the enclosing zipfile!),
        including its ".egg" extension.  `names`, if provided, should be a
        sequence of path name parts "under" the egg's extraction location.

        This method should only be called by resource providers that need to
        obtain an extraction location, and only for names they intend to
        extract, as it tracks the generated names for possible cleanup later.
        z-tmpr)
r&r\r�r�rA�_bypass_ensure_directory�	Exceptionr.�_warn_unsafe_extraction_pathr#)r�Zarchive_name�namesZextract_pathZtarget_pathrrr�get_cache_path�s


zResourceManager.get_cache_pathcCsXtjdkr |jtjd�r dStj|�j}|tj@s@|tj@rTd|}tj	|t
�dS)aN
        If the default extraction path is overridden and set to an insecure
        location, such as /tmp, it opens up an opportunity for an attacker to
        replace an extracted file with an unauthorized payload. Warn the user
        if a known insecure location is used.

        See Distribute #375 for more details.
        �ntZwindirNz�%s is writable by group/others and vulnerable to attack when used with get_resource_filename. Consider a more secure location (set with .set_extraction_path or the PYTHON_EGG_CACHE environment variable).)r�r�r��environ�stat�st_mode�S_IWOTH�S_IWGRP�warnings�warn�UserWarning)r��mode�msgrrrr1�s
z,ResourceManager._warn_unsafe_extraction_pathcCs.tjdkr*tj|�jdBd@}tj||�dS)a4Perform any platform-specific postprocessing of `tempname`

        This is where Mac header rewrites should be done; other platforms don't
        have anything special they should do.

        Resource providers should call this method ONLY after successfully
        extracting a compressed resource.  They must NOT call it on resources
        that are already in the filesystem.

        `tempname` is the current (temporary) name of the file, and `filename`
        is the name it will be renamed to by the caller after this routine
        returns.
        �posiximi�N)r�r�r6r7�chmod)r�Ztempname�filenamer=rrr�postprocess�s
zResourceManager.postprocesscCs|jrtd��||_dS)a�Set the base path where resources will be extracted to, if needed.

        If you do not call this routine before any extractions take place, the
        path defaults to the return value of ``get_default_cache()``.  (Which
        is based on the ``PYTHON_EGG_CACHE`` environment variable, with various
        platform-specific fallbacks.  See that routine's documentation for more
        details.)

        Resources are extracted to subdirectories of this path based upon
        information given by the ``IResourceProvider``.  You may set this to a
        temporary directory, but then you must call ``cleanup_resources()`` to
        delete the extracted files when done.  There is no guarantee that
        ``cleanup_resources()`` will be able to remove all extracted files.

        (Note: you may not change the extraction path for a given resource
        manager once resources have been extracted, unless you first call
        ``cleanup_resources()``.)
        z5Can't change extraction path, files already extractedN)r#rDr&)r�r�rrrrZ�sz#ResourceManager.set_extraction_pathFcCsdS)aB
        Delete all extracted resource files and directories, returning a list
        of the file and directory names that could not be successfully removed.
        This function does not have any concurrency protection, so it should
        generally only be called when the extraction path is a temporary
        directory exclusive to a single process.  This method is not
        automatically called; you must call it explicitly or register it as an
        ``atexit`` function if you wish to ensure cleanup of a temporary
        directory used for extractions.
        Nr)r��forcerrrr[�sz!ResourceManager.cleanup_resources)F)rrrrr&r�rTrUrRrQrPrSr.r3�staticmethodr1rBrZr[rrrrr_YscCstjjd�ptjdd�S)z�
    Return the ``PYTHON_EGG_CACHE`` environment variable
    or a platform-relevant user cache dir for an app
    named "Python-Eggs".
    ZPYTHON_EGG_CACHEzPython-Eggs)Zappname)r�r5r�rZuser_cache_dirrrrrr\
scCstjdd|�S)z�Convert an arbitrary string to a standard distribution name

    Any runs of non-alphanumeric/. characters are replaced with a single '-'.
    z[^A-Za-z0-9.]+�-)�re�sub)r�rrrriscCsDyttjj|��Stjjk
r>|jdd�}tjdd|�SXdS)zB
    Convert an arbitrary string to a standard version string
    r�r:z[^A-Za-z0-9.]+rEN)r�rrrrr�rFrG)rrrrrj!s
cCstjdd|�j�S)z�Convert an arbitrary string to a standard 'extra' name

    Any runs of non-alphanumeric characters are replaced with a single '_',
    and the result is always lowercased.
    z[^A-Za-z0-9.-]+r�)rFrGr)r	rrrro-scCs|jdd�S)z|Convert a project or version name to its filename-escaped form

    Any '-' characters are currently replaced with '_'.
    rEr�)r�)r�rrrrp6scCs>yt|�Wn,tk
r8}zd|_d|_|Sd}~XnXdS)zo
    Validate text as a PEP 508 environment marker; return an exception
    if invalid or False otherwise.
    NF)rr�SyntaxErrorrA�lineno)�text�errrrq>scCsHytjj|�}|j�Stjjk
rB}zt|��WYdd}~XnXdS)z�
    Evaluate a PEP 508 environment marker.
    Return a boolean indicating the marker result in this environment.
    Raise SyntaxError if marker is invalid.

    This implementation uses the 'pyparsing' module.
    N)rZmarkersZMarkerrZ
InvalidMarkerrH)rJr	r
rKrrrrrLs
c@s�eZdZdZdZdZdZdd�Zdd�Zdd�Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�ZdS)'r�zETry to implement resources and metadata for arbitrary PEP 302 loadersNcCs(t|dd�|_tjjt|dd��|_dS)Nr��__file__r�)r�r�r�r��dirname�module_path)r�r�rrrr�bszNullProvider.__init__cCs|j|j|�S)N)�_fnrN)r�r�r�rrrr�fsz"NullProvider.get_resource_filenamecCstj|j||��S)N)�io�BytesIOr�)r�r�r�rrrr�isz NullProvider.get_resource_streamcCs|j|j|j|��S)N)�_getrOrN)r�r�r�rrrr�lsz NullProvider.get_resource_stringcCs|j|j|j|��S)N)�_hasrOrN)r�r�rrrr�oszNullProvider.has_resourcecCs|jo|j|j|j|��S)N)�egg_inforSrO)r�r�rrrr�rszNullProvider.has_metadatacCs2|js
dS|j|j|j|��}tjr.|jd�S|S)Nr�zutf-8)rTrRrOrZPY3�decode)r�r��valuerrrr�uszNullProvider.get_metadatacCst|j|��S)N)rmr�)r�r�rrrr�{szNullProvider.get_metadata_linescCs|j|j|j|��S)N)�_isdirrOrN)r�r�rrrrU~szNullProvider.resource_isdircCs|jo|j|j|j|��S)N)rTrWrO)r�r�rrrr��szNullProvider.metadata_isdircCs|j|j|j|��S)N)�_listdirrOrN)r�r�rrrrS�szNullProvider.resource_listdircCs|jr|j|j|j|��SgS)N)rTrXrO)r�r�rrrr��szNullProvider.metadata_listdirc
Cs�d|}|j|�s$tdjft����|j|�jdd�}|jdd�}|j|j|�}||d<tj	j
|�r�t|�j�}t
||d�}t|||�n>dd	lm}t|�d|jd�|f||<t
||d�}	t|	||�dS)
Nzscripts/z<Script {script!r} not found in metadata at {self.egg_info!r}z
�
�
rL�execr)�cache)r�rcr�r�r�r�rOrTr�r�r�r�read�compiler[�	linecacher\�lenr)
r�r�r�ZscriptZscript_textZscript_filename�source�coder\Zscript_coderrrrI�s"

zNullProvider.run_scriptcCstd��dS)Nz9Can't perform this operation for unregistered loader type)�NotImplementedError)r�r�rrrrS�szNullProvider._hascCstd��dS)Nz9Can't perform this operation for unregistered loader type)rc)r�r�rrrrW�szNullProvider._isdircCstd��dS)Nz9Can't perform this operation for unregistered loader type)rc)r�r�rrrrX�szNullProvider._listdircCs |rtjj|f|jd���S|S)N�/)r�r�rAr)r��baser�rrrrO�szNullProvider._fncCs$t|jd�r|jj|�Std��dS)N�get_dataz=Can't perform this operation for loaders without 'get_data()')r�r�rfrc)r�r�rrrrR�szNullProvider._get)rrrr�egg_namerTr�r�r�r�r�r�r�r�r�rUr�rSr�rIrSrWrXrOrRrrrrr�[s,c@s eZdZdZdd�Zdd�ZdS)r�z&Provider based on a virtual filesystemcCstj||�|j�dS)N)r�r��
_setup_prefix)r�r�rrrr��szEggProvider.__init__cCs^|j}d}xN||krXt|�rBtjj|�|_tjj|d�|_||_P|}tjj	|�\}}qWdS)NzEGG-INFO)
rN�_is_egg_pathr�r��basenamergrArT�egg_rootr)r�r��oldrerrrrh�s
zEggProvider._setup_prefixN)rrrrr�rhrrrrr��sc@sDeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Ze	dd
��Z
dS)r�z6Provides access to package resources in the filesystemcCstjj|�S)N)r�r�r�)r�r�rrrrS�szDefaultProvider._hascCstjj|�S)N)r�r�r
)r�r�rrrrW�szDefaultProvider._isdircCs
tj|�S)N)r��listdir)r�r�rrrrX�szDefaultProvider._listdircCst|j|j|�d�S)N�rb)rrOrN)r�r�r�rrrr��sz#DefaultProvider.get_resource_streamc	Cst|d��
}|j�SQRXdS)Nrn)rr])r�r��streamrrrrR�szDefaultProvider._getcCs0d}x&|D]}tt|td��}t||�q
WdS)N�SourceFileLoader�SourcelessFileLoader)rprq)r��importlib_machinery�typer�)r�Zloader_namesr�Z
loader_clsrrr�	_register�s
zDefaultProvider._registerN)rrrrrSrWrXr�rRrrtrrrrr��sc@s8eZdZdZdZdd�ZZdd�Zdd�Zd	d
�Z	dS)rz.Provider that returns nothing for all requestsNcCsdS)NFr)r�r�rrrr8�szEmptyProvider.<lambda>cCsdS)Nr�r)r�r�rrrrR�szEmptyProvider._getcCsgS)Nr)r�r�rrrrXszEmptyProvider._listdircCsdS)Nr)r�rrrr�szEmptyProvider.__init__)
rrrrrNrWrSrRrXr�rrrrr�sc@s eZdZdZedd��ZeZdS)�ZipManifestsz
    zip manifest builder
    c
s4tj|�� ��fdd��j�D�}t|�SQRXdS)a
        Build a dictionary similar to the zipimport directory
        caches, except instead of tuples, store ZipInfo objects.

        Use a platform-specific path separator (os.sep) for the path keys
        for compatibility with pypy on Windows.
        c3s&|]}|jdtj��j|�fVqdS)rdN)r�r��sepZgetinfo)rr�)�zfilerrr
sz%ZipManifests.build.<locals>.<genexpr>N)�zipfileZZipFileZnamelistr$)r�r�r(r)rwr�builds	
zZipManifests.buildN)rrrrrry�loadrrrrru
sruc@s$eZdZdZejdd�Zdd�ZdS)�MemoizedZipManifestsz%
    Memoized zipfile manifests.
    �manifest_modzmanifest mtimecCsRtjj|�}tj|�j}||ks.||j|krH|j|�}|j||�||<||jS)zW
        Load a manifest at path or return a suitable manifest already loaded.
        )	r�r��normpathr6�st_mtime�mtimeryr|�manifest)r�r�rr�rrrrz+s
zMemoizedZipManifests.loadN)rrrrr��
namedtupler|rzrrrrr{%sr{c@s�eZdZdZdZe�Zdd�Zdd�Zdd�Z	e
d	d
��Zdd�Ze
d
d��Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �ZdS)!r�z"Resource support for zips and eggsNcCs tj||�|jjtj|_dS)N)r�r�r��archiver�rv�zip_pre)r�r�rrrr�?szZipProvider.__init__cCsP|jtj�}||jjkrdS|j|j�r:|t|j�d�Std||jf��dS)Nr�z%s is not a subpath of %s)	�rstripr�rvr�r�r�r�r`�AssertionError)r��fspathrrr�
_zipinfo_nameCszZipProvider._zipinfo_namecCsP|j|}|j|jtj�r:|t|j�dd�jtj�Std||jf��dS)Nrz%s is not a subpath of %s)r�r�rkr�rvr`rr�)r��zip_pathr�rrr�_partsOs

zZipProvider._partscCs|jj|jj�S)N)�_zip_manifestsrzr�r�)r�rrr�zipinfoYszZipProvider.zipinfocCs`|jstd��|j|�}|j�}dj|j|��|krTx|D]}|j||j|��q:W|j||�S)Nz5resource_filename() only supported for .egg, not .ziprd)rgrc�_resource_to_zip�_get_eager_resourcesrAr��_extract_resource�
_eager_to_zip)r�r�r�r��eagersr�rrrr�]s

z!ZipProvider.get_resource_filenamecCs"|j}|jd}tj|�}||fS)Nrrr�)rrr�)Z	file_size�	date_time�timeZmktime)Zzip_stat�sizer��	timestamprrr�_get_date_and_sizejs

zZipProvider._get_date_and_sizec
Csn||j�krDx*|j�|D]}|j|tjj||��}qWtjj|�S|j|j|�\}}tsdt	d��y�|j
|j|j|��}|j
||�r�|Stdtjj|�d�\}}	tj||jj|��tj|�t|	||f�|j|	|�yt|	|�Wn\tjk
�rDtjj|��r>|j
||��r|Stjdk�r>t|�t|	|�|S�YnXWn tjk
�rh|j�YnX|S)Nz>"os.rename" and "os.unlink" are not supported on this platformz	.$extract)�dirr4)�_indexr�r�r�rArMr�r��
WRITE_SUPPORT�IOErrorr3rgr��_is_current�_mkstemp�writer�rf�closerrBr
�error�isfiler�rr.)
r�r�r�r�Zlastr�r�Z	real_pathZoutfZtmpnamrrrr�ssD

zZipProvider._extract_resourcec		Csx|j|j|�\}}tjj|�s$dStj|�}|j|ksB|j|krFdS|jj	|�}t
|d��}|j�}WdQRX||kS)zK
        Return True if the file_path is current for this zip_path
        FrnN)r�r�r�r�r�r6�st_sizer~r�rfrr])	r�Z	file_pathr�r�r�r6Zzip_contents�fZ
file_contentsrrrr��s
zZipProvider._is_currentcCsB|jdkr<g}x&dD]}|j|�r|j|j|��qW||_|jS)N�native_libs.txt�eager_resources.txt)r�r�)r�r�r�r�)r�r�r�rrrr��s


z ZipProvider._get_eager_resourcescCs�y|jStk
r�i}xd|jD]Z}|jtj�}xH|rztjj|dd��}||krj||j|d�Pq4|j�g||<q4Wq"W||_|SXdS)Nrr�r�)	Z	_dirindex�AttributeErrorr�rr�rvrAr�r�)r�Zindr��parts�parentrrrr��szZipProvider._indexcCs |j|�}||jkp||j�kS)N)r�r�r�)r�r�r�rrrrS�s
zZipProvider._hascCs|j|�|j�kS)N)r�r�)r�r�rrrrW�szZipProvider._isdircCst|j�j|j|�f��S)N)r�r�r�r�)r�r�rrrrX�szZipProvider._listdircCs|j|j|j|��S)N)r�rOrk)r�r�rrrr��szZipProvider._eager_to_zipcCs|j|j|j|��S)N)r�rOrN)r�r�rrrr��szZipProvider._resource_to_zip)rrrrr�r{r�r�r�r�r�r�r�rDr�r�r�r�r�rSrWrXr�r�rrrrr�9s$

	7	c@s8eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�ZdS)
r|a*Metadata handler for standalone PKG-INFO files

    Usage::

        metadata = FileMetadata("/path/to/PKG-INFO")

    This provider rejects all data and metadata requests except for PKG-INFO,
    which is treated as existing, and will be the contents of the file at
    the provided location.
    cCs
||_dS)N)r�)r�r�rrrr��szFileMetadata.__init__cCs|dkotjj|j�S)NzPKG-INFO)r�r�r�)r�r�rrrr��szFileMetadata.has_metadatac	CsD|dkrtd��tj|jddd��}|j�}WdQRX|j|�|S)NzPKG-INFOz(No metadata except PKG-INFO is availablezutf-8r�)�encoding�errors)r�rPrr�r]�_warn_on_replacement)r�r�r��metadatarrrr��s
zFileMetadata.get_metadatacCs2djd�}||kr.d}|jft��}tj|�dS)Ns�zutf-8z2{self.path} could not be properly decoded in UTF-8)rUr�r�r:r;)r�r�Zreplacement_charr,r>rrrr�s

z!FileMetadata._warn_on_replacementcCst|j|��S)N)rmr�)r�r�rrrr�szFileMetadata.get_metadata_linesN)	rrrrr�r�r�r�r�rrrrr|�s
	c@seZdZdZdd�ZdS)r}asMetadata provider for egg directories

    Usage::

        # Development eggs:

        egg_info = "/path/to/PackageName.egg-info"
        base_dir = os.path.dirname(egg_info)
        metadata = PathMetadata(base_dir, egg_info)
        dist_name = os.path.splitext(os.path.basename(egg_info))[0]
        dist = Distribution(basedir, project_name=dist_name, metadata=metadata)

        # Unpacked egg directories:

        egg_path = "/path/to/PackageName-ver-pyver-etc.egg"
        metadata = PathMetadata(egg_path, os.path.join(egg_path,'EGG-INFO'))
        dist = Distribution.from_filename(egg_path, metadata=metadata)
    cCs||_||_dS)N)rNrT)r�r�rTrrrr�#szPathMetadata.__init__N)rrrrr�rrrrr}sc@seZdZdZdd�ZdS)r~z Metadata provider for .egg filescCsD|jtj|_||_|jr0tjj|j|j�|_n|j|_|j	�dS)z-Create a metadata provider from a zipimporterN)
r�r�rvr�r��prefixr�rArNrh)r��importerrrrr�+szEggMetadata.__init__N)rrrrr�rrrrr~(sr$)�_distribution_finderscCs|t|<dS)axRegister `distribution_finder` to find distributions in sys.path items

    `importer_type` is the type or class of a PEP 302 "Importer" (sys.path item
    handler), and `distribution_finder` is a callable that, passed a path
    item and the importer instance, yields ``Distribution`` instances found on
    that path item.  See ``pkg_resources.find_on_path`` for an example.N)r�)�
importer_typeZdistribution_finderrrrr�:scCst|�}tt|�}||||�S)z.Yield distributions accessible via `path_item`)rr�r�)�	path_item�onlyr��finderrrrrYDs
c	cs�|jjd�rdSt|�}|jd�r2tj||d�V|r:dSx�|jd�D]�}t|�r�tj	j
||�}ttj
|�|�}xT|D]
}|VqvWqF|j�jd�rFtj	j
||�}ttj
|��}||_tj|||�VqFWdS)z@
    Find eggs in zip files; possibly multiple nested eggs.
    z.whlNzPKG-INFO)r�rdz
.dist-info)r��endswithr~r�r`�
from_filenamerSrir�r�rA�find_eggs_in_zip�	zipimport�zipimporterrrT�
from_location)	r�r�r�r�Zsubitem�subpathr�r�Zsubmetarrrr�Ks$

r�cCsfS)Nr)r�r�r�rrr�find_nothingisr�cCsdd�}t||dd�S)aL
    Given a list of filenames, return them in descending order
    by version number.

    >>> names = 'bar', 'foo', 'Python-2.7.10.egg', 'Python-2.7.2.egg'
    >>> _by_version_descending(names)
    ['Python-2.7.10.egg', 'Python-2.7.2.egg', 'foo', 'bar']
    >>> names = 'Setuptools-1.2.3b1.egg', 'Setuptools-1.2.3.egg'
    >>> _by_version_descending(names)
    ['Setuptools-1.2.3.egg', 'Setuptools-1.2.3b1.egg']
    >>> names = 'Setuptools-1.2.3b1.egg', 'Setuptools-1.2.3.post1.egg'
    >>> _by_version_descending(names)
    ['Setuptools-1.2.3.post1.egg', 'Setuptools-1.2.3b1.egg']
    cSs2tjj|�\}}tj|jd�|g�}dd�|D�S)z6
        Parse each component of the filename
        rEcSsg|]}tjj|��qSr)rrr�)r�partrrr�
<listcomp>�sz?_by_version_descending.<locals>._by_version.<locals>.<listcomp>)r�r��splitext�	itertools�chainr)r��extr�rrr�_by_versionsz+_by_version_descending.<locals>._by_versionT)r2r)�sorted)r2r�rrr�_by_version_descendingpsr�c
#s�t���t��r4tj�t�tjj�d��d�VdSt��}��fdd�|D�}t	|�}x>|D]6}tjj�|�}t
�|��}x||�D]
}	|	Vq�Wq^WdS)z6Yield distributions accessible on a sys.path directoryzEGG-INFO)r�Nc3s|]}t�|��r|VqdS)N)�dist_factory)rr�)r�r�rrr
�szfind_on_path.<locals>.<genexpr>)�_normalize_cached�_is_unpacked_eggr`r�r}r�r�rA�safe_listdirr�r�)
r�r�r�r�ZfilteredZpath_item_entriesr��fullpath�factoryr�r)r�r�r�find_on_path�s
r�cCsL|j�}tt|jd��}|r tS|r2t|�r2tS|rF|jd�rFtSt�S)z9
    Return a dist_factory for a path_item and entry
    �	.egg-info�
.dist-infoz	.egg-link)r�r�)	rrrr��distributions_from_metadatarirY�resolve_egg_link�NoDists)r�r�r�rZis_metarrrr��sr�c@s*eZdZdZdd�ZejreZdd�ZdS)r�zS
    >>> bool(NoDists())
    False

    >>> list(NoDists()('anything'))
    []
    cCsdS)NFr)r�rrr�__bool__�szNoDists.__bool__cCstf�S)N)�iter)r�r�rrr�__call__�szNoDists.__call__N)	rrrrr�r�PY2Z__nonzero__r�rrrrr��s
r�cCsty
tj|�Sttfk
r"YnNtk
rn}z2|jtjtjtjfkpVt	|dd�dk}|s^�WYdd}~XnXfS)zI
    Attempt to list contents of path, but suppress some exceptions.
    ZwinerrorNi)
r�rm�PermissionError�NotADirectoryError�OSError�errno�ENOTDIRZEACCES�ENOENTr�)r�rKZ	ignorablerrrr��s
r�ccsftjj|�}tjj|�r:ttj|��dkr.dSt||�}nt|�}tjj|�}t	j
|||td�VdS)Nr)�
precedence)r�r�rMr
r`rmr}r|rjr`r�ry)r��rootr�r�rrrr��sr�c	cs8t|��&}x|D]}|j�}|r|VqWWdQRXdS)z1
    Yield non-empty lines from file at path
    N)r�strip)r�r��linerrr�non_empty_lines�s


r�cs.t��}�fdd�|D�}tt|�}t|f�S)za
    Given a path to an .egg-link, resolve distributions
    present in the referenced path.
    c3s$|]}tjjtjj��|�VqdS)N)r�r�rArM)r�ref)r�rrr
sz#resolve_egg_link.<locals>.<genexpr>)r�rrY�next)r�Zreferenced_pathsZresolved_pathsZdist_groupsr)r�rr��s


r��
FileFinder)�_namespace_handlers)�_namespace_packagescCs|t|<dS)a�Register `namespace_handler` to declare namespace packages

    `importer_type` is the type or class of a PEP 302 "Importer" (sys.path item
    handler), and `namespace_handler` is a callable like this::

        def namespace_handler(importer, path_entry, moduleName, module):
            # return a path_entry to use for child packages

    Namespace handlers are only called if the importer object has already
    agreed that it can handle the relevant path item, and they should only
    return a subpath if the module __path__ does not already contain an
    equivalent subpath.  For an example namespace handler, see
    ``pkg_resources.file_ns_handler``.
    N)r�)r�Znamespace_handlerrrrr�scCs�t|�}|dkrdS|j|�}|dkr*dStjj|�}|dkrbtj|�}tj|<g|_t|�nt	|d�svt
d|��tt|�}|||||�}|dk	r�|j}|j
|�|j|�t|||�|S)zEEnsure that named package includes a subpath of path_item (if needed)N�__path__zNot a package:)r�find_moduler?r�r��types�
ModuleTyper��_set_parent_nsr�r�r�r�r��load_module�_rebuild_mod_path)�packageNamer�r�r�r�Zhandlerr�r�rrr�
_handle_ns$s*






r�cs`dd�tjD���fdd����fdd�}t|t�s8dS|j|d�d	d�|D�|jdd�<dS)
zq
    Rebuild module.__path__ ensuring that all entries are ordered
    corresponding to their sys.path order
    cSsg|]}t|��qSr)r�)r�prrrr�Csz%_rebuild_mod_path.<locals>.<listcomp>cs(y
�j|�Stk
r"td�SXdS)z/
        Workaround for #520 and #513.
        �infN)�indexrD�float)r�)�sys_pathrr�safe_sys_path_indexEs
z._rebuild_mod_path.<locals>.safe_sys_path_indexcs<|jtj�}�jd�d}|d|�}�ttjj|���S)zR
        Return the ordinal of the path based on its position in sys.path
        r:rN)rr�rv�countr�rA)r��
path_partsZmodule_partsr�)�package_namer�rr�position_in_sys_pathNsz/_rebuild_mod_path.<locals>.position_in_sys_pathN)r2cSsg|]}t|��qSr)r�)rr�rrrr�\s)r?r�r�r�rr�)Z	orig_pathr�r�r�r)r�r�r�rr�>s		
r�cCs�tj�z�|tkrdStjd}}d|kr�dj|jd�dd��}t|�|tkrZt|�ytj	|j
}Wntk
r�td|��YnXtj
|g�j|�tj
|g�x|D]}t||�q�WWdtj�XdS)z9Declare that package 'packageName' is a namespace packageNr:rzNot a package:r�)�_imp�acquire_lockr�r?r�rArrVr�r�r�r�r�r�r�r��release_lock)r�r�r�r�rrrrV_s&
c
CsJtj�z2x,tj|f�D]}t||�}|rt||�qWWdtj�XdS)zDEnsure that previously-declared namespace packages include path_itemN)r�r�r�r�r�r�r�)r�r��packager�rrrr��s
cCsFtjj||jd�d�}t|�}x |jD]}t|�|kr(Pq(W|SdS)zBCompute an ns-package subpath for a filesystem or zipfile importerr:rNr�)r�r�rArr�r�)r�r�r�r�r�Z
normalizedr�rrr�file_ns_handler�sr�cCsdS)Nr)r�r�r�r�rrr�null_ns_handler�sr�cCstjjtjj|��S)z1Normalize a file/dir name for comparison purposes)r�r��normcase�realpath)rArrrrt�scCs2y||Stk
r,t|�||<}|SXdS)N)r�rt)rAr��resultrrrr��s
r�cCs|j�jd�S)z7
    Determine if given path appears to be an egg.
    z.egg)rr�)r�rrrri�sricCs t|�otjjtjj|dd��S)z@
    Determine if given path appears to be an unpacked egg.
    zEGG-INFOzPKG-INFO)rir�r�r�rA)r�rrrr��sr�cCs<|jd�}|j�}|r8dj|�}ttj||tj|�dS)Nr:)rr�rA�setattrr?r�)r�r�r�r�rrrr��s


r�ccsht|tj�r>xV|j�D]"}|j�}|r|jd�r|VqWn&x$|D]}xt|�D]
}|VqRWqDWdS)z9Yield non-empty/non-comment lines of a string or sequence�#N)r�rr��
splitlinesr�r�rm)�strs�sZssrrrrm�s
z\w+(\.\w+)*$z�
    (?P<name>[^-]+) (
        -(?P<ver>[^-]+) (
            -py(?P<pyver>[^-]+) (
                -(?P<plat>.+)
            )?
        )?
    )?
    c@s�eZdZdZffdfdd�Zdd�Zdd�Zdd
d�Zdd
�Zddd�Z	e
jd�Ze
ddd��Ze
dd��Ze
ddd��Ze
ddd��ZdS)rbz3Object representing an advertised importable objectNcCs<t|�std|��||_||_t|�|_t|�|_||_dS)NzInvalid module name)�MODULErDr��module_name�tuple�attrsr�r�)r�r�rrr�r�rrrr��s


zEntryPoint.__init__cCsHd|j|jf}|jr*|ddj|j�7}|jrD|ddj|j�7}|S)Nz%s = %s�:r:z [%s]�,)r�rrrAr�)r�rrrrr��szEntryPoint.__str__cCsdt|�S)NzEntryPoint.parse(%r))r�)r�rrrr��szEntryPoint.__repr__TcOs6|s|s|rtjdtdd�|r.|j||�|j�S)zH
        Require packages for this EntryPoint, then resolve it.
        zJParameters to load are deprecated.  Call .resolve and .require separately.r;)�
stacklevel)r:r;�DeprecationWarningrHr�)r�rHr7�kwargsrrrrz	szEntryPoint.loadcCsVt|jdgdd�}ytjt|j|�Stk
rP}ztt|���WYdd}~XnXdS)zD
        Resolve the entry point from its module and attrs.
        rr)�fromlist�levelN)	r�r�	functools�reducer�rr�r�r�)r�r��excrrrr�	s
zEntryPoint.resolvecCsN|jr|jrtd|��|jj|j�}tj||||jd�}tttj|��dS)Nz&Can't require() without a distribution)r�)	r�r�rfr�rWr�r�rr�)r�rrr�r(rrrrH	s

zEntryPoint.requirez]\s*(?P<name>.+?)\s*=\s*(?P<module>[\w.]+)\s*(:\s*(?P<attr>[\w.]+))?\s*(?P<extras>\[.*\])?\s*$cCsf|jj|�}|sd}t||��|j�}|j|d�}|drJ|djd�nf}||d|d|||�S)aParse a single entry point from string `src`

        Entry point syntax follows the form::

            name = some.module:some.attr [extra1, extra2]

        The entry name and module name are required, but the ``:attrs`` and
        ``[extras]`` parts are optional
        z9EntryPoint must be in 'name=module:attrs [extras]' formatr��attrr:r�r�)�patternr>rD�	groupdict�
_parse_extrasr)r��srcr�rFr>�resr�rrrrr�0	s
zEntryPoint.parsecCs(|sfStjd|�}|jr"t��|jS)N�x)rar��specsrDr�)r�Zextras_specr�rrrrD	szEntryPoint._parse_extrascCsZt|�std|��i}x>t|�D]2}|j||�}|j|krHtd||j��|||j<q W|S)zParse an entry point groupzInvalid group namezDuplicate entry point)rrDrmr�r�)r�rC�linesr��thisr�r�rrr�parse_groupM	s

zEntryPoint.parse_groupcCsxt|t�r|j�}nt|�}i}xR|D]J\}}|dkrD|s<q&td��|j�}||kr^td|��|j|||�||<q&W|S)z!Parse a map of entry point groupsNz%Entry points must be listed in groupszDuplicate group name)r�r$r(rnrDr�r)r��datar��mapsrCrrrr�	parse_mapZ	s


zEntryPoint.parse_map)T)NN)N)N)N)rrrrr�r�r�rzr�rHrFr^rrr�rrrrrrrrb�s 	



	cCs>|sdStjj|�}|djd�r:tjj|dd�d�S|S)Nr�rzmd5=r�r�)r�)rr�Zurlparser�Z
urlunparse)r�Zparsedrrr�_remove_md5_fragmentn	sr cCs@dd�}t||�}tt|�d�}|jd�\}}}t|j��p>dS)z�
    Given an iterable of lines from a Metadata file, return
    the value of the Version field, if present, or None otherwise.
    cSs|j�jd�S)Nzversion:)rr�)r�rrr�is_version_line|	sz+_version_from_file.<locals>.is_version_liner�rN)rr�r��	partitionrjr�)rr!Z
version_linesr�r�rVrrr�_version_from_filew	s

r#cs�eZdZdZdZddddedefdd�ZedNdd��Z	dd	�Z
ed
d��Zdd
�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zedd��Zedd��Zdd�Zed d!��Zed"d#��Zed$d%��Zd&d'�Zffd(d)�Zd*d+�ZdOd-d.�Zd/d0�Zd1d2�Z d3d4�Z!d5d6�Z"�fd7d8�Z#e$e%d9��s&[#edPd:d;��Z&d<d=�Z'd>d?�Z(dQd@dA�Z)dBdC�Z*dRdDdE�Z+dFdG�Z,dHdI�Z-dJdK�Z.edLdM��Z/�Z0S)Sr`z5Wrap an actual or potential sys.path entry w/metadatazPKG-INFONcCsFt|pd�|_|dk	r t|�|_||_||_||_||_|p>t|_	dS)NZUnknown)
rir�rj�_versionrr@r�r�r��	_provider)r�r�r�r�rrr@r�rrrr��	s
zDistribution.__init__cKs~dgd\}}}}tjj|�\}}	|	j�tkr^t|	j�}t|�}
|
r^|
jdddd�\}}}}|||f||||d�|��j�S)Nr�r�ZverZpyverrE)r�rrr@)r�r�r�r�_distributionImpl�EGG_NAMErC�_reload_version)r�r�rjr�r&r�rrr@r�r>rrrr��	s
zDistribution.from_locationcCs|S)Nr)r�rrrr(�	szDistribution._reload_versioncCs(|j|j|jt|j�|jpd|jp$dfS)Nr�)�parsed_versionr�r2r r�rr@)r�rrrr�	szDistribution.hashcmpcCs
t|j�S)N)�hashr)r�rrr�__hash__�	szDistribution.__hash__cCs|j|jkS)N)r)r�rrrr�__lt__�	szDistribution.__lt__cCs|j|jkS)N)r)r�rrrr�__le__�	szDistribution.__le__cCs|j|jkS)N)r)r�rrrr�__gt__�	szDistribution.__gt__cCs|j|jkS)N)r)r�rrrr�__ge__�	szDistribution.__ge__cCst||j�sdS|j|jkS)NF)r�r�r)r�rrrr�__eq__�	szDistribution.__eq__cCs
||kS)Nr)r�rrrr�__ne__�	szDistribution.__ne__cCs0y|jStk
r*|jj�|_}|SXdS)N)Z_keyr�r�r)r�r2rrrr2�	s
zDistribution.keycCst|d�st|j�|_|jS)N�_parsed_version)r�r rr2)r�rrrr)�	s
zDistribution.parsed_versioncCsXtjj}t|j|�}|sdS|js&dStjd�j�jdd�}t	j
|jft|��t
�dS)Na>
            '{project_name} ({version})' is being parsed as a legacy,
            non PEP 440,
            version. You may find odd behavior and sort order.
            In particular it will be sorted as less than 0.0. It
            is recommended to migrate to PEP 440 compatible
            versions.
            rYr�)rrrr�r2r'r(r�r�r:r;r��varsr)r�ZLVZ	is_legacyr,rrr�_warn_legacy_version�	sz!Distribution._warn_legacy_versioncCsLy|jStk
rFt|j|j��}|dkrBd}t||j|��|SXdS)Nz(Missing 'Version:' header and/or %s file)r$r�r#�
_get_metadata�PKG_INFOrD)r�rr,rrrr�	szDistribution.versioncCs2y|jStk
r*|j|j��|_YnX|jS)z~
        A map of extra to its list of (direct) requirements
        for this distribution, including the null extra.
        )Z_Distribution__dep_mapr��_filter_extras�_build_dep_map)r�rrr�_dep_map
s
zDistribution._dep_mapcCsvxpttd|��D]^}|}|j|�}|jd�\}}}|oFt|�pFt|�}|rPg}t|�pZd}|j|g�j|�qW|S)z�
        Given a mapping of extras to dependencies, strip off
        environment markers and filter out any dependencies
        not matching the markers.
        Nr)	r�rr�r"rqrrror�r�)�dmr	Z	new_extrar�r�r
Zfails_markerrrrr7
s

zDistribution._filter_extrascCsHi}x>dD]6}x0t|j|��D]\}}|j|g�jt|��qWq
W|S)N�requires.txt�depends.txt)r;r<)rnr5r�r�rh)r�r:r�r	r�rrrr8&
s

zDistribution._build_dep_mapcCsj|j}g}|j|jdf��xH|D]@}y|j|t|��Wq"tk
r`td||f��Yq"Xq"W|S)z@List of Requirements needed for this distro if `extras` are usedNz%s has no such extra feature %r)r9r�r�ror�rf)r�r�r:Zdepsr�rrrr�-
s
zDistribution.requiresccs(|j|�r$x|j|�D]
}|VqWdS)N)r�r�)r�r�r�rrrr5;
s
zDistribution._get_metadataFcCsZ|dkrtj}|j||d�|tjkrVt|j�x$|jd�D]}|tjkr<t|�q<WdS)z>Ensure distribution is importable on `path` (default=sys.path)N)r�znamespace_packages.txt)r?r�r�r�r�r5r�rV)r�r�r�Zpkgrrr�activate@
s


zDistribution.activatecCs8dt|j�t|j�|jptf}|jr4|d|j7}|S)z@Return what this distribution's standard .egg filename should bez
%s-%s-py%srE)rpr�rrr"r@)r�rArrrrgK
szDistribution.egg_namecCs |jrd||jfSt|�SdS)Nz%s (%s))r�r�)r�rrrr�V
szDistribution.__repr__cCs@yt|dd�}Wntk
r(d}YnX|p0d}d|j|fS)Nrz[unknown version]z%s %s)r�rDr�)r�rrrrr�\
s
zDistribution.__str__cCs|jd�rt|��t|j|�S)zADelegate all unrecognized public attributes to .metadata providerr�)r�r�r�r%)r�rrrr�__getattr__d
s
zDistribution.__getattr__cs.tttt|�j��tdd�|jj�D��B�S)Ncss|]}|jd�s|VqdS)r�N)r�)rrrrrr
n
sz'Distribution.__dir__.<locals>.<genexpr>)r�r��superr`�__dir__r%)r�)r�rrr@j
szDistribution.__dir__r@cKs|jt|�tjj|�|f|�S)N)r�r�r�r�rj)r�rAr�r&rrrr�w
szDistribution.from_filenamecCs<t|jtjj�r"d|j|jf}nd|j|jf}tj|�S)z?Return a ``Requirement`` that matches this distribution exactlyz%s==%sz%s===%s)r�r)rrrr�rar�)r��specrrrr~
szDistribution.as_requirementcCs.|j||�}|dkr&td||ff��|j�S)z=Return the `name` entry point of `group` or raise ImportErrorNzEntry point %r not found)rNr�rz)r�rCr�r�rrrrL�
szDistribution.load_entry_pointcCsPy
|j}Wn,tk
r6tj|jd�|�}|_YnX|dk	rL|j|i�S|S)z=Return the entry point map for `group`, or the full entry mapzentry_points.txtN)Z_ep_mapr�rbrr5r�)r�rCZep_maprrrrM�
s
zDistribution.get_entry_mapcCs|j|�j|�S)z<Return the EntryPoint object for `group`+`name`, or ``None``)rMr�)r�rCr�rrrrN�
szDistribution.get_entry_infoc
Cs2|p|j}|sdSt|�}tjj|�}dd�|D�}x�t|�D]v\}}||kr\|rVPq�dSq>||kr>|jtkr>|r�|||d�kr�dS|tjkr�|j	�|j
||�|j
||�Pq>W|tjkr�|j	�|r�|j
d|�n
|j|�dSxBy|j||d�}	Wnt
k
�rPYq�X||	=||	=|	}q�WdS)a�Ensure self.location is on path

        If replace=False (default):
            - If location is already in path anywhere, do nothing.
            - Else:
              - If it's an egg and its parent directory is on path,
                insert just ahead of the parent.
              - Else: add to the end of path.
        If replace=True:
            - If location is already on path anywhere (not eggs)
              or higher priority than its parent (eggs)
              do nothing.
            - Else:
              - If it's an egg and its parent directory is on path,
                insert just ahead of the parent,
                removing any lower-priority entries.
              - Else: add it to the front of path.
        NcSsg|]}|rt|�p|�qSr)r�)rr�rrrr��
sz*Distribution.insert_on.<locals>.<listcomp>rr)r�r�r�r�rM�	enumerater�rur?�check_version_conflictr�r�r�rD)
r�r��locr�ZnlocZbdirZnpathr�r�Znprrrr��
sB



zDistribution.insert_oncCs�|jdkrdStj|jd��}t|j�}x~|jd�D]p}|tjks4||ks4|tkrTq4|dkr^q4t	tj|dd�}|r�t|�j
|�s4|j
|j�r�q4td|||jf�q4WdS)	N�
setuptoolsznamespace_packages.txtz
top_level.txt�
pkg_resources�siterLzIModule %s was already imported from %s, but %s is being added to sys.path)rFrErG)r2r$r%r5rtr�r?r�r�r�r��
issue_warning)r�ZnsprD�modname�fnrrrrC�
s"

z#Distribution.check_version_conflictcCs4y
|jWn$tk
r.tdt|��dSXdS)NzUnbuilt egg for FT)rrDrHr�)r�rrrr�
s
zDistribution.has_versioncKsDd}x$|j�D]}|j|t||d��qW|jd|j�|jf|�S)z@Copy this distribution, substituting in any changed keyword argsz<project_name version py_version platform location precedenceNr�)rr�r�r%r�)r�r&r2rrrr�clones
zDistribution.clonecCsdd�|jD�S)NcSsg|]}|r|�qSrr)rZdeprrrr�
sz'Distribution.extras.<locals>.<listcomp>)r9)r�rrrr�szDistribution.extras)N)NF)N)N)NF)1rrrrr6r"rur�rr�r(r�rr+r,r-r.r/r0r1r2r)r4rr9rDr7r8r�r5r=rgr�r�r>r@r��objectr�rrLrMrNr�rCrrKr��
__classcell__rr)r�rr`�	sX

		

Dc@seZdZdd�ZdS)�EggInfoDistributioncCst|j|j��}|r||_|S)a�
        Packages installed by distutils (e.g. numpy or scipy),
        which uses an old safe_version, and so
        their version numbers can get mangled when
        converted to filenames (e.g., 1.11.0.dev0+2329eae to
        1.11.0.dev0_2329eae). These distributions will not be
        parsed properly
        downstream by Distribution and safe_version, so
        take an extra step and try to get the version number from
        the metadata file itself instead of the filename.
        )r#r5r6r$)r�Z
md_versionrrrr(sz#EggInfoDistribution._reload_versionN)rrrr(rrrrrN
srNc@s>eZdZdZdZejd�Zedd��Z	edd��Z
dd	�Zd
S)�DistInfoDistributionzV
    Wrap an actual or potential sys.path entry
    w/metadata, .dist-info style.
    ZMETADATAz([\(,])\s*(\d.*?)\s*([,\)])cCs@y|jStk
r:|j|j�}tjj�j|�|_|jSXdS)zParse and cache metadataN)Z	_pkg_infor�r�r6�email�parserZParserZparsestr)r�r�rrr�_parsed_pkg_info(sz%DistInfoDistribution._parsed_pkg_infocCs,y|jStk
r&|j�|_|jSXdS)N)�_DistInfoDistribution__dep_mapr��_compute_dependencies)r�rrrr92s

zDistInfoDistribution._dep_mapcs�dgi}|_g�x&|jjd�p"gD]}�jt|��q$W�fdd�}t|d��}|dj|�x<|jjd�ppgD](}t|j��}tt||��|�||<qrW|S)z+Recompute this distribution's dependencies.Nz
Requires-Distc3s0x*�D]"}|js"|jjd|i�r|VqWdS)Nr	)r
r)r	r�)r�rr�reqs_for_extraCs
zBDistInfoDistribution._compute_dependencies.<locals>.reqs_for_extrazProvides-Extra)	rSrRZget_allr�rh�	frozensetror�r�)r�r:r�rU�commonr	Zs_extrar)r�rrT:sz*DistInfoDistribution._compute_dependenciesN)rrrrr6rFr^ZEQEQr�rRr9rTrrrrrO s

rO)z.eggz	.egg-infoz
.dist-infoc
Os^d}t�}y"xtj|�j|kr(|d7}qWWntk
r@YnXtj|d|di|��dS)Nrr
)r!r?r�r�rDr:r;)r7r&rr*rrrrHYsrHc@seZdZdd�ZdS)�RequirementParseErrorcCsdj|j�S)Nr�)rAr7)r�rrrr�gszRequirementParseError.__str__N)rrrr�rrrrrXfsrXccs�tt|��}xp|D]h}d|kr0|d|jd��}|jd�rp|dd�j�}y|t|�7}Wntk
rndSXt|�VqWdS)z�Yield ``Requirement`` objects for each specification in `strs`

    `strs` must be a string, or a (possibly-nested) iterable thereof.
    z #N�\r;���)r�rmr�r�r�r��
StopIterationra)rrr�rrrrhks

csPeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Ze	d
d��Z
�ZS)racs�ytt|�j|�Wn2tjjk
rF}ztt|���WYdd}~XnX|j|_	t
|j�}||j�|_|_
dd�|jD�|_ttt|j��|_|j
|jt|j�|jr�t|j�ndf|_t|j�|_dS)z>DO NOT CALL THIS UNDOCUMENTED METHOD; use Requirement.parse()!NcSsg|]}|j|jf�qSr)rr)rrArrrr��sz(Requirement.__init__.<locals>.<listcomp>)r?rar�rr�ZInvalidRequirementrXr�r�Zunsafe_namerirr�r2�	specifierrrrror�rVr
�hashCmpr*�_Requirement__hash)r�Zrequirement_stringrKr�)r�rrr��s
zRequirement.__init__cCst|t�o|j|jkS)N)r�rar])r�rrrrr0�s
zRequirement.__eq__cCs
||kS)Nr)r�rrrrr1�szRequirement.__ne__cCs0t|t�r |j|jkrdS|j}|jj|dd�S)NFT)Zprereleases)r�r`r2rr\�contains)r�r�rrrr��s

zRequirement.__contains__cCs|jS)N)r^)r�rrrr+�szRequirement.__hash__cCsdt|�S)NzRequirement.parse(%r))r�)r�rrrr��szRequirement.__repr__cCst|�\}|S)N)rh)rr�rrrr��s
zRequirement.parse)rrrr�r0r1r�r+r�rDr�rMrr)r�rra�scCst|kr|tfS|S)zJ
    Ensure object appears in the mro even
    for old-style classes.
    )rL)�classesrrr�_always_object�s
racCs<ttjt|dt|����}x|D]}||kr ||Sq WdS)z2Return an adapter factory for `ob` from `registry`r�N)ra�inspectZgetmror�rs)�registryr3r��trrrr��s
r�cCstjj|�}tj|dd�dS)z1Ensure that the parent directory of `path` existsT)�exist_okN)r�r�rMr�makedirs)r�rMrrrrs�scCs@tstd��t|�\}}|r<|r<t|�r<t|�t|d�dS)z/Sandbox-bypassing version of ensure_directory()z*"os.mkdir" not supported on this platform.i�N)r�r�rr
r/r	)r�rMrArrrr/�sr/ccszd}g}xbt|�D]V}|jd�r^|jd�rR|s2|r<||fV|dd�j�}g}qhtd|��q|j|�qW||fVdS)asSplit a string or iterable thereof into (section, content) pairs

    Each ``section`` is a stripped version of the section header ("[section]")
    and each ``content`` is a list of stripped lines excluding blank lines and
    comment-only lines.  If there are any such lines before the first section
    header, they're returned in a first ``section`` of ``None``.
    N�[�]rzInvalid section headingr�)rmr�r�r�rDr�)rZsectionZcontentr�rrrrn�s


cOs&tj}ztt_tj||�S|t_XdS)N)r�r�os_open�tempfileZmkstemp)r7r&Zold_openrrrr��s
r��ignore)�categoryr�cOs|||�|S)Nr)r�r7rrrr�_call_asides
rmcs.t���|d<|j�fdd�t��D��dS)z=Set up global resource manager (deliberately not state-saved)Z_managerc3s&|]}|jd�s|t�|�fVqdS)r�N)r�r�)rr�)r�rrr
sz_initialize.<locals>.<genexpr>N)r_r"r�)r*r)r�r�_initializes

rncCs|tj�}td|d�|j}|j}|j}|j}|}tdd�|D��|dd�dd�g|_t	t
|jtj
��t�jt��d	S)
aE
    Prepare the master working set and make the ``require()``
    API available.

    This function has explicit effects on the global state
    of pkg_resources. It is intended to be invoked once at
    the initialization of this module.

    Invocation by other packages is unsupported and done
    at their own risk.
    rL)rWcss|]}|jdd�VqdS)F)r�N)r=)rr�rrrr
2sz1_initialize_master_working_set.<locals>.<genexpr>cSs|jdd�S)NT)r�)r=)r�rrrr86sz0_initialize_master_working_set.<locals>.<lambda>F)rN)r^r�r'rHrOrrIrr�r�rr�r?r�r!r"r�)rWrHrOrXrIr�rrr�_initialize_master_working_sets 

ro)rr)rrr�)N)N)F)F)F)F)N)�rZ
__future__rr?r�rPr�rFr�rxr�r:r6rZpkgutilrr@r�r�Zemail.parserrPr�rjr'r�rbrr�r�ZimpZpkg_resources.externrZpkg_resources.extern.six.movesrrrrr	r
rr�rriZos.pathr
rZimportlib.machinery�	machineryrrrr�rrrr��version_info�RuntimeErrorr�r�r�rHrWrXZresources_streamr[Zresource_dirrQrZrUrPrOrSrRrTr�r�r��RuntimeWarningrr r#r'r,r-r0r4r5r6Z
_sget_noneZ
_sset_nonerG�__all__r0rcrdr�rerfr�rr"rurvrwrxryr�rJrBr�r<r^r=r�rkrlrIr�rKrLrMrNrzr{rLr^r$r�r]r�rgr_r\rirjrorprqrrr�r�r�rtrr�rur{r�r�r|r}r~r�rYr�r�r�r�r�r�r�r�r�r�ZImpImporterr�r�r�r�r�rVr�r�r�rtr�rir�r�rmr>r�VERBOSE�
IGNORECASEr'rbr r#r`rNrOr&rHrDrXrhr�rarar�rsr/rnr��filterwarningsrmr!rnrorrrr�<module>s�




 




.

5	
d
-'




 !!


		
3
6

site-packages/pkg_resources/__pycache__/py31compat.cpython-36.opt-1.pyc000064400000001160147511334660021744 0ustar003

��fX�@srddlZddlZddlZddd�Zejdkp^d
ejko@dknp^dejkoZdknZerhenejZdS)�NFcCsJytj|�Wn6tk
rD}z|s2|jtjkr4�WYdd}~XnXdS)N)�os�makedirs�OSError�errnoZEEXIST)�path�exist_ok�exc�r	� /usr/lib/python3.6/py31compat.py�_makedirs_31s
r������)F)rr
r)rr)rrr)rr)rrr)rr�sysr�version_infoZneeds_makedirsrr	r	r	r
�<module>s

site-packages/pkg_resources/__pycache__/py31compat.cpython-36.pyc000064400000001160147511334660021005 0ustar003

��fX�@srddlZddlZddlZddd�Zejdkp^d
ejko@dknp^dejkoZdknZerhenejZdS)�NFcCsJytj|�Wn6tk
rD}z|s2|jtjkr4�WYdd}~XnXdS)N)�os�makedirs�OSError�errnoZEEXIST)�path�exist_ok�exc�r	� /usr/lib/python3.6/py31compat.py�_makedirs_31s
r������)F)rr
r)rr)rrr)rr)rrr)rr�sysr�version_infoZneeds_makedirsrr	r	r	r
�<module>s

site-packages/pkg_resources/__pycache__/__init__.cpython-36.opt-1.pyc000064400000272700147511334660021515 0ustar003

��f���^@sLdZddlmZddlZddlZddlZddlZddlZddlZddl	Z	ddl
Z
ddlZddlZddl
Z
ddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlZddlmZyddlZWnek
r�ddlZYnXddlmZddl m!Z!m"Z"m#Z#ddlm$Z$yddlm%Z%m&Z&m'Z'd	Z(Wnek
�rXd
Z(YnXddlm)Z*ddl+m,Z,m-Z-yddl.j/Z0e0j1Wnek
�r�dZ0YnXd
dl2m3Z3ddlm4Z4ddlm5Z5e6d�e6d�e6d�e6d�d�ej7k�od�kn�re8d��ej9�r dZ:dZ;dZ<dZ=dZ>dZ?dZ@dZAdZBdZCdZDdZEdZFdZGdZHdZIdZJdZKdZLGdd�deM�ZNdd�ZOiZPdd�ZQdd�ZRdd �ZSd!d"�ZTd#d$�ZUd%d&�ZVd'd(�ZWd)d*�ZXZYd+d,�ZZd-d.d/d0d1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLddMddNdOdPdQdRdSdTdUdVdWdXdYdZd[d\d]d^d_d`dadbdcdddedfdgdhdidjdkdldmdndodpgFZ[GdqdH�dHe\�Z]GdrdI�dIe]�Z^Gdsdt�dte^�Z_GdudJ�dJe]�Z`GdvdK�dKe]�ZaiZbejcdd�ZddZedwZfd
ZgdZhd�Zidxdl�Zjdyd/�Zkgfdzd{�Zld|d}�Zmd~d�Znejod��Zpejod��ZqenZrd�dQ�Zsd�d.�ZtetZud�d0�Zvd�d1�Zwd�d�d2�Zxd�d3�ZyGd�d_�d_�ZzGd�d`�d`ez�Z{Gd�dC�dCe|�Z}Gd�d��d�e~�ZGd�dB�dBe|�Z�e�Z�Gd�dL�dLe8�Z�Gd�dD�dD�Z�d�dA�Z�d�dN�Z�d�dO�Z�d�dT�Z�d�dU�Z�d�dV�Z�d�d�dW�Z�Gd�df�df�Z�eje|e��Gd�dg�dge��Z�Gd�dh�dhe��Z�e�j��Gd�dd�dde��Z�e��Z�Gd�d��d�e~�Z�Gd�d��d�e��Z�Gd�di�die��Z�eje
j�e��Gd�da�dae��Z�Gd�db�dbe��Z�Gd�dc�dce��Z�eQd�id��d�dj�Z�d�d�d>�Z�d�d�d��Z�e�e
j�e��d�d�d��Z�e�e|e��d�d��Z��dd�d��Z�d�d��Z�Gd�d��d��Z�d�d��Z�d�d��Z�d�d��Z�d�d��Z�e�ej�e��e�e0d���rne�e0j�e��eQd�id��eQd�id��d�dk�Z�d�d��Z�d�d„Z�d�d;�Z��dd�dm�Z�d�dƄZ�e�ej�e��e�e
j�e��e�e0d���r�e�e0j�e��d�dȄZ�e�e|e��d�dY�Z�ifd�d˄Z�d�d̈́Z�d�dτZ�d�dфZ�d�dR�Z�ejodӃj�Z�ejod�ej�ej�B�j�Z�Gd�dG�dGe|�Z�d�dׄZ�d�dلZ�Gd�dE�dEe|�Z�Gd�d܄d�e��Z�Gd�dބd�e��Z�e�e�e�dߜZ�d�d�Z�Gd�d�d�e��Z�d�dM�Z�Gd�dF�dFe5j�jŃZ�d�d�Z�d�d�Z�d�dX�Z�d�d�Z�d�dS�Z�d�d�Z�ej�d�eNd	d�d�d�Z�e�e΃fd�d���Z�e�d�d���Z�dS(aZ
Package resource API
--------------------

A resource is a logical file contained within a package, or a logical
subdirectory thereof.  The package resource API expects resource names
to have their path parts separated with ``/``, *not* whatever the local
path separator is.  Do not use os.path operations to manipulate resource
names being passed into the API.

The package resource API is designed to work with normal filesystem packages,
.egg files, and unpacked .egg files.  It can also work in a limited way with
.zip files and with custom PEP 302 loaders that support the ``get_data()``
method.
�)�absolute_importN)�get_importer)�six)�urllib�map�filter)�utime)�mkdir�rename�unlinkTF)�open)�isdir�split�)�
py31compat)�appdirs)�	packagingz&pkg_resources.extern.packaging.versionz)pkg_resources.extern.packaging.specifiersz+pkg_resources.extern.packaging.requirementsz&pkg_resources.extern.packaging.markers�zPython 3.3 or later is requiredc@seZdZdZdS)�
PEP440Warningza
    Used when there is an issue with a version or specifier not complying with
    PEP 440.
    N)�__name__�
__module__�__qualname__�__doc__�rr�/usr/lib/python3.6/__init__.pyrnsrcCs2ytjj|�Stjjk
r,tjj|�SXdS)N)r�version�Version�InvalidVersion�
LegacyVersion)�vrrr�
parse_versionusr cKs"t�j|�tjtj||��dS)N)�globals�update�_state_vars�dict�fromkeys)Zvartype�kwrrr�_declare_statesr'cCs<i}t�}x,tj�D] \}}|d|||�||<qW|S)NZ_sget_)r!r#�items)�state�g�krrrr�__getstate__�s
r,cCs<t�}x0|j�D]$\}}|dt|||||�qW|S)NZ_sset_)r!r(r#)r)r*r+rrrr�__setstate__�s r-cCs|j�S)N)�copy)�valrrr�
_sget_dict�sr0cCs|j�|j|�dS)N)�clearr")�key�obr)rrr�
_sset_dict�sr4cCs|j�S)N)r,)r/rrr�_sget_object�sr5cCs|j|�dS)N)r-)r2r3r)rrr�_sset_object�sr6cGsdS)Nr)�argsrrr�<lambda>�sr8cCsbt�}tj|�}|dk	r^tjdkr^y&ddjt�dd��|jd�f}Wntk
r\YnX|S)aZReturn this platform's maximum compatible version.

    distutils.util.get_platform() normally reports the minimum version
    of Mac OS X that would be required to *use* extensions produced by
    distutils.  But what we want when checking compatibility is to know the
    version of Mac OS X that we are *running*.  To allow usage of packages that
    explicitly require a newer version of Mac OS X, we must also know the
    current version of the OS.

    If this condition occurs for any other platform with a version in its
    platform strings, this function should be extended accordingly.
    N�darwinzmacosx-%s-%s�.�r)	�get_build_platform�macosVersionString�match�sys�platform�join�_macosx_vers�group�
ValueError)�plat�mrrr�get_supported_platform�s

&rG�require�
run_script�get_provider�get_distribution�load_entry_point�
get_entry_map�get_entry_info�iter_entry_points�resource_string�resource_stream�resource_filename�resource_listdir�resource_exists�resource_isdir�declare_namespace�working_set�add_activation_listener�find_distributions�set_extraction_path�cleanup_resources�get_default_cache�Environment�
WorkingSet�ResourceManager�Distribution�Requirement�
EntryPoint�ResolutionError�VersionConflict�DistributionNotFound�UnknownExtra�ExtractionError�parse_requirements�	safe_name�safe_version�get_platform�compatible_platforms�yield_lines�split_sections�
safe_extra�to_filename�invalid_marker�evaluate_marker�ensure_directory�normalize_path�EGG_DIST�BINARY_DIST�SOURCE_DIST�
CHECKOUT_DIST�DEVELOP_DIST�IMetadataProvider�IResourceProvider�FileMetadata�PathMetadata�EggMetadata�
EmptyProvider�empty_provider�NullProvider�EggProvider�DefaultProvider�ZipProvider�register_finder�register_namespace_handler�register_loader_type�fixup_namespace_packagesr�run_main�AvailableDistributionsc@seZdZdZdd�ZdS)rcz.Abstract base for dependency resolution errorscCs|jjt|j�S)N)�	__class__r�reprr7)�selfrrr�__repr__�szResolutionError.__repr__N)rrrrr�rrrrrc�sc@s<eZdZdZdZedd��Zedd��Zdd�Zd	d
�Z	dS)rdz�
    An already-installed version conflicts with the requested version.

    Should be initialized with the installed Distribution and the requested
    Requirement.
    z3{self.dist} is installed but {self.req} is requiredcCs
|jdS)Nr)r7)r�rrr�dist�szVersionConflict.distcCs
|jdS)Nr)r7)r�rrr�reqszVersionConflict.reqcCs|jjft��S)N)�	_template�format�locals)r�rrr�reportszVersionConflict.reportcCs|s|S|j|f}t|�S)zt
        If required_by is non-empty, return a version of self that is a
        ContextualVersionConflict.
        )r7�ContextualVersionConflict)r��required_byr7rrr�with_context
szVersionConflict.with_contextN)
rrrrr��propertyr�r�r�r�rrrrrd�sc@s&eZdZdZejdZedd��ZdS)r�z�
    A VersionConflict that accepts a third parameter, the set of the
    requirements that required the installed Distribution.
    z by {self.required_by}cCs
|jdS)Nr;)r7)r�rrrr�sz%ContextualVersionConflict.required_byN)rrrrrdr�r�r�rrrrr�s
r�c@sHeZdZdZdZedd��Zedd��Zedd��Zd	d
�Z	dd�Z
d
S)rez&A requested distribution was not foundzSThe '{self.req}' distribution was not found and is required by {self.requirers_str}cCs
|jdS)Nr)r7)r�rrrr�(szDistributionNotFound.reqcCs
|jdS)Nr)r7)r�rrr�	requirers,szDistributionNotFound.requirerscCs|js
dSdj|j�S)Nzthe applicationz, )r�rA)r�rrr�
requirers_str0sz"DistributionNotFound.requirers_strcCs|jjft��S)N)r�r�r�)r�rrrr�6szDistributionNotFound.reportcCs|j�S)N)r�)r�rrr�__str__9szDistributionNotFound.__str__N)rrrrr�r�r�r�r�r�r�rrrrre"sc@seZdZdZdS)rfz>Distribution doesn't have an "extra feature" of the given nameN)rrrrrrrrrf=sr;cCs|t|<dS)aRegister `provider_factory` to make providers for `loader_type`

    `loader_type` is the type or class of a PEP 302 ``module.__loader__``,
    and `provider_factory` is a function that, passed a *module* object,
    returns an ``IResourceProvider`` for that module.
    N)�_provider_factories)Zloader_typeZprovider_factoryrrrr�KscCstt|t�r$tj|�p"tt|��dSytj|}Wn&tk
rXt	|�tj|}YnXt
|dd�}tt|�|�S)z?Return an IResourceProvider for the named module or requirementr�
__loader__N)
�
isinstancerarW�findrH�strr?�modules�KeyError�
__import__�getattr�
_find_adapterr�)ZmoduleOrReq�module�loaderrrrrJUs
cCsd|s\tj�d}|dkrLd}tjj|�rLttd�rLtj|�}d|krL|d}|j|j	d��|dS)Nr�z0/System/Library/CoreServices/SystemVersion.plist�	readPlistZProductVersionr:)
r@Zmac_ver�os�path�exists�hasattr�plistlibr��appendr)�_cacherZplistZ
plist_contentrrrrBbs

rBcCsddd�j||�S)NZppc)ZPowerPCZPower_Macintosh)�get)�machinerrr�_macosx_archrsr�cCs~ddlm}|�}tjdkrz|jd�rzy<t�}tj�djdd�}dt	|d�t	|d	�t
|�fStk
rxYnX|S)
z�Return this platform's string for platform-specific distributions

    XXX Currently this is the same as ``distutils.util.get_platform()``, but it
    needs some hacks for Linux and Mac OS X.
    r)rkr9zmacosx-�� �_zmacosx-%d.%d-%sr)�	sysconfigrkr?r@�
startswithrBr��uname�replace�intr�rD)rkrErr�rrrr<vsr<zmacosx-(\d+)\.(\d+)-(.*)zdarwin-(\d+)\.(\d+)\.(\d+)-(.*)cCs�|dks|dks||krdStj|�}|r�tj|�}|s�tj|�}|r�t|jd��}d|jd�|jd�f}|dkr||dks�|dkr�|d	kr�dSd
S|jd�|jd�ks�|jd�|jd�kr�d
St|jd��t|jd��kr�d
SdSd
S)z�Can code for the `provided` platform run on the `required` platform?

    Returns true if either platform is ``None``, or the platforms are equal.

    XXX Needs compatibility checks for Linux and other unixy OSes.
    NTrz%s.%sr;�z10.3�z10.4Fr)r=r>�darwinVersionStringr�rC)ZprovidedZrequiredZreqMacZprovMacZ
provDarwinZdversionZmacosversionrrrrl�s*


cCs<tjd�j}|d}|j�||d<t|�dj||�dS)z@Locate distribution `dist_spec` and run its `script_name` scriptrrrN)r?�	_getframe�	f_globalsr1rHrI)Z	dist_spec�script_name�ns�namerrrrI�s
cCs@t|tj�rtj|�}t|t�r(t|�}t|t�s<td|��|S)z@Return a current distribution object for a Requirement or stringz-Expected string, Requirement, or Distribution)r�r�string_typesra�parserJr`�	TypeError)r�rrrrK�s



cCst|�j||�S)zDReturn `name` entry point of `group` for `dist` or raise ImportError)rKrL)r�rCr�rrrrL�scCst|�j|�S)z=Return the entry point map for `group`, or the full entry map)rKrM)r�rCrrrrM�scCst|�j||�S)z<Return the EntryPoint object for `group`+`name`, or ``None``)rKrN)r�rCr�rrrrN�sc@s<eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
S)rzcCsdS)z;Does the package's distribution contain the named metadata?Nr)r�rrr�has_metadata�szIMetadataProvider.has_metadatacCsdS)z'The named metadata resource as a stringNr)r�rrr�get_metadata�szIMetadataProvider.get_metadatacCsdS)z�Yield named metadata resource as list of non-blank non-comment lines

       Leading and trailing whitespace is stripped from each line, and lines
       with ``#`` as the first non-blank character are omitted.Nr)r�rrr�get_metadata_lines�sz$IMetadataProvider.get_metadata_linescCsdS)z>Is the named metadata a directory?  (like ``os.path.isdir()``)Nr)r�rrr�metadata_isdir�sz IMetadataProvider.metadata_isdircCsdS)z?List of metadata names in the directory (like ``os.listdir()``)Nr)r�rrr�metadata_listdir�sz"IMetadataProvider.metadata_listdircCsdS)z=Execute the named script in the supplied namespace dictionaryNr)r��	namespacerrrrI�szIMetadataProvider.run_scriptN)	rrrr�r�r�r�r�rIrrrrrz�sc@s@eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dS)r{z3An object that provides access to package resourcescCsdS)zdReturn a true filesystem path for `resource_name`

        `manager` must be an ``IResourceManager``Nr)�manager�
resource_namerrr�get_resource_filenamesz'IResourceProvider.get_resource_filenamecCsdS)ziReturn a readable file-like object for `resource_name`

        `manager` must be an ``IResourceManager``Nr)r�r�rrr�get_resource_streamsz%IResourceProvider.get_resource_streamcCsdS)zmReturn a string containing the contents of `resource_name`

        `manager` must be an ``IResourceManager``Nr)r�r�rrr�get_resource_string
sz%IResourceProvider.get_resource_stringcCsdS)z,Does the package contain the named resource?Nr)r�rrr�has_resourceszIResourceProvider.has_resourcecCsdS)z>Is the named resource a directory?  (like ``os.path.isdir()``)Nr)r�rrrrUsz IResourceProvider.resource_isdircCsdS)z?List of resource names in the directory (like ``os.listdir()``)Nr)r�rrrrSsz"IResourceProvider.resource_listdirN)
rrrrr�r�r�r�rUrSrrrrr{sc@s�eZdZdZd'dd�Zedd��Zedd��Zd	d
�Zdd�Z	d
d�Z
d(dd�Zdd�Zdd�Z
d)dd�Zd*dd�Zd+dd�Zdd�Zd,dd �Zd!d"�Zd#d$�Zd%d&�ZdS)-r^zDA collection of active distributions on sys.path (or a similar list)NcCsBg|_i|_i|_g|_|dkr&tj}x|D]}|j|�q,WdS)z?Create working set from list of path entries (default=sys.path)N)�entries�
entry_keys�by_key�	callbacksr?r��	add_entry)r�r��entryrrr�__init__s
zWorkingSet.__init__cCsZ|�}yddlm}Wntk
r*|SXy|j|�Wntk
rT|j|�SX|S)z1
        Prepare the master working set.
        r)�__requires__)�__main__r��ImportErrorrHrd�_build_from_requirements)�cls�wsr�rrr�
_build_master,szWorkingSet._build_mastercCsn|g�}t|�}|j|t��}x|D]}|j|�q$Wx"tjD]}||jkr>|j|�q>W|jtjdd�<|S)zQ
        Build a working set from a requirement spec. Rewrites sys.path.
        N)rh�resolver]�addr?r�r�r�)r�Zreq_specr��reqs�distsr�r�rrrr�@s

z#WorkingSet._build_from_requirementscCs@|jj|g�|jj|�x t|d�D]}|j||d�q&WdS)a�Add a path item to ``.entries``, finding any distributions on it

        ``find_distributions(entry, True)`` is used to find distributions
        corresponding to the path entry, and they are added.  `entry` is
        always appended to ``.entries``, even if it is already present.
        (This is because ``sys.path`` can contain the same value more than
        once, and the ``.entries`` of the ``sys.path`` WorkingSet should always
        equal ``sys.path``.)
        TFN)r��
setdefaultr�r�rYr�)r�r�r�rrrr�Vs
zWorkingSet.add_entrycCs|jj|j�|kS)z9True if `dist` is the active distribution for its project)r�r�r2)r�r�rrr�__contains__eszWorkingSet.__contains__cCs,|jj|j�}|dk	r(||kr(t||��|S)a�Find a distribution matching requirement `req`

        If there is an active distribution for the requested project, this
        returns it as long as it meets the version requirement specified by
        `req`.  But, if there is an active distribution for the project and it
        does *not* meet the `req` requirement, ``VersionConflict`` is raised.
        If there is no active distribution for the requested project, ``None``
        is returned.
        N)r�r�r2rd)r�r�r�rrrr�is

zWorkingSet.findccsPxJ|D]B}|j|�}|dkr6x*|j�D]
}|Vq&Wq||kr||VqWdS)aYield entry point objects from `group` matching `name`

        If `name` is None, yields all entry points in `group` from all
        distributions in the working set, otherwise only ones matching
        both `group` and `name` are yielded (in distribution order).
        N)rM�values)r�rCr�r�r��eprrrrOys

zWorkingSet.iter_entry_pointscCs>tjd�j}|d}|j�||d<|j|�dj||�dS)z?Locate distribution for `requires` and run `script_name` scriptrrrN)r?r�r�r1rHrI)r��requiresr�r�r�rrrrI�s
zWorkingSet.run_scriptccsTi}xJ|jD]@}||jkrqx.|j|D] }||kr(d||<|j|Vq(WqWdS)z�Yield distributions for non-duplicate projects in the working set

        The yield order is the order in which the items' path entries were
        added to the working set.
        rN)r�r�r�)r��seen�itemr2rrr�__iter__�s
zWorkingSet.__iter__TFcCs�|r|j|j||d�|dkr$|j}|jj|g�}|jj|jg�}|rX|j|jkrXdS||j|j<|j|krz|j|j�|j|kr�|j|j�|j|�dS)aAdd `dist` to working set, associated with `entry`

        If `entry` is unspecified, it defaults to the ``.location`` of `dist`.
        On exit from this routine, `entry` is added to the end of the working
        set's ``.entries`` (if it wasn't already present).

        `dist` is only added to the working set if it's for a project that
        doesn't already have a distribution in the set, unless `replace=True`.
        If it's added, any callbacks registered with the ``subscribe()`` method
        will be called.
        )r�N)	�	insert_onr��locationr�r�r2r�r��
_added_new)r�r�r��insertr��keysZkeys2rrrr��s

zWorkingSet.addcCs�t|�ddd�}i}i}g}t�}	tjt�}
�xP|�r�|jd�}||krLq2|	j||�sZq2|j|j�}|dk�r
|j	j|j�}|dks�||ko�|�r|}
|dkr�|dkr�t
|j�}nt
g�}tg�}
|j
||
||d�}||j<|dk�r|
j|d�}t||��|j|�||k�r,|
|}t||�j|��|j|j�ddd�}|j|�x(|D] }|
|j|j�|j|	|<�qRWd||<q2W|S)a�List all distributions needed to (recursively) meet `requirements`

        `requirements` must be a sequence of ``Requirement`` objects.  `env`,
        if supplied, should be an ``Environment`` instance.  If
        not supplied, it defaults to all distributions available within any
        entry or distribution in the working set.  `installer`, if supplied,
        will be invoked with each requirement that cannot be met by an
        already-installed distribution; it should return a ``Distribution`` or
        ``None``.

        Unless `replace_conflicting=True`, raises a VersionConflict exception
        if
        any requirements are found on the path that have the correct name but
        the wrong version.  Otherwise, if an `installer` is supplied it will be
        invoked to obtain the correct version of the requirement and activate
        it.

        `extras` is a list of the extras to be used with these requirements.
        This is important because extra requirements may look like `my_req;
        extra = "my_extra"`, which would otherwise be interpreted as a purely
        optional requirement.  Instead, we want to be able to assert that these
        requirements are truly required.
        Nrr)�replace_conflictingT���r�)�list�
_ReqExtras�collections�defaultdict�set�pop�markers_passr�r2r�r]r�r^�
best_matchrer�rdr�r��extras�extendr��project_name)r��requirements�env�	installerr�r�Z	processedZbestZto_activateZ
req_extrasr�r�r�r�r�Z
dependent_reqZnew_requirementsZnew_requirementrrrr��sN









zWorkingSet.resolvecCst|�}|j�i}i}|dkr4t|j�}||7}n||}|jg�}	tt|	j|��x�|D]�}
x�||
D]x}|j�g}y|	j|||�}
Wn4t	k
r�}z|||<|r�wjnPWYdd}~XqjXtt|	j|
��|j
tj|
��PqjWq\Wt|�}|j�||fS)asFind all activatable distributions in `plugin_env`

        Example usage::

            distributions, errors = working_set.find_plugins(
                Environment(plugin_dirlist)
            )
            # add plugins+libs to sys.path
            map(working_set.add, distributions)
            # display errors
            print('Could not load', errors)

        The `plugin_env` should be an ``Environment`` instance that contains
        only distributions that are in the project's "plugin directory" or
        directories. The `full_env`, if supplied, should be an ``Environment``
        contains all currently-available distributions.  If `full_env` is not
        supplied, one is created automatically from the ``WorkingSet`` this
        method is called on, which will typically mean that every directory on
        ``sys.path`` will be scanned for distributions.

        `installer` is a standard installer callback as used by the
        ``resolve()`` method. The `fallback` flag indicates whether we should
        attempt to resolve older versions of a plugin if the newest version
        cannot be resolved.

        This method returns a 2-tuple: (`distributions`, `error_info`), where
        `distributions` is a list of the distributions found in `plugin_env`
        that were loadable, along with any other distributions that are needed
        to resolve their dependencies.  `error_info` is a dictionary mapping
        unloadable plugin distributions to an exception instance describing the
        error that occurred. Usually this will be a ``DistributionNotFound`` or
        ``VersionConflict`` instance.
        N)
r��sortr]r�r�rr��as_requirementr�rcr"r$r%)r�Z
plugin_envZfull_envrZfallbackZplugin_projectsZ
error_infoZ
distributionsrZ
shadow_setr�r�r�Z	resolveesrrrr�find_pluginss4$





zWorkingSet.find_pluginscGs*|jt|��}x|D]}|j|�qW|S)a�Ensure that distributions matching `requirements` are activated

        `requirements` must be a string or a (possibly-nested) sequence
        thereof, specifying the distributions and versions required.  The
        return value is a sequence of the distributions that needed to be
        activated to fulfill the requirements; all relevant distributions are
        included, even if they were already activated in this working set.
        )r�rhr�)r�r�Zneededr�rrrrHos	
zWorkingSet.requirecCs<||jkrdS|jj|�|s"dSx|D]}||�q(WdS)z�Invoke `callback` for all distributions

        If `existing=True` (default),
        call on all existing ones, as well.
        N)r�r�)r��callback�existingr�rrr�	subscribes

zWorkingSet.subscribecCsx|jD]}||�qWdS)N)r�)r�r�rrrrr��szWorkingSet._added_newcCs,|jdd�|jj�|jj�|jdd�fS)N)r�r�r.r�r�)r�rrrr,�szWorkingSet.__getstate__cCs@|\}}}}|dd�|_|j�|_|j�|_|dd�|_dS)N)r�r.r�r�r�)r�Ze_k_b_cr�r�r�r�rrrr-�s


zWorkingSet.__setstate__)N)N)NTF)NNFN)NNT)T)rrrrr��classmethodr�r�r�r�r�rOrIr�r�r�rrHrr�r,r-rrrrr^s&




\
S
c@seZdZdZddd�ZdS)r�z>
    Map each requirement to the extras that demanded it.
    Ncs2�fdd�|j�f�|pdD�}�jp0t|�S)z�
        Evaluate markers for req against each extra that
        demanded it.

        Return False if the req has a marker and fails
        evaluation. Otherwise, return True.
        c3s|]}�jjd|i�VqdS)�extraN)�marker�evaluate)�.0r	)r�rr�	<genexpr>�sz*_ReqExtras.markers_pass.<locals>.<genexpr>N)N)r�r
�any)r�r�r�Zextra_evalsr)r�rr��s	
z_ReqExtras.markers_pass)N)rrrrr�rrrrr��sr�c@sxeZdZdZde�efdd�Zdd�Zdd�Zdd	d
�Z	dd�Z
d
d�Zddd�Zddd�Z
dd�Zdd�Zdd�ZdS)r]z5Searchable snapshot of distributions on a search pathNcCs i|_||_||_|j|�dS)a!Snapshot distributions available on a search path

        Any distributions found on `search_path` are added to the environment.
        `search_path` should be a sequence of ``sys.path`` items.  If not
        supplied, ``sys.path`` is used.

        `platform` is an optional string specifying the name of the platform
        that platform-specific distributions must be compatible with.  If
        unspecified, it defaults to the current platform.  `python` is an
        optional string naming the desired version of Python (e.g. ``'3.3'``);
        it defaults to the current version.

        You may explicitly set `platform` (and/or `python`) to ``None`` if you
        wish to map *all* distributions, not just those compatible with the
        running platform or Python version.
        N)�_distmapr@�python�scan)r��search_pathr@rrrrr��szEnvironment.__init__cCs2|jdkp|jdkp|j|jk}|o0t|j|j�S)z�Is distribution `dist` acceptable for this environment?

        The distribution must match the platform and python version
        requirements specified when this environment was created, or False
        is returned.
        N)r�
py_versionrlr@)r�r�Z	py_compatrrr�can_add�s

zEnvironment.can_addcCs|j|jj|�dS)z"Remove `dist` from the environmentN)rr2�remove)r�r�rrrr�szEnvironment.removecCs<|dkrtj}x(|D] }xt|�D]}|j|�q"WqWdS)adScan `search_path` for distributions usable in this environment

        Any distributions found are added to the environment.
        `search_path` should be a sequence of ``sys.path`` items.  If not
        supplied, ``sys.path`` is used.  Only distributions conforming to
        the platform/python version defined at initialization are added.
        N)r?r�rYr�)r�rr�r�rrrr�s

zEnvironment.scancCs|j�}|jj|g�S)aReturn a newest-to-oldest list of distributions for `project_name`

        Uses case-insensitive `project_name` comparison, assuming all the
        project's distributions use their project's name converted to all
        lowercase as their key.

        )�lowerrr�)r�r�Zdistribution_keyrrr�__getitem__�szEnvironment.__getitem__cCsL|j|�rH|j�rH|jj|jg�}||krH|j|�|jtjd�dd�dS)zLAdd `dist` if we ``can_add()`` it and it has not already been added
        �hashcmpT)r2�reverseN)	r�has_versionrr�r2r�r�operator�
attrgetter)r�r�r�rrrr��s

zEnvironment.addFcCsfy|j|�}Wntk
r,|s$�d}YnX|dk	r:|Sx||jD]}||krF|SqFW|j||�S)a�Find distribution best matching `req` and usable on `working_set`

        This calls the ``find(req)`` method of the `working_set` to see if a
        suitable distribution is already active.  (This may raise
        ``VersionConflict`` if an unsuitable version of the project is already
        active in the specified `working_set`.)  If a suitable distribution
        isn't active, this method returns the newest distribution in the
        environment that meets the ``Requirement`` in `req`.  If no suitable
        distribution is found, and `installer` is supplied, then the result of
        calling the environment's ``obtain(req, installer)`` method will be
        returned.
        N)r�rdr2�obtain)r�r�rWrr�r�rrrr�s
zEnvironment.best_matchcCs|dk	r||�SdS)a�Obtain a distribution matching `requirement` (e.g. via download)

        Obtain a distro that matches requirement (e.g. via download).  In the
        base ``Environment`` class, this routine just returns
        ``installer(requirement)``, unless `installer` is None, in which case
        None is returned instead.  This method is a hook that allows subclasses
        to attempt other ways of obtaining a distribution before falling back
        to the `installer` argument.Nr)r�Zrequirementrrrrrs	zEnvironment.obtainccs&x |jj�D]}||r|VqWdS)z=Yield the unique project names of the available distributionsN)rr�)r�r2rrrr�+szEnvironment.__iter__cCs^t|t�r|j|�nDt|t�rLx8|D] }x||D]}|j|�q4Wq&Wntd|f��|S)z2In-place addition of a distribution or environmentzCan't add %r to environment)r�r`r�r]r�)r��otherZprojectr�rrr�__iadd__1s


zEnvironment.__iadd__cCs.|jgddd�}x||fD]}||7}qW|S)z4Add an environment or distribution to an environmentN)r@r)r�)r�r�newrrrr�__add__=szEnvironment.__add__)N)NF)N)rrrrrG�PY_MAJORr�rrrrr�r�rr�rr!rrrrr]�s



c@seZdZdZdS)rgaTAn error occurred extracting a resource

    The following attributes are available from instances of this exception:

    manager
        The resource manager that raised this exception

    cache_path
        The base directory for resource extraction

    original_error
        The exception instance that caused extraction to fail
    N)rrrrrrrrrgIs
c@s�eZdZdZdZdd�Zdd�Zdd�Zd	d
�Zdd�Z	d
d�Z
dd�Zdd�Zffdd�Z
edd��Zdd�Zdd�Zddd�ZdS)r_z'Manage resource extraction and packagesNcCs
i|_dS)N)�cached_files)r�rrrr�]szResourceManager.__init__cCst|�j|�S)zDoes the named resource exist?)rJr�)r��package_or_requirementr�rrrrT`szResourceManager.resource_existscCst|�j|�S)z,Is the named resource an existing directory?)rJrU)r�r$r�rrrrUdszResourceManager.resource_isdircCst|�j||�S)z4Return a true filesystem path for specified resource)rJr�)r�r$r�rrrrRjsz!ResourceManager.resource_filenamecCst|�j||�S)z9Return a readable file-like object for specified resource)rJr�)r�r$r�rrrrQpszResourceManager.resource_streamcCst|�j||�S)z%Return specified resource as a string)rJr�)r�r$r�rrrrPvszResourceManager.resource_stringcCst|�j|�S)z1List the contents of the named resource directory)rJrS)r�r$r�rrrrS|sz ResourceManager.resource_listdircCsRtj�d}|jpt�}tjd�j�}t|jft	���}||_
||_||_|�dS)z5Give an error message for problems extracting file(s)ra
            Can't extract file(s) to egg cache

            The following error occurred while trying to extract file(s)
            to the Python egg cache:

              {old_exc}

            The Python egg cache directory is currently set to:

              {cache_path}

            Perhaps your account does not have write access to this directory?
            You can change the cache directory by setting the PYTHON_EGG_CACHE
            environment variable to point to an accessible directory.
            N)
r?�exc_info�extraction_pathr\�textwrap�dedent�lstriprgr�r�r��
cache_pathZoriginal_error)r��old_excr*�tmpl�errrrr�extraction_error�s
z ResourceManager.extraction_errorcCsf|jp
t�}tjj||df|��}yt|�Wntk
rL|j�YnX|j|�d|j	|<|S)a�Return absolute location in cache for `archive_name` and `names`

        The parent directory of the resulting path will be created if it does
        not already exist.  `archive_name` should be the base filename of the
        enclosing egg (which may not be the name of the enclosing zipfile!),
        including its ".egg" extension.  `names`, if provided, should be a
        sequence of path name parts "under" the egg's extraction location.

        This method should only be called by resource providers that need to
        obtain an extraction location, and only for names they intend to
        extract, as it tracks the generated names for possible cleanup later.
        z-tmpr)
r&r\r�r�rA�_bypass_ensure_directory�	Exceptionr.�_warn_unsafe_extraction_pathr#)r�Zarchive_name�namesZextract_pathZtarget_pathrrr�get_cache_path�s


zResourceManager.get_cache_pathcCsXtjdkr |jtjd�r dStj|�j}|tj@s@|tj@rTd|}tj	|t
�dS)aN
        If the default extraction path is overridden and set to an insecure
        location, such as /tmp, it opens up an opportunity for an attacker to
        replace an extracted file with an unauthorized payload. Warn the user
        if a known insecure location is used.

        See Distribute #375 for more details.
        �ntZwindirNz�%s is writable by group/others and vulnerable to attack when used with get_resource_filename. Consider a more secure location (set with .set_extraction_path or the PYTHON_EGG_CACHE environment variable).)r�r�r��environ�stat�st_mode�S_IWOTH�S_IWGRP�warnings�warn�UserWarning)r��mode�msgrrrr1�s
z,ResourceManager._warn_unsafe_extraction_pathcCs.tjdkr*tj|�jdBd@}tj||�dS)a4Perform any platform-specific postprocessing of `tempname`

        This is where Mac header rewrites should be done; other platforms don't
        have anything special they should do.

        Resource providers should call this method ONLY after successfully
        extracting a compressed resource.  They must NOT call it on resources
        that are already in the filesystem.

        `tempname` is the current (temporary) name of the file, and `filename`
        is the name it will be renamed to by the caller after this routine
        returns.
        �posiximi�N)r�r�r6r7�chmod)r�Ztempname�filenamer=rrr�postprocess�s
zResourceManager.postprocesscCs|jrtd��||_dS)a�Set the base path where resources will be extracted to, if needed.

        If you do not call this routine before any extractions take place, the
        path defaults to the return value of ``get_default_cache()``.  (Which
        is based on the ``PYTHON_EGG_CACHE`` environment variable, with various
        platform-specific fallbacks.  See that routine's documentation for more
        details.)

        Resources are extracted to subdirectories of this path based upon
        information given by the ``IResourceProvider``.  You may set this to a
        temporary directory, but then you must call ``cleanup_resources()`` to
        delete the extracted files when done.  There is no guarantee that
        ``cleanup_resources()`` will be able to remove all extracted files.

        (Note: you may not change the extraction path for a given resource
        manager once resources have been extracted, unless you first call
        ``cleanup_resources()``.)
        z5Can't change extraction path, files already extractedN)r#rDr&)r�r�rrrrZ�sz#ResourceManager.set_extraction_pathFcCsdS)aB
        Delete all extracted resource files and directories, returning a list
        of the file and directory names that could not be successfully removed.
        This function does not have any concurrency protection, so it should
        generally only be called when the extraction path is a temporary
        directory exclusive to a single process.  This method is not
        automatically called; you must call it explicitly or register it as an
        ``atexit`` function if you wish to ensure cleanup of a temporary
        directory used for extractions.
        Nr)r��forcerrrr[�sz!ResourceManager.cleanup_resources)F)rrrrr&r�rTrUrRrQrPrSr.r3�staticmethodr1rBrZr[rrrrr_YscCstjjd�ptjdd�S)z�
    Return the ``PYTHON_EGG_CACHE`` environment variable
    or a platform-relevant user cache dir for an app
    named "Python-Eggs".
    ZPYTHON_EGG_CACHEzPython-Eggs)Zappname)r�r5r�rZuser_cache_dirrrrrr\
scCstjdd|�S)z�Convert an arbitrary string to a standard distribution name

    Any runs of non-alphanumeric/. characters are replaced with a single '-'.
    z[^A-Za-z0-9.]+�-)�re�sub)r�rrrriscCsDyttjj|��Stjjk
r>|jdd�}tjdd|�SXdS)zB
    Convert an arbitrary string to a standard version string
    r�r:z[^A-Za-z0-9.]+rEN)r�rrrrr�rFrG)rrrrrj!s
cCstjdd|�j�S)z�Convert an arbitrary string to a standard 'extra' name

    Any runs of non-alphanumeric characters are replaced with a single '_',
    and the result is always lowercased.
    z[^A-Za-z0-9.-]+r�)rFrGr)r	rrrro-scCs|jdd�S)z|Convert a project or version name to its filename-escaped form

    Any '-' characters are currently replaced with '_'.
    rEr�)r�)r�rrrrp6scCs>yt|�Wn,tk
r8}zd|_d|_|Sd}~XnXdS)zo
    Validate text as a PEP 508 environment marker; return an exception
    if invalid or False otherwise.
    NF)rr�SyntaxErrorrA�lineno)�text�errrrq>scCsHytjj|�}|j�Stjjk
rB}zt|��WYdd}~XnXdS)z�
    Evaluate a PEP 508 environment marker.
    Return a boolean indicating the marker result in this environment.
    Raise SyntaxError if marker is invalid.

    This implementation uses the 'pyparsing' module.
    N)rZmarkersZMarkerrZ
InvalidMarkerrH)rJr	r
rKrrrrrLs
c@s�eZdZdZdZdZdZdd�Zdd�Zdd�Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�ZdS)'r�zETry to implement resources and metadata for arbitrary PEP 302 loadersNcCs(t|dd�|_tjjt|dd��|_dS)Nr��__file__r�)r�r�r�r��dirname�module_path)r�r�rrrr�bszNullProvider.__init__cCs|j|j|�S)N)�_fnrN)r�r�r�rrrr�fsz"NullProvider.get_resource_filenamecCstj|j||��S)N)�io�BytesIOr�)r�r�r�rrrr�isz NullProvider.get_resource_streamcCs|j|j|j|��S)N)�_getrOrN)r�r�r�rrrr�lsz NullProvider.get_resource_stringcCs|j|j|j|��S)N)�_hasrOrN)r�r�rrrr�oszNullProvider.has_resourcecCs|jo|j|j|j|��S)N)�egg_inforSrO)r�r�rrrr�rszNullProvider.has_metadatacCs2|js
dS|j|j|j|��}tjr.|jd�S|S)Nr�zutf-8)rTrRrOrZPY3�decode)r�r��valuerrrr�uszNullProvider.get_metadatacCst|j|��S)N)rmr�)r�r�rrrr�{szNullProvider.get_metadata_linescCs|j|j|j|��S)N)�_isdirrOrN)r�r�rrrrU~szNullProvider.resource_isdircCs|jo|j|j|j|��S)N)rTrWrO)r�r�rrrr��szNullProvider.metadata_isdircCs|j|j|j|��S)N)�_listdirrOrN)r�r�rrrrS�szNullProvider.resource_listdircCs|jr|j|j|j|��SgS)N)rTrXrO)r�r�rrrr��szNullProvider.metadata_listdirc
Cs�d|}|j|�s$tdjft����|j|�jdd�}|jdd�}|j|j|�}||d<tj	j
|�r�t|�j�}t
||d�}t|||�n>dd	lm}t|�d|jd�|f||<t
||d�}	t|	||�dS)
Nzscripts/z<Script {script!r} not found in metadata at {self.egg_info!r}z
�
�
rL�execr)�cache)r�rcr�r�r�r�rOrTr�r�r�r�read�compiler[�	linecacher\�lenr)
r�r�r�ZscriptZscript_textZscript_filename�source�coder\Zscript_coderrrrI�s"

zNullProvider.run_scriptcCstd��dS)Nz9Can't perform this operation for unregistered loader type)�NotImplementedError)r�r�rrrrS�szNullProvider._hascCstd��dS)Nz9Can't perform this operation for unregistered loader type)rc)r�r�rrrrW�szNullProvider._isdircCstd��dS)Nz9Can't perform this operation for unregistered loader type)rc)r�r�rrrrX�szNullProvider._listdircCs |rtjj|f|jd���S|S)N�/)r�r�rAr)r��baser�rrrrO�szNullProvider._fncCs$t|jd�r|jj|�Std��dS)N�get_dataz=Can't perform this operation for loaders without 'get_data()')r�r�rfrc)r�r�rrrrR�szNullProvider._get)rrrr�egg_namerTr�r�r�r�r�r�r�r�r�rUr�rSr�rIrSrWrXrOrRrrrrr�[s,c@s eZdZdZdd�Zdd�ZdS)r�z&Provider based on a virtual filesystemcCstj||�|j�dS)N)r�r��
_setup_prefix)r�r�rrrr��szEggProvider.__init__cCs^|j}d}xN||krXt|�rBtjj|�|_tjj|d�|_||_P|}tjj	|�\}}qWdS)NzEGG-INFO)
rN�_is_egg_pathr�r��basenamergrArT�egg_rootr)r�r��oldrerrrrh�s
zEggProvider._setup_prefixN)rrrrr�rhrrrrr��sc@sDeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Ze	dd
��Z
dS)r�z6Provides access to package resources in the filesystemcCstjj|�S)N)r�r�r�)r�r�rrrrS�szDefaultProvider._hascCstjj|�S)N)r�r�r
)r�r�rrrrW�szDefaultProvider._isdircCs
tj|�S)N)r��listdir)r�r�rrrrX�szDefaultProvider._listdircCst|j|j|�d�S)N�rb)rrOrN)r�r�r�rrrr��sz#DefaultProvider.get_resource_streamc	Cst|d��
}|j�SQRXdS)Nrn)rr])r�r��streamrrrrR�szDefaultProvider._getcCs0d}x&|D]}tt|td��}t||�q
WdS)N�SourceFileLoader�SourcelessFileLoader)rprq)r��importlib_machinery�typer�)r�Zloader_namesr�Z
loader_clsrrr�	_register�s
zDefaultProvider._registerN)rrrrrSrWrXr�rRrrtrrrrr��sc@s8eZdZdZdZdd�ZZdd�Zdd�Zd	d
�Z	dS)rz.Provider that returns nothing for all requestsNcCsdS)NFr)r�r�rrrr8�szEmptyProvider.<lambda>cCsdS)Nr�r)r�r�rrrrR�szEmptyProvider._getcCsgS)Nr)r�r�rrrrXszEmptyProvider._listdircCsdS)Nr)r�rrrr�szEmptyProvider.__init__)
rrrrrNrWrSrRrXr�rrrrr�sc@s eZdZdZedd��ZeZdS)�ZipManifestsz
    zip manifest builder
    c
s4tj|�� ��fdd��j�D�}t|�SQRXdS)a
        Build a dictionary similar to the zipimport directory
        caches, except instead of tuples, store ZipInfo objects.

        Use a platform-specific path separator (os.sep) for the path keys
        for compatibility with pypy on Windows.
        c3s&|]}|jdtj��j|�fVqdS)rdN)r�r��sepZgetinfo)rr�)�zfilerrr
sz%ZipManifests.build.<locals>.<genexpr>N)�zipfileZZipFileZnamelistr$)r�r�r(r)rwr�builds	
zZipManifests.buildN)rrrrrry�loadrrrrru
sruc@s$eZdZdZejdd�Zdd�ZdS)�MemoizedZipManifestsz%
    Memoized zipfile manifests.
    �manifest_modzmanifest mtimecCsRtjj|�}tj|�j}||ks.||j|krH|j|�}|j||�||<||jS)zW
        Load a manifest at path or return a suitable manifest already loaded.
        )	r�r��normpathr6�st_mtime�mtimeryr|�manifest)r�r�rr�rrrrz+s
zMemoizedZipManifests.loadN)rrrrr��
namedtupler|rzrrrrr{%sr{c@s�eZdZdZdZe�Zdd�Zdd�Zdd�Z	e
d	d
��Zdd�Ze
d
d��Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �ZdS)!r�z"Resource support for zips and eggsNcCs tj||�|jjtj|_dS)N)r�r�r��archiver�rv�zip_pre)r�r�rrrr�?szZipProvider.__init__cCsP|jtj�}||jjkrdS|j|j�r:|t|j�d�Std||jf��dS)Nr�z%s is not a subpath of %s)	�rstripr�rvr�r�r�r�r`�AssertionError)r��fspathrrr�
_zipinfo_nameCszZipProvider._zipinfo_namecCsP|j|}|j|jtj�r:|t|j�dd�jtj�Std||jf��dS)Nrz%s is not a subpath of %s)r�r�rkr�rvr`rr�)r��zip_pathr�rrr�_partsOs

zZipProvider._partscCs|jj|jj�S)N)�_zip_manifestsrzr�r�)r�rrr�zipinfoYszZipProvider.zipinfocCs`|jstd��|j|�}|j�}dj|j|��|krTx|D]}|j||j|��q:W|j||�S)Nz5resource_filename() only supported for .egg, not .ziprd)rgrc�_resource_to_zip�_get_eager_resourcesrAr��_extract_resource�
_eager_to_zip)r�r�r�r��eagersr�rrrr�]s

z!ZipProvider.get_resource_filenamecCs"|j}|jd}tj|�}||fS)Nrrr�)rrr�)Z	file_size�	date_time�timeZmktime)Zzip_stat�sizer��	timestamprrr�_get_date_and_sizejs

zZipProvider._get_date_and_sizec
Csn||j�krDx*|j�|D]}|j|tjj||��}qWtjj|�S|j|j|�\}}tsdt	d��y�|j
|j|j|��}|j
||�r�|Stdtjj|�d�\}}	tj||jj|��tj|�t|	||f�|j|	|�yt|	|�Wn\tjk
�rDtjj|��r>|j
||��r|Stjdk�r>t|�t|	|�|S�YnXWn tjk
�rh|j�YnX|S)Nz>"os.rename" and "os.unlink" are not supported on this platformz	.$extract)�dirr4)�_indexr�r�r�rArMr�r��
WRITE_SUPPORT�IOErrorr3rgr��_is_current�_mkstemp�writer�rf�closerrBr
�error�isfiler�rr.)
r�r�r�r�Zlastr�r�Z	real_pathZoutfZtmpnamrrrr�ssD

zZipProvider._extract_resourcec		Csx|j|j|�\}}tjj|�s$dStj|�}|j|ksB|j|krFdS|jj	|�}t
|d��}|j�}WdQRX||kS)zK
        Return True if the file_path is current for this zip_path
        FrnN)r�r�r�r�r�r6�st_sizer~r�rfrr])	r�Z	file_pathr�r�r�r6Zzip_contents�fZ
file_contentsrrrr��s
zZipProvider._is_currentcCsB|jdkr<g}x&dD]}|j|�r|j|j|��qW||_|jS)N�native_libs.txt�eager_resources.txt)r�r�)r�r�r�r�)r�r�r�rrrr��s


z ZipProvider._get_eager_resourcescCs�y|jStk
r�i}xd|jD]Z}|jtj�}xH|rztjj|dd��}||krj||j|d�Pq4|j�g||<q4Wq"W||_|SXdS)Nrr�r�)	Z	_dirindex�AttributeErrorr�rr�rvrAr�r�)r�Zindr��parts�parentrrrr��szZipProvider._indexcCs |j|�}||jkp||j�kS)N)r�r�r�)r�r�r�rrrrS�s
zZipProvider._hascCs|j|�|j�kS)N)r�r�)r�r�rrrrW�szZipProvider._isdircCst|j�j|j|�f��S)N)r�r�r�r�)r�r�rrrrX�szZipProvider._listdircCs|j|j|j|��S)N)r�rOrk)r�r�rrrr��szZipProvider._eager_to_zipcCs|j|j|j|��S)N)r�rOrN)r�r�rrrr��szZipProvider._resource_to_zip)rrrrr�r{r�r�r�r�r�r�r�rDr�r�r�r�r�rSrWrXr�r�rrrrr�9s$

	7	c@s8eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�ZdS)
r|a*Metadata handler for standalone PKG-INFO files

    Usage::

        metadata = FileMetadata("/path/to/PKG-INFO")

    This provider rejects all data and metadata requests except for PKG-INFO,
    which is treated as existing, and will be the contents of the file at
    the provided location.
    cCs
||_dS)N)r�)r�r�rrrr��szFileMetadata.__init__cCs|dkotjj|j�S)NzPKG-INFO)r�r�r�)r�r�rrrr��szFileMetadata.has_metadatac	CsD|dkrtd��tj|jddd��}|j�}WdQRX|j|�|S)NzPKG-INFOz(No metadata except PKG-INFO is availablezutf-8r�)�encoding�errors)r�rPrr�r]�_warn_on_replacement)r�r�r��metadatarrrr��s
zFileMetadata.get_metadatacCs2djd�}||kr.d}|jft��}tj|�dS)Ns�zutf-8z2{self.path} could not be properly decoded in UTF-8)rUr�r�r:r;)r�r�Zreplacement_charr,r>rrrr�s

z!FileMetadata._warn_on_replacementcCst|j|��S)N)rmr�)r�r�rrrr�szFileMetadata.get_metadata_linesN)	rrrrr�r�r�r�r�rrrrr|�s
	c@seZdZdZdd�ZdS)r}asMetadata provider for egg directories

    Usage::

        # Development eggs:

        egg_info = "/path/to/PackageName.egg-info"
        base_dir = os.path.dirname(egg_info)
        metadata = PathMetadata(base_dir, egg_info)
        dist_name = os.path.splitext(os.path.basename(egg_info))[0]
        dist = Distribution(basedir, project_name=dist_name, metadata=metadata)

        # Unpacked egg directories:

        egg_path = "/path/to/PackageName-ver-pyver-etc.egg"
        metadata = PathMetadata(egg_path, os.path.join(egg_path,'EGG-INFO'))
        dist = Distribution.from_filename(egg_path, metadata=metadata)
    cCs||_||_dS)N)rNrT)r�r�rTrrrr�#szPathMetadata.__init__N)rrrrr�rrrrr}sc@seZdZdZdd�ZdS)r~z Metadata provider for .egg filescCsD|jtj|_||_|jr0tjj|j|j�|_n|j|_|j	�dS)z-Create a metadata provider from a zipimporterN)
r�r�rvr�r��prefixr�rArNrh)r��importerrrrr�+szEggMetadata.__init__N)rrrrr�rrrrr~(sr$)�_distribution_finderscCs|t|<dS)axRegister `distribution_finder` to find distributions in sys.path items

    `importer_type` is the type or class of a PEP 302 "Importer" (sys.path item
    handler), and `distribution_finder` is a callable that, passed a path
    item and the importer instance, yields ``Distribution`` instances found on
    that path item.  See ``pkg_resources.find_on_path`` for an example.N)r�)�
importer_typeZdistribution_finderrrrr�:scCst|�}tt|�}||||�S)z.Yield distributions accessible via `path_item`)rr�r�)�	path_item�onlyr��finderrrrrYDs
c	cs�|jjd�rdSt|�}|jd�r2tj||d�V|r:dSx�|jd�D]�}t|�r�tj	j
||�}ttj
|�|�}xT|D]
}|VqvWqF|j�jd�rFtj	j
||�}ttj
|��}||_tj|||�VqFWdS)z@
    Find eggs in zip files; possibly multiple nested eggs.
    z.whlNzPKG-INFO)r�rdz
.dist-info)r��endswithr~r�r`�
from_filenamerSrir�r�rA�find_eggs_in_zip�	zipimport�zipimporterrrT�
from_location)	r�r�r�r�Zsubitem�subpathr�r�Zsubmetarrrr�Ks$

r�cCsfS)Nr)r�r�r�rrr�find_nothingisr�cCsdd�}t||dd�S)aL
    Given a list of filenames, return them in descending order
    by version number.

    >>> names = 'bar', 'foo', 'Python-2.7.10.egg', 'Python-2.7.2.egg'
    >>> _by_version_descending(names)
    ['Python-2.7.10.egg', 'Python-2.7.2.egg', 'foo', 'bar']
    >>> names = 'Setuptools-1.2.3b1.egg', 'Setuptools-1.2.3.egg'
    >>> _by_version_descending(names)
    ['Setuptools-1.2.3.egg', 'Setuptools-1.2.3b1.egg']
    >>> names = 'Setuptools-1.2.3b1.egg', 'Setuptools-1.2.3.post1.egg'
    >>> _by_version_descending(names)
    ['Setuptools-1.2.3.post1.egg', 'Setuptools-1.2.3b1.egg']
    cSs2tjj|�\}}tj|jd�|g�}dd�|D�S)z6
        Parse each component of the filename
        rEcSsg|]}tjj|��qSr)rrr�)r�partrrr�
<listcomp>�sz?_by_version_descending.<locals>._by_version.<locals>.<listcomp>)r�r��splitext�	itertools�chainr)r��extr�rrr�_by_versionsz+_by_version_descending.<locals>._by_versionT)r2r)�sorted)r2r�rrr�_by_version_descendingpsr�c
#s�t���t��r4tj�t�tjj�d��d�VdSt��}��fdd�|D�}t	|�}x>|D]6}tjj�|�}t
�|��}x||�D]
}	|	Vq�Wq^WdS)z6Yield distributions accessible on a sys.path directoryzEGG-INFO)r�Nc3s|]}t�|��r|VqdS)N)�dist_factory)rr�)r�r�rrr
�szfind_on_path.<locals>.<genexpr>)�_normalize_cached�_is_unpacked_eggr`r�r}r�r�rA�safe_listdirr�r�)
r�r�r�r�ZfilteredZpath_item_entriesr��fullpath�factoryr�r)r�r�r�find_on_path�s
r�cCsL|j�}tt|jd��}|r tS|r2t|�r2tS|rF|jd�rFtSt�S)z9
    Return a dist_factory for a path_item and entry
    �	.egg-info�
.dist-infoz	.egg-link)r�r�)	rrrr��distributions_from_metadatarirY�resolve_egg_link�NoDists)r�r�r�rZis_metarrrr��sr�c@s*eZdZdZdd�ZejreZdd�ZdS)r�zS
    >>> bool(NoDists())
    False

    >>> list(NoDists()('anything'))
    []
    cCsdS)NFr)r�rrr�__bool__�szNoDists.__bool__cCstf�S)N)�iter)r�r�rrr�__call__�szNoDists.__call__N)	rrrrr�r�PY2Z__nonzero__r�rrrrr��s
r�cCsty
tj|�Sttfk
r"YnNtk
rn}z2|jtjtjtjfkpVt	|dd�dk}|s^�WYdd}~XnXfS)zI
    Attempt to list contents of path, but suppress some exceptions.
    ZwinerrorNi)
r�rm�PermissionError�NotADirectoryError�OSError�errno�ENOTDIRZEACCES�ENOENTr�)r�rKZ	ignorablerrrr��s
r�ccsftjj|�}tjj|�r:ttj|��dkr.dSt||�}nt|�}tjj|�}t	j
|||td�VdS)Nr)�
precedence)r�r�rMr
r`rmr}r|rjr`r�ry)r��rootr�r�rrrr��sr�c	cs8t|��&}x|D]}|j�}|r|VqWWdQRXdS)z1
    Yield non-empty lines from file at path
    N)r�strip)r�r��linerrr�non_empty_lines�s


r�cs.t��}�fdd�|D�}tt|�}t|f�S)za
    Given a path to an .egg-link, resolve distributions
    present in the referenced path.
    c3s$|]}tjjtjj��|�VqdS)N)r�r�rArM)r�ref)r�rrr
sz#resolve_egg_link.<locals>.<genexpr>)r�rrY�next)r�Zreferenced_pathsZresolved_pathsZdist_groupsr)r�rr��s


r��
FileFinder)�_namespace_handlers)�_namespace_packagescCs|t|<dS)a�Register `namespace_handler` to declare namespace packages

    `importer_type` is the type or class of a PEP 302 "Importer" (sys.path item
    handler), and `namespace_handler` is a callable like this::

        def namespace_handler(importer, path_entry, moduleName, module):
            # return a path_entry to use for child packages

    Namespace handlers are only called if the importer object has already
    agreed that it can handle the relevant path item, and they should only
    return a subpath if the module __path__ does not already contain an
    equivalent subpath.  For an example namespace handler, see
    ``pkg_resources.file_ns_handler``.
    N)r�)r�Znamespace_handlerrrrr�scCs�t|�}|dkrdS|j|�}|dkr*dStjj|�}|dkrbtj|�}tj|<g|_t|�nt	|d�svt
d|��tt|�}|||||�}|dk	r�|j}|j
|�|j|�t|||�|S)zEEnsure that named package includes a subpath of path_item (if needed)N�__path__zNot a package:)r�find_moduler?r�r��types�
ModuleTyper��_set_parent_nsr�r�r�r�r��load_module�_rebuild_mod_path)�packageNamer�r�r�r�Zhandlerr�r�rrr�
_handle_ns$s*






r�cs`dd�tjD���fdd����fdd�}t|t�s8dS|j|d�d	d�|D�|jdd�<dS)
zq
    Rebuild module.__path__ ensuring that all entries are ordered
    corresponding to their sys.path order
    cSsg|]}t|��qSr)r�)r�prrrr�Csz%_rebuild_mod_path.<locals>.<listcomp>cs(y
�j|�Stk
r"td�SXdS)z/
        Workaround for #520 and #513.
        �infN)�indexrD�float)r�)�sys_pathrr�safe_sys_path_indexEs
z._rebuild_mod_path.<locals>.safe_sys_path_indexcs<|jtj�}�jd�d}|d|�}�ttjj|���S)zR
        Return the ordinal of the path based on its position in sys.path
        r:rN)rr�rv�countr�rA)r��
path_partsZmodule_partsr�)�package_namer�rr�position_in_sys_pathNsz/_rebuild_mod_path.<locals>.position_in_sys_pathN)r2cSsg|]}t|��qSr)r�)rr�rrrr�\s)r?r�r�r�rr�)Z	orig_pathr�r�r�r)r�r�r�rr�>s		
r�cCs�tj�z�|tkrdStjd}}d|kr�dj|jd�dd��}t|�|tkrZt|�ytj	|j
}Wntk
r�td|��YnXtj
|g�j|�tj
|g�x|D]}t||�q�WWdtj�XdS)z9Declare that package 'packageName' is a namespace packageNr:rzNot a package:r�)�_imp�acquire_lockr�r?r�rArrVr�r�r�r�r�r�r�r��release_lock)r�r�r�r�rrrrV_s&
c
CsJtj�z2x,tj|f�D]}t||�}|rt||�qWWdtj�XdS)zDEnsure that previously-declared namespace packages include path_itemN)r�r�r�r�r�r�r�)r�r��packager�rrrr��s
cCsFtjj||jd�d�}t|�}x |jD]}t|�|kr(Pq(W|SdS)zBCompute an ns-package subpath for a filesystem or zipfile importerr:rNr�)r�r�rArr�r�)r�r�r�r�r�Z
normalizedr�rrr�file_ns_handler�sr�cCsdS)Nr)r�r�r�r�rrr�null_ns_handler�sr�cCstjjtjj|��S)z1Normalize a file/dir name for comparison purposes)r�r��normcase�realpath)rArrrrt�scCs2y||Stk
r,t|�||<}|SXdS)N)r�rt)rAr��resultrrrr��s
r�cCs|j�jd�S)z7
    Determine if given path appears to be an egg.
    z.egg)rr�)r�rrrri�sricCs t|�otjjtjj|dd��S)z@
    Determine if given path appears to be an unpacked egg.
    zEGG-INFOzPKG-INFO)rir�r�r�rA)r�rrrr��sr�cCs<|jd�}|j�}|r8dj|�}ttj||tj|�dS)Nr:)rr�rA�setattrr?r�)r�r�r�r�rrrr��s


r�ccsht|tj�r>xV|j�D]"}|j�}|r|jd�r|VqWn&x$|D]}xt|�D]
}|VqRWqDWdS)z9Yield non-empty/non-comment lines of a string or sequence�#N)r�rr��
splitlinesr�r�rm)�strs�sZssrrrrm�s
z\w+(\.\w+)*$z�
    (?P<name>[^-]+) (
        -(?P<ver>[^-]+) (
            -py(?P<pyver>[^-]+) (
                -(?P<plat>.+)
            )?
        )?
    )?
    c@s�eZdZdZffdfdd�Zdd�Zdd�Zdd
d�Zdd
�Zddd�Z	e
jd�Ze
ddd��Ze
dd��Ze
ddd��Ze
ddd��ZdS)rbz3Object representing an advertised importable objectNcCs<t|�std|��||_||_t|�|_t|�|_||_dS)NzInvalid module name)�MODULErDr��module_name�tuple�attrsr�r�)r�r�rrr�r�rrrr��s


zEntryPoint.__init__cCsHd|j|jf}|jr*|ddj|j�7}|jrD|ddj|j�7}|S)Nz%s = %s�:r:z [%s]�,)r�rrrAr�)r�rrrrr��szEntryPoint.__str__cCsdt|�S)NzEntryPoint.parse(%r))r�)r�rrrr��szEntryPoint.__repr__TcOs6|s|s|rtjdtdd�|r.|j||�|j�S)zH
        Require packages for this EntryPoint, then resolve it.
        zJParameters to load are deprecated.  Call .resolve and .require separately.r;)�
stacklevel)r:r;�DeprecationWarningrHr�)r�rHr7�kwargsrrrrz	szEntryPoint.loadcCsVt|jdgdd�}ytjt|j|�Stk
rP}ztt|���WYdd}~XnXdS)zD
        Resolve the entry point from its module and attrs.
        rr)�fromlist�levelN)	r�r�	functools�reducer�rr�r�r�)r�r��excrrrr�	s
zEntryPoint.resolvecCsN|jr|jrtd|��|jj|j�}tj||||jd�}tttj|��dS)Nz&Can't require() without a distribution)r�)	r�r�rfr�rWr�r�rr�)r�rrr�r(rrrrH	s

zEntryPoint.requirez]\s*(?P<name>.+?)\s*=\s*(?P<module>[\w.]+)\s*(:\s*(?P<attr>[\w.]+))?\s*(?P<extras>\[.*\])?\s*$cCsf|jj|�}|sd}t||��|j�}|j|d�}|drJ|djd�nf}||d|d|||�S)aParse a single entry point from string `src`

        Entry point syntax follows the form::

            name = some.module:some.attr [extra1, extra2]

        The entry name and module name are required, but the ``:attrs`` and
        ``[extras]`` parts are optional
        z9EntryPoint must be in 'name=module:attrs [extras]' formatr��attrr:r�r�)�patternr>rD�	groupdict�
_parse_extrasr)r��srcr�rFr>�resr�rrrrr�0	s
zEntryPoint.parsecCs(|sfStjd|�}|jr"t��|jS)N�x)rar��specsrDr�)r�Zextras_specr�rrrrD	szEntryPoint._parse_extrascCsZt|�std|��i}x>t|�D]2}|j||�}|j|krHtd||j��|||j<q W|S)zParse an entry point groupzInvalid group namezDuplicate entry point)rrDrmr�r�)r�rC�linesr��thisr�r�rrr�parse_groupM	s

zEntryPoint.parse_groupcCsxt|t�r|j�}nt|�}i}xR|D]J\}}|dkrD|s<q&td��|j�}||kr^td|��|j|||�||<q&W|S)z!Parse a map of entry point groupsNz%Entry points must be listed in groupszDuplicate group name)r�r$r(rnrDr�r)r��datar��mapsrCrrrr�	parse_mapZ	s


zEntryPoint.parse_map)T)NN)N)N)N)rrrrr�r�r�rzr�rHrFr^rrr�rrrrrrrrb�s 	



	cCs>|sdStjj|�}|djd�r:tjj|dd�d�S|S)Nr�rzmd5=r�r�)r�)rr�Zurlparser�Z
urlunparse)r�Zparsedrrr�_remove_md5_fragmentn	sr cCs@dd�}t||�}tt|�d�}|jd�\}}}t|j��p>dS)z�
    Given an iterable of lines from a Metadata file, return
    the value of the Version field, if present, or None otherwise.
    cSs|j�jd�S)Nzversion:)rr�)r�rrr�is_version_line|	sz+_version_from_file.<locals>.is_version_liner�rN)rr�r��	partitionrjr�)rr!Z
version_linesr�r�rVrrr�_version_from_filew	s

r#cs�eZdZdZdZddddedefdd�ZedNdd��Z	dd	�Z
ed
d��Zdd
�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zedd��Zedd��Zdd�Zed d!��Zed"d#��Zed$d%��Zd&d'�Zffd(d)�Zd*d+�ZdOd-d.�Zd/d0�Zd1d2�Z d3d4�Z!d5d6�Z"�fd7d8�Z#e$e%d9��s&[#edPd:d;��Z&d<d=�Z'd>d?�Z(dQd@dA�Z)dBdC�Z*dRdDdE�Z+dFdG�Z,dHdI�Z-dJdK�Z.edLdM��Z/�Z0S)Sr`z5Wrap an actual or potential sys.path entry w/metadatazPKG-INFONcCsFt|pd�|_|dk	r t|�|_||_||_||_||_|p>t|_	dS)NZUnknown)
rir�rj�_versionrr@r�r�r��	_provider)r�r�r�r�rrr@r�rrrr��	s
zDistribution.__init__cKs~dgd\}}}}tjj|�\}}	|	j�tkr^t|	j�}t|�}
|
r^|
jdddd�\}}}}|||f||||d�|��j�S)Nr�r�ZverZpyverrE)r�rrr@)r�r�r�r�_distributionImpl�EGG_NAMErC�_reload_version)r�r�rjr�r&r�rrr@r�r>rrrr��	s
zDistribution.from_locationcCs|S)Nr)r�rrrr(�	szDistribution._reload_versioncCs(|j|j|jt|j�|jpd|jp$dfS)Nr�)�parsed_versionr�r2r r�rr@)r�rrrr�	szDistribution.hashcmpcCs
t|j�S)N)�hashr)r�rrr�__hash__�	szDistribution.__hash__cCs|j|jkS)N)r)r�rrrr�__lt__�	szDistribution.__lt__cCs|j|jkS)N)r)r�rrrr�__le__�	szDistribution.__le__cCs|j|jkS)N)r)r�rrrr�__gt__�	szDistribution.__gt__cCs|j|jkS)N)r)r�rrrr�__ge__�	szDistribution.__ge__cCst||j�sdS|j|jkS)NF)r�r�r)r�rrrr�__eq__�	szDistribution.__eq__cCs
||kS)Nr)r�rrrr�__ne__�	szDistribution.__ne__cCs0y|jStk
r*|jj�|_}|SXdS)N)Z_keyr�r�r)r�r2rrrr2�	s
zDistribution.keycCst|d�st|j�|_|jS)N�_parsed_version)r�r rr2)r�rrrr)�	s
zDistribution.parsed_versioncCsXtjj}t|j|�}|sdS|js&dStjd�j�jdd�}t	j
|jft|��t
�dS)Na>
            '{project_name} ({version})' is being parsed as a legacy,
            non PEP 440,
            version. You may find odd behavior and sort order.
            In particular it will be sorted as less than 0.0. It
            is recommended to migrate to PEP 440 compatible
            versions.
            rYr�)rrrr�r2r'r(r�r�r:r;r��varsr)r�ZLVZ	is_legacyr,rrr�_warn_legacy_version�	sz!Distribution._warn_legacy_versioncCsLy|jStk
rFt|j|j��}|dkrBd}t||j|��|SXdS)Nz(Missing 'Version:' header and/or %s file)r$r�r#�
_get_metadata�PKG_INFOrD)r�rr,rrrr�	szDistribution.versioncCs2y|jStk
r*|j|j��|_YnX|jS)z~
        A map of extra to its list of (direct) requirements
        for this distribution, including the null extra.
        )Z_Distribution__dep_mapr��_filter_extras�_build_dep_map)r�rrr�_dep_map
s
zDistribution._dep_mapcCsvxpttd|��D]^}|}|j|�}|jd�\}}}|oFt|�pFt|�}|rPg}t|�pZd}|j|g�j|�qW|S)z�
        Given a mapping of extras to dependencies, strip off
        environment markers and filter out any dependencies
        not matching the markers.
        Nr)	r�rr�r"rqrrror�r�)�dmr	Z	new_extrar�r�r
Zfails_markerrrrr7
s

zDistribution._filter_extrascCsHi}x>dD]6}x0t|j|��D]\}}|j|g�jt|��qWq
W|S)N�requires.txt�depends.txt)r;r<)rnr5r�r�rh)r�r:r�r	r�rrrr8&
s

zDistribution._build_dep_mapcCsj|j}g}|j|jdf��xH|D]@}y|j|t|��Wq"tk
r`td||f��Yq"Xq"W|S)z@List of Requirements needed for this distro if `extras` are usedNz%s has no such extra feature %r)r9r�r�ror�rf)r�r�r:Zdepsr�rrrr�-
s
zDistribution.requiresccs(|j|�r$x|j|�D]
}|VqWdS)N)r�r�)r�r�r�rrrr5;
s
zDistribution._get_metadataFcCsZ|dkrtj}|j||d�|tjkrVt|j�x$|jd�D]}|tjkr<t|�q<WdS)z>Ensure distribution is importable on `path` (default=sys.path)N)r�znamespace_packages.txt)r?r�r�r�r�r5r�rV)r�r�r�Zpkgrrr�activate@
s


zDistribution.activatecCs8dt|j�t|j�|jptf}|jr4|d|j7}|S)z@Return what this distribution's standard .egg filename should bez
%s-%s-py%srE)rpr�rrr"r@)r�rArrrrgK
szDistribution.egg_namecCs |jrd||jfSt|�SdS)Nz%s (%s))r�r�)r�rrrr�V
szDistribution.__repr__cCs@yt|dd�}Wntk
r(d}YnX|p0d}d|j|fS)Nrz[unknown version]z%s %s)r�rDr�)r�rrrrr�\
s
zDistribution.__str__cCs|jd�rt|��t|j|�S)zADelegate all unrecognized public attributes to .metadata providerr�)r�r�r�r%)r�rrrr�__getattr__d
s
zDistribution.__getattr__cs.tttt|�j��tdd�|jj�D��B�S)Ncss|]}|jd�s|VqdS)r�N)r�)rrrrrr
n
sz'Distribution.__dir__.<locals>.<genexpr>)r�r��superr`�__dir__r%)r�)r�rrr@j
szDistribution.__dir__r@cKs|jt|�tjj|�|f|�S)N)r�r�r�r�rj)r�rAr�r&rrrr�w
szDistribution.from_filenamecCs<t|jtjj�r"d|j|jf}nd|j|jf}tj|�S)z?Return a ``Requirement`` that matches this distribution exactlyz%s==%sz%s===%s)r�r)rrrr�rar�)r��specrrrr~
szDistribution.as_requirementcCs.|j||�}|dkr&td||ff��|j�S)z=Return the `name` entry point of `group` or raise ImportErrorNzEntry point %r not found)rNr�rz)r�rCr�r�rrrrL�
szDistribution.load_entry_pointcCsPy
|j}Wn,tk
r6tj|jd�|�}|_YnX|dk	rL|j|i�S|S)z=Return the entry point map for `group`, or the full entry mapzentry_points.txtN)Z_ep_mapr�rbrr5r�)r�rCZep_maprrrrM�
s
zDistribution.get_entry_mapcCs|j|�j|�S)z<Return the EntryPoint object for `group`+`name`, or ``None``)rMr�)r�rCr�rrrrN�
szDistribution.get_entry_infoc
Cs2|p|j}|sdSt|�}tjj|�}dd�|D�}x�t|�D]v\}}||kr\|rVPq�dSq>||kr>|jtkr>|r�|||d�kr�dS|tjkr�|j	�|j
||�|j
||�Pq>W|tjkr�|j	�|r�|j
d|�n
|j|�dSxBy|j||d�}	Wnt
k
�rPYq�X||	=||	=|	}q�WdS)a�Ensure self.location is on path

        If replace=False (default):
            - If location is already in path anywhere, do nothing.
            - Else:
              - If it's an egg and its parent directory is on path,
                insert just ahead of the parent.
              - Else: add to the end of path.
        If replace=True:
            - If location is already on path anywhere (not eggs)
              or higher priority than its parent (eggs)
              do nothing.
            - Else:
              - If it's an egg and its parent directory is on path,
                insert just ahead of the parent,
                removing any lower-priority entries.
              - Else: add it to the front of path.
        NcSsg|]}|rt|�p|�qSr)r�)rr�rrrr��
sz*Distribution.insert_on.<locals>.<listcomp>rr)r�r�r�r�rM�	enumerater�rur?�check_version_conflictr�r�r�rD)
r�r��locr�ZnlocZbdirZnpathr�r�Znprrrr��
sB



zDistribution.insert_oncCs�|jdkrdStj|jd��}t|j�}x~|jd�D]p}|tjks4||ks4|tkrTq4|dkr^q4t	tj|dd�}|r�t|�j
|�s4|j
|j�r�q4td|||jf�q4WdS)	N�
setuptoolsznamespace_packages.txtz
top_level.txt�
pkg_resources�siterLzIModule %s was already imported from %s, but %s is being added to sys.path)rFrErG)r2r$r%r5rtr�r?r�r�r�r��
issue_warning)r�ZnsprD�modname�fnrrrrC�
s"

z#Distribution.check_version_conflictcCs4y
|jWn$tk
r.tdt|��dSXdS)NzUnbuilt egg for FT)rrDrHr�)r�rrrr�
s
zDistribution.has_versioncKsDd}x$|j�D]}|j|t||d��qW|jd|j�|jf|�S)z@Copy this distribution, substituting in any changed keyword argsz<project_name version py_version platform location precedenceNr�)rr�r�r%r�)r�r&r2rrrr�clones
zDistribution.clonecCsdd�|jD�S)NcSsg|]}|r|�qSrr)rZdeprrrr�
sz'Distribution.extras.<locals>.<listcomp>)r9)r�rrrr�szDistribution.extras)N)NF)N)N)NF)1rrrrr6r"rur�rr�r(r�rr+r,r-r.r/r0r1r2r)r4rr9rDr7r8r�r5r=rgr�r�r>r@r��objectr�rrLrMrNr�rCrrKr��
__classcell__rr)r�rr`�	sX

		

Dc@seZdZdd�ZdS)�EggInfoDistributioncCst|j|j��}|r||_|S)a�
        Packages installed by distutils (e.g. numpy or scipy),
        which uses an old safe_version, and so
        their version numbers can get mangled when
        converted to filenames (e.g., 1.11.0.dev0+2329eae to
        1.11.0.dev0_2329eae). These distributions will not be
        parsed properly
        downstream by Distribution and safe_version, so
        take an extra step and try to get the version number from
        the metadata file itself instead of the filename.
        )r#r5r6r$)r�Z
md_versionrrrr(sz#EggInfoDistribution._reload_versionN)rrrr(rrrrrN
srNc@s>eZdZdZdZejd�Zedd��Z	edd��Z
dd	�Zd
S)�DistInfoDistributionzV
    Wrap an actual or potential sys.path entry
    w/metadata, .dist-info style.
    ZMETADATAz([\(,])\s*(\d.*?)\s*([,\)])cCs@y|jStk
r:|j|j�}tjj�j|�|_|jSXdS)zParse and cache metadataN)Z	_pkg_infor�r�r6�email�parserZParserZparsestr)r�r�rrr�_parsed_pkg_info(sz%DistInfoDistribution._parsed_pkg_infocCs,y|jStk
r&|j�|_|jSXdS)N)�_DistInfoDistribution__dep_mapr��_compute_dependencies)r�rrrr92s

zDistInfoDistribution._dep_mapcs�dgi}|_g�x&|jjd�p"gD]}�jt|��q$W�fdd�}t|d��}|dj|�x<|jjd�ppgD](}t|j��}tt||��|�||<qrW|S)z+Recompute this distribution's dependencies.Nz
Requires-Distc3s0x*�D]"}|js"|jjd|i�r|VqWdS)Nr	)r
r)r	r�)r�rr�reqs_for_extraCs
zBDistInfoDistribution._compute_dependencies.<locals>.reqs_for_extrazProvides-Extra)	rSrRZget_allr�rh�	frozensetror�r�)r�r:r�rU�commonr	Zs_extrar)r�rrT:sz*DistInfoDistribution._compute_dependenciesN)rrrrr6rFr^ZEQEQr�rRr9rTrrrrrO s

rO)z.eggz	.egg-infoz
.dist-infoc
Os^d}t�}y"xtj|�j|kr(|d7}qWWntk
r@YnXtj|d|di|��dS)Nrr
)r!r?r�r�rDr:r;)r7r&rr*rrrrHYsrHc@seZdZdd�ZdS)�RequirementParseErrorcCsdj|j�S)Nr�)rAr7)r�rrrr�gszRequirementParseError.__str__N)rrrr�rrrrrXfsrXccs�tt|��}xp|D]h}d|kr0|d|jd��}|jd�rp|dd�j�}y|t|�7}Wntk
rndSXt|�VqWdS)z�Yield ``Requirement`` objects for each specification in `strs`

    `strs` must be a string, or a (possibly-nested) iterable thereof.
    z #N�\r;���)r�rmr�r�r�r��
StopIterationra)rrr�rrrrhks

csPeZdZ�fdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Ze	d
d��Z
�ZS)racs�ytt|�j|�Wn2tjjk
rF}ztt|���WYdd}~XnX|j|_	t
|j�}||j�|_|_
dd�|jD�|_ttt|j��|_|j
|jt|j�|jr�t|j�ndf|_t|j�|_dS)z>DO NOT CALL THIS UNDOCUMENTED METHOD; use Requirement.parse()!NcSsg|]}|j|jf�qSr)rr)rrArrrr��sz(Requirement.__init__.<locals>.<listcomp>)r?rar�rr�ZInvalidRequirementrXr�r�Zunsafe_namerirr�r2�	specifierrrrror�rVr
�hashCmpr*�_Requirement__hash)r�Zrequirement_stringrKr�)r�rrr��s
zRequirement.__init__cCst|t�o|j|jkS)N)r�rar])r�rrrrr0�s
zRequirement.__eq__cCs
||kS)Nr)r�rrrrr1�szRequirement.__ne__cCs0t|t�r |j|jkrdS|j}|jj|dd�S)NFT)Zprereleases)r�r`r2rr\�contains)r�r�rrrr��s

zRequirement.__contains__cCs|jS)N)r^)r�rrrr+�szRequirement.__hash__cCsdt|�S)NzRequirement.parse(%r))r�)r�rrrr��szRequirement.__repr__cCst|�\}|S)N)rh)rr�rrrr��s
zRequirement.parse)rrrr�r0r1r�r+r�rDr�rMrr)r�rra�scCst|kr|tfS|S)zJ
    Ensure object appears in the mro even
    for old-style classes.
    )rL)�classesrrr�_always_object�s
racCs<ttjt|dt|����}x|D]}||kr ||Sq WdS)z2Return an adapter factory for `ob` from `registry`r�N)ra�inspectZgetmror�rs)�registryr3r��trrrr��s
r�cCstjj|�}tj|dd�dS)z1Ensure that the parent directory of `path` existsT)�exist_okN)r�r�rMr�makedirs)r�rMrrrrs�scCs@tstd��t|�\}}|r<|r<t|�r<t|�t|d�dS)z/Sandbox-bypassing version of ensure_directory()z*"os.mkdir" not supported on this platform.i�N)r�r�rr
r/r	)r�rMrArrrr/�sr/ccszd}g}xbt|�D]V}|jd�r^|jd�rR|s2|r<||fV|dd�j�}g}qhtd|��q|j|�qW||fVdS)asSplit a string or iterable thereof into (section, content) pairs

    Each ``section`` is a stripped version of the section header ("[section]")
    and each ``content`` is a list of stripped lines excluding blank lines and
    comment-only lines.  If there are any such lines before the first section
    header, they're returned in a first ``section`` of ``None``.
    N�[�]rzInvalid section headingr�)rmr�r�r�rDr�)rZsectionZcontentr�rrrrn�s


cOs&tj}ztt_tj||�S|t_XdS)N)r�r�os_open�tempfileZmkstemp)r7r&Zold_openrrrr��s
r��ignore)�categoryr�cOs|||�|S)Nr)r�r7rrrr�_call_asides
rmcs.t���|d<|j�fdd�t��D��dS)z=Set up global resource manager (deliberately not state-saved)Z_managerc3s&|]}|jd�s|t�|�fVqdS)r�N)r�r�)rr�)r�rrr
sz_initialize.<locals>.<genexpr>N)r_r"r�)r*r)r�r�_initializes

rncCs|tj�}td|d�|j}|j}|j}|j}|}tdd�|D��|dd�dd�g|_t	t
|jtj
��t�jt��d	S)
aE
    Prepare the master working set and make the ``require()``
    API available.

    This function has explicit effects on the global state
    of pkg_resources. It is intended to be invoked once at
    the initialization of this module.

    Invocation by other packages is unsupported and done
    at their own risk.
    rL)rWcss|]}|jdd�VqdS)F)r�N)r=)rr�rrrr
2sz1_initialize_master_working_set.<locals>.<genexpr>cSs|jdd�S)NT)r�)r=)r�rrrr86sz0_initialize_master_working_set.<locals>.<lambda>F)rN)r^r�r'rHrOrrIrr�r�rr�r?r�r!r"r�)rWrHrOrXrIr�rrr�_initialize_master_working_sets 

ro)rr)rrr�)N)N)F)F)F)F)N)�rZ
__future__rr?r�rPr�rFr�rxr�r:r6rZpkgutilrr@r�r�Zemail.parserrPr�rjr'r�rbrr�r�ZimpZpkg_resources.externrZpkg_resources.extern.six.movesrrrrr	r
rr�rriZos.pathr
rZimportlib.machinery�	machineryrrrr�rrrr��version_info�RuntimeErrorr�r�r�rHrWrXZresources_streamr[Zresource_dirrQrZrUrPrOrSrRrTr�r�r��RuntimeWarningrr r#r'r,r-r0r4r5r6Z
_sget_noneZ
_sset_nonerG�__all__r0rcrdr�rerfr�rr"rurvrwrxryr�rJrBr�r<r^r=r�rkrlrIr�rKrLrMrNrzr{rLr^r$r�r]r�rgr_r\rirjrorprqrrr�r�r�rtrr�rur{r�r�r|r}r~r�rYr�r�r�r�r�r�r�r�r�r�ZImpImporterr�r�r�r�r�rVr�r�r�rtr�rir�r�rmr>r�VERBOSE�
IGNORECASEr'rbr r#r`rNrOr&rHrDrXrhr�rarar�rsr/rnr��filterwarningsrmr!rnrorrrr�<module>s�




 




.

5	
d
-'




 !!


		
3
6

site-packages/pkg_resources/extern/__pycache__/__init__.cpython-36.opt-1.pyc000064400000004451147511334660023016 0ustar003

��f�	�@s,ddlZGdd�d�ZdZeee�j�dS)	�Nc@sDeZdZdZfdfdd�Zedd��Zd
dd�Zd	d
�Zdd�Z	dS)�VendorImporterz�
    A PEP 302 meta path importer for finding optionally-vendored
    or otherwise naturally-installed packages from root_name.
    NcCs&||_t|�|_|p|jdd�|_dS)NZexternZ_vendor)�	root_name�set�vendored_names�replace�
vendor_pkg)�selfrrr�r	�/usr/lib/python3.6/__init__.py�__init__
s
zVendorImporter.__init__ccs|jdVdVdS)zL
        Search first the vendor package then as a natural package.
        �.�N)r)rr	r	r
�search_pathszVendorImporter.search_pathcCs8|j|jd�\}}}|rdStt|j|j��s4dS|S)z�
        Return self when fullname starts with root_name and the
        target module is one vendored through this importer.
        rN)�	partitionr�any�map�
startswithr)r�fullname�path�root�base�targetr	r	r
�find_moduleszVendorImporter.find_modulecCs�|j|jd�\}}}xp|jD]T}y:||}t|�tj|}|tj|<tjdkrZtj|=|Stk
rpYqXqWtdjft	����dS)zK
        Iterate over the search path to locate and load fullname.
        r�z�The '{target}' package is required; normally this is bundled with this package so if you get this warning, consult the packager of your distribution.N)rr)
rrr�
__import__�sys�modules�version_info�ImportError�format�locals)rrrrr�prefixZextant�modr	r	r
�load_module#s



zVendorImporter.load_modulecCs|tjkrtjj|�dS)zR
        Install this importer into sys.meta_path if not already present.
        N)r�	meta_path�append)rr	r	r
�install@s
zVendorImporter.install)N)
�__name__�
__module__�__qualname__�__doc__r�propertyrrr#r&r	r	r	r
rs
r�	packaging�	pyparsing�six�appdirs)r,r-r.r/)rr�namesr'r&r	r	r	r
�<module>sDsite-packages/pkg_resources/extern/__pycache__/__init__.cpython-36.pyc000064400000004451147511334660022057 0ustar003

��f�	�@s,ddlZGdd�d�ZdZeee�j�dS)	�Nc@sDeZdZdZfdfdd�Zedd��Zd
dd�Zd	d
�Zdd�Z	dS)�VendorImporterz�
    A PEP 302 meta path importer for finding optionally-vendored
    or otherwise naturally-installed packages from root_name.
    NcCs&||_t|�|_|p|jdd�|_dS)NZexternZ_vendor)�	root_name�set�vendored_names�replace�
vendor_pkg)�selfrrr�r	�/usr/lib/python3.6/__init__.py�__init__
s
zVendorImporter.__init__ccs|jdVdVdS)zL
        Search first the vendor package then as a natural package.
        �.�N)r)rr	r	r
�search_pathszVendorImporter.search_pathcCs8|j|jd�\}}}|rdStt|j|j��s4dS|S)z�
        Return self when fullname starts with root_name and the
        target module is one vendored through this importer.
        rN)�	partitionr�any�map�
startswithr)r�fullname�path�root�base�targetr	r	r
�find_moduleszVendorImporter.find_modulecCs�|j|jd�\}}}xp|jD]T}y:||}t|�tj|}|tj|<tjdkrZtj|=|Stk
rpYqXqWtdjft	����dS)zK
        Iterate over the search path to locate and load fullname.
        r�z�The '{target}' package is required; normally this is bundled with this package so if you get this warning, consult the packager of your distribution.N)rr)
rrr�
__import__�sys�modules�version_info�ImportError�format�locals)rrrrr�prefixZextant�modr	r	r
�load_module#s



zVendorImporter.load_modulecCs|tjkrtjj|�dS)zR
        Install this importer into sys.meta_path if not already present.
        N)r�	meta_path�append)rr	r	r
�install@s
zVendorImporter.install)N)
�__name__�
__module__�__qualname__�__doc__r�propertyrrr#r&r	r	r	r
rs
r�	packaging�	pyparsing�six�appdirs)r,r-r.r/)rr�namesr'r&r	r	r	r
�<module>sDsite-packages/pkg_resources/extern/__init__.py000064400000004667147511334660015604 0ustar00import sys


class VendorImporter:
    """
    A PEP 302 meta path importer for finding optionally-vendored
    or otherwise naturally-installed packages from root_name.
    """

    def __init__(self, root_name, vendored_names=(), vendor_pkg=None):
        self.root_name = root_name
        self.vendored_names = set(vendored_names)
        self.vendor_pkg = vendor_pkg or root_name.replace('extern', '_vendor')

    @property
    def search_path(self):
        """
        Search first the vendor package then as a natural package.
        """
        yield self.vendor_pkg + '.'
        yield ''

    def find_module(self, fullname, path=None):
        """
        Return self when fullname starts with root_name and the
        target module is one vendored through this importer.
        """
        root, base, target = fullname.partition(self.root_name + '.')
        if root:
            return
        if not any(map(target.startswith, self.vendored_names)):
            return
        return self

    def load_module(self, fullname):
        """
        Iterate over the search path to locate and load fullname.
        """
        root, base, target = fullname.partition(self.root_name + '.')
        for prefix in self.search_path:
            try:
                extant = prefix + target
                __import__(extant)
                mod = sys.modules[extant]
                sys.modules[fullname] = mod
                # mysterious hack:
                # Remove the reference to the extant package/module
                # on later Python versions to cause relative imports
                # in the vendor package to resolve the same modules
                # as those going through this importer.
                if sys.version_info > (3, 3):
                    del sys.modules[extant]
                return mod
            except ImportError:
                pass
        else:
            raise ImportError(
                "The '{target}' package is required; "
                "normally this is bundled with this package so if you get "
                "this warning, consult the packager of your "
                "distribution.".format(**locals())
            )

    def install(self):
        """
        Install this importer into sys.meta_path if not already present.
        """
        if self not in sys.meta_path:
            sys.meta_path.append(self)


names = 'packaging', 'pyparsing', 'six', 'appdirs'
VendorImporter(__name__, names).install()
site-packages/isc-2.0-py3.6.egg-info000064400000000413147511334660012641 0ustar00Metadata-Version: 1.1
Name: isc
Version: 2.0
Summary: Python functions to support BIND utilities
Home-page: https://www.isc.org/bind
Author: Internet Systems Consortium, Inc
Author-email: info@isc.org
License: MPL
Description: UNKNOWN
Platform: UNKNOWN
Requires: ply
site-packages/tuned/daemon/daemon.py000064400000031671147511334660013505 0ustar00import os
import errno
import threading
import tuned.logs
from tuned.exceptions import TunedException
from tuned.profiles.exceptions import InvalidProfileException
import tuned.consts as consts
from tuned.utils.commands import commands
from tuned import exports
from tuned.utils.profile_recommender import ProfileRecommender
import re

log = tuned.logs.get()


class Daemon(object):
	def __init__(self, unit_manager, profile_loader, profile_names=None, config=None, application=None):
		log.debug("initializing daemon")
		self._daemon = consts.CFG_DEF_DAEMON
		self._sleep_interval = int(consts.CFG_DEF_SLEEP_INTERVAL)
		self._update_interval = int(consts.CFG_DEF_UPDATE_INTERVAL)
		self._dynamic_tuning = consts.CFG_DEF_DYNAMIC_TUNING
		self._recommend_command = True
		self._rollback = consts.CFG_DEF_ROLLBACK
		if config is not None:
			self._daemon = config.get_bool(consts.CFG_DAEMON, consts.CFG_DEF_DAEMON)
			self._sleep_interval = int(config.get(consts.CFG_SLEEP_INTERVAL, consts.CFG_DEF_SLEEP_INTERVAL))
			self._update_interval = int(config.get(consts.CFG_UPDATE_INTERVAL, consts.CFG_DEF_UPDATE_INTERVAL))
			self._dynamic_tuning = config.get_bool(consts.CFG_DYNAMIC_TUNING, consts.CFG_DEF_DYNAMIC_TUNING)
			self._recommend_command = config.get_bool(consts.CFG_RECOMMEND_COMMAND, consts.CFG_DEF_RECOMMEND_COMMAND)
			self._rollback = config.get(consts.CFG_ROLLBACK, consts.CFG_DEF_ROLLBACK)
		self._application = application
		if self._sleep_interval <= 0:
			self._sleep_interval = int(consts.CFG_DEF_SLEEP_INTERVAL)
		if self._update_interval == 0:
			self._dynamic_tuning = False
		elif self._update_interval < self._sleep_interval:
			self._update_interval = self._sleep_interval
		self._sleep_cycles = self._update_interval // self._sleep_interval
		log.info("using sleep interval of %d second(s)" % self._sleep_interval)
		if self._dynamic_tuning:
			log.info("dynamic tuning is enabled (can be overridden by plugins)")
			log.info("using update interval of %d second(s) (%d times of the sleep interval)" % (self._sleep_cycles * self._sleep_interval, self._sleep_cycles))

		self._profile_recommender = ProfileRecommender(is_hardcoded = not self._recommend_command)
		self._unit_manager = unit_manager
		self._profile_loader = profile_loader
		self._init_threads()
		self._cmd = commands()
		try:
			self._init_profile(profile_names)
		except TunedException as e:
			log.error("Cannot set initial profile. No tunings will be enabled: %s" % e)

	def _init_threads(self):
		self._thread = None
		self._terminate = threading.Event()
		# Flag which is set if terminating due to profile_switch
		self._terminate_profile_switch = threading.Event()
		# Flag which is set if there is no operation in progress
		self._not_used = threading.Event()
		# Flag which is set if SIGHUP is being processed
		self._sighup_processing = threading.Event()
		self._not_used.set()
		self._profile_applied = threading.Event()

	def reload_profile_config(self):
		"""Read configuration files again and load profile according to them"""
		self._init_profile(None)

	def _init_profile(self, profile_names):
		manual = True
		post_loaded_profile = self._cmd.get_post_loaded_profile()
		if profile_names is None:
			(profile_names, manual) = self._get_startup_profile()
			if profile_names is None:
				msg = "No profile is preset, running in manual mode. "
				if post_loaded_profile:
					msg += "Only post-loaded profile will be enabled"
				else:
					msg += "No profile will be enabled."
				log.info(msg)
		# Passed through '-p' cmdline option
		elif profile_names == "":
			if post_loaded_profile:
				log.info("Only post-loaded profile will be enabled")
			else:
				log.info("No profile will be enabled.")

		self._profile = None
		self._manual = None
		self._active_profiles = []
		self._post_loaded_profile = None
		self.set_all_profiles(profile_names, manual, post_loaded_profile)

	def _load_profiles(self, profile_names, manual):
		profile_names = profile_names or ""
		profile_list = profile_names.split()

		if self._post_loaded_profile:
			log.info("Using post-loaded profile '%s'"
				 % self._post_loaded_profile)
			profile_list = profile_list + [self._post_loaded_profile]
		for profile in profile_list:
			if profile not in self.profile_loader.profile_locator.get_known_names():
				errstr = "Requested profile '%s' doesn't exist." % profile
				self._notify_profile_changed(profile_names, False, errstr)
				raise TunedException(errstr)
		try:
			if profile_list:
				self._profile = self._profile_loader.load(profile_list)
			else:
				self._profile = None

			self._manual = manual
			self._active_profiles = profile_names.split()
		except InvalidProfileException as e:
			errstr = "Cannot load profile(s) '%s': %s" % (" ".join(profile_list), e)
			self._notify_profile_changed(profile_names, False, errstr)
			raise TunedException(errstr)

	def set_profile(self, profile_names, manual):
		if self.is_running():
			errstr = "Cannot set profile while the daemon is running."
			self._notify_profile_changed(profile_names, False,
						     errstr)
			raise TunedException(errstr)

		self._load_profiles(profile_names, manual)

	def _set_post_loaded_profile(self, profile_name):
		if not profile_name:
			self._post_loaded_profile = None
		elif len(profile_name.split()) > 1:
			errstr = "Whitespace is not allowed in profile names; only a single post-loaded profile is allowed."
			raise TunedException(errstr)
		else:
			self._post_loaded_profile = profile_name

	def set_all_profiles(self, active_profiles, manual, post_loaded_profile,
			     save_instantly=False):
		if self.is_running():
			errstr = "Cannot set profile while the daemon is running."
			self._notify_profile_changed(active_profiles, False,
						     errstr)
			raise TunedException(errstr)

		self._set_post_loaded_profile(post_loaded_profile)
		self._load_profiles(active_profiles, manual)

		if save_instantly:
			self._save_active_profile(active_profiles, manual)
			self._save_post_loaded_profile(post_loaded_profile)

	@property
	def profile(self):
		return self._profile

	@property
	def manual(self):
		return self._manual

	@property
	def post_loaded_profile(self):
		# Return the profile name only if the profile is active. If
		# the profile is not active, then the value is meaningless.
		return self._post_loaded_profile if self._profile else None

	@property
	def profile_recommender(self):
		return self._profile_recommender

	@property
	def profile_loader(self):
		return self._profile_loader

	# send notification when profile is changed (everything is setup) or if error occured
	# result: True - OK, False - error occured
	def _notify_profile_changed(self, profile_names, result, errstr):
		if self._application is not None:
			exports.send_signal(consts.SIGNAL_PROFILE_CHANGED, profile_names, result, errstr)
		return errstr

	def _full_rollback_required(self):
		retcode, out = self._cmd.execute(["systemctl", "is-system-running"], no_errors = [0])
		if retcode < 0:
			return False
		if out[:8] == "stopping":
			return False
		retcode, out = self._cmd.execute(["systemctl", "list-jobs"], no_errors = [0])
		return re.search(r"\b(shutdown|reboot|halt|poweroff)\.target.*start", out) is None and not retcode

	def _thread_code(self):
		if self._profile is None:
			raise TunedException("Cannot start the daemon without setting a profile.")

		self._unit_manager.create(self._profile.units)
		self._save_active_profile(" ".join(self._active_profiles),
					  self._manual)
		self._save_post_loaded_profile(self._post_loaded_profile)
		self._unit_manager.start_tuning()
		self._profile_applied.set()
		log.info("static tuning from profile '%s' applied" % self._profile.name)
		if self._daemon:
			exports.start()
		profile_names = " ".join(self._active_profiles)
		self._notify_profile_changed(profile_names, True, "OK")
		self._sighup_processing.clear()

		if self._daemon:
			# In python 2 interpreter with applied patch for rhbz#917709 we need to periodically
			# poll, otherwise the python will not have chance to update events / locks (due to GIL)
			# and e.g. DBus control will not work. The polling interval of 1 seconds (which is
			# the default) is still much better than 50 ms polling with unpatched interpreter.
			# For more details see TuneD rhbz#917587.
			_sleep_cnt = self._sleep_cycles
			while not self._cmd.wait(self._terminate, self._sleep_interval):
				if self._dynamic_tuning:
					_sleep_cnt -= 1
					if _sleep_cnt <= 0:
						_sleep_cnt = self._sleep_cycles
						log.debug("updating monitors")
						self._unit_manager.update_monitors()
						log.debug("performing tunings")
						self._unit_manager.update_tuning()

		self._profile_applied.clear()

		# wait for others to complete their tasks, use timeout 3 x sleep_interval to prevent
		# deadlocks
		i = 0
		while not self._cmd.wait(self._not_used, self._sleep_interval) and i < 3:
			i += 1

		# if terminating due to profile switch
		if self._terminate_profile_switch.is_set():
			rollback = consts.ROLLBACK_FULL
		else:
			# Assume only soft rollback is needed. Soft rollback means reverting all
			# non-persistent tunings applied by a plugin instance. In contrast to full
			# rollback, information about what to revert is kept in RAM (volatile
			# memory) -- TuneD data structures.
			# With systemd TuneD detects system shutdown and in such a case it doesn't
			# perform full cleanup. If the system is not shutting down, it means that TuneD
			# was explicitly stopped by the user and in such case do the full cleanup. On
			# systems without systemd, full cleanup is never performed.
			rollback = consts.ROLLBACK_SOFT
			if not self._full_rollback_required():
				log.info("terminating TuneD due to system shutdown / reboot")
			elif self._rollback == "not_on_exit":
				# no rollback on TuneD exit whatsoever
				rollback = consts.ROLLBACK_NONE
				log.info("terminating TuneD and not rolling back any changes due to '%s' option in '%s'" % (consts.CFG_ROLLBACK, consts.GLOBAL_CONFIG_FILE))
			else:
				if self._daemon:
					log.info("terminating TuneD, rolling back all changes")
					rollback = consts.ROLLBACK_FULL
				else:
					log.info("terminating TuneD in one-shot mode")
		if self._daemon:
			self._unit_manager.stop_tuning(rollback)
		self._unit_manager.destroy_all()

	def _save_active_profile(self, profile_names, manual):
		try:
			self._cmd.save_active_profile(profile_names, manual)
		except TunedException as e:
			log.error(str(e))

	def _save_post_loaded_profile(self, profile_name):
		try:
			self._cmd.save_post_loaded_profile(profile_name)
		except TunedException as e:
			log.error(str(e))

	def _get_recommended_profile(self):
		log.info("Running in automatic mode, checking what profile is recommended for your configuration.")
		profile = self._profile_recommender.recommend()
		log.info("Using '%s' profile" % profile)
		return profile

	def _get_startup_profile(self):
		profile, manual = self._cmd.get_active_profile()
		if manual is None:
			manual = profile is not None
		if not manual:
			profile = self._get_recommended_profile()
		return profile, manual

	def get_all_plugins(self):
		"""Return all accessible plugin classes"""
		return self._unit_manager.plugins_repository.load_all_plugins()

	def get_plugin_documentation(self, plugin_name):
		"""Return plugin class docstring"""
		try:
			plugin_class = self._unit_manager.plugins_repository.load_plugin(
				plugin_name
			)
		except ImportError:
			return ""
		return plugin_class.__doc__

	def get_plugin_hints(self, plugin_name):
		"""Return plugin's parameters and their hints

		Parameters:
		plugin_name -- plugins name

		Return:
		dictionary -- {parameter_name: hint}
		"""
		try:
			plugin_class = self._unit_manager.plugins_repository.load_plugin(
				plugin_name
			)
		except ImportError:
			return {}
		return plugin_class.get_config_options_hints()

	def is_enabled(self):
		return self._profile is not None

	def is_running(self):
		return self._thread is not None and self._thread.is_alive()

	def start(self):
		if self.is_running():
			return False

		if self._profile is None:
			return False

		log.info("starting tuning")
		self._not_used.set()
		self._thread = threading.Thread(target=self._thread_code)
		self._terminate_profile_switch.clear()
		self._terminate.clear()
		self._thread.start()
		return True

	def verify_profile(self, ignore_missing):
		if not self.is_running():
			log.error("TuneD is not running")
			return False

		if self._profile is None:
			log.error("no profile is set")
			return False

		if not self._profile_applied.is_set():
			log.error("profile is not applied")
			return False

		# using daemon, the main loop mustn't exit before our completion
		self._not_used.clear()
		log.info("verifying profile(s): %s" % self._profile.name)
		ret = self._unit_manager.verify_tuning(ignore_missing)
		# main loop is allowed to exit
		self._not_used.set()
		return ret

	# profile_switch is helper telling plugins whether the stop is due to profile switch
	def stop(self, profile_switch = False):
		if not self.is_running():
			return False
		log.info("stopping tuning")
		if profile_switch:
			self._terminate_profile_switch.set()
		self._terminate.set()
		self._thread.join()
		self._thread = None

		return True
site-packages/tuned/daemon/__init__.py000064400000000113147511334660013764 0ustar00from .application import *
from .controller import *
from .daemon import *
site-packages/tuned/daemon/application.py000064400000017227147511334660014546 0ustar00from tuned import storage, units, monitors, plugins, profiles, exports, hardware
from tuned.exceptions import TunedException
import tuned.logs
import tuned.version
from . import controller
from . import daemon
import signal
import os
import sys
import select
import struct
import tuned.consts as consts
from tuned.utils.global_config import GlobalConfig

log = tuned.logs.get()

__all__ = ["Application"]

class Application(object):
	def __init__(self, profile_name = None, config = None):
		# os.uname()[2] is for the python-2.7 compatibility, it's the release string
		# like e.g. '5.15.13-100.fc34.x86_64'
		log.info("TuneD: %s, kernel: %s" % (tuned.version.TUNED_VERSION_STR, os.uname()[2]))
		self._dbus_exporter = None
		self._unix_socket_exporter = None

		storage_provider = storage.PickleProvider()
		storage_factory = storage.Factory(storage_provider)

		self.config = GlobalConfig() if config is None else config
		if self.config.get_bool(consts.CFG_DYNAMIC_TUNING):
			log.info("dynamic tuning is enabled (can be overridden in plugins)")
		else:
			log.info("dynamic tuning is globally disabled")

		monitors_repository = monitors.Repository()
		udev_buffer_size = self.config.get_size("udev_buffer_size", consts.CFG_DEF_UDEV_BUFFER_SIZE)
		hardware_inventory = hardware.Inventory(buffer_size=udev_buffer_size)
		device_matcher = hardware.DeviceMatcher()
		device_matcher_udev = hardware.DeviceMatcherUdev()
		plugin_instance_factory = plugins.instance.Factory()
		self.variables = profiles.variables.Variables()

		plugins_repository = plugins.Repository(monitors_repository, storage_factory, hardware_inventory,\
			device_matcher, device_matcher_udev, plugin_instance_factory, self.config, self.variables)
		def_instance_priority = int(self.config.get(consts.CFG_DEFAULT_INSTANCE_PRIORITY, consts.CFG_DEF_DEFAULT_INSTANCE_PRIORITY))
		unit_manager = units.Manager(
				plugins_repository, monitors_repository,
				def_instance_priority, hardware_inventory, self.config)

		profile_factory = profiles.Factory()
		profile_merger = profiles.Merger()
		profile_locator = profiles.Locator(consts.LOAD_DIRECTORIES)
		profile_loader = profiles.Loader(profile_locator, profile_factory, profile_merger, self.config, self.variables)

		self._daemon = daemon.Daemon(unit_manager, profile_loader, profile_name, self.config, self)
		self._controller = controller.Controller(self._daemon, self.config)

		self._init_signals()

		self._pid_file = None

	def _handle_signal(self, signal_number, handler):
		def handler_wrapper(_signal_number, _frame):
			if signal_number == _signal_number:
				handler()
		signal.signal(signal_number, handler_wrapper)

	def _init_signals(self):
		self._handle_signal(signal.SIGHUP, self._controller.sighup)
		self._handle_signal(signal.SIGINT, self._controller.terminate)
		self._handle_signal(signal.SIGTERM, self._controller.terminate)

	def attach_to_dbus(self, bus_name, object_name, interface_name, namespace):
		if self._dbus_exporter is not None:
			raise TunedException("DBus interface is already initialized.")

		self._dbus_exporter = exports.dbus.DBusExporter(bus_name, interface_name, object_name, namespace)
		exports.register_exporter(self._dbus_exporter)

	def attach_to_unix_socket(self):
		if self._unix_socket_exporter is not None:
			raise TunedException("Unix socket interface is already initialized.")

		self._unix_socket_exporter = exports.unix_socket.UnixSocketExporter(self.config.get(consts.CFG_UNIX_SOCKET_PATH),
																			self.config.get(consts.CFG_UNIX_SOCKET_SIGNAL_PATHS),
																			self.config.get(consts.CFG_UNIX_SOCKET_OWNERSHIP),
																			self.config.get_int(consts.CFG_UNIX_SOCKET_PERMISIONS),
																			self.config.get_int(consts.CFG_UNIX_SOCKET_CONNECTIONS_BACKLOG))
		exports.register_exporter(self._unix_socket_exporter)

	def register_controller(self):
		exports.register_object(self._controller)

	def _daemonize_parent(self, parent_in_fd, child_out_fd):
		"""
		Wait till the child signalizes that the initialization is complete by writing
		some uninteresting data into the pipe.
		"""
		os.close(child_out_fd)
		(read_ready, drop, drop) = select.select([parent_in_fd], [], [], consts.DAEMONIZE_PARENT_TIMEOUT)

		if len(read_ready) != 1:
			os.close(parent_in_fd)
			raise TunedException("Cannot daemonize, timeout when waiting for the child process.")

		response = os.read(parent_in_fd, 8)
		os.close(parent_in_fd)

		if len(response) == 0:
			raise TunedException("Cannot daemonize, no response from child process received.")

		try:
			val = struct.unpack("?", response)[0]
		except struct.error:
			raise TunedException("Cannot daemonize, invalid response from child process received.")
		if val != True:
			raise TunedException("Cannot daemonize, child process reports failure.")

	def write_pid_file(self, pid_file = consts.PID_FILE):
		self._pid_file = pid_file
		self._delete_pid_file()
		try:
			dir_name = os.path.dirname(self._pid_file)
			if not os.path.exists(dir_name):
				os.makedirs(dir_name)

			with os.fdopen(os.open(self._pid_file, os.O_CREAT|os.O_TRUNC|os.O_WRONLY , 0o644), "w") as f:
				f.write("%d" % os.getpid())
		except (OSError,IOError) as error:
			log.critical("cannot write the PID to %s: %s" % (self._pid_file, str(error)))

	def _delete_pid_file(self):
		if os.path.exists(self._pid_file):
			try:
				os.unlink(self._pid_file)
			except OSError as error:
				log.warning("cannot remove existing PID file %s, %s" % (self._pid_file, str(error)))

	def _daemonize_child(self, pid_file, parent_in_fd, child_out_fd):
		"""
		Finishes daemonizing process, writes a PID file and signalizes to the parent
		that the initialization is complete.
		"""
		os.close(parent_in_fd)

		os.chdir("/")
		os.setsid()
		os.umask(0)

		try:
			pid = os.fork()
			if pid > 0:
				sys.exit(0)
		except OSError as error:
			log.critical("cannot daemonize, fork() error: %s" % str(error))
			val = struct.pack("?", False)
			os.write(child_out_fd, val)
			os.close(child_out_fd)
			raise TunedException("Cannot daemonize, second fork() failed.")

		fd = open("/dev/null", "w+")
		os.dup2(fd.fileno(), sys.stdin.fileno())
		os.dup2(fd.fileno(), sys.stdout.fileno())
		os.dup2(fd.fileno(), sys.stderr.fileno())

		self.write_pid_file(pid_file)

		log.debug("successfully daemonized")
		val = struct.pack("?", True)
		os.write(child_out_fd, val)
		os.close(child_out_fd)

	def daemonize(self, pid_file = consts.PID_FILE):
		"""
		Daemonizes the application. In case of failure, TunedException is raised
		in the parent process. If the operation is successfull, the main process
		is terminated and only child process returns from this method.
		"""
		parent_child_fds = os.pipe()
		try:
			child_pid = os.fork()
		except OSError as error:
			os.close(parent_child_fds[0])
			os.close(parent_child_fds[1])
			raise TunedException("Cannot daemonize, fork() failed.")

		try:
			if child_pid > 0:
				self._daemonize_parent(*parent_child_fds)
				sys.exit(0)
			else:
				self._daemonize_child(pid_file, *parent_child_fds)
		except:
			# pass exceptions only into parent process
			if child_pid > 0:
				raise
			else:
				sys.exit(1)

	@property
	def daemon(self):
		return self._daemon

	@property
	def controller(self):
		return self._controller

	def run(self, daemon):
		# override global config if ran from command line with daemon option (-d)
		if daemon:
			self.config.set(consts.CFG_DAEMON, True)
		if not self.config.get_bool(consts.CFG_DAEMON, consts.CFG_DEF_DAEMON):
			log.warn("Using one shot no daemon mode, most of the functionality will be not available, it can be changed in global config")
		result = self._controller.run()
		if self.config.get_bool(consts.CFG_DAEMON, consts.CFG_DEF_DAEMON):
			exports.stop()

		if self._pid_file is not None:
			self._delete_pid_file()

		return result
site-packages/tuned/daemon/controller.py000064400000031165147511334660014423 0ustar00from tuned import exports
import tuned.logs
import tuned.exceptions
from tuned.exceptions import TunedException
import threading
import tuned.consts as consts
from tuned.utils.commands import commands

__all__ = ["Controller"]

log = tuned.logs.get()

class TimerStore(object):
	def __init__(self):
		self._timers = dict()
		self._timers_lock = threading.Lock()

	def store_timer(self, token, timer):
		with self._timers_lock:
			self._timers[token] = timer

	def drop_timer(self, token):
		with self._timers_lock:
			try:
				timer = self._timers[token]
				timer.cancel()
				del self._timers[token]
			except:
				pass

	def cancel_all(self):
		with self._timers_lock:
			for timer in self._timers.values():
				timer.cancel()
			self._timers.clear()

class Controller(tuned.exports.interfaces.ExportableInterface):
	"""
	Controller's purpose is to keep the program running, start/stop the tuning,
	and export the controller interface (currently only over D-Bus).
	"""

	def __init__(self, daemon, global_config):
		super(Controller, self).__init__()
		self._daemon = daemon
		self._global_config = global_config
		self._terminate = threading.Event()
		self._cmd = commands()
		self._timer_store = TimerStore()

	def run(self):
		"""
		Controller main loop. The call is blocking.
		"""
		log.info("starting controller")
		res = self.start()
		daemon = self._global_config.get_bool(consts.CFG_DAEMON, consts.CFG_DEF_DAEMON)
		if not res and daemon:
			exports.start()

		if daemon:
			self._terminate.clear()
			# we have to pass some timeout, otherwise signals will not work
			while not self._cmd.wait(self._terminate, 1):
				exports.period_check()

		log.info("terminating controller")
		self.stop()

	def terminate(self):
		self._terminate.set()

	def sighup(self):
		if not self._daemon._sighup_processing.is_set():
			self._daemon._sighup_processing.set()
			if not self.reload():
				self._daemon._sighup_processing.clear()

	@exports.signal("sbs")
	def profile_changed(self, profile_name, result, errstr):
		pass

	# exports decorator checks the authorization (currently through polkit), caller is None if
	# no authorization was performed (i.e. the call should process as authorized), string
	# identifying caller (with DBus it's the caller bus name) if authorized and empty
	# string if not authorized, caller must be the last argument

	def _log_capture_abort(self, token):
		tuned.logs.log_capture_finish(token)
		self._timer_store.drop_timer(token)

	@exports.export("ii", "s")
	def log_capture_start(self, log_level, timeout, caller = None):
		if caller == "":
			return ""
		token = tuned.logs.log_capture_start(log_level)
		if token is None:
			return ""
		if timeout > 0:
			timer = threading.Timer(timeout,
					self._log_capture_abort, args = [token])
			self._timer_store.store_timer(token, timer)
			timer.start()
		return "" if token is None else token

	@exports.export("s", "s")
	def log_capture_finish(self, token, caller = None):
		if caller == "":
			return ""
		res = tuned.logs.log_capture_finish(token)
		self._timer_store.drop_timer(token)
		return "" if res is None else res

	@exports.export("", "b")
	def start(self, caller = None):
		if caller == "":
			return False
		if self._global_config.get_bool(consts.CFG_DAEMON, consts.CFG_DEF_DAEMON):
			if self._daemon.is_running():
				return True
			elif not self._daemon.is_enabled():
				return False
		return self._daemon.start()

	def _stop(self, profile_switch = False):
		if not self._daemon.is_running():
			res = True
		else:
			res = self._daemon.stop(profile_switch = profile_switch)
		self._timer_store.cancel_all()
		return res

	@exports.export("", "b")
	def stop(self, caller = None):
		if caller == "":
			return False
		return self._stop(profile_switch = False)

	@exports.export("", "b")
	def reload(self, caller = None):
		if caller == "":
			return False
		if self._daemon.is_running():
			stop_ok = self._stop(profile_switch = True)
			if not stop_ok:
				return False
		try:
			self._daemon.reload_profile_config()
		except TunedException as e:
			log.error("Failed to reload TuneD: %s" % e)
			return False
		return self.start()

	def _switch_profile(self, profile_name, manual):
		was_running = self._daemon.is_running()
		msg = "OK"
		success = True
		reapply = False
		try:
			if was_running:
				self._daemon.stop(profile_switch = True)
			self._daemon.set_profile(profile_name, manual)
		except tuned.exceptions.TunedException as e:
			success = False
			msg = str(e)
			if was_running and self._daemon.profile.name == profile_name:
				log.error("Failed to reapply profile '%s'. Did it change on disk and break?" % profile_name)
				reapply = True
			else:
				log.error("Failed to apply profile '%s'" % profile_name)
		finally:
			if was_running:
				if reapply:
					log.warn("Applying previously applied (possibly out-dated) profile '%s'." % profile_name)
				elif not success:
					log.info("Applying previously applied profile.")
				self._daemon.start()

		return (success, msg)

	@exports.export("s", "(bs)")
	def switch_profile(self, profile_name, caller = None):
		if caller == "":
			return (False, "Unauthorized")
		return self._switch_profile(profile_name, True)

	@exports.export("", "(bs)")
	def auto_profile(self, caller = None):
		if caller == "":
			return (False, "Unauthorized")
		profile_name = self.recommend_profile()
		return self._switch_profile(profile_name, False)

	@exports.export("", "s")
	def active_profile(self, caller = None):
		if caller == "":
			return ""
		if self._daemon.profile is not None:
			return self._daemon.profile.name
		else:
			return ""

	@exports.export("", "(ss)")
	def profile_mode(self, caller = None):
		if caller == "":
			return "unknown", "Unauthorized"
		manual = self._daemon.manual
		if manual is None:
			# This means no profile is applied. Check the preset value.
			try:
				profile, manual = self._cmd.get_active_profile()
				if manual is None:
					manual = profile is not None
			except TunedException as e:
				mode = "unknown"
				error = str(e)
				return mode, error
		mode = consts.ACTIVE_PROFILE_MANUAL if manual else consts.ACTIVE_PROFILE_AUTO
		return mode, ""

	@exports.export("", "s")
	def post_loaded_profile(self, caller = None):
		if caller == "":
			return ""
		return self._daemon.post_loaded_profile or ""

	@exports.export("", "b")
	def disable(self, caller = None):
		if caller == "":
			return False
		if self._daemon.is_running():
			self._daemon.stop()
		if self._daemon.is_enabled():
			self._daemon.set_all_profiles(None, True, None,
						      save_instantly=True)
		return True

	@exports.export("", "b")
	def is_running(self, caller = None):
		if caller == "":
			return False
		return self._daemon.is_running()

	@exports.export("", "as")
	def profiles(self, caller = None):
		if caller == "":
			return []
		return self._daemon.profile_loader.profile_locator.get_known_names()

	@exports.export("", "a(ss)")
	def profiles2(self, caller = None):
		if caller == "":
			return []
		return self._daemon.profile_loader.profile_locator.get_known_names_summary()

	@exports.export("s", "(bsss)")
	def profile_info(self, profile_name, caller = None):
		if caller == "":
			return tuple(False, "", "", "")
		if profile_name is None or profile_name == "":
			profile_name = self.active_profile()
		return tuple(self._daemon.profile_loader.profile_locator.get_profile_attrs(profile_name, [consts.PROFILE_ATTR_SUMMARY, consts.PROFILE_ATTR_DESCRIPTION], [""]))

	@exports.export("", "s")
	def recommend_profile(self, caller = None):
		if caller == "":
			return ""
		return self._daemon.profile_recommender.recommend()

	@exports.export("", "b")
	def verify_profile(self, caller = None):
		if caller == "":
			return False
		return self._daemon.verify_profile(ignore_missing = False)

	@exports.export("", "b")
	def verify_profile_ignore_missing(self, caller = None):
		if caller == "":
			return False
		return self._daemon.verify_profile(ignore_missing = True)

	@exports.export("", "a{sa{ss}}")
	def get_all_plugins(self, caller = None):
		"""Return dictionary with accesible plugins

		Return:
		dictionary -- {plugin_name: {parameter_name: default_value}}
		"""
		if caller == "":
			return False
		plugins = {}
		for plugin_class in self._daemon.get_all_plugins():
			plugin_name = plugin_class.__module__.split(".")[-1].split("_", 1)[1]
			conf_options = plugin_class._get_config_options()
			plugins[plugin_name] = {}
			for key, val in conf_options.items():
				plugins[plugin_name][key] = str(val)
		return plugins

	@exports.export("s","s")
	def get_plugin_documentation(self, plugin_name, caller = None):
		"""Return docstring of plugin's class"""
		if caller == "":
			return False
		return self._daemon.get_plugin_documentation(str(plugin_name))

	@exports.export("s","a{ss}")
	def get_plugin_hints(self, plugin_name, caller = None):
		"""Return dictionary with plugin's parameters and their hints

		Parameters:
		plugin_name -- name of plugin

		Return:
		dictionary -- {parameter_name: hint}
		"""
		if caller == "":
			return False
		return self._daemon.get_plugin_hints(str(plugin_name))

	@exports.export("s", "b")
	def register_socket_signal_path(self, path, caller = None):
		"""Allows to dynamically add sockets to send signals to

		Parameters:
		path -- path to socket to register for sending signals

		Return:
		bool -- True on success
		"""
		if caller == "":
			return False
		if self._daemon._application and self._daemon._application._unix_socket_exporter:
			self._daemon._application._unix_socket_exporter.register_signal_path(path)
			return True
		return False

	# devices - devices to migrate from other instances, string of form "dev1,dev2,dev3,..."
	#	or "cpulist:CPULIST", where CPULIST is e.g. "0-3,6,8-9"
	# instance_name - instance where to migrate devices
	@exports.export("ss", "(bs)")
	def instance_acquire_devices(self, devices, instance_name, caller = None):
		if caller == "":
			return (False, "Unauthorized")
		found = False
		for instance_target in self._daemon._unit_manager.instances:
			if instance_target.name == instance_name:
				log.debug("Found instance '%s'." % instance_target.name)
				found = True
				break
		if not found:
			rets = "Instance '%s' not found" % instance_name
			log.error(rets)
			return (False, rets)
		devs = set(self._cmd.devstr2devs(devices))
		log.debug("Instance '%s' trying to acquire devices '%s'." % (instance_target.name, str(devs)))
		for instance in self._daemon._unit_manager.instances:
			devs_moving = instance.processed_devices & devs
			if len(devs_moving):
				devs -= devs_moving
				log.info("Moving devices '%s' from instance '%s' to instance '%s'." % (str(devs_moving),
					instance.name, instance_target.name))
				if (instance.plugin.name != instance_target.plugin.name):
					rets = "Target instance '%s' is of type '%s', but devices '%s' are currently handled by " \
						"instance '%s' which is of type '%s'." % (instance_target.name,
						instance_target.plugin.name, str(devs_moving), instance.name, instance.plugin.name)
					log.error(rets)
					return (False, rets)
				instance.plugin._remove_devices_nocheck(instance, devs_moving)
				instance_target.plugin._add_devices_nocheck(instance_target, devs_moving)
		if (len(devs)):
			rets = "Ignoring devices not handled by any instance '%s'." % str(devs)
			log.info(rets)
			return (False, rets)
		return (True, "OK")

	@exports.export("s", "(bsa(ss))")
	def get_instances(self, plugin_name, caller = None):
		"""Return a list of active instances of a plugin or all active instances

		Parameters:
		plugin_name -- name of the plugin or an empty string

		Return:
		bool -- True on success
		string -- error message or "OK"
		list of string pairs -- [(instance_name, plugin_name)]
		"""
		if caller == "":
			return (False, "Unauthorized", [])
		if plugin_name != "" and plugin_name not in self.get_all_plugins().keys():
			rets = "Plugin '%s' does not exist" % plugin_name
			log.error(rets)
			return (False, rets, [])
		instances = filter(lambda instance: instance.active, self._daemon._unit_manager.instances)
		if plugin_name != "":
			instances = filter(lambda instance: instance.plugin.name == plugin_name, instances)
		return (True, "OK", list(map(lambda instance: (instance.name, instance.plugin.name), instances)))

	@exports.export("s", "(bsas)")
	def instance_get_devices(self, instance_name, caller = None):
		"""Return a list of devices assigned to an instance

		Parameters:
		instance_name -- name of the instance

		Return:
		bool -- True on success
		string -- error message or "OK"
		list of strings -- device names
		"""
		if caller == "":
			return (False, "Unauthorized", [])
		for instance in self._daemon._unit_manager.instances:
			if instance.name == instance_name:
				return (True, "OK", sorted(list(instance.processed_devices)))
		rets = "Instance '%s' not found" % instance_name
		log.error(rets)
		return (False, rets, [])
site-packages/tuned/daemon/__pycache__/daemon.cpython-36.pyc000064400000025261147511334660017767 0ustar003

�<�e�3�@s�ddlZddlZddlZddlZddlmZddlmZddl	j
Z
ddlmZddlm
Z
ddlmZddlZejj�ZGdd�de�ZdS)	�N)�TunedException)�InvalidProfileException)�commands)�exports)�ProfileRecommenderc@seZdZd;dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	d<dd�Z
edd��Zedd��Z
edd��Zedd��Zedd��Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Zd5d6�Zd7d8�Zd=d9d:�ZdS)>�DaemonNcCs�tjd�tj|_ttj�|_ttj�|_	tj
|_d|_tj
|_|dk	r�|jtjtj�|_t|jtjtj��|_t|jtjtj��|_	|jtjtj
�|_|jtjtj�|_|jtjtj
�|_||_|jdkr�ttj�|_|j	dkr�d|_n|j	|jkr�|j|_	|j	|j|_tjd|j�|j�rJtjd�tjd|j|j|jf�t|jd�|_||_||_|j�t �|_!y|j"|�Wn2t#k
�r�}ztj$d	|�WYdd}~XnXdS)
Nzinitializing daemonTrFz$using sleep interval of %d second(s)z8dynamic tuning is enabled (can be overridden by plugins)zFusing update interval of %d second(s) (%d times of the sleep interval))Zis_hardcodedz:Cannot set initial profile. No tunings will be enabled: %s)%�log�debug�constsZCFG_DEF_DAEMON�_daemon�intZCFG_DEF_SLEEP_INTERVAL�_sleep_intervalZCFG_DEF_UPDATE_INTERVALZ_update_intervalZCFG_DEF_DYNAMIC_TUNING�_dynamic_tuningZ_recommend_commandZCFG_DEF_ROLLBACK�	_rollbackZget_boolZ
CFG_DAEMON�getZCFG_SLEEP_INTERVALZCFG_UPDATE_INTERVALZCFG_DYNAMIC_TUNINGZCFG_RECOMMEND_COMMANDZCFG_DEF_RECOMMEND_COMMAND�CFG_ROLLBACK�_application�
_sleep_cycles�infor�_profile_recommender�
_unit_manager�_profile_loader�
_init_threadsr�_cmd�
_init_profiler�error)�selfZunit_manager�profile_loader�
profile_names�configZapplication�e�r!�/usr/lib/python3.6/daemon.py�__init__sF



zDaemon.__init__cCsFd|_tj�|_tj�|_tj�|_tj�|_|jj�tj�|_dS)N)	�_thread�	threadingZEvent�
_terminate�_terminate_profile_switch�	_not_used�_sighup_processing�set�_profile_applied)rr!r!r"r7s




zDaemon._init_threadscCs|jd�dS)zARead configuration files again and load profile according to themN)r)rr!r!r"�reload_profile_configCszDaemon.reload_profile_configcCs�d}|jj�}|dkrP|j�\}}|dkrrd}|r<|d7}n|d7}tj|�n"|dkrr|rhtjd�n
tjd�d|_d|_g|_d|_|j	|||�dS)NTz.No profile is preset, running in manual mode. z(Only post-loaded profile will be enabledzNo profile will be enabled.�)
rZget_post_loaded_profile�_get_startup_profilerr�_profile�_manual�_active_profiles�_post_loaded_profile�set_all_profiles)rr�manual�post_loaded_profile�msgr!r!r"rGs&


zDaemon._init_profilecCs�|pd}|j�}|jr2tjd|j�||jg}x:|D]2}||jjj�kr8d|}|j|d|�t|��q8Wy.|r�|j	j
|�|_nd|_||_|j�|_
WnJtk
r�}z.ddj|�|f}|j|d|�t|��WYdd}~XnXdS)Nr-zUsing post-loaded profile '%s'z%Requested profile '%s' doesn't exist.FzCannot load profile(s) '%s': %s� )�splitr2rrrZprofile_locatorZget_known_names�_notify_profile_changedrr�loadr/r0r1r�join)rrr4Zprofile_list�profile�errstrr r!r!r"�_load_profiles`s*

zDaemon._load_profilescCs2|j�r"d}|j|d|�t|��|j||�dS)Nz/Cannot set profile while the daemon is running.F)�
is_runningr9rr>)rrr4r=r!r!r"�set_profilezszDaemon.set_profilecCs4|sd|_n$t|j��dkr*d}t|��n||_dS)N�zYWhitespace is not allowed in profile names; only a single post-loaded profile is allowed.)r2�lenr8r)r�profile_namer=r!r!r"�_set_post_loaded_profile�s
zDaemon._set_post_loaded_profileFcCsV|j�r"d}|j|d|�t|��|j|�|j||�|rR|j||�|j|�dS)Nz/Cannot set profile while the daemon is running.F)r?r9rrDr>�_save_active_profile�_save_post_loaded_profile)rZactive_profilesr4r5Zsave_instantlyr=r!r!r"r3�s
zDaemon.set_all_profilescCs|jS)N)r/)rr!r!r"r<�szDaemon.profilecCs|jS)N)r0)rr!r!r"r4�sz
Daemon.manualcCs|jr|jSdS)N)r/r2)rr!r!r"r5�szDaemon.post_loaded_profilecCs|jS)N)r)rr!r!r"�profile_recommender�szDaemon.profile_recommendercCs|jS)N)r)rr!r!r"r�szDaemon.profile_loadercCs |jdk	rtjtj|||�|S)N)rrZsend_signalr
ZSIGNAL_PROFILE_CHANGED)rr�resultr=r!r!r"r9�s
zDaemon._notify_profile_changedcCsj|jjddgdgd�\}}|dkr&dS|dd�dkr:dS|jjddgdgd�\}}tjd	|�dkoh|S)
NZ	systemctlzis-system-runningr)Z	no_errorsF�Zstoppingz	list-jobsz0\b(shutdown|reboot|halt|poweroff)\.target.*start)rZexecute�re�search)rZretcode�outr!r!r"�_full_rollback_required�szDaemon._full_rollback_requiredcCs�|jdkrtd��|jj|jj�|jdj|j�|j�|j	|j
�|jj�|jj
�tjd|jj�|jrxtj�dj|j�}|j|dd�|jj�|j�r|j}x\|jj|j|j��s|jr�|d8}|dkr�|j}tjd�|jj�tjd	�|jj�q�W|jj�d}x.|jj|j |j��rD|d
k�rD|d7}�qW|j!j"��rZt#j$}njt#j%}|j&��svtjd�nN|j'dk�r�t#j(}tjd
t#j)t#j*f�n$|j�r�tjd�t#j$}n
tjd�|j�r�|jj+|�|jj,�dS)Nz2Cannot start the daemon without setting a profile.r7z'static tuning from profile '%s' appliedTZOKrArzupdating monitorszperforming tunings�z1terminating TuneD due to system shutdown / rebootZnot_on_exitzMterminating TuneD and not rolling back any changes due to '%s' option in '%s'z+terminating TuneD, rolling back all changesz"terminating TuneD in one-shot mode)-r/rrZcreateZunitsrEr;r1r0rFr2Zstart_tuningr+r*rr�namerr�startr9r)�clearrr�waitr&r
rr	Zupdate_monitorsZ
update_tuningr(r'�is_setr
Z
ROLLBACK_FULLZ
ROLLBACK_SOFTrMrZ
ROLLBACK_NONErZGLOBAL_CONFIG_FILEZstop_tuningZdestroy_all)rrZ
_sleep_cnt�iZrollbackr!r!r"�_thread_code�sX







"



zDaemon._thread_codecCsHy|jj||�Wn0tk
rB}ztjt|��WYdd}~XnXdS)N)rZsave_active_profilerrr�str)rrr4r r!r!r"rEszDaemon._save_active_profilecCsFy|jj|�Wn0tk
r@}ztjt|��WYdd}~XnXdS)N)rZsave_post_loaded_profilerrrrV)rrCr r!r!r"rFsz Daemon._save_post_loaded_profilecCs&tjd�|jj�}tjd|�|S)NzWRunning in automatic mode, checking what profile is recommended for your configuration.zUsing '%s' profile)rrrZ	recommend)rr<r!r!r"�_get_recommended_profiles

zDaemon._get_recommended_profilecCs2|jj�\}}|dkr|dk	}|s*|j�}||fS)N)rZget_active_profilerW)rr<r4r!r!r"r.szDaemon._get_startup_profilecCs|jjj�S)z$Return all accessible plugin classes)r�plugins_repositoryZload_all_plugins)rr!r!r"�get_all_plugins"szDaemon.get_all_pluginscCs.y|jjj|�}Wntk
r&dSX|jS)zReturn plugin class docstringr-)rrX�load_plugin�ImportError�__doc__)r�plugin_name�plugin_classr!r!r"�get_plugin_documentation&s
zDaemon.get_plugin_documentationcCs0y|jjj|�}Wntk
r&iSX|j�S)z�Return plugin's parameters and their hints

		Parameters:
		plugin_name -- plugins name

		Return:
		dictionary -- {parameter_name: hint}
		)rrXrZr[Zget_config_options_hints)rr]r^r!r!r"�get_plugin_hints0s	
zDaemon.get_plugin_hintscCs
|jdk	S)N)r/)rr!r!r"�
is_enabledAszDaemon.is_enabledcCs|jdk	o|jj�S)N)r$Zis_alive)rr!r!r"r?DszDaemon.is_runningcCs`|j�rdS|jdkrdStjd�|jj�tj|jd�|_	|j
j�|jj�|j	j
�dS)NFzstarting tuning)�targetT)r?r/rrr(r*r%ZThreadrUr$r'rQr&rP)rr!r!r"rPGs





zDaemon.startcCs||j�stjd�dS|jdkr.tjd�dS|jj�sFtjd�dS|jj�tjd|jj	�|j
j|�}|jj�|S)NzTuneD is not runningFzno profile is setzprofile is not appliedzverifying profile(s): %s)
r?rrr/r+rSr(rQrrOrZ
verify_tuningr*)rZignore_missing�retr!r!r"�verify_profileVs






zDaemon.verify_profilecCsB|j�sdStjd�|r$|jj�|jj�|jj�d|_dS)NFzstopping tuningT)r?rrr'r*r&r$r;)rZprofile_switchr!r!r"�stopls



zDaemon.stop)NNN)F)F) �__name__�
__module__�__qualname__r#rr,rr>r@rDr3�propertyr<r4r5rGrr9rMrUrErFrWr.rYr_r`rar?rPrdrer!r!r!r"rs8
&	

	G
r)�os�errnor%Z
tuned.logsZtunedZtuned.exceptionsrZtuned.profiles.exceptionsrZtuned.constsr
Ztuned.utils.commandsrrZtuned.utils.profile_recommenderrrJZlogsrr�objectrr!r!r!r"�<module>s

site-packages/tuned/daemon/__pycache__/__init__.cpython-36.pyc000064400000000265147511334660020260 0ustar003

�<�eK�@sddlTddlTddlTdS)�)�*N)ZapplicationZ
controllerZdaemon�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/daemon/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000265147511334660021217 0ustar003

�<�eK�@sddlTddlTddlTdS)�)�*N)ZapplicationZ
controllerZdaemon�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/daemon/__pycache__/application.cpython-36.pyc000064400000017516147511334660021033 0ustar003

�<�e��@s�ddlmZmZmZmZmZmZmZddlm	Z	ddl
ZddlZddlm
Z
ddlmZddlZddlZddlZddlZddlZddljZddlmZejj�ZdgZGd	d�de�ZdS)
�)�storage�units�monitors�plugins�profiles�exports�hardware)�TunedExceptionN�)�
controller)�daemon)�GlobalConfig�Applicationc@s�eZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	e
jfdd�Zdd�Z
dd�Ze
jfdd�Zedd��Zedd��Zdd�ZdS)rNc	Csptjdtjjtj�df�d|_d|_t	j
�}t	j|�}|dkrJt�n||_
|j
jtj�rjtjd�n
tjd�tj�}|j
jdtj�}tj|d�}tj�}tj�}	tjj�}
tjj�|_tj|||||	|
|j
|j�}t|j
jtj tj!��}t"j#|||||j
�}
tj�}tj$�}tj%tj&�}tj'||||j
|j�}t(j)|
|||j
|�|_*t+j,|j*|j
�|_-|j.�d|_/dS)NzTuneD: %s, kernel: %s�z8dynamic tuning is enabled (can be overridden in plugins)z#dynamic tuning is globally disabled�udev_buffer_size)�buffer_size)0�log�info�tuned�versionZTUNED_VERSION_STR�os�uname�_dbus_exporter�_unix_socket_exporterrZPickleProviderZFactoryr
�config�get_bool�constsZCFG_DYNAMIC_TUNINGrZ
RepositoryZget_sizeZCFG_DEF_UDEV_BUFFER_SIZErZ	InventoryZ
DeviceMatcherZDeviceMatcherUdevr�instancer�	variables�	Variables�int�getZCFG_DEFAULT_INSTANCE_PRIORITYZ!CFG_DEF_DEFAULT_INSTANCE_PRIORITYrZManagerZMergerZLocatorZLOAD_DIRECTORIES�LoaderrZDaemon�_daemonrZ
Controller�_controller�
_init_signals�	_pid_file)�selfZprofile_namerZstorage_providerZstorage_factoryZmonitors_repositoryrZhardware_inventoryZdevice_matcherZdevice_matcher_udevZplugin_instance_factoryZplugins_repositoryZdef_instance_priorityZunit_managerZprofile_factoryZprofile_mergerZprofile_locatorZprofile_loader�r(�!/usr/lib/python3.6/application.py�__init__s<



zApplication.__init__cs��fdd�}tj�|�dS)Ncs�|kr��dS)Nr()Z_signal_numberZ_frame)�handler�
signal_numberr(r)�handler_wrapper@sz3Application._handle_signal.<locals>.handler_wrapper)�signal)r'r,r+r-r()r+r,r)�_handle_signal?szApplication._handle_signalcCs:|jtj|jj�|jtj|jj�|jtj|jj�dS)N)r/r.�SIGHUPr$Zsighup�SIGINTZ	terminate�SIGTERM)r'r(r(r)r%EszApplication._init_signalscCs6|jdk	rtd��tjj||||�|_tj|j�dS)Nz&DBus interface is already initialized.)rr	rZdbusZDBusExporter�register_exporter)r'Zbus_nameZobject_nameZinterface_name�	namespacer(r(r)�attach_to_dbusJs
zApplication.attach_to_dbuscCsj|jdk	rtd��tjj|jjtj�|jjtj	�|jjtj
�|jjtj�|jjtj
��|_tj|j�dS)Nz-Unix socket interface is already initialized.)rr	rZunix_socketZUnixSocketExporterrr!rZCFG_UNIX_SOCKET_PATHZCFG_UNIX_SOCKET_SIGNAL_PATHSZCFG_UNIX_SOCKET_OWNERSHIPZget_intZCFG_UNIX_SOCKET_PERMISIONSZ#CFG_UNIX_SOCKET_CONNECTIONS_BACKLOGr3)r'r(r(r)�attach_to_unix_socketQs
z!Application.attach_to_unix_socketcCstj|j�dS)N)rZregister_objectr$)r'r(r(r)�register_controller\szApplication.register_controllercCs�tj|�tj|gggtj�\}}}t|�dkrBtj|�td��tj|d�}tj|�t|�dkrltd��ytj	d|�d}Wntj
k
r�td��YnX|dkr�td	��d
S)z|
		Wait till the child signalizes that the initialization is complete by writing
		some uninteresting data into the pipe.
		r
z=Cannot daemonize, timeout when waiting for the child process.�rz:Cannot daemonize, no response from child process received.�?z?Cannot daemonize, invalid response from child process received.Tz0Cannot daemonize, child process reports failure.N)r�close�selectrZDAEMONIZE_PARENT_TIMEOUT�lenr	�read�struct�unpack�error)r'�parent_in_fd�child_out_fdZ
read_readyZdropZresponse�valr(r(r)�_daemonize_parent_s


zApplication._daemonize_parentcCs�||_|j�yltjj|j�}tjj|�s4tj|�tjtj|jtj	tj
BtjBd�d��}|jdtj
��WdQRXWn>ttfk
r�}ztjd|jt|�f�WYdd}~XnXdS)Ni��wz%dzcannot write the PID to %s: %s)r&�_delete_pid_filer�path�dirname�exists�makedirs�fdopen�open�O_CREAT�O_TRUNC�O_WRONLY�write�getpid�OSError�IOErrorr�critical�str)r'�pid_fileZdir_name�fr@r(r(r)�write_pid_filexs
( zApplication.write_pid_filecCs^tjj|j�rZytj|j�Wn:tk
rX}ztjd|jt|�f�WYdd}~XnXdS)Nz&cannot remove existing PID file %s, %s)	rrGrIr&�unlinkrRrZwarningrU)r'r@r(r(r)rF�s
zApplication._delete_pid_filecCs*tj|�tjd�tj�tjd�ytj�}|dkrBtjd�Wn^tk
r�}zBt	j
dt|��tj
dd�}tj||�tj|�td��WYdd}~XnXtdd	�}tj|j�tjj��tj|j�tjj��tj|j�tjj��|j|�t	jd
�tj
dd�}tj||�tj|�dS)zy
		Finishes daemonizing process, writes a PID file and signalizes to the parent
		that the initialization is complete.
		�/rz"cannot daemonize, fork() error: %sr9Fz'Cannot daemonize, second fork() failed.Nz	/dev/nullzw+zsuccessfully daemonizedT)rr:�chdir�setsid�umask�fork�sys�exitrRrrTrUr>�packrPr	rL�dup2�fileno�stdin�stdout�stderrrX�debug)r'rVrArB�pidr@rC�fdr(r(r)�_daemonize_child�s.






zApplication._daemonize_childcCs�tj�}ytj�}WnFtk
rZ}z*tj|d�tj|d�td��WYdd}~XnXy2|dkr||j|�tjd�n|j	|f|��Wn"|dkr��n
tjd�YnXdS)z�
		Daemonizes the application. In case of failure, TunedException is raised
		in the parent process. If the operation is successfull, the main process
		is terminated and only child process returns from this method.
		rr
z Cannot daemonize, fork() failed.N)
r�piper^rRr:r	rDr_r`rj)r'rVZparent_child_fdsZ	child_pidr@r(r(r)�	daemonize�s 
zApplication.daemonizecCs|jS)N)r#)r'r(r(r)r�szApplication.daemoncCs|jS)N)r$)r'r(r(r)r�szApplication.controllercCsj|r|jjtjd�|jjtjtj�s0tjd�|jj	�}|jjtjtj�rTt
j�|jdk	rf|j
�|S)NTzrUsing one shot no daemon mode, most of the functionality will be not available, it can be changed in global config)r�setrZ
CFG_DAEMONrZCFG_DEF_DAEMONr�warnr$�runr�stopr&rF)r'r�resultr(r(r)ro�s


zApplication.run)NN)�__name__�
__module__�__qualname__r*r/r%r5r6r7rDrZPID_FILErXrFrjrl�propertyrrror(r(r(r)rs
+
")rrrrrrrrZtuned.exceptionsr	Z
tuned.logsZ
tuned.version�rrr.rr_r;r>Ztuned.constsrZtuned.utils.global_configr
Zlogsr!r�__all__�objectrr(r(r(r)�<module>s$

site-packages/tuned/daemon/__pycache__/daemon.cpython-36.opt-1.pyc000064400000025261147511334660020726 0ustar003

�<�e�3�@s�ddlZddlZddlZddlZddlmZddlmZddl	j
Z
ddlmZddlm
Z
ddlmZddlZejj�ZGdd�de�ZdS)	�N)�TunedException)�InvalidProfileException)�commands)�exports)�ProfileRecommenderc@seZdZd;dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	d<dd�Z
edd��Zedd��Z
edd��Zedd��Zedd��Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Zd5d6�Zd7d8�Zd=d9d:�ZdS)>�DaemonNcCs�tjd�tj|_ttj�|_ttj�|_	tj
|_d|_tj
|_|dk	r�|jtjtj�|_t|jtjtj��|_t|jtjtj��|_	|jtjtj
�|_|jtjtj�|_|jtjtj
�|_||_|jdkr�ttj�|_|j	dkr�d|_n|j	|jkr�|j|_	|j	|j|_tjd|j�|j�rJtjd�tjd|j|j|jf�t|jd�|_||_||_|j�t �|_!y|j"|�Wn2t#k
�r�}ztj$d	|�WYdd}~XnXdS)
Nzinitializing daemonTrFz$using sleep interval of %d second(s)z8dynamic tuning is enabled (can be overridden by plugins)zFusing update interval of %d second(s) (%d times of the sleep interval))Zis_hardcodedz:Cannot set initial profile. No tunings will be enabled: %s)%�log�debug�constsZCFG_DEF_DAEMON�_daemon�intZCFG_DEF_SLEEP_INTERVAL�_sleep_intervalZCFG_DEF_UPDATE_INTERVALZ_update_intervalZCFG_DEF_DYNAMIC_TUNING�_dynamic_tuningZ_recommend_commandZCFG_DEF_ROLLBACK�	_rollbackZget_boolZ
CFG_DAEMON�getZCFG_SLEEP_INTERVALZCFG_UPDATE_INTERVALZCFG_DYNAMIC_TUNINGZCFG_RECOMMEND_COMMANDZCFG_DEF_RECOMMEND_COMMAND�CFG_ROLLBACK�_application�
_sleep_cycles�infor�_profile_recommender�
_unit_manager�_profile_loader�
_init_threadsr�_cmd�
_init_profiler�error)�selfZunit_manager�profile_loader�
profile_names�configZapplication�e�r!�/usr/lib/python3.6/daemon.py�__init__sF



zDaemon.__init__cCsFd|_tj�|_tj�|_tj�|_tj�|_|jj�tj�|_dS)N)	�_thread�	threadingZEvent�
_terminate�_terminate_profile_switch�	_not_used�_sighup_processing�set�_profile_applied)rr!r!r"r7s




zDaemon._init_threadscCs|jd�dS)zARead configuration files again and load profile according to themN)r)rr!r!r"�reload_profile_configCszDaemon.reload_profile_configcCs�d}|jj�}|dkrP|j�\}}|dkrrd}|r<|d7}n|d7}tj|�n"|dkrr|rhtjd�n
tjd�d|_d|_g|_d|_|j	|||�dS)NTz.No profile is preset, running in manual mode. z(Only post-loaded profile will be enabledzNo profile will be enabled.�)
rZget_post_loaded_profile�_get_startup_profilerr�_profile�_manual�_active_profiles�_post_loaded_profile�set_all_profiles)rr�manual�post_loaded_profile�msgr!r!r"rGs&


zDaemon._init_profilecCs�|pd}|j�}|jr2tjd|j�||jg}x:|D]2}||jjj�kr8d|}|j|d|�t|��q8Wy.|r�|j	j
|�|_nd|_||_|j�|_
WnJtk
r�}z.ddj|�|f}|j|d|�t|��WYdd}~XnXdS)Nr-zUsing post-loaded profile '%s'z%Requested profile '%s' doesn't exist.FzCannot load profile(s) '%s': %s� )�splitr2rrrZprofile_locatorZget_known_names�_notify_profile_changedrr�loadr/r0r1r�join)rrr4Zprofile_list�profile�errstrr r!r!r"�_load_profiles`s*

zDaemon._load_profilescCs2|j�r"d}|j|d|�t|��|j||�dS)Nz/Cannot set profile while the daemon is running.F)�
is_runningr9rr>)rrr4r=r!r!r"�set_profilezszDaemon.set_profilecCs4|sd|_n$t|j��dkr*d}t|��n||_dS)N�zYWhitespace is not allowed in profile names; only a single post-loaded profile is allowed.)r2�lenr8r)r�profile_namer=r!r!r"�_set_post_loaded_profile�s
zDaemon._set_post_loaded_profileFcCsV|j�r"d}|j|d|�t|��|j|�|j||�|rR|j||�|j|�dS)Nz/Cannot set profile while the daemon is running.F)r?r9rrDr>�_save_active_profile�_save_post_loaded_profile)rZactive_profilesr4r5Zsave_instantlyr=r!r!r"r3�s
zDaemon.set_all_profilescCs|jS)N)r/)rr!r!r"r<�szDaemon.profilecCs|jS)N)r0)rr!r!r"r4�sz
Daemon.manualcCs|jr|jSdS)N)r/r2)rr!r!r"r5�szDaemon.post_loaded_profilecCs|jS)N)r)rr!r!r"�profile_recommender�szDaemon.profile_recommendercCs|jS)N)r)rr!r!r"r�szDaemon.profile_loadercCs |jdk	rtjtj|||�|S)N)rrZsend_signalr
ZSIGNAL_PROFILE_CHANGED)rr�resultr=r!r!r"r9�s
zDaemon._notify_profile_changedcCsj|jjddgdgd�\}}|dkr&dS|dd�dkr:dS|jjddgdgd�\}}tjd	|�dkoh|S)
NZ	systemctlzis-system-runningr)Z	no_errorsF�Zstoppingz	list-jobsz0\b(shutdown|reboot|halt|poweroff)\.target.*start)rZexecute�re�search)rZretcode�outr!r!r"�_full_rollback_required�szDaemon._full_rollback_requiredcCs�|jdkrtd��|jj|jj�|jdj|j�|j�|j	|j
�|jj�|jj
�tjd|jj�|jrxtj�dj|j�}|j|dd�|jj�|j�r|j}x\|jj|j|j��s|jr�|d8}|dkr�|j}tjd�|jj�tjd	�|jj�q�W|jj�d}x.|jj|j |j��rD|d
k�rD|d7}�qW|j!j"��rZt#j$}njt#j%}|j&��svtjd�nN|j'dk�r�t#j(}tjd
t#j)t#j*f�n$|j�r�tjd�t#j$}n
tjd�|j�r�|jj+|�|jj,�dS)Nz2Cannot start the daemon without setting a profile.r7z'static tuning from profile '%s' appliedTZOKrArzupdating monitorszperforming tunings�z1terminating TuneD due to system shutdown / rebootZnot_on_exitzMterminating TuneD and not rolling back any changes due to '%s' option in '%s'z+terminating TuneD, rolling back all changesz"terminating TuneD in one-shot mode)-r/rrZcreateZunitsrEr;r1r0rFr2Zstart_tuningr+r*rr�namerr�startr9r)�clearrr�waitr&r
rr	Zupdate_monitorsZ
update_tuningr(r'�is_setr
Z
ROLLBACK_FULLZ
ROLLBACK_SOFTrMrZ
ROLLBACK_NONErZGLOBAL_CONFIG_FILEZstop_tuningZdestroy_all)rrZ
_sleep_cnt�iZrollbackr!r!r"�_thread_code�sX







"



zDaemon._thread_codecCsHy|jj||�Wn0tk
rB}ztjt|��WYdd}~XnXdS)N)rZsave_active_profilerrr�str)rrr4r r!r!r"rEszDaemon._save_active_profilecCsFy|jj|�Wn0tk
r@}ztjt|��WYdd}~XnXdS)N)rZsave_post_loaded_profilerrrrV)rrCr r!r!r"rFsz Daemon._save_post_loaded_profilecCs&tjd�|jj�}tjd|�|S)NzWRunning in automatic mode, checking what profile is recommended for your configuration.zUsing '%s' profile)rrrZ	recommend)rr<r!r!r"�_get_recommended_profiles

zDaemon._get_recommended_profilecCs2|jj�\}}|dkr|dk	}|s*|j�}||fS)N)rZget_active_profilerW)rr<r4r!r!r"r.szDaemon._get_startup_profilecCs|jjj�S)z$Return all accessible plugin classes)r�plugins_repositoryZload_all_plugins)rr!r!r"�get_all_plugins"szDaemon.get_all_pluginscCs.y|jjj|�}Wntk
r&dSX|jS)zReturn plugin class docstringr-)rrX�load_plugin�ImportError�__doc__)r�plugin_name�plugin_classr!r!r"�get_plugin_documentation&s
zDaemon.get_plugin_documentationcCs0y|jjj|�}Wntk
r&iSX|j�S)z�Return plugin's parameters and their hints

		Parameters:
		plugin_name -- plugins name

		Return:
		dictionary -- {parameter_name: hint}
		)rrXrZr[Zget_config_options_hints)rr]r^r!r!r"�get_plugin_hints0s	
zDaemon.get_plugin_hintscCs
|jdk	S)N)r/)rr!r!r"�
is_enabledAszDaemon.is_enabledcCs|jdk	o|jj�S)N)r$Zis_alive)rr!r!r"r?DszDaemon.is_runningcCs`|j�rdS|jdkrdStjd�|jj�tj|jd�|_	|j
j�|jj�|j	j
�dS)NFzstarting tuning)�targetT)r?r/rrr(r*r%ZThreadrUr$r'rQr&rP)rr!r!r"rPGs





zDaemon.startcCs||j�stjd�dS|jdkr.tjd�dS|jj�sFtjd�dS|jj�tjd|jj	�|j
j|�}|jj�|S)NzTuneD is not runningFzno profile is setzprofile is not appliedzverifying profile(s): %s)
r?rrr/r+rSr(rQrrOrZ
verify_tuningr*)rZignore_missing�retr!r!r"�verify_profileVs






zDaemon.verify_profilecCsB|j�sdStjd�|r$|jj�|jj�|jj�d|_dS)NFzstopping tuningT)r?rrr'r*r&r$r;)rZprofile_switchr!r!r"�stopls



zDaemon.stop)NNN)F)F) �__name__�
__module__�__qualname__r#rr,rr>r@rDr3�propertyr<r4r5rGrr9rMrUrErFrWr.rYr_r`rar?rPrdrer!r!r!r"rs8
&	

	G
r)�os�errnor%Z
tuned.logsZtunedZtuned.exceptionsrZtuned.profiles.exceptionsrZtuned.constsr
Ztuned.utils.commandsrrZtuned.utils.profile_recommenderrrJZlogsrr�objectrr!r!r!r"�<module>s

site-packages/tuned/daemon/__pycache__/controller.cpython-36.pyc000064400000031757147511334660020716 0ustar003

�<�eu2�@s�ddlmZddlZddlZddlmZddlZddljZddlm	Z	dgZ
ejj�Z
Gdd�de�ZGdd�dejjj�ZdS)	�)�exportsN)�TunedException)�commands�
Controllerc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�
TimerStorecCst�|_tj�|_dS)N)�dict�_timers�	threadingZLock�_timers_lock)�self�r� /usr/lib/python3.6/controller.py�__init__szTimerStore.__init__c
Cs |j�||j|<WdQRXdS)N)r
r)r�token�timerrrr
�store_timerszTimerStore.store_timercCsB|j�2y|j|}|j�|j|=WnYnXWdQRXdS)N)r
r�cancel)rrrrrr
�
drop_timers
zTimerStore.drop_timerc	Cs<|j�,x|jj�D]}|j�qW|jj�WdQRXdS)N)r
r�valuesr�clear)rrrrr
�
cancel_allszTimerStore.cancel_allN)�__name__�
__module__�__qualname__rrrrrrrr
r
s	rcs�eZdZdZ�fdd�Zdd�Zdd�Zdd	�Zej	d
�dd��Z
d
d�Zejdd�dUdd��Z
ejdd�dVdd��Zejdd�dWdd��ZdXdd�Zejdd�dYdd��Zejdd�dZdd ��Zd!d"�Zejdd#�d[d$d%��Zejdd#�d\d&d'��Zejdd�d]d(d)��Zejdd*�d^d+d,��Zejdd�d_d-d.��Zejdd�d`d/d0��Zejdd�dad1d2��Zejdd3�dbd4d5��Zejdd6�dcd7d8��Zejdd9�ddd:d;��Zejdd�ded<d=��Zejdd�dfd>d?��Zejdd�dgd@dA��Z ejddB�dhdCdD��Z!ejdd�didEdF��Z"ejddG�djdHdI��Z#ejdd�dkdJdK��Z$ejdLd#�dldMdN��Z%ejddO�dmdPdQ��Z&ejddR�dndSdT��Z'�Z(S)orz�
	Controller's purpose is to keep the program running, start/stop the tuning,
	and export the controller interface (currently only over D-Bus).
	cs8tt|�j�||_||_tj�|_t�|_	t
�|_dS)N)�superrr�_daemon�_global_configr	ZEvent�
_terminater�_cmdr�_timer_store)r�daemonZ
global_config)�	__class__rr
r+s
zController.__init__cCsxtjd�|j�}|jjtjtj�}|r6|r6tj�|rb|j	j
�x|jj|j	d�s`tj
�qFWtjd�|j�dS)z1
		Controller main loop. The call is blocking.
		zstarting controller�zterminating controllerN)�log�info�startr�get_bool�consts�
CFG_DAEMON�CFG_DEF_DAEMONrrrr�waitZperiod_check�stop)r�resr rrr
�run3s



zController.runcCs|jj�dS)N)r�set)rrrr
�	terminateFszController.terminatecCs0|jjj�s,|jjj�|j�s,|jjj�dS)N)rZ_sighup_processingZis_setr.�reloadr)rrrr
�sighupIszController.sighupZsbscCsdS)Nr)r�profile_name�resultZerrstrrrr
�profile_changedOszController.profile_changedcCstjj|�|jj|�dS)N)�tuned�logs�log_capture_finishrr)rrrrr
�_log_capture_abortXszController._log_capture_abortZii�sNcCsf|dkrdStjj|�}|dkr$dS|dkrVtj||j|gd�}|jj||�|j�|dkrbdS|S)N�r)�args)	r5r6�log_capture_startr	ZTimerr8rrr%)rZ	log_levelZtimeout�callerrrrrr
r<\szController.log_capture_startcCs4|dkrdStjj|�}|jj|�|dkr0dS|S)Nr:)r5r6r7rr)rrr=r,rrr
r7js
zController.log_capture_finishr:�bcCsD|dkrdS|jjtjtj�r:|jj�r,dS|jj�s:dS|jj�S)Nr:FT)	rr&r'r(r)r�
is_running�
is_enabledr%)rr=rrr
r%rs

zController.startFcCs,|jj�sd}n|jj|d�}|jj�|S)NT)�profile_switch)rr?r+rr)rrAr,rrr
�_stop}s


zController._stopcCs|dkrdS|jdd�S)Nr:F)rA)rB)rr=rrr
r+�szController.stopcCsp|dkrdS|jj�r*|jdd�}|s*dSy|jj�Wn.tk
rf}ztjd|�dSd}~XnX|j�S)Nr:FT)rAzFailed to reload TuneD: %s)rr?rBZreload_profile_configrr#�errorr%)rr=Zstop_ok�errr
r0�s
zController.reloadcCs�|jj�}d}d}d}z�y$|r,|jjdd�|jj||�Wnftjjk
r�}zFd}t|�}|r�|jjj	|kr�t
jd|�d}nt
jd|�WYdd}~XnXWd|r�|r�t
jd|�n|s�t
j
d�|jj�X||fS)	N�OKTF)rAz@Failed to reapply profile '%s'. Did it change on disk and break?zFailed to apply profile '%s'z>Applying previously applied (possibly out-dated) profile '%s'.z$Applying previously applied profile.)rr?r+Zset_profiler5�
exceptionsr�str�profile�namer#rC�warnr$r%)rr2�manualZwas_running�msg�successZreapplyrDrrr
�_switch_profile�s,
$
zController._switch_profilez(bs)cCs|dkrdS|j|d�S)Nr:F�UnauthorizedT)FrO)rN)rr2r=rrr
�switch_profile�szController.switch_profilecCs |dkrdS|j�}|j|d�S)Nr:FrO)FrO)�recommend_profilerN)rr=r2rrr
�auto_profile�szController.auto_profilecCs*|dkrdS|jjdk	r"|jjjSdSdS)Nr:)rrHrI)rr=rrr
�active_profile�s

zController.active_profilez(ss)cCs�|dkrdS|jj}|dkrpy"|jj�\}}|dkr<|dk	}Wn0tk
rn}zd}t|�}||fSd}~XnX|rztjntj}|dfS)Nr:�unknownrO)rTrO)	rrKrZget_active_profilerrGr'ZACTIVE_PROFILE_MANUALZACTIVE_PROFILE_AUTO)rr=rKrHrD�moderCrrr
�profile_mode�szController.profile_modecCs|dkrdS|jjpdS)Nr:)r�post_loaded_profile)rr=rrr
rW�szController.post_loaded_profilecCsB|dkrdS|jj�r |jj�|jj�r>|jjddddd�dS)Nr:FT)Zsave_instantly)rr?r+r@Zset_all_profiles)rr=rrr
�disable�s


zController.disablecCs|dkrdS|jj�S)Nr:F)rr?)rr=rrr
r?�szController.is_running�ascCs|dkrgS|jjjj�S)Nr:)r�profile_loader�profile_locatorZget_known_names)rr=rrr
�profiles�szController.profilesza(ss)cCs|dkrgS|jjjj�S)Nr:)rrZr[Zget_known_names_summary)rr=rrr
�	profiles2�szController.profiles2z(bsss)cCsP|dkrtdddd�S|dks&|dkr.|j�}t|jjjj|tjtjgdg��S)Nr:F)	�tuplerSrrZr[Zget_profile_attrsr'ZPROFILE_ATTR_SUMMARYZPROFILE_ATTR_DESCRIPTION)rr2r=rrr
�profile_infos
zController.profile_infocCs|dkrdS|jjj�S)Nr:)rZprofile_recommenderZ	recommend)rr=rrr
rQszController.recommend_profilecCs|dkrdS|jjdd�S)Nr:F)�ignore_missing)r�verify_profile)rr=rrr
raszController.verify_profilecCs|dkrdS|jjdd�S)Nr:FT)r`)rra)rr=rrr
�verify_profile_ignore_missingsz(Controller.verify_profile_ignore_missingz	a{sa{ss}}cCsz|dkrdSi}xd|jj�D]V}|jjd�djdd�d}|j�}i||<x$|j�D]\}}t|�|||<qVWqW|S)zuReturn dictionary with accesible plugins

		Return:
		dictionary -- {plugin_name: {parameter_name: default_value}}
		r:F�.r"�_���)r�get_all_pluginsr�splitZ_get_config_options�itemsrG)rr=ZpluginsZplugin_class�plugin_nameZconf_options�key�valrrr
rfszController.get_all_pluginscCs|dkrdS|jjt|��S)z"Return docstring of plugin's classr:F)r�get_plugin_documentationrG)rrir=rrr
rl,sz#Controller.get_plugin_documentationza{ss}cCs|dkrdS|jjt|��S)z�Return dictionary with plugin's parameters and their hints

		Parameters:
		plugin_name -- name of plugin

		Return:
		dictionary -- {parameter_name: hint}
		r:F)r�get_plugin_hintsrG)rrir=rrr
rm3s
zController.get_plugin_hintscCs6|dkrdS|jjr2|jjjr2|jjjj|�dSdS)z�Allows to dynamically add sockets to send signals to

		Parameters:
		path -- path to socket to register for sending signals

		Return:
		bool -- True on success
		r:FT)rZ_applicationZ_unix_socket_exporterZregister_signal_path)r�pathr=rrr
�register_socket_signal_pathAs
z&Controller.register_socket_signal_pathZssc
Csb|dkrdSd}x2|jjjD]$}|j|krtjd|j�d}PqW|sbd|}tj|�d|fSt|jj	|��}tjd|jt
|�f�x�|jjjD]�}|j|@}	t|	�r�||	8}tj
dt
|	�|j|jf�|jj|jjk�rd	|j|jjt
|	�|j|jjf}tj|�d|fS|jj||	�|jj||	�q�Wt|��r^d
t
|�}tj
|�d|fSd
S)Nr:FrOzFound instance '%s'.TzInstance '%s' not foundz-Instance '%s' trying to acquire devices '%s'.z8Moving devices '%s' from instance '%s' to instance '%s'.ztTarget instance '%s' is of type '%s', but devices '%s' are currently handled by instance '%s' which is of type '%s'.z2Ignoring devices not handled by any instance '%s'.rE)FrO)TrE)r�
_unit_manager�	instancesrIr#�debugrCr.rZdevstr2devsrG�processed_devices�lenr$�pluginZ_remove_devices_nocheckZ_add_devices_nocheck)
rZdevices�
instance_namer=�foundZinstance_target�retsZdevs�instanceZdevs_movingrrr
�instance_acquire_devicesUsB





z#Controller.instance_acquire_devicesz	(bsa(ss))cs�|dkrddgfS�dkrF�|j�j�krFd�}tj|�d|gfStdd�|jjj�}�dkrtt�fdd�|�}dd	tt	d
d�|��fS)aReturn a list of active instances of a plugin or all active instances

		Parameters:
		plugin_name -- name of the plugin or an empty string

		Return:
		bool -- True on success
		string -- error message or "OK"
		list of string pairs -- [(instance_name, plugin_name)]
		r:FrOzPlugin '%s' does not existcSs|jS)N)Zactive)ryrrr
�<lambda>�sz*Controller.get_instances.<locals>.<lambda>cs|jj�kS)N)rurI)ry)rirr
r{�sTrEcSs|j|jjfS)N)rIru)ryrrr
r{�s)
rf�keysr#rC�filterrrprq�list�map)rrir=rxrqr)rir
�
get_instancesys


zController.get_instancesz(bsas)cCs`|dkrddgfSx0|jjjD]"}|j|krddtt|j��fSqWd|}tj|�d|gfS)z�Return a list of devices assigned to an instance

		Parameters:
		instance_name -- name of the instance

		Return:
		bool -- True on success
		string -- error message or "OK"
		list of strings -- device names
		r:FrOTrEzInstance '%s' not found)	rrprqrI�sortedr~rsr#rC)rrvr=ryrxrrr
�instance_get_devices�s


zController.instance_get_devices)N)N)N)F)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N))rrr�__doc__rr-r/r1r�signalr4r8Zexportr<r7r%rBr+r0rNrPrRrSrVrWrXr?r\r]r_rQrarbrfrlrmrorzr�r��
__classcell__rr)r!r
r%sv	

























#

)r5rZ
tuned.logsZtuned.exceptionsrr	Ztuned.constsr'Ztuned.utils.commandsr�__all__r6�getr#�objectrZ
interfacesZExportableInterfacerrrrr
�<module>s

site-packages/tuned/daemon/__pycache__/controller.cpython-36.opt-1.pyc000064400000031757147511334660021655 0ustar003

�<�eu2�@s�ddlmZddlZddlZddlmZddlZddljZddlm	Z	dgZ
ejj�Z
Gdd�de�ZGdd�dejjj�ZdS)	�)�exportsN)�TunedException)�commands�
Controllerc@s,eZdZdd�Zdd�Zdd�Zdd�Zd	S)
�
TimerStorecCst�|_tj�|_dS)N)�dict�_timers�	threadingZLock�_timers_lock)�self�r� /usr/lib/python3.6/controller.py�__init__szTimerStore.__init__c
Cs |j�||j|<WdQRXdS)N)r
r)r�token�timerrrr
�store_timerszTimerStore.store_timercCsB|j�2y|j|}|j�|j|=WnYnXWdQRXdS)N)r
r�cancel)rrrrrr
�
drop_timers
zTimerStore.drop_timerc	Cs<|j�,x|jj�D]}|j�qW|jj�WdQRXdS)N)r
r�valuesr�clear)rrrrr
�
cancel_allszTimerStore.cancel_allN)�__name__�
__module__�__qualname__rrrrrrrr
r
s	rcs�eZdZdZ�fdd�Zdd�Zdd�Zdd	�Zej	d
�dd��Z
d
d�Zejdd�dUdd��Z
ejdd�dVdd��Zejdd�dWdd��ZdXdd�Zejdd�dYdd��Zejdd�dZdd ��Zd!d"�Zejdd#�d[d$d%��Zejdd#�d\d&d'��Zejdd�d]d(d)��Zejdd*�d^d+d,��Zejdd�d_d-d.��Zejdd�d`d/d0��Zejdd�dad1d2��Zejdd3�dbd4d5��Zejdd6�dcd7d8��Zejdd9�ddd:d;��Zejdd�ded<d=��Zejdd�dfd>d?��Zejdd�dgd@dA��Z ejddB�dhdCdD��Z!ejdd�didEdF��Z"ejddG�djdHdI��Z#ejdd�dkdJdK��Z$ejdLd#�dldMdN��Z%ejddO�dmdPdQ��Z&ejddR�dndSdT��Z'�Z(S)orz�
	Controller's purpose is to keep the program running, start/stop the tuning,
	and export the controller interface (currently only over D-Bus).
	cs8tt|�j�||_||_tj�|_t�|_	t
�|_dS)N)�superrr�_daemon�_global_configr	ZEvent�
_terminater�_cmdr�_timer_store)r�daemonZ
global_config)�	__class__rr
r+s
zController.__init__cCsxtjd�|j�}|jjtjtj�}|r6|r6tj�|rb|j	j
�x|jj|j	d�s`tj
�qFWtjd�|j�dS)z1
		Controller main loop. The call is blocking.
		zstarting controller�zterminating controllerN)�log�info�startr�get_bool�consts�
CFG_DAEMON�CFG_DEF_DAEMONrrrr�waitZperiod_check�stop)r�resr rrr
�run3s



zController.runcCs|jj�dS)N)r�set)rrrr
�	terminateFszController.terminatecCs0|jjj�s,|jjj�|j�s,|jjj�dS)N)rZ_sighup_processingZis_setr.�reloadr)rrrr
�sighupIszController.sighupZsbscCsdS)Nr)r�profile_name�resultZerrstrrrr
�profile_changedOszController.profile_changedcCstjj|�|jj|�dS)N)�tuned�logs�log_capture_finishrr)rrrrr
�_log_capture_abortXszController._log_capture_abortZii�sNcCsf|dkrdStjj|�}|dkr$dS|dkrVtj||j|gd�}|jj||�|j�|dkrbdS|S)N�r)�args)	r5r6�log_capture_startr	ZTimerr8rrr%)rZ	log_levelZtimeout�callerrrrrr
r<\szController.log_capture_startcCs4|dkrdStjj|�}|jj|�|dkr0dS|S)Nr:)r5r6r7rr)rrr=r,rrr
r7js
zController.log_capture_finishr:�bcCsD|dkrdS|jjtjtj�r:|jj�r,dS|jj�s:dS|jj�S)Nr:FT)	rr&r'r(r)r�
is_running�
is_enabledr%)rr=rrr
r%rs

zController.startFcCs,|jj�sd}n|jj|d�}|jj�|S)NT)�profile_switch)rr?r+rr)rrAr,rrr
�_stop}s


zController._stopcCs|dkrdS|jdd�S)Nr:F)rA)rB)rr=rrr
r+�szController.stopcCsp|dkrdS|jj�r*|jdd�}|s*dSy|jj�Wn.tk
rf}ztjd|�dSd}~XnX|j�S)Nr:FT)rAzFailed to reload TuneD: %s)rr?rBZreload_profile_configrr#�errorr%)rr=Zstop_ok�errr
r0�s
zController.reloadcCs�|jj�}d}d}d}z�y$|r,|jjdd�|jj||�Wnftjjk
r�}zFd}t|�}|r�|jjj	|kr�t
jd|�d}nt
jd|�WYdd}~XnXWd|r�|r�t
jd|�n|s�t
j
d�|jj�X||fS)	N�OKTF)rAz@Failed to reapply profile '%s'. Did it change on disk and break?zFailed to apply profile '%s'z>Applying previously applied (possibly out-dated) profile '%s'.z$Applying previously applied profile.)rr?r+Zset_profiler5�
exceptionsr�str�profile�namer#rC�warnr$r%)rr2�manualZwas_running�msg�successZreapplyrDrrr
�_switch_profile�s,
$
zController._switch_profilez(bs)cCs|dkrdS|j|d�S)Nr:F�UnauthorizedT)FrO)rN)rr2r=rrr
�switch_profile�szController.switch_profilecCs |dkrdS|j�}|j|d�S)Nr:FrO)FrO)�recommend_profilerN)rr=r2rrr
�auto_profile�szController.auto_profilecCs*|dkrdS|jjdk	r"|jjjSdSdS)Nr:)rrHrI)rr=rrr
�active_profile�s

zController.active_profilez(ss)cCs�|dkrdS|jj}|dkrpy"|jj�\}}|dkr<|dk	}Wn0tk
rn}zd}t|�}||fSd}~XnX|rztjntj}|dfS)Nr:�unknownrO)rTrO)	rrKrZget_active_profilerrGr'ZACTIVE_PROFILE_MANUALZACTIVE_PROFILE_AUTO)rr=rKrHrD�moderCrrr
�profile_mode�szController.profile_modecCs|dkrdS|jjpdS)Nr:)r�post_loaded_profile)rr=rrr
rW�szController.post_loaded_profilecCsB|dkrdS|jj�r |jj�|jj�r>|jjddddd�dS)Nr:FT)Zsave_instantly)rr?r+r@Zset_all_profiles)rr=rrr
�disable�s


zController.disablecCs|dkrdS|jj�S)Nr:F)rr?)rr=rrr
r?�szController.is_running�ascCs|dkrgS|jjjj�S)Nr:)r�profile_loader�profile_locatorZget_known_names)rr=rrr
�profiles�szController.profilesza(ss)cCs|dkrgS|jjjj�S)Nr:)rrZr[Zget_known_names_summary)rr=rrr
�	profiles2�szController.profiles2z(bsss)cCsP|dkrtdddd�S|dks&|dkr.|j�}t|jjjj|tjtjgdg��S)Nr:F)	�tuplerSrrZr[Zget_profile_attrsr'ZPROFILE_ATTR_SUMMARYZPROFILE_ATTR_DESCRIPTION)rr2r=rrr
�profile_infos
zController.profile_infocCs|dkrdS|jjj�S)Nr:)rZprofile_recommenderZ	recommend)rr=rrr
rQszController.recommend_profilecCs|dkrdS|jjdd�S)Nr:F)�ignore_missing)r�verify_profile)rr=rrr
raszController.verify_profilecCs|dkrdS|jjdd�S)Nr:FT)r`)rra)rr=rrr
�verify_profile_ignore_missingsz(Controller.verify_profile_ignore_missingz	a{sa{ss}}cCsz|dkrdSi}xd|jj�D]V}|jjd�djdd�d}|j�}i||<x$|j�D]\}}t|�|||<qVWqW|S)zuReturn dictionary with accesible plugins

		Return:
		dictionary -- {plugin_name: {parameter_name: default_value}}
		r:F�.r"�_���)r�get_all_pluginsr�splitZ_get_config_options�itemsrG)rr=ZpluginsZplugin_class�plugin_nameZconf_options�key�valrrr
rfszController.get_all_pluginscCs|dkrdS|jjt|��S)z"Return docstring of plugin's classr:F)r�get_plugin_documentationrG)rrir=rrr
rl,sz#Controller.get_plugin_documentationza{ss}cCs|dkrdS|jjt|��S)z�Return dictionary with plugin's parameters and their hints

		Parameters:
		plugin_name -- name of plugin

		Return:
		dictionary -- {parameter_name: hint}
		r:F)r�get_plugin_hintsrG)rrir=rrr
rm3s
zController.get_plugin_hintscCs6|dkrdS|jjr2|jjjr2|jjjj|�dSdS)z�Allows to dynamically add sockets to send signals to

		Parameters:
		path -- path to socket to register for sending signals

		Return:
		bool -- True on success
		r:FT)rZ_applicationZ_unix_socket_exporterZregister_signal_path)r�pathr=rrr
�register_socket_signal_pathAs
z&Controller.register_socket_signal_pathZssc
Csb|dkrdSd}x2|jjjD]$}|j|krtjd|j�d}PqW|sbd|}tj|�d|fSt|jj	|��}tjd|jt
|�f�x�|jjjD]�}|j|@}	t|	�r�||	8}tj
dt
|	�|j|jf�|jj|jjk�rd	|j|jjt
|	�|j|jjf}tj|�d|fS|jj||	�|jj||	�q�Wt|��r^d
t
|�}tj
|�d|fSd
S)Nr:FrOzFound instance '%s'.TzInstance '%s' not foundz-Instance '%s' trying to acquire devices '%s'.z8Moving devices '%s' from instance '%s' to instance '%s'.ztTarget instance '%s' is of type '%s', but devices '%s' are currently handled by instance '%s' which is of type '%s'.z2Ignoring devices not handled by any instance '%s'.rE)FrO)TrE)r�
_unit_manager�	instancesrIr#�debugrCr.rZdevstr2devsrG�processed_devices�lenr$�pluginZ_remove_devices_nocheckZ_add_devices_nocheck)
rZdevices�
instance_namer=�foundZinstance_target�retsZdevs�instanceZdevs_movingrrr
�instance_acquire_devicesUsB





z#Controller.instance_acquire_devicesz	(bsa(ss))cs�|dkrddgfS�dkrF�|j�j�krFd�}tj|�d|gfStdd�|jjj�}�dkrtt�fdd�|�}dd	tt	d
d�|��fS)aReturn a list of active instances of a plugin or all active instances

		Parameters:
		plugin_name -- name of the plugin or an empty string

		Return:
		bool -- True on success
		string -- error message or "OK"
		list of string pairs -- [(instance_name, plugin_name)]
		r:FrOzPlugin '%s' does not existcSs|jS)N)Zactive)ryrrr
�<lambda>�sz*Controller.get_instances.<locals>.<lambda>cs|jj�kS)N)rurI)ry)rirr
r{�sTrEcSs|j|jjfS)N)rIru)ryrrr
r{�s)
rf�keysr#rC�filterrrprq�list�map)rrir=rxrqr)rir
�
get_instancesys


zController.get_instancesz(bsas)cCs`|dkrddgfSx0|jjjD]"}|j|krddtt|j��fSqWd|}tj|�d|gfS)z�Return a list of devices assigned to an instance

		Parameters:
		instance_name -- name of the instance

		Return:
		bool -- True on success
		string -- error message or "OK"
		list of strings -- device names
		r:FrOTrEzInstance '%s' not found)	rrprqrI�sortedr~rsr#rC)rrvr=ryrxrrr
�instance_get_devices�s


zController.instance_get_devices)N)N)N)F)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N))rrr�__doc__rr-r/r1r�signalr4r8Zexportr<r7r%rBr+r0rNrPrRrSrVrWrXr?r\r]r_rQrarbrfrlrmrorzr�r��
__classcell__rr)r!r
r%sv	

























#

)r5rZ
tuned.logsZtuned.exceptionsrr	Ztuned.constsr'Ztuned.utils.commandsr�__all__r6�getr#�objectrZ
interfacesZExportableInterfacerrrrr
�<module>s

site-packages/tuned/daemon/__pycache__/application.cpython-36.opt-1.pyc000064400000017516147511334660021772 0ustar003

�<�e��@s�ddlmZmZmZmZmZmZmZddlm	Z	ddl
ZddlZddlm
Z
ddlmZddlZddlZddlZddlZddlZddljZddlmZejj�ZdgZGd	d�de�ZdS)
�)�storage�units�monitors�plugins�profiles�exports�hardware)�TunedExceptionN�)�
controller)�daemon)�GlobalConfig�Applicationc@s�eZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	e
jfdd�Zdd�Z
dd�Ze
jfdd�Zedd��Zedd��Zdd�ZdS)rNc	Csptjdtjjtj�df�d|_d|_t	j
�}t	j|�}|dkrJt�n||_
|j
jtj�rjtjd�n
tjd�tj�}|j
jdtj�}tj|d�}tj�}tj�}	tjj�}
tjj�|_tj|||||	|
|j
|j�}t|j
jtj tj!��}t"j#|||||j
�}
tj�}tj$�}tj%tj&�}tj'||||j
|j�}t(j)|
|||j
|�|_*t+j,|j*|j
�|_-|j.�d|_/dS)NzTuneD: %s, kernel: %s�z8dynamic tuning is enabled (can be overridden in plugins)z#dynamic tuning is globally disabled�udev_buffer_size)�buffer_size)0�log�info�tuned�versionZTUNED_VERSION_STR�os�uname�_dbus_exporter�_unix_socket_exporterrZPickleProviderZFactoryr
�config�get_bool�constsZCFG_DYNAMIC_TUNINGrZ
RepositoryZget_sizeZCFG_DEF_UDEV_BUFFER_SIZErZ	InventoryZ
DeviceMatcherZDeviceMatcherUdevr�instancer�	variables�	Variables�int�getZCFG_DEFAULT_INSTANCE_PRIORITYZ!CFG_DEF_DEFAULT_INSTANCE_PRIORITYrZManagerZMergerZLocatorZLOAD_DIRECTORIES�LoaderrZDaemon�_daemonrZ
Controller�_controller�
_init_signals�	_pid_file)�selfZprofile_namerZstorage_providerZstorage_factoryZmonitors_repositoryrZhardware_inventoryZdevice_matcherZdevice_matcher_udevZplugin_instance_factoryZplugins_repositoryZdef_instance_priorityZunit_managerZprofile_factoryZprofile_mergerZprofile_locatorZprofile_loader�r(�!/usr/lib/python3.6/application.py�__init__s<



zApplication.__init__cs��fdd�}tj�|�dS)Ncs�|kr��dS)Nr()Z_signal_numberZ_frame)�handler�
signal_numberr(r)�handler_wrapper@sz3Application._handle_signal.<locals>.handler_wrapper)�signal)r'r,r+r-r()r+r,r)�_handle_signal?szApplication._handle_signalcCs:|jtj|jj�|jtj|jj�|jtj|jj�dS)N)r/r.�SIGHUPr$Zsighup�SIGINTZ	terminate�SIGTERM)r'r(r(r)r%EszApplication._init_signalscCs6|jdk	rtd��tjj||||�|_tj|j�dS)Nz&DBus interface is already initialized.)rr	rZdbusZDBusExporter�register_exporter)r'Zbus_nameZobject_nameZinterface_name�	namespacer(r(r)�attach_to_dbusJs
zApplication.attach_to_dbuscCsj|jdk	rtd��tjj|jjtj�|jjtj	�|jjtj
�|jjtj�|jjtj
��|_tj|j�dS)Nz-Unix socket interface is already initialized.)rr	rZunix_socketZUnixSocketExporterrr!rZCFG_UNIX_SOCKET_PATHZCFG_UNIX_SOCKET_SIGNAL_PATHSZCFG_UNIX_SOCKET_OWNERSHIPZget_intZCFG_UNIX_SOCKET_PERMISIONSZ#CFG_UNIX_SOCKET_CONNECTIONS_BACKLOGr3)r'r(r(r)�attach_to_unix_socketQs
z!Application.attach_to_unix_socketcCstj|j�dS)N)rZregister_objectr$)r'r(r(r)�register_controller\szApplication.register_controllercCs�tj|�tj|gggtj�\}}}t|�dkrBtj|�td��tj|d�}tj|�t|�dkrltd��ytj	d|�d}Wntj
k
r�td��YnX|dkr�td	��d
S)z|
		Wait till the child signalizes that the initialization is complete by writing
		some uninteresting data into the pipe.
		r
z=Cannot daemonize, timeout when waiting for the child process.�rz:Cannot daemonize, no response from child process received.�?z?Cannot daemonize, invalid response from child process received.Tz0Cannot daemonize, child process reports failure.N)r�close�selectrZDAEMONIZE_PARENT_TIMEOUT�lenr	�read�struct�unpack�error)r'�parent_in_fd�child_out_fdZ
read_readyZdropZresponse�valr(r(r)�_daemonize_parent_s


zApplication._daemonize_parentcCs�||_|j�yltjj|j�}tjj|�s4tj|�tjtj|jtj	tj
BtjBd�d��}|jdtj
��WdQRXWn>ttfk
r�}ztjd|jt|�f�WYdd}~XnXdS)Ni��wz%dzcannot write the PID to %s: %s)r&�_delete_pid_filer�path�dirname�exists�makedirs�fdopen�open�O_CREAT�O_TRUNC�O_WRONLY�write�getpid�OSError�IOErrorr�critical�str)r'�pid_fileZdir_name�fr@r(r(r)�write_pid_filexs
( zApplication.write_pid_filecCs^tjj|j�rZytj|j�Wn:tk
rX}ztjd|jt|�f�WYdd}~XnXdS)Nz&cannot remove existing PID file %s, %s)	rrGrIr&�unlinkrRrZwarningrU)r'r@r(r(r)rF�s
zApplication._delete_pid_filecCs*tj|�tjd�tj�tjd�ytj�}|dkrBtjd�Wn^tk
r�}zBt	j
dt|��tj
dd�}tj||�tj|�td��WYdd}~XnXtdd	�}tj|j�tjj��tj|j�tjj��tj|j�tjj��|j|�t	jd
�tj
dd�}tj||�tj|�dS)zy
		Finishes daemonizing process, writes a PID file and signalizes to the parent
		that the initialization is complete.
		�/rz"cannot daemonize, fork() error: %sr9Fz'Cannot daemonize, second fork() failed.Nz	/dev/nullzw+zsuccessfully daemonizedT)rr:�chdir�setsid�umask�fork�sys�exitrRrrTrUr>�packrPr	rL�dup2�fileno�stdin�stdout�stderrrX�debug)r'rVrArB�pidr@rC�fdr(r(r)�_daemonize_child�s.






zApplication._daemonize_childcCs�tj�}ytj�}WnFtk
rZ}z*tj|d�tj|d�td��WYdd}~XnXy2|dkr||j|�tjd�n|j	|f|��Wn"|dkr��n
tjd�YnXdS)z�
		Daemonizes the application. In case of failure, TunedException is raised
		in the parent process. If the operation is successfull, the main process
		is terminated and only child process returns from this method.
		rr
z Cannot daemonize, fork() failed.N)
r�piper^rRr:r	rDr_r`rj)r'rVZparent_child_fdsZ	child_pidr@r(r(r)�	daemonize�s 
zApplication.daemonizecCs|jS)N)r#)r'r(r(r)r�szApplication.daemoncCs|jS)N)r$)r'r(r(r)r�szApplication.controllercCsj|r|jjtjd�|jjtjtj�s0tjd�|jj	�}|jjtjtj�rTt
j�|jdk	rf|j
�|S)NTzrUsing one shot no daemon mode, most of the functionality will be not available, it can be changed in global config)r�setrZ
CFG_DAEMONrZCFG_DEF_DAEMONr�warnr$�runr�stopr&rF)r'r�resultr(r(r)ro�s


zApplication.run)NN)�__name__�
__module__�__qualname__r*r/r%r5r6r7rDrZPID_FILErXrFrjrl�propertyrrror(r(r(r)rs
+
")rrrrrrrrZtuned.exceptionsr	Z
tuned.logsZ
tuned.version�rrr.rr_r;r>Ztuned.constsrZtuned.utils.global_configr
Zlogsr!r�__all__�objectrr(r(r(r)�<module>s$

site-packages/tuned/logs.py000064400000007373147511334660011745 0ustar00import atexit
import logging
import logging.handlers
import os
import os.path
import inspect
import tuned.consts as consts
import random
import string
import threading
try:
	from StringIO import StringIO
except:
	from io import StringIO

__all__ = ["get"]

root_logger = None

log_handlers = {}
log_handlers_lock = threading.Lock()

class LogHandler(object):
	def __init__(self, handler, stream):
		self.handler = handler
		self.stream = stream

def _random_string(length):
	r = random.SystemRandom()
	chars = string.ascii_letters + string.digits
	res = ""
	for i in range(length):
		res += r.choice(chars)
	return res

def log_capture_start(log_level):
	with log_handlers_lock:
		for i in range(10):
			token = _random_string(16)
			if token not in log_handlers:
				break
		else:
			return None
		stream = StringIO()
		handler = logging.StreamHandler(stream)
		handler.setLevel(log_level)
		formatter = logging.Formatter(
				"%(levelname)-8s %(name)s: %(message)s")
		handler.setFormatter(formatter)
		root_logger.addHandler(handler)
		log_handler = LogHandler(handler, stream)
		log_handlers[token] = log_handler
		root_logger.debug("Added log handler %s." % token)
		return token

def log_capture_finish(token):
	with log_handlers_lock:
		try:
			log_handler = log_handlers[token]
		except KeyError:
			return None
		content = log_handler.stream.getvalue()
		log_handler.stream.close()
		root_logger.removeHandler(log_handler.handler)
		del log_handlers[token]
		root_logger.debug("Removed log handler %s." % token)
		return content

def get():
	global root_logger
	if root_logger is None:
		root_logger = logging.getLogger("tuned")

	calling_module = inspect.currentframe().f_back
	name = calling_module.f_locals["__name__"]
	if name == "__main__":
		name = "tuned"
		return root_logger
	elif name.startswith("tuned."):
		(root, child) = name.split(".", 1)
		child_logger = root_logger.getChild(child)
		child_logger.remove_all_handlers()
		child_logger.setLevel("NOTSET")
		return child_logger
	else:
		assert False

class TunedLogger(logging.getLoggerClass()):
	"""Custom TuneD daemon logger class."""
	_formatter = logging.Formatter("%(asctime)s %(levelname)-8s %(name)s: %(message)s")
	_console_handler = None
	_file_handler = None

	def __init__(self, *args, **kwargs):
		super(TunedLogger, self).__init__(*args, **kwargs)
		self.setLevel(logging.INFO)
		self.switch_to_console()

	def console(self, msg, *args, **kwargs):
		self.log(consts.LOG_LEVEL_CONSOLE, msg, *args, **kwargs)

	def switch_to_console(self):
		self._setup_console_handler()
		self.remove_all_handlers()
		self.addHandler(self._console_handler)

	def switch_to_file(self, filename = consts.LOG_FILE, 
			   maxBytes = consts.LOG_FILE_MAXBYTES,
			   backupCount = consts.LOG_FILE_COUNT):
		self._setup_file_handler(filename, maxBytes, backupCount)
		self.remove_all_handlers()
		self.addHandler(self._file_handler)

	def remove_all_handlers(self):
		_handlers = self.handlers
		for handler in _handlers:
			self.removeHandler(handler)

	@classmethod
	def _setup_console_handler(cls):
		if cls._console_handler is not None:
			return

		cls._console_handler = logging.StreamHandler()
		cls._console_handler.setFormatter(cls._formatter)

	@classmethod
	def _setup_file_handler(cls, filename, maxBytes, backupCount):
		if cls._file_handler is not None:
			return

		log_directory = os.path.dirname(filename)
		if log_directory == '':
			log_directory = '.'
		if not os.path.exists(log_directory):
			os.makedirs(log_directory)

		cls._file_handler = logging.handlers.RotatingFileHandler(
			filename, maxBytes = int(maxBytes), backupCount = int(backupCount))
		cls._file_handler.setFormatter(cls._formatter)

logging.addLevelName(consts.LOG_LEVEL_CONSOLE, consts.LOG_LEVEL_CONSOLE_NAME)
logging.setLoggerClass(TunedLogger)
atexit.register(logging.shutdown)
site-packages/tuned/units/manager.py000064400000013654147511334660013554 0ustar00import collections
import os
import re
import traceback
import tuned.exceptions
import tuned.logs
import tuned.plugins.exceptions
import tuned.consts as consts
from tuned.utils.global_config import GlobalConfig
from tuned.utils.commands import commands

log = tuned.logs.get()

__all__ = ["Manager"]

class Manager(object):
	"""
	Manager creates plugin instances and keeps a track of them.
	"""

	def __init__(self, plugins_repository, monitors_repository,
			def_instance_priority, hardware_inventory, config = None):
		super(Manager, self).__init__()
		self._plugins_repository = plugins_repository
		self._monitors_repository = monitors_repository
		self._def_instance_priority = def_instance_priority
		self._hardware_inventory = hardware_inventory
		self._instances = []
		self._plugins = []
		self._config = config or GlobalConfig()
		self._cmd = commands()

	@property
	def plugins(self):
		return self._plugins

	@property
	def instances(self):
		return self._instances

	@property
	def plugins_repository(self):
		return self._plugins_repository

	def _unit_matches_cpuinfo(self, unit):
		if unit.cpuinfo_regex is None:
			return True
		cpuinfo_string = self._config.get(consts.CFG_CPUINFO_STRING)
		if cpuinfo_string is None:
			cpuinfo_string = self._cmd.read_file("/proc/cpuinfo")
		return re.search(unit.cpuinfo_regex, cpuinfo_string,
				re.MULTILINE) is not None

	def _unit_matches_uname(self, unit):
		if unit.uname_regex is None:
			return True
		uname_string = self._config.get(consts.CFG_UNAME_STRING)
		if uname_string is None:
			uname_string = " ".join(os.uname())
		return re.search(unit.uname_regex, uname_string,
				re.MULTILINE) is not None

	def create(self, instances_config):
		instance_info_list = []
		for instance_name, instance_info in list(instances_config.items()):
			if not instance_info.enabled:
				log.debug("skipping disabled instance '%s'" % instance_name)
				continue
			if not self._unit_matches_cpuinfo(instance_info):
				log.debug("skipping instance '%s', cpuinfo does not match" % instance_name)
				continue
			if not self._unit_matches_uname(instance_info):
				log.debug("skipping instance '%s', uname does not match" % instance_name)
				continue

			instance_info.options.setdefault("priority", self._def_instance_priority)
			instance_info.options["priority"] = int(instance_info.options["priority"])
			instance_info_list.append(instance_info)

		instance_info_list.sort(key=lambda x: x.options["priority"])
		plugins_by_name = collections.OrderedDict()
		for instance_info in instance_info_list:
			instance_info.options.pop("priority")
			plugins_by_name[instance_info.type] = None

		for plugin_name, none in list(plugins_by_name.items()):
			try:
				plugin = self._plugins_repository.create(plugin_name)
				plugins_by_name[plugin_name] = plugin
				self._plugins.append(plugin)
			except tuned.plugins.exceptions.NotSupportedPluginException:
				log.info("skipping plugin '%s', not supported on your system" % plugin_name)
				continue
			except Exception as e:
				log.error("failed to initialize plugin %s" % plugin_name)
				log.exception(e)
				continue

		instances = []
		for instance_info in instance_info_list:
			plugin = plugins_by_name[instance_info.type]
			if plugin is None:
				continue
			log.debug("creating '%s' (%s)" % (instance_info.name, instance_info.type))
			new_instance = plugin.create_instance(instance_info.name, instance_info.devices, instance_info.devices_udev_regex, \
				instance_info.script_pre, instance_info.script_post, instance_info.options)
			instances.append(new_instance)
		for instance in instances:
			instance.plugin.init_devices()
			instance.plugin.assign_free_devices(instance)
			instance.plugin.initialize_instance(instance)
		# At this point we should be able to start the HW events
		# monitoring/processing thread, without risking race conditions
		self._hardware_inventory.start_processing_events()
		self._instances.extend(instances)

	def _try_call(self, caller, exc_ret, f, *args, **kwargs):
		try:
			return f(*args, **kwargs)
		except Exception as e:
			trace = traceback.format_exc()
			log.error("BUG: Unhandled exception in %s: %s"
					% (caller, str(e)))
			log.error(trace)
			return exc_ret

	def destroy_all(self):
		for instance in self._instances:
			log.debug("destroying instance %s" % instance.name)
			self._try_call("destroy_all", None,
					instance.plugin.destroy_instance,
					instance)
		for plugin in self._plugins:
			log.debug("cleaning plugin '%s'" % plugin.name)
			self._try_call("destroy_all", None, plugin.cleanup)

		del self._plugins[:]
		del self._instances[:]

	def update_monitors(self):
		for monitor in self._monitors_repository.monitors:
			log.debug("updating monitor %s" % monitor)
			self._try_call("update_monitors", None, monitor.update)

	def start_tuning(self):
		for instance in self._instances:
			self._try_call("start_tuning", None,
					instance.apply_tuning)

	def verify_tuning(self, ignore_missing):
		ret = True
		for instance in self._instances:
			res = self._try_call("verify_tuning", False,
					instance.verify_tuning, ignore_missing)
			if res == False:
				ret = False
		return ret

	def update_tuning(self):
		for instance in self._instances:
			self._try_call("update_tuning", None,
					instance.update_tuning)

	# rollback parameter is a helper telling plugins whether soft or full
	# rollback is needed, e.g. for bootloader plugin we need grub.cfg
	# tuning to persist across reboots and restarts of the daemon, so in
	# this case the rollback is usually set to consts.ROLLBACK_SOFT,
	# but we also need to clean it all up when TuneD is disabled or the
	# profile is changed. In this case the rollback is set to
	# consts.ROLLBACK_FULL. In practice it means to remove all temporal
	# or helper files, unpatch third party config files, etc.
	def stop_tuning(self, rollback = consts.ROLLBACK_SOFT):
		self._hardware_inventory.stop_processing_events()
		for instance in reversed(self._instances):
			self._try_call("stop_tuning", None,
					instance.unapply_tuning, rollback)
site-packages/tuned/units/__init__.py000064400000000027147511334660013667 0ustar00from .manager import *
site-packages/tuned/units/__pycache__/__init__.cpython-36.pyc000064400000000211147511334660020146 0ustar003

�<�e�@sddlTdS)�)�*N)Zmanager�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/units/__pycache__/manager.cpython-36.pyc000064400000013166147511334660020036 0ustar003

�<�e��@s~ddlZddlZddlZddlZddlZddlZddlZddlj	Z	ddl
mZddlm
Z
ejj�ZdgZGdd�de�ZdS)�N)�GlobalConfig)�commands�Managercs�eZdZdZd�fdd�	Zedd��Zedd��Zed	d
��Zdd�Z	d
d�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zejfdd�Z�ZS) rz?
	Manager creates plugin instances and keeps a track of them.
	NcsJtt|�j�||_||_||_||_g|_g|_|p:t	�|_
t�|_dS)N)
�superr�__init__�_plugins_repository�_monitors_repository�_def_instance_priority�_hardware_inventory�
_instances�_pluginsr�_configr�_cmd)�self�plugins_repositoryZmonitors_repositoryZdef_instance_priorityZhardware_inventory�config)�	__class__��/usr/lib/python3.6/manager.pyrszManager.__init__cCs|jS)N)r)rrrr�plugins!szManager.pluginscCs|jS)N)r)rrrr�	instances%szManager.instancescCs|jS)N)r)rrrrr)szManager.plugins_repositorycCsF|jdkrdS|jjtj�}|dkr0|jjd�}tj|j|tj	�dk	S)NTz
/proc/cpuinfo)
Z
cpuinfo_regexr
�get�constsZCFG_CPUINFO_STRINGrZ	read_file�re�search�	MULTILINE)r�unitZcpuinfo_stringrrr�_unit_matches_cpuinfo-s

zManager._unit_matches_cpuinfocCsH|jdkrdS|jjtj�}|dkr2djtj��}tj	|j|tj
�dk	S)NT� )Zuname_regexr
rrZCFG_UNAME_STRING�join�os�unamerrr)rrZuname_stringrrr�_unit_matches_uname6s

zManager._unit_matches_unamec
Cs2g}x�t|j��D]�\}}|js0tjd|�q|j|�sJtjd|�q|j|�sdtjd|�q|jjd|j	�t
|jd�|jd<|j|�qW|jdd�d�t
j�}x"|D]}|jjd�d||j<q�Wx�t|j��D]�\}}y$|jj|�}|||<|jj|�Wq�tjjjk
�r8tjd|�w�Yq�tk
�rt}	z tjd	|�tj|	�w�WYdd}	~	Xq�Xq�Wg}
xf|D]^}||j}|dk�r��q�tjd
|j|jf�|j|j|j|j|j |j!|j�}|
j|��q�Wx0|
D](}|j"j#�|j"j$|�|j"j%|��q�W|j&j'�|j(j)|
�dS)Nzskipping disabled instance '%s'z.skipping instance '%s', cpuinfo does not matchz,skipping instance '%s', uname does not match�prioritycSs
|jdS)Nr#)�options)�xrrr�<lambda>Psz Manager.create.<locals>.<lambda>)�keyz2skipping plugin '%s', not supported on your systemzfailed to initialize plugin %szcreating '%s' (%s))*�list�itemsZenabled�log�debugrr"r$�
setdefaultr	�int�append�sort�collections�OrderedDict�pop�typer�creater�tunedr�
exceptionsZNotSupportedPluginException�info�	Exception�errorZ	exception�nameZcreate_instanceZdevicesZdevices_udev_regexZ
script_preZscript_post�pluginZinit_devicesZassign_free_devicesZinitialize_instancer
Zstart_processing_eventsr�extend)
rZinstances_configZinstance_info_listZ
instance_nameZ
instance_infoZplugins_by_nameZplugin_nameZnoner;�erZnew_instance�instancerrrr4?s\









zManager.createcOsXy
|||�Stk
rR}z,tj�}tjd|t|�f�tj|�|Sd}~XnXdS)Nz"BUG: Unhandled exception in %s: %s)r8�	traceback�
format_excr*r9�str)rZcallerZexc_ret�f�args�kwargsr=Ztracerrr�	_try_callus

zManager._try_callcCs�x2|jD](}tjd|j�|jdd|jj|�qWx.|jD]$}tjd|j�|jdd|j�q<W|jdd�=|jdd�=dS)Nzdestroying instance %s�destroy_allzcleaning plugin '%s')	rr*r+r:rEr;Zdestroy_instancerZcleanup)rr>r;rrrrFs
zManager.destroy_allcCs4x.|jjD]"}tjd|�|jdd|j�q
WdS)Nzupdating monitor %s�update_monitors)rZmonitorsr*r+rE�update)rZmonitorrrrrG�szManager.update_monitorscCs$x|jD]}|jdd|j�qWdS)N�start_tuning)rrEZapply_tuning)rr>rrrrI�szManager.start_tuningcCs6d}x,|jD]"}|jdd|j|�}|dkrd}qW|S)NT�
verify_tuningF)rrErJ)rZignore_missing�retr>�resrrrrJ�s
zManager.verify_tuningcCs$x|jD]}|jdd|j�qWdS)N�
update_tuning)rrErM)rr>rrrrM�szManager.update_tuningcCs4|jj�x$t|j�D]}|jdd|j|�qWdS)N�stop_tuning)r
Zstop_processing_events�reversedrrEZunapply_tuning)rZrollbackr>rrrrN�s
zManager.stop_tuning)N)�__name__�
__module__�__qualname__�__doc__r�propertyrrrrr"r4rErFrGrIrJrMrZ
ROLLBACK_SOFTrN�
__classcell__rr)rrrs		6

	
)r0r rr?Ztuned.exceptionsr5Z
tuned.logsZtuned.plugins.exceptionsZtuned.constsrZtuned.utils.global_configrZtuned.utils.commandsrZlogsrr*�__all__�objectrrrrr�<module>s

site-packages/tuned/units/__pycache__/manager.cpython-36.opt-1.pyc000064400000013166147511334660020775 0ustar003

�<�e��@s~ddlZddlZddlZddlZddlZddlZddlZddlj	Z	ddl
mZddlm
Z
ejj�ZdgZGdd�de�ZdS)�N)�GlobalConfig)�commands�Managercs�eZdZdZd�fdd�	Zedd��Zedd��Zed	d
��Zdd�Z	d
d�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zejfdd�Z�ZS) rz?
	Manager creates plugin instances and keeps a track of them.
	NcsJtt|�j�||_||_||_||_g|_g|_|p:t	�|_
t�|_dS)N)
�superr�__init__�_plugins_repository�_monitors_repository�_def_instance_priority�_hardware_inventory�
_instances�_pluginsr�_configr�_cmd)�self�plugins_repositoryZmonitors_repositoryZdef_instance_priorityZhardware_inventory�config)�	__class__��/usr/lib/python3.6/manager.pyrszManager.__init__cCs|jS)N)r)rrrr�plugins!szManager.pluginscCs|jS)N)r)rrrr�	instances%szManager.instancescCs|jS)N)r)rrrrr)szManager.plugins_repositorycCsF|jdkrdS|jjtj�}|dkr0|jjd�}tj|j|tj	�dk	S)NTz
/proc/cpuinfo)
Z
cpuinfo_regexr
�get�constsZCFG_CPUINFO_STRINGrZ	read_file�re�search�	MULTILINE)r�unitZcpuinfo_stringrrr�_unit_matches_cpuinfo-s

zManager._unit_matches_cpuinfocCsH|jdkrdS|jjtj�}|dkr2djtj��}tj	|j|tj
�dk	S)NT� )Zuname_regexr
rrZCFG_UNAME_STRING�join�os�unamerrr)rrZuname_stringrrr�_unit_matches_uname6s

zManager._unit_matches_unamec
Cs2g}x�t|j��D]�\}}|js0tjd|�q|j|�sJtjd|�q|j|�sdtjd|�q|jjd|j	�t
|jd�|jd<|j|�qW|jdd�d�t
j�}x"|D]}|jjd�d||j<q�Wx�t|j��D]�\}}y$|jj|�}|||<|jj|�Wq�tjjjk
�r8tjd|�w�Yq�tk
�rt}	z tjd	|�tj|	�w�WYdd}	~	Xq�Xq�Wg}
xf|D]^}||j}|dk�r��q�tjd
|j|jf�|j|j|j|j|j |j!|j�}|
j|��q�Wx0|
D](}|j"j#�|j"j$|�|j"j%|��q�W|j&j'�|j(j)|
�dS)Nzskipping disabled instance '%s'z.skipping instance '%s', cpuinfo does not matchz,skipping instance '%s', uname does not match�prioritycSs
|jdS)Nr#)�options)�xrrr�<lambda>Psz Manager.create.<locals>.<lambda>)�keyz2skipping plugin '%s', not supported on your systemzfailed to initialize plugin %szcreating '%s' (%s))*�list�itemsZenabled�log�debugrr"r$�
setdefaultr	�int�append�sort�collections�OrderedDict�pop�typer�creater�tunedr�
exceptionsZNotSupportedPluginException�info�	Exception�errorZ	exception�nameZcreate_instanceZdevicesZdevices_udev_regexZ
script_preZscript_post�pluginZinit_devicesZassign_free_devicesZinitialize_instancer
Zstart_processing_eventsr�extend)
rZinstances_configZinstance_info_listZ
instance_nameZ
instance_infoZplugins_by_nameZplugin_nameZnoner;�erZnew_instance�instancerrrr4?s\









zManager.createcOsXy
|||�Stk
rR}z,tj�}tjd|t|�f�tj|�|Sd}~XnXdS)Nz"BUG: Unhandled exception in %s: %s)r8�	traceback�
format_excr*r9�str)rZcallerZexc_ret�f�args�kwargsr=Ztracerrr�	_try_callus

zManager._try_callcCs�x2|jD](}tjd|j�|jdd|jj|�qWx.|jD]$}tjd|j�|jdd|j�q<W|jdd�=|jdd�=dS)Nzdestroying instance %s�destroy_allzcleaning plugin '%s')	rr*r+r:rEr;Zdestroy_instancerZcleanup)rr>r;rrrrFs
zManager.destroy_allcCs4x.|jjD]"}tjd|�|jdd|j�q
WdS)Nzupdating monitor %s�update_monitors)rZmonitorsr*r+rE�update)rZmonitorrrrrG�szManager.update_monitorscCs$x|jD]}|jdd|j�qWdS)N�start_tuning)rrEZapply_tuning)rr>rrrrI�szManager.start_tuningcCs6d}x,|jD]"}|jdd|j|�}|dkrd}qW|S)NT�
verify_tuningF)rrErJ)rZignore_missing�retr>�resrrrrJ�s
zManager.verify_tuningcCs$x|jD]}|jdd|j�qWdS)N�
update_tuning)rrErM)rr>rrrrM�szManager.update_tuningcCs4|jj�x$t|j�D]}|jdd|j|�qWdS)N�stop_tuning)r
Zstop_processing_events�reversedrrEZunapply_tuning)rZrollbackr>rrrrN�s
zManager.stop_tuning)N)�__name__�
__module__�__qualname__�__doc__r�propertyrrrrr"r4rErFrGrIrJrMrZ
ROLLBACK_SOFTrN�
__classcell__rr)rrrs		6

	
)r0r rr?Ztuned.exceptionsr5Z
tuned.logsZtuned.plugins.exceptionsZtuned.constsrZtuned.utils.global_configrZtuned.utils.commandsrZlogsrr*�__all__�objectrrrrr�<module>s

site-packages/tuned/units/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000211147511334660021105 0ustar003

�<�e�@sddlTdS)�)�*N)Zmanager�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/exceptions.py000064400000001070147511334660013146 0ustar00import tuned.logs
import sys
import traceback

exception_logger = tuned.logs.get()

class TunedException(Exception):
	"""
	"""

	def log(self, logger = None):
		if logger is None:
			logger = exception_logger
		logger.error(str(self))
		self._log_trace(logger)

	def _log_trace(self, logger):
		(exc_type, exc_value, exc_traceback) = sys.exc_info()
		if exc_value != self:
			logger.debug("stack trace is no longer available")
		else:
			exception_info = "".join(traceback.format_exception(exc_type, exc_value, exc_traceback)).rstrip()
			logger.debug(exception_info)
site-packages/tuned/storage/__init__.py000064400000000215147511334660014170 0ustar00from tuned.storage.storage import Storage
from tuned.storage.factory import Factory
from tuned.storage.pickle_provider import PickleProvider
site-packages/tuned/storage/pickle_provider.py000064400000002604147511334660015616 0ustar00from . import interfaces
import tuned.logs
import pickle
import os
import tuned.consts as consts

log = tuned.logs.get()

class PickleProvider(interfaces.Provider):
	__slots__ = ["_path", "_data"]

	def __init__(self, path=None):
		if path is None:
			path = consts.DEFAULT_STORAGE_FILE
		self._path = path
		self._data = {}

	def set(self, namespace, option, value):
		self._data.setdefault(namespace, {})
		self._data[namespace][option] = value

	def get(self, namespace, option, default=None):
		self._data.setdefault(namespace, {})
		return self._data[namespace].get(option, default)

	def unset(self, namespace, option):
		self._data.setdefault(namespace, {})
		if option in self._data[namespace]:
			del self._data[namespace][option]

	def save(self):
		try:
			log.debug("Saving %s" % str(self._data))
			with open(self._path, "wb") as f:
				pickle.dump(self._data, f)
		except (OSError, IOError) as e:
			log.error("Error saving storage file '%s': %s" % (self._path, e))

	def load(self):
		try:
			with open(self._path, "rb") as f:
				self._data = pickle.load(f)
		except (OSError, IOError) as e:
			log.debug("Error loading storage file '%s': %s" % (self._path, e))
			self._data = {}
		except EOFError:
			self._data = {}

	def clear(self):
		self._data.clear()
		try:
			os.unlink(self._path)
		except (OSError, IOError) as e:
			log.debug("Error removing storage file '%s': %s" % (self._path, e))
site-packages/tuned/storage/factory.py000064400000000546147511334660014107 0ustar00from . import interfaces
from . import storage

class Factory(interfaces.Factory):
	__slots__ = ["_storage_provider"]

	def __init__(self, storage_provider):
		self._storage_provider = storage_provider

	@property
	def provider(self):
		return self._storage_provider

	def create(self, namespace):
		return storage.Storage(self._storage_provider, namespace)
site-packages/tuned/storage/interfaces.py000064400000000731147511334660014557 0ustar00class Factory(object):
	def create(self, namespace):
		raise NotImplementedError()

class Provider(object):
	def set(self, namespace, option, value):
		raise NotImplementedError()

	def get(self, namespace, option, default=None):
		raise NotImplementedError()

	def unset(self, namespace, option):
		raise NotImplementedError()

	def clear(self):
		raise NotImplementedError()

	def load(self):
		raise NotImplementedError()

	def save(self):
		raise NotImplementedError()
site-packages/tuned/storage/__pycache__/pickle_provider.cpython-36.opt-1.pyc000064400000004305147511334660023041 0ustar003

�<�e��@sNddlmZddlZddlZddlZddljZejj	�Z
Gdd�dej�ZdS)�)�
interfaces�Nc@sPeZdZddgZddd�Zdd�Zddd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dS)�PickleProvider�_path�_dataNcCs|dkrtj}||_i|_dS)N)�constsZDEFAULT_STORAGE_FILErr)�self�path�r
�%/usr/lib/python3.6/pickle_provider.py�__init__szPickleProvider.__init__cCs |jj|i�||j||<dS)N)r�
setdefault)r�	namespace�option�valuer
r
r�setszPickleProvider.setcCs |jj|i�|j|j||�S)N)rr
�get)rrr�defaultr
r
rrszPickleProvider.getcCs,|jj|i�||j|kr(|j||=dS)N)rr
)rrrr
r
r�unsetszPickleProvider.unsetcCs~y>tjdt|j��t|jd��}tj|j|�WdQRXWn:tt	fk
rx}ztj
d|j|f�WYdd}~XnXdS)Nz	Saving %s�wbz"Error saving storage file '%s': %s)�log�debug�strr�openr�pickle�dump�OSError�IOError�error)r�f�er
r
r�saveszPickleProvider.savecCs�y(t|jd��}tj|�|_WdQRXWnXttfk
rh}z tjd|j|f�i|_WYdd}~Xnt	k
r�i|_YnXdS)N�rbz#Error loading storage file '%s': %s)
rrr�loadrrrrr�EOFError)rrr r
r
rr#'szPickleProvider.loadcCsZ|jj�ytj|j�Wn:ttfk
rT}ztjd|j|f�WYdd}~XnXdS)Nz$Error removing storage file '%s': %s)	r�clear�os�unlinkrrrrr)rr r
r
rr%1s

zPickleProvider.clear)N)N)�__name__�
__module__�__qualname__�	__slots__rrrrr!r#r%r
r
r
rr	s


r)
�rZ
tuned.logsZtunedrr&Ztuned.constsrZlogsrrZProviderrr
r
r
r�<module>s

site-packages/tuned/storage/__pycache__/storage.cpython-36.opt-1.pyc000064400000001631147511334660021323 0ustar003

�<�e��@sGdd�de�ZdS)c@s6eZdZddgZdd�Zdd�Zddd	�Zd
d�ZdS)
�Storage�_storage_provider�
_namespacecCs||_||_dS)N)rr)�selfZstorage_provider�	namespace�r�/usr/lib/python3.6/storage.py�__init__szStorage.__init__cCs|jj|j||�dS)N)r�setr)r�option�valuerrrr	szStorage.setNcCs|jj|j||�S)N)r�getr)rr
�defaultrrrrszStorage.getcCs|jj|j|�dS)N)r�unsetr)rr
rrrrsz
Storage.unset)N)�__name__�
__module__�__qualname__�	__slots__rr	rrrrrrrs

rN)�objectrrrrr�<module>ssite-packages/tuned/storage/__pycache__/factory.cpython-36.pyc000064400000001423147511334660020366 0ustar003

�<�ef�@s.ddlmZddlmZGdd�dej�ZdS)�)�
interfaces)�storagec@s.eZdZdgZdd�Zedd��Zdd�ZdS)	�Factory�_storage_providercCs
||_dS)N)r)�selfZstorage_provider�r�/usr/lib/python3.6/factory.py�__init__szFactory.__init__cCs|jS)N)r)rrrr�provider
szFactory.providercCstj|j|�S)N)rZStorager)r�	namespacerrr�createszFactory.createN)�__name__�
__module__�__qualname__�	__slots__r	�propertyr
rrrrrrsrN)�rrrrrrr�<module>ssite-packages/tuned/storage/__pycache__/interfaces.cpython-36.pyc000064400000002410147511334660021037 0ustar003

�<�e��@s$Gdd�de�ZGdd�de�ZdS)c@seZdZdd�ZdS)�FactorycCs
t��dS)N)�NotImplementedError)�self�	namespace�r� /usr/lib/python3.6/interfaces.py�createszFactory.createN)�__name__�
__module__�__qualname__rrrrrrsrc@s>eZdZdd�Zddd�Zdd�Zdd	�Zd
d�Zdd
�ZdS)�ProvidercCs
t��dS)N)r)rr�option�valuerrr�setszProvider.setNcCs
t��dS)N)r)rrr�defaultrrr�get	szProvider.getcCs
t��dS)N)r)rrrrrr�unsetszProvider.unsetcCs
t��dS)N)r)rrrr�clearszProvider.clearcCs
t��dS)N)r)rrrr�loadsz
Provider.loadcCs
t��dS)N)r)rrrr�savesz
Provider.save)N)	rr	r
rrrrrrrrrrrs
rN)�objectrrrrrr�<module>ssite-packages/tuned/storage/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000437147511334660021421 0ustar003

�<�e��@s(ddlmZddlmZddlmZdS)�)�Storage)�Factory)�PickleProviderN)Ztuned.storage.storagerZtuned.storage.factoryrZtuned.storage.pickle_providerr�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/storage/__pycache__/factory.cpython-36.opt-1.pyc000064400000001423147511334660021325 0ustar003

�<�ef�@s.ddlmZddlmZGdd�dej�ZdS)�)�
interfaces)�storagec@s.eZdZdgZdd�Zedd��Zdd�ZdS)	�Factory�_storage_providercCs
||_dS)N)r)�selfZstorage_provider�r�/usr/lib/python3.6/factory.py�__init__szFactory.__init__cCs|jS)N)r)rrrr�provider
szFactory.providercCstj|j|�S)N)rZStorager)r�	namespacerrr�createszFactory.createN)�__name__�
__module__�__qualname__�	__slots__r	�propertyr
rrrrrrsrN)�rrrrrrr�<module>ssite-packages/tuned/storage/__pycache__/interfaces.cpython-36.opt-1.pyc000064400000002410147511334660021776 0ustar003

�<�e��@s$Gdd�de�ZGdd�de�ZdS)c@seZdZdd�ZdS)�FactorycCs
t��dS)N)�NotImplementedError)�self�	namespace�r� /usr/lib/python3.6/interfaces.py�createszFactory.createN)�__name__�
__module__�__qualname__rrrrrrsrc@s>eZdZdd�Zddd�Zdd�Zdd	�Zd
d�Zdd
�ZdS)�ProvidercCs
t��dS)N)r)rr�option�valuerrr�setszProvider.setNcCs
t��dS)N)r)rrr�defaultrrr�get	szProvider.getcCs
t��dS)N)r)rrrrrr�unsetszProvider.unsetcCs
t��dS)N)r)rrrr�clearszProvider.clearcCs
t��dS)N)r)rrrr�loadsz
Provider.loadcCs
t��dS)N)r)rrrr�savesz
Provider.save)N)	rr	r
rrrrrrrrrrrs
rN)�objectrrrrrr�<module>ssite-packages/tuned/storage/__pycache__/__init__.cpython-36.pyc000064400000000437147511334660020462 0ustar003

�<�e��@s(ddlmZddlmZddlmZdS)�)�Storage)�Factory)�PickleProviderN)Ztuned.storage.storagerZtuned.storage.factoryrZtuned.storage.pickle_providerr�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/storage/__pycache__/pickle_provider.cpython-36.pyc000064400000004305147511334660022102 0ustar003

�<�e��@sNddlmZddlZddlZddlZddljZejj	�Z
Gdd�dej�ZdS)�)�
interfaces�Nc@sPeZdZddgZddd�Zdd�Zddd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dS)�PickleProvider�_path�_dataNcCs|dkrtj}||_i|_dS)N)�constsZDEFAULT_STORAGE_FILErr)�self�path�r
�%/usr/lib/python3.6/pickle_provider.py�__init__szPickleProvider.__init__cCs |jj|i�||j||<dS)N)r�
setdefault)r�	namespace�option�valuer
r
r�setszPickleProvider.setcCs |jj|i�|j|j||�S)N)rr
�get)rrr�defaultr
r
rrszPickleProvider.getcCs,|jj|i�||j|kr(|j||=dS)N)rr
)rrrr
r
r�unsetszPickleProvider.unsetcCs~y>tjdt|j��t|jd��}tj|j|�WdQRXWn:tt	fk
rx}ztj
d|j|f�WYdd}~XnXdS)Nz	Saving %s�wbz"Error saving storage file '%s': %s)�log�debug�strr�openr�pickle�dump�OSError�IOError�error)r�f�er
r
r�saveszPickleProvider.savecCs�y(t|jd��}tj|�|_WdQRXWnXttfk
rh}z tjd|j|f�i|_WYdd}~Xnt	k
r�i|_YnXdS)N�rbz#Error loading storage file '%s': %s)
rrr�loadrrrrr�EOFError)rrr r
r
rr#'szPickleProvider.loadcCsZ|jj�ytj|j�Wn:ttfk
rT}ztjd|j|f�WYdd}~XnXdS)Nz$Error removing storage file '%s': %s)	r�clear�os�unlinkrrrrr)rr r
r
rr%1s

zPickleProvider.clear)N)N)�__name__�
__module__�__qualname__�	__slots__rrrrr!r#r%r
r
r
rr	s


r)
�rZ
tuned.logsZtunedrr&Ztuned.constsrZlogsrrZProviderrr
r
r
r�<module>s

site-packages/tuned/storage/__pycache__/storage.cpython-36.pyc000064400000001631147511334660020364 0ustar003

�<�e��@sGdd�de�ZdS)c@s6eZdZddgZdd�Zdd�Zddd	�Zd
d�ZdS)
�Storage�_storage_provider�
_namespacecCs||_||_dS)N)rr)�selfZstorage_provider�	namespace�r�/usr/lib/python3.6/storage.py�__init__szStorage.__init__cCs|jj|j||�dS)N)r�setr)r�option�valuerrrr	szStorage.setNcCs|jj|j||�S)N)r�getr)rr
�defaultrrrrszStorage.getcCs|jj|j|�dS)N)r�unsetr)rr
rrrrsz
Storage.unset)N)�__name__�
__module__�__qualname__�	__slots__rr	rrrrrrrs

rN)�objectrrrrr�<module>ssite-packages/tuned/storage/storage.py000064400000000742147511334660014102 0ustar00class Storage(object):
	__slots__ = ["_storage_provider", "_namespace"]

	def __init__(self, storage_provider, namespace):
		self._storage_provider = storage_provider
		self._namespace = namespace

	def set(self, option, value):
		self._storage_provider.set(self._namespace, option, value)

	def get(self, option, default=None):
		return self._storage_provider.get(self._namespace, option, default)

	def unset(self, option):
		self._storage_provider.unset(self._namespace, option)
site-packages/tuned/__init__.py000064400000001671147511334660012533 0ustar00#
# tuned: daemon for monitoring and adaptive tuning of system devices
#
# Copyright (C) 2008-2013 Red Hat, Inc.
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License
# as published by the Free Software Foundation; either version 2
# of the License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
#

__copyright__ = "Copyright 2008-2013 Red Hat, Inc."
__license__ = "GPLv2+"
__email__ = "power-management@lists.fedoraproject.org"
site-packages/tuned/ppd/__pycache__/config.cpython-36.opt-1.pyc000064400000005034147511334660020244 0ustar003

�<�eQ	�@sJddlmZmZddlmZddlZdZdZdZdZ	dZ
Gd	d
�d
�ZdS)�)�ConfigParser�Error)�TunedExceptionNzpower-saverZperformance�mainZprofiles�defaultc@s@eZdZdd�Zedd��Zedd��Zedd��Zd	d
�ZdS)�	PPDConfigcCs|j|�dS)N)�load_from_file)�self�config_file�r�/usr/lib/python3.6/config.py�__init__szPPDConfig.__init__cCs|jS)N)�_default_profile)r	rrr�default_profileszPPDConfig.default_profilecCs|jS)N)�
_ppd_to_tuned)r	rrr�ppd_to_tunedszPPDConfig.ppd_to_tunedcCs|jS)N)�
_tuned_to_ppd)r	rrr�tuned_to_ppdszPPDConfig.tuned_to_ppdcCsRt�}tjj|�std|��y|j|�Wn tk
rLtd|��YnXt|krbtd|��t|t�|_	t
dd�|j	j�D��s�td|��tt
|j	j���t|j	�kr�td|��dd	�|j	j�D�|_t|j	kr�td
|��t|j	k�rtd|��t|k�st|tk�r$td|��|tt|_|j|j	k�rNtd
|j��dS)Nz&Configuration file '%s' does not existz)Error parsing the configuration file '%s'z7Missing profiles section in the configuration file '%s'css|]}t|t�VqdS)N)�
isinstance�str)�.0Zmapped_profilerrr�	<genexpr>+sz+PPDConfig.load_from_file.<locals>.<genexpr>z6Invalid profile mapping in the configuration file '%s'z8Duplicate profile mapping in the configuration file '%s'cSsi|]\}}||�qSrr)r�k�vrrr�
<dictcomp>0sz,PPDConfig.load_from_file.<locals>.<dictcomp>z:Missing power-saver profile in the configuration file '%s'z:Missing performance profile in the configuration file '%s'z6Missing default profile in the configuration file '%s'zUnknown default profile '%s')r�os�path�isfiler�readr�PROFILES_SECTION�dictr�all�values�len�set�itemsr�PPD_POWER_SAVER�PPD_PERFORMANCE�MAIN_SECTION�DEFAULT_PROFILE_OPTIONr)r	r
Zcfgrrrrs0
zPPDConfig.load_from_fileN)	�__name__�
__module__�__qualname__r
�propertyrrrrrrrrr
s
r)Ztuned.utils.config_parserrrZtuned.exceptionsrrr&r'r(rr)rrrrr�<module>ssite-packages/tuned/ppd/__pycache__/config.cpython-36.pyc000064400000005034147511334660017305 0ustar003

�<�eQ	�@sJddlmZmZddlmZddlZdZdZdZdZ	dZ
Gd	d
�d
�ZdS)�)�ConfigParser�Error)�TunedExceptionNzpower-saverZperformance�mainZprofiles�defaultc@s@eZdZdd�Zedd��Zedd��Zedd��Zd	d
�ZdS)�	PPDConfigcCs|j|�dS)N)�load_from_file)�self�config_file�r�/usr/lib/python3.6/config.py�__init__szPPDConfig.__init__cCs|jS)N)�_default_profile)r	rrr�default_profileszPPDConfig.default_profilecCs|jS)N)�
_ppd_to_tuned)r	rrr�ppd_to_tunedszPPDConfig.ppd_to_tunedcCs|jS)N)�
_tuned_to_ppd)r	rrr�tuned_to_ppdszPPDConfig.tuned_to_ppdcCsRt�}tjj|�std|��y|j|�Wn tk
rLtd|��YnXt|krbtd|��t|t�|_	t
dd�|j	j�D��s�td|��tt
|j	j���t|j	�kr�td|��dd	�|j	j�D�|_t|j	kr�td
|��t|j	k�rtd|��t|k�st|tk�r$td|��|tt|_|j|j	k�rNtd
|j��dS)Nz&Configuration file '%s' does not existz)Error parsing the configuration file '%s'z7Missing profiles section in the configuration file '%s'css|]}t|t�VqdS)N)�
isinstance�str)�.0Zmapped_profilerrr�	<genexpr>+sz+PPDConfig.load_from_file.<locals>.<genexpr>z6Invalid profile mapping in the configuration file '%s'z8Duplicate profile mapping in the configuration file '%s'cSsi|]\}}||�qSrr)r�k�vrrr�
<dictcomp>0sz,PPDConfig.load_from_file.<locals>.<dictcomp>z:Missing power-saver profile in the configuration file '%s'z:Missing performance profile in the configuration file '%s'z6Missing default profile in the configuration file '%s'zUnknown default profile '%s')r�os�path�isfiler�readr�PROFILES_SECTION�dictr�all�values�len�set�itemsr�PPD_POWER_SAVER�PPD_PERFORMANCE�MAIN_SECTION�DEFAULT_PROFILE_OPTIONr)r	r
Zcfgrrrrs0
zPPDConfig.load_from_fileN)	�__name__�
__module__�__qualname__r
�propertyrrrrrrrrr
s
r)Ztuned.utils.config_parserrrZtuned.exceptionsrrr&r'r(rr)rrrrr�<module>ssite-packages/tuned/ppd/__pycache__/controller.cpython-36.opt-1.pyc000064400000020764147511334660021171 0ustar003

�<�e��@s�ddlmZmZddlmZddlmZddlmZm	Z	m
Z
ddlmZddl
Z
ddlZddlZej�ZdZdZd	ZGd
d�de�ZGdd
�d
e�ZGdd�de�ZGdd�dejj�ZdS)�)�exports�logs)�commands)�PPD_CONFIG_FILE)�	PPDConfig�PPD_PERFORMANCE�PPD_POWER_SAVER)�StrEnumN�tunedz-/sys/devices/system/cpu/intel_pstate/no_turboz4/sys/bus/platform/devices/thinkpad_acpi/dytc_lapmodec@seZdZdZdZdZdS)�PerformanceDegraded�zlap-detectedzhigh-operating-temperatureN)�__name__�
__module__�__qualname__�NONE�LAP_DETECTED�HIGH_OPERATING_TEMPERATURE�rr� /usr/lib/python3.6/controller.pyrsrc@seZdZdd�Zdd�ZdS)�ProfileHoldcCs||_||_||_||_dS)N)�profile�reason�app_id�watch)�selfrrrrrrr�__init__szProfileHold.__init__cCs|j|j|jd�S)N)�ProfileZReasonZ
ApplicationId)rrr)rrrr�as_dictszProfileHold.as_dictN)r
rrrrrrrrrsrc@sTeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)�ProfileHoldManagercCsi|_d|_||_dS)Nr)�_holds�_cookie_counter�_controller)rZ
controllerrrrr'szProfileHoldManager.__init__cs���fdd�}|S)Ncs(|dkr$tjd��f��j��dS)Nrz1Application '%s' disappeared, releasing hold '%s')�log�info�remove)�name)r�cookierrr�callback-sz.ProfileHoldManager._callback.<locals>.callbackr)rr&rr'r)rr&rr�	_callback,szProfileHoldManager._callbackcCs tdd�|jj�D��rtStS)Ncss|]}|jtkVqdS)N)rr)�.0�holdrrr�	<genexpr>5sz=ProfileHoldManager._effective_hold_profile.<locals>.<genexpr>)�anyr�valuesrr)rrrr�_effective_hold_profile4sz*ProfileHoldManager._effective_hold_profilecCs\||jkrdS|jj|�}|jj�tjd|�tjd|j��tj	d||j
|jf�dS)N�ProfileReleased�ActiveProfileHoldsz5Releasing hold '%s': profile '%s' by application '%s')r�poprZcancelrZsend_signal�property_changed�
as_dbus_arrayr"r#rr)rr&r*rrr�_cancel9s

zProfileHoldManager._cancelcCstjdd�|jj�D�dd�S)NcSsg|]}|j��qSr)r)r)r*rrr�
<listcomp>Csz4ProfileHoldManager.as_dbus_array.<locals>.<listcomp>za{sv})�	signature)�dbus�Arrayrr-)rrrrr3Bsz ProfileHoldManager.as_dbus_arraycCst|j}|jd7_|jjj||j||��}tjd|||f�t||||�|j|<t	j
d|j��|jj|�|S)N�z2Adding hold '%s': profile '%s' by application '%s'r0)
r r!�busZwatch_name_ownerr(r"r#rrrr2r3�switch_profile)rrrr�callerr&rrrr�addEszProfileHoldManager.addcCs
||jkS)N)r)rr&rrr�hasOszProfileHoldManager.hascCs:|j|�t|j�dkr"|j�}n|jj}|jj|�dS)Nr)r4�lenrr.r!�base_profiler;)rr&Znew_profilerrrr$Rs


zProfileHoldManager.removecCsx|jD]}|j|�qWdS)N)rr4)rr&rrr�clearZszProfileHoldManager.clearN)r
rrrr(r.r4r3r=r>r$rArrrrr&s	
rcs
eZdZ�fdd�Zdd�Zdd�Zedd��Zed	d
��Zdd�Z	d
d�Z
dd�Zdd�Ze
jdd�dd��Ze
jdd�dd��Ze
jd�dd��Ze
jd�dd��Ze
jd�dd ��Ze
jd!�d"d#��Ze
jd$�d%d&��Ze
jd'�d(d)��Ze
jd*�d+d,��Z�ZS)-�
ControllercsJtt|�j�||_||_t|�|_tj|_	t
�|_tj
�|_|j�dS)N)�superrBr�_bus�_tuned_interfacer�_profile_holdsrr�_performance_degradedr�_cmd�	threadingZEvent�
_terminate�load_config)rr:Ztuned_interface)�	__class__rrr`s

zController.__init__cCs�tj}tjjt�r,t|jjt��dkr,tj	}tjjt
�rRt|jjt
��dkrRtj}||jkr|t
jd|�||_tjd|�dS)Nr9zPerformance degraded: %sr)rr�os�path�exists�
NO_TURBO_PATH�intrHZ	read_filer�
LAP_MODE_PATHrrGr"r#rr2)rZperformance_degradedrrr�_check_performance_degradedjs
z&Controller._check_performance_degradedcCs2tj�x|jj|jd�s$|j�q
Wtj�dS)Nr9)r�startrH�waitrJrS�stop)rrrr�runwszController.runcCs|jS)N)rD)rrrrr:}szController.buscCs|jS)N)�
_base_profile)rrrrr@�szController.base_profilecCs|jj�dS)N)rJ�set)rrrr�	terminate�szController.terminatecCs&tt�|_|jj|_|j|jj�dS)N)rr�_configZdefault_profilerXr;)rrrrrK�s

zController.load_configcCsF|j�|krdS|jj|}tjd|�|jj|�tjd|�dS)NzSwitching to profile '%s'�
ActiveProfile)	�active_profiler[�ppd_to_tunedr"r#rEr;rr2)rr�
tuned_profilerrrr;�szController.switch_profilecCs|jj�}|jjj|d�S)N�unknown)rEr]r[Ztuned_to_ppd�get)rr_rrrr]�s
zController.active_profileZsss�ucCs6|tkr$|tkr$tjjdttf��|jj||||�S)Nz'Only '%s' and '%s' profiles may be held)rrr7�
exceptions�
DBusExceptionrFr=)rrrrr<rrr�HoldProfile�szController.HoldProfilercCs,|jj|�stjjd|��|jj|�dS)NzNo active hold for cookie '%s')rFr>r7rcrdr$)rr&r<rrr�ReleaseProfile�szController.ReleaseProfilecCsdS)Nr)rr&rrrr/�szController.ProfileReleasedr\cCs:||jjkrtjjd|��||_|jj�|j|�dS)NzInvalid profile '%s')	r[r^r7rcrdrXrFrAr;)rrrrr�set_active_profile�s

zController.set_active_profilecCs|j�S)N)r])rrrr�get_active_profile�szController.get_active_profileZProfilescCs tjdd�|jjj�D�dd�S)NcSsg|]}|td��qS))rZDriver)�DRIVER)r)rrrrr5�sz+Controller.get_profiles.<locals>.<listcomp>za{sv})r6)r7r8r[r^�keys)rrrr�get_profiles�szController.get_profilesZActionscCstjgdd�S)N�s)r6)r7r8)rrrr�get_actions�szController.get_actionsrcCs|jS)N)rG)rrrr�get_performance_degraded�sz#Controller.get_performance_degradedr0cCs
|jj�S)N)rFr3)rrrr�get_active_profile_holds�sz#Controller.get_active_profile_holds)r
rrrrSrW�propertyr:r@rZrKr;r]rZexportrerf�signalr/Zproperty_setterrgZproperty_getterrhrkrmrnro�
__classcell__rr)rLrrB_s$

rB)r
rrZtuned.utils.commandsrZtuned.constsrZtuned.ppd.configrrr�enumr	rIr7rMrar"rirPrRr�objectrrZ
interfacesZExportableInterfacerBrrrr�<module>s9site-packages/tuned/ppd/__pycache__/controller.cpython-36.pyc000064400000020764147511334660020232 0ustar003

�<�e��@s�ddlmZmZddlmZddlmZddlmZm	Z	m
Z
ddlmZddl
Z
ddlZddlZej�ZdZdZd	ZGd
d�de�ZGdd
�d
e�ZGdd�de�ZGdd�dejj�ZdS)�)�exports�logs)�commands)�PPD_CONFIG_FILE)�	PPDConfig�PPD_PERFORMANCE�PPD_POWER_SAVER)�StrEnumN�tunedz-/sys/devices/system/cpu/intel_pstate/no_turboz4/sys/bus/platform/devices/thinkpad_acpi/dytc_lapmodec@seZdZdZdZdZdS)�PerformanceDegraded�zlap-detectedzhigh-operating-temperatureN)�__name__�
__module__�__qualname__�NONE�LAP_DETECTED�HIGH_OPERATING_TEMPERATURE�rr� /usr/lib/python3.6/controller.pyrsrc@seZdZdd�Zdd�ZdS)�ProfileHoldcCs||_||_||_||_dS)N)�profile�reason�app_id�watch)�selfrrrrrrr�__init__szProfileHold.__init__cCs|j|j|jd�S)N)�ProfileZReasonZ
ApplicationId)rrr)rrrr�as_dictszProfileHold.as_dictN)r
rrrrrrrrrsrc@sTeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�ZdS)�ProfileHoldManagercCsi|_d|_||_dS)Nr)�_holds�_cookie_counter�_controller)rZ
controllerrrrr'szProfileHoldManager.__init__cs���fdd�}|S)Ncs(|dkr$tjd��f��j��dS)Nrz1Application '%s' disappeared, releasing hold '%s')�log�info�remove)�name)r�cookierrr�callback-sz.ProfileHoldManager._callback.<locals>.callbackr)rr&rr'r)rr&rr�	_callback,szProfileHoldManager._callbackcCs tdd�|jj�D��rtStS)Ncss|]}|jtkVqdS)N)rr)�.0�holdrrr�	<genexpr>5sz=ProfileHoldManager._effective_hold_profile.<locals>.<genexpr>)�anyr�valuesrr)rrrr�_effective_hold_profile4sz*ProfileHoldManager._effective_hold_profilecCs\||jkrdS|jj|�}|jj�tjd|�tjd|j��tj	d||j
|jf�dS)N�ProfileReleased�ActiveProfileHoldsz5Releasing hold '%s': profile '%s' by application '%s')r�poprZcancelrZsend_signal�property_changed�
as_dbus_arrayr"r#rr)rr&r*rrr�_cancel9s

zProfileHoldManager._cancelcCstjdd�|jj�D�dd�S)NcSsg|]}|j��qSr)r)r)r*rrr�
<listcomp>Csz4ProfileHoldManager.as_dbus_array.<locals>.<listcomp>za{sv})�	signature)�dbus�Arrayrr-)rrrrr3Bsz ProfileHoldManager.as_dbus_arraycCst|j}|jd7_|jjj||j||��}tjd|||f�t||||�|j|<t	j
d|j��|jj|�|S)N�z2Adding hold '%s': profile '%s' by application '%s'r0)
r r!�busZwatch_name_ownerr(r"r#rrrr2r3�switch_profile)rrrr�callerr&rrrr�addEszProfileHoldManager.addcCs
||jkS)N)r)rr&rrr�hasOszProfileHoldManager.hascCs:|j|�t|j�dkr"|j�}n|jj}|jj|�dS)Nr)r4�lenrr.r!�base_profiler;)rr&Znew_profilerrrr$Rs


zProfileHoldManager.removecCsx|jD]}|j|�qWdS)N)rr4)rr&rrr�clearZszProfileHoldManager.clearN)r
rrrr(r.r4r3r=r>r$rArrrrr&s	
rcs
eZdZ�fdd�Zdd�Zdd�Zedd��Zed	d
��Zdd�Z	d
d�Z
dd�Zdd�Ze
jdd�dd��Ze
jdd�dd��Ze
jd�dd��Ze
jd�dd��Ze
jd�dd ��Ze
jd!�d"d#��Ze
jd$�d%d&��Ze
jd'�d(d)��Ze
jd*�d+d,��Z�ZS)-�
ControllercsJtt|�j�||_||_t|�|_tj|_	t
�|_tj
�|_|j�dS)N)�superrBr�_bus�_tuned_interfacer�_profile_holdsrr�_performance_degradedr�_cmd�	threadingZEvent�
_terminate�load_config)rr:Ztuned_interface)�	__class__rrr`s

zController.__init__cCs�tj}tjjt�r,t|jjt��dkr,tj	}tjjt
�rRt|jjt
��dkrRtj}||jkr|t
jd|�||_tjd|�dS)Nr9zPerformance degraded: %sr)rr�os�path�exists�
NO_TURBO_PATH�intrHZ	read_filer�
LAP_MODE_PATHrrGr"r#rr2)rZperformance_degradedrrr�_check_performance_degradedjs
z&Controller._check_performance_degradedcCs2tj�x|jj|jd�s$|j�q
Wtj�dS)Nr9)r�startrH�waitrJrS�stop)rrrr�runwszController.runcCs|jS)N)rD)rrrrr:}szController.buscCs|jS)N)�
_base_profile)rrrrr@�szController.base_profilecCs|jj�dS)N)rJ�set)rrrr�	terminate�szController.terminatecCs&tt�|_|jj|_|j|jj�dS)N)rr�_configZdefault_profilerXr;)rrrrrK�s

zController.load_configcCsF|j�|krdS|jj|}tjd|�|jj|�tjd|�dS)NzSwitching to profile '%s'�
ActiveProfile)	�active_profiler[�ppd_to_tunedr"r#rEr;rr2)rr�
tuned_profilerrrr;�szController.switch_profilecCs|jj�}|jjj|d�S)N�unknown)rEr]r[Ztuned_to_ppd�get)rr_rrrr]�s
zController.active_profileZsss�ucCs6|tkr$|tkr$tjjdttf��|jj||||�S)Nz'Only '%s' and '%s' profiles may be held)rrr7�
exceptions�
DBusExceptionrFr=)rrrrr<rrr�HoldProfile�szController.HoldProfilercCs,|jj|�stjjd|��|jj|�dS)NzNo active hold for cookie '%s')rFr>r7rcrdr$)rr&r<rrr�ReleaseProfile�szController.ReleaseProfilecCsdS)Nr)rr&rrrr/�szController.ProfileReleasedr\cCs:||jjkrtjjd|��||_|jj�|j|�dS)NzInvalid profile '%s')	r[r^r7rcrdrXrFrAr;)rrrrr�set_active_profile�s

zController.set_active_profilecCs|j�S)N)r])rrrr�get_active_profile�szController.get_active_profileZProfilescCs tjdd�|jjj�D�dd�S)NcSsg|]}|td��qS))rZDriver)�DRIVER)r)rrrrr5�sz+Controller.get_profiles.<locals>.<listcomp>za{sv})r6)r7r8r[r^�keys)rrrr�get_profiles�szController.get_profilesZActionscCstjgdd�S)N�s)r6)r7r8)rrrr�get_actions�szController.get_actionsrcCs|jS)N)rG)rrrr�get_performance_degraded�sz#Controller.get_performance_degradedr0cCs
|jj�S)N)rFr3)rrrr�get_active_profile_holds�sz#Controller.get_active_profile_holds)r
rrrrSrW�propertyr:r@rZrKr;r]rZexportrerf�signalr/Zproperty_setterrgZproperty_getterrhrkrmrnro�
__classcell__rr)rLrrB_s$

rB)r
rrZtuned.utils.commandsrZtuned.constsrZtuned.ppd.configrrr�enumr	rIr7rMrar"rirPrRr�objectrrZ
interfacesZExportableInterfacerBrrrr�<module>s9site-packages/tuned/ppd/dbus.conf000064400000001353147511334660013006 0ustar00<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE busconfig PUBLIC "-//freedesktop//DTD D-BUS Bus Configuration 1.0//EN"
  "http://www.freedesktop.org/standards/dbus/1.0/busconfig.dtd">

<busconfig>
  <policy user="root">
    <allow own="net.hadess.PowerProfiles"/>
  </policy>

  <policy context="default">
    <allow send_destination="net.hadess.PowerProfiles" send_interface="net.hadess.PowerProfiles"/>
    <allow send_destination="net.hadess.PowerProfiles" send_interface="org.freedesktop.DBus.Introspectable"/>
    <allow send_destination="net.hadess.PowerProfiles" send_interface="org.freedesktop.DBus.Properties"/>
    <allow send_destination="net.hadess.PowerProfiles" send_interface="org.freedesktop.DBus.Peer"/>
  </policy>
</busconfig>
site-packages/tuned/ppd/ppd.conf000064400000000227147511334660012633 0ustar00[main]
# The default PPD profile
default=balanced

[profiles]
# PPD = TuneD
power-saver=powersave
balanced=balanced
performance=throughput-performance
site-packages/tuned/ppd/config.py000064400000004521147511334660013021 0ustar00from tuned.utils.config_parser import ConfigParser, Error
from tuned.exceptions import TunedException
import os

PPD_POWER_SAVER = "power-saver"
PPD_PERFORMANCE = "performance"

MAIN_SECTION = "main"
PROFILES_SECTION = "profiles"
DEFAULT_PROFILE_OPTION = "default"


class PPDConfig:
    def __init__(self, config_file):
        self.load_from_file(config_file)

    @property
    def default_profile(self):
        return self._default_profile

    @property
    def ppd_to_tuned(self):
        return self._ppd_to_tuned

    @property
    def tuned_to_ppd(self):
        return self._tuned_to_ppd

    def load_from_file(self, config_file):
        cfg = ConfigParser()

        if not os.path.isfile(config_file):
            raise TunedException("Configuration file '%s' does not exist" % config_file)
        try:
            cfg.read(config_file)
        except Error:
            raise TunedException("Error parsing the configuration file '%s'" % config_file)

        if PROFILES_SECTION not in cfg:
            raise TunedException("Missing profiles section in the configuration file '%s'" % config_file)
        self._ppd_to_tuned = dict(cfg[PROFILES_SECTION])

        if not all(isinstance(mapped_profile, str) for mapped_profile in self._ppd_to_tuned.values()):
            raise TunedException("Invalid profile mapping in the configuration file '%s'" % config_file)

        if len(set(self._ppd_to_tuned.values())) != len(self._ppd_to_tuned):
            raise TunedException("Duplicate profile mapping in the configuration file '%s'" % config_file)
        self._tuned_to_ppd = {v: k for k, v in self._ppd_to_tuned.items()}

        if PPD_POWER_SAVER not in self._ppd_to_tuned:
            raise TunedException("Missing power-saver profile in the configuration file '%s'" % config_file)

        if PPD_PERFORMANCE not in self._ppd_to_tuned:
            raise TunedException("Missing performance profile in the configuration file '%s'" % config_file)

        if MAIN_SECTION not in cfg or DEFAULT_PROFILE_OPTION not in cfg[MAIN_SECTION]:
            raise TunedException("Missing default profile in the configuration file '%s'" % config_file)

        self._default_profile = cfg[MAIN_SECTION][DEFAULT_PROFILE_OPTION]
        if self._default_profile not in self._ppd_to_tuned:
            raise TunedException("Unknown default profile '%s'" % self._default_profile)
site-packages/tuned/ppd/tuned-ppd.policy000064400000001740147511334660014323 0ustar00<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE policyconfig PUBLIC "-//freedesktop//DTD PolicyKit Policy Configuration 1.0//EN"
"http://www.freedesktop.org/standards/PolicyKit/1.0/policyconfig.dtd">
<policyconfig>

  <vendor>TuneD</vendor>
  <vendor_url>https://tuned-project.org/</vendor_url>

  <action id="net.hadess.PowerProfiles.HoldProfile">
    <description>Hold power profile</description>
    <message>Authentication is required to hold power profiles.</message>
    <defaults>
      <allow_any>no</allow_any>
      <allow_inactive>no</allow_inactive>
      <allow_active>yes</allow_active>
    </defaults>
  </action>
  
  <action id="net.hadess.PowerProfiles.ReleaseProfile">
    <description>Release power profile</description>
    <message>Authentication is required to release power profiles.</message>
    <defaults>
      <allow_any>no</allow_any>
      <allow_inactive>no</allow_inactive>
      <allow_active>yes</allow_active>
    </defaults>
  </action>

</policyconfig>
site-packages/tuned/ppd/tuned-ppd.service000064400000000454147511334660014465 0ustar00[Unit]
Description=PPD-to-TuneD API Translation Daemon
Requires=tuned.service
After=tuned.service
Before=multi-user.target display-manager.target

[Service]
Type=dbus
PIDFile=/run/tuned/tuned-ppd.pid
BusName=net.hadess.PowerProfiles
ExecStart=/usr/sbin/tuned-ppd

[Install]
WantedBy=graphical.target
site-packages/tuned/ppd/controller.py000064400000015676147511334660013754 0ustar00from tuned import exports, logs
from tuned.utils.commands import commands
from tuned.consts import PPD_CONFIG_FILE
from tuned.ppd.config import PPDConfig, PPD_PERFORMANCE, PPD_POWER_SAVER
from enum import StrEnum
import threading
import dbus
import os

log = logs.get()

DRIVER = "tuned"
NO_TURBO_PATH = "/sys/devices/system/cpu/intel_pstate/no_turbo"
LAP_MODE_PATH = "/sys/bus/platform/devices/thinkpad_acpi/dytc_lapmode"


class PerformanceDegraded(StrEnum):
    NONE = ""
    LAP_DETECTED = "lap-detected"
    HIGH_OPERATING_TEMPERATURE = "high-operating-temperature"


class ProfileHold(object):
    def __init__(self, profile, reason, app_id, watch):
        self.profile = profile
        self.reason = reason
        self.app_id = app_id
        self.watch = watch

    def as_dict(self):
        return {
            "Profile": self.profile,
            "Reason": self.reason,
            "ApplicationId": self.app_id,
        }


class ProfileHoldManager(object):
    def __init__(self, controller):
        self._holds = {}
        self._cookie_counter = 0
        self._controller = controller

    def _callback(self, cookie, app_id):
        def callback(name):
            if name == "":
                log.info("Application '%s' disappeared, releasing hold '%s'" % (app_id, cookie))
                self.remove(cookie)

        return callback

    def _effective_hold_profile(self):
        if any(hold.profile == PPD_POWER_SAVER for hold in self._holds.values()):
            return PPD_POWER_SAVER
        return PPD_PERFORMANCE

    def _cancel(self, cookie):
        if cookie not in self._holds:
            return
        hold = self._holds.pop(cookie)
        hold.watch.cancel()
        exports.send_signal("ProfileReleased", cookie)
        exports.property_changed("ActiveProfileHolds", self.as_dbus_array())
        log.info("Releasing hold '%s': profile '%s' by application '%s'" % (cookie, hold.profile, hold.app_id))

    def as_dbus_array(self):
        return dbus.Array([hold.as_dict() for hold in self._holds.values()], signature="a{sv}")

    def add(self, profile, reason, app_id, caller):
        cookie = self._cookie_counter
        self._cookie_counter += 1
        watch = self._controller.bus.watch_name_owner(caller, self._callback(cookie, app_id))
        log.info("Adding hold '%s': profile '%s' by application '%s'" % (cookie, profile, app_id))
        self._holds[cookie] = ProfileHold(profile, reason, app_id, watch)
        exports.property_changed("ActiveProfileHolds", self.as_dbus_array())
        self._controller.switch_profile(profile)
        return cookie

    def has(self, cookie):
        return cookie in self._holds

    def remove(self, cookie):
        self._cancel(cookie)
        if len(self._holds) != 0:
            new_profile = self._effective_hold_profile()
        else:
            new_profile = self._controller.base_profile
        self._controller.switch_profile(new_profile)

    def clear(self):
        for cookie in self._holds:
            self._cancel(cookie)


class Controller(exports.interfaces.ExportableInterface):
    def __init__(self, bus, tuned_interface):
        super(Controller, self).__init__()
        self._bus = bus
        self._tuned_interface = tuned_interface
        self._profile_holds = ProfileHoldManager(self)
        self._performance_degraded = PerformanceDegraded.NONE
        self._cmd = commands()
        self._terminate = threading.Event()
        self.load_config()

    def _check_performance_degraded(self):
        performance_degraded = PerformanceDegraded.NONE
        if os.path.exists(NO_TURBO_PATH):
            if int(self._cmd.read_file(NO_TURBO_PATH)) == 1:
                performance_degraded = PerformanceDegraded.HIGH_OPERATING_TEMPERATURE
        if os.path.exists(LAP_MODE_PATH):
            if int(self._cmd.read_file(LAP_MODE_PATH)) == 1:
                performance_degraded = PerformanceDegraded.LAP_DETECTED
        if performance_degraded != self._performance_degraded:
            log.info("Performance degraded: %s" % performance_degraded)
            self._performance_degraded = performance_degraded
            exports.property_changed("PerformanceDegraded", performance_degraded)

    def run(self):
        exports.start()
        while not self._cmd.wait(self._terminate, 1):
            self._check_performance_degraded()
        exports.stop()

    @property
    def bus(self):
        return self._bus

    @property
    def base_profile(self):
        return self._base_profile

    def terminate(self):
        self._terminate.set()

    def load_config(self):
        self._config = PPDConfig(PPD_CONFIG_FILE)
        self._base_profile = self._config.default_profile
        self.switch_profile(self._config.default_profile)

    def switch_profile(self, profile):
        if self.active_profile() == profile:
            return
        tuned_profile = self._config.ppd_to_tuned[profile]
        log.info("Switching to profile '%s'" % tuned_profile)
        self._tuned_interface.switch_profile(tuned_profile)
        exports.property_changed("ActiveProfile", profile)

    def active_profile(self):
        tuned_profile = self._tuned_interface.active_profile()
        return self._config.tuned_to_ppd.get(tuned_profile, "unknown")

    @exports.export("sss", "u")
    def HoldProfile(self, profile, reason, app_id, caller):
        if profile != PPD_POWER_SAVER and profile != PPD_PERFORMANCE:
            raise dbus.exceptions.DBusException(
                "Only '%s' and '%s' profiles may be held" % (PPD_POWER_SAVER, PPD_PERFORMANCE)
            )
        return self._profile_holds.add(profile, reason, app_id, caller)

    @exports.export("u", "")
    def ReleaseProfile(self, cookie, caller):
        if not self._profile_holds.has(cookie):
            raise dbus.exceptions.DBusException("No active hold for cookie '%s'" % cookie)
        self._profile_holds.remove(cookie)

    @exports.signal("u")
    def ProfileReleased(self, cookie):
        pass

    @exports.property_setter("ActiveProfile")
    def set_active_profile(self, profile):
        if profile not in self._config.ppd_to_tuned:
            raise dbus.exceptions.DBusException("Invalid profile '%s'" % profile)
        self._base_profile = profile
        self._profile_holds.clear()
        self.switch_profile(profile)

    @exports.property_getter("ActiveProfile")
    def get_active_profile(self):
        return self.active_profile()

    @exports.property_getter("Profiles")
    def get_profiles(self):
        return dbus.Array(
            [{"Profile": profile, "Driver": DRIVER} for profile in self._config.ppd_to_tuned.keys()],
            signature="a{sv}",
        )

    @exports.property_getter("Actions")
    def get_actions(self):
        return dbus.Array([], signature="s")

    @exports.property_getter("PerformanceDegraded")
    def get_performance_degraded(self):
        return self._performance_degraded

    @exports.property_getter("ActiveProfileHolds")
    def get_active_profile_holds(self):
        return self._profile_holds.as_dbus_array()
site-packages/tuned/ppd/tuned-ppd.dbus.service000064400000000151147511334660015413 0ustar00[D-BUS Service]
Name=net.hadess.PowerProfiles
Exec=/bin/false
User=root
SystemdService=tuned-ppd.service
site-packages/tuned/profiles/exceptions.py000064400000000137147511334660014774 0ustar00import tuned.exceptions

class InvalidProfileException(tuned.exceptions.TunedException):
	pass
site-packages/tuned/profiles/__init__.py000064400000000431147511334660014347 0ustar00from tuned.profiles.locator import *
from tuned.profiles.loader import *
from tuned.profiles.profile import *
from tuned.profiles.unit import *
from tuned.profiles.exceptions import *
from tuned.profiles.factory import *
from tuned.profiles.merger import *
from . import functions
site-packages/tuned/profiles/merger.py000064400000004210147511334660014070 0ustar00import collections
from functools import reduce

class Merger(object):
	"""
	Tool for merging multiple profiles into one.
	"""

	def __init__(self):
		pass

	def merge(self, configs):
		"""
		Merge multiple configurations into one. If there are multiple units of the same type, option 'devices'
		is set for each unit with respect to eliminating any duplicate devices.
		"""
		merged_config = reduce(self._merge_two, configs)
		return merged_config

	def _merge_two(self, profile_a, profile_b):
		"""
		Merge two profiles. The configuration of units with matching names are updated with options
		from the newer profile. If the 'replace' options of the newer unit is 'True', all options from the
		older unit are dropped.
		"""

		profile_a.options.update(profile_b.options)

		for unit_name, unit in list(profile_b.units.items()):
			if unit.replace or unit_name not in profile_a.units:
				profile_a.units[unit_name] = unit
			else:
				profile_a.units[unit_name].type = unit.type
				profile_a.units[unit_name].enabled = unit.enabled
				profile_a.units[unit_name].devices = unit.devices
				if unit.devices_udev_regex is not None:
					profile_a.units[unit_name].devices_udev_regex = unit.devices_udev_regex
				if unit.cpuinfo_regex is not None:
					profile_a.units[unit_name].cpuinfo_regex = unit.cpuinfo_regex
				if unit.uname_regex is not None:
					profile_a.units[unit_name].uname_regex = unit.uname_regex
				if unit.script_pre is not None:
					profile_a.units[unit_name].script_pre = unit.script_pre
				if unit.script_post is not None:
					profile_a.units[unit_name].script_post = unit.script_post
				if unit.drop is not None:
					for option in unit.drop:
						profile_a.units[unit_name].options.pop(option, None)
					unit.drop = None
				if unit_name == "script" and profile_a.units[unit_name].options.get("script", None) is not None:
					script = profile_a.units[unit_name].options.get("script", None)
					profile_a.units[unit_name].options.update(unit.options)
					profile_a.units[unit_name].options["script"] = script + profile_a.units[unit_name].options["script"]
				else:
					profile_a.units[unit_name].options.update(unit.options)

		return profile_a
site-packages/tuned/profiles/profile.py000064400000002156147511334660014256 0ustar00import tuned.profiles.unit
import tuned.consts as consts
import collections

class Profile(object):
	"""
	Representation of a tuning profile.
	"""

	__slots__ = ["_name", "_options", "_units"]

	def __init__(self, name, config):
		self._name = name
		self._init_options(config)
		self._init_units(config)

	def _init_options(self, config):
		self._options = {}
		if consts.PLUGIN_MAIN_UNIT_NAME in config:
			self._options = dict(config[consts.PLUGIN_MAIN_UNIT_NAME])

	def _init_units(self, config):
		self._units = collections.OrderedDict()
		for unit_name in config:
			if unit_name != consts.PLUGIN_MAIN_UNIT_NAME:
				new_unit = self._create_unit(unit_name, config[unit_name])
				self._units[unit_name] = new_unit

	def _create_unit(self, name, config):
		return tuned.profiles.unit.Unit(name, config)

	@property
	def name(self):
		"""
		Profile name.
		"""
		return self._name

	@name.setter
	def name(self, value):
		self._name = value

	@property
	def units(self):
		"""
		Units included in the profile.
		"""
		return self._units

	@property
	def options(self):
		"""
		Profile global options.
		"""
		return self._options
site-packages/tuned/profiles/unit.py000064400000004671147511334660013601 0ustar00import collections
import re

class Unit(object):
	"""
	Unit description.
	"""

	__slots__ = [ "_name", "_type", "_enabled", "_replace", "_drop", "_devices", "_devices_udev_regex", \
		"_cpuinfo_regex", "_uname_regex", "_script_pre", "_script_post", "_options" ]

	def __init__(self, name, config):
		self._name = name
		self._type = config.pop("type", self._name)
		self._enabled = config.pop("enabled", True) in [True, "True", "true", 1, "1"]
		self._replace = config.pop("replace", False) in [True, "True", "true", 1, "1"]
		self._drop = config.pop("drop", None)
		if self._drop is not None:
			self._drop = re.split(r"\b\s*[,;]\s*", str(self._drop))
		self._devices = config.pop("devices", "*")
		self._devices_udev_regex = config.pop("devices_udev_regex", None)
		self._cpuinfo_regex = config.pop("cpuinfo_regex", None)
		self._uname_regex = config.pop("uname_regex", None)
		self._script_pre = config.pop("script_pre", None)
		self._script_post = config.pop("script_post", None)
		self._options = collections.OrderedDict(config)

	@property
	def name(self):
		return self._name

	@property
	def type(self):
		return self._type

	@type.setter
	def type(self, value):
		self._type = value

	@property
	def enabled(self):
		return self._enabled

	@enabled.setter
	def enabled(self, value):
		self._enabled = value

	@property
	def replace(self):
		return self._replace

	@property
	def drop(self):
		return self._drop

	@drop.setter
	def drop(self, value):
		self._drop = value

	@property
	def devices(self):
		return self._devices

	@devices.setter
	def devices(self, value):
		self._devices = value

	@property
	def devices_udev_regex(self):
		return self._devices_udev_regex

	@devices_udev_regex.setter
	def devices_udev_regex(self, value):
		self._devices_udev_regex = value

	@property
	def cpuinfo_regex(self):
		return self._cpuinfo_regex

	@cpuinfo_regex.setter
	def cpuinfo_regex(self, value):
		self._cpuinfo_regex = value

	@property
	def uname_regex(self):
		return self._uname_regex

	@uname_regex.setter
	def uname_regex(self, value):
		self._uname_regex = value

	@property
	def script_pre(self):
		return self._script_pre

	@script_pre.setter
	def script_pre(self, value):
		self._script_pre = value

	@property
	def script_post(self):
		return self._script_post

	@script_post.setter
	def script_post(self, value):
		self._script_post = value

	@property
	def options(self):
		return self._options

	@options.setter
	def options(self, value):
		self._options = value
site-packages/tuned/profiles/locator.py000064400000007310147511334660014256 0ustar00import os
import tuned.consts as consts
from tuned.utils.config_parser import ConfigParser, Error

class Locator(object):
	"""
	Profiles locator and enumerator.
	"""

	__slots__ = ["_load_directories"]

	def __init__(self, load_directories):
		if type(load_directories) is not list:
			raise TypeError("load_directories parameter is not a list")
		self._load_directories = load_directories

	@property
	def load_directories(self):
		return self._load_directories

	def _get_config_filename(self, *path_parts):
		path_parts = list(path_parts) + ["tuned.conf"]
		config_name = os.path.join(*path_parts)
		return os.path.normpath(config_name)

	def get_config(self, profile_name, skip_files=None):
		ret = None
		conditional_load = profile_name[0:1] == "-"
		if conditional_load:
			profile_name = profile_name[1:]

		for dir_name in reversed(self._load_directories):
			# basename is protection not to get out of the path
			config_file = self._get_config_filename(dir_name, os.path.basename(profile_name))

			if skip_files is not None and config_file in skip_files:
				ret = ""
				continue

			if os.path.isfile(config_file):
				return config_file

		if conditional_load and ret is None:
			ret = ""

		return ret

	def check_profile_name_format(self, profile_name):
		return profile_name is not None and profile_name != "" and "/" not in profile_name

	def parse_config(self, profile_name):
		if not self.check_profile_name_format(profile_name):
			return None
		config_file = self.get_config(profile_name)
		if config_file is None:
			return None
		try:
			config = ConfigParser(delimiters=('='), inline_comment_prefixes=('#'), allow_no_value=True, strict=False)
			config.optionxform = str
			with open(config_file) as f:
				config.read_string("[" + consts.MAGIC_HEADER_NAME + "]\n" + f.read())
			return config
		except (IOError, OSError, Error) as e:
			return None

	# Get profile attributes (e.g. summary, description), attrs is list of requested attributes,
	# if it is not list it is converted to list, defvals is list of default values to return if
	# attribute is not found, it is also converted to list if it is not list.
	# Returns list of the following format [status, profile_name, attr1_val, attr2_val, ...],
	# status is boolean.
	def get_profile_attrs(self, profile_name, attrs, defvals = None):
		# check types
		try:
			attrs_len = len(attrs)
		except TypeError:
			attrs = [attrs]
			attrs_len = 1
		try:
			defvals_len = len(defvals)
		except TypeError:
			defvals = [defvals]
			defvals_len = 1
		# Extend defvals if needed, last value is used for extension
		if defvals_len < attrs_len:
			defvals = defvals + ([defvals[-1]] * (attrs_len - defvals_len))
		config = self.parse_config(profile_name)
		if config is None:
			return [False, "", "", ""]
		main_unit_in_config = consts.PLUGIN_MAIN_UNIT_NAME in config.sections()
		vals = [True, profile_name]
		for (attr, defval) in zip(attrs, defvals):
			if attr == "" or attr is None:
				vals[0] = False
				vals = vals + [""]
			elif main_unit_in_config and attr in config.options(consts.PLUGIN_MAIN_UNIT_NAME):
				vals = vals + [config.get(consts.PLUGIN_MAIN_UNIT_NAME, attr, raw=True)]
			else:
				vals = vals + [defval]
		return vals

	def list_profiles(self):
		profiles = set()
		for dir_name in self._load_directories:
			try:
				for profile_name in os.listdir(dir_name):
					config_file = self._get_config_filename(dir_name, profile_name)
					if os.path.isfile(config_file):
						profiles.add(profile_name)
			except OSError:
				pass
		return profiles

	def get_known_names(self):
		return sorted(self.list_profiles())

	def get_known_names_summary(self):
		return [(profile, self.get_profile_attrs(profile, [consts.PROFILE_ATTR_SUMMARY], [""])[2]) for profile in sorted(self.list_profiles())]
site-packages/tuned/profiles/factory.py000064400000000215147511334660014257 0ustar00import tuned.profiles.profile

class Factory(object):
	def create(self, name, config):
		return tuned.profiles.profile.Profile(name, config)
site-packages/tuned/profiles/__pycache__/loader.cpython-36.pyc000064400000010154147511334660020345 0ustar003

�<�e��@stddlZddlZddlmZmZddljZddlZ	ddl
Z
ddlZddlZddl
mZejj�ZGdd�de�ZdS)�N)�ConfigParser�Error)�InvalidProfileExceptionc@sveZdZdZdddddgZdd�Zd	d
�Zedd��Ze	d
d��Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�ZdS)�Loaderz
	Profiles loader.
	�_profile_locator�_profile_merger�_profile_factory�_global_config�
_variablescCs"||_||_||_||_||_dS)N)rrrr	r
)�self�profile_locatorZprofile_factoryZprofile_mergerZ
global_config�	variables�r�/usr/lib/python3.6/loader.py�__init__s
zLoader.__init__cCstjjj||�S)N)�tuned�profiles�profileZProfile)r�profile_name�configrrr�_create_profileszLoader._create_profilecCstjd|�S)Nz^[a-zA-Z0-9_.-]+$)�re�match)�clsrrrr�	safe_nameszLoader.safe_namecCs|jS)N)r)rrrrr"szLoader.profile_locatorcCs�t|�tk	r|j�}tt|j|��}t|�dkr8td��t|�dkrZtjddj	|��ntjd|d�g}g}|j
|||�t|�dkr�|jj|�}n|d}dj	|�|_
d|jkr�|jj|jdj�|jd=|j|�|j|�|S)	Nrz.No profile or invalid profiles were specified.�zloading profiles: %sz, zloading profile: %s� r
)�type�list�split�filterr�lenr�log�info�join�
_load_profiler�merge�name�unitsr
Zadd_from_cfg�options�_expand_vars_in_devices�_expand_vars_in_regexes)r�
profile_namesr�processed_filesZ
final_profilerrr�load&s*


zLoader.loadcCs0x*|jD] }|jj|j|j�|j|_qWdS)N)r(r
�expandZdevices)rr�unitrrrr*DszLoader._expand_vars_in_devicescCsLxF|jD]<}|jj|j|j�|j|_|jj|j|j�|j|_qWdS)N)r(r
r/Z
cpuinfo_regexZuname_regex)rrr0rrrr+HszLoader._expand_vars_in_regexesc	Cs�x�|D]�}|jj||�}|dkr"q|dkrFtd|tt|jj��f��|j|�|j|�}|jj	||�}d|j
kr�tjd|j
j|j
jd���}|j|||�|j|�qWdS)N�z!Cannot find profile '%s' in '%s'.�includez
\s*[,;]\s*)rZ
get_configrr�reversedZ_load_directories�append�_load_config_datarZcreater)rrr
r/�popr%)	rr,rr-r'�filenamerrZ
include_namesrrrr%Ms



zLoader._load_profilecCstjd||�S)Nz(?<!\\)\$\{i:PROFILE_DIR\})r�sub)rZprofile_dir�stringrrr�_expand_profile_dir^szLoader._expand_profile_dirc
Cs,y8tdddd�}t|_t|��}|j||�WdQRXWn2tjk
rj}ztd||��WYdd}~XnXtj	�}t
jj|�}x�t
|j��D]�}tj	�||<xF|j|�D]8}|j||dd�|||<|j||||�|||<q�W||jd�dk	r�t
jj|||d�}	t
jj|	�g||d<q�W|S)	N�=�#F)Z
delimitersZinline_comment_prefixes�strictzCannot parse '%s'.T)�rawZscript)r�strZoptionxform�openZ	read_filer�	__bases__r�collections�OrderedDict�os�path�dirnamerZsectionsr)�getr:r$�normpath)
r�	file_nameZ
config_obj�f�erZdir_nameZsectionZoptionZscript_pathrrrr5as$
  zLoader._load_config_dataN)�__name__�
__module__�__qualname__�__doc__�	__slots__rr�classmethodr�propertyrr.r*r+r%r:r5rrrrr
sr)Ztuned.profiles.profilerZtuned.profiles.variablesZtuned.utils.config_parserrrZtuned.constsZconstsZos.pathrDrBZ
tuned.logsrZtuned.profiles.exceptionsrZlogsrGr"�objectrrrrr�<module>s

site-packages/tuned/profiles/__pycache__/merger.cpython-36.opt-1.pyc000064400000003524147511334660021322 0ustar003

�<�e��@s(ddlZddlmZGdd�de�ZdS)�N)�reducec@s(eZdZdZdd�Zdd�Zdd�ZdS)	�Mergerz0
	Tool for merging multiple profiles into one.
	cCsdS)N�)�selfrr�/usr/lib/python3.6/merger.py�__init__	szMerger.__init__cCst|j|�}|S)z�
		Merge multiple configurations into one. If there are multiple units of the same type, option 'devices'
		is set for each unit with respect to eliminating any duplicate devices.
		)r�
_merge_two)rZconfigsZ
merged_configrrr�mergeszMerger.mergecCs�|jj|j��x�t|jj��D�]~\}}|js:||jkrF||j|<q |j|j|_|j|j|_|j|j|_|j	dk	r�|j	|j|_	|j
dk	r�|j
|j|_
|jdk	r�|j|j|_|jdk	r�|j|j|_|j
dk	r�|j
|j|_
|jdk	�rx"|jD]}|j|jj|d�q�Wd|_|dk�r�|j|jjdd�dk	�r�|j|jjdd�}|j|jj|j�||j|jd|j|jd<q |j|jj|j�q W|S)z�
		Merge two profiles. The configuration of units with matching names are updated with options
		from the newer profile. If the 'replace' options of the newer unit is 'True', all options from the
		older unit are dropped.
		N�script)Zoptions�update�listZunits�items�replace�typeZenabledZdevicesZdevices_udev_regexZ
cpuinfo_regexZuname_regexZ
script_preZscript_postZdrop�pop�get)rZ	profile_aZ	profile_bZ	unit_nameZunitZoptionr
rrrrs6




$"zMerger._merge_twoN)�__name__�
__module__�__qualname__�__doc__rr	rrrrrrsr)�collections�	functoolsr�objectrrrrr�<module>ssite-packages/tuned/profiles/__pycache__/profile.cpython-36.opt-1.pyc000064400000003433147511334660021500 0ustar003

�<�en�@s.ddlZddljZddlZGdd�de�ZdS)�Nc@sleZdZdZdddgZdd�Zdd�Zd	d
�Zdd�Ze	d
d��Z
e
jdd��Z
e	dd��Ze	dd��Z
dS)�Profilez'
	Representation of a tuning profile.
	�_name�_options�_unitscCs||_|j|�|j|�dS)N)r�
_init_options�_init_units)�self�name�config�r�/usr/lib/python3.6/profile.py�__init__s
zProfile.__init__cCs$i|_tj|kr t|tj�|_dS)N)r�consts�PLUGIN_MAIN_UNIT_NAME�dict)rr
rrrrs
zProfile._init_optionscCs@tj�|_x0|D](}|tjkr|j|||�}||j|<qWdS)N)�collections�OrderedDictrrr�_create_unit)rr
Z	unit_nameZnew_unitrrrrs



zProfile._init_unitscCstjjj||�S)N)�tunedZprofilesZunitZUnit)rr	r
rrrrszProfile._create_unitcCs|jS)z
		Profile name.
		)r)rrrrr	 szProfile.namecCs
||_dS)N)r)r�valuerrrr	'scCs|jS)z$
		Units included in the profile.
		)r)rrrr�units+sz
Profile.unitscCs|jS)z
		Profile global options.
		)r)rrrr�options2szProfile.optionsN)�__name__�
__module__�__qualname__�__doc__�	__slots__r
rrr�propertyr	�setterrrrrrrrs
r)Ztuned.profiles.unitrZtuned.constsrr�objectrrrrr�<module>s
site-packages/tuned/profiles/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000611147511334660021572 0ustar003

�<�e�@sHddlTddlTddlTddlTddlTddlTddlTddlmZdS)�)�*�)�	functionsN)	Ztuned.profiles.locatorZtuned.profiles.loaderZtuned.profiles.profileZtuned.profiles.unitZtuned.profiles.exceptionsZtuned.profiles.factoryZtuned.profiles.merger�r�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/profiles/__pycache__/exceptions.cpython-36.opt-1.pyc000064400000000536147511334660022222 0ustar003

�<�e_�@s ddlZGdd�dejj�ZdS)�Nc@seZdZdS)�InvalidProfileExceptionN)�__name__�
__module__�__qualname__�rr� /usr/lib/python3.6/exceptions.pyrsr)Ztuned.exceptionsZtuned�
exceptionsZTunedExceptionrrrrr�<module>ssite-packages/tuned/profiles/__pycache__/variables.cpython-36.pyc000064400000005664147511334660021061 0ustar003

�<�e$	�@sfddlZddlZddlZddlmZddljZddlmZddl	m
Z
mZejj
�ZGdd�d�ZdS)�N�)�	functions)�commands)�ConfigParser�Errorc@sXeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�ZdS)�	Variablesz8
	Storage and processing of variables used in profiles
	cCs"t�|_i|_i|_tj�|_dS)N)r�_cmd�
_lookup_re�_lookup_envrZ	Functions�
_functions)�self�r
�/usr/lib/python3.6/variables.py�__init__szVariables.__init__cCs|j|�dkr|S||S)Nr)�find)r�s�prefixr
r
r�_add_env_prefixszVariables._add_env_prefixcCstjd|�S)Nz\w+$)�re�match)r�variabler
r
r�
_check_varszVariables._check_varcCsj|dkrdSt|�}|j|�s0tjd|�dS|j|�}||jdtj|�d<||j|j	|t
j�<dS)Nz6variable definition '%s' contains unallowed charactersz
(?<!\\)\${�})�strr�log�error�expandr	r�escaper
r�constsZ
ENV_PREFIX)rr�valuer�vr
r
r�add_variables

zVariables.add_variablecCs�tjj|�stjd|�dSyLtddddd�}t|_t|��$}|j	dt
jd|j�|�WdQRXWn"t
k
r�tjd	|�dSXx<|j�D]0}x*|j|�D]}|j||j||dd
��q�Wq�WdS)Nz#unable to find variables_file: '%s'�=�#TF)Z
delimitersZinline_comment_prefixesZallow_no_value�strict�[z]
z"error parsing variables_file: '%s')�raw)�os�path�existsrrrrZoptionxform�openZread_stringrZMAGIC_HEADER_NAME�readrZsectionsZoptionsr!�get)r�filename�config�fr�or
r
r�
add_from_file+s
,zVariables.add_from_filecCsFx@|D]8}t|�dkr.|jtjj||��q|j|||�qWdS)N�include)rr1r'r(�normpathr!)rZcfg�itemr
r
r�add_from_cfg;s
zVariables.add_from_cfgcCstjdd|jj|j|��S)Nz\\(\${\w+})z\1)r�subrZmultiple_re_replacer	)rrr
r
r�
expand_staticCszVariables.expand_staticcCs&|dkrdS|jt|��}|jj|�S)N)r7rrr)rrrr
r
rrFszVariables.expandcCs|jS)N)r
)rr
r
r�get_envNszVariables.get_envN)
�__name__�
__module__�__qualname__�__doc__rrrr!r1r5r7rr8r
r
r
rrs
r)r'rZ
tuned.logsZtunedrZtuned.constsrZtuned.utils.commandsrZtuned.utils.config_parserrrZlogsr,rrr
r
r
r�<module>s

site-packages/tuned/profiles/__pycache__/locator.cpython-36.pyc000064400000007170147511334660020546 0ustar003

�<�e��@s6ddlZddljZddlmZmZGdd�de�ZdS)�N)�ConfigParser�Errorc@sneZdZdZdgZdd�Zedd��Zdd�Zdd
d�Z	dd
�Z
dd�Zddd�Zdd�Z
dd�Zdd�Zd	S)�Locatorz$
	Profiles locator and enumerator.
	�_load_directoriescCst|�tk	rtd��||_dS)Nz(load_directories parameter is not a list)�type�list�	TypeErrorr)�self�load_directories�r�/usr/lib/python3.6/locator.py�__init__szLocator.__init__cCs|jS)N)r)r	rrrr
szLocator.load_directoriescGs&t|�dg}tjj|�}tjj|�S)Nz
tuned.conf)r�os�path�join�normpath)r	�
path_partsZconfig_namerrr�_get_config_filenameszLocator._get_config_filenameNcCs�d}|dd�dk}|r$|dd�}xLt|j�D]>}|j|tjj|��}|dk	r^||kr^d}q0tjj|�r0|Sq0W|r�|dkr�d}|S)Nr��-�)�reversedrrrr�basename�isfile)r	�profile_nameZ
skip_files�retZconditional_load�dir_name�config_filerrr�
get_configszLocator.get_configcCs|dk	o|dkod|kS)Nr�/r)r	rrrr�check_profile_name_format0sz!Locator.check_profile_name_formatcCs�|j|�sdS|j|�}|dkr$dSyJtddddd�}t|_t|��"}|jdtjd|j	��WdQRX|St
ttfk
r�}zdSd}~XnXdS)N�=�#TF)Z
delimitersZinline_comment_prefixesZallow_no_value�strict�[z]
)
r rr�strZoptionxform�openZread_string�constsZMAGIC_HEADER_NAME�read�IOError�OSErrorr)r	rr�config�f�errr�parse_config3s


&zLocator.parse_configcCs$yt|�}Wntk
r*|g}d}YnXyt|�}Wntk
rV|g}d}YnX||krv||dg||}|j|�}|dkr�ddddgStj|j�k}d|g}xtt||�D]f\}	}
|	dks�|	dkr�d|d<|dg}q�|o�|	|jtj�k�r||jtj|	dd�g}q�||
g}q�W|S)NrFrTr)�raw���)	�lenrr.r'ZPLUGIN_MAIN_UNIT_NAMEZsections�zipZoptions�get)r	rZattrsZdefvalsZ	attrs_lenZdefvals_lenr+Zmain_unit_in_config�vals�attrZdefvalrrr�get_profile_attrsGs2


zLocator.get_profile_attrscCsjt�}x^|jD]T}y:x4tj|�D]&}|j||�}tjj|�r |j|�q WWqtk
r`YqXqW|S)N)	�setrr�listdirrrr�addr*)r	Zprofilesrrrrrr�
list_profileses
zLocator.list_profilescCst|j��S)N)�sortedr:)r	rrr�get_known_namesqszLocator.get_known_namescs�fdd�t�j��D�S)Ncs(g|] }|�j|tjgdg�df�qS)r�)r6r'ZPROFILE_ATTR_SUMMARY)�.0Zprofile)r	rr�
<listcomp>usz3Locator.get_known_names_summary.<locals>.<listcomp>)r;r:)r	r)r	r�get_known_names_summarytszLocator.get_known_names_summary)N)N)�__name__�
__module__�__qualname__�__doc__�	__slots__r
�propertyr
rrr r.r6r:r<r@rrrrrs

r)rZtuned.constsr'Ztuned.utils.config_parserrr�objectrrrrr�<module>s
site-packages/tuned/profiles/__pycache__/variables.cpython-36.opt-1.pyc000064400000005664147511334660022020 0ustar003

�<�e$	�@sfddlZddlZddlZddlmZddljZddlmZddl	m
Z
mZejj
�ZGdd�d�ZdS)�N�)�	functions)�commands)�ConfigParser�Errorc@sXeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�ZdS)�	Variablesz8
	Storage and processing of variables used in profiles
	cCs"t�|_i|_i|_tj�|_dS)N)r�_cmd�
_lookup_re�_lookup_envrZ	Functions�
_functions)�self�r
�/usr/lib/python3.6/variables.py�__init__szVariables.__init__cCs|j|�dkr|S||S)Nr)�find)r�s�prefixr
r
r�_add_env_prefixszVariables._add_env_prefixcCstjd|�S)Nz\w+$)�re�match)r�variabler
r
r�
_check_varszVariables._check_varcCsj|dkrdSt|�}|j|�s0tjd|�dS|j|�}||jdtj|�d<||j|j	|t
j�<dS)Nz6variable definition '%s' contains unallowed charactersz
(?<!\\)\${�})�strr�log�error�expandr	r�escaper
r�constsZ
ENV_PREFIX)rr�valuer�vr
r
r�add_variables

zVariables.add_variablecCs�tjj|�stjd|�dSyLtddddd�}t|_t|��$}|j	dt
jd|j�|�WdQRXWn"t
k
r�tjd	|�dSXx<|j�D]0}x*|j|�D]}|j||j||dd
��q�Wq�WdS)Nz#unable to find variables_file: '%s'�=�#TF)Z
delimitersZinline_comment_prefixesZallow_no_value�strict�[z]
z"error parsing variables_file: '%s')�raw)�os�path�existsrrrrZoptionxform�openZread_stringrZMAGIC_HEADER_NAME�readrZsectionsZoptionsr!�get)r�filename�config�fr�or
r
r�
add_from_file+s
,zVariables.add_from_filecCsFx@|D]8}t|�dkr.|jtjj||��q|j|||�qWdS)N�include)rr1r'r(�normpathr!)rZcfg�itemr
r
r�add_from_cfg;s
zVariables.add_from_cfgcCstjdd|jj|j|��S)Nz\\(\${\w+})z\1)r�subrZmultiple_re_replacer	)rrr
r
r�
expand_staticCszVariables.expand_staticcCs&|dkrdS|jt|��}|jj|�S)N)r7rrr)rrrr
r
rrFszVariables.expandcCs|jS)N)r
)rr
r
r�get_envNszVariables.get_envN)
�__name__�
__module__�__qualname__�__doc__rrrr!r1r5r7rr8r
r
r
rrs
r)r'rZ
tuned.logsZtunedrZtuned.constsrZtuned.utils.commandsrZtuned.utils.config_parserrrZlogsr,rrr
r
r
r�<module>s

site-packages/tuned/profiles/__pycache__/unit.cpython-36.opt-1.pyc000064400000007036147511334660021022 0ustar003

�<�e�	�@s$ddlZddlZGdd�de�ZdS)�Nc@sPeZdZdZdddddddd	d
ddd
gZdd�Zedd��Zedd��Zej	dd��Zedd��Z
e
j	dd��Z
edd��Zedd��Zej	dd��Zedd��Z
e
j	dd��Z
ed d!��Zej	d"d!��Zed#d$��Zej	d%d$��Zed&d'��Zej	d(d'��Zed)d*��Zej	d+d*��Zed,d-��Zej	d.d-��Zed/d0��Zej	d1d0��Zd2S)3�Unitz
	Unit description.
	�_name�_type�_enabled�_replace�_drop�_devices�_devices_udev_regex�_cpuinfo_regex�_uname_regex�_script_pre�_script_post�_optionscCs�||_|jd|j�|_|jdd�dk|_|jdd	�dk|_|jd
d�|_|jdk	rftjdt|j��|_|jdd
�|_	|jdd�|_
|jdd�|_|jdd�|_|jdd�|_
|jdd�|_tj|�|_dS)N�type�enabledT�True�true��1�replaceF�dropz\b\s*[,;]\s*�devices�*�devices_udev_regex�
cpuinfo_regex�uname_regex�
script_pre�script_post)Trrrr)Trrrr)r�poprrrr�re�split�strrr	r
rrr
�collections�OrderedDictr)�self�name�config�r'�/usr/lib/python3.6/unit.py�__init__s
z
Unit.__init__cCs|jS)N)r)r$r'r'r(r%sz	Unit.namecCs|jS)N)r)r$r'r'r(r sz	Unit.typecCs
||_dS)N)r)r$�valuer'r'r(r$scCs|jS)N)r)r$r'r'r(r(szUnit.enabledcCs
||_dS)N)r)r$r*r'r'r(r,scCs|jS)N)r)r$r'r'r(r0szUnit.replacecCs|jS)N)r)r$r'r'r(r4sz	Unit.dropcCs
||_dS)N)r)r$r*r'r'r(r8scCs|jS)N)r)r$r'r'r(r<szUnit.devicescCs
||_dS)N)r)r$r*r'r'r(r@scCs|jS)N)r	)r$r'r'r(rDszUnit.devices_udev_regexcCs
||_dS)N)r	)r$r*r'r'r(rHscCs|jS)N)r
)r$r'r'r(rLszUnit.cpuinfo_regexcCs
||_dS)N)r
)r$r*r'r'r(rPscCs|jS)N)r)r$r'r'r(rTszUnit.uname_regexcCs
||_dS)N)r)r$r*r'r'r(rXscCs|jS)N)r)r$r'r'r(r\szUnit.script_precCs
||_dS)N)r)r$r*r'r'r(r`scCs|jS)N)r
)r$r'r'r(rdszUnit.script_postcCs
||_dS)N)r
)r$r*r'r'r(rhscCs|jS)N)r)r$r'r'r(�optionslszUnit.optionscCs
||_dS)N)r)r$r*r'r'r(r+psN)�__name__�
__module__�__qualname__�__doc__�	__slots__r)�propertyr%r�setterrrrrrrrrrr+r'r'r'r(rs4r)r"r�objectrr'r'r'r(�<module>ssite-packages/tuned/profiles/__pycache__/__init__.cpython-36.pyc000064400000000611147511334660020633 0ustar003

�<�e�@sHddlTddlTddlTddlTddlTddlTddlTddlmZdS)�)�*�)�	functionsN)	Ztuned.profiles.locatorZtuned.profiles.loaderZtuned.profiles.profileZtuned.profiles.unitZtuned.profiles.exceptionsZtuned.profiles.factoryZtuned.profiles.merger�r�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/profiles/__pycache__/merger.cpython-36.pyc000064400000003524147511334660020363 0ustar003

�<�e��@s(ddlZddlmZGdd�de�ZdS)�N)�reducec@s(eZdZdZdd�Zdd�Zdd�ZdS)	�Mergerz0
	Tool for merging multiple profiles into one.
	cCsdS)N�)�selfrr�/usr/lib/python3.6/merger.py�__init__	szMerger.__init__cCst|j|�}|S)z�
		Merge multiple configurations into one. If there are multiple units of the same type, option 'devices'
		is set for each unit with respect to eliminating any duplicate devices.
		)r�
_merge_two)rZconfigsZ
merged_configrrr�mergeszMerger.mergecCs�|jj|j��x�t|jj��D�]~\}}|js:||jkrF||j|<q |j|j|_|j|j|_|j|j|_|j	dk	r�|j	|j|_	|j
dk	r�|j
|j|_
|jdk	r�|j|j|_|jdk	r�|j|j|_|j
dk	r�|j
|j|_
|jdk	�rx"|jD]}|j|jj|d�q�Wd|_|dk�r�|j|jjdd�dk	�r�|j|jjdd�}|j|jj|j�||j|jd|j|jd<q |j|jj|j�q W|S)z�
		Merge two profiles. The configuration of units with matching names are updated with options
		from the newer profile. If the 'replace' options of the newer unit is 'True', all options from the
		older unit are dropped.
		N�script)Zoptions�update�listZunits�items�replace�typeZenabledZdevicesZdevices_udev_regexZ
cpuinfo_regexZuname_regexZ
script_preZscript_postZdrop�pop�get)rZ	profile_aZ	profile_bZ	unit_nameZunitZoptionr
rrrrs6




$"zMerger._merge_twoN)�__name__�
__module__�__qualname__�__doc__rr	rrrrrrsr)�collections�	functoolsr�objectrrrrr�<module>ssite-packages/tuned/profiles/__pycache__/loader.cpython-36.opt-1.pyc000064400000010154147511334660021304 0ustar003

�<�e��@stddlZddlZddlmZmZddljZddlZ	ddl
Z
ddlZddlZddl
mZejj�ZGdd�de�ZdS)�N)�ConfigParser�Error)�InvalidProfileExceptionc@sveZdZdZdddddgZdd�Zd	d
�Zedd��Ze	d
d��Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�ZdS)�Loaderz
	Profiles loader.
	�_profile_locator�_profile_merger�_profile_factory�_global_config�
_variablescCs"||_||_||_||_||_dS)N)rrrr	r
)�self�profile_locatorZprofile_factoryZprofile_mergerZ
global_config�	variables�r�/usr/lib/python3.6/loader.py�__init__s
zLoader.__init__cCstjjj||�S)N)�tuned�profiles�profileZProfile)r�profile_name�configrrr�_create_profileszLoader._create_profilecCstjd|�S)Nz^[a-zA-Z0-9_.-]+$)�re�match)�clsrrrr�	safe_nameszLoader.safe_namecCs|jS)N)r)rrrrr"szLoader.profile_locatorcCs�t|�tk	r|j�}tt|j|��}t|�dkr8td��t|�dkrZtjddj	|��ntjd|d�g}g}|j
|||�t|�dkr�|jj|�}n|d}dj	|�|_
d|jkr�|jj|jdj�|jd=|j|�|j|�|S)	Nrz.No profile or invalid profiles were specified.�zloading profiles: %sz, zloading profile: %s� r
)�type�list�split�filterr�lenr�log�info�join�
_load_profiler�merge�name�unitsr
Zadd_from_cfg�options�_expand_vars_in_devices�_expand_vars_in_regexes)r�
profile_namesr�processed_filesZ
final_profilerrr�load&s*


zLoader.loadcCs0x*|jD] }|jj|j|j�|j|_qWdS)N)r(r
�expandZdevices)rr�unitrrrr*DszLoader._expand_vars_in_devicescCsLxF|jD]<}|jj|j|j�|j|_|jj|j|j�|j|_qWdS)N)r(r
r/Z
cpuinfo_regexZuname_regex)rrr0rrrr+HszLoader._expand_vars_in_regexesc	Cs�x�|D]�}|jj||�}|dkr"q|dkrFtd|tt|jj��f��|j|�|j|�}|jj	||�}d|j
kr�tjd|j
j|j
jd���}|j|||�|j|�qWdS)N�z!Cannot find profile '%s' in '%s'.�includez
\s*[,;]\s*)rZ
get_configrr�reversedZ_load_directories�append�_load_config_datarZcreater)rrr
r/�popr%)	rr,rr-r'�filenamerrZ
include_namesrrrr%Ms



zLoader._load_profilecCstjd||�S)Nz(?<!\\)\$\{i:PROFILE_DIR\})r�sub)rZprofile_dir�stringrrr�_expand_profile_dir^szLoader._expand_profile_dirc
Cs,y8tdddd�}t|_t|��}|j||�WdQRXWn2tjk
rj}ztd||��WYdd}~XnXtj	�}t
jj|�}x�t
|j��D]�}tj	�||<xF|j|�D]8}|j||dd�|||<|j||||�|||<q�W||jd�dk	r�t
jj|||d�}	t
jj|	�g||d<q�W|S)	N�=�#F)Z
delimitersZinline_comment_prefixes�strictzCannot parse '%s'.T)�rawZscript)r�strZoptionxform�openZ	read_filer�	__bases__r�collections�OrderedDict�os�path�dirnamerZsectionsr)�getr:r$�normpath)
r�	file_nameZ
config_obj�f�erZdir_nameZsectionZoptionZscript_pathrrrr5as$
  zLoader._load_config_dataN)�__name__�
__module__�__qualname__�__doc__�	__slots__rr�classmethodr�propertyrr.r*r+r%r:r5rrrrr
sr)Ztuned.profiles.profilerZtuned.profiles.variablesZtuned.utils.config_parserrrZtuned.constsZconstsZos.pathrDrBZ
tuned.logsrZtuned.profiles.exceptionsrZlogsrGr"�objectrrrrr�<module>s

site-packages/tuned/profiles/__pycache__/profile.cpython-36.pyc000064400000003433147511334660020541 0ustar003

�<�en�@s.ddlZddljZddlZGdd�de�ZdS)�Nc@sleZdZdZdddgZdd�Zdd�Zd	d
�Zdd�Ze	d
d��Z
e
jdd��Z
e	dd��Ze	dd��Z
dS)�Profilez'
	Representation of a tuning profile.
	�_name�_options�_unitscCs||_|j|�|j|�dS)N)r�
_init_options�_init_units)�self�name�config�r�/usr/lib/python3.6/profile.py�__init__s
zProfile.__init__cCs$i|_tj|kr t|tj�|_dS)N)r�consts�PLUGIN_MAIN_UNIT_NAME�dict)rr
rrrrs
zProfile._init_optionscCs@tj�|_x0|D](}|tjkr|j|||�}||j|<qWdS)N)�collections�OrderedDictrrr�_create_unit)rr
Z	unit_nameZnew_unitrrrrs



zProfile._init_unitscCstjjj||�S)N)�tunedZprofilesZunitZUnit)rr	r
rrrrszProfile._create_unitcCs|jS)z
		Profile name.
		)r)rrrrr	 szProfile.namecCs
||_dS)N)r)r�valuerrrr	'scCs|jS)z$
		Units included in the profile.
		)r)rrrr�units+sz
Profile.unitscCs|jS)z
		Profile global options.
		)r)rrrr�options2szProfile.optionsN)�__name__�
__module__�__qualname__�__doc__�	__slots__r
rrr�propertyr	�setterrrrrrrrs
r)Ztuned.profiles.unitrZtuned.constsrr�objectrrrrr�<module>s
site-packages/tuned/profiles/__pycache__/exceptions.cpython-36.pyc000064400000000536147511334660021263 0ustar003

�<�e_�@s ddlZGdd�dejj�ZdS)�Nc@seZdZdS)�InvalidProfileExceptionN)�__name__�
__module__�__qualname__�rr� /usr/lib/python3.6/exceptions.pyrsr)Ztuned.exceptionsZtuned�
exceptionsZTunedExceptionrrrrr�<module>ssite-packages/tuned/profiles/__pycache__/factory.cpython-36.pyc000064400000000736147511334660020553 0ustar003

�<�e��@sddlZGdd�de�ZdS)�Nc@seZdZdd�ZdS)�FactorycCstjjj||�S)N)�tunedZprofilesZprofileZProfile)�self�name�config�r�/usr/lib/python3.6/factory.py�createszFactory.createN)�__name__�
__module__�__qualname__r	rrrrrsr)Ztuned.profiles.profiler�objectrrrrr�<module>ssite-packages/tuned/profiles/__pycache__/locator.cpython-36.opt-1.pyc000064400000007170147511334660021505 0ustar003

�<�e��@s6ddlZddljZddlmZmZGdd�de�ZdS)�N)�ConfigParser�Errorc@sneZdZdZdgZdd�Zedd��Zdd�Zdd
d�Z	dd
�Z
dd�Zddd�Zdd�Z
dd�Zdd�Zd	S)�Locatorz$
	Profiles locator and enumerator.
	�_load_directoriescCst|�tk	rtd��||_dS)Nz(load_directories parameter is not a list)�type�list�	TypeErrorr)�self�load_directories�r�/usr/lib/python3.6/locator.py�__init__szLocator.__init__cCs|jS)N)r)r	rrrr
szLocator.load_directoriescGs&t|�dg}tjj|�}tjj|�S)Nz
tuned.conf)r�os�path�join�normpath)r	�
path_partsZconfig_namerrr�_get_config_filenameszLocator._get_config_filenameNcCs�d}|dd�dk}|r$|dd�}xLt|j�D]>}|j|tjj|��}|dk	r^||kr^d}q0tjj|�r0|Sq0W|r�|dkr�d}|S)Nr��-�)�reversedrrrr�basename�isfile)r	�profile_nameZ
skip_files�retZconditional_load�dir_name�config_filerrr�
get_configszLocator.get_configcCs|dk	o|dkod|kS)Nr�/r)r	rrrr�check_profile_name_format0sz!Locator.check_profile_name_formatcCs�|j|�sdS|j|�}|dkr$dSyJtddddd�}t|_t|��"}|jdtjd|j	��WdQRX|St
ttfk
r�}zdSd}~XnXdS)N�=�#TF)Z
delimitersZinline_comment_prefixesZallow_no_value�strict�[z]
)
r rr�strZoptionxform�openZread_string�constsZMAGIC_HEADER_NAME�read�IOError�OSErrorr)r	rr�config�f�errr�parse_config3s


&zLocator.parse_configcCs$yt|�}Wntk
r*|g}d}YnXyt|�}Wntk
rV|g}d}YnX||krv||dg||}|j|�}|dkr�ddddgStj|j�k}d|g}xtt||�D]f\}	}
|	dks�|	dkr�d|d<|dg}q�|o�|	|jtj�k�r||jtj|	dd�g}q�||
g}q�W|S)NrFrTr)�raw���)	�lenrr.r'ZPLUGIN_MAIN_UNIT_NAMEZsections�zipZoptions�get)r	rZattrsZdefvalsZ	attrs_lenZdefvals_lenr+Zmain_unit_in_config�vals�attrZdefvalrrr�get_profile_attrsGs2


zLocator.get_profile_attrscCsjt�}x^|jD]T}y:x4tj|�D]&}|j||�}tjj|�r |j|�q WWqtk
r`YqXqW|S)N)	�setrr�listdirrrr�addr*)r	Zprofilesrrrrrr�
list_profileses
zLocator.list_profilescCst|j��S)N)�sortedr:)r	rrr�get_known_namesqszLocator.get_known_namescs�fdd�t�j��D�S)Ncs(g|] }|�j|tjgdg�df�qS)r�)r6r'ZPROFILE_ATTR_SUMMARY)�.0Zprofile)r	rr�
<listcomp>usz3Locator.get_known_names_summary.<locals>.<listcomp>)r;r:)r	r)r	r�get_known_names_summarytszLocator.get_known_names_summary)N)N)�__name__�
__module__�__qualname__�__doc__�	__slots__r
�propertyr
rrr r.r6r:r<r@rrrrrs

r)rZtuned.constsr'Ztuned.utils.config_parserrr�objectrrrrr�<module>s
site-packages/tuned/profiles/__pycache__/unit.cpython-36.pyc000064400000007036147511334660020063 0ustar003

�<�e�	�@s$ddlZddlZGdd�de�ZdS)�Nc@sPeZdZdZdddddddd	d
ddd
gZdd�Zedd��Zedd��Zej	dd��Zedd��Z
e
j	dd��Z
edd��Zedd��Zej	dd��Zedd��Z
e
j	dd��Z
ed d!��Zej	d"d!��Zed#d$��Zej	d%d$��Zed&d'��Zej	d(d'��Zed)d*��Zej	d+d*��Zed,d-��Zej	d.d-��Zed/d0��Zej	d1d0��Zd2S)3�Unitz
	Unit description.
	�_name�_type�_enabled�_replace�_drop�_devices�_devices_udev_regex�_cpuinfo_regex�_uname_regex�_script_pre�_script_post�_optionscCs�||_|jd|j�|_|jdd�dk|_|jdd	�dk|_|jd
d�|_|jdk	rftjdt|j��|_|jdd
�|_	|jdd�|_
|jdd�|_|jdd�|_|jdd�|_
|jdd�|_tj|�|_dS)N�type�enabledT�True�true��1�replaceF�dropz\b\s*[,;]\s*�devices�*�devices_udev_regex�
cpuinfo_regex�uname_regex�
script_pre�script_post)Trrrr)Trrrr)r�poprrrr�re�split�strrr	r
rrr
�collections�OrderedDictr)�self�name�config�r'�/usr/lib/python3.6/unit.py�__init__s
z
Unit.__init__cCs|jS)N)r)r$r'r'r(r%sz	Unit.namecCs|jS)N)r)r$r'r'r(r sz	Unit.typecCs
||_dS)N)r)r$�valuer'r'r(r$scCs|jS)N)r)r$r'r'r(r(szUnit.enabledcCs
||_dS)N)r)r$r*r'r'r(r,scCs|jS)N)r)r$r'r'r(r0szUnit.replacecCs|jS)N)r)r$r'r'r(r4sz	Unit.dropcCs
||_dS)N)r)r$r*r'r'r(r8scCs|jS)N)r)r$r'r'r(r<szUnit.devicescCs
||_dS)N)r)r$r*r'r'r(r@scCs|jS)N)r	)r$r'r'r(rDszUnit.devices_udev_regexcCs
||_dS)N)r	)r$r*r'r'r(rHscCs|jS)N)r
)r$r'r'r(rLszUnit.cpuinfo_regexcCs
||_dS)N)r
)r$r*r'r'r(rPscCs|jS)N)r)r$r'r'r(rTszUnit.uname_regexcCs
||_dS)N)r)r$r*r'r'r(rXscCs|jS)N)r)r$r'r'r(r\szUnit.script_precCs
||_dS)N)r)r$r*r'r'r(r`scCs|jS)N)r
)r$r'r'r(rdszUnit.script_postcCs
||_dS)N)r
)r$r*r'r'r(rhscCs|jS)N)r)r$r'r'r(�optionslszUnit.optionscCs
||_dS)N)r)r$r*r'r'r(r+psN)�__name__�
__module__�__qualname__�__doc__�	__slots__r)�propertyr%r�setterrrrrrrrrrr+r'r'r'r(rs4r)r"r�objectrr'r'r'r(�<module>ssite-packages/tuned/profiles/__pycache__/factory.cpython-36.opt-1.pyc000064400000000736147511334660021512 0ustar003

�<�e��@sddlZGdd�de�ZdS)�Nc@seZdZdd�ZdS)�FactorycCstjjj||�S)N)�tunedZprofilesZprofileZProfile)�self�name�config�r�/usr/lib/python3.6/factory.py�createszFactory.createN)�__name__�
__module__�__qualname__r	rrrrrsr)Ztuned.profiles.profiler�objectrrrrr�<module>ssite-packages/tuned/profiles/loader.py000064400000010256147511334660014064 0ustar00import tuned.profiles.profile
import tuned.profiles.variables
from tuned.utils.config_parser import ConfigParser, Error
import tuned.consts as consts
import os.path
import collections
import tuned.logs
import re
from tuned.profiles.exceptions import InvalidProfileException

log = tuned.logs.get()

class Loader(object):
	"""
	Profiles loader.
	"""

	__slots__ = ["_profile_locator", "_profile_merger", "_profile_factory", "_global_config", "_variables"]

	def __init__(self, profile_locator, profile_factory, profile_merger, global_config, variables):
		self._profile_locator = profile_locator
		self._profile_factory = profile_factory
		self._profile_merger = profile_merger
		self._global_config = global_config
		self._variables = variables

	def _create_profile(self, profile_name, config):
		return tuned.profiles.profile.Profile(profile_name, config)

	@classmethod
	def safe_name(cls, profile_name):
		return re.match(r'^[a-zA-Z0-9_.-]+$', profile_name)

	@property
	def profile_locator(self):
		return self._profile_locator

	def load(self, profile_names):
		if type(profile_names) is not list:
			profile_names = profile_names.split()

		profile_names = list(filter(self.safe_name, profile_names))
		if len(profile_names) == 0:
			raise InvalidProfileException("No profile or invalid profiles were specified.")

		if len(profile_names) > 1:
			log.info("loading profiles: %s" % ", ".join(profile_names))
		else:
			log.info("loading profile: %s" % profile_names[0])
		profiles = []
		processed_files = []
		self._load_profile(profile_names, profiles, processed_files)

		if len(profiles) > 1:
			final_profile = self._profile_merger.merge(profiles)
		else:
			final_profile = profiles[0]

		final_profile.name = " ".join(profile_names)
		if "variables" in final_profile.units:
			self._variables.add_from_cfg(final_profile.units["variables"].options)
			del(final_profile.units["variables"])
		# FIXME hack, do all variable expansions in one place
		self._expand_vars_in_devices(final_profile)
		self._expand_vars_in_regexes(final_profile)
		return final_profile

	def _expand_vars_in_devices(self, profile):
		for unit in profile.units:
			profile.units[unit].devices = self._variables.expand(profile.units[unit].devices)

	def _expand_vars_in_regexes(self, profile):
		for unit in profile.units:
			profile.units[unit].cpuinfo_regex = self._variables.expand(profile.units[unit].cpuinfo_regex)
			profile.units[unit].uname_regex = self._variables.expand(profile.units[unit].uname_regex)

	def _load_profile(self, profile_names, profiles, processed_files):
		for name in profile_names:
			filename = self._profile_locator.get_config(name, processed_files)
			if filename == "":
				continue
			if filename is None:
				raise InvalidProfileException("Cannot find profile '%s' in '%s'." % (name, list(reversed(self._profile_locator._load_directories))))
			processed_files.append(filename)

			config = self._load_config_data(filename)
			profile = self._profile_factory.create(name, config)
			if "include" in profile.options:
				include_names = re.split(r"\s*[,;]\s*", self._variables.expand(profile.options.pop("include")))
				self._load_profile(include_names, profiles, processed_files)

			profiles.append(profile)

	def _expand_profile_dir(self, profile_dir, string):
		return re.sub(r'(?<!\\)\$\{i:PROFILE_DIR\}', profile_dir, string)

	def _load_config_data(self, file_name):
		try:
			config_obj = ConfigParser(delimiters=('='), inline_comment_prefixes=('#'), strict=False)
			config_obj.optionxform=str
			with open(file_name) as f:
				config_obj.read_file(f, file_name)
		except Error.__bases__ as e:
			raise InvalidProfileException("Cannot parse '%s'." % file_name, e)

		config = collections.OrderedDict()
		dir_name = os.path.dirname(file_name)
		for section in list(config_obj.sections()):
			config[section] = collections.OrderedDict()
			for option in config_obj.options(section):
				config[section][option] = config_obj.get(section, option, raw=True)
				config[section][option] = self._expand_profile_dir(dir_name, config[section][option])
			if config[section].get("script") is not None:
				script_path = os.path.join(dir_name, config[section]["script"])
				config[section]["script"] = [os.path.normpath(script_path)]

		return config
site-packages/tuned/profiles/variables.py000064400000004444147511334660014570 0ustar00import os
import re
import tuned.logs
from .functions import functions as functions
import tuned.consts as consts
from tuned.utils.commands import commands
from tuned.utils.config_parser import ConfigParser, Error

log = tuned.logs.get()

class Variables():
	"""
	Storage and processing of variables used in profiles
	"""

	def __init__(self):
		self._cmd = commands()
		self._lookup_re = {}
		self._lookup_env = {}
		self._functions = functions.Functions()

	def _add_env_prefix(self, s, prefix):
		if s.find(prefix) == 0:
			return s
		return prefix + s

	def _check_var(self, variable):
		return re.match(r'\w+$',variable)

	def add_variable(self, variable, value):
		if value is None:
			return
		s = str(variable)
		if not self._check_var(variable):
			log.error("variable definition '%s' contains unallowed characters" % variable)
			return
		v = self.expand(value)
		# variables referenced by ${VAR}, $ can be escaped by two $,
		# i.e. the following will not expand: $${VAR}
		self._lookup_re[r'(?<!\\)\${' + re.escape(s) + r'}'] = v
		self._lookup_env[self._add_env_prefix(s, consts.ENV_PREFIX)] = v

	def add_from_file(self, filename):
		if not os.path.exists(filename):
			log.error("unable to find variables_file: '%s'" % filename)
			return
		try:
			config = ConfigParser(delimiters=('='), inline_comment_prefixes=('#'), allow_no_value=True, strict=False)
			config.optionxform = str
			with open(filename) as f:
				config.read_string("[" + consts.MAGIC_HEADER_NAME + "]\n" + f.read(), filename)
		except Error:
			log.error("error parsing variables_file: '%s'" % filename)
			return
		for s in config.sections():
			for o in config.options(s):
				self.add_variable(o, config.get(s, o, raw=True))

	def add_from_cfg(self, cfg):
		for item in cfg:
			if str(item) == "include":
				self.add_from_file(os.path.normpath(cfg[item]))
			else:
				self.add_variable(item, cfg[item])

	# expand static variables (no functions)
	def expand_static(self, value):
		return re.sub(r'\\(\${\w+})', r'\1', self._cmd.multiple_re_replace(self._lookup_re, value))

	def expand(self, value):
		if value is None:
			return None
		# expand variables and convert all \${VAR} to ${VAR} (unescape)
		s = self.expand_static(str(value))
		# expand built-in functions
		return self._functions.expand(s)

	def get_env(self):
		return self._lookup_env
site-packages/tuned/profiles/functions/function_cpulist2devs.py000064400000000714147511334660021160 0ustar00import tuned.logs
from . import base

log = tuned.logs.get()

class cpulist2devs(base.Function):
	"""
	Conversion function: converts CPU list to device strings
	"""
	def __init__(self):
		# arbitrary number of arguments
		super(cpulist2devs, self).__init__("cpulist2devs", 0)

	def execute(self, args):
		if not super(cpulist2devs, self).execute(args):
			return None
		return self._cmd.cpulist2string(self._cmd.cpulist_unpack(",".join(args)), prefix = "cpu")
site-packages/tuned/profiles/functions/functions.py000064400000004142147511334660016633 0ustar00import os
import re
import glob
from . import repository
import tuned.logs
import tuned.consts as consts
from tuned.utils.commands import commands

log = tuned.logs.get()

cmd = commands()

class Functions():
	"""
	Built-in functions
	"""

	def __init__(self):
		self._repository = repository.Repository()
		self._parse_init()

	def _parse_init(self, s = ""):
		self._cnt = 0
		self._str = s
		self._len = len(s)
		self._stack = []
		self._esc = False

	def _curr_char(self):
		return self._str[self._cnt] if self._cnt < self._len else ""

	def _curr_substr(self, _len):
		return self._str[self._cnt:self._cnt + _len]

	def _push_pos(self, esc):
		self._stack.append((esc, self._cnt))

	def _sub(self, a, b, s):
		self._str = self._str[:a] + s + self._str[b + 1:]
		self._len = len(self._str)
		self._cnt += len(s) - (b - a + 1)
		if self._cnt < 0:
			self._cnt = 0

	def _process_func(self, _from):
		sl = re.split(r'(?<!\\):', self._str[_from:self._cnt])
		if sl[0] != "${f":
			return
		sl = [str(v).replace(r"\:", ":") for v in sl]
		if not re.match(r'\w+$', sl[1]):
			log.error("invalid function name '%s'" % sl[1])
			return
		try:
			f = self._repository.load_func(sl[1])
		except ImportError:
			log.error("function '%s' not implemented" % sl[1])
			return
		s = f.execute(sl[2:])
		if s is None:
			return
		self._sub(_from, self._cnt, s)

	def _process(self, s):
		self._parse_init(s)
		while self._cnt < self._len:
			if self._curr_char() == "}":
				try:
					si = self._stack.pop()
				except IndexError:
					log.error("invalid variable syntax, non pair '}' in: '%s'" % s)
					return self._str
				# if not escaped
				if not si[0]:
					self._process_func(si[1])
			elif self._curr_substr(2) == "${":
				self._push_pos(self._esc)
			if self._curr_char() == "\\":
				self._esc = True
			else:
				self._esc = False
			self._cnt += 1
		if len(self._stack):
			log.error("invalid variable syntax, non pair '{' in: '%s'" % s)
		return self._str

	def expand(self, s):
		if s is None or s == "":
			return s
		# expand functions and convert all \${f:*} to ${f:*} (unescape)
		return re.sub(r'\\(\${f:.*})', r'\1', self._process(s))
site-packages/tuned/profiles/functions/function_hex2cpulist.py000064400000000730147511334660021001 0ustar00import os
import tuned.logs
from . import base
from tuned.utils.commands import commands

log = tuned.logs.get()

class hex2cpulist(base.Function):
	"""
	Conversion function: converts hexadecimal CPU mask to CPU list
	"""
	def __init__(self):
		# 1 argument
		super(hex2cpulist, self).__init__("hex2cpulist", 1, 1)

	def execute(self, args):
		if not super(hex2cpulist, self).execute(args):
			return None
		return ",".join(str(v) for v in self._cmd.hex2cpulist(args[0]))
site-packages/tuned/profiles/functions/__pycache__/function_assertion.cpython-36.pyc000064400000002323147511334660025022 0ustar003

�<�e��@sTddlZddlZddlmZddlmZddlmZej	j
�ZGdd�dej�Z
dS)�N�)�base)�commands)�InvalidProfileExceptioncs,eZdZdZ�fdd�Z�fdd�Z�ZS)�	assertionz�
	Assertion: compares argument 2 with argument 3. If they don't match
	it logs text from argument 1 and  throws InvalidProfileException. This
	exception will abort profile loading.
	cstt|�jddd�dS)Nr�)�superr�__init__)�self)�	__class__��(/usr/lib/python3.6/function_assertion.pyr	szassertion.__init__csXtt|�j|�sdS|d|dkrTtjd|d|d|df�td|d��dS)Nr�z#assertion '%s' failed: '%s' != '%s'rzAssertion '%s' failed.)rr�execute�log�errorr)r
�args)rrr
rs zassertion.execute)�__name__�
__module__�__qualname__�__doc__r	r�
__classcell__rr)rr
r	sr)�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZtuned.profiles.exceptionsrZlogs�getrZFunctionrrrrr
�<module>s
site-packages/tuned/profiles/functions/__pycache__/function_hex2cpulist.cpython-36.pyc000064400000002073147511334660025267 0ustar003

�<�e��@sHddlZddlZddlmZddlmZejj�Z	Gdd�dej
�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�hex2cpulistzB
	Conversion function: converts hexadecimal CPU mask to CPU list
	cstt|�jddd�dS)Nrr)�superr�__init__)�self)�	__class__��*/usr/lib/python3.6/function_hex2cpulist.pyrszhex2cpulist.__init__cs4tt|�j|�sdSdjdd�|jj|d�D��S)N�,css|]}t|�VqdS)N)�str)�.0�vr
r
r�	<genexpr>sz&hex2cpulist.execute.<locals>.<genexpr>r)rr�execute�joinZ_cmd)r�args)r	r
rrszhex2cpulist.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r
r
)r	rrsr)�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZlogs�get�logZFunctionrr
r
r
r�<module>s

site-packages/tuned/profiles/functions/__pycache__/repository.cpython-36.opt-1.pyc000064400000003400147511334660024261 0ustar003

�<�e�@sTddlmZddlmZddlZddljZddlm	Z	ej
j�ZGdd�de�Z
dS)�)�PluginLoader�)�baseN)�commandscsHeZdZ�fdd�Zedd��Zdd�Zdd�Zd	d
�Zdd�Z	�Z
S)
�
Repositorycstt|�j�i|_dS)N)�superr�__init__�
_functions)�self)�	__class__�� /usr/lib/python3.6/repository.pyrszRepository.__init__cCs|jS)N)r	)r
rrr
�	functionsszRepository.functionscCs d|_tj|_tjjjj|_	dS)Nztuned.profiles.functions)
Z
_namespace�constsZFUNCTION_PREFIX�_prefix�tunedZprofilesrrZFunctionZ
_interface)r
rrr
�_set_loader_parameterssz!Repository._set_loader_parameterscCs,tjd|�|j|�}|�}||j|<|S)Nzcreating function %s)�log�debugZload_pluginr	)r
�
function_nameZfunction_clsZfunction_instancerrr
�creates


zRepository.createcCs||jkr|j|�S|j|S)N)r	r)r
rrrr
�	load_func!s

zRepository.load_funccCs>tjd|�x*t|jj��D]\}}||kr|j|=qWdS)Nzremoving function %s)rr�listr	�items)r
Zfunction�k�vrrr
�delete&szRepository.delete)�__name__�
__module__�__qualname__r�propertyrrrrr�
__classcell__rr)rr
r	s	r)Ztuned.utils.plugin_loaderr�rZ
tuned.logsrZtuned.constsrZtuned.utils.commandsrZlogs�getrrrrrr
�<module>s

site-packages/tuned/profiles/functions/__pycache__/base.cpython-36.pyc000064400000002155147511334660022023 0ustar003

�<�e�@s:ddlZddlZddlmZejj�ZGdd�de�Z	dS)�N)�commandsc@s0eZdZdZd	dd�Zed
dd��Zdd�ZdS)�Functionz
	Built-in function
	NcCs||_||_||_t�|_dS)N)�_name�
_nargs_max�
_nargs_minrZ_cmd)�self�name�	nargs_max�	nargs_min�r�/usr/lib/python3.6/base.py�__init__szFunction.__init__cCs<|dks|dkrdSt|�}|dks,||ko:|dkp:||kS)NFr)�len)�cls�argsr	r
Zlarrr�_check_argsszFunction._check_argscCs*|j||j|j�rdStjd|j�dS)NTz5invalid number of arguments for builtin function '%s'F)rrr�log�errorr)rrrrr�executeszFunction.execute)N)N)�__name__�
__module__�__qualname__�__doc__r
�classmethodrrrrrrrs

r)
�osZ
tuned.logsZtunedZtuned.utils.commandsrZlogs�getr�objectrrrrr�<module>s
site-packages/tuned/profiles/functions/__pycache__/function_regex_search_ternary.cpython-36.pyc000064400000001757147511334660027230 0ustar003

�<�e*�@s*ddlZddlmZGdd�dej�ZdS)�N�)�basecs,eZdZdZ�fdd�Z�fdd�Z�ZS)�regex_search_ternaryz�
	Ternary regex operator, it takes arguments in the following form
	STR1, REGEX, STR2, STR3
	If REGEX matches STR1 (re.search is used), STR2 is returned,
	otherwise STR3 is returned
	cstt|�jddd�dS)Nr�)�superr�__init__)�self)�	__class__��3/usr/lib/python3.6/function_regex_search_ternary.pyrszregex_search_ternary.__init__cs<tt|�j|�sdStj|d|d�r0|dS|dSdS)Nrr��)rr�execute�re�search)r�args)r	r
rrs
zregex_search_ternary.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r
r
)r	rrsr)r�rZFunctionrr
r
r
r�<module>ssite-packages/tuned/profiles/functions/__pycache__/function_cpulist_invert.cpython-36.pyc000064400000002335147511334660026070 0ustar003

�<�e��@sHddlZddlZddlmZddlmZejj�Z	Gdd�dej
�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�cpulist_invertz�
	Inverts list of CPUs (makes its complement). For the complement it
	gets number of online CPUs from the /sys/devices/system/cpu/online,
	e.g. system with 4 CPUs (0-3), the inversion of list "0,2,3" will be
	"1"
	cstt|�jdd�dS)Nrr)�superr�__init__)�self)�	__class__��-/usr/lib/python3.6/function_cpulist_invert.pyrszcpulist_invert.__init__cs6tt|�j|�sdSdjdd�|jjdj|��D��S)N�,css|]}t|�VqdS)N)�str)�.0�vr
r
r�	<genexpr>sz)cpulist_invert.execute.<locals>.<genexpr>z,,)rr�execute�joinZ_cmd)r�args)r	r
rrszcpulist_invert.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r
r
)r	rrsr)�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZlogs�get�logZFunctionrr
r
r
r�<module>s

site-packages/tuned/profiles/functions/__pycache__/function_cpulist2devs.cpython-36.pyc000064400000001614147511334660025444 0ustar003

�<�e��@s4ddlZddlmZejj�ZGdd�dej�ZdS)�N�)�basecs,eZdZdZ�fdd�Z�fdd�Z�ZS)�cpulist2devsz<
	Conversion function: converts CPU list to device strings
	cstt|�jdd�dS)Nrr)�superr�__init__)�self)�	__class__��+/usr/lib/python3.6/function_cpulist2devs.pyr
szcpulist2devs.__init__cs2tt|�j|�sdS|jj|jjdj|��dd�S)N�,Zcpu)�prefix)rr�executeZ_cmdZcpulist2stringZcpulist_unpack�join)r�args)rr	r
r
szcpulist2devs.execute)�__name__�
__module__�__qualname__�__doc__rr
�
__classcell__r	r	)rr
rsr)	Z
tuned.logsZtuned�rZlogs�get�logZFunctionrr	r	r	r
�<module>s
site-packages/tuned/profiles/functions/__pycache__/function_cpulist2devs.cpython-36.opt-1.pyc000064400000001614147511334660026403 0ustar003

�<�e��@s4ddlZddlmZejj�ZGdd�dej�ZdS)�N�)�basecs,eZdZdZ�fdd�Z�fdd�Z�ZS)�cpulist2devsz<
	Conversion function: converts CPU list to device strings
	cstt|�jdd�dS)Nrr)�superr�__init__)�self)�	__class__��+/usr/lib/python3.6/function_cpulist2devs.pyr
szcpulist2devs.__init__cs2tt|�j|�sdS|jj|jjdj|��dd�S)N�,Zcpu)�prefix)rr�executeZ_cmdZcpulist2stringZcpulist_unpack�join)r�args)rr	r
r
szcpulist2devs.execute)�__name__�
__module__�__qualname__�__doc__rr
�
__classcell__r	r	)rr
rsr)	Z
tuned.logsZtuned�rZlogs�get�logZFunctionrr	r	r	r
�<module>s
tuned/profiles/functions/__pycache__/function_regex_search_ternary.cpython-36.opt-1.pyc000064400000001757147511334660030110 0ustar00site-packages3

�<�e*�@s*ddlZddlmZGdd�dej�ZdS)�N�)�basecs,eZdZdZ�fdd�Z�fdd�Z�ZS)�regex_search_ternaryz�
	Ternary regex operator, it takes arguments in the following form
	STR1, REGEX, STR2, STR3
	If REGEX matches STR1 (re.search is used), STR2 is returned,
	otherwise STR3 is returned
	cstt|�jddd�dS)Nr�)�superr�__init__)�self)�	__class__��3/usr/lib/python3.6/function_regex_search_ternary.pyrszregex_search_ternary.__init__cs<tt|�j|�sdStj|d|d�r0|dS|dSdS)Nrr��)rr�execute�re�search)r�args)r	r
rrs
zregex_search_ternary.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r
r
)r	rrsr)r�rZFunctionrr
r
r
r�<module>ssite-packages/tuned/profiles/functions/__pycache__/function_check_net_queue_count.cpython-36.pyc000064400000002124147511334660027351 0ustar003

�<�e��@s4ddlZddlmZejj�ZGdd�dej�ZdS)�N�)�basecs,eZdZdZ�fdd�Z�fdd�Z�ZS)�check_net_queue_countz�
	Checks whether the user has specified a queue count for net devices. If
        not, return the number of housekeeping CPUs.
	cstt|�jddd�dS)Nrr)�superr�__init__)�self)�	__class__��4/usr/lib/python3.6/function_check_net_queue_count.pyrszcheck_net_queue_count.__init__csLtt|�j|�sdS|dj�r(|dS|jjdg�\}}tjd|�|S)NrZnproczHnet-dev queue count is not correctly specified, setting it to HK CPUs %s)rr�execute�isdigitZ_cmd�log�warn)r�args�ret�out)rr	r
rszcheck_net_queue_count.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r	r	)rr
rsr)	Z
tuned.logsZtuned�rZlogs�getr
ZFunctionrr	r	r	r
�<module>s
site-packages/tuned/profiles/functions/__pycache__/function_s2kb.cpython-36.pyc000064400000001612147511334660023654 0ustar003

�<�e��@s>ddlZddlZddlmZddlmZGdd�dej�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�s2kbz*
	Conversion function: sectors to kbytes
	cstt|�jddd�dS)Nrr)�superr�__init__)�self)�	__class__��#/usr/lib/python3.6/function_s2kb.pyr
sz
s2kb.__init__csJtt|�j|�sdSytttt|d�d���Stk
rDdSXdS)Nr�)rr�execute�str�int�round�
ValueError)r�args)r	r
rr
szs2kb.execute)�__name__�
__module__�__qualname__�__doc__rr
�
__classcell__r
r
)r	rrsr)	�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZFunctionrr
r
r
r�<module>ssite-packages/tuned/profiles/functions/__pycache__/function_kb2s.cpython-36.pyc000064400000001573147511334660023662 0ustar003

�<�e��@s>ddlZddlZddlmZddlmZGdd�dej�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�kb2sz*
	Conversion function: kbytes to sectors
	cstt|�jddd�dS)Nrr)�superr�__init__)�self)�	__class__��#/usr/lib/python3.6/function_kb2s.pyr
sz
kb2s.__init__csBtt|�j|�sdSytt|d�d�Stk
r<dSXdS)Nr�)rr�execute�str�int�
ValueError)r�args)r	r
rr
szkb2s.execute)�__name__�
__module__�__qualname__�__doc__rr
�
__classcell__r
r
)r	rrsr)	�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZFunctionrr
r
r
r�<module>ssite-packages/tuned/profiles/functions/__pycache__/function_cpulist_online.cpython-36.opt-1.pyc000064400000002312147511334660026777 0ustar003

�<�e��@sHddlZddlZddlmZddlmZejj�Z	Gdd�dej
�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�cpulist_onlinezW
	Checks whether CPUs from list are online, returns list containing
	only online CPUs
	cstt|�jdd�dS)Nrr)�superr�__init__)�self)�	__class__��-/usr/lib/python3.6/function_cpulist_online.pyr
szcpulist_online.__init__csRtt|�j|�sdS|jjdj|��}|jj|jjd���dj�fdd�|D��S)N�,z/sys/devices/system/cpu/onlinec3s|]}|�krt|�VqdS)N)�str)�.0�v)�onliner
r�	<genexpr>sz)cpulist_online.execute.<locals>.<genexpr>)rr�executeZ_cmdZcpulist_unpack�joinZ	read_file)r�argsZcpus)r	)rrrs
zcpulist_online.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r
r
)r	rrsr)�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZlogs�get�logZFunctionrr
r
r
r�<module>s

site-packages/tuned/profiles/functions/__pycache__/function_calc_isolated_cores.cpython-36.pyc000064400000003462147511334660027001 0ustar003

�<�ek�@sNddlZddlZddlZddlmZddljZejj	�Z
Gdd�dej�ZdS)�N�)�basecs,eZdZdZ�fdd�Z�fdd�Z�ZS)�calc_isolated_coresz�
	Calculates and returns isolated cores. The argument specifies how many
	cores per socket reserve for housekeeping. If not specified, 1 core
	per socket is reserved for housekeeping and the rest is isolated.
	cstt|�jdd�dS)Nrr)�superr�__init__)�self)�	__class__��2/usr/lib/python3.6/function_calc_isolated_cores.pyrszcalc_isolated_cores.__init__c
sPtt|�j|�sdSd}t|�dkrj|dj�sBt|d�dkr^tjd|d|jf�dSt|d�}i}x�t	j
tjj
tjd��D]|}tjj|�dd�}|j�r�tjj
|d�}tjj|�s�tjd||f�q�|jj|�j�}|j�r�|j|g�|g||<q�Wg}x.|j�D]"}	|	jtd�||	|d�}�qW|jtd�d	j
|�S)
NrrzPinvalid argument '%s' for builtin function '%s', it must be non-negative integerzcpu*�ztopology/physical_package_idz(file '%s' does not exist, cpu%s offline?)�key�,)rr�execute�len�	isdecimal�int�log�error�_name�globZiglob�os�path�join�constsZSYSFS_CPUS_PATH�basename�exists�debugZ_cmdZ	read_file�strip�get�values�sort)
r�argsZcpus_reserveZtopoZcpuZcpuidZphysical_package_idZsocketZ	isol_cpusZcpus)rr	r
rs4zcalc_isolated_cores.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r	r	)rr
r	sr)
rrZ
tuned.logsZtuned�rZtuned.constsrZlogsrrZFunctionrr	r	r	r
�<module>s

site-packages/tuned/profiles/functions/__pycache__/function_cpulist_pack.cpython-36.opt-1.pyc000064400000002307147511334660026435 0ustar003

�<�e}�@sHddlZddlZddlmZddlmZejj�Z	Gdd�dej
�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�cpulist_packz�
	Conversion function: packs CPU list in form 1,2,3,5 to 1-3,5.
	The cpulist_unpack is used as a preprocessor, so it always returns
	optimal results. For details about input syntax see cpulist_unpack.
	cstt|�jdd�dS)Nrr)�superr�__init__)�self)�	__class__��+/usr/lib/python3.6/function_cpulist_pack.pyrszcpulist_pack.__init__cs6tt|�j|�sdSdjdd�|jjdj|��D��S)N�,css|]}t|�VqdS)N)�str)�.0�vr
r
r�	<genexpr>sz'cpulist_pack.execute.<locals>.<genexpr>z,,)rr�execute�joinZ_cmd)r�args)r	r
rrszcpulist_pack.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r
r
)r	rrsr)�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZlogs�get�logZFunctionrr
r
r
r�<module>s

site-packages/tuned/profiles/functions/__pycache__/function_exec.cpython-36.pyc000064400000001556147511334660023746 0ustar003

�<�e��@s>ddlZddlZddlmZddlmZGdd�dej�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�executez0
	Executes process and substitutes its output.
	cstt|�jddd�dS)N�execrr)�superr�__init__)�self)�	__class__��#/usr/lib/python3.6/function_exec.pyr
szexecute.__init__cs4tt|�j|�sdS|jj|�\}}|dkr0|SdS)Nr)rrZ_cmd)r	�args�ret�out)r
rrrszexecute.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__rr)r
rrsr)	�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZFunctionrrrrr�<module>ssite-packages/tuned/profiles/functions/__pycache__/function_assertion.cpython-36.opt-1.pyc000064400000002323147511334660025761 0ustar003

�<�e��@sTddlZddlZddlmZddlmZddlmZej	j
�ZGdd�dej�Z
dS)�N�)�base)�commands)�InvalidProfileExceptioncs,eZdZdZ�fdd�Z�fdd�Z�ZS)�	assertionz�
	Assertion: compares argument 2 with argument 3. If they don't match
	it logs text from argument 1 and  throws InvalidProfileException. This
	exception will abort profile loading.
	cstt|�jddd�dS)Nr�)�superr�__init__)�self)�	__class__��(/usr/lib/python3.6/function_assertion.pyr	szassertion.__init__csXtt|�j|�sdS|d|dkrTtjd|d|d|df�td|d��dS)Nr�z#assertion '%s' failed: '%s' != '%s'rzAssertion '%s' failed.)rr�execute�log�errorr)r
�args)rrr
rs zassertion.execute)�__name__�
__module__�__qualname__�__doc__r	r�
__classcell__rr)rr
r	sr)�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZtuned.profiles.exceptionsrZlogs�getrZFunctionrrrrr
�<module>s
site-packages/tuned/profiles/functions/__pycache__/function_strip.cpython-36.pyc000064400000001517147511334660024160 0ustar003

�<�e��@s>ddlZddlZddlmZddlmZGdd�dej�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�stripz0
	Makes string from all arguments and strip it
	cstt|�jddd�dS)Nrrr)�superr�__init__)�self)�	__class__��$/usr/lib/python3.6/function_strip.pyr
szstrip.__init__cs"tt|�j|�sdSdj|�j�S)N�)rr�execute�join)r�args)r	r
rr
sz
strip.execute)�__name__�
__module__�__qualname__�__doc__rr
�
__classcell__r
r
)r	rrsr)	�osZ
tuned.logsZtunedrrZtuned.utils.commandsrZFunctionrr
r
r
r�<module>ssite-packages/tuned/profiles/functions/__pycache__/function_cpulist2hex.cpython-36.pyc000064400000001627147511334660025273 0ustar003

�<�e��@sHddlZddlZddlmZddlmZejj�Z	Gdd�dej
�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�cpulist2hexzB
	Conversion function: converts CPU list to hexadecimal CPU mask
	cstt|�jdd�dS)Nrr)�superr�__init__)�self)�	__class__��*/usr/lib/python3.6/function_cpulist2hex.pyrszcpulist2hex.__init__cs&tt|�j|�sdS|jjdj|��S)Nz,,)rr�executeZ_cmd�join)r�args)r	r
rrszcpulist2hex.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r
r
)r	rrsr)�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZlogs�get�logZFunctionrr
r
r
r�<module>s

site-packages/tuned/profiles/functions/__pycache__/function_cpuinfo_check.cpython-36.pyc000064400000002643147511334660025620 0ustar003

�<�e��@s<ddlZddlZddlmZejj�ZGdd�dej�Z	dS)�N�)�basecs,eZdZdZ�fdd�Z�fdd�Z�ZS)�
cpuinfo_checka�
	Checks regexes against /proc/cpuinfo. Accepts arguments in the
	following form: REGEX1, STR1, REGEX2, STR2, ...[, STR_FALLBACK]
	If REGEX1 matches something in /proc/cpuinfo it expands to STR1,
	if REGEX2 matches it expands to STR2. It stops on the first match,
	i.e. if REGEX1 matches, no more regexes are processed. If none
	regex matches it expands to STR_FALLBACK. If there is no fallback,
	it expands to empty string.
	cstt|�jddd�dS)Nrr�)�superr�__init__)�self)�	__class__��,/usr/lib/python3.6/function_cpuinfo_check.pyrszcpuinfo_check.__init__cs�tt|�j|�sdS|jjd�}xHtdt|�d�D]4}|dt|�kr2tj|||tj	�r2||dSq2Wt|�dr~|dSdSdS)Nz
/proc/cpuinforrr����)
rr�executeZ_cmdZ	read_file�range�len�re�search�	MULTILINE)r�argsZcpuinfo�i)r	r
rrszcpuinfo_check.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r
r
)r	rrs	r)
rZ
tuned.logsZtunedrrZlogs�get�logZFunctionrr
r
r
r�<module>s
site-packages/tuned/profiles/functions/__pycache__/function_exec.cpython-36.opt-1.pyc000064400000001556147511334660024705 0ustar003

�<�e��@s>ddlZddlZddlmZddlmZGdd�dej�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�executez0
	Executes process and substitutes its output.
	cstt|�jddd�dS)N�execrr)�superr�__init__)�self)�	__class__��#/usr/lib/python3.6/function_exec.pyr
szexecute.__init__cs4tt|�j|�sdS|jj|�\}}|dkr0|SdS)Nr)rrZ_cmd)r	�args�ret�out)r
rrrszexecute.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__rr)r
rrsr)	�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZFunctionrrrrr�<module>ssite-packages/tuned/profiles/functions/__pycache__/repository.cpython-36.pyc000064400000003463147511334660023333 0ustar003

�<�e�@sTddlmZddlmZddlZddljZddlm	Z	ej
j�ZGdd�de�Z
dS)�)�PluginLoader�)�baseN)�commandscsHeZdZ�fdd�Zedd��Zdd�Zdd�Zd	d
�Zdd�Z	�Z
S)
�
Repositorycstt|�j�i|_dS)N)�superr�__init__�
_functions)�self)�	__class__�� /usr/lib/python3.6/repository.pyrszRepository.__init__cCs|jS)N)r	)r
rrr
�	functionsszRepository.functionscCs d|_tj|_tjjjj|_	dS)Nztuned.profiles.functions)
Z
_namespace�constsZFUNCTION_PREFIX�_prefix�tunedZprofilesrrZFunction�
_interface)r
rrr
�_set_loader_parameterssz!Repository._set_loader_parameterscCs,tjd|�|j|�}|�}||j|<|S)Nzcreating function %s)�log�debugZload_pluginr	)r
�
function_nameZfunction_clsZfunction_instancerrr
�creates


zRepository.createcCs||jkr|j|�S|j|S)N)r	r)r
rrrr
�	load_func!s

zRepository.load_funccCsNt||j�st�tjd|�x*t|jj��D]\}}||kr.|j|=q.WdS)Nzremoving function %s)�
isinstancer�AssertionErrorrr�listr	�items)r
Zfunction�k�vrrr
�delete&s
zRepository.delete)�__name__�
__module__�__qualname__r�propertyrrrrr�
__classcell__rr)rr
r	s	r)Ztuned.utils.plugin_loaderr�rZ
tuned.logsrZtuned.constsrZtuned.utils.commandsrZlogs�getrrrrrr
�<module>s

site-packages/tuned/profiles/functions/__pycache__/function_cpulist2hex.cpython-36.opt-1.pyc000064400000001627147511334660026232 0ustar003

�<�e��@sHddlZddlZddlmZddlmZejj�Z	Gdd�dej
�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�cpulist2hexzB
	Conversion function: converts CPU list to hexadecimal CPU mask
	cstt|�jdd�dS)Nrr)�superr�__init__)�self)�	__class__��*/usr/lib/python3.6/function_cpulist2hex.pyrszcpulist2hex.__init__cs&tt|�j|�sdS|jjdj|��S)Nz,,)rr�executeZ_cmd�join)r�args)r	r
rrszcpulist2hex.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r
r
)r	rrsr)�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZlogs�get�logZFunctionrr
r
r
r�<module>s

site-packages/tuned/profiles/functions/__pycache__/function_calc_isolated_cores.cpython-36.opt-1.pyc000064400000003462147511334660027740 0ustar003

�<�ek�@sNddlZddlZddlZddlmZddljZejj	�Z
Gdd�dej�ZdS)�N�)�basecs,eZdZdZ�fdd�Z�fdd�Z�ZS)�calc_isolated_coresz�
	Calculates and returns isolated cores. The argument specifies how many
	cores per socket reserve for housekeeping. If not specified, 1 core
	per socket is reserved for housekeeping and the rest is isolated.
	cstt|�jdd�dS)Nrr)�superr�__init__)�self)�	__class__��2/usr/lib/python3.6/function_calc_isolated_cores.pyrszcalc_isolated_cores.__init__c
sPtt|�j|�sdSd}t|�dkrj|dj�sBt|d�dkr^tjd|d|jf�dSt|d�}i}x�t	j
tjj
tjd��D]|}tjj|�dd�}|j�r�tjj
|d�}tjj|�s�tjd||f�q�|jj|�j�}|j�r�|j|g�|g||<q�Wg}x.|j�D]"}	|	jtd�||	|d�}�qW|jtd�d	j
|�S)
NrrzPinvalid argument '%s' for builtin function '%s', it must be non-negative integerzcpu*�ztopology/physical_package_idz(file '%s' does not exist, cpu%s offline?)�key�,)rr�execute�len�	isdecimal�int�log�error�_name�globZiglob�os�path�join�constsZSYSFS_CPUS_PATH�basename�exists�debugZ_cmdZ	read_file�strip�get�values�sort)
r�argsZcpus_reserveZtopoZcpuZcpuidZphysical_package_idZsocketZ	isol_cpusZcpus)rr	r
rs4zcalc_isolated_cores.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r	r	)rr
r	sr)
rrZ
tuned.logsZtuned�rZtuned.constsrZlogsrrZFunctionrr	r	r	r
�<module>s

site-packages/tuned/profiles/functions/__pycache__/functions.cpython-36.opt-1.pyc000064400000006102147511334660024054 0ustar003

�<�eb�@sdddlZddlZddlZddlmZddlZddljZddl	m
Z
ejj�Z
e
�ZGdd�d�ZdS)�N�)�
repository)�commandsc@sZeZdZdZdd�Zddd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�ZdS)�	Functionsz
	Built-in functions
	cCstj�|_|j�dS)N)rZ
Repository�_repository�_parse_init)�self�r	�/usr/lib/python3.6/functions.py�__init__s
zFunctions.__init__�cCs&d|_||_t|�|_g|_d|_dS)NrF)�_cnt�_str�len�_len�_stack�_esc)r�sr	r	r
rs

zFunctions._parse_initcCs|j|jkr|j|jSdS)Nr)r
rr)rr	r	r
�
_curr_charszFunctions._curr_charcCs|j|j|j|�S)N)rr
)rrr	r	r
�_curr_substr szFunctions._curr_substrcCs|jj||jf�dS)N)r�appendr
)r�escr	r	r
�	_push_pos#szFunctions._push_poscCsd|jd|�||j|dd�|_t|j�|_|jt|�||d7_|jdkr`d|_dS)Nrr)rrrr
)r�a�brr	r	r
�_sub&s
&
zFunctions._subcCs�tjd|j||j��}|ddkr(dSdd�|D�}tjd|d�s\tjd|d�dSy|jj|d�}Wn&t	k
r�tjd	|d�dSX|j
|d
d��}|dkr�dS|j||j|�dS)Nz(?<!\\):rz${fcSsg|]}t|�jdd��qS)z\:�:)�str�replace)�.0�vr	r	r
�
<listcomp>1sz+Functions._process_func.<locals>.<listcomp>z\w+$rzinvalid function name '%s'zfunction '%s' not implemented�)�re�splitrr
�match�log�errorrZ	load_func�ImportErrorZexecuter)rZ_fromZsl�frr	r	r
�
_process_func-s zFunctions._process_funccCs�|j|�x�|j|jkr�|j�dkrpy|jj�}Wn$tk
rVtjd|�|j	SX|ds�|j
|d�n|jd�dkr�|j|j
�|j�dkr�d|_
nd	|_
|jd7_qWt|j�r�tjd
|�|j	S)N�}z.invalid variable syntax, non pair '}' in: '%s'rrr"z${�\TFz.invalid variable syntax, non pair '{' in: '%s')rr
rrr�pop�
IndexErrorr&r'rr*rrrr)rrZsir	r	r
�_process?s&

zFunctions._processcCs(|dks|dkr|Stjdd|j|��S)Nrz\\(\${f:.*})z\1)r#�subr/)rrr	r	r
�expandVszFunctions.expandN)r)
�__name__�
__module__�__qualname__�__doc__rrrrrrr*r/r1r	r	r	r
r
s
r)�osr#ZglobrrZ
tuned.logsZtunedZtuned.constsZconstsZtuned.utils.commandsrZlogs�getr&�cmdrr	r	r	r
�<module>s

site-packages/tuned/profiles/functions/__pycache__/functions.cpython-36.pyc000064400000006102147511334660023115 0ustar003

�<�eb�@sdddlZddlZddlZddlmZddlZddljZddl	m
Z
ejj�Z
e
�ZGdd�d�ZdS)�N�)�
repository)�commandsc@sZeZdZdZdd�Zddd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�ZdS)�	Functionsz
	Built-in functions
	cCstj�|_|j�dS)N)rZ
Repository�_repository�_parse_init)�self�r	�/usr/lib/python3.6/functions.py�__init__s
zFunctions.__init__�cCs&d|_||_t|�|_g|_d|_dS)NrF)�_cnt�_str�len�_len�_stack�_esc)r�sr	r	r
rs

zFunctions._parse_initcCs|j|jkr|j|jSdS)Nr)r
rr)rr	r	r
�
_curr_charszFunctions._curr_charcCs|j|j|j|�S)N)rr
)rrr	r	r
�_curr_substr szFunctions._curr_substrcCs|jj||jf�dS)N)r�appendr
)r�escr	r	r
�	_push_pos#szFunctions._push_poscCsd|jd|�||j|dd�|_t|j�|_|jt|�||d7_|jdkr`d|_dS)Nrr)rrrr
)r�a�brr	r	r
�_sub&s
&
zFunctions._subcCs�tjd|j||j��}|ddkr(dSdd�|D�}tjd|d�s\tjd|d�dSy|jj|d�}Wn&t	k
r�tjd	|d�dSX|j
|d
d��}|dkr�dS|j||j|�dS)Nz(?<!\\):rz${fcSsg|]}t|�jdd��qS)z\:�:)�str�replace)�.0�vr	r	r
�
<listcomp>1sz+Functions._process_func.<locals>.<listcomp>z\w+$rzinvalid function name '%s'zfunction '%s' not implemented�)�re�splitrr
�match�log�errorrZ	load_func�ImportErrorZexecuter)rZ_fromZsl�frr	r	r
�
_process_func-s zFunctions._process_funccCs�|j|�x�|j|jkr�|j�dkrpy|jj�}Wn$tk
rVtjd|�|j	SX|ds�|j
|d�n|jd�dkr�|j|j
�|j�dkr�d|_
nd	|_
|jd7_qWt|j�r�tjd
|�|j	S)N�}z.invalid variable syntax, non pair '}' in: '%s'rrr"z${�\TFz.invalid variable syntax, non pair '{' in: '%s')rr
rrr�pop�
IndexErrorr&r'rr*rrrr)rrZsir	r	r
�_process?s&

zFunctions._processcCs(|dks|dkr|Stjdd|j|��S)Nrz\\(\${f:.*})z\1)r#�subr/)rrr	r	r
�expandVszFunctions.expandN)r)
�__name__�
__module__�__qualname__�__doc__rrrrrrr*r/r1r	r	r	r
r
s
r)�osr#ZglobrrZ
tuned.logsZtunedZtuned.constsZconstsZtuned.utils.commandsrZlogs�getr&�cmdrr	r	r	r
�<module>s

site-packages/tuned/profiles/functions/__pycache__/function_cpulist_invert.cpython-36.opt-1.pyc000064400000002335147511334660027027 0ustar003

�<�e��@sHddlZddlZddlmZddlmZejj�Z	Gdd�dej
�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�cpulist_invertz�
	Inverts list of CPUs (makes its complement). For the complement it
	gets number of online CPUs from the /sys/devices/system/cpu/online,
	e.g. system with 4 CPUs (0-3), the inversion of list "0,2,3" will be
	"1"
	cstt|�jdd�dS)Nrr)�superr�__init__)�self)�	__class__��-/usr/lib/python3.6/function_cpulist_invert.pyrszcpulist_invert.__init__cs6tt|�j|�sdSdjdd�|jjdj|��D��S)N�,css|]}t|�VqdS)N)�str)�.0�vr
r
r�	<genexpr>sz)cpulist_invert.execute.<locals>.<genexpr>z,,)rr�execute�joinZ_cmd)r�args)r	r
rrszcpulist_invert.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r
r
)r	rrsr)�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZlogs�get�logZFunctionrr
r
r
r�<module>s

site-packages/tuned/profiles/functions/__pycache__/function_hex2cpulist.cpython-36.opt-1.pyc000064400000002073147511334660026226 0ustar003

�<�e��@sHddlZddlZddlmZddlmZejj�Z	Gdd�dej
�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�hex2cpulistzB
	Conversion function: converts hexadecimal CPU mask to CPU list
	cstt|�jddd�dS)Nrr)�superr�__init__)�self)�	__class__��*/usr/lib/python3.6/function_hex2cpulist.pyrszhex2cpulist.__init__cs4tt|�j|�sdSdjdd�|jj|d�D��S)N�,css|]}t|�VqdS)N)�str)�.0�vr
r
r�	<genexpr>sz&hex2cpulist.execute.<locals>.<genexpr>r)rr�execute�joinZ_cmd)r�args)r	r
rrszhex2cpulist.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r
r
)r	rrsr)�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZlogs�get�logZFunctionrr
r
r
r�<module>s

site-packages/tuned/profiles/functions/__pycache__/__init__.cpython-36.pyc000064400000000236147511334660022646 0ustar003

�<�e#�@sddlmZdS)�)�
RepositoryN)Z
repositoryr�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/profiles/functions/__pycache__/function_virt_check.cpython-36.opt-1.pyc000064400000002042147511334660026071 0ustar003

�<�eS�@s>ddlZddlZddlmZddlmZGdd�dej�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�
virt_checkz�
	Checks whether running inside virtual machine (VM) or on bare metal.
	If running inside VM expands to argument 1, otherwise expands to
	argument 2 (even on error).
	cstt|�jddd�dS)Nr�)�superr�__init__)�self)�	__class__��)/usr/lib/python3.6/function_virt_check.pyrszvirt_check.__init__csJtt|�j|�sdS|jjdg�\}}|dkrBt|�dkrB|dS|dS)Nz	virt-whatrr)rr�executeZ_cmd�len)r	�args�ret�out)r
rrr
szvirt_check.execute)�__name__�
__module__�__qualname__�__doc__rr
�
__classcell__rr)r
rrsr)	�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZFunctionrrrrr�<module>ssite-packages/tuned/profiles/functions/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000236147511334660023605 0ustar003

�<�e#�@sddlmZdS)�)�
RepositoryN)Z
repositoryr�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/profiles/functions/__pycache__/function_assertion_non_equal.cpython-36.opt-1.pyc000064400000002377147511334660030033 0ustar003

�<�e+�@sTddlZddlZddlmZddlmZddlmZej	j
�ZGdd�dej�Z
dS)�N�)�base)�commands)�InvalidProfileExceptioncs,eZdZdZ�fdd�Z�fdd�Z�ZS)�assertion_non_equalz�
	Assertion non equal: compares argument 2 with argument 3. If they match
	it logs text from argument 1 and  throws InvalidProfileException. This
	exception will abort profile loading.
	cstt|�jddd�dS)Nr�)�superr�__init__)�self)�	__class__��2/usr/lib/python3.6/function_assertion_non_equal.pyr	szassertion_non_equal.__init__csXtt|�j|�sdS|d|dkrTtjd|d|d|df�td|d��dS)Nr�z#assertion '%s' failed: '%s' == '%s'rzAssertion '%s' failed.)rr�execute�log�errorr)r
�args)rrr
rs zassertion_non_equal.execute)�__name__�
__module__�__qualname__�__doc__r	r�
__classcell__rr)rr
r	sr)�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZtuned.profiles.exceptionsrZlogs�getrZFunctionrrrrr
�<module>s
site-packages/tuned/profiles/functions/__pycache__/function_assertion_non_equal.cpython-36.pyc000064400000002377147511334660027074 0ustar003

�<�e+�@sTddlZddlZddlmZddlmZddlmZej	j
�ZGdd�dej�Z
dS)�N�)�base)�commands)�InvalidProfileExceptioncs,eZdZdZ�fdd�Z�fdd�Z�ZS)�assertion_non_equalz�
	Assertion non equal: compares argument 2 with argument 3. If they match
	it logs text from argument 1 and  throws InvalidProfileException. This
	exception will abort profile loading.
	cstt|�jddd�dS)Nr�)�superr�__init__)�self)�	__class__��2/usr/lib/python3.6/function_assertion_non_equal.pyr	szassertion_non_equal.__init__csXtt|�j|�sdS|d|dkrTtjd|d|d|df�td|d��dS)Nr�z#assertion '%s' failed: '%s' == '%s'rzAssertion '%s' failed.)rr�execute�log�errorr)r
�args)rrr
rs zassertion_non_equal.execute)�__name__�
__module__�__qualname__�__doc__r	r�
__classcell__rr)rr
r	sr)�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZtuned.profiles.exceptionsrZlogs�getrZFunctionrrrrr
�<module>s
site-packages/tuned/profiles/functions/__pycache__/function_cpulist2hex_invert.cpython-36.opt-1.pyc000064400000002174147511334660027617 0ustar003

�<�ep�@sHddlZddlZddlmZddlmZejj�Z	Gdd�dej
�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�cpulist2hex_invertz<
	Converts CPU list to hexadecimal CPU mask and inverts it
	cstt|�jdd�dS)Nrr)�superr�__init__)�self)�	__class__��1/usr/lib/python3.6/function_cpulist2hex_invert.pyrszcpulist2hex_invert.__init__cs>tt|�j|�sdS|jjdjdd�|jjdj|��D���S)N�,css|]}t|�VqdS)N)�str)�.0�vr
r
r�	<genexpr>sz-cpulist2hex_invert.execute.<locals>.<genexpr>z,,)rr�executeZ_cmdZcpulist2hex�joinZcpulist_invert)r�args)r	r
rrszcpulist2hex_invert.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r
r
)r	rrsr)�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZlogs�get�logZFunctionrr
r
r
r�<module>s

site-packages/tuned/profiles/functions/__pycache__/function_cpulist_present.cpython-36.opt-1.pyc000064400000002403147511334660027174 0ustar003

�<�e��@sHddlZddlZddlmZddlmZejj�Z	Gdd�dej
�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�cpulist_presentzY
	Checks whether CPUs from list are present, returns list containing
	only present CPUs
	cstt|�jdd�dS)Nrr)�superr�__init__)�self)�	__class__��./usr/lib/python3.6/function_cpulist_present.pyr
szcpulist_present.__init__csdtt|�j|�sdS|jjdj|��}|jj|jjd��}djdd�ttt	|�j
t	|����D��S)Nz,,z/sys/devices/system/cpu/present�,css|]}t|�VqdS)N)�str)�.0�vr
r
r�	<genexpr>sz*cpulist_present.execute.<locals>.<genexpr>)rr�executeZ_cmdZcpulist_unpack�joinZ	read_file�sorted�list�set�intersection)r�argsZcpusZpresent)r	r
rrs
zcpulist_present.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r
r
)r	rrsr)�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZlogs�get�logZFunctionrr
r
r
r�<module>s

site-packages/tuned/profiles/functions/__pycache__/function_cpulist_unpack.cpython-36.pyc000064400000002111147511334660026032 0ustar003

�<�e��@sHddlZddlZddlmZddlmZejj�Z	Gdd�dej
�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�cpulist_unpackzB
	Conversion function: unpacks CPU list in form 1-3,4 to 1,2,3,4
	cstt|�jdd�dS)Nrr)�superr�__init__)�self)�	__class__��-/usr/lib/python3.6/function_cpulist_unpack.pyrszcpulist_unpack.__init__cs6tt|�j|�sdSdjdd�|jjdj|��D��S)N�,css|]}t|�VqdS)N)�str)�.0�vr
r
r�	<genexpr>sz)cpulist_unpack.execute.<locals>.<genexpr>z,,)rr�execute�joinZ_cmd)r�args)r	r
rrszcpulist_unpack.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r
r
)r	rrsr)�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZlogs�get�logZFunctionrr
r
r
r�<module>s

site-packages/tuned/profiles/functions/__pycache__/function_cpulist_unpack.cpython-36.opt-1.pyc000064400000002111147511334660026771 0ustar003

�<�e��@sHddlZddlZddlmZddlmZejj�Z	Gdd�dej
�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�cpulist_unpackzB
	Conversion function: unpacks CPU list in form 1-3,4 to 1,2,3,4
	cstt|�jdd�dS)Nrr)�superr�__init__)�self)�	__class__��-/usr/lib/python3.6/function_cpulist_unpack.pyrszcpulist_unpack.__init__cs6tt|�j|�sdSdjdd�|jjdj|��D��S)N�,css|]}t|�VqdS)N)�str)�.0�vr
r
r�	<genexpr>sz)cpulist_unpack.execute.<locals>.<genexpr>z,,)rr�execute�joinZ_cmd)r�args)r	r
rrszcpulist_unpack.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r
r
)r	rrsr)�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZlogs�get�logZFunctionrr
r
r
r�<module>s

site-packages/tuned/profiles/functions/__pycache__/function_cpulist_present.cpython-36.pyc000064400000002403147511334660026235 0ustar003

�<�e��@sHddlZddlZddlmZddlmZejj�Z	Gdd�dej
�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�cpulist_presentzY
	Checks whether CPUs from list are present, returns list containing
	only present CPUs
	cstt|�jdd�dS)Nrr)�superr�__init__)�self)�	__class__��./usr/lib/python3.6/function_cpulist_present.pyr
szcpulist_present.__init__csdtt|�j|�sdS|jjdj|��}|jj|jjd��}djdd�ttt	|�j
t	|����D��S)Nz,,z/sys/devices/system/cpu/present�,css|]}t|�VqdS)N)�str)�.0�vr
r
r�	<genexpr>sz*cpulist_present.execute.<locals>.<genexpr>)rr�executeZ_cmdZcpulist_unpack�joinZ	read_file�sorted�list�set�intersection)r�argsZcpusZpresent)r	r
rrs
zcpulist_present.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r
r
)r	rrsr)�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZlogs�get�logZFunctionrr
r
r
r�<module>s

site-packages/tuned/profiles/functions/__pycache__/function_cpulist_pack.cpython-36.pyc000064400000002307147511334660025476 0ustar003

�<�e}�@sHddlZddlZddlmZddlmZejj�Z	Gdd�dej
�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�cpulist_packz�
	Conversion function: packs CPU list in form 1,2,3,5 to 1-3,5.
	The cpulist_unpack is used as a preprocessor, so it always returns
	optimal results. For details about input syntax see cpulist_unpack.
	cstt|�jdd�dS)Nrr)�superr�__init__)�self)�	__class__��+/usr/lib/python3.6/function_cpulist_pack.pyrszcpulist_pack.__init__cs6tt|�j|�sdSdjdd�|jjdj|��D��S)N�,css|]}t|�VqdS)N)�str)�.0�vr
r
r�	<genexpr>sz'cpulist_pack.execute.<locals>.<genexpr>z,,)rr�execute�joinZ_cmd)r�args)r	r
rrszcpulist_pack.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r
r
)r	rrsr)�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZlogs�get�logZFunctionrr
r
r
r�<module>s

site-packages/tuned/profiles/functions/__pycache__/function_cpulist_online.cpython-36.pyc000064400000002312147511334660026040 0ustar003

�<�e��@sHddlZddlZddlmZddlmZejj�Z	Gdd�dej
�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�cpulist_onlinezW
	Checks whether CPUs from list are online, returns list containing
	only online CPUs
	cstt|�jdd�dS)Nrr)�superr�__init__)�self)�	__class__��-/usr/lib/python3.6/function_cpulist_online.pyr
szcpulist_online.__init__csRtt|�j|�sdS|jjdj|��}|jj|jjd���dj�fdd�|D��S)N�,z/sys/devices/system/cpu/onlinec3s|]}|�krt|�VqdS)N)�str)�.0�v)�onliner
r�	<genexpr>sz)cpulist_online.execute.<locals>.<genexpr>)rr�executeZ_cmdZcpulist_unpack�joinZ	read_file)r�argsZcpus)r	)rrrs
zcpulist_online.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r
r
)r	rrsr)�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZlogs�get�logZFunctionrr
r
r
r�<module>s

site-packages/tuned/profiles/functions/__pycache__/function_s2kb.cpython-36.opt-1.pyc000064400000001612147511334660024613 0ustar003

�<�e��@s>ddlZddlZddlmZddlmZGdd�dej�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�s2kbz*
	Conversion function: sectors to kbytes
	cstt|�jddd�dS)Nrr)�superr�__init__)�self)�	__class__��#/usr/lib/python3.6/function_s2kb.pyr
sz
s2kb.__init__csJtt|�j|�sdSytttt|d�d���Stk
rDdSXdS)Nr�)rr�execute�str�int�round�
ValueError)r�args)r	r
rr
szs2kb.execute)�__name__�
__module__�__qualname__�__doc__rr
�
__classcell__r
r
)r	rrsr)	�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZFunctionrr
r
r
r�<module>ssite-packages/tuned/profiles/functions/__pycache__/function_kb2s.cpython-36.opt-1.pyc000064400000001573147511334660024621 0ustar003

�<�e��@s>ddlZddlZddlmZddlmZGdd�dej�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�kb2sz*
	Conversion function: kbytes to sectors
	cstt|�jddd�dS)Nrr)�superr�__init__)�self)�	__class__��#/usr/lib/python3.6/function_kb2s.pyr
sz
kb2s.__init__csBtt|�j|�sdSytt|d�d�Stk
r<dSXdS)Nr�)rr�execute�str�int�
ValueError)r�args)r	r
rr
szkb2s.execute)�__name__�
__module__�__qualname__�__doc__rr
�
__classcell__r
r
)r	rrsr)	�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZFunctionrr
r
r
r�<module>ssite-packages/tuned/profiles/functions/__pycache__/function_cpulist2hex_invert.cpython-36.pyc000064400000002174147511334660026660 0ustar003

�<�ep�@sHddlZddlZddlmZddlmZejj�Z	Gdd�dej
�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�cpulist2hex_invertz<
	Converts CPU list to hexadecimal CPU mask and inverts it
	cstt|�jdd�dS)Nrr)�superr�__init__)�self)�	__class__��1/usr/lib/python3.6/function_cpulist2hex_invert.pyrszcpulist2hex_invert.__init__cs>tt|�j|�sdS|jjdjdd�|jjdj|��D���S)N�,css|]}t|�VqdS)N)�str)�.0�vr
r
r�	<genexpr>sz-cpulist2hex_invert.execute.<locals>.<genexpr>z,,)rr�executeZ_cmdZcpulist2hex�joinZcpulist_invert)r�args)r	r
rrszcpulist2hex_invert.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r
r
)r	rrsr)�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZlogs�get�logZFunctionrr
r
r
r�<module>s

site-packages/tuned/profiles/functions/__pycache__/function_cpuinfo_check.cpython-36.opt-1.pyc000064400000002643147511334660026557 0ustar003

�<�e��@s<ddlZddlZddlmZejj�ZGdd�dej�Z	dS)�N�)�basecs,eZdZdZ�fdd�Z�fdd�Z�ZS)�
cpuinfo_checka�
	Checks regexes against /proc/cpuinfo. Accepts arguments in the
	following form: REGEX1, STR1, REGEX2, STR2, ...[, STR_FALLBACK]
	If REGEX1 matches something in /proc/cpuinfo it expands to STR1,
	if REGEX2 matches it expands to STR2. It stops on the first match,
	i.e. if REGEX1 matches, no more regexes are processed. If none
	regex matches it expands to STR_FALLBACK. If there is no fallback,
	it expands to empty string.
	cstt|�jddd�dS)Nrr�)�superr�__init__)�self)�	__class__��,/usr/lib/python3.6/function_cpuinfo_check.pyrszcpuinfo_check.__init__cs�tt|�j|�sdS|jjd�}xHtdt|�d�D]4}|dt|�kr2tj|||tj	�r2||dSq2Wt|�dr~|dSdSdS)Nz
/proc/cpuinforrr����)
rr�executeZ_cmdZ	read_file�range�len�re�search�	MULTILINE)r�argsZcpuinfo�i)r	r
rrszcpuinfo_check.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r
r
)r	rrs	r)
rZ
tuned.logsZtunedrrZlogs�get�logZFunctionrr
r
r
r�<module>s
site-packages/tuned/profiles/functions/__pycache__/base.cpython-36.opt-1.pyc000064400000002155147511334660022762 0ustar003

�<�e�@s:ddlZddlZddlmZejj�ZGdd�de�Z	dS)�N)�commandsc@s0eZdZdZd	dd�Zed
dd��Zdd�ZdS)�Functionz
	Built-in function
	NcCs||_||_||_t�|_dS)N)�_name�
_nargs_max�
_nargs_minrZ_cmd)�self�name�	nargs_max�	nargs_min�r�/usr/lib/python3.6/base.py�__init__szFunction.__init__cCs<|dks|dkrdSt|�}|dks,||ko:|dkp:||kS)NFr)�len)�cls�argsr	r
Zlarrr�_check_argsszFunction._check_argscCs*|j||j|j�rdStjd|j�dS)NTz5invalid number of arguments for builtin function '%s'F)rrr�log�errorr)rrrrr�executeszFunction.execute)N)N)�__name__�
__module__�__qualname__�__doc__r
�classmethodrrrrrrrs

r)
�osZ
tuned.logsZtunedZtuned.utils.commandsrZlogs�getr�objectrrrrr�<module>s
tuned/profiles/functions/__pycache__/function_check_net_queue_count.cpython-36.opt-1.pyc000064400000002124147511334660030231 0ustar00site-packages3

�<�e��@s4ddlZddlmZejj�ZGdd�dej�ZdS)�N�)�basecs,eZdZdZ�fdd�Z�fdd�Z�ZS)�check_net_queue_countz�
	Checks whether the user has specified a queue count for net devices. If
        not, return the number of housekeeping CPUs.
	cstt|�jddd�dS)Nrr)�superr�__init__)�self)�	__class__��4/usr/lib/python3.6/function_check_net_queue_count.pyrszcheck_net_queue_count.__init__csLtt|�j|�sdS|dj�r(|dS|jjdg�\}}tjd|�|S)NrZnproczHnet-dev queue count is not correctly specified, setting it to HK CPUs %s)rr�execute�isdigitZ_cmd�log�warn)r�args�ret�out)rr	r
rszcheck_net_queue_count.execute)�__name__�
__module__�__qualname__�__doc__rr�
__classcell__r	r	)rr
rsr)	Z
tuned.logsZtuned�rZlogs�getr
ZFunctionrr	r	r	r
�<module>s
site-packages/tuned/profiles/functions/__pycache__/function_virt_check.cpython-36.pyc000064400000002042147511334660025132 0ustar003

�<�eS�@s>ddlZddlZddlmZddlmZGdd�dej�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�
virt_checkz�
	Checks whether running inside virtual machine (VM) or on bare metal.
	If running inside VM expands to argument 1, otherwise expands to
	argument 2 (even on error).
	cstt|�jddd�dS)Nr�)�superr�__init__)�self)�	__class__��)/usr/lib/python3.6/function_virt_check.pyrszvirt_check.__init__csJtt|�j|�sdS|jjdg�\}}|dkrBt|�dkrB|dS|dS)Nz	virt-whatrr)rr�executeZ_cmd�len)r	�args�ret�out)r
rrr
szvirt_check.execute)�__name__�
__module__�__qualname__�__doc__rr
�
__classcell__rr)r
rrsr)	�osZ
tuned.logsZtuned�rZtuned.utils.commandsrZFunctionrrrrr�<module>ssite-packages/tuned/profiles/functions/__pycache__/function_strip.cpython-36.opt-1.pyc000064400000001517147511334660025117 0ustar003

�<�e��@s>ddlZddlZddlmZddlmZGdd�dej�ZdS)�N�)�base)�commandscs,eZdZdZ�fdd�Z�fdd�Z�ZS)�stripz0
	Makes string from all arguments and strip it
	cstt|�jddd�dS)Nrrr)�superr�__init__)�self)�	__class__��$/usr/lib/python3.6/function_strip.pyr
szstrip.__init__cs"tt|�j|�sdSdj|�j�S)N�)rr�execute�join)r�args)r	r
rr
sz
strip.execute)�__name__�
__module__�__qualname__�__doc__rr
�
__classcell__r
r
)r	rrsr)	�osZ
tuned.logsZtunedrrZtuned.utils.commandsrZFunctionrr
r
r
r�<module>ssite-packages/tuned/profiles/functions/function_s2kb.py000064400000000641147511334660017371 0ustar00import os
import tuned.logs
from . import base
from tuned.utils.commands import commands

class s2kb(base.Function):
	"""
	Conversion function: sectors to kbytes
	"""
	def __init__(self):
		# 1 argument
		super(s2kb, self).__init__("s2kb", 1, 1)

	def execute(self, args):
		if not super(s2kb, self).execute(args):
			return None
		try:
			return str(int(round(int(args[0]) / 2)))
		except ValueError:
			return None
site-packages/tuned/profiles/functions/function_cpulist2hex.py000064400000000726147511334660021006 0ustar00import os
import tuned.logs
from . import base
from tuned.utils.commands import commands

log = tuned.logs.get()

class cpulist2hex(base.Function):
	"""
	Conversion function: converts CPU list to hexadecimal CPU mask
	"""
	def __init__(self):
		# arbitrary number of arguments
		super(cpulist2hex, self).__init__("cpulist2hex", 0)

	def execute(self, args):
		if not super(cpulist2hex, self).execute(args):
			return None
		return self._cmd.cpulist2hex(",,".join(args))
site-packages/tuned/profiles/functions/repository.py000064400000002404147511334660017041 0ustar00from tuned.utils.plugin_loader import PluginLoader
from . import base
import tuned.logs
import tuned.consts as consts
from tuned.utils.commands import commands

log = tuned.logs.get()

class Repository(PluginLoader):

	def __init__(self):
		super(Repository, self).__init__()
		self._functions = {}

	@property
	def functions(self):
		return self._functions

	def _set_loader_parameters(self):
		self._namespace = "tuned.profiles.functions"
		self._prefix = consts.FUNCTION_PREFIX
		self._interface = tuned.profiles.functions.base.Function

	def create(self, function_name):
		log.debug("creating function %s" % function_name)
		function_cls = self.load_plugin(function_name)
		function_instance = function_cls()
		self._functions[function_name] = function_instance
		return function_instance

	# loads function from plugin file and return it
	# if it is already loaded, just return it, it is not loaded again
	def load_func(self, function_name):
		if not function_name in self._functions:
			return self.create(function_name)
		return self._functions[function_name]

	def delete(self, function):
		assert isinstance(function, self._interface)
		log.debug("removing function %s" % function)
		for k, v in list(self._functions.items()):
			if v == function:
				del self._functions[k]
site-packages/tuned/profiles/functions/function_regex_search_ternary.py000064400000001052147511334660022730 0ustar00import re
from . import base

class regex_search_ternary(base.Function):
	"""
	Ternary regex operator, it takes arguments in the following form
	STR1, REGEX, STR2, STR3
	If REGEX matches STR1 (re.search is used), STR2 is returned,
	otherwise STR3 is returned
	"""
	def __init__(self):
		# 4 arguments
		super(regex_search_ternary, self).__init__("regex_search_ternary", 4, 4)

	def execute(self, args):
		if not super(regex_search_ternary, self).execute(args):
			return None
		if re.search(args[1], args[0]):
			return args[2]
		else:
			return args[3]
site-packages/tuned/profiles/functions/function_exec.py000064400000000747147511334660017463 0ustar00import os
import tuned.logs
from . import base
from tuned.utils.commands import commands

class execute(base.Function):
	"""
	Executes process and substitutes its output.
	"""
	def __init__(self):
		# unlimited number of arguments, min 1 argument (the name of executable)
		super(execute, self).__init__("exec", 0, 1)

	def execute(self, args):
		if not super(execute, self).execute(args):
			return None
		(ret, out) = self._cmd.execute(args)
		if ret == 0:
			return out
		return None
site-packages/tuned/profiles/functions/function_check_net_queue_count.py000064400000001212147511334660023062 0ustar00import tuned.logs
from . import base

log = tuned.logs.get()

class check_net_queue_count(base.Function):
	"""
	Checks whether the user has specified a queue count for net devices. If
        not, return the number of housekeeping CPUs.
	"""
	def __init__(self):
		# 1 argument
		super(check_net_queue_count, self).__init__("check_net_queue_count", 1, 1)

	def execute(self, args):
		if not super(check_net_queue_count, self).execute(args):
			return None
		if args[0].isdigit():
			return args[0]
		(ret, out) = self._cmd.execute(["nproc"])
		log.warn("net-dev queue count is not correctly specified, setting it to HK CPUs %s" % (out))
		return out
site-packages/tuned/profiles/functions/function_cpulist2hex_invert.py000064400000001160147511334660022366 0ustar00import os
import tuned.logs
from . import base
from tuned.utils.commands import commands

log = tuned.logs.get()

class cpulist2hex_invert(base.Function):
	"""
	Converts CPU list to hexadecimal CPU mask and inverts it
	"""
	def __init__(self):
		# arbitrary number of arguments
		super(cpulist2hex_invert, self).__init__("cpulist2hex_invert", 0)

	def execute(self, args):
		if not super(cpulist2hex_invert, self).execute(args):
			return None
		# current implementation inverts the CPU list and then converts it to hexmask
		return self._cmd.cpulist2hex(",".join(str(v) for v in self._cmd.cpulist_invert(",,".join(args))))
site-packages/tuned/profiles/functions/function_kb2s.py000064400000000625147511334660017373 0ustar00import os
import tuned.logs
from . import base
from tuned.utils.commands import commands

class kb2s(base.Function):
	"""
	Conversion function: kbytes to sectors
	"""
	def __init__(self):
		# 1 argument
		super(kb2s, self).__init__("kb2s", 1, 1)

	def execute(self, args):
		if not super(kb2s, self).execute(args):
			return None
		try:
			return str(int(args[0]) * 2)
		except ValueError:
			return None
site-packages/tuned/profiles/functions/function_cpulist_unpack.py000064400000000777147511334660021566 0ustar00import os
import tuned.logs
from . import base
from tuned.utils.commands import commands

log = tuned.logs.get()

class cpulist_unpack(base.Function):
	"""
	Conversion function: unpacks CPU list in form 1-3,4 to 1,2,3,4
	"""
	def __init__(self):
		# arbitrary number of arguments
		super(cpulist_unpack, self).__init__("cpulist_unpack", 0)

	def execute(self, args):
		if not super(cpulist_unpack, self).execute(args):
			return None
		return ",".join(str(v) for v in self._cmd.cpulist_unpack(",,".join(args)))
site-packages/tuned/profiles/functions/function_virt_check.py000064400000001123147511334660020645 0ustar00import os
import tuned.logs
from . import base
from tuned.utils.commands import commands

class virt_check(base.Function):
	"""
	Checks whether running inside virtual machine (VM) or on bare metal.
	If running inside VM expands to argument 1, otherwise expands to
	argument 2 (even on error).
	"""
	def __init__(self):
		# 2 arguments
		super(virt_check, self).__init__("virt_check", 2, 2)

	def execute(self, args):
		if not super(virt_check, self).execute(args):
			return None
		(ret, out) = self._cmd.execute(["virt-what"])
		if ret == 0 and len(out) > 0:
			return args[0]
		return args[1]
site-packages/tuned/profiles/functions/function_cpulist_online.py000064400000001213147511334660021553 0ustar00import os
import tuned.logs
from . import base
from tuned.utils.commands import commands

log = tuned.logs.get()

class cpulist_online(base.Function):
	"""
	Checks whether CPUs from list are online, returns list containing
	only online CPUs
	"""
	def __init__(self):
		# arbitrary number of arguments
		super(cpulist_online, self).__init__("cpulist_online", 0)

	def execute(self, args):
		if not super(cpulist_online, self).execute(args):
			return None
		cpus = self._cmd.cpulist_unpack(",".join(args))
		online = self._cmd.cpulist_unpack(self._cmd.read_file("/sys/devices/system/cpu/online"))
		return ",".join(str(v) for v in cpus if v in online)
site-packages/tuned/profiles/functions/function_cpulist_pack.py000064400000001175147511334660021214 0ustar00import os
import tuned.logs
from . import base
from tuned.utils.commands import commands

log = tuned.logs.get()

class cpulist_pack(base.Function):
	"""
	Conversion function: packs CPU list in form 1,2,3,5 to 1-3,5.
	The cpulist_unpack is used as a preprocessor, so it always returns
	optimal results. For details about input syntax see cpulist_unpack.
	"""
	def __init__(self):
		# arbitrary number of arguments
		super(cpulist_pack, self).__init__("cpulist_pack", 0)

	def execute(self, args):
		if not super(cpulist_pack, self).execute(args):
			return None
		return ",".join(str(v) for v in self._cmd.cpulist_pack(",,".join(args)))
site-packages/tuned/profiles/functions/function_cpulist_present.py000064400000001263147511334660021754 0ustar00import os
import tuned.logs
from . import base
from tuned.utils.commands import commands

log = tuned.logs.get()

class cpulist_present(base.Function):
	"""
	Checks whether CPUs from list are present, returns list containing
	only present CPUs
	"""
	def __init__(self):
		# arbitrary number of arguments
		super(cpulist_present, self).__init__("cpulist_present", 0)

	def execute(self, args):
		if not super(cpulist_present, self).execute(args):
			return None
		cpus = self._cmd.cpulist_unpack(",,".join(args))
		present = self._cmd.cpulist_unpack(self._cmd.read_file("/sys/devices/system/cpu/present"))
		return ",".join(str(v) for v in sorted(list(set(cpus).intersection(set(present)))))
site-packages/tuned/profiles/functions/function_calc_isolated_cores.py000064400000003153147511334660022512 0ustar00import os
import glob
import tuned.logs
from . import base
import tuned.consts as consts

log = tuned.logs.get()

class calc_isolated_cores(base.Function):
	"""
	Calculates and returns isolated cores. The argument specifies how many
	cores per socket reserve for housekeeping. If not specified, 1 core
	per socket is reserved for housekeeping and the rest is isolated.
	"""
	def __init__(self):
		# max 1 argument
		super(calc_isolated_cores, self).__init__("calc_isolated_cores", 1)

	def execute(self, args):
		if not super(calc_isolated_cores, self).execute(args):
			return None
		cpus_reserve = 1
		if len(args) > 0:
			if not args[0].isdecimal() or int(args[0]) < 0:
				log.error("invalid argument '%s' for builtin function '%s', it must be non-negative integer" %
					(args[0], self._name))
				return None
			else:
				cpus_reserve = int(args[0])

		topo = {}
		for cpu in glob.iglob(os.path.join(consts.SYSFS_CPUS_PATH, "cpu*")):
			cpuid = os.path.basename(cpu)[3:]
			if cpuid.isdecimal():
				physical_package_id = os.path.join(cpu, "topology/physical_package_id")
				# Show no errors when the physical_package_id file does not exist -- the CPU may be offline.
				if not os.path.exists(physical_package_id):
					log.debug("file '%s' does not exist, cpu%s offline?" % (physical_package_id, cpuid))
					continue
				socket = self._cmd.read_file(physical_package_id).strip()
				if socket.isdecimal():
					topo[socket] = topo.get(socket, []) + [cpuid]

		isol_cpus = []
		for cpus in topo.values():
			cpus.sort(key = int)
			isol_cpus = isol_cpus + cpus[cpus_reserve:]
		isol_cpus.sort(key = int)
		return ",".join(isol_cpus)
site-packages/tuned/profiles/functions/function_cpuinfo_check.py000064400000001763147511334660021336 0ustar00import re
import tuned.logs
from . import base

log = tuned.logs.get()

class cpuinfo_check(base.Function):
	"""
	Checks regexes against /proc/cpuinfo. Accepts arguments in the
	following form: REGEX1, STR1, REGEX2, STR2, ...[, STR_FALLBACK]
	If REGEX1 matches something in /proc/cpuinfo it expands to STR1,
	if REGEX2 matches it expands to STR2. It stops on the first match,
	i.e. if REGEX1 matches, no more regexes are processed. If none
	regex matches it expands to STR_FALLBACK. If there is no fallback,
	it expands to empty string.
	"""
	def __init__(self):
		# unlimited number of arguments, min 2 arguments
		super(cpuinfo_check, self).__init__("cpuinfo_check", 0, 2)

	def execute(self, args):
		if not super(cpuinfo_check, self).execute(args):
			return None
		cpuinfo = self._cmd.read_file("/proc/cpuinfo")
		for i in range(0, len(args), 2):
			if i + 1 < len(args):
				if re.search(args[i], cpuinfo, re.MULTILINE):
					return args[i + 1]
		if len(args) % 2:
			return args[-1]
		else:
			return ""
site-packages/tuned/profiles/functions/function_cpulist_invert.py000064400000001223147511334660021577 0ustar00import os
import tuned.logs
from . import base
from tuned.utils.commands import commands

log = tuned.logs.get()

class cpulist_invert(base.Function):
	"""
	Inverts list of CPUs (makes its complement). For the complement it
	gets number of online CPUs from the /sys/devices/system/cpu/online,
	e.g. system with 4 CPUs (0-3), the inversion of list "0,2,3" will be
	"1"
	"""
	def __init__(self):
		# arbitrary number of arguments
		super(cpulist_invert, self).__init__("cpulist_invert", 0)

	def execute(self, args):
		if not super(cpulist_invert, self).execute(args):
			return None
		return ",".join(str(v) for v in self._cmd.cpulist_invert(",,".join(args)))
site-packages/tuned/profiles/functions/function_assertion_non_equal.py000064400000001453147511334660022602 0ustar00import os
import tuned.logs
from . import base
from tuned.utils.commands import commands
from tuned.profiles.exceptions import InvalidProfileException

log = tuned.logs.get()

class assertion_non_equal(base.Function):
	"""
	Assertion non equal: compares argument 2 with argument 3. If they match
	it logs text from argument 1 and  throws InvalidProfileException. This
	exception will abort profile loading.
	"""
	def __init__(self):
		# 3 arguments
		super(assertion_non_equal, self).__init__("assertion_non_equal", 3, 3)

	def execute(self, args):
		if not super(assertion_non_equal, self).execute(args):
			return None
		if args[1] == args[2]:
			log.error("assertion '%s' failed: '%s' == '%s'" % (args[0], args[1], args[2]))
			raise InvalidProfileException("Assertion '%s' failed." % args[0])
		return None
site-packages/tuned/profiles/functions/function_strip.py000064400000000626147511334660017674 0ustar00import os
import tuned.logs
from . import base
from tuned.utils.commands import commands

class strip(base.Function):
	"""
	Makes string from all arguments and strip it
	"""
	def __init__(self):
		# unlimited number of arguments, min 1 argument
		super(strip, self).__init__("strip", 0, 1)

	def execute(self, args):
		if not super(strip, self).execute(args):
			return None
		return "".join(args).strip()
site-packages/tuned/profiles/functions/function_assertion.py000064400000001377147511334660020546 0ustar00import os
import tuned.logs
from . import base
from tuned.utils.commands import commands
from tuned.profiles.exceptions import InvalidProfileException

log = tuned.logs.get()

class assertion(base.Function):
	"""
	Assertion: compares argument 2 with argument 3. If they don't match
	it logs text from argument 1 and  throws InvalidProfileException. This
	exception will abort profile loading.
	"""
	def __init__(self):
		# 3 arguments
		super(assertion, self).__init__("assertion", 3, 3)

	def execute(self, args):
		if not super(assertion, self).execute(args):
			return None
		if args[1] != args[2]:
			log.error("assertion '%s' failed: '%s' != '%s'" % (args[0], args[1], args[2]))
			raise InvalidProfileException("Assertion '%s' failed." % args[0])
		return None
site-packages/tuned/profiles/functions/__init__.py000064400000000043147511334660016356 0ustar00from .repository import Repository
site-packages/tuned/profiles/functions/base.py000064400000002020147511334660015526 0ustar00import os
import tuned.logs
from tuned.utils.commands import commands

log = tuned.logs.get()

class Function(object):
	"""
	Built-in function
	"""
	def __init__(self, name, nargs_max, nargs_min = None):
		self._name = name
		self._nargs_max = nargs_max
		self._nargs_min = nargs_min
		self._cmd = commands()

	# checks arguments
	# nargs_max - maximal number of arguments, there mustn't be more arguments,
	#             if nargs_max is 0, number of arguments is unlimited
	# nargs_min - minimal number of arguments, if not None there must
	#             be the same number of arguments or more
	@classmethod
	def _check_args(cls, args, nargs_max, nargs_min = None):
		if args is None or nargs_max is None:
			return False
		la = len(args)
		return (nargs_max == 0 or nargs_max >= la) and (nargs_min is None or nargs_min <= la)

	def execute(self, args):
		if self._check_args(args, self._nargs_max, self._nargs_min):
			return True
		else:
			log.error("invalid number of arguments for builtin function '%s'" % self._name)
		return False
site-packages/tuned/utils/config_parser.py000064400000003037147511334670014754 0ustar00# ConfigParser wrapper providing compatibility layer for python 2.7/3

try:
	python3 = True
	import configparser as cp
except ImportError:
	python3 = False
	import ConfigParser as cp
	from StringIO import StringIO
	import re

class Error(cp.Error):
	pass

if python3:

	class ConfigParser(cp.ConfigParser):
		pass

else:

	class ConfigParser(cp.ConfigParser):

		def __init__(self, delimiters=None, inline_comment_prefixes=None, strict=False, *args, **kwargs):
			delims = "".join(list(delimiters))
			# REs taken from the python-2.7 ConfigParser
			self.OPTCRE = re.compile(
				r'(?P<option>[^' + delims + r'\s][^' + delims + ']*)'
				r'\s*(?P<vi>[' + delims + r'])\s*'
				r'(?P<value>.*)$'
			)
			self.OPTCRE_NV = re.compile(
				r'(?P<option>[^' + delims + r'\s][^' + delims + ']*)'
				r'\s*(?:'
				r'(?P<vi>[' + delims + r'])\s*'
				r'(?P<value>.*))?$'
			)
			cp.ConfigParser.__init__(self, *args, **kwargs)
			self._inline_comment_prefixes = inline_comment_prefixes or []
			self._re = re.compile(r"\s+(%s).*" % ")|(".join(list(self._inline_comment_prefixes)))

		def read_string(self, string, source="<string>"):
			sfile = StringIO(string)
			self.read_file(sfile, source)

		def readfp(self, fp, filename=None):
			cp.ConfigParser.readfp(self, fp, filename)
			# remove inline comments
			all_sections = [self._defaults]
			all_sections.extend(self._sections.values())
			for options in all_sections:
				for name, val in options.items():
					options[name] = self._re.sub("", val)

		def read_file(self, f, source="<???>"):
			self.readfp(f, source)
site-packages/tuned/utils/profile_recommender.py000064400000013463147511334670016157 0ustar00import os
import re
import errno
import procfs
import subprocess
from tuned.utils.config_parser import ConfigParser, Error

try:
	import syspurpose.files
	have_syspurpose = True
except:
	have_syspurpose = False

import tuned.consts as consts
import tuned.logs
from tuned.utils.commands import commands

log = tuned.logs.get()

class ProfileRecommender:

	def __init__(self, is_hardcoded = False):
		self._is_hardcoded = is_hardcoded
		self._commands = commands()
		self._chassis_type = None

	def recommend(self):
		profile = consts.DEFAULT_PROFILE
		if self._is_hardcoded:
			return profile

		has_root = os.geteuid() == 0
		if not has_root:
			log.warning("Profile recommender is running without root privileges. Profiles with virt recommendation condition will be omitted.")
		matching = self.process_config(consts.RECOMMEND_CONF_FILE,
									   has_root=has_root)
		if matching is not None:
			return matching
		files = {}
		for directory in consts.RECOMMEND_DIRECTORIES:
			contents = []
			try:
				contents = os.listdir(directory)
			except OSError as e:
				if e.errno != errno.ENOENT:
					log.error("error accessing %s: %s" % (directory, e))
			for name in contents:
				path = os.path.join(directory, name)
				files[name] = path
		for name in sorted(files.keys()):
			path = files[name]
			matching = self.process_config(path, has_root=has_root)
			if matching is not None:
				return matching
		return profile

	def process_config(self, fname, has_root=True):
		matching_profile = None
		syspurpose_error_logged = False
		try:
			if not os.path.isfile(fname):
				return None
			config = ConfigParser(delimiters=('='), inline_comment_prefixes=('#'), strict=False)
			config.optionxform = str
			with open(fname) as f:
				config.read_file(f, fname)
			for section in config.sections():
				match = True
				for option in config.options(section):
					value = config.get(section, option, raw=True)
					if value == "":
						value = r"^$"
					if option == "virt":
						if not has_root:
							match = False
							break
						if not re.match(value,
								self._commands.execute(["virt-what"])[1], re.S):
							match = False
					elif option == "system":
						if not re.match(value,
								self._commands.read_file(
								consts.SYSTEM_RELEASE_FILE,
								no_error = True), re.S):
							match = False
					elif option[0] == "/":
						if not os.path.exists(option) or not re.match(value,
								self._commands.read_file(option), re.S):
							match = False
					elif option[0:7] == "process":
						ps = procfs.pidstats()
						ps.reload_threads()
						if len(ps.find_by_regex(re.compile(value))) == 0:
							match = False
					elif option == "chassis_type":
						chassis_type = self._get_chassis_type()

						if not re.match(value, chassis_type, re.IGNORECASE):
							match = False
					elif option == "syspurpose_role":
						role = ""
						if have_syspurpose:
							s = syspurpose.files.SyspurposeStore(
									syspurpose.files.USER_SYSPURPOSE,
									raise_on_error = True)
							try:
								s.read_file()
								role = s.contents["role"]
							except (IOError, OSError, KeyError) as e:
								if hasattr(e, "errno") and e.errno != errno.ENOENT:
									log.error("Failed to load the syspurpose\
										file: %s" % e)
						else:
							if not syspurpose_error_logged:
								log.error("Failed to process 'syspurpose_role' in '%s'\
									, the syspurpose module is not available" % fname)
								syspurpose_error_logged = True
						if re.match(value, role, re.IGNORECASE) is None:
							match = False

				if match:
					# remove the ",.*" suffix
					r = re.compile(r",[^,]*$")
					matching_profile = r.sub("", section)
					break
		except (IOError, OSError, Error) as e:
			log.error("error processing '%s', %s" % (fname, e))
		return matching_profile

	def _get_chassis_type(self):
		if self._chassis_type is not None:
			log.debug("returning cached chassis type '%s'" % self._chassis_type)
			return self._chassis_type

		# Check DMI sysfs first
		# Based on SMBios 3.3.0 specs (https://www.dmtf.org/sites/default/files/standards/documents/DSP0134_3.3.0.pdf)
		DMI_CHASSIS_TYPES = ["", "Other", "Unknown", "Desktop", "Low Profile Desktop", "Pizza Box", "Mini Tower", "Tower",
							"Portable", "Laptop", "Notebook", "Hand Held", "Docking Station", "All In One", "Sub Notebook",
							"Space-saving", "Lunch Box", "Main Server Chassis", "Expansion Chassis", "Sub Chassis",
							"Bus Expansion Chassis", "Peripheral Chassis", "RAID Chassis", "Rack Mount Chassis", "Sealed-case PC",
							"Multi-system", "CompactPCI", "AdvancedTCA", "Blade", "Blade Enclosing", "Tablet",
							"Convertible", "Detachable", "IoT Gateway", "Embedded PC", "Mini PC", "Stick PC"]
		try:
			with open('/sys/devices/virtual/dmi/id/chassis_type', 'r') as sysfs_chassis_type:
				chassis_type_id = int(sysfs_chassis_type.read())

			self._chassis_type = DMI_CHASSIS_TYPES[chassis_type_id]
		except IndexError:
			log.error("Unknown chassis type id read from dmi sysfs: %d" % chassis_type_id)
		except (OSError, IOError) as e:
			log.warn("error accessing dmi sysfs file: %s" % e)

		if self._chassis_type:
			log.debug("chassis type - %s" % self._chassis_type)
			return self._chassis_type

		# Fallback - try parsing dmidecode output
		try:
			p_dmi = subprocess.Popen(['dmidecode', '-s', 'chassis-type'],
				stdout=subprocess.PIPE, stderr=subprocess.PIPE,
				close_fds=True)

			(dmi_output, dmi_error) = p_dmi.communicate()

			if p_dmi.returncode:
				log.error("dmidecode finished with error (ret %d): '%s'" % (p_dmi.returncode, dmi_error))
			else:
				self._chassis_type = dmi_output.strip().decode()
		except (OSError, IOError) as e:
			log.warn("error executing dmidecode tool : %s" % e)

		if not self._chassis_type:
			log.debug("could not determine chassis type.")
			self._chassis_type = ""
		else:
			log.debug("chassis type - %s" % self._chassis_type)

		return self._chassis_type
site-packages/tuned/utils/__init__.py000064400000000000147511334670013655 0ustar00site-packages/tuned/utils/polkit.py000064400000002630147511334670013433 0ustar00import dbus
import tuned.logs

log = tuned.logs.get()

class polkit():
	def __init__(self):
		self._bus = dbus.SystemBus()
		self._proxy = self._bus.get_object('org.freedesktop.PolicyKit1', '/org/freedesktop/PolicyKit1/Authority', follow_name_owner_changes = True)
		self._authority = dbus.Interface(self._proxy, dbus_interface='org.freedesktop.PolicyKit1.Authority')

	def check_authorization(self, sender, action_id):
		"""Check authorization, return codes:
			1  - authorized
			2  - polkit error, but authorized with fallback method
			0  - unauthorized
			-1 - polkit error and unauthorized by the fallback method
			-2 - polkit error and unable to use the fallback method
		"""

		if sender is None or action_id is None:
			return False
		details = {}
		flags = 1            # AllowUserInteraction flag
		cancellation_id = "" # No cancellation id
		subject = ("system-bus-name", {"name" : sender})
		try:
			ret = self._authority.CheckAuthorization(subject, action_id, details, flags, cancellation_id)[0]
		except (dbus.exceptions.DBusException, ValueError) as e:
			log.error("error querying polkit: %s" % e)
			# No polkit or polkit error, fallback to always allow root
			try:
				uid = self._bus.get_unix_user(sender)
			except dbus.exceptions.DBusException as e:
				log.error("error using fallback authorization method: %s" % e)
				return -2
			if uid == 0:
				return 2
			else:
				return -1
		return 1 if ret else 0
site-packages/tuned/utils/__pycache__/__init__.cpython-36.pyc000064400000000161147511334670020151 0ustar003

�<�e�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/utils/__pycache__/nettool.cpython-36.pyc000064400000011304147511334670020077 0ustar003

�<�eF�@sHdgZddlZddlTddlZejj�ZGdd�d�Zdd�Z	ie	_
dS)�ethcard�N)�*c@s~eZdZddgddgddgddgdd	gd
d�ZdZd
d�Zdd�Zdd�Zdd�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dS) �Nettool������ ri�i�?)�
�di�i�	i'�autoFcCsb||_|j�tjd||j|j|j|jf�tjd||j|j	f�tjd||j
|jf�dS)Nz1%s: speed %s, full duplex %s, autoneg %s, link %sz"%s: supports: autoneg %s, modes %sz$%s: advertises: autoneg %s, modes %s)�
_interface�update�log�debug�speed�full_duplex�autoneg�link�supported_autoneg�supported_modes�advertised_autoneg�advertised_modes)�self�	interface�r�/usr/lib/python3.6/nettool.py�__init__s
 zNettool.__init__cCs4d|_d|_d|_d|_g|_d|_g|_d|_dS)NrF)rrrrrrrr)rrrr�
_clean_status"szNettool._clean_statuscCs8d}x.|D]&}||j|d|dr(dnd7}q
W|S)Nrr)�_advertise_values)rZmodes�mode�mrrr�_calculate_mode.s
&zNettool._calculate_modecCs<|j|krdS|jsdSdtdd|jd|r.dndgdd	�kS)
NTFr�ethtoolz-sr�onZoff)�	close_fds)rr�callr)r�enablerrr�_set_autonegotiation5s

zNettool._set_autonegotiationcCs.|jd�sdSdtdd|jdd|gdd�kS)	NTFrr%z-sZ	advertisez0x%03x)r')r*r(r)r�valuerrr�_set_advertise>s
zNettool._set_advertisecCs<d}x"|jD]}|d|kr|d}qW|dkr4|SdSdS)Nri�)r)r�maxr"rrr�
get_max_speedDszNettool.get_max_speedcCs6|js|jrdS|j|jd�r.|j�dSdSdS)NFrT)�	_disabledrr,r!r)rrrr�
set_max_speedNszNettool.set_max_speedcCs�|js|jrdSd}xD|jD]:}|dkr,q||kr||j|d7}||j|d7}qW||j|j�@}tjd|j||f�|j|�r�|j	�dSdSdS)NFrrrz)%s: set_speed(%d) - effective_mode 0x%03xT)
r/rr!r$rrrrr,r)rrr"ZamZeffective_moderrr�	set_speedYs
zNettool.set_speedc
Cs"|jr
dStd|jgttddd�}tddg|jtddd�}|j�d}|j�d}|d	kr�tjd
|j�tjd|j|j	dd
�f�|j
�d|_dS|j
�tjd�}tjd�}d}�xb|j
d�D�]R}|jd��rj|dd)�}	|	dkr�d}nt|	dk�rd}nd|	dk�rd}nT|	dk�r"d}nD|	dk�r2d}n4|	dk�rBd}n$|	dk�rRd}n|	d k�rbd!}nd}~	q�|dk�r�y|j|�jd�|_Wnd"|_YnXd}q�|dk�r�|d#k|_d}q�|dk�r�|d$k�p�|d%k|_d}q�|dk�r�|d$k|_d}q�|dk�rnyLx@|j
�D]4}
|j|
�jdd&�\}}|jjt|�|d#kf��qW~
~~Wn|jjd*�YnXq�|dk�r�|d'k|_d}q�|dk�ryV|d(k�r�x@|j
�D]4}
|j|
�jdd&�\}}|jjt|�|d#kf��q�W~
~~Wn|jjd+�YnXq�|d!kr�|d'k|_d}q�WdS),Nr%T)�stdout�stderrr'�universal_newlinesZsedzs/^\s*//;s/:\s*/:\n/g)�stdinr2r4r'rr�z*%s: some errors were reported by 'ethtool'z%s: %s�
z\nz(\d+)z(\d+)baseT/(Half|Full)�wait�:ZSpeedrZDuplexZduplexzAuto-negotiationrz
Link detectedrzSupported link modesrzSupports auto-negotiationrzAdvertised link modesrzAdvertised auto-negotiationr��ZFull�yesr&rZYeszNot reported���)r:T)r:T)r/�Popenr�PIPEr2ZcommunicaterZwarningr�replacer �re�compile�split�endswith�match�grouprrrrr�append�intrrr)
rZ	p_ethtoolZp_filter�output�errorsZre_speedZre_mode�state�lineZsectionr#�s�drrrrns�























zNettool.updateN)�__name__�
__module__�__qualname__r!r/rr r$r*r,r.r0r1rrrrrr	s 	
rcCs"|tjkrt|�tj|<tj|S)N)r�listr)rrrrr�s
)�__all__Z
tuned.logsZtuned�
subprocessr@Zlogs�getrrrrQrrrr�<module>s
Hsite-packages/tuned/utils/__pycache__/polkit.cpython-36.pyc000064400000003173147511334670017722 0ustar003

�<�e��@s,ddlZddlZejj�ZGdd�d�ZdS)�Nc@seZdZdd�Zdd�ZdS)�polkitcCs4tj�|_|jjdddd�|_tj|jdd�|_dS)Nzorg.freedesktop.PolicyKit1z%/org/freedesktop/PolicyKit1/AuthorityT)Zfollow_name_owner_changesz$org.freedesktop.PolicyKit1.Authority)Zdbus_interface)�dbusZ	SystemBus�_busZ
get_object�_proxyZ	Interface�
_authority)�self�r�/usr/lib/python3.6/polkit.py�__init__s
zpolkit.__init__c
Cs�|dks|dkrdSi}d}d}dd|if}y|jj|||||�d}Wn�tjjtfk
r�}zhtjd|�y|jj	|�}	Wn2tjjk
r�}ztjd	|�dSd}~XnX|	dkr�d
SdSWYdd}~XnX|r�dSdS)
z�Check authorization, return codes:
			1  - authorized
			2  - polkit error, but authorized with fallback method
			0  - unauthorized
			-1 - polkit error and unauthorized by the fallback method
			-2 - polkit error and unable to use the fallback method
		NF��zsystem-bus-name�namerzerror querying polkit: %sz-error using fallback authorization method: %s�������)
rZCheckAuthorizationr�
exceptionsZ
DBusException�
ValueError�log�errorrZ
get_unix_user)
rZsenderZ	action_idZdetails�flagsZcancellation_idZsubject�ret�eZuidrrr	�check_authorizations&	zpolkit.check_authorizationN)�__name__�
__module__�__qualname__r
rrrrr	rsr)rZ
tuned.logsZtunedZlogs�getrrrrrr	�<module>s
site-packages/tuned/utils/__pycache__/global_config.cpython-36.opt-1.pyc000064400000007743147511334670022153 0ustar003

�<�e��@s\ddlZddlmZmZddlmZddljZddl	m
Z
dgZejj
�ZGdd�d�ZdS)�N)�ConfigParser�Error)�TunedException)�commands�GlobalConfigc@sdeZdZejfdd�Zedd��Zejfdd�Zddd	�Z	dd
d�Z
dd
d�Zdd�Zddd�Z
dS)rcCsi|_|j|d�t�|_dS)N)�	file_name)�_cfg�load_configr�_cmd)�selfZconfig_file�r�#/usr/lib/python3.6/global_config.py�__init__
szGlobalConfig.__init__cCs>dd�tt�D�}tdd�|D��}tdd�|D��}||fS)ai
		Easy validation mimicking configobj
		Returns two dicts, first with default values (default None)
		global_default[consts.CFG_SOMETHING] = consts.CFG_DEF_SOMETHING or None
		second with configobj function for value type (default "get" for string, others eg getboolean, getint)
		global_function[consts.CFG_SOMETHING] = consts.CFG_FUNC_SOMETHING or get
		}
		cSs2g|]*}|jd�r|jd�r|jd�r|�qS)ZCFG_�	CFG_FUNC_�CFG_DEF_)�
startswith)�.0�optrrr
�
<listcomp>s
z7GlobalConfig.get_global_config_spec.<locals>.<listcomp>css0|](}tt|�ttd|dd�d�fVqdS)r�N)�getattr�consts)rrrrr
�	<genexpr> sz6GlobalConfig.get_global_config_spec.<locals>.<genexpr>css0|](}tt|�ttd|dd�d�fVqdS)rrN�get)rr)rrrrr
r!s)�dirr�dict)�optionsZglobal_defaultZglobal_functionrrr
�get_global_config_specs
z#GlobalConfig.get_global_config_speccCs\tjd|�y�tdddd�}t|_t|��$}|jdtjd|j	�|�WdQRX|j
�\|_}x�|jtj�D]�}||jkr�y$t
|||�}|tj|�|j|<Wq�tk
r�td	|��Yq�Xqrtjd
||f�|jtj|dd�|j|<qrWWn^tk
�r(}ztd
|��WYdd}~Xn0tk
�rV}ztd|��WYdd}~XnXdS)z&
		Loads global configuration file.
		z2reading and parsing global configuration file '%s'�=�#F)Z
delimitersZinline_comment_prefixes�strict�[z]
Nz2Global TuneD configuration file '%s' is not valid.z/Unknown option '%s' in global config file '%s'.T)�rawz/Global TuneD configuration file '%s' not found.z3Error parsing global TuneD configuration file '%s'.)�log�debugr�strZoptionxform�openZread_stringrZMAGIC_HEADER_NAME�readrrrrrr�infor�IOError)rrZ
config_parser�fZ_global_config_funcZoption�func�errr
r	$s*
(
 zGlobalConfig.load_configNcCs|jj||�S)N)rr)r�key�defaultrrr
r?szGlobalConfig.getcCs |jj|j||��dkrdSdS)N�1TF)r
�get_boolr)rr-r.rrr
r0BszGlobalConfig.get_boolrcCs.|jj||�}|r*t|t�r |St|d�S|S)Nr)rr�
isinstance�int)rr-r.�irrr
�get_intGs

zGlobalConfig.get_intcCs||j|<dS)N)r)rr-�valuerrr
�setPszGlobalConfig.setcCsH|j|�}|dkr|S|jj|�}|dkr@tjd||f�|S|SdS)Nz%Error parsing value '%s', using '%s'.)rr
�get_sizer#�error)rr-r.�val�retrrr
r7Ss
zGlobalConfig.get_size)N)N)r)N)�__name__�
__module__�__qualname__rZGLOBAL_CONFIG_FILEr�staticmethodrr	rr0r4r6r7rrrr
rs


	)Z
tuned.logsZtunedZtuned.utils.config_parserrrZtuned.exceptionsrZtuned.constsrZtuned.utils.commandsr�__all__Zlogsrr#rrrrr
�<module>s

site-packages/tuned/utils/__pycache__/config_parser.cpython-36.pyc000064400000003625147511334670021243 0ustar003

�<�e�@s�ydZddlZWn4ek
rDdZddlZddlmZddlZYnXGdd�dej�ZerpGdd�dej�ZnGd	d�dej�ZdS)
T�NF)�StringIOc@seZdZdS)�ErrorN)�__name__�
__module__�__qualname__�rr�#/usr/lib/python3.6/config_parser.pyrsrc@seZdZdS)�ConfigParserN)rrrrrrrr	sr	c@s4eZdZd
dd�Zddd�Zddd	�Zddd�ZdS)r	NFcOs�djt|��}tjd|d|d|d�|_tjd|d|d|d�|_tjj|f|�|�|ppg|_	tjdd	jt|j	���|_
dS)
N�z
(?P<option>[^z\s][^z]*)\s*(?P<vi>[z])\s*(?P<value>.*)$z]*)\s*(?:(?P<vi>[z])\s*(?P<value>.*))?$z	\s+(%s).*z)|()�join�list�re�compileZOPTCREZ	OPTCRE_NV�cpr	�__init__Z_inline_comment_prefixes�_re)�selfZ
delimitersZinline_comment_prefixes�strict�args�kwargsZdelimsrrrrs  
zConfigParser.__init__�<string>cCst|�}|j||�dS)N)r�	read_file)r�string�sourceZsfilerrr�read_string*szConfigParser.read_stringcCsbtjj|||�|jg}|j|jj��x4|D],}x&|j�D]\}}|jj	d|�||<q<Wq.WdS)Nr
)
rr	�readfpZ	_defaults�extendZ	_sections�values�itemsr�sub)r�fp�filenameZall_sectionsZoptions�name�valrrrr.s
zConfigParser.readfp�<???>cCs|j||�dS)N)r)r�frrrrr7szConfigParser.read_file)NNF)r)N)r$)rrrrrrrrrrrr	s


	)Zpython3Zconfigparserr�ImportErrorr	rr
rrrrr�<module>ssite-packages/tuned/utils/__pycache__/config_parser.cpython-36.opt-1.pyc000064400000003625147511334670022202 0ustar003

�<�e�@s�ydZddlZWn4ek
rDdZddlZddlmZddlZYnXGdd�dej�ZerpGdd�dej�ZnGd	d�dej�ZdS)
T�NF)�StringIOc@seZdZdS)�ErrorN)�__name__�
__module__�__qualname__�rr�#/usr/lib/python3.6/config_parser.pyrsrc@seZdZdS)�ConfigParserN)rrrrrrrr	sr	c@s4eZdZd
dd�Zddd�Zddd	�Zddd�ZdS)r	NFcOs�djt|��}tjd|d|d|d�|_tjd|d|d|d�|_tjj|f|�|�|ppg|_	tjdd	jt|j	���|_
dS)
N�z
(?P<option>[^z\s][^z]*)\s*(?P<vi>[z])\s*(?P<value>.*)$z]*)\s*(?:(?P<vi>[z])\s*(?P<value>.*))?$z	\s+(%s).*z)|()�join�list�re�compileZOPTCREZ	OPTCRE_NV�cpr	�__init__Z_inline_comment_prefixes�_re)�selfZ
delimitersZinline_comment_prefixes�strict�args�kwargsZdelimsrrrrs  
zConfigParser.__init__�<string>cCst|�}|j||�dS)N)r�	read_file)r�string�sourceZsfilerrr�read_string*szConfigParser.read_stringcCsbtjj|||�|jg}|j|jj��x4|D],}x&|j�D]\}}|jj	d|�||<q<Wq.WdS)Nr
)
rr	�readfpZ	_defaults�extendZ	_sections�values�itemsr�sub)r�fp�filenameZall_sectionsZoptions�name�valrrrr.s
zConfigParser.readfp�<???>cCs|j||�dS)N)r)r�frrrrr7szConfigParser.read_file)NNF)r)N)r$)rrrrrrrrrrrr	s


	)Zpython3Zconfigparserr�ImportErrorr	rr
rrrrr�<module>ssite-packages/tuned/utils/__pycache__/plugin_loader.cpython-36.opt-1.pyc000064400000004026147511334670022201 0ustar003

�<�e�@s4ddlZddlZdgZejj�ZGdd�de�ZdS)�N�PluginLoadercsFeZdZdddgZdd�Z�fdd�Zdd	�Zd
d�Zdd
�Z�Z	S)r�
_namespace�_prefix�
_interfacecCs
t��dS)z~
		This method has to be implemented in child class and should
		set _namespace, _prefix, and _interface member attributes.
		N)�NotImplementedError)�self�r�#/usr/lib/python3.6/plugin_loader.py�_set_loader_parameterssz#PluginLoader._set_loader_parameterscs,tt|�j�d|_d|_d|_|j�dS)N)�superr�__init__rrrr
)r)�	__class__rr	rszPluginLoader.__init__cCsd|j|j|f}|j|�S)Nz%s.%s%s)rr�
_get_class)rZplugin_name�module_namerrr	�load_pluginszPluginLoader.load_plugincCs�tjd|�t|�}|jd�}|jd�x t|�dkrJt||jd��}q,Wx4|jD]*}t||�}t|�tkrTt	||j
�rT|SqTWtd��dS)Nzloading module %s�.rzCannot find the plugin class.)�log�debug�
__import__�split�pop�len�getattr�__dict__�type�
issubclassr�ImportError)rr�module�path�name�clsrrr	r"s


zPluginLoader._get_classcCs�t|j�}g}xvtj|jjd�D]`}yFtjj|�d}|jd�sDw"|j	d|j|f�}||krj|j
|�Wq"tk
r�Yq"Xq"W|S)NrZplugin_z%s.%s)rr�os�listdirZplugins�__path__r�splitext�
startswithr�appendr)rZplugins_packageZplugin_clssrZplugin_classrrr	�load_all_plugins2s


zPluginLoader.load_all_plugins)
�__name__�
__module__�__qualname__�	__slots__r
rrrr'�
__classcell__rr)r
r	rs
)	Z
tuned.logsZtunedr!�__all__Zlogs�getr�objectrrrrr	�<module>s
site-packages/tuned/utils/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000161147511334670021110 0ustar003

�<�e�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/utils/__pycache__/commands.cpython-36.pyc000064400000036707147511334670020232 0ustar003

�<�e�?�@srddlZddlZddlZddlZddlZddlZddljZddl	Z	ddl
TddlmZej
j�ZGdd�d�ZdS)�N)�*)�TunedExceptionc@s�eZdZd`dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdadd�Z	dbdd�Z
dd�Zdd�Zdd�Z
dcdd�Zdddd�Zded d!�Zdfd#d$�Zdgd%d&�Zdhd'd(�Zdid)d*�Zdjd+d,�Zd-d.�Zd/d0�Zdkd1d2�Zd3d4�Zd5d6�Zdld7d8�Zddigdfd9d:�Zdmd;d<�Zd=d>�Zd?d@�ZdAdB�Z dndDdE�Z!dFdG�Z"dHdI�Z#dJdK�Z$dLdM�Z%dodNdO�Z&dPdQ�Z'dRdS�Z(dTdU�Z)dVdW�Z*dXdY�Z+dZd[�Z,d\d]�Z-d^d_�Z.dS)p�commandsTcCs
||_dS)N)�_logging)�selfZlogging�r�/usr/lib/python3.6/commands.py�__init__szcommands.__init__cCs|jrtj|�dS)N)r�log�error)r�msgrrr�_errorszcommands._errorcCs|jrtj|�dS)N)rr
�debug)rrrrr�_debugszcommands._debugc	Cs.t|�j�j�}ddddddddd�j||�S)N�1�0)�YZYES�TZTRUE�NZNO�FZFALSE)�str�upper�strip�get)r�value�vrrr�get_boolszcommands.get_boolcCstjddt|��j�S)Nz\s+� )�re�subrr)r�srrr�	remove_wsszcommands.remove_wscCstjdd|�S)Nz^"(.*)"$z\1)rr)rrrrr�unquote"szcommands.unquote�\cCs|j|d||f�S)Nz%s%s)�replace)rr Zwhat_escapeZ	escape_byrrr�escape&szcommands.escapecCs|j|d�S)N�)r$)rr Zescape_charrrr�unescape*szcommands.unescapecCs|d|t|�|S)Nr)�len)r�s1�pos�s2rrr�	align_str.szcommands.align_strcCs2g}|dk	r.x t|j��D]}|t|�7}qW|S)N)�sorted�items�list)r�d�l�irrr�	dict2list4s
zcommands.dict2listcCs(|dkrdStjddjt|j����S)Nz(%s)z)|()r�compile�joinr/�keys)rr0rrr�re_lookup_compile<szcommands.re_lookup_compileNrcsV�dkr|dkr.|Snt��dks*|dkr.|S|dkr@|j��}|j�fdd�||�S)Nrcst�j��|jdS)N�)r/�values�	lastindex)�mo)r0rr�<lambda>Msz.commands.multiple_re_replace.<locals>.<lambda>)r(r7r)rr0r �r�flagsr)r0r�multiple_re_replaceDs
zcommands.multiple_re_replacecCsRt|�dks|dkrdS|dkr*|j|�}|j|�}|rNt|j��|jdSdS)Nrr8)r(r7�searchr/r9r:)rr0r r=r;rrr�	re_lookupSs

zcommands.re_lookupFc	Cs�|jd||f�|r2tjj|�}tjj|�r2d}y6|rBtj|�t|d�}|jt|��|j	�d}Wn`t
tfk
r�}z@d}t|t
�r�|s�t|t�r�|j|kr�|jd||f�WYdd}~XnX|S)aWrite data to a file.

		Parameters:
		f -- filename where to write
		data -- data to write
		makedir -- if True and the path doesn't exist, it will be created
		no_error -- if True errors are silenced, it can be also list of ignored errnos

		Return:
		bool -- True on success
		zWriting to file: '%s' < '%s'F�wTz Writing to file '%s' error: '%s'N)r�os�path�dirname�isdir�makedirs�open�writer�close�OSError�IOError�
isinstance�boolr/�errnor
)	r�f�dataZmakedir�no_errorr0�fdZrc�errr�
write_to_file]s$

$zcommands.write_to_filer&cCsv|}yt|d�}|j�}|j�Wn<ttfk
r^}z|sN|jd||f�WYdd}~XnX|jd||f�|S)Nr=z"Error when reading file '%s': '%s'z Read data from file: '%s' > '%s')rH�readrJrKrLr
r)rrPZerr_retrRZ	old_valuerTrrr�	read_file|s
$zcommands.read_filecCsj|jd|�tjj|�rfytj||�Wn:tk
rd}z|sVtjd|t	|�f�dSd}~XnXdS)NzRemoving tree: '%s'zcannot remove tree '%s': '%s'FT)
rrCrD�exists�shutil�rmtreerKr
rr)rrPrRrrrrrZ�szcommands.rmtreecCsh|jd|�tjj|�rdytj|�Wn:tk
rb}z|sTtjd|t|�f�dSd}~XnXdS)NzRemoving file: '%s'zcannot remove file '%s': '%s'FT)	rrCrDrX�unlinkrKr
rr)rrPrRrrrrr[�szcommands.unlinkcCsd|jd||f�ytj||�Wn<tk
r^}z |sPtjd||t|�f�dSd}~XnXdS)NzRenaming file '%s' to '%s'z%cannot rename file '%s' to '%s': '%s'FT)rrC�renamerKr
rr)r�src�dstrRrrrrr\�szcommands.renamecCs`y"tjd||f�tj||�dStk
rZ}z|sLtjd|||f�dSd}~XnXdS)Nzcopying file '%s' to '%s'Tz!cannot copy file '%s' to '%s': %sF)r
rrY�copyrLr)rr]r^rRrTrrrr_�sz
commands.copycCs6|j|�}t|�dkrdS|j|tj|||tjd��S)NrF)r>)rWr(rUrr�	MULTILINE)rrP�pattern�replrQrrr�replace_in_file�s
zcommands.replace_in_filecCs4|j|�}t|�dkrdS|j||j||tjd��S)NrF)r>)rWr(rUr?rr`)rrPr0rQrrr�multiple_replace_in_file�s
z!commands.multiple_replace_in_filecCs�|j|�}x�|D]�}t|�}t||�}tjd|d|tjd�dkr||r�t|�dkrj|ddkrj|d7}|d||f7}qtjd|d	d
|j|�d|tjd�}qW|j||�S)Nz\bz\s*=.*$)r>rr8�
z%s="%s"
z\b(z\s*=).*$z\1�"���z\1")	rWrrr@r`r(rr%rU)rrPr0�addrQ�opt�orrrr�add_modify_option_in_file�s

.z"commands.add_modify_option_in_filecCs"|j|�}tjt|�jd��j�S)Nzutf-8)rW�hashlibZmd5r�encode�	hexdigest)rrPrQrrr�md5sum�s
zcommands.md5sumcCs"|j|�}tjt|�jd��j�S)Nzutf-8)rWrlZsha256rrmrn)rrPrQrrr�	sha256sum�s
zcommands.sha256sumcCs|jtj|�j�S)N)rW�constsZMACHINE_ID_FILEr)rrRrrr�get_machine_id�szcommands.get_machine_idcCsVd}tjj�}d|d<|j|�|jdt|��d}	d}
y�t|tt|||ddd�}|j�\}	}|j	}|r�||kr�d|kr�|dd�}
t
|
�dkr�|	dd�}
d	d
j|�|
f}
|s�|j|
�Wnxt
tfk
�r8}zV|jdk	r�|jnd
}t|�|k�r(d|k�r(d	d
j|�|f}
|�s(|j|
�WYdd}~XnX|�rJ||	|
fS||	fSdS)Nr�C�LC_ALLz
Executing %s.r&T)�stdout�stderr�env�shell�cwdZ	close_fdsZuniversal_newlinesr8zExecuting '%s' error: %srrgrgrg)rC�environr_�updaterr�Popen�PIPEZcommunicate�
returncoder(r5r
rKrLrO�abs)r�argsrxryrwZ	no_errorsZ
return_errZretcodeZ_environment�outZerr_msg�proc�errZerr_outrTrrr�execute�s>


zcommands.executecCs.tjd|�}|r|jd�S|r*|j�dS|S)Nz.*\[([^\]]+)\].*r8r)r�match�group�split)rZoptionsZdosplit�mrrr�get_active_options
zcommands.get_active_optioncCs*t|�}|dkp(|jd|dd�j�dkS)Nrz$/sys/devices/system/cpu/cpu%s/onlineT)rRr)rrWr)r�cpuZscpurrr�
is_cpu_onlineszcommands.is_cpu_onlinecCs\|dkrdSt|�jdd�}yt|d�}Wn&tk
rPtjdt|��gSX|j|�S)N�,r&�zinvalid hexadecimal mask '%s')rr$�int�
ValueErrorr
r�bitmask2cpulist)r�maskr�rrr�hex2cpulistszcommands.hex2cpulistcCs<d}g}x.|dkr6|d@r$|j|�|dL}|d7}q
W|S)Nrr8)�append)rr�r��cpusrrrr�&s

zcommands.bitmask2cpulist�'"cCs^g}|dkr|S|}t|�tk	rD|dk	r6t|�j|�}t|�jd�}g}g}d}d}�x|D�]}	t|	�}
|r�t|
�dkr�d}|j|�d}n||
7}q\|
dd�j�dkr�d}|
}q\|
o�|
ddks�|
dd	k�rP|
d
d�jd�}yPt|�d
k�r|ttt	|d�t	|d
�d
��7}n|jt	|
d
d���Wnt
k
�rLgSXq\t|
�dkr\|j|
�q\Wt|�dk�r�|j|�x�|D]�}	|	jd�}|	dd�j�dk�r�||j|	�7}ndyLt|�d
k�r�|ttt	|d�t	|d
�d
��7}n|jt	|d��Wnt
k
�rgSX�q�Wttt
|���}
x"|D]}||
k�r<|
j|��q<W|
S)Nr�Fr&r�Z0xT�^�!r8�-)�typer/rrr�r(r��lower�ranger�r�r�r-�set�remove)rr1Zstrip_chars�rlZllZll2Z
negation_listZhexmaskZhvrZsv�nlZvlZcpu_listr�rrr�cpulist_unpack9sh





(

zcommands.cpulist_unpackcCs�|j|�}|dkst|�dkr"|Sd}|}g}xz|dt|�kr�||d||dkr�||kr�|jt||�dt||��n|jt||��|d}|d7}q0W|dt|�kr�|jt||�dt|d��n|jt|d��|S)Nrr8r�rgrg)r�r(r�r)rr1r2�jr�rrr�cpulist_pack}s"
$$zcommands.cpulist_packcCs.|j|�}|j|jd��}tt|�t|��S)Nz/sys/devices/system/cpu/online)r�rWr/r�)rr1r�Zonlinerrr�cpulist_invert�s
zcommands.cpulist_invertcs�|dkrdS|j|�}|dkr"dS|j|�}d|�t��}|ddkrX|d|d7}�j|��dj�fdd�tdt��d�D��S)Nz%x�rr�c3s|]}�||d�VqdS)r�Nr)�.0r2)r rr�	<genexpr>�sz'commands.cpulist2hex.<locals>.<genexpr>)r��cpulist2bitmaskr(�zfillr5r�)rr1Zulr�Zlsr)r r�cpulist2hex�s


zcommands.cpulist2hexcCs$d}x|D]}|td|�O}q
W|S)Nrr�)�pow)rr1r�rrrrr��s
zcommands.cpulist2bitmaskcsdj�fdd�|D��S)Nr�c3s|]}�t|�VqdS)N)r)r�r)�prefixrrr��sz*commands.cpulist2string.<locals>.<genexpr>)r5)rr1r�r)r�r�cpulist2string�szcommands.cpulist2stringcCsb|dd�j�dkrH|dd�}|dd�j�dkrHdd�|j|�D�Stjd|�}dd�|D�S)Nrr�zcpulist:cSsg|]}dt|��qS)r�)r)r�rrrr�
<listcomp>�sz(commands.devstr2devs.<locals>.<listcomp>z\s*(?<!\\),\s*cSsg|]}t|�jdd��qS)z\,r�)rr$)r�rrrrr��s)r�r�rr�)rr r1rrr�devstr2devs�szcommands.devstr2devsc	Cs"y|j|d�S|j|�SdS)NF)�wait)rZ	terminateZtimerrrr��sz
commands.waitcCs�t|�j�j�}x�dD]�}|j|�}|dkr.qybt|d|��}||d�}|dkr^|d9}n0|dkrp|d	9}n|dkr�|d9}n|dkr�d}|Stk
r�dSXqWdS)N�KB�MB�GBr&r8i)r�r�r�r&rgiii@)rrr�rfindr�r�)rr ZunitZunit_ix�val�urrr�get_size�s&




zcommands.get_sizecCs�d}d}y(ttjd��}|j�j�}WdQRXWnjtk
rj}z|jtjkrZtd|��WYdd}~Xn2t	t
fk
r�}ztd|��WYdd}~XnXyHttjd��2}|j�j�}|dtjtj
gkr�tdtj��WdQRXWnptk
�r"}z |jtjk�rtd|��WYdd}~Xn4t	t
fk
�rT}ztd|��WYdd}~XnX|dk�rfd}n
|tj
k}|dk�r~d}||fS)Nr&r=z!Failed to read active profile: %szInvalid value in file %s.zFailed to read profile mode: %s)rHrq�ACTIVE_PROFILE_FILErVrrLrO�ENOENTrrK�EOFError�PROFILE_MODE_FILE�ACTIVE_PROFILE_AUTO�ACTIVE_PROFILE_MANUAL)r�profile_name�moderPrT�manualrrr�get_active_profile�s4


zcommands.get_active_profilecCs�y2ttjd��}|dk	r&|j|d�WdQRXWn4ttfk
rf}ztd|j��WYdd}~XnXy:ttjd��$}|r�tj	ntj
}|j|d�WdQRXWn4ttfk
r�}ztd|j��WYdd}~XnXdS)NrBrez!Failed to save active profile: %szFailed to save profile mode: %s)rHrqr�rIrKrLr�strerrorr�r�r�)rr�r�rPrTr�rrr�save_active_profile�s zcommands.save_active_profilecCs�d}y(ttjd��}|j�j�}WdQRXWnjtk
rf}z|jtjkrVtd|��WYdd}~Xn2t	t
fk
r�}ztd|��WYdd}~XnX|dkr�d}|S)Nr&r=z1Failed to read the active post-loaded profile: %s)rHrq�POST_LOADED_PROFILE_FILErVrrLrOr�rrKr�)rr�rPrTrrr�get_post_loaded_profilesz commands.get_post_loaded_profilecCsly2ttjd��}|dk	r&|j|d�WdQRXWn4ttfk
rf}ztd|j��WYdd}~XnXdS)NrBrez1Failed to save the active post-loaded profile: %s)rHrqr�rIrKrLrr�)rr�rPrTrrr�save_post_loaded_profilesz!commands.save_post_loaded_profilecCsDytj||�}Wn(tk
r8ddl}|j||�}YnX|j|�S)Nr)r�	maketrans�AttributeError�string�	translate)r�textZsource_charsZ
dest_charsZtransr�rrr�trszcommands.tr)T)r#r#)r#)Nr)N)FF)r&F)F)F)F)F)T)T)T)r�)r&)/�__name__�
__module__�__qualname__r	r
rrr!r"r%r'r,r3r7r?rArUrWrZr[r\r_rcrdrkrorprrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrsX
















'
	
D
	r)rOrlZ
tuned.logsZtunedr_rCrYZtuned.constsrqr�
subprocessZtuned.exceptionsrZlogsrr
rrrrr�<module>s

site-packages/tuned/utils/__pycache__/commands.cpython-36.opt-1.pyc000064400000036707147511334670021171 0ustar003

�<�e�?�@srddlZddlZddlZddlZddlZddlZddljZddl	Z	ddl
TddlmZej
j�ZGdd�d�ZdS)�N)�*)�TunedExceptionc@s�eZdZd`dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdadd�Z	dbdd�Z
dd�Zdd�Zdd�Z
dcdd�Zdddd�Zded d!�Zdfd#d$�Zdgd%d&�Zdhd'd(�Zdid)d*�Zdjd+d,�Zd-d.�Zd/d0�Zdkd1d2�Zd3d4�Zd5d6�Zdld7d8�Zddigdfd9d:�Zdmd;d<�Zd=d>�Zd?d@�ZdAdB�Z dndDdE�Z!dFdG�Z"dHdI�Z#dJdK�Z$dLdM�Z%dodNdO�Z&dPdQ�Z'dRdS�Z(dTdU�Z)dVdW�Z*dXdY�Z+dZd[�Z,d\d]�Z-d^d_�Z.dS)p�commandsTcCs
||_dS)N)�_logging)�selfZlogging�r�/usr/lib/python3.6/commands.py�__init__szcommands.__init__cCs|jrtj|�dS)N)r�log�error)r�msgrrr�_errorszcommands._errorcCs|jrtj|�dS)N)rr
�debug)rrrrr�_debugszcommands._debugc	Cs.t|�j�j�}ddddddddd�j||�S)N�1�0)�YZYES�TZTRUE�NZNO�FZFALSE)�str�upper�strip�get)r�value�vrrr�get_boolszcommands.get_boolcCstjddt|��j�S)Nz\s+� )�re�subrr)r�srrr�	remove_wsszcommands.remove_wscCstjdd|�S)Nz^"(.*)"$z\1)rr)rrrrr�unquote"szcommands.unquote�\cCs|j|d||f�S)Nz%s%s)�replace)rr Zwhat_escapeZ	escape_byrrr�escape&szcommands.escapecCs|j|d�S)N�)r$)rr Zescape_charrrr�unescape*szcommands.unescapecCs|d|t|�|S)Nr)�len)r�s1�pos�s2rrr�	align_str.szcommands.align_strcCs2g}|dk	r.x t|j��D]}|t|�7}qW|S)N)�sorted�items�list)r�d�l�irrr�	dict2list4s
zcommands.dict2listcCs(|dkrdStjddjt|j����S)Nz(%s)z)|()r�compile�joinr/�keys)rr0rrr�re_lookup_compile<szcommands.re_lookup_compileNrcsV�dkr|dkr.|Snt��dks*|dkr.|S|dkr@|j��}|j�fdd�||�S)Nrcst�j��|jdS)N�)r/�values�	lastindex)�mo)r0rr�<lambda>Msz.commands.multiple_re_replace.<locals>.<lambda>)r(r7r)rr0r �r�flagsr)r0r�multiple_re_replaceDs
zcommands.multiple_re_replacecCsRt|�dks|dkrdS|dkr*|j|�}|j|�}|rNt|j��|jdSdS)Nrr8)r(r7�searchr/r9r:)rr0r r=r;rrr�	re_lookupSs

zcommands.re_lookupFc	Cs�|jd||f�|r2tjj|�}tjj|�r2d}y6|rBtj|�t|d�}|jt|��|j	�d}Wn`t
tfk
r�}z@d}t|t
�r�|s�t|t�r�|j|kr�|jd||f�WYdd}~XnX|S)aWrite data to a file.

		Parameters:
		f -- filename where to write
		data -- data to write
		makedir -- if True and the path doesn't exist, it will be created
		no_error -- if True errors are silenced, it can be also list of ignored errnos

		Return:
		bool -- True on success
		zWriting to file: '%s' < '%s'F�wTz Writing to file '%s' error: '%s'N)r�os�path�dirname�isdir�makedirs�open�writer�close�OSError�IOError�
isinstance�boolr/�errnor
)	r�f�dataZmakedir�no_errorr0�fdZrc�errr�
write_to_file]s$

$zcommands.write_to_filer&cCsv|}yt|d�}|j�}|j�Wn<ttfk
r^}z|sN|jd||f�WYdd}~XnX|jd||f�|S)Nr=z"Error when reading file '%s': '%s'z Read data from file: '%s' > '%s')rH�readrJrKrLr
r)rrPZerr_retrRZ	old_valuerTrrr�	read_file|s
$zcommands.read_filecCsj|jd|�tjj|�rfytj||�Wn:tk
rd}z|sVtjd|t	|�f�dSd}~XnXdS)NzRemoving tree: '%s'zcannot remove tree '%s': '%s'FT)
rrCrD�exists�shutil�rmtreerKr
rr)rrPrRrrrrrZ�szcommands.rmtreecCsh|jd|�tjj|�rdytj|�Wn:tk
rb}z|sTtjd|t|�f�dSd}~XnXdS)NzRemoving file: '%s'zcannot remove file '%s': '%s'FT)	rrCrDrX�unlinkrKr
rr)rrPrRrrrrr[�szcommands.unlinkcCsd|jd||f�ytj||�Wn<tk
r^}z |sPtjd||t|�f�dSd}~XnXdS)NzRenaming file '%s' to '%s'z%cannot rename file '%s' to '%s': '%s'FT)rrC�renamerKr
rr)r�src�dstrRrrrrr\�szcommands.renamecCs`y"tjd||f�tj||�dStk
rZ}z|sLtjd|||f�dSd}~XnXdS)Nzcopying file '%s' to '%s'Tz!cannot copy file '%s' to '%s': %sF)r
rrY�copyrLr)rr]r^rRrTrrrr_�sz
commands.copycCs6|j|�}t|�dkrdS|j|tj|||tjd��S)NrF)r>)rWr(rUrr�	MULTILINE)rrP�pattern�replrQrrr�replace_in_file�s
zcommands.replace_in_filecCs4|j|�}t|�dkrdS|j||j||tjd��S)NrF)r>)rWr(rUr?rr`)rrPr0rQrrr�multiple_replace_in_file�s
z!commands.multiple_replace_in_filecCs�|j|�}x�|D]�}t|�}t||�}tjd|d|tjd�dkr||r�t|�dkrj|ddkrj|d7}|d||f7}qtjd|d	d
|j|�d|tjd�}qW|j||�S)Nz\bz\s*=.*$)r>rr8�
z%s="%s"
z\b(z\s*=).*$z\1�"���z\1")	rWrrr@r`r(rr%rU)rrPr0�addrQ�opt�orrrr�add_modify_option_in_file�s

.z"commands.add_modify_option_in_filecCs"|j|�}tjt|�jd��j�S)Nzutf-8)rW�hashlibZmd5r�encode�	hexdigest)rrPrQrrr�md5sum�s
zcommands.md5sumcCs"|j|�}tjt|�jd��j�S)Nzutf-8)rWrlZsha256rrmrn)rrPrQrrr�	sha256sum�s
zcommands.sha256sumcCs|jtj|�j�S)N)rW�constsZMACHINE_ID_FILEr)rrRrrr�get_machine_id�szcommands.get_machine_idcCsVd}tjj�}d|d<|j|�|jdt|��d}	d}
y�t|tt|||ddd�}|j�\}	}|j	}|r�||kr�d|kr�|dd�}
t
|
�dkr�|	dd�}
d	d
j|�|
f}
|s�|j|
�Wnxt
tfk
�r8}zV|jdk	r�|jnd
}t|�|k�r(d|k�r(d	d
j|�|f}
|�s(|j|
�WYdd}~XnX|�rJ||	|
fS||	fSdS)Nr�C�LC_ALLz
Executing %s.r&T)�stdout�stderr�env�shell�cwdZ	close_fdsZuniversal_newlinesr8zExecuting '%s' error: %srrgrgrg)rC�environr_�updaterr�Popen�PIPEZcommunicate�
returncoder(r5r
rKrLrO�abs)r�argsrxryrwZ	no_errorsZ
return_errZretcodeZ_environment�outZerr_msg�proc�errZerr_outrTrrr�execute�s>


zcommands.executecCs.tjd|�}|r|jd�S|r*|j�dS|S)Nz.*\[([^\]]+)\].*r8r)r�match�group�split)rZoptionsZdosplit�mrrr�get_active_options
zcommands.get_active_optioncCs*t|�}|dkp(|jd|dd�j�dkS)Nrz$/sys/devices/system/cpu/cpu%s/onlineT)rRr)rrWr)r�cpuZscpurrr�
is_cpu_onlineszcommands.is_cpu_onlinecCs\|dkrdSt|�jdd�}yt|d�}Wn&tk
rPtjdt|��gSX|j|�S)N�,r&�zinvalid hexadecimal mask '%s')rr$�int�
ValueErrorr
r�bitmask2cpulist)r�maskr�rrr�hex2cpulistszcommands.hex2cpulistcCs<d}g}x.|dkr6|d@r$|j|�|dL}|d7}q
W|S)Nrr8)�append)rr�r��cpusrrrr�&s

zcommands.bitmask2cpulist�'"cCs^g}|dkr|S|}t|�tk	rD|dk	r6t|�j|�}t|�jd�}g}g}d}d}�x|D�]}	t|	�}
|r�t|
�dkr�d}|j|�d}n||
7}q\|
dd�j�dkr�d}|
}q\|
o�|
ddks�|
dd	k�rP|
d
d�jd�}yPt|�d
k�r|ttt	|d�t	|d
�d
��7}n|jt	|
d
d���Wnt
k
�rLgSXq\t|
�dkr\|j|
�q\Wt|�dk�r�|j|�x�|D]�}	|	jd�}|	dd�j�dk�r�||j|	�7}ndyLt|�d
k�r�|ttt	|d�t	|d
�d
��7}n|jt	|d��Wnt
k
�rgSX�q�Wttt
|���}
x"|D]}||
k�r<|
j|��q<W|
S)Nr�Fr&r�Z0xT�^�!r8�-)�typer/rrr�r(r��lower�ranger�r�r�r-�set�remove)rr1Zstrip_chars�rlZllZll2Z
negation_listZhexmaskZhvrZsv�nlZvlZcpu_listr�rrr�cpulist_unpack9sh





(

zcommands.cpulist_unpackcCs�|j|�}|dkst|�dkr"|Sd}|}g}xz|dt|�kr�||d||dkr�||kr�|jt||�dt||��n|jt||��|d}|d7}q0W|dt|�kr�|jt||�dt|d��n|jt|d��|S)Nrr8r�rgrg)r�r(r�r)rr1r2�jr�rrr�cpulist_pack}s"
$$zcommands.cpulist_packcCs.|j|�}|j|jd��}tt|�t|��S)Nz/sys/devices/system/cpu/online)r�rWr/r�)rr1r�Zonlinerrr�cpulist_invert�s
zcommands.cpulist_invertcs�|dkrdS|j|�}|dkr"dS|j|�}d|�t��}|ddkrX|d|d7}�j|��dj�fdd�tdt��d�D��S)Nz%x�rr�c3s|]}�||d�VqdS)r�Nr)�.0r2)r rr�	<genexpr>�sz'commands.cpulist2hex.<locals>.<genexpr>)r��cpulist2bitmaskr(�zfillr5r�)rr1Zulr�Zlsr)r r�cpulist2hex�s


zcommands.cpulist2hexcCs$d}x|D]}|td|�O}q
W|S)Nrr�)�pow)rr1r�rrrrr��s
zcommands.cpulist2bitmaskcsdj�fdd�|D��S)Nr�c3s|]}�t|�VqdS)N)r)r�r)�prefixrrr��sz*commands.cpulist2string.<locals>.<genexpr>)r5)rr1r�r)r�r�cpulist2string�szcommands.cpulist2stringcCsb|dd�j�dkrH|dd�}|dd�j�dkrHdd�|j|�D�Stjd|�}dd�|D�S)Nrr�zcpulist:cSsg|]}dt|��qS)r�)r)r�rrrr�
<listcomp>�sz(commands.devstr2devs.<locals>.<listcomp>z\s*(?<!\\),\s*cSsg|]}t|�jdd��qS)z\,r�)rr$)r�rrrrr��s)r�r�rr�)rr r1rrr�devstr2devs�szcommands.devstr2devsc	Cs"y|j|d�S|j|�SdS)NF)�wait)rZ	terminateZtimerrrr��sz
commands.waitcCs�t|�j�j�}x�dD]�}|j|�}|dkr.qybt|d|��}||d�}|dkr^|d9}n0|dkrp|d	9}n|dkr�|d9}n|dkr�d}|Stk
r�dSXqWdS)N�KB�MB�GBr&r8i)r�r�r�r&rgiii@)rrr�rfindr�r�)rr ZunitZunit_ix�val�urrr�get_size�s&




zcommands.get_sizecCs�d}d}y(ttjd��}|j�j�}WdQRXWnjtk
rj}z|jtjkrZtd|��WYdd}~Xn2t	t
fk
r�}ztd|��WYdd}~XnXyHttjd��2}|j�j�}|dtjtj
gkr�tdtj��WdQRXWnptk
�r"}z |jtjk�rtd|��WYdd}~Xn4t	t
fk
�rT}ztd|��WYdd}~XnX|dk�rfd}n
|tj
k}|dk�r~d}||fS)Nr&r=z!Failed to read active profile: %szInvalid value in file %s.zFailed to read profile mode: %s)rHrq�ACTIVE_PROFILE_FILErVrrLrO�ENOENTrrK�EOFError�PROFILE_MODE_FILE�ACTIVE_PROFILE_AUTO�ACTIVE_PROFILE_MANUAL)r�profile_name�moderPrT�manualrrr�get_active_profile�s4


zcommands.get_active_profilecCs�y2ttjd��}|dk	r&|j|d�WdQRXWn4ttfk
rf}ztd|j��WYdd}~XnXy:ttjd��$}|r�tj	ntj
}|j|d�WdQRXWn4ttfk
r�}ztd|j��WYdd}~XnXdS)NrBrez!Failed to save active profile: %szFailed to save profile mode: %s)rHrqr�rIrKrLr�strerrorr�r�r�)rr�r�rPrTr�rrr�save_active_profile�s zcommands.save_active_profilecCs�d}y(ttjd��}|j�j�}WdQRXWnjtk
rf}z|jtjkrVtd|��WYdd}~Xn2t	t
fk
r�}ztd|��WYdd}~XnX|dkr�d}|S)Nr&r=z1Failed to read the active post-loaded profile: %s)rHrq�POST_LOADED_PROFILE_FILErVrrLrOr�rrKr�)rr�rPrTrrr�get_post_loaded_profilesz commands.get_post_loaded_profilecCsly2ttjd��}|dk	r&|j|d�WdQRXWn4ttfk
rf}ztd|j��WYdd}~XnXdS)NrBrez1Failed to save the active post-loaded profile: %s)rHrqr�rIrKrLrr�)rr�rPrTrrr�save_post_loaded_profilesz!commands.save_post_loaded_profilecCsDytj||�}Wn(tk
r8ddl}|j||�}YnX|j|�S)Nr)r�	maketrans�AttributeError�string�	translate)r�textZsource_charsZ
dest_charsZtransr�rrr�trszcommands.tr)T)r#r#)r#)Nr)N)FF)r&F)F)F)F)F)T)T)T)r�)r&)/�__name__�
__module__�__qualname__r	r
rrr!r"r%r'r,r3r7r?rArUrWrZr[r\r_rcrdrkrorprrr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrsX
















'
	
D
	r)rOrlZ
tuned.logsZtunedr_rCrYZtuned.constsrqr�
subprocessZtuned.exceptionsrZlogsrr
rrrrr�<module>s

site-packages/tuned/utils/__pycache__/nettool.cpython-36.opt-1.pyc000064400000011304147511334670021036 0ustar003

�<�eF�@sHdgZddlZddlTddlZejj�ZGdd�d�Zdd�Z	ie	_
dS)�ethcard�N)�*c@s~eZdZddgddgddgddgdd	gd
d�ZdZd
d�Zdd�Zdd�Zdd�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dS) �Nettool������ ri�i�?)�
�di�i�	i'�autoFcCsb||_|j�tjd||j|j|j|jf�tjd||j|j	f�tjd||j
|jf�dS)Nz1%s: speed %s, full duplex %s, autoneg %s, link %sz"%s: supports: autoneg %s, modes %sz$%s: advertises: autoneg %s, modes %s)�
_interface�update�log�debug�speed�full_duplex�autoneg�link�supported_autoneg�supported_modes�advertised_autoneg�advertised_modes)�self�	interface�r�/usr/lib/python3.6/nettool.py�__init__s
 zNettool.__init__cCs4d|_d|_d|_d|_g|_d|_g|_d|_dS)NrF)rrrrrrrr)rrrr�
_clean_status"szNettool._clean_statuscCs8d}x.|D]&}||j|d|dr(dnd7}q
W|S)Nrr)�_advertise_values)rZmodes�mode�mrrr�_calculate_mode.s
&zNettool._calculate_modecCs<|j|krdS|jsdSdtdd|jd|r.dndgdd	�kS)
NTFr�ethtoolz-sr�onZoff)�	close_fds)rr�callr)r�enablerrr�_set_autonegotiation5s

zNettool._set_autonegotiationcCs.|jd�sdSdtdd|jdd|gdd�kS)	NTFrr%z-sZ	advertisez0x%03x)r')r*r(r)r�valuerrr�_set_advertise>s
zNettool._set_advertisecCs<d}x"|jD]}|d|kr|d}qW|dkr4|SdSdS)Nri�)r)r�maxr"rrr�
get_max_speedDszNettool.get_max_speedcCs6|js|jrdS|j|jd�r.|j�dSdSdS)NFrT)�	_disabledrr,r!r)rrrr�
set_max_speedNszNettool.set_max_speedcCs�|js|jrdSd}xD|jD]:}|dkr,q||kr||j|d7}||j|d7}qW||j|j�@}tjd|j||f�|j|�r�|j	�dSdSdS)NFrrrz)%s: set_speed(%d) - effective_mode 0x%03xT)
r/rr!r$rrrrr,r)rrr"ZamZeffective_moderrr�	set_speedYs
zNettool.set_speedc
Cs"|jr
dStd|jgttddd�}tddg|jtddd�}|j�d}|j�d}|d	kr�tjd
|j�tjd|j|j	dd
�f�|j
�d|_dS|j
�tjd�}tjd�}d}�xb|j
d�D�]R}|jd��rj|dd)�}	|	dkr�d}nt|	dk�rd}nd|	dk�rd}nT|	dk�r"d}nD|	dk�r2d}n4|	dk�rBd}n$|	dk�rRd}n|	d k�rbd!}nd}~	q�|dk�r�y|j|�jd�|_Wnd"|_YnXd}q�|dk�r�|d#k|_d}q�|dk�r�|d$k�p�|d%k|_d}q�|dk�r�|d$k|_d}q�|dk�rnyLx@|j
�D]4}
|j|
�jdd&�\}}|jjt|�|d#kf��qW~
~~Wn|jjd*�YnXq�|dk�r�|d'k|_d}q�|dk�ryV|d(k�r�x@|j
�D]4}
|j|
�jdd&�\}}|jjt|�|d#kf��q�W~
~~Wn|jjd+�YnXq�|d!kr�|d'k|_d}q�WdS),Nr%T)�stdout�stderrr'�universal_newlinesZsedzs/^\s*//;s/:\s*/:\n/g)�stdinr2r4r'rr�z*%s: some errors were reported by 'ethtool'z%s: %s�
z\nz(\d+)z(\d+)baseT/(Half|Full)�wait�:ZSpeedrZDuplexZduplexzAuto-negotiationrz
Link detectedrzSupported link modesrzSupports auto-negotiationrzAdvertised link modesrzAdvertised auto-negotiationr��ZFull�yesr&rZYeszNot reported���)r:T)r:T)r/�Popenr�PIPEr2ZcommunicaterZwarningr�replacer �re�compile�split�endswith�match�grouprrrrr�append�intrrr)
rZ	p_ethtoolZp_filter�output�errorsZre_speedZre_mode�state�lineZsectionr#�s�drrrrns�























zNettool.updateN)�__name__�
__module__�__qualname__r!r/rr r$r*r,r.r0r1rrrrrr	s 	
rcCs"|tjkrt|�tj|<tj|S)N)r�listr)rrrrr�s
)�__all__Z
tuned.logsZtuned�
subprocessr@Zlogs�getrrrrQrrrr�<module>s
Hsite-packages/tuned/utils/__pycache__/profile_recommender.cpython-36.opt-1.pyc000064400000012233147511334670023374 0ustar003

�<�e3�	@s�ddlZddlZddlZddlZddlZddlmZmZyddlZ	dZ
WndZ
YnXddljZddl
ZddlmZejj�ZGdd�d�ZdS)�N)�ConfigParser�ErrorTF)�commandsc@s0eZdZddd�Zdd�Zd
dd�Zd	d
�ZdS)�ProfileRecommenderFcCs||_t�|_d|_dS)N)�
_is_hardcodedr�	_commands�
_chassis_type)�selfZis_hardcoded�r
�)/usr/lib/python3.6/profile_recommender.py�__init__szProfileRecommender.__init__c
Cstj}|jr|Stj�dk}|s*tjd�|jtj|d�}|dk	rF|Si}x�tj	D]|}g}ytj
|�}Wn@tk
r�}z$|jtj
kr�tjd||f�WYdd}~XnXx"|D]}tjj||�}	|	||<q�WqRWx6t|j��D]&}||}	|j|	|d�}|dk	r�|Sq�W|S)NrztProfile recommender is running without root privileges. Profiles with virt recommendation condition will be omitted.)�has_rootzerror accessing %s: %s)�constsZDEFAULT_PROFILEr�os�geteuid�logZwarning�process_configZRECOMMEND_CONF_FILEZRECOMMEND_DIRECTORIES�listdir�OSError�errno�ENOENT�error�path�join�sorted�keys)
r	Zprofiler
Zmatching�filesZ	directory�contents�e�namerr
r
r�	recommends6
$
zProfileRecommender.recommendTc!Cs�d}d}�y�tjj|�sdStdddd�}t|_t|��}|j||�WdQRX�x:|j�D�],}d}�x|j	|�D�]�}	|j
||	dd�}
|
dkr�d}
|	d	kr�|s�d}Ptj|
|j
jd
g�dtj�s�d}qt|	dk�r�tj|
|j
jtjdd
�tj��sjd}qt|	ddk�r<tjj|	��s6tj|
|j
j|	�tj��rjd}qt|	dd�dk�r~tj�}|j�t|jtj|
���dk�rjd}qt|	dk�r�|j�}tj|
|tj��sjd}qt|	dkrtd}
t�r:tjjtjjdd�}y|j�|j d}
WnRt!t"t#fk
�r6}z.t$|d��r&|j%t%j&k�r&t'j(d|�WYdd}~XnXn|�sRt'j(d|�d}tj|
|
tj�dkrtd}qtW|r\tjd�}|j)d|�}Pq\WWn<t!t"t*fk
�r�}zt'j(d||f�WYdd}~XnX|S)NF�=�#)Z
delimitersZinline_comment_prefixes�strictT)�raw�z^$Zvirtz	virt-what��system)Zno_errorr�/�Zprocess�chassis_typeZsyspurpose_role)Zraise_on_error�rolerz/Failed to load the syspurpose										file: %sz\Failed to process 'syspurpose_role' in '%s'									, the syspurpose module is not availablez,[^,]*$zerror processing '%s', %s)+rr�isfiler�strZoptionxform�openZ	read_fileZsectionsZoptions�get�re�matchrZexecute�SrZSYSTEM_RELEASE_FILE�exists�procfsZpidstatsZreload_threads�lenZ
find_by_regex�compile�_get_chassis_type�
IGNORECASE�have_syspurpose�
syspurposerZSyspurposeStoreZUSER_SYSPURPOSEr�IOErrorr�KeyError�hasattrrrrr�subr)r	Zfnamer
Zmatching_profileZsyspurpose_error_logged�config�fZsectionr1Zoption�valueZpsr*r+�sr�rr
r
rr9s�





$z!ProfileRecommender.process_configc%Cs�|jdk	r tjd|j�|jSdddddddd	d
ddd
ddddddddddddddddddd d!d"d#d$d%d&g%}y0td'd(��}t|j��}WdQRX|||_WnTtk
r�tjd)|�Yn4tt	fk
r�}ztj
d*|�WYdd}~XnX|j�rtjd+|j�|jSyXtjd,d-d.gtj
tj
d/d0�}|j�\}}|j�rZtjd1|j|f�n|j�j�|_Wn6tt	fk
�r�}ztj
d2|�WYdd}~XnX|j�s�tjd3�d|_ntjd+|j�|jS)4Nz"returning cached chassis type '%s'r%ZOtherZUnknownZDesktopzLow Profile Desktopz	Pizza Boxz
Mini TowerZTowerZPortableZLaptopZNotebookz	Hand HeldzDocking Stationz
All In OnezSub NotebookzSpace-savingz	Lunch BoxzMain Server ChassiszExpansion ChassiszSub ChassiszBus Expansion ChassiszPeripheral ChassiszRAID ChassiszRack Mount ChassiszSealed-case PCzMulti-systemZ
CompactPCIZAdvancedTCAZBladezBlade EnclosingZTabletZConvertibleZ
DetachablezIoT GatewayzEmbedded PCzMini PCzStick PCz(/sys/devices/virtual/dmi/id/chassis_typerCz/Unknown chassis type id read from dmi sysfs: %dz"error accessing dmi sysfs file: %szchassis type - %sZ	dmidecodez-szchassis-typeT)�stdout�stderrZ	close_fdsz,dmidecode finished with error (ret %d): '%s'z#error executing dmidecode tool : %sz!could not determine chassis type.)rr�debugr.�int�read�
IndexErrorrrr;�warn�
subprocess�Popen�PIPEZcommunicate�
returncode�strip�decode)r	ZDMI_CHASSIS_TYPESZsysfs_chassis_typeZchassis_type_idrZp_dmiZ
dmi_outputZ	dmi_errorr
r
rr7�sF


  
z$ProfileRecommender._get_chassis_typeN)F)T)�__name__�
__module__�__qualname__rr rr7r
r
r
rrs

Ir)rr0rr4rKZtuned.utils.config_parserrrZsyspurpose.filesr:r9Ztuned.constsrZ
tuned.logsZtunedZtuned.utils.commandsrZlogsr/rrr
r
r
r�<module>s


site-packages/tuned/utils/__pycache__/plugin_loader.cpython-36.pyc000064400000004241147511334670021241 0ustar003

�<�e�@s4ddlZddlZdgZejj�ZGdd�de�ZdS)�N�PluginLoadercsFeZdZdddgZdd�Z�fdd�Zdd	�Zd
d�Zdd
�Z�Z	S)r�
_namespace�_prefix�
_interfacecCs
t��dS)z~
		This method has to be implemented in child class and should
		set _namespace, _prefix, and _interface member attributes.
		N)�NotImplementedError)�self�r�#/usr/lib/python3.6/plugin_loader.py�_set_loader_parameterssz#PluginLoader._set_loader_parameterscsntt|�j�d|_d|_d|_|j�t|j�tks:t	�t|j�tksLt	�t|j�tkrft
|jt�sjt	�dS)N)�superr�__init__rrrr
�type�str�AssertionError�
issubclass�object)r)�	__class__rr	rszPluginLoader.__init__cCs,t|�tkst�d|j|j|f}|j|�S)Nz%s.%s%s)r
rrrr�
_get_class)rZplugin_name�module_namerrr	�load_pluginszPluginLoader.load_plugincCs�tjd|�t|�}|jd�}|jd�x t|�dkrJt||jd��}q,Wx4|jD]*}t||�}t|�tkrTt	||j
�rT|SqTWtd��dS)Nzloading module %s�.rzCannot find the plugin class.)�log�debug�
__import__�split�pop�len�getattr�__dict__r
rr�ImportError)rr�module�path�name�clsrrr	r"s


zPluginLoader._get_classcCs�t|j�}g}xvtj|jjd�D]`}yFtjj|�d}|jd�sDw"|j	d|j|f�}||krj|j
|�Wq"tk
r�Yq"Xq"W|S)NrZplugin_z%s.%s)rr�os�listdirZplugins�__path__r!�splitext�
startswithr�appendr)rZplugins_packageZplugin_clssrZplugin_classrrr	�load_all_plugins2s


zPluginLoader.load_all_plugins)
�__name__�
__module__�__qualname__�	__slots__r
rrrr*�
__classcell__rr)rr	rs
)	Z
tuned.logsZtunedr$�__all__Zlogs�getrrrrrrr	�<module>s
site-packages/tuned/utils/__pycache__/profile_recommender.cpython-36.pyc000064400000012233147511334670022435 0ustar003

�<�e3�	@s�ddlZddlZddlZddlZddlZddlmZmZyddlZ	dZ
WndZ
YnXddljZddl
ZddlmZejj�ZGdd�d�ZdS)�N)�ConfigParser�ErrorTF)�commandsc@s0eZdZddd�Zdd�Zd
dd�Zd	d
�ZdS)�ProfileRecommenderFcCs||_t�|_d|_dS)N)�
_is_hardcodedr�	_commands�
_chassis_type)�selfZis_hardcoded�r
�)/usr/lib/python3.6/profile_recommender.py�__init__szProfileRecommender.__init__c
Cstj}|jr|Stj�dk}|s*tjd�|jtj|d�}|dk	rF|Si}x�tj	D]|}g}ytj
|�}Wn@tk
r�}z$|jtj
kr�tjd||f�WYdd}~XnXx"|D]}tjj||�}	|	||<q�WqRWx6t|j��D]&}||}	|j|	|d�}|dk	r�|Sq�W|S)NrztProfile recommender is running without root privileges. Profiles with virt recommendation condition will be omitted.)�has_rootzerror accessing %s: %s)�constsZDEFAULT_PROFILEr�os�geteuid�logZwarning�process_configZRECOMMEND_CONF_FILEZRECOMMEND_DIRECTORIES�listdir�OSError�errno�ENOENT�error�path�join�sorted�keys)
r	Zprofiler
Zmatching�filesZ	directory�contents�e�namerr
r
r�	recommends6
$
zProfileRecommender.recommendTc!Cs�d}d}�y�tjj|�sdStdddd�}t|_t|��}|j||�WdQRX�x:|j�D�],}d}�x|j	|�D�]�}	|j
||	dd�}
|
dkr�d}
|	d	kr�|s�d}Ptj|
|j
jd
g�dtj�s�d}qt|	dk�r�tj|
|j
jtjdd
�tj��sjd}qt|	ddk�r<tjj|	��s6tj|
|j
j|	�tj��rjd}qt|	dd�dk�r~tj�}|j�t|jtj|
���dk�rjd}qt|	dk�r�|j�}tj|
|tj��sjd}qt|	dkrtd}
t�r:tjjtjjdd�}y|j�|j d}
WnRt!t"t#fk
�r6}z.t$|d��r&|j%t%j&k�r&t'j(d|�WYdd}~XnXn|�sRt'j(d|�d}tj|
|
tj�dkrtd}qtW|r\tjd�}|j)d|�}Pq\WWn<t!t"t*fk
�r�}zt'j(d||f�WYdd}~XnX|S)NF�=�#)Z
delimitersZinline_comment_prefixes�strictT)�raw�z^$Zvirtz	virt-what��system)Zno_errorr�/�Zprocess�chassis_typeZsyspurpose_role)Zraise_on_error�rolerz/Failed to load the syspurpose										file: %sz\Failed to process 'syspurpose_role' in '%s'									, the syspurpose module is not availablez,[^,]*$zerror processing '%s', %s)+rr�isfiler�strZoptionxform�openZ	read_fileZsectionsZoptions�get�re�matchrZexecute�SrZSYSTEM_RELEASE_FILE�exists�procfsZpidstatsZreload_threads�lenZ
find_by_regex�compile�_get_chassis_type�
IGNORECASE�have_syspurpose�
syspurposerZSyspurposeStoreZUSER_SYSPURPOSEr�IOErrorr�KeyError�hasattrrrrr�subr)r	Zfnamer
Zmatching_profileZsyspurpose_error_logged�config�fZsectionr1Zoption�valueZpsr*r+�sr�rr
r
rr9s�





$z!ProfileRecommender.process_configc%Cs�|jdk	r tjd|j�|jSdddddddd	d
ddd
ddddddddddddddddddd d!d"d#d$d%d&g%}y0td'd(��}t|j��}WdQRX|||_WnTtk
r�tjd)|�Yn4tt	fk
r�}ztj
d*|�WYdd}~XnX|j�rtjd+|j�|jSyXtjd,d-d.gtj
tj
d/d0�}|j�\}}|j�rZtjd1|j|f�n|j�j�|_Wn6tt	fk
�r�}ztj
d2|�WYdd}~XnX|j�s�tjd3�d|_ntjd+|j�|jS)4Nz"returning cached chassis type '%s'r%ZOtherZUnknownZDesktopzLow Profile Desktopz	Pizza Boxz
Mini TowerZTowerZPortableZLaptopZNotebookz	Hand HeldzDocking Stationz
All In OnezSub NotebookzSpace-savingz	Lunch BoxzMain Server ChassiszExpansion ChassiszSub ChassiszBus Expansion ChassiszPeripheral ChassiszRAID ChassiszRack Mount ChassiszSealed-case PCzMulti-systemZ
CompactPCIZAdvancedTCAZBladezBlade EnclosingZTabletZConvertibleZ
DetachablezIoT GatewayzEmbedded PCzMini PCzStick PCz(/sys/devices/virtual/dmi/id/chassis_typerCz/Unknown chassis type id read from dmi sysfs: %dz"error accessing dmi sysfs file: %szchassis type - %sZ	dmidecodez-szchassis-typeT)�stdout�stderrZ	close_fdsz,dmidecode finished with error (ret %d): '%s'z#error executing dmidecode tool : %sz!could not determine chassis type.)rr�debugr.�int�read�
IndexErrorrrr;�warn�
subprocess�Popen�PIPEZcommunicate�
returncode�strip�decode)r	ZDMI_CHASSIS_TYPESZsysfs_chassis_typeZchassis_type_idrZp_dmiZ
dmi_outputZ	dmi_errorr
r
rr7�sF


  
z$ProfileRecommender._get_chassis_typeN)F)T)�__name__�
__module__�__qualname__rr rr7r
r
r
rrs

Ir)rr0rr4rKZtuned.utils.config_parserrrZsyspurpose.filesr:r9Ztuned.constsrZ
tuned.logsZtunedZtuned.utils.commandsrZlogsr/rrr
r
r
r�<module>s


site-packages/tuned/utils/__pycache__/global_config.cpython-36.pyc000064400000007743147511334670021214 0ustar003

�<�e��@s\ddlZddlmZmZddlmZddljZddl	m
Z
dgZejj
�ZGdd�d�ZdS)�N)�ConfigParser�Error)�TunedException)�commands�GlobalConfigc@sdeZdZejfdd�Zedd��Zejfdd�Zddd	�Z	dd
d�Z
dd
d�Zdd�Zddd�Z
dS)rcCsi|_|j|d�t�|_dS)N)�	file_name)�_cfg�load_configr�_cmd)�selfZconfig_file�r�#/usr/lib/python3.6/global_config.py�__init__
szGlobalConfig.__init__cCs>dd�tt�D�}tdd�|D��}tdd�|D��}||fS)ai
		Easy validation mimicking configobj
		Returns two dicts, first with default values (default None)
		global_default[consts.CFG_SOMETHING] = consts.CFG_DEF_SOMETHING or None
		second with configobj function for value type (default "get" for string, others eg getboolean, getint)
		global_function[consts.CFG_SOMETHING] = consts.CFG_FUNC_SOMETHING or get
		}
		cSs2g|]*}|jd�r|jd�r|jd�r|�qS)ZCFG_�	CFG_FUNC_�CFG_DEF_)�
startswith)�.0�optrrr
�
<listcomp>s
z7GlobalConfig.get_global_config_spec.<locals>.<listcomp>css0|](}tt|�ttd|dd�d�fVqdS)r�N)�getattr�consts)rrrrr
�	<genexpr> sz6GlobalConfig.get_global_config_spec.<locals>.<genexpr>css0|](}tt|�ttd|dd�d�fVqdS)rrN�get)rr)rrrrr
r!s)�dirr�dict)�optionsZglobal_defaultZglobal_functionrrr
�get_global_config_specs
z#GlobalConfig.get_global_config_speccCs\tjd|�y�tdddd�}t|_t|��$}|jdtjd|j	�|�WdQRX|j
�\|_}x�|jtj�D]�}||jkr�y$t
|||�}|tj|�|j|<Wq�tk
r�td	|��Yq�Xqrtjd
||f�|jtj|dd�|j|<qrWWn^tk
�r(}ztd
|��WYdd}~Xn0tk
�rV}ztd|��WYdd}~XnXdS)z&
		Loads global configuration file.
		z2reading and parsing global configuration file '%s'�=�#F)Z
delimitersZinline_comment_prefixes�strict�[z]
Nz2Global TuneD configuration file '%s' is not valid.z/Unknown option '%s' in global config file '%s'.T)�rawz/Global TuneD configuration file '%s' not found.z3Error parsing global TuneD configuration file '%s'.)�log�debugr�strZoptionxform�openZread_stringrZMAGIC_HEADER_NAME�readrrrrrr�infor�IOError)rrZ
config_parser�fZ_global_config_funcZoption�func�errr
r	$s*
(
 zGlobalConfig.load_configNcCs|jj||�S)N)rr)r�key�defaultrrr
r?szGlobalConfig.getcCs |jj|j||��dkrdSdS)N�1TF)r
�get_boolr)rr-r.rrr
r0BszGlobalConfig.get_boolrcCs.|jj||�}|r*t|t�r |St|d�S|S)Nr)rr�
isinstance�int)rr-r.�irrr
�get_intGs

zGlobalConfig.get_intcCs||j|<dS)N)r)rr-�valuerrr
�setPszGlobalConfig.setcCsH|j|�}|dkr|S|jj|�}|dkr@tjd||f�|S|SdS)Nz%Error parsing value '%s', using '%s'.)rr
�get_sizer#�error)rr-r.�val�retrrr
r7Ss
zGlobalConfig.get_size)N)N)r)N)�__name__�
__module__�__qualname__rZGLOBAL_CONFIG_FILEr�staticmethodrr	rr0r4r6r7rrrr
rs


	)Z
tuned.logsZtunedZtuned.utils.config_parserrrZtuned.exceptionsrZtuned.constsrZtuned.utils.commandsr�__all__Zlogsrr#rrrrr
�<module>s

site-packages/tuned/utils/__pycache__/polkit.cpython-36.opt-1.pyc000064400000003173147511334670020661 0ustar003

�<�e��@s,ddlZddlZejj�ZGdd�d�ZdS)�Nc@seZdZdd�Zdd�ZdS)�polkitcCs4tj�|_|jjdddd�|_tj|jdd�|_dS)Nzorg.freedesktop.PolicyKit1z%/org/freedesktop/PolicyKit1/AuthorityT)Zfollow_name_owner_changesz$org.freedesktop.PolicyKit1.Authority)Zdbus_interface)�dbusZ	SystemBus�_busZ
get_object�_proxyZ	Interface�
_authority)�self�r�/usr/lib/python3.6/polkit.py�__init__s
zpolkit.__init__c
Cs�|dks|dkrdSi}d}d}dd|if}y|jj|||||�d}Wn�tjjtfk
r�}zhtjd|�y|jj	|�}	Wn2tjjk
r�}ztjd	|�dSd}~XnX|	dkr�d
SdSWYdd}~XnX|r�dSdS)
z�Check authorization, return codes:
			1  - authorized
			2  - polkit error, but authorized with fallback method
			0  - unauthorized
			-1 - polkit error and unauthorized by the fallback method
			-2 - polkit error and unable to use the fallback method
		NF��zsystem-bus-name�namerzerror querying polkit: %sz-error using fallback authorization method: %s�������)
rZCheckAuthorizationr�
exceptionsZ
DBusException�
ValueError�log�errorrZ
get_unix_user)
rZsenderZ	action_idZdetails�flagsZcancellation_idZsubject�ret�eZuidrrr	�check_authorizations&	zpolkit.check_authorizationN)�__name__�
__module__�__qualname__r
rrrrr	rsr)rZ
tuned.logsZtunedZlogs�getrrrrrr	�<module>s
site-packages/tuned/utils/plugin_loader.py000064400000003403147511334670014754 0ustar00import tuned.logs
import os

__all__ = ["PluginLoader"]

log = tuned.logs.get()

class PluginLoader(object):
	__slots__ = ["_namespace", "_prefix", "_interface"]

	def _set_loader_parameters(self):
		"""
		This method has to be implemented in child class and should
		set _namespace, _prefix, and _interface member attributes.
		"""
		raise NotImplementedError()

	def __init__(self):
		super(PluginLoader, self).__init__()

		self._namespace = None
		self._prefix = None
		self._interface = None
		self._set_loader_parameters()
		assert type(self._namespace) is str
		assert type(self._prefix) is str
		assert type(self._interface) is type and issubclass(self._interface, object)

	def load_plugin(self, plugin_name):
		assert type(plugin_name) is str
		module_name = "%s.%s%s" % (self._namespace, self._prefix, plugin_name)
		return self._get_class(module_name)

	def _get_class(self, module_name):
		log.debug("loading module %s" % module_name)
		module = __import__(module_name)
		path = module_name.split(".")
		path.pop(0)

		while len(path) > 0:
			module = getattr(module, path.pop(0))

		for name in module.__dict__:
			cls = getattr(module, name)
			if type(cls) is type and issubclass(cls, self._interface):
				return cls

		raise ImportError("Cannot find the plugin class.")

	def load_all_plugins(self):
		plugins_package = __import__(self._namespace)
		plugin_clss = []
		for module_name in os.listdir(plugins_package.plugins.__path__[0]):
			try:
				module_name = os.path.splitext(module_name)[0]
				if not module_name.startswith("plugin_"):
					continue
				plugin_class = self._get_class(
					"%s.%s" % (self._namespace, module_name)
					)
				if plugin_class not in plugin_clss:
					plugin_clss.append(plugin_class)
			except ImportError:
				pass
		return plugin_clss

site-packages/tuned/utils/commands.py000064400000037771147511334670013750 0ustar00import errno
import hashlib
import tuned.logs
import copy
import os
import shutil
import tuned.consts as consts
import re
from subprocess import *
from tuned.exceptions import TunedException

log = tuned.logs.get()

class commands:

	def __init__(self, logging = True):
		self._logging = logging

	def _error(self, msg):
		if self._logging:
			log.error(msg)

	def _debug(self, msg):
		if self._logging:
			log.debug(msg)

	def get_bool(self, value):
		v = str(value).upper().strip()
		return {"Y":"1", "YES":"1", "T":"1", "TRUE":"1", "N":"0", "NO":"0", "F":"0", "FALSE":"0"}.get(v, value)

	def remove_ws(self, s):
		return re.sub(r'\s+', ' ', str(s)).strip()

	def unquote(self, v):
		return re.sub("^\"(.*)\"$", r"\1", v)

	# escape escape character (by default '\')
	def escape(self, s, what_escape = "\\", escape_by = "\\"):
		return s.replace(what_escape, "%s%s" % (escape_by, what_escape))

	# clear escape characters (by default '\')
	def unescape(self, s, escape_char = "\\"):
		return s.replace(escape_char, "")

	# add spaces to align s2 to pos, returns resulting string: s1 + spaces + s2
	def align_str(self, s1, pos, s2):
		return s1 + " " * (pos - len(s1)) + s2

	# convert dictionary 'd' to flat list and return it
	# it uses sort on the dictionary items to return consistent results
	# for directories with different inserte/delete history
	def dict2list(self, d):
		l = []
		if d is not None:
			for i in sorted(d.items()):
				l += list(i)
		return l

	# Compile regex to speedup multiple_re_replace or re_lookup
	def re_lookup_compile(self, d):
		if d is None:
			return None
		return re.compile("(%s)" % ")|(".join(list(d.keys())))

	# Do multiple regex replaces in 's' according to lookup table described by
	# dictionary 'd', e.g.: d = {"re1": "replace1", "re2": "replace2", ...}
	# r can be regex precompiled by re_lookup_compile for speedup
	def multiple_re_replace(self, d, s, r = None, flags = 0):
		if d is None:
			if r is None:
				return s
		else:
			if len(d) == 0 or s is None:
				return s
		if r is None:
			r = self.re_lookup_compile(d)
		return r.sub(lambda mo: list(d.values())[mo.lastindex - 1], s, flags)

	# Do regex lookup on 's' according to lookup table described by
	# dictionary 'd' and return corresponding value from the dictionary,
	# e.g.: d = {"re1": val1, "re2": val2, ...}
	# r can be regex precompiled by re_lookup_compile for speedup
	def re_lookup(self, d, s, r = None):
		if len(d) == 0 or s is None:
			return None
		if r is None:
			r = self.re_lookup_compile(d)
		mo = r.search(s)
		if mo:
			return list(d.values())[mo.lastindex - 1]
		return None

	def write_to_file(self, f, data, makedir = False, no_error = False):
		"""Write data to a file.

		Parameters:
		f -- filename where to write
		data -- data to write
		makedir -- if True and the path doesn't exist, it will be created
		no_error -- if True errors are silenced, it can be also list of ignored errnos

		Return:
		bool -- True on success
		"""
		self._debug("Writing to file: '%s' < '%s'" % (f, data))
		if makedir:
			d = os.path.dirname(f)
			if os.path.isdir(d):
				makedir = False
		try:
			if makedir:
				os.makedirs(d)
			fd = open(f, "w")
			fd.write(str(data))
			fd.close()
			rc = True
		except (OSError, IOError) as e:
			rc = False
			if isinstance(no_error, bool) and not no_error or \
				isinstance(no_error, list) and e.errno not in no_error:
					self._error("Writing to file '%s' error: '%s'" % (f, e))
		return rc

	def read_file(self, f, err_ret = "", no_error = False):
		old_value = err_ret
		try:
			f = open(f, "r")
			old_value = f.read()
			f.close()
		except (OSError,IOError) as e:
			if not no_error:
				self._error("Error when reading file '%s': '%s'" % (f, e))
		self._debug("Read data from file: '%s' > '%s'" % (f, old_value))
		return old_value

	def rmtree(self, f, no_error = False):
		self._debug("Removing tree: '%s'" % f)
		if os.path.exists(f):
			try:
				shutil.rmtree(f, no_error)
			except OSError as error:
				if not no_error:
					log.error("cannot remove tree '%s': '%s'" % (f, str(error)))
				return False
		return True

	def unlink(self, f, no_error = False):
		self._debug("Removing file: '%s'" % f)
		if os.path.exists(f):
			try:
				os.unlink(f)
			except OSError as error:
				if not no_error:
					log.error("cannot remove file '%s': '%s'" % (f, str(error)))
				return False
		return True

	def rename(self, src, dst, no_error = False):
		self._debug("Renaming file '%s' to '%s'" % (src, dst))
		try:
			os.rename(src, dst)
		except OSError as error:
			if not no_error:
				log.error("cannot rename file '%s' to '%s': '%s'" % (src, dst, str(error)))
			return False
		return True

	def copy(self, src, dst, no_error = False):
		try:
			log.debug("copying file '%s' to '%s'" % (src, dst))
			shutil.copy(src, dst)
			return True
		except IOError as e:
			if not no_error:
				log.error("cannot copy file '%s' to '%s': %s" % (src, dst, e))
			return False

	def replace_in_file(self, f, pattern, repl):
		data = self.read_file(f)
		if len(data) <= 0:
			return False;
		return self.write_to_file(f, re.sub(pattern, repl, data, flags = re.MULTILINE))

	# do multiple replaces in file 'f' by using dictionary 'd',
	# e.g.: d = {"re1": val1, "re2": val2, ...}
	def multiple_replace_in_file(self, f, d):
		data = self.read_file(f)
		if len(data) <= 0:
			return False;
		return self.write_to_file(f, self.multiple_re_replace(d, data, flags = re.MULTILINE))

	# makes sure that options from 'd' are set to values from 'd' in file 'f',
	# when needed it edits options or add new options if they don't
	# exist and 'add' is set to True, 'd' has the following form:
	# d = {"option_1": value_1, "option_2": value_2, ...}
	def add_modify_option_in_file(self, f, d, add = True):
		data = self.read_file(f)
		for opt in d:
			o = str(opt)
			v = str(d[opt])
			if re.search(r"\b" + o + r"\s*=.*$", data, flags = re.MULTILINE) is None:
				if add:
					if len(data) > 0 and data[-1] != "\n":
						data += "\n"
					data += "%s=\"%s\"\n" % (o, v)
			else:
				data = re.sub(r"\b(" + o + r"\s*=).*$", r"\1" + "\"" + self.escape(v) + "\"", data, flags = re.MULTILINE)

		return self.write_to_file(f, data)

	# calcualtes md5sum of file 'f'
	def md5sum(self, f):
		data = self.read_file(f)
		return hashlib.md5(str(data).encode("utf-8")).hexdigest()

	# calcualtes sha256sum of file 'f'
	def sha256sum(self, f):
		data = self.read_file(f)
		return hashlib.sha256(str(data).encode("utf-8")).hexdigest()

	# returns machine ID or empty string "" in case of error
	def get_machine_id(self, no_error = True):
		return self.read_file(consts.MACHINE_ID_FILE, no_error).strip()

	# "no_errors" can be list of return codes not treated as errors, if 0 is in no_errors, it means any error
	# returns (retcode, out), where retcode is exit code of the executed process or -errno if
	# OSError or IOError exception happened
	def execute(self, args, shell = False, cwd = None, env = {}, no_errors = [], return_err = False):
		retcode = 0
		_environment = os.environ.copy()
		_environment["LC_ALL"] = "C"
		_environment.update(env)

		self._debug("Executing %s." % str(args))
		out = ""
		err_msg = None
		try:
			proc = Popen(args, stdout = PIPE, stderr = PIPE, \
					env = _environment, \
					shell = shell, cwd = cwd, \
					close_fds = True, \
					universal_newlines = True)
			out, err = proc.communicate()

			retcode = proc.returncode
			if retcode and not retcode in no_errors and not 0 in no_errors:
				err_out = err[:-1]
				if len(err_out) == 0:
					err_out = out[:-1]
				err_msg = "Executing '%s' error: %s" % (' '.join(args), err_out)
				if not return_err:
					self._error(err_msg)
		except (OSError, IOError) as e:
			retcode = -e.errno if e.errno is not None else -1
			if not abs(retcode) in no_errors and not 0 in no_errors:
				err_msg = "Executing '%s' error: %s" % (' '.join(args), e)
				if not return_err:
					self._error(err_msg)
		if return_err:
			return retcode, out, err_msg
		else:
			return retcode, out

	# Helper for parsing kernel options like:
	# [always] never
	# It will return 'always'
	def get_active_option(self, options, dosplit = True):
		m = re.match(r'.*\[([^\]]+)\].*', options)
		if m:
			return m.group(1)
		if dosplit:
			return options.split()[0]
		return options

	# Checks whether CPU is online
	def is_cpu_online(self, cpu):
		scpu = str(cpu)
		# CPU0 is always online
		return cpu == "0" or self.read_file("/sys/devices/system/cpu/cpu%s/online" % scpu, no_error = True).strip() == "1"

	# Converts hexadecimal CPU mask to CPU list
	def hex2cpulist(self, mask):
		if mask is None:
			return None
		mask = str(mask).replace(",", "")
		try:
			m = int(mask, 16)
		except ValueError:
			log.error("invalid hexadecimal mask '%s'" % str(mask))
			return []
		return self.bitmask2cpulist(m)

	# Converts an integer bitmask to a list of cpus (e.g. [0,3,4])
	def bitmask2cpulist(self, mask):
		cpu = 0
		cpus = []
		while mask > 0:
			if mask & 1:
				cpus.append(cpu)
			mask >>= 1
			cpu += 1
		return cpus

	# Unpacks CPU list, i.e. 1-3 will be converted to 1, 2, 3, supports
	# hexmasks that needs to be prefixed by "0x". Hexmasks can have commas,
	# which will be removed. If combining hexmasks with CPU list they need
	# to be separated by ",,", e.g.: 0-3, 0xf,, 6. It also supports negation
	# cpus by specifying "^" or "!", e.g.: 0-5, ^3, will output the list as
	# "0,1,2,4,5" (excluding 3). Note: negation supports only cpu numbers.
	# If "strip_chars" is not None and l is not list, we try strip characters.
	# It should be string with list of chars that is send to string.strip method
	# Default is english single and double quotes ("') rhbz#1891036
	def cpulist_unpack(self, l, strip_chars='\'"'):
		rl = []
		if l is None:
			return l
		ll = l
		if type(ll) is not list:
			if strip_chars is not None:
				ll = str(ll).strip(strip_chars)
			ll = str(ll).split(",")
		ll2 = []
		negation_list = []
		hexmask = False
		hv = ""
		# Remove commas from hexmasks
		for v in ll:
			sv = str(v)
			if hexmask:
				if len(sv) == 0:
					hexmask = False
					ll2.append(hv)
					hv = ""
				else:
					hv += sv
			else:
				if sv[0:2].lower() == "0x":
					hexmask = True
					hv = sv
				elif sv and (sv[0] == "^" or sv[0] == "!"):
					nl = sv[1:].split("-")
					try:
						if (len(nl) > 1):
							negation_list += list(range(
								int(nl[0]),
								int(nl[1]) + 1
								)
							)
						else:
							negation_list.append(int(sv[1:]))
					except ValueError:
						return []
				else:
					if len(sv) > 0:
						ll2.append(sv)
		if len(hv) > 0:
			ll2.append(hv)
		for v in ll2:
			vl = v.split("-")
			if v[0:2].lower() == "0x":
				rl += self.hex2cpulist(v)
			else:
				try:
					if len(vl) > 1:
						rl += list(range(int(vl[0]), int(vl[1]) + 1))
					else:
						rl.append(int(vl[0]))
				except ValueError:
					return []
		cpu_list = sorted(list(set(rl)))

		# Remove negated cpus after expanding
		for cpu in negation_list:
			if cpu in cpu_list:
				cpu_list.remove(cpu)
		return cpu_list

	# Packs CPU list, i.e. 1, 2, 3  will be converted to 1-3. It unpacks the
	# CPU list through cpulist_unpack first, so see its description about the
	# details of the input syntax
	def cpulist_pack(self, l):
		l = self.cpulist_unpack(l)
		if l is None or len(l) == 0:
			return l
		i = 0
		j = i
		rl = []
		while i + 1 < len(l):
			if l[i + 1] - l[i] != 1:
				if j != i:
					rl.append(str(l[j]) + "-" + str(l[i]))
				else:
					rl.append(str(l[i]))
				j = i + 1
			i += 1
		if j + 1 < len(l):
			rl.append(str(l[j]) + "-" + str(l[-1]))
		else:
			rl.append(str(l[-1]))
		return rl

	# Inverts CPU list (i.e. makes its complement)
	def cpulist_invert(self, l):
		cpus = self.cpulist_unpack(l)
		online = self.cpulist_unpack(self.read_file("/sys/devices/system/cpu/online"))
		return list(set(online) - set(cpus))

	# Converts CPU list to hexadecimal CPU mask
	def cpulist2hex(self, l):
		if l is None:
			return None
		ul = self.cpulist_unpack(l)
		if ul is None:
			return None
		m = self.cpulist2bitmask(ul)
		s = "%x" % m
		ls = len(s)
		if ls % 8 != 0:
			ls += 8 - ls % 8
		s = s.zfill(ls)
		return ",".join(s[i:i + 8] for i in range(0, len(s), 8))

	def cpulist2bitmask(self, l):
		m = 0
		for v in l:
			m |= pow(2, v)
		return m

	def cpulist2string(self, l, prefix = ""):
		return ",".join((prefix + str(v)) for v in l)

	# Converts string s consisting of "dev1,dev2,dev3,..." to list ["dev1", "dev2, "dev3", ...],
	# whitespaces are ignored, cpu lists are supported with the prefix "cpulist:", e.g.
	# "cpulist:0-2,4" is converted to ["cpu0", "cpu1", "cpu2", "cpu4"]. If device name starts
	# with "cpulist:" write it as "cpulist:cpulist:". Escape commas in name with the "\,".
	def devstr2devs(self, s):
		if s[0:8].lower() == "cpulist:":
			s = s[8:]
			if s[0:8].lower() != "cpulist:":
				return [("cpu" + str(v)) for v in self.cpulist_unpack(s)]
		l = re.split(r"\s*(?<!\\),\s*", s)
		return [str(v).replace(r"\,", ",") for v in l]

	# Do not make balancing on patched Python 2 interpreter (rhbz#1028122).
	# It means less CPU usage on patchet interpreter. On non-patched interpreter
	# it is not allowed to sleep longer than 50 ms.
	def wait(self, terminate, time):
		try:
			return terminate.wait(time, False)
		except:
			return terminate.wait(time)

	def get_size(self, s):
		s = str(s).strip().upper()
		for unit in ["KB", "MB", "GB", ""]:
			unit_ix = s.rfind(unit)
			if unit_ix == -1:
				continue
			try:
				val = int(s[:unit_ix])
				u = s[unit_ix:]
				if u == "KB":
					val *= 1024
				elif u == "MB":
					val *= 1024 * 1024
				elif u == "GB":
					val *= 1024 * 1024 * 1024
				elif u != "":
					val = None
				return val
			except ValueError:
				return None

	def get_active_profile(self):
		profile_name = ""
		mode = ""
		try:
			with open(consts.ACTIVE_PROFILE_FILE, "r") as f:
				profile_name = f.read().strip()
		except IOError as e:
			if e.errno != errno.ENOENT:
				raise TunedException("Failed to read active profile: %s" % e)
		except (OSError, EOFError) as e:
			raise TunedException("Failed to read active profile: %s" % e)
		try:
			with open(consts.PROFILE_MODE_FILE, "r") as f:
				mode = f.read().strip()
				if mode not in ["", consts.ACTIVE_PROFILE_AUTO, consts.ACTIVE_PROFILE_MANUAL]:
					raise TunedException("Invalid value in file %s." % consts.PROFILE_MODE_FILE)
		except IOError as e:
			if e.errno != errno.ENOENT:
				raise TunedException("Failed to read profile mode: %s" % e)
		except (OSError, EOFError) as e:
			raise TunedException("Failed to read profile mode: %s" % e)
		if mode == "":
			manual = None
		else:
			manual = mode == consts.ACTIVE_PROFILE_MANUAL
		if profile_name == "":
			profile_name = None
		return (profile_name, manual)

	def save_active_profile(self, profile_name, manual):
		try:
			with open(consts.ACTIVE_PROFILE_FILE, "w") as f:
				if profile_name is not None:
					f.write(profile_name + "\n")
		except (OSError,IOError) as e:
			raise TunedException("Failed to save active profile: %s" % e.strerror)
		try:
			with open(consts.PROFILE_MODE_FILE, "w") as f:
				mode = consts.ACTIVE_PROFILE_MANUAL if manual else consts.ACTIVE_PROFILE_AUTO
				f.write(mode + "\n")
		except (OSError,IOError) as e:
			raise TunedException("Failed to save profile mode: %s" % e.strerror)

	def get_post_loaded_profile(self):
		profile_name = ""
		try:
			with open(consts.POST_LOADED_PROFILE_FILE, "r") as f:
				profile_name = f.read().strip()
		except IOError as e:
			if e.errno != errno.ENOENT:
				raise TunedException("Failed to read the active post-loaded profile: %s" % e)
		except (OSError, EOFError) as e:
			raise TunedException("Failed to read the active post-loaded profile: %s" % e)
		if profile_name == "":
			profile_name = None
		return profile_name

	def save_post_loaded_profile(self, profile_name):
		try:
			with open(consts.POST_LOADED_PROFILE_FILE, "w") as f:
				if profile_name is not None:
					f.write(profile_name + "\n")
		except (OSError,IOError) as e:
			raise TunedException("Failed to save the active post-loaded profile: %s" % e.strerror)

	# Translates characters in 'text' from 'source_chars' to 'dest_chars'
	def tr(self, text, source_chars, dest_chars):
		try:
			trans = str.maketrans(source_chars, dest_chars)
		except AttributeError:
			import string
			trans = string.maketrans(source_chars, dest_chars)
		return text.translate(trans)
site-packages/tuned/utils/global_config.py000064400000006225147511334670014722 0ustar00import tuned.logs
from tuned.utils.config_parser import ConfigParser, Error
from tuned.exceptions import TunedException
import tuned.consts as consts
from tuned.utils.commands import commands

__all__ = ["GlobalConfig"]

log = tuned.logs.get()

class GlobalConfig():

	def __init__(self,config_file = consts.GLOBAL_CONFIG_FILE):
		self._cfg = {}
		self.load_config(file_name=config_file)
		self._cmd = commands()

	@staticmethod
	def get_global_config_spec():
		"""
		Easy validation mimicking configobj
		Returns two dicts, first with default values (default None)
		global_default[consts.CFG_SOMETHING] = consts.CFG_DEF_SOMETHING or None
		second with configobj function for value type (default "get" for string, others eg getboolean, getint)
		global_function[consts.CFG_SOMETHING] = consts.CFG_FUNC_SOMETHING or get
		}
		"""
		options = [opt for opt in dir(consts)
				   if opt.startswith("CFG_") and
				   not opt.startswith("CFG_FUNC_") and
				   not opt.startswith("CFG_DEF_")]
		global_default = dict((getattr(consts, opt), getattr(consts, "CFG_DEF_" + opt[4:], None)) for opt in options)
		global_function = dict((getattr(consts, opt), getattr(consts, "CFG_FUNC_" + opt[4:], "get")) for opt in options)
		return global_default, global_function

	def load_config(self, file_name = consts.GLOBAL_CONFIG_FILE):
		"""
		Loads global configuration file.
		"""
		log.debug("reading and parsing global configuration file '%s'" % file_name)
		try:
			config_parser = ConfigParser(delimiters=('='), inline_comment_prefixes=('#'), strict=False)
			config_parser.optionxform = str
			with open(file_name) as f:
				config_parser.read_string("[" + consts.MAGIC_HEADER_NAME + "]\n" + f.read(), file_name)
			self._cfg, _global_config_func = self.get_global_config_spec()
			for option in config_parser.options(consts.MAGIC_HEADER_NAME):
				if option in self._cfg:
					try:
						func = getattr(config_parser, _global_config_func[option])
						self._cfg[option] = func(consts.MAGIC_HEADER_NAME, option)
					except Error:
						raise TunedException("Global TuneD configuration file '%s' is not valid."
											 % file_name)
				else:
					log.info("Unknown option '%s' in global config file '%s'." % (option, file_name))
					self._cfg[option] = config_parser.get(consts.MAGIC_HEADER_NAME, option, raw=True)
		except IOError as e:
			raise TunedException("Global TuneD configuration file '%s' not found." % file_name)
		except Error as e:
			raise TunedException("Error parsing global TuneD configuration file '%s'." % file_name)

	def get(self, key, default = None):
		return self._cfg.get(key, default)

	def get_bool(self, key, default = None):
		if self._cmd.get_bool(self.get(key, default)) == "1":
			return True
		return False

	def get_int(self, key, default = 0):
		i = self._cfg.get(key, default)
		if i:
			if isinstance(i, int):
				return i
			else:
				return int(i, 0)
		return default

	def set(self, key, value):
		self._cfg[key] = value

	def get_size(self, key, default = None):
		val = self.get(key)
		if val is None:
			return default
		ret = self._cmd.get_size(val)
		if ret is None:
			log.error("Error parsing value '%s', using '%s'." %(val, default))
			return default
		else:
			return ret
site-packages/tuned/utils/nettool.py000064400000013106147511334670013615 0ustar00__all__ = ["ethcard"]

import tuned.logs
from subprocess import *
import re

log = tuned.logs.get()

class Nettool:

	_advertise_values = { # [ half, full ]
		10 : [ 0x001, 0x002 ],
		100 : [ 0x004, 0x008 ],
		1000 : [ 0x010, 0x020 ],
		2500 : [ 0, 0x8000 ], 
		10000 : [ 0, 0x1000 ],
		"auto" : 0x03F
	}

	_disabled = False

	def __init__(self, interface):
		self._interface = interface;
		self.update()

		log.debug("%s: speed %s, full duplex %s, autoneg %s, link %s" % (interface, self.speed, self.full_duplex, self.autoneg, self.link)) 
		log.debug("%s: supports: autoneg %s, modes %s" % (interface, self.supported_autoneg, self.supported_modes))
		log.debug("%s: advertises: autoneg %s, modes %s" % (interface, self.advertised_autoneg, self.advertised_modes))

#	def __del__(self):
#		if self.supported_autoneg:
#			self._set_advertise(self._advertise_values["auto"])

	def _clean_status(self):
		self.speed = 0
		self.full_duplex = False
		self.autoneg = False
		self.link = False

		self.supported_modes = []
		self.supported_autoneg = False

		self.advertised_modes = []
		self.advertised_autoneg = False

	def _calculate_mode(self, modes):
		mode = 0;
		for m in modes:
			mode += self._advertise_values[m[0]][ 1 if m[1] else 0 ]

		return mode

	def _set_autonegotiation(self, enable):
		if self.autoneg == enable:
			return True

		if not self.supported_autoneg:
			return False

		return 0 == call(["ethtool", "-s", self._interface, "autoneg", "on" if enable else "off"], close_fds=True)

	def _set_advertise(self, value):
		if not self._set_autonegotiation(True):
			return False

		return 0 == call(["ethtool", "-s", self._interface, "advertise", "0x%03x" % value], close_fds=True)

	def get_max_speed(self):
		max = 0
		for mode in self.supported_modes:
			if mode[0] > max: max = mode[0]

		if max > 0:
			return max
		else:
			return 1000

	def set_max_speed(self):
		if self._disabled or not self.supported_autoneg:
			return False

		#if self._set_advertise(self._calculateMode(self.supported_modes)):
		if self._set_advertise(self._advertise_values["auto"]):
			self.update()
			return True
		else:
			return False

	def set_speed(self, speed):
		if self._disabled or not self.supported_autoneg:
			return False

		mode = 0
		for am in self._advertise_values:
			if am == "auto": continue
			if am <= speed:
				mode += self._advertise_values[am][0];
				mode += self._advertise_values[am][1];

		effective_mode = mode & self._calculate_mode(self.supported_modes)

		log.debug("%s: set_speed(%d) - effective_mode 0x%03x" % (self._interface, speed, effective_mode))

		if self._set_advertise(effective_mode):
			self.update()
			return True
		else:
			return False

	def update(self):
		if self._disabled:
			return

		# run ethtool and preprocess output

		p_ethtool = Popen(["ethtool", self._interface], \
				stdout=PIPE, stderr=PIPE, close_fds=True, \
				universal_newlines = True)
		p_filter = Popen(["sed", r"s/^\s*//;s/:\s*/:\n/g"], \
				stdin=p_ethtool.stdout, stdout=PIPE, \
				universal_newlines = True, \
				close_fds=True)

		output = p_filter.communicate()[0]
		errors = p_ethtool.communicate()[1]

		if errors != "":
			log.warning("%s: some errors were reported by 'ethtool'" % self._interface)
			log.debug("%s: %s" % (self._interface, errors.replace("\n", r"\n")))
			self._clean_status()
			self._disabled = True
			return

		# parses output - kind of FSM

		self._clean_status()

		re_speed = re.compile(r"(\d+)")
		re_mode = re.compile(r"(\d+)baseT/(Half|Full)")

		state = "wait"

		for line in output.split("\n"):

			if line.endswith(":"):
				section = line[:-1]
				if section == "Speed": state = "speed"
				elif section == "Duplex": state = "duplex"
				elif section == "Auto-negotiation": state = "autoneg"
				elif section == "Link detected": state = "link"
				elif section == "Supported link modes": state = "supported_modes"
				elif section == "Supports auto-negotiation": state = "supported_autoneg"
				elif section == "Advertised link modes": state = "advertised_modes"
				elif section == "Advertised auto-negotiation": state = "advertised_autoneg"
				else: state = "wait"
				del section

			elif state == "speed":
				# Try to determine speed. If it fails, assume 1gbit ethernet
				try:
					self.speed = re_speed.match(line).group(1)
				except:
					self.speed = 1000
				state = "wait"

			elif state == "duplex":
				self.full_duplex = line == "Full"
				state = "wait"

			elif state == "autoneg":
				self.autoneg = (line == "yes" or line == "on")
				state = "wait"

			elif state == "link":
				self.link = line == "yes"
				state = "wait"

			elif state == "supported_modes":
				# Try to determine supported modes. If it fails, assume 1gibt ethernet fullduplex works
				try:
					for m in line.split():
						(s, d) = re_mode.match(m).group(1,2)
						self.supported_modes.append( (int(s), d == "Full") )
					del m,s,d
				except:
					self.supported_modes.append((1000, True))
	

			elif state == "supported_autoneg":
				self.supported_autoneg = line == "Yes"
				state = "wait"

			elif state == "advertised_modes":
				# Try to determine advertised modes. If it fails, assume 1gibt ethernet fullduplex works
				try:
					if line != "Not reported":
						for m in line.split():
							(s, d) = re_mode.match(m).group(1,2)
							self.advertised_modes.append( (int(s), d == "Full") )
						del m,s,d
				except:
					self.advertised_modes.append((1000, True))

			elif state == "advertised_autoneg":
				self.advertised_autoneg = line == "Yes"
				state = "wait"

def ethcard(interface):
	if not interface in ethcard.list:
		ethcard.list[interface] = Nettool(interface)

	return ethcard.list[interface]

ethcard.list = {}

site-packages/tuned/hardware/__init__.py000064400000000132147511334670014320 0ustar00from .inventory import *
from .device_matcher import *
from .device_matcher_udev import *
site-packages/tuned/hardware/device_matcher.py000064400000003050147511334670015525 0ustar00import fnmatch
import re

__all__ = ["DeviceMatcher"]

class DeviceMatcher(object):
	"""
	Device name matching against the devices specification in tuning profiles.

	The devices specification consists of multiple rules separated by spaces.
	The rules have a syntax of shell-style wildcards and are either positive
	or negative. The negative rules are prefixed with an exclamation mark.
	"""
	def match(self, rules, device_name):
		"""
		Match a device against the specification in the profile.

		If there is no positive rule in the specification, implicit rule
		which matches all devices is added. The device matches if and only
		if it matches some positive rule, but no negative rule.
		"""
		if isinstance(rules, str):
			rules = re.split(r"\s|,\s*", rules)

		positive_rules = [rule for rule in rules if not rule.startswith("!") and not rule.strip() == '']
		negative_rules = [rule[1:] for rule in rules if rule not in positive_rules]

		if len(positive_rules) == 0:
			positive_rules.append("*")

		matches = False
		for rule in positive_rules:
			if fnmatch.fnmatch(device_name, rule):
				matches = True
				break

		for rule in negative_rules:
			if fnmatch.fnmatch(device_name, rule):
				matches = False
				break

		return matches

	def match_list(self, rules, device_list):
		"""
		Match a device list against the specification in the profile. Returns
		the list, which is a subset of devices which match.
		"""
		matching_devices = []
		for device in device_list:
			if self.match(rules, device):
				matching_devices.append(device)

		return matching_devices
site-packages/tuned/hardware/__pycache__/device_matcher.cpython-36.opt-1.pyc000064400000003701147511334670022753 0ustar003

�<�e(�@s*ddlZddlZdgZGdd�de�ZdS)�N�
DeviceMatcherc@s eZdZdZdd�Zdd�ZdS)ra,
	Device name matching against the devices specification in tuning profiles.

	The devices specification consists of multiple rules separated by spaces.
	The rules have a syntax of shell-style wildcards and are either positive
	or negative. The negative rules are prefixed with an exclamation mark.
	cs�t|t�rtjd|�}dd�|D���fdd�|D�}t��dkrL�jd�d}x�D]}tj||�rVd}PqVWx|D]}tj||�rvd}PqvW|S)	a
		Match a device against the specification in the profile.

		If there is no positive rule in the specification, implicit rule
		which matches all devices is added. The device matches if and only
		if it matches some positive rule, but no negative rule.
		z\s|,\s*cSs*g|]"}|jd�r|j�dkr|�qS)�!�)�
startswith�strip)�.0�rule�r	�$/usr/lib/python3.6/device_matcher.py�
<listcomp>sz'DeviceMatcher.match.<locals>.<listcomp>cs g|]}|�kr|dd��qS)�Nr	)rr)�positive_rulesr	r
rsr�*FT)�
isinstance�str�re�split�len�append�fnmatch)�self�rulesZdevice_nameZnegative_rulesZmatchesrr	)r
r
�matchs 



zDeviceMatcher.matchcCs,g}x"|D]}|j||�r
|j|�q
W|S)z�
		Match a device list against the specification in the profile. Returns
		the list, which is a subset of devices which match.
		)rr)rrZdevice_listZmatching_devicesZdevicer	r	r
�
match_list,s

zDeviceMatcher.match_listN)�__name__�
__module__�__qualname__�__doc__rrr	r	r	r
rs)rr�__all__�objectrr	r	r	r
�<module>ssite-packages/tuned/hardware/__pycache__/device_matcher.cpython-36.pyc000064400000003701147511334670022014 0ustar003

�<�e(�@s*ddlZddlZdgZGdd�de�ZdS)�N�
DeviceMatcherc@s eZdZdZdd�Zdd�ZdS)ra,
	Device name matching against the devices specification in tuning profiles.

	The devices specification consists of multiple rules separated by spaces.
	The rules have a syntax of shell-style wildcards and are either positive
	or negative. The negative rules are prefixed with an exclamation mark.
	cs�t|t�rtjd|�}dd�|D���fdd�|D�}t��dkrL�jd�d}x�D]}tj||�rVd}PqVWx|D]}tj||�rvd}PqvW|S)	a
		Match a device against the specification in the profile.

		If there is no positive rule in the specification, implicit rule
		which matches all devices is added. The device matches if and only
		if it matches some positive rule, but no negative rule.
		z\s|,\s*cSs*g|]"}|jd�r|j�dkr|�qS)�!�)�
startswith�strip)�.0�rule�r	�$/usr/lib/python3.6/device_matcher.py�
<listcomp>sz'DeviceMatcher.match.<locals>.<listcomp>cs g|]}|�kr|dd��qS)�Nr	)rr)�positive_rulesr	r
rsr�*FT)�
isinstance�str�re�split�len�append�fnmatch)�self�rulesZdevice_nameZnegative_rulesZmatchesrr	)r
r
�matchs 



zDeviceMatcher.matchcCs,g}x"|D]}|j||�r
|j|�q
W|S)z�
		Match a device list against the specification in the profile. Returns
		the list, which is a subset of devices which match.
		)rr)rrZdevice_listZmatching_devicesZdevicer	r	r
�
match_list,s

zDeviceMatcher.match_listN)�__name__�
__module__�__qualname__�__doc__rrr	r	r	r
rs)rr�__all__�objectrr	r	r	r
�<module>ssite-packages/tuned/hardware/__pycache__/device_matcher_udev.cpython-36.pyc000064400000001504147511334670023036 0ustar003

�<�e�@s0ddlmZddlZdgZGdd�dej�ZdS)�)�device_matcher�N�DeviceMatcherUdevc@seZdZdd�ZdS)rcCsrd}y|jj�}Wntk
r.|j�}YnXx,tt|��D]\}}||d|d7}q>Wtj||tj�dk	S)zc
		Match a device against the udev regex in tuning profiles.

		device is a pyudev.Device object
		��=�
N)�
properties�items�AttributeError�sorted�list�re�search�	MULTILINE)�selfZregexZdevicerr	�key�val�r�)/usr/lib/python3.6/device_matcher_udev.py�matchszDeviceMatcherUdev.matchN)�__name__�
__module__�__qualname__rrrrrrs)rrr
�__all__Z
DeviceMatcherrrrrr�<module>ssite-packages/tuned/hardware/__pycache__/device_matcher_udev.cpython-36.opt-1.pyc000064400000001504147511334670023775 0ustar003

�<�e�@s0ddlmZddlZdgZGdd�dej�ZdS)�)�device_matcher�N�DeviceMatcherUdevc@seZdZdd�ZdS)rcCsrd}y|jj�}Wntk
r.|j�}YnXx,tt|��D]\}}||d|d7}q>Wtj||tj�dk	S)zc
		Match a device against the udev regex in tuning profiles.

		device is a pyudev.Device object
		��=�
N)�
properties�items�AttributeError�sorted�list�re�search�	MULTILINE)�selfZregexZdevicerr	�key�val�r�)/usr/lib/python3.6/device_matcher_udev.py�matchszDeviceMatcherUdev.matchN)�__name__�
__module__�__qualname__rrrrrrs)rrr
�__all__Z
DeviceMatcherrrrrr�<module>ssite-packages/tuned/hardware/__pycache__/inventory.cpython-36.pyc000064400000010146147511334670021110 0ustar003

�<�e3�@sPddlZddlZddlmZdgZejj�ZGdd�de�Z	Gdd�de�Z
dS)�N)�consts�	Inventoryc@s\eZdZdZddd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zddd�ZdS)rz�
	Inventory object can handle information about available hardware devices. It also informs the plugins
	about related hardware events.
	NTcCs�|dk	r||_n
tj�|_|dkr(tj}|j|j�|_|dkrDtj}|rxy|jj|�Wnt	k
rvt
jd�YnX|dkr�t�}||_
d|_i|_dS)NzLcannot set udev monitor receive buffer size, we are probably running inside zIcontainer or with limited capabilites, TuneD functionality may be limitedz�cannot set udev monitor receive buffer size, we are probably running inside container or with limited capabilites, TuneD functionality may be limited)�
_udev_context�pyudevZContextZMonitorZfrom_netlink�
_udev_monitorrZCFG_DEF_UDEV_BUFFER_SIZE�set_receive_buffer_size�EnvironmentError�log�warn�_MonitorObserverFactory�_monitor_observer_factory�_monitor_observer�_subscriptions)�selfZudev_contextZudev_monitor_clsZmonitor_observer_factory�buffer_sizer�r�/usr/lib/python3.6/inventory.py�__init__s&
zInventory.__init__cCs:ytjj|j||�Stk
r4tjj|j||�SXdS)z9Get a pyudev.Device object for the sys_name (e.g. 'sda').N)rZDevices�	from_namer�AttributeErrorZDevice)r�	subsystemZsys_namerrr�
get_device)szInventory.get_devicecCs|jj|d�S)z)Get list of devices on a given subsystem.)r)rZlist_devices)rrrrr�get_devices1szInventory.get_devicescCsx|j|jkrdSxb|j|jD]R\}}y|||�Wqtk
rn}ztjd|�tj|�WYdd}~XqXqWdS)Nz+Exception occured in event handler of '%s'.)rr�	Exceptionr	�errorZ	exception)rZeventZdevice�plugin�callback�errr�_handle_udev_event5szInventory._handle_udev_eventcCs\tjd||f�||f}||jkr6|j|j|�n"|g|j|<|jj|�|jj�dS)z7Register handler of device events on a given subsystem.zadding handler: %s (%s)N)r	�debugr�appendrZ	filter_by�start)rrrr�
callback_datarrr�	subscribe@s
zInventory.subscribecCs6|jdkr2tjd�|jj|j|j�|_|jj�dS)Nzstarting monitor observer)r
r	rr�createrrr!)rrrr�start_processing_eventsLs

z!Inventory.start_processing_eventscCs(|jdk	r$tjd�|jj�d|_dS)Nzstopping monitor observer)r
r	r�stop)rrrr�stop_processing_eventsRs


z Inventory.stop_processing_eventscCsJxD|j|D]6}|\}}||krtjd||f�|j|j|�qWdS)Nzremoving handler: %s (%s))rr	r�remove)rrrr"Z_pluginrrrr�_unsubscribe_subsystemXs
z Inventory._unsubscribe_subsystemcCsfg}xF|jD]<}|dks ||kr|j||�t|j|�dkr|j|�qWx|D]}|j|=qRWdS)z4Unregister handler registered with subscribe method.Nr)rr)�lenr )rrrZempty_subsystemsZ
_subsystemrrr�unsubscribe_s
zInventory.unsubscribe)NNNNT)N)
�__name__�
__module__�__qualname__�__doc__rrrrr#r%r'r)r+rrrrr	s
c@seZdZdd�ZdS)rcOstj||�S)N)rZMonitorObserver)r�args�kwargsrrrr$lsz_MonitorObserverFactory.createN)r,r-r.r$rrrrrksr)rZ
tuned.logsZtunedr�__all__Zlogs�getr	�objectrrrrrr�<module>s
bsite-packages/tuned/hardware/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000304147511334670021544 0ustar003

�<�eZ�@sddlTddlTddlTdS)�)�*N)Z	inventoryZdevice_matcherZdevice_matcher_udev�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/hardware/__pycache__/__init__.cpython-36.pyc000064400000000304147511334670020605 0ustar003

�<�eZ�@sddlTddlTddlTdS)�)�*N)Z	inventoryZdevice_matcherZdevice_matcher_udev�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/hardware/__pycache__/inventory.cpython-36.opt-1.pyc000064400000010146147511334670022047 0ustar003

�<�e3�@sPddlZddlZddlmZdgZejj�ZGdd�de�Z	Gdd�de�Z
dS)�N)�consts�	Inventoryc@s\eZdZdZddd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zddd�ZdS)rz�
	Inventory object can handle information about available hardware devices. It also informs the plugins
	about related hardware events.
	NTcCs�|dk	r||_n
tj�|_|dkr(tj}|j|j�|_|dkrDtj}|rxy|jj|�Wnt	k
rvt
jd�YnX|dkr�t�}||_
d|_i|_dS)NzLcannot set udev monitor receive buffer size, we are probably running inside zIcontainer or with limited capabilites, TuneD functionality may be limitedz�cannot set udev monitor receive buffer size, we are probably running inside container or with limited capabilites, TuneD functionality may be limited)�
_udev_context�pyudevZContextZMonitorZfrom_netlink�
_udev_monitorrZCFG_DEF_UDEV_BUFFER_SIZE�set_receive_buffer_size�EnvironmentError�log�warn�_MonitorObserverFactory�_monitor_observer_factory�_monitor_observer�_subscriptions)�selfZudev_contextZudev_monitor_clsZmonitor_observer_factory�buffer_sizer�r�/usr/lib/python3.6/inventory.py�__init__s&
zInventory.__init__cCs:ytjj|j||�Stk
r4tjj|j||�SXdS)z9Get a pyudev.Device object for the sys_name (e.g. 'sda').N)rZDevices�	from_namer�AttributeErrorZDevice)r�	subsystemZsys_namerrr�
get_device)szInventory.get_devicecCs|jj|d�S)z)Get list of devices on a given subsystem.)r)rZlist_devices)rrrrr�get_devices1szInventory.get_devicescCsx|j|jkrdSxb|j|jD]R\}}y|||�Wqtk
rn}ztjd|�tj|�WYdd}~XqXqWdS)Nz+Exception occured in event handler of '%s'.)rr�	Exceptionr	�errorZ	exception)rZeventZdevice�plugin�callback�errr�_handle_udev_event5szInventory._handle_udev_eventcCs\tjd||f�||f}||jkr6|j|j|�n"|g|j|<|jj|�|jj�dS)z7Register handler of device events on a given subsystem.zadding handler: %s (%s)N)r	�debugr�appendrZ	filter_by�start)rrrr�
callback_datarrr�	subscribe@s
zInventory.subscribecCs6|jdkr2tjd�|jj|j|j�|_|jj�dS)Nzstarting monitor observer)r
r	rr�createrrr!)rrrr�start_processing_eventsLs

z!Inventory.start_processing_eventscCs(|jdk	r$tjd�|jj�d|_dS)Nzstopping monitor observer)r
r	r�stop)rrrr�stop_processing_eventsRs


z Inventory.stop_processing_eventscCsJxD|j|D]6}|\}}||krtjd||f�|j|j|�qWdS)Nzremoving handler: %s (%s))rr	r�remove)rrrr"Z_pluginrrrr�_unsubscribe_subsystemXs
z Inventory._unsubscribe_subsystemcCsfg}xF|jD]<}|dks ||kr|j||�t|j|�dkr|j|�qWx|D]}|j|=qRWdS)z4Unregister handler registered with subscribe method.Nr)rr)�lenr )rrrZempty_subsystemsZ
_subsystemrrr�unsubscribe_s
zInventory.unsubscribe)NNNNT)N)
�__name__�
__module__�__qualname__�__doc__rrrrr#r%r'r)r+rrrrr	s
c@seZdZdd�ZdS)rcOstj||�S)N)rZMonitorObserver)r�args�kwargsrrrr$lsz_MonitorObserverFactory.createN)r,r-r.r$rrrrrksr)rZ
tuned.logsZtunedr�__all__Zlogs�getr	�objectrrrrrr�<module>s
bsite-packages/tuned/hardware/inventory.py000064400000007463147511334670014634 0ustar00import pyudev
import tuned.logs
from tuned import consts

__all__ = ["Inventory"]

log = tuned.logs.get()

class Inventory(object):
	"""
	Inventory object can handle information about available hardware devices. It also informs the plugins
	about related hardware events.
	"""

	def __init__(self, udev_context=None, udev_monitor_cls=None, monitor_observer_factory=None, buffer_size=None, set_receive_buffer_size=True):
		if udev_context is not None:
			self._udev_context = udev_context
		else:
			self._udev_context = pyudev.Context()

		if udev_monitor_cls is None:
			udev_monitor_cls = pyudev.Monitor
		self._udev_monitor = udev_monitor_cls.from_netlink(self._udev_context)
		if buffer_size is None:
			buffer_size = consts.CFG_DEF_UDEV_BUFFER_SIZE

		if (set_receive_buffer_size):
			try:
				self._udev_monitor.set_receive_buffer_size(buffer_size)
			except EnvironmentError:
				log.warn("cannot set udev monitor receive buffer size, we are probably running inside " +
					 "container or with limited capabilites, TuneD functionality may be limited")

		if monitor_observer_factory is None:
			monitor_observer_factory = _MonitorObserverFactory()
		self._monitor_observer_factory = monitor_observer_factory
		self._monitor_observer = None

		self._subscriptions = {}

	def get_device(self, subsystem, sys_name):
		"""Get a pyudev.Device object for the sys_name (e.g. 'sda')."""
		try:
			return pyudev.Devices.from_name(self._udev_context, subsystem, sys_name)
		# workaround for pyudev < 0.18
		except AttributeError:
			return pyudev.Device.from_name(self._udev_context, subsystem, sys_name)

	def get_devices(self, subsystem):
		"""Get list of devices on a given subsystem."""
		return self._udev_context.list_devices(subsystem=subsystem)

	def _handle_udev_event(self, event, device):
		if not device.subsystem in self._subscriptions:
			return

		for (plugin, callback) in self._subscriptions[device.subsystem]:
			try:
				callback(event, device)
			except Exception as e:
				log.error("Exception occured in event handler of '%s'." % plugin)
				log.exception(e)

	def subscribe(self, plugin, subsystem, callback):
		"""Register handler of device events on a given subsystem."""
		log.debug("adding handler: %s (%s)" % (subsystem, plugin))
		callback_data = (plugin, callback)
		if subsystem in self._subscriptions:
			self._subscriptions[subsystem].append(callback_data)
		else:
			self._subscriptions[subsystem] = [callback_data, ]
			self._udev_monitor.filter_by(subsystem)
			# After start(), HW events begin to get queued up
			self._udev_monitor.start()

	def start_processing_events(self):
		if self._monitor_observer is None:
			log.debug("starting monitor observer")
			self._monitor_observer = self._monitor_observer_factory.create(self._udev_monitor, self._handle_udev_event)
			self._monitor_observer.start()

	def stop_processing_events(self):
		if self._monitor_observer is not None:
			log.debug("stopping monitor observer")
			self._monitor_observer.stop()
			self._monitor_observer = None

	def _unsubscribe_subsystem(self, plugin, subsystem):
		for callback_data in self._subscriptions[subsystem]:
			(_plugin, callback) = callback_data
			if plugin == _plugin:
				log.debug("removing handler: %s (%s)" % (subsystem, plugin))
				self._subscriptions[subsystem].remove(callback_data)

	def unsubscribe(self, plugin, subsystem=None):
		"""Unregister handler registered with subscribe method."""
		empty_subsystems = []
		for _subsystem in self._subscriptions:
			if subsystem is None or _subsystem == subsystem:
				self._unsubscribe_subsystem(plugin, _subsystem)
				if len(self._subscriptions[_subsystem]) == 0:
					empty_subsystems.append(_subsystem)

		for _subsystem in empty_subsystems:
			del self._subscriptions[_subsystem]

class _MonitorObserverFactory(object):
	def create(self, *args, **kwargs):
		return pyudev.MonitorObserver(*args, **kwargs)
site-packages/tuned/hardware/device_matcher_udev.py000064400000001021147511334670016544 0ustar00from . import device_matcher
import re

__all__ = ["DeviceMatcherUdev"]

class DeviceMatcherUdev(device_matcher.DeviceMatcher):
	def match(self, regex, device):
		"""
		Match a device against the udev regex in tuning profiles.

		device is a pyudev.Device object
		"""

		properties = ''

		try:
			items = device.properties.items()
		except AttributeError:
			items = device.items()

		for key, val in sorted(list(items)):
			properties += key + '=' + val + '\n'

		return re.search(regex, properties, re.MULTILINE) is not None
site-packages/tuned/plugins/plugin_scsi_host.py000064400000006121147511334670016025 0ustar00import errno
from . import hotplug
from .decorators import *
import tuned.logs
import tuned.consts as consts
from tuned.utils.commands import commands
import os
import re

log = tuned.logs.get()

class SCSIHostPlugin(hotplug.Plugin):
	"""
	`scsi_host`::
	
	Tunes options for SCSI hosts.
	+
	The plug-in sets Aggressive Link Power Management (ALPM) to the value specified
	by the [option]`alpm` option. The option takes one of three values:
	`min_power`, `medium_power` and `max_performance`.
	+
	NOTE: ALPM is only available on SATA controllers that use the Advanced
	Host Controller Interface (AHCI).
	+
	.ALPM setting when extended periods of idle time are expected
	====
	----
	[scsi_host]
	alpm=min_power
	----
	====
	"""

	def __init__(self, *args, **kwargs):
		super(SCSIHostPlugin, self).__init__(*args, **kwargs)

		self._cmd = commands()

	def _init_devices(self):
		super(SCSIHostPlugin, self)._init_devices()
		self._devices_supported = True
		self._free_devices = set()
		for device in self._hardware_inventory.get_devices("scsi"):
			if self._device_is_supported(device):
				self._free_devices.add(device.sys_name)

		self._assigned_devices = set()

	def _get_device_objects(self, devices):
		return [self._hardware_inventory.get_device("scsi", x) for x in devices]

	@classmethod
	def _device_is_supported(cls, device):
		return  device.device_type == "scsi_host"

	def _hardware_events_init(self):
		self._hardware_inventory.subscribe(self, "scsi", self._hardware_events_callback)

	def _hardware_events_cleanup(self):
		self._hardware_inventory.unsubscribe(self)

	def _hardware_events_callback(self, event, device):
		if self._device_is_supported(device):
			super(SCSIHostPlugin, self)._hardware_events_callback(event, device)

	def _added_device_apply_tuning(self, instance, device_name):
		super(SCSIHostPlugin, self)._added_device_apply_tuning(instance, device_name)

	def _removed_device_unapply_tuning(self, instance, device_name):
		super(SCSIHostPlugin, self)._removed_device_unapply_tuning(instance, device_name)

	@classmethod
	def _get_config_options(cls):
		return {
			"alpm"               : None,
		}

	def _instance_init(self, instance):
		instance._has_static_tuning = True
		instance._has_dynamic_tuning = False

	def _instance_cleanup(self, instance):
		pass

	def _get_alpm_policy_file(self, device):
		return os.path.join("/sys/class/scsi_host/", str(device), "link_power_management_policy")

	@command_set("alpm", per_device = True)
	def _set_alpm(self, policy, device, sim, remove):
		if policy is None:
			return None
		policy_file = self._get_alpm_policy_file(device)
		if not sim:
			if os.path.exists(policy_file):
				self._cmd.write_to_file(policy_file, policy, \
					no_error = [errno.ENOENT] if remove else False)
			else:
				log.info("ALPM control file ('%s') not found, skipping ALPM setting for '%s'" % (policy_file, str(device)))
				return None
		return policy

	@command_get("alpm")
	def _get_alpm(self, device, ignore_missing=False):
		policy_file = self._get_alpm_policy_file(device)
		policy = self._cmd.read_file(policy_file, no_error = True).strip()
		return policy if policy != "" else None
site-packages/tuned/plugins/plugin_uncore.py000064400000011163147511334670015324 0ustar00from . import hotplug
from .decorators import *
import tuned.logs
from tuned.utils.commands import commands

import os
import fnmatch

log = tuned.logs.get()
cmd = commands()

SYSFS_DIR = "/sys/devices/system/cpu/intel_uncore_frequency/"

IS_MIN = 0
IS_MAX = 1

class UncorePlugin(hotplug.Plugin):
	"""
	`uncore`::

	`max_freq_khz, min_freq_khz`:::
	Limit the maximum and minumum uncore frequency.

	Those options are Intel specific and correspond directly to `sysfs` files
	exposed by Intel uncore frequency driver.
	====
	----
	[uncore]
	max_freq_khz=4000000
	----
	Using this options *TuneD* will limit maximum frequency of all uncore units
	on the Intel system to 4 GHz.
	====
	"""

	def _init_devices(self):
		self._devices_supported = True
		self._assigned_devices = set()
		self._free_devices = set()
		self._is_tpmi = False

		try:
			devices = os.listdir(SYSFS_DIR)
		except OSError:
			return

		# For new TPMI interface use only uncore devices
		tpmi_devices = fnmatch.filter(devices, 'uncore*')
		if len(tpmi_devices) > 0:
			self._is_tpmi = True  # Not used at present but can be usefull in future
			devices = tpmi_devices

		for d in devices:
			self._free_devices.add(d)

		log.debug("devices: %s", str(self._free_devices))

	def _instance_init(self, instance):
		instance._has_static_tuning = True
		instance._has_dynamic_tuning = False

	def _instance_cleanup(self, instance):
		pass

	def _get(self, dev_dir, file):
		sysfs_file = SYSFS_DIR + dev_dir + "/" + file
		value = cmd.read_file(sysfs_file)
		if len(value) > 0:
			return int(value)
		return None

	def _set(self, dev_dir, file, value):
		sysfs_file = SYSFS_DIR + dev_dir + "/" + file
		if cmd.write_to_file(sysfs_file, "%u" % value):
			return value
		return None

	@classmethod
	def _get_config_options(cls):
		return {
			"max_freq_khz": None,
			"min_freq_khz": None,
		}

	def _validate_value(self, device, min_or_max, value):
		try:
			freq_khz = int(value)
		except ValueError:
			log.error("value '%s' is not integer" % value)
			return None

		try:
			initial_max_freq_khz = self._get(device, "initial_max_freq_khz")
			initial_min_freq_khz = self._get(device, "initial_min_freq_khz")
			max_freq_khz = self._get(device, "max_freq_khz")
			min_freq_khz = self._get(device, "min_freq_khz")
		except (OSError, IOError):
			log.error("fail to read inital uncore frequency values")
			return None

		if min_or_max == IS_MAX:
			if freq_khz < min_freq_khz:
				log.error("%s: max_freq_khz %d value below min_freq_khz %d" % (device, freq_khz, min_freq_khz))
				return None

			if freq_khz > initial_max_freq_khz:
				log.info("%s: max_freq_khz %d value above initial_max_freq_khz - capped to %d" % (device, freq_khz, initial_max_freq_khz))
				freq_khz = initial_max_freq_khz

		elif min_or_max == IS_MIN:
			if freq_khz > max_freq_khz:
				log.error("%s: min_freq_khz %d value above max_freq_khz %d" % (device, freq_khz, max_freq_khz))
				return None

			if freq_khz < initial_min_freq_khz:
				log.info("%s: min_freq_khz %d value below initial_max_freq_khz - capped to %d" % (device, freq_khz, initial_min_freq_khz))
				freq_khz = initial_min_freq_khz

		else:
			return None

		return freq_khz

	@command_set("max_freq_khz", per_device = True)
	def _set_max_freq_khz(self, value, device, sim, remove):
		max_freq_khz = self._validate_value(device, IS_MAX, value)
		if max_freq_khz is None:
			return None

		if sim:
			return max_freq_khz

		log.debug("%s: set max_freq_khz %d" % (device, max_freq_khz))
		return self._set(device, "max_freq_khz", max_freq_khz)

	@command_get("max_freq_khz")
	def _get_max_freq_khz(self, device, ignore_missing=False):
		if ignore_missing and not os.path.isdir(SYSFS_DIR):
			return None

		try:
			max_freq_khz = self._get(device, "max_freq_khz")
		except (OSError, IOError):
			log.error("fail to read uncore frequency values")
			return None

		log.debug("%s: get max_freq_khz %d" % (device, max_freq_khz))
		return max_freq_khz

	@command_set("min_freq_khz", per_device = True)
	def _set_min_freq_khz(self, value, device, sim, remove):
		min_freq_khz = self._validate_value(device, IS_MIN, value)
		if min_freq_khz is None:
			return None

		if sim:
			return min_freq_khz

		log.debug("%s: set min_freq_khz %d" % (device, min_freq_khz))
		return self._set(device, "min_freq_khz", min_freq_khz)

	@command_get("min_freq_khz")
	def _get_min_freq_khz(self, device, ignore_missing=False):
		if ignore_missing and not os.path.isdir(SYSFS_DIR):
			return None

		try:
			min_freq_khz = self._get(device, "min_freq_khz")
		except (OSError, IOError):
			log.error("fail to read uncore frequency values")
			return None

		log.debug("%s: get min_freq_khz %d" % (device, min_freq_khz))
		return min_freq_khz
site-packages/tuned/plugins/plugin_service.py000064400000024741147511334670015477 0ustar00from . import base
import collections
import tuned.consts as consts
from .decorators import *
import os
import re
import tuned.logs
from tuned.utils.commands import commands

log = tuned.logs.get()
cmd = commands()

class Service():
	def __init__(self, start = None, enable = None, cfg_file = None, runlevel = None):
		self.enable = enable
		self.start = start
		self.cfg_file = cfg_file
		self.runlevel = runlevel

class InitHandler():
	def runlevel_get(self):
		(retcode, out) = cmd.execute(["runlevel"])
		return out.split()[-1] if retcode == 0 else None

	def daemon_reload(self):
		cmd.execute(["telinit", "q"])

	def cfg_install(self, name, cfg_file):
		pass

	def cfg_uninstall(self, name, cfg_file):
		pass

	def cfg_verify(self, name, cfg_file):
		return None

# no enable/disable
class SysVBasicHandler(InitHandler):
	def start(self, name):
		cmd.execute(["service", name, "start"])

	def stop(self, name):
		cmd.execute(["service", name, "stop"])

	def enable(self, name, runlevel):
		raise NotImplementedError()

	def disable(self, name, runlevel):
		raise NotImplementedError()

	def is_running(self, name):
		(retcode, out) = cmd.execute(["service", name, "status"], no_errors = [0])
		return retcode == 0

	def is_enabled(self, name, runlevel):
		raise NotImplementedError()

class SysVHandler(SysVBasicHandler):
	def enable(self, name, runlevel):
		cmd.execute(["chkconfig", "--level", runlevel, name, "on"])

	def disable(self, name, runlevel):
		cmd.execute(["chkconfig", "--level", runlevel, name, "off"])

	def is_enabled(self, name, runlevel):
		(retcode, out) = cmd.execute(["chkconfig", "--list", name])
		return out.split("%s:" % str(runlevel))[1][:2] == "on" if retcode == 0 else None

class SysVRCHandler(SysVBasicHandler):
	def enable(self, name, runlevel):
		cmd.execute(["sysv-rc-conf", "--level", runlevel, name, "on"])

	def disable(self, name, runlevel):
		cmd.execute(["sysv-rc-conf", "--level", runlevel, name, "off"])

	def is_enabled(self, name, runlevel):
		(retcode, out) = cmd.execute(["sysv-rc-conf", "--list", name])
		return out.split("%s:" % str(runlevel))[1][:2] == "on" if retcode == 0 else None

class OpenRCHandler(InitHandler):
	def runlevel_get(self):
		(retcode, out) = cmd.execute(["rc-status", "-r"])
		return out.strip() if retcode == 0 else None

	def start(self, name):
		cmd.execute(["rc-service", name, "start"])

	def stop(self, name):
		cmd.execute(["rc-service", name, "stop"])

	def enable(self, name, runlevel):
		cmd.execute(["rc-update", "add", name, runlevel])

	def disable(self, name, runlevel):
		cmd.execute(["rc-update", "del", name, runlevel])

	def is_running(self, name):
		(retcode, out) = cmd.execute(["rc-service", name, "status"], no_errors = [0])
		return retcode == 0

	def is_enabled(self, name, runlevel):
		(retcode, out) = cmd.execute(["rc-update", "show", runlevel])
		return bool(re.search(r"\b" + re.escape(name) + r"\b", out))

class SystemdHandler(InitHandler):
	# runlevel not used
	def runlevel_get(self):
		return ""

	def start(self, name):
		cmd.execute(["systemctl", "restart", name])

	def stop(self, name):
		cmd.execute(["systemctl", "stop", name])

	def enable(self, name, runlevel):
		cmd.execute(["systemctl", "enable", name])

	def disable(self, name, runlevel):
		cmd.execute(["systemctl", "disable", name])

	def is_running(self, name):
		(retcode, out) = cmd.execute(["systemctl", "is-active", name], no_errors = [0])
		return retcode == 0

	def is_enabled(self, name, runlevel):
		(retcode, out) = cmd.execute(["systemctl", "is-enabled", name], no_errors = [0])
		status = out.strip()
		return True if status == "enabled" else False if status =="disabled" else None

	def cfg_install(self, name, cfg_file):
		log.info("installing service configuration overlay file '%s' for service '%s'" % (cfg_file, name))
		if not os.path.exists(cfg_file):
			log.error("Unable to find service configuration '%s'" % cfg_file)
			return
		dirpath = consts.SERVICE_SYSTEMD_CFG_PATH % name
		try:
			os.makedirs(dirpath, consts.DEF_SERVICE_CFG_DIR_MODE)
		except OSError as e:
			log.error("Unable to create directory '%s': %s" % (dirpath, e))
			return
		cmd.copy(cfg_file, dirpath)
		self.daemon_reload()

	def cfg_uninstall(self, name, cfg_file):
		log.info("uninstalling service configuration overlay file '%s' for service '%s'" % (cfg_file, name))
		dirpath = consts.SERVICE_SYSTEMD_CFG_PATH % name
		path = "%s/%s" % (dirpath, os.path.basename(cfg_file))
		cmd.unlink(path)
		self.daemon_reload()
		# remove the service dir if empty, do not check for errors
		try:
			os.rmdir(dirpath)
		except (FileNotFoundError, OSError):
			pass

	def cfg_verify(self, name, cfg_file):
		if cfg_file is None:
			return None
		path = "%s/%s" % (consts.SERVICE_SYSTEMD_CFG_PATH % name, os.path.basename(cfg_file))
		if not os.path.exists(cfg_file):
			log.error("Unable to find service '%s' configuration '%s'" % (name, cfg_file))
			return False
		if not os.path.exists(path):
			log.error("Service '%s' configuration not installed in '%s'" % (name, path))
			return False
		sha256sum1 = cmd.sha256sum(cfg_file)
		sha256sum2 = cmd.sha256sum(path)
		return sha256sum1 == sha256sum2

class ServicePlugin(base.Plugin):
	"""
	`service`::
	
	Plug-in for handling sysvinit, sysv-rc, openrc and systemd services.
	+
	The syntax is as follows:
	+
	[subs="+quotes,+macros"]
	----
	[service]
	service.__service_name__=__commands__[,file:__file__]
	----
	+
	Supported service-handling `_commands_` are `start`, `stop`, `enable`
	and `disable`. The optional `file:__file__` directive installs an overlay
	configuration file `__file__`. Multiple commands must be comma (`,`)
	or semicolon (`;`) separated. If the directives conflict, the last
	one is used.
	+
	The service plugin supports configuration overlays only for systemd.
	In other init systems, this directive is ignored. The configuration
	overlay files are copied to `/etc/systemd/system/__service_name__.service.d/`
	directories. Upon profile unloading, the directory is removed if it is empty.
	+
	With systemd, the `start` command is implemented by `restart` in order
	to allow loading of the service configuration file overlay.
	+
	NOTE: With non-systemd init systems, the plug-in operates on the
	current runlevel only.
	+
	.Start and enable the `sendmail` service with an overlay file
	====
	----
	[service]
	service.sendmail=start,enable,file:${i:PROFILE_DIR}/tuned-sendmail.conf
	----
	The internal variable `${i:PROFILE_DIR}` points to the directory
	from which the profile is loaded.
	====
	"""

	def __init__(self, *args, **kwargs):
		super(ServicePlugin, self).__init__(*args, **kwargs)
		self._has_dynamic_options = True
		self._init_handler = self._detect_init_system()

	def _check_cmd(self, command):
		(retcode, out) = cmd.execute(command, no_errors = [0])
		return retcode == 0

	def _detect_init_system(self):
		if self._check_cmd(["systemctl", "status"]):
			log.debug("detected systemd")
			return SystemdHandler()
		elif self._check_cmd(["chkconfig"]):
			log.debug("detected generic sysvinit")
			return SysVHandler()
		elif self._check_cmd(["update-rc.d", "-h"]):
			log.debug("detected sysv-rc")
			return SysVRCHandler()
		elif self._check_cmd(["rc-update", "-h"]):
			log.debug("detected openrc")
			return OpenRCHandler()
		else:
			raise exceptions.NotSupportedPluginException("Unable to detect your init system, disabling the plugin.")

	def _parse_service_options(self, name,  val):
		l = re.split(r"\s*[,;]\s*", val)
		service = Service()
		for i in l:
			if i == "enable":
				service.enable = True
			elif i == "disable":
				service.enable = False
			elif i == "start":
				service.start = True
			elif i == "stop":
				service.start = False
			elif i[:5] == "file:":
				service.cfg_file = i[5:]
			else:
				log.error("service '%s': invalid service option: '%s'" % (name, i))
		return service

	def _instance_init(self, instance):
		instance._has_dynamic_tuning = False
		instance._has_static_tuning = True

		self._services = collections.OrderedDict([(option[8:], self._parse_service_options(option[8:], 
			self._variables.expand(value))) for option, value in instance.options.items()
			if option[:8] == "service." and len(option) > 8])
		instance._services_original = {}

	def _instance_cleanup(self, instance):
		pass

	def _process_service(self, name, start, enable, runlevel):
		if start:
			self._init_handler.start(name)
		elif start is not None:
			self._init_handler.stop(name)
		if enable:
			self._init_handler.enable(name, runlevel)
		elif enable is not None:
			self._init_handler.disable(name, runlevel)

	def _instance_apply_static(self, instance):
		runlevel = self._init_handler.runlevel_get()
		if runlevel is None:
			log.error("Cannot detect runlevel")
			return
		
		for service in self._services.items():
			is_enabled = self._init_handler.is_enabled(service[0], runlevel)
			is_running = self._init_handler.is_running(service[0])
			instance._services_original[service[0]] = Service(is_running, is_enabled, service[1].cfg_file, runlevel)
			if service[1].cfg_file:
				self._init_handler.cfg_install(service[0], service[1].cfg_file)
			self._process_service(service[0], service[1].start, service[1].enable, runlevel)

	def _instance_verify_static(self, instance, ignore_missing, devices):
		runlevel = self._init_handler.runlevel_get()
		if runlevel is None:
			log.error(consts.STR_VERIFY_PROFILE_FAIL % "cannot detect runlevel")
			return False

		ret = True
		for service in self._services.items():
			ret_cfg_verify = self._init_handler.cfg_verify(service[0], service[1].cfg_file)
			if ret_cfg_verify:
				log.info(consts.STR_VERIFY_PROFILE_OK % "service '%s' configuration '%s' matches" % (service[0], service[1].cfg_file))
			elif ret_cfg_verify is not None:
				log.error(consts.STR_VERIFY_PROFILE_FAIL % "service '%s' configuration '%s' differs" % (service[0], service[1].cfg_file))
				ret = False
			else:
				log.info(consts.STR_VERIFY_PROFILE_VALUE_MISSING % "service '%s' configuration '%s'" % (service[0], service[1].cfg_file))
			is_enabled = self._init_handler.is_enabled(service[0], runlevel)
			is_running = self._init_handler.is_running(service[0])
			if self._verify_value("%s running" % service[0], service[1].start, is_running, ignore_missing) is False:
				ret = False
			if self._verify_value("%s enabled" % service[0], service[1].enable, is_enabled, ignore_missing) is False:
				ret = False
		return ret

	def _instance_unapply_static(self, instance, rollback = consts.ROLLBACK_SOFT):
		for name, value in list(instance._services_original.items()):
			if value.cfg_file:
				self._init_handler.cfg_uninstall(name, value.cfg_file)
			self._process_service(name, value.start, value.enable, value.runlevel)
site-packages/tuned/plugins/plugin_eeepc_she.py000064400000005603147511334670015753 0ustar00from . import base
from . import exceptions
import tuned.logs
from tuned.utils.commands import commands
import os

log = tuned.logs.get()

class EeePCSHEPlugin(base.Plugin):
	"""
	`eeepc_she`::
	
	Dynamically sets the front-side bus (FSB) speed according to the
	CPU load. This feature can be found on some netbooks and is also
	known as the Asus Super Hybrid Engine. If the CPU load is lower or
	equal to the value specified by the [option]`load_threshold_powersave`
	option, the plug-in sets the FSB speed to the value specified by the
	[option]`she_powersave` option. If the CPU load is higher or
	equal to the value specified by the [option]`load_threshold_normal`
	option, it sets the FSB speed to the value specified by the
	[option]`she_normal` option. Static tuning is not supported and the
	plug-in is transparently disabled if the hardware support for this
	feature is not detected.
	
	NOTE: For details about the FSB frequencies and corresponding values, see
	link:https://www.kernel.org/doc/Documentation/ABI/testing/sysfs-platform-eeepc-laptop[the kernel documentation].
	The provided defaults should work for most users.
	"""

	def __init__(self, *args, **kwargs):
		self._cmd = commands()
		self._control_file = "/sys/devices/platform/eeepc/cpufv"
		if not os.path.isfile(self._control_file):
			self._control_file = "/sys/devices/platform/eeepc-wmi/cpufv"
		if not os.path.isfile(self._control_file):
			raise exceptions.NotSupportedPluginException("Plugin is not supported on your hardware.")
		super(EeePCSHEPlugin, self).__init__(*args, **kwargs)

	@classmethod
	def _get_config_options(self):
		return {
			"load_threshold_normal"    : 0.6,
			"load_threshold_powersave" : 0.4,
			"she_powersave"            : 2,
			"she_normal"               : 1,
		}

	def _instance_init(self, instance):
		instance._has_static_tuning = False
		instance._has_dynamic_tuning = True
		instance._she_mode = None
		instance._load_monitor = self._monitors_repository.create("load", None)

	def _instance_cleanup(self, instance):
		if instance._load_monitor is not None:
			self._monitors_repository.delete(instance._load_monitor)
			instance._load_monitor = None

	def _instance_update_dynamic(self, instance, device):
		load = instance._load_monitor.get_load()["system"]
		if load <= instance.options["load_threshold_powersave"]:
			self._set_she_mode(instance, "powersave")
		elif load >= instance.options["load_threshold_normal"]:
			self._set_she_mode(instance, "normal")

	def _instance_unapply_dynamic(self, instance, device):
		# FIXME: restore previous value
		self._set_she_mode(instance, "normal")

	def _set_she_mode(self, instance, new_mode):
		new_mode_numeric = int(instance.options["she_%s" % new_mode])
		if instance._she_mode != new_mode_numeric:
			log.info("new eeepc_she mode %s (%d) " % (new_mode, new_mode_numeric))
			self._cmd.write_to_file(self._control_file, "%s" % new_mode_numeric)
			self._she_mode = new_mode_numeric
site-packages/tuned/plugins/plugin_disk.py000064400000040733147511334670014770 0ustar00import errno
from . import hotplug
from .decorators import *
import tuned.logs
import tuned.consts as consts
from tuned.utils.commands import commands
import os
import re

log = tuned.logs.get()

class DiskPlugin(hotplug.Plugin):
	"""
	`disk`::
	
	Plug-in for tuning various block device options. This plug-in can also
	dynamically change the advanced power management and spindown timeout
	setting for a drive according to the current drive utilization. The
	dynamic tuning is controlled by the [option]`dynamic` and the global
	[option]`dynamic_tuning` option in `tuned-main.conf`.
	+
	The disk plug-in operates on all supported block devices unless a
	comma separated list of [option]`devices` is passed to it.
	+
	.Operate only on the sda block device
	====
	----
	[disk]
	# Comma separated list of devices, all devices if commented out.
	devices=sda
	----
	====
	+
	The [option]`elevator` option sets the Linux I/O scheduler.
	+
	.Use the bfq I/O scheduler on xvda block device
	====
	----
	[disk]
	device=xvda
	elevator=bfq
	----
	====
	+
	The [option]`scheduler_quantum` option only applies to the CFQ I/O
	scheduler. It defines the number of I/O requests that CFQ sends to
	one device at one time, essentially limiting queue depth. The default
	value is 8 requests. The device being used may support greater queue
	depth, but increasing the value of quantum will also increase latency,
	especially for large sequential write work loads.
	+
	The [option]`apm` option sets the Advanced Power Management feature
	on drives that support it. It corresponds to using the `-B` option of
	the `hdparm` utility. The [option]`spindown` option puts the drive
	into idle (low-power) mode, and also sets the standby (spindown)
	timeout for the drive. It corresponds to using `-S` option of the
	`hdparm` utility.
	+
	.Use a medium-agressive power management with spindown
	====
	----
	[disk]
	apm=128
	spindown=6
	----
	====
	+
	The [option]`readahead` option controls how much extra data the
	operating system reads from disk when performing sequential
	I/O operations. Increasing the `readahead` value might improve
	performance in application environments where sequential reading of
	large files takes place. The default unit for readahead is KiB. This
	can be adjusted to sectors by specifying the suffix 's'. If the
	suffix is specified, there must be at least one space between the
	number and suffix (for example, `readahead=8192 s`).
	+
	.Set the `readahead` to 4MB unless already set to a higher value
	====
	----
	[disk]
	readahead=>4096
	----
	====
	The disk readahead value can be multiplied by the constant
	specified by the [option]`readahead_multiply` option.
	"""

	def __init__(self, *args, **kwargs):
		super(DiskPlugin, self).__init__(*args, **kwargs)

		self._power_levels = [254, 225, 195, 165, 145, 125, 105, 85, 70, 55, 30, 20]
		self._spindown_levels = [0, 250, 230, 210, 190, 170, 150, 130, 110, 90, 70, 60]
		self._levels = len(self._power_levels)
		self._level_steps = 6
		self._load_smallest = 0.01
		self._cmd = commands()

	def _init_devices(self):
		super(DiskPlugin, self)._init_devices()
		self._devices_supported = True
		self._use_hdparm = True
		self._free_devices = set()
		self._hdparm_apm_devices = set()
		for device in self._hardware_inventory.get_devices("block"):
			if self._device_is_supported(device):
				self._free_devices.add(device.sys_name)
				if self._use_hdparm and self._is_hdparm_apm_supported(device.sys_name):
					self._hdparm_apm_devices.add(device.sys_name)

		self._assigned_devices = set()

	def _get_device_objects(self, devices):
		return [self._hardware_inventory.get_device("block", x) for x in devices]

	def _is_hdparm_apm_supported(self, device):
		(rc, out, err_msg) = self._cmd.execute(["hdparm", "-C", "/dev/%s" % device], \
				no_errors = [errno.ENOENT], return_err=True)
		if rc == -errno.ENOENT:
			log.warn("hdparm command not found, ignoring for other devices")
			self._use_hdparm = False
			return False
		elif rc:
			log.info("Device '%s' not supported by hdparm" % device)
			log.debug("(rc: %s, msg: '%s')" % (rc, err_msg))
			return False
		elif "unknown" in out:
			log.info("Driver for device '%s' does not support apm command" % device)
			return False
		return True

	@classmethod
	def _device_is_supported(cls, device):
		return  device.device_type == "disk" and \
			device.attributes.get("removable", None) == b"0" and \
			(device.parent is None or \
					device.parent.subsystem in ["scsi", "virtio", "xen", "nvme"])

	def _hardware_events_init(self):
		self._hardware_inventory.subscribe(self, "block", self._hardware_events_callback)

	def _hardware_events_cleanup(self):
		self._hardware_inventory.unsubscribe(self)

	def _hardware_events_callback(self, event, device):
		if self._device_is_supported(device) or event == "remove":
			super(DiskPlugin, self)._hardware_events_callback(event, device)

	def _added_device_apply_tuning(self, instance, device_name):
		if instance._load_monitor is not None:
			instance._load_monitor.add_device(device_name)
		super(DiskPlugin, self)._added_device_apply_tuning(instance, device_name)

	def _removed_device_unapply_tuning(self, instance, device_name):
		if instance._load_monitor is not None:
			instance._load_monitor.remove_device(device_name)
		super(DiskPlugin, self)._removed_device_unapply_tuning(instance, device_name)

	@classmethod
	def _get_config_options(cls):
		return {
			"dynamic"            : True, # FIXME: do we want this default?
			"elevator"           : None,
			"apm"                : None,
			"spindown"           : None,
			"readahead"          : None,
			"readahead_multiply" : None,
			"scheduler_quantum"  : None,
		}

	@classmethod
	def _get_config_options_used_by_dynamic(cls):
		return [
			"apm",
			"spindown",
		]

	def _instance_init(self, instance):
		instance._has_static_tuning = True

		self._apm_errcnt = 0
		self._spindown_errcnt = 0

		if self._option_bool(instance.options["dynamic"]):
			instance._has_dynamic_tuning = True
			instance._load_monitor = \
					self._monitors_repository.create(
					"disk", instance.assigned_devices)
			instance._device_idle = {}
			instance._stats = {}
			instance._idle = {}
			instance._spindown_change_delayed = {}
		else:
			instance._has_dynamic_tuning = False
			instance._load_monitor = None

	def _instance_cleanup(self, instance):
		if instance._load_monitor is not None:
			self._monitors_repository.delete(instance._load_monitor)
			instance._load_monitor = None

	def _update_errcnt(self, rc, spindown):
		if spindown:
			s = "spindown"
			cnt = self._spindown_errcnt
		else:
			s = "apm"
			cnt = self._apm_errcnt
		if cnt >= consts.ERROR_THRESHOLD:
			return
		if rc == 0:
			cnt = 0
		elif rc == -errno.ENOENT:
			self._spindown_errcnt = self._apm_errcnt = consts.ERROR_THRESHOLD + 1
			log.warn("hdparm command not found, ignoring future set_apm / set_spindown commands")
			return
		else:
			cnt += 1
			if cnt == consts.ERROR_THRESHOLD:
				log.info("disabling set_%s command: too many consecutive errors" % s)
		if spindown:
			self._spindown_errcnt = cnt
		else:
			self._apm_errcnt = cnt

	def _change_spindown(self, instance, device, new_spindown_level):
		log.debug("changing spindown to %d" % new_spindown_level)
		(rc, out) = self._cmd.execute(["hdparm", "-S%d" % new_spindown_level, "/dev/%s" % device], no_errors = [errno.ENOENT])
		self._update_errcnt(rc, True)
		instance._spindown_change_delayed[device] = False

	def _drive_spinning(self, device):
		(rc, out) = self._cmd.execute(["hdparm", "-C", "/dev/%s" % device], no_errors = [errno.ENOENT])
		return not "standby" in out and not "sleeping" in out

	def _instance_update_dynamic(self, instance, device):
		if not device in self._hdparm_apm_devices:
			return
		load = instance._load_monitor.get_device_load(device)
		if load is None:
			return

		if not device in instance._stats:
			self._init_stats_and_idle(instance, device)

		self._update_stats(instance, device, load)
		self._update_idle(instance, device)

		stats = instance._stats[device]
		idle = instance._idle[device]

		# level change decision

		if idle["level"] + 1 < self._levels and idle["read"] >= self._level_steps and idle["write"] >= self._level_steps:
			level_change = 1
		elif idle["level"] > 0 and (idle["read"] == 0 or idle["write"] == 0):
			level_change = -1
		else:
			level_change = 0

		# change level if decided

		if level_change != 0:
			idle["level"] += level_change
			new_power_level = self._power_levels[idle["level"]]
			new_spindown_level = self._spindown_levels[idle["level"]]

			log.debug("tuning level changed to %d" % idle["level"])
			if self._spindown_errcnt < consts.ERROR_THRESHOLD:
				if not self._drive_spinning(device) and level_change > 0:
					log.debug("delaying spindown change to %d, drive has already spun down" % new_spindown_level)
					instance._spindown_change_delayed[device] = True
				else:
					self._change_spindown(instance, device, new_spindown_level)
			if self._apm_errcnt < consts.ERROR_THRESHOLD:
				log.debug("changing APM_level to %d" % new_power_level)
				(rc, out) = self._cmd.execute(["hdparm", "-B%d" % new_power_level, "/dev/%s" % device], no_errors = [errno.ENOENT])
				self._update_errcnt(rc, False)
		elif instance._spindown_change_delayed[device] and self._drive_spinning(device):
			new_spindown_level = self._spindown_levels[idle["level"]]
			self._change_spindown(instance, device, new_spindown_level)

		log.debug("%s load: read %0.2f, write %0.2f" % (device, stats["read"], stats["write"]))
		log.debug("%s idle: read %d, write %d, level %d" % (device, idle["read"], idle["write"], idle["level"]))

	def _init_stats_and_idle(self, instance, device):
		instance._stats[device] = { "new": 11 * [0], "old": 11 * [0], "max": 11 * [1] }
		instance._idle[device] = { "level": 0, "read": 0, "write": 0 }
		instance._spindown_change_delayed[device] = False

	def _update_stats(self, instance, device, new_load):
		instance._stats[device]["old"] = old_load = instance._stats[device]["new"]
		instance._stats[device]["new"] = new_load

		# load difference
		diff = [new_old[0] - new_old[1] for new_old in zip(new_load, old_load)]
		instance._stats[device]["diff"] = diff

		# adapt maximum expected load if the difference is higher
		old_max_load = instance._stats[device]["max"]
		max_load = [max(pair) for pair in zip(old_max_load, diff)]
		instance._stats[device]["max"] = max_load

		# read/write ratio
		instance._stats[device]["read"] =  float(diff[1]) / float(max_load[1])
		instance._stats[device]["write"] = float(diff[5]) / float(max_load[5])

	def _update_idle(self, instance, device):
		# increase counter if there is no load, otherwise reset the counter
		for operation in ["read", "write"]:
			if instance._stats[device][operation] < self._load_smallest:
				instance._idle[device][operation] += 1
			else:
				instance._idle[device][operation] = 0

	def _instance_apply_dynamic(self, instance, device):
		# At the moment we support dynamic tuning just for devices compatible with hdparm apm commands
		# If in future will be added new functionality not connected to this command,
		# it is needed to change it here
		if device not in self._hdparm_apm_devices:
			log.info("There is no dynamic tuning available for device '%s' at time" % device)
		else:
			super(DiskPlugin, self)._instance_apply_dynamic(instance, device)

	def _instance_unapply_dynamic(self, instance, device):
		pass

	def _sysfs_path(self, device, suffix, prefix = "/sys/block/"):
		if "/" in device:
			dev = os.path.join(prefix, device.replace("/", "!"), suffix)
			if os.path.exists(dev):
				return dev
		return os.path.join(prefix, device, suffix)

	def _elevator_file(self, device):
		return self._sysfs_path(device, "queue/scheduler")

	@command_set("elevator", per_device=True)
	def _set_elevator(self, value, device, sim, remove):
		sys_file = self._elevator_file(device)
		if not sim:
			self._cmd.write_to_file(sys_file, value, \
				no_error = [errno.ENOENT] if remove else False)
		return value

	@command_get("elevator")
	def _get_elevator(self, device, ignore_missing=False):
		sys_file = self._elevator_file(device)
		# example of scheduler file content:
		# noop deadline [cfq]
		return self._cmd.get_active_option(self._cmd.read_file(sys_file, no_error=ignore_missing))

	@command_set("apm", per_device=True)
	def _set_apm(self, value, device, sim, remove):
		if device not in self._hdparm_apm_devices:
			if not sim:
				log.info("apm option is not supported for device '%s'" % device)
				return None
			else:
				return str(value)
		if self._apm_errcnt < consts.ERROR_THRESHOLD:
			if not sim:
				(rc, out) = self._cmd.execute(["hdparm", "-B", str(value), "/dev/" + device], no_errors = [errno.ENOENT])
				self._update_errcnt(rc, False)
			return str(value)
		else:
			return None

	@command_get("apm")
	def _get_apm(self, device, ignore_missing=False):
		if device not in self._hdparm_apm_devices:
			if not ignore_missing:
				log.info("apm option is not supported for device '%s'" % device)
			return None
		value = None
		err = False
		(rc, out) = self._cmd.execute(["hdparm", "-B", "/dev/" + device], no_errors = [errno.ENOENT])
		if rc == -errno.ENOENT:
			return None
		elif rc != 0:
			err = True
		else:
			m = re.match(r".*=\s*(\d+).*", out, re.S)
			if m:
				try:
					value = int(m.group(1))
				except ValueError:
					err = True
		if err:
			log.error("could not get current APM settings for device '%s'" % device)
		return value

	@command_set("spindown", per_device=True)
	def _set_spindown(self, value, device, sim, remove):
		if device not in self._hdparm_apm_devices:
			if not sim:
				log.info("spindown option is not supported for device '%s'" % device)
				return None
			else:
				return str(value)
		if self._spindown_errcnt < consts.ERROR_THRESHOLD:
			if not sim:
				(rc, out) = self._cmd.execute(["hdparm", "-S", str(value), "/dev/" + device], no_errors = [errno.ENOENT])
				self._update_errcnt(rc, True)
			return str(value)
		else:
			return None

	@command_get("spindown")
	def _get_spindown(self, device, ignore_missing=False):
		if device not in self._hdparm_apm_devices:
			if not ignore_missing:
				log.info("spindown option is not supported for device '%s'" % device)
			return None
		# There's no way how to get current/old spindown value, hardcoding vendor specific 253
		return 253

	def _readahead_file(self, device):
		return self._sysfs_path(device, "queue/read_ahead_kb")

	def _parse_ra(self, value):
		val = str(value).split(None, 1)
		try:
			v = int(val[0])
		except ValueError:
			return None
		if len(val) > 1 and val[1][0] == "s":
			# v *= 512 / 1024
			v /= 2
		return v

	@command_set("readahead", per_device=True)
	def _set_readahead(self, value, device, sim, remove):
		sys_file = self._readahead_file(device)
		val = self._parse_ra(value)
		if val is None:
			log.error("Invalid readahead value '%s' for device '%s'" % (value, device))
		else:
			if not sim:
				self._cmd.write_to_file(sys_file, "%d" % val, \
					no_error = [errno.ENOENT] if remove else False)
		return val

	@command_get("readahead")
	def _get_readahead(self, device, ignore_missing=False):
		sys_file = self._readahead_file(device)
		value = self._cmd.read_file(sys_file, no_error=ignore_missing).strip()
		if len(value) == 0:
			return None
		return int(value)

	@command_custom("readahead_multiply", per_device=True)
	def _multiply_readahead(self, enabling, multiplier, device, verify, ignore_missing):
		if verify:
			return None
		storage_key = self._storage_key(
				command_name = "readahead_multiply",
				device_name = device)
		if enabling:
			old_readahead = self._get_readahead(device)
			if old_readahead is None:
				return
			new_readahead = int(float(multiplier) * old_readahead)
			self._storage.set(storage_key, old_readahead)
			self._set_readahead(new_readahead, device, False)
		else:
			old_readahead = self._storage.get(storage_key)
			if old_readahead is None:
				return
			self._set_readahead(old_readahead, device, False)
			self._storage.unset(storage_key)

	def _scheduler_quantum_file(self, device):
		return self._sysfs_path(device, "queue/iosched/quantum")

	@command_set("scheduler_quantum", per_device=True)
	def _set_scheduler_quantum(self, value, device, sim, remove):
		sys_file = self._scheduler_quantum_file(device)
		if not sim:
			self._cmd.write_to_file(sys_file, "%d" % int(value), \
				no_error = [errno.ENOENT] if remove else False)
		return value

	@command_get("scheduler_quantum")
	def _get_scheduler_quantum(self, device, ignore_missing=False):
		sys_file = self._scheduler_quantum_file(device)
		value = self._cmd.read_file(sys_file, no_error=ignore_missing).strip()
		if len(value) == 0:
			if not ignore_missing:
				log.info("disk_scheduler_quantum option is not supported for device '%s'" % device)
			return None
		return int(value)
site-packages/tuned/plugins/plugin_sysfs.py000064400000005206147511334670015201 0ustar00from . import base
import glob
import re
import os.path
from .decorators import *
import tuned.logs
import tuned.consts as consts
from subprocess import *
from tuned.utils.commands import commands

log = tuned.logs.get()

class SysfsPlugin(base.Plugin):
	"""
	`sysfs`::
	
	Sets various `sysfs` settings specified by the plug-in options.
	+
	The syntax is `_name_=_value_`, where
	`_name_` is the `sysfs` path to use and `_value_` is
	the value to write. The `sysfs` path supports the shell-style
	wildcard characters (see `man 7 glob` for additional detail).
	+
	Use this plugin in case you need to change some settings that are
	not covered by other plug-ins. Prefer specific plug-ins if they
	cover the required settings.
	+
	.Ignore corrected errors and associated scans that cause latency spikes
	====
	----
	[sysfs]
	/sys/devices/system/machinecheck/machinecheck*/ignore_ce=1
	----
	====
	"""

	# TODO: resolve possible conflicts with sysctl settings from other plugins

	def __init__(self, *args, **kwargs):
		super(SysfsPlugin, self).__init__(*args, **kwargs)
		self._has_dynamic_options = True
		self._cmd = commands()

	def _instance_init(self, instance):
		instance._has_dynamic_tuning = False
		instance._has_static_tuning = True

		instance._sysfs = dict([(os.path.normpath(key_value[0]), key_value[1]) for key_value in list(instance.options.items())])
		instance._sysfs_original = {}

	def _instance_cleanup(self, instance):
		pass

	def _instance_apply_static(self, instance):
		for key, value in list(instance._sysfs.items()):
			v = self._variables.expand(value)
			for f in glob.iglob(key):
				if self._check_sysfs(f):
					instance._sysfs_original[f] = self._read_sysfs(f)
					self._write_sysfs(f, v)
				else:
					log.error("rejecting write to '%s' (not inside /sys)" % f)

	def _instance_verify_static(self, instance, ignore_missing, devices):
		ret = True
		for key, value in list(instance._sysfs.items()):
			v = self._variables.expand(value)
			for f in glob.iglob(key):
				if self._check_sysfs(f):
					curr_val = self._read_sysfs(f)
					if self._verify_value(f, v, curr_val, ignore_missing) == False:
						ret = False
		return ret

	def _instance_unapply_static(self, instance, rollback = consts.ROLLBACK_SOFT):
		for key, value in list(instance._sysfs_original.items()):
			self._write_sysfs(key, value)

	def _check_sysfs(self, sysfs_file):
		return re.match(r"^/sys/.*", sysfs_file)

	def _read_sysfs(self, sysfs_file):
		data = self._cmd.read_file(sysfs_file).strip()
		if len(data) > 0:
			return self._cmd.get_active_option(data, False)
		else:
			return None

	def _write_sysfs(self, sysfs_file, value):
		return self._cmd.write_to_file(sysfs_file, value)
site-packages/tuned/plugins/plugin_scheduler.py000064400000155677147511334670016032 0ustar00# code for cores isolation was inspired by Tuna implementation
# perf code was borrowed from kernel/tools/perf/python/twatch.py
# thanks to Arnaldo Carvalho de Melo <acme@redhat.com>

from . import base
from .decorators import *
import tuned.logs
import re
from subprocess import *
import threading
import perf
import select
import tuned.consts as consts
import procfs
from tuned.utils.commands import commands
import errno
import os
import collections
import math
# Check existence of scheduler API in os module
try:
	os.SCHED_FIFO
except AttributeError:
	import schedutils

log = tuned.logs.get()

class SchedulerParams(object):
	def __init__(self, cmd, cmdline = None, scheduler = None,
			priority = None, affinity = None, cgroup = None):
		self._cmd = cmd
		self.cmdline = cmdline
		self.scheduler = scheduler
		self.priority = priority
		self.affinity = affinity
		self.cgroup = cgroup

	@property
	def affinity(self):
		if self._affinity is None:
			return None
		else:
			return self._cmd.bitmask2cpulist(self._affinity)

	@affinity.setter
	def affinity(self, value):
		if value is None:
			self._affinity = None
		else:
			self._affinity = self._cmd.cpulist2bitmask(value)

class IRQAffinities(object):
	def __init__(self):
		self.irqs = {}
		self.default = None
		# IRQs that don't support changing CPU affinity:
		self.unchangeable = []

class SchedulerUtils(object):
	"""
	Class encapsulating scheduler implementation in os module
	"""

	_dict_schedcfg2schedconst = {
		"f": "SCHED_FIFO",
		"b": "SCHED_BATCH",
		"r": "SCHED_RR",
		"o": "SCHED_OTHER",
		"i": "SCHED_IDLE",
	}

	def __init__(self):
		# {"f": os.SCHED_FIFO...}
		self._dict_schedcfg2num = dict((k, getattr(os, name)) for k, name in self._dict_schedcfg2schedconst.items())
		# { os.SCHED_FIFO: "SCHED_FIFO"... }
		self._dict_num2schedconst = dict((getattr(os, name), name) for name in self._dict_schedcfg2schedconst.values())

	def sched_cfg_to_num(self, str_scheduler):
		return self._dict_schedcfg2num.get(str_scheduler)

	# Reimplementation of schedstr from schedutils for logging purposes
	def sched_num_to_const(self, scheduler):
		return self._dict_num2schedconst.get(scheduler)

	def get_scheduler(self, pid):
		return os.sched_getscheduler(pid)

	def set_scheduler(self, pid, sched, prio):
		os.sched_setscheduler(pid, sched, os.sched_param(prio))

	def get_affinity(self, pid):
		return os.sched_getaffinity(pid)

	def set_affinity(self, pid, affinity):
		os.sched_setaffinity(pid, affinity)

	def get_priority(self, pid):
		return os.sched_getparam(pid).sched_priority

	def get_priority_min(self, sched):
		return os.sched_get_priority_min(sched)

	def get_priority_max(self, sched):
		return os.sched_get_priority_max(sched)

class SchedulerUtilsSchedutils(SchedulerUtils):
	"""
	Class encapsulating scheduler implementation in schedutils module
	"""
	def __init__(self):
		# { "f": schedutils.SCHED_FIFO... }
		self._dict_schedcfg2num = dict((k, getattr(schedutils, name)) for k, name in self._dict_schedcfg2schedconst.items())
		# { schedutils.SCHED_FIFO: "SCHED_FIFO"... }
		self._dict_num2schedconst = dict((getattr(schedutils, name), name) for name in self._dict_schedcfg2schedconst.values())

	def get_scheduler(self, pid):
		return schedutils.get_scheduler(pid)

	def set_scheduler(self, pid, sched, prio):
		schedutils.set_scheduler(pid, sched, prio)

	def get_affinity(self, pid):
		return schedutils.get_affinity(pid)

	def set_affinity(self, pid, affinity):
		schedutils.set_affinity(pid, affinity)

	def get_priority(self, pid):
		return schedutils.get_priority(pid)

	def get_priority_min(self, sched):
		return schedutils.get_priority_min(sched)

	def get_priority_max(self, sched):
		return schedutils.get_priority_max(sched)

class SchedulerPlugin(base.Plugin):
	r"""
	`scheduler`::
	
	Allows tuning of scheduling priorities, process/thread/IRQ
	affinities, and CPU isolation.
	+
	To prevent processes/threads/IRQs from using certain CPUs, use
	the [option]`isolated_cores` option. It changes process/thread
	affinities, IRQs affinities and it sets `default_smp_affinity`
	for IRQs. The CPU affinity mask is adjusted for all processes and
	threads matching [option]`ps_whitelist` option subject to success
	of the `sched_setaffinity()` system call. The default setting of
	the [option]`ps_whitelist` regular expression is `.*` to match all
	processes and thread names. To exclude certain processes and threads
	use [option]`ps_blacklist` option. The value of this option is also
	interpreted as a regular expression and process/thread names (`ps -eo
	cmd`) are matched against that expression. Profile rollback allows
	all matching processes and threads to run on all CPUs and restores
	the IRQ settings prior to the profile application.
	+
	Multiple regular expressions for [option]`ps_whitelist`
	and [option]`ps_blacklist` options are allowed and separated by
	`;`. Quoted semicolon `\;` is taken literally.
	+
	.Isolate CPUs 2-4
	====
	----
	[scheduler]
	isolated_cores=2-4
	ps_blacklist=.*pmd.*;.*PMD.*;^DPDK;.*qemu-kvm.*
	----
	Isolate CPUs 2-4 while ignoring processes and threads matching
	`ps_blacklist` regular expressions.
	====
	The [option]`irq_process` option controls whether the scheduler plugin
	applies the `isolated_cores` parameter to IRQ affinities. The default
	value is `true`, which means that the scheduler plugin will move all
	possible IRQs away from the isolated cores. When `irq_process` is set
	to `false`, the plugin will not change any IRQ affinities.
	====
	The [option]`default_irq_smp_affinity` option controls the values
	*TuneD* writes to `/proc/irq/default_smp_affinity`. The file specifies
	default affinity mask that applies to all non-active IRQs. Once an
	IRQ is allocated/activated its affinity bitmask will be set to the
	default mask.
	+
	The following values are supported:
	+
	--
	`calc`::
	Content of `/proc/irq/default_smp_affinity` will be calculated
	from the `isolated_cores` parameter. Non-isolated cores
	are calculated as an inversion of the `isolated_cores`. Then
	the intersection of the non-isolated cores and the previous
	content of `/proc/irq/default_smp_affinity` is written to
	`/proc/irq/default_smp_affinity`. If the intersection is
	an empty set, then just the non-isolated cores are written to
	`/proc/irq/default_smp_affinity`. This behavior is the default if
	the parameter `default_irq_smp_affinity` is omitted.
	`ignore`::
	*TuneD* will not touch `/proc/irq/default_smp_affinity`.
	explicit cpulist::
	The cpulist (such as 1,3-4) is unpacked and written directly to
	`/proc/irq/default_smp_affinity`.
	--
	+
	.An explicit CPU list to set the default IRQ smp affinity to CPUs 0 and 2
	====
	----
	[scheduler]
	isolated_cores=1,3
	default_irq_smp_affinity=0,2
	----
	====
	To adjust scheduling policy, priority and affinity for a group of
	processes/threads, use the following syntax.
	+
	[subs="+quotes,+macros"]
	----
	group.__groupname__=__rule_prio__:__sched__:__prio__:__affinity__:__regex__
	----
	+
	where `__rule_prio__` defines internal *TuneD* priority of the
	rule. Rules are sorted based on priority. This is needed for
	inheritence to be able to reorder previously defined rules. Equal
	`__rule_prio__` rules should be processed in the order they were
	defined. However, this is Python interpreter dependant. To disable
	an inherited rule for `__groupname__` use:
	+
	[subs="+quotes,+macros"]
	----
	group.__groupname__=
	----
	+
	`__sched__` must be one of:
	*`f`* for FIFO,
	*`b`* for batch,
	*`r`* for round robin,
	*`o`* for other,
	*`*`* do not change.
	+
	`__affinity__` is CPU affinity in hexadecimal. Use `*` for no change.
	+
	`__prio__` scheduling priority (see `chrt -m`).
	+
	`__regex__` is Python regular expression. It is matched against the output of
	+
	[subs="+quotes,+macros"]
	----
	ps -eo cmd
	----
	+
	Any given process name may match more than one group. In such a case,
	the priority and scheduling policy are taken from the last matching
	`__regex__`.
	+
	.Setting scheduling policy and priorities to kernel threads and watchdog
	====
	----
	[scheduler]
	group.kthreads=0:*:1:*:\[.*\]$
	group.watchdog=0:f:99:*:\[watchdog.*\]
	----
	====
	+
	The scheduler plug-in uses perf event loop to catch newly created
	processes. By default it listens to `perf.RECORD_COMM` and
	`perf.RECORD_EXIT` events. By setting [option]`perf_process_fork`
	option to `true`, `perf.RECORD_FORK` events will be also listened
	to. In other words, child processes created by the `fork()` system
	call will be processed. Since child processes inherit CPU affinity
	from their parents, the scheduler plug-in usually does not need to
	explicitly process these events. As processing perf events can
	pose a significant CPU overhead, the [option]`perf_process_fork`
	option parameter is set to `false` by default. Due to this, child
	processes are not processed by the scheduler plug-in.
	+
	The CPU overhead of the scheduler plugin can be mitigated by using
	the scheduler [option]`runtime` option and setting it to `0`. This
	will completely disable the dynamic scheduler functionality and the
	perf events will not be monitored and acted upon. The disadvantage
	ot this approach is the procees/thread tuning will be done only at
	profile application.
	+
	.Disabling the scheduler dynamic functionality
	====
	----
	[scheduler]
	runtime=0
	isolated_cores=1,3
	----
	====
	+
	NOTE: For perf events, memory mapped buffer is used. Under heavy load
	the buffer may overflow. In such cases the `scheduler` plug-in
	may start missing events and failing to process some newly created
	processes. Increasing the buffer size may help. The buffer size can
	be set with the [option]`perf_mmap_pages` option. The value of this
	parameter has to expressed in powers of 2. If it is not the power
	of 2, the nearest higher power of 2 value is calculated from it
	and this calculated value used. If the [option]`perf_mmap_pages`
	option is omitted, the default kernel value is used.
	+
	The scheduler plug-in supports process/thread confinement using
	cgroups v1.
	+
	[option]`cgroup_mount_point` option specifies the path to mount the
	cgroup filesystem or where *TuneD* expects it to be mounted. If unset,
	`/sys/fs/cgroup/cpuset` is expected.
	+
	If [option]`cgroup_groups_init` option is set to `1` *TuneD*
	will create (and remove) all cgroups defined with the `cgroup*`
	options. This is the default behavior. If it is set to `0` the
	cgroups need to be preset by other means.
	+
	If [option]`cgroup_mount_point_init` option is set to `1`,
	*TuneD* will create (and remove) the cgroup mountpoint. It implies
	`cgroup_groups_init = 1`. If set to `0` the cgroups mount point
	needs to be preset by other means. This is the default behavior.
	+
	The [option]`cgroup_for_isolated_cores` option is the cgroup
	name used for the [option]`isolated_cores` option functionality. For
	example, if a system has 4 CPUs, `isolated_cores=1` means that all
	processes/threads will be moved to CPUs 0,2-3.
	The scheduler plug-in will isolate the specified core by writing
	the calculated CPU affinity to the `cpuset.cpus` control file of
	the specified cgroup and move all the matching processes/threads to
	this group. If this option is unset, classic cpuset affinity using
	`sched_setaffinity()` will be used.
	+
	[option]`cgroup.__cgroup_name__` option defines affinities for
	arbitrary cgroups. Even hierarchic cgroups can be used, but the
	hieararchy needs to be specified in the correct order. Also *TuneD*
	does not do any sanity checks here, with the exception that it forces
	the cgroup to be under [option]`cgroup_mount_point`.
	+
	The syntax of the scheduler option starting with `group.` has been
	augmented to use `cgroup.__cgroup_name__` instead of the hexadecimal
	`__affinity__`. The matching processes will be moved to the cgroup
	`__cgroup_name__`. It is also possible to use cgroups which have
	not been defined by the [option]`cgroup.` option as described above,
	i.e. cgroups not managed by *TuneD*.
	+
	All cgroup names are sanitized by replacing all all dots (`.`) with
	slashes (`/`). This is to prevent the plug-in from writing outside
	[option]`cgroup_mount_point`.
	+
	.Using cgroups v1 with the scheduler plug-in
	====
	----
	[scheduler]
	cgroup_mount_point=/sys/fs/cgroup/cpuset
	cgroup_mount_point_init=1
	cgroup_groups_init=1
	cgroup_for_isolated_cores=group
	cgroup.group1=2
	cgroup.group2=0,2
	
	group.ksoftirqd=0:f:2:cgroup.group1:ksoftirqd.*
	ps_blacklist=ksoftirqd.*;rcuc.*;rcub.*;ktimersoftd.*
	isolated_cores=1
	----
	Cgroup `group1` has the affinity set to CPU 2 and the cgroup `group2`
	to CPUs 0,2. Given a 4 CPU setup, the [option]`isolated_cores=1`
	option causes all processes/threads to be moved to CPU
	cores 0,2-3. Processes/threads that are blacklisted by the
	[option]`ps_blacklist` regular expression will not be moved.
	
	The scheduler plug-in will isolate the specified core by writing the
	CPU affinity 0,2-3 to the `cpuset.cpus` control file of the `group`
	and move all the matching processes/threads to this cgroup.
	====
	Option [option]`cgroup_ps_blacklist` allows excluding processes
	which belong to the blacklisted cgroups. The regular expression specified
	by this option is matched against cgroup hierarchies from
	`/proc/PID/cgroups`. Cgroups v1 hierarchies from `/proc/PID/cgroups`
	are separated by commas ',' prior to regular expression matching. The
	following is an example of content against which the regular expression
	is matched against: `10:hugetlb:/,9:perf_event:/,8:blkio:/`
	+
	Multiple regular expressions can be separated by semicolon ';'. The
	semicolon represents a logical 'or' operator.
	+
	.Cgroup-based exclusion of processes from the scheduler
	====
	----
	[scheduler]
	isolated_cores=1
	cgroup_ps_blacklist=:/daemons\b
	----
	
	The scheduler plug-in will move all processes away from core 1 except processes which
	belong to cgroup '/daemons'. The '\b' is a regular expression
	metacharacter that matches a word boundary.
	
	----
	[scheduler]
	isolated_cores=1
	cgroup_ps_blacklist=\b8:blkio:
	----
	
	The scheduler plug-in will exclude all processes which belong to a cgroup
	with hierarchy-ID 8 and controller-list blkio.
	====
	Recent kernels moved some `sched_` and `numa_balancing_` kernel run-time
	parameters from `/proc/sys/kernel`, managed by the `sysctl` utility, to
	`debugfs`, typically mounted under `/sys/kernel/debug`.  TuneD provides an
	abstraction mechanism for the following parameters via the scheduler plug-in:
	[option]`sched_min_granularity_ns`, [option]`sched_latency_ns`,
	[option]`sched_wakeup_granularity_ns`, [option]`sched_tunable_scaling`,
	[option]`sched_migration_cost_ns`, [option]`sched_nr_migrate`,
	[option]`numa_balancing_scan_delay_ms`,
	[option]`numa_balancing_scan_period_min_ms`,
	[option]`numa_balancing_scan_period_max_ms` and
	[option]`numa_balancing_scan_size_mb`.
	Based on the kernel used, TuneD will write the specified value to the correct
	location.
	+
	.Set tasks' "cache hot" value for migration decisions.
	====
	----
	[scheduler]
	sched_migration_cost_ns=500000
	----
	On the old kernels, this is equivalent to:
	----
	[sysctl]
	kernel.sched_migration_cost_ns=500000
	----
	that is, value `500000` will be written to `/proc/sys/kernel/sched_migration_cost_ns`.
	However, on more recent kernels, the value `500000` will be written to
	`/sys/kernel/debug/sched/migration_cost_ns`.
	====
	"""

	def __init__(self, monitor_repository, storage_factory, hardware_inventory, device_matcher, device_matcher_udev, plugin_instance_factory, global_cfg, variables):
		super(SchedulerPlugin, self).__init__(monitor_repository, storage_factory, hardware_inventory, device_matcher, device_matcher_udev, plugin_instance_factory, global_cfg, variables)
		self._has_dynamic_options = True
		self._daemon = consts.CFG_DEF_DAEMON
		self._sleep_interval = int(consts.CFG_DEF_SLEEP_INTERVAL)
		if global_cfg is not None:
			self._daemon = global_cfg.get_bool(consts.CFG_DAEMON, consts.CFG_DEF_DAEMON)
			self._sleep_interval = int(global_cfg.get(consts.CFG_SLEEP_INTERVAL, consts.CFG_DEF_SLEEP_INTERVAL))
		self._cmd = commands()
		# helper variable utilized for showing hint only once that the error may be caused by Secure Boot
		self._secure_boot_hint = None
		# paths cache for sched_ and numa_ tunings
		self._sched_knob_paths_cache = {}
		# default is to whitelist all and blacklist none
		self._ps_whitelist = ".*"
		self._ps_blacklist = ""
		self._cgroup_ps_blacklist_re = ""
		self._cpus = perf.cpu_map()
		self._scheduler_storage_key = self._storage_key(
				command_name = "scheduler")
		self._irq_process = True
		self._irq_storage_key = self._storage_key(
				command_name = "irq")
		self._evlist = None
		try:
			self._scheduler_utils = SchedulerUtils()
		except AttributeError:
			self._scheduler_utils = SchedulerUtilsSchedutils()

	def _calc_mmap_pages(self, mmap_pages):
		if mmap_pages is None:
			return None
		try:
			mp = int(mmap_pages)
		except ValueError:
			return 0
		if mp <= 0:
			return 0
		# round up to the nearest power of two value
		return int(2 ** math.ceil(math.log(mp, 2)))

	def _instance_init(self, instance):
		instance._evlist = None
		instance._has_dynamic_tuning = False
		instance._has_static_tuning = True
		# this is hack, runtime_tuning should be covered by dynamic_tuning configuration
		# TODO: add per plugin dynamic tuning configuration and use dynamic_tuning configuration
		# instead of runtime_tuning
		instance._runtime_tuning = True

		# FIXME: do we want to do this here?
		# recover original values in case of crash
		self._scheduler_original = self._storage.get(
				self._scheduler_storage_key, {})
		if len(self._scheduler_original) > 0:
			log.info("recovering scheduling settings from previous run")
			self._restore_ps_affinity()
			self._scheduler_original = {}
			self._storage.unset(self._scheduler_storage_key)

		self._cgroups_original_affinity = dict()

		# calculated by isolated_cores setter
		self._affinity = None

		self._cgroup_affinity_initialized = False
		self._cgroup = None
		self._cgroups = collections.OrderedDict([(self._sanitize_cgroup_path(option[7:]), self._variables.expand(affinity))
			for option, affinity in instance.options.items() if option[:7] == "cgroup." and len(option) > 7])

		instance._scheduler = instance.options

		perf_mmap_pages_raw = self._variables.expand(instance.options["perf_mmap_pages"])
		perf_mmap_pages = self._calc_mmap_pages(perf_mmap_pages_raw)
		if perf_mmap_pages == 0:
			log.error("Invalid 'perf_mmap_pages' value specified: '%s', using default kernel value" % perf_mmap_pages_raw)
			perf_mmap_pages = None
		if perf_mmap_pages is not None and str(perf_mmap_pages) != perf_mmap_pages_raw:
			log.info("'perf_mmap_pages' value has to be power of two, specified: '%s', using: '%d'" %
				(perf_mmap_pages_raw, perf_mmap_pages))
		for k in instance._scheduler:
			instance._scheduler[k] = self._variables.expand(instance._scheduler[k])
		if self._cmd.get_bool(instance._scheduler.get("runtime", 1)) == "0":
			instance._runtime_tuning = False
		instance._terminate = threading.Event()
		if self._daemon and instance._runtime_tuning:
			try:
				instance._threads = perf.thread_map()
				evsel = perf.evsel(type = perf.TYPE_SOFTWARE,
					config = perf.COUNT_SW_DUMMY,
					task = 1, comm = 1, mmap = 0, freq = 0,
					wakeup_events = 1, watermark = 1,
					sample_type = perf.SAMPLE_TID | perf.SAMPLE_CPU)
				evsel.open(cpus = self._cpus, threads = instance._threads)
				instance._evlist = perf.evlist(self._cpus, instance._threads)
				instance._evlist.add(evsel)
				if perf_mmap_pages is None:
					instance._evlist.mmap()
				else:
					instance._evlist.mmap(pages = perf_mmap_pages)
			# no perf
			except:
				instance._runtime_tuning = False

	def _instance_cleanup(self, instance):
		if instance._evlist:
			for fd in instance._evlist.get_pollfd():
				os.close(fd.name)

	@classmethod
	def _get_config_options(cls):
		return {
			"isolated_cores": None,
			"cgroup_mount_point": consts.DEF_CGROUP_MOUNT_POINT,
			"cgroup_mount_point_init": False,
			"cgroup_groups_init": True,
			"cgroup_for_isolated_cores": None,
			"cgroup_ps_blacklist": None,
			"ps_whitelist": None,
			"ps_blacklist": None,
			"irq_process": True,
			"default_irq_smp_affinity": "calc",
			"perf_mmap_pages": None,
			"perf_process_fork": "false",
			"sched_min_granularity_ns": None,
			"sched_latency_ns": None,
			"sched_wakeup_granularity_ns": None,
			"sched_tunable_scaling": None,
			"sched_migration_cost_ns": None,
			"sched_nr_migrate": None,
			"numa_balancing_scan_delay_ms": None,
			"numa_balancing_scan_period_min_ms": None,
			"numa_balancing_scan_period_max_ms": None,
			"numa_balancing_scan_size_mb": None
		}

	def _sanitize_cgroup_path(self, value):
		return str(value).replace(".", "/") if value is not None else None

	# Raises OSError, IOError
	def _get_cmdline(self, process):
		if not isinstance(process, procfs.process):
			pid = process
			process = procfs.process(pid)
		cmdline = procfs.process_cmdline(process)
		if self._is_kthread(process):
			cmdline = "[" + cmdline + "]"
		return cmdline

	# Raises OSError, IOError
	def get_processes(self):
		ps = procfs.pidstats()
		ps.reload_threads()
		processes = {}
		for proc in ps.values():
			try:
				cmd = self._get_cmdline(proc)
				pid = proc["pid"]
				processes[pid] = cmd
				if "threads" in proc:
					for pid in proc["threads"].keys():
						cmd = self._get_cmdline(proc)
						processes[pid] = cmd
			except (OSError, IOError) as e:
				if e.errno == errno.ENOENT \
						or e.errno == errno.ESRCH:
					continue
				else:
					raise
		return processes

	# Raises OSError
	# Raises SystemError with old (pre-0.4) python-schedutils
	# instead of OSError
	# If PID doesn't exist, errno == ESRCH
	def _get_rt(self, pid):
		scheduler = self._scheduler_utils.get_scheduler(pid)
		sched_str = self._scheduler_utils.sched_num_to_const(scheduler)
		priority = self._scheduler_utils.get_priority(pid)
		log.debug("Read scheduler policy '%s' and priority '%d' of PID '%d'"
				% (sched_str, priority, pid))
		return (scheduler, priority)

	def _set_rt(self, pid, sched, prio):
		sched_str = self._scheduler_utils.sched_num_to_const(sched)
		log.debug("Setting scheduler policy to '%s' and priority to '%d' of PID '%d'."
				% (sched_str, prio, pid))
		try:
			prio_min = self._scheduler_utils.get_priority_min(sched)
			prio_max = self._scheduler_utils.get_priority_max(sched)
			if prio < prio_min or prio > prio_max:
				log.error("Priority for %s must be in range %d - %d. '%d' was given."
						% (sched_str, prio_min,
						prio_max, prio))
		# Workaround for old (pre-0.4) python-schedutils which raised
		# SystemError instead of OSError
		except (SystemError, OSError) as e:
			log.error("Failed to get allowed priority range: %s"
					% e)
		try:
			self._scheduler_utils.set_scheduler(pid, sched, prio)
		except (SystemError, OSError) as e:
			if hasattr(e, "errno") and e.errno == errno.ESRCH:
				log.debug("Failed to set scheduling parameters of PID %d, the task vanished."
						% pid)
			else:
				log.error("Failed to set scheduling parameters of PID %d: %s"
						% (pid, e))

	# process is a procfs.process object
	# Raises OSError, IOError
	def _is_kthread(self, process):
		return process["stat"]["flags"] & procfs.pidstat.PF_KTHREAD != 0

	# Return codes:
	# 0 - Affinity is fixed
	# 1 - Affinity is changeable
	# -1 - Task vanished
	# -2 - Error
	def _affinity_changeable(self, pid):
		try:
			process = procfs.process(pid)
			if process["stat"].is_bound_to_cpu():
				if process["stat"]["state"] == "Z":
					log.debug("Affinity of zombie task with PID %d cannot be changed, the task's affinity mask is fixed."
							% pid)
				elif self._is_kthread(process):
					log.debug("Affinity of kernel thread with PID %d cannot be changed, the task's affinity mask is fixed."
							% pid)
				else:
					log.warn("Affinity of task with PID %d cannot be changed, the task's affinity mask is fixed."
							% pid)
				return 0
			else:
				return 1
		except (OSError, IOError) as e:
			if e.errno == errno.ENOENT or e.errno == errno.ESRCH:
				log.debug("Failed to get task info for PID %d, the task vanished."
						% pid)
				return -1
			else:
				log.error("Failed to get task info for PID %d: %s"
						% (pid, e))
				return -2
		except (AttributeError, KeyError) as e:
			log.error("Failed to get task info for PID %d: %s"
					% (pid, e))
			return -2

	def _store_orig_process_rt(self, pid, scheduler, priority):
		try:
			params = self._scheduler_original[pid]
		except KeyError:
			params = SchedulerParams(self._cmd)
			self._scheduler_original[pid] = params
		if params.scheduler is None and params.priority is None:
			params.scheduler = scheduler
			params.priority = priority

	def _tune_process_rt(self, pid, sched, prio):
		cont = True
		if sched is None and prio is None:
			return cont
		try:
			(prev_sched, prev_prio) = self._get_rt(pid)
			if sched is None:
				sched = prev_sched
			self._set_rt(pid, sched, prio)
			self._store_orig_process_rt(pid, prev_sched, prev_prio)
		except (SystemError, OSError) as e:
			if hasattr(e, "errno") and e.errno == errno.ESRCH:
				log.debug("Failed to read scheduler policy of PID %d, the task vanished."
						% pid)
				if pid in self._scheduler_original:
					del self._scheduler_original[pid]
				cont = False
			else:
				log.error("Refusing to set scheduler and priority of PID %d, reading original scheduling parameters failed: %s"
						% (pid, e))
		return cont

	def _is_cgroup_affinity(self, affinity):
		return str(affinity)[:7] == "cgroup."

	def _store_orig_process_affinity(self, pid, affinity, is_cgroup = False):
		try:
			params = self._scheduler_original[pid]
		except KeyError:
			params = SchedulerParams(self._cmd)
			self._scheduler_original[pid] = params
		if params.affinity is None and params.cgroup is None:
			if is_cgroup:
				params.cgroup = affinity
			else:
				params.affinity = affinity

	def _get_cgroup_affinity(self, pid):
		# we cannot use procfs, because it uses comma ',' delimiter which
		# can be ambiguous
		for l in self._cmd.read_file("%s/%s/%s" % (consts.PROCFS_MOUNT_POINT, str(pid), "cgroup"), no_error = True).split("\n"):
			try:
				cgroup = l.split(":cpuset:")[1][1:]
				return cgroup if cgroup != "" else "/"
			except IndexError:
				pass
		return "/"

	# it can be arbitrary cgroup even cgroup we didn't set, but it needs to be
	# under "cgroup_mount_point"
	def _set_cgroup(self, pid, cgroup):
		cgroup = self._sanitize_cgroup_path(cgroup)
		path = self._cgroup_mount_point
		if cgroup != "/":
			path = "%s/%s" % (path, cgroup)
		self._cmd.write_to_file("%s/tasks" % path, str(pid), no_error = True)

	def _parse_cgroup_affinity(self, cgroup):
		# "cgroup.CGROUP"
		cgroup = cgroup[7:]
		# this should be faster than string comparison
		is_cgroup = not isinstance(cgroup, list) and len(cgroup) > 0
		return is_cgroup, cgroup

	def _tune_process_affinity(self, pid, affinity, intersect = False):
		cont = True
		if affinity is None:
			return cont
		try:
			(is_cgroup, cgroup) = self._parse_cgroup_affinity(affinity)
			if is_cgroup:
				prev_affinity = self._get_cgroup_affinity(pid)
				self._set_cgroup(pid, cgroup)
			else:
				prev_affinity = self._get_affinity(pid)
				if intersect:
					affinity = self._get_intersect_affinity(
							prev_affinity, affinity,
							affinity)
				self._set_affinity(pid, affinity)
			self._store_orig_process_affinity(pid,
					prev_affinity, is_cgroup)
		except (SystemError, OSError) as e:
			if hasattr(e, "errno") and e.errno == errno.ESRCH:
				log.debug("Failed to read affinity of PID %d, the task vanished."
						% pid)
				if pid in self._scheduler_original:
					del self._scheduler_original[pid]
				cont = False
			else:
				log.error("Refusing to set CPU affinity of PID %d, reading original affinity failed: %s"
						% (pid, e))
		return cont

	#tune process and store previous values
	def _tune_process(self, pid, cmd, sched, prio, affinity):
		cont = self._tune_process_rt(pid, sched, prio)
		if not cont:
			return
		cont = self._tune_process_affinity(pid, affinity)
		if not cont or pid not in self._scheduler_original:
			return
		self._scheduler_original[pid].cmdline = cmd

	def _convert_sched_params(self, str_scheduler, str_priority):
		scheduler = self._scheduler_utils.sched_cfg_to_num(str_scheduler)
		if scheduler is None and str_scheduler != "*":
			log.error("Invalid scheduler: %s. Scheduler and priority will be ignored."
					% str_scheduler)
			return (None, None)
		else:
			try:
				priority = int(str_priority)
			except ValueError:
				log.error("Invalid priority: %s. Scheduler and priority will be ignored."
							% str_priority)
				return (None, None)
		return (scheduler, priority)

	def _convert_affinity(self, str_affinity):
		if str_affinity == "*":
			affinity = None
		elif self._is_cgroup_affinity(str_affinity):
			affinity = str_affinity
		else:
			affinity = self._cmd.hex2cpulist(str_affinity)
			if not affinity:
				log.error("Invalid affinity: %s. It will be ignored."
						% str_affinity)
				affinity = None
		return affinity

	def _convert_sched_cfg(self, vals):
		(rule_prio, scheduler, priority, affinity, regex) = vals
		(scheduler, priority) = self._convert_sched_params(
				scheduler, priority)
		affinity = self._convert_affinity(affinity)
		return (rule_prio, scheduler, priority, affinity, regex)

	def _cgroup_create_group(self, cgroup):
		path = "%s/%s" % (self._cgroup_mount_point, cgroup)
		try:
			os.mkdir(path, consts.DEF_CGROUP_MODE)
		except OSError as e:
			log.error("Unable to create cgroup '%s': %s" % (path, e))
		if (not self._cmd.write_to_file("%s/%s" % (path, "cpuset.mems"),
				self._cmd.read_file("%s/%s" % (self._cgroup_mount_point, "cpuset.mems"), no_error = True),
				no_error = True)):
					log.error("Unable to initialize 'cpuset.mems ' for cgroup '%s'" % path)

	def _cgroup_initialize_groups(self):
		if self._cgroup is not None and not self._cgroup in self._cgroups:
			self._cgroup_create_group(self._cgroup)
		for cg in self._cgroups:
			self._cgroup_create_group(cg)

	def _cgroup_initialize(self):
		log.debug("Initializing cgroups settings")
		try:
			os.makedirs(self._cgroup_mount_point, consts.DEF_CGROUP_MODE)
		except OSError as e:
			log.error("Unable to create cgroup mount point: %s" % e)
		(ret, out) = self._cmd.execute(["mount", "-t", "cgroup", "-o", "cpuset", "cpuset", self._cgroup_mount_point])
		if ret != 0:
			log.error("Unable to mount '%s'" % self._cgroup_mount_point)

	def _remove_dir(self, cgroup):
		try:
			os.rmdir(cgroup)
		except OSError as e:
			log.error("Unable to remove directory '%s': %s" % (cgroup, e))

	def _cgroup_finalize_groups(self):
		for cg in reversed(self._cgroups):
			self._remove_dir("%s/%s" % (self._cgroup_mount_point, cg))
		if self._cgroup is not None and not self._cgroup in self._cgroups:
			self._remove_dir("%s/%s" % (self._cgroup_mount_point, self._cgroup))

	def _cgroup_finalize(self):
		log.debug("Removing cgroups settings")
		(ret, out) = self._cmd.execute(["umount", self._cgroup_mount_point])
		if ret != 0:
			log.error("Unable to umount '%s'" % self._cgroup_mount_point)
			return False
		self._remove_dir(self._cgroup_mount_point)
		d = os.path.dirname(self._cgroup_mount_point)
		if (d != "/"):
			self._remove_dir(d)

	def _cgroup_set_affinity_one(self, cgroup, affinity, backup = False):
		if affinity != "":
			log.debug("Setting cgroup '%s' affinity to '%s'" % (cgroup, affinity))
		else:
			log.debug("Skipping cgroup '%s', empty affinity requested" % cgroup)
			return
		path = "%s/%s/%s" % (self._cgroup_mount_point, cgroup, "cpuset.cpus")
		if backup:
			orig_affinity = self._cmd.read_file(path, err_ret = "ERR", no_error = True).strip()
			if orig_affinity != "ERR":
				self._cgroups_original_affinity[cgroup] = orig_affinity
			else:
				log.error("Refusing to set affinity of cgroup '%s', reading original affinity failed" % cgroup)
				return
		if not self._cmd.write_to_file(path, affinity, no_error = True):
			log.error("Unable to set affinity '%s' for cgroup '%s'" % (affinity, cgroup))

	def _cgroup_set_affinity(self):
		if self._cgroup_affinity_initialized:
			return
		log.debug("Setting cgroups affinities")
		if self._affinity is not None and self._cgroup is not None and not self._cgroup in self._cgroups:
			self._cgroup_set_affinity_one(self._cgroup, self._affinity, backup = True)
		for cg in self._cgroups.items():
			self._cgroup_set_affinity_one(cg[0], cg[1], backup = True)
		self._cgroup_affinity_initialized = True

	def _cgroup_restore_affinity(self):
		log.debug("Restoring cgroups affinities")
		for cg in self._cgroups_original_affinity.items():
			self._cgroup_set_affinity_one(cg[0], cg[1])

	def _instance_apply_static(self, instance):
		# need to get "cgroup_mount_point_init", "cgroup_mount_point", "cgroup_groups_init",
		# "cgroup", and initialize mount point and cgroups before super class implementation call
		self._cgroup_mount_point = self._variables.expand(instance.options["cgroup_mount_point"])
		self._cgroup_mount_point_init = self._cmd.get_bool(self._variables.expand(
			instance.options["cgroup_mount_point_init"])) == "1"
		self._cgroup_groups_init = self._cmd.get_bool(self._variables.expand(
			instance.options["cgroup_groups_init"])) == "1"
		self._cgroup = self._sanitize_cgroup_path(self._variables.expand(
			instance.options["cgroup_for_isolated_cores"]))

		if self._cgroup_mount_point_init:
			self._cgroup_initialize()
		if self._cgroup_groups_init or self._cgroup_mount_point_init:
			self._cgroup_initialize_groups()

		super(SchedulerPlugin, self)._instance_apply_static(instance)

		self._cgroup_set_affinity()
		try:
			ps = self.get_processes()
		except (OSError, IOError) as e:
			log.error("error applying tuning, cannot get information about running processes: %s"
					% e)
			return
		sched_cfg = [(option, str(value).split(":", 4)) for option, value in instance._scheduler.items()]
		buf = [(option, self._convert_sched_cfg(vals))
				for option, vals in sched_cfg
				if re.match(r"group\.", option)
				and len(vals) == 5]
		sched_cfg = sorted(buf, key=lambda option_vals: option_vals[1][0])
		sched_all = dict()
		# for runtime tuning
		instance._sched_lookup = {}
		for option, (rule_prio, scheduler, priority, affinity, regex) \
				in sched_cfg:
			try:
				r = re.compile(regex)
			except re.error as e:
				log.error("error compiling regular expression: '%s'" % str(regex))
				continue
			processes = [(pid, cmd) for pid, cmd in ps.items() if re.search(r, cmd) is not None]
			#cmd - process name, option - group name
			sched = dict([(pid, (cmd, option, scheduler, priority, affinity, regex))
					for pid, cmd in processes])
			sched_all.update(sched)
			# make any contained regexes non-capturing: replace "(" with "(?:",
			# unless the "(" is preceded by "\" or followed by "?"
			regex = re.sub(r"(?<!\\)\((?!\?)", "(?:", str(regex))
			instance._sched_lookup[regex] = [scheduler, priority, affinity]
		for pid, (cmd, option, scheduler, priority, affinity, regex) \
				in sched_all.items():
			self._tune_process(pid, cmd, scheduler,
					priority, affinity)
		self._storage.set(self._scheduler_storage_key,
				self._scheduler_original)
		if self._daemon and instance._runtime_tuning:
			instance._thread = threading.Thread(target = self._thread_code, args = [instance])
			instance._thread.start()

	def _restore_ps_affinity(self):
		try:
			ps = self.get_processes()
		except (OSError, IOError) as e:
			log.error("error unapplying tuning, cannot get information about running processes: %s"
					% e)
			return
		for pid, orig_params in self._scheduler_original.items():
			# if command line for the pid didn't change, it's very probably the same process
			if pid not in ps or ps[pid] != orig_params.cmdline:
				continue
			if orig_params.scheduler is not None \
					and orig_params.priority is not None:
				self._set_rt(pid, orig_params.scheduler,
						orig_params.priority)
			if orig_params.cgroup is not None:
				self._set_cgroup(pid, orig_params.cgroup)
			elif orig_params.affinity is not None:
				self._set_affinity(pid, orig_params.affinity)
		self._scheduler_original = {}
		self._storage.unset(self._scheduler_storage_key)

	def _cgroup_cleanup_tasks_one(self, cgroup):
		cnt = int(consts.CGROUP_CLEANUP_TASKS_RETRY)
		data = " "
		while data != "" and cnt > 0:
			data = self._cmd.read_file("%s/%s/%s" % (self._cgroup_mount_point, cgroup, "tasks"),
				err_ret = " ", no_error = True)
			if data not in ["", " "]:
				for l in data.split("\n"):
					self._cmd.write_to_file("%s/%s" % (self._cgroup_mount_point, "tasks"), l, no_error = True)
			cnt -= 1
		if cnt == 0:
			log.warn("Unable to cleanup tasks from cgroup '%s'" % cgroup)

	def _cgroup_cleanup_tasks(self):
		if self._cgroup is not None and not self._cgroup in self._cgroups:
			self._cgroup_cleanup_tasks_one(self._cgroup)
		for cg in self._cgroups:
			self._cgroup_cleanup_tasks_one(cg)

	def _instance_unapply_static(self, instance, rollback = consts.ROLLBACK_SOFT):
		super(SchedulerPlugin, self)._instance_unapply_static(instance, rollback)
		if self._daemon and instance._runtime_tuning:
			instance._terminate.set()
			instance._thread.join()
		self._restore_ps_affinity()
		self._cgroup_restore_affinity()
		self._cgroup_cleanup_tasks()
		if self._cgroup_groups_init or self._cgroup_mount_point_init:
			self._cgroup_finalize_groups()
		if self._cgroup_mount_point_init:
			self._cgroup_finalize()

	def _cgroup_verify_affinity_one(self, cgroup, affinity):
		log.debug("Verifying cgroup '%s' affinity" % cgroup)
		path = "%s/%s/%s" % (self._cgroup_mount_point, cgroup, "cpuset.cpus")
		current_affinity = self._cmd.read_file(path, err_ret = "ERR", no_error = True)
		if current_affinity == "ERR":
			return True
		current_affinity = self._cmd.cpulist2string(self._cmd.cpulist_pack(current_affinity))
		affinity = self._cmd.cpulist2string(self._cmd.cpulist_pack(affinity))
		affinity_description = "cgroup '%s' affinity" % cgroup
		if current_affinity == affinity:
			log.info(consts.STR_VERIFY_PROFILE_VALUE_OK
					% (affinity_description, current_affinity))
			return True
		else:
			log.error(consts.STR_VERIFY_PROFILE_VALUE_FAIL
					% (affinity_description, current_affinity,
					affinity))
			return False

	def _cgroup_verify_affinity(self):
		log.debug("Veryfying cgroups affinities")
		ret = True
		if self._affinity is not None and self._cgroup is not None and not self._cgroup in self._cgroups:
			ret = ret and self._cgroup_verify_affinity_one(self._cgroup, self._affinity)
		for cg in self._cgroups.items():
			ret = ret and self._cgroup_verify_affinity_one(cg[0], cg[1])
		return ret

	def _instance_verify_static(self, instance, ignore_missing, devices):
		ret1 = super(SchedulerPlugin, self)._instance_verify_static(instance, ignore_missing, devices)
		ret2 = self._cgroup_verify_affinity()
		return ret1 and ret2

	def _add_pid(self, instance, pid, r):
		try:
			cmd = self._get_cmdline(pid)
		except (OSError, IOError) as e:
			if e.errno == errno.ENOENT \
					or e.errno == errno.ESRCH:
				log.debug("Failed to get cmdline of PID %d, the task vanished."
						% pid)
			else:
				log.error("Failed to get cmdline of PID %d: %s"
						% (pid, e))
			return
		v = self._cmd.re_lookup(instance._sched_lookup, cmd, r)
		if v is not None and not pid in self._scheduler_original:
			log.debug("tuning new process '%s' with PID '%d' by '%s'" % (cmd, pid, str(v)))
			(sched, prio, affinity) = v
			self._tune_process(pid, cmd, sched, prio,
					affinity)
			self._storage.set(self._scheduler_storage_key,
					self._scheduler_original)

	def _remove_pid(self, instance, pid):
		if pid in self._scheduler_original:
			del self._scheduler_original[pid]
			log.debug("removed PID %d from the rollback database" % pid)
			self._storage.set(self._scheduler_storage_key,
					self._scheduler_original)

	def _thread_code(self, instance):
		r = self._cmd.re_lookup_compile(instance._sched_lookup)
		poll = select.poll()
		# Store the file objects in a local variable so that they don't
		# go out of scope too soon. This is a workaround for
		# python3-perf bug rhbz#1659445.
		fds = instance._evlist.get_pollfd()
		for fd in fds:
			poll.register(fd)
		while not instance._terminate.is_set():
			# timeout to poll in milliseconds
			if len(poll.poll(self._sleep_interval * 1000)) > 0 and not instance._terminate.is_set():
				read_events = True
				while read_events:
					read_events = False
					for cpu in self._cpus:
						event = instance._evlist.read_on_cpu(cpu)
						if event:
							read_events = True
							if event.type == perf.RECORD_COMM or \
								(self._perf_process_fork_value and event.type == perf.RECORD_FORK):
								self._add_pid(instance, int(event.tid), r)
							elif event.type == perf.RECORD_EXIT:
								self._remove_pid(instance, int(event.tid))

	@command_custom("cgroup_ps_blacklist", per_device = False)
	def _cgroup_ps_blacklist(self, enabling, value, verify, ignore_missing):
		# currently unsupported
		if verify:
			return None
		if enabling and value is not None:
			self._cgroup_ps_blacklist_re = "|".join(["(%s)" % v for v in re.split(r"(?<!\\);", str(value))])

	@command_custom("ps_whitelist", per_device = False)
	def _ps_whitelist(self, enabling, value, verify, ignore_missing):
		# currently unsupported
		if verify:
			return None
		if enabling and value is not None:
			self._ps_whitelist = "|".join(["(%s)" % v for v in re.split(r"(?<!\\);", str(value))])

	@command_custom("ps_blacklist", per_device = False)
	def _ps_blacklist(self, enabling, value, verify, ignore_missing):
		# currently unsupported
		if verify:
			return None
		if enabling and value is not None:
			self._ps_blacklist = "|".join(["(%s)" % v for v in re.split(r"(?<!\\);", str(value))])

	@command_custom("irq_process", per_device = False)
	def _irq_process(self, enabling, value, verify, ignore_missing):
		# currently unsupported
		if verify:
			return None
		if enabling and value is not None:
			self._irq_process = self._cmd.get_bool(value) == "1"

	@command_custom("default_irq_smp_affinity", per_device = False)
	def _default_irq_smp_affinity(self, enabling, value, verify, ignore_missing):
		# currently unsupported
		if verify:
			return None
		if enabling and value is not None:
			if value in ["calc", "ignore"]:
				self._default_irq_smp_affinity_value = value
			else:
				self._default_irq_smp_affinity_value = self._cmd.cpulist_unpack(value)

	@command_custom("perf_process_fork", per_device = False)
	def _perf_process_fork(self, enabling, value, verify, ignore_missing):
		# currently unsupported
		if verify:
			return None
		if enabling and value is not None:
			self._perf_process_fork_value = self._cmd.get_bool(value) == "1"

	# Raises OSError
	# Raises SystemError with old (pre-0.4) python-schedutils
	# instead of OSError
	# If PID doesn't exist, errno == ESRCH
	def _get_affinity(self, pid):
		res = self._scheduler_utils.get_affinity(pid)
		log.debug("Read affinity '%s' of PID %d" % (res, pid))
		return res

	def _set_affinity(self, pid, affinity):
		log.debug("Setting CPU affinity of PID %d to '%s'." % (pid, affinity))
		try:
			self._scheduler_utils.set_affinity(pid, affinity)
			return True
		# Workaround for old python-schedutils (pre-0.4) which
		# incorrectly raised SystemError instead of OSError
		except (SystemError, OSError) as e:
			if hasattr(e, "errno") and e.errno == errno.ESRCH:
				log.debug("Failed to set affinity of PID %d, the task vanished."
						% pid)
			else:
				res = self._affinity_changeable(pid)
				if res == 1 or res == -2:
					log.error("Failed to set affinity of PID %d to '%s': %s"
							% (pid, affinity, e))
			return False

	# returns intersection of affinity1 with affinity2, if intersection is empty it returns affinity3
	def _get_intersect_affinity(self, affinity1, affinity2, affinity3):
		aff = set(affinity1).intersection(set(affinity2))
		if aff:
			return list(aff)
		return affinity3

	def _set_all_obj_affinity(self, objs, affinity, threads = False):
		psl = [v for v in objs if re.search(self._ps_whitelist,
				self._get_stat_comm(v)) is not None]
		if self._ps_blacklist != "":
			psl = [v for v in psl if re.search(self._ps_blacklist,
					self._get_stat_comm(v)) is None]
		if self._cgroup_ps_blacklist_re != "":
			psl = [v for v in psl if re.search(self._cgroup_ps_blacklist_re,
					self._get_stat_cgroup(v)) is None]
		psd = dict([(v.pid, v) for v in psl])
		for pid in psd:
			try:
				cmd = self._get_cmdline(psd[pid])
			except (OSError, IOError) as e:
				if e.errno == errno.ENOENT \
						or e.errno == errno.ESRCH:
					log.debug("Failed to get cmdline of PID %d, the task vanished."
							% pid)
				else:
					log.error("Refusing to set affinity of PID %d, failed to get its cmdline: %s"
							% (pid, e))
				continue
			cont = self._tune_process_affinity(pid, affinity,
					intersect = True)
			if not cont:
				continue
			if pid in self._scheduler_original:
				self._scheduler_original[pid].cmdline = cmd
			# process threads
			if not threads and "threads" in psd[pid]:
				self._set_all_obj_affinity(
						psd[pid]["threads"].values(),
						affinity, True)

	def _get_stat_cgroup(self, o):
		try:
			return o["cgroups"]
		except (OSError, IOError, KeyError):
			return ""

	def _get_stat_comm(self, o):
		try:
			return o["stat"]["comm"]
		except (OSError, IOError, KeyError):
			return ""

	def _set_ps_affinity(self, affinity):
		try:
			ps = procfs.pidstats()
			ps.reload_threads()
			self._set_all_obj_affinity(ps.values(), affinity, False)
		except (OSError, IOError) as e:
			log.error("error applying tuning, cannot get information about running processes: %s"
					% e)

	# Returns 0 on success, -2 if changing the affinity is not
	# supported, -1 if some other error occurs.
	def _set_irq_affinity(self, irq, affinity, restoring):
		try:
			affinity_hex = self._cmd.cpulist2hex(affinity)
			log.debug("Setting SMP affinity of IRQ %s to '%s'"
					% (irq, affinity_hex))
			filename = "/proc/irq/%s/smp_affinity" % irq
			with open(filename, "w") as f:
				f.write(affinity_hex)
			return 0
		except (OSError, IOError) as e:
			# EIO is returned by
			# kernel/irq/proc.c:write_irq_affinity() if changing
			# the affinity is not supported
			# (at least on kernels 3.10 and 4.18)
			if hasattr(e, "errno") and e.errno == errno.EIO \
					and not restoring:
				log.debug("Setting SMP affinity of IRQ %s is not supported"
						% irq)
				return -2
			else:
				log.error("Failed to set SMP affinity of IRQ %s to '%s': %s"
						% (irq, affinity_hex, e))
				return -1

	def _set_default_irq_affinity(self, affinity):
		try:
			affinity_hex = self._cmd.cpulist2hex(affinity)
			log.debug("Setting default SMP IRQ affinity to '%s'"
					% affinity_hex)
			with open("/proc/irq/default_smp_affinity", "w") as f:
				f.write(affinity_hex)
		except (OSError, IOError) as e:
			log.error("Failed to set default SMP IRQ affinity to '%s': %s"
					% (affinity_hex, e))

	def _set_all_irq_affinity(self, affinity):
		irq_original = IRQAffinities()
		irqs = procfs.interrupts()
		for irq in irqs.keys():
			try:
				prev_affinity = irqs[irq]["affinity"]
				log.debug("Read affinity of IRQ '%s': '%s'"
						% (irq, prev_affinity))
			except KeyError:
				continue
			_affinity = self._get_intersect_affinity(prev_affinity, affinity, affinity)
			if set(_affinity) == set(prev_affinity):
				continue
			res = self._set_irq_affinity(irq, _affinity, False)
			if res == 0:
				irq_original.irqs[irq] = prev_affinity
			elif res == -2:
				irq_original.unchangeable.append(irq)

		# default affinity
		prev_affinity_hex = self._cmd.read_file("/proc/irq/default_smp_affinity")
		prev_affinity = self._cmd.hex2cpulist(prev_affinity_hex)
		if self._default_irq_smp_affinity_value == "calc":
			_affinity = self._get_intersect_affinity(prev_affinity, affinity, affinity)
		elif self._default_irq_smp_affinity_value != "ignore":
			_affinity = self._default_irq_smp_affinity_value
		if self._default_irq_smp_affinity_value != "ignore":
			self._set_default_irq_affinity(_affinity)
			irq_original.default = prev_affinity
		self._storage.set(self._irq_storage_key, irq_original)

	def _restore_all_irq_affinity(self):
		irq_original = self._storage.get(self._irq_storage_key, None)
		if irq_original is None:
			return
		for irq, affinity in irq_original.irqs.items():
			self._set_irq_affinity(irq, affinity, True)
		if self._default_irq_smp_affinity_value != "ignore":
			affinity = irq_original.default
			self._set_default_irq_affinity(affinity)
		self._storage.unset(self._irq_storage_key)

	def _verify_irq_affinity(self, irq_description, correct_affinity,
			current_affinity):
		res = set(current_affinity).issubset(set(correct_affinity))
		if res:
			log.info(consts.STR_VERIFY_PROFILE_VALUE_OK
					% (irq_description, current_affinity))
		else:
			log.error(consts.STR_VERIFY_PROFILE_VALUE_FAIL
					% (irq_description, current_affinity,
					correct_affinity))
		return res

	def _verify_all_irq_affinity(self, correct_affinity, ignore_missing):
		irq_original = self._storage.get(self._irq_storage_key, None)
		irqs = procfs.interrupts()
		res = True
		for irq in irqs.keys():
			if irq in irq_original.unchangeable and ignore_missing:
				description = "IRQ %s does not support changing SMP affinity" % irq
				log.info(consts.STR_VERIFY_PROFILE_VALUE_MISSING % description)
				continue
			try:
				current_affinity = irqs[irq]["affinity"]
				log.debug("Read SMP affinity of IRQ '%s': '%s'"
						% (irq, current_affinity))
				irq_description = "SMP affinity of IRQ %s" % irq
				if not self._verify_irq_affinity(
						irq_description,
						correct_affinity,
						current_affinity):
					res = False
			except KeyError:
				continue

		current_affinity_hex = self._cmd.read_file(
				"/proc/irq/default_smp_affinity")
		current_affinity = self._cmd.hex2cpulist(current_affinity_hex)
		if self._default_irq_smp_affinity_value != "ignore" and not self._verify_irq_affinity("default IRQ SMP affinity",
				current_affinity, correct_affinity if self._default_irq_smp_affinity_value == "calc" else
				self._default_irq_smp_affinity_value):
			res = False
		return res

	@command_custom("isolated_cores", per_device = False, priority = 10)
	def _isolated_cores(self, enabling, value, verify, ignore_missing):
		affinity = None
		self._affinity = None
		if value is not None:
			isolated = set(self._cmd.cpulist_unpack(value))
			present = set(self._cpus)
			if isolated.issubset(present):
				affinity = list(present - isolated)
				self._affinity = self._cmd.cpulist2string(affinity)
			else:
				str_cpus = self._cmd.cpulist2string(self._cpus)
				log.error("Invalid isolated_cores specified, '%s' does not match available cores '%s'"
						% (value, str_cpus))
		if (enabling or verify) and affinity is None:
			return None
		# currently only IRQ affinity verification is supported
		if verify:
			if self._irq_process:
				return self._verify_all_irq_affinity(affinity, ignore_missing)
			return True
		elif enabling:
			if self._cgroup:
				self._cgroup_set_affinity()
				ps_affinity = "cgroup.%s" % self._cgroup
			else:
				ps_affinity = affinity
			self._set_ps_affinity(ps_affinity)
			if self._irq_process:
				self._set_all_irq_affinity(affinity)
		else:
			# Restoring processes' affinity is done in
			# _instance_unapply_static()
			if self._irq_process:
				self._restore_all_irq_affinity()

	def _get_sched_knob_path(self, prefix, namespace, knob):
		key = "%s_%s_%s" % (prefix, namespace, knob)
		path = self._sched_knob_paths_cache.get(key)
		if path:
			return path
		path = "/proc/sys/kernel/%s_%s" % (namespace, knob)
		if not os.path.exists(path):
			if prefix == "":
				path = "%s/%s" % (namespace, knob)
			else:
				path = "%s/%s/%s" % (prefix, namespace, knob)
			path = "/sys/kernel/debug/%s" % path
			if self._secure_boot_hint is None:
				self._secure_boot_hint = True
		self._sched_knob_paths_cache[key] = path
		return path

	def _get_sched_knob(self, prefix, namespace, knob):
		data = self._cmd.read_file(self._get_sched_knob_path(prefix, namespace, knob), err_ret = None)
		if data is None:
			log.error("Error reading '%s'" % knob)
			if self._secure_boot_hint:
				log.error("This may not work with Secure Boot or kernel_lockdown (this hint is logged only once)")
				self._secure_boot_hint = False
		return data

	def _set_sched_knob(self, prefix, namespace, knob, value, sim, remove = False):
		if value is None:
			return None
		if not sim:
			if not self._cmd.write_to_file(self._get_sched_knob_path(prefix, namespace, knob), value, \
				no_error = [errno.ENOENT] if remove else False):
					log.error("Error writing value '%s' to '%s'" % (value, knob))
		return value

	@command_get("sched_min_granularity_ns")
	def _get_sched_min_granularity_ns(self):
		return self._get_sched_knob("", "sched", "min_granularity_ns")

	@command_set("sched_min_granularity_ns")
	def _set_sched_min_granularity_ns(self, value, sim, remove):
		return self._set_sched_knob("", "sched", "min_granularity_ns", value, sim, remove)

	@command_get("sched_latency_ns")
	def _get_sched_latency_ns(self):
		return self._get_sched_knob("", "sched", "latency_ns")

	@command_set("sched_latency_ns")
	def _set_sched_latency_ns(self, value, sim, remove):
		return self._set_sched_knob("", "sched", "latency_ns", value, sim, remove)

	@command_get("sched_wakeup_granularity_ns")
	def _get_sched_wakeup_granularity_ns(self):
		return self._get_sched_knob("", "sched", "wakeup_granularity_ns")

	@command_set("sched_wakeup_granularity_ns")
	def _set_sched_wakeup_granularity_ns(self, value, sim, remove):
		return self._set_sched_knob("", "sched", "wakeup_granularity_ns", value, sim, remove)

	@command_get("sched_tunable_scaling")
	def _get_sched_tunable_scaling(self):
		return self._get_sched_knob("", "sched", "tunable_scaling")

	@command_set("sched_tunable_scaling")
	def _set_sched_tunable_scaling(self, value, sim, remove):
		return self._set_sched_knob("", "sched", "tunable_scaling", value, sim, remove)

	@command_get("sched_migration_cost_ns")
	def _get_sched_migration_cost_ns(self):
		return self._get_sched_knob("", "sched", "migration_cost_ns")

	@command_set("sched_migration_cost_ns")
	def _set_sched_migration_cost_ns(self, value, sim, remove):
		return self._set_sched_knob("", "sched", "migration_cost_ns", value, sim, remove)

	@command_get("sched_nr_migrate")
	def _get_sched_nr_migrate(self):
		return self._get_sched_knob("", "sched", "nr_migrate")

	@command_set("sched_nr_migrate")
	def _set_sched_nr_migrate(self, value, sim, remove):
		return self._set_sched_knob("", "sched", "nr_migrate", value, sim, remove)

	@command_get("numa_balancing_scan_delay_ms")
	def _get_numa_balancing_scan_delay_ms(self):
		return self._get_sched_knob("sched", "numa_balancing", "scan_delay_ms")

	@command_set("numa_balancing_scan_delay_ms")
	def _set_numa_balancing_scan_delay_ms(self, value, sim, remove):
		return self._set_sched_knob("sched", "numa_balancing", "scan_delay_ms", value, sim, remove)

	@command_get("numa_balancing_scan_period_min_ms")
	def _get_numa_balancing_scan_period_min_ms(self):
		return self._get_sched_knob("sched", "numa_balancing", "scan_period_min_ms")

	@command_set("numa_balancing_scan_period_min_ms")
	def _set_numa_balancing_scan_period_min_ms(self, value, sim, remove):
		return self._set_sched_knob("sched", "numa_balancing", "scan_period_min_ms", value, sim, remove)

	@command_get("numa_balancing_scan_period_max_ms")
	def _get_numa_balancing_scan_period_max_ms(self):
		return self._get_sched_knob("sched", "numa_balancing", "scan_period_max_ms")

	@command_set("numa_balancing_scan_period_max_ms")
	def _set_numa_balancing_scan_period_max_ms(self, value, sim, remove):
		return self._set_sched_knob("sched", "numa_balancing", "scan_period_max_ms", value, sim, remove)

	@command_get("numa_balancing_scan_size_mb")
	def _get_numa_balancing_scan_size_mb(self):
		return self._get_sched_knob("sched", "numa_balancing", "scan_size_mb")

	@command_set("numa_balancing_scan_size_mb")
	def _set_numa_balancing_scan_size_mb(self, value, sim, remove):
		return self._set_sched_knob("sched", "numa_balancing", "scan_size_mb", value, sim, remove)
site-packages/tuned/plugins/plugin_selinux.py000064400000004420147511334670015516 0ustar00import os
import errno
from . import base
from .decorators import *
import tuned.logs
from tuned.plugins import exceptions
from tuned.utils.commands import commands

log = tuned.logs.get()

class SelinuxPlugin(base.Plugin):
	"""
	`selinux`::
	
	Plug-in for tuning SELinux options.
	+
	SELinux decisions, such as allowing or denying access, are
	cached. This cache is known as the Access Vector Cache (AVC). When
	using these cached decisions, SELinux policy rules need to be checked
	less, which increases performance. The [option]`avc_cache_threshold`
	option allows adjusting the maximum number of AVC entries.
	+
	NOTE: Prior to changing the default value, evaluate the system
	performance with care. Increasing the value could potentially
	decrease the performance by making AVC slow.
	+
	.Increase the AVC cache threshold for hosts with containers.
	====
	----
	[selinux]
	avc_cache_threshold=8192
	----
	====
	"""

	@classmethod
	def _get_selinux_path(self):
		path = "/sys/fs/selinux"
		if not os.path.exists(path):
			path = "/selinux"
			if not os.path.exists(path):
				path = None
		return path

	def __init__(self, *args, **kwargs):
		self._cmd = commands()
		self._selinux_path = self._get_selinux_path()
		if self._selinux_path is None:
			raise exceptions.NotSupportedPluginException("SELinux is not enabled on your system or incompatible version is used.")
		self._cache_threshold_path = os.path.join(self._selinux_path, "avc", "cache_threshold")
		super(SelinuxPlugin, self).__init__(*args, **kwargs)

	@classmethod
	def _get_config_options(self):
		return {
			"avc_cache_threshold" : None,
		}

	def _instance_init(self, instance):
		instance._has_static_tuning = True
		instance._has_dynamic_tuning = False

	def _instance_cleanup(self, instance):
		pass

	@command_set("avc_cache_threshold")
	def _set_avc_cache_threshold(self, value, sim, remove):
		if value is None:
			return None
		threshold = int(value)
		if threshold >= 0:
			if not sim:
				self._cmd.write_to_file(self._cache_threshold_path, threshold, \
					no_error = [errno.ENOENT] if remove else False)
			return threshold
		else:
			return None

	@command_get("avc_cache_threshold")
	def _get_avc_cache_threshold(self):
		value = self._cmd.read_file(self._cache_threshold_path)
		if len(value) > 0:
			return int(value)
		return None
site-packages/tuned/plugins/plugin_irqbalance.py000064400000006742147511334670016141 0ustar00from . import base
from .decorators import command_custom
from tuned import consts
import tuned.logs
import errno
import perf
import re

log = tuned.logs.get()

class IrqbalancePlugin(base.Plugin):
	"""
	`irqbalance`::
	
	Plug-in for irqbalance settings management. The plug-in
	configures CPUs which should be skipped when rebalancing IRQs in
	`/etc/sysconfig/irqbalance`. It then restarts irqbalance if and
	only if it was previously running.
	+
	The banned/skipped CPUs are specified as a CPU list via the
	[option]`banned_cpus` option.
	+
	.Skip CPUs 2,4 and 9-13 when rebalancing IRQs
	====
	----
	[irqbalance]
	banned_cpus=2,4,9-13
	----
	====
	"""

	def __init__(self, *args, **kwargs):
		super(IrqbalancePlugin, self).__init__(*args, **kwargs)
		self._cpus = perf.cpu_map()

	def _instance_init(self, instance):
		instance._has_dynamic_tuning = False
		instance._has_static_tuning = True

	def _instance_cleanup(self, instance):
		pass

	@classmethod
	def _get_config_options(cls):
		return {
			"banned_cpus": None,
		}

	def _read_irqbalance_sysconfig(self):
		try:
			with open(consts.IRQBALANCE_SYSCONFIG_FILE, "r") as f:
				return f.read()
		except IOError as e:
			if e.errno == errno.ENOENT:
				log.warn("irqbalance sysconfig file is missing. Is irqbalance installed?")
			else:
				log.error("Failed to read irqbalance sysconfig file: %s" % e)
			return None

	def _write_irqbalance_sysconfig(self, content):
		try:
			with open(consts.IRQBALANCE_SYSCONFIG_FILE, "w") as f:
				f.write(content)
			return True
		except IOError as e:
			log.error("Failed to write irqbalance sysconfig file: %s" % e)
			return False

	def _write_banned_cpus(self, sysconfig, banned_cpumask):
		return sysconfig + "IRQBALANCE_BANNED_CPUS=%s\n" % banned_cpumask

	def _clear_banned_cpus(self, sysconfig):
		lines = []
		for line in sysconfig.split("\n"):
			if not re.match(r"\s*IRQBALANCE_BANNED_CPUS=", line):
				lines.append(line)
		return "\n".join(lines)

	def _restart_irqbalance(self):
		# Exit code 5 means unit not found (see 'EXIT_NOTINSTALLED' in
		# systemd.exec(5))
		retcode, out = self._cmd.execute(
			["systemctl", "try-restart", "irqbalance"],
			no_errors=[5])
		if retcode != 0:
			log.warn("Failed to restart irqbalance. Is it installed?")

	def _set_banned_cpus(self, banned_cpumask):
		content = self._read_irqbalance_sysconfig()
		if content is None:
			return
		content = self._clear_banned_cpus(content)
		content = self._write_banned_cpus(content, banned_cpumask)
		if self._write_irqbalance_sysconfig(content):
			self._restart_irqbalance()

	def _restore_banned_cpus(self):
		content = self._read_irqbalance_sysconfig()
		if content is None:
			return
		content = self._clear_banned_cpus(content)
		if self._write_irqbalance_sysconfig(content):
			self._restart_irqbalance()

	@command_custom("banned_cpus", per_device=False)
	def _banned_cpus(self, enabling, value, verify, ignore_missing):
		banned_cpumask = None
		if value is not None:
			banned = set(self._cmd.cpulist_unpack(value))
			present = set(self._cpus)
			if banned.issubset(present):
				banned_cpumask = self._cmd.cpulist2hex(list(banned))
			else:
				str_cpus = ",".join([str(x) for x in self._cpus])
				log.error("Invalid banned_cpus specified, '%s' does not match available cores '%s'"
					  % (value, str_cpus))

		if (enabling or verify) and banned_cpumask is None:
			return None
		if verify:
			# Verification is currently not supported
			return None
		elif enabling:
			self._set_banned_cpus(banned_cpumask)
		else:
			self._restore_banned_cpus()
site-packages/tuned/plugins/plugin_rtentsk.py000064400000002125147511334670015521 0ustar00from . import base
from .decorators import *
import tuned.logs
from tuned.utils.commands import commands
import glob
import socket
import time

log = tuned.logs.get()

class RTENTSKPlugin(base.Plugin):
	"""
	`rtentsk`::
	
	Plugin for avoiding interruptions due to static key IPIs due
        to opening socket with timestamping enabled (by opening a
        socket ourselves the static key is kept enabled).
	"""

	def _instance_init(self, instance):
		instance._has_static_tuning = True
		instance._has_dynamic_tuning = False

		# SO_TIMESTAMP nor SOF_TIMESTAMPING_OPT_TX_SWHW is defined by
		# the socket class

		SO_TIMESTAMP = 29 # see include/uapi/asm-generic/socket.h
		#define SO_TIMESTAMP 0x4012 # parisc!
		SOF_TIMESTAMPING_OPT_TX_SWHW = (1<<14) # see include/uapi/linux/net_tstamp.h

		s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDP)
		s.setsockopt(socket.SOL_SOCKET, SO_TIMESTAMP, SOF_TIMESTAMPING_OPT_TX_SWHW)
		self.rtentsk_socket = s
		log.info("opened SOF_TIMESTAMPING_OPT_TX_SWHW socket")

	def _instance_cleanup(self, instance):
		s = self.rtentsk_socket
		s.close()
site-packages/tuned/plugins/plugin_script.py000064400000007414147511334670015341 0ustar00import tuned.consts as consts
from . import base
import tuned.logs
import os
from subprocess import Popen, PIPE

log = tuned.logs.get()

class ScriptPlugin(base.Plugin):
	"""
	`script`::
	
	Executes an external script or binary when the profile is loaded or
	unloaded. You can choose an arbitrary executable.
	+
	IMPORTANT: The `script` plug-in is provided mainly for compatibility
	with earlier releases. Prefer other *TuneD* plug-ins if they cover
	the required functionality.
	+
	*TuneD* calls the executable with one of the following arguments:
	+
	--
	** `start` when loading the profile
	** `stop` when unloading the profile
	--
	+
	You need to correctly implement the `stop` action in your executable
	and revert all settings that you changed during the `start`
	action. Otherwise, the roll-back step after changing your *TuneD*
	profile will not work.
	+
	Bash scripts can import the [filename]`/usr/lib/tuned/functions`
	Bash library and use the functions defined there. Use these
	functions only for functionality that is not natively provided
	by *TuneD*. If a function name starts with an underscore, such as
	`_wifi_set_power_level`, consider the function private and do not
	use it in your scripts, because it might change in the future.
	+
	Specify the path to the executable using the `script` parameter in
	the plug-in configuration.
	+
	.Running a Bash script from a profile
	====
	To run a Bash script named `script.sh` that is located in the profile
	directory, use:
	
	----
	[script]
	script=${i:PROFILE_DIR}/script.sh
	----
	====
	"""

	@classmethod
	def _get_config_options(self):
		return {
			"script" : None,
		}

	def _instance_init(self, instance):
		instance._has_static_tuning = True
		instance._has_dynamic_tuning = False
		if instance.options["script"] is not None:
			# FIXME: this hack originated from profiles merger
			assert isinstance(instance.options["script"], list)
			instance._scripts = instance.options["script"]
		else:
			instance._scripts = []

	def _instance_cleanup(self, instance):
		pass

	def _call_scripts(self, scripts, arguments):
		ret = True
		for script in scripts:
			environ = os.environ
			environ.update(self._variables.get_env())
			log.info("calling script '%s' with arguments '%s'" % (script, str(arguments)))
			log.debug("using environment '%s'" % str(list(environ.items())))
			try:
				proc = Popen([script] +  arguments, \
						stdout=PIPE, stderr=PIPE, \
						close_fds=True, env=environ, \
						universal_newlines = True, \
						cwd = os.path.dirname(script))
				out, err = proc.communicate()
				if len(err):
					log.error("script '%s' error output: '%s'" % (script, err[:-1]))
				if proc.returncode:
					log.error("script '%s' returned error code: %d" % (script, proc.returncode))
					ret = False
			except (OSError,IOError) as e:
				log.error("script '%s' error: %s" % (script, e))
				ret = False
		return ret

	def _instance_apply_static(self, instance):
		super(ScriptPlugin, self)._instance_apply_static(instance)
		self._call_scripts(instance._scripts, ["start"])

	def _instance_verify_static(self, instance, ignore_missing, devices):
		ret = True
		if super(ScriptPlugin, self)._instance_verify_static(instance,
				ignore_missing, devices) == False:
			ret = False
		args = ["verify"]
		if ignore_missing:
			args += ["ignore_missing"]
		if self._call_scripts(instance._scripts, args) == True:
			log.info(consts.STR_VERIFY_PROFILE_OK % instance._scripts)
		else:
			log.error(consts.STR_VERIFY_PROFILE_FAIL % instance._scripts)
			ret = False
		return ret

	def _instance_unapply_static(self, instance, rollback = consts.ROLLBACK_SOFT):
		args = ["stop"]
		if rollback == consts.ROLLBACK_FULL:
			args = args + ["full_rollback"]
		self._call_scripts(reversed(instance._scripts), args)
		super(ScriptPlugin, self)._instance_unapply_static(instance, rollback)
site-packages/tuned/plugins/plugin_modules.py000064400000011475147511334670015507 0ustar00import re
import os.path
from . import base
from .decorators import *
import tuned.logs
from subprocess import *
from tuned.utils.commands import commands
import tuned.consts as consts

log = tuned.logs.get()

class ModulesPlugin(base.Plugin):
	"""
	`modules`::
	
	Plug-in for applying custom kernel modules options.
	+
	This plug-in can set parameters to kernel modules. It creates
	`/etc/modprobe.d/tuned.conf` file. The syntax is
	`_module_=_option1=value1 option2=value2..._` where `_module_` is
	the module name and `_optionx=valuex_` are module options which may
	or may not be present.
	+
	.Load module `netrom` with module parameter `nr_ndevs=2`
	====
	----
	[modules]
	netrom=nr_ndevs=2
	----
	====
	Modules can also be forced to load/reload by using an additional
	`+r` option prefix.
	+
	.(Re)load module `netrom` with module parameter `nr_ndevs=2`
	====
	----
	[modules]
	netrom=+r nr_ndevs=2
	----
	====
	The `+r` switch will also cause *TuneD* to try and remove `netrom`
	module (if loaded) and try and (re)insert it with the specified
	parameters. The `+r` can be followed by an optional comma (`+r,`)
	for better readability.
	+
	When using `+r` the module will be loaded immediately by the *TuneD*
	daemon itself rather than waiting for the OS to load it with the
	specified parameters.
	"""

	def __init__(self, *args, **kwargs):
		super(ModulesPlugin, self).__init__(*args, **kwargs)
		self._has_dynamic_options = True
		self._cmd = commands()

	def _instance_init(self, instance):
		instance._has_dynamic_tuning = False
		instance._has_static_tuning = True
		instance._modules = instance.options

	def _instance_cleanup(self, instance):
		pass

	def _reload_modules(self, modules):
		for module in modules:
			retcode, out = self._cmd.execute(["modprobe", "-r", module])
			if retcode < 0:
				log.warn("'modprobe' command not found, cannot reload kernel modules, reboot is required")
				return
			elif retcode > 0:
				log.debug("cannot remove kernel module '%s': %s" % (module, out.strip()))
			retcode, out = self._cmd.execute(["modprobe", module])
			if retcode != 0:
				log.warn("cannot insert/reinsert module '%s', reboot is required: %s" % (module, out.strip()))

	def _instance_apply_static(self, instance):
		self._clear_modprobe_file()
		s = ""
		retcode = 0
		skip_check = False
		reload_list = []
		for option, value in list(instance._modules.items()):
			module = self._variables.expand(option)
			v = self._variables.expand(value)
			if not skip_check:
				retcode, out = self._cmd.execute(["modinfo", module])
				if retcode < 0:
					skip_check = True
					log.warn("'modinfo' command not found, not checking kernel modules")
				elif retcode > 0:
					log.error("kernel module '%s' not found, skipping it" % module)
			if skip_check or retcode == 0:
				if len(v) > 1 and v[0:2] == "+r":
					v = re.sub(r"^\s*\+r\s*,?\s*", "", v)
					reload_list.append(module)
				if len(v) > 0:
					s += "options " + module + " " + v + "\n"
				else:
					log.debug("module '%s' doesn't have any option specified, not writing it to modprobe.d" % module)
		self._cmd.write_to_file(consts.MODULES_FILE, s)
		l = len(reload_list)
		if l > 0:
			self._reload_modules(reload_list)
			if len(instance._modules) != l:
				log.info(consts.STR_HINT_REBOOT)

	def _unquote_path(self, path):
		return str(path).replace("/", "")

	def _instance_verify_static(self, instance, ignore_missing, devices):
		ret = True
		# not all modules exports all their parameteters through sysfs, so hardcode check with ignore_missing
		ignore_missing = True
		r = re.compile(r"\s+")
		for option, value in list(instance._modules.items()):
			module = self._variables.expand(option)
			v = self._variables.expand(value)
			v = re.sub(r"^\s*\+r\s*,?\s*", "", v)
			mpath = "/sys/module/%s" % module
			if not os.path.exists(mpath):
				ret = False
				log.error(consts.STR_VERIFY_PROFILE_FAIL % "module '%s' is not loaded" % module)
			else:
				log.info(consts.STR_VERIFY_PROFILE_OK % "module '%s' is loaded" % module)
				l = r.split(v)
				for item in l:
					arg = item.split("=")
					if len(arg) != 2:
						log.warn("unrecognized module option for module '%s': %s" % (module, item))
					else:
						if self._verify_value(arg[0], arg[1],
							self._cmd.read_file(mpath + "/parameters/" + self._unquote_path(arg[0]), err_ret = None, no_error = True),
							ignore_missing) == False:
								ret = False
		return ret

	def _instance_unapply_static(self, instance, rollback = consts.ROLLBACK_SOFT):
		if rollback == consts.ROLLBACK_FULL:
			self._clear_modprobe_file()

	def _clear_modprobe_file(self):
		s = self._cmd.read_file(consts.MODULES_FILE, no_error = True)
		l = s.split("\n")
		i = j = 0
		ll = len(l)
		r = re.compile(r"^\s*#")
		while i < ll:
			if r.search(l[i]) is None:
				j = i
				i = ll
			i += 1
		s = "\n".join(l[0:j])
		if len(s) > 0:
			s += "\n"
		self._cmd.write_to_file(consts.MODULES_FILE, s)
site-packages/tuned/plugins/__pycache__/plugin_scsi_host.cpython-36.pyc000064400000011074147511334670022314 0ustar003

�<�eQ�@sjddlZddlmZddlTddlZddljZddlm	Z	ddl
Z
ddlZejj
�ZGdd�dej�ZdS)�N�)�hotplug)�*)�commandscs�eZdZdZ�fdd�Z�fdd�Zdd�Zedd	��Zd
d�Z	dd
�Z
�fdd�Z�fdd�Z�fdd�Z
edd��Zdd�Zdd�Zdd�Zeddd�dd ��Zed�d$d"d#��Z�ZS)%�SCSIHostPlugina�
	`scsi_host`::
	
	Tunes options for SCSI hosts.
	+
	The plug-in sets Aggressive Link Power Management (ALPM) to the value specified
	by the [option]`alpm` option. The option takes one of three values:
	`min_power`, `medium_power` and `max_performance`.
	+
	NOTE: ALPM is only available on SATA controllers that use the Advanced
	Host Controller Interface (AHCI).
	+
	.ALPM setting when extended periods of idle time are expected
	====
	----
	[scsi_host]
	alpm=min_power
	----
	====
	cstt|�j||�t�|_dS)N)�superr�__init__r�_cmd)�self�args�kwargs)�	__class__��&/usr/lib/python3.6/plugin_scsi_host.pyr"szSCSIHostPlugin.__init__csVtt|�j�d|_t�|_x,|jjd�D]}|j|�r*|jj	|j
�q*Wt�|_dS)NT�scsi)rr�
_init_devicesZ_devices_supported�setZ
_free_devices�_hardware_inventoryZget_devices�_device_is_supported�addZsys_nameZ_assigned_devices)r
�device)r
rrr's
zSCSIHostPlugin._init_devicescs�fdd�|D�S)Ncsg|]}�jjd|��qS)r)rZ
get_device)�.0�x)r
rr�
<listcomp>2sz6SCSIHostPlugin._get_device_objects.<locals>.<listcomp>r)r
Zdevicesr)r
r�_get_device_objects1sz"SCSIHostPlugin._get_device_objectscCs
|jdkS)NZ	scsi_host)Zdevice_type)�clsrrrrr4sz#SCSIHostPlugin._device_is_supportedcCs|jj|d|j�dS)Nr)rZ	subscribe�_hardware_events_callback)r
rrr�_hardware_events_init8sz$SCSIHostPlugin._hardware_events_initcCs|jj|�dS)N)rZunsubscribe)r
rrr�_hardware_events_cleanup;sz'SCSIHostPlugin._hardware_events_cleanupcs |j|�rtt|�j||�dS)N)rrrr)r
Zeventr)r
rrr>s
z(SCSIHostPlugin._hardware_events_callbackcstt|�j||�dS)N)rr�_added_device_apply_tuning)r
�instance�device_name)r
rrrBsz)SCSIHostPlugin._added_device_apply_tuningcstt|�j||�dS)N)rr�_removed_device_unapply_tuning)r
r r!)r
rrr"Esz-SCSIHostPlugin._removed_device_unapply_tuningcCsddiS)N�alpmr)rrrr�_get_config_optionsHsz"SCSIHostPlugin._get_config_optionscCsd|_d|_dS)NTF)Z_has_static_tuningZ_has_dynamic_tuning)r
r rrr�_instance_initNszSCSIHostPlugin._instance_initcCsdS)Nr)r
r rrr�_instance_cleanupRsz SCSIHostPlugin._instance_cleanupcCstjjdt|�d�S)Nz/sys/class/scsi_host/Zlink_power_management_policy)�os�path�join�str)r
rrrr�_get_alpm_policy_fileUsz$SCSIHostPlugin._get_alpm_policy_filer#T)Z
per_devicecCsd|dkrdS|j|�}|s`tjj|�rF|jj|||r<tjgndd�ntj	d|t
|�f�dS|S)NF)�no_errorzBALPM control file ('%s') not found, skipping ALPM setting for '%s')r+r'r(�existsr	Z
write_to_file�errno�ENOENT�log�infor*)r
�policyrZsim�remove�policy_filerrr�	_set_alpmXs

zSCSIHostPlugin._set_alpmFcCs.|j|�}|jj|dd�j�}|dkr*|SdS)NT)r,�)r+r	Z	read_file�strip)r
rZignore_missingr4r2rrr�	_get_alpmfs
zSCSIHostPlugin._get_alpm)F)�__name__�
__module__�__qualname__�__doc__rrr�classmethodrrrrrr"r$r%r&r+Zcommand_setr5Zcommand_getr8�
__classcell__rr)r
rrs"
r)r.r6rZ
decoratorsZ
tuned.logsZtunedZtuned.constsZconstsZtuned.utils.commandsrr'�reZlogs�getr0ZPluginrrrrr�<module>s

site-packages/tuned/plugins/__pycache__/plugin_script.cpython-36.pyc000064400000010476147511334670021627 0ustar003

�<�e�@sVddljZddlmZddlZddlZddlmZm	Z	ej
j�ZGdd�dej
�ZdS)�N�)�base)�Popen�PIPEcsbeZdZdZedd��Zdd�Zdd�Zdd	�Z�fd
d�Z	�fdd
�Z
ejf�fdd�	Z
�ZS)�ScriptPluginac
	`script`::
	
	Executes an external script or binary when the profile is loaded or
	unloaded. You can choose an arbitrary executable.
	+
	IMPORTANT: The `script` plug-in is provided mainly for compatibility
	with earlier releases. Prefer other *TuneD* plug-ins if they cover
	the required functionality.
	+
	*TuneD* calls the executable with one of the following arguments:
	+
	--
	** `start` when loading the profile
	** `stop` when unloading the profile
	--
	+
	You need to correctly implement the `stop` action in your executable
	and revert all settings that you changed during the `start`
	action. Otherwise, the roll-back step after changing your *TuneD*
	profile will not work.
	+
	Bash scripts can import the [filename]`/usr/lib/tuned/functions`
	Bash library and use the functions defined there. Use these
	functions only for functionality that is not natively provided
	by *TuneD*. If a function name starts with an underscore, such as
	`_wifi_set_power_level`, consider the function private and do not
	use it in your scripts, because it might change in the future.
	+
	Specify the path to the executable using the `script` parameter in
	the plug-in configuration.
	+
	.Running a Bash script from a profile
	====
	To run a Bash script named `script.sh` that is located in the profile
	directory, use:
	
	----
	[script]
	script=${i:PROFILE_DIR}/script.sh
	----
	====
	cCsddiS)N�script�)�selfrr�#/usr/lib/python3.6/plugin_script.py�_get_config_options6sz ScriptPlugin._get_config_optionscCsFd|_d|_|jddk	r<t|jdt�s.t�|jd|_ng|_dS)NTFr)Z_has_static_tuningZ_has_dynamic_tuningZoptions�
isinstance�list�AssertionError�_scripts)r	�instancerrr
�_instance_init<szScriptPlugin._instance_initcCsdS)Nr)r	rrrr
�_instance_cleanupFszScriptPlugin._instance_cleanupc
Csd}�x|D�]�}tj}|j|jj��tjd|t|�f�tjdtt	|j
����ytt|g|ttd|dtj
j|�d�}|j�\}}t|�r�tjd||dd
�f�|jr�tjd||jf�d}Wqttfk
�r
}	ztjd	||	f�d}WYdd}	~	XqXqW|S)NTz'calling script '%s' with arguments '%s'zusing environment '%s')�stdout�stderrZ	close_fds�envZuniversal_newlines�cwdzscript '%s' error output: '%s'rz#script '%s' returned error code: %dFzscript '%s' error: %s���)�os�environ�updateZ
_variablesZget_env�log�info�str�debugr
�itemsrr�path�dirnameZcommunicate�len�error�
returncode�OSError�IOError)
r	�scriptsZ	arguments�retrr�proc�out�err�errr
�
_call_scriptsIs,
zScriptPlugin._call_scriptscs$tt|�j|�|j|jdg�dS)N�start)�superr�_instance_apply_staticr-r)r	r)�	__class__rr
r0asz#ScriptPlugin._instance_apply_staticcstd}tt|�j|||�dkr d}dg}|r4|dg7}|j|j|�dkrZtjtj|j�ntj	tj
|j�d}|S)NTFZverify�ignore_missing)r/r�_instance_verify_staticr-rrr�constsZSTR_VERIFY_PROFILE_OKr#ZSTR_VERIFY_PROFILE_FAIL)r	rr2Zdevicesr(�args)r1rr
r3es
z$ScriptPlugin._instance_verify_staticcsBdg}|tjkr|dg}|jt|j�|�tt|�j||�dS)N�stopZ
full_rollback)r4Z
ROLLBACK_FULLr-�reversedrr/r�_instance_unapply_static)r	rZrollbackr5)r1rr
r8ts


z%ScriptPlugin._instance_unapply_static)�__name__�
__module__�__qualname__�__doc__�classmethodrrrr-r0r3r4Z
ROLLBACK_SOFTr8�
__classcell__rr)r1r
r	s+
r)Ztuned.constsr4�rZ
tuned.logsZtunedr�
subprocessrrZlogs�getrZPluginrrrrr
�<module>s

site-packages/tuned/plugins/__pycache__/plugin_video.cpython-36.pyc000064400000010206147511334670021420 0ustar003

�<�e��@s`ddlmZddlTddlZddlmZddlZddlZddl	Z	ej
j�ZGdd�dej
�ZdS)�)�base)�*�N)�commandsc@sjeZdZdZdd�Zdd�Zedd��Zdd	�Zd
d�Z	dd
�Z
eddd�dd��Ze
d�ddd��ZdS)�VideoPlugina�
	`video`::
	
	Sets various powersave levels on video cards. Currently, only the
	Radeon cards are supported. The powersave level can be specified
	by using the [option]`radeon_powersave` option. Supported values are:
	+
	--
	* `default`
	* `auto`
	* `low`
	* `mid`
	* `high`
	* `dynpm`
	* `dpm-battery`
	* `dpm-balanced`
	* `dpm-perfomance`
	--
	+
	For additional detail, see
	link:https://www.x.org/wiki/RadeonFeature/#kmspowermanagementoptions[KMS Power Management Options].
	+
	NOTE: This plug-in is experimental and the option might change in future releases.
	+
	.To set the powersave level for the Radeon video card to high
	====
	----
	[video]
	radeon_powersave=high
	----
	====
	cCsTd|_t�|_t�|_x0|jjd�jd�jdd�D]}|jj|j	�q2Wt
�|_dS)NT�drmzcard*ZDEVTYPEZ	drm_minor)Z_devices_supported�setZ
_free_devicesZ_assigned_devices�_hardware_inventoryZget_devicesZmatch_sys_nameZmatch_property�addZsys_namer�_cmd)�self�device�r�"/usr/lib/python3.6/plugin_video.py�
_init_devices-s zVideoPlugin._init_devicescs�fdd�|D�S)Ncsg|]}�jjd|��qS)r)r	Z
get_device)�.0�x)rrr�
<listcomp>9sz3VideoPlugin._get_device_objects.<locals>.<listcomp>r)rZdevicesr)rr�_get_device_objects8szVideoPlugin._get_device_objectscCsddiS)N�radeon_powersaver)rrrr�_get_config_options;szVideoPlugin._get_config_optionscCsd|_d|_dS)NFT)Z_has_dynamic_tuningZ_has_static_tuning)r�instancerrr�_instance_initAszVideoPlugin._instance_initcCsdS)Nr)rrrrr�_instance_cleanupEszVideoPlugin._instance_cleanupcCsd|d|d|d�S)Nz%/sys/class/drm/%s/device/power_methodz&/sys/class/drm/%s/device/power_profilez(/sys/class/drm/%s/device/power_dpm_state)�method�profile�	dpm_stater)rr
rrr�_radeon_powersave_filesHsz#VideoPlugin._radeon_powersave_filesrT)Z
per_devicec	Csl|j|�}ttjdd|��j�}tjj|d�sF|sFtj	d|�dS�x|D�]}|dkr�|s�|j
j|dd
|rztj
gndd�r�|j
j|d
||r�tj
gndd�r�|SqN|d
kr�|s�|j
j|dd
|r�tj
gndd�r�d
SqN|dk�rP|�sd|td�d�}|j
j|dd|�rtj
gndd��rd|j
j|d||�r@tj
gndd��rd|SqN|�s`tj	d�dSqNWdS)Nz#(\s*:\s*)|(\s+)|(\s*;\s*)|(\s*,\s*)� rz)radeon_powersave is not supported on '%s'�default�auto�low�mid�highrF)�no_error�dynpm�dpm-battery�dpm-balanced�dpm-performancezdpm-�dpmrz$Invalid option for radeon_powersave.)rr r!r"r#)r&r'r()r�str�re�sub�split�os�path�exists�log�warnrZ
write_to_file�errno�ENOENT�len)	r�valuer
Zsim�remove�	sys_filesZva�v�staterrr�_set_radeon_powersaveOs>


z!VideoPlugin._set_radeon_powersaveFcCsr|j|�}|jj|d|d�j�}|dkr>|jj|d�j�S|dkrJ|S|dkrjd|jj|d�j�SdSdS)Nr)r$rr%r)zdpm-r)rrZ	read_file�strip)rr
Zignore_missingr8rrrr�_get_radeon_powersavess
z!VideoPlugin._get_radeon_powersaveN)F)�__name__�
__module__�__qualname__�__doc__rr�classmethodrrrrZcommand_setr;Zcommand_getr=rrrrrs $r)�rZ
decoratorsZ
tuned.logsZtunedZtuned.utils.commandsrr.r3r+Zlogs�getr1ZPluginrrrrr�<module>s
site-packages/tuned/plugins/__pycache__/plugin_irqbalance.cpython-36.opt-1.pyc000064400000011013147511334670023347 0ustar003

�<�e�
�@sdddlmZddlmZddlmZddlZddlZddlZddl	Z	ej
j�ZGdd�dej
�ZdS)�)�base)�command_custom�)�constsNcs�eZdZdZ�fdd�Zdd�Zdd�Zedd	��Zd
d�Z	dd
�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zeddd�dd��Z�ZS)�IrqbalancePlugina�
	`irqbalance`::
	
	Plug-in for irqbalance settings management. The plug-in
	configures CPUs which should be skipped when rebalancing IRQs in
	`/etc/sysconfig/irqbalance`. It then restarts irqbalance if and
	only if it was previously running.
	+
	The banned/skipped CPUs are specified as a CPU list via the
	[option]`banned_cpus` option.
	+
	.Skip CPUs 2,4 and 9-13 when rebalancing IRQs
	====
	----
	[irqbalance]
	banned_cpus=2,4,9-13
	----
	====
	cs tt|�j||�tj�|_dS)N)�superr�__init__�perfZcpu_map�_cpus)�self�args�kwargs)�	__class__��'/usr/lib/python3.6/plugin_irqbalance.pyr szIrqbalancePlugin.__init__cCsd|_d|_dS)NFT)Z_has_dynamic_tuningZ_has_static_tuning)r�instancerrr�_instance_init$szIrqbalancePlugin._instance_initcCsdS)Nr)rrrrr�_instance_cleanup(sz"IrqbalancePlugin._instance_cleanupcCsddiS)N�banned_cpusr)�clsrrr�_get_config_options+sz$IrqbalancePlugin._get_config_optionscCsly ttjd��
}|j�SQRXWnFtk
rf}z*|jtjkrJtjd�ntj	d|�dSd}~XnXdS)N�rz>irqbalance sysconfig file is missing. Is irqbalance installed?z,Failed to read irqbalance sysconfig file: %s)
�openr�IRQBALANCE_SYSCONFIG_FILE�read�IOError�errno�ENOENT�log�warn�error)r�f�errr�_read_irqbalance_sysconfig1sz+IrqbalancePlugin._read_irqbalance_sysconfigcCsZy&ttjd��}|j|�WdQRXdStk
rT}ztjd|�dSd}~XnXdS)N�wTz-Failed to write irqbalance sysconfig file: %sF)rrr�writerrr )r�contentr!r"rrr�_write_irqbalance_sysconfig<sz,IrqbalancePlugin._write_irqbalance_sysconfigcCs|d|S)NzIRQBALANCE_BANNED_CPUS=%s
r)r�	sysconfig�banned_cpumaskrrr�_write_banned_cpusEsz#IrqbalancePlugin._write_banned_cpuscCs8g}x(|jd�D]}tjd|�s|j|�qWdj|�S)N�
z\s*IRQBALANCE_BANNED_CPUS=)�split�re�match�append�join)rr(�lines�linerrr�_clear_banned_cpusHs
z#IrqbalancePlugin._clear_banned_cpuscCs2|jjdddgdgd�\}}|dkr.tjd�dS)NZ	systemctlztry-restartZ
irqbalance�)Z	no_errorsrz.Failed to restart irqbalance. Is it installed?)�_cmdZexecuterr)rZretcode�outrrr�_restart_irqbalanceOs
z$IrqbalancePlugin._restart_irqbalancecCs@|j�}|dkrdS|j|�}|j||�}|j|�r<|j�dS)N)r#r3r*r'r7)rr)r&rrr�_set_banned_cpusXs

z!IrqbalancePlugin._set_banned_cpuscCs4|j�}|dkrdS|j|�}|j|�r0|j�dS)N)r#r3r'r7)rr&rrr�_restore_banned_cpusas

z%IrqbalancePlugin._restore_banned_cpusrF)Z
per_devicec	Cs�d}|dk	rjt|jj|��}t|j�}|j|�rB|jjt|��}n(djdd�|jD��}tj	d||f�|sr|r~|dkr~dS|r�dS|r�|j
|�n|j�dS)N�,cSsg|]}t|��qSr)�str)�.0�xrrr�
<listcomp>rsz1IrqbalancePlugin._banned_cpus.<locals>.<listcomp>zGInvalid banned_cpus specified, '%s' does not match available cores '%s')�setr5Zcpulist_unpackr
�issubsetZcpulist2hex�listr0rr r8r9)	rZenabling�valueZverifyZignore_missingr)ZbannedZpresentZstr_cpusrrr�_banned_cpusis 

zIrqbalancePlugin._banned_cpus)�__name__�
__module__�__qualname__�__doc__rrr�classmethodrr#r'r*r3r7r8r9rrC�
__classcell__rr)rrrs			r)�rZ
decoratorsrZtunedrZ
tuned.logsrr	r-Zlogs�getrZPluginrrrrr�<module>s
site-packages/tuned/plugins/__pycache__/repository.cpython-36.opt-1.pyc000064400000003574147511334670022124 0ustar003

�<�e��@s@ddlmZddlZddlZejj�ZdgZGdd�de�Z	dS)�)�PluginLoaderN�
Repositorycs@eZdZ�fdd�Zedd��Zdd�Zdd�Zd	d
�Z�Z	S)rc		sJtt|�j�t�|_||_||_||_||_||_	||_
||_||_dS)N)
�superr�__init__�set�_plugins�_monitor_repository�_storage_factory�_hardware_inventory�_device_matcher�_device_matcher_udev�_plugin_instance_factory�_global_cfg�
_variables)	�selfZmonitor_repositoryZstorage_factoryZhardware_inventoryZdevice_matcherZdevice_matcher_udevZplugin_instance_factoryZ
global_cfg�	variables)�	__class__�� /usr/lib/python3.6/repository.pyrszRepository.__init__cCs|jS)N)r)rrrr�pluginsszRepository.pluginscCsd|_d|_tjjj|_dS)Nz
tuned.pluginsZplugin_)Z
_namespace�_prefix�tunedr�baseZPluginZ
_interface)rrrr�_set_loader_parameterssz!Repository._set_loader_parametersc	CsNtjd|�|j|�}||j|j|j|j|j|j|j	|j
�}|jj|�|S)Nzcreating plugin %s)
�log�debugZload_pluginrr	r
rrr
rrr�add)rZplugin_nameZ
plugin_clsZplugin_instancerrr�create s
zRepository.createcCs&tjd|�|j�|jj|�dS)Nzremoving plugin %s)rrZcleanupr�remove)rZpluginrrr�delete(szRepository.delete)
�__name__�
__module__�__qualname__r�propertyrrrr�
__classcell__rr)rrr	s
)
Ztuned.utils.plugin_loaderrZtuned.plugins.baserZ
tuned.logsZlogs�getr�__all__rrrrr�<module>s

site-packages/tuned/plugins/__pycache__/plugin_selinux.cpython-36.opt-1.pyc000064400000006001147511334670022736 0ustar003

�<�e	�@sdddlZddlZddlmZddlTddlZddlmZddl	m
Z
ejj�Z
Gdd�dej�ZdS)	�N�)�base)�*)�
exceptions)�commandscsheZdZdZedd��Z�fdd�Zedd��Zdd	�Zd
d�Z	e
d�d
d��Zed�dd��Z
�ZS)�
SelinuxPlugina�
	`selinux`::
	
	Plug-in for tuning SELinux options.
	+
	SELinux decisions, such as allowing or denying access, are
	cached. This cache is known as the Access Vector Cache (AVC). When
	using these cached decisions, SELinux policy rules need to be checked
	less, which increases performance. The [option]`avc_cache_threshold`
	option allows adjusting the maximum number of AVC entries.
	+
	NOTE: Prior to changing the default value, evaluate the system
	performance with care. Increasing the value could potentially
	decrease the performance by making AVC slow.
	+
	.Increase the AVC cache threshold for hosts with containers.
	====
	----
	[selinux]
	avc_cache_threshold=8192
	----
	====
	cCs(d}tjj|�s$d}tjj|�s$d}|S)Nz/sys/fs/selinuxz/selinux)�os�path�exists)�selfr	�r�$/usr/lib/python3.6/plugin_selinux.py�_get_selinux_path$szSelinuxPlugin._get_selinux_pathcsPt�|_|j�|_|jdkr&tjd��tjj|jdd�|_	t
t|�j||�dS)NzFSELinux is not enabled on your system or incompatible version is used.ZavcZcache_threshold)
r�_cmdrZ
_selinux_pathrZNotSupportedPluginExceptionrr	�join�_cache_threshold_path�superr�__init__)r�args�kwargs)�	__class__rr
r-s


zSelinuxPlugin.__init__cCsddiS)N�avc_cache_thresholdr)rrrr
�_get_config_options5sz!SelinuxPlugin._get_config_optionscCsd|_d|_dS)NTF)Z_has_static_tuningZ_has_dynamic_tuning)r�instancerrr
�_instance_init;szSelinuxPlugin._instance_initcCsdS)Nr)rrrrr
�_instance_cleanup?szSelinuxPlugin._instance_cleanuprcCsL|dkrdSt|�}|dkrD|s@|jj|j||r8tjgndd�|SdSdS)NrF)Zno_error)�intrZ
write_to_filer�errno�ENOENT)r�valueZsim�removeZ	thresholdrrr
�_set_avc_cache_thresholdBsz&SelinuxPlugin._set_avc_cache_thresholdcCs&|jj|j�}t|�dkr"t|�SdS)Nr)rZ	read_filer�lenr)rrrrr
�_get_avc_cache_thresholdOsz&SelinuxPlugin._get_avc_cache_threshold)�__name__�
__module__�__qualname__�__doc__�classmethodrrrrrZcommand_setr!Zcommand_getr#�
__classcell__rr)rr
rs	
r)rr�rZ
decoratorsZ
tuned.logsZtunedZ
tuned.pluginsrZtuned.utils.commandsrZlogs�get�logZPluginrrrrr
�<module>s
site-packages/tuned/plugins/__pycache__/plugin_video.cpython-36.opt-1.pyc000064400000010206147511334670022357 0ustar003

�<�e��@s`ddlmZddlTddlZddlmZddlZddlZddl	Z	ej
j�ZGdd�dej
�ZdS)�)�base)�*�N)�commandsc@sjeZdZdZdd�Zdd�Zedd��Zdd	�Zd
d�Z	dd
�Z
eddd�dd��Ze
d�ddd��ZdS)�VideoPlugina�
	`video`::
	
	Sets various powersave levels on video cards. Currently, only the
	Radeon cards are supported. The powersave level can be specified
	by using the [option]`radeon_powersave` option. Supported values are:
	+
	--
	* `default`
	* `auto`
	* `low`
	* `mid`
	* `high`
	* `dynpm`
	* `dpm-battery`
	* `dpm-balanced`
	* `dpm-perfomance`
	--
	+
	For additional detail, see
	link:https://www.x.org/wiki/RadeonFeature/#kmspowermanagementoptions[KMS Power Management Options].
	+
	NOTE: This plug-in is experimental and the option might change in future releases.
	+
	.To set the powersave level for the Radeon video card to high
	====
	----
	[video]
	radeon_powersave=high
	----
	====
	cCsTd|_t�|_t�|_x0|jjd�jd�jdd�D]}|jj|j	�q2Wt
�|_dS)NT�drmzcard*ZDEVTYPEZ	drm_minor)Z_devices_supported�setZ
_free_devicesZ_assigned_devices�_hardware_inventoryZget_devicesZmatch_sys_nameZmatch_property�addZsys_namer�_cmd)�self�device�r�"/usr/lib/python3.6/plugin_video.py�
_init_devices-s zVideoPlugin._init_devicescs�fdd�|D�S)Ncsg|]}�jjd|��qS)r)r	Z
get_device)�.0�x)rrr�
<listcomp>9sz3VideoPlugin._get_device_objects.<locals>.<listcomp>r)rZdevicesr)rr�_get_device_objects8szVideoPlugin._get_device_objectscCsddiS)N�radeon_powersaver)rrrr�_get_config_options;szVideoPlugin._get_config_optionscCsd|_d|_dS)NFT)Z_has_dynamic_tuningZ_has_static_tuning)r�instancerrr�_instance_initAszVideoPlugin._instance_initcCsdS)Nr)rrrrr�_instance_cleanupEszVideoPlugin._instance_cleanupcCsd|d|d|d�S)Nz%/sys/class/drm/%s/device/power_methodz&/sys/class/drm/%s/device/power_profilez(/sys/class/drm/%s/device/power_dpm_state)�method�profile�	dpm_stater)rr
rrr�_radeon_powersave_filesHsz#VideoPlugin._radeon_powersave_filesrT)Z
per_devicec	Csl|j|�}ttjdd|��j�}tjj|d�sF|sFtj	d|�dS�x|D�]}|dkr�|s�|j
j|dd
|rztj
gndd�r�|j
j|d
||r�tj
gndd�r�|SqN|d
kr�|s�|j
j|dd
|r�tj
gndd�r�d
SqN|dk�rP|�sd|td�d�}|j
j|dd|�rtj
gndd��rd|j
j|d||�r@tj
gndd��rd|SqN|�s`tj	d�dSqNWdS)Nz#(\s*:\s*)|(\s+)|(\s*;\s*)|(\s*,\s*)� rz)radeon_powersave is not supported on '%s'�default�auto�low�mid�highrF)�no_error�dynpm�dpm-battery�dpm-balanced�dpm-performancezdpm-�dpmrz$Invalid option for radeon_powersave.)rr r!r"r#)r&r'r()r�str�re�sub�split�os�path�exists�log�warnrZ
write_to_file�errno�ENOENT�len)	r�valuer
Zsim�remove�	sys_filesZva�v�staterrr�_set_radeon_powersaveOs>


z!VideoPlugin._set_radeon_powersaveFcCsr|j|�}|jj|d|d�j�}|dkr>|jj|d�j�S|dkrJ|S|dkrjd|jj|d�j�SdSdS)Nr)r$rr%r)zdpm-r)rrZ	read_file�strip)rr
Zignore_missingr8rrrr�_get_radeon_powersavess
z!VideoPlugin._get_radeon_powersaveN)F)�__name__�
__module__�__qualname__�__doc__rr�classmethodrrrrZcommand_setr;Zcommand_getr=rrrrrs $r)�rZ
decoratorsZ
tuned.logsZtunedZtuned.utils.commandsrr.r3r+Zlogs�getr1ZPluginrrrrr�<module>s
site-packages/tuned/plugins/__pycache__/plugin_systemd.cpython-36.opt-1.pyc000064400000013657147511334670022756 0ustar003

�<�e4�@snddlmZddlTddlZddlmZddlmZddlj	Z	ddl
Z
ddlZejj
�ZGdd�dej�ZdS)	�)�base)�*�N)�
exceptions)�commandscs�eZdZdZ�fdd�Zdd�Zdd�Zedd	��Zd
d�Z	dd
�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zejfdd�Zdd�Zeddd�dd ��Z�ZS)!�
SystemdPlugina<
	`systemd`::
	
	Plug-in for tuning systemd options.
	+
	The [option]`cpu_affinity` option allows setting CPUAffinity in
	`/etc/systemd/system.conf`. This configures the CPU affinity for the
	service manager as well as the default CPU affinity for all forked
	off processes. The option takes a comma-separated list of CPUs with
	optional CPU ranges specified by the minus sign (`-`).
	+
	.Set the CPUAffinity for `systemd` to `0 1 2 3`
	====
	----
	[systemd]
	cpu_affinity=0-3
	----
	====
	+
	NOTE: These tunings are unloaded only on profile change followed by a reboot.
	cs<tjjtj�stjdtj��tt|�j	||�t
�|_dS)NzERequired systemd '%s' configuration file not found, disabling plugin.)�os�path�isfile�consts�SYSTEMD_SYSTEM_CONF_FILErZNotSupportedPluginException�superr�__init__r�_cmd)�self�args�kwargs)�	__class__��$/usr/lib/python3.6/plugin_systemd.pyr$szSystemdPlugin.__init__cCsd|_d|_dS)NFT)Z_has_dynamic_tuningZ_has_static_tuning)r�instancerrr�_instance_init*szSystemdPlugin._instance_initcCsdS)Nr)rrrrr�_instance_cleanup.szSystemdPlugin._instance_cleanupcCsddiS)N�cpu_affinityr)�clsrrr�_get_config_options1sz!SystemdPlugin._get_config_optionscCsB|dk	r>tjd|d|tjd�}|dk	r>|jdkr>|jd�SdS)Nz^\s*z\s*=\s*(.*)$)�flagsr)�re�search�	MULTILINE�	lastindex�group)r�conf�key�morrr�_get_keyval7s

zSystemdPlugin._get_keyvalcCs~tjd|ddt|�|tjd�\}}|dkrzy|ddkrF|d7}Wntk
r\YnX||dt|�d7}|S|S)	Nz^(\s*z\s*=).*$z\g<1>)rr�
�=���)r�subn�strr�
IndexError)rr"r#�valZconf_newZnsubsrrr�_add_keyval?s(zSystemdPlugin._add_keyvalcCstjd|dd|tjd�S)Nz^\s*z\s*=.*\n�)r)r�subr)rr"r#rrr�_del_keyKszSystemdPlugin._del_keycCs,|jjtjdd�}|dkr(tjd�dS|S)N)�err_retz(error reading systemd configuration file)r�	read_filerr�log�error)rZsystemd_system_confrrr�_read_systemd_system_confNs

z'SystemdPlugin._read_systemd_system_confcCsptjtj}|jj||�s8tjd�|jj|dd�dS|jj|tj�sltjdtj�|jj|dd�dSdS)Nz(error writing systemd configuration fileT)�no_errorFz/error replacing systemd configuration file '%s')	rrZTMP_FILE_SUFFIXr�
write_to_filer3r4�unlink�rename)rr"Ztmpfilerrr�_write_systemd_system_confUs
z(SystemdPlugin._write_systemd_system_confcCstjjtj|j�S)N)rr	�joinrZPERSISTENT_STORAGE_DIR�name)rrrr�_get_storage_filenamecsz#SystemdPlugin._get_storage_filenamecCsl|j�}|dk	rh|j�}|jj|ddd�}|jj|�|dkrN|j|tj�}n|j|tj|�}|j	|�dS)NT)r1r6)
r5r=rr2r8r0r�SYSTEMD_CPUAFFINITY_VARr-r:)rr"�fname�cpu_affinity_savedrrr�_remove_systemd_tuningfsz$SystemdPlugin._remove_systemd_tuningcCs0|tjkr,tjdtj�|j�tjd�dS)Nz6removing '%s' systemd tuning previously added by TuneDz[you may need to manualy run 'dracut -f' to update the systemd configuration in initrd image)rZ
ROLLBACK_FULLr3�infor>rAZconsole)rrZrollbackrrr�_instance_unapply_staticrs
z&SystemdPlugin._instance_unapply_staticc
Cs<|dkrdSdjdd�|jjtjddtjdd|���D��S)Nr.� css|]}t|�VqdS)N)r*)�.0�vrrr�	<genexpr>|sz8SystemdPlugin._cpulist_convert_unpack.<locals>.<genexpr>z\s+�,z,\s+)r;r�cpulist_unpackrr/)rZcpulistrrr�_cpulist_convert_unpackysz%SystemdPlugin._cpulist_convert_unpackrF)Z
per_devicecCs�d}d}|jj|jj|jj|���}djdd�|jj|�D��}|j�}	|	dk	rh|j|	t	j
�}|j|�}|r||jd|||�S|r�|j
�}
|jj|
ddd�}|dk	r�|dkr�||kr�|jj|
|dd�tjdt	j
|t	jf�|j|j|	t	j
|��dS)	NrDcss|]}t|�VqdS)N)r*)rErFrrrrG�sz)SystemdPlugin._cmdline.<locals>.<genexpr>rT)r1r6)Zmakedirz setting '%s' to '%s' in the '%s')rZunescapeZ
_variables�expandZunquoter;rIr5r%rr>rJZ
_verify_valuer=r2r7r3rBrr:r-)rZenabling�valueZverifyZignore_missingZ
conf_affinityZconf_affinity_unpackedrFZ
v_unpackedr"r?r@rrr�_cmdline~s"
zSystemdPlugin._cmdline)�__name__�
__module__�__qualname__�__doc__rrr�classmethodrr%r-r0r5r:r=rArZ
ROLLBACK_SOFTrCrJZcommand_customrM�
__classcell__rr)rrr
sr)r.rZ
decoratorsZ
tuned.logsZtunedrZtuned.utils.commandsrZtuned.constsrrrZlogs�getr3ZPluginrrrrr�<module>s

site-packages/tuned/plugins/__pycache__/plugin_script.cpython-36.opt-1.pyc000064400000010407147511334670022560 0ustar003

�<�e�@sVddljZddlmZddlZddlZddlmZm	Z	ej
j�ZGdd�dej
�ZdS)�N�)�base)�Popen�PIPEcsbeZdZdZedd��Zdd�Zdd�Zdd	�Z�fd
d�Z	�fdd
�Z
ejf�fdd�	Z
�ZS)�ScriptPluginac
	`script`::
	
	Executes an external script or binary when the profile is loaded or
	unloaded. You can choose an arbitrary executable.
	+
	IMPORTANT: The `script` plug-in is provided mainly for compatibility
	with earlier releases. Prefer other *TuneD* plug-ins if they cover
	the required functionality.
	+
	*TuneD* calls the executable with one of the following arguments:
	+
	--
	** `start` when loading the profile
	** `stop` when unloading the profile
	--
	+
	You need to correctly implement the `stop` action in your executable
	and revert all settings that you changed during the `start`
	action. Otherwise, the roll-back step after changing your *TuneD*
	profile will not work.
	+
	Bash scripts can import the [filename]`/usr/lib/tuned/functions`
	Bash library and use the functions defined there. Use these
	functions only for functionality that is not natively provided
	by *TuneD*. If a function name starts with an underscore, such as
	`_wifi_set_power_level`, consider the function private and do not
	use it in your scripts, because it might change in the future.
	+
	Specify the path to the executable using the `script` parameter in
	the plug-in configuration.
	+
	.Running a Bash script from a profile
	====
	To run a Bash script named `script.sh` that is located in the profile
	directory, use:
	
	----
	[script]
	script=${i:PROFILE_DIR}/script.sh
	----
	====
	cCsddiS)N�script�)�selfrr�#/usr/lib/python3.6/plugin_script.py�_get_config_options6sz ScriptPlugin._get_config_optionscCs2d|_d|_|jddk	r(|jd|_ng|_dS)NTFr)Z_has_static_tuningZ_has_dynamic_tuningZoptions�_scripts)r	�instancerrr
�_instance_init<s
zScriptPlugin._instance_initcCsdS)Nr)r	r
rrr
�_instance_cleanupFszScriptPlugin._instance_cleanupc
Csd}�x|D�]�}tj}|j|jj��tjd|t|�f�tjdtt	|j
����ytt|g|ttd|dtj
j|�d�}|j�\}}t|�r�tjd||dd
�f�|jr�tjd||jf�d}Wqttfk
�r
}	ztjd	||	f�d}WYdd}	~	XqXqW|S)NTz'calling script '%s' with arguments '%s'zusing environment '%s')�stdout�stderrZ	close_fds�envZuniversal_newlines�cwdzscript '%s' error output: '%s'rz#script '%s' returned error code: %dFzscript '%s' error: %s���)�os�environ�updateZ
_variablesZget_env�log�info�str�debug�list�itemsrr�path�dirnameZcommunicate�len�error�
returncode�OSError�IOError)
r	�scriptsZ	arguments�retrr�proc�out�err�errr
�
_call_scriptsIs,
zScriptPlugin._call_scriptscs$tt|�j|�|j|jdg�dS)N�start)�superr�_instance_apply_staticr+r)r	r
)�	__class__rr
r.asz#ScriptPlugin._instance_apply_staticcstd}tt|�j|||�dkr d}dg}|r4|dg7}|j|j|�dkrZtjtj|j�ntj	tj
|j�d}|S)NTFZverify�ignore_missing)r-r�_instance_verify_staticr+rrr�constsZSTR_VERIFY_PROFILE_OKr!ZSTR_VERIFY_PROFILE_FAIL)r	r
r0Zdevicesr&�args)r/rr
r1es
z$ScriptPlugin._instance_verify_staticcsBdg}|tjkr|dg}|jt|j�|�tt|�j||�dS)N�stopZ
full_rollback)r2Z
ROLLBACK_FULLr+�reversedrr-r�_instance_unapply_static)r	r
Zrollbackr3)r/rr
r6ts


z%ScriptPlugin._instance_unapply_static)�__name__�
__module__�__qualname__�__doc__�classmethodrrrr+r.r1r2Z
ROLLBACK_SOFTr6�
__classcell__rr)r/r
r	s+
r)Ztuned.constsr2�rZ
tuned.logsZtunedr�
subprocessrrZlogs�getrZPluginrrrrr
�<module>s

site-packages/tuned/plugins/__pycache__/repository.cpython-36.pyc000064400000003657147511334670021167 0ustar003

�<�e��@s@ddlmZddlZddlZejj�ZdgZGdd�de�Z	dS)�)�PluginLoaderN�
Repositorycs@eZdZ�fdd�Zedd��Zdd�Zdd�Zd	d
�Z�Z	S)rc		sJtt|�j�t�|_||_||_||_||_||_	||_
||_||_dS)N)
�superr�__init__�set�_plugins�_monitor_repository�_storage_factory�_hardware_inventory�_device_matcher�_device_matcher_udev�_plugin_instance_factory�_global_cfg�
_variables)	�selfZmonitor_repositoryZstorage_factoryZhardware_inventoryZdevice_matcherZdevice_matcher_udevZplugin_instance_factoryZ
global_cfg�	variables)�	__class__�� /usr/lib/python3.6/repository.pyrszRepository.__init__cCs|jS)N)r)rrrr�pluginsszRepository.pluginscCsd|_d|_tjjj|_dS)Nz
tuned.pluginsZplugin_)Z
_namespace�_prefix�tunedr�baseZPlugin�
_interface)rrrr�_set_loader_parameterssz!Repository._set_loader_parametersc	CsNtjd|�|j|�}||j|j|j|j|j|j|j	|j
�}|jj|�|S)Nzcreating plugin %s)
�log�debugZload_pluginrr	r
rrr
rrr�add)rZplugin_nameZ
plugin_clsZplugin_instancerrr�create s
zRepository.createcCs6t||j�st�tjd|�|j�|jj|�dS)Nzremoving plugin %s)�
isinstancer�AssertionErrorrrZcleanupr�remove)rZpluginrrr�delete(szRepository.delete)
�__name__�
__module__�__qualname__r�propertyrrrr"�
__classcell__rr)rrr	s
)
Ztuned.utils.plugin_loaderrZtuned.plugins.baserZ
tuned.logsZlogs�getr�__all__rrrrr�<module>s

site-packages/tuned/plugins/__pycache__/plugin_scheduler.cpython-36.opt-1.pyc000064400000152706147511334670023243 0ustar003

�<�e���@s
ddlmZddlTddlZddlZddlTddlZddlZddl	Z	ddl
jZddlZddl
mZddlZddlZddlZddlZy
ejWnek
r�ddlZYnXejj�ZGdd�de�ZGdd	�d	e�ZGd
d�de�ZGdd
�d
e�ZGdd�dej�ZdS)�)�base)�*�N)�commandsc@s0eZdZddd�Zedd��Zejdd��ZdS)�SchedulerParamsNcCs(||_||_||_||_||_||_dS)N)�_cmd�cmdline�	scheduler�priority�affinity�cgroup)�self�cmdrr	r
rr�r�&/usr/lib/python3.6/plugin_scheduler.py�__init__szSchedulerParams.__init__cCs |jdkrdS|jj|j�SdS)N)�	_affinityrZbitmask2cpulist)r
rrrr&s
zSchedulerParams.affinitycCs"|dkrd|_n|jj|�|_dS)N)rrZcpulist2bitmask)r
�valuerrrr-s)NNNNN)�__name__�
__module__�__qualname__r�propertyr�setterrrrrrs
rc@seZdZdd�ZdS)�
IRQAffinitiescCsi|_d|_g|_dS)N)�irqs�default�unchangeable)r
rrrr5szIRQAffinities.__init__N)rrrrrrrrr4src@speZdZdZdddddd�Zdd	�Zd
d�Zdd
�Zdd�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�ZdS)�SchedulerUtilsz=
	Class encapsulating scheduler implementation in os module
	�
SCHED_FIFO�SCHED_BATCH�SCHED_RR�SCHED_OTHER�
SCHED_IDLE)�f�b�r�o�icCs8tdd�|jj�D��|_tdd�|jj�D��|_dS)Ncss |]\}}|tt|�fVqdS)N)�getattr�os)�.0�k�namerrr�	<genexpr>Jsz*SchedulerUtils.__init__.<locals>.<genexpr>css|]}tt|�|fVqdS)N)r(r))r*r,rrrr-Ls)�dict�_dict_schedcfg2schedconst�items�_dict_schedcfg2num�values�_dict_num2schedconst)r
rrrrHszSchedulerUtils.__init__cCs|jj|�S)N)r1�get)r
�
str_schedulerrrr�sched_cfg_to_numNszSchedulerUtils.sched_cfg_to_numcCs|jj|�S)N)r3r4)r
r	rrr�sched_num_to_constRsz!SchedulerUtils.sched_num_to_constcCs
tj|�S)N)r)�sched_getscheduler)r
�pidrrr�
get_schedulerUszSchedulerUtils.get_schedulercCstj||tj|��dS)N)r)�sched_setscheduler�sched_param)r
r9�sched�priorrr�
set_schedulerXszSchedulerUtils.set_schedulercCs
tj|�S)N)r)�sched_getaffinity)r
r9rrr�get_affinity[szSchedulerUtils.get_affinitycCstj||�dS)N)r)�sched_setaffinity)r
r9rrrr�set_affinity^szSchedulerUtils.set_affinitycCstj|�jS)N)r)�sched_getparam�sched_priority)r
r9rrr�get_priorityaszSchedulerUtils.get_prioritycCs
tj|�S)N)r)�sched_get_priority_min)r
r=rrr�get_priority_mindszSchedulerUtils.get_priority_mincCs
tj|�S)N)r)�sched_get_priority_max)r
r=rrr�get_priority_maxgszSchedulerUtils.get_priority_maxN)rrr�__doc__r/rr6r7r:r?rArCrFrHrJrrrrr;s rc@sPeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�ZdS)�SchedulerUtilsSchedutilszE
	Class encapsulating scheduler implementation in schedutils module
	cCs8tdd�|jj�D��|_tdd�|jj�D��|_dS)Ncss |]\}}|tt|�fVqdS)N)r(�
schedutils)r*r+r,rrrr-psz4SchedulerUtilsSchedutils.__init__.<locals>.<genexpr>css|]}tt|�|fVqdS)N)r(rM)r*r,rrrr-rs)r.r/r0r1r2r3)r
rrrrnsz!SchedulerUtilsSchedutils.__init__cCs
tj|�S)N)rMr:)r
r9rrrr:tsz&SchedulerUtilsSchedutils.get_schedulercCstj|||�dS)N)rMr?)r
r9r=r>rrrr?wsz&SchedulerUtilsSchedutils.set_schedulercCs
tj|�S)N)rMrA)r
r9rrrrAzsz%SchedulerUtilsSchedutils.get_affinitycCstj||�dS)N)rMrC)r
r9rrrrrC}sz%SchedulerUtilsSchedutils.set_affinitycCs
tj|�S)N)rMrF)r
r9rrrrF�sz%SchedulerUtilsSchedutils.get_prioritycCs
tj|�S)N)rMrH)r
r=rrrrH�sz)SchedulerUtilsSchedutils.get_priority_mincCs
tj|�S)N)rMrJ)r
r=rrrrJ�sz)SchedulerUtilsSchedutils.get_priority_maxN)rrrrKrr:r?rArCrFrHrJrrrrrLjsrLcs�eZdZdZ�fdd�Zdd�Zdd�Zdd	�Zed
d��Z	dd
�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd�d!d"�Zd#d$�Zd%d&�Zd'd(�Zd�d)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Zd5d6�Zd7d8�Zd9d:�Z d;d<�Z!d=d>�Z"d�d?d@�Z#dAdB�Z$dCdD�Z%�fdEdF�Z&dGdH�Z'dIdJ�Z(dKdL�Z)e*j+f�fdMdN�	Z,dOdP�Z-dQdR�Z.�fdSdT�Z/dUdV�Z0dWdX�Z1dYdZ�Z2e3d[d d\�d]d^��Z4e3d_d d\�d`da��Z5e3dbd d\�dcdd��Z6e3ded d\�dfdg��Z7e3dhd d\�didj��Z8e3dkd d\�dldm��Z9dndo�Z:dpdq�Z;drds�Z<d�dtdu�Z=dvdw�Z>dxdy�Z?dzd{�Z@d|d}�ZAd~d�ZBd�d��ZCd�d��ZDd�d��ZEd�d��ZFe3d�d d�d��d�d���ZGd�d��ZHd�d��ZId�d�d��ZJeKd��d�d���ZLeMd��d�d���ZNeKd��d�d���ZOeMd��d�d���ZPeKd��d�d���ZQeMd��d�d���ZReKd��d�d���ZSeMd��d�d���ZTeKd��d�d���ZUeMd��d�d���ZVeKd��d�d���ZWeMd��d�d���ZXeKd��d�d���ZYeMd��d�d���ZZeKd��d�d���Z[eMd��d�d���Z\eKd��d�d���Z]eMd��d�d���Z^eKd��d�d„�Z_eMd��d�dĄ�Z`�ZaS)��SchedulerPlugina]-
	`scheduler`::
	
	Allows tuning of scheduling priorities, process/thread/IRQ
	affinities, and CPU isolation.
	+
	To prevent processes/threads/IRQs from using certain CPUs, use
	the [option]`isolated_cores` option. It changes process/thread
	affinities, IRQs affinities and it sets `default_smp_affinity`
	for IRQs. The CPU affinity mask is adjusted for all processes and
	threads matching [option]`ps_whitelist` option subject to success
	of the `sched_setaffinity()` system call. The default setting of
	the [option]`ps_whitelist` regular expression is `.*` to match all
	processes and thread names. To exclude certain processes and threads
	use [option]`ps_blacklist` option. The value of this option is also
	interpreted as a regular expression and process/thread names (`ps -eo
	cmd`) are matched against that expression. Profile rollback allows
	all matching processes and threads to run on all CPUs and restores
	the IRQ settings prior to the profile application.
	+
	Multiple regular expressions for [option]`ps_whitelist`
	and [option]`ps_blacklist` options are allowed and separated by
	`;`. Quoted semicolon `\;` is taken literally.
	+
	.Isolate CPUs 2-4
	====
	----
	[scheduler]
	isolated_cores=2-4
	ps_blacklist=.*pmd.*;.*PMD.*;^DPDK;.*qemu-kvm.*
	----
	Isolate CPUs 2-4 while ignoring processes and threads matching
	`ps_blacklist` regular expressions.
	====
	The [option]`irq_process` option controls whether the scheduler plugin
	applies the `isolated_cores` parameter to IRQ affinities. The default
	value is `true`, which means that the scheduler plugin will move all
	possible IRQs away from the isolated cores. When `irq_process` is set
	to `false`, the plugin will not change any IRQ affinities.
	====
	The [option]`default_irq_smp_affinity` option controls the values
	*TuneD* writes to `/proc/irq/default_smp_affinity`. The file specifies
	default affinity mask that applies to all non-active IRQs. Once an
	IRQ is allocated/activated its affinity bitmask will be set to the
	default mask.
	+
	The following values are supported:
	+
	--
	`calc`::
	Content of `/proc/irq/default_smp_affinity` will be calculated
	from the `isolated_cores` parameter. Non-isolated cores
	are calculated as an inversion of the `isolated_cores`. Then
	the intersection of the non-isolated cores and the previous
	content of `/proc/irq/default_smp_affinity` is written to
	`/proc/irq/default_smp_affinity`. If the intersection is
	an empty set, then just the non-isolated cores are written to
	`/proc/irq/default_smp_affinity`. This behavior is the default if
	the parameter `default_irq_smp_affinity` is omitted.
	`ignore`::
	*TuneD* will not touch `/proc/irq/default_smp_affinity`.
	explicit cpulist::
	The cpulist (such as 1,3-4) is unpacked and written directly to
	`/proc/irq/default_smp_affinity`.
	--
	+
	.An explicit CPU list to set the default IRQ smp affinity to CPUs 0 and 2
	====
	----
	[scheduler]
	isolated_cores=1,3
	default_irq_smp_affinity=0,2
	----
	====
	To adjust scheduling policy, priority and affinity for a group of
	processes/threads, use the following syntax.
	+
	[subs="+quotes,+macros"]
	----
	group.__groupname__=__rule_prio__:__sched__:__prio__:__affinity__:__regex__
	----
	+
	where `__rule_prio__` defines internal *TuneD* priority of the
	rule. Rules are sorted based on priority. This is needed for
	inheritence to be able to reorder previously defined rules. Equal
	`__rule_prio__` rules should be processed in the order they were
	defined. However, this is Python interpreter dependant. To disable
	an inherited rule for `__groupname__` use:
	+
	[subs="+quotes,+macros"]
	----
	group.__groupname__=
	----
	+
	`__sched__` must be one of:
	*`f`* for FIFO,
	*`b`* for batch,
	*`r`* for round robin,
	*`o`* for other,
	*`*`* do not change.
	+
	`__affinity__` is CPU affinity in hexadecimal. Use `*` for no change.
	+
	`__prio__` scheduling priority (see `chrt -m`).
	+
	`__regex__` is Python regular expression. It is matched against the output of
	+
	[subs="+quotes,+macros"]
	----
	ps -eo cmd
	----
	+
	Any given process name may match more than one group. In such a case,
	the priority and scheduling policy are taken from the last matching
	`__regex__`.
	+
	.Setting scheduling policy and priorities to kernel threads and watchdog
	====
	----
	[scheduler]
	group.kthreads=0:*:1:*:\[.*\]$
	group.watchdog=0:f:99:*:\[watchdog.*\]
	----
	====
	+
	The scheduler plug-in uses perf event loop to catch newly created
	processes. By default it listens to `perf.RECORD_COMM` and
	`perf.RECORD_EXIT` events. By setting [option]`perf_process_fork`
	option to `true`, `perf.RECORD_FORK` events will be also listened
	to. In other words, child processes created by the `fork()` system
	call will be processed. Since child processes inherit CPU affinity
	from their parents, the scheduler plug-in usually does not need to
	explicitly process these events. As processing perf events can
	pose a significant CPU overhead, the [option]`perf_process_fork`
	option parameter is set to `false` by default. Due to this, child
	processes are not processed by the scheduler plug-in.
	+
	The CPU overhead of the scheduler plugin can be mitigated by using
	the scheduler [option]`runtime` option and setting it to `0`. This
	will completely disable the dynamic scheduler functionality and the
	perf events will not be monitored and acted upon. The disadvantage
	ot this approach is the procees/thread tuning will be done only at
	profile application.
	+
	.Disabling the scheduler dynamic functionality
	====
	----
	[scheduler]
	runtime=0
	isolated_cores=1,3
	----
	====
	+
	NOTE: For perf events, memory mapped buffer is used. Under heavy load
	the buffer may overflow. In such cases the `scheduler` plug-in
	may start missing events and failing to process some newly created
	processes. Increasing the buffer size may help. The buffer size can
	be set with the [option]`perf_mmap_pages` option. The value of this
	parameter has to expressed in powers of 2. If it is not the power
	of 2, the nearest higher power of 2 value is calculated from it
	and this calculated value used. If the [option]`perf_mmap_pages`
	option is omitted, the default kernel value is used.
	+
	The scheduler plug-in supports process/thread confinement using
	cgroups v1.
	+
	[option]`cgroup_mount_point` option specifies the path to mount the
	cgroup filesystem or where *TuneD* expects it to be mounted. If unset,
	`/sys/fs/cgroup/cpuset` is expected.
	+
	If [option]`cgroup_groups_init` option is set to `1` *TuneD*
	will create (and remove) all cgroups defined with the `cgroup*`
	options. This is the default behavior. If it is set to `0` the
	cgroups need to be preset by other means.
	+
	If [option]`cgroup_mount_point_init` option is set to `1`,
	*TuneD* will create (and remove) the cgroup mountpoint. It implies
	`cgroup_groups_init = 1`. If set to `0` the cgroups mount point
	needs to be preset by other means. This is the default behavior.
	+
	The [option]`cgroup_for_isolated_cores` option is the cgroup
	name used for the [option]`isolated_cores` option functionality. For
	example, if a system has 4 CPUs, `isolated_cores=1` means that all
	processes/threads will be moved to CPUs 0,2-3.
	The scheduler plug-in will isolate the specified core by writing
	the calculated CPU affinity to the `cpuset.cpus` control file of
	the specified cgroup and move all the matching processes/threads to
	this group. If this option is unset, classic cpuset affinity using
	`sched_setaffinity()` will be used.
	+
	[option]`cgroup.__cgroup_name__` option defines affinities for
	arbitrary cgroups. Even hierarchic cgroups can be used, but the
	hieararchy needs to be specified in the correct order. Also *TuneD*
	does not do any sanity checks here, with the exception that it forces
	the cgroup to be under [option]`cgroup_mount_point`.
	+
	The syntax of the scheduler option starting with `group.` has been
	augmented to use `cgroup.__cgroup_name__` instead of the hexadecimal
	`__affinity__`. The matching processes will be moved to the cgroup
	`__cgroup_name__`. It is also possible to use cgroups which have
	not been defined by the [option]`cgroup.` option as described above,
	i.e. cgroups not managed by *TuneD*.
	+
	All cgroup names are sanitized by replacing all all dots (`.`) with
	slashes (`/`). This is to prevent the plug-in from writing outside
	[option]`cgroup_mount_point`.
	+
	.Using cgroups v1 with the scheduler plug-in
	====
	----
	[scheduler]
	cgroup_mount_point=/sys/fs/cgroup/cpuset
	cgroup_mount_point_init=1
	cgroup_groups_init=1
	cgroup_for_isolated_cores=group
	cgroup.group1=2
	cgroup.group2=0,2
	
	group.ksoftirqd=0:f:2:cgroup.group1:ksoftirqd.*
	ps_blacklist=ksoftirqd.*;rcuc.*;rcub.*;ktimersoftd.*
	isolated_cores=1
	----
	Cgroup `group1` has the affinity set to CPU 2 and the cgroup `group2`
	to CPUs 0,2. Given a 4 CPU setup, the [option]`isolated_cores=1`
	option causes all processes/threads to be moved to CPU
	cores 0,2-3. Processes/threads that are blacklisted by the
	[option]`ps_blacklist` regular expression will not be moved.
	
	The scheduler plug-in will isolate the specified core by writing the
	CPU affinity 0,2-3 to the `cpuset.cpus` control file of the `group`
	and move all the matching processes/threads to this cgroup.
	====
	Option [option]`cgroup_ps_blacklist` allows excluding processes
	which belong to the blacklisted cgroups. The regular expression specified
	by this option is matched against cgroup hierarchies from
	`/proc/PID/cgroups`. Cgroups v1 hierarchies from `/proc/PID/cgroups`
	are separated by commas ',' prior to regular expression matching. The
	following is an example of content against which the regular expression
	is matched against: `10:hugetlb:/,9:perf_event:/,8:blkio:/`
	+
	Multiple regular expressions can be separated by semicolon ';'. The
	semicolon represents a logical 'or' operator.
	+
	.Cgroup-based exclusion of processes from the scheduler
	====
	----
	[scheduler]
	isolated_cores=1
	cgroup_ps_blacklist=:/daemons\b
	----
	
	The scheduler plug-in will move all processes away from core 1 except processes which
	belong to cgroup '/daemons'. The '\b' is a regular expression
	metacharacter that matches a word boundary.
	
	----
	[scheduler]
	isolated_cores=1
	cgroup_ps_blacklist=\b8:blkio:
	----
	
	The scheduler plug-in will exclude all processes which belong to a cgroup
	with hierarchy-ID 8 and controller-list blkio.
	====
	Recent kernels moved some `sched_` and `numa_balancing_` kernel run-time
	parameters from `/proc/sys/kernel`, managed by the `sysctl` utility, to
	`debugfs`, typically mounted under `/sys/kernel/debug`.  TuneD provides an
	abstraction mechanism for the following parameters via the scheduler plug-in:
	[option]`sched_min_granularity_ns`, [option]`sched_latency_ns`,
	[option]`sched_wakeup_granularity_ns`, [option]`sched_tunable_scaling`,
	[option]`sched_migration_cost_ns`, [option]`sched_nr_migrate`,
	[option]`numa_balancing_scan_delay_ms`,
	[option]`numa_balancing_scan_period_min_ms`,
	[option]`numa_balancing_scan_period_max_ms` and
	[option]`numa_balancing_scan_size_mb`.
	Based on the kernel used, TuneD will write the specified value to the correct
	location.
	+
	.Set tasks' "cache hot" value for migration decisions.
	====
	----
	[scheduler]
	sched_migration_cost_ns=500000
	----
	On the old kernels, this is equivalent to:
	----
	[sysctl]
	kernel.sched_migration_cost_ns=500000
	----
	that is, value `500000` will be written to `/proc/sys/kernel/sched_migration_cost_ns`.
	However, on more recent kernels, the value `500000` will be written to
	`/sys/kernel/debug/sched/migration_cost_ns`.
	====
	c		s�tt|�j||||||||�d|_tj|_ttj�|_	|dk	rh|j
tjtj�|_t|jtj
tj��|_	t�|_d|_i|_d|_d|_d|_tj�|_|jdd�|_d|_|jdd�|_d|_yt�|_Wntk
r�t �|_YnXdS)NTz.*�r	)Zcommand_name�irq)!�superrNrZ_has_dynamic_options�constsZCFG_DEF_DAEMON�_daemon�intZCFG_DEF_SLEEP_INTERVAL�_sleep_interval�get_boolZ
CFG_DAEMONr4ZCFG_SLEEP_INTERVALrr�_secure_boot_hint�_sched_knob_paths_cache�
_ps_whitelist�
_ps_blacklist�_cgroup_ps_blacklist_re�perfZcpu_map�_cpusZ_storage_key�_scheduler_storage_key�_irq_process�_irq_storage_key�_evlistr�_scheduler_utils�AttributeErrorrL)	r
Zmonitor_repositoryZstorage_factoryZhardware_inventoryZdevice_matcherZdevice_matcher_udevZplugin_instance_factoryZ
global_cfg�	variables)�	__class__rrr�s0


zSchedulerPlugin.__init__cCsT|dkrdSyt|�}Wntk
r,dSX|dkr:dStdtjtj|d���S)Nr�)rT�
ValueError�mathZceil�log)r
Z
mmap_pagesZmprrr�_calc_mmap_pages�sz SchedulerPlugin._calc_mmap_pagescsd|_d|_d|_d|_�jj�ji��_t�j�dkr^t	j
d��j�i�_�jj�j�t
��_d�_d�_d�_tj�fdd�|jj�D���_|j|_�jj|jd�}�j|�}|dkr�t	jd|�d}|dk	r�t|�|kr�t	j
d	||f�x(|jD]}�jj|j|�|j|<�qW�jj|jjd
d��dk�rHd|_tj �|_!�j"�r|j�ry�t#j$�|_%t#j&t#j't#j(ddddddt#j)t#j*Bd
�	}|j+�j,|j%d�t#j-�j,|j%�|_|jj.|�|dk�r�|jj/�n|jj/|d�Wnd|_YnXdS)NFTrz0recovering scheduling settings from previous runcsJg|]B\}}|dd�dkrt|�dkr�j|dd���jj|�f�qS)N�zcgroup.)�len�_sanitize_cgroup_path�
_variables�expand)r*�optionr)r
rr�
<listcomp>�sz2SchedulerPlugin._instance_init.<locals>.<listcomp>�perf_mmap_pageszKInvalid 'perf_mmap_pages' value specified: '%s', using default kernel valuezL'perf_mmap_pages' value has to be power of two, specified: '%s', using: '%d'Zruntimer�0)	�type�configZtask�comm�mmapZfreqZ
wakeup_eventsZ	watermarkZsample_type)Zcpus�threads)Zpages)0raZ_has_dynamic_tuningZ_has_static_tuning�_runtime_tuning�_storager4r^�_scheduler_originalrlri�info�_restore_ps_affinity�unsetr.�_cgroups_original_affinityr�_cgroup_affinity_initialized�_cgroup�collections�OrderedDict�optionsr0�_cgroups�
_schedulerrnrorj�error�strrrV�	threadingZEvent�
_terminaterSr\Z
thread_mapZ_threads�evselZ
TYPE_SOFTWAREZCOUNT_SW_DUMMYZ
SAMPLE_TIDZ
SAMPLE_CPU�openr]Zevlist�addrw)r
�instanceZperf_mmap_pages_rawrrr+r�r)r
r�_instance_init�s^




zSchedulerPlugin._instance_initcCs*|jr&x|jj�D]}tj|j�qWdS)N)ra�
get_pollfdr)�closer,)r
r��fdrrr�_instance_cleanupsz!SchedulerPlugin._instance_cleanupcCs4dtjddddddddddddddddddddd�S)NFT�calcZfalse)�isolated_cores�cgroup_mount_point�cgroup_mount_point_init�cgroup_groups_init�cgroup_for_isolated_cores�cgroup_ps_blacklist�ps_whitelist�ps_blacklist�irq_process�default_irq_smp_affinityrr�perf_process_fork�sched_min_granularity_ns�sched_latency_ns�sched_wakeup_granularity_ns�sched_tunable_scaling�sched_migration_cost_ns�sched_nr_migrate�numa_balancing_scan_delay_ms�!numa_balancing_scan_period_min_ms�!numa_balancing_scan_period_max_ms�numa_balancing_scan_size_mb)rRZDEF_CGROUP_MOUNT_POINT)�clsrrr�_get_config_optionss,z#SchedulerPlugin._get_config_optionscCs|dk	rt|�jdd�SdS)N�.�/)r��replace)r
rrrrrm9sz%SchedulerPlugin._sanitize_cgroup_pathcCs>t|tj�s|}tj|�}tj|�}|j|�r:d|d}|S)N�[�])�
isinstance�procfs�processZprocess_cmdline�_is_kthread)r
r�r9rrrr�_get_cmdline=s


zSchedulerPlugin._get_cmdlinecCs�tj�}|j�i}x�|j�D]�}yN|j|�}|d}|||<d|krnx&|dj�D]}|j|�}|||<qTWWqttfk
r�}z$|jtj	ks�|jtj
kr�wn�WYdd}~XqXqW|S)Nr9rx)r��pidstats�reload_threadsr2r��keys�OSError�IOError�errno�ENOENT�ESRCH)r
�ps�	processes�procrr9�errr�
get_processesGs$

zSchedulerPlugin.get_processescCs@|jj|�}|jj|�}|jj|�}tjd|||f�||fS)Nz8Read scheduler policy '%s' and priority '%d' of PID '%d')rbr:r7rFri�debug)r
r9r	�	sched_strr
rrr�_get_rt`szSchedulerPlugin._get_rtcCs|jj|�}tjd|||f�yB|jj|�}|jj|�}||ksJ||kr`tjd||||f�Wn4ttfk
r�}ztjd|�WYdd}~XnXy|jj	|||�Wn`ttfk
�r}z>t
|d�r�|jtjkr�tjd|�ntjd||f�WYdd}~XnXdS)NzBSetting scheduler policy to '%s' and priority to '%d' of PID '%d'.z9Priority for %s must be in range %d - %d. '%d' was given.z(Failed to get allowed priority range: %sr�zAFailed to set scheduling parameters of PID %d, the task vanished.z1Failed to set scheduling parameters of PID %d: %s)
rbr7rir�rHrJr��SystemErrorr�r?�hasattrr�r�)r
r9r=r>r�Zprio_minZprio_maxr�rrr�_set_rths*
zSchedulerPlugin._set_rtcCs|ddtjj@dkS)N�stat�flagsr)r�ZpidstatZ
PF_KTHREAD)r
r�rrrr��szSchedulerPlugin._is_kthreadcCsyjtj|�}|dj�rd|dddkr8tjd|�n(|j|�rRtjd|�ntjd|�dSdSWn�ttfk
r�}zF|j	t	j
ks�|j	t	jkr�tjd	|�dStjd
||f�d
SWYdd}~Xn8t
tfk
�r}ztjd
||f�dSd}~XnXdS)Nr��state�ZzYAffinity of zombie task with PID %d cannot be changed, the task's affinity mask is fixed.z[Affinity of kernel thread with PID %d cannot be changed, the task's affinity mask is fixed.zRAffinity of task with PID %d cannot be changed, the task's affinity mask is fixed.rrz6Failed to get task info for PID %d, the task vanished.z&Failed to get task info for PID %d: %srf������r�)r�r�Zis_bound_to_cpurir�r��warnr�r�r�r�r�r�rc�KeyError)r
r9r�r�rrr�_affinity_changeable�s2



z$SchedulerPlugin._affinity_changeablecCs\y|j|}Wn(tk
r6t|j�}||j|<YnX|jdkrX|jdkrX||_||_dS)N)r{r�rrr	r
)r
r9r	r
�paramsrrr�_store_orig_process_rt�s
z&SchedulerPlugin._store_orig_process_rtcCs�d}|dkr|dkr|Sy:|j|�\}}|dkr4|}|j|||�|j|||�Wntttfk
r�}zTt|d�r�|jtjkr�tj	d|�||j
kr�|j
|=d}ntjd||f�WYdd}~XnX|S)NTr�z=Failed to read scheduler policy of PID %d, the task vanished.FzcRefusing to set scheduler and priority of PID %d, reading original scheduling parameters failed: %s)r�r�r�r�r�r�r�r�rir�r{r�)r
r9r=r>�contZ
prev_schedZ	prev_prior�rrr�_tune_process_rt�s&
z SchedulerPlugin._tune_process_rtcCst|�dd�dkS)Nrkzcgroup.)r�)r
rrrr�_is_cgroup_affinity�sz#SchedulerPlugin._is_cgroup_affinityFcCsby|j|}Wn(tk
r6t|j�}||j|<YnX|jdkr^|jdkr^|rX||_n||_dS)N)r{r�rrrr)r
r9r�	is_cgroupr�rrr�_store_orig_process_affinity�s
z,SchedulerPlugin._store_orig_process_affinitycCspxj|jjdtjt|�dfdd�jd�D]@}y&|jd�ddd�}|dkrP|Sd	Stk
rfYq(Xq(Wd	S)
Nz%s/%s/%srT)�no_error�
z:cpuset:rrOr�)r�	read_filerRZPROCFS_MOUNT_POINTr��split�
IndexError)r
r9�lrrrr�_get_cgroup_affinity�s,
z$SchedulerPlugin._get_cgroup_affinitycCsB|j|�}|j}|dkr$d||f}|jjd|t|�dd�dS)Nr�z%s/%sz%s/tasksT)r�)rm�_cgroup_mount_pointr�
write_to_filer�)r
r9r�pathrrr�_set_cgroup�s

zSchedulerPlugin._set_cgroupcCs,|dd�}t|t�o"t|�dk}||fS)Nrkr)r��listrl)r
rr�rrr�_parse_cgroup_affinity�sz&SchedulerPlugin._parse_cgroup_affinityc	Cs�d}|dkr|Syd|j|�\}}|r<|j|�}|j||�n(|j|�}|rX|j|||�}|j||�|j|||�Wntttfk
r�}zTt	|d�r�|j
t
jkr�tj
d|�||jkr�|j|=d}ntjd||f�WYdd}~XnX|S)NTr�z5Failed to read affinity of PID %d, the task vanished.FzLRefusing to set CPU affinity of PID %d, reading original affinity failed: %s)r�r�r��
_get_affinity�_get_intersect_affinity�
_set_affinityr�r�r�r�r�r�rir�r{r�)	r
r9r�	intersectr�r�r�
prev_affinityr�rrr�_tune_process_affinity�s4


z&SchedulerPlugin._tune_process_affinitycCsF|j|||�}|sdS|j||�}|s2||jkr6dS||j|_dS)N)r�r�r{r)r
r9rr=r>rr�rrr�
_tune_processszSchedulerPlugin._tune_processcCsf|jj|�}|dkr.|dkr.tjd|�dSyt|�}Wn"tk
r\tjd|�dSX||fS)Nrz>Invalid scheduler: %s. Scheduler and priority will be ignored.z=Invalid priority: %s. Scheduler and priority will be ignored.)NN)NN)rbr6rir�rTrg)r
r5Zstr_priorityr	r
rrr�_convert_sched_paramssz%SchedulerPlugin._convert_sched_paramscCsD|dkrd}n2|j|�r|}n"|jj|�}|s@tjd|�d}|S)Nrz)Invalid affinity: %s. It will be ignored.)r�r�hex2cpulistrir�)r
Zstr_affinityrrrr�_convert_affinity+s
z!SchedulerPlugin._convert_affinitycCs6|\}}}}}|j||�\}}|j|�}|||||fS)N)r�r�)r
�vals�	rule_prior	r
r�regexrrr�_convert_sched_cfg8s

z"SchedulerPlugin._convert_sched_cfgcCs�d|j|f}ytj|tj�Wn4tk
rT}ztjd||f�WYdd}~XnX|jj	d|df|jj
d|jdfdd�dd�s�tjd|�dS)Nz%s/%sz Unable to create cgroup '%s': %szcpuset.memsT)r�z3Unable to initialize 'cpuset.mems ' for cgroup '%s')r�r)�mkdirrR�DEF_CGROUP_MODEr�rir�rr�r�)r
rr�r�rrr�_cgroup_create_group?s$z$SchedulerPlugin._cgroup_create_groupcCs@|jdk	r"|j|jkr"|j|j�x|jD]}|j|�q*WdS)N)r�r�r�)r
�cgrrr�_cgroup_initialize_groupsJsz)SchedulerPlugin._cgroup_initialize_groupscCs�tjd�ytj|jtj�Wn0tk
rN}ztjd|�WYdd}~XnX|j	j
dddddd|jg�\}}|dkr�tjd	|j�dS)
NzInitializing cgroups settingsz'Unable to create cgroup mount point: %sZmountz-trz-oZcpusetrzUnable to mount '%s')rir�r)�makedirsr�rRr�r�r�r�execute)r
r��ret�outrrr�_cgroup_initializePs
  z"SchedulerPlugin._cgroup_initializecCsHytj|�Wn4tk
rB}ztjd||f�WYdd}~XnXdS)Nz#Unable to remove directory '%s': %s)r)�rmdirr�rir�)r
rr�rrr�_remove_dirZszSchedulerPlugin._remove_dircCsXx&t|j�D]}|jd|j|f�qW|jdk	rT|j|jkrT|jd|j|jf�dS)Nz%s/%s)�reversedr�r�r�r�)r
r�rrr�_cgroup_finalize_groups`sz'SchedulerPlugin._cgroup_finalize_groupscCsltjd�|jjd|jg�\}}|dkr<tjd|j�dS|j|j�tjj	|j�}|dkrh|j|�dS)NzRemoving cgroups settingsZumountrzUnable to umount '%s'Fr�)
rir�rr�r�r�r�r)r��dirname)r
r�r��drrr�_cgroup_finalizefs
z SchedulerPlugin._cgroup_finalizecCs�|dkrtjd||f�ntjd|�dSd|j|df}|r~|jj|ddd�j�}|dkrl||j|<ntjd	|�dS|jj||dd
�s�tjd||f�dS)NrOz$Setting cgroup '%s' affinity to '%s'z.Skipping cgroup '%s', empty affinity requestedz%s/%s/%szcpuset.cpus�ERRT)�err_retr�zIRefusing to set affinity of cgroup '%s', reading original affinity failed)r�z+Unable to set affinity '%s' for cgroup '%s')	rir�r�rr��striprr�r�)r
rr�backupr�Z
orig_affinityrrr�_cgroup_set_affinity_oneqsz(SchedulerPlugin._cgroup_set_affinity_onecCs~|jr
dStjd�|jdk	rH|jdk	rH|j|jkrH|j|j|jdd�x*|jj�D]}|j|d|ddd�qTWd|_dS)NzSetting cgroups affinitiesT)rrr)r�rir�rr�r�r	r0)r
r�rrr�_cgroup_set_affinity�s
 z$SchedulerPlugin._cgroup_set_affinitycCs6tjd�x&|jj�D]}|j|d|d�qWdS)NzRestoring cgroups affinitiesrr)rir�rr0r	)r
r�rrr�_cgroup_restore_affinity�s
z(SchedulerPlugin._cgroup_restore_affinityc#sn�jj|jd��_�jj�jj|jd��dk�_�jj�jj|jd��dk�_�j�jj|jd���_	�jr|�j
��js��jr��j�tt
��j|��j�y�j�}Wn2ttfk
r�}ztjd|�dSd}~XnXdd�|jj�D�}�fd	d�|D�}t|d
d�d�}t�}i|_x�|D]�\�\}����ytj���Wn<tjk
�r�}ztjd
t����w0WYdd}~XnX�fdd�|j�D�}t�����fdd�|D��}	|j|	�tjddt�������g|j�<�q0Wx4|j�D](\}
\}������j|
|�����q�W�j j!�j"�j#��j$�rj|j%�rjt&j'�j(|gd�|_)|j)j*�dS)Nr�r��1r�r�zIerror applying tuning, cannot get information about running processes: %scSs$g|]\}}|t|�jdd�f�qS)�:�)r�r�)r*rprrrrrq�sz:SchedulerPlugin._instance_apply_static.<locals>.<listcomp>cs6g|].\}}tjd|�rt|�dkr|�j|�f�qS)zgroup\.�)�re�matchrlr�)r*rpr�)r
rrrq�scSs|ddS)Nrrr)Zoption_valsrrr�<lambda>�sz8SchedulerPlugin._instance_apply_static.<locals>.<lambda>)�keyz(error compiling regular expression: '%s'cs(g|] \}}tj�|�dk	r||f�qS)N)r�search)r*r9r)r%rrrq�sc	s$g|]\}}||�����ff�qSrr)r*r9r)rrpr
r�r	rrrq�sz(?<!\\)\((?!\?)z(?:)�target�args)+rnror�r�rrV�_cgroup_mount_point_init�_cgroup_groups_initrmr�r�r�rQrN�_instance_apply_staticr
r�r�r�rir�r�r0�sortedr.�
_sched_lookupr�compiler��update�subr�rz�setr^r{rSryr�ZThread�_thread_code�_thread�start)r
r�r�r�Z	sched_cfgZbufZ	sched_allr�r�r=r9r)re)rrpr
r%r�r	r
rr�s^





z&SchedulerPlugin._instance_apply_staticcCs�y|j�}Wn2ttfk
r>}ztjd|�dSd}~XnXx�|jj�D]x\}}||ksL|||jkrlqL|jdk	r�|j	dk	r�|j
||j|j	�|jdk	r�|j||j�qL|j
dk	rL|j||j
�qLWi|_|jj|j�dS)NzKerror unapplying tuning, cannot get information about running processes: %s)r�r�r�rir�r{r0rr	r
r�rr�rr�rzr~r^)r
r�r�r9Zorig_paramsrrrr}�s&




z$SchedulerPlugin._restore_ps_affinitycCs�ttj�}d}xr|dkr�|dkr�|jjd|j|dfddd�}|d
krvx.|jd�D] }|jjd	|jdf|dd
�qRW|d8}qW|dkr�tj	d|�dS)N� rOrz%s/%s/%sZtasksT)rr�r�z%s/%s)r�rz(Unable to cleanup tasks from cgroup '%s')rOr#)
rTrRZCGROUP_CLEANUP_TASKS_RETRYrr�r�r�r�rir�)r
rZcnt�datar�rrr�_cgroup_cleanup_tasks_one�s

 z)SchedulerPlugin._cgroup_cleanup_tasks_onecCs@|jdk	r"|j|jkr"|j|j�x|jD]}|j|�q*WdS)N)r�r�r%)r
r�rrr�_cgroup_cleanup_tasks�sz%SchedulerPlugin._cgroup_cleanup_taskscsptt|�j||�|jr2|jr2|jj�|jj�|j	�|j
�|j�|jsV|j
r^|j�|j
rl|j�dS)N)rQrN�_instance_unapply_staticrSryr�rr!�joinr}rr&rrrr)r
r�Zrollback)rerrr'�s

z(SchedulerPlugin._instance_unapply_staticcCs�tjd|�d|j|df}|jj|ddd�}|dkr<dS|jj|jj|��}|jj|jj|��}d|}||kr�tjtj	||f�dStj
tj|||f�dSdS)	NzVerifying cgroup '%s' affinityz%s/%s/%szcpuset.cpusrT)rr�zcgroup '%s' affinityF)rir�r�rr��cpulist2stringZcpulist_packr|rR�STR_VERIFY_PROFILE_VALUE_OKr��STR_VERIFY_PROFILE_VALUE_FAIL)r
rrr��current_affinityZaffinity_descriptionrrr�_cgroup_verify_affinity_ones 
z+SchedulerPlugin._cgroup_verify_affinity_onecCsrtjd�d}|jdk	rB|jdk	rB|j|jkrB|o@|j|j|j�}x*|jj�D]}|oh|j|d|d�}qNW|S)NzVeryfying cgroups affinitiesTrr)rir�rr�r�r-r0)r
r�r�rrr�_cgroup_verify_affinitys
 z'SchedulerPlugin._cgroup_verify_affinitycs$tt|�j|||�}|j�}|o"|S)N)rQrN�_instance_verify_staticr.)r
r��ignore_missingZdevicesZret1Zret2)rerrr/sz'SchedulerPlugin._instance_verify_staticc
Cs�y|j|�}Wn^ttfk
rl}z>|jtjks<|jtjkrLtjd|�ntjd||f�dSd}~XnX|j	j
|j||�}|dk	r�||jkr�tjd||t
|�f�|\}}}	|j|||||	�|jj|j|j�dS)Nz3Failed to get cmdline of PID %d, the task vanished.z#Failed to get cmdline of PID %d: %sz-tuning new process '%s' with PID '%d' by '%s')r�r�r�r�r�r�rir�r�rZ	re_lookuprr{r�r�rzrr^)
r
r�r9r%rr��vr=r>rrrr�_add_pid$s$


zSchedulerPlugin._add_pidcCs6||jkr2|j|=tjd|�|jj|j|j�dS)Nz)removed PID %d from the rollback database)r{rir�rzrr^)r
r�r9rrr�_remove_pid9s


zSchedulerPlugin._remove_pidc	Cs�|jj|j�}tj�}|jj�}x|D]}|j|�q&Wx�|jj	�s�t
|j|jd��dkr:|jj	�r:d}x�|r�d}xt|jD]j}|jj
|�}|r~d}|jtjks�|jr�|jtjkr�|j|t|j�|�q~|jtjkr~|j|t|j��q~WqnWq:WdS)Ni�rTF)rZre_lookup_compiler�select�pollrar��registerr�Zis_setrlrUr]Zread_on_cpurtr\ZRECORD_COMM�_perf_process_fork_valueZRECORD_FORKr2rT�tidZRECORD_EXITr3)	r
r�r%r5Zfdsr�Zread_eventsZcpuZeventrrrr @s&

$zSchedulerPlugin._thread_coder�)�
per_devicecCs:|rdS|r6|dk	r6djdd�tjdt|��D��|_dS)N�|cSsg|]}d|�qS)z(%s)r)r*r1rrrrq_sz8SchedulerPlugin._cgroup_ps_blacklist.<locals>.<listcomp>z(?<!\\);)r(rr�r�r[)r
�enablingr�verifyr0rrr�_cgroup_ps_blacklistYsz$SchedulerPlugin._cgroup_ps_blacklistr�cCs:|rdS|r6|dk	r6djdd�tjdt|��D��|_dS)Nr:cSsg|]}d|�qS)z(%s)r)r*r1rrrrqgsz1SchedulerPlugin._ps_whitelist.<locals>.<listcomp>z(?<!\\);)r(rr�r�rY)r
r;rr<r0rrrrYaszSchedulerPlugin._ps_whitelistr�cCs:|rdS|r6|dk	r6djdd�tjdt|��D��|_dS)Nr:cSsg|]}d|�qS)z(%s)r)r*r1rrrrqosz1SchedulerPlugin._ps_blacklist.<locals>.<listcomp>z(?<!\\);)r(rr�r�rZ)r
r;rr<r0rrrrZiszSchedulerPlugin._ps_blacklistr�cCs*|rdS|r&|dk	r&|jj|�dk|_dS)Nr)rrVr_)r
r;rr<r0rrrr_qszSchedulerPlugin._irq_processr�cCs6|rdS|r2|dk	r2|dkr$||_n|jj|�|_dS)Nr��ignore)r�r>)�_default_irq_smp_affinity_valuer�cpulist_unpack)r
r;rr<r0rrr�_default_irq_smp_affinityysz)SchedulerPlugin._default_irq_smp_affinityr�cCs*|rdS|r&|dk	r&|jj|�dk|_dS)Nr)rrVr7)r
r;rr<r0rrr�_perf_process_fork�sz"SchedulerPlugin._perf_process_forkcCs"|jj|�}tjd||f�|S)NzRead affinity '%s' of PID %d)rbrArir�)r
r9�resrrrr��szSchedulerPlugin._get_affinitycCs�tjd||f�y|jj||�dSttfk
r�}zXt|d�r`|jtjkr`tjd|�n.|j	|�}|dksz|d	kr�tj
d|||f�dSd}~XnXdS)
Nz'Setting CPU affinity of PID %d to '%s'.Tr�z4Failed to set affinity of PID %d, the task vanished.rrfz,Failed to set affinity of PID %d to '%s': %sFr�)rir�rbrCr�r�r�r�r�r�r�)r
r9rr�rCrrrr��s

zSchedulerPlugin._set_affinitycCs"t|�jt|��}|rt|�S|S)N)r�intersectionr�)r
Z	affinity1Z	affinity2Z	affinity3Zaffrrrr��sz'SchedulerPlugin._get_intersect_affinityc
s>�fdd�|D�}�jdkr.�fdd�|D�}�jdkrJ�fdd�|D�}tdd�|D��}x�|D]�}y�j||�}Wnbttfk
r�}zB|jtjks�|jtjkr�t	j
d|�nt	jd||f�wbWYdd}~XnX�j||d	d
�}	|	s�qb|�j
k�r
|�j
|_|rbd||krb�j||dj�|d	�qbWdS)Ncs(g|] }tj�j�j|��dk	r|�qS)N)rrrY�_get_stat_comm)r*r1)r
rrrq�s
z9SchedulerPlugin._set_all_obj_affinity.<locals>.<listcomp>rOcs(g|] }tj�j�j|��dkr|�qS)N)rrrZrE)r*r1)r
rrrq�s
cs(g|] }tj�j�j|��dkr|�qS)N)rrr[�_get_stat_cgroup)r*r1)r
rrrq�s
cSsg|]}|j|f�qSr)r9)r*r1rrrrq�sz3Failed to get cmdline of PID %d, the task vanished.zARefusing to set affinity of PID %d, failed to get its cmdline: %sT)r�rx)rZr[r.r�r�r�r�r�r�rir�r�r�r{r�_set_all_obj_affinityr2)
r
ZobjsrrxZpslZpsdr9rr�r�r)r
rrG�s6



z%SchedulerPlugin._set_all_obj_affinityc
Cs(y|dStttfk
r"dSXdS)NZcgroupsrO)r�r�r�)r
r&rrrrF�sz SchedulerPlugin._get_stat_cgroupc
Cs,y|ddStttfk
r&dSXdS)Nr�rvrO)r�r�r�)r
r&rrrrE�szSchedulerPlugin._get_stat_commcCs`y&tj�}|j�|j|j�|d�Wn4ttfk
rZ}ztjd|�WYdd}~XnXdS)NFzIerror applying tuning, cannot get information about running processes: %s)	r�r�r�rGr2r�r�rir�)r
rr�r�rrr�_set_ps_affinity�sz SchedulerPlugin._set_ps_affinitycCs�yJ|jj|�}tjd||f�d|}t|d��}|j|�WdQRXdSttfk
r�}zLt|d�r�|j	t	j
kr�|r�tjd|�d
Stjd|||f�dSWYdd}~XnXdS)Nz&Setting SMP affinity of IRQ %s to '%s'z/proc/irq/%s/smp_affinity�wrr�z/Setting SMP affinity of IRQ %s is not supportedrfz0Failed to set SMP affinity of IRQ %s to '%s': %srr�r�)r�cpulist2hexrir�r��writer�r�r�r�ZEIOr�)r
rPrZ	restoring�affinity_hex�filenamer#r�rrr�_set_irq_affinity�s"z!SchedulerPlugin._set_irq_affinitycCs|y>|jj|�}tjd|�tdd��}|j|�WdQRXWn8ttfk
rv}ztjd||f�WYdd}~XnXdS)Nz(Setting default SMP IRQ affinity to '%s'z/proc/irq/default_smp_affinityrIz2Failed to set default SMP IRQ affinity to '%s': %s)	rrJrir�r�rKr�r�r�)r
rrLr#r�rrr�_set_default_irq_affinity�sz)SchedulerPlugin._set_default_irq_affinityc	
Cs"t�}tj�}x�|j�D]�}y"||d}tjd||f�Wntk
rTwYnX|j|||�}t|�t|�krvq|j	||d�}|dkr�||j
|<q|d	kr|jj|�qW|j
jd�}|j
j|�}|jdkr�|j|||�}n|jdkr�|j}|jdk�r|j|�||_|jj|j|�dS)
NrzRead affinity of IRQ '%s': '%s'Frrfz/proc/irq/default_smp_affinityr�r>r�)rr��
interruptsr�rir�r�r�rrNrr�appendrr�r�r?rOrrzr`)	r
r�irq_originalrrPr�rrCZprev_affinity_hexrrr�_set_all_irq_affinity
s6


z%SchedulerPlugin._set_all_irq_affinitycCsn|jj|jd�}|dkrdSx$|jj�D]\}}|j||d�q(W|jdkr\|j}|j|�|jj	|j�dS)NTr>)
rzr4r`rr0rNr?rrOr~)r
rRrPrrrr�_restore_all_irq_affinity)s

z)SchedulerPlugin._restore_all_irq_affinitycCsFt|�jt|��}|r,tjtj||f�ntjtj|||f�|S)N)r�issubsetrir|rRr*r�r+)r
�irq_description�correct_affinityr,rCrrr�_verify_irq_affinity4s
z$SchedulerPlugin._verify_irq_affinitycCs�|jj|jd�}tj�}d}x�|j�D]�}||jkrR|rRd|}tjt	j
|�q&y<||d}tjd||f�d|}	|j|	||�s�d}Wq&t
k
r�w&Yq&Xq&W|jjd�}
|jj|
�}|jdkr�|jd	||jd
kr�|n|j�r�d}|S)NTz-IRQ %s does not support changing SMP affinityrz#Read SMP affinity of IRQ '%s': '%s'zSMP affinity of IRQ %sFz/proc/irq/default_smp_affinityr>zdefault IRQ SMP affinityr�)rzr4r`r�rPr�rrir|rRZ STR_VERIFY_PROFILE_VALUE_MISSINGr�rXr�rr�r�r?)r
rWr0rRrrCrP�descriptionr,rVZcurrent_affinity_hexrrr�_verify_all_irq_affinity@s8
z(SchedulerPlugin._verify_all_irq_affinityr��
)r9r
c
Cs�d}d|_|dk	rrt|jj|��}t|j�}|j|�rRt||�}|jj|�|_n |jj|j�}tj	d||f�|sz|r�|dkr�dS|r�|j
r�|j||�SdS|r�|jr�|j
�d|j}	n|}	|j|	�|j
r�|j|�n|j
r�|j�dS)NzJInvalid isolated_cores specified, '%s' does not match available cores '%s'Tz	cgroup.%s)rrrr@r]rUr�r)rir�r_rZr�r
rHrSrT)
r
r;rr<r0r�isolatedZpresentZstr_cpusZps_affinityrrr�_isolated_cores_s6


zSchedulerPlugin._isolated_corescCs�d|||f}|jj|�}|r"|Sd||f}tjj|�sv|dkrPd||f}nd|||f}d|}|jdkrvd|_||j|<|S)Nz%s_%s_%sz/proc/sys/kernel/%s_%srOz%s/%sz%s/%s/%sz/sys/kernel/debug/%sT)rXr4r)r��existsrW)r
�prefix�	namespace�knobrr�rrr�_get_sched_knob_path�s

z$SchedulerPlugin._get_sched_knob_pathcCsJ|jj|j|||�dd�}|dkrFtjd|�|jrFtjd�d|_|S)N)rzError reading '%s'zUThis may not work with Secure Boot or kernel_lockdown (this hint is logged only once)F)rr�rbrir�rW)r
r_r`rar$rrr�_get_sched_knob�s
zSchedulerPlugin._get_sched_knobcCsN|dkrdS|sJ|jj|j|||�||r0tjgndd�sJtjd||f�|S)NF)r�z Error writing value '%s' to '%s')rr�rbr�r�rir�)r
r_r`rar�sim�removerrr�_set_sched_knob�szSchedulerPlugin._set_sched_knobr�cCs|jddd�S)NrOr=�min_granularity_ns)rc)r
rrr�_get_sched_min_granularity_ns�sz-SchedulerPlugin._get_sched_min_granularity_nscCs|jddd|||�S)NrOr=rg)rf)r
rrdrerrr�_set_sched_min_granularity_ns�sz-SchedulerPlugin._set_sched_min_granularity_nsr�cCs|jddd�S)NrOr=�
latency_ns)rc)r
rrr�_get_sched_latency_ns�sz%SchedulerPlugin._get_sched_latency_nscCs|jddd|||�S)NrOr=rj)rf)r
rrdrerrr�_set_sched_latency_ns�sz%SchedulerPlugin._set_sched_latency_nsr�cCs|jddd�S)NrOr=�wakeup_granularity_ns)rc)r
rrr� _get_sched_wakeup_granularity_ns�sz0SchedulerPlugin._get_sched_wakeup_granularity_nscCs|jddd|||�S)NrOr=rm)rf)r
rrdrerrr� _set_sched_wakeup_granularity_ns�sz0SchedulerPlugin._set_sched_wakeup_granularity_nsr�cCs|jddd�S)NrOr=�tunable_scaling)rc)r
rrr�_get_sched_tunable_scaling�sz*SchedulerPlugin._get_sched_tunable_scalingcCs|jddd|||�S)NrOr=rp)rf)r
rrdrerrr�_set_sched_tunable_scaling�sz*SchedulerPlugin._set_sched_tunable_scalingr�cCs|jddd�S)NrOr=�migration_cost_ns)rc)r
rrr�_get_sched_migration_cost_ns�sz,SchedulerPlugin._get_sched_migration_cost_nscCs|jddd|||�S)NrOr=rs)rf)r
rrdrerrr�_set_sched_migration_cost_ns�sz,SchedulerPlugin._set_sched_migration_cost_nsr�cCs|jddd�S)NrOr=�
nr_migrate)rc)r
rrr�_get_sched_nr_migrate�sz%SchedulerPlugin._get_sched_nr_migratecCs|jddd|||�S)NrOr=rv)rf)r
rrdrerrr�_set_sched_nr_migrate�sz%SchedulerPlugin._set_sched_nr_migrater�cCs|jddd�S)Nr=�numa_balancing�
scan_delay_ms)rc)r
rrr�!_get_numa_balancing_scan_delay_ms�sz1SchedulerPlugin._get_numa_balancing_scan_delay_mscCs|jddd|||�S)Nr=ryrz)rf)r
rrdrerrr�!_set_numa_balancing_scan_delay_ms�sz1SchedulerPlugin._set_numa_balancing_scan_delay_msr�cCs|jddd�S)Nr=ry�scan_period_min_ms)rc)r
rrr�&_get_numa_balancing_scan_period_min_ms�sz6SchedulerPlugin._get_numa_balancing_scan_period_min_mscCs|jddd|||�S)Nr=ryr})rf)r
rrdrerrr�&_set_numa_balancing_scan_period_min_ms�sz6SchedulerPlugin._set_numa_balancing_scan_period_min_msr�cCs|jddd�S)Nr=ry�scan_period_max_ms)rc)r
rrr�&_get_numa_balancing_scan_period_max_ms�sz6SchedulerPlugin._get_numa_balancing_scan_period_max_mscCs|jddd|||�S)Nr=ryr�)rf)r
rrdrerrr�&_set_numa_balancing_scan_period_max_ms�sz6SchedulerPlugin._set_numa_balancing_scan_period_max_msr�cCs|jddd�S)Nr=ry�scan_size_mb)rc)r
rrr� _get_numa_balancing_scan_size_mb�sz0SchedulerPlugin._get_numa_balancing_scan_size_mbcCs|jddd|||�S)Nr=ryr�)rf)r
rrdrerrr� _set_numa_balancing_scan_size_mb�sz0SchedulerPlugin._set_numa_balancing_scan_size_mb)F)F)F)F)F)brrrrKrrjr�r��classmethodr�rmr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr	r
rrr}r%r&rRZ
ROLLBACK_SOFTr'r-r.r/r2r3r Zcommand_customr=rYrZr_rArBr�r�r�rGrFrErHrNrOrSrTrXrZr]rbrcrfZcommand_getrhZcommand_setrirkrlrnrorqrrrtrurwrxr{r|r~rr�r�r�r��
__classcell__rr)rerrN�s�(?




	



<

	
"$	
	rN) rOrZ
decoratorsZ
tuned.logsZtunedr�
subprocessr�r\r4Ztuned.constsrRr�Ztuned.utils.commandsrr�r)r�rhrrcrMZlogsr4ri�objectrrrrLZPluginrNrrrr�<module>s0


/site-packages/tuned/plugins/__pycache__/plugin_modules.cpython-36.opt-1.pyc000064400000012711147511334670022724 0ustar003

�<�e=�@sjddlZddlZddlmZddlTddlZddlTddl	m
Z
ddljZej
j�ZGdd�dej�ZdS)�N�)�base)�*)�commandscsfeZdZdZ�fdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
ejfdd�Z
dd�Z�ZS)�
ModulesPlugina!
	`modules`::
	
	Plug-in for applying custom kernel modules options.
	+
	This plug-in can set parameters to kernel modules. It creates
	`/etc/modprobe.d/tuned.conf` file. The syntax is
	`_module_=_option1=value1 option2=value2..._` where `_module_` is
	the module name and `_optionx=valuex_` are module options which may
	or may not be present.
	+
	.Load module `netrom` with module parameter `nr_ndevs=2`
	====
	----
	[modules]
	netrom=nr_ndevs=2
	----
	====
	Modules can also be forced to load/reload by using an additional
	`+r` option prefix.
	+
	.(Re)load module `netrom` with module parameter `nr_ndevs=2`
	====
	----
	[modules]
	netrom=+r nr_ndevs=2
	----
	====
	The `+r` switch will also cause *TuneD* to try and remove `netrom`
	module (if loaded) and try and (re)insert it with the specified
	parameters. The `+r` can be followed by an optional comma (`+r,`)
	for better readability.
	+
	When using `+r` the module will be loaded immediately by the *TuneD*
	daemon itself rather than waiting for the OS to load it with the
	specified parameters.
	cs$tt|�j||�d|_t�|_dS)NT)�superr�__init__Z_has_dynamic_optionsr�_cmd)�self�args�kwargs)�	__class__��$/usr/lib/python3.6/plugin_modules.pyr3szModulesPlugin.__init__cCsd|_d|_|j|_dS)NFT)Z_has_dynamic_tuningZ_has_static_tuningZoptions�_modules)r
�instancerrr�_instance_init8szModulesPlugin._instance_initcCsdS)Nr)r
rrrr�_instance_cleanup=szModulesPlugin._instance_cleanupcCs�x�|D]�}|jjdd|g�\}}|dkr6tjd�dS|dkrTtjd||j�f�|jjd|g�\}}|dkrtjd||j�f�qWdS)NZmodprobez-rrzN'modprobe' command not found, cannot reload kernel modules, reboot is requiredz$cannot remove kernel module '%s': %sz:cannot insert/reinsert module '%s', reboot is required: %s)r	�execute�log�warn�debug�strip)r
�modules�module�retcode�outrrr�_reload_modules@s

zModulesPlugin._reload_modulescCsR|j�d}d}d}g}x�t|jj��D]�\}}|jj|�}|jj|�}	|s�|jjd|g�\}}
|dkrxd}tj	d�n|dkr�tj
d|�|s�|dkr(t|	�dkr�|	dd	�d
kr�tj
dd|	�}	|j|�t|	�dkr�|d|d
|	d7}q(tjd|�q(W|jjtj|�t|�}|dk�rN|j|�t|j�|k�rNtjtj�dS)N�rFZmodinfoTz8'modinfo' command not found, not checking kernel modulesz)kernel module '%s' not found, skipping itr�z+rz^\s*\+r\s*,?\s*zoptions � �
zKmodule '%s' doesn't have any option specified, not writing it to modprobe.d)�_clear_modprobe_file�listr�items�
_variables�expandr	rrr�error�len�re�sub�appendr�
write_to_file�consts�MODULES_FILEr�infoZSTR_HINT_REBOOT)r
r�srZ
skip_checkZreload_list�option�valuer�vr�lrrr�_instance_apply_staticLs8


z$ModulesPlugin._instance_apply_staticcCst|�jdd�S)N�/r)�str�replace)r
�pathrrr�
_unquote_pathkszModulesPlugin._unquote_pathc
Csd}d}tjd�}�xt|jj��D]�\}}|jj|�}|jj|�}	tjdd|	�}	d|}
tj	j
|
�s�d}tjt
jd|�q$tjt
jd|�|j|	�}xv|D]n}|jd	�}
t|
�d
kr�tjd||f�q�|j|
d|
d
|jj|
d|j|
d�ddd�|�dkr�d}q�Wq$W|S)NTz\s+z^\s*\+r\s*,?\s*rz/sys/module/%sFzmodule '%s' is not loadedzmodule '%s' is loaded�=rz.unrecognized module option for module '%s': %srrz/parameters/)Zerr_ret�no_error)r)�compiler#rr$r%r&r*�osr9�existsrr'r-ZSTR_VERIFY_PROFILE_FAILr/ZSTR_VERIFY_PROFILE_OK�splitr(rZ
_verify_valuer	�	read_filer:)r
rZignore_missingZdevices�ret�rr1r2rr3Zmpathr4�item�argrrr�_instance_verify_staticns,



"
z%ModulesPlugin._instance_verify_staticcCs|tjkr|j�dS)N)r-Z
ROLLBACK_FULLr")r
rZrollbackrrr�_instance_unapply_static�s
z&ModulesPlugin._instance_unapply_staticcCs�|jjtjdd�}|jd�}d}}t|�}tjd�}x.||krd|j||�dkrZ|}|}|d7}q8Wdj	|d|��}t|�dkr�|d7}|jj
tj|�dS)NT)r<r!rz^\s*#r)r	rAr-r.r@r(r)r=�search�joinr,)r
r0r4�i�jZllrCrrrr"�s


z"ModulesPlugin._clear_modprobe_file)�__name__�
__module__�__qualname__�__doc__rrrrr5r:rFr-Z
ROLLBACK_SOFTrGr"�
__classcell__rr)r
rrs%r)r)Zos.pathr>rrZ
decoratorsZ
tuned.logsZtuned�
subprocessZtuned.utils.commandsrZtuned.constsr-Zlogs�getrZPluginrrrrr�<module>s

site-packages/tuned/plugins/__pycache__/base.cpython-36.pyc000064400000052210147511334670017647 0ustar003

�<�e�W�@slddlZddljZddlZddlZddlZddlmZddl	Z	ddl
mZmZej
j�ZGdd�de�ZdS)�N)�commands)�Popen�PIPEc@s&eZdZdZdd�Zdd�Zdd�Zedd	��Ze	d
d��Z
e	dd
��Ze	dd��Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zejfd2d3�Z d4d5�Z!d6d7�Z"d8d9�Z#ejfd:d;�Z$d<d=�Z%d>d?�Z&ejfd@dA�Z'dBdC�Z(dDdE�Z)dFdG�Z*dHdI�Z+dJdK�Z,dLdM�Z-d|dOdP�Z.d}dQdR�Z/d~dSdT�Z0ddUdV�Z1dWdX�Z2dYdZ�Z3d[d\�Z4d]d^�Z5d_d`�Z6d�dbdc�Z7d�ddde�Z8dfdg�Z9dhdi�Z:djdk�Z;d�dldm�Z<d�dndo�Z=dpdq�Z>drds�Z?dtdu�Z@d�dvdw�ZAd�dxdy�ZBdzd{�ZCdNS)��Plugina
	Base class for all plugins.

	Plugins change various system settings in order to get desired performance or power
	saving. Plugins use Monitor objects to get information from the running system.

	Intentionally a lot of logic is included in the plugin to increase plugin flexibility.
	c		Csn|j|jj�|_||_||_||_||_||_t	j
�|_|j�||_
||_d|_d|_|j�|_t�|_dS)zPlugin constructor.FN)�create�	__class__�__name__�_storageZ_monitors_repositoryZ_hardware_inventory�_device_matcher�_device_matcher_udev�_instance_factory�collections�OrderedDict�
_instances�_init_commands�_global_cfg�
_variables�_has_dynamic_options�_devices_inited�#_get_config_options_used_by_dynamic�_options_used_by_dynamicr�_cmd)	�selfZmonitors_repositoryZstorage_factoryZhardware_inventoryZdevice_matcherZdevice_matcher_udevZinstance_factoryZ
global_cfg�	variables�r�/usr/lib/python3.6/base.py�__init__s

zPlugin.__init__cCs|j�dS)N)�destroy_instances)rrrr�cleanup,szPlugin.cleanupcCs|js|j�d|_dS)NT)r�
_init_devices)rrrr�init_devices/szPlugin.init_devicescCs|jjjd�djdd�dS)N�.��_���)r�
__module__�split)rrrr�name4szPlugin.namecCsiS)z-Default configuration options for the plugin.r)rrrr�_get_config_options<szPlugin._get_config_optionscCsiS)z*Explanation of each config option functionr)�clsrrr�get_config_options_hintsAszPlugin.get_config_options_hintscCsgS)znList of config options used by dynamic tuning. Their previous values will be automatically saved and restored.r)rrrrrFsz*Plugin._get_config_options_used_by_dynamiccCsP|j�j�}x>|D]6}||ks$|jr2||||<qtjd||jjf�qW|S)z3Merge provided options with plugin default options.z$Unknown option '%s' for plugin '%s'.)r(�copyr�log�warnrr)r�optionsZ	effective�keyrrr�_get_effective_optionsKs
zPlugin._get_effective_optionscCs,t|�tkr|St|�j�}|dkp*|dkS)N�true�1)�type�bool�str�lower)r�valuerrr�_option_boolVszPlugin._option_boolc	CsF||jkrtd|��|j|�}|jj|||||||�}||j|<|S)z8Create new instance of the plugin and seize the devices.z.Plugin instance with name '%s' already exists.)r�	Exceptionr0rr)	rr'�devices_expression�devices_udev_regex�
script_pre�script_postr.Zeffective_options�instancerrr�create_instance`s



zPlugin.create_instancecCsV|j|krtd||f��|j|jkr2td|��|j|j}|j|�|j|j=dS)zDestroy existing instance.z9Plugin instance '%s' does not belong to this plugin '%s'.z+Plugin instance '%s' was already destroyed.N)Z_pluginr9r'r�_destroy_instance)rr>rrr�destroy_instancels

zPlugin.destroy_instancecCs$tjd|j|jf�|j|�dS)zInitialize an instance.zinitializing instance %s (%s)N)r,�debugr'�_instance_init)rr>rrr�initialize_instancewszPlugin.initialize_instancecCsFx6t|jj��D]$}tjd|j|jf�|j|�qW|jj�dS)zDestroy all instances.zdestroying instance %s (%s)N)�listr�valuesr,rBr'r@�clear)rr>rrrr|szPlugin.destroy_instancescCs|j|�|j|�dS)N)�release_devices�_instance_cleanup)rr>rrrr@�s
zPlugin._destroy_instancecCs
t��dS)N)�NotImplementedError)rr>rrrrC�szPlugin._instance_initcCs
t��dS)N)rJ)rr>rrrrI�szPlugin._instance_cleanupcCsd|_t�|_t�|_dS)NF)�_devices_supported�set�_assigned_devices�
_free_devices)rrrrr�szPlugin._init_devicescCsdS)z�Override this in a subclass to transform a list of device names (e.g. ['sda'])
		   to a list of pyudev.Device objects, if your plugin supports itNr)r�devicesrrr�_get_device_objects�szPlugin._get_device_objectscCsj|jdkrt|jj|j|��S|j|�}|dkrDtjd|j�t�S|j	j|j|�}tdd�|D��SdS)Nz<Plugin '%s' does not support the 'devices_udev_regex' optioncSsg|]
}|j�qSr)Zsys_name)�.0�xrrr�
<listcomp>�sz0Plugin._get_matching_devices.<locals>.<listcomp>)
r;rLr
Z
match_listr:rPr,�errorr'r)rr>rOZudev_devicesrrr�_get_matching_devices�s

zPlugin._get_matching_devicescCs�|js
dStjd|j�|j||j�}t|�dk|_|jsNtjd|j�n`|j}|j|jkrn|d|j7}tj	d|dj
|�f�|jj|�|j
|O_
|j|8_dS)Nz assigning devices to instance %srz*instance %s: no matching devices availablez (%s)z!instance %s: assigning devices %sz, )rKr,rBr'rUrN�len�activer-�info�join�assigned_devices�updaterM)rr>Z	to_assignr'rrr�assign_free_devices�szPlugin.assign_free_devicescCsV|js
dS|j|jB|j@}d|_|jj�|jj�|j|8_|j|O_dS)NF)rK�processed_devicesrZrMrWrGrN)rr>Z
to_releaserrrrH�s

zPlugin.release_devicescCs(|jsdg}x|D]}|||�qWdS)N)rK)rr>�callbackrO�devicerrr�_run_for_each_device�s
zPlugin._run_for_each_devicecCsdS)Nr)rr>�enablingrrr�_instance_pre_static�szPlugin._instance_pre_staticcCsdS)Nr)rr>rarrr�_instance_post_static�szPlugin._instance_post_staticcCsn|dkrdSt|�dkr0tjd|j|f�dS|jd�sHtjd�dStjj|�}d}�x|D�]}tj	}	|	j
|jj��|g}
|t
jkr�|
jd�|
j|�tjd	|t|
�f�tjd
tt|	j����yVt|g|
ttd|	|dd�}|j�\}}
|j�r$tjd||j|
dd�f�d}Wq`ttfk
�rd}ztjd||f�d}WYdd}~Xq`Xq`W|S)Nrz1Instance '%s': no device to call script '%s' for.�/z<Relative paths cannot be used in script_pre or script_post. zUse ${i:PROFILE_DIR}.FTZ
full_rollbackz'calling script '%s' with arguments '%s'zusing environment '%s')�stdout�stderrZ	close_fds�env�cwdZuniversal_newlineszscript '%s' error: %d, '%s'r"zscript '%s' error: %szQRelative paths cannot be used in script_pre or script_post. Use ${i:PROFILE_DIR}.r$)rVr,r-r'�
startswithrT�os�path�dirname�environr[rZget_env�constsZ
ROLLBACK_FULL�appendrXr5rBrE�itemsrrZcommunicate�
returncode�OSError�IOError)rr>Zscript�oprO�rollbackZdir_name�retZdevrmZ	arguments�proc�out�err�errr�_call_device_script�sB





zPlugin._call_device_scriptcCs�|js
dS|jrZ|j||jd|j�|j|d�|j|�|j|d�|j||jd|j�|j	r�|j
jtj
tj�r�|j||j|j�|jj|j�|jj�dS)zG
		Apply static and dynamic tuning if the plugin instance is active.
		NZapplyT)rW�has_static_tuningr{r<rZrb�_instance_apply_staticrcr=�has_dynamic_tuningr�getrn�CFG_DYNAMIC_TUNING�CFG_DEF_DYNAMIC_TUNINGr`�_instance_apply_dynamicr]r[rG)rr>rrr�instance_apply_tuning�s




zPlugin.instance_apply_tuningcCs�|js
dSt|j�dkr.tjddj|j��|jj�}|jr�|j	||j
d|�dkrXdS|j|||�dkrndS|j	||jd|�dkr�dSdSdSdS)z<
		Verify static tuning if the plugin instance is active.
		Nrz)BUG: Some devices have not been tuned: %sz, ZverifyFT)
rWrVrZr,rTrYr]r+r|r{r<�_instance_verify_staticr=)rr>�ignore_missingrOrrr�instance_verify_tunings
zPlugin.instance_verify_tuningcCs<|js
dS|jr8|jjtjtj�r8|j||j|j	j
��dS)z<
		Apply dynamic tuning if the plugin instance is active.
		N)rWr~rrrnr�r�r`�_instance_update_dynamicr]r+)rr>rrr�instance_update_tuning$szPlugin.instance_update_tuningcCs�|tjkrdS|jr8|jjtjtj�r8|j||j|j	�|j
r�|j||jd|j	|d�|j
|d�|j||�|j|d�|j||jd|j	|d�dS)z8
		Remove all tunings applied by the plugin instance.
		NZunapply)ruF)rnZ
ROLLBACK_NONEr~rrr�r�r`�_instance_unapply_dynamicr]r|r{r=rb�_instance_unapply_staticrcr<)rr>rurrr�instance_unapply_tuning-s

zPlugin.instance_unapply_tuningcCs|j|�|j||j�dS)N)� _execute_all_non_device_commands�_execute_all_device_commandsrZ)rr>rrrr}?s
zPlugin._instance_apply_staticcCs2d}|j||�dkrd}|j|||�dkr.d}|S)NTF)�_verify_all_non_device_commands�_verify_all_device_commands)rr>r�rOrvrrrr�CszPlugin._instance_verify_staticcCs|j||j�|j|�dS)N)�_cleanup_all_device_commandsr]� _cleanup_all_non_device_commands)rr>rurrrr�KszPlugin._instance_unapply_staticcsFx4���fdd��jD�D]}�j��j|��qW�j���dS)Ncs(g|] }�j��j|��dkr|�qS)N)�_storage_get�	_commands)rQ�opt)r_r>rrrrSQsz2Plugin._instance_apply_dynamic.<locals>.<listcomp>)r�_check_and_save_valuer�r�)rr>r_Zoptionr)r_r>rrr�PszPlugin._instance_apply_dynamiccCs
t��dS)N)rJ)rr>r_rrrr�Vsz Plugin._instance_unapply_dynamiccCs
t��dS)N)rJ)rr>r_rrrr�YszPlugin._instance_update_dynamiccCstj�|_|j�|j�dS)z
		Initialize commands.
		N)r
rr��_autoregister_commands�_check_commands)rrrrr`s
zPlugin._init_commandscCs�x�|jjD]�}|jd�rq
t||�}t|d�s0q
|jd}|jj|d|i�}d|jkr�d|d<||d<|jd|d<|jd|d<nBd	|jkr�||d	<n.d|jkr�||d<|jd|d<|jd|d<||j|<q
Wtj	t
t|jj��d
d�d��|_dS)
zd
		Register all commands marked using @command_set, @command_get, and @command_custom decorators.
		�__�_commandr'rLN�custom�
per_device�priorityrcSs|ddS)Nr"r�r)Z	name_inforrr�<lambda>�sz/Plugin._autoregister_commands.<locals>.<lambda>)r/)
r�__dict__ri�getattr�hasattrr�r�rr
r�sorted�iterrp)r�member_name�member�command_namerXrrrr�hs*







zPlugin._autoregister_commandscCsJxDt|jj��D]2\}}|jdd�r&qd|ks6d|krtd|��qWdS)z2
		Check if all commands are defined correctly.
		r�FrrLz,Plugin command '%s' is not defined correctlyN)rEr�rpr�	TypeError)rr��commandrrrr��s
zPlugin._check_commandsNcCsJt|�j}|dkrdn|}|dkr&dn|}|dkr6dn|}d||||fS)N�z%s/%s/%s/%s)r3r)rZ
instance_namer��device_name�
class_namerrr�_storage_key�s
zPlugin._storage_keycCs&|j|j|d|�}|jj||�dS)Nr')r�r'r	rL)rr>r�r7r�r/rrr�_storage_set�szPlugin._storage_setcCs |j|j|d|�}|jj|�S)Nr')r�r'r	r)rr>r�r�r/rrrr��szPlugin._storage_getcCs |j|j|d|�}|jj|�S)Nr')r�r'r	Zunset)rr>r�r�r/rrr�_storage_unset�szPlugin._storage_unsetcCsVxPdd�t|jj��D�D]4}|jj|jj|dd��}|dk	r|j|||�qWdS)NcSsg|]}|ds|�qS)r�r)rQr�rrrrS�sz;Plugin._execute_all_non_device_commands.<locals>.<listcomp>r')rEr�rFr�expandr.r�_execute_non_device_command)rr>r��	new_valuerrrr��sz'Plugin._execute_all_non_device_commandscCshxbdd�t|jj��D�D]F}|jj|jj|dd��}|dkrBqx|D]}|j||||�qHWqWdS)NcSsg|]}|dr|�qS)r�r)rQr�rrrrS�sz7Plugin._execute_all_device_commands.<locals>.<listcomp>r')rEr�rFrr�r.r�_execute_device_command)rr>rOr�r�r_rrrr��s
z#Plugin._execute_all_device_commandscCsdd}xZdd�t|jj��D�D]>}|jj|jj|dd��}|dk	r|j||||�dkrd}qW|S)NTcSsg|]}|ds|�qS)r�r)rQr�rrrrS�sz:Plugin._verify_all_non_device_commands.<locals>.<listcomp>r'F)rEr�rFrr�r.r�_verify_non_device_command)rr>r�rvr�r�rrrr��sz&Plugin._verify_all_non_device_commandscCsnd}xddd�t|jj��D�D]H}|jj|dd�}|dkr>qx&|D]}|j|||||�dkrDd}qDWqW|S)NTcSsg|]}|dr|�qS)r�r)rQr�rrrrS�sz6Plugin._verify_all_device_commands.<locals>.<listcomp>r'F)rEr�rFr.r�_verify_device_command)rr>rOr�rvr�r�r_rrrr��s
z"Plugin._verify_all_device_commandscCs�|dk	r�t|�}t|�dkr |S|dd�}|dd�}|dkrP|dkrL|S|SyF|dkrtt|�t|�krn|SdSn |dkr�t|�t|�kr�|SdSWn*tk
r�tjd||||f�YnX|S)Nr"�<�>zhcannot compare new value '%s' with current value '%s' by operator '%s', using '%s' directly as new value)r�r�)r5rV�int�
ValueErrorr,r-)rr��
current_valueZnwsrt�valrrr�_process_assignment_modifiers�s(z$Plugin._process_assignment_modifiersFcCs&|dk	r|d||d�S|d�SdS)Nr)r�r)rr�r_r�rrr�_get_current_value�szPlugin._get_current_valuecCs<|j||�}|j||�}|dk	r8|dk	r8|j||||�|S)N)r�r�r�)rr>r�r_r�r�rrrr��s
zPlugin._check_and_save_valuecCsR|ddk	r"|dd||dd�n,|j||||�}|dk	rN|d||ddd�dS)Nr�TFrL)�sim�remove)r�)rr>r�r_r�rrrr��s
zPlugin._execute_device_commandcCsN|ddk	r |dd|dd�n*|j||d|�}|dk	rJ|d|ddd�dS)Nr�TFrL)r�r�)r�)rr>r�r�rrrr��s
z"Plugin._execute_non_device_commandcCs.|jjt|��}tjd|�r*tjdd|�S|S)Nz\s*(0+,?)+([\da-fA-F]*,?)*\s*$z^\s*(0+,?)+r�)rZunquoter5�re�match�sub)rr7�vrrr�_norm_valueszPlugin._norm_valuec	Cs(|dkrdSd}|dkrN|rN|dkr6tjtj|�ntjtj||f�dS|dk	�r|j|�}|j|�}yt|�t|�k}Wn�tk
�ryt|d�t|d�k}Wn^tk
�rt|�t|�k}|�st|�j	d�}x"|D]}|j
�}||k}|r�Pq�WYnXYnX|j|||||d�|S)NFT��|)r_)r,rXrnZ STR_VERIFY_PROFILE_VALUE_MISSINGZ'STR_VERIFY_PROFILE_DEVICE_VALUE_MISSINGr�r�r�r5r&�strip�_log_verification_result)	rr'r�r�r�r_rv�valsr�rrr�
_verify_value
s8





zPlugin._verify_valuecCs�|rL|dkr*tjtj|t|�j�f�ntjtj||t|�j�f�dS|dkr|tjtj|t|�j�t|�j�f�n(tjtj	||t|�j�t|�j�f�dSdS)NTF)
r,rXrnZSTR_VERIFY_PROFILE_VALUE_OKr5r�Z"STR_VERIFY_PROFILE_DEVICE_VALUE_OKrTZSTR_VERIFY_PROFILE_VALUE_FAILZ$STR_VERIFY_PROFILE_DEVICE_VALUE_FAIL)rr'�successr�r�r_rrrr�-s((zPlugin._log_verification_resultcCsp|ddk	r |dd||d|�S|j|||d�}|j||�}|dkrHdS|d||dd�}|j|d||||�S)Nr�T)r�rLFr')r�r�r�)rr>r�r_r�r�r�rrrr�<szPlugin._verify_device_commandcCsd|ddk	r|dd|d|�S|j|�}|j||�}|dkr@dS|d|dd�}|j|d|||�S)Nr�TrLFr')r�r�r�)rr>r�r�r�r�rrrr�Fs
z!Plugin._verify_non_device_commandcCsZxTtdd�t|jj��D��D]4}|jj|dd�dk	sF|d|jkr|j||�qWdS)NcSsg|]}|ds|�qS)r�r)rQr�rrrrSQsz;Plugin._cleanup_all_non_device_commands.<locals>.<listcomp>r')�reversedrEr�rFr.rr�_cleanup_non_device_command)rr>r�rrrr�Ps"$z'Plugin._cleanup_all_non_device_commandscCslxftdd�t|jj��D��D]F}|jj|dd�dk	sF|d|jkrx|D]}|j||||�qLWqWdS)NcSsg|]}|dr|�qS)r�r)rQr�rrrrSVsz7Plugin._cleanup_all_device_commands.<locals>.<listcomp>r')r�rEr�rFr.rr�_cleanup_device_command)rr>rOr�r�r_rrrr�Us"$
z#Plugin._cleanup_all_device_commandscCs^|ddk	r"|ddd|dd�n8|j|||�}|dk	rL|d||d|d�|j|||�dS)Nr�FrL)r�r�)r�r�)rr>r�r_r��	old_valuerrrr�[szPlugin._cleanup_device_commandcCsV|ddk	r |ddddd�n2|j||�}|dk	rF|d|ddd�|j||�dS)Nr�FrL)r�r�)r�r�)rr>r�r�rrrr�dsz"Plugin._cleanup_non_device_command)NNN)N)N)N)NF)NN)N)N)F)F)Drr%�__qualname__�__doc__rrr �propertyr'�classmethodr(r*rr0r8r?rArDrr@rCrIrrPrUr\rHr`rbrcrnZ
ROLLBACK_SOFTr{r�r�r�r�r}r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrs|
#	



	


!




	r)r�Ztuned.constsrnZtuned.profiles.variablesZtunedZ
tuned.logsr
Ztuned.utils.commandsrrj�
subprocessrrZlogsrr,�objectrrrrr�<module>s

site-packages/tuned/plugins/__pycache__/__init__.cpython-36.pyc000064400000000255147511334670020476 0ustar003

�<�e1�@sddlTddlmZdS)�)�*)�instanceN)Z
repository�r�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/plugins/__pycache__/base.cpython-36.opt-1.pyc000064400000052210147511334670020606 0ustar003

�<�e�W�@slddlZddljZddlZddlZddlZddlmZddl	Z	ddl
mZmZej
j�ZGdd�de�ZdS)�N)�commands)�Popen�PIPEc@s&eZdZdZdd�Zdd�Zdd�Zedd	��Ze	d
d��Z
e	dd
��Ze	dd��Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zejfd2d3�Z d4d5�Z!d6d7�Z"d8d9�Z#ejfd:d;�Z$d<d=�Z%d>d?�Z&ejfd@dA�Z'dBdC�Z(dDdE�Z)dFdG�Z*dHdI�Z+dJdK�Z,dLdM�Z-d|dOdP�Z.d}dQdR�Z/d~dSdT�Z0ddUdV�Z1dWdX�Z2dYdZ�Z3d[d\�Z4d]d^�Z5d_d`�Z6d�dbdc�Z7d�ddde�Z8dfdg�Z9dhdi�Z:djdk�Z;d�dldm�Z<d�dndo�Z=dpdq�Z>drds�Z?dtdu�Z@d�dvdw�ZAd�dxdy�ZBdzd{�ZCdNS)��Plugina
	Base class for all plugins.

	Plugins change various system settings in order to get desired performance or power
	saving. Plugins use Monitor objects to get information from the running system.

	Intentionally a lot of logic is included in the plugin to increase plugin flexibility.
	c		Csn|j|jj�|_||_||_||_||_||_t	j
�|_|j�||_
||_d|_d|_|j�|_t�|_dS)zPlugin constructor.FN)�create�	__class__�__name__�_storageZ_monitors_repositoryZ_hardware_inventory�_device_matcher�_device_matcher_udev�_instance_factory�collections�OrderedDict�
_instances�_init_commands�_global_cfg�
_variables�_has_dynamic_options�_devices_inited�#_get_config_options_used_by_dynamic�_options_used_by_dynamicr�_cmd)	�selfZmonitors_repositoryZstorage_factoryZhardware_inventoryZdevice_matcherZdevice_matcher_udevZinstance_factoryZ
global_cfg�	variables�r�/usr/lib/python3.6/base.py�__init__s

zPlugin.__init__cCs|j�dS)N)�destroy_instances)rrrr�cleanup,szPlugin.cleanupcCs|js|j�d|_dS)NT)r�
_init_devices)rrrr�init_devices/szPlugin.init_devicescCs|jjjd�djdd�dS)N�.��_���)r�
__module__�split)rrrr�name4szPlugin.namecCsiS)z-Default configuration options for the plugin.r)rrrr�_get_config_options<szPlugin._get_config_optionscCsiS)z*Explanation of each config option functionr)�clsrrr�get_config_options_hintsAszPlugin.get_config_options_hintscCsgS)znList of config options used by dynamic tuning. Their previous values will be automatically saved and restored.r)rrrrrFsz*Plugin._get_config_options_used_by_dynamiccCsP|j�j�}x>|D]6}||ks$|jr2||||<qtjd||jjf�qW|S)z3Merge provided options with plugin default options.z$Unknown option '%s' for plugin '%s'.)r(�copyr�log�warnrr)r�optionsZ	effective�keyrrr�_get_effective_optionsKs
zPlugin._get_effective_optionscCs,t|�tkr|St|�j�}|dkp*|dkS)N�true�1)�type�bool�str�lower)r�valuerrr�_option_boolVszPlugin._option_boolc	CsF||jkrtd|��|j|�}|jj|||||||�}||j|<|S)z8Create new instance of the plugin and seize the devices.z.Plugin instance with name '%s' already exists.)r�	Exceptionr0rr)	rr'�devices_expression�devices_udev_regex�
script_pre�script_postr.Zeffective_options�instancerrr�create_instance`s



zPlugin.create_instancecCsV|j|krtd||f��|j|jkr2td|��|j|j}|j|�|j|j=dS)zDestroy existing instance.z9Plugin instance '%s' does not belong to this plugin '%s'.z+Plugin instance '%s' was already destroyed.N)Z_pluginr9r'r�_destroy_instance)rr>rrr�destroy_instancels

zPlugin.destroy_instancecCs$tjd|j|jf�|j|�dS)zInitialize an instance.zinitializing instance %s (%s)N)r,�debugr'�_instance_init)rr>rrr�initialize_instancewszPlugin.initialize_instancecCsFx6t|jj��D]$}tjd|j|jf�|j|�qW|jj�dS)zDestroy all instances.zdestroying instance %s (%s)N)�listr�valuesr,rBr'r@�clear)rr>rrrr|szPlugin.destroy_instancescCs|j|�|j|�dS)N)�release_devices�_instance_cleanup)rr>rrrr@�s
zPlugin._destroy_instancecCs
t��dS)N)�NotImplementedError)rr>rrrrC�szPlugin._instance_initcCs
t��dS)N)rJ)rr>rrrrI�szPlugin._instance_cleanupcCsd|_t�|_t�|_dS)NF)�_devices_supported�set�_assigned_devices�
_free_devices)rrrrr�szPlugin._init_devicescCsdS)z�Override this in a subclass to transform a list of device names (e.g. ['sda'])
		   to a list of pyudev.Device objects, if your plugin supports itNr)r�devicesrrr�_get_device_objects�szPlugin._get_device_objectscCsj|jdkrt|jj|j|��S|j|�}|dkrDtjd|j�t�S|j	j|j|�}tdd�|D��SdS)Nz<Plugin '%s' does not support the 'devices_udev_regex' optioncSsg|]
}|j�qSr)Zsys_name)�.0�xrrr�
<listcomp>�sz0Plugin._get_matching_devices.<locals>.<listcomp>)
r;rLr
Z
match_listr:rPr,�errorr'r)rr>rOZudev_devicesrrr�_get_matching_devices�s

zPlugin._get_matching_devicescCs�|js
dStjd|j�|j||j�}t|�dk|_|jsNtjd|j�n`|j}|j|jkrn|d|j7}tj	d|dj
|�f�|jj|�|j
|O_
|j|8_dS)Nz assigning devices to instance %srz*instance %s: no matching devices availablez (%s)z!instance %s: assigning devices %sz, )rKr,rBr'rUrN�len�activer-�info�join�assigned_devices�updaterM)rr>Z	to_assignr'rrr�assign_free_devices�szPlugin.assign_free_devicescCsV|js
dS|j|jB|j@}d|_|jj�|jj�|j|8_|j|O_dS)NF)rK�processed_devicesrZrMrWrGrN)rr>Z
to_releaserrrrH�s

zPlugin.release_devicescCs(|jsdg}x|D]}|||�qWdS)N)rK)rr>�callbackrO�devicerrr�_run_for_each_device�s
zPlugin._run_for_each_devicecCsdS)Nr)rr>�enablingrrr�_instance_pre_static�szPlugin._instance_pre_staticcCsdS)Nr)rr>rarrr�_instance_post_static�szPlugin._instance_post_staticcCsn|dkrdSt|�dkr0tjd|j|f�dS|jd�sHtjd�dStjj|�}d}�x|D�]}tj	}	|	j
|jj��|g}
|t
jkr�|
jd�|
j|�tjd	|t|
�f�tjd
tt|	j����yVt|g|
ttd|	|dd�}|j�\}}
|j�r$tjd||j|
dd�f�d}Wq`ttfk
�rd}ztjd||f�d}WYdd}~Xq`Xq`W|S)Nrz1Instance '%s': no device to call script '%s' for.�/z<Relative paths cannot be used in script_pre or script_post. zUse ${i:PROFILE_DIR}.FTZ
full_rollbackz'calling script '%s' with arguments '%s'zusing environment '%s')�stdout�stderrZ	close_fds�env�cwdZuniversal_newlineszscript '%s' error: %d, '%s'r"zscript '%s' error: %szQRelative paths cannot be used in script_pre or script_post. Use ${i:PROFILE_DIR}.r$)rVr,r-r'�
startswithrT�os�path�dirname�environr[rZget_env�constsZ
ROLLBACK_FULL�appendrXr5rBrE�itemsrrZcommunicate�
returncode�OSError�IOError)rr>Zscript�oprO�rollbackZdir_name�retZdevrmZ	arguments�proc�out�err�errr�_call_device_script�sB





zPlugin._call_device_scriptcCs�|js
dS|jrZ|j||jd|j�|j|d�|j|�|j|d�|j||jd|j�|j	r�|j
jtj
tj�r�|j||j|j�|jj|j�|jj�dS)zG
		Apply static and dynamic tuning if the plugin instance is active.
		NZapplyT)rW�has_static_tuningr{r<rZrb�_instance_apply_staticrcr=�has_dynamic_tuningr�getrn�CFG_DYNAMIC_TUNING�CFG_DEF_DYNAMIC_TUNINGr`�_instance_apply_dynamicr]r[rG)rr>rrr�instance_apply_tuning�s




zPlugin.instance_apply_tuningcCs�|js
dSt|j�dkr.tjddj|j��|jj�}|jr�|j	||j
d|�dkrXdS|j|||�dkrndS|j	||jd|�dkr�dSdSdSdS)z<
		Verify static tuning if the plugin instance is active.
		Nrz)BUG: Some devices have not been tuned: %sz, ZverifyFT)
rWrVrZr,rTrYr]r+r|r{r<�_instance_verify_staticr=)rr>�ignore_missingrOrrr�instance_verify_tunings
zPlugin.instance_verify_tuningcCs<|js
dS|jr8|jjtjtj�r8|j||j|j	j
��dS)z<
		Apply dynamic tuning if the plugin instance is active.
		N)rWr~rrrnr�r�r`�_instance_update_dynamicr]r+)rr>rrr�instance_update_tuning$szPlugin.instance_update_tuningcCs�|tjkrdS|jr8|jjtjtj�r8|j||j|j	�|j
r�|j||jd|j	|d�|j
|d�|j||�|j|d�|j||jd|j	|d�dS)z8
		Remove all tunings applied by the plugin instance.
		NZunapply)ruF)rnZ
ROLLBACK_NONEr~rrr�r�r`�_instance_unapply_dynamicr]r|r{r=rb�_instance_unapply_staticrcr<)rr>rurrr�instance_unapply_tuning-s

zPlugin.instance_unapply_tuningcCs|j|�|j||j�dS)N)� _execute_all_non_device_commands�_execute_all_device_commandsrZ)rr>rrrr}?s
zPlugin._instance_apply_staticcCs2d}|j||�dkrd}|j|||�dkr.d}|S)NTF)�_verify_all_non_device_commands�_verify_all_device_commands)rr>r�rOrvrrrr�CszPlugin._instance_verify_staticcCs|j||j�|j|�dS)N)�_cleanup_all_device_commandsr]� _cleanup_all_non_device_commands)rr>rurrrr�KszPlugin._instance_unapply_staticcsFx4���fdd��jD�D]}�j��j|��qW�j���dS)Ncs(g|] }�j��j|��dkr|�qS)N)�_storage_get�	_commands)rQ�opt)r_r>rrrrSQsz2Plugin._instance_apply_dynamic.<locals>.<listcomp>)r�_check_and_save_valuer�r�)rr>r_Zoptionr)r_r>rrr�PszPlugin._instance_apply_dynamiccCs
t��dS)N)rJ)rr>r_rrrr�Vsz Plugin._instance_unapply_dynamiccCs
t��dS)N)rJ)rr>r_rrrr�YszPlugin._instance_update_dynamiccCstj�|_|j�|j�dS)z
		Initialize commands.
		N)r
rr��_autoregister_commands�_check_commands)rrrrr`s
zPlugin._init_commandscCs�x�|jjD]�}|jd�rq
t||�}t|d�s0q
|jd}|jj|d|i�}d|jkr�d|d<||d<|jd|d<|jd|d<nBd	|jkr�||d	<n.d|jkr�||d<|jd|d<|jd|d<||j|<q
Wtj	t
t|jj��d
d�d��|_dS)
zd
		Register all commands marked using @command_set, @command_get, and @command_custom decorators.
		�__�_commandr'rLN�custom�
per_device�priorityrcSs|ddS)Nr"r�r)Z	name_inforrr�<lambda>�sz/Plugin._autoregister_commands.<locals>.<lambda>)r/)
r�__dict__ri�getattr�hasattrr�r�rr
r�sorted�iterrp)r�member_name�member�command_namerXrrrr�hs*







zPlugin._autoregister_commandscCsJxDt|jj��D]2\}}|jdd�r&qd|ks6d|krtd|��qWdS)z2
		Check if all commands are defined correctly.
		r�FrrLz,Plugin command '%s' is not defined correctlyN)rEr�rpr�	TypeError)rr��commandrrrr��s
zPlugin._check_commandsNcCsJt|�j}|dkrdn|}|dkr&dn|}|dkr6dn|}d||||fS)N�z%s/%s/%s/%s)r3r)rZ
instance_namer��device_name�
class_namerrr�_storage_key�s
zPlugin._storage_keycCs&|j|j|d|�}|jj||�dS)Nr')r�r'r	rL)rr>r�r7r�r/rrr�_storage_set�szPlugin._storage_setcCs |j|j|d|�}|jj|�S)Nr')r�r'r	r)rr>r�r�r/rrrr��szPlugin._storage_getcCs |j|j|d|�}|jj|�S)Nr')r�r'r	Zunset)rr>r�r�r/rrr�_storage_unset�szPlugin._storage_unsetcCsVxPdd�t|jj��D�D]4}|jj|jj|dd��}|dk	r|j|||�qWdS)NcSsg|]}|ds|�qS)r�r)rQr�rrrrS�sz;Plugin._execute_all_non_device_commands.<locals>.<listcomp>r')rEr�rFr�expandr.r�_execute_non_device_command)rr>r��	new_valuerrrr��sz'Plugin._execute_all_non_device_commandscCshxbdd�t|jj��D�D]F}|jj|jj|dd��}|dkrBqx|D]}|j||||�qHWqWdS)NcSsg|]}|dr|�qS)r�r)rQr�rrrrS�sz7Plugin._execute_all_device_commands.<locals>.<listcomp>r')rEr�rFrr�r.r�_execute_device_command)rr>rOr�r�r_rrrr��s
z#Plugin._execute_all_device_commandscCsdd}xZdd�t|jj��D�D]>}|jj|jj|dd��}|dk	r|j||||�dkrd}qW|S)NTcSsg|]}|ds|�qS)r�r)rQr�rrrrS�sz:Plugin._verify_all_non_device_commands.<locals>.<listcomp>r'F)rEr�rFrr�r.r�_verify_non_device_command)rr>r�rvr�r�rrrr��sz&Plugin._verify_all_non_device_commandscCsnd}xddd�t|jj��D�D]H}|jj|dd�}|dkr>qx&|D]}|j|||||�dkrDd}qDWqW|S)NTcSsg|]}|dr|�qS)r�r)rQr�rrrrS�sz6Plugin._verify_all_device_commands.<locals>.<listcomp>r'F)rEr�rFr.r�_verify_device_command)rr>rOr�rvr�r�r_rrrr��s
z"Plugin._verify_all_device_commandscCs�|dk	r�t|�}t|�dkr |S|dd�}|dd�}|dkrP|dkrL|S|SyF|dkrtt|�t|�krn|SdSn |dkr�t|�t|�kr�|SdSWn*tk
r�tjd||||f�YnX|S)Nr"�<�>zhcannot compare new value '%s' with current value '%s' by operator '%s', using '%s' directly as new value)r�r�)r5rV�int�
ValueErrorr,r-)rr��
current_valueZnwsrt�valrrr�_process_assignment_modifiers�s(z$Plugin._process_assignment_modifiersFcCs&|dk	r|d||d�S|d�SdS)Nr)r�r)rr�r_r�rrr�_get_current_value�szPlugin._get_current_valuecCs<|j||�}|j||�}|dk	r8|dk	r8|j||||�|S)N)r�r�r�)rr>r�r_r�r�rrrr��s
zPlugin._check_and_save_valuecCsR|ddk	r"|dd||dd�n,|j||||�}|dk	rN|d||ddd�dS)Nr�TFrL)�sim�remove)r�)rr>r�r_r�rrrr��s
zPlugin._execute_device_commandcCsN|ddk	r |dd|dd�n*|j||d|�}|dk	rJ|d|ddd�dS)Nr�TFrL)r�r�)r�)rr>r�r�rrrr��s
z"Plugin._execute_non_device_commandcCs.|jjt|��}tjd|�r*tjdd|�S|S)Nz\s*(0+,?)+([\da-fA-F]*,?)*\s*$z^\s*(0+,?)+r�)rZunquoter5�re�match�sub)rr7�vrrr�_norm_valueszPlugin._norm_valuec	Cs(|dkrdSd}|dkrN|rN|dkr6tjtj|�ntjtj||f�dS|dk	�r|j|�}|j|�}yt|�t|�k}Wn�tk
�ryt|d�t|d�k}Wn^tk
�rt|�t|�k}|�st|�j	d�}x"|D]}|j
�}||k}|r�Pq�WYnXYnX|j|||||d�|S)NFT��|)r_)r,rXrnZ STR_VERIFY_PROFILE_VALUE_MISSINGZ'STR_VERIFY_PROFILE_DEVICE_VALUE_MISSINGr�r�r�r5r&�strip�_log_verification_result)	rr'r�r�r�r_rv�valsr�rrr�
_verify_value
s8





zPlugin._verify_valuecCs�|rL|dkr*tjtj|t|�j�f�ntjtj||t|�j�f�dS|dkr|tjtj|t|�j�t|�j�f�n(tjtj	||t|�j�t|�j�f�dSdS)NTF)
r,rXrnZSTR_VERIFY_PROFILE_VALUE_OKr5r�Z"STR_VERIFY_PROFILE_DEVICE_VALUE_OKrTZSTR_VERIFY_PROFILE_VALUE_FAILZ$STR_VERIFY_PROFILE_DEVICE_VALUE_FAIL)rr'�successr�r�r_rrrr�-s((zPlugin._log_verification_resultcCsp|ddk	r |dd||d|�S|j|||d�}|j||�}|dkrHdS|d||dd�}|j|d||||�S)Nr�T)r�rLFr')r�r�r�)rr>r�r_r�r�r�rrrr�<szPlugin._verify_device_commandcCsd|ddk	r|dd|d|�S|j|�}|j||�}|dkr@dS|d|dd�}|j|d|||�S)Nr�TrLFr')r�r�r�)rr>r�r�r�r�rrrr�Fs
z!Plugin._verify_non_device_commandcCsZxTtdd�t|jj��D��D]4}|jj|dd�dk	sF|d|jkr|j||�qWdS)NcSsg|]}|ds|�qS)r�r)rQr�rrrrSQsz;Plugin._cleanup_all_non_device_commands.<locals>.<listcomp>r')�reversedrEr�rFr.rr�_cleanup_non_device_command)rr>r�rrrr�Ps"$z'Plugin._cleanup_all_non_device_commandscCslxftdd�t|jj��D��D]F}|jj|dd�dk	sF|d|jkrx|D]}|j||||�qLWqWdS)NcSsg|]}|dr|�qS)r�r)rQr�rrrrSVsz7Plugin._cleanup_all_device_commands.<locals>.<listcomp>r')r�rEr�rFr.rr�_cleanup_device_command)rr>rOr�r�r_rrrr�Us"$
z#Plugin._cleanup_all_device_commandscCs^|ddk	r"|ddd|dd�n8|j|||�}|dk	rL|d||d|d�|j|||�dS)Nr�FrL)r�r�)r�r�)rr>r�r_r��	old_valuerrrr�[szPlugin._cleanup_device_commandcCsV|ddk	r |ddddd�n2|j||�}|dk	rF|d|ddd�|j||�dS)Nr�FrL)r�r�)r�r�)rr>r�r�rrrr�dsz"Plugin._cleanup_non_device_command)NNN)N)N)N)NF)NN)N)N)F)F)Drr%�__qualname__�__doc__rrr �propertyr'�classmethodr(r*rr0r8r?rArDrr@rCrIrrPrUr\rHr`rbrcrnZ
ROLLBACK_SOFTr{r�r�r�r�r}r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrrs|
#	



	


!




	r)r�Ztuned.constsrnZtuned.profiles.variablesZtunedZ
tuned.logsr
Ztuned.utils.commandsrrj�
subprocessrrZlogsrr,�objectrrrrr�<module>s

site-packages/tuned/plugins/__pycache__/plugin_net.cpython-36.pyc000064400000056323147511334670021112 0ustar003

�<�e�Z�@spddlZddlmZddlTddlZddlmZddlm	Z	ddl
Z
ddlZejj
�ZdZGdd	�d	ej�ZdS)
�N�)�hotplug)�*)�ethcard)�commandsZpumbagsdcs.eZdZdZ�fdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
edd��Zedd��Z
edd��Zedd��Zedd��Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zed(d)��Zed*d+d,�d-d.��Zed*�did0d1��Zed2�d3d4��Zed2�d5d6��Zgfd7d8�Zdjd:d;�Z ed<d+d,�d=d>��Z!d?d@�Z"ed<�dkdAdB��Z#edCd+d,�dDdE��Z$edC�dldFdG��Z%dHdI�Z&dJdK�Z'dLdM�Z(dNdO�Z)dPdQ�Z*dRdS�Z+dTdU�Z,dmdVdW�Z-dXdY�Z.e/dZd+d,�d[d\��Z0e/d]d+d,�d^d_��Z1e/d`d+d,�dadb��Z2e/dcd+d,�ddde��Z3e/dfd+d,�dgdh��Z4�Z5S)n�NetTuningPlugina�
	`net`::
	
	Configures network driver, hardware and Netfilter settings.
	Dynamic change of the interface speed according to the interface
	utilization is also supported. The dynamic tuning is controlled by
	the [option]`dynamic` and the global [option]`dynamic_tuning`
	option in `tuned-main.conf`.
	+
	The [option]`wake_on_lan` option sets wake-on-lan to the specified
	value as when using the `ethtool` utility.
	+
	.Set Wake-on-LAN for device eth0 on MagicPacket(TM)
	====
	----
	[net]
	devices=eth0
	wake_on_lan=g
	----
	====
	+
	The [option]`coalesce` option allows changing coalescing settings
	for the specified network devices. The syntax is:
	+
	[subs="+quotes,+macros"]
	----
	coalesce=__param1__ __value1__ __param2__ __value2__ ... __paramN__ __valueN__
	----
	Note that not all the coalescing parameters are supported by all
	network cards. For the list of coalescing parameters of your network
	device, use `ethtool -c device`.
	+	
	.Setting coalescing parameters rx/tx-usecs for all network devices
	====
	----
	[net]
	coalesce=rx-usecs 3 tx-usecs 16
	----
	====
	+
	The [option]`features` option allows changing 
	the offload parameters and other features for the specified
	network devices. To query the features of your network device,
	use `ethtool -k device`. The syntax of the option is the same as
	the [option]`coalesce` option.
	+
	.Turn off TX checksumming, generic segmentation and receive offload 
	====
	----
	[net]
	features=tx off gso off gro off
	----
	====
	The [option]`pause` option allows changing the pause parameters for
	the specified network devices. To query the pause parameters of your
	network device, use `ethtool -a device`. The syntax of the option
	is the same as the [option]`coalesce` option.
	+
	.Disable autonegotiation
	====
	----
	[net]
	pause=autoneg off
	----
	====
	+
	The [option]`ring` option allows changing the rx/tx ring parameters
	for the specified network devices. To query the ring parameters of your
	network device, use `ethtool -g device`. The syntax of the option
	is the same as the [option]`coalesce` option.
	+	
	.Change the number of ring entries for the Rx/Tx rings to 1024/512 respectively
	=====
	-----
	[net]
	ring=rx 1024 tx 512
	-----
	=====
	+
	The [option]`channels` option allows changing the numbers of channels
	for the specified network device. A channel is an IRQ and the set
	of queues that can trigger that IRQ. To query the channels parameters of your
	network device, use `ethtool -l device`. The syntax of the option
	is the same as the [option]`coalesce` option.
	+
	.Set the number of multi-purpose channels to 16
	=====
	-----
	[net]
	channels=combined 16
	-----
	=====
	+   
	A network device either supports rx/tx or combined queue
	mode. The [option]`channels` option automatically adjusts the
	parameters based on the mode supported by the device as long as a
	valid configuration is requested.
	+
	The [option]`nf_conntrack_hashsize` option sets the size of the hash
	table which stores lists of conntrack entries by writing to
	`/sys/module/nf_conntrack/parameters/hashsize`.
	+
	.Adjust the size of the conntrack hash table
	====
	----
	[net]
	nf_conntrack_hashsize=131072
	----
	====
	+
	The [option]`txqueuelen` option allows changing txqueuelen (the length
	of the transmit queue). It uses `ip` utility that is in package	iproute
	recommended for TuneD, so the package needs to be installed for its correct
	functionality. To query the txqueuelen parameters of your network device
	use `ip link show` and the current value is shown after the qlen column.
	+
	.Adjust the length of the transmit queue
	====
	----
	[net]
	txqueuelen=5000
	----
	====
	+
	The [option]`mtu` option allows changing MTU (Maximum Transmission Unit).
	It uses `ip` utility that is in package	iproute recommended for TuneD, so
	the package needs to be installed for its correct functionality. To query
	the MTU parameters of your network device use `ip link show` and the
	current value is shown after the MTU column.
	+
	.Adjust the size of the MTU
	====
	----
	[net]
	mtu=9000
	----
	====
	cs6tt|�j||�d|_d|_t�|_i|_d|_dS)Ng�������?�T)	�superr�__init__�_load_smallest�_level_stepsr�_cmd�_re_ip_link_show�_use_ip)�self�args�kwargs)�	__class__�� /usr/lib/python3.6/plugin_net.pyr
�szNetTuningPlugin.__init__cCshd|_t�|_t�|_tjd�}x.|jjd�D]}|j|j	�r.|jj
|j�q.Wtj
dt|j��dS)NTz(?!.*/virtual/.*)�netzdevices: %s)Z_devices_supported�setZ
_free_devicesZ_assigned_devices�re�compile�_hardware_inventoryZget_devices�matchZdevice_path�addZsys_name�log�debug�str)rZre_not_virtual�devicerrr�
_init_devices�s
zNetTuningPlugin._init_devicescs�fdd�|D�S)Ncsg|]}�jjd|��qS)r)rZ
get_device)�.0�x)rrr�
<listcomp>�sz7NetTuningPlugin._get_device_objects.<locals>.<listcomp>r)rZdevicesr)rr�_get_device_objects�sz#NetTuningPlugin._get_device_objectscCsXd|_|j|jd�r<d|_|jjd|j�|_i|_i|_	nd|_d|_d|_d|_	dS)NT�dynamicrF)
Z_has_static_tuningZ_option_boolZoptionsZ_has_dynamic_tuning�_monitors_repositoryZcreateZassigned_devices�
_load_monitor�_idle�_stats)r�instancerrr�_instance_init�szNetTuningPlugin._instance_initcCs"|jdk	r|jj|j�d|_dS)N)r(r'�delete)rr+rrr�_instance_cleanup�s
z!NetTuningPlugin._instance_cleanupcCs|j||�dS)N)�_instance_update_dynamic)rr+r rrr�_instance_apply_dynamic�sz'NetTuningPlugin._instance_apply_dynamiccCs<dd�|jj|�D�}|dkr"dS||jkr8|j||�|j|||�|j||�|j|}|j|}|ddkr�|d|jkr�|d|jkr�d|d<tj	d|�t
|�jd	�nF|ddkr�|ddks�|ddkr�d|d<tj	d
|�t
|�j�tj
d||d|df�tj
d||d|d|df�dS)
NcSsg|]}t|��qSr)�int)r"�valuerrrr$�sz<NetTuningPlugin._instance_update_dynamic.<locals>.<listcomp>�levelr�read�writerz%s: setting 100Mbps�dz%s: setting max speedz %s load: read %0.2f, write %0.2fz$%s idle: read %d, write %d, level %d)r(Zget_device_loadr*�_init_stats_and_idle�
_update_stats�_update_idler)rr�inforZ	set_speed�
set_max_speedr)rr+r �loadZstatsZidlerrrr/�s&


($z(NetTuningPlugin._instance_update_dynamiccCs2ddddddddddddddddddddddd�S)N)zadaptive-rxzadaptive-txzrx-usecsz	rx-frameszrx-usecs-irqz
rx-frames-irqztx-usecsz	tx-framesztx-usecs-irqz
tx-frames-irqzstats-block-usecszpkt-rate-lowzrx-usecs-lowz
rx-frames-lowztx-usecs-lowz
tx-frames-lowz
pkt-rate-highz
rx-usecs-highzrx-frames-highz
tx-usecs-highztx-frames-highzsample-intervalr)�clsrrr�_get_config_options_coalesce�s,z,NetTuningPlugin._get_config_options_coalescecCsdddd�S)N)�autoneg�rx�txr)r=rrr�_get_config_options_pause�sz)NetTuningPlugin._get_config_options_pausecCsddddd�S)N)r@zrx-minizrx-jumborAr)r=rrr�_get_config_options_ringsz(NetTuningPlugin._get_config_options_ringcCsddddd�S)N)r@rA�other�combinedr)r=rrr�_get_config_options_channelssz,NetTuningPlugin._get_config_options_channelscCsddddddddddd�
S)NT)
r&�wake_on_lan�nf_conntrack_hashsize�features�coalesce�pause�ring�channels�
txqueuelen�mtur)r=rrr�_get_config_optionssz#NetTuningPlugin._get_config_optionscCsF|jt|�j��}ddgd|dgd�|j|<dddd�|j|<dS)N�r�r)�new�max)r3r4r5)�_calc_speedrZ
get_max_speedr*r))rr+r Z	max_speedrrrr7sz$NetTuningPlugin._init_stats_and_idlecCs�|j|d|j|d<}||j|d<dd�t||�D�}||j|d<|j|d}dd�t||�D�}||j|d<t|d�t|d�|j|d	<t|d
�t|d
�|j|d<dS)NrS�oldcSsg|]}|d|d�qS)rrr)r"Znew_oldrrrr$(sz1NetTuningPlugin._update_stats.<locals>.<listcomp>�diffrTcSsg|]}t|��qSr)rT)r"Zpairrrrr$-srr4rRr5)r*�zip�float)rr+r Znew_loadZold_loadrWZold_max_loadZmax_loadrrrr8"s"zNetTuningPlugin._update_statscCsLxFdD]>}|j|||jkr6|j||d7<qd|j||<qWdS)Nr4r5rr)r4r5)r*rr))rr+r Z	operationrrrr94s
zNetTuningPlugin._update_idlecCsH||jkrD|j|ddkrDd|j|d<tjd|�t|�j�dS)Nr3rz%s: setting max speed)r)rr:rr;)rr+r rrr�_instance_unapply_dynamic<sz)NetTuningPlugin._instance_unapply_dynamiccCstd|d�S)Ng333333�?i�g333333�@g333333#A)r1)rZspeedrrrrUBszNetTuningPlugin._calc_speedcCs�|jj|�}ttjdd|��j�}t|�}|ddkrPtjd|t|�f�dS|dkr^t	�St	t
t|ddd�|ddd����S)Nz (:\s*)|(\s+)|(\s*;\s*)|(\s*,\s*)� rRrzinvalid %s parameter: '%s'r)Z
_variables�expandrr�sub�split�lenr�error�dict�listrX)rr2�context�vZlvrrr�_parse_config_parametersKsz(NetTuningPlugin._parse_config_parameterscCs||jjddddddddd	d
ddd
dddddd�|�}dd�|jd�D�}t|�dkrXdStdd�dd�|dd�D�D��S)Nzadaptive-rx:z
adaptive-tx:zrx-frames-low:zrx-frames-high:ztx-frames-low:ztx-frames-high:zlro:zrx:ztx:zsg:ztso:zufo:zgso:zgro:zrxvlan:ztxvlan:zntuple:zrxhash:)zAdaptive RX:z\s+TX:z
rx-frame-low:zrx-frame-high:z
tx-frame-low:ztx-frame-high:zlarge-receive-offload:zrx-checksumming:ztx-checksumming:zscatter-gather:ztcp-segmentation-offload:zudp-fragmentation-offload:zgeneric-segmentation-offload:zgeneric-receive-offload:zrx-vlan-offload:ztx-vlan-offload:zntuple-filters:zreceive-hashing:cSs2g|]*}tt|��dkrtjdt|��r|�qS)rz
\[fixed\]$)r`rr�search)r"rerrrr$ssz<NetTuningPlugin._parse_device_parameters.<locals>.<listcomp>�
rRcSsg|]}t|�dkr|�qS)rR)r`)r"�urrrr$xscSsg|]}tjdt|���qS)z:\s*)rr_r)r"rerrrr$xsr)r
�multiple_re_replacer_r`rb)rr2Zvlrrr�_parse_device_parametersZs0z(NetTuningPlugin._parse_device_parameterscCsdS)Nz,/sys/module/nf_conntrack/parameters/hashsizer)rrrr�_nf_conntrack_hashsize_pathzsz+NetTuningPlugin._nf_conntrack_hashsize_pathrGT)Z
per_devicecCs^|dkrdStjddt|��}tjdtd|�s@tjd�dS|sZ|jjdd|d|g�|S)	N�0�dz^[z]+$zIncorrect 'wake_on_lan' value.�ethtoolz-sZwol)	rr^rr�
WOL_VALUESr�warnr
�execute)rr2r �sim�removerrr�_set_wake_on_lan~s
z NetTuningPlugin._set_wake_on_lanFcCsXd}y:tjdtd|jjd|g�dtj�}|r<|jd�}Wntk
rRYnX|S)Nz.*Wake-on:\s*([z]+).*ror)rrrpr
rr�S�group�IOError)rr �ignore_missingr2�mrrr�_get_wake_on_lan�s(z NetTuningPlugin._get_wake_on_lanrHcCsN|dkrdSt|�}|dkrF|sB|jj|j�||r:tjgndd�|SdSdS)NrF)Zno_error)r1r
Z
write_to_filerl�errno�ENOENT)rr2rsrtZhashsizerrr�_set_nf_conntrack_hashsize�sz*NetTuningPlugin._set_nf_conntrack_hashsizecCs(|jj|j��}t|�dkr$t|�SdS)Nr)r
Z	read_filerlr`r1)rr2rrr�_get_nf_conntrack_hashsize�sz*NetTuningPlugin._get_nf_conntrack_hashsizecCsz|js
dSddg|}|jj|tjgdd�\}}}|tjkrRtjd�d|_dS|rvtjd�tjd||f�dS|S)	NZip�linkT)�	no_errorsZ
return_errz0ip command not found, ignoring for other devicesFzProblem calling ip commandz(rc: %s, msg: '%s'))	rr
rrr|r}rrqr:r)rrZrc�outZerr_msgrrr�
_call_ip_link�s

zNetTuningPlugin._call_ip_linkNcCsdg}|r|j|�|j|�S)NZshow)�appendr�)rr rrrr�
_ip_link_show�s
zNetTuningPlugin._ip_link_showrNcCsr|dkrdSyt|�Wn"tk
r:tjd|�dSX|sn|jdd|d|g�}|dkrntjd|�dS|S)Nz$txqueuelen value '%s' is not integerr�devrNz%Cannot set txqueuelen for device '%s')r1�
ValueErrorrrqr�)rr2r rsrt�resrrr�_set_txqueuelen�szNetTuningPlugin._set_txqueuelencCs(||jkrtjd|�|j|<|j|S)z@
		Return regex for int arg value from "ip link show" command
		z.*\s+%s\s+(\d+))rrr)r�argrrr�_get_re_ip_link_show�s
z$NetTuningPlugin._get_re_ip_link_showcCs`|j|�}|dkr(|s$tjd|�dS|jd�j|�}|dkrV|sRtjd|�dS|jd�S)NzECannot get 'ip link show' result for txqueuelen value for device '%s'ZqlenzFCannot get txqueuelen value from 'ip link show' result for device '%s'r)r�rr:r�rgrw)rr ryr�r�rrr�_get_txqueuelen�s
zNetTuningPlugin._get_txqueuelenrOcCsr|dkrdSyt|�Wn"tk
r:tjd|�dSX|sn|jdd|d|g�}|dkrntjd|�dS|S)Nzmtu value '%s' is not integerrr�rOzCannot set mtu for device '%s')r1r�rrqr�)rr2r rsrtr�rrr�_set_mtu�szNetTuningPlugin._set_mtucCs`|j|�}|dkr(|s$tjd|�dS|jd�j|�}|dkrV|sRtjd|�dS|jd�S)Nz>Cannot get 'ip link show' result for mtu value for device '%s'rOz?Cannot get mtu value from 'ip link show' result for device '%s'r)r�rr:r�rgrw)rr ryr�r�rrr�_get_mtu�s
zNetTuningPlugin._get_mtucCsl|dkrdSt|j��}|j|j|j|jd�}t||�j��}|j|�shtjd|t	||�f�dSdS)NrIT)rJrKrLrMzunknown %s parameter(s): %sF)
r�keysr>rBrCrF�issubsetrrar)rrdrnZparamsZsupported_getterZ	supportedrrr�_check_parameters
s

z!NetTuningPlugin._check_parameterscCsR|jjdddd�|�}|jd�dd�}dd�|D�}td	d�d
d�|D�D��S)Nr?r@rA)Z
Autonegotiate�RX�TXrhrcSs&g|]}|dkrtjd|�r|�qS)�z	\[fixed\])rrg)r"r#rrrr$sz;NetTuningPlugin._parse_pause_parameters.<locals>.<listcomp>cSsg|]}t|�dkr|�qS)rR)r`)r"r#rrrr$ scSsg|]}tjd|��qS)z:\s*)rr_)r"r#rrrr$ s)r
rjr_rb)r�s�lrrr�_parse_pause_parameterssz'NetTuningPlugin._parse_pause_parameterscCsjtjd|tjd�}|d}|jjddddd�|�}|jd	�}d
d�|D�}dd�d
d�|D�D�}t|�S)Nz^Current hardware settings:$)�flagsrr@zrx-minizrx-jumborA)r�zRX MinizRX Jumbor�rhcSsg|]}|dkr|�qS)r�r)r"r#rrrr$,sz:NetTuningPlugin._parse_ring_parameters.<locals>.<listcomp>cSsg|]}t|�dkr|�qS)rR)r`)r"r#rrrr$-scSsg|]}tjd|��qS)z:\s*)rr_)r"r#rrrr$-s)rr_�	MULTILINEr
rjrb)rr��ar�rrr�_parse_ring_parameters#s
z&NetTuningPlugin._parse_ring_parameterscCsjtjd|tjd�}|d}|jjddddd�|�}|jd	�}d
d�|D�}dd�d
d�|D�D�}t|�S)Nz^Current hardware settings:$)r�rr@rArDrE)r�r�ZOtherZCombinedrhcSsg|]}|dkr|�qS)r�r)r"r#rrrr$:sz>NetTuningPlugin._parse_channels_parameters.<locals>.<listcomp>cSsg|]}t|�dkr|�qS)rR)r`)r"r#rrrr$;scSsg|]}tjd|��qS)z:\s*)rr_)r"r#rrrr$;s)rr_r�r
rjrb)rr�r�r�rrr�_parse_channels_parameters1s
z*NetTuningPlugin._parse_channels_parameterscCszg}d|kr(|jd|dd|dg�n,ttt|d�t|d���}|jd|g�ttt|ddd�|ddd����S)NrEr@rrA�rR)�extendrrTr1rbrcrX)rrdZparams_list�
dev_paramsZmod_params_listZcntrrr�_replace_channels_parameters>sz,NetTuningPlugin._replace_channels_parametersc	CsRt|j��}t|j��}||}x,|D]$}tjd|||f�|j|d�q&WdS)aFilter unsupported parameters and log warnings about it

		Positional parameters:
		context -- context of change
		parameters -- parameters to change
		device -- name of device on which should be parameters set
		dev_params -- dictionary of currently known parameters of device
		z-%s parameter %s is not supported by device %sN)rr�rZwarning�pop)	rrdZ
parametersr r�Zsupported_parametersZparameters_to_changeZunsupported_parameters�paramrrr�_check_device_supportGs	

z%NetTuningPlugin._check_device_supportc
Cs�dddddd�}||}|jjd||g�\}}|dksBt|�dkrFdS|j|j|j|j|jd�}||}||�}	|d	kr�|j||	�r�dS|	S)
Nz-cz-kz-az-gz-l)rJrIrKrLrMrorrJ)r
rrr`rkr�r�r�r�)
rrdr �context2opt�opt�retr2Zcontext2parser�parserrnrrr�_get_device_parameters^s 
z&NetTuningPlugin._get_device_parametersc	Cs�|dkst|�dkrdS|j||�}|dks:|j||�r>iS|r�|j||||�|dkr�t|tt|���dkr�|j||jj	|�|�}|r�t|�dkr�t
jd|t|�f�dddd	d
d�}||}|jjd||g|jj	|�d
gd�|S)NrrM�n/armzsetting %s: %sz-Cz-Kz-Az-Gz-L)rJrIrKrLrMro�P)r�)r�rm)
r`rfr�r�r�next�iterr�r
�	dict2listrrrr)	rrdr2r rsr�rnr�r�rrr�_set_device_parametersps  $z&NetTuningPlugin._set_device_parameterscs�|j||d�}|r�|j||�}|dks2t|�dkr6dS|j|||||d���dks^t��dkrbdS�fdd�|j�D�}t|�}|r�|jj��|jj|�k}	|j||	�||d�|	S|j	j
|dj|jj|���n|j	j|�}
|j||
|d�dS)	N)Zcommand_nameZdevice_namerF)r�cs g|]\}}|�kr||f�qSrr)r"r�r2)�
params_setrrr$�sz6NetTuningPlugin._custom_parameters.<locals>.<listcomp>)r r\)
Z_storage_keyr�r`r��itemsrbr
r�Z_log_verification_resultZ_storager�join�get)rrd�startr2r �verifyZstorage_keyZparams_currentZrelevant_params_currentr��original_valuer)r�r�_custom_parameters�s:

z"NetTuningPlugin._custom_parametersrIcCs|jd||||�S)NrI)r�)rr�r2r r�ryrrr�	_features�szNetTuningPlugin._featuresrJcCs|jd||||�S)NrJ)r�)rr�r2r r�ryrrr�	_coalesce�szNetTuningPlugin._coalescerKcCs|jd||||�S)NrK)r�)rr�r2r r�ryrrr�_pause�szNetTuningPlugin._pauserLcCs|jd||||�S)NrL)r�)rr�r2r r�ryrrr�_ring�szNetTuningPlugin._ringrMcCs|jd||||�S)NrM)r�)rr�r2r r�ryrrr�	_channels�szNetTuningPlugin._channels)F)N)F)F)N)6�__name__�
__module__�__qualname__�__doc__r
r!r%r,r.r0r/�classmethodr>rBrCrFrPr7r8r9rZrUrfrkrlZcommand_setruZcommand_getr{r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�Zcommand_customr�r�r�r�r��
__classcell__rr)rrrsd
	 



	
&r)r|r�rZ
decoratorsZ
tuned.logsZtunedZtuned.utils.nettoolrZtuned.utils.commandsr�osrZlogsr�rrpZPluginrrrrr�<module>s
site-packages/tuned/plugins/__pycache__/plugin_acpi.cpython-36.opt-1.pyc000064400000006545147511334670022200 0ustar003

�<�e�	�@sXddlmZddlTddlZddlZddlZddlmZej	j
�ZGdd�dej�Z
dS)�)�base)�*�N)�ACPI_DIRcsveZdZdZ�fdd�Zedd��Zdd�Zdd	�Zed
d��Z	edd
��Z
ed�dd��Ze
d�ddd��Z�ZS)�
ACPIPlugina>
	`acpi`::

	Configures the ACPI driver.
	+
	The only currently supported option is
	[option]`platform_profile`, which sets the ACPI
	platform profile sysfs attribute,
	a generic power/performance preference API for other drivers.
	Multiple profiles can be specified, separated by `|`.
	The first available profile is selected.
	+
	--
	.Selecting a platform profile
	====
	----
	[acpi]
	platform_profile=balanced|low-power
	----
	Using this option, *TuneD* will try to set the platform profile
	to `balanced`. If that fails, it will try to set it to `low-power`.
	====
	--
	cstt|�j||�dS)N)�superr�__init__)�self�args�kwargs)�	__class__��!/usr/lib/python3.6/plugin_acpi.pyr$szACPIPlugin.__init__cCsddiS)N�platform_profiler
)�clsr
r
r�_get_config_options'szACPIPlugin._get_config_optionscCsd|_d|_dS)NTF)Z_has_static_tuningZ_has_dynamic_tuning)r	�instancer
r
r�_instance_init+szACPIPlugin._instance_initcCsdS)Nr
)r	rr
r
r�_instance_cleanup/szACPIPlugin._instance_cleanupcCstjjtd�S)NZplatform_profile_choices)�os�path�joinr)rr
r
r�_platform_profile_choices_path2sz)ACPIPlugin._platform_profile_choices_pathcCstjjtd�S)Nr)rrrr)rr
r
r�_platform_profile_path6sz!ACPIPlugin._platform_profile_pathrcCs�tjj|j��stjd�dSdd�|jd�D�}t|jj	|j
��j��}xZ|D]R}||kr�|s�tjd|�|jj|j�||r�t
jgndd�|Stjd|�qPWtjd	�dS)
Nz5ACPI platform_profile is not supported on this systemcSsg|]}|j��qSr
)�strip)�.0�profiler
r
r�
<listcomp>?sz4ACPIPlugin._set_platform_profile.<locals>.<listcomp>�|z Setting platform_profile to '%s'F)Zno_errorz+Requested platform_profile '%s' unavailablezDFailed to set platform_profile. Is the value in the profile correct?)rr�isfiler�log�debug�split�set�_cmd�	read_filer�infoZ
write_to_file�errno�ENOENT�warn�error)r	ZprofilesZsim�removeZavail_profilesrr
r
r�_set_platform_profile:s


z ACPIPlugin._set_platform_profileFcCs2tjj|j��stjd�dS|jj|j��j�S)Nz5ACPI platform_profile is not supported on this system)	rrrrr r!r$r%r)r	Zignore_missingr
r
r�_get_platform_profileLs
z ACPIPlugin._get_platform_profile)F)�__name__�
__module__�__qualname__�__doc__r�classmethodrrrrrZcommand_setr,Zcommand_getr-�
__classcell__r
r
)rrrsr)�rZ
decoratorsrr'Z
tuned.logsZtunedZtuned.constsrZlogs�getr ZPluginrr
r
r
r�<module>s
site-packages/tuned/plugins/__pycache__/decorators.cpython-36.pyc000064400000002077147511334670021110 0ustar003

�<�e��@s*dddgZd	dd�Zdd�Zd
dd�ZdS)�command_set�command_get�command_customF�cs���fdd�}|S)Ncsd���d�|_|S)NT)�set�name�
per_device�priority)�_command)�method)rrr�� /usr/lib/python3.6/decorators.py�wrappers

zcommand_set.<locals>.wrapperr)rrrr
r)rrrrrs	cs�fdd�}|S)Ncsd�d�|_|S)NT)�getr)r	)r
)rrrr
!s
zcommand_get.<locals>.wrapperr)rr
r)rrr scs���fdd�}|S)Ncsd���d�|_|S)NT)Zcustomrrr)r	)r
)rrrrrr
*s

zcommand_custom.<locals>.wrapperr)rrrr
r)rrrrr)sN)Fr)Fr)�__all__rrrrrrr�<module>s

	site-packages/tuned/plugins/__pycache__/plugin_usb.cpython-36.pyc000064400000005716147511334670021115 0ustar003

�<�e��@sXddlmZddlTddlZddlmZddlZddlZej	j
�ZGdd�dej�Z
dS)�)�base)�*�N)�commandsc@sjeZdZdZdd�Zdd�Zedd��Zdd	�Zd
d�Z	dd
�Z
eddd�dd��Ze
d�ddd��ZdS)�	USBPlugina�
	`usb`::
	
	Sets autosuspend timeout of USB devices to the value specified by the
	[option]`autosuspend` option in seconds. If the [option]`devices`
	option is specified, the [option]`autosuspend` option applies to only
	the USB devices specified, otherwise it applies to all USB devices.
	+
	The value `0` means that autosuspend is disabled.
	+
	.To turn off USB autosuspend for USB devices `1-1` and `1-2`
	====
	----
	[usb]
	devices=1-1,1-2
	autosuspend=0
	----
	====
	cCsNd|_t�|_t�|_x*|jjd�jdd�D]}|jj|j�q,Wt	�|_
dS)NT�usbZDEVTYPEZ
usb_device)Z_devices_supported�setZ
_free_devicesZ_assigned_devices�_hardware_inventoryZget_devicesZmatch_property�addZsys_namer�_cmd)�self�device�r� /usr/lib/python3.6/plugin_usb.py�
_init_devicesszUSBPlugin._init_devicescs�fdd�|D�S)Ncsg|]}�jjd|��qS)r)r	Z
get_device)�.0�x)rrr�
<listcomp>*sz1USBPlugin._get_device_objects.<locals>.<listcomp>r)rZdevicesr)rr�_get_device_objects)szUSBPlugin._get_device_objectscCsddiS)N�autosuspendr)rrrr�_get_config_options,szUSBPlugin._get_config_optionscCsd|_d|_dS)NTF)Z_has_static_tuningZ_has_dynamic_tuning)r�instancerrr�_instance_init2szUSBPlugin._instance_initcCsdS)Nr)rrrrr�_instance_cleanup6szUSBPlugin._instance_cleanupcCsd|S)Nz)/sys/bus/usb/devices/%s/power/autosuspendr)rr
rrr�_autosuspend_sysfile9szUSBPlugin._autosuspend_sysfilerT)Z
per_devicecCsR|j|�}|dkrdS|rdnd}|sN|j|�}|jj|||rFtjgndd�|S)N�1�0F)�no_error)Z_option_boolrrZ
write_to_file�errno�ENOENT)r�valuer
Zsim�remove�enable�val�sys_filerrr�_set_autosuspend<s


zUSBPlugin._set_autosuspendFcCs|j|�}|jj||d�j�S)N)r)rrZ	read_file�strip)rr
Zignore_missingr$rrr�_get_autosuspendIs
zUSBPlugin._get_autosuspendN)F)�__name__�
__module__�__qualname__�__doc__rr�classmethodrrrrZcommand_setr%Zcommand_getr'rrrrr
s

r)�rZ
decoratorsZ
tuned.logsZtunedZtuned.utils.commandsrZglobrZlogs�get�logZPluginrrrrr�<module>s
site-packages/tuned/plugins/__pycache__/plugin_mounts.cpython-36.pyc000064400000012773147511334670021652 0ustar003

�<�e��@spddljZddlmZddlTddlmZmZddlZ	ddl
mZddlZe	j
j�Ze�ZGdd�dej�ZdS)	�N�)�base)�*)�Popen�PIPE)�commandsc@steZdZdZedd��Zdd�Zedd��Zdd	�Zd
d�Z	dd
�Z
dd�Zdd�Zdd�Z
eddd�dd��ZdS)�MountsPluginaP
	`mounts`::
	
	Enables or disables barriers for mounts according to the value of the
	[option]`disable_barriers` option. The [option]`disable_barriers`
	option has an optional value `force` which disables barriers even
	on mountpoints with write back caches. Note that only extended file
	systems (ext) are supported by this plug-in.
	cCs�i}d}tdddgttddd�j�\}}x�dd�|j�D�D]�}t|�d	krNq<|dd	�\}}}t|�d	krt|d	nd}	t|�d
kr�|d
nd}
|dkr�|}q<|dks<|dkr�q<|
dks<|
dkr�q<|j|
t�||	d��||
dj|�q<W||_dS)z�
		Gets the information about disks, partitions and mountpoints. Stores information about used filesystem and
		creates a list of all underlying devices (in case of LVM) for each mountpoint.
		NZlsblkz-rnozTYPE,RM,KNAME,FSTYPE,MOUNTPOINTT)�stdout�stderrZ	close_fdsZuniversal_newlinescSsg|]}|j��qS�)�split)�.0�linerr�#/usr/lib/python3.6/plugin_mounts.py�
<listcomp>$sz>MountsPlugin._generate_mountpoint_topology.<locals>.<listcomp>��Zdisk�1�part�lvmz[SWAP])�disks�device_name�
filesystemr)rr)	rrZcommunicate�
splitlines�len�
setdefault�set�add�_mountpoint_topology)�clsZmountpoint_topologyZcurrent_diskr	r
�columnsZdevice_typeZdevice_removablerr�
mountpointrrr�_generate_mountpoint_topologys,z*MountsPlugin._generate_mountpoint_topologycCs*|j�d|_t|jj��|_t�|_dS)NT)r"Z_devices_supportedrr�keysZ
_free_devicesZ_assigned_devices)�selfrrr�
_init_devices;szMountsPlugin._init_devicescCsddiS)N�disable_barriersr)r$rrr�_get_config_optionsAsz MountsPlugin._get_config_optionscCsd|_d|_dS)NFT)Z_has_dynamic_tuningZ_has_static_tuning)r$�instancerrr�_instance_initGszMountsPlugin._instance_initcCsdS)Nr)r$r(rrr�_instance_cleanupKszMountsPlugin._instance_cleanupcCs,tjd|�}x|D]}tj|�j�SWdS)zV
		Get device cache type. This will work only for devices on SCSI kernel subsystem.
		z+/sys/block/%s/device/scsi_disk/*/cache_typeN)�glob�cmdZ	read_file�strip)r$�deviceZsource_filenamesZsource_filenamerrr�_get_device_cache_typeNs
z#MountsPlugin._get_device_cache_typecCs.x(|j|dD]}|j|�dkrdSqWdS)zr
		Checks if the device has 'write back' cache. If the cache type cannot be determined, asume some other cache.
		rz
write backTF)rr/)r$r!r.rrr�_mountpoint_has_writeback_cacheWsz,MountsPlugin._mountpoint_has_writeback_cachecCs�td��H}x@|D]4}|j�}|dddkr.q|d|kr|d}PqWdSWdQRX|jd�}xH|D]<}|jd�\}}	}
|d	ks�|d
kr�|
dkr�dS|d
krfd
SqfWd
SdS)zP
		Checks if a given mountpoint is mounted with barriers enabled or disabled.
		z/proc/mountsr�/rrN�,�=Z	nobarrierZbarrier�0FT)�openr�	partition)r$r!Zmounts_filerr Zoption_list�optionsZoption�name�sep�valuerrr�_mountpoint_has_barriers`s"



z%MountsPlugin._mountpoint_has_barrierscCsd|dd|g}tj|�dS)z
		Remounts partition.
		z/usr/bin/mountz-oz
remount,%sN)r,Zexecute)r$r6r7Zremount_commandrrr�_remount_partition}szMountsPlugin._remount_partitionr&T)Z
per_devicec
CsZ|jd|d�}t|�j�dk}|p*|j|�}|�r|s:dSd}|j|djd�sXd}nl|rn|j|�rnd}nV|j|�}	|	dkr�d}n>|	d	kr�|r�tj	t
j|�d
Sd}n|r�tjt
j
|�d	S|dk	r�tj	d||f�dS|jj||	�tj	d
|�|j|d�nJ|�rdS|jj|�}	|	dk�r0dStj	d|�|j|d�|jj|�dS)Nr&)Zcommand_namer�forcerZextzfilesystem not supportedzdevice uses write back cachezunknown current settingFTzbarriers already disabledz#not disabling barriers on '%s' (%s)zdisabling barriers on '%s'z	barrier=0zenabling barriers on '%s'z	barrier=1)Z_storage_key�str�lowerZ_option_boolr�
startswithr0r;�log�info�constsZSTR_VERIFY_PROFILE_OK�errorZSTR_VERIFY_PROFILE_FAILZ_storagerr<�getZunset)
r$�startr:r!ZverifyZignore_missingZstorage_keyr=Z
reject_reason�original_valuerrr�_disable_barriers�sN

zMountsPlugin._disable_barriersN)�__name__�
__module__�__qualname__�__doc__�classmethodr"r%r'r)r*r/r0r;r<Zcommand_customrHrrrrrs	$		r)Ztuned.constsrC�rZ
decorators�
subprocessrrZ
tuned.logsZtunedZtuned.utils.commandsrr+ZlogsrErAr,ZPluginrrrrr�<module>s

site-packages/tuned/plugins/__pycache__/plugin_vm.cpython-36.pyc000064400000010156147511334670020740 0ustar003

�<�e�
�@snddlmZddlTddlZddlZddlZddlZddlZddl	m
Z
ejj�Z
e
�ZGdd�dej�ZdS)�)�base)�*�N)�commandsc@s�eZdZdZedd��Zdd�Zdd�Zedd	��Ze	d
�dd��Z
e	d
�dd��Zed
�dd��Z
ed
�dd��Ze	d�dd��Zed�dd��ZdS)�VMPlugina|
	`vm`::
	
	Enables or disables transparent huge pages depending on value of the
	[option]`transparent_hugepages` option. The option can have one of three
	possible values `always`, `madvise` and `never`.
	+
	.Disable transparent hugepages
	====
	----
	[vm]
	transparent_hugepages=never
	----
	====
	+
	The [option]`transparent_hugepage.defrag` option specifies the
	defragmentation policy. Possible values for this option are `always`,
	`defer`, `defer+madvise`, `madvise` and `never`. For a detailed
	explanation of these values refer to
	link:https://www.kernel.org/doc/Documentation/vm/transhuge.txt[Transparent Hugepage Support].
	cCsdddd�S)N)�transparent_hugepages�transparent_hugepageztransparent_hugepage.defrag�)�selfr	r	�/usr/lib/python3.6/plugin_vm.py�_get_config_options%szVMPlugin._get_config_optionscCsd|_d|_dS)NTF)Z_has_static_tuningZ_has_dynamic_tuning)r
�instancer	r	r�_instance_init-szVMPlugin._instance_initcCsdS)Nr	)r
r
r	r	r�_instance_cleanup1szVMPlugin._instance_cleanupcCsd}tjj|�sd}|S)Nz#/sys/kernel/mm/transparent_hugepagez*/sys/kernel/mm/redhat_transparent_hugepage)�os�path�exists)r
rr	r	r�	_thp_path4szVMPlugin._thp_pathrcCs�|dkr"|stjdt|��dStjddd�}|jd�d	krP|sLtjd
�dStjj	|j
�d�}tjj|�r�|s�tj|||r�t
jgndd�|S|s�tjd
�dSdS)N�always�never�madvisez-Incorrect 'transparent_hugepages' value '%s'.z
/proc/cmdlineT)�no_errorztransparent_hugepage=rzWtransparent_hugepage is already set in kernel boot cmdline, ignoring value from profile�enabledFzDOption 'transparent_hugepages' is not supported on current hardware.)rrr)�log�warn�str�cmd�	read_file�find�inforr�joinrr�
write_to_file�errno�ENOENT)r
�value�sim�removeZcmdline�sys_filer	r	r�_set_transparent_hugepages<s$

z#VMPlugin._set_transparent_hugepagesrcCs|j|||�dS)N)r()r
r$r%r&r	r	r�_set_transparent_hugepageUsz"VMPlugin._set_transparent_hugepagecCs6tjj|j�d�}tjj|�r.tjtj|��SdSdS)Nr)rrr rrr�get_active_optionr)r
r'r	r	r�_get_transparent_hugepagesYsz#VMPlugin._get_transparent_hugepagescCs|j�S)N)r+)r
r	r	r�_get_transparent_hugepagebsz"VMPlugin._get_transparent_hugepageztransparent_hugepage.defragcCsXtjj|j�d�}tjj|�rB|s>tj|||r6tjgndd�|S|sPt	j
d�dSdS)N�defragF)rzJOption 'transparent_hugepage.defrag' is not supported on current hardware.)rrr rrrr!r"r#rr)r
r$r%r&r'r	r	r� _set_transparent_hugepage_defragfs
z)VMPlugin._set_transparent_hugepage_defragcCs6tjj|j�d�}tjj|�r.tjtj|��SdSdS)Nr-)rrr rrrr*r)r
r'r	r	r� _get_transparent_hugepage_defragssz)VMPlugin._get_transparent_hugepage_defragN)�__name__�
__module__�__qualname__�__doc__�classmethodrrrrZcommand_setr(r)Zcommand_getr+r,r.r/r	r	r	rrs	
r)�rZ
decoratorsZ
tuned.logsZtunedrr"�structZglobZtuned.utils.commandsrZlogs�getrrZPluginrr	r	r	r�<module>s
site-packages/tuned/plugins/__pycache__/plugin_cpu.cpython-36.opt-1.pyc000064400000064037147511334670022053 0ustar003

�<�e:n�@s�ddlmZddlTddlZddlmZddljZddl	Z	ddl
Z
ddlZddl
Z
ddlZddl
Z
ejj�ZdZGdd�dej�ZdS)	�)�hotplug)�*�N)�commandsz$/sys/devices/system/cpu/cpu0/cpuidlecs$eZdZdZ�fdd�Zdd�Zdd�Zedd	��Zd
d�Z	dd
�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Z�fd$d%�Zejf�fd&d'�	Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zdmd3d4�Zdnd5d6�Z dod7d8�Z!d9d:�Z"d;d<�Z#e$d=d>d?�d@dA��Z%e&d=�dpdBdC��Z'dqdEdF�Z(e$dGd>dHdI�dJdK��Z)e&dG�drdLdM��Z*dNdO�Z+dsdPdQ�Z,dRdS�Z-e$dTd>d?�dUdV��Z.dWdX�Z/dYdZ�Z0d[d\�Z1e&dT�dtd]d^��Z2d_d`�Z3dadb�Z4e$dcd>d?�ddde��Z5e&dc�dudfdg��Z6e$dhd>d?�didj��Z7e&dh�dvdkdl��Z8�Z9S)w�CPULatencyPlugina�
	`cpu`::
	
	Sets the CPU governor to the value specified by the [option]`governor`
	option and dynamically changes the Power Management Quality of
	Service (PM QoS) CPU Direct Memory Access (DMA) latency according
	to the CPU load.
	
	`governor`:::
	The [option]`governor` option of the 'cpu' plug-in supports specifying
	CPU governors. Multiple governors are separated using '|'. The '|'
	character is meant to represent a logical 'or' operator. Note that the
	same syntax is used for the [option]`energy_perf_bias` option. *TuneD*
	will set the first governor that is available on the system.
	+    
	For example, with the following profile, *TuneD* will set the 'ondemand'
	governor, if it is available. If it is not available, but the 'powersave'
	governor is available, 'powersave' will be set. If neither of them are
	available, the governor will not be changed.
	+
	.Specifying a CPU governor
	====
	----
	[cpu]
	governor=ondemand|powersave
	----
	====
	
	`sampling_down_factor`:::
	The sampling rate determines how frequently the governor checks
	to tune the CPU. The [option]`sampling_down_factor` is a tunable
	that multiplies the sampling rate when the CPU is at its highest
	clock frequency thereby delaying load evaluation and improving
	performance. Allowed values for sampling_down_factor are 1 to 100000.
	+
	.The recommended setting for jitter reduction
	====
	----
	[cpu]
	sampling_down_factor = 100
	----
	====
	
	`energy_perf_bias`:::
	[option]`energy_perf_bias` supports managing energy
	vs. performance policy via x86 Model Specific Registers using the
	`x86_energy_perf_policy` tool. Multiple alternative Energy Performance
	Bias (EPB) values are supported. The alternative values are separated
	using the '|' character. The following EPB values are supported
	starting with kernel 4.13: "performance", "balance-performance",
	"normal", "balance-power" and "power". On newer processors is value
	writen straight to file (see rhbz#2095829)
	+
	.Specifying alternative Energy Performance Bias values
	====
	----
	[cpu]
	energy_perf_bias=powersave|power
	----
	
	*TuneD* will try to set EPB to 'powersave'. If that fails, it will
	try to set it to 'power'.
	====
	
	`energy_performance_preference`:::
	[option]`energy_performance_preference` supports managing energy
	vs. performance hints on newer Intel and AMD processors with active P-State
	CPU scaling drivers (intel_pstate or amd-pstate). Multiple alternative
	Energy Performance Preferences (EPP) values are supported. The alternative
	values are separated using the '|' character. Available values can be found
	in `energy_performance_available_preferences` file in `CPUFreq` policy
	directory in `sysfs`.
	in
	+
	.Specifying alternative Energy Performance Hints values
	====
	----
	[cpu]
	energy_performance_preference=balance_power|power
	----
	
	*TuneD* will try to set EPP to 'balance_power'. If that fails, it will
	try to set it to 'power'.
	====
	
	`latency_low, latency_high, load_threshold`:::
	+
	If the CPU load is lower than the value specified by
	the[option]`load_threshold` option, the latency is set to the value
	specified either by the [option]`latency_high` option or by the
	[option]`latency_low` option.
	+
	`force_latency`:::
	You can also force the latency to a specific value and prevent it from
	dynamically changing further. To do so, set the [option]`force_latency`
	option to the required latency value.
	+
	The maximum latency value can be specified in several ways:
	+
	 * by a numerical value in microseconds (for example, `force_latency=10`)
	 * as the kernel CPU idle level ID of the maximum C-state allowed
	   (for example, force_latency = cstate.id:1)
	 * as a case sensitive name of the maximum C-state allowed
	   (for example, force_latency = cstate.name:C1)
	 * by using 'None' as a fallback value to prevent errors when alternative
	   C-state IDs/names do not exist. When 'None' is used in the alternatives
	   pipeline, all the alternatives that follow 'None' are ignored.
	+
	It is also possible to specify multiple fallback values separated by '|' as
	the C-state names and/or IDs may not be available on some systems.
	+
	.Specifying fallback C-state values
	====
	----
	[cpu]
	force_latency=cstate.name:C6|cstate.id:4|10
	----
	This configuration tries to obtain and set the latency of C-state named C6.
	If the C-state C6 does not exist, kernel CPU idle level ID 4 (state4) latency
	is searched for in sysfs. Finally, if the state4 directory in sysfs is not found,
	the last latency fallback value is `10` us. The value is encoded and written into
	the kernel's PM QoS file `/dev/cpu_dma_latency`.
	====
	+
	.Specifying fallback C-state values using 'None'.
	====
	----
	[cpu]
	force_latency=cstate.name:XYZ|None
	----
	In this case, if C-state with the name `XYZ` does not exist
	[option]`force_latency`, no latency value will be written into the
	kernel's PM QoS file, and no errors will be reported due to the
	presence of 'None'.
	====
	
	`min_perf_pct, max_perf_pct, no_turbo`:::
	These options set the internals of the Intel P-State driver exposed via the kernel's
	`sysfs` interface.
	+
	.Adjusting the configuration of the Intel P-State driver
	====
	----
	[cpu]
	min_perf_pct=100
	----
	Limit the minimum P-State that will be requested by the driver. It states
	it as a percentage of the max (non-turbo) performance level.
	====
	+
	`pm_qos_resume_latency_us`:::
	This option allow to set specific latency for all cpus or specific ones.
	====
	----
	[cpu]
	pm_qos_resume_latency_us=n/a
	----
	Special value that disables C-states completely.
	====
	----
	[cpu]
	pm_qos_resume_latency_us=0
	----
	Allows all C-states.
	====
	----
	[cpu]
	pm_qos_resume_latency_us=100
	----
	Allows any C-state with a resume latency less than value.
	csrtt|�j||�d|_d|_d|_d|_d|_d|_d|_	d|_
d|_d|_d|_
d|_i|_t�|_d|_dS)NT�x86_64F)�superr�__init__�_has_pm_qos�_archZ_is_x86�	_is_intel�_is_amd�_has_energy_perf_bias�_has_intel_pstate�_has_amd_pstate�_has_pm_qos_resume_latency_us�_min_perf_pct_save�_max_perf_pct_save�_no_turbo_save�_governors_mapr�_cmd�_flags)�self�args�kwargs)�	__class__�� /usr/lib/python3.6/plugin_cpu.pyr	�s zCPULatencyPlugin.__init__cCs>d|_t�|_x"|jjd�D]}|jj|j�qWt�|_dS)NT�cpu)Z_devices_supported�setZ
_free_devices�_hardware_inventoryZget_devices�addZsys_nameZ_assigned_devices)r�devicerrr�
_init_devices�s
zCPULatencyPlugin._init_devicescs�fdd�|D�S)Ncsg|]}�jjd|��qS)r)r Z
get_device)�.0�x)rrr�
<listcomp>�sz8CPULatencyPlugin._get_device_objects.<locals>.<listcomp>r)rZdevicesr)rr�_get_device_objects�sz$CPULatencyPlugin._get_device_objectsc
Csddddddddddddd�S)Ng�������?�di�)�load_threshold�latency_low�latency_high�
force_latency�governor�sampling_down_factor�energy_perf_bias�min_perf_pct�max_perf_pct�no_turbo�pm_qos_resume_latency_us�energy_performance_preferencer)rrrr�_get_config_options�sz$CPULatencyPlugin._get_config_optionscCs�dddddg}tj�|_|j|krttj�}|jjd�}|dkrFd|_n|d	ksV|d
kr^d|_nd|_t	j
d|�nt	j
d|j�|jr�|j�|j�|jr�|j
�dS)
NrZi686Zi585Zi486Zi386Z	vendor_idZGenuineIntelTZAuthenticAMDZHygonGenuinez$We are running on an x86 %s platformzWe are running on %s (non x86))�platform�machiner�procfs�cpuinfo�tags�getrr
�log�info�_check_energy_perf_bias�_check_intel_pstate�_check_amd_pstate)rZintel_archsrZvendorrrr�_check_arch�s"

zCPULatencyPlugin._check_archcCsbd|_d}|jjddgtj|gd�\}}|dkr@|dkr@d|_n|dkrTtjd	�n
tjd
�dS)NFr�x86_energy_perf_policyz-r)Z	no_errorsr�Tzgunable to run x86_energy_perf_policy tool, ignoring CPU energy performance bias, is the tool installed?zXyour CPU doesn't support MSR_IA32_ENERGY_PERF_BIAS, ignoring CPU energy performance bias)rr�execute�errno�ENOENTr<�warning)rZretcode_unsupported�retcode�outrrrr>sz(CPULatencyPlugin._check_energy_perf_biascCs"tjjd�|_|jrtjd�dS)Nz$/sys/devices/system/cpu/intel_pstatezintel_pstate detected)�os�path�existsrr<r=)rrrrr?sz$CPULatencyPlugin._check_intel_pstatecCs"tjjd�|_|jrtjd�dS)Nz"/sys/devices/system/cpu/amd_pstatezamd-pstate detected)rJrKrLrr<r=)rrrrr@#sz"CPULatencyPlugin._check_amd_pstatecCs$|jdkrtj�jjdg�|_|jS)N�flags)rr8r9r:r;)rrrr�_get_cpuinfo_flags(s
z#CPULatencyPlugin._get_cpuinfo_flagscCs t|�}|jjt|�jdd��S)NrrC)�strrZ
is_cpu_online�replace)rr"Zsdrrr�_is_cpu_online-szCPULatencyPlugin._is_cpu_onlinecCstjjd|�S)Nz3/sys/devices/system/cpu/%s/cpufreq/scaling_governor)rJrKrL)rr"rrr�_cpu_has_scaling_governor1sz*CPULatencyPlugin._cpu_has_scaling_governorcCs<|j|�stjd|�dS|j|�s8tjd|�dSdS)Nz'%s' is not online, skippingFz.there is no scaling governor fo '%s', skippingT)rQr<�debugrR)rr"rrr�_check_cpu_can_change_governor4s

z/CPULatencyPlugin._check_cpu_can_change_governorcCs�d|_d|_t|jj��d|kr�d|_ytjtj	tj
�|_Wn*tk
rht
jdtj	�d|_YnXd|_|jddkr�|jddkr�|jjdd�|_d|_nd|_|j�nd|_t
jd|j�yt|j�d|_Wntk
r�d|_YnXdS)	NTFrz-Unable to open '%s', disabling PM_QoS controlr,r3�loadzILatency settings from non-first CPU plugin instance '%s' will be ignored.)Z_has_static_tuningZ_has_dynamic_tuning�listZ
_instances�values�_first_instancerJ�open�constsZPATH_CPU_DMA_LATENCY�O_WRONLY�_cpu_latency_fd�OSErrorr<�errorr
�_latency�options�_monitors_repositoryZcreate�
_load_monitorrAr=�nameZassigned_devices�
_first_device�
IndexError)r�instancerrr�_instance_init=s*
zCPULatencyPlugin._instance_initcCs4|jr0|jrtj|j�|jdk	r0|jj|j�dS)N)rXr
rJ�closer\rbra�delete)rrfrrr�_instance_cleanup[s

z"CPULatencyPlugin._instance_cleanupcCs|jjd|d�j�S)Nz'/sys/devices/system/cpu/intel_pstate/%s)r�	read_file�strip)r�attrrrr�_get_intel_pstate_attrbsz'CPULatencyPlugin._get_intel_pstate_attrcCs|dk	r|jjd||�dS)Nz'/sys/devices/system/cpu/intel_pstate/%s)r�
write_to_file)rrm�valrrr�_set_intel_pstate_attresz'CPULatencyPlugin._set_intel_pstate_attrcCs&|dkrdS|j|�}|j||�|S)N)rnrq)rrm�value�vrrr�_getset_intel_pstate_attris

z*CPULatencyPlugin._getset_intel_pstate_attrcs�tt|�j|�|jsdS|jj|jd�}|dk	r>|j|�|jr�|jj|jd�}|j	d|�|_
|jj|jd�}|j	d|�|_|jj|jd�}|j	d|�|_dS)Nr,r0r1r2)
rr�_instance_apply_staticrXZ
_variables�expandr`�_set_latencyrrtrrr)rrfZforce_latency_valueZ	new_value)rrrrups(


z'CPULatencyPlugin._instance_apply_staticcsLtt|�j||�|jrH|jrH|jd|j�|jd|j�|jd|j�dS)Nr0r1r2)	rr�_instance_unapply_staticrXrrqrrr)rrfZrollback)rrrrx�s
z)CPULatencyPlugin._instance_unapply_staticcCs|j||�dS)N)�_instance_update_dynamic)rrfr"rrr�_instance_apply_dynamic�sz(CPULatencyPlugin._instance_apply_dynamiccCsP||jkrdS|jj�d}||jdkr<|j|jd�n|j|jd�dS)N�systemr)r+r*)rdrbZget_loadr`rw)rrfr"rUrrrry�s
z)CPULatencyPlugin._instance_update_dynamiccCsdS)Nr)rrfr"rrr�_instance_unapply_dynamic�sz*CPULatencyPlugin._instance_unapply_dynamiccCs&yt|�Sttfk
r dSXdS)N)�int�
ValueError�	TypeError)r�srrr�_str2int�szCPULatencyPlugin._str2intcCs�i|_xztjt�D]l}td|}|jj|dddd�}|jj|dddd�}|dk	r|dk	r|j|�}|dk	r||j|j�<qWdS)Nz/%s/rcT)�err_ret�no_error�latency)�cstates_latencyrJ�listdir�cpuidle_states_pathrrkr�rl)r�dZcstate_pathrcr�rrr�_read_cstates_latency�s
z&CPULatencyPlugin._read_cstates_latencyFcCshtjd|�|jdkr*tjd�|j�|jj|d�}|rR|dkrRtjd�dStjdt|��|S)Nz)getting latency for cstate with name '%s'zreading cstates latency tablerz"skipping latency 0 as set by paramz!cstate name mapped to latency: %s)r<rSr�r�r;rO)rrc�no_zeror�rrr�_get_latency_by_cstate_name�s


z,CPULatencyPlugin._get_latency_by_cstate_namecCs�tjdt|��|j|�}|dkr2tjd�dStdd|}|j|jj|ddd��}|rt|dkrttjd�dStjd	t|��|S)
Nz'getting latency for cstate with ID '%s'zcstate ID is invalidz/%s/latencyzstate%dT)r�r�rz"skipping latency 0 as set by paramzcstate ID mapped to latency: %s)r<rSrOr�r�rrk)rZlidr�Zlatency_pathr�rrr�_get_latency_by_cstate_id�s


z*CPULatencyPlugin._get_latency_by_cstate_idcCs`d|_t|�jd�}tjd||f��x.|D�]$}yt|�}tjd|�W�n�tk
�rH|dd�dkr�|j|dd�dd�}n�|dd	�d
kr�|j|d	d��}n�|dd�dkr�|j|dd�dd�}nn|dd
�dkr�|j|d
d��}nJ|dk�rtjd�dS|�r.|dk�r.tjd�ntjdt|��d}YnX|dk	r.Pq.W|dfS)N�|z#parsing latency '%s', allow_na '%s'z+parsed directly specified latency value: %dr�zcstate.id_no_zero:T)r��
z
cstate.id:�zcstate.name_no_zero:�zcstate.name:�none�Nonezlatency 'none' specifiedzn/azlatency 'n/a' specifiedzinvalid latency specified: '%s'F)r�r�)NT)	r�rO�splitr<rSr}r~r�r�)rr��allow_naZ	latenciesrrr�_parse_latency�s6



zCPULatencyPlugin._parse_latencycCsp|j|�\}}|rl|jrl|dkr4tjd�d|_n8|j|krltjd|�tjd|�}tj	|j
|�||_dS)Nztunable to evaluate latency value (probably wrong settings in the 'cpu' section of current profile), disabling PM QoSFzsetting new cpu latency %d�i)r�r
r<r^r_r=�struct�packrJ�writer\)rr��skipZlatency_binrrrrw�s

zCPULatencyPlugin._set_latencycCs|jjd|�j�j�S)Nz>/sys/devices/system/cpu/%s/cpufreq/scaling_available_governors)rrkrlr�)rr"rrr�_get_available_governors�sz)CPULatencyPlugin._get_available_governorsr-T)�
per_devicecCs�|j|�sdSt|�}|jd�}dd�|D�}x&|D]}t|�dkr4tjd�dSq4W|j|�}x~|D]^}||kr�|s�tjd||f�|jj	d|||r�t
jgndd	�Pqf|sftjd
||f�qfWtj
ddj|��d}|S)
Nr�cSsg|]}|j��qSr)rl)r$r-rrrr&sz2CPULatencyPlugin._set_governor.<locals>.<listcomp>rz.The 'governor' option contains an empty value.z!setting governor '%s' on cpu '%s'z3/sys/devices/system/cpu/%s/cpufreq/scaling_governorF)r�z7Ignoring governor '%s' on cpu '%s', it is not supportedz.None of the scaling governors is supported: %sz, )rTrOr��lenr<r^r�r=rrorErFrS�warn�join)rZ	governorsr"�sim�remover-Zavailable_governorsrrr�
_set_governor�s2





zCPULatencyPlugin._set_governorcCsTd}|j|�sdS|jjd||d�j�}t|�dkr:|}|dkrPtjd|�|S)Nz3/sys/devices/system/cpu/%s/cpufreq/scaling_governor)r�rz*could not get current governor on cpu '%s')rTrrkrlr�r<r^)rr"�ignore_missingr-�datarrr�
_get_governors
zCPULatencyPlugin._get_governor�ondemandcCsd|S)Nz7/sys/devices/system/cpu/cpufreq/%s/sampling_down_factorr)rr-rrr�_sampling_down_factor_path%sz+CPULatencyPlugin._sampling_down_factor_pathr.r�)r�ZprioritycCs�d}||jkr|jj�d|j|<|j|�}|dkrFtjd|�dS|t|jj��kr�||j|<|j|�}tj	j
|�s�tjd||f�dSt|�}|s�tjd||f�|j
j|||r�tjgndd�|S)NzIignoring sampling_down_factor setting for CPU '%s', cannot match governorzTignoring sampling_down_factor setting for CPU '%s', governor '%s' doesn't support itz6setting sampling_down_factor to '%s' for governor '%s'F)r�)r�clearr�r<rSrVrWr�rJrKrLrOr=rrorErF)rr.r"r�r�rpr-rKrrr�_set_sampling_down_factor(s&





z*CPULatencyPlugin._set_sampling_down_factorcCsD|j||d�}|dkrdS|j|�}tjj|�s4dS|jj|�j�S)N)r�)r�r�rJrKrLrrkrl)rr"r�r-rKrrr�_get_sampling_down_factorCs
z*CPULatencyPlugin._get_sampling_down_factorcCs*|jjdd|t|�gdd�\}}}||fS)NrBz-cT)Z
return_err)rrDrO)r�cpu_idrrrHrI�err_msgrrr�_try_set_energy_perf_biasMsz*CPULatencyPlugin._try_set_energy_perf_biascCsd||rdndfS)Nz>/sys/devices/system/cpu/cpufreq/policy%s/energy_performance_%sZavailable_preferencesZ
preferencer)rr�Z	availablerrr�_pstate_preference_pathVsz(CPULatencyPlugin._pstate_preference_pathcCsd|S)Nz4/sys/devices/system/cpu/cpu%s/power/energy_perf_biasr)rr�rrr�_energy_perf_bias_pathYsz'CPULatencyPlugin._energy_perf_bias_pathr/cCs||j|�stjd|�dS|jd�}|jd�}tj|j�kr�|j|�}t	j
j|�r�|s�xT|D]>}|j�}|j
j|||r�tjgndd�r^tjd||f�Pq^Wtjd|�t|�Stjd|�dSn�|j�rt|�slx�|D]|}|j�}tjd	||f�|j||�\}	}
|	d
k�r,tjd||f�Pq�|	d
k�rHtjd|
�Pq�tjd||f�q�Wtjd|�t|�SdSdS)
Nz%s is not online, skippingrr�F)r�z5energy_perf_bias successfully set to '%s' on cpu '%s'zPFailed to set energy_perf_bias on cpu '%s'. Is the value in the profile correct?zXFailed to set energy_perf_bias on cpu '%s' because energy_perf_bias file does not exist.z2Trying to set energy_perf_bias to '%s' on cpu '%s'rz"Failed to set energy_perf_bias: %szHCould not set energy_perf_bias to '%s' on cpu '%s', trying another value)rQr<rS�lstripr�rZ�CFG_CPU_EPP_FLAGrNr�rJrKrLrlrrorErFr=r^rOrr�)rr/r"r�r�r��vals�energy_perf_bias_pathrprHr�rrr�_set_energy_perf_bias\sX








z&CPULatencyPlugin._set_energy_perf_biascCsjyt|�}WnXtk
rd}z<yt|d�}Wn&tk
rR}z
|}WYdd}~XnXWYdd}~XnX|S)N�)r}r~)rr�rs�errr�_try_parse_num�s(zCPULatencyPlugin._try_parse_numcCsdddd�j|j|�|�S)N�performance�normalZ	powersave)r��)r;r�)rr�rrr�_energy_perf_policy_to_human�sz-CPULatencyPlugin._energy_perf_policy_to_humancCsdddddd�j|j|�|�S)Nr�zbalance-performancer�z
balance-powerZpower)r�r��r�)r;r�)rr�rrr�_energy_perf_policy_to_human_v2�sz0CPULatencyPlugin._energy_perf_policy_to_human_v2c
Cs�d}|j|�s tjd|�dS|jd�}tj|j�krb|j|�}tj	j
|�r�|j|jj
|��}nz|jr�|jjdd|dg�\}}|dkr�xR|j�D]F}|j�}	t|	�dkr�|j|	d�}Pq�t|	�d	kr�|j|	d�}Pq�W|S)
Nz%s is not online, skippingrrBz-cz-rr�r�)rQr<rSr�rZr�rNr�rJrKrLr�rrkrrD�
splitlinesr�r�r�)
rr"r�r/r�r�rH�lines�line�lrrr�_get_energy_perf_bias�s*


z&CPULatencyPlugin._get_energy_perf_biascCsd|S)Nz9/sys/devices/system/cpu/%s/power/pm_qos_resume_latency_usr)rr"rrr�_pm_qos_resume_latency_us_path�sz/CPULatencyPlugin._pm_qos_resume_latency_us_pathcCs4|jdkr.tjj|j|��|_|js.tjd�|jS)NzGOption 'pm_qos_resume_latency_us' is not supported on current hardware.)rrJrKrLr�r<r=)rr"rrr�_check_pm_qos_resume_latency_us�s


z0CPULatencyPlugin._check_pm_qos_resume_latency_usr3cCs�|j|�stjd|�dS|j|dd�\}}|s>|j|�rBdS|dksZ|dkrp|dkrptjd||f�dS|s�|jj|j|�||r�t	j
gndd�|S)	Nz%s is not online, skippingT)r�zn/arz<Invalid pm_qos_resume_latency_us specified: '%s', cpu: '%s'.F)r�)rQr<rSr�r�rGrror�rErF)rr3r"r�r�r�r�rrr�_set_pm_qos_resume_latency_us�s
z.CPULatencyPlugin._set_pm_qos_resume_latency_uscCsD|j|�stjd|�dS|j|�s*dS|jj|j|�|d�j�S)Nz%s is not online, skipping)r�)rQr<rSr�rrkr�rl)rr"r�rrr�_get_pm_qos_resume_latency_us�s

z.CPULatencyPlugin._get_pm_qos_resume_latency_usr4c	Cs�|j|�stjd|�dS|jd�}tjj|j|d��r�|jd�}|s�t	|j
j|j|d��j��}xn|D]X}||kr�|j
j|j|�||r�t
jgndd�tjd||f�Pqjtjd||f�qjWtjd	|�t|�Stjd
�dS)Nz%s is not online, skippingrTr�F)r�z=Setting energy_performance_preference value '%s' for cpu '%s'zAenergy_performance_preference value '%s' unavailable for cpu '%s'z]Failed to set energy_performance_preference on cpu '%s'. Is the value in the profile correct?zyenergy_performance_available_preferences file missing, which can happen if the system is booted without a P-state driver.)rQr<rSr�rJrKrLr�r�rrrkrorErFr=r�r^rO)	rr4r"r�r�r�r�Z
avail_valsrprrr�"_set_energy_performance_preference�s(




z3CPULatencyPlugin._set_energy_performance_preferencecCs^|j|�stjd|�dS|jd�}tjj|j|d��rP|jj	|j|��j
�Stjd�dS)Nz%s is not online, skippingrTzyenergy_performance_available_preferences file missing, which can happen if the system is booted without a P-state driver.)rQr<rSr�rJrKrLr�rrkrl)rr"r�r�rrr�"_get_energy_performance_preferences


z3CPULatencyPlugin._get_energy_performance_preference)F)F)F)F)r�)F)F)F)F)F):�__name__�
__module__�__qualname__�__doc__r	r#r'�classmethodr5rAr>r?r@rNrQrRrTrgrjrnrqrtrurZZ
ROLLBACK_SOFTrxrzryr|r�r�r�r�r�rwr�Zcommand_setr�Zcommand_getr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r��
__classcell__rr)rrrsn,		



		
8r)rCrZ
decoratorsZ
tuned.logsZtunedZtuned.utils.commandsrZtuned.constsrZrJrEr�r6r8Zlogsr;r<r�ZPluginrrrrr�<module>s

site-packages/tuned/plugins/__pycache__/plugin_eeepc_she.cpython-36.opt-1.pyc000064400000006666147511334670023210 0ustar003

�<�e��@sTddlmZddlmZddlZddlmZddlZejj	�Z
Gdd�dej�ZdS)�)�base)�
exceptions�N)�commandscsTeZdZdZ�fdd�Zedd��Zdd�Zdd	�Zd
d�Z	dd
�Z
dd�Z�ZS)�EeePCSHEPlugina�
	`eeepc_she`::
	
	Dynamically sets the front-side bus (FSB) speed according to the
	CPU load. This feature can be found on some netbooks and is also
	known as the Asus Super Hybrid Engine. If the CPU load is lower or
	equal to the value specified by the [option]`load_threshold_powersave`
	option, the plug-in sets the FSB speed to the value specified by the
	[option]`she_powersave` option. If the CPU load is higher or
	equal to the value specified by the [option]`load_threshold_normal`
	option, it sets the FSB speed to the value specified by the
	[option]`she_normal` option. Static tuning is not supported and the
	plug-in is transparently disabled if the hardware support for this
	feature is not detected.
	
	NOTE: For details about the FSB frequencies and corresponding values, see
	link:https://www.kernel.org/doc/Documentation/ABI/testing/sysfs-platform-eeepc-laptop[the kernel documentation].
	The provided defaults should work for most users.
	csPt�|_d|_tjj|j�s"d|_tjj|j�s:tjd��tt	|�j
||�dS)Nz!/sys/devices/platform/eeepc/cpufvz%/sys/devices/platform/eeepc-wmi/cpufvz)Plugin is not supported on your hardware.)r�_cmd�
_control_file�os�path�isfilerZNotSupportedPluginException�superr�__init__)�self�args�kwargs)�	__class__��&/usr/lib/python3.6/plugin_eeepc_she.pyr
s
zEeePCSHEPlugin.__init__cCsddddd�S)Ng333333�?g�������?�r)�load_threshold_normal�load_threshold_powersaveZ
she_powersaveZ
she_normalr)rrrr�_get_config_options'sz"EeePCSHEPlugin._get_config_optionscCs&d|_d|_d|_|jjdd�|_dS)NFT�load)Z_has_static_tuningZ_has_dynamic_tuning�	_she_mode�_monitors_repositoryZcreate�
_load_monitor)r�instancerrr�_instance_init0szEeePCSHEPlugin._instance_initcCs"|jdk	r|jj|j�d|_dS)N)rr�delete)rrrrr�_instance_cleanup6s
z EeePCSHEPlugin._instance_cleanupcCsH|jj�d}||jdkr*|j|d�n||jdkrD|j|d�dS)N�systemrZ	powersaver�normal)rZget_load�options�
_set_she_mode)rr�devicerrrr�_instance_update_dynamic;s
z'EeePCSHEPlugin._instance_update_dynamiccCs|j|d�dS)Nr!)r#)rrr$rrr�_instance_unapply_dynamicBsz(EeePCSHEPlugin._instance_unapply_dynamiccCsLt|jd|�}|j|krHtjd||f�|jj|jd|�||_dS)Nzshe_%sznew eeepc_she mode %s (%d) z%s)�intr"r�log�inforZ
write_to_filer)rrZnew_modeZnew_mode_numericrrrr#Fs

zEeePCSHEPlugin._set_she_mode)
�__name__�
__module__�__qualname__�__doc__r
�classmethodrrrr%r&r#�
__classcell__rr)rrr	s		r)
�rrZ
tuned.logsZtunedZtuned.utils.commandsrr	Zlogs�getr(ZPluginrrrrr�<module>s
site-packages/tuned/plugins/__pycache__/plugin_sysctl.cpython-36.opt-1.pyc000064400000015310147511334670022573 0ustar003

�<�e��@s�ddlZddlmZddlTddlZddlTddlmZddl	j
Z
ddlZddlZej
j�ZddgZdd	gZGd
d�dej�ZdS)�N�)�base)�*)�commandsZbase_reachable_timeZretrans_timez
/run/sysctl.dz
/etc/sysctl.dcs�eZdZdZ�fdd�Zdd�Zdd�Zdd	�Zd
d�Ze	j
fdd
�Zdd�Zdd�Z
dd�Zdd�Zdd�Zddd�Z�ZS)�SysctlPluginaI
	`sysctl`::
	
	Sets various kernel parameters at runtime.
	+
	This plug-in is used for applying custom `sysctl` settings and should
	only be used to change system settings that are not covered by other
	*TuneD* plug-ins. If the settings are covered by other *TuneD* plug-ins,
	use those plug-ins instead.
	+
	The syntax for this plug-in is
	`_key_=_value_`, where
	`_key_` is the same as the key name provided by the
	`sysctl` utility.
	+
	.Adjusting the kernel runtime kernel.sched_min_granularity_ns value
	====
	----
	[sysctl]
	kernel.sched_min_granularity_ns=3000000
	----
	====
	cs$tt|�j||�d|_t�|_dS)NT)�superr�__init__Z_has_dynamic_optionsr�_cmd)�self�args�kwargs)�	__class__��#/usr/lib/python3.6/plugin_sysctl.pyr*szSysctlPlugin.__init__cCshd|_d|_|j|j�}|jj|i�|_t|j�dkr\tj	d�|j
|�i|_|jj|�|j|_
dS)NFTrz0recovering old sysctl settings from previous run)Z_has_dynamic_tuningZ_has_static_tuning�_storage_key�name�_storage�get�_sysctl_original�len�log�info�_instance_unapply_static�unsetZoptions�_sysctl)r
�instance�storage_keyrrr�_instance_init/s

zSysctlPlugin._instance_initcCs|j|j�}|jj|�dS)N)rrrr)r
rrrrr�_instance_cleanup?szSysctlPlugin._instance_cleanupcCs�xzt|jj��D]h\}}|j|�}|dkr:tjd|�q|jj|jj	|��}|j
||�}|dk	r||j|<|j||�qW|j
|j�}|jj||j�|jjtjtj�r�tjd�|j|j�dS)NzDsysctl option %s will not be set, failed to read the original value.zreapplying system sysctl)�listr�items�_read_sysctlr�error�
_variables�expandr	Zunquote�_process_assignment_modifiersr�
_write_sysctlrrr�setZ_global_cfgZget_bool�constsZCFG_REAPPLY_SYSCTLZCFG_DEF_REAPPLY_SYSCTLr�_apply_system_sysctl)r
r�option�value�original_valueZ	new_valuerrrr�_instance_apply_staticCs"



z#SysctlPlugin._instance_apply_staticcCsvd}d}xht|jj��D]V\}}|j|�}|j|jj|�|�}|dk	r|j||jj	|�|jj	|�|�dkrd}qW|S)NTF)
rrr r!r%r#r$Z
_verify_valuer	Z	remove_ws)r
r�ignore_missingZdevices�retr*r+Zcurr_valrrr�_instance_verify_staticYs
$z$SysctlPlugin._instance_verify_staticcCs,x&t|jj��D]\}}|j||�qWdS)N)rrr r&)r
rZrollbackr*r+rrrresz%SysctlPlugin._instance_unapply_staticc
Cs�i}x\tD]T}ytj|�}Wntk
r2w
YnXx(|D] }|jd�sJq:||kr:|||<q:Wq
Wx4t|j��D]$}||}d||f}|j||�qpW|jd|�dS)Nz.confz%s/%sz/etc/sysctl.conf)�SYSCTL_CONFIG_DIRS�os�listdir�OSError�endswith�sorted�keys�_apply_sysctl_config_file)r
�instance_sysctl�files�d�flistZfname�pathrrrr)is 


z!SysctlPlugin._apply_system_sysctlcCs�tjd|�yPt|d��.}x&t|d�D]\}}|j||||�q(WWdQRXtjd|�WnHttfk
r�}z(|jtjkr�tj	d|t
|�f�WYdd}~XnXdS)Nz%Applying sysctl settings from file %s�rrz.Finished applying sysctl settings from file %sz.Error reading sysctl settings from file %s: %s)r�debug�open�	enumerate�_apply_sysctl_config_liner4�IOError�errno�ENOENTr"�str)r
r=r9�f�lineno�line�errrr8|sz&SysctlPlugin._apply_sysctl_config_filec	Cs�|j�}t|�dks,|ddks,|ddkr0dS|jdd�}t|�dkr^tjd||f�dS|\}}|j�}t|�dkr�tjd||f�dS|j�}||kr�|jj||�}||kr�tjd|||f�|j||d	d
�dS)Nr�#�;�=r�z Syntax error in file %s, line %dz2Overriding sysctl parameter '%s' from '%s' to '%s'T)r.)	�stripr�splitrr"r#r$rr&)	r
r=rHrIr9Ztmpr*r+Zinstance_valuerrrrB�s*$z&SysctlPlugin._apply_sysctl_config_linecCsd|jj|dd�S)Nz/proc/sys/%sz./z/.)r	Ztr)r
r*rrr�_get_sysctl_path�szSysctlPlugin._get_sysctl_pathcCs�|j|�}yht|d��B}d}x.t|�D]"\}}|dkr&tjd|�dSq&W|j�}WdQRXtjd||f�|Sttfk
r�}z6|j	t	j
kr�tjd|�ntjd|t|�f�dSd}~XnXdS)Nr>�rzGFailed to read sysctl parameter '%s', multi-line values are unsupportedz&Value of sysctl parameter '%s' is '%s'zBFailed to read sysctl parameter '%s', the parameter does not existz(Failed to read sysctl parameter '%s': %s)rQr@rArr"rOr?r4rCrDrErF)r
r*r=rGrI�ir+rJrrrr!�s(

zSysctlPlugin._read_sysctlFcCs�|j|�}tjj|�tkr,tjd|�dSy6tjd||f�t|d��}|j	|�WdQRXdSt
tfk
r�}zJ|jtj
kr�|r�tjntj}|d||f�ntjd||t|�f�dSd}~XnXdS)Nz+Refusing to set deprecated sysctl option %sFz%Setting sysctl parameter '%s' to '%s'�wTzIFailed to set sysctl parameter '%s' to '%s', the parameter does not existz/Failed to set sysctl parameter '%s' to '%s': %s)rQr2r=�basename�DEPRECATED_SYSCTL_OPTIONSrr"r?r@�writer4rCrDrErF)r
r*r+r.r=rGrJZlog_funcrrrr&�s&
zSysctlPlugin._write_sysctl)F)�__name__�
__module__�__qualname__�__doc__rrrr-r0r(Z
ROLLBACK_SOFTrr)r8rBrQr!r&�
__classcell__rr)r
rrs
r)�rerRrZ
decoratorsZ
tuned.logsZtuned�
subprocessZtuned.utils.commandsrZtuned.constsr(rDr2ZlogsrrrVr1ZPluginrrrrr�<module>s

site-packages/tuned/plugins/__pycache__/plugin_irqbalance.cpython-36.pyc000064400000011013147511334670022410 0ustar003

�<�e�
�@sdddlmZddlmZddlmZddlZddlZddlZddl	Z	ej
j�ZGdd�dej
�ZdS)�)�base)�command_custom�)�constsNcs�eZdZdZ�fdd�Zdd�Zdd�Zedd	��Zd
d�Z	dd
�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zeddd�dd��Z�ZS)�IrqbalancePlugina�
	`irqbalance`::
	
	Plug-in for irqbalance settings management. The plug-in
	configures CPUs which should be skipped when rebalancing IRQs in
	`/etc/sysconfig/irqbalance`. It then restarts irqbalance if and
	only if it was previously running.
	+
	The banned/skipped CPUs are specified as a CPU list via the
	[option]`banned_cpus` option.
	+
	.Skip CPUs 2,4 and 9-13 when rebalancing IRQs
	====
	----
	[irqbalance]
	banned_cpus=2,4,9-13
	----
	====
	cs tt|�j||�tj�|_dS)N)�superr�__init__�perfZcpu_map�_cpus)�self�args�kwargs)�	__class__��'/usr/lib/python3.6/plugin_irqbalance.pyr szIrqbalancePlugin.__init__cCsd|_d|_dS)NFT)Z_has_dynamic_tuningZ_has_static_tuning)r�instancerrr�_instance_init$szIrqbalancePlugin._instance_initcCsdS)Nr)rrrrr�_instance_cleanup(sz"IrqbalancePlugin._instance_cleanupcCsddiS)N�banned_cpusr)�clsrrr�_get_config_options+sz$IrqbalancePlugin._get_config_optionscCsly ttjd��
}|j�SQRXWnFtk
rf}z*|jtjkrJtjd�ntj	d|�dSd}~XnXdS)N�rz>irqbalance sysconfig file is missing. Is irqbalance installed?z,Failed to read irqbalance sysconfig file: %s)
�openr�IRQBALANCE_SYSCONFIG_FILE�read�IOError�errno�ENOENT�log�warn�error)r�f�errr�_read_irqbalance_sysconfig1sz+IrqbalancePlugin._read_irqbalance_sysconfigcCsZy&ttjd��}|j|�WdQRXdStk
rT}ztjd|�dSd}~XnXdS)N�wTz-Failed to write irqbalance sysconfig file: %sF)rrr�writerrr )r�contentr!r"rrr�_write_irqbalance_sysconfig<sz,IrqbalancePlugin._write_irqbalance_sysconfigcCs|d|S)NzIRQBALANCE_BANNED_CPUS=%s
r)r�	sysconfig�banned_cpumaskrrr�_write_banned_cpusEsz#IrqbalancePlugin._write_banned_cpuscCs8g}x(|jd�D]}tjd|�s|j|�qWdj|�S)N�
z\s*IRQBALANCE_BANNED_CPUS=)�split�re�match�append�join)rr(�lines�linerrr�_clear_banned_cpusHs
z#IrqbalancePlugin._clear_banned_cpuscCs2|jjdddgdgd�\}}|dkr.tjd�dS)NZ	systemctlztry-restartZ
irqbalance�)Z	no_errorsrz.Failed to restart irqbalance. Is it installed?)�_cmdZexecuterr)rZretcode�outrrr�_restart_irqbalanceOs
z$IrqbalancePlugin._restart_irqbalancecCs@|j�}|dkrdS|j|�}|j||�}|j|�r<|j�dS)N)r#r3r*r'r7)rr)r&rrr�_set_banned_cpusXs

z!IrqbalancePlugin._set_banned_cpuscCs4|j�}|dkrdS|j|�}|j|�r0|j�dS)N)r#r3r'r7)rr&rrr�_restore_banned_cpusas

z%IrqbalancePlugin._restore_banned_cpusrF)Z
per_devicec	Cs�d}|dk	rjt|jj|��}t|j�}|j|�rB|jjt|��}n(djdd�|jD��}tj	d||f�|sr|r~|dkr~dS|r�dS|r�|j
|�n|j�dS)N�,cSsg|]}t|��qSr)�str)�.0�xrrr�
<listcomp>rsz1IrqbalancePlugin._banned_cpus.<locals>.<listcomp>zGInvalid banned_cpus specified, '%s' does not match available cores '%s')�setr5Zcpulist_unpackr
�issubsetZcpulist2hex�listr0rr r8r9)	rZenabling�valueZverifyZignore_missingr)ZbannedZpresentZstr_cpusrrr�_banned_cpusis 

zIrqbalancePlugin._banned_cpus)�__name__�
__module__�__qualname__�__doc__rrr�classmethodrr#r'r*r3r7r8r9rrC�
__classcell__rr)rrrs			r)�rZ
decoratorsrZtunedrZ
tuned.logsrr	r-Zlogs�getrZPluginrrrrr�<module>s
site-packages/tuned/plugins/__pycache__/plugin_bootloader.cpython-36.opt-1.pyc000064400000060721147511334670023412 0ustar003

�<�e:e�@s�ddlmZddlTddlZddlmZddlmZddlj	Z	ddl
Z
ddlZddlZddl
mZejj�ZGdd	�d	ej�ZdS)
�)�base)�*�N)�
exceptions)�commands)�sleepcs�eZdZdZ�fdd�Zdd�Zdd�Zedd	��Ze	d[dd��Z
e	d
d��Zdd�Zdd�Z
iifdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zejfd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Z d5d6�Z!d7d8�Z"d9d:�Z#d;d<�Z$d=d>�Z%d?d@�Z&e'dA�dBdC��Z(e'dD�dEdF��Z)e'dG�dHdI��Z*e'dJdKdLdM�dNdO��Z+e'dPdKdLdM�dQdR��Z,e'dSdKdLdM�dTdU��Z-e'dVdKdLdM�dWdX��Z.dYdZ�Z/�Z0S)\�BootloaderPlugina�
	`bootloader`::
	
	Adds options to the kernel command line. This plug-in supports the
	GRUB 2 boot loader and the Boot Loader Specification (BLS).
	+
	NOTE: *TuneD* will not remove or replace kernel command line
	parameters added via other methods like *grubby*. *TuneD* will manage
	the kernel command line parameters added via *TuneD*. Please refer
	to your platform bootloader documentation about how to identify and
	manage kernel command line parameters set outside of *TuneD*.
	+
	Customized non-standard location of the GRUB 2 configuration file
	can be specified by the [option]`grub2_cfg_file` option.
	+
	The kernel options are added to the current GRUB configuration and
	its templates. Reboot the system for the kernel option to take effect.
	+
	Switching to another profile or manually stopping the `tuned`
	service removes the additional options. If you shut down or reboot
	the system, the kernel options persist in the [filename]`grub.cfg`
	file and grub environment files.
	+
	The kernel options can be specified by the following syntax:
	+
	[subs="+quotes,+macros"]
	----
	cmdline__suffix__=__arg1__ __arg2__ ... __argN__
	----
	+
	Or with an alternative, but equivalent syntax:
	+
	[subs="+quotes,+macros"]
	----
	cmdline__suffix__=+__arg1__ __arg2__ ... __argN__
	----
	+
	Where __suffix__ can be arbitrary (even empty) alphanumeric
	string which should be unique across all loaded profiles. It is
	recommended to use the profile name as the __suffix__
	(for example, [option]`cmdline_my_profile`). If there are multiple
	[option]`cmdline` options with the same suffix, during the profile
	load/merge the value which was assigned previously will be used. This
	is the same behavior as any other plug-in options. The final kernel
	command line is constructed by concatenating all the resulting
	[option]`cmdline` options.
	+
	It is also possible to remove kernel options by the following syntax:
	+
	[subs="+quotes,+macros"]
	----
	cmdline__suffix__=-__arg1__ __arg2__ ... __argN__
	----
	+
	Such kernel options will not be concatenated and thus removed during
	the final kernel command line construction.
	+
	.Modifying the kernel command line
	====
	For example, to add the [option]`quiet` kernel option to a *TuneD*
	profile, include the following lines in the [filename]`tuned.conf`
	file:
	
	----
	[bootloader]
	cmdline_my_profile=+quiet
	----
	
	An example of a custom profile `my_profile` that adds the
	[option]`isolcpus=2` option to the kernel command line:
	
	----
	[bootloader]
	cmdline_my_profile=isolcpus=2
	----
	
	An example of a custom profile `my_profile` that removes the
	[option]`rhgb quiet` options from the kernel command line (if
	previously added by *TuneD*):
	
	----
	[bootloader]
	cmdline_my_profile=-rhgb quiet
	----
	====
	+
	.Modifying the kernel command line, example with inheritance
	====
	For example, to add the [option]`rhgb quiet` kernel options to a
	*TuneD* profile `profile_1`:
	
	----
	[bootloader]
	cmdline_profile_1=+rhgb quiet
	----
	
	In the child profile `profile_2` drop the [option]`quiet` option
	from the kernel command line:
	
	----
	[main]
	include=profile_1
	
	[bootloader]
	cmdline_profile_2=-quiet
	----
	
	The final kernel command line will be [option]`rhgb`. In case the same
	[option]`cmdline` suffix as in the `profile_1` is used:
	
	----
	[main]
	include=profile_1
	
	[bootloader]
	cmdline_profile_1=-quiet
	----
	
	It will result in the empty kernel command line because the merge
	executes and the [option]`cmdline_profile_1` gets redefined to just
	[option]`-quiet`. Thus there is nothing to remove in the final kernel
	command line processing.
	====
	+
	The [option]`initrd_add_img=IMAGE` adds an initrd overlay file
	`IMAGE`. If the `IMAGE` file name begins with '/', the absolute path is
	used. Otherwise, the current profile directory is used as the base
	directory for the `IMAGE`.
	+
	The [option]`initrd_add_dir=DIR` creates an initrd image from the
	directory `DIR` and adds the resulting image as an overlay.
	If the `DIR` directory name begins with '/', the absolute path
	is used. Otherwise, the current profile directory is used as the
	base directory for the `DIR`.
	+
	The [option]`initrd_dst_img=PATHNAME` sets the name and location of
	the resulting initrd image. Typically, it is not necessary to use this
	option. By default, the location of initrd images is `/boot` and the
	name of the image is taken as the basename of `IMAGE` or `DIR`. This can
	be overridden by setting [option]`initrd_dst_img`.
	+
	The [option]`initrd_remove_dir=VALUE` removes the source directory
	from which the initrd image was built if `VALUE` is true. Only 'y',
	'yes', 't', 'true' and '1' (case insensitive) are accepted as true
	values for this option. Other values are interpreted as false.
	+
	.Adding an overlay initrd image
	====
	----
	[bootloader]
	initrd_remove_dir=True
	initrd_add_dir=/tmp/tuned-initrd.img
	----
	
	This creates an initrd image from the `/tmp/tuned-initrd.img` directory
	and and then removes the `tuned-initrd.img` directory from `/tmp`.
	====
	+
	The [option]`skip_grub_config=VALUE` does not change grub
	configuration if `VALUE` is true. However, [option]`cmdline`
	options are still processed, and the result is used to verify the current
	cmdline. Only 'y', 'yes', 't', 'true' and '1' (case insensitive) are accepted
	as true values for this option. Other values are interpreted as false.
	+
	.Do not change grub configuration
	====
	----
	[bootloader]
	skip_grub_config=True
	cmdline=+systemd.cpu_affinity=1
	----
	====
	cs6tjjtj�stjd��tt|�j	||�t
�|_dS)Nz4Required GRUB2 template not found, disabling plugin.)�os�path�isfile�constsZGRUB2_TUNED_TEMPLATE_PATHrZNotSupportedPluginException�superr�__init__r�_cmd)�self�args�kwargs)�	__class__��'/usr/lib/python3.6/plugin_bootloader.pyr�s
zBootloaderPlugin.__init__cCsVd|_d|_d|_d|_d|_d|_d|_d|_|j�|_	|j
�|_|j�dk	|_
dS)NFT�)Z_has_dynamic_tuningZ_has_static_tuning�update_grub2_cfg�_skip_grub_config_val�_initrd_remove_dir�_initrd_dst_img_val�_cmdline_val�_initrd_val�_get_grub2_cfg_files�_grub2_cfg_file_names�_bls_enabled�_bls�_rpm_ostree_status�_rpm_ostree)r�instancerrr�_instance_init�s

zBootloaderPlugin._instance_initcCsdS)Nr)rr#rrr�_instance_cleanup�sz"BootloaderPlugin._instance_cleanupcCsdddddddd�S)N)�grub2_cfg_file�initrd_dst_img�initrd_add_img�initrd_add_dir�initrd_remove_dir�cmdline�skip_grub_configr)�clsrrr�_get_config_options�sz$BootloaderPlugin._get_config_optionsrcCs`i}|j�}xN|j�D]B}||kr|jdd�}|j|dg�jt|�dkrR|dnd�qW|S)z�
		Returns dict created from options
		e.g.: _options_to_dict("A=A A=B A B=A C=A", "A=B B=A B=B") returns {'A': ['A', None], 'C': ['A']}
		�=rrN)�split�
setdefault�append�len)�optionsZomit�d�oZarrrrr�_options_to_dict�s.z!BootloaderPlugin._options_to_dictcCsdjdd�|j�D��S)N� cSs2g|]*\}}|D]}|dk	r(|d|n|�qqS)Nr/r)�.0�k�vZv1rrr�
<listcomp>�sz5BootloaderPlugin._dict_to_options.<locals>.<listcomp>)�join�items)r5rrr�_dict_to_options�sz!BootloaderPlugin._dict_to_optionscCsr|jjddgdd�\}}}tjd||f�|dkr8dS|j�}t|�dksX|dd	krjtjd
|�dS|dS)zW
		Returns status of rpm-ostree transactions or None if not run on rpm-ostree system
		z
rpm-ostreeZstatusT)�
return_errz.rpm-ostree status output stdout:
%s
stderr:
%srN�zState:z2Exceptional format of rpm-ostree status result:
%sr)r�execute�log�debugr0r3�warn)r�rc�out�errZsplitedrrrr!�sz#BootloaderPlugin._rpm_ostree_statuscCsFd}d}x(t|�D]}|j�dkr&dSt|�qW|j�dkrBdSdS)N�
g�?ZidleTF)�ranger!r)rZsleep_cyclesZ
sleep_secs�irrr�_wait_till_idlesz BootloaderPlugin._wait_till_idlecs�|jjddgdd�\}}}tjd||f�|dkr8dS|j|�}|j�sXtjd�dSi}|j|�j�}x8|j	�D],\�}	x|	D]}
|�j
|
�q�W|	|�<qtWi}|j|�j�}x�|j	�D]~\�}	|j���r$tjd	�|�f�|j�g�j
|��|j
�fd
d�|�D��g|�<|j�g�j
|	�|	|�<q�W||k�rdtjd|�|||fStjd
||f�|jjddgdd�|D�dd�|D�dd�\}}
}|dk�r�tjd|�|j|�ddfS|||fSdS)z�
		Method for appending or deleting rpm-ostree karg
		returns None if rpm-ostree not present or is run on not ostree system
		or tuple with new kargs, appended kargs and deleted kargs
		z
rpm-ostree�kargsT)r@z'rpm-ostree output stdout:
%s
stderr:
%srNzCannot wait for transaction endz)adding rpm-ostree kargs %s: %s for deletecs$g|]}|dk	r�d|n��qS)Nr/r)r9r;)r:rrr<.sz6BootloaderPlugin._rpm_ostree_kargs.<locals>.<listcomp>z3skipping rpm-ostree kargs - append == deleting (%s)z2rpm-ostree kargs - appending: '%s'; deleting: '%s'cSsg|]}d|�qS)z--append=%sr)r9r;rrrr<9scSsg|]}d|�qS)z--delete=%sr)r9r;rrrr<:sz-Something went wrong with rpm-ostree kargs
%s)NNN)NNN)rrBrCrDr7rL�errorr?r0r>�remove�getr1�extend�info)rr2�deleterFrGrHrM�deletedZ
delete_params�valr;�appendedZ
append_params�_r)r:r�_rpm_ostree_kargs
sF





z"BootloaderPlugin._rpm_ostree_kargscCsV|j�j�}g}xR|D]J}t|�jd�r4|j|�q||krJ||||<qtjd||jjf�qWd}x�|D]�}||}|dksn|dkr�qn|d}|dd�}|dd�j	�}	|dks�|d	kr�|dkr�|	dkr�|d|	7}qn|d
k�r(|	dk�r4x@|	j
�D]&}
tj|
�}tj
d|d
d|�}�q�Wqn|d|7}qnW|j	�}|dk�rR||d<|S)zSMerge provided options with plugin default options and merge all cmdline.* options.r+z$Unknown option '%s' for plugin '%s'.rNrrrA�+�\�-r8z(\A|\s)z	(?=\Z|\s))rZrYr[)r.�copy�str�
startswithr2rCrEr�__name__�stripr0�re�escape�sub)rr4Z	effectiveZcmdline_keys�keyr+rU�opZop1�vals�pZregexrrr�_get_effective_optionsAs:





z'BootloaderPlugin._get_effective_optionscCs.g}x$tjD]}tjj|�r|j|�qW|S)N)rZGRUB2_CFG_FILESr	r
�existsr2)rZ	cfg_files�frrrrcs
z%BootloaderPlugin._get_grub2_cfg_filescCsH|jjtjdd�}t|�dkr2tjdtj�dStjd|tj	d�dk	S)NT)�no_errorrzcannot read '%s'Fz=^\s*GRUB_ENABLE_BLSCFG\s*=\s*\"?\s*[tT][rR][uU][eE]\s*\"?\s*$)�flags)
r�	read_filer�GRUB2_DEFAULT_ENV_FILEr3rCrRra�search�	MULTILINE)r�grub2_default_envrrrrjszBootloaderPlugin._bls_enabledcCs|jjtj|�S)N)r�add_modify_option_in_filer�BOOT_CMDLINE_FILE)rr5rrr�_patch_bootcmdlinessz#BootloaderPlugin._patch_bootcmdlinecCs�|jtjdtjdi�|js*tjd�dSx4|jD]*}|jj|dtj	ddtj
didd�q2W|jdk	r�tjd|j�|jj|j�dS)Nrzcannot find grub.cfg to patchzset\s+F)�addzremoving initrd image '%s')
rtr�BOOT_CMDLINE_TUNED_VAR�BOOT_CMDLINE_INITRD_ADD_VARrrCrRrrr�GRUB2_TUNED_VAR�GRUB2_TUNED_INITRD_VARr�unlink)rrjrrr�_remove_grub2_tuningvs
*
z%BootloaderPlugin._remove_grub2_tuningcCsf|jjtj�}tjtjd|tjd�}|r2|dnd}tjtjd|tjd�}|rZ|dnd}||fS)Nz	=\"(.*)\")rlrr)	rrmrrsrarorvrp�BOOT_CMDLINE_KARGS_DELETED_VAR)rrjrVrTrrr�_get_rpm_ostree_changes�sz(BootloaderPlugin._get_rpm_ostree_changescCs@|j�\}}|j|j|�|j|�d�|jtjdtjdi�dS)N)r2rSr)r}rXr7rtrrvr|)rrVrTrrr�_remove_rpm_ostree_tuning�sz*BootloaderPlugin._remove_rpm_ostree_tuningcCsR|tjkrN|jrN|jr,tjd�|j�n"tjd�|j�|jddd��dS)Nz4removing rpm-ostree tuning previously added by Tunedz/removing grub2 tuning previously added by Tunedr)�tuned_params�tuned_initrd)	rZ
ROLLBACK_FULLrr"rCrRr~r{�_update_grubenv)rr#Zrollbackrrr�_instance_unapply_static�s


z)BootloaderPlugin._instance_unapply_staticcCs�tjd�tjdtjdd|tjd�}tjdtjd|tjd�}tjdtjdd|tjd�}tjdtjd|tjd�}tjtjdd|tjd�}tjtj	dd|tjd�S)	Nzunpatching grub.cfgz
^\s*set\s+z\s*=.*
r)rlz *\$z\nz\n+)
rCrDrarcrrxrpry�GRUB2_TEMPLATE_HEADER_BEGIN�GRUB2_TEMPLATE_HEADER_END)r�	grub2_cfg�cfgrrr�_grub2_cfg_unpatch�s
z#BootloaderPlugin._grub2_cfg_unpatchcCs�tjd�dtjd}x8|D]0}|d|jj|�d|jj||�d7}qW|tjd7}tjd||tj	d	�}tj
tjd
�}xt|D]l}tjd|dd
|||tj	d	�}tjd|d||dd|tj	d	�}tjd|dd|tj	d	�}q�W|S)Nzinitial patching of grub.cfgz\1\n\n�
zset z="z"
z\nz+^(\s*###\s+END\s+[^#]+/00_header\s+### *)\n)rl)�linuxZinitrdz^(\s*z(16|efi)?\s+.*)$z\1 $z(?:16|efi)?\s+\S+rescue.*)\$z *(.*)$z\1\2z(?:16|efi)?\s+\S+rescue.*) +$z\1)rCrDrr�rrbr�rarcrprxry)rr�r5�s�optZd2rKrrr�_grub2_cfg_patch_initial�s

0
$( z)BootloaderPlugin._grub2_cfg_patch_initialcCs�|jjtj�}t|�dkr.tjdtj�dStjtjd�}d}xv|D]n}t	j
d|d||d|t	jd�dkrFd	}|ddkr�|d7}||d|d
|d||d7}qFW|r�tjdtj�|jj
tj|�d	S)Nrzcannot read '%s'F)ZGRUB_CMDLINE_LINUX_DEFAULTZGRUB_INITRD_OVERLAYz^[^#]*\bz
\s*=.*\\\$z\b.*$)rlTrr�z="${z:+$z }\$z"
z
patching '%s'���)rrmrrnr3rCrRrxryrarorprD�
write_to_file)rrqr5�writerKrrr�_grub2_default_env_patch�s 
*,z)BootloaderPlugin._grub2_default_env_patchcCs�|jjtj�}t|�dkr.tjdtj�dSd}tjdtj	d|tj
d�r�d}tjdtj	dd	|tj
d�}|d
dkr�|d7}|r�tjdtj�|jj
tj|�dS)Nrzcannot read '%s'Fzb^GRUB_CMDLINE_LINUX_DEFAULT=\"\$\{GRUB_CMDLINE_LINUX_DEFAULT:\+\$GRUB_CMDLINE_LINUX_DEFAULT \}\\\$z"$)rlTz"$
rrr�zunpatching '%s'r�)rrmrrnr3rCrRrarorxrprcrDr�)rrqr�r�rrr�_grub2_default_env_unpatch�s z+BootloaderPlugin._grub2_default_env_unpatchcCsZtjd�|jstjd�dS�x|jD�]}|jj|�}t|�dkrVtjd|�q(tjd|�|}d}xf|D]^}tjd|dd|jj	||�d
|tj
d�\}}|dks�tjd
||tj
d�dkrrd}qrWttjd
t
j|tj
d��ttjd
t
j|tj
d��k�rd}|�r*|j|j|�|�}|jj||�q(W|j�rN|j�n|j�dS)Nzpatching grub.cfgzcannot find grub.cfg to patchFrzcannot patch %sz+adding boot command line parameters to '%s'z	\b(set\s+z\s*=).*$z\1�")rlrz\$Tz\1")rCrDrrRrrmr3ra�subnrbrpro�findallrrxryr�r�r�r r�r�)rr5rjr�Z
grub2_cfg_newZ
patch_initialr�Znsubsrrr�_grub2_cfg_patch�s4


4" 
z!BootloaderPlugin._grub2_cfg_patchcCsb|j�\}}|j|j|�}|s"dS|j|d�\}}}|dkr@dS|jtj|jtj|j|�i�dS)N)r2)	r}r7rrXrtrrvr|r?)rrVrWZ
_cmdline_dictr5rrr�_rpm_ostree_update�sz#BootloaderPlugin._rpm_ostree_updatecCs8|jtj|jtj|ji�|jtj|jtj|ji�dS)N)	r�rrxrryrrtrvrw)rrrr�
_grub2_updateszBootloaderPlugin._grub2_updatecCstjjtj�S)N)r	r
rirZBLS_ENTRIES_PATH)rrrr�_has_blsszBootloaderPlugin._has_blscCs\tjdt|��dd�|j�D�}|jjdddg|�\}}|dkrXtjd|�d	Sd
S)Nzupdating grubenv, setting %scSs$g|]\}}dt|�t|�f�qS)z%s=%s)r])r9Zoption�valuerrrr<sz4BootloaderPlugin._update_grubenv.<locals>.<listcomp>z
grub2-editenvr[�setrzcannot update grubenv: '%s'FT)rCrDr]r>rrBrE)rr5�lrFrGrrrr�
sz BootloaderPlugin._update_grubenvcCsb|jj�}|dkrdStjdtj�|jjtjdgd|id�\}}|dkr^tjd|�dSd	S)
NrFz4running kernel update hook '%s' to patch BLS entriesruZKERNEL_INSTALL_MACHINE_ID)�envrzcannot patch BLS entries: '%s'T)rZget_machine_idrCrDrZKERNEL_UPDATE_HOOK_FILErBrE)rZ
machine_idrFrGrrr�_bls_entries_patch_initials
z+BootloaderPlugin._bls_entries_patch_initialcCs6tjd�|j�r2|j|j|jd��r2|j�r2dSdS)Nzupdating BLS)rr�TF)rCrDr�r�rrr�)rrrr�_bls_updates
zBootloaderPlugin._bls_updatecCs(|jdkr$tjjtjtjj|��|_dS)N)rr	r
r=r�BOOT_DIR�basename)r�namerrr�_init_initrd_dst_img&s
z%BootloaderPlugin._init_initrd_dst_imgcCstjjtj�S)N)r	r
�isdirrZPETITBOOT_DETECT_DIR)rrrr�_check_petitboot*sz!BootloaderPlugin._check_petitbootcCs�|jrtjd�dS|j�r&tjd�tjd|j�tjj|j�}|j	j
||j�sXdSd|_|j	jd�j
�}d}t|�}|r�tjdd	|�}t|�|kr�|}tjj||�|_dS)
Nz:Detected rpm-ostree which doesn't support initrd overlays.FzkDetected Petitboot which doesn't support initrd overlays. The initrd overlay will be ignored by bootloader.zinstalling initrd image as '%s'Tz
/proc/cmdline�/z)^\s*BOOT_IMAGE=\s*(?:\([^)]*\))?(\S*/).*$z\1)r"rCrEr�rRrr	r
r�rr\rrm�rstripr3rarcr=r)rZimgZimg_nameZcurr_cmdlineZinitrd_grubpathZlcr
rrr�_install_initrd-s&

z BootloaderPlugin._install_initrdr&cCs$|rdS|r |dk	r t|�g|_dS)N)r]r)r�enablingr��verify�ignore_missingrrr�_grub2_cfg_fileBsz BootloaderPlugin._grub2_cfg_filer'cCsR|rdS|rN|dk	rNt|�|_|jdkr,dS|jddkrNtjjtj|j�|_dS)NrFrr�)r]rr	r
r=rr�)rr�r�r�r�rrr�_initrd_dst_imgJs

z BootloaderPlugin._initrd_dst_imgr*cCs*|rdS|r&|dk	r&|jj|�dk|_dS)N�1)r�get_boolr)rr�r�r�r�rrrrVsz#BootloaderPlugin._initrd_remove_dirr(FrI)Z
per_deviceZprioritycCsD|rdS|r@|dk	r@t|�}|j|�|dkr2dS|j|�s@dSdS)NrF)r]r�r�)rr�r�r�r�Zsrc_imgrrr�_initrd_add_img^s

z BootloaderPlugin._initrd_add_imgr)c
Cs|rdS|o|dk	�rt|�}|j|�|dkr4dStjj|�sRtjd|�dStjd|�tj	ddd�\}}tj
d|�tj|�|jj
d	||d
d�\}}	tj
d|	�|d
kr�tjd�|jj|d
d�dS|j|�|jj|�|j�rtjd|�|jj|�dS)NrFzFerror: cannot create initrd image, source directory '%s' doesn't existz+generating initrd image from directory '%s'ztuned-bootloader-z.tmp)�prefix�suffixz+writing initrd image to temporary file '%s'zfind . | cpio -co > %sT)�cwd�shellzcpio log: %srzerror generating initrd image)rkzremoving directory '%s')r]r�r	r
r�rCrNrR�tempfileZmkstemprD�closerrBrzr�rZrmtree)
rr�r�r�r�Zsrc_dir�fdZtmpfilerFrGrrr�_initrd_add_dirks2



z BootloaderPlugin._initrd_add_dirr+cCsJ|jj|jj|��}|�r |jr8|j�d}|j|�}n|jjd�}t|�dkrTdSt	|j
��}t	|j
��}	|	|}
t|
�dkr�tjt
jdt|	�f�dSdd�|D�}xR|
D]J}|j
dd�d}
|
|kr�tjt
j|
|f�q�tjt
j||
|f�q�W|	|@}tjd	d
j|�f�dS|�rF|dk	�rFtjd�d|_||_dS)
Nrz
/proc/cmdliner+TcSsi|]}||jdd�d�qS)r/rr)r0)r9r;rrr�
<dictcomp>�sz-BootloaderPlugin._cmdline.<locals>.<dictcomp>r/rz2expected arguments that are present in cmdline: %sr8Fz;installing additional boot command line parameters to grub2)Z
_variables�expandrZunquoter"rXr?rmr3r�r0rCrRrZSTR_VERIFY_PROFILE_VALUE_OKr]rNZ'STR_VERIFY_PROFILE_CMDLINE_FAIL_MISSINGZSTR_VERIFY_PROFILE_CMDLINE_FAILr=rr)rr�r�r�r�r;Zrpm_ostree_kargsr+Zcmdline_setZ	value_setZmissing_setZcmdline_dict�m�argZpresent_setrrr�_cmdline�s6

zBootloaderPlugin._cmdliner,cCs8|rdS|r4|dk	r4|jj|�dkr4tjd�d|_dS)Nr�z(skipping any modification of grub configT)rr�rCrRr)rr�r�r�r�rrr�_skip_grub_config�s
z"BootloaderPlugin._skip_grub_configcCs�|rV|jrVt|j�dkr"tjd�t|j�dkr:tjd�|jtj|jtj	|ji�n0|r�|j
r�|jrp|j�n|j
�|j�d|_
dS)Nrz0requested changes to initrd will not be applied!z1requested changes to cmdline will not be applied!F)rr3rrCrErrtrrvrwrr"r�r�r�)rr#r�rrr�_instance_post_static�s




z&BootloaderPlugin._instance_post_static)r)1r_�
__module__�__qualname__�__doc__rr$r%�classmethodr.�staticmethodr7r?r!rLrXrhrrrtr{r}r~rZ
ROLLBACK_SOFTr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�Zcommand_customr�r�rr�r�r�r�r��
__classcell__rr)rrrsT.
4"	
	 
	
!	r)rrZ
decoratorsZ
tuned.logsZtunedrZtuned.utils.commandsrZtuned.constsrr	rar�ZtimerZlogsrPrCZPluginrrrrr�<module>s

site-packages/tuned/plugins/__pycache__/decorators.cpython-36.opt-1.pyc000064400000002077147511334670022047 0ustar003

�<�e��@s*dddgZd	dd�Zdd�Zd
dd�ZdS)�command_set�command_get�command_customF�cs���fdd�}|S)Ncsd���d�|_|S)NT)�set�name�
per_device�priority)�_command)�method)rrr�� /usr/lib/python3.6/decorators.py�wrappers

zcommand_set.<locals>.wrapperr)rrrr
r)rrrrrs	cs�fdd�}|S)Ncsd�d�|_|S)NT)�getr)r	)r
)rrrr
!s
zcommand_get.<locals>.wrapperr)rr
r)rrr scs���fdd�}|S)Ncsd���d�|_|S)NT)Zcustomrrr)r	)r
)rrrrrr
*s

zcommand_custom.<locals>.wrapperr)rrrr
r)rrrrr)sN)Fr)Fr)�__all__rrrrrrr�<module>s

	site-packages/tuned/plugins/__pycache__/plugin_audio.cpython-36.opt-1.pyc000064400000010041147511334670022347 0ustar003

�<�e��@snddlmZddlTddlZddlmZddlZddlZddl	Z	ddl
Z
ejj�Z
e�ZGdd�dej�ZdS)�)�hotplug)�*�N)�commandsc@s�eZdZdZdd�Zdd�Zdd�Zdd	�Zed
d��Z	dd
�Z
dd�Zeddd�dd��Z
ed�ddd��Zeddd�dd��Zed�ddd��ZdS) �AudioPlugina�
	`audio`::
	
	Sets audio cards power saving options. The plug-in sets the auto suspend
	timeout for audio codecs to the value specified by the [option]`timeout`
	option.
	+
	Currently, the `snd_hda_intel` and `snd_ac97_codec` codecs are
	supported and the [option]`timeout` value is in seconds. To disable
	auto suspend for these codecs, set the [option]`timeout` value
	to `0`. To enforce the controller reset, set the option
	[option]`reset_controller` to `true`. Note that power management
	is supported per module. Hence, the kernel module names are used as
	device names.
	+
	.Set the timeout value to 10s and enforce the controller reset
	====
	----
	[audio]
	timeout=10
	reset_controller=true
	----
	====
	cCsTd|_t�|_t�|_x8|jjd�jd�D]"}|j|�}|dkr*|jj|�q*WdS)NTZsoundzcard*�
snd_hda_intel�snd_ac97_codec)rr)	Z_devices_supported�setZ_assigned_devicesZ
_free_devicesZ_hardware_inventoryZget_devicesZmatch_sys_name�_device_module_name�add)�self�deviceZmodule_name�r�"/usr/lib/python3.6/plugin_audio.py�
_init_devices(s
zAudioPlugin._init_devicescCsd|_d|_dS)NTF)Z_has_static_tuningZ_has_dynamic_tuning)r�instancerrr�_instance_init2szAudioPlugin._instance_initcCsdS)Nr)rrrrr�_instance_cleanup6szAudioPlugin._instance_cleanupc	Csy|jjSdSdS)N)�parentZdriver)rr
rrrr
9szAudioPlugin._device_module_namecCs
ddd�S)NrF)�timeout�reset_controllerr)�clsrrr�_get_config_options?szAudioPlugin._get_config_optionscCsd|S)Nz$/sys/module/%s/parameters/power_saver)rr
rrr�
_timeout_pathFszAudioPlugin._timeout_pathcCsd|S)Nz//sys/module/%s/parameters/power_save_controllerr)rr
rrr�_reset_controller_pathIsz"AudioPlugin._reset_controller_pathrT)Z
per_devicec
Csryt|�}Wn"tk
r.tjd|�dSX|dkrj|j|�}|sftj|d||r^tjgndd�|SdSdS)Nz!timeout value '%s' is not integerrz%dF)�no_error)	�int�
ValueError�log�errorr�cmd�
write_to_file�errno�ENOENT)r�valuer
�sim�remover�sys_filerrr�_set_timeoutLs
zAudioPlugin._set_timeoutFcCs,|j|�}tj||d�}t|�dkr(|SdS)N)rr)rr �	read_file�len)rr
�ignore_missingr'r$rrr�_get_timeout\s

zAudioPlugin._get_timeoutrcCsHtj|�}|j|�}tjj|�rD|s@tj|||r8tjgndd�|SdS)NF)r)	r �get_boolr�os�path�existsr!r"r#)rr$r
r%r&�vr'rrr�_set_reset_controllerds

z!AudioPlugin._set_reset_controllercCs:|j|�}tjj|�r6tj|�}t|�dkr6tj|�SdS)Nr)rr.r/r0r r)r*r-)rr
r+r'r$rrr�_get_reset_controlleros


z!AudioPlugin._get_reset_controllerN)F)F)�__name__�
__module__�__qualname__�__doc__rrrr
�classmethodrrrZcommand_setr(Zcommand_getr,r2r3rrrrrs
r)�rZ
decoratorsZ
tuned.logsZtunedZtuned.utils.commandsrr.r"�structZglobZlogs�getrr ZPluginrrrrr�<module>s
site-packages/tuned/plugins/__pycache__/plugin_disk.cpython-36.opt-1.pyc000064400000040550147511334670022210 0ustar003

�<�e�A�@sjddlZddlmZddlTddlZddljZddlm	Z	ddl
Z
ddlZejj
�ZGdd�dej�ZdS)�N�)�hotplug)�*)�commandscs�eZdZdZ�fdd�Z�fdd�Zdd�Zdd	�Zed
d��Z	dd
�Z
dd�Z�fdd�Z�fdd�Z
�fdd�Zedd��Zedd��Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Z�fd,d-�Zd.d/�ZdZd1d2�Zd3d4�Zed5d6d7�d8d9��Ze d5�d[d;d<��Z!ed=d6d7�d>d?��Z"e d=�d\d@dA��Z#edBd6d7�dCdD��Z$e dB�d]dEdF��Z%dGdH�Z&dIdJ�Z'edKd6d7�dLdM��Z(e dK�d^dNdO��Z)e*dPd6d7�dQdR��Z+dSdT�Z,edUd6d7�dVdW��Z-e dU�d_dXdY��Z.�Z/S)`�
DiskPlugina�	
	`disk`::
	
	Plug-in for tuning various block device options. This plug-in can also
	dynamically change the advanced power management and spindown timeout
	setting for a drive according to the current drive utilization. The
	dynamic tuning is controlled by the [option]`dynamic` and the global
	[option]`dynamic_tuning` option in `tuned-main.conf`.
	+
	The disk plug-in operates on all supported block devices unless a
	comma separated list of [option]`devices` is passed to it.
	+
	.Operate only on the sda block device
	====
	----
	[disk]
	# Comma separated list of devices, all devices if commented out.
	devices=sda
	----
	====
	+
	The [option]`elevator` option sets the Linux I/O scheduler.
	+
	.Use the bfq I/O scheduler on xvda block device
	====
	----
	[disk]
	device=xvda
	elevator=bfq
	----
	====
	+
	The [option]`scheduler_quantum` option only applies to the CFQ I/O
	scheduler. It defines the number of I/O requests that CFQ sends to
	one device at one time, essentially limiting queue depth. The default
	value is 8 requests. The device being used may support greater queue
	depth, but increasing the value of quantum will also increase latency,
	especially for large sequential write work loads.
	+
	The [option]`apm` option sets the Advanced Power Management feature
	on drives that support it. It corresponds to using the `-B` option of
	the `hdparm` utility. The [option]`spindown` option puts the drive
	into idle (low-power) mode, and also sets the standby (spindown)
	timeout for the drive. It corresponds to using `-S` option of the
	`hdparm` utility.
	+
	.Use a medium-agressive power management with spindown
	====
	----
	[disk]
	apm=128
	spindown=6
	----
	====
	+
	The [option]`readahead` option controls how much extra data the
	operating system reads from disk when performing sequential
	I/O operations. Increasing the `readahead` value might improve
	performance in application environments where sequential reading of
	large files takes place. The default unit for readahead is KiB. This
	can be adjusted to sectors by specifying the suffix 's'. If the
	suffix is specified, there must be at least one space between the
	number and suffix (for example, `readahead=8192 s`).
	+
	.Set the `readahead` to 4MB unless already set to a higher value
	====
	----
	[disk]
	readahead=>4096
	----
	====
	The disk readahead value can be multiplied by the constant
	specified by the [option]`readahead_multiply` option.
	csrtt|�j||�ddddddddd	d
ddg|_d
dddddddddd	dg|_t|j�|_d|_d|_t	�|_
dS)N��������}�i�U�F�7��r����������n�Z�<�g{�G�z�?)�superr�__init__�
_power_levels�_spindown_levels�len�_levels�_level_steps�_load_smallestr�_cmd)�self�args�kwargs)�	__class__��!/usr/lib/python3.6/plugin_disk.pyrXszDiskPlugin.__init__cs�tt|�j�d|_d|_t�|_t�|_xL|jj	d�D]<}|j
|�r8|jj|j�|jr8|j
|j�r8|jj|j�q8Wt�|_dS)NT�block)rr�
_init_devicesZ_devices_supported�_use_hdparm�setZ
_free_devices�_hdparm_apm_devices�_hardware_inventoryZget_devices�_device_is_supported�addZsys_name�_is_hdparm_apm_supportedZ_assigned_devices)r'�device)r*r+r,r.bs
zDiskPlugin._init_devicescs�fdd�|D�S)Ncsg|]}�jjd|��qS)r-)r2Z
get_device)�.0�x)r'r+r,�
<listcomp>qsz2DiskPlugin._get_device_objects.<locals>.<listcomp>r+)r'Zdevicesr+)r'r,�_get_device_objectspszDiskPlugin._get_device_objectscCs�|jjddd|gtjgdd�\}}}|tjkrFtjd�d|_dS|rntjd|�tjd	||f�dSd
|kr�tjd|�dSdS)N�hdparmz-Cz/dev/%sT)�	no_errorsZ
return_errz4hdparm command not found, ignoring for other devicesFz#Device '%s' not supported by hdparmz(rc: %s, msg: '%s')�unknownz3Driver for device '%s' does not support apm command)	r&�execute�errno�ENOENT�log�warnr/�info�debug)r'r6�rc�outZerr_msgr+r+r,r5ss
z#DiskPlugin._is_hdparm_apm_supportedcCs2|jdko0|jjdd�dko0|jdkp0|jjdkS)	N�diskZ	removable�0�scsi�virtio�xen�nvme)rIrJrKrL)Zdevice_typeZ
attributes�get�parentZ	subsystem)�clsr6r+r+r,r3�s

zDiskPlugin._device_is_supportedcCs|jj|d|j�dS)Nr-)r2Z	subscribe�_hardware_events_callback)r'r+r+r,�_hardware_events_init�sz DiskPlugin._hardware_events_initcCs|jj|�dS)N)r2Zunsubscribe)r'r+r+r,�_hardware_events_cleanup�sz#DiskPlugin._hardware_events_cleanupcs(|j|�s|dkr$tt|�j||�dS)N�remove)r3rrrP)r'Zeventr6)r*r+r,rP�sz$DiskPlugin._hardware_events_callbackcs,|jdk	r|jj|�tt|�j||�dS)N)�
_load_monitorZ
add_devicerr�_added_device_apply_tuning)r'�instance�device_name)r*r+r,rU�s
z%DiskPlugin._added_device_apply_tuningcs,|jdk	r|jj|�tt|�j||�dS)N)rTZ
remove_devicerr�_removed_device_unapply_tuning)r'rVrW)r*r+r,rX�s
z)DiskPlugin._removed_device_unapply_tuningcCsdddddddd�S)NT)�dynamic�elevator�apm�spindown�	readahead�readahead_multiply�scheduler_quantumr+)rOr+r+r,�_get_config_options�szDiskPlugin._get_config_optionscCsddgS)Nr[r\r+)rOr+r+r,�#_get_config_options_used_by_dynamic�sz.DiskPlugin._get_config_options_used_by_dynamiccCsdd|_d|_d|_|j|jd�rTd|_|jjd|j�|_	i|_
i|_i|_i|_
nd|_d|_	dS)NTrrYrGF)Z_has_static_tuning�_apm_errcnt�_spindown_errcntZ_option_boolZoptionsZ_has_dynamic_tuning�_monitors_repositoryZcreateZassigned_devicesrTZ_device_idle�_stats�_idle�_spindown_change_delayed)r'rVr+r+r,�_instance_init�szDiskPlugin._instance_initcCs"|jdk	r|jj|j�d|_dS)N)rTrd�delete)r'rVr+r+r,�_instance_cleanup�s
zDiskPlugin._instance_cleanupcCs�|rd}|j}n
d}|j}|tjkr(dS|dkr6d}nL|tjkrbtjd|_|_tjd�dS|d7}|tjkr�tjd|�|r�||_n||_dS)Nr\r[rrzIhdparm command not found, ignoring future set_apm / set_spindown commandsz5disabling set_%s command: too many consecutive errors)	rcrb�consts�ERROR_THRESHOLDr?r@rArBrC)r'rEr\�sZcntr+r+r,�_update_errcnt�s&


zDiskPlugin._update_errcntcCsNtjd|�|jjdd|d|gtjgd�\}}|j|d�d|j|<dS)Nzchanging spindown to %dr;z-S%dz/dev/%s)r<TF)rArDr&r>r?r@rnrg)r'rVr6�new_spindown_levelrErFr+r+r,�_change_spindown�s&zDiskPlugin._change_spindowncCs2|jjddd|gtjgd�\}}d|ko0d|kS)Nr;z-Cz/dev/%s)r<ZstandbyZsleeping)r&r>r?r@)r'r6rErFr+r+r,�_drive_spinning�s"zDiskPlugin._drive_spinningcCs(||jkrdS|jj|�}|dkr&dS||jkr<|j||�|j|||�|j||�|j|}|j|}|dd|jkr�|d|j	kr�|d|j	kr�d}n.|ddkr�|ddks�|ddkr�d}nd}|dk�r�|d|7<|j
|d}|j|d}tj
d|d�|jtjk�rb|j|��rT|dk�rTtj
d|�d|j|<n|j|||�|jtjk�r�tj
d	|�|jjd
d|d|gtjgd
�\}	}
|j|	d�n4|j|�r�|j|��r�|j|d}|j|||�tj
d||d|df�tj
d||d|d|df�dS)N�levelr�read�writerztuning level changed to %dz;delaying spindown change to %d, drive has already spun downTzchanging APM_level to %dr;z-B%dz/dev/%s)r<Fz %s load: read %0.2f, write %0.2fz$%s idle: read %d, write %d, level %d���)r1rTZget_device_loadre�_init_stats_and_idle�
_update_stats�_update_idlerfr#r$r r!rArDrcrkrlrqrgrprbr&r>r?r@rn)r'rVr6�loadZstatsZidleZlevel_changeZnew_power_levelrorErFr+r+r,�_instance_update_dynamic�sF



.$
&z#DiskPlugin._instance_update_dynamiccCsDddgddgddgd�|j|<dddd�|j|<d|j|<dS)N�rr)�new�old�max)rrrsrtF)rerfrg)r'rVr6r+r+r,rvs$zDiskPlugin._init_stats_and_idlecCs�|j|d|j|d<}||j|d<dd�t||�D�}||j|d<|j|d}dd�t||�D�}||j|d<t|d�t|d�|j|d	<t|d
�t|d
�|j|d<dS)Nr|r}cSsg|]}|d|d�qS)rrr+)r7Znew_oldr+r+r,r9'sz,DiskPlugin._update_stats.<locals>.<listcomp>�diffr~cSsg|]}t|��qSr+)r~)r7Zpairr+r+r,r9,srrs�rt)re�zip�float)r'rVr6Znew_loadZold_loadrZold_max_loadZmax_loadr+r+r,rw"s"zDiskPlugin._update_statscCsLxFdD]>}|j|||jkr6|j||d7<qd|j||<qWdS)Nrsrtrr)rsrt)rer%rf)r'rVr6Z	operationr+r+r,rx3s
zDiskPlugin._update_idlecs0||jkrtjd|�ntt|�j||�dS)Nz<There is no dynamic tuning available for device '%s' at time)r1rArCrr�_instance_apply_dynamic)r'rVr6)r*r+r,r�;s
z"DiskPlugin._instance_apply_dynamiccCsdS)Nr+)r'rVr6r+r+r,�_instance_unapply_dynamicDsz$DiskPlugin._instance_unapply_dynamic�/sys/block/cCs@d|kr0tjj||jdd�|�}tjj|�r0|Stjj|||�S)N�/�!)�os�path�join�replace�exists)r'r6�suffix�prefixZdevr+r+r,�_sysfs_pathGs
zDiskPlugin._sysfs_pathcCs|j|d�S)Nzqueue/scheduler)r�)r'r6r+r+r,�_elevator_fileNszDiskPlugin._elevator_filerZT)Z
per_devicecCs0|j|�}|s,|jj|||r$tjgndd�|S)NF)�no_error)r�r&�
write_to_filer?r@)r'�valuer6�simrS�sys_filer+r+r,�
_set_elevatorQs


zDiskPlugin._set_elevatorFcCs"|j|�}|jj|jj||d��S)N)r�)r�r&Zget_active_option�	read_file)r'r6�ignore_missingr�r+r+r,�
_get_elevatorYs
zDiskPlugin._get_elevatorr[cCs|||jkr(|s tjd|�dSt|�S|jtjkrt|sl|jjddt|�d|gt	j
gd�\}}|j|d�t|�SdSdS)Nz+apm option is not supported for device '%s'r;z-Bz/dev/)r<F)r1rArC�strrbrkrlr&r>r?r@rn)r'r�r6r�rSrErFr+r+r,�_set_apm`s
(zDiskPlugin._set_apmcCs�||jkr |stjd|�dSd}d}|jjddd|gtjgd�\}}|tjkrZdS|dkrhd}n@tjd	|tj	�}|r�yt
|jd
��}Wntk
r�d}YnX|r�tj
d|�|S)Nz+apm option is not supported for device '%s'Fr;z-Bz/dev/)r<rTz
.*=\s*(\d+).*rz2could not get current APM settings for device '%s')r1rArCr&r>r?r@�re�match�S�int�group�
ValueError�error)r'r6r�r��errrErF�mr+r+r,�_get_apmps(
"
zDiskPlugin._get_apmr\cCs|||jkr(|s tjd|�dSt|�S|jtjkrt|sl|jjddt|�d|gt	j
gd�\}}|j|d�t|�SdSdS)Nz0spindown option is not supported for device '%s'r;z-Sz/dev/)r<T)r1rArCr�rcrkrlr&r>r?r@rn)r'r�r6r�rSrErFr+r+r,�
_set_spindown�s
(zDiskPlugin._set_spindowncCs$||jkr |stjd|�dSdS)Nz0spindown option is not supported for device '%s'�)r1rArC)r'r6r�r+r+r,�
_get_spindown�s

zDiskPlugin._get_spindowncCs|j|d�S)Nzqueue/read_ahead_kb)r�)r'r6r+r+r,�_readahead_file�szDiskPlugin._readahead_filecCs^t|�jdd�}yt|d�}Wntk
r4dSXt|�dkrZ|dddkrZ|d}|S)Nrrrm�)r��splitr�r�r")r'r��val�vr+r+r,�	_parse_ra�szDiskPlugin._parse_rar]cCsZ|j|�}|j|�}|dkr0tjd||f�n&|sV|jj|d||rNtjgndd�|S)Nz,Invalid readahead value '%s' for device '%s'z%dF)r�)r�r�rAr�r&r�r?r@)r'r�r6r�rSr�r�r+r+r,�_set_readahead�s

zDiskPlugin._set_readaheadcCs6|j|�}|jj||d�j�}t|�dkr.dSt|�S)N)r�r)r�r&r��stripr"r�)r'r6r�r�r�r+r+r,�_get_readahead�s

zDiskPlugin._get_readaheadr^c	Cs�|rdS|jd|d�}|r^|j|�}|dkr0dStt|�|�}|jj||�|j||d�n2|jj|�}|dkrvdS|j||d�|jj|�dS)Nr^)Zcommand_namerWF)	Z_storage_keyr�r�r�Z_storager0r�rMZunset)	r'ZenablingZ
multiplierr6Zverifyr�Zstorage_keyZ
old_readaheadZ
new_readaheadr+r+r,�_multiply_readahead�s"
zDiskPlugin._multiply_readaheadcCs|j|d�S)Nzqueue/iosched/quantum)r�)r'r6r+r+r,�_scheduler_quantum_file�sz"DiskPlugin._scheduler_quantum_filer_cCs8|j|�}|s4|jj|dt|�|r,tjgndd�|S)Nz%dF)r�)r�r&r�r�r?r@)r'r�r6r�rSr�r+r+r,�_set_scheduler_quantum�s

z!DiskPlugin._set_scheduler_quantumcCsH|j|�}|jj||d�j�}t|�dkr@|s<tjd|�dSt|�S)N)r�rz>disk_scheduler_quantum option is not supported for device '%s')r�r&r�r�r"rArCr�)r'r6r�r�r�r+r+r,�_get_scheduler_quantum�s
z!DiskPlugin._get_scheduler_quantum)r�)F)F)F)F)F)0�__name__�
__module__�__qualname__�__doc__rr.r:r5�classmethodr3rQrRrPrUrXr`rarhrjrnrprqrzrvrwrxr�r�r�r�Zcommand_setr�Zcommand_getr�r�r�r�r�r�r�r�r�Zcommand_customr�r�r�r��
__classcell__r+r+)r*r,rsZJ
2	
r)r?�rZ
decoratorsZ
tuned.logsZtunedZtuned.constsrkZtuned.utils.commandsrr�r�ZlogsrMrAZPluginrr+r+r+r,�<module>s

site-packages/tuned/plugins/__pycache__/plugin_service.cpython-36.opt-1.pyc000064400000033530147511334670022716 0ustar003

�<�e�)�@s�ddlmZddlZddljZddlTddlZddlZddlZ	ddl
mZe	jj
�Ze�ZGdd�d�ZGdd	�d	�ZGd
d�de�ZGdd
�d
e�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�dej�ZdS)�)�base�N)�*)�commandsc@seZdZddd�ZdS)�ServiceNcCs||_||_||_||_dS)N)�enable�start�cfg_file�runlevel)�selfrrr	r
�r�$/usr/lib/python3.6/plugin_service.py�__init__szService.__init__)NNNN)�__name__�
__module__�__qualname__rrrrr
r
src@s4eZdZdd�Zdd�Zdd�Zdd�Zd	d
�ZdS)�InitHandlercCs(tjdg�\}}|dkr$|j�dSdS)Nr
rr���)�cmd�execute�split)r�retcode�outrrr
�runlevel_getszInitHandler.runlevel_getcCstjddg�dS)NZtelinit�q)rr)rrrr
�
daemon_reloadszInitHandler.daemon_reloadcCsdS)Nr)r�namer	rrr
�cfg_installszInitHandler.cfg_installcCsdS)Nr)rrr	rrr
�
cfg_uninstallszInitHandler.cfg_uninstallcCsdS)Nr)rrr	rrr
�
cfg_verify"szInitHandler.cfg_verifyN)rrrrrrrrrrrr
rs
rc@s<eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
S)�SysVBasicHandlercCstjd|dg�dS)N�servicer)rr)rrrrr
r'szSysVBasicHandler.startcCstjd|dg�dS)Nr!�stop)rr)rrrrr
r"*szSysVBasicHandler.stopcCs
t��dS)N)�NotImplementedError)rrr
rrr
r-szSysVBasicHandler.enablecCs
t��dS)N)r#)rrr
rrr
�disable0szSysVBasicHandler.disablecCs"tjd|dgdgd�\}}|dkS)Nr!�statusr)�	no_errors)rr)rrrrrrr
�
is_running3szSysVBasicHandler.is_runningcCs
t��dS)N)r#)rrr
rrr
�
is_enabled7szSysVBasicHandler.is_enabledN)	rrrrr"rr$r'r(rrrr
r &sr c@s$eZdZdd�Zdd�Zdd�ZdS)�SysVHandlercCstjdd||dg�dS)N�	chkconfigz--level�on)rr)rrr
rrr
r;szSysVHandler.enablecCstjdd||dg�dS)Nr*z--level�off)rr)rrr
rrr
r$>szSysVHandler.disablecCsBtjdd|g�\}}|dkr>|jdt|��ddd�dkSdS)Nr*z--listrz%s:r�r+)rrr�str)rrr
rrrrr
r(AszSysVHandler.is_enabledN)rrrrr$r(rrrr
r):sr)c@s$eZdZdd�Zdd�Zdd�ZdS)�
SysVRCHandlercCstjdd||dg�dS)Nzsysv-rc-confz--levelr+)rr)rrr
rrr
rFszSysVRCHandler.enablecCstjdd||dg�dS)Nzsysv-rc-confz--levelr,)rr)rrr
rrr
r$IszSysVRCHandler.disablecCsBtjdd|g�\}}|dkr>|jdt|��ddd�dkSdS)Nzsysv-rc-confz--listrz%s:rr-r+)rrrr.)rrr
rrrrr
r(LszSysVRCHandler.is_enabledN)rrrrr$r(rrrr
r/Esr/c@sDeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dS)�
OpenRCHandlercCs&tjddg�\}}|dkr"|j�SdS)Nz	rc-statusz-rr)rr�strip)rrrrrr
rQszOpenRCHandler.runlevel_getcCstjd|dg�dS)Nz
rc-servicer)rr)rrrrr
rUszOpenRCHandler.startcCstjd|dg�dS)Nz
rc-servicer")rr)rrrrr
r"XszOpenRCHandler.stopcCstjdd||g�dS)Nz	rc-update�add)rr)rrr
rrr
r[szOpenRCHandler.enablecCstjdd||g�dS)Nz	rc-update�del)rr)rrr
rrr
r$^szOpenRCHandler.disablecCs"tjd|dgdgd�\}}|dkS)Nz
rc-servicer%r)r&)rr)rrrrrrr
r'aszOpenRCHandler.is_runningcCs2tjdd|g�\}}ttjdtj|�d|��S)Nz	rc-updateZshowz\b)rr�bool�re�search�escape)rrr
rrrrr
r(eszOpenRCHandler.is_enabledN)
rrrrrr"rr$r'r(rrrr
r0Psr0c@s\eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�ZdS)�SystemdHandlercCsdS)N�r)rrrr
rkszSystemdHandler.runlevel_getcCstjdd|g�dS)N�	systemctlZrestart)rr)rrrrr
rnszSystemdHandler.startcCstjdd|g�dS)Nr:r")rr)rrrrr
r"qszSystemdHandler.stopcCstjdd|g�dS)Nr:r)rr)rrr
rrr
rtszSystemdHandler.enablecCstjdd|g�dS)Nr:r$)rr)rrr
rrr
r$wszSystemdHandler.disablecCs"tjdd|gdgd�\}}|dkS)Nr:z	is-activer)r&)rr)rrrrrrr
r'zszSystemdHandler.is_runningcCs>tjdd|gdgd�\}}|j�}|dkr.dS|dkr:dSdS)	Nr:z
is-enabledr)r&ZenabledTZdisabledF)rrr1)rrr
rrr%rrr
r(~szSystemdHandler.is_enabledcCs�tjd||f�tjj|�s0tjd|�dStj|}ytj|tj	�Wn2t
k
r~}ztjd||f�dSd}~XnXtj||�|j
�dS)NzCinstalling service configuration overlay file '%s' for service '%s'z)Unable to find service configuration '%s'z#Unable to create directory '%s': %s)�log�info�os�path�exists�error�consts�SERVICE_SYSTEMD_CFG_PATH�makedirsZDEF_SERVICE_CFG_DIR_MODE�OSErrorr�copyr)rrr	�dirpath�errr
r�s
zSystemdHandler.cfg_installcCsntjd||f�tj|}d|tjj|�f}tj|�|j	�ytj
|�Wnttfk
rhYnXdS)NzEuninstalling service configuration overlay file '%s' for service '%s'z%s/%s)
r;r<rArBr=r>�basenamer�unlinkr�rmdir�FileNotFoundErrorrD)rrr	rFr>rrr
r�s

zSystemdHandler.cfg_uninstallcCs�|dkrdSdtj|tjj|�f}tjj|�sHtjd||f�dStjj|�sjtjd||f�dStj	|�}tj	|�}||kS)Nz%s/%sz.Unable to find service '%s' configuration '%s'Fz0Service '%s' configuration not installed in '%s')
rArBr=r>rHr?r;r@rZ	sha256sum)rrr	r>Z
sha256sum1Z
sha256sum2rrr
r�s

zSystemdHandler.cfg_verifyN)
rrrrrr"rr$r'r(rrrrrrr
r8isr8csneZdZdZ�fdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Ze
jfdd�Z�ZS)�
ServicePlugina1
	`service`::
	
	Plug-in for handling sysvinit, sysv-rc, openrc and systemd services.
	+
	The syntax is as follows:
	+
	[subs="+quotes,+macros"]
	----
	[service]
	service.__service_name__=__commands__[,file:__file__]
	----
	+
	Supported service-handling `_commands_` are `start`, `stop`, `enable`
	and `disable`. The optional `file:__file__` directive installs an overlay
	configuration file `__file__`. Multiple commands must be comma (`,`)
	or semicolon (`;`) separated. If the directives conflict, the last
	one is used.
	+
	The service plugin supports configuration overlays only for systemd.
	In other init systems, this directive is ignored. The configuration
	overlay files are copied to `/etc/systemd/system/__service_name__.service.d/`
	directories. Upon profile unloading, the directory is removed if it is empty.
	+
	With systemd, the `start` command is implemented by `restart` in order
	to allow loading of the service configuration file overlay.
	+
	NOTE: With non-systemd init systems, the plug-in operates on the
	current runlevel only.
	+
	.Start and enable the `sendmail` service with an overlay file
	====
	----
	[service]
	service.sendmail=start,enable,file:${i:PROFILE_DIR}/tuned-sendmail.conf
	----
	The internal variable `${i:PROFILE_DIR}` points to the directory
	from which the profile is loaded.
	====
	cs&tt|�j||�d|_|j�|_dS)NT)�superrLrZ_has_dynamic_options�_detect_init_system�
_init_handler)r�args�kwargs)�	__class__rr
r�szServicePlugin.__init__cCstj|dgd�\}}|dkS)Nr)r&)rr)rZcommandrrrrr
�
_check_cmd�szServicePlugin._check_cmdcCs�|jddg�rtjd�t�S|jdg�r:tjd�t�S|jddg�rXtjd�t�S|jd	dg�rvtjd
�t�Stjd��dS)Nr:r%zdetected systemdr*zdetected generic sysvinitzupdate-rc.dz-hzdetected sysv-rcz	rc-updatezdetected openrcz8Unable to detect your init system, disabling the plugin.)	rSr;�debugr8r)r/r0�
exceptionsZNotSupportedPluginException)rrrr
rN�s



z!ServicePlugin._detect_init_systemcCs�tjd|�}t�}x~|D]v}|dkr,d|_q|dkr<d|_q|dkrLd|_q|dkr\d|_q|dd�d	kr||dd�|_qtjd
||f�qW|S)Nz
\s*[,;]\s*rTr$Frr"�zfile:z*service '%s': invalid service option: '%s')r5rrrrr	r;r@)rr�val�lr!�irrr
�_parse_service_options�s
z$ServicePlugin._parse_service_optionscs6d|_d|_tj�fdd�|jj�D���_i|_dS)NFTcsTg|]L\}}|dd�dkrt|�dkr|dd��j|dd��jj|��f�qS)N�zservice.)�lenrZZ
_variables�expand)�.0Zoption�value)rrr
�
<listcomp>sz0ServicePlugin._instance_init.<locals>.<listcomp>)Z_has_dynamic_tuningZ_has_static_tuning�collections�OrderedDictZoptions�items�	_services�_services_original)r�instancer)rr
�_instance_inits
zServicePlugin._instance_initcCsdS)Nr)rrfrrr
�_instance_cleanup	szServicePlugin._instance_cleanupcCsT|r|jj|�n|dk	r&|jj|�|r:|jj||�n|dk	rP|jj||�dS)N)rOrr"rr$)rrrrr
rrr
�_process_serviceszServicePlugin._process_servicecCs�|jj�}|dkr tjd�dSx�|jj�D]�}|jj|d|�}|jj|d�}t|||dj	|�|j
|d<|dj	r�|jj|d|dj	�|j|d|dj
|dj|�q,WdS)NzCannot detect runlevelrr)rOrr;r@rdrcr(r'rr	rerrirr)rrfr
r!r(r'rrr
�_instance_apply_statics


z$ServicePlugin._instance_apply_staticc
CsH|jj�}|dkr&tjtjd�dSd}�x|jj�D�]}|jj|d|dj	�}|r~tj
tjd|d|dj	f�nR|dk	r�tjtjd|d|dj	f�d}n"tj
tjd|d|dj	f�|jj
|d|�}|jj|d�}	|jd	|d|dj|	|�dk�rd}|jd
|d|dj||�dkr8d}q8W|S)Nzcannot detect runlevelFTrrz'service '%s' configuration '%s' matchesz'service '%s' configuration '%s' differszservice '%s' configuration '%s'z
%s runningz
%s enabled)rOrr;r@rAZSTR_VERIFY_PROFILE_FAILrdrcrr	r<ZSTR_VERIFY_PROFILE_OKZ STR_VERIFY_PROFILE_VALUE_MISSINGr(r'Z
_verify_valuerr)
rrfZignore_missingZdevicesr
�retr!Zret_cfg_verifyr(r'rrr
�_instance_verify_static$s(
$""$"z%ServicePlugin._instance_verify_staticcCsLxFt|jj��D]4\}}|jr.|jj||j�|j||j|j|j	�qWdS)N)
�listrercr	rOrrirrr
)rrfZrollbackrr_rrr
�_instance_unapply_static<sz&ServicePlugin._instance_unapply_static)rrr�__doc__rrSrNrZrgrhrirjrlrAZ
ROLLBACK_SOFTrn�
__classcell__rr)rRr
rL�s(	
rL)r9rraZtuned.constsrAZ
decoratorsr=r5Z
tuned.logsZtunedZtuned.utils.commandsrZlogs�getr;rrrr r)r/r0r8ZPluginrLrrrr
�<module>s"

Bsite-packages/tuned/plugins/__pycache__/plugin_audio.cpython-36.pyc000064400000010041147511334670021410 0ustar003

�<�e��@snddlmZddlTddlZddlmZddlZddlZddl	Z	ddl
Z
ejj�Z
e�ZGdd�dej�ZdS)�)�hotplug)�*�N)�commandsc@s�eZdZdZdd�Zdd�Zdd�Zdd	�Zed
d��Z	dd
�Z
dd�Zeddd�dd��Z
ed�ddd��Zeddd�dd��Zed�ddd��ZdS) �AudioPlugina�
	`audio`::
	
	Sets audio cards power saving options. The plug-in sets the auto suspend
	timeout for audio codecs to the value specified by the [option]`timeout`
	option.
	+
	Currently, the `snd_hda_intel` and `snd_ac97_codec` codecs are
	supported and the [option]`timeout` value is in seconds. To disable
	auto suspend for these codecs, set the [option]`timeout` value
	to `0`. To enforce the controller reset, set the option
	[option]`reset_controller` to `true`. Note that power management
	is supported per module. Hence, the kernel module names are used as
	device names.
	+
	.Set the timeout value to 10s and enforce the controller reset
	====
	----
	[audio]
	timeout=10
	reset_controller=true
	----
	====
	cCsTd|_t�|_t�|_x8|jjd�jd�D]"}|j|�}|dkr*|jj|�q*WdS)NTZsoundzcard*�
snd_hda_intel�snd_ac97_codec)rr)	Z_devices_supported�setZ_assigned_devicesZ
_free_devicesZ_hardware_inventoryZget_devicesZmatch_sys_name�_device_module_name�add)�self�deviceZmodule_name�r�"/usr/lib/python3.6/plugin_audio.py�
_init_devices(s
zAudioPlugin._init_devicescCsd|_d|_dS)NTF)Z_has_static_tuningZ_has_dynamic_tuning)r�instancerrr�_instance_init2szAudioPlugin._instance_initcCsdS)Nr)rrrrr�_instance_cleanup6szAudioPlugin._instance_cleanupc	Csy|jjSdSdS)N)�parentZdriver)rr
rrrr
9szAudioPlugin._device_module_namecCs
ddd�S)NrF)�timeout�reset_controllerr)�clsrrr�_get_config_options?szAudioPlugin._get_config_optionscCsd|S)Nz$/sys/module/%s/parameters/power_saver)rr
rrr�
_timeout_pathFszAudioPlugin._timeout_pathcCsd|S)Nz//sys/module/%s/parameters/power_save_controllerr)rr
rrr�_reset_controller_pathIsz"AudioPlugin._reset_controller_pathrT)Z
per_devicec
Csryt|�}Wn"tk
r.tjd|�dSX|dkrj|j|�}|sftj|d||r^tjgndd�|SdSdS)Nz!timeout value '%s' is not integerrz%dF)�no_error)	�int�
ValueError�log�errorr�cmd�
write_to_file�errno�ENOENT)r�valuer
�sim�remover�sys_filerrr�_set_timeoutLs
zAudioPlugin._set_timeoutFcCs,|j|�}tj||d�}t|�dkr(|SdS)N)rr)rr �	read_file�len)rr
�ignore_missingr'r$rrr�_get_timeout\s

zAudioPlugin._get_timeoutrcCsHtj|�}|j|�}tjj|�rD|s@tj|||r8tjgndd�|SdS)NF)r)	r �get_boolr�os�path�existsr!r"r#)rr$r
r%r&�vr'rrr�_set_reset_controllerds

z!AudioPlugin._set_reset_controllercCs:|j|�}tjj|�r6tj|�}t|�dkr6tj|�SdS)Nr)rr.r/r0r r)r*r-)rr
r+r'r$rrr�_get_reset_controlleros


z!AudioPlugin._get_reset_controllerN)F)F)�__name__�
__module__�__qualname__�__doc__rrrr
�classmethodrrrZcommand_setr(Zcommand_getr,r2r3rrrrrs
r)�rZ
decoratorsZ
tuned.logsZtunedZtuned.utils.commandsrr.r"�structZglobZlogs�getrr ZPluginrrrrr�<module>s
site-packages/tuned/plugins/__pycache__/plugin_acpi.cpython-36.pyc000064400000006545147511334670021241 0ustar003

�<�e�	�@sXddlmZddlTddlZddlZddlZddlmZej	j
�ZGdd�dej�Z
dS)�)�base)�*�N)�ACPI_DIRcsveZdZdZ�fdd�Zedd��Zdd�Zdd	�Zed
d��Z	edd
��Z
ed�dd��Ze
d�ddd��Z�ZS)�
ACPIPlugina>
	`acpi`::

	Configures the ACPI driver.
	+
	The only currently supported option is
	[option]`platform_profile`, which sets the ACPI
	platform profile sysfs attribute,
	a generic power/performance preference API for other drivers.
	Multiple profiles can be specified, separated by `|`.
	The first available profile is selected.
	+
	--
	.Selecting a platform profile
	====
	----
	[acpi]
	platform_profile=balanced|low-power
	----
	Using this option, *TuneD* will try to set the platform profile
	to `balanced`. If that fails, it will try to set it to `low-power`.
	====
	--
	cstt|�j||�dS)N)�superr�__init__)�self�args�kwargs)�	__class__��!/usr/lib/python3.6/plugin_acpi.pyr$szACPIPlugin.__init__cCsddiS)N�platform_profiler
)�clsr
r
r�_get_config_options'szACPIPlugin._get_config_optionscCsd|_d|_dS)NTF)Z_has_static_tuningZ_has_dynamic_tuning)r	�instancer
r
r�_instance_init+szACPIPlugin._instance_initcCsdS)Nr
)r	rr
r
r�_instance_cleanup/szACPIPlugin._instance_cleanupcCstjjtd�S)NZplatform_profile_choices)�os�path�joinr)rr
r
r�_platform_profile_choices_path2sz)ACPIPlugin._platform_profile_choices_pathcCstjjtd�S)Nr)rrrr)rr
r
r�_platform_profile_path6sz!ACPIPlugin._platform_profile_pathrcCs�tjj|j��stjd�dSdd�|jd�D�}t|jj	|j
��j��}xZ|D]R}||kr�|s�tjd|�|jj|j�||r�t
jgndd�|Stjd|�qPWtjd	�dS)
Nz5ACPI platform_profile is not supported on this systemcSsg|]}|j��qSr
)�strip)�.0�profiler
r
r�
<listcomp>?sz4ACPIPlugin._set_platform_profile.<locals>.<listcomp>�|z Setting platform_profile to '%s'F)Zno_errorz+Requested platform_profile '%s' unavailablezDFailed to set platform_profile. Is the value in the profile correct?)rr�isfiler�log�debug�split�set�_cmd�	read_filer�infoZ
write_to_file�errno�ENOENT�warn�error)r	ZprofilesZsim�removeZavail_profilesrr
r
r�_set_platform_profile:s


z ACPIPlugin._set_platform_profileFcCs2tjj|j��stjd�dS|jj|j��j�S)Nz5ACPI platform_profile is not supported on this system)	rrrrr r!r$r%r)r	Zignore_missingr
r
r�_get_platform_profileLs
z ACPIPlugin._get_platform_profile)F)�__name__�
__module__�__qualname__�__doc__r�classmethodrrrrrZcommand_setr,Zcommand_getr-�
__classcell__r
r
)rrrsr)�rZ
decoratorsrr'Z
tuned.logsZtunedZtuned.constsrZlogs�getr ZPluginrr
r
r
r�<module>s
site-packages/tuned/plugins/__pycache__/hotplug.cpython-36.opt-1.pyc000064400000010347147511334670021363 0ustar003

�<�eX�@s>ddlmZddljZddlZejj�ZGdd�dej	�Z	dS)�)�base�Ncs�eZdZdZ�fdd�Z�fdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Z�ZS)�Pluginz:
	Base class for plugins with device hotpluging support.
	cstt|�j||�dS)N)�superr�__init__)�self�args�kwargs)�	__class__��/usr/lib/python3.6/hotplug.pyrszPlugin.__init__cstt|�j�|j�dS)N)rr�cleanup�_hardware_events_cleanup)r)r
rrr
szPlugin.cleanupcCsdS)Nr)rrrr�_hardware_events_initszPlugin._hardware_events_initcCsdS)Nr)rrrrrszPlugin._hardware_events_cleanupcCs|j�dS)N)r)rrrr�
_init_devicesszPlugin._init_devicescCsN|dkr&tjd|j�|j|j�n$|dkrJtjd|j�|j|j�dS)N�addzdevice '%s' added�removezdevice '%s' removed)�log�infoZsys_name�_add_device�_remove_device)rZeventZdevicerrr�_hardware_events_callbacksz Plugin._hardware_events_callbackcCsdtjd|j|f�|jj|�|j||jd|g�|j||�|j||jd|g�|j	j|�dS)Nz!instance %s: adding new device %sZapply)
rr�name�_assigned_devicesr�_call_device_script�
script_pre�_added_device_apply_tuning�script_post�processed_devices)r�instance�device_namerrr�_add_device_process$szPlugin._add_device_processcCsr||j|jBkrdSxXt|jj��D],\}}t|j||g��dkr$|j||�Pq$Wtj	d|�|jj
|�dS)Nrzno instance wants %s)r�
_free_devices�list�
_instances�items�lenZ_get_matching_devicesr!r�debugr)rr Z
instance_namerrrrr,szPlugin._add_devicecCs8x|D]}|j||�qWt|j�t|j�dk|_dS)zN
		Add devices specified by the set to the instance, no check is performed.
		rN)r!r&r�assigned_devices�active)rr�device_names�devrrr�_add_devices_nocheck8s
zPlugin._add_devices_nocheckcCsx||jkrt|j||jd|g�|j||�|j||jd|g�|jj|�t|j�t|j�dk|_|j	j|�dSdS)NZunapplyrTF)
rrr�_removed_device_unapply_tuningrrr&r(r)r)rrr rrr�_remove_device_processCs
zPlugin._remove_device_processcCsJ||j|jBkrdSx0t|jj��D]}|j||�r$Pq$W|jj|�dS)zVRemove device from the instance

		Parameters:
		device_name -- name of the device

		N)rr"r#r$�valuesr.r)rr rrrrrQszPlugin._remove_devicecCsx|D]}|j||�qWdS)zS
		Remove devices specified by the set from the instance, no check is performed.
		N)r.)rrr*r+rrr�_remove_devices_nocheckas
zPlugin._remove_devices_nocheckcCs6|j||g�|jr2|jjtjtj�r2|j||�dS)N)Z_execute_all_device_commands�has_dynamic_tuning�_global_cfg�get�consts�CFG_DYNAMIC_TUNING�CFG_DEF_DYNAMIC_TUNINGZ_instance_apply_dynamic)rrr rrrrhsz!Plugin._added_device_apply_tuningcCs:|jr$|jjtjtj�r$|j||�|j||gdd�dS)NT)r)r1r2r3r4r5r6Z_instance_unapply_dynamicZ_cleanup_all_device_commands)rrr rrrr-msz%Plugin._removed_device_unapply_tuning)�__name__�
__module__�__qualname__�__doc__rr
rrrrr!rr,r.rr0rr-�
__classcell__rr)r
rrsr)
�rZtuned.constsr4Z
tuned.logsZtunedZlogsr3rrrrrr�<module>s

site-packages/tuned/plugins/__pycache__/plugin_scsi_host.cpython-36.opt-1.pyc000064400000011074147511334670023253 0ustar003

�<�eQ�@sjddlZddlmZddlTddlZddljZddlm	Z	ddl
Z
ddlZejj
�ZGdd�dej�ZdS)�N�)�hotplug)�*)�commandscs�eZdZdZ�fdd�Z�fdd�Zdd�Zedd	��Zd
d�Z	dd
�Z
�fdd�Z�fdd�Z�fdd�Z
edd��Zdd�Zdd�Zdd�Zeddd�dd ��Zed�d$d"d#��Z�ZS)%�SCSIHostPlugina�
	`scsi_host`::
	
	Tunes options for SCSI hosts.
	+
	The plug-in sets Aggressive Link Power Management (ALPM) to the value specified
	by the [option]`alpm` option. The option takes one of three values:
	`min_power`, `medium_power` and `max_performance`.
	+
	NOTE: ALPM is only available on SATA controllers that use the Advanced
	Host Controller Interface (AHCI).
	+
	.ALPM setting when extended periods of idle time are expected
	====
	----
	[scsi_host]
	alpm=min_power
	----
	====
	cstt|�j||�t�|_dS)N)�superr�__init__r�_cmd)�self�args�kwargs)�	__class__��&/usr/lib/python3.6/plugin_scsi_host.pyr"szSCSIHostPlugin.__init__csVtt|�j�d|_t�|_x,|jjd�D]}|j|�r*|jj	|j
�q*Wt�|_dS)NT�scsi)rr�
_init_devicesZ_devices_supported�setZ
_free_devices�_hardware_inventoryZget_devices�_device_is_supported�addZsys_nameZ_assigned_devices)r
�device)r
rrr's
zSCSIHostPlugin._init_devicescs�fdd�|D�S)Ncsg|]}�jjd|��qS)r)rZ
get_device)�.0�x)r
rr�
<listcomp>2sz6SCSIHostPlugin._get_device_objects.<locals>.<listcomp>r)r
Zdevicesr)r
r�_get_device_objects1sz"SCSIHostPlugin._get_device_objectscCs
|jdkS)NZ	scsi_host)Zdevice_type)�clsrrrrr4sz#SCSIHostPlugin._device_is_supportedcCs|jj|d|j�dS)Nr)rZ	subscribe�_hardware_events_callback)r
rrr�_hardware_events_init8sz$SCSIHostPlugin._hardware_events_initcCs|jj|�dS)N)rZunsubscribe)r
rrr�_hardware_events_cleanup;sz'SCSIHostPlugin._hardware_events_cleanupcs |j|�rtt|�j||�dS)N)rrrr)r
Zeventr)r
rrr>s
z(SCSIHostPlugin._hardware_events_callbackcstt|�j||�dS)N)rr�_added_device_apply_tuning)r
�instance�device_name)r
rrrBsz)SCSIHostPlugin._added_device_apply_tuningcstt|�j||�dS)N)rr�_removed_device_unapply_tuning)r
r r!)r
rrr"Esz-SCSIHostPlugin._removed_device_unapply_tuningcCsddiS)N�alpmr)rrrr�_get_config_optionsHsz"SCSIHostPlugin._get_config_optionscCsd|_d|_dS)NTF)Z_has_static_tuningZ_has_dynamic_tuning)r
r rrr�_instance_initNszSCSIHostPlugin._instance_initcCsdS)Nr)r
r rrr�_instance_cleanupRsz SCSIHostPlugin._instance_cleanupcCstjjdt|�d�S)Nz/sys/class/scsi_host/Zlink_power_management_policy)�os�path�join�str)r
rrrr�_get_alpm_policy_fileUsz$SCSIHostPlugin._get_alpm_policy_filer#T)Z
per_devicecCsd|dkrdS|j|�}|s`tjj|�rF|jj|||r<tjgndd�ntj	d|t
|�f�dS|S)NF)�no_errorzBALPM control file ('%s') not found, skipping ALPM setting for '%s')r+r'r(�existsr	Z
write_to_file�errno�ENOENT�log�infor*)r
�policyrZsim�remove�policy_filerrr�	_set_alpmXs

zSCSIHostPlugin._set_alpmFcCs.|j|�}|jj|dd�j�}|dkr*|SdS)NT)r,�)r+r	Z	read_file�strip)r
rZignore_missingr4r2rrr�	_get_alpmfs
zSCSIHostPlugin._get_alpm)F)�__name__�
__module__�__qualname__�__doc__rrr�classmethodrrrrrr"r$r%r&r+Zcommand_setr5Zcommand_getr8�
__classcell__rr)r
rrs"
r)r.r6rZ
decoratorsZ
tuned.logsZtunedZtuned.constsZconstsZtuned.utils.commandsrr'�reZlogs�getr0ZPluginrrrrr�<module>s

site-packages/tuned/plugins/__pycache__/hotplug.cpython-36.pyc000064400000010347147511334670020424 0ustar003

�<�eX�@s>ddlmZddljZddlZejj�ZGdd�dej	�Z	dS)�)�base�Ncs�eZdZdZ�fdd�Z�fdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Z�ZS)�Pluginz:
	Base class for plugins with device hotpluging support.
	cstt|�j||�dS)N)�superr�__init__)�self�args�kwargs)�	__class__��/usr/lib/python3.6/hotplug.pyrszPlugin.__init__cstt|�j�|j�dS)N)rr�cleanup�_hardware_events_cleanup)r)r
rrr
szPlugin.cleanupcCsdS)Nr)rrrr�_hardware_events_initszPlugin._hardware_events_initcCsdS)Nr)rrrrrszPlugin._hardware_events_cleanupcCs|j�dS)N)r)rrrr�
_init_devicesszPlugin._init_devicescCsN|dkr&tjd|j�|j|j�n$|dkrJtjd|j�|j|j�dS)N�addzdevice '%s' added�removezdevice '%s' removed)�log�infoZsys_name�_add_device�_remove_device)rZeventZdevicerrr�_hardware_events_callbacksz Plugin._hardware_events_callbackcCsdtjd|j|f�|jj|�|j||jd|g�|j||�|j||jd|g�|j	j|�dS)Nz!instance %s: adding new device %sZapply)
rr�name�_assigned_devicesr�_call_device_script�
script_pre�_added_device_apply_tuning�script_post�processed_devices)r�instance�device_namerrr�_add_device_process$szPlugin._add_device_processcCsr||j|jBkrdSxXt|jj��D],\}}t|j||g��dkr$|j||�Pq$Wtj	d|�|jj
|�dS)Nrzno instance wants %s)r�
_free_devices�list�
_instances�items�lenZ_get_matching_devicesr!r�debugr)rr Z
instance_namerrrrr,szPlugin._add_devicecCs8x|D]}|j||�qWt|j�t|j�dk|_dS)zN
		Add devices specified by the set to the instance, no check is performed.
		rN)r!r&r�assigned_devices�active)rr�device_names�devrrr�_add_devices_nocheck8s
zPlugin._add_devices_nocheckcCsx||jkrt|j||jd|g�|j||�|j||jd|g�|jj|�t|j�t|j�dk|_|j	j|�dSdS)NZunapplyrTF)
rrr�_removed_device_unapply_tuningrrr&r(r)r)rrr rrr�_remove_device_processCs
zPlugin._remove_device_processcCsJ||j|jBkrdSx0t|jj��D]}|j||�r$Pq$W|jj|�dS)zVRemove device from the instance

		Parameters:
		device_name -- name of the device

		N)rr"r#r$�valuesr.r)rr rrrrrQszPlugin._remove_devicecCsx|D]}|j||�qWdS)zS
		Remove devices specified by the set from the instance, no check is performed.
		N)r.)rrr*r+rrr�_remove_devices_nocheckas
zPlugin._remove_devices_nocheckcCs6|j||g�|jr2|jjtjtj�r2|j||�dS)N)Z_execute_all_device_commands�has_dynamic_tuning�_global_cfg�get�consts�CFG_DYNAMIC_TUNING�CFG_DEF_DYNAMIC_TUNINGZ_instance_apply_dynamic)rrr rrrrhsz!Plugin._added_device_apply_tuningcCs:|jr$|jjtjtj�r$|j||�|j||gdd�dS)NT)r)r1r2r3r4r5r6Z_instance_unapply_dynamicZ_cleanup_all_device_commands)rrr rrrr-msz%Plugin._removed_device_unapply_tuning)�__name__�
__module__�__qualname__�__doc__rr
rrrrr!rr,r.rr0rr-�
__classcell__rr)r
rrsr)
�rZtuned.constsr4Z
tuned.logsZtunedZlogsr3rrrrrr�<module>s

site-packages/tuned/plugins/__pycache__/plugin_rtentsk.cpython-36.opt-1.pyc000064400000002531147511334670022745 0ustar003

�<�eU�@s`ddlmZddlTddlZddlmZddlZddlZddl	Z	ej
j�ZGdd�dej
�ZdS)�)�base)�*�N)�commandsc@s eZdZdZdd�Zdd�ZdS)�
RTENTSKPluginz�
	`rtentsk`::
	
	Plugin for avoiding interruptions due to static key IPIs due
        to opening socket with timestamping enabled (by opening a
        socket ourselves the static key is kept enabled).
	cCsLd|_d|_d}d}tjtjtjtj�}|jtj||�||_t	j
d�dS)NTF�r�z*opened SOF_TIMESTAMPING_OPT_TX_SWHW socketi@)Z_has_static_tuningZ_has_dynamic_tuning�socketZAF_INETZ
SOCK_DGRAMZIPPROTO_UDPZ
setsockoptZ
SOL_SOCKET�rtentsk_socket�log�info)�self�instanceZSO_TIMESTAMPZSOF_TIMESTAMPING_OPT_TX_SWHW�s�r�$/usr/lib/python3.6/plugin_rtentsk.py�_instance_initszRTENTSKPlugin._instance_initcCs|j}|j�dS)N)r
�close)r
rrrrr�_instance_cleanup$szRTENTSKPlugin._instance_cleanupN)�__name__�
__module__�__qualname__�__doc__rrrrrrrsr)�rZ
decoratorsZ
tuned.logsZtunedZtuned.utils.commandsrZglobr	ZtimeZlogs�getrZPluginrrrrr�<module>s
site-packages/tuned/plugins/__pycache__/exceptions.cpython-36.opt-1.pyc000064400000000542147511334670022056 0ustar003

�<�ec�@s ddlZGdd�dejj�ZdS)�Nc@seZdZdS)�NotSupportedPluginExceptionN)�__name__�
__module__�__qualname__�rr� /usr/lib/python3.6/exceptions.pyrsr)Ztuned.exceptionsZtuned�
exceptionsZTunedExceptionrrrrr�<module>ssite-packages/tuned/plugins/__pycache__/plugin_systemd.cpython-36.pyc000064400000013657147511334670022017 0ustar003

�<�e4�@snddlmZddlTddlZddlmZddlmZddlj	Z	ddl
Z
ddlZejj
�ZGdd�dej�ZdS)	�)�base)�*�N)�
exceptions)�commandscs�eZdZdZ�fdd�Zdd�Zdd�Zedd	��Zd
d�Z	dd
�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zejfdd�Zdd�Zeddd�dd ��Z�ZS)!�
SystemdPlugina<
	`systemd`::
	
	Plug-in for tuning systemd options.
	+
	The [option]`cpu_affinity` option allows setting CPUAffinity in
	`/etc/systemd/system.conf`. This configures the CPU affinity for the
	service manager as well as the default CPU affinity for all forked
	off processes. The option takes a comma-separated list of CPUs with
	optional CPU ranges specified by the minus sign (`-`).
	+
	.Set the CPUAffinity for `systemd` to `0 1 2 3`
	====
	----
	[systemd]
	cpu_affinity=0-3
	----
	====
	+
	NOTE: These tunings are unloaded only on profile change followed by a reboot.
	cs<tjjtj�stjdtj��tt|�j	||�t
�|_dS)NzERequired systemd '%s' configuration file not found, disabling plugin.)�os�path�isfile�consts�SYSTEMD_SYSTEM_CONF_FILErZNotSupportedPluginException�superr�__init__r�_cmd)�self�args�kwargs)�	__class__��$/usr/lib/python3.6/plugin_systemd.pyr$szSystemdPlugin.__init__cCsd|_d|_dS)NFT)Z_has_dynamic_tuningZ_has_static_tuning)r�instancerrr�_instance_init*szSystemdPlugin._instance_initcCsdS)Nr)rrrrr�_instance_cleanup.szSystemdPlugin._instance_cleanupcCsddiS)N�cpu_affinityr)�clsrrr�_get_config_options1sz!SystemdPlugin._get_config_optionscCsB|dk	r>tjd|d|tjd�}|dk	r>|jdkr>|jd�SdS)Nz^\s*z\s*=\s*(.*)$)�flagsr)�re�search�	MULTILINE�	lastindex�group)r�conf�key�morrr�_get_keyval7s

zSystemdPlugin._get_keyvalcCs~tjd|ddt|�|tjd�\}}|dkrzy|ddkrF|d7}Wntk
r\YnX||dt|�d7}|S|S)	Nz^(\s*z\s*=).*$z\g<1>)rr�
�=���)r�subn�strr�
IndexError)rr"r#�valZconf_newZnsubsrrr�_add_keyval?s(zSystemdPlugin._add_keyvalcCstjd|dd|tjd�S)Nz^\s*z\s*=.*\n�)r)r�subr)rr"r#rrr�_del_keyKszSystemdPlugin._del_keycCs,|jjtjdd�}|dkr(tjd�dS|S)N)�err_retz(error reading systemd configuration file)r�	read_filerr�log�error)rZsystemd_system_confrrr�_read_systemd_system_confNs

z'SystemdPlugin._read_systemd_system_confcCsptjtj}|jj||�s8tjd�|jj|dd�dS|jj|tj�sltjdtj�|jj|dd�dSdS)Nz(error writing systemd configuration fileT)�no_errorFz/error replacing systemd configuration file '%s')	rrZTMP_FILE_SUFFIXr�
write_to_filer3r4�unlink�rename)rr"Ztmpfilerrr�_write_systemd_system_confUs
z(SystemdPlugin._write_systemd_system_confcCstjjtj|j�S)N)rr	�joinrZPERSISTENT_STORAGE_DIR�name)rrrr�_get_storage_filenamecsz#SystemdPlugin._get_storage_filenamecCsl|j�}|dk	rh|j�}|jj|ddd�}|jj|�|dkrN|j|tj�}n|j|tj|�}|j	|�dS)NT)r1r6)
r5r=rr2r8r0r�SYSTEMD_CPUAFFINITY_VARr-r:)rr"�fname�cpu_affinity_savedrrr�_remove_systemd_tuningfsz$SystemdPlugin._remove_systemd_tuningcCs0|tjkr,tjdtj�|j�tjd�dS)Nz6removing '%s' systemd tuning previously added by TuneDz[you may need to manualy run 'dracut -f' to update the systemd configuration in initrd image)rZ
ROLLBACK_FULLr3�infor>rAZconsole)rrZrollbackrrr�_instance_unapply_staticrs
z&SystemdPlugin._instance_unapply_staticc
Cs<|dkrdSdjdd�|jjtjddtjdd|���D��S)Nr.� css|]}t|�VqdS)N)r*)�.0�vrrr�	<genexpr>|sz8SystemdPlugin._cpulist_convert_unpack.<locals>.<genexpr>z\s+�,z,\s+)r;r�cpulist_unpackrr/)rZcpulistrrr�_cpulist_convert_unpackysz%SystemdPlugin._cpulist_convert_unpackrF)Z
per_devicecCs�d}d}|jj|jj|jj|���}djdd�|jj|�D��}|j�}	|	dk	rh|j|	t	j
�}|j|�}|r||jd|||�S|r�|j
�}
|jj|
ddd�}|dk	r�|dkr�||kr�|jj|
|dd�tjdt	j
|t	jf�|j|j|	t	j
|��dS)	NrDcss|]}t|�VqdS)N)r*)rErFrrrrG�sz)SystemdPlugin._cmdline.<locals>.<genexpr>rT)r1r6)Zmakedirz setting '%s' to '%s' in the '%s')rZunescapeZ
_variables�expandZunquoter;rIr5r%rr>rJZ
_verify_valuer=r2r7r3rBrr:r-)rZenabling�valueZverifyZignore_missingZ
conf_affinityZconf_affinity_unpackedrFZ
v_unpackedr"r?r@rrr�_cmdline~s"
zSystemdPlugin._cmdline)�__name__�
__module__�__qualname__�__doc__rrr�classmethodrr%r-r0r5r:r=rArZ
ROLLBACK_SOFTrCrJZcommand_customrM�
__classcell__rr)rrr
sr)r.rZ
decoratorsZ
tuned.logsZtunedrZtuned.utils.commandsrZtuned.constsrrrZlogs�getr3ZPluginrrrrr�<module>s

site-packages/tuned/plugins/__pycache__/exceptions.cpython-36.pyc000064400000000542147511334670021117 0ustar003

�<�ec�@s ddlZGdd�dejj�ZdS)�Nc@seZdZdS)�NotSupportedPluginExceptionN)�__name__�
__module__�__qualname__�rr� /usr/lib/python3.6/exceptions.pyrsr)Ztuned.exceptionsZtuned�
exceptionsZTunedExceptionrrrrr�<module>ssite-packages/tuned/plugins/__pycache__/plugin_cpu.cpython-36.pyc000064400000064100147511334670021103 0ustar003

�<�e:n�@s�ddlmZddlTddlZddlmZddljZddl	Z	ddl
Z
ddlZddl
Z
ddlZddl
Z
ejj�ZdZGdd�dej�ZdS)	�)�hotplug)�*�N)�commandsz$/sys/devices/system/cpu/cpu0/cpuidlecs$eZdZdZ�fdd�Zdd�Zdd�Zedd	��Zd
d�Z	dd
�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Z�fd$d%�Zejf�fd&d'�	Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zdmd3d4�Zdnd5d6�Z dod7d8�Z!d9d:�Z"d;d<�Z#e$d=d>d?�d@dA��Z%e&d=�dpdBdC��Z'dqdEdF�Z(e$dGd>dHdI�dJdK��Z)e&dG�drdLdM��Z*dNdO�Z+dsdPdQ�Z,dRdS�Z-e$dTd>d?�dUdV��Z.dWdX�Z/dYdZ�Z0d[d\�Z1e&dT�dtd]d^��Z2d_d`�Z3dadb�Z4e$dcd>d?�ddde��Z5e&dc�dudfdg��Z6e$dhd>d?�didj��Z7e&dh�dvdkdl��Z8�Z9S)w�CPULatencyPlugina�
	`cpu`::
	
	Sets the CPU governor to the value specified by the [option]`governor`
	option and dynamically changes the Power Management Quality of
	Service (PM QoS) CPU Direct Memory Access (DMA) latency according
	to the CPU load.
	
	`governor`:::
	The [option]`governor` option of the 'cpu' plug-in supports specifying
	CPU governors. Multiple governors are separated using '|'. The '|'
	character is meant to represent a logical 'or' operator. Note that the
	same syntax is used for the [option]`energy_perf_bias` option. *TuneD*
	will set the first governor that is available on the system.
	+    
	For example, with the following profile, *TuneD* will set the 'ondemand'
	governor, if it is available. If it is not available, but the 'powersave'
	governor is available, 'powersave' will be set. If neither of them are
	available, the governor will not be changed.
	+
	.Specifying a CPU governor
	====
	----
	[cpu]
	governor=ondemand|powersave
	----
	====
	
	`sampling_down_factor`:::
	The sampling rate determines how frequently the governor checks
	to tune the CPU. The [option]`sampling_down_factor` is a tunable
	that multiplies the sampling rate when the CPU is at its highest
	clock frequency thereby delaying load evaluation and improving
	performance. Allowed values for sampling_down_factor are 1 to 100000.
	+
	.The recommended setting for jitter reduction
	====
	----
	[cpu]
	sampling_down_factor = 100
	----
	====
	
	`energy_perf_bias`:::
	[option]`energy_perf_bias` supports managing energy
	vs. performance policy via x86 Model Specific Registers using the
	`x86_energy_perf_policy` tool. Multiple alternative Energy Performance
	Bias (EPB) values are supported. The alternative values are separated
	using the '|' character. The following EPB values are supported
	starting with kernel 4.13: "performance", "balance-performance",
	"normal", "balance-power" and "power". On newer processors is value
	writen straight to file (see rhbz#2095829)
	+
	.Specifying alternative Energy Performance Bias values
	====
	----
	[cpu]
	energy_perf_bias=powersave|power
	----
	
	*TuneD* will try to set EPB to 'powersave'. If that fails, it will
	try to set it to 'power'.
	====
	
	`energy_performance_preference`:::
	[option]`energy_performance_preference` supports managing energy
	vs. performance hints on newer Intel and AMD processors with active P-State
	CPU scaling drivers (intel_pstate or amd-pstate). Multiple alternative
	Energy Performance Preferences (EPP) values are supported. The alternative
	values are separated using the '|' character. Available values can be found
	in `energy_performance_available_preferences` file in `CPUFreq` policy
	directory in `sysfs`.
	in
	+
	.Specifying alternative Energy Performance Hints values
	====
	----
	[cpu]
	energy_performance_preference=balance_power|power
	----
	
	*TuneD* will try to set EPP to 'balance_power'. If that fails, it will
	try to set it to 'power'.
	====
	
	`latency_low, latency_high, load_threshold`:::
	+
	If the CPU load is lower than the value specified by
	the[option]`load_threshold` option, the latency is set to the value
	specified either by the [option]`latency_high` option or by the
	[option]`latency_low` option.
	+
	`force_latency`:::
	You can also force the latency to a specific value and prevent it from
	dynamically changing further. To do so, set the [option]`force_latency`
	option to the required latency value.
	+
	The maximum latency value can be specified in several ways:
	+
	 * by a numerical value in microseconds (for example, `force_latency=10`)
	 * as the kernel CPU idle level ID of the maximum C-state allowed
	   (for example, force_latency = cstate.id:1)
	 * as a case sensitive name of the maximum C-state allowed
	   (for example, force_latency = cstate.name:C1)
	 * by using 'None' as a fallback value to prevent errors when alternative
	   C-state IDs/names do not exist. When 'None' is used in the alternatives
	   pipeline, all the alternatives that follow 'None' are ignored.
	+
	It is also possible to specify multiple fallback values separated by '|' as
	the C-state names and/or IDs may not be available on some systems.
	+
	.Specifying fallback C-state values
	====
	----
	[cpu]
	force_latency=cstate.name:C6|cstate.id:4|10
	----
	This configuration tries to obtain and set the latency of C-state named C6.
	If the C-state C6 does not exist, kernel CPU idle level ID 4 (state4) latency
	is searched for in sysfs. Finally, if the state4 directory in sysfs is not found,
	the last latency fallback value is `10` us. The value is encoded and written into
	the kernel's PM QoS file `/dev/cpu_dma_latency`.
	====
	+
	.Specifying fallback C-state values using 'None'.
	====
	----
	[cpu]
	force_latency=cstate.name:XYZ|None
	----
	In this case, if C-state with the name `XYZ` does not exist
	[option]`force_latency`, no latency value will be written into the
	kernel's PM QoS file, and no errors will be reported due to the
	presence of 'None'.
	====
	
	`min_perf_pct, max_perf_pct, no_turbo`:::
	These options set the internals of the Intel P-State driver exposed via the kernel's
	`sysfs` interface.
	+
	.Adjusting the configuration of the Intel P-State driver
	====
	----
	[cpu]
	min_perf_pct=100
	----
	Limit the minimum P-State that will be requested by the driver. It states
	it as a percentage of the max (non-turbo) performance level.
	====
	+
	`pm_qos_resume_latency_us`:::
	This option allow to set specific latency for all cpus or specific ones.
	====
	----
	[cpu]
	pm_qos_resume_latency_us=n/a
	----
	Special value that disables C-states completely.
	====
	----
	[cpu]
	pm_qos_resume_latency_us=0
	----
	Allows all C-states.
	====
	----
	[cpu]
	pm_qos_resume_latency_us=100
	----
	Allows any C-state with a resume latency less than value.
	csrtt|�j||�d|_d|_d|_d|_d|_d|_d|_	d|_
d|_d|_d|_
d|_i|_t�|_d|_dS)NT�x86_64F)�superr�__init__�_has_pm_qos�_archZ_is_x86�	_is_intel�_is_amd�_has_energy_perf_bias�_has_intel_pstate�_has_amd_pstate�_has_pm_qos_resume_latency_us�_min_perf_pct_save�_max_perf_pct_save�_no_turbo_save�_governors_mapr�_cmd�_flags)�self�args�kwargs)�	__class__�� /usr/lib/python3.6/plugin_cpu.pyr	�s zCPULatencyPlugin.__init__cCs>d|_t�|_x"|jjd�D]}|jj|j�qWt�|_dS)NT�cpu)Z_devices_supported�setZ
_free_devices�_hardware_inventoryZget_devices�addZsys_nameZ_assigned_devices)r�devicerrr�
_init_devices�s
zCPULatencyPlugin._init_devicescs�fdd�|D�S)Ncsg|]}�jjd|��qS)r)r Z
get_device)�.0�x)rrr�
<listcomp>�sz8CPULatencyPlugin._get_device_objects.<locals>.<listcomp>r)rZdevicesr)rr�_get_device_objects�sz$CPULatencyPlugin._get_device_objectsc
Csddddddddddddd�S)Ng�������?�di�)�load_threshold�latency_low�latency_high�
force_latency�governor�sampling_down_factor�energy_perf_bias�min_perf_pct�max_perf_pct�no_turbo�pm_qos_resume_latency_us�energy_performance_preferencer)rrrr�_get_config_options�sz$CPULatencyPlugin._get_config_optionscCs�dddddg}tj�|_|j|krttj�}|jjd�}|dkrFd|_n|d	ksV|d
kr^d|_nd|_t	j
d|�nt	j
d|j�|jr�|j�|j�|jr�|j
�dS)
NrZi686Zi585Zi486Zi386Z	vendor_idZGenuineIntelTZAuthenticAMDZHygonGenuinez$We are running on an x86 %s platformzWe are running on %s (non x86))�platform�machiner�procfs�cpuinfo�tags�getrr
�log�info�_check_energy_perf_bias�_check_intel_pstate�_check_amd_pstate)rZintel_archsrZvendorrrr�_check_arch�s"

zCPULatencyPlugin._check_archcCsbd|_d}|jjddgtj|gd�\}}|dkr@|dkr@d|_n|dkrTtjd	�n
tjd
�dS)NFr�x86_energy_perf_policyz-r)Z	no_errorsr�Tzgunable to run x86_energy_perf_policy tool, ignoring CPU energy performance bias, is the tool installed?zXyour CPU doesn't support MSR_IA32_ENERGY_PERF_BIAS, ignoring CPU energy performance bias)rr�execute�errno�ENOENTr<�warning)rZretcode_unsupported�retcode�outrrrr>sz(CPULatencyPlugin._check_energy_perf_biascCs"tjjd�|_|jrtjd�dS)Nz$/sys/devices/system/cpu/intel_pstatezintel_pstate detected)�os�path�existsrr<r=)rrrrr?sz$CPULatencyPlugin._check_intel_pstatecCs"tjjd�|_|jrtjd�dS)Nz"/sys/devices/system/cpu/amd_pstatezamd-pstate detected)rJrKrLrr<r=)rrrrr@#sz"CPULatencyPlugin._check_amd_pstatecCs$|jdkrtj�jjdg�|_|jS)N�flags)rr8r9r:r;)rrrr�_get_cpuinfo_flags(s
z#CPULatencyPlugin._get_cpuinfo_flagscCs t|�}|jjt|�jdd��S)NrrC)�strrZ
is_cpu_online�replace)rr"Zsdrrr�_is_cpu_online-szCPULatencyPlugin._is_cpu_onlinecCstjjd|�S)Nz3/sys/devices/system/cpu/%s/cpufreq/scaling_governor)rJrKrL)rr"rrr�_cpu_has_scaling_governor1sz*CPULatencyPlugin._cpu_has_scaling_governorcCs<|j|�stjd|�dS|j|�s8tjd|�dSdS)Nz'%s' is not online, skippingFz.there is no scaling governor fo '%s', skippingT)rQr<�debugrR)rr"rrr�_check_cpu_can_change_governor4s

z/CPULatencyPlugin._check_cpu_can_change_governorcCs�d|_d|_t|jj��d|kr�d|_ytjtj	tj
�|_Wn*tk
rht
jdtj	�d|_YnXd|_|jddkr�|jddkr�|jjdd�|_d|_nd|_|j�nd|_t
jd|j�yt|j�d|_Wntk
r�d|_YnXdS)	NTFrz-Unable to open '%s', disabling PM_QoS controlr,r3�loadzILatency settings from non-first CPU plugin instance '%s' will be ignored.)Z_has_static_tuningZ_has_dynamic_tuning�listZ
_instances�values�_first_instancerJ�open�constsZPATH_CPU_DMA_LATENCY�O_WRONLY�_cpu_latency_fd�OSErrorr<�errorr
�_latency�options�_monitors_repositoryZcreate�
_load_monitorrAr=�nameZassigned_devices�
_first_device�
IndexError)r�instancerrr�_instance_init=s*
zCPULatencyPlugin._instance_initcCs4|jr0|jrtj|j�|jdk	r0|jj|j�dS)N)rXr
rJ�closer\rbra�delete)rrfrrr�_instance_cleanup[s

z"CPULatencyPlugin._instance_cleanupcCs|jjd|d�j�S)Nz'/sys/devices/system/cpu/intel_pstate/%s)r�	read_file�strip)r�attrrrr�_get_intel_pstate_attrbsz'CPULatencyPlugin._get_intel_pstate_attrcCs|dk	r|jjd||�dS)Nz'/sys/devices/system/cpu/intel_pstate/%s)r�
write_to_file)rrm�valrrr�_set_intel_pstate_attresz'CPULatencyPlugin._set_intel_pstate_attrcCs&|dkrdS|j|�}|j||�|S)N)rnrq)rrm�value�vrrr�_getset_intel_pstate_attris

z*CPULatencyPlugin._getset_intel_pstate_attrcs�tt|�j|�|jsdS|jj|jd�}|dk	r>|j|�|jr�|jj|jd�}|j	d|�|_
|jj|jd�}|j	d|�|_|jj|jd�}|j	d|�|_dS)Nr,r0r1r2)
rr�_instance_apply_staticrXZ
_variables�expandr`�_set_latencyrrtrrr)rrfZforce_latency_valueZ	new_value)rrrrups(


z'CPULatencyPlugin._instance_apply_staticcsLtt|�j||�|jrH|jrH|jd|j�|jd|j�|jd|j�dS)Nr0r1r2)	rr�_instance_unapply_staticrXrrqrrr)rrfZrollback)rrrrx�s
z)CPULatencyPlugin._instance_unapply_staticcCs|j||�dS)N)�_instance_update_dynamic)rrfr"rrr�_instance_apply_dynamic�sz(CPULatencyPlugin._instance_apply_dynamiccCsZ|js
t�||jkrdS|jj�d}||jdkrF|j|jd�n|j|jd�dS)N�systemr)r+r*)rX�AssertionErrorrdrbZget_loadr`rw)rrfr"rUrrrry�s

z)CPULatencyPlugin._instance_update_dynamiccCsdS)Nr)rrfr"rrr�_instance_unapply_dynamic�sz*CPULatencyPlugin._instance_unapply_dynamiccCs&yt|�Sttfk
r dSXdS)N)�int�
ValueError�	TypeError)r�srrr�_str2int�szCPULatencyPlugin._str2intcCs�i|_xztjt�D]l}td|}|jj|dddd�}|jj|dddd�}|dk	r|dk	r|j|�}|dk	r||j|j�<qWdS)Nz/%s/rcT)�err_ret�no_error�latency)�cstates_latencyrJ�listdir�cpuidle_states_pathrrkr�rl)r�dZcstate_pathrcr�rrr�_read_cstates_latency�s
z&CPULatencyPlugin._read_cstates_latencyFcCshtjd|�|jdkr*tjd�|j�|jj|d�}|rR|dkrRtjd�dStjdt|��|S)Nz)getting latency for cstate with name '%s'zreading cstates latency tablerz"skipping latency 0 as set by paramz!cstate name mapped to latency: %s)r<rSr�r�r;rO)rrc�no_zeror�rrr�_get_latency_by_cstate_name�s


z,CPULatencyPlugin._get_latency_by_cstate_namecCs�tjdt|��|j|�}|dkr2tjd�dStdd|}|j|jj|ddd��}|rt|dkrttjd�dStjd	t|��|S)
Nz'getting latency for cstate with ID '%s'zcstate ID is invalidz/%s/latencyzstate%dT)r�r�rz"skipping latency 0 as set by paramzcstate ID mapped to latency: %s)r<rSrOr�r�rrk)rZlidr�Zlatency_pathr�rrr�_get_latency_by_cstate_id�s


z*CPULatencyPlugin._get_latency_by_cstate_idcCs`d|_t|�jd�}tjd||f��x.|D�]$}yt|�}tjd|�W�n�tk
�rH|dd�dkr�|j|dd�dd�}n�|dd	�d
kr�|j|d	d��}n�|dd�dkr�|j|dd�dd�}nn|dd
�dkr�|j|d
d��}nJ|dk�rtjd�dS|�r.|dk�r.tjd�ntjdt|��d}YnX|dk	r.Pq.W|dfS)N�|z#parsing latency '%s', allow_na '%s'z+parsed directly specified latency value: %dr�zcstate.id_no_zero:T)r��
z
cstate.id:�zcstate.name_no_zero:�zcstate.name:�none�Nonezlatency 'none' specifiedzn/azlatency 'n/a' specifiedzinvalid latency specified: '%s'F)r�r�)NT)	r�rO�splitr<rSr~rr�r�)rr��allow_naZ	latenciesrrr�_parse_latency�s6



zCPULatencyPlugin._parse_latencycCsp|j|�\}}|rl|jrl|dkr4tjd�d|_n8|j|krltjd|�tjd|�}tj	|j
|�||_dS)Nztunable to evaluate latency value (probably wrong settings in the 'cpu' section of current profile), disabling PM QoSFzsetting new cpu latency %d�i)r�r
r<r^r_r=�struct�packrJ�writer\)rr��skipZlatency_binrrrrw�s

zCPULatencyPlugin._set_latencycCs|jjd|�j�j�S)Nz>/sys/devices/system/cpu/%s/cpufreq/scaling_available_governors)rrkrlr�)rr"rrr�_get_available_governors�sz)CPULatencyPlugin._get_available_governorsr-T)�
per_devicecCs�|j|�sdSt|�}|jd�}dd�|D�}x&|D]}t|�dkr4tjd�dSq4W|j|�}x~|D]^}||kr�|s�tjd||f�|jj	d|||r�t
jgndd	�Pqf|sftjd
||f�qfWtj
ddj|��d}|S)
Nr�cSsg|]}|j��qSr)rl)r$r-rrrr&sz2CPULatencyPlugin._set_governor.<locals>.<listcomp>rz.The 'governor' option contains an empty value.z!setting governor '%s' on cpu '%s'z3/sys/devices/system/cpu/%s/cpufreq/scaling_governorF)r�z7Ignoring governor '%s' on cpu '%s', it is not supportedz.None of the scaling governors is supported: %sz, )rTrOr��lenr<r^r�r=rrorErFrS�warn�join)rZ	governorsr"�sim�remover-Zavailable_governorsrrr�
_set_governor�s2





zCPULatencyPlugin._set_governorcCsTd}|j|�sdS|jjd||d�j�}t|�dkr:|}|dkrPtjd|�|S)Nz3/sys/devices/system/cpu/%s/cpufreq/scaling_governor)r�rz*could not get current governor on cpu '%s')rTrrkrlr�r<r^)rr"�ignore_missingr-�datarrr�
_get_governors
zCPULatencyPlugin._get_governor�ondemandcCsd|S)Nz7/sys/devices/system/cpu/cpufreq/%s/sampling_down_factorr)rr-rrr�_sampling_down_factor_path%sz+CPULatencyPlugin._sampling_down_factor_pathr.r�)r�ZprioritycCs�d}||jkr|jj�d|j|<|j|�}|dkrFtjd|�dS|t|jj��kr�||j|<|j|�}tj	j
|�s�tjd||f�dSt|�}|s�tjd||f�|j
j|||r�tjgndd�|S)NzIignoring sampling_down_factor setting for CPU '%s', cannot match governorzTignoring sampling_down_factor setting for CPU '%s', governor '%s' doesn't support itz6setting sampling_down_factor to '%s' for governor '%s'F)r�)r�clearr�r<rSrVrWr�rJrKrLrOr=rrorErF)rr.r"r�r�rpr-rKrrr�_set_sampling_down_factor(s&





z*CPULatencyPlugin._set_sampling_down_factorcCsD|j||d�}|dkrdS|j|�}tjj|�s4dS|jj|�j�S)N)r�)r�r�rJrKrLrrkrl)rr"r�r-rKrrr�_get_sampling_down_factorCs
z*CPULatencyPlugin._get_sampling_down_factorcCs*|jjdd|t|�gdd�\}}}||fS)NrBz-cT)Z
return_err)rrDrO)r�cpu_idrrrHrI�err_msgrrr�_try_set_energy_perf_biasMsz*CPULatencyPlugin._try_set_energy_perf_biascCsd||rdndfS)Nz>/sys/devices/system/cpu/cpufreq/policy%s/energy_performance_%sZavailable_preferencesZ
preferencer)rr�Z	availablerrr�_pstate_preference_pathVsz(CPULatencyPlugin._pstate_preference_pathcCsd|S)Nz4/sys/devices/system/cpu/cpu%s/power/energy_perf_biasr)rr�rrr�_energy_perf_bias_pathYsz'CPULatencyPlugin._energy_perf_bias_pathr/cCs||j|�stjd|�dS|jd�}|jd�}tj|j�kr�|j|�}t	j
j|�r�|s�xT|D]>}|j�}|j
j|||r�tjgndd�r^tjd||f�Pq^Wtjd|�t|�Stjd|�dSn�|j�rt|�slx�|D]|}|j�}tjd	||f�|j||�\}	}
|	d
k�r,tjd||f�Pq�|	d
k�rHtjd|
�Pq�tjd||f�q�Wtjd|�t|�SdSdS)
Nz%s is not online, skippingrr�F)r�z5energy_perf_bias successfully set to '%s' on cpu '%s'zPFailed to set energy_perf_bias on cpu '%s'. Is the value in the profile correct?zXFailed to set energy_perf_bias on cpu '%s' because energy_perf_bias file does not exist.z2Trying to set energy_perf_bias to '%s' on cpu '%s'rz"Failed to set energy_perf_bias: %szHCould not set energy_perf_bias to '%s' on cpu '%s', trying another value)rQr<rS�lstripr�rZ�CFG_CPU_EPP_FLAGrNr�rJrKrLrlrrorErFr=r^rOrr�)rr/r"r�r�r��vals�energy_perf_bias_pathrprHr�rrr�_set_energy_perf_bias\sX








z&CPULatencyPlugin._set_energy_perf_biascCsjyt|�}WnXtk
rd}z<yt|d�}Wn&tk
rR}z
|}WYdd}~XnXWYdd}~XnX|S)N�)r~r)rr�rs�errr�_try_parse_num�s(zCPULatencyPlugin._try_parse_numcCsdddd�j|j|�|�S)N�performance�normalZ	powersave)r��)r;r�)rr�rrr�_energy_perf_policy_to_human�sz-CPULatencyPlugin._energy_perf_policy_to_humancCsdddddd�j|j|�|�S)Nr�zbalance-performancer�z
balance-powerZpower)r�r��r�)r;r�)rr�rrr�_energy_perf_policy_to_human_v2�sz0CPULatencyPlugin._energy_perf_policy_to_human_v2c
Cs�d}|j|�s tjd|�dS|jd�}tj|j�krb|j|�}tj	j
|�r�|j|jj
|��}nz|jr�|jjdd|dg�\}}|dkr�xR|j�D]F}|j�}	t|	�dkr�|j|	d�}Pq�t|	�d	kr�|j|	d�}Pq�W|S)
Nz%s is not online, skippingrrBz-cz-rr�r�)rQr<rSr�rZr�rNr�rJrKrLr�rrkrrD�
splitlinesr�r�r�)
rr"r�r/r�r�rH�lines�line�lrrr�_get_energy_perf_bias�s*


z&CPULatencyPlugin._get_energy_perf_biascCsd|S)Nz9/sys/devices/system/cpu/%s/power/pm_qos_resume_latency_usr)rr"rrr�_pm_qos_resume_latency_us_path�sz/CPULatencyPlugin._pm_qos_resume_latency_us_pathcCs4|jdkr.tjj|j|��|_|js.tjd�|jS)NzGOption 'pm_qos_resume_latency_us' is not supported on current hardware.)rrJrKrLr�r<r=)rr"rrr�_check_pm_qos_resume_latency_us�s


z0CPULatencyPlugin._check_pm_qos_resume_latency_usr3cCs�|j|�stjd|�dS|j|dd�\}}|s>|j|�rBdS|dksZ|dkrp|dkrptjd||f�dS|s�|jj|j|�||r�t	j
gndd�|S)	Nz%s is not online, skippingT)r�zn/arz<Invalid pm_qos_resume_latency_us specified: '%s', cpu: '%s'.F)r�)rQr<rSr�r�rGrror�rErF)rr3r"r�r�r�r�rrr�_set_pm_qos_resume_latency_us�s
z.CPULatencyPlugin._set_pm_qos_resume_latency_uscCsD|j|�stjd|�dS|j|�s*dS|jj|j|�|d�j�S)Nz%s is not online, skipping)r�)rQr<rSr�rrkr�rl)rr"r�rrr�_get_pm_qos_resume_latency_us�s

z.CPULatencyPlugin._get_pm_qos_resume_latency_usr4c	Cs�|j|�stjd|�dS|jd�}tjj|j|d��r�|jd�}|s�t	|j
j|j|d��j��}xn|D]X}||kr�|j
j|j|�||r�t
jgndd�tjd||f�Pqjtjd||f�qjWtjd	|�t|�Stjd
�dS)Nz%s is not online, skippingrTr�F)r�z=Setting energy_performance_preference value '%s' for cpu '%s'zAenergy_performance_preference value '%s' unavailable for cpu '%s'z]Failed to set energy_performance_preference on cpu '%s'. Is the value in the profile correct?zyenergy_performance_available_preferences file missing, which can happen if the system is booted without a P-state driver.)rQr<rSr�rJrKrLr�r�rrrkrorErFr=r�r^rO)	rr4r"r�r�r�r�Z
avail_valsrprrr�"_set_energy_performance_preference�s(




z3CPULatencyPlugin._set_energy_performance_preferencecCs^|j|�stjd|�dS|jd�}tjj|j|d��rP|jj	|j|��j
�Stjd�dS)Nz%s is not online, skippingrTzyenergy_performance_available_preferences file missing, which can happen if the system is booted without a P-state driver.)rQr<rSr�rJrKrLr�rrkrl)rr"r�r�rrr�"_get_energy_performance_preferences


z3CPULatencyPlugin._get_energy_performance_preference)F)F)F)F)r�)F)F)F)F)F):�__name__�
__module__�__qualname__�__doc__r	r#r'�classmethodr5rAr>r?r@rNrQrRrTrgrjrnrqrtrurZZ
ROLLBACK_SOFTrxrzryr}r�r�r�r�r�rwr�Zcommand_setr�Zcommand_getr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r��
__classcell__rr)rrrsn,		



		
8r)rCrZ
decoratorsZ
tuned.logsZtunedZtuned.utils.commandsrZtuned.constsrZrJrEr�r6r8Zlogsr;r<r�ZPluginrrrrr�<module>s

site-packages/tuned/plugins/__pycache__/plugin_vm.cpython-36.opt-1.pyc000064400000010156147511334670021677 0ustar003

�<�e�
�@snddlmZddlTddlZddlZddlZddlZddlZddl	m
Z
ejj�Z
e
�ZGdd�dej�ZdS)�)�base)�*�N)�commandsc@s�eZdZdZedd��Zdd�Zdd�Zedd	��Ze	d
�dd��Z
e	d
�dd��Zed
�dd��Z
ed
�dd��Ze	d�dd��Zed�dd��ZdS)�VMPlugina|
	`vm`::
	
	Enables or disables transparent huge pages depending on value of the
	[option]`transparent_hugepages` option. The option can have one of three
	possible values `always`, `madvise` and `never`.
	+
	.Disable transparent hugepages
	====
	----
	[vm]
	transparent_hugepages=never
	----
	====
	+
	The [option]`transparent_hugepage.defrag` option specifies the
	defragmentation policy. Possible values for this option are `always`,
	`defer`, `defer+madvise`, `madvise` and `never`. For a detailed
	explanation of these values refer to
	link:https://www.kernel.org/doc/Documentation/vm/transhuge.txt[Transparent Hugepage Support].
	cCsdddd�S)N)�transparent_hugepages�transparent_hugepageztransparent_hugepage.defrag�)�selfr	r	�/usr/lib/python3.6/plugin_vm.py�_get_config_options%szVMPlugin._get_config_optionscCsd|_d|_dS)NTF)Z_has_static_tuningZ_has_dynamic_tuning)r
�instancer	r	r�_instance_init-szVMPlugin._instance_initcCsdS)Nr	)r
r
r	r	r�_instance_cleanup1szVMPlugin._instance_cleanupcCsd}tjj|�sd}|S)Nz#/sys/kernel/mm/transparent_hugepagez*/sys/kernel/mm/redhat_transparent_hugepage)�os�path�exists)r
rr	r	r�	_thp_path4szVMPlugin._thp_pathrcCs�|dkr"|stjdt|��dStjddd�}|jd�d	krP|sLtjd
�dStjj	|j
�d�}tjj|�r�|s�tj|||r�t
jgndd�|S|s�tjd
�dSdS)N�always�never�madvisez-Incorrect 'transparent_hugepages' value '%s'.z
/proc/cmdlineT)�no_errorztransparent_hugepage=rzWtransparent_hugepage is already set in kernel boot cmdline, ignoring value from profile�enabledFzDOption 'transparent_hugepages' is not supported on current hardware.)rrr)�log�warn�str�cmd�	read_file�find�inforr�joinrr�
write_to_file�errno�ENOENT)r
�value�sim�removeZcmdline�sys_filer	r	r�_set_transparent_hugepages<s$

z#VMPlugin._set_transparent_hugepagesrcCs|j|||�dS)N)r()r
r$r%r&r	r	r�_set_transparent_hugepageUsz"VMPlugin._set_transparent_hugepagecCs6tjj|j�d�}tjj|�r.tjtj|��SdSdS)Nr)rrr rrr�get_active_optionr)r
r'r	r	r�_get_transparent_hugepagesYsz#VMPlugin._get_transparent_hugepagescCs|j�S)N)r+)r
r	r	r�_get_transparent_hugepagebsz"VMPlugin._get_transparent_hugepageztransparent_hugepage.defragcCsXtjj|j�d�}tjj|�rB|s>tj|||r6tjgndd�|S|sPt	j
d�dSdS)N�defragF)rzJOption 'transparent_hugepage.defrag' is not supported on current hardware.)rrr rrrr!r"r#rr)r
r$r%r&r'r	r	r� _set_transparent_hugepage_defragfs
z)VMPlugin._set_transparent_hugepage_defragcCs6tjj|j�d�}tjj|�r.tjtj|��SdSdS)Nr-)rrr rrrr*r)r
r'r	r	r� _get_transparent_hugepage_defragssz)VMPlugin._get_transparent_hugepage_defragN)�__name__�
__module__�__qualname__�__doc__�classmethodrrrrZcommand_setr(r)Zcommand_getr+r,r.r/r	r	r	rrs	
r)�rZ
decoratorsZ
tuned.logsZtunedrr"�structZglobZtuned.utils.commandsrZlogs�getrrZPluginrr	r	r	r�<module>s
site-packages/tuned/plugins/__pycache__/plugin_bootloader.cpython-36.pyc000064400000060721147511334670022453 0ustar003

�<�e:e�@s�ddlmZddlTddlZddlmZddlmZddlj	Z	ddl
Z
ddlZddlZddl
mZejj�ZGdd	�d	ej�ZdS)
�)�base)�*�N)�
exceptions)�commands)�sleepcs�eZdZdZ�fdd�Zdd�Zdd�Zedd	��Ze	d[dd��Z
e	d
d��Zdd�Zdd�Z
iifdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zejfd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Z d5d6�Z!d7d8�Z"d9d:�Z#d;d<�Z$d=d>�Z%d?d@�Z&e'dA�dBdC��Z(e'dD�dEdF��Z)e'dG�dHdI��Z*e'dJdKdLdM�dNdO��Z+e'dPdKdLdM�dQdR��Z,e'dSdKdLdM�dTdU��Z-e'dVdKdLdM�dWdX��Z.dYdZ�Z/�Z0S)\�BootloaderPlugina�
	`bootloader`::
	
	Adds options to the kernel command line. This plug-in supports the
	GRUB 2 boot loader and the Boot Loader Specification (BLS).
	+
	NOTE: *TuneD* will not remove or replace kernel command line
	parameters added via other methods like *grubby*. *TuneD* will manage
	the kernel command line parameters added via *TuneD*. Please refer
	to your platform bootloader documentation about how to identify and
	manage kernel command line parameters set outside of *TuneD*.
	+
	Customized non-standard location of the GRUB 2 configuration file
	can be specified by the [option]`grub2_cfg_file` option.
	+
	The kernel options are added to the current GRUB configuration and
	its templates. Reboot the system for the kernel option to take effect.
	+
	Switching to another profile or manually stopping the `tuned`
	service removes the additional options. If you shut down or reboot
	the system, the kernel options persist in the [filename]`grub.cfg`
	file and grub environment files.
	+
	The kernel options can be specified by the following syntax:
	+
	[subs="+quotes,+macros"]
	----
	cmdline__suffix__=__arg1__ __arg2__ ... __argN__
	----
	+
	Or with an alternative, but equivalent syntax:
	+
	[subs="+quotes,+macros"]
	----
	cmdline__suffix__=+__arg1__ __arg2__ ... __argN__
	----
	+
	Where __suffix__ can be arbitrary (even empty) alphanumeric
	string which should be unique across all loaded profiles. It is
	recommended to use the profile name as the __suffix__
	(for example, [option]`cmdline_my_profile`). If there are multiple
	[option]`cmdline` options with the same suffix, during the profile
	load/merge the value which was assigned previously will be used. This
	is the same behavior as any other plug-in options. The final kernel
	command line is constructed by concatenating all the resulting
	[option]`cmdline` options.
	+
	It is also possible to remove kernel options by the following syntax:
	+
	[subs="+quotes,+macros"]
	----
	cmdline__suffix__=-__arg1__ __arg2__ ... __argN__
	----
	+
	Such kernel options will not be concatenated and thus removed during
	the final kernel command line construction.
	+
	.Modifying the kernel command line
	====
	For example, to add the [option]`quiet` kernel option to a *TuneD*
	profile, include the following lines in the [filename]`tuned.conf`
	file:
	
	----
	[bootloader]
	cmdline_my_profile=+quiet
	----
	
	An example of a custom profile `my_profile` that adds the
	[option]`isolcpus=2` option to the kernel command line:
	
	----
	[bootloader]
	cmdline_my_profile=isolcpus=2
	----
	
	An example of a custom profile `my_profile` that removes the
	[option]`rhgb quiet` options from the kernel command line (if
	previously added by *TuneD*):
	
	----
	[bootloader]
	cmdline_my_profile=-rhgb quiet
	----
	====
	+
	.Modifying the kernel command line, example with inheritance
	====
	For example, to add the [option]`rhgb quiet` kernel options to a
	*TuneD* profile `profile_1`:
	
	----
	[bootloader]
	cmdline_profile_1=+rhgb quiet
	----
	
	In the child profile `profile_2` drop the [option]`quiet` option
	from the kernel command line:
	
	----
	[main]
	include=profile_1
	
	[bootloader]
	cmdline_profile_2=-quiet
	----
	
	The final kernel command line will be [option]`rhgb`. In case the same
	[option]`cmdline` suffix as in the `profile_1` is used:
	
	----
	[main]
	include=profile_1
	
	[bootloader]
	cmdline_profile_1=-quiet
	----
	
	It will result in the empty kernel command line because the merge
	executes and the [option]`cmdline_profile_1` gets redefined to just
	[option]`-quiet`. Thus there is nothing to remove in the final kernel
	command line processing.
	====
	+
	The [option]`initrd_add_img=IMAGE` adds an initrd overlay file
	`IMAGE`. If the `IMAGE` file name begins with '/', the absolute path is
	used. Otherwise, the current profile directory is used as the base
	directory for the `IMAGE`.
	+
	The [option]`initrd_add_dir=DIR` creates an initrd image from the
	directory `DIR` and adds the resulting image as an overlay.
	If the `DIR` directory name begins with '/', the absolute path
	is used. Otherwise, the current profile directory is used as the
	base directory for the `DIR`.
	+
	The [option]`initrd_dst_img=PATHNAME` sets the name and location of
	the resulting initrd image. Typically, it is not necessary to use this
	option. By default, the location of initrd images is `/boot` and the
	name of the image is taken as the basename of `IMAGE` or `DIR`. This can
	be overridden by setting [option]`initrd_dst_img`.
	+
	The [option]`initrd_remove_dir=VALUE` removes the source directory
	from which the initrd image was built if `VALUE` is true. Only 'y',
	'yes', 't', 'true' and '1' (case insensitive) are accepted as true
	values for this option. Other values are interpreted as false.
	+
	.Adding an overlay initrd image
	====
	----
	[bootloader]
	initrd_remove_dir=True
	initrd_add_dir=/tmp/tuned-initrd.img
	----
	
	This creates an initrd image from the `/tmp/tuned-initrd.img` directory
	and and then removes the `tuned-initrd.img` directory from `/tmp`.
	====
	+
	The [option]`skip_grub_config=VALUE` does not change grub
	configuration if `VALUE` is true. However, [option]`cmdline`
	options are still processed, and the result is used to verify the current
	cmdline. Only 'y', 'yes', 't', 'true' and '1' (case insensitive) are accepted
	as true values for this option. Other values are interpreted as false.
	+
	.Do not change grub configuration
	====
	----
	[bootloader]
	skip_grub_config=True
	cmdline=+systemd.cpu_affinity=1
	----
	====
	cs6tjjtj�stjd��tt|�j	||�t
�|_dS)Nz4Required GRUB2 template not found, disabling plugin.)�os�path�isfile�constsZGRUB2_TUNED_TEMPLATE_PATHrZNotSupportedPluginException�superr�__init__r�_cmd)�self�args�kwargs)�	__class__��'/usr/lib/python3.6/plugin_bootloader.pyr�s
zBootloaderPlugin.__init__cCsVd|_d|_d|_d|_d|_d|_d|_d|_|j�|_	|j
�|_|j�dk	|_
dS)NFT�)Z_has_dynamic_tuningZ_has_static_tuning�update_grub2_cfg�_skip_grub_config_val�_initrd_remove_dir�_initrd_dst_img_val�_cmdline_val�_initrd_val�_get_grub2_cfg_files�_grub2_cfg_file_names�_bls_enabled�_bls�_rpm_ostree_status�_rpm_ostree)r�instancerrr�_instance_init�s

zBootloaderPlugin._instance_initcCsdS)Nr)rr#rrr�_instance_cleanup�sz"BootloaderPlugin._instance_cleanupcCsdddddddd�S)N)�grub2_cfg_file�initrd_dst_img�initrd_add_img�initrd_add_dir�initrd_remove_dir�cmdline�skip_grub_configr)�clsrrr�_get_config_options�sz$BootloaderPlugin._get_config_optionsrcCs`i}|j�}xN|j�D]B}||kr|jdd�}|j|dg�jt|�dkrR|dnd�qW|S)z�
		Returns dict created from options
		e.g.: _options_to_dict("A=A A=B A B=A C=A", "A=B B=A B=B") returns {'A': ['A', None], 'C': ['A']}
		�=rrN)�split�
setdefault�append�len)�optionsZomit�d�oZarrrrr�_options_to_dict�s.z!BootloaderPlugin._options_to_dictcCsdjdd�|j�D��S)N� cSs2g|]*\}}|D]}|dk	r(|d|n|�qqS)Nr/r)�.0�k�vZv1rrr�
<listcomp>�sz5BootloaderPlugin._dict_to_options.<locals>.<listcomp>)�join�items)r5rrr�_dict_to_options�sz!BootloaderPlugin._dict_to_optionscCsr|jjddgdd�\}}}tjd||f�|dkr8dS|j�}t|�dksX|dd	krjtjd
|�dS|dS)zW
		Returns status of rpm-ostree transactions or None if not run on rpm-ostree system
		z
rpm-ostreeZstatusT)�
return_errz.rpm-ostree status output stdout:
%s
stderr:
%srN�zState:z2Exceptional format of rpm-ostree status result:
%sr)r�execute�log�debugr0r3�warn)r�rc�out�errZsplitedrrrr!�sz#BootloaderPlugin._rpm_ostree_statuscCsFd}d}x(t|�D]}|j�dkr&dSt|�qW|j�dkrBdSdS)N�
g�?ZidleTF)�ranger!r)rZsleep_cyclesZ
sleep_secs�irrr�_wait_till_idlesz BootloaderPlugin._wait_till_idlecs�|jjddgdd�\}}}tjd||f�|dkr8dS|j|�}|j�sXtjd�dSi}|j|�j�}x8|j	�D],\�}	x|	D]}
|�j
|
�q�W|	|�<qtWi}|j|�j�}x�|j	�D]~\�}	|j���r$tjd	�|�f�|j�g�j
|��|j
�fd
d�|�D��g|�<|j�g�j
|	�|	|�<q�W||k�rdtjd|�|||fStjd
||f�|jjddgdd�|D�dd�|D�dd�\}}
}|dk�r�tjd|�|j|�ddfS|||fSdS)z�
		Method for appending or deleting rpm-ostree karg
		returns None if rpm-ostree not present or is run on not ostree system
		or tuple with new kargs, appended kargs and deleted kargs
		z
rpm-ostree�kargsT)r@z'rpm-ostree output stdout:
%s
stderr:
%srNzCannot wait for transaction endz)adding rpm-ostree kargs %s: %s for deletecs$g|]}|dk	r�d|n��qS)Nr/r)r9r;)r:rrr<.sz6BootloaderPlugin._rpm_ostree_kargs.<locals>.<listcomp>z3skipping rpm-ostree kargs - append == deleting (%s)z2rpm-ostree kargs - appending: '%s'; deleting: '%s'cSsg|]}d|�qS)z--append=%sr)r9r;rrrr<9scSsg|]}d|�qS)z--delete=%sr)r9r;rrrr<:sz-Something went wrong with rpm-ostree kargs
%s)NNN)NNN)rrBrCrDr7rL�errorr?r0r>�remove�getr1�extend�info)rr2�deleterFrGrHrM�deletedZ
delete_params�valr;�appendedZ
append_params�_r)r:r�_rpm_ostree_kargs
sF





z"BootloaderPlugin._rpm_ostree_kargscCsV|j�j�}g}xR|D]J}t|�jd�r4|j|�q||krJ||||<qtjd||jjf�qWd}x�|D]�}||}|dksn|dkr�qn|d}|dd�}|dd�j	�}	|dks�|d	kr�|dkr�|	dkr�|d|	7}qn|d
k�r(|	dk�r4x@|	j
�D]&}
tj|
�}tj
d|d
d|�}�q�Wqn|d|7}qnW|j	�}|dk�rR||d<|S)zSMerge provided options with plugin default options and merge all cmdline.* options.r+z$Unknown option '%s' for plugin '%s'.rNrrrA�+�\�-r8z(\A|\s)z	(?=\Z|\s))rZrYr[)r.�copy�str�
startswithr2rCrEr�__name__�stripr0�re�escape�sub)rr4Z	effectiveZcmdline_keys�keyr+rU�opZop1�vals�pZregexrrr�_get_effective_optionsAs:





z'BootloaderPlugin._get_effective_optionscCs.g}x$tjD]}tjj|�r|j|�qW|S)N)rZGRUB2_CFG_FILESr	r
�existsr2)rZ	cfg_files�frrrrcs
z%BootloaderPlugin._get_grub2_cfg_filescCsH|jjtjdd�}t|�dkr2tjdtj�dStjd|tj	d�dk	S)NT)�no_errorrzcannot read '%s'Fz=^\s*GRUB_ENABLE_BLSCFG\s*=\s*\"?\s*[tT][rR][uU][eE]\s*\"?\s*$)�flags)
r�	read_filer�GRUB2_DEFAULT_ENV_FILEr3rCrRra�search�	MULTILINE)r�grub2_default_envrrrrjszBootloaderPlugin._bls_enabledcCs|jjtj|�S)N)r�add_modify_option_in_filer�BOOT_CMDLINE_FILE)rr5rrr�_patch_bootcmdlinessz#BootloaderPlugin._patch_bootcmdlinecCs�|jtjdtjdi�|js*tjd�dSx4|jD]*}|jj|dtj	ddtj
didd�q2W|jdk	r�tjd|j�|jj|j�dS)Nrzcannot find grub.cfg to patchzset\s+F)�addzremoving initrd image '%s')
rtr�BOOT_CMDLINE_TUNED_VAR�BOOT_CMDLINE_INITRD_ADD_VARrrCrRrrr�GRUB2_TUNED_VAR�GRUB2_TUNED_INITRD_VARr�unlink)rrjrrr�_remove_grub2_tuningvs
*
z%BootloaderPlugin._remove_grub2_tuningcCsf|jjtj�}tjtjd|tjd�}|r2|dnd}tjtjd|tjd�}|rZ|dnd}||fS)Nz	=\"(.*)\")rlrr)	rrmrrsrarorvrp�BOOT_CMDLINE_KARGS_DELETED_VAR)rrjrVrTrrr�_get_rpm_ostree_changes�sz(BootloaderPlugin._get_rpm_ostree_changescCs@|j�\}}|j|j|�|j|�d�|jtjdtjdi�dS)N)r2rSr)r}rXr7rtrrvr|)rrVrTrrr�_remove_rpm_ostree_tuning�sz*BootloaderPlugin._remove_rpm_ostree_tuningcCsR|tjkrN|jrN|jr,tjd�|j�n"tjd�|j�|jddd��dS)Nz4removing rpm-ostree tuning previously added by Tunedz/removing grub2 tuning previously added by Tunedr)�tuned_params�tuned_initrd)	rZ
ROLLBACK_FULLrr"rCrRr~r{�_update_grubenv)rr#Zrollbackrrr�_instance_unapply_static�s


z)BootloaderPlugin._instance_unapply_staticcCs�tjd�tjdtjdd|tjd�}tjdtjd|tjd�}tjdtjdd|tjd�}tjdtjd|tjd�}tjtjdd|tjd�}tjtj	dd|tjd�S)	Nzunpatching grub.cfgz
^\s*set\s+z\s*=.*
r)rlz *\$z\nz\n+)
rCrDrarcrrxrpry�GRUB2_TEMPLATE_HEADER_BEGIN�GRUB2_TEMPLATE_HEADER_END)r�	grub2_cfg�cfgrrr�_grub2_cfg_unpatch�s
z#BootloaderPlugin._grub2_cfg_unpatchcCs�tjd�dtjd}x8|D]0}|d|jj|�d|jj||�d7}qW|tjd7}tjd||tj	d	�}tj
tjd
�}xt|D]l}tjd|dd
|||tj	d	�}tjd|d||dd|tj	d	�}tjd|dd|tj	d	�}q�W|S)Nzinitial patching of grub.cfgz\1\n\n�
zset z="z"
z\nz+^(\s*###\s+END\s+[^#]+/00_header\s+### *)\n)rl)�linuxZinitrdz^(\s*z(16|efi)?\s+.*)$z\1 $z(?:16|efi)?\s+\S+rescue.*)\$z *(.*)$z\1\2z(?:16|efi)?\s+\S+rescue.*) +$z\1)rCrDrr�rrbr�rarcrprxry)rr�r5�s�optZd2rKrrr�_grub2_cfg_patch_initial�s

0
$( z)BootloaderPlugin._grub2_cfg_patch_initialcCs�|jjtj�}t|�dkr.tjdtj�dStjtjd�}d}xv|D]n}t	j
d|d||d|t	jd�dkrFd	}|ddkr�|d7}||d|d
|d||d7}qFW|r�tjdtj�|jj
tj|�d	S)Nrzcannot read '%s'F)ZGRUB_CMDLINE_LINUX_DEFAULTZGRUB_INITRD_OVERLAYz^[^#]*\bz
\s*=.*\\\$z\b.*$)rlTrr�z="${z:+$z }\$z"
z
patching '%s'���)rrmrrnr3rCrRrxryrarorprD�
write_to_file)rrqr5�writerKrrr�_grub2_default_env_patch�s 
*,z)BootloaderPlugin._grub2_default_env_patchcCs�|jjtj�}t|�dkr.tjdtj�dSd}tjdtj	d|tj
d�r�d}tjdtj	dd	|tj
d�}|d
dkr�|d7}|r�tjdtj�|jj
tj|�dS)Nrzcannot read '%s'Fzb^GRUB_CMDLINE_LINUX_DEFAULT=\"\$\{GRUB_CMDLINE_LINUX_DEFAULT:\+\$GRUB_CMDLINE_LINUX_DEFAULT \}\\\$z"$)rlTz"$
rrr�zunpatching '%s'r�)rrmrrnr3rCrRrarorxrprcrDr�)rrqr�r�rrr�_grub2_default_env_unpatch�s z+BootloaderPlugin._grub2_default_env_unpatchcCsZtjd�|jstjd�dS�x|jD�]}|jj|�}t|�dkrVtjd|�q(tjd|�|}d}xf|D]^}tjd|dd|jj	||�d
|tj
d�\}}|dks�tjd
||tj
d�dkrrd}qrWttjd
t
j|tj
d��ttjd
t
j|tj
d��k�rd}|�r*|j|j|�|�}|jj||�q(W|j�rN|j�n|j�dS)Nzpatching grub.cfgzcannot find grub.cfg to patchFrzcannot patch %sz+adding boot command line parameters to '%s'z	\b(set\s+z\s*=).*$z\1�")rlrz\$Tz\1")rCrDrrRrrmr3ra�subnrbrpro�findallrrxryr�r�r�r r�r�)rr5rjr�Z
grub2_cfg_newZ
patch_initialr�Znsubsrrr�_grub2_cfg_patch�s4


4" 
z!BootloaderPlugin._grub2_cfg_patchcCsb|j�\}}|j|j|�}|s"dS|j|d�\}}}|dkr@dS|jtj|jtj|j|�i�dS)N)r2)	r}r7rrXrtrrvr|r?)rrVrWZ
_cmdline_dictr5rrr�_rpm_ostree_update�sz#BootloaderPlugin._rpm_ostree_updatecCs8|jtj|jtj|ji�|jtj|jtj|ji�dS)N)	r�rrxrryrrtrvrw)rrrr�
_grub2_updateszBootloaderPlugin._grub2_updatecCstjjtj�S)N)r	r
rirZBLS_ENTRIES_PATH)rrrr�_has_blsszBootloaderPlugin._has_blscCs\tjdt|��dd�|j�D�}|jjdddg|�\}}|dkrXtjd|�d	Sd
S)Nzupdating grubenv, setting %scSs$g|]\}}dt|�t|�f�qS)z%s=%s)r])r9Zoption�valuerrrr<sz4BootloaderPlugin._update_grubenv.<locals>.<listcomp>z
grub2-editenvr[�setrzcannot update grubenv: '%s'FT)rCrDr]r>rrBrE)rr5�lrFrGrrrr�
sz BootloaderPlugin._update_grubenvcCsb|jj�}|dkrdStjdtj�|jjtjdgd|id�\}}|dkr^tjd|�dSd	S)
NrFz4running kernel update hook '%s' to patch BLS entriesruZKERNEL_INSTALL_MACHINE_ID)�envrzcannot patch BLS entries: '%s'T)rZget_machine_idrCrDrZKERNEL_UPDATE_HOOK_FILErBrE)rZ
machine_idrFrGrrr�_bls_entries_patch_initials
z+BootloaderPlugin._bls_entries_patch_initialcCs6tjd�|j�r2|j|j|jd��r2|j�r2dSdS)Nzupdating BLS)rr�TF)rCrDr�r�rrr�)rrrr�_bls_updates
zBootloaderPlugin._bls_updatecCs(|jdkr$tjjtjtjj|��|_dS)N)rr	r
r=r�BOOT_DIR�basename)r�namerrr�_init_initrd_dst_img&s
z%BootloaderPlugin._init_initrd_dst_imgcCstjjtj�S)N)r	r
�isdirrZPETITBOOT_DETECT_DIR)rrrr�_check_petitboot*sz!BootloaderPlugin._check_petitbootcCs�|jrtjd�dS|j�r&tjd�tjd|j�tjj|j�}|j	j
||j�sXdSd|_|j	jd�j
�}d}t|�}|r�tjdd	|�}t|�|kr�|}tjj||�|_dS)
Nz:Detected rpm-ostree which doesn't support initrd overlays.FzkDetected Petitboot which doesn't support initrd overlays. The initrd overlay will be ignored by bootloader.zinstalling initrd image as '%s'Tz
/proc/cmdline�/z)^\s*BOOT_IMAGE=\s*(?:\([^)]*\))?(\S*/).*$z\1)r"rCrEr�rRrr	r
r�rr\rrm�rstripr3rarcr=r)rZimgZimg_nameZcurr_cmdlineZinitrd_grubpathZlcr
rrr�_install_initrd-s&

z BootloaderPlugin._install_initrdr&cCs$|rdS|r |dk	r t|�g|_dS)N)r]r)r�enablingr��verify�ignore_missingrrr�_grub2_cfg_fileBsz BootloaderPlugin._grub2_cfg_filer'cCsR|rdS|rN|dk	rNt|�|_|jdkr,dS|jddkrNtjjtj|j�|_dS)NrFrr�)r]rr	r
r=rr�)rr�r�r�r�rrr�_initrd_dst_imgJs

z BootloaderPlugin._initrd_dst_imgr*cCs*|rdS|r&|dk	r&|jj|�dk|_dS)N�1)r�get_boolr)rr�r�r�r�rrrrVsz#BootloaderPlugin._initrd_remove_dirr(FrI)Z
per_deviceZprioritycCsD|rdS|r@|dk	r@t|�}|j|�|dkr2dS|j|�s@dSdS)NrF)r]r�r�)rr�r�r�r�Zsrc_imgrrr�_initrd_add_img^s

z BootloaderPlugin._initrd_add_imgr)c
Cs|rdS|o|dk	�rt|�}|j|�|dkr4dStjj|�sRtjd|�dStjd|�tj	ddd�\}}tj
d|�tj|�|jj
d	||d
d�\}}	tj
d|	�|d
kr�tjd�|jj|d
d�dS|j|�|jj|�|j�rtjd|�|jj|�dS)NrFzFerror: cannot create initrd image, source directory '%s' doesn't existz+generating initrd image from directory '%s'ztuned-bootloader-z.tmp)�prefix�suffixz+writing initrd image to temporary file '%s'zfind . | cpio -co > %sT)�cwd�shellzcpio log: %srzerror generating initrd image)rkzremoving directory '%s')r]r�r	r
r�rCrNrR�tempfileZmkstemprD�closerrBrzr�rZrmtree)
rr�r�r�r�Zsrc_dir�fdZtmpfilerFrGrrr�_initrd_add_dirks2



z BootloaderPlugin._initrd_add_dirr+cCsJ|jj|jj|��}|�r |jr8|j�d}|j|�}n|jjd�}t|�dkrTdSt	|j
��}t	|j
��}	|	|}
t|
�dkr�tjt
jdt|	�f�dSdd�|D�}xR|
D]J}|j
dd�d}
|
|kr�tjt
j|
|f�q�tjt
j||
|f�q�W|	|@}tjd	d
j|�f�dS|�rF|dk	�rFtjd�d|_||_dS)
Nrz
/proc/cmdliner+TcSsi|]}||jdd�d�qS)r/rr)r0)r9r;rrr�
<dictcomp>�sz-BootloaderPlugin._cmdline.<locals>.<dictcomp>r/rz2expected arguments that are present in cmdline: %sr8Fz;installing additional boot command line parameters to grub2)Z
_variables�expandrZunquoter"rXr?rmr3r�r0rCrRrZSTR_VERIFY_PROFILE_VALUE_OKr]rNZ'STR_VERIFY_PROFILE_CMDLINE_FAIL_MISSINGZSTR_VERIFY_PROFILE_CMDLINE_FAILr=rr)rr�r�r�r�r;Zrpm_ostree_kargsr+Zcmdline_setZ	value_setZmissing_setZcmdline_dict�m�argZpresent_setrrr�_cmdline�s6

zBootloaderPlugin._cmdliner,cCs8|rdS|r4|dk	r4|jj|�dkr4tjd�d|_dS)Nr�z(skipping any modification of grub configT)rr�rCrRr)rr�r�r�r�rrr�_skip_grub_config�s
z"BootloaderPlugin._skip_grub_configcCs�|rV|jrVt|j�dkr"tjd�t|j�dkr:tjd�|jtj|jtj	|ji�n0|r�|j
r�|jrp|j�n|j
�|j�d|_
dS)Nrz0requested changes to initrd will not be applied!z1requested changes to cmdline will not be applied!F)rr3rrCrErrtrrvrwrr"r�r�r�)rr#r�rrr�_instance_post_static�s




z&BootloaderPlugin._instance_post_static)r)1r_�
__module__�__qualname__�__doc__rr$r%�classmethodr.�staticmethodr7r?r!rLrXrhrrrtr{r}r~rZ
ROLLBACK_SOFTr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�Zcommand_customr�r�rr�r�r�r�r��
__classcell__rr)rrrsT.
4"	
	 
	
!	r)rrZ
decoratorsZ
tuned.logsZtunedrZtuned.utils.commandsrZtuned.constsrr	rar�ZtimerZlogsrPrCZPluginrrrrr�<module>s

site-packages/tuned/plugins/__pycache__/plugin_modules.cpython-36.pyc000064400000012711147511334670021765 0ustar003

�<�e=�@sjddlZddlZddlmZddlTddlZddlTddl	m
Z
ddljZej
j�ZGdd�dej�ZdS)�N�)�base)�*)�commandscsfeZdZdZ�fdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
ejfdd�Z
dd�Z�ZS)�
ModulesPlugina!
	`modules`::
	
	Plug-in for applying custom kernel modules options.
	+
	This plug-in can set parameters to kernel modules. It creates
	`/etc/modprobe.d/tuned.conf` file. The syntax is
	`_module_=_option1=value1 option2=value2..._` where `_module_` is
	the module name and `_optionx=valuex_` are module options which may
	or may not be present.
	+
	.Load module `netrom` with module parameter `nr_ndevs=2`
	====
	----
	[modules]
	netrom=nr_ndevs=2
	----
	====
	Modules can also be forced to load/reload by using an additional
	`+r` option prefix.
	+
	.(Re)load module `netrom` with module parameter `nr_ndevs=2`
	====
	----
	[modules]
	netrom=+r nr_ndevs=2
	----
	====
	The `+r` switch will also cause *TuneD* to try and remove `netrom`
	module (if loaded) and try and (re)insert it with the specified
	parameters. The `+r` can be followed by an optional comma (`+r,`)
	for better readability.
	+
	When using `+r` the module will be loaded immediately by the *TuneD*
	daemon itself rather than waiting for the OS to load it with the
	specified parameters.
	cs$tt|�j||�d|_t�|_dS)NT)�superr�__init__Z_has_dynamic_optionsr�_cmd)�self�args�kwargs)�	__class__��$/usr/lib/python3.6/plugin_modules.pyr3szModulesPlugin.__init__cCsd|_d|_|j|_dS)NFT)Z_has_dynamic_tuningZ_has_static_tuningZoptions�_modules)r
�instancerrr�_instance_init8szModulesPlugin._instance_initcCsdS)Nr)r
rrrr�_instance_cleanup=szModulesPlugin._instance_cleanupcCs�x�|D]�}|jjdd|g�\}}|dkr6tjd�dS|dkrTtjd||j�f�|jjd|g�\}}|dkrtjd||j�f�qWdS)NZmodprobez-rrzN'modprobe' command not found, cannot reload kernel modules, reboot is requiredz$cannot remove kernel module '%s': %sz:cannot insert/reinsert module '%s', reboot is required: %s)r	�execute�log�warn�debug�strip)r
�modules�module�retcode�outrrr�_reload_modules@s

zModulesPlugin._reload_modulescCsR|j�d}d}d}g}x�t|jj��D]�\}}|jj|�}|jj|�}	|s�|jjd|g�\}}
|dkrxd}tj	d�n|dkr�tj
d|�|s�|dkr(t|	�dkr�|	dd	�d
kr�tj
dd|	�}	|j|�t|	�dkr�|d|d
|	d7}q(tjd|�q(W|jjtj|�t|�}|dk�rN|j|�t|j�|k�rNtjtj�dS)N�rFZmodinfoTz8'modinfo' command not found, not checking kernel modulesz)kernel module '%s' not found, skipping itr�z+rz^\s*\+r\s*,?\s*zoptions � �
zKmodule '%s' doesn't have any option specified, not writing it to modprobe.d)�_clear_modprobe_file�listr�items�
_variables�expandr	rrr�error�len�re�sub�appendr�
write_to_file�consts�MODULES_FILEr�infoZSTR_HINT_REBOOT)r
r�srZ
skip_checkZreload_list�option�valuer�vr�lrrr�_instance_apply_staticLs8


z$ModulesPlugin._instance_apply_staticcCst|�jdd�S)N�/r)�str�replace)r
�pathrrr�
_unquote_pathkszModulesPlugin._unquote_pathc
Csd}d}tjd�}�xt|jj��D]�\}}|jj|�}|jj|�}	tjdd|	�}	d|}
tj	j
|
�s�d}tjt
jd|�q$tjt
jd|�|j|	�}xv|D]n}|jd	�}
t|
�d
kr�tjd||f�q�|j|
d|
d
|jj|
d|j|
d�ddd�|�dkr�d}q�Wq$W|S)NTz\s+z^\s*\+r\s*,?\s*rz/sys/module/%sFzmodule '%s' is not loadedzmodule '%s' is loaded�=rz.unrecognized module option for module '%s': %srrz/parameters/)Zerr_ret�no_error)r)�compiler#rr$r%r&r*�osr9�existsrr'r-ZSTR_VERIFY_PROFILE_FAILr/ZSTR_VERIFY_PROFILE_OK�splitr(rZ
_verify_valuer	�	read_filer:)r
rZignore_missingZdevices�ret�rr1r2rr3Zmpathr4�item�argrrr�_instance_verify_staticns,



"
z%ModulesPlugin._instance_verify_staticcCs|tjkr|j�dS)N)r-Z
ROLLBACK_FULLr")r
rZrollbackrrr�_instance_unapply_static�s
z&ModulesPlugin._instance_unapply_staticcCs�|jjtjdd�}|jd�}d}}t|�}tjd�}x.||krd|j||�dkrZ|}|}|d7}q8Wdj	|d|��}t|�dkr�|d7}|jj
tj|�dS)NT)r<r!rz^\s*#r)r	rAr-r.r@r(r)r=�search�joinr,)r
r0r4�i�jZllrCrrrr"�s


z"ModulesPlugin._clear_modprobe_file)�__name__�
__module__�__qualname__�__doc__rrrrr5r:rFr-Z
ROLLBACK_SOFTrGr"�
__classcell__rr)r
rrs%r)r)Zos.pathr>rrZ
decoratorsZ
tuned.logsZtuned�
subprocessZtuned.utils.commandsrZtuned.constsr-Zlogs�getrZPluginrrrrr�<module>s

site-packages/tuned/plugins/__pycache__/plugin_eeepc_she.cpython-36.pyc000064400000006666147511334670022251 0ustar003

�<�e��@sTddlmZddlmZddlZddlmZddlZejj	�Z
Gdd�dej�ZdS)�)�base)�
exceptions�N)�commandscsTeZdZdZ�fdd�Zedd��Zdd�Zdd	�Zd
d�Z	dd
�Z
dd�Z�ZS)�EeePCSHEPlugina�
	`eeepc_she`::
	
	Dynamically sets the front-side bus (FSB) speed according to the
	CPU load. This feature can be found on some netbooks and is also
	known as the Asus Super Hybrid Engine. If the CPU load is lower or
	equal to the value specified by the [option]`load_threshold_powersave`
	option, the plug-in sets the FSB speed to the value specified by the
	[option]`she_powersave` option. If the CPU load is higher or
	equal to the value specified by the [option]`load_threshold_normal`
	option, it sets the FSB speed to the value specified by the
	[option]`she_normal` option. Static tuning is not supported and the
	plug-in is transparently disabled if the hardware support for this
	feature is not detected.
	
	NOTE: For details about the FSB frequencies and corresponding values, see
	link:https://www.kernel.org/doc/Documentation/ABI/testing/sysfs-platform-eeepc-laptop[the kernel documentation].
	The provided defaults should work for most users.
	csPt�|_d|_tjj|j�s"d|_tjj|j�s:tjd��tt	|�j
||�dS)Nz!/sys/devices/platform/eeepc/cpufvz%/sys/devices/platform/eeepc-wmi/cpufvz)Plugin is not supported on your hardware.)r�_cmd�
_control_file�os�path�isfilerZNotSupportedPluginException�superr�__init__)�self�args�kwargs)�	__class__��&/usr/lib/python3.6/plugin_eeepc_she.pyr
s
zEeePCSHEPlugin.__init__cCsddddd�S)Ng333333�?g�������?�r)�load_threshold_normal�load_threshold_powersaveZ
she_powersaveZ
she_normalr)rrrr�_get_config_options'sz"EeePCSHEPlugin._get_config_optionscCs&d|_d|_d|_|jjdd�|_dS)NFT�load)Z_has_static_tuningZ_has_dynamic_tuning�	_she_mode�_monitors_repositoryZcreate�
_load_monitor)r�instancerrr�_instance_init0szEeePCSHEPlugin._instance_initcCs"|jdk	r|jj|j�d|_dS)N)rr�delete)rrrrr�_instance_cleanup6s
z EeePCSHEPlugin._instance_cleanupcCsH|jj�d}||jdkr*|j|d�n||jdkrD|j|d�dS)N�systemrZ	powersaver�normal)rZget_load�options�
_set_she_mode)rr�devicerrrr�_instance_update_dynamic;s
z'EeePCSHEPlugin._instance_update_dynamiccCs|j|d�dS)Nr!)r#)rrr$rrr�_instance_unapply_dynamicBsz(EeePCSHEPlugin._instance_unapply_dynamiccCsLt|jd|�}|j|krHtjd||f�|jj|jd|�||_dS)Nzshe_%sznew eeepc_she mode %s (%d) z%s)�intr"r�log�inforZ
write_to_filer)rrZnew_modeZnew_mode_numericrrrr#Fs

zEeePCSHEPlugin._set_she_mode)
�__name__�
__module__�__qualname__�__doc__r
�classmethodrrrr%r&r#�
__classcell__rr)rrr	s		r)
�rrZ
tuned.logsZtunedZtuned.utils.commandsrr	Zlogs�getr(ZPluginrrrrr�<module>s
site-packages/tuned/plugins/__pycache__/plugin_rtentsk.cpython-36.pyc000064400000002531147511334670022006 0ustar003

�<�eU�@s`ddlmZddlTddlZddlmZddlZddlZddl	Z	ej
j�ZGdd�dej
�ZdS)�)�base)�*�N)�commandsc@s eZdZdZdd�Zdd�ZdS)�
RTENTSKPluginz�
	`rtentsk`::
	
	Plugin for avoiding interruptions due to static key IPIs due
        to opening socket with timestamping enabled (by opening a
        socket ourselves the static key is kept enabled).
	cCsLd|_d|_d}d}tjtjtjtj�}|jtj||�||_t	j
d�dS)NTF�r�z*opened SOF_TIMESTAMPING_OPT_TX_SWHW socketi@)Z_has_static_tuningZ_has_dynamic_tuning�socketZAF_INETZ
SOCK_DGRAMZIPPROTO_UDPZ
setsockoptZ
SOL_SOCKET�rtentsk_socket�log�info)�self�instanceZSO_TIMESTAMPZSOF_TIMESTAMPING_OPT_TX_SWHW�s�r�$/usr/lib/python3.6/plugin_rtentsk.py�_instance_initszRTENTSKPlugin._instance_initcCs|j}|j�dS)N)r
�close)r
rrrrr�_instance_cleanup$szRTENTSKPlugin._instance_cleanupN)�__name__�
__module__�__qualname__�__doc__rrrrrrrsr)�rZ
decoratorsZ
tuned.logsZtunedZtuned.utils.commandsrZglobr	ZtimeZlogs�getrZPluginrrrrr�<module>s
site-packages/tuned/plugins/__pycache__/plugin_sysfs.cpython-36.pyc000064400000007221147511334670021464 0ustar003

�<�e�
�@srddlmZddlZddlZddlZddlTddlZddl	j
Z
ddlTddlm
Z
ejj�ZGdd�dej�ZdS)�)�base�N)�*)�commandscsfeZdZdZ�fdd�Zdd�Zdd�Zdd	�Zd
d�Ze	j
fdd
�Zdd�Zdd�Z
dd�Z�ZS)�SysfsPlugina|
	`sysfs`::
	
	Sets various `sysfs` settings specified by the plug-in options.
	+
	The syntax is `_name_=_value_`, where
	`_name_` is the `sysfs` path to use and `_value_` is
	the value to write. The `sysfs` path supports the shell-style
	wildcard characters (see `man 7 glob` for additional detail).
	+
	Use this plugin in case you need to change some settings that are
	not covered by other plug-ins. Prefer specific plug-ins if they
	cover the required settings.
	+
	.Ignore corrected errors and associated scans that cause latency spikes
	====
	----
	[sysfs]
	/sys/devices/system/machinecheck/machinecheck*/ignore_ce=1
	----
	====
	cs$tt|�j||�d|_t�|_dS)NT)�superr�__init__Z_has_dynamic_optionsr�_cmd)�self�args�kwargs)�	__class__��"/usr/lib/python3.6/plugin_sysfs.pyr'szSysfsPlugin.__init__cCs4d|_d|_tdd�t|jj��D��|_i|_dS)NFTcSs$g|]}tjj|d�|df�qS)rr)�os�path�normpath)�.0�	key_valuerrr�
<listcomp>0sz.SysfsPlugin._instance_init.<locals>.<listcomp>)Z_has_dynamic_tuningZ_has_static_tuning�dict�listZoptions�items�_sysfs�_sysfs_original)r
�instancerrr�_instance_init,szSysfsPlugin._instance_initcCsdS)Nr)r
rrrr�_instance_cleanup3szSysfsPlugin._instance_cleanupcCsvxpt|jj��D]^\}}|jj|�}xHtj|�D]:}|j|�r\|j|�|j	|<|j
||�q0tjd|�q0WqWdS)Nz)rejecting write to '%s' (not inside /sys))
rrr�
_variables�expand�glob�iglob�_check_sysfs�_read_sysfsr�_write_sysfs�log�error)r
r�key�value�v�frrr�_instance_apply_static6s
z"SysfsPlugin._instance_apply_staticc
Cspd}xft|jj��D]T\}}|jj|�}x>tj|�D]0}|j|�r4|j|�}	|j	|||	|�dkr4d}q4WqW|S)NTF)
rrrrrr r!r"r#Z
_verify_value)
r
rZignore_missingZdevices�retr'r(r)r*Zcurr_valrrr�_instance_verify_static@s

z#SysfsPlugin._instance_verify_staticcCs,x&t|jj��D]\}}|j||�qWdS)N)rrrr$)r
rZrollbackr'r(rrr�_instance_unapply_staticKsz$SysfsPlugin._instance_unapply_staticcCstjd|�S)Nz^/sys/.*)�re�match)r
�
sysfs_filerrrr"OszSysfsPlugin._check_sysfscCs2|jj|�j�}t|�dkr*|jj|d�SdSdS)NrF)r	Z	read_file�strip�lenZget_active_option)r
r1�datarrrr#RszSysfsPlugin._read_sysfscCs|jj||�S)N)r	Z
write_to_file)r
r1r(rrrr$YszSysfsPlugin._write_sysfs)�__name__�
__module__�__qualname__�__doc__rrrr+r-�constsZ
ROLLBACK_SOFTr.r"r#r$�
__classcell__rr)r
rr
s
r)�rr r/Zos.pathrZ
decoratorsZ
tuned.logsZtunedZtuned.constsr9�
subprocessZtuned.utils.commandsrZlogs�getr%ZPluginrrrrr�<module>s

site-packages/tuned/plugins/__pycache__/plugin_uncore.cpython-36.pyc000064400000011404147511334670021606 0ustar003

�<�es�@sjddlmZddlTddlZddlmZddlZddlZej	j
�Ze�ZdZ
dZdZGdd�dej�ZdS)	�)�hotplug)�*�N)�commandsz//sys/devices/system/cpu/intel_uncore_frequency/c@s�eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Ze	dd
��Z
dd�Zeddd�dd��Z
ed�ddd��Zeddd�dd��Zed�ddd��ZdS) �UncorePlugina|
	`uncore`::

	`max_freq_khz, min_freq_khz`:::
	Limit the maximum and minumum uncore frequency.

	Those options are Intel specific and correspond directly to `sysfs` files
	exposed by Intel uncore frequency driver.
	====
	----
	[uncore]
	max_freq_khz=4000000
	----
	Using this options *TuneD* will limit maximum frequency of all uncore units
	on the Intel system to 4 GHz.
	====
	cCs�d|_t�|_t�|_d|_ytjt�}Wntk
r>dSXt	j
|d�}t|�dkrbd|_|}x|D]}|jj|�qhWt
jdt|j��dS)NTFzuncore*rzdevices: %s)Z_devices_supported�setZ_assigned_devicesZ
_free_devicesZ_is_tpmi�os�listdir�	SYSFS_DIR�OSError�fnmatch�filter�len�add�log�debug�str)�selfZdevicesZtpmi_devices�d�r�#/usr/lib/python3.6/plugin_uncore.py�
_init_devices$s
zUncorePlugin._init_devicescCsd|_d|_dS)NTF)Z_has_static_tuningZ_has_dynamic_tuning)r�instancerrr�_instance_init:szUncorePlugin._instance_initcCsdS)Nr)rrrrr�_instance_cleanup>szUncorePlugin._instance_cleanupcCs2t|d|}tj|�}t|�dkr.t|�SdS)N�/r)r
�cmdZ	read_filer�int)r�dev_dir�file�
sysfs_file�valuerrr�_getAs

zUncorePlugin._getcCs(t|d|}tj|d|�r$|SdS)Nrz%u)r
rZ
write_to_file)rrrr!r rrr�_setHszUncorePlugin._setcCs
ddd�S)N)�max_freq_khz�min_freq_khzr)�clsrrr�_get_config_optionsNsz UncorePlugin._get_config_optionsc	Cs*yt|�}Wn"tk
r.tjd|�dSXy4|j|d�}|j|d�}|j|d�}|j|d�}Wn"ttfk
r�tjd�dSX|tkr�||kr�tjd|||f�dS||kr�tjd|||f�|}nT|t	k�r"||k�r�tjd	|||f�dS||k�r&tjd
|||f�|}ndS|S)Nzvalue '%s' is not integer�initial_max_freq_khz�initial_min_freq_khzr$r%z+fail to read inital uncore frequency valuesz/%s: max_freq_khz %d value below min_freq_khz %dzC%s: max_freq_khz %d value above initial_max_freq_khz - capped to %dz/%s: min_freq_khz %d value above max_freq_khz %dzC%s: min_freq_khz %d value below initial_max_freq_khz - capped to %d)
r�
ValueErrorr�errorr"r�IOError�IS_MAX�info�IS_MIN)	r�deviceZ
min_or_maxr!Zfreq_khzr(r)r$r%rrr�_validate_valueUs:



zUncorePlugin._validate_valuer$T)Z
per_devicecCsB|j|t|�}|dkrdS|r"|Stjd||f�|j|d|�S)Nz%s: set max_freq_khz %dr$)r1r-rrr#)rr!r0�sim�remover$rrr�_set_max_freq_khz|szUncorePlugin._set_max_freq_khzFcCs`|rtjjt�rdSy|j|d�}Wn"ttfk
rHtjd�dSXtj	d||f�|S)Nr$z$fail to read uncore frequency valuesz%s: get max_freq_khz %d)
r�path�isdirr
r"rr,rr+r)rr0�ignore_missingr$rrr�_get_max_freq_khz�s
zUncorePlugin._get_max_freq_khzr%cCsB|j|t|�}|dkrdS|r"|Stjd||f�|j|d|�S)Nz%s: set min_freq_khz %dr%)r1r/rrr#)rr!r0r2r3r%rrr�_set_min_freq_khz�szUncorePlugin._set_min_freq_khzcCs`|rtjjt�rdSy|j|d�}Wn"ttfk
rHtjd�dSXtj	d||f�|S)Nr%z$fail to read uncore frequency valuesz%s: get min_freq_khz %d)
rr5r6r
r"rr,rr+r)rr0r7r%rrr�_get_min_freq_khz�s
zUncorePlugin._get_min_freq_khzN)F)F)�__name__�
__module__�__qualname__�__doc__rrrr"r#�classmethodr'r1Zcommand_setr4Zcommand_getr8r9r:rrrrrs'
r)�rZ
decoratorsZ
tuned.logsZtunedZtuned.utils.commandsrrrZlogs�getrrr
r/r-ZPluginrrrrr�<module>s
site-packages/tuned/plugins/__pycache__/plugin_net.cpython-36.opt-1.pyc000064400000056323147511334670022051 0ustar003

�<�e�Z�@spddlZddlmZddlTddlZddlmZddlm	Z	ddl
Z
ddlZejj
�ZdZGdd	�d	ej�ZdS)
�N�)�hotplug)�*)�ethcard)�commandsZpumbagsdcs.eZdZdZ�fdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
edd��Zedd��Z
edd��Zedd��Zedd��Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zed(d)��Zed*d+d,�d-d.��Zed*�did0d1��Zed2�d3d4��Zed2�d5d6��Zgfd7d8�Zdjd:d;�Z ed<d+d,�d=d>��Z!d?d@�Z"ed<�dkdAdB��Z#edCd+d,�dDdE��Z$edC�dldFdG��Z%dHdI�Z&dJdK�Z'dLdM�Z(dNdO�Z)dPdQ�Z*dRdS�Z+dTdU�Z,dmdVdW�Z-dXdY�Z.e/dZd+d,�d[d\��Z0e/d]d+d,�d^d_��Z1e/d`d+d,�dadb��Z2e/dcd+d,�ddde��Z3e/dfd+d,�dgdh��Z4�Z5S)n�NetTuningPlugina�
	`net`::
	
	Configures network driver, hardware and Netfilter settings.
	Dynamic change of the interface speed according to the interface
	utilization is also supported. The dynamic tuning is controlled by
	the [option]`dynamic` and the global [option]`dynamic_tuning`
	option in `tuned-main.conf`.
	+
	The [option]`wake_on_lan` option sets wake-on-lan to the specified
	value as when using the `ethtool` utility.
	+
	.Set Wake-on-LAN for device eth0 on MagicPacket(TM)
	====
	----
	[net]
	devices=eth0
	wake_on_lan=g
	----
	====
	+
	The [option]`coalesce` option allows changing coalescing settings
	for the specified network devices. The syntax is:
	+
	[subs="+quotes,+macros"]
	----
	coalesce=__param1__ __value1__ __param2__ __value2__ ... __paramN__ __valueN__
	----
	Note that not all the coalescing parameters are supported by all
	network cards. For the list of coalescing parameters of your network
	device, use `ethtool -c device`.
	+	
	.Setting coalescing parameters rx/tx-usecs for all network devices
	====
	----
	[net]
	coalesce=rx-usecs 3 tx-usecs 16
	----
	====
	+
	The [option]`features` option allows changing 
	the offload parameters and other features for the specified
	network devices. To query the features of your network device,
	use `ethtool -k device`. The syntax of the option is the same as
	the [option]`coalesce` option.
	+
	.Turn off TX checksumming, generic segmentation and receive offload 
	====
	----
	[net]
	features=tx off gso off gro off
	----
	====
	The [option]`pause` option allows changing the pause parameters for
	the specified network devices. To query the pause parameters of your
	network device, use `ethtool -a device`. The syntax of the option
	is the same as the [option]`coalesce` option.
	+
	.Disable autonegotiation
	====
	----
	[net]
	pause=autoneg off
	----
	====
	+
	The [option]`ring` option allows changing the rx/tx ring parameters
	for the specified network devices. To query the ring parameters of your
	network device, use `ethtool -g device`. The syntax of the option
	is the same as the [option]`coalesce` option.
	+	
	.Change the number of ring entries for the Rx/Tx rings to 1024/512 respectively
	=====
	-----
	[net]
	ring=rx 1024 tx 512
	-----
	=====
	+
	The [option]`channels` option allows changing the numbers of channels
	for the specified network device. A channel is an IRQ and the set
	of queues that can trigger that IRQ. To query the channels parameters of your
	network device, use `ethtool -l device`. The syntax of the option
	is the same as the [option]`coalesce` option.
	+
	.Set the number of multi-purpose channels to 16
	=====
	-----
	[net]
	channels=combined 16
	-----
	=====
	+   
	A network device either supports rx/tx or combined queue
	mode. The [option]`channels` option automatically adjusts the
	parameters based on the mode supported by the device as long as a
	valid configuration is requested.
	+
	The [option]`nf_conntrack_hashsize` option sets the size of the hash
	table which stores lists of conntrack entries by writing to
	`/sys/module/nf_conntrack/parameters/hashsize`.
	+
	.Adjust the size of the conntrack hash table
	====
	----
	[net]
	nf_conntrack_hashsize=131072
	----
	====
	+
	The [option]`txqueuelen` option allows changing txqueuelen (the length
	of the transmit queue). It uses `ip` utility that is in package	iproute
	recommended for TuneD, so the package needs to be installed for its correct
	functionality. To query the txqueuelen parameters of your network device
	use `ip link show` and the current value is shown after the qlen column.
	+
	.Adjust the length of the transmit queue
	====
	----
	[net]
	txqueuelen=5000
	----
	====
	+
	The [option]`mtu` option allows changing MTU (Maximum Transmission Unit).
	It uses `ip` utility that is in package	iproute recommended for TuneD, so
	the package needs to be installed for its correct functionality. To query
	the MTU parameters of your network device use `ip link show` and the
	current value is shown after the MTU column.
	+
	.Adjust the size of the MTU
	====
	----
	[net]
	mtu=9000
	----
	====
	cs6tt|�j||�d|_d|_t�|_i|_d|_dS)Ng�������?�T)	�superr�__init__�_load_smallest�_level_stepsr�_cmd�_re_ip_link_show�_use_ip)�self�args�kwargs)�	__class__�� /usr/lib/python3.6/plugin_net.pyr
�szNetTuningPlugin.__init__cCshd|_t�|_t�|_tjd�}x.|jjd�D]}|j|j	�r.|jj
|j�q.Wtj
dt|j��dS)NTz(?!.*/virtual/.*)�netzdevices: %s)Z_devices_supported�setZ
_free_devicesZ_assigned_devices�re�compile�_hardware_inventoryZget_devices�matchZdevice_path�addZsys_name�log�debug�str)rZre_not_virtual�devicerrr�
_init_devices�s
zNetTuningPlugin._init_devicescs�fdd�|D�S)Ncsg|]}�jjd|��qS)r)rZ
get_device)�.0�x)rrr�
<listcomp>�sz7NetTuningPlugin._get_device_objects.<locals>.<listcomp>r)rZdevicesr)rr�_get_device_objects�sz#NetTuningPlugin._get_device_objectscCsXd|_|j|jd�r<d|_|jjd|j�|_i|_i|_	nd|_d|_d|_d|_	dS)NT�dynamicrF)
Z_has_static_tuningZ_option_boolZoptionsZ_has_dynamic_tuning�_monitors_repositoryZcreateZassigned_devices�
_load_monitor�_idle�_stats)r�instancerrr�_instance_init�szNetTuningPlugin._instance_initcCs"|jdk	r|jj|j�d|_dS)N)r(r'�delete)rr+rrr�_instance_cleanup�s
z!NetTuningPlugin._instance_cleanupcCs|j||�dS)N)�_instance_update_dynamic)rr+r rrr�_instance_apply_dynamic�sz'NetTuningPlugin._instance_apply_dynamiccCs<dd�|jj|�D�}|dkr"dS||jkr8|j||�|j|||�|j||�|j|}|j|}|ddkr�|d|jkr�|d|jkr�d|d<tj	d|�t
|�jd	�nF|ddkr�|ddks�|ddkr�d|d<tj	d
|�t
|�j�tj
d||d|df�tj
d||d|d|df�dS)
NcSsg|]}t|��qSr)�int)r"�valuerrrr$�sz<NetTuningPlugin._instance_update_dynamic.<locals>.<listcomp>�levelr�read�writerz%s: setting 100Mbps�dz%s: setting max speedz %s load: read %0.2f, write %0.2fz$%s idle: read %d, write %d, level %d)r(Zget_device_loadr*�_init_stats_and_idle�
_update_stats�_update_idler)rr�inforZ	set_speed�
set_max_speedr)rr+r �loadZstatsZidlerrrr/�s&


($z(NetTuningPlugin._instance_update_dynamiccCs2ddddddddddddddddddddddd�S)N)zadaptive-rxzadaptive-txzrx-usecsz	rx-frameszrx-usecs-irqz
rx-frames-irqztx-usecsz	tx-framesztx-usecs-irqz
tx-frames-irqzstats-block-usecszpkt-rate-lowzrx-usecs-lowz
rx-frames-lowztx-usecs-lowz
tx-frames-lowz
pkt-rate-highz
rx-usecs-highzrx-frames-highz
tx-usecs-highztx-frames-highzsample-intervalr)�clsrrr�_get_config_options_coalesce�s,z,NetTuningPlugin._get_config_options_coalescecCsdddd�S)N)�autoneg�rx�txr)r=rrr�_get_config_options_pause�sz)NetTuningPlugin._get_config_options_pausecCsddddd�S)N)r@zrx-minizrx-jumborAr)r=rrr�_get_config_options_ringsz(NetTuningPlugin._get_config_options_ringcCsddddd�S)N)r@rA�other�combinedr)r=rrr�_get_config_options_channelssz,NetTuningPlugin._get_config_options_channelscCsddddddddddd�
S)NT)
r&�wake_on_lan�nf_conntrack_hashsize�features�coalesce�pause�ring�channels�
txqueuelen�mtur)r=rrr�_get_config_optionssz#NetTuningPlugin._get_config_optionscCsF|jt|�j��}ddgd|dgd�|j|<dddd�|j|<dS)N�r�r)�new�max)r3r4r5)�_calc_speedrZ
get_max_speedr*r))rr+r Z	max_speedrrrr7sz$NetTuningPlugin._init_stats_and_idlecCs�|j|d|j|d<}||j|d<dd�t||�D�}||j|d<|j|d}dd�t||�D�}||j|d<t|d�t|d�|j|d	<t|d
�t|d
�|j|d<dS)NrS�oldcSsg|]}|d|d�qS)rrr)r"Znew_oldrrrr$(sz1NetTuningPlugin._update_stats.<locals>.<listcomp>�diffrTcSsg|]}t|��qSr)rT)r"Zpairrrrr$-srr4rRr5)r*�zip�float)rr+r Znew_loadZold_loadrWZold_max_loadZmax_loadrrrr8"s"zNetTuningPlugin._update_statscCsLxFdD]>}|j|||jkr6|j||d7<qd|j||<qWdS)Nr4r5rr)r4r5)r*rr))rr+r Z	operationrrrr94s
zNetTuningPlugin._update_idlecCsH||jkrD|j|ddkrDd|j|d<tjd|�t|�j�dS)Nr3rz%s: setting max speed)r)rr:rr;)rr+r rrr�_instance_unapply_dynamic<sz)NetTuningPlugin._instance_unapply_dynamiccCstd|d�S)Ng333333�?i�g333333�@g333333#A)r1)rZspeedrrrrUBszNetTuningPlugin._calc_speedcCs�|jj|�}ttjdd|��j�}t|�}|ddkrPtjd|t|�f�dS|dkr^t	�St	t
t|ddd�|ddd����S)Nz (:\s*)|(\s+)|(\s*;\s*)|(\s*,\s*)� rRrzinvalid %s parameter: '%s'r)Z
_variables�expandrr�sub�split�lenr�error�dict�listrX)rr2�context�vZlvrrr�_parse_config_parametersKsz(NetTuningPlugin._parse_config_parameterscCs||jjddddddddd	d
ddd
dddddd�|�}dd�|jd�D�}t|�dkrXdStdd�dd�|dd�D�D��S)Nzadaptive-rx:z
adaptive-tx:zrx-frames-low:zrx-frames-high:ztx-frames-low:ztx-frames-high:zlro:zrx:ztx:zsg:ztso:zufo:zgso:zgro:zrxvlan:ztxvlan:zntuple:zrxhash:)zAdaptive RX:z\s+TX:z
rx-frame-low:zrx-frame-high:z
tx-frame-low:ztx-frame-high:zlarge-receive-offload:zrx-checksumming:ztx-checksumming:zscatter-gather:ztcp-segmentation-offload:zudp-fragmentation-offload:zgeneric-segmentation-offload:zgeneric-receive-offload:zrx-vlan-offload:ztx-vlan-offload:zntuple-filters:zreceive-hashing:cSs2g|]*}tt|��dkrtjdt|��r|�qS)rz
\[fixed\]$)r`rr�search)r"rerrrr$ssz<NetTuningPlugin._parse_device_parameters.<locals>.<listcomp>�
rRcSsg|]}t|�dkr|�qS)rR)r`)r"�urrrr$xscSsg|]}tjdt|���qS)z:\s*)rr_r)r"rerrrr$xsr)r
�multiple_re_replacer_r`rb)rr2Zvlrrr�_parse_device_parametersZs0z(NetTuningPlugin._parse_device_parameterscCsdS)Nz,/sys/module/nf_conntrack/parameters/hashsizer)rrrr�_nf_conntrack_hashsize_pathzsz+NetTuningPlugin._nf_conntrack_hashsize_pathrGT)Z
per_devicecCs^|dkrdStjddt|��}tjdtd|�s@tjd�dS|sZ|jjdd|d|g�|S)	N�0�dz^[z]+$zIncorrect 'wake_on_lan' value.�ethtoolz-sZwol)	rr^rr�
WOL_VALUESr�warnr
�execute)rr2r �sim�removerrr�_set_wake_on_lan~s
z NetTuningPlugin._set_wake_on_lanFcCsXd}y:tjdtd|jjd|g�dtj�}|r<|jd�}Wntk
rRYnX|S)Nz.*Wake-on:\s*([z]+).*ror)rrrpr
rr�S�group�IOError)rr �ignore_missingr2�mrrr�_get_wake_on_lan�s(z NetTuningPlugin._get_wake_on_lanrHcCsN|dkrdSt|�}|dkrF|sB|jj|j�||r:tjgndd�|SdSdS)NrF)Zno_error)r1r
Z
write_to_filerl�errno�ENOENT)rr2rsrtZhashsizerrr�_set_nf_conntrack_hashsize�sz*NetTuningPlugin._set_nf_conntrack_hashsizecCs(|jj|j��}t|�dkr$t|�SdS)Nr)r
Z	read_filerlr`r1)rr2rrr�_get_nf_conntrack_hashsize�sz*NetTuningPlugin._get_nf_conntrack_hashsizecCsz|js
dSddg|}|jj|tjgdd�\}}}|tjkrRtjd�d|_dS|rvtjd�tjd||f�dS|S)	NZip�linkT)�	no_errorsZ
return_errz0ip command not found, ignoring for other devicesFzProblem calling ip commandz(rc: %s, msg: '%s'))	rr
rrr|r}rrqr:r)rrZrc�outZerr_msgrrr�
_call_ip_link�s

zNetTuningPlugin._call_ip_linkNcCsdg}|r|j|�|j|�S)NZshow)�appendr�)rr rrrr�
_ip_link_show�s
zNetTuningPlugin._ip_link_showrNcCsr|dkrdSyt|�Wn"tk
r:tjd|�dSX|sn|jdd|d|g�}|dkrntjd|�dS|S)Nz$txqueuelen value '%s' is not integerr�devrNz%Cannot set txqueuelen for device '%s')r1�
ValueErrorrrqr�)rr2r rsrt�resrrr�_set_txqueuelen�szNetTuningPlugin._set_txqueuelencCs(||jkrtjd|�|j|<|j|S)z@
		Return regex for int arg value from "ip link show" command
		z.*\s+%s\s+(\d+))rrr)r�argrrr�_get_re_ip_link_show�s
z$NetTuningPlugin._get_re_ip_link_showcCs`|j|�}|dkr(|s$tjd|�dS|jd�j|�}|dkrV|sRtjd|�dS|jd�S)NzECannot get 'ip link show' result for txqueuelen value for device '%s'ZqlenzFCannot get txqueuelen value from 'ip link show' result for device '%s'r)r�rr:r�rgrw)rr ryr�r�rrr�_get_txqueuelen�s
zNetTuningPlugin._get_txqueuelenrOcCsr|dkrdSyt|�Wn"tk
r:tjd|�dSX|sn|jdd|d|g�}|dkrntjd|�dS|S)Nzmtu value '%s' is not integerrr�rOzCannot set mtu for device '%s')r1r�rrqr�)rr2r rsrtr�rrr�_set_mtu�szNetTuningPlugin._set_mtucCs`|j|�}|dkr(|s$tjd|�dS|jd�j|�}|dkrV|sRtjd|�dS|jd�S)Nz>Cannot get 'ip link show' result for mtu value for device '%s'rOz?Cannot get mtu value from 'ip link show' result for device '%s'r)r�rr:r�rgrw)rr ryr�r�rrr�_get_mtu�s
zNetTuningPlugin._get_mtucCsl|dkrdSt|j��}|j|j|j|jd�}t||�j��}|j|�shtjd|t	||�f�dSdS)NrIT)rJrKrLrMzunknown %s parameter(s): %sF)
r�keysr>rBrCrF�issubsetrrar)rrdrnZparamsZsupported_getterZ	supportedrrr�_check_parameters
s

z!NetTuningPlugin._check_parameterscCsR|jjdddd�|�}|jd�dd�}dd�|D�}td	d�d
d�|D�D��S)Nr?r@rA)Z
Autonegotiate�RX�TXrhrcSs&g|]}|dkrtjd|�r|�qS)�z	\[fixed\])rrg)r"r#rrrr$sz;NetTuningPlugin._parse_pause_parameters.<locals>.<listcomp>cSsg|]}t|�dkr|�qS)rR)r`)r"r#rrrr$ scSsg|]}tjd|��qS)z:\s*)rr_)r"r#rrrr$ s)r
rjr_rb)r�s�lrrr�_parse_pause_parameterssz'NetTuningPlugin._parse_pause_parameterscCsjtjd|tjd�}|d}|jjddddd�|�}|jd	�}d
d�|D�}dd�d
d�|D�D�}t|�S)Nz^Current hardware settings:$)�flagsrr@zrx-minizrx-jumborA)r�zRX MinizRX Jumbor�rhcSsg|]}|dkr|�qS)r�r)r"r#rrrr$,sz:NetTuningPlugin._parse_ring_parameters.<locals>.<listcomp>cSsg|]}t|�dkr|�qS)rR)r`)r"r#rrrr$-scSsg|]}tjd|��qS)z:\s*)rr_)r"r#rrrr$-s)rr_�	MULTILINEr
rjrb)rr��ar�rrr�_parse_ring_parameters#s
z&NetTuningPlugin._parse_ring_parameterscCsjtjd|tjd�}|d}|jjddddd�|�}|jd	�}d
d�|D�}dd�d
d�|D�D�}t|�S)Nz^Current hardware settings:$)r�rr@rArDrE)r�r�ZOtherZCombinedrhcSsg|]}|dkr|�qS)r�r)r"r#rrrr$:sz>NetTuningPlugin._parse_channels_parameters.<locals>.<listcomp>cSsg|]}t|�dkr|�qS)rR)r`)r"r#rrrr$;scSsg|]}tjd|��qS)z:\s*)rr_)r"r#rrrr$;s)rr_r�r
rjrb)rr�r�r�rrr�_parse_channels_parameters1s
z*NetTuningPlugin._parse_channels_parameterscCszg}d|kr(|jd|dd|dg�n,ttt|d�t|d���}|jd|g�ttt|ddd�|ddd����S)NrEr@rrA�rR)�extendrrTr1rbrcrX)rrdZparams_list�
dev_paramsZmod_params_listZcntrrr�_replace_channels_parameters>sz,NetTuningPlugin._replace_channels_parametersc	CsRt|j��}t|j��}||}x,|D]$}tjd|||f�|j|d�q&WdS)aFilter unsupported parameters and log warnings about it

		Positional parameters:
		context -- context of change
		parameters -- parameters to change
		device -- name of device on which should be parameters set
		dev_params -- dictionary of currently known parameters of device
		z-%s parameter %s is not supported by device %sN)rr�rZwarning�pop)	rrdZ
parametersr r�Zsupported_parametersZparameters_to_changeZunsupported_parameters�paramrrr�_check_device_supportGs	

z%NetTuningPlugin._check_device_supportc
Cs�dddddd�}||}|jjd||g�\}}|dksBt|�dkrFdS|j|j|j|j|jd�}||}||�}	|d	kr�|j||	�r�dS|	S)
Nz-cz-kz-az-gz-l)rJrIrKrLrMrorrJ)r
rrr`rkr�r�r�r�)
rrdr �context2opt�opt�retr2Zcontext2parser�parserrnrrr�_get_device_parameters^s 
z&NetTuningPlugin._get_device_parametersc	Cs�|dkst|�dkrdS|j||�}|dks:|j||�r>iS|r�|j||||�|dkr�t|tt|���dkr�|j||jj	|�|�}|r�t|�dkr�t
jd|t|�f�dddd	d
d�}||}|jjd||g|jj	|�d
gd�|S)NrrM�n/armzsetting %s: %sz-Cz-Kz-Az-Gz-L)rJrIrKrLrMro�P)r�)r�rm)
r`rfr�r�r�next�iterr�r
�	dict2listrrrr)	rrdr2r rsr�rnr�r�rrr�_set_device_parametersps  $z&NetTuningPlugin._set_device_parameterscs�|j||d�}|r�|j||�}|dks2t|�dkr6dS|j|||||d���dks^t��dkrbdS�fdd�|j�D�}t|�}|r�|jj��|jj|�k}	|j||	�||d�|	S|j	j
|dj|jj|���n|j	j|�}
|j||
|d�dS)	N)Zcommand_nameZdevice_namerF)r�cs g|]\}}|�kr||f�qSrr)r"r�r2)�
params_setrrr$�sz6NetTuningPlugin._custom_parameters.<locals>.<listcomp>)r r\)
Z_storage_keyr�r`r��itemsrbr
r�Z_log_verification_resultZ_storager�join�get)rrd�startr2r �verifyZstorage_keyZparams_currentZrelevant_params_currentr��original_valuer)r�r�_custom_parameters�s:

z"NetTuningPlugin._custom_parametersrIcCs|jd||||�S)NrI)r�)rr�r2r r�ryrrr�	_features�szNetTuningPlugin._featuresrJcCs|jd||||�S)NrJ)r�)rr�r2r r�ryrrr�	_coalesce�szNetTuningPlugin._coalescerKcCs|jd||||�S)NrK)r�)rr�r2r r�ryrrr�_pause�szNetTuningPlugin._pauserLcCs|jd||||�S)NrL)r�)rr�r2r r�ryrrr�_ring�szNetTuningPlugin._ringrMcCs|jd||||�S)NrM)r�)rr�r2r r�ryrrr�	_channels�szNetTuningPlugin._channels)F)N)F)F)N)6�__name__�
__module__�__qualname__�__doc__r
r!r%r,r.r0r/�classmethodr>rBrCrFrPr7r8r9rZrUrfrkrlZcommand_setruZcommand_getr{r~rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�Zcommand_customr�r�r�r�r��
__classcell__rr)rrrsd
	 



	
&r)r|r�rZ
decoratorsZ
tuned.logsZtunedZtuned.utils.nettoolrZtuned.utils.commandsr�osrZlogsr�rrpZPluginrrrrr�<module>s
site-packages/tuned/plugins/__pycache__/plugin_sysfs.cpython-36.opt-1.pyc000064400000007221147511334670022423 0ustar003

�<�e�
�@srddlmZddlZddlZddlZddlTddlZddl	j
Z
ddlTddlm
Z
ejj�ZGdd�dej�ZdS)�)�base�N)�*)�commandscsfeZdZdZ�fdd�Zdd�Zdd�Zdd	�Zd
d�Ze	j
fdd
�Zdd�Zdd�Z
dd�Z�ZS)�SysfsPlugina|
	`sysfs`::
	
	Sets various `sysfs` settings specified by the plug-in options.
	+
	The syntax is `_name_=_value_`, where
	`_name_` is the `sysfs` path to use and `_value_` is
	the value to write. The `sysfs` path supports the shell-style
	wildcard characters (see `man 7 glob` for additional detail).
	+
	Use this plugin in case you need to change some settings that are
	not covered by other plug-ins. Prefer specific plug-ins if they
	cover the required settings.
	+
	.Ignore corrected errors and associated scans that cause latency spikes
	====
	----
	[sysfs]
	/sys/devices/system/machinecheck/machinecheck*/ignore_ce=1
	----
	====
	cs$tt|�j||�d|_t�|_dS)NT)�superr�__init__Z_has_dynamic_optionsr�_cmd)�self�args�kwargs)�	__class__��"/usr/lib/python3.6/plugin_sysfs.pyr'szSysfsPlugin.__init__cCs4d|_d|_tdd�t|jj��D��|_i|_dS)NFTcSs$g|]}tjj|d�|df�qS)rr)�os�path�normpath)�.0�	key_valuerrr�
<listcomp>0sz.SysfsPlugin._instance_init.<locals>.<listcomp>)Z_has_dynamic_tuningZ_has_static_tuning�dict�listZoptions�items�_sysfs�_sysfs_original)r
�instancerrr�_instance_init,szSysfsPlugin._instance_initcCsdS)Nr)r
rrrr�_instance_cleanup3szSysfsPlugin._instance_cleanupcCsvxpt|jj��D]^\}}|jj|�}xHtj|�D]:}|j|�r\|j|�|j	|<|j
||�q0tjd|�q0WqWdS)Nz)rejecting write to '%s' (not inside /sys))
rrr�
_variables�expand�glob�iglob�_check_sysfs�_read_sysfsr�_write_sysfs�log�error)r
r�key�value�v�frrr�_instance_apply_static6s
z"SysfsPlugin._instance_apply_staticc
Cspd}xft|jj��D]T\}}|jj|�}x>tj|�D]0}|j|�r4|j|�}	|j	|||	|�dkr4d}q4WqW|S)NTF)
rrrrrr r!r"r#Z
_verify_value)
r
rZignore_missingZdevices�retr'r(r)r*Zcurr_valrrr�_instance_verify_static@s

z#SysfsPlugin._instance_verify_staticcCs,x&t|jj��D]\}}|j||�qWdS)N)rrrr$)r
rZrollbackr'r(rrr�_instance_unapply_staticKsz$SysfsPlugin._instance_unapply_staticcCstjd|�S)Nz^/sys/.*)�re�match)r
�
sysfs_filerrrr"OszSysfsPlugin._check_sysfscCs2|jj|�j�}t|�dkr*|jj|d�SdSdS)NrF)r	Z	read_file�strip�lenZget_active_option)r
r1�datarrrr#RszSysfsPlugin._read_sysfscCs|jj||�S)N)r	Z
write_to_file)r
r1r(rrrr$YszSysfsPlugin._write_sysfs)�__name__�
__module__�__qualname__�__doc__rrrr+r-�constsZ
ROLLBACK_SOFTr.r"r#r$�
__classcell__rr)r
rr
s
r)�rr r/Zos.pathrZ
decoratorsZ
tuned.logsZtunedZtuned.constsr9�
subprocessZtuned.utils.commandsrZlogs�getr%ZPluginrrrrr�<module>s

site-packages/tuned/plugins/__pycache__/plugin_scheduler.cpython-36.pyc000064400000152706147511334670022304 0ustar003

�<�e���@s
ddlmZddlTddlZddlZddlTddlZddlZddl	Z	ddl
jZddlZddl
mZddlZddlZddlZddlZy
ejWnek
r�ddlZYnXejj�ZGdd�de�ZGdd	�d	e�ZGd
d�de�ZGdd
�d
e�ZGdd�dej�ZdS)�)�base)�*�N)�commandsc@s0eZdZddd�Zedd��Zejdd��ZdS)�SchedulerParamsNcCs(||_||_||_||_||_||_dS)N)�_cmd�cmdline�	scheduler�priority�affinity�cgroup)�self�cmdrr	r
rr�r�&/usr/lib/python3.6/plugin_scheduler.py�__init__szSchedulerParams.__init__cCs |jdkrdS|jj|j�SdS)N)�	_affinityrZbitmask2cpulist)r
rrrr&s
zSchedulerParams.affinitycCs"|dkrd|_n|jj|�|_dS)N)rrZcpulist2bitmask)r
�valuerrrr-s)NNNNN)�__name__�
__module__�__qualname__r�propertyr�setterrrrrrs
rc@seZdZdd�ZdS)�
IRQAffinitiescCsi|_d|_g|_dS)N)�irqs�default�unchangeable)r
rrrr5szIRQAffinities.__init__N)rrrrrrrrr4src@speZdZdZdddddd�Zdd	�Zd
d�Zdd
�Zdd�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�ZdS)�SchedulerUtilsz=
	Class encapsulating scheduler implementation in os module
	�
SCHED_FIFO�SCHED_BATCH�SCHED_RR�SCHED_OTHER�
SCHED_IDLE)�f�b�r�o�icCs8tdd�|jj�D��|_tdd�|jj�D��|_dS)Ncss |]\}}|tt|�fVqdS)N)�getattr�os)�.0�k�namerrr�	<genexpr>Jsz*SchedulerUtils.__init__.<locals>.<genexpr>css|]}tt|�|fVqdS)N)r(r))r*r,rrrr-Ls)�dict�_dict_schedcfg2schedconst�items�_dict_schedcfg2num�values�_dict_num2schedconst)r
rrrrHszSchedulerUtils.__init__cCs|jj|�S)N)r1�get)r
�
str_schedulerrrr�sched_cfg_to_numNszSchedulerUtils.sched_cfg_to_numcCs|jj|�S)N)r3r4)r
r	rrr�sched_num_to_constRsz!SchedulerUtils.sched_num_to_constcCs
tj|�S)N)r)�sched_getscheduler)r
�pidrrr�
get_schedulerUszSchedulerUtils.get_schedulercCstj||tj|��dS)N)r)�sched_setscheduler�sched_param)r
r9�sched�priorrr�
set_schedulerXszSchedulerUtils.set_schedulercCs
tj|�S)N)r)�sched_getaffinity)r
r9rrr�get_affinity[szSchedulerUtils.get_affinitycCstj||�dS)N)r)�sched_setaffinity)r
r9rrrr�set_affinity^szSchedulerUtils.set_affinitycCstj|�jS)N)r)�sched_getparam�sched_priority)r
r9rrr�get_priorityaszSchedulerUtils.get_prioritycCs
tj|�S)N)r)�sched_get_priority_min)r
r=rrr�get_priority_mindszSchedulerUtils.get_priority_mincCs
tj|�S)N)r)�sched_get_priority_max)r
r=rrr�get_priority_maxgszSchedulerUtils.get_priority_maxN)rrr�__doc__r/rr6r7r:r?rArCrFrHrJrrrrr;s rc@sPeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�ZdS)�SchedulerUtilsSchedutilszE
	Class encapsulating scheduler implementation in schedutils module
	cCs8tdd�|jj�D��|_tdd�|jj�D��|_dS)Ncss |]\}}|tt|�fVqdS)N)r(�
schedutils)r*r+r,rrrr-psz4SchedulerUtilsSchedutils.__init__.<locals>.<genexpr>css|]}tt|�|fVqdS)N)r(rM)r*r,rrrr-rs)r.r/r0r1r2r3)r
rrrrnsz!SchedulerUtilsSchedutils.__init__cCs
tj|�S)N)rMr:)r
r9rrrr:tsz&SchedulerUtilsSchedutils.get_schedulercCstj|||�dS)N)rMr?)r
r9r=r>rrrr?wsz&SchedulerUtilsSchedutils.set_schedulercCs
tj|�S)N)rMrA)r
r9rrrrAzsz%SchedulerUtilsSchedutils.get_affinitycCstj||�dS)N)rMrC)r
r9rrrrrC}sz%SchedulerUtilsSchedutils.set_affinitycCs
tj|�S)N)rMrF)r
r9rrrrF�sz%SchedulerUtilsSchedutils.get_prioritycCs
tj|�S)N)rMrH)r
r=rrrrH�sz)SchedulerUtilsSchedutils.get_priority_mincCs
tj|�S)N)rMrJ)r
r=rrrrJ�sz)SchedulerUtilsSchedutils.get_priority_maxN)rrrrKrr:r?rArCrFrHrJrrrrrLjsrLcs�eZdZdZ�fdd�Zdd�Zdd�Zdd	�Zed
d��Z	dd
�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd�d!d"�Zd#d$�Zd%d&�Zd'd(�Zd�d)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Zd5d6�Zd7d8�Zd9d:�Z d;d<�Z!d=d>�Z"d�d?d@�Z#dAdB�Z$dCdD�Z%�fdEdF�Z&dGdH�Z'dIdJ�Z(dKdL�Z)e*j+f�fdMdN�	Z,dOdP�Z-dQdR�Z.�fdSdT�Z/dUdV�Z0dWdX�Z1dYdZ�Z2e3d[d d\�d]d^��Z4e3d_d d\�d`da��Z5e3dbd d\�dcdd��Z6e3ded d\�dfdg��Z7e3dhd d\�didj��Z8e3dkd d\�dldm��Z9dndo�Z:dpdq�Z;drds�Z<d�dtdu�Z=dvdw�Z>dxdy�Z?dzd{�Z@d|d}�ZAd~d�ZBd�d��ZCd�d��ZDd�d��ZEd�d��ZFe3d�d d�d��d�d���ZGd�d��ZHd�d��ZId�d�d��ZJeKd��d�d���ZLeMd��d�d���ZNeKd��d�d���ZOeMd��d�d���ZPeKd��d�d���ZQeMd��d�d���ZReKd��d�d���ZSeMd��d�d���ZTeKd��d�d���ZUeMd��d�d���ZVeKd��d�d���ZWeMd��d�d���ZXeKd��d�d���ZYeMd��d�d���ZZeKd��d�d���Z[eMd��d�d���Z\eKd��d�d���Z]eMd��d�d���Z^eKd��d�d„�Z_eMd��d�dĄ�Z`�ZaS)��SchedulerPlugina]-
	`scheduler`::
	
	Allows tuning of scheduling priorities, process/thread/IRQ
	affinities, and CPU isolation.
	+
	To prevent processes/threads/IRQs from using certain CPUs, use
	the [option]`isolated_cores` option. It changes process/thread
	affinities, IRQs affinities and it sets `default_smp_affinity`
	for IRQs. The CPU affinity mask is adjusted for all processes and
	threads matching [option]`ps_whitelist` option subject to success
	of the `sched_setaffinity()` system call. The default setting of
	the [option]`ps_whitelist` regular expression is `.*` to match all
	processes and thread names. To exclude certain processes and threads
	use [option]`ps_blacklist` option. The value of this option is also
	interpreted as a regular expression and process/thread names (`ps -eo
	cmd`) are matched against that expression. Profile rollback allows
	all matching processes and threads to run on all CPUs and restores
	the IRQ settings prior to the profile application.
	+
	Multiple regular expressions for [option]`ps_whitelist`
	and [option]`ps_blacklist` options are allowed and separated by
	`;`. Quoted semicolon `\;` is taken literally.
	+
	.Isolate CPUs 2-4
	====
	----
	[scheduler]
	isolated_cores=2-4
	ps_blacklist=.*pmd.*;.*PMD.*;^DPDK;.*qemu-kvm.*
	----
	Isolate CPUs 2-4 while ignoring processes and threads matching
	`ps_blacklist` regular expressions.
	====
	The [option]`irq_process` option controls whether the scheduler plugin
	applies the `isolated_cores` parameter to IRQ affinities. The default
	value is `true`, which means that the scheduler plugin will move all
	possible IRQs away from the isolated cores. When `irq_process` is set
	to `false`, the plugin will not change any IRQ affinities.
	====
	The [option]`default_irq_smp_affinity` option controls the values
	*TuneD* writes to `/proc/irq/default_smp_affinity`. The file specifies
	default affinity mask that applies to all non-active IRQs. Once an
	IRQ is allocated/activated its affinity bitmask will be set to the
	default mask.
	+
	The following values are supported:
	+
	--
	`calc`::
	Content of `/proc/irq/default_smp_affinity` will be calculated
	from the `isolated_cores` parameter. Non-isolated cores
	are calculated as an inversion of the `isolated_cores`. Then
	the intersection of the non-isolated cores and the previous
	content of `/proc/irq/default_smp_affinity` is written to
	`/proc/irq/default_smp_affinity`. If the intersection is
	an empty set, then just the non-isolated cores are written to
	`/proc/irq/default_smp_affinity`. This behavior is the default if
	the parameter `default_irq_smp_affinity` is omitted.
	`ignore`::
	*TuneD* will not touch `/proc/irq/default_smp_affinity`.
	explicit cpulist::
	The cpulist (such as 1,3-4) is unpacked and written directly to
	`/proc/irq/default_smp_affinity`.
	--
	+
	.An explicit CPU list to set the default IRQ smp affinity to CPUs 0 and 2
	====
	----
	[scheduler]
	isolated_cores=1,3
	default_irq_smp_affinity=0,2
	----
	====
	To adjust scheduling policy, priority and affinity for a group of
	processes/threads, use the following syntax.
	+
	[subs="+quotes,+macros"]
	----
	group.__groupname__=__rule_prio__:__sched__:__prio__:__affinity__:__regex__
	----
	+
	where `__rule_prio__` defines internal *TuneD* priority of the
	rule. Rules are sorted based on priority. This is needed for
	inheritence to be able to reorder previously defined rules. Equal
	`__rule_prio__` rules should be processed in the order they were
	defined. However, this is Python interpreter dependant. To disable
	an inherited rule for `__groupname__` use:
	+
	[subs="+quotes,+macros"]
	----
	group.__groupname__=
	----
	+
	`__sched__` must be one of:
	*`f`* for FIFO,
	*`b`* for batch,
	*`r`* for round robin,
	*`o`* for other,
	*`*`* do not change.
	+
	`__affinity__` is CPU affinity in hexadecimal. Use `*` for no change.
	+
	`__prio__` scheduling priority (see `chrt -m`).
	+
	`__regex__` is Python regular expression. It is matched against the output of
	+
	[subs="+quotes,+macros"]
	----
	ps -eo cmd
	----
	+
	Any given process name may match more than one group. In such a case,
	the priority and scheduling policy are taken from the last matching
	`__regex__`.
	+
	.Setting scheduling policy and priorities to kernel threads and watchdog
	====
	----
	[scheduler]
	group.kthreads=0:*:1:*:\[.*\]$
	group.watchdog=0:f:99:*:\[watchdog.*\]
	----
	====
	+
	The scheduler plug-in uses perf event loop to catch newly created
	processes. By default it listens to `perf.RECORD_COMM` and
	`perf.RECORD_EXIT` events. By setting [option]`perf_process_fork`
	option to `true`, `perf.RECORD_FORK` events will be also listened
	to. In other words, child processes created by the `fork()` system
	call will be processed. Since child processes inherit CPU affinity
	from their parents, the scheduler plug-in usually does not need to
	explicitly process these events. As processing perf events can
	pose a significant CPU overhead, the [option]`perf_process_fork`
	option parameter is set to `false` by default. Due to this, child
	processes are not processed by the scheduler plug-in.
	+
	The CPU overhead of the scheduler plugin can be mitigated by using
	the scheduler [option]`runtime` option and setting it to `0`. This
	will completely disable the dynamic scheduler functionality and the
	perf events will not be monitored and acted upon. The disadvantage
	ot this approach is the procees/thread tuning will be done only at
	profile application.
	+
	.Disabling the scheduler dynamic functionality
	====
	----
	[scheduler]
	runtime=0
	isolated_cores=1,3
	----
	====
	+
	NOTE: For perf events, memory mapped buffer is used. Under heavy load
	the buffer may overflow. In such cases the `scheduler` plug-in
	may start missing events and failing to process some newly created
	processes. Increasing the buffer size may help. The buffer size can
	be set with the [option]`perf_mmap_pages` option. The value of this
	parameter has to expressed in powers of 2. If it is not the power
	of 2, the nearest higher power of 2 value is calculated from it
	and this calculated value used. If the [option]`perf_mmap_pages`
	option is omitted, the default kernel value is used.
	+
	The scheduler plug-in supports process/thread confinement using
	cgroups v1.
	+
	[option]`cgroup_mount_point` option specifies the path to mount the
	cgroup filesystem or where *TuneD* expects it to be mounted. If unset,
	`/sys/fs/cgroup/cpuset` is expected.
	+
	If [option]`cgroup_groups_init` option is set to `1` *TuneD*
	will create (and remove) all cgroups defined with the `cgroup*`
	options. This is the default behavior. If it is set to `0` the
	cgroups need to be preset by other means.
	+
	If [option]`cgroup_mount_point_init` option is set to `1`,
	*TuneD* will create (and remove) the cgroup mountpoint. It implies
	`cgroup_groups_init = 1`. If set to `0` the cgroups mount point
	needs to be preset by other means. This is the default behavior.
	+
	The [option]`cgroup_for_isolated_cores` option is the cgroup
	name used for the [option]`isolated_cores` option functionality. For
	example, if a system has 4 CPUs, `isolated_cores=1` means that all
	processes/threads will be moved to CPUs 0,2-3.
	The scheduler plug-in will isolate the specified core by writing
	the calculated CPU affinity to the `cpuset.cpus` control file of
	the specified cgroup and move all the matching processes/threads to
	this group. If this option is unset, classic cpuset affinity using
	`sched_setaffinity()` will be used.
	+
	[option]`cgroup.__cgroup_name__` option defines affinities for
	arbitrary cgroups. Even hierarchic cgroups can be used, but the
	hieararchy needs to be specified in the correct order. Also *TuneD*
	does not do any sanity checks here, with the exception that it forces
	the cgroup to be under [option]`cgroup_mount_point`.
	+
	The syntax of the scheduler option starting with `group.` has been
	augmented to use `cgroup.__cgroup_name__` instead of the hexadecimal
	`__affinity__`. The matching processes will be moved to the cgroup
	`__cgroup_name__`. It is also possible to use cgroups which have
	not been defined by the [option]`cgroup.` option as described above,
	i.e. cgroups not managed by *TuneD*.
	+
	All cgroup names are sanitized by replacing all all dots (`.`) with
	slashes (`/`). This is to prevent the plug-in from writing outside
	[option]`cgroup_mount_point`.
	+
	.Using cgroups v1 with the scheduler plug-in
	====
	----
	[scheduler]
	cgroup_mount_point=/sys/fs/cgroup/cpuset
	cgroup_mount_point_init=1
	cgroup_groups_init=1
	cgroup_for_isolated_cores=group
	cgroup.group1=2
	cgroup.group2=0,2
	
	group.ksoftirqd=0:f:2:cgroup.group1:ksoftirqd.*
	ps_blacklist=ksoftirqd.*;rcuc.*;rcub.*;ktimersoftd.*
	isolated_cores=1
	----
	Cgroup `group1` has the affinity set to CPU 2 and the cgroup `group2`
	to CPUs 0,2. Given a 4 CPU setup, the [option]`isolated_cores=1`
	option causes all processes/threads to be moved to CPU
	cores 0,2-3. Processes/threads that are blacklisted by the
	[option]`ps_blacklist` regular expression will not be moved.
	
	The scheduler plug-in will isolate the specified core by writing the
	CPU affinity 0,2-3 to the `cpuset.cpus` control file of the `group`
	and move all the matching processes/threads to this cgroup.
	====
	Option [option]`cgroup_ps_blacklist` allows excluding processes
	which belong to the blacklisted cgroups. The regular expression specified
	by this option is matched against cgroup hierarchies from
	`/proc/PID/cgroups`. Cgroups v1 hierarchies from `/proc/PID/cgroups`
	are separated by commas ',' prior to regular expression matching. The
	following is an example of content against which the regular expression
	is matched against: `10:hugetlb:/,9:perf_event:/,8:blkio:/`
	+
	Multiple regular expressions can be separated by semicolon ';'. The
	semicolon represents a logical 'or' operator.
	+
	.Cgroup-based exclusion of processes from the scheduler
	====
	----
	[scheduler]
	isolated_cores=1
	cgroup_ps_blacklist=:/daemons\b
	----
	
	The scheduler plug-in will move all processes away from core 1 except processes which
	belong to cgroup '/daemons'. The '\b' is a regular expression
	metacharacter that matches a word boundary.
	
	----
	[scheduler]
	isolated_cores=1
	cgroup_ps_blacklist=\b8:blkio:
	----
	
	The scheduler plug-in will exclude all processes which belong to a cgroup
	with hierarchy-ID 8 and controller-list blkio.
	====
	Recent kernels moved some `sched_` and `numa_balancing_` kernel run-time
	parameters from `/proc/sys/kernel`, managed by the `sysctl` utility, to
	`debugfs`, typically mounted under `/sys/kernel/debug`.  TuneD provides an
	abstraction mechanism for the following parameters via the scheduler plug-in:
	[option]`sched_min_granularity_ns`, [option]`sched_latency_ns`,
	[option]`sched_wakeup_granularity_ns`, [option]`sched_tunable_scaling`,
	[option]`sched_migration_cost_ns`, [option]`sched_nr_migrate`,
	[option]`numa_balancing_scan_delay_ms`,
	[option]`numa_balancing_scan_period_min_ms`,
	[option]`numa_balancing_scan_period_max_ms` and
	[option]`numa_balancing_scan_size_mb`.
	Based on the kernel used, TuneD will write the specified value to the correct
	location.
	+
	.Set tasks' "cache hot" value for migration decisions.
	====
	----
	[scheduler]
	sched_migration_cost_ns=500000
	----
	On the old kernels, this is equivalent to:
	----
	[sysctl]
	kernel.sched_migration_cost_ns=500000
	----
	that is, value `500000` will be written to `/proc/sys/kernel/sched_migration_cost_ns`.
	However, on more recent kernels, the value `500000` will be written to
	`/sys/kernel/debug/sched/migration_cost_ns`.
	====
	c		s�tt|�j||||||||�d|_tj|_ttj�|_	|dk	rh|j
tjtj�|_t|jtj
tj��|_	t�|_d|_i|_d|_d|_d|_tj�|_|jdd�|_d|_|jdd�|_d|_yt�|_Wntk
r�t �|_YnXdS)NTz.*�r	)Zcommand_name�irq)!�superrNrZ_has_dynamic_options�constsZCFG_DEF_DAEMON�_daemon�intZCFG_DEF_SLEEP_INTERVAL�_sleep_interval�get_boolZ
CFG_DAEMONr4ZCFG_SLEEP_INTERVALrr�_secure_boot_hint�_sched_knob_paths_cache�
_ps_whitelist�
_ps_blacklist�_cgroup_ps_blacklist_re�perfZcpu_map�_cpusZ_storage_key�_scheduler_storage_key�_irq_process�_irq_storage_key�_evlistr�_scheduler_utils�AttributeErrorrL)	r
Zmonitor_repositoryZstorage_factoryZhardware_inventoryZdevice_matcherZdevice_matcher_udevZplugin_instance_factoryZ
global_cfg�	variables)�	__class__rrr�s0


zSchedulerPlugin.__init__cCsT|dkrdSyt|�}Wntk
r,dSX|dkr:dStdtjtj|d���S)Nr�)rT�
ValueError�mathZceil�log)r
Z
mmap_pagesZmprrr�_calc_mmap_pages�sz SchedulerPlugin._calc_mmap_pagescsd|_d|_d|_d|_�jj�ji��_t�j�dkr^t	j
d��j�i�_�jj�j�t
��_d�_d�_d�_tj�fdd�|jj�D���_|j|_�jj|jd�}�j|�}|dkr�t	jd|�d}|dk	r�t|�|kr�t	j
d	||f�x(|jD]}�jj|j|�|j|<�qW�jj|jjd
d��dk�rHd|_tj �|_!�j"�r|j�ry�t#j$�|_%t#j&t#j't#j(ddddddt#j)t#j*Bd
�	}|j+�j,|j%d�t#j-�j,|j%�|_|jj.|�|dk�r�|jj/�n|jj/|d�Wnd|_YnXdS)NFTrz0recovering scheduling settings from previous runcsJg|]B\}}|dd�dkrt|�dkr�j|dd���jj|�f�qS)N�zcgroup.)�len�_sanitize_cgroup_path�
_variables�expand)r*�optionr)r
rr�
<listcomp>�sz2SchedulerPlugin._instance_init.<locals>.<listcomp>�perf_mmap_pageszKInvalid 'perf_mmap_pages' value specified: '%s', using default kernel valuezL'perf_mmap_pages' value has to be power of two, specified: '%s', using: '%d'Zruntimer�0)	�type�configZtask�comm�mmapZfreqZ
wakeup_eventsZ	watermarkZsample_type)Zcpus�threads)Zpages)0raZ_has_dynamic_tuningZ_has_static_tuning�_runtime_tuning�_storager4r^�_scheduler_originalrlri�info�_restore_ps_affinity�unsetr.�_cgroups_original_affinityr�_cgroup_affinity_initialized�_cgroup�collections�OrderedDict�optionsr0�_cgroups�
_schedulerrnrorj�error�strrrV�	threadingZEvent�
_terminaterSr\Z
thread_mapZ_threads�evselZ
TYPE_SOFTWAREZCOUNT_SW_DUMMYZ
SAMPLE_TIDZ
SAMPLE_CPU�openr]Zevlist�addrw)r
�instanceZperf_mmap_pages_rawrrr+r�r)r
r�_instance_init�s^




zSchedulerPlugin._instance_initcCs*|jr&x|jj�D]}tj|j�qWdS)N)ra�
get_pollfdr)�closer,)r
r��fdrrr�_instance_cleanupsz!SchedulerPlugin._instance_cleanupcCs4dtjddddddddddddddddddddd�S)NFT�calcZfalse)�isolated_cores�cgroup_mount_point�cgroup_mount_point_init�cgroup_groups_init�cgroup_for_isolated_cores�cgroup_ps_blacklist�ps_whitelist�ps_blacklist�irq_process�default_irq_smp_affinityrr�perf_process_fork�sched_min_granularity_ns�sched_latency_ns�sched_wakeup_granularity_ns�sched_tunable_scaling�sched_migration_cost_ns�sched_nr_migrate�numa_balancing_scan_delay_ms�!numa_balancing_scan_period_min_ms�!numa_balancing_scan_period_max_ms�numa_balancing_scan_size_mb)rRZDEF_CGROUP_MOUNT_POINT)�clsrrr�_get_config_optionss,z#SchedulerPlugin._get_config_optionscCs|dk	rt|�jdd�SdS)N�.�/)r��replace)r
rrrrrm9sz%SchedulerPlugin._sanitize_cgroup_pathcCs>t|tj�s|}tj|�}tj|�}|j|�r:d|d}|S)N�[�])�
isinstance�procfs�processZprocess_cmdline�_is_kthread)r
r�r9rrrr�_get_cmdline=s


zSchedulerPlugin._get_cmdlinecCs�tj�}|j�i}x�|j�D]�}yN|j|�}|d}|||<d|krnx&|dj�D]}|j|�}|||<qTWWqttfk
r�}z$|jtj	ks�|jtj
kr�wn�WYdd}~XqXqW|S)Nr9rx)r��pidstats�reload_threadsr2r��keys�OSError�IOError�errno�ENOENT�ESRCH)r
�ps�	processes�procrr9�errr�
get_processesGs$

zSchedulerPlugin.get_processescCs@|jj|�}|jj|�}|jj|�}tjd|||f�||fS)Nz8Read scheduler policy '%s' and priority '%d' of PID '%d')rbr:r7rFri�debug)r
r9r	�	sched_strr
rrr�_get_rt`szSchedulerPlugin._get_rtcCs|jj|�}tjd|||f�yB|jj|�}|jj|�}||ksJ||kr`tjd||||f�Wn4ttfk
r�}ztjd|�WYdd}~XnXy|jj	|||�Wn`ttfk
�r}z>t
|d�r�|jtjkr�tjd|�ntjd||f�WYdd}~XnXdS)NzBSetting scheduler policy to '%s' and priority to '%d' of PID '%d'.z9Priority for %s must be in range %d - %d. '%d' was given.z(Failed to get allowed priority range: %sr�zAFailed to set scheduling parameters of PID %d, the task vanished.z1Failed to set scheduling parameters of PID %d: %s)
rbr7rir�rHrJr��SystemErrorr�r?�hasattrr�r�)r
r9r=r>r�Zprio_minZprio_maxr�rrr�_set_rths*
zSchedulerPlugin._set_rtcCs|ddtjj@dkS)N�stat�flagsr)r�ZpidstatZ
PF_KTHREAD)r
r�rrrr��szSchedulerPlugin._is_kthreadcCsyjtj|�}|dj�rd|dddkr8tjd|�n(|j|�rRtjd|�ntjd|�dSdSWn�ttfk
r�}zF|j	t	j
ks�|j	t	jkr�tjd	|�dStjd
||f�d
SWYdd}~Xn8t
tfk
�r}ztjd
||f�dSd}~XnXdS)Nr��state�ZzYAffinity of zombie task with PID %d cannot be changed, the task's affinity mask is fixed.z[Affinity of kernel thread with PID %d cannot be changed, the task's affinity mask is fixed.zRAffinity of task with PID %d cannot be changed, the task's affinity mask is fixed.rrz6Failed to get task info for PID %d, the task vanished.z&Failed to get task info for PID %d: %srf������r�)r�r�Zis_bound_to_cpurir�r��warnr�r�r�r�r�r�rc�KeyError)r
r9r�r�rrr�_affinity_changeable�s2



z$SchedulerPlugin._affinity_changeablecCs\y|j|}Wn(tk
r6t|j�}||j|<YnX|jdkrX|jdkrX||_||_dS)N)r{r�rrr	r
)r
r9r	r
�paramsrrr�_store_orig_process_rt�s
z&SchedulerPlugin._store_orig_process_rtcCs�d}|dkr|dkr|Sy:|j|�\}}|dkr4|}|j|||�|j|||�Wntttfk
r�}zTt|d�r�|jtjkr�tj	d|�||j
kr�|j
|=d}ntjd||f�WYdd}~XnX|S)NTr�z=Failed to read scheduler policy of PID %d, the task vanished.FzcRefusing to set scheduler and priority of PID %d, reading original scheduling parameters failed: %s)r�r�r�r�r�r�r�r�rir�r{r�)r
r9r=r>�contZ
prev_schedZ	prev_prior�rrr�_tune_process_rt�s&
z SchedulerPlugin._tune_process_rtcCst|�dd�dkS)Nrkzcgroup.)r�)r
rrrr�_is_cgroup_affinity�sz#SchedulerPlugin._is_cgroup_affinityFcCsby|j|}Wn(tk
r6t|j�}||j|<YnX|jdkr^|jdkr^|rX||_n||_dS)N)r{r�rrrr)r
r9r�	is_cgroupr�rrr�_store_orig_process_affinity�s
z,SchedulerPlugin._store_orig_process_affinitycCspxj|jjdtjt|�dfdd�jd�D]@}y&|jd�ddd�}|dkrP|Sd	Stk
rfYq(Xq(Wd	S)
Nz%s/%s/%srT)�no_error�
z:cpuset:rrOr�)r�	read_filerRZPROCFS_MOUNT_POINTr��split�
IndexError)r
r9�lrrrr�_get_cgroup_affinity�s,
z$SchedulerPlugin._get_cgroup_affinitycCsB|j|�}|j}|dkr$d||f}|jjd|t|�dd�dS)Nr�z%s/%sz%s/tasksT)r�)rm�_cgroup_mount_pointr�
write_to_filer�)r
r9r�pathrrr�_set_cgroup�s

zSchedulerPlugin._set_cgroupcCs,|dd�}t|t�o"t|�dk}||fS)Nrkr)r��listrl)r
rr�rrr�_parse_cgroup_affinity�sz&SchedulerPlugin._parse_cgroup_affinityc	Cs�d}|dkr|Syd|j|�\}}|r<|j|�}|j||�n(|j|�}|rX|j|||�}|j||�|j|||�Wntttfk
r�}zTt	|d�r�|j
t
jkr�tj
d|�||jkr�|j|=d}ntjd||f�WYdd}~XnX|S)NTr�z5Failed to read affinity of PID %d, the task vanished.FzLRefusing to set CPU affinity of PID %d, reading original affinity failed: %s)r�r�r��
_get_affinity�_get_intersect_affinity�
_set_affinityr�r�r�r�r�r�rir�r{r�)	r
r9r�	intersectr�r�r�
prev_affinityr�rrr�_tune_process_affinity�s4


z&SchedulerPlugin._tune_process_affinitycCsF|j|||�}|sdS|j||�}|s2||jkr6dS||j|_dS)N)r�r�r{r)r
r9rr=r>rr�rrr�
_tune_processszSchedulerPlugin._tune_processcCsf|jj|�}|dkr.|dkr.tjd|�dSyt|�}Wn"tk
r\tjd|�dSX||fS)Nrz>Invalid scheduler: %s. Scheduler and priority will be ignored.z=Invalid priority: %s. Scheduler and priority will be ignored.)NN)NN)rbr6rir�rTrg)r
r5Zstr_priorityr	r
rrr�_convert_sched_paramssz%SchedulerPlugin._convert_sched_paramscCsD|dkrd}n2|j|�r|}n"|jj|�}|s@tjd|�d}|S)Nrz)Invalid affinity: %s. It will be ignored.)r�r�hex2cpulistrir�)r
Zstr_affinityrrrr�_convert_affinity+s
z!SchedulerPlugin._convert_affinitycCs6|\}}}}}|j||�\}}|j|�}|||||fS)N)r�r�)r
�vals�	rule_prior	r
r�regexrrr�_convert_sched_cfg8s

z"SchedulerPlugin._convert_sched_cfgcCs�d|j|f}ytj|tj�Wn4tk
rT}ztjd||f�WYdd}~XnX|jj	d|df|jj
d|jdfdd�dd�s�tjd|�dS)Nz%s/%sz Unable to create cgroup '%s': %szcpuset.memsT)r�z3Unable to initialize 'cpuset.mems ' for cgroup '%s')r�r)�mkdirrR�DEF_CGROUP_MODEr�rir�rr�r�)r
rr�r�rrr�_cgroup_create_group?s$z$SchedulerPlugin._cgroup_create_groupcCs@|jdk	r"|j|jkr"|j|j�x|jD]}|j|�q*WdS)N)r�r�r�)r
�cgrrr�_cgroup_initialize_groupsJsz)SchedulerPlugin._cgroup_initialize_groupscCs�tjd�ytj|jtj�Wn0tk
rN}ztjd|�WYdd}~XnX|j	j
dddddd|jg�\}}|dkr�tjd	|j�dS)
NzInitializing cgroups settingsz'Unable to create cgroup mount point: %sZmountz-trz-oZcpusetrzUnable to mount '%s')rir�r)�makedirsr�rRr�r�r�r�execute)r
r��ret�outrrr�_cgroup_initializePs
  z"SchedulerPlugin._cgroup_initializecCsHytj|�Wn4tk
rB}ztjd||f�WYdd}~XnXdS)Nz#Unable to remove directory '%s': %s)r)�rmdirr�rir�)r
rr�rrr�_remove_dirZszSchedulerPlugin._remove_dircCsXx&t|j�D]}|jd|j|f�qW|jdk	rT|j|jkrT|jd|j|jf�dS)Nz%s/%s)�reversedr�r�r�r�)r
r�rrr�_cgroup_finalize_groups`sz'SchedulerPlugin._cgroup_finalize_groupscCsltjd�|jjd|jg�\}}|dkr<tjd|j�dS|j|j�tjj	|j�}|dkrh|j|�dS)NzRemoving cgroups settingsZumountrzUnable to umount '%s'Fr�)
rir�rr�r�r�r�r)r��dirname)r
r�r��drrr�_cgroup_finalizefs
z SchedulerPlugin._cgroup_finalizecCs�|dkrtjd||f�ntjd|�dSd|j|df}|r~|jj|ddd�j�}|dkrl||j|<ntjd	|�dS|jj||dd
�s�tjd||f�dS)NrOz$Setting cgroup '%s' affinity to '%s'z.Skipping cgroup '%s', empty affinity requestedz%s/%s/%szcpuset.cpus�ERRT)�err_retr�zIRefusing to set affinity of cgroup '%s', reading original affinity failed)r�z+Unable to set affinity '%s' for cgroup '%s')	rir�r�rr��striprr�r�)r
rr�backupr�Z
orig_affinityrrr�_cgroup_set_affinity_oneqsz(SchedulerPlugin._cgroup_set_affinity_onecCs~|jr
dStjd�|jdk	rH|jdk	rH|j|jkrH|j|j|jdd�x*|jj�D]}|j|d|ddd�qTWd|_dS)NzSetting cgroups affinitiesT)rrr)r�rir�rr�r�r	r0)r
r�rrr�_cgroup_set_affinity�s
 z$SchedulerPlugin._cgroup_set_affinitycCs6tjd�x&|jj�D]}|j|d|d�qWdS)NzRestoring cgroups affinitiesrr)rir�rr0r	)r
r�rrr�_cgroup_restore_affinity�s
z(SchedulerPlugin._cgroup_restore_affinityc#sn�jj|jd��_�jj�jj|jd��dk�_�jj�jj|jd��dk�_�j�jj|jd���_	�jr|�j
��js��jr��j�tt
��j|��j�y�j�}Wn2ttfk
r�}ztjd|�dSd}~XnXdd�|jj�D�}�fd	d�|D�}t|d
d�d�}t�}i|_x�|D]�\�\}����ytj���Wn<tjk
�r�}ztjd
t����w0WYdd}~XnX�fdd�|j�D�}t�����fdd�|D��}	|j|	�tjddt�������g|j�<�q0Wx4|j�D](\}
\}������j|
|�����q�W�j j!�j"�j#��j$�rj|j%�rjt&j'�j(|gd�|_)|j)j*�dS)Nr�r��1r�r�zIerror applying tuning, cannot get information about running processes: %scSs$g|]\}}|t|�jdd�f�qS)�:�)r�r�)r*rprrrrrq�sz:SchedulerPlugin._instance_apply_static.<locals>.<listcomp>cs6g|].\}}tjd|�rt|�dkr|�j|�f�qS)zgroup\.�)�re�matchrlr�)r*rpr�)r
rrrq�scSs|ddS)Nrrr)Zoption_valsrrr�<lambda>�sz8SchedulerPlugin._instance_apply_static.<locals>.<lambda>)�keyz(error compiling regular expression: '%s'cs(g|] \}}tj�|�dk	r||f�qS)N)r�search)r*r9r)r%rrrq�sc	s$g|]\}}||�����ff�qSrr)r*r9r)rrpr
r�r	rrrq�sz(?<!\\)\((?!\?)z(?:)�target�args)+rnror�r�rrV�_cgroup_mount_point_init�_cgroup_groups_initrmr�r�r�rQrN�_instance_apply_staticr
r�r�r�rir�r�r0�sortedr.�
_sched_lookupr�compiler��update�subr�rz�setr^r{rSryr�ZThread�_thread_code�_thread�start)r
r�r�r�Z	sched_cfgZbufZ	sched_allr�r�r=r9r)re)rrpr
r%r�r	r
rr�s^





z&SchedulerPlugin._instance_apply_staticcCs�y|j�}Wn2ttfk
r>}ztjd|�dSd}~XnXx�|jj�D]x\}}||ksL|||jkrlqL|jdk	r�|j	dk	r�|j
||j|j	�|jdk	r�|j||j�qL|j
dk	rL|j||j
�qLWi|_|jj|j�dS)NzKerror unapplying tuning, cannot get information about running processes: %s)r�r�r�rir�r{r0rr	r
r�rr�rr�rzr~r^)r
r�r�r9Zorig_paramsrrrr}�s&




z$SchedulerPlugin._restore_ps_affinitycCs�ttj�}d}xr|dkr�|dkr�|jjd|j|dfddd�}|d
krvx.|jd�D] }|jjd	|jdf|dd
�qRW|d8}qW|dkr�tj	d|�dS)N� rOrz%s/%s/%sZtasksT)rr�r�z%s/%s)r�rz(Unable to cleanup tasks from cgroup '%s')rOr#)
rTrRZCGROUP_CLEANUP_TASKS_RETRYrr�r�r�r�rir�)r
rZcnt�datar�rrr�_cgroup_cleanup_tasks_one�s

 z)SchedulerPlugin._cgroup_cleanup_tasks_onecCs@|jdk	r"|j|jkr"|j|j�x|jD]}|j|�q*WdS)N)r�r�r%)r
r�rrr�_cgroup_cleanup_tasks�sz%SchedulerPlugin._cgroup_cleanup_taskscsptt|�j||�|jr2|jr2|jj�|jj�|j	�|j
�|j�|jsV|j
r^|j�|j
rl|j�dS)N)rQrN�_instance_unapply_staticrSryr�rr!�joinr}rr&rrrr)r
r�Zrollback)rerrr'�s

z(SchedulerPlugin._instance_unapply_staticcCs�tjd|�d|j|df}|jj|ddd�}|dkr<dS|jj|jj|��}|jj|jj|��}d|}||kr�tjtj	||f�dStj
tj|||f�dSdS)	NzVerifying cgroup '%s' affinityz%s/%s/%szcpuset.cpusrT)rr�zcgroup '%s' affinityF)rir�r�rr��cpulist2stringZcpulist_packr|rR�STR_VERIFY_PROFILE_VALUE_OKr��STR_VERIFY_PROFILE_VALUE_FAIL)r
rrr��current_affinityZaffinity_descriptionrrr�_cgroup_verify_affinity_ones 
z+SchedulerPlugin._cgroup_verify_affinity_onecCsrtjd�d}|jdk	rB|jdk	rB|j|jkrB|o@|j|j|j�}x*|jj�D]}|oh|j|d|d�}qNW|S)NzVeryfying cgroups affinitiesTrr)rir�rr�r�r-r0)r
r�r�rrr�_cgroup_verify_affinitys
 z'SchedulerPlugin._cgroup_verify_affinitycs$tt|�j|||�}|j�}|o"|S)N)rQrN�_instance_verify_staticr.)r
r��ignore_missingZdevicesZret1Zret2)rerrr/sz'SchedulerPlugin._instance_verify_staticc
Cs�y|j|�}Wn^ttfk
rl}z>|jtjks<|jtjkrLtjd|�ntjd||f�dSd}~XnX|j	j
|j||�}|dk	r�||jkr�tjd||t
|�f�|\}}}	|j|||||	�|jj|j|j�dS)Nz3Failed to get cmdline of PID %d, the task vanished.z#Failed to get cmdline of PID %d: %sz-tuning new process '%s' with PID '%d' by '%s')r�r�r�r�r�r�rir�r�rZ	re_lookuprr{r�r�rzrr^)
r
r�r9r%rr��vr=r>rrrr�_add_pid$s$


zSchedulerPlugin._add_pidcCs6||jkr2|j|=tjd|�|jj|j|j�dS)Nz)removed PID %d from the rollback database)r{rir�rzrr^)r
r�r9rrr�_remove_pid9s


zSchedulerPlugin._remove_pidc	Cs�|jj|j�}tj�}|jj�}x|D]}|j|�q&Wx�|jj	�s�t
|j|jd��dkr:|jj	�r:d}x�|r�d}xt|jD]j}|jj
|�}|r~d}|jtjks�|jr�|jtjkr�|j|t|j�|�q~|jtjkr~|j|t|j��q~WqnWq:WdS)Ni�rTF)rZre_lookup_compiler�select�pollrar��registerr�Zis_setrlrUr]Zread_on_cpurtr\ZRECORD_COMM�_perf_process_fork_valueZRECORD_FORKr2rT�tidZRECORD_EXITr3)	r
r�r%r5Zfdsr�Zread_eventsZcpuZeventrrrr @s&

$zSchedulerPlugin._thread_coder�)�
per_devicecCs:|rdS|r6|dk	r6djdd�tjdt|��D��|_dS)N�|cSsg|]}d|�qS)z(%s)r)r*r1rrrrq_sz8SchedulerPlugin._cgroup_ps_blacklist.<locals>.<listcomp>z(?<!\\);)r(rr�r�r[)r
�enablingr�verifyr0rrr�_cgroup_ps_blacklistYsz$SchedulerPlugin._cgroup_ps_blacklistr�cCs:|rdS|r6|dk	r6djdd�tjdt|��D��|_dS)Nr:cSsg|]}d|�qS)z(%s)r)r*r1rrrrqgsz1SchedulerPlugin._ps_whitelist.<locals>.<listcomp>z(?<!\\);)r(rr�r�rY)r
r;rr<r0rrrrYaszSchedulerPlugin._ps_whitelistr�cCs:|rdS|r6|dk	r6djdd�tjdt|��D��|_dS)Nr:cSsg|]}d|�qS)z(%s)r)r*r1rrrrqosz1SchedulerPlugin._ps_blacklist.<locals>.<listcomp>z(?<!\\);)r(rr�r�rZ)r
r;rr<r0rrrrZiszSchedulerPlugin._ps_blacklistr�cCs*|rdS|r&|dk	r&|jj|�dk|_dS)Nr)rrVr_)r
r;rr<r0rrrr_qszSchedulerPlugin._irq_processr�cCs6|rdS|r2|dk	r2|dkr$||_n|jj|�|_dS)Nr��ignore)r�r>)�_default_irq_smp_affinity_valuer�cpulist_unpack)r
r;rr<r0rrr�_default_irq_smp_affinityysz)SchedulerPlugin._default_irq_smp_affinityr�cCs*|rdS|r&|dk	r&|jj|�dk|_dS)Nr)rrVr7)r
r;rr<r0rrr�_perf_process_fork�sz"SchedulerPlugin._perf_process_forkcCs"|jj|�}tjd||f�|S)NzRead affinity '%s' of PID %d)rbrArir�)r
r9�resrrrr��szSchedulerPlugin._get_affinitycCs�tjd||f�y|jj||�dSttfk
r�}zXt|d�r`|jtjkr`tjd|�n.|j	|�}|dksz|d	kr�tj
d|||f�dSd}~XnXdS)
Nz'Setting CPU affinity of PID %d to '%s'.Tr�z4Failed to set affinity of PID %d, the task vanished.rrfz,Failed to set affinity of PID %d to '%s': %sFr�)rir�rbrCr�r�r�r�r�r�r�)r
r9rr�rCrrrr��s

zSchedulerPlugin._set_affinitycCs"t|�jt|��}|rt|�S|S)N)r�intersectionr�)r
Z	affinity1Z	affinity2Z	affinity3Zaffrrrr��sz'SchedulerPlugin._get_intersect_affinityc
s>�fdd�|D�}�jdkr.�fdd�|D�}�jdkrJ�fdd�|D�}tdd�|D��}x�|D]�}y�j||�}Wnbttfk
r�}zB|jtjks�|jtjkr�t	j
d|�nt	jd||f�wbWYdd}~XnX�j||d	d
�}	|	s�qb|�j
k�r
|�j
|_|rbd||krb�j||dj�|d	�qbWdS)Ncs(g|] }tj�j�j|��dk	r|�qS)N)rrrY�_get_stat_comm)r*r1)r
rrrq�s
z9SchedulerPlugin._set_all_obj_affinity.<locals>.<listcomp>rOcs(g|] }tj�j�j|��dkr|�qS)N)rrrZrE)r*r1)r
rrrq�s
cs(g|] }tj�j�j|��dkr|�qS)N)rrr[�_get_stat_cgroup)r*r1)r
rrrq�s
cSsg|]}|j|f�qSr)r9)r*r1rrrrq�sz3Failed to get cmdline of PID %d, the task vanished.zARefusing to set affinity of PID %d, failed to get its cmdline: %sT)r�rx)rZr[r.r�r�r�r�r�r�rir�r�r�r{r�_set_all_obj_affinityr2)
r
ZobjsrrxZpslZpsdr9rr�r�r)r
rrG�s6



z%SchedulerPlugin._set_all_obj_affinityc
Cs(y|dStttfk
r"dSXdS)NZcgroupsrO)r�r�r�)r
r&rrrrF�sz SchedulerPlugin._get_stat_cgroupc
Cs,y|ddStttfk
r&dSXdS)Nr�rvrO)r�r�r�)r
r&rrrrE�szSchedulerPlugin._get_stat_commcCs`y&tj�}|j�|j|j�|d�Wn4ttfk
rZ}ztjd|�WYdd}~XnXdS)NFzIerror applying tuning, cannot get information about running processes: %s)	r�r�r�rGr2r�r�rir�)r
rr�r�rrr�_set_ps_affinity�sz SchedulerPlugin._set_ps_affinitycCs�yJ|jj|�}tjd||f�d|}t|d��}|j|�WdQRXdSttfk
r�}zLt|d�r�|j	t	j
kr�|r�tjd|�d
Stjd|||f�dSWYdd}~XnXdS)Nz&Setting SMP affinity of IRQ %s to '%s'z/proc/irq/%s/smp_affinity�wrr�z/Setting SMP affinity of IRQ %s is not supportedrfz0Failed to set SMP affinity of IRQ %s to '%s': %srr�r�)r�cpulist2hexrir�r��writer�r�r�r�ZEIOr�)r
rPrZ	restoring�affinity_hex�filenamer#r�rrr�_set_irq_affinity�s"z!SchedulerPlugin._set_irq_affinitycCs|y>|jj|�}tjd|�tdd��}|j|�WdQRXWn8ttfk
rv}ztjd||f�WYdd}~XnXdS)Nz(Setting default SMP IRQ affinity to '%s'z/proc/irq/default_smp_affinityrIz2Failed to set default SMP IRQ affinity to '%s': %s)	rrJrir�r�rKr�r�r�)r
rrLr#r�rrr�_set_default_irq_affinity�sz)SchedulerPlugin._set_default_irq_affinityc	
Cs"t�}tj�}x�|j�D]�}y"||d}tjd||f�Wntk
rTwYnX|j|||�}t|�t|�krvq|j	||d�}|dkr�||j
|<q|d	kr|jj|�qW|j
jd�}|j
j|�}|jdkr�|j|||�}n|jdkr�|j}|jdk�r|j|�||_|jj|j|�dS)
NrzRead affinity of IRQ '%s': '%s'Frrfz/proc/irq/default_smp_affinityr�r>r�)rr��
interruptsr�rir�r�r�rrNrr�appendrr�r�r?rOrrzr`)	r
r�irq_originalrrPr�rrCZprev_affinity_hexrrr�_set_all_irq_affinity
s6


z%SchedulerPlugin._set_all_irq_affinitycCsn|jj|jd�}|dkrdSx$|jj�D]\}}|j||d�q(W|jdkr\|j}|j|�|jj	|j�dS)NTr>)
rzr4r`rr0rNr?rrOr~)r
rRrPrrrr�_restore_all_irq_affinity)s

z)SchedulerPlugin._restore_all_irq_affinitycCsFt|�jt|��}|r,tjtj||f�ntjtj|||f�|S)N)r�issubsetrir|rRr*r�r+)r
�irq_description�correct_affinityr,rCrrr�_verify_irq_affinity4s
z$SchedulerPlugin._verify_irq_affinitycCs�|jj|jd�}tj�}d}x�|j�D]�}||jkrR|rRd|}tjt	j
|�q&y<||d}tjd||f�d|}	|j|	||�s�d}Wq&t
k
r�w&Yq&Xq&W|jjd�}
|jj|
�}|jdkr�|jd	||jd
kr�|n|j�r�d}|S)NTz-IRQ %s does not support changing SMP affinityrz#Read SMP affinity of IRQ '%s': '%s'zSMP affinity of IRQ %sFz/proc/irq/default_smp_affinityr>zdefault IRQ SMP affinityr�)rzr4r`r�rPr�rrir|rRZ STR_VERIFY_PROFILE_VALUE_MISSINGr�rXr�rr�r�r?)r
rWr0rRrrCrP�descriptionr,rVZcurrent_affinity_hexrrr�_verify_all_irq_affinity@s8
z(SchedulerPlugin._verify_all_irq_affinityr��
)r9r
c
Cs�d}d|_|dk	rrt|jj|��}t|j�}|j|�rRt||�}|jj|�|_n |jj|j�}tj	d||f�|sz|r�|dkr�dS|r�|j
r�|j||�SdS|r�|jr�|j
�d|j}	n|}	|j|	�|j
r�|j|�n|j
r�|j�dS)NzJInvalid isolated_cores specified, '%s' does not match available cores '%s'Tz	cgroup.%s)rrrr@r]rUr�r)rir�r_rZr�r
rHrSrT)
r
r;rr<r0r�isolatedZpresentZstr_cpusZps_affinityrrr�_isolated_cores_s6


zSchedulerPlugin._isolated_corescCs�d|||f}|jj|�}|r"|Sd||f}tjj|�sv|dkrPd||f}nd|||f}d|}|jdkrvd|_||j|<|S)Nz%s_%s_%sz/proc/sys/kernel/%s_%srOz%s/%sz%s/%s/%sz/sys/kernel/debug/%sT)rXr4r)r��existsrW)r
�prefix�	namespace�knobrr�rrr�_get_sched_knob_path�s

z$SchedulerPlugin._get_sched_knob_pathcCsJ|jj|j|||�dd�}|dkrFtjd|�|jrFtjd�d|_|S)N)rzError reading '%s'zUThis may not work with Secure Boot or kernel_lockdown (this hint is logged only once)F)rr�rbrir�rW)r
r_r`rar$rrr�_get_sched_knob�s
zSchedulerPlugin._get_sched_knobcCsN|dkrdS|sJ|jj|j|||�||r0tjgndd�sJtjd||f�|S)NF)r�z Error writing value '%s' to '%s')rr�rbr�r�rir�)r
r_r`rar�sim�removerrr�_set_sched_knob�szSchedulerPlugin._set_sched_knobr�cCs|jddd�S)NrOr=�min_granularity_ns)rc)r
rrr�_get_sched_min_granularity_ns�sz-SchedulerPlugin._get_sched_min_granularity_nscCs|jddd|||�S)NrOr=rg)rf)r
rrdrerrr�_set_sched_min_granularity_ns�sz-SchedulerPlugin._set_sched_min_granularity_nsr�cCs|jddd�S)NrOr=�
latency_ns)rc)r
rrr�_get_sched_latency_ns�sz%SchedulerPlugin._get_sched_latency_nscCs|jddd|||�S)NrOr=rj)rf)r
rrdrerrr�_set_sched_latency_ns�sz%SchedulerPlugin._set_sched_latency_nsr�cCs|jddd�S)NrOr=�wakeup_granularity_ns)rc)r
rrr� _get_sched_wakeup_granularity_ns�sz0SchedulerPlugin._get_sched_wakeup_granularity_nscCs|jddd|||�S)NrOr=rm)rf)r
rrdrerrr� _set_sched_wakeup_granularity_ns�sz0SchedulerPlugin._set_sched_wakeup_granularity_nsr�cCs|jddd�S)NrOr=�tunable_scaling)rc)r
rrr�_get_sched_tunable_scaling�sz*SchedulerPlugin._get_sched_tunable_scalingcCs|jddd|||�S)NrOr=rp)rf)r
rrdrerrr�_set_sched_tunable_scaling�sz*SchedulerPlugin._set_sched_tunable_scalingr�cCs|jddd�S)NrOr=�migration_cost_ns)rc)r
rrr�_get_sched_migration_cost_ns�sz,SchedulerPlugin._get_sched_migration_cost_nscCs|jddd|||�S)NrOr=rs)rf)r
rrdrerrr�_set_sched_migration_cost_ns�sz,SchedulerPlugin._set_sched_migration_cost_nsr�cCs|jddd�S)NrOr=�
nr_migrate)rc)r
rrr�_get_sched_nr_migrate�sz%SchedulerPlugin._get_sched_nr_migratecCs|jddd|||�S)NrOr=rv)rf)r
rrdrerrr�_set_sched_nr_migrate�sz%SchedulerPlugin._set_sched_nr_migrater�cCs|jddd�S)Nr=�numa_balancing�
scan_delay_ms)rc)r
rrr�!_get_numa_balancing_scan_delay_ms�sz1SchedulerPlugin._get_numa_balancing_scan_delay_mscCs|jddd|||�S)Nr=ryrz)rf)r
rrdrerrr�!_set_numa_balancing_scan_delay_ms�sz1SchedulerPlugin._set_numa_balancing_scan_delay_msr�cCs|jddd�S)Nr=ry�scan_period_min_ms)rc)r
rrr�&_get_numa_balancing_scan_period_min_ms�sz6SchedulerPlugin._get_numa_balancing_scan_period_min_mscCs|jddd|||�S)Nr=ryr})rf)r
rrdrerrr�&_set_numa_balancing_scan_period_min_ms�sz6SchedulerPlugin._set_numa_balancing_scan_period_min_msr�cCs|jddd�S)Nr=ry�scan_period_max_ms)rc)r
rrr�&_get_numa_balancing_scan_period_max_ms�sz6SchedulerPlugin._get_numa_balancing_scan_period_max_mscCs|jddd|||�S)Nr=ryr�)rf)r
rrdrerrr�&_set_numa_balancing_scan_period_max_ms�sz6SchedulerPlugin._set_numa_balancing_scan_period_max_msr�cCs|jddd�S)Nr=ry�scan_size_mb)rc)r
rrr� _get_numa_balancing_scan_size_mb�sz0SchedulerPlugin._get_numa_balancing_scan_size_mbcCs|jddd|||�S)Nr=ryr�)rf)r
rrdrerrr� _set_numa_balancing_scan_size_mb�sz0SchedulerPlugin._set_numa_balancing_scan_size_mb)F)F)F)F)F)brrrrKrrjr�r��classmethodr�rmr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrr	r
rrr}r%r&rRZ
ROLLBACK_SOFTr'r-r.r/r2r3r Zcommand_customr=rYrZr_rArBr�r�r�rGrFrErHrNrOrSrTrXrZr]rbrcrfZcommand_getrhZcommand_setrirkrlrnrorqrrrtrurwrxr{r|r~rr�r�r�r��
__classcell__rr)rerrN�s�(?




	



<

	
"$	
	rN) rOrZ
decoratorsZ
tuned.logsZtunedr�
subprocessr�r\r4Ztuned.constsrRr�Ztuned.utils.commandsrr�r)r�rhrrcrMZlogsr4ri�objectrrrrLZPluginrNrrrr�<module>s0


/site-packages/tuned/plugins/__pycache__/plugin_mounts.cpython-36.opt-1.pyc000064400000012773147511334670022611 0ustar003

�<�e��@spddljZddlmZddlTddlmZmZddlZ	ddl
mZddlZe	j
j�Ze�ZGdd�dej�ZdS)	�N�)�base)�*)�Popen�PIPE)�commandsc@steZdZdZedd��Zdd�Zedd��Zdd	�Zd
d�Z	dd
�Z
dd�Zdd�Zdd�Z
eddd�dd��ZdS)�MountsPluginaP
	`mounts`::
	
	Enables or disables barriers for mounts according to the value of the
	[option]`disable_barriers` option. The [option]`disable_barriers`
	option has an optional value `force` which disables barriers even
	on mountpoints with write back caches. Note that only extended file
	systems (ext) are supported by this plug-in.
	cCs�i}d}tdddgttddd�j�\}}x�dd�|j�D�D]�}t|�d	krNq<|dd	�\}}}t|�d	krt|d	nd}	t|�d
kr�|d
nd}
|dkr�|}q<|dks<|dkr�q<|
dks<|
dkr�q<|j|
t�||	d��||
dj|�q<W||_dS)z�
		Gets the information about disks, partitions and mountpoints. Stores information about used filesystem and
		creates a list of all underlying devices (in case of LVM) for each mountpoint.
		NZlsblkz-rnozTYPE,RM,KNAME,FSTYPE,MOUNTPOINTT)�stdout�stderrZ	close_fdsZuniversal_newlinescSsg|]}|j��qS�)�split)�.0�linerr�#/usr/lib/python3.6/plugin_mounts.py�
<listcomp>$sz>MountsPlugin._generate_mountpoint_topology.<locals>.<listcomp>��Zdisk�1�part�lvmz[SWAP])�disks�device_name�
filesystemr)rr)	rrZcommunicate�
splitlines�len�
setdefault�set�add�_mountpoint_topology)�clsZmountpoint_topologyZcurrent_diskr	r
�columnsZdevice_typeZdevice_removablerr�
mountpointrrr�_generate_mountpoint_topologys,z*MountsPlugin._generate_mountpoint_topologycCs*|j�d|_t|jj��|_t�|_dS)NT)r"Z_devices_supportedrr�keysZ
_free_devicesZ_assigned_devices)�selfrrr�
_init_devices;szMountsPlugin._init_devicescCsddiS)N�disable_barriersr)r$rrr�_get_config_optionsAsz MountsPlugin._get_config_optionscCsd|_d|_dS)NFT)Z_has_dynamic_tuningZ_has_static_tuning)r$�instancerrr�_instance_initGszMountsPlugin._instance_initcCsdS)Nr)r$r(rrr�_instance_cleanupKszMountsPlugin._instance_cleanupcCs,tjd|�}x|D]}tj|�j�SWdS)zV
		Get device cache type. This will work only for devices on SCSI kernel subsystem.
		z+/sys/block/%s/device/scsi_disk/*/cache_typeN)�glob�cmdZ	read_file�strip)r$�deviceZsource_filenamesZsource_filenamerrr�_get_device_cache_typeNs
z#MountsPlugin._get_device_cache_typecCs.x(|j|dD]}|j|�dkrdSqWdS)zr
		Checks if the device has 'write back' cache. If the cache type cannot be determined, asume some other cache.
		rz
write backTF)rr/)r$r!r.rrr�_mountpoint_has_writeback_cacheWsz,MountsPlugin._mountpoint_has_writeback_cachecCs�td��H}x@|D]4}|j�}|dddkr.q|d|kr|d}PqWdSWdQRX|jd�}xH|D]<}|jd�\}}	}
|d	ks�|d
kr�|
dkr�dS|d
krfd
SqfWd
SdS)zP
		Checks if a given mountpoint is mounted with barriers enabled or disabled.
		z/proc/mountsr�/rrN�,�=Z	nobarrierZbarrier�0FT)�openr�	partition)r$r!Zmounts_filerr Zoption_list�optionsZoption�name�sep�valuerrr�_mountpoint_has_barriers`s"



z%MountsPlugin._mountpoint_has_barrierscCsd|dd|g}tj|�dS)z
		Remounts partition.
		z/usr/bin/mountz-oz
remount,%sN)r,Zexecute)r$r6r7Zremount_commandrrr�_remount_partition}szMountsPlugin._remount_partitionr&T)Z
per_devicec
CsZ|jd|d�}t|�j�dk}|p*|j|�}|�r|s:dSd}|j|djd�sXd}nl|rn|j|�rnd}nV|j|�}	|	dkr�d}n>|	d	kr�|r�tj	t
j|�d
Sd}n|r�tjt
j
|�d	S|dk	r�tj	d||f�dS|jj||	�tj	d
|�|j|d�nJ|�rdS|jj|�}	|	dk�r0dStj	d|�|j|d�|jj|�dS)Nr&)Zcommand_namer�forcerZextzfilesystem not supportedzdevice uses write back cachezunknown current settingFTzbarriers already disabledz#not disabling barriers on '%s' (%s)zdisabling barriers on '%s'z	barrier=0zenabling barriers on '%s'z	barrier=1)Z_storage_key�str�lowerZ_option_boolr�
startswithr0r;�log�info�constsZSTR_VERIFY_PROFILE_OK�errorZSTR_VERIFY_PROFILE_FAILZ_storagerr<�getZunset)
r$�startr:r!ZverifyZignore_missingZstorage_keyr=Z
reject_reason�original_valuerrr�_disable_barriers�sN

zMountsPlugin._disable_barriersN)�__name__�
__module__�__qualname__�__doc__�classmethodr"r%r'r)r*r/r0r;r<Zcommand_customrHrrrrrs	$		r)Ztuned.constsrC�rZ
decorators�
subprocessrrZ
tuned.logsZtunedZtuned.utils.commandsrr+ZlogsrErAr,ZPluginrrrrr�<module>s

site-packages/tuned/plugins/__pycache__/plugin_selinux.cpython-36.pyc000064400000006001147511334670021777 0ustar003

�<�e	�@sdddlZddlZddlmZddlTddlZddlmZddl	m
Z
ejj�Z
Gdd�dej�ZdS)	�N�)�base)�*)�
exceptions)�commandscsheZdZdZedd��Z�fdd�Zedd��Zdd	�Zd
d�Z	e
d�d
d��Zed�dd��Z
�ZS)�
SelinuxPlugina�
	`selinux`::
	
	Plug-in for tuning SELinux options.
	+
	SELinux decisions, such as allowing or denying access, are
	cached. This cache is known as the Access Vector Cache (AVC). When
	using these cached decisions, SELinux policy rules need to be checked
	less, which increases performance. The [option]`avc_cache_threshold`
	option allows adjusting the maximum number of AVC entries.
	+
	NOTE: Prior to changing the default value, evaluate the system
	performance with care. Increasing the value could potentially
	decrease the performance by making AVC slow.
	+
	.Increase the AVC cache threshold for hosts with containers.
	====
	----
	[selinux]
	avc_cache_threshold=8192
	----
	====
	cCs(d}tjj|�s$d}tjj|�s$d}|S)Nz/sys/fs/selinuxz/selinux)�os�path�exists)�selfr	�r�$/usr/lib/python3.6/plugin_selinux.py�_get_selinux_path$szSelinuxPlugin._get_selinux_pathcsPt�|_|j�|_|jdkr&tjd��tjj|jdd�|_	t
t|�j||�dS)NzFSELinux is not enabled on your system or incompatible version is used.ZavcZcache_threshold)
r�_cmdrZ
_selinux_pathrZNotSupportedPluginExceptionrr	�join�_cache_threshold_path�superr�__init__)r�args�kwargs)�	__class__rr
r-s


zSelinuxPlugin.__init__cCsddiS)N�avc_cache_thresholdr)rrrr
�_get_config_options5sz!SelinuxPlugin._get_config_optionscCsd|_d|_dS)NTF)Z_has_static_tuningZ_has_dynamic_tuning)r�instancerrr
�_instance_init;szSelinuxPlugin._instance_initcCsdS)Nr)rrrrr
�_instance_cleanup?szSelinuxPlugin._instance_cleanuprcCsL|dkrdSt|�}|dkrD|s@|jj|j||r8tjgndd�|SdSdS)NrF)Zno_error)�intrZ
write_to_filer�errno�ENOENT)r�valueZsim�removeZ	thresholdrrr
�_set_avc_cache_thresholdBsz&SelinuxPlugin._set_avc_cache_thresholdcCs&|jj|j�}t|�dkr"t|�SdS)Nr)rZ	read_filer�lenr)rrrrr
�_get_avc_cache_thresholdOsz&SelinuxPlugin._get_avc_cache_threshold)�__name__�
__module__�__qualname__�__doc__�classmethodrrrrrZcommand_setr!Zcommand_getr#�
__classcell__rr)rr
rs	
r)rr�rZ
decoratorsZ
tuned.logsZtunedZ
tuned.pluginsrZtuned.utils.commandsrZlogs�get�logZPluginrrrrr
�<module>s
site-packages/tuned/plugins/__pycache__/plugin_disk.cpython-36.pyc000064400000040550147511334670021251 0ustar003

�<�e�A�@sjddlZddlmZddlTddlZddljZddlm	Z	ddl
Z
ddlZejj
�ZGdd�dej�ZdS)�N�)�hotplug)�*)�commandscs�eZdZdZ�fdd�Z�fdd�Zdd�Zdd	�Zed
d��Z	dd
�Z
dd�Z�fdd�Z�fdd�Z
�fdd�Zedd��Zedd��Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Z�fd,d-�Zd.d/�ZdZd1d2�Zd3d4�Zed5d6d7�d8d9��Ze d5�d[d;d<��Z!ed=d6d7�d>d?��Z"e d=�d\d@dA��Z#edBd6d7�dCdD��Z$e dB�d]dEdF��Z%dGdH�Z&dIdJ�Z'edKd6d7�dLdM��Z(e dK�d^dNdO��Z)e*dPd6d7�dQdR��Z+dSdT�Z,edUd6d7�dVdW��Z-e dU�d_dXdY��Z.�Z/S)`�
DiskPlugina�	
	`disk`::
	
	Plug-in for tuning various block device options. This plug-in can also
	dynamically change the advanced power management and spindown timeout
	setting for a drive according to the current drive utilization. The
	dynamic tuning is controlled by the [option]`dynamic` and the global
	[option]`dynamic_tuning` option in `tuned-main.conf`.
	+
	The disk plug-in operates on all supported block devices unless a
	comma separated list of [option]`devices` is passed to it.
	+
	.Operate only on the sda block device
	====
	----
	[disk]
	# Comma separated list of devices, all devices if commented out.
	devices=sda
	----
	====
	+
	The [option]`elevator` option sets the Linux I/O scheduler.
	+
	.Use the bfq I/O scheduler on xvda block device
	====
	----
	[disk]
	device=xvda
	elevator=bfq
	----
	====
	+
	The [option]`scheduler_quantum` option only applies to the CFQ I/O
	scheduler. It defines the number of I/O requests that CFQ sends to
	one device at one time, essentially limiting queue depth. The default
	value is 8 requests. The device being used may support greater queue
	depth, but increasing the value of quantum will also increase latency,
	especially for large sequential write work loads.
	+
	The [option]`apm` option sets the Advanced Power Management feature
	on drives that support it. It corresponds to using the `-B` option of
	the `hdparm` utility. The [option]`spindown` option puts the drive
	into idle (low-power) mode, and also sets the standby (spindown)
	timeout for the drive. It corresponds to using `-S` option of the
	`hdparm` utility.
	+
	.Use a medium-agressive power management with spindown
	====
	----
	[disk]
	apm=128
	spindown=6
	----
	====
	+
	The [option]`readahead` option controls how much extra data the
	operating system reads from disk when performing sequential
	I/O operations. Increasing the `readahead` value might improve
	performance in application environments where sequential reading of
	large files takes place. The default unit for readahead is KiB. This
	can be adjusted to sectors by specifying the suffix 's'. If the
	suffix is specified, there must be at least one space between the
	number and suffix (for example, `readahead=8192 s`).
	+
	.Set the `readahead` to 4MB unless already set to a higher value
	====
	----
	[disk]
	readahead=>4096
	----
	====
	The disk readahead value can be multiplied by the constant
	specified by the [option]`readahead_multiply` option.
	csrtt|�j||�ddddddddd	d
ddg|_d
dddddddddd	dg|_t|j�|_d|_d|_t	�|_
dS)N��������}�i�U�F�7��r����������n�Z�<�g{�G�z�?)�superr�__init__�
_power_levels�_spindown_levels�len�_levels�_level_steps�_load_smallestr�_cmd)�self�args�kwargs)�	__class__��!/usr/lib/python3.6/plugin_disk.pyrXszDiskPlugin.__init__cs�tt|�j�d|_d|_t�|_t�|_xL|jj	d�D]<}|j
|�r8|jj|j�|jr8|j
|j�r8|jj|j�q8Wt�|_dS)NT�block)rr�
_init_devicesZ_devices_supported�_use_hdparm�setZ
_free_devices�_hdparm_apm_devices�_hardware_inventoryZget_devices�_device_is_supported�addZsys_name�_is_hdparm_apm_supportedZ_assigned_devices)r'�device)r*r+r,r.bs
zDiskPlugin._init_devicescs�fdd�|D�S)Ncsg|]}�jjd|��qS)r-)r2Z
get_device)�.0�x)r'r+r,�
<listcomp>qsz2DiskPlugin._get_device_objects.<locals>.<listcomp>r+)r'Zdevicesr+)r'r,�_get_device_objectspszDiskPlugin._get_device_objectscCs�|jjddd|gtjgdd�\}}}|tjkrFtjd�d|_dS|rntjd|�tjd	||f�dSd
|kr�tjd|�dSdS)N�hdparmz-Cz/dev/%sT)�	no_errorsZ
return_errz4hdparm command not found, ignoring for other devicesFz#Device '%s' not supported by hdparmz(rc: %s, msg: '%s')�unknownz3Driver for device '%s' does not support apm command)	r&�execute�errno�ENOENT�log�warnr/�info�debug)r'r6�rc�outZerr_msgr+r+r,r5ss
z#DiskPlugin._is_hdparm_apm_supportedcCs2|jdko0|jjdd�dko0|jdkp0|jjdkS)	N�diskZ	removable�0�scsi�virtio�xen�nvme)rIrJrKrL)Zdevice_typeZ
attributes�get�parentZ	subsystem)�clsr6r+r+r,r3�s

zDiskPlugin._device_is_supportedcCs|jj|d|j�dS)Nr-)r2Z	subscribe�_hardware_events_callback)r'r+r+r,�_hardware_events_init�sz DiskPlugin._hardware_events_initcCs|jj|�dS)N)r2Zunsubscribe)r'r+r+r,�_hardware_events_cleanup�sz#DiskPlugin._hardware_events_cleanupcs(|j|�s|dkr$tt|�j||�dS)N�remove)r3rrrP)r'Zeventr6)r*r+r,rP�sz$DiskPlugin._hardware_events_callbackcs,|jdk	r|jj|�tt|�j||�dS)N)�
_load_monitorZ
add_devicerr�_added_device_apply_tuning)r'�instance�device_name)r*r+r,rU�s
z%DiskPlugin._added_device_apply_tuningcs,|jdk	r|jj|�tt|�j||�dS)N)rTZ
remove_devicerr�_removed_device_unapply_tuning)r'rVrW)r*r+r,rX�s
z)DiskPlugin._removed_device_unapply_tuningcCsdddddddd�S)NT)�dynamic�elevator�apm�spindown�	readahead�readahead_multiply�scheduler_quantumr+)rOr+r+r,�_get_config_options�szDiskPlugin._get_config_optionscCsddgS)Nr[r\r+)rOr+r+r,�#_get_config_options_used_by_dynamic�sz.DiskPlugin._get_config_options_used_by_dynamiccCsdd|_d|_d|_|j|jd�rTd|_|jjd|j�|_	i|_
i|_i|_i|_
nd|_d|_	dS)NTrrYrGF)Z_has_static_tuning�_apm_errcnt�_spindown_errcntZ_option_boolZoptionsZ_has_dynamic_tuning�_monitors_repositoryZcreateZassigned_devicesrTZ_device_idle�_stats�_idle�_spindown_change_delayed)r'rVr+r+r,�_instance_init�szDiskPlugin._instance_initcCs"|jdk	r|jj|j�d|_dS)N)rTrd�delete)r'rVr+r+r,�_instance_cleanup�s
zDiskPlugin._instance_cleanupcCs�|rd}|j}n
d}|j}|tjkr(dS|dkr6d}nL|tjkrbtjd|_|_tjd�dS|d7}|tjkr�tjd|�|r�||_n||_dS)Nr\r[rrzIhdparm command not found, ignoring future set_apm / set_spindown commandsz5disabling set_%s command: too many consecutive errors)	rcrb�consts�ERROR_THRESHOLDr?r@rArBrC)r'rEr\�sZcntr+r+r,�_update_errcnt�s&


zDiskPlugin._update_errcntcCsNtjd|�|jjdd|d|gtjgd�\}}|j|d�d|j|<dS)Nzchanging spindown to %dr;z-S%dz/dev/%s)r<TF)rArDr&r>r?r@rnrg)r'rVr6�new_spindown_levelrErFr+r+r,�_change_spindown�s&zDiskPlugin._change_spindowncCs2|jjddd|gtjgd�\}}d|ko0d|kS)Nr;z-Cz/dev/%s)r<ZstandbyZsleeping)r&r>r?r@)r'r6rErFr+r+r,�_drive_spinning�s"zDiskPlugin._drive_spinningcCs(||jkrdS|jj|�}|dkr&dS||jkr<|j||�|j|||�|j||�|j|}|j|}|dd|jkr�|d|j	kr�|d|j	kr�d}n.|ddkr�|ddks�|ddkr�d}nd}|dk�r�|d|7<|j
|d}|j|d}tj
d|d�|jtjk�rb|j|��rT|dk�rTtj
d|�d|j|<n|j|||�|jtjk�r�tj
d	|�|jjd
d|d|gtjgd
�\}	}
|j|	d�n4|j|�r�|j|��r�|j|d}|j|||�tj
d||d|df�tj
d||d|d|df�dS)N�levelr�read�writerztuning level changed to %dz;delaying spindown change to %d, drive has already spun downTzchanging APM_level to %dr;z-B%dz/dev/%s)r<Fz %s load: read %0.2f, write %0.2fz$%s idle: read %d, write %d, level %d���)r1rTZget_device_loadre�_init_stats_and_idle�
_update_stats�_update_idlerfr#r$r r!rArDrcrkrlrqrgrprbr&r>r?r@rn)r'rVr6�loadZstatsZidleZlevel_changeZnew_power_levelrorErFr+r+r,�_instance_update_dynamic�sF



.$
&z#DiskPlugin._instance_update_dynamiccCsDddgddgddgd�|j|<dddd�|j|<d|j|<dS)N�rr)�new�old�max)rrrsrtF)rerfrg)r'rVr6r+r+r,rvs$zDiskPlugin._init_stats_and_idlecCs�|j|d|j|d<}||j|d<dd�t||�D�}||j|d<|j|d}dd�t||�D�}||j|d<t|d�t|d�|j|d	<t|d
�t|d
�|j|d<dS)Nr|r}cSsg|]}|d|d�qS)rrr+)r7Znew_oldr+r+r,r9'sz,DiskPlugin._update_stats.<locals>.<listcomp>�diffr~cSsg|]}t|��qSr+)r~)r7Zpairr+r+r,r9,srrs�rt)re�zip�float)r'rVr6Znew_loadZold_loadrZold_max_loadZmax_loadr+r+r,rw"s"zDiskPlugin._update_statscCsLxFdD]>}|j|||jkr6|j||d7<qd|j||<qWdS)Nrsrtrr)rsrt)rer%rf)r'rVr6Z	operationr+r+r,rx3s
zDiskPlugin._update_idlecs0||jkrtjd|�ntt|�j||�dS)Nz<There is no dynamic tuning available for device '%s' at time)r1rArCrr�_instance_apply_dynamic)r'rVr6)r*r+r,r�;s
z"DiskPlugin._instance_apply_dynamiccCsdS)Nr+)r'rVr6r+r+r,�_instance_unapply_dynamicDsz$DiskPlugin._instance_unapply_dynamic�/sys/block/cCs@d|kr0tjj||jdd�|�}tjj|�r0|Stjj|||�S)N�/�!)�os�path�join�replace�exists)r'r6�suffix�prefixZdevr+r+r,�_sysfs_pathGs
zDiskPlugin._sysfs_pathcCs|j|d�S)Nzqueue/scheduler)r�)r'r6r+r+r,�_elevator_fileNszDiskPlugin._elevator_filerZT)Z
per_devicecCs0|j|�}|s,|jj|||r$tjgndd�|S)NF)�no_error)r�r&�
write_to_filer?r@)r'�valuer6�simrS�sys_filer+r+r,�
_set_elevatorQs


zDiskPlugin._set_elevatorFcCs"|j|�}|jj|jj||d��S)N)r�)r�r&Zget_active_option�	read_file)r'r6�ignore_missingr�r+r+r,�
_get_elevatorYs
zDiskPlugin._get_elevatorr[cCs|||jkr(|s tjd|�dSt|�S|jtjkrt|sl|jjddt|�d|gt	j
gd�\}}|j|d�t|�SdSdS)Nz+apm option is not supported for device '%s'r;z-Bz/dev/)r<F)r1rArC�strrbrkrlr&r>r?r@rn)r'r�r6r�rSrErFr+r+r,�_set_apm`s
(zDiskPlugin._set_apmcCs�||jkr |stjd|�dSd}d}|jjddd|gtjgd�\}}|tjkrZdS|dkrhd}n@tjd	|tj	�}|r�yt
|jd
��}Wntk
r�d}YnX|r�tj
d|�|S)Nz+apm option is not supported for device '%s'Fr;z-Bz/dev/)r<rTz
.*=\s*(\d+).*rz2could not get current APM settings for device '%s')r1rArCr&r>r?r@�re�match�S�int�group�
ValueError�error)r'r6r�r��errrErF�mr+r+r,�_get_apmps(
"
zDiskPlugin._get_apmr\cCs|||jkr(|s tjd|�dSt|�S|jtjkrt|sl|jjddt|�d|gt	j
gd�\}}|j|d�t|�SdSdS)Nz0spindown option is not supported for device '%s'r;z-Sz/dev/)r<T)r1rArCr�rcrkrlr&r>r?r@rn)r'r�r6r�rSrErFr+r+r,�
_set_spindown�s
(zDiskPlugin._set_spindowncCs$||jkr |stjd|�dSdS)Nz0spindown option is not supported for device '%s'�)r1rArC)r'r6r�r+r+r,�
_get_spindown�s

zDiskPlugin._get_spindowncCs|j|d�S)Nzqueue/read_ahead_kb)r�)r'r6r+r+r,�_readahead_file�szDiskPlugin._readahead_filecCs^t|�jdd�}yt|d�}Wntk
r4dSXt|�dkrZ|dddkrZ|d}|S)Nrrrm�)r��splitr�r�r")r'r��val�vr+r+r,�	_parse_ra�szDiskPlugin._parse_rar]cCsZ|j|�}|j|�}|dkr0tjd||f�n&|sV|jj|d||rNtjgndd�|S)Nz,Invalid readahead value '%s' for device '%s'z%dF)r�)r�r�rAr�r&r�r?r@)r'r�r6r�rSr�r�r+r+r,�_set_readahead�s

zDiskPlugin._set_readaheadcCs6|j|�}|jj||d�j�}t|�dkr.dSt|�S)N)r�r)r�r&r��stripr"r�)r'r6r�r�r�r+r+r,�_get_readahead�s

zDiskPlugin._get_readaheadr^c	Cs�|rdS|jd|d�}|r^|j|�}|dkr0dStt|�|�}|jj||�|j||d�n2|jj|�}|dkrvdS|j||d�|jj|�dS)Nr^)Zcommand_namerWF)	Z_storage_keyr�r�r�Z_storager0r�rMZunset)	r'ZenablingZ
multiplierr6Zverifyr�Zstorage_keyZ
old_readaheadZ
new_readaheadr+r+r,�_multiply_readahead�s"
zDiskPlugin._multiply_readaheadcCs|j|d�S)Nzqueue/iosched/quantum)r�)r'r6r+r+r,�_scheduler_quantum_file�sz"DiskPlugin._scheduler_quantum_filer_cCs8|j|�}|s4|jj|dt|�|r,tjgndd�|S)Nz%dF)r�)r�r&r�r�r?r@)r'r�r6r�rSr�r+r+r,�_set_scheduler_quantum�s

z!DiskPlugin._set_scheduler_quantumcCsH|j|�}|jj||d�j�}t|�dkr@|s<tjd|�dSt|�S)N)r�rz>disk_scheduler_quantum option is not supported for device '%s')r�r&r�r�r"rArCr�)r'r6r�r�r�r+r+r,�_get_scheduler_quantum�s
z!DiskPlugin._get_scheduler_quantum)r�)F)F)F)F)F)0�__name__�
__module__�__qualname__�__doc__rr.r:r5�classmethodr3rQrRrPrUrXr`rarhrjrnrprqrzrvrwrxr�r�r�r�Zcommand_setr�Zcommand_getr�r�r�r�r�r�r�r�r�Zcommand_customr�r�r�r��
__classcell__r+r+)r*r,rsZJ
2	
r)r?�rZ
decoratorsZ
tuned.logsZtunedZtuned.constsrkZtuned.utils.commandsrr�r�ZlogsrMrAZPluginrr+r+r+r,�<module>s

site-packages/tuned/plugins/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000255147511334670021435 0ustar003

�<�e1�@sddlTddlmZdS)�)�*)�instanceN)Z
repository�r�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/plugins/__pycache__/plugin_usb.cpython-36.opt-1.pyc000064400000005716147511334670022054 0ustar003

�<�e��@sXddlmZddlTddlZddlmZddlZddlZej	j
�ZGdd�dej�Z
dS)�)�base)�*�N)�commandsc@sjeZdZdZdd�Zdd�Zedd��Zdd	�Zd
d�Z	dd
�Z
eddd�dd��Ze
d�ddd��ZdS)�	USBPlugina�
	`usb`::
	
	Sets autosuspend timeout of USB devices to the value specified by the
	[option]`autosuspend` option in seconds. If the [option]`devices`
	option is specified, the [option]`autosuspend` option applies to only
	the USB devices specified, otherwise it applies to all USB devices.
	+
	The value `0` means that autosuspend is disabled.
	+
	.To turn off USB autosuspend for USB devices `1-1` and `1-2`
	====
	----
	[usb]
	devices=1-1,1-2
	autosuspend=0
	----
	====
	cCsNd|_t�|_t�|_x*|jjd�jdd�D]}|jj|j�q,Wt	�|_
dS)NT�usbZDEVTYPEZ
usb_device)Z_devices_supported�setZ
_free_devicesZ_assigned_devices�_hardware_inventoryZget_devicesZmatch_property�addZsys_namer�_cmd)�self�device�r� /usr/lib/python3.6/plugin_usb.py�
_init_devicesszUSBPlugin._init_devicescs�fdd�|D�S)Ncsg|]}�jjd|��qS)r)r	Z
get_device)�.0�x)rrr�
<listcomp>*sz1USBPlugin._get_device_objects.<locals>.<listcomp>r)rZdevicesr)rr�_get_device_objects)szUSBPlugin._get_device_objectscCsddiS)N�autosuspendr)rrrr�_get_config_options,szUSBPlugin._get_config_optionscCsd|_d|_dS)NTF)Z_has_static_tuningZ_has_dynamic_tuning)r�instancerrr�_instance_init2szUSBPlugin._instance_initcCsdS)Nr)rrrrr�_instance_cleanup6szUSBPlugin._instance_cleanupcCsd|S)Nz)/sys/bus/usb/devices/%s/power/autosuspendr)rr
rrr�_autosuspend_sysfile9szUSBPlugin._autosuspend_sysfilerT)Z
per_devicecCsR|j|�}|dkrdS|rdnd}|sN|j|�}|jj|||rFtjgndd�|S)N�1�0F)�no_error)Z_option_boolrrZ
write_to_file�errno�ENOENT)r�valuer
Zsim�remove�enable�val�sys_filerrr�_set_autosuspend<s


zUSBPlugin._set_autosuspendFcCs|j|�}|jj||d�j�S)N)r)rrZ	read_file�strip)rr
Zignore_missingr$rrr�_get_autosuspendIs
zUSBPlugin._get_autosuspendN)F)�__name__�
__module__�__qualname__�__doc__rr�classmethodrrrrZcommand_setr%Zcommand_getr'rrrrr
s

r)�rZ
decoratorsZ
tuned.logsZtunedZtuned.utils.commandsrZglobrZlogs�get�logZPluginrrrrr�<module>s
site-packages/tuned/plugins/__pycache__/plugin_uncore.cpython-36.opt-1.pyc000064400000011404147511334670022545 0ustar003

�<�es�@sjddlmZddlTddlZddlmZddlZddlZej	j
�Ze�ZdZ
dZdZGdd�dej�ZdS)	�)�hotplug)�*�N)�commandsz//sys/devices/system/cpu/intel_uncore_frequency/c@s�eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Ze	dd
��Z
dd�Zeddd�dd��Z
ed�ddd��Zeddd�dd��Zed�ddd��ZdS) �UncorePlugina|
	`uncore`::

	`max_freq_khz, min_freq_khz`:::
	Limit the maximum and minumum uncore frequency.

	Those options are Intel specific and correspond directly to `sysfs` files
	exposed by Intel uncore frequency driver.
	====
	----
	[uncore]
	max_freq_khz=4000000
	----
	Using this options *TuneD* will limit maximum frequency of all uncore units
	on the Intel system to 4 GHz.
	====
	cCs�d|_t�|_t�|_d|_ytjt�}Wntk
r>dSXt	j
|d�}t|�dkrbd|_|}x|D]}|jj|�qhWt
jdt|j��dS)NTFzuncore*rzdevices: %s)Z_devices_supported�setZ_assigned_devicesZ
_free_devicesZ_is_tpmi�os�listdir�	SYSFS_DIR�OSError�fnmatch�filter�len�add�log�debug�str)�selfZdevicesZtpmi_devices�d�r�#/usr/lib/python3.6/plugin_uncore.py�
_init_devices$s
zUncorePlugin._init_devicescCsd|_d|_dS)NTF)Z_has_static_tuningZ_has_dynamic_tuning)r�instancerrr�_instance_init:szUncorePlugin._instance_initcCsdS)Nr)rrrrr�_instance_cleanup>szUncorePlugin._instance_cleanupcCs2t|d|}tj|�}t|�dkr.t|�SdS)N�/r)r
�cmdZ	read_filer�int)r�dev_dir�file�
sysfs_file�valuerrr�_getAs

zUncorePlugin._getcCs(t|d|}tj|d|�r$|SdS)Nrz%u)r
rZ
write_to_file)rrrr!r rrr�_setHszUncorePlugin._setcCs
ddd�S)N)�max_freq_khz�min_freq_khzr)�clsrrr�_get_config_optionsNsz UncorePlugin._get_config_optionsc	Cs*yt|�}Wn"tk
r.tjd|�dSXy4|j|d�}|j|d�}|j|d�}|j|d�}Wn"ttfk
r�tjd�dSX|tkr�||kr�tjd|||f�dS||kr�tjd|||f�|}nT|t	k�r"||k�r�tjd	|||f�dS||k�r&tjd
|||f�|}ndS|S)Nzvalue '%s' is not integer�initial_max_freq_khz�initial_min_freq_khzr$r%z+fail to read inital uncore frequency valuesz/%s: max_freq_khz %d value below min_freq_khz %dzC%s: max_freq_khz %d value above initial_max_freq_khz - capped to %dz/%s: min_freq_khz %d value above max_freq_khz %dzC%s: min_freq_khz %d value below initial_max_freq_khz - capped to %d)
r�
ValueErrorr�errorr"r�IOError�IS_MAX�info�IS_MIN)	r�deviceZ
min_or_maxr!Zfreq_khzr(r)r$r%rrr�_validate_valueUs:



zUncorePlugin._validate_valuer$T)Z
per_devicecCsB|j|t|�}|dkrdS|r"|Stjd||f�|j|d|�S)Nz%s: set max_freq_khz %dr$)r1r-rrr#)rr!r0�sim�remover$rrr�_set_max_freq_khz|szUncorePlugin._set_max_freq_khzFcCs`|rtjjt�rdSy|j|d�}Wn"ttfk
rHtjd�dSXtj	d||f�|S)Nr$z$fail to read uncore frequency valuesz%s: get max_freq_khz %d)
r�path�isdirr
r"rr,rr+r)rr0�ignore_missingr$rrr�_get_max_freq_khz�s
zUncorePlugin._get_max_freq_khzr%cCsB|j|t|�}|dkrdS|r"|Stjd||f�|j|d|�S)Nz%s: set min_freq_khz %dr%)r1r/rrr#)rr!r0r2r3r%rrr�_set_min_freq_khz�szUncorePlugin._set_min_freq_khzcCs`|rtjjt�rdSy|j|d�}Wn"ttfk
rHtjd�dSXtj	d||f�|S)Nr%z$fail to read uncore frequency valuesz%s: get min_freq_khz %d)
rr5r6r
r"rr,rr+r)rr0r7r%rrr�_get_min_freq_khz�s
zUncorePlugin._get_min_freq_khzN)F)F)�__name__�
__module__�__qualname__�__doc__rrrr"r#�classmethodr'r1Zcommand_setr4Zcommand_getr8r9r:rrrrrs'
r)�rZ
decoratorsZ
tuned.logsZtunedZtuned.utils.commandsrrrZlogs�getrrr
r/r-ZPluginrrrrr�<module>s
site-packages/tuned/plugins/__pycache__/plugin_sysctl.cpython-36.pyc000064400000015310147511334670021634 0ustar003

�<�e��@s�ddlZddlmZddlTddlZddlTddlmZddl	j
Z
ddlZddlZej
j�ZddgZdd	gZGd
d�dej�ZdS)�N�)�base)�*)�commandsZbase_reachable_timeZretrans_timez
/run/sysctl.dz
/etc/sysctl.dcs�eZdZdZ�fdd�Zdd�Zdd�Zdd	�Zd
d�Ze	j
fdd
�Zdd�Zdd�Z
dd�Zdd�Zdd�Zddd�Z�ZS)�SysctlPluginaI
	`sysctl`::
	
	Sets various kernel parameters at runtime.
	+
	This plug-in is used for applying custom `sysctl` settings and should
	only be used to change system settings that are not covered by other
	*TuneD* plug-ins. If the settings are covered by other *TuneD* plug-ins,
	use those plug-ins instead.
	+
	The syntax for this plug-in is
	`_key_=_value_`, where
	`_key_` is the same as the key name provided by the
	`sysctl` utility.
	+
	.Adjusting the kernel runtime kernel.sched_min_granularity_ns value
	====
	----
	[sysctl]
	kernel.sched_min_granularity_ns=3000000
	----
	====
	cs$tt|�j||�d|_t�|_dS)NT)�superr�__init__Z_has_dynamic_optionsr�_cmd)�self�args�kwargs)�	__class__��#/usr/lib/python3.6/plugin_sysctl.pyr*szSysctlPlugin.__init__cCshd|_d|_|j|j�}|jj|i�|_t|j�dkr\tj	d�|j
|�i|_|jj|�|j|_
dS)NFTrz0recovering old sysctl settings from previous run)Z_has_dynamic_tuningZ_has_static_tuning�_storage_key�name�_storage�get�_sysctl_original�len�log�info�_instance_unapply_static�unsetZoptions�_sysctl)r
�instance�storage_keyrrr�_instance_init/s

zSysctlPlugin._instance_initcCs|j|j�}|jj|�dS)N)rrrr)r
rrrrr�_instance_cleanup?szSysctlPlugin._instance_cleanupcCs�xzt|jj��D]h\}}|j|�}|dkr:tjd|�q|jj|jj	|��}|j
||�}|dk	r||j|<|j||�qW|j
|j�}|jj||j�|jjtjtj�r�tjd�|j|j�dS)NzDsysctl option %s will not be set, failed to read the original value.zreapplying system sysctl)�listr�items�_read_sysctlr�error�
_variables�expandr	Zunquote�_process_assignment_modifiersr�
_write_sysctlrrr�setZ_global_cfgZget_bool�constsZCFG_REAPPLY_SYSCTLZCFG_DEF_REAPPLY_SYSCTLr�_apply_system_sysctl)r
r�option�value�original_valueZ	new_valuerrrr�_instance_apply_staticCs"



z#SysctlPlugin._instance_apply_staticcCsvd}d}xht|jj��D]V\}}|j|�}|j|jj|�|�}|dk	r|j||jj	|�|jj	|�|�dkrd}qW|S)NTF)
rrr r!r%r#r$Z
_verify_valuer	Z	remove_ws)r
r�ignore_missingZdevices�retr*r+Zcurr_valrrr�_instance_verify_staticYs
$z$SysctlPlugin._instance_verify_staticcCs,x&t|jj��D]\}}|j||�qWdS)N)rrr r&)r
rZrollbackr*r+rrrresz%SysctlPlugin._instance_unapply_staticc
Cs�i}x\tD]T}ytj|�}Wntk
r2w
YnXx(|D] }|jd�sJq:||kr:|||<q:Wq
Wx4t|j��D]$}||}d||f}|j||�qpW|jd|�dS)Nz.confz%s/%sz/etc/sysctl.conf)�SYSCTL_CONFIG_DIRS�os�listdir�OSError�endswith�sorted�keys�_apply_sysctl_config_file)r
�instance_sysctl�files�d�flistZfname�pathrrrr)is 


z!SysctlPlugin._apply_system_sysctlcCs�tjd|�yPt|d��.}x&t|d�D]\}}|j||||�q(WWdQRXtjd|�WnHttfk
r�}z(|jtjkr�tj	d|t
|�f�WYdd}~XnXdS)Nz%Applying sysctl settings from file %s�rrz.Finished applying sysctl settings from file %sz.Error reading sysctl settings from file %s: %s)r�debug�open�	enumerate�_apply_sysctl_config_liner4�IOError�errno�ENOENTr"�str)r
r=r9�f�lineno�line�errrr8|sz&SysctlPlugin._apply_sysctl_config_filec	Cs�|j�}t|�dks,|ddks,|ddkr0dS|jdd�}t|�dkr^tjd||f�dS|\}}|j�}t|�dkr�tjd||f�dS|j�}||kr�|jj||�}||kr�tjd|||f�|j||d	d
�dS)Nr�#�;�=r�z Syntax error in file %s, line %dz2Overriding sysctl parameter '%s' from '%s' to '%s'T)r.)	�stripr�splitrr"r#r$rr&)	r
r=rHrIr9Ztmpr*r+Zinstance_valuerrrrB�s*$z&SysctlPlugin._apply_sysctl_config_linecCsd|jj|dd�S)Nz/proc/sys/%sz./z/.)r	Ztr)r
r*rrr�_get_sysctl_path�szSysctlPlugin._get_sysctl_pathcCs�|j|�}yht|d��B}d}x.t|�D]"\}}|dkr&tjd|�dSq&W|j�}WdQRXtjd||f�|Sttfk
r�}z6|j	t	j
kr�tjd|�ntjd|t|�f�dSd}~XnXdS)Nr>�rzGFailed to read sysctl parameter '%s', multi-line values are unsupportedz&Value of sysctl parameter '%s' is '%s'zBFailed to read sysctl parameter '%s', the parameter does not existz(Failed to read sysctl parameter '%s': %s)rQr@rArr"rOr?r4rCrDrErF)r
r*r=rGrI�ir+rJrrrr!�s(

zSysctlPlugin._read_sysctlFcCs�|j|�}tjj|�tkr,tjd|�dSy6tjd||f�t|d��}|j	|�WdQRXdSt
tfk
r�}zJ|jtj
kr�|r�tjntj}|d||f�ntjd||t|�f�dSd}~XnXdS)Nz+Refusing to set deprecated sysctl option %sFz%Setting sysctl parameter '%s' to '%s'�wTzIFailed to set sysctl parameter '%s' to '%s', the parameter does not existz/Failed to set sysctl parameter '%s' to '%s': %s)rQr2r=�basename�DEPRECATED_SYSCTL_OPTIONSrr"r?r@�writer4rCrDrErF)r
r*r+r.r=rGrJZlog_funcrrrr&�s&
zSysctlPlugin._write_sysctl)F)�__name__�
__module__�__qualname__�__doc__rrrr-r0r(Z
ROLLBACK_SOFTrr)r8rBrQr!r&�
__classcell__rr)r
rrs
r)�rerRrZ
decoratorsZ
tuned.logsZtuned�
subprocessZtuned.utils.commandsrZtuned.constsr(rDr2ZlogsrrrVr1ZPluginrrrrr�<module>s

site-packages/tuned/plugins/__pycache__/plugin_service.cpython-36.pyc000064400000033530147511334670021757 0ustar003

�<�e�)�@s�ddlmZddlZddljZddlTddlZddlZddlZ	ddl
mZe	jj
�Ze�ZGdd�d�ZGdd	�d	�ZGd
d�de�ZGdd
�d
e�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�dej�ZdS)�)�base�N)�*)�commandsc@seZdZddd�ZdS)�ServiceNcCs||_||_||_||_dS)N)�enable�start�cfg_file�runlevel)�selfrrr	r
�r�$/usr/lib/python3.6/plugin_service.py�__init__szService.__init__)NNNN)�__name__�
__module__�__qualname__rrrrr
r
src@s4eZdZdd�Zdd�Zdd�Zdd�Zd	d
�ZdS)�InitHandlercCs(tjdg�\}}|dkr$|j�dSdS)Nr
rr���)�cmd�execute�split)r�retcode�outrrr
�runlevel_getszInitHandler.runlevel_getcCstjddg�dS)NZtelinit�q)rr)rrrr
�
daemon_reloadszInitHandler.daemon_reloadcCsdS)Nr)r�namer	rrr
�cfg_installszInitHandler.cfg_installcCsdS)Nr)rrr	rrr
�
cfg_uninstallszInitHandler.cfg_uninstallcCsdS)Nr)rrr	rrr
�
cfg_verify"szInitHandler.cfg_verifyN)rrrrrrrrrrrr
rs
rc@s<eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
S)�SysVBasicHandlercCstjd|dg�dS)N�servicer)rr)rrrrr
r'szSysVBasicHandler.startcCstjd|dg�dS)Nr!�stop)rr)rrrrr
r"*szSysVBasicHandler.stopcCs
t��dS)N)�NotImplementedError)rrr
rrr
r-szSysVBasicHandler.enablecCs
t��dS)N)r#)rrr
rrr
�disable0szSysVBasicHandler.disablecCs"tjd|dgdgd�\}}|dkS)Nr!�statusr)�	no_errors)rr)rrrrrrr
�
is_running3szSysVBasicHandler.is_runningcCs
t��dS)N)r#)rrr
rrr
�
is_enabled7szSysVBasicHandler.is_enabledN)	rrrrr"rr$r'r(rrrr
r &sr c@s$eZdZdd�Zdd�Zdd�ZdS)�SysVHandlercCstjdd||dg�dS)N�	chkconfigz--level�on)rr)rrr
rrr
r;szSysVHandler.enablecCstjdd||dg�dS)Nr*z--level�off)rr)rrr
rrr
r$>szSysVHandler.disablecCsBtjdd|g�\}}|dkr>|jdt|��ddd�dkSdS)Nr*z--listrz%s:r�r+)rrr�str)rrr
rrrrr
r(AszSysVHandler.is_enabledN)rrrrr$r(rrrr
r):sr)c@s$eZdZdd�Zdd�Zdd�ZdS)�
SysVRCHandlercCstjdd||dg�dS)Nzsysv-rc-confz--levelr+)rr)rrr
rrr
rFszSysVRCHandler.enablecCstjdd||dg�dS)Nzsysv-rc-confz--levelr,)rr)rrr
rrr
r$IszSysVRCHandler.disablecCsBtjdd|g�\}}|dkr>|jdt|��ddd�dkSdS)Nzsysv-rc-confz--listrz%s:rr-r+)rrrr.)rrr
rrrrr
r(LszSysVRCHandler.is_enabledN)rrrrr$r(rrrr
r/Esr/c@sDeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dS)�
OpenRCHandlercCs&tjddg�\}}|dkr"|j�SdS)Nz	rc-statusz-rr)rr�strip)rrrrrr
rQszOpenRCHandler.runlevel_getcCstjd|dg�dS)Nz
rc-servicer)rr)rrrrr
rUszOpenRCHandler.startcCstjd|dg�dS)Nz
rc-servicer")rr)rrrrr
r"XszOpenRCHandler.stopcCstjdd||g�dS)Nz	rc-update�add)rr)rrr
rrr
r[szOpenRCHandler.enablecCstjdd||g�dS)Nz	rc-update�del)rr)rrr
rrr
r$^szOpenRCHandler.disablecCs"tjd|dgdgd�\}}|dkS)Nz
rc-servicer%r)r&)rr)rrrrrrr
r'aszOpenRCHandler.is_runningcCs2tjdd|g�\}}ttjdtj|�d|��S)Nz	rc-updateZshowz\b)rr�bool�re�search�escape)rrr
rrrrr
r(eszOpenRCHandler.is_enabledN)
rrrrrr"rr$r'r(rrrr
r0Psr0c@s\eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�ZdS)�SystemdHandlercCsdS)N�r)rrrr
rkszSystemdHandler.runlevel_getcCstjdd|g�dS)N�	systemctlZrestart)rr)rrrrr
rnszSystemdHandler.startcCstjdd|g�dS)Nr:r")rr)rrrrr
r"qszSystemdHandler.stopcCstjdd|g�dS)Nr:r)rr)rrr
rrr
rtszSystemdHandler.enablecCstjdd|g�dS)Nr:r$)rr)rrr
rrr
r$wszSystemdHandler.disablecCs"tjdd|gdgd�\}}|dkS)Nr:z	is-activer)r&)rr)rrrrrrr
r'zszSystemdHandler.is_runningcCs>tjdd|gdgd�\}}|j�}|dkr.dS|dkr:dSdS)	Nr:z
is-enabledr)r&ZenabledTZdisabledF)rrr1)rrr
rrr%rrr
r(~szSystemdHandler.is_enabledcCs�tjd||f�tjj|�s0tjd|�dStj|}ytj|tj	�Wn2t
k
r~}ztjd||f�dSd}~XnXtj||�|j
�dS)NzCinstalling service configuration overlay file '%s' for service '%s'z)Unable to find service configuration '%s'z#Unable to create directory '%s': %s)�log�info�os�path�exists�error�consts�SERVICE_SYSTEMD_CFG_PATH�makedirsZDEF_SERVICE_CFG_DIR_MODE�OSErrorr�copyr)rrr	�dirpath�errr
r�s
zSystemdHandler.cfg_installcCsntjd||f�tj|}d|tjj|�f}tj|�|j	�ytj
|�Wnttfk
rhYnXdS)NzEuninstalling service configuration overlay file '%s' for service '%s'z%s/%s)
r;r<rArBr=r>�basenamer�unlinkr�rmdir�FileNotFoundErrorrD)rrr	rFr>rrr
r�s

zSystemdHandler.cfg_uninstallcCs�|dkrdSdtj|tjj|�f}tjj|�sHtjd||f�dStjj|�sjtjd||f�dStj	|�}tj	|�}||kS)Nz%s/%sz.Unable to find service '%s' configuration '%s'Fz0Service '%s' configuration not installed in '%s')
rArBr=r>rHr?r;r@rZ	sha256sum)rrr	r>Z
sha256sum1Z
sha256sum2rrr
r�s

zSystemdHandler.cfg_verifyN)
rrrrrr"rr$r'r(rrrrrrr
r8isr8csneZdZdZ�fdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Ze
jfdd�Z�ZS)�
ServicePlugina1
	`service`::
	
	Plug-in for handling sysvinit, sysv-rc, openrc and systemd services.
	+
	The syntax is as follows:
	+
	[subs="+quotes,+macros"]
	----
	[service]
	service.__service_name__=__commands__[,file:__file__]
	----
	+
	Supported service-handling `_commands_` are `start`, `stop`, `enable`
	and `disable`. The optional `file:__file__` directive installs an overlay
	configuration file `__file__`. Multiple commands must be comma (`,`)
	or semicolon (`;`) separated. If the directives conflict, the last
	one is used.
	+
	The service plugin supports configuration overlays only for systemd.
	In other init systems, this directive is ignored. The configuration
	overlay files are copied to `/etc/systemd/system/__service_name__.service.d/`
	directories. Upon profile unloading, the directory is removed if it is empty.
	+
	With systemd, the `start` command is implemented by `restart` in order
	to allow loading of the service configuration file overlay.
	+
	NOTE: With non-systemd init systems, the plug-in operates on the
	current runlevel only.
	+
	.Start and enable the `sendmail` service with an overlay file
	====
	----
	[service]
	service.sendmail=start,enable,file:${i:PROFILE_DIR}/tuned-sendmail.conf
	----
	The internal variable `${i:PROFILE_DIR}` points to the directory
	from which the profile is loaded.
	====
	cs&tt|�j||�d|_|j�|_dS)NT)�superrLrZ_has_dynamic_options�_detect_init_system�
_init_handler)r�args�kwargs)�	__class__rr
r�szServicePlugin.__init__cCstj|dgd�\}}|dkS)Nr)r&)rr)rZcommandrrrrr
�
_check_cmd�szServicePlugin._check_cmdcCs�|jddg�rtjd�t�S|jdg�r:tjd�t�S|jddg�rXtjd�t�S|jd	dg�rvtjd
�t�Stjd��dS)Nr:r%zdetected systemdr*zdetected generic sysvinitzupdate-rc.dz-hzdetected sysv-rcz	rc-updatezdetected openrcz8Unable to detect your init system, disabling the plugin.)	rSr;�debugr8r)r/r0�
exceptionsZNotSupportedPluginException)rrrr
rN�s



z!ServicePlugin._detect_init_systemcCs�tjd|�}t�}x~|D]v}|dkr,d|_q|dkr<d|_q|dkrLd|_q|dkr\d|_q|dd�d	kr||dd�|_qtjd
||f�qW|S)Nz
\s*[,;]\s*rTr$Frr"�zfile:z*service '%s': invalid service option: '%s')r5rrrrr	r;r@)rr�val�lr!�irrr
�_parse_service_options�s
z$ServicePlugin._parse_service_optionscs6d|_d|_tj�fdd�|jj�D���_i|_dS)NFTcsTg|]L\}}|dd�dkrt|�dkr|dd��j|dd��jj|��f�qS)N�zservice.)�lenrZZ
_variables�expand)�.0Zoption�value)rrr
�
<listcomp>sz0ServicePlugin._instance_init.<locals>.<listcomp>)Z_has_dynamic_tuningZ_has_static_tuning�collections�OrderedDictZoptions�items�	_services�_services_original)r�instancer)rr
�_instance_inits
zServicePlugin._instance_initcCsdS)Nr)rrfrrr
�_instance_cleanup	szServicePlugin._instance_cleanupcCsT|r|jj|�n|dk	r&|jj|�|r:|jj||�n|dk	rP|jj||�dS)N)rOrr"rr$)rrrrr
rrr
�_process_serviceszServicePlugin._process_servicecCs�|jj�}|dkr tjd�dSx�|jj�D]�}|jj|d|�}|jj|d�}t|||dj	|�|j
|d<|dj	r�|jj|d|dj	�|j|d|dj
|dj|�q,WdS)NzCannot detect runlevelrr)rOrr;r@rdrcr(r'rr	rerrirr)rrfr
r!r(r'rrr
�_instance_apply_statics


z$ServicePlugin._instance_apply_staticc
CsH|jj�}|dkr&tjtjd�dSd}�x|jj�D�]}|jj|d|dj	�}|r~tj
tjd|d|dj	f�nR|dk	r�tjtjd|d|dj	f�d}n"tj
tjd|d|dj	f�|jj
|d|�}|jj|d�}	|jd	|d|dj|	|�dk�rd}|jd
|d|dj||�dkr8d}q8W|S)Nzcannot detect runlevelFTrrz'service '%s' configuration '%s' matchesz'service '%s' configuration '%s' differszservice '%s' configuration '%s'z
%s runningz
%s enabled)rOrr;r@rAZSTR_VERIFY_PROFILE_FAILrdrcrr	r<ZSTR_VERIFY_PROFILE_OKZ STR_VERIFY_PROFILE_VALUE_MISSINGr(r'Z
_verify_valuerr)
rrfZignore_missingZdevicesr
�retr!Zret_cfg_verifyr(r'rrr
�_instance_verify_static$s(
$""$"z%ServicePlugin._instance_verify_staticcCsLxFt|jj��D]4\}}|jr.|jj||j�|j||j|j|j	�qWdS)N)
�listrercr	rOrrirrr
)rrfZrollbackrr_rrr
�_instance_unapply_static<sz&ServicePlugin._instance_unapply_static)rrr�__doc__rrSrNrZrgrhrirjrlrAZ
ROLLBACK_SOFTrn�
__classcell__rr)rRr
rL�s(	
rL)r9rraZtuned.constsrAZ
decoratorsr=r5Z
tuned.logsZtunedZtuned.utils.commandsrZlogs�getr;rrrr r)r/r0r8ZPluginrLrrrr
�<module>s"

Bsite-packages/tuned/plugins/hotplug.py000064400000007530147511334670014140 0ustar00from . import base
import tuned.consts as consts
import tuned.logs

log = tuned.logs.get()

class Plugin(base.Plugin):
	"""
	Base class for plugins with device hotpluging support.
	"""

	def __init__(self, *args, **kwargs):
		super(Plugin, self).__init__(*args, **kwargs)

	def cleanup(self):
		super(Plugin, self).cleanup()
		self._hardware_events_cleanup()

	def _hardware_events_init(self):
		pass

	def _hardware_events_cleanup(self):
		pass

	def _init_devices(self):
		self._hardware_events_init()

	def _hardware_events_callback(self, event, device):
		if event == "add":
			log.info("device '%s' added" % device.sys_name)
			self._add_device(device.sys_name)
		elif event == "remove":
			log.info("device '%s' removed" % device.sys_name)
			self._remove_device(device.sys_name)

	def _add_device_process(self, instance, device_name):
		log.info("instance %s: adding new device %s" % (instance.name, device_name))
		self._assigned_devices.add(device_name)
		self._call_device_script(instance, instance.script_pre, "apply", [device_name])
		self._added_device_apply_tuning(instance, device_name)
		self._call_device_script(instance, instance.script_post, "apply", [device_name])
		instance.processed_devices.add(device_name)

	def _add_device(self, device_name):
		if device_name in (self._assigned_devices | self._free_devices):
			return

		for instance_name, instance in list(self._instances.items()):
			if len(self._get_matching_devices(instance, [device_name])) == 1:
				self._add_device_process(instance, device_name)
				break
		else:
			log.debug("no instance wants %s" % device_name)
			self._free_devices.add(device_name)

	def _add_devices_nocheck(self, instance, device_names):
		"""
		Add devices specified by the set to the instance, no check is performed.
		"""
		for dev in device_names:
			self._add_device_process(instance, dev)
		# This can be a bit racy (we can overcount),
		# but it shouldn't affect the boolean result
		instance.active = len(instance.processed_devices) \
				+ len(instance.assigned_devices) > 0

	def _remove_device_process(self, instance, device_name):
		if device_name in instance.processed_devices:
			self._call_device_script(instance, instance.script_post, "unapply", [device_name])
			self._removed_device_unapply_tuning(instance, device_name)
			self._call_device_script(instance, instance.script_pre, "unapply", [device_name])
			instance.processed_devices.remove(device_name)
			# This can be a bit racy (we can overcount),
			# but it shouldn't affect the boolean result
			instance.active = len(instance.processed_devices) \
					+ len(instance.assigned_devices) > 0
			self._assigned_devices.remove(device_name)
			return True
		return False

	def _remove_device(self, device_name):
		"""Remove device from the instance

		Parameters:
		device_name -- name of the device

		"""
		if device_name not in (self._assigned_devices | self._free_devices):
			return

		for instance in list(self._instances.values()):
			if self._remove_device_process(instance, device_name):
				break
		else:
			self._free_devices.remove(device_name)

	def _remove_devices_nocheck(self, instance, device_names):
		"""
		Remove devices specified by the set from the instance, no check is performed.
		"""
		for dev in device_names:
			self._remove_device_process(instance, dev)

	def _added_device_apply_tuning(self, instance, device_name):
		self._execute_all_device_commands(instance, [device_name])
		if instance.has_dynamic_tuning and self._global_cfg.get(consts.CFG_DYNAMIC_TUNING, consts.CFG_DEF_DYNAMIC_TUNING):
			self._instance_apply_dynamic(instance, device_name)

	def _removed_device_unapply_tuning(self, instance, device_name):
		if instance.has_dynamic_tuning and self._global_cfg.get(consts.CFG_DYNAMIC_TUNING, consts.CFG_DEF_DYNAMIC_TUNING):
			self._instance_unapply_dynamic(instance, device_name)
		self._cleanup_all_device_commands(instance, [device_name], remove = True)
site-packages/tuned/plugins/plugin_acpi.py000064400000004616147511334670014752 0ustar00from . import base
from .decorators import *
import os
import errno
import tuned.logs
from tuned.consts import ACPI_DIR

log = tuned.logs.get()


class ACPIPlugin(base.Plugin):
	"""
	`acpi`::

	Configures the ACPI driver.
	+
	The only currently supported option is
	[option]`platform_profile`, which sets the ACPI
	platform profile sysfs attribute,
	a generic power/performance preference API for other drivers.
	Multiple profiles can be specified, separated by `|`.
	The first available profile is selected.
	+
	--
	.Selecting a platform profile
	====
	----
	[acpi]
	platform_profile=balanced|low-power
	----
	Using this option, *TuneD* will try to set the platform profile
	to `balanced`. If that fails, it will try to set it to `low-power`.
	====
	--
	"""
	def __init__(self, *args, **kwargs):
		super(ACPIPlugin, self).__init__(*args, **kwargs)

	@classmethod
	def _get_config_options(cls):
		return {"platform_profile": None}

	def _instance_init(self, instance):
		instance._has_static_tuning = True
		instance._has_dynamic_tuning = False

	def _instance_cleanup(self, instance):
		pass

	@classmethod
	def _platform_profile_choices_path(cls):
		return os.path.join(ACPI_DIR, "platform_profile_choices")

	@classmethod
	def _platform_profile_path(cls):
		return os.path.join(ACPI_DIR, "platform_profile")

	@command_set("platform_profile")
	def _set_platform_profile(self, profiles, sim, remove):
		if not os.path.isfile(self._platform_profile_path()):
			log.debug("ACPI platform_profile is not supported on this system")
			return None
		profiles = [profile.strip() for profile in profiles.split('|')]
		avail_profiles = set(self._cmd.read_file(self._platform_profile_choices_path()).split())
		for profile in profiles:
			if profile in avail_profiles:
				if not sim:
					log.info("Setting platform_profile to '%s'" % profile)
					self._cmd.write_to_file(self._platform_profile_path(), profile, \
						no_error=[errno.ENOENT] if remove else False)
				return profile
			log.warn("Requested platform_profile '%s' unavailable" % profile)
		log.error("Failed to set platform_profile. Is the value in the profile correct?")
		return None

	@command_get("platform_profile")
	def _get_platform_profile(self, ignore_missing=False):
		if not os.path.isfile(self._platform_profile_path()):
			log.debug("ACPI platform_profile is not supported on this system")
			return None
		return self._cmd.read_file(self._platform_profile_path()).strip()
site-packages/tuned/plugins/instance/__init__.py000064400000000074147511334670016015 0ustar00from .instance import Instance
from .factory import Factory
site-packages/tuned/plugins/instance/factory.py000064400000000224147511334670015722 0ustar00from .instance import Instance

class Factory(object):
	def create(self, *args, **kwargs):
		instance = Instance(*args, **kwargs)
		return instance
site-packages/tuned/plugins/instance/__pycache__/factory.cpython-36.opt-1.pyc000064400000000707147511334670023153 0ustar003

�<�e��@s ddlmZGdd�de�ZdS)�)�Instancec@seZdZdd�ZdS)�FactorycOst||�}|S)N)r)�self�args�kwargs�instance�r�/usr/lib/python3.6/factory.py�creates
zFactory.createN)�__name__�
__module__�__qualname__r
rrrr	rsrN)rr�objectrrrrr	�<module>ssite-packages/tuned/plugins/instance/__pycache__/instance.cpython-36.pyc000064400000006502147511334670022350 0ustar003

�<�e��@sddljZGdd�de�ZdS)�Nc@s�eZdZdZdd�Zedd��Zedd��Zedd	��Zej	d
d	��Zedd��Z
ed
d��Zedd��Zedd��Z
edd��Zedd��Zedd��Zedd��Zedd��Zdd�Zdd �Zd!d"�Zejfd#d$�Zd%d&�Zd'S)(�Instancez
	cCsP||_||_||_||_||_||_||_d|_d|_d|_	t
�|_t
�|_dS)NTF)
�_plugin�_name�_devices_expression�_devices_udev_regex�_script_pre�_script_post�_options�_active�_has_static_tuning�_has_dynamic_tuning�set�_assigned_devices�_processed_devices)�self�plugin�name�devices_expression�devices_udev_regex�
script_pre�script_post�options�r�/usr/lib/python3.6/instance.py�__init__szInstance.__init__cCs|jS)N)r)rrrrrszInstance.plugincCs|jS)N)r)rrrrrsz
Instance.namecCs|jS)z>The instance performs some tuning (otherwise it is suspended).)r
)rrrr�active szInstance.activecCs
||_dS)N)r
)r�valuerrrr%scCs|jS)N)r)rrrrr)szInstance.devices_expressioncCs|jS)N)r)rrrr�assigned_devices-szInstance.assigned_devicescCs|jS)N)r)rrrr�processed_devices1szInstance.processed_devicescCs|jS)N)r)rrrrr5szInstance.devices_udev_regexcCs|jS)N)r)rrrrr9szInstance.script_precCs|jS)N)r)rrrrr=szInstance.script_postcCs|jS)N)r	)rrrrrAszInstance.optionscCs|jS)N)r)rrrr�has_static_tuningEszInstance.has_static_tuningcCs|jS)N)r)rrrr�has_dynamic_tuningIszInstance.has_dynamic_tuningcCs|jj|�dS)N)rZinstance_apply_tuning)rrrr�apply_tuningOszInstance.apply_tuningcCs|jj||�S)N)rZinstance_verify_tuning)rZignore_missingrrr�
verify_tuningRszInstance.verify_tuningcCs|jj|�dS)N)rZinstance_update_tuning)rrrr�
update_tuningUszInstance.update_tuningcCs|jj||�dS)N)rZinstance_unapply_tuning)rZrollbackrrr�unapply_tuningXszInstance.unapply_tuningcCs|j�|jj|�dS)N)r$rZdestroy_instance)rrrr�destroy[szInstance.destroyN)�__name__�
__module__�__qualname__�__doc__r�propertyrrr�setterrrrrrrrrr r!r"r#�constsZ
ROLLBACK_SOFTr$r%rrrrrs(r)Ztuned.constsr,�objectrrrrr�<module>s
site-packages/tuned/plugins/instance/__pycache__/__init__.cpython-36.pyc000064400000000301147511334670022272 0ustar003

�<�e<�@sddlmZddlmZdS)�)�Instance)�FactoryN)�instancer�factoryr�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/plugins/instance/__pycache__/factory.cpython-36.pyc000064400000000707147511334670022214 0ustar003

�<�e��@s ddlmZGdd�de�ZdS)�)�Instancec@seZdZdd�ZdS)�FactorycOst||�}|S)N)r)�self�args�kwargs�instance�r�/usr/lib/python3.6/factory.py�creates
zFactory.createN)�__name__�
__module__�__qualname__r
rrrr	rsrN)rr�objectrrrrr	�<module>ssite-packages/tuned/plugins/instance/__pycache__/instance.cpython-36.opt-1.pyc000064400000006502147511334670023307 0ustar003

�<�e��@sddljZGdd�de�ZdS)�Nc@s�eZdZdZdd�Zedd��Zedd��Zedd	��Zej	d
d	��Zedd��Z
ed
d��Zedd��Zedd��Z
edd��Zedd��Zedd��Zedd��Zedd��Zdd�Zdd �Zd!d"�Zejfd#d$�Zd%d&�Zd'S)(�Instancez
	cCsP||_||_||_||_||_||_||_d|_d|_d|_	t
�|_t
�|_dS)NTF)
�_plugin�_name�_devices_expression�_devices_udev_regex�_script_pre�_script_post�_options�_active�_has_static_tuning�_has_dynamic_tuning�set�_assigned_devices�_processed_devices)�self�plugin�name�devices_expression�devices_udev_regex�
script_pre�script_post�options�r�/usr/lib/python3.6/instance.py�__init__szInstance.__init__cCs|jS)N)r)rrrrrszInstance.plugincCs|jS)N)r)rrrrrsz
Instance.namecCs|jS)z>The instance performs some tuning (otherwise it is suspended).)r
)rrrr�active szInstance.activecCs
||_dS)N)r
)r�valuerrrr%scCs|jS)N)r)rrrrr)szInstance.devices_expressioncCs|jS)N)r)rrrr�assigned_devices-szInstance.assigned_devicescCs|jS)N)r)rrrr�processed_devices1szInstance.processed_devicescCs|jS)N)r)rrrrr5szInstance.devices_udev_regexcCs|jS)N)r)rrrrr9szInstance.script_precCs|jS)N)r)rrrrr=szInstance.script_postcCs|jS)N)r	)rrrrrAszInstance.optionscCs|jS)N)r)rrrr�has_static_tuningEszInstance.has_static_tuningcCs|jS)N)r)rrrr�has_dynamic_tuningIszInstance.has_dynamic_tuningcCs|jj|�dS)N)rZinstance_apply_tuning)rrrr�apply_tuningOszInstance.apply_tuningcCs|jj||�S)N)rZinstance_verify_tuning)rZignore_missingrrr�
verify_tuningRszInstance.verify_tuningcCs|jj|�dS)N)rZinstance_update_tuning)rrrr�
update_tuningUszInstance.update_tuningcCs|jj||�dS)N)rZinstance_unapply_tuning)rZrollbackrrr�unapply_tuningXszInstance.unapply_tuningcCs|j�|jj|�dS)N)r$rZdestroy_instance)rrrr�destroy[szInstance.destroyN)�__name__�
__module__�__qualname__�__doc__r�propertyrrr�setterrrrrrrrrr r!r"r#�constsZ
ROLLBACK_SOFTr$r%rrrrrs(r)Ztuned.constsr,�objectrrrrr�<module>s
site-packages/tuned/plugins/instance/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000301147511334670023231 0ustar003

�<�e<�@sddlmZddlmZdS)�)�Instance)�FactoryN)�instancer�factoryr�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/plugins/instance/instance.py000064400000003657147511334670016074 0ustar00import tuned.consts as consts

class Instance(object):
	"""
	"""

	def __init__(self, plugin, name, devices_expression, devices_udev_regex, script_pre, script_post, options):
		self._plugin = plugin
		self._name = name
		self._devices_expression = devices_expression
		self._devices_udev_regex = devices_udev_regex
		self._script_pre = script_pre
		self._script_post = script_post
		self._options = options

		self._active = True
		self._has_static_tuning = False
		self._has_dynamic_tuning = False
		self._assigned_devices = set()
		self._processed_devices = set()

	# properties

	@property
	def plugin(self):
		return self._plugin

	@property
	def name(self):
		return self._name

	@property
	def active(self):
		"""The instance performs some tuning (otherwise it is suspended)."""
		return self._active

	@active.setter
	def active(self, value):
		self._active = value

	@property
	def devices_expression(self):
		return self._devices_expression

	@property
	def assigned_devices(self):
		return self._assigned_devices

	@property
	def processed_devices(self):
		return self._processed_devices

	@property
	def devices_udev_regex(self):
		return self._devices_udev_regex

	@property
	def script_pre(self):
		return self._script_pre

	@property
	def script_post(self):
		return self._script_post

	@property
	def options(self):
		return self._options

	@property
	def has_static_tuning(self):
		return self._has_static_tuning

	@property
	def has_dynamic_tuning(self):
		return self._has_dynamic_tuning

	# methods

	def apply_tuning(self):
		self._plugin.instance_apply_tuning(self)

	def verify_tuning(self, ignore_missing):
		return self._plugin.instance_verify_tuning(self, ignore_missing)

	def update_tuning(self):
		self._plugin.instance_update_tuning(self)

	def unapply_tuning(self, rollback = consts.ROLLBACK_SOFT):
		self._plugin.instance_unapply_tuning(self, rollback)

	def destroy(self):
		self.unapply_tuning()
		self._plugin.destroy_instance(self)
site-packages/tuned/plugins/exceptions.py000064400000000143147511334670014630 0ustar00import tuned.exceptions

class NotSupportedPluginException(tuned.exceptions.TunedException):
	pass
site-packages/tuned/plugins/plugin_usb.py000064400000003737147511334670014632 0ustar00from . import base
from .decorators import *
import tuned.logs
from tuned.utils.commands import commands
import glob
import errno

log = tuned.logs.get()

class USBPlugin(base.Plugin):
	"""
	`usb`::
	
	Sets autosuspend timeout of USB devices to the value specified by the
	[option]`autosuspend` option in seconds. If the [option]`devices`
	option is specified, the [option]`autosuspend` option applies to only
	the USB devices specified, otherwise it applies to all USB devices.
	+
	The value `0` means that autosuspend is disabled.
	+
	.To turn off USB autosuspend for USB devices `1-1` and `1-2`
	====
	----
	[usb]
	devices=1-1,1-2
	autosuspend=0
	----
	====
	"""

	def _init_devices(self):
		self._devices_supported = True
		self._free_devices = set()
		self._assigned_devices = set()

		for device in self._hardware_inventory.get_devices("usb").match_property("DEVTYPE", "usb_device"):
			self._free_devices.add(device.sys_name)

		self._cmd = commands()

	def _get_device_objects(self, devices):
		return [self._hardware_inventory.get_device("usb", x) for x in devices]

	@classmethod
	def _get_config_options(self):
		return {
			"autosuspend" : None,
		}

	def _instance_init(self, instance):
		instance._has_static_tuning = True
		instance._has_dynamic_tuning = False

	def _instance_cleanup(self, instance):
		pass

	def _autosuspend_sysfile(self, device):
		return "/sys/bus/usb/devices/%s/power/autosuspend" % device

	@command_set("autosuspend", per_device=True)
	def _set_autosuspend(self, value, device, sim, remove):
		enable = self._option_bool(value)
		if enable is None:
			return None

		val = "1" if enable else "0"
		if not sim:
			sys_file = self._autosuspend_sysfile(device)
			self._cmd.write_to_file(sys_file, val, \
				no_error = [errno.ENOENT] if remove else False)
		return val

	@command_get("autosuspend")
	def _get_autosuspend(self, device, ignore_missing=False):
		sys_file = self._autosuspend_sysfile(device)
		return self._cmd.read_file(sys_file, no_error=ignore_missing).strip()
site-packages/tuned/plugins/repository.py000064400000002772147511334670014700 0ustar00from tuned.utils.plugin_loader import PluginLoader
import tuned.plugins.base
import tuned.logs

log = tuned.logs.get()

__all__ = ["Repository"]

class Repository(PluginLoader):

	def __init__(self, monitor_repository, storage_factory, hardware_inventory, device_matcher, device_matcher_udev, plugin_instance_factory, global_cfg, variables):
		super(Repository, self).__init__()
		self._plugins = set()
		self._monitor_repository = monitor_repository
		self._storage_factory = storage_factory
		self._hardware_inventory = hardware_inventory
		self._device_matcher = device_matcher
		self._device_matcher_udev = device_matcher_udev
		self._plugin_instance_factory = plugin_instance_factory
		self._global_cfg = global_cfg
		self._variables = variables

	@property
	def plugins(self):
		return self._plugins

	def _set_loader_parameters(self):
		self._namespace = "tuned.plugins"
		self._prefix = "plugin_"
		self._interface = tuned.plugins.base.Plugin

	def create(self, plugin_name):
		log.debug("creating plugin %s" % plugin_name)
		plugin_cls = self.load_plugin(plugin_name)
		plugin_instance = plugin_cls(self._monitor_repository, self._storage_factory, self._hardware_inventory, self._device_matcher,\
			self._device_matcher_udev, self._plugin_instance_factory, self._global_cfg, self._variables)
		self._plugins.add(plugin_instance)
		return plugin_instance

	def delete(self, plugin):
		assert isinstance(plugin, self._interface)
		log.debug("removing plugin %s" % plugin)
		plugin.cleanup()
		self._plugins.remove(plugin)
site-packages/tuned/plugins/plugin_net.py000064400000055324147511334670014626 0ustar00import errno
from . import hotplug
from .decorators import *
import tuned.logs
from tuned.utils.nettool import ethcard
from tuned.utils.commands import commands
import os
import re

log = tuned.logs.get()

WOL_VALUES = "pumbagsd"

class NetTuningPlugin(hotplug.Plugin):
	"""
	`net`::
	
	Configures network driver, hardware and Netfilter settings.
	Dynamic change of the interface speed according to the interface
	utilization is also supported. The dynamic tuning is controlled by
	the [option]`dynamic` and the global [option]`dynamic_tuning`
	option in `tuned-main.conf`.
	+
	The [option]`wake_on_lan` option sets wake-on-lan to the specified
	value as when using the `ethtool` utility.
	+
	.Set Wake-on-LAN for device eth0 on MagicPacket(TM)
	====
	----
	[net]
	devices=eth0
	wake_on_lan=g
	----
	====
	+
	The [option]`coalesce` option allows changing coalescing settings
	for the specified network devices. The syntax is:
	+
	[subs="+quotes,+macros"]
	----
	coalesce=__param1__ __value1__ __param2__ __value2__ ... __paramN__ __valueN__
	----
	Note that not all the coalescing parameters are supported by all
	network cards. For the list of coalescing parameters of your network
	device, use `ethtool -c device`.
	+	
	.Setting coalescing parameters rx/tx-usecs for all network devices
	====
	----
	[net]
	coalesce=rx-usecs 3 tx-usecs 16
	----
	====
	+
	The [option]`features` option allows changing 
	the offload parameters and other features for the specified
	network devices. To query the features of your network device,
	use `ethtool -k device`. The syntax of the option is the same as
	the [option]`coalesce` option.
	+
	.Turn off TX checksumming, generic segmentation and receive offload 
	====
	----
	[net]
	features=tx off gso off gro off
	----
	====
	The [option]`pause` option allows changing the pause parameters for
	the specified network devices. To query the pause parameters of your
	network device, use `ethtool -a device`. The syntax of the option
	is the same as the [option]`coalesce` option.
	+
	.Disable autonegotiation
	====
	----
	[net]
	pause=autoneg off
	----
	====
	+
	The [option]`ring` option allows changing the rx/tx ring parameters
	for the specified network devices. To query the ring parameters of your
	network device, use `ethtool -g device`. The syntax of the option
	is the same as the [option]`coalesce` option.
	+	
	.Change the number of ring entries for the Rx/Tx rings to 1024/512 respectively
	=====
	-----
	[net]
	ring=rx 1024 tx 512
	-----
	=====
	+
	The [option]`channels` option allows changing the numbers of channels
	for the specified network device. A channel is an IRQ and the set
	of queues that can trigger that IRQ. To query the channels parameters of your
	network device, use `ethtool -l device`. The syntax of the option
	is the same as the [option]`coalesce` option.
	+
	.Set the number of multi-purpose channels to 16
	=====
	-----
	[net]
	channels=combined 16
	-----
	=====
	+   
	A network device either supports rx/tx or combined queue
	mode. The [option]`channels` option automatically adjusts the
	parameters based on the mode supported by the device as long as a
	valid configuration is requested.
	+
	The [option]`nf_conntrack_hashsize` option sets the size of the hash
	table which stores lists of conntrack entries by writing to
	`/sys/module/nf_conntrack/parameters/hashsize`.
	+
	.Adjust the size of the conntrack hash table
	====
	----
	[net]
	nf_conntrack_hashsize=131072
	----
	====
	+
	The [option]`txqueuelen` option allows changing txqueuelen (the length
	of the transmit queue). It uses `ip` utility that is in package	iproute
	recommended for TuneD, so the package needs to be installed for its correct
	functionality. To query the txqueuelen parameters of your network device
	use `ip link show` and the current value is shown after the qlen column.
	+
	.Adjust the length of the transmit queue
	====
	----
	[net]
	txqueuelen=5000
	----
	====
	+
	The [option]`mtu` option allows changing MTU (Maximum Transmission Unit).
	It uses `ip` utility that is in package	iproute recommended for TuneD, so
	the package needs to be installed for its correct functionality. To query
	the MTU parameters of your network device use `ip link show` and the
	current value is shown after the MTU column.
	+
	.Adjust the size of the MTU
	====
	----
	[net]
	mtu=9000
	----
	====
	"""

	def __init__(self, *args, **kwargs):
		super(NetTuningPlugin, self).__init__(*args, **kwargs)
		self._load_smallest = 0.05
		self._level_steps = 6
		self._cmd = commands()
		self._re_ip_link_show = {}
		self._use_ip = True

	def _init_devices(self):
		self._devices_supported = True
		self._free_devices = set()
		self._assigned_devices = set()

		re_not_virtual = re.compile('(?!.*/virtual/.*)')
		for device in self._hardware_inventory.get_devices("net"):
			if re_not_virtual.match(device.device_path):
				self._free_devices.add(device.sys_name)

		log.debug("devices: %s" % str(self._free_devices));

	def _get_device_objects(self, devices):
		return [self._hardware_inventory.get_device("net", x) for x in devices]

	def _instance_init(self, instance):
		instance._has_static_tuning = True
		if self._option_bool(instance.options["dynamic"]):
			instance._has_dynamic_tuning = True
			instance._load_monitor = self._monitors_repository.create("net", instance.assigned_devices)
			instance._idle = {}
			instance._stats = {}
		else:
			instance._has_dynamic_tuning = False
			instance._load_monitor = None
			instance._idle = None
			instance._stats = None

	def _instance_cleanup(self, instance):
		if instance._load_monitor is not None:
			self._monitors_repository.delete(instance._load_monitor)
			instance._load_monitor = None

	def _instance_apply_dynamic(self, instance, device):
		self._instance_update_dynamic(instance, device)

	def _instance_update_dynamic(self, instance, device):
		load = [int(value) for value in instance._load_monitor.get_device_load(device)]
		if load is None:
			return

		if not device in instance._stats:
			self._init_stats_and_idle(instance, device)
		self._update_stats(instance, device, load)
		self._update_idle(instance, device)

		stats = instance._stats[device]
		idle = instance._idle[device]

		if idle["level"] == 0 and idle["read"] >= self._level_steps and idle["write"] >= self._level_steps:
			idle["level"] = 1
			log.info("%s: setting 100Mbps" % device)
			ethcard(device).set_speed(100)
		elif idle["level"] == 1 and (idle["read"] == 0 or idle["write"] == 0):
			idle["level"] = 0
			log.info("%s: setting max speed" % device)
			ethcard(device).set_max_speed()

		log.debug("%s load: read %0.2f, write %0.2f" % (device, stats["read"], stats["write"]))
		log.debug("%s idle: read %d, write %d, level %d" % (device, idle["read"], idle["write"], idle["level"]))

	@classmethod
	def _get_config_options_coalesce(cls):
		return {
			"adaptive-rx": None,
			"adaptive-tx": None,
			"rx-usecs": None,
			"rx-frames": None,
			"rx-usecs-irq": None,
			"rx-frames-irq": None,
			"tx-usecs": None,
			"tx-frames": None,
			"tx-usecs-irq": None,
			"tx-frames-irq": None,
			"stats-block-usecs": None,
			"pkt-rate-low": None,
			"rx-usecs-low": None,
			"rx-frames-low": None,
			"tx-usecs-low": None,
			"tx-frames-low": None,
			"pkt-rate-high": None,
			"rx-usecs-high": None,
			"rx-frames-high": None,
			"tx-usecs-high": None,
			"tx-frames-high": None,
			"sample-interval": None
			}

	@classmethod
	def _get_config_options_pause(cls):
		return { "autoneg": None,
			"rx": None,
			"tx": None }

	@classmethod
	def _get_config_options_ring(cls):
		return { "rx": None,
			"rx-mini": None,
			"rx-jumbo": None,
			"tx": None }

	@classmethod
	def _get_config_options_channels(cls):
		return { "rx": None,
			"tx": None,
			"other": None,
			"combined": None }

	@classmethod
	def _get_config_options(cls):
		return {
			"dynamic": True,
			"wake_on_lan": None,
			"nf_conntrack_hashsize": None,
			"features": None,
			"coalesce": None,
			"pause": None,
			"ring": None,
			"channels": None,
			"txqueuelen": None,
			"mtu": None,
		}

	def _init_stats_and_idle(self, instance, device):
		max_speed = self._calc_speed(ethcard(device).get_max_speed())
		instance._stats[device] = { "new": 4 * [0], "max": 2 * [max_speed, 1] }
		instance._idle[device] = { "level": 0, "read": 0, "write": 0 }

	def _update_stats(self, instance, device, new_load):
		# put new to old
		instance._stats[device]["old"] = old_load = instance._stats[device]["new"]
		instance._stats[device]["new"] = new_load

		# load difference
		diff = [new_old[0] - new_old[1] for new_old in zip(new_load, old_load)]
		instance._stats[device]["diff"] = diff

		# adapt maximum expected load if the difference is higer
		old_max_load = instance._stats[device]["max"]
		max_load = [max(pair) for pair in zip(old_max_load, diff)]
		instance._stats[device]["max"] = max_load

		# read/write ratio
		instance._stats[device]["read"] =  float(diff[0]) / float(max_load[0])
		instance._stats[device]["write"] = float(diff[2]) / float(max_load[2])

	def _update_idle(self, instance, device):
		# increase counter if there is no load, otherwise reset the counter
		for operation in ["read", "write"]:
			if instance._stats[device][operation] < self._load_smallest:
				instance._idle[device][operation] += 1
			else:
				instance._idle[device][operation] = 0

	def _instance_unapply_dynamic(self, instance, device):
		if device in instance._idle and instance._idle[device]["level"] > 0:
			instance._idle[device]["level"] = 0
			log.info("%s: setting max speed" % device)
			ethcard(device).set_max_speed()

	def _calc_speed(self, speed):
		# 0.6 is just a magical constant (empirical value): Typical workload on netcard won't exceed
		# that and if it does, then the code is smart enough to adapt it.
		# 1024 * 1024 as for MB -> B
		# speed / 8  Mb -> MB
		return (int) (0.6 * 1024 * 1024 * speed / 8)

	# parse features/coalesce config parameters (those defined in profile configuration)
	# context is for error message
	def _parse_config_parameters(self, value, context):
		# expand config variables
		value = self._variables.expand(value)
		# split supporting various dellimeters
		v = str(re.sub(r"(:\s*)|(\s+)|(\s*;\s*)|(\s*,\s*)", " ", value)).split()
		lv = len(v)
		if lv % 2 != 0:
			log.error("invalid %s parameter: '%s'" % (context, str(value)))
			return None
		if lv == 0:
			return dict()
		# convert flat list to dict
		return dict(list(zip(v[::2], v[1::2])))

	# parse features/coalesce device parameters (those returned by ethtool)
	def _parse_device_parameters(self, value):
		# substitute "Adaptive RX: val1  TX: val2" to 'adaptive-rx: val1' and
		# 'adaptive-tx: val2' and workaround for ethtool inconsistencies
		# (rhbz#1225375)
		value = self._cmd.multiple_re_replace({
			"Adaptive RX:": "adaptive-rx:",
			"\\s+TX:": "\nadaptive-tx:",
			"rx-frame-low:": "rx-frames-low:",
			"rx-frame-high:": "rx-frames-high:",
			"tx-frame-low:": "tx-frames-low:",
			"tx-frame-high:": "tx-frames-high:",
			"large-receive-offload:": "lro:",
			"rx-checksumming:": "rx:",
			"tx-checksumming:": "tx:",
			"scatter-gather:": "sg:",
			"tcp-segmentation-offload:": "tso:",
			"udp-fragmentation-offload:": "ufo:",
			"generic-segmentation-offload:": "gso:",
			"generic-receive-offload:": "gro:",
			"rx-vlan-offload:": "rxvlan:",
			"tx-vlan-offload:": "txvlan:",
			"ntuple-filters:": "ntuple:",
			"receive-hashing:": "rxhash:",
		}, value)
		# remove empty lines, remove fixed parameters (those with "[fixed]")
		vl = [v for v in value.split('\n') if len(str(v)) > 0 and not re.search(r"\[fixed\]$", str(v))]
		if len(vl) < 2:
			return None
		# skip first line (device name), split to key/value,
		# remove pairs which are not key/value
		return dict([u for u in [re.split(r":\s*", str(v)) for v in vl[1:]] if len(u) == 2])

	@classmethod
	def _nf_conntrack_hashsize_path(self):
		return "/sys/module/nf_conntrack/parameters/hashsize"

	@command_set("wake_on_lan", per_device=True)
	def _set_wake_on_lan(self, value, device, sim, remove):
		if value is None:
			return None

		# see man ethtool for possible wol values, 0 added as an alias for 'd'
		value = re.sub(r"0", "d", str(value));
		if not re.match(r"^[" + WOL_VALUES + r"]+$", value):
			log.warn("Incorrect 'wake_on_lan' value.")
			return None

		if not sim:
			self._cmd.execute(["ethtool", "-s", device, "wol", value])
		return value

	@command_get("wake_on_lan")
	def _get_wake_on_lan(self, device, ignore_missing=False):
		value = None
		try:
			m = re.match(r".*Wake-on:\s*([" + WOL_VALUES + "]+).*", self._cmd.execute(["ethtool", device])[1], re.S)
			if m:
				value = m.group(1)
		except IOError:
			pass
		return value

	@command_set("nf_conntrack_hashsize")
	def _set_nf_conntrack_hashsize(self, value, sim, remove):
		if value is None:
			return None

		hashsize = int(value)
		if hashsize >= 0:
			if not sim:
				self._cmd.write_to_file(self._nf_conntrack_hashsize_path(), hashsize, \
					no_error = [errno.ENOENT] if remove else False)
			return hashsize
		else:
			return None

	@command_get("nf_conntrack_hashsize")
	def _get_nf_conntrack_hashsize(self):
		value = self._cmd.read_file(self._nf_conntrack_hashsize_path())
		if len(value) > 0:
			return int(value)
		return None

	def _call_ip_link(self, args=[]):
		if not self._use_ip:
			return None
		args = ["ip", "link"] + args
		(rc, out, err_msg) = self._cmd.execute(args, no_errors=[errno.ENOENT], return_err=True)
		if rc == -errno.ENOENT:
			log.warn("ip command not found, ignoring for other devices")
			self._use_ip = False
			return None
		elif rc:
			log.info("Problem calling ip command")
			log.debug("(rc: %s, msg: '%s')" % (rc, err_msg))
			return None
		return out

	def _ip_link_show(self, device=None):
		args = ["show"]
		if device:
			args.append(device)
		return self._call_ip_link(args)

	@command_set("txqueuelen", per_device=True)
	def _set_txqueuelen(self, value, device, sim, remove):
		if value is None:
			return None
		try:
			int(value)
		except ValueError:
			log.warn("txqueuelen value '%s' is not integer" % value)
			return None
		if not sim:
			# there is inconsistency in "ip", where "txqueuelen" is set as it, but is shown as "qlen"
			res = self._call_ip_link(["set", "dev", device, "txqueuelen", value])
			if res is None:
				log.warn("Cannot set txqueuelen for device '%s'" % device)
				return None
		return value

	def _get_re_ip_link_show(self, arg):
		"""
		Return regex for int arg value from "ip link show" command
		"""
		if arg not in self._re_ip_link_show:
			self._re_ip_link_show[arg] = re.compile(r".*\s+%s\s+(\d+)" % arg)
		return self._re_ip_link_show[arg]

	@command_get("txqueuelen")
	def _get_txqueuelen(self, device, ignore_missing=False):
		out = self._ip_link_show(device)
		if out is None:
			if not ignore_missing:
				log.info("Cannot get 'ip link show' result for txqueuelen value for device '%s'" % device)
			return None
		res = self._get_re_ip_link_show("qlen").search(out)
		if res is None:
			# We can theoretically get device without qlen (http://linux-ip.net/gl/ip-cref/ip-cref-node17.html)
			if not ignore_missing:
				log.info("Cannot get txqueuelen value from 'ip link show' result for device '%s'" % device)
			return None
		return res.group(1)

	@command_set("mtu", per_device=True)
	def _set_mtu(self, value, device, sim, remove):
		if value is None:
			return None
		try:
			int(value)
		except ValueError:
			log.warn("mtu value '%s' is not integer" % value)
			return None
		if not sim:
			res = self._call_ip_link(["set", "dev", device, "mtu", value])
			if res is None:
				log.warn("Cannot set mtu for device '%s'" % device)
				return None
		return value

	@command_get("mtu")
	def _get_mtu(self, device, ignore_missing=False):
		out = self._ip_link_show(device)
		if out is None:
			if not ignore_missing:
				log.info("Cannot get 'ip link show' result for mtu value for device '%s'" % device)
			return None
		res = self._get_re_ip_link_show("mtu").search(out)
		if res is None:
			# mtu value should be always present, but it's better to have a test
			if not ignore_missing:
				log.info("Cannot get mtu value from 'ip link show' result for device '%s'" % device)
			return None
		return res.group(1)

	# d is dict: {parameter: value}
	def _check_parameters(self, context, d):
		if context == "features":
			return True
		params = set(d.keys())
		supported_getter = { "coalesce": self._get_config_options_coalesce, \
				"pause": self._get_config_options_pause, \
				"ring": self._get_config_options_ring, \
				"channels": self._get_config_options_channels }
		supported = set(supported_getter[context]().keys())
		if not params.issubset(supported):
			log.error("unknown %s parameter(s): %s" % (context, str(params - supported)))
			return False
		return True

	# parse output of ethtool -a
	def _parse_pause_parameters(self, s):
		s = self._cmd.multiple_re_replace(\
				{"Autonegotiate": "autoneg",
				"RX": "rx",
				"TX": "tx"}, s)
		l = s.split("\n")[1:]
		l = [x for x in l if x != '' and not re.search(r"\[fixed\]", x)]
		return dict([x for x in [re.split(r":\s*", x) for x in l] if len(x) == 2])

	# parse output of ethtool -g
	def _parse_ring_parameters(self, s):
		a = re.split(r"^Current hardware settings:$", s, flags=re.MULTILINE)
		s = a[1]
		s = self._cmd.multiple_re_replace(\
				{"RX": "rx",
				"RX Mini": "rx-mini",
				"RX Jumbo": "rx-jumbo",
				"TX": "tx"}, s)
		l = s.split("\n")
		l = [x for x in l if x != '']
		l = [x for x in [re.split(r":\s*", x) for x in l] if len(x) == 2]
		return dict(l)

	# parse output of ethtool -l
	def _parse_channels_parameters(self, s):
		a = re.split(r"^Current hardware settings:$", s, flags=re.MULTILINE)
		s = a[1]
		s = self._cmd.multiple_re_replace(\
				{"RX": "rx",
				"TX": "tx",
				"Other": "other",
				"Combined": "combined"}, s)
		l = s.split("\n")
		l = [x for x in l if x != '']
		l = [x for x in [re.split(r":\s*", x) for x in l] if len(x) == 2]
		return dict(l)

	def _replace_channels_parameters(self, context, params_list, dev_params):
		mod_params_list = []
		if "combined" in params_list:
			mod_params_list.extend(["rx", params_list[1], "tx", params_list[1]])
		else:
			cnt = str(max(int(params_list[1]), int(params_list[3])))
			mod_params_list.extend(["combined", cnt])
		return dict(list(zip(mod_params_list[::2], mod_params_list[1::2])))

	def _check_device_support(self, context, parameters, device, dev_params):
		"""Filter unsupported parameters and log warnings about it

		Positional parameters:
		context -- context of change
		parameters -- parameters to change
		device -- name of device on which should be parameters set
		dev_params -- dictionary of currently known parameters of device
		"""
		supported_parameters = set(dev_params.keys())
		parameters_to_change = set(parameters.keys())
		# if parameters_to_change contains unsupported parameter(s) then remove
		# it/them
		unsupported_parameters = (parameters_to_change
			- supported_parameters)
		for param in unsupported_parameters:
			log.warning("%s parameter %s is not supported by device %s" % (
				context,
				param,
				device,
			))
			parameters.pop(param, None)

	def _get_device_parameters(self, context, device):
		context2opt = { "coalesce": "-c", "features": "-k", "pause": "-a", "ring": "-g", \
				"channels": "-l"}
		opt = context2opt[context]
		ret, value = self._cmd.execute(["ethtool", opt, device])
		if ret != 0 or len(value) == 0:
			return None
		context2parser = { "coalesce": self._parse_device_parameters, \
				"features": self._parse_device_parameters, \
				"pause": self._parse_pause_parameters, \
				"ring": self._parse_ring_parameters, \
				"channels": self._parse_channels_parameters }
		parser = context2parser[context]
		d = parser(value)
		if context == "coalesce" and not self._check_parameters(context, d):
			return None
		return d

	def _set_device_parameters(self, context, value, device, sim,
				dev_params = None):
		if value is None or len(value) == 0:
			return None
		d = self._parse_config_parameters(value, context)
		if d is None or not self._check_parameters(context, d):
			return {}
		# check if device supports parameters and filter out unsupported ones
		if dev_params:
			self._check_device_support(context, d, device, dev_params)
			# replace the channel parameters based on the device support
			if context == "channels" and str(dev_params[next(iter(d))]) in ["n/a", "0"]:
				d = self._replace_channels_parameters(context, self._cmd.dict2list(d), dev_params)

		if not sim and len(d) != 0:
			log.debug("setting %s: %s" % (context, str(d)))
			context2opt = { "coalesce": "-C", "features": "-K", "pause": "-A", "ring": "-G", \
                                "channels": "-L"}
			opt = context2opt[context]
			# ignore ethtool return code 80, it means parameter is already set
			self._cmd.execute(["ethtool", opt, device] + self._cmd.dict2list(d), no_errors = [80])
		return d

	def _custom_parameters(self, context, start, value, device, verify):
		storage_key = self._storage_key(
				command_name = context,
				device_name = device)
		if start:
			params_current = self._get_device_parameters(context,
					device)
			if params_current is None or len(params_current) == 0:
				return False
			params_set = self._set_device_parameters(context,
					value, device, verify,
					dev_params = params_current)
			# if none of parameters passed checks then the command completely
			# failed
			if params_set is None or len(params_set) == 0:
				return False
			relevant_params_current = [(param, value) for param, value
					in params_current.items()
					if param in params_set]
			relevant_params_current = dict(relevant_params_current)
			if verify:
				res = (self._cmd.dict2list(params_set)
						== self._cmd.dict2list(relevant_params_current))
				self._log_verification_result(context, res,
						params_set,
						relevant_params_current,
						device = device)
				return res
			# saved are only those parameters which passed checks
			self._storage.set(storage_key, " ".join(
					self._cmd.dict2list(relevant_params_current)))
		else:
			original_value = self._storage.get(storage_key)
			# in storage are only those parameters which were already tested
			# so skip check for supported parameters
			self._set_device_parameters(context, original_value, device, False)
		return None

	@command_custom("features", per_device = True)
	def _features(self, start, value, device, verify, ignore_missing):
		return self._custom_parameters("features", start, value, device, verify)

	@command_custom("coalesce", per_device = True)
	def _coalesce(self, start, value, device, verify, ignore_missing):
		return self._custom_parameters("coalesce", start, value, device, verify)

	@command_custom("pause", per_device = True)
	def _pause(self, start, value, device, verify, ignore_missing):
		return self._custom_parameters("pause", start, value, device, verify)

	@command_custom("ring", per_device = True)
	def _ring(self, start, value, device, verify, ignore_missing):
		return self._custom_parameters("ring", start, value, device, verify)

	@command_custom("channels", per_device = True)
	def _channels(self, start, value, device, verify, ignore_missing):
		return self._custom_parameters("channels", start, value, device, verify)
site-packages/tuned/plugins/decorators.py000064400000001727147511334670014625 0ustar00__all__ = ["command_set", "command_get", "command_custom"]

#	@command_set("scheduler", per_device=True)
#	def set_scheduler(self, value, device):
#		set_new_scheduler
#
#	@command_get("scheduler")
#	def get_scheduler(self, device):
#		return current_scheduler
#
#	@command_set("foo")
#	def set_foo(self, value):
#		set_new_foo
#
#	@command_get("foo")
#	def get_foo(self):
#		return current_foo
#

def command_set(name, per_device=False, priority=0):
	def wrapper(method):
		method._command = {
			"set": True,
			"name": name,
			"per_device": per_device,
			"priority": priority,
		}
		return method

	return wrapper

def command_get(name):
	def wrapper(method):
		method._command = {
			"get": True,
			"name": name,
		}
		return method
	return wrapper

def command_custom(name, per_device=False, priority=0):
	def wrapper(method):
		method._command = {
			"custom": True,
			"name": name,
			"per_device": per_device,
			"priority": priority,
		}
		return method
	return wrapper
site-packages/tuned/plugins/base.py000064400000053621147511334670013372 0ustar00import re
import tuned.consts as consts
import tuned.profiles.variables
import tuned.logs
import collections
from tuned.utils.commands import commands
import os
from subprocess import Popen, PIPE

log = tuned.logs.get()

class Plugin(object):
	"""
	Base class for all plugins.

	Plugins change various system settings in order to get desired performance or power
	saving. Plugins use Monitor objects to get information from the running system.

	Intentionally a lot of logic is included in the plugin to increase plugin flexibility.
	"""

	def __init__(self, monitors_repository, storage_factory, hardware_inventory, device_matcher, device_matcher_udev, instance_factory, global_cfg, variables):
		"""Plugin constructor."""

		self._storage = storage_factory.create(self.__class__.__name__)
		self._monitors_repository = monitors_repository
		self._hardware_inventory = hardware_inventory
		self._device_matcher = device_matcher
		self._device_matcher_udev = device_matcher_udev
		self._instance_factory = instance_factory

		self._instances = collections.OrderedDict()
		self._init_commands()

		self._global_cfg = global_cfg
		self._variables = variables
		self._has_dynamic_options = False
		self._devices_inited = False

		self._options_used_by_dynamic = self._get_config_options_used_by_dynamic()

		self._cmd = commands()

	def cleanup(self):
		self.destroy_instances()

	def init_devices(self):
		if not self._devices_inited:
			self._init_devices()
			self._devices_inited = True

	@property
	def name(self):
		return self.__class__.__module__.split(".")[-1].split("_", 1)[1]

	#
	# Plugin configuration manipulation and helpers.
	#

	@classmethod
	def _get_config_options(self):
		"""Default configuration options for the plugin."""
		return {}

	@classmethod
	def get_config_options_hints(cls):
		"""Explanation of each config option function"""
		return {}

	@classmethod
	def _get_config_options_used_by_dynamic(self):
		"""List of config options used by dynamic tuning. Their previous values will be automatically saved and restored."""
		return []

	def _get_effective_options(self, options):
		"""Merge provided options with plugin default options."""
		# TODO: _has_dynamic_options is a hack
		effective = self._get_config_options().copy()
		for key in options:
			if key in effective or self._has_dynamic_options:
				effective[key] = options[key]
			else:
				log.warn("Unknown option '%s' for plugin '%s'." % (key, self.__class__.__name__))
		return effective

	def _option_bool(self, value):
		if type(value) is bool:
			return value
		value = str(value).lower()
		return value == "true" or value == "1"

	#
	# Interface for manipulation with instances of the plugin.
	#

	def create_instance(self, name, devices_expression, devices_udev_regex, script_pre, script_post, options):
		"""Create new instance of the plugin and seize the devices."""
		if name in self._instances:
			raise Exception("Plugin instance with name '%s' already exists." % name)

		effective_options = self._get_effective_options(options)
		instance = self._instance_factory.create(self, name, devices_expression, devices_udev_regex, \
			script_pre, script_post, effective_options)
		self._instances[name] = instance

		return instance

	def destroy_instance(self, instance):
		"""Destroy existing instance."""
		if instance._plugin != self:
			raise Exception("Plugin instance '%s' does not belong to this plugin '%s'." % (instance, self))
		if instance.name not in self._instances:
			raise Exception("Plugin instance '%s' was already destroyed." % instance)

		instance = self._instances[instance.name]
		self._destroy_instance(instance)
		del self._instances[instance.name]

	def initialize_instance(self, instance):
		"""Initialize an instance."""
		log.debug("initializing instance %s (%s)" % (instance.name, self.name))
		self._instance_init(instance)

	def destroy_instances(self):
		"""Destroy all instances."""
		for instance in list(self._instances.values()):
			log.debug("destroying instance %s (%s)" % (instance.name, self.name))
			self._destroy_instance(instance)
		self._instances.clear()

	def _destroy_instance(self, instance):
		self.release_devices(instance)
		self._instance_cleanup(instance)

	def _instance_init(self, instance):
		raise NotImplementedError()

	def _instance_cleanup(self, instance):
		raise NotImplementedError()

	#
	# Devices handling
	#

	def _init_devices(self):
		self._devices_supported = False
		self._assigned_devices = set()
		self._free_devices = set()

	def _get_device_objects(self, devices):
		"""Override this in a subclass to transform a list of device names (e.g. ['sda'])
		   to a list of pyudev.Device objects, if your plugin supports it"""
		return None

	def _get_matching_devices(self, instance, devices):
		if instance.devices_udev_regex is None:
			return set(self._device_matcher.match_list(instance.devices_expression, devices))
		else:
			udev_devices = self._get_device_objects(devices)
			if udev_devices is None:
				log.error("Plugin '%s' does not support the 'devices_udev_regex' option", self.name)
				return set()
			udev_devices = self._device_matcher_udev.match_list(instance.devices_udev_regex, udev_devices)
			return set([x.sys_name for x in udev_devices])

	def assign_free_devices(self, instance):
		if not self._devices_supported:
			return

		log.debug("assigning devices to instance %s" % instance.name)
		to_assign = self._get_matching_devices(instance, self._free_devices)
		instance.active = len(to_assign) > 0
		if not instance.active:
			log.warn("instance %s: no matching devices available" % instance.name)
		else:
			name = instance.name
			if instance.name != self.name:
				name += " (%s)" % self.name
			log.info("instance %s: assigning devices %s" % (name, ", ".join(to_assign)))
			instance.assigned_devices.update(to_assign) # cannot use |=
			self._assigned_devices |= to_assign
			self._free_devices -= to_assign

	def release_devices(self, instance):
		if not self._devices_supported:
			return

		to_release = (instance.processed_devices \
				| instance.assigned_devices) \
				& self._assigned_devices

		instance.active = False
		instance.processed_devices.clear()
		instance.assigned_devices.clear()
		self._assigned_devices -= to_release
		self._free_devices |= to_release

	#
	# Tuning activation and deactivation.
	#

	def _run_for_each_device(self, instance, callback, devices):
		if not self._devices_supported:
			devices = [None, ]

		for device in devices:
			callback(instance, device)

	def _instance_pre_static(self, instance, enabling):
		pass

	def _instance_post_static(self, instance, enabling):
		pass

	def _call_device_script(self, instance, script, op, devices, rollback = consts.ROLLBACK_SOFT):
		if script is None:
			return None
		if len(devices) == 0:
			log.warn("Instance '%s': no device to call script '%s' for." % (instance.name, script))
			return None
		if not script.startswith("/"):
			log.error("Relative paths cannot be used in script_pre or script_post. " \
				+ "Use ${i:PROFILE_DIR}.")
			return False
		dir_name = os.path.dirname(script)
		ret = True
		for dev in devices:
			environ = os.environ
			environ.update(self._variables.get_env())
			arguments = [op]
			if rollback == consts.ROLLBACK_FULL:
				arguments.append("full_rollback")
			arguments.append(dev)
			log.info("calling script '%s' with arguments '%s'" % (script, str(arguments)))
			log.debug("using environment '%s'" % str(list(environ.items())))
			try:
				proc = Popen([script] +  arguments, \
						stdout=PIPE, stderr=PIPE, \
						close_fds=True, env=environ, \
						cwd = dir_name, universal_newlines = True)
				out, err = proc.communicate()
				if proc.returncode:
					log.error("script '%s' error: %d, '%s'" % (script, proc.returncode, err[:-1]))
					ret = False
			except (OSError,IOError) as e:
				log.error("script '%s' error: %s" % (script, e))
				ret = False
		return ret

	def instance_apply_tuning(self, instance):
		"""
		Apply static and dynamic tuning if the plugin instance is active.
		"""
		if not instance.active:
			return

		if instance.has_static_tuning:
			self._call_device_script(instance, instance.script_pre,
					"apply", instance.assigned_devices)
			self._instance_pre_static(instance, True)
			self._instance_apply_static(instance)
			self._instance_post_static(instance, True)
			self._call_device_script(instance, instance.script_post,
					"apply", instance.assigned_devices)
		if instance.has_dynamic_tuning and self._global_cfg.get(consts.CFG_DYNAMIC_TUNING, consts.CFG_DEF_DYNAMIC_TUNING):
			self._run_for_each_device(instance, self._instance_apply_dynamic, instance.assigned_devices)
		instance.processed_devices.update(instance.assigned_devices)
		instance.assigned_devices.clear()

	def instance_verify_tuning(self, instance, ignore_missing):
		"""
		Verify static tuning if the plugin instance is active.
		"""
		if not instance.active:
			return None

		if len(instance.assigned_devices) != 0:
			log.error("BUG: Some devices have not been tuned: %s"
					% ", ".join(instance.assigned_devices))
		devices = instance.processed_devices.copy()
		if instance.has_static_tuning:
			if self._call_device_script(instance, instance.script_pre, "verify", devices) == False:
				return False
			if self._instance_verify_static(instance, ignore_missing, devices) == False:
				return False
			if self._call_device_script(instance, instance.script_post, "verify", devices) == False:
				return False
			return True
		else:
			return None

	def instance_update_tuning(self, instance):
		"""
		Apply dynamic tuning if the plugin instance is active.
		"""
		if not instance.active:
			return
		if instance.has_dynamic_tuning and self._global_cfg.get(consts.CFG_DYNAMIC_TUNING, consts.CFG_DEF_DYNAMIC_TUNING):
			self._run_for_each_device(instance, self._instance_update_dynamic, instance.processed_devices.copy())

	def instance_unapply_tuning(self, instance, rollback = consts.ROLLBACK_SOFT):
		"""
		Remove all tunings applied by the plugin instance.
		"""
		if rollback == consts.ROLLBACK_NONE:
			return

		if instance.has_dynamic_tuning and self._global_cfg.get(consts.CFG_DYNAMIC_TUNING, consts.CFG_DEF_DYNAMIC_TUNING):
			self._run_for_each_device(instance, self._instance_unapply_dynamic, instance.processed_devices)
		if instance.has_static_tuning:
			self._call_device_script(instance, instance.script_post,
					"unapply", instance.processed_devices,
					rollback = rollback)
			self._instance_pre_static(instance, False)
			self._instance_unapply_static(instance, rollback)
			self._instance_post_static(instance, False)
			self._call_device_script(instance, instance.script_pre, "unapply", instance.processed_devices, rollback = rollback)

	def _instance_apply_static(self, instance):
		self._execute_all_non_device_commands(instance)
		self._execute_all_device_commands(instance, instance.assigned_devices)

	def _instance_verify_static(self, instance, ignore_missing, devices):
		ret = True
		if self._verify_all_non_device_commands(instance, ignore_missing) == False:
			ret = False
		if self._verify_all_device_commands(instance, devices, ignore_missing) == False:
			ret = False
		return ret

	def _instance_unapply_static(self, instance, rollback = consts.ROLLBACK_SOFT):
		self._cleanup_all_device_commands(instance,
				instance.processed_devices)
		self._cleanup_all_non_device_commands(instance)

	def _instance_apply_dynamic(self, instance, device):
		for option in [opt for opt in self._options_used_by_dynamic if self._storage_get(instance, self._commands[opt], device) is None]:
			self._check_and_save_value(instance, self._commands[option], device)

		self._instance_update_dynamic(instance, device)

	def _instance_unapply_dynamic(self, instance, device):
		raise NotImplementedError()

	def _instance_update_dynamic(self, instance, device):
		raise NotImplementedError()

	#
	# Registration of commands for static plugins.
	#

	def _init_commands(self):
		"""
		Initialize commands.
		"""
		self._commands = collections.OrderedDict()
		self._autoregister_commands()
		self._check_commands()

	def _autoregister_commands(self):
		"""
		Register all commands marked using @command_set, @command_get, and @command_custom decorators.
		"""
		for member_name in self.__class__.__dict__:
			if member_name.startswith("__"):
				continue
			member = getattr(self, member_name)
			if not hasattr(member, "_command"):
				continue

			command_name = member._command["name"]
			info = self._commands.get(command_name, {"name": command_name})

			if "set" in member._command:
				info["custom"] = None
				info["set"] = member
				info["per_device"] = member._command["per_device"]
				info["priority"] = member._command["priority"]
			elif "get" in member._command:
				info["get"] = member
			elif "custom" in member._command:
				info["custom"] = member
				info["per_device"] = member._command["per_device"]
				info["priority"] = member._command["priority"]

			self._commands[command_name] = info

		# sort commands by priority
		self._commands = collections.OrderedDict(sorted(iter(self._commands.items()), key=lambda name_info: name_info[1]["priority"]))

	def _check_commands(self):
		"""
		Check if all commands are defined correctly.
		"""
		for command_name, command in list(self._commands.items()):
			# do not check custom commands
			if command.get("custom", False):
				continue
			# automatic commands should have 'get' and 'set' functions
			if "get" not in command or "set" not in command:
				raise TypeError("Plugin command '%s' is not defined correctly" % command_name)

	#
	# Operations with persistent storage for status data.
	#

	def _storage_key(self, instance_name = None, command_name = None,
			device_name = None):
		class_name = type(self).__name__
		instance_name = "" if instance_name is None else instance_name
		command_name = "" if command_name is None else command_name
		device_name = "" if device_name is None else device_name
		return "%s/%s/%s/%s" % (class_name, instance_name,
				command_name, device_name)

	def _storage_set(self, instance, command, value, device_name=None):
		key = self._storage_key(instance.name, command["name"], device_name)
		self._storage.set(key, value)

	def _storage_get(self, instance, command, device_name=None):
		key = self._storage_key(instance.name, command["name"], device_name)
		return self._storage.get(key)

	def _storage_unset(self, instance, command, device_name=None):
		key = self._storage_key(instance.name, command["name"], device_name)
		return self._storage.unset(key)

	#
	# Command execution, verification, and cleanup.
	#

	def _execute_all_non_device_commands(self, instance):
		for command in [command for command in list(self._commands.values()) if not command["per_device"]]:
			new_value = self._variables.expand(instance.options.get(command["name"], None))
			if new_value is not None:
				self._execute_non_device_command(instance, command, new_value)

	def _execute_all_device_commands(self, instance, devices):
		for command in [command for command in list(self._commands.values()) if command["per_device"]]:
			new_value = self._variables.expand(instance.options.get(command["name"], None))
			if new_value is None:
				continue
			for device in devices:
				self._execute_device_command(instance, command, device, new_value)

	def _verify_all_non_device_commands(self, instance, ignore_missing):
		ret = True
		for command in [command for command in list(self._commands.values()) if not command["per_device"]]:
			new_value = self._variables.expand(instance.options.get(command["name"], None))
			if new_value is not None:
				if self._verify_non_device_command(instance, command, new_value, ignore_missing) == False:
					ret = False
		return ret

	def _verify_all_device_commands(self, instance, devices, ignore_missing):
		ret = True
		for command in [command for command in list(self._commands.values()) if command["per_device"]]:
			new_value = instance.options.get(command["name"], None)
			if new_value is None:
				continue
			for device in devices:
				if self._verify_device_command(instance, command, device, new_value, ignore_missing) == False:
					ret = False
		return ret

	def _process_assignment_modifiers(self, new_value, current_value):
		if new_value is not None:
			nws = str(new_value)
			if len(nws) <= 1:
				return new_value
			op = nws[:1]
			val = nws[1:]
			if current_value is None:
				return val if op in ["<", ">"] else new_value
			try:
				if op == ">":
					if int(val) > int(current_value):
						return val
					else:
						return None
				elif op == "<":
					if int(val) < int(current_value):
						return val
					else:
						return None
			except ValueError:
				log.warn("cannot compare new value '%s' with current value '%s' by operator '%s', using '%s' directly as new value" % (val, current_value, op, new_value))
		return new_value

	def _get_current_value(self, command, device = None, ignore_missing=False):
		if device is not None:
			return command["get"](device, ignore_missing=ignore_missing)
		else:
			return command["get"]()

	def _check_and_save_value(self, instance, command, device = None, new_value = None):
		current_value = self._get_current_value(command, device)
		new_value = self._process_assignment_modifiers(new_value, current_value)
		if new_value is not None and current_value is not None:
			self._storage_set(instance, command, current_value, device)
		return new_value

	def _execute_device_command(self, instance, command, device, new_value):
		if command["custom"] is not None:
			command["custom"](True, new_value, device, False, False)
		else:
			new_value = self._check_and_save_value(instance, command, device, new_value)
			if new_value is not None:
				command["set"](new_value, device, sim = False, remove = False)

	def _execute_non_device_command(self, instance, command, new_value):
		if command["custom"] is not None:
			command["custom"](True, new_value, False, False)
		else:
			new_value = self._check_and_save_value(instance, command, None, new_value)
			if new_value is not None:
				command["set"](new_value, sim = False, remove = False)

	def _norm_value(self, value):
		v = self._cmd.unquote(str(value))
		if re.match(r'\s*(0+,?)+([\da-fA-F]*,?)*\s*$', v):
			return re.sub(r'^\s*(0+,?)+', "", v)
		return v

	def _verify_value(self, name, new_value, current_value, ignore_missing, device = None):
		if new_value is None:
			return None
		ret = False
		if current_value is None and ignore_missing:
			if device is None:
				log.info(consts.STR_VERIFY_PROFILE_VALUE_MISSING % name)
			else:
				log.info(consts.STR_VERIFY_PROFILE_DEVICE_VALUE_MISSING % (device, name))
			return True

		if current_value is not None:
			current_value = self._norm_value(current_value)
			new_value = self._norm_value(new_value)
			try:
				ret = int(new_value) == int(current_value)
			except ValueError:
				try:
					ret = int(new_value, 16) == int(current_value, 16)
				except ValueError:
					ret = str(new_value) == str(current_value)
					if not ret:
						vals = str(new_value).split('|')
						for val in vals:
							val = val.strip()
							ret = val == current_value
							if ret:
								break
		self._log_verification_result(name, ret, new_value,
				current_value, device = device)
		return ret

	def _log_verification_result(self, name, success, new_value,
			current_value, device = None):
		if success:
			if device is None:
				log.info(consts.STR_VERIFY_PROFILE_VALUE_OK % (name, str(current_value).strip()))
			else:
				log.info(consts.STR_VERIFY_PROFILE_DEVICE_VALUE_OK % (device, name, str(current_value).strip()))
			return True
		else:
			if device is None:
				log.error(consts.STR_VERIFY_PROFILE_VALUE_FAIL % (name, str(current_value).strip(), str(new_value).strip()))
			else:
				log.error(consts.STR_VERIFY_PROFILE_DEVICE_VALUE_FAIL % (device, name, str(current_value).strip(), str(new_value).strip()))
			return False

	def _verify_device_command(self, instance, command, device, new_value, ignore_missing):
		if command["custom"] is not None:
			return command["custom"](True, new_value, device, True, ignore_missing)
		current_value = self._get_current_value(command, device, ignore_missing=ignore_missing)
		new_value = self._process_assignment_modifiers(new_value, current_value)
		if new_value is None:
			return None
		new_value = command["set"](new_value, device, True, False)
		return self._verify_value(command["name"], new_value, current_value, ignore_missing, device)

	def _verify_non_device_command(self, instance, command, new_value, ignore_missing):
		if command["custom"] is not None:
			return command["custom"](True, new_value, True, ignore_missing)
		current_value = self._get_current_value(command)
		new_value = self._process_assignment_modifiers(new_value, current_value)
		if new_value is None:
			return None
		new_value = command["set"](new_value, True, False)
		return self._verify_value(command["name"], new_value, current_value, ignore_missing)

	def _cleanup_all_non_device_commands(self, instance):
		for command in reversed([command for command in list(self._commands.values()) if not command["per_device"]]):
			if (instance.options.get(command["name"], None) is not None) or (command["name"] in self._options_used_by_dynamic):
				self._cleanup_non_device_command(instance, command)

	def _cleanup_all_device_commands(self, instance, devices, remove = False):
		for command in reversed([command for command in list(self._commands.values()) if command["per_device"]]):
			if (instance.options.get(command["name"], None) is not None) or (command["name"] in self._options_used_by_dynamic):
				for device in devices:
					self._cleanup_device_command(instance, command, device, remove)

	def _cleanup_device_command(self, instance, command, device, remove = False):
		if command["custom"] is not None:
			command["custom"](False, None, device, False, False)
		else:
			old_value = self._storage_get(instance, command, device)
			if old_value is not None:
				command["set"](old_value, device, sim = False, remove = remove)
			self._storage_unset(instance, command, device)

	def _cleanup_non_device_command(self, instance, command):
		if command["custom"] is not None:
			command["custom"](False, None, False, False)
		else:
			old_value = self._storage_get(instance, command)
			if old_value is not None:
				command["set"](old_value, sim = False, remove = False)
			self._storage_unset(instance, command)
site-packages/tuned/plugins/plugin_mounts.py000064400000012714147511334670015361 0ustar00import tuned.consts as consts
from . import base
from .decorators import *
from subprocess import Popen,PIPE
import tuned.logs
from tuned.utils.commands import commands
import glob

log = tuned.logs.get()
cmd = commands()

class MountsPlugin(base.Plugin):
	"""
	`mounts`::
	
	Enables or disables barriers for mounts according to the value of the
	[option]`disable_barriers` option. The [option]`disable_barriers`
	option has an optional value `force` which disables barriers even
	on mountpoints with write back caches. Note that only extended file
	systems (ext) are supported by this plug-in.
	"""

	@classmethod
	def _generate_mountpoint_topology(cls):
		"""
		Gets the information about disks, partitions and mountpoints. Stores information about used filesystem and
		creates a list of all underlying devices (in case of LVM) for each mountpoint.
		"""
		mountpoint_topology = {}
		current_disk = None

		stdout, stderr = Popen(["lsblk", "-rno", \
				"TYPE,RM,KNAME,FSTYPE,MOUNTPOINT"], \
				stdout=PIPE, stderr=PIPE, close_fds=True, \
				universal_newlines = True).communicate()
		for columns in [line.split() for line in stdout.splitlines()]:
			if len(columns) < 3:
				continue
			device_type, device_removable, device_name = columns[:3]
			filesystem = columns[3] if len(columns) > 3 else None
			mountpoint = columns[4] if len(columns) > 4 else None

			if device_type == "disk":
				current_disk = device_name
				continue

			# skip removable, skip nonpartitions
			if device_removable == "1" or device_type not in ["part", "lvm"]:
				continue

			if mountpoint is None or mountpoint == "[SWAP]":
				continue

			mountpoint_topology.setdefault(mountpoint, {"disks": set(), "device_name": device_name, "filesystem": filesystem})
			mountpoint_topology[mountpoint]["disks"].add(current_disk)

		cls._mountpoint_topology = mountpoint_topology

	def _init_devices(self):
		self._generate_mountpoint_topology()
		self._devices_supported = True
		self._free_devices = set(self._mountpoint_topology.keys())
		self._assigned_devices = set()

	@classmethod
	def _get_config_options(self):
		return {
			"disable_barriers": None,
		}

	def _instance_init(self, instance):
		instance._has_dynamic_tuning = False
		instance._has_static_tuning = True

	def _instance_cleanup(self, instance):
		pass

	def _get_device_cache_type(self, device):
		"""
		Get device cache type. This will work only for devices on SCSI kernel subsystem.
		"""
		source_filenames = glob.glob("/sys/block/%s/device/scsi_disk/*/cache_type" % device)
		for source_filename in source_filenames:
			return cmd.read_file(source_filename).strip()
		return None

	def _mountpoint_has_writeback_cache(self, mountpoint):
		"""
		Checks if the device has 'write back' cache. If the cache type cannot be determined, asume some other cache.
		"""
		for device in self._mountpoint_topology[mountpoint]["disks"]:
			if self._get_device_cache_type(device) == "write back":
				return True
		return False

	def _mountpoint_has_barriers(self, mountpoint):
		"""
		Checks if a given mountpoint is mounted with barriers enabled or disabled.
		"""
		with open("/proc/mounts") as mounts_file:
			for line in mounts_file:
				# device mountpoint filesystem options dump check
				columns = line.split()
				if columns[0][0] != "/":
					continue
				if columns[1] == mountpoint:
					option_list = columns[3]
					break
			else:
				return None

		options = option_list.split(",")
		for option in options:
			(name, sep, value) = option.partition("=")
			# nobarrier barrier=0
			if name == "nobarrier" or (name == "barrier" and value == "0"):
				return False
			# barrier barrier=1
			elif name == "barrier":
				return True
		else:
			# default
			return True

	def _remount_partition(self, partition, options):
		"""
		Remounts partition.
		"""
		remount_command = ["/usr/bin/mount", partition, "-o", "remount,%s" % options]
		cmd.execute(remount_command)

	@command_custom("disable_barriers", per_device=True)
	def _disable_barriers(self, start, value, mountpoint, verify, ignore_missing):
		storage_key = self._storage_key(
				command_name = "disable_barriers",
				device_name = mountpoint)
		force = str(value).lower() == "force"
		value = force or self._option_bool(value)

		if start:
			if not value:
				return None

			reject_reason = None

			if not self._mountpoint_topology[mountpoint]["filesystem"].startswith("ext"):
				reject_reason = "filesystem not supported"
			elif not force and self._mountpoint_has_writeback_cache(mountpoint):
				reject_reason = "device uses write back cache"
			else:
				original_value = self._mountpoint_has_barriers(mountpoint)
				if original_value is None:
					reject_reason = "unknown current setting"
				elif original_value == False:
					if verify:
						log.info(consts.STR_VERIFY_PROFILE_OK % mountpoint)
						return True
					else:
						reject_reason = "barriers already disabled"
				elif verify:
					log.error(consts.STR_VERIFY_PROFILE_FAIL % mountpoint)
					return False

			if reject_reason is not None:
				log.info("not disabling barriers on '%s' (%s)" % (mountpoint, reject_reason))
				return None

			self._storage.set(storage_key, original_value)
			log.info("disabling barriers on '%s'" % mountpoint)
			self._remount_partition(mountpoint, "barrier=0")

		else:
			if verify:
				return None
			original_value = self._storage.get(storage_key)
			if original_value is None:
				return None

			log.info("enabling barriers on '%s'" % mountpoint)
			self._remount_partition(mountpoint, "barrier=1")
			self._storage.unset(storage_key)
		return None
site-packages/tuned/plugins/plugin_bootloader.py000064400000062472147511334670016174 0ustar00from . import base
from .decorators import *
import tuned.logs
from . import exceptions
from tuned.utils.commands import commands
import tuned.consts as consts

import os
import re
import tempfile
from time import sleep

log = tuned.logs.get()

class BootloaderPlugin(base.Plugin):
	"""
	`bootloader`::
	
	Adds options to the kernel command line. This plug-in supports the
	GRUB 2 boot loader and the Boot Loader Specification (BLS).
	+
	NOTE: *TuneD* will not remove or replace kernel command line
	parameters added via other methods like *grubby*. *TuneD* will manage
	the kernel command line parameters added via *TuneD*. Please refer
	to your platform bootloader documentation about how to identify and
	manage kernel command line parameters set outside of *TuneD*.
	+
	Customized non-standard location of the GRUB 2 configuration file
	can be specified by the [option]`grub2_cfg_file` option.
	+
	The kernel options are added to the current GRUB configuration and
	its templates. Reboot the system for the kernel option to take effect.
	+
	Switching to another profile or manually stopping the `tuned`
	service removes the additional options. If you shut down or reboot
	the system, the kernel options persist in the [filename]`grub.cfg`
	file and grub environment files.
	+
	The kernel options can be specified by the following syntax:
	+
	[subs="+quotes,+macros"]
	----
	cmdline__suffix__=__arg1__ __arg2__ ... __argN__
	----
	+
	Or with an alternative, but equivalent syntax:
	+
	[subs="+quotes,+macros"]
	----
	cmdline__suffix__=+__arg1__ __arg2__ ... __argN__
	----
	+
	Where __suffix__ can be arbitrary (even empty) alphanumeric
	string which should be unique across all loaded profiles. It is
	recommended to use the profile name as the __suffix__
	(for example, [option]`cmdline_my_profile`). If there are multiple
	[option]`cmdline` options with the same suffix, during the profile
	load/merge the value which was assigned previously will be used. This
	is the same behavior as any other plug-in options. The final kernel
	command line is constructed by concatenating all the resulting
	[option]`cmdline` options.
	+
	It is also possible to remove kernel options by the following syntax:
	+
	[subs="+quotes,+macros"]
	----
	cmdline__suffix__=-__arg1__ __arg2__ ... __argN__
	----
	+
	Such kernel options will not be concatenated and thus removed during
	the final kernel command line construction.
	+
	.Modifying the kernel command line
	====
	For example, to add the [option]`quiet` kernel option to a *TuneD*
	profile, include the following lines in the [filename]`tuned.conf`
	file:
	
	----
	[bootloader]
	cmdline_my_profile=+quiet
	----
	
	An example of a custom profile `my_profile` that adds the
	[option]`isolcpus=2` option to the kernel command line:
	
	----
	[bootloader]
	cmdline_my_profile=isolcpus=2
	----
	
	An example of a custom profile `my_profile` that removes the
	[option]`rhgb quiet` options from the kernel command line (if
	previously added by *TuneD*):
	
	----
	[bootloader]
	cmdline_my_profile=-rhgb quiet
	----
	====
	+
	.Modifying the kernel command line, example with inheritance
	====
	For example, to add the [option]`rhgb quiet` kernel options to a
	*TuneD* profile `profile_1`:
	
	----
	[bootloader]
	cmdline_profile_1=+rhgb quiet
	----
	
	In the child profile `profile_2` drop the [option]`quiet` option
	from the kernel command line:
	
	----
	[main]
	include=profile_1
	
	[bootloader]
	cmdline_profile_2=-quiet
	----
	
	The final kernel command line will be [option]`rhgb`. In case the same
	[option]`cmdline` suffix as in the `profile_1` is used:
	
	----
	[main]
	include=profile_1
	
	[bootloader]
	cmdline_profile_1=-quiet
	----
	
	It will result in the empty kernel command line because the merge
	executes and the [option]`cmdline_profile_1` gets redefined to just
	[option]`-quiet`. Thus there is nothing to remove in the final kernel
	command line processing.
	====
	+
	The [option]`initrd_add_img=IMAGE` adds an initrd overlay file
	`IMAGE`. If the `IMAGE` file name begins with '/', the absolute path is
	used. Otherwise, the current profile directory is used as the base
	directory for the `IMAGE`.
	+
	The [option]`initrd_add_dir=DIR` creates an initrd image from the
	directory `DIR` and adds the resulting image as an overlay.
	If the `DIR` directory name begins with '/', the absolute path
	is used. Otherwise, the current profile directory is used as the
	base directory for the `DIR`.
	+
	The [option]`initrd_dst_img=PATHNAME` sets the name and location of
	the resulting initrd image. Typically, it is not necessary to use this
	option. By default, the location of initrd images is `/boot` and the
	name of the image is taken as the basename of `IMAGE` or `DIR`. This can
	be overridden by setting [option]`initrd_dst_img`.
	+
	The [option]`initrd_remove_dir=VALUE` removes the source directory
	from which the initrd image was built if `VALUE` is true. Only 'y',
	'yes', 't', 'true' and '1' (case insensitive) are accepted as true
	values for this option. Other values are interpreted as false.
	+
	.Adding an overlay initrd image
	====
	----
	[bootloader]
	initrd_remove_dir=True
	initrd_add_dir=/tmp/tuned-initrd.img
	----
	
	This creates an initrd image from the `/tmp/tuned-initrd.img` directory
	and and then removes the `tuned-initrd.img` directory from `/tmp`.
	====
	+
	The [option]`skip_grub_config=VALUE` does not change grub
	configuration if `VALUE` is true. However, [option]`cmdline`
	options are still processed, and the result is used to verify the current
	cmdline. Only 'y', 'yes', 't', 'true' and '1' (case insensitive) are accepted
	as true values for this option. Other values are interpreted as false.
	+
	.Do not change grub configuration
	====
	----
	[bootloader]
	skip_grub_config=True
	cmdline=+systemd.cpu_affinity=1
	----
	====
	"""

	def __init__(self, *args, **kwargs):
		if not os.path.isfile(consts.GRUB2_TUNED_TEMPLATE_PATH):
			raise exceptions.NotSupportedPluginException("Required GRUB2 template not found, disabling plugin.")
		super(BootloaderPlugin, self).__init__(*args, **kwargs)
		self._cmd = commands()

	def _instance_init(self, instance):
		instance._has_dynamic_tuning = False
		instance._has_static_tuning = True
		# controls grub2_cfg rewrites in _instance_post_static
		self.update_grub2_cfg = False
		self._skip_grub_config_val = False
		self._initrd_remove_dir = False
		self._initrd_dst_img_val = None
		self._cmdline_val = ""
		self._initrd_val = ""
		self._grub2_cfg_file_names = self._get_grub2_cfg_files()
		self._bls = self._bls_enabled()

		self._rpm_ostree = self._rpm_ostree_status() is not None

	def _instance_cleanup(self, instance):
		pass

	@classmethod
	def _get_config_options(cls):
		return {
			"grub2_cfg_file": None,
			"initrd_dst_img": None,
			"initrd_add_img": None,
			"initrd_add_dir": None,
			"initrd_remove_dir": None,
			"cmdline": None,
			"skip_grub_config": None,
		}

	@staticmethod
	def _options_to_dict(options, omit=""):
		"""
		Returns dict created from options
		e.g.: _options_to_dict("A=A A=B A B=A C=A", "A=B B=A B=B") returns {'A': ['A', None], 'C': ['A']}
		"""
		d = {}
		omit = omit.split()
		for o in options.split():
			if o not in omit:
				arr = o.split('=', 1)
				d.setdefault(arr[0], []).append(arr[1] if len(arr) > 1 else None)
		return d

	@staticmethod
	def _dict_to_options(d):
		return " ".join([k + "=" + v1 if v1 is not None else k for k, v in d.items() for v1 in v])

	def _rpm_ostree_status(self):
		"""
		Returns status of rpm-ostree transactions or None if not run on rpm-ostree system
		"""
		(rc, out, err) = self._cmd.execute(['rpm-ostree', 'status'], return_err=True)
		log.debug("rpm-ostree status output stdout:\n%s\nstderr:\n%s" % (out, err))
		if rc != 0:
			return None
		splited = out.split()
		if len(splited) < 2 or splited[0] != "State:":
			log.warn("Exceptional format of rpm-ostree status result:\n%s" % out)
			return None
		return splited[1]

	def _wait_till_idle(self):
		sleep_cycles = 10
		sleep_secs = 1.0
		for i in range(sleep_cycles):
			if self._rpm_ostree_status() == "idle":
				return True
			sleep(sleep_secs)
		if self._rpm_ostree_status() == "idle":
			return True
		return False

	def _rpm_ostree_kargs(self, append={}, delete={}):
		"""
		Method for appending or deleting rpm-ostree karg
		returns None if rpm-ostree not present or is run on not ostree system
		or tuple with new kargs, appended kargs and deleted kargs
		"""
		(rc, out, err) = self._cmd.execute(['rpm-ostree', 'kargs'], return_err=True)
		log.debug("rpm-ostree output stdout:\n%s\nstderr:\n%s" % (out, err))
		if rc != 0:
			return None, None, None
		kargs = self._options_to_dict(out)

		if not self._wait_till_idle():
			log.error("Cannot wait for transaction end")
			return None, None, None

		deleted = {}
		delete_params = self._dict_to_options(delete).split()
		# Deleting kargs, e.g. deleting added kargs by profile
		for k, val in delete.items():
			for v in val:
				kargs[k].remove(v)
			deleted[k] = val

		appended = {}
		append_params = self._dict_to_options(append).split()
		# Appending kargs, e.g. new kargs by profile or restoring kargs replaced by profile
		for k, val in append.items():
			if kargs.get(k):
				# If there is karg that we add with new value we want to delete it
				# and store old value for restoring after profile unload
				log.debug("adding rpm-ostree kargs %s: %s for delete" % (k, kargs[k]))
				deleted.setdefault(k, []).extend(kargs[k])
				delete_params.extend([k + "=" + v if v is not None else k for v in kargs[k]])
				kargs[k] = []
			kargs.setdefault(k, []).extend(val)
			appended[k] = val

		if append_params == delete_params:
			log.info("skipping rpm-ostree kargs - append == deleting (%s)" % append_params)
			return kargs, appended, deleted

		log.info("rpm-ostree kargs - appending: '%s'; deleting: '%s'" % (append_params, delete_params))
		(rc, _, err) = self._cmd.execute(['rpm-ostree', 'kargs'] +
										 ['--append=%s' % v for v in append_params] +
										 ['--delete=%s' % v for v in delete_params], return_err=True)
		if rc != 0:
			log.error("Something went wrong with rpm-ostree kargs\n%s" % (err))
			return self._options_to_dict(out), None, None
		else:
			return kargs, appended, deleted

	def _get_effective_options(self, options):
		"""Merge provided options with plugin default options and merge all cmdline.* options."""
		effective = self._get_config_options().copy()
		cmdline_keys = []
		for key in options:
			if str(key).startswith("cmdline"):
				cmdline_keys.append(key)
			elif key in effective:
				effective[key] = options[key]
			else:
				log.warn("Unknown option '%s' for plugin '%s'." % (key, self.__class__.__name__))
		cmdline = ""
		for key in cmdline_keys:
			val = options[key]
			if val is None or val == "":
				continue
			op = val[0]
			op1 = val[1:2]
			vals = val[1:].strip()
			if op == "+" or (op == "\\" and op1 in ["\\", "+", "-"]):
				if vals != "":
					cmdline += " " + vals
			elif op == "-":
				if vals != "":
					for p in vals.split():
						regex = re.escape(p)
						cmdline = re.sub(r"(\A|\s)" + regex + r"(?=\Z|\s)", r"", cmdline)
			else:
				cmdline += " " + val
		cmdline = cmdline.strip()
		if cmdline != "":
			effective["cmdline"] = cmdline
		return effective

	def _get_grub2_cfg_files(self):
		cfg_files = []
		for f in consts.GRUB2_CFG_FILES:
			if os.path.exists(f):
				cfg_files.append(f)
		return cfg_files

	def _bls_enabled(self):
		grub2_default_env = self._cmd.read_file(consts.GRUB2_DEFAULT_ENV_FILE, no_error = True)
		if len(grub2_default_env) <= 0:
			log.info("cannot read '%s'" % consts.GRUB2_DEFAULT_ENV_FILE)
			return False

		return re.search(r"^\s*GRUB_ENABLE_BLSCFG\s*=\s*\"?\s*[tT][rR][uU][eE]\s*\"?\s*$", grub2_default_env,
			flags = re.MULTILINE) is not None

	def _patch_bootcmdline(self, d):
		return self._cmd.add_modify_option_in_file(consts.BOOT_CMDLINE_FILE, d)

	def _remove_grub2_tuning(self):
		self._patch_bootcmdline({consts.BOOT_CMDLINE_TUNED_VAR : "", consts.BOOT_CMDLINE_INITRD_ADD_VAR : ""})
		if not self._grub2_cfg_file_names:
			log.info("cannot find grub.cfg to patch")
			return
		for f in self._grub2_cfg_file_names:
			self._cmd.add_modify_option_in_file(f, {r"set\s+" + consts.GRUB2_TUNED_VAR : "", r"set\s+" + consts.GRUB2_TUNED_INITRD_VAR : ""}, add = False)
		if self._initrd_dst_img_val is not None:
			log.info("removing initrd image '%s'" % self._initrd_dst_img_val)
			self._cmd.unlink(self._initrd_dst_img_val)

	def _get_rpm_ostree_changes(self):
		f = self._cmd.read_file(consts.BOOT_CMDLINE_FILE)
		appended = re.search(consts.BOOT_CMDLINE_TUNED_VAR + r"=\"(.*)\"", f, flags=re.MULTILINE)
		appended = appended[1] if appended else ""
		deleted = re.search(consts.BOOT_CMDLINE_KARGS_DELETED_VAR + r"=\"(.*)\"", f, flags=re.MULTILINE)
		deleted = deleted[1] if deleted else ""
		return appended, deleted

	def _remove_rpm_ostree_tuning(self):
		appended, deleted = self._get_rpm_ostree_changes()
		self._rpm_ostree_kargs(append=self._options_to_dict(deleted), delete=self._options_to_dict(appended))
		self._patch_bootcmdline({consts.BOOT_CMDLINE_TUNED_VAR: "", consts.BOOT_CMDLINE_KARGS_DELETED_VAR: ""})

	def _instance_unapply_static(self, instance, rollback = consts.ROLLBACK_SOFT):
		if rollback == consts.ROLLBACK_FULL and not self._skip_grub_config_val:
			if self._rpm_ostree:
				log.info("removing rpm-ostree tuning previously added by Tuned")
				self._remove_rpm_ostree_tuning()
			else:
				log.info("removing grub2 tuning previously added by Tuned")
				self._remove_grub2_tuning()
				self._update_grubenv({"tuned_params" : "", "tuned_initrd" : ""})

	def _grub2_cfg_unpatch(self, grub2_cfg):
		log.debug("unpatching grub.cfg")
		cfg = re.sub(r"^\s*set\s+" + consts.GRUB2_TUNED_VAR + "\\s*=.*\n", "", grub2_cfg, flags = re.MULTILINE)
		grub2_cfg = re.sub(r" *\$" + consts.GRUB2_TUNED_VAR, "", cfg, flags = re.MULTILINE)
		cfg = re.sub(r"^\s*set\s+" + consts.GRUB2_TUNED_INITRD_VAR + "\\s*=.*\n", "", grub2_cfg, flags = re.MULTILINE)
		grub2_cfg = re.sub(r" *\$" + consts.GRUB2_TUNED_INITRD_VAR, "", cfg, flags = re.MULTILINE)
		cfg = re.sub(consts.GRUB2_TEMPLATE_HEADER_BEGIN + r"\n", "", grub2_cfg, flags = re.MULTILINE)
		return re.sub(consts.GRUB2_TEMPLATE_HEADER_END + r"\n+", "", cfg, flags = re.MULTILINE)

	def _grub2_cfg_patch_initial(self, grub2_cfg, d):
		log.debug("initial patching of grub.cfg")
		s = r"\1\n\n" + consts.GRUB2_TEMPLATE_HEADER_BEGIN + "\n"
		for opt in d:
			s += r"set " + self._cmd.escape(opt) + "=\"" + self._cmd.escape(d[opt]) + "\"\n"
		s += consts.GRUB2_TEMPLATE_HEADER_END + r"\n"
		grub2_cfg = re.sub(r"^(\s*###\s+END\s+[^#]+/00_header\s+### *)\n", s, grub2_cfg, flags = re.MULTILINE)

		d2 = {"linux" : consts.GRUB2_TUNED_VAR, "initrd" : consts.GRUB2_TUNED_INITRD_VAR}
		for i in d2:
			# add TuneD parameters to all kernels
			grub2_cfg = re.sub(r"^(\s*" + i + r"(16|efi)?\s+.*)$", r"\1 $" + d2[i], grub2_cfg, flags = re.MULTILINE)
			# remove TuneD parameters from rescue kernels
			grub2_cfg = re.sub(r"^(\s*" + i + r"(?:16|efi)?\s+\S+rescue.*)\$" + d2[i] + r" *(.*)$", r"\1\2", grub2_cfg, flags = re.MULTILINE)
			# fix whitespaces in rescue kernels
			grub2_cfg = re.sub(r"^(\s*" + i + r"(?:16|efi)?\s+\S+rescue.*) +$", r"\1", grub2_cfg, flags = re.MULTILINE)
		return grub2_cfg

	def _grub2_default_env_patch(self):
		grub2_default_env = self._cmd.read_file(consts.GRUB2_DEFAULT_ENV_FILE)
		if len(grub2_default_env) <= 0:
			log.info("cannot read '%s'" % consts.GRUB2_DEFAULT_ENV_FILE)
			return False

		d = {"GRUB_CMDLINE_LINUX_DEFAULT" : consts.GRUB2_TUNED_VAR, "GRUB_INITRD_OVERLAY" : consts.GRUB2_TUNED_INITRD_VAR}
		write = False
		for i in d:
			if re.search(r"^[^#]*\b" + i + r"\s*=.*\\\$" + d[i] + r"\b.*$", grub2_default_env, flags = re.MULTILINE) is None:
				write = True
				if grub2_default_env[-1] != "\n":
					grub2_default_env += "\n"
				grub2_default_env += i + "=\"${" + i + ":+$" + i + r" }\$" + d[i] + "\"\n"
		if write:
			log.debug("patching '%s'" % consts.GRUB2_DEFAULT_ENV_FILE)
			self._cmd.write_to_file(consts.GRUB2_DEFAULT_ENV_FILE, grub2_default_env)
		return True

	def _grub2_default_env_unpatch(self):
		grub2_default_env = self._cmd.read_file(consts.GRUB2_DEFAULT_ENV_FILE)
		if len(grub2_default_env) <= 0:
			log.info("cannot read '%s'" % consts.GRUB2_DEFAULT_ENV_FILE)
			return False

		write = False
		if re.search(r"^GRUB_CMDLINE_LINUX_DEFAULT=\"\$\{GRUB_CMDLINE_LINUX_DEFAULT:\+\$GRUB_CMDLINE_LINUX_DEFAULT \}\\\$" +
			consts.GRUB2_TUNED_VAR + "\"$", grub2_default_env, flags = re.MULTILINE):
				write = True
				cfg = re.sub(r"^GRUB_CMDLINE_LINUX_DEFAULT=\"\$\{GRUB_CMDLINE_LINUX_DEFAULT:\+\$GRUB_CMDLINE_LINUX_DEFAULT \}\\\$" +
					consts.GRUB2_TUNED_VAR + "\"$\n", "", grub2_default_env, flags = re.MULTILINE)
				if cfg[-1] != "\n":
					cfg += "\n"
		if write:
			log.debug("unpatching '%s'" % consts.GRUB2_DEFAULT_ENV_FILE)
			self._cmd.write_to_file(consts.GRUB2_DEFAULT_ENV_FILE, cfg)
		return True

	def _grub2_cfg_patch(self, d):
		log.debug("patching grub.cfg")
		if not self._grub2_cfg_file_names:
			log.info("cannot find grub.cfg to patch")
			return False
		for f in self._grub2_cfg_file_names:
			grub2_cfg = self._cmd.read_file(f)
			if len(grub2_cfg) <= 0:
				log.info("cannot patch %s" % f)
				continue
			log.debug("adding boot command line parameters to '%s'" % f)
			grub2_cfg_new = grub2_cfg
			patch_initial = False
			for opt in d:
				(grub2_cfg_new, nsubs) = re.subn(r"\b(set\s+" + opt + r"\s*=).*$", r"\1" + "\"" + self._cmd.escape(d[opt]) + "\"", grub2_cfg_new, flags = re.MULTILINE)
				if nsubs < 1 or re.search(r"\$" + opt, grub2_cfg, flags = re.MULTILINE) is None:
					patch_initial = True

			# workaround for rhbz#1442117
			if len(re.findall(r"\$" + consts.GRUB2_TUNED_VAR, grub2_cfg, flags = re.MULTILINE)) != \
				len(re.findall(r"\$" + consts.GRUB2_TUNED_INITRD_VAR, grub2_cfg, flags = re.MULTILINE)):
					patch_initial = True

			if patch_initial:
				grub2_cfg_new = self._grub2_cfg_patch_initial(self._grub2_cfg_unpatch(grub2_cfg), d)
			self._cmd.write_to_file(f, grub2_cfg_new)
		if self._bls:
			self._grub2_default_env_unpatch()
		else:
			self._grub2_default_env_patch()
		return True

	def _rpm_ostree_update(self):
		appended, _ = self._get_rpm_ostree_changes()
		_cmdline_dict = self._options_to_dict(self._cmdline_val, appended)
		if not _cmdline_dict:
			return None
		(_, _, d) = self._rpm_ostree_kargs(append=_cmdline_dict)
		if d is None:
			return
		self._patch_bootcmdline({consts.BOOT_CMDLINE_TUNED_VAR : self._cmdline_val, consts.BOOT_CMDLINE_KARGS_DELETED_VAR : self._dict_to_options(d)})

	def _grub2_update(self):
		self._grub2_cfg_patch({consts.GRUB2_TUNED_VAR : self._cmdline_val, consts.GRUB2_TUNED_INITRD_VAR : self._initrd_val})
		self._patch_bootcmdline({consts.BOOT_CMDLINE_TUNED_VAR : self._cmdline_val, consts.BOOT_CMDLINE_INITRD_ADD_VAR : self._initrd_val})

	def _has_bls(self):
		return os.path.exists(consts.BLS_ENTRIES_PATH)

	def _update_grubenv(self, d):
		log.debug("updating grubenv, setting %s" % str(d))
		l = ["%s=%s" % (str(option), str(value)) for option, value in d.items()]
		(rc, out) = self._cmd.execute(["grub2-editenv", "-", "set"] + l)
		if rc != 0:
			log.warn("cannot update grubenv: '%s'" % out)
			return False
		return True

	def _bls_entries_patch_initial(self):
		machine_id = self._cmd.get_machine_id()
		if machine_id == "":
			return False
		log.debug("running kernel update hook '%s' to patch BLS entries" % consts.KERNEL_UPDATE_HOOK_FILE)
		(rc, out) = self._cmd.execute([consts.KERNEL_UPDATE_HOOK_FILE, "add"], env = {"KERNEL_INSTALL_MACHINE_ID" : machine_id})
		if rc != 0:
			log.warn("cannot patch BLS entries: '%s'" % out)
			return False
		return True

	def _bls_update(self):
		log.debug("updating BLS")
		if self._has_bls() and \
			self._update_grubenv({"tuned_params" : self._cmdline_val, "tuned_initrd" : self._initrd_val}) and \
			self._bls_entries_patch_initial():
				return True
		return False

	def _init_initrd_dst_img(self, name):
		if self._initrd_dst_img_val is None:
			self._initrd_dst_img_val = os.path.join(consts.BOOT_DIR, os.path.basename(name))

	def _check_petitboot(self):
		return os.path.isdir(consts.PETITBOOT_DETECT_DIR)

	def _install_initrd(self, img):
		if self._rpm_ostree:
			log.warn("Detected rpm-ostree which doesn't support initrd overlays.")
			return False
		if self._check_petitboot():
			log.warn("Detected Petitboot which doesn't support initrd overlays. The initrd overlay will be ignored by bootloader.")
		log.info("installing initrd image as '%s'" % self._initrd_dst_img_val)
		img_name = os.path.basename(self._initrd_dst_img_val)
		if not self._cmd.copy(img, self._initrd_dst_img_val):
			return False
		self.update_grub2_cfg = True
		curr_cmdline = self._cmd.read_file("/proc/cmdline").rstrip()
		initrd_grubpath = "/"
		lc = len(curr_cmdline)
		if lc:
			path = re.sub(r"^\s*BOOT_IMAGE=\s*(?:\([^)]*\))?(\S*/).*$", "\\1", curr_cmdline)
			if len(path) < lc:
				initrd_grubpath = path
		self._initrd_val = os.path.join(initrd_grubpath, img_name)
		return True

	@command_custom("grub2_cfg_file")
	def _grub2_cfg_file(self, enabling, value, verify, ignore_missing):
		# nothing to verify
		if verify:
			return None
		if enabling and value is not None:
			self._grub2_cfg_file_names = [str(value)]

	@command_custom("initrd_dst_img")
	def _initrd_dst_img(self, enabling, value, verify, ignore_missing):
		# nothing to verify
		if verify:
			return None
		if enabling and value is not None:
			self._initrd_dst_img_val = str(value)
			if self._initrd_dst_img_val == "":
				return False
			if self._initrd_dst_img_val[0] != "/":
				self._initrd_dst_img_val = os.path.join(consts.BOOT_DIR, self._initrd_dst_img_val)

	@command_custom("initrd_remove_dir")
	def _initrd_remove_dir(self, enabling, value, verify, ignore_missing):
		# nothing to verify
		if verify:
			return None
		if enabling and value is not None:
			self._initrd_remove_dir = self._cmd.get_bool(value) == "1"

	@command_custom("initrd_add_img", per_device = False, priority = 10)
	def _initrd_add_img(self, enabling, value, verify, ignore_missing):
		# nothing to verify
		if verify:
			return None
		if enabling and value is not None:
			src_img = str(value)
			self._init_initrd_dst_img(src_img)
			if src_img == "":
				return False
			if not self._install_initrd(src_img):
				return False

	@command_custom("initrd_add_dir", per_device = False, priority = 10)
	def _initrd_add_dir(self, enabling, value, verify, ignore_missing):
		# nothing to verify
		if verify:
			return None
		if enabling and value is not None:
			src_dir = str(value)
			self._init_initrd_dst_img(src_dir)
			if src_dir == "":
				return False
			if not os.path.isdir(src_dir):
				log.error("error: cannot create initrd image, source directory '%s' doesn't exist" % src_dir)
				return False

			log.info("generating initrd image from directory '%s'" % src_dir)
			(fd, tmpfile) = tempfile.mkstemp(prefix = "tuned-bootloader-", suffix = ".tmp")
			log.debug("writing initrd image to temporary file '%s'" % tmpfile)
			os.close(fd)
			(rc, out) = self._cmd.execute("find . | cpio -co > %s" % tmpfile, cwd = src_dir, shell = True)
			log.debug("cpio log: %s" % out)
			if rc != 0:
				log.error("error generating initrd image")
				self._cmd.unlink(tmpfile, no_error = True)
				return False
			self._install_initrd(tmpfile)
			self._cmd.unlink(tmpfile)
			if self._initrd_remove_dir:
				log.info("removing directory '%s'" % src_dir)
				self._cmd.rmtree(src_dir)

	@command_custom("cmdline", per_device = False, priority = 10)
	def _cmdline(self, enabling, value, verify, ignore_missing):
		v = self._variables.expand(self._cmd.unquote(value))
		if verify:
			if self._rpm_ostree:
				rpm_ostree_kargs = self._rpm_ostree_kargs()[0]
				cmdline = self._dict_to_options(rpm_ostree_kargs)
			else:
				cmdline = self._cmd.read_file("/proc/cmdline")
			if len(cmdline) == 0:
				return None
			cmdline_set = set(cmdline.split())
			value_set = set(v.split())
			missing_set = value_set - cmdline_set
			if len(missing_set) == 0:
				log.info(consts.STR_VERIFY_PROFILE_VALUE_OK % ("cmdline", str(value_set)))
				return True
			else:
				cmdline_dict = {v.split("=", 1)[0]: v for v in cmdline_set}
				for m in missing_set:
					arg = m.split("=", 1)[0]
					if not arg in cmdline_dict:
						log.error(consts.STR_VERIFY_PROFILE_CMDLINE_FAIL_MISSING % (arg, m))
					else:
						log.error(consts.STR_VERIFY_PROFILE_CMDLINE_FAIL % (cmdline_dict[arg], m))
				present_set = value_set & cmdline_set
				log.info("expected arguments that are present in cmdline: %s"%(" ".join(present_set),))
				return False
		if enabling and value is not None:
			log.info("installing additional boot command line parameters to grub2")
			self.update_grub2_cfg = True
			self._cmdline_val = v

	@command_custom("skip_grub_config", per_device = False, priority = 10)
	def _skip_grub_config(self, enabling, value, verify, ignore_missing):
		if verify:
			return None
		if enabling and value is not None:
			if self._cmd.get_bool(value) == "1":
				log.info("skipping any modification of grub config")
				self._skip_grub_config_val = True

	def _instance_post_static(self, instance, enabling):
		if enabling and self._skip_grub_config_val:
			if len(self._initrd_val) > 0:
				log.warn("requested changes to initrd will not be applied!")
			if len(self._cmdline_val) > 0:
				log.warn("requested changes to cmdline will not be applied!")
			# ensure that the desired cmdline is always written to BOOT_CMDLINE_FILE (/etc/tuned/bootcmdline)
			self._patch_bootcmdline({consts.BOOT_CMDLINE_TUNED_VAR : self._cmdline_val, consts.BOOT_CMDLINE_INITRD_ADD_VAR : self._initrd_val})
		elif enabling and self.update_grub2_cfg:
			if self._rpm_ostree:
				self._rpm_ostree_update()
			else:
				self._grub2_update()
				self._bls_update()
			self.update_grub2_cfg = False
site-packages/tuned/plugins/plugin_systemd.py000064400000012464147511334670015526 0ustar00from . import base
from .decorators import *
import tuned.logs
from . import exceptions
from tuned.utils.commands import commands
import tuned.consts as consts

import os
import re

log = tuned.logs.get()

class SystemdPlugin(base.Plugin):
	"""
	`systemd`::
	
	Plug-in for tuning systemd options.
	+
	The [option]`cpu_affinity` option allows setting CPUAffinity in
	`/etc/systemd/system.conf`. This configures the CPU affinity for the
	service manager as well as the default CPU affinity for all forked
	off processes. The option takes a comma-separated list of CPUs with
	optional CPU ranges specified by the minus sign (`-`).
	+
	.Set the CPUAffinity for `systemd` to `0 1 2 3`
	====
	----
	[systemd]
	cpu_affinity=0-3
	----
	====
	+
	NOTE: These tunings are unloaded only on profile change followed by a reboot.
	"""

	def __init__(self, *args, **kwargs):
		if not os.path.isfile(consts.SYSTEMD_SYSTEM_CONF_FILE):
			raise exceptions.NotSupportedPluginException("Required systemd '%s' configuration file not found, disabling plugin." % consts.SYSTEMD_SYSTEM_CONF_FILE)
		super(SystemdPlugin, self).__init__(*args, **kwargs)
		self._cmd = commands()

	def _instance_init(self, instance):
		instance._has_dynamic_tuning = False
		instance._has_static_tuning = True

	def _instance_cleanup(self, instance):
		pass

	@classmethod
	def _get_config_options(cls):
		return {
			"cpu_affinity": None,
		}

	def _get_keyval(self, conf, key):
		if conf is not None:
			mo = re.search(r"^\s*" + key + r"\s*=\s*(.*)$", conf, flags = re.MULTILINE)
			if mo is not None and mo.lastindex == 1:
				return mo.group(1)
		return None

	# add/replace key with the value
	def _add_keyval(self, conf, key, val):
		(conf_new, nsubs) = re.subn(r"^(\s*" + key + r"\s*=).*$", r"\g<1>" + str(val), conf, flags = re.MULTILINE)
		if nsubs < 1:
			try:
				if conf[-1] != "\n":
					conf += "\n"
			except IndexError:
				pass
			conf += key + "=" + str(val) + "\n"
			return conf
		return conf_new

	def _del_key(self, conf, key):
		return re.sub(r"^\s*" + key + r"\s*=.*\n", "", conf, flags = re.MULTILINE)

	def _read_systemd_system_conf(self):
		systemd_system_conf = self._cmd.read_file(consts.SYSTEMD_SYSTEM_CONF_FILE, err_ret = None)
		if systemd_system_conf is None:
			log.error("error reading systemd configuration file")
			return None
		return systemd_system_conf

	def _write_systemd_system_conf(self, conf):
		tmpfile = consts.SYSTEMD_SYSTEM_CONF_FILE + consts.TMP_FILE_SUFFIX
		if not self._cmd.write_to_file(tmpfile, conf):
			log.error("error writing systemd configuration file")
			self._cmd.unlink(tmpfile, no_error = True)
			return False
		# Atomic replace, this doesn't work on Windows (AFAIK there is no way on Windows how to do this
		# atomically), but it's unlikely this code will run there
		if not self._cmd.rename(tmpfile, consts.SYSTEMD_SYSTEM_CONF_FILE):
			log.error("error replacing systemd configuration file '%s'" % consts.SYSTEMD_SYSTEM_CONF_FILE)
			self._cmd.unlink(tmpfile, no_error = True)
			return False
		return True

	def _get_storage_filename(self):
		return os.path.join(consts.PERSISTENT_STORAGE_DIR, self.name)

	def _remove_systemd_tuning(self):
		conf = self._read_systemd_system_conf()
		if (conf is not None):
			fname = self._get_storage_filename()
			cpu_affinity_saved = self._cmd.read_file(fname, err_ret = None, no_error = True)
			self._cmd.unlink(fname)
			if cpu_affinity_saved is None:
				conf = self._del_key(conf, consts.SYSTEMD_CPUAFFINITY_VAR)
			else:
				conf = self._add_keyval(conf, consts.SYSTEMD_CPUAFFINITY_VAR, cpu_affinity_saved)
			self._write_systemd_system_conf(conf)

	def _instance_unapply_static(self, instance, rollback = consts.ROLLBACK_SOFT):
		if rollback == consts.ROLLBACK_FULL:
			log.info("removing '%s' systemd tuning previously added by TuneD" % consts.SYSTEMD_CPUAFFINITY_VAR)
			self._remove_systemd_tuning()
			log.console("you may need to manualy run 'dracut -f' to update the systemd configuration in initrd image")

	# convert cpulist from systemd syntax to TuneD syntax and unpack it
	def _cpulist_convert_unpack(self, cpulist):
		if cpulist is None:
			return ""
		return " ".join(str(v) for v in self._cmd.cpulist_unpack(re.sub(r"\s+", r",", re.sub(r",\s+", r",", cpulist))))

	@command_custom("cpu_affinity", per_device = False)
	def _cmdline(self, enabling, value, verify, ignore_missing):
		conf_affinity = None
		conf_affinity_unpacked = None
		v = self._cmd.unescape(self._variables.expand(self._cmd.unquote(value)))
		v_unpacked = " ".join(str(v) for v in self._cmd.cpulist_unpack(v))
		conf = self._read_systemd_system_conf()
		if conf is not None:
			conf_affinity = self._get_keyval(conf, consts.SYSTEMD_CPUAFFINITY_VAR)
			conf_affinity_unpacked = self._cpulist_convert_unpack(conf_affinity)
		if verify:
			return self._verify_value("cpu_affinity", v_unpacked, conf_affinity_unpacked, ignore_missing)
		if enabling:
			fname = self._get_storage_filename()
			cpu_affinity_saved = self._cmd.read_file(fname, err_ret = None, no_error = True)
			if conf_affinity is not None and cpu_affinity_saved is None and v_unpacked != conf_affinity_unpacked:
				self._cmd.write_to_file(fname, conf_affinity, makedir = True)

			log.info("setting '%s' to '%s' in the '%s'" % (consts.SYSTEMD_CPUAFFINITY_VAR, v_unpacked, consts.SYSTEMD_SYSTEM_CONF_FILE))
			self._write_systemd_system_conf(self._add_keyval(conf, consts.SYSTEMD_CPUAFFINITY_VAR, v_unpacked))
site-packages/tuned/plugins/plugin_video.py000064400000007337147511334670015147 0ustar00from . import base
from .decorators import *
import tuned.logs
from tuned.utils.commands import commands
import os
import errno
import re

log = tuned.logs.get()

class VideoPlugin(base.Plugin):
	"""
	`video`::
	
	Sets various powersave levels on video cards. Currently, only the
	Radeon cards are supported. The powersave level can be specified
	by using the [option]`radeon_powersave` option. Supported values are:
	+
	--
	* `default`
	* `auto`
	* `low`
	* `mid`
	* `high`
	* `dynpm`
	* `dpm-battery`
	* `dpm-balanced`
	* `dpm-perfomance`
	--
	+
	For additional detail, see
	link:https://www.x.org/wiki/RadeonFeature/#kmspowermanagementoptions[KMS Power Management Options].
	+
	NOTE: This plug-in is experimental and the option might change in future releases.
	+
	.To set the powersave level for the Radeon video card to high
	====
	----
	[video]
	radeon_powersave=high
	----
	====
	"""

	def _init_devices(self):
		self._devices_supported = True
		self._free_devices = set()
		self._assigned_devices = set()

		# FIXME: this is a blind shot, needs testing
		for device in self._hardware_inventory.get_devices("drm").match_sys_name("card*").match_property("DEVTYPE", "drm_minor"):
			self._free_devices.add(device.sys_name)

		self._cmd = commands()

	def _get_device_objects(self, devices):
		return [self._hardware_inventory.get_device("drm", x) for x in devices]

	@classmethod
	def _get_config_options(self):
		return {
			"radeon_powersave" : None,
		}

	def _instance_init(self, instance):
		instance._has_dynamic_tuning = False
		instance._has_static_tuning = True

	def _instance_cleanup(self, instance):
		pass

	def _radeon_powersave_files(self, device):
		return {
			"method" : "/sys/class/drm/%s/device/power_method" % device,
			"profile": "/sys/class/drm/%s/device/power_profile" % device,
			"dpm_state": "/sys/class/drm/%s/device/power_dpm_state" % device
		}

	@command_set("radeon_powersave", per_device=True)
	def _set_radeon_powersave(self, value, device, sim, remove):
		sys_files = self._radeon_powersave_files(device)
		va = str(re.sub(r"(\s*:\s*)|(\s+)|(\s*;\s*)|(\s*,\s*)", " ", value)).split()
		if not os.path.exists(sys_files["method"]):
			if not sim:
				log.warn("radeon_powersave is not supported on '%s'" % device)
				return None
		for v in va:
			if v in ["default", "auto", "low", "mid", "high"]:
				if not sim:
					if (self._cmd.write_to_file(sys_files["method"], "profile", \
						no_error = [errno.ENOENT] if remove else False) and
						self._cmd.write_to_file(sys_files["profile"], v, \
							no_error = [errno.ENOENT] if remove else False)):
								return v
			elif v == "dynpm":
				if not sim:
					if (self._cmd.write_to_file(sys_files["method"], "dynpm", \
						no_error = [errno.ENOENT] if remove else False)):
							return "dynpm"
			# new DPM profiles, recommended to use if supported
			elif v in ["dpm-battery", "dpm-balanced", "dpm-performance"]:
				if not sim:
					state = v[len("dpm-"):]
					if (self._cmd.write_to_file(sys_files["method"], "dpm", \
						no_error = [errno.ENOENT] if remove else False) and
						self._cmd.write_to_file(sys_files["dpm_state"], state, \
							no_error = [errno.ENOENT] if remove else False)):
								return v
			else:
				if not sim:
					log.warn("Invalid option for radeon_powersave.")
				return None
		return None

	@command_get("radeon_powersave")
	def _get_radeon_powersave(self, device, ignore_missing = False):
		sys_files = self._radeon_powersave_files(device)
		method = self._cmd.read_file(sys_files["method"], no_error=ignore_missing).strip()
		if method == "profile":
			return self._cmd.read_file(sys_files["profile"]).strip()
		elif method == "dynpm":
			return method
		elif method == "dpm":
			return "dpm-" + self._cmd.read_file(sys_files["dpm_state"]).strip()
		else:
			return None
site-packages/tuned/plugins/plugin_vm.py000064400000006751147511334670014462 0ustar00from . import base
from .decorators import *
import tuned.logs

import os
import errno
import struct
import glob
from tuned.utils.commands import commands

log = tuned.logs.get()
cmd = commands()

class VMPlugin(base.Plugin):
	"""
	`vm`::
	
	Enables or disables transparent huge pages depending on value of the
	[option]`transparent_hugepages` option. The option can have one of three
	possible values `always`, `madvise` and `never`.
	+
	.Disable transparent hugepages
	====
	----
	[vm]
	transparent_hugepages=never
	----
	====
	+
	The [option]`transparent_hugepage.defrag` option specifies the
	defragmentation policy. Possible values for this option are `always`,
	`defer`, `defer+madvise`, `madvise` and `never`. For a detailed
	explanation of these values refer to
	link:https://www.kernel.org/doc/Documentation/vm/transhuge.txt[Transparent Hugepage Support].
	"""

	@classmethod
	def _get_config_options(self):
		return {
			"transparent_hugepages" : None,
			"transparent_hugepage" : None,
			"transparent_hugepage.defrag" : None,
		}

	def _instance_init(self, instance):
		instance._has_static_tuning = True
		instance._has_dynamic_tuning = False

	def _instance_cleanup(self, instance):
		pass

	@classmethod
	def _thp_path(self):
		path = "/sys/kernel/mm/transparent_hugepage"
		if not os.path.exists(path):
			# RHEL-6 support
			path =  "/sys/kernel/mm/redhat_transparent_hugepage"
		return path

	@command_set("transparent_hugepages")
	def _set_transparent_hugepages(self, value, sim, remove):
		if value not in ["always", "never", "madvise"]:
			if not sim:
				log.warn("Incorrect 'transparent_hugepages' value '%s'." % str(value))
			return None

		cmdline = cmd.read_file("/proc/cmdline", no_error = True)
		if cmdline.find("transparent_hugepage=") > 0:
			if not sim:
				log.info("transparent_hugepage is already set in kernel boot cmdline, ignoring value from profile")
			return None

		sys_file = os.path.join(self._thp_path(), "enabled")
		if os.path.exists(sys_file):
			if not sim:
				cmd.write_to_file(sys_file, value, \
					no_error = [errno.ENOENT] if remove else False)
			return value
		else:
			if not sim:
				log.warn("Option 'transparent_hugepages' is not supported on current hardware.")
			return None

        # just an alias to transparent_hugepages
	@command_set("transparent_hugepage")
	def _set_transparent_hugepage(self, value, sim, remove):
		self._set_transparent_hugepages(value, sim, remove)

	@command_get("transparent_hugepages")
	def _get_transparent_hugepages(self):
		sys_file = os.path.join(self._thp_path(), "enabled")
		if os.path.exists(sys_file):
			return cmd.get_active_option(cmd.read_file(sys_file))
		else:
			return None

        # just an alias to transparent_hugepages
	@command_get("transparent_hugepage")
	def _get_transparent_hugepage(self):
		return self._get_transparent_hugepages()

	@command_set("transparent_hugepage.defrag")
	def _set_transparent_hugepage_defrag(self, value, sim, remove):
		sys_file = os.path.join(self._thp_path(), "defrag")
		if os.path.exists(sys_file):
			if not sim:
				cmd.write_to_file(sys_file, value, \
					no_error = [errno.ENOENT] if remove else False)
			return value
		else:
			if not sim:
				log.warn("Option 'transparent_hugepage.defrag' is not supported on current hardware.")
			return None

	@command_get("transparent_hugepage.defrag")
	def _get_transparent_hugepage_defrag(self):
		sys_file = os.path.join(self._thp_path(), "defrag")
		if os.path.exists(sys_file):
			return cmd.get_active_option(cmd.read_file(sys_file))
		else:
			return None
site-packages/tuned/plugins/plugin_sysctl.py000064400000015372147511334670015360 0ustar00import re
from . import base
from .decorators import *
import tuned.logs
from subprocess import *
from tuned.utils.commands import commands
import tuned.consts as consts
import errno
import os

log = tuned.logs.get()

DEPRECATED_SYSCTL_OPTIONS = [ "base_reachable_time", "retrans_time" ]
SYSCTL_CONFIG_DIRS = [ "/run/sysctl.d",
		"/etc/sysctl.d" ]

class SysctlPlugin(base.Plugin):
	"""
	`sysctl`::
	
	Sets various kernel parameters at runtime.
	+
	This plug-in is used for applying custom `sysctl` settings and should
	only be used to change system settings that are not covered by other
	*TuneD* plug-ins. If the settings are covered by other *TuneD* plug-ins,
	use those plug-ins instead.
	+
	The syntax for this plug-in is
	`_key_=_value_`, where
	`_key_` is the same as the key name provided by the
	`sysctl` utility.
	+
	.Adjusting the kernel runtime kernel.sched_min_granularity_ns value
	====
	----
	[sysctl]
	kernel.sched_min_granularity_ns=3000000
	----
	====
	"""

	def __init__(self, *args, **kwargs):
		super(SysctlPlugin, self).__init__(*args, **kwargs)
		self._has_dynamic_options = True
		self._cmd = commands()

	def _instance_init(self, instance):
		instance._has_dynamic_tuning = False
		instance._has_static_tuning = True

		# FIXME: do we want to do this here?
		# recover original values in case of crash
		storage_key = self._storage_key(instance.name)
		instance._sysctl_original = self._storage.get(storage_key, {})
		if len(instance._sysctl_original) > 0:
			log.info("recovering old sysctl settings from previous run")
			self._instance_unapply_static(instance)
			instance._sysctl_original = {}
			self._storage.unset(storage_key)

		instance._sysctl = instance.options

	def _instance_cleanup(self, instance):
		storage_key = self._storage_key(instance.name)
		self._storage.unset(storage_key)

	def _instance_apply_static(self, instance):
		for option, value in list(instance._sysctl.items()):
			original_value = self._read_sysctl(option)
			if original_value is None:
				log.error("sysctl option %s will not be set, failed to read the original value."
						% option)
			else:
				new_value = self._variables.expand(
						self._cmd.unquote(value))
				new_value = self._process_assignment_modifiers(
						new_value, original_value)
				if new_value is not None:
					instance._sysctl_original[option] = original_value
					self._write_sysctl(option, new_value)

		storage_key = self._storage_key(instance.name)
		self._storage.set(storage_key, instance._sysctl_original)

		if self._global_cfg.get_bool(consts.CFG_REAPPLY_SYSCTL, consts.CFG_DEF_REAPPLY_SYSCTL):
			log.info("reapplying system sysctl")
			self._apply_system_sysctl(instance._sysctl)

	def _instance_verify_static(self, instance, ignore_missing, devices):
		ret = True
		# override, so always skip missing
		ignore_missing = True
		for option, value in list(instance._sysctl.items()):
			curr_val = self._read_sysctl(option)
			value = self._process_assignment_modifiers(self._variables.expand(value), curr_val)
			if value is not None:
				if self._verify_value(option, self._cmd.remove_ws(value), self._cmd.remove_ws(curr_val), ignore_missing) == False:
					ret = False
		return ret

	def _instance_unapply_static(self, instance, rollback = consts.ROLLBACK_SOFT):
		for option, value in list(instance._sysctl_original.items()):
			self._write_sysctl(option, value)

	def _apply_system_sysctl(self, instance_sysctl):
		files = {}
		for d in SYSCTL_CONFIG_DIRS:
			try:
				flist = os.listdir(d)
			except OSError:
				continue
			for fname in flist:
				if not fname.endswith(".conf"):
					continue
				if fname not in files:
					files[fname] = d

		for fname in sorted(files.keys()):
			d = files[fname]
			path = "%s/%s" % (d, fname)
			self._apply_sysctl_config_file(path, instance_sysctl)
		self._apply_sysctl_config_file("/etc/sysctl.conf", instance_sysctl)

	def _apply_sysctl_config_file(self, path, instance_sysctl):
		log.debug("Applying sysctl settings from file %s" % path)
		try:
			with open(path, "r") as f:
				for lineno, line in enumerate(f, 1):
					self._apply_sysctl_config_line(path, lineno, line, instance_sysctl)
			log.debug("Finished applying sysctl settings from file %s"
					% path)
		except (OSError, IOError) as e:
			if e.errno != errno.ENOENT:
				log.error("Error reading sysctl settings from file %s: %s"
						% (path, str(e)))

	def _apply_sysctl_config_line(self, path, lineno, line, instance_sysctl):
		line = line.strip()
		if len(line) == 0 or line[0] == "#" or line[0] == ";":
			return
		tmp = line.split("=", 1)
		if len(tmp) != 2:
			log.error("Syntax error in file %s, line %d"
					% (path, lineno))
			return
		option, value = tmp
		option = option.strip()
		if len(option) == 0:
			log.error("Syntax error in file %s, line %d"
					% (path, lineno))
			return
		value = value.strip()
		if option in instance_sysctl:
			instance_value = self._variables.expand(instance_sysctl[option])
			if instance_value != value:
				log.info("Overriding sysctl parameter '%s' from '%s' to '%s'"
						% (option, instance_value, value))
		self._write_sysctl(option, value, ignore_missing = True)

	def _get_sysctl_path(self, option):
		# The sysctl name in sysctl tool and in /proc/sys differs.
		# All dots (.) in sysctl name are represented by /proc/sys
		# directories and all slashes in the name (/) are converted
		# to dots (.) in the /proc/sys filenames.
		return "/proc/sys/%s" % self._cmd.tr(option, "./", "/.")

	def _read_sysctl(self, option):
		path = self._get_sysctl_path(option)
		try:
			with open(path, "r") as f:
				line = ""
				for i, line in enumerate(f):
					if i > 0:
						log.error("Failed to read sysctl parameter '%s', multi-line values are unsupported"
								% option)
						return None
				value = line.strip()
			log.debug("Value of sysctl parameter '%s' is '%s'"
					% (option, value))
			return value
		except (OSError, IOError) as e:
			if e.errno == errno.ENOENT:
				log.error("Failed to read sysctl parameter '%s', the parameter does not exist"
						% option)
			else:
				log.error("Failed to read sysctl parameter '%s': %s"
						% (option, str(e)))
			return None

	def _write_sysctl(self, option, value, ignore_missing = False):
		path = self._get_sysctl_path(option)
		if os.path.basename(path) in DEPRECATED_SYSCTL_OPTIONS:
			log.error("Refusing to set deprecated sysctl option %s"
					% option)
			return False
		try:
			log.debug("Setting sysctl parameter '%s' to '%s'"
					% (option, value))
			with open(path, "w") as f:
				f.write(value)
			return True
		except (OSError, IOError) as e:
			if e.errno == errno.ENOENT:
				log_func = log.debug if ignore_missing else log.error
				log_func("Failed to set sysctl parameter '%s' to '%s', the parameter does not exist"
						% (option, value))
			else:
				log.error("Failed to set sysctl parameter '%s' to '%s': %s"
						% (option, value, str(e)))
			return False
site-packages/tuned/plugins/__init__.py000064400000000061147511334670014205 0ustar00from .repository import *
from . import instance
site-packages/tuned/plugins/plugin_cpu.py000064400000067072147511334670014632 0ustar00from . import hotplug
from .decorators import *
import tuned.logs
from tuned.utils.commands import commands
import tuned.consts as consts

import os
import errno
import struct
import errno
import platform
import procfs

log = tuned.logs.get()

cpuidle_states_path = "/sys/devices/system/cpu/cpu0/cpuidle"

class CPULatencyPlugin(hotplug.Plugin):
	"""
	`cpu`::
	
	Sets the CPU governor to the value specified by the [option]`governor`
	option and dynamically changes the Power Management Quality of
	Service (PM QoS) CPU Direct Memory Access (DMA) latency according
	to the CPU load.
	
	`governor`:::
	The [option]`governor` option of the 'cpu' plug-in supports specifying
	CPU governors. Multiple governors are separated using '|'. The '|'
	character is meant to represent a logical 'or' operator. Note that the
	same syntax is used for the [option]`energy_perf_bias` option. *TuneD*
	will set the first governor that is available on the system.
	+    
	For example, with the following profile, *TuneD* will set the 'ondemand'
	governor, if it is available. If it is not available, but the 'powersave'
	governor is available, 'powersave' will be set. If neither of them are
	available, the governor will not be changed.
	+
	.Specifying a CPU governor
	====
	----
	[cpu]
	governor=ondemand|powersave
	----
	====
	
	`sampling_down_factor`:::
	The sampling rate determines how frequently the governor checks
	to tune the CPU. The [option]`sampling_down_factor` is a tunable
	that multiplies the sampling rate when the CPU is at its highest
	clock frequency thereby delaying load evaluation and improving
	performance. Allowed values for sampling_down_factor are 1 to 100000.
	+
	.The recommended setting for jitter reduction
	====
	----
	[cpu]
	sampling_down_factor = 100
	----
	====
	
	`energy_perf_bias`:::
	[option]`energy_perf_bias` supports managing energy
	vs. performance policy via x86 Model Specific Registers using the
	`x86_energy_perf_policy` tool. Multiple alternative Energy Performance
	Bias (EPB) values are supported. The alternative values are separated
	using the '|' character. The following EPB values are supported
	starting with kernel 4.13: "performance", "balance-performance",
	"normal", "balance-power" and "power". On newer processors is value
	writen straight to file (see rhbz#2095829)
	+
	.Specifying alternative Energy Performance Bias values
	====
	----
	[cpu]
	energy_perf_bias=powersave|power
	----
	
	*TuneD* will try to set EPB to 'powersave'. If that fails, it will
	try to set it to 'power'.
	====
	
	`energy_performance_preference`:::
	[option]`energy_performance_preference` supports managing energy
	vs. performance hints on newer Intel and AMD processors with active P-State
	CPU scaling drivers (intel_pstate or amd-pstate). Multiple alternative
	Energy Performance Preferences (EPP) values are supported. The alternative
	values are separated using the '|' character. Available values can be found
	in `energy_performance_available_preferences` file in `CPUFreq` policy
	directory in `sysfs`.
	in
	+
	.Specifying alternative Energy Performance Hints values
	====
	----
	[cpu]
	energy_performance_preference=balance_power|power
	----
	
	*TuneD* will try to set EPP to 'balance_power'. If that fails, it will
	try to set it to 'power'.
	====
	
	`latency_low, latency_high, load_threshold`:::
	+
	If the CPU load is lower than the value specified by
	the[option]`load_threshold` option, the latency is set to the value
	specified either by the [option]`latency_high` option or by the
	[option]`latency_low` option.
	+
	`force_latency`:::
	You can also force the latency to a specific value and prevent it from
	dynamically changing further. To do so, set the [option]`force_latency`
	option to the required latency value.
	+
	The maximum latency value can be specified in several ways:
	+
	 * by a numerical value in microseconds (for example, `force_latency=10`)
	 * as the kernel CPU idle level ID of the maximum C-state allowed
	   (for example, force_latency = cstate.id:1)
	 * as a case sensitive name of the maximum C-state allowed
	   (for example, force_latency = cstate.name:C1)
	 * by using 'None' as a fallback value to prevent errors when alternative
	   C-state IDs/names do not exist. When 'None' is used in the alternatives
	   pipeline, all the alternatives that follow 'None' are ignored.
	+
	It is also possible to specify multiple fallback values separated by '|' as
	the C-state names and/or IDs may not be available on some systems.
	+
	.Specifying fallback C-state values
	====
	----
	[cpu]
	force_latency=cstate.name:C6|cstate.id:4|10
	----
	This configuration tries to obtain and set the latency of C-state named C6.
	If the C-state C6 does not exist, kernel CPU idle level ID 4 (state4) latency
	is searched for in sysfs. Finally, if the state4 directory in sysfs is not found,
	the last latency fallback value is `10` us. The value is encoded and written into
	the kernel's PM QoS file `/dev/cpu_dma_latency`.
	====
	+
	.Specifying fallback C-state values using 'None'.
	====
	----
	[cpu]
	force_latency=cstate.name:XYZ|None
	----
	In this case, if C-state with the name `XYZ` does not exist
	[option]`force_latency`, no latency value will be written into the
	kernel's PM QoS file, and no errors will be reported due to the
	presence of 'None'.
	====
	
	`min_perf_pct, max_perf_pct, no_turbo`:::
	These options set the internals of the Intel P-State driver exposed via the kernel's
	`sysfs` interface.
	+
	.Adjusting the configuration of the Intel P-State driver
	====
	----
	[cpu]
	min_perf_pct=100
	----
	Limit the minimum P-State that will be requested by the driver. It states
	it as a percentage of the max (non-turbo) performance level.
	====
	+
	`pm_qos_resume_latency_us`:::
	This option allow to set specific latency for all cpus or specific ones.
	====
	----
	[cpu]
	pm_qos_resume_latency_us=n/a
	----
	Special value that disables C-states completely.
	====
	----
	[cpu]
	pm_qos_resume_latency_us=0
	----
	Allows all C-states.
	====
	----
	[cpu]
	pm_qos_resume_latency_us=100
	----
	Allows any C-state with a resume latency less than value.
	"""

	def __init__(self, *args, **kwargs):
		super(CPULatencyPlugin, self).__init__(*args, **kwargs)

		self._has_pm_qos = True
		self._arch = "x86_64"
		self._is_x86 = False
		self._is_intel = False
		self._is_amd = False
		self._has_energy_perf_bias = False
		self._has_intel_pstate = False
		self._has_amd_pstate = False
		self._has_pm_qos_resume_latency_us = None

		self._min_perf_pct_save = None
		self._max_perf_pct_save = None
		self._no_turbo_save = None
		self._governors_map = {}
		self._cmd = commands()

		self._flags = None

	def _init_devices(self):
		self._devices_supported = True
		self._free_devices = set()
		# current list of devices
		for device in self._hardware_inventory.get_devices("cpu"):
			self._free_devices.add(device.sys_name)

		self._assigned_devices = set()

	def _get_device_objects(self, devices):
		return [self._hardware_inventory.get_device("cpu", x) for x in devices]

	@classmethod
	def _get_config_options(self):
		return {
			"load_threshold"       : 0.2,
			"latency_low"          : 100,
			"latency_high"         : 1000,
			"force_latency"        : None,
			"governor"             : None,
			"sampling_down_factor" : None,
			"energy_perf_bias"     : None,
			"min_perf_pct"         : None,
			"max_perf_pct"         : None,
			"no_turbo"             : None,
			"pm_qos_resume_latency_us": None,
			"energy_performance_preference" : None,
		}

	def _check_arch(self):
		intel_archs = [ "x86_64", "i686", "i585", "i486", "i386" ]
		self._arch = platform.machine()

		if self._arch in intel_archs:
			# Possible other x86 vendors (from arch/x86/kernel/cpu/*):
			# "CentaurHauls", "CyrixInstead", "Geode by NSC", "HygonGenuine", "GenuineTMx86",
			# "TransmetaCPU", "UMC UMC UMC"
			cpu = procfs.cpuinfo()
			vendor = cpu.tags.get("vendor_id")
			if vendor == "GenuineIntel":
				self._is_intel = True
			elif vendor == "AuthenticAMD" or vendor == "HygonGenuine":
				self._is_amd = True
			else:
				# We always assign Intel, unless we know better
				self._is_intel = True
			log.info("We are running on an x86 %s platform" % vendor)
		else:
			log.info("We are running on %s (non x86)" % self._arch)

		if self._is_intel:
			# Check for x86_energy_perf_policy, ignore if not available / supported
			self._check_energy_perf_bias()
			# Check for intel_pstate
			self._check_intel_pstate()

		if self._is_amd:
			# Check for amd-pstate
			self._check_amd_pstate()

	def _check_energy_perf_bias(self):
		self._has_energy_perf_bias = False
		retcode_unsupported = 1
		retcode, out = self._cmd.execute(["x86_energy_perf_policy", "-r"], no_errors = [errno.ENOENT, retcode_unsupported])
		# With recent versions of the tool, a zero exit code is
		# returned even if EPB is not supported. The output is empty
		# in that case, however.
		if retcode == 0 and out != "":
			self._has_energy_perf_bias = True
		elif retcode < 0:
			log.warning("unable to run x86_energy_perf_policy tool, ignoring CPU energy performance bias, is the tool installed?")
		else:
			log.warning("your CPU doesn't support MSR_IA32_ENERGY_PERF_BIAS, ignoring CPU energy performance bias")

	def _check_intel_pstate(self):
		self._has_intel_pstate = os.path.exists("/sys/devices/system/cpu/intel_pstate")
		if self._has_intel_pstate:
			log.info("intel_pstate detected")

	def _check_amd_pstate(self):
		self._has_amd_pstate = os.path.exists("/sys/devices/system/cpu/amd_pstate")
		if self._has_amd_pstate:
			log.info("amd-pstate detected")

	def _get_cpuinfo_flags(self):
		if self._flags is None:
			self._flags = procfs.cpuinfo().tags.get("flags", [])
		return self._flags

	def _is_cpu_online(self, device):
		sd = str(device)
		return self._cmd.is_cpu_online(str(device).replace("cpu", ""))

	def _cpu_has_scaling_governor(self, device):
		return os.path.exists("/sys/devices/system/cpu/%s/cpufreq/scaling_governor" % device)

	def _check_cpu_can_change_governor(self, device):
		if not self._is_cpu_online(device):
			log.debug("'%s' is not online, skipping" % device)
			return False
		if not self._cpu_has_scaling_governor(device):
			log.debug("there is no scaling governor fo '%s', skipping" % device)
			return False
		return True

	def _instance_init(self, instance):
		instance._has_static_tuning = True
		instance._has_dynamic_tuning = False

		# only the first instance of the plugin can control the latency
		if list(self._instances.values())[0] == instance:
			instance._first_instance = True
			try:
				self._cpu_latency_fd = os.open(consts.PATH_CPU_DMA_LATENCY, os.O_WRONLY)
			except OSError:
				log.error("Unable to open '%s', disabling PM_QoS control" % consts.PATH_CPU_DMA_LATENCY)
				self._has_pm_qos = False
			self._latency = None

			if instance.options["force_latency"] is None and instance.options["pm_qos_resume_latency_us"] is None:
				instance._load_monitor = self._monitors_repository.create("load", None)
				instance._has_dynamic_tuning = True
			else:
				instance._load_monitor = None

			self._check_arch()
		else:
			instance._first_instance = False
			log.info("Latency settings from non-first CPU plugin instance '%s' will be ignored." % instance.name)

		try:
			instance._first_device = list(instance.assigned_devices)[0]
		except IndexError:
			instance._first_device = None

	def _instance_cleanup(self, instance):
		if instance._first_instance:
			if self._has_pm_qos:
				os.close(self._cpu_latency_fd)
			if instance._load_monitor is not None:
				self._monitors_repository.delete(instance._load_monitor)

	def _get_intel_pstate_attr(self, attr):
		return self._cmd.read_file("/sys/devices/system/cpu/intel_pstate/%s" % attr, None).strip()

	def _set_intel_pstate_attr(self, attr, val):
		if val is not None:
			self._cmd.write_to_file("/sys/devices/system/cpu/intel_pstate/%s" % attr, val)

	def _getset_intel_pstate_attr(self, attr, value):
		if value is None:
			return None
		v = self._get_intel_pstate_attr(attr)
		self._set_intel_pstate_attr(attr, value)
		return v

	def _instance_apply_static(self, instance):
		super(CPULatencyPlugin, self)._instance_apply_static(instance)

		if not instance._first_instance:
			return

		force_latency_value = self._variables.expand(
			instance.options["force_latency"])
		if force_latency_value is not None:
			self._set_latency(force_latency_value)
		if self._has_intel_pstate:
			new_value = self._variables.expand(
				instance.options["min_perf_pct"])
			self._min_perf_pct_save = self._getset_intel_pstate_attr(
				"min_perf_pct", new_value)
			new_value = self._variables.expand(
				instance.options["max_perf_pct"])
			self._max_perf_pct_save = self._getset_intel_pstate_attr(
				"max_perf_pct", new_value)
			new_value = self._variables.expand(
				instance.options["no_turbo"])
			self._no_turbo_save = self._getset_intel_pstate_attr(
				"no_turbo", new_value)

	def _instance_unapply_static(self, instance, rollback = consts.ROLLBACK_SOFT):
		super(CPULatencyPlugin, self)._instance_unapply_static(instance, rollback)

		if instance._first_instance and self._has_intel_pstate:
			self._set_intel_pstate_attr("min_perf_pct", self._min_perf_pct_save)
			self._set_intel_pstate_attr("max_perf_pct", self._max_perf_pct_save)
			self._set_intel_pstate_attr("no_turbo", self._no_turbo_save)

	def _instance_apply_dynamic(self, instance, device):
		self._instance_update_dynamic(instance, device)

	def _instance_update_dynamic(self, instance, device):
		assert(instance._first_instance)
		if device != instance._first_device:
			return

		load = instance._load_monitor.get_load()["system"]
		if load < instance.options["load_threshold"]:
			self._set_latency(instance.options["latency_high"])
		else:
			self._set_latency(instance.options["latency_low"])

	def _instance_unapply_dynamic(self, instance, device):
		pass

	def _str2int(self, s):
		try:
			return int(s)
		except (ValueError, TypeError):
			return None

	def _read_cstates_latency(self):
		self.cstates_latency = {}
		for d in os.listdir(cpuidle_states_path):
			cstate_path = cpuidle_states_path + "/%s/" % d
			name = self._cmd.read_file(cstate_path + "name", err_ret = None, no_error = True)
			latency = self._cmd.read_file(cstate_path + "latency", err_ret = None, no_error = True)
			if name is not None and latency is not None:
				latency = self._str2int(latency)
				if latency is not None:
					self.cstates_latency[name.strip()] = latency

	def _get_latency_by_cstate_name(self, name, no_zero=False):
		log.debug("getting latency for cstate with name '%s'" % name)
		if self.cstates_latency is None:
			log.debug("reading cstates latency table")
			self._read_cstates_latency()
		latency = self.cstates_latency.get(name, None)
		if no_zero and latency == 0:
			log.debug("skipping latency 0 as set by param")
			return None
		log.debug("cstate name mapped to latency: %s" % str(latency))
		return latency

	def _get_latency_by_cstate_id(self, lid, no_zero=False):
		log.debug("getting latency for cstate with ID '%s'" % str(lid))
		lid = self._str2int(lid)
		if lid is None:
			log.debug("cstate ID is invalid")
			return None
		latency_path = cpuidle_states_path + "/%s/latency" % ("state%d" % lid)
		latency = self._str2int(self._cmd.read_file(latency_path, err_ret = None, no_error = True))
		if no_zero and latency == 0:
			log.debug("skipping latency 0 as set by param")
			return None
		log.debug("cstate ID mapped to latency: %s" % str(latency))
		return latency

	# returns (latency, skip), skip means we want to skip latency settings
	def _parse_latency(self, latency, allow_na=False):
		self.cstates_latency = None
		latencies = str(latency).split("|")
		log.debug("parsing latency '%s', allow_na '%s'" % (latency, allow_na))
		for latency in latencies:
			try:
				latency = int(latency)
				log.debug("parsed directly specified latency value: %d" % latency)
			except ValueError:
				if latency[0:18] == "cstate.id_no_zero:":
					latency = self._get_latency_by_cstate_id(latency[18:], no_zero=True)
				elif latency[0:10] == "cstate.id:":
					latency = self._get_latency_by_cstate_id(latency[10:])
				elif latency[0:20] == "cstate.name_no_zero:":
					latency = self._get_latency_by_cstate_name(latency[20:], no_zero=True)
				elif latency[0:12] == "cstate.name:":
					latency = self._get_latency_by_cstate_name(latency[12:])
				elif latency in ["none", "None"]:
					log.debug("latency 'none' specified")
					return None, True
				elif allow_na and latency == "n/a":
					log.debug("latency 'n/a' specified")
					pass
				else:
					log.debug("invalid latency specified: '%s'" % str(latency))
					latency = None
			if latency is not None:
				break
		return latency, False

	def _set_latency(self, latency):
		latency, skip = self._parse_latency(latency)
		if not skip and self._has_pm_qos:
			if latency is None:
				log.error("unable to evaluate latency value (probably wrong settings in the 'cpu' section of current profile), disabling PM QoS")
				self._has_pm_qos = False
			elif self._latency != latency:
				log.info("setting new cpu latency %d" % latency)
				latency_bin = struct.pack("i", latency)
				os.write(self._cpu_latency_fd, latency_bin)
				self._latency = latency

	def _get_available_governors(self, device):
		return self._cmd.read_file("/sys/devices/system/cpu/%s/cpufreq/scaling_available_governors" % device).strip().split()

	@command_set("governor", per_device=True)
	def _set_governor(self, governors, device, sim, remove):
		if not self._check_cpu_can_change_governor(device):
			return None
		governors = str(governors)
		governors = governors.split("|")
		governors = [governor.strip() for governor in governors]
		for governor in governors:
			if len(governor) == 0:
				log.error("The 'governor' option contains an empty value.")
				return None
		available_governors = self._get_available_governors(device)
		for governor in governors:
			if governor in available_governors:
				if not sim:
					log.info("setting governor '%s' on cpu '%s'"
							% (governor, device))
					self._cmd.write_to_file("/sys/devices/system/cpu/%s/cpufreq/scaling_governor"
							% device, governor, no_error = [errno.ENOENT] if remove else False)
				break
			elif not sim:
				log.debug("Ignoring governor '%s' on cpu '%s', it is not supported"
						% (governor, device))
		else:
			log.warn("None of the scaling governors is supported: %s"
					% ", ".join(governors))
			governor = None
		return governor

	@command_get("governor")
	def _get_governor(self, device, ignore_missing=False):
		governor = None
		if not self._check_cpu_can_change_governor(device):
			return None
		data = self._cmd.read_file("/sys/devices/system/cpu/%s/cpufreq/scaling_governor" % device, no_error=ignore_missing).strip()
		if len(data) > 0:
			governor = data

		if governor is None:
			log.error("could not get current governor on cpu '%s'" % device)

		return governor

	def _sampling_down_factor_path(self, governor = "ondemand"):
		return "/sys/devices/system/cpu/cpufreq/%s/sampling_down_factor" % governor

	@command_set("sampling_down_factor", per_device = True, priority = 10)
	def _set_sampling_down_factor(self, sampling_down_factor, device, sim, remove):
		val = None

		# hack to clear governors map when the profile starts unloading
		# TODO: this should be handled better way, by e.g. currently non-implemented
		# Plugin.profile_load_finished() method
		if device in self._governors_map:
			self._governors_map.clear()

		self._governors_map[device] = None
		governor = self._get_governor(device)
		if governor is None:
			log.debug("ignoring sampling_down_factor setting for CPU '%s', cannot match governor" % device)
			return None
		if governor not in list(self._governors_map.values()):
			self._governors_map[device] = governor
			path = self._sampling_down_factor_path(governor)
			if not os.path.exists(path):
				log.debug("ignoring sampling_down_factor setting for CPU '%s', governor '%s' doesn't support it" % (device, governor))
				return None
			val = str(sampling_down_factor)
			if not sim:
				log.info("setting sampling_down_factor to '%s' for governor '%s'" % (val, governor))
				self._cmd.write_to_file(path, val, no_error = [errno.ENOENT] if remove else False)
		return val

	@command_get("sampling_down_factor")
	def _get_sampling_down_factor(self, device, ignore_missing=False):
		governor = self._get_governor(device, ignore_missing=ignore_missing)
		if governor is None:
			return None
		path = self._sampling_down_factor_path(governor)
		if not os.path.exists(path):
			return None
		return self._cmd.read_file(path).strip()

	def _try_set_energy_perf_bias(self, cpu_id, value):
		(retcode, out, err_msg) = self._cmd.execute(
				["x86_energy_perf_policy",
				"-c", cpu_id,
				str(value)
				],
				return_err = True)
		return (retcode, err_msg)

	def _pstate_preference_path(self, cpu_id, available = False):
		return "/sys/devices/system/cpu/cpufreq/policy%s/energy_performance_%s" % (cpu_id, "available_preferences" if available else "preference")

	def _energy_perf_bias_path(self, cpu_id):
		return "/sys/devices/system/cpu/cpu%s/power/energy_perf_bias" % cpu_id

	@command_set("energy_perf_bias", per_device=True)
	def _set_energy_perf_bias(self, energy_perf_bias, device, sim, remove):
		if not self._is_cpu_online(device):
			log.debug("%s is not online, skipping" % device)
			return None
		cpu_id = device.lstrip("cpu")
		vals = energy_perf_bias.split('|')

		# It should be writen straight to sysfs energy_perf_bias file if requested on newer processors
		# see rhbz#2095829
		if consts.CFG_CPU_EPP_FLAG in self._get_cpuinfo_flags():
			energy_perf_bias_path = self._energy_perf_bias_path(cpu_id)
			if os.path.exists(energy_perf_bias_path):
				if not sim:
					for val in vals:
						val = val.strip()
						if self._cmd.write_to_file(energy_perf_bias_path, val, \
							no_error = [errno.ENOENT] if remove else False):
								log.info("energy_perf_bias successfully set to '%s' on cpu '%s'"
										 % (val, device))
								break
					else:
						log.error("Failed to set energy_perf_bias on cpu '%s'. Is the value in the profile correct?"
								  % device)
					
				return str(energy_perf_bias)
			else:
				log.error("Failed to set energy_perf_bias on cpu '%s' because energy_perf_bias file does not exist."
						  % device)
				return None
		elif self._has_energy_perf_bias:
			if not sim:
				for val in vals:
					val = val.strip()
					log.debug("Trying to set energy_perf_bias to '%s' on cpu '%s'"
							% (val, device))
					(retcode, err_msg) = self._try_set_energy_perf_bias(
							cpu_id, val)
					if retcode == 0:
						log.info("energy_perf_bias successfully set to '%s' on cpu '%s'"
								% (val, device))
						break
					elif retcode < 0:
						log.error("Failed to set energy_perf_bias: %s"
								% err_msg)
						break
					else:
						log.debug("Could not set energy_perf_bias to '%s' on cpu '%s', trying another value"
								% (val, device))
				else:
					log.error("Failed to set energy_perf_bias on cpu '%s'. Is the value in the profile correct?"
							% device)
			return str(energy_perf_bias)
		else:
			return None

	def _try_parse_num(self, s):
		try:
			v = int(s)
		except ValueError as e:
			try:
				v = int(s, 16)
			except ValueError as e:
				v = s
		return v

	# Before Linux 4.13
	def _energy_perf_policy_to_human(self, s):
		return {0:"performance", 6:"normal", 15:"powersave"}.get(self._try_parse_num(s), s)

	# Since Linux 4.13
	def _energy_perf_policy_to_human_v2(self, s):
		return {0:"performance",
				4:"balance-performance",
				6:"normal",
				8:"balance-power",
				15:"power",
				}.get(self._try_parse_num(s), s)

	@command_get("energy_perf_bias")
	def _get_energy_perf_bias(self, device, ignore_missing=False):
		energy_perf_bias = None
		if not self._is_cpu_online(device):
			log.debug("%s is not online, skipping" % device)
			return None
		cpu_id = device.lstrip("cpu")
		if consts.CFG_CPU_EPP_FLAG in self._get_cpuinfo_flags():
			energy_perf_bias_path = self._energy_perf_bias_path(cpu_id)
			if os.path.exists(energy_perf_bias_path):
				energy_perf_bias = self._energy_perf_policy_to_human_v2(self._cmd.read_file(energy_perf_bias_path))
		elif self._has_energy_perf_bias:
			retcode, lines = self._cmd.execute(["x86_energy_perf_policy", "-c", cpu_id, "-r"])
			if retcode == 0:
				for line in lines.splitlines():
					l = line.split()
					if len(l) == 2:
						energy_perf_bias = self._energy_perf_policy_to_human(l[1])
						break
					elif len(l) == 3:
						energy_perf_bias = self._energy_perf_policy_to_human_v2(l[2])
						break

		return energy_perf_bias

	def _pm_qos_resume_latency_us_path(self, device):
		return "/sys/devices/system/cpu/%s/power/pm_qos_resume_latency_us" % device

	def _check_pm_qos_resume_latency_us(self, device):
		if self._has_pm_qos_resume_latency_us is None:
			self._has_pm_qos_resume_latency_us = os.path.exists(self._pm_qos_resume_latency_us_path(device))
			if not self._has_pm_qos_resume_latency_us:
				log.info("Option 'pm_qos_resume_latency_us' is not supported on current hardware.")
		return self._has_pm_qos_resume_latency_us

	@command_set("pm_qos_resume_latency_us", per_device=True)
	def _set_pm_qos_resume_latency_us(self, pm_qos_resume_latency_us, device, sim, remove):
		if not self._is_cpu_online(device):
			log.debug("%s is not online, skipping" % device)
			return None
		latency, skip = self._parse_latency(pm_qos_resume_latency_us, allow_na=True)
		if skip or not self._check_pm_qos_resume_latency_us(device):
			return None
		if latency is None or (latency != "n/a" and latency < 0):
			log.warning("Invalid pm_qos_resume_latency_us specified: '%s', cpu: '%s'." % (pm_qos_resume_latency_us, device))
			return None
		if not sim:
			self._cmd.write_to_file(self._pm_qos_resume_latency_us_path(device), latency, \
				no_error = [errno.ENOENT] if remove else False)
		return latency

	@command_get("pm_qos_resume_latency_us")
	def _get_pm_qos_resume_latency_us(self, device, ignore_missing=False):
		if not self._is_cpu_online(device):
			log.debug("%s is not online, skipping" % device)
			return None
		if not self._check_pm_qos_resume_latency_us(device):
			return None
		return self._cmd.read_file(self._pm_qos_resume_latency_us_path(device), no_error=ignore_missing).strip()

	@command_set("energy_performance_preference", per_device=True)
	def _set_energy_performance_preference(self, energy_performance_preference, device, sim, remove):
		if not self._is_cpu_online(device):
			log.debug("%s is not online, skipping" % device)
			return None
		cpu_id = device.lstrip("cpu")
		if os.path.exists(self._pstate_preference_path(cpu_id, True)):
			vals = energy_performance_preference.split('|')
			if not sim:
				avail_vals = set(self._cmd.read_file(self._pstate_preference_path(cpu_id, True)).split())
				for val in vals:
					if val in avail_vals:
						self._cmd.write_to_file(self._pstate_preference_path(cpu_id), val, \
							no_error = [errno.ENOENT] if remove else False)
						log.info("Setting energy_performance_preference value '%s' for cpu '%s'" % (val, device))
						break
					else:
						log.warn("energy_performance_preference value '%s' unavailable for cpu '%s'" % (val, device))
				else:
					log.error("Failed to set energy_performance_preference on cpu '%s'. Is the value in the profile correct?"
							  % device)
			return str(energy_performance_preference)
		else:
			log.debug("energy_performance_available_preferences file missing, which can happen if the system is booted without a P-state driver.")
		return None

	@command_get("energy_performance_preference")
	def _get_energy_performance_preference(self, device, ignore_missing=False):
		if not self._is_cpu_online(device):
			log.debug("%s is not online, skipping" % device)
			return None
		cpu_id = device.lstrip("cpu")
		# read the EPP hint used by the intel_pstate and amd-pstate CPU scaling drivers
		if os.path.exists(self._pstate_preference_path(cpu_id, True)):
			return self._cmd.read_file(self._pstate_preference_path(cpu_id)).strip()
		else:
			log.debug("energy_performance_available_preferences file missing, which can happen if the system is booted without a P-state driver.")
		return None
site-packages/tuned/plugins/plugin_audio.py000064400000006211147511334670015130 0ustar00from . import hotplug
from .decorators import *
import tuned.logs
from tuned.utils.commands import commands

import os
import errno
import struct
import glob

log = tuned.logs.get()
cmd = commands()

class AudioPlugin(hotplug.Plugin):
	"""
	`audio`::
	
	Sets audio cards power saving options. The plug-in sets the auto suspend
	timeout for audio codecs to the value specified by the [option]`timeout`
	option.
	+
	Currently, the `snd_hda_intel` and `snd_ac97_codec` codecs are
	supported and the [option]`timeout` value is in seconds. To disable
	auto suspend for these codecs, set the [option]`timeout` value
	to `0`. To enforce the controller reset, set the option
	[option]`reset_controller` to `true`. Note that power management
	is supported per module. Hence, the kernel module names are used as
	device names.
	+
	.Set the timeout value to 10s and enforce the controller reset
	====
	----
	[audio]
	timeout=10
	reset_controller=true
	----
	====
	"""

	def _init_devices(self):
		self._devices_supported = True
		self._assigned_devices = set()
		self._free_devices = set()

		for device in self._hardware_inventory.get_devices("sound").match_sys_name("card*"):
			module_name = self._device_module_name(device)
			if module_name in ["snd_hda_intel", "snd_ac97_codec"]:
				self._free_devices.add(module_name)

	def _instance_init(self, instance):
		instance._has_static_tuning = True
		instance._has_dynamic_tuning = False

	def _instance_cleanup(self, instance):
		pass

	def _device_module_name(self, device):
		try:
			return device.parent.driver
		except:
			return None

	@classmethod
	def _get_config_options(cls):
		return {
			"timeout":          0,
			"reset_controller": False,
		}

	def _timeout_path(self, device):
		return "/sys/module/%s/parameters/power_save" % device

	def _reset_controller_path(self, device):
		return "/sys/module/%s/parameters/power_save_controller" % device

	@command_set("timeout", per_device = True)
	def _set_timeout(self, value, device, sim, remove):
		try:
			timeout = int(value)
		except ValueError:
			log.error("timeout value '%s' is not integer" % value)
			return None
		if timeout >= 0:
			sys_file = self._timeout_path(device)
			if not sim:
				cmd.write_to_file(sys_file, "%d" % timeout, \
					no_error = [errno.ENOENT] if remove else False)
			return timeout
		else:
			return None

	@command_get("timeout")
	def _get_timeout(self, device, ignore_missing=False):
		sys_file = self._timeout_path(device)
		value = cmd.read_file(sys_file, no_error=ignore_missing)
		if len(value) > 0:
			return value
		return None

	@command_set("reset_controller", per_device = True)
	def _set_reset_controller(self, value, device, sim, remove):
		v = cmd.get_bool(value)
		sys_file = self._reset_controller_path(device)
		if os.path.exists(sys_file):
			if not sim:
				cmd.write_to_file(sys_file, v, \
					no_error = [errno.ENOENT] if remove else False)
			return v
		return None

	@command_get("reset_controller")
	def _get_reset_controller(self, device, ignore_missing=False):
		sys_file = self._reset_controller_path(device)
		if os.path.exists(sys_file):
			value = cmd.read_file(sys_file)
			if len(value) > 0:
				return cmd.get_bool(value)
		return None
site-packages/tuned/version.py000064400000000253147511334670012455 0ustar00TUNED_VERSION_MAJOR = 2
TUNED_VERSION_MINOR = 22
TUNED_VERSION_PATCH = 1

TUNED_VERSION_STR = "%d.%d.%d" % (TUNED_VERSION_MAJOR, TUNED_VERSION_MINOR, TUNED_VERSION_PATCH)
site-packages/tuned/admin/__init__.py000064400000000116147511334670013615 0ustar00from .admin import *
from .exceptions import *
from .dbus_controller import *
site-packages/tuned/admin/exceptions.py000064400000000137147511334670014242 0ustar00import tuned.exceptions

class TunedAdminDBusException(tuned.exceptions.TunedException):
	pass
site-packages/tuned/admin/admin.py000064400000037247147511334670013165 0ustar00
from __future__ import print_function
import tuned.admin
from tuned.utils.commands import commands
from tuned.profiles import Locator as profiles_locator
from .exceptions import TunedAdminDBusException
from tuned.exceptions import TunedException
import tuned.consts as consts
from tuned.utils.profile_recommender import ProfileRecommender
import os
import sys
import errno
import time
import threading
import logging

class Admin(object):
	def __init__(self, dbus = True, debug = False, asynco = False,
			timeout = consts.ADMIN_TIMEOUT,
			log_level = logging.ERROR):
		self._dbus = dbus
		self._debug = debug
		self._async = asynco
		self._timeout = timeout
		self._cmd = commands(debug)
		self._profiles_locator = profiles_locator(consts.LOAD_DIRECTORIES)
		self._daemon_action_finished = threading.Event()
		self._daemon_action_profile = ""
		self._daemon_action_result = True
		self._daemon_action_errstr = ""
		self._controller = None
		self._log_token = None
		self._log_level = log_level
		self._profile_recommender = ProfileRecommender()
		if self._dbus:
			self._controller = tuned.admin.DBusController(consts.DBUS_BUS, consts.DBUS_INTERFACE, consts.DBUS_OBJECT, debug)
			try:
				self._controller.set_signal_handler(consts.SIGNAL_PROFILE_CHANGED, self._signal_profile_changed_cb)
			except TunedAdminDBusException as e:
				self._error(e)
				self._dbus = False

	def _error(self, message):
		print(message, file=sys.stderr)

	def _signal_profile_changed_cb(self, profile_name, result, errstr):
		# ignore successive signals if the signal is not yet processed
		if not self._daemon_action_finished.is_set():
			self._daemon_action_profile = profile_name
			self._daemon_action_result = result
			self._daemon_action_errstr = errstr
			self._daemon_action_finished.set()

	def _tuned_is_running(self):
		try:
			os.kill(int(self._cmd.read_file(consts.PID_FILE)), 0)
		except OSError as e:
			return e.errno == errno.EPERM
		except (ValueError, IOError) as e:
			return False
		return True

	# run the action specified by the action_name with args
	def action(self, action_name, *args, **kwargs):
		if action_name is None or action_name == "":
			return False
		action = None
		action_dbus = None
		res = False
		try:
			action_dbus = getattr(self, "_action_dbus_" + action_name)
		except AttributeError as e:
			self._dbus = False
		try:
			action = getattr(self, "_action_" + action_name)
		except AttributeError as e:
			if not self._dbus:
				self._error(str(e) + ", action '%s' is not implemented" % action_name)
				return False
		if self._dbus:
			try:
				self._controller.set_on_exit_action(
						self._log_capture_finish)
				self._controller.set_action(action_dbus, *args, **kwargs)
				res = self._controller.run()
			except TunedAdminDBusException as e:
				self._error(e)
				self._dbus = False

		if not self._dbus:
			res = action(*args, **kwargs)
		return res

	def _print_profiles(self, profile_names):
		print("Available profiles:")
		for profile in profile_names:
			if profile[1] is not None and profile[1] != "":
				print(self._cmd.align_str("- %s" % profile[0], 30, "- %s" % profile[1]))
			else:
				print("- %s" % profile[0])

	def _action_dbus_list_profiles(self):
		try:
			profile_names = self._controller.profiles2()
		except TunedAdminDBusException as e:
			# fallback to older API
			profile_names = [(profile, "") for profile in self._controller.profiles()]
		self._print_profiles(profile_names)
		self._action_dbus_active()
		return self._controller.exit(True)

	def _action_list_profiles(self):
		self._print_profiles(self._profiles_locator.get_known_names_summary())
		self._action_active()
		return True

	def _dbus_get_active_profile(self):
		profile_name = self._controller.active_profile()
		if profile_name == "":
			profile_name = None
		self._controller.exit(True)
		return profile_name

	def _get_active_profile(self):
		profile_name, manual = self._cmd.get_active_profile()
		return profile_name

	def _get_profile_mode(self):
		(profile, manual) = self._cmd.get_active_profile()
		if manual is None:
			manual = profile is not None
		return consts.ACTIVE_PROFILE_MANUAL if manual else consts.ACTIVE_PROFILE_AUTO

	def _dbus_get_post_loaded_profile(self):
		profile_name = self._controller.post_loaded_profile()
		if profile_name == "":
			profile_name = None
		return profile_name

	def _get_post_loaded_profile(self):
		profile_name = self._cmd.get_post_loaded_profile()
		return profile_name

	def _print_profile_info(self, profile, profile_info):
		if profile_info[0] == True:
			print("Profile name:")
			print(profile_info[1])
			print()
			print("Profile summary:")
			print(profile_info[2])
			print()
			print("Profile description:")
			print(profile_info[3])
			return True
		else:
			print("Unable to get information about profile '%s'" % profile)
			return False

	def _action_dbus_profile_info(self, profile = ""):
		if profile == "":
			profile = self._dbus_get_active_profile()
		if profile:
			res = self._print_profile_info(profile, self._controller.profile_info(profile))
		else:
			print("No current active profile.")
			res = False
		return self._controller.exit(res)

	def _action_profile_info(self, profile = ""):
		if profile == "":
			try:
				profile = self._get_active_profile()
				if profile is None:
					print("No current active profile.")
					return False
			except TunedException as e:
				self._error(str(e))
				return False
		return self._print_profile_info(profile, self._profiles_locator.get_profile_attrs(profile, [consts.PROFILE_ATTR_SUMMARY, consts.PROFILE_ATTR_DESCRIPTION], ["", ""]))

	def _print_profile_name(self, profile_name):
		if profile_name is None:
			print("No current active profile.")
			return False
		else:
			print("Current active profile: %s" % profile_name)
		return True

	def _print_post_loaded_profile(self, profile_name):
		if profile_name:
			print("Current post-loaded profile: %s" % profile_name)

	def _action_dbus_active(self):
		active_profile = self._dbus_get_active_profile()
		res = self._print_profile_name(active_profile)
		if res:
			post_loaded_profile = self._dbus_get_post_loaded_profile()
			self._print_post_loaded_profile(post_loaded_profile)
		return self._controller.exit(res)

	def _action_active(self):
		try:
			profile_name = self._get_active_profile()
			post_loaded_profile = self._get_post_loaded_profile()
			# The result of the DBus call active_profile includes
			# the post-loaded profile, so add it here as well
			if post_loaded_profile:
				if profile_name:
					profile_name += " "
				else:
					profile_name = ""
				profile_name += post_loaded_profile
		except TunedException as e:
			self._error(str(e))
			return False
		if profile_name is not None and not self._tuned_is_running():
			print("It seems that tuned daemon is not running, preset profile is not activated.")
			print("Preset profile: %s" % profile_name)
			if post_loaded_profile:
				print("Preset post-loaded profile: %s" % post_loaded_profile)
			return True
		res = self._print_profile_name(profile_name)
		self._print_post_loaded_profile(post_loaded_profile)
		return res

	def _print_profile_mode(self, mode):
		print("Profile selection mode: " + mode)

	def _action_dbus_profile_mode(self):
		mode, error = self._controller.profile_mode()
		self._print_profile_mode(mode)
		if error != "":
			self._error(error)
			return self._controller.exit(False)
		return self._controller.exit(True)

	def _action_profile_mode(self):
		try:
			mode = self._get_profile_mode()
			self._print_profile_mode(mode)
			return True
		except TunedException as e:
			self._error(str(e))
			return False

	def _profile_print_status(self, ret, msg):
		if ret:
			if not self._controller.is_running() and not self._controller.start():
				self._error("Cannot enable the tuning.")
				ret = False
		else:
			self._error("Unable to switch profile: %s" % msg)
		return ret

	def _action_dbus_wait_profile(self, profile_name):
		if time.time() >= self._timestamp + self._timeout:
			print("Operation timed out after waiting %d seconds(s), you may try to increase timeout by using --timeout command line option or using --async." % self._timeout)
			return self._controller.exit(False)
		if self._daemon_action_finished.isSet():
			if self._daemon_action_profile == profile_name:
				if not self._daemon_action_result:
					print("Error changing profile: %s" % self._daemon_action_errstr)
					return self._controller.exit(False)
				return self._controller.exit(True)
		return False

	def _log_capture_finish(self):
		if self._log_token is None or self._log_token == "":
			return
		try:
			log_msgs = self._controller.log_capture_finish(
					self._log_token)
			self._log_token = None
			print(log_msgs, end = "", file = sys.stderr)
			sys.stderr.flush()
		except TunedAdminDBusException as e:
			self._error("Error: Failed to stop log capture. Restart the TuneD daemon to prevent a memory leak.")

	def _action_dbus_profile(self, profiles):
		if len(profiles) == 0:
			return self._action_dbus_list()
		profile_name = " ".join(profiles)
		if profile_name == "":
			return self._controller.exit(False)
		self._daemon_action_finished.clear()
		if not self._async and self._log_level is not None:
			# 25 seconds default DBus timeout + 5 secs safety margin
			timeout = self._timeout + 25 + 5
			self._log_token = self._controller.log_capture_start(
					self._log_level, timeout)
		(ret, msg) = self._controller.switch_profile(profile_name)
		if self._async or not ret:
			return self._controller.exit(self._profile_print_status(ret, msg))
		else:
			self._timestamp = time.time()
			self._controller.set_action(self._action_dbus_wait_profile, profile_name)
		return self._profile_print_status(ret, msg)

	def _restart_tuned(self):
		print("Trying to (re)start tuned...")
		(ret, msg) = self._cmd.execute(["service", "tuned", "restart"])
		if ret == 0:
			print("TuneD (re)started, changes applied.")
		else:
			print("TuneD (re)start failed, you need to (re)start TuneD by hand for changes to apply.")

	def _set_profile(self, profile_name, manual):
		if profile_name in self._profiles_locator.get_known_names():
			try:
				self._cmd.save_active_profile(profile_name, manual)
				self._restart_tuned()
				return True
			except TunedException as e:
				self._error(str(e))
				self._error("Unable to switch profile.")
				return False
		else:
			self._error("Requested profile '%s' doesn't exist." % profile_name)
			return False

	def _action_profile(self, profiles):
		if len(profiles) == 0:
			return self._action_list_profiles()
		profile_name = " ".join(profiles)
		if profile_name == "":
			return False
		return self._set_profile(profile_name, True)

	def _action_dbus_auto_profile(self):
		profile_name = self._controller.recommend_profile()
		self._daemon_action_finished.clear()
		if not self._async and self._log_level is not None:
			# 25 seconds default DBus timeout + 5 secs safety margin
			timeout = self._timeout + 25 + 5
			self._log_token = self._controller.log_capture_start(
					self._log_level, timeout)
		(ret, msg) = self._controller.auto_profile()
		if self._async or not ret:
			return self._controller.exit(self._profile_print_status(ret, msg))
		else:
			self._timestamp = time.time()
			self._controller.set_action(self._action_dbus_wait_profile, profile_name)
		return self._profile_print_status(ret, msg)

	def _action_auto_profile(self):
		profile_name = self._profile_recommender.recommend()
		return self._set_profile(profile_name, False)

	def _action_dbus_recommend_profile(self):
		print(self._controller.recommend_profile())
		return self._controller.exit(True)

	def _action_recommend_profile(self):
		print(self._profile_recommender.recommend())
		return True

	def _action_dbus_verify_profile(self, ignore_missing):
		if ignore_missing:
			ret = self._controller.verify_profile_ignore_missing()
		else:
			ret = self._controller.verify_profile()
		if ret:
			print("Verification succeeded, current system settings match the preset profile.")
		else:
			print("Verification failed, current system settings differ from the preset profile.")
			print("You can mostly fix this by restarting the TuneD daemon, e.g.:")
			print("  systemctl restart tuned")
			print("or")
			print("  service tuned restart")
			print("Sometimes (if some plugins like bootloader are used) a reboot may be required.")
		print("See TuneD log file ('%s') for details." % consts.LOG_FILE)
		return self._controller.exit(ret)

	def _action_verify_profile(self, ignore_missing):
		print("Not supported in no_daemon mode.")
		return False

	def _action_dbus_off(self):
		# 25 seconds default DBus timeout + 5 secs safety margin
		timeout = 25 + 5
		self._log_token = self._controller.log_capture_start(
				self._log_level, timeout)
		ret = self._controller.off()
		if not ret:
			self._error("Cannot disable active profile.")
		return self._controller.exit(ret)

	def _action_off(self):
		print("Not supported in no_daemon mode.")
		return False

	def _action_dbus_list(self, list_choice="profiles", verbose=False):
		"""Print accessible profiles or plugins got from TuneD dbus api

		Keyword arguments:
		list_choice -- argument from command line deciding what will be listed
		verbose -- if True then list plugin's config options and their hints
			if possible. Functional only with plugin listing, with profiles
			this argument is omitted
		"""
		if list_choice == "profiles":
			return self._action_dbus_list_profiles()
		elif list_choice == "plugins":
			return self._action_dbus_list_plugins(verbose=verbose)

	def _action_list(self, list_choice="profiles", verbose=False):
		"""Print accessible profiles or plugins with no daemon mode

		Keyword arguments:
		list_choice -- argument from command line deciding what will be listed
		verbose -- Plugins cannot be listed in this mode, so verbose argument
			is here only because argparse module always supplies verbose
			option and if verbose was not here it would result in error
		"""
		if list_choice == "profiles":
			return self._action_list_profiles()
		elif list_choice == "plugins":
			return self._action_list_plugins(verbose=verbose)

	def _action_dbus_list_plugins(self, verbose=False):
		"""Print accessible plugins

		Keyword arguments:
		verbose -- if is set to True then parameters and hints are printed
		"""
		plugins = self._controller.get_plugins()
		for plugin in plugins.keys():
			print(plugin)
			if not verbose or len(plugins[plugin]) == 0:
				continue
			hints = self._controller.get_plugin_hints(plugin)
			for parameter in plugins[plugin]:
				print("\t%s" %(parameter))
				hint = hints.get(parameter, None)
				if hint:
					print("\t\t%s" %(hint))
		return self._controller.exit(True)

	def _action_list_plugins(self, verbose=False):
		print("Not supported in no_daemon mode.")
		return False

	def _action_dbus_instance_acquire_devices(self, devices, instance):
		(ret, msg) = self._controller.instance_acquire_devices(devices, instance)
		if not ret:
			self._error("Unable to acquire devices: %s" % msg)
		return self._controller.exit(ret)

	def _action_instance_acquire_devices(self, devices, instance):
		print("Not supported in no_daemon mode.")
		return False

	def _action_dbus_get_instances(self, plugin_name):
		(ret, msg, pairs) = self._controller.get_instances(plugin_name)
		if not ret:
			self._error("Unable to list instances: %s" % msg)
			return self._controller.exit(False)
		for instance, plugin in pairs:
			print("%s (%s)" % (instance, plugin))
		return self._controller.exit(True)

	def _action_get_instances(self, plugin_name):
		print("Not supported in no_daemon mode.")
		return False

	def _action_dbus_instance_get_devices(self, instance):
		(ret, msg, devices) = self._controller.instance_get_devices(instance)
		if not ret:
			self._error("Unable to list devices: %s" % msg)
			return self._controller.exit(False)
		for device in devices:
			print(device)
		return self._controller.exit(True)

	def _action_instance_get_devices(self, instance):
		print("Not supported in no_daemon mode.")
		return False
site-packages/tuned/admin/__pycache__/dbus_controller.cpython-36.opt-1.pyc000064400000015610147511334670022506 0ustar003

�<�eW�@sZddlZddlZddlZddlmZddlmZmZddlm	Z	dgZ
Gdd�de�ZdS)�N)�
DBusGMainLoop)�GLib�GObject�)�TunedAdminDBusException�DBusControllerc@seZdZdCdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zd2d3�Zd4d5�Zd6d7�Zd8d9�Zd:d;�Zd<d=�Z d>d?�Z!d@dA�Z"dBS)DrFcCsL||_||_||_d|_d|_||_d|_d|_d|_d|_	d|_
d|_dS)NTF)�	_bus_name�_interface_name�_object_name�_proxy�
_interface�_debug�
_main_loop�_action�_on_exit_action�_ret�_exit�
_exception)�selfZbus_nameZinterface_nameZobject_name�debug�r�%/usr/lib/python3.6/dbus_controller.py�__init__szDBusController.__init__cCsvyP|jdkrNtdd�tj�|_tj�}|j|j|j	�|_tj
|j|jd�|_Wn tj
jk
rptd��YnXdS)NT)Zset_as_default)Zdbus_interfacez>Cannot talk to TuneD daemon via DBus. Is TuneD daemon running?)rrrZMainLoopr�dbusZ	SystemBusZ
get_objectrr
Z	Interfacer	r�
exceptions�
DBusExceptionr)rZbusrrr�_init_proxys


zDBusController._init_proxycCs�|jdk	rPy|j|j|j�|_Wn.tk
rN}z||_d|_WYdd}~XnX|jr~|jdk	rp|j|j|j	�|j
j�dStj
d�dS)NTFr)r�_action_args�_action_kwargsZ_action_exit_coderrrr�_on_exit_action_args�_on_exit_action_kwargsr�quit�timeZsleep)r�errr�_idle$s



zDBusController._idlecOs||_||_||_dS)N)rrr )r�action�args�kwargsrrr�set_on_exit_action7sz!DBusController.set_on_exit_actioncOs||_||_||_dS)N)rrr)rr%r&r'rrr�
set_action<szDBusController.set_actioncCs2d|_tj|j�|jj�|jdk	r,|j�|jS)N)rrZidle_addr$r�runr)rrrrr*As

zDBusController.runcOsp|j�y|jj|�}||ddi�Stjjk
rj}z(d}|jrR|dt|�7}t|��WYdd}~XnXdS)N�timeout�(z DBus call to TuneD daemon failedz (%s))	rrZget_dbus_methodrrrr
�strr)rZmethod_namer&r'�methodZdbus_exceptionZerr_strrrr�_callJszDBusController._callcCs|j�|jj||�dS)N)rrZconnect_to_signal)r�signal�cbrrr�set_signal_handlerVsz!DBusController.set_signal_handlercCs
|jd�S)N�
is_running)r/)rrrrr3ZszDBusController.is_runningcCs
|jd�S)N�start)r/)rrrrr4]szDBusController.startcCs
|jd�S)N�stop)r/)rrrrr5`szDBusController.stopcCs
|jd�S)N�profiles)r/)rrrrr6cszDBusController.profilescCs
|jd�S)N�	profiles2)r/)rrrrr7fszDBusController.profiles2cCs|jd|�S)N�profile_info)r/)rZprofile_namerrrr8iszDBusController.profile_infocCs|jd||�S)N�log_capture_start)r/)rZ	log_levelr+rrrr9lsz DBusController.log_capture_startcCs|jd|�S)N�log_capture_finish)r/)r�tokenrrrr:osz!DBusController.log_capture_finishcCs
|jd�S)N�active_profile)r/)rrrrr<rszDBusController.active_profilecCs
|jd�S)N�profile_mode)r/)rrrrr=uszDBusController.profile_modecCs
|jd�S)N�post_loaded_profile)r/)rrrrr>xsz"DBusController.post_loaded_profilecCs|dkrdS|jd|�S)N�F�No profile specified�switch_profile)Fr@)r/)rZnew_profilerrrrA{szDBusController.switch_profilecCs
|jd�S)N�auto_profile)r/)rrrrrB�szDBusController.auto_profilecCs
|jd�S)N�recommend_profile)r/)rrrrrC�sz DBusController.recommend_profilecCs
|jd�S)N�verify_profile)r/)rrrrrD�szDBusController.verify_profilecCs
|jd�S)N�verify_profile_ignore_missing)r/)rrrrrE�sz,DBusController.verify_profile_ignore_missingcCs
|jd�S)N�disable)r/)rrrr�off�szDBusController.offcCs
|jd�S)zzReturn dict with plugin names and their hints

		Return:
		dictionary -- {plugin_name: {parameter_name: default_value}}
		Zget_all_plugins)r/)rrrr�get_plugins�szDBusController.get_pluginscCs|jd|�S)z"Return docstring of plugin's class�get_plugin_documentation)r/)r�plugin_namerrrrI�sz'DBusController.get_plugin_documentationcCs|jd|�S)z�Return dictionary with parameters of plugin and their hints

		Parameters:
		plugin_name -- name of plugin

		Return:
		dictionary -- {parameter_name: hint}
		�get_plugin_hints)r/)rrJrrrrK�s	zDBusController.get_plugin_hintscCs|jd||�S)N�instance_acquire_devices)r/)rZdevices�instancerrrrL�sz'DBusController.instance_acquire_devicescCs|jd|�S)N�
get_instances)r/)rrJrrrrN�szDBusController.get_instancescCs|jd|�S)N�instance_get_devices)r/)rrMrrrrO�sz#DBusController.instance_get_devicescCs|jd�||_d|_|S)NT)r)rr)r�retrrr�exit�s
zDBusController.exitN)F)#�__name__�
__module__�__qualname__rrr$r(r)r*r/r2r3r4r5r6r7r8r9r:r<r=r>rArBrCrDrErGrHrIrKrLrNrOrQrrrrr
s@
	)
rZdbus.exceptionsr"Zdbus.mainloop.glibrZ
gi.repositoryrrrr�__all__�objectrrrrr�<module>ssite-packages/tuned/admin/__pycache__/exceptions.cpython-36.opt-1.pyc000064400000000536147511334670021470 0ustar003

�<�e_�@s ddlZGdd�dejj�ZdS)�Nc@seZdZdS)�TunedAdminDBusExceptionN)�__name__�
__module__�__qualname__�rr� /usr/lib/python3.6/exceptions.pyrsr)Ztuned.exceptionsZtuned�
exceptionsZTunedExceptionrrrrr�<module>ssite-packages/tuned/admin/__pycache__/admin.cpython-36.pyc000064400000041020147511334670017431 0ustar003

�<�e�>�@s�ddlmZddlZddlmZddlmZddl	m
Z
ddlmZddl
jZddlmZddlZddlZddlZddlZddlZddlZGd	d
�d
e�ZdS)�)�print_functionN)�commands)�Locator�)�TunedAdminDBusException)�TunedException)�ProfileRecommenderc@s�eZdZdddejejfdd�Zdd�Zdd�Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdfd d!�Zdgd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zd2d3�Zd4d5�Zd6d7�Z d8d9�Z!d:d;�Z"d<d=�Z#d>d?�Z$d@dA�Z%dBdC�Z&dDdE�Z'dFdG�Z(dHdI�Z)dJdK�Z*dLdM�Z+dNdO�Z,dhdQdR�Z-didSdT�Z.djdUdV�Z/dkdWdX�Z0dYdZ�Z1d[d\�Z2d]d^�Z3d_d`�Z4dadb�Z5dcdd�Z6deS)l�AdminTFcCs�||_||_||_||_t|�|_ttj�|_	t
j�|_d|_
d|_d|_d|_d|_||_t�|_|jr�tjjtjtjtj|�|_y|jjtj|j�Wn2tk
r�}z|j|�d|_WYdd}~XnXdS)N�TF) �_dbusZ_debug�_async�_timeoutr�_cmd�profiles_locator�constsZLOAD_DIRECTORIES�_profiles_locator�	threadingZEvent�_daemon_action_finished�_daemon_action_profile�_daemon_action_result�_daemon_action_errstr�_controller�
_log_token�
_log_levelr�_profile_recommender�tunedZadminZDBusControllerZDBUS_BUSZDBUS_INTERFACEZDBUS_OBJECTZset_signal_handlerZSIGNAL_PROFILE_CHANGED�_signal_profile_changed_cbr�_error)�selfZdbus�debugZasynco�timeoutZ	log_level�e�r"�/usr/lib/python3.6/admin.py�__init__s*


zAdmin.__init__cCst|tjd�dS)N)�file)�print�sys�stderr)r�messager"r"r#r+szAdmin._errorcCs*|jj�s&||_||_||_|jj�dS)N)rZis_setrrr�set)r�profile_name�resultZerrstrr"r"r#r.s

z Admin._signal_profile_changed_cbcCsnytjt|jjtj��d�WnJtk
rF}z|jtj	kSd}~Xn$t
tfk
rh}zdSd}~XnXdS)NrFT)�os�kill�intrZ	read_filerZPID_FILE�OSError�errnoZEPERM�
ValueError�IOError)rr!r"r"r#�_tuned_is_running6szAdmin._tuned_is_runningc-Os6|dks|dkrdSd}d}d}yt|d|�}Wn(tk
rZ}zd|_WYdd}~XnXyt|d|�}WnBtk
r�}z&|js�|jt|�d|�dSWYdd}~XnX|j�r y0|jj|j�|jj|f|�|�|jj	�}Wn4t
k
�r}z|j|�d|_WYdd}~XnX|j�s2|||�}|S)Nr
FZ
_action_dbus_Z_action_z , action '%s' is not implemented)�getattr�AttributeErrorrr�strrZset_on_exit_action�_log_capture_finish�
set_actionZrunr)rZaction_name�args�kwargs�actionZaction_dbus�resr!r"r"r#r<@s6

zAdmin.actioncCshtd�xZ|D]R}|ddk	rP|ddkrPt|jjd|ddd|d��qtd|d�qWdS)NzAvailable profiles:rr
z- %sr�)r&rZ	align_str)r�
profile_names�profiler"r"r#�_print_profiles^s

&zAdmin._print_profilescCsdy|jj�}Wn6tk
rD}zdd�|jj�D�}WYdd}~XnX|j|�|j�|jjd�S)NcSsg|]}|df�qS)r
r")�.0r@r"r"r#�
<listcomp>ksz4Admin._action_dbus_list_profiles.<locals>.<listcomp>T)rZ	profiles2r�profilesrA�_action_dbus_active�exit)rr?r!r"r"r#�_action_dbus_list_profilesfs&
z Admin._action_dbus_list_profilescCs|j|jj��|j�dS)NT)rArZget_known_names_summary�_action_active)rr"r"r#�_action_list_profilespszAdmin._action_list_profilescCs&|jj�}|dkrd}|jjd�|S)Nr
T)r�active_profilerF)rr+r"r"r#�_dbus_get_active_profileus

zAdmin._dbus_get_active_profilecCs|jj�\}}|S)N)r�get_active_profile)rr+�manualr"r"r#�_get_active_profile|szAdmin._get_active_profilecCs.|jj�\}}|dkr|dk	}|r(tjStjS)N)rrLrZACTIVE_PROFILE_MANUALZACTIVE_PROFILE_AUTO)rr@rMr"r"r#�_get_profile_mode�szAdmin._get_profile_modecCs|jj�}|dkrd}|S)Nr
)r�post_loaded_profile)rr+r"r"r#�_dbus_get_post_loaded_profile�s
z#Admin._dbus_get_post_loaded_profilecCs|jj�}|S)N)rZget_post_loaded_profile)rr+r"r"r#�_get_post_loaded_profile�s
zAdmin._get_post_loaded_profilecCsl|ddkrXtd�t|d�t�td�t|d�t�td�t|d�dStd	|�d
SdS)NrTz
Profile name:rzProfile summary:�zProfile description:�z,Unable to get information about profile '%s'F)r&)rr@�profile_infor"r"r#�_print_profile_info�szAdmin._print_profile_infor
cCsB|dkr|j�}|r*|j||jj|��}ntd�d}|jj|�S)Nr
zNo current active profile.F)rKrVrrUr&rF)rr@r=r"r"r#�_action_dbus_profile_info�szAdmin._action_dbus_profile_infocCs||dkrXy |j�}|dkr&td�dSWn.tk
rV}z|jt|��dSd}~XnX|j||jj|tj	tj
gddg��S)Nr
zNo current active profile.F)rNr&rrr7rVrZget_profile_attrsrZPROFILE_ATTR_SUMMARYZPROFILE_ATTR_DESCRIPTION)rr@r!r"r"r#�_action_profile_info�szAdmin._action_profile_infocCs$|dkrtd�dStd|�dS)NzNo current active profile.FzCurrent active profile: %sT)r&)rr+r"r"r#�_print_profile_name�s
zAdmin._print_profile_namecCs|rtd|�dS)NzCurrent post-loaded profile: %s)r&)rr+r"r"r#�_print_post_loaded_profile�sz Admin._print_post_loaded_profilecCs4|j�}|j|�}|r(|j�}|j|�|jj|�S)N)rKrYrQrZrrF)rrJr=rPr"r"r#rE�s

zAdmin._action_dbus_activecCs�y2|j�}|j�}|r0|r$|d7}nd}||7}Wn.tk
r`}z|jt|��dSd}~XnX|dk	r�|j�r�td�td|�|r�td|�dS|j|�}|j|�|S)N� r
FzKIt seems that tuned daemon is not running, preset profile is not activated.zPreset profile: %szPreset post-loaded profile: %sT)	rNrRrrr7r4r&rYrZ)rr+rPr!r=r"r"r#rH�s(


zAdmin._action_activecCstd|�dS)NzProfile selection mode: )r&)r�moder"r"r#�_print_profile_mode�szAdmin._print_profile_modecCsB|jj�\}}|j|�|dkr6|j|�|jjd�S|jjd�S)Nr
FT)rZprofile_moder]rrF)rr\�errorr"r"r#�_action_dbus_profile_mode�s

zAdmin._action_dbus_profile_modecCsJy|j�}|j|�dStk
rD}z|jt|��dSd}~XnXdS)NTF)rOr]rrr7)rr\r!r"r"r#�_action_profile_mode�s
zAdmin._action_profile_modecCs>|r,|jj�r:|jj�r:|jd�d}n|jd|�|S)NzCannot enable the tuning.FzUnable to switch profile: %s)rZ
is_running�startr)r�ret�msgr"r"r#�_profile_print_status�s
zAdmin._profile_print_statuscCsrtj�|j|jkr.td|j�|jjd�S|jj�rn|j|krn|j	sbtd|j
�|jjd�S|jjd�SdS)Nz�Operation timed out after waiting %d seconds(s), you may try to increase timeout by using --timeout command line option or using --async.FzError changing profile: %sT)�time�
_timestampr
r&rrFrZisSetrrr)rr+r"r"r#�_action_dbus_wait_profile�s

zAdmin._action_dbus_wait_profilecCs||jdks|jdkrdSy2|jj|j�}d|_t|dtjd�tjj�Wn,tk
rv}z|jd�WYdd}~XnXdS)Nr
)�endr%zUError: Failed to stop log capture. Restart the TuneD daemon to prevent a memory leak.)	rrZlog_capture_finishr&r'r(�flushrr)rZlog_msgsr!r"r"r#r8szAdmin._log_capture_finishcCs�t|�dkr|j�Sdj|�}|dkr2|jjd�S|jj�|jrn|jdk	rn|j	dd}|jj
|j|�|_|jj|�\}}|js�|r�|jj|j
||��Stj�|_|jj|j|�|j
||�S)Nrr[r
F��)�len�_action_dbus_list�joinrrFr�clearrrr
�log_capture_startrZswitch_profilerdrerfr9rg)rrDr+r rbrcr"r"r#�_action_dbus_profiles 


zAdmin._action_dbus_profilecCs<td�|jjdddg�\}}|dkr0td�ntd�dS)NzTrying to (re)start tuned...ZservicerZrestartrz#TuneD (re)started, changes applied.zQTuneD (re)start failed, you need to (re)start TuneD by hand for changes to apply.)r&rZexecute)rrbrcr"r"r#�_restart_tuned+s

zAdmin._restart_tunedcCsz||jj�krdy|jj||�|j�dStk
r`}z|jt|��|jd�dSd}~XqvXn|jd|�dSdS)NTzUnable to switch profile.Fz%Requested profile '%s' doesn't exist.)rZget_known_namesrZsave_active_profilerrrrr7)rr+rMr!r"r"r#�_set_profile3s
zAdmin._set_profilecCs6t|�dkr|j�Sdj|�}|dkr*dS|j|d�S)Nrr[r
FT)rlrIrnrs)rrDr+r"r"r#�_action_profileAs
zAdmin._action_profilecCs�|jj�}|jj�|jrF|jdk	rF|jdd}|jj|j|�|_|jj	�\}}|js`|rt|jj
|j||��Stj�|_
|jj|j|�|j||�S)Nrjrk)r�recommend_profilerrorrr
rprZauto_profilerFrdrerfr9rg)rr+r rbrcr"r"r#�_action_dbus_auto_profileIs


zAdmin._action_dbus_auto_profilecCs|jj�}|j|d�S)NF)r�	recommendrs)rr+r"r"r#�_action_auto_profileYs
zAdmin._action_auto_profilecCst|jj��|jjd�S)NT)r&rrurF)rr"r"r#�_action_dbus_recommend_profile]sz$Admin._action_dbus_recommend_profilecCst|jj��dS)NT)r&rrw)rr"r"r#�_action_recommend_profileaszAdmin._action_recommend_profilecCsr|r|jj�}n
|jj�}|r(td�n0td�td�td�td�td�td�tdtj�|jj|�S)	NzIVerification succeeded, current system settings match the preset profile.zLVerification failed, current system settings differ from the preset profile.z=You can mostly fix this by restarting the TuneD daemon, e.g.:z  systemctl restart tuned�orz  service tuned restartzNSometimes (if some plugins like bootloader are used) a reboot may be required.z&See TuneD log file ('%s') for details.)rZverify_profile_ignore_missingZverify_profiler&rZLOG_FILErF)r�ignore_missingrbr"r"r#�_action_dbus_verify_profilees

z!Admin._action_dbus_verify_profilecCstd�dS)Nz Not supported in no_daemon mode.F)r&)rr|r"r"r#�_action_verify_profilevszAdmin._action_verify_profilecCs:d}|jj|j|�|_|jj�}|s.|jd�|jj|�S)NrjrkzCannot disable active profile.r>)rrprrZoffrrF)rr rbr"r"r#�_action_dbus_offzs

zAdmin._action_dbus_offcCstd�dS)Nz Not supported in no_daemon mode.F)r&)rr"r"r#�_action_off�szAdmin._action_offrDcCs(|dkr|j�S|dkr$|j|d�SdS)aDPrint accessible profiles or plugins got from TuneD dbus api

		Keyword arguments:
		list_choice -- argument from command line deciding what will be listed
		verbose -- if True then list plugin's config options and their hints
			if possible. Functional only with plugin listing, with profiles
			this argument is omitted
		rD�plugins)�verboseN)rG�_action_dbus_list_plugins)r�list_choicer�r"r"r#rm�s	zAdmin._action_dbus_listcCs(|dkr|j�S|dkr$|j|d�SdS)aaPrint accessible profiles or plugins with no daemon mode

		Keyword arguments:
		list_choice -- argument from command line deciding what will be listed
		verbose -- Plugins cannot be listed in this mode, so verbose argument
			is here only because argparse module always supplies verbose
			option and if verbose was not here it would result in error
		rDr�)r�N)rI�_action_list_plugins)rr�r�r"r"r#�_action_list�s	zAdmin._action_listcCs�|jj�}xv|j�D]j}t|�|st||�dkr8q|jj|�}x8||D],}td|�|j|d�}|rNtd|�qNWqW|jjd�S)zvPrint accessible plugins

		Keyword arguments:
		verbose -- if is set to True then parameters and hints are printed
		rz	%sNz		%sT)rZget_plugins�keysr&rlZget_plugin_hints�getrF)rr�r��pluginZhintsZ	parameterZhintr"r"r#r��s
zAdmin._action_dbus_list_pluginscCstd�dS)Nz Not supported in no_daemon mode.F)r&)rr�r"r"r#r��szAdmin._action_list_pluginscCs0|jj||�\}}|s$|jd|�|jj|�S)NzUnable to acquire devices: %s)rZinstance_acquire_devicesrrF)r�devices�instancerbrcr"r"r#�%_action_dbus_instance_acquire_devices�sz+Admin._action_dbus_instance_acquire_devicescCstd�dS)Nz Not supported in no_daemon mode.F)r&)rr�r�r"r"r#� _action_instance_acquire_devices�sz&Admin._action_instance_acquire_devicescCs^|jj|�\}}}|s0|jd|�|jjd�Sx |D]\}}td||f�q6W|jjd�S)NzUnable to list instances: %sFz%s (%s)T)rZ
get_instancesrrFr&)r�plugin_namerbrcZpairsr�r�r"r"r#�_action_dbus_get_instances�sz Admin._action_dbus_get_instancescCstd�dS)Nz Not supported in no_daemon mode.F)r&)rr�r"r"r#�_action_get_instances�szAdmin._action_get_instancescCsR|jj|�\}}}|s0|jd|�|jjd�Sx|D]}t|�q6W|jjd�S)NzUnable to list devices: %sFT)rZinstance_get_devicesrrFr&)rr�rbrcr�Zdevicer"r"r#�!_action_dbus_instance_get_devices�s
z'Admin._action_dbus_instance_get_devicescCstd�dS)Nz Not supported in no_daemon mode.F)r&)rr�r"r"r#�_action_instance_get_devices�sz"Admin._action_instance_get_devicesN)r
)r
)rDF)rDF)F)F)7�__name__�
__module__�__qualname__rZ
ADMIN_TIMEOUT�loggingZERRORr$rrr4r<rArGrIrKrNrOrQrRrVrWrXrYrZrErHr]r_r`rdrgr8rqrrrsrtrvrxryrzr}r~rr�rmr�r�r�r�r�r�r�r�r�r"r"r"r#r	sd




		




		r	)Z
__future__rZtuned.adminrZtuned.utils.commandsrZtuned.profilesrr�
exceptionsrZtuned.exceptionsrZtuned.constsrZtuned.utils.profile_recommenderrr-r'r1rerr��objectr	r"r"r"r#�<module>s
site-packages/tuned/admin/__pycache__/__init__.cpython-36.pyc000064400000000270147511334670020102 0ustar003

�<�eN�@sddlTddlTddlTdS)�)�*N)Zadmin�
exceptionsZdbus_controller�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/admin/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000270147511334670021041 0ustar003

�<�eN�@sddlTddlTddlTdS)�)�*N)Zadmin�
exceptionsZdbus_controller�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/admin/__pycache__/dbus_controller.cpython-36.pyc000064400000015610147511334670021547 0ustar003

�<�eW�@sZddlZddlZddlZddlmZddlmZmZddlm	Z	dgZ
Gdd�de�ZdS)�N)�
DBusGMainLoop)�GLib�GObject�)�TunedAdminDBusException�DBusControllerc@seZdZdCdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zd2d3�Zd4d5�Zd6d7�Zd8d9�Zd:d;�Zd<d=�Z d>d?�Z!d@dA�Z"dBS)DrFcCsL||_||_||_d|_d|_||_d|_d|_d|_d|_	d|_
d|_dS)NTF)�	_bus_name�_interface_name�_object_name�_proxy�
_interface�_debug�
_main_loop�_action�_on_exit_action�_ret�_exit�
_exception)�selfZbus_nameZinterface_nameZobject_name�debug�r�%/usr/lib/python3.6/dbus_controller.py�__init__szDBusController.__init__cCsvyP|jdkrNtdd�tj�|_tj�}|j|j|j	�|_tj
|j|jd�|_Wn tj
jk
rptd��YnXdS)NT)Zset_as_default)Zdbus_interfacez>Cannot talk to TuneD daemon via DBus. Is TuneD daemon running?)rrrZMainLoopr�dbusZ	SystemBusZ
get_objectrr
Z	Interfacer	r�
exceptions�
DBusExceptionr)rZbusrrr�_init_proxys


zDBusController._init_proxycCs�|jdk	rPy|j|j|j�|_Wn.tk
rN}z||_d|_WYdd}~XnX|jr~|jdk	rp|j|j|j	�|j
j�dStj
d�dS)NTFr)r�_action_args�_action_kwargsZ_action_exit_coderrrr�_on_exit_action_args�_on_exit_action_kwargsr�quit�timeZsleep)r�errr�_idle$s



zDBusController._idlecOs||_||_||_dS)N)rrr )r�action�args�kwargsrrr�set_on_exit_action7sz!DBusController.set_on_exit_actioncOs||_||_||_dS)N)rrr)rr%r&r'rrr�
set_action<szDBusController.set_actioncCs2d|_tj|j�|jj�|jdk	r,|j�|jS)N)rrZidle_addr$r�runr)rrrrr*As

zDBusController.runcOsp|j�y|jj|�}||ddi�Stjjk
rj}z(d}|jrR|dt|�7}t|��WYdd}~XnXdS)N�timeout�(z DBus call to TuneD daemon failedz (%s))	rrZget_dbus_methodrrrr
�strr)rZmethod_namer&r'�methodZdbus_exceptionZerr_strrrr�_callJszDBusController._callcCs|j�|jj||�dS)N)rrZconnect_to_signal)r�signal�cbrrr�set_signal_handlerVsz!DBusController.set_signal_handlercCs
|jd�S)N�
is_running)r/)rrrrr3ZszDBusController.is_runningcCs
|jd�S)N�start)r/)rrrrr4]szDBusController.startcCs
|jd�S)N�stop)r/)rrrrr5`szDBusController.stopcCs
|jd�S)N�profiles)r/)rrrrr6cszDBusController.profilescCs
|jd�S)N�	profiles2)r/)rrrrr7fszDBusController.profiles2cCs|jd|�S)N�profile_info)r/)rZprofile_namerrrr8iszDBusController.profile_infocCs|jd||�S)N�log_capture_start)r/)rZ	log_levelr+rrrr9lsz DBusController.log_capture_startcCs|jd|�S)N�log_capture_finish)r/)r�tokenrrrr:osz!DBusController.log_capture_finishcCs
|jd�S)N�active_profile)r/)rrrrr<rszDBusController.active_profilecCs
|jd�S)N�profile_mode)r/)rrrrr=uszDBusController.profile_modecCs
|jd�S)N�post_loaded_profile)r/)rrrrr>xsz"DBusController.post_loaded_profilecCs|dkrdS|jd|�S)N�F�No profile specified�switch_profile)Fr@)r/)rZnew_profilerrrrA{szDBusController.switch_profilecCs
|jd�S)N�auto_profile)r/)rrrrrB�szDBusController.auto_profilecCs
|jd�S)N�recommend_profile)r/)rrrrrC�sz DBusController.recommend_profilecCs
|jd�S)N�verify_profile)r/)rrrrrD�szDBusController.verify_profilecCs
|jd�S)N�verify_profile_ignore_missing)r/)rrrrrE�sz,DBusController.verify_profile_ignore_missingcCs
|jd�S)N�disable)r/)rrrr�off�szDBusController.offcCs
|jd�S)zzReturn dict with plugin names and their hints

		Return:
		dictionary -- {plugin_name: {parameter_name: default_value}}
		Zget_all_plugins)r/)rrrr�get_plugins�szDBusController.get_pluginscCs|jd|�S)z"Return docstring of plugin's class�get_plugin_documentation)r/)r�plugin_namerrrrI�sz'DBusController.get_plugin_documentationcCs|jd|�S)z�Return dictionary with parameters of plugin and their hints

		Parameters:
		plugin_name -- name of plugin

		Return:
		dictionary -- {parameter_name: hint}
		�get_plugin_hints)r/)rrJrrrrK�s	zDBusController.get_plugin_hintscCs|jd||�S)N�instance_acquire_devices)r/)rZdevices�instancerrrrL�sz'DBusController.instance_acquire_devicescCs|jd|�S)N�
get_instances)r/)rrJrrrrN�szDBusController.get_instancescCs|jd|�S)N�instance_get_devices)r/)rrMrrrrO�sz#DBusController.instance_get_devicescCs|jd�||_d|_|S)NT)r)rr)r�retrrr�exit�s
zDBusController.exitN)F)#�__name__�
__module__�__qualname__rrr$r(r)r*r/r2r3r4r5r6r7r8r9r:r<r=r>rArBrCrDrErGrHrIrKrLrNrOrQrrrrr
s@
	)
rZdbus.exceptionsr"Zdbus.mainloop.glibrZ
gi.repositoryrrrr�__all__�objectrrrrr�<module>ssite-packages/tuned/admin/__pycache__/admin.cpython-36.opt-1.pyc000064400000041020147511334670020370 0ustar003

�<�e�>�@s�ddlmZddlZddlmZddlmZddl	m
Z
ddlmZddl
jZddlmZddlZddlZddlZddlZddlZddlZGd	d
�d
e�ZdS)�)�print_functionN)�commands)�Locator�)�TunedAdminDBusException)�TunedException)�ProfileRecommenderc@s�eZdZdddejejfdd�Zdd�Zdd�Z	d	d
�Z
dd�Zd
d�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdfd d!�Zdgd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zd2d3�Zd4d5�Zd6d7�Z d8d9�Z!d:d;�Z"d<d=�Z#d>d?�Z$d@dA�Z%dBdC�Z&dDdE�Z'dFdG�Z(dHdI�Z)dJdK�Z*dLdM�Z+dNdO�Z,dhdQdR�Z-didSdT�Z.djdUdV�Z/dkdWdX�Z0dYdZ�Z1d[d\�Z2d]d^�Z3d_d`�Z4dadb�Z5dcdd�Z6deS)l�AdminTFcCs�||_||_||_||_t|�|_ttj�|_	t
j�|_d|_
d|_d|_d|_d|_||_t�|_|jr�tjjtjtjtj|�|_y|jjtj|j�Wn2tk
r�}z|j|�d|_WYdd}~XnXdS)N�TF) �_dbusZ_debug�_async�_timeoutr�_cmd�profiles_locator�constsZLOAD_DIRECTORIES�_profiles_locator�	threadingZEvent�_daemon_action_finished�_daemon_action_profile�_daemon_action_result�_daemon_action_errstr�_controller�
_log_token�
_log_levelr�_profile_recommender�tunedZadminZDBusControllerZDBUS_BUSZDBUS_INTERFACEZDBUS_OBJECTZset_signal_handlerZSIGNAL_PROFILE_CHANGED�_signal_profile_changed_cbr�_error)�selfZdbus�debugZasynco�timeoutZ	log_level�e�r"�/usr/lib/python3.6/admin.py�__init__s*


zAdmin.__init__cCst|tjd�dS)N)�file)�print�sys�stderr)r�messager"r"r#r+szAdmin._errorcCs*|jj�s&||_||_||_|jj�dS)N)rZis_setrrr�set)r�profile_name�resultZerrstrr"r"r#r.s

z Admin._signal_profile_changed_cbcCsnytjt|jjtj��d�WnJtk
rF}z|jtj	kSd}~Xn$t
tfk
rh}zdSd}~XnXdS)NrFT)�os�kill�intrZ	read_filerZPID_FILE�OSError�errnoZEPERM�
ValueError�IOError)rr!r"r"r#�_tuned_is_running6szAdmin._tuned_is_runningc-Os6|dks|dkrdSd}d}d}yt|d|�}Wn(tk
rZ}zd|_WYdd}~XnXyt|d|�}WnBtk
r�}z&|js�|jt|�d|�dSWYdd}~XnX|j�r y0|jj|j�|jj|f|�|�|jj	�}Wn4t
k
�r}z|j|�d|_WYdd}~XnX|j�s2|||�}|S)Nr
FZ
_action_dbus_Z_action_z , action '%s' is not implemented)�getattr�AttributeErrorrr�strrZset_on_exit_action�_log_capture_finish�
set_actionZrunr)rZaction_name�args�kwargs�actionZaction_dbus�resr!r"r"r#r<@s6

zAdmin.actioncCshtd�xZ|D]R}|ddk	rP|ddkrPt|jjd|ddd|d��qtd|d�qWdS)NzAvailable profiles:rr
z- %sr�)r&rZ	align_str)r�
profile_names�profiler"r"r#�_print_profiles^s

&zAdmin._print_profilescCsdy|jj�}Wn6tk
rD}zdd�|jj�D�}WYdd}~XnX|j|�|j�|jjd�S)NcSsg|]}|df�qS)r
r")�.0r@r"r"r#�
<listcomp>ksz4Admin._action_dbus_list_profiles.<locals>.<listcomp>T)rZ	profiles2r�profilesrA�_action_dbus_active�exit)rr?r!r"r"r#�_action_dbus_list_profilesfs&
z Admin._action_dbus_list_profilescCs|j|jj��|j�dS)NT)rArZget_known_names_summary�_action_active)rr"r"r#�_action_list_profilespszAdmin._action_list_profilescCs&|jj�}|dkrd}|jjd�|S)Nr
T)r�active_profilerF)rr+r"r"r#�_dbus_get_active_profileus

zAdmin._dbus_get_active_profilecCs|jj�\}}|S)N)r�get_active_profile)rr+�manualr"r"r#�_get_active_profile|szAdmin._get_active_profilecCs.|jj�\}}|dkr|dk	}|r(tjStjS)N)rrLrZACTIVE_PROFILE_MANUALZACTIVE_PROFILE_AUTO)rr@rMr"r"r#�_get_profile_mode�szAdmin._get_profile_modecCs|jj�}|dkrd}|S)Nr
)r�post_loaded_profile)rr+r"r"r#�_dbus_get_post_loaded_profile�s
z#Admin._dbus_get_post_loaded_profilecCs|jj�}|S)N)rZget_post_loaded_profile)rr+r"r"r#�_get_post_loaded_profile�s
zAdmin._get_post_loaded_profilecCsl|ddkrXtd�t|d�t�td�t|d�t�td�t|d�dStd	|�d
SdS)NrTz
Profile name:rzProfile summary:�zProfile description:�z,Unable to get information about profile '%s'F)r&)rr@�profile_infor"r"r#�_print_profile_info�szAdmin._print_profile_infor
cCsB|dkr|j�}|r*|j||jj|��}ntd�d}|jj|�S)Nr
zNo current active profile.F)rKrVrrUr&rF)rr@r=r"r"r#�_action_dbus_profile_info�szAdmin._action_dbus_profile_infocCs||dkrXy |j�}|dkr&td�dSWn.tk
rV}z|jt|��dSd}~XnX|j||jj|tj	tj
gddg��S)Nr
zNo current active profile.F)rNr&rrr7rVrZget_profile_attrsrZPROFILE_ATTR_SUMMARYZPROFILE_ATTR_DESCRIPTION)rr@r!r"r"r#�_action_profile_info�szAdmin._action_profile_infocCs$|dkrtd�dStd|�dS)NzNo current active profile.FzCurrent active profile: %sT)r&)rr+r"r"r#�_print_profile_name�s
zAdmin._print_profile_namecCs|rtd|�dS)NzCurrent post-loaded profile: %s)r&)rr+r"r"r#�_print_post_loaded_profile�sz Admin._print_post_loaded_profilecCs4|j�}|j|�}|r(|j�}|j|�|jj|�S)N)rKrYrQrZrrF)rrJr=rPr"r"r#rE�s

zAdmin._action_dbus_activecCs�y2|j�}|j�}|r0|r$|d7}nd}||7}Wn.tk
r`}z|jt|��dSd}~XnX|dk	r�|j�r�td�td|�|r�td|�dS|j|�}|j|�|S)N� r
FzKIt seems that tuned daemon is not running, preset profile is not activated.zPreset profile: %szPreset post-loaded profile: %sT)	rNrRrrr7r4r&rYrZ)rr+rPr!r=r"r"r#rH�s(


zAdmin._action_activecCstd|�dS)NzProfile selection mode: )r&)r�moder"r"r#�_print_profile_mode�szAdmin._print_profile_modecCsB|jj�\}}|j|�|dkr6|j|�|jjd�S|jjd�S)Nr
FT)rZprofile_moder]rrF)rr\�errorr"r"r#�_action_dbus_profile_mode�s

zAdmin._action_dbus_profile_modecCsJy|j�}|j|�dStk
rD}z|jt|��dSd}~XnXdS)NTF)rOr]rrr7)rr\r!r"r"r#�_action_profile_mode�s
zAdmin._action_profile_modecCs>|r,|jj�r:|jj�r:|jd�d}n|jd|�|S)NzCannot enable the tuning.FzUnable to switch profile: %s)rZ
is_running�startr)r�ret�msgr"r"r#�_profile_print_status�s
zAdmin._profile_print_statuscCsrtj�|j|jkr.td|j�|jjd�S|jj�rn|j|krn|j	sbtd|j
�|jjd�S|jjd�SdS)Nz�Operation timed out after waiting %d seconds(s), you may try to increase timeout by using --timeout command line option or using --async.FzError changing profile: %sT)�time�
_timestampr
r&rrFrZisSetrrr)rr+r"r"r#�_action_dbus_wait_profile�s

zAdmin._action_dbus_wait_profilecCs||jdks|jdkrdSy2|jj|j�}d|_t|dtjd�tjj�Wn,tk
rv}z|jd�WYdd}~XnXdS)Nr
)�endr%zUError: Failed to stop log capture. Restart the TuneD daemon to prevent a memory leak.)	rrZlog_capture_finishr&r'r(�flushrr)rZlog_msgsr!r"r"r#r8szAdmin._log_capture_finishcCs�t|�dkr|j�Sdj|�}|dkr2|jjd�S|jj�|jrn|jdk	rn|j	dd}|jj
|j|�|_|jj|�\}}|js�|r�|jj|j
||��Stj�|_|jj|j|�|j
||�S)Nrr[r
F��)�len�_action_dbus_list�joinrrFr�clearrrr
�log_capture_startrZswitch_profilerdrerfr9rg)rrDr+r rbrcr"r"r#�_action_dbus_profiles 


zAdmin._action_dbus_profilecCs<td�|jjdddg�\}}|dkr0td�ntd�dS)NzTrying to (re)start tuned...ZservicerZrestartrz#TuneD (re)started, changes applied.zQTuneD (re)start failed, you need to (re)start TuneD by hand for changes to apply.)r&rZexecute)rrbrcr"r"r#�_restart_tuned+s

zAdmin._restart_tunedcCsz||jj�krdy|jj||�|j�dStk
r`}z|jt|��|jd�dSd}~XqvXn|jd|�dSdS)NTzUnable to switch profile.Fz%Requested profile '%s' doesn't exist.)rZget_known_namesrZsave_active_profilerrrrr7)rr+rMr!r"r"r#�_set_profile3s
zAdmin._set_profilecCs6t|�dkr|j�Sdj|�}|dkr*dS|j|d�S)Nrr[r
FT)rlrIrnrs)rrDr+r"r"r#�_action_profileAs
zAdmin._action_profilecCs�|jj�}|jj�|jrF|jdk	rF|jdd}|jj|j|�|_|jj	�\}}|js`|rt|jj
|j||��Stj�|_
|jj|j|�|j||�S)Nrjrk)r�recommend_profilerrorrr
rprZauto_profilerFrdrerfr9rg)rr+r rbrcr"r"r#�_action_dbus_auto_profileIs


zAdmin._action_dbus_auto_profilecCs|jj�}|j|d�S)NF)r�	recommendrs)rr+r"r"r#�_action_auto_profileYs
zAdmin._action_auto_profilecCst|jj��|jjd�S)NT)r&rrurF)rr"r"r#�_action_dbus_recommend_profile]sz$Admin._action_dbus_recommend_profilecCst|jj��dS)NT)r&rrw)rr"r"r#�_action_recommend_profileaszAdmin._action_recommend_profilecCsr|r|jj�}n
|jj�}|r(td�n0td�td�td�td�td�td�tdtj�|jj|�S)	NzIVerification succeeded, current system settings match the preset profile.zLVerification failed, current system settings differ from the preset profile.z=You can mostly fix this by restarting the TuneD daemon, e.g.:z  systemctl restart tuned�orz  service tuned restartzNSometimes (if some plugins like bootloader are used) a reboot may be required.z&See TuneD log file ('%s') for details.)rZverify_profile_ignore_missingZverify_profiler&rZLOG_FILErF)r�ignore_missingrbr"r"r#�_action_dbus_verify_profilees

z!Admin._action_dbus_verify_profilecCstd�dS)Nz Not supported in no_daemon mode.F)r&)rr|r"r"r#�_action_verify_profilevszAdmin._action_verify_profilecCs:d}|jj|j|�|_|jj�}|s.|jd�|jj|�S)NrjrkzCannot disable active profile.r>)rrprrZoffrrF)rr rbr"r"r#�_action_dbus_offzs

zAdmin._action_dbus_offcCstd�dS)Nz Not supported in no_daemon mode.F)r&)rr"r"r#�_action_off�szAdmin._action_offrDcCs(|dkr|j�S|dkr$|j|d�SdS)aDPrint accessible profiles or plugins got from TuneD dbus api

		Keyword arguments:
		list_choice -- argument from command line deciding what will be listed
		verbose -- if True then list plugin's config options and their hints
			if possible. Functional only with plugin listing, with profiles
			this argument is omitted
		rD�plugins)�verboseN)rG�_action_dbus_list_plugins)r�list_choicer�r"r"r#rm�s	zAdmin._action_dbus_listcCs(|dkr|j�S|dkr$|j|d�SdS)aaPrint accessible profiles or plugins with no daemon mode

		Keyword arguments:
		list_choice -- argument from command line deciding what will be listed
		verbose -- Plugins cannot be listed in this mode, so verbose argument
			is here only because argparse module always supplies verbose
			option and if verbose was not here it would result in error
		rDr�)r�N)rI�_action_list_plugins)rr�r�r"r"r#�_action_list�s	zAdmin._action_listcCs�|jj�}xv|j�D]j}t|�|st||�dkr8q|jj|�}x8||D],}td|�|j|d�}|rNtd|�qNWqW|jjd�S)zvPrint accessible plugins

		Keyword arguments:
		verbose -- if is set to True then parameters and hints are printed
		rz	%sNz		%sT)rZget_plugins�keysr&rlZget_plugin_hints�getrF)rr�r��pluginZhintsZ	parameterZhintr"r"r#r��s
zAdmin._action_dbus_list_pluginscCstd�dS)Nz Not supported in no_daemon mode.F)r&)rr�r"r"r#r��szAdmin._action_list_pluginscCs0|jj||�\}}|s$|jd|�|jj|�S)NzUnable to acquire devices: %s)rZinstance_acquire_devicesrrF)r�devices�instancerbrcr"r"r#�%_action_dbus_instance_acquire_devices�sz+Admin._action_dbus_instance_acquire_devicescCstd�dS)Nz Not supported in no_daemon mode.F)r&)rr�r�r"r"r#� _action_instance_acquire_devices�sz&Admin._action_instance_acquire_devicescCs^|jj|�\}}}|s0|jd|�|jjd�Sx |D]\}}td||f�q6W|jjd�S)NzUnable to list instances: %sFz%s (%s)T)rZ
get_instancesrrFr&)r�plugin_namerbrcZpairsr�r�r"r"r#�_action_dbus_get_instances�sz Admin._action_dbus_get_instancescCstd�dS)Nz Not supported in no_daemon mode.F)r&)rr�r"r"r#�_action_get_instances�szAdmin._action_get_instancescCsR|jj|�\}}}|s0|jd|�|jjd�Sx|D]}t|�q6W|jjd�S)NzUnable to list devices: %sFT)rZinstance_get_devicesrrFr&)rr�rbrcr�Zdevicer"r"r#�!_action_dbus_instance_get_devices�s
z'Admin._action_dbus_instance_get_devicescCstd�dS)Nz Not supported in no_daemon mode.F)r&)rr�r"r"r#�_action_instance_get_devices�sz"Admin._action_instance_get_devicesN)r
)r
)rDF)rDF)F)F)7�__name__�
__module__�__qualname__rZ
ADMIN_TIMEOUT�loggingZERRORr$rrr4r<rArGrIrKrNrOrQrRrVrWrXrYrZrErHr]r_r`rdrgr8rqrrrsrtrvrxryrzr}r~rr�rmr�r�r�r�r�r�r�r�r�r"r"r"r#r	sd




		




		r	)Z
__future__rZtuned.adminrZtuned.utils.commandsrZtuned.profilesrr�
exceptionsrZtuned.exceptionsrZtuned.constsrZtuned.utils.profile_recommenderrr-r'r1rerr��objectr	r"r"r"r#�<module>s
site-packages/tuned/admin/__pycache__/exceptions.cpython-36.pyc000064400000000536147511334670020531 0ustar003

�<�e_�@s ddlZGdd�dejj�ZdS)�Nc@seZdZdS)�TunedAdminDBusExceptionN)�__name__�
__module__�__qualname__�rr� /usr/lib/python3.6/exceptions.pyrsr)Ztuned.exceptionsZtuned�
exceptionsZTunedExceptionrrrrr�<module>ssite-packages/tuned/admin/dbus_controller.py000064400000011527147511334670015266 0ustar00import dbus
import dbus.exceptions
import time
from dbus.mainloop.glib import DBusGMainLoop
from gi.repository import GLib, GObject
from .exceptions import TunedAdminDBusException

__all__ = ["DBusController"]

class DBusController(object):
	def __init__(self, bus_name, interface_name, object_name, debug = False):
		self._bus_name = bus_name
		self._interface_name = interface_name
		self._object_name = object_name
		self._proxy = None
		self._interface = None
		self._debug = debug
		self._main_loop = None
		self._action = None
		self._on_exit_action = None
		self._ret = True
		self._exit = False
		self._exception = None

	def _init_proxy(self):
		try:
			if self._proxy is None:
				DBusGMainLoop(set_as_default=True)
				self._main_loop = GLib.MainLoop()
				bus = dbus.SystemBus()
				self._proxy = bus.get_object(self._bus_name, self._object_name)
				self._interface = dbus.Interface(self._proxy, dbus_interface = self._interface_name)
		except dbus.exceptions.DBusException:
			raise TunedAdminDBusException("Cannot talk to TuneD daemon via DBus. Is TuneD daemon running?")

	def _idle(self):
		if self._action is not None:
			# This may (and very probably will) run in child thread, so catch and pass exceptions to the main thread
			try:
				self._action_exit_code = self._action(*self._action_args, **self._action_kwargs)
			except TunedAdminDBusException as e:
				self._exception = e
				self._exit = True

		if self._exit:
			if self._on_exit_action is not None:
				self._on_exit_action(*self._on_exit_action_args,
						**self._on_exit_action_kwargs)
			self._main_loop.quit()
			return False
		else:
			time.sleep(1)
		return True

	def set_on_exit_action(self, action, *args, **kwargs):
		self._on_exit_action = action
		self._on_exit_action_args = args
		self._on_exit_action_kwargs = kwargs

	def set_action(self, action, *args, **kwargs):
		self._action = action
		self._action_args = args
		self._action_kwargs = kwargs

	def run(self):
		self._exception = None
		GLib.idle_add(self._idle)
		self._main_loop.run()
		# Pass exception happened in child thread to the caller
		if self._exception is not None:
			raise self._exception
		return self._ret

	def _call(self, method_name, *args, **kwargs):
		self._init_proxy()

		try:
			method = self._interface.get_dbus_method(method_name)
			return method(*args, timeout=40)
		except dbus.exceptions.DBusException as dbus_exception:
			err_str = "DBus call to TuneD daemon failed"
			if self._debug:
				err_str += " (%s)" % str(dbus_exception)
			raise TunedAdminDBusException(err_str)

	def set_signal_handler(self, signal, cb):
		self._init_proxy()
		self._proxy.connect_to_signal(signal, cb)

	def is_running(self):
		return self._call("is_running")

	def start(self):
		return self._call("start")

	def stop(self):
		return self._call("stop")

	def profiles(self):
		return self._call("profiles")

	def profiles2(self):
		return self._call("profiles2")

	def profile_info(self, profile_name):
		return self._call("profile_info", profile_name)

	def log_capture_start(self, log_level, timeout):
		return self._call("log_capture_start", log_level, timeout)

	def log_capture_finish(self, token):
		return self._call("log_capture_finish", token)

	def active_profile(self):
		return self._call("active_profile")

	def profile_mode(self):
		return self._call("profile_mode")

	def post_loaded_profile(self):
		return self._call("post_loaded_profile")

	def switch_profile(self, new_profile):
		if new_profile == "":
			return (False, "No profile specified")
		return self._call("switch_profile", new_profile)

	def auto_profile(self):
		return self._call("auto_profile")

	def recommend_profile(self):
		return self._call("recommend_profile")

	def verify_profile(self):
		return self._call("verify_profile")

	def verify_profile_ignore_missing(self):
		return self._call("verify_profile_ignore_missing")

	def off(self):
		return self._call("disable")

	def get_plugins(self):
		"""Return dict with plugin names and their hints

		Return:
		dictionary -- {plugin_name: {parameter_name: default_value}}
		"""
		return self._call("get_all_plugins")

	def get_plugin_documentation(self, plugin_name):
		"""Return docstring of plugin's class"""
		return self._call("get_plugin_documentation", plugin_name)

	def get_plugin_hints(self, plugin_name):
		"""Return dictionary with parameters of plugin and their hints

		Parameters:
		plugin_name -- name of plugin

		Return:
		dictionary -- {parameter_name: hint}
		"""
		return self._call("get_plugin_hints", plugin_name)

	def instance_acquire_devices(self, devices, instance):
		return self._call("instance_acquire_devices", devices, instance)

	def get_instances(self, plugin_name):
		return self._call("get_instances", plugin_name)

	def instance_get_devices(self, instance):
		return self._call("instance_get_devices", instance)

	def exit(self, ret):
		self.set_action(None)
		self._ret = ret
		self._exit = True
		return ret
site-packages/tuned/patterns.py000064400000000517147511334670012633 0ustar00class Singleton(object):
	"""
	Singleton design pattern.
	"""

	_instance = None

	def __init__(self):
		if self.__class__ is Singleton:
			raise TypeError("Cannot instantiate directly.")

	@classmethod
	def get_instance(cls):
		"""Get the class instance."""
		if cls._instance is None:
			cls._instance = cls()
		return cls._instance
site-packages/tuned/consts.py000064400000017325147511334670012311 0ustar00import logging

GLOBAL_CONFIG_FILE = "/etc/tuned/tuned-main.conf"
ACTIVE_PROFILE_FILE = "/etc/tuned/active_profile"
PROFILE_MODE_FILE = "/etc/tuned/profile_mode"
POST_LOADED_PROFILE_FILE = "/etc/tuned/post_loaded_profile"
PROFILE_FILE = "tuned.conf"
RECOMMEND_CONF_FILE = "/etc/tuned/recommend.conf"
DAEMONIZE_PARENT_TIMEOUT = 5
NAMESPACE = "com.redhat.tuned"
DBUS_BUS = NAMESPACE
DBUS_INTERFACE = "com.redhat.tuned.control"
DBUS_OBJECT = "/Tuned"
DEFAULT_PROFILE = "balanced"
DEFAULT_STORAGE_FILE = "/run/tuned/save.pickle"
LOAD_DIRECTORIES = ["/usr/lib/tuned", "/etc/tuned"]
PERSISTENT_STORAGE_DIR = "/var/lib/tuned"
PLUGIN_MAIN_UNIT_NAME = "main"
# Magic section header because ConfigParser does not support "headerless" config
MAGIC_HEADER_NAME = "this_is_some_magic_section_header_because_of_compatibility"
RECOMMEND_DIRECTORIES = ["/usr/lib/tuned/recommend.d", "/etc/tuned/recommend.d"]

TMP_FILE_SUFFIX = ".tmp"
# max. number of consecutive errors to give up
ERROR_THRESHOLD = 3

# bootloader plugin configuration
BOOT_DIR = "/boot"
GRUB2_CFG_FILES = ["/etc/grub2.cfg", "/etc/grub2-efi.cfg"]
GRUB2_CFG_DIR = "/etc/grub.d"
GRUB2_TUNED_TEMPLATE_NAME = "00_tuned"
GRUB2_TUNED_TEMPLATE_PATH = GRUB2_CFG_DIR + "/" + GRUB2_TUNED_TEMPLATE_NAME
GRUB2_TEMPLATE_HEADER_BEGIN = "### BEGIN /etc/grub.d/" + GRUB2_TUNED_TEMPLATE_NAME +  " ###"
GRUB2_TEMPLATE_HEADER_END = "### END /etc/grub.d/" + GRUB2_TUNED_TEMPLATE_NAME +  " ###"
GRUB2_TUNED_VAR = "tuned_params"
GRUB2_TUNED_INITRD_VAR = "tuned_initrd"
GRUB2_DEFAULT_ENV_FILE = "/etc/default/grub"
INITRD_IMAGE_DIR = "/boot"
BOOT_CMDLINE_TUNED_VAR = "TUNED_BOOT_CMDLINE"
BOOT_CMDLINE_INITRD_ADD_VAR = "TUNED_BOOT_INITRD_ADD"
BOOT_CMDLINE_KARGS_DELETED_VAR = "TUNED_BOOT_KARGS_DELETED"
BOOT_CMDLINE_FILE = "/etc/tuned/bootcmdline"
PETITBOOT_DETECT_DIR = "/sys/firmware/opal"
MACHINE_ID_FILE = "/etc/machine-id"
KERNEL_UPDATE_HOOK_FILE = "/usr/lib/kernel/install.d/92-tuned.install"
BLS_ENTRIES_PATH = "/boot/loader/entries"

# scheduler plugin configuration
# how many times retry to move tasks to parent cgroup on cgroup cleanup
CGROUP_CLEANUP_TASKS_RETRY = 10
PROCFS_MOUNT_POINT = "/proc"
DEF_CGROUP_MOUNT_POINT = "/sys/fs/cgroup/cpuset"
DEF_CGROUP_MODE = 0o770

# service plugin configuration
SERVICE_SYSTEMD_CFG_PATH = "/etc/systemd/system/%s.service.d"
DEF_SERVICE_CFG_DIR_MODE = 0o755

# modules plugin configuration
MODULES_FILE = "/etc/modprobe.d/tuned.conf"

# systemd plugin configuration
SYSTEMD_SYSTEM_CONF_FILE = "/etc/systemd/system.conf"
SYSTEMD_CPUAFFINITY_VAR = "CPUAffinity"

# irqbalance plugin configuration
IRQBALANCE_SYSCONFIG_FILE = "/etc/sysconfig/irqbalance"

# acpi plugin configuration
ACPI_DIR = "/sys/firmware/acpi"

# built-in functions configuration
SYSFS_CPUS_PATH = "/sys/devices/system/cpu"

# number of backups
LOG_FILE_COUNT = 2
LOG_FILE_MAXBYTES = 100*1000
LOG_FILE = "/var/log/tuned/tuned.log"
PID_FILE = "/run/tuned/tuned.pid"
SYSTEM_RELEASE_FILE = "/etc/system-release-cpe"
# prefix for functions plugins
FUNCTION_PREFIX = "function_"
# prefix for exported environment variables when calling scripts
ENV_PREFIX = "TUNED_"
ROLLBACK_NONE = 0
ROLLBACK_SOFT = 1
ROLLBACK_FULL = 2

# tuned-gui
PREFIX_PROFILE_FACTORY = "System"
PREFIX_PROFILE_USER = "User"

# PPD-to-tuned API translation daemon configuration
PPD_NAMESPACE = "net.hadess.PowerProfiles"
PPD_DBUS_BUS = PPD_NAMESPACE
PPD_DBUS_OBJECT = "/net/hadess/PowerProfiles"
PPD_DBUS_INTERFACE = PPD_DBUS_BUS
PPD_CONFIG_FILE = "/etc/tuned/ppd.conf"

# After adding new option to tuned-main.conf add here its name with CFG_ prefix
# and eventually default value with CFG_DEF_ prefix (default is None)
# and function for check with CFG_FUNC_ prefix
# (see configobj for methods, default is get for string)
CFG_DAEMON = "daemon"
CFG_DYNAMIC_TUNING = "dynamic_tuning"
CFG_SLEEP_INTERVAL = "sleep_interval"
CFG_UPDATE_INTERVAL = "update_interval"
CFG_RECOMMEND_COMMAND = "recommend_command"
CFG_REAPPLY_SYSCTL = "reapply_sysctl"
CFG_DEFAULT_INSTANCE_PRIORITY = "default_instance_priority"
CFG_UDEV_BUFFER_SIZE = "udev_buffer_size"
CFG_LOG_FILE_COUNT = "log_file_count"
CFG_LOG_FILE_MAX_SIZE = "log_file_max_size"
CFG_UNAME_STRING = "uname_string"
CFG_CPUINFO_STRING = "cpuinfo_string"
CFG_ENABLE_DBUS = "enable_dbus"
CFG_ENABLE_UNIX_SOCKET = "enable_unix_socket"
CFG_UNIX_SOCKET_PATH = "unix_socket_path"
CFG_UNIX_SOCKET_SIGNAL_PATHS = "unix_socket_signal_paths"
CFG_UNIX_SOCKET_OWNERSHIP = "unix_socket_ownership"
CFG_UNIX_SOCKET_PERMISIONS = "unix_socket_permissions"
CFG_UNIX_SOCKET_CONNECTIONS_BACKLOG = "connections_backlog"
CFG_CPU_EPP_FLAG = "hwp_epp"
CFG_ROLLBACK = "rollback"

# no_daemon mode
CFG_DEF_DAEMON = True
CFG_FUNC_DAEMON = "getboolean"
# default configuration
CFG_DEF_DYNAMIC_TUNING = True
CFG_FUNC_DYNAMIC_TUNING = "getboolean"
# how long to sleep before checking for events (in seconds)
CFG_DEF_SLEEP_INTERVAL = 1
CFG_FUNC_SLEEP_INTERVAL = "getint"
# update interval for dynamic tuning (in seconds)
CFG_DEF_UPDATE_INTERVAL = 10
CFG_FUNC_UPDATE_INTERVAL = "getint"
# recommend command availability
CFG_DEF_RECOMMEND_COMMAND = True
CFG_FUNC_RECOMMEND_COMMAND = "getboolean"
# reapply system sysctl
CFG_DEF_REAPPLY_SYSCTL = True
CFG_FUNC_REAPPLY_SYSCTL = "getboolean"
# default instance priority
CFG_DEF_DEFAULT_INSTANCE_PRIORITY = 0
CFG_FUNC_DEFAULT_INSTANCE_PRIORITY = "getint"
# default pyudev.Monitor buffer size
CFG_DEF_UDEV_BUFFER_SIZE = 1024 * 1024
# default log file count
CFG_DEF_LOG_FILE_COUNT = 2
CFG_FUNC_LOG_FILE_COUNT = "getint"
# default log file max size
CFG_DEF_LOG_FILE_MAX_SIZE = 1024 * 1024
# default listening on dbus
CFG_DEF_ENABLE_DBUS = True
CFG_FUNC_ENABLE_DBUS = "getboolean"
# default listening on unix socket
# as it is not used commonly disabled by default
CFG_DEF_ENABLE_UNIX_SOCKET = False
CFG_FUNC_ENABLE_UNIX_SOCKET = "getboolean"
# default unix socket path
CFG_DEF_UNIX_SOCKET_PATH = "/run/tuned/tuned.sock"
CFG_DEF_UNIX_SOCKET_SIGNAL_PATHS = ""
# default unix socket ownership
# (uid and gid, python2 does not support names out of box, -1 leaves default)
CFG_DEF_UNIX_SOCKET_OWNERSHIP = "-1 -1"
# default unix socket permissions
CFG_DEF_UNIX_SOCKET_PERMISIONS = "0o600"
# default unix socket conections backlog
CFG_DEF_UNIX_SOCKET_CONNECTIONS_BACKLOG = "1024"
CFG_FUNC_UNIX_SOCKET_CONNECTIONS_BACKLOG = "getint"
# default rollback strategy
CFG_DEF_ROLLBACK = "auto"

PATH_CPU_DMA_LATENCY = "/dev/cpu_dma_latency"

# profile attributes which can be specified in the main section
PROFILE_ATTR_SUMMARY = "summary"
PROFILE_ATTR_DESCRIPTION = "description"

SIGNAL_PROFILE_CHANGED = "profile_changed"

STR_HINT_REBOOT = "you need to reboot for changes to take effect"

STR_VERIFY_PROFILE_DEVICE_VALUE_OK = "verify: passed: device %s: '%s' = '%s'"
STR_VERIFY_PROFILE_VALUE_OK = "verify: passed: '%s' = '%s'"
STR_VERIFY_PROFILE_OK = "verify: passed: '%s'"
STR_VERIFY_PROFILE_DEVICE_VALUE_MISSING = "verify: skipped, missing: device %s: '%s'"
STR_VERIFY_PROFILE_VALUE_MISSING = "verify: skipped, missing: '%s'"
STR_VERIFY_PROFILE_DEVICE_VALUE_FAIL = "verify: failed: device %s: '%s' = '%s', expected '%s'"
STR_VERIFY_PROFILE_VALUE_FAIL = "verify: failed: '%s' = '%s', expected '%s'"
STR_VERIFY_PROFILE_CMDLINE_FAIL = "verify: failed: cmdline arg '%s', expected '%s'"
STR_VERIFY_PROFILE_CMDLINE_FAIL_MISSING = "verify: failed: cmdline arg '%s' is missing, expected '%s'"
STR_VERIFY_PROFILE_FAIL = "verify: failed: '%s'"

# timout for tuned-adm operations in seconds
ADMIN_TIMEOUT = 600

# Strings for /etc/tuned/profile_mode specifying if the active profile
# was set automatically or manually
ACTIVE_PROFILE_AUTO = "auto"
ACTIVE_PROFILE_MANUAL = "manual"

LOG_LEVEL_CONSOLE = 60
LOG_LEVEL_CONSOLE_NAME = "CONSOLE"
CAPTURE_LOG_LEVEL = "console"
CAPTURE_LOG_LEVELS = {
		"debug": logging.DEBUG,
		"info": logging.INFO,
		"warn": logging.WARN,
		"error": logging.ERROR,
		"console": LOG_LEVEL_CONSOLE,
		"none": None,
		}
site-packages/tuned/__pycache__/__init__.cpython-36.pyc000064400000000372147511334670017015 0ustar003

�<�e��@sdZdZdZdS)z!Copyright 2008-2013 Red Hat, Inc.zGPLv2+z(power-management@lists.fedoraproject.orgN)Z
__copyright__Z__license__Z	__email__�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/__pycache__/version.cpython-36.opt-1.pyc000064400000000370147511334670017700 0ustar003

�<�e��@sdZdZdZdeeefZdS)���z%d.%d.%dN)ZTUNED_VERSION_MAJORZTUNED_VERSION_MINORZTUNED_VERSION_PATCHZTUNED_VERSION_STR�rr�/usr/lib/python3.6/version.py�<module>ssite-packages/tuned/__pycache__/logs.cpython-36.opt-1.pyc000064400000010626147511334670017164 0ustar003

�<�e��
@sddlZddlZddlZddlZddlZddlZddljZddlZddl	Z	ddl
Z
yddlmZWnddlmZYnXdgZ
daiZe
j�ZGdd�de�Zdd�Zdd	�Zd
d�Zdd�ZGd
d�dej��Zejejej�eje�ejej�dS)�N)�StringIO�getc@seZdZdd�ZdS)�
LogHandlercCs||_||_dS)N)�handler�stream)�selfrr�r�/usr/lib/python3.6/logs.py�__init__szLogHandler.__init__N)�__name__�
__module__�__qualname__r
rrrr	rsrcCs<tj�}tjtj}d}xt|�D]}||j|�7}q"W|S)N�)�randomZSystemRandom�stringZ
ascii_lettersZdigits�rangeZchoice)Zlength�r�chars�res�irrr	�_random_stringsrc
Cs�t��x&td�D]}td�}|tkrPqWdSt�}tj|�}|j|�tjd�}|j	|�t
j|�t||�}|t|<t
j
d|�|SQRXdS)N�
�z%%(levelname)-8s %(name)s: %(message)szAdded log handler %s.)�log_handlers_lockrr�log_handlersr�logging�
StreamHandler�setLevel�	Formatter�setFormatter�root_logger�
addHandlerr�debug)Z	log_levelr�tokenrrZ	formatter�log_handlerrrr	�log_capture_start$s"




r%cCsjt�\yt|}Wntk
r&dSX|jj�}|jj�tj|j�t|=tj	d|�|SQRXdS)NzRemoved log handler %s.)
rr�KeyErrorr�getvalue�closer �
removeHandlerrr")r#r$Zcontentrrr	�log_capture_finish8s

r*cCsttdkrtjd�atj�j}|jd}|dkr6d}tS|jd�rp|jdd�\}}tj	|�}|j
�|jd�|SdS)NZtunedr�__main__ztuned.�.�ZNOTSET)r rZ	getLogger�inspectZcurrentframe�f_back�f_locals�
startswith�splitZgetChild�remove_all_handlersr)Zcalling_module�name�rootZchildZchild_loggerrrr	rEs





csxeZdZdZejd�ZdZdZ�fdd�Z	dd�Z
dd	�Zej
ejejfd
d�Zdd
�Zedd��Zedd��Z�ZS)�TunedLoggerz!Custom TuneD daemon logger class.z1%(asctime)s %(levelname)-8s %(name)s: %(message)sNcs*tt|�j||�|jtj�|j�dS)N)�superr6r
rr�INFO�switch_to_console)r�args�kwargs)�	__class__rr	r
^szTunedLogger.__init__cOs|jtj|f|�|�dS)N)�log�consts�LOG_LEVEL_CONSOLE)r�msgr:r;rrr	�consolecszTunedLogger.consolecCs |j�|j�|j|j�dS)N)�_setup_console_handlerr3r!�_console_handler)rrrr	r9fszTunedLogger.switch_to_consolecCs&|j|||�|j�|j|j�dS)N)�_setup_file_handlerr3r!�
_file_handler)r�filename�maxBytes�backupCountrrr	�switch_to_filekszTunedLogger.switch_to_filecCs"|j}x|D]}|j|�qWdS)N)�handlersr))rZ	_handlersrrrr	r3rs
zTunedLogger.remove_all_handlerscCs*|jdk	rdStj�|_|jj|j�dS)N)rCrrr�
_formatter)�clsrrr	rBws

z"TunedLogger._setup_console_handlercCsj|jdk	rdStjj|�}|dkr&d}tjj|�s<tj|�tjj|t	|�t	|�d�|_|jj
|j�dS)Nrr,)rGrH)rE�os�path�dirname�exists�makedirsrrJZRotatingFileHandler�intrrK)rLrFrGrHZ
log_directoryrrr	rDs

zTunedLogger._setup_file_handler)rrr
�__doc__rrrKrCrEr
rAr9r>ZLOG_FILEZLOG_FILE_MAXBYTESZLOG_FILE_COUNTrIr3�classmethodrBrD�
__classcell__rr)r<r	r6Xs
r6) �atexitrZlogging.handlersrMZos.pathr.Ztuned.constsr>rrZ	threadingr�io�__all__r rZLockr�objectrrr%r*rZgetLoggerClassr6ZaddLevelNamer?ZLOG_LEVEL_CONSOLE_NAMEZsetLoggerClass�registerZshutdownrrrr	�<module>s4

6
site-packages/tuned/__pycache__/consts.cpython-36.pyc000064400000014505147511334670016572 0ustar003

�<�e��@svddlZdZdZdZdZdZdZdZd	ZeZ	d
Z
dZdZd
Z
ddgZdZdZdZddgZdZdZdZddgZdZdZedeZdedZdedZd Zd!Zd"ZdZd#Z d$Z!d%Z"d&Z#d'Z$d(Z%d)Z&d*Z'd+Z(d,Z)d-Z*d.Z+d/Z,d0Z-d1Z.d2Z/d3Z0d4Z1d5Z2d6Z3d7Z4dzZ5d:Z6d;Z7d<Z8d=Z9d>Z:dZ;d?Z<d7Z=d@Z>dAZ?dBZ@e@ZAdCZBeAZCdDZDdEZEdFZFdGZGdHZHdIZIdJZJdKZKdLZLdMZMdNZNdOZOdPZPdQZQdRZRdSZSdTZTdUZUdVZVdWZWdXZXdYZYdZZZd[Z[dZZ\d[Z]d?Z^d\Z_d+Z`d\ZadZZbd[ZcdZZdd[ZedZfd\Zgd{Zhd7Zid\Zjd|ZkdZZld[Zmd^Znd[Zod_Zpd`ZqdaZrdbZsdcZtd\ZuddZvdeZwdfZxdgZydhZzdiZ{djZ|dkZ}dlZ~dmZdnZ�doZ�dpZ�dqZ�drZ�dsZ�dtZ�ddZ�duZ�dvZ�dwZ�dxZ�ej�ej�ej�ej�e�ddy�Z�dS)}�Nz/etc/tuned/tuned-main.confz/etc/tuned/active_profilez/etc/tuned/profile_modez/etc/tuned/post_loaded_profilez
tuned.confz/etc/tuned/recommend.conf�zcom.redhat.tunedzcom.redhat.tuned.controlz/TunedZbalancedz/run/tuned/save.picklez/usr/lib/tunedz
/etc/tunedz/var/lib/tuned�mainZ:this_is_some_magic_section_header_because_of_compatibilityz/usr/lib/tuned/recommend.dz/etc/tuned/recommend.dz.tmp�z/bootz/etc/grub2.cfgz/etc/grub2-efi.cfgz/etc/grub.dZ00_tuned�/z### BEGIN /etc/grub.d/z ###z### END /etc/grub.d/Ztuned_paramsZtuned_initrdz/etc/default/grubZTUNED_BOOT_CMDLINEZTUNED_BOOT_INITRD_ADDZTUNED_BOOT_KARGS_DELETEDz/etc/tuned/bootcmdlinez/sys/firmware/opalz/etc/machine-idz*/usr/lib/kernel/install.d/92-tuned.installz/boot/loader/entries�
z/procz/sys/fs/cgroup/cpuseti�z /etc/systemd/system/%s.service.di�z/etc/modprobe.d/tuned.confz/etc/systemd/system.confZCPUAffinityz/etc/sysconfig/irqbalancez/sys/firmware/acpiz/sys/devices/system/cpu��di�z/var/log/tuned/tuned.logz/run/tuned/tuned.pidz/etc/system-release-cpeZ	function_ZTUNED_�ZSystemZUserznet.hadess.PowerProfilesz/net/hadess/PowerProfilesz/etc/tuned/ppd.confZdaemonZdynamic_tuningZsleep_intervalZupdate_intervalZrecommend_commandZreapply_sysctlZdefault_instance_priorityZudev_buffer_sizeZlog_file_countZlog_file_max_sizeZuname_stringZcpuinfo_stringZenable_dbusZenable_unix_socketZunix_socket_pathZunix_socket_signal_pathsZunix_socket_ownershipZunix_socket_permissionsZconnections_backlogZhwp_eppZrollbackTZ
getbooleanZgetintiFz/run/tuned/tuned.sock�z-1 -1Z0o600Z1024�autoz/dev/cpu_dma_latencyZsummary�descriptionZprofile_changedz-you need to reboot for changes to take effectz&verify: passed: device %s: '%s' = '%s'zverify: passed: '%s' = '%s'zverify: passed: '%s'z)verify: skipped, missing: device %s: '%s'zverify: skipped, missing: '%s'z5verify: failed: device %s: '%s' = '%s', expected '%s'z*verify: failed: '%s' = '%s', expected '%s'z/verify: failed: cmdline arg '%s', expected '%s'z:verify: failed: cmdline arg '%s' is missing, expected '%s'zverify: failed: '%s'iXZmanual�<ZCONSOLE�console)�debug�info�warn�errorrZnonei��ii)�ZloggingZGLOBAL_CONFIG_FILEZACTIVE_PROFILE_FILEZPROFILE_MODE_FILEZPOST_LOADED_PROFILE_FILEZPROFILE_FILEZRECOMMEND_CONF_FILEZDAEMONIZE_PARENT_TIMEOUTZ	NAMESPACEZDBUS_BUSZDBUS_INTERFACEZDBUS_OBJECTZDEFAULT_PROFILEZDEFAULT_STORAGE_FILEZLOAD_DIRECTORIESZPERSISTENT_STORAGE_DIRZPLUGIN_MAIN_UNIT_NAMEZMAGIC_HEADER_NAMEZRECOMMEND_DIRECTORIESZTMP_FILE_SUFFIXZERROR_THRESHOLDZBOOT_DIRZGRUB2_CFG_FILESZ
GRUB2_CFG_DIRZGRUB2_TUNED_TEMPLATE_NAMEZGRUB2_TUNED_TEMPLATE_PATHZGRUB2_TEMPLATE_HEADER_BEGINZGRUB2_TEMPLATE_HEADER_ENDZGRUB2_TUNED_VARZGRUB2_TUNED_INITRD_VARZGRUB2_DEFAULT_ENV_FILEZINITRD_IMAGE_DIRZBOOT_CMDLINE_TUNED_VARZBOOT_CMDLINE_INITRD_ADD_VARZBOOT_CMDLINE_KARGS_DELETED_VARZBOOT_CMDLINE_FILEZPETITBOOT_DETECT_DIRZMACHINE_ID_FILEZKERNEL_UPDATE_HOOK_FILEZBLS_ENTRIES_PATHZCGROUP_CLEANUP_TASKS_RETRYZPROCFS_MOUNT_POINTZDEF_CGROUP_MOUNT_POINTZDEF_CGROUP_MODEZSERVICE_SYSTEMD_CFG_PATHZDEF_SERVICE_CFG_DIR_MODEZMODULES_FILEZSYSTEMD_SYSTEM_CONF_FILEZSYSTEMD_CPUAFFINITY_VARZIRQBALANCE_SYSCONFIG_FILEZACPI_DIRZSYSFS_CPUS_PATHZLOG_FILE_COUNTZLOG_FILE_MAXBYTESZLOG_FILEZPID_FILEZSYSTEM_RELEASE_FILEZFUNCTION_PREFIXZ
ENV_PREFIXZ
ROLLBACK_NONEZ
ROLLBACK_SOFTZ
ROLLBACK_FULLZPREFIX_PROFILE_FACTORYZPREFIX_PROFILE_USERZ
PPD_NAMESPACEZPPD_DBUS_BUSZPPD_DBUS_OBJECTZPPD_DBUS_INTERFACEZPPD_CONFIG_FILEZ
CFG_DAEMONZCFG_DYNAMIC_TUNINGZCFG_SLEEP_INTERVALZCFG_UPDATE_INTERVALZCFG_RECOMMEND_COMMANDZCFG_REAPPLY_SYSCTLZCFG_DEFAULT_INSTANCE_PRIORITYZCFG_UDEV_BUFFER_SIZEZCFG_LOG_FILE_COUNTZCFG_LOG_FILE_MAX_SIZEZCFG_UNAME_STRINGZCFG_CPUINFO_STRINGZCFG_ENABLE_DBUSZCFG_ENABLE_UNIX_SOCKETZCFG_UNIX_SOCKET_PATHZCFG_UNIX_SOCKET_SIGNAL_PATHSZCFG_UNIX_SOCKET_OWNERSHIPZCFG_UNIX_SOCKET_PERMISIONSZ#CFG_UNIX_SOCKET_CONNECTIONS_BACKLOGZCFG_CPU_EPP_FLAGZCFG_ROLLBACKZCFG_DEF_DAEMONZCFG_FUNC_DAEMONZCFG_DEF_DYNAMIC_TUNINGZCFG_FUNC_DYNAMIC_TUNINGZCFG_DEF_SLEEP_INTERVALZCFG_FUNC_SLEEP_INTERVALZCFG_DEF_UPDATE_INTERVALZCFG_FUNC_UPDATE_INTERVALZCFG_DEF_RECOMMEND_COMMANDZCFG_FUNC_RECOMMEND_COMMANDZCFG_DEF_REAPPLY_SYSCTLZCFG_FUNC_REAPPLY_SYSCTLZ!CFG_DEF_DEFAULT_INSTANCE_PRIORITYZ"CFG_FUNC_DEFAULT_INSTANCE_PRIORITYZCFG_DEF_UDEV_BUFFER_SIZEZCFG_DEF_LOG_FILE_COUNTZCFG_FUNC_LOG_FILE_COUNTZCFG_DEF_LOG_FILE_MAX_SIZEZCFG_DEF_ENABLE_DBUSZCFG_FUNC_ENABLE_DBUSZCFG_DEF_ENABLE_UNIX_SOCKETZCFG_FUNC_ENABLE_UNIX_SOCKETZCFG_DEF_UNIX_SOCKET_PATHZ CFG_DEF_UNIX_SOCKET_SIGNAL_PATHSZCFG_DEF_UNIX_SOCKET_OWNERSHIPZCFG_DEF_UNIX_SOCKET_PERMISIONSZ'CFG_DEF_UNIX_SOCKET_CONNECTIONS_BACKLOGZ(CFG_FUNC_UNIX_SOCKET_CONNECTIONS_BACKLOGZCFG_DEF_ROLLBACKZPATH_CPU_DMA_LATENCYZPROFILE_ATTR_SUMMARYZPROFILE_ATTR_DESCRIPTIONZSIGNAL_PROFILE_CHANGEDZSTR_HINT_REBOOTZ"STR_VERIFY_PROFILE_DEVICE_VALUE_OKZSTR_VERIFY_PROFILE_VALUE_OKZSTR_VERIFY_PROFILE_OKZ'STR_VERIFY_PROFILE_DEVICE_VALUE_MISSINGZ STR_VERIFY_PROFILE_VALUE_MISSINGZ$STR_VERIFY_PROFILE_DEVICE_VALUE_FAILZSTR_VERIFY_PROFILE_VALUE_FAILZSTR_VERIFY_PROFILE_CMDLINE_FAILZ'STR_VERIFY_PROFILE_CMDLINE_FAIL_MISSINGZSTR_VERIFY_PROFILE_FAILZ
ADMIN_TIMEOUTZACTIVE_PROFILE_AUTOZACTIVE_PROFILE_MANUALZLOG_LEVEL_CONSOLEZLOG_LEVEL_CONSOLE_NAMEZCAPTURE_LOG_LEVEL�DEBUG�INFOZWARNZERRORZCAPTURE_LOG_LEVELS�rr�/usr/lib/python3.6/consts.py�<module>s"site-packages/tuned/__pycache__/exceptions.cpython-36.opt-1.pyc000064400000001662147511334670020401 0ustar003

�<�e8�@s6ddlZddlZddlZejj�ZGdd�de�ZdS)�Nc@s"eZdZdZddd�Zdd�ZdS)�TunedExceptionz
	NcCs(|dkrt}|jt|��|j|�dS)N)�exception_logger�error�str�
_log_trace)�self�logger�r	� /usr/lib/python3.6/exceptions.py�logszTunedException.logcCsHtj�\}}}||kr"|jd�n"djtj|||��j�}|j|�dS)Nz"stack trace is no longer available�)�sys�exc_info�debug�join�	traceback�format_exception�rstrip)rr�exc_type�	exc_value�
exc_tracebackZexception_infor	r	r
rs
zTunedException._log_trace)N)�__name__�
__module__�__qualname__�__doc__rrr	r	r	r
rs
r)	Z
tuned.logsZtunedr
rZlogs�getr�	Exceptionrr	r	r	r
�<module>s
site-packages/tuned/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000372147511334670017754 0ustar003

�<�e��@sdZdZdZdS)z!Copyright 2008-2013 Red Hat, Inc.zGPLv2+z(power-management@lists.fedoraproject.orgN)Z
__copyright__Z__license__Z	__email__�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/__pycache__/logs.cpython-36.pyc000064400000010657147511334670016231 0ustar003

�<�e��
@sddlZddlZddlZddlZddlZddlZddljZddlZddl	Z	ddl
Z
yddlmZWnddlmZYnXdgZ
daiZe
j�ZGdd�de�Zdd�Zdd	�Zd
d�Zdd�ZGd
d�dej��Zejejej�eje�ejej�dS)�N)�StringIO�getc@seZdZdd�ZdS)�
LogHandlercCs||_||_dS)N)�handler�stream)�selfrr�r�/usr/lib/python3.6/logs.py�__init__szLogHandler.__init__N)�__name__�
__module__�__qualname__r
rrrr	rsrcCs<tj�}tjtj}d}xt|�D]}||j|�7}q"W|S)N�)�randomZSystemRandom�stringZ
ascii_lettersZdigits�rangeZchoice)Zlength�r�chars�res�irrr	�_random_stringsrc
Cs�t��x&td�D]}td�}|tkrPqWdSt�}tj|�}|j|�tjd�}|j	|�t
j|�t||�}|t|<t
j
d|�|SQRXdS)N�
�z%%(levelname)-8s %(name)s: %(message)szAdded log handler %s.)�log_handlers_lockrr�log_handlersr�logging�
StreamHandler�setLevel�	Formatter�setFormatter�root_logger�
addHandlerr�debug)Z	log_levelr�tokenrrZ	formatter�log_handlerrrr	�log_capture_start$s"




r%cCsjt�\yt|}Wntk
r&dSX|jj�}|jj�tj|j�t|=tj	d|�|SQRXdS)NzRemoved log handler %s.)
rr�KeyErrorr�getvalue�closer �
removeHandlerrr")r#r$Zcontentrrr	�log_capture_finish8s

r*cCs|tdkrtjd�atj�j}|jd}|dkr6d}tS|jd�rp|jdd�\}}tj	|�}|j
�|jd�|Sdsxt�dS)	NZtunedr�__main__ztuned.�.�ZNOTSETF)
r rZ	getLogger�inspectZcurrentframe�f_back�f_locals�
startswith�splitZgetChild�remove_all_handlersr�AssertionError)Zcalling_module�name�rootZchildZchild_loggerrrr	rEs





csxeZdZdZejd�ZdZdZ�fdd�Z	dd�Z
dd	�Zej
ejejfd
d�Zdd
�Zedd��Zedd��Z�ZS)�TunedLoggerz!Custom TuneD daemon logger class.z1%(asctime)s %(levelname)-8s %(name)s: %(message)sNcs*tt|�j||�|jtj�|j�dS)N)�superr7r
rr�INFO�switch_to_console)r�args�kwargs)�	__class__rr	r
^szTunedLogger.__init__cOs|jtj|f|�|�dS)N)�log�consts�LOG_LEVEL_CONSOLE)r�msgr;r<rrr	�consolecszTunedLogger.consolecCs |j�|j�|j|j�dS)N)�_setup_console_handlerr3r!�_console_handler)rrrr	r:fszTunedLogger.switch_to_consolecCs&|j|||�|j�|j|j�dS)N)�_setup_file_handlerr3r!�
_file_handler)r�filename�maxBytes�backupCountrrr	�switch_to_filekszTunedLogger.switch_to_filecCs"|j}x|D]}|j|�qWdS)N)�handlersr))rZ	_handlersrrrr	r3rs
zTunedLogger.remove_all_handlerscCs*|jdk	rdStj�|_|jj|j�dS)N)rDrrr�
_formatter)�clsrrr	rCws

z"TunedLogger._setup_console_handlercCsj|jdk	rdStjj|�}|dkr&d}tjj|�s<tj|�tjj|t	|�t	|�d�|_|jj
|j�dS)Nrr,)rHrI)rF�os�path�dirname�exists�makedirsrrKZRotatingFileHandler�intrrL)rMrGrHrIZ
log_directoryrrr	rEs

zTunedLogger._setup_file_handler)rrr
�__doc__rrrLrDrFr
rBr:r?ZLOG_FILEZLOG_FILE_MAXBYTESZLOG_FILE_COUNTrJr3�classmethodrCrE�
__classcell__rr)r=r	r7Xs
r7) �atexitrZlogging.handlersrNZos.pathr.Ztuned.constsr?rrZ	threadingr�io�__all__r rZLockr�objectrrr%r*rZgetLoggerClassr7ZaddLevelNamer@ZLOG_LEVEL_CONSOLE_NAMEZsetLoggerClass�registerZshutdownrrrr	�<module>s4

6
site-packages/tuned/__pycache__/patterns.cpython-36.opt-1.pyc000064400000001312147511334670020050 0ustar003

�<�eO�@sGdd�de�ZdS)c@s(eZdZdZdZdd�Zedd��ZdS)�	Singletonz
	Singleton design pattern.
	NcCs|jtkrtd��dS)NzCannot instantiate directly.)�	__class__r�	TypeError)�self�r�/usr/lib/python3.6/patterns.py�__init__s
zSingleton.__init__cCs|jdkr|�|_|jS)zGet the class instance.N)�	_instance)�clsrrr�get_instances
zSingleton.get_instance)�__name__�
__module__�__qualname__�__doc__rr�classmethodr
rrrrrsrN)�objectrrrrr�<module>ssite-packages/tuned/__pycache__/consts.cpython-36.opt-1.pyc000064400000014505147511334670017531 0ustar003

�<�e��@svddlZdZdZdZdZdZdZdZd	ZeZ	d
Z
dZdZd
Z
ddgZdZdZdZddgZdZdZdZddgZdZdZedeZdedZdedZd Zd!Zd"ZdZd#Z d$Z!d%Z"d&Z#d'Z$d(Z%d)Z&d*Z'd+Z(d,Z)d-Z*d.Z+d/Z,d0Z-d1Z.d2Z/d3Z0d4Z1d5Z2d6Z3d7Z4dzZ5d:Z6d;Z7d<Z8d=Z9d>Z:dZ;d?Z<d7Z=d@Z>dAZ?dBZ@e@ZAdCZBeAZCdDZDdEZEdFZFdGZGdHZHdIZIdJZJdKZKdLZLdMZMdNZNdOZOdPZPdQZQdRZRdSZSdTZTdUZUdVZVdWZWdXZXdYZYdZZZd[Z[dZZ\d[Z]d?Z^d\Z_d+Z`d\ZadZZbd[ZcdZZdd[ZedZfd\Zgd{Zhd7Zid\Zjd|ZkdZZld[Zmd^Znd[Zod_Zpd`ZqdaZrdbZsdcZtd\ZuddZvdeZwdfZxdgZydhZzdiZ{djZ|dkZ}dlZ~dmZdnZ�doZ�dpZ�dqZ�drZ�dsZ�dtZ�ddZ�duZ�dvZ�dwZ�dxZ�ej�ej�ej�ej�e�ddy�Z�dS)}�Nz/etc/tuned/tuned-main.confz/etc/tuned/active_profilez/etc/tuned/profile_modez/etc/tuned/post_loaded_profilez
tuned.confz/etc/tuned/recommend.conf�zcom.redhat.tunedzcom.redhat.tuned.controlz/TunedZbalancedz/run/tuned/save.picklez/usr/lib/tunedz
/etc/tunedz/var/lib/tuned�mainZ:this_is_some_magic_section_header_because_of_compatibilityz/usr/lib/tuned/recommend.dz/etc/tuned/recommend.dz.tmp�z/bootz/etc/grub2.cfgz/etc/grub2-efi.cfgz/etc/grub.dZ00_tuned�/z### BEGIN /etc/grub.d/z ###z### END /etc/grub.d/Ztuned_paramsZtuned_initrdz/etc/default/grubZTUNED_BOOT_CMDLINEZTUNED_BOOT_INITRD_ADDZTUNED_BOOT_KARGS_DELETEDz/etc/tuned/bootcmdlinez/sys/firmware/opalz/etc/machine-idz*/usr/lib/kernel/install.d/92-tuned.installz/boot/loader/entries�
z/procz/sys/fs/cgroup/cpuseti�z /etc/systemd/system/%s.service.di�z/etc/modprobe.d/tuned.confz/etc/systemd/system.confZCPUAffinityz/etc/sysconfig/irqbalancez/sys/firmware/acpiz/sys/devices/system/cpu��di�z/var/log/tuned/tuned.logz/run/tuned/tuned.pidz/etc/system-release-cpeZ	function_ZTUNED_�ZSystemZUserznet.hadess.PowerProfilesz/net/hadess/PowerProfilesz/etc/tuned/ppd.confZdaemonZdynamic_tuningZsleep_intervalZupdate_intervalZrecommend_commandZreapply_sysctlZdefault_instance_priorityZudev_buffer_sizeZlog_file_countZlog_file_max_sizeZuname_stringZcpuinfo_stringZenable_dbusZenable_unix_socketZunix_socket_pathZunix_socket_signal_pathsZunix_socket_ownershipZunix_socket_permissionsZconnections_backlogZhwp_eppZrollbackTZ
getbooleanZgetintiFz/run/tuned/tuned.sock�z-1 -1Z0o600Z1024�autoz/dev/cpu_dma_latencyZsummary�descriptionZprofile_changedz-you need to reboot for changes to take effectz&verify: passed: device %s: '%s' = '%s'zverify: passed: '%s' = '%s'zverify: passed: '%s'z)verify: skipped, missing: device %s: '%s'zverify: skipped, missing: '%s'z5verify: failed: device %s: '%s' = '%s', expected '%s'z*verify: failed: '%s' = '%s', expected '%s'z/verify: failed: cmdline arg '%s', expected '%s'z:verify: failed: cmdline arg '%s' is missing, expected '%s'zverify: failed: '%s'iXZmanual�<ZCONSOLE�console)�debug�info�warn�errorrZnonei��ii)�ZloggingZGLOBAL_CONFIG_FILEZACTIVE_PROFILE_FILEZPROFILE_MODE_FILEZPOST_LOADED_PROFILE_FILEZPROFILE_FILEZRECOMMEND_CONF_FILEZDAEMONIZE_PARENT_TIMEOUTZ	NAMESPACEZDBUS_BUSZDBUS_INTERFACEZDBUS_OBJECTZDEFAULT_PROFILEZDEFAULT_STORAGE_FILEZLOAD_DIRECTORIESZPERSISTENT_STORAGE_DIRZPLUGIN_MAIN_UNIT_NAMEZMAGIC_HEADER_NAMEZRECOMMEND_DIRECTORIESZTMP_FILE_SUFFIXZERROR_THRESHOLDZBOOT_DIRZGRUB2_CFG_FILESZ
GRUB2_CFG_DIRZGRUB2_TUNED_TEMPLATE_NAMEZGRUB2_TUNED_TEMPLATE_PATHZGRUB2_TEMPLATE_HEADER_BEGINZGRUB2_TEMPLATE_HEADER_ENDZGRUB2_TUNED_VARZGRUB2_TUNED_INITRD_VARZGRUB2_DEFAULT_ENV_FILEZINITRD_IMAGE_DIRZBOOT_CMDLINE_TUNED_VARZBOOT_CMDLINE_INITRD_ADD_VARZBOOT_CMDLINE_KARGS_DELETED_VARZBOOT_CMDLINE_FILEZPETITBOOT_DETECT_DIRZMACHINE_ID_FILEZKERNEL_UPDATE_HOOK_FILEZBLS_ENTRIES_PATHZCGROUP_CLEANUP_TASKS_RETRYZPROCFS_MOUNT_POINTZDEF_CGROUP_MOUNT_POINTZDEF_CGROUP_MODEZSERVICE_SYSTEMD_CFG_PATHZDEF_SERVICE_CFG_DIR_MODEZMODULES_FILEZSYSTEMD_SYSTEM_CONF_FILEZSYSTEMD_CPUAFFINITY_VARZIRQBALANCE_SYSCONFIG_FILEZACPI_DIRZSYSFS_CPUS_PATHZLOG_FILE_COUNTZLOG_FILE_MAXBYTESZLOG_FILEZPID_FILEZSYSTEM_RELEASE_FILEZFUNCTION_PREFIXZ
ENV_PREFIXZ
ROLLBACK_NONEZ
ROLLBACK_SOFTZ
ROLLBACK_FULLZPREFIX_PROFILE_FACTORYZPREFIX_PROFILE_USERZ
PPD_NAMESPACEZPPD_DBUS_BUSZPPD_DBUS_OBJECTZPPD_DBUS_INTERFACEZPPD_CONFIG_FILEZ
CFG_DAEMONZCFG_DYNAMIC_TUNINGZCFG_SLEEP_INTERVALZCFG_UPDATE_INTERVALZCFG_RECOMMEND_COMMANDZCFG_REAPPLY_SYSCTLZCFG_DEFAULT_INSTANCE_PRIORITYZCFG_UDEV_BUFFER_SIZEZCFG_LOG_FILE_COUNTZCFG_LOG_FILE_MAX_SIZEZCFG_UNAME_STRINGZCFG_CPUINFO_STRINGZCFG_ENABLE_DBUSZCFG_ENABLE_UNIX_SOCKETZCFG_UNIX_SOCKET_PATHZCFG_UNIX_SOCKET_SIGNAL_PATHSZCFG_UNIX_SOCKET_OWNERSHIPZCFG_UNIX_SOCKET_PERMISIONSZ#CFG_UNIX_SOCKET_CONNECTIONS_BACKLOGZCFG_CPU_EPP_FLAGZCFG_ROLLBACKZCFG_DEF_DAEMONZCFG_FUNC_DAEMONZCFG_DEF_DYNAMIC_TUNINGZCFG_FUNC_DYNAMIC_TUNINGZCFG_DEF_SLEEP_INTERVALZCFG_FUNC_SLEEP_INTERVALZCFG_DEF_UPDATE_INTERVALZCFG_FUNC_UPDATE_INTERVALZCFG_DEF_RECOMMEND_COMMANDZCFG_FUNC_RECOMMEND_COMMANDZCFG_DEF_REAPPLY_SYSCTLZCFG_FUNC_REAPPLY_SYSCTLZ!CFG_DEF_DEFAULT_INSTANCE_PRIORITYZ"CFG_FUNC_DEFAULT_INSTANCE_PRIORITYZCFG_DEF_UDEV_BUFFER_SIZEZCFG_DEF_LOG_FILE_COUNTZCFG_FUNC_LOG_FILE_COUNTZCFG_DEF_LOG_FILE_MAX_SIZEZCFG_DEF_ENABLE_DBUSZCFG_FUNC_ENABLE_DBUSZCFG_DEF_ENABLE_UNIX_SOCKETZCFG_FUNC_ENABLE_UNIX_SOCKETZCFG_DEF_UNIX_SOCKET_PATHZ CFG_DEF_UNIX_SOCKET_SIGNAL_PATHSZCFG_DEF_UNIX_SOCKET_OWNERSHIPZCFG_DEF_UNIX_SOCKET_PERMISIONSZ'CFG_DEF_UNIX_SOCKET_CONNECTIONS_BACKLOGZ(CFG_FUNC_UNIX_SOCKET_CONNECTIONS_BACKLOGZCFG_DEF_ROLLBACKZPATH_CPU_DMA_LATENCYZPROFILE_ATTR_SUMMARYZPROFILE_ATTR_DESCRIPTIONZSIGNAL_PROFILE_CHANGEDZSTR_HINT_REBOOTZ"STR_VERIFY_PROFILE_DEVICE_VALUE_OKZSTR_VERIFY_PROFILE_VALUE_OKZSTR_VERIFY_PROFILE_OKZ'STR_VERIFY_PROFILE_DEVICE_VALUE_MISSINGZ STR_VERIFY_PROFILE_VALUE_MISSINGZ$STR_VERIFY_PROFILE_DEVICE_VALUE_FAILZSTR_VERIFY_PROFILE_VALUE_FAILZSTR_VERIFY_PROFILE_CMDLINE_FAILZ'STR_VERIFY_PROFILE_CMDLINE_FAIL_MISSINGZSTR_VERIFY_PROFILE_FAILZ
ADMIN_TIMEOUTZACTIVE_PROFILE_AUTOZACTIVE_PROFILE_MANUALZLOG_LEVEL_CONSOLEZLOG_LEVEL_CONSOLE_NAMEZCAPTURE_LOG_LEVEL�DEBUG�INFOZWARNZERRORZCAPTURE_LOG_LEVELS�rr�/usr/lib/python3.6/consts.py�<module>s"site-packages/tuned/__pycache__/version.cpython-36.pyc000064400000000370147511334670016741 0ustar003

�<�e��@sdZdZdZdeeefZdS)���z%d.%d.%dN)ZTUNED_VERSION_MAJORZTUNED_VERSION_MINORZTUNED_VERSION_PATCHZTUNED_VERSION_STR�rr�/usr/lib/python3.6/version.py�<module>ssite-packages/tuned/__pycache__/exceptions.cpython-36.pyc000064400000001662147511334670017442 0ustar003

�<�e8�@s6ddlZddlZddlZejj�ZGdd�de�ZdS)�Nc@s"eZdZdZddd�Zdd�ZdS)�TunedExceptionz
	NcCs(|dkrt}|jt|��|j|�dS)N)�exception_logger�error�str�
_log_trace)�self�logger�r	� /usr/lib/python3.6/exceptions.py�logszTunedException.logcCsHtj�\}}}||kr"|jd�n"djtj|||��j�}|j|�dS)Nz"stack trace is no longer available�)�sys�exc_info�debug�join�	traceback�format_exception�rstrip)rr�exc_type�	exc_value�
exc_tracebackZexception_infor	r	r
rs
zTunedException._log_trace)N)�__name__�
__module__�__qualname__�__doc__rrr	r	r	r
rs
r)	Z
tuned.logsZtunedr
rZlogs�getr�	Exceptionrr	r	r	r
�<module>s
site-packages/tuned/__pycache__/patterns.cpython-36.pyc000064400000001312147511334670017111 0ustar003

�<�eO�@sGdd�de�ZdS)c@s(eZdZdZdZdd�Zedd��ZdS)�	Singletonz
	Singleton design pattern.
	NcCs|jtkrtd��dS)NzCannot instantiate directly.)�	__class__r�	TypeError)�self�r�/usr/lib/python3.6/patterns.py�__init__s
zSingleton.__init__cCs|jdkr|�|_|jS)zGet the class instance.N)�	_instance)�clsrrr�get_instances
zSingleton.get_instance)�__name__�
__module__�__qualname__�__doc__rr�classmethodr
rrrrrsrN)�objectrrrrr�<module>ssite-packages/tuned/exports/controller.py000064400000006722147511334670014666 0ustar00from . import interfaces
import inspect
import tuned.patterns

class ExportsController(tuned.patterns.Singleton):
	"""
	Controls and manages object interface exporting.
	"""

	def __init__(self):
		super(ExportsController, self).__init__()
		self._exporters = []
		self._objects = []
		self._exports_initialized = False

	def register_exporter(self, instance):
		"""Register objects exporter."""
		self._exporters.append(instance)

	def register_object(self, instance):
		"""Register object to be exported."""
		self._objects.append(instance)

	def _is_exportable_method(self, method):
		"""Check if method was marked with @exports.export wrapper."""
		return inspect.ismethod(method) and hasattr(method, "export_params")

	def _is_exportable_signal(self, method):
		"""Check if method was marked with @exports.signal wrapper."""
		return inspect.ismethod(method) and hasattr(method, "signal_params")

	def _is_exportable_getter(self, method):
		"""Check if method was marked with @exports.get_property wrapper."""
		return inspect.ismethod(method) and hasattr(method, "property_get_params")

	def _is_exportable_setter(self, method):
		"""Check if method was marked with @exports.set_property wrapper."""
		return inspect.ismethod(method) and hasattr(method, "property_set_params")

	def _export_method(self, method):
		"""Register method to all exporters."""
		for exporter in self._exporters:
			args = method.export_params[0]
			kwargs = method.export_params[1]
			exporter.export(method, *args, **kwargs)

	def _export_signal(self, method):
		"""Register signal to all exporters."""
		for exporter in self._exporters:
			args = method.signal_params[0]
			kwargs = method.signal_params[1]
			exporter.signal(method, *args, **kwargs)

	def _export_getter(self, method):
		"""Register property getter to all exporters."""
		for exporter in self._exporters:
			args = method.property_get_params[0]
			kwargs = method.property_get_params[1]
			exporter.property_getter(method, *args, **kwargs)

	def _export_setter(self, method):
		"""Register property setter to all exporters."""
		for exporter in self._exporters:
			args = method.property_set_params[0]
			kwargs = method.property_set_params[1]
			exporter.property_setter(method, *args, **kwargs)

	def send_signal(self, signal, *args, **kwargs):
		"""Register signal to all exporters."""
		for exporter in self._exporters:
			exporter.send_signal(signal, *args, **kwargs)

	def property_changed(self, *args, **kwargs):
		for exporter in self._exporters:
			exporter.property_changed(*args, **kwargs)

	def period_check(self):
		"""Allows to perform checks on exporters without special thread."""
		for exporter in self._exporters:
			exporter.period_check()

	def _initialize_exports(self):
		if self._exports_initialized:
			return

		for instance in self._objects:
			for name, method in inspect.getmembers(instance, self._is_exportable_method):
				self._export_method(method)
			for name, method in inspect.getmembers(instance, self._is_exportable_signal):
				self._export_signal(method)
			for name, method in inspect.getmembers(instance, self._is_exportable_getter):
				self._export_getter(method)
			for name, method in inspect.getmembers(instance, self._is_exportable_setter):
				self._export_setter(method)

		self._exports_initialized = True

	def start(self):
		"""Start the exports."""
		self._initialize_exports()
		for exporter in self._exporters:
			exporter.start()

	def stop(self):
		"""Stop the exports."""
		for exporter in self._exporters:
			exporter.stop()
site-packages/tuned/exports/__init__.py000064400000003561147511334670014240 0ustar00from . import interfaces
from . import controller
from . import dbus_exporter as dbus
from . import dbus_exporter_with_properties as dbus_with_properties
from . import unix_socket_exporter as unix_socket

def export(*args, **kwargs):
	"""Decorator, use to mark exportable methods."""
	def wrapper(method):
		method.export_params = [ args, kwargs ]
		return method
	return wrapper

def signal(*args, **kwargs):
	"""Decorator, use to mark exportable signals."""
	def wrapper(method):
		method.signal_params = [ args, kwargs ]
		return method
	return wrapper

def property_setter(*args, **kwargs):
	"""Decorator, use to mark setters of exportable properties."""
	def wrapper(method):
		method.property_set_params = [ args, kwargs ]
		return method
	return wrapper

def property_getter(*args, **kwargs):
	"""Decorator, use to mark getters of exportable properties."""
	def wrapper(method):
		method.property_get_params = [ args, kwargs ]
		return method
	return wrapper

def property_changed(*args, **kwargs):
	ctl = controller.ExportsController.get_instance()
	return ctl.property_changed(*args, **kwargs)

def register_exporter(instance):
	if not isinstance(instance, interfaces.ExporterInterface):
		raise Exception()
	ctl = controller.ExportsController.get_instance()
	return ctl.register_exporter(instance)

def register_object(instance):
	if not isinstance(instance, interfaces.ExportableInterface):
		raise Exception()
	ctl = controller.ExportsController.get_instance()
	return ctl.register_object(instance)

def send_signal(*args, **kwargs):
	ctl = controller.ExportsController.get_instance()
	return ctl.send_signal(*args, **kwargs)

def start():
	ctl = controller.ExportsController.get_instance()
	return ctl.start()

def stop():
	ctl = controller.ExportsController.get_instance()
	return ctl.stop()

def period_check():
	ctl = controller.ExportsController.get_instance()
	return ctl.period_check()
site-packages/tuned/exports/interfaces.py000064400000001114147511334670014614 0ustar00class ExportableInterface(object):
	pass

class ExporterInterface(object):
	def export(self, method, in_signature, out_signature):
		# to be overridden by concrete implementation
		raise NotImplementedError()

	def signal(self, method, out_signature):
		# to be overridden by concrete implementation
		raise NotImplementedError()

	def send_signal(self, signal, *args, **kwargs):
		# to be overridden by concrete implementation
		raise NotImplementedError()

	def start(self):
		raise NotImplementedError()

	def stop(self):
		raise NotImplementedError()

	def period_check(self):
		pass
site-packages/tuned/exports/dbus_exporter_with_properties.py000064400000006052147511334670020673 0ustar00from inspect import ismethod
from dbus.service import method, signal
from dbus import PROPERTIES_IFACE
from dbus.exceptions import DBusException
from tuned.exports.dbus_exporter import DBusExporter


class DBusExporterWithProperties(DBusExporter):
    def __init__(self, bus_name, interface_name, object_name, namespace):
        super(DBusExporterWithProperties, self).__init__(bus_name, interface_name, object_name, namespace)
        self._property_setters = {}
        self._property_getters = {}

        def Get(_, interface_name, property_name):
            if interface_name != self._interface_name:
                raise DBusException("Unknown interface: %s" % interface_name)
            if property_name not in self._property_getters:
                raise DBusException("No such property: %s" % property_name)
            getter = self._property_getters[property_name]
            return getter()

        def Set(_, interface_name, property_name, value):
            if interface_name != self._interface_name:
                raise DBusException("Unknown interface: %s" % interface_name)
            if property_name not in self._property_setters:
                raise DBusException("No such property: %s" % property_name)
            setter = self._property_setters[property_name]
            setter(value)

        def GetAll(_, interface_name):
            if interface_name != self._interface_name:
                raise DBusException("Unknown interface: %s" % interface_name)
            return {name: getter() for name, getter in self._property_getters.items()}

        def PropertiesChanged(_, interface_name, changed_properties, invalidated_properties):
            if interface_name != self._interface_name:
                raise DBusException("Unknown interface: %s" % interface_name)

        self._dbus_methods["Get"] = method(PROPERTIES_IFACE, in_signature="ss", out_signature="v")(Get)
        self._dbus_methods["Set"] = method(PROPERTIES_IFACE, in_signature="ssv")(Set)
        self._dbus_methods["GetAll"] = method(PROPERTIES_IFACE, in_signature="s", out_signature="a{sv}")(GetAll)
        self._dbus_methods["PropertiesChanged"] = signal(PROPERTIES_IFACE, signature="sa{sv}as")(PropertiesChanged)
        self._signals.add("PropertiesChanged")

    def property_changed(self, property_name, value):
        self.send_signal("PropertiesChanged", self._interface_name, {property_name: value}, {})

    def property_getter(self, method, property_name):
        if not ismethod(method):
            raise Exception("Only bound methods can be exported.")
        if property_name in self._property_getters:
            raise Exception("A getter for this property is already registered.")
        self._property_getters[property_name] = method

    def property_setter(self, method, property_name):
        if not ismethod(method):
            raise Exception("Only bound methods can be exported.")
        if property_name in self._property_setters:
            raise Exception("A setter for this property is already registered.")
        self._property_setters[property_name] = method
site-packages/tuned/exports/__pycache__/dbus_exporter_with_properties.cpython-36.opt-1.pyc000064400000006315147511334670026120 0ustar003

�<�e*�@sTddlmZddlmZmZddlmZddlmZddl	m
Z
Gdd�de
�ZdS)	�)�ismethod)�method�signal)�PROPERTIES_IFACE)�
DBusException)�DBusExportercs4eZdZ�fdd�Zdd�Zdd�Zdd�Z�ZS)	�DBusExporterWithPropertiesc	s�tt��j||||�i�_i�_�fdd�}�fdd�}�fdd�}�fdd�}ttd	d
d�|��jd<ttd
d�|��jd<ttddd�|��jd<ttdd�|��jd<�j	j
d�dS)Ncs<|�jkrtd|��|�jkr,td|���j|}|�S)NzUnknown interface: %szNo such property: %s)�_interface_namer�_property_getters)�_�interface_name�
property_name�getter)�self��3/usr/lib/python3.6/dbus_exporter_with_properties.py�Gets


z0DBusExporterWithProperties.__init__.<locals>.GetcsB|�jkrtd|��|�jkr,td|���j|}||�dS)NzUnknown interface: %szNo such property: %s)r	r�_property_setters)rrr
�value�setter)rrr�Sets


z0DBusExporterWithProperties.__init__.<locals>.Setcs*|�jkrtd|��dd��jj�D�S)NzUnknown interface: %scSsi|]\}}|�|�qSrr)�.0�namerrrr�
<dictcomp>!szGDBusExporterWithProperties.__init__.<locals>.GetAll.<locals>.<dictcomp>)r	rr
�items)rr)rrr�GetAlls
z3DBusExporterWithProperties.__init__.<locals>.GetAllcs|�jkrtd|��dS)NzUnknown interface: %s)r	r)rrZchanged_propertiesZinvalidated_properties)rrr�PropertiesChanged#s
z>DBusExporterWithProperties.__init__.<locals>.PropertiesChangedZss�v)�in_signatureZ
out_signaturerZssv)rr�sza{sv}rzsa{sv}as)Z	signaturer)�superr�__init__rr
rrZ
_dbus_methodsrZ_signals�add)	rZbus_namerZobject_name�	namespacerrrr)�	__class__)rrr!	sz#DBusExporterWithProperties.__init__cCs|jd|j||ii�dS)Nr)Zsend_signalr	)rr
rrrr�property_changed-sz+DBusExporterWithProperties.property_changedcCs0t|�std��||jkr"td��||j|<dS)Nz#Only bound methods can be exported.z1A getter for this property is already registered.)r�	Exceptionr
)rrr
rrr�property_getter0s

z*DBusExporterWithProperties.property_gettercCs0t|�std��||jkr"td��||j|<dS)Nz#Only bound methods can be exported.z1A setter for this property is already registered.)rr&r)rrr
rrr�property_setter7s

z*DBusExporterWithProperties.property_setter)�__name__�
__module__�__qualname__r!r%r'r(�
__classcell__rr)r$rrs$rN)�inspectrZdbus.servicerrZdbusrZdbus.exceptionsrZtuned.exports.dbus_exporterrrrrrr�<module>s
site-packages/tuned/exports/__pycache__/controller.cpython-36.pyc000064400000011237147511334670021147 0ustar003

�<�e�
�@s4ddlmZddlZddlZGdd�dejj�ZdS)�)�
interfaces�Ncs�eZdZdZ�fdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Z�ZS)$�ExportsControllerz4
	Controls and manages object interface exporting.
	cs$tt|�j�g|_g|_d|_dS)NF)�superr�__init__�
_exporters�_objects�_exports_initialized)�self)�	__class__�� /usr/lib/python3.6/controller.pyr
szExportsController.__init__cCs|jj|�dS)zRegister objects exporter.N)r�append)r
�instancerrr
�register_exportersz#ExportsController.register_exportercCs|jj|�dS)zRegister object to be exported.N)rr)r
rrrr
�register_objectsz!ExportsController.register_objectcCstj|�ot|d�S)z8Check if method was marked with @exports.export wrapper.�
export_params)�inspect�ismethod�hasattr)r
�methodrrr
�_is_exportable_methodsz'ExportsController._is_exportable_methodcCstj|�ot|d�S)z8Check if method was marked with @exports.signal wrapper.�
signal_params)rrr)r
rrrr
�_is_exportable_signalsz'ExportsController._is_exportable_signalcCstj|�ot|d�S)z>Check if method was marked with @exports.get_property wrapper.�property_get_params)rrr)r
rrrr
�_is_exportable_getter sz'ExportsController._is_exportable_gettercCstj|�ot|d�S)z>Check if method was marked with @exports.set_property wrapper.�property_set_params)rrr)r
rrrr
�_is_exportable_setter$sz'ExportsController._is_exportable_settercCs:x4|jD]*}|jd}|jd}|j|f|�|�qWdS)z!Register method to all exporters.rrN)rrZexport)r
r�exporter�args�kwargsrrr
�_export_method(s

z ExportsController._export_methodcCs:x4|jD]*}|jd}|jd}|j|f|�|�qWdS)z!Register signal to all exporters.rrN)rr�signal)r
rrrr rrr
�_export_signal/s

z ExportsController._export_signalcCs:x4|jD]*}|jd}|jd}|j|f|�|�qWdS)z*Register property getter to all exporters.rrN)rrZproperty_getter)r
rrrr rrr
�_export_getter6s

z ExportsController._export_gettercCs:x4|jD]*}|jd}|jd}|j|f|�|�qWdS)z*Register property setter to all exporters.rrN)rrZproperty_setter)r
rrrr rrr
�_export_setter=s

z ExportsController._export_settercOs&x |jD]}|j|f|�|�qWdS)z!Register signal to all exporters.N)r�send_signal)r
r"rr rrrr
r&DszExportsController.send_signalcOs x|jD]}|j||�qWdS)N)r�property_changed)r
rr rrrr
r'Isz"ExportsController.property_changedcCsx|jD]}|j�qWdS)z=Allows to perform checks on exporters without special thread.N)r�period_check)r
rrrr
r(MszExportsController.period_checkcCs�|jr
dSx�|jD]�}x$tj||j�D]\}}|j|�q&Wx$tj||j�D]\}}|j|�qLWx$tj||j�D]\}}|j	|�qrWx$tj||j
�D]\}}|j|�q�WqWd|_dS)NT)r	rrZ
getmembersrr!rr#rr$rr%)r
r�namerrrr
�_initialize_exportsRsz%ExportsController._initialize_exportscCs$|j�x|jD]}|j�qWdS)zStart the exports.N)r*r�start)r
rrrr
r+bszExportsController.startcCsx|jD]}|j�qWdS)zStop the exports.N)r�stop)r
rrrr
r,hszExportsController.stop)�__name__�
__module__�__qualname__�__doc__rrrrrrrr!r#r$r%r&r'r(r*r+r,�
__classcell__rr)rr
rs$r)�rrZtuned.patternsZtunedZpatternsZ	Singletonrrrrr
�<module>ssite-packages/tuned/exports/__pycache__/dbus_exporter_with_properties.cpython-36.pyc000064400000006315147511334670025161 0ustar003

�<�e*�@sTddlmZddlmZmZddlmZddlmZddl	m
Z
Gdd�de
�ZdS)	�)�ismethod)�method�signal)�PROPERTIES_IFACE)�
DBusException)�DBusExportercs4eZdZ�fdd�Zdd�Zdd�Zdd�Z�ZS)	�DBusExporterWithPropertiesc	s�tt��j||||�i�_i�_�fdd�}�fdd�}�fdd�}�fdd�}ttd	d
d�|��jd<ttd
d�|��jd<ttddd�|��jd<ttdd�|��jd<�j	j
d�dS)Ncs<|�jkrtd|��|�jkr,td|���j|}|�S)NzUnknown interface: %szNo such property: %s)�_interface_namer�_property_getters)�_�interface_name�
property_name�getter)�self��3/usr/lib/python3.6/dbus_exporter_with_properties.py�Gets


z0DBusExporterWithProperties.__init__.<locals>.GetcsB|�jkrtd|��|�jkr,td|���j|}||�dS)NzUnknown interface: %szNo such property: %s)r	r�_property_setters)rrr
�value�setter)rrr�Sets


z0DBusExporterWithProperties.__init__.<locals>.Setcs*|�jkrtd|��dd��jj�D�S)NzUnknown interface: %scSsi|]\}}|�|�qSrr)�.0�namerrrr�
<dictcomp>!szGDBusExporterWithProperties.__init__.<locals>.GetAll.<locals>.<dictcomp>)r	rr
�items)rr)rrr�GetAlls
z3DBusExporterWithProperties.__init__.<locals>.GetAllcs|�jkrtd|��dS)NzUnknown interface: %s)r	r)rrZchanged_propertiesZinvalidated_properties)rrr�PropertiesChanged#s
z>DBusExporterWithProperties.__init__.<locals>.PropertiesChangedZss�v)�in_signatureZ
out_signaturerZssv)rr�sza{sv}rzsa{sv}as)Z	signaturer)�superr�__init__rr
rrZ
_dbus_methodsrZ_signals�add)	rZbus_namerZobject_name�	namespacerrrr)�	__class__)rrr!	sz#DBusExporterWithProperties.__init__cCs|jd|j||ii�dS)Nr)Zsend_signalr	)rr
rrrr�property_changed-sz+DBusExporterWithProperties.property_changedcCs0t|�std��||jkr"td��||j|<dS)Nz#Only bound methods can be exported.z1A getter for this property is already registered.)r�	Exceptionr
)rrr
rrr�property_getter0s

z*DBusExporterWithProperties.property_gettercCs0t|�std��||jkr"td��||j|<dS)Nz#Only bound methods can be exported.z1A setter for this property is already registered.)rr&r)rrr
rrr�property_setter7s

z*DBusExporterWithProperties.property_setter)�__name__�
__module__�__qualname__r!r%r'r(�
__classcell__rr)r$rrs$rN)�inspectrZdbus.servicerrZdbusrZdbus.exceptionsrZtuned.exports.dbus_exporterrrrrrr�<module>s
site-packages/tuned/exports/__pycache__/unix_socket_exporter.cpython-36.pyc000064400000021134147511334670023244 0ustar003

�<�e��@s�ddlZddlZddlZddlZddlmZddlZddlj	Z	ddl
mZddlZddl
Z
ddlZejj�ZGdd�dej�ZdS)�N�)�
interfaces)�ismethodc@s�eZdZdZejejejejej	fdd�Z
dd�Zdd�Zdd	�Z
d
d�Zdd
�Zdd�Zdd�Zdd�Zdd�Zd$dd�Zd%dd�Zdd�Zdd�Zd d!�Zd"d#�ZdS)&�UnixSocketExportera9
	Export method calls through Unix Domain Socket Interface.

	We take a method to be exported and create a simple wrapper function
	to call it. This is required as we need the original function to be
	bound to the original object instance. While the wrapper will be bound
	to an object we dynamically construct.
	cCs||_d|_|rtjd|�ng|_g|_dd	g|_|r�|j�}x�t|dd��D]�\}}yt|�|j|<WqPt	k
r�y2|dkr�t
j|�j|j|<nt
j|�j|j|<Wn2tk
r�tjd|dkr�dnd|f�YnXYqPXqPW||_||_i|_t�|_d|_d|_dS)
Nz,;r�rz(%s '%s' does not exists, leaving defaultZUserZGroup���r)�_socket_path�_socket_object�re�split�_socket_signal_pathsZ_socket_signal_objects�
_ownership�	enumerate�int�
ValueError�pwd�getpwnamZpw_uid�grpZgetgrnamZgr_gid�KeyError�log�error�_permissions�_connections_backlog�_unix_socket_methods�set�_signalsZ_connZ_channel)�selfZsocket_pathZsignal_pathsZ	ownershipZpermissionsZconnections_backlog�i�o�r�*/usr/lib/python3.6/unix_socket_exporter.py�__init__s.
.zUnixSocketExporter.__init__cCs
|jdk	S)N)r	)rrrr �running;szUnixSocketExporter.runningcsTt��std���j}||jkr,td|��G�fdd�dt�}|||�|j|<dS)Nz#Only bound methods can be exported.z/Method with this name (%s) is already exported.cs eZdZdd�Z�fdd�ZdS)z*UnixSocketExporter.export.<locals>.wrappercSs||_||_dS)N)Z
_in_signature�_out_signature)r�in_signature�
out_signaturerrr r!Gsz3UnixSocketExporter.export.<locals>.wrapper.__init__cs
�||�S)Nr)r�args�kwargs)�methodrr �__call__Ksz3UnixSocketExporter.export.<locals>.wrapper.__call__N)�__name__�
__module__�__qualname__r!r)r)r(rr �wrapperFsr-)r�	Exceptionr*r�object)rr(r$r%�method_namer-r)r(r �export>s
zUnixSocketExporter.exportcs^t��std���j}||jkr,td|��G�fdd�dt�}||�|j|<|jj|�dS)Nz#Only bound methods can be exported.z/Method with this name (%s) is already exported.cs eZdZdd�Z�fdd�ZdS)z*UnixSocketExporter.signal.<locals>.wrappercSs
||_dS)N)r#)rr%rrr r!Ysz3UnixSocketExporter.signal.<locals>.wrapper.__init__cs
�||�S)Nr)rr&r')r(rr r)\sz3UnixSocketExporter.signal.<locals>.wrapper.__call__N)r*r+r,r!r)r)r(rr r-Xsr-)rr.r*rr/r�add)rr(r%r0r-r)r(r �signalPs
zUnixSocketExporter.signalcOs�||jkrtd|��x�|jD]�}tjd|�yDtjtjtj�}|jd�|j	|�|j
|d||d��|j�Wqtk
r�}ztj
d|||f�WYdd}~XqXqWdS)NzSignal '%s' doesn't exist.zSending signal on socket %sFz2.0)�jsonrpcr(�paramsz2Error while sending signal '%s' to socket '%s': %s)rr.rr�debug�socket�AF_UNIX�SOCK_STREAMZsetblockingZconnect�
_send_data�close�OSError�warning)rr3r&r'�p�s�errr �send_signalbs


zUnixSocketExporter.send_signalcCs|jj|�dS)N)r�append)r�pathrrr �register_signal_pathpsz'UnixSocketExporter.register_signal_pathcCs�|jr�tjj|j�r tj|j�tjtjtj�|_|jj	|j�|jj
|j�tj|j|j
d|j
d�|jr�tj|j|j�dS)Nrr)r�osrC�exists�unlinkr7r8r9r	ZbindZlistenr�chownr
r�chmod)rrrr �_construct_socket_objectssz+UnixSocketExporter._construct_socket_objectcCs |j�rdS|j�|j�dS)N)r"�stoprJ)rrrr �start~szUnixSocketExporter.startcCs|jr|jj�dS)N)r	r;)rrrr rK�szUnixSocketExporter.stopcCsbtjd|�y|jtj|�jd��Wn4tk
r\}ztjd||f�WYdd}~XnXdS)NzSending socket data: %s)zutf-8zFailed to send data '%s': %s)rr6�send�json�dumps�encoder.r=)rr?�datar@rrr r:�s
zUnixSocketExporter._send_dataFcCs$d|d�}|r||d<n||d<|S)Nz2.0)r4�idr�resultr)rrQrRr�resrrr �_create_response�s
z#UnixSocketExporter._create_responseNcCs|j|||d�d|d�S)N)�code�messagerQT)rrR)rU)rrVrWrRrQrrr �_create_error_responce�s
z)UnixSocketExporter._create_error_responcecCs|j||�S)N)rU)rrSrRrrr �_create_result_response�sz*UnixSocketExporter._create_result_responsecCs|jd�r|SdS)NrR)�get)rrQrrr �	_check_id�s
zUnixSocketExporter._check_idcCsnt|�tks&|jd�dks&|jd�r2|jdd�S|jd�}d}|d|jkrb|j|jdd|��Sy�|jd	�s�|j|d�}njt|d	�ttfkr�|j|d|d	�}n>t|d	�tkr�|j|df|d	�}n|j|jdd|��SWnntk
�r$}z|j|jdd|t	|���Sd}~Xn8t
k
�rZ}z|j|jdd
|t	|���Sd}~XnX|j|j||��S)Nr4z2.0r(iXzInvalid RequestrRiYzMethod not foundr5iZzInvalid paramsr�Errori����i����i����i����)�type�dictrZrXrr[�list�tuple�	TypeError�strr.rY)r�reqrR�retr@rrr �_process_request�s&&

$&z#UnixSocketExporter._process_requestc	#Cs�|j�sdS�x�tj|jgffd�\}}}|�r�|jj�\}}y*d}x |jd�j�}|sZP||7}qFWWn2tk
r�}ztjd|�wWYdd}~XnX|�r�yt	j
|�}WnRtk
�r}z4tjd||f�|j||jddt
|���wWYdd}~XnXt|�tttfk�r>tjd	�|j||jd
dt
|���qt|�ttfk�r�t|�dk�rz|j||jddt
|���qg}x(|D] }|j|�}|�r�|j|��q�W|�r�|j||�n|j|�}|�r�|j||�qdSqWdS)a�
		Periodically checks socket object for new calls. This allows to function without special thread.
		Interface is according JSON-RPC 2.0 Specification (see https://www.jsonrpc.org/specification)
		
		Example calls:
		
		printf '[{"jsonrpc": "2.0", "method": "active_profile", "id": 1}, {"jsonrpc": "2.0", "method": "profiles", "id": 2}]' | nc -U /run/tuned/tuned.sock
		printf '{"jsonrpc": "2.0", "method": "switch_profile", "params": {"profile_name": "balanced"}, "id": 1}' | nc -U /run/tuned/tuned.sock
		Nr�iz"Failed to load data of message: %sz!Failed to load json data '%s': %si�zParse errorzWrong format of calliXzInvalid RequestiD���iD���i����)r"�selectr	ZacceptZrecv�decoder.rrrN�loadsr:rXrbr]r`r_r^�lenrerB)	r�r�_ZconnrQZrec_datar@rTrcrrr �period_check�sT




zUnixSocketExporter.period_check)F)NN)r*r+r,�__doc__�constsZCFG_DEF_UNIX_SOCKET_PATHZ CFG_DEF_UNIX_SOCKET_SIGNAL_PATHSZCFG_DEF_UNIX_SOCKET_OWNERSHIPZCFG_DEF_UNIX_SOCKET_PERMISIONSZ'CFG_DEF_UNIX_SOCKET_CONNECTIONS_BACKLOGr!r"r1r3rArDrJrLrKr:rUrXrYr[rermrrrr rs*

r)rEr
rrrfrZ
tuned.logsZtunedZtuned.constsro�inspectrr7rNrgZlogsrZrZExporterInterfacerrrrr �<module>s

site-packages/tuned/exports/__pycache__/__init__.cpython-36.pyc000064400000005413147511334670020522 0ustar003

�<�eq�@s�ddlmZddlmZddlmZddlmZddlmZdd�Z	dd	�Z
d
d�Zdd
�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�ZdS)�)�
interfaces)�
controller)�
dbus_exporter)�dbus_exporter_with_properties)�unix_socket_exportercs��fdd�}|S)z*Decorator, use to mark exportable methods.cs��g|_|S)N)Z
export_params)�method)�args�kwargs��/usr/lib/python3.6/__init__.py�wrapper	s
zexport.<locals>.wrapperr
)rr	rr
)rr	r�exportsr
cs��fdd�}|S)z*Decorator, use to mark exportable signals.cs��g|_|S)N)Z
signal_params)r)rr	r
rrs
zsignal.<locals>.wrapperr
)rr	rr
)rr	r�signalsrcs��fdd�}|S)z8Decorator, use to mark setters of exportable properties.cs��g|_|S)N)Zproperty_set_params)r)rr	r
rrs
z property_setter.<locals>.wrapperr
)rr	rr
)rr	r�property_settersrcs��fdd�}|S)z8Decorator, use to mark getters of exportable properties.cs��g|_|S)N)Zproperty_get_params)r)rr	r
rrs
z property_getter.<locals>.wrapperr
)rr	rr
)rr	r�property_gettersrcOstjj�}|j||�S)N)r�ExportsController�get_instance�property_changed)rr	�ctlr
r
rr#s
rcCs&t|tj�st��tjj�}|j|�S)N)�
isinstancerZExporterInterface�	Exceptionrrr�register_exporter)�instancerr
r
rr's
rcCs&t|tj�st��tjj�}|j|�S)N)rrZExportableInterfacerrrr�register_object)rrr
r
rr-s
rcOstjj�}|j||�S)N)rrr�send_signal)rr	rr
r
rr3s
rcCstjj�}|j�S)N)rrr�start)rr
r
rr7s
rcCstjj�}|j�S)N)rrr�stop)rr
r
rr;s
rcCstjj�}|j�S)N)rrr�period_check)rr
r
rr?s
rN)�rrrZdbusrZdbus_with_propertiesrZunix_socketr
rrrrrrrrrrr
r
r
r�<module>ssite-packages/tuned/exports/__pycache__/interfaces.cpython-36.pyc000064400000002400147511334670021077 0ustar003

�<�eL�@s$Gdd�de�ZGdd�de�ZdS)c@seZdZdS)�ExportableInterfaceN)�__name__�
__module__�__qualname__�rr� /usr/lib/python3.6/interfaces.pyrsrc@s<eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
S)�ExporterInterfacecCs
t��dS)N)�NotImplementedError)�self�methodZin_signature�
out_signaturerrr�exportszExporterInterface.exportcCs
t��dS)N)r)r	r
rrrr�signal	szExporterInterface.signalcOs
t��dS)N)r)r	r
�args�kwargsrrr�send_signal
szExporterInterface.send_signalcCs
t��dS)N)r)r	rrr�startszExporterInterface.startcCs
t��dS)N)r)r	rrr�stopszExporterInterface.stopcCsdS)Nr)r	rrr�period_checkszExporterInterface.period_checkN)	rrrrr
rrrrrrrrrsrN)�objectrrrrrr�<module>ssite-packages/tuned/exports/__pycache__/dbus_exporter.cpython-36.pyc000064400000015573147511334670021660 0ustar003

�<�eO�@sddlmZddlZddlZddlZddlZddlZddlZ	ddl
jZddlZddl
Z
ddlmZddlmZddlmZddlmZddlmZdd	lmZydd
lmZdd�ZWn ek
r�dd
lmZYnXe	jj�Zdd�ZGdd�dej �Z!dS)�)�
interfaces�N)�ismethod)�polkit)�GLib)�FunctionType)�
DBusException)�ErrorMessage)�getfullargspeccCst|�S)N)r
)�func�r�#/usr/lib/python3.6/dbus_exporter.py�
getargspecsr)rcCs�t|dd�}|dk	rn0t|dd�dkr4d|jj}nd|j|jjf}t|t�rZ|j�}ndjtj	|j|��}t
|||�}|j�s�|j|�dS)NZ_dbus_error_name�
__module__��__main__zorg.freedesktop.DBus.Python.%sz!org.freedesktop.DBus.Python.%s.%s)rr)
�getattr�	__class__�__name__r�
isinstancerZget_dbus_message�join�	traceback�format_exception_onlyr	Zget_no_replyZsend_message)Z
connection�messageZ	exception�name�contentsZreplyrrr
�_method_reply_error$s

rc@s�eZdZdZdd�Zedd��Zedd��Zedd	��Zd
d�Z	dd
�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�ZdS)�DBusExportera+
	Export method calls through DBus Interface.

	We take a method to be exported and create a simple wrapper function
	to call it. This is required as we need the original function to be
	bound to the original object instance. While the wrapper will be bound
	to an object we dynamically construct.
	cCs�tj�tjkrttj_tjjj	dd�d|_
d|_i|_t
�|_||_||_||_||_d|_d|_t�|_tjtj�}tj�|_tjtj|�dS)NT)Zset_as_default)�logZgetEffectiveLevel�logging�DEBUGr�dbus�serviceZmainloopZglibZ
DBusGMainLoop�_dbus_object_clsZ_dbus_object�
_dbus_methods�set�_signals�	_bus_name�_interface_name�_object_name�
_namespace�_thread�_bus_objectr�_polkit�signal�	getsignal�SIGINTrZMainLoop�
_main_loop)�self�bus_name�interface_name�object_name�	namespaceZsignal_handlerrrr
�__init__Bs"
zDBusExporter.__init__cCs|jS)N)r')r2rrr
r3`szDBusExporter.bus_namecCs|jS)N)r()r2rrr
r4dszDBusExporter.interface_namecCs|jS)N)r))r2rrr
r5hszDBusExporter.object_namecCs
|jdk	S)N)r+)r2rrr
�runninglszDBusExporter.runningcCsNdj|jdjt|j�j�d�}t|dt|j�d�}t	|j
dt�|j�}|S)Nz2def {name}({args}):
					return wrapper({args})
		z, )r�argsz<decorator-gen-%d>�execr)�formatrrr�__func__r9�compile�lenr$r�	co_consts�locals)r2�method�wrapper�source�coderrrr
�_prepare_for_dbusoszDBusExporter._prepare_for_dbuscsjt��std���j}|�jkr(td����fdd�}�j�|�}tjj�j||dd�|�}|�j|<dS)Nz#Only bound methods can be exported.z*Method with this name is already exported.cs
�jd�j}|d}tjd||f��jj||�}|}|dkrXtjd||f�n�|dkrttjd||f�n�|dkr�tjd||f�t|dd
��d	g}nZ|dkr�tjd
||f�t|dd��d	g}n(tj	d||f�t|dd��d	g}�||�S)N�.rz?checking authorization for action '%s' requested by caller '%s'zJaction '%s' requested by caller '%s' was successfully authorized by polkit�zepolkit error, but action '%s' requested by caller '%s' was successfully authorized by fallback methodrzLaction '%s' requested by caller '%s' wasn't authorized, ignoring the requestrzppolkit error and action '%s' requested by caller '%s' wasn't authorized by fallback method, ignoring the requestzvpolkit error and unable to use fallback method to authorize action '%s' requested by caller '%s', ignoring the request���rHrHrHrH)
r*rr�debugr-Zcheck_authorization�warn�info�list�error)�ownerr9�kwargsZ	action_id�caller�retZ	args_copy)rAr2rr
rB�s$z$DBusExporter.export.<locals>.wrapperrP)Zsender_keyword)	r�	Exceptionrr$rEr!r"rAr()r2rAZin_signature�
out_signature�method_namerBr)rAr2r
�export}s
zDBusExporter.exportcsnt��std���j}||jkr(td���fdd�}|j�|�}tjj|j|�|�}||j|<|j	j
|�dS)Nz#Only bound methods can be exported.z*Method with this name is already exported.cs
�||�S)Nr)rNr9rO)rArr
rB�sz$DBusExporter.signal.<locals>.wrapper)rrRrr$rEr!r"r.r(r&�add)r2rArSrTrBr)rAr
r.�s

zDBusExporter.signalcOsfd}||jks|jdkrd}yt|j|�}Wntk
rDd}YnX|rXtd|��n
|||�dS)NFTzSignal '%s' doesn't exist.)r&r,r�AttributeErrorrR)r2r.r9rO�errrArrr
�send_signal�s
zDBusExporter.send_signalcCs<|jdk	rtd��dt|�}t|tjjf|j�}||_dS)Nz%The exporter class was already build.zDBusExporter_%d)r#rR�id�typer!r"ZObjectr$)r2Zunique_name�clsrrr
�_construct_dbus_object_class�s

z)DBusExporter._construct_dbus_object_classcCsn|j�rdS|jdkr|j�|j�tj�}tjj|j|�}|j||j	|�|_
tj|j
d�|_|jj�dS)N)�target)r8r#r]�stopr!Z	SystemBusr"ZBusNamer'r)r,�	threadingZThread�_thread_coder+�start)r2Zbusr3rrr
rb�s
zDBusExporter.startcCs2|jdk	r.|jj�r.|jj�|jj�d|_dS)N)r+Zis_aliver1�quitr)r2rrr
r_�s

zDBusExporter.stopcCs|jj�|`d|_dS)N)r1Zrunr,)r2rrr
ra�s
zDBusExporter._thread_codeN)rr�__qualname__�__doc__r7�propertyr3r4r5r8rErUr.rYr]rbr_rarrrr
r8s"
	
r)"rrZdbus.servicer!Zdbus.mainloop.glibZdbus.exceptionsr`r.Z
tuned.logsZtunedZtuned.constsZconstsrr�inspectrZtuned.utils.polkitrZ
gi.repositoryr�typesrrZ
dbus.lowlevelr	r
r�ImportErrorZlogs�getrrZExporterInterfacerrrrr
�<module>s.

site-packages/tuned/exports/__pycache__/controller.cpython-36.opt-1.pyc000064400000011237147511334670022106 0ustar003

�<�e�
�@s4ddlmZddlZddlZGdd�dejj�ZdS)�)�
interfaces�Ncs�eZdZdZ�fdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Z�ZS)$�ExportsControllerz4
	Controls and manages object interface exporting.
	cs$tt|�j�g|_g|_d|_dS)NF)�superr�__init__�
_exporters�_objects�_exports_initialized)�self)�	__class__�� /usr/lib/python3.6/controller.pyr
szExportsController.__init__cCs|jj|�dS)zRegister objects exporter.N)r�append)r
�instancerrr
�register_exportersz#ExportsController.register_exportercCs|jj|�dS)zRegister object to be exported.N)rr)r
rrrr
�register_objectsz!ExportsController.register_objectcCstj|�ot|d�S)z8Check if method was marked with @exports.export wrapper.�
export_params)�inspect�ismethod�hasattr)r
�methodrrr
�_is_exportable_methodsz'ExportsController._is_exportable_methodcCstj|�ot|d�S)z8Check if method was marked with @exports.signal wrapper.�
signal_params)rrr)r
rrrr
�_is_exportable_signalsz'ExportsController._is_exportable_signalcCstj|�ot|d�S)z>Check if method was marked with @exports.get_property wrapper.�property_get_params)rrr)r
rrrr
�_is_exportable_getter sz'ExportsController._is_exportable_gettercCstj|�ot|d�S)z>Check if method was marked with @exports.set_property wrapper.�property_set_params)rrr)r
rrrr
�_is_exportable_setter$sz'ExportsController._is_exportable_settercCs:x4|jD]*}|jd}|jd}|j|f|�|�qWdS)z!Register method to all exporters.rrN)rrZexport)r
r�exporter�args�kwargsrrr
�_export_method(s

z ExportsController._export_methodcCs:x4|jD]*}|jd}|jd}|j|f|�|�qWdS)z!Register signal to all exporters.rrN)rr�signal)r
rrrr rrr
�_export_signal/s

z ExportsController._export_signalcCs:x4|jD]*}|jd}|jd}|j|f|�|�qWdS)z*Register property getter to all exporters.rrN)rrZproperty_getter)r
rrrr rrr
�_export_getter6s

z ExportsController._export_gettercCs:x4|jD]*}|jd}|jd}|j|f|�|�qWdS)z*Register property setter to all exporters.rrN)rrZproperty_setter)r
rrrr rrr
�_export_setter=s

z ExportsController._export_settercOs&x |jD]}|j|f|�|�qWdS)z!Register signal to all exporters.N)r�send_signal)r
r"rr rrrr
r&DszExportsController.send_signalcOs x|jD]}|j||�qWdS)N)r�property_changed)r
rr rrrr
r'Isz"ExportsController.property_changedcCsx|jD]}|j�qWdS)z=Allows to perform checks on exporters without special thread.N)r�period_check)r
rrrr
r(MszExportsController.period_checkcCs�|jr
dSx�|jD]�}x$tj||j�D]\}}|j|�q&Wx$tj||j�D]\}}|j|�qLWx$tj||j�D]\}}|j	|�qrWx$tj||j
�D]\}}|j|�q�WqWd|_dS)NT)r	rrZ
getmembersrr!rr#rr$rr%)r
r�namerrrr
�_initialize_exportsRsz%ExportsController._initialize_exportscCs$|j�x|jD]}|j�qWdS)zStart the exports.N)r*r�start)r
rrrr
r+bszExportsController.startcCsx|jD]}|j�qWdS)zStop the exports.N)r�stop)r
rrrr
r,hszExportsController.stop)�__name__�
__module__�__qualname__�__doc__rrrrrrrr!r#r$r%r&r'r(r*r+r,�
__classcell__rr)rr
rs$r)�rrZtuned.patternsZtunedZpatternsZ	Singletonrrrrr
�<module>ssite-packages/tuned/exports/__pycache__/__init__.cpython-36.opt-1.pyc000064400000005413147511334670021461 0ustar003

�<�eq�@s�ddlmZddlmZddlmZddlmZddlmZdd�Z	dd	�Z
d
d�Zdd
�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�ZdS)�)�
interfaces)�
controller)�
dbus_exporter)�dbus_exporter_with_properties)�unix_socket_exportercs��fdd�}|S)z*Decorator, use to mark exportable methods.cs��g|_|S)N)Z
export_params)�method)�args�kwargs��/usr/lib/python3.6/__init__.py�wrapper	s
zexport.<locals>.wrapperr
)rr	rr
)rr	r�exportsr
cs��fdd�}|S)z*Decorator, use to mark exportable signals.cs��g|_|S)N)Z
signal_params)r)rr	r
rrs
zsignal.<locals>.wrapperr
)rr	rr
)rr	r�signalsrcs��fdd�}|S)z8Decorator, use to mark setters of exportable properties.cs��g|_|S)N)Zproperty_set_params)r)rr	r
rrs
z property_setter.<locals>.wrapperr
)rr	rr
)rr	r�property_settersrcs��fdd�}|S)z8Decorator, use to mark getters of exportable properties.cs��g|_|S)N)Zproperty_get_params)r)rr	r
rrs
z property_getter.<locals>.wrapperr
)rr	rr
)rr	r�property_gettersrcOstjj�}|j||�S)N)r�ExportsController�get_instance�property_changed)rr	�ctlr
r
rr#s
rcCs&t|tj�st��tjj�}|j|�S)N)�
isinstancerZExporterInterface�	Exceptionrrr�register_exporter)�instancerr
r
rr's
rcCs&t|tj�st��tjj�}|j|�S)N)rrZExportableInterfacerrrr�register_object)rrr
r
rr-s
rcOstjj�}|j||�S)N)rrr�send_signal)rr	rr
r
rr3s
rcCstjj�}|j�S)N)rrr�start)rr
r
rr7s
rcCstjj�}|j�S)N)rrr�stop)rr
r
rr;s
rcCstjj�}|j�S)N)rrr�period_check)rr
r
rr?s
rN)�rrrZdbusrZdbus_with_propertiesrZunix_socketr
rrrrrrrrrrr
r
r
r�<module>ssite-packages/tuned/exports/__pycache__/unix_socket_exporter.cpython-36.opt-1.pyc000064400000021134147511334670024203 0ustar003

�<�e��@s�ddlZddlZddlZddlZddlmZddlZddlj	Z	ddl
mZddlZddl
Z
ddlZejj�ZGdd�dej�ZdS)�N�)�
interfaces)�ismethodc@s�eZdZdZejejejejej	fdd�Z
dd�Zdd�Zdd	�Z
d
d�Zdd
�Zdd�Zdd�Zdd�Zdd�Zd$dd�Zd%dd�Zdd�Zdd�Zd d!�Zd"d#�ZdS)&�UnixSocketExportera9
	Export method calls through Unix Domain Socket Interface.

	We take a method to be exported and create a simple wrapper function
	to call it. This is required as we need the original function to be
	bound to the original object instance. While the wrapper will be bound
	to an object we dynamically construct.
	cCs||_d|_|rtjd|�ng|_g|_dd	g|_|r�|j�}x�t|dd��D]�\}}yt|�|j|<WqPt	k
r�y2|dkr�t
j|�j|j|<nt
j|�j|j|<Wn2tk
r�tjd|dkr�dnd|f�YnXYqPXqPW||_||_i|_t�|_d|_d|_dS)
Nz,;r�rz(%s '%s' does not exists, leaving defaultZUserZGroup���r)�_socket_path�_socket_object�re�split�_socket_signal_pathsZ_socket_signal_objects�
_ownership�	enumerate�int�
ValueError�pwd�getpwnamZpw_uid�grpZgetgrnamZgr_gid�KeyError�log�error�_permissions�_connections_backlog�_unix_socket_methods�set�_signalsZ_connZ_channel)�selfZsocket_pathZsignal_pathsZ	ownershipZpermissionsZconnections_backlog�i�o�r�*/usr/lib/python3.6/unix_socket_exporter.py�__init__s.
.zUnixSocketExporter.__init__cCs
|jdk	S)N)r	)rrrr �running;szUnixSocketExporter.runningcsTt��std���j}||jkr,td|��G�fdd�dt�}|||�|j|<dS)Nz#Only bound methods can be exported.z/Method with this name (%s) is already exported.cs eZdZdd�Z�fdd�ZdS)z*UnixSocketExporter.export.<locals>.wrappercSs||_||_dS)N)Z
_in_signature�_out_signature)r�in_signature�
out_signaturerrr r!Gsz3UnixSocketExporter.export.<locals>.wrapper.__init__cs
�||�S)Nr)r�args�kwargs)�methodrr �__call__Ksz3UnixSocketExporter.export.<locals>.wrapper.__call__N)�__name__�
__module__�__qualname__r!r)r)r(rr �wrapperFsr-)r�	Exceptionr*r�object)rr(r$r%�method_namer-r)r(r �export>s
zUnixSocketExporter.exportcs^t��std���j}||jkr,td|��G�fdd�dt�}||�|j|<|jj|�dS)Nz#Only bound methods can be exported.z/Method with this name (%s) is already exported.cs eZdZdd�Z�fdd�ZdS)z*UnixSocketExporter.signal.<locals>.wrappercSs
||_dS)N)r#)rr%rrr r!Ysz3UnixSocketExporter.signal.<locals>.wrapper.__init__cs
�||�S)Nr)rr&r')r(rr r)\sz3UnixSocketExporter.signal.<locals>.wrapper.__call__N)r*r+r,r!r)r)r(rr r-Xsr-)rr.r*rr/r�add)rr(r%r0r-r)r(r �signalPs
zUnixSocketExporter.signalcOs�||jkrtd|��x�|jD]�}tjd|�yDtjtjtj�}|jd�|j	|�|j
|d||d��|j�Wqtk
r�}ztj
d|||f�WYdd}~XqXqWdS)NzSignal '%s' doesn't exist.zSending signal on socket %sFz2.0)�jsonrpcr(�paramsz2Error while sending signal '%s' to socket '%s': %s)rr.rr�debug�socket�AF_UNIX�SOCK_STREAMZsetblockingZconnect�
_send_data�close�OSError�warning)rr3r&r'�p�s�errr �send_signalbs


zUnixSocketExporter.send_signalcCs|jj|�dS)N)r�append)r�pathrrr �register_signal_pathpsz'UnixSocketExporter.register_signal_pathcCs�|jr�tjj|j�r tj|j�tjtjtj�|_|jj	|j�|jj
|j�tj|j|j
d|j
d�|jr�tj|j|j�dS)Nrr)r�osrC�exists�unlinkr7r8r9r	ZbindZlistenr�chownr
r�chmod)rrrr �_construct_socket_objectssz+UnixSocketExporter._construct_socket_objectcCs |j�rdS|j�|j�dS)N)r"�stoprJ)rrrr �start~szUnixSocketExporter.startcCs|jr|jj�dS)N)r	r;)rrrr rK�szUnixSocketExporter.stopcCsbtjd|�y|jtj|�jd��Wn4tk
r\}ztjd||f�WYdd}~XnXdS)NzSending socket data: %s)zutf-8zFailed to send data '%s': %s)rr6�send�json�dumps�encoder.r=)rr?�datar@rrr r:�s
zUnixSocketExporter._send_dataFcCs$d|d�}|r||d<n||d<|S)Nz2.0)r4�idr�resultr)rrQrRr�resrrr �_create_response�s
z#UnixSocketExporter._create_responseNcCs|j|||d�d|d�S)N)�code�messagerQT)rrR)rU)rrVrWrRrQrrr �_create_error_responce�s
z)UnixSocketExporter._create_error_responcecCs|j||�S)N)rU)rrSrRrrr �_create_result_response�sz*UnixSocketExporter._create_result_responsecCs|jd�r|SdS)NrR)�get)rrQrrr �	_check_id�s
zUnixSocketExporter._check_idcCsnt|�tks&|jd�dks&|jd�r2|jdd�S|jd�}d}|d|jkrb|j|jdd|��Sy�|jd	�s�|j|d�}njt|d	�ttfkr�|j|d|d	�}n>t|d	�tkr�|j|df|d	�}n|j|jdd|��SWnntk
�r$}z|j|jdd|t	|���Sd}~Xn8t
k
�rZ}z|j|jdd
|t	|���Sd}~XnX|j|j||��S)Nr4z2.0r(iXzInvalid RequestrRiYzMethod not foundr5iZzInvalid paramsr�Errori����i����i����i����)�type�dictrZrXrr[�list�tuple�	TypeError�strr.rY)r�reqrR�retr@rrr �_process_request�s&&

$&z#UnixSocketExporter._process_requestc	#Cs�|j�sdS�x�tj|jgffd�\}}}|�r�|jj�\}}y*d}x |jd�j�}|sZP||7}qFWWn2tk
r�}ztjd|�wWYdd}~XnX|�r�yt	j
|�}WnRtk
�r}z4tjd||f�|j||jddt
|���wWYdd}~XnXt|�tttfk�r>tjd	�|j||jd
dt
|���qt|�ttfk�r�t|�dk�rz|j||jddt
|���qg}x(|D] }|j|�}|�r�|j|��q�W|�r�|j||�n|j|�}|�r�|j||�qdSqWdS)a�
		Periodically checks socket object for new calls. This allows to function without special thread.
		Interface is according JSON-RPC 2.0 Specification (see https://www.jsonrpc.org/specification)
		
		Example calls:
		
		printf '[{"jsonrpc": "2.0", "method": "active_profile", "id": 1}, {"jsonrpc": "2.0", "method": "profiles", "id": 2}]' | nc -U /run/tuned/tuned.sock
		printf '{"jsonrpc": "2.0", "method": "switch_profile", "params": {"profile_name": "balanced"}, "id": 1}' | nc -U /run/tuned/tuned.sock
		Nr�iz"Failed to load data of message: %sz!Failed to load json data '%s': %si�zParse errorzWrong format of calliXzInvalid RequestiD���iD���i����)r"�selectr	ZacceptZrecv�decoder.rrrN�loadsr:rXrbr]r`r_r^�lenrerB)	r�r�_ZconnrQZrec_datar@rTrcrrr �period_check�sT




zUnixSocketExporter.period_check)F)NN)r*r+r,�__doc__�constsZCFG_DEF_UNIX_SOCKET_PATHZ CFG_DEF_UNIX_SOCKET_SIGNAL_PATHSZCFG_DEF_UNIX_SOCKET_OWNERSHIPZCFG_DEF_UNIX_SOCKET_PERMISIONSZ'CFG_DEF_UNIX_SOCKET_CONNECTIONS_BACKLOGr!r"r1r3rArDrJrLrKr:rUrXrYr[rermrrrr rs*

r)rEr
rrrfrZ
tuned.logsZtunedZtuned.constsro�inspectrr7rNrgZlogsrZrZExporterInterfacerrrrr �<module>s

site-packages/tuned/exports/__pycache__/dbus_exporter.cpython-36.opt-1.pyc000064400000015573147511334670022617 0ustar003

�<�eO�@sddlmZddlZddlZddlZddlZddlZddlZ	ddl
jZddlZddl
Z
ddlmZddlmZddlmZddlmZddlmZdd	lmZydd
lmZdd�ZWn ek
r�dd
lmZYnXe	jj�Zdd�ZGdd�dej �Z!dS)�)�
interfaces�N)�ismethod)�polkit)�GLib)�FunctionType)�
DBusException)�ErrorMessage)�getfullargspeccCst|�S)N)r
)�func�r�#/usr/lib/python3.6/dbus_exporter.py�
getargspecsr)rcCs�t|dd�}|dk	rn0t|dd�dkr4d|jj}nd|j|jjf}t|t�rZ|j�}ndjtj	|j|��}t
|||�}|j�s�|j|�dS)NZ_dbus_error_name�
__module__��__main__zorg.freedesktop.DBus.Python.%sz!org.freedesktop.DBus.Python.%s.%s)rr)
�getattr�	__class__�__name__r�
isinstancerZget_dbus_message�join�	traceback�format_exception_onlyr	Zget_no_replyZsend_message)Z
connection�messageZ	exception�name�contentsZreplyrrr
�_method_reply_error$s

rc@s�eZdZdZdd�Zedd��Zedd��Zedd	��Zd
d�Z	dd
�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�ZdS)�DBusExportera+
	Export method calls through DBus Interface.

	We take a method to be exported and create a simple wrapper function
	to call it. This is required as we need the original function to be
	bound to the original object instance. While the wrapper will be bound
	to an object we dynamically construct.
	cCs�tj�tjkrttj_tjjj	dd�d|_
d|_i|_t
�|_||_||_||_||_d|_d|_t�|_tjtj�}tj�|_tjtj|�dS)NT)Zset_as_default)�logZgetEffectiveLevel�logging�DEBUGr�dbus�serviceZmainloopZglibZ
DBusGMainLoop�_dbus_object_clsZ_dbus_object�
_dbus_methods�set�_signals�	_bus_name�_interface_name�_object_name�
_namespace�_thread�_bus_objectr�_polkit�signal�	getsignal�SIGINTrZMainLoop�
_main_loop)�self�bus_name�interface_name�object_name�	namespaceZsignal_handlerrrr
�__init__Bs"
zDBusExporter.__init__cCs|jS)N)r')r2rrr
r3`szDBusExporter.bus_namecCs|jS)N)r()r2rrr
r4dszDBusExporter.interface_namecCs|jS)N)r))r2rrr
r5hszDBusExporter.object_namecCs
|jdk	S)N)r+)r2rrr
�runninglszDBusExporter.runningcCsNdj|jdjt|j�j�d�}t|dt|j�d�}t	|j
dt�|j�}|S)Nz2def {name}({args}):
					return wrapper({args})
		z, )r�argsz<decorator-gen-%d>�execr)�formatrrr�__func__r9�compile�lenr$r�	co_consts�locals)r2�method�wrapper�source�coderrrr
�_prepare_for_dbusoszDBusExporter._prepare_for_dbuscsjt��std���j}|�jkr(td����fdd�}�j�|�}tjj�j||dd�|�}|�j|<dS)Nz#Only bound methods can be exported.z*Method with this name is already exported.cs
�jd�j}|d}tjd||f��jj||�}|}|dkrXtjd||f�n�|dkrttjd||f�n�|dkr�tjd||f�t|dd
��d	g}nZ|dkr�tjd
||f�t|dd��d	g}n(tj	d||f�t|dd��d	g}�||�S)N�.rz?checking authorization for action '%s' requested by caller '%s'zJaction '%s' requested by caller '%s' was successfully authorized by polkit�zepolkit error, but action '%s' requested by caller '%s' was successfully authorized by fallback methodrzLaction '%s' requested by caller '%s' wasn't authorized, ignoring the requestrzppolkit error and action '%s' requested by caller '%s' wasn't authorized by fallback method, ignoring the requestzvpolkit error and unable to use fallback method to authorize action '%s' requested by caller '%s', ignoring the request���rHrHrHrH)
r*rr�debugr-Zcheck_authorization�warn�info�list�error)�ownerr9�kwargsZ	action_id�caller�retZ	args_copy)rAr2rr
rB�s$z$DBusExporter.export.<locals>.wrapperrP)Zsender_keyword)	r�	Exceptionrr$rEr!r"rAr()r2rAZin_signature�
out_signature�method_namerBr)rAr2r
�export}s
zDBusExporter.exportcsnt��std���j}||jkr(td���fdd�}|j�|�}tjj|j|�|�}||j|<|j	j
|�dS)Nz#Only bound methods can be exported.z*Method with this name is already exported.cs
�||�S)Nr)rNr9rO)rArr
rB�sz$DBusExporter.signal.<locals>.wrapper)rrRrr$rEr!r"r.r(r&�add)r2rArSrTrBr)rAr
r.�s

zDBusExporter.signalcOsfd}||jks|jdkrd}yt|j|�}Wntk
rDd}YnX|rXtd|��n
|||�dS)NFTzSignal '%s' doesn't exist.)r&r,r�AttributeErrorrR)r2r.r9rO�errrArrr
�send_signal�s
zDBusExporter.send_signalcCs<|jdk	rtd��dt|�}t|tjjf|j�}||_dS)Nz%The exporter class was already build.zDBusExporter_%d)r#rR�id�typer!r"ZObjectr$)r2Zunique_name�clsrrr
�_construct_dbus_object_class�s

z)DBusExporter._construct_dbus_object_classcCsn|j�rdS|jdkr|j�|j�tj�}tjj|j|�}|j||j	|�|_
tj|j
d�|_|jj�dS)N)�target)r8r#r]�stopr!Z	SystemBusr"ZBusNamer'r)r,�	threadingZThread�_thread_coder+�start)r2Zbusr3rrr
rb�s
zDBusExporter.startcCs2|jdk	r.|jj�r.|jj�|jj�d|_dS)N)r+Zis_aliver1�quitr)r2rrr
r_�s

zDBusExporter.stopcCs|jj�|`d|_dS)N)r1Zrunr,)r2rrr
ra�s
zDBusExporter._thread_codeN)rr�__qualname__�__doc__r7�propertyr3r4r5r8rErUr.rYr]rbr_rarrrr
r8s"
	
r)"rrZdbus.servicer!Zdbus.mainloop.glibZdbus.exceptionsr`r.Z
tuned.logsZtunedZtuned.constsZconstsrr�inspectrZtuned.utils.polkitrZ
gi.repositoryr�typesrrZ
dbus.lowlevelr	r
r�ImportErrorZlogs�getrrZExporterInterfacerrrrr
�<module>s.

site-packages/tuned/exports/__pycache__/interfaces.cpython-36.opt-1.pyc000064400000002400147511334670022036 0ustar003

�<�eL�@s$Gdd�de�ZGdd�de�ZdS)c@seZdZdS)�ExportableInterfaceN)�__name__�
__module__�__qualname__�rr� /usr/lib/python3.6/interfaces.pyrsrc@s<eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
S)�ExporterInterfacecCs
t��dS)N)�NotImplementedError)�self�methodZin_signature�
out_signaturerrr�exportszExporterInterface.exportcCs
t��dS)N)r)r	r
rrrr�signal	szExporterInterface.signalcOs
t��dS)N)r)r	r
�args�kwargsrrr�send_signal
szExporterInterface.send_signalcCs
t��dS)N)r)r	rrr�startszExporterInterface.startcCs
t��dS)N)r)r	rrr�stopszExporterInterface.stopcCsdS)Nr)r	rrr�period_checkszExporterInterface.period_checkN)	rrrrr
rrrrrrrrrsrN)�objectrrrrrr�<module>ssite-packages/tuned/exports/unix_socket_exporter.py000064400000017300147511334670016760 0ustar00import os
import re
import pwd, grp

from . import interfaces
import tuned.logs
import tuned.consts as consts
from inspect import ismethod
import socket
import json
import select

log = tuned.logs.get()

class UnixSocketExporter(interfaces.ExporterInterface):
	"""
	Export method calls through Unix Domain Socket Interface.

	We take a method to be exported and create a simple wrapper function
	to call it. This is required as we need the original function to be
	bound to the original object instance. While the wrapper will be bound
	to an object we dynamically construct.
	"""

	def __init__(self, socket_path=consts.CFG_DEF_UNIX_SOCKET_PATH,
				 signal_paths=consts.CFG_DEF_UNIX_SOCKET_SIGNAL_PATHS,
				 ownership=consts.CFG_DEF_UNIX_SOCKET_OWNERSHIP,
				 permissions=consts.CFG_DEF_UNIX_SOCKET_PERMISIONS,
				 connections_backlog=consts.CFG_DEF_UNIX_SOCKET_CONNECTIONS_BACKLOG):

		self._socket_path = socket_path
		self._socket_object = None
		self._socket_signal_paths = re.split(r",;", signal_paths) if signal_paths else []
		self._socket_signal_objects = []
		self._ownership = [-1, -1]
		if ownership:
			ownership = ownership.split()
			for i, o in enumerate(ownership[:2]):
				try:
					self._ownership[i] = int(o)
				except ValueError:
					try:
						# user
						if i == 0:
							self._ownership[i] = pwd.getpwnam(o).pw_uid
						# group
						else:
							self._ownership[i] = grp.getgrnam(o).gr_gid
					except KeyError:
						log.error("%s '%s' does not exists, leaving default" % ("User" if i == 0 else "Group", o))
		self._permissions = permissions
		self._connections_backlog = connections_backlog

		self._unix_socket_methods = {}
		self._signals = set()
		self._conn = None
		self._channel = None

	def running(self):
		return self._socket_object is not None

	def export(self, method, in_signature, out_signature):
		if not ismethod(method):
			raise Exception("Only bound methods can be exported.")

		method_name = method.__name__
		if method_name in self._unix_socket_methods:
			raise Exception("Method with this name (%s) is already exported." % method_name)

		class wrapper(object):
			def __init__(self, in_signature, out_signature):
				self._in_signature = in_signature
				self._out_signature = out_signature
				
			def __call__(self, *args, **kwargs):
				return method(*args, **kwargs)

		self._unix_socket_methods[method_name] = wrapper(in_signature, out_signature)

	def signal(self, method, out_signature):
		if not ismethod(method):
			raise Exception("Only bound methods can be exported.")
		
		method_name = method.__name__
		if method_name in self._unix_socket_methods:
			raise Exception("Method with this name (%s) is already exported." % method_name)
		
		class wrapper(object):
			def __init__(self, out_signature):
				self._out_signature = out_signature
			
			def __call__(self, *args, **kwargs):
				return method(*args, **kwargs)
		
		self._unix_socket_methods[method_name] = wrapper(out_signature)
		self._signals.add(method_name)

	def send_signal(self, signal, *args, **kwargs):
		if not signal in self._signals:
			raise Exception("Signal '%s' doesn't exist." % signal)
		for p in self._socket_signal_paths:
			log.debug("Sending signal on socket %s" % p)
			try:
				s = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
				s.setblocking(False)
				s.connect(p)
				self._send_data(s, {"jsonrpc": "2.0", "method": signal, "params": args})
				s.close()
			except OSError as e:
				log.warning("Error while sending signal '%s' to socket '%s': %s" % (signal, p, e))

	def register_signal_path(self, path):
		self._socket_signal_paths.append(path)

	def _construct_socket_object(self):
		if self._socket_path:
			if os.path.exists(self._socket_path):
				os.unlink(self._socket_path)
			self._socket_object = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
			self._socket_object.bind(self._socket_path)
			self._socket_object.listen(self._connections_backlog)
			os.chown(self._socket_path, self._ownership[0], self._ownership[1])
			if self._permissions:
				os.chmod(self._socket_path, self._permissions)

	def start(self):
		if self.running():
			return

		self.stop()
		self._construct_socket_object()

	def stop(self):
		if self._socket_object:
			self._socket_object.close()

	def _send_data(self, s, data):
		log.debug("Sending socket data: %s)" % data)
		try:
			s.send(json.dumps(data).encode("utf-8"))
		except Exception as e:
			log.warning("Failed to send data '%s': %s" % (data, e))

	def _create_response(self, data, id, error=False):
		res = {
			"jsonrpc": "2.0",
			"id": id
		}
		if error:
			res["error"] = data
		else:
			res["result"] = data
		return res

	def _create_error_responce(self, code, message, id=None, data=None):
		return self._create_response({
			"code": code,
			"message": message,
			"data": data,
		}, error=True, id=id)

	def _create_result_response(self, result, id):
		return self._create_response(result, id)

	def _check_id(self, data):
		if data.get("id"):
			return data
		return None

	def _process_request(self, req):
		if type(req) != dict or req.get("jsonrpc") != "2.0" or not req.get("method"):
			return self._create_error_responce(-32600, "Invalid Request")
		id = req.get("id")
		ret = None
		if req["method"] not in self._unix_socket_methods:
			return self._check_id(self._create_error_responce(-32601, "Method not found", id))
		try:
			if not req.get("params"):
				ret = self._unix_socket_methods[req["method"]]()
			elif type(req["params"]) in (list, tuple):
				ret = self._unix_socket_methods[req["method"]](*req["params"])
			elif type(req["params"]) == dict:
				ret = self._unix_socket_methods[req["method"]](**req["params"])
			else:
				return self._check_id(self._create_error_responce(-32600, "Invalid Request", id))
		except TypeError as e:
			return self._check_id(self._create_error_responce(-32602, "Invalid params", id, str(e)))
		except Exception as e:
			return self._check_id(self._create_error_responce(1, "Error", id, str(e)))
		return self._check_id(self._create_result_response(ret, id))

	def period_check(self):
		"""
		Periodically checks socket object for new calls. This allows to function without special thread.
		Interface is according JSON-RPC 2.0 Specification (see https://www.jsonrpc.org/specification)
		
		Example calls:
		
		printf '[{"jsonrpc": "2.0", "method": "active_profile", "id": 1}, {"jsonrpc": "2.0", "method": "profiles", "id": 2}]' | nc -U /run/tuned/tuned.sock
		printf '{"jsonrpc": "2.0", "method": "switch_profile", "params": {"profile_name": "balanced"}, "id": 1}' | nc -U /run/tuned/tuned.sock
		"""
		if not self.running():
			return
		while True:
			r, _, _ = select.select([self._socket_object], (), (), 0)
			if r:
				conn, _ = self._socket_object.accept()
				try:
					data = ""
					while True:
						rec_data = conn.recv(4096).decode()
						if not rec_data:
							break
						data += rec_data
				except Exception as e:
					log.error("Failed to load data of message: %s" % e)
					continue
				if data:
					try:
						data = json.loads(data)
					except Exception as e:
						log.error("Failed to load json data '%s': %s" % (data, e))
						self._send_data(conn, self._create_error_responce(-32700, "Parse error", str(e)))
						continue
					if type(data) not in (tuple, list, dict):
						log.error("Wrong format of call")
						self._send_data(conn, self._create_error_responce(-32700, "Parse error", str(e)))
						continue
					if type(data) in (tuple, list):
						if len(data) == 0:
							self._send_data(conn, self._create_error_responce(-32600, "Invalid Request", str(e)))
							continue
						res = []
						for req in data:
							r = self._process_request(req)
							if r:
								res.append(r)
						if res:
							self._send_data(conn, res)
					else:
						res = self._process_request(data)
						if r:
							self._send_data(conn, res)
			else:
				return
		
site-packages/tuned/exports/dbus_exporter.py000064400000017117147511334670015370 0ustar00from . import interfaces
import dbus.service
import dbus.mainloop.glib
import dbus.exceptions
import threading
import signal
import tuned.logs
import tuned.consts as consts
import traceback
import logging
from inspect import ismethod
from tuned.utils.polkit import polkit
from gi.repository import GLib
from types import FunctionType
from dbus.exceptions import DBusException
from dbus.lowlevel import ErrorMessage

try:
	# Python3 version
	# getfullargspec is not present in Python2, so when we drop P2 support
	# replace "getargspec(func)" in code with "getfullargspec(func).args"
	from inspect import getfullargspec

	def getargspec(func):
		return getfullargspec(func)
except ImportError:
	# Python2 version, drop after support stops
	from inspect import getargspec


log = tuned.logs.get()

# This is mostly copy of the code from the dbus.service module without the
# code that sends tracebacks through the D-Bus (i.e. no library tracebacks
# are exposed on the D-Bus now).
def _method_reply_error(connection, message, exception):
    name = getattr(exception, '_dbus_error_name', None)

    if name is not None:
        pass
    elif getattr(exception, '__module__', '') in ('', '__main__'):
        name = 'org.freedesktop.DBus.Python.%s' % exception.__class__.__name__
    else:
        name = 'org.freedesktop.DBus.Python.%s.%s' % (exception.__module__, exception.__class__.__name__)

    if isinstance(exception, DBusException):
        contents = exception.get_dbus_message()
    else:
        contents = ''.join(traceback.format_exception_only(exception.__class__,
            exception))
    reply = ErrorMessage(message, name, contents)

    if not message.get_no_reply():
        connection.send_message(reply)

class DBusExporter(interfaces.ExporterInterface):
	"""
	Export method calls through DBus Interface.

	We take a method to be exported and create a simple wrapper function
	to call it. This is required as we need the original function to be
	bound to the original object instance. While the wrapper will be bound
	to an object we dynamically construct.
	"""

	def __init__(self, bus_name, interface_name, object_name, namespace):
		# Monkey patching of the D-Bus library _method_reply_error() to reply
		# tracebacks via D-Bus only if in the debug mode. It doesn't seem there is a
		# more simple way how to cover all possible exceptions that could occur in
		# the D-Bus library. Just setting the exception.include_traceback to False doesn't
		# seem to help because there is only a subset of exceptions that support this flag.
		if log.getEffectiveLevel() != logging.DEBUG:
			dbus.service._method_reply_error = _method_reply_error

		dbus.mainloop.glib.DBusGMainLoop(set_as_default=True)

		self._dbus_object_cls = None
		self._dbus_object = None
		self._dbus_methods = {}
		self._signals = set()

		self._bus_name = bus_name
		self._interface_name = interface_name
		self._object_name = object_name
		self._namespace = namespace
		self._thread = None
		self._bus_object = None
		self._polkit = polkit()

		# dirty hack that fixes KeyboardInterrupt handling
		# the hack is needed because PyGObject / GTK+-3 developers are morons
		signal_handler = signal.getsignal(signal.SIGINT)
		self._main_loop = GLib.MainLoop()
		signal.signal(signal.SIGINT, signal_handler)

	@property
	def bus_name(self):
		return self._bus_name

	@property
	def interface_name(self):
		return self._interface_name

	@property
	def object_name(self):
		return self._object_name

	def running(self):
		return self._thread is not None

	def _prepare_for_dbus(self, method, wrapper):
		source = """def {name}({args}):
					return wrapper({args})
		""".format(name=method.__name__, args=', '.join(getargspec(method.__func__).args))
		code = compile(source, '<decorator-gen-%d>' % len(self._dbus_methods), 'exec')
		# https://docs.python.org/3.9/library/inspect.html
		# co_consts - tuple of constants used in the bytecode
		# example:
		# compile("e=2\ndef f(x):\n    return x*2\n", "X", 'exec').co_consts
		# (2, <code object f at 0x7f8c60c65330, file "X", line 2>, None)
		# Because we have only one object in code (our function), we can use code.co_consts[0]
		func = FunctionType(code.co_consts[0], locals(), method.__name__)
		return func

	def export(self, method, in_signature, out_signature):
		if not ismethod(method):
			raise Exception("Only bound methods can be exported.")

		method_name = method.__name__
		if method_name in self._dbus_methods:
			raise Exception("Method with this name is already exported.")

		def wrapper(owner, *args, **kwargs):
			action_id = self._namespace + "." + method.__name__
			caller = args[-1]
			log.debug("checking authorization for action '%s' requested by caller '%s'" % (action_id, caller))
			ret = self._polkit.check_authorization(caller, action_id)
			args_copy = args
			if ret == 1:
					log.debug("action '%s' requested by caller '%s' was successfully authorized by polkit" % (action_id, caller))
			elif ret == 2:
					log.warn("polkit error, but action '%s' requested by caller '%s' was successfully authorized by fallback method" % (action_id, caller))
			elif ret == 0:
					log.info("action '%s' requested by caller '%s' wasn't authorized, ignoring the request" % (action_id, caller))
					args_copy = list(args[:-1]) + [""]
			elif ret == -1:
				log.warn("polkit error and action '%s' requested by caller '%s' wasn't authorized by fallback method, ignoring the request" % (action_id, caller))
				args_copy = list(args[:-1]) + [""]
			else:
				log.error("polkit error and unable to use fallback method to authorize action '%s' requested by caller '%s', ignoring the request" % (action_id, caller))
				args_copy = list(args[:-1]) + [""]
			return method(*args_copy, **kwargs)

		wrapper = self._prepare_for_dbus(method, wrapper)
		wrapper = dbus.service.method(self._interface_name, in_signature, out_signature, sender_keyword = "caller")(wrapper)

		self._dbus_methods[method_name] = wrapper

	def signal(self, method, out_signature):
		if not ismethod(method):
			raise Exception("Only bound methods can be exported.")

		method_name = method.__name__
		if method_name in self._dbus_methods:
			raise Exception("Method with this name is already exported.")

		def wrapper(owner, *args, **kwargs):
			return method(*args, **kwargs)

		wrapper = self._prepare_for_dbus(method, wrapper)
		wrapper = dbus.service.signal(self._interface_name, out_signature)(wrapper)

		self._dbus_methods[method_name] = wrapper
		self._signals.add(method_name)

	def send_signal(self, signal, *args, **kwargs):
		err = False
		if not signal in self._signals or self._bus_object is None:
			err = True
		try:
			method = getattr(self._bus_object, signal)
		except AttributeError:
			err = True
		if err:
			raise Exception("Signal '%s' doesn't exist." % signal)
		else:
			method(*args, **kwargs)

	def _construct_dbus_object_class(self):
		if self._dbus_object_cls is not None:
			raise Exception("The exporter class was already build.")

		unique_name = "DBusExporter_%d" % id(self)
		cls = type(unique_name, (dbus.service.Object,), self._dbus_methods)

		self._dbus_object_cls = cls

	def start(self):
		if self.running():
			return
		if self._dbus_object_cls is None:
			self._construct_dbus_object_class()

		self.stop()
		bus = dbus.SystemBus()
		bus_name = dbus.service.BusName(self._bus_name, bus)
		self._bus_object = self._dbus_object_cls(bus, self._object_name, bus_name)
		self._thread = threading.Thread(target=self._thread_code)
		self._thread.start()

	def stop(self):
		if self._thread is not None and self._thread.is_alive():
			self._main_loop.quit()
			self._thread.join()
			self._thread = None

	def _thread_code(self):
		self._main_loop.run()
		del self._bus_object
		self._bus_object = None
site-packages/tuned/monitors/monitor_net.py000064400000002223147511334670015176 0ustar00import tuned.monitors
import os
import re
from tuned.utils.nettool import ethcard

class NetMonitor(tuned.monitors.Monitor):

	@classmethod
	def _init_available_devices(cls):
		available = []
		for root, dirs, files in os.walk("/sys/devices"):
			if root.endswith("/net") and not root.endswith("/virtual/net"):
				available += dirs
		
		cls._available_devices = set(available)

		for dev in available:
			#max_speed = cls._calcspeed(ethcard(dev).get_max_speed())
			cls._load[dev] = ['0', '0', '0', '0']

	@classmethod
	def _calcspeed(cls, speed):
		# 0.6 is just a magical constant (empirical value): Typical workload on netcard won't exceed
		# that and if it does, then the code is smart enough to adapt it.
		# 1024 * 1024 as for MB -> B
		# speed / 8  Mb -> MB
		return (int) (0.6 * 1024 * 1024 * speed / 8)

	@classmethod
	def _updateStat(cls, dev):
		files = ["rx_bytes", "rx_packets", "tx_bytes", "tx_packets"]
		for i,f in enumerate(files):
			with open("/sys/class/net/" + dev + "/statistics/" + f) as statfile:
				cls._load[dev][i] = statfile.read().strip()

	@classmethod
	def update(cls):
		for device in cls._updating_devices:
			cls._updateStat(device)

site-packages/tuned/monitors/base.py000064400000005552147511334670013563 0ustar00import tuned.logs
log = tuned.logs.get()

__all__ = ["Monitor"]

class Monitor(object):
	"""
	Base class for all monitors.

	Monitors provide data about the running system to Plugin objects, which use the data
	to tune system parameters.

	Following methods require reimplementation:
	  - _init_available_devices(cls)
	  - update(cls)
	"""

	# class properties

	@classmethod
	def _init_class(cls):
		cls._class_initialized = False
		cls._instances = set()
		cls._available_devices = set()
		cls._updating_devices = set()
		cls._load = {}

		cls._init_available_devices()
		assert isinstance(cls._available_devices, set)
		cls._class_initialized = True
		log.debug("available devices: %s" % ", ".join(cls._available_devices))

	@classmethod
	def _init_available_devices(cls):
		raise NotImplementedError()

	@classmethod
	def _update_available_devices(cls):
		cls._init_available_devices()
		log.debug("available devices updated to: %s"
				% ", ".join(cls._available_devices))

	@classmethod
	def get_available_devices(cls):
		return cls._available_devices

	@classmethod
	def update(cls):
		raise NotImplementedError()

	@classmethod
	def _register_instance(cls, instance):
		cls._instances.add(instance)

	@classmethod
	def _deregister_instance(cls, instance):
		cls._instances.remove(instance)

	@classmethod
	def _refresh_updating_devices(cls):
		new_updating = set()
		for instance in cls._instances:
			new_updating |= instance.devices
		cls._updating_devices.clear()
		cls._updating_devices.update(new_updating)

	@classmethod
	def instances(cls):
		return cls._instances

	# instance properties

	def __init__(self, devices = None):
		if not hasattr(self, "_class_initialized"):
			self._init_class()
			assert hasattr(self, "_class_initialized")

		self._register_instance(self)

		if devices is not None:
			self.devices = devices
		else:
			self.devices = self.get_available_devices()

		self.update()

	def __del__(self):
		try:
			self.cleanup()
		except:
			pass

	def cleanup(self):
		self._deregister_instance(self)
		self._refresh_updating_devices()

	@property
	def devices(self):
		return self._devices

	@devices.setter
	def devices(self, value):
		new_devices = self._available_devices & set(value)
		self._devices = new_devices
		self._refresh_updating_devices()

	def add_device(self, device):
		assert (isinstance(device,str) or isinstance(device,unicode))
		self._update_available_devices()
		if device in self._available_devices:
			self._devices.add(device)
			self._updating_devices.add(device)

	def remove_device(self, device):
		assert (isinstance(device,str) or isinstance(device,unicode))
		if device in self._devices:
			self._devices.remove(device)
			self._updating_devices.remove(device)

	def get_load(self):
		return dict([dev_load for dev_load in list(self._load.items()) if dev_load[0] in self._devices])

	def get_device_load(self, device):
		return self._load.get(device, None)
site-packages/tuned/monitors/monitor_disk.py000064400000001566147511334670015353 0ustar00import tuned.monitors
import os

class DiskMonitor(tuned.monitors.Monitor):

	_supported_vendors = ["ATA", "SCSI"]

	@classmethod
	def _init_available_devices(cls):
		block_devices = os.listdir("/sys/block")
		available = set(filter(cls._is_device_supported, block_devices))
		cls._available_devices = available

		for d in available:
			cls._load[d] = [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]

	@classmethod
	def _is_device_supported(cls, device):
		vendor_file = "/sys/block/%s/device/vendor" % device
		try:
			vendor = open(vendor_file).read().strip()
		except IOError:
			return False

		return vendor in cls._supported_vendors

	@classmethod
	def update(cls):
		for device in cls._updating_devices:
			cls._update_disk(device)

	@classmethod
	def _update_disk(cls, dev):
		with open("/sys/block/" + dev + "/stat") as statfile:
			cls._load[dev] = list(map(int, statfile.read().split()))
site-packages/tuned/monitors/__init__.py000064400000000056147511334670014402 0ustar00from .base import *
from .repository import *
site-packages/tuned/monitors/monitor_load.py000064400000000462147511334670015332 0ustar00import tuned.monitors

class LoadMonitor(tuned.monitors.Monitor):
	@classmethod
	def _init_available_devices(cls):
		cls._available_devices = set(["system"])

	@classmethod
	def update(cls):
		with open("/proc/loadavg") as statfile:
			data = statfile.read().split()
		cls._load["system"] = float(data[0])
site-packages/tuned/monitors/repository.py000064400000001477147511334670015072 0ustar00import tuned.logs
import tuned.monitors
from tuned.utils.plugin_loader import PluginLoader

log = tuned.logs.get()

__all__ = ["Repository"]

class Repository(PluginLoader):

	def __init__(self):
		super(Repository, self).__init__()
		self._monitors = set()

	@property
	def monitors(self):
		return self._monitors

	def _set_loader_parameters(self):
		self._namespace = "tuned.monitors"
		self._prefix = "monitor_"
		self._interface = tuned.monitors.Monitor

	def create(self, plugin_name, devices):
		log.debug("creating monitor %s" % plugin_name)
		monitor_cls = self.load_plugin(plugin_name)
		monitor_instance = monitor_cls(devices)
		self._monitors.add(monitor_instance)
		return monitor_instance

	def delete(self, monitor):
		assert isinstance(monitor, self._interface)
		monitor.cleanup()
		self._monitors.remove(monitor)
site-packages/tuned/monitors/__pycache__/monitor_disk.cpython-36.opt-1.pyc000064400000002672147511334670022575 0ustar003

�<�ev�@s(ddlZddlZGdd�dejj�ZdS)�Nc@sDeZdZddgZedd��Zedd��Zedd��Zed	d
��ZdS)�DiskMonitorZATAZSCSIcCsRtjd�}tt|j|��}||_x,|D]$}dddddddddddg|j|<q&WdS)Nz
/sys/blockr)�os�listdir�set�filter�_is_device_supportedZ_available_devices�_load)�clsZ
block_devicesZ	available�d�r�"/usr/lib/python3.6/monitor_disk.py�_init_available_devicess


z#DiskMonitor._init_available_devicescCs<d|}yt|�j�j�}Wntk
r0dSX||jkS)Nz/sys/block/%s/device/vendorF)�open�read�strip�IOError�_supported_vendors)r	�deviceZvendor_fileZvendorrrrrsz DiskMonitor._is_device_supportedcCsx|jD]}|j|�qWdS)N)Z_updating_devices�_update_disk)r	rrrr�updateszDiskMonitor.updatecCs<td|d��"}ttt|j�j���|j|<WdQRXdS)Nz/sys/block/z/stat)r�list�map�intr�splitr)r	ZdevZstatfilerrrr szDiskMonitor._update_diskN)	�__name__�
__module__�__qualname__r�classmethodr
rrrrrrrrs
	
r)Ztuned.monitorsZtunedrZmonitorsZMonitorrrrrr�<module>ssite-packages/tuned/monitors/__pycache__/base.cpython-36.opt-1.pyc000064400000007715147511334670021011 0ustar003

�<�ej�@s,ddlZejj�ZdgZGdd�de�ZdS)�N�Monitorc@s�eZdZdZedd��Zedd��Zedd��Zedd	��Zed
d��Z	edd
��Z
edd��Zedd��Zedd��Z
d&dd�Zdd�Zdd�Zedd��Zejdd��Zdd�Zd d!�Zd"d#�Zd$d%�ZdS)'rz�
	Base class for all monitors.

	Monitors provide data about the running system to Plugin objects, which use the data
	to tune system parameters.

	Following methods require reimplementation:
	  - _init_available_devices(cls)
	  - update(cls)
	cCsLd|_t�|_t�|_t�|_i|_|j�d|_tjddj	|j��dS)NFTzavailable devices: %sz, )
�_class_initialized�set�
_instances�_available_devices�_updating_devices�_load�_init_available_devices�log�debug�join)�cls�r�/usr/lib/python3.6/base.py�_init_classszMonitor._init_classcCs
t��dS)N)�NotImplementedError)r
rrrr	!szMonitor._init_available_devicescCs"|j�tjddj|j��dS)Nz available devices updated to: %sz, )r	r
rrr)r
rrr�_update_available_devices%sz!Monitor._update_available_devicescCs|jS)N)r)r
rrr�get_available_devices+szMonitor.get_available_devicescCs
t��dS)N)r)r
rrr�update/szMonitor.updatecCs|jj|�dS)N)r�add)r
�instancerrr�_register_instance3szMonitor._register_instancecCs|jj|�dS)N)r�remove)r
rrrr�_deregister_instance7szMonitor._deregister_instancecCs:t�}x|jD]}||jO}qW|jj�|jj|�dS)N)rr�devicesr�clearr)r
Znew_updatingrrrr�_refresh_updating_devices;s

z!Monitor._refresh_updating_devicescCs|jS)N)r)r
rrr�	instancesCszMonitor.instancesNcCsBt|d�s|j�|j|�|dk	r,||_n
|j�|_|j�dS)Nr)�hasattrrrrrr)�selfrrrr�__init__Is


zMonitor.__init__c	Csy|j�WnYnXdS)N)�cleanup)rrrr�__del__WszMonitor.__del__cCs|j|�|j�dS)N)rr)rrrrr!]s
zMonitor.cleanupcCs|jS)N)�_devices)rrrrraszMonitor.devicescCs |jt|�@}||_|j�dS)N)rrr#r)r�valueZnew_devicesrrrrescCs.|j�||jkr*|jj|�|jj|�dS)N)rrr#rr)r�devicerrr�
add_deviceks
zMonitor.add_devicecCs&||jkr"|jj|�|jj|�dS)N)r#rr)rr%rrr�
remove_devicers
zMonitor.remove_devicecs t�fdd�t�jj��D��S)Ncsg|]}|d�jkr|�qS)r)r#)�.0Zdev_load)rrr�
<listcomp>ysz$Monitor.get_load.<locals>.<listcomp>)�dict�listr�items)rr)rr�get_loadxszMonitor.get_loadcCs|jj|d�S)N)r�get)rr%rrr�get_device_load{szMonitor.get_device_load)N)�__name__�
__module__�__qualname__�__doc__�classmethodrr	rrrrrrrr r"r!�propertyr�setterr&r'r-r/rrrrrs&


)Z
tuned.logsZtunedZlogsr.r
�__all__�objectrrrrr�<module>s
site-packages/tuned/monitors/__pycache__/monitor_net.cpython-36.pyc000064400000002753147511334670021472 0ustar003

�<�e��@s<ddlZddlZddlZddlmZGdd�dejj�ZdS)�N)�ethcardc@s<eZdZedd��Zedd��Zedd��Zedd��Zd	S)
�
NetMonitorcCsjg}x6tjd�D](\}}}|jd�r|jd�r||7}qWt|�|_x|D]}ddddg|j|<qLWdS)Nz/sys/devicesz/netz/virtual/net�0)�os�walk�endswith�setZ_available_devices�_load)�clsZ	available�root�dirs�files�dev�r�!/usr/lib/python3.6/monitor_net.py�_init_available_devicess

z"NetMonitor._init_available_devicescCstd|d�S)Ng333333�?i�g333333�@g333333#A)�int)r
Zspeedrrr�
_calcspeedszNetMonitor._calcspeedcCs\ddddg}xJt|�D]>\}}td|d|��}|j�j�|j||<WdQRXqWdS)NZrx_bytesZ
rx_packetsZtx_bytesZ
tx_packetsz/sys/class/net/z/statistics/)�	enumerate�open�read�stripr	)r
rr
�i�fZstatfilerrr�_updateStatszNetMonitor._updateStatcCsx|jD]}|j|�qWdS)N)Z_updating_devicesr)r
Zdevicerrr�update$szNetMonitor.updateN)�__name__�
__module__�__qualname__�classmethodrrrrrrrrrs
r)	Ztuned.monitorsZtunedr�reZtuned.utils.nettoolrZmonitorsZMonitorrrrrr�<module>ssite-packages/tuned/monitors/__pycache__/monitor_load.cpython-36.opt-1.pyc000064400000001411147511334670022550 0ustar003

�<�e2�@s ddlZGdd�dejj�ZdS)�Nc@s$eZdZedd��Zedd��ZdS)�LoadMonitorcCstdg�|_dS)N�system)�setZ_available_devices)�cls�r�"/usr/lib/python3.6/monitor_load.py�_init_available_devicessz#LoadMonitor._init_available_devicesc
Cs6td��}|j�j�}WdQRXt|d�|jd<dS)Nz
/proc/loadavgrr)�open�read�split�float�_load)rZstatfile�datarrr�updates
zLoadMonitor.updateN)�__name__�
__module__�__qualname__�classmethodrrrrrrrsr)Ztuned.monitorsZtunedZmonitorsZMonitorrrrrr�<module>ssite-packages/tuned/monitors/__pycache__/repository.cpython-36.pyc000064400000002734147511334670021353 0ustar003

�<�e?�@s@ddlZddlZddlmZejj�ZdgZGdd�de�Z	dS)�N)�PluginLoader�
Repositorycs@eZdZ�fdd�Zedd��Zdd�Zdd�Zd	d
�Z�Z	S)rcstt|�j�t�|_dS)N)�superr�__init__�set�	_monitors)�self)�	__class__�� /usr/lib/python3.6/repository.pyrszRepository.__init__cCs|jS)N)r)rr
r
r�monitorsszRepository.monitorscCsd|_d|_tjj|_dS)Nztuned.monitorsZmonitor_)Z
_namespace�_prefix�tunedrZMonitor�
_interface)rr
r
r�_set_loader_parameterssz!Repository._set_loader_parameterscCs0tjd|�|j|�}||�}|jj|�|S)Nzcreating monitor %s)�log�debugZload_pluginr�add)rZplugin_nameZdevicesZmonitor_clsZmonitor_instancer
r
r�creates

zRepository.createcCs(t||j�st�|j�|jj|�dS)N)�
isinstancer�AssertionErrorZcleanupr�remove)rZmonitorr
r
r�deleteszRepository.delete)
�__name__�
__module__�__qualname__r�propertyrrrr�
__classcell__r
r
)r	rr	s
)
Z
tuned.logsrZtuned.monitorsZtuned.utils.plugin_loaderrZlogs�getr�__all__rr
r
r
r�<module>s

site-packages/tuned/monitors/__pycache__/base.cpython-36.pyc000064400000010160147511334670020036 0ustar003

�<�ej�@s,ddlZejj�ZdgZGdd�de�ZdS)�N�Monitorc@s�eZdZdZedd��Zedd��Zedd��Zedd	��Zed
d��Z	edd
��Z
edd��Zedd��Zedd��Z
d&dd�Zdd�Zdd�Zedd��Zejdd��Zdd�Zd d!�Zd"d#�Zd$d%�ZdS)'rz�
	Base class for all monitors.

	Monitors provide data about the running system to Plugin objects, which use the data
	to tune system parameters.

	Following methods require reimplementation:
	  - _init_available_devices(cls)
	  - update(cls)
	cCs\d|_t�|_t�|_t�|_i|_|j�t|jt�s<t�d|_t	j
ddj|j��dS)NFTzavailable devices: %sz, )�_class_initialized�set�
_instances�_available_devices�_updating_devices�_load�_init_available_devices�
isinstance�AssertionError�log�debug�join)�cls�r�/usr/lib/python3.6/base.py�_init_classszMonitor._init_classcCs
t��dS)N)�NotImplementedError)rrrrr	!szMonitor._init_available_devicescCs"|j�tjddj|j��dS)Nz available devices updated to: %sz, )r	rr
rr)rrrr�_update_available_devices%sz!Monitor._update_available_devicescCs|jS)N)r)rrrr�get_available_devices+szMonitor.get_available_devicescCs
t��dS)N)r)rrrr�update/szMonitor.updatecCs|jj|�dS)N)r�add)r�instancerrr�_register_instance3szMonitor._register_instancecCs|jj|�dS)N)r�remove)rrrrr�_deregister_instance7szMonitor._deregister_instancecCs:t�}x|jD]}||jO}qW|jj�|jj|�dS)N)rr�devicesr�clearr)rZnew_updatingrrrr�_refresh_updating_devices;s

z!Monitor._refresh_updating_devicescCs|jS)N)r)rrrr�	instancesCszMonitor.instancesNcCsPt|d�s |j�t|d�s t�|j|�|dk	r:||_n
|j�|_|j�dS)Nr)�hasattrrrrrrr)�selfrrrr�__init__Is


zMonitor.__init__c	Csy|j�WnYnXdS)N)�cleanup)r!rrr�__del__WszMonitor.__del__cCs|j|�|j�dS)N)rr)r!rrrr#]s
zMonitor.cleanupcCs|jS)N)�_devices)r!rrrraszMonitor.devicescCs |jt|�@}||_|j�dS)N)rrr%r)r!�valueZnew_devicesrrrrescCsFt|t�st|t�st�|j�||jkrB|jj|�|jj|�dS)N)	r
�str�unicoderrrr%rr)r!�devicerrr�
add_deviceks

zMonitor.add_devicecCs>t|t�st|t�st�||jkr:|jj|�|jj|�dS)N)r
r'r(rr%rr)r!r)rrr�
remove_devicers
zMonitor.remove_devicecs t�fdd�t�jj��D��S)Ncsg|]}|d�jkr|�qS)r)r%)�.0Zdev_load)r!rr�
<listcomp>ysz$Monitor.get_load.<locals>.<listcomp>)�dict�listr�items)r!r)r!r�get_loadxszMonitor.get_loadcCs|jj|d�S)N)r�get)r!r)rrr�get_device_load{szMonitor.get_device_load)N)�__name__�
__module__�__qualname__�__doc__�classmethodrr	rrrrrrrr"r$r#�propertyr�setterr*r+r1r3rrrrrs&


)Z
tuned.logsZtunedZlogsr2r�__all__�objectrrrrr�<module>s
site-packages/tuned/monitors/__pycache__/__init__.cpython-36.pyc000064400000000234147511334670020664 0ustar003

�<�e.�@sddlTddlTdS)�)�*N)�baseZ
repository�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/monitors/__pycache__/monitor_disk.cpython-36.pyc000064400000002672147511334670021636 0ustar003

�<�ev�@s(ddlZddlZGdd�dejj�ZdS)�Nc@sDeZdZddgZedd��Zedd��Zedd��Zed	d
��ZdS)�DiskMonitorZATAZSCSIcCsRtjd�}tt|j|��}||_x,|D]$}dddddddddddg|j|<q&WdS)Nz
/sys/blockr)�os�listdir�set�filter�_is_device_supportedZ_available_devices�_load)�clsZ
block_devicesZ	available�d�r�"/usr/lib/python3.6/monitor_disk.py�_init_available_devicess


z#DiskMonitor._init_available_devicescCs<d|}yt|�j�j�}Wntk
r0dSX||jkS)Nz/sys/block/%s/device/vendorF)�open�read�strip�IOError�_supported_vendors)r	�deviceZvendor_fileZvendorrrrrsz DiskMonitor._is_device_supportedcCsx|jD]}|j|�qWdS)N)Z_updating_devices�_update_disk)r	rrrr�updateszDiskMonitor.updatecCs<td|d��"}ttt|j�j���|j|<WdQRXdS)Nz/sys/block/z/stat)r�list�map�intr�splitr)r	ZdevZstatfilerrrr szDiskMonitor._update_diskN)	�__name__�
__module__�__qualname__r�classmethodr
rrrrrrrrs
	
r)Ztuned.monitorsZtunedrZmonitorsZMonitorrrrrr�<module>ssite-packages/tuned/monitors/__pycache__/repository.cpython-36.opt-1.pyc000064400000002651147511334670022310 0ustar003

�<�e?�@s@ddlZddlZddlmZejj�ZdgZGdd�de�Z	dS)�N)�PluginLoader�
Repositorycs@eZdZ�fdd�Zedd��Zdd�Zdd�Zd	d
�Z�Z	S)rcstt|�j�t�|_dS)N)�superr�__init__�set�	_monitors)�self)�	__class__�� /usr/lib/python3.6/repository.pyrszRepository.__init__cCs|jS)N)r)rr
r
r�monitorsszRepository.monitorscCsd|_d|_tjj|_dS)Nztuned.monitorsZmonitor_)Z
_namespace�_prefix�tunedrZMonitorZ
_interface)rr
r
r�_set_loader_parameterssz!Repository._set_loader_parameterscCs0tjd|�|j|�}||�}|jj|�|S)Nzcreating monitor %s)�log�debugZload_pluginr�add)rZplugin_nameZdevicesZmonitor_clsZmonitor_instancer
r
r�creates

zRepository.createcCs|j�|jj|�dS)N)Zcleanupr�remove)rZmonitorr
r
r�deleteszRepository.delete)
�__name__�
__module__�__qualname__r�propertyrrrr�
__classcell__r
r
)r	rr	s
)
Z
tuned.logsrZtuned.monitorsZtuned.utils.plugin_loaderrZlogs�getr�__all__rr
r
r
r�<module>s

site-packages/tuned/monitors/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000234147511334670021623 0ustar003

�<�e.�@sddlTddlTdS)�)�*N)�baseZ
repository�rr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/tuned/monitors/__pycache__/monitor_net.cpython-36.opt-1.pyc000064400000002753147511334670022431 0ustar003

�<�e��@s<ddlZddlZddlZddlmZGdd�dejj�ZdS)�N)�ethcardc@s<eZdZedd��Zedd��Zedd��Zedd��Zd	S)
�
NetMonitorcCsjg}x6tjd�D](\}}}|jd�r|jd�r||7}qWt|�|_x|D]}ddddg|j|<qLWdS)Nz/sys/devicesz/netz/virtual/net�0)�os�walk�endswith�setZ_available_devices�_load)�clsZ	available�root�dirs�files�dev�r�!/usr/lib/python3.6/monitor_net.py�_init_available_devicess

z"NetMonitor._init_available_devicescCstd|d�S)Ng333333�?i�g333333�@g333333#A)�int)r
Zspeedrrr�
_calcspeedszNetMonitor._calcspeedcCs\ddddg}xJt|�D]>\}}td|d|��}|j�j�|j||<WdQRXqWdS)NZrx_bytesZ
rx_packetsZtx_bytesZ
tx_packetsz/sys/class/net/z/statistics/)�	enumerate�open�read�stripr	)r
rr
�i�fZstatfilerrr�_updateStatszNetMonitor._updateStatcCsx|jD]}|j|�qWdS)N)Z_updating_devicesr)r
Zdevicerrr�update$szNetMonitor.updateN)�__name__�
__module__�__qualname__�classmethodrrrrrrrrrs
r)	Ztuned.monitorsZtunedr�reZtuned.utils.nettoolrZmonitorsZMonitorrrrrr�<module>ssite-packages/tuned/monitors/__pycache__/monitor_load.cpython-36.pyc000064400000001411147511334670021611 0ustar003

�<�e2�@s ddlZGdd�dejj�ZdS)�Nc@s$eZdZedd��Zedd��ZdS)�LoadMonitorcCstdg�|_dS)N�system)�setZ_available_devices)�cls�r�"/usr/lib/python3.6/monitor_load.py�_init_available_devicessz#LoadMonitor._init_available_devicesc
Cs6td��}|j�j�}WdQRXt|d�|jd<dS)Nz
/proc/loadavgrr)�open�read�split�float�_load)rZstatfile�datarrr�updates
zLoadMonitor.updateN)�__name__�
__module__�__qualname__�classmethodrrrrrrrsr)Ztuned.monitorsZtunedZmonitorsZMonitorrrrrr�<module>ssite-packages/firewall/command.py000064400000057301147511334670013102 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2011-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

"""FirewallCommand class for command line client simplification"""

__all__ = [ "FirewallCommand" ]

import sys

from firewall import errors
from firewall.errors import FirewallError
from dbus.exceptions import DBusException
from firewall.functions import checkIPnMask, checkIP6nMask, check_mac, \
    check_port, check_single_address

class FirewallCommand(object):
    def __init__(self, quiet=False, verbose=False):
        self.quiet = quiet
        self.verbose = verbose
        self.__use_exception_handler = True
        self.fw = None

    def set_fw(self, fw):
        self.fw = fw

    def set_quiet(self, flag):
        self.quiet = flag

    def get_quiet(self):
        return self.quiet

    def set_verbose(self, flag):
        self.verbose = flag

    def get_verbose(self):
        return self.verbose

    def print_msg(self, msg=None):
        if msg is not None and not self.quiet:
            sys.stdout.write(msg + "\n")

    def print_error_msg(self, msg=None):
        if msg is not None and not self.quiet:
            sys.stderr.write(msg + "\n")

    def print_warning(self, msg=None):
        FAIL = '\033[91m'
        END = '\033[00m'
        if sys.stderr.isatty():
            msg = FAIL + msg + END
        self.print_error_msg(msg)

    def print_and_exit(self, msg=None, exit_code=0):
        #OK = '\033[92m'
        #END = '\033[00m'
        if exit_code > 1:
            self.print_warning(msg)
        else:
            #if sys.stdout.isatty():
            #   msg = OK + msg + END
            self.print_msg(msg)
        sys.exit(exit_code)

    def fail(self, msg=None):
        self.print_and_exit(msg, 2)

    def print_if_verbose(self, msg=None):
        if msg is not None and self.verbose:
            sys.stdout.write(msg + "\n")

    def __cmd_sequence(self, cmd_type, option, action_method, query_method, # pylint: disable=W0613, R0913, R0914
                       parse_method, message, start_args=None, end_args=None, # pylint: disable=W0613
                       no_exit=False):
        if self.fw is not None:
            self.fw.authorizeAll()
        items = [ ]
        _errors = 0
        _error_codes = [ ]
        for item in option:
            if parse_method is not None:
                try:
                    item = parse_method(item)
                except Exception as msg:
                    code = FirewallError.get_code(str(msg))
                    if len(option) > 1:
                        self.print_warning("Warning: %s" % msg)
                    else:
                        self.print_and_exit("Error: %s" % msg, code)
                    if code not in _error_codes:
                        _error_codes.append(code)
                    _errors += 1
                    continue

            items.append(item)

        for item in items:
            call_item = [ ]
            if start_args is not None:
                call_item += start_args
            if not isinstance(item, list) and not isinstance(item, tuple):
                call_item.append(item)
            else:
                call_item += item
            if end_args is not None:
                call_item += end_args
            self.deactivate_exception_handler()
            try:
                action_method(*call_item)
            except (DBusException, Exception) as msg:
                if isinstance(msg, DBusException):
                    self.fail_if_not_authorized(msg.get_dbus_name())
                    msg = msg.get_dbus_message()
                else:
                    msg = str(msg)
                code = FirewallError.get_code(msg)
                if code in [ errors.ALREADY_ENABLED, errors.NOT_ENABLED,
                             errors.ZONE_ALREADY_SET, errors.ALREADY_SET ]:
                    code = 0
                if len(option) > 1:
                    self.print_warning("Warning: %s" % msg)
                elif code == 0:
                    self.print_warning("Warning: %s" % msg)
                    return
                else:
                    self.print_and_exit("Error: %s" % msg, code)
                if code not in _error_codes:
                    _error_codes.append(code)
                _errors += 1
            self.activate_exception_handler()

        if not no_exit:
            if len(option) > _errors or 0 in _error_codes:
                # There have been more options than errors or there
                # was at least one error code 0, return.
                return
            elif len(_error_codes) == 1:
                # Exactly one error code, use it.
                sys.exit(_error_codes[0])
            elif len(_error_codes) > 1:
                # There is more than error, exit using
                # UNKNOWN_ERROR. This could happen within sequences
                # where parsing failed with different errors like
                # INVALID_PORT and INVALID_PROTOCOL.
                sys.exit(errors.UNKNOWN_ERROR)

    def add_sequence(self, option, action_method, query_method, parse_method, # pylint: disable=R0913
                     message, no_exit=False):
        self.__cmd_sequence("add", option, action_method, query_method,
                            parse_method, message, no_exit=no_exit)

    def x_add_sequence(self, x, option, action_method, query_method, # pylint: disable=R0913
                       parse_method, message, no_exit=False):
        self.__cmd_sequence("add", option, action_method, query_method,
                            parse_method, message, start_args=[x],
                            no_exit=no_exit)

    def zone_add_timeout_sequence(self, zone, option, action_method, # pylint: disable=R0913
                                  query_method, parse_method, message,
                                  timeout, no_exit=False):
        self.__cmd_sequence("add", option, action_method, query_method,
                            parse_method, message, start_args=[zone],
                            end_args=[timeout], no_exit=no_exit)

    def remove_sequence(self, option, action_method, query_method, # pylint: disable=R0913
                        parse_method, message, no_exit=False):
        self.__cmd_sequence("remove", option, action_method, query_method,
                            parse_method, message, no_exit=no_exit)

    def x_remove_sequence(self, x, option, action_method, query_method, # pylint: disable=R0913
                          parse_method, message, no_exit=False):
        self.__cmd_sequence("remove", option, action_method, query_method,
                            parse_method, message, start_args=[x],
                            no_exit=no_exit)


    def __query_sequence(self, option, query_method, parse_method, message, # pylint: disable=R0913
                         start_args=None, no_exit=False):
        items = [ ]
        for item in option:
            if parse_method is not None:
                try:
                    item = parse_method(item)
                except Exception as msg:
                    if len(option) > 1:
                        self.print_warning("Warning: %s" % msg)
                        continue
                    else:
                        code = FirewallError.get_code(str(msg))
                        self.print_and_exit("Error: %s" % msg, code)
            items.append(item)

        for item in items:
            call_item = [ ]
            if start_args is not None:
                call_item += start_args
            if not isinstance(item, list) and not isinstance(item, tuple):
                call_item.append(item)
            else:
                call_item += item
            self.deactivate_exception_handler()
            try:
                res = query_method(*call_item)
            except DBusException as msg:
                self.fail_if_not_authorized(msg.get_dbus_name())
                code = FirewallError.get_code(msg.get_dbus_message())
                if len(option) > 1:
                    self.print_warning("Warning: %s" % msg.get_dbus_message())
                    continue
                else:
                    self.print_and_exit("Error: %s" % msg.get_dbus_message(),
                                        code)
            except Exception as msg:
                code = FirewallError.get_code(str(msg))
                if len(option) > 1:
                    self.print_warning("Warning: %s" % msg)
                else:
                    self.print_and_exit("Error: %s" % msg, code)
            self.activate_exception_handler()
            if len(option) > 1:
                self.print_msg("%s: %s" % (message % item, ("no", "yes")[res]))
            else:
                self.print_query_result(res)
        if not no_exit:
            sys.exit(0)

    def query_sequence(self, option, query_method, parse_method, message, # pylint: disable=R0913
                       no_exit=False):
        self.__query_sequence(option, query_method, parse_method,
                              message, no_exit=no_exit)

    def x_query_sequence(self, x, option, query_method, parse_method, # pylint: disable=R0913
                         message, no_exit=False):
        self.__query_sequence(option, query_method, parse_method,
                              message, start_args=[x], no_exit=no_exit)


    def parse_source(self, value):
        if not checkIPnMask(value) and not checkIP6nMask(value) \
           and not check_mac(value) and not \
           (value.startswith("ipset:") and len(value) > 6):
            raise FirewallError(errors.INVALID_ADDR,
                                "'%s' is no valid IPv4, IPv6 or MAC address, nor an ipset" % value)
        return value

    def parse_port(self, value, separator="/"):
        try:
            (port, proto) = value.split(separator)
        except ValueError:
            raise FirewallError(errors.INVALID_PORT, "bad port (most likely "
                                "missing protocol), correct syntax is "
                                "portid[-portid]%sprotocol" % separator)
        if not check_port(port):
            raise FirewallError(errors.INVALID_PORT, port)
        if proto not in [ "tcp", "udp", "sctp", "dccp" ]:
            raise FirewallError(errors.INVALID_PROTOCOL,
                                "'%s' not in {'tcp'|'udp'|'sctp'|'dccp'}" % \
                                proto)
        return (port, proto)

    def parse_forward_port(self, value, compat=False):
        port = None
        protocol = None
        toport = None
        toaddr = None
        i = 0
        while ("=" in value[i:]):
            opt = value[i:].split("=", 1)[0]
            i += len(opt) + 1
            if "=" in value[i:]:
                val = value[i:].split(":", 1)[0]
            else:
                val = value[i:]
            i += len(val) + 1

            if opt == "port":
                port = val
            elif opt == "proto":
                protocol = val
            elif opt == "toport":
                toport = val
            elif opt == "toaddr":
                toaddr = val
            elif opt == "if" and compat:
                # ignore if option in compat mode
                pass
            else:
                raise FirewallError(errors.INVALID_FORWARD,
                                    "invalid forward port arg '%s'" % (opt))
        if not port:
            raise FirewallError(errors.INVALID_FORWARD, "missing port")
        if not protocol:
            raise FirewallError(errors.INVALID_FORWARD, "missing protocol")
        if not (toport or toaddr):
            raise FirewallError(errors.INVALID_FORWARD, "missing destination")

        if not check_port(port):
            raise FirewallError(errors.INVALID_PORT, port)
        if protocol not in [ "tcp", "udp", "sctp", "dccp" ]:
            raise FirewallError(errors.INVALID_PROTOCOL,
                                "'%s' not in {'tcp'|'udp'|'sctp'|'dccp'}" % \
                                protocol)
        if toport and not check_port(toport):
            raise FirewallError(errors.INVALID_PORT, toport)
        if toaddr and not check_single_address("ipv4", toaddr):
            if compat or not check_single_address("ipv6", toaddr):
                raise FirewallError(errors.INVALID_ADDR, toaddr)

        return (port, protocol, toport, toaddr)

    def parse_ipset_option(self, value):
        args = value.split("=")
        if len(args) == 1:
            return (args[0], "")
        elif len(args) == 2:
            return args
        else:
            raise FirewallError(errors.INVALID_OPTION,
                                "invalid ipset option '%s'" % (value))

    def check_destination_ipv(self, value):
        ipvs = [ "ipv4", "ipv6", ]
        if value not in ipvs:
            raise FirewallError(errors.INVALID_IPV,
                                "invalid argument: %s (choose from '%s')" % \
                                (value, "', '".join(ipvs)))
        return value

    def parse_service_destination(self, value):
        try:
            (ipv, destination) = value.split(":", 1)
        except ValueError:
            raise FirewallError(errors.INVALID_DESTINATION,
                                "destination syntax is ipv:address[/mask]")
        return (self.check_destination_ipv(ipv), destination)

    def check_ipv(self, value):
        ipvs = [ "ipv4", "ipv6", "eb" ]
        if value not in ipvs:
            raise FirewallError(errors.INVALID_IPV,
                                "invalid argument: %s (choose from '%s')" % \
                                (value, "', '".join(ipvs)))
        return value

    def check_helper_family(self, value):
        ipvs = [ "", "ipv4", "ipv6" ]
        if value not in ipvs:
            raise FirewallError(errors.INVALID_IPV,
                                "invalid argument: %s (choose from '%s')" % \
                                (value, "', '".join(ipvs)))
        return value

    def check_module(self, value):
        if not value.startswith("nf_conntrack_"):
            raise FirewallError(
                errors.INVALID_MODULE,
                "'%s' does not start with 'nf_conntrack_'" % value)
        if len(value.replace("nf_conntrack_", "")) < 1:
            raise FirewallError(errors.INVALID_MODULE,
                                "Module name '%s' too short" % value)
        return value

    def print_zone_policy_info(self, zone, settings, default_zone=None, extra_interfaces=[], isPolicy=True): # pylint: disable=R0914
        target = settings.getTarget()
        services = settings.getServices()
        ports = settings.getPorts()
        protocols = settings.getProtocols()
        masquerade = settings.getMasquerade()
        forward_ports = settings.getForwardPorts()
        source_ports = settings.getSourcePorts()
        icmp_blocks = settings.getIcmpBlocks()
        rules = settings.getRichRules()
        description = settings.getDescription()
        short_description = settings.getShort()
        if isPolicy:
            ingress_zones = settings.getIngressZones()
            egress_zones = settings.getEgressZones()
            priority = settings.getPriority()
        else:
            icmp_block_inversion = settings.getIcmpBlockInversion()
            interfaces = sorted(set(settings.getInterfaces() + extra_interfaces))
            sources = settings.getSources()
            forward = settings.getForward()

        def rich_rule_sorted_key(rule):
            priority = 0
            search_str = "priority="
            try:
                i = rule.index(search_str)
            except ValueError:
                pass
            else:
                i += len(search_str)
                priority = int(rule[i:i+(rule[i:].index(" "))].replace("\"", ""))

            return priority

        attributes = []
        if default_zone is not None:
            if zone == default_zone:
                attributes.append("default")
        if (not isPolicy and (interfaces or sources)) or \
           (    isPolicy and ingress_zones and egress_zones):
            attributes.append("active")
        if attributes:
            zone = zone + " (%s)" % ", ".join(attributes)
        self.print_msg(zone)
        if self.verbose:
            self.print_msg("  summary: " + short_description)
            self.print_msg("  description: " + description)
        if isPolicy:
            self.print_msg("  priority: " + str(priority))
        self.print_msg("  target: " + target)
        if not isPolicy:
            self.print_msg("  icmp-block-inversion: %s" % \
                           ("yes" if icmp_block_inversion else "no"))
        if isPolicy:
            self.print_msg("  ingress-zones: " + " ".join(ingress_zones))
            self.print_msg("  egress-zones: " + " ".join(egress_zones))
        else:
            self.print_msg("  interfaces: " + " ".join(interfaces))
            self.print_msg("  sources: " + " ".join(sources))
        self.print_msg("  services: " + " ".join(sorted(services)))
        self.print_msg("  ports: " + " ".join(["%s/%s" % (port[0], port[1])
                                               for port in ports]))
        self.print_msg("  protocols: " + " ".join(sorted(protocols)))
        if not isPolicy:
            self.print_msg("  forward: %s" % ("yes" if forward else "no"))
        self.print_msg("  masquerade: %s" % ("yes" if masquerade else "no"))
        self.print_msg("  forward-ports: " + ("\n\t" if forward_ports else "") +
                       "\n\t".join(["port=%s:proto=%s:toport=%s:toaddr=%s" % \
                                    (port, proto, toport, toaddr)
                                    for (port, proto, toport, toaddr) in \
                                    forward_ports]))
        self.print_msg("  source-ports: " +
                       " ".join(["%s/%s" % (port[0], port[1])
                                 for port in source_ports]))
        self.print_msg("  icmp-blocks: " + " ".join(icmp_blocks))
        self.print_msg("  rich rules: " + ("\n\t" if rules else "") +
                            "\n\t".join(sorted(rules, key=rich_rule_sorted_key)))

    def print_zone_info(self, zone, settings, default_zone=None, extra_interfaces=[]):
        self.print_zone_policy_info(zone, settings, default_zone=default_zone, extra_interfaces=extra_interfaces, isPolicy=False)

    def print_policy_info(self, policy, settings, default_zone=None, extra_interfaces=[]):
        self.print_zone_policy_info(policy, settings, default_zone=default_zone, extra_interfaces=extra_interfaces, isPolicy=True)

    def print_service_info(self, service, settings):
        ports = settings.getPorts()
        protocols = settings.getProtocols()
        source_ports = settings.getSourcePorts()
        modules = settings.getModules()
        description = settings.getDescription()
        destinations = settings.getDestinations()
        short_description = settings.getShort()
        includes = settings.getIncludes()
        helpers = settings.getHelpers()
        self.print_msg(service)
        if self.verbose:
            self.print_msg("  summary: " + short_description)
            self.print_msg("  description: " + description)
        self.print_msg("  ports: " + " ".join(["%s/%s" % (port[0], port[1])
                                               for port in ports]))
        self.print_msg("  protocols: " + " ".join(protocols))
        self.print_msg("  source-ports: " +
                       " ".join(["%s/%s" % (port[0], port[1])
                                 for port in source_ports]))
        self.print_msg("  modules: " + " ".join(modules))
        self.print_msg("  destination: " +
                       " ".join(["%s:%s" % (k, v)
                                 for k, v in destinations.items()]))
        self.print_msg("  includes: " + " ".join(sorted(includes)))
        self.print_msg("  helpers: " + " ".join(sorted(helpers)))

    def print_icmptype_info(self, icmptype, settings):
        destinations = settings.getDestinations()
        description = settings.getDescription()
        short_description = settings.getShort()
        if len(destinations) == 0:
            destinations = [ "ipv4", "ipv6" ]
        self.print_msg(icmptype)
        if self.verbose:
            self.print_msg("  summary: " + short_description)
            self.print_msg("  description: " + description)
        self.print_msg("  destination: " + " ".join(destinations))

    def print_ipset_info(self, ipset, settings):
        ipset_type = settings.getType()
        options = settings.getOptions()
        entries = settings.getEntries()
        description = settings.getDescription()
        short_description = settings.getShort()
        self.print_msg(ipset)
        if self.verbose:
            self.print_msg("  summary: " + short_description)
            self.print_msg("  description: " + description)
        self.print_msg("  type: " + ipset_type)
        self.print_msg("  options: " + " ".join(["%s=%s" % (k, v) if v else k
                                                 for k, v in options.items()]))
        self.print_msg("  entries: " + " ".join(entries))

    def print_helper_info(self, helper, settings):
        ports = settings.getPorts()
        module = settings.getModule()
        family = settings.getFamily()
        description = settings.getDescription()
        short_description = settings.getShort()
        self.print_msg(helper)
        if self.verbose:
            self.print_msg("  summary: " + short_description)
            self.print_msg("  description: " + description)
        self.print_msg("  family: " + family)
        self.print_msg("  module: " + module)
        self.print_msg("  ports: " + " ".join(["%s/%s" % (port[0], port[1])
                                               for port in ports]))

    def print_query_result(self, value):
        if value:
            self.print_and_exit("yes")
        else:
            self.print_and_exit("no", 1)

    def exception_handler(self, exception_message):
        if not self.__use_exception_handler:
            raise
        self.fail_if_not_authorized(exception_message)
        code = FirewallError.get_code(str(exception_message))
        if code in [ errors.ALREADY_ENABLED, errors.NOT_ENABLED,
                     errors.ZONE_ALREADY_SET, errors.ALREADY_SET ]:
            self.print_warning("Warning: %s" % exception_message)
        else:
            self.print_and_exit("Error: %s" % exception_message, code)

    def fail_if_not_authorized(self, exception_message):
        if "NotAuthorizedException" in exception_message:
            msg = """Authorization failed.
    Make sure polkit agent is running or run the application as superuser."""
            self.print_and_exit(msg, errors.NOT_AUTHORIZED)

    def deactivate_exception_handler(self):
        self.__use_exception_handler = False

    def activate_exception_handler(self):
        self.__use_exception_handler = True

    def get_ipset_entries_from_file(self, filename):
        entries = [ ]
        entries_set = set()
        f = open(filename)
        for line in f:
            if not line:
                break
            line = line.strip()
            if len(line) < 1 or line[0] in ['#', ';']:
                continue
            if line not in entries_set:
                entries.append(line)
                entries_set.add(line)
        f.close()
        return entries
site-packages/firewall/__init__.py000064400000000000147511334670013203 0ustar00site-packages/firewall/client.py000064400000410670147511334670012744 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2009-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

from gi.repository import GLib, GObject

# force use of pygobject3 in python-slip
import sys
sys.modules['gobject'] = GObject

import dbus.mainloop.glib
import slip.dbus
from decorator import decorator

from firewall import config
from firewall.core.base import DEFAULT_ZONE_TARGET, DEFAULT_POLICY_TARGET, DEFAULT_POLICY_PRIORITY
from firewall.dbus_utils import dbus_to_python
from firewall.functions import b2u
from firewall.core.rich import Rich_Rule
from firewall.core.ipset import normalize_ipset_entry, check_entry_overlaps_existing, \
                                check_for_overlapping_entries
from firewall import errors
from firewall.errors import FirewallError

import dbus
import traceback

exception_handler = None
not_authorized_loop = False

@decorator
def handle_exceptions(func, *args, **kwargs):
    """Decorator to handle exceptions
    """
    authorized = False
    while not authorized:
        try:
            return func(*args, **kwargs)
        except dbus.exceptions.DBusException as e:
            dbus_message = e.get_dbus_message() # returns unicode
            dbus_name = e.get_dbus_name()
            if not exception_handler:
                raise
            if "NotAuthorizedException" in dbus_name:
                exception_handler("NotAuthorizedException")
            elif "org.freedesktop.DBus.Error" in dbus_name:
                # dbus error, try again
                exception_handler(dbus_message)
            else:
                authorized = True
                if dbus_message:
                    exception_handler(dbus_message)
                else:
                    exception_handler(b2u(str(e)))
        except FirewallError as e:
            if not exception_handler:
                raise
            else:
                exception_handler(b2u(str(e)))
        except Exception:
            if not exception_handler:
                raise
            else:
                exception_handler(b2u(traceback.format_exc()))
        if not not_authorized_loop:
            break

# zone config setings

class FirewallClientZoneSettings(object):
    @handle_exceptions
    def __init__(self, settings = None):
        self.settings = ["", "", "", False, DEFAULT_ZONE_TARGET, [], [],
                         [], False, [], [], [], [], [], [], False, False]
        self.settings_name = ["version", "short", "description", "UNUSED",
                              "target", "services", "ports",
                              "icmp_blocks", "masquerade", "forward_ports",
                              "interfaces", "sources", "rules_str",
                              "protocols", "source_ports", "icmp_block_inversion",
                              "forward"]
        self.settings_dbus_type = ["s", "s", "s", "b",
                                   "s", "s", "(ss)",
                                   "s", "b", "(ssss)",
                                   "s", "s", "s",
                                   "s", "(ss)", "b",
                                   "b"]
        if settings:
            if isinstance(settings, list):
                for i,v in enumerate(settings):
                    self.settings[i] = settings[i]
            if isinstance(settings, dict):
                self.setSettingsDict(settings)

    @handle_exceptions
    def __repr__(self):
        return '%s(%r)' % (self.__class__, self.settings)

    @handle_exceptions
    def getSettingsDict(self):
        settings = {}
        for key,value in zip(self.settings_name, self.settings):
            if key == 'UNUSED':
                continue
            settings[key] = value
        return settings
    @handle_exceptions
    def setSettingsDict(self, settings):
        for key in settings:
            self.settings[self.settings_name.index(key)] = settings[key]
    @handle_exceptions
    def getSettingsDbusDict(self):
        settings = {}
        for key,value,sig in zip(self.settings_name, self.settings, self.settings_dbus_type):
            if key == 'UNUSED':
                continue
            if type(value) is list:
                settings[key] = dbus.Array(value, signature=sig)
            elif type(value) is dict:
                settings[key] = dbus.Dictionary(value, signature=sig)
            else:
                settings[key] = value
        return settings

    @handle_exceptions
    def getRuntimeSettingsDict(self):
        settings = self.getSettingsDict()
        # These are not configurable at runtime:
        del settings['version']
        del settings['short']
        del settings['description']
        del settings['target']
        return settings
    @handle_exceptions
    def getRuntimeSettingsDbusDict(self):
        settings = self.getSettingsDbusDict()
        # These are not configurable at runtime:
        del settings['version']
        del settings['short']
        del settings['description']
        del settings['target']
        return settings

    @handle_exceptions
    def getVersion(self):
        return self.settings[0]
    @handle_exceptions
    def setVersion(self, version):
        self.settings[0] = version

    @handle_exceptions
    def getShort(self):
        return self.settings[1]
    @handle_exceptions
    def setShort(self, short):
        self.settings[1] = short

    @handle_exceptions
    def getDescription(self):
        return self.settings[2]
    @handle_exceptions
    def setDescription(self, description):
        self.settings[2] = description

    # self.settings[3] was used for 'immutable'

    @handle_exceptions
    def getTarget(self):
        return self.settings[4] if self.settings[4] != DEFAULT_ZONE_TARGET else "default"
    @handle_exceptions
    def setTarget(self, target):
        self.settings[4] = target if target != "default" else DEFAULT_ZONE_TARGET

    @handle_exceptions
    def getServices(self):
        return self.settings[5]
    @handle_exceptions
    def setServices(self, services):
        self.settings[5] = services
    @handle_exceptions
    def addService(self, service):
        if service not in self.settings[5]:
            self.settings[5].append(service)
        else:
            raise FirewallError(errors.ALREADY_ENABLED, service)
    @handle_exceptions
    def removeService(self, service):
        if service in self.settings[5]:
            self.settings[5].remove(service)
        else:
            raise FirewallError(errors.NOT_ENABLED, service)
    @handle_exceptions
    def queryService(self, service):
        return service in self.settings[5]

    @handle_exceptions
    def getPorts(self):
        return self.settings[6]
    @handle_exceptions
    def setPorts(self, ports):
        self.settings[6] = ports
    @handle_exceptions
    def addPort(self, port, protocol):
        if (port,protocol) not in self.settings[6]:
            self.settings[6].append((port,protocol))
        else:
            raise FirewallError(errors.ALREADY_ENABLED,
                                "'%s:%s'" % (port, protocol))
    @handle_exceptions
    def removePort(self, port, protocol):
        if (port,protocol) in self.settings[6]:
            self.settings[6].remove((port,protocol))
        else:
            raise FirewallError(errors.NOT_ENABLED,
                                "'%s:%s'" % (port, protocol))
    @handle_exceptions
    def queryPort(self, port, protocol):
        return (port,protocol) in self.settings[6]

    @handle_exceptions
    def getProtocols(self):
        return self.settings[13]
    @handle_exceptions
    def setProtocols(self, protocols):
        self.settings[13] = protocols
    @handle_exceptions
    def addProtocol(self, protocol):
        if protocol not in self.settings[13]:
            self.settings[13].append(protocol)
        else:
            raise FirewallError(errors.ALREADY_ENABLED, protocol)
    @handle_exceptions
    def removeProtocol(self, protocol):
        if protocol in self.settings[13]:
            self.settings[13].remove(protocol)
        else:
            raise FirewallError(errors.NOT_ENABLED, protocol)
    @handle_exceptions
    def queryProtocol(self, protocol):
        return protocol in self.settings[13]

    @handle_exceptions
    def getSourcePorts(self):
        return self.settings[14]
    @handle_exceptions
    def setSourcePorts(self, ports):
        self.settings[14] = ports
    @handle_exceptions
    def addSourcePort(self, port, protocol):
        if (port,protocol) not in self.settings[14]:
            self.settings[14].append((port,protocol))
        else:
            raise FirewallError(errors.ALREADY_ENABLED,
                                "'%s:%s'" % (port, protocol))
    @handle_exceptions
    def removeSourcePort(self, port, protocol):
        if (port,protocol) in self.settings[14]:
            self.settings[14].remove((port,protocol))
        else:
            raise FirewallError(errors.NOT_ENABLED,
                                "'%s:%s'" % (port, protocol))
    @handle_exceptions
    def querySourcePort(self, port, protocol):
        return (port,protocol) in self.settings[14]

    @handle_exceptions
    def getIcmpBlocks(self):
        return self.settings[7]
    @handle_exceptions
    def setIcmpBlocks(self, icmpblocks):
        self.settings[7] = icmpblocks
    @handle_exceptions
    def addIcmpBlock(self, icmptype):
        if icmptype not in self.settings[7]:
            self.settings[7].append(icmptype)
        else:
            raise FirewallError(errors.ALREADY_ENABLED, icmptype)
    @handle_exceptions
    def removeIcmpBlock(self, icmptype):
        if icmptype in self.settings[7]:
            self.settings[7].remove(icmptype)
        else:
            raise FirewallError(errors.NOT_ENABLED, icmptype)
    @handle_exceptions
    def queryIcmpBlock(self, icmptype):
        return icmptype in self.settings[7]

    @handle_exceptions
    def getIcmpBlockInversion(self):
        return self.settings[15]
    @handle_exceptions
    def setIcmpBlockInversion(self, flag):
        self.settings[15] = flag
    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addIcmpBlockInversion(self):
        if not self.settings[15]:
            self.settings[15] = True
        else:
            FirewallError(errors.ALREADY_ENABLED, "icmp-block-inversion")
    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeIcmpBlockInversion(self):
        if self.settings[15]:
            self.settings[15] = False
        else:
            FirewallError(errors.NOT_ENABLED, "icmp-block-inversion")
    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryIcmpBlockInversion(self):
        return self.settings[15]

    @handle_exceptions
    def getForward(self):
        return self.settings[16]
    @handle_exceptions
    def setForward(self, forward):
        self.settings[16] = forward
    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addForward(self):
        if not self.settings[16]:
            self.settings[16] = True
        else:
            FirewallError(errors.ALREADY_ENABLED, "forward")
    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeForward(self):
        if self.settings[16]:
            self.settings[16] = False
        else:
            FirewallError(errors.NOT_ENABLED, "forward")
    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryForward(self):
        return self.settings[16]

    @handle_exceptions
    def getMasquerade(self):
        return self.settings[8]
    @handle_exceptions
    def setMasquerade(self, masquerade):
        self.settings[8] = masquerade
    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addMasquerade(self):
        if not self.settings[8]:
            self.settings[8] = True
        else:
            FirewallError(errors.ALREADY_ENABLED, "masquerade")
    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeMasquerade(self):
        if self.settings[8]:
            self.settings[8] = False
        else:
            FirewallError(errors.NOT_ENABLED, "masquerade")
    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryMasquerade(self):
        return self.settings[8]

    @handle_exceptions
    def getForwardPorts(self):
        return self.settings[9]
    @handle_exceptions
    def setForwardPorts(self, ports):
        self.settings[9] = ports
    @handle_exceptions
    def addForwardPort(self, port, protocol, to_port, to_addr):
        if to_port is None:
            to_port = ''
        if to_addr is None:
            to_addr = ''
        if (port,protocol,to_port,to_addr) not in self.settings[9]:
            self.settings[9].append((port,protocol,to_port,to_addr))
        else:
            raise FirewallError(errors.ALREADY_ENABLED, "'%s:%s:%s:%s'" % \
                                (port, protocol, to_port, to_addr))
    @handle_exceptions
    def removeForwardPort(self, port, protocol, to_port, to_addr):
        if to_port is None:
            to_port = ''
        if to_addr is None:
            to_addr = ''
        if (port,protocol,to_port,to_addr) in self.settings[9]:
            self.settings[9].remove((port,protocol,to_port,to_addr))
        else:
            raise FirewallError(errors.NOT_ENABLED, "'%s:%s:%s:%s'" % \
                                (port, protocol, to_port, to_addr))
    @handle_exceptions
    def queryForwardPort(self, port, protocol, to_port, to_addr):
        if to_port is None:
            to_port = ''
        if to_addr is None:
            to_addr = ''
        return (port,protocol,to_port,to_addr) in self.settings[9]

    @handle_exceptions
    def getInterfaces(self):
        return self.settings[10]
    @handle_exceptions
    def setInterfaces(self, interfaces):
        self.settings[10] = interfaces
    @handle_exceptions
    def addInterface(self, interface):
        if interface not in self.settings[10]:
            self.settings[10].append(interface)
        else:
            raise FirewallError(errors.ALREADY_ENABLED, interface)
    @handle_exceptions
    def removeInterface(self, interface):
        if interface in self.settings[10]:
            self.settings[10].remove(interface)
        else:
            raise FirewallError(errors.NOT_ENABLED, interface)
    @handle_exceptions
    def queryInterface(self, interface):
        return interface in self.settings[10]

    @handle_exceptions
    def getSources(self):
        return self.settings[11]
    @handle_exceptions
    def setSources(self, sources):
        self.settings[11] = sources
    @handle_exceptions
    def addSource(self, source):
        if source not in self.settings[11]:
            self.settings[11].append(source)
        else:
            raise FirewallError(errors.ALREADY_ENABLED, source)
    @handle_exceptions
    def removeSource(self, source):
        if source in self.settings[11]:
            self.settings[11].remove(source)
        else:
            raise FirewallError(errors.NOT_ENABLED, source)
    @handle_exceptions
    def querySource(self, source):
        return source in self.settings[11]

    @handle_exceptions
    def getRichRules(self):
        return self.settings[12]
    @handle_exceptions
    def setRichRules(self, rules):
        rules = [ str(Rich_Rule(rule_str=r)) for r in rules ]
        self.settings[12] = rules
    @handle_exceptions
    def addRichRule(self, rule):
        rule = str(Rich_Rule(rule_str=rule))
        if rule not in self.settings[12]:
            self.settings[12].append(rule)
        else:
            raise FirewallError(errors.ALREADY_ENABLED, rule)
    @handle_exceptions
    def removeRichRule(self, rule):
        rule = str(Rich_Rule(rule_str=rule))
        if rule in self.settings[12]:
            self.settings[12].remove(rule)
        else:
            raise FirewallError(errors.NOT_ENABLED, rule)
    @handle_exceptions
    def queryRichRule(self, rule):
        rule = str(Rich_Rule(rule_str=rule))
        return rule in self.settings[12]


# zone config

class FirewallClientConfigZone(object):
    def __init__(self, bus, path):
        self.bus = bus
        self.path = path
        self.dbus_obj = self.bus.get_object(config.dbus.DBUS_INTERFACE, path)
        self.fw_zone = dbus.Interface(
            self.dbus_obj,
            dbus_interface=config.dbus.DBUS_INTERFACE_CONFIG_ZONE)
        self.fw_properties = dbus.Interface(
            self.dbus_obj, dbus_interface='org.freedesktop.DBus.Properties')
        #TODO: check interface version and revision (need to match client 
        # version)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def get_property(self, prop):
        return dbus_to_python(self.fw_properties.Get(
            config.dbus.DBUS_INTERFACE_CONFIG_ZONE, prop))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def get_properties(self):
        return dbus_to_python(self.fw_properties.GetAll(
            config.dbus.DBUS_INTERFACE_CONFIG_ZONE))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def set_property(self, prop, value):
        self.fw_properties.Set(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                               prop, value)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getSettings(self):
        return FirewallClientZoneSettings(dbus_to_python(self.fw_zone.getSettings2()))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def update(self, settings):
        self.fw_zone.update2(settings.getSettingsDbusDict())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def loadDefaults(self):
        self.fw_zone.loadDefaults()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def remove(self):
        self.fw_zone.remove()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def rename(self, name):
        self.fw_zone.rename(name)

    # version

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getVersion(self):
        return self.fw_zone.getVersion()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setVersion(self, version):
        self.fw_zone.setVersion(version)

    # short

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getShort(self):
        return self.fw_zone.getShort()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setShort(self, short):
        self.fw_zone.setShort(short)

    # description

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getDescription(self):
        return self.fw_zone.getDescription()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setDescription(self, description):
        self.fw_zone.setDescription(description)

    # target

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getTarget(self):
        return self.fw_zone.getTarget()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setTarget(self, target):
        self.fw_zone.setTarget(target)

    # service

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getServices(self):
        return self.fw_zone.getServices()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setServices(self, services):
        self.fw_zone.setServices(services)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addService(self, service):
        self.fw_zone.addService(service)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeService(self, service):
        self.fw_zone.removeService(service)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryService(self, service):
        return self.fw_zone.queryService(service)

    # port

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getPorts(self):
        return self.fw_zone.getPorts()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setPorts(self, ports):
        self.fw_zone.setPorts(ports)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addPort(self, port, protocol):
        self.fw_zone.addPort(port, protocol)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removePort(self, port, protocol):
        self.fw_zone.removePort(port, protocol)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryPort(self, port, protocol):
        return self.fw_zone.queryPort(port, protocol)

    # protocol

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getProtocols(self):
        return self.fw_zone.getProtocols()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setProtocols(self, protocols):
        self.fw_zone.setProtocols(protocols)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addProtocol(self, protocol):
        self.fw_zone.addProtocol(protocol)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeProtocol(self, protocol):
        self.fw_zone.removeProtocol(protocol)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryProtocol(self, protocol):
        return self.fw_zone.queryProtocol(protocol)

    # source-port

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getSourcePorts(self):
        return self.fw_zone.getSourcePorts()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setSourcePorts(self, ports):
        self.fw_zone.setSourcePorts(ports)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addSourcePort(self, port, protocol):
        self.fw_zone.addSourcePort(port, protocol)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeSourcePort(self, port, protocol):
        self.fw_zone.removeSourcePort(port, protocol)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def querySourcePort(self, port, protocol):
        return self.fw_zone.querySourcePort(port, protocol)

    # icmp block

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getIcmpBlocks(self):
        return self.fw_zone.getIcmpBlocks()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setIcmpBlocks(self, icmptypes):
        self.fw_zone.setIcmpBlocks(icmptypes)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addIcmpBlock(self, icmptype):
        self.fw_zone.addIcmpBlock(icmptype)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeIcmpBlock(self, icmptype):
        self.fw_zone.removeIcmpBlock(icmptype)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryIcmpBlock(self, icmptype):
        return self.fw_zone.queryIcmpBlock(icmptype)

    # icmp-block-inversion

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getIcmpBlockInversion(self):
        return self.fw_zone.getIcmpBlockInversion()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setIcmpBlockInversion(self, inversion):
        self.fw_zone.setIcmpBlockInversion(inversion)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addIcmpBlockInversion(self):
        self.fw_zone.addIcmpBlockInversion()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeIcmpBlockInversion(self):
        self.fw_zone.removeIcmpBlockInversion()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryIcmpBlockInversion(self):
        return self.fw_zone.queryIcmpBlockInversion()

    # forward

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getForward(self):
        return self.fw_zone.getSettings2()["forward"]

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setForward(self, forward):
        self.fw_zone.update2({"forward": forward})

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addForward(self):
        self.fw_zone.update2({"forward": True})

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeForward(self):
        self.fw_zone.update2({"forward": False})

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryForward(self):
        return self.fw_zone.getSettings2()["forward"]

    # masquerade

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getMasquerade(self):
        return self.fw_zone.getMasquerade()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setMasquerade(self, masquerade):
        self.fw_zone.setMasquerade(masquerade)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addMasquerade(self):
        self.fw_zone.addMasquerade()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeMasquerade(self):
        self.fw_zone.removeMasquerade()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryMasquerade(self):
        return self.fw_zone.queryMasquerade()

    # forward port

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getForwardPorts(self):
        return self.fw_zone.getForwardPorts()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setForwardPorts(self, ports):
        self.fw_zone.setForwardPorts(ports)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addForwardPort(self, port, protocol, toport, toaddr):
        if toport is None:
            toport = ''
        if toaddr is None:
            toaddr = ''
        self.fw_zone.addForwardPort(port, protocol, toport, toaddr)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeForwardPort(self, port, protocol, toport, toaddr):
        if toport is None:
            toport = ''
        if toaddr is None:
            toaddr = ''
        self.fw_zone.removeForwardPort(port, protocol, toport, toaddr)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryForwardPort(self, port, protocol, toport, toaddr):
        if toport is None:
            toport = ''
        if toaddr is None:
            toaddr = ''
        return self.fw_zone.queryForwardPort(port, protocol, toport, toaddr)

    # interface

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getInterfaces(self):
        return self.fw_zone.getInterfaces()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setInterfaces(self, interfaces):
        self.fw_zone.setInterfaces(interfaces)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addInterface(self, interface):
        self.fw_zone.addInterface(interface)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeInterface(self, interface):
        self.fw_zone.removeInterface(interface)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryInterface(self, interface):
        return self.fw_zone.queryInterface(interface)

    # source

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getSources(self):
        return self.fw_zone.getSources()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setSources(self, sources):
        self.fw_zone.setSources(sources)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addSource(self, source):
        self.fw_zone.addSource(source)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeSource(self, source):
        self.fw_zone.removeSource(source)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def querySource(self, source):
        return self.fw_zone.querySource(source)

    # rich rule

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getRichRules(self):
        return self.fw_zone.getRichRules()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setRichRules(self, rules):
        self.fw_zone.setRichRules(rules)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addRichRule(self, rule):
        self.fw_zone.addRichRule(rule)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeRichRule(self, rule):
        self.fw_zone.removeRichRule(rule)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryRichRule(self, rule):
        return self.fw_zone.queryRichRule(rule)

class FirewallClientPolicySettings(object):
    @handle_exceptions
    def __init__(self, settings=None):
        self.settings = {"description": "",
                         "egress_zones": [],
                         "forward_ports": [],
                         "icmp_blocks": [],
                         "ingress_zones": [],
                         "masquerade": False,
                         "ports": [],
                         "priority": DEFAULT_POLICY_PRIORITY,
                         "protocols": [],
                         "rich_rules": [],
                         "services": [],
                         "short": "",
                         "source_ports": [],
                         "target": DEFAULT_POLICY_TARGET,
                         "version": "",
                         }
        self.settings_dbus_type = ["s", "s", "(ssss)", "s",
                                   "s", "b", "(ss)",
                                   "i", "s", "s",
                                   "s", "s", "(ss)",
                                   "s", "s"]
        if settings:
            self.setSettingsDict(settings)

    @handle_exceptions
    def __repr__(self):
        return '%s(%r)' % (self.__class__, self.settings)

    @handle_exceptions
    def getSettingsDict(self):
        return self.settings
    @handle_exceptions
    def setSettingsDict(self, settings):
        for key in settings:
            self.settings[key] = settings[key]
    @handle_exceptions
    def getSettingsDbusDict(self):
        settings = {}
        for key,sig in zip(self.settings, self.settings_dbus_type):
            value = self.settings[key]
            if type(value) is list:
                settings[key] = dbus.Array(value, signature=sig)
            elif type(value) is dict:
                settings[key] = dbus.Dictionary(value, signature=sig)
            else:
                settings[key] = value
        return settings
    def getRuntimeSettingsDbusDict(self):
        settings = self.getSettingsDbusDict()
        for key in ["version", "short", "description", "target"]:
            del settings[key]
        return settings

    @handle_exceptions
    def getVersion(self):
        return self.settings["version"]
    @handle_exceptions
    def setVersion(self, version):
        self.settings["version"] = version

    @handle_exceptions
    def getShort(self):
        return self.settings["short"]
    @handle_exceptions
    def setShort(self, short):
        self.settings["short"] = short

    @handle_exceptions
    def getDescription(self):
        return self.settings["description"]
    @handle_exceptions
    def setDescription(self, description):
        self.settings["description"] = description

    @handle_exceptions
    def getTarget(self):
        return self.settings["target"]
    @handle_exceptions
    def setTarget(self, target):
        self.settings["target"] = target

    @handle_exceptions
    def getServices(self):
        return self.settings["services"]
    @handle_exceptions
    def setServices(self, services):
        self.settings["services"] = services
    @handle_exceptions
    def addService(self, service):
        if service not in self.settings["services"]:
            self.settings["services"].append(service)
        else:
            raise FirewallError(errors.ALREADY_ENABLED, service)
    @handle_exceptions
    def removeService(self, service):
        if service in self.settings["services"]:
            self.settings["services"].remove(service)
        else:
            raise FirewallError(errors.NOT_ENABLED, service)
    @handle_exceptions
    def queryService(self, service):
        return service in self.settings["services"]

    @handle_exceptions
    def getPorts(self):
        return self.settings["ports"]
    @handle_exceptions
    def setPorts(self, ports):
        self.settings["ports"] = ports
    @handle_exceptions
    def addPort(self, port, protocol):
        if (port,protocol) not in self.settings["ports"]:
            self.settings["ports"].append((port,protocol))
        else:
            raise FirewallError(errors.ALREADY_ENABLED,
                                "'%s:%s'" % (port, protocol))
    @handle_exceptions
    def removePort(self, port, protocol):
        if (port,protocol) in self.settings["ports"]:
            self.settings["ports"].remove((port,protocol))
        else:
            raise FirewallError(errors.NOT_ENABLED,
                                "'%s:%s'" % (port, protocol))
    @handle_exceptions
    def queryPort(self, port, protocol):
        return (port,protocol) in self.settings["ports"]

    @handle_exceptions
    def getProtocols(self):
        return self.settings["protocols"]
    @handle_exceptions
    def setProtocols(self, protocols):
        self.settings["protocols"] = protocols
    @handle_exceptions
    def addProtocol(self, protocol):
        if protocol not in self.settings["protocols"]:
            self.settings["protocols"].append(protocol)
        else:
            raise FirewallError(errors.ALREADY_ENABLED, protocol)
    @handle_exceptions
    def removeProtocol(self, protocol):
        if protocol in self.settings["protocols"]:
            self.settings["protocols"].remove(protocol)
        else:
            raise FirewallError(errors.NOT_ENABLED, protocol)
    @handle_exceptions
    def queryProtocol(self, protocol):
        return protocol in self.settings["protocols"]

    @handle_exceptions
    def getSourcePorts(self):
        return self.settings["source_ports"]
    @handle_exceptions
    def setSourcePorts(self, ports):
        self.settings["source_ports"] = ports
    @handle_exceptions
    def addSourcePort(self, port, protocol):
        if (port,protocol) not in self.settings["source_ports"]:
            self.settings["source_ports"].append((port,protocol))
        else:
            raise FirewallError(errors.ALREADY_ENABLED,
                                "'%s:%s'" % (port, protocol))
    @handle_exceptions
    def removeSourcePort(self, port, protocol):
        if (port,protocol) in self.settings["source_ports"]:
            self.settings["source_ports"].remove((port,protocol))
        else:
            raise FirewallError(errors.NOT_ENABLED,
                                "'%s:%s'" % (port, protocol))
    @handle_exceptions
    def querySourcePort(self, port, protocol):
        return (port,protocol) in self.settings["source_ports"]

    @handle_exceptions
    def getIcmpBlocks(self):
        return self.settings["icmp_blocks"]
    @handle_exceptions
    def setIcmpBlocks(self, icmpblocks):
        self.settings["icmp_blocks"] = icmpblocks
    @handle_exceptions
    def addIcmpBlock(self, icmptype):
        if icmptype not in self.settings["icmp_blocks"]:
            self.settings["icmp_blocks"].append(icmptype)
        else:
            raise FirewallError(errors.ALREADY_ENABLED, icmptype)
    @handle_exceptions
    def removeIcmpBlock(self, icmptype):
        if icmptype in self.settings["icmp_blocks"]:
            self.settings["icmp_blocks"].remove(icmptype)
        else:
            raise FirewallError(errors.NOT_ENABLED, icmptype)
    @handle_exceptions
    def queryIcmpBlock(self, icmptype):
        return icmptype in self.settings["icmp_blocks"]

    @handle_exceptions
    def getMasquerade(self):
        return self.settings["masquerade"]
    @handle_exceptions
    def setMasquerade(self, masquerade):
        self.settings["masquerade"] = masquerade
    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addMasquerade(self):
        if not self.settings["masquerade"]:
            self.settings["masquerade"] = True
        else:
            FirewallError(errors.ALREADY_ENABLED, "masquerade")
    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeMasquerade(self):
        if self.settings["masquerade"]:
            self.settings["masquerade"] = False
        else:
            FirewallError(errors.NOT_ENABLED, "masquerade")
    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryMasquerade(self):
        return self.settings["masquerade"]

    @handle_exceptions
    def getForwardPorts(self):
        return self.settings["forward_ports"]
    @handle_exceptions
    def setForwardPorts(self, ports):
        self.settings["forward_ports"] = ports
    @handle_exceptions
    def addForwardPort(self, port, protocol, to_port, to_addr):
        if to_port is None:
            to_port = ''
        if to_addr is None:
            to_addr = ''
        if (port,protocol,to_port,to_addr) not in self.settings["forward_ports"]:
            self.settings["forward_ports"].append((port,protocol,to_port,to_addr))
        else:
            raise FirewallError(errors.ALREADY_ENABLED, "'%s:%s:%s:%s'" % \
                                (port, protocol, to_port, to_addr))
    @handle_exceptions
    def removeForwardPort(self, port, protocol, to_port, to_addr):
        if to_port is None:
            to_port = ''
        if to_addr is None:
            to_addr = ''
        if (port,protocol,to_port,to_addr) in self.settings["forward_ports"]:
            self.settings["forward_ports"].remove((port,protocol,to_port,to_addr))
        else:
            raise FirewallError(errors.NOT_ENABLED, "'%s:%s:%s:%s'" % \
                                (port, protocol, to_port, to_addr))
    @handle_exceptions
    def queryForwardPort(self, port, protocol, to_port, to_addr):
        if to_port is None:
            to_port = ''
        if to_addr is None:
            to_addr = ''
        return (port,protocol,to_port,to_addr) in self.settings["forward_ports"]

    @handle_exceptions
    def getRichRules(self):
        return self.settings["rich_rules"]
    @handle_exceptions
    def setRichRules(self, rules):
        rules = [ str(Rich_Rule(rule_str=r)) for r in rules ]
        self.settings["rich_rules"] = rules
    @handle_exceptions
    def addRichRule(self, rule):
        rule = str(Rich_Rule(rule_str=rule))
        if rule not in self.settings["rich_rules"]:
            self.settings["rich_rules"].append(rule)
        else:
            raise FirewallError(errors.ALREADY_ENABLED, rule)
    @handle_exceptions
    def removeRichRule(self, rule):
        rule = str(Rich_Rule(rule_str=rule))
        if rule in self.settings["rich_rules"]:
            self.settings["rich_rules"].remove(rule)
        else:
            raise FirewallError(errors.NOT_ENABLED, rule)
    @handle_exceptions
    def queryRichRule(self, rule):
        rule = str(Rich_Rule(rule_str=rule))
        return rule in self.settings["rich_rules"]

    @handle_exceptions
    def getIngressZones(self):
        return self.settings["ingress_zones"]
    @handle_exceptions
    def setIngressZones(self, ingress_zones):
        self.settings["ingress_zones"] = ingress_zones
    @handle_exceptions
    def addIngressZone(self, ingress_zone):
        if ingress_zone not in self.settings["ingress_zones"]:
            self.settings["ingress_zones"].append(ingress_zone)
        else:
            raise FirewallError(errors.ALREADY_ENABLED, ingress_zone)
    @handle_exceptions
    def removeIngressZone(self, ingress_zone):
        if ingress_zone in self.settings["ingress_zones"]:
            self.settings["ingress_zones"].remove(ingress_zone)
        else:
            raise FirewallError(errors.NOT_ENABLED, ingress_zone)
    @handle_exceptions
    def queryIngressZone(self, ingress_zone):
        return ingress_zone in self.settings["ingress_zones"]

    @handle_exceptions
    def getEgressZones(self):
        return self.settings["egress_zones"]
    @handle_exceptions
    def setEgressZones(self, egress_zones):
        self.settings["egress_zones"] = egress_zones
    @handle_exceptions
    def addEgressZone(self, egress_zone):
        if egress_zone not in self.settings["egress_zones"]:
            self.settings["egress_zones"].append(egress_zone)
        else:
            raise FirewallError(errors.ALREADY_ENABLED, egress_zone)
    @handle_exceptions
    def removeEgressZone(self, egress_zone):
        if egress_zone in self.settings["egress_zones"]:
            self.settings["egress_zones"].remove(egress_zone)
        else:
            raise FirewallError(errors.NOT_ENABLED, egress_zone)
    @handle_exceptions
    def queryEgressZone(self, egress_zone):
        return egress_zone in self.settings["egress_zones"]

    @handle_exceptions
    def getPriority(self):
        return self.settings["priority"]
    @handle_exceptions
    def setPriority(self, priority):
        self.settings["priority"] = int(priority)

class FirewallClientConfigPolicy(object):
    def __init__(self, bus, path):
        self.bus = bus
        self.path = path
        self.dbus_obj = self.bus.get_object(config.dbus.DBUS_INTERFACE, path)
        self.fw_policy = dbus.Interface(
            self.dbus_obj,
            dbus_interface=config.dbus.DBUS_INTERFACE_CONFIG_POLICY)
        self.fw_properties = dbus.Interface(
            self.dbus_obj, dbus_interface='org.freedesktop.DBus.Properties')

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def get_property(self, prop):
        return dbus_to_python(self.fw_properties.Get(
            config.dbus.DBUS_INTERFACE_CONFIG_POLICY, prop))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def get_properties(self):
        return dbus_to_python(self.fw_properties.GetAll(
            config.dbus.DBUS_INTERFACE_CONFIG_POLICY))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def set_property(self, prop, value):
        self.fw_properties.Set(config.dbus.DBUS_INTERFACE_CONFIG_POLICY,
                               prop, value)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getSettings(self):
        return FirewallClientPolicySettings(dbus_to_python(self.fw_policy.getSettings()))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def update(self, settings):
        self.fw_policy.update(settings.getSettingsDbusDict())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def loadDefaults(self):
        self.fw_policy.loadDefaults()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def remove(self):
        self.fw_policy.remove()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def rename(self, name):
        self.fw_policy.rename(name)

# service config settings

class FirewallClientServiceSettings(object):
    @handle_exceptions
    def __init__(self, settings=None):
        self.settings = ["", "", "", [], [], {}, [], [], [], []]
        self.settings_name = ["version", "short", "description", "ports",
                              "modules", "destination", "protocols",
                              "source_ports", "includes", "helpers"]
        self.settings_dbus_type = ["s", "s", "s", "(ss)",
                                   "s", "ss", "s",
                                   "(ss)", "s", "s"]
        if settings:
            if type(settings) is list:
                for i,v in enumerate(settings):
                    self.settings[i] = settings[i]
            elif type(settings) is dict:
                self.setSettingsDict(settings)

    @handle_exceptions
    def __repr__(self):
        return '%s(%r)' % (self.__class__, self.settings)

    @handle_exceptions
    def getSettingsDict(self):
        settings = {}
        for key,value in zip(self.settings_name, self.settings):
            settings[key] = value
        return settings
    @handle_exceptions
    def setSettingsDict(self, settings):
        for key in settings:
            self.settings[self.settings_name.index(key)] = settings[key]
    @handle_exceptions
    def getSettingsDbusDict(self):
        settings = {}
        for key,value,sig in zip(self.settings_name, self.settings, self.settings_dbus_type):
            if type(value) is list:
                settings[key] = dbus.Array(value, signature=sig)
            elif type(value) is dict:
                settings[key] = dbus.Dictionary(value, signature=sig)
            else:
                settings[key] = value
        return settings

    @handle_exceptions
    def getVersion(self):
        return self.settings[0]
    @handle_exceptions
    def setVersion(self, version):
        self.settings[0] = version

    @handle_exceptions
    def getShort(self):
        return self.settings[1]
    @handle_exceptions
    def setShort(self, short):
        self.settings[1] = short

    @handle_exceptions
    def getDescription(self):
        return self.settings[2]
    @handle_exceptions
    def setDescription(self, description):
        self.settings[2] = description

    @handle_exceptions
    def getPorts(self):
        return self.settings[3]
    @handle_exceptions
    def setPorts(self, ports):
        self.settings[3] = ports
    @handle_exceptions
    def addPort(self, port, protocol):
        if (port,protocol) not in self.settings[3]:
            self.settings[3].append((port,protocol))
        else:
            raise FirewallError(errors.ALREADY_ENABLED,
                                "'%s:%s'" % (port, protocol))
    @handle_exceptions
    def removePort(self, port, protocol):
        if (port,protocol) in self.settings[3]:
            self.settings[3].remove((port,protocol))
        else:
            raise FirewallError(errors.NOT_ENABLED,
                                "'%s:%s'" % (port, protocol))
    @handle_exceptions
    def queryPort(self, port, protocol):
        return (port,protocol) in self.settings[3]

    @handle_exceptions
    def getProtocols(self):
        return self.settings[6]
    @handle_exceptions
    def setProtocols(self, protocols):
        self.settings[6] = protocols
    @handle_exceptions
    def addProtocol(self, protocol):
        if protocol not in self.settings[6]:
            self.settings[6].append(protocol)
        else:
            raise FirewallError(errors.ALREADY_ENABLED, protocol)
    @handle_exceptions
    def removeProtocol(self, protocol):
        if protocol in self.settings[6]:
            self.settings[6].remove(protocol)
        else:
            raise FirewallError(errors.NOT_ENABLED, protocol)
    @handle_exceptions
    def queryProtocol(self, protocol):
        return protocol in self.settings[6]

    @handle_exceptions
    def getSourcePorts(self):
        return self.settings[7]
    @handle_exceptions
    def setSourcePorts(self, ports):
        self.settings[7] = ports
    @handle_exceptions
    def addSourcePort(self, port, protocol):
        if (port,protocol) not in self.settings[7]:
            self.settings[7].append((port,protocol))
        else:
            raise FirewallError(errors.ALREADY_ENABLED,
                                "'%s:%s'" % (port, protocol))
    @handle_exceptions
    def removeSourcePort(self, port, protocol):
        if (port,protocol) in self.settings[7]:
            self.settings[7].remove((port,protocol))
        else:
            raise FirewallError(errors.NOT_ENABLED,
                                "'%s:%s'" % (port, protocol))
    @handle_exceptions
    def querySourcePort(self, port, protocol):
        return (port,protocol) in self.settings[7]

    @handle_exceptions
    def getModules(self):
        return self.settings[4]
    @handle_exceptions
    def setModules(self, modules):
        self.settings[4] = modules
    @handle_exceptions
    def addModule(self, module):
        if module not in self.settings[4]:
            self.settings[4].append(module)
        else:
            raise FirewallError(errors.ALREADY_ENABLED, module)
    @handle_exceptions
    def removeModule(self, module):
        if module in self.settings[4]:
            self.settings[4].remove(module)
        else:
            raise FirewallError(errors.NOT_ENABLED, module)
    @handle_exceptions
    def queryModule(self, module):
        return module in self.settings[4]

    @handle_exceptions
    def getDestinations(self):
        return self.settings[5]
    @handle_exceptions
    def setDestinations(self, destinations):
        self.settings[5] = destinations
    @handle_exceptions
    def setDestination(self, dest_type, address):
        if dest_type not in self.settings[5] or \
           self.settings[5][dest_type] != address:
            self.settings[5][dest_type] = address
        else:
            raise FirewallError(errors.ALREADY_ENABLED, "'%s:%s'" % \
                                (dest_type, address))
    @handle_exceptions
    def removeDestination(self, dest_type, address=None):
        if dest_type in self.settings[5]:
            if address is not None and self.settings[5][dest_type] != address:
                raise FirewallError(errors.NOT_ENABLED, "'%s:%s'" % \
                                    (dest_type, address))
            del self.settings[5][dest_type]
        else:
            raise FirewallError(errors.NOT_ENABLED, "'%s'" % dest_type)
    @handle_exceptions
    def queryDestination(self, dest_type, address):
        return (dest_type in self.settings[5] and \
                    address == self.settings[5][dest_type])

    @handle_exceptions
    def getIncludes(self):
        return self.settings[8]
    @handle_exceptions
    def setIncludes(self, includes):
        self.settings[8] = includes
    @handle_exceptions
    def addInclude(self, include):
        if include not in self.settings[8]:
            self.settings[8].append(include)
        else:
            raise FirewallError(errors.ALREADY_ENABLED, include)
    @handle_exceptions
    def removeInclude(self, include):
        if include in self.settings[8]:
            self.settings[8].remove(include)
        else:
            raise FirewallError(errors.NOT_ENABLED, include)
    @handle_exceptions
    def queryInclude(self, include):
        return include in self.settings[8]

    @handle_exceptions
    def getHelpers(self):
        return self.settings[9]
    @handle_exceptions
    def setHelpers(self, helpers):
        self.settings[9] = helpers
    @handle_exceptions
    def addHelper(self, helper):
        if helper not in self.settings[9]:
            self.settings[9].append(helper)
        else:
            raise FirewallError(errors.ALREADY_ENABLED, helper)
    @handle_exceptions
    def removeHelper(self, helper):
        if helper in self.settings[9]:
            self.settings[9].remove(helper)
        else:
            raise FirewallError(errors.NOT_ENABLED, helper)
    @handle_exceptions
    def queryHelper(self, helper):
        return helper in self.settings[9]

# ipset config settings

class FirewallClientIPSetSettings(object):
    @handle_exceptions
    def __init__(self, settings=None):
        if settings:
            self.settings = settings
        else:
            self.settings = ["", "", "", "", {}, []]

    @handle_exceptions
    def __repr__(self):
        return '%s(%r)' % (self.__class__, self.settings)

    @handle_exceptions
    def getVersion(self):
        return self.settings[0]
    @handle_exceptions
    def setVersion(self, version):
        self.settings[0] = version

    @handle_exceptions
    def getShort(self):
        return self.settings[1]
    @handle_exceptions
    def setShort(self, short):
        self.settings[1] = short

    @handle_exceptions
    def getDescription(self):
        return self.settings[2]
    @handle_exceptions
    def setDescription(self, description):
        self.settings[2] = description

    @handle_exceptions
    def getType(self):
        return self.settings[3]
    @handle_exceptions
    def setType(self, ipset_type):
        self.settings[3] = ipset_type

    @handle_exceptions
    def getOptions(self):
        return self.settings[4]
    @handle_exceptions
    def setOptions(self, options):
        self.settings[4] = options
    @handle_exceptions
    def addOption(self, key, value):
        if key not in self.settings[4] or self.settings[4][key] != value:
            self.settings[4][key] = value
        else:
            raise FirewallError(errors.ALREADY_ENABLED, "'%s=%s'" % (key,value)
                                if value else key)
    @handle_exceptions
    def removeOption(self, key):
        if key in self.settings[4]:
            del self.settings[4][key]
        else:
            raise FirewallError(errors.NOT_ENABLED, key)
    @handle_exceptions
    def queryOption(self, key, value):
        return key in self.settings[4] and self.settings[4][key] == value

    @handle_exceptions
    def getEntries(self):
        return self.settings[5]
    @handle_exceptions
    def setEntries(self, entries):
        if "timeout" in self.settings[4] and \
           self.settings[4]["timeout"] != "0":
            raise FirewallError(errors.IPSET_WITH_TIMEOUT)
        check_for_overlapping_entries(entries)
        self.settings[5] = entries
    @handle_exceptions
    def addEntry(self, entry):
        if "timeout" in self.settings[4] and \
           self.settings[4]["timeout"] != "0":
            raise FirewallError(errors.IPSET_WITH_TIMEOUT)
        entry = normalize_ipset_entry(entry)
        if entry not in self.settings[5]:
            check_entry_overlaps_existing(entry, self.settings[5])
            self.settings[5].append(entry)
        else:
            raise FirewallError(errors.ALREADY_ENABLED, entry)
    @handle_exceptions
    def removeEntry(self, entry):
        if "timeout" in self.settings[4] and \
           self.settings[4]["timeout"] != "0":
            raise FirewallError(errors.IPSET_WITH_TIMEOUT)
        entry = normalize_ipset_entry(entry)
        if entry in self.settings[5]:
            self.settings[5].remove(entry)
        else:
            raise FirewallError(errors.NOT_ENABLED, entry)
    @handle_exceptions
    def queryEntry(self, entry):
        if "timeout" in self.settings[4] and \
           self.settings[4]["timeout"] != "0":
            raise FirewallError(errors.IPSET_WITH_TIMEOUT)
        entry = normalize_ipset_entry(entry)
        return entry in self.settings[5]

# ipset config

class FirewallClientConfigIPSet(object):
    @handle_exceptions
    def __init__(self, bus, path):
        self.bus = bus
        self.path = path
        self.dbus_obj = self.bus.get_object(config.dbus.DBUS_INTERFACE, path)
        self.fw_ipset = dbus.Interface(
            self.dbus_obj,
            dbus_interface=config.dbus.DBUS_INTERFACE_CONFIG_IPSET)
        self.fw_properties = dbus.Interface(
            self.dbus_obj, dbus_interface='org.freedesktop.DBus.Properties')

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def get_property(self, prop):
        return dbus_to_python(self.fw_properties.Get(
            config.dbus.DBUS_INTERFACE_CONFIG_IPSET, prop))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def get_properties(self):
        return dbus_to_python(self.fw_properties.GetAll(
            config.dbus.DBUS_INTERFACE_CONFIG_IPSET))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def set_property(self, prop, value):
        self.fw_properties.Set(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                               prop, value)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getSettings(self):
        return FirewallClientIPSetSettings(list(dbus_to_python(\
                    self.fw_ipset.getSettings())))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def update(self, settings):
        self.fw_ipset.update(tuple(settings.settings))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def loadDefaults(self):
        self.fw_ipset.loadDefaults()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def remove(self):
        self.fw_ipset.remove()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def rename(self, name):
        self.fw_ipset.rename(name)

    # version

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getVersion(self):
        return self.fw_ipset.getVersion()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setVersion(self, version):
        self.fw_ipset.setVersion(version)

    # short

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getShort(self):
        return self.fw_ipset.getShort()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setShort(self, short):
        self.fw_ipset.setShort(short)

    # description

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getDescription(self):
        return self.fw_ipset.getDescription()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setDescription(self, description):
        self.fw_ipset.setDescription(description)

    # entry

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getEntries(self):
        return self.fw_ipset.getEntries()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setEntries(self, entries):
        self.fw_ipset.setEntries(entries)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addEntry(self, entry):
        self.fw_ipset.addEntry(entry)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeEntry(self, entry):
        self.fw_ipset.removeEntry(entry)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryEntry(self, entry):
        return self.fw_ipset.queryEntry(entry)

# helper config settings

class FirewallClientHelperSettings(object):
    @handle_exceptions
    def __init__(self, settings=None):
        if settings:
            self.settings = settings
        else:
            self.settings = ["", "", "", "", "", [ ]]

    @handle_exceptions
    def __repr__(self):
        return '%s(%r)' % (self.__class__, self.settings)

    @handle_exceptions
    def getVersion(self):
        return self.settings[0]
    @handle_exceptions
    def setVersion(self, version):
        self.settings[0] = version

    @handle_exceptions
    def getShort(self):
        return self.settings[1]
    @handle_exceptions
    def setShort(self, short):
        self.settings[1] = short

    @handle_exceptions
    def getDescription(self):
        return self.settings[2]
    @handle_exceptions
    def setDescription(self, description):
        self.settings[2] = description

    @handle_exceptions
    def getFamily(self):
        return self.settings[3]
    @handle_exceptions
    def setFamily(self, ipv):
        if ipv is None:
            self.settings[3] = ""
        self.settings[3] = ipv

    @handle_exceptions
    def getModule(self):
        return self.settings[4]
    @handle_exceptions
    def setModule(self, module):
        self.settings[4] = module

    @handle_exceptions
    def getPorts(self):
        return self.settings[5]
    @handle_exceptions
    def setPorts(self, ports):
        self.settings[5] = ports
    @handle_exceptions
    def addPort(self, port, protocol):
        if (port,protocol) not in self.settings[5]:
            self.settings[5].append((port,protocol))
        else:
            raise FirewallError(errors.ALREADY_ENABLED,
                                "'%s:%s'" % (port, protocol))
    @handle_exceptions
    def removePort(self, port, protocol):
        if (port,protocol) in self.settings[5]:
            self.settings[5].remove((port,protocol))
        else:
            raise FirewallError(errors.NOT_ENABLED,
                                "'%s:%s'" % (port, protocol))
    @handle_exceptions
    def queryPort(self, port, protocol):
        return (port,protocol) in self.settings[5]

# helper config

class FirewallClientConfigHelper(object):
    @handle_exceptions
    def __init__(self, bus, path):
        self.bus = bus
        self.path = path
        self.dbus_obj = self.bus.get_object(config.dbus.DBUS_INTERFACE, path)
        self.fw_helper = dbus.Interface(
            self.dbus_obj,
            dbus_interface=config.dbus.DBUS_INTERFACE_CONFIG_HELPER)
        self.fw_properties = dbus.Interface(
            self.dbus_obj, dbus_interface='org.freedesktop.DBus.Properties')

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def get_property(self, prop):
        return dbus_to_python(self.fw_properties.Get(
            config.dbus.DBUS_INTERFACE_CONFIG_HELPER, prop))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def get_properties(self):
        return dbus_to_python(self.fw_properties.GetAll(
            config.dbus.DBUS_INTERFACE_CONFIG_HELPER))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def set_property(self, prop, value):
        self.fw_properties.Set(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                               prop, value)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getSettings(self):
        return FirewallClientHelperSettings(list(dbus_to_python(\
                    self.fw_helper.getSettings())))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def update(self, settings):
        self.fw_helper.update(tuple(settings.settings))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def loadDefaults(self):
        self.fw_helper.loadDefaults()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def remove(self):
        self.fw_helper.remove()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def rename(self, name):
        self.fw_helper.rename(name)

    # version

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getVersion(self):
        return self.fw_helper.getVersion()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setVersion(self, version):
        self.fw_helper.setVersion(version)

    # short

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getShort(self):
        return self.fw_helper.getShort()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setShort(self, short):
        self.fw_helper.setShort(short)

    # description

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getDescription(self):
        return self.fw_helper.getDescription()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setDescription(self, description):
        self.fw_helper.setDescription(description)

    # port

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getPorts(self):
        return self.fw_helper.getPorts()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setPorts(self, ports):
        self.fw_helper.setPorts(ports)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addPort(self, port, protocol):
        self.fw_helper.addPort(port, protocol)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removePort(self, port, protocol):
        self.fw_helper.removePort(port, protocol)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryPort(self, port, protocol):
        return self.fw_helper.queryPort(port, protocol)

    # family

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getFamily(self):
        return self.fw_helper.getFamily()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setFamily(self, ipv):
        if ipv is None:
            self.fw_helper.setFamily("")
        self.fw_helper.setFamily(ipv)

    # module

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getModule(self):
        return self.fw_helper.getModule()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setModule(self, module):
        self.fw_helper.setModule(module)

# service config

class FirewallClientConfigService(object):
    @handle_exceptions
    def __init__(self, bus, path):
        self.bus = bus
        self.path = path
        self.dbus_obj = self.bus.get_object(config.dbus.DBUS_INTERFACE, path)
        self.fw_service = dbus.Interface(
            self.dbus_obj,
            dbus_interface=config.dbus.DBUS_INTERFACE_CONFIG_SERVICE)
        self.fw_properties = dbus.Interface(
            self.dbus_obj, dbus_interface='org.freedesktop.DBus.Properties')

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def get_property(self, prop):
        return dbus_to_python(self.fw_properties.Get(
            config.dbus.DBUS_INTERFACE_CONFIG_SERVICE, prop))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def get_properties(self):
        return dbus_to_python(self.fw_properties.GetAll(
            config.dbus.DBUS_INTERFACE_CONFIG_SERVICE))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def set_property(self, prop, value):
        self.fw_properties.Set(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                               prop, value)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getSettings(self):
        return FirewallClientServiceSettings(dbus_to_python(
                    self.fw_service.getSettings2()))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def update(self, settings):
        self.fw_service.update2(settings.getSettingsDbusDict())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def loadDefaults(self):
        self.fw_service.loadDefaults()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def remove(self):
        self.fw_service.remove()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def rename(self, name):
        self.fw_service.rename(name)

    # version

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getVersion(self):
        return self.fw_service.getVersion()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setVersion(self, version):
        self.fw_service.setVersion(version)

    # short

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getShort(self):
        return self.fw_service.getShort()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setShort(self, short):
        self.fw_service.setShort(short)

    # description

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getDescription(self):
        return self.fw_service.getDescription()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setDescription(self, description):
        self.fw_service.setDescription(description)

    # port

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getPorts(self):
        return self.fw_service.getPorts()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setPorts(self, ports):
        self.fw_service.setPorts(ports)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addPort(self, port, protocol):
        self.fw_service.addPort(port, protocol)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removePort(self, port, protocol):
        self.fw_service.removePort(port, protocol)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryPort(self, port, protocol):
        return self.fw_service.queryPort(port, protocol)

    # protocol

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getProtocols(self):
        return self.fw_service.getProtocols()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setProtocols(self, protocols):
        self.fw_service.setProtocols(protocols)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addProtocol(self, protocol):
        self.fw_service.addProtocol(protocol)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeProtocol(self, protocol):
        self.fw_service.removeProtocol(protocol)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryProtocol(self, protocol):
        return self.fw_service.queryProtocol(protocol)

    # source-port

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getSourcePorts(self):
        return self.fw_service.getSourcePorts()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setSourcePorts(self, ports):
        self.fw_service.setSourcePorts(ports)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addSourcePort(self, port, protocol):
        self.fw_service.addSourcePort(port, protocol)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeSourcePort(self, port, protocol):
        self.fw_service.removeSourcePort(port, protocol)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def querySourcePort(self, port, protocol):
        return self.fw_service.querySourcePort(port, protocol)

    # module

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getModules(self):
        return self.fw_service.getModules()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setModules(self, modules):
        self.fw_service.setModules(modules)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addModule(self, module):
        self.fw_service.addModule(module)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeModule(self, module):
        self.fw_service.removeModule(module)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryModule(self, module):
        return self.fw_service.queryModule(module)

    # destination

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getDestinations(self):
        return self.fw_service.getDestinations()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setDestinations(self, destinations):
        self.fw_service.setDestinations(destinations)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getDestination(self, destination):
        return self.fw_service.getDestination(destination)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setDestination(self, destination, address):
        self.fw_service.setDestination(destination, address)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeDestination(self, destination, address=None):
        if address is not None and self.getDestination(destination) != address:
            raise FirewallError(errors.NOT_ENABLED, "'%s:%s'" % \
                                (destination, address))
        self.fw_service.removeDestination(destination)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryDestination(self, destination, address):
        return self.fw_service.queryDestination(destination, address)

    # include

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getIncludes(self):
        return self.fw_service.getIncludes()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setIncludes(self, includes):
        self.fw_service.setIncludes(includes)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addInclude(self, include):
        self.fw_service.addInclude(include)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeInclude(self, include):
        self.fw_service.removeInclude(include)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryInclude(self, include):
        return self.fw_service.queryInclude(include)


# icmptype config settings

class FirewallClientIcmpTypeSettings(object):
    @handle_exceptions
    def __init__(self, settings=None):
        if settings:
            self.settings = settings
        else:
            self.settings = ["", "", "", []]

    @handle_exceptions
    def __repr__(self):
        return '%s(%r)' % (self.__class__, self.settings)

    @handle_exceptions
    def getVersion(self):
        return self.settings[0]
    @handle_exceptions
    def setVersion(self, version):
        self.settings[0] = version

    @handle_exceptions
    def getShort(self):
        return self.settings[1]
    @handle_exceptions
    def setShort(self, short):
        self.settings[1] = short

    @handle_exceptions
    def getDescription(self):
        return self.settings[2]
    @handle_exceptions
    def setDescription(self, description):
        self.settings[2] = description

    @handle_exceptions
    def getDestinations(self):
        return self.settings[3]
    @handle_exceptions
    def setDestinations(self, destinations):
        self.settings[3] = destinations
    @handle_exceptions
    def addDestination(self, destination):
        # empty means all
        if not self.settings[3]:
            raise FirewallError(errors.ALREADY_ENABLED, destination)
        elif destination not in self.settings[3]:
            self.settings[3].append(destination)
        else:
            raise FirewallError(errors.ALREADY_ENABLED, destination)
    @handle_exceptions
    def removeDestination(self, destination):
        if destination in self.settings[3]:
            self.settings[3].remove(destination)
        # empty means all
        elif not self.settings[3]:
            self.setDestinations(list(set(['ipv4','ipv6']) - \
                                      set([destination])))
        else:
            raise FirewallError(errors.NOT_ENABLED, destination)

    @handle_exceptions
    def queryDestination(self, destination):
        # empty means all
        return not self.settings[3] or \
               destination in self.settings[3]

# icmptype config

class FirewallClientConfigIcmpType(object):
    @handle_exceptions
    def __init__(self, bus, path):
        self.bus = bus
        self.path = path
        self.dbus_obj = self.bus.get_object(config.dbus.DBUS_INTERFACE, path)
        self.fw_icmptype = dbus.Interface(
            self.dbus_obj,
            dbus_interface=config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE)
        self.fw_properties = dbus.Interface(
            self.dbus_obj, dbus_interface='org.freedesktop.DBus.Properties')

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def get_property(self, prop):
        return dbus_to_python(self.fw_properties.Get(
            config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE, prop))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def get_properties(self):
        return dbus_to_python(self.fw_properties.GetAll(
            config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def set_property(self, prop, value):
        self.fw_properties.Set(config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE,
                               prop, value)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getSettings(self):
        return FirewallClientIcmpTypeSettings(list(dbus_to_python(\
                    self.fw_icmptype.getSettings())))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def update(self, settings):
        self.fw_icmptype.update(tuple(settings.settings))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def loadDefaults(self):
        self.fw_icmptype.loadDefaults()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def remove(self):
        self.fw_icmptype.remove()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def rename(self, name):
        self.fw_icmptype.rename(name)

    # version

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getVersion(self):
        return self.fw_icmptype.getVersion()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setVersion(self, version):
        self.fw_icmptype.setVersion(version)

    # short

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getShort(self):
        return self.fw_icmptype.getShort()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setShort(self, short):
        self.fw_icmptype.setShort(short)

    # description

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getDescription(self):
        return self.fw_icmptype.getDescription()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setDescription(self, description):
        self.fw_icmptype.setDescription(description)

    # destination

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getDestinations(self):
        return self.fw_icmptype.getDestinations()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setDestinations(self, destinations):
        self.fw_icmptype.setDestinations(destinations)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addDestination(self, destination):
        self.fw_icmptype.addDestination(destination)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeDestination(self, destination):
        self.fw_icmptype.removeDestination(destination)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryDestination(self, destination):
        return self.fw_icmptype.queryDestination(destination)


# config.policies lockdown whitelist

class FirewallClientPoliciesLockdownWhitelist(object):
    @handle_exceptions
    def __init__(self, settings=None):
        if settings:
            self.settings = settings
        else:
            self.settings = [ [], [], [], [] ]

    @handle_exceptions
    def __repr__(self):
        return '%s(%r)' % (self.__class__, self.settings)

    @handle_exceptions
    def getCommands(self):
        return self.settings[0]
    @handle_exceptions
    def setCommands(self, commands):
        self.settings[0] = commands
    @handle_exceptions
    def addCommand(self, command):
        if command not in self.settings[0]:
            self.settings[0].append(command)
    @handle_exceptions
    def removeCommand(self, command):
        if command in self.settings[0]:
            self.settings[0].remove(command)
    @handle_exceptions
    def queryCommand(self, command):
        return command in self.settings[0]

    @handle_exceptions
    def getContexts(self):
        return self.settings[1]
    @handle_exceptions
    def setContexts(self, contexts):
        self.settings[1] = contexts
    @handle_exceptions
    def addContext(self, context):
        if context not in self.settings[1]:
            self.settings[1].append(context)
    @handle_exceptions
    def removeContext(self, context):
        if context in self.settings[1]:
            self.settings[1].remove(context)
    @handle_exceptions
    def queryContext(self, context):
        return context in self.settings[1]

    @handle_exceptions
    def getUsers(self):
        return self.settings[2]
    @handle_exceptions
    def setUsers(self, users):
        self.settings[2] = users
    @handle_exceptions
    def addUser(self, user):
        if user not in self.settings[2]:
            self.settings[2].append(user)
    @handle_exceptions
    def removeUser(self, user):
        if user in self.settings[2]:
            self.settings[2].remove(user)
    @handle_exceptions
    def queryUser(self, user):
        return user in self.settings[2]

    @handle_exceptions
    def getUids(self):
        return self.settings[3]
    @handle_exceptions
    def setUids(self, uids):
        self.settings[3] = uids
    @handle_exceptions
    def addUid(self, uid):
        if uid not in self.settings[3]:
            self.settings[3].append(uid)
    @handle_exceptions
    def removeUid(self, uid):
        if uid in self.settings[3]:
            self.settings[3].remove(uid)
    @handle_exceptions
    def queryUid(self, uid):
        return uid in self.settings[3]

# config.policies

class FirewallClientConfigPolicies(object):
    @handle_exceptions
    def __init__(self, bus):
        self.bus = bus
        self.dbus_obj = self.bus.get_object(config.dbus.DBUS_INTERFACE,
                                            config.dbus.DBUS_PATH_CONFIG)
        self.fw_policies = dbus.Interface(
            self.dbus_obj,
            dbus_interface=config.dbus.DBUS_INTERFACE_CONFIG_POLICIES)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getLockdownWhitelist(self):
        return FirewallClientPoliciesLockdownWhitelist( \
            list(dbus_to_python(self.fw_policies.getLockdownWhitelist())))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setLockdownWhitelist(self, settings):
        self.fw_policies.setLockdownWhitelist(tuple(settings.settings))

    # command

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addLockdownWhitelistCommand(self, command):
        self.fw_policies.addLockdownWhitelistCommand(command)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeLockdownWhitelistCommand(self, command):
        self.fw_policies.removeLockdownWhitelistCommand(command)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryLockdownWhitelistCommand(self, command):
        return dbus_to_python(self.fw_policies.queryLockdownWhitelistCommand(command))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getLockdownWhitelistCommands(self):
        return dbus_to_python(self.fw_policies.getLockdownWhitelistCommands())

    # context

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addLockdownWhitelistContext(self, context):
        self.fw_policies.addLockdownWhitelistContext(context)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeLockdownWhitelistContext(self, context):
        self.fw_policies.removeLockdownWhitelistContext(context)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryLockdownWhitelistContext(self, context):
        return dbus_to_python(self.fw_policies.queryLockdownWhitelistContext(context))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getLockdownWhitelistContexts(self):
        return dbus_to_python(self.fw_policies.getLockdownWhitelistContexts())

    # user

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addLockdownWhitelistUser(self, user):
        self.fw_policies.addLockdownWhitelistUser(user)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeLockdownWhitelistUser(self, user):
        self.fw_policies.removeLockdownWhitelistUser(user)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryLockdownWhitelistUser(self, user):
        return dbus_to_python(self.fw_policies.queryLockdownWhitelistUser(user))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getLockdownWhitelistUsers(self):
        return dbus_to_python(self.fw_policies.getLockdownWhitelistUsers())

    # uid

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getLockdownWhitelistUids(self):
        return dbus_to_python(self.fw_policies.getLockdownWhitelistUids())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setLockdownWhitelistUids(self, uids):
        self.fw_policies.setLockdownWhitelistUids(uids)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addLockdownWhitelistUid(self, uid):
        self.fw_policies.addLockdownWhitelistUid(uid)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeLockdownWhitelistUid(self, uid):
        self.fw_policies.removeLockdownWhitelistUid(uid)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryLockdownWhitelistUid(self, uid):
        return dbus_to_python(self.fw_policies.queryLockdownWhitelistUid(uid))

# config.direct

class FirewallClientDirect(object):
    @handle_exceptions
    def __init__(self, settings=None):
        if settings:
            self.settings = settings
        else:
            self.settings = [ [], [], [], ]

    @handle_exceptions
    def __repr__(self):
        return '%s(%r)' % (self.__class__, self.settings)

    @handle_exceptions
    def getAllChains(self):
        return self.settings[0]
    @handle_exceptions
    def getChains(self, ipv, table):
        return [ entry[2] for entry in self.settings[0] \
                 if entry[0] == ipv and entry[1] == table ]
    @handle_exceptions
    def setAllChains(self, chains):
        self.settings[0] = chains
    @handle_exceptions
    def addChain(self, ipv, table, chain):
        idx = (ipv, table, chain)
        if idx not in self.settings[0]:
            self.settings[0].append(idx)
    @handle_exceptions
    def removeChain(self, ipv, table, chain):
        idx = (ipv, table, chain)
        if idx in self.settings[0]:
            self.settings[0].remove(idx)
    @handle_exceptions
    def queryChain(self, ipv, table, chain):
        idx = (ipv, table, chain)
        return idx in self.settings[0]

    @handle_exceptions
    def getAllRules(self):
        return self.settings[1]
    @handle_exceptions
    def getRules(self, ipv, table, chain):
        return [ entry[3:] for entry in self.settings[1] \
                 if entry[0] == ipv and entry[1] == table \
                 and entry[2] == chain ]
    @handle_exceptions
    def setAllRules(self, rules):
        self.settings[1] = rules
    @handle_exceptions
    def addRule(self, ipv, table, chain, priority, args):
        idx = (ipv, table, chain, priority, args)
        if idx not in self.settings[1]:
            self.settings[1].append(idx)
    @handle_exceptions
    def removeRule(self, ipv, table, chain, priority, args):
        idx = (ipv, table, chain, priority, args)
        if idx in self.settings[1]:
            self.settings[1].remove(idx)
    @handle_exceptions
    def removeRules(self, ipv, table, chain):
        for idx in list(self.settings[1]):
            if idx[0] == ipv and idx[1] == table and idx[2] == chain:
                self.settings[1].remove(idx)
    @handle_exceptions
    def queryRule(self, ipv, table, chain, priority, args):
        idx = (ipv, table, chain, priority, args)
        return idx in self.settings[1]

    @handle_exceptions
    def getAllPassthroughs(self):
        return self.settings[2]
    @handle_exceptions
    def setAllPassthroughs(self, passthroughs):
        self.settings[2] = passthroughs
    @handle_exceptions
    def removeAllPassthroughs(self):
        self.settings[2] = []
    @handle_exceptions
    def getPassthroughs(self, ipv):
        return [ entry[1] for entry in self.settings[2] \
                 if entry[0] == ipv ]
    @handle_exceptions
    def addPassthrough(self, ipv, args):
        idx = (ipv, args)
        if idx not in self.settings[2]:
            self.settings[2].append(idx)
    @handle_exceptions
    def removePassthrough(self, ipv, args):
        idx = (ipv, args)
        if idx in self.settings[2]:
            self.settings[2].remove(idx)
    @handle_exceptions
    def queryPassthrough(self, ipv, args):
        idx = (ipv, args)
        return idx in self.settings[2]

# config.direct

class FirewallClientConfigDirect(object):
    @handle_exceptions
    def __init__(self, bus):
        self.bus = bus
        self.dbus_obj = self.bus.get_object(config.dbus.DBUS_INTERFACE,
                                            config.dbus.DBUS_PATH_CONFIG)
        self.fw_direct = dbus.Interface(
            self.dbus_obj,
            dbus_interface=config.dbus.DBUS_INTERFACE_CONFIG_DIRECT)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getSettings(self):
        return FirewallClientDirect( \
            list(dbus_to_python(self.fw_direct.getSettings())))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def update(self, settings):
        self.fw_direct.update(tuple(settings.settings))

    # direct chain

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addChain(self, ipv, table, chain):
        self.fw_direct.addChain(ipv, table, chain)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeChain(self, ipv, table, chain):
        self.fw_direct.removeChain(ipv, table, chain)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryChain(self, ipv, table, chain):
        return dbus_to_python(self.fw_direct.queryChain(ipv, table, chain))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getChains(self, ipv, table):
        return dbus_to_python(self.fw_direct.getChains(ipv, table))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getAllChains(self):
        return dbus_to_python(self.fw_direct.getAllChains())

    # direct rule

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addRule(self, ipv, table, chain, priority, args):
        self.fw_direct.addRule(ipv, table, chain, priority, args)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeRule(self, ipv, table, chain, priority, args):
        self.fw_direct.removeRule(ipv, table, chain, priority, args)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeRules(self, ipv, table, chain):
        self.fw_direct.removeRules(ipv, table, chain)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryRule(self, ipv, table, chain, priority, args):
        return dbus_to_python(self.fw_direct.queryRule(ipv, table, chain, priority, args))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getRules(self, ipv, table, chain):
        return dbus_to_python(self.fw_direct.getRules(ipv, table, chain))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getAllRules(self):
        return dbus_to_python(self.fw_direct.getAllRules())

    # tracked passthrough

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addPassthrough(self, ipv, args):
        self.fw_direct.addPassthrough(ipv, args)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removePassthrough(self, ipv, args):
        self.fw_direct.removePassthrough(ipv, args)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryPassthrough(self, ipv, args):
        return dbus_to_python(self.fw_direct.queryPassthrough(ipv, args))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getPassthroughs(self, ipv):
        return dbus_to_python(self.fw_direct.getPassthroughs(ipv))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getAllPassthroughs(self):
        return dbus_to_python(self.fw_direct.getAllPassthroughs())

# config

class FirewallClientConfig(object):
    @handle_exceptions
    def __init__(self, bus):
        self.bus = bus
        self.dbus_obj = self.bus.get_object(config.dbus.DBUS_INTERFACE,
                                            config.dbus.DBUS_PATH_CONFIG)
        self.fw_config = dbus.Interface(
            self.dbus_obj,
            dbus_interface=config.dbus.DBUS_INTERFACE_CONFIG)
        self.fw_properties = dbus.Interface(
            self.dbus_obj, dbus_interface='org.freedesktop.DBus.Properties')
        self._policies = FirewallClientConfigPolicies(self.bus)
        self._direct = FirewallClientConfigDirect(self.bus)

    # properties

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def get_property(self, prop):
        return dbus_to_python(self.fw_properties.Get(
            config.dbus.DBUS_INTERFACE_CONFIG, prop))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def get_properties(self):
        return dbus_to_python(self.fw_properties.GetAll(
            config.dbus.DBUS_INTERFACE_CONFIG))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def set_property(self, prop, value):
        self.fw_properties.Set(config.dbus.DBUS_INTERFACE_CONFIG, prop, value)

    # ipset

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getIPSetNames(self):
        return dbus_to_python(self.fw_config.getIPSetNames())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def listIPSets(self):
        return dbus_to_python(self.fw_config.listIPSets())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getIPSet(self, path):
        return FirewallClientConfigIPSet(self.bus, path)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getIPSetByName(self, name):
        path = dbus_to_python(self.fw_config.getIPSetByName(name))
        return FirewallClientConfigIPSet(self.bus, path)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addIPSet(self, name, settings):
        if isinstance(settings, FirewallClientIPSetSettings):
            path = self.fw_config.addIPSet(name, tuple(settings.settings))
        else:
            path = self.fw_config.addIPSet(name, tuple(settings))
        return FirewallClientConfigIPSet(self.bus, path)

    # zone

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getZoneNames(self):
        return dbus_to_python(self.fw_config.getZoneNames())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def listZones(self):
        return dbus_to_python(self.fw_config.listZones())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getZone(self, path):
        return FirewallClientConfigZone(self.bus, path)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getZoneByName(self, name):
        path = dbus_to_python(self.fw_config.getZoneByName(name))
        return FirewallClientConfigZone(self.bus, path)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getZoneOfInterface(self, iface):
        return dbus_to_python(self.fw_config.getZoneOfInterface(iface))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getZoneOfSource(self, source):
        return dbus_to_python(self.fw_config.getZoneOfSource(source))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addZone(self, name, settings):
        if isinstance(settings, FirewallClientZoneSettings):
            path = self.fw_config.addZone2(name, settings.getSettingsDbusDict())
        elif isinstance(settings, dict):
            path = self.fw_config.addZone2(name, settings)
        else:
            # tuple based dbus API has 16 elements. Slice what we're given down
            # to the expected size.
            path = self.fw_config.addZone(name, tuple(settings[:16]))
        return FirewallClientConfigZone(self.bus, path)

    # policy

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getPolicyNames(self):
        return dbus_to_python(self.fw_config.getPolicyNames())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def listPolicies(self):
        return dbus_to_python(self.fw_config.listPolicies())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getPolicy(self, path):
        return FirewallClientConfigPolicy(self.bus, path)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getPolicyByName(self, name):
        path = dbus_to_python(self.fw_config.getPolicyByName(name))
        return FirewallClientConfigPolicy(self.bus, path)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addPolicy(self, name, settings):
        if isinstance(settings, FirewallClientPolicySettings):
            path = self.fw_config.addPolicy(name, settings.getSettingsDbusDict())
        else: # dict
            path = self.fw_config.addPolicy(name, settings)
        return FirewallClientConfigPolicy(self.bus, path)

    # service

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getServiceNames(self):
        return dbus_to_python(self.fw_config.getServiceNames())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def listServices(self):
        return dbus_to_python(self.fw_config.listServices())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getService(self, path):
        return FirewallClientConfigService(self.bus, path)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getServiceByName(self, name):
        path = dbus_to_python(self.fw_config.getServiceByName(name))
        return FirewallClientConfigService(self.bus, path)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addService(self, name, settings):
        if isinstance(settings, FirewallClientServiceSettings):
            path = self.fw_config.addService2(name, settings.getSettingsDbusDict())
        elif type(settings) is dict:
            path = self.fw_config.addService2(name, settings)
        else:
            # tuple based dbus API has 8 elements. Slice what we're given down
            # to the expected size.
            path = self.fw_config.addService(name, tuple(settings[:8]))
        return FirewallClientConfigService(self.bus, path)

    # icmptype

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getIcmpTypeNames(self):
        return dbus_to_python(self.fw_config.getIcmpTypeNames())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def listIcmpTypes(self):
        return dbus_to_python(self.fw_config.listIcmpTypes())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getIcmpType(self, path):
        return FirewallClientConfigIcmpType(self.bus, path)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getIcmpTypeByName(self, name):
        path = dbus_to_python(self.fw_config.getIcmpTypeByName(name))
        return FirewallClientConfigIcmpType(self.bus, path)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addIcmpType(self, name, settings):
        if isinstance(settings, FirewallClientIcmpTypeSettings):
            path = self.fw_config.addIcmpType(name, tuple(settings.settings))
        else:
            path = self.fw_config.addIcmpType(name, tuple(settings))
        return FirewallClientConfigIcmpType(self.bus, path)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def policies(self):
        return self._policies

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def direct(self):
        return self._direct

    # helper

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getHelperNames(self):
        return dbus_to_python(self.fw_config.getHelperNames())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def listHelpers(self):
        return dbus_to_python(self.fw_config.listHelpers())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getHelper(self, path):
        return FirewallClientConfigHelper(self.bus, path)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getHelperByName(self, name):
        path = dbus_to_python(self.fw_config.getHelperByName(name))
        return FirewallClientConfigHelper(self.bus, path)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addHelper(self, name, settings):
        if isinstance(settings, FirewallClientHelperSettings):
            path = self.fw_config.addHelper(name, tuple(settings.settings))
        else:
            path = self.fw_config.addHelper(name, tuple(settings))
        return FirewallClientConfigHelper(self.bus, path)

#

class FirewallClient(object):
    @handle_exceptions
    def __init__(self, bus=None, wait=0, quiet=True):
        if not bus:
            dbus.mainloop.glib.DBusGMainLoop(set_as_default=True)
            try:
                self.bus = slip.dbus.SystemBus()
                self.bus.default_timeout = None
            except Exception:
                try:
                    self.bus = dbus.SystemBus()
                except dbus.exceptions.DBusException as e:
                    raise FirewallError(errors.DBUS_ERROR,
                                        e.get_dbus_message())
                else:
                    print("Not using slip.dbus")
        else:
            self.bus = bus

        self.bus.add_signal_receiver(
            handler_function=self._dbus_connection_changed,
            signal_name="NameOwnerChanged",
            dbus_interface="org.freedesktop.DBus",
            arg0=config.dbus.DBUS_INTERFACE)

        for interface in [ config.dbus.DBUS_INTERFACE,
                           config.dbus.DBUS_INTERFACE_IPSET,
                           config.dbus.DBUS_INTERFACE_ZONE,
                           config.dbus.DBUS_INTERFACE_POLICY,
                           config.dbus.DBUS_INTERFACE_DIRECT,
                           config.dbus.DBUS_INTERFACE_POLICIES,
                           config.dbus.DBUS_INTERFACE_CONFIG,
                           config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                           config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                           config.dbus.DBUS_INTERFACE_CONFIG_POLICY,
                           config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                           config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                           config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                           config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE,
                           config.dbus.DBUS_INTERFACE_CONFIG_POLICIES ]:
            self.bus.add_signal_receiver(self._signal_receiver,
                                         dbus_interface=interface,
                                         interface_keyword='interface',
                                         member_keyword='member',
                                         path_keyword='path')

        # callbacks
        self._callback = { }
        self._callbacks = {
            # client callbacks
            "connection-changed": "connection-changed",
            "connection-established": "connection-established",
            "connection-lost": "connection-lost",
            # firewalld callbacks
            "log-denied-changed": "LogDeniedChanged",
            "default-zone-changed": "DefaultZoneChanged",
            "panic-mode-enabled": "PanicModeEnabled",
            "panic-mode-disabled": "PanicModeDisabled",
            "reloaded": "Reloaded",
            "service-added": "ServiceAdded",
            "service-removed": "ServiceRemoved",
            "port-added": "PortAdded",
            "port-removed": "PortRemoved",
            "source-port-added": "SourcePortAdded",
            "source-port-removed": "SourcePortRemoved",
            "protocol-added": "ProtocolAdded",
            "protocol-removed": "ProtocolRemoved",
            "masquerade-added": "MasqueradeAdded",
            "masquerade-removed": "MasqueradeRemoved",
            "forward-port-added": "ForwardPortAdded",
            "forward-port-removed": "ForwardPortRemoved",
            "icmp-block-added": "IcmpBlockAdded",
            "icmp-block-removed": "IcmpBlockRemoved",
            "icmp-block-inversion-added": "IcmpBlockInversionAdded",
            "icmp-block-inversion-removed": "IcmpBlockInversionRemoved",
            "richrule-added": "RichRuleAdded",
            "richrule-removed": "RichRuleRemoved",
            "interface-added": "InterfaceAdded",
            "interface-removed": "InterfaceRemoved",
            "zone-changed": "ZoneOfInterfaceChanged", # DEPRECATED, use zone-of-interface-changed instead
            "zone-of-interface-changed": "ZoneOfInterfaceChanged",
            "source-added": "SourceAdded",
            "source-removed": "SourceRemoved",
            "zone-of-source-changed": "ZoneOfSourceChanged",
            "zone-updated": "ZoneUpdated",
            "policy-updated": "PolicyUpdated",
            # ipset callbacks
            "ipset-entry-added": "EntryAdded",
            "ipset-entry-removed": "EntryRemoved",
            # direct callbacks
            "direct:chain-added": "ChainAdded",
            "direct:chain-removed": "ChainRemoved",
            "direct:rule-added": "RuleAdded",
            "direct:rule-removed": "RuleRemoved",
            "direct:passthrough-added": "PassthroughAdded",
            "direct:passthrough-removed": "PassthroughRemoved",
            "config:direct:updated": "config:direct:Updated",
            # policy callbacks
            "lockdown-enabled": "LockdownEnabled",
            "lockdown-disabled": "LockdownDisabled",
            "lockdown-whitelist-command-added": "LockdownWhitelistCommandAdded",
            "lockdown-whitelist-command-removed": "LockdownWhitelistCommandRemoved",
            "lockdown-whitelist-context-added": "LockdownWhitelistContextAdded",
            "lockdown-whitelist-context-removed": "LockdownWhitelistContextRemoved",
            "lockdown-whitelist-uid-added": "LockdownWhitelistUidAdded",
            "lockdown-whitelist-uid-removed": "LockdownWhitelistUidRemoved",
            "lockdown-whitelist-user-added": "LockdownWhitelistUserAdded",
            "lockdown-whitelist-user-removed": "LockdownWhitelistUserRemoved",
            # firewalld.config callbacks
            "config:policies:lockdown-whitelist-updated": "config:policies:LockdownWhitelistUpdated",
            "config:ipset-added": "config:IPSetAdded",
            "config:ipset-updated": "config:IPSetUpdated",
            "config:ipset-removed": "config:IPSetRemoved",
            "config:ipset-renamed": "config:IPSetRenamed",
            "config:zone-added": "config:ZoneAdded",
            "config:zone-updated": "config:ZoneUpdated",
            "config:zone-removed": "config:ZoneRemoved",
            "config:zone-renamed": "config:ZoneRenamed",
            "config:policy-added": "config:PolicyAdded",
            "config:policy-updated": "config:PolicyUpdated",
            "config:policy-removed": "config:PolicyRemoved",
            "config:policy-renamed": "config:PolicyRenamed",
            "config:service-added": "config:ServiceAdded",
            "config:service-updated": "config:ServiceUpdated",
            "config:service-removed": "config:ServiceRemoved",
            "config:service-renamed": "config:ServiceRenamed",
            "config:icmptype-added": "config:IcmpTypeAdded",
            "config:icmptype-updated": "config:IcmpTypeUpdated",
            "config:icmptype-removed": "config:IcmpTypeRemoved",
            "config:icmptype-renamed": "config:IcmpTypeRenamed",
            "config:helper-added": "config:HelperAdded",
            "config:helper-updated": "config:HelperUpdated",
            "config:helper-removed": "config:HelperRemoved",
            "config:helper-renamed": "config:HelperRenamed",
            }

        # initialize variables used for connection
        self._init_vars()

        self.quiet = quiet

        if wait > 0:
            # connect in one second
            GLib.timeout_add_seconds(wait, self._connection_established)
        else:
            self._connection_established()

    @handle_exceptions
    def _init_vars(self):
        self.fw = None
        self.fw_ipset = None
        self.fw_zone = None
        self.fw_policy = None
        self.fw_helper = None
        self.fw_direct = None
        self.fw_properties = None
        self._config = None
        self.connected = False

    @handle_exceptions
    def getExceptionHandler(self):
        return exception_handler

    @handle_exceptions
    def setExceptionHandler(self, handler):
        global exception_handler
        exception_handler = handler

    @handle_exceptions
    def getNotAuthorizedLoop(self):
        return not_authorized_loop

    @handle_exceptions
    def setNotAuthorizedLoop(self, enable):
        global not_authorized_loop
        not_authorized_loop = enable

    @handle_exceptions
    def connect(self, name, callback, *args):
        if name in self._callbacks:
            self._callback[self._callbacks[name]] = (callback, args)
        else:
            raise ValueError("Unknown callback name '%s'" % name)

    @handle_exceptions
    def _dbus_connection_changed(self, name, old_owner, new_owner):
        if name != config.dbus.DBUS_INTERFACE:
            return

        if new_owner:
            # connection established
            self._connection_established()
        else:
            # connection lost
            self._connection_lost()

    @handle_exceptions
    def _connection_established(self):
        try:
            self.dbus_obj = self.bus.get_object(config.dbus.DBUS_INTERFACE,
                                                config.dbus.DBUS_PATH)
            self.fw = dbus.Interface(self.dbus_obj,
                                     dbus_interface=config.dbus.DBUS_INTERFACE)
            self.fw_ipset = dbus.Interface(
                self.dbus_obj, dbus_interface=config.dbus.DBUS_INTERFACE_IPSET)
            self.fw_zone = dbus.Interface(
                self.dbus_obj,
                dbus_interface=config.dbus.DBUS_INTERFACE_ZONE)
            self.fw_policy = dbus.Interface(
                self.dbus_obj,
                dbus_interface=config.dbus.DBUS_INTERFACE_POLICY)
            self.fw_direct = dbus.Interface(
                self.dbus_obj, dbus_interface=config.dbus.DBUS_INTERFACE_DIRECT)
            self.fw_policies = dbus.Interface(
                self.dbus_obj,
                dbus_interface=config.dbus.DBUS_INTERFACE_POLICIES)
            self.fw_properties = dbus.Interface(
                self.dbus_obj, dbus_interface='org.freedesktop.DBus.Properties')
        except dbus.exceptions.DBusException as e:
            # ignore dbus errors
            if not self.quiet:
                print ("DBusException", e.get_dbus_message())
            return
        except Exception as e:
            if not self.quiet:
                print ("Exception", e)
            return
        self._config = FirewallClientConfig(self.bus)
        self.connected = True
        self._signal_receiver(member="connection-established",
                              interface=config.dbus.DBUS_INTERFACE)
        self._signal_receiver(member="connection-changed",
                              interface=config.dbus.DBUS_INTERFACE)

    @handle_exceptions
    def _connection_lost(self):
        self._init_vars()
        self._signal_receiver(member="connection-lost",
                              interface=config.dbus.DBUS_INTERFACE)
        self._signal_receiver(member="connection-changed",
                              interface=config.dbus.DBUS_INTERFACE)

    @handle_exceptions
    def _signal_receiver(self, *args, **kwargs):
        if "member" not in kwargs or "interface" not in kwargs:
            return

        signal = kwargs["member"]
        interface = kwargs["interface"]

        # config signals need special treatment
        # pimp signal name
        if interface.startswith(config.dbus.DBUS_INTERFACE_CONFIG_ZONE):
            signal = "config:Zone" + signal
        if interface.startswith(config.dbus.DBUS_INTERFACE_CONFIG_POLICY):
            signal = "config:Policy" + signal
        elif interface.startswith(config.dbus.DBUS_INTERFACE_CONFIG_IPSET):
            signal = "config:IPSet" + signal
        elif interface.startswith(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE):
            signal = "config:Service" + signal
        elif interface.startswith(config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE):
            signal = "config:IcmpType" + signal
        elif interface.startswith(config.dbus.DBUS_INTERFACE_CONFIG_HELPER):
            signal = "config:Helper" + signal
        elif interface == config.dbus.DBUS_INTERFACE_CONFIG:
            signal = "config:" + signal
        elif interface == config.dbus.DBUS_INTERFACE_CONFIG_POLICIES:
            signal = "config:policies:" + signal
        elif interface == config.dbus.DBUS_INTERFACE_CONFIG_DIRECT:
            signal = "config:direct:" + signal

        cb = None
        for callback in self._callbacks:
            if self._callbacks[callback] == signal and \
                    self._callbacks[callback] in self._callback:
                cb = self._callback[self._callbacks[callback]]
        if cb is None:
            return

        # call back with args converted to python types ...
        cb_args = [ dbus_to_python(arg) for arg in args ]
        try:
            if cb[1]:
                # add call data
                cb_args.extend(cb[1])
            # call back
            cb[0](*cb_args)
        except Exception as msg:
            print(msg)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def config(self):
        return self._config

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def reload(self):
        self.fw.reload()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def complete_reload(self):
        self.fw.completeReload()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def runtimeToPermanent(self):
        self.fw.runtimeToPermanent()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def checkPermanentConfig(self):
        self.fw.checkPermanentConfig()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def get_property(self, prop):
        return dbus_to_python(self.fw_properties.Get(
            config.dbus.DBUS_INTERFACE, prop))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def get_properties(self):
        return dbus_to_python(self.fw_properties.GetAll(
            config.dbus.DBUS_INTERFACE))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def set_property(self, prop, value):
        self.fw_properties.Set(config.dbus.DBUS_INTERFACE, prop, value)

    # panic mode

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def enablePanicMode(self):
        self.fw.enablePanicMode()
    
    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def disablePanicMode(self):
        self.fw.disablePanicMode()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryPanicMode(self):
        return dbus_to_python(self.fw.queryPanicMode())

    # list functions

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getZoneSettings(self, zone):
        return FirewallClientZoneSettings(dbus_to_python(self.fw_zone.getZoneSettings2(zone)))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getIPSets(self):
        return dbus_to_python(self.fw_ipset.getIPSets())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getIPSetSettings(self, ipset):
        return FirewallClientIPSetSettings(list(dbus_to_python(\
                    self.fw_ipset.getIPSetSettings(ipset))))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addEntry(self, ipset, entry):
        self.fw_ipset.addEntry(ipset, entry)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getEntries(self, ipset):
        return self.fw_ipset.getEntries(ipset)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setEntries(self, ipset, entries):
        return self.fw_ipset.setEntries(ipset, entries)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeEntry(self, ipset, entry):
        self.fw_ipset.removeEntry(ipset, entry)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryEntry(self, ipset, entry):
        return dbus_to_python(self.fw_ipset.queryEntry(ipset, entry))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def listServices(self):
        return dbus_to_python(self.fw.listServices())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getServiceSettings(self, service):
        return FirewallClientServiceSettings(dbus_to_python(
                    self.fw.getServiceSettings2(service)))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def listIcmpTypes(self):
        return dbus_to_python(self.fw.listIcmpTypes())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getIcmpTypeSettings(self, icmptype):
        return FirewallClientIcmpTypeSettings(list(dbus_to_python(\
                    self.fw.getIcmpTypeSettings(icmptype))))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getHelpers(self):
        return dbus_to_python(self.fw.getHelpers())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getHelperSettings(self, helper):
        return FirewallClientHelperSettings(list(dbus_to_python(\
                    self.fw.getHelperSettings(helper))))

    # automatic helper setting

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getAutomaticHelpers(self):
        return dbus_to_python(self.fw.getAutomaticHelpers())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setAutomaticHelpers(self, value):
        self.fw.setAutomaticHelpers(value)

    # log denied

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getLogDenied(self):
        return dbus_to_python(self.fw.getLogDenied())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setLogDenied(self, value):
        self.fw.setLogDenied(value)

    # default zone

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getDefaultZone(self):
        return dbus_to_python(self.fw.getDefaultZone())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setDefaultZone(self, zone):
        self.fw.setDefaultZone(zone)

    # zone

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setZoneSettings(self, zone, settings):
        self.fw_zone.setZoneSettings2(zone, settings.getRuntimeSettingsDbusDict())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getZones(self):
        return dbus_to_python(self.fw_zone.getZones())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getActiveZones(self):
        return dbus_to_python(self.fw_zone.getActiveZones())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getZoneOfInterface(self, interface):
        return dbus_to_python(self.fw_zone.getZoneOfInterface(interface))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getZoneOfSource(self, source):
        return dbus_to_python(self.fw_zone.getZoneOfSource(source))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def isImmutable(self, zone):
        return dbus_to_python(self.fw_zone.isImmutable(zone))

    # policy

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getPolicySettings(self, policy):
        return FirewallClientPolicySettings(dbus_to_python(self.fw_policy.getPolicySettings(policy)))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def setPolicySettings(self, policy, settings):
        self.fw_policy.setPolicySettings(policy, settings.getRuntimeSettingsDbusDict())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getPolicies(self):
        return dbus_to_python(self.fw_policy.getPolicies())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getActivePolicies(self):
        return dbus_to_python(self.fw_policy.getActivePolicies())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def isPolicyImmutable(self, policy):
        return dbus_to_python(self.fw_policy.isImmutable(policy))

    # interfaces

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addInterface(self, zone, interface):
        return dbus_to_python(self.fw_zone.addInterface(zone, interface))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def changeZone(self, zone, interface): # DEPRECATED
        return dbus_to_python(self.fw_zone.changeZone(zone, interface))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def changeZoneOfInterface(self, zone, interface):
        return dbus_to_python(self.fw_zone.changeZoneOfInterface(zone,
                                                                 interface))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getInterfaces(self, zone):
        return dbus_to_python(self.fw_zone.getInterfaces(zone))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryInterface(self, zone, interface):
        return dbus_to_python(self.fw_zone.queryInterface(zone, interface))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeInterface(self, zone, interface):
        return dbus_to_python(self.fw_zone.removeInterface(zone, interface))

    # sources

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addSource(self, zone, source):
        return dbus_to_python(self.fw_zone.addSource(zone, source))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def changeZoneOfSource(self, zone, source):
        return dbus_to_python(self.fw_zone.changeZoneOfSource(zone, source))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getSources(self, zone):
        return dbus_to_python(self.fw_zone.getSources(zone))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def querySource(self, zone, source):
        return dbus_to_python(self.fw_zone.querySource(zone, source))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeSource(self, zone, source):
        return dbus_to_python(self.fw_zone.removeSource(zone, source))

    # rich rules

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addRichRule(self, zone, rule, timeout=0):
        return dbus_to_python(self.fw_zone.addRichRule(zone, rule, timeout))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getRichRules(self, zone):
        return dbus_to_python(self.fw_zone.getRichRules(zone))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryRichRule(self, zone, rule):
        return dbus_to_python(self.fw_zone.queryRichRule(zone, rule))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeRichRule(self, zone, rule):
        return dbus_to_python(self.fw_zone.removeRichRule(zone, rule))

    # services

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addService(self, zone, service, timeout=0):
        return dbus_to_python(self.fw_zone.addService(zone, service, timeout))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getServices(self, zone):
        return dbus_to_python(self.fw_zone.getServices(zone))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryService(self, zone, service):
        return dbus_to_python(self.fw_zone.queryService(zone, service))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeService(self, zone, service):
        return dbus_to_python(self.fw_zone.removeService(zone, service))

    # ports

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addPort(self, zone, port, protocol, timeout=0):
        return dbus_to_python(self.fw_zone.addPort(zone, port, protocol, timeout))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getPorts(self, zone):
        return dbus_to_python(self.fw_zone.getPorts(zone))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryPort(self, zone, port, protocol):
        return dbus_to_python(self.fw_zone.queryPort(zone, port, protocol))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removePort(self, zone, port, protocol):
        return dbus_to_python(self.fw_zone.removePort(zone, port, protocol))

    # protocols

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addProtocol(self, zone, protocol, timeout=0):
        return dbus_to_python(self.fw_zone.addProtocol(zone, protocol, timeout))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getProtocols(self, zone):
        return dbus_to_python(self.fw_zone.getProtocols(zone))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryProtocol(self, zone, protocol):
        return dbus_to_python(self.fw_zone.queryProtocol(zone, protocol))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeProtocol(self, zone, protocol):
        return dbus_to_python(self.fw_zone.removeProtocol(zone, protocol))

    # forward

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addForward(self, zone):
        self.fw_zone.setZoneSettings2(zone, {"forward": True})

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryForward(self, zone):
        return dbus_to_python(self.fw_zone.getZoneSettings2(zone))["forward"]

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeForward(self, zone):
        self.fw_zone.setZoneSettings2(zone, {"forward": False})

    # masquerade

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addMasquerade(self, zone, timeout=0):
        return dbus_to_python(self.fw_zone.addMasquerade(zone, timeout))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryMasquerade(self, zone):
        return dbus_to_python(self.fw_zone.queryMasquerade(zone))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeMasquerade(self, zone):
        return dbus_to_python(self.fw_zone.removeMasquerade(zone))

    # forward ports

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addForwardPort(self, zone, port, protocol, toport, toaddr,
                       timeout=0):
        if toport is None:
            toport = ""
        if toaddr is None:
            toaddr = ""
        return dbus_to_python(self.fw_zone.addForwardPort(zone, port, protocol,
                                                          toport, toaddr,
                                                          timeout))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getForwardPorts(self, zone):
        return dbus_to_python(self.fw_zone.getForwardPorts(zone))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryForwardPort(self, zone, port, protocol, toport, toaddr):
        if toport is None:
            toport = ""
        if toaddr is None:
            toaddr = ""
        return dbus_to_python(self.fw_zone.queryForwardPort(zone,
                                                            port, protocol,
                                                            toport, toaddr))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeForwardPort(self, zone, port, protocol, toport, toaddr):
        if toport is None:
            toport = ""
        if toaddr is None:
            toaddr = ""
        return dbus_to_python(self.fw_zone.removeForwardPort(zone,
                                                             port, protocol,
                                                             toport, toaddr))

    # source ports

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addSourcePort(self, zone, port, protocol, timeout=0):
        return dbus_to_python(self.fw_zone.addSourcePort(zone, port, protocol,
                                                         timeout))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getSourcePorts(self, zone):
        return dbus_to_python(self.fw_zone.getSourcePorts(zone))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def querySourcePort(self, zone, port, protocol):
        return dbus_to_python(self.fw_zone.querySourcePort(zone, port, protocol))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeSourcePort(self, zone, port, protocol):
        return dbus_to_python(self.fw_zone.removeSourcePort(zone, port,
                                                            protocol))

    # icmpblock

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addIcmpBlock(self, zone, icmp, timeout=0):
        return dbus_to_python(self.fw_zone.addIcmpBlock(zone, icmp, timeout))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getIcmpBlocks(self, zone):
        return dbus_to_python(self.fw_zone.getIcmpBlocks(zone))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryIcmpBlock(self, zone, icmp):
        return dbus_to_python(self.fw_zone.queryIcmpBlock(zone, icmp))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeIcmpBlock(self, zone, icmp):
        return dbus_to_python(self.fw_zone.removeIcmpBlock(zone, icmp))

    # icmp block inversion

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addIcmpBlockInversion(self, zone):
        return dbus_to_python(self.fw_zone.addIcmpBlockInversion(zone))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryIcmpBlockInversion(self, zone):
        return dbus_to_python(self.fw_zone.queryIcmpBlockInversion(zone))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeIcmpBlockInversion(self, zone):
        return dbus_to_python(self.fw_zone.removeIcmpBlockInversion(zone))

    # direct chain

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addChain(self, ipv, table, chain):
        self.fw_direct.addChain(ipv, table, chain)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeChain(self, ipv, table, chain):
        self.fw_direct.removeChain(ipv, table, chain)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryChain(self, ipv, table, chain):
        return dbus_to_python(self.fw_direct.queryChain(ipv, table, chain))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getChains(self, ipv, table):
        return dbus_to_python(self.fw_direct.getChains(ipv, table))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getAllChains(self):
        return dbus_to_python(self.fw_direct.getAllChains())

    # direct rule

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addRule(self, ipv, table, chain, priority, args):
        self.fw_direct.addRule(ipv, table, chain, priority, args)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeRule(self, ipv, table, chain, priority, args):
        self.fw_direct.removeRule(ipv, table, chain, priority, args)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeRules(self, ipv, table, chain):
        self.fw_direct.removeRules(ipv, table, chain)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryRule(self, ipv, table, chain, priority, args):
        return dbus_to_python(self.fw_direct.queryRule(ipv, table, chain, priority, args))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getRules(self, ipv, table, chain):
        return dbus_to_python(self.fw_direct.getRules(ipv, table, chain))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getAllRules(self):
        return dbus_to_python(self.fw_direct.getAllRules())

    # direct passthrough

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def passthrough(self, ipv, args):
        return dbus_to_python(self.fw_direct.passthrough(ipv, args))

    # tracked passthrough

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getAllPassthroughs(self):
        return dbus_to_python(self.fw_direct.getAllPassthroughs())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeAllPassthroughs(self):
        self.fw_direct.removeAllPassthroughs()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getPassthroughs(self, ipv):
        return dbus_to_python(self.fw_direct.getPassthroughs(ipv))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addPassthrough(self, ipv, args):
        self.fw_direct.addPassthrough(ipv, args)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removePassthrough(self, ipv, args):
        self.fw_direct.removePassthrough(ipv, args)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryPassthrough(self, ipv, args):
        return dbus_to_python(self.fw_direct.queryPassthrough(ipv, args))

    # lockdown

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def enableLockdown(self):
        self.fw_policies.enableLockdown()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def disableLockdown(self):
        self.fw_policies.disableLockdown()

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryLockdown(self):
        return dbus_to_python(self.fw_policies.queryLockdown())

    # policies

    # lockdown white list commands

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addLockdownWhitelistCommand(self, command):
        self.fw_policies.addLockdownWhitelistCommand(command)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getLockdownWhitelistCommands(self):
        return dbus_to_python(self.fw_policies.getLockdownWhitelistCommands())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryLockdownWhitelistCommand(self, command):
        return dbus_to_python(self.fw_policies.queryLockdownWhitelistCommand(command))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeLockdownWhitelistCommand(self, command):
        self.fw_policies.removeLockdownWhitelistCommand(command)

    # lockdown white list contexts

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addLockdownWhitelistContext(self, context):
        self.fw_policies.addLockdownWhitelistContext(context)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getLockdownWhitelistContexts(self):
        return dbus_to_python(self.fw_policies.getLockdownWhitelistContexts())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryLockdownWhitelistContext(self, context):
        return dbus_to_python(self.fw_policies.queryLockdownWhitelistContext(context))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeLockdownWhitelistContext(self, context):
        self.fw_policies.removeLockdownWhitelistContext(context)

    # lockdown white list uids

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addLockdownWhitelistUid(self, uid):
        self.fw_policies.addLockdownWhitelistUid(uid)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getLockdownWhitelistUids(self):
        return dbus_to_python(self.fw_policies.getLockdownWhitelistUids())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryLockdownWhitelistUid(self, uid):
        return dbus_to_python(self.fw_policies.queryLockdownWhitelistUid(uid))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeLockdownWhitelistUid(self, uid):
        self.fw_policies.removeLockdownWhitelistUid(uid)

    # lockdown white list users

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def addLockdownWhitelistUser(self, user):
        self.fw_policies.addLockdownWhitelistUser(user)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def getLockdownWhitelistUsers(self):
        return dbus_to_python(self.fw_policies.getLockdownWhitelistUsers())

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def queryLockdownWhitelistUser(self, user):
        return dbus_to_python(self.fw_policies.queryLockdownWhitelistUser(user))

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def removeLockdownWhitelistUser(self, user):
        self.fw_policies.removeLockdownWhitelistUser(user)

    @slip.dbus.polkit.enable_proxy
    @handle_exceptions
    def authorizeAll(self):
        """ Authorize once for all polkit actions. """
        self.fw.authorizeAll()
site-packages/firewall/fw_types.py000064400000004220147511334670013314 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2013-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

__all__ = [ "LastUpdatedOrderedDict" ]

class LastUpdatedOrderedDict(object):
    def __init__(self, x=None):
        self._dict = { }
        self._list = [ ]
        if x:
            self.update(x)

    def clear(self):
        del self._list[:]
        self._dict.clear()

    def update(self, x):
        for key,value in x.items():
            self[key] = value

    def items(self):
        return [(key, self[key]) for key in self._list]

    def __delitem__(self, key):
        if key in self._dict:
            self._list.remove(key)
            del self._dict[key]

    def __repr__(self):
        return '%s([%s])' % (self.__class__.__name__, ', '.join(
                ['(%r, %r)' % (key, self[key]) for key in self._list]))

    def __setitem__(self, key, value):
        if key not in self._dict:
            self._list.append(key)
        self._dict[key] = value

    def __getitem__(self, key):
        if type(key) == int:
            return self._list[key]
        else:
            return self._dict[key]

    def __len__(self):
        return len(self._list)

    def copy(self):
        return LastUpdatedOrderedDict(self)

    def keys(self):
        return self._list[:]

    def values(self):
        return [ self[key] for key in self._list ]

    def setdefault(self, key, value=None):
        if key in self:
            return self[key]
        else:
            self[key] = value
            return value
site-packages/firewall/core/__pycache__/fw_policies.cpython-36.pyc000064400000004432147511334670021200 0ustar003

]ûf�
�@sVdgZddlmZddlmZddlmZddlmZddlm	Z	Gdd�de
�ZdS)	�FirewallPolicies�)�config)�log)�LockdownWhitelist)�errors)�
FirewallErrorc@sDeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dS)rcCsd|_ttj�|_dS)NF)�	_lockdownrrZLOCKDOWN_WHITELIST�lockdown_whitelist)�self�r�!/usr/lib/python3.6/fw_policies.py�__init__szFirewallPolicies.__init__cCsd|j|j|jfS)Nz
%s(%r, %r))�	__class__rr	)r
rrr�__repr__#s
zFirewallPolicies.__repr__cCsd|_|jj�dS)NF)rr	�cleanup)r
rrrr'szFirewallPolicies.cleanupcCs�|dkr2tjd|�|jj|�r�tjd�dSn�|dkrdtjd|�|jj|�r�tjd�dSnb|dkr�tjd	|�|jj|�r�tjd
�dSn0|dkr�tjd|�|jj|�r�tjd
�dSdS)N�contextz#Doing access check for context "%s"zcontext matches.TZuidzDoing access check for uid %dzuid matches.�userz Doing access check for user "%s"z
user matches.Zcommandz#Doing access check for command "%s"zcommand matches.F)rZdebug2r	Z
match_contextZdebug3Z	match_uidZ
match_userZ
match_command)r
�key�valuerrr�access_check-s*



zFirewallPolicies.access_checkcCs|jrttjd��d|_dS)Nzenable_lockdown()T)rrrZALREADY_ENABLED)r
rrr�enable_lockdownDsz FirewallPolicies.enable_lockdowncCs|jsttjd��d|_dS)Nzdisable_lockdown()F)rrrZNOT_ENABLED)r
rrr�disable_lockdownIsz!FirewallPolicies.disable_lockdowncCs|jS)N)r)r
rrr�query_lockdownNszFirewallPolicies.query_lockdownN)
�__name__�
__module__�__qualname__r
rrrrrrrrrrrsN)�__all__ZfirewallrZfirewall.core.loggerrZ#firewall.core.io.lockdown_whitelistrrZfirewall.errorsr�objectrrrrr�<module>ssite-packages/firewall/core/__pycache__/fw_config.cpython-36.pyc000064400000077161147511334670020647 0ustar003

]ûf��@s�dgZddlZddlZddlZddlZddlmZddlmZddl	m
Z
mZmZddl
mZmZmZddlmZmZmZddlmZmZmZdd	lmZmZmZdd
lmZmZm Z ddlm!Z!ddl"m#Z#Gd
d�de$�Z%dS)�FirewallConfig�N)�config)�log)�IcmpType�icmptype_reader�icmptype_writer)�Service�service_reader�service_writer)�Zone�zone_reader�zone_writer)�IPSet�ipset_reader�ipset_writer)�Helper�
helper_reader�
helper_writer)�Policy�
policy_reader�
policy_writer)�errors)�
FirewallErrorc@s$eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Zd5d6�Zd7d8�Zd9d:�Zd;d<�Z d=d>�Z!d?d@�Z"dAdB�Z#dCdD�Z$dEdF�Z%dGdH�Z&dIdJ�Z'dKdL�Z(dMdN�Z)dOdP�Z*dQdR�Z+dSdT�Z,dUdV�Z-dWdX�Z.dYdZ�Z/d[d\�Z0d]d^�Z1d_d`�Z2dadb�Z3dcdd�Z4dedf�Z5dgdh�Z6didj�Z7dkdl�Z8dmdn�Z9dodp�Z:dqdr�Z;dsdt�Z<dudv�Z=dwdx�Z>dydz�Z?d{d|�Z@d}d~�ZAdd��ZBd�d��ZCd�d��ZDd�d��ZEd�d��ZFd�d��ZGd�d��ZHd�d��ZId�d��ZJd�d��ZKd�d��ZLd�d��ZMd�d��ZNd�d��ZOd�d��ZPd�d��ZQd�d��ZRd�d��ZSd�d��ZTd�d��ZUd�d��ZVd�d��ZWd�d��ZXd�d��ZYd�d��ZZd�d��Z[d�d��Z\d�d��Z]d�d��Z^d�d��Z_d�d��Z`d�d��Zad�d��Zbd�d„Zcd�dĄZdd�dƄZed�S)�rcCs||_|j�dS)N)�_fw�_FirewallConfig__init_vars)�self�fw�r�/usr/lib/python3.6/fw_config.py�__init__(szFirewallConfig.__init__cCsHd|j|j|j|j|j|j|j|j|j|j	|j
|j|j|j
|j|jfS)Nz>%s(%r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r))�	__class__�_ipsets�
_icmptypes�	_services�_zones�_helpersZpolicy_objects�_builtin_ipsets�_builtin_icmptypes�_builtin_services�_builtin_zones�_builtin_helpers�_builtin_policy_objects�_firewalld_conf�	_policies�_direct)rrrr�__repr__,szFirewallConfig.__repr__cCs^i|_i|_i|_i|_i|_i|_i|_i|_i|_i|_	i|_
i|_d|_d|_
d|_dS)N)r!r"r#r$r%�_policy_objectsr&r'r(r)r*r+r,r-r.)rrrrZ__init_vars6szFirewallConfig.__init_varscCs4x,t|jj��D]}|j|j�|j|=qWx,t|jj��D]}|j|j�|j|=q>Wx,t|jj��D]}|j|j�|j|=qlWx,t|jj��D]}|j|j�|j|=q�Wx,t|jj��D]}|j|j�|j|=q�Wx,t|jj��D]}|j|j�|j|=q�Wx.t|j	j��D]}|j	|j�|j	|=�q$Wx.t|j
j��D]}|j
|j�|j
|=�qTWx.t|jj��D]}|j|j�|j|=�q�Wx.t|jj��D]}|j|j�|j|=�q�W|j
�r�|j
j�|`
d|_
|j�r|jj�|`d|_|j�r(|jj�|`d|_|j�dS)N)�listr&�keys�cleanupr!r'r"r(r#r)r$r*r%r,r-r.r)r�xrrrr3GsV


zFirewallConfig.cleanupcCs|jjj�S)N)r�policiesZquery_lockdown)rrrr�lockdown_enabled~szFirewallConfig.lockdown_enabledcCs|jjj||�S)N)rr5�access_check)r�key�valuerrrr7�szFirewallConfig.access_checkcCs
||_dS)N)r,)r�confrrr�set_firewalld_conf�sz!FirewallConfig.set_firewalld_confcCs|jS)N)r,)rrrr�get_firewalld_conf�sz!FirewallConfig.get_firewalld_confcCs(tjjtj�s|jj�n
|jj�dS)N)�os�path�existsrZFIREWALLD_CONFr,�clear�read)rrrr�update_firewalld_conf�sz$FirewallConfig.update_firewalld_confcCs
||_dS)N)r-)rr5rrr�set_policies�szFirewallConfig.set_policiescCs|jS)N)r-)rrrr�get_policies�szFirewallConfig.get_policiescCs,tjjtj�s|jjj�n|jjj�dS)N)	r=r>r?rZLOCKDOWN_WHITELISTr-Zlockdown_whitelistr3rA)rrrr�update_lockdown_whitelist�sz(FirewallConfig.update_lockdown_whitelistcCs
||_dS)N)r.)rZdirectrrr�
set_direct�szFirewallConfig.set_directcCs|jS)N)r.)rrrr�
get_direct�szFirewallConfig.get_directcCs(tjjtj�s|jj�n
|jj�dS)N)r=r>r?rZFIREWALLD_DIRECTr.r3rA)rrrr�
update_direct�szFirewallConfig.update_directcCs$ttt|jj��t|jj����S)N)�sorted�setr1r!r2r&)rrrr�
get_ipsets�szFirewallConfig.get_ipsetscCs$|jr||j|j<n||j|j<dS)N)�builtinr&�namer!)r�objrrr�	add_ipset�szFirewallConfig.add_ipsetcCs8||jkr|j|S||jkr(|j|Sttj|��dS)N)r!r&rr�
INVALID_IPSET)rrMrrr�	get_ipset�s




zFirewallConfig.get_ipsetcCst|j|jkrttj|j��nB|j|j|kr@ttjd|j��n|j|jkr^ttjd|j��|j|�|j|jS)Nzself._ipsets[%s] != objz'%s' not a built-in ipset)rMr!rr�NO_DEFAULTSr&�
_remove_ipset)rrNrrr�load_ipset_defaults�s
z"FirewallConfig.load_ipset_defaultscCs|j�S)N)�
export_config)rrNrrr�get_ipset_config�szFirewallConfig.get_ipset_configcCsj|jrPtj|�}|j|�tj|_d|_|j|jkr:d|_|j|�t|�|S|j|�t|�|SdS)NF)	rL�copy�
import_configr�ETC_FIREWALLD_IPSETSr>�defaultrOr)rrNr:r4rrr�set_ipset_config�s



zFirewallConfig.set_ipset_configcCsx||jks||jkr$ttjd|��t�}|j|�|j|�||_d||_	t
j|_d|_
d|_t|�|j|�|S)Nznew_ipset(): '%s'z%s.xmlFT)r!r&rr�
NAME_CONFLICTr�
check_namerXrM�filenamerrYr>rLrZrrO)rrMr:r4rrr�	new_ipset�s




zFirewallConfig.new_ipsetcCs�tjj|�}tjj|�}tjj|�s�|tjkr�x�|jj�D]D}|j|}|j	|kr:|j|=|j
|jkrvd|j|j
fSd|fSq:WnHxF|jj�D]8}|j|}|j	|kr�|j|=|j
|jkr�d|fSdSq�WdStj
d|�yt||�}Wn0tk
�r}ztjd||�dSd}~XnX|j
|jk�rJ|j
|jk�rJ|j|�d|fS|tjk�r�|j
|jk�r�|j|j
j|_||j|j
<d|fS|j
|jk�r�|j|j
=||j|j
<|j
|jk�r�d|fSd	Sd
S)N�update�removezLoading ipset file '%s'z"Failed to load ipset file '%s': %s�new)NN)NN)NN)NN)NN)r=r>�basename�dirnamer?rrYr!r2r^rMr&r�debug1r�	Exception�errorrOrZ)rrMr^r>r4rN�msgrrr�update_ipset_from_path�sP






z%FirewallConfig.update_ipset_from_pathcCs�|j|jkrttj|j��|jtjkr>ttjd|jtjf��d|j|jf}yt	j
|d|�Wn:tk
r�}ztj
d||�tj|�WYdd}~XnX|j|j=dS)Nz'%s' != '%s'z	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %s)rMr!rrrPr>rrY�INVALID_DIRECTORY�shutil�moverfrrgr=ra)rrNrMrhrrrrS8szFirewallConfig._remove_ipsetcCs$|js|jr ttjd|j��dS)Nz'%s' is built-in ipset)rLrZrrZ
BUILTIN_IPSETrM)rrNrrr�check_builtin_ipsetIsz"FirewallConfig.check_builtin_ipsetcCs|j|�|j|�dS)N)rmrS)rrNrrr�remove_ipsetNs
zFirewallConfig.remove_ipsetcCs$|j|�|j||�}|j|�|S)N)rm�_copy_ipsetrS)rrNrMr_rrr�rename_ipsetRs

zFirewallConfig.rename_ipsetcCs|j||j��S)N)r_rU)rrNrMrrrroXszFirewallConfig._copy_ipsetcCs$ttt|jj��t|jj����S)N)rIrJr1r"r2r')rrrr�
get_icmptypes]szFirewallConfig.get_icmptypescCs$|jr||j|j<n||j|j<dS)N)rLr'rMr")rrNrrr�add_icmptypeaszFirewallConfig.add_icmptypecCs8||jkr|j|S||jkr(|j|Sttj|��dS)N)r"r'rr�INVALID_ICMPTYPE)rrMrrr�get_icmptypegs




zFirewallConfig.get_icmptypecCst|j|jkrttj|j��nB|j|j|kr@ttjd|j��n|j|jkr^ttjd|j��|j|�|j|jS)Nzself._icmptypes[%s] != objz'%s' not a built-in icmptype)rMr"rrrRr'�_remove_icmptype)rrNrrr�load_icmptype_defaultsns
z%FirewallConfig.load_icmptype_defaultscCs|j�S)N)rU)rrNrrr�get_icmptype_configzsz"FirewallConfig.get_icmptype_configcCsj|jrPtj|�}|j|�tj|_d|_|j|jkr:d|_|j|�t|�|S|j|�t|�|SdS)NF)	rLrWrXr�ETC_FIREWALLD_ICMPTYPESr>rZrrr)rrNr:r4rrr�set_icmptype_config}s



z"FirewallConfig.set_icmptype_configcCsx||jks||jkr$ttjd|��t�}|j|�|j|�||_d||_	t
j|_d|_
d|_t|�|j|�|S)Nznew_icmptype(): '%s'z%s.xmlFT)r"r'rrr\rr]rXrMr^rrxr>rLrZrrr)rrMr:r4rrr�new_icmptype�s




zFirewallConfig.new_icmptypecCs�tjj|�}tjj|�}tjj|�s�|tjkr�x�|jj�D]D}|j|}|j	|kr:|j|=|j
|jkrvd|j|j
fSd|fSq:WnHxF|jj�D]8}|j|}|j	|kr�|j|=|j
|jkr�d|fSdSq�WdStj
d|�yt||�}Wn0tk
�r}ztjd||�dSd}~XnX|j
|jk�rJ|j
|jk�rJ|j|�d|fS|tjk�r�|j
|jk�r�|j|j
j|_||j|j
<d|fS|j
|jk�r�|j|j
=||j|j
<|j
|jk�r�d|fSd	Sd
S)Nr`razLoading icmptype file '%s'z%Failed to load icmptype file '%s': %srb)NN)NN)NN)NN)NN)r=r>rcrdr?rrxr"r2r^rMr'rrerrfrgrrrZ)rrMr^r>r4rNrhrrr�update_icmptype_from_path�sP






z(FirewallConfig.update_icmptype_from_pathcCs�|j|jkrttj|j��|jtjkr>ttjd|jtjf��d|j|jf}yt	j
|d|�Wn:tk
r�}ztj
d||�tj|�WYdd}~XnX|j|j=dS)Nz'%s' != '%s'z	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %s)rMr"rrrsr>rrxrjrkrlrfrrgr=ra)rrNrMrhrrrru�szFirewallConfig._remove_icmptypecCs$|js|jr ttjd|j��dS)Nz'%s' is built-in icmp type)rLrZrrZBUILTIN_ICMPTYPErM)rrNrrr�check_builtin_icmptype�sz%FirewallConfig.check_builtin_icmptypecCs|j|�|j|�dS)N)r|ru)rrNrrr�remove_icmptype�s
zFirewallConfig.remove_icmptypecCs$|j|�|j||�}|j|�|S)N)r|�_copy_icmptyperu)rrNrMrzrrr�rename_icmptype�s

zFirewallConfig.rename_icmptypecCs|j||j��S)N)rzrU)rrNrMrrrr~szFirewallConfig._copy_icmptypecCs$ttt|jj��t|jj����S)N)rIrJr1r#r2r()rrrr�get_services
szFirewallConfig.get_servicescCs$|jr||j|j<n||j|j<dS)N)rLr(rMr#)rrNrrr�add_serviceszFirewallConfig.add_servicecCs<||jkr|j|S||jkr(|j|Sttjd|��dS)Nzget_service(): '%s')r#r(rr�INVALID_SERVICE)rrMrrr�get_services




zFirewallConfig.get_servicecCst|j|jkrttj|j��nB|j|j|kr@ttjd|j��n|j|jkr^ttjd|j��|j|�|j|jS)Nzself._services[%s] != objz'%s' not a built-in service)rMr#rrrRr(�_remove_service)rrNrrr�load_service_defaultss
z$FirewallConfig.load_service_defaultscCsr|j�}g}x\td�D]P}|j|d|krN|jtjt||j|d���q|j||j|d�qWt|�S)N�r)�export_config_dict�range�IMPORT_EXPORT_STRUCTURE�appendrW�deepcopy�getattr�tuple)rrN�	conf_dict�	conf_list�irrr�get_service_config's"z!FirewallConfig.get_service_configcCs|j�S)N)r�)rrNrrr�get_service_config_dict3sz&FirewallConfig.get_service_config_dictcCs�i}x&t|�D]\}}|||j|d<qW|jr|tj|�}|j|�tj|_d|_|j|jkrfd|_|j	|�t
|�|S|j|�t
|�|SdS)NrF)�	enumerater�rLrW�import_config_dictr�ETC_FIREWALLD_SERVICESr>rZr�r
)rrNr:r�r�r9r4rrr�set_service_config6s 



z!FirewallConfig.set_service_configcCsj|jrPtj|�}|j|�tj|_d|_|j|jkr:d|_|j|�t|�|S|j|�t|�|SdS)NF)	rLrWr�rr�r>rZr�r
)rrNr:r4rrr�set_service_config_dictJs



z&FirewallConfig.set_service_config_dictcCs�||jks||jkr$ttjd|��i}x&t|�D]\}}||tj|d<q2Wt�}|j|�|j	|�||_
d||_tj
|_d|_d|_t|�|j|�|S)Nznew_service(): '%s'rz%s.xmlFT)r#r(rrr\r�rr�r]r�rMr^rr�r>rLrZr
r�)rrMr:r�r�r9r4rrr�new_serviceZs"




zFirewallConfig.new_servicecCsx||jks||jkr$ttjd|��t�}|j|�|j|�||_d||_	t
j|_d|_
d|_t|�|j|�|S)Nznew_service(): '%s'z%s.xmlFT)r#r(rrr\rr]r�rMr^rr�r>rLrZr
r�)rrMr:r4rrr�new_service_dictqs




zFirewallConfig.new_service_dictcCs�tjj|�}tjj|�}tjj|�s�|tjkr�x�|jj�D]D}|j|}|j	|kr:|j|=|j
|jkrvd|j|j
fSd|fSq:WnHxF|jj�D]8}|j|}|j	|kr�|j|=|j
|jkr�d|fSdSq�WdStj
d|�yt||�}Wn0tk
�r}ztjd||�dSd}~XnX|j
|jk�rJ|j
|jk�rJ|j|�d|fS|tjk�r�|j
|jk�r�|j|j
j|_||j|j
<d|fS|j
|jk�r�|j|j
=||j|j
<|j
|jk�r�d|fSd	Sd
S)Nr`razLoading service file '%s'z$Failed to load service file '%s': %srb)NN)NN)NN)NN)NN)r=r>rcrdr?rr�r#r2r^rMr(rrer	rfrgr�rZ)rrMr^r>r4rNrhrrr�update_service_from_path�sP






z'FirewallConfig.update_service_from_pathcCs�|j|jkrttj|j��|jtjkr>ttjd|jtjf��d|j|jf}yt	j
|d|�Wn:tk
r�}ztj
d||�tj|�WYdd}~XnX|j|j=dS)Nz'%s' != '%s'z	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %s)rMr#rrr�r>rr�rjrkrlrfrrgr=ra)rrNrMrhrrrr��szFirewallConfig._remove_servicecCs$|js|jr ttjd|j��dS)Nz'%s' is built-in service)rLrZrrZBUILTIN_SERVICErM)rrNrrr�check_builtin_service�sz$FirewallConfig.check_builtin_servicecCs|j|�|j|�dS)N)r�r�)rrNrrr�remove_service�s
zFirewallConfig.remove_servicecCs$|j|�|j||�}|j|�|S)N)r��
_copy_servicer�)rrNrMr�rrr�rename_service�s

zFirewallConfig.rename_servicecCs|j||j��S)N)r�r�)rrNrMrrrr��szFirewallConfig._copy_servicecCs$ttt|jj��t|jj����S)N)rIrJr1r$r2r))rrrr�	get_zones�szFirewallConfig.get_zonescCs$|jr||j|j<n||j|j<dS)N)rLr)rMr$)rrNrrr�add_zone�szFirewallConfig.add_zonecCs(||jkr|j|=||jkr$|j|=dS)N)r)r$)rrMrrr�forget_zone�s

zFirewallConfig.forget_zonecCs<||jkr|j|S||jkr(|j|Sttjd|��dS)Nzget_zone(): %s)r$r)rr�INVALID_ZONE)rrMrrr�get_zone�s




zFirewallConfig.get_zonecCst|j|jkrttj|j��nB|j|j|kr@ttjd|j��n|j|jkr^ttjd|j��|j|�|j|jS)Nzself._zones[%s] != objz'%s' not a built-in zone)rMr$rrrRr)�_remove_zone)rrNrrr�load_zone_defaultss
z!FirewallConfig.load_zone_defaultscCsr|j�}g}x\td�D]P}|j|d|krN|jtjt||j|d���q|j||j|d�qWt|�S)N�r)r�r�r�r�rWr�r�r�)rrNr�r�r�rrr�get_zone_configs"zFirewallConfig.get_zone_configcCs|j�S)N)r�)rrNrrr�get_zone_config_dictsz#FirewallConfig.get_zone_config_dictcCs�i}x&t|�D]\}}|||j|d<qW|jr�tj|�}||_|j|�tj|_d|_|j|jkrld|_	|j
|�t|�|S||_|j|�t|�|SdS)NrF)r�r�rLrW�	fw_configr�r�ETC_FIREWALLD_ZONESr>rZr�r
)rrNr:r�r�r9r4rrr�set_zone_config s$



zFirewallConfig.set_zone_configcCsv|jrVtj|�}||_|j|�tj|_d|_|j|jkr@d|_|j|�t	|�|S||_|j|�t	|�|SdS)NF)
rLrWr�r�rr�r>rZr�r
)rrNr:r4rrr�set_zone_config_dict6s



z#FirewallConfig.set_zone_config_dictcCs�||jks||jkr$ttjd|��i}x&t|�D]\}}||tj|d<q2Wt�}||_|j	|�|j
|�||_d||_t
j|_d|_d|_t|�|j|�|S)Nznew_zone(): '%s'rz%s.xmlFT)r$r)rrr\r�rr�r�r]r�rMr^rr�r>rLrZr
r�)rrMr:r�r�r9r4rrr�new_zoneHs"



zFirewallConfig.new_zonecCs~||jks||jkr$ttjd|��t�}||_|j|�|j|�||_	d||_
tj|_
d|_d|_t|�|j|�|S)Nznew_zone(): '%s'z%s.xmlFT)r$r)rrr\rr�r]r�rMr^rr�r>rLrZr
r�)rrMr:r4rrr�
new_zone_dict_s



zFirewallConfig.new_zone_dictcCstjj|�}tjj|�}tjj|�s�|jtj�r�x�|jj	�D]D}|j|}|j
|kr<|j|=|j|jkrxd|j|jfSd|fSq<WnHxF|jj	�D]8}|j|}|j
|kr�|j|=|j|jkr�d|fSd	Sq�Wd
St
jd|�yt||�}Wn0tk
�r}zt
jd||�dSd}~XnX||_|jtj��rlt|�ttj�k�rldtjj|�tjj|�dd�f|_|j|jk�r�|j|jk�r�|j|�d|fS|jtj��r�|j|jk�r�|j|jj|_||j|j<d|fS|j|jk�r|j|j=||j|j<|j|jk�rd|fSd
SdS)Nr`razLoading zone file '%s'z!Failed to load zone file '%s': %sz%s/%sr�rb)NN)NN)NN���)NN)NN)r=r>rcrdr?�
startswithrr�r$r2r^rMr)rrerrfrgr��lenr�rZ)rrMr^r>r4rNrhrrr�update_zone_from_pathrsZ





z$FirewallConfig.update_zone_from_pathcCs�|j|jkrttj|j��|jjtj�s@ttj	d|jtjf��d|j|jf}yt
j|d|�Wn:tk
r�}zt
jd||�tj|�WYdd}~XnX|j|j=dS)Nz'%s' doesn't start with '%s'z	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %s)rMr$rrr�r>r�rr�rjrkrlrfrrgr=ra)rrNrMrhrrrr��szFirewallConfig._remove_zonecCs$|js|jr ttjd|j��dS)Nz'%s' is built-in zone)rLrZrrZBUILTIN_ZONErM)rrNrrr�check_builtin_zone�sz!FirewallConfig.check_builtin_zonecCs|j|�|j|�dS)N)r�r�)rrNrrr�remove_zone�s
zFirewallConfig.remove_zonec	CsN|j|�|j�}|j|�y|j||�}Wn|j|j|��YnX|S)N)r�r�r�r�rM)rrNrMZobj_confr�rrr�rename_zone�s

zFirewallConfig.rename_zonecCs$ttt|jj��t|jj����S)N)rIrJr1r0r2r+)rrrr�get_policy_objects�sz!FirewallConfig.get_policy_objectscCs$|jr||j|j<n||j|j<dS)N)rLr+rMr0)rrNrrr�add_policy_object�sz FirewallConfig.add_policy_objectcCs<||jkr|j|S||jkr(|j|Sttjd|��dS)Nzget_policy_object(): %s)r0r+rr�INVALID_POLICY)rrMrrr�get_policy_object�s




z FirewallConfig.get_policy_objectcCst|j|jkrttj|j��nB|j|j|kr@ttjd|j��n|j|jkr^ttjd|j��|j|�|j|jS)Nzself._policy_objects[%s] != objz'%s' not a built-in policy)rMr0rrrRr+�_remove_policy_object)rrNrrr�load_policy_object_defaults�s
z*FirewallConfig.load_policy_object_defaultscCs|j�S)N)r�)rrNrrr�get_policy_object_config_dictsz,FirewallConfig.get_policy_object_config_dictcCsv|jrVtj|�}||_|j|�tj|_d|_|j|jkr@d|_|j|�t	|�|S||_|j|�t	|�|SdS)NF)
rLrWr�r�r�ETC_FIREWALLD_POLICIESr>rZr�r)rrNr:r4rrr�set_policy_object_config_dicts



z,FirewallConfig.set_policy_object_config_dictcCs~||jks||jkr$ttjd|��t�}||_|j|�|j|�||_	d||_
tj|_
d|_d|_t|�|j|�|S)Nznew_policy_object(): '%s'z%s.xmlFT)r0r+rrr\rr�r]r�rMr^rr�r>rLrZrr�)rrMr:r4rrr�new_policy_object_dicts



z%FirewallConfig.new_policy_object_dictcCstjj|�}tjj|�}tjj|�s�|jtj�r�x�|jj	�D]D}|j|}|j
|kr<|j|=|j|jkrxd|j|jfSd|fSq<WnHxF|jj	�D]8}|j|}|j
|kr�|j|=|j|jkr�d|fSd	Sq�Wd
St
jd|�yt||�}Wn0tk
�r}zt
jd||�dSd}~XnX||_|jtj��rlt|�ttj�k�rldtjj|�tjj|�dd�f|_|j|jk�r�|j|jk�r�|j|�d|fS|jtj��r�|j|jk�r�|j|jj|_||j|j<d|fS|j|jk�r|j|j=||j|j<|j|jk�rd|fSd
SdS)Nr`razLoading policy file '%s'z#Failed to load policy file '%s': %sz%s/%srr�rb)NN)NN)NNr�)NN)NN)r=r>rcrdr?r�rr�r0r2r^rMr+rrerrfrgr�r�r�rZ)rrMr^r>r4rNrhrrr�update_policy_object_from_path,sZ





z-FirewallConfig.update_policy_object_from_pathcCs�|j|jkrttj|j��|jjtj�s@ttj	d|jtjf��d|j|jf}yt
j|d|�Wn:tk
r�}zt
jd||�tj|�WYdd}~XnX|j|j=dS)Nz'%s' doesn't start with '%s'z	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %s)rMr0rrr�r>r�rr�rjrkrlrfrrgr=ra)rrNrMrhrrrr�ysz$FirewallConfig._remove_policy_objectcCs$|js|jr ttjd|j��dS)Nz'%s' is built-in policy)rLrZrrZBUILTIN_POLICYrM)rrNrrr�check_builtin_policy_object�sz*FirewallConfig.check_builtin_policy_objectcCs|j|�|j|�dS)N)r�r�)rrNrrr�remove_policy_object�s
z#FirewallConfig.remove_policy_objectcCs$|j|�|j||�}|j|�|S)N)r��_copy_policy_objectr�)rrNrMZnew_policy_objectrrr�rename_policy_object�s

z#FirewallConfig.rename_policy_objectcCs|j||j��S)N)r�r�)rrNrMrrrr��sz"FirewallConfig._copy_policy_objectcCs$ttt|jj��t|jj����S)N)rIrJr1r%r2r*)rrrr�get_helpers�szFirewallConfig.get_helperscCs$|jr||j|j<n||j|j<dS)N)rLr*rMr%)rrNrrr�
add_helper�szFirewallConfig.add_helpercCs8||jkr|j|S||jkr(|j|Sttj|��dS)N)r%r*rr�INVALID_HELPER)rrMrrr�
get_helper�s




zFirewallConfig.get_helpercCst|j|jkrttj|j��nB|j|j|kr@ttjd|j��n|j|jkr^ttjd|j��|j|�|j|jS)Nzself._helpers[%s] != objz'%s' not a built-in helper)rMr%rrrRr*�_remove_helper)rrNrrr�load_helper_defaults�s
z#FirewallConfig.load_helper_defaultscCs|j�S)N)rU)rrNrrr�get_helper_config�sz FirewallConfig.get_helper_configcCsj|jrPtj|�}|j|�tj|_d|_|j|jkr:d|_|j|�t|�|S|j|�t|�|SdS)NF)	rLrWrXr�ETC_FIREWALLD_HELPERSr>rZr�r)rrNr:r4rrr�set_helper_config�s



z FirewallConfig.set_helper_configcCsx||jks||jkr$ttjd|��t�}|j|�|j|�||_d||_	t
j|_d|_
d|_t|�|j|�|S)Nznew_helper(): '%s'z%s.xmlFT)r%r*rrr\rr]rXrMr^rr�r>rLrZrr�)rrMr:r4rrr�
new_helper�s




zFirewallConfig.new_helpercCs�tjj|�}tjj|�}tjj|�s�|tjkr�x�|jj�D]D}|j|}|j	|kr:|j|=|j
|jkrvd|j|j
fSd|fSq:WnHxF|jj�D]8}|j|}|j	|kr�|j|=|j
|jkr�d|fSdSq�WdStj
d|�yt||�}Wn0tk
�r}ztjd||�dSd}~XnX|j
|jk�rJ|j
|jk�rJ|j|�d|fS|tjk�r�|j
|jk�r�|j|j
j|_||j|j
<d|fS|j
|jk�r�|j|j
=||j|j
<|j
|jk�r�d|fSd	Sd
S)Nr`razLoading helper file '%s'z#Failed to load helper file '%s': %srb)NN)NN)NN)NN)NN)r=r>rcrdr?rr�r%r2r^rMr*rrerrfrgr�rZ)rrMr^r>r4rNrhrrr�update_helper_from_path�sP






z&FirewallConfig.update_helper_from_pathcCs�|j|jkrttj|j��|jtjkr>ttjd|jtjf��d|j|jf}yt	j
|d|�Wn:tk
r�}ztj
d||�tj|�WYdd}~XnX|j|j=dS)Nz'%s' != '%s'z	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %s)rMr%rrr�r>rr�rjrkrlrfrrgr=ra)rrNrMrhrrrr�&szFirewallConfig._remove_helpercCs$|js|jr ttjd|j��dS)Nz'%s' is built-in helper)rLrZrrZBUILTIN_HELPERrM)rrNrrr�check_builtin_helper7sz#FirewallConfig.check_builtin_helpercCs|j|�|j|�dS)N)r�r�)rrNrrr�
remove_helper<s
zFirewallConfig.remove_helpercCs$|j|�|j||�}|j|�|S)N)r��_copy_helperr�)rrNrMr�rrr�
rename_helper@s

zFirewallConfig.rename_helpercCs|j||j��S)N)r�rU)rrNrMrrrr�FszFirewallConfig._copy_helperN)f�__name__�
__module__�__qualname__rr/rr3r6r7r;r<rBrCrDrErFrGrHrKrOrQrTrVr[r_rirSrmrnrprorqrrrtrvrwryrzr{rur|r}rr~r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr's�
7EEEMME)&�__all__rWr=Zos.pathrkZfirewallrZfirewall.core.loggerrZfirewall.core.io.icmptyperrrZfirewall.core.io.servicerr	r
Zfirewall.core.io.zonerrr
Zfirewall.core.io.ipsetrrrZfirewall.core.io.helperrrrZfirewall.core.io.policyrrrrZfirewall.errorsr�objectrrrrr�<module>ssite-packages/firewall/core/__pycache__/rich.cpython-36.pyc000064400000050640147511334670017624 0ustar003

]ûf8��@s�dddddddddd	d
ddd
ddddgZddlmZddlmZddlmZddlmZddlm	Z	Gdd�de
�ZGdd�de
�ZGdd�de
�Z
Gdd�de
�ZGdd�de�ZGdd�de
�ZGdd�de
�ZGdd�de
�ZGd d�de
�ZGd!d	�d	e
�ZGd"d
�d
e
�ZGd#d�de
�ZGd$d�de
�ZGd%d
�d
e
�ZGd&d�de�ZGd'd�de
�Zd(d)d/d1d+�ZGd,d�de
�ZGd-d�de
�Zd.S)2�Rich_Source�Rich_Destination�Rich_Service�	Rich_Port�
Rich_Protocol�Rich_Masquerade�Rich_IcmpBlock�
Rich_IcmpType�Rich_SourcePort�Rich_ForwardPort�Rich_Log�
Rich_Audit�Rich_Accept�Rich_Reject�	Rich_Drop�	Rich_Mark�
Rich_Limit�	Rich_Rule�)�	functions)�check_ipset_name)�REJECT_TYPES)�errors)�
FirewallErrorc@seZdZddd�Zdd�ZdS)rFcCs�||_|jdkrd|_||_|jdks0|jdkr8d|_n|jdk	rN|jj�|_||_|jdkrdd|_||_|jdkr�|jdkr�|jdkr�ttjd��dS)N�zno address, mac and ipset)�addr�mac�upper�ipset�invertrr�INVALID_RULE)�selfrrrr�r!�/usr/lib/python3.6/rich.py�__init__$s


zRich_Source.__init__cCsjd|jrdnd}|jdk	r*|d|jS|jdk	rB|d|jS|jdk	rZ|d|jSttjd��dS)Nz	source%s z NOTrzaddress="%s"zmac="%s"z
ipset="%s"zno address, mac and ipset)rrrrrrr)r �retr!r!r"�__str__5s


zRich_Source.__str__N)F)�__name__�
__module__�__qualname__r#r%r!r!r!r"r#s
c@seZdZddd�Zdd�ZdS)rFcCsV||_|jdkrd|_||_|jdkr,d|_||_|jdkrR|jdkrRttjd��dS)Nrzno address and ipset)rrrrrr)r rrrr!r!r"r#Bs

zRich_Destination.__init__cCsRd|jrdnd}|jdk	r*|d|jS|jdk	rB|d|jSttjd��dS)Nzdestination%s z NOTrzaddress="%s"z
ipset="%s"zno address and ipset)rrrrrr)r r$r!r!r"r%Ns

zRich_Destination.__str__N)F)r&r'r(r#r%r!r!r!r"rAs
c@seZdZdd�Zdd�ZdS)rcCs
||_dS)N)�name)r r)r!r!r"r#YszRich_Service.__init__cCs
d|jS)Nzservice name="%s")r))r r!r!r"r%\szRich_Service.__str__N)r&r'r(r#r%r!r!r!r"rXsc@seZdZdd�Zdd�ZdS)rcCs||_||_dS)N)�port�protocol)r r*r+r!r!r"r#`szRich_Port.__init__cCsd|j|jfS)Nzport port="%s" protocol="%s")r*r+)r r!r!r"r%dszRich_Port.__str__N)r&r'r(r#r%r!r!r!r"r_sc@seZdZdd�ZdS)r	cCsd|j|jfS)Nz#source-port port="%s" protocol="%s")r*r+)r r!r!r"r%hszRich_SourcePort.__str__N)r&r'r(r%r!r!r!r"r	gsc@seZdZdd�Zdd�ZdS)rcCs
||_dS)N)�value)r r,r!r!r"r#mszRich_Protocol.__init__cCs
d|jS)Nzprotocol value="%s")r,)r r!r!r"r%pszRich_Protocol.__str__N)r&r'r(r#r%r!r!r!r"rlsc@seZdZdd�Zdd�ZdS)rcCsdS)Nr!)r r!r!r"r#tszRich_Masquerade.__init__cCsdS)N�
masquerader!)r r!r!r"r%wszRich_Masquerade.__str__N)r&r'r(r#r%r!r!r!r"rssc@seZdZdd�Zdd�ZdS)rcCs
||_dS)N)r))r r)r!r!r"r#{szRich_IcmpBlock.__init__cCs
d|jS)Nzicmp-block name="%s")r))r r!r!r"r%~szRich_IcmpBlock.__str__N)r&r'r(r#r%r!r!r!r"rzsc@seZdZdd�Zdd�ZdS)rcCs
||_dS)N)r))r r)r!r!r"r#�szRich_IcmpType.__init__cCs
d|jS)Nzicmp-type name="%s")r))r r!r!r"r%�szRich_IcmpType.__str__N)r&r'r(r#r%r!r!r!r"r�sc@seZdZdd�Zdd�ZdS)r
cCs<||_||_||_||_|jdkr(d|_|jdkr8d|_dS)Nr)r*r+�to_port�
to_address)r r*r+r.r/r!r!r"r#�s

zRich_ForwardPort.__init__cCs<d|j|j|jdkrd|jnd|jdkr4d|jndfS)Nz(forward-port port="%s" protocol="%s"%s%srz
 to-port="%s"z
 to-addr="%s")r*r+r.r/)r r!r!r"r%�szRich_ForwardPort.__str__N)r&r'r(r#r%r!r!r!r"r
�sc@seZdZddd�Zdd�ZdS)rNcCs||_||_||_dS)N)�prefix�level�limit)r r0r1r2r!r!r"r#�szRich_Log.__init__cCs>d|jrd|jnd|jr$d|jnd|jr6d|jndfS)Nz	log%s%s%sz prefix="%s"rz level="%s"z %s)r0r1r2)r r!r!r"r%�szRich_Log.__str__)NNN)r&r'r(r#r%r!r!r!r"r�s
c@seZdZddd�Zdd�ZdS)rNcCs
||_dS)N)r2)r r2r!r!r"r#�szRich_Audit.__init__cCsd|jrd|jndS)Nzaudit%sz %sr)r2)r r!r!r"r%�szRich_Audit.__str__)N)r&r'r(r#r%r!r!r!r"r�s
c@seZdZddd�Zdd�ZdS)r
NcCs
||_dS)N)r2)r r2r!r!r"r#�szRich_Accept.__init__cCsd|jrd|jndS)Nzaccept%sz %sr)r2)r r!r!r"r%�szRich_Accept.__str__)N)r&r'r(r#r%r!r!r!r"r
�s
c@s&eZdZddd�Zdd�Zdd�ZdS)	rNcCs||_||_dS)N)�typer2)r Z_typer2r!r!r"r#�szRich_Reject.__init__cCs,d|jrd|jnd|jr$d|jndfS)Nz
reject%s%sz
 type="%s"rz %s)r3r2)r r!r!r"r%�szRich_Reject.__str__cCsT|jrP|sttjd��|dkrP|jt|krPdjt|�}ttjd|j|f��dS)Nz9When using reject type you must specify also rule family.�ipv4�ipv6z, z%Wrong reject type %s.
Use one of: %s.)r4r5)r3rrrr�join)r �familyZvalid_typesr!r!r"�check�szRich_Reject.check)NN)r&r'r(r#r%r8r!r!r!r"r�s
c@seZdZdd�ZdS)rcCsd|jrd|jndS)Nzdrop%sz %sr)r2)r r!r!r"r%�szRich_Drop.__str__N)r&r'r(r%r!r!r!r"r�sc@s&eZdZddd�Zdd�Zdd�ZdS)	rNcCs||_||_dS)N)�setr2)r Z_setr2r!r!r"r#�szRich_Mark.__init__cCsd|j|jrd|jndfS)Nz
mark set=%s%sz %sr)r9r2)r r!r!r"r%�szRich_Mark.__str__cCs�|jdk	r|j}nttjd��d|krv|jd�}t|�dkrHttj|��tj|d�shtj|d�r�ttj|��ntj|�s�ttj|��dS)Nzno value set�/�r�)r9rrZINVALID_MARK�split�lenrZcheckUINT32)r �x�splitsr!r!r"r8�s


zRich_Mark.check)N)r&r'r(r#r%r8r!r!r!r"r�s
r<�<�)�s�m�h�dc@s�eZdZddd�Zdd�Zedd��Zejdd��Zed	d
��Zejdd
��Ze	dd
��Z
dd�Ze	dd��Zdd�Z
dd�ZdS)rNcCs||_||_dS)N)r,�burst)r r,rGr!r!r"r#�szRich_Limit.__init__cCs|j�|j�dS)N)�value_parse�burst_parse)r r!r!r"r8�szRich_Limit.checkcCs|jS)N)�_value)r r!r!r"r,�szRich_Limit.valuecCsf|dkrd|_dSy|j|�\}}Wntk
r<|}YnX|�d|��}t|dd�|krb||_dS)Nr:rJ)rJ�_value_parser�getattr)r r,�rate�duration�vr!r!r"r,�s
cCs|jS)N)�_burst)r r!r!r"rGszRich_Limit.burstcCs\|dkrd|_dSy|j|�}Wntk
r8|}Yn
Xt|�}t|dd�|krX||_dS)NrP)rP�_burst_parser�strrL)r rG�br!r!r"rGs
cCs�d}d|kr|jd�}|s(t|�dkr4ttj|��|\}}yt|�}Wnttj|��YnX|dkrv|dd�}|dks�|dkr�ttj|��dt||d
kr�ttjd|f��|dkr�|dkr�ttjd|f��||fS)Nr:r;�second�minute�hour�dayr<rCrDrErFi'rz%s too fastz%s too slow)rTrUrVrW)rCrDrErF)r=r>rr�
INVALID_LIMIT�int�DURATION_TO_MULT)r,r@rMrNr!r!r"rKs&
zRich_Limit._value_parsecCs|j|j�S)N)rKrJ)r r!r!r"rH:szRich_Limit.value_parsec	CsR|dkrdSyt|�}Wnttj|��YnX|dksB|dkrNttj|��|S)Nr<i���)rYrrrX)rGrSr!r!r"rQ=szRich_Limit._burst_parsecCs|j|j�S)N)rQrP)r r!r!r"rIKszRich_Limit.burst_parsecCs,d|j�d�}|jdk	r(|d|j��7}|S)Nz
limit value="�"z burst=)rJrP)r rCr!r!r"r%Ns
zRich_Limit.__str__)N)r&r'r(r#r8�propertyr,�setterrG�staticmethodrKrHrQrIr%r!r!r!r"r�s
c@s>eZdZdZdZddd�Zdd�Zd	d
�Zdd�Zd
d�Z	dS)ri�i�NrcCsV|dk	rt|�|_nd|_||_d|_d|_d|_d|_d|_d|_|rR|j	|�dS)N)
rRr7�priority�source�destination�element�log�audit�action�_import_from_string)r r7�rule_strr_r!r!r"r#XszRich_Rule.__init__cCs�g}x|tj|�D]n}d|krp|jd�}t|�dksF|dsF|drVttjd|��|j|d|dd��q|jd|i�qW|jddi�|S)	z Lexical analysis �=r;rr<zinternal error in _lexer(): %s)�	attr_name�
attr_valuerb�EOL)rZ	splitArgsr=r>rrr�append)r rg�tokens�r�attrr!r!r"�_lexeris
 
zRich_Rule._lexercCs�|sttjd��tj|�}d|_d|_d|_d|_d|_	d|_
d|_d|_|j
|�}|rv|djd�dkrvttjd��i}g}d}�x`||jd�dko�|dgk�s�||jd�}||jd�}||jd�}|�r�|dHk�r�ttjd|���n�|dIk�r�|dk�r|j�rttjd+��n�|dk�r<|j�r<ttjd,��n�|dJk�rf|j	�rfttjd-||j	f��nh|d"k�r�|j
�r�ttjd.��nH|d#k�r�|j�r�ttjd/��n(|dKk�r�|j�r�ttjd0||jf��nttjd1|��t|�dk�r�|t|�d2nd3}	|	d3k�r�|�r`|�r`|d	k�r2ttjd4��n,|dk�rJttjd5��nttjd6||f��n*d|k�r�ttjd7||f��n
|jd��nL|	dk�rD|d	k�r�|dLk�r�ttjd:|��||_n||dk�ryt|�|_Wn&tk
�rttjd;|��YnXn:|�r6|dk�rd<}
nd=||f}
ttj|
��n
|j|��n�|	dk�r�|dMk�rb|||<nV|dNk�rvd>|d
<nBt|jd
�|jd�|jd�|jd
d?��|_|j�|j�|d2}�n|	dk�r,|dOk�r�|||<nN|dPk�r�d>|d
<n:t|jd
�|jd�|jd
d?��|_|j�|j�|d2}�n�|	dk�rd|dk�rTt|�|_	|j�nttjd@���nv|	dk�r�|dk�r�t|�|_	|j�nttjdA���n>|	dk�r�|dQk�r�|||<n0t|jd�|jd��|_	|j�|j�|d2}�n�|	dk�r&|dk�rt|�|_	|j�nttjdB���n�|	dk�r^|dk�rNt|�|_	|j�nttjdC���n||	dk�r�t�|_	|j�|j�|d2}�nN|	d k�r�|dRk�r�|||<n@t|jd�|jd�|jd�|jd��|_	|j�|j�|d2}�n�|	d!k�r@|dSk�r|||<n0t|jd�|jd��|_	|j�|j�|d2}�n�|	d"k�r�|dTk�r^|||<nN|d(k�rt|jd(�n8t |jd�|jd�|jd(��|_
|j�|j�|d2}�n*|	d#k�r�|d(k�r�|jd(�n(t!|jd(��|_|j�|j�|d2}�n�|	d$k�rH|d(k�r|jd(�n(t"|jd(��|_|j�|j�|d2}�n�|	d%k�r�|d(k�rh|jd(�n(t#|jd(��|_|j�|j�|d2}�nF|	d&k�r�|dk�r�|||<nF|d(k�r�|jd(�n0t$|jd�|jd(��|_|j�|j�|d2}n�|	d'k�r`|dk�r|||<nF|d(k�r.|jd(�n0t%|jd�|jd(��|_|j�|j�|d2}nz|	d(k�r�|dUk�r�||dD|��<nVdE|k�r�ttjdF��t&|dE|jdG��|d(<|jdEd�|jdGd�|j�|d2}|d2}q�W|j'�dS)VNz
empty rulerrbrk�rulerirjr_r7�addressrrrr,r*r+�to-port�to-addrr)r0r1r3r9rGzbad attribute '%s'r`ra�service�
icmp-block�	icmp-typer-�forward-port�source-portrcrd�accept�drop�reject�markr2�not�NOTzmore than one 'source' elementz#more than one 'destination' elementzFmore than one element. There cannot be both '%s' and '%s' in one rule.zmore than one 'log' elementzmore than one 'audit' elementzOmore than one 'action' element. There cannot be both '%s' and '%s' in one rule.zunknown element %sr<rz0'family' outside of rule. Use 'rule family=...'.z4'priority' outside of rule. Use 'rule priority=...'.z:'%s' outside of any element. Use 'rule <element> %s= ...'.z,'%s' outside of rule. Use 'rule ... %s ...'.r4r5zH'family' attribute cannot have '%s' value. Use 'ipv4' or 'ipv6' instead.z(invalid 'priority' attribute value '%s'.zdwrong 'protocol' usage. Use either 'rule protocol value=...' or  'rule [forward-]port protocol=...'.zDattribute '%s' outside of any element. Use 'rule <element> %s= ...'.TFzinvalid 'protocol' elementzinvalid 'service' elementzinvalid 'icmp-block' elementzinvalid 'icmp-type' elementzlimit.zlimit.valuezinvalid 'limit' elementzlimit.burst)r_r7rrrrrr,r*r+rsrtr)r0r1r3r9rG)rqr`rar+rur*rvrwr-rxryrcrdrzr{r|r}r2r~rrk)r+rur*rvrwr-rxry)rzr{r|r})r4r5)rrrrr)r~r)rrrr)r~r)r*r+)r*r+rsrt)r*r+)r0r1)r,rG)(rrrrZstripNonPrintableCharactersr_r7r`rarbrcrdrerp�getr>rlrY�
ValueError�INVALID_PRIORITYr�pop�clearrrrrrrrr
r	rrr
rrrrr8)r rgrmZattrsZin_elements�indexrbrirjZ
in_elementZerr_msgr!r!r"rfzs�

""













*




"






















(






 




















zRich_Rule._import_from_stringc	Cs`|jdk	r"|jd kr"ttj|j��|jdkrn|jdk	rB|jjdk	sL|jdk	rVttj��t|j	�t
krnttj��|j|jks�|j|j
kr�ttjd|j|j
f��|j	dko�|jdks�|jdk	o�|jdk�r
|jdkr�ttjd��|jdko�|jdko�|jdk�r
ttjd��t|j	�tt
tgk�rP|jdk�rP|jdk�rP|jdk�rPttjd��|jdk	�rj|jjdk	�r�|jdk�r�ttj��|jjdk	�r�ttjd��|jjdk	�r�ttjd	��tj|j|jj��sjttjt|jj���n�|jjdk	�r,|jjdk	�rttjd
��tj|jj��sjttjt|jj���n>|jjdk	�r^t|jj��sjttjt|jj���nttjd��|jdk	�r|jjdk	�r�|jdk�r�ttj��|jjdk	�r�ttjd	��tj|j|jj��sttjt|jj���n>|jjdk	�rt|jj��sttjt|jj���nttjd��t|j	�t k�rd|j	j!dk�sLt"|j	j!�d
k�r`ttj#t|j	j!����n�t|j	�t$k�r�tj%|j	j&��s�ttj'|j	j&��|j	j(d!k�r`ttj)|j	j(���n�t|j	�t*k�r�tj+|j	j,��s`ttj)|j	j,���nvt|j	�tk�r<|jdk	�rttjd��|jdk	�r`|jjdk	�r`ttjd���n$t|j	�tk�r�|j	j!dk�slt"|j	j!�d
k�r�ttj-t|j	j!���|j�r`ttjd���n�t|j	�t.k�r�|j	j!dk�s�t"|j	j!�d
k�r`ttj-t|j	j!����n�t|j	�t
k�r�tj%|j	j&��sttj'|j	j&��|j	j(d"k�r.ttj)|j	j(��|j	j/dk�rZ|j	j0dk�rZttj'|j	j/��|j	j/dk�r�tj%|j	j/��r�ttj'|j	j/��|j	j0dk�r�tj1|j|j	j0��r�ttj|j	j0��|jdk�r�ttj��|jdk	�r`ttjd��nrt|j	�t2k�r>tj%|j	j&��sttj'|j	j&��|j	j(d#k�r`ttj)|j	j(��n"|j	dk	�r`ttjdt|j	���|jdk	�r�|jj3�r�|jj3d$k�r�ttj4|jj3��|jj5dk	�r�|jj5j6�|jdk	�r�t|j�t7t8t9gk�r�ttj:t|j���|jj5dk	�r�|jj5j6�|jdk	�r\t|j�t8k�r(|jj6|j�nt|j�t;k�rB|jj6�|jj5dk	�r\|jj5j6�dS)%Nr4r5z/'priority' attribute must be between %d and %d.rzno element, no actionz%no element, no source, no destinationzno action, no log, no auditzaddress and maczaddress and ipsetz
mac and ipsetzinvalid sourcezinvalid destinationr<�tcp�udp�sctp�dccpzmasquerade and actionzmasquerade and mac sourcezicmp-block and actionrzforward-port and actionzUnknown element %s�emerg�alert�crit�error�warning�notice�info�debug)r4r5)r�r�r�r�)r�r�r�r�)r�r�r�r�)r�r�r�r�r�r�r�r�)<r7rrZINVALID_FAMILYr`rraZMISSING_FAMILYr3rbr
r_�priority_min�priority_maxr�rcrerrrrdrrrZ
check_addressZINVALID_ADDRrRZ	check_macZINVALID_MACrZ
INVALID_IPSETZINVALID_DESTINATIONrr)r>ZINVALID_SERVICErZ
check_portr*ZINVALID_PORTr+ZINVALID_PROTOCOLrZ
checkProtocolr,ZINVALID_ICMPTYPErr.r/Zcheck_single_addressr	r1ZINVALID_LOG_LEVELr2r8r
rrZINVALID_AUDIT_TYPEr)r r!r!r"r8hs�




 
 



   


zRich_Rule.checkcCs�d}|jr|d|j7}|jr,|d|j7}|jr@|d|j7}|jrT|d|j7}|jrh|d|j7}|jr||d|j7}|jr�|d|j7}|jr�|d|j7}tj	r�tj
|�S|S)Nrqz priority="%d"z family="%s"z %s)r_r7r`rarbrcrdrerZPY2Zu2b)r r$r!r!r"r%s$zRich_Rule.__str__i���)NNr)
r&r'r(r�r�r#rprfr8r%r!r!r!r"rTs
o-Nii�i�Q)�__all__ZfirewallrZfirewall.core.ipsetrZfirewall.core.baserrZfirewall.errorsr�objectrrrrr	rrrrr
rrr
rrrrZrrr!r!r!r"�<module>s@
dsite-packages/firewall/core/__pycache__/fw_transaction.cpython-36.pyc000064400000011650147511334670021716 0ustar003

]ûf��@sJdZdgZddlZddlmZddlmZddlmZGdd�de	�Z
dS)z!Transaction classes for firewalld�FirewallTransaction�N)�log)�errors)�
FirewallErrorc@s�eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#S)$rcCs(||_i|_g|_g|_g|_g|_dS)N)�fw�rules�	pre_funcs�
post_funcs�
fail_funcs�modules)�selfr�r
�$/usr/lib/python3.6/fw_transaction.py�__init__!szFirewallTransaction.__init__cCs2|jj�|jdd�=|jdd�=|jdd�=dS)N)r�clearrr	r
)rr
r
rr)s
zFirewallTransaction.clearcCs|jj|jg�j|�dS)N)r�
setdefault�name�append)r�backend�ruler
r
r�add_rule/szFirewallTransaction.add_rulecCsx|D]}|j||�qWdS)N)r)rrrrr
r
r�	add_rules2s
zFirewallTransaction.add_rulescCs|j|jko||j|jkS)N)rr)rrrr
r
r�
query_rule6szFirewallTransaction.query_rulecCs2|j|jkr.||j|jkr.|j|jj|�dS)N)rr�remove)rrrr
r
r�remove_rule9szFirewallTransaction.remove_rulecGs|jj||f�dS)N)rr)r�func�argsr
r
r�add_pre=szFirewallTransaction.add_precGs|jj||f�dS)N)r	r)rrrr
r
r�add_post@szFirewallTransaction.add_postcGs|jj||f�dS)N)r
r)rrrr
r
r�add_failCszFirewallTransaction.add_failcCs||jkr|jj|�dS)N)rr)r�moduler
r
r�
add_moduleFs
zFirewallTransaction.add_modulecCs||jkr|jj|�dS)N)rr)rr r
r
r�
remove_moduleJs
z!FirewallTransaction.remove_modulecCsx|D]}|j|�qWdS)N)r!)rrr r
r
r�add_modulesNs
zFirewallTransaction.add_modulescCsx|D]}|j|�qWdS)N)r")rrr r
r
r�remove_modulesRs
z"FirewallTransaction.remove_modulescCs�tjdt|�|df�i}|sjxp|jD]<}x6t|j|�D]$}|j|g�j|jj|�j	|��q<Wq(Wn(x&|jD]}|j|g�j
|j|�qrW||jfS)Nz%s.prepare(%s, %s)z...)r�debug4�typer�reversedrrr�get_backend_by_name�reverse_rule�extendr)r�enabler�backend_namerr
r
r�prepareVszFirewallTransaction.preparecCstjdt|�|f�|j|�\}}|j�d}d}g}xp|D]h}y|jj|||�WnBtk
r�}z&d}|}tjt	j
��tj|�WYdd}~Xq>X|j|�q>W|s�|jj
||�}	|	r�|	\}
}|
r�tj|�|�ri}xH|D]@}g||<x2t||�D]"}||j|jj|�j|���qWq�Wxb|D]Z}y|jj|||�Wn<tk
�r�}ztjt	j
��tj|�WYdd}~XnX�q0Wxh|jD]^\}
}y|
|�WnFtk
�r�}z(tjt	j
��tjd|
||f�WYdd}~XnX�q�Wttj|��|j�dS)Nz%s.execute(%s)F�Tz#Calling fail func %s(%s) failed: %s)rr%r&r-�prerr�	Exception�debug1�	traceback�
format_exc�errorrZhandle_modulesr'r(r)r
rrZCOMMAND_FAILED�post)rr+rrr4ZerrorMsg�doner,�msgZ
module_returnZstatusZ
undo_rulesrrrr
r
r�executefsV



"&zFirewallTransaction.executecCs|tjdt|��xd|jD]Z\}}y||�Wqtk
rr}z(tjtj��tjd|||f�WYdd}~XqXqWdS)Nz%s.pre()z"Calling pre func %s(%s) failed: %s)	rr%r&rr0r1r2r3r4)rrrr7r
r
rr/�szFirewallTransaction.precCs|tjdt|��xd|jD]Z\}}y||�Wqtk
rr}z(tjtj��tjd|||f�WYdd}~XqXqWdS)Nz	%s.post()z#Calling post func %s(%s) failed: %s)	rr%r&r	r0r1r2r3r4)rrrr7r
r
rr5�szFirewallTransaction.postN)�__name__�
__module__�__qualname__rrrrrrrrrr!r"r#r$r-r8r/r5r
r
r
rr s"@)�__doc__�__all__r2Zfirewall.core.loggerrZfirewallrZfirewall.errorsr�objectrr
r
r
r�<module>ssite-packages/firewall/core/__pycache__/ipset.cpython-36.pyc000064400000021161147511334670020017 0ustar003

]ûfq2�@s�dZdddgZddlZddlZddlmZddlmZddl	m
Z
dd	lmZdd
l
mZmZddlmZdZd
ddddddddddgZddddd�Zdddd�ZGd d�de�Zd!d�Zd"d�Zd#d$�Zd%d&�Zd'd(�ZdS))zThe ipset command wrapper�ipset�check_ipset_name�remove_default_create_options�N)�errors)�
FirewallError)�runProg)�log)�tempFile�readfile)�COMMANDS� zhash:ipzhash:ip,portzhash:ip,port,ipzhash:ip,port,netzhash:ip,markzhash:netzhash:net,netz
hash:net,portzhash:net,port,netzhash:net,ifacezhash:macz
inet|inet6�valuez
value in secs)�family�hashsize�maxelem�timeoutZinetZ1024Z65536)rrrc@s�eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zd'd
d�Z	dd�Z
dd�Zdd�Zd(dd�Z
d)dd�Zdd�Zd*dd�Zd+dd�Zdd �Zd!d"�Zd#d$�Zd%d&�ZdS),rzipset command wrapper classcCstd|_d|_dS)Nr)r�_command�name)�self�r�/usr/lib/python3.6/ipset.py�__init__Ks
zipset.__init__cCs^dd�|D�}tjd|j|jdj|��t|j|�\}}|dkrZtd|jdj|�|f��|S)zCall ipset with argscSsg|]}d|�qS)z%sr)�.0�itemrrr�
<listcomp>Rszipset.__run.<locals>.<listcomp>z	%s: %s %s� rz'%s %s' failed: %s)r�debug2�	__class__r�joinr�
ValueError)r�argsZ_args�status�retrrrZ__runOszipset.__runcCs t|�tkrttjd|��dS)zCheck ipset namezipset name '%s' is not validN)�len�IPSET_MAXNAMELENrrZINVALID_NAME)rrrrr�
check_nameZszipset.check_namecCs�g}d}y|jdg�}Wn0tk
rH}ztjd|�WYdd}~XnX|j�}d}xT|D]L}|r�|j�jdd�}|d|kr�|dtkr�|j|d�|j	d�r\d	}q\W|S)
z?Return types that are supported by the ipset command and kernel�z--helpzipset error: %sNF�rzSupported set types:T)
�_ipset__runrrZdebug1�
splitlines�strip�split�IPSET_TYPES�append�
startswith)rr"�outputZex�linesZin_types�line�splitsrrr�set_supported_types`s  

zipset.set_supported_typescCs(t|�tks|tkr$ttjd|��dS)zCheck ipset typez!ipset type name '%s' is not validN)r#r$r,rrZINVALID_TYPE)r�	type_namerrr�
check_typeuszipset.check_typeNcCsd|j|�|j|�d||g}t|t�rZx0|j�D]$\}}|j|�|dkr2|j|�q2W|j|�S)z+Create an ipset with name, type and options�creater&)r%r5�
isinstance�dict�itemsr-r()r�set_namer4�optionsr �key�valrrr�
set_create{s




zipset.set_createcCs|j|�|jd|g�S)NZdestroy)r%r()rr:rrr�set_destroy�s
zipset.set_destroycCsd||g}|j|�S)N�add)r()rr:�entryr rrr�set_add�s
z
ipset.set_addcCsd||g}|j|�S)N�del)r()rr:rAr rrr�
set_delete�s
zipset.set_deletecCs,d||g}|r"|jddj|��|j|�S)N�testz%sr)r-rr()rr:rAr;r rrrrE�s
z
ipset.testcCs2dg}|r|j|�|r"|j|�|j|�jd�S)N�list�
)r-�extendr(r+)rr:r;r rrr�set_list�s

zipset.set_listcCs<|jdgd�}i}d}}i}�x|D�]}t|�dkr:q&dd�|jdd�D�}t|�dkr`q&q&|d	d
krv|d}q&|d	dkr�|d}q&|d	dkr&|dj�}d	}	x^|	t|�k�r||	}
|
dk�r�t|�|	kr�|	d7}	||	||
<ntjd|�iS|	d7}	q�W|�r$|�r$|t|�f||<d}}|j�q&W|S)z" Get active ipsets (only headers) z-terse)r;N�cSsg|]}|j��qSr)r*)r�xrrrr�sz.ipset.set_get_active_terse.<locals>.<listcomp>�:r'r�NameZTypeZHeaderrrrr�netmaskz&Malformed ipset list -terse output: %s)rrrrrN)rIr#r+r�errorr�clear)rr0r"�_nameZ_type�_optionsr1Zpairr2�i�optrrr�set_get_active_terse�sD

zipset.set_get_active_tersecCsdg}|r|j|�|j|�S)N�save)r-r()rr:r rrrrV�s
z
ipset.savecCs�|j|�|j|�t�}d|kr*d|}d||dg}|rlx0|j�D]$\}}	|j|�|	dkrD|j|	�qDW|jddj|��|jd|�xN|D]F}
d|
kr�d|
}
|r�|jd||
dj|�f�q�|jd	||
f�q�W|j�tj	|j
�}tjd
|j
|jd|j
|jf�dg}t|j||j
d
�\}}
tj�dk�r�yt|j
�Wntk
�r`YnVXd}xNt|j
�D]@}tjd||fddd�|jd��s�tjddd�|d7}�qrWtj|j
�|dk�r�td|jdj|�|
f��|
S)Nrz'%s'r6z-existr&z%s
z	flush %s
z
add %s %s %s
z
add %s %s
z%s: %s restore %sz%s: %dZrestore)�stdinr'rJz%8d: %sr)�nofmt�nlrG)rXz'%s %s' failed: %s)r%r5r	r9r-�writer�close�os�statrrrrr�st_sizerZgetDebugLogLevelr
�	ExceptionZdebug3�endswith�unlinkr)rr:r4�entriesZcreate_optionsZ
entry_optionsZ	temp_filer r<r=rAr]r!r"rSr1rrr�set_restore�sV




zipset.set_restorecCsdg}|r|j|�|j|�S)N�flush)r-r()rr:r rrr�	set_flushs
zipset.set_flushcCs|jd||g�S)N�rename)r()rZold_set_nameZnew_set_namerrrrf
szipset.renamecCs|jd||g�S)N�swap)r()rZ
set_name_1Z
set_name_2rrrrgsz
ipset.swapcCs|jdg�S)N�version)r()rrrrrhsz
ipset.version)N)N)NN)N)NN)�__name__�
__module__�__qualname__�__doc__rr(r%r3r5r>r?rBrDrErIrUrVrcrerfrgrhrrrrrHs&



'

7cCst|�tkrdSdS)z"Return true if ipset name is validFT)r#r$)rrrrrscCs8|j�}x*tD]"}||krt|||kr||=qW|S)z( Return only non default create options )�copy�IPSET_DEFAULT_CREATE_OPTIONS)r;rRrTrrrrs

c
Cshg}xX|jd�D]J}y&|jd�|jttj|dd���Wqtk
rX|j|�YqXqWdj|�S)z! Normalize IP addresses in entry �,�/F)�strict)r+�indexr-�str�	ipaddress�
ip_networkrr)rAZ_entryZ_partrrr�normalize_ipset_entry&s
rvcCsxt|jd��dkrdSytj|dd�}Wntk
r<dSXx4|D],}|jtj|dd��rDttjdj	||���qDWdS)z: Check if entry overlaps any entry in the list of entries rorJNF)rqz,Entry '{}' overlaps with existing entry '{}')
r#r+rtrur�overlapsrr�
INVALID_ENTRY�format)rArbZ
entry_networkZitrrrr�check_entry_overlaps_existing2s
rzcCs~ydd�|D�}Wntk
r&dSXt|�dkr8dS|j�|jd�}x.|D]&}|j|�rrttjdj||���|}qPWdS)z> Check if any entry overlaps any entry in the list of entries cSsg|]}tj|dd��qS)F)rq)rtru)rrKrrrrEsz1check_for_overlapping_entries.<locals>.<listcomp>NrzEntry '{}' overlaps entry '{}')	rr#�sort�poprwrrrxry)rbZprev_networkZcurrent_networkrrr�check_for_overlapping_entriesBs2


r})rl�__all__Zos.pathr\rtZfirewallrZfirewall.errorsrZfirewall.core.progrZfirewall.core.loggerrZfirewall.functionsr	r
Zfirewall.configrr$r,ZIPSET_CREATE_OPTIONSrn�objectrrrrvrzr}rrrr�<module>sF
P	site-packages/firewall/core/__pycache__/fw.cpython-36.pyc000064400000064774147511334670017330 0ustar003

]ûf���@s�dgZddlZddlZddlZddlZddlZddlmZddlm	Z	ddl
mZddl
mZddl
m
Z
ddl
mZdd	l
mZdd
lmZddlmZddlmZdd
lmZddlmZddlmZddlmZddlmZddl m!Z!ddl"m#Z#ddl$m%Z%m&Z&ddl'm(Z(ddl)m*Z*ddl+m,Z,ddl-m.Z.ddl/m0Z0ddl1m2Z2m3Z3ddl4m5Z5ddl6m7Z7ddl8m9Z9ddl:m;Z;ddlm<Z<dd l=m>Z>Gd!d�de?�Z@dS)"�Firewall�N)�config)�	functions)�	ipXtables)�ebtables)�nftables)�ipset)�modules)�FirewallIcmpType)�FirewallService)�FirewallZone)�FirewallDirect)�FirewallConfig)�FirewallPolicies)�
FirewallIPSet)�FirewallTransaction)�FirewallHelper)�FirewallPolicy)�nm_get_bus_name�nm_get_interfaces_in_zone)�log)�firewalld_conf)�Direct)�service_reader)�icmptype_reader)�zone_reader�Zone)�ipset_reader)�IPSET_TYPES)�
helper_reader)�
policy_reader)�errors)�
FirewallErrorc@s�eZdZdadd�Zdd�Zdd�Zdd	�Zd
d�Zdbdd
�Zdd�Z	dcdd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zddd)d*�Zded+d,�Zdfd-d.�Zdgd/d0�Zd1d2�Zd3d4�Zd5d6�Zd7d8�Zd9d:�Zd;d<�Zd=d>�Z d?d@�Z!dAdB�Z"dCdD�Z#dEdF�Z$dGdH�Z%dIdJ�Z&dhdKdL�Z'dMdN�Z(dOdP�Z)dQdR�Z*dSdT�Z+dUdV�Z,dWdX�Z-dYdZ�Z.d[d\�Z/d]d^�Z0d_d`�Z1d(S)irFcCsttj�|_||_|jr>d|_d|_d|_d|_t	|_
d|_nrtj
|�|_d|_g|_tj|�|_d|_g|_tj�|_d|_tj�|_d|_g|_
tj|�|_d|_tj�|_t|�|_t|�|_t|�|_ t!|�|_"t#|�|_t$�|_%t&|�|_t'|�|_(t)|�|_*|j+�dS)NFT),rr�FIREWALLD_CONF�_firewalld_conf�_offline�ip4tables_enabled�ip6tables_enabled�ebtables_enabled�
ipset_enabledr�ipset_supported_types�nftables_enabledr�	ip4tables�ip4tables_backend�ipv4_supported_icmp_types�	ip6tables�ip6tables_backend�ipv6_supported_icmp_typesr�ebtables_backendr�
ipset_backendr�nftables_backendr	�modules_backendr
�icmptyper�servicer�zoner
�directrr�policiesrr�helperr�policy�_Firewall__init_vars)�selfZoffline�r?�/usr/lib/python3.6/fw.py�__init__CsB










zFirewall.__init__cCsDd|j|j|j|j|j|j|j|j|j|j	|j
|j|j|j
|jfS)Nz:%s(%r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r))�	__class__r&r'r(�_state�_panic�
_default_zone�_module_refcount�_marks�cleanup_on_exit�cleanup_modules_on_exit�ipv6_rpfilter_enabledr)�_individual_calls�_log_denied)r>r?r?r@�__repr__kszFirewall.__repr__cCsjd|_d|_d|_i|_g|_tj|_tj|_	tj
|_tj|_
tj|_tj|_tj|_tj|_tj|_dS)NZINITF�)rCrDrErFrGrZFALLBACK_CLEANUP_ON_EXITrHZ FALLBACK_CLEANUP_MODULES_ON_EXITrIZFALLBACK_IPV6_RPFILTERrJZFALLBACK_INDIVIDUAL_CALLSrKZFALLBACK_LOG_DENIEDrLZFALLBACK_FIREWALL_BACKEND�_firewall_backendZFALLBACK_FLUSH_ALL_ON_RELOAD�_flush_all_on_reloadZFALLBACK_RFC3964_IPV4�
_rfc3964_ipv4ZFALLBACK_ALLOW_ZONE_DRIFTING�_allow_zone_drifting)r>r?r?r@Z__init_varstszFirewall.__init_varscCs�|jr$d|jj�kr$tjd�d|_|jrHd|jj�krHtjd�d|_|jrld|jj�krltjd�d|_|jr�|jr�|j	r�tj
d�tjd�dS)N�filterziptables is not usable.Fzip6tables is not usable.zebtables is not usable.zNo IPv4 and IPv6 firewall.�)
r&r-�get_available_tablesr�info1r'r0r(r2r+�fatal�sys�exit)r>r?r?r@�
_check_tables�s 



zFirewall._check_tablescCszy|jj�Wn*tk
r8tjd�d|_g|_YnX|jj�|_|jj	�|jj
s||jjrltjd�ntjd�d|_|j
r�|jjd�|_n|jr�|jj�|_ng|_|jj	�|jj
s�|jjr�tjd�ntjd�d|_|j
r�|jjd�|_n|j�r|jj�|_ng|_|jj	�|jj
�sN|jj�r>tjd	�ntjd
�d|_|j�rv|j�rv|jj�rvtjd�dS)Nz4ipset not usable, disabling ipset usage in firewall.FzFiptables-restore is missing, using individual calls for IPv4 firewall.zCiptables-restore and iptables are missing, disabling IPv4 firewall.�ipv4zGip6tables-restore is missing, using individual calls for IPv6 firewall.zEip6tables-restore and ip6tables are missing, disabling IPv6 firewall.�ipv6zHebtables-restore is missing, using individual calls for bridge firewall.zEebtables-restore and ebtables are missing, disabling bridge firewall.zSebtables-restore is not supporting the --noflush option, will therefore not be used)r3Zset_list�
ValueErrorr�warningr)r*Zset_supported_typesr-Zfill_exists�restore_command_existsZcommand_existsr&r+r4Zsupported_icmp_typesr.r0r'r1r2r(rK�restore_noflush_option�debug1)r>r?r?r@�_start_check�sL








zFirewall._start_checkc>Cs~tj}tjdtj�y|jj�Wn8tk
rZ}ztj|�tjd�WYdd}~X�n"X|jj	d�rt|jj	d�}|jj	d�r�|jj	d�}|dk	r�|j
�dBkr�d|_tjd|j�|jj	d	��r|jj	d	�}|dk	r�|j
�dCkr�d|_|dk	�r|j
�dDk�rd|_tjd
|j�|jj	d��rv|jj	d�}|dk	�rv|j
�dEk�rvtjd�y|j
j�Wntk
�rtYnX|jj	d��r�|jj	d�}|dk	�r�|j
�dFk�r�d|_|j
�dGk�r�d|_|j�r�tjd�n
tjd�|jj	d��r"|jj	d�}|dk	�r"|j
�dHk�r"tjd�d|_|jj	d��rt|jj	d�}|dk�sT|j
�dk�r\d|_n|j
�|_tjd|j�|jj	d��r�|jj	d�|_tjd|j�|jj	d��r�|jj	d�}|j
�dIk�r�d|_nd|_tjd|j�|jj	d��r&|jj	d�}|j
�dJk�rd|_nd|_tjd|j�|jj	d��r||jj	d�}|j
�dKk�rVd|_nd|_|j�sntjd�tjd |j�|jjtj|j��|j|j�|j�s�|j�tjd!�y|j
jj�WnZtk
�r }z<|j
j��r�tjd"|j
jj |�ntjd"|j
jj |�WYdd}~XnX|jj!tj|j
��|j"tj#d#�|j"tj$d#�|j"tj%d$�|j"tj&d$�t'|j(j)��d%k�r�tjd&�|j"tj*d'�|j"tj+d'�|j"tj,d(�|j"tj-d(�t'|j.j/��d%k�r�tjd)�|j"tj0d*�|j"tj1d*�t'|j2j3��d%k�r&tj4d+�t5j6d,�|j"tj7d-�|j"tj8d-�d}x.dLD]&}||j2j3�k�rLtj4d1|�d}�qLW|�r�t5j6d,�||j2j3�k�r�d2|j2j3�k�r�d2}nd3|j2j3�k�r�d3}nd.}tjd4||�|}ntjd5|�t9tj:�}	t;j<j=tj:��rRtjd6tj:�y|	j�Wn4tk
�rP}ztjd7tj:|�WYdd}~XnX|j>j?|	�|jj@tj|	��|jA|�|_B|j�r�dS|jC�tjD�d%k�r�tEjE�}
tF|�}|jG|d8�|�r�|�s�|jH�r�|jIjJ��r�|jKd�|jL�|�r|�rtjd9�|jMjN�|jO|d8�|jKd�|jL�|jH�rR|jIjJ��rRtjd:�|jIjP�tjd;�|jQ|d8�tjd<�|j2jR|d8�|j2jSd|jB|d8�tjd=�|jTjU|d8�|jKd�|jL�|j>jV��rPtjd>�|j>jW|�y|jKd�|jL�WnXtk
�r8}z$t|jXd?|jY�r |jYnd@��WYdd}~Xntk
�rN�YnX~tjD�d,k�rztEjE�}
tjZdA|
|
�dS)MNz"Loading firewalld config file '%s'z0Using fallback firewalld configuration settings.�DefaultZoneZ
CleanupOnExit�no�falseFzCleanupOnExit is set to '%s'ZCleanupModulesOnExit�yes�trueTz#CleanupModulesOnExit is set to '%s'ZLockdownzLockdown is enabledZ
IPv6_rpfilterzIPv6 rpfilter is enabledzIPV6 rpfilter is disabledZIndividualCallszIndividualCalls is enabled�	LogDeniedZoffzLogDenied is set to '%s'ZFirewallBackendzFirewallBackend is set to '%s'ZFlushAllOnReloadzFlushAllOnReload is set to '%s'ZRFC3964_IPv4zRFC3964_IPv4 is set to '%s'ZAllowZoneDriftingz�AllowZoneDrifting is enabled. This is considered an insecure configuration option. It will be removed in a future release. Please consider disabling it now.z AllowZoneDrifting is set to '%s'zLoading lockdown whitelistz*Failed to load lockdown whitelist '%s': %srr6rzNo icmptypes found.r;r7zNo services found.r8zNo zones found.rTr<�block�drop�trustedzZone '%s' is not available.ZpublicZexternalz+Default zone '%s' is not valid. Using '%s'.zUsing default zone '%s'zLoading direct rules file '%s'z)Failed to load direct rules file '%s': %s)�use_transactionzUnloading firewall moduleszApplying ipsetszApplying default rule setzApplying used zoneszApplying used policiesz2Applying direct chains rules and passthrough rulesz
Direct: %srNz%Flushing and applying took %f seconds)rdre)rfrg)rdre)rfrg)rdre)rfrg)rfrg)rdre)rdre)rdre)rirjrk)[rZ
FALLBACK_ZONErrar#r$�read�	Exceptionr^�get�lowerrHrIr:Zenable_lockdownr"rJrKrLrOrPrQrRr%Zset_firewalld_conf�copy�deepcopy�_select_firewall_backendrbZlockdown_whitelistZquery_lockdown�error�filenameZset_policies�_loaderZFIREWALLD_IPSETSZETC_FIREWALLD_IPSETSZFIREWALLD_ICMPTYPESZETC_FIREWALLD_ICMPTYPES�lenr6�
get_icmptypesZFIREWALLD_HELPERSZETC_FIREWALLD_HELPERSZFIREWALLD_SERVICESZETC_FIREWALLD_SERVICESr7�get_servicesZFIREWALLD_ZONESZETC_FIREWALLD_ZONESr8�	get_zonesrWrXrYZFIREWALLD_POLICIESZETC_FIREWALLD_POLICIESrZFIREWALLD_DIRECT�os�path�existsr9Zset_permanent_configZ
set_direct�
check_zonerErZZgetDebugLogLevel�timer�flushr)rZ
has_ipsets�execute�clearr5�unload_firewall_modules�apply_default_tablesZapply_ipsets�apply_default_rulesZapply_zones�change_default_zoner<Zapply_policiesZhas_configurationZapply_direct�code�msgZdebug2)r>�reload�complete_reloadZdefault_zoner��valuert�zr8�objZtm1�transaction�eZtm2r?r?r@�_start�sr







 




















.zFirewall._startcCsHy|j�Wn&tk
r2d|_|jd��YnXd|_|jd�dS)N�FAILED�ACCEPT�RUNNING)r�rnrC�
set_policy)r>r?r?r@�start�s
zFirewall.startcCshtjj|�sdS|rZ|jtj�rV|dkrVt�}tjj|�|_|j	|j�||_d|_
nd}�x|ttj|��D�]h}|j
d�s�|jtj�rl|dkrltjjd||f�rl|jd||f|dd�qld||f}tjd||��y�|dk�r�t||�}|j|jj�k�r8|jj|j�}tjd	||j|j|j�|jj|j�n|jjtj��rNd|_
y|jj|�Wn<tk
�r�}	ztjd
|jt|	�f�WYdd}	~	XnX|jjtj|���n�|dk�rFt||�}|j|jj�k�r|jj |j�}tjd	||j|j|j�|jj!|j�n|jjtj��r$d|_
|jj"|�|jj"tj|���n.|dk�rnt#|||d�}|�r�dtjj|�tjj|�d
d�f|_|j	|j�tj|�}
|j|j$j%�k�r|j$j&|j�}|j$j'|j�|j(�r�tjd||j||�|j)|�ntjd	||j|j|j�n|jjtj��r,d|_
d|
_
|jj*|
�|�r^tjd||j||�|j)|�n|j$j*|��n|dk�rDt+||�}|j|j,j-�k�r�|j,j.|j�}tjd	||j|j|j�|j,j/|j�n|jjtj��r�d|_
y|j,j0|�Wn<tk
�r,}	ztj1d
|jt|	�f�WYdd}	~	XnX|jj0tj|���n0|dk�r�t2||�}|j|j3j4�k�r�|j3j5|j�}tjd	||j|j|j�|j3j6|j�n|jjtj��r�d|_
|j3j7|�|jj7tj|��n�|dk�rht8||�}|j|j9j:�k�r2|j9j;|j�}tjd	||j|j|j�|j9j<|j�n|jjtj��rHd|_
|j9j=|�|jj>tj|��ntj?d|�Wqltk
�r�}ztj@d|||�WYdd}~XqltAk
�r�tj@d||�tjB�YqlXqlW|�rd|j(�rd|j|j$j%�k�rX|j$j&|j�}tjd||j|j|j�y|j$j'|j�WntAk
�rHYnX|jjC|j�|j$j*|�dS)Nr8Fz.xmlz%s/%sT)�combinezLoading %s file '%s'r6z  Overloads %s '%s' ('%s/%s')z%s: %s, ignoring for run-time.r7)Z
no_check_namer�z  Combining %s '%s' ('%s/%s')rr;r<zUnknown reader type %szFailed to load %s file '%s': %szFailed to load %s file '%s':z0  Overloading and deactivating %s '%s' ('%s/%s')���)Dr{r|�isdir�
startswithrZ
ETC_FIREWALLDr�basename�nameZ
check_name�default�sorted�listdir�endswithrvrrarr6rxZget_icmptyperuZremove_icmptypeZadd_icmptyper"rV�strrqrrrr7ryZget_serviceZremove_serviceZadd_servicerr8rzZget_zoneZremove_zone�combinedr�Zadd_zonerr�
get_ipsets�	get_ipsetZremove_ipset�	add_ipsetr^rr;Zget_helpersZ
get_helperZ
remove_helperZ
add_helperr r<�get_policiesZ
get_policyZ
remove_policyZ
add_policyZadd_policy_objectrWrtrnZ	exceptionZforget_zone)r>r|Zreader_typer�Z
combined_zonerur�r�Zorig_objrtZ
config_objr�r?r?r@rv�s


$







$




zFirewall._loadercCsp|jj�|jj�|jj�|jj�|jj�|jj�|jj�|jj�|j	j�|j
j�|j�dS)N)r6�cleanupr7r8rr;rr9r:r<r$r=)r>r?r?r@r��s









zFirewall.cleanupcCsN|jsB|jr(|j�|jj�|jd�|jrBtjd�|jj	�|j
�dS)Nr�z!Unloading firewall kernel modules)r%rHr�rr�rIrrar5r�r�)r>r?r?r@�stop�s



z
Firewall.stopc	Cs�d}d}x�t|�D]�\}}|r0|jj|�\}}n$|j|dkrDd}n|jj|�\}}|dkrn|d7}||7}q|r�|jj|d�|j|d7<q||jkr|j|d8<|j|dkr|j|=qW||fS)NrrNrT)�	enumerater5�load_modulerFZ
unload_module�
setdefault)	r>Z_modules�enableZ
num_failedZ
error_msgs�i�moduleZstatusr�r?r?r@�handle_modules�s(
zFirewall.handle_modulescCs|dkrd|_dS)NrF)r+)r>�backendr?r?r@rs�sz!Firewall._select_firewall_backendcCs4x|j�D]}|j|kr
|Sq
Wttjd|��dS)Nz'%s' backend does not exist)�all_backendsr�r"r!Z
UNKNOWN_ERROR)r>r�r�r?r?r@�get_backend_by_name�s

zFirewall.get_backend_by_namecCs\|jr|jS|dkr |jr |jS|dkr4|jr4|jS|dkrH|jrH|jStt	j
d|��dS)Nr[r\�ebz-'%s' is not a valid backend or is unavailable)r+r4r&r-r'r0r(r2r"r!�INVALID_IPV)r>�ipvr?r?r@�get_backend_by_ipv�szFirewall.get_backend_by_ipvcCsP|dkr|jr|jS|dkr(|jr(|jS|dkr<|jr<|jSttjd|��dS)Nr[r\r�z-'%s' is not a valid backend or is unavailable)	r&r-r'r0r(r2r"r!r�)r>r�r?r?r@�get_direct_backend_by_ipv�sz"Firewall.get_direct_backend_by_ipvcCs<|dkr|jS|dkr|jS|dkr*|jS|dkr8|jSdS)Nr,r/rrF)r&r'r(r+)r>r�r?r?r@�is_backend_enabled�szFirewall.is_backend_enabledcCs8|jr
dS|dkr|jS|dkr&|jS|dkr4|jSdS)NTr[r\r�F)r+r&r'r()r>r�r?r?r@�is_ipv_enabled�szFirewall.is_ipv_enabledcCsRg}|jr|j|j�n6|jr*|j|j�|jr<|j|j�|jrN|j|j�|S)N)	r+�appendr4r&r-r'r0r(r2)r>�backendsr?r?r@�enabled_backends
szFirewall.enabled_backendscCsPg}|jr|j|j�|jr(|j|j�|jr:|j|j�|jrL|j|j�|S)N)	r&r�r-r'r0r(r2r+r4)r>r�r?r?r@r�szFirewall.all_backendsNcCsN|dkrt|�}n|}x |j�D]}|j||j��q W|dkrJ|jd�dS)NT)rr��	add_rulesZbuild_default_tablesr�)r>rlr�r�r?r?r@r�#s
zFirewall.apply_default_tablescCs�|dkrt|�}n|}x(|j�D]}|j|j�}|j||�q W|jd�r~|jd�}d|j�kr~|jr~|j	|j�}|j||�|jd�r�|j
r�|j�}|j||�|dkr�|jd�dS)Nr\�rawT)
rr�Zbuild_default_rulesrLr�r�r�rUrJZbuild_rpfilter_rulesrQZbuild_rfc3964_ipv4_rulesr�)r>rlr�r��rulesZipv6_backendr?r?r@r�/s"


zFirewall.apply_default_rulescCs\|dkrt|�}n|}tjd�x$|j�D]}|j�}|j||�q*W|dkrX|jd�dS)NzFlushing rule setT)rrrar�Zbuild_flush_rulesr�r�)r>rlr�r�r�r?r?r@r�Is

zFirewall.flushcCs`|dkrt|�}n|}tjd|�x&|j�D]}|j|�}|j||�q,W|dkr\|jd�dS)NzSetting policy to '%s'T)rrrar�Zbuild_set_policy_rulesr�r�)r>r<rlr�r�r�r?r?r@r�Xs

zFirewall.set_policycCsB|sdS|j|�}|s&ttjd|��|j|�s4dS|j||j�S)NrNz'%s' is not a valid backend)r�r"r!r�r��set_rulerL)r>�backend_name�ruler�r?r?r@r�is


z
Firewall.rulecCs"ttd|��}|j|�}|s,ttjd|��|j|�s:dS|js\|js\|dkoX|j	j
�rx�t|�D]�\}}y|j||j
�Wqftk
�r}zjtjtj��tj|�xFt|d|��D]2}y|j|j|�|j
�Wq�tk
r�Yq�Xq�W|�WYdd}~XqfXqfWn|j||j
�dS)Nz'%s' is not a valid backendr)�listrSr�r"r!r�r�rKr_r2r`r�r�rLrnrra�	traceback�
format_excrt�reversedZreverse_ruleZ	set_rules)r>r�r�Z_rulesr�r�r�r�r?r?r@r�ws.




zFirewall.rulescCs|jrttj��dS)N)rDr"r!Z
PANIC_MODE)r>r?r?r@�check_panic�szFirewall.check_paniccCs"|}||jj�krttj|��|S)N)r<r�r"r!ZINVALID_POLICY)r>r<Z_policyr?r?r@�check_policy�szFirewall.check_policycCs8|}|s|dkr|j�}||jj�kr4ttj|��|S)NrN)�get_default_zoner8rzr"r!ZINVALID_ZONE)r>r8�_zoner?r?r@r~�szFirewall.check_zonecCstj|�sttj|��dS)N)rZcheckInterfacer"r!ZINVALID_INTERFACE)r>�	interfacer?r?r@�check_interface�s
zFirewall.check_interfacecCs|jj|�dS)N)r7�
check_service)r>r7r?r?r@r��szFirewall.check_servicecCstj|�sttj|��dS)N)r�
check_portr"r!ZINVALID_PORT)r>Zportr?r?r@r��s
zFirewall.check_portcCs*|sttj��|dkr&ttjd|��dS)N�tcp�udp�sctp�dccpz''%s' not in {'tcp'|'udp'|'sctp'|'dccp'})r�r�r�r�)r"r!ZMISSING_PROTOCOLZINVALID_PROTOCOL)r>Zprotocolr?r?r@�check_tcpudp�s
zFirewall.check_tcpudpcCstj|�sttj|��dS)N)rZcheckIPr"r!�INVALID_ADDR)r>Zipr?r?r@�check_ip�s
zFirewall.check_ipcCsP|dkr tj|�sLttj|��n,|dkr@tj|�sLttj|��nttjd��dS)Nr[r\z'%s' not in {'ipv4'|'ipv6'})rZcheckIPnMaskr"r!r�Z
checkIP6nMaskr�)r>r��sourcer?r?r@�
check_address�s

zFirewall.check_addresscCs|jj|�dS)N)r6�check_icmptype)r>Zicmpr?r?r@r��szFirewall.check_icmptypecCs>t|t�std|t|�f��t|�dkr:ttjd|��dS)Nz%s is %s, expected intrz#timeout '%d' is not positive number)�
isinstance�int�	TypeError�typer"r!�
INVALID_VALUE)r>Ztimeoutr?r?r@�
check_timeout�s

zFirewall.check_timeoutc CsT|j}|j}|sNi}x&|jj�D]}|jj|�d||<q W|jj�}|j�}g}x$|jj	�D]}	|j
|jj|	��q^W|s�|jd�|j
�d}
y|jd|d�Wn&tk
r�}z
|}
WYdd}~XnX|�rxH|D]@}|jj|j�s�x,|jj�D]}
|
jdk�rq�|
j|j�q�Wq�W|�s�|j�}||k�r�||k�rFi||<xFt||j��D]2\}}|d�rX||||||<|||=�qXWxb|jj�D]T}||k�r�x.||D]"}|jj|||||d��q�W||=ntjd|��q�Wt|�d	k�r*x(t|j��D]}tjd
|�||=�qW~x�|D]�}|jj|j��r�xx|jD]R}y|jj|j|�Wn6tk
�r�}z|jtj k�r�|�WYdd}~XnX�qNWn|jj!|�|jj"|j��q2W|jj#|�t$�}|�r x@|jj�dgD],}x$t%|�D]}|jj|||d��q�W�q�W||_|j�s8|jd
�|
�rJd|_&|
�nd|_&dS)N�
interfacesZDROPT)r�r�r�__default__�senderzNew zone '%s'.rz(Lost zone '%s', zone interfaces dropped.rN)r�r�r�r�)'rDrPr8rz�get_settingsr9Zget_runtime_configr�rr�r�r�r�r�r�rnZquery_ipsetr�r�Zset_destroyr��items�change_zone_of_interfacerrVrw�keysZentriesZ	add_entryr"r�r!�ALREADY_ENABLEDr�Zapply_ipsetZ
set_configrrrC)r>r�rDZ	flush_allZ_zone_interfacesr8Z_direct_config�_old_dzZ_ipset_objs�_nameZstart_exceptionr�r�r�Z_new_dz�iface�settingsZinterface_id�entryr�Znm_bus_namer�r?r?r@r��s�









zFirewall.reloadcCs|jS)N)rC)r>r?r?r@�	get_stateJszFirewall.get_statecCsZ|jrttjd��y|jd�Wn.tk
rN}zttj|��WYdd}~XnXd|_dS)Nzpanic mode already enabledZPANICT)rDr"r!r�r�rn�COMMAND_FAILED)r>r�r?r?r@�enable_panic_modeOszFirewall.enable_panic_modecCsZ|jsttjd��y|jd�Wn.tk
rN}zttj|��WYdd}~XnXd|_dS)Nzpanic mode is not enabledr�F)rDr"r!ZNOT_ENABLEDr�rnr�)r>r�r?r?r@�disable_panic_modeZszFirewall.disable_panic_modecCs|jS)N)rD)r>r?r?r@�query_panic_modeeszFirewall.query_panic_modecCs|jS)N)rL)r>r?r?r@�get_log_deniedjszFirewall.get_log_deniedcCsb|tjkr&ttjd|djtj�f��||j�krR||_|jj	d|�|jj
�nttj|��dS)Nz'%s', choose from '%s'z','rh)rZLOG_DENIED_VALUESr"r!r��joinr�rLr$�set�writeZALREADY_SET)r>r�r?r?r@�set_log_deniedms
zFirewall.set_log_deniedcCs|jS)N)rE)r>r?r?r@r�|szFirewall.get_default_zonecCs�|j|�}||jkr�|j}||_|jjd|�|jj�|jj||�|jj|�}x@t|dj	��D]\}}|drd|jj
d|�qdWnttj
|��dS)Nrcr�r�rN)r~rEr$r�r�r8r�r�r�r�r�r"r!ZZONE_ALREADY_SET)r>r8r�r�Z_old_dz_settingsr�r�r?r?r@�set_default_zones


zFirewall.set_default_zonecCsH|j�}x:|j�D].\}}|s(t|t�r2|||<q||kr||=qW|S)N)rqr�r��bool)r>Z	permanentZruntimer��keyr�r?r?r@�'combine_runtime_with_permanent_settings�s

z0Firewall.combine_runtime_with_permanent_settingscCsi}i}x�t|j��t|j��BD]�}||kr"t||t�r�t||krN||ng�}tt||�|�||<t|t||�A|@�||<q"t||t�s�t||t�r�||r�||r�d||<q�||r�||r�d||<q"ttjdj	t
||�|���q"W||fS)NTFz Unhandled setting type {} key {})r�r�r�r�r�r�r"r!ZINVALID_SETTING�formatr�)r>Zold_settingsZnew_settingsZadd_settingsZremove_settingsr��oldr?r?r@�get_added_and_removed_settings�s

 z'Firewall.get_added_and_removed_settings)F)FF)F)N)N)N)N)F)2�__name__�
__module__�__qualname__rArMr=rZrbr�r�rvr�r�r�rsr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r~r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r?r?r?r@rBsd
(	;








 	
s)A�__all__Zos.pathr{rXrqrr�ZfirewallrrZ
firewall.corerrrrr	Zfirewall.core.fw_icmptyper
Zfirewall.core.fw_servicerZfirewall.core.fw_zonerZfirewall.core.fw_directr
Zfirewall.core.fw_configrZfirewall.core.fw_policiesrZfirewall.core.fw_ipsetrZfirewall.core.fw_transactionrZfirewall.core.fw_helperrZfirewall.core.fw_policyrZfirewall.core.fw_nmrrZfirewall.core.loggerrZfirewall.core.io.firewalld_confrZfirewall.core.io.directrZfirewall.core.io.servicerZfirewall.core.io.icmptyperZfirewall.core.io.zonerrZfirewall.core.io.ipsetrZfirewall.core.ipsetrZfirewall.core.io.helperrZfirewall.core.io.policyr r!Zfirewall.errorsr"�objectrr?r?r?r@�<module>sHsite-packages/firewall/core/__pycache__/ipXtables.cpython-36.opt-1.pyc000064400000104137147511334670021572 0ustar003

]ûf���@s(ddlZddlZddlmZddlmZddlmZm	Z	m
Z
mZmZm
Z
mZmZddlmZddlmZmZmZmZmZddlmZmZmZmZmZmZmZddl Z dZ!d	d
dgdd
gdd
d	d
dgdd
d
gd	d
dgd�Z"ddd�Z#ddd�Z$dd�Z%dd�Z&dd�Z'Gdd�de(�Z)Gdd�de)�Z*dS)�N)�runProg)�log)�tempFile�readfile�	splitArgs�	check_mac�portStr�check_single_address�
check_address�normalizeIP6)�config)�
FirewallError�INVALID_PASSTHROUGH�INVALID_RULE�
UNKNOWN_ERROR�INVALID_ADDR)�Rich_Accept�Rich_Reject�	Rich_Drop�	Rich_Mark�Rich_Masquerade�Rich_ForwardPort�Rich_IcmpBlock��INPUT�OUTPUT�FORWARD�
PREROUTING�POSTROUTING)�security�raw�mangle�nat�filterzicmp-host-prohibitedzicmp6-adm-prohibited)�ipv4�ipv6�icmpz	ipv6-icmpcCs�ddddddd�}|dd�}x~|D]v}y|j|�}Wntk
rLw$YnX|d
kr�yt||d	�Wntk
r~YnX|j|d	�||||<q$W|S)z Inverse valid rule z-Dz--deletez-Xz--delete-chain)z-Az--appendz-Iz--insertz-Nz--new-chainN�-I�--insert�)r'r()�index�	Exception�int�pop)�args�replace_args�ret_args�arg�idx�r3�/usr/lib/python3.6/ipXtables.py�common_reverse_rule9s(
r5cCs�ddddddd�}|dd�}x�|D]x}y|j|�}Wntk
rLw$YnX|dkr�yt||d	�Wntk
r~YnX|j|d	�||||<|SWttd
��dS)z Reverse valid passthough rule z-Dz--deletez-Xz--delete-chain)z-Az--appendz-Iz--insertz-Nz--new-chainN�-I�--insertr)zno '-A', '-I' or '-N' arg)r6r7)r*�
ValueErrorr,r-r
r)r.r/r0�xr2r3r3r4�common_reverse_passthrough^s,
r:cCs�t|�}tddddddddd	d
ddd
dddddddg�}t||@�dkrbttdt||@�d��tddddddg�}t||@�dkr�ttd��dS)zZ Check if passthough rule is valid (only add, insert and new chain
    rules are allowed) z-Cz--checkz-Dz--deletez-Rz	--replacez-Lz--listz-Sz--list-rulesz-Fz--flushz-Zz--zeroz-Xz--delete-chainz-Pz--policyz-Ez--rename-chainrzarg '%s' is not allowedz-Az--appendz-Iz--insertz-Nz--new-chainzno '-A', '-I' or '-N' argN)�set�lenr
r�list)r.Znot_allowedZneededr3r3r4�common_check_passthrough�s*

r>c@s�eZdZdZdZdZdd�Zdd�Zdd�Zd	d
�Z	dd�Z
d
d�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zdhd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zdid,d-�Zd.d/�Zdjd1d2�Zd3d4�Zd5d6�Zdkd8d9�Zdld:d;�Z d<d=�Z!d>d?�Z"d@dA�Z#dBdC�Z$dDdE�Z%dFdG�Z&dHdI�Z'dJdK�Z(dLdM�Z)dNdO�Z*dPdQ�Z+dmdRdS�Z,dndTdU�Z-dodVdW�Z.dXdY�Z/dpdZd[�Z0dqd\d]�Z1drd^d_�Z2dsd`da�Z3dbdc�Z4ddde�Z5dfdg�Z6d!S)t�	ip4tablesr$TcCsd||_tj|j|_tjd|j|_|j�|_|j�|_	|j
�g|_i|_i|_
g|_i|_dS)Nz
%s-restore)�_fwrZCOMMANDS�ipv�_command�_restore_command�_detect_wait_option�wait_option�_detect_restore_wait_option�restore_wait_option�fill_exists�available_tables�rich_rule_priority_counts�policy_priority_counts�zone_source_index_cache�
our_chains)�self�fwr3r3r4�__init__�s

zip4tables.__init__cCs$tjj|j�|_tjj|j�|_dS)N)�os�path�existsrBZcommand_existsrCZrestore_command_exists)rNr3r3r4rH�szip4tables.fill_existscCs�|jr(|j|kr(|jgdd�|D�}ndd�|D�}tjd|j|jdj|��t|j|�\}}|dkr�td|jdj|�|f��|S)NcSsg|]}d|�qS)z%sr3)�.0�itemr3r3r4�
<listcomp>�sz#ip4tables.__run.<locals>.<listcomp>cSsg|]}d|�qS)z%sr3)rTrUr3r3r4rV�sz	%s: %s %s� rz'%s %s' failed: %s)rEr�debug2�	__class__rB�joinrr8)rNr.Z_args�status�retr3r3r4Z__run�szip4tables.__runc
Cs<y|j|�}Wntk
r"dSX||||d�<dSdS)NF�T)r*r8)rN�rule�patternZreplacement�ir3r3r4�
_rule_replace�szip4tables._rule_replacecCs|tko|t|kS)N)�BUILT_IN_CHAINS)rNrA�table�chainr3r3r4�is_chain_builtin�szip4tables.is_chain_builtincCs2d|g}|r|jd�n
|jd�|j|�|gS)Nz-tz-Nz-X)�append)rN�addrcrdr^r3r3r4�build_chain_rules�s

zip4tables.build_chain_rulescCs8d|g}|r |d|t|�g7}n|d|g7}||7}|S)Nz-tz-Iz-D)�str)rNrgrcrdr*r.r^r3r3r4�
build_rule�szip4tables.build_rulecCst|�S)N)r5)rNr.r3r3r4�reverse_rule�szip4tables.reverse_rulecCst|�dS)N)r>)rNr.r3r3r4�check_passthrough�szip4tables.check_passthroughcCst|�S)N)r:)rNr.r3r3r4�reverse_passthrough�szip4tables.reverse_passthroughcCs�d}y|jd�}Wntk
r&YnXt|�|dkrD||d}d}xLd
D]D}y|j|�}Wntk
rtYqNXt|�|dkrN||d}qNW||fS)Nr#z-tr]�-A�--append�-I�--insert�-N�--new-chain)rnrorprqrrrs)r*r8r<)rNr.rcr`rd�optr3r3r4�passthrough_parse_table_chain�s$z'ip4tables.passthrough_parse_table_chaincCs4yH|jd�}|j|�|j|�}d|dkr:||df}n||df}WnFtk
r�y|jd�}|j|�d}Wntk
r�dSXYnXd}|ddkr�d}|r�|r�||kr�|j|�nn|�r0|�r�||kr�|j|�|jdd
�d�|j|�}n|jj�rd}nt|�}d|d<|j	dd|d�dS)Nz%%ZONE_SOURCE%%z-m���z%%ZONE_INTERFACE%%Tr�-D�--deleteFcSs|dS)Nrr3)r9r3r3r4�<lambda>&sz4ip4tables._run_replace_zone_source.<locals>.<lambda>)�keyz-Ir)z%dr])ryrz)
r*r-r8�removerf�sortr@�_allow_zone_driftingr<�insert)rNr^rLr`�zoneZzone_source�rule_addr*r3r3r4�_run_replace_zone_source	s>







z"ip4tables._run_replace_zone_sourcecCsy|j|�}Wntk
r$Y�n�Xd}d}d}|j|�|j|�}t|�tkr\ttd��d}	xLdD]D}
y|j|
�}Wntk
r�YqfXt|�|dkrf||d}	qfWxhdD]`}
y|j|
�}Wntk
r�Yq�Xt|�|dk�r�||d}|
dk�rd}|
dkr�d}q�W|	|f}|�sp||k�sP|||k�sP|||dk�rZttd��|||d8<n�||k�r�i||<|||k�r�d|||<d}
xHt	||j
��D]4}||k�r�|�r�P|
|||7}
||k�r�P�q�W|||d7<d
||<|j|dd|
�dS)a
        Change something like
          -t filter -I public_IN %%RICH_RULE_PRIORITY%% 123
        or
          -t filter -A public_IN %%RICH_RULE_PRIORITY%% 321
        into
          -t filter -I public_IN 4
        or
          -t filter -I public_IN
        TFr]z%priority must be followed by a numberr#�-t�--table�-A�--append�-I�--insert�-D�--deleterz*nonexistent or underflow of priority countr)z%dN���)r�r�)r�r�r�r�r�r�)r�r�)r�r�)r*r8r-�typer,r
rr<r�sorted�keysr�)rNr^Zpriority_counts�tokenr`r�r�Zinsert_add_index�priorityrcrt�jrdr*�pr3r3r4�_set_rule_replace_priority2sj








z$ip4tables._set_rule_replace_prioritycCsPt�}i}tj|j�}tj|j�}tj|j�}�x�|D�]�}|dd�}	|j|	dddt|jg�|j|	dt	|jg�y|	j
d�}
Wntk
r�Yn8X|dkr�q6|d$kr�d
dd|g|	|
|
d
�<n
|	j|
�|j
|	|d�|j
|	|d�|j|	|�d}xZd%D]R}y|	j
|�}
Wntk
�r,Yn(Xt|	�|
d
k�r|	j|
�|	j|
�}�qWxhtt|	��D]X}
xPtjD]F}
|
|	|
k�rt|	|
jd��o�|	|
jd��rtd|	|
|	|
<�qtW�qhW|j|g�j|	�q6WxR|D]J}||}|jd|�x"|D]}	|jdj|	�d��qW|jd��q�W|j�tj|j�}tjd|j|j d|j|j!f�g}|j"�rz|j|j"�|jd�t#|j ||jd�\}}tj$�dk�r
t%|j�}|dk	�r
d
}
xH|D]@}tj&d|
|fd
dd �|jd��s�tj&d!d
d"�|
d
7}
�q�Wtj'|j�|dk�r:td#|j dj|�|f��||_||_||_dS)&Nz
%%REJECT%%�REJECTz
--reject-withz%%ICMP%%z%%LOGTYPE%%�off�unicast�	broadcast�	multicastz-m�pkttypez
--pkt-typer]z%%RICH_RULE_PRIORITY%%z%%POLICY_PRIORITY%%r#�-t�--table�"z"%s"z*%s
rW�
zCOMMIT
z	%s: %s %sz%s: %dz-n)�stdinr)z%8d: %sr)�nofmt�nlr)r�z'%s %s' failed: %s)r�r�r�)r�r�)(r�copy�deepcopyrJrKrLra�DEFAULT_REJECT_TYPErA�ICMPr*r8r-r�r�r<�range�stringZ
whitespace�
startswith�endswith�
setdefaultrf�writerZ�closerQ�stat�namerrXrYrC�st_sizerGrZgetDebugLogLevelrZdebug3�unlink)rN�rules�
log_denied�	temp_fileZtable_rulesrJrKrLZ_ruler^r`rcrt�cr�r.r[r\�lines�liner3r3r4�	set_rules�s�









zip4tables.set_rulesc
Cs�|j|dddt|jg�|j|dt|jg�y|jd�}Wntk
rRYn:X|dkr`dS|dkr�ddd
|g|||d�<n
|j|�tj|j	�}tj|j
�}tj|j�}|j||d�|j||d�|j
||�|j|�}||_	||_
||_|S)Nz
%%REJECT%%r�z
--reject-withz%%ICMP%%z%%LOGTYPE%%r�rr�r�r�z-mr�z
--pkt-typer]z%%RICH_RULE_PRIORITY%%z%%POLICY_PRIORITY%%)r�r�r�)rar�rAr�r*r8r-r�r�rJrKrLr�r��_ip4tables__run)rNr^r�r`rJrKrL�outputr3r3r4�set_rule�s.

zip4tables.set_ruleNcCs�g}|r|gntj�}xx|D]p}||jkr6|j|�qy,|jd|ddg�|jj|�|j|�Wqtk
r�tjd|j|f�YqXqW|S)Nz-tz-Lz-nzA%s table '%s' does not exist (or not enough permission to check).)	rbr�rIrfr�r8r�debug1rA)rNrcr\Ztablesr3r3r4�get_available_tabless

zip4tables.get_available_tablescCs`d}t|jdddg�}|ddkr\d}t|jdddg�}|ddkrHd}tjd|j|j|�|S)Nrz-wz-Lz-nrz-w10z%s: %s will be using %s option.)rrBrrXrY)rNrEr\r3r3r4rDszip4tables._detect_wait_optioncCs�t�}|jd�|j�d}xJdD]B}t|j|g|jd�}|ddkr"d|dkr"d	|dkr"|}Pq"Wtjd
|j|j|�t	j
|j�|S)Nz#foor�-w�--wait=2)r�rzinvalid optionr]zunrecognized optionz%s: %s will be using %s option.)r�r�)rr�r�rrCr�rrXrYrQr�)rNr�rEZtest_optionr\r3r3r4rF"s

z%ip4tables._detect_restore_wait_optioncCsVi|_i|_g|_g}x:tj�D].}|j|�s0q xdD]}|jd||g�q6Wq W|S)N�-F�-X�-Zz-t)r�r�r�)rJrKrLrbr�r�rf)rNr�rc�flagr3r3r4�build_flush_rules5s

zip4tables.build_flush_rulescCsfg}|dkrdn|}xLtj�D]@}|j|�s.q|dkr8qx$t|D]}|jd|d||g�qBWqW|S)NZPANIC�DROPr"z-tz-P)rbr�r�rf)rN�policyr��_policyrcrdr3r3r4�build_set_policy_rulesDs
z ip4tables.build_set_policy_rulescCs g}d}y"|jd|jdkrdnddg�}WnJtk
rt}z.|jdkrVtjd|�ntjd|�WYd	d	}~XnX|j�}d
}x�|D]�}|r�|j�j�}|j�}xD|D]<}	|	j	d�r�|	j
d�r�|	d
d�}
n|	}
|
|kr�|j|
�q�W|jdko�|j	d��s|jdkr�|j	d�r�d}q�W|S)zQReturn ICMP types that are supported by the iptables/ip6tables command and kernelrz-pr$r&z	ipv6-icmpz--helpziptables error: %szip6tables error: %sNF�(�)r]zValid ICMP Types:r%zValid ICMPv6 Types:Tr�)r�rAr8rr��
splitlines�strip�lower�splitr�r�rf)rNrAr\r�Zexr�Zin_typesr�Zsplitsr�r9r3r3r4�supported_icmp_typesPs4
 

zip4tables.supported_icmp_typescCsgS)Nr3)rNr3r3r4�build_default_tablesqszip4tables.build_default_tablesr�c	Csi}|jd�rpg|d<t�|jd<xLtdD]@}|djd|�|djd||f�|jdjd|�q,W|jd��r\g|d<t�|jd<x�tdD]�}|djd|�|djd||f�|jdjd|�|dkr�xt|jjr�ddd	d
gndd	d
gD]R}|djd||f�|djd|||f�|jdjtd
||fg���qWq�W|jd��rNg|d<t�|jd<x�tdD]�}|djd|�|djd||f�|jdjd|�|dk�r�xv|jj�r�ddd	d
gndd	d
gD]R}|djd||f�|djd|||f�|jdjtd
||fg���q�W�q�W|jd��r@g|d<t�|jd<x�tdD]�}|djd|�|djd||f�|jdjd|�|d9k�rxxv|jj�r�ddd	d
gndd	d
gD]R}|djd||f�|djd|||f�|jdjtd
||fg���q�W�qxWg|d<t�|jd<|djd�|djd�|djd�|djd�|jdjtd��xf|jj�r�ddd	d
gndd	d
gD]B}|djd|�|djd|�|jdjtd|���q�W|dk�r |djd�|djd�|dk�rF|djd�|djd�|djd�|djd �|djd!�|djd"�|jdjtd#��xJd:D]B}|djd$|�|djd%|�|jdjtd&|���q�Wxzd;D]r}xj|jj�r
dd	gnd	gD]N}|djd)||f�|djd*||f�|jdjtd+||f���qW�q�WxJd<D]B}|djd$|�|djd%|�|jdjtd&|���qnW|dk�r�|djd,�|djd-�|dk�r�|djd.�|djd/�|dd0d1d2d3g7<|jdjtd4��xJd=D]B}|djd5|�|djd6|�|jdjtd7|���q2WxJd>D]B}|djd5|�|djd6|�|jdjtd7|���q~Wg}xJ|D]B}||j�k�r�q�x(||D]}|jd8|gt	|���q�W�q�W|S)?Nrz-N %s_directz-A %s -j %s_directz	%s_directr r�POLICIES_preZZONES_SOURCEZZONES�
POLICIES_postz-N %s_%sz-A %s -j %s_%sz%s_%sr!r"rr#zB-A INPUT -m conntrack --ctstate RELATED,ESTABLISHED,DNAT -j ACCEPTz-A INPUT -i lo -j ACCEPTz-N INPUT_directz-A INPUT -j INPUT_directZINPUT_directz-N INPUT_%sz-A INPUT -j INPUT_%szINPUT_%sr�z^-A INPUT -m conntrack --ctstate INVALID %%LOGTYPE%% -j LOG --log-prefix 'STATE_INVALID_DROP: 'z/-A INPUT -m conntrack --ctstate INVALID -j DROPz9-A INPUT %%LOGTYPE%% -j LOG --log-prefix 'FINAL_REJECT: 'z-A INPUT -j %%REJECT%%zD-A FORWARD -m conntrack --ctstate RELATED,ESTABLISHED,DNAT -j ACCEPTz-A FORWARD -i lo -j ACCEPTz-N FORWARD_directz-A FORWARD -j FORWARD_directZFORWARD_directz
-N FORWARD_%sz-A FORWARD -j FORWARD_%sz
FORWARD_%s�IN�OUTz-N FORWARD_%s_%sz-A FORWARD -j FORWARD_%s_%sz
FORWARD_%s_%sz`-A FORWARD -m conntrack --ctstate INVALID %%LOGTYPE%% -j LOG --log-prefix 'STATE_INVALID_DROP: 'z1-A FORWARD -m conntrack --ctstate INVALID -j DROPz;-A FORWARD %%LOGTYPE%% -j LOG --log-prefix 'FINAL_REJECT: 'z-A FORWARD -j %%REJECT%%z-N OUTPUT_directz>-A OUTPUT -m conntrack --ctstate RELATED,ESTABLISHED -j ACCEPTz-A OUTPUT -o lo -j ACCEPTz-A OUTPUT -j OUTPUT_directZ
OUTPUT_directz-N OUTPUT_%sz-A OUTPUT -j OUTPUT_%sz	OUTPUT_%sz-t)rr)r�)r�r�)r�)r�)r�)
r�r;rMrbrfrgr@r�updater)	rNr�Z
default_rulesrdZdispatch_suffix�	directionZfinal_default_rulesrcr^r3r3r4�build_default_rulesus�
$(
&*
&*&



(






"zip4tables.build_default_rulescCsf|dkrdddhS|dkr,d|j�kr,dhS|dkrHd|j�krHddhS|d	krbd	|j�krbdhSiS)
Nr#r�
FORWARD_IN�FORWARD_OUTr!rr"rr )r�)rNrcr3r3r4�get_zone_table_chains�s
zip4tables.get_zone_table_chainsc	s�|jjj|���jdkrdnd��dkr4�dkr4dnd}	|jjj|�t|	��g}
g}x|D]}|
jd|g�qZWx|D]}|jd	|g�qvWxB|D]:}
|jjj|
�}|dkr�|j	|�r�q�|
j|j
d|
��q�Wx\|D]T}
|jjj|
�}|dk�r|j	|��rq�t|
��r�dk�rq�|j|j
d|
��q�W������fdd�}g}|
�r�x�|
D]F}|�r�x8|D]}|j|||���qdWn|�r�n|j||d���qTWnH|�r�n@|�r�x8|D]}|j|d|���q�Wn|�r�n|j|dd��|S)Nr�pre�postr"rTFz-iz-or$r%z-sr�rz-dcsVddd��}d�|d��fd�jg}|r6|j|�|rD|j|�|jd�g�|S)Nz-Az-D)TFz-tz%s_POLICIES_%sz%%POLICY_PRIORITY%%z-j)r��extend)�ingress_fragment�egress_fragment�add_delr^)r�rd�chain_suffix�enable�p_objrcr3r4�_generate_policy_dispatch_rules


zSip4tables.build_policy_ingress_egress_rules.<locals>._generate_policy_dispatch_rule)r$r%)r$r%)rr�r)r@r�Z
get_policyr��policy_base_chain_name�POLICY_CHAIN_PREFIXrfr�Zcheck_source�is_ipv_supported�_rule_addr_fragmentr)rNr�r�rcrdZingress_interfacesZegress_interfacesZingress_sourcesZegress_sources�isSNATZingress_fragmentsZegress_fragments�	interface�addrrAr�r�r�r�r3)r�rdr�r�r�rcr4�!build_policy_ingress_egress_rules�sR






z+ip4tables.build_policy_ingress_egress_rulesFc
Cs�|dkr|dkrdnd}|jjj||t|d�}	ddddddd�|}
d	}|rb|rbd
d|dg}n,|rtd
d|g}ndd|g}|s�|dg7}|d||
|||	g7}|gS)Nr"rTF)r�z-iz-o)rrrr�r�rz-gz-Iz%s_ZONESz%%ZONE_INTERFACE%%z-Az-Dz-t)r@r�r�r�)
rNr�r�r�r�rcrdrfr�r�rt�actionr^r3r3r4�!build_zone_source_interface_rulesKs&

z+ip4tables.build_zone_source_interface_rulescCs�|jd�rP|dd�}|dkr$d}nd}dj|g|jjj|��}ddd	||gSt|�rz|dkrjttd
��ddd|j�gSt	d
|�r�t
|�}n,td
|�r�|jd�}t
|d�d|d}||gSdS)Nzipset:�z-d�dst�src�,z-mr;z--match-setzCan't match a destination MAC.�macz--mac-sourcer%�/rr])
r�rZr@�ipsetZ
get_dimensionrr
r�upperr	rr
r�)rNrt�address�invertr��flags�
addr_splitr3r3r4r�es"





zip4tables._rule_addr_fragmentc
Cs�ddd�|}|dkr"|dkr"dnd}|jjj||t|d�}	d	d
d	d	d
d
d�|}
|jjrdd|}nd
|}t|�r�|dkr�gS||d|d|g}|j|j|
|��|jd|	g�|gS)Nz-Iz-D)TFr"rTF)r�z-sz-d)rrrr�r�rz%s_ZONES_SOURCEz%s_ZONESr�rz%%ZONE_SOURCE%%z-tz-g)rr�r)r@r�r�r�rrr�r�)
rNr�r�r�r�rcrdr�r�r�rtZzone_dispatch_chainr^r3r3r4�build_zone_source_address_rules{s&
z)ip4tables.build_zone_source_address_rulescCs>ddd�|}ddd�|}|dkr0|dkr0dnd	}|jjj||t|d
�}|j|jt|d|d|d
|d|d|g��g}	|	j||d|g�|	j|d
|d|g�|	j|d|d|g�|	j|d|d|g�|	j|d|d|g�|	j|d|d|g�|	j||d|dd
|g�|	j||d|dd|g�|	j||d|dd|g�|	j||d|dd|g�|	j||d|dd|g�|jjj|j	}
|jj
�dk�r|dk�r|
dk�r�|	j||d|ddddd|g	�|
dk�r|	j||d|ddddd|g	�|dk�r,|
dk�r,|	j||d|d|
g�|�s:|	j�|	S)Nz-Nz-X)TFz-Az-Dr"rTF)r�z%s_logz%s_denyz%s_prez%s_postz%s_allowz-tz-jr�r#r��
%%REJECT%%z%%LOGTYPE%%�LOGz--log-prefixz
"%s_REJECT: "r�z"%s_DROP: "�ACCEPT)r�r�)r�r�r�r�)r@r�r�r�rMr�r;rfZ	_policies�target�get_log_denied�reverse)rNr�r�rcrdZ
add_del_chainZadd_del_ruler�r�r�r�r3r3r4�build_policy_chain_rules�sN




z"ip4tables.build_policy_chain_rulescCs2|sgSddd|jg}|jdk	r.|d|jg7}|S)Nz-m�limitz--limitz
--limit-burst)�valueZburst)rNr�sr3r3r4�_rule_limit�s
zip4tables._rule_limitcCs�t|j�tttgkrn<|jrHt|j�tttt	gkrRt
tdt|j���n
t
td��|jdkr�t|j�ttgks�t|j�tt	gkr�dSt|j�tgks�t|j�ttgkr�dSn|jdkr�dSdSdS)NzUnknown action %szNo rule action specified.r�allowZdenyr�r�)
r��elementrrrr�rrrrr
rr�)rN�	rich_ruler3r3r4�_rich_rule_chain_suffix�s 


z!ip4tables._rich_rule_chain_suffixcCs>|jr|jrttd��|jdkr(dS|jdkr6dSdSdS)NzNot log or auditrrr�r�)r�auditr
rr�)rNrr3r3r4� _rich_rule_chain_suffix_from_log�s


z*ip4tables._rich_rule_chain_suffix_from_logcCs|jdkrgSd|jgS)Nrz%%RICH_RULE_PRIORITY%%)r�)rNrr3r3r4�_rich_rule_priority_fragment�s
z&ip4tables._rich_rule_priority_fragmentc
Cs�|js
gS|jjj||t�}ddd�|}|j|�}d||d||fg}	|	|j|�7}	|	|ddg7}	|jjr�|	dd	|jjg7}	|jjr�|	d
d|jjg7}	|	|j	|jj
�7}	|	S)Nz-Az-D)TFz-tz%s_%sz-jr�z--log-prefixz'%s'z--log-levelz%s)rr@r�r�r�rr�prefix�levelrr)
rNr�rr�rc�
rule_fragmentr�r�r�r^r3r3r4�_rich_rule_log�s
zip4tables._rich_rule_logcCs�|js
gSddd�|}|jjj||t�}|j|�}d||d||fg}	|	|j|�7}	|	|7}	t|j�t	krrd}
n,t|j�t
kr�d}
nt|j�tkr�d}
nd	}
|	d
dd|
g7}	|	|j|jj
�7}	|	S)
Nz-Az-D)TFz-tz%s_%sZacceptZrejectZdrop�unknownz-jZAUDITz--type)r
r@r�r�r�rrr�r�rrrrr)rNr�rr�rcrr�r�r�r^Z_typer3r3r4�_rich_rule_audits$
zip4tables._rich_rule_auditcCs2|js
gSddd�|}|jjj||t�}|j|�}d||f}	t|j�tkrXddg}
n�t|j�tkr�ddg}
|jjr�|
d|jjg7}
nnt|j�t	kr�dd	g}
nVt|j�t
kr�d
}|jjj||t�}d||f}	ddd|jjg}
ntt
d
t|j���d|||	g}||j|�7}|||
7}||j|jj�7}|S)Nz-Az-D)TFz%s_%sz-jr�r�z
--reject-withr�r!�MARKz--set-xmarkzUnknown action %sz-t)r�r@r�r�r�r	r�rrrrr;r
rrrr)rNr�rr�rcrr�r�r�rdZrule_actionr^r3r3r4�_rich_rule_action$s4


zip4tables._rich_rule_actioncCs�|sgSg}|jr�|jr"|jd�td|j�rB|dt|j�g7}q�td|j�r||jjd�}|dt|d�d|dg7}q�|d|jg7}nD|jr�|ddg7}|jr�|jd�|jj	j
|jd	�}|d
|j|g7}|S)N�!r%z-dr�rr]z-mr;r�z--match-set)r�r�rfr	rr
r�r�r@r��_ipset_match_flags)rNZ	rich_destrr�r�r3r3r4�_rich_rule_destination_fragmentFs&
"
z)ip4tables._rich_rule_destination_fragmentcCs|sgSg}|jr�|jr"|jd�td|j�rB|dt|j�g7}nHtd|j�r||jjd�}|dt|d�d|dg7}n|d|jg7}n�t|d�r�|jr�|ddg7}|jr�|jd�|d	|jg7}nPt|d
�o�|j	�r|ddg7}|jr�|jd�|j
jj|j	d�}|d
|j	|g7}|S)Nrr%z-sr�rr]r�z-mz--mac-sourcer�r;r�z--match-set)
r�r�rfr	rr
r��hasattrr�r�r@r�r)rNZrich_sourcerr�r�r3r3r4�_rich_rule_source_fragment^s0
"

z$ip4tables._rich_rule_source_fragmentcCsddd�|}d}|jjj||t�}	d|g}
|rD|
ddt|�g7}
|rT|
d|g7}
|rx|
|j|j�7}
|
|j|j�7}
|s�t	|j
�tkr�|
d	d
ddg7}
g}|r�|j|j
|||||
��|j|j|||||
��|j|j|||||
��n"|j|d
|	d|g|
ddg�|S)Nz-Az-D)TFr#z-pz--dportz%sz-dz-m�	conntrackz	--ctstatez
NEW,UNTRACKEDz%s_allowz-tz-jr�)r@r�r�r�rr�destinationr�sourcer�r�rrfrrr)rNr�r��proto�portrrr�rcr�rr�r3r3r4�build_policy_ports_rules{s*z"ip4tables.build_policy_ports_rulescCs�ddd�|}d}|jjj||t�}d|g}	|r<|	d|g7}	|r`|	|j|j�7}	|	|j|j�7}	|stt|j	�t
kr�|	ddd	d
g7}	g}
|r�|
j|j|||||	��|
j|j
|||||	��|
j|j|||||	��n"|
j|d|d|g|	d
dg�|
S)Nz-Az-D)TFr#z-pz-dz-mrz	--ctstatez
NEW,UNTRACKEDz%s_allowz-tz-jr�)r@r�r�r�rrrrr�r�rrfrrr)rNr�r��protocolrrr�rcr�rr�r3r3r4�build_policy_protocol_rules�s&z%ip4tables.build_policy_protocol_rulescCsddd�|}d}|jjj||t�}	d|g}
|rD|
ddt|�g7}
|rT|
d|g7}
|rx|
|j|j�7}
|
|j|j�7}
|s�t	|j
�tkr�|
d	d
ddg7}
g}|r�|j|j
|||||
��|j|j|||||
��|j|j|||||
��n"|j|d
|	d|g|
ddg�|S)Nz-Az-D)TFr#z-pz--sportz%sz-dz-mrz	--ctstatez
NEW,UNTRACKEDz%s_allowz-tz-jr�)r@r�r�r�rrrrrr�r�rrfrrr)rNr�r�rrrrr�rcr�rr�r3r3r4�build_policy_source_ports_rules�s*z)ip4tables.build_policy_source_ports_rulescCsvd}|jjj||t�}	ddd�|}
|
d|	ddd|g}|rP|dd	t|�g7}|r`|d
|g7}|ddd
|g7}|gS)Nr z-Az-D)TFz%s_allowz-tz-pz--dportz%sz-dz-jZCTz--helper)r@r�r�r�r)rNr�r�rrrZhelper_nameZmodule_short_namercr�r�r^r3r3r4�build_policy_helper_ports_rules�sz)ip4tables.build_policy_helper_ports_rulesc
	Cs�ddd�|}|jjj||t�}g}	|rH|	jdd|d|d|dd	g�n6t|�rTgS|	jdd|d|g|jd
|�dd	g�|	S)Nz-Az-D)TFz-tr#z%s_allowz-oz-jr�z-d)r@r�r�r�rfrr�)
rNr�r�r�rcr�rr�r�r�r3r3r4�build_zone_forward_rules�sz"ip4tables.build_zone_forward_rulesc
Cs,d}|jjj||tdd�}ddd�|}g}|rj|j|�}||j|�7}||j|j�7}||j|j	�7}nd}g}	|	j
dd|d	||fg|d
ddd
dg�g}|r�|j|�}||j|�7}||j|j�7}||j|j	�7}nd}d}|jjj||t�}|	j
dd|d	||fg|ddddd
dg�|	S)Nr"T)r�z-Az-D)TFrz-tz%s_%srz-o�loz-jZ
MASQUERADEr#z-mrz	--ctstatez
NEW,UNTRACKEDr�)r@r�r�r�r	rrrrrrf)
rNr�r�rrcr�r�rr�r�r3r3r4�build_policy_masquerade_rules�s6

z'ip4tables.build_policy_masquerade_rulesc
Cs
d}|jjj||t�}	ddd�|}
d}|rPtd|�rH|dt|�7}n||7}|rn|dkrn|dt|d	�7}g}|r�|j|�}
|j|�}||j	|j
�7}||j|j�7}nd
}
g}|r�|j
|j|||d|��|j
dd|
d|	|
fg|d
|dt|�ddd|g�|S)Nr"z-Az-D)TFrr%z[%s]z:%s�-rz-tz%s_%sz-pz--dportz-jZDNATz--to-destination)r@r�r�r�r	rrr	rrrrrrfr)rNr�r�rr ZtoportZtoaddrrrcr�r�Ztorr�r�r3r3r4�build_policy_forward_port_ruless2


z)ip4tables.build_policy_forward_port_rulescCs�d}|jjj||t�}ddd�|}|jdkrFddg}ddd	|jg}	ndd
g}ddd|jg}	g}
|jjj|�r|d
|}d}nd|}d}g}
|r�|
|j|j�7}
|
|j	|j
�7}
|
||	7}
|�rP|
j|j|||||
��|
j|j
|||||
��|j�r|
j|j|||||
��n:|j|�}|
jd||d||fg|j|�|
ddg�n`|jj�dk�r�|dk�r�|
j||d|g|
ddddd|g�|
j||d|g|
d|g�|
S)Nr#z-Az-D)TFr$z-pr&z-mz--icmp-typez	ipv6-icmpZicmp6z
--icmpv6-typez%s_allowr�z%s_denyz
%%REJECT%%z-tz%s_%sz-jr�z%%LOGTYPE%%r�z--log-prefixz"%s_ICMP_BLOCK: ")r@r�r�r�rAr��query_icmp_block_inversionrrrrrfrrr�rr	rr�)rNr�r�Zictrrcr�r�r�matchr�Zfinal_chainZfinal_targetrr�r3r3r4�build_policy_icmp_block_rules3sJ

 z'ip4tables.build_policy_icmp_block_rulesc	Cs�d}|jjj||t�}g}d}|jjj|�r�d}|jj�dkr�|rRd|t|�g}nd|g}|d|dd	d
ddd
d|g	}|j|�|d7}nd}|r�d|t|�g}nd|g}|d|dd	d|g}|j|�|S)Nr#r�z
%%REJECT%%r�z-Iz-Dz-tz-pz%%ICMP%%z%%LOGTYPE%%z-jr�z--log-prefixz"%s_ICMP_BLOCK: "r]r�)r@r�r�r�r)r�rirf)	rNr�r�rcr�r�Zrule_idxZ
ibi_targetr^r3r3r4�'build_policy_icmp_block_inversion_rulesds.



z1ip4tables.build_policy_icmp_block_inversion_rulescCsxd}g}||j|j�7}||j|j�7}g}|j|j|||||��|j|j|||||��|j|j|||||��|S)Nr#)rrrrrfrrr)rNr�r�rrcrr�r3r3r4�*build_policy_rich_source_destination_rules�sz4ip4tables.build_policy_rich_source_destination_rulescCs
||jkS)N)rA)rNrAr3r3r4r��szip4tables.is_ipv_supported)N)N)r�)F)F)NN)NN)NN)NN)N)N)N)7�__name__�
__module__�__qualname__rAr�Zpolicies_supportedrPrHr�rarerhrjrkrlrmrur�r�r�r�r�rDrFr�r�r�r�r�r�r�r�r�r�rrr	rrrrrrrrr!r"r#r$r&r(r+r,r-r�r3r3r3r4r?�sh

			)Pa#

!
zN

0"




&
!
1"r?c@s&eZdZdZdZddd�Zdd�ZdS)	�	ip6tablesr%Fc
Cs�g}|jddddddddd	d
g
�|dkrL|jddddddddd	dd
dg�|jdddddddd	dg	�|jdddddddd	dg	�|S)Nz-Irz-tr!z-mZrpfilterz--invertz--validmarkz-jr�r�r�z--log-prefixzrpfilter_DROP: z-pz	ipv6-icmpz$--icmpv6-type=neighbour-solicitationr�z"--icmpv6-type=router-advertisement)rf)rNr�r�r3r3r4�build_rpfilter_rules�s$



zip6tables.build_rpfilter_rulescCs�ddddddddd	g	}d
}|jdj|�g}|jddd
|g�xT|D]L}|jddd|d|ddddg
�|jjdkrF|jddd|d|ddddg
�qFW|jdddddd|g�|jdddddd|g�|S)Nz::0.0.0.0/96z::ffff:0.0.0.0/96z2002:0000::/24z2002:0a00::/24z2002:7f00::/24z2002:ac10::/28z2002:c0a8::/32z2002:a9fe::/32z2002:e000::/19ZRFC3964_IPv4r#z-tz-Nz-Iz-dz-jr�z
--reject-withzaddr-unreachr��allr�z--log-prefixz"RFC3964_IPv4_REJECT: "r�4r)r�r3)rMrgrfr@Z_log_denied)rNZ
daddr_listZ
chain_namer�Zdaddrr3r3r4�build_rfc3964_ipv4_rules�s4



z"ip6tables.build_rfc3964_ipv4_rulesN)F)r.r/r0rAr�r2r5r3r3r3r4r1�s
r1)+Zos.pathrQr�Zfirewall.core.progrZfirewall.core.loggerrZfirewall.functionsrrrrrr	r
rZfirewallrZfirewall.errorsr
rrrrZfirewall.core.richrrrrrrrr�r�rbr�r�r5r:r>�objectr?r1r3r3r3r4�<module>s@($%* xsite-packages/firewall/core/__pycache__/icmp.cpython-36.opt-1.pyc000064400000004265147511334670020570 0ustar003

]ûf�#@s�ddddgZddddddd	d
ddd
dddddddddddddddddddd d!d"d#d$�"Zd%d&d'd(d)dddd*d+d,d,d-d-d.d/d0d0d1d1d2d3�Zd4d5�Zd6d�Zd7d8�Zd9d�Zd:S);�
ICMP_TYPES�ICMPV6_TYPES�check_icmp_type�check_icmpv6_typez0/0z3/0z3/1z3/2z3/3z3/4z3/5z3/6z3/7z3/9z3/10z3/11z3/12z3/13z3/14z3/15z4/0z5/0z5/1z5/2z5/3z8/0z9/0z10/0z11/0z11/1z12/0z12/1z13/0z14/0z17/0z18/0)"z
echo-reply�pongznetwork-unreachablezhost-unreachablezprotocol-unreachablezport-unreachablezfragmentation-neededzsource-route-failedznetwork-unknownzhost-unknownznetwork-prohibitedzhost-prohibitedzTOS-network-unreachablezTOS-host-unreachablezcommunication-prohibitedzhost-precedence-violationzprecedence-cutoffz
source-quenchznetwork-redirectz
host-redirectzTOS-network-redirectzTOS-host-redirectzecho-request�pingzrouter-advertisementzrouter-solicitationzttl-zero-during-transitzttl-zero-during-reassemblyz
ip-header-badzrequired-option-missingztimestamp-requestztimestamp-replyzaddress-mask-requestzaddress-mask-replyz1/0z1/1z1/3z1/4z2/0z4/1z4/2z128/0z129/0z133/0z134/0z135/0z136/0z137/0)zno-routezcommunication-prohibitedzaddress-unreachablezport-unreachablezpacket-too-bigzttl-zero-during-transitzttl-zero-during-reassemblyz
bad-headerzunknown-header-typezunknown-optionzecho-requestrz
echo-replyrzrouter-solicitationzrouter-advertisementzneighbour-solicitationzneigbour-solicitationzneighbour-advertisementzneigbour-advertisementZredirectcCs|tkrdSdS)NTF)r)�_name�r�/usr/lib/python3.6/icmp.py�check_icmp_nameVsr
cCs|tj�krdSdS)NTF)r�values)�_typerrr	r[scCs|tkrdSdS)NTF)r)rrrr	�check_icmpv6_name`sr
cCs|tj�krdSdS)NTF)rr)rrrr	resN)�__all__rrr
rr
rrrrr	�<module>sxsite-packages/firewall/core/__pycache__/fw_helper.cpython-36.pyc000064400000003705147511334670020652 0ustar003

]ûf)�@s6dZdgZddlmZddlmZGdd�de�ZdS)zhelper backend�FirewallHelper�)�errors)�
FirewallErrorc@s\eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�ZdS)rcCs||_i|_dS)N)Z_fw�_helpers)�self�fw�r�/usr/lib/python3.6/fw_helper.py�__init__szFirewallHelper.__init__cCsd|j|jfS)Nz%s(%r))�	__class__r)rrrr	�__repr__"szFirewallHelper.__repr__cCs|jj�dS)N)r�clear)rrrr	�cleanup'szFirewallHelper.cleanupcCs||j�krttj|��dS)N)�get_helpersrr�INVALID_HELPER)r�namerrr	�check_helper*szFirewallHelper.check_helpercCs||j�kS)N)r)rrrrr	�query_helper.szFirewallHelper.query_helpercCst|jj��S)N)�sortedr�keys)rrrr	r1szFirewallHelper.get_helperscCst|j�dkS)Nr)�lenr)rrrr	�has_helpers4szFirewallHelper.has_helperscCs|j|�|j|S)N)rr)rrrrr	�
get_helper7s
zFirewallHelper.get_helpercCs||j|j<dS)N)rr)r�objrrr	�
add_helper;szFirewallHelper.add_helpercCs"||jkrttj|��|j|=dS)N)rrrr)rrrrr	�
remove_helper>s
zFirewallHelper.remove_helperN)
�__name__�
__module__�__qualname__r
rrrrrrrrrrrrr	rsN)�__doc__�__all__ZfirewallrZfirewall.errorsr�objectrrrrr	�<module>ssite-packages/firewall/core/__pycache__/fw_direct.cpython-36.opt-1.pyc000064400000030436147511334670021605 0ustar003

]ûf
V�@sndgZddlmZddlmZddlmZddlmZddlm	Z	ddl
mZddlm
Z
Gd	d�de�Zd
S)�FirewallDirect�)�LastUpdatedOrderedDict)�	ipXtables)�ebtables)�FirewallTransaction)�log)�errors)�
FirewallErrorc@sDeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dLdd�Z
dd�Zdd�ZdMdd�Z
dd�Zdd�Zdd�Zdd�ZdNd d!�ZdOd"d#�Zd$d%�Zd&d'�Zd(d)�ZdPd*d+�ZdQd,d-�Zd.d/�Zd0d1�Zd2d3�Zd4d5�Zd6d7�Zd8d9�ZdRd:d;�ZdSd<d=�Z d>d?�Z!d@dA�Z"dBdC�Z#dDdE�Z$dFdG�Z%dHdI�Z&dJdK�Z'dS)TrcCs||_|j�dS)N)�_fw�_FirewallDirect__init_vars)�self�fw�r�/usr/lib/python3.6/fw_direct.py�__init__'szFirewallDirect.__init__cCsd|j|j|j|jfS)Nz%s(%r, %r, %r))�	__class__�_chains�_rules�_rule_priority_positions)rrrr�__repr__+szFirewallDirect.__repr__cCs"i|_i|_i|_i|_d|_dS)N)rrr�
_passthroughs�_obj)rrrrZ__init_vars/s
zFirewallDirect.__init_varscCs|j�dS)N)r)rrrr�cleanup6szFirewallDirect.cleanupcCs
t|j�S)N)rr
)rrrr�new_transaction;szFirewallDirect.new_transactioncCs
||_dS)N)r)r�objrrr�set_permanent_config@sz#FirewallDirect.set_permanent_configcCs\t|j�t|j�t|j�dkr&dSt|jj��t|jj��t|jj��dkrXdSdS)NrTF)�lenrrrr�get_all_chains�
get_all_rules�get_all_passthroughs)rrrr�has_configurationCs"z FirewallDirect.has_configurationNcCsP|dkr|j�}n|}|j|jj�|jj�|jj�f|�|dkrL|jd�dS)NT)r�
set_configrrrr�execute)r�use_transaction�transactionrrr�apply_directLs

zFirewallDirect.apply_directcCsi}i}i}xL|jD]B}|\}}x4|j|D]&}|jj|||�s,|j|g�j|�q,WqWxf|jD]\}|\}}}xL|j|D]>\}	}
|jj||||	|
�s|||kr�t�||<|	|||	|
f<q|WqbWxP|jD]F}x@|j|D]2}
|jj	||
�s�||k�r�g||<||j|
�q�Wq�W|||fS)N)
rr�query_chain�
setdefault�appendr�
query_rulerr�query_passthrough)rZchains�rulesZpassthroughs�table_id�ipv�table�chain�chain_id�priority�argsrrr�get_runtime_config]s,


z!FirewallDirect.get_runtime_configcCs|j|j|jfS)N)rrr)rrrr�
get_config|szFirewallDirect.get_configcCs�|dkr|j�}n|}|\}}}x||D]t}|\}}	xf||D]Z}
|j||	|
�s<y|j||	|
|d�Wq<tk
r�}ztjt|��WYdd}~Xq<Xq<Wq&Wx�|D]�}|\}}	}
xt||D]h\}
}|j||	|
|
|�s�y|j||	|
|
||d�Wq�tk
�r"}ztjt|��WYdd}~Xq�Xq�Wq�Wxx|D]p}xh||D]\}|j	||��s@y|j
|||d�Wn2tk
�r�}ztjt|��WYdd}~XnX�q@W�q2W|dk�r�|jd�dS)N)r#T)rr&�	add_chainr	rZwarning�strr)�add_ruler*�add_passthroughr")rZconfr#r$rrrr,r-r.r/�errorr0r1r2rrrr!s@



(

(
,
zFirewallDirect.set_configcCs*dddg}||kr&ttjd||f��dS)N�ipv4�ipv6Zebz'%s' not in '%s')r	rZINVALID_IPV)rr-Zipvsrrr�
_check_ipv�s
zFirewallDirect._check_ipvcCsF|j|�|dkrtjj�ntjj�}||krBttjd||f��dS)Nr:r;z'%s' not in '%s')r:r;)r<r�BUILT_IN_CHAINS�keysrr	rZ
INVALID_TABLE)rr-r.Ztablesrrr�_check_ipv_table�s

zFirewallDirect._check_ipv_tablecCs�|dkr4tj|}|jjr i}qH|jj|�j|}ntj|}tj|}||kr`tt	j
d|��||krxtt	j
d|��|dkr�|jjj|�dk	r�tt	j
d|��dS)Nr:r;zchain '%s' is built-in chainzchain '%s' is reservedzChain '%s' is reserved)r:r;)r:r;)rr=r
�nftables_enabled�get_direct_backend_by_ipv�
our_chainsrZ
OUR_CHAINSr	rZ
BUILTIN_CHAIN�zoneZzone_from_chainZ
INVALID_CHAIN)rr-r.r/Zbuilt_in_chainsrBrrr�_check_builtin_chain�s"




z#FirewallDirect._check_builtin_chaincCsH|r|jj|g�j|�n*|j|j|�t|j|�dkrD|j|=dS)Nr)rr'r(�remover)rr,r/�addrrr�_register_chain�s
zFirewallDirect._register_chaincCs>|dkr|j�}n|}|jd||||�|dkr:|jd�dS)NT)r�_chainr")rr-r.r/r#r$rrrr5�s
zFirewallDirect.add_chaincCs>|dkr|j�}n|}|jd||||�|dkr:|jd�dS)NFT)rrHr")rr-r.r/r#r$rrr�remove_chain�s
zFirewallDirect.remove_chaincCs:|j||�|j|||�||f}||jko8||j|kS)N)r?rDr)rr-r.r/r,rrrr&�s

zFirewallDirect.query_chaincCs,|j||�||f}||jkr(|j|SgS)N)r?r)rr-r.r,rrr�
get_chains�s


zFirewallDirect.get_chainscCsDg}x:|jD]0}|\}}x"|j|D]}|j|||f�q$WqW|S)N)rr()r�r�keyr-r.r/rrrr�szFirewallDirect.get_all_chainscCsB|dkr|j�}n|}|jd||||||�|dkr>|jd�dS)NT)r�_ruler")rr-r.r/r1r2r#r$rrrr7s
zFirewallDirect.add_rulecCsB|dkr|j�}n|}|jd||||||�|dkr>|jd�dS)NFT)rrMr")rr-r.r/r1r2r#r$rrr�remove_rules
zFirewallDirect.remove_rulecCs2|j||�|||f}||jko0||f|j|kS)N)r?r)rr-r.r/r1r2r0rrrr)s

zFirewallDirect.query_rulecCs6|j||�|||f}||jkr2t|j|j��SgS)N)r?r�listr>)rr-r.r/r0rrr�	get_ruless


zFirewallDirect.get_rulesc	CsRg}xH|jD]>}|\}}}x.|j|D] \}}|j||||t|�f�q&WqW|S)N)rr(rO)rrKrLr-r.r/r1r2rrrr%s
 zFirewallDirect.get_all_rulescCs�|rr||jkrt�|j|<||j||<||jkr<i|j|<||j|krb|j|||7<q�||j||<n<|j||=t|j|�dkr�|j|=|j|||8<dS)Nr)rrrr)r�rule_idr0r1�enable�countrrr�_register_rule-s


zFirewallDirect._register_rulecCsVy|jj|jj|�j|�Stk
rP}ztj|�ttj	|��WYdd}~XnXdS)N)
r
�rulerA�name�	ExceptionrZdebug2r	rZCOMMAND_FAILED)rr-r2�msgrrr�passthroughAs

zFirewallDirect.passthroughcCsX|r*||jkrg|j|<|j|j|�n*|j|j|�t|j|�dkrT|j|=dS)Nr)rr(rEr)rr-r2rRrrr�_register_passthroughIs

z$FirewallDirect._register_passthroughcCs@|dkr|j�}n|}|jd|t|�|�|dkr<|jd�dS)NT)r�_passthroughrOr")rr-r2r#r$rrrr8Ss
zFirewallDirect.add_passthroughcCs@|dkr|j�}n|}|jd|t|�|�|dkr<|jd�dS)NFT)rr[rOr")rr-r2r#r$rrr�remove_passthrough^s
z!FirewallDirect.remove_passthroughcCs||jkot|�|j|kS)N)r�tuple)rr-r2rrrr*is
z FirewallDirect.query_passthroughcCs>g}x4|jD]*}x$|j|D]}|j|t|�f�qWqW|S)N)rr(rO)rrKr-r2rrrrms
z#FirewallDirect.get_all_passthroughscCs4g}||jkr0x |j|D]}|jt|��qW|S)N)rr(rO)rr-rKr2rrr�get_passthroughsts

zFirewallDirect.get_passthroughscCs�g}x�|D]�}d}x�|D]�}y|j|�}Wntk
r>YqXt|�|krd||dkrd}||djd�}x.|D]&}	|dd�}
|	|
|d<|j|
�qxWqW|s
|j|�q
W|S)z5Split values combined with commas for options in optsF�,�TN)�index�
ValueErrorr�splitr()rr+ZoptsZ	out_rulesrUZ	processed�opt�i�items�itemrMrrr�split_value{s$


zFirewallDirect.split_valuec
Cs*|j||�|jjr2|dkr2|jjj||||�|}|jj|�}	|jjrd|	j|||�rdd|}n:|jjr�|dd�dkr�|	j|||dd��r�|dd�}|||f}
||f}|r�|
|jkr�||j|
kr�tt	j
d||||f��nB|
|jk�s||j|
k�rtt	jd||||f��|j|
|}d}d	}
|
|jk�r�t
|j|
j��}d	}x@|t|�k�r�|||k�r�||j|
||7}|d7}�qTWt|�g}|j|d
dg�}|j|dd
g�}x<|D]4}|j|	|	j||||t|���|d7}|
d7}
�q�W|j||
|||
�|j|j||
|||
�dS)Nr:r;z	%s_direct�Z_directz"rule '%s' already is in '%s:%s:%s'zrule '%s' is not in '%s:%s:%s'r`rz-sz--sourcez-dz
--destination)r:r;i����i����i����)r?r
r@rC�create_zone_base_by_chainrAZis_chain_builtinrr	r�ALREADY_ENABLED�NOT_ENABLEDr�sortedr>rrOrhr7Z
build_ruler]rT�add_fail)rrRr-r.r/r1r2r$rH�backendr0rQrarSZ	positions�jZ	args_list�_argsrrrrM�sZ




(

zFirewallDirect._rulecCs�|j||�|j|||�||f}|rV||jkr�||j|kr�ttjd|||f��n.||jksn||j|kr�ttjd|||f��|jj|�}|j	||j
|||��|j|||�|j|j|||�dS)Nz chain '%s' already is in '%s:%s'zchain '%s' is not in '%s:%s')
r?rDrr	rrkrlr
rAZ	add_rulesZbuild_chain_rulesrGrn)rrFr-r.r/r$r,rorrrrHs$

zFirewallDirect._chainc
Cs�|j|�t|�}|rD||jkrp||j|krpttjd||f��n,||jks\||j|krpttjd||f��|jj|�}|r�|j	|�|dkr�|j
|�\}}|r�|r�|jjj|||�|}	n
|j
|�}	|j||	�|j|||�|j|j|||�dS)Nzpassthrough '%s', '%s'r:r;)r:r;)r<r]rr	rrkrlr
rAZcheck_passthroughZpassthrough_parse_table_chainrCrjZreverse_passthroughr7rZrn)
rrRr-r2r$Z
tuple_argsror.r/rqrrrr[s0




zFirewallDirect._passthrough)N)N)N)N)N)N)N)N)(�__name__�
__module__�__qualname__rrrrrrr r%r3r4r!r<r?rDrGr5rIr&rJrr7rNr)rPrrTrYrZr8r\r*rr^rhrMrHr[rrrrr&sJ	

'	

	




jN)�__all__Zfirewall.fw_typesrZ
firewall.corerrZfirewall.core.fw_transactionrZfirewall.core.loggerrZfirewallrZfirewall.errorsr	�objectrrrrr�<module>ssite-packages/firewall/core/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000161147511334670021366 0ustar003

]ûf�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/firewall/core/__pycache__/watcher.cpython-36.pyc000064400000005256147511334670020337 0ustar003

]ûf��@s*dgZddlmZmZGdd�de�ZdS)�Watcher�)�Gio�GLibc@sdeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dS)rcCs"||_||_i|_i|_g|_dS)N)�	_callback�_timeout�	_monitors�	_timeouts�_blocked)�self�callbackZtimeout�r�/usr/lib/python3.6/watcher.py�__init__s
zWatcher.__init__cCs:tjj|�}|jtjjd�|j|<|j|jd|j�dS)N�changed)	r�File�new_for_pathZmonitor_directory�FileMonitorFlags�NONEr�connect�_file_changed_cb)r
Z	directory�gfilerrr
�
add_watch_dir"szWatcher.add_watch_dircCs:tjj|�}|jtjjd�|j|<|j|jd|j�dS)Nr)	rrrZmonitor_filerrrrr)r
�filenamerrrr
�add_watch_file(szWatcher.add_watch_filecCs
|jj�S)N)r�keys)r
rrr
�get_watches.szWatcher.get_watchescCs
||jkS)N)r)r
rrrr
�	has_watch1szWatcher.has_watchcCs|j|=dS)N)r)r
rrrr
�remove_watch4szWatcher.remove_watchcCs||jkr|jj|�dS)N)r	�append)r
rrrr
�block_source7s
zWatcher.block_sourcecCs||jkr|jj|�dS)N)r	�remove)r
rrrr
�unblock_source;s
zWatcher.unblock_sourcecCs4x.t|jj��D]}tj|j|�|j|=qWdS)N)�listrrr�
source_remove)r
rrrr
�clear_timeouts?szWatcher.clear_timeoutscCs ||jkr|j|�|j|=dS)N)r	rr)r
rrrr
�_call_callbackDs

zWatcher._call_callbackcCs�|j�}||jkr8||jkr4tj|j|�|j|=dS|tjjksh|tjjksh|tjj	ksh|tjj
kr�||jkr�tj|j|�|j|=tj|j|j
|�|j|<dS)N)Zget_parse_namer	rrr#rZFileMonitorEventZCHANGEDZCREATEDZDELETEDZATTRIBUTE_CHANGEDZtimeout_add_secondsrr%)r
ZmonitorZgio_fileZgio_other_fileZeventrrrr
rIs


zWatcher._file_changed_cbN)�__name__�
__module__�__qualname__rrrrrrrr!r$r%rrrrr
rsN)�__all__Z
gi.repositoryrr�objectrrrrr
�<module>ssite-packages/firewall/core/__pycache__/__init__.cpython-36.pyc000064400000000161147511334670020427 0ustar003

]ûf�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/firewall/core/__pycache__/fw_service.cpython-36.opt-1.pyc000064400000003201147511334670021761 0ustar003

]ûfg�@s2dgZddlmZddlmZGdd�de�ZdS)�FirewallService�)�errors)�
FirewallErrorc@sLeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)rcCs||_i|_dS)N)Z_fw�	_services)�self�fw�r� /usr/lib/python3.6/fw_service.py�__init__szFirewallService.__init__cCsd|j|jfS)Nz%s(%r))�	__class__r)rrrr	�__repr__ szFirewallService.__repr__cCs|jj�dS)N)r�clear)rrrr	�cleanup#szFirewallService.cleanupcCst|jj��S)N)�sortedr�keys)rrrr	�get_services(szFirewallService.get_servicescCs||jkrttj|��dS)N)rrrZINVALID_SERVICE)r�servicerrr	�
check_service+s
zFirewallService.check_servicecCs|j|�|j|S)N)rr)rrrrr	�get_service/s
zFirewallService.get_servicecCs||j|j<dS)N)r�name)r�objrrr	�add_service3szFirewallService.add_servicecCs|j|�|j|=dS)N)rr)rrrrr	�remove_service6s
zFirewallService.remove_serviceN)�__name__�
__module__�__qualname__r
rrrrrrrrrrr	rsN)�__all__ZfirewallrZfirewall.errorsr�objectrrrrr	�<module>ssite-packages/firewall/core/__pycache__/logger.cpython-36.opt-1.pyc000064400000055135147511334670021121 0ustar003

]ûf>y�@s�ddddgZddlZddlZddlZddlZddlZddlZddlZddlZddl	Z
ddl
Z
Gdd�de�ZGdd�de�Z
Gd	d
�d
e
�ZGdd�de�ZGd
d�de�ZGdd�de�Ze�ZdS)�	LogTarget�FileLog�Logger�log�Nc@s2eZdZdZdd�Zddd�Zdd�Zd	d
�ZdS)
rz% Abstract class for logging targets. cCs
d|_dS)N)�fd)�self�r�/usr/lib/python3.6/logger.py�__init__(szLogTarget.__init__rcCstd��dS)Nz%LogTarget.write is an abstract method)�NotImplementedError)r�data�level�logger�is_debugrrr	�write+szLogTarget.writecCstd��dS)Nz%LogTarget.flush is an abstract method)r)rrrr	�flush.szLogTarget.flushcCstd��dS)Nz%LogTarget.close is an abstract method)r)rrrr	�close1szLogTarget.closeN)r)�__name__�
__module__�__qualname__�__doc__r
rrrrrrr	r&s

c@s.eZdZdd�Zddd�Zdd�Zdd	�Zd
S)�
_StdoutLogcCstj|�tj|_dS)N)rr
�sys�stdoutr)rrrr	r
8s
z_StdoutLog.__init__rcCs|jj|�|j�dS)N)rrr)rrr
rrrrr	r<sz_StdoutLog.writecCs|j�dS)N)r)rrrr	rAsz_StdoutLog.closecCs|jj�dS)N)rr)rrrr	rDsz_StdoutLog.flushN)r)rrrr
rrrrrrr	r7s
rc@seZdZdd�ZdS)�
_StderrLogcCstj|�tj|_dS)N)rr
r�stderrr)rrrr	r
Ks
z_StderrLog.__init__N)rrrr
rrrr	rJsrc@s.eZdZdd�Zddd�Zdd�Zdd	�Zd
S)�
_SyslogLogcCs.tj|�tjtjjtjd�tj	tj
�dS)Nr)rr
�syslogZopenlog�os�path�basenamer�argvZLOG_PIDZ
LOG_DAEMON)rrrr	r
Ss
	z_SyslogLog.__init__rcCs�d}|rtj}nF||jkr"tj}n4||jkr4tj}n"||jkrFtj}n||jkrVtj	}|j
d�rt|dt|�d�}t|�dkr�|dkr�tj|�ntj||�dS)N�
�r)rZ	LOG_DEBUG�INFO1ZLOG_INFO�WARNINGZLOG_WARNING�ERRORZLOG_ERR�FATALZLOG_CRIT�endswith�len)rrr
rrZpriorityrrr	ras"




z_SyslogLog.writecCstj�dS)N)rZcloselog)rrrr	rwsz_SyslogLog.closecCsdS)Nr)rrrr	rzsz_SyslogLog.flushN)r)rrrr
rrrrrrr	rRs
rc@s<eZdZdZddd�Zdd�Zddd	�Zd
d�Zdd
�ZdS)rz< FileLog class.
    File will be opened on the first write. �wcCstj|�||_||_dS)N)rr
�filename�mode)rr+r,rrr	r
�s
zFileLog.__init__cCsv|jr
dStjtjB}|jjd�r,|tjO}tj|j|d�|_tj	|jd�tj
|j|j�|_tj|jtjtj
�dS)N�ai�)rr�O_CREAT�O_WRONLYr,�
startswith�O_APPEND�openr+�fchmod�fdopen�fcntlZF_SETFDZ
FD_CLOEXEC)r�flagsrrr	r2�s
zFileLog.openrcCs(|js|j�|jj|�|jj�dS)N)rr2rr)rrr
rrrrr	r�sz
FileLog.writecCs|js
dS|jj�d|_dS)N)rr)rrrr	r�s
z
FileLog.closecCs|js
dS|jj�dS)N)rr)rrrr	r�sz
FileLog.flushN)r*)r)	rrrrr
r2rrrrrrr	rs

c@s�eZdZdZd[Zd\Zd]Zd^Zd_ZdZ	e
�Ze�Z
e�Zd`d	d
�Zdd�Zdadd�Zdbdd�Zdcdd�Zdddd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd d!�Zed"fd#d$�Zed"fd%d&�Zed"fd'd(�Zed"fd)d*�Zed"fd+d,�Z ed"fd-d.�Z!d/d0�Z"d1d2�Z#d3d4�Z$d5d6�Z%d7d8�Z&d9d:�Z'd;d<�Z(d=d>�Z)d?d@�Z*dAdB�Z+dCdD�Z,dedEdF�Z-dGdH�Z.dfdIdJ�Z/ed"dfdKdL�Z0ed"dfdMdN�Z1ed"dfdOdP�Z2dgdQdR�Z3dSdT�Z4dUdV�Z5dWdX�Z6dhdYdZ�Z7d"S)iraL	
    Format string:

    %(class)s      Calling class the function belongs to, else empty
    %(date)s       Date using Logger.date_format, see time module
    %(domain)s     Full Domain: %(module)s.%(class)s.%(function)s
    %(file)s       Filename of the module
    %(function)s   Function name, empty in __main__
    %(label)s      Label according to log function call from Logger.label
    %(level)d      Internal logging level
    %(line)d       Line number in module
    %(module)s     Module name
    %(message)s    Log message

    Standard levels:

    FATAL                 Fatal error messages
    ERROR                 Error messages
    WARNING               Warning messages
    INFOx, x in [1..5]    Information
    DEBUGy, y in [1..10]  Debug messages
    NO_INFO               No info output
    NO_DEBUG              No debug output
    INFO_MAX              Maximum info level
    DEBUG_MAX             Maximum debug level

    x and y depend on info_max and debug_max from Logger class
    initialization. See __init__ function.

    Default logging targets:

    stdout        Logs to stdout
    stderr        Logs to stderr
    syslog        Logs to syslog

    Additional arguments for logging functions (fatal, error, warning, info
    and debug):

    nl       Disable newline at the end with nl=0, default is nl=1.
    fmt      Format string for this logging entry, overloads global format
             string. Example: fmt="%(file)s:%(line)d %(message)s"
    nofmt    Only output message with nofmt=1. The nofmt argument wins over
             the fmt argument.

    Example:

    from logger import log
    log.setInfoLogLevel(log.INFO1)
    log.setDebugLogLevel(log.DEBUG1)
    for i in range(1, log.INFO_MAX+1):
        log.setInfoLogLabel(i, "INFO%d: " % i)
    log.setFormat("%(date)s %(module)s:%(line)d [%(domain)s] %(label)s: "
                  "%(level)d %(message)s")
    log.setDateFormat("%Y-%m-%d %H:%M:%S")

    fl = FileLog("/tmp/log", "a")
    log.addInfoLogging("*", fl)
    log.addDebugLogging("*", fl)
    log.addInfoLogging("*", log.syslog, fmt="%(label)s%(message)s")

    log.debug3("debug3")
    log.debug2("debug2")
    log.debug1("debug1")
    log.info2("info2")
    log.info1("info1")
    log.warning("warning\n", nl=0)
    log.error("error\n", nl=0)
    log.fatal("fatal")
    log.info(log.INFO1, "nofmt info", nofmt=1)

    ����r#r�
cCs�i|_i|_d|_d|_i|_i|_i|_i|_i|_i|_	|dkrPt
d|��|dkrdt
d|��|j|_||_
d|_||_|j|jd�|j|jd�|j|jd�|j|jd�xNtd|j
d�D]:}t|d	||�|j|d�t|d
|dd�||��q�WxTtd|jd�D]@}t|d
||�|j|d|�t|d|dd�||���qW|j|j�|j|j�|jd�|jd�|jd|j|j|j|jg�|jd|jdd�t|j|j
d�D��|jd|jdd�td|jd�D��dS)z Logger class initialization �r#zLogger: info_max %d is too lowrzLogger: debug_max %d is too lowz
FATAL ERROR: zERROR: z	WARNING: zINFO%dzinfo%dcs��fdd�S)Ncs�j�|f|�|�S)N)�info)�message�args�kwargs)r�xrr	�<lambda> sz3Logger.__init__.<locals>.<lambda>.<locals>.<lambda>r)rrAr)rrAr	rBsz!Logger.__init__.<locals>.<lambda>zDEBUG%dz	DEBUG%d: zdebug%dcs��fdd�S)Ncs�j�|f|�|�S)N)�debug)r>r?r@)rrArr	rB)sz3Logger.__init__.<locals>.<lambda>.<locals>.<lambda>r)rrAr)rrAr	rB(sz%(label)s%(message)sz%d %b %Y %H:%M:%S�*cSsg|]}|�qSrr)�.0�irrr	�
<listcomp>4sz#Logger.__init__.<locals>.<listcomp>cSsg|]}|�qSrr)rErFrrr	rG6sN) �_level�_debug_level�_format�_date_format�_label�_debug_label�_logging�_debug_logging�_domains�_debug_domains�
ValueErrorr%�NO_INFO�INFO_MAX�NO_DEBUG�	DEBUG_MAX�setInfoLogLabelr'�	TRACEBACKr&�range�setattr�setDebugLogLabel�setInfoLogLevelr$�setDebugLogLevel�	setFormat�
setDateFormat�setInfoLoggingrr�setDebugLogging)rZinfo_maxZ	debug_maxrHrrr	r
�sX






zLogger.__init__cCsNxHt|j|jd�D]2}||jkr$qx |j|D]\}}}|j�q0WqWdS)z Close all logging targets r#N)rYr'rVrNr)rr
�dummy�targetrrr	r8s

zLogger.closerDcCs$|j|�||jkr|j|S|jS)z Get info log level. )�_checkDomainrH�NOTHING)r�domainrrr	�getInfoLogLevel@s


zLogger.getInfoLogLevelcCs8|j|�||jkr|j}||jkr*|j}||j|<dS)z% Set log level [NOTHING .. INFO_MAX] N)rdrerTrH)rr
rfrrr	r\Gs


zLogger.setInfoLogLevelcCs*|j|�||jkr$|j||jS|jS)z Get debug log level. )rdrIrU)rrfrrr	�getDebugLogLevelPs

zLogger.getDebugLogLevelcCs:|j|�|dkrd}||jkr&|j}||j|j|<dS)z- Set debug log level [NO_DEBUG .. DEBUG_MAX] rN)rdrVrUrI)rr
rfrrr	r]Ws

zLogger.setDebugLogLevelcCs|jS)N)rJ)rrrr	�	getFormat`szLogger.getFormatcCs
||_dS)N)rJ)rrJrrr	r^cszLogger.setFormatcCs|jS)N)rK)rrrr	�
getDateFormatfszLogger.getDateFormatcCs
||_dS)N)rK)rrJrrr	r_iszLogger.setDateFormatcCs:|j|�}x*|D]"}|j||j|jd�||j|<qWdS)zU Set log label for level. Level can be a single level or an array
        of levels. )�	min_level�	max_levelN)�
_getLevels�_checkLogLevelr'rTrL)rr
�label�levelsrrr	rWls




zLogger.setInfoLogLabelcCs>|j|dd�}x*|D]"}|j||j|jd�||j|<qWdS)zU Set log label for level. Level can be a single level or an array
        of levels. r#)r)rkrlN)rmrnr$rVrM)rr
rorprrr	r[us



zLogger.setDebugLogLabelNcCs|j||||dd�dS)z� Set info log target for domain and level. Level can be a single
        level or an array of levels. Use level ALL to set for all levels.
        If no format is specified, the default format will be used. r)rN)�_setLogging)rrfrcr
�fmtrrr	r`~szLogger.setInfoLoggingcCs|j||||dd�dS)z� Set debug log target for domain and level. Level can be a single
        level or an array of levels. Use level ALL to set for all levels.
        If no format is specified, the default format will be used. r#)rN)rq)rrfrcr
rrrrr	ra�szLogger.setDebugLoggingcCs|j||||dd�dS)z� Add info log target for domain and level. Level can be a single
        level or an array of levels. Use level ALL to set for all levels.
        If no format is specified, the default format will be used. r)rN)�_addLogging)rrfrcr
rrrrr	�addInfoLogging�szLogger.addInfoLoggingcCs|j||||dd�dS)z� Add debg log target for domain and level. Level can be a single
        level or an array of levels. Use level ALL to set for all levels.
        If no format is specified, the default format will be used. r#)rN)rs)rrfrcr
rrrrr	�addDebugLogging�szLogger.addDebugLoggingcCs|j||||dd�dS)z� Delete info log target for domain and level. Level can be a single
        level or an array of levels. Use level ALL to set for all levels.
        If no format is specified, the default format will be used. r)rN)�_delLogging)rrfrcr
rrrrr	�delInfoLogging�szLogger.delInfoLoggingcCs|j||||dd�dS)z� Delete debug log target for domain and level. Level can be a single
        level or an array of levels. Use level ALL to set for all levels.
        If no format is specified, the default format will be used. r#)rN)rv)rrfrcr
rrrrr	�delDebugLogging�szLogger.delDebugLoggingcCs|j|dd�S)zN Is there currently any info logging for this log level (and
        domain)? r)r)�_isLoggingHere)rr
rrr	�isInfoLoggingHere�szLogger.isInfoLoggingHerecCs|j|dd�S)zO Is there currently any debug logging for this log level (and
        domain)? r#)r)ry)rr
rrr	�isDebugLoggingHere�szLogger.isDebugLoggingHerecOs,|j|�d|d<|j|j|f|�|�dS)z Fatal error log. rrN)�_checkKWargs�_logr')rrJr?r@rrr	�fatal�s
zLogger.fatalcOs,|j|�d|d<|j|j|f|�|�dS)z Error log. rrN)r|r}r&)rrJr?r@rrr	�error�s
zLogger.errorcOs,|j|�d|d<|j|j|f|�|�dS)z Warning log. rrN)r|r}r%)rrJr?r@rrr	�warning�s
zLogger.warningcOsB|j|d|jd�|j|�d|d<|j||j|f|�|�dS)z� Information log using info level [1..info_max].
        There are additional infox functions according to info_max from
        __init__r#)rkrlrrN)rnrTr|r}rS)rr
rJr?r@rrr	r=�s
zLogger.infocOs<|j|d|jd�|j|�d|d<|j||f|�|�dS)z� Debug log using debug level [1..debug_max].
        There are additional debugx functions according to debug_max
        from __init__r#)rkrlrN)rnrVr|r})rr
rJr?r@rrr	rC�s
zLogger.debugcCs|j|jtj�gid�dS)N)r?r@)r}rX�	traceback�
format_exc)rrrr	�	exception�szLogger.exceptioncCs&||ks||kr"td|||f��dS)Nz*Level %d out of range, should be [%d..%d].)rR)rr
rkrlrrr	rn�szLogger._checkLogLevelcCs2|sdSx$|j�D]}|dkrtd|��qWdS)N�nlrr�nofmtz0Key '%s' is not allowed as argument for logging.)r�rrr�)�keysrR)rr@�keyrrr	r|�s
zLogger._checkKWargscCs|s|dkrtd|��dS)Nr<zDomain '%s' is not valid.)rR)rrfrrr	rd�szLogger._checkDomaincCs�||jkrft|t�st|t�r$|}n|g}xp|D]0}|rL|j|d|jd�q0|j||j|jd�q0Wn6|r�dd�t|j	|j�D�}ndd�t|j|j�D�}|S)z Generate log level array. r#)rkrlcSsg|]}|�qSrr)rErFrrr	rG�sz%Logger._getLevels.<locals>.<listcomp>cSsg|]}|�qSrr)rErFrrr	rG�s)
�ALL�
isinstance�list�tuplernrVr'rTrYZDEBUG1)rr
rrprrr	rm�s


zLogger._getLevelscCsNt|t�st|t�r|}n|g}x(|D] }t|jt�s&td|jj��q&W|S)z Generate target array. z '%s' is no valid logging target.)r�r�r��
issubclass�	__class__rrRr)rrc�targetsZ_targetrrr	�_getTargets�s
zLogger._getTargetscCs�|r |j}|j}d|jdf}n|j}|j}|j|jdf}t|�dkrP|j�xVt	|d|d�D]@}||krrqdx0||D]$\}}}||kr||j
|g�j|�q|WqdWdS)z% Generate dict with domain by level. r#rN)rQrOrVrPrNr'rTr)�clearrY�
setdefault�append)rrrPrNZ_ranger
rfrbrrr	�_genDomainsszLogger._genDomainsc	Csl|j|�|j||�}|j|�}|r,|j}n|j}x*|D]"}x|D]}|||fg||<qBWq8W|j|�dS)N)rdrmr�rOrNr�)	rrfrcr
rrrrpr�rNrrr	rqs



zLogger._setLoggingc	Cst|j|�|j||�}|j|�}|r,|j}n|j}x2|D]*}x$|D]}|j|g�j|||f�qBWq8W|j|�dS)N)rdrmr�rOrNr�r�r�)	rrfrcr
rrrrpr�rNrrr	rs-s



 zLogger._addLoggingc
Cs�|j|�|j||�}|j|�}|r,|j}n|j}x�|D]|}	xv|D]n}|	|krPqB|||f||	kr�||	j|||f�t||	�dkr�||	=qB||jkrBtd|	||j	j
|f��qBWq8W|j|�dS)NrzDNo mathing logging for level %d, domain %s, target %s and format %s.)rdrmr�rOrN�remover)r�rRr�rr�)
rrfrcr
rrrrpr�rNrHrrr	rv<s&




zLogger._delLoggingcCst|j||�}|sdS|dd}|r,|j}n|j}x<||D]0\}}}|dksh|j|�shtj|d|�r<dSq<WdS)NFrf�.rDT)�_genDictrOrNr0�fnmatch�fnmatchcase)rr
r�_dict�point_domainrNrfrbrrr	ryUs
zLogger._isLoggingHerec	Cs�|jjdkrD|jjd}||jkrD|j|}|j|j|j�}|rD|Stj|j�}|j}|j|j	kr�t
|j	|jd�r�|j	|jj|kr�dSxT|j	j�D]F\}}t
|tj�r�t
||j�r�t||j�}t
|tj�r�|j|kr�|Sq�WdS)z7 Function to get calling class. Returns class or None. rZ	func_codeN)�f_code�co_argcount�co_varnames�f_locals�
_getClass2r��inspectZ	getmodule�co_name�__dict__�hasattr�__code__�itemsr��typesZ	ClassType�getattr�FunctionType)	r�frameZselfname�_self�obj�module�coderb�valuerrr	�	_getClassis*


zLogger._getClasscCsVx,|jj�D]}t|tj�r|j|kr|SqWx"|jD]}|j||�}|r6|Sq6WdS)z@ Internal function to get calling class. Returns class or None. N)r��valuesr�r�r�r��	__bases__r�)rr�r�r��baseZ_objrrr	r��s
zLogger._getClass2cOsld}d|kr|d}d}d|kr(|d}d}d|kr<|d}|j||�}|sPdSt|�dkrj|||d<n&t|�dkr�||d|d<n||d<|dd}	|r�|j}
n|j}
g}x�|
|D]�\}}
}|
|kr�q�|d	ks�|	j|d�s�tj|d|�r�|�s|j}d
|k�r|d
}|�r0|
j|d|||�n|
j|||||�|�rZ|
jd|||�|j	|
�q�WdS)Nrrr#r�r�r>rfr�rDrrr")
r�r)rOrNr0r�r�rJrr�)rr
rJr?r@rr�r�r�r�rNZused_targetsrfrcrrr	r}�sL
zLogger._logcCsg}d}|r |j}|j}|j}n|j}|j}|j}xN|D]F}|dkrh|||kr~d}t|�dkrdg}Pq8|||kr8|j|�q8W|r�t|�dkr�dS||kr�dStj	�}	x$|	r�|	j
r�|	jd|jkr�|	j
}	q�W|	s�t
d��|	jd}
|
d	}x|D]}|j|�r�g}Pq�W|	j}t|
�}
xx||D]l}|jd�}|dk�rD�q&n|dk�r\|d|�}n|}|
t|�k�r�|
j|��s�dSn|j|
��s&dS�q&Wd
}||k�r�||}|j|	j|
d
|jd
||tj|jtj��d�	}|dd
k�r�d
|d<d}x&||D]}|dk�r�q�d}P�q�W|jjd�dk�sR|jjd�dk�sR|�sRt|�dk�rl|j|	�}|�rl|j|d<d
|d|d<|dd
k�r�|dd	|d7<|dd
k�r�|dd	|d7<t|�dk�r�|S|dd	}x0|D](}|j|��stj|d|��r�|S�q�WdS)z Internal function. FrDTrr#Nrz Frame information not available.r�r<)	�file�liner��class�functionrfror
Zdater��?z	%(domain)z%(class)r�r�rf)rIrQrMrHrPrLr)r�r�Zcurrentframe�f_back�	f_globalsrrRr0r��find�co_filename�f_linenor��timeZstrftimerKZ	localtimerJr�rr�r�)rr
rZ
check_domainsZsimple_matchr�rPrLrf�fZmodule_nameZpoint_module�co�_lenrF�dZ	level_strZ
domain_neededr�r�rrr	r��s�














zLogger._genDict���������������)r7r;)rD)rD)rD)rD)r)r)r)r)8rrrrr�rer'rXr&r%rrrrrrr
rrgr\rhr]rir^rjr_rWr[r`rartrurwrxrzr{r~rr�r=rCr�rnr|rdrmr�r�rqrsrvryr�r�r}r�rrrr	r�sdG
;

	

					


 4)�__all__rr�r�r�r�rr�r5Zos.pathr�objectrrrrrrrrrrr	�<module>s.-(*4site-packages/firewall/core/__pycache__/icmp.cpython-36.pyc000064400000004265147511334670017631 0ustar003

]ûf�#@s�ddddgZddddddd	d
ddd
dddddddddddddddddddd d!d"d#d$�"Zd%d&d'd(d)dddd*d+d,d,d-d-d.d/d0d0d1d1d2d3�Zd4d5�Zd6d�Zd7d8�Zd9d�Zd:S);�
ICMP_TYPES�ICMPV6_TYPES�check_icmp_type�check_icmpv6_typez0/0z3/0z3/1z3/2z3/3z3/4z3/5z3/6z3/7z3/9z3/10z3/11z3/12z3/13z3/14z3/15z4/0z5/0z5/1z5/2z5/3z8/0z9/0z10/0z11/0z11/1z12/0z12/1z13/0z14/0z17/0z18/0)"z
echo-reply�pongznetwork-unreachablezhost-unreachablezprotocol-unreachablezport-unreachablezfragmentation-neededzsource-route-failedznetwork-unknownzhost-unknownznetwork-prohibitedzhost-prohibitedzTOS-network-unreachablezTOS-host-unreachablezcommunication-prohibitedzhost-precedence-violationzprecedence-cutoffz
source-quenchznetwork-redirectz
host-redirectzTOS-network-redirectzTOS-host-redirectzecho-request�pingzrouter-advertisementzrouter-solicitationzttl-zero-during-transitzttl-zero-during-reassemblyz
ip-header-badzrequired-option-missingztimestamp-requestztimestamp-replyzaddress-mask-requestzaddress-mask-replyz1/0z1/1z1/3z1/4z2/0z4/1z4/2z128/0z129/0z133/0z134/0z135/0z136/0z137/0)zno-routezcommunication-prohibitedzaddress-unreachablezport-unreachablezpacket-too-bigzttl-zero-during-transitzttl-zero-during-reassemblyz
bad-headerzunknown-header-typezunknown-optionzecho-requestrz
echo-replyrzrouter-solicitationzrouter-advertisementzneighbour-solicitationzneigbour-solicitationzneighbour-advertisementzneigbour-advertisementZredirectcCs|tkrdSdS)NTF)r)�_name�r�/usr/lib/python3.6/icmp.py�check_icmp_nameVsr
cCs|tj�krdSdS)NTF)r�values)�_typerrr	r[scCs|tkrdSdS)NTF)r)rrrr	�check_icmpv6_name`sr
cCs|tj�krdSdS)NTF)rr)rrrr	resN)�__all__rrr
rr
rrrrr	�<module>sxsite-packages/firewall/core/__pycache__/modules.cpython-36.pyc000064400000005500147511334670020342 0ustar003

]ûf��@sBdZdgZddlmZddlmZddlmZGdd�de�Z	dS)zmodules backend�modules�)�runProg)�log)�COMMANDSc@sLeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)rcCstd|_td|_dS)NZmodprobeZrmmod)r�
_load_command�_unload_command)�self�r	�/usr/lib/python3.6/modules.py�__init__s
zmodules.__init__cCs
d|jS)Nz%s)�	__class__)rr	r	r
�__repr__$szmodules.__repr__cCs�g}i}y�tdd��p}xh|D]`}|s&P|j�}|j�}|j|d�|ddkrp|djd�dd	�||d<qg||d<qWWdQRXWntk
r�YnX||fS)
z6 get all loaded kernel modules and their dependencies z
/proc/modules�rr��-�,N����)�open�strip�split�append�FileNotFoundError)r�mods�deps�f�lineZsplitsr	r	r
�loaded_modules's 
 zmodules.loaded_modulescCs"tjd|j|j|�t|j|g�S)Nz	%s: %s %s)r�debug2rrr)r�moduler	r	r
�load_module<szmodules.load_modulecCs"tjd|j|j|�t|j|g�S)Nz	%s: %s %s)rrrrr)rrr	r	r
�
unload_module@szmodules.unload_modulecCsT||krdSx0||D]$}|j|||�||kr|j|�qW||krP|j|�dS)z  get all dependants of a module N)�get_depsr)rrr�ret�modr	r	r
r"Dszmodules.get_depscCs�g}|j�\}}|jd||�x*dD]"}||kr$|j|�|jd|�q$Wx^|D]V}|dks�|jd�s�|jd	�s�|jd
�s�|jd�s�|jd�s�|jd
�rP|j|||�qPW|S)z) get all loaded firewall-related modules Znf_conntrack�nf_conntrack_ipv4�nf_conntrack_ipv6r�	ip_tables�
ip6_tables�ebtablesZiptable_Z	ip6table_Znf_Zxt_Zipt_Zip6t_)r%r&r)r'r(r))rr"�remove�insert�
startswith)rrZmods2rZbad_bad_moduler$r	r	r
�get_firewall_modulesOs


zmodules.get_firewall_modulescCs>x8|j�D],}|j|�\}}|dkr
tjd||f�q
WdS)z% unload all firewall-related modules rz Failed to unload module '%s': %sN)r-r!rZdebug1)rrZstatusr#r	r	r
�unload_firewall_modulesdszmodules.unload_firewall_modulesN)�__name__�
__module__�__qualname__rr
rr r!r"r-r.r	r	r	r
rsN)
�__doc__�__all__Zfirewall.core.progrZfirewall.core.loggerrZfirewall.configr�objectrr	r	r	r
�<module>s
site-packages/firewall/core/__pycache__/fw_policies.cpython-36.opt-1.pyc000064400000004432147511334670022137 0ustar003

]ûf�
�@sVdgZddlmZddlmZddlmZddlmZddlm	Z	Gdd�de
�ZdS)	�FirewallPolicies�)�config)�log)�LockdownWhitelist)�errors)�
FirewallErrorc@sDeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dS)rcCsd|_ttj�|_dS)NF)�	_lockdownrrZLOCKDOWN_WHITELIST�lockdown_whitelist)�self�r�!/usr/lib/python3.6/fw_policies.py�__init__szFirewallPolicies.__init__cCsd|j|j|jfS)Nz
%s(%r, %r))�	__class__rr	)r
rrr�__repr__#s
zFirewallPolicies.__repr__cCsd|_|jj�dS)NF)rr	�cleanup)r
rrrr'szFirewallPolicies.cleanupcCs�|dkr2tjd|�|jj|�r�tjd�dSn�|dkrdtjd|�|jj|�r�tjd�dSnb|dkr�tjd	|�|jj|�r�tjd
�dSn0|dkr�tjd|�|jj|�r�tjd
�dSdS)N�contextz#Doing access check for context "%s"zcontext matches.TZuidzDoing access check for uid %dzuid matches.�userz Doing access check for user "%s"z
user matches.Zcommandz#Doing access check for command "%s"zcommand matches.F)rZdebug2r	Z
match_contextZdebug3Z	match_uidZ
match_userZ
match_command)r
�key�valuerrr�access_check-s*



zFirewallPolicies.access_checkcCs|jrttjd��d|_dS)Nzenable_lockdown()T)rrrZALREADY_ENABLED)r
rrr�enable_lockdownDsz FirewallPolicies.enable_lockdowncCs|jsttjd��d|_dS)Nzdisable_lockdown()F)rrrZNOT_ENABLED)r
rrr�disable_lockdownIsz!FirewallPolicies.disable_lockdowncCs|jS)N)r)r
rrr�query_lockdownNszFirewallPolicies.query_lockdownN)
�__name__�
__module__�__qualname__r
rrrrrrrrrrrsN)�__all__ZfirewallrZfirewall.core.loggerrZ#firewall.core.io.lockdown_whitelistrrZfirewall.errorsr�objectrrrrr�<module>ssite-packages/firewall/core/__pycache__/prog.cpython-36.pyc000064400000001266147511334670017646 0ustar003

]ûf��@sddlZdgZddd�ZdS)�N�runProgc
Cs�|dkrg}|g|}d}|r@t|d��}|j�j�}WdQRXddi}y tj|tjtjtjd|d�}Wntk
r|d
SX|j|�\}}	|j	dd	�}|j
|fS)N�rZLANG�CT)�stdin�stderr�stdoutZ	close_fds�env��zutf-8�replace)r	r
)�open�read�encode�
subprocess�Popen�PIPEZSTDOUT�OSErrorZcommunicate�decode�
returncode)
�prog�argvr�argsZinput_stringZhandlerZprocess�outputZ
err_output�r�/usr/lib/python3.6/prog.pyrs$

)NN)r�__all__rrrrr�<module>ssite-packages/firewall/core/__pycache__/fw_config.cpython-36.opt-1.pyc000064400000077161147511334670021606 0ustar003

]ûf��@s�dgZddlZddlZddlZddlZddlmZddlmZddl	m
Z
mZmZddl
mZmZmZddlmZmZmZddlmZmZmZdd	lmZmZmZdd
lmZmZm Z ddlm!Z!ddl"m#Z#Gd
d�de$�Z%dS)�FirewallConfig�N)�config)�log)�IcmpType�icmptype_reader�icmptype_writer)�Service�service_reader�service_writer)�Zone�zone_reader�zone_writer)�IPSet�ipset_reader�ipset_writer)�Helper�
helper_reader�
helper_writer)�Policy�
policy_reader�
policy_writer)�errors)�
FirewallErrorc@s$eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#d$�Zd%d&�Zd'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3d4�Zd5d6�Zd7d8�Zd9d:�Zd;d<�Z d=d>�Z!d?d@�Z"dAdB�Z#dCdD�Z$dEdF�Z%dGdH�Z&dIdJ�Z'dKdL�Z(dMdN�Z)dOdP�Z*dQdR�Z+dSdT�Z,dUdV�Z-dWdX�Z.dYdZ�Z/d[d\�Z0d]d^�Z1d_d`�Z2dadb�Z3dcdd�Z4dedf�Z5dgdh�Z6didj�Z7dkdl�Z8dmdn�Z9dodp�Z:dqdr�Z;dsdt�Z<dudv�Z=dwdx�Z>dydz�Z?d{d|�Z@d}d~�ZAdd��ZBd�d��ZCd�d��ZDd�d��ZEd�d��ZFd�d��ZGd�d��ZHd�d��ZId�d��ZJd�d��ZKd�d��ZLd�d��ZMd�d��ZNd�d��ZOd�d��ZPd�d��ZQd�d��ZRd�d��ZSd�d��ZTd�d��ZUd�d��ZVd�d��ZWd�d��ZXd�d��ZYd�d��ZZd�d��Z[d�d��Z\d�d��Z]d�d��Z^d�d��Z_d�d��Z`d�d��Zad�d��Zbd�d„Zcd�dĄZdd�dƄZed�S)�rcCs||_|j�dS)N)�_fw�_FirewallConfig__init_vars)�self�fw�r�/usr/lib/python3.6/fw_config.py�__init__(szFirewallConfig.__init__cCsHd|j|j|j|j|j|j|j|j|j|j	|j
|j|j|j
|j|jfS)Nz>%s(%r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r))�	__class__�_ipsets�
_icmptypes�	_services�_zones�_helpersZpolicy_objects�_builtin_ipsets�_builtin_icmptypes�_builtin_services�_builtin_zones�_builtin_helpers�_builtin_policy_objects�_firewalld_conf�	_policies�_direct)rrrr�__repr__,szFirewallConfig.__repr__cCs^i|_i|_i|_i|_i|_i|_i|_i|_i|_i|_	i|_
i|_d|_d|_
d|_dS)N)r!r"r#r$r%�_policy_objectsr&r'r(r)r*r+r,r-r.)rrrrZ__init_vars6szFirewallConfig.__init_varscCs4x,t|jj��D]}|j|j�|j|=qWx,t|jj��D]}|j|j�|j|=q>Wx,t|jj��D]}|j|j�|j|=qlWx,t|jj��D]}|j|j�|j|=q�Wx,t|jj��D]}|j|j�|j|=q�Wx,t|jj��D]}|j|j�|j|=q�Wx.t|j	j��D]}|j	|j�|j	|=�q$Wx.t|j
j��D]}|j
|j�|j
|=�qTWx.t|jj��D]}|j|j�|j|=�q�Wx.t|jj��D]}|j|j�|j|=�q�W|j
�r�|j
j�|`
d|_
|j�r|jj�|`d|_|j�r(|jj�|`d|_|j�dS)N)�listr&�keys�cleanupr!r'r"r(r#r)r$r*r%r,r-r.r)r�xrrrr3GsV


zFirewallConfig.cleanupcCs|jjj�S)N)r�policiesZquery_lockdown)rrrr�lockdown_enabled~szFirewallConfig.lockdown_enabledcCs|jjj||�S)N)rr5�access_check)r�key�valuerrrr7�szFirewallConfig.access_checkcCs
||_dS)N)r,)r�confrrr�set_firewalld_conf�sz!FirewallConfig.set_firewalld_confcCs|jS)N)r,)rrrr�get_firewalld_conf�sz!FirewallConfig.get_firewalld_confcCs(tjjtj�s|jj�n
|jj�dS)N)�os�path�existsrZFIREWALLD_CONFr,�clear�read)rrrr�update_firewalld_conf�sz$FirewallConfig.update_firewalld_confcCs
||_dS)N)r-)rr5rrr�set_policies�szFirewallConfig.set_policiescCs|jS)N)r-)rrrr�get_policies�szFirewallConfig.get_policiescCs,tjjtj�s|jjj�n|jjj�dS)N)	r=r>r?rZLOCKDOWN_WHITELISTr-Zlockdown_whitelistr3rA)rrrr�update_lockdown_whitelist�sz(FirewallConfig.update_lockdown_whitelistcCs
||_dS)N)r.)rZdirectrrr�
set_direct�szFirewallConfig.set_directcCs|jS)N)r.)rrrr�
get_direct�szFirewallConfig.get_directcCs(tjjtj�s|jj�n
|jj�dS)N)r=r>r?rZFIREWALLD_DIRECTr.r3rA)rrrr�
update_direct�szFirewallConfig.update_directcCs$ttt|jj��t|jj����S)N)�sorted�setr1r!r2r&)rrrr�
get_ipsets�szFirewallConfig.get_ipsetscCs$|jr||j|j<n||j|j<dS)N)�builtinr&�namer!)r�objrrr�	add_ipset�szFirewallConfig.add_ipsetcCs8||jkr|j|S||jkr(|j|Sttj|��dS)N)r!r&rr�
INVALID_IPSET)rrMrrr�	get_ipset�s




zFirewallConfig.get_ipsetcCst|j|jkrttj|j��nB|j|j|kr@ttjd|j��n|j|jkr^ttjd|j��|j|�|j|jS)Nzself._ipsets[%s] != objz'%s' not a built-in ipset)rMr!rr�NO_DEFAULTSr&�
_remove_ipset)rrNrrr�load_ipset_defaults�s
z"FirewallConfig.load_ipset_defaultscCs|j�S)N)�
export_config)rrNrrr�get_ipset_config�szFirewallConfig.get_ipset_configcCsj|jrPtj|�}|j|�tj|_d|_|j|jkr:d|_|j|�t|�|S|j|�t|�|SdS)NF)	rL�copy�
import_configr�ETC_FIREWALLD_IPSETSr>�defaultrOr)rrNr:r4rrr�set_ipset_config�s



zFirewallConfig.set_ipset_configcCsx||jks||jkr$ttjd|��t�}|j|�|j|�||_d||_	t
j|_d|_
d|_t|�|j|�|S)Nznew_ipset(): '%s'z%s.xmlFT)r!r&rr�
NAME_CONFLICTr�
check_namerXrM�filenamerrYr>rLrZrrO)rrMr:r4rrr�	new_ipset�s




zFirewallConfig.new_ipsetcCs�tjj|�}tjj|�}tjj|�s�|tjkr�x�|jj�D]D}|j|}|j	|kr:|j|=|j
|jkrvd|j|j
fSd|fSq:WnHxF|jj�D]8}|j|}|j	|kr�|j|=|j
|jkr�d|fSdSq�WdStj
d|�yt||�}Wn0tk
�r}ztjd||�dSd}~XnX|j
|jk�rJ|j
|jk�rJ|j|�d|fS|tjk�r�|j
|jk�r�|j|j
j|_||j|j
<d|fS|j
|jk�r�|j|j
=||j|j
<|j
|jk�r�d|fSd	Sd
S)N�update�removezLoading ipset file '%s'z"Failed to load ipset file '%s': %s�new)NN)NN)NN)NN)NN)r=r>�basename�dirnamer?rrYr!r2r^rMr&r�debug1r�	Exception�errorrOrZ)rrMr^r>r4rN�msgrrr�update_ipset_from_path�sP






z%FirewallConfig.update_ipset_from_pathcCs�|j|jkrttj|j��|jtjkr>ttjd|jtjf��d|j|jf}yt	j
|d|�Wn:tk
r�}ztj
d||�tj|�WYdd}~XnX|j|j=dS)Nz'%s' != '%s'z	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %s)rMr!rrrPr>rrY�INVALID_DIRECTORY�shutil�moverfrrgr=ra)rrNrMrhrrrrS8szFirewallConfig._remove_ipsetcCs$|js|jr ttjd|j��dS)Nz'%s' is built-in ipset)rLrZrrZ
BUILTIN_IPSETrM)rrNrrr�check_builtin_ipsetIsz"FirewallConfig.check_builtin_ipsetcCs|j|�|j|�dS)N)rmrS)rrNrrr�remove_ipsetNs
zFirewallConfig.remove_ipsetcCs$|j|�|j||�}|j|�|S)N)rm�_copy_ipsetrS)rrNrMr_rrr�rename_ipsetRs

zFirewallConfig.rename_ipsetcCs|j||j��S)N)r_rU)rrNrMrrrroXszFirewallConfig._copy_ipsetcCs$ttt|jj��t|jj����S)N)rIrJr1r"r2r')rrrr�
get_icmptypes]szFirewallConfig.get_icmptypescCs$|jr||j|j<n||j|j<dS)N)rLr'rMr")rrNrrr�add_icmptypeaszFirewallConfig.add_icmptypecCs8||jkr|j|S||jkr(|j|Sttj|��dS)N)r"r'rr�INVALID_ICMPTYPE)rrMrrr�get_icmptypegs




zFirewallConfig.get_icmptypecCst|j|jkrttj|j��nB|j|j|kr@ttjd|j��n|j|jkr^ttjd|j��|j|�|j|jS)Nzself._icmptypes[%s] != objz'%s' not a built-in icmptype)rMr"rrrRr'�_remove_icmptype)rrNrrr�load_icmptype_defaultsns
z%FirewallConfig.load_icmptype_defaultscCs|j�S)N)rU)rrNrrr�get_icmptype_configzsz"FirewallConfig.get_icmptype_configcCsj|jrPtj|�}|j|�tj|_d|_|j|jkr:d|_|j|�t|�|S|j|�t|�|SdS)NF)	rLrWrXr�ETC_FIREWALLD_ICMPTYPESr>rZrrr)rrNr:r4rrr�set_icmptype_config}s



z"FirewallConfig.set_icmptype_configcCsx||jks||jkr$ttjd|��t�}|j|�|j|�||_d||_	t
j|_d|_
d|_t|�|j|�|S)Nznew_icmptype(): '%s'z%s.xmlFT)r"r'rrr\rr]rXrMr^rrxr>rLrZrrr)rrMr:r4rrr�new_icmptype�s




zFirewallConfig.new_icmptypecCs�tjj|�}tjj|�}tjj|�s�|tjkr�x�|jj�D]D}|j|}|j	|kr:|j|=|j
|jkrvd|j|j
fSd|fSq:WnHxF|jj�D]8}|j|}|j	|kr�|j|=|j
|jkr�d|fSdSq�WdStj
d|�yt||�}Wn0tk
�r}ztjd||�dSd}~XnX|j
|jk�rJ|j
|jk�rJ|j|�d|fS|tjk�r�|j
|jk�r�|j|j
j|_||j|j
<d|fS|j
|jk�r�|j|j
=||j|j
<|j
|jk�r�d|fSd	Sd
S)Nr`razLoading icmptype file '%s'z%Failed to load icmptype file '%s': %srb)NN)NN)NN)NN)NN)r=r>rcrdr?rrxr"r2r^rMr'rrerrfrgrrrZ)rrMr^r>r4rNrhrrr�update_icmptype_from_path�sP






z(FirewallConfig.update_icmptype_from_pathcCs�|j|jkrttj|j��|jtjkr>ttjd|jtjf��d|j|jf}yt	j
|d|�Wn:tk
r�}ztj
d||�tj|�WYdd}~XnX|j|j=dS)Nz'%s' != '%s'z	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %s)rMr"rrrsr>rrxrjrkrlrfrrgr=ra)rrNrMrhrrrru�szFirewallConfig._remove_icmptypecCs$|js|jr ttjd|j��dS)Nz'%s' is built-in icmp type)rLrZrrZBUILTIN_ICMPTYPErM)rrNrrr�check_builtin_icmptype�sz%FirewallConfig.check_builtin_icmptypecCs|j|�|j|�dS)N)r|ru)rrNrrr�remove_icmptype�s
zFirewallConfig.remove_icmptypecCs$|j|�|j||�}|j|�|S)N)r|�_copy_icmptyperu)rrNrMrzrrr�rename_icmptype�s

zFirewallConfig.rename_icmptypecCs|j||j��S)N)rzrU)rrNrMrrrr~szFirewallConfig._copy_icmptypecCs$ttt|jj��t|jj����S)N)rIrJr1r#r2r()rrrr�get_services
szFirewallConfig.get_servicescCs$|jr||j|j<n||j|j<dS)N)rLr(rMr#)rrNrrr�add_serviceszFirewallConfig.add_servicecCs<||jkr|j|S||jkr(|j|Sttjd|��dS)Nzget_service(): '%s')r#r(rr�INVALID_SERVICE)rrMrrr�get_services




zFirewallConfig.get_servicecCst|j|jkrttj|j��nB|j|j|kr@ttjd|j��n|j|jkr^ttjd|j��|j|�|j|jS)Nzself._services[%s] != objz'%s' not a built-in service)rMr#rrrRr(�_remove_service)rrNrrr�load_service_defaultss
z$FirewallConfig.load_service_defaultscCsr|j�}g}x\td�D]P}|j|d|krN|jtjt||j|d���q|j||j|d�qWt|�S)N�r)�export_config_dict�range�IMPORT_EXPORT_STRUCTURE�appendrW�deepcopy�getattr�tuple)rrN�	conf_dict�	conf_list�irrr�get_service_config's"z!FirewallConfig.get_service_configcCs|j�S)N)r�)rrNrrr�get_service_config_dict3sz&FirewallConfig.get_service_config_dictcCs�i}x&t|�D]\}}|||j|d<qW|jr|tj|�}|j|�tj|_d|_|j|jkrfd|_|j	|�t
|�|S|j|�t
|�|SdS)NrF)�	enumerater�rLrW�import_config_dictr�ETC_FIREWALLD_SERVICESr>rZr�r
)rrNr:r�r�r9r4rrr�set_service_config6s 



z!FirewallConfig.set_service_configcCsj|jrPtj|�}|j|�tj|_d|_|j|jkr:d|_|j|�t|�|S|j|�t|�|SdS)NF)	rLrWr�rr�r>rZr�r
)rrNr:r4rrr�set_service_config_dictJs



z&FirewallConfig.set_service_config_dictcCs�||jks||jkr$ttjd|��i}x&t|�D]\}}||tj|d<q2Wt�}|j|�|j	|�||_
d||_tj
|_d|_d|_t|�|j|�|S)Nznew_service(): '%s'rz%s.xmlFT)r#r(rrr\r�rr�r]r�rMr^rr�r>rLrZr
r�)rrMr:r�r�r9r4rrr�new_serviceZs"




zFirewallConfig.new_servicecCsx||jks||jkr$ttjd|��t�}|j|�|j|�||_d||_	t
j|_d|_
d|_t|�|j|�|S)Nznew_service(): '%s'z%s.xmlFT)r#r(rrr\rr]r�rMr^rr�r>rLrZr
r�)rrMr:r4rrr�new_service_dictqs




zFirewallConfig.new_service_dictcCs�tjj|�}tjj|�}tjj|�s�|tjkr�x�|jj�D]D}|j|}|j	|kr:|j|=|j
|jkrvd|j|j
fSd|fSq:WnHxF|jj�D]8}|j|}|j	|kr�|j|=|j
|jkr�d|fSdSq�WdStj
d|�yt||�}Wn0tk
�r}ztjd||�dSd}~XnX|j
|jk�rJ|j
|jk�rJ|j|�d|fS|tjk�r�|j
|jk�r�|j|j
j|_||j|j
<d|fS|j
|jk�r�|j|j
=||j|j
<|j
|jk�r�d|fSd	Sd
S)Nr`razLoading service file '%s'z$Failed to load service file '%s': %srb)NN)NN)NN)NN)NN)r=r>rcrdr?rr�r#r2r^rMr(rrer	rfrgr�rZ)rrMr^r>r4rNrhrrr�update_service_from_path�sP






z'FirewallConfig.update_service_from_pathcCs�|j|jkrttj|j��|jtjkr>ttjd|jtjf��d|j|jf}yt	j
|d|�Wn:tk
r�}ztj
d||�tj|�WYdd}~XnX|j|j=dS)Nz'%s' != '%s'z	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %s)rMr#rrr�r>rr�rjrkrlrfrrgr=ra)rrNrMrhrrrr��szFirewallConfig._remove_servicecCs$|js|jr ttjd|j��dS)Nz'%s' is built-in service)rLrZrrZBUILTIN_SERVICErM)rrNrrr�check_builtin_service�sz$FirewallConfig.check_builtin_servicecCs|j|�|j|�dS)N)r�r�)rrNrrr�remove_service�s
zFirewallConfig.remove_servicecCs$|j|�|j||�}|j|�|S)N)r��
_copy_servicer�)rrNrMr�rrr�rename_service�s

zFirewallConfig.rename_servicecCs|j||j��S)N)r�r�)rrNrMrrrr��szFirewallConfig._copy_servicecCs$ttt|jj��t|jj����S)N)rIrJr1r$r2r))rrrr�	get_zones�szFirewallConfig.get_zonescCs$|jr||j|j<n||j|j<dS)N)rLr)rMr$)rrNrrr�add_zone�szFirewallConfig.add_zonecCs(||jkr|j|=||jkr$|j|=dS)N)r)r$)rrMrrr�forget_zone�s

zFirewallConfig.forget_zonecCs<||jkr|j|S||jkr(|j|Sttjd|��dS)Nzget_zone(): %s)r$r)rr�INVALID_ZONE)rrMrrr�get_zone�s




zFirewallConfig.get_zonecCst|j|jkrttj|j��nB|j|j|kr@ttjd|j��n|j|jkr^ttjd|j��|j|�|j|jS)Nzself._zones[%s] != objz'%s' not a built-in zone)rMr$rrrRr)�_remove_zone)rrNrrr�load_zone_defaultss
z!FirewallConfig.load_zone_defaultscCsr|j�}g}x\td�D]P}|j|d|krN|jtjt||j|d���q|j||j|d�qWt|�S)N�r)r�r�r�r�rWr�r�r�)rrNr�r�r�rrr�get_zone_configs"zFirewallConfig.get_zone_configcCs|j�S)N)r�)rrNrrr�get_zone_config_dictsz#FirewallConfig.get_zone_config_dictcCs�i}x&t|�D]\}}|||j|d<qW|jr�tj|�}||_|j|�tj|_d|_|j|jkrld|_	|j
|�t|�|S||_|j|�t|�|SdS)NrF)r�r�rLrW�	fw_configr�r�ETC_FIREWALLD_ZONESr>rZr�r
)rrNr:r�r�r9r4rrr�set_zone_config s$



zFirewallConfig.set_zone_configcCsv|jrVtj|�}||_|j|�tj|_d|_|j|jkr@d|_|j|�t	|�|S||_|j|�t	|�|SdS)NF)
rLrWr�r�rr�r>rZr�r
)rrNr:r4rrr�set_zone_config_dict6s



z#FirewallConfig.set_zone_config_dictcCs�||jks||jkr$ttjd|��i}x&t|�D]\}}||tj|d<q2Wt�}||_|j	|�|j
|�||_d||_t
j|_d|_d|_t|�|j|�|S)Nznew_zone(): '%s'rz%s.xmlFT)r$r)rrr\r�rr�r�r]r�rMr^rr�r>rLrZr
r�)rrMr:r�r�r9r4rrr�new_zoneHs"



zFirewallConfig.new_zonecCs~||jks||jkr$ttjd|��t�}||_|j|�|j|�||_	d||_
tj|_
d|_d|_t|�|j|�|S)Nznew_zone(): '%s'z%s.xmlFT)r$r)rrr\rr�r]r�rMr^rr�r>rLrZr
r�)rrMr:r4rrr�
new_zone_dict_s



zFirewallConfig.new_zone_dictcCstjj|�}tjj|�}tjj|�s�|jtj�r�x�|jj	�D]D}|j|}|j
|kr<|j|=|j|jkrxd|j|jfSd|fSq<WnHxF|jj	�D]8}|j|}|j
|kr�|j|=|j|jkr�d|fSd	Sq�Wd
St
jd|�yt||�}Wn0tk
�r}zt
jd||�dSd}~XnX||_|jtj��rlt|�ttj�k�rldtjj|�tjj|�dd�f|_|j|jk�r�|j|jk�r�|j|�d|fS|jtj��r�|j|jk�r�|j|jj|_||j|j<d|fS|j|jk�r|j|j=||j|j<|j|jk�rd|fSd
SdS)Nr`razLoading zone file '%s'z!Failed to load zone file '%s': %sz%s/%sr�rb)NN)NN)NN���)NN)NN)r=r>rcrdr?�
startswithrr�r$r2r^rMr)rrerrfrgr��lenr�rZ)rrMr^r>r4rNrhrrr�update_zone_from_pathrsZ





z$FirewallConfig.update_zone_from_pathcCs�|j|jkrttj|j��|jjtj�s@ttj	d|jtjf��d|j|jf}yt
j|d|�Wn:tk
r�}zt
jd||�tj|�WYdd}~XnX|j|j=dS)Nz'%s' doesn't start with '%s'z	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %s)rMr$rrr�r>r�rr�rjrkrlrfrrgr=ra)rrNrMrhrrrr��szFirewallConfig._remove_zonecCs$|js|jr ttjd|j��dS)Nz'%s' is built-in zone)rLrZrrZBUILTIN_ZONErM)rrNrrr�check_builtin_zone�sz!FirewallConfig.check_builtin_zonecCs|j|�|j|�dS)N)r�r�)rrNrrr�remove_zone�s
zFirewallConfig.remove_zonec	CsN|j|�|j�}|j|�y|j||�}Wn|j|j|��YnX|S)N)r�r�r�r�rM)rrNrMZobj_confr�rrr�rename_zone�s

zFirewallConfig.rename_zonecCs$ttt|jj��t|jj����S)N)rIrJr1r0r2r+)rrrr�get_policy_objects�sz!FirewallConfig.get_policy_objectscCs$|jr||j|j<n||j|j<dS)N)rLr+rMr0)rrNrrr�add_policy_object�sz FirewallConfig.add_policy_objectcCs<||jkr|j|S||jkr(|j|Sttjd|��dS)Nzget_policy_object(): %s)r0r+rr�INVALID_POLICY)rrMrrr�get_policy_object�s




z FirewallConfig.get_policy_objectcCst|j|jkrttj|j��nB|j|j|kr@ttjd|j��n|j|jkr^ttjd|j��|j|�|j|jS)Nzself._policy_objects[%s] != objz'%s' not a built-in policy)rMr0rrrRr+�_remove_policy_object)rrNrrr�load_policy_object_defaults�s
z*FirewallConfig.load_policy_object_defaultscCs|j�S)N)r�)rrNrrr�get_policy_object_config_dictsz,FirewallConfig.get_policy_object_config_dictcCsv|jrVtj|�}||_|j|�tj|_d|_|j|jkr@d|_|j|�t	|�|S||_|j|�t	|�|SdS)NF)
rLrWr�r�r�ETC_FIREWALLD_POLICIESr>rZr�r)rrNr:r4rrr�set_policy_object_config_dicts



z,FirewallConfig.set_policy_object_config_dictcCs~||jks||jkr$ttjd|��t�}||_|j|�|j|�||_	d||_
tj|_
d|_d|_t|�|j|�|S)Nznew_policy_object(): '%s'z%s.xmlFT)r0r+rrr\rr�r]r�rMr^rr�r>rLrZrr�)rrMr:r4rrr�new_policy_object_dicts



z%FirewallConfig.new_policy_object_dictcCstjj|�}tjj|�}tjj|�s�|jtj�r�x�|jj	�D]D}|j|}|j
|kr<|j|=|j|jkrxd|j|jfSd|fSq<WnHxF|jj	�D]8}|j|}|j
|kr�|j|=|j|jkr�d|fSd	Sq�Wd
St
jd|�yt||�}Wn0tk
�r}zt
jd||�dSd}~XnX||_|jtj��rlt|�ttj�k�rldtjj|�tjj|�dd�f|_|j|jk�r�|j|jk�r�|j|�d|fS|jtj��r�|j|jk�r�|j|jj|_||j|j<d|fS|j|jk�r|j|j=||j|j<|j|jk�rd|fSd
SdS)Nr`razLoading policy file '%s'z#Failed to load policy file '%s': %sz%s/%srr�rb)NN)NN)NNr�)NN)NN)r=r>rcrdr?r�rr�r0r2r^rMr+rrerrfrgr�r�r�rZ)rrMr^r>r4rNrhrrr�update_policy_object_from_path,sZ





z-FirewallConfig.update_policy_object_from_pathcCs�|j|jkrttj|j��|jjtj�s@ttj	d|jtjf��d|j|jf}yt
j|d|�Wn:tk
r�}zt
jd||�tj|�WYdd}~XnX|j|j=dS)Nz'%s' doesn't start with '%s'z	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %s)rMr0rrr�r>r�rr�rjrkrlrfrrgr=ra)rrNrMrhrrrr�ysz$FirewallConfig._remove_policy_objectcCs$|js|jr ttjd|j��dS)Nz'%s' is built-in policy)rLrZrrZBUILTIN_POLICYrM)rrNrrr�check_builtin_policy_object�sz*FirewallConfig.check_builtin_policy_objectcCs|j|�|j|�dS)N)r�r�)rrNrrr�remove_policy_object�s
z#FirewallConfig.remove_policy_objectcCs$|j|�|j||�}|j|�|S)N)r��_copy_policy_objectr�)rrNrMZnew_policy_objectrrr�rename_policy_object�s

z#FirewallConfig.rename_policy_objectcCs|j||j��S)N)r�r�)rrNrMrrrr��sz"FirewallConfig._copy_policy_objectcCs$ttt|jj��t|jj����S)N)rIrJr1r%r2r*)rrrr�get_helpers�szFirewallConfig.get_helperscCs$|jr||j|j<n||j|j<dS)N)rLr*rMr%)rrNrrr�
add_helper�szFirewallConfig.add_helpercCs8||jkr|j|S||jkr(|j|Sttj|��dS)N)r%r*rr�INVALID_HELPER)rrMrrr�
get_helper�s




zFirewallConfig.get_helpercCst|j|jkrttj|j��nB|j|j|kr@ttjd|j��n|j|jkr^ttjd|j��|j|�|j|jS)Nzself._helpers[%s] != objz'%s' not a built-in helper)rMr%rrrRr*�_remove_helper)rrNrrr�load_helper_defaults�s
z#FirewallConfig.load_helper_defaultscCs|j�S)N)rU)rrNrrr�get_helper_config�sz FirewallConfig.get_helper_configcCsj|jrPtj|�}|j|�tj|_d|_|j|jkr:d|_|j|�t|�|S|j|�t|�|SdS)NF)	rLrWrXr�ETC_FIREWALLD_HELPERSr>rZr�r)rrNr:r4rrr�set_helper_config�s



z FirewallConfig.set_helper_configcCsx||jks||jkr$ttjd|��t�}|j|�|j|�||_d||_	t
j|_d|_
d|_t|�|j|�|S)Nznew_helper(): '%s'z%s.xmlFT)r%r*rrr\rr]rXrMr^rr�r>rLrZrr�)rrMr:r4rrr�
new_helper�s




zFirewallConfig.new_helpercCs�tjj|�}tjj|�}tjj|�s�|tjkr�x�|jj�D]D}|j|}|j	|kr:|j|=|j
|jkrvd|j|j
fSd|fSq:WnHxF|jj�D]8}|j|}|j	|kr�|j|=|j
|jkr�d|fSdSq�WdStj
d|�yt||�}Wn0tk
�r}ztjd||�dSd}~XnX|j
|jk�rJ|j
|jk�rJ|j|�d|fS|tjk�r�|j
|jk�r�|j|j
j|_||j|j
<d|fS|j
|jk�r�|j|j
=||j|j
<|j
|jk�r�d|fSd	Sd
S)Nr`razLoading helper file '%s'z#Failed to load helper file '%s': %srb)NN)NN)NN)NN)NN)r=r>rcrdr?rr�r%r2r^rMr*rrerrfrgr�rZ)rrMr^r>r4rNrhrrr�update_helper_from_path�sP






z&FirewallConfig.update_helper_from_pathcCs�|j|jkrttj|j��|jtjkr>ttjd|jtjf��d|j|jf}yt	j
|d|�Wn:tk
r�}ztj
d||�tj|�WYdd}~XnX|j|j=dS)Nz'%s' != '%s'z	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %s)rMr%rrr�r>rr�rjrkrlrfrrgr=ra)rrNrMrhrrrr�&szFirewallConfig._remove_helpercCs$|js|jr ttjd|j��dS)Nz'%s' is built-in helper)rLrZrrZBUILTIN_HELPERrM)rrNrrr�check_builtin_helper7sz#FirewallConfig.check_builtin_helpercCs|j|�|j|�dS)N)r�r�)rrNrrr�
remove_helper<s
zFirewallConfig.remove_helpercCs$|j|�|j||�}|j|�|S)N)r��_copy_helperr�)rrNrMr�rrr�
rename_helper@s

zFirewallConfig.rename_helpercCs|j||j��S)N)r�rU)rrNrMrrrr�FszFirewallConfig._copy_helperN)f�__name__�
__module__�__qualname__rr/rr3r6r7r;r<rBrCrDrErFrGrHrKrOrQrTrVr[r_rirSrmrnrprorqrrrtrvrwryrzr{rur|r}rr~r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr's�
7EEEMME)&�__all__rWr=Zos.pathrkZfirewallrZfirewall.core.loggerrZfirewall.core.io.icmptyperrrZfirewall.core.io.servicerr	r
Zfirewall.core.io.zonerrr
Zfirewall.core.io.ipsetrrrZfirewall.core.io.helperrrrZfirewall.core.io.policyrrrrZfirewall.errorsr�objectrrrrr�<module>ssite-packages/firewall/core/__pycache__/fw_ipset.cpython-36.opt-1.pyc000064400000016503147511334670021456 0ustar003

]ûf�%�@sfdZdgZddlmZddlmZmZmZm	Z	ddl
mZddlm
Z
ddlmZGdd�de�Zd	S)
z
ipset backend�
FirewallIPSet�)�log)�remove_default_create_options�normalize_ipset_entry�check_entry_overlaps_existing�check_for_overlapping_entries)�IPSet)�errors)�
FirewallErrorc@s�eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	d4dd�Z
dd�Zdd�Zd5dd�Z
dd�Zdd�Zdd�Zd6dd �Zd!d"�Zd#d$�Zd%d&�Zd7d'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3S)8rcCs||_i|_dS)N)�_fw�_ipsets)�self�fw�r�/usr/lib/python3.6/fw_ipset.py�__init__#szFirewallIPSet.__init__cCsd|j|jfS)Nz%s(%r))�	__class__r)r
rrr�__repr__'szFirewallIPSet.__repr__cCs|jj�dS)N)r�clear)r
rrr�cleanup,szFirewallIPSet.cleanupcCs||j�krttj|��dS)N)�
get_ipsetsr
r	Z
INVALID_IPSET)r
�namerrr�check_ipset/szFirewallIPSet.check_ipsetcCs||j�kS)N)r)r
rrrr�query_ipset3szFirewallIPSet.query_ipsetcCst|jj��S)N)�sortedr�keys)r
rrrr6szFirewallIPSet.get_ipsetscCst|j�dkS)Nr)�lenr)r
rrr�
has_ipsets9szFirewallIPSet.has_ipsetsFcCs&|j|�|j|}|r"|j|�|S)N)rr�check_applied_obj)r
r�applied�objrrr�	get_ipset<s



zFirewallIPSet.get_ipsetcCs4g}|jjr|j|jj�|jjr0|j|jj�|S)N)rZnftables_enabled�appendZnftables_backendZ
ipset_enabledZ
ipset_backend)r
�backendsrrrr#CszFirewallIPSet.backendscCs0|j|jjkr ttjd|j��||j|j<dS)Nz'%s' is not supported by ipset.)�typerZipset_supported_typesr
r	ZINVALID_TYPErr)r
r rrr�	add_ipsetKszFirewallIPSet.add_ipsetcCs�|j|}|jrh|rhy x|j�D]}|j|�q"WWqttk
rd}zttj|��WYdd}~XqtXntj	d|�|j|=dS)Nz,Keeping ipset '%s' because of timeout option)
rrr#�set_destroy�	Exceptionr
r	�COMMAND_FAILEDr�debug1)r
rZkeepr �backend�msgrrr�remove_ipsetQs
 zFirewallIPSet.remove_ipsetc<Cs$|j|}�x|j�D�]}|jdkr�|j�}||kr�d|jksv|jddksv|j||dksvt|j�||dkr�y|j|�Wn.tk
r�}zt	t
j|��WYdd}~XnX|jj
�r�y|j|j|j|j�Wn0tk
�r}zt	t
j|��WYdd}~Xn&Xd|_d|jk�r,|jddk�r,qy|j|j�Wn0tk
�rl}zt	t
j|��WYdd}~XnXx�|jD]J}y|j|j|�Wn0tk
�r�}zt	t
j|��WYdd}~XnX�qvWqy|j|j|j|j|jd�Wn0tk
�r}zt	t
j|��WYdd}~XqXd|_qWdS)N�ipset�timeout�0r�T)rr#rZset_get_active_terse�optionsr$�rm_def_cr_optsr&r'r
r	r(r�_individual_callsZ
set_creater�	set_flush�entries�set_add�set_restore)r
rr r*Zactiver+�entryrrr�apply_ipset]sL


&
zFirewallIPSet.apply_ipsetcCs>x8|j�D],}|j|}d|_tjd|�|j|�q
WdS)NFzApplying ipset '%s')rrrrr)r9)r
rr rrr�apply_ipsets�s

zFirewallIPSet.apply_ipsetscCs�xz|j�D]n}|jdkrq
x\|j�D]P}y|j|�|j|�Wq$tk
rr}z|jtjkrb|�WYdd}~Xq$Xq$Wq
WdS)NZnftables)	r#rr�
check_appliedr&r
�coder	�NOT_APPLIED)r
r*r-r+rrr�flush�s

zFirewallIPSet.flushTcCs|j||d�jS)N)r)r!r$)r
rrrrr�get_type�szFirewallIPSet.get_typecCst|j|dd�jjd��S)NT)r�,)rr!r$�split)r
rrrr�
get_dimension�szFirewallIPSet.get_dimensioncCs|j|�}|j|�dS)N)r!r)r
rr rrrr;�s
zFirewallIPSet.check_appliedcCs|jsttj|j��dS)N)rr
r	r=r)r
r rrrr�szFirewallIPSet.check_applied_objcCs.|j||d�}d|jkr*|jddkr*dSdS)N)rZfamilyZinet6Zipv6Zipv4)r!r1)r
rrr rrr�
get_family�s

zFirewallIPSet.get_familycCs�|j|dd�}t|�}tj||j|j�||jkrFttj	d||f��t
||j�y$x|j�D]}|j|j
|�q^WWn.tk
r�}zttj|��WYdd}~Xn&Xd|jks�|jddkr�|jj|�dS)NT)rz'%s' already is in '%s'r.r/)r!rr�check_entryr1r$r5r
r	ZALREADY_ENABLEDrr#r6rr'r(r")r
rr8r r*r+rrr�	add_entry�s
zFirewallIPSet.add_entrycCs�|j|dd�}t|�}||jkr4ttjd||f��y$x|j�D]}|j|j|�q@WWn.t	k
r�}zttj
|��WYdd}~Xn&Xd|jks�|jddkr�|jj|�dS)NT)rz'%s' not in '%s'r.r/)
r!rr5r
r	ZNOT_ENABLEDr#Z
set_deleterr'r(r1�remove)r
rr8r r*r+rrr�remove_entry�s
zFirewallIPSet.remove_entrycCsD|j|dd�}t|�}d|jkr:|jddkr:ttj|��||jkS)NT)rr.r/)r!rr1r
r	ZIPSET_WITH_TIMEOUTr5)r
rr8r rrr�query_entry�s
zFirewallIPSet.query_entrycCs|j|dd�}|jS)NT)r)r!r5)r
rr rrr�get_entries�szFirewallIPSet.get_entriescCs@|j|dd�}t|�x|D]}tj||j|j�qWd|jksN|jddkrT||_y"x|j�D]}|j|j	�q`WWn.t
k
r�}zttj
|��WYdd}~XnXd|_yXxR|j�D]F}|jjr�x8|jD]}|j|j	|�q�Wq�|j|j	|j|j|jd�q�WWn0t
k
�r4}zttj
|��WYdd}~XnXd|_dS)NT)rr.r/)r!rrrDr1r$r5r#r4rr'r
r	r(rrr3r6r7)r
rr5r r8r*r+rrr�set_entries�s.
zFirewallIPSet.set_entriesN)F)F)T)T)�__name__�
__module__�__qualname__rrrrrrrr!r#r%r,r9r:r>r?rBr;rrCrErGrHrIrJrrrrr"s0

1

		N)�__doc__�__all__Zfirewall.core.loggerrZfirewall.core.ipsetrr2rrrZfirewall.core.io.ipsetrZfirewallr	Zfirewall.errorsr
�objectrrrrr�<module>ssite-packages/firewall/core/__pycache__/fw_ifcfg.cpython-36.opt-1.pyc000064400000002613147511334670021405 0ustar003

]ûf
�@sTdZddgZddlZddlZddlmZddlmZddlm	Z	dd�Z
d	d�ZdS)
z.Functions to search for and change ifcfg files�search_ifcfg_of_interface�ifcfg_set_zone_of_interface�N)�config)�log)�ifcfgcCs�tjjtj�sdSxtttjtj��D]`}|jd�s4q$xd
D]}|j|�r:q:q:Wd	|krXq$t	d
tj|f�}|j
�|jd�|kr$|Sq$Wdtj|f}tjj|�r�t	|�}|j
�|SdS)z6search ifcfg file for the interface in config.IFCFGDIRNzifcfg-�.bak�.orig�.rpmnew�.rpmorig�.rpmsave�-range�.z%s/%sZDEVICEz%s/ifcfg-%s)rrr	r
rr)�os�path�existsrZIFCFGDIR�sorted�listdir�
startswith�endswithr�read�get)�	interface�filenameZignored�
ifcfg_file�r�/usr/lib/python3.6/fw_ifcfg.pyr!s*

cCsn|dkrd}t|�}|dk	rj|jd�|krj|jd�dko>|dkrjtjd||jf�|jd|�|j�dS)zYSet zone (ZONE=<zone>) in the ifcfg file that uses the interface
    (DEVICE=<interface>)N�ZZONEzSetting ZONE=%s in '%s')rrrZdebug1r�set�write)Zzonerrrrrr?s)�__doc__�__all__rZos.pathZfirewallrZfirewall.core.loggerrZfirewall.core.io.ifcfgrrrrrrr�<module>ssite-packages/firewall/core/__pycache__/ipset.cpython-36.opt-1.pyc000064400000021161147511334670020756 0ustar003

]ûfq2�@s�dZdddgZddlZddlZddlmZddlmZddl	m
Z
dd	lmZdd
l
mZmZddlmZdZd
ddddddddddgZddddd�Zdddd�ZGd d�de�Zd!d�Zd"d�Zd#d$�Zd%d&�Zd'd(�ZdS))zThe ipset command wrapper�ipset�check_ipset_name�remove_default_create_options�N)�errors)�
FirewallError)�runProg)�log)�tempFile�readfile)�COMMANDS� zhash:ipzhash:ip,portzhash:ip,port,ipzhash:ip,port,netzhash:ip,markzhash:netzhash:net,netz
hash:net,portzhash:net,port,netzhash:net,ifacezhash:macz
inet|inet6�valuez
value in secs)�family�hashsize�maxelem�timeoutZinetZ1024Z65536)rrrc@s�eZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zd'd
d�Z	dd�Z
dd�Zdd�Zd(dd�Z
d)dd�Zdd�Zd*dd�Zd+dd�Zdd �Zd!d"�Zd#d$�Zd%d&�ZdS),rzipset command wrapper classcCstd|_d|_dS)Nr)r�_command�name)�self�r�/usr/lib/python3.6/ipset.py�__init__Ks
zipset.__init__cCs^dd�|D�}tjd|j|jdj|��t|j|�\}}|dkrZtd|jdj|�|f��|S)zCall ipset with argscSsg|]}d|�qS)z%sr)�.0�itemrrr�
<listcomp>Rszipset.__run.<locals>.<listcomp>z	%s: %s %s� rz'%s %s' failed: %s)r�debug2�	__class__r�joinr�
ValueError)r�argsZ_args�status�retrrrZ__runOszipset.__runcCs t|�tkrttjd|��dS)zCheck ipset namezipset name '%s' is not validN)�len�IPSET_MAXNAMELENrrZINVALID_NAME)rrrrr�
check_nameZszipset.check_namecCs�g}d}y|jdg�}Wn0tk
rH}ztjd|�WYdd}~XnX|j�}d}xT|D]L}|r�|j�jdd�}|d|kr�|dtkr�|j|d�|j	d�r\d	}q\W|S)
z?Return types that are supported by the ipset command and kernel�z--helpzipset error: %sNF�rzSupported set types:T)
�_ipset__runrrZdebug1�
splitlines�strip�split�IPSET_TYPES�append�
startswith)rr"�outputZex�linesZin_types�line�splitsrrr�set_supported_types`s  

zipset.set_supported_typescCs(t|�tks|tkr$ttjd|��dS)zCheck ipset typez!ipset type name '%s' is not validN)r#r$r,rrZINVALID_TYPE)r�	type_namerrr�
check_typeuszipset.check_typeNcCsd|j|�|j|�d||g}t|t�rZx0|j�D]$\}}|j|�|dkr2|j|�q2W|j|�S)z+Create an ipset with name, type and options�creater&)r%r5�
isinstance�dict�itemsr-r()r�set_namer4�optionsr �key�valrrr�
set_create{s




zipset.set_createcCs|j|�|jd|g�S)NZdestroy)r%r()rr:rrr�set_destroy�s
zipset.set_destroycCsd||g}|j|�S)N�add)r()rr:�entryr rrr�set_add�s
z
ipset.set_addcCsd||g}|j|�S)N�del)r()rr:rAr rrr�
set_delete�s
zipset.set_deletecCs,d||g}|r"|jddj|��|j|�S)N�testz%sr)r-rr()rr:rAr;r rrrrE�s
z
ipset.testcCs2dg}|r|j|�|r"|j|�|j|�jd�S)N�list�
)r-�extendr(r+)rr:r;r rrr�set_list�s

zipset.set_listcCs<|jdgd�}i}d}}i}�x|D�]}t|�dkr:q&dd�|jdd�D�}t|�dkr`q&q&|d	d
krv|d}q&|d	dkr�|d}q&|d	dkr&|dj�}d	}	x^|	t|�k�r||	}
|
dk�r�t|�|	kr�|	d7}	||	||
<ntjd|�iS|	d7}	q�W|�r$|�r$|t|�f||<d}}|j�q&W|S)z" Get active ipsets (only headers) z-terse)r;N�cSsg|]}|j��qSr)r*)r�xrrrr�sz.ipset.set_get_active_terse.<locals>.<listcomp>�:r'r�NameZTypeZHeaderrrrr�netmaskz&Malformed ipset list -terse output: %s)rrrrrN)rIr#r+r�errorr�clear)rr0r"�_nameZ_type�_optionsr1Zpairr2�i�optrrr�set_get_active_terse�sD

zipset.set_get_active_tersecCsdg}|r|j|�|j|�S)N�save)r-r()rr:r rrrrV�s
z
ipset.savecCs�|j|�|j|�t�}d|kr*d|}d||dg}|rlx0|j�D]$\}}	|j|�|	dkrD|j|	�qDW|jddj|��|jd|�xN|D]F}
d|
kr�d|
}
|r�|jd||
dj|�f�q�|jd	||
f�q�W|j�tj	|j
�}tjd
|j
|jd|j
|jf�dg}t|j||j
d
�\}}
tj�dk�r�yt|j
�Wntk
�r`YnVXd}xNt|j
�D]@}tjd||fddd�|jd��s�tjddd�|d7}�qrWtj|j
�|dk�r�td|jdj|�|
f��|
S)Nrz'%s'r6z-existr&z%s
z	flush %s
z
add %s %s %s
z
add %s %s
z%s: %s restore %sz%s: %dZrestore)�stdinr'rJz%8d: %sr)�nofmt�nlrG)rXz'%s %s' failed: %s)r%r5r	r9r-�writer�close�os�statrrrrr�st_sizerZgetDebugLogLevelr
�	ExceptionZdebug3�endswith�unlinkr)rr:r4�entriesZcreate_optionsZ
entry_optionsZ	temp_filer r<r=rAr]r!r"rSr1rrr�set_restore�sV




zipset.set_restorecCsdg}|r|j|�|j|�S)N�flush)r-r()rr:r rrr�	set_flushs
zipset.set_flushcCs|jd||g�S)N�rename)r()rZold_set_nameZnew_set_namerrrrf
szipset.renamecCs|jd||g�S)N�swap)r()rZ
set_name_1Z
set_name_2rrrrgsz
ipset.swapcCs|jdg�S)N�version)r()rrrrrhsz
ipset.version)N)N)NN)N)NN)�__name__�
__module__�__qualname__�__doc__rr(r%r3r5r>r?rBrDrErIrUrVrcrerfrgrhrrrrrHs&



'

7cCst|�tkrdSdS)z"Return true if ipset name is validFT)r#r$)rrrrrscCs8|j�}x*tD]"}||krt|||kr||=qW|S)z( Return only non default create options )�copy�IPSET_DEFAULT_CREATE_OPTIONS)r;rRrTrrrrs

c
Cshg}xX|jd�D]J}y&|jd�|jttj|dd���Wqtk
rX|j|�YqXqWdj|�S)z! Normalize IP addresses in entry �,�/F)�strict)r+�indexr-�str�	ipaddress�
ip_networkrr)rAZ_entryZ_partrrr�normalize_ipset_entry&s
rvcCsxt|jd��dkrdSytj|dd�}Wntk
r<dSXx4|D],}|jtj|dd��rDttjdj	||���qDWdS)z: Check if entry overlaps any entry in the list of entries rorJNF)rqz,Entry '{}' overlaps with existing entry '{}')
r#r+rtrur�overlapsrr�
INVALID_ENTRY�format)rArbZ
entry_networkZitrrrr�check_entry_overlaps_existing2s
rzcCs~ydd�|D�}Wntk
r&dSXt|�dkr8dS|j�|jd�}x.|D]&}|j|�rrttjdj||���|}qPWdS)z> Check if any entry overlaps any entry in the list of entries cSsg|]}tj|dd��qS)F)rq)rtru)rrKrrrrEsz1check_for_overlapping_entries.<locals>.<listcomp>NrzEntry '{}' overlaps entry '{}')	rr#�sort�poprwrrrxry)rbZprev_networkZcurrent_networkrrr�check_for_overlapping_entriesBs2


r})rl�__all__Zos.pathr\rtZfirewallrZfirewall.errorsrZfirewall.core.progrZfirewall.core.loggerrZfirewall.functionsr	r
Zfirewall.configrr$r,ZIPSET_CREATE_OPTIONSrn�objectrrrrvrzr}rrrr�<module>sF
P	site-packages/firewall/core/__pycache__/ipXtables.cpython-36.pyc000064400000104137147511334670020633 0ustar003

]ûf���@s(ddlZddlZddlmZddlmZddlmZm	Z	m
Z
mZmZm
Z
mZmZddlmZddlmZmZmZmZmZddlmZmZmZmZmZmZmZddl Z dZ!d	d
dgdd
gdd
d	d
dgdd
d
gd	d
dgd�Z"ddd�Z#ddd�Z$dd�Z%dd�Z&dd�Z'Gdd�de(�Z)Gdd�de)�Z*dS)�N)�runProg)�log)�tempFile�readfile�	splitArgs�	check_mac�portStr�check_single_address�
check_address�normalizeIP6)�config)�
FirewallError�INVALID_PASSTHROUGH�INVALID_RULE�
UNKNOWN_ERROR�INVALID_ADDR)�Rich_Accept�Rich_Reject�	Rich_Drop�	Rich_Mark�Rich_Masquerade�Rich_ForwardPort�Rich_IcmpBlock��INPUT�OUTPUT�FORWARD�
PREROUTING�POSTROUTING)�security�raw�mangle�nat�filterzicmp-host-prohibitedzicmp6-adm-prohibited)�ipv4�ipv6�icmpz	ipv6-icmpcCs�ddddddd�}|dd�}x~|D]v}y|j|�}Wntk
rLw$YnX|d
kr�yt||d	�Wntk
r~YnX|j|d	�||||<q$W|S)z Inverse valid rule z-Dz--deletez-Xz--delete-chain)z-Az--appendz-Iz--insertz-Nz--new-chainN�-I�--insert�)r'r()�index�	Exception�int�pop)�args�replace_args�ret_args�arg�idx�r3�/usr/lib/python3.6/ipXtables.py�common_reverse_rule9s(
r5cCs�ddddddd�}|dd�}x�|D]x}y|j|�}Wntk
rLw$YnX|dkr�yt||d	�Wntk
r~YnX|j|d	�||||<|SWttd
��dS)z Reverse valid passthough rule z-Dz--deletez-Xz--delete-chain)z-Az--appendz-Iz--insertz-Nz--new-chainN�-I�--insertr)zno '-A', '-I' or '-N' arg)r6r7)r*�
ValueErrorr,r-r
r)r.r/r0�xr2r3r3r4�common_reverse_passthrough^s,
r:cCs�t|�}tddddddddd	d
ddd
dddddddg�}t||@�dkrbttdt||@�d��tddddddg�}t||@�dkr�ttd��dS)zZ Check if passthough rule is valid (only add, insert and new chain
    rules are allowed) z-Cz--checkz-Dz--deletez-Rz	--replacez-Lz--listz-Sz--list-rulesz-Fz--flushz-Zz--zeroz-Xz--delete-chainz-Pz--policyz-Ez--rename-chainrzarg '%s' is not allowedz-Az--appendz-Iz--insertz-Nz--new-chainzno '-A', '-I' or '-N' argN)�set�lenr
r�list)r.Znot_allowedZneededr3r3r4�common_check_passthrough�s*

r>c@s�eZdZdZdZdZdd�Zdd�Zdd�Zd	d
�Z	dd�Z
d
d�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zdhd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zdid,d-�Zd.d/�Zdjd1d2�Zd3d4�Zd5d6�Zdkd8d9�Zdld:d;�Z d<d=�Z!d>d?�Z"d@dA�Z#dBdC�Z$dDdE�Z%dFdG�Z&dHdI�Z'dJdK�Z(dLdM�Z)dNdO�Z*dPdQ�Z+dmdRdS�Z,dndTdU�Z-dodVdW�Z.dXdY�Z/dpdZd[�Z0dqd\d]�Z1drd^d_�Z2dsd`da�Z3dbdc�Z4ddde�Z5dfdg�Z6d!S)t�	ip4tablesr$TcCsd||_tj|j|_tjd|j|_|j�|_|j�|_	|j
�g|_i|_i|_
g|_i|_dS)Nz
%s-restore)�_fwrZCOMMANDS�ipv�_command�_restore_command�_detect_wait_option�wait_option�_detect_restore_wait_option�restore_wait_option�fill_exists�available_tables�rich_rule_priority_counts�policy_priority_counts�zone_source_index_cache�
our_chains)�self�fwr3r3r4�__init__�s

zip4tables.__init__cCs$tjj|j�|_tjj|j�|_dS)N)�os�path�existsrBZcommand_existsrCZrestore_command_exists)rNr3r3r4rH�szip4tables.fill_existscCs�|jr(|j|kr(|jgdd�|D�}ndd�|D�}tjd|j|jdj|��t|j|�\}}|dkr�td|jdj|�|f��|S)NcSsg|]}d|�qS)z%sr3)�.0�itemr3r3r4�
<listcomp>�sz#ip4tables.__run.<locals>.<listcomp>cSsg|]}d|�qS)z%sr3)rTrUr3r3r4rV�sz	%s: %s %s� rz'%s %s' failed: %s)rEr�debug2�	__class__rB�joinrr8)rNr.Z_args�status�retr3r3r4Z__run�szip4tables.__runc
Cs<y|j|�}Wntk
r"dSX||||d�<dSdS)NF�T)r*r8)rN�rule�patternZreplacement�ir3r3r4�
_rule_replace�szip4tables._rule_replacecCs|tko|t|kS)N)�BUILT_IN_CHAINS)rNrA�table�chainr3r3r4�is_chain_builtin�szip4tables.is_chain_builtincCs2d|g}|r|jd�n
|jd�|j|�|gS)Nz-tz-Nz-X)�append)rN�addrcrdr^r3r3r4�build_chain_rules�s

zip4tables.build_chain_rulescCs8d|g}|r |d|t|�g7}n|d|g7}||7}|S)Nz-tz-Iz-D)�str)rNrgrcrdr*r.r^r3r3r4�
build_rule�szip4tables.build_rulecCst|�S)N)r5)rNr.r3r3r4�reverse_rule�szip4tables.reverse_rulecCst|�dS)N)r>)rNr.r3r3r4�check_passthrough�szip4tables.check_passthroughcCst|�S)N)r:)rNr.r3r3r4�reverse_passthrough�szip4tables.reverse_passthroughcCs�d}y|jd�}Wntk
r&YnXt|�|dkrD||d}d}xLd
D]D}y|j|�}Wntk
rtYqNXt|�|dkrN||d}qNW||fS)Nr#z-tr]�-A�--append�-I�--insert�-N�--new-chain)rnrorprqrrrs)r*r8r<)rNr.rcr`rd�optr3r3r4�passthrough_parse_table_chain�s$z'ip4tables.passthrough_parse_table_chaincCs4yH|jd�}|j|�|j|�}d|dkr:||df}n||df}WnFtk
r�y|jd�}|j|�d}Wntk
r�dSXYnXd}|ddkr�d}|r�|r�||kr�|j|�nn|�r0|�r�||kr�|j|�|jdd
�d�|j|�}n|jj�rd}nt|�}d|d<|j	dd|d�dS)Nz%%ZONE_SOURCE%%z-m���z%%ZONE_INTERFACE%%Tr�-D�--deleteFcSs|dS)Nrr3)r9r3r3r4�<lambda>&sz4ip4tables._run_replace_zone_source.<locals>.<lambda>)�keyz-Ir)z%dr])ryrz)
r*r-r8�removerf�sortr@�_allow_zone_driftingr<�insert)rNr^rLr`�zoneZzone_source�rule_addr*r3r3r4�_run_replace_zone_source	s>







z"ip4tables._run_replace_zone_sourcecCsy|j|�}Wntk
r$Y�n�Xd}d}d}|j|�|j|�}t|�tkr\ttd��d}	xLdD]D}
y|j|
�}Wntk
r�YqfXt|�|dkrf||d}	qfWxhdD]`}
y|j|
�}Wntk
r�Yq�Xt|�|dk�r�||d}|
dk�rd}|
dkr�d}q�W|	|f}|�sp||k�sP|||k�sP|||dk�rZttd��|||d8<n�||k�r�i||<|||k�r�d|||<d}
xHt	||j
��D]4}||k�r�|�r�P|
|||7}
||k�r�P�q�W|||d7<d
||<|j|dd|
�dS)a
        Change something like
          -t filter -I public_IN %%RICH_RULE_PRIORITY%% 123
        or
          -t filter -A public_IN %%RICH_RULE_PRIORITY%% 321
        into
          -t filter -I public_IN 4
        or
          -t filter -I public_IN
        TFr]z%priority must be followed by a numberr#�-t�--table�-A�--append�-I�--insert�-D�--deleterz*nonexistent or underflow of priority countr)z%dN���)r�r�)r�r�r�r�r�r�)r�r�)r�r�)r*r8r-�typer,r
rr<r�sorted�keysr�)rNr^Zpriority_counts�tokenr`r�r�Zinsert_add_index�priorityrcrt�jrdr*�pr3r3r4�_set_rule_replace_priority2sj








z$ip4tables._set_rule_replace_prioritycCsPt�}i}tj|j�}tj|j�}tj|j�}�x�|D�]�}|dd�}	|j|	dddt|jg�|j|	dt	|jg�y|	j
d�}
Wntk
r�Yn8X|dkr�q6|d$kr�d
dd|g|	|
|
d
�<n
|	j|
�|j
|	|d�|j
|	|d�|j|	|�d}xZd%D]R}y|	j
|�}
Wntk
�r,Yn(Xt|	�|
d
k�r|	j|
�|	j|
�}�qWxhtt|	��D]X}
xPtjD]F}
|
|	|
k�rt|	|
jd��o�|	|
jd��rtd|	|
|	|
<�qtW�qhW|j|g�j|	�q6WxR|D]J}||}|jd|�x"|D]}	|jdj|	�d��qW|jd��q�W|j�tj|j�}tjd|j|j d|j|j!f�g}|j"�rz|j|j"�|jd�t#|j ||jd�\}}tj$�dk�r
t%|j�}|dk	�r
d
}
xH|D]@}tj&d|
|fd
dd �|jd��s�tj&d!d
d"�|
d
7}
�q�Wtj'|j�|dk�r:td#|j dj|�|f��||_||_||_dS)&Nz
%%REJECT%%�REJECTz
--reject-withz%%ICMP%%z%%LOGTYPE%%�off�unicast�	broadcast�	multicastz-m�pkttypez
--pkt-typer]z%%RICH_RULE_PRIORITY%%z%%POLICY_PRIORITY%%r#�-t�--table�"z"%s"z*%s
rW�
zCOMMIT
z	%s: %s %sz%s: %dz-n)�stdinr)z%8d: %sr)�nofmt�nlr)r�z'%s %s' failed: %s)r�r�r�)r�r�)(r�copy�deepcopyrJrKrLra�DEFAULT_REJECT_TYPErA�ICMPr*r8r-r�r�r<�range�stringZ
whitespace�
startswith�endswith�
setdefaultrf�writerZ�closerQ�stat�namerrXrYrC�st_sizerGrZgetDebugLogLevelrZdebug3�unlink)rN�rules�
log_denied�	temp_fileZtable_rulesrJrKrLZ_ruler^r`rcrt�cr�r.r[r\�lines�liner3r3r4�	set_rules�s�









zip4tables.set_rulesc
Cs�|j|dddt|jg�|j|dt|jg�y|jd�}Wntk
rRYn:X|dkr`dS|dkr�ddd
|g|||d�<n
|j|�tj|j	�}tj|j
�}tj|j�}|j||d�|j||d�|j
||�|j|�}||_	||_
||_|S)Nz
%%REJECT%%r�z
--reject-withz%%ICMP%%z%%LOGTYPE%%r�rr�r�r�z-mr�z
--pkt-typer]z%%RICH_RULE_PRIORITY%%z%%POLICY_PRIORITY%%)r�r�r�)rar�rAr�r*r8r-r�r�rJrKrLr�r��_ip4tables__run)rNr^r�r`rJrKrL�outputr3r3r4�set_rule�s.

zip4tables.set_ruleNcCs�g}|r|gntj�}xx|D]p}||jkr6|j|�qy,|jd|ddg�|jj|�|j|�Wqtk
r�tjd|j|f�YqXqW|S)Nz-tz-Lz-nzA%s table '%s' does not exist (or not enough permission to check).)	rbr�rIrfr�r8r�debug1rA)rNrcr\Ztablesr3r3r4�get_available_tabless

zip4tables.get_available_tablescCs`d}t|jdddg�}|ddkr\d}t|jdddg�}|ddkrHd}tjd|j|j|�|S)Nrz-wz-Lz-nrz-w10z%s: %s will be using %s option.)rrBrrXrY)rNrEr\r3r3r4rDszip4tables._detect_wait_optioncCs�t�}|jd�|j�d}xJdD]B}t|j|g|jd�}|ddkr"d|dkr"d	|dkr"|}Pq"Wtjd
|j|j|�t	j
|j�|S)Nz#foor�-w�--wait=2)r�rzinvalid optionr]zunrecognized optionz%s: %s will be using %s option.)r�r�)rr�r�rrCr�rrXrYrQr�)rNr�rEZtest_optionr\r3r3r4rF"s

z%ip4tables._detect_restore_wait_optioncCsVi|_i|_g|_g}x:tj�D].}|j|�s0q xdD]}|jd||g�q6Wq W|S)N�-F�-X�-Zz-t)r�r�r�)rJrKrLrbr�r�rf)rNr�rc�flagr3r3r4�build_flush_rules5s

zip4tables.build_flush_rulescCsfg}|dkrdn|}xLtj�D]@}|j|�s.q|dkr8qx$t|D]}|jd|d||g�qBWqW|S)NZPANIC�DROPr"z-tz-P)rbr�r�rf)rN�policyr��_policyrcrdr3r3r4�build_set_policy_rulesDs
z ip4tables.build_set_policy_rulescCs g}d}y"|jd|jdkrdnddg�}WnJtk
rt}z.|jdkrVtjd|�ntjd|�WYd	d	}~XnX|j�}d
}x�|D]�}|r�|j�j�}|j�}xD|D]<}	|	j	d�r�|	j
d�r�|	d
d�}
n|	}
|
|kr�|j|
�q�W|jdko�|j	d��s|jdkr�|j	d�r�d}q�W|S)zQReturn ICMP types that are supported by the iptables/ip6tables command and kernelrz-pr$r&z	ipv6-icmpz--helpziptables error: %szip6tables error: %sNF�(�)r]zValid ICMP Types:r%zValid ICMPv6 Types:Tr�)r�rAr8rr��
splitlines�strip�lower�splitr�r�rf)rNrAr\r�Zexr�Zin_typesr�Zsplitsr�r9r3r3r4�supported_icmp_typesPs4
 

zip4tables.supported_icmp_typescCsgS)Nr3)rNr3r3r4�build_default_tablesqszip4tables.build_default_tablesr�c	Csi}|jd�rpg|d<t�|jd<xLtdD]@}|djd|�|djd||f�|jdjd|�q,W|jd��r\g|d<t�|jd<x�tdD]�}|djd|�|djd||f�|jdjd|�|dkr�xt|jjr�ddd	d
gndd	d
gD]R}|djd||f�|djd|||f�|jdjtd
||fg���qWq�W|jd��rNg|d<t�|jd<x�tdD]�}|djd|�|djd||f�|jdjd|�|dk�r�xv|jj�r�ddd	d
gndd	d
gD]R}|djd||f�|djd|||f�|jdjtd
||fg���q�W�q�W|jd��r@g|d<t�|jd<x�tdD]�}|djd|�|djd||f�|jdjd|�|d9k�rxxv|jj�r�ddd	d
gndd	d
gD]R}|djd||f�|djd|||f�|jdjtd
||fg���q�W�qxWg|d<t�|jd<|djd�|djd�|djd�|djd�|jdjtd��xf|jj�r�ddd	d
gndd	d
gD]B}|djd|�|djd|�|jdjtd|���q�W|dk�r |djd�|djd�|dk�rF|djd�|djd�|djd�|djd �|djd!�|djd"�|jdjtd#��xJd:D]B}|djd$|�|djd%|�|jdjtd&|���q�Wxzd;D]r}xj|jj�r
dd	gnd	gD]N}|djd)||f�|djd*||f�|jdjtd+||f���qW�q�WxJd<D]B}|djd$|�|djd%|�|jdjtd&|���qnW|dk�r�|djd,�|djd-�|dk�r�|djd.�|djd/�|dd0d1d2d3g7<|jdjtd4��xJd=D]B}|djd5|�|djd6|�|jdjtd7|���q2WxJd>D]B}|djd5|�|djd6|�|jdjtd7|���q~Wg}xJ|D]B}||j�k�r�q�x(||D]}|jd8|gt	|���q�W�q�W|S)?Nrz-N %s_directz-A %s -j %s_directz	%s_directr r�POLICIES_preZZONES_SOURCEZZONES�
POLICIES_postz-N %s_%sz-A %s -j %s_%sz%s_%sr!r"rr#zB-A INPUT -m conntrack --ctstate RELATED,ESTABLISHED,DNAT -j ACCEPTz-A INPUT -i lo -j ACCEPTz-N INPUT_directz-A INPUT -j INPUT_directZINPUT_directz-N INPUT_%sz-A INPUT -j INPUT_%szINPUT_%sr�z^-A INPUT -m conntrack --ctstate INVALID %%LOGTYPE%% -j LOG --log-prefix 'STATE_INVALID_DROP: 'z/-A INPUT -m conntrack --ctstate INVALID -j DROPz9-A INPUT %%LOGTYPE%% -j LOG --log-prefix 'FINAL_REJECT: 'z-A INPUT -j %%REJECT%%zD-A FORWARD -m conntrack --ctstate RELATED,ESTABLISHED,DNAT -j ACCEPTz-A FORWARD -i lo -j ACCEPTz-N FORWARD_directz-A FORWARD -j FORWARD_directZFORWARD_directz
-N FORWARD_%sz-A FORWARD -j FORWARD_%sz
FORWARD_%s�IN�OUTz-N FORWARD_%s_%sz-A FORWARD -j FORWARD_%s_%sz
FORWARD_%s_%sz`-A FORWARD -m conntrack --ctstate INVALID %%LOGTYPE%% -j LOG --log-prefix 'STATE_INVALID_DROP: 'z1-A FORWARD -m conntrack --ctstate INVALID -j DROPz;-A FORWARD %%LOGTYPE%% -j LOG --log-prefix 'FINAL_REJECT: 'z-A FORWARD -j %%REJECT%%z-N OUTPUT_directz>-A OUTPUT -m conntrack --ctstate RELATED,ESTABLISHED -j ACCEPTz-A OUTPUT -o lo -j ACCEPTz-A OUTPUT -j OUTPUT_directZ
OUTPUT_directz-N OUTPUT_%sz-A OUTPUT -j OUTPUT_%sz	OUTPUT_%sz-t)rr)r�)r�r�)r�)r�)r�)
r�r;rMrbrfrgr@r�updater)	rNr�Z
default_rulesrdZdispatch_suffix�	directionZfinal_default_rulesrcr^r3r3r4�build_default_rulesus�
$(
&*
&*&



(






"zip4tables.build_default_rulescCsf|dkrdddhS|dkr,d|j�kr,dhS|dkrHd|j�krHddhS|d	krbd	|j�krbdhSiS)
Nr#r�
FORWARD_IN�FORWARD_OUTr!rr"rr )r�)rNrcr3r3r4�get_zone_table_chains�s
zip4tables.get_zone_table_chainsc	s�|jjj|���jdkrdnd��dkr4�dkr4dnd}	|jjj|�t|	��g}
g}x|D]}|
jd|g�qZWx|D]}|jd	|g�qvWxB|D]:}
|jjj|
�}|dkr�|j	|�r�q�|
j|j
d|
��q�Wx\|D]T}
|jjj|
�}|dk�r|j	|��rq�t|
��r�dk�rq�|j|j
d|
��q�W������fdd�}g}|
�r�x�|
D]F}|�r�x8|D]}|j|||���qdWn|�r�n|j||d���qTWnH|�r�n@|�r�x8|D]}|j|d|���q�Wn|�r�n|j|dd��|S)Nr�pre�postr"rTFz-iz-or$r%z-sr�rz-dcsVddd��}d�|d��fd�jg}|r6|j|�|rD|j|�|jd�g�|S)Nz-Az-D)TFz-tz%s_POLICIES_%sz%%POLICY_PRIORITY%%z-j)r��extend)�ingress_fragment�egress_fragment�add_delr^)r�rd�chain_suffix�enable�p_objrcr3r4�_generate_policy_dispatch_rules


zSip4tables.build_policy_ingress_egress_rules.<locals>._generate_policy_dispatch_rule)r$r%)r$r%)rr�r)r@r�Z
get_policyr��policy_base_chain_name�POLICY_CHAIN_PREFIXrfr�Zcheck_source�is_ipv_supported�_rule_addr_fragmentr)rNr�r�rcrdZingress_interfacesZegress_interfacesZingress_sourcesZegress_sources�isSNATZingress_fragmentsZegress_fragments�	interface�addrrAr�r�r�r�r3)r�rdr�r�r�rcr4�!build_policy_ingress_egress_rules�sR






z+ip4tables.build_policy_ingress_egress_rulesFc
Cs�|dkr|dkrdnd}|jjj||t|d�}	ddddddd�|}
d	}|rb|rbd
d|dg}n,|rtd
d|g}ndd|g}|s�|dg7}|d||
|||	g7}|gS)Nr"rTF)r�z-iz-o)rrrr�r�rz-gz-Iz%s_ZONESz%%ZONE_INTERFACE%%z-Az-Dz-t)r@r�r�r�)
rNr�r�r�r�rcrdrfr�r�rt�actionr^r3r3r4�!build_zone_source_interface_rulesKs&

z+ip4tables.build_zone_source_interface_rulescCs�|jd�rP|dd�}|dkr$d}nd}dj|g|jjj|��}ddd	||gSt|�rz|dkrjttd
��ddd|j�gSt	d
|�r�t
|�}n,td
|�r�|jd�}t
|d�d|d}||gSdS)Nzipset:�z-d�dst�src�,z-mr;z--match-setzCan't match a destination MAC.�macz--mac-sourcer%�/rr])
r�rZr@�ipsetZ
get_dimensionrr
r�upperr	rr
r�)rNrt�address�invertr��flags�
addr_splitr3r3r4r�es"





zip4tables._rule_addr_fragmentc
Cs�ddd�|}|dkr"|dkr"dnd}|jjj||t|d�}	d	d
d	d	d
d
d�|}
|jjrdd|}nd
|}t|�r�|dkr�gS||d|d|g}|j|j|
|��|jd|	g�|gS)Nz-Iz-D)TFr"rTF)r�z-sz-d)rrrr�r�rz%s_ZONES_SOURCEz%s_ZONESr�rz%%ZONE_SOURCE%%z-tz-g)rr�r)r@r�r�r�rrr�r�)
rNr�r�r�r�rcrdr�r�r�rtZzone_dispatch_chainr^r3r3r4�build_zone_source_address_rules{s&
z)ip4tables.build_zone_source_address_rulescCs>ddd�|}ddd�|}|dkr0|dkr0dnd	}|jjj||t|d
�}|j|jt|d|d|d
|d|d|g��g}	|	j||d|g�|	j|d
|d|g�|	j|d|d|g�|	j|d|d|g�|	j|d|d|g�|	j|d|d|g�|	j||d|dd
|g�|	j||d|dd|g�|	j||d|dd|g�|	j||d|dd|g�|	j||d|dd|g�|jjj|j	}
|jj
�dk�r|dk�r|
dk�r�|	j||d|ddddd|g	�|
dk�r|	j||d|ddddd|g	�|dk�r,|
dk�r,|	j||d|d|
g�|�s:|	j�|	S)Nz-Nz-X)TFz-Az-Dr"rTF)r�z%s_logz%s_denyz%s_prez%s_postz%s_allowz-tz-jr�r#r��
%%REJECT%%z%%LOGTYPE%%�LOGz--log-prefixz
"%s_REJECT: "r�z"%s_DROP: "�ACCEPT)r�r�)r�r�r�r�)r@r�r�r�rMr�r;rfZ	_policies�target�get_log_denied�reverse)rNr�r�rcrdZ
add_del_chainZadd_del_ruler�r�r�r�r3r3r4�build_policy_chain_rules�sN




z"ip4tables.build_policy_chain_rulescCs2|sgSddd|jg}|jdk	r.|d|jg7}|S)Nz-m�limitz--limitz
--limit-burst)�valueZburst)rNr�sr3r3r4�_rule_limit�s
zip4tables._rule_limitcCs�t|j�tttgkrn<|jrHt|j�tttt	gkrRt
tdt|j���n
t
td��|jdkr�t|j�ttgks�t|j�tt	gkr�dSt|j�tgks�t|j�ttgkr�dSn|jdkr�dSdSdS)NzUnknown action %szNo rule action specified.r�allowZdenyr�r�)
r��elementrrrr�rrrrr
rr�)rN�	rich_ruler3r3r4�_rich_rule_chain_suffix�s 


z!ip4tables._rich_rule_chain_suffixcCs>|jr|jrttd��|jdkr(dS|jdkr6dSdSdS)NzNot log or auditrrr�r�)r�auditr
rr�)rNrr3r3r4� _rich_rule_chain_suffix_from_log�s


z*ip4tables._rich_rule_chain_suffix_from_logcCs|jdkrgSd|jgS)Nrz%%RICH_RULE_PRIORITY%%)r�)rNrr3r3r4�_rich_rule_priority_fragment�s
z&ip4tables._rich_rule_priority_fragmentc
Cs�|js
gS|jjj||t�}ddd�|}|j|�}d||d||fg}	|	|j|�7}	|	|ddg7}	|jjr�|	dd	|jjg7}	|jjr�|	d
d|jjg7}	|	|j	|jj
�7}	|	S)Nz-Az-D)TFz-tz%s_%sz-jr�z--log-prefixz'%s'z--log-levelz%s)rr@r�r�r�rr�prefix�levelrr)
rNr�rr�rc�
rule_fragmentr�r�r�r^r3r3r4�_rich_rule_log�s
zip4tables._rich_rule_logcCs�|js
gSddd�|}|jjj||t�}|j|�}d||d||fg}	|	|j|�7}	|	|7}	t|j�t	krrd}
n,t|j�t
kr�d}
nt|j�tkr�d}
nd	}
|	d
dd|
g7}	|	|j|jj
�7}	|	S)
Nz-Az-D)TFz-tz%s_%sZacceptZrejectZdrop�unknownz-jZAUDITz--type)r
r@r�r�r�rrr�r�rrrrr)rNr�rr�rcrr�r�r�r^Z_typer3r3r4�_rich_rule_audits$
zip4tables._rich_rule_auditcCs2|js
gSddd�|}|jjj||t�}|j|�}d||f}	t|j�tkrXddg}
n�t|j�tkr�ddg}
|jjr�|
d|jjg7}
nnt|j�t	kr�dd	g}
nVt|j�t
kr�d
}|jjj||t�}d||f}	ddd|jjg}
ntt
d
t|j���d|||	g}||j|�7}|||
7}||j|jj�7}|S)Nz-Az-D)TFz%s_%sz-jr�r�z
--reject-withr�r!�MARKz--set-xmarkzUnknown action %sz-t)r�r@r�r�r�r	r�rrrrr;r
rrrr)rNr�rr�rcrr�r�r�rdZrule_actionr^r3r3r4�_rich_rule_action$s4


zip4tables._rich_rule_actioncCs�|sgSg}|jr�|jr"|jd�td|j�rB|dt|j�g7}q�td|j�r||jjd�}|dt|d�d|dg7}q�|d|jg7}nD|jr�|ddg7}|jr�|jd�|jj	j
|jd	�}|d
|j|g7}|S)N�!r%z-dr�rr]z-mr;r�z--match-set)r�r�rfr	rr
r�r�r@r��_ipset_match_flags)rNZ	rich_destrr�r�r3r3r4�_rich_rule_destination_fragmentFs&
"
z)ip4tables._rich_rule_destination_fragmentcCs|sgSg}|jr�|jr"|jd�td|j�rB|dt|j�g7}nHtd|j�r||jjd�}|dt|d�d|dg7}n|d|jg7}n�t|d�r�|jr�|ddg7}|jr�|jd�|d	|jg7}nPt|d
�o�|j	�r|ddg7}|jr�|jd�|j
jj|j	d�}|d
|j	|g7}|S)Nrr%z-sr�rr]r�z-mz--mac-sourcer�r;r�z--match-set)
r�r�rfr	rr
r��hasattrr�r�r@r�r)rNZrich_sourcerr�r�r3r3r4�_rich_rule_source_fragment^s0
"

z$ip4tables._rich_rule_source_fragmentcCsddd�|}d}|jjj||t�}	d|g}
|rD|
ddt|�g7}
|rT|
d|g7}
|rx|
|j|j�7}
|
|j|j�7}
|s�t	|j
�tkr�|
d	d
ddg7}
g}|r�|j|j
|||||
��|j|j|||||
��|j|j|||||
��n"|j|d
|	d|g|
ddg�|S)Nz-Az-D)TFr#z-pz--dportz%sz-dz-m�	conntrackz	--ctstatez
NEW,UNTRACKEDz%s_allowz-tz-jr�)r@r�r�r�rr�destinationr�sourcer�r�rrfrrr)rNr�r��proto�portrrr�rcr�rr�r3r3r4�build_policy_ports_rules{s*z"ip4tables.build_policy_ports_rulescCs�ddd�|}d}|jjj||t�}d|g}	|r<|	d|g7}	|r`|	|j|j�7}	|	|j|j�7}	|stt|j	�t
kr�|	ddd	d
g7}	g}
|r�|
j|j|||||	��|
j|j
|||||	��|
j|j|||||	��n"|
j|d|d|g|	d
dg�|
S)Nz-Az-D)TFr#z-pz-dz-mrz	--ctstatez
NEW,UNTRACKEDz%s_allowz-tz-jr�)r@r�r�r�rrrrr�r�rrfrrr)rNr�r��protocolrrr�rcr�rr�r3r3r4�build_policy_protocol_rules�s&z%ip4tables.build_policy_protocol_rulescCsddd�|}d}|jjj||t�}	d|g}
|rD|
ddt|�g7}
|rT|
d|g7}
|rx|
|j|j�7}
|
|j|j�7}
|s�t	|j
�tkr�|
d	d
ddg7}
g}|r�|j|j
|||||
��|j|j|||||
��|j|j|||||
��n"|j|d
|	d|g|
ddg�|S)Nz-Az-D)TFr#z-pz--sportz%sz-dz-mrz	--ctstatez
NEW,UNTRACKEDz%s_allowz-tz-jr�)r@r�r�r�rrrrrr�r�rrfrrr)rNr�r�rrrrr�rcr�rr�r3r3r4�build_policy_source_ports_rules�s*z)ip4tables.build_policy_source_ports_rulescCsvd}|jjj||t�}	ddd�|}
|
d|	ddd|g}|rP|dd	t|�g7}|r`|d
|g7}|ddd
|g7}|gS)Nr z-Az-D)TFz%s_allowz-tz-pz--dportz%sz-dz-jZCTz--helper)r@r�r�r�r)rNr�r�rrrZhelper_nameZmodule_short_namercr�r�r^r3r3r4�build_policy_helper_ports_rules�sz)ip4tables.build_policy_helper_ports_rulesc
	Cs�ddd�|}|jjj||t�}g}	|rH|	jdd|d|d|dd	g�n6t|�rTgS|	jdd|d|g|jd
|�dd	g�|	S)Nz-Az-D)TFz-tr#z%s_allowz-oz-jr�z-d)r@r�r�r�rfrr�)
rNr�r�r�rcr�rr�r�r�r3r3r4�build_zone_forward_rules�sz"ip4tables.build_zone_forward_rulesc
Cs,d}|jjj||tdd�}ddd�|}g}|rj|j|�}||j|�7}||j|j�7}||j|j	�7}nd}g}	|	j
dd|d	||fg|d
ddd
dg�g}|r�|j|�}||j|�7}||j|j�7}||j|j	�7}nd}d}|jjj||t�}|	j
dd|d	||fg|ddddd
dg�|	S)Nr"T)r�z-Az-D)TFrz-tz%s_%srz-o�loz-jZ
MASQUERADEr#z-mrz	--ctstatez
NEW,UNTRACKEDr�)r@r�r�r�r	rrrrrrf)
rNr�r�rrcr�r�rr�r�r3r3r4�build_policy_masquerade_rules�s6

z'ip4tables.build_policy_masquerade_rulesc
Cs
d}|jjj||t�}	ddd�|}
d}|rPtd|�rH|dt|�7}n||7}|rn|dkrn|dt|d	�7}g}|r�|j|�}
|j|�}||j	|j
�7}||j|j�7}nd
}
g}|r�|j
|j|||d|��|j
dd|
d|	|
fg|d
|dt|�ddd|g�|S)Nr"z-Az-D)TFrr%z[%s]z:%s�-rz-tz%s_%sz-pz--dportz-jZDNATz--to-destination)r@r�r�r�r	rrr	rrrrrrfr)rNr�r�rr ZtoportZtoaddrrrcr�r�Ztorr�r�r3r3r4�build_policy_forward_port_ruless2


z)ip4tables.build_policy_forward_port_rulescCs�d}|jjj||t�}ddd�|}|jdkrFddg}ddd	|jg}	ndd
g}ddd|jg}	g}
|jjj|�r|d
|}d}nd|}d}g}
|r�|
|j|j�7}
|
|j	|j
�7}
|
||	7}
|�rP|
j|j|||||
��|
j|j
|||||
��|j�r|
j|j|||||
��n:|j|�}|
jd||d||fg|j|�|
ddg�n`|jj�dk�r�|dk�r�|
j||d|g|
ddddd|g�|
j||d|g|
d|g�|
S)Nr#z-Az-D)TFr$z-pr&z-mz--icmp-typez	ipv6-icmpZicmp6z
--icmpv6-typez%s_allowr�z%s_denyz
%%REJECT%%z-tz%s_%sz-jr�z%%LOGTYPE%%r�z--log-prefixz"%s_ICMP_BLOCK: ")r@r�r�r�rAr��query_icmp_block_inversionrrrrrfrrr�rr	rr�)rNr�r�Zictrrcr�r�r�matchr�Zfinal_chainZfinal_targetrr�r3r3r4�build_policy_icmp_block_rules3sJ

 z'ip4tables.build_policy_icmp_block_rulesc	Cs�d}|jjj||t�}g}d}|jjj|�r�d}|jj�dkr�|rRd|t|�g}nd|g}|d|dd	d
ddd
d|g	}|j|�|d7}nd}|r�d|t|�g}nd|g}|d|dd	d|g}|j|�|S)Nr#r�z
%%REJECT%%r�z-Iz-Dz-tz-pz%%ICMP%%z%%LOGTYPE%%z-jr�z--log-prefixz"%s_ICMP_BLOCK: "r]r�)r@r�r�r�r)r�rirf)	rNr�r�rcr�r�Zrule_idxZ
ibi_targetr^r3r3r4�'build_policy_icmp_block_inversion_rulesds.



z1ip4tables.build_policy_icmp_block_inversion_rulescCsxd}g}||j|j�7}||j|j�7}g}|j|j|||||��|j|j|||||��|j|j|||||��|S)Nr#)rrrrrfrrr)rNr�r�rrcrr�r3r3r4�*build_policy_rich_source_destination_rules�sz4ip4tables.build_policy_rich_source_destination_rulescCs
||jkS)N)rA)rNrAr3r3r4r��szip4tables.is_ipv_supported)N)N)r�)F)F)NN)NN)NN)NN)N)N)N)7�__name__�
__module__�__qualname__rAr�Zpolicies_supportedrPrHr�rarerhrjrkrlrmrur�r�r�r�r�rDrFr�r�r�r�r�r�r�r�r�r�rrr	rrrrrrrrr!r"r#r$r&r(r+r,r-r�r3r3r3r4r?�sh

			)Pa#

!
zN

0"




&
!
1"r?c@s&eZdZdZdZddd�Zdd�ZdS)	�	ip6tablesr%Fc
Cs�g}|jddddddddd	d
g
�|dkrL|jddddddddd	dd
dg�|jdddddddd	dg	�|jdddddddd	dg	�|S)Nz-Irz-tr!z-mZrpfilterz--invertz--validmarkz-jr�r�r�z--log-prefixzrpfilter_DROP: z-pz	ipv6-icmpz$--icmpv6-type=neighbour-solicitationr�z"--icmpv6-type=router-advertisement)rf)rNr�r�r3r3r4�build_rpfilter_rules�s$



zip6tables.build_rpfilter_rulescCs�ddddddddd	g	}d
}|jdj|�g}|jddd
|g�xT|D]L}|jddd|d|ddddg
�|jjdkrF|jddd|d|ddddg
�qFW|jdddddd|g�|jdddddd|g�|S)Nz::0.0.0.0/96z::ffff:0.0.0.0/96z2002:0000::/24z2002:0a00::/24z2002:7f00::/24z2002:ac10::/28z2002:c0a8::/32z2002:a9fe::/32z2002:e000::/19ZRFC3964_IPv4r#z-tz-Nz-Iz-dz-jr�z
--reject-withzaddr-unreachr��allr�z--log-prefixz"RFC3964_IPv4_REJECT: "r�4r)r�r3)rMrgrfr@Z_log_denied)rNZ
daddr_listZ
chain_namer�Zdaddrr3r3r4�build_rfc3964_ipv4_rules�s4



z"ip6tables.build_rfc3964_ipv4_rulesN)F)r.r/r0rAr�r2r5r3r3r3r4r1�s
r1)+Zos.pathrQr�Zfirewall.core.progrZfirewall.core.loggerrZfirewall.functionsrrrrrr	r
rZfirewallrZfirewall.errorsr
rrrrZfirewall.core.richrrrrrrrr�r�rbr�r�r5r:r>�objectr?r1r3r3r3r4�<module>s@($%* xsite-packages/firewall/core/__pycache__/rich.cpython-36.opt-1.pyc000064400000050640147511334670020563 0ustar003

]ûf8��@s�dddddddddd	d
ddd
ddddgZddlmZddlmZddlmZddlmZddlm	Z	Gdd�de
�ZGdd�de
�ZGdd�de
�Z
Gdd�de
�ZGdd�de�ZGdd�de
�ZGdd�de
�ZGdd�de
�ZGd d�de
�ZGd!d	�d	e
�ZGd"d
�d
e
�ZGd#d�de
�ZGd$d�de
�ZGd%d
�d
e
�ZGd&d�de�ZGd'd�de
�Zd(d)d/d1d+�ZGd,d�de
�ZGd-d�de
�Zd.S)2�Rich_Source�Rich_Destination�Rich_Service�	Rich_Port�
Rich_Protocol�Rich_Masquerade�Rich_IcmpBlock�
Rich_IcmpType�Rich_SourcePort�Rich_ForwardPort�Rich_Log�
Rich_Audit�Rich_Accept�Rich_Reject�	Rich_Drop�	Rich_Mark�
Rich_Limit�	Rich_Rule�)�	functions)�check_ipset_name)�REJECT_TYPES)�errors)�
FirewallErrorc@seZdZddd�Zdd�ZdS)rFcCs�||_|jdkrd|_||_|jdks0|jdkr8d|_n|jdk	rN|jj�|_||_|jdkrdd|_||_|jdkr�|jdkr�|jdkr�ttjd��dS)N�zno address, mac and ipset)�addr�mac�upper�ipset�invertrr�INVALID_RULE)�selfrrrr�r!�/usr/lib/python3.6/rich.py�__init__$s


zRich_Source.__init__cCsjd|jrdnd}|jdk	r*|d|jS|jdk	rB|d|jS|jdk	rZ|d|jSttjd��dS)Nz	source%s z NOTrzaddress="%s"zmac="%s"z
ipset="%s"zno address, mac and ipset)rrrrrrr)r �retr!r!r"�__str__5s


zRich_Source.__str__N)F)�__name__�
__module__�__qualname__r#r%r!r!r!r"r#s
c@seZdZddd�Zdd�ZdS)rFcCsV||_|jdkrd|_||_|jdkr,d|_||_|jdkrR|jdkrRttjd��dS)Nrzno address and ipset)rrrrrr)r rrrr!r!r"r#Bs

zRich_Destination.__init__cCsRd|jrdnd}|jdk	r*|d|jS|jdk	rB|d|jSttjd��dS)Nzdestination%s z NOTrzaddress="%s"z
ipset="%s"zno address and ipset)rrrrrr)r r$r!r!r"r%Ns

zRich_Destination.__str__N)F)r&r'r(r#r%r!r!r!r"rAs
c@seZdZdd�Zdd�ZdS)rcCs
||_dS)N)�name)r r)r!r!r"r#YszRich_Service.__init__cCs
d|jS)Nzservice name="%s")r))r r!r!r"r%\szRich_Service.__str__N)r&r'r(r#r%r!r!r!r"rXsc@seZdZdd�Zdd�ZdS)rcCs||_||_dS)N)�port�protocol)r r*r+r!r!r"r#`szRich_Port.__init__cCsd|j|jfS)Nzport port="%s" protocol="%s")r*r+)r r!r!r"r%dszRich_Port.__str__N)r&r'r(r#r%r!r!r!r"r_sc@seZdZdd�ZdS)r	cCsd|j|jfS)Nz#source-port port="%s" protocol="%s")r*r+)r r!r!r"r%hszRich_SourcePort.__str__N)r&r'r(r%r!r!r!r"r	gsc@seZdZdd�Zdd�ZdS)rcCs
||_dS)N)�value)r r,r!r!r"r#mszRich_Protocol.__init__cCs
d|jS)Nzprotocol value="%s")r,)r r!r!r"r%pszRich_Protocol.__str__N)r&r'r(r#r%r!r!r!r"rlsc@seZdZdd�Zdd�ZdS)rcCsdS)Nr!)r r!r!r"r#tszRich_Masquerade.__init__cCsdS)N�
masquerader!)r r!r!r"r%wszRich_Masquerade.__str__N)r&r'r(r#r%r!r!r!r"rssc@seZdZdd�Zdd�ZdS)rcCs
||_dS)N)r))r r)r!r!r"r#{szRich_IcmpBlock.__init__cCs
d|jS)Nzicmp-block name="%s")r))r r!r!r"r%~szRich_IcmpBlock.__str__N)r&r'r(r#r%r!r!r!r"rzsc@seZdZdd�Zdd�ZdS)rcCs
||_dS)N)r))r r)r!r!r"r#�szRich_IcmpType.__init__cCs
d|jS)Nzicmp-type name="%s")r))r r!r!r"r%�szRich_IcmpType.__str__N)r&r'r(r#r%r!r!r!r"r�sc@seZdZdd�Zdd�ZdS)r
cCs<||_||_||_||_|jdkr(d|_|jdkr8d|_dS)Nr)r*r+�to_port�
to_address)r r*r+r.r/r!r!r"r#�s

zRich_ForwardPort.__init__cCs<d|j|j|jdkrd|jnd|jdkr4d|jndfS)Nz(forward-port port="%s" protocol="%s"%s%srz
 to-port="%s"z
 to-addr="%s")r*r+r.r/)r r!r!r"r%�szRich_ForwardPort.__str__N)r&r'r(r#r%r!r!r!r"r
�sc@seZdZddd�Zdd�ZdS)rNcCs||_||_||_dS)N)�prefix�level�limit)r r0r1r2r!r!r"r#�szRich_Log.__init__cCs>d|jrd|jnd|jr$d|jnd|jr6d|jndfS)Nz	log%s%s%sz prefix="%s"rz level="%s"z %s)r0r1r2)r r!r!r"r%�szRich_Log.__str__)NNN)r&r'r(r#r%r!r!r!r"r�s
c@seZdZddd�Zdd�ZdS)rNcCs
||_dS)N)r2)r r2r!r!r"r#�szRich_Audit.__init__cCsd|jrd|jndS)Nzaudit%sz %sr)r2)r r!r!r"r%�szRich_Audit.__str__)N)r&r'r(r#r%r!r!r!r"r�s
c@seZdZddd�Zdd�ZdS)r
NcCs
||_dS)N)r2)r r2r!r!r"r#�szRich_Accept.__init__cCsd|jrd|jndS)Nzaccept%sz %sr)r2)r r!r!r"r%�szRich_Accept.__str__)N)r&r'r(r#r%r!r!r!r"r
�s
c@s&eZdZddd�Zdd�Zdd�ZdS)	rNcCs||_||_dS)N)�typer2)r Z_typer2r!r!r"r#�szRich_Reject.__init__cCs,d|jrd|jnd|jr$d|jndfS)Nz
reject%s%sz
 type="%s"rz %s)r3r2)r r!r!r"r%�szRich_Reject.__str__cCsT|jrP|sttjd��|dkrP|jt|krPdjt|�}ttjd|j|f��dS)Nz9When using reject type you must specify also rule family.�ipv4�ipv6z, z%Wrong reject type %s.
Use one of: %s.)r4r5)r3rrrr�join)r �familyZvalid_typesr!r!r"�check�szRich_Reject.check)NN)r&r'r(r#r%r8r!r!r!r"r�s
c@seZdZdd�ZdS)rcCsd|jrd|jndS)Nzdrop%sz %sr)r2)r r!r!r"r%�szRich_Drop.__str__N)r&r'r(r%r!r!r!r"r�sc@s&eZdZddd�Zdd�Zdd�ZdS)	rNcCs||_||_dS)N)�setr2)r Z_setr2r!r!r"r#�szRich_Mark.__init__cCsd|j|jrd|jndfS)Nz
mark set=%s%sz %sr)r9r2)r r!r!r"r%�szRich_Mark.__str__cCs�|jdk	r|j}nttjd��d|krv|jd�}t|�dkrHttj|��tj|d�shtj|d�r�ttj|��ntj|�s�ttj|��dS)Nzno value set�/�r�)r9rrZINVALID_MARK�split�lenrZcheckUINT32)r �x�splitsr!r!r"r8�s


zRich_Mark.check)N)r&r'r(r#r%r8r!r!r!r"r�s
r<�<�)�s�m�h�dc@s�eZdZddd�Zdd�Zedd��Zejdd��Zed	d
��Zejdd
��Ze	dd
��Z
dd�Ze	dd��Zdd�Z
dd�ZdS)rNcCs||_||_dS)N)r,�burst)r r,rGr!r!r"r#�szRich_Limit.__init__cCs|j�|j�dS)N)�value_parse�burst_parse)r r!r!r"r8�szRich_Limit.checkcCs|jS)N)�_value)r r!r!r"r,�szRich_Limit.valuecCsf|dkrd|_dSy|j|�\}}Wntk
r<|}YnX|�d|��}t|dd�|krb||_dS)Nr:rJ)rJ�_value_parser�getattr)r r,�rate�duration�vr!r!r"r,�s
cCs|jS)N)�_burst)r r!r!r"rGszRich_Limit.burstcCs\|dkrd|_dSy|j|�}Wntk
r8|}Yn
Xt|�}t|dd�|krX||_dS)NrP)rP�_burst_parser�strrL)r rG�br!r!r"rGs
cCs�d}d|kr|jd�}|s(t|�dkr4ttj|��|\}}yt|�}Wnttj|��YnX|dkrv|dd�}|dks�|dkr�ttj|��dt||d
kr�ttjd|f��|dkr�|dkr�ttjd|f��||fS)Nr:r;�second�minute�hour�dayr<rCrDrErFi'rz%s too fastz%s too slow)rTrUrVrW)rCrDrErF)r=r>rr�
INVALID_LIMIT�int�DURATION_TO_MULT)r,r@rMrNr!r!r"rKs&
zRich_Limit._value_parsecCs|j|j�S)N)rKrJ)r r!r!r"rH:szRich_Limit.value_parsec	CsR|dkrdSyt|�}Wnttj|��YnX|dksB|dkrNttj|��|S)Nr<i���)rYrrrX)rGrSr!r!r"rQ=szRich_Limit._burst_parsecCs|j|j�S)N)rQrP)r r!r!r"rIKszRich_Limit.burst_parsecCs,d|j�d�}|jdk	r(|d|j��7}|S)Nz
limit value="�"z burst=)rJrP)r rCr!r!r"r%Ns
zRich_Limit.__str__)N)r&r'r(r#r8�propertyr,�setterrG�staticmethodrKrHrQrIr%r!r!r!r"r�s
c@s>eZdZdZdZddd�Zdd�Zd	d
�Zdd�Zd
d�Z	dS)ri�i�NrcCsV|dk	rt|�|_nd|_||_d|_d|_d|_d|_d|_d|_|rR|j	|�dS)N)
rRr7�priority�source�destination�element�log�audit�action�_import_from_string)r r7�rule_strr_r!r!r"r#XszRich_Rule.__init__cCs�g}x|tj|�D]n}d|krp|jd�}t|�dksF|dsF|drVttjd|��|j|d|dd��q|jd|i�qW|jddi�|S)	z Lexical analysis �=r;rr<zinternal error in _lexer(): %s)�	attr_name�
attr_valuerb�EOL)rZ	splitArgsr=r>rrr�append)r rg�tokens�r�attrr!r!r"�_lexeris
 
zRich_Rule._lexercCs�|sttjd��tj|�}d|_d|_d|_d|_d|_	d|_
d|_d|_|j
|�}|rv|djd�dkrvttjd��i}g}d}�x`||jd�dko�|dgk�s�||jd�}||jd�}||jd�}|�r�|dHk�r�ttjd|���n�|dIk�r�|dk�r|j�rttjd+��n�|dk�r<|j�r<ttjd,��n�|dJk�rf|j	�rfttjd-||j	f��nh|d"k�r�|j
�r�ttjd.��nH|d#k�r�|j�r�ttjd/��n(|dKk�r�|j�r�ttjd0||jf��nttjd1|��t|�dk�r�|t|�d2nd3}	|	d3k�r�|�r`|�r`|d	k�r2ttjd4��n,|dk�rJttjd5��nttjd6||f��n*d|k�r�ttjd7||f��n
|jd��nL|	dk�rD|d	k�r�|dLk�r�ttjd:|��||_n||dk�ryt|�|_Wn&tk
�rttjd;|��YnXn:|�r6|dk�rd<}
nd=||f}
ttj|
��n
|j|��n�|	dk�r�|dMk�rb|||<nV|dNk�rvd>|d
<nBt|jd
�|jd�|jd�|jd
d?��|_|j�|j�|d2}�n|	dk�r,|dOk�r�|||<nN|dPk�r�d>|d
<n:t|jd
�|jd�|jd
d?��|_|j�|j�|d2}�n�|	dk�rd|dk�rTt|�|_	|j�nttjd@���nv|	dk�r�|dk�r�t|�|_	|j�nttjdA���n>|	dk�r�|dQk�r�|||<n0t|jd�|jd��|_	|j�|j�|d2}�n�|	dk�r&|dk�rt|�|_	|j�nttjdB���n�|	dk�r^|dk�rNt|�|_	|j�nttjdC���n||	dk�r�t�|_	|j�|j�|d2}�nN|	d k�r�|dRk�r�|||<n@t|jd�|jd�|jd�|jd��|_	|j�|j�|d2}�n�|	d!k�r@|dSk�r|||<n0t|jd�|jd��|_	|j�|j�|d2}�n�|	d"k�r�|dTk�r^|||<nN|d(k�rt|jd(�n8t |jd�|jd�|jd(��|_
|j�|j�|d2}�n*|	d#k�r�|d(k�r�|jd(�n(t!|jd(��|_|j�|j�|d2}�n�|	d$k�rH|d(k�r|jd(�n(t"|jd(��|_|j�|j�|d2}�n�|	d%k�r�|d(k�rh|jd(�n(t#|jd(��|_|j�|j�|d2}�nF|	d&k�r�|dk�r�|||<nF|d(k�r�|jd(�n0t$|jd�|jd(��|_|j�|j�|d2}n�|	d'k�r`|dk�r|||<nF|d(k�r.|jd(�n0t%|jd�|jd(��|_|j�|j�|d2}nz|	d(k�r�|dUk�r�||dD|��<nVdE|k�r�ttjdF��t&|dE|jdG��|d(<|jdEd�|jdGd�|j�|d2}|d2}q�W|j'�dS)VNz
empty rulerrbrk�rulerirjr_r7�addressrrrr,r*r+�to-port�to-addrr)r0r1r3r9rGzbad attribute '%s'r`ra�service�
icmp-block�	icmp-typer-�forward-port�source-portrcrd�accept�drop�reject�markr2�not�NOTzmore than one 'source' elementz#more than one 'destination' elementzFmore than one element. There cannot be both '%s' and '%s' in one rule.zmore than one 'log' elementzmore than one 'audit' elementzOmore than one 'action' element. There cannot be both '%s' and '%s' in one rule.zunknown element %sr<rz0'family' outside of rule. Use 'rule family=...'.z4'priority' outside of rule. Use 'rule priority=...'.z:'%s' outside of any element. Use 'rule <element> %s= ...'.z,'%s' outside of rule. Use 'rule ... %s ...'.r4r5zH'family' attribute cannot have '%s' value. Use 'ipv4' or 'ipv6' instead.z(invalid 'priority' attribute value '%s'.zdwrong 'protocol' usage. Use either 'rule protocol value=...' or  'rule [forward-]port protocol=...'.zDattribute '%s' outside of any element. Use 'rule <element> %s= ...'.TFzinvalid 'protocol' elementzinvalid 'service' elementzinvalid 'icmp-block' elementzinvalid 'icmp-type' elementzlimit.zlimit.valuezinvalid 'limit' elementzlimit.burst)r_r7rrrrrr,r*r+rsrtr)r0r1r3r9rG)rqr`rar+rur*rvrwr-rxryrcrdrzr{r|r}r2r~rrk)r+rur*rvrwr-rxry)rzr{r|r})r4r5)rrrrr)r~r)rrrr)r~r)r*r+)r*r+rsrt)r*r+)r0r1)r,rG)(rrrrZstripNonPrintableCharactersr_r7r`rarbrcrdrerp�getr>rlrY�
ValueError�INVALID_PRIORITYr�pop�clearrrrrrrrr
r	rrr
rrrrr8)r rgrmZattrsZin_elements�indexrbrirjZ
in_elementZerr_msgr!r!r"rfzs�

""













*




"






















(






 




















zRich_Rule._import_from_stringc	Cs`|jdk	r"|jd kr"ttj|j��|jdkrn|jdk	rB|jjdk	sL|jdk	rVttj��t|j	�t
krnttj��|j|jks�|j|j
kr�ttjd|j|j
f��|j	dko�|jdks�|jdk	o�|jdk�r
|jdkr�ttjd��|jdko�|jdko�|jdk�r
ttjd��t|j	�tt
tgk�rP|jdk�rP|jdk�rP|jdk�rPttjd��|jdk	�rj|jjdk	�r�|jdk�r�ttj��|jjdk	�r�ttjd��|jjdk	�r�ttjd	��tj|j|jj��sjttjt|jj���n�|jjdk	�r,|jjdk	�rttjd
��tj|jj��sjttjt|jj���n>|jjdk	�r^t|jj��sjttjt|jj���nttjd��|jdk	�r|jjdk	�r�|jdk�r�ttj��|jjdk	�r�ttjd	��tj|j|jj��sttjt|jj���n>|jjdk	�rt|jj��sttjt|jj���nttjd��t|j	�t k�rd|j	j!dk�sLt"|j	j!�d
k�r`ttj#t|j	j!����n�t|j	�t$k�r�tj%|j	j&��s�ttj'|j	j&��|j	j(d!k�r`ttj)|j	j(���n�t|j	�t*k�r�tj+|j	j,��s`ttj)|j	j,���nvt|j	�tk�r<|jdk	�rttjd��|jdk	�r`|jjdk	�r`ttjd���n$t|j	�tk�r�|j	j!dk�slt"|j	j!�d
k�r�ttj-t|j	j!���|j�r`ttjd���n�t|j	�t.k�r�|j	j!dk�s�t"|j	j!�d
k�r`ttj-t|j	j!����n�t|j	�t
k�r�tj%|j	j&��sttj'|j	j&��|j	j(d"k�r.ttj)|j	j(��|j	j/dk�rZ|j	j0dk�rZttj'|j	j/��|j	j/dk�r�tj%|j	j/��r�ttj'|j	j/��|j	j0dk�r�tj1|j|j	j0��r�ttj|j	j0��|jdk�r�ttj��|jdk	�r`ttjd��nrt|j	�t2k�r>tj%|j	j&��sttj'|j	j&��|j	j(d#k�r`ttj)|j	j(��n"|j	dk	�r`ttjdt|j	���|jdk	�r�|jj3�r�|jj3d$k�r�ttj4|jj3��|jj5dk	�r�|jj5j6�|jdk	�r�t|j�t7t8t9gk�r�ttj:t|j���|jj5dk	�r�|jj5j6�|jdk	�r\t|j�t8k�r(|jj6|j�nt|j�t;k�rB|jj6�|jj5dk	�r\|jj5j6�dS)%Nr4r5z/'priority' attribute must be between %d and %d.rzno element, no actionz%no element, no source, no destinationzno action, no log, no auditzaddress and maczaddress and ipsetz
mac and ipsetzinvalid sourcezinvalid destinationr<�tcp�udp�sctp�dccpzmasquerade and actionzmasquerade and mac sourcezicmp-block and actionrzforward-port and actionzUnknown element %s�emerg�alert�crit�error�warning�notice�info�debug)r4r5)r�r�r�r�)r�r�r�r�)r�r�r�r�)r�r�r�r�r�r�r�r�)<r7rrZINVALID_FAMILYr`rraZMISSING_FAMILYr3rbr
r_�priority_min�priority_maxr�rcrerrrrdrrrZ
check_addressZINVALID_ADDRrRZ	check_macZINVALID_MACrZ
INVALID_IPSETZINVALID_DESTINATIONrr)r>ZINVALID_SERVICErZ
check_portr*ZINVALID_PORTr+ZINVALID_PROTOCOLrZ
checkProtocolr,ZINVALID_ICMPTYPErr.r/Zcheck_single_addressr	r1ZINVALID_LOG_LEVELr2r8r
rrZINVALID_AUDIT_TYPEr)r r!r!r"r8hs�




 
 



   


zRich_Rule.checkcCs�d}|jr|d|j7}|jr,|d|j7}|jr@|d|j7}|jrT|d|j7}|jrh|d|j7}|jr||d|j7}|jr�|d|j7}|jr�|d|j7}tj	r�tj
|�S|S)Nrqz priority="%d"z family="%s"z %s)r_r7r`rarbrcrdrerZPY2Zu2b)r r$r!r!r"r%s$zRich_Rule.__str__i���)NNr)
r&r'r(r�r�r#rprfr8r%r!r!r!r"rTs
o-Nii�i�Q)�__all__ZfirewallrZfirewall.core.ipsetrZfirewall.core.baserrZfirewall.errorsr�objectrrrrr	rrrrr
rrr
rrrrZrrr!r!r!r"�<module>s@
dsite-packages/firewall/core/__pycache__/fw_policy.cpython-36.pyc000064400000154111147511334670020670 0ustar003

]ûf=V�@s�ddlZddlZddlmZddlmZmZmZmZm	Z	m
Z
mZmZm
Z
mZddlmZmZmZmZmZmZmZmZmZmZmZddlmZddlmZddlm Z ddl!m"Z"dd	l#m$Z$Gd
d�de%�Z&dS)�N)�log)
�portStr�checkIPnMask�
checkIP6nMask�
checkProtocol�enable_ip_forwarding�check_single_address�portInPortRange�get_nf_conntrack_short_name�coalescePortRange�breakPortRange)�	Rich_Rule�Rich_Accept�Rich_Service�	Rich_Port�
Rich_Protocol�Rich_Masquerade�Rich_ForwardPort�Rich_SourcePort�Rich_IcmpBlock�
Rich_IcmpType�	Rich_Mark)�FirewallTransaction)�errors)�
FirewallError)�LastUpdatedOrderedDict)�SOURCE_IPSET_TYPESc@s�eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
�ddd�Zdd�Zdd�Zdd�Z�dd d!�Z�d
d"d#�Z�dd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Z�dd0d1�Zd2d3�Z�dd4d5�Zd6d7�Zd8d9�Zd:d;�Zd<d=�Zd>d?�Z �dd@dA�Z!dBdC�Z"�ddDdE�Z#dFdG�Z$dHdI�Z%dJdK�Z&dLdM�Z'dNdO�Z(dPdQ�Z)dRdS�Z*�ddTdU�Z+dVdW�Z,�ddXdY�Z-dZd[�Z.d\d]�Z/d^d_�Z0d`da�Z1dbdc�Z2�dddde�Z3dfdg�Z4�ddhdi�Z5djdk�Z6dldm�Z7dndo�Z8dpdq�Z9drds�Z:dtdu�Z;dvdw�Z<�ddxdy�Z=dzd{�Z>�dd|d}�Z?d~d�Z@d�d��ZAd�d��ZBd�d��ZCd�d��ZD�dd�d��ZEd�d��ZF�dd�d��ZGd�d��ZHd�d��ZId�d��ZJd�d��ZK�dd�d��ZLd�d��ZM�dd�d��ZNd�d��ZOd�d��ZPd�d��ZQd�d��ZR�dd�d��ZSd�d��ZT�dd�d��ZUd�d��ZVd�d��ZW�dd�d��ZX�d d�d��ZY�d!d�d��ZZd�d��Z[�d"d�d��Z\d�d��Z]�d#d�d��Z^d�d��Z_d�d��Z`d�d��Za�d$d�dÄZbd�dńZc�d%d�dDŽZdd�dɄZed�d˄Zfd�d̈́Zgd�dτZh�d&d�dфZid�dӄZjd�dՄZk�d'd�dׄZld�dلZmd�dۄZnd�d݄Zod�d߄Zpd�d�Zqd�d�Zrd�d�Zsd�d�Ztd�d�Zu�d(d�d�Zv�d)d�d�Zwd�d�Zxd�d�Zyd�d�Zzd�d��Z{�d*d�d��Z|d�d��Z}d�d��Z~d�d��Zd�d��Z��d�d�Z��d�d�Z��d�d�Z��d�d�Z��d+�d	�d
�Z�dS(,�FirewallPolicycCs||_i|_i|_dS)N)�_fw�_chains�	_policies)�self�fw�r#�/usr/lib/python3.6/fw_policy.py�__init__szFirewallPolicy.__init__cCsd|j|j|jfS)Nz
%s(%r, %r))�	__class__rr )r!r#r#r$�__repr__szFirewallPolicy.__repr__cCs|jj�|jj�dS)N)r�clearr )r!r#r#r$�cleanups
zFirewallPolicy.cleanupcCs
t|j�S)N)rr)r!r#r#r$�new_transaction$szFirewallPolicy.new_transactioncCst|jj��S)N)�sortedr �keys)r!r#r#r$�get_policies)szFirewallPolicy.get_policiescCs8g}x*|j�D]}|j|�}|js|j|�qWt|�S)N)r-�
get_policy�derived_from_zone�appendr+)r!Zpolicies�p�p_objr#r#r$�"get_policies_not_derived_from_zone,s
z1FirewallPolicy.get_policies_not_derived_from_zonecCs~g}xt|j�D]h}|j|�}t|d�t|jjj��tddg�B@rt|d�t|jjj��tddg�B@r|j|�qW|S)N�
ingress_zones�HOST�ANY�egress_zones)r3�get_settings�setr�zoneZget_active_zonesr0)r!Zactive_policies�policy�settingsr#r#r$�)get_active_policies_not_derived_from_zone4s
((z8FirewallPolicy.get_active_policies_not_derived_from_zonecCs|jj|�}|j|S)N)r�check_policyr )r!r;r1r#r#r$r.>szFirewallPolicy.get_policycCs,dd�dD�|_||j|j<|j|j�dS)NcSsi|]}t�|�qSr#)r)�.0�xr#r#r$�
<dictcomp>Csz-FirewallPolicy.add_policy.<locals>.<dictcomp>�services�ports�
masquerade�
forward_ports�source_ports�icmp_blocks�rules�	protocols�icmp_block_inversionr4r7)rBrCrDrErFrGrHrIrJr4r7)r<r �name�copy_permanent_to_runtime)r!�objr#r#r$�
add_policyBs
zFirewallPolicy.add_policycCs0|j|}|jr|j|�|jj�|j|=dS)N)r �applied�unapply_policy_settingsr<r()r!r;rMr#r#r$�
remove_policyNs



zFirewallPolicy.remove_policycCs�|j|}|jrdSx|jD]}|j||dd�qWx|jD]}|j||dd�q<Wx|jD]}|j||�q\Wx|jD]}|j	|f|��qxWx|j
D]}|j||�q�Wxf|jD]\}y|j
|f|��Wq�tk
�r}z$|jtjgkr�tj|�n|�WYdd}~Xq�Xq�Wx|jD]}|j||��qWxj|jD]`}y|j|f|��WnDtk
�r�}z&|jtjgk�r�tj|�n|�WYdd}~XnX�q:Wx|jD]}|j||��q�W|j�r�|j|�dS)NF)�allow_apply)r rOr4�add_ingress_zoner7�add_egress_zonerG�add_icmp_blockrE�add_forward_portrB�add_servicerC�add_portr�coder�ALREADY_ENABLEDr�warningrI�add_protocolrF�add_source_portrH�add_rulerD�add_masquerade)r!r;rM�args�errorr#r#r$rLUsB
z(FirewallPolicy.copy_permanent_to_runtimeNcCsNxH|j�D]<}|j|}|jr q
||j�kr
tjd|�|j||d�q
WdS)NzApplying policy '%s')�use_transaction)r-r r/r=rZdebug1�apply_policy_settings)r!rbr;r2r#r#r$�apply_policies|s
zFirewallPolicy.apply_policiescCs|j|}||_dS)N)r rO)r!r;rOrMr#r#r$�set_policy_applied�s
z!FirewallPolicy.set_policy_appliedcCstj�||d�}|S)N)Zdate�sender�timeout)�time)r!rgrf�retr#r#r$Z__gen_settings�szFirewallPolicy.__gen_settingscCs|j|�jS)N)r.r<)r!r;r#r#r$r8�szFirewallPolicy.get_settingscCsj|jj|�}|j|}|r |js.|r2|jr2dS|r<d|_|dkrN|j�}n|}|r�x8|jsh|j|�n|j|�D]\}}|j|d|||�qrW|j	|�}	|js�|j
|||��xV|	D�]L}
�xD|	|
D�]6}|
dkr�|j||||�q�|
dkr�q�q�|
dk�r|j|||f|��q�|
dk�r0|j
||||�q�|
dk�rV|j|||d|d|�q�|
d	k�rr|j||||�q�|
d
k�r�|j|||d|d|�q�|
dk�r�|j|||�q�|
dk�r�|j||t|d
�|�q�|
dk�r�q�q�|
dk�r�q�q�tjd||
|�q�Wq�W|�sRx<|j�s"|j|�n|j|�D]\}}|j|d|||��q,Wd|_|dk�rf|j|�dS)NTrGrJrErBrCr�rIrFrDrH)�rule_strr4r7z5Policy '%s': Unknown setting '%s:%s', unable to applyF)rr>r rOr*r/�%_get_table_chains_for_policy_dispatch�#_get_table_chains_for_zone_dispatch�gen_chain_rulesr8�_ingress_egress_zones�_icmp_block�
_forward_port�_service�_port�	_protocol�_source_port�_masquerade�_FirewallPolicy__ruler
rr[�execute)r!�enabler;rb�_policyrM�transaction�table�chainr<�keyr`r#r#r$�_policy_settings�sj













zFirewallPolicy._policy_settingscCs|jd||d�dS)NT)rb)r)r!r;rbr#r#r$rc�sz$FirewallPolicy.apply_policy_settingscCs|jd||d�dS)NF)rb)r)r!r;rbr#r#r$rP�sz&FirewallPolicy.unapply_policy_settingscCsr|j|�j�}|j|�|j|�|j|�|j|�|j|�|j|�|j|�|j	|�|j
|�|j|�d�
}|jj
||�S)zH
        :return: exported config updated with runtime settings
        )
rBrCrGrDrE�
rich_rulesrIrFr4r7)r.Zexport_config_dict�
list_services�
list_ports�list_icmp_blocks�query_masquerade�list_forward_ports�
list_rules�list_protocols�list_source_ports�list_ingress_zones�list_egress_zonesrZ'combine_runtime_with_permanent_settings)r!r;Z	permanentZruntimer#r#r$�get_config_with_settings_dict�sz,FirewallPolicy.get_config_with_settings_dictcs�ddlm�d
��fdd�	}��fdd�}�j�jf�j�jf�j�jf�j�j	f�j
�jf||f�j�j
f�j�jf�j�jf�j�jfd�
}�j|�}�jj||�\}}	xt|	D]l}
t|	|
t��rxV|	|
D]8}t|t�r�||
d|f|��q�||
d||�q�Wq�||
d|�q�Wx�|D]�}
t||
t��r�xn||
D]J}t|t��rv||
d|f|�d|d	��n||
d||d|d	��qFWn||
d|d|d	��q(WdS)Nr)r
cs�j|�|d�d|d�dS)N)rkr)rgrf)r^)r;rkrgrf)r
r!r#r$�add_rule_wrapper�szFFirewallPolicy.set_config_with_settings_dict.<locals>.add_rule_wrappercs�j|�|d��dS)N)rk)�remove_rule)r;rk)r
r!r#r$�remove_rule_wrapper�szIFirewallPolicy.set_config_with_settings_dict.<locals>.remove_rule_wrapper)
rBrCrGrDrEr�rIrFr4r7rj)rgrf)rN)�firewall.core.richr
rW�remove_servicerX�remove_portrU�remove_icmp_blockr_�remove_masqueraderV�remove_forward_portr\�remove_protocolr]�remove_source_portrS�remove_ingress_zonerT�remove_egress_zoner�rZget_added_and_removed_settings�
isinstance�list�tuple)r!r;r<rfr�r�Z
setting_to_fnZold_settingsZadd_settingsZremove_settingsr~r`r#)r
r!r$�set_config_with_settings_dict�s:











  z,FirewallPolicy.set_config_with_settings_dictcCs&|sttj��|dkr"|jj|�dS)Nr5r6)r5r6)rr�INVALID_ZONEr�
check_zone)r!r:r#r#r$�check_ingress_zones
z!FirewallPolicy.check_ingress_zonecCs|j|�|S)N)r�)r!r:r#r#r$Z__ingress_zone_id"s
z FirewallPolicy.__ingress_zone_idrTcCs�|jj|�}|jj|�|jj�|j|}|j|�}	|	|jdkrXttj	d||f��d|jdks�d|jdks�|dkr�|jdr�ttj
d��|dkr�d|jdkr�ttj
d��|dkr�|j�}
n|}
|�rJ|jr�|j
d||
�|j||	||�|
j|j||	�|j�s:||j�k�rH|j||
d	�|
j|j|d�n|j
d
||
�n |j||	||�|
j|j||	�|dk�r~|
jd
�dS)Nr4z'%s' already in '%s'r6r5zI'ingress-zones' may only contain one of: many regular zones, ANY, or HOSTr7zF'HOST' can only appear in either ingress or egress zones, but not bothF)rbT)r6r5)rr>�
check_timeout�check_panicr � _FirewallPolicy__ingress_zone_idr<rrrZr�r*rOro�&_FirewallPolicy__register_ingress_zone�add_fail�(_FirewallPolicy__unregister_ingress_zoner=rcrerx)r!r;r:rgrfrbrRrz�_obj�zone_idr{r#r#r$rS&s<




zFirewallPolicy.add_ingress_zonecCs|j||�|jd|<dS)Nr4)�_FirewallPolicy__gen_settingsr<)r!r�r�rgrfr#r#r$Z__register_ingress_zoneSsz&FirewallPolicy.__register_ingress_zonecCs�|jj|�}|jj�|j|}|j|�}||jdkrLttjd||f��|dkr^|j	�}n|}|j
r�t|jd�dkr�|j||�n|j
d||�|j||�|j|j||dd�||j�kr�|j
d||�n|j|j||�|dkr�|jd�|S)Nr4z'%s' not in '%s'rjFT)rr>r�r r�r<rr�NOT_ENABLEDr*rO�lenrPror�r�r�r=�add_postrx)r!r;r:rbrzr�r�r{r#r#r$r�Vs,




z"FirewallPolicy.remove_ingress_zonecCs||jdkr|jd|=dS)Nr4)r<)r!r�r�r#r#r$Z__unregister_ingress_zoneysz(FirewallPolicy.__unregister_ingress_zonecCs|j|�|j|�dkS)Nr4)r�r8)r!r;r:r#r#r$�query_ingress_zone}sz!FirewallPolicy.query_ingress_zonecCst|j|�dj��S)Nr4)r�r8r,)r!r;r#r#r$r��sz!FirewallPolicy.list_ingress_zonescCs&|sttj��|dkr"|jj|�dS)Nr5r6)r5r6)rrr�rr�)r!r:r#r#r$�check_egress_zone�s
z FirewallPolicy.check_egress_zonecCs|j|�|S)N)r�)r!r:r#r#r$Z__egress_zone_id�s
zFirewallPolicy.__egress_zone_idcCs�|jj|�}|jj|�|jj�|j|}|j|�}	|	|jdkrXttj	d||f��d|jdks�d|jdks�|dkr�|jdr�ttj
d��|dkr�d|jdkr�ttj
d��|dkr�|j�}
n|}
|�rJ|jr�|j
d||
�|j||	||�|
j|j||	�|j�s:||j�k�rH|j||
d	�|
j|j|d�n|j
d
||
�n |j||	||�|
j|j||	�|dk�r~|
jd
�dS)Nr7z'%s' already in '%s'r6r5zH'egress-zones' may only contain one of: many regular zones, ANY, or HOSTr4zF'HOST' can only appear in either ingress or egress zones, but not bothF)rbT)r6r5)rr>r�r�r �_FirewallPolicy__egress_zone_idr<rrrZr�r*rOro�%_FirewallPolicy__register_egress_zoner��'_FirewallPolicy__unregister_egress_zoner=rcrerx)r!r;r:rgrfrbrRrzr�r�r{r#r#r$rT�s<




zFirewallPolicy.add_egress_zonecCs|j||�|jd|<dS)Nr7)r�r<)r!r�r�rgrfr#r#r$Z__register_egress_zone�sz%FirewallPolicy.__register_egress_zonecCs�|jj|�}|jj�|j|}|j|�}||jdkrLttjd||f��|dkr^|j	�}n|}|j
r�t|jd�dkr�|j||�n|j
d||�|j||�|j|j||dd�||j�kr�|j
d||�n|j|j||�|dkr�|jd�|S)Nr7z'%s' not in '%s'rjFT)rr>r�r r�r<rrr�r*rOr�rPror�r�r�r=r�rx)r!r;r:rbrzr�r�r{r#r#r$r��s,




z!FirewallPolicy.remove_egress_zonecCs||jdkr|jd|=dS)Nr7)r<)r!r�r�r#r#r$Z__unregister_egress_zone�sz'FirewallPolicy.__unregister_egress_zonecCs|j|�|j|�dkS)Nr7)r�r8)r!r;r:r#r#r$�query_egress_zone�sz FirewallPolicy.query_egress_zonecCst|j|�dj��S)Nr7)r�r8r,)r!r;r#r#r$r��sz FirewallPolicy.list_egress_zonescCs|j�dS)N)Zcheck)r!�ruler#r#r$�
check_rule�szFirewallPolicy.check_rulecCs|j|�t|�S)N)r��str)r!r�r#r#r$Z	__rule_id�s
zFirewallPolicy.__rule_idcCsx|sdS|jr,t|j�rdSt|j�rtdSnHt|d�r@|jr@dSt|d�rt|jrt|j|j�|j|j�|j|j�SdS)N�ipv4�ipv6�mac��ipset)	Zaddrrr�hasattrr�r��_check_ipset_type_for_source�_check_ipset_applied�
_ipset_family)r!�sourcer#r#r$�_rule_source_ipv�s

zFirewallPolicy._rule_source_ipvcCs|j||||�dS)N)�
_rule_prepare)r!ryr;r�r{r#r#r$Z__ruleszFirewallPolicy.__rulecCsL|jj|�}|jj|�|jj�|j|}|j|�}||jdkrh|jrP|jn|}	tt	j
d||	f��|j�s�|jr�t|jt
�r�d|jdkr�tt	jd��d|jdkr�tt	jd��x6|jdD](}
|
dkr�q�|jjj|
�r�tt	jd	��q�W|j�r�t|jt��r�d|jdk�r,|jj�r�tt	jd
��nb|jd�r�|jj�sNtt	jd��x>|jdD]0}
|
dk�rl�qZ|jjj|
��rZtt	jd���qZW|j�r�t|jt��r�x>|jdD]0}
|
dk�rq�|jjj|
��r�tt	jd
���q�W|dk�r�|j�}n|}|j�r|jd|||�|j||||�|j|j||�|dk�rH|jd�|S)NrHz'%s' already in '%s'r5r7z.'masquerade' is invalid for egress zone 'HOST'r4z/'masquerade' is invalid for ingress zone 'HOST'r6zR'masquerade' cannot be used in a policy if an ingress zone has assigned interfaceszAA 'forward-port' with 'to-addr' is invalid for egress zone 'HOST'zC'forward-port' requires 'to-addr' if egress zone is 'ANY' or a zonezS'forward-port' cannot be used in a policy if an egress zone has assigned interfaceszR'mark' action cannot be used in a policy if an egress zone has assigned interfacesT)r6r5)rr>r�r�r �_FirewallPolicy__rule_idr<r/rrrZ�elementr�rr�r:�list_interfacesr�
to_address�INVALID_FORWARD�actionrr*rOrw�_FirewallPolicy__register_ruler�� _FirewallPolicy__unregister_rulerx)r!r;r�rgrfrbrzr��rule_id�_namer:r{r#r#r$r^
s`










zFirewallPolicy.add_rulecCs|j||�|jd|<dS)NrH)r�r<)r!r�r�rgrfr#r#r$Z__register_ruleEszFirewallPolicy.__register_rulec	Cs�|jj|�}|jj�|j|}|j|�}||jdkr\|jrD|jn|}ttj	d||f��|dkrn|j
�}n|}|jr�|jd|||�|j
|j||�|dkr�|jd�|S)NrHz'%s' not in '%s'FT)rr>r�r r�r<r/rrr�r*rOrwr�r�rx)	r!r;r�rbrzr�r�r�r{r#r#r$r�Is"




zFirewallPolicy.remove_rulecCs||jdkr|jd|=dS)NrH)r<)r!r�r�r#r#r$Z__unregister_ruledsz FirewallPolicy.__unregister_rulecCs|j|�|j|�dkS)NrH)r�r8)r!r;r�r#r#r$�
query_rulehszFirewallPolicy.query_rulecCst|j|�dj��S)NrH)r�r8r,)r!r;r#r#r$r�kszFirewallPolicy.list_rulescCs|jj|�dS)N)r�
check_service)r!�servicer#r#r$r�pszFirewallPolicy.check_servicecCs|j|�|S)N)r�)r!r�r#r#r$Z__service_idss
zFirewallPolicy.__service_idcCs�|jj|�}|jj|�|jj�|j|}|j|�}||jdkrh|jrP|jn|}	tt	j
d||	f��|dkrz|j�}
n|}
|jr�|j
d|||
�|j||||�|
j|j||�|dkr�|
jd�|S)NrBz'%s' already in '%s'T)rr>r�r�r �_FirewallPolicy__service_idr<r/rrrZr*rOrr�!_FirewallPolicy__register_servicer��#_FirewallPolicy__unregister_servicerx)r!r;r�rgrfrbrzr��
service_idr�r{r#r#r$rWws&




zFirewallPolicy.add_servicecCs|j||�|jd|<dS)NrB)r�r<)r!r�r�rgrfr#r#r$Z__register_service�sz!FirewallPolicy.__register_servicec	Cs�|jj|�}|jj�|j|}|j|�}||jdkr\|jrD|jn|}ttj	d||f��|dkrn|j
�}n|}|jr�|jd|||�|j
|j||�|dkr�|jd�|S)NrBz'%s' not in '%s'FT)rr>r�r r�r<r/rrr�r*rOrrr�r�rx)	r!r;r�rbrzr�r�r�r{r#r#r$r��s"




zFirewallPolicy.remove_servicecCs||jdkr|jd|=dS)NrB)r<)r!r�r�r#r#r$Z__unregister_service�sz#FirewallPolicy.__unregister_servicecCs|j|�|j|�dkS)NrB)r�r8)r!r;r�r#r#r$�
query_service�szFirewallPolicy.query_servicecCs|j|�dj�S)NrB)r8r,)r!r;r#r#r$r��szFirewallPolicy.list_servicescCsTg}xJ|D]B}y|jjj|�}Wn tk
r@ttj|��YnX|j|�q
W|S)N)r�helper�
get_helperrr�INVALID_HELPERr0)r!�helpers�_helpersr��_helperr#r#r$�get_helpers_for_service_helpers�s
z.FirewallPolicy.get_helpers_for_service_helperscCs�g}x�|D]�}y|jjj|�}Wn tk
r@ttj|��YnXt|j�dkr�t|j	�}y|jjj|�}|j
|�Wq�tk
r�|r�tjd|�w
Yq�Xq
|j
|�q
W|S)NrjzHelper '%s' is not available)
rr�r�rrr�r�rCr
�moduler0rr[)r!�modulesryr�r�r��_module_short_namer�r#r#r$�get_helpers_for_service_modules�s"


z.FirewallPolicy.get_helpers_for_service_modulescCs|jj|�|jj|�dS)N)r�
check_port�check_tcpudp)r!�port�protocolr#r#r$r��szFirewallPolicy.check_portcCs|j||�t|d�|fS)N�-)r�r)r!r�r�r#r#r$Z	__port_id�szFirewallPolicy.__port_idcs�|jj|�}|jj|�|jj�|j|}tt�fdd�|jd��}	x@|	D]8}
t||
d�rN|j	rl|j	n|}t
tjd|�|f��qNWt
|dd�|	D��\}}
|dkr�|j�}n|}|j�rx$|D]}|jd|t|d	��|�q�Wx$|
D]}|jd
|t|d	��|�q�Wx:|D]2}|j|��}
|j||
||�|j|j||
��qWx*|
D]"}|j|��}
|j|j||
��qNW|dk�r�|jd�|S)Ncs|d�kS)Nrjr#)r@)r�r#r$�<lambda>�sz)FirewallPolicy.add_port.<locals>.<lambda>rCrz'%s:%s' already in '%s'cSsg|]\}}|�qSr#r#)r?rsrtr#r#r$�
<listcomp>�sz+FirewallPolicy.add_port.<locals>.<listcomp>Tr�F)rr>r�r�r r��filterr<r	r/rrrZrr*rOrsr�_FirewallPolicy__port_id�_FirewallPolicy__register_portr�� _FirewallPolicy__unregister_portr�rx)r!r;r�r�rgrfrbrzr��existing_port_ids�port_idr��added_ranges�removed_rangesr{�ranger#)r�r$rX�s:









zFirewallPolicy.add_portcCs|j||�|jd|<dS)NrC)r�r<)r!r�r�rgrfr#r#r$Z__register_portszFirewallPolicy.__register_portcs�|jj|�}|jj�|j|}tt�fdd�|jd��}xB|D]}t||d�rBPqBW|jrf|jn|}	t	t
jd|�|	f��t|dd�|D��\}
}|dkr�|j
�}n|}|j�rx$|
D]}
|jd|t|
d	��|�q�Wx$|D]}
|jd
|t|
d	��|�q�Wx:|
D]2}
|j|
��}|j||dd�|j|j||��qWx*|D]"}
|j|
��}|j|j||��qDW|dk�r~|jd�|S)Ncs|d�kS)Nrjr#)r@)r�r#r$r�sz,FirewallPolicy.remove_port.<locals>.<lambda>rCrz'%s:%s' not in '%s'cSsg|]\}}|�qSr#r#)r?rsrtr#r#r$r�#sz.FirewallPolicy.remove_port.<locals>.<listcomp>Tr�F)rr>r�r r�r�r<r	r/rrr�rr*rOrsrr�r�r�r�r�rx)r!r;r�r�rbrzr�r�r�r�r�r�r{r�r#)r�r$r�s:









zFirewallPolicy.remove_portcCs||jdkr|jd|=dS)NrC)r<)r!r�r�r#r#r$Z__unregister_port=sz FirewallPolicy.__unregister_portcCs6x0|j|�dD]\}}t||�r||krdSqWdS)NrCTF)r8r	)r!r;r�r�rsrtr#r#r$�
query_portAszFirewallPolicy.query_portcCst|j|�dj��S)NrC)r�r8r,)r!r;r#r#r$r�HszFirewallPolicy.list_portscCst|�sttj|��dS)N)rrrZINVALID_PROTOCOL)r!r�r#r#r$�check_protocolMszFirewallPolicy.check_protocolcCs|j|�|S)N)r�)r!r�r#r#r$Z
__protocol_idQs
zFirewallPolicy.__protocol_idcCs�|jj|�}|jj|�|jj�|j|}|j|�}||jdkrh|jrP|jn|}	tt	j
d||	f��|dkrz|j�}
n|}
|jr�|j
d|||
�|j||||�|
j|j||�|dkr�|
jd�|S)NrIz'%s' already in '%s'T)rr>r�r�r �_FirewallPolicy__protocol_idr<r/rrrZr*rOrt�"_FirewallPolicy__register_protocolr��$_FirewallPolicy__unregister_protocolrx)r!r;r�rgrfrbrzr��protocol_idr�r{r#r#r$r\Us&




zFirewallPolicy.add_protocolcCs|j||�|jd|<dS)NrI)r�r<)r!r�r�rgrfr#r#r$Z__register_protocolrsz"FirewallPolicy.__register_protocolc	Cs�|jj|�}|jj�|j|}|j|�}||jdkr\|jrD|jn|}ttj	d||f��|dkrn|j
�}n|}|jr�|jd|||�|j
|j||�|dkr�|jd�|S)NrIz'%s' not in '%s'FT)rr>r�r r�r<r/rrr�r*rOrtr�r�rx)	r!r;r�rbrzr�r�r�r{r#r#r$r�vs$





zFirewallPolicy.remove_protocolcCs||jdkr|jd|=dS)NrI)r<)r!r�r�r#r#r$Z__unregister_protocol�sz$FirewallPolicy.__unregister_protocolcCs|j|�|j|�dkS)NrI)r�r8)r!r;r�r#r#r$�query_protocol�szFirewallPolicy.query_protocolcCst|j|�dj��S)NrI)r�r8r,)r!r;r#r#r$r��szFirewallPolicy.list_protocolscCs|j||�t|d�|fS)Nr�)r�r)r!r�r�r#r#r$Z__source_port_id�szFirewallPolicy.__source_port_idcs�|jj|�}|jj|�|jj�|j|}tt�fdd�|jd��}	x@|	D]8}
t||
d�rN|j	rl|j	n|}t
tjd|�|f��qNWt
|dd�|	D��\}}
|dkr�|j�}n|}|j�rx$|D]}|jd|t|d	��|�q�Wx$|
D]}|jd
|t|d	��|�q�Wx:|D]2}|j|��}
|j||
||�|j|j||
��qWx*|
D]"}|j|��}
|j|j||
��qNW|dk�r�|jd�|S)Ncs|d�kS)Nrjr#)r@)r�r#r$r��sz0FirewallPolicy.add_source_port.<locals>.<lambda>rFrz'%s:%s' already in '%s'cSsg|]\}}|�qSr#r#)r?rsrtr#r#r$r��sz2FirewallPolicy.add_source_port.<locals>.<listcomp>Tr�F)rr>r�r�r r�r�r<r	r/rrrZrr*rOrur�_FirewallPolicy__source_port_id�%_FirewallPolicy__register_source_portr��'_FirewallPolicy__unregister_source_portr�rx)r!r;r�r�rgrfrbrzr�r�r�r�r�r�r{r�r#)r�r$r]�s:









zFirewallPolicy.add_source_portcCs|j||�|jd|<dS)NrF)r�r<)r!r�r�rgrfr#r#r$Z__register_source_port�sz%FirewallPolicy.__register_source_portcs�|jj|�}|jj�|j|}tt�fdd�|jd��}xB|D]}t||d�rBPqBW|jrf|jn|}	t	t
jd|�|	f��t|dd�|D��\}
}|dkr�|j
�}n|}|j�rx$|
D]}
|jd|t|
d	��|�q�Wx$|D]}
|jd
|t|
d	��|�q�Wx:|
D]2}
|j|
��}|j||dd�|j|j||��qWx*|D]"}
|j|
��}|j|j||��qDW|dk�r~|jd�|S)Ncs|d�kS)Nrjr#)r@)r�r#r$r��sz3FirewallPolicy.remove_source_port.<locals>.<lambda>rFrz'%s:%s' not in '%s'cSsg|]\}}|�qSr#r#)r?rsrtr#r#r$r��sz5FirewallPolicy.remove_source_port.<locals>.<listcomp>Tr�F)rr>r�r r�r�r<r	r/rrr�rr*rOrurr�r�r�r�r�rx)r!r;r�r�rbrzr�r�r�r�r�r�r{r�r#)r�r$r��s:









z!FirewallPolicy.remove_source_portcCs||jdkr|jd|=dS)NrF)r<)r!r�r�r#r#r$Z__unregister_source_port�sz'FirewallPolicy.__unregister_source_portcCs6x0|j|�dD]\}}t||�r||krdSqWdS)NrFTF)r8r	)r!r;r�r�rsrtr#r#r$�query_source_port�sz FirewallPolicy.query_source_portcCst|j|�dj��S)NrF)r�r8r,)r!r;r#r#r$r�sz FirewallPolicy.list_source_portscCsdS)NTr#)r!r#r#r$Z__masquerade_idszFirewallPolicy.__masquerade_idcCs8|jj|�}|jj|�|jj�|j|}|j�}||jdkrb|jrN|jn|}tt	j
d|��|js�d|jdkr�tt	jd��d|jdkr�tt	jd��x6|jdD](}	|	dkr�q�|jjj
|	�r�tt	jd	��q�W|dkr�|j�}
n|}
|j�r|jd
||
�|j||||�|
j|j||�|dk�r4|
jd
�|S)NrDz"masquerade already enabled in '%s'r5r7z.'masquerade' is invalid for egress zone 'HOST'r4z/'masquerade' is invalid for ingress zone 'HOST'r6zR'masquerade' cannot be used in a policy if an ingress zone has assigned interfacesT)rr>r�r�r �_FirewallPolicy__masquerade_idr<r/rrrZr�r:r�r*rOrv�$_FirewallPolicy__register_masquerader��&_FirewallPolicy__unregister_masqueraderx)r!r;rgrfrbrzr��
masquerade_idr�r:r{r#r#r$r_
s:





zFirewallPolicy.add_masqueradecCs|j||�|jd|<dS)NrD)r�r<)r!r�r�rgrfr#r#r$Z__register_masquerade2sz$FirewallPolicy.__register_masqueradecCs�|jj|�}|jj�|j|}|j�}||jdkrV|jrB|jn|}ttj	d|��|dkrh|j
�}n|}|jr�|jd||�|j
|j||�|dkr�|jd�|S)NrDzmasquerade not enabled in '%s'FT)rr>r�r r�r<r/rrr�r*rOrvr�r�rx)r!r;rbrzr�r�r�r{r#r#r$r�6s"




z FirewallPolicy.remove_masqueradecCs||jdkr|jd|=dS)NrD)r<)r!r�r�r#r#r$Z__unregister_masqueradePsz&FirewallPolicy.__unregister_masqueradecCs|j�|j|�dkS)NrD)r�r8)r!r;r#r#r$r�TszFirewallPolicy.query_masqueradecCs^|jj|�|jj|�|r(|jj|�|rBt||�sBttj|��|rZ|rZttjd��dS)Nz.port-forwarding is missing to-port AND to-addr)rr�r�rrrZINVALID_ADDRr�)r!�ipvr�r��toport�toaddrr#r#r$�check_forward_portYs
z!FirewallPolicy.check_forward_portcCsLtd|�r|jd||||�n|jd||||�t|d�|t|d�t|�fS)Nr�r�r�)rrrr�)r!r�r�r�r�r#r#r$Z__forward_port_idfs


z FirewallPolicy.__forward_port_idc	CsZ|jj|�}	|jj|�|jj�|j|	}
|j||||�}||
jdkrt|
jrV|
jn|	}tt	j
d|||||f��|
js�d|
jdkr�|r�tt	jd��nR|
jdr�|s�tt	jd��x6|
jdD](}
|
dkr�q�|jjj
|
�r�tt	jd��q�W|dk�r|j�}n|}|
j�r"|jd	|	|||||�|j|
|||�|j|j|
|�|dk�rV|jd	�|	S)
NrEz'%s:%s:%s:%s' already in '%s'r5r7zAA 'forward-port' with 'to-addr' is invalid for egress zone 'HOST'zC'forward-port' requires 'to-addr' if egress zone is 'ANY' or a zoner6zS'forward-port' cannot be used in a policy if an egress zone has assigned interfacesT)rr>r�r�r � _FirewallPolicy__forward_port_idr<r/rrrZr�r:r�r�r*rOrq�&_FirewallPolicy__register_forward_portr��(_FirewallPolicy__unregister_forward_portrx)r!r;r�r�r�r�rgrfrbrzr��
forward_idr�r:r{r#r#r$rVnsB






zFirewallPolicy.add_forward_portcCs|j||�|jd|<dS)NrE)r�r<)r!r�rrgrfr#r#r$Z__register_forward_port�sz&FirewallPolicy.__register_forward_portcCs�|jj|�}|jj�|j|}|j||||�}	|	|jdkrh|jrJ|jn|}
ttj	d|||||
f��|dkrz|j
�}n|}|jr�|jd||||||�|j
|j||	�|dkr�|jd�|S)NrEz'%s:%s:%s:%s' not in '%s'FT)rr>r�r rr<r/rrr�r*rOrqr�rrx)r!r;r�r�r�r�rbrzr�rr�r{r#r#r$r��s&



z"FirewallPolicy.remove_forward_portcCs||jdkr|jd|=dS)NrE)r<)r!r�rr#r#r$Z__unregister_forward_port�sz(FirewallPolicy.__unregister_forward_portcCs"|j||||�}||j|�dkS)NrE)rr8)r!r;r�r�r�r�rr#r#r$�query_forward_port�sz!FirewallPolicy.query_forward_portcCst|j|�dj��S)NrE)r�r8r,)r!r;r#r#r$r��sz!FirewallPolicy.list_forward_portscCs|jj|�dS)N)rZcheck_icmptype)r!�icmpr#r#r$�check_icmp_block�szFirewallPolicy.check_icmp_blockcCs|j|�|S)N)r)r!rr#r#r$Z__icmp_block_id�s
zFirewallPolicy.__icmp_block_idcCs�|jj|�}|jj|�|jj�|j|}|j|�}||jdkrh|jrP|jn|}	tt	j
d||	f��|dkrz|j�}
n|}
|jr�|j
d|||
�|j||||�|
j|j||�|dkr�|
jd�|S)NrGz'%s' already in '%s'T)rr>r�r�r �_FirewallPolicy__icmp_block_idr<r/rrrZr*rOrp�$_FirewallPolicy__register_icmp_blockr��&_FirewallPolicy__unregister_icmp_blockrx)r!r;rrgrfrbrzr��icmp_idr�r{r#r#r$rU�s&




zFirewallPolicy.add_icmp_blockcCs|j||�|jd|<dS)NrG)r�r<)r!r�rrgrfr#r#r$Z__register_icmp_block�sz$FirewallPolicy.__register_icmp_blockc	Cs�|jj|�}|jj�|j|}|j|�}||jdkr\|jrD|jn|}ttj	d||f��|dkrn|j
�}n|}|jr�|jd|||�|j
|j||�|dkr�|jd�|S)NrGz'%s' not in '%s'FT)rr>r�r rr<r/rrr�r*rOrpr�r
rx)	r!r;rrbrzr�rr�r{r#r#r$r��s"




z FirewallPolicy.remove_icmp_blockcCs||jdkr|jd|=dS)NrG)r<)r!r�rr#r#r$Z__unregister_icmp_blocksz&FirewallPolicy.__unregister_icmp_blockcCs|j|�|j|�dkS)NrG)rr8)r!r;rr#r#r$�query_icmp_blockszFirewallPolicy.query_icmp_blockcCs|j|�dj�S)NrG)r8r,)r!r;r#r#r$r�szFirewallPolicy.list_icmp_blockscCsdS)NTr#)r!r#r#r$Z__icmp_block_inversion_idsz(FirewallPolicy.__icmp_block_inversion_idc
Cs|jj|�}|jj�|j|}|j�}||jdkrV|jrB|jn|}ttj	d|��|dkrh|j
�}n|}|jr�x&|j|�dD]}	|j
d||	|�q�W|jd||�|j|||�|j|j|||�|j�rx&|j|�dD]}	|j
d||	|�q�W|jd||�|dk�r|jd�|S)NrJz,icmp-block-inversion already enabled in '%s'rGFT)rr>r�r �(_FirewallPolicy__icmp_block_inversion_idr<r/rrrZr*rOr8rp�_icmp_block_inversion�._FirewallPolicy__register_icmp_block_inversionr��*_FirewallPolicy__undo_icmp_block_inversionrx)
r!r;rfrbrzr��icmp_block_inversion_idr�r{r`r#r#r$�add_icmp_block_inversions6





z'FirewallPolicy.add_icmp_block_inversioncCs|jd|�|jd|<dS)NrrJ)r�r<)r!r�rrfr#r#r$Z__register_icmp_block_inversionEsz.FirewallPolicy.__register_icmp_block_inversioncCs�|j�}|jr6x&|j|�dD]}|jd|||�qW||jdkrP|jd|=|jr~x&|j|�dD]}|jd|||�qfW|jd�dS)NrGFrJT)r*rOr8rpr<rx)r!rzr�rr{r`r#r#r$Z__undo_icmp_block_inversionJsz*FirewallPolicy.__undo_icmp_block_inversionc	Cs|jj|�}|jj�|j|}|j�}||jdkrV|jrB|jn|}ttj	d|��|dkrh|j
�}n|}|jr�x&|j|�dD]}|j
d|||�q�W|jd||�|j||�|j|j||d�|j�rx&|j|�dD]}|j
d|||�q�W|jd||�|dk�r|jd�|S)NrJz(icmp-block-inversion not enabled in '%s'rGFT)rr>r�r r
r<r/rrr�r*rOr8rpr�0_FirewallPolicy__unregister_icmp_block_inversionr�rrx)	r!r;rbrzr�rr�r{r`r#r#r$�remove_icmp_block_inversion\s6






z*FirewallPolicy.remove_icmp_block_inversioncCs||jdkr|jd|=dS)NrJ)r<)r!r�rr#r#r$Z!__unregister_icmp_block_inversion�sz0FirewallPolicy.__unregister_icmp_block_inversioncCs|j�|j|�dkS)NrJ)r
r8)r!r;r#r#r$�query_icmp_block_inversion�sz)FirewallPolicy.query_icmp_block_inversionc
Cs�|jjj|�}|jr*|jjj|jd}n|}|rT||jkrt||f|j|krtdSn ||jksp||f|j|krtdSx@|jj�D]2}|jr�||j	�kr�|j
||||�}	|j||	�q�W|j||||fg�|j
|j||||fg�dS)Nr)rr;r.r/r:Z_zone_policiesr�enabled_backends�policies_supportedZget_available_tablesZbuild_policy_chain_rules�	add_rules�_register_chainsr�)
r!r;�creater|r}r{rMZtracking_policy�backendrHr#r#r$rn�s$

zFirewallPolicy.gen_chain_rulescCsbx\|D]T\}}|r,|jj|g�j||f�q|j|j||f�t|j|�dkr|j|=qWdS)Nr)r�
setdefaultr0�remover�)r!r;rZtablesr|r}r#r#r$r�szFirewallPolicy._register_chainscCs$|jjj|�dkrdS|jjj|�S)Nzhash:mac)rr��get_typeZ
get_family)r!rKr#r#r$r��szFirewallPolicy._ipset_familycCs|jjj|�S)N)rr�r)r!rKr#r#r$Z__ipset_type�szFirewallPolicy.__ipset_typecCsdj|g|jjj|��S)N�,)�joinrr�Z
get_dimension)r!rK�flagr#r#r$�_ipset_match_flags�sz!FirewallPolicy._ipset_match_flagscCs|jjj|�S)N)rr�Z
check_applied)r!rKr#r#r$r��sz#FirewallPolicy._check_ipset_appliedcCs*|j|�}|tkr&ttjd||f��dS)Nz.ipset '%s' with type '%s' not usable as source)�_FirewallPolicy__ipset_typerrrZ
INVALID_IPSET)r!rKZ_typer#r#r$r��s
z+FirewallPolicy._check_ipset_type_for_sourcecs�t|j�tkr��jjj|jj�}|dkr2|jjg}xR|jD]H}||krHq:�j|�|j	|�t
j|�}||j_�j|||||d�q:Wg}	|j
r�|j
g}	nH|jr�t|jt�s�t|jt�r�jjj|jj���jr�fdd�dD�}	�j|j�}
|
�r&|j
�r |j
|
k�r&ttjd|
|j
f��n|
g}	|	�s4ddg}	�fdd�|	D�}	|	|_�x2t�fdd�|	D��D�]}t|j�tk�r��jjj|jj�}g}t|j�d	k�r�|j�r�ttjd
��xB|	D].}
|
|jk�r�|j|
��r�|j	|j|
��q�Wn
|j	d��x~|D�]�}t|j�tk�r�j|j |�}|�j!|j"�7}t#t|�dd�d
�}g}x�|D]�}|j$}t%|�}|j&dd�}|j	|�|j
dk�r�|j|j
��r��qTt|j'�dk�r�|j	|�n:x8|j'D].\}}|j(||||||j|�}|j)||��q�W�qTW|j*|�x4|j'D]*\}}|j+||||||�}|j)||��q
Wx.|j,D]$}|j-|||||�}|j)||��q@Wx4|j.D]*\}}|j/||||||�}|j)||��qpW�qW�qft|j�t0k�r�|jj1}|jj2}�j3||�|j+||||d|�}|j)||��qft|j�t4k�r<|jj5}�j6|�|j-|||d|�}|j)||��qft|j�t7k�r�|�rzx&|	D]}
|j|
��rX|j8t9|
��qXW|j:|||�}|j)||��qft|j�t;k�r4|jj1}|jj2}|jj<}|jj=}xD|	D]<}
|j|
��r�j>|
||||�|�r�|�r�|j8t9|
��q�W|j?|||||||�}|j)||��qft|j�t@k�r�|jj1}|jj2}�j3||�|j/||||d|�}|j)||�n�t|j�tk�s�t|j�tk�r>�jjj|jj��|j
�r�j�r�|j
�jk�r�ttjAd|j
|jjf��t|j�tk�r |j�r t|j�tk�r ttjd��|jB||�|�}|j)||�n>|jdk�rf|jC|||�}|j)||�nttjdt|j����qfWdS)N)�included_servicescsg|]}|�jkr|�qSr#)�destination)r?r�)�ictr#r$r��sz0FirewallPolicy._rule_prepare.<locals>.<listcomp>r�r�z;Source address family '%s' conflicts with rule family '%s'.csg|]}�jj|�r|�qSr#)r�is_ipv_enabled)r?r�)r!r#r$r��scsg|]}�jj|��qSr#)r�get_backend_by_ipv)r?r@)r!r#r$r��srz"Destination conflict with service.cSs|jS)N)rK)r@r#r#r$r�sz.FirewallPolicy._rule_prepare.<locals>.<lambda>)r~�	conntrack�natr�rjz3rich rule family '%s' conflicts with icmp type '%s'z'IcmpBlock not usable with accept actionzUnknown element %s)r�r�)D�typer�rrr��get_servicerK�includesr�r0�copy�deepcopyr��familyr�rr�config�get_icmptyper%r�r�rrZINVALID_RULE�ipvsr9r��is_ipv_supportedr�rr�r�r�r�r+r�r
�replacerC�build_policy_helper_ports_rulesrZadd_modules�build_policy_ports_rulesrI�build_policy_protocol_rulesrF�build_policy_source_ports_rulesrr�r�r�r�valuer�rr�r�build_policy_masquerade_rulesrZto_portr�r�build_policy_forward_port_rulesrZINVALID_ICMPTYPE�build_policy_icmp_block_rulesZ*build_policy_rich_source_destination_rules)r!ryr;r�r{r$�svc�includeZ_ruler3Z
source_ipvrZdestinationsr�r%r�r�r�r�r��
nat_moduler��protorHr�r�r�r#)r&r!r$r��s




 









zFirewallPolicy._rule_preparecCsb|jjj|�}|j|j|�}||j|j�7}tt|�dd�d�}|dkrN|g}x@|j	D]6}||krdqV|j
|�|j|�|j|||||d�qVWg}	xndD]f}
|jj
|
�s�q�|jj|
�}t|j�dkr�|
|jkr�|	j||j|
f�q�|df|	kr�|	j|df�q�W�xV|	D�]L\}}x�|D]�}
|
j}t|�}|
jjdd	�}|j|�|
jd
k�rf|j|
j��rf�qt|
j�dk�r�|j|�n:x8|
jD].\}}|j||||||
j|�}|j||��q�W�qWx2|jD](\}}|j|||||�}|j||��q�Wx,|jD]"}|j||||�}|j||��q�Wx2|jD](\}}|j|||||�}|j||��q,W�qWdS)
NcSs|jS)N)rK)r@r#r#r$r��sz)FirewallPolicy._service.<locals>.<lambda>)r~)r$r�r�rr)r*r�rj)r�r�) rr�r,r�r�r�r�r+r9r-r�r0rrr'r(r�r%r�r
r5Z
add_moduler0r4rCr6rKrr7rIr8rFr9)r!ryr;r�r{r$r>r�r?Zbackends_ipvr�rr%r�r�r�r@r�rArHr�r#r#r$rr�sb






zFirewallPolicy._servicecCs<x6|jj�D](}|jsq|j||||�}|j||�qWdS)N)rrrr7r)r!ryr;r�r�r{rrHr#r#r$rs�s
zFirewallPolicy._portcCs:x4|jj�D]&}|jsq|j|||�}|j||�qWdS)N)rrrr8r)r!ryr;r�r{rrHr#r#r$rt�s
zFirewallPolicy._protocolcCs<x6|jj�D](}|jsq|j||||�}|j||�qWdS)N)rrrr9r)r!ryr;r�r�r{rrHr#r#r$ru�s
zFirewallPolicy._source_portcCs8d}|jt|�|jj|�}|j||�}|j||�dS)Nr�)r�rrr(r;r)r!ryr;r{r�rrHr#r#r$rv�s
zFirewallPolicy._masqueradecCsXtd|�rd}nd}|r(|r(|jt|�|jj|�}	|	j||||||�}
|j|	|
�dS)Nr�r�)rr�rrr(r<r)r!ryr;r{r�r�r�r�r�rrHr#r#r$rq�s

zFirewallPolicy._forward_portc
Cs�|jjj|�}xl|jj�D]^}|js&qd}|jrXx&dD]}||jkr6|j|�s6d}Pq6W|r^q|j|||�}	|j||	�qWdS)NFr�r�T)r�r�)	rr1r2rrr%r4r=r)
r!ryr;rr{r&rZskip_backendr�rHr#r#r$rp�s


zFirewallPolicy._icmp_blockcCsh|j|j}|dkrdS|j|�r0|dkr0dSx2|jj�D]$}|jsHq<|j||�}|j||�q<WdS)N�DROP�
%%REJECT%%�REJECTZACCEPT)rBrCrD)r �targetrrrrZ'build_policy_icmp_block_inversion_rulesr)r!ryr;r{rErrHr#r#r$rsz$FirewallPolicy._icmp_block_inversionc	Cs�x|D]}|j|�qWx|D]}|j|�qWd|ks@d|krXt|�dkrXttjd��d|kshd|kr�t|�dkr�ttjd��|s�|r�|r�|r�d|kr�d|kr�ttjd|��|s�|r�|r�|r�d|kr�d|kr�ttjd|��dS)Nr6r5rjzI'ingress-zones' may only contain one of: many regular zones, ANY, or HOSTzH'egress-zones' may only contain one of: many regular zones, ANY, or HOSTzpolicy "%s" has no ingresszpolicy "%s" has no egress)r�r�r�rrr�)	r!r;r4r7�ingress_interfaces�egress_interfaces�ingress_sources�egress_sourcesr:r#r#r$�check_ingress_egress"s$

z#FirewallPolicy.check_ingress_egressc

Cs�|dkr&|dkr�|r�ttjd|��n�|dkrtd|krFttjd|��d|kr^ttjd|��|r�ttjd|��n||d	kr�d|kr�ttjd|��d|kr�ttjd|��nB|d
kr�d|kr�ttjd|��n |dkr�d|kr�ttjd
|��dS)N�
PREROUTING�rawzFpolicy "%s" egress-zones may not include a zone with added interfaces.�POSTROUTINGr5z/policy "%s" ingress-zones may not include HOST.z.policy "%s" egress-zones may not include HOST.zGpolicy "%s" ingress-zones may not include a zone with added interfaces.�FORWARD�INPUTz0policy "%s" egress-zones must include only HOST.�OUTPUTz1policy "%s" ingress-zones must include only HOST.)rrr�)
r!r;r|r}r4r7rFrGrHrIr#r#r$�check_ingress_egress_chain<s,z)FirewallPolicy.check_ingress_egress_chaincCs$|j�}|j|||�|jd�dS)NT)r*rorx)r!ryr;r{r#r#r$�!_ingress_egress_zones_transactionYsz0FirewallPolicy._ingress_egress_zones_transactioncCsL|j|}|jd}|jd}t�}t�}t�}	t�}
xB|D]:}|dkrJq<|t|jjj|��O}|	t|jjj|��O}	q<WxB|D]:}|dkr�q�|t|jjj|��O}|
t|jjj|��O}
q�W|j||||||	|
�xr|jj�D]d}|j	s�q�xV|j
|�D]H\}
}|j||
||||||	|
�	|j|||
||||	|
�}|j
||��q�Wq�WdS)Nr4r7r6r5)r6r5)r6r5)r r<r9rr:r�Zlist_sourcesrJrrrlrQZ!build_policy_ingress_egress_rulesr)r!ryr;r{rMr4r7rFrGrHrIr:rr|r}rHr#r#r$ro^s@






z$FirewallPolicy._ingress_egress_zonescCs6|j|}d|jdkrFd|jdkrFdddg}|jjsB|jd�|Sd|jdkrpdg}|jjsl|jd�|Sd|jdkr�dgSd|jdko�d|jdk�r�ddddg}|jj�s�|jd�|Sd|jdk�r.dddg}|jj�s�|jd�x4|jdD]}|jjj|�d�rP�qW|jd �|Sd|jdk�r�d!d"g}|jj�sZ|jd#�x>|jdD]}|jjj|�d�rfP�qfW|jd$�|jd%�|Sd&g}|jj�s�|jd'�x4|jdD]}|jjj|�d�r�P�q�W|jd(�x>|jdD]}|jjj|�d�r�P�q�W|jd)�|jd*�|SdS)+z:Create a list of (table, chain) needed for policy dispatchr6r4r5r7r�rOr*rK�manglerLrPrNrMZ
interfacesN)r�rO)r*rK)rSrK)rLrK)r�rO)rLrK)r�rP)r�rN)r*rK)r*rM)rSrK)rLrK)r�rN)r*rK)rSrK)rLrK)r*rM)r�rN)r*rM)rLrK)r*rK)rSrK)r�rN)rLrK)r*rM)r*rK)rSrK)r r<r�nftables_enabledr0r:r8)r!r;rM�tcr:r#r#r$rl�sj
















z4FirewallPolicy._get_table_chains_for_policy_dispatchcCsr|j|}d|jdkr4dg}|jjs0|jd�|Sd|jdkrLdddgSd|jdkrbddgStd|�SdS)z8Create a list of (table, chain) needed for zone dispatchr5r7r�rOrLrKr6�
FORWARD_INr*rSr4�FORWARD_OUTrMzInvalid policy: %sN)r�rO)rLrK)r�rV)r*rK)rSrK)r�rW)r*rM)r r<rrTr0r)r!r;rMrUr#r#r$rm�s

z2FirewallPolicy._get_table_chains_for_zone_dispatchFcCs�|jjj|�}|jr|j}n||}d|jdkrl|dkrBd|S|dkrRd|S|jsh|dkrhd|S�nJd|jd	kr�|js�|dkr�d
|S�n"d|jdk�r�|dkr�|jr�d|Sd
|Sn0|dkr�|r�d|Sd|Sn|dk�r�d|Sn�d|jd	k�rh|dk�r*|j�r d|Sd
|Sn<|dk�rL|�rBd|Sd|Sn|dk�r�|j�s�d|SnN|j�s�|dk�r�d
|S|dk�r�|�r�d|Sd|Sn|dk�r�d|Std|||f�S)Nr5r7r�ZIN_rLZPRE_rSr*r4ZOUT_r6ZFWDI_ZFWD_ZPOST_ZFWDO_z.Can't convert policy to chain name: %s, %s, %s)rSr*)rSrL)rSrL)rSrL)rr;r.r/r<r)r!r;r|Z
policy_prefixZisSNATrM�suffixr#r#r$�policy_base_chain_name�sb













z%FirewallPolicy.policy_base_chain_name)N)N)N)N)rNNT)N)rNNT)N)rNN)N)rNN)N)rNN)N)rNN)N)rNN)N)rNN)N)NN)NN)NNrNN)NNN)NN)rNN)N)NN)N)N)N)NN)F)��__name__�
__module__�__qualname__r%r'r)r*r-r3r=r.rNrQrLrdrer�r8rrcrPr�r�r�r�rSr�r�r�r�r�r�r�rTr�r�r�r�r�r�r�r�rwr^r�r�r�r�r�r�r�rWr�r�r�r�r�r�r�r�r�rXr�r�r�r�r�r�r�r\r�r�r�r�r�r�r]r�r�r�r�r�r�r_r�r�r�r�rrrVrr�rrr�rrrUr	r�r
rr�r
rrrrrrrnrr�r#r"r�r�r�rrrsrtrurvrqrprrJrQrRrorlrmrYr#r#r#r$rs$
'	?.,#,#:
'('('
+))@@		(Pr)'rhr.Zfirewall.core.loggerrZfirewall.functionsrrrrrrr	r
rrr�r
rrrrrrrrrrZfirewall.core.fw_transactionrZfirewallrZfirewall.errorsrZfirewall.fw_typesrZfirewall.core.baser�objectrr#r#r#r$�<module>s04site-packages/firewall/core/__pycache__/fw_service.cpython-36.pyc000064400000003201147511334670021022 0ustar003

]ûfg�@s2dgZddlmZddlmZGdd�de�ZdS)�FirewallService�)�errors)�
FirewallErrorc@sLeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)rcCs||_i|_dS)N)Z_fw�	_services)�self�fw�r� /usr/lib/python3.6/fw_service.py�__init__szFirewallService.__init__cCsd|j|jfS)Nz%s(%r))�	__class__r)rrrr	�__repr__ szFirewallService.__repr__cCs|jj�dS)N)r�clear)rrrr	�cleanup#szFirewallService.cleanupcCst|jj��S)N)�sortedr�keys)rrrr	�get_services(szFirewallService.get_servicescCs||jkrttj|��dS)N)rrrZINVALID_SERVICE)r�servicerrr	�
check_service+s
zFirewallService.check_servicecCs|j|�|j|S)N)rr)rrrrr	�get_service/s
zFirewallService.get_servicecCs||j|j<dS)N)r�name)r�objrrr	�add_service3szFirewallService.add_servicecCs|j|�|j|=dS)N)rr)rrrrr	�remove_service6s
zFirewallService.remove_serviceN)�__name__�
__module__�__qualname__r
rrrrrrrrrrr	rsN)�__all__ZfirewallrZfirewall.errorsr�objectrrrrr	�<module>ssite-packages/firewall/core/__pycache__/ebtables.cpython-36.pyc000064400000016336147511334670020464 0ustar003

]ûf�$�@s&dgZddlZddlmZddlmZddlmZm	Z	m
Z
ddlmZddl
mZddlmZmZddlZd	gd
ddgd
ddgd�ZiZiZiZx�ej�D]tZgee<e�ee<x\eeD]PZeejde�eejdeef�eejde�eejde�q�Wq�WGdd�de�ZdS)�ebtables�N)�runProg)�log)�tempFile�readfile�	splitArgs)�COMMANDS)�	ipXtables)�
FirewallError�INVALID_IPVZBROUTINGZ
PREROUTINGZPOSTROUTINGZOUTPUTZINPUTZFORWARD)ZbrouteZnat�filterz-N %s_directz-I %s 1 -j %s_directz-I %s_direct 1 -j RETURNz	%s_directc@s�eZdZdZdZdZdd�Zdd�Zdd�Zd	d
�Z	dd�Z
d
d�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd/d d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd0d+d,�Zd-d.�ZdS)1rZebFcCsBt|j|_td|j|_|j�|_|j�|_|j�g|_	dS)Nz
%s-restore)
r�ipv�_command�_restore_command�_detect_restore_noflush_optionZrestore_noflush_option�_detect_concurrent_option�concurrent_option�fill_exists�available_tables)�self�r�/usr/lib/python3.6/ebtables.py�__init__9s

zebtables.__init__cCs$tjj|j�|_tjj|j�|_dS)N)�os�path�existsrZcommand_existsrZrestore_command_exists)rrrrrAszebtables.fill_existscCs(d}t|jddg�}|ddkr$d}|S)N�z--concurrentz-Lr)rr)rr�retrrrrEs
z"ebtables._detect_concurrent_optioncCs.g}y|j|d�Wntk
r(dSXdS)N�offFT)�	set_rules�
ValueError)r�rulesrrrrOsz'ebtables._detect_restore_noflush_optioncCs�g}|jr |j|kr |j|j�|dd�|D�7}tjd|j|jdj|��t|j|�\}}|dkr~td|jdj|�|f��|S)NcSsg|]}d|�qS)z%sr)�.0�itemrrr�
<listcomp>^sz"ebtables.__run.<locals>.<listcomp>z	%s: %s %s� rz'%s %s' failed: %s)	r�appendr�debug2�	__class__r�joinrr )r�argsZ_args�statusrrrrZ__runYszebtables.__runcCs(x"dD]}||krttd|��qWdS)N�
%%REJECT%%�%%ICMP%%�%%LOGTYPE%%z'%s' invalid for ebtables)r,r-r.)r
r)r�rule�strrrr�_rule_validatefs
zebtables._rule_validatecCs|tko|t|kS)N)�BUILT_IN_CHAINS)rr
�table�chainrrr�is_chain_builtinlszebtables.is_chain_builtincCsJg}|r4|jd|d|g�|jd|d|dddg�n|jd|d|g�|S)Nz-tz-Nz-I�1z-jZRETURNz-X)r&)r�addr3r4r!rrr�build_chain_rulespszebtables.build_chain_rulescCs8d|g}|r |d|t|�g7}n|d|g7}||7}|S)Nz-tz-Iz-D)r0)rr7r3r4�indexr*r/rrr�
build_rule{szebtables.build_rulecCs
tj|�S)N)r	Zcommon_reverse_rule)rr*rrr�reverse_rule�szebtables.reverse_rulecCstj|�dS)N)r	Zcommon_check_passthrough)rr*rrr�check_passthrough�szebtables.check_passthroughcCs
tj|�S)N)r	Zcommon_reverse_passthrough)rr*rrr�reverse_passthrough�szebtables.reverse_passthroughc
Cs<t�}d}i}x�|D]�}|dd�}|j|�xTdD]L}y|j|�}	Wntk
rZYq4Xt|�|	dkr4|j|	�|j|	�}q4Wx^tt|��D]N}	xHtjD]>}
|
||	kr�||	j	d�o�||	j
d�r�d||	||	<q�Wq�W|j|g�j|�qWxD|D]<}|j
d|�x&||D]}|j
dj|�d	��qW�qW|j�tj|j�}tjd
|j|jd|j|jf�g}|jd�t|j||jd
�\}
}tj�dk�rt|j�}|dk	�rd}	xH|D]@}tjd|	|fddd�|j
d	��s�tjddd�|	d7}	�q�Wtj|j�|
dk�r8td|jdj|�|f��dS)Nr�-t�--table��"z"%s"z*%s
r%�
z	%s: %s %sz%s: %dz	--noflush)�stdin�z%8d: %sr)�nofmt�nlr)rEz'%s %s' failed: %s)r>r?)rr1r9r �len�pop�range�stringZ
whitespace�
startswith�endswith�
setdefaultr&�writer)�closer�stat�namerr'r(r�st_sizerZgetDebugLogLevelrZdebug3�unlink)rr!�
log_deniedZ	temp_filer3Ztable_rulesZ_ruler/�opt�i�crPr*r+r�lines�linerrrr�sZ




 




zebtables.set_rulescCs|j|�|j|�S)N)r1�_ebtables__run)rr/rTrrr�set_rule�s
zebtables.set_ruleNcCs�g}|r|gntj�}xp|D]h}||jkr6|j|�qy*|jd|dg�|jj|�|j|�Wqtk
r�tjd|�YqXqW|S)Nz-tz-Lz#ebtables table '%s' does not exist.)r2�keysrr&rZr rZdebug1)rr3rZtablesrrr�get_available_tables�s

zebtables.get_available_tablescCsiS)Nr)rr3rrr�get_zone_table_chains�szebtables.get_zone_table_chainscCsFg}x<tj�D]0}||j�kr qxdD]}|jd||g�q&WqW|S)N�-F�-X�-Zz-t)r_r`ra)r2r\r]r&)rr!r3�flagrrr�build_flush_rules�s
zebtables.build_flush_rulescCs^g}|dkrdn|}xDtj�D]8}||j�kr0qx$t|D]}|jd|d||g�q:WqW|S)NZPANICZDROPz-tz-P)r2r\r]r&)rZpolicyr!Z_policyr3r4rrr�build_set_policy_rules�szebtables.build_set_policy_rulescCsgS)Nr)rrrr�build_default_tables�szebtables.build_default_tablesrcCs�g}x�tD]�}||j�krq
t|dd�}|dkrJ|tkrJ|jt|�d|g}x:|D]2}t|�tkrx|j||�qX|j|t|��qXWq
W|S)Nrz-t)�
DEFAULT_RULESr]�	LOG_RULES�extend�type�listr&r)rrTZ
default_rulesr3Z_default_rules�prefixr/rrr�build_default_rules�s

zebtables.build_default_rulescCs
||jkS)N)r
)rr
rrr�is_ipv_supportedszebtables.is_ipv_supported)N)r)�__name__�
__module__�__qualname__r
rQZpolicies_supportedrrrrrZr1r5r8r:r;r<r=rr[r]r^rcrdrerlrmrrrrr4s0


	@


)�__all__Zos.pathrZfirewall.core.progrZfirewall.core.loggerrZfirewall.functionsrrrZfirewall.configrZ
firewall.corer	Zfirewall.errorsr
rrJr2rfrgZ
OUR_CHAINSr\r3�setr4r&r7�objectrrrrr�<module>s.
site-packages/firewall/core/__pycache__/base.cpython-36.pyc000064400000002172147511334670017606 0ustar003

]ûf6�@s�dZdZdZd0ZdddedgZddddgZd	d
ddd
dd�Zddddddddddddddddgd d!d"d#d$d%d&ddg	d'�Zd(d)d*d+d,d-d.gZd/S)1zBase firewall settingsz{chain}_{zone}ZCONTINUE�ZACCEPTz
%%REJECT%%ZDROP�defaultZREJECTZPREZPOST�INZFWDIZFWDOZOUT)Z
PREROUTINGZPOSTROUTINGZINPUTZ
FORWARD_INZFORWARD_OUTZOUTPUTzicmp-host-prohibitedzhost-prohibzicmp-net-unreachableznet-unreachzicmp-host-unreachablezhost-unreachzicmp-port-unreachablezport-unreachzicmp-proto-unreachablez
proto-unreachzicmp-net-prohibitedz
net-prohibz	tcp-resetztcp-rstzicmp-admin-prohibitedzadmin-prohibzicmp6-adm-prohibitedzadm-prohibitedzicmp6-no-routezno-routezicmp6-addr-unreachablezaddr-unreachzicmp6-port-unreachable)Zipv4Zipv6zhash:ipzhash:ip,portzhash:ip,markzhash:netz
hash:net,portzhash:net,ifacezhash:macN���)	�__doc__ZDEFAULT_ZONE_TARGETZDEFAULT_POLICY_TARGETZDEFAULT_POLICY_PRIORITYZZONE_TARGETSZPOLICY_TARGETSZ	SHORTCUTSZREJECT_TYPESZSOURCE_IPSET_TYPES�rr�/usr/lib/python3.6/base.py�<module>s.site-packages/firewall/core/__pycache__/nftables.cpython-36.pyc000064400000130376147511334670020502 0ustar003

]ûf��%@slddlmZddlZddlZddlZddlmZddlmZm	Z	m
Z
mZmZddl
mZmZmZmZmZmZmZddlmZmZmZmZmZmZmZddlmZdZed	d
Z dZ!dZ"id
ddCe"fiddDe"fdde"fd�dde"fdde"fdde"fdde"fd�d�Z#dEdd�Z$e$ddd�e$dd�e$dd�e$dd�e$ddd�e$ddd �e$ddd�e$dd!d"�e$ddd#�e$ddd"�e$dd$d"�e$ddd%�e$dd!d�e$ddd&�e$ddd�e$dd$�e$ddd'�e$ddd(�e$ddd)�e$dd!�e$dd$d"�e$dd*�e$dd+�e$dd,�e$ddd-�e$dd.�e$dd/�e$dd0�e$dd!d'�e$ddd1�e$dd!d)�e$ddd2�e$dd.d"�e$dd.d�d3�"e$d4dd'�e$d4d$d�e$d4dd)�e$d4dd"�e$d4d�e$d4d�e$d4d�e$d4dd-�e$d4d5�e$d4d6�e$d4d7�e$d4d8�e$d4d9�e$d4d:�e$d4dd�e$d4d;�e$d4d$�e$d4dd�e$d4d<�e$d4dd&�e$d4d=�e$d4d>�e$d4d.�e$d4d.d"�e$d4d.d�e$d4d$d"�e$d4d$d)�d?�d@�Z%GdAdB�dBe&�Z'dS)F�)�absolute_importN)�log)�	check_mac�getPortRange�normalizeIP6�check_single_address�
check_address)�
FirewallError�
UNKNOWN_ERROR�INVALID_RULE�INVALID_ICMPTYPE�INVALID_TYPE�
INVALID_ENTRY�INVALID_PORT)�Rich_Accept�Rich_Reject�	Rich_Drop�	Rich_Mark�Rich_Masquerade�Rich_ForwardPort�Rich_IcmpBlock)�NftablesZ	firewalld�_Zpolicy_dropZpolicy_�
�
PREROUTING�
prerouting��dZpostrouting)r�POSTROUTING�input�forward�output)r�INPUT�FORWARD�OUTPUT)�raw�mangle�nat�filtercCsHdd|dd�id|d�ig}|dk	rD|jdd|dd�id|d�i�|S)N�match�payload�type)�protocol�fieldz==)�left�op�right�code)�append)r,r+r1�	fragments�r4�/usr/lib/python3.6/nftables.py�_icmp_types_fragmentsSsr6�icmpzdestination-unreachable�
z
echo-replyzecho-request���redirect��zparameter-problem�����zrouter-advertisementzrouter-solicitationz
source-quench�z
time-exceededztimestamp-replyztimestamp-request��)"zcommunication-prohibitedzdestination-unreachablez
echo-replyzecho-requestzfragmentation-neededzhost-precedence-violationzhost-prohibitedz
host-redirectzhost-unknownzhost-unreachablez
ip-header-badznetwork-prohibitedznetwork-redirectznetwork-unknownznetwork-unreachablezparameter-problemzport-unreachablezprecedence-cutoffzprotocol-unreachabler;zrequired-option-missingzrouter-advertisementzrouter-solicitationz
source-quenchzsource-route-failedz
time-exceededztimestamp-replyztimestamp-requestztos-host-redirectztos-host-unreachableztos-network-redirectztos-network-unreachablezttl-zero-during-reassemblyzttl-zero-during-transit�icmpv6zmld-listener-donezmld-listener-queryzmld-listener-reportzmld2-listener-reportznd-neighbor-advertznd-neighbor-solicitzpacket-too-bigznd-redirectznd-router-advertznd-router-solicit)zaddress-unreachablez
bad-headerzbeyond-scopezcommunication-prohibitedzdestination-unreachablez
echo-replyzecho-requestz
failed-policyzmld-listener-donezmld-listener-queryzmld-listener-reportzmld2-listener-reportzneighbour-advertisementzneighbour-solicitationzno-routezpacket-too-bigzparameter-problemzport-unreachabler;zreject-routezrouter-advertisementzrouter-solicitationz
time-exceededzttl-zero-during-reassemblyzttl-zero-during-transitzunknown-header-typezunknown-option)�ipv4�ipv6c@s`eZdZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Z	dd
�Z
dd�Zd�dd�Zdd�Z
dd�Zdd�Zdd�Zd�dd�Zdd�Zd�d d!�Zd"d#�Zd�d%d&�Zd�d(d)�Zd�d*d+�Zd�d,d-�Zd.d/�Zd0d1�Zd2d3�Zd4d5�Zd6d7�Zd8d9�Zd:d;�Zd<d=�Z d>d?�Z!d@dA�Z"dBdC�Z#dDdE�Z$dFdG�Z%dHdI�Z&d�dJdK�Z'dLdM�Z(dNdO�Z)dPdQ�Z*dRdS�Z+d�dTdU�Z,d�dVdW�Z-d�dXdY�Z.dZd[�Z/d�d\d]�Z0d�d^d_�Z1d�d`da�Z2d�dbdc�Z3d�ddde�Z4dfdg�Z5d�dhdi�Z6djdk�Z7d�dldm�Z8dndo�Z9dpdq�Z:drds�Z;dtdu�Z<d�dvdw�Z=d�dxdy�Z>dzd{�Z?d�d|d}�Z@d~d�ZAd�d��ZBd�d��ZCd�d��ZDd�d��ZEd�d��ZFd�d��ZGd�d�d��ZHdS)��nftablesTcCsb||_d|_g|_i|_i|_i|_i|_i|_gggd�|_t	�|_
|j
jd�|j
jd�dS)NT)�inet�ip�ip6)
�_fwZrestore_command_existsZavailable_tables�rule_to_handle�rule_ref_count�rich_rule_priority_counts�policy_priority_counts�zone_source_index_cache�created_tablesrrIZset_echo_outputZset_handle_output)�self�fwr4r4r5�__init__�sznftables.__init__cCs�xdD]}||krPqWd||dkr`||ddd||dddf}||dd=n(d||dkr�d}||dd=ndS||dd	}|r�|dkr�||kr�|||kr�||j|�n�|dk�r�||kr�g||<|�r(|||k�r||j|�||jd
d�d�||j|�}n|jj�r8d
}nt||�}||}||=|d
k�rf||d<n |d8}||d<||ddd<dS)N�add�insert�deletez%%ZONE_SOURCE%%�rule�zone�addressz%%ZONE_INTERFACE%%�familycSs|dS)Nrr4)�xr4r4r5�<lambda>�sz3nftables._run_replace_zone_source.<locals>.<lambda>)�keyrr<�index)rWrXrY)�remover2�sortrarM�_allow_zone_drifting�len)rTrZrR�verbZzone_sourcer]ra�
_verb_snippetr4r4r5�_run_replace_zone_source�sD




z!nftables._run_replace_zone_sourcecCsBd|krdtj|d�iSd|kr4dtj|d�iSttd��dS)NrXrYrWzFailed to reverse rule)�copy�deepcopyr	r
)rT�dictr4r4r5�reverse_rule�s
znftables.reverse_rulec
Cs�xdD]}||krPqW|||dk�r�||d|}||d|=t|�tkr^ttd��||dd||ddf}|dkr�||ks�|||ks�|||dkr�ttd	��|||d
8<n�||kr�i||<|||kr�d|||<d}xVt||j��D]B}||k�r"|dk�r"P||||7}||k�r|dk�rP�qW|||d
7<||}	||=|dk�r�|	|d<n |d
8}|	|d<||ddd<dS)
NrWrXrYrZz%priority must be followed by a numberr]�chainrz*nonexistent or underflow of priority countr<ra)rWrXrY)r+�intr	rr
�sorted�keys)
rTrZZpriority_counts�tokenrf�priorityrmra�prgr4r4r5�_set_rule_replace_priority�sD

 


z#nftables._set_rule_replace_prioritycCsfx`d
D]X}||krd||krtj||d�}xdD]}||kr6||=q6Wtj|dd	�}|SqWdS)NrWrXrYrZra�handle�positionT)Z	sort_keys)rWrXrY)rarurv)rirj�json�dumps)rTrZrf�rule_keyZnon_keyr4r4r5�
_get_rule_keys


znftables._get_rule_keycCsLdddddg}dddg}g}g}tj|j�}tj|j�}tj|j�}	|jj�}
�x�|D�]�}t|�tkrvtt	d|��x|D]}||kr|Pq|W||kr�tt
d|��|j|�}
|
|
k�rDtj
d|j|
|
|
�|dkr�|
|
d	7<qVnX|
|
d	k�r|
|
d	8<qVn6|
|
d	k�r,|
|
d	8<ntt	d
|
|
|
f��n|
�r\|dk�r\d	|
|
<|j|�tj|�}|
�rttd||dd��||dd<|j||d
�|j||d�|j||	�|dk�rdd|ddd|ddd|ddd|j|
d�ii}|j|�qVWdddd	iig|i}tj�dk�rVtjd|jtj|��|jj|�\}}}|dk�r�tdd|tj|�f��||_||_|	|_|
|_d}x�|D]�}|d	7}|j|�}
|
�s̐q�d|k�r�|j|
=|j|
=�q�x"|D]}||d|k�r�P�q�W||d|k�r$�q�|d||dd|j|
<�q�WdS)NrWrXrY�flush�replacez#rule must be a dictionary, rule: %szno valid verb found, rule: %sz%s: prev rule ref cnt %d, %sr<z)rule ref count bug: rule_key '%s', cnt %drZ�exprz%%RICH_RULE_PRIORITY%%z%%POLICY_PRIORITY%%r]�tablerm)r]r~rmrurIZmetainfoZjson_schema_versionr@z.%s: calling python-nftables with JSON blob: %srz'%s' failed: %s
JSON blob:
%szpython-nftablesru)rirjrPrQrRrOr+rkr	r
rrzrZdebug2�	__class__r2�listr(rtrhrNZgetDebugLogLevelZdebug3rwrxrIZjson_cmd�
ValueError)rT�rules�
log_deniedZ_valid_verbsZ_valid_add_verbsZ_deduplicated_rulesZ_executed_rulesrPrQrRrOrZrfryZ_ruleZ	json_blobZrcr!�errorrar4r4r5�	set_rules+s�







&






znftables.set_rulescCs|j|g|�dS)N�)r�)rTrZr�r4r4r5�set_rule�sznftables.set_ruleNcCs|r
|gStj�S)N)�IPTABLES_TO_NFT_HOOKrp)rTr~r4r4r5�get_available_tables�sznftables.get_available_tablescCsFg}x<dD]4}|jdd||d�ii�|jdd||d�ii�q
W|S)	NrJrKrLrWr~)r]�namerY)rJrKrL)r2)rTr~r�r]r4r4r5�_build_delete_table_rules�s


z"nftables._build_delete_table_rulescCs�i}i}xB|jd�D]4}|j|�}||jkr|j|||<|j|||<qW||_||_i|_i|_i|_x*dD]"}t|j|krp|j|j	t�qpW|j
t�S)NTrJrKrL)rJrKrL)� _build_set_policy_rules_ct_rulesrzrNrOrPrQrR�
TABLE_NAMErSrbr�)rTZsaved_rule_to_handleZsaved_rule_ref_countrZ�
policy_keyr]r4r4r5�build_flush_rules�s 


znftables.build_flush_rulesc
Cslddd�|}g}xTdD]L}|j|ddtd	d
|fddd
diiddddgid�iddigd�ii�qW|S)NrWrY)TFrr r!rZrJz%s_%sr(r)�ctr`�state�in�set�established�related)r.r/r0�accept)r]r~rmr})rr r!)r2�TABLE_NAME_POLICY)rT�enable�add_delr��hookr4r4r5r��s


z)nftables._build_set_policy_rules_ct_rulescCstg}|dkrt|jdddtd�ii�|jdjt�x>dD]6}|jdddtd	d
|fd|dtd
dd�ii�q:W|dk�r�|jdddtd�ii�|jdjt�x>dD]6}|jdddtd	d|fd|dtd
dd�ii�q�W||jd�7}nz|dk�rfx4|jd�D]&}|j|�}||jk�r|j|��qW||jt�7}t|jdk�rp|jdjt�n
t	t
d�|S)NZPANICrWr~rJ)r]r�rr!rmz%s_%sr%r(i,r<�drop)r]r~r�r+r��prio�policy�DROPrr rT�ACCEPTFznot implemented)rr!i���)rr r!)r2r�rS�NFT_HOOK_OFFSETr�rzrNr�rbr	r
)rTr�r�r�rZr�r4r4r5�build_set_policy_rules�sH













znftables.build_set_policy_rulescCs<t�}x,|r|gntj�D]}|jt|j��qWt|�S)N)r��ICMP_TYPES_FRAGMENTSrp�updater�)rT�ipvZ	supportedZ_ipvr4r4r5�supported_icmp_types�sznftables.supported_icmp_typescCs>g}x4dD],}|jdd|td�ii�|j|jt�q
W|S)NrJrKrLrWr~)r]r�)rJrKrL)r2r�rS)rTZdefault_tablesr]r4r4r5�build_default_tabless

znftables.build_default_tables�offcCs�g}x�tdj�D]�}|jdddtd|ddtd|dtd|d	d
�ii�xz|jjrlddd
dgndd
dgD]X}|jdddtd||fd�ii�|jdddtd|ddd||fiigd�ii�qvWqWx�d?D]�}x�tdj�D]�}|jdd|td|ddtd|dtd|d	d
�ii�x~|jj�rJddd
dgndd
dgD]Z}|jdd|td||fd�ii�|jdd|td|ddd||fiigd�ii��qTWq�Wq�WxVtdj�D]F}|jdddtd|ddtd|dtd|d	d
�ii��q�W|jdddtddddddiid d!d"d#gid$�id%digd�ii�|jdddtdddddd&iid d'd$�id%digd�ii�|jdddtdddd(dd)iid*d+d$�id%digd�ii�x~|jj�r�ddd
dgndd
dgD]Z}|jdddtd,d|fd�ii�|jdddtddddd,d|fiigd�ii��q�W|d-k�r�|jdddtddddddiid d!d.gid$�i|j|�d/d0d1iigd�ii�|jdddtddddddiid d!d.gid$�id2digd�ii�|d-k�r$|jdddtdd|j|�d/d0d3iigd�ii�|jdddtddd4d5d6d7�igd�ii�|jdddtdd8ddddiid d!d"d#gid$�id%digd�ii�|jdddtdd8dddd&iid d'd$�id%digd�ii�|jdddtdd8dd(dd)iid*d+d$�id%digd�ii�xbd@D]Z}|jdddtd,d8|fd�ii�|jdddtdd8ddd,d8|fiigd�ii��qWx�dAD]�}xz|jj�r�dd
gnd
gD]^}|jdddtd;d8||fd�ii�|jdddtdd8ddd;d8||fiigd�ii��q�W�qvWxbdBD]Z}|jdddtd,d8|fd�ii�|jdddtdd8ddd,d8|fiigd�ii��qW|d-k�r�|jdddtdd8ddddiid d!d.gid$�i|j|�d/d0d1iigd�ii�|jdddtdd8ddddiid d!d.gid$�id2digd�ii�|d-k�r6|jdddtdd8|j|�d/d0d3iigd�ii�|jdddtdd8d4d5d6d7�igd�ii�|jdddtdd<ddddiid d!d"d#gid$�id%digd�ii�|jdddtd=dd(dd>iid*d+d$�id%digd�ii�xbdCD]Z}|jdddtd,d<|fd�ii�|jdddtdd<ddd,d<|fiigd�ii��q�WxbdDD]Z}|jdddtd,d<|fd�ii�|jdddtdd<ddd,d<|fiigd�ii��qHW|S)ENr&rWrmrJz	mangle_%sr(z%srr<)r]r~r�r+r�r��POLICIES_preZZONES_SOURCEZZONES�
POLICIES_postzmangle_%s_%s)r]r~r�rZ�jump�target)r]r~rmr}rKrLr'znat_%sz	nat_%s_%sz	filter_%sr"r)r�r`r�r�r�r�r�)r.r/r0r�Zstatus�dnat�meta�iifnamez==�lozfilter_%s_%sr�Zinvalidr�prefixzSTATE_INVALID_DROP: r�zFINAL_REJECT: �reject�icmpxzadmin-prohibited)r+r}r#�IN�OUTzfilter_%s_%s_%sr$�
filter_OUTPUT�oifname)rKrL)r�)r�r�)r�)r�)r�)r�rpr2r�rMrd�_pkttype_match_fragment)rTr�Z
default_rulesrmZdispatch_suffixr]�	directionr4r4r5�build_default_ruless�
$

(

&

.
 


&

&











&


.


&










&


&znftables.build_default_rulescCs4|dkrdddgS|dkr dgS|dkr0ddgSgS)	Nr(r"�
FORWARD_IN�FORWARD_OUTr&rr'rr4)rTr~r4r4r5�get_zone_table_chains�s
znftables.get_zone_table_chainsrJc
s��dkr\�dkr\g}
|
j�j�|��||||dd�	�|
j�j�|��||||dd�	�|
S�jjj|���jdkrxdnd��dkr��d	kr�d
nd}�jjj|�t|��g}g}
|r�|jdd
ddiiddt	|�id�i�|�r|
jdd
ddiiddt	|�id�i�ddd�}|�rlxT|D]L}�dk�rT�jj
j|�}||k�rT�||k�rT�q|j�jd|���qW|�r�xT|D]L}�dk�r��jj
j|�}||k�r��||k�r��qx|
j�jd|���qxW��������fdd�}g}
|�rHx�|D]P}|
�rxB|
D]}|
j|||���qWn"�dk�r0|�r0n|
j||d���q�Wn\�dk�rZ|�rZnJ|
�r�xB|
D]}|
j|d|���qfWn"�dk�r�|�r�n|
j|dd��|
S)Nr'rJrK)r]rLr�pre�postrTFr)r�r`r�z==r�)r.r/r0r�)rGrH�saddr�daddrcs�g}|r|j|�|r |j|�|jddd��fii��td���f|d�}|j�j����rrdd|iiSdd|iiSdS)	Nr�r�z%s_%sz%s_%s_POLICIES_%s)r]r~rmr}rWrZrY)r2r�r��_policy_priority_fragment)�ingress_fragment�egress_fragment�expr_fragmentsrZ)�_policyrm�chain_suffixr�r]�p_objrTr~r4r5�_generate_policy_dispatch_rules

zRnftables.build_policy_ingress_egress_rules.<locals>._generate_policy_dispatch_rule)
�extend�!build_policy_ingress_egress_rulesrMr�Z
get_policyrr�policy_base_chain_name�POLICY_CHAIN_PREFIXr2r�r[Zcheck_source�_rule_addr_fragment)rTr�r�r~rmZingress_interfacesZegress_interfacesZingress_sourcesZegress_sourcesr]r��isSNATZingress_fragmentsZegress_fragmentsZ
ipv_to_family�srcr��dstr�r�r�r4)r�rmr�r�r]r�rTr~r5r��sv









z*nftables.build_policy_ingress_egress_rulesFc	
Cs�|dkrT|dkrTg}	|	j|j|||||||d��|	j|j|||||||d��|	S|dkrh|dkrhdnd}
|jjj||t|
d�}d	d
d	d	d
d
d�|}|t|�dd
kr�|dt|�d�d}d}
|dkr�|
dd||fiig}n,ddd|iid|d�i|
dd||fiig}|�rL|�rLd}|td||f|d�}|j|j	��nP|�rnd}|td||f|d�}n.d}|td||f|d�}|�s�|j|j	��|d|iigS)Nr'rJrKrLrTF)r�r�r�)rrr"r�r�r$r<�+�*�gotor�z%s_%sr)r�r`z==)r.r/r0rXz%s_%s_ZONES)r]r~rmr}rWrYrZ)
r��!build_zone_source_interface_rulesrMr�r�r�rer�r��_zone_interface_fragment)rTr�r[r��	interfacer~rmr2r]r�r�r��opt�actionr�rfrZr4r4r5r�Qs\



z*nftables.build_zone_source_interface_rulesc	Csn|dkr�|dkr�g}|jd�r6|j|td�d��}	nd}	td|�sTt|�sT|	dkrp|j|j||||||d��td|�s�t|�s�|	dkr�|j|j||||||d��|S|dkr�|dkr�d	nd
}
|jjj	||t
|
d�}dd
d�|}ddddddd�|}
|jj�rd||f}nd||f}d}|t||j
|
|�|dd||fiigd�}|j|j||��|d|iigS)Nr'rJzipset:rGrKrHrLrTF)r�rXrY)TFr�r�)rrr"r�r�r$z%s_%s_ZONES_SOURCEz%s_%s_ZONESr�r�z%s_%s)r]r~rmr}rZ)�
startswith�_set_get_familyrerrr��build_zone_source_address_rulesrMr�r�r�rdr�r�r��_zone_source_fragment)rTr�r[r�r\r~rmr]r�Zipset_familyr�r�r�r�Zzone_dispatch_chainr�rZr4r4r5r��sB


z(nftables.build_zone_source_address_rulesc
Cs|dkrH|dkrHg}|j|j||||d��|j|j||||d��|Sddd�|}|dkrj|dkrjd	nd
}|jjj||t|d�}	g}|j|d|td
||	fd�ii�x0d!D](}
|j|d|td||	|
fd�ii�q�WxDd"D]<}
|j|d|td
||	fddd||	|
fiigd�ii�q�W|jjj|j	}|jj
�dk�r�|dk�r�|d#k�r�|}|dk�rhd}|j|d|td
||	f|j|jj
��ddd|	|fiigd�ii�|dk�r|d$k�r|d%k�r�|j�}
n|j
�di}
|j|d|td
||	f|
gd�ii�|�s|j�|S)&Nr'rJrKrLrWrY)TFrTF)r�rmz%s_%s)r]r~r�r�r�deny�allowr�z%s_%s_%srZr�r�)r]r~rmr}r�r(�REJECT�
%%REJECT%%r�r�z"filter_%s_%s: "r�)r�rr�r�r�)r�rr�r�r�)r�r�r�)r�r�r�r�)r�r�)r��build_policy_chain_rulesrMr�r�r�r2r�Z	_policiesr��get_log_deniedr��_reject_fragment�lower�reverse)rTr�r�r~rmr]r�r�r�r�r�r�Z
log_suffix�target_fragmentr4r4r5r��sZ





&




 





z!nftables.build_policy_chain_rulescCs<|dkriS|dkr,ddddiid	|d
�iSttd|��dS)
N�all�unicast�	broadcast�	multicastr)r�r`�pkttypez==)r.r/r0zInvalid pkttype "%s")r�r�r�)r	r)rTr�r4r4r5r��s
z nftables._pkttype_match_fragmentcCsdddd�idddd�idddd�idddd�idddd�idddd�idddd�idddd�idddd�idddd�iddd	d�iddd	d�iddd
d�iddd
d�iddd
d�idddd�idddd�iddd
d�iddd
d�idddd�idddd�idddiidddiid�}||S)Nr�r7zhost-prohibited)r+r}znet-prohibitedzadmin-prohibitedrFznet-unreachablezhost-unreachablezport-unreachabler�zprot-unreachablezaddr-unreachablezno-router+z	tcp reset)zicmp-host-prohibitedzhost-prohibzicmp-net-prohibitedz
net-prohibzicmp-admin-prohibitedzadmin-prohibzicmp6-adm-prohibitedzadm-prohibitedzicmp-net-unreachableznet-unreachzicmp-host-unreachablezhost-unreachzicmp-port-unreachablezicmp6-port-unreachablezport-unreachzicmp-proto-unreachablez
proto-unreachzicmp6-addr-unreachablezaddr-unreachzicmp6-no-routezno-routez	tcp-resetztcp-rstr4)rTZreject_typeZfragsr4r4r5�_reject_types_fragment�s0
znftables._reject_types_fragmentcCsdddd�iS)Nr�r�zadmin-prohibited)r+r}r4)rTr4r4r5r�sznftables._reject_fragmentcCs ddddiiddddgid	�iS)
Nr)r�r`�l4protoz==r�r7rF)r.r/r0r4)rTr4r4r5�_icmp_match_fragment"sznftables._icmp_match_fragmentcCsP|siSddddd�}|j�\}}|||d�}|j�}|dk	rH||d<d|iS)	N�secondZminuteZhourZday)�s�m�h�d)�rateZper�burst�limit)Zvalue_parseZburst_parse)rTr�Zrich_to_nftr�Zdurationr�r�r4r4r5�_rich_rule_limit_fragment'sz"nftables._rich_rule_limit_fragmentcCs�t|j�tttgkrn<|jrHt|j�tttt	gkrRt
tdt|j���n
t
td��|jdkr�t|j�ttgks�t|j�tt	gkr�dSt|j�tgks�t|j�ttgkr�dSn|jdkr�dSdSdS)NzUnknown action %szNo rule action specified.rr�r�r�r�)
r+�elementrrrr�rrrrr	rrr)rT�	rich_ruler4r4r5�_rich_rule_chain_suffix?s 


z nftables._rich_rule_chain_suffixcCs>|jr|jrttd��|jdkr(dS|jdkr6dSdSdS)NzNot log or auditrrr�r�)r�auditr	rrr)rTr�r4r4r5� _rich_rule_chain_suffix_from_logUs


z)nftables._rich_rule_chain_suffix_from_logcCsddiS)Nz%%ZONE_INTERFACE%%r4)rTr4r4r5r�`sz!nftables._zone_interface_fragmentcCsNtd|�rt|�}n,td|�r@|jd�}t|d�d|d}d||d�iS)NrH�/rr<z%%ZONE_SOURCE%%)r[r\)rrr�split)rTr[r\Z
addr_splitr4r4r5r�cs



znftables._zone_source_fragmentcCs
d|jiS)Nz%%POLICY_PRIORITY%%)rr)rTr�r4r4r5r�ksz"nftables._policy_priority_fragmentcCs|s|jdkriSd|jiS)Nrz%%RICH_RULE_PRIORITY%%)rr)rTr�r4r4r5�_rich_rule_priority_fragmentnsz%nftables._rich_rule_priority_fragmentcCs�|js
iS|jjj||t�}ddd�|}|j|�}i}	|jjrPd|jj|	d<|jjr|d|jjkrhdn|jj}
d|
|	d<d	td
|||f||j	|jj
�d|	igd�}|j|j|��|d
|iiS)NrWrY)TFz%sr�Zwarning�warn�levelrJz%s_%s_%sr)r]r~rmr}rZ)
rrMr�r�r�r�r�rr�r�r�r�r�)rTr�r�r�r~r�r�r�r�Zlog_optionsrrZr4r4r5�_rich_rule_logss&
znftables._rich_rule_logc
Cs�|js
iS|jjj||t�}ddd�|}|j|�}dtd|||f||j|jj�dddiigd	�}	|	j	|j
|��|d
|	iiS)NrWrY)TFrJz%s_%s_%srrr�)r]r~rmr}rZ)r�rMr�r�r�r�r�r�r�r�r�)
rTr�r�r�r~r�r�r�r�rZr4r4r5�_rich_rule_audit�s
znftables._rich_rule_auditc
Cs�|js
iS|jjj||t�}ddd�|}|j|�}d|||f}	t|j�tkr\ddi}
�nt|j�tkr�|jjr�|j	|jj�}
nddi}
n�t|j�t
kr�ddi}
n�t|j�tk�rHd}|jjj||t�}d|||f}	|jjj
d	�}t|�d
k�r,dddd
iiddddd
ii|d
gi|dgid�i}
ndddd
ii|dd�i}
nttdt|j���dt|	||j|jj�|
gd�}|j|j|��|d|iiS)NrWrY)TFz%s_%s_%sr�r�r�r&r�r<r�r`�mark�^�&r)r`�valuezUnknown action %srJ)r]r~rmr}rZ)r�rMr�r�r�r�r+rrr�rrr�r�rer	rr�r�r�r�r�)
rTr�r�r�r~r�r�r�r�rmZrule_actionrrZr4r4r5�_rich_rule_action�sB


,znftables._rich_rule_actioncCs�|jd�r0|j|td�d�d|kr(dnd|�St|�r>d}n�td|�rNd}nvtd|�r�d}tj|dd�}d	|jj	|j
d
�i}nDtd|�r�d}t|�}n,d}|jd
�}d	t|d�t
|d�d
�i}dd||d�i|r�dnd|d�iSdS)Nzipset:r�TF�etherrGrK)�strictr�)�addrrerHrLr�rr<r)r*)r,r-z!=z==)r.r/r0)r��_set_match_fragmentrerrr�	ipaddressZIPv4NetworkZnetwork_addressZ
compressedZ	prefixlenrr�rn)rTZ
addr_fieldr\�invertr]Znormalized_addressZaddr_lenr4r4r5r��s(
&





znftables._rule_addr_fragmentcCs6|siS|d
krttd|��ddddiid|d	�iS)NrGrHzInvalid familyr)r�r`�nfprotoz==)r.r/r0)rGrH)r	r)rTZrich_familyr4r4r5�_rich_rule_family_fragment�s
z#nftables._rich_rule_family_fragmentcCs8|siS|jr|j}n|jr&d|j}|jd||jd�S)Nzipset:r�)r)r�ipsetr�r)rTZ	rich_destr\r4r4r5�_rich_rule_destination_fragment�s
z(nftables._rich_rule_destination_fragmentcCsZ|siS|jr|j}n2t|d�r.|jr.|j}nt|d�rH|jrHd|j}|jd||jd�S)N�macrzipset:r�)r)r�hasattrrrr�r)rTZrich_sourcer\r4r4r5�_rich_rule_source_fragment�s
z#nftables._rich_rule_source_fragmentcCsPt|�}t|t�r$|dkr$tt��n(t|�dkr8|dSd|d|dgiSdS)Nrr<�range)r�
isinstancernr	rre)rT�portrr4r4r5�_port_fragments
znftables._port_fragmentc	Csbddd�|}d}|jjj||t�}	g}
|r>|
j|j|j��|rT|
j|jd|��|r||
j|j|j	��|
j|j
|j��|
jdd|dd	�id
|j|�d�i�|s�t
|j�tkr�|
jddd
diiddddgid�i�g}|�r0|j|j|||||
��|j|j|||||
��|j|j|||||
��n.|j|ddtd||	f|
ddigd�ii�|S)NrWrY)TFr(r�r)r*�dport)r,r-z==)r.r/r0r�r`r�r�r��new�	untrackedrZrJz%s_%s_allowr�)r]r~rmr})rMr�r�r�r2rr]r�r�destinationr�sourcerr+r�rrrrr�)rTr�r��protorrr�r�r~r�r�r�r4r4r5�build_policy_ports_ruless:


z!nftables.build_policy_ports_rulesc	CsZddd�|}d}|jjj||t�}g}	|r>|	j|j|j��|rT|	j|jd|��|r||	j|j|j	��|	j|j
|j��|	jdddd	iid
|d�i�|s�t|j
�tkr�|	jdddd
iiddddgid�i�g}
|�r(|
j|j|||||	��|
j|j|||||	��|
j|j|||||	��n.|
j|ddtd||f|	ddigd�ii�|
S)NrWrY)TFr(r�r)r�r`r�z==)r.r/r0r�r�r�r�rrrZrJz%s_%s_allowr�)r]r~rmr})rMr�r�r�r2rr]r�rrrrr+r�rrrrr�)rTr�r�r,rr�r�r~r�r�r�r4r4r5�build_policy_protocol_rules2s8

z$nftables.build_policy_protocol_rulesc	Csbddd�|}d}|jjj||t�}	g}
|r>|
j|j|j��|rT|
j|jd|��|r||
j|j|j	��|
j|j
|j��|
jdd|dd	�id
|j|�d�i�|s�t
|j�tkr�|
jddd
diiddddgid�i�g}|�r0|j|j|||||
��|j|j|||||
��|j|j|||||
��n.|j|ddtd||	f|
ddigd�ii�|S)NrWrY)TFr(r�r)r*�sport)r,r-z==)r.r/r0r�r`r�r�r�rrrZrJz%s_%s_allowr�)r]r~rmr})rMr�r�r�r2rr]r�rrrrrr+r�rrrrr�)rTr�r�rrrr�r�r~r�r�r�r4r4r5�build_policy_source_ports_rulesUs:


z(nftables.build_policy_source_ports_rulesc
	Cs�d}|jjj||t�}	ddd�|}
g}|rR|jdddtd||f||d�ii�g}|rl|j|jd	|��|jd
d|dd
�id|j|�d�i�|jdd||fi�|j|
ddtd|	|d�ii�|S)Nr(rWrY)TFz	ct helperrJzhelper-%s-%s)r]r~r�r+r,r�r)r*r)r,r-z==)r.r/r0rZzfilter_%s_allow)r]r~rmr})rMr�r�r�r2r�r�r)
rTr�r�rrrZhelper_nameZmodule_short_namer~r�r�r�r�r4r4r5�build_policy_helper_ports_ruleszs.



z(nftables.build_policy_helper_ports_rulescCs�ddd�|}|jjj||t�}g}	|rv|t|�ddkrT|dt|�d�d}ddd	d
iid|d�id
dig}
n|jd|�d
dig}
dtd||
d�}|	j|d|ii�|	S)NrWrY)TFr<r�r�r)r�r`r�z==)r.r/r0r�r�rJzfilter_%s_allow)r]r~rmr}rZ)rMr�r�r�rer�r�r2)rTr�r[r�r~r�rr�r�r�r}rZr4r4r5�build_zone_forward_rules�s"z!nftables.build_zone_forward_rulesc	Cs�d}|jjj||tdd�}ddd�|}g}|r`|j|j|j��|j|j|j��|j	|�}	nd}	|t
d||	f|d	d
ddiid
dd�iddigd�}
|
j|j|��|d|
iigS)Nr'T)r�rWrY)TFr�z	nat_%s_%sr)r�r`r�z!=r�)r.r/r0Z
masquerade)r]r~rmr}rZ)
rMr�r�r�r2rrrrr�r�r�r�)rTr�r�r]r�r~r�r�r�r�rZr4r4r5�"_build_policy_masquerade_nat_rules�s&
z+nftables._build_policy_masquerade_nat_rulesc
Cs^g}|rD|jr|jdks,|jrDtd|jj�rD|j|j||d|��nV|r�|jrX|jdksl|jr�td|jj�r�|j|j||d|��n|j|j||d|��d}|jjj||t	�}ddd�|}g}|r�|j
|j|j��|j
|j
|j��|j|�}	nd	}	d
td||	f|dd
ddiiddddgid�iddigd�}
|
j|j|��|j
|d|
ii�|S)NrHrLrGrKr(rWrY)TFr�rJzfilter_%s_%sr)r�r`r�r�r�rr)r.r/r0r�)r]r~rmr}rZ)r]rrrr�r&rMr�r�r�r2rrrr�r�r�r�)rTr�r�r�r�r~r�r�r�r�rZr4r4r5�build_policy_masquerade_rules�s8
z&nftables.build_policy_masquerade_rulesc	Cs$d}	|jjj||	t�}
ddd�|}g}|r\|j|j|j��|j|j|j��|j	|�}
nd}
|jdd|dd	�id
|j
|�d�i�|r�td|�r�t|�}|r�|d
kr�|jd||j
|�d�i�q�|jdd|ii�n|jdd|j
|�ii�|t
d|
|
f|d�}|j|j|��|d|iigS)Nr'rWrY)TFr�r)r*r)r,r-z==)r.r/r0rHr�r�)rrrr;rz	nat_%s_%s)r]r~rmr}rZ)rMr�r�r�r2rrrrr�rrrr�r�r�)rTr�r�rr,�toaddr�toportr]r�r~r�r�r�r�rZr4r4r5�$_build_policy_forward_port_nat_rules�s4


z-nftables._build_policy_forward_port_nat_rulesc	
Cs�g}|rF|jr|jdks&|rFtd|�rF|j|j||||||d|��n�|r�|jrZ|jdksh|r�td|�r�|j|j||||||d|��nL|r�td|�r�|j|j||||||d|��n|j|j||||||d|��|S)NrHrLrGrK)r]rr�r*)	rTr�r�rr,r)r(r�r�r4r4r5�build_policy_forward_port_rulessz(nftables.build_policy_forward_port_rulescCs2|t|krt||Sttd||j|f��dS)Nz)ICMP type '%s' not supported by %s for %s)r�r	rr�)rTr�Z	icmp_typer4r4r5�_icmp_types_to_nft_fragments(sz%nftables._icmp_types_to_nft_fragmentscCsBd}|jjj||t�}ddd�|}|r6|jr6|j}n<|jrjg}d|jkrT|jd�d|jkrr|jd�nddg}g}	�x�|D�]�}
|jjj|�r�d||f}ddi}nd	||f}|j�}g}
|r�|
j|j	|j
��|
j|j|j��|
j|j|j
��|
j|j|
|j��|�r�|	j|j|||||
��|	j|j|||||
��|j�rf|	j|j|||||
��nN|j|�}d
td|||f|
|j�gd�}|j|j|��|	j|d
|ii�q~|jj�dk�r|jjj|��r|	j|d
d
t||
|j|jj��ddd||fiigd�ii�|	j|d
d
t||
|gd�ii�q~W|	S)Nr(rWrY)TFrGrHz%s_%s_allowr�z
%s_%s_denyrJz%s_%s_%s)r]r~rmr}rZr�rr�z"%s_%s_ICMP_BLOCK: ")rMr�r�r��ipvsrr2�query_icmp_block_inversionr�rr]rrrr�r,r�rrr�rr�r�r�r�r�r�)rTr�r�Zictr�r~r�r�r-r�r�Zfinal_chainr�r�r�rZr4r4r5�build_policy_icmp_block_rules/sb





"
"
z&nftables.build_policy_icmp_block_rulescCs�d}|jjj||t�}g}ddd�|}|jjj|�r@|j�}nddi}|j|ddtd||fd	|j�|gd
�ii�|jj	�dkr�|jjj|�r�|j|ddtd||fd	|j�|j
|jj	��dd
d||fiigd
�ii�|S)Nr(rWrY)TFr�rZrJz%s_%sr9)r]r~rmrar}r�rr�z%s_%s_ICMP_BLOCK: )rMr�r�r�r.r�r2r�r�r�r�)rTr�r�r~r�r�r�r�r4r4r5�'build_policy_icmp_block_inversion_rulesks,




 z0nftables.build_policy_icmp_block_inversion_rulescCs�g}ddddiiddd�iddd	d
dgdd
�iddd�ig}|dkrV|jdddii�|jddi�|jdddtd|d�ii�|jdddtdddddd�iddddgid�id digd�ii�|S)!Nr)r�r`rz==rH)r.r/r0Zfibr�ZiifrZoif)�flags�resultFr�rr�zrpfilter_DROP: r�rXrZrJZfilter_PREROUTING)r]r~rmr}r*rFr+)r,r-r�znd-router-advertznd-neighbor-solicitr�)r2r�)rTr�r�r�r4r4r5�build_rpfilter_rules�s0

znftables.build_rpfilter_rulesc	Cs�ddddddddd	g	}d
d�|D�}dd
ddd�idd|id�ig}|jjd"krb|jdddii�|j|jd��g}|jdddtdd|d�ii�|jdddtd d!|d�ii�|S)#Nz::0.0.0.0/96z::ffff:0.0.0.0/96z2002:0000::/24z2002:0a00::/24z2002:7f00::/24z2002:ac10::/28z2002:c0a8::/32z2002:a9fe::/32z2002:e000::/19cSs2g|]*}d|jd�dt|jd�d�d�i�qS)r�r�rr<)rre)r�rn)�.0r^r4r4r5�
<listcomp>�sz5nftables.build_rfc3964_ipv4_rules.<locals>.<listcomp>r)r*rLr�)r,r-z==r�)r.r/r0r�r�rr�zRFC3964_IPv4_REJECT: zaddr-unreachrWrZrJr�r<)r]r~rmrar}Zfilter_FORWARDrB)r�r�)rMZ_log_deniedr2r�r�)rTZ	daddr_setr�r�r4r4r5�build_rfc3964_ipv4_rules�s:

z!nftables.build_rfc3964_ipv4_rulescCs�d}g}|j|j|j��|j|j|j��|j|j|j��g}|j|j|||||��|j|j|||||��|j|j	|||||��|S)Nr()
r2rr]rrrrrrr)rTr�r�r�r~r�r�r4r4r5�*build_policy_rich_source_destination_rules�sz3nftables.build_policy_rich_source_destination_rulescCs|dkrdSdS)NrGrH�ebTF)rGrHr8r4)rTr�r4r4r5�is_ipv_supported�sznftables.is_ipv_supportedc
Cs�ddd�}||||ddg||dd||g||dd||g||dg||||||g||ddg||dd||g||dgdd	�}||kr�||Sttd
|��dS)NZ	ipv4_addrZ	ipv6_addr)rGrHZ
inet_protoZinet_servicerZifnameZ
ether_addr)zhash:ipzhash:ip,portzhash:ip,port,ipzhash:ip,port,netzhash:ip,markzhash:netzhash:net,netz
hash:net,portzhash:net,port,netzhash:net,ifacezhash:macz!ipset type name '%s' is not valid)r	r
)rTr�r+Zipv_addr�typesr4r4r5�_set_type_list�s"

znftables._set_type_listc
Cs�|rd|kr|ddkrd}nd}t||j||�d�}x0|jd�djd�D]}|dkrLdg|d
<PqLW|r�d|kr�|d|d<d|kr�|d|d<g}x0dD](}d|i}	|	j|�|jdd|	ii�q�W|S)Nr]�inet6rHrG)r~r�r+�:r<�,rK�netrZintervalr1ZtimeoutZmaxelem�sizerJrLrWr�)rKr?r)rJrKrL)r�r;r�r�r2)
rTr�r+�optionsr�Zset_dict�tr�r]Z	rule_dictr4r4r5�build_set_create_rules�s*


znftables.build_set_create_rulescCs$|j|||�}|j||jj��dS)N)rCr�rMr�)rTr�r+rAr�r4r4r5�
set_createsznftables.set_createcCs8x2dD]*}dd|t|d�ii}|j||jj��qWdS)NrJrKrLrYr�)r]r~r�)rJrKrL)r�r�rMr�)rTr�r]rZr4r4r5�set_destroys

znftables.set_destroycCs6|jjj|�jjd�djd�}g}x�tt|��D]�}||dkrr|jdddii�|jdd	|rdd
ndd�i�q2||dkr�|jd|j|�|r�dndd�i�q2||dkr�|jdd|r�dndii�q2||dkr�|jdddii�q2t	d||��q2Wdt|�dk�rd|in|d|�r&dndd|d�iS)Nr=r<r>rr�r`r�r*Zthrr")r,r-rKr?rr�r�Zifacer�r�rz-Unsupported ipset type for match fragment: %sr)�concatrz!=z==�@)r.r/r0)rKr?r)
rMr�	get_ipsetr+r�rrer2r�r	)rTr�Z
match_destr�type_formatr3�ir4r4r5r s$ znftables._set_match_fragmentcCsN|jjj|�}|jjd�djd�}|jd�}t|�t|�krHttd��g}�x�tt|��D�]�}||dk�r,y||j	d�}Wn&t
k
r�|jd�||}	Yn,X|j||d|��|||dd�}	y|	j	d�}Wn t
k
�r|j|	�Yn(X|jd|	d|�|	|dd�gi�q\||dk�r d||k�rb|jd||jd�i�n�y||j	d�}WnLt
k
�r�||}
d|jk�r�|jdd
k�r�t
|
�}
|j|
�Yn^X||d|�}
d|jk�r�|jdd
k�r�t
|
�}
|jd|
t|||dd��d�i�q\|j||�q\Wt|�dk�rJd|igS|S)Nr=r<r>z+Number of values does not match ipset type.rZtcp�-rrKr?r�r]r<r�)rrerF)rKr?)rMrrHr+r�rer	rrrar�r2rArrn)rTr��entry�objrIZentry_tokensZfragmentrJraZport_strrr4r4r5�_set_entry_fragment7sL

("znftables._set_entry_fragmentc	Cs>g}|j||�}x(dD] }|jdd|t||d�ii�qW|S)NrJrKrLrWr�)r]r~r��elem)rJrKrL)rNr2r�)rTr�rLr�r�r]r4r4r5�build_set_add_rulesks

znftables.build_set_add_rulescCs"|j||�}|j||jj��dS)N)rPr�rMr�)rTr�rLr�r4r4r5�set_addusznftables.set_addcCsF|j||�}x4dD],}dd|t||d�ii}|j||jj��qWdS)NrJrKrLrYr�)r]r~r�rO)rJrKrL)rNr�r�rMr�)rTr�rLr�r]rZr4r4r5�
set_deleteys
znftables.set_deletecCs4g}x*dD]"}dd|t|d�ii}|j|�q
W|S)NrJrKrLr{r�)r]r~r�)rJrKrL)r�r2)rTr�r�r]rZr4r4r5�build_set_flush_rules�s
znftables.build_set_flush_rulescCs |j|�}|j||jj��dS)N)rSr�rMr�)rTr�r�r4r4r5�	set_flush�s
znftables.set_flushcCsJ|jjj|�}|jdkrd}n(|jrBd|jkrB|jddkrBd}nd}|S)Nzhash:macr	r]r<rLrK)rMrrHr+rA)rTr�rr]r4r4r5r��s
znftables._set_get_familyc	Cs�g}|j|j|||��|j|j|��d}x^|D]D}|j|j||��|d7}|dkr2|j||jj��|j�d}q2W|j||jj��dS)Nrr<i�)r�rCrSrPr�rMr��clear)	rTZset_nameZ	type_nameZentriesZcreate_optionsZ
entry_optionsr��chunkrLr4r4r5�set_restore�s
znftables.set_restore)N)N)r�)rJ)FrJ)rJ)rJ)F)NN)NN)NN)NN)N)N)N)N)N)F)N)N)F)NN)I�__name__�
__module__�__qualname__r�Zpolicies_supportedrVrhrlrtrzr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr�rrrrr r!r#r$r%r&r'r*r+r,r/r0r3r6r7r9r;rCrDrErrNrPrQrRrSrTr�rWr4r4r4r5rI�s�/.`

4


R
i
;
-
9
 +


	
$
$
$


'
$

<
#


4
		rIij���i����)N)(Z
__future__rrirwr
Zfirewall.core.loggerrZfirewall.functionsrrrrrZfirewall.errorsr	r
rrr
rrZfirewall.core.richrrrrrrrZnftables.nftablesrr�r�r�r�r�r6r��objectrIr4r4r4r5�<module>s�$$





































site-packages/firewall/core/__pycache__/prog.cpython-36.opt-1.pyc000064400000001266147511334670020605 0ustar003

]ûf��@sddlZdgZddd�ZdS)�N�runProgc
Cs�|dkrg}|g|}d}|r@t|d��}|j�j�}WdQRXddi}y tj|tjtjtjd|d�}Wntk
r|d
SX|j|�\}}	|j	dd	�}|j
|fS)N�rZLANG�CT)�stdin�stderr�stdoutZ	close_fds�env��zutf-8�replace)r	r
)�open�read�encode�
subprocess�Popen�PIPEZSTDOUT�OSErrorZcommunicate�decode�
returncode)
�prog�argvr�argsZinput_stringZhandlerZprocess�outputZ
err_output�r�/usr/lib/python3.6/prog.pyrs$

)NN)r�__all__rrrrr�<module>ssite-packages/firewall/core/__pycache__/fw.cpython-36.opt-1.pyc000064400000064774147511334670020267 0ustar003

]ûf���@s�dgZddlZddlZddlZddlZddlZddlmZddlm	Z	ddl
mZddl
mZddl
m
Z
ddl
mZdd	l
mZdd
lmZddlmZddlmZdd
lmZddlmZddlmZddlmZddlmZddl m!Z!ddl"m#Z#ddl$m%Z%m&Z&ddl'm(Z(ddl)m*Z*ddl+m,Z,ddl-m.Z.ddl/m0Z0ddl1m2Z2m3Z3ddl4m5Z5ddl6m7Z7ddl8m9Z9ddl:m;Z;ddlm<Z<dd l=m>Z>Gd!d�de?�Z@dS)"�Firewall�N)�config)�	functions)�	ipXtables)�ebtables)�nftables)�ipset)�modules)�FirewallIcmpType)�FirewallService)�FirewallZone)�FirewallDirect)�FirewallConfig)�FirewallPolicies)�
FirewallIPSet)�FirewallTransaction)�FirewallHelper)�FirewallPolicy)�nm_get_bus_name�nm_get_interfaces_in_zone)�log)�firewalld_conf)�Direct)�service_reader)�icmptype_reader)�zone_reader�Zone)�ipset_reader)�IPSET_TYPES)�
helper_reader)�
policy_reader)�errors)�
FirewallErrorc@s�eZdZdadd�Zdd�Zdd�Zdd	�Zd
d�Zdbdd
�Zdd�Z	dcdd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zddd)d*�Zded+d,�Zdfd-d.�Zdgd/d0�Zd1d2�Zd3d4�Zd5d6�Zd7d8�Zd9d:�Zd;d<�Zd=d>�Z d?d@�Z!dAdB�Z"dCdD�Z#dEdF�Z$dGdH�Z%dIdJ�Z&dhdKdL�Z'dMdN�Z(dOdP�Z)dQdR�Z*dSdT�Z+dUdV�Z,dWdX�Z-dYdZ�Z.d[d\�Z/d]d^�Z0d_d`�Z1d(S)irFcCsttj�|_||_|jr>d|_d|_d|_d|_t	|_
d|_nrtj
|�|_d|_g|_tj|�|_d|_g|_tj�|_d|_tj�|_d|_g|_
tj|�|_d|_tj�|_t|�|_t|�|_t|�|_ t!|�|_"t#|�|_t$�|_%t&|�|_t'|�|_(t)|�|_*|j+�dS)NFT),rr�FIREWALLD_CONF�_firewalld_conf�_offline�ip4tables_enabled�ip6tables_enabled�ebtables_enabled�
ipset_enabledr�ipset_supported_types�nftables_enabledr�	ip4tables�ip4tables_backend�ipv4_supported_icmp_types�	ip6tables�ip6tables_backend�ipv6_supported_icmp_typesr�ebtables_backendr�
ipset_backendr�nftables_backendr	�modules_backendr
�icmptyper�servicer�zoner
�directrr�policiesrr�helperr�policy�_Firewall__init_vars)�selfZoffline�r?�/usr/lib/python3.6/fw.py�__init__CsB










zFirewall.__init__cCsDd|j|j|j|j|j|j|j|j|j|j	|j
|j|j|j
|jfS)Nz:%s(%r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r))�	__class__r&r'r(�_state�_panic�
_default_zone�_module_refcount�_marks�cleanup_on_exit�cleanup_modules_on_exit�ipv6_rpfilter_enabledr)�_individual_calls�_log_denied)r>r?r?r@�__repr__kszFirewall.__repr__cCsjd|_d|_d|_i|_g|_tj|_tj|_	tj
|_tj|_
tj|_tj|_tj|_tj|_tj|_dS)NZINITF�)rCrDrErFrGrZFALLBACK_CLEANUP_ON_EXITrHZ FALLBACK_CLEANUP_MODULES_ON_EXITrIZFALLBACK_IPV6_RPFILTERrJZFALLBACK_INDIVIDUAL_CALLSrKZFALLBACK_LOG_DENIEDrLZFALLBACK_FIREWALL_BACKEND�_firewall_backendZFALLBACK_FLUSH_ALL_ON_RELOAD�_flush_all_on_reloadZFALLBACK_RFC3964_IPV4�
_rfc3964_ipv4ZFALLBACK_ALLOW_ZONE_DRIFTING�_allow_zone_drifting)r>r?r?r@Z__init_varstszFirewall.__init_varscCs�|jr$d|jj�kr$tjd�d|_|jrHd|jj�krHtjd�d|_|jrld|jj�krltjd�d|_|jr�|jr�|j	r�tj
d�tjd�dS)N�filterziptables is not usable.Fzip6tables is not usable.zebtables is not usable.zNo IPv4 and IPv6 firewall.�)
r&r-�get_available_tablesr�info1r'r0r(r2r+�fatal�sys�exit)r>r?r?r@�
_check_tables�s 



zFirewall._check_tablescCszy|jj�Wn*tk
r8tjd�d|_g|_YnX|jj�|_|jj	�|jj
s||jjrltjd�ntjd�d|_|j
r�|jjd�|_n|jr�|jj�|_ng|_|jj	�|jj
s�|jjr�tjd�ntjd�d|_|j
r�|jjd�|_n|j�r|jj�|_ng|_|jj	�|jj
�sN|jj�r>tjd	�ntjd
�d|_|j�rv|j�rv|jj�rvtjd�dS)Nz4ipset not usable, disabling ipset usage in firewall.FzFiptables-restore is missing, using individual calls for IPv4 firewall.zCiptables-restore and iptables are missing, disabling IPv4 firewall.�ipv4zGip6tables-restore is missing, using individual calls for IPv6 firewall.zEip6tables-restore and ip6tables are missing, disabling IPv6 firewall.�ipv6zHebtables-restore is missing, using individual calls for bridge firewall.zEebtables-restore and ebtables are missing, disabling bridge firewall.zSebtables-restore is not supporting the --noflush option, will therefore not be used)r3Zset_list�
ValueErrorr�warningr)r*Zset_supported_typesr-Zfill_exists�restore_command_existsZcommand_existsr&r+r4Zsupported_icmp_typesr.r0r'r1r2r(rK�restore_noflush_option�debug1)r>r?r?r@�_start_check�sL








zFirewall._start_checkc>Cs~tj}tjdtj�y|jj�Wn8tk
rZ}ztj|�tjd�WYdd}~X�n"X|jj	d�rt|jj	d�}|jj	d�r�|jj	d�}|dk	r�|j
�dBkr�d|_tjd|j�|jj	d	��r|jj	d	�}|dk	r�|j
�dCkr�d|_|dk	�r|j
�dDk�rd|_tjd
|j�|jj	d��rv|jj	d�}|dk	�rv|j
�dEk�rvtjd�y|j
j�Wntk
�rtYnX|jj	d��r�|jj	d�}|dk	�r�|j
�dFk�r�d|_|j
�dGk�r�d|_|j�r�tjd�n
tjd�|jj	d��r"|jj	d�}|dk	�r"|j
�dHk�r"tjd�d|_|jj	d��rt|jj	d�}|dk�sT|j
�dk�r\d|_n|j
�|_tjd|j�|jj	d��r�|jj	d�|_tjd|j�|jj	d��r�|jj	d�}|j
�dIk�r�d|_nd|_tjd|j�|jj	d��r&|jj	d�}|j
�dJk�rd|_nd|_tjd|j�|jj	d��r||jj	d�}|j
�dKk�rVd|_nd|_|j�sntjd�tjd |j�|jjtj|j��|j|j�|j�s�|j�tjd!�y|j
jj�WnZtk
�r }z<|j
j��r�tjd"|j
jj |�ntjd"|j
jj |�WYdd}~XnX|jj!tj|j
��|j"tj#d#�|j"tj$d#�|j"tj%d$�|j"tj&d$�t'|j(j)��d%k�r�tjd&�|j"tj*d'�|j"tj+d'�|j"tj,d(�|j"tj-d(�t'|j.j/��d%k�r�tjd)�|j"tj0d*�|j"tj1d*�t'|j2j3��d%k�r&tj4d+�t5j6d,�|j"tj7d-�|j"tj8d-�d}x.dLD]&}||j2j3�k�rLtj4d1|�d}�qLW|�r�t5j6d,�||j2j3�k�r�d2|j2j3�k�r�d2}nd3|j2j3�k�r�d3}nd.}tjd4||�|}ntjd5|�t9tj:�}	t;j<j=tj:��rRtjd6tj:�y|	j�Wn4tk
�rP}ztjd7tj:|�WYdd}~XnX|j>j?|	�|jj@tj|	��|jA|�|_B|j�r�dS|jC�tjD�d%k�r�tEjE�}
tF|�}|jG|d8�|�r�|�s�|jH�r�|jIjJ��r�|jKd�|jL�|�r|�rtjd9�|jMjN�|jO|d8�|jKd�|jL�|jH�rR|jIjJ��rRtjd:�|jIjP�tjd;�|jQ|d8�tjd<�|j2jR|d8�|j2jSd|jB|d8�tjd=�|jTjU|d8�|jKd�|jL�|j>jV��rPtjd>�|j>jW|�y|jKd�|jL�WnXtk
�r8}z$t|jXd?|jY�r |jYnd@��WYdd}~Xntk
�rN�YnX~tjD�d,k�rztEjE�}
tjZdA|
|
�dS)MNz"Loading firewalld config file '%s'z0Using fallback firewalld configuration settings.�DefaultZoneZ
CleanupOnExit�no�falseFzCleanupOnExit is set to '%s'ZCleanupModulesOnExit�yes�trueTz#CleanupModulesOnExit is set to '%s'ZLockdownzLockdown is enabledZ
IPv6_rpfilterzIPv6 rpfilter is enabledzIPV6 rpfilter is disabledZIndividualCallszIndividualCalls is enabled�	LogDeniedZoffzLogDenied is set to '%s'ZFirewallBackendzFirewallBackend is set to '%s'ZFlushAllOnReloadzFlushAllOnReload is set to '%s'ZRFC3964_IPv4zRFC3964_IPv4 is set to '%s'ZAllowZoneDriftingz�AllowZoneDrifting is enabled. This is considered an insecure configuration option. It will be removed in a future release. Please consider disabling it now.z AllowZoneDrifting is set to '%s'zLoading lockdown whitelistz*Failed to load lockdown whitelist '%s': %srr6rzNo icmptypes found.r;r7zNo services found.r8zNo zones found.rTr<�block�drop�trustedzZone '%s' is not available.ZpublicZexternalz+Default zone '%s' is not valid. Using '%s'.zUsing default zone '%s'zLoading direct rules file '%s'z)Failed to load direct rules file '%s': %s)�use_transactionzUnloading firewall moduleszApplying ipsetszApplying default rule setzApplying used zoneszApplying used policiesz2Applying direct chains rules and passthrough rulesz
Direct: %srNz%Flushing and applying took %f seconds)rdre)rfrg)rdre)rfrg)rdre)rfrg)rfrg)rdre)rdre)rdre)rirjrk)[rZ
FALLBACK_ZONErrar#r$�read�	Exceptionr^�get�lowerrHrIr:Zenable_lockdownr"rJrKrLrOrPrQrRr%Zset_firewalld_conf�copy�deepcopy�_select_firewall_backendrbZlockdown_whitelistZquery_lockdown�error�filenameZset_policies�_loaderZFIREWALLD_IPSETSZETC_FIREWALLD_IPSETSZFIREWALLD_ICMPTYPESZETC_FIREWALLD_ICMPTYPES�lenr6�
get_icmptypesZFIREWALLD_HELPERSZETC_FIREWALLD_HELPERSZFIREWALLD_SERVICESZETC_FIREWALLD_SERVICESr7�get_servicesZFIREWALLD_ZONESZETC_FIREWALLD_ZONESr8�	get_zonesrWrXrYZFIREWALLD_POLICIESZETC_FIREWALLD_POLICIESrZFIREWALLD_DIRECT�os�path�existsr9Zset_permanent_configZ
set_direct�
check_zonerErZZgetDebugLogLevel�timer�flushr)rZ
has_ipsets�execute�clearr5�unload_firewall_modules�apply_default_tablesZapply_ipsets�apply_default_rulesZapply_zones�change_default_zoner<Zapply_policiesZhas_configurationZapply_direct�code�msgZdebug2)r>�reload�complete_reloadZdefault_zoner��valuert�zr8�objZtm1�transaction�eZtm2r?r?r@�_start�sr







 




















.zFirewall._startcCsHy|j�Wn&tk
r2d|_|jd��YnXd|_|jd�dS)N�FAILED�ACCEPT�RUNNING)r�rnrC�
set_policy)r>r?r?r@�start�s
zFirewall.startcCshtjj|�sdS|rZ|jtj�rV|dkrVt�}tjj|�|_|j	|j�||_d|_
nd}�x|ttj|��D�]h}|j
d�s�|jtj�rl|dkrltjjd||f�rl|jd||f|dd�qld||f}tjd||��y�|dk�r�t||�}|j|jj�k�r8|jj|j�}tjd	||j|j|j�|jj|j�n|jjtj��rNd|_
y|jj|�Wn<tk
�r�}	ztjd
|jt|	�f�WYdd}	~	XnX|jjtj|���n�|dk�rFt||�}|j|jj�k�r|jj |j�}tjd	||j|j|j�|jj!|j�n|jjtj��r$d|_
|jj"|�|jj"tj|���n.|dk�rnt#|||d�}|�r�dtjj|�tjj|�d
d�f|_|j	|j�tj|�}
|j|j$j%�k�r|j$j&|j�}|j$j'|j�|j(�r�tjd||j||�|j)|�ntjd	||j|j|j�n|jjtj��r,d|_
d|
_
|jj*|
�|�r^tjd||j||�|j)|�n|j$j*|��n|dk�rDt+||�}|j|j,j-�k�r�|j,j.|j�}tjd	||j|j|j�|j,j/|j�n|jjtj��r�d|_
y|j,j0|�Wn<tk
�r,}	ztj1d
|jt|	�f�WYdd}	~	XnX|jj0tj|���n0|dk�r�t2||�}|j|j3j4�k�r�|j3j5|j�}tjd	||j|j|j�|j3j6|j�n|jjtj��r�d|_
|j3j7|�|jj7tj|��n�|dk�rht8||�}|j|j9j:�k�r2|j9j;|j�}tjd	||j|j|j�|j9j<|j�n|jjtj��rHd|_
|j9j=|�|jj>tj|��ntj?d|�Wqltk
�r�}ztj@d|||�WYdd}~XqltAk
�r�tj@d||�tjB�YqlXqlW|�rd|j(�rd|j|j$j%�k�rX|j$j&|j�}tjd||j|j|j�y|j$j'|j�WntAk
�rHYnX|jjC|j�|j$j*|�dS)Nr8Fz.xmlz%s/%sT)�combinezLoading %s file '%s'r6z  Overloads %s '%s' ('%s/%s')z%s: %s, ignoring for run-time.r7)Z
no_check_namer�z  Combining %s '%s' ('%s/%s')rr;r<zUnknown reader type %szFailed to load %s file '%s': %szFailed to load %s file '%s':z0  Overloading and deactivating %s '%s' ('%s/%s')���)Dr{r|�isdir�
startswithrZ
ETC_FIREWALLDr�basename�nameZ
check_name�default�sorted�listdir�endswithrvrrarr6rxZget_icmptyperuZremove_icmptypeZadd_icmptyper"rV�strrqrrrr7ryZget_serviceZremove_serviceZadd_servicerr8rzZget_zoneZremove_zone�combinedr�Zadd_zonerr�
get_ipsets�	get_ipsetZremove_ipset�	add_ipsetr^rr;Zget_helpersZ
get_helperZ
remove_helperZ
add_helperr r<�get_policiesZ
get_policyZ
remove_policyZ
add_policyZadd_policy_objectrWrtrnZ	exceptionZforget_zone)r>r|Zreader_typer�Z
combined_zonerur�r�Zorig_objrtZ
config_objr�r?r?r@rv�s


$







$




zFirewall._loadercCsp|jj�|jj�|jj�|jj�|jj�|jj�|jj�|jj�|j	j�|j
j�|j�dS)N)r6�cleanupr7r8rr;rr9r:r<r$r=)r>r?r?r@r��s









zFirewall.cleanupcCsN|jsB|jr(|j�|jj�|jd�|jrBtjd�|jj	�|j
�dS)Nr�z!Unloading firewall kernel modules)r%rHr�rr�rIrrar5r�r�)r>r?r?r@�stop�s



z
Firewall.stopc	Cs�d}d}x�t|�D]�\}}|r0|jj|�\}}n$|j|dkrDd}n|jj|�\}}|dkrn|d7}||7}q|r�|jj|d�|j|d7<q||jkr|j|d8<|j|dkr|j|=qW||fS)NrrNrT)�	enumerater5�load_modulerFZ
unload_module�
setdefault)	r>Z_modules�enableZ
num_failedZ
error_msgs�i�moduleZstatusr�r?r?r@�handle_modules�s(
zFirewall.handle_modulescCs|dkrd|_dS)NrF)r+)r>�backendr?r?r@rs�sz!Firewall._select_firewall_backendcCs4x|j�D]}|j|kr
|Sq
Wttjd|��dS)Nz'%s' backend does not exist)�all_backendsr�r"r!Z
UNKNOWN_ERROR)r>r�r�r?r?r@�get_backend_by_name�s

zFirewall.get_backend_by_namecCs\|jr|jS|dkr |jr |jS|dkr4|jr4|jS|dkrH|jrH|jStt	j
d|��dS)Nr[r\�ebz-'%s' is not a valid backend or is unavailable)r+r4r&r-r'r0r(r2r"r!�INVALID_IPV)r>�ipvr?r?r@�get_backend_by_ipv�szFirewall.get_backend_by_ipvcCsP|dkr|jr|jS|dkr(|jr(|jS|dkr<|jr<|jSttjd|��dS)Nr[r\r�z-'%s' is not a valid backend or is unavailable)	r&r-r'r0r(r2r"r!r�)r>r�r?r?r@�get_direct_backend_by_ipv�sz"Firewall.get_direct_backend_by_ipvcCs<|dkr|jS|dkr|jS|dkr*|jS|dkr8|jSdS)Nr,r/rrF)r&r'r(r+)r>r�r?r?r@�is_backend_enabled�szFirewall.is_backend_enabledcCs8|jr
dS|dkr|jS|dkr&|jS|dkr4|jSdS)NTr[r\r�F)r+r&r'r()r>r�r?r?r@�is_ipv_enabled�szFirewall.is_ipv_enabledcCsRg}|jr|j|j�n6|jr*|j|j�|jr<|j|j�|jrN|j|j�|S)N)	r+�appendr4r&r-r'r0r(r2)r>�backendsr?r?r@�enabled_backends
szFirewall.enabled_backendscCsPg}|jr|j|j�|jr(|j|j�|jr:|j|j�|jrL|j|j�|S)N)	r&r�r-r'r0r(r2r+r4)r>r�r?r?r@r�szFirewall.all_backendsNcCsN|dkrt|�}n|}x |j�D]}|j||j��q W|dkrJ|jd�dS)NT)rr��	add_rulesZbuild_default_tablesr�)r>rlr�r�r?r?r@r�#s
zFirewall.apply_default_tablescCs�|dkrt|�}n|}x(|j�D]}|j|j�}|j||�q W|jd�r~|jd�}d|j�kr~|jr~|j	|j�}|j||�|jd�r�|j
r�|j�}|j||�|dkr�|jd�dS)Nr\�rawT)
rr�Zbuild_default_rulesrLr�r�r�rUrJZbuild_rpfilter_rulesrQZbuild_rfc3964_ipv4_rulesr�)r>rlr�r��rulesZipv6_backendr?r?r@r�/s"


zFirewall.apply_default_rulescCs\|dkrt|�}n|}tjd�x$|j�D]}|j�}|j||�q*W|dkrX|jd�dS)NzFlushing rule setT)rrrar�Zbuild_flush_rulesr�r�)r>rlr�r�r�r?r?r@r�Is

zFirewall.flushcCs`|dkrt|�}n|}tjd|�x&|j�D]}|j|�}|j||�q,W|dkr\|jd�dS)NzSetting policy to '%s'T)rrrar�Zbuild_set_policy_rulesr�r�)r>r<rlr�r�r�r?r?r@r�Xs

zFirewall.set_policycCsB|sdS|j|�}|s&ttjd|��|j|�s4dS|j||j�S)NrNz'%s' is not a valid backend)r�r"r!r�r��set_rulerL)r>�backend_name�ruler�r?r?r@r�is


z
Firewall.rulecCs"ttd|��}|j|�}|s,ttjd|��|j|�s:dS|js\|js\|dkoX|j	j
�rx�t|�D]�\}}y|j||j
�Wqftk
�r}zjtjtj��tj|�xFt|d|��D]2}y|j|j|�|j
�Wq�tk
r�Yq�Xq�W|�WYdd}~XqfXqfWn|j||j
�dS)Nz'%s' is not a valid backendr)�listrSr�r"r!r�r�rKr_r2r`r�r�rLrnrra�	traceback�
format_excrt�reversedZreverse_ruleZ	set_rules)r>r�r�Z_rulesr�r�r�r�r?r?r@r�ws.




zFirewall.rulescCs|jrttj��dS)N)rDr"r!Z
PANIC_MODE)r>r?r?r@�check_panic�szFirewall.check_paniccCs"|}||jj�krttj|��|S)N)r<r�r"r!ZINVALID_POLICY)r>r<Z_policyr?r?r@�check_policy�szFirewall.check_policycCs8|}|s|dkr|j�}||jj�kr4ttj|��|S)NrN)�get_default_zoner8rzr"r!ZINVALID_ZONE)r>r8�_zoner?r?r@r~�szFirewall.check_zonecCstj|�sttj|��dS)N)rZcheckInterfacer"r!ZINVALID_INTERFACE)r>�	interfacer?r?r@�check_interface�s
zFirewall.check_interfacecCs|jj|�dS)N)r7�
check_service)r>r7r?r?r@r��szFirewall.check_servicecCstj|�sttj|��dS)N)r�
check_portr"r!ZINVALID_PORT)r>Zportr?r?r@r��s
zFirewall.check_portcCs*|sttj��|dkr&ttjd|��dS)N�tcp�udp�sctp�dccpz''%s' not in {'tcp'|'udp'|'sctp'|'dccp'})r�r�r�r�)r"r!ZMISSING_PROTOCOLZINVALID_PROTOCOL)r>Zprotocolr?r?r@�check_tcpudp�s
zFirewall.check_tcpudpcCstj|�sttj|��dS)N)rZcheckIPr"r!�INVALID_ADDR)r>Zipr?r?r@�check_ip�s
zFirewall.check_ipcCsP|dkr tj|�sLttj|��n,|dkr@tj|�sLttj|��nttjd��dS)Nr[r\z'%s' not in {'ipv4'|'ipv6'})rZcheckIPnMaskr"r!r�Z
checkIP6nMaskr�)r>r��sourcer?r?r@�
check_address�s

zFirewall.check_addresscCs|jj|�dS)N)r6�check_icmptype)r>Zicmpr?r?r@r��szFirewall.check_icmptypecCs>t|t�std|t|�f��t|�dkr:ttjd|��dS)Nz%s is %s, expected intrz#timeout '%d' is not positive number)�
isinstance�int�	TypeError�typer"r!�
INVALID_VALUE)r>Ztimeoutr?r?r@�
check_timeout�s

zFirewall.check_timeoutc CsT|j}|j}|sNi}x&|jj�D]}|jj|�d||<q W|jj�}|j�}g}x$|jj	�D]}	|j
|jj|	��q^W|s�|jd�|j
�d}
y|jd|d�Wn&tk
r�}z
|}
WYdd}~XnX|�rxH|D]@}|jj|j�s�x,|jj�D]}
|
jdk�rq�|
j|j�q�Wq�W|�s�|j�}||k�r�||k�rFi||<xFt||j��D]2\}}|d�rX||||||<|||=�qXWxb|jj�D]T}||k�r�x.||D]"}|jj|||||d��q�W||=ntjd|��q�Wt|�d	k�r*x(t|j��D]}tjd
|�||=�qW~x�|D]�}|jj|j��r�xx|jD]R}y|jj|j|�Wn6tk
�r�}z|jtj k�r�|�WYdd}~XnX�qNWn|jj!|�|jj"|j��q2W|jj#|�t$�}|�r x@|jj�dgD],}x$t%|�D]}|jj|||d��q�W�q�W||_|j�s8|jd
�|
�rJd|_&|
�nd|_&dS)N�
interfacesZDROPT)r�r�r�__default__�senderzNew zone '%s'.rz(Lost zone '%s', zone interfaces dropped.rN)r�r�r�r�)'rDrPr8rz�get_settingsr9Zget_runtime_configr�rr�r�r�r�r�r�rnZquery_ipsetr�r�Zset_destroyr��items�change_zone_of_interfacerrVrw�keysZentriesZ	add_entryr"r�r!�ALREADY_ENABLEDr�Zapply_ipsetZ
set_configrrrC)r>r�rDZ	flush_allZ_zone_interfacesr8Z_direct_config�_old_dzZ_ipset_objs�_nameZstart_exceptionr�r�r�Z_new_dz�iface�settingsZinterface_id�entryr�Znm_bus_namer�r?r?r@r��s�









zFirewall.reloadcCs|jS)N)rC)r>r?r?r@�	get_stateJszFirewall.get_statecCsZ|jrttjd��y|jd�Wn.tk
rN}zttj|��WYdd}~XnXd|_dS)Nzpanic mode already enabledZPANICT)rDr"r!r�r�rn�COMMAND_FAILED)r>r�r?r?r@�enable_panic_modeOszFirewall.enable_panic_modecCsZ|jsttjd��y|jd�Wn.tk
rN}zttj|��WYdd}~XnXd|_dS)Nzpanic mode is not enabledr�F)rDr"r!ZNOT_ENABLEDr�rnr�)r>r�r?r?r@�disable_panic_modeZszFirewall.disable_panic_modecCs|jS)N)rD)r>r?r?r@�query_panic_modeeszFirewall.query_panic_modecCs|jS)N)rL)r>r?r?r@�get_log_deniedjszFirewall.get_log_deniedcCsb|tjkr&ttjd|djtj�f��||j�krR||_|jj	d|�|jj
�nttj|��dS)Nz'%s', choose from '%s'z','rh)rZLOG_DENIED_VALUESr"r!r��joinr�rLr$�set�writeZALREADY_SET)r>r�r?r?r@�set_log_deniedms
zFirewall.set_log_deniedcCs|jS)N)rE)r>r?r?r@r�|szFirewall.get_default_zonecCs�|j|�}||jkr�|j}||_|jjd|�|jj�|jj||�|jj|�}x@t|dj	��D]\}}|drd|jj
d|�qdWnttj
|��dS)Nrcr�r�rN)r~rEr$r�r�r8r�r�r�r�r�r"r!ZZONE_ALREADY_SET)r>r8r�r�Z_old_dz_settingsr�r�r?r?r@�set_default_zones


zFirewall.set_default_zonecCsH|j�}x:|j�D].\}}|s(t|t�r2|||<q||kr||=qW|S)N)rqr�r��bool)r>Z	permanentZruntimer��keyr�r?r?r@�'combine_runtime_with_permanent_settings�s

z0Firewall.combine_runtime_with_permanent_settingscCsi}i}x�t|j��t|j��BD]�}||kr"t||t�r�t||krN||ng�}tt||�|�||<t|t||�A|@�||<q"t||t�s�t||t�r�||r�||r�d||<q�||r�||r�d||<q"ttjdj	t
||�|���q"W||fS)NTFz Unhandled setting type {} key {})r�r�r�r�r�r�r"r!ZINVALID_SETTING�formatr�)r>Zold_settingsZnew_settingsZadd_settingsZremove_settingsr��oldr?r?r@�get_added_and_removed_settings�s

 z'Firewall.get_added_and_removed_settings)F)FF)F)N)N)N)N)F)2�__name__�
__module__�__qualname__rArMr=rZrbr�r�rvr�r�r�rsr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r~r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r?r?r?r@rBsd
(	;








 	
s)A�__all__Zos.pathr{rXrqrr�ZfirewallrrZ
firewall.corerrrrr	Zfirewall.core.fw_icmptyper
Zfirewall.core.fw_servicerZfirewall.core.fw_zonerZfirewall.core.fw_directr
Zfirewall.core.fw_configrZfirewall.core.fw_policiesrZfirewall.core.fw_ipsetrZfirewall.core.fw_transactionrZfirewall.core.fw_helperrZfirewall.core.fw_policyrZfirewall.core.fw_nmrrZfirewall.core.loggerrZfirewall.core.io.firewalld_confrZfirewall.core.io.directrZfirewall.core.io.servicerZfirewall.core.io.icmptyperZfirewall.core.io.zonerrZfirewall.core.io.ipsetrZfirewall.core.ipsetrZfirewall.core.io.helperrZfirewall.core.io.policyr r!Zfirewall.errorsr"�objectrr?r?r?r@�<module>sHsite-packages/firewall/core/__pycache__/helper.cpython-36.opt-1.pyc000064400000000256147511334670021113 0ustar003

]ûf$�@sdZdZdS)zThe helper maxnamelen� N)�__doc__ZHELPER_MAXNAMELEN�rr�/usr/lib/python3.6/helper.py�<module>ssite-packages/firewall/core/__pycache__/fw_ipset.cpython-36.pyc000064400000016503147511334670020517 0ustar003

]ûf�%�@sfdZdgZddlmZddlmZmZmZm	Z	ddl
mZddlm
Z
ddlmZGdd�de�Zd	S)
z
ipset backend�
FirewallIPSet�)�log)�remove_default_create_options�normalize_ipset_entry�check_entry_overlaps_existing�check_for_overlapping_entries)�IPSet)�errors)�
FirewallErrorc@s�eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	d4dd�Z
dd�Zdd�Zd5dd�Z
dd�Zdd�Zdd�Zd6dd �Zd!d"�Zd#d$�Zd%d&�Zd7d'd(�Zd)d*�Zd+d,�Zd-d.�Zd/d0�Zd1d2�Zd3S)8rcCs||_i|_dS)N)�_fw�_ipsets)�self�fw�r�/usr/lib/python3.6/fw_ipset.py�__init__#szFirewallIPSet.__init__cCsd|j|jfS)Nz%s(%r))�	__class__r)r
rrr�__repr__'szFirewallIPSet.__repr__cCs|jj�dS)N)r�clear)r
rrr�cleanup,szFirewallIPSet.cleanupcCs||j�krttj|��dS)N)�
get_ipsetsr
r	Z
INVALID_IPSET)r
�namerrr�check_ipset/szFirewallIPSet.check_ipsetcCs||j�kS)N)r)r
rrrr�query_ipset3szFirewallIPSet.query_ipsetcCst|jj��S)N)�sortedr�keys)r
rrrr6szFirewallIPSet.get_ipsetscCst|j�dkS)Nr)�lenr)r
rrr�
has_ipsets9szFirewallIPSet.has_ipsetsFcCs&|j|�|j|}|r"|j|�|S)N)rr�check_applied_obj)r
r�applied�objrrr�	get_ipset<s



zFirewallIPSet.get_ipsetcCs4g}|jjr|j|jj�|jjr0|j|jj�|S)N)rZnftables_enabled�appendZnftables_backendZ
ipset_enabledZ
ipset_backend)r
�backendsrrrr#CszFirewallIPSet.backendscCs0|j|jjkr ttjd|j��||j|j<dS)Nz'%s' is not supported by ipset.)�typerZipset_supported_typesr
r	ZINVALID_TYPErr)r
r rrr�	add_ipsetKszFirewallIPSet.add_ipsetcCs�|j|}|jrh|rhy x|j�D]}|j|�q"WWqttk
rd}zttj|��WYdd}~XqtXntj	d|�|j|=dS)Nz,Keeping ipset '%s' because of timeout option)
rrr#�set_destroy�	Exceptionr
r	�COMMAND_FAILEDr�debug1)r
rZkeepr �backend�msgrrr�remove_ipsetQs
 zFirewallIPSet.remove_ipsetc<Cs$|j|}�x|j�D�]}|jdkr�|j�}||kr�d|jksv|jddksv|j||dksvt|j�||dkr�y|j|�Wn.tk
r�}zt	t
j|��WYdd}~XnX|jj
�r�y|j|j|j|j�Wn0tk
�r}zt	t
j|��WYdd}~Xn&Xd|_d|jk�r,|jddk�r,qy|j|j�Wn0tk
�rl}zt	t
j|��WYdd}~XnXx�|jD]J}y|j|j|�Wn0tk
�r�}zt	t
j|��WYdd}~XnX�qvWqy|j|j|j|j|jd�Wn0tk
�r}zt	t
j|��WYdd}~XqXd|_qWdS)N�ipset�timeout�0r�T)rr#rZset_get_active_terse�optionsr$�rm_def_cr_optsr&r'r
r	r(r�_individual_callsZ
set_creater�	set_flush�entries�set_add�set_restore)r
rr r*Zactiver+�entryrrr�apply_ipset]sL


&
zFirewallIPSet.apply_ipsetcCs>x8|j�D],}|j|}d|_tjd|�|j|�q
WdS)NFzApplying ipset '%s')rrrrr)r9)r
rr rrr�apply_ipsets�s

zFirewallIPSet.apply_ipsetscCs�xz|j�D]n}|jdkrq
x\|j�D]P}y|j|�|j|�Wq$tk
rr}z|jtjkrb|�WYdd}~Xq$Xq$Wq
WdS)NZnftables)	r#rr�
check_appliedr&r
�coder	�NOT_APPLIED)r
r*r-r+rrr�flush�s

zFirewallIPSet.flushTcCs|j||d�jS)N)r)r!r$)r
rrrrr�get_type�szFirewallIPSet.get_typecCst|j|dd�jjd��S)NT)r�,)rr!r$�split)r
rrrr�
get_dimension�szFirewallIPSet.get_dimensioncCs|j|�}|j|�dS)N)r!r)r
rr rrrr;�s
zFirewallIPSet.check_appliedcCs|jsttj|j��dS)N)rr
r	r=r)r
r rrrr�szFirewallIPSet.check_applied_objcCs.|j||d�}d|jkr*|jddkr*dSdS)N)rZfamilyZinet6Zipv6Zipv4)r!r1)r
rrr rrr�
get_family�s

zFirewallIPSet.get_familycCs�|j|dd�}t|�}tj||j|j�||jkrFttj	d||f��t
||j�y$x|j�D]}|j|j
|�q^WWn.tk
r�}zttj|��WYdd}~Xn&Xd|jks�|jddkr�|jj|�dS)NT)rz'%s' already is in '%s'r.r/)r!rr�check_entryr1r$r5r
r	ZALREADY_ENABLEDrr#r6rr'r(r")r
rr8r r*r+rrr�	add_entry�s
zFirewallIPSet.add_entrycCs�|j|dd�}t|�}||jkr4ttjd||f��y$x|j�D]}|j|j|�q@WWn.t	k
r�}zttj
|��WYdd}~Xn&Xd|jks�|jddkr�|jj|�dS)NT)rz'%s' not in '%s'r.r/)
r!rr5r
r	ZNOT_ENABLEDr#Z
set_deleterr'r(r1�remove)r
rr8r r*r+rrr�remove_entry�s
zFirewallIPSet.remove_entrycCsD|j|dd�}t|�}d|jkr:|jddkr:ttj|��||jkS)NT)rr.r/)r!rr1r
r	ZIPSET_WITH_TIMEOUTr5)r
rr8r rrr�query_entry�s
zFirewallIPSet.query_entrycCs|j|dd�}|jS)NT)r)r!r5)r
rr rrr�get_entries�szFirewallIPSet.get_entriescCs@|j|dd�}t|�x|D]}tj||j|j�qWd|jksN|jddkrT||_y"x|j�D]}|j|j	�q`WWn.t
k
r�}zttj
|��WYdd}~XnXd|_yXxR|j�D]F}|jjr�x8|jD]}|j|j	|�q�Wq�|j|j	|j|j|jd�q�WWn0t
k
�r4}zttj
|��WYdd}~XnXd|_dS)NT)rr.r/)r!rrrDr1r$r5r#r4rr'r
r	r(rrr3r6r7)r
rr5r r8r*r+rrr�set_entries�s.
zFirewallIPSet.set_entriesN)F)F)T)T)�__name__�
__module__�__qualname__rrrrrrrr!r#r%r,r9r:r>r?rBr;rrCrErGrHrIrJrrrrr"s0

1

		N)�__doc__�__all__Zfirewall.core.loggerrZfirewall.core.ipsetrr2rrrZfirewall.core.io.ipsetrZfirewallr	Zfirewall.errorsr
�objectrrrrr�<module>ssite-packages/firewall/core/__pycache__/fw_direct.cpython-36.pyc000064400000030436147511334670020646 0ustar003

]ûf
V�@sndgZddlmZddlmZddlmZddlmZddlm	Z	ddl
mZddlm
Z
Gd	d�de�Zd
S)�FirewallDirect�)�LastUpdatedOrderedDict)�	ipXtables)�ebtables)�FirewallTransaction)�log)�errors)�
FirewallErrorc@sDeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dLdd�Z
dd�Zdd�ZdMdd�Z
dd�Zdd�Zdd�Zdd�ZdNd d!�ZdOd"d#�Zd$d%�Zd&d'�Zd(d)�ZdPd*d+�ZdQd,d-�Zd.d/�Zd0d1�Zd2d3�Zd4d5�Zd6d7�Zd8d9�ZdRd:d;�ZdSd<d=�Z d>d?�Z!d@dA�Z"dBdC�Z#dDdE�Z$dFdG�Z%dHdI�Z&dJdK�Z'dS)TrcCs||_|j�dS)N)�_fw�_FirewallDirect__init_vars)�self�fw�r�/usr/lib/python3.6/fw_direct.py�__init__'szFirewallDirect.__init__cCsd|j|j|j|jfS)Nz%s(%r, %r, %r))�	__class__�_chains�_rules�_rule_priority_positions)rrrr�__repr__+szFirewallDirect.__repr__cCs"i|_i|_i|_i|_d|_dS)N)rrr�
_passthroughs�_obj)rrrrZ__init_vars/s
zFirewallDirect.__init_varscCs|j�dS)N)r)rrrr�cleanup6szFirewallDirect.cleanupcCs
t|j�S)N)rr
)rrrr�new_transaction;szFirewallDirect.new_transactioncCs
||_dS)N)r)r�objrrr�set_permanent_config@sz#FirewallDirect.set_permanent_configcCs\t|j�t|j�t|j�dkr&dSt|jj��t|jj��t|jj��dkrXdSdS)NrTF)�lenrrrr�get_all_chains�
get_all_rules�get_all_passthroughs)rrrr�has_configurationCs"z FirewallDirect.has_configurationNcCsP|dkr|j�}n|}|j|jj�|jj�|jj�f|�|dkrL|jd�dS)NT)r�
set_configrrrr�execute)r�use_transaction�transactionrrr�apply_directLs

zFirewallDirect.apply_directcCsi}i}i}xL|jD]B}|\}}x4|j|D]&}|jj|||�s,|j|g�j|�q,WqWxf|jD]\}|\}}}xL|j|D]>\}	}
|jj||||	|
�s|||kr�t�||<|	|||	|
f<q|WqbWxP|jD]F}x@|j|D]2}
|jj	||
�s�||k�r�g||<||j|
�q�Wq�W|||fS)N)
rr�query_chain�
setdefault�appendr�
query_rulerr�query_passthrough)rZchains�rulesZpassthroughs�table_id�ipv�table�chain�chain_id�priority�argsrrr�get_runtime_config]s,


z!FirewallDirect.get_runtime_configcCs|j|j|jfS)N)rrr)rrrr�
get_config|szFirewallDirect.get_configcCs�|dkr|j�}n|}|\}}}x||D]t}|\}}	xf||D]Z}
|j||	|
�s<y|j||	|
|d�Wq<tk
r�}ztjt|��WYdd}~Xq<Xq<Wq&Wx�|D]�}|\}}	}
xt||D]h\}
}|j||	|
|
|�s�y|j||	|
|
||d�Wq�tk
�r"}ztjt|��WYdd}~Xq�Xq�Wq�Wxx|D]p}xh||D]\}|j	||��s@y|j
|||d�Wn2tk
�r�}ztjt|��WYdd}~XnX�q@W�q2W|dk�r�|jd�dS)N)r#T)rr&�	add_chainr	rZwarning�strr)�add_ruler*�add_passthroughr")rZconfr#r$rrrr,r-r.r/�errorr0r1r2rrrr!s@



(

(
,
zFirewallDirect.set_configcCs*dddg}||kr&ttjd||f��dS)N�ipv4�ipv6Zebz'%s' not in '%s')r	rZINVALID_IPV)rr-Zipvsrrr�
_check_ipv�s
zFirewallDirect._check_ipvcCsF|j|�|dkrtjj�ntjj�}||krBttjd||f��dS)Nr:r;z'%s' not in '%s')r:r;)r<r�BUILT_IN_CHAINS�keysrr	rZ
INVALID_TABLE)rr-r.Ztablesrrr�_check_ipv_table�s

zFirewallDirect._check_ipv_tablecCs�|dkr4tj|}|jjr i}qH|jj|�j|}ntj|}tj|}||kr`tt	j
d|��||krxtt	j
d|��|dkr�|jjj|�dk	r�tt	j
d|��dS)Nr:r;zchain '%s' is built-in chainzchain '%s' is reservedzChain '%s' is reserved)r:r;)r:r;)rr=r
�nftables_enabled�get_direct_backend_by_ipv�
our_chainsrZ
OUR_CHAINSr	rZ
BUILTIN_CHAIN�zoneZzone_from_chainZ
INVALID_CHAIN)rr-r.r/Zbuilt_in_chainsrBrrr�_check_builtin_chain�s"




z#FirewallDirect._check_builtin_chaincCsH|r|jj|g�j|�n*|j|j|�t|j|�dkrD|j|=dS)Nr)rr'r(�remover)rr,r/�addrrr�_register_chain�s
zFirewallDirect._register_chaincCs>|dkr|j�}n|}|jd||||�|dkr:|jd�dS)NT)r�_chainr")rr-r.r/r#r$rrrr5�s
zFirewallDirect.add_chaincCs>|dkr|j�}n|}|jd||||�|dkr:|jd�dS)NFT)rrHr")rr-r.r/r#r$rrr�remove_chain�s
zFirewallDirect.remove_chaincCs:|j||�|j|||�||f}||jko8||j|kS)N)r?rDr)rr-r.r/r,rrrr&�s

zFirewallDirect.query_chaincCs,|j||�||f}||jkr(|j|SgS)N)r?r)rr-r.r,rrr�
get_chains�s


zFirewallDirect.get_chainscCsDg}x:|jD]0}|\}}x"|j|D]}|j|||f�q$WqW|S)N)rr()r�r�keyr-r.r/rrrr�szFirewallDirect.get_all_chainscCsB|dkr|j�}n|}|jd||||||�|dkr>|jd�dS)NT)r�_ruler")rr-r.r/r1r2r#r$rrrr7s
zFirewallDirect.add_rulecCsB|dkr|j�}n|}|jd||||||�|dkr>|jd�dS)NFT)rrMr")rr-r.r/r1r2r#r$rrr�remove_rules
zFirewallDirect.remove_rulecCs2|j||�|||f}||jko0||f|j|kS)N)r?r)rr-r.r/r1r2r0rrrr)s

zFirewallDirect.query_rulecCs6|j||�|||f}||jkr2t|j|j��SgS)N)r?r�listr>)rr-r.r/r0rrr�	get_ruless


zFirewallDirect.get_rulesc	CsRg}xH|jD]>}|\}}}x.|j|D] \}}|j||||t|�f�q&WqW|S)N)rr(rO)rrKrLr-r.r/r1r2rrrr%s
 zFirewallDirect.get_all_rulescCs�|rr||jkrt�|j|<||j||<||jkr<i|j|<||j|krb|j|||7<q�||j||<n<|j||=t|j|�dkr�|j|=|j|||8<dS)Nr)rrrr)r�rule_idr0r1�enable�countrrr�_register_rule-s


zFirewallDirect._register_rulecCsVy|jj|jj|�j|�Stk
rP}ztj|�ttj	|��WYdd}~XnXdS)N)
r
�rulerA�name�	ExceptionrZdebug2r	rZCOMMAND_FAILED)rr-r2�msgrrr�passthroughAs

zFirewallDirect.passthroughcCsX|r*||jkrg|j|<|j|j|�n*|j|j|�t|j|�dkrT|j|=dS)Nr)rr(rEr)rr-r2rRrrr�_register_passthroughIs

z$FirewallDirect._register_passthroughcCs@|dkr|j�}n|}|jd|t|�|�|dkr<|jd�dS)NT)r�_passthroughrOr")rr-r2r#r$rrrr8Ss
zFirewallDirect.add_passthroughcCs@|dkr|j�}n|}|jd|t|�|�|dkr<|jd�dS)NFT)rr[rOr")rr-r2r#r$rrr�remove_passthrough^s
z!FirewallDirect.remove_passthroughcCs||jkot|�|j|kS)N)r�tuple)rr-r2rrrr*is
z FirewallDirect.query_passthroughcCs>g}x4|jD]*}x$|j|D]}|j|t|�f�qWqW|S)N)rr(rO)rrKr-r2rrrrms
z#FirewallDirect.get_all_passthroughscCs4g}||jkr0x |j|D]}|jt|��qW|S)N)rr(rO)rr-rKr2rrr�get_passthroughsts

zFirewallDirect.get_passthroughscCs�g}x�|D]�}d}x�|D]�}y|j|�}Wntk
r>YqXt|�|krd||dkrd}||djd�}x.|D]&}	|dd�}
|	|
|d<|j|
�qxWqW|s
|j|�q
W|S)z5Split values combined with commas for options in optsF�,�TN)�index�
ValueErrorr�splitr()rr+ZoptsZ	out_rulesrUZ	processed�opt�i�items�itemrMrrr�split_value{s$


zFirewallDirect.split_valuec
Cs*|j||�|jjr2|dkr2|jjj||||�|}|jj|�}	|jjrd|	j|||�rdd|}n:|jjr�|dd�dkr�|	j|||dd��r�|dd�}|||f}
||f}|r�|
|jkr�||j|
kr�tt	j
d||||f��nB|
|jk�s||j|
k�rtt	jd||||f��|j|
|}d}d	}
|
|jk�r�t
|j|
j��}d	}x@|t|�k�r�|||k�r�||j|
||7}|d7}�qTWt|�g}|j|d
dg�}|j|dd
g�}x<|D]4}|j|	|	j||||t|���|d7}|
d7}
�q�W|j||
|||
�|j|j||
|||
�dS)Nr:r;z	%s_direct�Z_directz"rule '%s' already is in '%s:%s:%s'zrule '%s' is not in '%s:%s:%s'r`rz-sz--sourcez-dz
--destination)r:r;i����i����i����)r?r
r@rC�create_zone_base_by_chainrAZis_chain_builtinrr	r�ALREADY_ENABLED�NOT_ENABLEDr�sortedr>rrOrhr7Z
build_ruler]rT�add_fail)rrRr-r.r/r1r2r$rH�backendr0rQrarSZ	positions�jZ	args_list�_argsrrrrM�sZ




(

zFirewallDirect._rulecCs�|j||�|j|||�||f}|rV||jkr�||j|kr�ttjd|||f��n.||jksn||j|kr�ttjd|||f��|jj|�}|j	||j
|||��|j|||�|j|j|||�dS)Nz chain '%s' already is in '%s:%s'zchain '%s' is not in '%s:%s')
r?rDrr	rrkrlr
rAZ	add_rulesZbuild_chain_rulesrGrn)rrFr-r.r/r$r,rorrrrHs$

zFirewallDirect._chainc
Cs�|j|�t|�}|rD||jkrp||j|krpttjd||f��n,||jks\||j|krpttjd||f��|jj|�}|r�|j	|�|dkr�|j
|�\}}|r�|r�|jjj|||�|}	n
|j
|�}	|j||	�|j|||�|j|j|||�dS)Nzpassthrough '%s', '%s'r:r;)r:r;)r<r]rr	rrkrlr
rAZcheck_passthroughZpassthrough_parse_table_chainrCrjZreverse_passthroughr7rZrn)
rrRr-r2r$Z
tuple_argsror.r/rqrrrr[s0




zFirewallDirect._passthrough)N)N)N)N)N)N)N)N)(�__name__�
__module__�__qualname__rrrrrrr r%r3r4r!r<r?rDrGr5rIr&rJrr7rNr)rPrrTrYrZr8r\r*rr^rhrMrHr[rrrrr&sJ	

'	

	




jN)�__all__Zfirewall.fw_typesrZ
firewall.corerrZfirewall.core.fw_transactionrZfirewall.core.loggerrZfirewallrZfirewall.errorsr	�objectrrrrr�<module>ssite-packages/firewall/core/__pycache__/fw_zone.cpython-36.opt-1.pyc000064400000077522147511334670021315 0ustar003

]ûfy��@s�ddlZddlZddlmZmZmZddlmZddlm	Z	ddl
mZddlm
Z
mZmZmZmZmZmZmZmZddlmZmZmZddlmZdd	lmZdd
lmZGdd�de �Z!dS)
�N)�	SHORTCUTS�DEFAULT_ZONE_TARGET�SOURCE_IPSET_TYPES)�FirewallTransaction)�Policy)�log)	�Rich_Service�	Rich_Port�
Rich_Protocol�Rich_SourcePort�Rich_ForwardPort�Rich_IcmpBlock�
Rich_IcmpType�Rich_Masquerade�	Rich_Mark)�checkIPnMask�
checkIP6nMask�	check_mac)�errors)�
FirewallError)�LastUpdatedOrderedDictc@sNeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zd�dd �Zd!d"�Zd#d$�Zd%d&�Zd�d'd(�Zd)d*�Zd+d,�Zd-d.�Zd�d/d0�Zd�d1d2�Zd3d4�Zd5d6�Zd7d8�Zd9d:�Zd;d<�Z d=d>�Z!d�d@dA�Z"dBdC�Z#d�dDdE�Z$d�dFdG�Z%d�dHdI�Z&dJdK�Z'dLdM�Z(dNdO�Z)d�dQdR�Z*d�dSdT�Z+d�dUdV�Z,dWdX�Z-d�dYdZ�Z.d�d[d\�Z/d]d^�Z0d_d`�Z1dadb�Z2d�dcdd�Z3dedf�Z4dgdh�Z5didj�Z6dkdl�Z7dmdn�Z8dodp�Z9d�dqdr�Z:dsdt�Z;dudv�Z<dwdx�Z=d�dydz�Z>d{d|�Z?d}d~�Z@dd��ZAd�d�d��ZBd�d��ZCd�d��ZDd�d��ZEd�d��ZFd�d�d��ZGd�d��ZHd�d��ZId�d��ZJd�d�d��ZKd�d��ZLd�d��ZMd�d��ZNd�d�d��ZOd�d��ZPd�d��ZQd�d�d��ZRd�d�d��ZSd�d�d��ZTd�d��ZUd�d�d��ZVd�d��ZWd�d��ZXd�d��ZYd�d�d��ZZd�d��Z[d�d��Z\d�d��Z]d�d��Z^d�d��Z_d�d�d��Z`d�d��Zad�d�d„Zbd�dĄZcd�dƄZddS)��FirewallZonercCs||_i|_i|_dS)N)�_fw�_zones�_zone_policies)�self�fw�r�/usr/lib/python3.6/fw_zone.py�__init__&szFirewallZone.__init__cCsd|j|jfS)Nz%s(%r))�	__class__r)rrrr�__repr__+szFirewallZone.__repr__cCs|jj�|jj�dS)N)r�clearr)rrrr�cleanup.s
zFirewallZone.cleanupcCs
t|j�S)N)rr)rrrr�new_transaction2szFirewallZone.new_transactioncCsdj||d�S)Nzzone_{fromZone}_{toZone})�fromZone�toZone)�format)rr%r&rrr�policy_name_from_zones5sz#FirewallZone.policy_name_from_zonescCst|jj��S)N)�sortedr�keys)rrrr�	get_zones:szFirewallZone.get_zonescCs8g}x.|j�D]"}|j|�s&|j|�r|j|�qW|S)N)r+�list_interfaces�list_sources�append)rZactive_zones�zonerrr�get_active_zones=s
zFirewallZone.get_active_zonescCs6|j|�}x&|jD]}||j|jdkr|SqWdS)N�
interfaces)�_FirewallZone__interface_idr�settings)r�	interface�interface_idr/rrr�get_zone_of_interfaceDs

z"FirewallZone.get_zone_of_interfacecCs6|j|�}x&|jD]}||j|jdkr|SqWdS)N�sources)�_FirewallZone__source_idrr3)r�source�	source_idr/rrr�get_zone_of_sourceLs

zFirewallZone.get_zone_of_sourcecCs|jj|�}|j|S)N)r�
check_zoner)rr/�zrrr�get_zoneTszFirewallZone.get_zonecCsBt�}|j|_|j||�|_|j|_|j|_|g|_|g|_�x�dD�]�}||jkr~|d	kr~|dkr~t	||t
jt||���qD|d
kr�||jkr�|d
kr�t	||t
jt||���qD||jko�|d
ko�|dk�r�t	||t
jt||���qD|dkrDg|_
xB|j
D]8}|j||�}||j|j|�k�r�|j
jt
j|���q�WqDW|S)N�services�ports�
masquerade�
forward_ports�source_ports�icmp_blocks�rules�	protocols�HOST�ANY)r?r@rArBrCrDrErF)r?r@rCrDrF)rA)rDrB)rE)r�nameZderived_from_zoner(�ZONE_POLICY_PRIORITYZpriority�targetZ
ingress_zonesZegress_zones�setattr�copy�deepcopy�getattrrE�_rich_rule_to_policiesr.)r�z_objr%r&�p_objZsetting�ruleZcurrent_policyrrr�policy_obj_from_zone_objXs6

z%FirewallZone.policy_obj_from_zone_objcCs�dd�d	D�|_||j|j<g|j|j<xX|jdfd|jf|jdfgD]8\}}|j|||�}|jjj|�|j|jj|j�qFW|j	|j�dS)
NcSsi|]}t�|�qSr)r)�.0�xrrr�
<dictcomp>sz)FirewallZone.add_zone.<locals>.<dictcomp>r1r7�icmp_block_inversion�forwardrGrH)r1r7rXrY)
r3rrIrrTr�policyZ
add_policyr.�copy_permanent_to_runtime)r�objr%r&rRrrr�add_zone~s

zFirewallZone.add_zonecCsn|j|}x|jD]}|j||dd�qWx|jD]}|j||dd�q2W|jrZ|j|�|jrj|j|�dS)NF)�allow_apply)	rr1�
add_interfacer7�
add_sourcerY�add_forwardrX�add_icmp_block_inversion)rr/r\�argrrrr[�s

z&FirewallZone.copy_permanent_to_runtimecCs8|j|}|jr|j|�|jj�|j|=|j|=dS)N)r�applied�unapply_zone_settingsr3r"r)rr/r\rrr�remove_zone�s


zFirewallZone.remove_zoneNcCsVxP|j�D]D}|j|}t|j�dks4t|j�dkr
tjd|�|j||d�q
WdS)NrzApplying zone '%s')�use_transaction)r+r�lenr1r7r�debug1�apply_zone_settings)rrgr/rQrrr�apply_zones�s

zFirewallZone.apply_zonescCs|j|}||_dS)N)rrd)rr/rdr\rrr�set_zone_applied�s
zFirewallZone.set_zone_appliedcCs�d|krdS|jd�}t|�dkr&dSd}x tD]}|dt|kr0|}q0W|dk	r�|d|j�krhdSt|�dks�t|�dkr�|ddkr�|d|fSdS)N�_�r���prer�deny�allow�post)rqrrrrsrt)�splitrhrr+)r�chainZsplits�_chainrVrrr�zone_from_chain�s 

zFirewallZone.zone_from_chaincCst|j|�}|dkrdS|\}}|d	kr0|}d}n4|d
krB|}d}n"|dkrTd}|}nttjd|��|j||�|fS)N�
PREROUTING�
FORWARD_INrH�INPUTrG�POSTROUTING�FORWARD_OUTz&chain '%s' can't be mapped to a policy)ryrz)r{)r|r})rxrrZ
INVALID_CHAINr()rrvrVr/rwr%r&rrr�policy_from_chain�s
zFirewallZone.policy_from_chainc	Csj|dkrf|j|�}|dk	rf|j|�\}}|dkr:|j�}n|}|jjj|d|||�|dkrf|jd�dS)N�ipv4�ipv6T)rr�)r~r$rrZZgen_chain_rules�execute)	r�ipv�tablervrgrVrZrw�transactionrrr�create_zone_base_by_chain�s

z&FirewallZone.create_zone_base_by_chaincCstj�||d�}|S)N)Zdate�sender�timeout)�time)rr�r��retrrrZ__gen_settings�szFirewallZone.__gen_settingscCs|j|�jS)N)r>r3)rr/rrr�get_settings�szFirewallZone.get_settingscCs�|j|�}x�|D]z}xt||D]h}|dkr<|j||||�q|dkr`|j|||d|d|�q|dkrlqq|dkrvqtjd|||�qWqW|r�|j|||�dS)Nr1r7rrorXrYz3Zone '%s': Unknown setting '%s:%s', unable to apply)r��
_interface�_sourcerZwarning�_icmp_block_inversion)r�enabler/r�r3�key�argsrrr�_zone_settingss

zFirewallZone._zone_settingscCs�|jj|�}|j|}|jr dSd|_|dkr8|j�}n|}x2|j|D]$}tjd||�|jjj	||d�qHW|j
d||�|dkr�|jd�dS)NTz+Applying policy (%s) derived from zone '%s')rg)rr<rrdr$rrrirZ�apply_policy_settingsr�r�)rr/rg�_zoner\r�rZrrrrjs

z FirewallZone.apply_zone_settingscCs�|jj|�}|j|}|js dS|dkr2|j�}n|}x$|j|D]}|jjj||d�qBW|jd||�|dkr||j	d�dS)N)rgFT)
rr<rrdr$rrZ�unapply_policy_settingsr�r�)rr/rgr�r\r�rZrrrre,s

z"FirewallZone.unapply_zone_settingscCs~|j|�}|j|�}g}x\td�D]P}|j|d|krZ|jtjt||j|d���q"|j||j|d�q"Wt|�S)zH
        :return: exported config updated with runtime settings
        �r)	r>�get_config_with_settings_dict�rangeZIMPORT_EXPORT_STRUCTUREr.rMrNrO�tuple)rr/r\Z	conf_dictZ	conf_list�irrr�get_config_with_settings?s

"z%FirewallZone.get_config_with_settingsc
Cs�|j|�j�}|dtkr"d|d<|j|�|j|�|j|�|j|�|j|�|j|�|j	|�|j
|�|j|�|j|�|j
|�|j|�d�}|jj||�S)zH
        :return: exported config updated with runtime settings
        rK�default)r?r@rDrArBr1r7�	rules_strrFrCrXrY)r>Zexport_config_dictr�
list_services�
list_ports�list_icmp_blocks�query_masquerade�list_forward_portsr,r-�
list_rules�list_protocols�list_source_ports�query_icmp_block_inversion�
query_forwardrZ'combine_runtime_with_permanent_settings)rr/Z	permanentZruntimerrrr�Os z*FirewallZone.get_config_with_settings_dictc
sddlm�d��fdd�	}��fdd�}�j�jf�j�jf�j�jf�j�j	f�j
�jf�j�j
f�j�jf||f�j�jf�j�jf�j�jf�j�jfd�}�j|�}�jj||�\}}	xv|	D]n}
t|	|
t��r$xX|	|
D]:}t|t��r||
d|f|��q�||
d||�q�Wq�||
d|�q�Wx�|D]�}
t||
t��r�x�||
D]l}|
dk�r�||
d|||d�nDt|t��r�||
d|f|�d|d��n||
d||d|d��q\Wn6|
dk�r�||
d||d�n||
d|d|d��q>WdS)Nr)�	Rich_Rulecs�j|�|d�d|d�dS)N)�rule_strr)r�r�)�add_rule)r/r�r�r�)r�rrr�add_rule_wrapperhszDFirewallZone.set_config_with_settings_dict.<locals>.add_rule_wrappercs�j|�|d��dS)N)r�)�remove_rule)r/r�)r�rrr�remove_rule_wrapperjszGFirewallZone.set_config_with_settings_dict.<locals>.remove_rule_wrapper)r?r@rDrArBr1r7r�rFrCrXrYror1r7)r�)r�r�rX)rN)r1r7)rX)�firewall.core.richr��add_service�remove_service�add_port�remove_port�add_icmp_block�remove_icmp_block�add_masquerade�remove_masquerade�add_forward_port�remove_forward_portr_�remove_interfacer`�
remove_source�add_protocol�remove_protocol�add_source_port�remove_source_portrb�remove_icmp_block_inversionra�remove_forwardr�rZget_added_and_removed_settings�
isinstance�listr�)rr/r3r�r�r�Z
setting_to_fnZold_settingsZadd_settingsZremove_settingsr�r�r)r�rr�set_config_with_settings_dictesF













  
z*FirewallZone.set_config_with_settings_dictcCs|jj|�dS)N)r�check_interface)rr4rrrr��szFirewallZone.check_interfacecCs\|jj|�}|j|}|j|�}||jdkrX|jd|}d|krX|ddk	rX|dSdS)Nr1r�)rr<rr2r3)rr/r4r��_objr5r3rrr�interface_get_sender�s

z!FirewallZone.interface_get_sendercCs|j|�|S)N)r�)rr4rrrZ__interface_id�s
zFirewallZone.__interface_idTc
Cs|jj�|jj|�}|j|}|j|�}||jdkrLttjd||f��|j	|�dk	rjttj
d|��tjd||f�|dkr�|j
�}	n|}	|jr�|r�|j||	d�|	j|j|d�|r�|jd|||	�|j||||�|	j|j||�|dk�r|	jd�|S)Nr1z'%s' already bound to '%s'z'%s' already bound to a zonez&Setting zone of interface '%s' to '%s')rgFT)r�check_panicr<rr2r3rr�ZONE_ALREADY_SETr6�
ZONE_CONFLICTrrir$rdrj�add_failrlr��!_FirewallZone__register_interface�#_FirewallZone__unregister_interfacer�)
rr/r4r�rgr^r�r�r5r�rrrr_�s8









zFirewallZone.add_interfacecCs6|jd|�|jd|<|p"|dk|jd|d<dS)Nrr1��__default__)�_FirewallZone__gen_settingsr3)rr�r5r/r�rrrZ__register_interface�sz!FirewallZone.__register_interfacecCsR|jj�|j|�}|jj|�}||kr,|S|dk	r@|j||�|j|||�}|S)N)rr�r6r<r�r_)rr/r4r��	_old_zone�	_new_zoner�rrr�change_zone_of_interface�s

z%FirewallZone.change_zone_of_interfacecCsz|jj�|dkr|j�}n|}|j||�|jd|d|dd�|dk	rd|dkrd|jd|d|dd�|dkrv|jd�dS)NT�+)r.r�F)rr�r$rjr�r�)rZold_zoneZnew_zonergr�rrr�change_default_zone�s

z FirewallZone.change_default_zonec	Cs�|jj�|j|�}|dkr,ttjd|��|dkr8|n
|jj|�}||krbttjd|||f��|dkrt|j�}n|}|j	|}|j
|�}|j|j||�|j
d|||�|dkr�|jd�|S)Nz'%s' is not in any zoner�z"remove_interface(%s, %s): zoi='%s'FT)rr�r6rrZUNKNOWN_INTERFACEr<r�r$rr2�add_postr�r�r�)	rr/r4rgZzoir�r�r�r5rrrr��s(






zFirewallZone.remove_interfacecCs||jdkr|jd|=dS)Nr1)r3)rr�r5rrrZ__unregister_interfacesz#FirewallZone.__unregister_interfacecCs|j|�|j|�dkS)Nr1)r2r�)rr/r4rrr�query_interfaceszFirewallZone.query_interfacecCs|j|�dj�S)Nr1)r�r*)rr/rrrr,"szFirewallZone.list_interfacesFcCsxt|�rdSt|�rdSt|�r$dS|jd�rh|j|dd��|rV|j|dd��|j|dd��Sttj	|��dS)Nrr�r�zipset:�)
rrr�
startswith�_check_ipset_type_for_source�_check_ipset_applied�
_ipset_familyrrZINVALID_ADDR)rr9rdrrr�check_source's
zFirewallZone.check_sourcecCs|j||d�}||fS)N)rd)r�)rr9rdr�rrrZ__source_id6szFirewallZone.__source_idc
Cs|jj�|jj|�}|j|}t|�r0|j�}|j||d�}||jdkr`tt	j
d||f��|j|�dk	r~tt	jd|��|dkr�|j
�}	n|}	|jr�|r�|j||	d�|	j|j|d�|r�|jd||d|d	|	�|j||||�|	j|j||�|dk�r|	jd�|S)
N)rdr7z'%s' already bound to '%s'z'%s' already bound to a zone)rgFTrro)rr�r<rr�upperr8r3rrr�r;r�r$rdrjr�rlr��_FirewallZone__register_source� _FirewallZone__unregister_sourcer�)
rr/r9r�rgr^r�r�r:r�rrrr`:s4





zFirewallZone.add_sourcecCs6|jd|�|jd|<|p"|dk|jd|d<dS)Nrr7r�r�)r�r3)rr�r:r/r�rrrZ__register_sourceaszFirewallZone.__register_sourcecCsb|jj�|j|�}|jj|�}||kr,|St|�r<|j�}|dk	rP|j||�|j|||�}|S)N)rr�r;r<rr�r�r`)rr/r9r�r�r�r�rrr�change_zone_of_sourcegs

z"FirewallZone.change_zone_of_sourcec	Cs�|jj�t|�r|j�}|j|�}|dkr<ttjd|��|dkrH|n
|jj|�}||krrttj	d|||f��|dkr�|j
�}n|}|j|}|j|�}|j
|j||�|jd||d|d|�|dkr�|jd�|S)Nz'%s' is not in any zoner�zremove_source(%s, %s): zos='%s'FrroT)rr�rr�r;rrZUNKNOWN_SOURCEr<r�r$rr8r�r�r�r�)	rr/r9rgZzosr�r�r�r:rrrr�ys,






zFirewallZone.remove_sourcecCs||jdkr|jd|=dS)Nr7)r3)rr�r:rrrZ__unregister_source�sz FirewallZone.__unregister_sourcecCs(t|�r|j�}|j|�|j|�dkS)Nr7)rr�r8r�)rr/r9rrr�query_source�szFirewallZone.query_sourcecCsdd�|j|�dj�D�S)NcSsg|]}|d�qS)ror)rU�krrr�
<listcomp>�sz-FirewallZone.list_sources.<locals>.<listcomp>r7)r�r*)rr/rrrr-�szFirewallZone.list_sourcescs�x��jj�D]�}|jsqxP�j|D]B}x<�jjj|�D]*\}}	|j||||||	|�}
|j||
�q8Wq$W�j|d�}�j	|�dr|d
kr|j
|||d|d�}
|j||
�qWxΈjjj�D]�}|�jjj|�kr�|�jjj
|�kr�q�|�jjj�k�rd�jjj|�j�rd|�r<t�j|��dk�r<�jjj||d�n&�jjjd	||�|j�fd
d�|�q�|r�|j�fdd�|�q�WdS)NrHrYr��*�filter)r4ro)rgFcs |�jjj�ko�jjjd|�S)NT)rrZ�)get_active_policies_not_derived_from_zone�!_ingress_egress_zones_transaction)�p)rrr�<lambda>�sz)FirewallZone._interface.<locals>.<lambda>cs|�jjj�ko�jjj|�S)N)rrZr�r�)r�)rrrr��s)r�r�)r�enabled_backends�policies_supportedrrZ�#_get_table_chains_for_zone_dispatchZ!build_zone_source_interface_rules�	add_rulesr(r��build_zone_forward_rules�"get_policies_not_derived_from_zone�list_ingress_zones�list_egress_zonesr��
get_policyrdrhr,r��_ingress_egress_zonesr�)rr�r/r4r�r.�backendrZr�rvrEr)rrr��s2$zFirewallZone._interfacecCs$|j|�dkrdS|jjj|dd�S)Nzhash:macF)rd)�_ipset_typer�ipsetZ
get_family)rrIrrrr��szFirewallZone._ipset_familycCs|jjj|dd�S)NF)rd)rr�Zget_type)rrIrrrr��szFirewallZone._ipset_typecCsdj|g|jjj|��S)N�,)�joinrr�Z
get_dimension)rrI�flagrrr�_ipset_match_flags�szFirewallZone._ipset_match_flagscCs|jjj|�S)N)rr�Z
check_applied)rrIrrrr��sz!FirewallZone._check_ipset_appliedcCs*|j|�}|tkr&ttjd||f��dS)Nz.ipset '%s' with type '%s' not usable as source)r�rrrZ
INVALID_IPSET)rrIZ_typerrrr��s
z)FirewallZone._check_ipset_type_for_sourcec
s�x�|r�jj|�gn�jj�D]�}|js*qxN�j|D]@}x:�jjj|�D](\}}	|j||||||	�}
|j||
�qJWq6W�j	|d�}�j
|�dr|j|||d|d�}
|j||
�qWxΈjjj�D]�}|�jjj
|�kr�|�jjj|�kr�q�|�jjj�k�rl�jjj|�j�rl|�rDt�j|��dk�rD�jjj||d�n&�jjjd||�|j�fdd	�|�q�|r�|j�fd
d	�|�q�WdS)NrHrYr�)r9ro)rgFcs |�jjj�ko�jjjd|�S)NT)rrZr�r�)r�)rrrr�sz&FirewallZone._source.<locals>.<lambda>cs|�jjj�ko�jjj|�S)N)rrZr�r�)r�)rrrr�
s)r�get_backend_by_ipvr�r�rrZr�Zbuild_zone_source_address_rulesr�r(r�r�r�r�r�r�r�rdrhr-r�r�r�)rr�r/r�r9r�r�rZr�rvrEr)rrr��s2"$zFirewallZone._sourcecCs0|jj|�}|j|d�}|jjj||||�|S)NrG)rr<r(rZr�)rr/�servicer�r��p_namerrrr�
szFirewallZone.add_servicecCs,|jj|�}|j|d�}|jjj||�|S)NrG)rr<r(rZr�)rr/r�r�rrrr�szFirewallZone.remove_servicecCs(|jj|�}|j|d�}|jjj||�S)NrG)rr<r(rZ�
query_service)rr/r�r�rrrr�szFirewallZone.query_servicecCs&|jj|�}|j|d�}|jjj|�S)NrG)rr<r(rZr�)rr/r�rrrr�szFirewallZone.list_servicescCs2|jj|�}|j|d�}|jjj|||||�|S)NrG)rr<r(rZr�)rr/�port�protocolr�r�r�rrrr�#szFirewallZone.add_portcCs.|jj|�}|j|d�}|jjj|||�|S)NrG)rr<r(rZr�)rr/r�r�r�rrrr�)szFirewallZone.remove_portcCs*|jj|�}|j|d�}|jjj|||�S)NrG)rr<r(rZ�
query_port)rr/r�r�r�rrrr/szFirewallZone.query_portcCs&|jj|�}|j|d�}|jjj|�S)NrG)rr<r(rZr�)rr/r�rrrr�4szFirewallZone.list_portscCs2|jj|�}|j|d�}|jjj|||||�|S)NrG)rr<r(rZr�)rr/�source_portr�r�r�r�rrrr�9szFirewallZone.add_source_portcCs.|jj|�}|j|d�}|jjj|||�|S)NrG)rr<r(rZr�)rr/rr�r�rrrr�?szFirewallZone.remove_source_portcCs*|jj|�}|j|d�}|jjj|||�S)NrG)rr<r(rZ�query_source_port)rr/rr�r�rrrrEszFirewallZone.query_source_portcCs&|jj|�}|j|d�}|jjj|�S)NrG)rr<r(rZr�)rr/r�rrrr�JszFirewallZone.list_source_portscCs�|jj|�}t|j�tkr(|j|d�gSt|j�ttt	t
gkrL|j|d�gSt|j�ttgkrv|j|d�|j|d�gSt|j�t
gkr�|j|d�gSt|j�tgkr�|jd|�gS|jdkr�|j|d�gStdt|j���dS)NrHrGz Rich rule type (%s) not handled.)rr<�type�actionrr(�elementrr	r
rr
rrrr)rr/rSrrrrPOs 

z#FirewallZone._rich_rule_to_policiescCs.x(|j||�D]}|jjj||||�qW|S)N)rPrrZr�)rr/rSr�r�r�rrrr�bszFirewallZone.add_rulecCs*x$|j||�D]}|jjj||�qW|S)N)rPrrZr�)rr/rSr�rrrr�gszFirewallZone.remove_rulecCs2d}x(|j||�D]}|o(|jjj||�}qW|S)NT)rPrrZ�
query_rule)rr/rSr�r�rrrrlszFirewallZone.query_rulecCs^|jj|�}t�}xB|j|d�|j|d�|jd|�gD]}|jt|jjj|���q6Wt|�S)NrHrG)rr<�setr(�updaterZr�r�)rr/r�r�rrrr�rs
zFirewallZone.list_rulescCs0|jj|�}|j|d�}|jjj||||�|S)NrG)rr<r(rZr�)rr/r�r�r�r�rrrr�{szFirewallZone.add_protocolcCs,|jj|�}|j|d�}|jjj||�|S)NrG)rr<r(rZr�)rr/r�r�rrrr��szFirewallZone.remove_protocolcCs(|jj|�}|j|d�}|jjj||�S)NrG)rr<r(rZ�query_protocol)rr/r�r�rrrr	�szFirewallZone.query_protocolcCs&|jj|�}|j|d�}|jjj|�S)NrG)rr<r(rZr�)rr/r�rrrr��szFirewallZone.list_protocolscCs.|jj|�}|jd|�}|jjj|||�|S)NrH)rr<r(rZr�)rr/r�r�r�rrrr��szFirewallZone.add_masqueradecCs*|jj|�}|jd|�}|jjj|�|S)NrH)rr<r(rZr�)rr/r�rrrr��szFirewallZone.remove_masqueradecCs&|jj|�}|jd|�}|jjj|�S)NrH)rr<r(rZr�)rr/r�rrrr��szFirewallZone.query_masqueradec	Cs6|jj|�}|j|d�}|jjj|||||||�|S)NrH)rr<r(rZr�)	rr/r�r��toport�toaddrr�r�r�rrrr��s
zFirewallZone.add_forward_portcCs2|jj|�}|j|d�}|jjj|||||�|S)NrH)rr<r(rZr�)rr/r�r�r
rr�rrrr��sz FirewallZone.remove_forward_portcCs.|jj|�}|j|d�}|jjj|||||�S)NrH)rr<r(rZ�query_forward_port)rr/r�r�r
rr�rrrr�szFirewallZone.query_forward_portcCs&|jj|�}|j|d�}|jjj|�S)NrH)rr<r(rZr�)rr/r�rrrr��szFirewallZone.list_forward_portscCsP|jj|�}|j|d�}|jjj||||�|j|d�}|jjj||||�|S)NrGrH)rr<r(rZr�)rr/�icmpr�r�r�rrrr��szFirewallZone.add_icmp_blockcCsH|jj|�}|j|d�}|jjj||�|j|d�}|jjj||�|S)NrGrH)rr<r(rZr�)rr/r
r�rrrr��szFirewallZone.remove_icmp_blockcCsD|jj|�}|j|d�}|j|d�}|jjj||�oB|jjj||�S)NrGrH)rr<r(rZ�query_icmp_block)rr/r
�p_name_host�
p_name_fwdrrrr�s
zFirewallZone.query_icmp_blockcCsH|jj|�}|j|d�}|j|d�}tt|jjj|�|jjj|���S)NrGrH)rr<r(r)rrZr�)rr/rrrrrr��s
zFirewallZone.list_icmp_blockscCsH|jj|�}|j|d�}|jjj||�|j|d�}|jjj||�|S)NrGrH)rr<r(rZrb)rr/r�r�rrrrb�sz%FirewallZone.add_icmp_block_inversioncCsL|jj|�}|j|d�}|jjj|||�|j|d�}|jjj|||�dS)NrGrH)rr<r(rZr�)rr�r/r�r�rrrr��s
z"FirewallZone._icmp_block_inversioncCsD|jj|�}|j|d�}|jjj|�|j|d�}|jjj|�|S)NrGrH)rr<r(rZr�)rr/r�rrrr��sz(FirewallZone.remove_icmp_block_inversioncCs@|jj|�}|j|d�}|j|d�}|jjj|�o>|jjj|�S)NrGrH)rr<r(rZr�)rr/rrrrrr��s
z'FirewallZone.query_icmp_block_inversionc
	Cs�|j|d�}xT|j|jdD]@}x:|jj�D],}|js:q.|j|||d|d�}|j||�q.WqWxj|j|jdD]V\}}	xL|r�|jj|�gn|jj�D],}|js�q�|j|||d|	d�}|j||�q�WqtWdS)NrHr1r�)r4r7)r9)	r(rr3rr�r�r�r�r�)
rr�r/r�r�r4r�rEr�r9rrr�_forward�s"zFirewallZone._forwardcCsdS)NTr)rrrrZ__forward_idszFirewallZone.__forward_idc	Cs�|jj|�}|jj|�|jj�|j|}|j�}||jdkrRttj	d|��|dkrd|j
�}n|}|jr||jd||�|j
||||�|j|j||�|dkr�|jd�|S)NrYzforward already enabled in '%s'T)rr<Z
check_timeoutr�r�_FirewallZone__forward_idr3rrZALREADY_ENABLEDr$rdr�_FirewallZone__register_forwardr��!_FirewallZone__unregister_forwardr�)	rr/r�r�rgr�r��
forward_idr�rrrras$




zFirewallZone.add_forwardcCs|j||�|jd|<dS)NrY)r�r3)rr�rr�r�rrrZ__register_forward.szFirewallZone.__register_forwardcCs�|jj|�}|jj�|j|}|j�}||jdkrFttjd|��|dkrX|j	�}n|}|j
rp|jd||�|j|j
||�|dkr�|jd�|S)NrYzforward not enabled in '%s'FT)rr<r�rrr3rrZNOT_ENABLEDr$rdrr�rr�)rr/rgr�r�rr�rrrr�2s 




zFirewallZone.remove_forwardcCs||jdkr|jd|=dS)NrY)r3)rr�rrrrZ__unregister_forwardKsz!FirewallZone.__unregister_forwardcCs|j�|j|�dkS)NrY)rr�)rr/rrrr�OszFirewallZone.query_forward)N)N)N)N)NNT)N)N)N)F)F)NNT)N)N)F)rN)rN)rN)rN)rN)rN)NNrN)NN)NN)rN)N)rNN)N)e�__name__�
__module__�__qualname__rJrr!r#r$r(r+r0r6r;r>rTr]r[rfrkrlrxr~r�r�r�r�rjrer�r�r�r�r�r2r_r�r�r�r�r�r�r,r�r8r`r�r�r�r�r�r-r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�rr�rPr�r�rr�r�r�r	r�r�r�r�r�r�rr�r�r�rr�rbr�r�r�rrrarr�rr�rrrrr#s�&



8
(





&


,(



	





		
		

r)"r�rMZfirewall.core.baserrrZfirewall.core.fw_transactionrZfirewall.core.io.policyrZfirewall.core.loggerrr�rr	r
rrr
rrrZfirewall.functionsrrrZfirewallrZfirewall.errorsrZfirewall.fw_typesr�objectrrrrr�<module>s,site-packages/firewall/core/__pycache__/modules.cpython-36.opt-1.pyc000064400000005500147511334670021301 0ustar003

]ûf��@sBdZdgZddlmZddlmZddlmZGdd�de�Z	dS)zmodules backend�modules�)�runProg)�log)�COMMANDSc@sLeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)rcCstd|_td|_dS)NZmodprobeZrmmod)r�
_load_command�_unload_command)�self�r	�/usr/lib/python3.6/modules.py�__init__s
zmodules.__init__cCs
d|jS)Nz%s)�	__class__)rr	r	r
�__repr__$szmodules.__repr__cCs�g}i}y�tdd��p}xh|D]`}|s&P|j�}|j�}|j|d�|ddkrp|djd�dd	�||d<qg||d<qWWdQRXWntk
r�YnX||fS)
z6 get all loaded kernel modules and their dependencies z
/proc/modules�rr��-�,N����)�open�strip�split�append�FileNotFoundError)r�mods�deps�f�lineZsplitsr	r	r
�loaded_modules's 
 zmodules.loaded_modulescCs"tjd|j|j|�t|j|g�S)Nz	%s: %s %s)r�debug2rrr)r�moduler	r	r
�load_module<szmodules.load_modulecCs"tjd|j|j|�t|j|g�S)Nz	%s: %s %s)rrrrr)rrr	r	r
�
unload_module@szmodules.unload_modulecCsT||krdSx0||D]$}|j|||�||kr|j|�qW||krP|j|�dS)z  get all dependants of a module N)�get_depsr)rrr�ret�modr	r	r
r"Dszmodules.get_depscCs�g}|j�\}}|jd||�x*dD]"}||kr$|j|�|jd|�q$Wx^|D]V}|dks�|jd�s�|jd	�s�|jd
�s�|jd�s�|jd�s�|jd
�rP|j|||�qPW|S)z) get all loaded firewall-related modules Znf_conntrack�nf_conntrack_ipv4�nf_conntrack_ipv6r�	ip_tables�
ip6_tables�ebtablesZiptable_Z	ip6table_Znf_Zxt_Zipt_Zip6t_)r%r&r)r'r(r))rr"�remove�insert�
startswith)rrZmods2rZbad_bad_moduler$r	r	r
�get_firewall_modulesOs


zmodules.get_firewall_modulescCs>x8|j�D],}|j|�\}}|dkr
tjd||f�q
WdS)z% unload all firewall-related modules rz Failed to unload module '%s': %sN)r-r!rZdebug1)rrZstatusr#r	r	r
�unload_firewall_modulesdszmodules.unload_firewall_modulesN)�__name__�
__module__�__qualname__rr
rr r!r"r-r.r	r	r	r
rsN)
�__doc__�__all__Zfirewall.core.progrZfirewall.core.loggerrZfirewall.configr�objectrr	r	r	r
�<module>s
site-packages/firewall/core/__pycache__/fw_transaction.cpython-36.opt-1.pyc000064400000011650147511334670022655 0ustar003

]ûf��@sJdZdgZddlZddlmZddlmZddlmZGdd�de	�Z
dS)z!Transaction classes for firewalld�FirewallTransaction�N)�log)�errors)�
FirewallErrorc@s�eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd �Zd!d"�Zd#S)$rcCs(||_i|_g|_g|_g|_g|_dS)N)�fw�rules�	pre_funcs�
post_funcs�
fail_funcs�modules)�selfr�r
�$/usr/lib/python3.6/fw_transaction.py�__init__!szFirewallTransaction.__init__cCs2|jj�|jdd�=|jdd�=|jdd�=dS)N)r�clearrr	r
)rr
r
rr)s
zFirewallTransaction.clearcCs|jj|jg�j|�dS)N)r�
setdefault�name�append)r�backend�ruler
r
r�add_rule/szFirewallTransaction.add_rulecCsx|D]}|j||�qWdS)N)r)rrrrr
r
r�	add_rules2s
zFirewallTransaction.add_rulescCs|j|jko||j|jkS)N)rr)rrrr
r
r�
query_rule6szFirewallTransaction.query_rulecCs2|j|jkr.||j|jkr.|j|jj|�dS)N)rr�remove)rrrr
r
r�remove_rule9szFirewallTransaction.remove_rulecGs|jj||f�dS)N)rr)r�func�argsr
r
r�add_pre=szFirewallTransaction.add_precGs|jj||f�dS)N)r	r)rrrr
r
r�add_post@szFirewallTransaction.add_postcGs|jj||f�dS)N)r
r)rrrr
r
r�add_failCszFirewallTransaction.add_failcCs||jkr|jj|�dS)N)rr)r�moduler
r
r�
add_moduleFs
zFirewallTransaction.add_modulecCs||jkr|jj|�dS)N)rr)rr r
r
r�
remove_moduleJs
z!FirewallTransaction.remove_modulecCsx|D]}|j|�qWdS)N)r!)rrr r
r
r�add_modulesNs
zFirewallTransaction.add_modulescCsx|D]}|j|�qWdS)N)r")rrr r
r
r�remove_modulesRs
z"FirewallTransaction.remove_modulescCs�tjdt|�|df�i}|sjxp|jD]<}x6t|j|�D]$}|j|g�j|jj|�j	|��q<Wq(Wn(x&|jD]}|j|g�j
|j|�qrW||jfS)Nz%s.prepare(%s, %s)z...)r�debug4�typer�reversedrrr�get_backend_by_name�reverse_rule�extendr)r�enabler�backend_namerr
r
r�prepareVszFirewallTransaction.preparecCstjdt|�|f�|j|�\}}|j�d}d}g}xp|D]h}y|jj|||�WnBtk
r�}z&d}|}tjt	j
��tj|�WYdd}~Xq>X|j|�q>W|s�|jj
||�}	|	r�|	\}
}|
r�tj|�|�ri}xH|D]@}g||<x2t||�D]"}||j|jj|�j|���qWq�Wxb|D]Z}y|jj|||�Wn<tk
�r�}ztjt	j
��tj|�WYdd}~XnX�q0Wxh|jD]^\}
}y|
|�WnFtk
�r�}z(tjt	j
��tjd|
||f�WYdd}~XnX�q�Wttj|��|j�dS)Nz%s.execute(%s)F�Tz#Calling fail func %s(%s) failed: %s)rr%r&r-�prerr�	Exception�debug1�	traceback�
format_exc�errorrZhandle_modulesr'r(r)r
rrZCOMMAND_FAILED�post)rr+rrr4ZerrorMsg�doner,�msgZ
module_returnZstatusZ
undo_rulesrrrr
r
r�executefsV



"&zFirewallTransaction.executecCs|tjdt|��xd|jD]Z\}}y||�Wqtk
rr}z(tjtj��tjd|||f�WYdd}~XqXqWdS)Nz%s.pre()z"Calling pre func %s(%s) failed: %s)	rr%r&rr0r1r2r3r4)rrrr7r
r
rr/�szFirewallTransaction.precCs|tjdt|��xd|jD]Z\}}y||�Wqtk
rr}z(tjtj��tjd|||f�WYdd}~XqXqWdS)Nz	%s.post()z#Calling post func %s(%s) failed: %s)	rr%r&r	r0r1r2r3r4)rrrr7r
r
rr5�szFirewallTransaction.postN)�__name__�
__module__�__qualname__rrrrrrrrrr!r"r#r$r-r8r/r5r
r
r
rr s"@)�__doc__�__all__r2Zfirewall.core.loggerrZfirewallrZfirewall.errorsr�objectrr
r
r
r�<module>ssite-packages/firewall/core/__pycache__/fw_nm.cpython-36.pyc000064400000011713147511334670020003 0ustar003

]ûf�@s dZddddddddgZd	d
lZd	dlmZyejdd
�Wnek
rTdZYn8Xyd	dlmZdZWn e	eej
fk
r�dZYnXd
ad	dlm
Z
d	dlmZd	dlmZd	d
lZdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d�Zd"d�Zd#d�Zd
S)$z(Functions for NetworkManager interaction�check_nm_imported�nm_is_imported�nm_get_zone_of_connection�nm_set_zone_of_connection�nm_get_connections�nm_get_connection_of_interface�nm_get_bus_name�nm_get_dbus_interface�N)�GLib�NMz1.0F)rT)�errors)�
FirewallError)�logcCststtjd��dS)zNCheck function to raise a MISSING_IMPORT error if the import of NM failed
    zgi.repository.NM = 1.0N)�_nm_importedr
rZMISSING_IMPORT�rr�/usr/lib/python3.6/fw_nm.pyr0scCstS)znReturns true if NM has been properly imported
    @return True if import was successful, False otherwirse
    )rrrrrr6scCststjjd�atS)z�Returns the NM client object or None if the import of NM failed
    @return NM.Client instance if import was successful, None otherwise
    N)�
_nm_clientrZClient�newrrrr�
nm_get_client<srcCs�t�t�j|�}|dkrdS|j�}|dkr2dSy |j�tjjtjjB@rPdSWn t	k
rr|j
�rndSYnX|j�}|dkr�d}|S)z�Get zone of connection from NM
    @param connection name
    @return zone string setting of connection, empty string if not set, None if connection is unknown
    N�)rr�get_connection_by_uuid�get_setting_connection�	get_flagsr�SettingsConnectionFlags�NM_GENERATED�NM_VOLATILE�AttributeError�get_unsavedZget_zone)�
connection�con�setting_con�zonerrrrEs$
cCsVt�t�j|�}|dkrdS|j�}|dkr2dS|dkr>d}|jd|�|jdd�S)zSet the zone for a connection
    @param zone name
    @param connection name
    @return True if zone was set, else False
    NFrr!T)rrrrZset_propertyZcommit_changes)r!rrr rrrrcsc	Cs~|j�|j�t�t�j�}xX|D]P}|j�r4q&|j�}|j�}|j�}|||<x |D]}|j�}|rZ|||<qZWq&WdS)znGet active connections from NM
    @param connections return dict
    @param connections_name return dict
    N)	�clearrr�get_active_connections�get_vpnZget_id�get_uuid�get_devices�get_ip_iface)	ZconnectionsZconnections_nameZactive_connections�
active_con�nameZuuidZdevices�dev�ip_ifacerrrrxs


cCs�t�g}x�t�j�D]|}|j�r$qy&|j�}|j�tjjtjj	B@rHwWnt
k
rh|j�rdwYnXx&|j�D]}|j
�}|rt|j|�qtWqW|S)zGGet active interfaces from NM
    @returns list of interface names
    )rrr#r$�get_connectionrrrrrrrr&r'�append)Zactive_interfacesr(rr*r+rrr�nm_get_interfaces�s$r.cCs6g}x,t�D]"}t|�}|t|�kr|j|�qW|S)N)r.rrr-)r!Z
interfaces�	interfaceZconnrrr�nm_get_interfaces_in_zone�sr0cCs<t�x0t�j�D]"}|j�}|dkr(q||kr|SqWdS)zzGet device from NM which has the given IP interface
    @param interface name
    @returns NM.Device instance or None
    N)rrr&r')r/�devicer+rrr�nm_get_device_by_ip_iface�sr2cCsxt�t|�}|dkrdS|j�}|dkr.dSy |j�}|j�tjj@rLdSWn tk
rn|j	�rjdSYnX|j
�S)z�Get connection from NM that is using the interface
    @param interface name
    @returns connection that is using interface or None
    N)rr2Zget_active_connectionr,rrrrrrr%)r/r1r(rrrrr�s
cCsRtsdSy&tj�}|jtjtj�}|j}~~|Stk
rLt	j
d�YnXdS)Nz(Failed to get bus name of NetworkManager)r�dbusZ	SystemBusZ
get_objectr�DBUS_INTERFACEZ	DBUS_PATHZbus_name�	ExceptionrZdebug2)Zbus�objr)rrrr�scCstsdStjS)Nr)rrr4rrrrr�s)�__doc__�__all__ZgiZ
gi.repositoryr
Zrequire_version�
ValueErrorrr�ImportError�ErrorrZfirewallrZfirewall.errorsr
Zfirewall.core.loggerrr3rrrrrrr.r0r2rrrrrrr�<module>s@

	 	
site-packages/firewall/core/__pycache__/fw_ifcfg.cpython-36.pyc000064400000002613147511334670020446 0ustar003

]ûf
�@sTdZddgZddlZddlZddlmZddlmZddlm	Z	dd�Z
d	d�ZdS)
z.Functions to search for and change ifcfg files�search_ifcfg_of_interface�ifcfg_set_zone_of_interface�N)�config)�log)�ifcfgcCs�tjjtj�sdSxtttjtj��D]`}|jd�s4q$xd
D]}|j|�r:q:q:Wd	|krXq$t	d
tj|f�}|j
�|jd�|kr$|Sq$Wdtj|f}tjj|�r�t	|�}|j
�|SdS)z6search ifcfg file for the interface in config.IFCFGDIRNzifcfg-�.bak�.orig�.rpmnew�.rpmorig�.rpmsave�-range�.z%s/%sZDEVICEz%s/ifcfg-%s)rrr	r
rr)�os�path�existsrZIFCFGDIR�sorted�listdir�
startswith�endswithr�read�get)�	interface�filenameZignored�
ifcfg_file�r�/usr/lib/python3.6/fw_ifcfg.pyr!s*

cCsn|dkrd}t|�}|dk	rj|jd�|krj|jd�dko>|dkrjtjd||jf�|jd|�|j�dS)zYSet zone (ZONE=<zone>) in the ifcfg file that uses the interface
    (DEVICE=<interface>)N�ZZONEzSetting ZONE=%s in '%s')rrrZdebug1r�set�write)Zzonerrrrrr?s)�__doc__�__all__rZos.pathZfirewallrZfirewall.core.loggerrZfirewall.core.io.ifcfgrrrrrrr�<module>ssite-packages/firewall/core/__pycache__/fw_zone.cpython-36.pyc000064400000077522147511334670020356 0ustar003

]ûfy��@s�ddlZddlZddlmZmZmZddlmZddlm	Z	ddl
mZddlm
Z
mZmZmZmZmZmZmZmZddlmZmZmZddlmZdd	lmZdd
lmZGdd�de �Z!dS)
�N)�	SHORTCUTS�DEFAULT_ZONE_TARGET�SOURCE_IPSET_TYPES)�FirewallTransaction)�Policy)�log)	�Rich_Service�	Rich_Port�
Rich_Protocol�Rich_SourcePort�Rich_ForwardPort�Rich_IcmpBlock�
Rich_IcmpType�Rich_Masquerade�	Rich_Mark)�checkIPnMask�
checkIP6nMask�	check_mac)�errors)�
FirewallError)�LastUpdatedOrderedDictc@sNeZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zd�dd �Zd!d"�Zd#d$�Zd%d&�Zd�d'd(�Zd)d*�Zd+d,�Zd-d.�Zd�d/d0�Zd�d1d2�Zd3d4�Zd5d6�Zd7d8�Zd9d:�Zd;d<�Z d=d>�Z!d�d@dA�Z"dBdC�Z#d�dDdE�Z$d�dFdG�Z%d�dHdI�Z&dJdK�Z'dLdM�Z(dNdO�Z)d�dQdR�Z*d�dSdT�Z+d�dUdV�Z,dWdX�Z-d�dYdZ�Z.d�d[d\�Z/d]d^�Z0d_d`�Z1dadb�Z2d�dcdd�Z3dedf�Z4dgdh�Z5didj�Z6dkdl�Z7dmdn�Z8dodp�Z9d�dqdr�Z:dsdt�Z;dudv�Z<dwdx�Z=d�dydz�Z>d{d|�Z?d}d~�Z@dd��ZAd�d�d��ZBd�d��ZCd�d��ZDd�d��ZEd�d��ZFd�d�d��ZGd�d��ZHd�d��ZId�d��ZJd�d�d��ZKd�d��ZLd�d��ZMd�d��ZNd�d�d��ZOd�d��ZPd�d��ZQd�d�d��ZRd�d�d��ZSd�d�d��ZTd�d��ZUd�d�d��ZVd�d��ZWd�d��ZXd�d��ZYd�d�d��ZZd�d��Z[d�d��Z\d�d��Z]d�d��Z^d�d��Z_d�d�d��Z`d�d��Zad�d�d„Zbd�dĄZcd�dƄZddS)��FirewallZonercCs||_i|_i|_dS)N)�_fw�_zones�_zone_policies)�self�fw�r�/usr/lib/python3.6/fw_zone.py�__init__&szFirewallZone.__init__cCsd|j|jfS)Nz%s(%r))�	__class__r)rrrr�__repr__+szFirewallZone.__repr__cCs|jj�|jj�dS)N)r�clearr)rrrr�cleanup.s
zFirewallZone.cleanupcCs
t|j�S)N)rr)rrrr�new_transaction2szFirewallZone.new_transactioncCsdj||d�S)Nzzone_{fromZone}_{toZone})�fromZone�toZone)�format)rr%r&rrr�policy_name_from_zones5sz#FirewallZone.policy_name_from_zonescCst|jj��S)N)�sortedr�keys)rrrr�	get_zones:szFirewallZone.get_zonescCs8g}x.|j�D]"}|j|�s&|j|�r|j|�qW|S)N)r+�list_interfaces�list_sources�append)rZactive_zones�zonerrr�get_active_zones=s
zFirewallZone.get_active_zonescCs6|j|�}x&|jD]}||j|jdkr|SqWdS)N�
interfaces)�_FirewallZone__interface_idr�settings)r�	interface�interface_idr/rrr�get_zone_of_interfaceDs

z"FirewallZone.get_zone_of_interfacecCs6|j|�}x&|jD]}||j|jdkr|SqWdS)N�sources)�_FirewallZone__source_idrr3)r�source�	source_idr/rrr�get_zone_of_sourceLs

zFirewallZone.get_zone_of_sourcecCs|jj|�}|j|S)N)r�
check_zoner)rr/�zrrr�get_zoneTszFirewallZone.get_zonecCsBt�}|j|_|j||�|_|j|_|j|_|g|_|g|_�x�dD�]�}||jkr~|d	kr~|dkr~t	||t
jt||���qD|d
kr�||jkr�|d
kr�t	||t
jt||���qD||jko�|d
ko�|dk�r�t	||t
jt||���qD|dkrDg|_
xB|j
D]8}|j||�}||j|j|�k�r�|j
jt
j|���q�WqDW|S)N�services�ports�
masquerade�
forward_ports�source_ports�icmp_blocks�rules�	protocols�HOST�ANY)r?r@rArBrCrDrErF)r?r@rCrDrF)rA)rDrB)rE)r�nameZderived_from_zoner(�ZONE_POLICY_PRIORITYZpriority�targetZ
ingress_zonesZegress_zones�setattr�copy�deepcopy�getattrrE�_rich_rule_to_policiesr.)r�z_objr%r&�p_objZsetting�ruleZcurrent_policyrrr�policy_obj_from_zone_objXs6

z%FirewallZone.policy_obj_from_zone_objcCs�dd�d	D�|_||j|j<g|j|j<xX|jdfd|jf|jdfgD]8\}}|j|||�}|jjj|�|j|jj|j�qFW|j	|j�dS)
NcSsi|]}t�|�qSr)r)�.0�xrrr�
<dictcomp>sz)FirewallZone.add_zone.<locals>.<dictcomp>r1r7�icmp_block_inversion�forwardrGrH)r1r7rXrY)
r3rrIrrTr�policyZ
add_policyr.�copy_permanent_to_runtime)r�objr%r&rRrrr�add_zone~s

zFirewallZone.add_zonecCsn|j|}x|jD]}|j||dd�qWx|jD]}|j||dd�q2W|jrZ|j|�|jrj|j|�dS)NF)�allow_apply)	rr1�
add_interfacer7�
add_sourcerY�add_forwardrX�add_icmp_block_inversion)rr/r\�argrrrr[�s

z&FirewallZone.copy_permanent_to_runtimecCs8|j|}|jr|j|�|jj�|j|=|j|=dS)N)r�applied�unapply_zone_settingsr3r"r)rr/r\rrr�remove_zone�s


zFirewallZone.remove_zoneNcCsVxP|j�D]D}|j|}t|j�dks4t|j�dkr
tjd|�|j||d�q
WdS)NrzApplying zone '%s')�use_transaction)r+r�lenr1r7r�debug1�apply_zone_settings)rrgr/rQrrr�apply_zones�s

zFirewallZone.apply_zonescCs|j|}||_dS)N)rrd)rr/rdr\rrr�set_zone_applied�s
zFirewallZone.set_zone_appliedcCs�d|krdS|jd�}t|�dkr&dSd}x tD]}|dt|kr0|}q0W|dk	r�|d|j�krhdSt|�dks�t|�dkr�|ddkr�|d|fSdS)N�_�r���prer�deny�allow�post)rqrrrrsrt)�splitrhrr+)r�chainZsplits�_chainrVrrr�zone_from_chain�s 

zFirewallZone.zone_from_chaincCst|j|�}|dkrdS|\}}|d	kr0|}d}n4|d
krB|}d}n"|dkrTd}|}nttjd|��|j||�|fS)N�
PREROUTING�
FORWARD_INrH�INPUTrG�POSTROUTING�FORWARD_OUTz&chain '%s' can't be mapped to a policy)ryrz)r{)r|r})rxrrZ
INVALID_CHAINr()rrvrVr/rwr%r&rrr�policy_from_chain�s
zFirewallZone.policy_from_chainc	Csj|dkrf|j|�}|dk	rf|j|�\}}|dkr:|j�}n|}|jjj|d|||�|dkrf|jd�dS)N�ipv4�ipv6T)rr�)r~r$rrZZgen_chain_rules�execute)	r�ipv�tablervrgrVrZrw�transactionrrr�create_zone_base_by_chain�s

z&FirewallZone.create_zone_base_by_chaincCstj�||d�}|S)N)Zdate�sender�timeout)�time)rr�r��retrrrZ__gen_settings�szFirewallZone.__gen_settingscCs|j|�jS)N)r>r3)rr/rrr�get_settings�szFirewallZone.get_settingscCs�|j|�}x�|D]z}xt||D]h}|dkr<|j||||�q|dkr`|j|||d|d|�q|dkrlqq|dkrvqtjd|||�qWqW|r�|j|||�dS)Nr1r7rrorXrYz3Zone '%s': Unknown setting '%s:%s', unable to apply)r��
_interface�_sourcerZwarning�_icmp_block_inversion)r�enabler/r�r3�key�argsrrr�_zone_settingss

zFirewallZone._zone_settingscCs�|jj|�}|j|}|jr dSd|_|dkr8|j�}n|}x2|j|D]$}tjd||�|jjj	||d�qHW|j
d||�|dkr�|jd�dS)NTz+Applying policy (%s) derived from zone '%s')rg)rr<rrdr$rrrirZ�apply_policy_settingsr�r�)rr/rg�_zoner\r�rZrrrrjs

z FirewallZone.apply_zone_settingscCs�|jj|�}|j|}|js dS|dkr2|j�}n|}x$|j|D]}|jjj||d�qBW|jd||�|dkr||j	d�dS)N)rgFT)
rr<rrdr$rrZ�unapply_policy_settingsr�r�)rr/rgr�r\r�rZrrrre,s

z"FirewallZone.unapply_zone_settingscCs~|j|�}|j|�}g}x\td�D]P}|j|d|krZ|jtjt||j|d���q"|j||j|d�q"Wt|�S)zH
        :return: exported config updated with runtime settings
        �r)	r>�get_config_with_settings_dict�rangeZIMPORT_EXPORT_STRUCTUREr.rMrNrO�tuple)rr/r\Z	conf_dictZ	conf_list�irrr�get_config_with_settings?s

"z%FirewallZone.get_config_with_settingsc
Cs�|j|�j�}|dtkr"d|d<|j|�|j|�|j|�|j|�|j|�|j|�|j	|�|j
|�|j|�|j|�|j
|�|j|�d�}|jj||�S)zH
        :return: exported config updated with runtime settings
        rK�default)r?r@rDrArBr1r7�	rules_strrFrCrXrY)r>Zexport_config_dictr�
list_services�
list_ports�list_icmp_blocks�query_masquerade�list_forward_portsr,r-�
list_rules�list_protocols�list_source_ports�query_icmp_block_inversion�
query_forwardrZ'combine_runtime_with_permanent_settings)rr/Z	permanentZruntimerrrr�Os z*FirewallZone.get_config_with_settings_dictc
sddlm�d��fdd�	}��fdd�}�j�jf�j�jf�j�jf�j�j	f�j
�jf�j�j
f�j�jf||f�j�jf�j�jf�j�jf�j�jfd�}�j|�}�jj||�\}}	xv|	D]n}
t|	|
t��r$xX|	|
D]:}t|t��r||
d|f|��q�||
d||�q�Wq�||
d|�q�Wx�|D]�}
t||
t��r�x�||
D]l}|
dk�r�||
d|||d�nDt|t��r�||
d|f|�d|d��n||
d||d|d��q\Wn6|
dk�r�||
d||d�n||
d|d|d��q>WdS)Nr)�	Rich_Rulecs�j|�|d�d|d�dS)N)�rule_strr)r�r�)�add_rule)r/r�r�r�)r�rrr�add_rule_wrapperhszDFirewallZone.set_config_with_settings_dict.<locals>.add_rule_wrappercs�j|�|d��dS)N)r�)�remove_rule)r/r�)r�rrr�remove_rule_wrapperjszGFirewallZone.set_config_with_settings_dict.<locals>.remove_rule_wrapper)r?r@rDrArBr1r7r�rFrCrXrYror1r7)r�)r�r�rX)rN)r1r7)rX)�firewall.core.richr��add_service�remove_service�add_port�remove_port�add_icmp_block�remove_icmp_block�add_masquerade�remove_masquerade�add_forward_port�remove_forward_portr_�remove_interfacer`�
remove_source�add_protocol�remove_protocol�add_source_port�remove_source_portrb�remove_icmp_block_inversionra�remove_forwardr�rZget_added_and_removed_settings�
isinstance�listr�)rr/r3r�r�r�Z
setting_to_fnZold_settingsZadd_settingsZremove_settingsr�r�r)r�rr�set_config_with_settings_dictesF













  
z*FirewallZone.set_config_with_settings_dictcCs|jj|�dS)N)r�check_interface)rr4rrrr��szFirewallZone.check_interfacecCs\|jj|�}|j|}|j|�}||jdkrX|jd|}d|krX|ddk	rX|dSdS)Nr1r�)rr<rr2r3)rr/r4r��_objr5r3rrr�interface_get_sender�s

z!FirewallZone.interface_get_sendercCs|j|�|S)N)r�)rr4rrrZ__interface_id�s
zFirewallZone.__interface_idTc
Cs|jj�|jj|�}|j|}|j|�}||jdkrLttjd||f��|j	|�dk	rjttj
d|��tjd||f�|dkr�|j
�}	n|}	|jr�|r�|j||	d�|	j|j|d�|r�|jd|||	�|j||||�|	j|j||�|dk�r|	jd�|S)Nr1z'%s' already bound to '%s'z'%s' already bound to a zonez&Setting zone of interface '%s' to '%s')rgFT)r�check_panicr<rr2r3rr�ZONE_ALREADY_SETr6�
ZONE_CONFLICTrrir$rdrj�add_failrlr��!_FirewallZone__register_interface�#_FirewallZone__unregister_interfacer�)
rr/r4r�rgr^r�r�r5r�rrrr_�s8









zFirewallZone.add_interfacecCs6|jd|�|jd|<|p"|dk|jd|d<dS)Nrr1��__default__)�_FirewallZone__gen_settingsr3)rr�r5r/r�rrrZ__register_interface�sz!FirewallZone.__register_interfacecCsR|jj�|j|�}|jj|�}||kr,|S|dk	r@|j||�|j|||�}|S)N)rr�r6r<r�r_)rr/r4r��	_old_zone�	_new_zoner�rrr�change_zone_of_interface�s

z%FirewallZone.change_zone_of_interfacecCsz|jj�|dkr|j�}n|}|j||�|jd|d|dd�|dk	rd|dkrd|jd|d|dd�|dkrv|jd�dS)NT�+)r.r�F)rr�r$rjr�r�)rZold_zoneZnew_zonergr�rrr�change_default_zone�s

z FirewallZone.change_default_zonec	Cs�|jj�|j|�}|dkr,ttjd|��|dkr8|n
|jj|�}||krbttjd|||f��|dkrt|j�}n|}|j	|}|j
|�}|j|j||�|j
d|||�|dkr�|jd�|S)Nz'%s' is not in any zoner�z"remove_interface(%s, %s): zoi='%s'FT)rr�r6rrZUNKNOWN_INTERFACEr<r�r$rr2�add_postr�r�r�)	rr/r4rgZzoir�r�r�r5rrrr��s(






zFirewallZone.remove_interfacecCs||jdkr|jd|=dS)Nr1)r3)rr�r5rrrZ__unregister_interfacesz#FirewallZone.__unregister_interfacecCs|j|�|j|�dkS)Nr1)r2r�)rr/r4rrr�query_interfaceszFirewallZone.query_interfacecCs|j|�dj�S)Nr1)r�r*)rr/rrrr,"szFirewallZone.list_interfacesFcCsxt|�rdSt|�rdSt|�r$dS|jd�rh|j|dd��|rV|j|dd��|j|dd��Sttj	|��dS)Nrr�r�zipset:�)
rrr�
startswith�_check_ipset_type_for_source�_check_ipset_applied�
_ipset_familyrrZINVALID_ADDR)rr9rdrrr�check_source's
zFirewallZone.check_sourcecCs|j||d�}||fS)N)rd)r�)rr9rdr�rrrZ__source_id6szFirewallZone.__source_idc
Cs|jj�|jj|�}|j|}t|�r0|j�}|j||d�}||jdkr`tt	j
d||f��|j|�dk	r~tt	jd|��|dkr�|j
�}	n|}	|jr�|r�|j||	d�|	j|j|d�|r�|jd||d|d	|	�|j||||�|	j|j||�|dk�r|	jd�|S)
N)rdr7z'%s' already bound to '%s'z'%s' already bound to a zone)rgFTrro)rr�r<rr�upperr8r3rrr�r;r�r$rdrjr�rlr��_FirewallZone__register_source� _FirewallZone__unregister_sourcer�)
rr/r9r�rgr^r�r�r:r�rrrr`:s4





zFirewallZone.add_sourcecCs6|jd|�|jd|<|p"|dk|jd|d<dS)Nrr7r�r�)r�r3)rr�r:r/r�rrrZ__register_sourceaszFirewallZone.__register_sourcecCsb|jj�|j|�}|jj|�}||kr,|St|�r<|j�}|dk	rP|j||�|j|||�}|S)N)rr�r;r<rr�r�r`)rr/r9r�r�r�r�rrr�change_zone_of_sourcegs

z"FirewallZone.change_zone_of_sourcec	Cs�|jj�t|�r|j�}|j|�}|dkr<ttjd|��|dkrH|n
|jj|�}||krrttj	d|||f��|dkr�|j
�}n|}|j|}|j|�}|j
|j||�|jd||d|d|�|dkr�|jd�|S)Nz'%s' is not in any zoner�zremove_source(%s, %s): zos='%s'FrroT)rr�rr�r;rrZUNKNOWN_SOURCEr<r�r$rr8r�r�r�r�)	rr/r9rgZzosr�r�r�r:rrrr�ys,






zFirewallZone.remove_sourcecCs||jdkr|jd|=dS)Nr7)r3)rr�r:rrrZ__unregister_source�sz FirewallZone.__unregister_sourcecCs(t|�r|j�}|j|�|j|�dkS)Nr7)rr�r8r�)rr/r9rrr�query_source�szFirewallZone.query_sourcecCsdd�|j|�dj�D�S)NcSsg|]}|d�qS)ror)rU�krrr�
<listcomp>�sz-FirewallZone.list_sources.<locals>.<listcomp>r7)r�r*)rr/rrrr-�szFirewallZone.list_sourcescs�x��jj�D]�}|jsqxP�j|D]B}x<�jjj|�D]*\}}	|j||||||	|�}
|j||
�q8Wq$W�j|d�}�j	|�dr|d
kr|j
|||d|d�}
|j||
�qWxΈjjj�D]�}|�jjj|�kr�|�jjj
|�kr�q�|�jjj�k�rd�jjj|�j�rd|�r<t�j|��dk�r<�jjj||d�n&�jjjd	||�|j�fd
d�|�q�|r�|j�fdd�|�q�WdS)NrHrYr��*�filter)r4ro)rgFcs |�jjj�ko�jjjd|�S)NT)rrZ�)get_active_policies_not_derived_from_zone�!_ingress_egress_zones_transaction)�p)rrr�<lambda>�sz)FirewallZone._interface.<locals>.<lambda>cs|�jjj�ko�jjj|�S)N)rrZr�r�)r�)rrrr��s)r�r�)r�enabled_backends�policies_supportedrrZ�#_get_table_chains_for_zone_dispatchZ!build_zone_source_interface_rules�	add_rulesr(r��build_zone_forward_rules�"get_policies_not_derived_from_zone�list_ingress_zones�list_egress_zonesr��
get_policyrdrhr,r��_ingress_egress_zonesr�)rr�r/r4r�r.�backendrZr�rvrEr)rrr��s2$zFirewallZone._interfacecCs$|j|�dkrdS|jjj|dd�S)Nzhash:macF)rd)�_ipset_typer�ipsetZ
get_family)rrIrrrr��szFirewallZone._ipset_familycCs|jjj|dd�S)NF)rd)rr�Zget_type)rrIrrrr��szFirewallZone._ipset_typecCsdj|g|jjj|��S)N�,)�joinrr�Z
get_dimension)rrI�flagrrr�_ipset_match_flags�szFirewallZone._ipset_match_flagscCs|jjj|�S)N)rr�Z
check_applied)rrIrrrr��sz!FirewallZone._check_ipset_appliedcCs*|j|�}|tkr&ttjd||f��dS)Nz.ipset '%s' with type '%s' not usable as source)r�rrrZ
INVALID_IPSET)rrIZ_typerrrr��s
z)FirewallZone._check_ipset_type_for_sourcec
s�x�|r�jj|�gn�jj�D]�}|js*qxN�j|D]@}x:�jjj|�D](\}}	|j||||||	�}
|j||
�qJWq6W�j	|d�}�j
|�dr|j|||d|d�}
|j||
�qWxΈjjj�D]�}|�jjj
|�kr�|�jjj|�kr�q�|�jjj�k�rl�jjj|�j�rl|�rDt�j|��dk�rD�jjj||d�n&�jjjd||�|j�fdd	�|�q�|r�|j�fd
d	�|�q�WdS)NrHrYr�)r9ro)rgFcs |�jjj�ko�jjjd|�S)NT)rrZr�r�)r�)rrrr�sz&FirewallZone._source.<locals>.<lambda>cs|�jjj�ko�jjj|�S)N)rrZr�r�)r�)rrrr�
s)r�get_backend_by_ipvr�r�rrZr�Zbuild_zone_source_address_rulesr�r(r�r�r�r�r�r�r�rdrhr-r�r�r�)rr�r/r�r9r�r�rZr�rvrEr)rrr��s2"$zFirewallZone._sourcecCs0|jj|�}|j|d�}|jjj||||�|S)NrG)rr<r(rZr�)rr/�servicer�r��p_namerrrr�
szFirewallZone.add_servicecCs,|jj|�}|j|d�}|jjj||�|S)NrG)rr<r(rZr�)rr/r�r�rrrr�szFirewallZone.remove_servicecCs(|jj|�}|j|d�}|jjj||�S)NrG)rr<r(rZ�
query_service)rr/r�r�rrrr�szFirewallZone.query_servicecCs&|jj|�}|j|d�}|jjj|�S)NrG)rr<r(rZr�)rr/r�rrrr�szFirewallZone.list_servicescCs2|jj|�}|j|d�}|jjj|||||�|S)NrG)rr<r(rZr�)rr/�port�protocolr�r�r�rrrr�#szFirewallZone.add_portcCs.|jj|�}|j|d�}|jjj|||�|S)NrG)rr<r(rZr�)rr/r�r�r�rrrr�)szFirewallZone.remove_portcCs*|jj|�}|j|d�}|jjj|||�S)NrG)rr<r(rZ�
query_port)rr/r�r�r�rrrr/szFirewallZone.query_portcCs&|jj|�}|j|d�}|jjj|�S)NrG)rr<r(rZr�)rr/r�rrrr�4szFirewallZone.list_portscCs2|jj|�}|j|d�}|jjj|||||�|S)NrG)rr<r(rZr�)rr/�source_portr�r�r�r�rrrr�9szFirewallZone.add_source_portcCs.|jj|�}|j|d�}|jjj|||�|S)NrG)rr<r(rZr�)rr/rr�r�rrrr�?szFirewallZone.remove_source_portcCs*|jj|�}|j|d�}|jjj|||�S)NrG)rr<r(rZ�query_source_port)rr/rr�r�rrrrEszFirewallZone.query_source_portcCs&|jj|�}|j|d�}|jjj|�S)NrG)rr<r(rZr�)rr/r�rrrr�JszFirewallZone.list_source_portscCs�|jj|�}t|j�tkr(|j|d�gSt|j�ttt	t
gkrL|j|d�gSt|j�ttgkrv|j|d�|j|d�gSt|j�t
gkr�|j|d�gSt|j�tgkr�|jd|�gS|jdkr�|j|d�gStdt|j���dS)NrHrGz Rich rule type (%s) not handled.)rr<�type�actionrr(�elementrr	r
rr
rrrr)rr/rSrrrrPOs 

z#FirewallZone._rich_rule_to_policiescCs.x(|j||�D]}|jjj||||�qW|S)N)rPrrZr�)rr/rSr�r�r�rrrr�bszFirewallZone.add_rulecCs*x$|j||�D]}|jjj||�qW|S)N)rPrrZr�)rr/rSr�rrrr�gszFirewallZone.remove_rulecCs2d}x(|j||�D]}|o(|jjj||�}qW|S)NT)rPrrZ�
query_rule)rr/rSr�r�rrrrlszFirewallZone.query_rulecCs^|jj|�}t�}xB|j|d�|j|d�|jd|�gD]}|jt|jjj|���q6Wt|�S)NrHrG)rr<�setr(�updaterZr�r�)rr/r�r�rrrr�rs
zFirewallZone.list_rulescCs0|jj|�}|j|d�}|jjj||||�|S)NrG)rr<r(rZr�)rr/r�r�r�r�rrrr�{szFirewallZone.add_protocolcCs,|jj|�}|j|d�}|jjj||�|S)NrG)rr<r(rZr�)rr/r�r�rrrr��szFirewallZone.remove_protocolcCs(|jj|�}|j|d�}|jjj||�S)NrG)rr<r(rZ�query_protocol)rr/r�r�rrrr	�szFirewallZone.query_protocolcCs&|jj|�}|j|d�}|jjj|�S)NrG)rr<r(rZr�)rr/r�rrrr��szFirewallZone.list_protocolscCs.|jj|�}|jd|�}|jjj|||�|S)NrH)rr<r(rZr�)rr/r�r�r�rrrr��szFirewallZone.add_masqueradecCs*|jj|�}|jd|�}|jjj|�|S)NrH)rr<r(rZr�)rr/r�rrrr��szFirewallZone.remove_masqueradecCs&|jj|�}|jd|�}|jjj|�S)NrH)rr<r(rZr�)rr/r�rrrr��szFirewallZone.query_masqueradec	Cs6|jj|�}|j|d�}|jjj|||||||�|S)NrH)rr<r(rZr�)	rr/r�r��toport�toaddrr�r�r�rrrr��s
zFirewallZone.add_forward_portcCs2|jj|�}|j|d�}|jjj|||||�|S)NrH)rr<r(rZr�)rr/r�r�r
rr�rrrr��sz FirewallZone.remove_forward_portcCs.|jj|�}|j|d�}|jjj|||||�S)NrH)rr<r(rZ�query_forward_port)rr/r�r�r
rr�rrrr�szFirewallZone.query_forward_portcCs&|jj|�}|j|d�}|jjj|�S)NrH)rr<r(rZr�)rr/r�rrrr��szFirewallZone.list_forward_portscCsP|jj|�}|j|d�}|jjj||||�|j|d�}|jjj||||�|S)NrGrH)rr<r(rZr�)rr/�icmpr�r�r�rrrr��szFirewallZone.add_icmp_blockcCsH|jj|�}|j|d�}|jjj||�|j|d�}|jjj||�|S)NrGrH)rr<r(rZr�)rr/r
r�rrrr��szFirewallZone.remove_icmp_blockcCsD|jj|�}|j|d�}|j|d�}|jjj||�oB|jjj||�S)NrGrH)rr<r(rZ�query_icmp_block)rr/r
�p_name_host�
p_name_fwdrrrr�s
zFirewallZone.query_icmp_blockcCsH|jj|�}|j|d�}|j|d�}tt|jjj|�|jjj|���S)NrGrH)rr<r(r)rrZr�)rr/rrrrrr��s
zFirewallZone.list_icmp_blockscCsH|jj|�}|j|d�}|jjj||�|j|d�}|jjj||�|S)NrGrH)rr<r(rZrb)rr/r�r�rrrrb�sz%FirewallZone.add_icmp_block_inversioncCsL|jj|�}|j|d�}|jjj|||�|j|d�}|jjj|||�dS)NrGrH)rr<r(rZr�)rr�r/r�r�rrrr��s
z"FirewallZone._icmp_block_inversioncCsD|jj|�}|j|d�}|jjj|�|j|d�}|jjj|�|S)NrGrH)rr<r(rZr�)rr/r�rrrr��sz(FirewallZone.remove_icmp_block_inversioncCs@|jj|�}|j|d�}|j|d�}|jjj|�o>|jjj|�S)NrGrH)rr<r(rZr�)rr/rrrrrr��s
z'FirewallZone.query_icmp_block_inversionc
	Cs�|j|d�}xT|j|jdD]@}x:|jj�D],}|js:q.|j|||d|d�}|j||�q.WqWxj|j|jdD]V\}}	xL|r�|jj|�gn|jj�D],}|js�q�|j|||d|	d�}|j||�q�WqtWdS)NrHr1r�)r4r7)r9)	r(rr3rr�r�r�r�r�)
rr�r/r�r�r4r�rEr�r9rrr�_forward�s"zFirewallZone._forwardcCsdS)NTr)rrrrZ__forward_idszFirewallZone.__forward_idc	Cs�|jj|�}|jj|�|jj�|j|}|j�}||jdkrRttj	d|��|dkrd|j
�}n|}|jr||jd||�|j
||||�|j|j||�|dkr�|jd�|S)NrYzforward already enabled in '%s'T)rr<Z
check_timeoutr�r�_FirewallZone__forward_idr3rrZALREADY_ENABLEDr$rdr�_FirewallZone__register_forwardr��!_FirewallZone__unregister_forwardr�)	rr/r�r�rgr�r��
forward_idr�rrrras$




zFirewallZone.add_forwardcCs|j||�|jd|<dS)NrY)r�r3)rr�rr�r�rrrZ__register_forward.szFirewallZone.__register_forwardcCs�|jj|�}|jj�|j|}|j�}||jdkrFttjd|��|dkrX|j	�}n|}|j
rp|jd||�|j|j
||�|dkr�|jd�|S)NrYzforward not enabled in '%s'FT)rr<r�rrr3rrZNOT_ENABLEDr$rdrr�rr�)rr/rgr�r�rr�rrrr�2s 




zFirewallZone.remove_forwardcCs||jdkr|jd|=dS)NrY)r3)rr�rrrrZ__unregister_forwardKsz!FirewallZone.__unregister_forwardcCs|j�|j|�dkS)NrY)rr�)rr/rrrr�OszFirewallZone.query_forward)N)N)N)N)NNT)N)N)N)F)F)NNT)N)N)F)rN)rN)rN)rN)rN)rN)NNrN)NN)NN)rN)N)rNN)N)e�__name__�
__module__�__qualname__rJrr!r#r$r(r+r0r6r;r>rTr]r[rfrkrlrxr~r�r�r�r�rjrer�r�r�r�r�r2r_r�r�r�r�r�r�r,r�r8r`r�r�r�r�r�r-r�r�r�r�r�r�r�r�r�r�r�r�r�rr�r�r�rr�rPr�r�rr�r�r�r	r�r�r�r�r�r�rr�r�r�rr�rbr�r�r�rrrarr�rr�rrrrr#s�&



8
(





&


,(



	





		
		

r)"r�rMZfirewall.core.baserrrZfirewall.core.fw_transactionrZfirewall.core.io.policyrZfirewall.core.loggerrr�rr	r
rrr
rrrZfirewall.functionsrrrZfirewallrZfirewall.errorsrZfirewall.fw_typesr�objectrrrrr�<module>s,site-packages/firewall/core/__pycache__/fw_icmptype.cpython-36.opt-1.pyc000064400000004205147511334670022160 0ustar003

]ûf�	�@s>dgZddlmZddlmZddlmZGdd�de�ZdS)�FirewallIcmpType�)�log)�errors)�
FirewallErrorc@sLeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)rcCs||_i|_dS)N)�_fw�
_icmptypes)�self�fw�r
�!/usr/lib/python3.6/fw_icmptype.py�__init__szFirewallIcmpType.__init__cCsd|j|jfS)Nz%s(%r))�	__class__r)rr
r
r�__repr__!szFirewallIcmpType.__repr__cCs|jj�dS)N)r�clear)rr
r
r�cleanup$szFirewallIcmpType.cleanupcCst|jj��S)N)�sortedr�keys)rr
r
r�
get_icmptypes)szFirewallIcmpType.get_icmptypescCs||jkrttj|��dS)N)rrrZINVALID_ICMPTYPE)r�icmptyper
r
r�check_icmptype,s
zFirewallIcmpType.check_icmptypecCs|j|�|j|S)N)rr)rrr
r
r�get_icmptype0s
zFirewallIcmpType.get_icmptypecCs�|j}t|�dkrddg}x�|D]z}|dkrL|jjrB|jjrBq |jj}n,|dkrt|jjrj|jjrjq |jj}ng}|jj	�|kr t
jd|j|f�q W||j|j<dS)NrZipv4Zipv6z5ICMP type '%s' is not supported by the kernel for %s.)
Zdestination�lenrZip4tables_enabledZnftables_enabledZipv4_supported_icmp_typesZip6tables_enabledZipv6_supported_icmp_types�name�lowerrZinfo1r)r�objZ	orig_ipvsZipvZsupported_icmpsr
r
r�add_icmptype4s 


zFirewallIcmpType.add_icmptypecCs|j|�|j|=dS)N)rr)rrr
r
r�remove_icmptypeGs
z FirewallIcmpType.remove_icmptypeN)�__name__�
__module__�__qualname__rrrrrrrrr
r
r
rrsN)	�__all__Zfirewall.core.loggerrZfirewallrZfirewall.errorsr�objectrr
r
r
r�<module>ssite-packages/firewall/core/__pycache__/fw_nm.cpython-36.opt-1.pyc000064400000011713147511334670020742 0ustar003

]ûf�@s dZddddddddgZd	d
lZd	dlmZyejdd
�Wnek
rTdZYn8Xyd	dlmZdZWn e	eej
fk
r�dZYnXd
ad	dlm
Z
d	dlmZd	dlmZd	d
lZdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd�Zdd �Zd!d�Zd"d�Zd#d�Zd
S)$z(Functions for NetworkManager interaction�check_nm_imported�nm_is_imported�nm_get_zone_of_connection�nm_set_zone_of_connection�nm_get_connections�nm_get_connection_of_interface�nm_get_bus_name�nm_get_dbus_interface�N)�GLib�NMz1.0F)rT)�errors)�
FirewallError)�logcCststtjd��dS)zNCheck function to raise a MISSING_IMPORT error if the import of NM failed
    zgi.repository.NM = 1.0N)�_nm_importedr
rZMISSING_IMPORT�rr�/usr/lib/python3.6/fw_nm.pyr0scCstS)znReturns true if NM has been properly imported
    @return True if import was successful, False otherwirse
    )rrrrrr6scCststjjd�atS)z�Returns the NM client object or None if the import of NM failed
    @return NM.Client instance if import was successful, None otherwise
    N)�
_nm_clientrZClient�newrrrr�
nm_get_client<srcCs�t�t�j|�}|dkrdS|j�}|dkr2dSy |j�tjjtjjB@rPdSWn t	k
rr|j
�rndSYnX|j�}|dkr�d}|S)z�Get zone of connection from NM
    @param connection name
    @return zone string setting of connection, empty string if not set, None if connection is unknown
    N�)rr�get_connection_by_uuid�get_setting_connection�	get_flagsr�SettingsConnectionFlags�NM_GENERATED�NM_VOLATILE�AttributeError�get_unsavedZget_zone)�
connection�con�setting_con�zonerrrrEs$
cCsVt�t�j|�}|dkrdS|j�}|dkr2dS|dkr>d}|jd|�|jdd�S)zSet the zone for a connection
    @param zone name
    @param connection name
    @return True if zone was set, else False
    NFrr!T)rrrrZset_propertyZcommit_changes)r!rrr rrrrcsc	Cs~|j�|j�t�t�j�}xX|D]P}|j�r4q&|j�}|j�}|j�}|||<x |D]}|j�}|rZ|||<qZWq&WdS)znGet active connections from NM
    @param connections return dict
    @param connections_name return dict
    N)	�clearrr�get_active_connections�get_vpnZget_id�get_uuid�get_devices�get_ip_iface)	ZconnectionsZconnections_nameZactive_connections�
active_con�nameZuuidZdevices�dev�ip_ifacerrrrxs


cCs�t�g}x�t�j�D]|}|j�r$qy&|j�}|j�tjjtjj	B@rHwWnt
k
rh|j�rdwYnXx&|j�D]}|j
�}|rt|j|�qtWqW|S)zGGet active interfaces from NM
    @returns list of interface names
    )rrr#r$�get_connectionrrrrrrrr&r'�append)Zactive_interfacesr(rr*r+rrr�nm_get_interfaces�s$r.cCs6g}x,t�D]"}t|�}|t|�kr|j|�qW|S)N)r.rrr-)r!Z
interfaces�	interfaceZconnrrr�nm_get_interfaces_in_zone�sr0cCs<t�x0t�j�D]"}|j�}|dkr(q||kr|SqWdS)zzGet device from NM which has the given IP interface
    @param interface name
    @returns NM.Device instance or None
    N)rrr&r')r/�devicer+rrr�nm_get_device_by_ip_iface�sr2cCsxt�t|�}|dkrdS|j�}|dkr.dSy |j�}|j�tjj@rLdSWn tk
rn|j	�rjdSYnX|j
�S)z�Get connection from NM that is using the interface
    @param interface name
    @returns connection that is using interface or None
    N)rr2Zget_active_connectionr,rrrrrrr%)r/r1r(rrrrr�s
cCsRtsdSy&tj�}|jtjtj�}|j}~~|Stk
rLt	j
d�YnXdS)Nz(Failed to get bus name of NetworkManager)r�dbusZ	SystemBusZ
get_objectr�DBUS_INTERFACEZ	DBUS_PATHZbus_name�	ExceptionrZdebug2)Zbus�objr)rrrr�scCstsdStjS)Nr)rrr4rrrrr�s)�__doc__�__all__ZgiZ
gi.repositoryr
Zrequire_version�
ValueErrorrr�ImportError�ErrorrZfirewallrZfirewall.errorsr
Zfirewall.core.loggerrr3rrrrrrr.r0r2rrrrrrr�<module>s@

	 	
site-packages/firewall/core/__pycache__/base.cpython-36.opt-1.pyc000064400000002172147511334670020545 0ustar003

]ûf6�@s�dZdZdZd0ZdddedgZddddgZd	d
ddd
dd�Zddddddddddddddddgd d!d"d#d$d%d&ddg	d'�Zd(d)d*d+d,d-d.gZd/S)1zBase firewall settingsz{chain}_{zone}ZCONTINUE�ZACCEPTz
%%REJECT%%ZDROP�defaultZREJECTZPREZPOST�INZFWDIZFWDOZOUT)Z
PREROUTINGZPOSTROUTINGZINPUTZ
FORWARD_INZFORWARD_OUTZOUTPUTzicmp-host-prohibitedzhost-prohibzicmp-net-unreachableznet-unreachzicmp-host-unreachablezhost-unreachzicmp-port-unreachablezport-unreachzicmp-proto-unreachablez
proto-unreachzicmp-net-prohibitedz
net-prohibz	tcp-resetztcp-rstzicmp-admin-prohibitedzadmin-prohibzicmp6-adm-prohibitedzadm-prohibitedzicmp6-no-routezno-routezicmp6-addr-unreachablezaddr-unreachzicmp6-port-unreachable)Zipv4Zipv6zhash:ipzhash:ip,portzhash:ip,markzhash:netz
hash:net,portzhash:net,ifacezhash:macN���)	�__doc__ZDEFAULT_ZONE_TARGETZDEFAULT_POLICY_TARGETZDEFAULT_POLICY_PRIORITYZZONE_TARGETSZPOLICY_TARGETSZ	SHORTCUTSZREJECT_TYPESZSOURCE_IPSET_TYPES�rr�/usr/lib/python3.6/base.py�<module>s.site-packages/firewall/core/__pycache__/fw_helper.cpython-36.opt-1.pyc000064400000003705147511334670021611 0ustar003

]ûf)�@s6dZdgZddlmZddlmZGdd�de�ZdS)zhelper backend�FirewallHelper�)�errors)�
FirewallErrorc@s\eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�ZdS)rcCs||_i|_dS)N)Z_fw�_helpers)�self�fw�r�/usr/lib/python3.6/fw_helper.py�__init__szFirewallHelper.__init__cCsd|j|jfS)Nz%s(%r))�	__class__r)rrrr	�__repr__"szFirewallHelper.__repr__cCs|jj�dS)N)r�clear)rrrr	�cleanup'szFirewallHelper.cleanupcCs||j�krttj|��dS)N)�get_helpersrr�INVALID_HELPER)r�namerrr	�check_helper*szFirewallHelper.check_helpercCs||j�kS)N)r)rrrrr	�query_helper.szFirewallHelper.query_helpercCst|jj��S)N)�sortedr�keys)rrrr	r1szFirewallHelper.get_helperscCst|j�dkS)Nr)�lenr)rrrr	�has_helpers4szFirewallHelper.has_helperscCs|j|�|j|S)N)rr)rrrrr	�
get_helper7s
zFirewallHelper.get_helpercCs||j|j<dS)N)rr)r�objrrr	�
add_helper;szFirewallHelper.add_helpercCs"||jkrttj|��|j|=dS)N)rrrr)rrrrr	�
remove_helper>s
zFirewallHelper.remove_helperN)
�__name__�
__module__�__qualname__r
rrrrrrrrrrrrr	rsN)�__doc__�__all__ZfirewallrZfirewall.errorsr�objectrrrrr	�<module>ssite-packages/firewall/core/__pycache__/logger.cpython-36.pyc000064400000055135147511334670020162 0ustar003

]ûf>y�@s�ddddgZddlZddlZddlZddlZddlZddlZddlZddlZddl	Z
ddl
Z
Gdd�de�ZGdd�de�Z
Gd	d
�d
e
�ZGdd�de�ZGd
d�de�ZGdd�de�Ze�ZdS)�	LogTarget�FileLog�Logger�log�Nc@s2eZdZdZdd�Zddd�Zdd�Zd	d
�ZdS)
rz% Abstract class for logging targets. cCs
d|_dS)N)�fd)�self�r�/usr/lib/python3.6/logger.py�__init__(szLogTarget.__init__rcCstd��dS)Nz%LogTarget.write is an abstract method)�NotImplementedError)r�data�level�logger�is_debugrrr	�write+szLogTarget.writecCstd��dS)Nz%LogTarget.flush is an abstract method)r)rrrr	�flush.szLogTarget.flushcCstd��dS)Nz%LogTarget.close is an abstract method)r)rrrr	�close1szLogTarget.closeN)r)�__name__�
__module__�__qualname__�__doc__r
rrrrrrr	r&s

c@s.eZdZdd�Zddd�Zdd�Zdd	�Zd
S)�
_StdoutLogcCstj|�tj|_dS)N)rr
�sys�stdoutr)rrrr	r
8s
z_StdoutLog.__init__rcCs|jj|�|j�dS)N)rrr)rrr
rrrrr	r<sz_StdoutLog.writecCs|j�dS)N)r)rrrr	rAsz_StdoutLog.closecCs|jj�dS)N)rr)rrrr	rDsz_StdoutLog.flushN)r)rrrr
rrrrrrr	r7s
rc@seZdZdd�ZdS)�
_StderrLogcCstj|�tj|_dS)N)rr
r�stderrr)rrrr	r
Ks
z_StderrLog.__init__N)rrrr
rrrr	rJsrc@s.eZdZdd�Zddd�Zdd�Zdd	�Zd
S)�
_SyslogLogcCs.tj|�tjtjjtjd�tj	tj
�dS)Nr)rr
�syslogZopenlog�os�path�basenamer�argvZLOG_PIDZ
LOG_DAEMON)rrrr	r
Ss
	z_SyslogLog.__init__rcCs�d}|rtj}nF||jkr"tj}n4||jkr4tj}n"||jkrFtj}n||jkrVtj	}|j
d�rt|dt|�d�}t|�dkr�|dkr�tj|�ntj||�dS)N�
�r)rZ	LOG_DEBUG�INFO1ZLOG_INFO�WARNINGZLOG_WARNING�ERRORZLOG_ERR�FATALZLOG_CRIT�endswith�len)rrr
rrZpriorityrrr	ras"




z_SyslogLog.writecCstj�dS)N)rZcloselog)rrrr	rwsz_SyslogLog.closecCsdS)Nr)rrrr	rzsz_SyslogLog.flushN)r)rrrr
rrrrrrr	rRs
rc@s<eZdZdZddd�Zdd�Zddd	�Zd
d�Zdd
�ZdS)rz< FileLog class.
    File will be opened on the first write. �wcCstj|�||_||_dS)N)rr
�filename�mode)rr+r,rrr	r
�s
zFileLog.__init__cCsv|jr
dStjtjB}|jjd�r,|tjO}tj|j|d�|_tj	|jd�tj
|j|j�|_tj|jtjtj
�dS)N�ai�)rr�O_CREAT�O_WRONLYr,�
startswith�O_APPEND�openr+�fchmod�fdopen�fcntlZF_SETFDZ
FD_CLOEXEC)r�flagsrrr	r2�s
zFileLog.openrcCs(|js|j�|jj|�|jj�dS)N)rr2rr)rrr
rrrrr	r�sz
FileLog.writecCs|js
dS|jj�d|_dS)N)rr)rrrr	r�s
z
FileLog.closecCs|js
dS|jj�dS)N)rr)rrrr	r�sz
FileLog.flushN)r*)r)	rrrrr
r2rrrrrrr	rs

c@s�eZdZdZd[Zd\Zd]Zd^Zd_ZdZ	e
�Ze�Z
e�Zd`d	d
�Zdd�Zdadd�Zdbdd�Zdcdd�Zdddd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd d!�Zed"fd#d$�Zed"fd%d&�Zed"fd'd(�Zed"fd)d*�Zed"fd+d,�Z ed"fd-d.�Z!d/d0�Z"d1d2�Z#d3d4�Z$d5d6�Z%d7d8�Z&d9d:�Z'd;d<�Z(d=d>�Z)d?d@�Z*dAdB�Z+dCdD�Z,dedEdF�Z-dGdH�Z.dfdIdJ�Z/ed"dfdKdL�Z0ed"dfdMdN�Z1ed"dfdOdP�Z2dgdQdR�Z3dSdT�Z4dUdV�Z5dWdX�Z6dhdYdZ�Z7d"S)iraL	
    Format string:

    %(class)s      Calling class the function belongs to, else empty
    %(date)s       Date using Logger.date_format, see time module
    %(domain)s     Full Domain: %(module)s.%(class)s.%(function)s
    %(file)s       Filename of the module
    %(function)s   Function name, empty in __main__
    %(label)s      Label according to log function call from Logger.label
    %(level)d      Internal logging level
    %(line)d       Line number in module
    %(module)s     Module name
    %(message)s    Log message

    Standard levels:

    FATAL                 Fatal error messages
    ERROR                 Error messages
    WARNING               Warning messages
    INFOx, x in [1..5]    Information
    DEBUGy, y in [1..10]  Debug messages
    NO_INFO               No info output
    NO_DEBUG              No debug output
    INFO_MAX              Maximum info level
    DEBUG_MAX             Maximum debug level

    x and y depend on info_max and debug_max from Logger class
    initialization. See __init__ function.

    Default logging targets:

    stdout        Logs to stdout
    stderr        Logs to stderr
    syslog        Logs to syslog

    Additional arguments for logging functions (fatal, error, warning, info
    and debug):

    nl       Disable newline at the end with nl=0, default is nl=1.
    fmt      Format string for this logging entry, overloads global format
             string. Example: fmt="%(file)s:%(line)d %(message)s"
    nofmt    Only output message with nofmt=1. The nofmt argument wins over
             the fmt argument.

    Example:

    from logger import log
    log.setInfoLogLevel(log.INFO1)
    log.setDebugLogLevel(log.DEBUG1)
    for i in range(1, log.INFO_MAX+1):
        log.setInfoLogLabel(i, "INFO%d: " % i)
    log.setFormat("%(date)s %(module)s:%(line)d [%(domain)s] %(label)s: "
                  "%(level)d %(message)s")
    log.setDateFormat("%Y-%m-%d %H:%M:%S")

    fl = FileLog("/tmp/log", "a")
    log.addInfoLogging("*", fl)
    log.addDebugLogging("*", fl)
    log.addInfoLogging("*", log.syslog, fmt="%(label)s%(message)s")

    log.debug3("debug3")
    log.debug2("debug2")
    log.debug1("debug1")
    log.info2("info2")
    log.info1("info1")
    log.warning("warning\n", nl=0)
    log.error("error\n", nl=0)
    log.fatal("fatal")
    log.info(log.INFO1, "nofmt info", nofmt=1)

    ����r#r�
cCs�i|_i|_d|_d|_i|_i|_i|_i|_i|_i|_	|dkrPt
d|��|dkrdt
d|��|j|_||_
d|_||_|j|jd�|j|jd�|j|jd�|j|jd�xNtd|j
d�D]:}t|d	||�|j|d�t|d
|dd�||��q�WxTtd|jd�D]@}t|d
||�|j|d|�t|d|dd�||���qW|j|j�|j|j�|jd�|jd�|jd|j|j|j|jg�|jd|jdd�t|j|j
d�D��|jd|jdd�td|jd�D��dS)z Logger class initialization �r#zLogger: info_max %d is too lowrzLogger: debug_max %d is too lowz
FATAL ERROR: zERROR: z	WARNING: zINFO%dzinfo%dcs��fdd�S)Ncs�j�|f|�|�S)N)�info)�message�args�kwargs)r�xrr	�<lambda> sz3Logger.__init__.<locals>.<lambda>.<locals>.<lambda>r)rrAr)rrAr	rBsz!Logger.__init__.<locals>.<lambda>zDEBUG%dz	DEBUG%d: zdebug%dcs��fdd�S)Ncs�j�|f|�|�S)N)�debug)r>r?r@)rrArr	rB)sz3Logger.__init__.<locals>.<lambda>.<locals>.<lambda>r)rrAr)rrAr	rB(sz%(label)s%(message)sz%d %b %Y %H:%M:%S�*cSsg|]}|�qSrr)�.0�irrr	�
<listcomp>4sz#Logger.__init__.<locals>.<listcomp>cSsg|]}|�qSrr)rErFrrr	rG6sN) �_level�_debug_level�_format�_date_format�_label�_debug_label�_logging�_debug_logging�_domains�_debug_domains�
ValueErrorr%�NO_INFO�INFO_MAX�NO_DEBUG�	DEBUG_MAX�setInfoLogLabelr'�	TRACEBACKr&�range�setattr�setDebugLogLabel�setInfoLogLevelr$�setDebugLogLevel�	setFormat�
setDateFormat�setInfoLoggingrr�setDebugLogging)rZinfo_maxZ	debug_maxrHrrr	r
�sX






zLogger.__init__cCsNxHt|j|jd�D]2}||jkr$qx |j|D]\}}}|j�q0WqWdS)z Close all logging targets r#N)rYr'rVrNr)rr
�dummy�targetrrr	r8s

zLogger.closerDcCs$|j|�||jkr|j|S|jS)z Get info log level. )�_checkDomainrH�NOTHING)r�domainrrr	�getInfoLogLevel@s


zLogger.getInfoLogLevelcCs8|j|�||jkr|j}||jkr*|j}||j|<dS)z% Set log level [NOTHING .. INFO_MAX] N)rdrerTrH)rr
rfrrr	r\Gs


zLogger.setInfoLogLevelcCs*|j|�||jkr$|j||jS|jS)z Get debug log level. )rdrIrU)rrfrrr	�getDebugLogLevelPs

zLogger.getDebugLogLevelcCs:|j|�|dkrd}||jkr&|j}||j|j|<dS)z- Set debug log level [NO_DEBUG .. DEBUG_MAX] rN)rdrVrUrI)rr
rfrrr	r]Ws

zLogger.setDebugLogLevelcCs|jS)N)rJ)rrrr	�	getFormat`szLogger.getFormatcCs
||_dS)N)rJ)rrJrrr	r^cszLogger.setFormatcCs|jS)N)rK)rrrr	�
getDateFormatfszLogger.getDateFormatcCs
||_dS)N)rK)rrJrrr	r_iszLogger.setDateFormatcCs:|j|�}x*|D]"}|j||j|jd�||j|<qWdS)zU Set log label for level. Level can be a single level or an array
        of levels. )�	min_level�	max_levelN)�
_getLevels�_checkLogLevelr'rTrL)rr
�label�levelsrrr	rWls




zLogger.setInfoLogLabelcCs>|j|dd�}x*|D]"}|j||j|jd�||j|<qWdS)zU Set log label for level. Level can be a single level or an array
        of levels. r#)r)rkrlN)rmrnr$rVrM)rr
rorprrr	r[us



zLogger.setDebugLogLabelNcCs|j||||dd�dS)z� Set info log target for domain and level. Level can be a single
        level or an array of levels. Use level ALL to set for all levels.
        If no format is specified, the default format will be used. r)rN)�_setLogging)rrfrcr
�fmtrrr	r`~szLogger.setInfoLoggingcCs|j||||dd�dS)z� Set debug log target for domain and level. Level can be a single
        level or an array of levels. Use level ALL to set for all levels.
        If no format is specified, the default format will be used. r#)rN)rq)rrfrcr
rrrrr	ra�szLogger.setDebugLoggingcCs|j||||dd�dS)z� Add info log target for domain and level. Level can be a single
        level or an array of levels. Use level ALL to set for all levels.
        If no format is specified, the default format will be used. r)rN)�_addLogging)rrfrcr
rrrrr	�addInfoLogging�szLogger.addInfoLoggingcCs|j||||dd�dS)z� Add debg log target for domain and level. Level can be a single
        level or an array of levels. Use level ALL to set for all levels.
        If no format is specified, the default format will be used. r#)rN)rs)rrfrcr
rrrrr	�addDebugLogging�szLogger.addDebugLoggingcCs|j||||dd�dS)z� Delete info log target for domain and level. Level can be a single
        level or an array of levels. Use level ALL to set for all levels.
        If no format is specified, the default format will be used. r)rN)�_delLogging)rrfrcr
rrrrr	�delInfoLogging�szLogger.delInfoLoggingcCs|j||||dd�dS)z� Delete debug log target for domain and level. Level can be a single
        level or an array of levels. Use level ALL to set for all levels.
        If no format is specified, the default format will be used. r#)rN)rv)rrfrcr
rrrrr	�delDebugLogging�szLogger.delDebugLoggingcCs|j|dd�S)zN Is there currently any info logging for this log level (and
        domain)? r)r)�_isLoggingHere)rr
rrr	�isInfoLoggingHere�szLogger.isInfoLoggingHerecCs|j|dd�S)zO Is there currently any debug logging for this log level (and
        domain)? r#)r)ry)rr
rrr	�isDebugLoggingHere�szLogger.isDebugLoggingHerecOs,|j|�d|d<|j|j|f|�|�dS)z Fatal error log. rrN)�_checkKWargs�_logr')rrJr?r@rrr	�fatal�s
zLogger.fatalcOs,|j|�d|d<|j|j|f|�|�dS)z Error log. rrN)r|r}r&)rrJr?r@rrr	�error�s
zLogger.errorcOs,|j|�d|d<|j|j|f|�|�dS)z Warning log. rrN)r|r}r%)rrJr?r@rrr	�warning�s
zLogger.warningcOsB|j|d|jd�|j|�d|d<|j||j|f|�|�dS)z� Information log using info level [1..info_max].
        There are additional infox functions according to info_max from
        __init__r#)rkrlrrN)rnrTr|r}rS)rr
rJr?r@rrr	r=�s
zLogger.infocOs<|j|d|jd�|j|�d|d<|j||f|�|�dS)z� Debug log using debug level [1..debug_max].
        There are additional debugx functions according to debug_max
        from __init__r#)rkrlrN)rnrVr|r})rr
rJr?r@rrr	rC�s
zLogger.debugcCs|j|jtj�gid�dS)N)r?r@)r}rX�	traceback�
format_exc)rrrr	�	exception�szLogger.exceptioncCs&||ks||kr"td|||f��dS)Nz*Level %d out of range, should be [%d..%d].)rR)rr
rkrlrrr	rn�szLogger._checkLogLevelcCs2|sdSx$|j�D]}|dkrtd|��qWdS)N�nlrr�nofmtz0Key '%s' is not allowed as argument for logging.)r�rrr�)�keysrR)rr@�keyrrr	r|�s
zLogger._checkKWargscCs|s|dkrtd|��dS)Nr<zDomain '%s' is not valid.)rR)rrfrrr	rd�szLogger._checkDomaincCs�||jkrft|t�st|t�r$|}n|g}xp|D]0}|rL|j|d|jd�q0|j||j|jd�q0Wn6|r�dd�t|j	|j�D�}ndd�t|j|j�D�}|S)z Generate log level array. r#)rkrlcSsg|]}|�qSrr)rErFrrr	rG�sz%Logger._getLevels.<locals>.<listcomp>cSsg|]}|�qSrr)rErFrrr	rG�s)
�ALL�
isinstance�list�tuplernrVr'rTrYZDEBUG1)rr
rrprrr	rm�s


zLogger._getLevelscCsNt|t�st|t�r|}n|g}x(|D] }t|jt�s&td|jj��q&W|S)z Generate target array. z '%s' is no valid logging target.)r�r�r��
issubclass�	__class__rrRr)rrc�targetsZ_targetrrr	�_getTargets�s
zLogger._getTargetscCs�|r |j}|j}d|jdf}n|j}|j}|j|jdf}t|�dkrP|j�xVt	|d|d�D]@}||krrqdx0||D]$\}}}||kr||j
|g�j|�q|WqdWdS)z% Generate dict with domain by level. r#rN)rQrOrVrPrNr'rTr)�clearrY�
setdefault�append)rrrPrNZ_ranger
rfrbrrr	�_genDomainsszLogger._genDomainsc	Csl|j|�|j||�}|j|�}|r,|j}n|j}x*|D]"}x|D]}|||fg||<qBWq8W|j|�dS)N)rdrmr�rOrNr�)	rrfrcr
rrrrpr�rNrrr	rqs



zLogger._setLoggingc	Cst|j|�|j||�}|j|�}|r,|j}n|j}x2|D]*}x$|D]}|j|g�j|||f�qBWq8W|j|�dS)N)rdrmr�rOrNr�r�r�)	rrfrcr
rrrrpr�rNrrr	rs-s



 zLogger._addLoggingc
Cs�|j|�|j||�}|j|�}|r,|j}n|j}x�|D]|}	xv|D]n}|	|krPqB|||f||	kr�||	j|||f�t||	�dkr�||	=qB||jkrBtd|	||j	j
|f��qBWq8W|j|�dS)NrzDNo mathing logging for level %d, domain %s, target %s and format %s.)rdrmr�rOrN�remover)r�rRr�rr�)
rrfrcr
rrrrpr�rNrHrrr	rv<s&




zLogger._delLoggingcCst|j||�}|sdS|dd}|r,|j}n|j}x<||D]0\}}}|dksh|j|�shtj|d|�r<dSq<WdS)NFrf�.rDT)�_genDictrOrNr0�fnmatch�fnmatchcase)rr
r�_dict�point_domainrNrfrbrrr	ryUs
zLogger._isLoggingHerec	Cs�|jjdkrD|jjd}||jkrD|j|}|j|j|j�}|rD|Stj|j�}|j}|j|j	kr�t
|j	|jd�r�|j	|jj|kr�dSxT|j	j�D]F\}}t
|tj�r�t
||j�r�t||j�}t
|tj�r�|j|kr�|Sq�WdS)z7 Function to get calling class. Returns class or None. rZ	func_codeN)�f_code�co_argcount�co_varnames�f_locals�
_getClass2r��inspectZ	getmodule�co_name�__dict__�hasattr�__code__�itemsr��typesZ	ClassType�getattr�FunctionType)	r�frameZselfname�_self�obj�module�coderb�valuerrr	�	_getClassis*


zLogger._getClasscCsVx,|jj�D]}t|tj�r|j|kr|SqWx"|jD]}|j||�}|r6|Sq6WdS)z@ Internal function to get calling class. Returns class or None. N)r��valuesr�r�r�r��	__bases__r�)rr�r�r��baseZ_objrrr	r��s
zLogger._getClass2cOsld}d|kr|d}d}d|kr(|d}d}d|kr<|d}|j||�}|sPdSt|�dkrj|||d<n&t|�dkr�||d|d<n||d<|dd}	|r�|j}
n|j}
g}x�|
|D]�\}}
}|
|kr�q�|d	ks�|	j|d�s�tj|d|�r�|�s|j}d
|k�r|d
}|�r0|
j|d|||�n|
j|||||�|�rZ|
jd|||�|j	|
�q�WdS)Nrrr#r�r�r>rfr�rDrrr")
r�r)rOrNr0r�r�rJrr�)rr
rJr?r@rr�r�r�r�rNZused_targetsrfrcrrr	r}�sL
zLogger._logcCsg}d}|r |j}|j}|j}n|j}|j}|j}xN|D]F}|dkrh|||kr~d}t|�dkrdg}Pq8|||kr8|j|�q8W|r�t|�dkr�dS||kr�dStj	�}	x$|	r�|	j
r�|	jd|jkr�|	j
}	q�W|	s�t
d��|	jd}
|
d	}x|D]}|j|�r�g}Pq�W|	j}t|
�}
xx||D]l}|jd�}|dk�rD�q&n|dk�r\|d|�}n|}|
t|�k�r�|
j|��s�dSn|j|
��s&dS�q&Wd
}||k�r�||}|j|	j|
d
|jd
||tj|jtj��d�	}|dd
k�r�d
|d<d}x&||D]}|dk�r�q�d}P�q�W|jjd�dk�sR|jjd�dk�sR|�sRt|�dk�rl|j|	�}|�rl|j|d<d
|d|d<|dd
k�r�|dd	|d7<|dd
k�r�|dd	|d7<t|�dk�r�|S|dd	}x0|D](}|j|��stj|d|��r�|S�q�WdS)z Internal function. FrDTrr#Nrz Frame information not available.r�r<)	�file�liner��class�functionrfror
Zdater��?z	%(domain)z%(class)r�r�rf)rIrQrMrHrPrLr)r�r�Zcurrentframe�f_back�	f_globalsrrRr0r��find�co_filename�f_linenor��timeZstrftimerKZ	localtimerJr�rr�r�)rr
rZ
check_domainsZsimple_matchr�rPrLrf�fZmodule_nameZpoint_module�co�_lenrF�dZ	level_strZ
domain_neededr�r�rrr	r��s�














zLogger._genDict���������������)r7r;)rD)rD)rD)rD)r)r)r)r)8rrrrr�rer'rXr&r%rrrrrrr
rrgr\rhr]rir^rjr_rWr[r`rartrurwrxrzr{r~rr�r=rCr�rnr|rdrmr�r�rqrsrvryr�r�r}r�rrrr	r�sdG
;

	

					


 4)�__all__rr�r�r�r�rr�r5Zos.pathr�objectrrrrrrrrrrr	�<module>s.-(*4site-packages/firewall/core/__pycache__/fw_icmptype.cpython-36.pyc000064400000004205147511334670021221 0ustar003

]ûf�	�@s>dgZddlmZddlmZddlmZGdd�de�ZdS)�FirewallIcmpType�)�log)�errors)�
FirewallErrorc@sLeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)rcCs||_i|_dS)N)�_fw�
_icmptypes)�self�fw�r
�!/usr/lib/python3.6/fw_icmptype.py�__init__szFirewallIcmpType.__init__cCsd|j|jfS)Nz%s(%r))�	__class__r)rr
r
r�__repr__!szFirewallIcmpType.__repr__cCs|jj�dS)N)r�clear)rr
r
r�cleanup$szFirewallIcmpType.cleanupcCst|jj��S)N)�sortedr�keys)rr
r
r�
get_icmptypes)szFirewallIcmpType.get_icmptypescCs||jkrttj|��dS)N)rrrZINVALID_ICMPTYPE)r�icmptyper
r
r�check_icmptype,s
zFirewallIcmpType.check_icmptypecCs|j|�|j|S)N)rr)rrr
r
r�get_icmptype0s
zFirewallIcmpType.get_icmptypecCs�|j}t|�dkrddg}x�|D]z}|dkrL|jjrB|jjrBq |jj}n,|dkrt|jjrj|jjrjq |jj}ng}|jj	�|kr t
jd|j|f�q W||j|j<dS)NrZipv4Zipv6z5ICMP type '%s' is not supported by the kernel for %s.)
Zdestination�lenrZip4tables_enabledZnftables_enabledZipv4_supported_icmp_typesZip6tables_enabledZipv6_supported_icmp_types�name�lowerrZinfo1r)r�objZ	orig_ipvsZipvZsupported_icmpsr
r
r�add_icmptype4s 


zFirewallIcmpType.add_icmptypecCs|j|�|j|=dS)N)rr)rrr
r
r�remove_icmptypeGs
z FirewallIcmpType.remove_icmptypeN)�__name__�
__module__�__qualname__rrrrrrrrr
r
r
rrsN)	�__all__Zfirewall.core.loggerrZfirewallrZfirewall.errorsr�objectrr
r
r
r�<module>ssite-packages/firewall/core/__pycache__/watcher.cpython-36.opt-1.pyc000064400000005256147511334670021276 0ustar003

]ûf��@s*dgZddlmZmZGdd�de�ZdS)�Watcher�)�Gio�GLibc@sdeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
dS)rcCs"||_||_i|_i|_g|_dS)N)�	_callback�_timeout�	_monitors�	_timeouts�_blocked)�self�callbackZtimeout�r�/usr/lib/python3.6/watcher.py�__init__s
zWatcher.__init__cCs:tjj|�}|jtjjd�|j|<|j|jd|j�dS)N�changed)	r�File�new_for_pathZmonitor_directory�FileMonitorFlags�NONEr�connect�_file_changed_cb)r
Z	directory�gfilerrr
�
add_watch_dir"szWatcher.add_watch_dircCs:tjj|�}|jtjjd�|j|<|j|jd|j�dS)Nr)	rrrZmonitor_filerrrrr)r
�filenamerrrr
�add_watch_file(szWatcher.add_watch_filecCs
|jj�S)N)r�keys)r
rrr
�get_watches.szWatcher.get_watchescCs
||jkS)N)r)r
rrrr
�	has_watch1szWatcher.has_watchcCs|j|=dS)N)r)r
rrrr
�remove_watch4szWatcher.remove_watchcCs||jkr|jj|�dS)N)r	�append)r
rrrr
�block_source7s
zWatcher.block_sourcecCs||jkr|jj|�dS)N)r	�remove)r
rrrr
�unblock_source;s
zWatcher.unblock_sourcecCs4x.t|jj��D]}tj|j|�|j|=qWdS)N)�listrrr�
source_remove)r
rrrr
�clear_timeouts?szWatcher.clear_timeoutscCs ||jkr|j|�|j|=dS)N)r	rr)r
rrrr
�_call_callbackDs

zWatcher._call_callbackcCs�|j�}||jkr8||jkr4tj|j|�|j|=dS|tjjksh|tjjksh|tjj	ksh|tjj
kr�||jkr�tj|j|�|j|=tj|j|j
|�|j|<dS)N)Zget_parse_namer	rrr#rZFileMonitorEventZCHANGEDZCREATEDZDELETEDZATTRIBUTE_CHANGEDZtimeout_add_secondsrr%)r
ZmonitorZgio_fileZgio_other_fileZeventrrrr
rIs


zWatcher._file_changed_cbN)�__name__�
__module__�__qualname__rrrrrrrr!r$r%rrrrr
rsN)�__all__Z
gi.repositoryrr�objectrrrrr
�<module>ssite-packages/firewall/core/__pycache__/ebtables.cpython-36.opt-1.pyc000064400000016336147511334670021423 0ustar003

]ûf�$�@s&dgZddlZddlmZddlmZddlmZm	Z	m
Z
ddlmZddl
mZddlmZmZddlZd	gd
ddgd
ddgd�ZiZiZiZx�ej�D]tZgee<e�ee<x\eeD]PZeejde�eejdeef�eejde�eejde�q�Wq�WGdd�de�ZdS)�ebtables�N)�runProg)�log)�tempFile�readfile�	splitArgs)�COMMANDS)�	ipXtables)�
FirewallError�INVALID_IPVZBROUTINGZ
PREROUTINGZPOSTROUTINGZOUTPUTZINPUTZFORWARD)ZbrouteZnat�filterz-N %s_directz-I %s 1 -j %s_directz-I %s_direct 1 -j RETURNz	%s_directc@s�eZdZdZdZdZdd�Zdd�Zdd�Zd	d
�Z	dd�Z
d
d�Zdd�Zdd�Z
dd�Zdd�Zdd�Zdd�Zdd�Zdd�Zd/d d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd0d+d,�Zd-d.�ZdS)1rZebFcCsBt|j|_td|j|_|j�|_|j�|_|j�g|_	dS)Nz
%s-restore)
r�ipv�_command�_restore_command�_detect_restore_noflush_optionZrestore_noflush_option�_detect_concurrent_option�concurrent_option�fill_exists�available_tables)�self�r�/usr/lib/python3.6/ebtables.py�__init__9s

zebtables.__init__cCs$tjj|j�|_tjj|j�|_dS)N)�os�path�existsrZcommand_existsrZrestore_command_exists)rrrrrAszebtables.fill_existscCs(d}t|jddg�}|ddkr$d}|S)N�z--concurrentz-Lr)rr)rr�retrrrrEs
z"ebtables._detect_concurrent_optioncCs.g}y|j|d�Wntk
r(dSXdS)N�offFT)�	set_rules�
ValueError)r�rulesrrrrOsz'ebtables._detect_restore_noflush_optioncCs�g}|jr |j|kr |j|j�|dd�|D�7}tjd|j|jdj|��t|j|�\}}|dkr~td|jdj|�|f��|S)NcSsg|]}d|�qS)z%sr)�.0�itemrrr�
<listcomp>^sz"ebtables.__run.<locals>.<listcomp>z	%s: %s %s� rz'%s %s' failed: %s)	r�appendr�debug2�	__class__r�joinrr )r�argsZ_args�statusrrrrZ__runYszebtables.__runcCs(x"dD]}||krttd|��qWdS)N�
%%REJECT%%�%%ICMP%%�%%LOGTYPE%%z'%s' invalid for ebtables)r,r-r.)r
r)r�rule�strrrr�_rule_validatefs
zebtables._rule_validatecCs|tko|t|kS)N)�BUILT_IN_CHAINS)rr
�table�chainrrr�is_chain_builtinlszebtables.is_chain_builtincCsJg}|r4|jd|d|g�|jd|d|dddg�n|jd|d|g�|S)Nz-tz-Nz-I�1z-jZRETURNz-X)r&)r�addr3r4r!rrr�build_chain_rulespszebtables.build_chain_rulescCs8d|g}|r |d|t|�g7}n|d|g7}||7}|S)Nz-tz-Iz-D)r0)rr7r3r4�indexr*r/rrr�
build_rule{szebtables.build_rulecCs
tj|�S)N)r	Zcommon_reverse_rule)rr*rrr�reverse_rule�szebtables.reverse_rulecCstj|�dS)N)r	Zcommon_check_passthrough)rr*rrr�check_passthrough�szebtables.check_passthroughcCs
tj|�S)N)r	Zcommon_reverse_passthrough)rr*rrr�reverse_passthrough�szebtables.reverse_passthroughc
Cs<t�}d}i}x�|D]�}|dd�}|j|�xTdD]L}y|j|�}	Wntk
rZYq4Xt|�|	dkr4|j|	�|j|	�}q4Wx^tt|��D]N}	xHtjD]>}
|
||	kr�||	j	d�o�||	j
d�r�d||	||	<q�Wq�W|j|g�j|�qWxD|D]<}|j
d|�x&||D]}|j
dj|�d	��qW�qW|j�tj|j�}tjd
|j|jd|j|jf�g}|jd�t|j||jd
�\}
}tj�dk�rt|j�}|dk	�rd}	xH|D]@}tjd|	|fddd�|j
d	��s�tjddd�|	d7}	�q�Wtj|j�|
dk�r8td|jdj|�|f��dS)Nr�-t�--table��"z"%s"z*%s
r%�
z	%s: %s %sz%s: %dz	--noflush)�stdin�z%8d: %sr)�nofmt�nlr)rEz'%s %s' failed: %s)r>r?)rr1r9r �len�pop�range�stringZ
whitespace�
startswith�endswith�
setdefaultr&�writer)�closer�stat�namerr'r(r�st_sizerZgetDebugLogLevelrZdebug3�unlink)rr!�
log_deniedZ	temp_filer3Ztable_rulesZ_ruler/�opt�i�crPr*r+r�lines�linerrrr�sZ




 




zebtables.set_rulescCs|j|�|j|�S)N)r1�_ebtables__run)rr/rTrrr�set_rule�s
zebtables.set_ruleNcCs�g}|r|gntj�}xp|D]h}||jkr6|j|�qy*|jd|dg�|jj|�|j|�Wqtk
r�tjd|�YqXqW|S)Nz-tz-Lz#ebtables table '%s' does not exist.)r2�keysrr&rZr rZdebug1)rr3rZtablesrrr�get_available_tables�s

zebtables.get_available_tablescCsiS)Nr)rr3rrr�get_zone_table_chains�szebtables.get_zone_table_chainscCsFg}x<tj�D]0}||j�kr qxdD]}|jd||g�q&WqW|S)N�-F�-X�-Zz-t)r_r`ra)r2r\r]r&)rr!r3�flagrrr�build_flush_rules�s
zebtables.build_flush_rulescCs^g}|dkrdn|}xDtj�D]8}||j�kr0qx$t|D]}|jd|d||g�q:WqW|S)NZPANICZDROPz-tz-P)r2r\r]r&)rZpolicyr!Z_policyr3r4rrr�build_set_policy_rules�szebtables.build_set_policy_rulescCsgS)Nr)rrrr�build_default_tables�szebtables.build_default_tablesrcCs�g}x�tD]�}||j�krq
t|dd�}|dkrJ|tkrJ|jt|�d|g}x:|D]2}t|�tkrx|j||�qX|j|t|��qXWq
W|S)Nrz-t)�
DEFAULT_RULESr]�	LOG_RULES�extend�type�listr&r)rrTZ
default_rulesr3Z_default_rules�prefixr/rrr�build_default_rules�s

zebtables.build_default_rulescCs
||jkS)N)r
)rr
rrr�is_ipv_supportedszebtables.is_ipv_supported)N)r)�__name__�
__module__�__qualname__r
rQZpolicies_supportedrrrrrZr1r5r8r:r;r<r=rr[r]r^rcrdrerlrmrrrrr4s0


	@


)�__all__Zos.pathrZfirewall.core.progrZfirewall.core.loggerrZfirewall.functionsrrrZfirewall.configrZ
firewall.corer	Zfirewall.errorsr
rrJr2rfrgZ
OUR_CHAINSr\r3�setr4r&r7�objectrrrrr�<module>s.
site-packages/firewall/core/__pycache__/helper.cpython-36.pyc000064400000000256147511334670020154 0ustar003

]ûf$�@sdZdZdS)zThe helper maxnamelen� N)�__doc__ZHELPER_MAXNAMELEN�rr�/usr/lib/python3.6/helper.py�<module>ssite-packages/firewall/core/__pycache__/fw_policy.cpython-36.opt-1.pyc000064400000154111147511334670021627 0ustar003

]ûf=V�@s�ddlZddlZddlmZddlmZmZmZmZm	Z	m
Z
mZmZm
Z
mZddlmZmZmZmZmZmZmZmZmZmZmZddlmZddlmZddlm Z ddl!m"Z"dd	l#m$Z$Gd
d�de%�Z&dS)�N)�log)
�portStr�checkIPnMask�
checkIP6nMask�
checkProtocol�enable_ip_forwarding�check_single_address�portInPortRange�get_nf_conntrack_short_name�coalescePortRange�breakPortRange)�	Rich_Rule�Rich_Accept�Rich_Service�	Rich_Port�
Rich_Protocol�Rich_Masquerade�Rich_ForwardPort�Rich_SourcePort�Rich_IcmpBlock�
Rich_IcmpType�	Rich_Mark)�FirewallTransaction)�errors)�
FirewallError)�LastUpdatedOrderedDict)�SOURCE_IPSET_TYPESc@s�eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dd�Zdd�Zdd�Z
�ddd�Zdd�Zdd�Zdd�Z�dd d!�Z�d
d"d#�Z�dd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Z�dd0d1�Zd2d3�Z�dd4d5�Zd6d7�Zd8d9�Zd:d;�Zd<d=�Zd>d?�Z �dd@dA�Z!dBdC�Z"�ddDdE�Z#dFdG�Z$dHdI�Z%dJdK�Z&dLdM�Z'dNdO�Z(dPdQ�Z)dRdS�Z*�ddTdU�Z+dVdW�Z,�ddXdY�Z-dZd[�Z.d\d]�Z/d^d_�Z0d`da�Z1dbdc�Z2�dddde�Z3dfdg�Z4�ddhdi�Z5djdk�Z6dldm�Z7dndo�Z8dpdq�Z9drds�Z:dtdu�Z;dvdw�Z<�ddxdy�Z=dzd{�Z>�dd|d}�Z?d~d�Z@d�d��ZAd�d��ZBd�d��ZCd�d��ZD�dd�d��ZEd�d��ZF�dd�d��ZGd�d��ZHd�d��ZId�d��ZJd�d��ZK�dd�d��ZLd�d��ZM�dd�d��ZNd�d��ZOd�d��ZPd�d��ZQd�d��ZR�dd�d��ZSd�d��ZT�dd�d��ZUd�d��ZVd�d��ZW�dd�d��ZX�d d�d��ZY�d!d�d��ZZd�d��Z[�d"d�d��Z\d�d��Z]�d#d�d��Z^d�d��Z_d�d��Z`d�d��Za�d$d�dÄZbd�dńZc�d%d�dDŽZdd�dɄZed�d˄Zfd�d̈́Zgd�dτZh�d&d�dфZid�dӄZjd�dՄZk�d'd�dׄZld�dلZmd�dۄZnd�d݄Zod�d߄Zpd�d�Zqd�d�Zrd�d�Zsd�d�Ztd�d�Zu�d(d�d�Zv�d)d�d�Zwd�d�Zxd�d�Zyd�d�Zzd�d��Z{�d*d�d��Z|d�d��Z}d�d��Z~d�d��Zd�d��Z��d�d�Z��d�d�Z��d�d�Z��d�d�Z��d+�d	�d
�Z�dS(,�FirewallPolicycCs||_i|_i|_dS)N)�_fw�_chains�	_policies)�self�fw�r#�/usr/lib/python3.6/fw_policy.py�__init__szFirewallPolicy.__init__cCsd|j|j|jfS)Nz
%s(%r, %r))�	__class__rr )r!r#r#r$�__repr__szFirewallPolicy.__repr__cCs|jj�|jj�dS)N)r�clearr )r!r#r#r$�cleanups
zFirewallPolicy.cleanupcCs
t|j�S)N)rr)r!r#r#r$�new_transaction$szFirewallPolicy.new_transactioncCst|jj��S)N)�sortedr �keys)r!r#r#r$�get_policies)szFirewallPolicy.get_policiescCs8g}x*|j�D]}|j|�}|js|j|�qWt|�S)N)r-�
get_policy�derived_from_zone�appendr+)r!Zpolicies�p�p_objr#r#r$�"get_policies_not_derived_from_zone,s
z1FirewallPolicy.get_policies_not_derived_from_zonecCs~g}xt|j�D]h}|j|�}t|d�t|jjj��tddg�B@rt|d�t|jjj��tddg�B@r|j|�qW|S)N�
ingress_zones�HOST�ANY�egress_zones)r3�get_settings�setr�zoneZget_active_zonesr0)r!Zactive_policies�policy�settingsr#r#r$�)get_active_policies_not_derived_from_zone4s
((z8FirewallPolicy.get_active_policies_not_derived_from_zonecCs|jj|�}|j|S)N)r�check_policyr )r!r;r1r#r#r$r.>szFirewallPolicy.get_policycCs,dd�dD�|_||j|j<|j|j�dS)NcSsi|]}t�|�qSr#)r)�.0�xr#r#r$�
<dictcomp>Csz-FirewallPolicy.add_policy.<locals>.<dictcomp>�services�ports�
masquerade�
forward_ports�source_ports�icmp_blocks�rules�	protocols�icmp_block_inversionr4r7)rBrCrDrErFrGrHrIrJr4r7)r<r �name�copy_permanent_to_runtime)r!�objr#r#r$�
add_policyBs
zFirewallPolicy.add_policycCs0|j|}|jr|j|�|jj�|j|=dS)N)r �applied�unapply_policy_settingsr<r()r!r;rMr#r#r$�
remove_policyNs



zFirewallPolicy.remove_policycCs�|j|}|jrdSx|jD]}|j||dd�qWx|jD]}|j||dd�q<Wx|jD]}|j||�q\Wx|jD]}|j	|f|��qxWx|j
D]}|j||�q�Wxf|jD]\}y|j
|f|��Wq�tk
�r}z$|jtjgkr�tj|�n|�WYdd}~Xq�Xq�Wx|jD]}|j||��qWxj|jD]`}y|j|f|��WnDtk
�r�}z&|jtjgk�r�tj|�n|�WYdd}~XnX�q:Wx|jD]}|j||��q�W|j�r�|j|�dS)NF)�allow_apply)r rOr4�add_ingress_zoner7�add_egress_zonerG�add_icmp_blockrE�add_forward_portrB�add_servicerC�add_portr�coder�ALREADY_ENABLEDr�warningrI�add_protocolrF�add_source_portrH�add_rulerD�add_masquerade)r!r;rM�args�errorr#r#r$rLUsB
z(FirewallPolicy.copy_permanent_to_runtimeNcCsNxH|j�D]<}|j|}|jr q
||j�kr
tjd|�|j||d�q
WdS)NzApplying policy '%s')�use_transaction)r-r r/r=rZdebug1�apply_policy_settings)r!rbr;r2r#r#r$�apply_policies|s
zFirewallPolicy.apply_policiescCs|j|}||_dS)N)r rO)r!r;rOrMr#r#r$�set_policy_applied�s
z!FirewallPolicy.set_policy_appliedcCstj�||d�}|S)N)Zdate�sender�timeout)�time)r!rgrf�retr#r#r$Z__gen_settings�szFirewallPolicy.__gen_settingscCs|j|�jS)N)r.r<)r!r;r#r#r$r8�szFirewallPolicy.get_settingscCsj|jj|�}|j|}|r |js.|r2|jr2dS|r<d|_|dkrN|j�}n|}|r�x8|jsh|j|�n|j|�D]\}}|j|d|||�qrW|j	|�}	|js�|j
|||��xV|	D�]L}
�xD|	|
D�]6}|
dkr�|j||||�q�|
dkr�q�q�|
dk�r|j|||f|��q�|
dk�r0|j
||||�q�|
dk�rV|j|||d|d|�q�|
d	k�rr|j||||�q�|
d
k�r�|j|||d|d|�q�|
dk�r�|j|||�q�|
dk�r�|j||t|d
�|�q�|
dk�r�q�q�|
dk�r�q�q�tjd||
|�q�Wq�W|�sRx<|j�s"|j|�n|j|�D]\}}|j|d|||��q,Wd|_|dk�rf|j|�dS)NTrGrJrErBrCr�rIrFrDrH)�rule_strr4r7z5Policy '%s': Unknown setting '%s:%s', unable to applyF)rr>r rOr*r/�%_get_table_chains_for_policy_dispatch�#_get_table_chains_for_zone_dispatch�gen_chain_rulesr8�_ingress_egress_zones�_icmp_block�
_forward_port�_service�_port�	_protocol�_source_port�_masquerade�_FirewallPolicy__ruler
rr[�execute)r!�enabler;rb�_policyrM�transaction�table�chainr<�keyr`r#r#r$�_policy_settings�sj













zFirewallPolicy._policy_settingscCs|jd||d�dS)NT)rb)r)r!r;rbr#r#r$rc�sz$FirewallPolicy.apply_policy_settingscCs|jd||d�dS)NF)rb)r)r!r;rbr#r#r$rP�sz&FirewallPolicy.unapply_policy_settingscCsr|j|�j�}|j|�|j|�|j|�|j|�|j|�|j|�|j|�|j	|�|j
|�|j|�d�
}|jj
||�S)zH
        :return: exported config updated with runtime settings
        )
rBrCrGrDrE�
rich_rulesrIrFr4r7)r.Zexport_config_dict�
list_services�
list_ports�list_icmp_blocks�query_masquerade�list_forward_ports�
list_rules�list_protocols�list_source_ports�list_ingress_zones�list_egress_zonesrZ'combine_runtime_with_permanent_settings)r!r;Z	permanentZruntimer#r#r$�get_config_with_settings_dict�sz,FirewallPolicy.get_config_with_settings_dictcs�ddlm�d
��fdd�	}��fdd�}�j�jf�j�jf�j�jf�j�j	f�j
�jf||f�j�j
f�j�jf�j�jf�j�jfd�
}�j|�}�jj||�\}}	xt|	D]l}
t|	|
t��rxV|	|
D]8}t|t�r�||
d|f|��q�||
d||�q�Wq�||
d|�q�Wx�|D]�}
t||
t��r�xn||
D]J}t|t��rv||
d|f|�d|d	��n||
d||d|d	��qFWn||
d|d|d	��q(WdS)Nr)r
cs�j|�|d�d|d�dS)N)rkr)rgrf)r^)r;rkrgrf)r
r!r#r$�add_rule_wrapper�szFFirewallPolicy.set_config_with_settings_dict.<locals>.add_rule_wrappercs�j|�|d��dS)N)rk)�remove_rule)r;rk)r
r!r#r$�remove_rule_wrapper�szIFirewallPolicy.set_config_with_settings_dict.<locals>.remove_rule_wrapper)
rBrCrGrDrEr�rIrFr4r7rj)rgrf)rN)�firewall.core.richr
rW�remove_servicerX�remove_portrU�remove_icmp_blockr_�remove_masqueraderV�remove_forward_portr\�remove_protocolr]�remove_source_portrS�remove_ingress_zonerT�remove_egress_zoner�rZget_added_and_removed_settings�
isinstance�list�tuple)r!r;r<rfr�r�Z
setting_to_fnZold_settingsZadd_settingsZremove_settingsr~r`r#)r
r!r$�set_config_with_settings_dict�s:











  z,FirewallPolicy.set_config_with_settings_dictcCs&|sttj��|dkr"|jj|�dS)Nr5r6)r5r6)rr�INVALID_ZONEr�
check_zone)r!r:r#r#r$�check_ingress_zones
z!FirewallPolicy.check_ingress_zonecCs|j|�|S)N)r�)r!r:r#r#r$Z__ingress_zone_id"s
z FirewallPolicy.__ingress_zone_idrTcCs�|jj|�}|jj|�|jj�|j|}|j|�}	|	|jdkrXttj	d||f��d|jdks�d|jdks�|dkr�|jdr�ttj
d��|dkr�d|jdkr�ttj
d��|dkr�|j�}
n|}
|�rJ|jr�|j
d||
�|j||	||�|
j|j||	�|j�s:||j�k�rH|j||
d	�|
j|j|d�n|j
d
||
�n |j||	||�|
j|j||	�|dk�r~|
jd
�dS)Nr4z'%s' already in '%s'r6r5zI'ingress-zones' may only contain one of: many regular zones, ANY, or HOSTr7zF'HOST' can only appear in either ingress or egress zones, but not bothF)rbT)r6r5)rr>�
check_timeout�check_panicr � _FirewallPolicy__ingress_zone_idr<rrrZr�r*rOro�&_FirewallPolicy__register_ingress_zone�add_fail�(_FirewallPolicy__unregister_ingress_zoner=rcrerx)r!r;r:rgrfrbrRrz�_obj�zone_idr{r#r#r$rS&s<




zFirewallPolicy.add_ingress_zonecCs|j||�|jd|<dS)Nr4)�_FirewallPolicy__gen_settingsr<)r!r�r�rgrfr#r#r$Z__register_ingress_zoneSsz&FirewallPolicy.__register_ingress_zonecCs�|jj|�}|jj�|j|}|j|�}||jdkrLttjd||f��|dkr^|j	�}n|}|j
r�t|jd�dkr�|j||�n|j
d||�|j||�|j|j||dd�||j�kr�|j
d||�n|j|j||�|dkr�|jd�|S)Nr4z'%s' not in '%s'rjFT)rr>r�r r�r<rr�NOT_ENABLEDr*rO�lenrPror�r�r�r=�add_postrx)r!r;r:rbrzr�r�r{r#r#r$r�Vs,




z"FirewallPolicy.remove_ingress_zonecCs||jdkr|jd|=dS)Nr4)r<)r!r�r�r#r#r$Z__unregister_ingress_zoneysz(FirewallPolicy.__unregister_ingress_zonecCs|j|�|j|�dkS)Nr4)r�r8)r!r;r:r#r#r$�query_ingress_zone}sz!FirewallPolicy.query_ingress_zonecCst|j|�dj��S)Nr4)r�r8r,)r!r;r#r#r$r��sz!FirewallPolicy.list_ingress_zonescCs&|sttj��|dkr"|jj|�dS)Nr5r6)r5r6)rrr�rr�)r!r:r#r#r$�check_egress_zone�s
z FirewallPolicy.check_egress_zonecCs|j|�|S)N)r�)r!r:r#r#r$Z__egress_zone_id�s
zFirewallPolicy.__egress_zone_idcCs�|jj|�}|jj|�|jj�|j|}|j|�}	|	|jdkrXttj	d||f��d|jdks�d|jdks�|dkr�|jdr�ttj
d��|dkr�d|jdkr�ttj
d��|dkr�|j�}
n|}
|�rJ|jr�|j
d||
�|j||	||�|
j|j||	�|j�s:||j�k�rH|j||
d	�|
j|j|d�n|j
d
||
�n |j||	||�|
j|j||	�|dk�r~|
jd
�dS)Nr7z'%s' already in '%s'r6r5zH'egress-zones' may only contain one of: many regular zones, ANY, or HOSTr4zF'HOST' can only appear in either ingress or egress zones, but not bothF)rbT)r6r5)rr>r�r�r �_FirewallPolicy__egress_zone_idr<rrrZr�r*rOro�%_FirewallPolicy__register_egress_zoner��'_FirewallPolicy__unregister_egress_zoner=rcrerx)r!r;r:rgrfrbrRrzr�r�r{r#r#r$rT�s<




zFirewallPolicy.add_egress_zonecCs|j||�|jd|<dS)Nr7)r�r<)r!r�r�rgrfr#r#r$Z__register_egress_zone�sz%FirewallPolicy.__register_egress_zonecCs�|jj|�}|jj�|j|}|j|�}||jdkrLttjd||f��|dkr^|j	�}n|}|j
r�t|jd�dkr�|j||�n|j
d||�|j||�|j|j||dd�||j�kr�|j
d||�n|j|j||�|dkr�|jd�|S)Nr7z'%s' not in '%s'rjFT)rr>r�r r�r<rrr�r*rOr�rPror�r�r�r=r�rx)r!r;r:rbrzr�r�r{r#r#r$r��s,




z!FirewallPolicy.remove_egress_zonecCs||jdkr|jd|=dS)Nr7)r<)r!r�r�r#r#r$Z__unregister_egress_zone�sz'FirewallPolicy.__unregister_egress_zonecCs|j|�|j|�dkS)Nr7)r�r8)r!r;r:r#r#r$�query_egress_zone�sz FirewallPolicy.query_egress_zonecCst|j|�dj��S)Nr7)r�r8r,)r!r;r#r#r$r��sz FirewallPolicy.list_egress_zonescCs|j�dS)N)Zcheck)r!�ruler#r#r$�
check_rule�szFirewallPolicy.check_rulecCs|j|�t|�S)N)r��str)r!r�r#r#r$Z	__rule_id�s
zFirewallPolicy.__rule_idcCsx|sdS|jr,t|j�rdSt|j�rtdSnHt|d�r@|jr@dSt|d�rt|jrt|j|j�|j|j�|j|j�SdS)N�ipv4�ipv6�mac��ipset)	Zaddrrr�hasattrr�r��_check_ipset_type_for_source�_check_ipset_applied�
_ipset_family)r!�sourcer#r#r$�_rule_source_ipv�s

zFirewallPolicy._rule_source_ipvcCs|j||||�dS)N)�
_rule_prepare)r!ryr;r�r{r#r#r$Z__ruleszFirewallPolicy.__rulecCsL|jj|�}|jj|�|jj�|j|}|j|�}||jdkrh|jrP|jn|}	tt	j
d||	f��|j�s�|jr�t|jt
�r�d|jdkr�tt	jd��d|jdkr�tt	jd��x6|jdD](}
|
dkr�q�|jjj|
�r�tt	jd	��q�W|j�r�t|jt��r�d|jdk�r,|jj�r�tt	jd
��nb|jd�r�|jj�sNtt	jd��x>|jdD]0}
|
dk�rl�qZ|jjj|
��rZtt	jd���qZW|j�r�t|jt��r�x>|jdD]0}
|
dk�rq�|jjj|
��r�tt	jd
���q�W|dk�r�|j�}n|}|j�r|jd|||�|j||||�|j|j||�|dk�rH|jd�|S)NrHz'%s' already in '%s'r5r7z.'masquerade' is invalid for egress zone 'HOST'r4z/'masquerade' is invalid for ingress zone 'HOST'r6zR'masquerade' cannot be used in a policy if an ingress zone has assigned interfaceszAA 'forward-port' with 'to-addr' is invalid for egress zone 'HOST'zC'forward-port' requires 'to-addr' if egress zone is 'ANY' or a zonezS'forward-port' cannot be used in a policy if an egress zone has assigned interfaceszR'mark' action cannot be used in a policy if an egress zone has assigned interfacesT)r6r5)rr>r�r�r �_FirewallPolicy__rule_idr<r/rrrZ�elementr�rr�r:�list_interfacesr�
to_address�INVALID_FORWARD�actionrr*rOrw�_FirewallPolicy__register_ruler�� _FirewallPolicy__unregister_rulerx)r!r;r�rgrfrbrzr��rule_id�_namer:r{r#r#r$r^
s`










zFirewallPolicy.add_rulecCs|j||�|jd|<dS)NrH)r�r<)r!r�r�rgrfr#r#r$Z__register_ruleEszFirewallPolicy.__register_rulec	Cs�|jj|�}|jj�|j|}|j|�}||jdkr\|jrD|jn|}ttj	d||f��|dkrn|j
�}n|}|jr�|jd|||�|j
|j||�|dkr�|jd�|S)NrHz'%s' not in '%s'FT)rr>r�r r�r<r/rrr�r*rOrwr�r�rx)	r!r;r�rbrzr�r�r�r{r#r#r$r�Is"




zFirewallPolicy.remove_rulecCs||jdkr|jd|=dS)NrH)r<)r!r�r�r#r#r$Z__unregister_ruledsz FirewallPolicy.__unregister_rulecCs|j|�|j|�dkS)NrH)r�r8)r!r;r�r#r#r$�
query_rulehszFirewallPolicy.query_rulecCst|j|�dj��S)NrH)r�r8r,)r!r;r#r#r$r�kszFirewallPolicy.list_rulescCs|jj|�dS)N)r�
check_service)r!�servicer#r#r$r�pszFirewallPolicy.check_servicecCs|j|�|S)N)r�)r!r�r#r#r$Z__service_idss
zFirewallPolicy.__service_idcCs�|jj|�}|jj|�|jj�|j|}|j|�}||jdkrh|jrP|jn|}	tt	j
d||	f��|dkrz|j�}
n|}
|jr�|j
d|||
�|j||||�|
j|j||�|dkr�|
jd�|S)NrBz'%s' already in '%s'T)rr>r�r�r �_FirewallPolicy__service_idr<r/rrrZr*rOrr�!_FirewallPolicy__register_servicer��#_FirewallPolicy__unregister_servicerx)r!r;r�rgrfrbrzr��
service_idr�r{r#r#r$rWws&




zFirewallPolicy.add_servicecCs|j||�|jd|<dS)NrB)r�r<)r!r�r�rgrfr#r#r$Z__register_service�sz!FirewallPolicy.__register_servicec	Cs�|jj|�}|jj�|j|}|j|�}||jdkr\|jrD|jn|}ttj	d||f��|dkrn|j
�}n|}|jr�|jd|||�|j
|j||�|dkr�|jd�|S)NrBz'%s' not in '%s'FT)rr>r�r r�r<r/rrr�r*rOrrr�r�rx)	r!r;r�rbrzr�r�r�r{r#r#r$r��s"




zFirewallPolicy.remove_servicecCs||jdkr|jd|=dS)NrB)r<)r!r�r�r#r#r$Z__unregister_service�sz#FirewallPolicy.__unregister_servicecCs|j|�|j|�dkS)NrB)r�r8)r!r;r�r#r#r$�
query_service�szFirewallPolicy.query_servicecCs|j|�dj�S)NrB)r8r,)r!r;r#r#r$r��szFirewallPolicy.list_servicescCsTg}xJ|D]B}y|jjj|�}Wn tk
r@ttj|��YnX|j|�q
W|S)N)r�helper�
get_helperrr�INVALID_HELPERr0)r!�helpers�_helpersr��_helperr#r#r$�get_helpers_for_service_helpers�s
z.FirewallPolicy.get_helpers_for_service_helperscCs�g}x�|D]�}y|jjj|�}Wn tk
r@ttj|��YnXt|j�dkr�t|j	�}y|jjj|�}|j
|�Wq�tk
r�|r�tjd|�w
Yq�Xq
|j
|�q
W|S)NrjzHelper '%s' is not available)
rr�r�rrr�r�rCr
�moduler0rr[)r!�modulesryr�r�r��_module_short_namer�r#r#r$�get_helpers_for_service_modules�s"


z.FirewallPolicy.get_helpers_for_service_modulescCs|jj|�|jj|�dS)N)r�
check_port�check_tcpudp)r!�port�protocolr#r#r$r��szFirewallPolicy.check_portcCs|j||�t|d�|fS)N�-)r�r)r!r�r�r#r#r$Z	__port_id�szFirewallPolicy.__port_idcs�|jj|�}|jj|�|jj�|j|}tt�fdd�|jd��}	x@|	D]8}
t||
d�rN|j	rl|j	n|}t
tjd|�|f��qNWt
|dd�|	D��\}}
|dkr�|j�}n|}|j�rx$|D]}|jd|t|d	��|�q�Wx$|
D]}|jd
|t|d	��|�q�Wx:|D]2}|j|��}
|j||
||�|j|j||
��qWx*|
D]"}|j|��}
|j|j||
��qNW|dk�r�|jd�|S)Ncs|d�kS)Nrjr#)r@)r�r#r$�<lambda>�sz)FirewallPolicy.add_port.<locals>.<lambda>rCrz'%s:%s' already in '%s'cSsg|]\}}|�qSr#r#)r?rsrtr#r#r$�
<listcomp>�sz+FirewallPolicy.add_port.<locals>.<listcomp>Tr�F)rr>r�r�r r��filterr<r	r/rrrZrr*rOrsr�_FirewallPolicy__port_id�_FirewallPolicy__register_portr�� _FirewallPolicy__unregister_portr�rx)r!r;r�r�rgrfrbrzr��existing_port_ids�port_idr��added_ranges�removed_rangesr{�ranger#)r�r$rX�s:









zFirewallPolicy.add_portcCs|j||�|jd|<dS)NrC)r�r<)r!r�r�rgrfr#r#r$Z__register_portszFirewallPolicy.__register_portcs�|jj|�}|jj�|j|}tt�fdd�|jd��}xB|D]}t||d�rBPqBW|jrf|jn|}	t	t
jd|�|	f��t|dd�|D��\}
}|dkr�|j
�}n|}|j�rx$|
D]}
|jd|t|
d	��|�q�Wx$|D]}
|jd
|t|
d	��|�q�Wx:|
D]2}
|j|
��}|j||dd�|j|j||��qWx*|D]"}
|j|
��}|j|j||��qDW|dk�r~|jd�|S)Ncs|d�kS)Nrjr#)r@)r�r#r$r�sz,FirewallPolicy.remove_port.<locals>.<lambda>rCrz'%s:%s' not in '%s'cSsg|]\}}|�qSr#r#)r?rsrtr#r#r$r�#sz.FirewallPolicy.remove_port.<locals>.<listcomp>Tr�F)rr>r�r r�r�r<r	r/rrr�rr*rOrsrr�r�r�r�r�rx)r!r;r�r�rbrzr�r�r�r�r�r�r{r�r#)r�r$r�s:









zFirewallPolicy.remove_portcCs||jdkr|jd|=dS)NrC)r<)r!r�r�r#r#r$Z__unregister_port=sz FirewallPolicy.__unregister_portcCs6x0|j|�dD]\}}t||�r||krdSqWdS)NrCTF)r8r	)r!r;r�r�rsrtr#r#r$�
query_portAszFirewallPolicy.query_portcCst|j|�dj��S)NrC)r�r8r,)r!r;r#r#r$r�HszFirewallPolicy.list_portscCst|�sttj|��dS)N)rrrZINVALID_PROTOCOL)r!r�r#r#r$�check_protocolMszFirewallPolicy.check_protocolcCs|j|�|S)N)r�)r!r�r#r#r$Z
__protocol_idQs
zFirewallPolicy.__protocol_idcCs�|jj|�}|jj|�|jj�|j|}|j|�}||jdkrh|jrP|jn|}	tt	j
d||	f��|dkrz|j�}
n|}
|jr�|j
d|||
�|j||||�|
j|j||�|dkr�|
jd�|S)NrIz'%s' already in '%s'T)rr>r�r�r �_FirewallPolicy__protocol_idr<r/rrrZr*rOrt�"_FirewallPolicy__register_protocolr��$_FirewallPolicy__unregister_protocolrx)r!r;r�rgrfrbrzr��protocol_idr�r{r#r#r$r\Us&




zFirewallPolicy.add_protocolcCs|j||�|jd|<dS)NrI)r�r<)r!r�r�rgrfr#r#r$Z__register_protocolrsz"FirewallPolicy.__register_protocolc	Cs�|jj|�}|jj�|j|}|j|�}||jdkr\|jrD|jn|}ttj	d||f��|dkrn|j
�}n|}|jr�|jd|||�|j
|j||�|dkr�|jd�|S)NrIz'%s' not in '%s'FT)rr>r�r r�r<r/rrr�r*rOrtr�r�rx)	r!r;r�rbrzr�r�r�r{r#r#r$r�vs$





zFirewallPolicy.remove_protocolcCs||jdkr|jd|=dS)NrI)r<)r!r�r�r#r#r$Z__unregister_protocol�sz$FirewallPolicy.__unregister_protocolcCs|j|�|j|�dkS)NrI)r�r8)r!r;r�r#r#r$�query_protocol�szFirewallPolicy.query_protocolcCst|j|�dj��S)NrI)r�r8r,)r!r;r#r#r$r��szFirewallPolicy.list_protocolscCs|j||�t|d�|fS)Nr�)r�r)r!r�r�r#r#r$Z__source_port_id�szFirewallPolicy.__source_port_idcs�|jj|�}|jj|�|jj�|j|}tt�fdd�|jd��}	x@|	D]8}
t||
d�rN|j	rl|j	n|}t
tjd|�|f��qNWt
|dd�|	D��\}}
|dkr�|j�}n|}|j�rx$|D]}|jd|t|d	��|�q�Wx$|
D]}|jd
|t|d	��|�q�Wx:|D]2}|j|��}
|j||
||�|j|j||
��qWx*|
D]"}|j|��}
|j|j||
��qNW|dk�r�|jd�|S)Ncs|d�kS)Nrjr#)r@)r�r#r$r��sz0FirewallPolicy.add_source_port.<locals>.<lambda>rFrz'%s:%s' already in '%s'cSsg|]\}}|�qSr#r#)r?rsrtr#r#r$r��sz2FirewallPolicy.add_source_port.<locals>.<listcomp>Tr�F)rr>r�r�r r�r�r<r	r/rrrZrr*rOrur�_FirewallPolicy__source_port_id�%_FirewallPolicy__register_source_portr��'_FirewallPolicy__unregister_source_portr�rx)r!r;r�r�rgrfrbrzr�r�r�r�r�r�r{r�r#)r�r$r]�s:









zFirewallPolicy.add_source_portcCs|j||�|jd|<dS)NrF)r�r<)r!r�r�rgrfr#r#r$Z__register_source_port�sz%FirewallPolicy.__register_source_portcs�|jj|�}|jj�|j|}tt�fdd�|jd��}xB|D]}t||d�rBPqBW|jrf|jn|}	t	t
jd|�|	f��t|dd�|D��\}
}|dkr�|j
�}n|}|j�rx$|
D]}
|jd|t|
d	��|�q�Wx$|D]}
|jd
|t|
d	��|�q�Wx:|
D]2}
|j|
��}|j||dd�|j|j||��qWx*|D]"}
|j|
��}|j|j||��qDW|dk�r~|jd�|S)Ncs|d�kS)Nrjr#)r@)r�r#r$r��sz3FirewallPolicy.remove_source_port.<locals>.<lambda>rFrz'%s:%s' not in '%s'cSsg|]\}}|�qSr#r#)r?rsrtr#r#r$r��sz5FirewallPolicy.remove_source_port.<locals>.<listcomp>Tr�F)rr>r�r r�r�r<r	r/rrr�rr*rOrurr�r�r�r�r�rx)r!r;r�r�rbrzr�r�r�r�r�r�r{r�r#)r�r$r��s:









z!FirewallPolicy.remove_source_portcCs||jdkr|jd|=dS)NrF)r<)r!r�r�r#r#r$Z__unregister_source_port�sz'FirewallPolicy.__unregister_source_portcCs6x0|j|�dD]\}}t||�r||krdSqWdS)NrFTF)r8r	)r!r;r�r�rsrtr#r#r$�query_source_port�sz FirewallPolicy.query_source_portcCst|j|�dj��S)NrF)r�r8r,)r!r;r#r#r$r�sz FirewallPolicy.list_source_portscCsdS)NTr#)r!r#r#r$Z__masquerade_idszFirewallPolicy.__masquerade_idcCs8|jj|�}|jj|�|jj�|j|}|j�}||jdkrb|jrN|jn|}tt	j
d|��|js�d|jdkr�tt	jd��d|jdkr�tt	jd��x6|jdD](}	|	dkr�q�|jjj
|	�r�tt	jd	��q�W|dkr�|j�}
n|}
|j�r|jd
||
�|j||||�|
j|j||�|dk�r4|
jd
�|S)NrDz"masquerade already enabled in '%s'r5r7z.'masquerade' is invalid for egress zone 'HOST'r4z/'masquerade' is invalid for ingress zone 'HOST'r6zR'masquerade' cannot be used in a policy if an ingress zone has assigned interfacesT)rr>r�r�r �_FirewallPolicy__masquerade_idr<r/rrrZr�r:r�r*rOrv�$_FirewallPolicy__register_masquerader��&_FirewallPolicy__unregister_masqueraderx)r!r;rgrfrbrzr��
masquerade_idr�r:r{r#r#r$r_
s:





zFirewallPolicy.add_masqueradecCs|j||�|jd|<dS)NrD)r�r<)r!r�r�rgrfr#r#r$Z__register_masquerade2sz$FirewallPolicy.__register_masqueradecCs�|jj|�}|jj�|j|}|j�}||jdkrV|jrB|jn|}ttj	d|��|dkrh|j
�}n|}|jr�|jd||�|j
|j||�|dkr�|jd�|S)NrDzmasquerade not enabled in '%s'FT)rr>r�r r�r<r/rrr�r*rOrvr�r�rx)r!r;rbrzr�r�r�r{r#r#r$r�6s"




z FirewallPolicy.remove_masqueradecCs||jdkr|jd|=dS)NrD)r<)r!r�r�r#r#r$Z__unregister_masqueradePsz&FirewallPolicy.__unregister_masqueradecCs|j�|j|�dkS)NrD)r�r8)r!r;r#r#r$r�TszFirewallPolicy.query_masqueradecCs^|jj|�|jj|�|r(|jj|�|rBt||�sBttj|��|rZ|rZttjd��dS)Nz.port-forwarding is missing to-port AND to-addr)rr�r�rrrZINVALID_ADDRr�)r!�ipvr�r��toport�toaddrr#r#r$�check_forward_portYs
z!FirewallPolicy.check_forward_portcCsLtd|�r|jd||||�n|jd||||�t|d�|t|d�t|�fS)Nr�r�r�)rrrr�)r!r�r�r�r�r#r#r$Z__forward_port_idfs


z FirewallPolicy.__forward_port_idc	CsZ|jj|�}	|jj|�|jj�|j|	}
|j||||�}||
jdkrt|
jrV|
jn|	}tt	j
d|||||f��|
js�d|
jdkr�|r�tt	jd��nR|
jdr�|s�tt	jd��x6|
jdD](}
|
dkr�q�|jjj
|
�r�tt	jd��q�W|dk�r|j�}n|}|
j�r"|jd	|	|||||�|j|
|||�|j|j|
|�|dk�rV|jd	�|	S)
NrEz'%s:%s:%s:%s' already in '%s'r5r7zAA 'forward-port' with 'to-addr' is invalid for egress zone 'HOST'zC'forward-port' requires 'to-addr' if egress zone is 'ANY' or a zoner6zS'forward-port' cannot be used in a policy if an egress zone has assigned interfacesT)rr>r�r�r � _FirewallPolicy__forward_port_idr<r/rrrZr�r:r�r�r*rOrq�&_FirewallPolicy__register_forward_portr��(_FirewallPolicy__unregister_forward_portrx)r!r;r�r�r�r�rgrfrbrzr��
forward_idr�r:r{r#r#r$rVnsB






zFirewallPolicy.add_forward_portcCs|j||�|jd|<dS)NrE)r�r<)r!r�rrgrfr#r#r$Z__register_forward_port�sz&FirewallPolicy.__register_forward_portcCs�|jj|�}|jj�|j|}|j||||�}	|	|jdkrh|jrJ|jn|}
ttj	d|||||
f��|dkrz|j
�}n|}|jr�|jd||||||�|j
|j||	�|dkr�|jd�|S)NrEz'%s:%s:%s:%s' not in '%s'FT)rr>r�r rr<r/rrr�r*rOrqr�rrx)r!r;r�r�r�r�rbrzr�rr�r{r#r#r$r��s&



z"FirewallPolicy.remove_forward_portcCs||jdkr|jd|=dS)NrE)r<)r!r�rr#r#r$Z__unregister_forward_port�sz(FirewallPolicy.__unregister_forward_portcCs"|j||||�}||j|�dkS)NrE)rr8)r!r;r�r�r�r�rr#r#r$�query_forward_port�sz!FirewallPolicy.query_forward_portcCst|j|�dj��S)NrE)r�r8r,)r!r;r#r#r$r��sz!FirewallPolicy.list_forward_portscCs|jj|�dS)N)rZcheck_icmptype)r!�icmpr#r#r$�check_icmp_block�szFirewallPolicy.check_icmp_blockcCs|j|�|S)N)r)r!rr#r#r$Z__icmp_block_id�s
zFirewallPolicy.__icmp_block_idcCs�|jj|�}|jj|�|jj�|j|}|j|�}||jdkrh|jrP|jn|}	tt	j
d||	f��|dkrz|j�}
n|}
|jr�|j
d|||
�|j||||�|
j|j||�|dkr�|
jd�|S)NrGz'%s' already in '%s'T)rr>r�r�r �_FirewallPolicy__icmp_block_idr<r/rrrZr*rOrp�$_FirewallPolicy__register_icmp_blockr��&_FirewallPolicy__unregister_icmp_blockrx)r!r;rrgrfrbrzr��icmp_idr�r{r#r#r$rU�s&




zFirewallPolicy.add_icmp_blockcCs|j||�|jd|<dS)NrG)r�r<)r!r�rrgrfr#r#r$Z__register_icmp_block�sz$FirewallPolicy.__register_icmp_blockc	Cs�|jj|�}|jj�|j|}|j|�}||jdkr\|jrD|jn|}ttj	d||f��|dkrn|j
�}n|}|jr�|jd|||�|j
|j||�|dkr�|jd�|S)NrGz'%s' not in '%s'FT)rr>r�r rr<r/rrr�r*rOrpr�r
rx)	r!r;rrbrzr�rr�r{r#r#r$r��s"




z FirewallPolicy.remove_icmp_blockcCs||jdkr|jd|=dS)NrG)r<)r!r�rr#r#r$Z__unregister_icmp_blocksz&FirewallPolicy.__unregister_icmp_blockcCs|j|�|j|�dkS)NrG)rr8)r!r;rr#r#r$�query_icmp_blockszFirewallPolicy.query_icmp_blockcCs|j|�dj�S)NrG)r8r,)r!r;r#r#r$r�szFirewallPolicy.list_icmp_blockscCsdS)NTr#)r!r#r#r$Z__icmp_block_inversion_idsz(FirewallPolicy.__icmp_block_inversion_idc
Cs|jj|�}|jj�|j|}|j�}||jdkrV|jrB|jn|}ttj	d|��|dkrh|j
�}n|}|jr�x&|j|�dD]}	|j
d||	|�q�W|jd||�|j|||�|j|j|||�|j�rx&|j|�dD]}	|j
d||	|�q�W|jd||�|dk�r|jd�|S)NrJz,icmp-block-inversion already enabled in '%s'rGFT)rr>r�r �(_FirewallPolicy__icmp_block_inversion_idr<r/rrrZr*rOr8rp�_icmp_block_inversion�._FirewallPolicy__register_icmp_block_inversionr��*_FirewallPolicy__undo_icmp_block_inversionrx)
r!r;rfrbrzr��icmp_block_inversion_idr�r{r`r#r#r$�add_icmp_block_inversions6





z'FirewallPolicy.add_icmp_block_inversioncCs|jd|�|jd|<dS)NrrJ)r�r<)r!r�rrfr#r#r$Z__register_icmp_block_inversionEsz.FirewallPolicy.__register_icmp_block_inversioncCs�|j�}|jr6x&|j|�dD]}|jd|||�qW||jdkrP|jd|=|jr~x&|j|�dD]}|jd|||�qfW|jd�dS)NrGFrJT)r*rOr8rpr<rx)r!rzr�rr{r`r#r#r$Z__undo_icmp_block_inversionJsz*FirewallPolicy.__undo_icmp_block_inversionc	Cs|jj|�}|jj�|j|}|j�}||jdkrV|jrB|jn|}ttj	d|��|dkrh|j
�}n|}|jr�x&|j|�dD]}|j
d|||�q�W|jd||�|j||�|j|j||d�|j�rx&|j|�dD]}|j
d|||�q�W|jd||�|dk�r|jd�|S)NrJz(icmp-block-inversion not enabled in '%s'rGFT)rr>r�r r
r<r/rrr�r*rOr8rpr�0_FirewallPolicy__unregister_icmp_block_inversionr�rrx)	r!r;rbrzr�rr�r{r`r#r#r$�remove_icmp_block_inversion\s6






z*FirewallPolicy.remove_icmp_block_inversioncCs||jdkr|jd|=dS)NrJ)r<)r!r�rr#r#r$Z!__unregister_icmp_block_inversion�sz0FirewallPolicy.__unregister_icmp_block_inversioncCs|j�|j|�dkS)NrJ)r
r8)r!r;r#r#r$�query_icmp_block_inversion�sz)FirewallPolicy.query_icmp_block_inversionc
Cs�|jjj|�}|jr*|jjj|jd}n|}|rT||jkrt||f|j|krtdSn ||jksp||f|j|krtdSx@|jj�D]2}|jr�||j	�kr�|j
||||�}	|j||	�q�W|j||||fg�|j
|j||||fg�dS)Nr)rr;r.r/r:Z_zone_policiesr�enabled_backends�policies_supportedZget_available_tablesZbuild_policy_chain_rules�	add_rules�_register_chainsr�)
r!r;�creater|r}r{rMZtracking_policy�backendrHr#r#r$rn�s$

zFirewallPolicy.gen_chain_rulescCsbx\|D]T\}}|r,|jj|g�j||f�q|j|j||f�t|j|�dkr|j|=qWdS)Nr)r�
setdefaultr0�remover�)r!r;rZtablesr|r}r#r#r$r�szFirewallPolicy._register_chainscCs$|jjj|�dkrdS|jjj|�S)Nzhash:mac)rr��get_typeZ
get_family)r!rKr#r#r$r��szFirewallPolicy._ipset_familycCs|jjj|�S)N)rr�r)r!rKr#r#r$Z__ipset_type�szFirewallPolicy.__ipset_typecCsdj|g|jjj|��S)N�,)�joinrr�Z
get_dimension)r!rK�flagr#r#r$�_ipset_match_flags�sz!FirewallPolicy._ipset_match_flagscCs|jjj|�S)N)rr�Z
check_applied)r!rKr#r#r$r��sz#FirewallPolicy._check_ipset_appliedcCs*|j|�}|tkr&ttjd||f��dS)Nz.ipset '%s' with type '%s' not usable as source)�_FirewallPolicy__ipset_typerrrZ
INVALID_IPSET)r!rKZ_typer#r#r$r��s
z+FirewallPolicy._check_ipset_type_for_sourcecs�t|j�tkr��jjj|jj�}|dkr2|jjg}xR|jD]H}||krHq:�j|�|j	|�t
j|�}||j_�j|||||d�q:Wg}	|j
r�|j
g}	nH|jr�t|jt�s�t|jt�r�jjj|jj���jr�fdd�dD�}	�j|j�}
|
�r&|j
�r |j
|
k�r&ttjd|
|j
f��n|
g}	|	�s4ddg}	�fdd�|	D�}	|	|_�x2t�fdd�|	D��D�]}t|j�tk�r��jjj|jj�}g}t|j�d	k�r�|j�r�ttjd
��xB|	D].}
|
|jk�r�|j|
��r�|j	|j|
��q�Wn
|j	d��x~|D�]�}t|j�tk�r�j|j |�}|�j!|j"�7}t#t|�dd�d
�}g}x�|D]�}|j$}t%|�}|j&dd�}|j	|�|j
dk�r�|j|j
��r��qTt|j'�dk�r�|j	|�n:x8|j'D].\}}|j(||||||j|�}|j)||��q�W�qTW|j*|�x4|j'D]*\}}|j+||||||�}|j)||��q
Wx.|j,D]$}|j-|||||�}|j)||��q@Wx4|j.D]*\}}|j/||||||�}|j)||��qpW�qW�qft|j�t0k�r�|jj1}|jj2}�j3||�|j+||||d|�}|j)||��qft|j�t4k�r<|jj5}�j6|�|j-|||d|�}|j)||��qft|j�t7k�r�|�rzx&|	D]}
|j|
��rX|j8t9|
��qXW|j:|||�}|j)||��qft|j�t;k�r4|jj1}|jj2}|jj<}|jj=}xD|	D]<}
|j|
��r�j>|
||||�|�r�|�r�|j8t9|
��q�W|j?|||||||�}|j)||��qft|j�t@k�r�|jj1}|jj2}�j3||�|j/||||d|�}|j)||�n�t|j�tk�s�t|j�tk�r>�jjj|jj��|j
�r�j�r�|j
�jk�r�ttjAd|j
|jjf��t|j�tk�r |j�r t|j�tk�r ttjd��|jB||�|�}|j)||�n>|jdk�rf|jC|||�}|j)||�nttjdt|j����qfWdS)N)�included_servicescsg|]}|�jkr|�qSr#)�destination)r?r�)�ictr#r$r��sz0FirewallPolicy._rule_prepare.<locals>.<listcomp>r�r�z;Source address family '%s' conflicts with rule family '%s'.csg|]}�jj|�r|�qSr#)r�is_ipv_enabled)r?r�)r!r#r$r��scsg|]}�jj|��qSr#)r�get_backend_by_ipv)r?r@)r!r#r$r��srz"Destination conflict with service.cSs|jS)N)rK)r@r#r#r$r�sz.FirewallPolicy._rule_prepare.<locals>.<lambda>)r~�	conntrack�natr�rjz3rich rule family '%s' conflicts with icmp type '%s'z'IcmpBlock not usable with accept actionzUnknown element %s)r�r�)D�typer�rrr��get_servicerK�includesr�r0�copy�deepcopyr��familyr�rr�config�get_icmptyper%r�r�rrZINVALID_RULE�ipvsr9r��is_ipv_supportedr�rr�r�r�r�r+r�r
�replacerC�build_policy_helper_ports_rulesrZadd_modules�build_policy_ports_rulesrI�build_policy_protocol_rulesrF�build_policy_source_ports_rulesrr�r�r�r�valuer�rr�r�build_policy_masquerade_rulesrZto_portr�r�build_policy_forward_port_rulesrZINVALID_ICMPTYPE�build_policy_icmp_block_rulesZ*build_policy_rich_source_destination_rules)r!ryr;r�r{r$�svc�includeZ_ruler3Z
source_ipvrZdestinationsr�r%r�r�r�r�r��
nat_moduler��protorHr�r�r�r#)r&r!r$r��s




 









zFirewallPolicy._rule_preparecCsb|jjj|�}|j|j|�}||j|j�7}tt|�dd�d�}|dkrN|g}x@|j	D]6}||krdqV|j
|�|j|�|j|||||d�qVWg}	xndD]f}
|jj
|
�s�q�|jj|
�}t|j�dkr�|
|jkr�|	j||j|
f�q�|df|	kr�|	j|df�q�W�xV|	D�]L\}}x�|D]�}
|
j}t|�}|
jjdd	�}|j|�|
jd
k�rf|j|
j��rf�qt|
j�dk�r�|j|�n:x8|
jD].\}}|j||||||
j|�}|j||��q�W�qWx2|jD](\}}|j|||||�}|j||��q�Wx,|jD]"}|j||||�}|j||��q�Wx2|jD](\}}|j|||||�}|j||��q,W�qWdS)
NcSs|jS)N)rK)r@r#r#r$r��sz)FirewallPolicy._service.<locals>.<lambda>)r~)r$r�r�rr)r*r�rj)r�r�) rr�r,r�r�r�r�r+r9r-r�r0rrr'r(r�r%r�r
r5Z
add_moduler0r4rCr6rKrr7rIr8rFr9)r!ryr;r�r{r$r>r�r?Zbackends_ipvr�rr%r�r�r�r@r�rArHr�r#r#r$rr�sb






zFirewallPolicy._servicecCs<x6|jj�D](}|jsq|j||||�}|j||�qWdS)N)rrrr7r)r!ryr;r�r�r{rrHr#r#r$rs�s
zFirewallPolicy._portcCs:x4|jj�D]&}|jsq|j|||�}|j||�qWdS)N)rrrr8r)r!ryr;r�r{rrHr#r#r$rt�s
zFirewallPolicy._protocolcCs<x6|jj�D](}|jsq|j||||�}|j||�qWdS)N)rrrr9r)r!ryr;r�r�r{rrHr#r#r$ru�s
zFirewallPolicy._source_portcCs8d}|jt|�|jj|�}|j||�}|j||�dS)Nr�)r�rrr(r;r)r!ryr;r{r�rrHr#r#r$rv�s
zFirewallPolicy._masqueradecCsXtd|�rd}nd}|r(|r(|jt|�|jj|�}	|	j||||||�}
|j|	|
�dS)Nr�r�)rr�rrr(r<r)r!ryr;r{r�r�r�r�r�rrHr#r#r$rq�s

zFirewallPolicy._forward_portc
Cs�|jjj|�}xl|jj�D]^}|js&qd}|jrXx&dD]}||jkr6|j|�s6d}Pq6W|r^q|j|||�}	|j||	�qWdS)NFr�r�T)r�r�)	rr1r2rrr%r4r=r)
r!ryr;rr{r&rZskip_backendr�rHr#r#r$rp�s


zFirewallPolicy._icmp_blockcCsh|j|j}|dkrdS|j|�r0|dkr0dSx2|jj�D]$}|jsHq<|j||�}|j||�q<WdS)N�DROP�
%%REJECT%%�REJECTZACCEPT)rBrCrD)r �targetrrrrZ'build_policy_icmp_block_inversion_rulesr)r!ryr;r{rErrHr#r#r$rsz$FirewallPolicy._icmp_block_inversionc	Cs�x|D]}|j|�qWx|D]}|j|�qWd|ks@d|krXt|�dkrXttjd��d|kshd|kr�t|�dkr�ttjd��|s�|r�|r�|r�d|kr�d|kr�ttjd|��|s�|r�|r�|r�d|kr�d|kr�ttjd|��dS)Nr6r5rjzI'ingress-zones' may only contain one of: many regular zones, ANY, or HOSTzH'egress-zones' may only contain one of: many regular zones, ANY, or HOSTzpolicy "%s" has no ingresszpolicy "%s" has no egress)r�r�r�rrr�)	r!r;r4r7�ingress_interfaces�egress_interfaces�ingress_sources�egress_sourcesr:r#r#r$�check_ingress_egress"s$

z#FirewallPolicy.check_ingress_egressc

Cs�|dkr&|dkr�|r�ttjd|��n�|dkrtd|krFttjd|��d|kr^ttjd|��|r�ttjd|��n||d	kr�d|kr�ttjd|��d|kr�ttjd|��nB|d
kr�d|kr�ttjd|��n |dkr�d|kr�ttjd
|��dS)N�
PREROUTING�rawzFpolicy "%s" egress-zones may not include a zone with added interfaces.�POSTROUTINGr5z/policy "%s" ingress-zones may not include HOST.z.policy "%s" egress-zones may not include HOST.zGpolicy "%s" ingress-zones may not include a zone with added interfaces.�FORWARD�INPUTz0policy "%s" egress-zones must include only HOST.�OUTPUTz1policy "%s" ingress-zones must include only HOST.)rrr�)
r!r;r|r}r4r7rFrGrHrIr#r#r$�check_ingress_egress_chain<s,z)FirewallPolicy.check_ingress_egress_chaincCs$|j�}|j|||�|jd�dS)NT)r*rorx)r!ryr;r{r#r#r$�!_ingress_egress_zones_transactionYsz0FirewallPolicy._ingress_egress_zones_transactioncCsL|j|}|jd}|jd}t�}t�}t�}	t�}
xB|D]:}|dkrJq<|t|jjj|��O}|	t|jjj|��O}	q<WxB|D]:}|dkr�q�|t|jjj|��O}|
t|jjj|��O}
q�W|j||||||	|
�xr|jj�D]d}|j	s�q�xV|j
|�D]H\}
}|j||
||||||	|
�	|j|||
||||	|
�}|j
||��q�Wq�WdS)Nr4r7r6r5)r6r5)r6r5)r r<r9rr:r�Zlist_sourcesrJrrrlrQZ!build_policy_ingress_egress_rulesr)r!ryr;r{rMr4r7rFrGrHrIr:rr|r}rHr#r#r$ro^s@






z$FirewallPolicy._ingress_egress_zonescCs6|j|}d|jdkrFd|jdkrFdddg}|jjsB|jd�|Sd|jdkrpdg}|jjsl|jd�|Sd|jdkr�dgSd|jdko�d|jdk�r�ddddg}|jj�s�|jd�|Sd|jdk�r.dddg}|jj�s�|jd�x4|jdD]}|jjj|�d�rP�qW|jd �|Sd|jdk�r�d!d"g}|jj�sZ|jd#�x>|jdD]}|jjj|�d�rfP�qfW|jd$�|jd%�|Sd&g}|jj�s�|jd'�x4|jdD]}|jjj|�d�r�P�q�W|jd(�x>|jdD]}|jjj|�d�r�P�q�W|jd)�|jd*�|SdS)+z:Create a list of (table, chain) needed for policy dispatchr6r4r5r7r�rOr*rK�manglerLrPrNrMZ
interfacesN)r�rO)r*rK)rSrK)rLrK)r�rO)rLrK)r�rP)r�rN)r*rK)r*rM)rSrK)rLrK)r�rN)r*rK)rSrK)rLrK)r*rM)r�rN)r*rM)rLrK)r*rK)rSrK)r�rN)rLrK)r*rM)r*rK)rSrK)r r<r�nftables_enabledr0r:r8)r!r;rM�tcr:r#r#r$rl�sj
















z4FirewallPolicy._get_table_chains_for_policy_dispatchcCsr|j|}d|jdkr4dg}|jjs0|jd�|Sd|jdkrLdddgSd|jdkrbddgStd|�SdS)z8Create a list of (table, chain) needed for zone dispatchr5r7r�rOrLrKr6�
FORWARD_INr*rSr4�FORWARD_OUTrMzInvalid policy: %sN)r�rO)rLrK)r�rV)r*rK)rSrK)r�rW)r*rM)r r<rrTr0r)r!r;rMrUr#r#r$rm�s

z2FirewallPolicy._get_table_chains_for_zone_dispatchFcCs�|jjj|�}|jr|j}n||}d|jdkrl|dkrBd|S|dkrRd|S|jsh|dkrhd|S�nJd|jd	kr�|js�|dkr�d
|S�n"d|jdk�r�|dkr�|jr�d|Sd
|Sn0|dkr�|r�d|Sd|Sn|dk�r�d|Sn�d|jd	k�rh|dk�r*|j�r d|Sd
|Sn<|dk�rL|�rBd|Sd|Sn|dk�r�|j�s�d|SnN|j�s�|dk�r�d
|S|dk�r�|�r�d|Sd|Sn|dk�r�d|Std|||f�S)Nr5r7r�ZIN_rLZPRE_rSr*r4ZOUT_r6ZFWDI_ZFWD_ZPOST_ZFWDO_z.Can't convert policy to chain name: %s, %s, %s)rSr*)rSrL)rSrL)rSrL)rr;r.r/r<r)r!r;r|Z
policy_prefixZisSNATrM�suffixr#r#r$�policy_base_chain_name�sb













z%FirewallPolicy.policy_base_chain_name)N)N)N)N)rNNT)N)rNNT)N)rNN)N)rNN)N)rNN)N)rNN)N)rNN)N)rNN)N)NN)NN)NNrNN)NNN)NN)rNN)N)NN)N)N)N)NN)F)��__name__�
__module__�__qualname__r%r'r)r*r-r3r=r.rNrQrLrdrer�r8rrcrPr�r�r�r�rSr�r�r�r�r�r�r�rTr�r�r�r�r�r�r�r�rwr^r�r�r�r�r�r�r�rWr�r�r�r�r�r�r�r�r�rXr�r�r�r�r�r�r�r\r�r�r�r�r�r�r]r�r�r�r�r�r�r_r�r�r�r�rrrVrr�rrr�rrrUr	r�r
rr�r
rrrrrrrnrr�r#r"r�r�r�rrrsrtrurvrqrprrJrQrRrorlrmrYr#r#r#r$rs$
'	?.,#,#:
'('('
+))@@		(Pr)'rhr.Zfirewall.core.loggerrZfirewall.functionsrrrrrrr	r
rrr�r
rrrrrrrrrrZfirewall.core.fw_transactionrZfirewallrZfirewall.errorsrZfirewall.fw_typesrZfirewall.core.baser�objectrr#r#r#r$�<module>s04site-packages/firewall/core/__pycache__/nftables.cpython-36.opt-1.pyc000064400000130376147511334670021441 0ustar003

]ûf��%@slddlmZddlZddlZddlZddlmZddlmZm	Z	m
Z
mZmZddl
mZmZmZmZmZmZmZddlmZmZmZmZmZmZmZddlmZdZed	d
Z dZ!dZ"id
ddCe"fiddDe"fdde"fd�dde"fdde"fdde"fdde"fd�d�Z#dEdd�Z$e$ddd�e$dd�e$dd�e$dd�e$ddd�e$ddd �e$ddd�e$dd!d"�e$ddd#�e$ddd"�e$dd$d"�e$ddd%�e$dd!d�e$ddd&�e$ddd�e$dd$�e$ddd'�e$ddd(�e$ddd)�e$dd!�e$dd$d"�e$dd*�e$dd+�e$dd,�e$ddd-�e$dd.�e$dd/�e$dd0�e$dd!d'�e$ddd1�e$dd!d)�e$ddd2�e$dd.d"�e$dd.d�d3�"e$d4dd'�e$d4d$d�e$d4dd)�e$d4dd"�e$d4d�e$d4d�e$d4d�e$d4dd-�e$d4d5�e$d4d6�e$d4d7�e$d4d8�e$d4d9�e$d4d:�e$d4dd�e$d4d;�e$d4d$�e$d4dd�e$d4d<�e$d4dd&�e$d4d=�e$d4d>�e$d4d.�e$d4d.d"�e$d4d.d�e$d4d$d"�e$d4d$d)�d?�d@�Z%GdAdB�dBe&�Z'dS)F�)�absolute_importN)�log)�	check_mac�getPortRange�normalizeIP6�check_single_address�
check_address)�
FirewallError�
UNKNOWN_ERROR�INVALID_RULE�INVALID_ICMPTYPE�INVALID_TYPE�
INVALID_ENTRY�INVALID_PORT)�Rich_Accept�Rich_Reject�	Rich_Drop�	Rich_Mark�Rich_Masquerade�Rich_ForwardPort�Rich_IcmpBlock)�NftablesZ	firewalld�_Zpolicy_dropZpolicy_�
�
PREROUTING�
prerouting��dZpostrouting)r�POSTROUTING�input�forward�output)r�INPUT�FORWARD�OUTPUT)�raw�mangle�nat�filtercCsHdd|dd�id|d�ig}|dk	rD|jdd|dd�id|d�i�|S)N�match�payload�type)�protocol�fieldz==)�left�op�right�code)�append)r,r+r1�	fragments�r4�/usr/lib/python3.6/nftables.py�_icmp_types_fragmentsSsr6�icmpzdestination-unreachable�
z
echo-replyzecho-request���redirect��zparameter-problem�����zrouter-advertisementzrouter-solicitationz
source-quench�z
time-exceededztimestamp-replyztimestamp-request��)"zcommunication-prohibitedzdestination-unreachablez
echo-replyzecho-requestzfragmentation-neededzhost-precedence-violationzhost-prohibitedz
host-redirectzhost-unknownzhost-unreachablez
ip-header-badznetwork-prohibitedznetwork-redirectznetwork-unknownznetwork-unreachablezparameter-problemzport-unreachablezprecedence-cutoffzprotocol-unreachabler;zrequired-option-missingzrouter-advertisementzrouter-solicitationz
source-quenchzsource-route-failedz
time-exceededztimestamp-replyztimestamp-requestztos-host-redirectztos-host-unreachableztos-network-redirectztos-network-unreachablezttl-zero-during-reassemblyzttl-zero-during-transit�icmpv6zmld-listener-donezmld-listener-queryzmld-listener-reportzmld2-listener-reportznd-neighbor-advertznd-neighbor-solicitzpacket-too-bigznd-redirectznd-router-advertznd-router-solicit)zaddress-unreachablez
bad-headerzbeyond-scopezcommunication-prohibitedzdestination-unreachablez
echo-replyzecho-requestz
failed-policyzmld-listener-donezmld-listener-queryzmld-listener-reportzmld2-listener-reportzneighbour-advertisementzneighbour-solicitationzno-routezpacket-too-bigzparameter-problemzport-unreachabler;zreject-routezrouter-advertisementzrouter-solicitationz
time-exceededzttl-zero-during-reassemblyzttl-zero-during-transitzunknown-header-typezunknown-option)�ipv4�ipv6c@s`eZdZdZdZdd�Zdd�Zdd�Zdd	�Zd
d�Z	dd
�Z
dd�Zd�dd�Zdd�Z
dd�Zdd�Zdd�Zd�dd�Zdd�Zd�d d!�Zd"d#�Zd�d%d&�Zd�d(d)�Zd�d*d+�Zd�d,d-�Zd.d/�Zd0d1�Zd2d3�Zd4d5�Zd6d7�Zd8d9�Zd:d;�Zd<d=�Z d>d?�Z!d@dA�Z"dBdC�Z#dDdE�Z$dFdG�Z%dHdI�Z&d�dJdK�Z'dLdM�Z(dNdO�Z)dPdQ�Z*dRdS�Z+d�dTdU�Z,d�dVdW�Z-d�dXdY�Z.dZd[�Z/d�d\d]�Z0d�d^d_�Z1d�d`da�Z2d�dbdc�Z3d�ddde�Z4dfdg�Z5d�dhdi�Z6djdk�Z7d�dldm�Z8dndo�Z9dpdq�Z:drds�Z;dtdu�Z<d�dvdw�Z=d�dxdy�Z>dzd{�Z?d�d|d}�Z@d~d�ZAd�d��ZBd�d��ZCd�d��ZDd�d��ZEd�d��ZFd�d��ZGd�d�d��ZHdS)��nftablesTcCsb||_d|_g|_i|_i|_i|_i|_i|_gggd�|_t	�|_
|j
jd�|j
jd�dS)NT)�inet�ip�ip6)
�_fwZrestore_command_existsZavailable_tables�rule_to_handle�rule_ref_count�rich_rule_priority_counts�policy_priority_counts�zone_source_index_cache�created_tablesrrIZset_echo_outputZset_handle_output)�self�fwr4r4r5�__init__�sznftables.__init__cCs�xdD]}||krPqWd||dkr`||ddd||dddf}||dd=n(d||dkr�d}||dd=ndS||dd	}|r�|dkr�||kr�|||kr�||j|�n�|dk�r�||kr�g||<|�r(|||k�r||j|�||jd
d�d�||j|�}n|jj�r8d
}nt||�}||}||=|d
k�rf||d<n |d8}||d<||ddd<dS)N�add�insert�deletez%%ZONE_SOURCE%%�rule�zone�addressz%%ZONE_INTERFACE%%�familycSs|dS)Nrr4)�xr4r4r5�<lambda>�sz3nftables._run_replace_zone_source.<locals>.<lambda>)�keyrr<�index)rWrXrY)�remover2�sortrarM�_allow_zone_drifting�len)rTrZrR�verbZzone_sourcer]ra�
_verb_snippetr4r4r5�_run_replace_zone_source�sD




z!nftables._run_replace_zone_sourcecCsBd|krdtj|d�iSd|kr4dtj|d�iSttd��dS)NrXrYrWzFailed to reverse rule)�copy�deepcopyr	r
)rT�dictr4r4r5�reverse_rule�s
znftables.reverse_rulec
Cs�xdD]}||krPqW|||dk�r�||d|}||d|=t|�tkr^ttd��||dd||ddf}|dkr�||ks�|||ks�|||dkr�ttd	��|||d
8<n�||kr�i||<|||kr�d|||<d}xVt||j��D]B}||k�r"|dk�r"P||||7}||k�r|dk�rP�qW|||d
7<||}	||=|dk�r�|	|d<n |d
8}|	|d<||ddd<dS)
NrWrXrYrZz%priority must be followed by a numberr]�chainrz*nonexistent or underflow of priority countr<ra)rWrXrY)r+�intr	rr
�sorted�keys)
rTrZZpriority_counts�tokenrf�priorityrmra�prgr4r4r5�_set_rule_replace_priority�sD

 


z#nftables._set_rule_replace_prioritycCsfx`d
D]X}||krd||krtj||d�}xdD]}||kr6||=q6Wtj|dd	�}|SqWdS)NrWrXrYrZra�handle�positionT)Z	sort_keys)rWrXrY)rarurv)rirj�json�dumps)rTrZrf�rule_keyZnon_keyr4r4r5�
_get_rule_keys


znftables._get_rule_keycCsLdddddg}dddg}g}g}tj|j�}tj|j�}tj|j�}	|jj�}
�x�|D�]�}t|�tkrvtt	d|��x|D]}||kr|Pq|W||kr�tt
d|��|j|�}
|
|
k�rDtj
d|j|
|
|
�|dkr�|
|
d	7<qVnX|
|
d	k�r|
|
d	8<qVn6|
|
d	k�r,|
|
d	8<ntt	d
|
|
|
f��n|
�r\|dk�r\d	|
|
<|j|�tj|�}|
�rttd||dd��||dd<|j||d
�|j||d�|j||	�|dk�rdd|ddd|ddd|ddd|j|
d�ii}|j|�qVWdddd	iig|i}tj�dk�rVtjd|jtj|��|jj|�\}}}|dk�r�tdd|tj|�f��||_||_|	|_|
|_d}x�|D]�}|d	7}|j|�}
|
�s̐q�d|k�r�|j|
=|j|
=�q�x"|D]}||d|k�r�P�q�W||d|k�r$�q�|d||dd|j|
<�q�WdS)NrWrXrY�flush�replacez#rule must be a dictionary, rule: %szno valid verb found, rule: %sz%s: prev rule ref cnt %d, %sr<z)rule ref count bug: rule_key '%s', cnt %drZ�exprz%%RICH_RULE_PRIORITY%%z%%POLICY_PRIORITY%%r]�tablerm)r]r~rmrurIZmetainfoZjson_schema_versionr@z.%s: calling python-nftables with JSON blob: %srz'%s' failed: %s
JSON blob:
%szpython-nftablesru)rirjrPrQrRrOr+rkr	r
rrzrZdebug2�	__class__r2�listr(rtrhrNZgetDebugLogLevelZdebug3rwrxrIZjson_cmd�
ValueError)rT�rules�
log_deniedZ_valid_verbsZ_valid_add_verbsZ_deduplicated_rulesZ_executed_rulesrPrQrRrOrZrfryZ_ruleZ	json_blobZrcr!�errorrar4r4r5�	set_rules+s�







&






znftables.set_rulescCs|j|g|�dS)N�)r�)rTrZr�r4r4r5�set_rule�sznftables.set_ruleNcCs|r
|gStj�S)N)�IPTABLES_TO_NFT_HOOKrp)rTr~r4r4r5�get_available_tables�sznftables.get_available_tablescCsFg}x<dD]4}|jdd||d�ii�|jdd||d�ii�q
W|S)	NrJrKrLrWr~)r]�namerY)rJrKrL)r2)rTr~r�r]r4r4r5�_build_delete_table_rules�s


z"nftables._build_delete_table_rulescCs�i}i}xB|jd�D]4}|j|�}||jkr|j|||<|j|||<qW||_||_i|_i|_i|_x*dD]"}t|j|krp|j|j	t�qpW|j
t�S)NTrJrKrL)rJrKrL)� _build_set_policy_rules_ct_rulesrzrNrOrPrQrR�
TABLE_NAMErSrbr�)rTZsaved_rule_to_handleZsaved_rule_ref_countrZ�
policy_keyr]r4r4r5�build_flush_rules�s 


znftables.build_flush_rulesc
Cslddd�|}g}xTdD]L}|j|ddtd	d
|fddd
diiddddgid�iddigd�ii�qW|S)NrWrY)TFrr r!rZrJz%s_%sr(r)�ctr`�state�in�set�established�related)r.r/r0�accept)r]r~rmr})rr r!)r2�TABLE_NAME_POLICY)rT�enable�add_delr��hookr4r4r5r��s


z)nftables._build_set_policy_rules_ct_rulescCstg}|dkrt|jdddtd�ii�|jdjt�x>dD]6}|jdddtd	d
|fd|dtd
dd�ii�q:W|dk�r�|jdddtd�ii�|jdjt�x>dD]6}|jdddtd	d|fd|dtd
dd�ii�q�W||jd�7}nz|dk�rfx4|jd�D]&}|j|�}||jk�r|j|��qW||jt�7}t|jdk�rp|jdjt�n
t	t
d�|S)NZPANICrWr~rJ)r]r�rr!rmz%s_%sr%r(i,r<�drop)r]r~r�r+r��prio�policy�DROPrr rT�ACCEPTFznot implemented)rr!i���)rr r!)r2r�rS�NFT_HOOK_OFFSETr�rzrNr�rbr	r
)rTr�r�r�rZr�r4r4r5�build_set_policy_rules�sH













znftables.build_set_policy_rulescCs<t�}x,|r|gntj�D]}|jt|j��qWt|�S)N)r��ICMP_TYPES_FRAGMENTSrp�updater�)rT�ipvZ	supportedZ_ipvr4r4r5�supported_icmp_types�sznftables.supported_icmp_typescCs>g}x4dD],}|jdd|td�ii�|j|jt�q
W|S)NrJrKrLrWr~)r]r�)rJrKrL)r2r�rS)rTZdefault_tablesr]r4r4r5�build_default_tabless

znftables.build_default_tables�offcCs�g}x�tdj�D]�}|jdddtd|ddtd|dtd|d	d
�ii�xz|jjrlddd
dgndd
dgD]X}|jdddtd||fd�ii�|jdddtd|ddd||fiigd�ii�qvWqWx�d?D]�}x�tdj�D]�}|jdd|td|ddtd|dtd|d	d
�ii�x~|jj�rJddd
dgndd
dgD]Z}|jdd|td||fd�ii�|jdd|td|ddd||fiigd�ii��qTWq�Wq�WxVtdj�D]F}|jdddtd|ddtd|dtd|d	d
�ii��q�W|jdddtddddddiid d!d"d#gid$�id%digd�ii�|jdddtdddddd&iid d'd$�id%digd�ii�|jdddtdddd(dd)iid*d+d$�id%digd�ii�x~|jj�r�ddd
dgndd
dgD]Z}|jdddtd,d|fd�ii�|jdddtddddd,d|fiigd�ii��q�W|d-k�r�|jdddtddddddiid d!d.gid$�i|j|�d/d0d1iigd�ii�|jdddtddddddiid d!d.gid$�id2digd�ii�|d-k�r$|jdddtdd|j|�d/d0d3iigd�ii�|jdddtddd4d5d6d7�igd�ii�|jdddtdd8ddddiid d!d"d#gid$�id%digd�ii�|jdddtdd8dddd&iid d'd$�id%digd�ii�|jdddtdd8dd(dd)iid*d+d$�id%digd�ii�xbd@D]Z}|jdddtd,d8|fd�ii�|jdddtdd8ddd,d8|fiigd�ii��qWx�dAD]�}xz|jj�r�dd
gnd
gD]^}|jdddtd;d8||fd�ii�|jdddtdd8ddd;d8||fiigd�ii��q�W�qvWxbdBD]Z}|jdddtd,d8|fd�ii�|jdddtdd8ddd,d8|fiigd�ii��qW|d-k�r�|jdddtdd8ddddiid d!d.gid$�i|j|�d/d0d1iigd�ii�|jdddtdd8ddddiid d!d.gid$�id2digd�ii�|d-k�r6|jdddtdd8|j|�d/d0d3iigd�ii�|jdddtdd8d4d5d6d7�igd�ii�|jdddtdd<ddddiid d!d"d#gid$�id%digd�ii�|jdddtd=dd(dd>iid*d+d$�id%digd�ii�xbdCD]Z}|jdddtd,d<|fd�ii�|jdddtdd<ddd,d<|fiigd�ii��q�WxbdDD]Z}|jdddtd,d<|fd�ii�|jdddtdd<ddd,d<|fiigd�ii��qHW|S)ENr&rWrmrJz	mangle_%sr(z%srr<)r]r~r�r+r�r��POLICIES_preZZONES_SOURCEZZONES�
POLICIES_postzmangle_%s_%s)r]r~r�rZ�jump�target)r]r~rmr}rKrLr'znat_%sz	nat_%s_%sz	filter_%sr"r)r�r`r�r�r�r�r�)r.r/r0r�Zstatus�dnat�meta�iifnamez==�lozfilter_%s_%sr�Zinvalidr�prefixzSTATE_INVALID_DROP: r�zFINAL_REJECT: �reject�icmpxzadmin-prohibited)r+r}r#�IN�OUTzfilter_%s_%s_%sr$�
filter_OUTPUT�oifname)rKrL)r�)r�r�)r�)r�)r�)r�rpr2r�rMrd�_pkttype_match_fragment)rTr�Z
default_rulesrmZdispatch_suffixr]�	directionr4r4r5�build_default_ruless�
$

(

&

.
 


&

&











&


.


&










&


&znftables.build_default_rulescCs4|dkrdddgS|dkr dgS|dkr0ddgSgS)	Nr(r"�
FORWARD_IN�FORWARD_OUTr&rr'rr4)rTr~r4r4r5�get_zone_table_chains�s
znftables.get_zone_table_chainsrJc
s��dkr\�dkr\g}
|
j�j�|��||||dd�	�|
j�j�|��||||dd�	�|
S�jjj|���jdkrxdnd��dkr��d	kr�d
nd}�jjj|�t|��g}g}
|r�|jdd
ddiiddt	|�id�i�|�r|
jdd
ddiiddt	|�id�i�ddd�}|�rlxT|D]L}�dk�rT�jj
j|�}||k�rT�||k�rT�q|j�jd|���qW|�r�xT|D]L}�dk�r��jj
j|�}||k�r��||k�r��qx|
j�jd|���qxW��������fdd�}g}
|�rHx�|D]P}|
�rxB|
D]}|
j|||���qWn"�dk�r0|�r0n|
j||d���q�Wn\�dk�rZ|�rZnJ|
�r�xB|
D]}|
j|d|���qfWn"�dk�r�|�r�n|
j|dd��|
S)Nr'rJrK)r]rLr�pre�postrTFr)r�r`r�z==r�)r.r/r0r�)rGrH�saddr�daddrcs�g}|r|j|�|r |j|�|jddd��fii��td���f|d�}|j�j����rrdd|iiSdd|iiSdS)	Nr�r�z%s_%sz%s_%s_POLICIES_%s)r]r~rmr}rWrZrY)r2r�r��_policy_priority_fragment)�ingress_fragment�egress_fragment�expr_fragmentsrZ)�_policyrm�chain_suffixr�r]�p_objrTr~r4r5�_generate_policy_dispatch_rules

zRnftables.build_policy_ingress_egress_rules.<locals>._generate_policy_dispatch_rule)
�extend�!build_policy_ingress_egress_rulesrMr�Z
get_policyrr�policy_base_chain_name�POLICY_CHAIN_PREFIXr2r�r[Zcheck_source�_rule_addr_fragment)rTr�r�r~rmZingress_interfacesZegress_interfacesZingress_sourcesZegress_sourcesr]r��isSNATZingress_fragmentsZegress_fragmentsZ
ipv_to_family�srcr��dstr�r�r�r4)r�rmr�r�r]r�rTr~r5r��sv









z*nftables.build_policy_ingress_egress_rulesFc	
Cs�|dkrT|dkrTg}	|	j|j|||||||d��|	j|j|||||||d��|	S|dkrh|dkrhdnd}
|jjj||t|
d�}d	d
d	d	d
d
d�|}|t|�dd
kr�|dt|�d�d}d}
|dkr�|
dd||fiig}n,ddd|iid|d�i|
dd||fiig}|�rL|�rLd}|td||f|d�}|j|j	��nP|�rnd}|td||f|d�}n.d}|td||f|d�}|�s�|j|j	��|d|iigS)Nr'rJrKrLrTF)r�r�r�)rrr"r�r�r$r<�+�*�gotor�z%s_%sr)r�r`z==)r.r/r0rXz%s_%s_ZONES)r]r~rmr}rWrYrZ)
r��!build_zone_source_interface_rulesrMr�r�r�rer�r��_zone_interface_fragment)rTr�r[r��	interfacer~rmr2r]r�r�r��opt�actionr�rfrZr4r4r5r�Qs\



z*nftables.build_zone_source_interface_rulesc	Csn|dkr�|dkr�g}|jd�r6|j|td�d��}	nd}	td|�sTt|�sT|	dkrp|j|j||||||d��td|�s�t|�s�|	dkr�|j|j||||||d��|S|dkr�|dkr�d	nd
}
|jjj	||t
|
d�}dd
d�|}ddddddd�|}
|jj�rd||f}nd||f}d}|t||j
|
|�|dd||fiigd�}|j|j||��|d|iigS)Nr'rJzipset:rGrKrHrLrTF)r�rXrY)TFr�r�)rrr"r�r�r$z%s_%s_ZONES_SOURCEz%s_%s_ZONESr�r�z%s_%s)r]r~rmr}rZ)�
startswith�_set_get_familyrerrr��build_zone_source_address_rulesrMr�r�r�rdr�r�r��_zone_source_fragment)rTr�r[r�r\r~rmr]r�Zipset_familyr�r�r�r�Zzone_dispatch_chainr�rZr4r4r5r��sB


z(nftables.build_zone_source_address_rulesc
Cs|dkrH|dkrHg}|j|j||||d��|j|j||||d��|Sddd�|}|dkrj|dkrjd	nd
}|jjj||t|d�}	g}|j|d|td
||	fd�ii�x0d!D](}
|j|d|td||	|
fd�ii�q�WxDd"D]<}
|j|d|td
||	fddd||	|
fiigd�ii�q�W|jjj|j	}|jj
�dk�r�|dk�r�|d#k�r�|}|dk�rhd}|j|d|td
||	f|j|jj
��ddd|	|fiigd�ii�|dk�r|d$k�r|d%k�r�|j�}
n|j
�di}
|j|d|td
||	f|
gd�ii�|�s|j�|S)&Nr'rJrKrLrWrY)TFrTF)r�rmz%s_%s)r]r~r�r�r�deny�allowr�z%s_%s_%srZr�r�)r]r~rmr}r�r(�REJECT�
%%REJECT%%r�r�z"filter_%s_%s: "r�)r�rr�r�r�)r�rr�r�r�)r�r�r�)r�r�r�r�)r�r�)r��build_policy_chain_rulesrMr�r�r�r2r�Z	_policiesr��get_log_deniedr��_reject_fragment�lower�reverse)rTr�r�r~rmr]r�r�r�r�r�r�Z
log_suffix�target_fragmentr4r4r5r��sZ





&




 





z!nftables.build_policy_chain_rulescCs<|dkriS|dkr,ddddiid	|d
�iSttd|��dS)
N�all�unicast�	broadcast�	multicastr)r�r`�pkttypez==)r.r/r0zInvalid pkttype "%s")r�r�r�)r	r)rTr�r4r4r5r��s
z nftables._pkttype_match_fragmentcCsdddd�idddd�idddd�idddd�idddd�idddd�idddd�idddd�idddd�idddd�iddd	d�iddd	d�iddd
d�iddd
d�iddd
d�idddd�idddd�iddd
d�iddd
d�idddd�idddd�idddiidddiid�}||S)Nr�r7zhost-prohibited)r+r}znet-prohibitedzadmin-prohibitedrFznet-unreachablezhost-unreachablezport-unreachabler�zprot-unreachablezaddr-unreachablezno-router+z	tcp reset)zicmp-host-prohibitedzhost-prohibzicmp-net-prohibitedz
net-prohibzicmp-admin-prohibitedzadmin-prohibzicmp6-adm-prohibitedzadm-prohibitedzicmp-net-unreachableznet-unreachzicmp-host-unreachablezhost-unreachzicmp-port-unreachablezicmp6-port-unreachablezport-unreachzicmp-proto-unreachablez
proto-unreachzicmp6-addr-unreachablezaddr-unreachzicmp6-no-routezno-routez	tcp-resetztcp-rstr4)rTZreject_typeZfragsr4r4r5�_reject_types_fragment�s0
znftables._reject_types_fragmentcCsdddd�iS)Nr�r�zadmin-prohibited)r+r}r4)rTr4r4r5r�sznftables._reject_fragmentcCs ddddiiddddgid	�iS)
Nr)r�r`�l4protoz==r�r7rF)r.r/r0r4)rTr4r4r5�_icmp_match_fragment"sznftables._icmp_match_fragmentcCsP|siSddddd�}|j�\}}|||d�}|j�}|dk	rH||d<d|iS)	N�secondZminuteZhourZday)�s�m�h�d)�rateZper�burst�limit)Zvalue_parseZburst_parse)rTr�Zrich_to_nftr�Zdurationr�r�r4r4r5�_rich_rule_limit_fragment'sz"nftables._rich_rule_limit_fragmentcCs�t|j�tttgkrn<|jrHt|j�tttt	gkrRt
tdt|j���n
t
td��|jdkr�t|j�ttgks�t|j�tt	gkr�dSt|j�tgks�t|j�ttgkr�dSn|jdkr�dSdSdS)NzUnknown action %szNo rule action specified.rr�r�r�r�)
r+�elementrrrr�rrrrr	rrr)rT�	rich_ruler4r4r5�_rich_rule_chain_suffix?s 


z nftables._rich_rule_chain_suffixcCs>|jr|jrttd��|jdkr(dS|jdkr6dSdSdS)NzNot log or auditrrr�r�)r�auditr	rrr)rTr�r4r4r5� _rich_rule_chain_suffix_from_logUs


z)nftables._rich_rule_chain_suffix_from_logcCsddiS)Nz%%ZONE_INTERFACE%%r4)rTr4r4r5r�`sz!nftables._zone_interface_fragmentcCsNtd|�rt|�}n,td|�r@|jd�}t|d�d|d}d||d�iS)NrH�/rr<z%%ZONE_SOURCE%%)r[r\)rrr�split)rTr[r\Z
addr_splitr4r4r5r�cs



znftables._zone_source_fragmentcCs
d|jiS)Nz%%POLICY_PRIORITY%%)rr)rTr�r4r4r5r�ksz"nftables._policy_priority_fragmentcCs|s|jdkriSd|jiS)Nrz%%RICH_RULE_PRIORITY%%)rr)rTr�r4r4r5�_rich_rule_priority_fragmentnsz%nftables._rich_rule_priority_fragmentcCs�|js
iS|jjj||t�}ddd�|}|j|�}i}	|jjrPd|jj|	d<|jjr|d|jjkrhdn|jj}
d|
|	d<d	td
|||f||j	|jj
�d|	igd�}|j|j|��|d
|iiS)NrWrY)TFz%sr�Zwarning�warn�levelrJz%s_%s_%sr)r]r~rmr}rZ)
rrMr�r�r�r�r�rr�r�r�r�r�)rTr�r�r�r~r�r�r�r�Zlog_optionsrrZr4r4r5�_rich_rule_logss&
znftables._rich_rule_logc
Cs�|js
iS|jjj||t�}ddd�|}|j|�}dtd|||f||j|jj�dddiigd	�}	|	j	|j
|��|d
|	iiS)NrWrY)TFrJz%s_%s_%srrr�)r]r~rmr}rZ)r�rMr�r�r�r�r�r�r�r�r�)
rTr�r�r�r~r�r�r�r�rZr4r4r5�_rich_rule_audit�s
znftables._rich_rule_auditc
Cs�|js
iS|jjj||t�}ddd�|}|j|�}d|||f}	t|j�tkr\ddi}
�nt|j�tkr�|jjr�|j	|jj�}
nddi}
n�t|j�t
kr�ddi}
n�t|j�tk�rHd}|jjj||t�}d|||f}	|jjj
d	�}t|�d
k�r,dddd
iiddddd
ii|d
gi|dgid�i}
ndddd
ii|dd�i}
nttdt|j���dt|	||j|jj�|
gd�}|j|j|��|d|iiS)NrWrY)TFz%s_%s_%sr�r�r�r&r�r<r�r`�mark�^�&r)r`�valuezUnknown action %srJ)r]r~rmr}rZ)r�rMr�r�r�r�r+rrr�rrr�r�rer	rr�r�r�r�r�)
rTr�r�r�r~r�r�r�r�rmZrule_actionrrZr4r4r5�_rich_rule_action�sB


,znftables._rich_rule_actioncCs�|jd�r0|j|td�d�d|kr(dnd|�St|�r>d}n�td|�rNd}nvtd|�r�d}tj|dd�}d	|jj	|j
d
�i}nDtd|�r�d}t|�}n,d}|jd
�}d	t|d�t
|d�d
�i}dd||d�i|r�dnd|d�iSdS)Nzipset:r�TF�etherrGrK)�strictr�)�addrrerHrLr�rr<r)r*)r,r-z!=z==)r.r/r0)r��_set_match_fragmentrerrr�	ipaddressZIPv4NetworkZnetwork_addressZ
compressedZ	prefixlenrr�rn)rTZ
addr_fieldr\�invertr]Znormalized_addressZaddr_lenr4r4r5r��s(
&





znftables._rule_addr_fragmentcCs6|siS|d
krttd|��ddddiid|d	�iS)NrGrHzInvalid familyr)r�r`�nfprotoz==)r.r/r0)rGrH)r	r)rTZrich_familyr4r4r5�_rich_rule_family_fragment�s
z#nftables._rich_rule_family_fragmentcCs8|siS|jr|j}n|jr&d|j}|jd||jd�S)Nzipset:r�)r)r�ipsetr�r)rTZ	rich_destr\r4r4r5�_rich_rule_destination_fragment�s
z(nftables._rich_rule_destination_fragmentcCsZ|siS|jr|j}n2t|d�r.|jr.|j}nt|d�rH|jrHd|j}|jd||jd�S)N�macrzipset:r�)r)r�hasattrrrr�r)rTZrich_sourcer\r4r4r5�_rich_rule_source_fragment�s
z#nftables._rich_rule_source_fragmentcCsPt|�}t|t�r$|dkr$tt��n(t|�dkr8|dSd|d|dgiSdS)Nrr<�range)r�
isinstancernr	rre)rT�portrr4r4r5�_port_fragments
znftables._port_fragmentc	Csbddd�|}d}|jjj||t�}	g}
|r>|
j|j|j��|rT|
j|jd|��|r||
j|j|j	��|
j|j
|j��|
jdd|dd	�id
|j|�d�i�|s�t
|j�tkr�|
jddd
diiddddgid�i�g}|�r0|j|j|||||
��|j|j|||||
��|j|j|||||
��n.|j|ddtd||	f|
ddigd�ii�|S)NrWrY)TFr(r�r)r*�dport)r,r-z==)r.r/r0r�r`r�r�r��new�	untrackedrZrJz%s_%s_allowr�)r]r~rmr})rMr�r�r�r2rr]r�r�destinationr�sourcerr+r�rrrrr�)rTr�r��protorrr�r�r~r�r�r�r4r4r5�build_policy_ports_ruless:


z!nftables.build_policy_ports_rulesc	CsZddd�|}d}|jjj||t�}g}	|r>|	j|j|j��|rT|	j|jd|��|r||	j|j|j	��|	j|j
|j��|	jdddd	iid
|d�i�|s�t|j
�tkr�|	jdddd
iiddddgid�i�g}
|�r(|
j|j|||||	��|
j|j|||||	��|
j|j|||||	��n.|
j|ddtd||f|	ddigd�ii�|
S)NrWrY)TFr(r�r)r�r`r�z==)r.r/r0r�r�r�r�rrrZrJz%s_%s_allowr�)r]r~rmr})rMr�r�r�r2rr]r�rrrrr+r�rrrrr�)rTr�r�r,rr�r�r~r�r�r�r4r4r5�build_policy_protocol_rules2s8

z$nftables.build_policy_protocol_rulesc	Csbddd�|}d}|jjj||t�}	g}
|r>|
j|j|j��|rT|
j|jd|��|r||
j|j|j	��|
j|j
|j��|
jdd|dd	�id
|j|�d�i�|s�t
|j�tkr�|
jddd
diiddddgid�i�g}|�r0|j|j|||||
��|j|j|||||
��|j|j|||||
��n.|j|ddtd||	f|
ddigd�ii�|S)NrWrY)TFr(r�r)r*�sport)r,r-z==)r.r/r0r�r`r�r�r�rrrZrJz%s_%s_allowr�)r]r~rmr})rMr�r�r�r2rr]r�rrrrrr+r�rrrrr�)rTr�r�rrrr�r�r~r�r�r�r4r4r5�build_policy_source_ports_rulesUs:


z(nftables.build_policy_source_ports_rulesc
	Cs�d}|jjj||t�}	ddd�|}
g}|rR|jdddtd||f||d�ii�g}|rl|j|jd	|��|jd
d|dd
�id|j|�d�i�|jdd||fi�|j|
ddtd|	|d�ii�|S)Nr(rWrY)TFz	ct helperrJzhelper-%s-%s)r]r~r�r+r,r�r)r*r)r,r-z==)r.r/r0rZzfilter_%s_allow)r]r~rmr})rMr�r�r�r2r�r�r)
rTr�r�rrrZhelper_nameZmodule_short_namer~r�r�r�r�r4r4r5�build_policy_helper_ports_ruleszs.



z(nftables.build_policy_helper_ports_rulescCs�ddd�|}|jjj||t�}g}	|rv|t|�ddkrT|dt|�d�d}ddd	d
iid|d�id
dig}
n|jd|�d
dig}
dtd||
d�}|	j|d|ii�|	S)NrWrY)TFr<r�r�r)r�r`r�z==)r.r/r0r�r�rJzfilter_%s_allow)r]r~rmr}rZ)rMr�r�r�rer�r�r2)rTr�r[r�r~r�rr�r�r�r}rZr4r4r5�build_zone_forward_rules�s"z!nftables.build_zone_forward_rulesc	Cs�d}|jjj||tdd�}ddd�|}g}|r`|j|j|j��|j|j|j��|j	|�}	nd}	|t
d||	f|d	d
ddiid
dd�iddigd�}
|
j|j|��|d|
iigS)Nr'T)r�rWrY)TFr�z	nat_%s_%sr)r�r`r�z!=r�)r.r/r0Z
masquerade)r]r~rmr}rZ)
rMr�r�r�r2rrrrr�r�r�r�)rTr�r�r]r�r~r�r�r�r�rZr4r4r5�"_build_policy_masquerade_nat_rules�s&
z+nftables._build_policy_masquerade_nat_rulesc
Cs^g}|rD|jr|jdks,|jrDtd|jj�rD|j|j||d|��nV|r�|jrX|jdksl|jr�td|jj�r�|j|j||d|��n|j|j||d|��d}|jjj||t	�}ddd�|}g}|r�|j
|j|j��|j
|j
|j��|j|�}	nd	}	d
td||	f|dd
ddiiddddgid�iddigd�}
|
j|j|��|j
|d|
ii�|S)NrHrLrGrKr(rWrY)TFr�rJzfilter_%s_%sr)r�r`r�r�r�rr)r.r/r0r�)r]r~rmr}rZ)r]rrrr�r&rMr�r�r�r2rrrr�r�r�r�)rTr�r�r�r�r~r�r�r�r�rZr4r4r5�build_policy_masquerade_rules�s8
z&nftables.build_policy_masquerade_rulesc	Cs$d}	|jjj||	t�}
ddd�|}g}|r\|j|j|j��|j|j|j��|j	|�}
nd}
|jdd|dd	�id
|j
|�d�i�|r�td|�r�t|�}|r�|d
kr�|jd||j
|�d�i�q�|jdd|ii�n|jdd|j
|�ii�|t
d|
|
f|d�}|j|j|��|d|iigS)Nr'rWrY)TFr�r)r*r)r,r-z==)r.r/r0rHr�r�)rrrr;rz	nat_%s_%s)r]r~rmr}rZ)rMr�r�r�r2rrrrr�rrrr�r�r�)rTr�r�rr,�toaddr�toportr]r�r~r�r�r�r�rZr4r4r5�$_build_policy_forward_port_nat_rules�s4


z-nftables._build_policy_forward_port_nat_rulesc	
Cs�g}|rF|jr|jdks&|rFtd|�rF|j|j||||||d|��n�|r�|jrZ|jdksh|r�td|�r�|j|j||||||d|��nL|r�td|�r�|j|j||||||d|��n|j|j||||||d|��|S)NrHrLrGrK)r]rr�r*)	rTr�r�rr,r)r(r�r�r4r4r5�build_policy_forward_port_rulessz(nftables.build_policy_forward_port_rulescCs2|t|krt||Sttd||j|f��dS)Nz)ICMP type '%s' not supported by %s for %s)r�r	rr�)rTr�Z	icmp_typer4r4r5�_icmp_types_to_nft_fragments(sz%nftables._icmp_types_to_nft_fragmentscCsBd}|jjj||t�}ddd�|}|r6|jr6|j}n<|jrjg}d|jkrT|jd�d|jkrr|jd�nddg}g}	�x�|D�]�}
|jjj|�r�d||f}ddi}nd	||f}|j�}g}
|r�|
j|j	|j
��|
j|j|j��|
j|j|j
��|
j|j|
|j��|�r�|	j|j|||||
��|	j|j|||||
��|j�rf|	j|j|||||
��nN|j|�}d
td|||f|
|j�gd�}|j|j|��|	j|d
|ii�q~|jj�dk�r|jjj|��r|	j|d
d
t||
|j|jj��ddd||fiigd�ii�|	j|d
d
t||
|gd�ii�q~W|	S)Nr(rWrY)TFrGrHz%s_%s_allowr�z
%s_%s_denyrJz%s_%s_%s)r]r~rmr}rZr�rr�z"%s_%s_ICMP_BLOCK: ")rMr�r�r��ipvsrr2�query_icmp_block_inversionr�rr]rrrr�r,r�rrr�rr�r�r�r�r�r�)rTr�r�Zictr�r~r�r�r-r�r�Zfinal_chainr�r�r�rZr4r4r5�build_policy_icmp_block_rules/sb





"
"
z&nftables.build_policy_icmp_block_rulescCs�d}|jjj||t�}g}ddd�|}|jjj|�r@|j�}nddi}|j|ddtd||fd	|j�|gd
�ii�|jj	�dkr�|jjj|�r�|j|ddtd||fd	|j�|j
|jj	��dd
d||fiigd
�ii�|S)Nr(rWrY)TFr�rZrJz%s_%sr9)r]r~rmrar}r�rr�z%s_%s_ICMP_BLOCK: )rMr�r�r�r.r�r2r�r�r�r�)rTr�r�r~r�r�r�r�r4r4r5�'build_policy_icmp_block_inversion_rulesks,




 z0nftables.build_policy_icmp_block_inversion_rulescCs�g}ddddiiddd�iddd	d
dgdd
�iddd�ig}|dkrV|jdddii�|jddi�|jdddtd|d�ii�|jdddtdddddd�iddddgid�id digd�ii�|S)!Nr)r�r`rz==rH)r.r/r0Zfibr�ZiifrZoif)�flags�resultFr�rr�zrpfilter_DROP: r�rXrZrJZfilter_PREROUTING)r]r~rmr}r*rFr+)r,r-r�znd-router-advertznd-neighbor-solicitr�)r2r�)rTr�r�r�r4r4r5�build_rpfilter_rules�s0

znftables.build_rpfilter_rulesc	Cs�ddddddddd	g	}d
d�|D�}dd
ddd�idd|id�ig}|jjd"krb|jdddii�|j|jd��g}|jdddtdd|d�ii�|jdddtd d!|d�ii�|S)#Nz::0.0.0.0/96z::ffff:0.0.0.0/96z2002:0000::/24z2002:0a00::/24z2002:7f00::/24z2002:ac10::/28z2002:c0a8::/32z2002:a9fe::/32z2002:e000::/19cSs2g|]*}d|jd�dt|jd�d�d�i�qS)r�r�rr<)rre)r�rn)�.0r^r4r4r5�
<listcomp>�sz5nftables.build_rfc3964_ipv4_rules.<locals>.<listcomp>r)r*rLr�)r,r-z==r�)r.r/r0r�r�rr�zRFC3964_IPv4_REJECT: zaddr-unreachrWrZrJr�r<)r]r~rmrar}Zfilter_FORWARDrB)r�r�)rMZ_log_deniedr2r�r�)rTZ	daddr_setr�r�r4r4r5�build_rfc3964_ipv4_rules�s:

z!nftables.build_rfc3964_ipv4_rulescCs�d}g}|j|j|j��|j|j|j��|j|j|j��g}|j|j|||||��|j|j|||||��|j|j	|||||��|S)Nr()
r2rr]rrrrrrr)rTr�r�r�r~r�r�r4r4r5�*build_policy_rich_source_destination_rules�sz3nftables.build_policy_rich_source_destination_rulescCs|dkrdSdS)NrGrH�ebTF)rGrHr8r4)rTr�r4r4r5�is_ipv_supported�sznftables.is_ipv_supportedc
Cs�ddd�}||||ddg||dd||g||dd||g||dg||||||g||ddg||dd||g||dgdd	�}||kr�||Sttd
|��dS)NZ	ipv4_addrZ	ipv6_addr)rGrHZ
inet_protoZinet_servicerZifnameZ
ether_addr)zhash:ipzhash:ip,portzhash:ip,port,ipzhash:ip,port,netzhash:ip,markzhash:netzhash:net,netz
hash:net,portzhash:net,port,netzhash:net,ifacezhash:macz!ipset type name '%s' is not valid)r	r
)rTr�r+Zipv_addr�typesr4r4r5�_set_type_list�s"

znftables._set_type_listc
Cs�|rd|kr|ddkrd}nd}t||j||�d�}x0|jd�djd�D]}|dkrLdg|d
<PqLW|r�d|kr�|d|d<d|kr�|d|d<g}x0dD](}d|i}	|	j|�|jdd|	ii�q�W|S)Nr]�inet6rHrG)r~r�r+�:r<�,rK�netrZintervalr1ZtimeoutZmaxelem�sizerJrLrWr�)rKr?r)rJrKrL)r�r;r�r�r2)
rTr�r+�optionsr�Zset_dict�tr�r]Z	rule_dictr4r4r5�build_set_create_rules�s*


znftables.build_set_create_rulescCs$|j|||�}|j||jj��dS)N)rCr�rMr�)rTr�r+rAr�r4r4r5�
set_createsznftables.set_createcCs8x2dD]*}dd|t|d�ii}|j||jj��qWdS)NrJrKrLrYr�)r]r~r�)rJrKrL)r�r�rMr�)rTr�r]rZr4r4r5�set_destroys

znftables.set_destroycCs6|jjj|�jjd�djd�}g}x�tt|��D]�}||dkrr|jdddii�|jdd	|rdd
ndd�i�q2||dkr�|jd|j|�|r�dndd�i�q2||dkr�|jdd|r�dndii�q2||dkr�|jdddii�q2t	d||��q2Wdt|�dk�rd|in|d|�r&dndd|d�iS)Nr=r<r>rr�r`r�r*Zthrr")r,r-rKr?rr�r�Zifacer�r�rz-Unsupported ipset type for match fragment: %sr)�concatrz!=z==�@)r.r/r0)rKr?r)
rMr�	get_ipsetr+r�rrer2r�r	)rTr�Z
match_destr�type_formatr3�ir4r4r5r s$ znftables._set_match_fragmentcCsN|jjj|�}|jjd�djd�}|jd�}t|�t|�krHttd��g}�x�tt|��D�]�}||dk�r,y||j	d�}Wn&t
k
r�|jd�||}	Yn,X|j||d|��|||dd�}	y|	j	d�}Wn t
k
�r|j|	�Yn(X|jd|	d|�|	|dd�gi�q\||dk�r d||k�rb|jd||jd�i�n�y||j	d�}WnLt
k
�r�||}
d|jk�r�|jdd
k�r�t
|
�}
|j|
�Yn^X||d|�}
d|jk�r�|jdd
k�r�t
|
�}
|jd|
t|||dd��d�i�q\|j||�q\Wt|�dk�rJd|igS|S)Nr=r<r>z+Number of values does not match ipset type.rZtcp�-rrKr?r�r]r<r�)rrerF)rKr?)rMrrHr+r�rer	rrrar�r2rArrn)rTr��entry�objrIZentry_tokensZfragmentrJraZport_strrr4r4r5�_set_entry_fragment7sL

("znftables._set_entry_fragmentc	Cs>g}|j||�}x(dD] }|jdd|t||d�ii�qW|S)NrJrKrLrWr�)r]r~r��elem)rJrKrL)rNr2r�)rTr�rLr�r�r]r4r4r5�build_set_add_rulesks

znftables.build_set_add_rulescCs"|j||�}|j||jj��dS)N)rPr�rMr�)rTr�rLr�r4r4r5�set_addusznftables.set_addcCsF|j||�}x4dD],}dd|t||d�ii}|j||jj��qWdS)NrJrKrLrYr�)r]r~r�rO)rJrKrL)rNr�r�rMr�)rTr�rLr�r]rZr4r4r5�
set_deleteys
znftables.set_deletecCs4g}x*dD]"}dd|t|d�ii}|j|�q
W|S)NrJrKrLr{r�)r]r~r�)rJrKrL)r�r2)rTr�r�r]rZr4r4r5�build_set_flush_rules�s
znftables.build_set_flush_rulescCs |j|�}|j||jj��dS)N)rSr�rMr�)rTr�r�r4r4r5�	set_flush�s
znftables.set_flushcCsJ|jjj|�}|jdkrd}n(|jrBd|jkrB|jddkrBd}nd}|S)Nzhash:macr	r]r<rLrK)rMrrHr+rA)rTr�rr]r4r4r5r��s
znftables._set_get_familyc	Cs�g}|j|j|||��|j|j|��d}x^|D]D}|j|j||��|d7}|dkr2|j||jj��|j�d}q2W|j||jj��dS)Nrr<i�)r�rCrSrPr�rMr��clear)	rTZset_nameZ	type_nameZentriesZcreate_optionsZ
entry_optionsr��chunkrLr4r4r5�set_restore�s
znftables.set_restore)N)N)r�)rJ)FrJ)rJ)rJ)F)NN)NN)NN)NN)N)N)N)N)N)F)N)N)F)NN)I�__name__�
__module__�__qualname__r�Zpolicies_supportedrVrhrlrtrzr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr�rrrrr r!r#r$r%r&r'r*r+r,r/r0r3r6r7r9r;rCrDrErrNrPrQrRrSrTr�rWr4r4r4r5rI�s�/.`

4


R
i
;
-
9
 +


	
$
$
$


'
$

<
#


4
		rIij���i����)N)(Z
__future__rrirwr
Zfirewall.core.loggerrZfirewall.functionsrrrrrZfirewall.errorsr	r
rrr
rrZfirewall.core.richrrrrrrrZnftables.nftablesrr�r�r�r�r�r6r��objectrIr4r4r4r5�<module>s�$$





































site-packages/firewall/core/icmp.py000064400000006035147511334670013342 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2017 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

__all__ = [ "ICMP_TYPES", "ICMPV6_TYPES",
            "check_icmp_type", "check_icmpv6_type" ]

ICMP_TYPES = {
     "echo-reply": "0/0",
     "pong": "0/0",
     "network-unreachable": "3/0",
     "host-unreachable": "3/1",
     "protocol-unreachable": "3/2",
     "port-unreachable": "3/3",
     "fragmentation-needed": "3/4",
     "source-route-failed": "3/5",
     "network-unknown": "3/6",
     "host-unknown": "3/7",
     "network-prohibited": "3/9",
     "host-prohibited": "3/10",
     "TOS-network-unreachable": "3/11",
     "TOS-host-unreachable": "3/12",
     "communication-prohibited": "3/13",
     "host-precedence-violation": "3/14",
     "precedence-cutoff": "3/15",
     "source-quench": "4/0",
     "network-redirect": "5/0",
     "host-redirect": "5/1",
     "TOS-network-redirect": "5/2",
     "TOS-host-redirect": "5/3",
     "echo-request": "8/0",
     "ping": "8/0",
     "router-advertisement": "9/0",
     "router-solicitation": "10/0",
     "ttl-zero-during-transit": "11/0",
     "ttl-zero-during-reassembly": "11/1",
     "ip-header-bad": "12/0",
     "required-option-missing": "12/1",
     "timestamp-request": "13/0",
     "timestamp-reply": "14/0",
     "address-mask-request": "17/0",
     "address-mask-reply": "18/0",
}

ICMPV6_TYPES = {
    "no-route": "1/0",
    "communication-prohibited": "1/1",
    "address-unreachable": "1/3",
    "port-unreachable": "1/4",
    "packet-too-big": "2/0",
    "ttl-zero-during-transit": "3/0",
    "ttl-zero-during-reassembly": "3/1",
    "bad-header": "4/0",
    "unknown-header-type": "4/1",
    "unknown-option": "4/2",
    "echo-request": "128/0",
    "ping": "128/0",
    "echo-reply": "129/0",
    "pong": "129/0",
    "router-solicitation": "133/0",
    "router-advertisement": "134/0",
    "neighbour-solicitation": "135/0",
    "neigbour-solicitation": "135/0",
    "neighbour-advertisement": "136/0",
    "neigbour-advertisement": "136/0",
    "redirect": "137/0",
}

def check_icmp_name(_name):
    if _name in ICMP_TYPES:
        return True
    return False

def check_icmp_type(_type):
    if _type in ICMP_TYPES.values():
        return True
    return False

def check_icmpv6_name(_name):
    if _name in ICMP_TYPES:
        return True
    return False

def check_icmpv6_type(_type):
    if _type in ICMPV6_TYPES.values():
        return True
    return False
site-packages/firewall/core/logger.py000064400000074476147511334670013707 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2005-2007,2012 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

__all__ = [ "LogTarget", "FileLog", "Logger", "log" ]

import sys
import types
import time
import inspect
import fnmatch
import syslog
import traceback
import fcntl
import os.path
import os

# ---------------------------------------------------------------------------

# abstract class for logging targets
class LogTarget(object):
    """ Abstract class for logging targets. """
    def __init__(self):
        self.fd = None

    def write(self, data, level, logger, is_debug=0):
        raise NotImplementedError("LogTarget.write is an abstract method")

    def flush(self):
        raise NotImplementedError("LogTarget.flush is an abstract method")

    def close(self):
        raise NotImplementedError("LogTarget.close is an abstract method")

# ---------------------------------------------------------------------------

# private class for stdout
class _StdoutLog(LogTarget):
    def __init__(self):
        LogTarget.__init__(self)
        self.fd = sys.stdout

    def write(self, data, level, logger, is_debug=0):
        # ignore level
        self.fd.write(data)
        self.flush()

    def close(self):
        self.flush()

    def flush(self):
        self.fd.flush()

# ---------------------------------------------------------------------------

# private class for stderr
class _StderrLog(_StdoutLog):
    def __init__(self):
        _StdoutLog.__init__(self)
        self.fd = sys.stderr

# ---------------------------------------------------------------------------

# private class for syslog
class _SyslogLog(LogTarget):
    def __init__(self):
        # Only initialize LogTarget here as fs should be None
        LogTarget.__init__(self)
        #
        # Derived from: https://github.com/canvon/firewalld/commit/af0edfee1cc1891b7b13f302ca5911b24e9b0f13
        #
        # Work around Python issue 27875, "Syslogs /usr/sbin/foo as /foo
        # instead of as foo"
        # (but using openlog explicitly might be better anyway)
        #
        # Set ident to basename, log PID as well, and log to facility "daemon".
        syslog.openlog(os.path.basename(sys.argv[0]),
                       syslog.LOG_PID, syslog.LOG_DAEMON)

    def write(self, data, level, logger, is_debug=0):
        priority = None
        if is_debug:
            priority = syslog.LOG_DEBUG
        else:
            if level >= logger.INFO1:
                priority = syslog.LOG_INFO
            elif level == logger.WARNING:
                priority = syslog.LOG_WARNING
            elif level == logger.ERROR:
                priority = syslog.LOG_ERR
            elif level == logger.FATAL:
                priority = syslog.LOG_CRIT

        if data.endswith("\n"):
            data = data[:len(data)-1]
        if len(data) > 0:
            if priority is None:
                syslog.syslog(data)
            else:
                syslog.syslog(priority, data)

    def close(self):
        syslog.closelog()

    def flush(self):
        pass

# ---------------------------------------------------------------------------

class FileLog(LogTarget):
    """ FileLog class.
    File will be opened on the first write. """
    def __init__(self, filename, mode="w"):
        LogTarget.__init__(self)
        self.filename = filename
        self.mode = mode

    def open(self):
        if self.fd:
            return
        flags = os.O_CREAT | os.O_WRONLY
        if self.mode.startswith('a'):
            flags |= os.O_APPEND
        self.fd = os.open(self.filename, flags, 0o640)
        # Make sure that existing file has correct perms
        os.fchmod(self.fd, 0o640)
        # Make it an object
        self.fd = os.fdopen(self.fd, self.mode)
        fcntl.fcntl(self.fd, fcntl.F_SETFD, fcntl.FD_CLOEXEC)

    def write(self, data, level, logger, is_debug=0):
        if not self.fd:
            self.open()
        self.fd.write(data)
        self.fd.flush()

    def close(self):
        if not self.fd:
            return
        self.fd.close()
        self.fd = None

    def flush(self):
        if not self.fd:
            return
        self.fd.flush()

# ---------------------------------------------------------------------------

class Logger(object):
    r"""
    Format string:

    %(class)s      Calling class the function belongs to, else empty
    %(date)s       Date using Logger.date_format, see time module
    %(domain)s     Full Domain: %(module)s.%(class)s.%(function)s
    %(file)s       Filename of the module
    %(function)s   Function name, empty in __main__
    %(label)s      Label according to log function call from Logger.label
    %(level)d      Internal logging level
    %(line)d       Line number in module
    %(module)s     Module name
    %(message)s    Log message

    Standard levels:

    FATAL                 Fatal error messages
    ERROR                 Error messages
    WARNING               Warning messages
    INFOx, x in [1..5]    Information
    DEBUGy, y in [1..10]  Debug messages
    NO_INFO               No info output
    NO_DEBUG              No debug output
    INFO_MAX              Maximum info level
    DEBUG_MAX             Maximum debug level

    x and y depend on info_max and debug_max from Logger class
    initialization. See __init__ function.

    Default logging targets:

    stdout        Logs to stdout
    stderr        Logs to stderr
    syslog        Logs to syslog

    Additional arguments for logging functions (fatal, error, warning, info
    and debug):

    nl       Disable newline at the end with nl=0, default is nl=1.
    fmt      Format string for this logging entry, overloads global format
             string. Example: fmt="%(file)s:%(line)d %(message)s"
    nofmt    Only output message with nofmt=1. The nofmt argument wins over
             the fmt argument.

    Example:

    from logger import log
    log.setInfoLogLevel(log.INFO1)
    log.setDebugLogLevel(log.DEBUG1)
    for i in range(1, log.INFO_MAX+1):
        log.setInfoLogLabel(i, "INFO%d: " % i)
    log.setFormat("%(date)s %(module)s:%(line)d [%(domain)s] %(label)s: "
                  "%(level)d %(message)s")
    log.setDateFormat("%Y-%m-%d %H:%M:%S")

    fl = FileLog("/tmp/log", "a")
    log.addInfoLogging("*", fl)
    log.addDebugLogging("*", fl)
    log.addInfoLogging("*", log.syslog, fmt="%(label)s%(message)s")

    log.debug3("debug3")
    log.debug2("debug2")
    log.debug1("debug1")
    log.info2("info2")
    log.info1("info1")
    log.warning("warning\n", nl=0)
    log.error("error\n", nl=0)
    log.fatal("fatal")
    log.info(log.INFO1, "nofmt info", nofmt=1)

    """

    ALL       = -5
    NOTHING   = -4
    FATAL     = -3
    TRACEBACK = -2
    ERROR     = -1
    WARNING   =  0

    # Additional levels are generated in class initilization

    stdout = _StdoutLog()
    stderr = _StderrLog()
    syslog = _SyslogLog()

    def __init__(self, info_max=5, debug_max=10):
        """ Logger class initialization """
        self._level = { }
        self._debug_level = { }
        self._format = ""
        self._date_format = ""
        self._label = { }
        self._debug_label = { }
        self._logging = { }
        self._debug_logging = { }
        self._domains = { }
        self._debug_domains = { }

        # INFO1 is required for standard log level
        if info_max < 1:
            raise ValueError("Logger: info_max %d is too low" % info_max)
        if debug_max < 0:
            raise ValueError("Logger: debug_max %d is too low" % debug_max)

        self.NO_INFO   = self.WARNING # = 0
        self.INFO_MAX  = info_max
        self.NO_DEBUG  = 0
        self.DEBUG_MAX = debug_max

        self.setInfoLogLabel(self.FATAL, "FATAL ERROR: ")
        self.setInfoLogLabel(self.TRACEBACK, "")
        self.setInfoLogLabel(self.ERROR, "ERROR: ")
        self.setInfoLogLabel(self.WARNING, "WARNING: ")

        # generate info levels and infox functions
        for _level in range(1, self.INFO_MAX+1):
            setattr(self, "INFO%d" % _level, _level)
            self.setInfoLogLabel(_level, "")
            setattr(self, "info%d" % (_level),
                    (lambda self, x:
                     lambda message, *args, **kwargs:
                     self.info(x, message, *args, **kwargs))(self, _level)) # pylint: disable=E0602

        # generate debug levels and debugx functions
        for _level in range(1, self.DEBUG_MAX+1):
            setattr(self, "DEBUG%d" % _level, _level)
            self.setDebugLogLabel(_level, "DEBUG%d: " % _level)
            setattr(self, "debug%d" % (_level),
                    (lambda self, x:
                     lambda message, *args, **kwargs:
                     self.debug(x, message, *args, **kwargs))(self, _level)) # pylint: disable=E0602

        # set initial log levels, formats and targets
        self.setInfoLogLevel(self.INFO1)
        self.setDebugLogLevel(self.NO_DEBUG)
        self.setFormat("%(label)s%(message)s")
        self.setDateFormat("%d %b %Y %H:%M:%S")
        self.setInfoLogging("*", self.stderr, [ self.FATAL, self.ERROR,
                                                self.WARNING ])
        self.setInfoLogging("*", self.stdout,
                            [ i for i in range(self.INFO1, self.INFO_MAX+1) ])
        self.setDebugLogging("*", self.stdout,
                             [ i for i in range(1, self.DEBUG_MAX+1) ])

    def close(self):
        """ Close all logging targets """
        for level in range(self.FATAL, self.DEBUG_MAX+1):
            if level not in self._logging:
                continue
            for (dummy, target, dummy) in self._logging[level]:
                target.close()

    def getInfoLogLevel(self, domain="*"):
        """ Get info log level. """
        self._checkDomain(domain)
        if domain in self._level:
            return self._level[domain]
        return self.NOTHING

    def setInfoLogLevel(self, level, domain="*"):
        """ Set log level [NOTHING .. INFO_MAX] """
        self._checkDomain(domain)
        if level < self.NOTHING:
            level = self.NOTHING
        if level > self.INFO_MAX:
            level = self.INFO_MAX
        self._level[domain] = level

    def getDebugLogLevel(self, domain="*"):
        """ Get debug log level. """
        self._checkDomain(domain)
        if domain in self._debug_level:
            return self._debug_level[domain] + self.NO_DEBUG
        return self.NO_DEBUG

    def setDebugLogLevel(self, level, domain="*"):
        """ Set debug log level [NO_DEBUG .. DEBUG_MAX] """
        self._checkDomain(domain)
        if level < 0:
            level = 0
        if level > self.DEBUG_MAX:
            level = self.DEBUG_MAX
        self._debug_level[domain] = level - self.NO_DEBUG

    def getFormat(self):
        return self._format

    def setFormat(self, _format):
        self._format = _format

    def getDateFormat(self):
        return self._date_format

    def setDateFormat(self, _format):
        self._date_format = _format

    def setInfoLogLabel(self, level, label):
        """ Set log label for level. Level can be a single level or an array
        of levels. """
        levels = self._getLevels(level)
        for level in levels:
            self._checkLogLevel(level, min_level=self.FATAL,
                                max_level=self.INFO_MAX)
            self._label[level] = label

    def setDebugLogLabel(self, level, label):
        """ Set log label for level. Level can be a single level or an array
        of levels. """
        levels = self._getLevels(level, is_debug=1)
        for level in levels:
            self._checkLogLevel(level, min_level=self.INFO1,
                                max_level=self.DEBUG_MAX)
            self._debug_label[level] = label

    def setInfoLogging(self, domain, target, level=ALL, fmt=None):
        """ Set info log target for domain and level. Level can be a single
        level or an array of levels. Use level ALL to set for all levels.
        If no format is specified, the default format will be used. """
        self._setLogging(domain, target, level, fmt, is_debug=0)

    def setDebugLogging(self, domain, target, level=ALL, fmt=None):
        """ Set debug log target for domain and level. Level can be a single
        level or an array of levels. Use level ALL to set for all levels.
        If no format is specified, the default format will be used. """
        self._setLogging(domain, target, level, fmt, is_debug=1)

    def addInfoLogging(self, domain, target, level=ALL, fmt=None):
        """ Add info log target for domain and level. Level can be a single
        level or an array of levels. Use level ALL to set for all levels.
        If no format is specified, the default format will be used. """
        self._addLogging(domain, target, level, fmt, is_debug=0)

    def addDebugLogging(self, domain, target, level=ALL, fmt=None):
        """ Add debg log target for domain and level. Level can be a single
        level or an array of levels. Use level ALL to set for all levels.
        If no format is specified, the default format will be used. """
        self._addLogging(domain, target, level, fmt, is_debug=1)

    def delInfoLogging(self, domain, target, level=ALL, fmt=None):
        """ Delete info log target for domain and level. Level can be a single
        level or an array of levels. Use level ALL to set for all levels.
        If no format is specified, the default format will be used. """
        self._delLogging(domain, target, level, fmt, is_debug=0)

    def delDebugLogging(self, domain, target, level=ALL, fmt=None):
        """ Delete debug log target for domain and level. Level can be a single
        level or an array of levels. Use level ALL to set for all levels.
        If no format is specified, the default format will be used. """
        self._delLogging(domain, target, level, fmt, is_debug=1)

    def isInfoLoggingHere(self, level):
        """ Is there currently any info logging for this log level (and
        domain)? """
        return self._isLoggingHere(level, is_debug=0)

    def isDebugLoggingHere(self, level):
        """ Is there currently any debug logging for this log level (and
        domain)? """
        return self._isLoggingHere(level, is_debug=1)

    ### log functions

    def fatal(self, _format, *args, **kwargs):
        """ Fatal error log. """
        self._checkKWargs(kwargs)
        kwargs["is_debug"] = 0
        self._log(self.FATAL, _format, *args, **kwargs)

    def error(self, _format, *args, **kwargs):
        """ Error log. """
        self._checkKWargs(kwargs)
        kwargs["is_debug"] = 0
        self._log(self.ERROR, _format, *args, **kwargs)

    def warning(self, _format, *args, **kwargs):
        """ Warning log. """
        self._checkKWargs(kwargs)
        kwargs["is_debug"] = 0
        self._log(self.WARNING, _format, *args, **kwargs)

    def info(self, level, _format, *args, **kwargs):
        """ Information log using info level [1..info_max].
        There are additional infox functions according to info_max from
        __init__"""
        self._checkLogLevel(level, min_level=1, max_level=self.INFO_MAX)
        self._checkKWargs(kwargs)
        kwargs["is_debug"] = 0
        self._log(level+self.NO_INFO, _format, *args, **kwargs)

    def debug(self, level, _format, *args, **kwargs):
        """ Debug log using debug level [1..debug_max].
        There are additional debugx functions according to debug_max
        from __init__"""
        self._checkLogLevel(level, min_level=1, max_level=self.DEBUG_MAX)
        self._checkKWargs(kwargs)
        kwargs["is_debug"] = 1
        self._log(level, _format, *args, **kwargs)

    def exception(self):
        self._log(self.TRACEBACK, traceback.format_exc(), args=[], kwargs={})

    ### internal functions

    def _checkLogLevel(self, level, min_level, max_level):
        if level < min_level or level > max_level:
            raise ValueError("Level %d out of range, should be [%d..%d]." % \
                             (level, min_level, max_level))

    def _checkKWargs(self, kwargs):
        if not kwargs:
            return
        for key in kwargs.keys():
            if key not in [ "nl", "fmt", "nofmt" ]:
                raise ValueError("Key '%s' is not allowed as argument for logging." % key)

    def _checkDomain(self, domain):
        if not domain or domain == "":
            raise ValueError("Domain '%s' is not valid." % domain)

    def _getLevels(self, level, is_debug=0):
        """ Generate log level array. """
        if level != self.ALL:
            if isinstance(level, list) or isinstance(level, tuple):
                levels = level
            else:
                levels = [ level ]
            for level in levels:
                if is_debug:
                    self._checkLogLevel(level, min_level=1,
                                        max_level=self.DEBUG_MAX)
                else:
                    self._checkLogLevel(level, min_level=self.FATAL,
                                        max_level=self.INFO_MAX)
        else:
            if is_debug:
                levels = [ i for i in range(self.DEBUG1, self.DEBUG_MAX) ]
            else:
                levels = [ i for i in range(self.FATAL, self.INFO_MAX) ]
        return levels

    def _getTargets(self, target):
        """ Generate target array. """
        if isinstance(target, list) or isinstance(target, tuple):
            targets = target
        else:
            targets = [ target ]
        for _target in targets:
            if not issubclass(_target.__class__, LogTarget):
                raise ValueError("'%s' is no valid logging target." % \
                      _target.__class__.__name__)
        return targets

    def _genDomains(self, is_debug=0):
        # private method for self._domains array creation, speeds up
        """ Generate dict with domain by level. """
        if is_debug:
            _domains = self._debug_domains
            _logging = self._debug_logging
            _range = ( 1, self.DEBUG_MAX+1 )
        else:
            _domains = self._domains
            _logging = self._logging
            _range = ( self.FATAL, self.INFO_MAX+1 )

        if len(_domains) > 0:
            _domains.clear()

        for level in range(_range[0], _range[1]):
            if level not in _logging:
                continue
            for (domain, dummy, dummy) in _logging[level]:
                if domain not in _domains:
                    _domains.setdefault(level, [ ]).append(domain)

    def _setLogging(self, domain, target, level=ALL, fmt=None, is_debug=0):
        self._checkDomain(domain)
        levels = self._getLevels(level, is_debug)
        targets = self._getTargets(target)

        if is_debug:
            _logging = self._debug_logging
        else:
            _logging = self._logging

        for level in levels:
            for target in targets:
                _logging[level] = [ (domain, target, fmt) ]
        self._genDomains(is_debug)

    def _addLogging(self, domain, target, level=ALL, fmt=None, is_debug=0):
        self._checkDomain(domain)
        levels = self._getLevels(level, is_debug)
        targets = self._getTargets(target)

        if is_debug:
            _logging = self._debug_logging
        else:
            _logging = self._logging

        for level in levels:
            for target in targets:
                _logging.setdefault(level, [ ]).append((domain, target, fmt))
        self._genDomains(is_debug)

    def _delLogging(self, domain, target, level=ALL, fmt=None, is_debug=0):
        self._checkDomain(domain)
        levels = self._getLevels(level, is_debug)
        targets = self._getTargets(target)

        if is_debug:
            _logging = self._debug_logging
        else:
            _logging = self._logging

        for _level in levels:
            for target in targets:
                if _level not in _logging:
                    continue
                if (domain, target, fmt) in _logging[_level]:
                    _logging[_level].remove( (domain, target, fmt) )
                    if len(_logging[_level]) == 0:
                        del _logging[_level]
                        continue
                if level != self.ALL:
                    raise ValueError("No mathing logging for " \
                          "level %d, domain %s, target %s and format %s." % \
                          (_level, domain, target.__class__.__name__, fmt))
        self._genDomains(is_debug)

    def _isLoggingHere(self, level, is_debug=0):
        _dict = self._genDict(level, is_debug)
        if not _dict:
            return False

        point_domain = _dict["domain"] + "."

        if is_debug:
            _logging = self._debug_logging
        else:
            _logging = self._logging

        # do we need to log?
        for (domain, dummy, dummy) in _logging[level]:
            if domain == "*" or \
                   point_domain.startswith(domain) or \
                   fnmatch.fnmatchcase(_dict["domain"], domain):
                return True
        return False

    def _getClass(self, frame):
        """ Function to get calling class. Returns class or None. """
        # get class by first function argument, if there are any
        if frame.f_code.co_argcount > 0:
            selfname = frame.f_code.co_varnames[0]
            if selfname in frame.f_locals:
                _self = frame.f_locals[selfname]
                obj = self._getClass2(_self.__class__, frame.f_code)
                if obj:
                    return obj

        module = inspect.getmodule(frame.f_code)
        code = frame.f_code

        # function in module?
        if code.co_name in module.__dict__:
            if hasattr(module.__dict__[code.co_name], "func_code") and \
                   module.__dict__[code.co_name].__code__  == code:
                return None

        # class in module
        for (dummy, obj) in module.__dict__.items():
            if isinstance(obj, types.ClassType):
                if hasattr(obj, code.co_name):
                    value = getattr(obj, code.co_name)
                    if isinstance(value, types.FunctionType):
                        if value.__code__ == code:
                            return obj

        # nothing found
        return None

    def _getClass2(self, obj, code):
        """ Internal function to get calling class. Returns class or None. """
        for value in obj.__dict__.values():
            if isinstance(value, types.FunctionType):
                if value.__code__ == code:
                    return obj

        for base in obj.__bases__:
            _obj = self._getClass2(base, code)
            if _obj:
                return _obj
        return None

    # internal log class
    def _log(self, level, _format, *args, **kwargs):
        is_debug = 0
        if "is_debug" in kwargs:
            is_debug = kwargs["is_debug"]

        nl = 1
        if "nl" in kwargs:
            nl = kwargs["nl"]

        nofmt = 0
        if "nofmt" in kwargs:
            nofmt = kwargs["nofmt"]

        _dict = self._genDict(level, is_debug)
        if not _dict:
            return

        if len(args) > 1:
            _dict['message'] = _format % args
        elif len(args) == 1:  # needed for _format % _dict
            _dict['message'] = _format % args[0]
        else:
            _dict['message'] = _format

        point_domain = _dict["domain"] + "."

        if is_debug:
            _logging = self._debug_logging
        else:
            _logging = self._logging

        used_targets = [ ]
        # log to target(s)
        for (domain, target, _format) in _logging[level]:
            if target in used_targets:
                continue
            if domain == "*" \
                   or point_domain.startswith(domain+".") \
                   or fnmatch.fnmatchcase(_dict["domain"], domain):
                if not _format:
                    _format = self._format
                if "fmt" in kwargs:
                    _format = kwargs["fmt"]
                if nofmt:
                    target.write(_dict["message"], level, self, is_debug)
                else:
                    target.write(_format % _dict, level, self, is_debug)
                if nl: # newline
                    target.write("\n", level, self, is_debug)
                used_targets.append(target)

    # internal function to generate the dict, needed for logging
    def _genDict(self, level, is_debug=0):
        """ Internal function. """
        check_domains = [ ]
        simple_match = False

        if is_debug:
            _dict = self._debug_level
            _domains = self._debug_domains
            _label = self._debug_label
        else:
            _dict = self._level
            _domains = self._domains
            _label = self._label

        # no debug
        for domain in _dict:
            if domain == "*":
                # '*' matches everything: simple match
                if _dict[domain] >= level:
                    simple_match = True
                    if len(check_domains) > 0:
                        check_domains = [ ]
                    break
            else:
                if _dict[domain] >= level:
                    check_domains.append(domain)

        if not simple_match and len(check_domains) < 1:
            return None

        if level not in _domains:
            return None

        f = inspect.currentframe()

        # go outside of logger module as long as there is a lower frame
        while f and f.f_back and f.f_globals["__name__"] == self.__module__:
            f = f.f_back

        if not f:
            raise ValueError("Frame information not available.")

        # get module name
        module_name = f.f_globals["__name__"]

        # simple module match test for all entries of check_domain
        point_module = module_name + "."
        for domain in check_domains:
            if point_module.startswith(domain):
                # found domain in module name
                check_domains = [ ]
                break

        # get code
        co = f.f_code

        # optimization: bail out early if domain can not match at all
        _len = len(module_name)
        for domain in _domains[level]:
            i = domain.find("*")
            if i == 0:
                continue
            elif i > 0:
                d = domain[:i]
            else:
                d = domain
            if _len >= len(d):
                if not module_name.startswith(d):
                    return None
            else:
                if not d.startswith(module_name):
                    return None

        # generate _dict for format output
        level_str = ""
        if level in _label:
            level_str = _label[level]
        _dict = { 'file': co.co_filename,
                  'line': f.f_lineno,
                  'module': module_name,
                  'class': '',
                  'function': co.co_name,
                  'domain': '',
                  'label' : level_str,
                  'level' : level,
                  'date' : time.strftime(self._date_format, time.localtime()) }
        if _dict["function"] == "?":
            _dict["function"] = ""

        # domain match needed?
        domain_needed = False
        for domain in _domains[level]:
            # standard domain, matches everything
            if domain == "*":
                continue
            # domain is needed
            domain_needed = True
            break

        # do we need to get the class object?
        if self._format.find("%(domain)") >= 0 or \
               self._format.find("%(class)") >= 0 or \
               domain_needed or \
               len(check_domains) > 0:
            obj = self._getClass(f)
            if obj:
                _dict["class"] = obj.__name__

        # build domain string
        _dict["domain"] = "" + _dict["module"]
        if _dict["class"] != "":
            _dict["domain"] += "." + _dict["class"]
        if _dict["function"] != "":
            _dict["domain"] += "." + _dict["function"]

        if len(check_domains) < 1:
            return _dict

        point_domain = _dict["domain"] + "."
        for domain in check_domains:
            if point_domain.startswith(domain) or \
                   fnmatch.fnmatchcase(_dict["domain"], domain):
                return _dict

        return None

# ---------------------------------------------------------------------------

# Global logging object.
log = Logger()

# ---------------------------------------------------------------------------

"""
# Example
if __name__ == '__main__':
    log.setInfoLogLevel(log.INFO2)
    log.setDebugLogLevel(log.DEBUG5)
    for i in range(log.INFO1, log.INFO_MAX+1):
        log.setInfoLogLabel(i, "INFO%d: " % i)
    for i in range(log.DEBUG1, log.DEBUG_MAX+1):
        log.setDebugLogLabel(i, "DEBUG%d: " % i)

    log.setFormat("%(date)s %(module)s:%(line)d %(label)s"
                  "%(message)s")
    log.setDateFormat("%Y-%m-%d %H:%M:%S")

    fl = FileLog("/tmp/log", "a")
    log.addInfoLogging("*", fl)
    log.delDebugLogging("*", log.stdout)
    log.setDebugLogging("*", log.stdout, [ log.DEBUG1, log.DEBUG2 ] )
    log.addDebugLogging("*", fl)
#    log.addInfoLogging("*", log.syslog, fmt="%(label)s%(message)s")
#    log.addDebugLogging("*", log.syslog, fmt="%(label)s%(message)s")

    log.debug10("debug10")
    log.debug9("debug9")
    log.debug8("debug8")
    log.debug7("debug7")
    log.debug6("debug6")
    log.debug5("debug5")
    log.debug4("debug4")
    log.debug3("debug3")
    log.debug2("debug2", fmt="%(file)s:%(line)d %(message)s")
    log.debug1("debug1", nofmt=1)
    log.info5("info5")
    log.info4("info4")
    log.info3("info3")
    log.info2("info2")
    log.info1("info1")
    log.warning("warning\n", nl=0)
    log.error("error ", nl=0)
    log.error("error", nofmt=1)
    log.fatal("fatal")
    log.info(log.INFO1, "nofmt info", nofmt=1)
    log.info(log.INFO2, "info2 fmt", fmt="%(file)s:%(line)d %(message)s")

    try:
        a = b
    except Exception as e:
        log.exception()
"""

# vim:ts=4:sw=4:showmatch:expandtab
site-packages/firewall/core/io/helper.py000064400000020263147511334670014277 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2011-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

__all__ = [ "Helper", "helper_reader", "helper_writer" ]

import xml.sax as sax
import os
import io
import shutil

from firewall import config
from firewall.functions import u2b_if_py2
from firewall.core.io.io_object import PY2, IO_Object, \
    IO_Object_ContentHandler, IO_Object_XMLGenerator, check_port, \
    check_tcpudp
from firewall.core.logger import log
from firewall import errors
from firewall.errors import FirewallError

class Helper(IO_Object):
    IMPORT_EXPORT_STRUCTURE = (
        ( "version",  "" ),                   # s
        ( "short", "" ),                      # s
        ( "description", "" ),                # s
        ( "family", "", ),                    # s
        ( "module", "", ),                    # s
        ( "ports", [ ( "", "" ), ], ),        # a(ss)
        )
    DBUS_SIGNATURE = '(sssssa(ss))'
    ADDITIONAL_ALNUM_CHARS = [ "-", "." ]
    PARSER_REQUIRED_ELEMENT_ATTRS = {
        "short": None,
        "description": None,
        "helper": [ "module" ],
        }
    PARSER_OPTIONAL_ELEMENT_ATTRS = {
        "helper": [ "name", "version", "family" ],
        "port": [ "port", "protocol" ],
        }

    def __init__(self):
        super(Helper, self).__init__()
        self.version = ""
        self.short = ""
        self.description = ""
        self.module = ""
        self.family = ""
        self.ports = [ ]

    def cleanup(self):
        self.version = ""
        self.short = ""
        self.description = ""
        self.module = ""
        self.family = ""
        del self.ports[:]

    def encode_strings(self):
        """ HACK. I haven't been able to make sax parser return
            strings encoded (because of python 2) instead of in unicode.
            Get rid of it once we throw out python 2 support."""
        self.version = u2b_if_py2(self.version)
        self.short = u2b_if_py2(self.short)
        self.description = u2b_if_py2(self.description)
        self.module = u2b_if_py2(self.module)
        self.family = u2b_if_py2(self.family)
        self.ports = [(u2b_if_py2(po),u2b_if_py2(pr)) for (po,pr) in self.ports]

    def check_ipv(self, ipv):
        ipvs = [ 'ipv4', 'ipv6' ]
        if ipv not in ipvs:
            raise FirewallError(errors.INVALID_IPV,
                                "'%s' not in '%s'" % (ipv, ipvs))

    def _check_config(self, config, item, all_config):
        if item == "ports":
            for port in config:
                check_port(port[0])
                check_tcpudp(port[1])
        elif item == "module":
            if not config.startswith("nf_conntrack_"):
                raise FirewallError(
                    errors.INVALID_MODULE,
                    "'%s' does not start with 'nf_conntrack_'" % config)
            if len(config.replace("nf_conntrack_", "")) < 1:
                raise FirewallError(errors.INVALID_MODULE,
                                    "Module name '%s' too short" % config)

# PARSER

class helper_ContentHandler(IO_Object_ContentHandler):
    def startElement(self, name, attrs):
        IO_Object_ContentHandler.startElement(self, name, attrs)
        self.item.parser_check_element_attrs(name, attrs)
        if name == "helper":
            if "version" in attrs:
                self.item.version = attrs["version"]
            if "family" in attrs:
                self.item.check_ipv(attrs["family"])
                self.item.family = attrs["family"]
            if "module" in attrs:
                if not attrs["module"].startswith("nf_conntrack_"):
                    raise FirewallError(
                        errors.INVALID_MODULE,
                        "'%s' does not start with 'nf_conntrack_'" % \
                        attrs["module"])
                if len(attrs["module"].replace("nf_conntrack_", "")) < 1:
                    raise FirewallError(
                        errors.INVALID_MODULE,
                        "Module name '%s' too short" % attrs["module"])
                self.item.module = attrs["module"]
        elif name == "short":
            pass
        elif name == "description":
            pass
        elif name == "port":
            check_port(attrs["port"])
            check_tcpudp(attrs["protocol"])
            entry = (attrs["port"], attrs["protocol"])
            if entry not in self.item.ports:
                self.item.ports.append(entry)
            else:
                log.warning("Port '%s/%s' already set, ignoring.",
                            attrs["port"], attrs["protocol"])

def helper_reader(filename, path):
    helper = Helper()
    if not filename.endswith(".xml"):
        raise FirewallError(errors.INVALID_NAME,
                            "'%s' is missing .xml suffix" % filename)
    helper.name = filename[:-4]
    helper.check_name(helper.name)
    helper.filename = filename
    helper.path = path
    helper.builtin = False if path.startswith(config.ETC_FIREWALLD) else True
    helper.default = helper.builtin
    handler = helper_ContentHandler(helper)
    parser = sax.make_parser()
    parser.setContentHandler(handler)
    name = "%s/%s" % (path, filename)
    with open(name, "rb") as f:
        source = sax.InputSource(None)
        source.setByteStream(f)
        try:
            parser.parse(source)
        except sax.SAXParseException as msg:
            raise FirewallError(errors.INVALID_HELPER,
                                "not a valid helper file: %s" % \
                                msg.getException())
    del handler
    del parser
    if PY2:
        helper.encode_strings()
    return helper

def helper_writer(helper, path=None):
    _path = path if path else helper.path

    if helper.filename:
        name = "%s/%s" % (_path, helper.filename)
    else:
        name = "%s/%s.xml" % (_path, helper.name)

    if os.path.exists(name):
        try:
            shutil.copy2(name, "%s.old" % name)
        except Exception as msg:
            log.error("Backup of file '%s' failed: %s", name, msg)

    dirpath = os.path.dirname(name)
    if dirpath.startswith(config.ETC_FIREWALLD) and not os.path.exists(dirpath):
        if not os.path.exists(config.ETC_FIREWALLD):
            os.mkdir(config.ETC_FIREWALLD, 0o750)
        os.mkdir(dirpath, 0o750)

    f = io.open(name, mode='wt', encoding='UTF-8')
    handler = IO_Object_XMLGenerator(f)
    handler.startDocument()

    # start helper element
    attrs = {}
    attrs["module"] = helper.module
    if helper.version and helper.version != "":
        attrs["version"] = helper.version
    if helper.family and helper.family != "":
        attrs["family"] = helper.family
    handler.startElement("helper", attrs)
    handler.ignorableWhitespace("\n")

    # short
    if helper.short and helper.short != "":
        handler.ignorableWhitespace("  ")
        handler.startElement("short", { })
        handler.characters(helper.short)
        handler.endElement("short")
        handler.ignorableWhitespace("\n")

    # description
    if helper.description and helper.description != "":
        handler.ignorableWhitespace("  ")
        handler.startElement("description", { })
        handler.characters(helper.description)
        handler.endElement("description")
        handler.ignorableWhitespace("\n")

    # ports
    for port in helper.ports:
        handler.ignorableWhitespace("  ")
        handler.simpleElement("port", { "port": port[0], "protocol": port[1] })
        handler.ignorableWhitespace("\n")

    # end helper element
    handler.endElement('helper')
    handler.ignorableWhitespace("\n")
    handler.endDocument()
    f.close()
    del handler
site-packages/firewall/core/io/firewalld_conf.py000064400000032774147511334670016010 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2011-2012 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

import os.path
import io
import tempfile
import shutil

from firewall import config
from firewall.core.logger import log
from firewall.functions import b2u, u2b, PY2

valid_keys = [ "DefaultZone", "MinimalMark", "CleanupOnExit",
               "CleanupModulesOnExit", "Lockdown", "IPv6_rpfilter",
               "IndividualCalls", "LogDenied", "AutomaticHelpers",
               "FirewallBackend", "FlushAllOnReload", "RFC3964_IPv4",
               "AllowZoneDrifting" ]

class firewalld_conf(object):
    def __init__(self, filename):
        self._config = { }
        self._deleted = [ ]
        self.filename = filename
        self.clear()

    def clear(self):
        self._config = { }
        self._deleted = [ ]

    def cleanup(self):
        self._config.clear()
        self._deleted = [ ]

    def get(self, key):
        return self._config.get(key.strip())

    def set(self, key, value):
        _key = b2u(key.strip())
        self._config[_key] = b2u(value.strip())
        if _key in self._deleted:
            self._deleted.remove(_key)

    def __str__(self):
        s = ""
        for (key,value) in self._config.items():
            if s:
                s += '\n'
            s += '%s=%s' % (key, value)
        return u2b(s) if PY2 else s

    # load self.filename
    def read(self):
        self.clear()
        try:
            f = open(self.filename, "r")
        except Exception as msg:
            log.error("Failed to load '%s': %s", self.filename, msg)
            self.set("DefaultZone", config.FALLBACK_ZONE)
            self.set("MinimalMark", str(config.FALLBACK_MINIMAL_MARK))
            self.set("CleanupOnExit", "yes" if config.FALLBACK_CLEANUP_ON_EXIT else "no")
            self.set("CleanupModulesOnExit", "yes" if config.FALLBACK_CLEANUP_MODULES_ON_EXIT else "no")
            self.set("Lockdown", "yes" if config.FALLBACK_LOCKDOWN else "no")
            self.set("IPv6_rpfilter","yes" if config.FALLBACK_IPV6_RPFILTER else "no")
            self.set("IndividualCalls", "yes" if config.FALLBACK_INDIVIDUAL_CALLS else "no")
            self.set("LogDenied", config.FALLBACK_LOG_DENIED)
            self.set("AutomaticHelpers", config.FALLBACK_AUTOMATIC_HELPERS)
            self.set("FirewallBackend", config.FALLBACK_FIREWALL_BACKEND)
            self.set("FlushAllOnReload", "yes" if config.FALLBACK_FLUSH_ALL_ON_RELOAD else "no")
            self.set("RFC3964_IPv4", "yes" if config.FALLBACK_RFC3964_IPV4 else "no")
            self.set("AllowZoneDrifting", "yes" if config.FALLBACK_ALLOW_ZONE_DRIFTING else "no")
            raise

        for line in f:
            if not line:
                break
            line = line.strip()
            if len(line) < 1 or line[0] in ['#', ';']:
                continue
            # get key/value pair
            pair = [ x.strip() for x in line.split("=") ]
            if len(pair) != 2:
                log.error("Invalid option definition: '%s'", line.strip())
                continue
            elif pair[0] not in valid_keys:
                log.error("Invalid option: '%s'", line.strip())
                continue
            elif pair[1] == '':
                log.error("Missing value: '%s'", line.strip())
                continue
            elif self._config.get(pair[0]) is not None:
                log.error("Duplicate option definition: '%s'", line.strip())
                continue
            self._config[pair[0]] = pair[1]
        f.close()

        # check default zone
        if not self.get("DefaultZone"):
            log.error("DefaultZone is not set, using default value '%s'",
                      config.FALLBACK_ZONE)
            self.set("DefaultZone", str(config.FALLBACK_ZONE))

        # check minimal mark
        value = self.get("MinimalMark")
        try:
            int(value)
        except (ValueError, TypeError):
            if value is not None:
                log.warning("MinimalMark '%s' is not valid, using default "
                            "value '%d'", value if value else '',
                            config.FALLBACK_MINIMAL_MARK)
            self.set("MinimalMark", str(config.FALLBACK_MINIMAL_MARK))

        # check cleanup on exit
        value = self.get("CleanupOnExit")
        if not value or value.lower() not in [ "no", "false", "yes", "true" ]:
            if value is not None:
                log.warning("CleanupOnExit '%s' is not valid, using default "
                            "value %s", value if value else '',
                            config.FALLBACK_CLEANUP_ON_EXIT)
            self.set("CleanupOnExit", "yes" if config.FALLBACK_CLEANUP_ON_EXIT else "no")

        # check module cleanup on exit
        value = self.get("CleanupModulesOnExit")
        if not value or value.lower() not in [ "no", "false", "yes", "true" ]:
            if value is not None:
                log.warning("CleanupModulesOnExit '%s' is not valid, using default "
                            "value %s", value if value else '',
                            config.FALLBACK_CLEANUP_MODULES_ON_EXIT)
            self.set("CleanupModulesOnExit", "yes" if config.FALLBACK_CLEANUP_MODULES_ON_EXIT else "no")

        # check lockdown
        value = self.get("Lockdown")
        if not value or value.lower() not in [ "yes", "true", "no", "false" ]:
            if value is not None:
                log.warning("Lockdown '%s' is not valid, using default "
                            "value %s", value if value else '',
                            config.FALLBACK_LOCKDOWN)
            self.set("Lockdown", "yes" if config.FALLBACK_LOCKDOWN else "no")

        # check ipv6_rpfilter
        value = self.get("IPv6_rpfilter")
        if not value or value.lower() not in [ "yes", "true", "no", "false" ]:
            if value is not None:
                log.warning("IPv6_rpfilter '%s' is not valid, using default "
                            "value %s", value if value else '',
                            config.FALLBACK_IPV6_RPFILTER)
            self.set("IPv6_rpfilter","yes" if config.FALLBACK_IPV6_RPFILTER else "no")

        # check individual calls
        value = self.get("IndividualCalls")
        if not value or value.lower() not in [ "yes", "true", "no", "false" ]:
            if value is not None:
                log.warning("IndividualCalls '%s' is not valid, using default "
                            "value %s", value if value else '',
                            config.FALLBACK_INDIVIDUAL_CALLS)
            self.set("IndividualCalls", "yes" if config.FALLBACK_INDIVIDUAL_CALLS else "no")

        # check log denied
        value = self.get("LogDenied")
        if not value or value not in config.LOG_DENIED_VALUES:
            if value is not None:
                log.warning("LogDenied '%s' is invalid, using default value '%s'",
                            value, config.FALLBACK_LOG_DENIED)
            self.set("LogDenied", str(config.FALLBACK_LOG_DENIED))

        # check automatic helpers
        value = self.get("AutomaticHelpers")
        if not value or value.lower() not in config.AUTOMATIC_HELPERS_VALUES:
            if value is not None:
                log.warning("AutomaticHelpers '%s' is not valid, using default "
                            "value %s", value if value else '',
                            config.FALLBACK_AUTOMATIC_HELPERS)
            self.set("AutomaticHelpers", str(config.FALLBACK_AUTOMATIC_HELPERS))

        value = self.get("FirewallBackend")
        if not value or value.lower() not in config.FIREWALL_BACKEND_VALUES:
            if value is not None:
                log.warning("FirewallBackend '%s' is not valid, using default "
                            "value %s", value if value else '',
                            config.FALLBACK_FIREWALL_BACKEND)
            self.set("FirewallBackend", str(config.FALLBACK_FIREWALL_BACKEND))

        value = self.get("FlushAllOnReload")
        if not value or value.lower() not in [ "yes", "true", "no", "false" ]:
            if value is not None:
                log.warning("FlushAllOnReload '%s' is not valid, using default "
                            "value %s", value if value else '',
                            config.FALLBACK_FLUSH_ALL_ON_RELOAD)
            self.set("FlushAllOnReload", str(config.FALLBACK_FLUSH_ALL_ON_RELOAD))

        value = self.get("RFC3964_IPv4")
        if not value or value.lower() not in [ "yes", "true", "no", "false" ]:
            if value is not None:
                log.warning("RFC3964_IPv4 '%s' is not valid, using default "
                            "value %s", value if value else '',
                            config.FALLBACK_RFC3964_IPV4)
            self.set("RFC3964_IPv4", str(config.FALLBACK_RFC3964_IPV4))

        value = self.get("AllowZoneDrifting")
        if not value or value.lower() not in [ "yes", "true", "no", "false" ]:
            if value is not None:
                log.warning("AllowZoneDrifting '%s' is not valid, using default "
                            "value %s", value if value else '',
                            config.FALLBACK_ALLOW_ZONE_DRIFTING)
            self.set("AllowZoneDrifting", str(config.FALLBACK_ALLOW_ZONE_DRIFTING))

    # save to self.filename if there are key/value changes
    def write(self):
        if len(self._config) < 1:
            # no changes: nothing to do
            return

        # handled keys
        done = [ ]

        if not os.path.exists(config.ETC_FIREWALLD):
            os.mkdir(config.ETC_FIREWALLD, 0o750)

        try:
            temp_file = tempfile.NamedTemporaryFile(mode='wt',
                             prefix="%s." % os.path.basename(self.filename),
                             dir=os.path.dirname(self.filename), delete=False)
        except Exception as msg:
            log.error("Failed to open temporary file: %s" % msg)
            raise

        modified = False
        empty = False
        try:
            f= io.open(self.filename, mode='rt', encoding='UTF-8')
        except Exception as msg:
            if os.path.exists(self.filename):
                log.error("Failed to open '%s': %s" % (self.filename, msg))
                raise
            else:
                f = None
        else:
            for line in f:
                if not line:
                    break
                # remove newline
                line = line.strip("\n")

                if len(line) < 1:
                    if not empty:
                        temp_file.write(u"\n")
                        empty = True
                elif line[0] == '#':
                    empty = False
                    temp_file.write(line)
                    temp_file.write(u"\n")
                else:
                    p = line.split("=")
                    if len(p) != 2:
                        empty = False
                        temp_file.write(line+u"\n")
                        continue
                    key = p[0].strip()
                    value = p[1].strip()
                    # check for modified key/value pairs
                    if key not in done:
                        if (key in self._config and \
                                self._config[key] != value):
                            empty = False
                            temp_file.write(u'%s=%s\n' %
                                            (key, self._config[key]))
                            modified = True
                        elif key in self._deleted:
                            modified = True
                        else:
                            empty = False
                            temp_file.write(line+u"\n")
                        done.append(key)
                    else:
                        modified = True

        # write remaining key/value pairs
        if len(self._config) > 0:
            for (key,value) in self._config.items():
                if key in done:
                    continue
                if key in ["MinimalMark", "AutomaticHelpers"]: # omit deprecated from new config
                    continue
                if not empty:
                    temp_file.write(u"\n")
                    empty = True
                temp_file.write(u'%s=%s\n' % (key, value))
                modified = True

        if f:
            f.close()
        temp_file.close()

        if not modified: # not modified: remove tempfile
            os.remove(temp_file.name)
            return
        # make backup
        if os.path.exists(self.filename):
            try:
                shutil.copy2(self.filename, "%s.old" % self.filename)
            except Exception as msg:
                os.remove(temp_file.name)
                raise IOError("Backup of '%s' failed: %s" % (self.filename, msg))

        # copy tempfile
        try:
            shutil.move(temp_file.name, self.filename)
        except Exception as msg:
            os.remove(temp_file.name)
            raise IOError("Failed to create '%s': %s" % (self.filename, msg))
        else:
            os.chmod(self.filename, 0o600)
site-packages/firewall/core/io/policy.py000064400000121317147511334670014321 0ustar00# -*- coding: utf-8 -*-
#
# SPDX-License-Identifier: GPL-2.0-or-later

__all__ = [ "Policy", "policy_reader", "policy_writer" ]

import xml.sax as sax
import os
import io
import shutil

from firewall import config
from firewall.functions import checkIP, checkIP6
from firewall.functions import uniqify, max_policy_name_len, portStr
from firewall.core.base import DEFAULT_POLICY_TARGET, POLICY_TARGETS, DEFAULT_POLICY_PRIORITY
from firewall.core.io.io_object import IO_Object, \
    IO_Object_ContentHandler, IO_Object_XMLGenerator, check_port, \
    check_tcpudp, check_protocol
from firewall.core import rich
from firewall.core.logger import log
from firewall import errors
from firewall.errors import FirewallError


def common_startElement(obj, name, attrs):
    if name == "short":
        pass
    elif name == "description":
        pass

    elif name == "service":
        if obj._rule:
            if obj._rule.element:
                log.warning("Invalid rule: More than one element in rule '%s', ignoring.",
                            str(obj._rule))
                obj._rule_error = True
                return True
            obj._rule.element = rich.Rich_Service(attrs["name"])
            return True
        if attrs["name"] not in obj.item.services:
            obj.item.services.append(attrs["name"])
        else:
            log.warning("Service '%s' already set, ignoring.",
                        attrs["name"])

    elif name == "port":
        if obj._rule:
            if obj._rule.element:
                log.warning("Invalid rule: More than one element in rule '%s', ignoring.",
                            str(obj._rule))
                obj._rule_error = True
                return True
            obj._rule.element = rich.Rich_Port(attrs["port"],
                                                attrs["protocol"])
            return True
        check_port(attrs["port"])
        check_tcpudp(attrs["protocol"])
        entry = (portStr(attrs["port"], "-"), attrs["protocol"])
        if entry not in obj.item.ports:
            obj.item.ports.append(entry)
        else:
            log.warning("Port '%s/%s' already set, ignoring.",
                        attrs["port"], attrs["protocol"])

    elif name == "protocol":
        if obj._rule:
            if obj._rule.element:
                log.warning("Invalid rule: More than one element in rule '%s', ignoring.",
                            str(obj._rule))
                obj._rule_error = True
                return True
            obj._rule.element = rich.Rich_Protocol(attrs["value"])
        else:
            check_protocol(attrs["value"])
            if attrs["value"] not in obj.item.protocols:
                obj.item.protocols.append(attrs["value"])
            else:
                log.warning("Protocol '%s' already set, ignoring.",
                            attrs["value"])
    elif name == "icmp-block":
        if obj._rule:
            if obj._rule.element:
                log.warning("Invalid rule: More than one element in rule '%s', ignoring.",
                            str(obj._rule))
                obj._rule_error = True
                return True
            obj._rule.element = rich.Rich_IcmpBlock(attrs["name"])
            return True
        if attrs["name"] not in obj.item.icmp_blocks:
            obj.item.icmp_blocks.append(attrs["name"])
        else:
            log.warning("icmp-block '%s' already set, ignoring.",
                        attrs["name"])

    elif name == "icmp-type":
        if obj._rule:
            if obj._rule.element:
                log.warning("Invalid rule: More than one element in rule '%s', ignoring.",
                            str(obj._rule))
                obj._rule_error = True
                return True
            obj._rule.element = rich.Rich_IcmpType(attrs["name"])
            return True
        else:
            log.warning("Invalid rule: icmp-block '%s' outside of rule",
                        attrs["name"])

    elif name == "masquerade":
        if obj._rule:
            if obj._rule.element:
                log.warning("Invalid rule: More than one element in rule '%s', ignoring.",
                            str(obj._rule))
                obj._rule_error = True
                return True
            obj._rule.element = rich.Rich_Masquerade()
        else:
            if obj.item.masquerade:
                log.warning("Masquerade already set, ignoring.")
            else:
                obj.item.masquerade = True

    elif name == "forward-port":
        to_port = ""
        if "to-port" in attrs:
            to_port = attrs["to-port"]
        to_addr = ""
        if "to-addr" in attrs:
            to_addr = attrs["to-addr"]

        if obj._rule:
            if obj._rule.element:
                log.warning("Invalid rule: More than one element in rule '%s', ignoring.",
                            str(obj._rule))
                obj._rule_error = True
                return True
            obj._rule.element = rich.Rich_ForwardPort(attrs["port"],
                                                       attrs["protocol"],
                                                       to_port, to_addr)
            return True

        check_port(attrs["port"])
        check_tcpudp(attrs["protocol"])
        if to_port:
            check_port(to_port)
        if to_addr:
            if not checkIP(to_addr) and not checkIP6(to_addr):
                raise FirewallError(errors.INVALID_ADDR,
                                    "to-addr '%s' is not a valid address" \
                                    % to_addr)
        entry = (portStr(attrs["port"], "-"), attrs["protocol"],
                 portStr(to_port, "-"), str(to_addr))
        if entry not in obj.item.forward_ports:
            obj.item.forward_ports.append(entry)
        else:
            log.warning("Forward port %s/%s%s%s already set, ignoring.",
                        attrs["port"], attrs["protocol"],
                        " >%s" % to_port if to_port else "",
                        " @%s" % to_addr if to_addr else "")

    elif name == "source-port":
        if obj._rule:
            if obj._rule.element:
                log.warning("Invalid rule: More than one element in rule '%s', ignoring.",
                            str(obj._rule))
                obj._rule_error = True
                return True
            obj._rule.element = rich.Rich_SourcePort(attrs["port"],
                                                      attrs["protocol"])
            return True
        check_port(attrs["port"])
        check_tcpudp(attrs["protocol"])
        entry = (portStr(attrs["port"], "-"), attrs["protocol"])
        if entry not in obj.item.source_ports:
            obj.item.source_ports.append(entry)
        else:
            log.warning("Source port '%s/%s' already set, ignoring.",
                        attrs["port"], attrs["protocol"])

    elif name == "destination":
        if not obj._rule:
            log.warning('Invalid rule: Destination outside of rule')
            obj._rule_error = True
            return True
        if obj._rule.destination:
            log.warning("Invalid rule: More than one destination in rule '%s', ignoring.",
                        str(obj._rule))
            return True
        invert = False
        address = None
        if "address" in attrs:
            address = attrs["address"]
        ipset = None
        if "ipset" in attrs:
            ipset = attrs["ipset"]
        if "invert" in attrs and \
                attrs["invert"].lower() in [ "yes", "true" ]:
            invert = True
        obj._rule.destination = rich.Rich_Destination(address,
                                                      ipset,
                                                      invert)

    elif name in [ "accept", "reject", "drop", "mark" ]:
        if not obj._rule:
            log.warning('Invalid rule: Action outside of rule')
            obj._rule_error = True
            return True
        if obj._rule.action:
            log.warning('Invalid rule: More than one action')
            obj._rule_error = True
            return True
        if name == "accept":
            obj._rule.action = rich.Rich_Accept()
        elif name == "reject":
            _type = None
            if "type" in attrs:
                _type = attrs["type"]
            obj._rule.action = rich.Rich_Reject(_type)
        elif name == "drop":
            obj._rule.action = rich.Rich_Drop()
        elif name == "mark":
            _set = attrs["set"]
            obj._rule.action = rich.Rich_Mark(_set)
        obj._limit_ok = obj._rule.action

    elif name == "log":
        if not obj._rule:
            log.warning('Invalid rule: Log outside of rule')
            return True
        if obj._rule.log:
            log.warning('Invalid rule: More than one log')
            return True
        level = None
        if "level" in attrs:
            level = attrs["level"]
            if level not in [ "emerg", "alert", "crit", "error",
                              "warning", "notice", "info", "debug" ]:
                log.warning('Invalid rule: Invalid log level')
                obj._rule_error = True
                return True
        prefix = attrs["prefix"] if "prefix" in attrs else None
        obj._rule.log = rich.Rich_Log(prefix, level)
        obj._limit_ok = obj._rule.log

    elif name == "audit":
        if not obj._rule:
            log.warning('Invalid rule: Audit outside of rule')
            return True
        if obj._rule.audit:
            log.warning("Invalid rule: More than one audit in rule '%s', ignoring.",
                        str(obj._rule))
            obj._rule_error = True
            return True
        obj._rule.audit = rich.Rich_Audit()
        obj._limit_ok = obj._rule.audit

    elif name == "rule":
        family = None
        priority = 0
        if "family" in attrs:
            family = attrs["family"]
            if family not in [ "ipv4", "ipv6" ]:
                log.warning('Invalid rule: Rule family "%s" invalid',
                            attrs["family"])
                obj._rule_error = True
                return True
        if "priority" in attrs:
            priority = int(attrs["priority"])
        obj._rule = rich.Rich_Rule(family=family, priority=priority)

    elif name == "limit":
        if not obj._limit_ok:
            log.warning('Invalid rule: Limit outside of action, log and audit')
            obj._rule_error = True
            return True
        if obj._limit_ok.limit:
            log.warning("Invalid rule: More than one limit in rule '%s', ignoring.",
                        str(obj._rule))
            obj._rule_error = True
            return True
        value = attrs["value"]
        obj._limit_ok.limit = rich.Rich_Limit(value, attrs.get("burst"))
    else:
        return False

    return True

def common_endElement(obj, name):
    if name == "rule":
        if not obj._rule_error:
            try:
                obj._rule.check()
            except Exception as e:
                log.warning("%s: %s", e, str(obj._rule))
            else:
                if str(obj._rule) not in obj.item.rules_str:
                    obj.item.rules.append(obj._rule)
                    obj.item.rules_str.append(str(obj._rule))
                else:
                    log.warning("Rule '%s' already set, ignoring.",
                                str(obj._rule))
        obj._rule = None
        obj._rule_error = False
    elif name in [ "accept", "reject", "drop", "mark", "log", "audit" ]:
        obj._limit_ok = None

def common_check_config(obj, config, item, all_config):
    obj_type = "Policy" if isinstance(obj, Policy) else "Zone"

    if item == "services" and obj.fw_config:
        existing_services = obj.fw_config.get_services()
        for service in config:
            if service not in existing_services:
                raise FirewallError(errors.INVALID_SERVICE,
                                    "'%s' not among existing services" % \
                                    service)
    elif item == "ports":
        for port in config:
            check_port(port[0])
            check_tcpudp(port[1])
    elif item == "protocols":
        for proto in config:
            check_protocol(proto)
    elif item == "icmp_blocks" and obj.fw_config:
        existing_icmptypes = obj.fw_config.get_icmptypes()
        for icmptype in config:
            if icmptype not in existing_icmptypes:
                raise FirewallError(errors.INVALID_ICMPTYPE,
                                    "'%s' not among existing icmp types" % \
                                    icmptype)
    elif item == "forward_ports":
        for fwd_port in config:
            check_port(fwd_port[0])
            check_tcpudp(fwd_port[1])
            if not fwd_port[2] and not fwd_port[3]:
                raise FirewallError(
                    errors.INVALID_FORWARD,
                    "'%s' is missing to-port AND to-addr " % fwd_port)
            if fwd_port[2]:
                check_port(fwd_port[2])
            if fwd_port[3]:
                if not checkIP(fwd_port[3]) and not checkIP6(fwd_port[3]):
                    raise FirewallError(
                        errors.INVALID_ADDR,
                        "to-addr '%s' is not a valid address" % fwd_port[3])
    elif item == "source_ports":
        for port in config:
            check_port(port[0])
            check_tcpudp(port[1])
    elif item in ["rules_str", "rich_rules"]:
        for rule in config:
            obj_rich = rich.Rich_Rule(rule_str=rule)
            if obj.fw_config and obj_rich.element and (isinstance(obj_rich.element, rich.Rich_IcmpBlock) or
                                                       isinstance(obj_rich.element, rich.Rich_IcmpType)):
                existing_icmptypes = obj.fw_config.get_icmptypes()
                if obj_rich.element.name not in existing_icmptypes:
                    raise FirewallError(errors.INVALID_ICMPTYPE,
                                        "'%s' not among existing icmp types" % \
                                        obj_rich.element.name)
                elif obj_rich.family:
                    ict = obj.fw_config.get_icmptype(obj_rich.element.name)
                    if ict.destination and obj_rich.family not in ict.destination:
                        raise FirewallError(errors.INVALID_ICMPTYPE,
                                            "rich rule family '%s' conflicts with icmp type '%s'" % \
                                            (obj_rich.family, obj_rich.element.name))
            elif obj.fw_config and isinstance(obj_rich.element, rich.Rich_Service):
                existing_services = obj.fw_config.get_services()
                if obj_rich.element.name not in existing_services:
                    raise FirewallError(
                        errors.INVALID_SERVICE,
                        "{} '{}': '{}' not among existing services".format(
                            obj_type, obj.name, obj_rich.element.name
                        ),
                    )


def _handler_add_rich_limit(handler, limit):
    d = {"value": limit.value}
    burst = limit.burst
    if burst is not None:
        d["burst"] = burst
    handler.simpleElement("limit", d)


def common_writer(obj, handler):
    # short
    if obj.short and obj.short != "":
        handler.ignorableWhitespace("  ")
        handler.startElement("short", { })
        handler.characters(obj.short)
        handler.endElement("short")
        handler.ignorableWhitespace("\n")

    # description
    if obj.description and obj.description != "":
        handler.ignorableWhitespace("  ")
        handler.startElement("description", { })
        handler.characters(obj.description)
        handler.endElement("description")
        handler.ignorableWhitespace("\n")

    # services
    for service in uniqify(obj.services):
        handler.ignorableWhitespace("  ")
        handler.simpleElement("service", { "name": service })
        handler.ignorableWhitespace("\n")

    # ports
    for port in uniqify(obj.ports):
        handler.ignorableWhitespace("  ")
        handler.simpleElement("port", { "port": port[0], "protocol": port[1] })
        handler.ignorableWhitespace("\n")

    # protocols
    for protocol in uniqify(obj.protocols):
        handler.ignorableWhitespace("  ")
        handler.simpleElement("protocol", { "value": protocol })
        handler.ignorableWhitespace("\n")

    # icmp-blocks
    for icmp in uniqify(obj.icmp_blocks):
        handler.ignorableWhitespace("  ")
        handler.simpleElement("icmp-block", { "name": icmp })
        handler.ignorableWhitespace("\n")

    # masquerade
    if obj.masquerade:
        handler.ignorableWhitespace("  ")
        handler.simpleElement("masquerade", { })
        handler.ignorableWhitespace("\n")

    # forward-ports
    for forward in uniqify(obj.forward_ports):
        handler.ignorableWhitespace("  ")
        attrs = { "port": forward[0], "protocol": forward[1] }
        if forward[2] and forward[2] != "" :
            attrs["to-port"] = forward[2]
        if forward[3] and forward[3] != "" :
            attrs["to-addr"] = forward[3]
        handler.simpleElement("forward-port", attrs)
        handler.ignorableWhitespace("\n")

    # source-ports
    for port in uniqify(obj.source_ports):
        handler.ignorableWhitespace("  ")
        handler.simpleElement("source-port", { "port": port[0],
                                               "protocol": port[1] })
        handler.ignorableWhitespace("\n")

    # rules
    for rule in obj.rules:
        attrs = { }
        if rule.family:
            attrs["family"] = rule.family
        if rule.priority != 0:
            attrs["priority"] = str(rule.priority)
        handler.ignorableWhitespace("  ")
        handler.startElement("rule", attrs)
        handler.ignorableWhitespace("\n")

        # source
        if rule.source:
            attrs = { }
            if rule.source.addr:
                attrs["address"] = rule.source.addr
            if rule.source.mac:
                attrs["mac"] = rule.source.mac
            if rule.source.ipset:
                attrs["ipset"] = rule.source.ipset
            if rule.source.invert:
                attrs["invert"] = "True"
            handler.ignorableWhitespace("    ")
            handler.simpleElement("source", attrs)
            handler.ignorableWhitespace("\n")

        # destination
        if rule.destination:
            attrs = { }
            if rule.destination.addr:
                attrs["address"] = rule.destination.addr
            if rule.destination.ipset:
                attrs["ipset"] = rule.destination.ipset
            if rule.destination.invert:
                attrs["invert"] = "True"
            handler.ignorableWhitespace("    ")
            handler.simpleElement("destination", attrs)
            handler.ignorableWhitespace("\n")

        # element
        if rule.element:
            element = ""
            attrs = { }

            if type(rule.element) == rich.Rich_Service:
                element = "service"
                attrs["name"] = rule.element.name
            elif type(rule.element) == rich.Rich_Port:
                element = "port"
                attrs["port"] = rule.element.port
                attrs["protocol"] = rule.element.protocol
            elif type(rule.element) == rich.Rich_Protocol:
                element = "protocol"
                attrs["value"] = rule.element.value
            elif type(rule.element) == rich.Rich_Masquerade:
                element = "masquerade"
            elif type(rule.element) == rich.Rich_IcmpBlock:
                element = "icmp-block"
                attrs["name"] = rule.element.name
            elif type(rule.element) == rich.Rich_IcmpType:
                element = "icmp-type"
                attrs["name"] = rule.element.name
            elif type(rule.element) == rich.Rich_ForwardPort:
                element = "forward-port"
                attrs["port"] = rule.element.port
                attrs["protocol"] = rule.element.protocol
                if rule.element.to_port != "":
                    attrs["to-port"] = rule.element.to_port
                if rule.element.to_address != "":
                    attrs["to-addr"] = rule.element.to_address
            elif type(rule.element) == rich.Rich_SourcePort:
                element = "source-port"
                attrs["port"] = rule.element.port
                attrs["protocol"] = rule.element.protocol
            else:
                raise FirewallError(
                    errors.INVALID_OBJECT,
                    "Unknown element '%s' in obj_writer" % type(rule.element))

            handler.ignorableWhitespace("    ")
            handler.simpleElement(element, attrs)
            handler.ignorableWhitespace("\n")

        # rule.element

        # log
        if rule.log:
            attrs = { }
            if rule.log.prefix:
                attrs["prefix"] = rule.log.prefix
            if rule.log.level:
                attrs["level"] = rule.log.level
            if rule.log.limit:
                handler.ignorableWhitespace("    ")
                handler.startElement("log", attrs)
                handler.ignorableWhitespace("\n      ")
                _handler_add_rich_limit(handler, rule.log.limit)
                handler.ignorableWhitespace("\n    ")
                handler.endElement("log")
            else:
                handler.ignorableWhitespace("    ")
                handler.simpleElement("log", attrs)
            handler.ignorableWhitespace("\n")

        # audit
        if rule.audit:
            attrs = {}
            if rule.audit.limit:
                handler.ignorableWhitespace("    ")
                handler.startElement("audit", { })
                handler.ignorableWhitespace("\n      ")
                _handler_add_rich_limit(handler, rule.audit.limit)
                handler.ignorableWhitespace("\n    ")
                handler.endElement("audit")
            else:
                handler.ignorableWhitespace("    ")
                handler.simpleElement("audit", attrs)
            handler.ignorableWhitespace("\n")

        # action
        if rule.action:
            action = ""
            attrs = { }
            if type(rule.action) == rich.Rich_Accept:
                action = "accept"
            elif type(rule.action) == rich.Rich_Reject:
                action = "reject"
                if rule.action.type:
                    attrs["type"] = rule.action.type
            elif type(rule.action) == rich.Rich_Drop:
                action = "drop"
            elif type(rule.action) == rich.Rich_Mark:
                action = "mark"
                attrs["set"] = rule.action.set
            else:
                log.warning("Unknown action '%s'", type(rule.action))
            if rule.action.limit:
                handler.ignorableWhitespace("    ")
                handler.startElement(action, attrs)
                handler.ignorableWhitespace("\n      ")
                _handler_add_rich_limit(handler, rule.action.limit)
                handler.ignorableWhitespace("\n    ")
                handler.endElement(action)
            else:
                handler.ignorableWhitespace("    ")
                handler.simpleElement(action, attrs)
            handler.ignorableWhitespace("\n")

        handler.ignorableWhitespace("  ")
        handler.endElement("rule")
        handler.ignorableWhitespace("\n")


class Policy(IO_Object):
    priority_min = -32768
    priority_max =  32767
    priority_default = DEFAULT_POLICY_PRIORITY
    priority_reserved = [0]

    IMPORT_EXPORT_STRUCTURE = (
        ( "version",  "" ),                            # s
        ( "short", "" ),                               # s
        ( "description", "" ),                         # s
        ( "target", "" ),                              # s
        ( "services", [ "", ], ),                      # as
        ( "ports", [ ( "", "" ), ], ),                 # a(ss)
        ( "icmp_blocks", [ "", ], ),                   # as
        ( "masquerade", False ),                       # b
        ( "forward_ports", [ ( "", "", "", "" ), ], ), # a(ssss)
        ( "rich_rules", [ "" ] ),                      # as
        ( "protocols", [ "", ], ),                     # as
        ( "source_ports", [ ( "", "" ), ], ),          # a(ss)
        ( "priority", 0 ),                             # i
        ( "ingress_zones", [ "" ] ),                   # as
        ( "egress_zones", [ "" ] ),                    # as
        )
    ADDITIONAL_ALNUM_CHARS = [ "_", "-", "/" ]
    PARSER_REQUIRED_ELEMENT_ATTRS = {
        "short": None,
        "description": None,
        "policy": ["target"],
        "service": [ "name" ],
        "port": [ "port", "protocol" ],
        "icmp-block": [ "name" ],
        "icmp-type": [ "name" ],
        "masquerade": None,
        "forward-port": [ "port", "protocol" ],
        "rule": None,
        "source": None,
        "destination": None,
        "protocol": [ "value" ],
        "source-port": [ "port", "protocol" ],
        "log":  None,
        "audit": None,
        "accept": None,
        "reject": None,
        "drop": None,
        "mark": [ "set" ],
        "limit": [ "value" ],
        "ingress-zone": [ "name" ],
        "egress-zone": [ "name" ],
        }
    PARSER_OPTIONAL_ELEMENT_ATTRS = {
        "policy": [ "version", "priority" ],
        "forward-port": [ "to-port", "to-addr" ],
        "rule": [ "family", "priority" ],
        "source": [ "address", "mac", "invert", "family", "ipset" ],
        "destination": [ "address", "invert", "ipset" ],
        "log": [ "prefix", "level" ],
        "reject": [ "type" ],
        "limit": ["burst"],
        }

    def __init__(self):
        super(Policy, self).__init__()
        self.version = ""
        self.short = ""
        self.description = ""
        self.target = DEFAULT_POLICY_TARGET
        self.services = [ ]
        self.ports = [ ]
        self.protocols = [ ]
        self.icmp_blocks = [ ]
        self.masquerade = False
        self.forward_ports = [ ]
        self.source_ports = [ ]
        self.fw_config = None # to be able to check services and a icmp_blocks
        self.rules = [ ]
        self.rules_str = [ ]
        self.applied = False
        self.priority = self.priority_default
        self.derived_from_zone = None
        self.ingress_zones = []
        self.egress_zones = []

    def cleanup(self):
        self.version = ""
        self.short = ""
        self.description = ""
        self.target = DEFAULT_POLICY_TARGET
        del self.services[:]
        del self.ports[:]
        del self.protocols[:]
        del self.icmp_blocks[:]
        self.masquerade = False
        del self.forward_ports[:]
        del self.source_ports[:]
        self.fw_config = None # to be able to check services and a icmp_blocks
        del self.rules[:]
        del self.rules_str[:]
        self.applied = False
        self.priority = self.priority_default
        del self.ingress_zones[:]
        del self.egress_zones[:]

    def __getattr__(self, name):
        if name == "rich_rules":
            return self.rules_str
        else:
            return getattr(super(Policy, self), name)

    def __setattr__(self, name, value):
        if name == "rich_rules":
            self.rules = [rich.Rich_Rule(rule_str=s) for s in value]
            # must convert back to string to get the canonical string.
            self.rules_str = [str(s) for s in self.rules]
        else:
            super(Policy, self).__setattr__(name, value)

    def _check_config(self, config, item, all_config):
        common_check_config(self, config, item, all_config)

        if item == "target":
            if config not in POLICY_TARGETS:
                raise FirewallError(errors.INVALID_TARGET, "'%s' is invalid target" % (config))
        elif item == "priority":
            if config in self.priority_reserved or \
               config > self.priority_max or \
               config < self.priority_min:
                raise FirewallError(errors.INVALID_PRIORITY, "%d is invalid priority. Must be in range [%d, %d]. The following are reserved: %s" %
                                                             (config, self.priority_min, self.priority_max, self.priority_reserved))
        elif item in ["ingress_zones", "egress_zones"]:
            existing_zones = ["ANY", "HOST"]
            if self.fw_config:
                existing_zones += self.fw_config.get_zones()
            for zone in config:
                if zone not in existing_zones:
                    raise FirewallError(errors.INVALID_ZONE,
                                        "'%s' not among existing zones" % (zone))
                if ((zone not in ["ANY", "HOST"] and (set(["ANY", "HOST"]) & set(config))) or \
                   (zone in ["ANY", "HOST"] and (set(config) - set([zone])))):
                    raise FirewallError(errors.INVALID_ZONE,
                                        "'%s' may only contain one of: many regular zones, ANY, or HOST" % (item))
                if zone == "HOST" and \
                   ((item == "ingress_zones" and "egress_zones" in all_config and "HOST" in all_config["egress_zones"]) or \
                   (item == "egress_zones" and "ingress_zones" in all_config and "HOST" in all_config["ingress_zones"])):
                    raise FirewallError(errors.INVALID_ZONE,
                                        "'HOST' can only appear in either ingress or egress zones, but not both")
        elif item == "masquerade" and config:
            if "egress_zones" in all_config and "HOST" in all_config["egress_zones"]:
                raise FirewallError(errors.INVALID_ZONE, "'masquerade' is invalid for egress zone 'HOST'")
            elif "ingress_zones" in all_config:
                if "HOST" in all_config["ingress_zones"]:
                    raise FirewallError(errors.INVALID_ZONE, "'masquerade' is invalid for ingress zone 'HOST'")
                for zone in all_config["ingress_zones"]:
                    if zone == "ANY":
                        continue
                    z_obj = self.fw_config.get_zone(zone)
                    if self.fw_config and "interfaces" in self.fw_config.get_zone_config_dict(z_obj):
                        raise FirewallError(errors.INVALID_ZONE, "'masquerade' cannot be used in a policy if an ingress zone has assigned interfaces")
        elif item == "rich_rules":
            for rule in config:
                obj = rich.Rich_Rule(rule_str=rule)
                if obj.element and isinstance(obj.element, rich.Rich_Masquerade):
                    if "egress_zones" in all_config and "HOST" in all_config["egress_zones"]:
                        raise FirewallError(errors.INVALID_ZONE, "'masquerade' is invalid for egress zone 'HOST'")
                    elif "ingress_zones" in all_config:
                        if "HOST" in all_config["ingress_zones"]:
                            raise FirewallError(errors.INVALID_ZONE, "'masquerade' is invalid for ingress zone 'HOST'")
                        for zone in all_config["ingress_zones"]:
                            if zone == "ANY":
                                continue
                            z_obj = self.fw_config.get_zone(zone)
                            if self.fw_config and "interfaces" in self.fw_config.get_zone_config_dict(z_obj):
                                raise FirewallError(errors.INVALID_ZONE, "'masquerade' cannot be used in a policy if an ingress zone has assigned interfaces")
                elif obj.element and isinstance(obj.element, rich.Rich_ForwardPort):
                    if "egress_zones" in all_config:
                        if "HOST" in all_config["egress_zones"]:
                            if obj.element.to_address:
                                raise FirewallError(errors.INVALID_FORWARD, "A 'forward-port' with 'to-addr' is invalid for egress zone 'HOST'")
                        elif all_config["egress_zones"]:
                            if not obj.element.to_address:
                                raise FirewallError(errors.INVALID_FORWARD, "'forward-port' requires 'to-addr' if egress zone is 'ANY' or a zone")
                            if "ANY" not in all_config["egress_zones"]:
                                for zone in all_config["egress_zones"]:
                                    z_obj = self.fw_config.get_zone(zone)
                                    if self.fw_config and "interfaces" in self.fw_config.get_zone_config_dict(z_obj):
                                        raise FirewallError(errors.INVALID_ZONE, "'forward-port' cannot be used in a policy if an egress zone has assigned interfaces")
                elif obj.action and isinstance(obj.action, rich.Rich_Mark):
                    if "egress_zones" in all_config:
                        for zone in all_config["egress_zones"]:
                            if zone in ["ANY", "HOST"]:
                                continue
                            z_obj = self.fw_config.get_zone(zone)
                            if self.fw_config and "interfaces" in self.fw_config.get_zone_config_dict(z_obj):
                                raise FirewallError(errors.INVALID_ZONE, "'mark' action cannot be used in a policy if an egress zone has assigned interfaces")
        elif item == "forward_ports":
            for fwd_port in config:
                if "ingress_zones" in all_config and "HOST" in all_config["ingress_zones"]:
                    raise FirewallError(errors.INVALID_ZONE, "'forward-port' is invalid for ingress zone 'HOST'")
                elif "egress_zones" in all_config:
                    if "HOST" in all_config["egress_zones"]:
                        if fwd_port[3]:
                            raise FirewallError(errors.INVALID_FORWARD, "A 'forward-port' with 'to-addr' is invalid for egress zone 'HOST'")
                    elif all_config["egress_zones"]:
                        if not fwd_port[3]:
                            raise FirewallError(errors.INVALID_FORWARD, "'forward-port' requires 'to-addr' if egress zone is 'ANY' or a zone")
                        if "ANY" not in all_config["egress_zones"]:
                            for zone in all_config["egress_zones"]:
                                z_obj = self.fw_config.get_zone(zone)
                                if self.fw_config and "interfaces" in self.fw_config.get_zone_config_dict(z_obj):
                                    raise FirewallError(errors.INVALID_ZONE, "'forward-port' cannot be used in a policy if an egress zone has assigned interfaces")

    def check_name(self, name):
        super(Policy, self).check_name(name)
        if name.startswith('/'):
            raise FirewallError(errors.INVALID_NAME,
                                "'%s' can't start with '/'" % name)
        elif name.endswith('/'):
            raise FirewallError(errors.INVALID_NAME,
                                "'%s' can't end with '/'" % name)
        elif name.count('/') > 1:
            raise FirewallError(errors.INVALID_NAME,
                                "more than one '/' in '%s'" % name)
        else:
            if "/" in name:
                checked_name = name[:name.find('/')]
            else:
                checked_name = name
            if len(checked_name) > max_policy_name_len():
                raise FirewallError(errors.INVALID_NAME,
                                    "Policy of '%s' has %d chars, max is %d" % (
                                    name, len(checked_name),
                                    max_policy_name_len()))
            if self.fw_config:
                if checked_name in self.fw_config.get_zones():
                    raise FirewallError(errors.NAME_CONFLICT, "Policies can't have the same name as a zone.")

# PARSER

class policy_ContentHandler(IO_Object_ContentHandler):
    def __init__(self, item):
        IO_Object_ContentHandler.__init__(self, item)
        self._rule = None
        self._rule_error = False
        self._limit_ok = None

    def startElement(self, name, attrs):
        IO_Object_ContentHandler.startElement(self, name, attrs)
        if self._rule_error:
            return

        self.item.parser_check_element_attrs(name, attrs)

        if common_startElement(self, name, attrs):
            return

        elif name == "policy":
            if "version" in attrs:
                self.item.version = attrs["version"]
            if "priority" in attrs:
                self.item.priority = int(attrs["priority"])
            if "target" in attrs:
                target = attrs["target"]
                if target not in POLICY_TARGETS:
                    raise FirewallError(errors.INVALID_TARGET, target)
                if target:
                    self.item.target = target

        elif name == "ingress-zone":
            if attrs["name"] not in self.item.ingress_zones:
                self.item.ingress_zones.append(attrs["name"])
            else:
                log.warning("Ingress zone '%s' already set, ignoring.", attrs["name"])

        elif name == "egress-zone":
            if attrs["name"] not in self.item.egress_zones:
                self.item.egress_zones.append(attrs["name"])
            else:
                log.warning("Egress zone '%s' already set, ignoring.", attrs["name"])

        elif name == "source":
            if not self._rule:
                log.warning('Invalid rule: Source outside of rule')
                self._rule_error = True
                return

            if self._rule.source:
                log.warning("Invalid rule: More than one source in rule '%s', ignoring.",
                            str(self._rule))
                self._rule_error = True
                return
            invert = False
            if "invert" in attrs and \
                    attrs["invert"].lower() in [ "yes", "true" ]:
                invert = True
            addr = mac = ipset = None
            if "address" in attrs:
                addr = attrs["address"]
            if "mac" in attrs:
                mac = attrs["mac"]
            if "ipset" in attrs:
                ipset = attrs["ipset"]
            self._rule.source = rich.Rich_Source(addr, mac, ipset,
                                                 invert=invert)
            return

        else:
            log.warning("Unknown XML element '%s'", name)
            return

    def endElement(self, name):
        IO_Object_ContentHandler.endElement(self, name)

        common_endElement(self, name)

def policy_reader(filename, path, no_check_name=False):
    policy = Policy()
    if not filename.endswith(".xml"):
        raise FirewallError(errors.INVALID_NAME,
                            "'%s' is missing .xml suffix" % filename)
    policy.name = filename[:-4]
    if not no_check_name:
        policy.check_name(policy.name)
    policy.filename = filename
    policy.path = path
    policy.builtin = False if path.startswith(config.ETC_FIREWALLD) else True
    policy.default = policy.builtin
    handler = policy_ContentHandler(policy)
    parser = sax.make_parser()
    parser.setContentHandler(handler)
    name = "%s/%s" % (path, filename)
    with open(name, "rb") as f:
        source = sax.InputSource(None)
        source.setByteStream(f)
        try:
            parser.parse(source)
        except sax.SAXParseException as msg:
            raise FirewallError(errors.INVALID_POLICY,
                                "not a valid policy file: %s" % \
                                msg.getException())
    del handler
    del parser
    return policy

def policy_writer(policy, path=None):
    _path = path if path else policy.path

    if policy.filename:
        name = "%s/%s" % (_path, policy.filename)
    else:
        name = "%s/%s.xml" % (_path, policy.name)

    if os.path.exists(name):
        try:
            shutil.copy2(name, "%s.old" % name)
        except Exception as msg:
            log.error("Backup of file '%s' failed: %s", name, msg)

    dirpath = os.path.dirname(name)
    if dirpath.startswith(config.ETC_FIREWALLD) and not os.path.exists(dirpath):
        if not os.path.exists(config.ETC_FIREWALLD):
            os.mkdir(config.ETC_FIREWALLD, 0o750)
        os.mkdir(dirpath, 0o750)

    f = io.open(name, mode='wt', encoding='UTF-8')
    handler = IO_Object_XMLGenerator(f)
    handler.startDocument()

    # start policy element
    attrs = {}
    if policy.version and policy.version != "":
        attrs["version"] = policy.version
    if policy.priority != policy.priority_default:
        attrs["priority"] = str(policy.priority)
    attrs["target"] = policy.target
    handler.startElement("policy", attrs)
    handler.ignorableWhitespace("\n")

    common_writer(policy, handler)

    # ingress-zones
    for zone in uniqify(policy.ingress_zones):
        handler.ignorableWhitespace("  ")
        handler.simpleElement("ingress-zone", { "name": zone })
        handler.ignorableWhitespace("\n")

    # egress-zones
    for zone in uniqify(policy.egress_zones):
        handler.ignorableWhitespace("  ")
        handler.simpleElement("egress-zone", { "name": zone })
        handler.ignorableWhitespace("\n")

    # end policy element
    handler.endElement("policy")
    handler.ignorableWhitespace("\n")
    handler.endDocument()
    f.close()
    del handler
site-packages/firewall/core/io/__pycache__/firewalld_conf.cpython-36.pyc000064400000016616147511334670022271 0ustar003

]ûf�5�
@s~ddlZddlZddlZddlZddlmZddlmZddl	m
Z
mZmZddddd	d
ddd
ddddg
Z
Gdd�de�ZdS)�N)�config)�log)�b2u�u2b�PY2�DefaultZone�MinimalMark�
CleanupOnExit�CleanupModulesOnExit�Lockdown�
IPv6_rpfilter�IndividualCalls�	LogDenied�AutomaticHelpers�FirewallBackend�FlushAllOnReload�RFC3964_IPv4�AllowZoneDriftingc@sLeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)�firewalld_confcCsi|_g|_||_|j�dS)N)�_config�_deleted�filename�clear)�selfr�r�$/usr/lib/python3.6/firewalld_conf.py�__init__&szfirewalld_conf.__init__cCsi|_g|_dS)N)rr)rrrrr,szfirewalld_conf.clearcCs|jj�g|_dS)N)rrr)rrrr�cleanup0s
zfirewalld_conf.cleanupcCs|jj|j��S)N)r�get�strip)r�keyrrrr4szfirewalld_conf.getcCs8t|j��}t|j��|j|<||jkr4|jj|�dS)N)rrrr�remove)rr �valueZ_keyrrr�set7s
zfirewalld_conf.setcCsHd}x2|jj�D]$\}}|r$|d7}|d||f7}qWtrDt|�S|S)N��
z%s=%s)r�itemsrr)r�sr r"rrr�__str__=szfirewalld_conf.__str__cCs�|j�yt|jd�}W�n8tk
�rR}�ztjd|j|�|jdtj�|jdt	tj
��|jdtjrpdnd�|jdtjr�dnd�|jd	tj
r�dnd�|jd
tjr�dnd�|jdtjr�dnd�|jdtj�|jd
tj�|jdtj�|jdtj�r
dnd�|jdtj�r"dnd�|jdtj�r:dnd��WYdd}~XnX�x�|D]�}|�shP|j�}t|�dk�s\|dd.k�r��q\dd�|jd�D�}t|�dk�r�tjd|j���q\nr|dtk�r�tjd|j���q\nN|ddk�rtjd|j���q\n*|jj|d�dk	�r:tjd|j���q\|d|j|d<�q\W|j�|jd��s�tjdtj�|jdt	tj��|jd�}yt|�WnPttfk
�r�|dk	�r�tj d |�r�|ndtj
�|jdt	tj
��YnX|jd�}|�s|j!�d/k�rJ|dk	�r2tj d#|�r(|ndtj�|jdtj�rDdnd�|jd�}|�sj|j!�d0k�r�|dk	�r�tj d$|�r�|ndtj�|jdtj�r�dnd�|jd	�}|�s�|j!�d1k�r|dk	�r�tj d%|�r�|ndtj
�|jd	tj
�r�dnd�|jd
�}|�s"|j!�d2k�r^|dk	�rFtj d&|�r<|ndtj�|jd
tj�rXdnd�|jd�}|�s~|j!�d3k�r�|dk	�r�tj d'|�r�|ndtj�|jdtj�r�dnd�|jd�}|�s�|tj"k�r|dk	�r�tj d(|tj�|jdt	tj��|jd
�}|�s&|j!�tj#k�r\|dk	�rJtj d)|�r@|ndtj�|jd
t	tj��|jd�}|�s~|j!�tj$k�r�|dk	�r�tj d*|�r�|ndtj�|jdt	tj��|jd�}|�s�|j!�d4k�r
|dk	�r�tj d+|�r�|ndtj�|jdt	tj��|jd�}|�s*|j!�d5k�r`|dk	�rNtj d,|�rD|ndtj�|jdt	tj��|jd�}|�s�|j!�d6k�r�|dk	�r�tj d-|�r�|ndtj�|jdt	tj��dS)7N�rzFailed to load '%s': %srrr	�yes�nor
rrr
rrrrrr�r�#�;cSsg|]}|j��qSr)r)�.0�xrrr�
<listcomp>bsz'firewalld_conf.read.<locals>.<listcomp>�=�zInvalid option definition: '%s'zInvalid option: '%s'r$zMissing value: '%s'z!Duplicate option definition: '%s'z0DefaultZone is not set, using default value '%s'z7MinimalMark '%s' is not valid, using default value '%d'�false�truez7CleanupOnExit '%s' is not valid, using default value %sz>CleanupModulesOnExit '%s' is not valid, using default value %sz2Lockdown '%s' is not valid, using default value %sz7IPv6_rpfilter '%s' is not valid, using default value %sz9IndividualCalls '%s' is not valid, using default value %sz3LogDenied '%s' is invalid, using default value '%s'z:AutomaticHelpers '%s' is not valid, using default value %sz9FirewallBackend '%s' is not valid, using default value %sz:FlushAllOnReload '%s' is not valid, using default value %sz6RFC3964_IPv4 '%s' is not valid, using default value %sz;AllowZoneDrifting '%s' is not valid, using default value %s)r-r.)r+r4r*r5)r+r4r*r5)r*r5r+r4)r*r5r+r4)r*r5r+r4)r*r5r+r4)r*r5r+r4)r*r5r+r4)%r�openr�	Exceptionr�errorr#rZ
FALLBACK_ZONE�strZFALLBACK_MINIMAL_MARKZFALLBACK_CLEANUP_ON_EXITZ FALLBACK_CLEANUP_MODULES_ON_EXITZFALLBACK_LOCKDOWNZFALLBACK_IPV6_RPFILTERZFALLBACK_INDIVIDUAL_CALLSZFALLBACK_LOG_DENIEDZFALLBACK_AUTOMATIC_HELPERSZFALLBACK_FIREWALL_BACKENDZFALLBACK_FLUSH_ALL_ON_RELOADZFALLBACK_RFC3964_IPV4ZFALLBACK_ALLOW_ZONE_DRIFTINGr�len�split�
valid_keysrr�close�int�
ValueError�	TypeErrorZwarning�lowerZLOG_DENIED_VALUESZAUTOMATIC_HELPERS_VALUESZFIREWALL_BACKEND_VALUES)r�f�msg�lineZpairr"rrr�readFs
























zfirewalld_conf.readc:Cs�t|j�dkrdSg}tjjtj�s2tjtjd�y.tj	ddtjj
|j�tjj|j�dd�}Wn2t
k
r�}ztjd|��WYdd}~XnXd}d}ytj|jdd	d
�}WnPt
k
�r}z0tjj|j�r�tjd|j|f��nd}WYdd}~X�n6X�x0|D�]&}|�sP|jd�}t|�dk�rH|�s2|jd�d
}n�|ddk�rpd}|j|�|jd�n�|jd�}t|�dk�r�d}|j|d��q|dj�}	|dj�}
|	|k�r.|	|jk�r�|j|	|
k�r�d}|jd|	|j|	f�d
}n$|	|jk�rd
}nd}|j|d�|j|	�nd
}�qWt|j�dk�r�x^|jj�D]P\}	}
|	|k�rj�qT|	dk�rx�qT|�s�|jd�d
}|jd|	|
f�d
}�qTW|�r�|j�|j�|�s�tj|j�dStjj|j��r@ytj|jd|j�WnBt
k
�r>}z$tj|j�td|j|f��WYdd}~XnXytj|j|j�WnBt
k
�r�}z$tj|j�td|j|f��WYdd}~XnXtj|jd�dS)Nr,i�Zwtz%s.F)�mode�prefix�dir�deletez!Failed to open temporary file: %sZrtzUTF-8)rF�encodingzFailed to open '%s': %sr%Trr-r2r3z%s=%s
rrz%s.oldzBackup of '%s' failed: %szFailed to create '%s': %si�)rr) r:r�os�path�existsrZ
ETC_FIREWALLD�mkdir�tempfileZNamedTemporaryFile�basenamer�dirnamer7rr8�ior6r�writer;r�appendr&r=r!�name�shutilZcopy2�IOErrorZmove�chmod)r�doneZ	temp_filerCZmodified�emptyrBrD�pr r"rrrrS�s�









$$zfirewalld_conf.writeN)�__name__�
__module__�__qualname__rrrrr#r(rErSrrrrr%s	r)Zos.pathrKrRrOrVZfirewallrZfirewall.core.loggerrZfirewall.functionsrrrr<�objectrrrrr�<module>ssite-packages/firewall/core/io/__pycache__/direct.cpython-36.pyc000064400000027263147511334670020565 0ustar003

]ûf�=�@s�ddljZddlZddlZddlZddlmZddlmZddl	m
Z
mZmZddl
mZmZmZddlmZddlmZddlmZdd	lmZdd
lmZGdd�de�ZGd
d�de�ZdS)�N)�config)�LastUpdatedOrderedDict)�	splitArgs�joinArgs�
u2b_if_py2)�	IO_Object�IO_Object_ContentHandler�IO_Object_XMLGenerator)�log)�	ipXtables)�ebtables)�errors)�
FirewallErrorc@s$eZdZdd�Zdd�Zdd�ZdS)�direct_ContentHandlercCstj||�d|_dS)NF)r�__init__�direct)�self�item�r�/usr/lib/python3.6/direct.pyr(szdirect_ContentHandler.__init__cCs�tj|||�|jj||�|dkr@|jr6ttjd��d|_�n>|dkr�|js\tj	d�dS|d}|d}|d}|jj
t|�t|�t|��n�|dk�r6|js�tj	d	�dS|d}|dkr�ttjd
|��|d}|d}yt
|d�}Wn(tk
�rtj	d|d�dSXt|�t|�t|�|g|_nH|dk�rl|j�sVtj	d�dS|d}t|�g|_ntj	d|�dSdS)NrzMore than one direct tag.T�chainz$Parse Error: chain outside of direct�ipv�table�rulez#Parse Error: rule outside of direct�ipv4�ipv6�ebz"'%s' not from {'ipv4'|'ipv6'|'eb'}�priorityz'Parse Error: %s is not a valid priority�passthroughz&Parse Error: command outside of directzUnknown XML element %s)rrr)r�startElementrZparser_check_element_attrsrrr
ZPARSE_ERRORr
�error�	add_chainr�INVALID_IPV�int�
ValueError�_rule�_passthrough)r�nameZattrsrrrrrrrr,sT






z"direct_ContentHandler.startElementcCs�tj||�|dkrX|jrF|jjdd�t|j�D��|jj|j�n
tj	d�d|_nJ|dkr�|jr�|j
jdd�t|j�D��|jj|j
�n
tj	d	�d|_
dS)
NrcSsg|]}t|��qSr)r)�.0�xrrr�
<listcomp>dsz4direct_ContentHandler.endElement.<locals>.<listcomp>z2Error: rule does not have any arguments, ignoring.rcSsg|]}t|��qSr)r)r(r)rrrr*msz0Error: passthrough does not have any arguments, z	ignoring.z9Error: passthrough does not have any arguments, ignoring.)r�
endElementZ_elementr%�appendrr�add_ruler
r r&�add_passthrough)rr'rrrr+^s 
z direct_ContentHandler.endElementN)�__name__�
__module__�__qualname__rrr+rrrrr's2rcs<eZdZdZddBgfddddddgfgfdddgfgffZdZdd	d
dgd	d
ddgd	gd
�ZiZ�fdd�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zd2d3�Zd4d5�Zd6d7�Zd8d9�Zd:d;�Zd<d=�Zd>d?�Z d@dA�Z!�Z"S)C�Directz Direct class �chains��rulesr�passthroughsz(a(sss)a(sssias)a(sas))Nrrrr)rrrrcs0tt|�j�||_t�|_t�|_t�|_dS)N)�superr2r�filenamerr3r5r6)rr8)�	__class__rrr�s
zDirect.__init__cCsdS)Nr)r�confrZall_confrrr�
_check_config�szDirect._check_configcCsg}g}x>|jD]4}x.|j|D] }|jtt|�t|g���q WqW|j|�g}xR|jD]H}xB|j|D]4}|jt|d|d|d|dt|d�f��qnWq^W|j|�g}x8|jD].}x(|j|D]}|jt|t|�f��q�Wq�W|j|�t|�S)Nr��)r3r,�tuple�listr5r6)r�retr)�keyrrrrr�
export_config�s$$


zDirect.export_configcCs�|j�|j|�x�t|j�D]x\}\}}|dkrNx||D]}|j|�q<W|dkrrx||D]}|j|�q`W|dkrx||D]}|j|�q�WqWdS)Nr3r5r6)�cleanupZcheck_config�	enumerate�IMPORT_EXPORT_STRUCTUREr!r-r.)rr:�i�elementZdummyr)rrr�
import_config�s
zDirect.import_configcCs"|jj�|jj�|jj�dS)N)r3�clearr5r6)rrrrrC�s

zDirect.cleanupcCs�td�x4|jD]*}td|d|ddj|j|�f�qWtd�xZ|jD]P}td|d|d|df�x,|j|D]\}}td	|d
j|�f�q|WqNWtd�x@|jD]6}td|�x$|j|D]}td
d
j|��q�Wq�WdS)Nr3z  (%s, %s): %srr<�,r5z  (%s, %s, %s):r=z    (%d, ('%s'))z','r6z  %s:z
    ('%s'))�printr3�joinr5r6)rrAr�argsrrr�output�sz
Direct.outputcCs*dddg}||kr&ttjd||f��dS)Nrrrz'%s' not in '%s')rr
r")rrZipvsrrr�
_check_ipv�s
zDirect._check_ipvcCsF|j|�|dkrtjj�ntjj�}||krBttjd||f��dS)Nrrz'%s' not in '%s')rr)rOrZBUILT_IN_CHAINS�keysrrr
Z
INVALID_TABLE)rrrZtablesrrr�_check_ipv_table�s

zDirect._check_ipv_tablecCsd|j||�||f}||jkr(g|j|<||j|krH|j|j|�ntjd|||fd�dS)Nz(Chain '%s' for table '%s' with ipv '%s' zalready in list, ignoring)rQr3r,r
�warning)rrrrrArrrr!�s

zDirect.add_chaincCsn|j||�||f}||jkrX||j|krX|j|j|�t|j|�dkrj|j|=ntd|||f��dS)Nrz4Chain '%s' with table '%s' with ipv '%s' not in list)rQr3�remove�lenr$)rrrrrArrr�remove_chain�s
zDirect.remove_chaincCs,|j||�||f}||jko*||j|kS)N)rQr3)rrrrrArrr�query_chain�szDirect.query_chaincCs<|j||�||f}||jkr(|j|Std||f��dS)Nz&No chains for table '%s' with ipv '%s')rQr3r$)rrrrArrr�
get_chains�s

zDirect.get_chainscCs|jS)N)r3)rrrr�get_all_chainsszDirect.get_all_chainscCs�|j||�|||f}||jkr,t�|j|<|t|�f}||j|krV||j||<n*tjddj|�||fd||fd�dS)Nz(Rule '%s' for table '%s' and chain '%s' z',zwith ipv '%s' and priority %d zalready in list, ignoring)rQr5rr>r
rRrL)rrrrrrMrA�valuerrrr-s

zDirect.add_rulecCs�|j||�|||f}|t|�f}||jkrb||j|krb|j||=t|j|�dkr�|j|=n$tddj|�||fd||f��dS)Nrz(Rule '%s' for table '%s' and chain '%s' z',z)with ipv '%s' and priority %d not in list)rQr>r5rTr$rL)rrrrrrMrArYrrr�remove_rules

zDirect.remove_rulecCsb|j||�|||f}||jkr^x"|j|j�D]}|j||=q0Wt|j|�dkr^|j|=dS)Nr)rQr5rPrT)rrrrrArYrrr�remove_rules"s

zDirect.remove_rulescCs:|j||�|||f}|t|�f}||jko8||j|kS)N)rQr>r5)rrrrrrMrArYrrr�
query_rule+s
zDirect.query_rulecCsF|j||�|||f}||jkr*|j|Std||fd|��dS)Nz'No rules for table '%s' and chain '%s' z
with ipv '%s')rQr5r$)rrrrrArrr�	get_rules1s


zDirect.get_rulescCs|jS)N)r5)rrrr�
get_all_rules:szDirect.get_all_rulescCs^|j|�||jkrg|j|<||j|kr>|j|j|�ntjddj|�|fd�dS)NzPassthrough '%s' for ipv '%s'z',zalready in list, ignoring)rOr6r,r
rRrL)rrrMrrrr.?s


zDirect.add_passthroughcCsl|j|�||jkrN||j|krN|j|j|�t|j|�dkrh|j|=ntddj|�|fd��dS)NrzPassthrough '%s' for ipv '%s'z',znot in list)rOr6rSrTr$rL)rrrMrrr�remove_passthroughIs

zDirect.remove_passthroughcCs"|j|�||jko ||j|kS)N)rOr6)rrrMrrr�query_passthroughSs
zDirect.query_passthroughcCs.|j|�||jkr|j|Std|��dS)NzNo passthroughs for ipv '%s')rOr6r$)rrrrr�get_passthroughsWs


zDirect.get_passthroughscCs|jS)N)r6)rrrr�get_all_passthroughs^szDirect.get_all_passthroughscCs�|j�|jjd�s&ttjd|j��t|�}tj�}|j	|�t
|jd��b}tjd�}|j|�y|j
|�Wn8tjk
r�}zttjd|j���WYdd}~XnXWdQRXdS)Nz.xmlz'%s' is missing .xml suffix�rbzNot a valid file: %s)rCr8�endswithrr
ZINVALID_NAMEr�saxZmake_parserZsetContentHandler�openZInputSourceZ
setByteStream�parseZSAXParseExceptionZINVALID_TYPEZgetException)r�handler�parser�f�source�msgrrr�readcs 


zDirect.readc
CsBtjj|j�r\ytj|jd|j�Wn4tk
rZ}ztd|j|f��WYdd}~XnXtjjtj	�sxtj
tj	d�tj|jddd�}t
|�}|j�|jdi�|jd�xR|jD]H}|\}}x:|j|D],}|jd	�|jd
|||d��|jd�q�Wq�Wx�|jD]�}|\}}}xx|j|D]j\}}	t|	�dk�r@�q&|jd	�|jd
|||d|d��|jtjjt|	���|jd
�|jd��q&W�qWx||jD]r}xj|j|D]\}	t|	�dk�rȐq�|jd	�|jdd|i�|jtjjt|	���|jd�|jd��q�W�q�W|jd�|jd�|j�|j�~dS)Nz%s.oldzBackup of '%s' failed: %si�ZwtzUTF-8)�mode�encodingr�
z  r)rrrr<rz%d)rrrrrr)�os�path�existsr8�shutilZcopy2�	Exception�IOErrorrZ
ETC_FIREWALLD�mkdir�iorfr	Z
startDocumentrZignorableWhitespacer3Z
simpleElementr5rTreZsaxutils�escaperr+r6ZendDocument�close)
rrlrjrhrArrrrrMrrr�writeusZ$











zDirect.write)r4r4r4)#r/r0r1�__doc__rEZDBUS_SIGNATUREZPARSER_REQUIRED_ELEMENT_ATTRSZPARSER_OPTIONAL_ELEMENT_ATTRSrr;rBrHrCrNrOrQr!rUrVrWrXr-rZr[r\r]r^r.r_r`rarbrmr{�
__classcell__rr)r9rr2usH

	
		

r2)Zxml.saxrerqrxrtZfirewallrZfirewall.fw_typesrZfirewall.functionsrrrZfirewall.core.io.io_objectrrr	Zfirewall.core.loggerr
Z
firewall.corerrr
Zfirewall.errorsrrr2rrrr�<module>s
Nsite-packages/firewall/core/io/__pycache__/lockdown_whitelist.cpython-36.pyc000064400000022550147511334670023221 0ustar003

]ûf�1�@s�ddljZddlZddlZddlZddlmZddlmZm	Z	m
Z
mZddlm
Z
ddlmZmZmZmZmZmZddlmZddlmZGdd	�d	e
�ZGd
d�de	�ZdS)�N)�config)�PY2�	IO_Object�IO_Object_ContentHandler�IO_Object_XMLGenerator)�log)�uniqify�	checkUser�checkUid�checkCommand�checkContext�
u2b_if_py2)�errors)�
FirewallErrorc@seZdZdd�Zdd�ZdS)�!lockdown_whitelist_ContentHandlercCstj||�d|_dS)NF)r�__init__�	whitelist)�self�item�r�(/usr/lib/python3.6/lockdown_whitelist.pyr%sz*lockdown_whitelist_ContentHandler.__init__cCsVtj|||�|jj||�|dkr@|jr6ttjd��d|_�n|dkrr|js\tj	d�dS|d}|jj
|�n�|dkr�|js�tj	d�dSd	|kr�yt|d	�}Wn&tk
r�tj	d
|d	�dSX|jj
|�nd|kr�|jj|d�n\|dk�r@|j�stj	d�dSd
|k�r.tj	d�dS|jj|d
�ntj	d|�dSdS)NrzMore than one whitelist.T�commandz)Parse Error: command outside of whitelist�name�userz&Parse Error: user outside of whitelist�idz"Parse Error: %s is not a valid uid�selinuxz)Parse Error: selinux outside of whitelist�contextzParse Error: no contextzUnknown XML element %s)r�startElementrZparser_check_element_attrsrrrZPARSE_ERRORr�error�add_command�int�
ValueError�add_uid�add_user�add_context)rrZattrsr�uidrrrr)sJ






z.lockdown_whitelist_ContentHandler.startElementN)�__name__�
__module__�__qualname__rrrrrrr$srcs4eZdZdZddgfddgfddgfddgffZdZd	gZd
dgd
dgd
�ZdddgiZ�fdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zd2d3�Zd4d5�Zd6d7�Zd8d9�Zd:d;�Zd<d=�Zd>d?�Z d@dA�Z!dBdC�Z"�Z#S)D�LockdownWhitelistz LockdownWhitelist class �commands��contexts�users�uidsrz
(asasasai)�_Nrr)rrrrrrcs6tt|�j�||_d|_g|_g|_g|_g|_dS)N)	�superr)r�filename�parserr*r,r-r.)rr1)�	__class__rrrnszLockdownWhitelist.__init__cCs�|d
kr.x�|D]}|j||dd�|�qWnv|dkrLt|�s�ttj|��nX|dkrjt|�s�ttj|��n:|dkr�t|�s�ttj|��n|d	kr�t	|�s�ttj
|��dS)Nr*r,r-r.�rrrr%)r*r,r-r.���)�
_check_configrrr�INVALID_COMMANDr�INVALID_CONTEXTr	�INVALID_USERr
�INVALID_UID)rrrZ
all_config�xrrrr6ys
zLockdownWhitelist._check_configcCs4|jdd�=|jdd�=|jdd�=|jdd�=dS)N)r*r,r-r.)rrrr�cleanup�szLockdownWhitelist.cleanupcCs:dd�|jD�|_dd�|jD�|_dd�|jD�|_dS)z� HACK. I haven't been able to make sax parser return
            strings encoded (because of python 2) instead of in unicode.
            Get rid of it once we throw out python 2 support.cSsg|]}t|��qSr)r
)�.0r;rrr�
<listcomp>�sz4LockdownWhitelist.encode_strings.<locals>.<listcomp>cSsg|]}t|��qSr)r
)r=r;rrrr>�scSsg|]}t|��qSr)r
)r=r;rrrr>�sN)r*r,r-)rrrr�encode_strings�sz LockdownWhitelist.encode_stringscCs@t|�sttj|��||jkr,|jj|�nttjd|��dS)Nz!Command "%s" already in whitelist)rrrr7r*�append�ALREADY_ENABLED)rrrrrr�s
zLockdownWhitelist.add_commandcCs,||jkr|jj|�nttjd|��dS)NzCommand "%s" not in whitelist.)r*�removerr�NOT_ENABLED)rrrrr�remove_command�s
z LockdownWhitelist.remove_commandcCs
||jkS)N)r*)rrrrr�has_command�szLockdownWhitelist.has_commandcCsBx<|jD]2}|jd�r.|j|dd��r:dSq||krdSqWdS)N�*r4TFr5)r*�endswith�
startswith)rrZ_commandrrr�
match_command�s
zLockdownWhitelist.match_commandcCs|jS)N)r*)rrrr�get_commands�szLockdownWhitelist.get_commandscCsDt|�sttjt|���||jkr0|jj|�nttjd|��dS)NzUid "%s" already in whitelist)r
rrr:�strr.r@rA)rr%rrrr"�s
zLockdownWhitelist.add_uidcCs,||jkr|jj|�nttjd|��dS)NzUid "%s" not in whitelist.)r.rBrrrC)rr%rrr�
remove_uid�s
zLockdownWhitelist.remove_uidcCs
||jkS)N)r.)rr%rrr�has_uid�szLockdownWhitelist.has_uidcCs
||jkS)N)r.)rr%rrr�	match_uid�szLockdownWhitelist.match_uidcCs|jS)N)r.)rrrr�get_uids�szLockdownWhitelist.get_uidscCs@t|�sttj|��||jkr,|jj|�nttjd|��dS)NzUser "%s" already in whitelist)r	rrr9r-r@rA)rrrrrr#�s
zLockdownWhitelist.add_usercCs,||jkr|jj|�nttjd|��dS)NzUser "%s" not in whitelist.)r-rBrrrC)rrrrr�remove_user�s
zLockdownWhitelist.remove_usercCs
||jkS)N)r-)rrrrr�has_user�szLockdownWhitelist.has_usercCs
||jkS)N)r-)rrrrr�
match_user�szLockdownWhitelist.match_usercCs|jS)N)r-)rrrr�	get_users�szLockdownWhitelist.get_userscCs@t|�sttj|��||jkr,|jj|�nttjd|��dS)Nz!Context "%s" already in whitelist)rrrr8r,r@rA)rrrrrr$"s
zLockdownWhitelist.add_contextcCs,||jkr|jj|�nttjd|��dS)NzContext "%s" not in whitelist.)r,rBrrrC)rrrrr�remove_context,s
z LockdownWhitelist.remove_contextcCs
||jkS)N)r,)rrrrr�has_context3szLockdownWhitelist.has_contextcCs
||jkS)N)r,)rrrrr�
match_context6szLockdownWhitelist.match_contextcCs|jS)N)r,)rrrr�get_contexts9szLockdownWhitelist.get_contextscCs�|j�|jjd�s&ttjd|j��t|�}tj�}|j	|�y|j
|j�Wn8tjk
r�}zttjd|j
���WYdd}~XnX~~tr�|j�dS)Nz.xmlz'%s' is missing .xml suffixzNot a valid file: %s)r<r1rGrrZINVALID_NAMEr�saxZmake_parserZsetContentHandler�parseZSAXParseExceptionZINVALID_TYPEZgetExceptionrr?)r�handlerr2�msgrrr�read>s"
zLockdownWhitelist.readcCs�tjj|j�r\ytj|jd|j�Wn4tk
rZ}ztd|j|f��WYdd}~XnXtjjtj	�sxtj
tj	d�tj|jddd�}t
|�}|j�|jdi�|jd�x6t|j�D](}|jd	�|jd
d|i�|jd�q�Wx:t|j�D],}|jd	�|jdd
t|�i�|jd�q�Wx8t|j�D]*}|jd	�|jdd|i�|jd��q0Wx8t|j�D]*}|jd	�|jdd|i�|jd��qjW|jd�|jd�|j�|j�~dS)Nz%s.oldzBackup of '%s' failed: %si�ZwtzUTF-8)�mode�encodingr�
z  rrrrrr)�os�path�existsr1�shutilZcopy2�	Exception�IOErrorrZ
ETC_FIREWALLD�mkdir�io�openrZ
startDocumentrZignorableWhitespacerr*Z
simpleElementr.rKr-r,Z
endElementZendDocument�close)rr[�frZrr%rrrrr�writeQsB$






zLockdownWhitelist.write)$r&r'r(�__doc__ZIMPORT_EXPORT_STRUCTUREZDBUS_SIGNATUREZADDITIONAL_ALNUM_CHARSZPARSER_REQUIRED_ELEMENT_ATTRSZPARSER_OPTIONAL_ELEMENT_ATTRSrr6r<r?rrDrErIrJr"rLrMrNrOr#rPrQrRrSr$rTrUrVrWr\rk�
__classcell__rr)r3rr)WsL

	


1
r))Zxml.saxrXr`rgrcZfirewallrZfirewall.core.io.io_objectrrrrZfirewall.core.loggerrZfirewall.functionsrr	r
rrr
rZfirewall.errorsrrr)rrrr�<module>s
 3site-packages/firewall/core/io/__pycache__/ipset.cpython-36.opt-1.pyc000064400000025614147511334670021374 0ustar003

]ûf�R�@sdZdddgZddljZddlZddlZddlZddlmZddl	m
Z
mZmZm
Z
mZmZmZmZmZddlmZmZmZmZdd	lmZmZdd
lmZmZmZmZddl m!Z!ddlm"Z"dd
l#m$Z$Gdd�de�Z%Gdd�de�Z&dd�Z'ddd�Z(dS)z$ipset io XML handler, reader, writer�IPSet�ipset_reader�ipset_writer�N)�config)	�checkIP�checkIP6�checkIPnMask�
checkIP6nMask�
u2b_if_py2�	check_mac�
check_port�checkInterface�
checkProtocol)�PY2�	IO_Object�IO_Object_ContentHandler�IO_Object_XMLGenerator)�IPSET_TYPES�IPSET_CREATE_OPTIONS)�check_icmp_name�check_icmp_type�check_icmpv6_name�check_icmpv6_type)�log)�errors)�
FirewallErrorcs�eZdZddd d!dddifddgffZdZd	d
ddgZd
d
dgdgd
d�Zdgdgd�Z�fdd�Zdd�Z	dd�Z
edd��Zdd�Z
�fdd�Z�ZS)"r�version��short�description�type�options�entriesz
(ssssa{ss}as)�_�-�:�.N�name)rr�ipset�option�entry�value)r(r)cs<tt|�j�d|_d|_d|_d|_g|_i|_d|_	dS)NrF)
�superr�__init__rrrr r"r!�applied)�self)�	__class__��/usr/lib/python3.6/ipset.pyr-CszIPSet.__init__cCs8d|_d|_d|_d|_|jdd�=|jj�d|_dS)NrF)rrrr r"r!�clearr.)r/r1r1r2�cleanupMs
z
IPSet.cleanupcCs\t|j�|_t|j�|_t|j�|_t|j�|_dd�|jj�D�|_dd�|jD�|_dS)z� HACK. I haven't been able to make sax parser return
            strings encoded (because of python 2) instead of in unicode.
            Get rid of it once we throw out python 2 support.cSsi|]\}}t|�t|��qSr1)r
)�.0�k�vr1r1r2�
<dictcomp>^sz(IPSet.encode_strings.<locals>.<dictcomp>cSsg|]}t|��qSr1)r
)r5�er1r1r2�
<listcomp>`sz(IPSet.encode_strings.<locals>.<listcomp>N)r
rrrr r!�itemsr")r/r1r1r2�encode_stringsVszIPSet.encode_stringsc
Csd}d|kr|ddkrd}|jd�s6ttjd|��|dd�jd�}|jd�}t|�t|�ksnt|�d	kr�ttjd
||f���xztt|��D�]h}||}||}|dk�r�d|ko�|dk�rh|d	kr�ttjd
|||f��|jd�}	t|	�dk�rttjd||||f��x�|	D]J}
|dk�r2t|
��sH|dk�rt	|
��rttjd|
|||f���qWnh|dk�r�|dk�r�ttjd||||f��|dk�r�t
}nt}nt	}||��s�ttjd||||f��q�|dk�r@d|k�r�|jd�}	t|	�dk�rttjd||||f��|dk�r0t|	d��sJ|dk�rft	|	d��rfttjd|	d|||f��|dk�r�t
|	d	��s�|dk�r>t|	d	��r>ttjd|	d	|||f��n�|jd��r�|dk�o�|dk�o�|dk�s�ttjd||||f��|dk�rt
|��s&|dk�r�t|��r�ttjd||||f��q�|dk�rvt
|��s`|dk�r�ttjd||f��q�|dk�r�d|k�r�|jd�}	t|	�dk�r�ttjd|��|	ddk�r|dk�r�ttjd||f��t|	d	��r�t|	d	��r�ttjd|	d	|f��n�|	dd1k�r~|dk�rDttjd||f��t|	d	��r�t|	d	��r�ttjd!|	d	|f��n^|	dd2k�r�t|	d��r�ttjd&|	d|f��n&t|	d	��s�ttjd'|	d	|f��nt|��s�ttjd(||f��q�|d)k�r�|jd*��rPyt|d+�}Wn*tk
�rLttjd,||f��YnXn8yt|�}Wn*tk
�r�ttjd,||f��YnX|dk�s�|d-k�r�ttjd,||f��q�|d.k�r�t|��s�t|�d/k�r�ttjd0||f��q�ttjd|��q�WdS)3NZipv4�family�inet6Zipv6zhash:zipset type '%s' not usable��,�z)entry '%s' does not match ipset type '%s'Zipr$z invalid address '%s' in '%s'[%d]�z.invalid address range '%s' in '%s' for %s (%s)z(invalid address '%s' in '%s' for %s (%s)z0.0.0.0rZnetz/0zhash:net,ifaceZmacz00:00:00:00:00:00z invalid mac address '%s' in '%s'Zportr%zinvalid port '%s'Zicmpz(invalid protocol for family '%s' in '%s'zinvalid icmp type '%s' in '%s'�icmpv6�	ipv6-icmpz invalid icmpv6 type '%s' in '%s'�tcp�sctp�udp�udplitezinvalid protocol '%s' in '%s'zinvalid port '%s'in '%s'zinvalid port '%s' in '%s'ZmarkZ0x�zinvalid mark '%s' in '%s'l��Ziface�zinvalid interface '%s' in '%s')rCrD)rErFrGrH)�
startswithrr�
INVALID_IPSET�split�lenZ
INVALID_ENTRY�rangerrrr	�endswithrrrrrrr�int�
ValueErrorr
)
r*r!Z
ipset_typer=�flagsr;�i�flag�itemZsplitsZ_splitZip_checkZint_valr1r1r2�check_entrybs@























zIPSet.check_entrycCs�|dkr |tkr ttjd|��|dkr�x�|j�D]�}|tkrNttjd|��|dkr�yt||�}Wn,tk
r�ttj	d|||f��YnX|d	kr�ttj	d
|||f��q2|dkr2||dkr2ttj
||��q2WdS)Nr z'%s' is not valid ipset typer!zipset invalid option '%s'�timeout�hashsize�maxelemz)Option '%s': Value '%s' is not an integerrz#Option '%s': Value '%s' is negativer=�inetr>)rXrYrZ)r[r>)rrr�INVALID_TYPE�keysrrLrQrR�
INVALID_VALUE�INVALID_FAMILY)r/rrVZ
all_config�key�	int_valuer1r1r2�
_check_configs2

zIPSet._check_configcsrd|dkr6|dddkr6t|d�dkr6ttj��x&|dD]}tj||d|d�q@Wtt|�j|�dS)NrX��0r?r�)rNrrZIPSET_WITH_TIMEOUTrrWr,�
import_config)r/rr*)r0r1r2rf3s
zIPSet.import_config)rr)rr)rr)r r)�__name__�
__module__�__qualname__ZIMPORT_EXPORT_STRUCTUREZDBUS_SIGNATUREZADDITIONAL_ALNUM_CHARSZPARSER_REQUIRED_ELEMENT_ATTRSZPARSER_OPTIONAL_ELEMENT_ATTRSr-r4r<�staticmethodrWrbrf�
__classcell__r1r1)r0r2r,s,


	7c@seZdZdd�Zdd�ZdS)�ipset_ContentHandlerc
Cs�tj|||�|jj||�|dkrpd|krX|dtkrLttjd|d��|d|j_d|krl|d|j_	�nz|dkr|�nn|dkr��nb|dk�r�d}d	|kr�|d	}|d
dkr�ttj
d|d
��|jjdko�|d
dk�r�ttj
d|d
|jjf��|d
dk�r&|�r&ttj
d|d
��|d
dk�r�yt|�}Wn.tk
�rnttj
d|d
|f��YnX|dk�r�ttj
d|d
|f��|d
dk�r�|dk�r�ttj|��|d
|jjk�r�||jj|d
<ntjd|d
�dS)Nr(r z%srrrr)rr+r'r=rXrYrZzUnknown option '%s'zhash:macz%Unsupported option '%s' for type '%s'z&Missing mandatory value of option '%s'z)Option '%s': Value '%s' is not an integerrz#Option '%s': Value '%s' is negativer[r>z Option %s already set, ignoring.)r=rXrYrZ)r=)r=rXrYrZ)rXrYrZ)r[r>)r�startElementrVZparser_check_element_attrsrrrr\r rZINVALID_OPTIONrQrRr^r_r!r�warning)r/r'�attrsr+rar1r1r2rm>sd

z!ipset_ContentHandler.startElementcCs(tj||�|dkr$|jjj|j�dS)Nr*)r�
endElementrVr"�appendZ_element)r/r'r1r1r2rpuszipset_ContentHandler.endElementN)rgrhrirmrpr1r1r1r2rl=s7rlc%Cst�}|jd�s ttjd|��|dd�|_|j|j�||_||_|j	t
j�rVdnd|_|j|_
t|�}tj�}|j|�d||f}t|d��b}tjd�}|j|�y|j|�Wn8tjk
r�}zttjd|j���WYdd}~XnXWdQRX~~d	|jk�rF|jd	d
k�rFt|j�dk�rFtjd|j�|jdd�=d}	t�}
x�|	t|j�k�r|j|	|
k�r�tjd
|j|	�|jj|	�nry|j |j|	|j|j!�Wn<tk
�r�}ztjd|�|jj|	�WYdd}~XnX|
j"|j|	�|	d7}	�qRW~
t#�r|j$�|S)Nz.xmlz'%s' is missing .xml suffixrcFTz%s/%s�rbznot a valid ipset file: %srXrdrz6ipset '%s': timeout option is set, entries are ignoredzEntry %s already set, ignoring.z
%s, ignoring.rA���)%rrPrrZINVALID_NAMEr'Z
check_name�filename�pathrKr�
ETC_FIREWALLDZbuiltin�defaultrl�saxZmake_parserZsetContentHandler�openZInputSourceZ
setByteStream�parseZSAXParseExceptionrLZgetExceptionr!rNr"rrn�set�poprWr �addrr<)rtrur(�handler�parserr'�f�source�msgrTZentries_setr9r1r1r2rzs^




(cCs�|r|n|j}|jr$d||jf}nd||jf}tjj|�r�ytj|d|�Wn0tk
r�}ztj	d||�WYdd}~XnXtjj
|�}|jtj
�r�tjj|�r�tjjtj
�s�tjtj
d�tj|d�tj|ddd�}t|�}|j�d	|ji}|j�r|jd
k�r|j|d<|jd|�|jd
�|j�rz|jd
k�rz|jd�|jdi�|j|j�|jd�|jd
�|j�r�|jd
k�r�|jd�|jdi�|j|j�|jd�|jd
�xZ|jj�D]L\}	}
|jd�|
d
k�r|jd|	|
d��n|jdd|	i�|jd
��q�WxD|jD]:}|jd�|jdi�|j|�|jd�|jd
��q(W|jd�|jd
�|j�|j �~dS)Nz%s/%sz	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %si�ZwtzUTF-8)�mode�encodingr rrr(�
z  rrr))r'r+r'r*)!rurtr'�os�exists�shutilZcopy2�	Exceptionr�error�dirnamerKrrv�mkdir�ioryrZ
startDocumentr rrmZignorableWhitespacerZ
charactersrprr!r;Z
simpleElementr"ZendDocument�close)r(ru�_pathr'r��dirpathr�r~ror`r+r*r1r1r2r�sf 















)N))�__doc__�__all__Zxml.saxrxr�r�r�ZfirewallrZfirewall.functionsrrrr	r
rrr
rZfirewall.core.io.io_objectrrrrZfirewall.core.ipsetrrZfirewall.core.icmprrrrZfirewall.core.loggerrrZfirewall.errorsrrrlrrr1r1r1r2�<module>s&

,=5site-packages/firewall/core/io/__pycache__/ifcfg.cpython-36.opt-1.pyc000064400000007721147511334670021325 0ustar003

]ûf��@s^dZdgZddlZddlZddlZddlZddlmZddl	m
Z
mZmZGdd�de
�ZdS)zifcfg file parser�ifcfg�N)�log)�b2u�u2b�PY2c@sLeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)rcCsi|_g|_||_|j�dS)N)�_config�_deleted�filename�clear)�selfr	�r�/usr/lib/python3.6/ifcfg.py�__init__#szifcfg.__init__cCsi|_g|_dS)N)rr)rrrr
r
)szifcfg.clearcCs|jj�dS)N)rr
)rrrr
�cleanup-sz
ifcfg.cleanupcCs|jj|j��S)N)r�get�strip)r�keyrrr
r0sz	ifcfg.getcCs8t|j��}t|j��|j|<||jkr4|jj|�dS)N)rrrr�remove)rr�valueZ_keyrrr
�set3s
z	ifcfg.setcCsHd}x2|jj�D]$\}}|r$|d7}|d||f7}qWtrDt|�S|S)N��
z%s=%s)r�itemsrr)r�srrrrr
�__str__9sz
ifcfg.__str__cCsB|j�yt|jd�}Wn4tk
rL}ztjd|j|��WYdd}~XnXx�|D]�}|s^P|j�}t|�dksT|ddkr�qTdd�|jd	d�D�}t|�d
kr�qTt|d�d
kr�|dj	d�r�|dj
d�r�|ddd�|d<|ddkr�qTn,|jj|d�dk	�r tj
d
|j|j��qT|d|j|d<qTW|j�dS)N�rzFailed to load '%s': %s�r�#�;cSsg|]}|j��qSr)r)�.0�xrrr
�
<listcomp>Qszifcfg.read.<locals>.<listcomp>�=��"rz%%s: Duplicate option definition: '%s')rr���)r
�openr	�	Exceptionr�errorr�len�split�
startswith�endswithrrZwarning�close)r�f�msg�lineZpairrrr
�readBs2
z
ifcfg.readc:Cs�t|j�dkrdSg}y.tjddtjj|j�tjj|j�dd�}Wn2t	k
rv}zt
jd|��WYdd}~XnXd}d}ytj
|jddd	�}WnNt	k
r�}z0tjj|j�r�t
jd
|j|f��nd}WYdd}~X�ndX�x^|D�]T}|s�P|jd�}t|�dk�r(|�sD|jd�d}q�|d
dk�rPd}|j|�|jd�q�|jdd�}t|�dk�r~d}|j|d�q�|d
j�}	|dj�}
t|
�dk�r�|
jd��r�|
jd��r�|
dd�}
|	|k�r@|	|jk�r|j|	|
k�rd}|jd|	|j|	f�d}n$|	|jk�r"d}nd}|j|d�|j|	�q�d}q�Wt|j�d
k�r�xF|jj�D]8\}	}
|	|k�rz�qd|�s�d}|jd|	|
f�d}�qdW|�r�|j�|j�|�s�tj|j�dStjj|j��r8ytj|jd|j�WnBt	k
�r6}z$tj|j�td|j|f��WYdd}~XnXytj|j|j�WnBt	k
�r�}z$tj|j�td|j|f��WYdd}~XnXtj|jd�dS)NrZwtz%s.F)�mode�prefix�dir�deletez!Failed to open temporary file: %sZrtzUTF-8)r2�encodingzFailed to open '%s': %srTrrr"r#r$z%s=%s
z%s.bakzBackup of '%s' failed: %szFailed to create '%s': %si�r%)r)r�tempfileZNamedTemporaryFile�os�path�basenamer	�dirnamer'rr(�ior&�existsr�writer*r+r,r�appendrr-r�name�shutilZcopy2�IOErrorZmove�chmod)r�doneZ	temp_filer/Zmodified�emptyr.r0�prrrrr
r>_s�





$$zifcfg.writeN)�__name__�
__module__�__qualname__rr
rrrrr1r>rrrr
r"s	)�__doc__�__all__Zos.pathr8r<r7rAZfirewall.core.loggerrZfirewall.functionsrrr�objectrrrrr
�<module>ssite-packages/firewall/core/io/__pycache__/icmptype.cpython-36.opt-1.pyc000064400000011635147511334670022100 0ustar003

]ûf��@s�dddgZddljZddlZddlZddlZddlmZddlm	Z	ddl
mZmZm
Z
mZddlmZdd	lmZdd
lmZGdd�de�ZGdd
�d
e
�Zdd�Zddd�ZdS)�IcmpType�icmptype_reader�icmptype_writer�N)�config)�
u2b_if_py2)�PY2�	IO_Object�IO_Object_ContentHandler�IO_Object_XMLGenerator)�log)�errors)�
FirewallErrorcspeZdZdddddgffZdZddgZd	d	d	d
�Zddgdd
gd�Z�fdd�Zdd�Z	dd�Z
dd�Z�ZS)r�version��short�description�destinationz(sssas)�_�-N)rr�icmptype�name�ipv4�ipv6)rrcs*tt|�j�d|_d|_d|_g|_dS)Nr)�superr�__init__rrrr)�self)�	__class__��/usr/lib/python3.6/icmptype.pyr8s
zIcmpType.__init__cCs"d|_d|_d|_|jdd�=dS)Nr)rrrr)rrrr�cleanup?szIcmpType.cleanupcCs:t|j�|_t|j�|_t|j�|_dd�|jD�|_dS)z� HACK. I haven't been able to make sax parser return
            strings encoded (because of python 2) instead of in unicode.
            Get rid of it once we throw out python 2 support.cSsg|]}t|��qSr)r)�.0�mrrr�
<listcomp>Lsz+IcmpType.encode_strings.<locals>.<listcomp>N)rrrrr)rrrr�encode_stringsEszIcmpType.encode_stringscCs2|dkr.x$|D]}|dkrttjd|��qWdS)Nrrrz'%s' not from {'ipv4'|'ipv6'})rr)r
rZINVALID_DESTINATION)rr�itemZ
all_configrrrr�
_check_configNs
zIcmpType._check_config)rr)rr)rr)
�__name__�
__module__�__qualname__ZIMPORT_EXPORT_STRUCTUREZDBUS_SIGNATUREZADDITIONAL_ALNUM_CHARSZPARSER_REQUIRED_ELEMENT_ATTRSZPARSER_OPTIONAL_ELEMENT_ATTRSrrr#r%�
__classcell__rr)rrr%s	c@seZdZdd�ZdS)�icmptype_ContentHandlercCs�tj|||�|jj||�|dkrTd|kr>tjd|d�d|kr�|d|j_nT|dkr^nJ|dkrhn@|dkr�x6dD].}||krv||j�d
krv|jjj	t
|��qvWdS)Nrrz'Ignoring deprecated attribute name='%s'rrrrrr�yes�true)rr)r+r,)r	�startElementr$Zparser_check_element_attrsrZwarningr�lowerr�append�str)rr�attrs�xrrrr-Ys"
z$icmptype_ContentHandler.startElementN)r&r'r(r-rrrrr*Xsr*c	Cst�}|jd�s ttjd|��|dd	�|_|j|j�||_||_|j	t
j�rVdnd|_|j|_
t|�}tj�}|j|�d||f}t|d��b}tjd�}|j|�y|j|�Wn8tjk
r�}zttjd|j���WYdd}~XnXWdQRX~~t�r|j�|S)
Nz.xmlz%s is missing .xml suffix�FTz%s/%s�rbznot a valid icmptype file: %s���)r�endswithr
rZINVALID_NAMErZ
check_name�filename�path�
startswithr�
ETC_FIREWALLDZbuiltin�defaultr*�saxZmake_parserZsetContentHandler�openZInputSourceZ
setByteStream�parseZSAXParseExceptionZINVALID_ICMPTYPEZgetExceptionrr#)	r7r8r�handler�parserr�f�source�msgrrrrms8




(c
Cs.|r|n|j}|jr$d||jf}nd||jf}tjj|�r�ytj|d|�Wn0tk
r�}ztj	d||�WYdd}~XnXtjj
|�}|jtj
�r�tjj|�r�tjjtj
�s�tjtj
d�tj|d�tj|ddd�}t|�}|j�i}|j�r|jd	k�r|j|d
<|jd|�|jd�|j�rt|jd	k�rt|jd
�|jdi�|j|j�|jd�|jd�|j�r�|jd	k�r�|jd
�|jdi�|j|j�|jd�|jd�|j�r|jd
�i}x|jD]}	d||	<�q�W|jd|�|jd�|jd�|jd�|j�|j�~dS)Nz%s/%sz	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %si�ZwtzUTF-8)�mode�encodingrrr�
z  rrr+r)r8r7r�os�exists�shutilZcopy2�	Exceptionr�error�dirnamer9rr:�mkdir�ior=r
Z
startDocumentrr-ZignorableWhitespacerZ
charactersZ
endElementrrZ
simpleElementZendDocument�close)
rr8�_pathrrC�dirpathrAr?r1r2rrrr�s\ 











)N)�__all__Zxml.saxr<rGrNrIZfirewallrZfirewall.functionsrZfirewall.core.io.io_objectrrr	r
Zfirewall.core.loggerrrZfirewall.errorsr
rr*rrrrrr�<module>s

3site-packages/firewall/core/io/__pycache__/ifcfg.cpython-36.pyc000064400000007721147511334670020366 0ustar003

]ûf��@s^dZdgZddlZddlZddlZddlZddlmZddl	m
Z
mZmZGdd�de
�ZdS)zifcfg file parser�ifcfg�N)�log)�b2u�u2b�PY2c@sLeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)rcCsi|_g|_||_|j�dS)N)�_config�_deleted�filename�clear)�selfr	�r�/usr/lib/python3.6/ifcfg.py�__init__#szifcfg.__init__cCsi|_g|_dS)N)rr)rrrr
r
)szifcfg.clearcCs|jj�dS)N)rr
)rrrr
�cleanup-sz
ifcfg.cleanupcCs|jj|j��S)N)r�get�strip)r�keyrrr
r0sz	ifcfg.getcCs8t|j��}t|j��|j|<||jkr4|jj|�dS)N)rrrr�remove)rr�valueZ_keyrrr
�set3s
z	ifcfg.setcCsHd}x2|jj�D]$\}}|r$|d7}|d||f7}qWtrDt|�S|S)N��
z%s=%s)r�itemsrr)r�srrrrr
�__str__9sz
ifcfg.__str__cCsB|j�yt|jd�}Wn4tk
rL}ztjd|j|��WYdd}~XnXx�|D]�}|s^P|j�}t|�dksT|ddkr�qTdd�|jd	d�D�}t|�d
kr�qTt|d�d
kr�|dj	d�r�|dj
d�r�|ddd�|d<|ddkr�qTn,|jj|d�dk	�r tj
d
|j|j��qT|d|j|d<qTW|j�dS)N�rzFailed to load '%s': %s�r�#�;cSsg|]}|j��qSr)r)�.0�xrrr
�
<listcomp>Qszifcfg.read.<locals>.<listcomp>�=��"rz%%s: Duplicate option definition: '%s')rr���)r
�openr	�	Exceptionr�errorr�len�split�
startswith�endswithrrZwarning�close)r�f�msg�lineZpairrrr
�readBs2
z
ifcfg.readc:Cs�t|j�dkrdSg}y.tjddtjj|j�tjj|j�dd�}Wn2t	k
rv}zt
jd|��WYdd}~XnXd}d}ytj
|jddd	�}WnNt	k
r�}z0tjj|j�r�t
jd
|j|f��nd}WYdd}~X�ndX�x^|D�]T}|s�P|jd�}t|�dk�r(|�sD|jd�d}q�|d
dk�rPd}|j|�|jd�q�|jdd�}t|�dk�r~d}|j|d�q�|d
j�}	|dj�}
t|
�dk�r�|
jd��r�|
jd��r�|
dd�}
|	|k�r@|	|jk�r|j|	|
k�rd}|jd|	|j|	f�d}n$|	|jk�r"d}nd}|j|d�|j|	�q�d}q�Wt|j�d
k�r�xF|jj�D]8\}	}
|	|k�rz�qd|�s�d}|jd|	|
f�d}�qdW|�r�|j�|j�|�s�tj|j�dStjj|j��r8ytj|jd|j�WnBt	k
�r6}z$tj|j�td|j|f��WYdd}~XnXytj|j|j�WnBt	k
�r�}z$tj|j�td|j|f��WYdd}~XnXtj|jd�dS)NrZwtz%s.F)�mode�prefix�dir�deletez!Failed to open temporary file: %sZrtzUTF-8)r2�encodingzFailed to open '%s': %srTrrr"r#r$z%s=%s
z%s.bakzBackup of '%s' failed: %szFailed to create '%s': %si�r%)r)r�tempfileZNamedTemporaryFile�os�path�basenamer	�dirnamer'rr(�ior&�existsr�writer*r+r,r�appendrr-r�name�shutilZcopy2�IOErrorZmove�chmod)r�doneZ	temp_filer/Zmodified�emptyr.r0�prrrrr
r>_s�





$$zifcfg.writeN)�__name__�
__module__�__qualname__rr
rrrrr1r>rrrr
r"s	)�__doc__�__all__Zos.pathr8r<r7rAZfirewall.core.loggerrZfirewall.functionsrrr�objectrrrrr
�<module>ssite-packages/firewall/core/io/__pycache__/policy.cpython-36.opt-1.pyc000064400000051212147511334670021540 0ustar003

]ûfϢ�@s dddgZddljZddlZddlZddlZddlmZddlm	Z	m
Z
ddlmZmZm
Z
ddlmZmZmZdd	lmZmZmZmZmZmZdd
lmZddlmZddlmZdd
lmZdd�Z dd�Z!dd�Z"dd�Z#dd�Z$Gdd�de�Z%Gdd�de�Z&ddd�Z'ddd�Z(dS) �Policy�
policy_reader�
policy_writer�N)�config)�checkIP�checkIP6)�uniqify�max_policy_name_len�portStr)�DEFAULT_POLICY_TARGET�POLICY_TARGETS�DEFAULT_POLICY_PRIORITY)�	IO_Object�IO_Object_ContentHandler�IO_Object_XMLGenerator�
check_port�check_tcpudp�check_protocol)�rich)�log)�errors)�
FirewallErrorc	Cs�|dkr�n�|dkr�n�|dkr�|jr`|jjrJtjdt|j��d|_dStj|d�|j_dS|d|jj	kr�|jj	j
|d�ntjd|d��n|dk�rN|jr�|jjr�tjdt|j��d|_dStj|d|d	�|j_dSt|d�t
|d	�t|dd
�|d	f}||jjk�r4|jjj
|�ntjd|d|d	��nN|d	k�r�|j�r�|jj�r�tjdt|j��d|_dStj|d�|j_nBt|d�|d|jjk�r�|jjj
|d�ntjd
|d��n�|dk�rh|j�r.|jj�rtjdt|j��d|_dStj|d�|j_dS|d|jjk�rT|jjj
|d�ntjd|d��n4|dk�r�|j�r�|jj�r�tjdt|j��d|_dStj|d�|j_dStjd|d��n�|dk�r2|j�r|jj�rtjdt|j��d|_dStj�|j_n|jj�r&tjd�nd|j_�nj|dk�r�d}d|k�rR|d}d}d|k�rh|d}|j�r�|jj�r�tjdt|j��d|_dStj|d|d	||�|j_dSt|d�t
|d	�|�r�t|�|�r
t|��r
t|��r
ttjd|��t|dd
�|d	t|d
�t|�f}||jjk�rL|jjj
|�n6tjd|d|d	|�rld|nd|�r|d|nd��n|dk�r@|j�r�|jj�r�tjdt|j��d|_dStj|d|d	�|j_dSt|d�t
|d	�t|dd
�|d	f}||jj k�r&|jj j
|�ntjd|d|d	��n\|dk�r�|j�sftjd�d|_dS|jj!�r�tjd t|j��dSd!}d}d"|k�r�|d"}d}d#|k�r�|d#}d$|k�r�|d$j"�dLk�r�d}tj#|||�|j_!�n�|dMk�r�|j�stjd+�d|_dS|jj$�r0tjd,�d|_dS|d'k�rHtj%�|j_$nh|d(k�rxd}	d-|k�rh|d-}	tj&|	�|j_$n8|d)k�r�tj'�|j_$n |d*k�r�|d.}
tj(|
�|j_$|jj$|_)�n�|d/k�r^|j�s�tjd0�dS|jj�r�tjd1�dSd}d2|k�r*|d2}|dNk�r*tjd;�d|_dSd<|k�r<|d<nd}tj*||�|j_|jj|_)�n>|d=k�r�|j�s~tjd>�dS|jj+�r�tjd?t|j��d|_dStj,�|j_+|jj+|_)n�|d@k�r,d}
dA}dB|k�r|dB}
|
dOk�rtjdE|dB�d|_dSdF|k�rt-|dF�}tj.|
|dG�|_np|dHk�r�|j)�sRtjdI�d|_dS|j)j/�rxtjdJt|j��d|_dS|d}tj0||j1dK��|j)_/nd!SdS)PN�short�description�servicez;Invalid rule: More than one element in rule '%s', ignoring.T�namez#Service '%s' already set, ignoring.�port�protocol�-z#Port '%s/%s' already set, ignoring.�valuez$Protocol '%s' already set, ignoring.z
icmp-blockz&icmp-block '%s' already set, ignoring.z	icmp-typez-Invalid rule: icmp-block '%s' outside of rule�
masqueradez!Masquerade already set, ignoring.zforward-port�zto-portzto-addrz#to-addr '%s' is not a valid addressz-Forward port %s/%s%s%s already set, ignoring.z >%sz @%szsource-portz*Source port '%s/%s' already set, ignoring.�destinationz)Invalid rule: Destination outside of rulez?Invalid rule: More than one destination in rule '%s', ignoring.F�address�ipset�invert�yes�true�accept�reject�drop�markz$Invalid rule: Action outside of rulez"Invalid rule: More than one action�type�setrz!Invalid rule: Log outside of rulezInvalid rule: More than one log�level�emerg�alert�crit�error�warning�notice�info�debugzInvalid rule: Invalid log level�prefix�auditz#Invalid rule: Audit outside of rulez9Invalid rule: More than one audit in rule '%s', ignoring.�ruler�family�ipv4�ipv6z&Invalid rule: Rule family "%s" invalid�priority)r:r=�limitz4Invalid rule: Limit outside of action, log and auditz9Invalid rule: More than one limit in rule '%s', ignoring.�burst)r&r')r(r)r*r+)r/r0r1r2r3r4r5r6)r;r<)2�_rule�elementrr3�str�_rule_errorr�Rich_Service�item�services�append�	Rich_Portrrr
�ports�
Rich_Protocolr�	protocols�Rich_IcmpBlock�icmp_blocks�
Rich_IcmpType�Rich_Masquerader �Rich_ForwardPortrrrr�INVALID_ADDR�
forward_ports�Rich_SourcePort�source_portsr"�lowerZRich_Destination�action�Rich_Accept�Rich_Reject�	Rich_Drop�	Rich_Mark�	_limit_okZRich_Logr8Z
Rich_Audit�int�	Rich_Ruler>Z
Rich_Limit�get)�objr�attrs�entry�to_portZto_addrr%r#r$Z_typeZ_setr.r7r:r=r�rc�/usr/lib/python3.6/policy.py�common_startElements�


















































recCs�|dkr�|js�y|jj�Wn6tk
rR}ztjd|t|j��WYdd}~XnLXt|j�|jjkr�|jj	j
|j�|jjj
t|j��ntjdt|j��d|_d|_n|dkr�d|_dS)Nr9z%s: %sz Rule '%s' already set, ignoring.Fr(r)r*r+rr8)r(r)r*r+rr8)rCr@Zcheck�	Exceptionrr3rBrE�	rules_str�rulesrGr[)r_r�ercrcrd�common_endElements&rjcCs�t|t�rdnd}|dkrT|jrT|jj�}x$|D]}||kr0ttjd|��q0W�n�|dkr�x$|D]}t|d�t|d�qbW�nb|dkr�x|D]}t	|�q�W�n@|d	kr�|jr�|jj
�}	x$|D]}
|
|	kr�ttjd
|
��q�W�n�|dk�r�x�|D]�}t|d�t|d�|d�r>|d
�r>ttjd|��|d�rTt|d�|d
r�t
|d
�r�t|d
�r�ttjd|d
��q�W�nT|dk�r�x&|D]}t|d�t|d��q�W�n|dk�r�x|D�]}tj|d�}
|j�r�|
j�r�t|
jtj��st|
jtj��r�|jj
�}	|
jj|	k�rLttjd
|
jj��nH|
j�r�|jj|
jj�}|j�r�|
j|jk�r�ttjd|
j|
jjf��nL|j�r�t|
jtj��r�|jj�}|
jj|k�r�ttjdj||j|
jj����q�WdS)NrZZonerFz '%s' not among existing servicesrIr�rKrMz"'%s' not among existing icmp typesrR��z$'%s' is missing to-port AND to-addr z#to-addr '%s' is not a valid addressrTrg�
rich_rules)�rule_strz3rich rule family '%s' conflicts with icmp type '%s'z){} '{}': '{}' not among existing services)rgrn)�
isinstancer�	fw_configZget_servicesrrZINVALID_SERVICErrrZ
get_icmptypesZINVALID_ICMPTYPE�INVALID_FORWARDrrrQrr]rArLrNrr:Zget_icmptyper"rD�format)r_rrE�
all_configZobj_typeZexisting_servicesrr�protoZexisting_icmptypesZicmptype�fwd_portr9Zobj_richZictrcrcrd�common_check_config2s�












 

rwcCs0d|ji}|j}|dk	r ||d<|jd|�dS)Nrr?r>)rr?�
simpleElement)�handlerr>�dr?rcrcrd�_handler_add_rich_limitxs

r{cCs�|jrF|jdkrF|jd�|jdi�|j|j�|jd�|jd�|jr�|jdkr�|jd�|jdi�|j|j�|jd�|jd�x6t|j�D](}|jd�|jdd|i�|jd�q�Wx@t|j	�D]2}|jd�|jd|d	|d
d��|jd�q�Wx8t|j
�D]*}|jd�|jdd
|i�|jd��qWx8t|j�D]*}|jd�|jdd|i�|jd��qLW|j�r�|jd�|jdi�|jd�x�t|j
�D]�}|jd�|d	|d
d�}|d�r�|ddk�r�|d|d<|d�r|ddk�r|d|d<|jd|�|jd��q�WxBt|j�D]4}|jd�|jd|d	|d
d��|jd��q>W�xT|jD�]H}i}|j�r�|j|d<|jd	k�r�t|j�|d<|jd�|jd|�|jd�|j�rVi}|jj�r�|jj|d<|jj�r|jj|d<|jj�r$|jj|d<|jj�r6d|d<|jd�|jd|�|jd�|j�r�i}|jj�rx|jj|d<|jj�r�|jj|d<|jj�r�d|d<|jd�|jd |�|jd�|j�rxd}	i}t|j�tjk�r�d}	|jj|d<�nbt|j�tjk�r(d}	|jj|d<|jj |d<�n0t|j�tj!k�rNd}	|jj"|d
<�n
t|j�tj#k�rfd}	n�t|j�tj$k�r�d}	|jj|d<n�t|j�tj%k�r�d!}	|jj|d<n�t|j�tj&k�rd}	|jj|d<|jj |d<|jj'dk�r�|jj'|d<|jj(dk�rX|jj(|d<nFt|j�tj)k�rBd}	|jj|d<|jj |d<nt*t+j,d"t|j���|jd�|j|	|�|jd�|j-�ri}|j-j.�r�|j-j.|d#<|j-j/�r�|j-j/|d$<|j-j0�r�|jd�|jd%|�|jd&�t1||j-j0�|jd'�|jd%�n|jd�|jd%|�|jd�|j2�r�i}|j2j0�rx|jd�|jd(i�|jd&�t1||j2j0�|jd'�|jd(�n|jd�|jd(|�|jd�|j3�r�d}
i}t|j3�tj4k�r�d)}
n|t|j3�tj5k�r�d*}
|j3j�r<|j3j|d+<nNt|j3�tj6k�rd,}
n6t|j3�tj7k�r*d-}
|j3j8|d.<nt-j9d/t|j3��|j3j0�r�|jd�|j|
|�|jd&�t1||j3j0�|jd'�|j|
�n|jd�|j|
|�|jd�|jd�|jd�|jd��q�WdS)0Nr!z  r�
rrrrrrk)rrrrz
icmp-blockr rlzto-portrmzto-addrzforward-portzsource-portr:r=r9r#�macr$�Truer%z    �sourcer"z	icmp-typez"Unknown element '%s' in obj_writerr7r.rz
      z
    r8r(r)r,r*r+r-zUnknown action '%s'):r�ignorableWhitespace�startElementZ
characters�
endElementrrrFrxrIrKrMr rRrTrhr:r=rBr�addrr}r$r%r"rAr,rrDrrHrrrJrrOrLrNrPrb�
to_addressrSrrZINVALID_OBJECTrr7r.r>r{r8rVrWrXrYrZr-r3)r_ryrrrZicmpZforwardr`r9rArVrcrcrd�
common_writer�s\




















































r�csPeZdZd7ZdZeZdgZd8d9d:d;d	dgfd
d<gfddgfd=dd>gfddgfddgfdd?gfd@ddgfddgffZdddgZ	dddgdgddgdgdgdddgddddgddgddddddgdgdgdgd�Z
ddgdd gd!dgd"d#d$d!d%gd"d$d%gd&d'gd(gd)gd*�Z�fd+d,�Zd-d.�Z
�fd/d0�Z�fd1d2�Zd3d4�Z�fd5d6�Z�ZS)Ari�i�r�versionr!rr�targetrFrIrMr FrRrnrKrTr=�
ingress_zones�egress_zones�_r�/Nrrrrr-)rr�policyrrz
icmp-blockz	icmp-typer zforward-portr9rr"rzsource-portrr8r(r)r*r+r>zingress-zonezegress-zonezto-portzto-addrr:r#r}r%r$r7r.r,r?)r�zforward-portr9rr"rr)r>cs�tt|�j�d|_d|_d|_t|_g|_g|_	g|_
g|_d|_g|_
g|_d|_g|_g|_d|_|j|_d|_g|_g|_dS)Nr!F)�superr�__init__r�rrrr�rFrIrKrMr rRrTrqrhrg�applied�priority_defaultr=Zderived_from_zoner�r�)�self)�	__class__rcrdr��s(zPolicy.__init__cCs�d|_d|_d|_t|_|jdd�=|jdd�=|jdd�=|jdd�=d|_	|j
dd�=|jdd�=d|_|j
dd�=|jdd�=d|_|j|_|jdd�=|jdd�=dS)Nr!F)r�rrrr�rFrIrKrMr rRrTrqrhrgr�r�r=r�r�)r�rcrcrd�cleanup�s$zPolicy.cleanupcs"|dkr|jSttt|�|�SdS)Nrn)rg�getattrr�r)r�r)r�rcrd�__getattr__�szPolicy.__getattr__csB|dkr,dd�|D�|_dd�|jD�|_ntt|�j||�dS)NrncSsg|]}tj|d��qS))ro)rr])�.0�srcrcrd�
<listcomp>�sz&Policy.__setattr__.<locals>.<listcomp>cSsg|]}t|��qSrc)rB)r�r�rcrcrdr��s)rhrgr�r�__setattr__)r�rr)r�rcrdr��szPolicy.__setattr__c
Cst||||�|dkr2|tkr.ttjd|���n�|dkrz||jksX||jksX||jkrvttjd||j|j|jf���n�|dk�rhddg}|j	r�||j	j
�7}x�|D]�}||kr�ttjd	|��|dkr�tddg�t|�@�s�|dk�rt|�t|g��rttjd
|��|dkr�|dk�r8d|k�r8d|dk�sT|dkr�d|kr�d|dkr�ttjd��q�W�n�|dk�r|�rd|k�r�d|dk�r�ttjd
��nxd|k�rd|dk�r�ttjd��xR|dD]F}|dk�rސq�|j	j
|�}|j	�r�d|j	j|�k�r�ttjd���q�W�n�|dk�r4�x�|D�]}tj|d�}|j�r�t|jtj��r�d|k�r|d|dk�r|ttjd
��nxd|k�r,d|dk�r�ttjd��xR|dD]F}|dk�r��q�|j	j
|�}|j	�r�d|j	j|�k�r�ttjd���q�W�q,|j�r�t|jtj��r�d|k�r,d|dk�r@|jj�r�ttjd��nt|d�r,|jj�s`ttjd��d|dk�r,x�|dD]8}|j	j
|�}|j	�rxd|j	j|�k�rxttjd���qxWnv|j�r,t|jtj��r,d|k�r,xR|dD]F}|dk�r�q�|j	j
|�}|j	�r�d|j	j|�k�r�ttjd���q�W�q,Wn�|dk�rx�|D]�}	d|k�rnd|dk�rnttjd��n�d|k�rDd|dk�r�|	d�rttjd��nt|d�rD|	d�s�ttjd��d|dk�rDxD|dD]8}|j	j
|�}|j	�r�d|j	j|�k�r�ttjd���q�W�qDWdS)Nr�z'%s' is invalid targetr=zQ%d is invalid priority. Must be in range [%d, %d]. The following are reserved: %sr�r��ANY�HOSTz'%s' not among existing zonesz>'%s' may only contain one of: many regular zones, ANY, or HOSTzF'HOST' can only appear in either ingress or egress zones, but not bothr z.'masquerade' is invalid for egress zone 'HOST'z/'masquerade' is invalid for ingress zone 'HOST'Z
interfaceszR'masquerade' cannot be used in a policy if an ingress zone has assigned interfacesrn)rozAA 'forward-port' with 'to-addr' is invalid for egress zone 'HOST'zC'forward-port' requires 'to-addr' if egress zone is 'ANY' or a zonezS'forward-port' cannot be used in a policy if an egress zone has assigned interfaceszR'mark' action cannot be used in a policy if an egress zone has assigned interfacesrRz1'forward-port' is invalid for ingress zone 'HOST'rm)r�r�)r�r�)r�r�)r�r�)rwrrr�INVALID_TARGET�priority_reserved�priority_max�priority_minZINVALID_PRIORITYrq�	get_zonesZINVALID_ZONEr-Zget_zoneZget_zone_config_dictrr]rArprOrPr�rrrVrZ)
r�rrErtZexisting_zones�zoneZz_objr9r_rvrcrcrd�
_check_config�s�






"
















zPolicy._check_configcs�tt|�j|�|jd�r,ttjd|��n�|jd�rHttjd|��n�|jd�dkrhttjd|��njd|kr�|d|j	d��}n|}t
|�t�kr�ttjd|t
|�t�f��|jr�||jj
�kr�ttjd��dS)Nr�z'%s' can't start with '/'z'%s' can't end with '/'rkzmore than one '/' in '%s'z&Policy of '%s' has %d chars, max is %dz,Policies can't have the same name as a zone.)r�r�
check_name�
startswithrr�INVALID_NAME�endswith�count�find�lenr	rqr�Z
NAME_CONFLICT)r�rZchecked_name)r�rcrdr�,s*

zPolicy.check_namei���)r�r!)rr!)rr!)r�r!)r!r!)r F)r!r!r!r!)r!r!)r=r)�__name__�
__module__�__qualname__r�r�r
r�r�ZIMPORT_EXPORT_STRUCTUREZADDITIONAL_ALNUM_CHARSZPARSER_REQUIRED_ELEMENT_ATTRSZPARSER_OPTIONAL_ELEMENT_ATTRSr�r�r�r�r�r��
__classcell__rcrc)r�rdrZsr


^c@s$eZdZdd�Zdd�Zdd�ZdS)�policy_ContentHandlercCs"tj||�d|_d|_d|_dS)NF)rr�r@rCr[)r�rErcrcrdr�Hszpolicy_ContentHandler.__init__cCstj|||�|jrdS|jj||�t|||�r6dS|dkr�d|krR|d|j_d|krjt|d�|j_d|kr�|d}|t	kr�t
tj|��|r�||j_
�n^|dkr�|d|jjkr�|jjj|d�ntjd|d��n|dk�r |d|jjk�r|jjj|d�ntjd	|d�n�|d
k�r�|j�sFtjd�d|_dS|jj�rltjd
t|j��d|_dSd}d|k�r�|dj�dk�r�d}d}}}d|k�r�|d}d|k�r�|d}d|k�r�|d}tj||||d�|j_dStjd|�dSdS)Nr�r�r=r�zingress-zonerz(Ingress zone '%s' already set, ignoring.zegress-zonez'Egress zone '%s' already set, ignoring.rz$Invalid rule: Source outside of ruleTz:Invalid rule: More than one source in rule '%s', ignoring.Fr%r&r'r#r}r$)r%zUnknown XML element '%s')r&r')rr�rCrEZparser_check_element_attrsrer�r\r=rrrr�r�r�rGrr3r�r@rrBrUrZRich_Source)r�rr`r�r%r�r}r$rcrcrdr�Nsf








z"policy_ContentHandler.startElementcCstj||�t||�dS)N)rr�rj)r�rrcrcrdr��sz policy_ContentHandler.endElementN)r�r�r�r�r�r�rcrcrcrdr�Gs@r�Fc
Cst�}|jd�s ttjd|��|dd	�|_|s>|j|j�||_||_|j	t
j�rZdnd|_|j|_
t|�}tj�}|j|�d||f}t|d��b}tjd�}|j|�y|j|�Wn8tjk
r�}	zttjd|	j���WYdd}	~	XnXWdQRX~~|S)
Nz.xmlz'%s' is missing .xml suffix�FTz%s/%s�rbznot a valid policy file: %s���)rr�rrr�rr��filename�pathr�r�
ETC_FIREWALLDZbuiltin�defaultr��saxZmake_parserZsetContentHandler�openZInputSourceZ
setByteStream�parseZSAXParseExceptionZINVALID_POLICYZgetException)
r�r�Z
no_check_namer�ry�parserr�fr�msgrcrcrdr�s6




(c
Cs�|r|n|j}|jr$d||jf}nd||jf}tjj|�r�ytj|d|�Wn0tk
r�}ztj	d||�WYdd}~XnXtjj
|�}|jtj
�r�tjj|�r�tjjtj
�s�tjtj
d�tj|d�tj|ddd�}t|�}|j�i}|j�r|jd	k�r|j|d
<|j|jk�r0t|j�|d<|j|d<|jd
|�|jd�t||�x8t|j�D]*}	|jd�|jdd|	i�|jd��qfWx8t|j�D]*}	|jd�|jdd|	i�|jd��q�W|jd
�|jd�|j �|j!�~dS)Nz%s/%sz	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %si�ZwtzUTF-8)�mode�encodingr!r�r=r�r�r|z  zingress-zonerzegress-zone)"r�r�r�os�exists�shutilZcopy2rfrr2�dirnamer�rr��mkdir�ior�rZ
startDocumentr�r=r�rBr�r�r�r�rr�rxr�r�ZendDocument�close)
r�r��_pathrr��dirpathr�ryr`r�rcrcrdr�sN 







)F)N))�__all__Zxml.saxr�r�r�r�ZfirewallrZfirewall.functionsrrrr	r
Zfirewall.core.baserrr
Zfirewall.core.io.io_objectrrrrrrZ
firewall.corerZfirewall.core.loggerrrZfirewall.errorsrrerjrwr{r�rr�rrrcrcrcrd�<module>s4

 F[nL
site-packages/firewall/core/io/__pycache__/firewalld_conf.cpython-36.opt-1.pyc000064400000016616147511334670023230 0ustar003

]ûf�5�
@s~ddlZddlZddlZddlZddlmZddlmZddl	m
Z
mZmZddddd	d
ddd
ddddg
Z
Gdd�de�ZdS)�N)�config)�log)�b2u�u2b�PY2�DefaultZone�MinimalMark�
CleanupOnExit�CleanupModulesOnExit�Lockdown�
IPv6_rpfilter�IndividualCalls�	LogDenied�AutomaticHelpers�FirewallBackend�FlushAllOnReload�RFC3964_IPv4�AllowZoneDriftingc@sLeZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
d�Z	dd�Z
dS)�firewalld_confcCsi|_g|_||_|j�dS)N)�_config�_deleted�filename�clear)�selfr�r�$/usr/lib/python3.6/firewalld_conf.py�__init__&szfirewalld_conf.__init__cCsi|_g|_dS)N)rr)rrrrr,szfirewalld_conf.clearcCs|jj�g|_dS)N)rrr)rrrr�cleanup0s
zfirewalld_conf.cleanupcCs|jj|j��S)N)r�get�strip)r�keyrrrr4szfirewalld_conf.getcCs8t|j��}t|j��|j|<||jkr4|jj|�dS)N)rrrr�remove)rr �valueZ_keyrrr�set7s
zfirewalld_conf.setcCsHd}x2|jj�D]$\}}|r$|d7}|d||f7}qWtrDt|�S|S)N��
z%s=%s)r�itemsrr)r�sr r"rrr�__str__=szfirewalld_conf.__str__cCs�|j�yt|jd�}W�n8tk
�rR}�ztjd|j|�|jdtj�|jdt	tj
��|jdtjrpdnd�|jdtjr�dnd�|jd	tj
r�dnd�|jd
tjr�dnd�|jdtjr�dnd�|jdtj�|jd
tj�|jdtj�|jdtj�r
dnd�|jdtj�r"dnd�|jdtj�r:dnd��WYdd}~XnX�x�|D]�}|�shP|j�}t|�dk�s\|dd.k�r��q\dd�|jd�D�}t|�dk�r�tjd|j���q\nr|dtk�r�tjd|j���q\nN|ddk�rtjd|j���q\n*|jj|d�dk	�r:tjd|j���q\|d|j|d<�q\W|j�|jd��s�tjdtj�|jdt	tj��|jd�}yt|�WnPttfk
�r�|dk	�r�tj d |�r�|ndtj
�|jdt	tj
��YnX|jd�}|�s|j!�d/k�rJ|dk	�r2tj d#|�r(|ndtj�|jdtj�rDdnd�|jd�}|�sj|j!�d0k�r�|dk	�r�tj d$|�r�|ndtj�|jdtj�r�dnd�|jd	�}|�s�|j!�d1k�r|dk	�r�tj d%|�r�|ndtj
�|jd	tj
�r�dnd�|jd
�}|�s"|j!�d2k�r^|dk	�rFtj d&|�r<|ndtj�|jd
tj�rXdnd�|jd�}|�s~|j!�d3k�r�|dk	�r�tj d'|�r�|ndtj�|jdtj�r�dnd�|jd�}|�s�|tj"k�r|dk	�r�tj d(|tj�|jdt	tj��|jd
�}|�s&|j!�tj#k�r\|dk	�rJtj d)|�r@|ndtj�|jd
t	tj��|jd�}|�s~|j!�tj$k�r�|dk	�r�tj d*|�r�|ndtj�|jdt	tj��|jd�}|�s�|j!�d4k�r
|dk	�r�tj d+|�r�|ndtj�|jdt	tj��|jd�}|�s*|j!�d5k�r`|dk	�rNtj d,|�rD|ndtj�|jdt	tj��|jd�}|�s�|j!�d6k�r�|dk	�r�tj d-|�r�|ndtj�|jdt	tj��dS)7N�rzFailed to load '%s': %srrr	�yes�nor
rrr
rrrrrr�r�#�;cSsg|]}|j��qSr)r)�.0�xrrr�
<listcomp>bsz'firewalld_conf.read.<locals>.<listcomp>�=�zInvalid option definition: '%s'zInvalid option: '%s'r$zMissing value: '%s'z!Duplicate option definition: '%s'z0DefaultZone is not set, using default value '%s'z7MinimalMark '%s' is not valid, using default value '%d'�false�truez7CleanupOnExit '%s' is not valid, using default value %sz>CleanupModulesOnExit '%s' is not valid, using default value %sz2Lockdown '%s' is not valid, using default value %sz7IPv6_rpfilter '%s' is not valid, using default value %sz9IndividualCalls '%s' is not valid, using default value %sz3LogDenied '%s' is invalid, using default value '%s'z:AutomaticHelpers '%s' is not valid, using default value %sz9FirewallBackend '%s' is not valid, using default value %sz:FlushAllOnReload '%s' is not valid, using default value %sz6RFC3964_IPv4 '%s' is not valid, using default value %sz;AllowZoneDrifting '%s' is not valid, using default value %s)r-r.)r+r4r*r5)r+r4r*r5)r*r5r+r4)r*r5r+r4)r*r5r+r4)r*r5r+r4)r*r5r+r4)r*r5r+r4)%r�openr�	Exceptionr�errorr#rZ
FALLBACK_ZONE�strZFALLBACK_MINIMAL_MARKZFALLBACK_CLEANUP_ON_EXITZ FALLBACK_CLEANUP_MODULES_ON_EXITZFALLBACK_LOCKDOWNZFALLBACK_IPV6_RPFILTERZFALLBACK_INDIVIDUAL_CALLSZFALLBACK_LOG_DENIEDZFALLBACK_AUTOMATIC_HELPERSZFALLBACK_FIREWALL_BACKENDZFALLBACK_FLUSH_ALL_ON_RELOADZFALLBACK_RFC3964_IPV4ZFALLBACK_ALLOW_ZONE_DRIFTINGr�len�split�
valid_keysrr�close�int�
ValueError�	TypeErrorZwarning�lowerZLOG_DENIED_VALUESZAUTOMATIC_HELPERS_VALUESZFIREWALL_BACKEND_VALUES)r�f�msg�lineZpairr"rrr�readFs
























zfirewalld_conf.readc:Cs�t|j�dkrdSg}tjjtj�s2tjtjd�y.tj	ddtjj
|j�tjj|j�dd�}Wn2t
k
r�}ztjd|��WYdd}~XnXd}d}ytj|jdd	d
�}WnPt
k
�r}z0tjj|j�r�tjd|j|f��nd}WYdd}~X�n6X�x0|D�]&}|�sP|jd�}t|�dk�rH|�s2|jd�d
}n�|ddk�rpd}|j|�|jd�n�|jd�}t|�dk�r�d}|j|d��q|dj�}	|dj�}
|	|k�r.|	|jk�r�|j|	|
k�r�d}|jd|	|j|	f�d
}n$|	|jk�rd
}nd}|j|d�|j|	�nd
}�qWt|j�dk�r�x^|jj�D]P\}	}
|	|k�rj�qT|	dk�rx�qT|�s�|jd�d
}|jd|	|
f�d
}�qTW|�r�|j�|j�|�s�tj|j�dStjj|j��r@ytj|jd|j�WnBt
k
�r>}z$tj|j�td|j|f��WYdd}~XnXytj|j|j�WnBt
k
�r�}z$tj|j�td|j|f��WYdd}~XnXtj|jd�dS)Nr,i�Zwtz%s.F)�mode�prefix�dir�deletez!Failed to open temporary file: %sZrtzUTF-8)rF�encodingzFailed to open '%s': %sr%Trr-r2r3z%s=%s
rrz%s.oldzBackup of '%s' failed: %szFailed to create '%s': %si�)rr) r:r�os�path�existsrZ
ETC_FIREWALLD�mkdir�tempfileZNamedTemporaryFile�basenamer�dirnamer7rr8�ior6r�writer;r�appendr&r=r!�name�shutilZcopy2�IOErrorZmove�chmod)r�doneZ	temp_filerCZmodified�emptyrBrD�pr r"rrrrS�s�









$$zfirewalld_conf.writeN)�__name__�
__module__�__qualname__rrrrr#r(rErSrrrrr%s	r)Zos.pathrKrRrOrVZfirewallrZfirewall.core.loggerrZfirewall.functionsrrrr<�objectrrrrr�<module>ssite-packages/firewall/core/io/__pycache__/ipset.cpython-36.pyc000064400000025614147511334670020435 0ustar003

]ûf�R�@sdZdddgZddljZddlZddlZddlZddlmZddl	m
Z
mZmZm
Z
mZmZmZmZmZddlmZmZmZmZdd	lmZmZdd
lmZmZmZmZddl m!Z!ddlm"Z"dd
l#m$Z$Gdd�de�Z%Gdd�de�Z&dd�Z'ddd�Z(dS)z$ipset io XML handler, reader, writer�IPSet�ipset_reader�ipset_writer�N)�config)	�checkIP�checkIP6�checkIPnMask�
checkIP6nMask�
u2b_if_py2�	check_mac�
check_port�checkInterface�
checkProtocol)�PY2�	IO_Object�IO_Object_ContentHandler�IO_Object_XMLGenerator)�IPSET_TYPES�IPSET_CREATE_OPTIONS)�check_icmp_name�check_icmp_type�check_icmpv6_name�check_icmpv6_type)�log)�errors)�
FirewallErrorcs�eZdZddd d!dddifddgffZdZd	d
ddgZd
d
dgdgd
d�Zdgdgd�Z�fdd�Zdd�Z	dd�Z
edd��Zdd�Z
�fdd�Z�ZS)"r�version��short�description�type�options�entriesz
(ssssa{ss}as)�_�-�:�.N�name)rr�ipset�option�entry�value)r(r)cs<tt|�j�d|_d|_d|_d|_g|_i|_d|_	dS)NrF)
�superr�__init__rrrr r"r!�applied)�self)�	__class__��/usr/lib/python3.6/ipset.pyr-CszIPSet.__init__cCs8d|_d|_d|_d|_|jdd�=|jj�d|_dS)NrF)rrrr r"r!�clearr.)r/r1r1r2�cleanupMs
z
IPSet.cleanupcCs\t|j�|_t|j�|_t|j�|_t|j�|_dd�|jj�D�|_dd�|jD�|_dS)z� HACK. I haven't been able to make sax parser return
            strings encoded (because of python 2) instead of in unicode.
            Get rid of it once we throw out python 2 support.cSsi|]\}}t|�t|��qSr1)r
)�.0�k�vr1r1r2�
<dictcomp>^sz(IPSet.encode_strings.<locals>.<dictcomp>cSsg|]}t|��qSr1)r
)r5�er1r1r2�
<listcomp>`sz(IPSet.encode_strings.<locals>.<listcomp>N)r
rrrr r!�itemsr")r/r1r1r2�encode_stringsVszIPSet.encode_stringsc
Csd}d|kr|ddkrd}|jd�s6ttjd|��|dd�jd�}|jd�}t|�t|�ksnt|�d	kr�ttjd
||f���xztt|��D�]h}||}||}|dk�r�d|ko�|dk�rh|d	kr�ttjd
|||f��|jd�}	t|	�dk�rttjd||||f��x�|	D]J}
|dk�r2t|
��sH|dk�rt	|
��rttjd|
|||f���qWnh|dk�r�|dk�r�ttjd||||f��|dk�r�t
}nt}nt	}||��s�ttjd||||f��q�|dk�r@d|k�r�|jd�}	t|	�dk�rttjd||||f��|dk�r0t|	d��sJ|dk�rft	|	d��rfttjd|	d|||f��|dk�r�t
|	d	��s�|dk�r>t|	d	��r>ttjd|	d	|||f��n�|jd��r�|dk�o�|dk�o�|dk�s�ttjd||||f��|dk�rt
|��s&|dk�r�t|��r�ttjd||||f��q�|dk�rvt
|��s`|dk�r�ttjd||f��q�|dk�r�d|k�r�|jd�}	t|	�dk�r�ttjd|��|	ddk�r|dk�r�ttjd||f��t|	d	��r�t|	d	��r�ttjd|	d	|f��n�|	dd1k�r~|dk�rDttjd||f��t|	d	��r�t|	d	��r�ttjd!|	d	|f��n^|	dd2k�r�t|	d��r�ttjd&|	d|f��n&t|	d	��s�ttjd'|	d	|f��nt|��s�ttjd(||f��q�|d)k�r�|jd*��rPyt|d+�}Wn*tk
�rLttjd,||f��YnXn8yt|�}Wn*tk
�r�ttjd,||f��YnX|dk�s�|d-k�r�ttjd,||f��q�|d.k�r�t|��s�t|�d/k�r�ttjd0||f��q�ttjd|��q�WdS)3NZipv4�family�inet6Zipv6zhash:zipset type '%s' not usable��,�z)entry '%s' does not match ipset type '%s'Zipr$z invalid address '%s' in '%s'[%d]�z.invalid address range '%s' in '%s' for %s (%s)z(invalid address '%s' in '%s' for %s (%s)z0.0.0.0rZnetz/0zhash:net,ifaceZmacz00:00:00:00:00:00z invalid mac address '%s' in '%s'Zportr%zinvalid port '%s'Zicmpz(invalid protocol for family '%s' in '%s'zinvalid icmp type '%s' in '%s'�icmpv6�	ipv6-icmpz invalid icmpv6 type '%s' in '%s'�tcp�sctp�udp�udplitezinvalid protocol '%s' in '%s'zinvalid port '%s'in '%s'zinvalid port '%s' in '%s'ZmarkZ0x�zinvalid mark '%s' in '%s'l��Ziface�zinvalid interface '%s' in '%s')rCrD)rErFrGrH)�
startswithrr�
INVALID_IPSET�split�lenZ
INVALID_ENTRY�rangerrrr	�endswithrrrrrrr�int�
ValueErrorr
)
r*r!Z
ipset_typer=�flagsr;�i�flag�itemZsplitsZ_splitZip_checkZint_valr1r1r2�check_entrybs@























zIPSet.check_entrycCs�|dkr |tkr ttjd|��|dkr�x�|j�D]�}|tkrNttjd|��|dkr�yt||�}Wn,tk
r�ttj	d|||f��YnX|d	kr�ttj	d
|||f��q2|dkr2||dkr2ttj
||��q2WdS)Nr z'%s' is not valid ipset typer!zipset invalid option '%s'�timeout�hashsize�maxelemz)Option '%s': Value '%s' is not an integerrz#Option '%s': Value '%s' is negativer=�inetr>)rXrYrZ)r[r>)rrr�INVALID_TYPE�keysrrLrQrR�
INVALID_VALUE�INVALID_FAMILY)r/rrVZ
all_config�key�	int_valuer1r1r2�
_check_configs2

zIPSet._check_configcsrd|dkr6|dddkr6t|d�dkr6ttj��x&|dD]}tj||d|d�q@Wtt|�j|�dS)NrX��0r?r�)rNrrZIPSET_WITH_TIMEOUTrrWr,�
import_config)r/rr*)r0r1r2rf3s
zIPSet.import_config)rr)rr)rr)r r)�__name__�
__module__�__qualname__ZIMPORT_EXPORT_STRUCTUREZDBUS_SIGNATUREZADDITIONAL_ALNUM_CHARSZPARSER_REQUIRED_ELEMENT_ATTRSZPARSER_OPTIONAL_ELEMENT_ATTRSr-r4r<�staticmethodrWrbrf�
__classcell__r1r1)r0r2r,s,


	7c@seZdZdd�Zdd�ZdS)�ipset_ContentHandlerc
Cs�tj|||�|jj||�|dkrpd|krX|dtkrLttjd|d��|d|j_d|krl|d|j_	�nz|dkr|�nn|dkr��nb|dk�r�d}d	|kr�|d	}|d
dkr�ttj
d|d
��|jjdko�|d
dk�r�ttj
d|d
|jjf��|d
dk�r&|�r&ttj
d|d
��|d
dk�r�yt|�}Wn.tk
�rnttj
d|d
|f��YnX|dk�r�ttj
d|d
|f��|d
dk�r�|dk�r�ttj|��|d
|jjk�r�||jj|d
<ntjd|d
�dS)Nr(r z%srrrr)rr+r'r=rXrYrZzUnknown option '%s'zhash:macz%Unsupported option '%s' for type '%s'z&Missing mandatory value of option '%s'z)Option '%s': Value '%s' is not an integerrz#Option '%s': Value '%s' is negativer[r>z Option %s already set, ignoring.)r=rXrYrZ)r=)r=rXrYrZ)rXrYrZ)r[r>)r�startElementrVZparser_check_element_attrsrrrr\r rZINVALID_OPTIONrQrRr^r_r!r�warning)r/r'�attrsr+rar1r1r2rm>sd

z!ipset_ContentHandler.startElementcCs(tj||�|dkr$|jjj|j�dS)Nr*)r�
endElementrVr"�appendZ_element)r/r'r1r1r2rpuszipset_ContentHandler.endElementN)rgrhrirmrpr1r1r1r2rl=s7rlc%Cst�}|jd�s ttjd|��|dd�|_|j|j�||_||_|j	t
j�rVdnd|_|j|_
t|�}tj�}|j|�d||f}t|d��b}tjd�}|j|�y|j|�Wn8tjk
r�}zttjd|j���WYdd}~XnXWdQRX~~d	|jk�rF|jd	d
k�rFt|j�dk�rFtjd|j�|jdd�=d}	t�}
x�|	t|j�k�r|j|	|
k�r�tjd
|j|	�|jj|	�nry|j |j|	|j|j!�Wn<tk
�r�}ztjd|�|jj|	�WYdd}~XnX|
j"|j|	�|	d7}	�qRW~
t#�r|j$�|S)Nz.xmlz'%s' is missing .xml suffixrcFTz%s/%s�rbznot a valid ipset file: %srXrdrz6ipset '%s': timeout option is set, entries are ignoredzEntry %s already set, ignoring.z
%s, ignoring.rA���)%rrPrrZINVALID_NAMEr'Z
check_name�filename�pathrKr�
ETC_FIREWALLDZbuiltin�defaultrl�saxZmake_parserZsetContentHandler�openZInputSourceZ
setByteStream�parseZSAXParseExceptionrLZgetExceptionr!rNr"rrn�set�poprWr �addrr<)rtrur(�handler�parserr'�f�source�msgrTZentries_setr9r1r1r2rzs^




(cCs�|r|n|j}|jr$d||jf}nd||jf}tjj|�r�ytj|d|�Wn0tk
r�}ztj	d||�WYdd}~XnXtjj
|�}|jtj
�r�tjj|�r�tjjtj
�s�tjtj
d�tj|d�tj|ddd�}t|�}|j�d	|ji}|j�r|jd
k�r|j|d<|jd|�|jd
�|j�rz|jd
k�rz|jd�|jdi�|j|j�|jd�|jd
�|j�r�|jd
k�r�|jd�|jdi�|j|j�|jd�|jd
�xZ|jj�D]L\}	}
|jd�|
d
k�r|jd|	|
d��n|jdd|	i�|jd
��q�WxD|jD]:}|jd�|jdi�|j|�|jd�|jd
��q(W|jd�|jd
�|j�|j �~dS)Nz%s/%sz	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %si�ZwtzUTF-8)�mode�encodingr rrr(�
z  rrr))r'r+r'r*)!rurtr'�os�exists�shutilZcopy2�	Exceptionr�error�dirnamerKrrv�mkdir�ioryrZ
startDocumentr rrmZignorableWhitespacerZ
charactersrprr!r;Z
simpleElementr"ZendDocument�close)r(ru�_pathr'r��dirpathr�r~ror`r+r*r1r1r2r�sf 















)N))�__doc__�__all__Zxml.saxrxr�r�r�ZfirewallrZfirewall.functionsrrrr	r
rrr
rZfirewall.core.io.io_objectrrrrZfirewall.core.ipsetrrZfirewall.core.icmprrrrZfirewall.core.loggerrrZfirewall.errorsrrrlrrr1r1r1r2�<module>s&

,=5site-packages/firewall/core/io/__pycache__/helper.cpython-36.opt-1.pyc000064400000013451147511334670021523 0ustar003

]ûf� �@s�dddgZddljZddlZddlZddlZddlmZddlm	Z	ddl
mZmZm
Z
mZmZmZddlmZdd	lmZdd
lmZGdd�de�ZGdd
�d
e
�Zdd�Zddd�ZdS)�Helper�
helper_reader�
helper_writer�N)�config)�
u2b_if_py2)�PY2�	IO_Object�IO_Object_ContentHandler�IO_Object_XMLGenerator�
check_port�check_tcpudp)�log)�errors)�
FirewallErrorcs�eZdZddddddd gffZdZd	d
gZdddgd�Zd
ddgddgd�Z�fdd�Zdd�Z	dd�Z
dd�Zdd�Z�Z
S)!r�version��short�description�family�module�portsz(sssssa(ss))�-�.N)rr�helper�name�port�protocol)rrcs6tt|�j�d|_d|_d|_d|_d|_g|_dS)Nr)	�superr�__init__rrrrrr)�self)�	__class__��/usr/lib/python3.6/helper.pyr;szHelper.__init__cCs.d|_d|_d|_d|_d|_|jdd�=dS)Nr)rrrrrr)rr!r!r"�cleanupDszHelper.cleanupcCsRt|j�|_t|j�|_t|j�|_t|j�|_t|j�|_dd�|jD�|_dS)z� HACK. I haven't been able to make sax parser return
            strings encoded (because of python 2) instead of in unicode.
            Get rid of it once we throw out python 2 support.cSs g|]\}}t|�t|�f�qSr!)r)�.0ZpoZprr!r!r"�
<listcomp>Usz)Helper.encode_strings.<locals>.<listcomp>N)rrrrrrr)rr!r!r"�encode_stringsLszHelper.encode_stringscCs(ddg}||kr$ttjd||f��dS)NZipv4Zipv6z'%s' not in '%s')rrZINVALID_IPV)rZipvZipvsr!r!r"�	check_ipvWszHelper.check_ipvcCsz|dkr0xl|D]}t|d�t|d�qWnF|dkrv|jd�sRttjd|��t|jdd��dkrvttjd|��dS)	Nrr�r�
nf_conntrack_z('%s' does not start with 'nf_conntrack_'rzModule name '%s' too short)rr�
startswithrr�INVALID_MODULE�len�replace)rr�itemZ
all_configrr!r!r"�
_check_config]s


zHelper._check_config)rr)rr)rr)rr)rr)rr)�__name__�
__module__�__qualname__ZIMPORT_EXPORT_STRUCTUREZDBUS_SIGNATUREZADDITIONAL_ALNUM_CHARSZPARSER_REQUIRED_ELEMENT_ATTRSZPARSER_OPTIONAL_ELEMENT_ATTRSrr#r&r'r/�
__classcell__r!r!)r r"r&s$
	c@seZdZdd�ZdS)�helper_ContentHandlercCs>tj|||�|jj||�|dkr�d|kr8|d|j_d|kr\|jj|d�|d|j_d|kr�|djd�s�tt	j
d|d��t|djdd��dkr�tt	j
d	|d��|d|j_
nz|d
kr�np|dkr�nf|dk�r:t|d�t|d
�|d|d
f}||jjk�r$|jjj|�ntjd|d|d
�dS)Nrrrrr)z('%s' does not start with 'nf_conntrack_'rr(zModule name '%s' too shortrrrrz#Port '%s/%s' already set, ignoring.)r	�startElementr.Zparser_check_element_attrsrr'rr*rrr+r,r-rrrr�appendr
Zwarning)rr�attrs�entryr!r!r"r5ns>
z"helper_ContentHandler.startElementN)r0r1r2r5r!r!r!r"r4msr4c	Cst�}|jd�s ttjd|��|dd	�|_|j|j�||_||_|j	t
j�rVdnd|_|j|_
t|�}tj�}|j|�d||f}t|d��b}tjd�}|j|�y|j|�Wn8tjk
r�}zttjd|j���WYdd}~XnXWdQRX~~t�r|j�|S)
Nz.xmlz'%s' is missing .xml suffix�FTz%s/%s�rbznot a valid helper file: %s���)r�endswithrrZINVALID_NAMErZ
check_name�filename�pathr*r�
ETC_FIREWALLDZbuiltin�defaultr4�saxZmake_parserZsetContentHandler�openZInputSourceZ
setByteStream�parseZSAXParseExceptionZINVALID_HELPERZgetExceptionrr&)	r=r>r�handler�parserr�f�source�msgr!r!r"r�s8




(c
CsP|r|n|j}|jr$d||jf}nd||jf}tjj|�r�ytj|d|�Wn0tk
r�}ztj	d||�WYdd}~XnXtjj
|�}|jtj
�r�tjj|�r�tjjtj
�s�tjtj
d�tj|d�tj|ddd�}t|�}|j�i}|j|d	<|j�r|jd
k�r|j|d<|j�r<|jd
k�r<|j|d<|jd
|�|jd�|j�r�|jd
k�r�|jd�|jdi�|j|j�|jd�|jd�|j�r�|jd
k�r�|jd�|jdi�|j|j�|jd�|jd�x>|jD]4}	|jd�|jd|	d|	dd��|jd��q�W|jd
�|jd�|j�|j�~dS)Nz%s/%sz	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %si�ZwtzUTF-8)�mode�encodingrrrrr�
z  rrrrr()rr) r>r=r�os�exists�shutilZcopy2�	Exceptionr
�error�dirnamer*rr?�mkdir�iorBr
Z
startDocumentrrrr5ZignorableWhitespacerZ
charactersZ
endElementrrZ
simpleElementZendDocument�close)
rr>�_pathrrH�dirpathrFrDr7rr!r!r"r�s\ 












)N)�__all__Zxml.saxrArLrSrNZfirewallrZfirewall.functionsrZfirewall.core.io.io_objectrrr	r
rrZfirewall.core.loggerr
rZfirewall.errorsrrr4rrr!r!r!r"�<module>s

 G#site-packages/firewall/core/io/__pycache__/io_object.cpython-36.pyc000064400000027540147511334670021246 0ustar003

]ûf5�@sdZddddddddgZd	d
ljZd	d
ljjZd	d
lZd	d
lZd	dlm	Z	d	dl
mZd	d
lm
Z
d	dl
mZd	dlmZejdkZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�dejj�ZGdd�dej�Zdd�Zdd�Zdd�Z dd�Z!d
S)z5Generic io_object handler, io specific check methods.�PY2�	IO_Object�IO_Object_ContentHandler�IO_Object_XMLGenerator�
check_port�check_tcpudp�check_protocol�
check_address�N)�OrderedDict)�	functions)�b2u)�errors)�
FirewallError�3c@s|eZdZdZfZdZgZiZiZdd�Z	dd�Z
dd�Zd	d
�Zdd�Z
d
d�Zdd�Zdd�Zdd�Zdd�Zdd�ZdS)rz; Abstract IO_Object as base for icmptype, service and zone z()cCs"d|_d|_d|_d|_d|_dS)N�F)�filename�path�name�defaultZbuiltin)�self�r�/usr/lib/python3.6/io_object.py�__init__2s
zIO_Object.__init__cCs6g}x(|jD]}|jtjt||d���qWt|�S)Nr	)�IMPORT_EXPORT_STRUCTURE�append�copy�deepcopy�getattr�tuple)r�ret�xrrr�
export_config9szIO_Object.export_configcCsXi}tdd�|jD��}x:|D]2}t||�s<tt||�t�rtjt||��||<qW|S)NcSsg|]}|d|df�qS)r	�r)�.0r rrr�
<listcomp>Asz0IO_Object.export_config_dict.<locals>.<listcomp>)�dictrr�
isinstance�boolrr)r�conf�type_formats�keyrrr�export_config_dict?s
zIO_Object.export_config_dictcCs�|j|�x�t|j�D]~\}\}}t||t�r~g}t�}x,||D] }||krD|j|�|j|�qDW~t||t	j
|��qt||t	j
||��qWdS)N)�check_config�	enumeraterr&�list�setr�add�setattrrr)rr(�i�elementZdummyZ_confZ_setr rrr�
import_configGs

zIO_Object.import_configc	Cs~|j|�xn|D]f}t||�s0ttjdj|���t||t�r`t||tt	j
tj||����qt||tj||��qWdS)Nz-Internal error. '{}' is not a valid attribute)
�check_config_dict�hasattrrr
Z
UNKNOWN_ERROR�formatr&r.r1r
�fromkeysrr)rr(r*rrr�import_config_dictWs


"zIO_Object.import_config_dictcCszt|t�s(ttjd|td�t|�f��t|�dkr@ttjd��x4|D],}|j�rF||j	krFttjd||f��qFWdS)Nz'%s' not of type %s, but %srr"zname can't be emptyz'%s' is not allowed in '%s')
r&�strrr
�INVALID_TYPE�type�lenZINVALID_NAME�isalnum�ADDITIONAL_ALNUM_CHARS)rr�charrrr�
check_namecs


zIO_Object.check_namecCsjt|�t|j�kr0ttjdt|�t|j�f��i}x&t|j�D]\}\}}||||<q@W|j|�dS)Nz structure size mismatch %d != %d)r=rrr
r;r-r5)rr(Z	conf_dictr2r �yrrrr,pszIO_Object.check_configcCsrtdd�|jD��}xX|D]P}|dd�|jD�krDttjdj|���|j||||�|j||||�qWdS)NcSsg|]}|d|df�qS)r	r"r)r#r rrrr$|sz/IO_Object.check_config_dict.<locals>.<listcomp>cSsg|]\}}|�qSrr)r#r rBrrrr$~szoption '{}' is not valid)r%rrr
ZINVALID_OPTIONr7�_check_config_structure�
_check_config)rr(r)r*rrrr5{s
zIO_Object.check_config_dictcCsdS)Nr)rZdummy1Zdummy2Zdummy3rrrrD�szIO_Object._check_configc	Cs`t|t|��s,ttjd|t|�t|�f��t|t�rrt|�dkrRttjd|��x|D]}|j||d�qXWn�t|t�r�t|�t|�kr�ttjd|t|�f��x�t	|�D]\}}|j|||�q�Wn�t|t
��r\t|j��d\}}xn|j�D]b\}}t|t|���s,ttjd|t|�t|�f��t|t|��s�ttjd|t|�t|�f��q�WdS)Nz'%s' not of type %s, but %sr"zlen('%s') != 1r	zlen('%s') != %d)r&r<rr
r;r.r=rCrr-r%�items)	rr(Z	structurer r2�valueZskeyZsvaluer*rrrrC�s8



z!IO_Object._check_config_structurecCs�|j�}d}||jkrdd}|j|dk	rdx:|j|D],}||krL|j|�q4ttjd||f��q4W||jkr�d}x$|j|D]}||kr~|j|�q~W|s�ttjd|��x |D]}ttjd||f��q�WdS)NFTzMissing attribute %s for %szUnexpected element %sz%s: Unexpected attribute %s)ZgetNames�PARSER_REQUIRED_ELEMENT_ATTRS�removerr
ZPARSE_ERROR�PARSER_OPTIONAL_ELEMENT_ATTRS)rr�attrsZ_attrs�foundr rrr�parser_check_element_attrs�s,



z$IO_Object.parser_check_element_attrsN)�__name__�
__module__�__qualname__�__doc__rZDBUS_SIGNATUREr?rGrIrr!r+r4r9rAr,r5rDrCrLrrrrr)s"
!cs$eZdZ�fdd�Zdd�Z�ZS)�UnexpectedElementErrorcstt|�j�||_dS)N)�superrQrr)rr)�	__class__rrr�szUnexpectedElementError.__init__cCs
d|jS)NzUnexpected element '%s')r)rrrr�__str__�szUnexpectedElementError.__str__)rMrNrOrrT�
__classcell__rr)rSrrQ�srQcs$eZdZ�fdd�Zdd�Z�ZS)�MissingAttributeErrorcstt|�j�||_||_dS)N)rRrVrr�	attribute)rrrW)rSrrr�szMissingAttributeError.__init__cCsd|j|jfS)Nz$Element '%s': missing '%s' attribute)rrW)rrrrrT�szMissingAttributeError.__str__)rMrNrOrrTrUrr)rSrrV�srVcs$eZdZ�fdd�Zdd�Z�ZS)�UnexpectedAttributeErrorcstt|�j�||_||_dS)N)rRrXrrrW)rrrW)rSrrr�sz!UnexpectedAttributeError.__init__cCsd|j|jfS)Nz'Element '%s': unexpected attribute '%s')rrW)rrrrrT�sz UnexpectedAttributeError.__str__)rMrNrOrrTrUrr)rSrrX�srXc@s4eZdZdd�Zdd�Zdd�Zdd�Zd	d
�ZdS)rcCs||_d|_dS)Nr)�item�_element)rrYrrrr�sz!IO_Object_ContentHandler.__init__cCs
d|_dS)Nr)rZ)rrrr�
startDocument�sz&IO_Object_ContentHandler.startDocumentcCs
d|_dS)Nr)rZ)rrrJrrr�startElement�sz%IO_Object_ContentHandler.startElementcCs*|dkr|j|j_n|dkr&|j|j_dS)N�short�description)rZrYr]r^)rrrrr�
endElement�sz#IO_Object_ContentHandler.endElementcCs|j|jdd�7_dS)N�
� )rZ�replace)r�contentrrr�
characters�sz#IO_Object_ContentHandler.charactersN)rMrNrOrr[r\r_rdrrrrr�s
c@s<eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
S)rcCsNtjjj|�|j|_|j|_ig|_|jd|_	g|_
d|_d|_d|_
dS)Nr"zutf-8F���)�sax�handler�ContentHandlerr�write�_write�flushZ_flushZ_ns_contextsZ_current_contextZ_undeclared_ns_mapsZ	_encodingZ_pending_start_elementZ_short_empty_elements)r�outrrrr�szIO_Object_XMLGenerator.__init__cCs*trdd�|j�D�}tjj|||�dS)a saxutils.XMLGenerator.startElement() expects name and attrs to be
            unicode and bad things happen if any of them is (utf-8) encoded.
            We override the method here to sanitize this case.
            Can be removed once we drop Python2 support.
        cSsi|]\}}t|�t|��qSr)r)r#rrFrrr�
<dictcomp>sz7IO_Object_XMLGenerator.startElement.<locals>.<dictcomp>N)rrE�saxutils�XMLGeneratorr\)rrrJrrrr\sz#IO_Object_XMLGenerator.startElementcCs�trX|jdt|��x4|j�D](\}}|jdt|�tjt|��f�q W|jd�nF|jd|�x,|j�D] \}}|jd|tj|�f�qpW|jd�dS)z* slightly modified startElement()
        �<z %s=%sz/>N)rrjrrErnZ	quoteattr)rrrJrFrrr�
simpleElementsz$IO_Object_XMLGenerator.simpleElementcCstjj|t|��dS)z� saxutils.XMLGenerator.endElement() expects name to be
            unicode and bad things happen if it's (utf-8) encoded.
            We override the method here to sanitize this case.
            Can be removed once we drop Python2 support.
        N)rnror_r)rrrrrr_sz!IO_Object_XMLGenerator.endElementcCstjj|t|��dS)z� saxutils.XMLGenerator.characters() expects content to be
            unicode and bad things happen if it's (utf-8) encoded.
            We override the method here to sanitize this case.
            Can be removed once we drop Python2 support.
        N)rnrordr)rrcrrrrd%sz!IO_Object_XMLGenerator.characterscCstjj|t|��dS)a saxutils.XMLGenerator.ignorableWhitespace() expects content to be
            unicode and bad things happen if it's (utf-8) encoded.
            We override the method here to sanitize this case.
            Can be removed once we drop Python2 support.
        N)rnro�ignorableWhitespacer)rrcrrrrr-sz*IO_Object_XMLGenerator.ignorableWhitespaceN)	rMrNrOrr\rqr_rdrrrrrrr�s
cCs�tj|�}|dkr$ttjd|��n`|dkr>ttjd|��nF|dkrXttjd|��n,t|�dkr�|d|dkr�ttjd|��dS)	N�zport number in '%s' is too bigr"z'%s' is invalid port rangezport range '%s' is ambiguousr	���re)rZgetPortRangerr
ZINVALID_PORTr=)ZportZ
port_rangerrrr5s
cCs|dkrttjd|��dS)N�tcp�udp�sctp�dccpz)'%s' not from {'tcp'|'udp'|'sctp'|'dccp'})rurvrwrx)rr
�INVALID_PROTOCOL)�protocolrrrrDscCstj|�sttj|��dS)N)rZ
checkProtocolrr
ry)rzrrrrJs
cCs$tj||�s ttjd||f��dS)Nz'%s' is not valid %s address)rrrr
ZINVALID_ADDR)ZipvZaddrrrrrNs)"rP�__all__Zxml.saxrfZxml.sax.saxutilsrnr�sys�collectionsr
ZfirewallrZfirewall.functionsrr
Zfirewall.errorsr�versionr�objectr�	ExceptionrQrVrXrgrhrrorrrrrrrrr�<module>s0

		Csite-packages/firewall/core/io/__pycache__/service.cpython-36.pyc000064400000020377147511334670020752 0ustar003

]ûf�2�@s�dddgZddljZddlZddlZddlZddlmZddlm	Z	ddl
mZmZm
Z
mZmZmZmZmZddlmZdd	lmZdd
lmZGdd�de�ZGdd
�d
e
�Zdd�Zddd�ZdS)�Service�service_reader�service_writer�N)�config)�
u2b_if_py2)�PY2�	IO_Object�IO_Object_ContentHandler�IO_Object_XMLGenerator�
check_port�check_tcpudp�check_protocol�
check_address)�log)�errors)�
FirewallErrorcs�eZdZd d!d"dd#gfddgfdddifddgfd	d$gfd
dgfddgff
Zdd
gZdddd�Zddgddgdgdgddgddgdgdgd�Z�fdd�Zdd�Zdd�Z	dd�Z
�ZS)%r�version��short�description�ports�modules�destination�	protocols�source_ports�includes�helpers�_�-N)rr�service�name�port�protocol�value�ipv4�ipv6r)rr!r"�modulerzsource-port�include�helpercsNtt|�j�d|_d|_d|_g|_g|_g|_i|_	g|_
g|_g|_dS)Nr)
�superr�__init__rrrrrrrrrr)�self)�	__class__��/usr/lib/python3.6/service.pyr*DszService.__init__cCshd|_d|_d|_|jdd�=|jdd�=|jdd�=|jj�|jdd�=|j	dd�=|j
dd�=dS)Nr)rrrrrrr�clearrrr)r+r-r-r.�cleanupQs
zService.cleanupcCs�t|j�|_t|j�|_t|j�|_dd�|jD�|_dd�|jD�|_dd�|jj�D�|_dd�|jD�|_dd�|j	D�|_	dd�|j
D�|_
d	d�|jD�|_d
S)z� HACK. I haven't been able to make sax parser return
            strings encoded (because of python 2) instead of in unicode.
            Get rid of it once we throw out python 2 support.cSs g|]\}}t|�t|�f�qSr-)r)�.0�po�prr-r-r.�
<listcomp>dsz*Service.encode_strings.<locals>.<listcomp>cSsg|]}t|��qSr-)r)r1�mr-r-r.r4escSsi|]\}}t|�t|��qSr-)r)r1�k�vr-r-r.�
<dictcomp>fsz*Service.encode_strings.<locals>.<dictcomp>cSsg|]}t|��qSr-)r)r1r3r-r-r.r4gscSs g|]\}}t|�t|�f�qSr-)r)r1r2r3r-r-r.r4hscSsg|]}t|��qSr-)r)r1�sr-r-r.r4jscSsg|]}t|��qSr-)r)r1r9r-r-r.r4ksN)rrrrrrr�itemsrrrr)r+r-r-r.�encode_strings]szService.encode_stringscCs:|dkrJx>|D]6}|ddkr8t|d�t|d�qt|d�qWn�|dkrjx�|D]}t|�qXWn�|dkr�x�|D]}t|d�t|d�qxWn�|dkr�x�|D]*}|dkr�ttjd
|��t|||�q�Wn^|dk�r6xR|D]J}|jd��r|jdd�}d
|k�r|jd
d�}t	|�dkr�ttj
|��q�WdS)Nrrr�rrrr$r%z'%s' not in {'ipv4'|'ipv6'}r�
nf_conntrack_rr�)r$r%)rrr
rrZINVALID_DESTINATIONr�
startswith�replace�lenZINVALID_MODULE)r+r�itemZ
all_configr!�protorr&r-r-r.�
_check_configms8






zService._check_config)rr)rr)rr)rr)rr)�__name__�
__module__�__qualname__ZIMPORT_EXPORT_STRUCTUREZADDITIONAL_ALNUM_CHARSZPARSER_REQUIRED_ELEMENT_ATTRSZPARSER_OPTIONAL_ELEMENT_ATTRSr*r0r;rD�
__classcell__r-r-)r,r.r&s4


c@seZdZdd�ZdS)�service_ContentHandlercCs0tj|||�|jj||�|dkrTd|kr<tjd|d�d|krP|d|j_�n�|dkr`�n�|dkrl�n�|dk�r$|ddkr�t|d�t|d	�|d|d	f}||jj	kr�|jj	j
|�ntjd
|d|d	�nBt|d	�|d	|jjk�r|jjj
|d	�ntjd|d	��n|d	k�rtt|d�|d|jjk�r`|jjj
|d�ntjd|d��n�|d
k�r�t|d�t|d	�|d|d	f}||jj
k�r�|jj
j
|�ntjd|d|d	��nN|dk�r>xRdD]J}||k�r�t|||�||jjk�r&tjd|�n|||jj|<�q�Wn�|dk�r�|d}|jd��r~|jdd�}d|k�r~|jdd�}||jjk�r�|jjj
|�ntjd|�n�|dk�r�|d|jjk�r�|jjj
|d�ntjd|d�n@|dk�r,|d|jjk�r|jjj
|d�ntjd|d�dS)Nrr z'Ignoring deprecated attribute name='%s'rrrr!rr"z#Port '%s/%s' already set, ignoring.z$Protocol '%s' already set, ignoring.r#zsource-portz)SourcePort '%s/%s' already set, ignoring.rr$r%z2Destination address for '%s' already set, ignoringr&r=rrz"Module '%s' already set, ignoring.r'z#Include '%s' already set, ignoring.r(z"Helper '%s' already set, ignoring.)r$r%)r	�startElementrBZparser_check_element_attrsrZwarningrrrr�appendr
rrrrr?r@rrr)r+r �attrs�entry�xr&r-r-r.rJ�s�










z#service_ContentHandler.startElementN)rErFrGrJr-r-r-r.rI�srIc	Cst�}|jd�s ttjd|��|dd	�|_|j|j�||_||_|j	t
j�rVdnd|_|j|_
t|�}tj�}|j|�d||f}t|d��b}tjd�}|j|�y|j|�Wn8tjk
r�}zttjd|j���WYdd}~XnXWdQRX~~t�r|j�|S)
Nz.xmlz'%s' is missing .xml suffix�FTz%s/%s�rbznot a valid service file: %s���)r�endswithrrZINVALID_NAMEr Z
check_name�filename�pathr?r�
ETC_FIREWALLDZbuiltin�defaultrI�saxZmake_parserZsetContentHandler�openZInputSourceZ
setByteStream�parseZSAXParseExceptionZINVALID_SERVICEZgetExceptionrr;)	rSrTr�handler�parserr �f�source�msgr-r-r.r�s8




(cCsr|r|n|j}|jr$d||jf}nd||jf}tjj|�r�ytj|d|�Wn0tk
r�}ztj	d||�WYdd}~XnXtjj
|�}|jtj
�r�tjj|�r�tjjtj
�s�tjtj
d�tj|d�tj|ddd�}t|�}|j�i}|j�r|jd	k�r|j|d
<|jd|�|jd�|j�rt|jd	k�rt|jd
�|jdi�|j|j�|jd�|jd�|j�r�|jd	k�r�|jd
�|jdi�|j|j�|jd�|jd�x>|jD]4}	|jd
�|jd|	d|	dd��|jd��q�Wx4|jD]*}
|jd
�|jdd|
i�|jd��qWx>|jD]4}	|jd
�|jd|	d|	dd��|jd��q<Wx4|jD]*}|jd
�|jdd|i�|jd��q|Wt|j �dk�r�|jd
�|jd|j �|jd�x4|j!D]*}|jd
�|jdd|i�|jd��q�Wx4|j"D]*}
|jd
�|jdd|
i�|jd��qW|jd�|jd�|j#�|j$�~dS)Nz%s/%sz	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %si�ZwtzUTF-8)�mode�encodingrrr�
z  rrr!rr<)r!r"r"r#zsource-portr&r rr'r()%rTrSr �os�exists�shutilZcopy2�	Exceptionr�error�dirnamer?rrU�mkdir�iorXr
Z
startDocumentrrJZignorableWhitespacerZ
charactersZ
endElementrrZ
simpleElementrrrrArrrZendDocument�close)rrT�_pathr r^�dirpathr\rZrLr!r"r&r'r(r-r-r.rs� 

















)N)�__all__Zxml.saxrWrbrirdZfirewallrZfirewall.functionsrZfirewall.core.io.io_objectrrr	r
rrr
rZfirewall.core.loggerrrZfirewall.errorsrrrIrrr-r-r-r.�<module>s

(mQsite-packages/firewall/core/io/__pycache__/functions.cpython-36.opt-1.pyc000064400000005256147511334670022260 0ustar003

]ûfy�@s�ddlZddlmZddlmZddlmZddlmZddl	m
Z
ddlmZddl
mZdd	lmZdd
lmZddlmZddlmZdd
lmZdd�ZdS)�N)�config)�
FirewallError)�FirewallConfig)�zone_reader)�service_reader)�ipset_reader)�icmptype_reader)�
helper_reader)�
policy_reader)�Direct)�LockdownWhitelist)�firewalld_confc	-Cs|t|�}t|jtjtjgd�t|jtjtj	gd�t
|jtjtj
gd�t|jtjtjgd�t|jtjtjgd�t|jtjtjgd�d�}�x
|j�D�]�}x�||dD]�}tjj|�s�q�x�ttj|��D]�}|j d�r�yD||d||�}|d
k�r�||_!|j"|j#��||d|�Wq�t$k
�rT}zt$|j%d	||j&f��WYdd}~Xq�t'k
�r�}zt'd	||f��WYdd}~Xq�Xq�Wq�Wq�Wtjj(tj)��r:y$t*tj)�}|j+�|j,|j-��Wnpt$k
�r}zt$|j%d	tj)|j&f��WYdd}~Xn6t'k
�r8}zt'd	tj)|f��WYdd}~XnXtjj(tj.��r�y$t/tj.�}|j+�|j,|j-��Wnpt$k
�r�}zt$|j%d	tj.|j&f��WYdd}~Xn6t'k
�r�}zt'd	tj.|f��WYdd}~XnXtjj(tj0��rxyt1tj0�}|j+�Wnpt$k
�rB}zt$|j%d	tj0|j&f��WYdd}~Xn6t'k
�rv}zt'd	tj0|f��WYdd}~XnXdS)N)�reader�add�dirs)Zipset�helperZicmptypeZservice�zone�policyrz.xmlrrrrz'%s': %s)rr)2rrZ	add_ipsetrZFIREWALLD_IPSETSZETC_FIREWALLD_IPSETSr	Z
add_helperZFIREWALLD_HELPERSZETC_FIREWALLD_HELPERSrZadd_icmptypeZFIREWALLD_ICMPTYPESZETC_FIREWALLD_ICMPTYPESrZadd_serviceZFIREWALLD_SERVICESZETC_FIREWALLD_SERVICESrZadd_zoneZFIREWALLD_ZONESZETC_FIREWALLD_ZONESr
Zadd_policy_objectZFIREWALLD_POLICIESZETC_FIREWALLD_POLICIES�keys�os�path�isdir�sorted�listdir�endswith�	fw_configZcheck_config_dictZexport_config_dictr�code�msg�	Exception�isfileZFIREWALLD_DIRECTr�read�check_configZ
export_configZLOCKDOWN_WHITELISTrZFIREWALLD_CONFr
)	�fwrZreadersrZ_dir�file�obj�errorr�r&�/usr/lib/python3.6/functions.pyr!&sz

&.
($
($
(r!)rZfirewallrZfirewall.errorsrZfirewall.core.fw_configrZfirewall.core.io.zonerZfirewall.core.io.servicerZfirewall.core.io.ipsetrZfirewall.core.io.icmptyperZfirewall.core.io.helperr	Zfirewall.core.io.policyr
Zfirewall.core.io.directrZ#firewall.core.io.lockdown_whitelistrZfirewall.core.io.firewalld_confr
r!r&r&r&r'�<module>ssite-packages/firewall/core/io/__pycache__/icmptype.cpython-36.pyc000064400000011635147511334670021141 0ustar003

]ûf��@s�dddgZddljZddlZddlZddlZddlmZddlm	Z	ddl
mZmZm
Z
mZddlmZdd	lmZdd
lmZGdd�de�ZGdd
�d
e
�Zdd�Zddd�ZdS)�IcmpType�icmptype_reader�icmptype_writer�N)�config)�
u2b_if_py2)�PY2�	IO_Object�IO_Object_ContentHandler�IO_Object_XMLGenerator)�log)�errors)�
FirewallErrorcspeZdZdddddgffZdZddgZd	d	d	d
�Zddgdd
gd�Z�fdd�Zdd�Z	dd�Z
dd�Z�ZS)r�version��short�description�destinationz(sssas)�_�-N)rr�icmptype�name�ipv4�ipv6)rrcs*tt|�j�d|_d|_d|_g|_dS)Nr)�superr�__init__rrrr)�self)�	__class__��/usr/lib/python3.6/icmptype.pyr8s
zIcmpType.__init__cCs"d|_d|_d|_|jdd�=dS)Nr)rrrr)rrrr�cleanup?szIcmpType.cleanupcCs:t|j�|_t|j�|_t|j�|_dd�|jD�|_dS)z� HACK. I haven't been able to make sax parser return
            strings encoded (because of python 2) instead of in unicode.
            Get rid of it once we throw out python 2 support.cSsg|]}t|��qSr)r)�.0�mrrr�
<listcomp>Lsz+IcmpType.encode_strings.<locals>.<listcomp>N)rrrrr)rrrr�encode_stringsEszIcmpType.encode_stringscCs2|dkr.x$|D]}|dkrttjd|��qWdS)Nrrrz'%s' not from {'ipv4'|'ipv6'})rr)r
rZINVALID_DESTINATION)rr�itemZ
all_configrrrr�
_check_configNs
zIcmpType._check_config)rr)rr)rr)
�__name__�
__module__�__qualname__ZIMPORT_EXPORT_STRUCTUREZDBUS_SIGNATUREZADDITIONAL_ALNUM_CHARSZPARSER_REQUIRED_ELEMENT_ATTRSZPARSER_OPTIONAL_ELEMENT_ATTRSrrr#r%�
__classcell__rr)rrr%s	c@seZdZdd�ZdS)�icmptype_ContentHandlercCs�tj|||�|jj||�|dkrTd|kr>tjd|d�d|kr�|d|j_nT|dkr^nJ|dkrhn@|dkr�x6dD].}||krv||j�d
krv|jjj	t
|��qvWdS)Nrrz'Ignoring deprecated attribute name='%s'rrrrrr�yes�true)rr)r+r,)r	�startElementr$Zparser_check_element_attrsrZwarningr�lowerr�append�str)rr�attrs�xrrrr-Ys"
z$icmptype_ContentHandler.startElementN)r&r'r(r-rrrrr*Xsr*c	Cst�}|jd�s ttjd|��|dd	�|_|j|j�||_||_|j	t
j�rVdnd|_|j|_
t|�}tj�}|j|�d||f}t|d��b}tjd�}|j|�y|j|�Wn8tjk
r�}zttjd|j���WYdd}~XnXWdQRX~~t�r|j�|S)
Nz.xmlz%s is missing .xml suffix�FTz%s/%s�rbznot a valid icmptype file: %s���)r�endswithr
rZINVALID_NAMErZ
check_name�filename�path�
startswithr�
ETC_FIREWALLDZbuiltin�defaultr*�saxZmake_parserZsetContentHandler�openZInputSourceZ
setByteStream�parseZSAXParseExceptionZINVALID_ICMPTYPEZgetExceptionrr#)	r7r8r�handler�parserr�f�source�msgrrrrms8




(c
Cs.|r|n|j}|jr$d||jf}nd||jf}tjj|�r�ytj|d|�Wn0tk
r�}ztj	d||�WYdd}~XnXtjj
|�}|jtj
�r�tjj|�r�tjjtj
�s�tjtj
d�tj|d�tj|ddd�}t|�}|j�i}|j�r|jd	k�r|j|d
<|jd|�|jd�|j�rt|jd	k�rt|jd
�|jdi�|j|j�|jd�|jd�|j�r�|jd	k�r�|jd
�|jdi�|j|j�|jd�|jd�|j�r|jd
�i}x|jD]}	d||	<�q�W|jd|�|jd�|jd�|jd�|j�|j�~dS)Nz%s/%sz	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %si�ZwtzUTF-8)�mode�encodingrrr�
z  rrr+r)r8r7r�os�exists�shutilZcopy2�	Exceptionr�error�dirnamer9rr:�mkdir�ior=r
Z
startDocumentrr-ZignorableWhitespacerZ
charactersZ
endElementrrZ
simpleElementZendDocument�close)
rr8�_pathrrC�dirpathrAr?r1r2rrrr�s\ 











)N)�__all__Zxml.saxr<rGrNrIZfirewallrZfirewall.functionsrZfirewall.core.io.io_objectrrr	r
Zfirewall.core.loggerrrZfirewall.errorsr
rr*rrrrrr�<module>s

3site-packages/firewall/core/io/__pycache__/direct.cpython-36.opt-1.pyc000064400000027263147511334700021516 0ustar003

]ûf�=�@s�ddljZddlZddlZddlZddlmZddlmZddl	m
Z
mZmZddl
mZmZmZddlmZddlmZddlmZdd	lmZdd
lmZGdd�de�ZGd
d�de�ZdS)�N)�config)�LastUpdatedOrderedDict)�	splitArgs�joinArgs�
u2b_if_py2)�	IO_Object�IO_Object_ContentHandler�IO_Object_XMLGenerator)�log)�	ipXtables)�ebtables)�errors)�
FirewallErrorc@s$eZdZdd�Zdd�Zdd�ZdS)�direct_ContentHandlercCstj||�d|_dS)NF)r�__init__�direct)�self�item�r�/usr/lib/python3.6/direct.pyr(szdirect_ContentHandler.__init__cCs�tj|||�|jj||�|dkr@|jr6ttjd��d|_�n>|dkr�|js\tj	d�dS|d}|d}|d}|jj
t|�t|�t|��n�|dk�r6|js�tj	d	�dS|d}|dkr�ttjd
|��|d}|d}yt
|d�}Wn(tk
�rtj	d|d�dSXt|�t|�t|�|g|_nH|dk�rl|j�sVtj	d�dS|d}t|�g|_ntj	d|�dSdS)NrzMore than one direct tag.T�chainz$Parse Error: chain outside of direct�ipv�table�rulez#Parse Error: rule outside of direct�ipv4�ipv6�ebz"'%s' not from {'ipv4'|'ipv6'|'eb'}�priorityz'Parse Error: %s is not a valid priority�passthroughz&Parse Error: command outside of directzUnknown XML element %s)rrr)r�startElementrZparser_check_element_attrsrrr
ZPARSE_ERRORr
�error�	add_chainr�INVALID_IPV�int�
ValueError�_rule�_passthrough)r�nameZattrsrrrrrrrr,sT






z"direct_ContentHandler.startElementcCs�tj||�|dkrX|jrF|jjdd�t|j�D��|jj|j�n
tj	d�d|_nJ|dkr�|jr�|j
jdd�t|j�D��|jj|j
�n
tj	d	�d|_
dS)
NrcSsg|]}t|��qSr)r)�.0�xrrr�
<listcomp>dsz4direct_ContentHandler.endElement.<locals>.<listcomp>z2Error: rule does not have any arguments, ignoring.rcSsg|]}t|��qSr)r)r(r)rrrr*msz0Error: passthrough does not have any arguments, z	ignoring.z9Error: passthrough does not have any arguments, ignoring.)r�
endElementZ_elementr%�appendrr�add_ruler
r r&�add_passthrough)rr'rrrr+^s 
z direct_ContentHandler.endElementN)�__name__�
__module__�__qualname__rrr+rrrrr's2rcs<eZdZdZddBgfddddddgfgfdddgfgffZdZdd	d
dgd	d
ddgd	gd
�ZiZ�fdd�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zd2d3�Zd4d5�Zd6d7�Zd8d9�Zd:d;�Zd<d=�Zd>d?�Z d@dA�Z!�Z"S)C�Directz Direct class �chains��rulesr�passthroughsz(a(sss)a(sssias)a(sas))Nrrrr)rrrrcs0tt|�j�||_t�|_t�|_t�|_dS)N)�superr2r�filenamerr3r5r6)rr8)�	__class__rrr�s
zDirect.__init__cCsdS)Nr)r�confrZall_confrrr�
_check_config�szDirect._check_configcCsg}g}x>|jD]4}x.|j|D] }|jtt|�t|g���q WqW|j|�g}xR|jD]H}xB|j|D]4}|jt|d|d|d|dt|d�f��qnWq^W|j|�g}x8|jD].}x(|j|D]}|jt|t|�f��q�Wq�W|j|�t|�S)Nr��)r3r,�tuple�listr5r6)r�retr)�keyrrrrr�
export_config�s$$


zDirect.export_configcCs�|j�|j|�x�t|j�D]x\}\}}|dkrNx||D]}|j|�q<W|dkrrx||D]}|j|�q`W|dkrx||D]}|j|�q�WqWdS)Nr3r5r6)�cleanupZcheck_config�	enumerate�IMPORT_EXPORT_STRUCTUREr!r-r.)rr:�i�elementZdummyr)rrr�
import_config�s
zDirect.import_configcCs"|jj�|jj�|jj�dS)N)r3�clearr5r6)rrrrrC�s

zDirect.cleanupcCs�td�x4|jD]*}td|d|ddj|j|�f�qWtd�xZ|jD]P}td|d|d|df�x,|j|D]\}}td	|d
j|�f�q|WqNWtd�x@|jD]6}td|�x$|j|D]}td
d
j|��q�Wq�WdS)Nr3z  (%s, %s): %srr<�,r5z  (%s, %s, %s):r=z    (%d, ('%s'))z','r6z  %s:z
    ('%s'))�printr3�joinr5r6)rrAr�argsrrr�output�sz
Direct.outputcCs*dddg}||kr&ttjd||f��dS)Nrrrz'%s' not in '%s')rr
r")rrZipvsrrr�
_check_ipv�s
zDirect._check_ipvcCsF|j|�|dkrtjj�ntjj�}||krBttjd||f��dS)Nrrz'%s' not in '%s')rr)rOrZBUILT_IN_CHAINS�keysrrr
Z
INVALID_TABLE)rrrZtablesrrr�_check_ipv_table�s

zDirect._check_ipv_tablecCsd|j||�||f}||jkr(g|j|<||j|krH|j|j|�ntjd|||fd�dS)Nz(Chain '%s' for table '%s' with ipv '%s' zalready in list, ignoring)rQr3r,r
�warning)rrrrrArrrr!�s

zDirect.add_chaincCsn|j||�||f}||jkrX||j|krX|j|j|�t|j|�dkrj|j|=ntd|||f��dS)Nrz4Chain '%s' with table '%s' with ipv '%s' not in list)rQr3�remove�lenr$)rrrrrArrr�remove_chain�s
zDirect.remove_chaincCs,|j||�||f}||jko*||j|kS)N)rQr3)rrrrrArrr�query_chain�szDirect.query_chaincCs<|j||�||f}||jkr(|j|Std||f��dS)Nz&No chains for table '%s' with ipv '%s')rQr3r$)rrrrArrr�
get_chains�s

zDirect.get_chainscCs|jS)N)r3)rrrr�get_all_chainsszDirect.get_all_chainscCs�|j||�|||f}||jkr,t�|j|<|t|�f}||j|krV||j||<n*tjddj|�||fd||fd�dS)Nz(Rule '%s' for table '%s' and chain '%s' z',zwith ipv '%s' and priority %d zalready in list, ignoring)rQr5rr>r
rRrL)rrrrrrMrA�valuerrrr-s

zDirect.add_rulecCs�|j||�|||f}|t|�f}||jkrb||j|krb|j||=t|j|�dkr�|j|=n$tddj|�||fd||f��dS)Nrz(Rule '%s' for table '%s' and chain '%s' z',z)with ipv '%s' and priority %d not in list)rQr>r5rTr$rL)rrrrrrMrArYrrr�remove_rules

zDirect.remove_rulecCsb|j||�|||f}||jkr^x"|j|j�D]}|j||=q0Wt|j|�dkr^|j|=dS)Nr)rQr5rPrT)rrrrrArYrrr�remove_rules"s

zDirect.remove_rulescCs:|j||�|||f}|t|�f}||jko8||j|kS)N)rQr>r5)rrrrrrMrArYrrr�
query_rule+s
zDirect.query_rulecCsF|j||�|||f}||jkr*|j|Std||fd|��dS)Nz'No rules for table '%s' and chain '%s' z
with ipv '%s')rQr5r$)rrrrrArrr�	get_rules1s


zDirect.get_rulescCs|jS)N)r5)rrrr�
get_all_rules:szDirect.get_all_rulescCs^|j|�||jkrg|j|<||j|kr>|j|j|�ntjddj|�|fd�dS)NzPassthrough '%s' for ipv '%s'z',zalready in list, ignoring)rOr6r,r
rRrL)rrrMrrrr.?s


zDirect.add_passthroughcCsl|j|�||jkrN||j|krN|j|j|�t|j|�dkrh|j|=ntddj|�|fd��dS)NrzPassthrough '%s' for ipv '%s'z',znot in list)rOr6rSrTr$rL)rrrMrrr�remove_passthroughIs

zDirect.remove_passthroughcCs"|j|�||jko ||j|kS)N)rOr6)rrrMrrr�query_passthroughSs
zDirect.query_passthroughcCs.|j|�||jkr|j|Std|��dS)NzNo passthroughs for ipv '%s')rOr6r$)rrrrr�get_passthroughsWs


zDirect.get_passthroughscCs|jS)N)r6)rrrr�get_all_passthroughs^szDirect.get_all_passthroughscCs�|j�|jjd�s&ttjd|j��t|�}tj�}|j	|�t
|jd��b}tjd�}|j|�y|j
|�Wn8tjk
r�}zttjd|j���WYdd}~XnXWdQRXdS)Nz.xmlz'%s' is missing .xml suffix�rbzNot a valid file: %s)rCr8�endswithrr
ZINVALID_NAMEr�saxZmake_parserZsetContentHandler�openZInputSourceZ
setByteStream�parseZSAXParseExceptionZINVALID_TYPEZgetException)r�handler�parser�f�source�msgrrr�readcs 


zDirect.readc
CsBtjj|j�r\ytj|jd|j�Wn4tk
rZ}ztd|j|f��WYdd}~XnXtjjtj	�sxtj
tj	d�tj|jddd�}t
|�}|j�|jdi�|jd�xR|jD]H}|\}}x:|j|D],}|jd	�|jd
|||d��|jd�q�Wq�Wx�|jD]�}|\}}}xx|j|D]j\}}	t|	�dk�r@�q&|jd	�|jd
|||d|d��|jtjjt|	���|jd
�|jd��q&W�qWx||jD]r}xj|j|D]\}	t|	�dk�rȐq�|jd	�|jdd|i�|jtjjt|	���|jd�|jd��q�W�q�W|jd�|jd�|j�|j�~dS)Nz%s.oldzBackup of '%s' failed: %si�ZwtzUTF-8)�mode�encodingr�
z  r)rrrr<rz%d)rrrrrr)�os�path�existsr8�shutilZcopy2�	Exception�IOErrorrZ
ETC_FIREWALLD�mkdir�iorfr	Z
startDocumentrZignorableWhitespacer3Z
simpleElementr5rTreZsaxutils�escaperr+r6ZendDocument�close)
rrlrjrhrArrrrrMrrr�writeusZ$











zDirect.write)r4r4r4)#r/r0r1�__doc__rEZDBUS_SIGNATUREZPARSER_REQUIRED_ELEMENT_ATTRSZPARSER_OPTIONAL_ELEMENT_ATTRSrr;rBrHrCrNrOrQr!rUrVrWrXr-rZr[r\r]r^r.r_r`rarbrmr{�
__classcell__rr)r9rr2usH

	
		

r2)Zxml.saxrerqrxrtZfirewallrZfirewall.fw_typesrZfirewall.functionsrrrZfirewall.core.io.io_objectrrr	Zfirewall.core.loggerr
Z
firewall.corerrr
Zfirewall.errorsrrr2rrrr�<module>s
Nsite-packages/firewall/core/io/__pycache__/zone.cpython-36.pyc000064400000031307147511334700020252 0ustar003

]ûf�M�@sdddgZddljZddlZddlZddlZddlmZddlm	Z	m
Z
mZmZm
Z
mZmZddlmZmZddlmZmZmZmZdd	lmZmZmZmZdd
lmZddlm Z ddlm!Z!dd
l"m#Z#Gdd�de�Z$Gdd�de�Z%ddd�Z&ddd�Z'dS)�Zone�zone_reader�zone_writer�N)�config)�checkIPnMask�
checkIP6nMask�checkInterface�uniqify�max_zone_name_len�
u2b_if_py2�	check_mac)�DEFAULT_ZONE_TARGET�ZONE_TARGETS)�PY2�	IO_Object�IO_Object_ContentHandler�IO_Object_XMLGenerator)�common_startElement�common_endElement�common_check_config�
common_writer)�rich)�log)�errors)�
FirewallErrorcsfeZdZdZd@dAdBdCdDd	dgfd
dEgfddgfdFd
dGgfddgfddgfddgfddgfddHgfdIdJfZdddgZddddgddgdgdgdddgdgddddgddgddddddgdgdd�Zddddgd gd!d"gd#d$gd%d&d'd#d(gd%d'd(gd)d*gd+gd,gd-�	Zed.d/��Z	�fd0d1�Z
d2d3�Zd4d5�Z�fd6d7�Z
�fd8d9�Zd:d;�Z�fd<d=�Zd>d?�Z�ZS)Krz Zone class �version��short�description�UNUSEDF�target�services�ports�icmp_blocks�
masquerade�
forward_ports�
interfaces�sources�	rules_str�	protocols�source_ports�icmp_block_inversion�forward�_�-�/N�name�port�protocol�value�set)rr�zone�servicer1z
icmp-blockz	icmp-typer,zforward-port�	interface�rule�source�destinationr2zsource-portrZauditZaccept�rejectZdropZmark�limitzicmp-block-inversion�	immutableZenabledzto-portzto-addr�familyZpriority�address�mac�invert�ipset�prefix�level�typeZburst)	r5r$zforward-portr8r9r:rr;r<cCs8x&ttj�D]\}\}}||kr|SqWttjd��dS)Nz
index_of())�	enumerater�IMPORT_EXPORT_STRUCTURErrZ
UNKNOWN_ERROR)�element�iZelZdummy�rJ�/usr/lib/python3.6/zone.py�index_ofdsz
Zone.index_ofcs�tt|�j�d|_d|_d|_d|_t|_g|_	g|_
g|_g|_d|_
d|_g|_g|_g|_g|_d|_g|_g|_d|_d|_d|_dS)NrF)�superr�__init__rrrrr
r r!r"r)r#r,r$r%r*r&r'�	fw_config�rulesr(r+�combined�applied)�self)�	__class__rJrKrNks,z
Zone.__init__cCs�d|_d|_d|_d|_t|_|jdd�=|jdd�=|jdd�=|j	dd�=d|_
d|_|jdd�=|j
dd�=|jdd�=|jdd�=d|_|jdd�=|jdd�=d|_d|_d|_dS)NrF)rrrrr
r r!r"r)r#r,r$r%r*r&r'rOrPr(r+rQrR)rSrJrJrK�cleanup�s*zZone.cleanupcCs�t|j�|_t|j�|_t|j�|_t|j�|_dd�|jD�|_dd�|jD�|_dd�|jD�|_dd�|jD�|_dd�|j	D�|_	dd�|j
D�|_
dd�|jD�|_d	d�|jD�|_d
d�|j
D�|_
dd�|jD�|_dS)
z� HACK. I haven't been able to make sax parser return
            strings encoded (because of python 2) instead of in unicode.
            Get rid of it once we throw out python 2 support.cSsg|]}t|��qSrJ)r)�.0�srJrJrK�
<listcomp>�sz'Zone.encode_strings.<locals>.<listcomp>cSs g|]\}}t|�t|�f�qSrJ)r)rV�po�prrJrJrKrX�scSsg|]}t|��qSrJ)r)rVrZrJrJrKrX�scSsg|]}t|��qSrJ)r)rVrIrJrJrKrX�scSs0g|](\}}}}t|�t|�t|�t|�f�qSrJ)r)rVZp1Zp2Zp3Zp4rJrJrKrX�scSs g|]\}}t|�t|�f�qSrJ)r)rVrYrZrJrJrKrX�scSsg|]}t|��qSrJ)r)rVrIrJrJrKrX�scSsg|]}t|��qSrJ)r)rVrWrJrJrKrX�scSsg|]}t|��qSrJ)r)rVrWrJrJrKrX�scSsg|]}t|��qSrJ)r)rVrWrJrJrKrX�sN)rrrrr r!r"r)r#r%r*r&r'rPr()rSrJrJrK�encode_strings�szZone.encode_stringscsN|dkr8dd�|D�|_tt|�j|dd�|jD��ntt|�j||�dS)Nr(cSsg|]}tj|d��qS))Zrule_str)rZ	Rich_Rule)rVrWrJrJrKrX�sz$Zone.__setattr__.<locals>.<listcomp>cSsg|]}t|��qSrJ)�str)rVrWrJrJrKrX�s)rPrMr�__setattr__)rSr0r3)rTrJrKr]�s zZone.__setattr__cstt|�j�}|d=|S)Nr)rMr�export_config_dict)rSZconf)rTrJrKr^�szZone.export_config_dictcCsLt||||�|dkr.|tkr*ttj|���n|dkr�xl|D]d}t|�sTttj|��|jr<xD|jj�D]6}||j	krvqf||jj
|�jkrfttjdj||���qfWq<Wn�|dk�rHx�|D]�}t
|�r�t|�r�t|�r�|jd�r�ttj|��|jr�xL|jj�D]>}||j	k�r�q||jj
|�jk�rttjdj||����qWq�WdS)Nr r&z)interface '{}' already bound to zone '{}'r'zipset:z&source '{}' already bound to zone '{}')rrrr�INVALID_TARGETrZINVALID_INTERFACErOZ	get_zonesr0Zget_zoner&�formatrrr�
startswith�INVALID_ADDRr')rSr�itemZ
all_configr7r5r9rJrJrK�
_check_config�s6



zZone._check_configcs�tt|�j|�|jd�r,ttjd|��n�|jd�rHttjd|��n�|jd�dkrhttjd|��nnd|kr�|d|j	d��}n|}t
|�t�kr�ttjd|t
|�t�|jf��|j
r�||j
j�kr�ttjd��dS)Nr/z'%s' can't start with '/'z'%s' can't end with '/'�zmore than one '/' in '%s'z'Zone of '%s' has %d chars, max is %d %sz+Zones can't have the same name as a policy.)rMr�
check_namerarr�INVALID_NAME�endswith�count�find�lenr
rQrOZget_policy_objectsZ
NAME_CONFLICT)rSr0Zchecked_name)rTrJrKrf�s,

zZone.check_namec
Cs�d|_d|_d|_d|_d|_x$|jD]}||jkr&|jj|�q&Wx$|jD]}||jkrL|jj|�qLWx$|jD]}||jkrr|jj|�qrWx$|j	D]}||j	kr�|j	j|�q�Wx$|j
D]}||j
kr�|j
j|�q�Wx$|jD]}||jkr�|jj|�q�W|j�rd|_|j
�rd|_
x(|jD]}||jk�r&|jj|��q&Wx(|jD]}||jk�rP|jj|��qPWx,|jD]"}	|jj|	�|jjt|	���qzW|j�r�d|_dS)NTr)rQ�filenamerrrr&�appendr'r!r"r)r#r,r$r%r*rPr(r\r+)
rSr5r7r9r6r1�protoZicmpr,r8rJrJrK�combine�sL





zZone.combine)rr)rr)rr)rF)r r)rr)r$F)rrrr)rr)r+F)r,F)�__name__�
__module__�__qualname__�__doc__rGZADDITIONAL_ALNUM_CHARSZPARSER_REQUIRED_ELEMENT_ATTRSZPARSER_OPTIONAL_ELEMENT_ATTRS�staticmethodrLrNrUr[r]r^rdrfro�
__classcell__rJrJ)rTrKr(sx


c@s$eZdZdd�Zdd�Zdd�ZdS)�zone_ContentHandlercCs"tj||�d|_d|_d|_dS)NF)rrN�_rule�_rule_errorZ	_limit_ok)rSrcrJrJrKrN szzone_ContentHandler.__init__c	Cs�tj|||�|jrdS|jj||�t|||�r6dS|dkr�d|krVtjd|d�d|krj|d|j_d|kr�tjd|d�d|kr�|d}|t	kr�t
tj|��|dkr�|t
kr�||j_�n�|d	kr�|jjr�tjd
�nd|j_�n�|dk�rh|j�rtjd
�d|_dSd|k�r.tjd�d|_dS|d|jjk�rT|jjj|d�ntjd|d��n8|dk�rf|j�r |jj�r�tjdt|j��d|_dSd}d|k�r�|dj�d$k�r�d}d}}}d|k�r�|d}d|k�r�|d}d|k�r|d}tj||||d�|j_dSd|k�rBd|k�rBtjd�dSd|k�rdd|k�rdtjd�dSd|k�r~tjd|d�d|k�r�tjd�dSd|k�r�t|d��r�t|d��r�t|d��r�t
tj|d��d|k�r$d|d}||jjk�r|jjj|�ntjd |d�d|k�r�|d}||jjk�rT|jjj|�ntjd |d�n:|d!k�r�|jj�r�tjd"�nd|j_ntjd#|�dSdS)%Nr5r0z'Ignoring deprecated attribute name='%s'rr=z,Ignoring deprecated attribute immutable='%s'r rr,zForward already set, ignoring.Tr7z$Invalid rule: interface use in rule.z Invalid interface: Name missing.z%Interface '%s' already set, ignoring.r9z:Invalid rule: More than one source in rule '%s', ignoring.FrA�yes�truer?r@rB)rAz$Invalid source: No address no ipset.z"Invalid source: Address and ipset.r>z)Ignoring deprecated attribute family='%s'z+Invalid source: Invertion not allowed here.zipset:%sz"Source '%s' already set, ignoring.zicmp-block-inversionz+Icmp-Block-Inversion already set, ignoring.zUnknown XML element '%s')ryrz)r�startElementrxrcZparser_check_element_attrsrrZwarningrrrrr_r
r r,rwr&rmr9r\�lowerrZRich_Sourcerrrrbr'r+)	rSr0�attrsr rAZaddrr@rB�entryrJrJrKr{&s�

























z zone_ContentHandler.startElementcCstj||�t||�dS)N)r�
endElementr)rSr0rJrJrKr�szzone_ContentHandler.endElementN)rprqrrrNr{rrJrJrJrKrvsprvFc
Cst�}|jd�s ttjd|��|dd	�|_|s>|j|j�||_||_|j	t
j�rZdnd|_|j|_
t|�}tj�}|j|�d||f}t|d��b}tjd�}|j|�y|j|�Wn8tjk
r�}	zttjd|	j���WYdd}	~	XnXWdQRX~~t�r|j�|S)
Nz.xmlz'%s' is missing .xml suffix�FTz%s/%s�rbznot a valid zone file: %s���)rrhrrrgr0rfrl�pathrar�
ETC_FIREWALLDZbuiltin�defaultrv�saxZmake_parserZsetContentHandler�openZInputSourceZ
setByteStream�parseZSAXParseExceptionZINVALID_ZONEZgetExceptionrr[)
rlr�Z
no_check_namer5�handler�parserr0�fr9�msgrJrJrKr�s:




(cCs\|r|n|j}|jr$d||jf}nd||jf}tjj|�r�ytj|d|�Wn0tk
r�}ztj	d||�WYdd}~XnXtjj
|�}|jtj
�r�tjj|�r�tjjtj
�s�tjtj
d�tj|d�tj|ddd�}t|�}|j�i}|j�r|jd	k�r|j|d
<|jtk�r*|j|d<|jd|�|jd
�t||�x8t|j�D]*}	|jd�|jdd|	i�|jd
��qVWx\t|j�D]N}
|jd�d|
k�r�|jdd|
dd�i�n|jdd|
i�|jd
��q�W|j�r
|jd�|jdi�|jd
�|j�r2|jd�|jdi�|jd
�|jd�|jd
�|j �|j!�~dS)Nz%s/%sz	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %si�ZwtzUTF-8)�mode�encodingrrr r5�
z  r7r0zipset:r9rB�r?zicmp-block-inversionr,)"r�rlr0�os�exists�shutilZcopy2�	Exceptionr�error�dirnamerarr��mkdir�ior�rZ
startDocumentrr r
r{ZignorableWhitespacerr	r&Z
simpleElementr'r+r,rZendDocument�close)r5r��_pathr0r��dirpathr�r�r}r7r9rJrJrKr�s` 












)F)N)(�__all__Zxml.saxr�r�r�r�ZfirewallrZfirewall.functionsrrrr	r
rrZfirewall.core.baser
rZfirewall.core.io.io_objectrrrrZfirewall.core.io.policyrrrrZ
firewall.corerZfirewall.core.loggerrrZfirewall.errorsrrrvrrrJrJrJrK�<module>s$

$x|
site-packages/firewall/core/io/__pycache__/functions.cpython-36.pyc000064400000005256147511334700021313 0ustar003

]ûfy�@s�ddlZddlmZddlmZddlmZddlmZddl	m
Z
ddlmZddl
mZdd	lmZdd
lmZddlmZddlmZdd
lmZdd�ZdS)�N)�config)�
FirewallError)�FirewallConfig)�zone_reader)�service_reader)�ipset_reader)�icmptype_reader)�
helper_reader)�
policy_reader)�Direct)�LockdownWhitelist)�firewalld_confc	-Cs|t|�}t|jtjtjgd�t|jtjtj	gd�t
|jtjtj
gd�t|jtjtjgd�t|jtjtjgd�t|jtjtjgd�d�}�x
|j�D�]�}x�||dD]�}tjj|�s�q�x�ttj|��D]�}|j d�r�yD||d||�}|d
k�r�||_!|j"|j#��||d|�Wq�t$k
�rT}zt$|j%d	||j&f��WYdd}~Xq�t'k
�r�}zt'd	||f��WYdd}~Xq�Xq�Wq�Wq�Wtjj(tj)��r:y$t*tj)�}|j+�|j,|j-��Wnpt$k
�r}zt$|j%d	tj)|j&f��WYdd}~Xn6t'k
�r8}zt'd	tj)|f��WYdd}~XnXtjj(tj.��r�y$t/tj.�}|j+�|j,|j-��Wnpt$k
�r�}zt$|j%d	tj.|j&f��WYdd}~Xn6t'k
�r�}zt'd	tj.|f��WYdd}~XnXtjj(tj0��rxyt1tj0�}|j+�Wnpt$k
�rB}zt$|j%d	tj0|j&f��WYdd}~Xn6t'k
�rv}zt'd	tj0|f��WYdd}~XnXdS)N)�reader�add�dirs)Zipset�helperZicmptypeZservice�zone�policyrz.xmlrrrrz'%s': %s)rr)2rrZ	add_ipsetrZFIREWALLD_IPSETSZETC_FIREWALLD_IPSETSr	Z
add_helperZFIREWALLD_HELPERSZETC_FIREWALLD_HELPERSrZadd_icmptypeZFIREWALLD_ICMPTYPESZETC_FIREWALLD_ICMPTYPESrZadd_serviceZFIREWALLD_SERVICESZETC_FIREWALLD_SERVICESrZadd_zoneZFIREWALLD_ZONESZETC_FIREWALLD_ZONESr
Zadd_policy_objectZFIREWALLD_POLICIESZETC_FIREWALLD_POLICIES�keys�os�path�isdir�sorted�listdir�endswith�	fw_configZcheck_config_dictZexport_config_dictr�code�msg�	Exception�isfileZFIREWALLD_DIRECTr�read�check_configZ
export_configZLOCKDOWN_WHITELISTrZFIREWALLD_CONFr
)	�fwrZreadersrZ_dir�file�obj�errorr�r&�/usr/lib/python3.6/functions.pyr!&sz

&.
($
($
(r!)rZfirewallrZfirewall.errorsrZfirewall.core.fw_configrZfirewall.core.io.zonerZfirewall.core.io.servicerZfirewall.core.io.ipsetrZfirewall.core.io.icmptyperZfirewall.core.io.helperr	Zfirewall.core.io.policyr
Zfirewall.core.io.directrZ#firewall.core.io.lockdown_whitelistrZfirewall.core.io.firewalld_confr
r!r&r&r&r'�<module>ssite-packages/firewall/core/io/__pycache__/__init__.cpython-36.pyc000064400000001227147511334700021034 0ustar003

]ûf<�@sjddlZdejkrfddlmZeed�s<dd�Zeede�ddlmZeed�sfd	d
�Z	eede	�dS)�NZ_xmlplus)�AttributesImpl�__contains__cCs|t|d�kS)NZ_attrs)�getattr)�self�name�r�/usr/lib/python3.6/__init__.py�__AttributesImpl__contains__sr	)�XMLGeneratorZ_writecCst|d�j|�dS)NZ_out)r�write)r�textrrr�__XMLGenerator_write$sr
)
Zxml�__file__Zxml.sax.xmlreaderr�hasattrr	�setattrZxml.sax.saxutilsr
r
rrrr�<module>s


site-packages/firewall/core/io/__pycache__/helper.cpython-36.pyc000064400000013451147511334700020556 0ustar003

]ûf� �@s�dddgZddljZddlZddlZddlZddlmZddlm	Z	ddl
mZmZm
Z
mZmZmZddlmZdd	lmZdd
lmZGdd�de�ZGdd
�d
e
�Zdd�Zddd�ZdS)�Helper�
helper_reader�
helper_writer�N)�config)�
u2b_if_py2)�PY2�	IO_Object�IO_Object_ContentHandler�IO_Object_XMLGenerator�
check_port�check_tcpudp)�log)�errors)�
FirewallErrorcs�eZdZddddddd gffZdZd	d
gZdddgd�Zd
ddgddgd�Z�fdd�Zdd�Z	dd�Z
dd�Zdd�Z�Z
S)!r�version��short�description�family�module�portsz(sssssa(ss))�-�.N)rr�helper�name�port�protocol)rrcs6tt|�j�d|_d|_d|_d|_d|_g|_dS)Nr)	�superr�__init__rrrrrr)�self)�	__class__��/usr/lib/python3.6/helper.pyr;szHelper.__init__cCs.d|_d|_d|_d|_d|_|jdd�=dS)Nr)rrrrrr)rr!r!r"�cleanupDszHelper.cleanupcCsRt|j�|_t|j�|_t|j�|_t|j�|_t|j�|_dd�|jD�|_dS)z� HACK. I haven't been able to make sax parser return
            strings encoded (because of python 2) instead of in unicode.
            Get rid of it once we throw out python 2 support.cSs g|]\}}t|�t|�f�qSr!)r)�.0ZpoZprr!r!r"�
<listcomp>Usz)Helper.encode_strings.<locals>.<listcomp>N)rrrrrrr)rr!r!r"�encode_stringsLszHelper.encode_stringscCs(ddg}||kr$ttjd||f��dS)NZipv4Zipv6z'%s' not in '%s')rrZINVALID_IPV)rZipvZipvsr!r!r"�	check_ipvWszHelper.check_ipvcCsz|dkr0xl|D]}t|d�t|d�qWnF|dkrv|jd�sRttjd|��t|jdd��dkrvttjd|��dS)	Nrr�r�
nf_conntrack_z('%s' does not start with 'nf_conntrack_'rzModule name '%s' too short)rr�
startswithrr�INVALID_MODULE�len�replace)rr�itemZ
all_configrr!r!r"�
_check_config]s


zHelper._check_config)rr)rr)rr)rr)rr)rr)�__name__�
__module__�__qualname__ZIMPORT_EXPORT_STRUCTUREZDBUS_SIGNATUREZADDITIONAL_ALNUM_CHARSZPARSER_REQUIRED_ELEMENT_ATTRSZPARSER_OPTIONAL_ELEMENT_ATTRSrr#r&r'r/�
__classcell__r!r!)r r"r&s$
	c@seZdZdd�ZdS)�helper_ContentHandlercCs>tj|||�|jj||�|dkr�d|kr8|d|j_d|kr\|jj|d�|d|j_d|kr�|djd�s�tt	j
d|d��t|djdd��dkr�tt	j
d	|d��|d|j_
nz|d
kr�np|dkr�nf|dk�r:t|d�t|d
�|d|d
f}||jjk�r$|jjj|�ntjd|d|d
�dS)Nrrrrr)z('%s' does not start with 'nf_conntrack_'rr(zModule name '%s' too shortrrrrz#Port '%s/%s' already set, ignoring.)r	�startElementr.Zparser_check_element_attrsrr'rr*rrr+r,r-rrrr�appendr
Zwarning)rr�attrs�entryr!r!r"r5ns>
z"helper_ContentHandler.startElementN)r0r1r2r5r!r!r!r"r4msr4c	Cst�}|jd�s ttjd|��|dd	�|_|j|j�||_||_|j	t
j�rVdnd|_|j|_
t|�}tj�}|j|�d||f}t|d��b}tjd�}|j|�y|j|�Wn8tjk
r�}zttjd|j���WYdd}~XnXWdQRX~~t�r|j�|S)
Nz.xmlz'%s' is missing .xml suffix�FTz%s/%s�rbznot a valid helper file: %s���)r�endswithrrZINVALID_NAMErZ
check_name�filename�pathr*r�
ETC_FIREWALLDZbuiltin�defaultr4�saxZmake_parserZsetContentHandler�openZInputSourceZ
setByteStream�parseZSAXParseExceptionZINVALID_HELPERZgetExceptionrr&)	r=r>r�handler�parserr�f�source�msgr!r!r"r�s8




(c
CsP|r|n|j}|jr$d||jf}nd||jf}tjj|�r�ytj|d|�Wn0tk
r�}ztj	d||�WYdd}~XnXtjj
|�}|jtj
�r�tjj|�r�tjjtj
�s�tjtj
d�tj|d�tj|ddd�}t|�}|j�i}|j|d	<|j�r|jd
k�r|j|d<|j�r<|jd
k�r<|j|d<|jd
|�|jd�|j�r�|jd
k�r�|jd�|jdi�|j|j�|jd�|jd�|j�r�|jd
k�r�|jd�|jdi�|j|j�|jd�|jd�x>|jD]4}	|jd�|jd|	d|	dd��|jd��q�W|jd
�|jd�|j�|j�~dS)Nz%s/%sz	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %si�ZwtzUTF-8)�mode�encodingrrrrr�
z  rrrrr()rr) r>r=r�os�exists�shutilZcopy2�	Exceptionr
�error�dirnamer*rr?�mkdir�iorBr
Z
startDocumentrrrr5ZignorableWhitespacerZ
charactersZ
endElementrrZ
simpleElementZendDocument�close)
rr>�_pathrrH�dirpathrFrDr7rr!r!r"r�s\ 












)N)�__all__Zxml.saxrArLrSrNZfirewallrZfirewall.functionsrZfirewall.core.io.io_objectrrr	r
rrZfirewall.core.loggerr
rZfirewall.errorsrrr4rrr!r!r!r"�<module>s

 G#site-packages/firewall/core/io/__pycache__/__init__.cpython-36.opt-1.pyc000064400000001227147511334700021773 0ustar003

]ûf<�@sjddlZdejkrfddlmZeed�s<dd�Zeede�ddlmZeed�sfd	d
�Z	eede	�dS)�NZ_xmlplus)�AttributesImpl�__contains__cCs|t|d�kS)NZ_attrs)�getattr)�self�name�r�/usr/lib/python3.6/__init__.py�__AttributesImpl__contains__sr	)�XMLGeneratorZ_writecCst|d�j|�dS)NZ_out)r�write)r�textrrr�__XMLGenerator_write$sr
)
Zxml�__file__Zxml.sax.xmlreaderr�hasattrr	�setattrZxml.sax.saxutilsr
r
rrrr�<module>s


site-packages/firewall/core/io/__pycache__/service.cpython-36.opt-1.pyc000064400000020377147511334700021703 0ustar003

]ûf�2�@s�dddgZddljZddlZddlZddlZddlmZddlm	Z	ddl
mZmZm
Z
mZmZmZmZmZddlmZdd	lmZdd
lmZGdd�de�ZGdd
�d
e
�Zdd�Zddd�ZdS)�Service�service_reader�service_writer�N)�config)�
u2b_if_py2)�PY2�	IO_Object�IO_Object_ContentHandler�IO_Object_XMLGenerator�
check_port�check_tcpudp�check_protocol�
check_address)�log)�errors)�
FirewallErrorcs�eZdZd d!d"dd#gfddgfdddifddgfd	d$gfd
dgfddgff
Zdd
gZdddd�Zddgddgdgdgddgddgdgdgd�Z�fdd�Zdd�Zdd�Z	dd�Z
�ZS)%r�version��short�description�ports�modules�destination�	protocols�source_ports�includes�helpers�_�-N)rr�service�name�port�protocol�value�ipv4�ipv6r)rr!r"�modulerzsource-port�include�helpercsNtt|�j�d|_d|_d|_g|_g|_g|_i|_	g|_
g|_g|_dS)Nr)
�superr�__init__rrrrrrrrrr)�self)�	__class__��/usr/lib/python3.6/service.pyr*DszService.__init__cCshd|_d|_d|_|jdd�=|jdd�=|jdd�=|jj�|jdd�=|j	dd�=|j
dd�=dS)Nr)rrrrrrr�clearrrr)r+r-r-r.�cleanupQs
zService.cleanupcCs�t|j�|_t|j�|_t|j�|_dd�|jD�|_dd�|jD�|_dd�|jj�D�|_dd�|jD�|_dd�|j	D�|_	dd�|j
D�|_
d	d�|jD�|_d
S)z� HACK. I haven't been able to make sax parser return
            strings encoded (because of python 2) instead of in unicode.
            Get rid of it once we throw out python 2 support.cSs g|]\}}t|�t|�f�qSr-)r)�.0�po�prr-r-r.�
<listcomp>dsz*Service.encode_strings.<locals>.<listcomp>cSsg|]}t|��qSr-)r)r1�mr-r-r.r4escSsi|]\}}t|�t|��qSr-)r)r1�k�vr-r-r.�
<dictcomp>fsz*Service.encode_strings.<locals>.<dictcomp>cSsg|]}t|��qSr-)r)r1r3r-r-r.r4gscSs g|]\}}t|�t|�f�qSr-)r)r1r2r3r-r-r.r4hscSsg|]}t|��qSr-)r)r1�sr-r-r.r4jscSsg|]}t|��qSr-)r)r1r9r-r-r.r4ksN)rrrrrrr�itemsrrrr)r+r-r-r.�encode_strings]szService.encode_stringscCs:|dkrJx>|D]6}|ddkr8t|d�t|d�qt|d�qWn�|dkrjx�|D]}t|�qXWn�|dkr�x�|D]}t|d�t|d�qxWn�|dkr�x�|D]*}|dkr�ttjd
|��t|||�q�Wn^|dk�r6xR|D]J}|jd��r|jdd�}d
|k�r|jd
d�}t	|�dkr�ttj
|��q�WdS)Nrrr�rrrr$r%z'%s' not in {'ipv4'|'ipv6'}r�
nf_conntrack_rr�)r$r%)rrr
rrZINVALID_DESTINATIONr�
startswith�replace�lenZINVALID_MODULE)r+r�itemZ
all_configr!�protorr&r-r-r.�
_check_configms8






zService._check_config)rr)rr)rr)rr)rr)�__name__�
__module__�__qualname__ZIMPORT_EXPORT_STRUCTUREZADDITIONAL_ALNUM_CHARSZPARSER_REQUIRED_ELEMENT_ATTRSZPARSER_OPTIONAL_ELEMENT_ATTRSr*r0r;rD�
__classcell__r-r-)r,r.r&s4


c@seZdZdd�ZdS)�service_ContentHandlercCs0tj|||�|jj||�|dkrTd|kr<tjd|d�d|krP|d|j_�n�|dkr`�n�|dkrl�n�|dk�r$|ddkr�t|d�t|d	�|d|d	f}||jj	kr�|jj	j
|�ntjd
|d|d	�nBt|d	�|d	|jjk�r|jjj
|d	�ntjd|d	��n|d	k�rtt|d�|d|jjk�r`|jjj
|d�ntjd|d��n�|d
k�r�t|d�t|d	�|d|d	f}||jj
k�r�|jj
j
|�ntjd|d|d	��nN|dk�r>xRdD]J}||k�r�t|||�||jjk�r&tjd|�n|||jj|<�q�Wn�|dk�r�|d}|jd��r~|jdd�}d|k�r~|jdd�}||jjk�r�|jjj
|�ntjd|�n�|dk�r�|d|jjk�r�|jjj
|d�ntjd|d�n@|dk�r,|d|jjk�r|jjj
|d�ntjd|d�dS)Nrr z'Ignoring deprecated attribute name='%s'rrrr!rr"z#Port '%s/%s' already set, ignoring.z$Protocol '%s' already set, ignoring.r#zsource-portz)SourcePort '%s/%s' already set, ignoring.rr$r%z2Destination address for '%s' already set, ignoringr&r=rrz"Module '%s' already set, ignoring.r'z#Include '%s' already set, ignoring.r(z"Helper '%s' already set, ignoring.)r$r%)r	�startElementrBZparser_check_element_attrsrZwarningrrrr�appendr
rrrrr?r@rrr)r+r �attrs�entry�xr&r-r-r.rJ�s�










z#service_ContentHandler.startElementN)rErFrGrJr-r-r-r.rI�srIc	Cst�}|jd�s ttjd|��|dd	�|_|j|j�||_||_|j	t
j�rVdnd|_|j|_
t|�}tj�}|j|�d||f}t|d��b}tjd�}|j|�y|j|�Wn8tjk
r�}zttjd|j���WYdd}~XnXWdQRX~~t�r|j�|S)
Nz.xmlz'%s' is missing .xml suffix�FTz%s/%s�rbznot a valid service file: %s���)r�endswithrrZINVALID_NAMEr Z
check_name�filename�pathr?r�
ETC_FIREWALLDZbuiltin�defaultrI�saxZmake_parserZsetContentHandler�openZInputSourceZ
setByteStream�parseZSAXParseExceptionZINVALID_SERVICEZgetExceptionrr;)	rSrTr�handler�parserr �f�source�msgr-r-r.r�s8




(cCsr|r|n|j}|jr$d||jf}nd||jf}tjj|�r�ytj|d|�Wn0tk
r�}ztj	d||�WYdd}~XnXtjj
|�}|jtj
�r�tjj|�r�tjjtj
�s�tjtj
d�tj|d�tj|ddd�}t|�}|j�i}|j�r|jd	k�r|j|d
<|jd|�|jd�|j�rt|jd	k�rt|jd
�|jdi�|j|j�|jd�|jd�|j�r�|jd	k�r�|jd
�|jdi�|j|j�|jd�|jd�x>|jD]4}	|jd
�|jd|	d|	dd��|jd��q�Wx4|jD]*}
|jd
�|jdd|
i�|jd��qWx>|jD]4}	|jd
�|jd|	d|	dd��|jd��q<Wx4|jD]*}|jd
�|jdd|i�|jd��q|Wt|j �dk�r�|jd
�|jd|j �|jd�x4|j!D]*}|jd
�|jdd|i�|jd��q�Wx4|j"D]*}
|jd
�|jdd|
i�|jd��qW|jd�|jd�|j#�|j$�~dS)Nz%s/%sz	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %si�ZwtzUTF-8)�mode�encodingrrr�
z  rrr!rr<)r!r"r"r#zsource-portr&r rr'r()%rTrSr �os�exists�shutilZcopy2�	Exceptionr�error�dirnamer?rrU�mkdir�iorXr
Z
startDocumentrrJZignorableWhitespacerZ
charactersZ
endElementrrZ
simpleElementrrrrArrrZendDocument�close)rrT�_pathr r^�dirpathr\rZrLr!r"r&r'r(r-r-r.rs� 

















)N)�__all__Zxml.saxrWrbrirdZfirewallrZfirewall.functionsrZfirewall.core.io.io_objectrrr	r
rrr
rZfirewall.core.loggerrrZfirewall.errorsrrrIrrr-r-r-r.�<module>s

(mQsite-packages/firewall/core/io/__pycache__/io_object.cpython-36.opt-1.pyc000064400000027540147511334700022177 0ustar003

]ûf5�@sdZddddddddgZd	d
ljZd	d
ljjZd	d
lZd	d
lZd	dlm	Z	d	dl
mZd	d
lm
Z
d	dl
mZd	dlmZejdkZGdd�de�ZGdd�de�ZGdd�de�ZGdd�de�ZGdd�dejj�ZGdd�dej�Zdd�Zdd�Zdd�Z dd�Z!d
S)z5Generic io_object handler, io specific check methods.�PY2�	IO_Object�IO_Object_ContentHandler�IO_Object_XMLGenerator�
check_port�check_tcpudp�check_protocol�
check_address�N)�OrderedDict)�	functions)�b2u)�errors)�
FirewallError�3c@s|eZdZdZfZdZgZiZiZdd�Z	dd�Z
dd�Zd	d
�Zdd�Z
d
d�Zdd�Zdd�Zdd�Zdd�Zdd�ZdS)rz; Abstract IO_Object as base for icmptype, service and zone z()cCs"d|_d|_d|_d|_d|_dS)N�F)�filename�path�name�defaultZbuiltin)�self�r�/usr/lib/python3.6/io_object.py�__init__2s
zIO_Object.__init__cCs6g}x(|jD]}|jtjt||d���qWt|�S)Nr	)�IMPORT_EXPORT_STRUCTURE�append�copy�deepcopy�getattr�tuple)r�ret�xrrr�
export_config9szIO_Object.export_configcCsXi}tdd�|jD��}x:|D]2}t||�s<tt||�t�rtjt||��||<qW|S)NcSsg|]}|d|df�qS)r	�r)�.0r rrr�
<listcomp>Asz0IO_Object.export_config_dict.<locals>.<listcomp>)�dictrr�
isinstance�boolrr)r�conf�type_formats�keyrrr�export_config_dict?s
zIO_Object.export_config_dictcCs�|j|�x�t|j�D]~\}\}}t||t�r~g}t�}x,||D] }||krD|j|�|j|�qDW~t||t	j
|��qt||t	j
||��qWdS)N)�check_config�	enumeraterr&�list�setr�add�setattrrr)rr(�i�elementZdummyZ_confZ_setr rrr�
import_configGs

zIO_Object.import_configc	Cs~|j|�xn|D]f}t||�s0ttjdj|���t||t�r`t||tt	j
tj||����qt||tj||��qWdS)Nz-Internal error. '{}' is not a valid attribute)
�check_config_dict�hasattrrr
Z
UNKNOWN_ERROR�formatr&r.r1r
�fromkeysrr)rr(r*rrr�import_config_dictWs


"zIO_Object.import_config_dictcCszt|t�s(ttjd|td�t|�f��t|�dkr@ttjd��x4|D],}|j�rF||j	krFttjd||f��qFWdS)Nz'%s' not of type %s, but %srr"zname can't be emptyz'%s' is not allowed in '%s')
r&�strrr
�INVALID_TYPE�type�lenZINVALID_NAME�isalnum�ADDITIONAL_ALNUM_CHARS)rr�charrrr�
check_namecs


zIO_Object.check_namecCsjt|�t|j�kr0ttjdt|�t|j�f��i}x&t|j�D]\}\}}||||<q@W|j|�dS)Nz structure size mismatch %d != %d)r=rrr
r;r-r5)rr(Z	conf_dictr2r �yrrrr,pszIO_Object.check_configcCsrtdd�|jD��}xX|D]P}|dd�|jD�krDttjdj|���|j||||�|j||||�qWdS)NcSsg|]}|d|df�qS)r	r"r)r#r rrrr$|sz/IO_Object.check_config_dict.<locals>.<listcomp>cSsg|]\}}|�qSrr)r#r rBrrrr$~szoption '{}' is not valid)r%rrr
ZINVALID_OPTIONr7�_check_config_structure�
_check_config)rr(r)r*rrrr5{s
zIO_Object.check_config_dictcCsdS)Nr)rZdummy1Zdummy2Zdummy3rrrrD�szIO_Object._check_configc	Cs`t|t|��s,ttjd|t|�t|�f��t|t�rrt|�dkrRttjd|��x|D]}|j||d�qXWn�t|t�r�t|�t|�kr�ttjd|t|�f��x�t	|�D]\}}|j|||�q�Wn�t|t
��r\t|j��d\}}xn|j�D]b\}}t|t|���s,ttjd|t|�t|�f��t|t|��s�ttjd|t|�t|�f��q�WdS)Nz'%s' not of type %s, but %sr"zlen('%s') != 1r	zlen('%s') != %d)r&r<rr
r;r.r=rCrr-r%�items)	rr(Z	structurer r2�valueZskeyZsvaluer*rrrrC�s8



z!IO_Object._check_config_structurecCs�|j�}d}||jkrdd}|j|dk	rdx:|j|D],}||krL|j|�q4ttjd||f��q4W||jkr�d}x$|j|D]}||kr~|j|�q~W|s�ttjd|��x |D]}ttjd||f��q�WdS)NFTzMissing attribute %s for %szUnexpected element %sz%s: Unexpected attribute %s)ZgetNames�PARSER_REQUIRED_ELEMENT_ATTRS�removerr
ZPARSE_ERROR�PARSER_OPTIONAL_ELEMENT_ATTRS)rr�attrsZ_attrs�foundr rrr�parser_check_element_attrs�s,



z$IO_Object.parser_check_element_attrsN)�__name__�
__module__�__qualname__�__doc__rZDBUS_SIGNATUREr?rGrIrr!r+r4r9rAr,r5rDrCrLrrrrr)s"
!cs$eZdZ�fdd�Zdd�Z�ZS)�UnexpectedElementErrorcstt|�j�||_dS)N)�superrQrr)rr)�	__class__rrr�szUnexpectedElementError.__init__cCs
d|jS)NzUnexpected element '%s')r)rrrr�__str__�szUnexpectedElementError.__str__)rMrNrOrrT�
__classcell__rr)rSrrQ�srQcs$eZdZ�fdd�Zdd�Z�ZS)�MissingAttributeErrorcstt|�j�||_||_dS)N)rRrVrr�	attribute)rrrW)rSrrr�szMissingAttributeError.__init__cCsd|j|jfS)Nz$Element '%s': missing '%s' attribute)rrW)rrrrrT�szMissingAttributeError.__str__)rMrNrOrrTrUrr)rSrrV�srVcs$eZdZ�fdd�Zdd�Z�ZS)�UnexpectedAttributeErrorcstt|�j�||_||_dS)N)rRrXrrrW)rrrW)rSrrr�sz!UnexpectedAttributeError.__init__cCsd|j|jfS)Nz'Element '%s': unexpected attribute '%s')rrW)rrrrrT�sz UnexpectedAttributeError.__str__)rMrNrOrrTrUrr)rSrrX�srXc@s4eZdZdd�Zdd�Zdd�Zdd�Zd	d
�ZdS)rcCs||_d|_dS)Nr)�item�_element)rrYrrrr�sz!IO_Object_ContentHandler.__init__cCs
d|_dS)Nr)rZ)rrrr�
startDocument�sz&IO_Object_ContentHandler.startDocumentcCs
d|_dS)Nr)rZ)rrrJrrr�startElement�sz%IO_Object_ContentHandler.startElementcCs*|dkr|j|j_n|dkr&|j|j_dS)N�short�description)rZrYr]r^)rrrrr�
endElement�sz#IO_Object_ContentHandler.endElementcCs|j|jdd�7_dS)N�
� )rZ�replace)r�contentrrr�
characters�sz#IO_Object_ContentHandler.charactersN)rMrNrOrr[r\r_rdrrrrr�s
c@s<eZdZdd�Zdd�Zdd�Zdd�Zd	d
�Zdd�Zd
S)rcCsNtjjj|�|j|_|j|_ig|_|jd|_	g|_
d|_d|_d|_
dS)Nr"zutf-8F���)�sax�handler�ContentHandlerr�write�_write�flushZ_flushZ_ns_contextsZ_current_contextZ_undeclared_ns_mapsZ	_encodingZ_pending_start_elementZ_short_empty_elements)r�outrrrr�szIO_Object_XMLGenerator.__init__cCs*trdd�|j�D�}tjj|||�dS)a saxutils.XMLGenerator.startElement() expects name and attrs to be
            unicode and bad things happen if any of them is (utf-8) encoded.
            We override the method here to sanitize this case.
            Can be removed once we drop Python2 support.
        cSsi|]\}}t|�t|��qSr)r)r#rrFrrr�
<dictcomp>sz7IO_Object_XMLGenerator.startElement.<locals>.<dictcomp>N)rrE�saxutils�XMLGeneratorr\)rrrJrrrr\sz#IO_Object_XMLGenerator.startElementcCs�trX|jdt|��x4|j�D](\}}|jdt|�tjt|��f�q W|jd�nF|jd|�x,|j�D] \}}|jd|tj|�f�qpW|jd�dS)z* slightly modified startElement()
        �<z %s=%sz/>N)rrjrrErnZ	quoteattr)rrrJrFrrr�
simpleElementsz$IO_Object_XMLGenerator.simpleElementcCstjj|t|��dS)z� saxutils.XMLGenerator.endElement() expects name to be
            unicode and bad things happen if it's (utf-8) encoded.
            We override the method here to sanitize this case.
            Can be removed once we drop Python2 support.
        N)rnror_r)rrrrrr_sz!IO_Object_XMLGenerator.endElementcCstjj|t|��dS)z� saxutils.XMLGenerator.characters() expects content to be
            unicode and bad things happen if it's (utf-8) encoded.
            We override the method here to sanitize this case.
            Can be removed once we drop Python2 support.
        N)rnrordr)rrcrrrrd%sz!IO_Object_XMLGenerator.characterscCstjj|t|��dS)a saxutils.XMLGenerator.ignorableWhitespace() expects content to be
            unicode and bad things happen if it's (utf-8) encoded.
            We override the method here to sanitize this case.
            Can be removed once we drop Python2 support.
        N)rnro�ignorableWhitespacer)rrcrrrrr-sz*IO_Object_XMLGenerator.ignorableWhitespaceN)	rMrNrOrr\rqr_rdrrrrrrr�s
cCs�tj|�}|dkr$ttjd|��n`|dkr>ttjd|��nF|dkrXttjd|��n,t|�dkr�|d|dkr�ttjd|��dS)	N�zport number in '%s' is too bigr"z'%s' is invalid port rangezport range '%s' is ambiguousr	���re)rZgetPortRangerr
ZINVALID_PORTr=)ZportZ
port_rangerrrr5s
cCs|dkrttjd|��dS)N�tcp�udp�sctp�dccpz)'%s' not from {'tcp'|'udp'|'sctp'|'dccp'})rurvrwrx)rr
�INVALID_PROTOCOL)�protocolrrrrDscCstj|�sttj|��dS)N)rZ
checkProtocolrr
ry)rzrrrrJs
cCs$tj||�s ttjd||f��dS)Nz'%s' is not valid %s address)rrrr
ZINVALID_ADDR)ZipvZaddrrrrrNs)"rP�__all__Zxml.saxrfZxml.sax.saxutilsrnr�sys�collectionsr
ZfirewallrZfirewall.functionsrr
Zfirewall.errorsr�versionr�objectr�	ExceptionrQrVrXrgrhrrorrrrrrrrr�<module>s0

		Csite-packages/firewall/core/io/__pycache__/policy.cpython-36.pyc000064400000051212147511334700020573 0ustar003

]ûfϢ�@s dddgZddljZddlZddlZddlZddlmZddlm	Z	m
Z
ddlmZmZm
Z
ddlmZmZmZdd	lmZmZmZmZmZmZdd
lmZddlmZddlmZdd
lmZdd�Z dd�Z!dd�Z"dd�Z#dd�Z$Gdd�de�Z%Gdd�de�Z&ddd�Z'ddd�Z(dS) �Policy�
policy_reader�
policy_writer�N)�config)�checkIP�checkIP6)�uniqify�max_policy_name_len�portStr)�DEFAULT_POLICY_TARGET�POLICY_TARGETS�DEFAULT_POLICY_PRIORITY)�	IO_Object�IO_Object_ContentHandler�IO_Object_XMLGenerator�
check_port�check_tcpudp�check_protocol)�rich)�log)�errors)�
FirewallErrorc	Cs�|dkr�n�|dkr�n�|dkr�|jr`|jjrJtjdt|j��d|_dStj|d�|j_dS|d|jj	kr�|jj	j
|d�ntjd|d��n|dk�rN|jr�|jjr�tjdt|j��d|_dStj|d|d	�|j_dSt|d�t
|d	�t|dd
�|d	f}||jjk�r4|jjj
|�ntjd|d|d	��nN|d	k�r�|j�r�|jj�r�tjdt|j��d|_dStj|d�|j_nBt|d�|d|jjk�r�|jjj
|d�ntjd
|d��n�|dk�rh|j�r.|jj�rtjdt|j��d|_dStj|d�|j_dS|d|jjk�rT|jjj
|d�ntjd|d��n4|dk�r�|j�r�|jj�r�tjdt|j��d|_dStj|d�|j_dStjd|d��n�|dk�r2|j�r|jj�rtjdt|j��d|_dStj�|j_n|jj�r&tjd�nd|j_�nj|dk�r�d}d|k�rR|d}d}d|k�rh|d}|j�r�|jj�r�tjdt|j��d|_dStj|d|d	||�|j_dSt|d�t
|d	�|�r�t|�|�r
t|��r
t|��r
ttjd|��t|dd
�|d	t|d
�t|�f}||jjk�rL|jjj
|�n6tjd|d|d	|�rld|nd|�r|d|nd��n|dk�r@|j�r�|jj�r�tjdt|j��d|_dStj|d|d	�|j_dSt|d�t
|d	�t|dd
�|d	f}||jj k�r&|jj j
|�ntjd|d|d	��n\|dk�r�|j�sftjd�d|_dS|jj!�r�tjd t|j��dSd!}d}d"|k�r�|d"}d}d#|k�r�|d#}d$|k�r�|d$j"�dLk�r�d}tj#|||�|j_!�n�|dMk�r�|j�stjd+�d|_dS|jj$�r0tjd,�d|_dS|d'k�rHtj%�|j_$nh|d(k�rxd}	d-|k�rh|d-}	tj&|	�|j_$n8|d)k�r�tj'�|j_$n |d*k�r�|d.}
tj(|
�|j_$|jj$|_)�n�|d/k�r^|j�s�tjd0�dS|jj�r�tjd1�dSd}d2|k�r*|d2}|dNk�r*tjd;�d|_dSd<|k�r<|d<nd}tj*||�|j_|jj|_)�n>|d=k�r�|j�s~tjd>�dS|jj+�r�tjd?t|j��d|_dStj,�|j_+|jj+|_)n�|d@k�r,d}
dA}dB|k�r|dB}
|
dOk�rtjdE|dB�d|_dSdF|k�rt-|dF�}tj.|
|dG�|_np|dHk�r�|j)�sRtjdI�d|_dS|j)j/�rxtjdJt|j��d|_dS|d}tj0||j1dK��|j)_/nd!SdS)PN�short�description�servicez;Invalid rule: More than one element in rule '%s', ignoring.T�namez#Service '%s' already set, ignoring.�port�protocol�-z#Port '%s/%s' already set, ignoring.�valuez$Protocol '%s' already set, ignoring.z
icmp-blockz&icmp-block '%s' already set, ignoring.z	icmp-typez-Invalid rule: icmp-block '%s' outside of rule�
masqueradez!Masquerade already set, ignoring.zforward-port�zto-portzto-addrz#to-addr '%s' is not a valid addressz-Forward port %s/%s%s%s already set, ignoring.z >%sz @%szsource-portz*Source port '%s/%s' already set, ignoring.�destinationz)Invalid rule: Destination outside of rulez?Invalid rule: More than one destination in rule '%s', ignoring.F�address�ipset�invert�yes�true�accept�reject�drop�markz$Invalid rule: Action outside of rulez"Invalid rule: More than one action�type�setrz!Invalid rule: Log outside of rulezInvalid rule: More than one log�level�emerg�alert�crit�error�warning�notice�info�debugzInvalid rule: Invalid log level�prefix�auditz#Invalid rule: Audit outside of rulez9Invalid rule: More than one audit in rule '%s', ignoring.�ruler�family�ipv4�ipv6z&Invalid rule: Rule family "%s" invalid�priority)r:r=�limitz4Invalid rule: Limit outside of action, log and auditz9Invalid rule: More than one limit in rule '%s', ignoring.�burst)r&r')r(r)r*r+)r/r0r1r2r3r4r5r6)r;r<)2�_rule�elementrr3�str�_rule_errorr�Rich_Service�item�services�append�	Rich_Portrrr
�ports�
Rich_Protocolr�	protocols�Rich_IcmpBlock�icmp_blocks�
Rich_IcmpType�Rich_Masquerader �Rich_ForwardPortrrrr�INVALID_ADDR�
forward_ports�Rich_SourcePort�source_portsr"�lowerZRich_Destination�action�Rich_Accept�Rich_Reject�	Rich_Drop�	Rich_Mark�	_limit_okZRich_Logr8Z
Rich_Audit�int�	Rich_Ruler>Z
Rich_Limit�get)�objr�attrs�entry�to_portZto_addrr%r#r$Z_typeZ_setr.r7r:r=r�rc�/usr/lib/python3.6/policy.py�common_startElements�


















































recCs�|dkr�|js�y|jj�Wn6tk
rR}ztjd|t|j��WYdd}~XnLXt|j�|jjkr�|jj	j
|j�|jjj
t|j��ntjdt|j��d|_d|_n|dkr�d|_dS)Nr9z%s: %sz Rule '%s' already set, ignoring.Fr(r)r*r+rr8)r(r)r*r+rr8)rCr@Zcheck�	Exceptionrr3rBrE�	rules_str�rulesrGr[)r_r�ercrcrd�common_endElements&rjcCs�t|t�rdnd}|dkrT|jrT|jj�}x$|D]}||kr0ttjd|��q0W�n�|dkr�x$|D]}t|d�t|d�qbW�nb|dkr�x|D]}t	|�q�W�n@|d	kr�|jr�|jj
�}	x$|D]}
|
|	kr�ttjd
|
��q�W�n�|dk�r�x�|D]�}t|d�t|d�|d�r>|d
�r>ttjd|��|d�rTt|d�|d
r�t
|d
�r�t|d
�r�ttjd|d
��q�W�nT|dk�r�x&|D]}t|d�t|d��q�W�n|dk�r�x|D�]}tj|d�}
|j�r�|
j�r�t|
jtj��st|
jtj��r�|jj
�}	|
jj|	k�rLttjd
|
jj��nH|
j�r�|jj|
jj�}|j�r�|
j|jk�r�ttjd|
j|
jjf��nL|j�r�t|
jtj��r�|jj�}|
jj|k�r�ttjdj||j|
jj����q�WdS)NrZZonerFz '%s' not among existing servicesrIr�rKrMz"'%s' not among existing icmp typesrR��z$'%s' is missing to-port AND to-addr z#to-addr '%s' is not a valid addressrTrg�
rich_rules)�rule_strz3rich rule family '%s' conflicts with icmp type '%s'z){} '{}': '{}' not among existing services)rgrn)�
isinstancer�	fw_configZget_servicesrrZINVALID_SERVICErrrZ
get_icmptypesZINVALID_ICMPTYPE�INVALID_FORWARDrrrQrr]rArLrNrr:Zget_icmptyper"rD�format)r_rrE�
all_configZobj_typeZexisting_servicesrr�protoZexisting_icmptypesZicmptype�fwd_portr9Zobj_richZictrcrcrd�common_check_config2s�












 

rwcCs0d|ji}|j}|dk	r ||d<|jd|�dS)Nrr?r>)rr?�
simpleElement)�handlerr>�dr?rcrcrd�_handler_add_rich_limitxs

r{cCs�|jrF|jdkrF|jd�|jdi�|j|j�|jd�|jd�|jr�|jdkr�|jd�|jdi�|j|j�|jd�|jd�x6t|j�D](}|jd�|jdd|i�|jd�q�Wx@t|j	�D]2}|jd�|jd|d	|d
d��|jd�q�Wx8t|j
�D]*}|jd�|jdd
|i�|jd��qWx8t|j�D]*}|jd�|jdd|i�|jd��qLW|j�r�|jd�|jdi�|jd�x�t|j
�D]�}|jd�|d	|d
d�}|d�r�|ddk�r�|d|d<|d�r|ddk�r|d|d<|jd|�|jd��q�WxBt|j�D]4}|jd�|jd|d	|d
d��|jd��q>W�xT|jD�]H}i}|j�r�|j|d<|jd	k�r�t|j�|d<|jd�|jd|�|jd�|j�rVi}|jj�r�|jj|d<|jj�r|jj|d<|jj�r$|jj|d<|jj�r6d|d<|jd�|jd|�|jd�|j�r�i}|jj�rx|jj|d<|jj�r�|jj|d<|jj�r�d|d<|jd�|jd |�|jd�|j�rxd}	i}t|j�tjk�r�d}	|jj|d<�nbt|j�tjk�r(d}	|jj|d<|jj |d<�n0t|j�tj!k�rNd}	|jj"|d
<�n
t|j�tj#k�rfd}	n�t|j�tj$k�r�d}	|jj|d<n�t|j�tj%k�r�d!}	|jj|d<n�t|j�tj&k�rd}	|jj|d<|jj |d<|jj'dk�r�|jj'|d<|jj(dk�rX|jj(|d<nFt|j�tj)k�rBd}	|jj|d<|jj |d<nt*t+j,d"t|j���|jd�|j|	|�|jd�|j-�ri}|j-j.�r�|j-j.|d#<|j-j/�r�|j-j/|d$<|j-j0�r�|jd�|jd%|�|jd&�t1||j-j0�|jd'�|jd%�n|jd�|jd%|�|jd�|j2�r�i}|j2j0�rx|jd�|jd(i�|jd&�t1||j2j0�|jd'�|jd(�n|jd�|jd(|�|jd�|j3�r�d}
i}t|j3�tj4k�r�d)}
n|t|j3�tj5k�r�d*}
|j3j�r<|j3j|d+<nNt|j3�tj6k�rd,}
n6t|j3�tj7k�r*d-}
|j3j8|d.<nt-j9d/t|j3��|j3j0�r�|jd�|j|
|�|jd&�t1||j3j0�|jd'�|j|
�n|jd�|j|
|�|jd�|jd�|jd�|jd��q�WdS)0Nr!z  r�
rrrrrrk)rrrrz
icmp-blockr rlzto-portrmzto-addrzforward-portzsource-portr:r=r9r#�macr$�Truer%z    �sourcer"z	icmp-typez"Unknown element '%s' in obj_writerr7r.rz
      z
    r8r(r)r,r*r+r-zUnknown action '%s'):r�ignorableWhitespace�startElementZ
characters�
endElementrrrFrxrIrKrMr rRrTrhr:r=rBr�addrr}r$r%r"rAr,rrDrrHrrrJrrOrLrNrPrb�
to_addressrSrrZINVALID_OBJECTrr7r.r>r{r8rVrWrXrYrZr-r3)r_ryrrrZicmpZforwardr`r9rArVrcrcrd�
common_writer�s\




















































r�csPeZdZd7ZdZeZdgZd8d9d:d;d	dgfd
d<gfddgfd=dd>gfddgfddgfdd?gfd@ddgfddgffZdddgZ	dddgdgddgdgdgdddgddddgddgddddddgdgdgdgd�Z
ddgdd gd!dgd"d#d$d!d%gd"d$d%gd&d'gd(gd)gd*�Z�fd+d,�Zd-d.�Z
�fd/d0�Z�fd1d2�Zd3d4�Z�fd5d6�Z�ZS)Ari�i�r�versionr!rr�targetrFrIrMr FrRrnrKrTr=�
ingress_zones�egress_zones�_r�/Nrrrrr-)rr�policyrrz
icmp-blockz	icmp-typer zforward-portr9rr"rzsource-portrr8r(r)r*r+r>zingress-zonezegress-zonezto-portzto-addrr:r#r}r%r$r7r.r,r?)r�zforward-portr9rr"rr)r>cs�tt|�j�d|_d|_d|_t|_g|_g|_	g|_
g|_d|_g|_
g|_d|_g|_g|_d|_|j|_d|_g|_g|_dS)Nr!F)�superr�__init__r�rrrr�rFrIrKrMr rRrTrqrhrg�applied�priority_defaultr=Zderived_from_zoner�r�)�self)�	__class__rcrdr��s(zPolicy.__init__cCs�d|_d|_d|_t|_|jdd�=|jdd�=|jdd�=|jdd�=d|_	|j
dd�=|jdd�=d|_|j
dd�=|jdd�=d|_|j|_|jdd�=|jdd�=dS)Nr!F)r�rrrr�rFrIrKrMr rRrTrqrhrgr�r�r=r�r�)r�rcrcrd�cleanup�s$zPolicy.cleanupcs"|dkr|jSttt|�|�SdS)Nrn)rg�getattrr�r)r�r)r�rcrd�__getattr__�szPolicy.__getattr__csB|dkr,dd�|D�|_dd�|jD�|_ntt|�j||�dS)NrncSsg|]}tj|d��qS))ro)rr])�.0�srcrcrd�
<listcomp>�sz&Policy.__setattr__.<locals>.<listcomp>cSsg|]}t|��qSrc)rB)r�r�rcrcrdr��s)rhrgr�r�__setattr__)r�rr)r�rcrdr��szPolicy.__setattr__c
Cst||||�|dkr2|tkr.ttjd|���n�|dkrz||jksX||jksX||jkrvttjd||j|j|jf���n�|dk�rhddg}|j	r�||j	j
�7}x�|D]�}||kr�ttjd	|��|dkr�tddg�t|�@�s�|dk�rt|�t|g��rttjd
|��|dkr�|dk�r8d|k�r8d|dk�sT|dkr�d|kr�d|dkr�ttjd��q�W�n�|dk�r|�rd|k�r�d|dk�r�ttjd
��nxd|k�rd|dk�r�ttjd��xR|dD]F}|dk�rސq�|j	j
|�}|j	�r�d|j	j|�k�r�ttjd���q�W�n�|dk�r4�x�|D�]}tj|d�}|j�r�t|jtj��r�d|k�r|d|dk�r|ttjd
��nxd|k�r,d|dk�r�ttjd��xR|dD]F}|dk�r��q�|j	j
|�}|j	�r�d|j	j|�k�r�ttjd���q�W�q,|j�r�t|jtj��r�d|k�r,d|dk�r@|jj�r�ttjd��nt|d�r,|jj�s`ttjd��d|dk�r,x�|dD]8}|j	j
|�}|j	�rxd|j	j|�k�rxttjd���qxWnv|j�r,t|jtj��r,d|k�r,xR|dD]F}|dk�r�q�|j	j
|�}|j	�r�d|j	j|�k�r�ttjd���q�W�q,Wn�|dk�rx�|D]�}	d|k�rnd|dk�rnttjd��n�d|k�rDd|dk�r�|	d�rttjd��nt|d�rD|	d�s�ttjd��d|dk�rDxD|dD]8}|j	j
|�}|j	�r�d|j	j|�k�r�ttjd���q�W�qDWdS)Nr�z'%s' is invalid targetr=zQ%d is invalid priority. Must be in range [%d, %d]. The following are reserved: %sr�r��ANY�HOSTz'%s' not among existing zonesz>'%s' may only contain one of: many regular zones, ANY, or HOSTzF'HOST' can only appear in either ingress or egress zones, but not bothr z.'masquerade' is invalid for egress zone 'HOST'z/'masquerade' is invalid for ingress zone 'HOST'Z
interfaceszR'masquerade' cannot be used in a policy if an ingress zone has assigned interfacesrn)rozAA 'forward-port' with 'to-addr' is invalid for egress zone 'HOST'zC'forward-port' requires 'to-addr' if egress zone is 'ANY' or a zonezS'forward-port' cannot be used in a policy if an egress zone has assigned interfaceszR'mark' action cannot be used in a policy if an egress zone has assigned interfacesrRz1'forward-port' is invalid for ingress zone 'HOST'rm)r�r�)r�r�)r�r�)r�r�)rwrrr�INVALID_TARGET�priority_reserved�priority_max�priority_minZINVALID_PRIORITYrq�	get_zonesZINVALID_ZONEr-Zget_zoneZget_zone_config_dictrr]rArprOrPr�rrrVrZ)
r�rrErtZexisting_zones�zoneZz_objr9r_rvrcrcrd�
_check_config�s�






"
















zPolicy._check_configcs�tt|�j|�|jd�r,ttjd|��n�|jd�rHttjd|��n�|jd�dkrhttjd|��njd|kr�|d|j	d��}n|}t
|�t�kr�ttjd|t
|�t�f��|jr�||jj
�kr�ttjd��dS)Nr�z'%s' can't start with '/'z'%s' can't end with '/'rkzmore than one '/' in '%s'z&Policy of '%s' has %d chars, max is %dz,Policies can't have the same name as a zone.)r�r�
check_name�
startswithrr�INVALID_NAME�endswith�count�find�lenr	rqr�Z
NAME_CONFLICT)r�rZchecked_name)r�rcrdr�,s*

zPolicy.check_namei���)r�r!)rr!)rr!)r�r!)r!r!)r F)r!r!r!r!)r!r!)r=r)�__name__�
__module__�__qualname__r�r�r
r�r�ZIMPORT_EXPORT_STRUCTUREZADDITIONAL_ALNUM_CHARSZPARSER_REQUIRED_ELEMENT_ATTRSZPARSER_OPTIONAL_ELEMENT_ATTRSr�r�r�r�r�r��
__classcell__rcrc)r�rdrZsr


^c@s$eZdZdd�Zdd�Zdd�ZdS)�policy_ContentHandlercCs"tj||�d|_d|_d|_dS)NF)rr�r@rCr[)r�rErcrcrdr�Hszpolicy_ContentHandler.__init__cCstj|||�|jrdS|jj||�t|||�r6dS|dkr�d|krR|d|j_d|krjt|d�|j_d|kr�|d}|t	kr�t
tj|��|r�||j_
�n^|dkr�|d|jjkr�|jjj|d�ntjd|d��n|dk�r |d|jjk�r|jjj|d�ntjd	|d�n�|d
k�r�|j�sFtjd�d|_dS|jj�rltjd
t|j��d|_dSd}d|k�r�|dj�dk�r�d}d}}}d|k�r�|d}d|k�r�|d}d|k�r�|d}tj||||d�|j_dStjd|�dSdS)Nr�r�r=r�zingress-zonerz(Ingress zone '%s' already set, ignoring.zegress-zonez'Egress zone '%s' already set, ignoring.rz$Invalid rule: Source outside of ruleTz:Invalid rule: More than one source in rule '%s', ignoring.Fr%r&r'r#r}r$)r%zUnknown XML element '%s')r&r')rr�rCrEZparser_check_element_attrsrer�r\r=rrrr�r�r�rGrr3r�r@rrBrUrZRich_Source)r�rr`r�r%r�r}r$rcrcrdr�Nsf








z"policy_ContentHandler.startElementcCstj||�t||�dS)N)rr�rj)r�rrcrcrdr��sz policy_ContentHandler.endElementN)r�r�r�r�r�r�rcrcrcrdr�Gs@r�Fc
Cst�}|jd�s ttjd|��|dd	�|_|s>|j|j�||_||_|j	t
j�rZdnd|_|j|_
t|�}tj�}|j|�d||f}t|d��b}tjd�}|j|�y|j|�Wn8tjk
r�}	zttjd|	j���WYdd}	~	XnXWdQRX~~|S)
Nz.xmlz'%s' is missing .xml suffix�FTz%s/%s�rbznot a valid policy file: %s���)rr�rrr�rr��filename�pathr�r�
ETC_FIREWALLDZbuiltin�defaultr��saxZmake_parserZsetContentHandler�openZInputSourceZ
setByteStream�parseZSAXParseExceptionZINVALID_POLICYZgetException)
r�r�Z
no_check_namer�ry�parserr�fr�msgrcrcrdr�s6




(c
Cs�|r|n|j}|jr$d||jf}nd||jf}tjj|�r�ytj|d|�Wn0tk
r�}ztj	d||�WYdd}~XnXtjj
|�}|jtj
�r�tjj|�r�tjjtj
�s�tjtj
d�tj|d�tj|ddd�}t|�}|j�i}|j�r|jd	k�r|j|d
<|j|jk�r0t|j�|d<|j|d<|jd
|�|jd�t||�x8t|j�D]*}	|jd�|jdd|	i�|jd��qfWx8t|j�D]*}	|jd�|jdd|	i�|jd��q�W|jd
�|jd�|j �|j!�~dS)Nz%s/%sz	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %si�ZwtzUTF-8)�mode�encodingr!r�r=r�r�r|z  zingress-zonerzegress-zone)"r�r�r�os�exists�shutilZcopy2rfrr2�dirnamer�rr��mkdir�ior�rZ
startDocumentr�r=r�rBr�r�r�r�rr�rxr�r�ZendDocument�close)
r�r��_pathrr��dirpathr�ryr`r�rcrcrdr�sN 







)F)N))�__all__Zxml.saxr�r�r�r�ZfirewallrZfirewall.functionsrrrr	r
Zfirewall.core.baserrr
Zfirewall.core.io.io_objectrrrrrrZ
firewall.corerZfirewall.core.loggerrrZfirewall.errorsrrerjrwr{r�rr�rrrcrcrcrd�<module>s4

 F[nL
site-packages/firewall/core/io/__pycache__/zone.cpython-36.opt-1.pyc000064400000031307147511334700021211 0ustar003

]ûf�M�@sdddgZddljZddlZddlZddlZddlmZddlm	Z	m
Z
mZmZm
Z
mZmZddlmZmZddlmZmZmZmZdd	lmZmZmZmZdd
lmZddlm Z ddlm!Z!dd
l"m#Z#Gdd�de�Z$Gdd�de�Z%ddd�Z&ddd�Z'dS)�Zone�zone_reader�zone_writer�N)�config)�checkIPnMask�
checkIP6nMask�checkInterface�uniqify�max_zone_name_len�
u2b_if_py2�	check_mac)�DEFAULT_ZONE_TARGET�ZONE_TARGETS)�PY2�	IO_Object�IO_Object_ContentHandler�IO_Object_XMLGenerator)�common_startElement�common_endElement�common_check_config�
common_writer)�rich)�log)�errors)�
FirewallErrorcsfeZdZdZd@dAdBdCdDd	dgfd
dEgfddgfdFd
dGgfddgfddgfddgfddgfddHgfdIdJfZdddgZddddgddgdgdgdddgdgddddgddgddddddgdgdd�Zddddgd gd!d"gd#d$gd%d&d'd#d(gd%d'd(gd)d*gd+gd,gd-�	Zed.d/��Z	�fd0d1�Z
d2d3�Zd4d5�Z�fd6d7�Z
�fd8d9�Zd:d;�Z�fd<d=�Zd>d?�Z�ZS)Krz Zone class �version��short�description�UNUSEDF�target�services�ports�icmp_blocks�
masquerade�
forward_ports�
interfaces�sources�	rules_str�	protocols�source_ports�icmp_block_inversion�forward�_�-�/N�name�port�protocol�value�set)rr�zone�servicer1z
icmp-blockz	icmp-typer,zforward-port�	interface�rule�source�destinationr2zsource-portrZauditZaccept�rejectZdropZmark�limitzicmp-block-inversion�	immutableZenabledzto-portzto-addr�familyZpriority�address�mac�invert�ipset�prefix�level�typeZburst)	r5r$zforward-portr8r9r:rr;r<cCs8x&ttj�D]\}\}}||kr|SqWttjd��dS)Nz
index_of())�	enumerater�IMPORT_EXPORT_STRUCTURErrZ
UNKNOWN_ERROR)�element�iZelZdummy�rJ�/usr/lib/python3.6/zone.py�index_ofdsz
Zone.index_ofcs�tt|�j�d|_d|_d|_d|_t|_g|_	g|_
g|_g|_d|_
d|_g|_g|_g|_g|_d|_g|_g|_d|_d|_d|_dS)NrF)�superr�__init__rrrrr
r r!r"r)r#r,r$r%r*r&r'�	fw_config�rulesr(r+�combined�applied)�self)�	__class__rJrKrNks,z
Zone.__init__cCs�d|_d|_d|_d|_t|_|jdd�=|jdd�=|jdd�=|j	dd�=d|_
d|_|jdd�=|j
dd�=|jdd�=|jdd�=d|_|jdd�=|jdd�=d|_d|_d|_dS)NrF)rrrrr
r r!r"r)r#r,r$r%r*r&r'rOrPr(r+rQrR)rSrJrJrK�cleanup�s*zZone.cleanupcCs�t|j�|_t|j�|_t|j�|_t|j�|_dd�|jD�|_dd�|jD�|_dd�|jD�|_dd�|jD�|_dd�|j	D�|_	dd�|j
D�|_
dd�|jD�|_d	d�|jD�|_d
d�|j
D�|_
dd�|jD�|_dS)
z� HACK. I haven't been able to make sax parser return
            strings encoded (because of python 2) instead of in unicode.
            Get rid of it once we throw out python 2 support.cSsg|]}t|��qSrJ)r)�.0�srJrJrK�
<listcomp>�sz'Zone.encode_strings.<locals>.<listcomp>cSs g|]\}}t|�t|�f�qSrJ)r)rV�po�prrJrJrKrX�scSsg|]}t|��qSrJ)r)rVrZrJrJrKrX�scSsg|]}t|��qSrJ)r)rVrIrJrJrKrX�scSs0g|](\}}}}t|�t|�t|�t|�f�qSrJ)r)rVZp1Zp2Zp3Zp4rJrJrKrX�scSs g|]\}}t|�t|�f�qSrJ)r)rVrYrZrJrJrKrX�scSsg|]}t|��qSrJ)r)rVrIrJrJrKrX�scSsg|]}t|��qSrJ)r)rVrWrJrJrKrX�scSsg|]}t|��qSrJ)r)rVrWrJrJrKrX�scSsg|]}t|��qSrJ)r)rVrWrJrJrKrX�sN)rrrrr r!r"r)r#r%r*r&r'rPr()rSrJrJrK�encode_strings�szZone.encode_stringscsN|dkr8dd�|D�|_tt|�j|dd�|jD��ntt|�j||�dS)Nr(cSsg|]}tj|d��qS))Zrule_str)rZ	Rich_Rule)rVrWrJrJrKrX�sz$Zone.__setattr__.<locals>.<listcomp>cSsg|]}t|��qSrJ)�str)rVrWrJrJrKrX�s)rPrMr�__setattr__)rSr0r3)rTrJrKr]�s zZone.__setattr__cstt|�j�}|d=|S)Nr)rMr�export_config_dict)rSZconf)rTrJrKr^�szZone.export_config_dictcCsLt||||�|dkr.|tkr*ttj|���n|dkr�xl|D]d}t|�sTttj|��|jr<xD|jj�D]6}||j	krvqf||jj
|�jkrfttjdj||���qfWq<Wn�|dk�rHx�|D]�}t
|�r�t|�r�t|�r�|jd�r�ttj|��|jr�xL|jj�D]>}||j	k�r�q||jj
|�jk�rttjdj||����qWq�WdS)Nr r&z)interface '{}' already bound to zone '{}'r'zipset:z&source '{}' already bound to zone '{}')rrrr�INVALID_TARGETrZINVALID_INTERFACErOZ	get_zonesr0Zget_zoner&�formatrrr�
startswith�INVALID_ADDRr')rSr�itemZ
all_configr7r5r9rJrJrK�
_check_config�s6



zZone._check_configcs�tt|�j|�|jd�r,ttjd|��n�|jd�rHttjd|��n�|jd�dkrhttjd|��nnd|kr�|d|j	d��}n|}t
|�t�kr�ttjd|t
|�t�|jf��|j
r�||j
j�kr�ttjd��dS)Nr/z'%s' can't start with '/'z'%s' can't end with '/'�zmore than one '/' in '%s'z'Zone of '%s' has %d chars, max is %d %sz+Zones can't have the same name as a policy.)rMr�
check_namerarr�INVALID_NAME�endswith�count�find�lenr
rQrOZget_policy_objectsZ
NAME_CONFLICT)rSr0Zchecked_name)rTrJrKrf�s,

zZone.check_namec
Cs�d|_d|_d|_d|_d|_x$|jD]}||jkr&|jj|�q&Wx$|jD]}||jkrL|jj|�qLWx$|jD]}||jkrr|jj|�qrWx$|j	D]}||j	kr�|j	j|�q�Wx$|j
D]}||j
kr�|j
j|�q�Wx$|jD]}||jkr�|jj|�q�W|j�rd|_|j
�rd|_
x(|jD]}||jk�r&|jj|��q&Wx(|jD]}||jk�rP|jj|��qPWx,|jD]"}	|jj|	�|jjt|	���qzW|j�r�d|_dS)NTr)rQ�filenamerrrr&�appendr'r!r"r)r#r,r$r%r*rPr(r\r+)
rSr5r7r9r6r1�protoZicmpr,r8rJrJrK�combine�sL





zZone.combine)rr)rr)rr)rF)r r)rr)r$F)rrrr)rr)r+F)r,F)�__name__�
__module__�__qualname__�__doc__rGZADDITIONAL_ALNUM_CHARSZPARSER_REQUIRED_ELEMENT_ATTRSZPARSER_OPTIONAL_ELEMENT_ATTRS�staticmethodrLrNrUr[r]r^rdrfro�
__classcell__rJrJ)rTrKr(sx


c@s$eZdZdd�Zdd�Zdd�ZdS)�zone_ContentHandlercCs"tj||�d|_d|_d|_dS)NF)rrN�_rule�_rule_errorZ	_limit_ok)rSrcrJrJrKrN szzone_ContentHandler.__init__c	Cs�tj|||�|jrdS|jj||�t|||�r6dS|dkr�d|krVtjd|d�d|krj|d|j_d|kr�tjd|d�d|kr�|d}|t	kr�t
tj|��|dkr�|t
kr�||j_�n�|d	kr�|jjr�tjd
�nd|j_�n�|dk�rh|j�rtjd
�d|_dSd|k�r.tjd�d|_dS|d|jjk�rT|jjj|d�ntjd|d��n8|dk�rf|j�r |jj�r�tjdt|j��d|_dSd}d|k�r�|dj�d$k�r�d}d}}}d|k�r�|d}d|k�r�|d}d|k�r|d}tj||||d�|j_dSd|k�rBd|k�rBtjd�dSd|k�rdd|k�rdtjd�dSd|k�r~tjd|d�d|k�r�tjd�dSd|k�r�t|d��r�t|d��r�t|d��r�t
tj|d��d|k�r$d|d}||jjk�r|jjj|�ntjd |d�d|k�r�|d}||jjk�rT|jjj|�ntjd |d�n:|d!k�r�|jj�r�tjd"�nd|j_ntjd#|�dSdS)%Nr5r0z'Ignoring deprecated attribute name='%s'rr=z,Ignoring deprecated attribute immutable='%s'r rr,zForward already set, ignoring.Tr7z$Invalid rule: interface use in rule.z Invalid interface: Name missing.z%Interface '%s' already set, ignoring.r9z:Invalid rule: More than one source in rule '%s', ignoring.FrA�yes�truer?r@rB)rAz$Invalid source: No address no ipset.z"Invalid source: Address and ipset.r>z)Ignoring deprecated attribute family='%s'z+Invalid source: Invertion not allowed here.zipset:%sz"Source '%s' already set, ignoring.zicmp-block-inversionz+Icmp-Block-Inversion already set, ignoring.zUnknown XML element '%s')ryrz)r�startElementrxrcZparser_check_element_attrsrrZwarningrrrrr_r
r r,rwr&rmr9r\�lowerrZRich_Sourcerrrrbr'r+)	rSr0�attrsr rAZaddrr@rB�entryrJrJrKr{&s�

























z zone_ContentHandler.startElementcCstj||�t||�dS)N)r�
endElementr)rSr0rJrJrKr�szzone_ContentHandler.endElementN)rprqrrrNr{rrJrJrJrKrvsprvFc
Cst�}|jd�s ttjd|��|dd	�|_|s>|j|j�||_||_|j	t
j�rZdnd|_|j|_
t|�}tj�}|j|�d||f}t|d��b}tjd�}|j|�y|j|�Wn8tjk
r�}	zttjd|	j���WYdd}	~	XnXWdQRX~~t�r|j�|S)
Nz.xmlz'%s' is missing .xml suffix�FTz%s/%s�rbznot a valid zone file: %s���)rrhrrrgr0rfrl�pathrar�
ETC_FIREWALLDZbuiltin�defaultrv�saxZmake_parserZsetContentHandler�openZInputSourceZ
setByteStream�parseZSAXParseExceptionZINVALID_ZONEZgetExceptionrr[)
rlr�Z
no_check_namer5�handler�parserr0�fr9�msgrJrJrKr�s:




(cCs\|r|n|j}|jr$d||jf}nd||jf}tjj|�r�ytj|d|�Wn0tk
r�}ztj	d||�WYdd}~XnXtjj
|�}|jtj
�r�tjj|�r�tjjtj
�s�tjtj
d�tj|d�tj|ddd�}t|�}|j�i}|j�r|jd	k�r|j|d
<|jtk�r*|j|d<|jd|�|jd
�t||�x8t|j�D]*}	|jd�|jdd|	i�|jd
��qVWx\t|j�D]N}
|jd�d|
k�r�|jdd|
dd�i�n|jdd|
i�|jd
��q�W|j�r
|jd�|jdi�|jd
�|j�r2|jd�|jdi�|jd
�|jd�|jd
�|j �|j!�~dS)Nz%s/%sz	%s/%s.xmlz%s.oldzBackup of file '%s' failed: %si�ZwtzUTF-8)�mode�encodingrrr r5�
z  r7r0zipset:r9rB�r?zicmp-block-inversionr,)"r�rlr0�os�exists�shutilZcopy2�	Exceptionr�error�dirnamerarr��mkdir�ior�rZ
startDocumentrr r
r{ZignorableWhitespacerr	r&Z
simpleElementr'r+r,rZendDocument�close)r5r��_pathr0r��dirpathr�r�r}r7r9rJrJrKr�s` 












)F)N)(�__all__Zxml.saxr�r�r�r�ZfirewallrZfirewall.functionsrrrr	r
rrZfirewall.core.baser
rZfirewall.core.io.io_objectrrrrZfirewall.core.io.policyrrrrZ
firewall.corerZfirewall.core.loggerrrZfirewall.errorsrrrvrrrJrJrJrK�<module>s$

$x|
site-packages/firewall/core/io/__pycache__/lockdown_whitelist.cpython-36.opt-1.pyc000064400000022550147511334700024152 0ustar003

]ûf�1�@s�ddljZddlZddlZddlZddlmZddlmZm	Z	m
Z
mZddlm
Z
ddlmZmZmZmZmZmZddlmZddlmZGdd	�d	e
�ZGd
d�de	�ZdS)�N)�config)�PY2�	IO_Object�IO_Object_ContentHandler�IO_Object_XMLGenerator)�log)�uniqify�	checkUser�checkUid�checkCommand�checkContext�
u2b_if_py2)�errors)�
FirewallErrorc@seZdZdd�Zdd�ZdS)�!lockdown_whitelist_ContentHandlercCstj||�d|_dS)NF)r�__init__�	whitelist)�self�item�r�(/usr/lib/python3.6/lockdown_whitelist.pyr%sz*lockdown_whitelist_ContentHandler.__init__cCsVtj|||�|jj||�|dkr@|jr6ttjd��d|_�n|dkrr|js\tj	d�dS|d}|jj
|�n�|dkr�|js�tj	d�dSd	|kr�yt|d	�}Wn&tk
r�tj	d
|d	�dSX|jj
|�nd|kr�|jj|d�n\|dk�r@|j�stj	d�dSd
|k�r.tj	d�dS|jj|d
�ntj	d|�dSdS)NrzMore than one whitelist.T�commandz)Parse Error: command outside of whitelist�name�userz&Parse Error: user outside of whitelist�idz"Parse Error: %s is not a valid uid�selinuxz)Parse Error: selinux outside of whitelist�contextzParse Error: no contextzUnknown XML element %s)r�startElementrZparser_check_element_attrsrrrZPARSE_ERRORr�error�add_command�int�
ValueError�add_uid�add_user�add_context)rrZattrsr�uidrrrr)sJ






z.lockdown_whitelist_ContentHandler.startElementN)�__name__�
__module__�__qualname__rrrrrrr$srcs4eZdZdZddgfddgfddgfddgffZdZd	gZd
dgd
dgd
�ZdddgiZ�fdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zd d!�Zd"d#�Zd$d%�Zd&d'�Zd(d)�Zd*d+�Zd,d-�Zd.d/�Zd0d1�Zd2d3�Zd4d5�Zd6d7�Zd8d9�Zd:d;�Zd<d=�Zd>d?�Z d@dA�Z!dBdC�Z"�Z#S)D�LockdownWhitelistz LockdownWhitelist class �commands��contexts�users�uidsrz
(asasasai)�_Nrr)rrrrrrcs6tt|�j�||_d|_g|_g|_g|_g|_dS)N)	�superr)r�filename�parserr*r,r-r.)rr1)�	__class__rrrnszLockdownWhitelist.__init__cCs�|d
kr.x�|D]}|j||dd�|�qWnv|dkrLt|�s�ttj|��nX|dkrjt|�s�ttj|��n:|dkr�t|�s�ttj|��n|d	kr�t	|�s�ttj
|��dS)Nr*r,r-r.�rrrr%)r*r,r-r.���)�
_check_configrrr�INVALID_COMMANDr�INVALID_CONTEXTr	�INVALID_USERr
�INVALID_UID)rrrZ
all_config�xrrrr6ys
zLockdownWhitelist._check_configcCs4|jdd�=|jdd�=|jdd�=|jdd�=dS)N)r*r,r-r.)rrrr�cleanup�szLockdownWhitelist.cleanupcCs:dd�|jD�|_dd�|jD�|_dd�|jD�|_dS)z� HACK. I haven't been able to make sax parser return
            strings encoded (because of python 2) instead of in unicode.
            Get rid of it once we throw out python 2 support.cSsg|]}t|��qSr)r
)�.0r;rrr�
<listcomp>�sz4LockdownWhitelist.encode_strings.<locals>.<listcomp>cSsg|]}t|��qSr)r
)r=r;rrrr>�scSsg|]}t|��qSr)r
)r=r;rrrr>�sN)r*r,r-)rrrr�encode_strings�sz LockdownWhitelist.encode_stringscCs@t|�sttj|��||jkr,|jj|�nttjd|��dS)Nz!Command "%s" already in whitelist)rrrr7r*�append�ALREADY_ENABLED)rrrrrr�s
zLockdownWhitelist.add_commandcCs,||jkr|jj|�nttjd|��dS)NzCommand "%s" not in whitelist.)r*�removerr�NOT_ENABLED)rrrrr�remove_command�s
z LockdownWhitelist.remove_commandcCs
||jkS)N)r*)rrrrr�has_command�szLockdownWhitelist.has_commandcCsBx<|jD]2}|jd�r.|j|dd��r:dSq||krdSqWdS)N�*r4TFr5)r*�endswith�
startswith)rrZ_commandrrr�
match_command�s
zLockdownWhitelist.match_commandcCs|jS)N)r*)rrrr�get_commands�szLockdownWhitelist.get_commandscCsDt|�sttjt|���||jkr0|jj|�nttjd|��dS)NzUid "%s" already in whitelist)r
rrr:�strr.r@rA)rr%rrrr"�s
zLockdownWhitelist.add_uidcCs,||jkr|jj|�nttjd|��dS)NzUid "%s" not in whitelist.)r.rBrrrC)rr%rrr�
remove_uid�s
zLockdownWhitelist.remove_uidcCs
||jkS)N)r.)rr%rrr�has_uid�szLockdownWhitelist.has_uidcCs
||jkS)N)r.)rr%rrr�	match_uid�szLockdownWhitelist.match_uidcCs|jS)N)r.)rrrr�get_uids�szLockdownWhitelist.get_uidscCs@t|�sttj|��||jkr,|jj|�nttjd|��dS)NzUser "%s" already in whitelist)r	rrr9r-r@rA)rrrrrr#�s
zLockdownWhitelist.add_usercCs,||jkr|jj|�nttjd|��dS)NzUser "%s" not in whitelist.)r-rBrrrC)rrrrr�remove_user�s
zLockdownWhitelist.remove_usercCs
||jkS)N)r-)rrrrr�has_user�szLockdownWhitelist.has_usercCs
||jkS)N)r-)rrrrr�
match_user�szLockdownWhitelist.match_usercCs|jS)N)r-)rrrr�	get_users�szLockdownWhitelist.get_userscCs@t|�sttj|��||jkr,|jj|�nttjd|��dS)Nz!Context "%s" already in whitelist)rrrr8r,r@rA)rrrrrr$"s
zLockdownWhitelist.add_contextcCs,||jkr|jj|�nttjd|��dS)NzContext "%s" not in whitelist.)r,rBrrrC)rrrrr�remove_context,s
z LockdownWhitelist.remove_contextcCs
||jkS)N)r,)rrrrr�has_context3szLockdownWhitelist.has_contextcCs
||jkS)N)r,)rrrrr�
match_context6szLockdownWhitelist.match_contextcCs|jS)N)r,)rrrr�get_contexts9szLockdownWhitelist.get_contextscCs�|j�|jjd�s&ttjd|j��t|�}tj�}|j	|�y|j
|j�Wn8tjk
r�}zttjd|j
���WYdd}~XnX~~tr�|j�dS)Nz.xmlz'%s' is missing .xml suffixzNot a valid file: %s)r<r1rGrrZINVALID_NAMEr�saxZmake_parserZsetContentHandler�parseZSAXParseExceptionZINVALID_TYPEZgetExceptionrr?)r�handlerr2�msgrrr�read>s"
zLockdownWhitelist.readcCs�tjj|j�r\ytj|jd|j�Wn4tk
rZ}ztd|j|f��WYdd}~XnXtjjtj	�sxtj
tj	d�tj|jddd�}t
|�}|j�|jdi�|jd�x6t|j�D](}|jd	�|jd
d|i�|jd�q�Wx:t|j�D],}|jd	�|jdd
t|�i�|jd�q�Wx8t|j�D]*}|jd	�|jdd|i�|jd��q0Wx8t|j�D]*}|jd	�|jdd|i�|jd��qjW|jd�|jd�|j�|j�~dS)Nz%s.oldzBackup of '%s' failed: %si�ZwtzUTF-8)�mode�encodingr�
z  rrrrrr)�os�path�existsr1�shutilZcopy2�	Exception�IOErrorrZ
ETC_FIREWALLD�mkdir�io�openrZ
startDocumentrZignorableWhitespacerr*Z
simpleElementr.rKr-r,Z
endElementZendDocument�close)rr[�frZrr%rrrrr�writeQsB$






zLockdownWhitelist.write)$r&r'r(�__doc__ZIMPORT_EXPORT_STRUCTUREZDBUS_SIGNATUREZADDITIONAL_ALNUM_CHARSZPARSER_REQUIRED_ELEMENT_ATTRSZPARSER_OPTIONAL_ELEMENT_ATTRSrr6r<r?rrDrErIrJr"rLrMrNrOr#rPrQrRrSr$rTrUrVrWr\rk�
__classcell__rr)r3rr)WsL

	


1
r))Zxml.saxrXr`rgrcZfirewallrZfirewall.core.io.io_objectrrrrZfirewall.core.loggerrZfirewall.functionsrr	r
rrr
rZfirewall.errorsrrr)rrrr�<module>s
 3site-packages/firewall/core/io/direct.py000064400000036740147511334700014273 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2011-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

import xml.sax as sax
import os
import io
import shutil

from firewall import config
from firewall.fw_types import LastUpdatedOrderedDict
from firewall.functions import splitArgs, joinArgs, u2b_if_py2
from firewall.core.io.io_object import IO_Object, IO_Object_ContentHandler, \
    IO_Object_XMLGenerator
from firewall.core.logger import log
from firewall.core import ipXtables
from firewall.core import ebtables
from firewall import errors
from firewall.errors import FirewallError


class direct_ContentHandler(IO_Object_ContentHandler):
    def __init__(self, item):
        IO_Object_ContentHandler.__init__(self, item)
        self.direct = False

    def startElement(self, name, attrs):
        IO_Object_ContentHandler.startElement(self, name, attrs)
        self.item.parser_check_element_attrs(name, attrs)

        if name == "direct":
            if self.direct:
                raise FirewallError(errors.PARSE_ERROR,
                                    "More than one direct tag.")
            self.direct = True

        elif name == "chain":
            if not self.direct:
                log.error("Parse Error: chain outside of direct")
                return
            ipv = attrs["ipv"]
            table = attrs["table"]
            chain = attrs["chain"]
            self.item.add_chain(u2b_if_py2(ipv), u2b_if_py2(table),
                                u2b_if_py2(chain))

        elif name == "rule":
            if not self.direct:
                log.error("Parse Error: rule outside of direct")
                return
            ipv = attrs["ipv"]
            if ipv not in [ "ipv4", "ipv6", "eb" ]:
                raise FirewallError(errors.INVALID_IPV,
                                    "'%s' not from {'ipv4'|'ipv6'|'eb'}" % ipv)
            table = attrs["table"]
            chain = attrs["chain"]
            try:
                priority = int(attrs["priority"])
            except ValueError:
                log.error("Parse Error: %s is not a valid priority" %
                          attrs["priority"])
                return
            self._rule = [ u2b_if_py2(ipv), u2b_if_py2(table),
                           u2b_if_py2(chain), priority ]

        elif name == "passthrough":
            if not self.direct:
                log.error("Parse Error: command outside of direct")
                return
            ipv = attrs["ipv"]
            self._passthrough = [ u2b_if_py2(ipv) ]

        else:
            log.error('Unknown XML element %s' % name)
            return

    def endElement(self, name):
        IO_Object_ContentHandler.endElement(self, name)

        if name == "rule":
            if self._element:
                # add arguments
                self._rule.append([ u2b_if_py2(x)
                                    for x in splitArgs(self._element) ])
                self.item.add_rule(*self._rule)
            else:
                log.error("Error: rule does not have any arguments, ignoring.")
            self._rule = None
        elif name == "passthrough":
            if self._element:
                # add arguments
                self._passthrough.append([ u2b_if_py2(x)
                                           for x in splitArgs(self._element) ])
                self.item.add_passthrough(*self._passthrough)
            else:
                log.error("Error: passthrough does not have any arguments, " +
                          "ignoring.")
            self._passthrough = None

class Direct(IO_Object):
    """ Direct class """

    IMPORT_EXPORT_STRUCTURE = (
        # chain: [ ipv, table, [ chain ] ]
        ( "chains", [ ( "", "", "" ), ], ),                   # a(sss)
        # rule: [ ipv, table, chain, [ priority, [ arg ] ] ]
        ( "rules", [ ( "", "", "", 0, [ "" ] ), ], ),         # a(sssias)
        # passthrough: [ ipv, [ [ arg ] ] ]
        ( "passthroughs", [ ( "", [ "" ]), ], ),              # a(sas)
        )
    DBUS_SIGNATURE = '(a(sss)a(sssias)a(sas))'
    PARSER_REQUIRED_ELEMENT_ATTRS = {
        "direct": None,
        "chain": [ "ipv", "table", "chain" ],
        "rule": [ "ipv", "table", "chain", "priority" ],
        "passthrough": [ "ipv" ]
        }
    PARSER_OPTIONAL_ELEMENT_ATTRS = {
        }

    def __init__(self, filename):
        super(Direct, self).__init__()
        self.filename = filename
        self.chains = LastUpdatedOrderedDict()
        self.rules = LastUpdatedOrderedDict()
        self.passthroughs = LastUpdatedOrderedDict()

    def _check_config(self, conf, item, all_conf):
        pass
        # check arg lists

    def export_config(self):
        ret = [ ]
        x = [ ]
        for key in self.chains:
            for chain in self.chains[key]:
                x.append(tuple(list(key) + list([chain])))
        ret.append(x)
        x = [ ]
        for key in self.rules:
            for rule in self.rules[key]:
                x.append(tuple((key[0], key[1], key[2], rule[0],
                                list(rule[1]))))
        ret.append(x)
        x = [ ]
        for key in self.passthroughs:
            for rule in self.passthroughs[key]:
                x.append(tuple((key, list(rule))))
        ret.append(x)
        return tuple(ret)

    def import_config(self, conf):
        self.cleanup()
        self.check_config(conf)
        for i,(element,dummy) in enumerate(self.IMPORT_EXPORT_STRUCTURE):
            if element == "chains":
                for x in conf[i]:
                    self.add_chain(*x)
            if element == "rules":
                for x in conf[i]:
                    self.add_rule(*x)
            if element == "passthroughs":
                for x in conf[i]:
                    self.add_passthrough(*x)

    def cleanup(self):
        self.chains.clear()
        self.rules.clear()
        self.passthroughs.clear()

    def output(self):
        print("chains")
        for key in self.chains:
            print("  (%s, %s): %s" % (key[0], key[1],
                                      ",".join(self.chains[key])))
        print("rules")
        for key in self.rules:
            print("  (%s, %s, %s):" % (key[0], key[1], key[2]))
            for (priority,args) in self.rules[key]:
                print("    (%d, ('%s'))" % (priority, "','".join(args)))
        print("passthroughs")
        for key in self.passthroughs:
            print("  %s:" % (key))
            for args in self.passthroughs[key]:
                print("    ('%s')" % ("','".join(args)))

    def _check_ipv(self, ipv):
        ipvs = ['ipv4', 'ipv6', 'eb']
        if ipv not in ipvs:
            raise FirewallError(errors.INVALID_IPV,
                                "'%s' not in '%s'" % (ipv, ipvs))

    def _check_ipv_table(self, ipv, table):
        self._check_ipv(ipv)

        tables = ipXtables.BUILT_IN_CHAINS.keys() if ipv in ['ipv4', 'ipv6'] \
                                         else ebtables.BUILT_IN_CHAINS.keys()
        if table not in tables:
            raise FirewallError(errors.INVALID_TABLE,
                                "'%s' not in '%s'" % (table, tables))

    # chains

    def add_chain(self, ipv, table, chain):
        self._check_ipv_table(ipv, table)
        key = (ipv, table)
        if key not in self.chains:
            self.chains[key] = [ ]
        if chain not in self.chains[key]:
            self.chains[key].append(chain)
        else:
            log.warning("Chain '%s' for table '%s' with ipv '%s' " % \
                        (chain, table, ipv) + "already in list, ignoring")

    def remove_chain(self, ipv, table, chain):
        self._check_ipv_table(ipv, table)
        key = (ipv, table)
        if key in self.chains and chain in self.chains[key]:
            self.chains[key].remove(chain)
            if len(self.chains[key]) == 0:
                del self.chains[key]
        else:
            raise ValueError( \
                "Chain '%s' with table '%s' with ipv '%s' not in list" % \
                (chain, table, ipv))

    def query_chain(self, ipv, table, chain):
        self._check_ipv_table(ipv, table)
        key = (ipv, table)
        return (key in self.chains and chain in self.chains[key])

    def get_chains(self, ipv, table):
        self._check_ipv_table(ipv, table)
        key = (ipv, table)
        if key in self.chains:
            return self.chains[key]
        else:
            raise ValueError("No chains for table '%s' with ipv '%s'" % \
                             (table, ipv))

    def get_all_chains(self):
        return self.chains

    # rules

    def add_rule(self, ipv, table, chain, priority, args):
        self._check_ipv_table(ipv, table)
        key = (ipv, table, chain)
        if key not in self.rules:
            self.rules[key] = LastUpdatedOrderedDict()
        value = (priority, tuple(args))
        if value not in self.rules[key]:
            self.rules[key][value] = priority
        else:
            log.warning("Rule '%s' for table '%s' and chain '%s' " % \
                        ("',".join(args), table, chain) +
                        "with ipv '%s' and priority %d " % (ipv, priority) +
                        "already in list, ignoring")

    def remove_rule(self, ipv, table, chain, priority, args):
        self._check_ipv_table(ipv, table)
        key = (ipv, table, chain)
        value = (priority, tuple(args))
        if key in self.rules and value in self.rules[key]:
            del self.rules[key][value]
            if len(self.rules[key]) == 0:
                del self.rules[key]
        else:
            raise ValueError("Rule '%s' for table '%s' and chain '%s' " % \
                ("',".join(args), table, chain) + \
                "with ipv '%s' and priority %d not in list" % (ipv, priority))

    def remove_rules(self, ipv, table, chain):
        self._check_ipv_table(ipv, table)
        key = (ipv, table, chain)
        if key in self.rules:
            for value in self.rules[key].keys():
                del self.rules[key][value]
            if len(self.rules[key]) == 0:
                del self.rules[key]

    def query_rule(self, ipv, table, chain, priority, args):
        self._check_ipv_table(ipv, table)
        key = (ipv, table, chain)
        value = (priority, tuple(args))
        return (key in self.rules and value in self.rules[key])

    def get_rules(self, ipv, table, chain):
        self._check_ipv_table(ipv, table)
        key = (ipv, table, chain)
        if key in self.rules:
            return self.rules[key]
        else:
            raise ValueError("No rules for table '%s' and chain '%s' " %\
                             (table, chain) + "with ipv '%s'" % (ipv))

    def get_all_rules(self):
        return self.rules

#    # passthrough
#
    def add_passthrough(self, ipv, args):
        self._check_ipv(ipv)
        if ipv not in self.passthroughs:
            self.passthroughs[ipv] = [ ]
        if args not in self.passthroughs[ipv]:
            self.passthroughs[ipv].append(args)
        else:
            log.warning("Passthrough '%s' for ipv '%s'" % \
                        ("',".join(args), ipv) + "already in list, ignoring")

    def remove_passthrough(self, ipv, args):
        self._check_ipv(ipv)
        if ipv in self.passthroughs and args in self.passthroughs[ipv]:
            self.passthroughs[ipv].remove(args)
            if len(self.passthroughs[ipv]) == 0:
                del self.passthroughs[ipv]
        else:
            raise ValueError("Passthrough '%s' for ipv '%s'" % \
                             ("',".join(args), ipv) + "not in list")

    def query_passthrough(self, ipv, args):
        self._check_ipv(ipv)
        return ipv in self.passthroughs and args in self.passthroughs[ipv]

    def get_passthroughs(self, ipv):
        self._check_ipv(ipv)
        if ipv in self.passthroughs:
            return self.passthroughs[ipv]
        else:
            raise ValueError("No passthroughs for ipv '%s'" % (ipv))

    def get_all_passthroughs(self):
        return self.passthroughs

    # read

    def read(self):
        self.cleanup()
        if not self.filename.endswith(".xml"):
            raise FirewallError(errors.INVALID_NAME,
                                "'%s' is missing .xml suffix" % self.filename)
        handler = direct_ContentHandler(self)
        parser = sax.make_parser()
        parser.setContentHandler(handler)
        with open(self.filename, "rb") as f:
            source = sax.InputSource(None)
            source.setByteStream(f)
            try:
                parser.parse(source)
            except sax.SAXParseException as msg:
                raise FirewallError(errors.INVALID_TYPE,
                                    "Not a valid file: %s" % \
                                    msg.getException())

    def write(self):
        if os.path.exists(self.filename):
            try:
                shutil.copy2(self.filename, "%s.old" % self.filename)
            except Exception as msg:
                raise IOError("Backup of '%s' failed: %s" % (self.filename, msg))

        if not os.path.exists(config.ETC_FIREWALLD):
            os.mkdir(config.ETC_FIREWALLD, 0o750)

        f = io.open(self.filename, mode='wt', encoding='UTF-8')
        handler = IO_Object_XMLGenerator(f)
        handler.startDocument()

        # start whitelist element
        handler.startElement("direct", { })
        handler.ignorableWhitespace("\n")

        # chains
        for key in self.chains:
            (ipv, table) = key
            for chain in self.chains[key]:
                handler.ignorableWhitespace("  ")
                handler.simpleElement("chain", { "ipv": ipv, "table": table,
                                                 "chain": chain })
                handler.ignorableWhitespace("\n")

        # rules
        for key in self.rules:
            (ipv, table, chain) = key
            for (priority, args) in self.rules[key]:
                if len(args) < 1:
                    continue
                handler.ignorableWhitespace("  ")
                handler.startElement("rule", { "ipv": ipv, "table": table,
                                               "chain": chain,
                                               "priority": "%d" % priority })
                handler.ignorableWhitespace(sax.saxutils.escape(joinArgs(args)))
                handler.endElement("rule")
                handler.ignorableWhitespace("\n")

        # passthroughs
        for ipv in self.passthroughs:
            for args in self.passthroughs[ipv]:
                if len(args) < 1:
                    continue
                handler.ignorableWhitespace("  ")
                handler.startElement("passthrough", { "ipv": ipv })
                handler.ignorableWhitespace(sax.saxutils.escape(joinArgs(args)))
                handler.endElement("passthrough")
                handler.ignorableWhitespace("\n")

        # end zone element
        handler.endElement("direct")
        handler.ignorableWhitespace("\n")
        handler.endDocument()
        f.close()
        del handler
site-packages/firewall/core/io/service.py000064400000031325147511334700014453 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2011-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

__all__ = [ "Service", "service_reader", "service_writer" ]

import xml.sax as sax
import os
import io
import shutil

from firewall import config
from firewall.functions import u2b_if_py2
from firewall.core.io.io_object import PY2, IO_Object, \
    IO_Object_ContentHandler, IO_Object_XMLGenerator, check_port, \
    check_tcpudp, check_protocol, check_address
from firewall.core.logger import log
from firewall import errors
from firewall.errors import FirewallError

class Service(IO_Object):
    IMPORT_EXPORT_STRUCTURE = (
        ( "version",  "" ),
        ( "short", "" ),
        ( "description", "" ),
        ( "ports", [ ( "", "" ), ], ),
        ( "modules", [ "", ], ),
        ( "destination", { "": "", }, ),
        ( "protocols", [ "", ], ),
        ( "source_ports", [ ( "", "" ), ], ),
        ( "includes", [ "" ], ),
        ( "helpers", [ "", ], ),
        )
    ADDITIONAL_ALNUM_CHARS = [ "_", "-" ]
    PARSER_REQUIRED_ELEMENT_ATTRS = {
        "short": None,
        "description": None,
        "service": None,
        }
    PARSER_OPTIONAL_ELEMENT_ATTRS = {
        "service": [ "name", "version" ],
        "port": [ "port", "protocol" ],
        "protocol": [ "value" ],
        "module": [ "name" ],
        "destination": [ "ipv4", "ipv6" ],
        "source-port": [ "port", "protocol" ],
        "include": [ "service" ],
        "helper": [ "name" ],
        }

    def __init__(self):
        super(Service, self).__init__()
        self.version = ""
        self.short = ""
        self.description = ""
        self.ports = [ ]
        self.protocols = [ ]
        self.modules = [ ]
        self.destination = { }
        self.source_ports = [ ]
        self.includes = [ ]
        self.helpers = [ ]

    def cleanup(self):
        self.version = ""
        self.short = ""
        self.description = ""
        del self.ports[:]
        del self.protocols[:]
        del self.modules[:]
        self.destination.clear()
        del self.source_ports[:]
        del self.includes[:]
        del self.helpers[:]

    def encode_strings(self):
        """ HACK. I haven't been able to make sax parser return
            strings encoded (because of python 2) instead of in unicode.
            Get rid of it once we throw out python 2 support."""
        self.version = u2b_if_py2(self.version)
        self.short = u2b_if_py2(self.short)
        self.description = u2b_if_py2(self.description)
        self.ports = [(u2b_if_py2(po),u2b_if_py2(pr)) for (po,pr) in self.ports]
        self.modules = [u2b_if_py2(m) for m in self.modules]
        self.destination = {u2b_if_py2(k):u2b_if_py2(v) for k,v in self.destination.items()}
        self.protocols = [u2b_if_py2(pr) for pr in self.protocols]
        self.source_ports = [(u2b_if_py2(po),u2b_if_py2(pr)) for (po,pr)
                             in self.source_ports]
        self.includes = [u2b_if_py2(s) for s in self.includes]
        self.helpers = [u2b_if_py2(s) for s in self.helpers]

    def _check_config(self, config, item, all_config):
        if item == "ports":
            for port in config:
                if port[0] != "":
                    check_port(port[0])
                    check_tcpudp(port[1])
                else:
                    # only protocol
                    check_protocol(port[1])

        elif item == "protocols":
            for proto in config:
                check_protocol(proto)

        elif item == "source_ports":
            for port in config:
                check_port(port[0])
                check_tcpudp(port[1])

        elif item == "destination":
            for destination in config:
                if destination not in [ "ipv4", "ipv6" ]:
                    raise FirewallError(errors.INVALID_DESTINATION,
                                        "'%s' not in {'ipv4'|'ipv6'}" % \
                                        destination)
                check_address(destination, config[destination])

        elif item == "modules":
            for module in config:
                if module.startswith("nf_conntrack_"):
                    module = module.replace("nf_conntrack_", "")
                    if "_" in module:
                        module = module.replace("_", "-")
                if len(module) < 2:
                    raise FirewallError(errors.INVALID_MODULE, module)

# PARSER

class service_ContentHandler(IO_Object_ContentHandler):
    def startElement(self, name, attrs):
        IO_Object_ContentHandler.startElement(self, name, attrs)
        self.item.parser_check_element_attrs(name, attrs)
        if name == "service":
            if "name" in attrs:
                log.warning("Ignoring deprecated attribute name='%s'",
                            attrs["name"])
            if "version" in attrs:
                self.item.version = attrs["version"]
        elif name == "short":
            pass
        elif name == "description":
            pass
        elif name == "port":
            if attrs["port"] != "":
                check_port(attrs["port"])
                check_tcpudp(attrs["protocol"])
                entry = (attrs["port"], attrs["protocol"])
                if entry not in self.item.ports:
                    self.item.ports.append(entry)
                else:
                    log.warning("Port '%s/%s' already set, ignoring.",
                                attrs["port"], attrs["protocol"])
            else:
                check_protocol(attrs["protocol"])
                if attrs["protocol"] not in self.item.protocols:
                    self.item.protocols.append(attrs["protocol"])
                else:
                    log.warning("Protocol '%s' already set, ignoring.",
                                attrs["protocol"])
        elif name == "protocol":
            check_protocol(attrs["value"])
            if attrs["value"] not in self.item.protocols:
                self.item.protocols.append(attrs["value"])
            else:
                log.warning("Protocol '%s' already set, ignoring.",
                            attrs["value"])
        elif name == "source-port":
            check_port(attrs["port"])
            check_tcpudp(attrs["protocol"])
            entry = (attrs["port"], attrs["protocol"])
            if entry not in self.item.source_ports:
                self.item.source_ports.append(entry)
            else:
                log.warning("SourcePort '%s/%s' already set, ignoring.",
                            attrs["port"], attrs["protocol"])
        elif name == "destination":
            for x in [ "ipv4", "ipv6" ]:
                if x in attrs:
                    check_address(x, attrs[x])
                    if x in self.item.destination:
                        log.warning("Destination address for '%s' already set, ignoring",
                                    x)
                    else:
                        self.item.destination[x] = attrs[x]
        elif name == "module":
            module = attrs["name"]
            if module.startswith("nf_conntrack_"):
                module = module.replace("nf_conntrack_", "")
                if "_" in module:
                    module = module.replace("_", "-")
            if module not in self.item.modules:
                self.item.modules.append(module)
            else:
                log.warning("Module '%s' already set, ignoring.",
                            module)
        elif name == "include":
            if attrs["service"] not in self.item.includes:
                self.item.includes.append(attrs["service"])
            else:
                log.warning("Include '%s' already set, ignoring.",
                            attrs["service"])
        elif name == "helper":
            if attrs["name"] not in self.item.helpers:
                self.item.helpers.append(attrs["name"])
            else:
                log.warning("Helper '%s' already set, ignoring.",
                            attrs["name"])


def service_reader(filename, path):
    service = Service()
    if not filename.endswith(".xml"):
        raise FirewallError(errors.INVALID_NAME,
                            "'%s' is missing .xml suffix" % filename)
    service.name = filename[:-4]
    service.check_name(service.name)
    service.filename = filename
    service.path = path
    service.builtin = False if path.startswith(config.ETC_FIREWALLD) else True
    service.default = service.builtin
    handler = service_ContentHandler(service)
    parser = sax.make_parser()
    parser.setContentHandler(handler)
    name = "%s/%s" % (path, filename)
    with open(name, "rb") as f:
        source = sax.InputSource(None)
        source.setByteStream(f)
        try:
            parser.parse(source)
        except sax.SAXParseException as msg:
            raise FirewallError(errors.INVALID_SERVICE,
                                "not a valid service file: %s" % \
                                msg.getException())
    del handler
    del parser
    if PY2:
        service.encode_strings()
    return service

def service_writer(service, path=None):
    _path = path if path else service.path

    if service.filename:
        name = "%s/%s" % (_path, service.filename)
    else:
        name = "%s/%s.xml" % (_path, service.name)

    if os.path.exists(name):
        try:
            shutil.copy2(name, "%s.old" % name)
        except Exception as msg:
            log.error("Backup of file '%s' failed: %s", name, msg)

    dirpath = os.path.dirname(name)
    if dirpath.startswith(config.ETC_FIREWALLD) and not os.path.exists(dirpath):
        if not os.path.exists(config.ETC_FIREWALLD):
            os.mkdir(config.ETC_FIREWALLD, 0o750)
        os.mkdir(dirpath, 0o750)

    f = io.open(name, mode='wt', encoding='UTF-8')
    handler = IO_Object_XMLGenerator(f)
    handler.startDocument()

    # start service element
    attrs = {}
    if service.version and service.version != "":
        attrs["version"] = service.version
    handler.startElement("service", attrs)
    handler.ignorableWhitespace("\n")

    # short
    if service.short and service.short != "":
        handler.ignorableWhitespace("  ")
        handler.startElement("short", { })
        handler.characters(service.short)
        handler.endElement("short")
        handler.ignorableWhitespace("\n")

    # description
    if service.description and service.description != "":
        handler.ignorableWhitespace("  ")
        handler.startElement("description", { })
        handler.characters(service.description)
        handler.endElement("description")
        handler.ignorableWhitespace("\n")

    # ports
    for port in service.ports:
        handler.ignorableWhitespace("  ")
        handler.simpleElement("port", { "port": port[0], "protocol": port[1] })
        handler.ignorableWhitespace("\n")

    # protocols
    for protocol in service.protocols:
        handler.ignorableWhitespace("  ")
        handler.simpleElement("protocol", { "value": protocol })
        handler.ignorableWhitespace("\n")

    # source ports
    for port in service.source_ports:
        handler.ignorableWhitespace("  ")
        handler.simpleElement("source-port", { "port": port[0],
                                               "protocol": port[1] })
        handler.ignorableWhitespace("\n")

    # modules
    for module in service.modules:
        handler.ignorableWhitespace("  ")
        handler.simpleElement("module", { "name": module })
        handler.ignorableWhitespace("\n")

    # destination
    if len(service.destination) > 0:
        handler.ignorableWhitespace("  ")
        handler.simpleElement("destination", service.destination)
        handler.ignorableWhitespace("\n")

    # includes
    for include in service.includes:
        handler.ignorableWhitespace("  ")
        handler.simpleElement("include", { "service": include })
        handler.ignorableWhitespace("\n")

    # helpers
    for helper in service.helpers:
        handler.ignorableWhitespace("  ")
        handler.simpleElement("helper", { "name": helper })
        handler.ignorableWhitespace("\n")

    # end service element
    handler.endElement('service')
    handler.ignorableWhitespace("\n")
    handler.endDocument()
    f.close()
    del handler
site-packages/firewall/core/io/ipset.py000064400000051205147511334700014136 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2015-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

"""ipset io XML handler, reader, writer"""

__all__ = [ "IPSet", "ipset_reader", "ipset_writer" ]

import xml.sax as sax
import os
import io
import shutil

from firewall import config
from firewall.functions import checkIP, checkIP6, checkIPnMask, \
    checkIP6nMask, u2b_if_py2, check_mac, check_port, checkInterface, \
    checkProtocol
from firewall.core.io.io_object import PY2, IO_Object, \
    IO_Object_ContentHandler, IO_Object_XMLGenerator
from firewall.core.ipset import IPSET_TYPES, IPSET_CREATE_OPTIONS
from firewall.core.icmp import check_icmp_name, check_icmp_type, \
    check_icmpv6_name, check_icmpv6_type
from firewall.core.logger import log
from firewall import errors
from firewall.errors import FirewallError

class IPSet(IO_Object):
    IMPORT_EXPORT_STRUCTURE = (
        ( "version",  "" ),              # s
        ( "short", "" ),                 # s
        ( "description", "" ),           # s
        ( "type", "" ),                  # s
        ( "options", { "": "", }, ),     # a{ss}
        ( "entries", [ "" ], ),          # as
    )
    DBUS_SIGNATURE = '(ssssa{ss}as)'
    ADDITIONAL_ALNUM_CHARS = [ "_", "-", ":", "." ]
    PARSER_REQUIRED_ELEMENT_ATTRS = {
        "short": None,
        "description": None,
        "ipset": [ "type" ],
        "option": [ "name" ],
        "entry": None,
    }
    PARSER_OPTIONAL_ELEMENT_ATTRS = {
        "ipset": [ "version" ],
        "option": [ "value" ],
    }

    def __init__(self):
        super(IPSet, self).__init__()
        self.version = ""
        self.short = ""
        self.description = ""
        self.type = ""
        self.entries = [ ]
        self.options = { }
        self.applied = False

    def cleanup(self):
        self.version = ""
        self.short = ""
        self.description = ""
        self.type = ""
        del self.entries[:]
        self.options.clear()
        self.applied = False

    def encode_strings(self):
        """ HACK. I haven't been able to make sax parser return
            strings encoded (because of python 2) instead of in unicode.
            Get rid of it once we throw out python 2 support."""
        self.version = u2b_if_py2(self.version)
        self.short = u2b_if_py2(self.short)
        self.description = u2b_if_py2(self.description)
        self.type = u2b_if_py2(self.type)
        self.options = { u2b_if_py2(k):u2b_if_py2(v)
                         for k, v in self.options.items() }
        self.entries = [ u2b_if_py2(e) for e in self.entries ]

    @staticmethod
    def check_entry(entry, options, ipset_type):
        family = "ipv4"
        if "family" in options:
            if options["family"] == "inet6":
                family = "ipv6"

        if not ipset_type.startswith("hash:"):
            raise FirewallError(errors.INVALID_IPSET,
                                "ipset type '%s' not usable" % ipset_type)
        flags = ipset_type[5:].split(",")
        items = entry.split(",")

        if len(flags) != len(items) or len(flags) < 1:
            raise FirewallError(
                errors.INVALID_ENTRY,
                "entry '%s' does not match ipset type '%s'" % \
                (entry, ipset_type))

        for i in range(len(flags)):
            flag = flags[i]
            item = items[i]

            if flag == "ip":
                if "-" in item and family == "ipv4":
                    # IP ranges only with plain IPs, no masks
                    if i > 1:
                        raise FirewallError(
                            errors.INVALID_ENTRY,
                            "invalid address '%s' in '%s'[%d]" % \
                            (item, entry, i))
                    splits = item.split("-")
                    if len(splits) != 2:
                        raise FirewallError(
                            errors.INVALID_ENTRY,
                            "invalid address range '%s' in '%s' for %s (%s)" % \
                            (item, entry, ipset_type, family))
                    for _split in splits:
                        if (family == "ipv4" and not checkIP(_split)) or \
                           (family == "ipv6" and not checkIP6(_split)):
                            raise FirewallError(
                                errors.INVALID_ENTRY,
                                "invalid address '%s' in '%s' for %s (%s)" % \
                                (_split, entry, ipset_type, family))
                else:
                    # IPs with mask only allowed in the first
                    # position of the type
                    if family == "ipv4":
                        if item == "0.0.0.0":
                            raise FirewallError(
                                errors.INVALID_ENTRY,
                                "invalid address '%s' in '%s' for %s (%s)" % \
                                (item, entry, ipset_type, family))
                        if i == 0:
                            ip_check = checkIPnMask
                        else:
                            ip_check = checkIP
                    else:
                        ip_check = checkIP6
                    if not ip_check(item):
                        raise FirewallError(
                            errors.INVALID_ENTRY,
                            "invalid address '%s' in '%s' for %s (%s)" % \
                            (item, entry, ipset_type, family))
            elif flag == "net":
                if "-" in item:
                    # IP ranges only with plain IPs, no masks
                    splits = item.split("-")
                    if len(splits) != 2:
                        raise FirewallError(
                            errors.INVALID_ENTRY,
                            "invalid address range '%s' in '%s' for %s (%s)" % \
                            (item, entry, ipset_type, family))
                    # First part can only be a plain IP
                    if (family == "ipv4" and not checkIP(splits[0])) or \
                       (family == "ipv6" and not checkIP6(splits[0])):
                        raise FirewallError(
                            errors.INVALID_ENTRY,
                            "invalid address '%s' in '%s' for %s (%s)" % \
                            (splits[0], entry, ipset_type, family))
                    # Second part can also have a mask
                    if (family == "ipv4" and not checkIPnMask(splits[1])) or \
                       (family == "ipv6" and not checkIP6nMask(splits[1])):
                        raise FirewallError(
                            errors.INVALID_ENTRY,
                            "invalid address '%s' in '%s' for %s (%s)" % \
                            (splits[1], entry, ipset_type, family))
                else:
                    # IPs with mask allowed in all positions, but no /0
                    if item.endswith("/0"):
                        if not (family == "ipv6" and i == 0 and
                                ipset_type == "hash:net,iface"):
                            raise FirewallError(
                                errors.INVALID_ENTRY,
                                "invalid address '%s' in '%s' for %s (%s)" % \
                                (item, entry, ipset_type, family))
                    if (family == "ipv4" and not checkIPnMask(item)) or \
                       (family == "ipv6" and not checkIP6nMask(item)):
                        raise FirewallError(
                            errors.INVALID_ENTRY,
                            "invalid address '%s' in '%s' for %s (%s)" % \
                            (item, entry, ipset_type, family))
            elif flag == "mac":
                # ipset does not allow to add 00:00:00:00:00:00
                if not check_mac(item) or item == "00:00:00:00:00:00":
                    raise FirewallError(
                        errors.INVALID_ENTRY,
                        "invalid mac address '%s' in '%s'" % (item, entry))
            elif flag == "port":
                if ":" in item:
                    splits = item.split(":")
                    if len(splits) != 2:
                        raise FirewallError(
                            errors.INVALID_ENTRY,
                            "invalid port '%s'" % (item))
                    if splits[0] == "icmp":
                        if family != "ipv4":
                            raise FirewallError(
                                errors.INVALID_ENTRY,
                                "invalid protocol for family '%s' in '%s'" % \
                                (family, entry))
                        if not check_icmp_name(splits[1]) and not \
                           check_icmp_type(splits[1]):
                            raise FirewallError(
                                errors.INVALID_ENTRY,
                                "invalid icmp type '%s' in '%s'" % \
                                (splits[1], entry))
                    elif splits[0] in [ "icmpv6", "ipv6-icmp" ]:
                        if family != "ipv6":
                            raise FirewallError(
                                errors.INVALID_ENTRY,
                                "invalid protocol for family '%s' in '%s'" % \
                                (family, entry))
                        if not check_icmpv6_name(splits[1]) and not \
                           check_icmpv6_type(splits[1]):
                            raise FirewallError(
                                errors.INVALID_ENTRY,
                                "invalid icmpv6 type '%s' in '%s'" % \
                                (splits[1], entry))
                    elif splits[0] not in [ "tcp", "sctp", "udp", "udplite" ] \
                       and not checkProtocol(splits[0]):
                        raise FirewallError(
                            errors.INVALID_ENTRY,
                            "invalid protocol '%s' in '%s'" % (splits[0],
                                                               entry))
                    elif not check_port(splits[1]):
                        raise FirewallError(
                            errors.INVALID_ENTRY,
                            "invalid port '%s'in '%s'" % (splits[1], entry))
                else:
                    if not check_port(item):
                        raise FirewallError(
                            errors.INVALID_ENTRY,
                            "invalid port '%s' in '%s'" % (item, entry))
            elif flag == "mark":
                if item.startswith("0x"):
                    try:
                        int_val = int(item, 16)
                    except ValueError:
                        raise FirewallError(
                            errors.INVALID_ENTRY,
                            "invalid mark '%s' in '%s'" % (item, entry))
                else:
                    try:
                        int_val = int(item)
                    except ValueError:
                        raise FirewallError(
                            errors.INVALID_ENTRY,
                            "invalid mark '%s' in '%s'" % (item, entry))
                if int_val < 0 or int_val > 4294967295:
                    raise FirewallError(
                        errors.INVALID_ENTRY,
                        "invalid mark '%s' in '%s'" % (item, entry))
            elif flag == "iface":
                if not checkInterface(item) or len(item) > 15:
                    raise FirewallError(
                        errors.INVALID_ENTRY,
                        "invalid interface '%s' in '%s'" % (item, entry))
            else:
                raise FirewallError(errors.INVALID_IPSET,
                                    "ipset type '%s' not usable" % ipset_type)

    def _check_config(self, config, item, all_config):
        if item == "type":
            if config not in IPSET_TYPES:
                raise FirewallError(errors.INVALID_TYPE,
                                    "'%s' is not valid ipset type" % config)
        if item == "options":
            for key in config.keys():
                if key not in IPSET_CREATE_OPTIONS:
                    raise FirewallError(errors.INVALID_IPSET,
                                        "ipset invalid option '%s'" % key)
                if key in [ "timeout", "hashsize", "maxelem" ]:
                    try:
                        int_value = int(config[key])
                    except ValueError:
                        raise FirewallError(
                            errors.INVALID_VALUE,
                            "Option '%s': Value '%s' is not an integer" % \
                            (key, config[key]))
                    if int_value < 0:
                        raise FirewallError(
                            errors.INVALID_VALUE,
                            "Option '%s': Value '%s' is negative" % \
                            (key, config[key]))
                elif key == "family" and \
                     config[key] not in [ "inet", "inet6" ]:
                    raise FirewallError(errors.INVALID_FAMILY, config[key])

    def import_config(self, config):
        if "timeout" in config[4] and config[4]["timeout"] != "0":
            if len(config[5]) != 0:
                raise FirewallError(errors.IPSET_WITH_TIMEOUT)
        for entry in config[5]:
            IPSet.check_entry(entry, config[4], config[3])
        super(IPSet, self).import_config(config)

# PARSER

class ipset_ContentHandler(IO_Object_ContentHandler):
    def startElement(self, name, attrs):
        IO_Object_ContentHandler.startElement(self, name, attrs)
        self.item.parser_check_element_attrs(name, attrs)
        if name == "ipset":
            if "type" in attrs:
                if attrs["type"] not in IPSET_TYPES:
                    raise FirewallError(errors.INVALID_TYPE, "%s" % attrs["type"])
                self.item.type = attrs["type"]
            if "version" in attrs:
                self.item.version = attrs["version"]
        elif name == "short":
            pass
        elif name == "description":
            pass
        elif name == "option":
            value = ""
            if "value" in attrs:
                value = attrs["value"]

            if attrs["name"] not in \
               [ "family", "timeout", "hashsize", "maxelem" ]:
                raise FirewallError(
                    errors.INVALID_OPTION,
                    "Unknown option '%s'" % attrs["name"])
            if self.item.type == "hash:mac" and attrs["name"] in [ "family" ]:
                raise FirewallError(
                    errors.INVALID_OPTION,
                    "Unsupported option '%s' for type '%s'" % \
                    (attrs["name"], self.item.type))
            if attrs["name"] in [ "family", "timeout", "hashsize", "maxelem" ] \
               and not value:
                raise FirewallError(
                    errors.INVALID_OPTION,
                    "Missing mandatory value of option '%s'" % attrs["name"])
            if attrs["name"] in [ "timeout", "hashsize", "maxelem" ]:
                try:
                    int_value = int(value)
                except ValueError:
                    raise FirewallError(
                        errors.INVALID_VALUE,
                        "Option '%s': Value '%s' is not an integer" % \
                        (attrs["name"], value))
                if int_value < 0:
                    raise FirewallError(
                        errors.INVALID_VALUE,
                        "Option '%s': Value '%s' is negative" % \
                        (attrs["name"], value))
            if attrs["name"] == "family" and value not in [ "inet", "inet6" ]:
                raise FirewallError(errors.INVALID_FAMILY, value)
            if attrs["name"] not in self.item.options:
                self.item.options[attrs["name"]] = value
            else:
                log.warning("Option %s already set, ignoring.", attrs["name"])
        # nothing to do for entry and entries here

    def endElement(self, name):
        IO_Object_ContentHandler.endElement(self, name)
        if name == "entry":
            self.item.entries.append(self._element)

def ipset_reader(filename, path):
    ipset = IPSet()
    if not filename.endswith(".xml"):
        raise FirewallError(errors.INVALID_NAME,
                            "'%s' is missing .xml suffix" % filename)
    ipset.name = filename[:-4]
    ipset.check_name(ipset.name)
    ipset.filename = filename
    ipset.path = path
    ipset.builtin = False if path.startswith(config.ETC_FIREWALLD) else True
    ipset.default = ipset.builtin
    handler = ipset_ContentHandler(ipset)
    parser = sax.make_parser()
    parser.setContentHandler(handler)
    name = "%s/%s" % (path, filename)
    with open(name, "rb") as f:
        source = sax.InputSource(None)
        source.setByteStream(f)
        try:
            parser.parse(source)
        except sax.SAXParseException as msg:
            raise FirewallError(errors.INVALID_IPSET,
                                "not a valid ipset file: %s" % \
                                msg.getException())
    del handler
    del parser
    if "timeout" in ipset.options and ipset.options["timeout"] != "0" and \
       len(ipset.entries) > 0:
        # no entries visible for ipsets with timeout
        log.warning("ipset '%s': timeout option is set, entries are ignored",
                    ipset.name)
        del ipset.entries[:]
    i = 0
    entries_set = set()
    while i < len(ipset.entries):
        if ipset.entries[i] in entries_set:
            log.warning("Entry %s already set, ignoring.", ipset.entries[i])
            ipset.entries.pop(i)
        else:
            try:
                ipset.check_entry(ipset.entries[i], ipset.options, ipset.type)
            except FirewallError as e:
                log.warning("%s, ignoring.", e)
                ipset.entries.pop(i)
            else:
                entries_set.add(ipset.entries[i])
                i += 1
    del entries_set
    if PY2:
        ipset.encode_strings()

    return ipset

def ipset_writer(ipset, path=None):
    _path = path if path else ipset.path

    if ipset.filename:
        name = "%s/%s" % (_path, ipset.filename)
    else:
        name = "%s/%s.xml" % (_path, ipset.name)

    if os.path.exists(name):
        try:
            shutil.copy2(name, "%s.old" % name)
        except Exception as msg:
            log.error("Backup of file '%s' failed: %s", name, msg)

    dirpath = os.path.dirname(name)
    if dirpath.startswith(config.ETC_FIREWALLD) and not os.path.exists(dirpath):
        if not os.path.exists(config.ETC_FIREWALLD):
            os.mkdir(config.ETC_FIREWALLD, 0o750)
        os.mkdir(dirpath, 0o750)

    f = io.open(name, mode='wt', encoding='UTF-8')
    handler = IO_Object_XMLGenerator(f)
    handler.startDocument()

    # start ipset element
    attrs = { "type": ipset.type }
    if ipset.version and ipset.version != "":
        attrs["version"] = ipset.version
    handler.startElement("ipset", attrs)
    handler.ignorableWhitespace("\n")

    # short
    if ipset.short and ipset.short != "":
        handler.ignorableWhitespace("  ")
        handler.startElement("short", { })
        handler.characters(ipset.short)
        handler.endElement("short")
        handler.ignorableWhitespace("\n")

    # description
    if ipset.description and ipset.description != "":
        handler.ignorableWhitespace("  ")
        handler.startElement("description", { })
        handler.characters(ipset.description)
        handler.endElement("description")
        handler.ignorableWhitespace("\n")

    # options
    for key,value in ipset.options.items():
        handler.ignorableWhitespace("  ")
        if value != "":
            handler.simpleElement("option", { "name": key, "value": value })
        else:
            handler.simpleElement("option", { "name": key })
        handler.ignorableWhitespace("\n")

    # entries
    for entry in ipset.entries:
        handler.ignorableWhitespace("  ")
        handler.startElement("entry", { })
        handler.characters(entry)
        handler.endElement("entry")
        handler.ignorableWhitespace("\n")

    # end ipset element
    handler.endElement('ipset')
    handler.ignorableWhitespace("\n")
    handler.endDocument()
    f.close()
    del handler
site-packages/firewall/core/io/icmptype.py000064400000015244147511334700014647 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2011-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

__all__ = [ "IcmpType", "icmptype_reader", "icmptype_writer" ]

import xml.sax as sax
import os
import io
import shutil

from firewall import config
from firewall.functions import u2b_if_py2
from firewall.core.io.io_object import PY2, IO_Object, \
    IO_Object_ContentHandler, IO_Object_XMLGenerator
from firewall.core.logger import log
from firewall import errors
from firewall.errors import FirewallError

class IcmpType(IO_Object):
    IMPORT_EXPORT_STRUCTURE = (
        ( "version",  "" ),          # s
        ( "short", "" ),             # s
        ( "description", "" ),       # s
        ( "destination", [ "", ], ), # as
        )
    DBUS_SIGNATURE = '(sssas)'
    ADDITIONAL_ALNUM_CHARS = [ "_", "-" ]
    PARSER_REQUIRED_ELEMENT_ATTRS = {
        "short": None,
        "description": None,
        "icmptype": None,
        }
    PARSER_OPTIONAL_ELEMENT_ATTRS = {
        "icmptype": [ "name", "version" ],
        "destination": [ "ipv4", "ipv6" ],
        }

    def __init__(self):
        super(IcmpType, self).__init__()
        self.version = ""
        self.short = ""
        self.description = ""
        self.destination = [ ]

    def cleanup(self):
        self.version = ""
        self.short = ""
        self.description = ""
        del self.destination[:]

    def encode_strings(self):
        """ HACK. I haven't been able to make sax parser return
            strings encoded (because of python 2) instead of in unicode.
            Get rid of it once we throw out python 2 support."""
        self.version = u2b_if_py2(self.version)
        self.short = u2b_if_py2(self.short)
        self.description = u2b_if_py2(self.description)
        self.destination = [u2b_if_py2(m) for m in self.destination]

    def _check_config(self, config, item, all_config):
        if item == "destination":
            for destination in config:
                if destination not in [ "ipv4", "ipv6" ]:
                    raise FirewallError(errors.INVALID_DESTINATION,
                                        "'%s' not from {'ipv4'|'ipv6'}" % \
                                        destination)

# PARSER

class icmptype_ContentHandler(IO_Object_ContentHandler):
    def startElement(self, name, attrs):
        IO_Object_ContentHandler.startElement(self, name, attrs)
        self.item.parser_check_element_attrs(name, attrs)

        if name == "icmptype":
            if "name" in attrs:
                log.warning("Ignoring deprecated attribute name='%s'" %
                            attrs["name"])
            if "version" in attrs:
                self.item.version = attrs["version"]
        elif name == "short":
            pass
        elif name == "description":
            pass
        elif name == "destination":
            for x in [ "ipv4", "ipv6" ]:
                if x in attrs and \
                        attrs[x].lower() in [ "yes", "true" ]:
                    self.item.destination.append(str(x))

def icmptype_reader(filename, path):
    icmptype = IcmpType()
    if not filename.endswith(".xml"):
        raise FirewallError(errors.INVALID_NAME,
                            "%s is missing .xml suffix" % filename)
    icmptype.name = filename[:-4]
    icmptype.check_name(icmptype.name)
    icmptype.filename = filename
    icmptype.path = path
    icmptype.builtin = False if path.startswith(config.ETC_FIREWALLD) else True
    icmptype.default = icmptype.builtin
    handler = icmptype_ContentHandler(icmptype)
    parser = sax.make_parser()
    parser.setContentHandler(handler)
    name = "%s/%s" % (path, filename)
    with open(name, "rb") as f:
        source = sax.InputSource(None)
        source.setByteStream(f)
        try:
            parser.parse(source)
        except sax.SAXParseException as msg:
            raise FirewallError(errors.INVALID_ICMPTYPE,
                                "not a valid icmptype file: %s" % \
                                msg.getException())
    del handler
    del parser
    if PY2:
        icmptype.encode_strings()
    return icmptype

def icmptype_writer(icmptype, path=None):
    _path = path if path else icmptype.path

    if icmptype.filename:
        name = "%s/%s" % (_path, icmptype.filename)
    else:
        name = "%s/%s.xml" % (_path, icmptype.name)

    if os.path.exists(name):
        try:
            shutil.copy2(name, "%s.old" % name)
        except Exception as msg:
            log.error("Backup of file '%s' failed: %s", name, msg)

    dirpath = os.path.dirname(name)
    if dirpath.startswith(config.ETC_FIREWALLD) and not os.path.exists(dirpath):
        if not os.path.exists(config.ETC_FIREWALLD):
            os.mkdir(config.ETC_FIREWALLD, 0o750)
        os.mkdir(dirpath, 0o750)

    f = io.open(name, mode='wt', encoding='UTF-8')
    handler = IO_Object_XMLGenerator(f)
    handler.startDocument()

    # start icmptype element
    attrs = {}
    if icmptype.version and icmptype.version != "":
        attrs["version"] = icmptype.version
    handler.startElement("icmptype", attrs)
    handler.ignorableWhitespace("\n")

    # short
    if icmptype.short and icmptype.short != "":
        handler.ignorableWhitespace("  ")
        handler.startElement("short", { })
        handler.characters(icmptype.short)
        handler.endElement("short")
        handler.ignorableWhitespace("\n")

    # description
    if icmptype.description and icmptype.description != "":
        handler.ignorableWhitespace("  ")
        handler.startElement("description", { })
        handler.characters(icmptype.description)
        handler.endElement("description")
        handler.ignorableWhitespace("\n")

    # destination
    if icmptype.destination:
        handler.ignorableWhitespace("  ")
        attrs = { }
        for x in icmptype.destination:
            attrs[x] = "yes"
        handler.simpleElement("destination", attrs)
        handler.ignorableWhitespace("\n")

    # end icmptype element
    handler.endElement('icmptype')
    handler.ignorableWhitespace("\n")
    handler.endDocument()
    f.close()
    del handler
site-packages/firewall/core/io/zone.py000064400000046600147511334700013770 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2011-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

__all__ = [ "Zone", "zone_reader", "zone_writer" ]

import xml.sax as sax
import os
import io
import shutil

from firewall import config
from firewall.functions import checkIPnMask, checkIP6nMask, checkInterface, uniqify, max_zone_name_len, u2b_if_py2, check_mac
from firewall.core.base import DEFAULT_ZONE_TARGET, ZONE_TARGETS
from firewall.core.io.io_object import PY2, IO_Object, \
    IO_Object_ContentHandler, IO_Object_XMLGenerator
from firewall.core.io.policy import common_startElement, common_endElement, common_check_config, common_writer
from firewall.core import rich
from firewall.core.logger import log
from firewall import errors
from firewall.errors import FirewallError

class Zone(IO_Object):
    """ Zone class """

    IMPORT_EXPORT_STRUCTURE = (
        ( "version",  "" ),                            # s
        ( "short", "" ),                               # s
        ( "description", "" ),                         # s
        ( "UNUSED", False ),                           # b
        ( "target", "" ),                              # s
        ( "services", [ "", ], ),                      # as
        ( "ports", [ ( "", "" ), ], ),                 # a(ss)
        ( "icmp_blocks", [ "", ], ),                   # as
        ( "masquerade", False ),                       # b
        ( "forward_ports", [ ( "", "", "", "" ), ], ), # a(ssss)
        ( "interfaces", [ "" ] ),                      # as
        ( "sources", [ "" ] ),                         # as
        ( "rules_str", [ "" ] ),                       # as
        ( "protocols", [ "", ], ),                     # as
        ( "source_ports", [ ( "", "" ), ], ),          # a(ss)
        ( "icmp_block_inversion", False ),             # b
        ( "forward", False ),                          # b
        )
    ADDITIONAL_ALNUM_CHARS = [ "_", "-", "/" ]
    PARSER_REQUIRED_ELEMENT_ATTRS = {
        "short": None,
        "description": None,
        "zone": None,
        "service": [ "name" ],
        "port": [ "port", "protocol" ],
        "icmp-block": [ "name" ],
        "icmp-type": [ "name" ],
        "forward": None,
        "forward-port": [ "port", "protocol" ],
        "interface": [ "name" ],
        "rule": None,
        "source": None,
        "destination": None,
        "protocol": [ "value" ],
        "source-port": [ "port", "protocol" ],
        "log":  None,
        "audit": None,
        "accept": None,
        "reject": None,
        "drop": None,
        "mark": [ "set" ],
        "limit": [ "value" ],
        "icmp-block-inversion": None,
        }
    PARSER_OPTIONAL_ELEMENT_ATTRS = {
        "zone": [ "name", "immutable", "target", "version" ],
        "masquerade": [ "enabled" ],
        "forward-port": [ "to-port", "to-addr" ],
        "rule": [ "family", "priority" ],
        "source": [ "address", "mac", "invert", "family", "ipset" ],
        "destination": [ "address", "invert", "ipset" ],
        "log": [ "prefix", "level" ],
        "reject": [ "type" ],
        "limit": ["burst"],
        }

    @staticmethod
    def index_of(element):
        for i, (el, dummy) in enumerate(Zone.IMPORT_EXPORT_STRUCTURE):
            if el == element:
                return i
        raise FirewallError(errors.UNKNOWN_ERROR, "index_of()")

    def __init__(self):
        super(Zone, self).__init__()
        self.version = ""
        self.short = ""
        self.description = ""
        self.UNUSED = False
        self.target = DEFAULT_ZONE_TARGET
        self.services = [ ]
        self.ports = [ ]
        self.protocols = [ ]
        self.icmp_blocks = [ ]
        self.forward = False
        self.masquerade = False
        self.forward_ports = [ ]
        self.source_ports = [ ]
        self.interfaces = [ ]
        self.sources = [ ]
        self.fw_config = None # to be able to check services and a icmp_blocks
        self.rules = [ ]
        self.rules_str = [ ]
        self.icmp_block_inversion = False
        self.combined = False
        self.applied = False

    def cleanup(self):
        self.version = ""
        self.short = ""
        self.description = ""
        self.UNUSED = False
        self.target = DEFAULT_ZONE_TARGET
        del self.services[:]
        del self.ports[:]
        del self.protocols[:]
        del self.icmp_blocks[:]
        self.forward = False
        self.masquerade = False
        del self.forward_ports[:]
        del self.source_ports[:]
        del self.interfaces[:]
        del self.sources[:]
        self.fw_config = None # to be able to check services and a icmp_blocks
        del self.rules[:]
        del self.rules_str[:]
        self.icmp_block_inversion = False
        self.combined = False
        self.applied = False

    def encode_strings(self):
        """ HACK. I haven't been able to make sax parser return
            strings encoded (because of python 2) instead of in unicode.
            Get rid of it once we throw out python 2 support."""
        self.version = u2b_if_py2(self.version)
        self.short = u2b_if_py2(self.short)
        self.description = u2b_if_py2(self.description)
        self.target = u2b_if_py2(self.target)
        self.services = [u2b_if_py2(s) for s in self.services]
        self.ports = [(u2b_if_py2(po),u2b_if_py2(pr)) for (po,pr) in self.ports]
        self.protocols = [u2b_if_py2(pr) for pr in self.protocols]
        self.icmp_blocks = [u2b_if_py2(i) for i in self.icmp_blocks]
        self.forward_ports = [(u2b_if_py2(p1),u2b_if_py2(p2),u2b_if_py2(p3),u2b_if_py2(p4)) for (p1,p2,p3,p4) in self.forward_ports]
        self.source_ports = [(u2b_if_py2(po),u2b_if_py2(pr)) for (po,pr)
                             in self.source_ports]
        self.interfaces = [u2b_if_py2(i) for i in self.interfaces]
        self.sources = [u2b_if_py2(s) for s in self.sources]
        self.rules = [u2b_if_py2(s) for s in self.rules]
        self.rules_str = [u2b_if_py2(s) for s in self.rules_str]

    def __setattr__(self, name, value):
        if name == "rules_str":
            self.rules = [rich.Rich_Rule(rule_str=s) for s in value]
            # must convert back to string to get the canonical string.
            super(Zone, self).__setattr__(name, [str(s) for s in self.rules])
        else:
            super(Zone, self).__setattr__(name, value)

    def export_config_dict(self):
        conf = super(Zone, self).export_config_dict()
        del conf["UNUSED"]
        return conf

    def _check_config(self, config, item, all_config):
        common_check_config(self, config, item, all_config)

        if item == "target":
            if config not in ZONE_TARGETS:
                raise FirewallError(errors.INVALID_TARGET, config)
        elif item == "interfaces":
            for interface in config:
                if not checkInterface(interface):
                    raise FirewallError(errors.INVALID_INTERFACE, interface)
                if self.fw_config:
                    for zone in self.fw_config.get_zones():
                        if zone == self.name:
                            continue
                        if interface in self.fw_config.get_zone(zone).interfaces:
                            raise FirewallError(errors.INVALID_INTERFACE,
                                    "interface '{}' already bound to zone '{}'".format(interface, zone))
        elif item == "sources":
            for source in config:
                if not checkIPnMask(source) and not checkIP6nMask(source) and \
                   not check_mac(source) and not source.startswith("ipset:"):
                    raise FirewallError(errors.INVALID_ADDR, source)
                if self.fw_config:
                    for zone in self.fw_config.get_zones():
                        if zone == self.name:
                            continue
                        if source in self.fw_config.get_zone(zone).sources:
                            raise FirewallError(errors.INVALID_ADDR,
                                    "source '{}' already bound to zone '{}'".format(source, zone))


    def check_name(self, name):
        super(Zone, self).check_name(name)
        if name.startswith('/'):
            raise FirewallError(errors.INVALID_NAME,
                                "'%s' can't start with '/'" % name)
        elif name.endswith('/'):
            raise FirewallError(errors.INVALID_NAME,
                                "'%s' can't end with '/'" % name)
        elif name.count('/') > 1:
            raise FirewallError(errors.INVALID_NAME,
                                "more than one '/' in '%s'" % name)
        else:
            if "/" in name:
                checked_name = name[:name.find('/')]
            else:
                checked_name = name
            if len(checked_name) > max_zone_name_len():
                raise FirewallError(errors.INVALID_NAME,
                                    "Zone of '%s' has %d chars, max is %d %s" % (
                                    name, len(checked_name),
                                    max_zone_name_len(),
                                    self.combined))
            if self.fw_config:
                if checked_name in self.fw_config.get_policy_objects():
                    raise FirewallError(errors.NAME_CONFLICT, "Zones can't have the same name as a policy.")

    def combine(self, zone):
        self.combined = True
        self.filename = None
        self.version = ""
        self.short = ""
        self.description = ""

        for interface in zone.interfaces:
            if interface not in self.interfaces:
                self.interfaces.append(interface)
        for source in zone.sources:
            if source not in self.sources:
                self.sources.append(source)
        for service in zone.services:
            if service not in self.services:
                self.services.append(service)
        for port in zone.ports:
            if port not in self.ports:
                self.ports.append(port)
        for proto in zone.protocols:
            if proto not in self.protocols:
                self.protocols.append(proto)
        for icmp in zone.icmp_blocks:
            if icmp not in self.icmp_blocks:
                self.icmp_blocks.append(icmp)
        if zone.forward:
            self.forward = True
        if zone.masquerade:
            self.masquerade = True
        for forward in zone.forward_ports:
            if forward not in self.forward_ports:
                self.forward_ports.append(forward)
        for port in zone.source_ports:
            if port not in self.source_ports:
                self.source_ports.append(port)
        for rule in zone.rules:
            self.rules.append(rule)
            self.rules_str.append(str(rule))
        if zone.icmp_block_inversion:
            self.icmp_block_inversion = True

# PARSER

class zone_ContentHandler(IO_Object_ContentHandler):
    def __init__(self, item):
        IO_Object_ContentHandler.__init__(self, item)
        self._rule = None
        self._rule_error = False
        self._limit_ok = None

    def startElement(self, name, attrs):
        IO_Object_ContentHandler.startElement(self, name, attrs)
        if self._rule_error:
            return

        self.item.parser_check_element_attrs(name, attrs)

        if common_startElement(self, name, attrs):
            return

        elif name == "zone":
            if "name" in attrs:
                log.warning("Ignoring deprecated attribute name='%s'",
                            attrs["name"])
            if "version" in attrs:
                self.item.version = attrs["version"]
            if "immutable" in attrs:
                log.warning("Ignoring deprecated attribute immutable='%s'",
                            attrs["immutable"])
            if "target" in attrs:
                target = attrs["target"]
                if target not in ZONE_TARGETS:
                    raise FirewallError(errors.INVALID_TARGET, target)
                if target != "" and target != DEFAULT_ZONE_TARGET:
                    self.item.target = target

        elif name == "forward":
            if self.item.forward:
                log.warning("Forward already set, ignoring.")
            else:
                self.item.forward = True

        elif name == "interface":
            if self._rule:
                log.warning('Invalid rule: interface use in rule.')
                self._rule_error = True
                return
            # zone bound to interface
            if "name" not in attrs:
                log.warning('Invalid interface: Name missing.')
                self._rule_error = True
                return
            if attrs["name"] not in self.item.interfaces:
                self.item.interfaces.append(attrs["name"])
            else:
                log.warning("Interface '%s' already set, ignoring.",
                            attrs["name"])

        elif name == "source":
            if self._rule:
                if self._rule.source:
                    log.warning("Invalid rule: More than one source in rule '%s', ignoring.",
                                str(self._rule))
                    self._rule_error = True
                    return
                invert = False
                if "invert" in attrs and \
                        attrs["invert"].lower() in [ "yes", "true" ]:
                    invert = True
                addr = mac = ipset = None
                if "address" in attrs:
                    addr = attrs["address"]
                if "mac" in attrs:
                    mac = attrs["mac"]
                if "ipset" in attrs:
                    ipset = attrs["ipset"]
                self._rule.source = rich.Rich_Source(addr, mac, ipset,
                                                     invert=invert)
                return
            # zone bound to source
            if "address" not in attrs and "ipset" not in attrs:
                log.warning('Invalid source: No address no ipset.')
                return
            if "address" in attrs and "ipset" in attrs:
                log.warning('Invalid source: Address and ipset.')
                return
            if "family" in attrs:
                log.warning("Ignoring deprecated attribute family='%s'",
                            attrs["family"])
            if "invert" in attrs:
                log.warning('Invalid source: Invertion not allowed here.')
                return
            if "address" in attrs:
                if not checkIPnMask(attrs["address"]) and \
                   not checkIP6nMask(attrs["address"]) and \
                   not check_mac(attrs["address"]):
                    raise FirewallError(errors.INVALID_ADDR, attrs["address"])
            if "ipset" in attrs:
                entry = "ipset:%s" % attrs["ipset"]
                if entry not in self.item.sources:
                    self.item.sources.append(entry)
                else:
                    log.warning("Source '%s' already set, ignoring.",
                                attrs["address"])
            if "address" in attrs:
                entry = attrs["address"]
                if entry not in self.item.sources:
                    self.item.sources.append(entry)
                else:
                    log.warning("Source '%s' already set, ignoring.",
                                attrs["address"])

        elif name == "icmp-block-inversion":
            if self.item.icmp_block_inversion:
                log.warning("Icmp-Block-Inversion already set, ignoring.")
            else:
                self.item.icmp_block_inversion = True

        else:
            log.warning("Unknown XML element '%s'", name)
            return

    def endElement(self, name):
        IO_Object_ContentHandler.endElement(self, name)

        common_endElement(self, name)

def zone_reader(filename, path, no_check_name=False):
    zone = Zone()
    if not filename.endswith(".xml"):
        raise FirewallError(errors.INVALID_NAME,
                            "'%s' is missing .xml suffix" % filename)
    zone.name = filename[:-4]
    if not no_check_name:
        zone.check_name(zone.name)
    zone.filename = filename
    zone.path = path
    zone.builtin = False if path.startswith(config.ETC_FIREWALLD) else True
    zone.default = zone.builtin
    handler = zone_ContentHandler(zone)
    parser = sax.make_parser()
    parser.setContentHandler(handler)
    name = "%s/%s" % (path, filename)
    with open(name, "rb") as f:
        source = sax.InputSource(None)
        source.setByteStream(f)
        try:
            parser.parse(source)
        except sax.SAXParseException as msg:
            raise FirewallError(errors.INVALID_ZONE,
                                "not a valid zone file: %s" % \
                                msg.getException())
    del handler
    del parser
    if PY2:
        zone.encode_strings()
    return zone

def zone_writer(zone, path=None):
    _path = path if path else zone.path

    if zone.filename:
        name = "%s/%s" % (_path, zone.filename)
    else:
        name = "%s/%s.xml" % (_path, zone.name)

    if os.path.exists(name):
        try:
            shutil.copy2(name, "%s.old" % name)
        except Exception as msg:
            log.error("Backup of file '%s' failed: %s", name, msg)

    dirpath = os.path.dirname(name)
    if dirpath.startswith(config.ETC_FIREWALLD) and not os.path.exists(dirpath):
        if not os.path.exists(config.ETC_FIREWALLD):
            os.mkdir(config.ETC_FIREWALLD, 0o750)
        os.mkdir(dirpath, 0o750)

    f = io.open(name, mode='wt', encoding='UTF-8')
    handler = IO_Object_XMLGenerator(f)
    handler.startDocument()

    # start zone element
    attrs = {}
    if zone.version and zone.version != "":
        attrs["version"] = zone.version
    if zone.target != DEFAULT_ZONE_TARGET:
        attrs["target"] = zone.target
    handler.startElement("zone", attrs)
    handler.ignorableWhitespace("\n")

    common_writer(zone, handler)

    # interfaces
    for interface in uniqify(zone.interfaces):
        handler.ignorableWhitespace("  ")
        handler.simpleElement("interface", { "name": interface })
        handler.ignorableWhitespace("\n")

    # source
    for source in uniqify(zone.sources):
        handler.ignorableWhitespace("  ")
        if "ipset:" in source:
            handler.simpleElement("source", { "ipset": source[6:] })
        else:
            handler.simpleElement("source", { "address": source })
        handler.ignorableWhitespace("\n")

    # icmp-block-inversion
    if zone.icmp_block_inversion:
        handler.ignorableWhitespace("  ")
        handler.simpleElement("icmp-block-inversion", { })
        handler.ignorableWhitespace("\n")

    # forward
    if zone.forward:
        handler.ignorableWhitespace("  ")
        handler.simpleElement("forward", { })
        handler.ignorableWhitespace("\n")

    # end zone element
    handler.endElement("zone")
    handler.ignorableWhitespace("\n")
    handler.endDocument()
    f.close()
    del handler
site-packages/firewall/core/io/__init__.py000064400000003074147511334700014552 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2012 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

# fix xmlplus to be compatible with the python xml sax parser and python 3
# by adding __contains__ to xml.sax.xmlreader.AttributesImpl
import xml
if "_xmlplus" in xml.__file__:
    from xml.sax.xmlreader import AttributesImpl
    if not hasattr(AttributesImpl, "__contains__"):
        # this is missing:
        def __AttributesImpl__contains__(self, name):
            return name in getattr(self, "_attrs")
        # add it using the name __contains__
        setattr(AttributesImpl, "__contains__", __AttributesImpl__contains__)
    from xml.sax.saxutils import XMLGenerator
    if not hasattr(XMLGenerator, "_write"):
        # this is missing:
        def __XMLGenerator_write(self, text):
            getattr(self, "_out").write(text)
        # add it using the name _write
        setattr(XMLGenerator, "_write", __XMLGenerator_write)
site-packages/firewall/core/io/functions.py000064400000011171147511334700015020 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2018 Red Hat, Inc.
#
# Authors:
# Eric Garver <egarver@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

import os

from firewall import config
from firewall.errors import FirewallError

from firewall.core.fw_config import FirewallConfig
from firewall.core.io.zone import zone_reader
from firewall.core.io.service import service_reader
from firewall.core.io.ipset import ipset_reader
from firewall.core.io.icmptype import icmptype_reader
from firewall.core.io.helper import helper_reader
from firewall.core.io.policy import policy_reader
from firewall.core.io.direct import Direct
from firewall.core.io.lockdown_whitelist import LockdownWhitelist
from firewall.core.io.firewalld_conf import firewalld_conf

def check_config(fw):
    fw_config = FirewallConfig(fw)
    readers = {
        "ipset":    {"reader": ipset_reader,
                     "add": fw_config.add_ipset,
                     "dirs": [config.FIREWALLD_IPSETS, config.ETC_FIREWALLD_IPSETS],
                    },
        "helper":   {"reader": helper_reader,
                     "add": fw_config.add_helper,
                     "dirs": [config.FIREWALLD_HELPERS, config.ETC_FIREWALLD_HELPERS],
                    },
        "icmptype": {"reader": icmptype_reader,
                     "add": fw_config.add_icmptype,
                     "dirs": [config.FIREWALLD_ICMPTYPES, config.ETC_FIREWALLD_ICMPTYPES],
                    },
        "service":  {"reader": service_reader,
                     "add": fw_config.add_service,
                     "dirs": [config.FIREWALLD_SERVICES, config.ETC_FIREWALLD_SERVICES],
                    },
        "zone":     {"reader": zone_reader,
                     "add": fw_config.add_zone,
                     "dirs": [config.FIREWALLD_ZONES, config.ETC_FIREWALLD_ZONES],
                    },
        "policy":   {"reader": policy_reader,
                     "add": fw_config.add_policy_object,
                     "dirs": [config.FIREWALLD_POLICIES, config.ETC_FIREWALLD_POLICIES],
                    },
    }
    for reader in readers.keys():
        for _dir in readers[reader]["dirs"]:
            if not os.path.isdir(_dir):
                continue
            for file in sorted(os.listdir(_dir)):
                if file.endswith(".xml"):
                    try:
                        obj = readers[reader]["reader"](file, _dir)
                        if reader in ["zone", "policy"]:
                            obj.fw_config = fw_config
                        obj.check_config_dict(obj.export_config_dict())
                        readers[reader]["add"](obj)
                    except FirewallError as error:
                        raise FirewallError(error.code, "'%s': %s" % (file, error.msg))
                    except Exception as msg:
                        raise Exception("'%s': %s" % (file, msg))
    if os.path.isfile(config.FIREWALLD_DIRECT):
        try:
            obj = Direct(config.FIREWALLD_DIRECT)
            obj.read()
            obj.check_config(obj.export_config())
        except FirewallError as error:
            raise FirewallError(error.code, "'%s': %s" % (config.FIREWALLD_DIRECT, error.msg))
        except Exception as msg:
            raise Exception("'%s': %s" % (config.FIREWALLD_DIRECT, msg))
    if os.path.isfile(config.LOCKDOWN_WHITELIST):
        try:
            obj = LockdownWhitelist(config.LOCKDOWN_WHITELIST)
            obj.read()
            obj.check_config(obj.export_config())
        except FirewallError as error:
            raise FirewallError(error.code, "'%s': %s" % (config.LOCKDOWN_WHITELIST, error.msg))
        except Exception as msg:
            raise Exception("'%s': %s" % (config.LOCKDOWN_WHITELIST, msg))
    if os.path.isfile(config.FIREWALLD_CONF):
        try:
            obj = firewalld_conf(config.FIREWALLD_CONF)
            obj.read()
        except FirewallError as error:
            raise FirewallError(error.code, "'%s': %s" % (config.FIREWALLD_CONF, error.msg))
        except Exception as msg:
            raise Exception("'%s': %s" % (config.FIREWALLD_CONF, msg))
site-packages/firewall/core/io/lockdown_whitelist.py000064400000030647147511334700016735 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2011-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

import xml.sax as sax
import os
import io
import shutil

from firewall import config
from firewall.core.io.io_object import PY2, IO_Object, \
                    IO_Object_ContentHandler, IO_Object_XMLGenerator
from firewall.core.logger import log
from firewall.functions import uniqify, checkUser, checkUid, checkCommand, \
                               checkContext, u2b_if_py2
from firewall import errors
from firewall.errors import FirewallError

class lockdown_whitelist_ContentHandler(IO_Object_ContentHandler):
    def __init__(self, item):
        IO_Object_ContentHandler.__init__(self, item)
        self.whitelist = False

    def startElement(self, name, attrs):
        IO_Object_ContentHandler.startElement(self, name, attrs)
        self.item.parser_check_element_attrs(name, attrs)

        if name == "whitelist":
            if self.whitelist:
                raise FirewallError(errors.PARSE_ERROR,
                                    "More than one whitelist.")
            self.whitelist = True

        elif name == "command":
            if not self.whitelist:
                log.error("Parse Error: command outside of whitelist")
                return
            command = attrs["name"]
            self.item.add_command(command)

        elif name == "user":
            if not self.whitelist:
                log.error("Parse Error: user outside of whitelist")
                return
            if "id" in attrs:
                try:
                    uid = int(attrs["id"])
                except ValueError:
                    log.error("Parse Error: %s is not a valid uid" % 
                              attrs["id"])
                    return
                self.item.add_uid(uid)
            elif "name" in attrs:
                self.item.add_user(attrs["name"])

        elif name == "selinux":
            if not self.whitelist:
                log.error("Parse Error: selinux outside of whitelist")
                return
            if "context" not in attrs:
                log.error("Parse Error: no context")
                return
            self.item.add_context(attrs["context"])
            

        else:
            log.error('Unknown XML element %s' % name)
            return

class LockdownWhitelist(IO_Object):
    """ LockdownWhitelist class """

    IMPORT_EXPORT_STRUCTURE = (
        ( "commands", [ "" ] ),   # as
        ( "contexts", [ "" ] ),   # as
        ( "users", [ "" ] ),      # as
        ( "uids", [ 0 ] )         # ai
        )
    DBUS_SIGNATURE = '(asasasai)'
    ADDITIONAL_ALNUM_CHARS = [ "_" ]
    PARSER_REQUIRED_ELEMENT_ATTRS = {
        "whitelist": None,
        "command": [ "name" ],
        "user": None,
#        "group": None,
        "selinux": [ "context" ],
        }
    PARSER_OPTIONAL_ELEMENT_ATTRS = {
        "user": [ "id", "name" ],
#        "group": [ "id", "name" ],
        }

    def __init__(self, filename):
        super(LockdownWhitelist, self).__init__()
        self.filename = filename
        self.parser = None
        self.commands = [ ]
        self.contexts = [ ]
        self.users = [ ]
        self.uids = [ ]
#        self.gids = [ ]
#        self.groups = [ ]

    def _check_config(self, config, item, all_config):
        if item in [ "commands", "contexts", "users", "uids" ]:
            for x in config:
                self._check_config(x, item[:-1], all_config)
        elif item == "command":
            if not checkCommand(config):
                raise FirewallError(errors.INVALID_COMMAND, config)
        elif item == "context":
            if not checkContext(config):
                raise FirewallError(errors.INVALID_CONTEXT, config)
        elif item == "user":
            if not checkUser(config):
                raise FirewallError(errors.INVALID_USER, config)
        elif item == "uid":
            if not checkUid(config):
                raise FirewallError(errors.INVALID_UID, config)

    def cleanup(self):
        del self.commands[:]
        del self.contexts[:]
        del self.users[:]
        del self.uids[:]
#        del self.gids[:]
#        del self.groups[:]

    def encode_strings(self):
        """ HACK. I haven't been able to make sax parser return
            strings encoded (because of python 2) instead of in unicode.
            Get rid of it once we throw out python 2 support."""
        self.commands = [ u2b_if_py2(x) for x in self.commands ]
        self.contexts = [ u2b_if_py2(x) for x in self.contexts ]
        self.users = [ u2b_if_py2(x) for x in self.users ]

    # commands

    def add_command(self, command):
        if not checkCommand(command):
            raise FirewallError(errors.INVALID_COMMAND, command)
        if command not in self.commands:
            self.commands.append(command)
        else:
            raise FirewallError(errors.ALREADY_ENABLED,
                                'Command "%s" already in whitelist' % command)

    def remove_command(self, command):
        if command in self.commands:
            self.commands.remove(command)
        else:
            raise FirewallError(errors.NOT_ENABLED,
                                'Command "%s" not in whitelist.' % command)

    def has_command(self, command):
        return (command in self.commands)

    def match_command(self, command):
        for _command in self.commands:
            if _command.endswith("*"):
                if command.startswith(_command[:-1]):
                    return True
            else:
                if _command == command:
                    return True
        return False

    def get_commands(self):
        return self.commands

    # user ids

    def add_uid(self, uid):
        if not checkUid(uid):
            raise FirewallError(errors.INVALID_UID, str(uid))
        if uid not in self.uids:
            self.uids.append(uid)
        else:
            raise FirewallError(errors.ALREADY_ENABLED,
                                'Uid "%s" already in whitelist' % uid)


    def remove_uid(self, uid):
        if uid in self.uids:
            self.uids.remove(uid)
        else:
            raise FirewallError(errors.NOT_ENABLED,
                                'Uid "%s" not in whitelist.' % uid)

    def has_uid(self, uid):
        return (uid in self.uids)

    def match_uid(self, uid):
        return (uid in self.uids)

    def get_uids(self):
        return self.uids

    # users

    def add_user(self, user):
        if not checkUser(user):
            raise FirewallError(errors.INVALID_USER, user)
        if user not in self.users:
            self.users.append(user)
        else:
            raise FirewallError(errors.ALREADY_ENABLED,
                                'User "%s" already in whitelist' % user)


    def remove_user(self, user):
        if user in self.users:
            self.users.remove(user)
        else:
            raise FirewallError(errors.NOT_ENABLED,
                                'User "%s" not in whitelist.' % user)

    def has_user(self, user):
        return (user in self.users)

    def match_user(self, user):
        return (user in self.users)

    def get_users(self):
        return self.users

#    # group ids
#
#    def add_gid(self, gid):
#        if gid not in self.gids:
#            self.gids.append(gid)
#
#    def remove_gid(self, gid):
#        if gid in self.gids:
#            self.gids.remove(gid)
#        else:
#            raise FirewallError(errors.NOT_ENABLED,
#                                'Gid "%s" not in whitelist.' % gid)
#
#    def has_gid(self, gid):
#        return (gid in self.gids)
#
#    def match_gid(self, gid):
#        return (gid in self.gids)
#
#    def get_gids(self):
#        return self.gids

#    # groups
#
#    def add_group(self, group):
#        if group not in self.groups:
#            self.groups.append(group)
#
#    def remove_group(self, group):
#        if group in self.groups:
#            self.groups.remove(group)
#        else:
#            raise FirewallError(errors.NOT_ENABLED,
#                                'Group "%s" not in whitelist.' % group)
#
#    def has_group(self, group):
#        return (group in self.groups)
#
#    def match_group(self, group):
#        return (group in self.groups)
#
#    def get_groups(self):
#        return self.groups

    # selinux contexts

    def add_context(self, context):
        if not checkContext(context):
            raise FirewallError(errors.INVALID_CONTEXT, context)
        if context not in self.contexts:
            self.contexts.append(context)
        else:
            raise FirewallError(errors.ALREADY_ENABLED,
                                'Context "%s" already in whitelist' % context)


    def remove_context(self, context):
        if context in self.contexts:
            self.contexts.remove(context)
        else:
            raise FirewallError(errors.NOT_ENABLED,
                                'Context "%s" not in whitelist.' % context)

    def has_context(self, context):
        return (context in self.contexts)

    def match_context(self, context):
        return (context in self.contexts)

    def get_contexts(self):
        return self.contexts

    # read and write

    def read(self):
        self.cleanup()
        if not self.filename.endswith(".xml"):
            raise FirewallError(errors.INVALID_NAME,
                                "'%s' is missing .xml suffix" % self.filename)
        handler = lockdown_whitelist_ContentHandler(self)
        parser = sax.make_parser()
        parser.setContentHandler(handler)
        try:
            parser.parse(self.filename)
        except sax.SAXParseException as msg:
            raise FirewallError(errors.INVALID_TYPE,
                                "Not a valid file: %s" % \
                                msg.getException())
        del handler
        del parser
        if PY2:
            self.encode_strings()

    def write(self):
        if os.path.exists(self.filename):
            try:
                shutil.copy2(self.filename, "%s.old" % self.filename)
            except Exception as msg:
                raise IOError("Backup of '%s' failed: %s" % (self.filename, msg))

        if not os.path.exists(config.ETC_FIREWALLD):
            os.mkdir(config.ETC_FIREWALLD, 0o750)

        f = io.open(self.filename, mode='wt', encoding='UTF-8')
        handler = IO_Object_XMLGenerator(f)
        handler.startDocument()

        # start whitelist element
        handler.startElement("whitelist", { })
        handler.ignorableWhitespace("\n")

        # commands
        for command in uniqify(self.commands):
            handler.ignorableWhitespace("  ")
            handler.simpleElement("command", { "name": command })
            handler.ignorableWhitespace("\n")

        for uid in uniqify(self.uids):
            handler.ignorableWhitespace("  ")
            handler.simpleElement("user", { "id": str(uid) })
            handler.ignorableWhitespace("\n")

        for user in uniqify(self.users):
            handler.ignorableWhitespace("  ")
            handler.simpleElement("user", { "name": user })
            handler.ignorableWhitespace("\n")

#        for gid in uniqify(self.gids):
#            handler.ignorableWhitespace("  ")
#            handler.simpleElement("user", { "id": str(gid) })
#            handler.ignorableWhitespace("\n")

#        for group in uniqify(self.groups):
#            handler.ignorableWhitespace("  ")
#            handler.simpleElement("group", { "name": group })
#            handler.ignorableWhitespace("\n")

        for context in uniqify(self.contexts):
            handler.ignorableWhitespace("  ")
            handler.simpleElement("selinux", { "context": context })
            handler.ignorableWhitespace("\n")

        # end whitelist element
        handler.endElement("whitelist")
        handler.ignorableWhitespace("\n")
        handler.endDocument()
        f.close()
        del handler
site-packages/firewall/core/io/io_object.py000064400000032422147511334700014747 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2011-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

"""Generic io_object handler, io specific check methods."""

__all__ = [ "PY2",
            "IO_Object", "IO_Object_ContentHandler", "IO_Object_XMLGenerator",
            "check_port", "check_tcpudp", "check_protocol", "check_address" ]

import xml.sax as sax
import xml.sax.saxutils as saxutils
import copy
import sys
from collections import OrderedDict

from firewall import functions
from firewall.functions import b2u
from firewall import errors
from firewall.errors import FirewallError

PY2 = sys.version < '3'

class IO_Object(object):
    """ Abstract IO_Object as base for icmptype, service and zone """

    IMPORT_EXPORT_STRUCTURE = ( )
    DBUS_SIGNATURE = '()'
    ADDITIONAL_ALNUM_CHARS = [ ] # additional to alnum
    PARSER_REQUIRED_ELEMENT_ATTRS = { }
    PARSER_OPTIONAL_ELEMENT_ATTRS = { }

    def __init__(self):
        self.filename = ""
        self.path = ""
        self.name = ""
        self.default = False
        self.builtin = False

    def export_config(self):
        ret = [ ]
        for x in self.IMPORT_EXPORT_STRUCTURE:
            ret.append(copy.deepcopy(getattr(self, x[0])))
        return tuple(ret)

    def export_config_dict(self):
        conf = {}
        type_formats = dict([(x[0], x[1]) for x in self.IMPORT_EXPORT_STRUCTURE])
        for key in type_formats:
            if getattr(self, key) or isinstance(getattr(self, key), bool):
                conf[key] = copy.deepcopy(getattr(self, key))
        return conf

    def import_config(self, conf):
        self.check_config(conf)
        for i,(element,dummy) in enumerate(self.IMPORT_EXPORT_STRUCTURE):
            if isinstance(conf[i], list):
                # remove duplicates without changing the order
                _conf = [ ]
                _set = set()
                for x in conf[i]:
                    if x not in _set:
                        _conf.append(x)
                        _set.add(x)
                del _set
                setattr(self, element, copy.deepcopy(_conf))
            else:
                setattr(self, element, copy.deepcopy(conf[i]))

    def import_config_dict(self, conf):
        self.check_config_dict(conf)

        for key in conf:
            if not hasattr(self, key):
                raise FirewallError(errors.UNKNOWN_ERROR, "Internal error. '{}' is not a valid attribute".format(key))
            if isinstance(conf[key], list):
                # maintain list order while removing duplicates
                setattr(self, key, list(OrderedDict.fromkeys(copy.deepcopy(conf[key]))))
            else:
                setattr(self, key, copy.deepcopy(conf[key]))

    def check_name(self, name):
        if not isinstance(name, str):
            raise FirewallError(errors.INVALID_TYPE,
                                "'%s' not of type %s, but %s" % (name, type(""),
                                                                 type(name)))
        if len(name) < 1:
            raise FirewallError(errors.INVALID_NAME, "name can't be empty")
        for char in name:
            if not char.isalnum() and char not in self.ADDITIONAL_ALNUM_CHARS:
                raise FirewallError(
                    errors.INVALID_NAME,
                    "'%s' is not allowed in '%s'" % ((char, name)))

    def check_config(self, conf):
        if len(conf) != len(self.IMPORT_EXPORT_STRUCTURE):
            raise FirewallError(
                errors.INVALID_TYPE,
                "structure size mismatch %d != %d" % \
                (len(conf), len(self.IMPORT_EXPORT_STRUCTURE)))
        conf_dict = {}
        for i,(x,y) in enumerate(self.IMPORT_EXPORT_STRUCTURE):
            conf_dict[x] = conf[i]
        self.check_config_dict(conf_dict)

    def check_config_dict(self, conf):
        type_formats = dict([(x[0], x[1]) for x in self.IMPORT_EXPORT_STRUCTURE])
        for key in conf:
            if key not in [x for (x,y) in self.IMPORT_EXPORT_STRUCTURE]:
                raise FirewallError(errors.INVALID_OPTION, "option '{}' is not valid".format(key))
            self._check_config_structure(conf[key], type_formats[key])
            self._check_config(conf[key], key, conf)

    def _check_config(self, dummy1, dummy2, dummy3):
        # to be overloaded by sub classes
        return

    def _check_config_structure(self, conf, structure):
        if not isinstance(conf, type(structure)):
            raise FirewallError(errors.INVALID_TYPE,
                                "'%s' not of type %s, but %s" % \
                                (conf, type(structure), type(conf)))
        if isinstance(structure, list):
            # same type elements, else struct
            if len(structure) != 1:
                raise FirewallError(errors.INVALID_TYPE,
                                    "len('%s') != 1" % structure)
            for x in conf:
                self._check_config_structure(x, structure[0])
        elif isinstance(structure, tuple):
            if len(structure) != len(conf):
                raise FirewallError(errors.INVALID_TYPE,
                                    "len('%s') != %d" % (conf,
                                                         len(structure)))
            for i,value in enumerate(structure):
                self._check_config_structure(conf[i], value)
        elif isinstance(structure, dict):
            # only one key value pair in structure
            (skey, svalue) = list(structure.items())[0]
            for (key, value) in conf.items():
                if not isinstance(key, type(skey)):
                    raise FirewallError(errors.INVALID_TYPE,
                                        "'%s' not of type %s, but %s" % (\
                            key, type(skey), type(key)))
                if not isinstance(value, type(svalue)):
                    raise FirewallError(errors.INVALID_TYPE,
                                        "'%s' not of type %s, but %s" % (\
                            value, type(svalue), type(value)))

    # check required elements and attributes and also optional attributes
    def parser_check_element_attrs(self, name, attrs):
        _attrs = attrs.getNames()

        found = False
        if name in self.PARSER_REQUIRED_ELEMENT_ATTRS:
            found = True
            if self.PARSER_REQUIRED_ELEMENT_ATTRS[name] is not None:
                for x in self.PARSER_REQUIRED_ELEMENT_ATTRS[name]:
                    if x in _attrs:
                        _attrs.remove(x)
                    else:
                        raise FirewallError(
                            errors.PARSE_ERROR,
                            "Missing attribute %s for %s" % (x, name))
        if name in self.PARSER_OPTIONAL_ELEMENT_ATTRS:
            found = True
            for x in self.PARSER_OPTIONAL_ELEMENT_ATTRS[name]:
                if x in _attrs:
                    _attrs.remove(x)
        if not found:
            raise FirewallError(errors.PARSE_ERROR,
                                "Unexpected element %s" % name)
        # raise attributes[0]
        for x in _attrs:
            raise FirewallError(errors.PARSE_ERROR,
                                "%s: Unexpected attribute %s" % (name, x))

# PARSER

class UnexpectedElementError(Exception):
    def __init__(self, name):
        super(UnexpectedElementError, self).__init__()
        self.name = name
    def __str__(self):
        return "Unexpected element '%s'" % (self.name)

class MissingAttributeError(Exception):
    def __init__(self, name, attribute):
        super(MissingAttributeError, self).__init__()
        self.name = name
        self.attribute = attribute
    def __str__(self):
        return "Element '%s': missing '%s' attribute" % \
            (self.name, self.attribute)

class UnexpectedAttributeError(Exception):
    def __init__(self, name, attribute):
        super(UnexpectedAttributeError, self).__init__()
        self.name = name
        self.attribute = attribute
    def __str__(self):
        return "Element '%s': unexpected attribute '%s'" % \
            (self.name, self.attribute)

class IO_Object_ContentHandler(sax.handler.ContentHandler):
    def __init__(self, item):
        self.item = item
        self._element = ""

    def startDocument(self):
        self._element = ""

    def startElement(self, name, attrs):
        self._element = ""

    def endElement(self, name):
        if name == "short":
            self.item.short = self._element
        elif name == "description":
            self.item.description = self._element

    def characters(self, content):
        self._element += content.replace('\n', ' ')

class IO_Object_XMLGenerator(saxutils.XMLGenerator):
    def __init__(self, out):
        # fix memory leak in saxutils.XMLGenerator.__init__:
        #   out = _gettextwriter(out, encoding)
        # creates unbound object results in garbage in gc
        #
        # saxutils.XMLGenerator.__init__(self, out, "utf-8")
        #   replaced by modified saxutils.XMLGenerator.__init__ code:
        sax.handler.ContentHandler.__init__(self)
        self._write = out.write
        self._flush = out.flush
        self._ns_contexts = [{}] # contains uri -> prefix dicts
        self._current_context = self._ns_contexts[-1]
        self._undeclared_ns_maps = []
        self._encoding = "utf-8"
        self._pending_start_element = False
        self._short_empty_elements = False

    def startElement(self, name, attrs):
        """ saxutils.XMLGenerator.startElement() expects name and attrs to be
            unicode and bad things happen if any of them is (utf-8) encoded.
            We override the method here to sanitize this case.
            Can be removed once we drop Python2 support.
        """
        if PY2:
            attrs = { b2u(name):b2u(value) for name, value in attrs.items() }
        saxutils.XMLGenerator.startElement(self, name, attrs)

    def simpleElement(self, name, attrs):
        """ slightly modified startElement()
        """
        if PY2:
            self._write(u'<' + b2u(name))
            for (name, value) in attrs.items():
                self._write(u' %s=%s' % (b2u(name),
                                         saxutils.quoteattr(b2u(value))))
            self._write(u'/>')
        else:
            self._write('<' + name)
            for (name, value) in attrs.items():
                self._write(' %s=%s' % (name, saxutils.quoteattr(value)))
            self._write('/>')

    def endElement(self, name):
        """ saxutils.XMLGenerator.endElement() expects name to be
            unicode and bad things happen if it's (utf-8) encoded.
            We override the method here to sanitize this case.
            Can be removed once we drop Python2 support.
        """
        saxutils.XMLGenerator.endElement(self, b2u(name))

    def characters(self, content):
        """ saxutils.XMLGenerator.characters() expects content to be
            unicode and bad things happen if it's (utf-8) encoded.
            We override the method here to sanitize this case.
            Can be removed once we drop Python2 support.
        """
        saxutils.XMLGenerator.characters(self, b2u(content))

    def ignorableWhitespace(self, content):
        """ saxutils.XMLGenerator.ignorableWhitespace() expects content to be
            unicode and bad things happen if it's (utf-8) encoded.
            We override the method here to sanitize this case.
            Can be removed once we drop Python2 support.
        """
        saxutils.XMLGenerator.ignorableWhitespace(self, b2u(content))

def check_port(port):
    port_range = functions.getPortRange(port)
    if port_range == -2:
        raise FirewallError(errors.INVALID_PORT,
                            "port number in '%s' is too big" % port)
    elif port_range == -1:
        raise FirewallError(errors.INVALID_PORT,
                            "'%s' is invalid port range" % port)
    elif port_range is None:
        raise FirewallError(errors.INVALID_PORT,
                            "port range '%s' is ambiguous" % port)
    elif len(port_range) == 2 and port_range[0] >= port_range[1]:
        raise FirewallError(errors.INVALID_PORT,
                            "'%s' is invalid port range" % port)

def check_tcpudp(protocol):
    if protocol not in [ "tcp", "udp", "sctp", "dccp" ]:
        raise FirewallError(errors.INVALID_PROTOCOL,
                            "'%s' not from {'tcp'|'udp'|'sctp'|'dccp'}" % \
                            protocol)

def check_protocol(protocol):
    if not functions.checkProtocol(protocol):
        raise FirewallError(errors.INVALID_PROTOCOL, protocol)

def check_address(ipv, addr):
    if not functions.check_address(ipv, addr):
        raise FirewallError(errors.INVALID_ADDR,
                            "'%s' is not valid %s address" % (addr, ipv))

site-packages/firewall/core/io/ifcfg.py000064400000014345147511334700014074 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2011-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

"""ifcfg file parser"""

__all__ = [ "ifcfg" ]

import os.path
import io
import tempfile
import shutil

from firewall.core.logger import log
from firewall.functions import b2u, u2b, PY2

class ifcfg(object):
    def __init__(self, filename):
        self._config = { }
        self._deleted = [ ]
        self.filename = filename
        self.clear()

    def clear(self):
        self._config = { }
        self._deleted = [ ]

    def cleanup(self):
        self._config.clear()

    def get(self, key):
        return self._config.get(key.strip())

    def set(self, key, value):
        _key = b2u(key.strip())
        self._config[_key] = b2u(value.strip())
        if _key in self._deleted:
            self._deleted.remove(_key)

    def __str__(self):
        s = ""
        for (key, value) in self._config.items():
            if s:
                s += '\n'
            s += '%s=%s' % (key, value)
        return u2b(s) if PY2 else s

    # load self.filename
    def read(self):
        self.clear()
        try:
            f = open(self.filename, "r")
        except Exception as msg:
            log.error("Failed to load '%s': %s", self.filename, msg)
            raise

        for line in f:
            if not line:
                break
            line = line.strip()
            if len(line) < 1 or line[0] in ['#', ';']:
                continue
            # get key/value pair
            pair = [ x.strip() for x in line.split("=", 1) ]
            if len(pair) != 2:
                continue
            if len(pair[1]) >= 2 and \
               pair[1].startswith('"') and pair[1].endswith('"'):
                pair[1] = pair[1][1:-1]
            if pair[1] == '':
                continue
            elif self._config.get(pair[0]) is not None:
                log.warning("%s: Duplicate option definition: '%s'", self.filename, line.strip())
                continue
            self._config[pair[0]] = pair[1]
        f.close()

    def write(self):
        if len(self._config) < 1:
            # no changes: nothing to do
            return

        # handled keys
        done = [ ]

        try:
            temp_file = tempfile.NamedTemporaryFile(
                mode='wt',
                prefix="%s." % os.path.basename(self.filename),
                dir=os.path.dirname(self.filename), delete=False)
        except Exception as msg:
            log.error("Failed to open temporary file: %s" % msg)
            raise

        modified = False
        empty = False
        try:
            f = io.open(self.filename, mode='rt', encoding='UTF-8')
        except Exception as msg:
            if os.path.exists(self.filename):
                log.error("Failed to open '%s': %s" % (self.filename, msg))
                raise
            else:
                f = None
        else:
            for line in f:
                if not line:
                    break
                # remove newline
                line = line.strip("\n")

                if len(line) < 1:
                    if not empty:
                        temp_file.write(u"\n")
                        empty = True
                elif line[0] == '#':
                    empty = False
                    temp_file.write(line)
                    temp_file.write(u"\n")
                else:
                    p = line.split("=", 1)
                    if len(p) != 2:
                        empty = False
                        temp_file.write(line+u"\n")
                        continue
                    key = p[0].strip()
                    value = p[1].strip()
                    if len(value) >= 2 and \
                       value.startswith('"') and value.endswith('"'):
                        value = value[1:-1]
                    # check for modified key/value pairs
                    if key not in done:
                        if key in self._config and self._config[key] != value:
                            empty = False
                            temp_file.write(u'%s=%s\n' % (key,
                                                          self._config[key]))
                            modified = True
                        elif key in self._deleted:
                            modified = True
                        else:
                            empty = False
                            temp_file.write(line+u"\n")
                        done.append(key)
                    else:
                        modified = True

        # write remaining key/value pairs
        if len(self._config) > 0:
            for (key, value) in self._config.items():
                if key in done:
                    continue
                if not empty:
                    empty = True
                temp_file.write(u'%s=%s\n' % (key, value))
                modified = True

        if f:
            f.close()
        temp_file.close()

        if not modified: # not modified: remove tempfile
            os.remove(temp_file.name)
            return
        # make backup
        if os.path.exists(self.filename):
            try:
                shutil.copy2(self.filename, "%s.bak" % self.filename)
            except Exception as msg:
                os.remove(temp_file.name)
                raise IOError("Backup of '%s' failed: %s" % (self.filename, msg))

        # copy tempfile
        try:
            shutil.move(temp_file.name, self.filename)
        except Exception as msg:
            os.remove(temp_file.name)
            raise IOError("Failed to create '%s': %s" % (self.filename, msg))
        else:
            os.chmod(self.filename, 0o600)
site-packages/firewall/core/modules.py000064400000007356147511334700014063 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2010-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

"""modules backend"""

__all__ = [ "modules" ]

from firewall.core.prog import runProg
from firewall.core.logger import log
from firewall.config import COMMANDS

class modules(object):
    def __init__(self):
        self._load_command = COMMANDS["modprobe"]
        # Use rmmod instead of modprobe -r (RHBZ#1031102)
        self._unload_command = COMMANDS["rmmod"]

    def __repr__(self):
        return '%s' % (self.__class__)

    def loaded_modules(self):
        """ get all loaded kernel modules and their dependencies """
        mods = [ ]
        deps = { }
        try:
            with open("/proc/modules", "r") as f:
                for line in f:
                    if not line:
                        break
                    line = line.strip()
                    splits = line.split()
                    mods.append(splits[0])
                    if splits[3] != "-":
                        deps[splits[0]] = splits[3].split(",")[:-1]
                    else:
                        deps[splits[0]] = [ ]
        except FileNotFoundError:
            pass

        return mods, deps # [loaded modules], {module:[dependants]}

    def load_module(self, module):
        log.debug2("%s: %s %s", self.__class__, self._load_command, module)
        return runProg(self._load_command, [ module ])

    def unload_module(self, module):
        log.debug2("%s: %s %s", self.__class__, self._unload_command, module)
        return runProg(self._unload_command, [ module ])

    def get_deps(self, module, deps, ret):
        """ get all dependants of a module """
        if module not in deps:
            return
        for mod in deps[module]:
            self.get_deps(mod, deps, ret)
            if mod not in ret:
                ret.append(mod)
        if module not in ret:
            ret.append(module)

    def get_firewall_modules(self):
        """ get all loaded firewall-related modules """
        mods = [ ]
        (mods2, deps) = self.loaded_modules()

        self.get_deps("nf_conntrack", deps, mods)
        # these modules don't have dependants listed in /proc/modules
        for bad_bad_module in ["nf_conntrack_ipv4", "nf_conntrack_ipv6"]:
            if bad_bad_module in mods:
                # move them to end of list, so we'll remove them later
                mods.remove(bad_bad_module)
                mods.insert(-1, bad_bad_module)

        for mod in mods2:
            if mod in [ "ip_tables", "ip6_tables", "ebtables" ] or \
               mod.startswith("iptable_") or mod.startswith("ip6table_") or \
               mod.startswith("nf_") or mod.startswith("xt_") or \
               mod.startswith("ipt_") or mod.startswith("ip6t_") :
                self.get_deps(mod, deps, mods)
        return mods

    def unload_firewall_modules(self):
        """ unload all firewall-related modules """
        for module in self.get_firewall_modules():
            (status, ret) = self.unload_module(module)
            if status != 0:
                log.debug1("Failed to unload module '%s': %s" %(module, ret))
site-packages/firewall/core/fw_config.py000064400000136430147511334700014350 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2011-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

__all__ = [ "FirewallConfig" ]

import copy
import os
import os.path
import shutil
from firewall import config
from firewall.core.logger import log
from firewall.core.io.icmptype import IcmpType, icmptype_reader, icmptype_writer
from firewall.core.io.service import Service, service_reader, service_writer
from firewall.core.io.zone import Zone, zone_reader, zone_writer
from firewall.core.io.ipset import IPSet, ipset_reader, ipset_writer
from firewall.core.io.helper import Helper, helper_reader, helper_writer
from firewall.core.io.policy import Policy, policy_reader, policy_writer
from firewall import errors
from firewall.errors import FirewallError

class FirewallConfig(object):
    def __init__(self, fw):
        self._fw = fw
        self.__init_vars()

    def __repr__(self):
        return '%s(%r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r)' % \
            (self.__class__,
             self._ipsets, self._icmptypes, self._services, self._zones,
             self._helpers, self.policy_objects,
             self._builtin_ipsets, self._builtin_icmptypes,
             self._builtin_services, self._builtin_zones, self._builtin_helpers,
             self._builtin_policy_objects,
             self._firewalld_conf, self._policies, self._direct)

    def __init_vars(self):
        self._ipsets = { }
        self._icmptypes = { }
        self._services = { }
        self._zones = { }
        self._helpers = { }
        self._policy_objects = { }
        self._builtin_ipsets = { }
        self._builtin_icmptypes = { }
        self._builtin_services = { }
        self._builtin_zones = { }
        self._builtin_helpers = { }
        self._builtin_policy_objects = { }
        self._firewalld_conf = None
        self._policies = None
        self._direct = None

    def cleanup(self):
        for x in list(self._builtin_ipsets.keys()):
            self._builtin_ipsets[x].cleanup()
            del self._builtin_ipsets[x]
        for x in list(self._ipsets.keys()):
            self._ipsets[x].cleanup()
            del self._ipsets[x]

        for x in list(self._builtin_icmptypes.keys()):
            self._builtin_icmptypes[x].cleanup()
            del self._builtin_icmptypes[x]
        for x in list(self._icmptypes.keys()):
            self._icmptypes[x].cleanup()
            del self._icmptypes[x]

        for x in list(self._builtin_services.keys()):
            self._builtin_services[x].cleanup()
            del self._builtin_services[x]
        for x in list(self._services.keys()):
            self._services[x].cleanup()
            del self._services[x]

        for x in list(self._builtin_zones.keys()):
            self._builtin_zones[x].cleanup()
            del self._builtin_zones[x]
        for x in list(self._zones.keys()):
            self._zones[x].cleanup()
            del self._zones[x]

        for x in list(self._builtin_helpers.keys()):
            self._builtin_helpers[x].cleanup()
            del self._builtin_helpers[x]
        for x in list(self._helpers.keys()):
            self._helpers[x].cleanup()
            del self._helpers[x]

        if self._firewalld_conf:
            self._firewalld_conf.cleanup()
            del self._firewalld_conf
            self._firewalld_conf = None

        if self._policies:
            self._policies.cleanup()
            del self._policies
            self._policies = None

        if self._direct:
            self._direct.cleanup()
            del self._direct
            self._direct = None

        self.__init_vars()

    # access check

    def lockdown_enabled(self):
        return self._fw.policies.query_lockdown()

    def access_check(self, key, value):
        return self._fw.policies.access_check(key, value)

    # firewalld_conf

    def set_firewalld_conf(self, conf):
        self._firewalld_conf = conf

    def get_firewalld_conf(self):
        return self._firewalld_conf

    def update_firewalld_conf(self):
        if not os.path.exists(config.FIREWALLD_CONF):
            self._firewalld_conf.clear()
        else:
            self._firewalld_conf.read()

    # policies

    def set_policies(self, policies):
        self._policies = policies

    def get_policies(self):
        return self._policies

    def update_lockdown_whitelist(self):
        if not os.path.exists(config.LOCKDOWN_WHITELIST):
            self._policies.lockdown_whitelist.cleanup()
        else:
            self._policies.lockdown_whitelist.read()

    # direct

    def set_direct(self, direct):
        self._direct = direct

    def get_direct(self):
        return self._direct

    def update_direct(self):
        if not os.path.exists(config.FIREWALLD_DIRECT):
            self._direct.cleanup()
        else:
            self._direct.read()

    # ipset

    def get_ipsets(self):
        return sorted(set(list(self._ipsets.keys()) + \
                          list(self._builtin_ipsets.keys())))

    def add_ipset(self, obj):
        if obj.builtin:
            self._builtin_ipsets[obj.name] = obj
        else:
            self._ipsets[obj.name] = obj

    def get_ipset(self, name):
        if name in self._ipsets:
            return self._ipsets[name]
        elif name in self._builtin_ipsets:
            return self._builtin_ipsets[name]
        raise FirewallError(errors.INVALID_IPSET, name)

    def load_ipset_defaults(self, obj):
        if obj.name not in self._ipsets:
            raise FirewallError(errors.NO_DEFAULTS, obj.name)
        elif self._ipsets[obj.name] != obj:
            raise FirewallError(errors.NO_DEFAULTS,
                                "self._ipsets[%s] != obj" % obj.name)
        elif obj.name not in self._builtin_ipsets:
            raise FirewallError(errors.NO_DEFAULTS,
                            "'%s' not a built-in ipset" % obj.name)
        self._remove_ipset(obj)
        return self._builtin_ipsets[obj.name]

    def get_ipset_config(self, obj):
        return obj.export_config()

    def set_ipset_config(self, obj, conf):
        if obj.builtin:
            x = copy.copy(obj)
            x.import_config(conf)
            x.path = config.ETC_FIREWALLD_IPSETS
            x.builtin = False
            if obj.path != x.path:
                x.default = False
            self.add_ipset(x)
            ipset_writer(x)
            return x
        else:
            obj.import_config(conf)
            ipset_writer(obj)
            return obj

    def new_ipset(self, name, conf):
        if name in self._ipsets or name in self._builtin_ipsets:
            raise FirewallError(errors.NAME_CONFLICT,
                                "new_ipset(): '%s'" % name)

        x = IPSet()
        x.check_name(name)
        x.import_config(conf)
        x.name = name
        x.filename = "%s.xml" % name
        x.path = config.ETC_FIREWALLD_IPSETS
        # It is not possible to add a new one with a name of a buitin
        x.builtin = False
        x.default = True

        ipset_writer(x)
        self.add_ipset(x)
        return x

    def update_ipset_from_path(self, name):
        filename = os.path.basename(name)
        path = os.path.dirname(name)

        if not os.path.exists(name):
            # removed file

            if path == config.ETC_FIREWALLD_IPSETS:
                # removed custom ipset
                for x in self._ipsets.keys():
                    obj = self._ipsets[x]
                    if obj.filename == filename:
                        del self._ipsets[x]
                        if obj.name in self._builtin_ipsets:
                            return ("update", self._builtin_ipsets[obj.name])
                        return ("remove", obj)
            else:
                # removed builtin ipset
                for x in self._builtin_ipsets.keys():
                    obj = self._builtin_ipsets[x]
                    if obj.filename == filename:
                        del self._builtin_ipsets[x]
                        if obj.name not in self._ipsets:
                            # update dbus ipset
                            return ("remove", obj)
                        else:
                            # builtin hidden, no update needed
                            return (None, None)

            # ipset not known to firewalld, yet (timeout, ..)
            return (None, None)

        # new or updated file

        log.debug1("Loading ipset file '%s'", name)
        try:
            obj = ipset_reader(filename, path)
        except Exception as msg:
            log.error("Failed to load ipset file '%s': %s", filename, msg)
            return (None, None)

        # new ipset
        if obj.name not in self._builtin_ipsets and obj.name not in self._ipsets:
            self.add_ipset(obj)
            return ("new", obj)

        # updated ipset
        if path == config.ETC_FIREWALLD_IPSETS:
            # custom ipset update
            if obj.name in self._ipsets:
                obj.default = self._ipsets[obj.name].default
                self._ipsets[obj.name] = obj
            return ("update", obj)
        else:
            if obj.name in self._builtin_ipsets:
                # builtin ipset update
                del self._builtin_ipsets[obj.name]
                self._builtin_ipsets[obj.name] = obj

                if obj.name not in self._ipsets:
                    # update dbus ipset
                    return ("update", obj)
                else:
                    # builtin hidden, no update needed
                    return (None, None)

        # ipset not known to firewalld, yet (timeout, ..)
        return (None, None)

    def _remove_ipset(self, obj):
        if obj.name not in self._ipsets:
            raise FirewallError(errors.INVALID_IPSET, obj.name)
        if obj.path != config.ETC_FIREWALLD_IPSETS:
            raise FirewallError(errors.INVALID_DIRECTORY,
                                "'%s' != '%s'" % (obj.path,
                                                  config.ETC_FIREWALLD_IPSETS))

        name = "%s/%s.xml" % (obj.path, obj.name)
        try:
            shutil.move(name, "%s.old" % name)
        except Exception as msg:
            log.error("Backup of file '%s' failed: %s", name, msg)
            os.remove(name)

        del self._ipsets[obj.name]

    def check_builtin_ipset(self, obj):
        if obj.builtin or not obj.default:
            raise FirewallError(errors.BUILTIN_IPSET,
                                "'%s' is built-in ipset" % obj.name)

    def remove_ipset(self, obj):
        self.check_builtin_ipset(obj)
        self._remove_ipset(obj)

    def rename_ipset(self, obj, name):
        self.check_builtin_ipset(obj)
        new_ipset = self._copy_ipset(obj, name)
        self._remove_ipset(obj)
        return new_ipset

    def _copy_ipset(self, obj, name):
        return self.new_ipset(name, obj.export_config())

    # icmptypes

    def get_icmptypes(self):
        return sorted(set(list(self._icmptypes.keys()) + \
                          list(self._builtin_icmptypes.keys())))

    def add_icmptype(self, obj):
        if obj.builtin:
            self._builtin_icmptypes[obj.name] = obj
        else:
            self._icmptypes[obj.name] = obj

    def get_icmptype(self, name):
        if name in self._icmptypes:
            return self._icmptypes[name]
        elif name in self._builtin_icmptypes:
            return self._builtin_icmptypes[name]
        raise FirewallError(errors.INVALID_ICMPTYPE, name)

    def load_icmptype_defaults(self, obj):
        if obj.name not in self._icmptypes:
            raise FirewallError(errors.NO_DEFAULTS, obj.name)
        elif self._icmptypes[obj.name] != obj:
            raise FirewallError(errors.NO_DEFAULTS,
                                "self._icmptypes[%s] != obj" % obj.name)
        elif obj.name not in self._builtin_icmptypes:
            raise FirewallError(errors.NO_DEFAULTS,
                                "'%s' not a built-in icmptype" % obj.name)
        self._remove_icmptype(obj)
        return self._builtin_icmptypes[obj.name]

    def get_icmptype_config(self, obj):
        return obj.export_config()

    def set_icmptype_config(self, obj, conf):
        if obj.builtin:
            x = copy.copy(obj)
            x.import_config(conf)
            x.path = config.ETC_FIREWALLD_ICMPTYPES
            x.builtin = False
            if obj.path != x.path:
                x.default = False
            self.add_icmptype(x)
            icmptype_writer(x)
            return x
        else:
            obj.import_config(conf)
            icmptype_writer(obj)
            return obj

    def new_icmptype(self, name, conf):
        if name in self._icmptypes or name in self._builtin_icmptypes:
            raise FirewallError(errors.NAME_CONFLICT,
                                "new_icmptype(): '%s'" % name)

        x = IcmpType()
        x.check_name(name)
        x.import_config(conf)
        x.name = name
        x.filename = "%s.xml" % name
        x.path = config.ETC_FIREWALLD_ICMPTYPES
        # It is not possible to add a new one with a name of a buitin
        x.builtin = False
        x.default = True

        icmptype_writer(x)
        self.add_icmptype(x)
        return x

    def update_icmptype_from_path(self, name):
        filename = os.path.basename(name)
        path = os.path.dirname(name)

        if not os.path.exists(name):
            # removed file

            if path == config.ETC_FIREWALLD_ICMPTYPES:
                # removed custom icmptype
                for x in self._icmptypes.keys():
                    obj = self._icmptypes[x]
                    if obj.filename == filename:
                        del self._icmptypes[x]
                        if obj.name in self._builtin_icmptypes:
                            return ("update", self._builtin_icmptypes[obj.name])
                        return ("remove", obj)
            else:
                # removed builtin icmptype
                for x in self._builtin_icmptypes.keys():
                    obj = self._builtin_icmptypes[x]
                    if obj.filename == filename:
                        del self._builtin_icmptypes[x]
                        if obj.name not in self._icmptypes:
                            # update dbus icmptype
                            return ("remove", obj)
                        else:
                            # builtin hidden, no update needed
                            return (None, None)

            # icmptype not known to firewalld, yet (timeout, ..)
            return (None, None)

        # new or updated file

        log.debug1("Loading icmptype file '%s'", name)
        try:
            obj = icmptype_reader(filename, path)
        except Exception as msg:
            log.error("Failed to load icmptype file '%s': %s", filename, msg)
            return (None, None)

        # new icmptype
        if obj.name not in self._builtin_icmptypes and obj.name not in self._icmptypes:
            self.add_icmptype(obj)
            return ("new", obj)

        # updated icmptype
        if path == config.ETC_FIREWALLD_ICMPTYPES:
            # custom icmptype update
            if obj.name in self._icmptypes:
                obj.default = self._icmptypes[obj.name].default
                self._icmptypes[obj.name] = obj
            return ("update", obj)
        else:
            if obj.name in self._builtin_icmptypes:
                # builtin icmptype update
                del self._builtin_icmptypes[obj.name]
                self._builtin_icmptypes[obj.name] = obj

                if obj.name not in self._icmptypes:
                    # update dbus icmptype
                    return ("update", obj)
                else:
                    # builtin hidden, no update needed
                    return (None, None)
            
        # icmptype not known to firewalld, yet (timeout, ..)
        return (None, None)

    def _remove_icmptype(self, obj):
        if obj.name not in self._icmptypes:
            raise FirewallError(errors.INVALID_ICMPTYPE, obj.name)
        if obj.path != config.ETC_FIREWALLD_ICMPTYPES:
            raise FirewallError(errors.INVALID_DIRECTORY,
                                "'%s' != '%s'" % \
                                (obj.path, config.ETC_FIREWALLD_ICMPTYPES))

        name = "%s/%s.xml" % (obj.path, obj.name)
        try:
            shutil.move(name, "%s.old" % name)
        except Exception as msg:
            log.error("Backup of file '%s' failed: %s", name, msg)
            os.remove(name)

        del self._icmptypes[obj.name]

    def check_builtin_icmptype(self, obj):
        if obj.builtin or not obj.default:
            raise FirewallError(errors.BUILTIN_ICMPTYPE,
                                "'%s' is built-in icmp type" % obj.name)

    def remove_icmptype(self, obj):
        self.check_builtin_icmptype(obj)
        self._remove_icmptype(obj)

    def rename_icmptype(self, obj, name):
        self.check_builtin_icmptype(obj)
        new_icmptype = self._copy_icmptype(obj, name)
        self._remove_icmptype(obj)
        return new_icmptype

    def _copy_icmptype(self, obj, name):
        return self.new_icmptype(name, obj.export_config())

    # services

    def get_services(self):
        return sorted(set(list(self._services.keys()) + \
                          list(self._builtin_services.keys())))

    def add_service(self, obj):
        if obj.builtin:
            self._builtin_services[obj.name] = obj
        else:
            self._services[obj.name] = obj

    def get_service(self, name):
        if name in self._services:
            return self._services[name]
        elif name in self._builtin_services:
            return self._builtin_services[name]
        raise FirewallError(errors.INVALID_SERVICE, "get_service(): '%s'" % name)

    def load_service_defaults(self, obj):
        if obj.name not in self._services:
            raise FirewallError(errors.NO_DEFAULTS, obj.name)
        elif self._services[obj.name] != obj:
            raise FirewallError(errors.NO_DEFAULTS,
                                "self._services[%s] != obj" % obj.name)
        elif obj.name not in self._builtin_services:
            raise FirewallError(errors.NO_DEFAULTS,
                                "'%s' not a built-in service" % obj.name)
        self._remove_service(obj)
        return self._builtin_services[obj.name]

    def get_service_config(self, obj):
        conf_dict = obj.export_config_dict()
        conf_list = []
        for i in range(8): # tuple based dbus API has 8 elements
            if obj.IMPORT_EXPORT_STRUCTURE[i][0] not in conf_dict:
                # old API needs the empty elements as well. Grab it from the
                # object otherwise we don't know the type.
                conf_list.append(copy.deepcopy(getattr(obj, obj.IMPORT_EXPORT_STRUCTURE[i][0])))
            else:
                conf_list.append(conf_dict[obj.IMPORT_EXPORT_STRUCTURE[i][0]])
        return tuple(conf_list)

    def get_service_config_dict(self, obj):
        return obj.export_config_dict()

    def set_service_config(self, obj, conf):
        conf_dict = {}
        for i,value in enumerate(conf):
            conf_dict[obj.IMPORT_EXPORT_STRUCTURE[i][0]] = value

        if obj.builtin:
            x = copy.copy(obj)
            x.import_config_dict(conf_dict)
            x.path = config.ETC_FIREWALLD_SERVICES
            x.builtin = False
            if obj.path != x.path:
                x.default = False
            self.add_service(x)
            service_writer(x)
            return x
        else:
            obj.import_config_dict(conf_dict)
            service_writer(obj)
            return obj

    def set_service_config_dict(self, obj, conf):
        if obj.builtin:
            x = copy.copy(obj)
            x.import_config_dict(conf)
            x.path = config.ETC_FIREWALLD_SERVICES
            x.builtin = False
            if obj.path != x.path:
                x.default = False
            self.add_service(x)
            service_writer(x)
            return x
        else:
            obj.import_config_dict(conf)
            service_writer(obj)
            return obj

    def new_service(self, name, conf):
        if name in self._services or name in self._builtin_services:
            raise FirewallError(errors.NAME_CONFLICT,
                                "new_service(): '%s'" % name)

        conf_dict = {}
        for i,value in enumerate(conf):
            conf_dict[Service.IMPORT_EXPORT_STRUCTURE[i][0]] = value

        x = Service()
        x.check_name(name)
        x.import_config_dict(conf_dict)
        x.name = name
        x.filename = "%s.xml" % name
        x.path = config.ETC_FIREWALLD_SERVICES
        # It is not possible to add a new one with a name of a buitin
        x.builtin = False
        x.default = True

        service_writer(x)
        self.add_service(x)
        return x

    def new_service_dict(self, name, conf):
        if name in self._services or name in self._builtin_services:
            raise FirewallError(errors.NAME_CONFLICT,
                                "new_service(): '%s'" % name)

        x = Service()
        x.check_name(name)
        x.import_config_dict(conf)
        x.name = name
        x.filename = "%s.xml" % name
        x.path = config.ETC_FIREWALLD_SERVICES
        # It is not possible to add a new one with a name of a buitin
        x.builtin = False
        x.default = True

        service_writer(x)
        self.add_service(x)
        return x

    def update_service_from_path(self, name):
        filename = os.path.basename(name)
        path = os.path.dirname(name)

        if not os.path.exists(name):
            # removed file

            if path == config.ETC_FIREWALLD_SERVICES:
                # removed custom service
                for x in self._services.keys():
                    obj = self._services[x]
                    if obj.filename == filename:
                        del self._services[x]
                        if obj.name in self._builtin_services:
                            return ("update", self._builtin_services[obj.name])
                        return ("remove", obj)
            else:
                # removed builtin service
                for x in self._builtin_services.keys():
                    obj = self._builtin_services[x]
                    if obj.filename == filename:
                        del self._builtin_services[x]
                        if obj.name not in self._services:
                            # update dbus service
                            return ("remove", obj)
                        else:
                            # builtin hidden, no update needed
                            return (None, None)

            # service not known to firewalld, yet (timeout, ..)
            return (None, None)

        # new or updated file

        log.debug1("Loading service file '%s'", name)
        try:
            obj = service_reader(filename, path)
        except Exception as msg:
            log.error("Failed to load service file '%s': %s", filename, msg)
            return (None, None)

        # new service
        if obj.name not in self._builtin_services and obj.name not in self._services:
            self.add_service(obj)
            return ("new", obj)

        # updated service
        if path == config.ETC_FIREWALLD_SERVICES:
            # custom service update
            if obj.name in self._services:
                obj.default = self._services[obj.name].default
                self._services[obj.name] = obj
            return ("update", obj)
        else:
            if obj.name in self._builtin_services:
                # builtin service update
                del self._builtin_services[obj.name]
                self._builtin_services[obj.name] = obj

                if obj.name not in self._services:
                    # update dbus service
                    return ("update", obj)
                else:
                    # builtin hidden, no update needed
                    return (None, None)
            
        # service not known to firewalld, yet (timeout, ..)
        return (None, None)

    def _remove_service(self, obj):
        if obj.name not in self._services:
            raise FirewallError(errors.INVALID_SERVICE, obj.name)
        if obj.path != config.ETC_FIREWALLD_SERVICES:
            raise FirewallError(errors.INVALID_DIRECTORY,
                                "'%s' != '%s'" % \
                                (obj.path, config.ETC_FIREWALLD_SERVICES))

        name = "%s/%s.xml" % (obj.path, obj.name)
        try:
            shutil.move(name, "%s.old" % name)
        except Exception as msg:
            log.error("Backup of file '%s' failed: %s", name, msg)
            os.remove(name)

        del self._services[obj.name]

    def check_builtin_service(self, obj):
        if obj.builtin or not obj.default:
            raise FirewallError(errors.BUILTIN_SERVICE,
                                "'%s' is built-in service" % obj.name)

    def remove_service(self, obj):
        self.check_builtin_service(obj)
        self._remove_service(obj)

    def rename_service(self, obj, name):
        self.check_builtin_service(obj)
        new_service = self._copy_service(obj, name)
        self._remove_service(obj)
        return new_service

    def _copy_service(self, obj, name):
        return self.new_service_dict(name, obj.export_config_dict())

    # zones

    def get_zones(self):
        return sorted(set(list(self._zones.keys()) + \
                          list(self._builtin_zones.keys())))

    def add_zone(self, obj):
        if obj.builtin:
            self._builtin_zones[obj.name] = obj
        else:
            self._zones[obj.name] = obj

    def forget_zone(self, name):
        if name in self._builtin_zones:
            del self._builtin_zones[name]
        if name in self._zones:
            del self._zones[name]

    def get_zone(self, name):
        if name in self._zones:
            return self._zones[name]
        elif name in self._builtin_zones:
            return self._builtin_zones[name]
        raise FirewallError(errors.INVALID_ZONE, "get_zone(): %s" % name)

    def load_zone_defaults(self, obj):
        if obj.name not in self._zones:
            raise FirewallError(errors.NO_DEFAULTS, obj.name)
        elif self._zones[obj.name] != obj:
            raise FirewallError(errors.NO_DEFAULTS,
                                "self._zones[%s] != obj" % obj.name)
        elif obj.name not in self._builtin_zones:
            raise FirewallError(errors.NO_DEFAULTS,
                                "'%s' not a built-in zone" % obj.name)
        self._remove_zone(obj)
        return self._builtin_zones[obj.name]

    def get_zone_config(self, obj):
        conf_dict = obj.export_config_dict()
        conf_list = []
        for i in range(16): # tuple based dbus API has 16 elements
            if obj.IMPORT_EXPORT_STRUCTURE[i][0] not in conf_dict:
                # old API needs the empty elements as well. Grab it from the
                # object otherwise we don't know the type.
                conf_list.append(copy.deepcopy(getattr(obj, obj.IMPORT_EXPORT_STRUCTURE[i][0])))
            else:
                conf_list.append(conf_dict[obj.IMPORT_EXPORT_STRUCTURE[i][0]])
        return tuple(conf_list)

    def get_zone_config_dict(self, obj):
        return obj.export_config_dict()

    def set_zone_config(self, obj, conf):
        conf_dict = {}
        for i,value in enumerate(conf):
            conf_dict[obj.IMPORT_EXPORT_STRUCTURE[i][0]] = value

        if obj.builtin:
            x = copy.copy(obj)
            x.fw_config = self
            x.import_config_dict(conf_dict)
            x.path = config.ETC_FIREWALLD_ZONES
            x.builtin = False
            if obj.path != x.path:
                x.default = False
            self.add_zone(x)
            zone_writer(x)
            return x
        else:
            obj.fw_config = self
            obj.import_config_dict(conf_dict)
            zone_writer(obj)
            return obj

    def set_zone_config_dict(self, obj, conf):
        if obj.builtin:
            x = copy.copy(obj)
            x.fw_config = self
            x.import_config_dict(conf)
            x.path = config.ETC_FIREWALLD_ZONES
            x.builtin = False
            if obj.path != x.path:
                x.default = False
            self.add_zone(x)
            zone_writer(x)
            return x
        else:
            obj.fw_config = self
            obj.import_config_dict(conf)
            zone_writer(obj)
            return obj

    def new_zone(self, name, conf):
        if name in self._zones or name in self._builtin_zones:
            raise FirewallError(errors.NAME_CONFLICT, "new_zone(): '%s'" % name)

        conf_dict = {}
        for i,value in enumerate(conf):
            conf_dict[Zone.IMPORT_EXPORT_STRUCTURE[i][0]] = value

        x = Zone()
        x.fw_config = self
        x.check_name(name)
        x.import_config_dict(conf_dict)
        x.name = name
        x.filename = "%s.xml" % name
        x.path = config.ETC_FIREWALLD_ZONES
        # It is not possible to add a new one with a name of a buitin
        x.builtin = False
        x.default = True

        zone_writer(x)
        self.add_zone(x)
        return x

    def new_zone_dict(self, name, conf):
        if name in self._zones or name in self._builtin_zones:
            raise FirewallError(errors.NAME_CONFLICT, "new_zone(): '%s'" % name)

        x = Zone()
        x.fw_config = self
        x.check_name(name)
        x.import_config_dict(conf)
        x.name = name
        x.filename = "%s.xml" % name
        x.path = config.ETC_FIREWALLD_ZONES
        # It is not possible to add a new one with a name of a buitin
        x.builtin = False
        x.default = True

        zone_writer(x)
        self.add_zone(x)
        return x

    def update_zone_from_path(self, name):
        filename = os.path.basename(name)
        path = os.path.dirname(name)

        if not os.path.exists(name):
            # removed file

            if path.startswith(config.ETC_FIREWALLD_ZONES):
                # removed custom zone
                for x in self._zones.keys():
                    obj = self._zones[x]
                    if obj.filename == filename:
                        del self._zones[x]
                        if obj.name in self._builtin_zones:
                            return ("update", self._builtin_zones[obj.name])
                        return ("remove", obj)
            else:
                # removed builtin zone
                for x in self._builtin_zones.keys():
                    obj = self._builtin_zones[x]
                    if obj.filename == filename:
                        del self._builtin_zones[x]
                        if obj.name not in self._zones:
                            # update dbus zone
                            return ("remove", obj)
                        else:
                            # builtin hidden, no update needed
                            return (None, None)

            # zone not known to firewalld, yet (timeout, ..)
            return (None, None)

        # new or updated file

        log.debug1("Loading zone file '%s'", name)
        try:
            obj = zone_reader(filename, path)
        except Exception as msg:
            log.error("Failed to load zone file '%s': %s", filename, msg)
            return (None, None)

        obj.fw_config = self

        if path.startswith(config.ETC_FIREWALLD_ZONES) and \
           len(path) > len(config.ETC_FIREWALLD_ZONES):
            # custom combined zone part
            obj.name = "%s/%s" % (os.path.basename(path),
                                  os.path.basename(filename)[0:-4])

        # new zone
        if obj.name not in self._builtin_zones and obj.name not in self._zones:
            self.add_zone(obj)
            return ("new", obj)

        # updated zone
        if path.startswith(config.ETC_FIREWALLD_ZONES):
            # custom zone update
            if obj.name in self._zones:
                obj.default = self._zones[obj.name].default
                self._zones[obj.name] = obj
            return ("update", obj)
        else:
            if obj.name in self._builtin_zones:
                # builtin zone update
                del self._builtin_zones[obj.name]
                self._builtin_zones[obj.name] = obj

                if obj.name not in self._zones:
                    # update dbus zone
                    return ("update", obj)
                else:
                    # builtin hidden, no update needed
                    return (None, None)
            
        # zone not known to firewalld, yet (timeout, ..)
        return (None, None)

    def _remove_zone(self, obj):
        if obj.name not in self._zones:
            raise FirewallError(errors.INVALID_ZONE, obj.name)
        if not obj.path.startswith(config.ETC_FIREWALLD_ZONES):
            raise FirewallError(errors.INVALID_DIRECTORY,
                                "'%s' doesn't start with '%s'" % \
                                (obj.path, config.ETC_FIREWALLD_ZONES))

        name = "%s/%s.xml" % (obj.path, obj.name)
        try:
            shutil.move(name, "%s.old" % name)
        except Exception as msg:
            log.error("Backup of file '%s' failed: %s", name, msg)
            os.remove(name)

        del self._zones[obj.name]

    def check_builtin_zone(self, obj):
        if obj.builtin or not obj.default:
            raise FirewallError(errors.BUILTIN_ZONE,
                                "'%s' is built-in zone" % obj.name)

    def remove_zone(self, obj):
        self.check_builtin_zone(obj)
        self._remove_zone(obj)

    def rename_zone(self, obj, name):
        self.check_builtin_zone(obj)
        obj_conf = obj.export_config_dict()
        self._remove_zone(obj)
        try:
            new_zone = self.new_zone_dict(name, obj_conf)
        except:
            # re-add original if rename failed
            self.new_zone_dict(obj.name, obj_conf)
            raise
        return new_zone

    # policy objects

    def get_policy_objects(self):
        return sorted(set(list(self._policy_objects.keys()) + \
                          list(self._builtin_policy_objects.keys())))

    def add_policy_object(self, obj):
        if obj.builtin:
            self._builtin_policy_objects[obj.name] = obj
        else:
            self._policy_objects[obj.name] = obj

    def get_policy_object(self, name):
        if name in self._policy_objects:
            return self._policy_objects[name]
        elif name in self._builtin_policy_objects:
            return self._builtin_policy_objects[name]
        raise FirewallError(errors.INVALID_POLICY, "get_policy_object(): %s" % name)

    def load_policy_object_defaults(self, obj):
        if obj.name not in self._policy_objects:
            raise FirewallError(errors.NO_DEFAULTS, obj.name)
        elif self._policy_objects[obj.name] != obj:
            raise FirewallError(errors.NO_DEFAULTS,
                                "self._policy_objects[%s] != obj" % obj.name)
        elif obj.name not in self._builtin_policy_objects:
            raise FirewallError(errors.NO_DEFAULTS,
                                "'%s' not a built-in policy" % obj.name)
        self._remove_policy_object(obj)
        return self._builtin_policy_objects[obj.name]

    def get_policy_object_config_dict(self, obj):
        return obj.export_config_dict()

    def set_policy_object_config_dict(self, obj, conf):
        if obj.builtin:
            x = copy.copy(obj)
            x.fw_config = self
            x.import_config_dict(conf)
            x.path = config.ETC_FIREWALLD_POLICIES
            x.builtin = False
            if obj.path != x.path:
                x.default = False
            self.add_policy_object(x)
            policy_writer(x)
            return x
        else:
            obj.fw_config = self
            obj.import_config_dict(conf)
            policy_writer(obj)
            return obj

    def new_policy_object_dict(self, name, conf):
        if name in self._policy_objects or name in self._builtin_policy_objects:
            raise FirewallError(errors.NAME_CONFLICT, "new_policy_object(): '%s'" % name)

        x = Policy()
        x.fw_config = self
        x.check_name(name)
        x.import_config_dict(conf)
        x.name = name
        x.filename = "%s.xml" % name
        x.path = config.ETC_FIREWALLD_POLICIES
        # It is not possible to add a new one with a name of a buitin
        x.builtin = False
        x.default = True

        policy_writer(x)
        self.add_policy_object(x)
        return x

    def update_policy_object_from_path(self, name):
        filename = os.path.basename(name)
        path = os.path.dirname(name)

        if not os.path.exists(name):
            # removed file

            if path.startswith(config.ETC_FIREWALLD_POLICIES):
                # removed custom policy_object
                for x in self._policy_objects.keys():
                    obj = self._policy_objects[x]
                    if obj.filename == filename:
                        del self._policy_objects[x]
                        if obj.name in self._builtin_policy_objects:
                            return ("update", self._builtin_policy_objects[obj.name])
                        return ("remove", obj)
            else:
                # removed builtin policy_object
                for x in self._builtin_policy_objects.keys():
                    obj = self._builtin_policy_objects[x]
                    if obj.filename == filename:
                        del self._builtin_policy_objects[x]
                        if obj.name not in self._policy_objects:
                            # update dbus policy_object
                            return ("remove", obj)
                        else:
                            # builtin hidden, no update needed
                            return (None, None)

            # policy_object not known to firewalld, yet (timeout, ..)
            return (None, None)

        # new or updated file

        log.debug1("Loading policy file '%s'", name)
        try:
            obj = policy_reader(filename, path)
        except Exception as msg:
            log.error("Failed to load policy file '%s': %s", filename, msg)
            return (None, None)

        obj.fw_config = self

        if path.startswith(config.ETC_FIREWALLD_POLICIES) and \
           len(path) > len(config.ETC_FIREWALLD_POLICIES):
            # custom combined policy_object part
            obj.name = "%s/%s" % (os.path.basename(path),
                                  os.path.basename(filename)[0:-4])

        # new policy_object
        if obj.name not in self._builtin_policy_objects and obj.name not in self._policy_objects:
            self.add_policy_object(obj)
            return ("new", obj)

        # updated policy_object
        if path.startswith(config.ETC_FIREWALLD_POLICIES):
            # custom policy_object update
            if obj.name in self._policy_objects:
                obj.default = self._policy_objects[obj.name].default
                self._policy_objects[obj.name] = obj
            return ("update", obj)
        else:
            if obj.name in self._builtin_policy_objects:
                # builtin policy_object update
                del self._builtin_policy_objects[obj.name]
                self._builtin_policy_objects[obj.name] = obj

                if obj.name not in self._policy_objects:
                    # update dbus policy_object
                    return ("update", obj)
                else:
                    # builtin hidden, no update needed
                    return (None, None)

        # policy_object not known to firewalld, yet (timeout, ..)
        return (None, None)

    def _remove_policy_object(self, obj):
        if obj.name not in self._policy_objects:
            raise FirewallError(errors.INVALID_POLICY, obj.name)
        if not obj.path.startswith(config.ETC_FIREWALLD_POLICIES):
            raise FirewallError(errors.INVALID_DIRECTORY,
                                "'%s' doesn't start with '%s'" % \
                                (obj.path, config.ETC_FIREWALLD_POLICIES))

        name = "%s/%s.xml" % (obj.path, obj.name)
        try:
            shutil.move(name, "%s.old" % name)
        except Exception as msg:
            log.error("Backup of file '%s' failed: %s", name, msg)
            os.remove(name)

        del self._policy_objects[obj.name]

    def check_builtin_policy_object(self, obj):
        if obj.builtin or not obj.default:
            raise FirewallError(errors.BUILTIN_POLICY,
                                "'%s' is built-in policy" % obj.name)

    def remove_policy_object(self, obj):
        self.check_builtin_policy_object(obj)
        self._remove_policy_object(obj)

    def rename_policy_object(self, obj, name):
        self.check_builtin_policy_object(obj)
        new_policy_object = self._copy_policy_object(obj, name)
        self._remove_policy_object(obj)
        return new_policy_object

    def _copy_policy_object(self, obj, name):
        return self.new_policy_object_dict(name, obj.export_config_dict())

    # helper

    def get_helpers(self):
        return sorted(set(list(self._helpers.keys()) + \
                          list(self._builtin_helpers.keys())))

    def add_helper(self, obj):
        if obj.builtin:
            self._builtin_helpers[obj.name] = obj
        else:
            self._helpers[obj.name] = obj

    def get_helper(self, name):
        if name in self._helpers:
            return self._helpers[name]
        elif name in self._builtin_helpers:
            return self._builtin_helpers[name]
        raise FirewallError(errors.INVALID_HELPER, name)

    def load_helper_defaults(self, obj):
        if obj.name not in self._helpers:
            raise FirewallError(errors.NO_DEFAULTS, obj.name)
        elif self._helpers[obj.name] != obj:
            raise FirewallError(errors.NO_DEFAULTS,
                                "self._helpers[%s] != obj" % obj.name)
        elif obj.name not in self._builtin_helpers:
            raise FirewallError(errors.NO_DEFAULTS,
                            "'%s' not a built-in helper" % obj.name)
        self._remove_helper(obj)
        return self._builtin_helpers[obj.name]

    def get_helper_config(self, obj):
        return obj.export_config()

    def set_helper_config(self, obj, conf):
        if obj.builtin:
            x = copy.copy(obj)
            x.import_config(conf)
            x.path = config.ETC_FIREWALLD_HELPERS
            x.builtin = False
            if obj.path != x.path:
                x.default = False
            self.add_helper(x)
            helper_writer(x)
            return x
        else:
            obj.import_config(conf)
            helper_writer(obj)
            return obj

    def new_helper(self, name, conf):
        if name in self._helpers or name in self._builtin_helpers:
            raise FirewallError(errors.NAME_CONFLICT,
                                "new_helper(): '%s'" % name)

        x = Helper()
        x.check_name(name)
        x.import_config(conf)
        x.name = name
        x.filename = "%s.xml" % name
        x.path = config.ETC_FIREWALLD_HELPERS
        # It is not possible to add a new one with a name of a buitin
        x.builtin = False
        x.default = True

        helper_writer(x)
        self.add_helper(x)
        return x

    def update_helper_from_path(self, name):
        filename = os.path.basename(name)
        path = os.path.dirname(name)

        if not os.path.exists(name):
            # removed file

            if path == config.ETC_FIREWALLD_HELPERS:
                # removed custom helper
                for x in self._helpers.keys():
                    obj = self._helpers[x]
                    if obj.filename == filename:
                        del self._helpers[x]
                        if obj.name in self._builtin_helpers:
                            return ("update", self._builtin_helpers[obj.name])
                        return ("remove", obj)
            else:
                # removed builtin helper
                for x in self._builtin_helpers.keys():
                    obj = self._builtin_helpers[x]
                    if obj.filename == filename:
                        del self._builtin_helpers[x]
                        if obj.name not in self._helpers:
                            # update dbus helper
                            return ("remove", obj)
                        else:
                            # builtin hidden, no update needed
                            return (None, None)

            # helper not known to firewalld, yet (timeout, ..)
            return (None, None)

        # new or updated file

        log.debug1("Loading helper file '%s'", name)
        try:
            obj = helper_reader(filename, path)
        except Exception as msg:
            log.error("Failed to load helper file '%s': %s", filename, msg)
            return (None, None)

        # new helper
        if obj.name not in self._builtin_helpers and obj.name not in self._helpers:
            self.add_helper(obj)
            return ("new", obj)

        # updated helper
        if path == config.ETC_FIREWALLD_HELPERS:
            # custom helper update
            if obj.name in self._helpers:
                obj.default = self._helpers[obj.name].default
                self._helpers[obj.name] = obj
            return ("update", obj)
        else:
            if obj.name in self._builtin_helpers:
                # builtin helper update
                del self._builtin_helpers[obj.name]
                self._builtin_helpers[obj.name] = obj

                if obj.name not in self._helpers:
                    # update dbus helper
                    return ("update", obj)
                else:
                    # builtin hidden, no update needed
                    return (None, None)

        # helper not known to firewalld, yet (timeout, ..)
        return (None, None)

    def _remove_helper(self, obj):
        if obj.name not in self._helpers:
            raise FirewallError(errors.INVALID_HELPER, obj.name)
        if obj.path != config.ETC_FIREWALLD_HELPERS:
            raise FirewallError(errors.INVALID_DIRECTORY,
                                "'%s' != '%s'" % (obj.path,
                                                  config.ETC_FIREWALLD_HELPERS))

        name = "%s/%s.xml" % (obj.path, obj.name)
        try:
            shutil.move(name, "%s.old" % name)
        except Exception as msg:
            log.error("Backup of file '%s' failed: %s", name, msg)
            os.remove(name)

        del self._helpers[obj.name]

    def check_builtin_helper(self, obj):
        if obj.builtin or not obj.default:
            raise FirewallError(errors.BUILTIN_HELPER,
                                "'%s' is built-in helper" % obj.name)

    def remove_helper(self, obj):
        self.check_builtin_helper(obj)
        self._remove_helper(obj)

    def rename_helper(self, obj, name):
        self.check_builtin_helper(obj)
        new_helper = self._copy_helper(obj, name)
        self._remove_helper(obj)
        return new_helper

    def _copy_helper(self, obj, name):
        return self.new_helper(name, obj.export_config())
site-packages/firewall/core/rich.py000064400000102070147511334700013325 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2013-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

__all__ = [ "Rich_Source", "Rich_Destination", "Rich_Service", "Rich_Port",
            "Rich_Protocol", "Rich_Masquerade", "Rich_IcmpBlock",
            "Rich_IcmpType",
            "Rich_SourcePort", "Rich_ForwardPort", "Rich_Log", "Rich_Audit",
            "Rich_Accept", "Rich_Reject", "Rich_Drop", "Rich_Mark",
            "Rich_Limit", "Rich_Rule" ]

from firewall import functions
from firewall.core.ipset import check_ipset_name
from firewall.core.base import REJECT_TYPES
from firewall import errors
from firewall.errors import FirewallError

class Rich_Source(object):
    def __init__(self, addr, mac, ipset, invert=False):
        self.addr = addr
        if self.addr == "":
            self.addr = None
        self.mac = mac
        if self.mac == "" or self.mac is None:
            self.mac = None
        elif self.mac is not None:
            self.mac = self.mac.upper()
        self.ipset = ipset
        if self.ipset == "":
            self.ipset = None
        self.invert = invert
        if self.addr is None and self.mac is None and self.ipset is None:
            raise FirewallError(errors.INVALID_RULE,
                                "no address, mac and ipset")

    def __str__(self):
        ret = 'source%s ' % (" NOT" if self.invert else "")
        if self.addr is not None:
            return ret + 'address="%s"' % self.addr
        elif self.mac is not None:
            return ret + 'mac="%s"' % self.mac
        elif self.ipset is not None:
            return ret + 'ipset="%s"' % self.ipset
        else:
            raise FirewallError(errors.INVALID_RULE,
                                "no address, mac and ipset")

class Rich_Destination(object):
    def __init__(self, addr, ipset, invert=False):
        self.addr = addr
        if self.addr == "":
            self.addr = None
        self.ipset = ipset
        if self.ipset == "":
            self.ipset = None
        self.invert = invert
        if self.addr is None and self.ipset is None:
            raise FirewallError(errors.INVALID_RULE,
                                "no address and ipset")

    def __str__(self):
        ret = 'destination%s ' % (" NOT" if self.invert else "")
        if self.addr is not None:
            return ret + 'address="%s"' % self.addr
        elif self.ipset is not None:
            return ret + 'ipset="%s"' % self.ipset
        else:
            raise FirewallError(errors.INVALID_RULE,
                                "no address and ipset")

class Rich_Service(object):
    def __init__(self, name):
        self.name = name

    def __str__(self):
        return 'service name="%s"' % (self.name)

class Rich_Port(object):
    def __init__(self, port, protocol):
        self.port = port
        self.protocol = protocol

    def __str__(self):
        return 'port port="%s" protocol="%s"' % (self.port, self.protocol)

class Rich_SourcePort(Rich_Port):
    def __str__(self):
        return 'source-port port="%s" protocol="%s"' % (self.port,
                                                        self.protocol)

class Rich_Protocol(object):
    def __init__(self, value):
        self.value = value

    def __str__(self):
        return 'protocol value="%s"' % (self.value)

class Rich_Masquerade(object):
    def __init__(self):
        pass

    def __str__(self):
        return 'masquerade'

class Rich_IcmpBlock(object):
    def __init__(self, name):
        self.name = name

    def __str__(self):
        return 'icmp-block name="%s"' % (self.name)

class Rich_IcmpType(object):
    def __init__(self, name):
        self.name = name

    def __str__(self):
        return 'icmp-type name="%s"' % (self.name)

class Rich_ForwardPort(object):
    def __init__(self, port, protocol, to_port, to_address):
        self.port = port
        self.protocol = protocol
        self.to_port = to_port
        self.to_address = to_address
        # replace None with "" in to_port and/or to_address
        if self.to_port is None:
            self.to_port = ""
        if self.to_address is None:
            self.to_address = ""

    def __str__(self):
        return 'forward-port port="%s" protocol="%s"%s%s' % \
            (self.port, self.protocol,
             ' to-port="%s"' % self.to_port if self.to_port != "" else '',
             ' to-addr="%s"' % self.to_address if self.to_address != "" else '')

class Rich_Log(object):
    def __init__(self, prefix=None, level=None, limit=None):
        #TODO check default level in iptables
        self.prefix = prefix
        self.level = level
        self.limit = limit

    def __str__(self):
        return 'log%s%s%s' % \
            (' prefix="%s"' % (self.prefix) if self.prefix else "",
             ' level="%s"' % (self.level) if self.level else "",
             " %s" % self.limit if self.limit else "")

class Rich_Audit(object):
    def __init__(self, limit=None):
        #TODO check default level in iptables
        self.limit = limit

    def __str__(self):
        return 'audit%s' % (" %s" % self.limit if self.limit else "")

class Rich_Accept(object):
    def __init__(self, limit=None):
        self.limit = limit

    def __str__(self):
        return "accept%s" % (" %s" % self.limit if self.limit else "")

class Rich_Reject(object):
    def __init__(self, _type=None, limit=None):
        self.type = _type
        self.limit = limit

    def __str__(self):
        return "reject%s%s" % (' type="%s"' % self.type if self.type else "",
                               " %s" % self.limit if self.limit else "")

    def check(self, family):
        if self.type:
            if not family:
                raise FirewallError(errors.INVALID_RULE, "When using reject type you must specify also rule family.")
            if family in ['ipv4', 'ipv6'] and \
               self.type not in REJECT_TYPES[family]:
                valid_types = ", ".join(REJECT_TYPES[family])
                raise FirewallError(errors.INVALID_RULE, "Wrong reject type %s.\nUse one of: %s." % (self.type, valid_types))

class Rich_Drop(Rich_Accept):
    def __str__(self):
        return "drop%s" % (" %s" % self.limit if self.limit else "")


class Rich_Mark(object):
    def __init__(self, _set, limit=None):
        self.set = _set
        self.limit = limit

    def __str__(self):
        return "mark set=%s%s" % (self.set,
                                  " %s" % self.limit if self.limit else "")

    def check(self):
        if self.set is not None:
            x = self.set
        else:
            raise FirewallError(errors.INVALID_MARK, "no value set")

        if "/" in x:
            splits = x.split("/")
            if len(splits) != 2:
                raise FirewallError(errors.INVALID_MARK, x)
            if not functions.checkUINT32(splits[0]) or \
               not functions.checkUINT32(splits[1]):
                # value and mask are uint32
                raise FirewallError(errors.INVALID_MARK, x)
        else:
            if not functions.checkUINT32(x):
                # value is uint32
                raise FirewallError(errors.INVALID_MARK, x)

DURATION_TO_MULT = {
    "s": 1,
    "m": 60,
    "h": 60 * 60,
    "d": 24 * 60 * 60,
}

class Rich_Limit(object):
    def __init__(self, value, burst=None):
        self.value = value
        self.burst = burst

    def check(self):
        self.value_parse()
        self.burst_parse()

    @property
    def value(self):
        return self._value

    @value.setter
    def value(self, value):
        if value is None:
            self._value = None
            return
        try:
            rate, duration = self._value_parse(value)
        except FirewallError:
            # The value is invalid. We cannot normalize it.
            v = value
        else:
            v = f"{rate}/{duration}"
        if getattr(self, "_value", None) != v:
            self._value = v

    @property
    def burst(self):
        return self._burst

    @burst.setter
    def burst(self, burst):
        if burst is None:
            self._burst = None
            return
        try:
            b = self._burst_parse(burst)
        except FirewallError:
            b = burst
        else:
            b = str(burst)
        if getattr(self, "_burst", None) != b:
            self._burst = b

    @staticmethod
    def _value_parse(value):
        splits = None
        if "/" in value:
            splits = value.split("/")
        if not splits or len(splits) != 2:
            raise FirewallError(errors.INVALID_LIMIT, value)
        (rate, duration) = splits
        try:
            rate = int(rate)
        except:
            raise FirewallError(errors.INVALID_LIMIT, value)

        if duration in ["second", "minute", "hour", "day"]:
            duration = duration[:1]

        if rate < 1 or duration not in ["s", "m", "h", "d"]:
            raise FirewallError(errors.INVALID_LIMIT, value)

        if 10000 * DURATION_TO_MULT[duration] // rate == 0:
            raise FirewallError(errors.INVALID_LIMIT, "%s too fast" % (value,))

        if rate == 1 and duration == "d":
            # iptables (v1.4.21) doesn't accept 1/d
            raise FirewallError(errors.INVALID_LIMIT, "%s too slow" % (value,))

        return rate, duration

    def value_parse(self):
        return self._value_parse(self._value)

    @staticmethod
    def _burst_parse(burst):
        if burst is None:
            return None
        try:
            b = int(burst)
        except:
            raise FirewallError(errors.INVALID_LIMIT, burst)

        if b < 1 or b > 10_000_000:
            raise FirewallError(errors.INVALID_LIMIT, burst)

        return b

    def burst_parse(self):
        return self._burst_parse(self._burst)

    def __str__(self):
        s = f'limit value="{self._value}"'
        if self._burst is not None:
            s += f" burst={self._burst}"
        return s

class Rich_Rule(object):
    priority_min = -32768
    priority_max =  32767

    def __init__(self, family=None, rule_str=None, priority=0):
        if family is not None:
            self.family = str(family)
        else:
            self.family = None

        self.priority = priority
        self.source = None
        self.destination = None
        self.element = None
        self.log = None
        self.audit = None
        self.action = None

        if rule_str:
            self._import_from_string(rule_str)

    def _lexer(self, rule_str):
        """ Lexical analysis """
        tokens = []

        for r in functions.splitArgs(rule_str):
            if "=" in r:
                attr = r.split('=')
                if len(attr) != 2 or not attr[0] or not attr[1]:
                    raise FirewallError(errors.INVALID_RULE,
                                        'internal error in _lexer(): %s' % r)
                tokens.append({'attr_name':attr[0], 'attr_value':attr[1]})
            else:
                tokens.append({'element':r})
        tokens.append({'element':'EOL'})

        return tokens

    def _import_from_string(self, rule_str):
        if not rule_str:
            raise FirewallError(errors.INVALID_RULE, 'empty rule')

        rule_str = functions.stripNonPrintableCharacters(rule_str)

        self.priority = 0
        self.family = None
        self.source = None
        self.destination = None
        self.element = None
        self.log = None
        self.audit = None
        self.action = None

        tokens = self._lexer(rule_str)
        if tokens and tokens[0].get('element')  == 'EOL':
            raise FirewallError(errors.INVALID_RULE, 'empty rule')

        attrs = {}       # attributes of elements
        in_elements = [] # stack with elements we are in
        index = 0        # index into tokens
        while not (tokens[index].get('element')  == 'EOL' and in_elements == ['rule']):
            element = tokens[index].get('element')
            attr_name = tokens[index].get('attr_name')
            attr_value = tokens[index].get('attr_value')
            #print ("in_elements: ", in_elements)
            #print ("index: %s, element: %s, attribute: %s=%s" % (index, element, attr_name, attr_value))
            if attr_name:     # attribute
                if attr_name not in ['priority', 'family', 'address', 'mac', 'ipset',
                                     'invert', 'value',
                                     'port', 'protocol', 'to-port', 'to-addr',
                                     'name', 'prefix', 'level', 'type',
                                     'set', 'burst']:
                    raise FirewallError(errors.INVALID_RULE, "bad attribute '%s'" % attr_name)
            else:             # element
                if element in ['rule', 'source', 'destination', 'protocol',
                               'service', 'port', 'icmp-block', 'icmp-type', 'masquerade',
                               'forward-port', 'source-port', 'log', 'audit',
                               'accept', 'drop', 'reject', 'mark', 'limit', 'not', 'NOT', 'EOL']:
                    if element == 'source' and self.source:
                        raise FirewallError(errors.INVALID_RULE, "more than one 'source' element")
                    elif element == 'destination' and self.destination:
                        raise FirewallError(errors.INVALID_RULE, "more than one 'destination' element")
                    elif element in ['protocol', 'service', 'port',
                                     'icmp-block', 'icmp-type',
                                     'masquerade', 'forward-port',
                                     'source-port'] and self.element:
                        raise FirewallError(errors.INVALID_RULE, "more than one element. There cannot be both '%s' and '%s' in one rule." % (element, self.element))
                    elif element == 'log' and self.log:
                        raise FirewallError(errors.INVALID_RULE, "more than one 'log' element")
                    elif element == 'audit' and self.audit:
                        raise FirewallError(errors.INVALID_RULE, "more than one 'audit' element")
                    elif element in ['accept', 'drop', 'reject', 'mark'] and self.action:
                        raise FirewallError(errors.INVALID_RULE, "more than one 'action' element. There cannot be both '%s' and '%s' in one rule." % (element, self.action))
                else:
                    raise FirewallError(errors.INVALID_RULE, "unknown element %s" % element)

            in_element = in_elements[len(in_elements)-1] if len(in_elements) > 0 else ''

            if in_element == '':
                if not element and attr_name:
                    if attr_name == 'family':
                        raise FirewallError(errors.INVALID_RULE, "'family' outside of rule. Use 'rule family=...'.")
                    elif attr_name == 'priority':
                        raise FirewallError(errors.INVALID_RULE, "'priority' outside of rule. Use 'rule priority=...'.")
                    else:
                        raise FirewallError(errors.INVALID_RULE, "'%s' outside of any element. Use 'rule <element> %s= ...'." % (attr_name, attr_name))
                elif 'rule' not in element:
                    raise FirewallError(errors.INVALID_RULE, "'%s' outside of rule. Use 'rule ... %s ...'." % (element, element))
                else:
                    in_elements.append('rule') # push into stack
            elif in_element == 'rule':
                if attr_name == 'family':
                    if attr_value not in ['ipv4', 'ipv6']:
                        raise FirewallError(errors.INVALID_RULE, "'family' attribute cannot have '%s' value. Use 'ipv4' or 'ipv6' instead." % attr_value)
                    self.family = attr_value
                elif attr_name == 'priority':
                    try:
                        self.priority = int(attr_value)
                    except ValueError:
                        raise FirewallError(errors.INVALID_PRIORITY, "invalid 'priority' attribute value '%s'." % attr_value)
                elif attr_name:
                    if attr_name == 'protocol':
                        err_msg = "wrong 'protocol' usage. Use either 'rule protocol value=...' or  'rule [forward-]port protocol=...'."
                    else:
                        err_msg = "attribute '%s' outside of any element. Use 'rule <element> %s= ...'." % (attr_name, attr_name)
                    raise FirewallError(errors.INVALID_RULE, err_msg)
                else:
                    in_elements.append(element) # push into stack
            elif in_element == 'source':
                if attr_name in ['address', 'mac', 'ipset', 'invert']:
                    attrs[attr_name] = attr_value
                elif element in ['not', 'NOT']:
                    attrs['invert'] = True
                else:
                    self.source = Rich_Source(attrs.get('address'), attrs.get('mac'), attrs.get('ipset'), attrs.get('invert', False))
                    in_elements.pop() # source
                    attrs.clear()
                    index = index -1 # return token to input
            elif in_element == 'destination':
                if attr_name in ['address', 'ipset', 'invert']:
                    attrs[attr_name] = attr_value
                elif element in ['not', 'NOT']:
                    attrs['invert'] = True
                else:
                    self.destination = Rich_Destination(attrs.get('address'), attrs.get('ipset'), attrs.get('invert', False))
                    in_elements.pop() # destination
                    attrs.clear()
                    index = index -1 # return token to input
            elif in_element == 'protocol':
                if attr_name == 'value':
                    self.element = Rich_Protocol(attr_value)
                    in_elements.pop() # protocol
                else:
                    raise FirewallError(errors.INVALID_RULE, "invalid 'protocol' element")
            elif in_element == 'service':
                if attr_name == 'name':
                    self.element = Rich_Service(attr_value)
                    in_elements.pop() # service
                else:
                    raise FirewallError(errors.INVALID_RULE, "invalid 'service' element")
            elif in_element == 'port':
                if attr_name in ['port', 'protocol']:
                    attrs[attr_name] = attr_value
                else:
                    self.element = Rich_Port(attrs.get('port'), attrs.get('protocol'))
                    in_elements.pop() # port
                    attrs.clear()
                    index = index -1 # return token to input
            elif in_element == 'icmp-block':
                if attr_name == 'name':
                    self.element = Rich_IcmpBlock(attr_value)
                    in_elements.pop() # icmp-block
                else:
                    raise FirewallError(errors.INVALID_RULE, "invalid 'icmp-block' element")
            elif in_element == 'icmp-type':
                if attr_name == 'name':
                    self.element = Rich_IcmpType(attr_value)
                    in_elements.pop() # icmp-type
                else:
                    raise FirewallError(errors.INVALID_RULE, "invalid 'icmp-type' element")
            elif in_element == 'masquerade':
                self.element = Rich_Masquerade()
                in_elements.pop()
                attrs.clear()
                index = index -1 # return token to input
            elif in_element == 'forward-port':
                if attr_name in ['port', 'protocol', 'to-port', 'to-addr']:
                    attrs[attr_name] = attr_value
                else:
                    self.element = Rich_ForwardPort(attrs.get('port'), attrs.get('protocol'), attrs.get('to-port'), attrs.get('to-addr'))
                    in_elements.pop() # forward-port
                    attrs.clear()
                    index = index -1 # return token to input
            elif in_element == 'source-port':
                if attr_name in ['port', 'protocol']:
                    attrs[attr_name] = attr_value
                else:
                    self.element = Rich_SourcePort(attrs.get('port'), attrs.get('protocol'))
                    in_elements.pop() # source-port
                    attrs.clear()
                    index = index -1 # return token to input
            elif in_element == 'log':
                if attr_name in ['prefix', 'level']:
                    attrs[attr_name] = attr_value
                elif element == 'limit':
                    in_elements.append('limit')
                else:
                    self.log = Rich_Log(attrs.get('prefix'), attrs.get('level'), attrs.get('limit'))
                    in_elements.pop() # log
                    attrs.clear()
                    index = index -1 # return token to input
            elif in_element == 'audit':
                if element == 'limit':
                    in_elements.append('limit')
                else:
                    self.audit = Rich_Audit(attrs.get('limit'))
                    in_elements.pop() # audit
                    attrs.clear()
                    index = index -1 # return token to input
            elif in_element == 'accept':
                if element == 'limit':
                    in_elements.append('limit')
                else:
                    self.action = Rich_Accept(attrs.get('limit'))
                    in_elements.pop() # accept
                    attrs.clear()
                    index = index -1 # return token to input
            elif in_element == 'drop':
                if element == 'limit':
                    in_elements.append('limit')
                else:
                    self.action = Rich_Drop(attrs.get('limit'))
                    in_elements.pop() # drop
                    attrs.clear()
                    index = index -1 # return token to input
            elif in_element == 'reject':
                if attr_name == 'type':
                    attrs[attr_name] = attr_value
                elif element == 'limit':
                    in_elements.append('limit')
                else:
                    self.action = Rich_Reject(attrs.get('type'), attrs.get('limit'))
                    in_elements.pop() # accept
                    attrs.clear()
                    index = index -1 # return token to input
            elif in_element == 'mark':
                if attr_name == 'set':
                    attrs[attr_name] = attr_value
                elif element == 'limit':
                    in_elements.append('limit')
                else:
                    self.action = Rich_Mark(attrs.get('set'),
                                            attrs.get('limit'))
                    in_elements.pop() # accept
                    attrs.clear()
                    index = index -1 # return token to input
            elif in_element == 'limit':
                if attr_name in ["value", "burst"]:
                    attrs[f"limit.{attr_name}"] = attr_value
                else:
                    if "limit.value" not in attrs:
                        raise FirewallError(
                            errors.INVALID_RULE, "invalid 'limit' element"
                        )
                    attrs["limit"] = Rich_Limit(
                        attrs["limit.value"], attrs.get("limit.burst")
                    )
                    attrs.pop("limit.value", None)
                    attrs.pop("limit.burst", None)
                    in_elements.pop()  # limit
                    index = index - 1  # return token to input

            index = index + 1

        self.check()

    def check(self):
        if self.family is not None and self.family not in [ "ipv4", "ipv6" ]:
            raise FirewallError(errors.INVALID_FAMILY, self.family)
        if self.family is None:
            if (self.source is not None and self.source.addr is not None) or \
               self.destination is not None:
                raise FirewallError(errors.MISSING_FAMILY)
            if type(self.element) == Rich_ForwardPort:
                raise FirewallError(errors.MISSING_FAMILY)

        if self.priority < self.priority_min or self.priority > self.priority_max:
            raise FirewallError(errors.INVALID_PRIORITY, "'priority' attribute must be between %d and %d." \
                                                         % (self.priority_min, self.priority_max))

        if self.element is None and \
           (self.log is None or (self.log is not None and self.priority == 0)):
            if self.action is None:
                raise FirewallError(errors.INVALID_RULE, "no element, no action")
            if self.source is None and self.destination is None and self.priority == 0:
                raise FirewallError(errors.INVALID_RULE, "no element, no source, no destination")

        if type(self.element) not in [ Rich_IcmpBlock,
                                       Rich_ForwardPort,
                                       Rich_Masquerade ]:
            if self.log is None and self.audit is None and \
                    self.action is None:
                raise FirewallError(errors.INVALID_RULE, "no action, no log, no audit")

        # source
        if self.source is not None:
            if self.source.addr is not None:
                if self.family is None:
                    raise FirewallError(errors.INVALID_FAMILY)
                if self.source.mac is not None:
                    raise FirewallError(errors.INVALID_RULE, "address and mac")
                if self.source.ipset is not None:
                    raise FirewallError(errors.INVALID_RULE, "address and ipset")
                if not functions.check_address(self.family, self.source.addr):
                    raise FirewallError(errors.INVALID_ADDR, str(self.source.addr))

            elif self.source.mac is not None:
                if self.source.ipset is not None:
                    raise FirewallError(errors.INVALID_RULE, "mac and ipset")
                if not functions.check_mac(self.source.mac):
                    raise FirewallError(errors.INVALID_MAC, str(self.source.mac))

            elif self.source.ipset is not None:
                if not check_ipset_name(self.source.ipset):
                    raise FirewallError(errors.INVALID_IPSET, str(self.source.ipset))

            else:
                raise FirewallError(errors.INVALID_RULE, "invalid source")

        # destination
        if self.destination is not None:
            if self.destination.addr is not None:
                if self.family is None:
                    raise FirewallError(errors.INVALID_FAMILY)
                if self.destination.ipset is not None:
                    raise FirewallError(errors.INVALID_DESTINATION, "address and ipset")
                if not functions.check_address(self.family, self.destination.addr):
                    raise FirewallError(errors.INVALID_ADDR, str(self.destination.addr))

            elif self.destination.ipset is not None:
                if not check_ipset_name(self.destination.ipset):
                    raise FirewallError(errors.INVALID_IPSET, str(self.destination.ipset))

            else:
                raise FirewallError(errors.INVALID_RULE, "invalid destination")

        # service
        if type(self.element) == Rich_Service:
            # service availability needs to be checked in Firewall, here is no
            # knowledge about this, therefore only simple check
            if self.element.name is None or len(self.element.name) < 1:
                raise FirewallError(errors.INVALID_SERVICE, str(self.element.name))

        # port
        elif type(self.element) == Rich_Port:
            if not functions.check_port(self.element.port):
                raise FirewallError(errors.INVALID_PORT, self.element.port)
            if self.element.protocol not in [ "tcp", "udp", "sctp", "dccp" ]:
                raise FirewallError(errors.INVALID_PROTOCOL, self.element.protocol)

        # protocol
        elif type(self.element) == Rich_Protocol:
            if not functions.checkProtocol(self.element.value):
                raise FirewallError(errors.INVALID_PROTOCOL, self.element.value)

        # masquerade
        elif type(self.element) == Rich_Masquerade:
            if self.action is not None:
                raise FirewallError(errors.INVALID_RULE, "masquerade and action")
            if self.source is not None and self.source.mac is not None:
                raise FirewallError(errors.INVALID_RULE, "masquerade and mac source")

        # icmp-block
        elif type(self.element) == Rich_IcmpBlock:
            # icmp type availability needs to be checked in Firewall, here is no
            # knowledge about this, therefore only simple check
            if self.element.name is None or len(self.element.name) < 1:
                raise FirewallError(errors.INVALID_ICMPTYPE, str(self.element.name))
            if self.action:
                raise FirewallError(errors.INVALID_RULE, "icmp-block and action")

        # icmp-type
        elif type(self.element) == Rich_IcmpType:
            # icmp type availability needs to be checked in Firewall, here is no
            # knowledge about this, therefore only simple check
            if self.element.name is None or len(self.element.name) < 1:
                raise FirewallError(errors.INVALID_ICMPTYPE, str(self.element.name))

        # forward-port
        elif type(self.element) == Rich_ForwardPort:
            if not functions.check_port(self.element.port):
                raise FirewallError(errors.INVALID_PORT, self.element.port)
            if self.element.protocol not in [ "tcp", "udp", "sctp", "dccp" ]:
                raise FirewallError(errors.INVALID_PROTOCOL, self.element.protocol)
            if self.element.to_port == "" and self.element.to_address == "":
                raise FirewallError(errors.INVALID_PORT, self.element.to_port)
            if self.element.to_port != "" and \
                    not functions.check_port(self.element.to_port):
                raise FirewallError(errors.INVALID_PORT, self.element.to_port)
            if self.element.to_address != "" and \
                    not functions.check_single_address(self.family,
                                                       self.element.to_address):
                raise FirewallError(errors.INVALID_ADDR, self.element.to_address)
            if self.family is None:
                raise FirewallError(errors.INVALID_FAMILY)
            if self.action is not None:
                raise FirewallError(errors.INVALID_RULE, "forward-port and action")

        # source-port
        elif type(self.element) == Rich_SourcePort:
            if not functions.check_port(self.element.port):
                raise FirewallError(errors.INVALID_PORT, self.element.port)
            if self.element.protocol not in [ "tcp", "udp", "sctp", "dccp" ]:
                raise FirewallError(errors.INVALID_PROTOCOL, self.element.protocol)

        # other element and not empty?
        elif self.element is not None:
            raise FirewallError(errors.INVALID_RULE, "Unknown element %s" % 
                                type(self.element))

        # log
        if self.log is not None:
            if self.log.level and \
               self.log.level not in [ "emerg", "alert", "crit", "error",
                                       "warning", "notice", "info", "debug" ]:
                raise FirewallError(errors.INVALID_LOG_LEVEL, self.log.level)

            if self.log.limit is not None:
                self.log.limit.check()

        # audit
        if self.audit is not None:
            if type(self.action) not in [ Rich_Accept, Rich_Reject, Rich_Drop ]:
                raise FirewallError(errors.INVALID_AUDIT_TYPE, type(self.action))

            if self.audit.limit is not None:
                self.audit.limit.check()

        # action
        if self.action is not None:
            if type(self.action) == Rich_Reject:
                self.action.check(self.family)
            elif type(self.action) == Rich_Mark:
                self.action.check()

            if self.action.limit is not None:
                self.action.limit.check()

    def __str__(self):
        ret = 'rule'
        if self.priority:
            ret += ' priority="%d"' % self.priority
        if self.family:
            ret += ' family="%s"' % self.family
        if self.source:
            ret += " %s" % self.source
        if self.destination:
            ret += " %s" % self.destination
        if self.element:
            ret += " %s" % self.element
        if self.log:
            ret += " %s" % self.log
        if self.audit:
            ret += " %s" % self.audit
        if self.action:
            ret += " %s" % self.action

        return (functions.u2b(ret)) if functions.PY2 else ret


#class Rich_RawRule(object):
#class Rich_RuleSet(object):
#class Rich_AddressList(object):
site-packages/firewall/core/fw_direct.py000064400000053015147511334700014352 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2010-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

__all__ = [ "FirewallDirect" ]

from firewall.fw_types import LastUpdatedOrderedDict
from firewall.core import ipXtables
from firewall.core import ebtables
from firewall.core.fw_transaction import FirewallTransaction
from firewall.core.logger import log
from firewall import errors
from firewall.errors import FirewallError

############################################################################
#
# class Firewall
#
############################################################################

class FirewallDirect(object):
    def __init__(self, fw):
        self._fw = fw
        self.__init_vars()

    def __repr__(self):
        return '%s(%r, %r, %r)' % (self.__class__, self._chains, self._rules,
                                   self._rule_priority_positions)

    def __init_vars(self):
        self._chains = { }
        self._rules = { }
        self._rule_priority_positions = { }
        self._passthroughs = { }
        self._obj = None

    def cleanup(self):
        self.__init_vars()

    # transaction

    def new_transaction(self):
        return FirewallTransaction(self._fw)

    # configuration

    def set_permanent_config(self, obj):
        self._obj = obj

    def has_configuration(self):
        if len(self._chains) + len(self._rules) + len(self._passthroughs) > 0:
            return True
        if len(self._obj.get_all_chains()) + \
           len(self._obj.get_all_rules()) + \
           len(self._obj.get_all_passthroughs()) > 0:
            return True
        return False

    def apply_direct(self, use_transaction=None):
        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        # Apply permanent configuration and save the obj to be able to
        # remove permanent configuration settings within get_runtime_config
        # for use in firewalld reload.
        self.set_config((self._obj.get_all_chains(),
                         self._obj.get_all_rules(),
                         self._obj.get_all_passthroughs()),
                        transaction)

        if use_transaction is None:
            transaction.execute(True)

    def get_runtime_config(self):
        # Return only runtime changes
        # Remove all chains, rules and passthroughs that are in self._obj
        # (permanent config applied in firewalld _start.
        chains = { }
        rules = { }
        passthroughs = { }

        for table_id in self._chains:
            (ipv, table) = table_id
            for chain in self._chains[table_id]:
                if not self._obj.query_chain(ipv, table, chain):
                    chains.setdefault(table_id, [ ]).append(chain)

        for chain_id in self._rules:
            (ipv, table, chain) = chain_id
            for (priority, args) in self._rules[chain_id]:
                if not self._obj.query_rule(ipv, table, chain, priority, args):
                    if chain_id not in rules:
                        rules[chain_id] = LastUpdatedOrderedDict()
                    rules[chain_id][(priority, args)] = priority

        for ipv in self._passthroughs:
            for args in self._passthroughs[ipv]:
                if not self._obj.query_passthrough(ipv, args):
                    if ipv not in passthroughs:
                        passthroughs[ipv] = [ ]
                    passthroughs[ipv].append(args)

        return (chains, rules, passthroughs)

    def get_config(self):
        return (self._chains, self._rules, self._passthroughs)

    def set_config(self, conf, use_transaction=None):
        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        (_chains, _rules, _passthroughs) = conf
        for table_id in _chains:
            (ipv, table) = table_id
            for chain in _chains[table_id]:
                if not self.query_chain(ipv, table, chain):
                    try:
                        self.add_chain(ipv, table, chain,
                                       use_transaction=transaction)
                    except FirewallError as error:
                        log.warning(str(error))

        for chain_id in _rules:
            (ipv, table, chain) = chain_id
            for (priority, args) in _rules[chain_id]:
                if not self.query_rule(ipv, table, chain, priority, args):
                    try:
                        self.add_rule(ipv, table, chain, priority, args,
                                      use_transaction=transaction)
                    except FirewallError as error:
                        log.warning(str(error))

        for ipv in _passthroughs:
            for args in _passthroughs[ipv]:
                if not self.query_passthrough(ipv, args):
                    try:
                        self.add_passthrough(ipv, args,
                                             use_transaction=transaction)
                    except FirewallError as error:
                        log.warning(str(error))

        if use_transaction is None:
            transaction.execute(True)

    def _check_ipv(self, ipv):
        ipvs = ['ipv4', 'ipv6', 'eb']
        if ipv not in ipvs:
            raise FirewallError(errors.INVALID_IPV,
                                "'%s' not in '%s'" % (ipv, ipvs))

    def _check_ipv_table(self, ipv, table):
        self._check_ipv(ipv)

        tables = ipXtables.BUILT_IN_CHAINS.keys() if ipv in [ 'ipv4', 'ipv6' ] \
                                         else ebtables.BUILT_IN_CHAINS.keys()
        if table not in tables:
            raise FirewallError(errors.INVALID_TABLE,
                                "'%s' not in '%s'" % (table, tables))

    def _check_builtin_chain(self, ipv, table, chain):
        if ipv in ['ipv4', 'ipv6']:
            built_in_chains = ipXtables.BUILT_IN_CHAINS[table]
            if self._fw.nftables_enabled:
                our_chains = {}
            else:
                our_chains = self._fw.get_direct_backend_by_ipv(ipv).our_chains[table]
        else:
            built_in_chains = ebtables.BUILT_IN_CHAINS[table]
            our_chains = ebtables.OUR_CHAINS[table]
        if chain in built_in_chains:
            raise FirewallError(errors.BUILTIN_CHAIN,
                                "chain '%s' is built-in chain" % chain)
        if chain in our_chains:
            raise FirewallError(errors.BUILTIN_CHAIN,
                                "chain '%s' is reserved" % chain)
        if ipv in [ "ipv4", "ipv6" ]:
            if self._fw.zone.zone_from_chain(chain) is not None:
                raise FirewallError(errors.INVALID_CHAIN,
                                    "Chain '%s' is reserved" % chain)


    def _register_chain(self, table_id, chain, add):
        if add:
            self._chains.setdefault(table_id, [ ]).append(chain)
        else:
            self._chains[table_id].remove(chain)
            if len(self._chains[table_id]) == 0:
                del self._chains[table_id]

    def add_chain(self, ipv, table, chain, use_transaction=None):
        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        #TODO: policy="ACCEPT"
        self._chain(True, ipv, table, chain, transaction)

        if use_transaction is None:
            transaction.execute(True)

    def remove_chain(self, ipv, table, chain, use_transaction=None):
        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        self._chain(False, ipv, table, chain, transaction)

        if use_transaction is None:
            transaction.execute(True)

    def query_chain(self, ipv, table, chain):
        self._check_ipv_table(ipv, table)
        self._check_builtin_chain(ipv, table, chain)
        table_id = (ipv, table)
        return (table_id in self._chains and
                chain in self._chains[table_id])

    def get_chains(self, ipv, table):
        self._check_ipv_table(ipv, table)
        table_id = (ipv, table)
        if table_id in self._chains:
            return self._chains[table_id]
        return [ ]

    def get_all_chains(self):
        r = [ ]
        for key in self._chains:
            (ipv, table) = key
            for chain in self._chains[key]:
                r.append((ipv, table, chain))
        return r


    def add_rule(self, ipv, table, chain, priority, args, use_transaction=None):
        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        self._rule(True, ipv, table, chain, priority, args, transaction)

        if use_transaction is None:
            transaction.execute(True)

    def remove_rule(self, ipv, table, chain, priority, args,
                    use_transaction=None):
        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        self._rule(False, ipv, table, chain, priority, args, transaction)

        if use_transaction is None:
            transaction.execute(True)

    def query_rule(self, ipv, table, chain, priority, args):
        self._check_ipv_table(ipv, table)
        chain_id = (ipv, table, chain)
        return chain_id in self._rules and \
            (priority, args) in self._rules[chain_id]

    def get_rules(self, ipv, table, chain):
        self._check_ipv_table(ipv, table)
        chain_id = (ipv, table, chain)
        if chain_id in self._rules:
            return list(self._rules[chain_id].keys())
        return [ ]

    def get_all_rules(self):
        r = [ ]
        for key in self._rules:
            (ipv, table, chain) = key
            for (priority, args) in self._rules[key]:
                r.append((ipv, table, chain, priority, list(args)))
        return r

    def _register_rule(self, rule_id, chain_id, priority, enable, count):
        if enable:
            if chain_id not in self._rules:
                self._rules[chain_id] = LastUpdatedOrderedDict()
            self._rules[chain_id][rule_id] = priority
            if chain_id not in self._rule_priority_positions:
                self._rule_priority_positions[chain_id] = { }

            if priority in self._rule_priority_positions[chain_id]:
                self._rule_priority_positions[chain_id][priority] += count
            else:
                self._rule_priority_positions[chain_id][priority] = count
        else:
            del self._rules[chain_id][rule_id]
            if len(self._rules[chain_id]) == 0:
                del self._rules[chain_id]
            self._rule_priority_positions[chain_id][priority] -= count

    # DIRECT PASSTHROUGH (untracked)

    def passthrough(self, ipv, args):
        try:
            return self._fw.rule(self._fw.get_direct_backend_by_ipv(ipv).name, args)
        except Exception as msg:
            log.debug2(msg)
            raise FirewallError(errors.COMMAND_FAILED, msg)


    def _register_passthrough(self, ipv, args, enable):
        if enable:
            if ipv not in self._passthroughs:
                self._passthroughs[ipv] = [ ]
            self._passthroughs[ipv].append(args)
        else:
            self._passthroughs[ipv].remove(args)
            if len(self._passthroughs[ipv]) == 0:
                del self._passthroughs[ipv]

    def add_passthrough(self, ipv, args, use_transaction=None):
        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        self._passthrough(True, ipv, list(args), transaction)

        if use_transaction is None:
            transaction.execute(True)

    def remove_passthrough(self, ipv, args, use_transaction=None):
        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        self._passthrough(False, ipv, list(args), transaction)

        if use_transaction is None:
            transaction.execute(True)

    def query_passthrough(self, ipv, args):
        return ipv in self._passthroughs and \
            tuple(args) in self._passthroughs[ipv]

    def get_all_passthroughs(self):
        r = [ ]
        for ipv in self._passthroughs:
            for args in self._passthroughs[ipv]:
                r.append((ipv, list(args)))
        return r

    def get_passthroughs(self, ipv):
        r = [ ]
        if ipv in self._passthroughs:
            for args in self._passthroughs[ipv]:
                r.append(list(args))
        return r

    def split_value(self, rules, opts):
        """Split values combined with commas for options in opts"""

        out_rules = [ ]
        for rule in rules:
            processed = False
            for opt in opts:
                try:
                    i = rule.index(opt)
                except ValueError:
                    pass
                else:
                    if len(rule) > i and "," in rule[i+1]:
                        # For all items in the comma separated list in index
                        # i of the rule, a new rule is created with a single
                        # item from this list
                        processed = True
                        items = rule[i+1].split(",")
                        for item in items:
                            _rule = rule[:]
                            _rule[i+1] = item
                            out_rules.append(_rule)
            if not processed:
                out_rules.append(rule)

        return out_rules


    def _rule(self, enable, ipv, table, chain, priority, args, transaction):
        self._check_ipv_table(ipv, table)
        # Do not create zone chains if we're using nftables. Only allow direct
        # rules in the built in chains.
        if not self._fw.nftables_enabled \
           and ipv in [ "ipv4", "ipv6" ]:
            self._fw.zone.create_zone_base_by_chain(ipv, table, chain,
                                                    transaction)

        _chain = chain

        backend = self._fw.get_direct_backend_by_ipv(ipv)

        # if nftables is in use, just put the direct rules in the chain
        # specified by the user. i.e. don't append _direct.
        if not self._fw.nftables_enabled \
           and backend.is_chain_builtin(ipv, table, chain):
            _chain = "%s_direct" % (chain)
        elif self._fw.nftables_enabled and chain[-7:] == "_direct" \
             and backend.is_chain_builtin(ipv, table, chain[:-7]):
            # strip _direct suffix. If we're using nftables we don't bother
            # creating the *_direct chains for builtin chains.
            _chain = chain[:-7]

        chain_id = (ipv, table, chain)
        rule_id = (priority, args)

        if enable:
            if chain_id in self._rules and \
                    rule_id in self._rules[chain_id]:
                raise FirewallError(errors.ALREADY_ENABLED,
                                    "rule '%s' already is in '%s:%s:%s'" % \
                                    (args, ipv, table, chain))
        else:
            if chain_id not in self._rules or \
                    rule_id not in self._rules[chain_id]:
                raise FirewallError(errors.NOT_ENABLED,
                                    "rule '%s' is not in '%s:%s:%s'" % \
                                    (args, ipv, table, chain))

            # get priority of rule
            priority = self._rules[chain_id][rule_id]

        # If a rule gets added, the initial rule index position within the
        # ipv, table and chain combination (chain_id) is 1.
        # Tf the chain_id exists in _rule_priority_positions, there are already
        # other rules for this chain_id. The number of rules for a priority
        # less or equal to the priority of the new rule will increase the
        # index of the new rule. The index is the ip*tables -I insert rule
        # number.
        #
        # Example: We have the following rules for chain_id (ipv4, filter,
        # INPUT) already:
        #   ipv4, filter, INPUT, 1, -i, foo1, -j, ACCEPT
        #   ipv4, filter, INPUT, 2, -i, foo2, -j, ACCEPT
        #   ipv4, filter, INPUT, 2, -i, foo2_1, -j, ACCEPT
        #   ipv4, filter, INPUT, 3, -i, foo3, -j, ACCEPT
        # This results in the following _rule_priority_positions structure:
        #   _rule_priority_positions[(ipv4,filter,INPUT)][1] = 1
        #   _rule_priority_positions[(ipv4,filter,INPUT)][2] = 2
        #   _rule_priority_positions[(ipv4,filter,INPUT)][3] = 1
        # The new rule
        #   ipv4, filter, INPUT, 2, -i, foo2_2, -j, ACCEPT
        # has the same pritority as the second rule before and will be added
        # right after it.
        # The initial index is 1 and the chain_id is already in
        # _rule_priority_positions. Therefore the index will increase for
        # the number of rules in every rule position in
        # _rule_priority_positions[(ipv4,filter,INPUT)].keys()
        # where position is smaller or equal to the entry in keys.
        # With the example from above:
        # The priority of the new rule is 2. Therefore for all keys in
        # _rule_priority_positions[chain_id] where priority is 1 or 2, the
        # number of the rules will increase the index of the rule.
        # For _rule_priority_positions[chain_id][1]: index += 1
        # _rule_priority_positions[chain_id][2]: index += 2
        # index will be 4 in the end and the rule in the table chain
        # combination will be added at index 4.
        # If there are no rules in the table chain combination, a new rule
        # has index 1.

        index = 1
        count = 0
        if chain_id in self._rule_priority_positions:
            positions = sorted(self._rule_priority_positions[chain_id].keys())
            j = 0
            while j < len(positions) and priority >= positions[j]:
                index += self._rule_priority_positions[chain_id][positions[j]]
                j += 1

        # split the direct rule in some cases as iptables-restore can't handle
        # compound args.
        #
        args_list = [list(args)]
        args_list = self.split_value(args_list, [ "-s", "--source" ])
        args_list = self.split_value(args_list, [ "-d", "--destination" ])

        for _args in args_list:
            transaction.add_rule(backend, backend.build_rule(enable, table, _chain, index, tuple(_args)))
            index += 1
            count += 1

        self._register_rule(rule_id, chain_id, priority, enable, count)
        transaction.add_fail(self._register_rule,
                             rule_id, chain_id, priority, not enable, count)

    def _chain(self, add, ipv, table, chain, transaction):
        self._check_ipv_table(ipv, table)
        self._check_builtin_chain(ipv, table, chain)
        table_id = (ipv, table)

        if add:
            if table_id in self._chains and \
                    chain in self._chains[table_id]:
                raise FirewallError(errors.ALREADY_ENABLED,
                                    "chain '%s' already is in '%s:%s'" % \
                                    (chain, ipv, table))
        else:
            if table_id not in self._chains or \
                    chain not in self._chains[table_id]:
                raise FirewallError(errors.NOT_ENABLED,
                                    "chain '%s' is not in '%s:%s'" % \
                                    (chain, ipv, table))

        backend = self._fw.get_direct_backend_by_ipv(ipv)
        transaction.add_rules(backend, backend.build_chain_rules(add, table, chain))

        self._register_chain(table_id, chain, add)
        transaction.add_fail(self._register_chain, table_id, chain, not add)

    def _passthrough(self, enable, ipv, args, transaction):
        self._check_ipv(ipv)

        tuple_args = tuple(args)
        if enable:
            if ipv in self._passthroughs and \
               tuple_args in self._passthroughs[ipv]:
                raise FirewallError(errors.ALREADY_ENABLED,
                                    "passthrough '%s', '%s'" % (ipv, args))
        else:
            if ipv not in self._passthroughs or \
               tuple_args not in self._passthroughs[ipv]:
                raise FirewallError(errors.NOT_ENABLED,
                                    "passthrough '%s', '%s'" % (ipv, args))

        backend = self._fw.get_direct_backend_by_ipv(ipv)

        if enable:
            backend.check_passthrough(args)
            # try to find out if a zone chain should be used
            if ipv in [ "ipv4", "ipv6" ]:
                table, chain = backend.passthrough_parse_table_chain(args)
                if table and chain:
                    self._fw.zone.create_zone_base_by_chain(ipv, table, chain)
            _args = args
        else:
            _args = backend.reverse_passthrough(args)

        transaction.add_rule(backend, _args)

        self._register_passthrough(ipv, tuple_args, enable)
        transaction.add_fail(self._register_passthrough, ipv, tuple_args,
                             not enable)
site-packages/firewall/core/fw_ipset.py000064400000022712147511334700014224 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2015-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

"""ipset backend"""

__all__ = [ "FirewallIPSet" ]

from firewall.core.logger import log
from firewall.core.ipset import remove_default_create_options as rm_def_cr_opts, \
                                normalize_ipset_entry, check_entry_overlaps_existing, \
                                check_for_overlapping_entries
from firewall.core.io.ipset import IPSet
from firewall import errors
from firewall.errors import FirewallError

class FirewallIPSet(object):
    def __init__(self, fw):
        self._fw = fw
        self._ipsets = { }

    def __repr__(self):
        return '%s(%r)' % (self.__class__, self._ipsets)

    # ipsets

    def cleanup(self):
        self._ipsets.clear()

    def check_ipset(self, name):
        if name not in self.get_ipsets():
            raise FirewallError(errors.INVALID_IPSET, name)

    def query_ipset(self, name):
        return name in self.get_ipsets()

    def get_ipsets(self):
        return sorted(self._ipsets.keys())

    def has_ipsets(self):
        return len(self._ipsets) > 0

    def get_ipset(self, name, applied=False):
        self.check_ipset(name)
        obj = self._ipsets[name]
        if applied:
            self.check_applied_obj(obj)
        return obj

    def backends(self):
        backends = []
        if self._fw.nftables_enabled:
            backends.append(self._fw.nftables_backend)
        if self._fw.ipset_enabled:
            backends.append(self._fw.ipset_backend)
        return backends

    def add_ipset(self, obj):
        if obj.type not in self._fw.ipset_supported_types:
            raise FirewallError(errors.INVALID_TYPE,
                                "'%s' is not supported by ipset." % obj.type)
        self._ipsets[obj.name] = obj

    def remove_ipset(self, name, keep=False):
        obj = self._ipsets[name]
        if obj.applied and not keep:
            try:
                for backend in self.backends():
                    backend.set_destroy(name)
            except Exception as msg:
                raise FirewallError(errors.COMMAND_FAILED, msg)
        else:
            log.debug1("Keeping ipset '%s' because of timeout option", name)
        del self._ipsets[name]

    def apply_ipset(self, name):
        obj = self._ipsets[name]

        for backend in self.backends():
            if backend.name == "ipset":
                active = backend.set_get_active_terse()

                if name in active and ("timeout" not in obj.options or \
                                       obj.options["timeout"] == "0" or \
                                       obj.type != active[name][0] or \
                                       rm_def_cr_opts(obj.options) != \
                                       active[name][1]):
                    try:
                        backend.set_destroy(name)
                    except Exception as msg:
                        raise FirewallError(errors.COMMAND_FAILED, msg)

            if self._fw._individual_calls:
                try:
                    backend.set_create(obj.name, obj.type, obj.options)
                except Exception as msg:
                    raise FirewallError(errors.COMMAND_FAILED, msg)
                else:
                    obj.applied = True
                    if "timeout" in obj.options and \
                       obj.options["timeout"] != "0":
                        # no entries visible for ipsets with timeout
                        continue

                try:
                    backend.set_flush(obj.name)
                except Exception as msg:
                    raise FirewallError(errors.COMMAND_FAILED, msg)

                for entry in obj.entries:
                    try:
                        backend.set_add(obj.name, entry)
                    except Exception as msg:
                        raise FirewallError(errors.COMMAND_FAILED, msg)
            else:
                try:
                    backend.set_restore(obj.name, obj.type,
                                                   obj.entries, obj.options,
                                                   None)
                except Exception as msg:
                    raise FirewallError(errors.COMMAND_FAILED, msg)
                else:
                    obj.applied = True

    def apply_ipsets(self):
        for name in self.get_ipsets():
            obj = self._ipsets[name]
            obj.applied = False

            log.debug1("Applying ipset '%s'" % name)
            self.apply_ipset(name)

    def flush(self):
        for backend in self.backends():
            # nftables sets are part of the normal firewall ruleset.
            if backend.name == "nftables":
                continue
            for ipset in self.get_ipsets():
                try:
                    self.check_applied(ipset)
                    backend.set_destroy(ipset)
                except FirewallError as msg:
                    if msg.code != errors.NOT_APPLIED:
                        raise msg

    # TYPE

    def get_type(self, name, applied=True):
        return self.get_ipset(name, applied=applied).type

    # DIMENSION
    def get_dimension(self, name):
        return len(self.get_ipset(name, applied=True).type.split(","))

    def check_applied(self, name):
        obj = self.get_ipset(name)
        self.check_applied_obj(obj)

    def check_applied_obj(self, obj):
        if not obj.applied:
            raise FirewallError(
                errors.NOT_APPLIED, obj.name)

    # OPTIONS

    def get_family(self, name, applied=True):
        obj = self.get_ipset(name, applied=applied)
        if "family" in obj.options:
            if obj.options["family"] == "inet6":
                return "ipv6"
        return "ipv4"

    # ENTRIES

    def add_entry(self, name, entry):
        obj = self.get_ipset(name, applied=True)
        entry = normalize_ipset_entry(entry)

        IPSet.check_entry(entry, obj.options, obj.type)
        if entry in obj.entries:
            raise FirewallError(errors.ALREADY_ENABLED,
                                "'%s' already is in '%s'" % (entry, name))
        check_entry_overlaps_existing(entry, obj.entries)

        try:
            for backend in self.backends():
                backend.set_add(obj.name, entry)
        except Exception as msg:
            raise FirewallError(errors.COMMAND_FAILED, msg)
        else:
            if "timeout" not in obj.options or obj.options["timeout"] == "0":
                # no entries visible for ipsets with timeout
                obj.entries.append(entry)

    def remove_entry(self, name, entry):
        obj = self.get_ipset(name, applied=True)
        entry = normalize_ipset_entry(entry)

        # no entry check for removal
        if entry not in obj.entries:
            raise FirewallError(errors.NOT_ENABLED,
                                "'%s' not in '%s'" % (entry, name))
        try:
            for backend in self.backends():
                backend.set_delete(obj.name, entry)
        except Exception as msg:
            raise FirewallError(errors.COMMAND_FAILED, msg)
        else:
            if "timeout" not in obj.options or obj.options["timeout"] == "0":
                # no entries visible for ipsets with timeout
                obj.entries.remove(entry)

    def query_entry(self, name, entry):
        obj = self.get_ipset(name, applied=True)
        entry = normalize_ipset_entry(entry)
        if "timeout" in obj.options and obj.options["timeout"] != "0":
            # no entries visible for ipsets with timeout
            raise FirewallError(errors.IPSET_WITH_TIMEOUT, name)

        return entry in obj.entries

    def get_entries(self, name):
        obj = self.get_ipset(name, applied=True)
        return obj.entries

    def set_entries(self, name, entries):
        obj = self.get_ipset(name, applied=True)

        check_for_overlapping_entries(entries)

        for entry in entries:
            IPSet.check_entry(entry, obj.options, obj.type)
        if "timeout" not in obj.options or obj.options["timeout"] == "0":
            # no entries visible for ipsets with timeout
            obj.entries = entries

        try:
            for backend in self.backends():
                backend.set_flush(obj.name)
        except Exception as msg:
            raise FirewallError(errors.COMMAND_FAILED, msg)
        else:
            obj.applied = True

        try:
            for backend in self.backends():
                if self._fw._individual_calls:
                    for entry in obj.entries:
                        backend.set_add(obj.name, entry)
                else:
                    backend.set_restore(obj.name, obj.type, obj.entries,
                                                   obj.options, None)
        except Exception as msg:
            raise FirewallError(errors.COMMAND_FAILED, msg)
        else:
            obj.applied = True

        return
site-packages/firewall/core/fw_policy.py000064400000253075147511334700014407 0ustar00# -*- coding: utf-8 -*-
#
# SPDX-License-Identifier: GPL-2.0-or-later

import time
import copy
from firewall.core.logger import log
from firewall.functions import portStr, checkIPnMask, checkIP6nMask, \
    checkProtocol, enable_ip_forwarding, check_single_address, \
    portInPortRange, get_nf_conntrack_short_name, coalescePortRange, breakPortRange
from firewall.core.rich import Rich_Rule, Rich_Accept, \
    Rich_Service, Rich_Port, Rich_Protocol, \
    Rich_Masquerade, Rich_ForwardPort, Rich_SourcePort, Rich_IcmpBlock, \
    Rich_IcmpType, Rich_Mark
from firewall.core.fw_transaction import FirewallTransaction
from firewall import errors
from firewall.errors import FirewallError
from firewall.fw_types import LastUpdatedOrderedDict
from firewall.core.base import SOURCE_IPSET_TYPES

class FirewallPolicy(object):
    def __init__(self, fw):
        self._fw = fw
        self._chains = { }
        self._policies = { }

    def __repr__(self):
        return '%s(%r, %r)' % (self.__class__, self._chains, self._policies)

    def cleanup(self):
        self._chains.clear()
        self._policies.clear()

    # transaction

    def new_transaction(self):
        return FirewallTransaction(self._fw)

    # policies

    def get_policies(self):
        return sorted(self._policies.keys())

    def get_policies_not_derived_from_zone(self):
        policies = []
        for p in self.get_policies():
            p_obj = self.get_policy(p)
            if not p_obj.derived_from_zone:
                policies.append(p)
        return sorted(policies)

    def get_active_policies_not_derived_from_zone(self):
        active_policies = []
        for policy in self.get_policies_not_derived_from_zone():
            settings = self.get_settings(policy)
            if (set(settings["ingress_zones"]) & (set(self._fw.zone.get_active_zones()) | set(["HOST", "ANY"]))) and \
               (set(settings["egress_zones"]) & (set(self._fw.zone.get_active_zones()) | set(["HOST", "ANY"]))):
                active_policies.append(policy)

        return active_policies

    def get_policy(self, policy):
        p = self._fw.check_policy(policy)
        return self._policies[p]

    def add_policy(self, obj):
        obj.settings = { x : LastUpdatedOrderedDict()
                         for x in [ "services", "ports",
                                    "masquerade", "forward_ports",
                                    "source_ports",
                                    "icmp_blocks", "rules",
                                    "protocols", "icmp_block_inversion",
                                    "ingress_zones", "egress_zones" ] }

        self._policies[obj.name] = obj
        self.copy_permanent_to_runtime(obj.name)

    def remove_policy(self, policy):
        obj = self._policies[policy]
        if obj.applied:
            self.unapply_policy_settings(policy)
        obj.settings.clear()
        del self._policies[policy]

    def copy_permanent_to_runtime(self, policy):
        obj = self._policies[policy]

        if obj.applied:
            return

        for args in obj.ingress_zones:
            self.add_ingress_zone(policy, args, allow_apply=False)
        for args in obj.egress_zones:
            self.add_egress_zone(policy, args, allow_apply=False)
        for args in obj.icmp_blocks:
            self.add_icmp_block(policy, args)
        for args in obj.forward_ports:
            self.add_forward_port(policy, *args)
        for args in obj.services:
            self.add_service(policy, args)
        for args in obj.ports:
            try:
                self.add_port(policy, *args)
            except FirewallError as error:
                if error.code in [errors.ALREADY_ENABLED]:
                    log.warning(error)
                else:
                    raise error
        for args in obj.protocols:
            self.add_protocol(policy, args)
        for args in obj.source_ports:
            try:
                self.add_source_port(policy, *args)
            except FirewallError as error:
                if error.code in [errors.ALREADY_ENABLED]:
                    log.warning(error)
                else:
                    raise error
        for args in obj.rules:
            self.add_rule(policy, args)
        if obj.masquerade:
            self.add_masquerade(policy)

    def apply_policies(self, use_transaction=None):
        for policy in self.get_policies():
            p_obj = self._policies[policy]
            if p_obj.derived_from_zone:
                continue
            if policy in self.get_active_policies_not_derived_from_zone():
                log.debug1("Applying policy '%s'", policy)
                self.apply_policy_settings(policy, use_transaction=use_transaction)

    def set_policy_applied(self, policy, applied):
        obj = self._policies[policy]
        obj.applied = applied

    # settings

    # generate settings record with sender, timeout
    def __gen_settings(self, timeout, sender):
        ret = {
            "date": time.time(),
            "sender": sender,
            "timeout": timeout,
        }
        return ret

    def get_settings(self, policy):
        return self.get_policy(policy).settings

    def _policy_settings(self, enable, policy, use_transaction=None):
        _policy = self._fw.check_policy(policy)
        obj = self._policies[_policy]
        if (enable and obj.applied) or (not enable and not obj.applied):
            return
        if enable:
            obj.applied = True

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if enable:
            # build the base chain layout of the policy
            for (table, chain) in self._get_table_chains_for_policy_dispatch(policy) if not obj.derived_from_zone \
                             else self._get_table_chains_for_zone_dispatch(policy):
                self.gen_chain_rules(policy, True, table, chain, transaction)

        settings = self.get_settings(policy)
        if not obj.derived_from_zone:
            self._ingress_egress_zones(enable, _policy, transaction)
        for key in settings:
            for args in settings[key]:
                if key == "icmp_blocks":
                    self._icmp_block(enable, _policy, args, transaction)
                elif key == "icmp_block_inversion":
                    continue
                elif key == "forward_ports":
                    self._forward_port(enable, _policy, transaction,
                                       *args)
                elif key == "services":
                    self._service(enable, _policy, args, transaction)
                elif key == "ports":
                    self._port(enable, _policy, args[0], args[1],
                                transaction)
                elif key == "protocols":
                    self._protocol(enable, _policy, args, transaction)
                elif key == "source_ports":
                    self._source_port(enable, _policy, args[0], args[1],
                                       transaction)
                elif key == "masquerade":
                    self._masquerade(enable, _policy, transaction)
                elif key == "rules":
                    self.__rule(enable, _policy, Rich_Rule(rule_str=args),
                                transaction)
                elif key == "ingress_zones":
                    continue
                elif key == "egress_zones":
                    continue
                else:
                    log.warning("Policy '%s': Unknown setting '%s:%s', "
                                "unable to apply", policy, key, args)

        if not enable:
            for (table, chain) in self._get_table_chains_for_policy_dispatch(policy) if not obj.derived_from_zone \
                             else self._get_table_chains_for_zone_dispatch(policy):
                self.gen_chain_rules(policy, False, table, chain, transaction)
            obj.applied = False

        if use_transaction is None:
            transaction.execute(enable)

    def apply_policy_settings(self, policy, use_transaction=None):
        self._policy_settings(True, policy, use_transaction=use_transaction)

    def unapply_policy_settings(self, policy, use_transaction=None):
        self._policy_settings(False, policy, use_transaction=use_transaction)

    def get_config_with_settings_dict(self, policy):
        """
        :return: exported config updated with runtime settings
        """
        permanent = self.get_policy(policy).export_config_dict()
        runtime = { "services": self.list_services(policy),
                    "ports": self.list_ports(policy),
                    "icmp_blocks": self.list_icmp_blocks(policy),
                    "masquerade": self.query_masquerade(policy),
                    "forward_ports": self.list_forward_ports(policy),
                    "rich_rules": self.list_rules(policy),
                    "protocols": self.list_protocols(policy),
                    "source_ports": self.list_source_ports(policy),
                    "ingress_zones": self.list_ingress_zones(policy),
                    "egress_zones": self.list_egress_zones(policy),
                    }
        return self._fw.combine_runtime_with_permanent_settings(permanent, runtime)

    def set_config_with_settings_dict(self, policy, settings, sender):
        # stupid wrappers to convert rich rule string to rich rule object
        from firewall.core.rich import Rich_Rule
        def add_rule_wrapper(policy, rule_str, timeout=0, sender=None):
            self.add_rule(policy, Rich_Rule(rule_str=rule_str), timeout=0, sender=sender)
        def remove_rule_wrapper(policy, rule_str):
            self.remove_rule(policy, Rich_Rule(rule_str=rule_str))

        setting_to_fn = {
            "services": (self.add_service, self.remove_service),
            "ports": (self.add_port, self.remove_port),
            "icmp_blocks": (self.add_icmp_block, self.remove_icmp_block),
            "masquerade": (self.add_masquerade, self.remove_masquerade),
            "forward_ports": (self.add_forward_port, self.remove_forward_port),
            "rich_rules": (add_rule_wrapper, remove_rule_wrapper),
            "protocols": (self.add_protocol, self.remove_protocol),
            "source_ports": (self.add_source_port, self.remove_source_port),
            "ingress_zones": (self.add_ingress_zone, self.remove_ingress_zone),
            "egress_zones": (self.add_egress_zone, self.remove_egress_zone),
        }

        old_settings = self.get_config_with_settings_dict(policy)
        (add_settings, remove_settings) = self._fw.get_added_and_removed_settings(old_settings, settings)

        for key in remove_settings:
            if isinstance(remove_settings[key], list):
                for args in remove_settings[key]:
                    if isinstance(args, tuple):
                        setting_to_fn[key][1](policy, *args)
                    else:
                        setting_to_fn[key][1](policy, args)
            else: # bool
                setting_to_fn[key][1](policy)

        for key in add_settings:
            if isinstance(add_settings[key], list):
                for args in add_settings[key]:
                    if isinstance(args, tuple):
                        setting_to_fn[key][0](policy, *args, timeout=0, sender=sender)
                    else:
                        setting_to_fn[key][0](policy, args, timeout=0, sender=sender)
            else: # bool
                setting_to_fn[key][0](policy, timeout=0, sender=sender)

    # ingress zones

    def check_ingress_zone(self, zone):
        if not zone:
            raise FirewallError(errors.INVALID_ZONE)
        if zone not in ["HOST", "ANY"]:
            self._fw.check_zone(zone)

    def __ingress_zone_id(self, zone):
        self.check_ingress_zone(zone)
        return zone

    def add_ingress_zone(self, policy, zone, timeout=0, sender=None,
                         use_transaction=None, allow_apply=True):
        _policy = self._fw.check_policy(policy)
        self._fw.check_timeout(timeout)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        zone_id = self.__ingress_zone_id(zone)
        if zone_id in _obj.settings["ingress_zones"]:
            raise FirewallError(errors.ALREADY_ENABLED,
                                "'%s' already in '%s'" % (zone, _policy))

        if "ANY" in _obj.settings["ingress_zones"] or "HOST" in _obj.settings["ingress_zones"] or \
           zone in ["ANY", "HOST"] and _obj.settings["ingress_zones"]:
            raise FirewallError(errors.INVALID_ZONE, "'ingress-zones' may only contain one of: many regular zones, ANY, or HOST")

        if zone == "HOST" and "HOST" in _obj.settings["egress_zones"]:
            raise FirewallError(errors.INVALID_ZONE, "'HOST' can only appear in either ingress or egress zones, but not both")

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if allow_apply:
            if _obj.applied:
                self._ingress_egress_zones(False, _policy, transaction)

            # register early so backends can access updated zone list
            self.__register_ingress_zone(_obj, zone_id, timeout, sender)
            transaction.add_fail(self.__unregister_ingress_zone, _obj, zone_id)

            if not _obj.applied:
                if _policy in self.get_active_policies_not_derived_from_zone():
                    self.apply_policy_settings(_policy, use_transaction=transaction)
                    transaction.add_fail(self.set_policy_applied, _policy, False)
            else:
                self._ingress_egress_zones(True, _policy, transaction)
        else:
            self.__register_ingress_zone(_obj, zone_id, timeout, sender)
            transaction.add_fail(self.__unregister_ingress_zone, _obj, zone_id)

        if use_transaction is None:
            transaction.execute(True)

    def __register_ingress_zone(self, _obj, zone_id, timeout, sender):
        _obj.settings["ingress_zones"][zone_id] = self.__gen_settings(timeout, sender)

    def remove_ingress_zone(self, policy, zone, use_transaction=None):
        _policy = self._fw.check_policy(policy)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        zone_id = self.__ingress_zone_id(zone)
        if zone_id not in _obj.settings["ingress_zones"]:
            raise FirewallError(errors.NOT_ENABLED,
                                "'%s' not in '%s'" % (zone, _policy))

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            if len(_obj.settings["ingress_zones"]) == 1:
                self.unapply_policy_settings(_policy, transaction)
            else:
                self._ingress_egress_zones(False, _policy, transaction)

            # unregister early so backends have updated zone list
            self.__unregister_ingress_zone(_obj, zone_id)
            transaction.add_fail(self.__register_ingress_zone, _obj, zone_id, None, None)

            if _policy in self.get_active_policies_not_derived_from_zone():
                self._ingress_egress_zones(True, _policy, transaction)
        else:
            transaction.add_post(self.__unregister_ingress_zone, _obj, zone_id)

        if use_transaction is None:
            transaction.execute(True)

        return _policy

    def __unregister_ingress_zone(self, _obj, zone_id):
        if zone_id in _obj.settings["ingress_zones"]:
            del _obj.settings["ingress_zones"][zone_id]

    def query_ingress_zone(self, policy, zone):
        return self.__ingress_zone_id(zone) in self.get_settings(policy)["ingress_zones"]

    def list_ingress_zones(self, policy):
        return list(self.get_settings(policy)["ingress_zones"].keys())

    # egress zones

    def check_egress_zone(self, zone):
        if not zone:
            raise FirewallError(errors.INVALID_ZONE)
        if zone not in ["HOST", "ANY"]:
            self._fw.check_zone(zone)

    def __egress_zone_id(self, zone):
        self.check_egress_zone(zone)
        return zone

    def add_egress_zone(self, policy, zone, timeout=0, sender=None,
                         use_transaction=None, allow_apply=True):
        _policy = self._fw.check_policy(policy)
        self._fw.check_timeout(timeout)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        zone_id = self.__egress_zone_id(zone)
        if zone_id in _obj.settings["egress_zones"]:
            raise FirewallError(errors.ALREADY_ENABLED,
                                "'%s' already in '%s'" % (zone, _policy))

        if "ANY" in _obj.settings["egress_zones"] or "HOST" in _obj.settings["egress_zones"] or \
           zone in ["ANY", "HOST"] and _obj.settings["egress_zones"]:
            raise FirewallError(errors.INVALID_ZONE, "'egress-zones' may only contain one of: many regular zones, ANY, or HOST")

        if zone == "HOST" and "HOST" in _obj.settings["ingress_zones"]:
            raise FirewallError(errors.INVALID_ZONE, "'HOST' can only appear in either ingress or egress zones, but not both")

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if allow_apply:
            if _obj.applied:
                self._ingress_egress_zones(False, _policy, transaction)

            # register early so backends can access updated zone list
            self.__register_egress_zone(_obj, zone_id, timeout, sender)
            transaction.add_fail(self.__unregister_egress_zone, _obj, zone_id)

            if not _obj.applied:
                if _policy in self.get_active_policies_not_derived_from_zone():
                    self.apply_policy_settings(_policy, use_transaction=transaction)
                    transaction.add_fail(self.set_policy_applied, _policy, False)
            else:
                self._ingress_egress_zones(True, _policy, transaction)
        else:
            self.__register_egress_zone(_obj, zone_id, timeout, sender)
            transaction.add_fail(self.__unregister_egress_zone, _obj, zone_id)

        if use_transaction is None:
            transaction.execute(True)

    def __register_egress_zone(self, _obj, zone_id, timeout, sender):
        _obj.settings["egress_zones"][zone_id] = self.__gen_settings(timeout, sender)

    def remove_egress_zone(self, policy, zone, use_transaction=None):
        _policy = self._fw.check_policy(policy)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        zone_id = self.__egress_zone_id(zone)
        if zone_id not in _obj.settings["egress_zones"]:
            raise FirewallError(errors.NOT_ENABLED,
                                "'%s' not in '%s'" % (zone, _policy))

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            if len(_obj.settings["egress_zones"]) == 1:
                self.unapply_policy_settings(_policy, transaction)
            else:
                self._ingress_egress_zones(False, _policy, transaction)

            # unregister early so backends have updated zone list
            self.__unregister_egress_zone(_obj, zone_id)
            transaction.add_fail(self.__register_egress_zone, _obj, zone_id, None, None)

            if _policy in self.get_active_policies_not_derived_from_zone():
                self._ingress_egress_zones(True, _policy, transaction)
        else:
            transaction.add_post(self.__unregister_egress_zone, _obj, zone_id)

        if use_transaction is None:
            transaction.execute(True)

        return _policy

    def __unregister_egress_zone(self, _obj, zone_id):
        if zone_id in _obj.settings["egress_zones"]:
            del _obj.settings["egress_zones"][zone_id]

    def query_egress_zone(self, policy, zone):
        return self.__egress_zone_id(zone) in self.get_settings(policy)["egress_zones"]

    def list_egress_zones(self, policy):
        return list(self.get_settings(policy)["egress_zones"].keys())

    # RICH LANGUAGE

    def check_rule(self, rule):
        rule.check()

    def __rule_id(self, rule):
        self.check_rule(rule)
        return str(rule)

    def _rule_source_ipv(self, source):
        if not source:
            return None

        if source.addr:
            if checkIPnMask(source.addr):
                return "ipv4"
            elif checkIP6nMask(source.addr):
                return "ipv6"
        elif hasattr(source, "mac") and source.mac:
            return ""
        elif hasattr(source, "ipset") and source.ipset:
            self._check_ipset_type_for_source(source.ipset)
            self._check_ipset_applied(source.ipset)
            return self._ipset_family(source.ipset)

        return None

    def __rule(self, enable, policy, rule, transaction):
        self._rule_prepare(enable, policy, rule, transaction)

    def add_rule(self, policy, rule, timeout=0, sender=None,
                 use_transaction=None):
        _policy = self._fw.check_policy(policy)
        self._fw.check_timeout(timeout)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        rule_id = self.__rule_id(rule)
        if rule_id in _obj.settings["rules"]:
            _name = _obj.derived_from_zone if _obj.derived_from_zone else _policy
            raise FirewallError(errors.ALREADY_ENABLED,
                                "'%s' already in '%s'" % (rule, _name))

        if not _obj.derived_from_zone:
            if rule.element and isinstance(rule.element, Rich_Masquerade):
                if "HOST" in _obj.settings["egress_zones"]:
                    raise FirewallError(errors.INVALID_ZONE, "'masquerade' is invalid for egress zone 'HOST'")
                if "HOST" in _obj.settings["ingress_zones"]:
                    raise FirewallError(errors.INVALID_ZONE, "'masquerade' is invalid for ingress zone 'HOST'")
                for zone in _obj.settings["ingress_zones"]:
                    if zone == "ANY":
                        continue
                    if self._fw.zone.list_interfaces(zone):
                        raise FirewallError(errors.INVALID_ZONE, "'masquerade' cannot be used in a policy if an ingress zone has assigned interfaces")
            if rule.element and isinstance(rule.element, Rich_ForwardPort):
                if "HOST" in _obj.settings["egress_zones"]:
                    if rule.element.to_address:
                        raise FirewallError(errors.INVALID_FORWARD, "A 'forward-port' with 'to-addr' is invalid for egress zone 'HOST'")
                elif _obj.settings["egress_zones"]:
                    if not rule.element.to_address:
                        raise FirewallError(errors.INVALID_FORWARD, "'forward-port' requires 'to-addr' if egress zone is 'ANY' or a zone")
                    for zone in _obj.settings["egress_zones"]:
                        if zone == "ANY":
                            continue
                        if self._fw.zone.list_interfaces(zone):
                            raise FirewallError(errors.INVALID_ZONE, "'forward-port' cannot be used in a policy if an egress zone has assigned interfaces")
            if rule.action and isinstance(rule.action, Rich_Mark):
                for zone in _obj.settings["egress_zones"]:
                    if zone in ["ANY", "HOST"]:
                        continue
                    if self._fw.zone.list_interfaces(zone):
                        raise FirewallError(errors.INVALID_ZONE, "'mark' action cannot be used in a policy if an egress zone has assigned interfaces")

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            self.__rule(True, _policy, rule, transaction)

        self.__register_rule(_obj, rule_id, timeout, sender)
        transaction.add_fail(self.__unregister_rule, _obj, rule_id)

        if use_transaction is None:
            transaction.execute(True)

        return _policy

    def __register_rule(self, _obj, rule_id, timeout, sender):
        _obj.settings["rules"][rule_id] = self.__gen_settings(
            timeout, sender)

    def remove_rule(self, policy, rule,
                    use_transaction=None):
        _policy = self._fw.check_policy(policy)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        rule_id = self.__rule_id(rule)
        if rule_id not in _obj.settings["rules"]:
            _name = _obj.derived_from_zone if _obj.derived_from_zone else _policy
            raise FirewallError(errors.NOT_ENABLED,
                                "'%s' not in '%s'" % (rule, _name))

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            self.__rule(False, _policy, rule, transaction)

        transaction.add_post(self.__unregister_rule, _obj, rule_id)

        if use_transaction is None:
            transaction.execute(True)

        return _policy

    def __unregister_rule(self, _obj, rule_id):
        if rule_id in _obj.settings["rules"]:
            del _obj.settings["rules"][rule_id]

    def query_rule(self, policy, rule):
        return self.__rule_id(rule) in self.get_settings(policy)["rules"]

    def list_rules(self, policy):
        return list(self.get_settings(policy)["rules"].keys())

    # SERVICES

    def check_service(self, service):
        self._fw.check_service(service)

    def __service_id(self, service):
        self.check_service(service)
        return service

    def add_service(self, policy, service, timeout=0, sender=None,
                    use_transaction=None):
        _policy = self._fw.check_policy(policy)
        self._fw.check_timeout(timeout)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        service_id = self.__service_id(service)
        if service_id in _obj.settings["services"]:
            _name = _obj.derived_from_zone if _obj.derived_from_zone else _policy
            raise FirewallError(errors.ALREADY_ENABLED,
                                "'%s' already in '%s'" % (service, _name))

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            self._service(True, _policy, service, transaction)

        self.__register_service(_obj, service_id, timeout, sender)
        transaction.add_fail(self.__unregister_service, _obj, service_id)

        if use_transaction is None:
            transaction.execute(True)

        return _policy

    def __register_service(self, _obj, service_id, timeout, sender):
        _obj.settings["services"][service_id] = \
            self.__gen_settings(timeout, sender)

    def remove_service(self, policy, service,
                       use_transaction=None):
        _policy = self._fw.check_policy(policy)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        service_id = self.__service_id(service)
        if service_id not in _obj.settings["services"]:
            _name = _obj.derived_from_zone if _obj.derived_from_zone else _policy
            raise FirewallError(errors.NOT_ENABLED,
                                "'%s' not in '%s'" % (service, _name))

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            self._service(False, _policy, service, transaction)

        transaction.add_post(self.__unregister_service, _obj, service_id)

        if use_transaction is None:
            transaction.execute(True)

        return _policy

    def __unregister_service(self, _obj, service_id):
        if service_id in _obj.settings["services"]:
            del _obj.settings["services"][service_id]

    def query_service(self, policy, service):
        return self.__service_id(service) in self.get_settings(policy)["services"]

    def list_services(self, policy):
        return self.get_settings(policy)["services"].keys()

    def get_helpers_for_service_helpers(self, helpers):
        _helpers = [ ]
        for helper in helpers:
            try:
                _helper = self._fw.helper.get_helper(helper)
            except FirewallError:
                raise FirewallError(errors.INVALID_HELPER, helper)
            _helpers.append(_helper)
        return _helpers

    def get_helpers_for_service_modules(self, modules, enable):
        # If automatic helper assignment is turned off, helpers that
        # do not have ports defined will be replaced by the helpers
        # that the helper.module defines.
        _helpers = [ ]
        for module in modules:
            try:
                helper = self._fw.helper.get_helper(module)
            except FirewallError:
                raise FirewallError(errors.INVALID_HELPER, module)
            if len(helper.ports) < 1:
                _module_short_name = get_nf_conntrack_short_name(helper.module)
                try:
                    _helper = self._fw.helper.get_helper(_module_short_name)
                    _helpers.append(_helper)
                except FirewallError:
                    if enable:
                        log.warning("Helper '%s' is not available" % _module_short_name)
                    continue
            else:
                _helpers.append(helper)
        return _helpers

    # PORTS

    def check_port(self, port, protocol):
        self._fw.check_port(port)
        self._fw.check_tcpudp(protocol)

    def __port_id(self, port, protocol):
        self.check_port(port, protocol)
        return (portStr(port, "-"), protocol)

    def add_port(self, policy, port, protocol, timeout=0, sender=None,
                 use_transaction=None):
        _policy = self._fw.check_policy(policy)
        self._fw.check_timeout(timeout)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        existing_port_ids = list(filter(lambda x: x[1] == protocol, _obj.settings["ports"]))
        for port_id in existing_port_ids:
            if portInPortRange(port, port_id[0]):
                _name = _obj.derived_from_zone if _obj.derived_from_zone else _policy
                raise FirewallError(errors.ALREADY_ENABLED,
                                    "'%s:%s' already in '%s'" % (port, protocol, _name))

        added_ranges, removed_ranges = coalescePortRange(port, [_port for (_port, _protocol) in existing_port_ids])

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            for range in added_ranges:
                self._port(True, _policy, portStr(range, "-"), protocol, transaction)
            for range in removed_ranges:
                self._port(False, _policy, portStr(range, "-"), protocol, transaction)

        for range in added_ranges:
            port_id = self.__port_id(range, protocol)
            self.__register_port(_obj, port_id, timeout, sender)
            transaction.add_fail(self.__unregister_port, _obj, port_id)
        for range in removed_ranges:
            port_id = self.__port_id(range, protocol)
            transaction.add_post(self.__unregister_port, _obj, port_id)

        if use_transaction is None:
            transaction.execute(True)

        return _policy

    def __register_port(self, _obj, port_id, timeout, sender):
        _obj.settings["ports"][port_id] = \
            self.__gen_settings(timeout, sender)

    def remove_port(self, policy, port, protocol,
                    use_transaction=None):
        _policy = self._fw.check_policy(policy)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        existing_port_ids = list(filter(lambda x: x[1] == protocol, _obj.settings["ports"]))
        for port_id in existing_port_ids:
            if portInPortRange(port, port_id[0]):
                break
        else:
            _name = _obj.derived_from_zone if _obj.derived_from_zone else _policy
            raise FirewallError(errors.NOT_ENABLED,
                                "'%s:%s' not in '%s'" % (port, protocol, _name))

        added_ranges, removed_ranges = breakPortRange(port, [_port for (_port, _protocol) in existing_port_ids])

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            for range in added_ranges:
                self._port(True, _policy, portStr(range, "-"), protocol, transaction)
            for range in removed_ranges:
                self._port(False, _policy, portStr(range, "-"), protocol, transaction)

        for range in added_ranges:
            port_id = self.__port_id(range, protocol)
            self.__register_port(_obj, port_id, 0, None)
            transaction.add_fail(self.__unregister_port, _obj, port_id)
        for range in removed_ranges:
            port_id = self.__port_id(range, protocol)
            transaction.add_post(self.__unregister_port, _obj, port_id)

        if use_transaction is None:
            transaction.execute(True)

        return _policy

    def __unregister_port(self, _obj, port_id):
        if port_id in _obj.settings["ports"]:
            del _obj.settings["ports"][port_id]

    def query_port(self, policy, port, protocol):
        for (_port, _protocol) in self.get_settings(policy)["ports"]:
            if portInPortRange(port, _port) and protocol == _protocol:
                return True

        return False

    def list_ports(self, policy):
        return list(self.get_settings(policy)["ports"].keys())

    # PROTOCOLS

    def check_protocol(self, protocol):
        if not checkProtocol(protocol):
            raise FirewallError(errors.INVALID_PROTOCOL, protocol)

    def __protocol_id(self, protocol):
        self.check_protocol(protocol)
        return protocol

    def add_protocol(self, policy, protocol, timeout=0, sender=None,
                     use_transaction=None):
        _policy = self._fw.check_policy(policy)
        self._fw.check_timeout(timeout)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        protocol_id = self.__protocol_id(protocol)
        if protocol_id in _obj.settings["protocols"]:
            _name = _obj.derived_from_zone if _obj.derived_from_zone else _policy
            raise FirewallError(errors.ALREADY_ENABLED,
                                "'%s' already in '%s'" % (protocol, _name))

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            self._protocol(True, _policy, protocol, transaction)

        self.__register_protocol(_obj, protocol_id, timeout, sender)
        transaction.add_fail(self.__unregister_protocol, _obj, protocol_id)

        if use_transaction is None:
            transaction.execute(True)

        return _policy

    def __register_protocol(self, _obj, protocol_id, timeout, sender):
        _obj.settings["protocols"][protocol_id] = \
            self.__gen_settings(timeout, sender)

    def remove_protocol(self, policy, protocol,
                        use_transaction=None):
        _policy = self._fw.check_policy(policy)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        protocol_id = self.__protocol_id(protocol)
        if protocol_id not in _obj.settings["protocols"]:
            _name = _obj.derived_from_zone if _obj.derived_from_zone else _policy
            raise FirewallError(errors.NOT_ENABLED,
                                "'%s' not in '%s'" % (protocol, _name))

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            self._protocol(False, _policy, protocol, transaction)

        transaction.add_post(self.__unregister_protocol, _obj,
                                  protocol_id)

        if use_transaction is None:
            transaction.execute(True)

        return _policy

    def __unregister_protocol(self, _obj, protocol_id):
        if protocol_id in _obj.settings["protocols"]:
            del _obj.settings["protocols"][protocol_id]

    def query_protocol(self, policy, protocol):
        return self.__protocol_id(protocol) in self.get_settings(policy)["protocols"]

    def list_protocols(self, policy):
        return list(self.get_settings(policy)["protocols"].keys())

    # SOURCE PORTS

    def __source_port_id(self, port, protocol):
        self.check_port(port, protocol)
        return (portStr(port, "-"), protocol)

    def add_source_port(self, policy, port, protocol, timeout=0, sender=None,
                        use_transaction=None):
        _policy = self._fw.check_policy(policy)
        self._fw.check_timeout(timeout)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        existing_port_ids = list(filter(lambda x: x[1] == protocol, _obj.settings["source_ports"]))
        for port_id in existing_port_ids:
            if portInPortRange(port, port_id[0]):
                _name = _obj.derived_from_zone if _obj.derived_from_zone else _policy
                raise FirewallError(errors.ALREADY_ENABLED,
                                    "'%s:%s' already in '%s'" % (port, protocol, _name))

        added_ranges, removed_ranges = coalescePortRange(port, [_port for (_port, _protocol) in existing_port_ids])

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            for range in added_ranges:
                self._source_port(True, _policy, portStr(range, "-"), protocol, transaction)
            for range in removed_ranges:
                self._source_port(False, _policy, portStr(range, "-"), protocol, transaction)

        for range in added_ranges:
            port_id = self.__source_port_id(range, protocol)
            self.__register_source_port(_obj, port_id, timeout, sender)
            transaction.add_fail(self.__unregister_source_port, _obj, port_id)
        for range in removed_ranges:
            port_id = self.__source_port_id(range, protocol)
            transaction.add_post(self.__unregister_source_port, _obj, port_id)

        if use_transaction is None:
            transaction.execute(True)

        return _policy

    def __register_source_port(self, _obj, port_id, timeout, sender):
        _obj.settings["source_ports"][port_id] = \
            self.__gen_settings(timeout, sender)

    def remove_source_port(self, policy, port, protocol,
                           use_transaction=None):
        _policy = self._fw.check_policy(policy)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        existing_port_ids = list(filter(lambda x: x[1] == protocol, _obj.settings["source_ports"]))
        for port_id in existing_port_ids:
            if portInPortRange(port, port_id[0]):
                break
        else:
            _name = _obj.derived_from_zone if _obj.derived_from_zone else _policy
            raise FirewallError(errors.NOT_ENABLED,
                                "'%s:%s' not in '%s'" % (port, protocol, _name))

        added_ranges, removed_ranges = breakPortRange(port, [_port for (_port, _protocol) in existing_port_ids])

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            for range in added_ranges:
                self._source_port(True, _policy, portStr(range, "-"), protocol, transaction)
            for range in removed_ranges:
                self._source_port(False, _policy, portStr(range, "-"), protocol, transaction)

        for range in added_ranges:
            port_id = self.__source_port_id(range, protocol)
            self.__register_source_port(_obj, port_id, 0, None)
            transaction.add_fail(self.__unregister_source_port, _obj, port_id)
        for range in removed_ranges:
            port_id = self.__source_port_id(range, protocol)
            transaction.add_post(self.__unregister_source_port, _obj, port_id)

        if use_transaction is None:
            transaction.execute(True)

        return _policy

    def __unregister_source_port(self, _obj, port_id):
        if port_id in _obj.settings["source_ports"]:
            del _obj.settings["source_ports"][port_id]

    def query_source_port(self, policy, port, protocol):
        for (_port, _protocol) in self.get_settings(policy)["source_ports"]:
            if portInPortRange(port, _port) and protocol == _protocol:
                return True

        return False

    def list_source_ports(self, policy):
        return list(self.get_settings(policy)["source_ports"].keys())

    # MASQUERADE

    def __masquerade_id(self):
        return True

    def add_masquerade(self, policy, timeout=0, sender=None,
                       use_transaction=None):
        _policy = self._fw.check_policy(policy)
        self._fw.check_timeout(timeout)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        masquerade_id = self.__masquerade_id()
        if masquerade_id in _obj.settings["masquerade"]:
            _name = _obj.derived_from_zone if _obj.derived_from_zone else _policy
            raise FirewallError(errors.ALREADY_ENABLED,
                                "masquerade already enabled in '%s'" % _name)

        if not _obj.derived_from_zone:
            if "HOST" in _obj.settings["egress_zones"]:
                raise FirewallError(errors.INVALID_ZONE, "'masquerade' is invalid for egress zone 'HOST'")
            if "HOST" in _obj.settings["ingress_zones"]:
                raise FirewallError(errors.INVALID_ZONE, "'masquerade' is invalid for ingress zone 'HOST'")
            for zone in _obj.settings["ingress_zones"]:
                if zone == "ANY":
                    continue
                if self._fw.zone.list_interfaces(zone):
                    raise FirewallError(errors.INVALID_ZONE, "'masquerade' cannot be used in a policy if an ingress zone has assigned interfaces")

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            self._masquerade(True, _policy, transaction)

        self.__register_masquerade(_obj, masquerade_id, timeout, sender)
        transaction.add_fail(self.__unregister_masquerade, _obj, masquerade_id)

        if use_transaction is None:
            transaction.execute(True)

        return _policy

    def __register_masquerade(self, _obj, masquerade_id, timeout, sender):
        _obj.settings["masquerade"][masquerade_id] = \
            self.__gen_settings(timeout, sender)

    def remove_masquerade(self, policy, use_transaction=None):
        _policy = self._fw.check_policy(policy)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        masquerade_id = self.__masquerade_id()
        if masquerade_id not in _obj.settings["masquerade"]:
            _name = _obj.derived_from_zone if _obj.derived_from_zone else _policy
            raise FirewallError(errors.NOT_ENABLED,
                                "masquerade not enabled in '%s'" % _name)

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            self._masquerade(False, _policy, transaction)

        transaction.add_post(self.__unregister_masquerade, _obj, masquerade_id)

        if use_transaction is None:
            transaction.execute(True)

        return _policy

    def __unregister_masquerade(self, _obj, masquerade_id):
        if masquerade_id in _obj.settings["masquerade"]:
            del _obj.settings["masquerade"][masquerade_id]

    def query_masquerade(self, policy):
        return self.__masquerade_id() in self.get_settings(policy)["masquerade"]

    # PORT FORWARDING

    def check_forward_port(self, ipv, port, protocol, toport=None, toaddr=None):
        self._fw.check_port(port)
        self._fw.check_tcpudp(protocol)
        if toport:
            self._fw.check_port(toport)
        if toaddr:
            if not check_single_address(ipv, toaddr):
                raise FirewallError(errors.INVALID_ADDR, toaddr)
        if not toport and not toaddr:
            raise FirewallError(
                errors.INVALID_FORWARD,
                "port-forwarding is missing to-port AND to-addr")

    def __forward_port_id(self, port, protocol, toport=None, toaddr=None):
        if check_single_address("ipv6", toaddr):
            self.check_forward_port("ipv6", port, protocol, toport, toaddr)
        else:
            self.check_forward_port("ipv4", port, protocol, toport, toaddr)
        return (portStr(port, "-"), protocol,
                portStr(toport, "-"), str(toaddr))

    def add_forward_port(self, policy, port, protocol, toport=None,
                         toaddr=None, timeout=0, sender=None,
                         use_transaction=None):
        _policy = self._fw.check_policy(policy)
        self._fw.check_timeout(timeout)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        forward_id = self.__forward_port_id(port, protocol, toport, toaddr)
        if forward_id in _obj.settings["forward_ports"]:
            _name = _obj.derived_from_zone if _obj.derived_from_zone else _policy
            raise FirewallError(errors.ALREADY_ENABLED,
                                "'%s:%s:%s:%s' already in '%s'" % \
                                (port, protocol, toport, toaddr, _name))

        if not _obj.derived_from_zone:
            if "HOST" in _obj.settings["egress_zones"]:
                if toaddr:
                    raise FirewallError(errors.INVALID_FORWARD, "A 'forward-port' with 'to-addr' is invalid for egress zone 'HOST'")
            elif _obj.settings["egress_zones"]:
                if not toaddr:
                    raise FirewallError(errors.INVALID_FORWARD, "'forward-port' requires 'to-addr' if egress zone is 'ANY' or a zone")
                for zone in _obj.settings["egress_zones"]:
                    if zone == "ANY":
                        continue
                    if self._fw.zone.list_interfaces(zone):
                        raise FirewallError(errors.INVALID_ZONE, "'forward-port' cannot be used in a policy if an egress zone has assigned interfaces")

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            self._forward_port(True, _policy, transaction, port, protocol,
                               toport, toaddr)

        self.__register_forward_port(_obj, forward_id, timeout, sender)
        transaction.add_fail(self.__unregister_forward_port, _obj, forward_id)

        if use_transaction is None:
            transaction.execute(True)

        return _policy

    def __register_forward_port(self, _obj, forward_id, timeout, sender):
        _obj.settings["forward_ports"][forward_id] = \
            self.__gen_settings(timeout, sender)

    def remove_forward_port(self, policy, port, protocol, toport=None,
                            toaddr=None, use_transaction=None):
        _policy = self._fw.check_policy(policy)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        forward_id = self.__forward_port_id(port, protocol, toport, toaddr)
        if forward_id not in _obj.settings["forward_ports"]:
            _name = _obj.derived_from_zone if _obj.derived_from_zone else _policy
            raise FirewallError(errors.NOT_ENABLED,
                                "'%s:%s:%s:%s' not in '%s'" % \
                                (port, protocol, toport, toaddr, _name))

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            self._forward_port(False, _policy, transaction, port, protocol,
                               toport, toaddr)

        transaction.add_post(self.__unregister_forward_port, _obj, forward_id)

        if use_transaction is None:
            transaction.execute(True)

        return _policy

    def __unregister_forward_port(self, _obj, forward_id):
        if forward_id in _obj.settings["forward_ports"]:
            del _obj.settings["forward_ports"][forward_id]

    def query_forward_port(self, policy, port, protocol, toport=None,
                           toaddr=None):
        forward_id = self.__forward_port_id(port, protocol, toport, toaddr)
        return forward_id in self.get_settings(policy)["forward_ports"]

    def list_forward_ports(self, policy):
        return list(self.get_settings(policy)["forward_ports"].keys())

    # ICMP BLOCK

    def check_icmp_block(self, icmp):
        self._fw.check_icmptype(icmp)

    def __icmp_block_id(self, icmp):
        self.check_icmp_block(icmp)
        return icmp

    def add_icmp_block(self, policy, icmp, timeout=0, sender=None,
                       use_transaction=None):
        _policy = self._fw.check_policy(policy)
        self._fw.check_timeout(timeout)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        icmp_id = self.__icmp_block_id(icmp)
        if icmp_id in _obj.settings["icmp_blocks"]:
            _name = _obj.derived_from_zone if _obj.derived_from_zone else _policy
            raise FirewallError(errors.ALREADY_ENABLED,
                                "'%s' already in '%s'" % (icmp, _name))

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            self._icmp_block(True, _policy, icmp, transaction)

        self.__register_icmp_block(_obj, icmp_id, timeout, sender)
        transaction.add_fail(self.__unregister_icmp_block, _obj, icmp_id)

        if use_transaction is None:
            transaction.execute(True)

        return _policy

    def __register_icmp_block(self, _obj, icmp_id, timeout, sender):
        _obj.settings["icmp_blocks"][icmp_id] = \
            self.__gen_settings(timeout, sender)

    def remove_icmp_block(self, policy, icmp, use_transaction=None):
        _policy = self._fw.check_policy(policy)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        icmp_id = self.__icmp_block_id(icmp)
        if icmp_id not in _obj.settings["icmp_blocks"]:
            _name = _obj.derived_from_zone if _obj.derived_from_zone else _policy
            raise FirewallError(errors.NOT_ENABLED,
                                "'%s' not in '%s'" % (icmp, _name))

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            self._icmp_block(False, _policy, icmp, transaction)

        transaction.add_post(self.__unregister_icmp_block, _obj, icmp_id)

        if use_transaction is None:
            transaction.execute(True)

        return _policy

    def __unregister_icmp_block(self, _obj, icmp_id):
        if icmp_id in _obj.settings["icmp_blocks"]:
            del _obj.settings["icmp_blocks"][icmp_id]

    def query_icmp_block(self, policy, icmp):
        return self.__icmp_block_id(icmp) in self.get_settings(policy)["icmp_blocks"]

    def list_icmp_blocks(self, policy):
        return self.get_settings(policy)["icmp_blocks"].keys()

    # ICMP BLOCK INVERSION

    def __icmp_block_inversion_id(self):
        return True

    def add_icmp_block_inversion(self, policy, sender=None,
                                 use_transaction=None):
        _policy = self._fw.check_policy(policy)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        icmp_block_inversion_id = self.__icmp_block_inversion_id()
        if icmp_block_inversion_id in _obj.settings["icmp_block_inversion"]:
            _name = _obj.derived_from_zone if _obj.derived_from_zone else _policy
            raise FirewallError(
                errors.ALREADY_ENABLED,
                "icmp-block-inversion already enabled in '%s'" % _name)

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            # undo icmp blocks
            for args in self.get_settings(_policy)["icmp_blocks"]:
                self._icmp_block(False, _policy, args, transaction)

            self._icmp_block_inversion(False, _policy, transaction)

        self.__register_icmp_block_inversion(_obj, icmp_block_inversion_id,
                                             sender)
        transaction.add_fail(self.__undo_icmp_block_inversion, _policy, _obj,
                             icmp_block_inversion_id)

        # redo icmp blocks
        if _obj.applied:
            for args in self.get_settings(_policy)["icmp_blocks"]:
                self._icmp_block(True, _policy, args, transaction)

            self._icmp_block_inversion(True, _policy, transaction)

        if use_transaction is None:
            transaction.execute(True)

        return _policy

    def __register_icmp_block_inversion(self, _obj, icmp_block_inversion_id,
                                        sender):
        _obj.settings["icmp_block_inversion"][icmp_block_inversion_id] = \
            self.__gen_settings(0, sender)

    def __undo_icmp_block_inversion(self, _policy, _obj, icmp_block_inversion_id):
        transaction = self.new_transaction()

        # undo icmp blocks
        if _obj.applied:
            for args in self.get_settings(_policy)["icmp_blocks"]:
                self._icmp_block(False, _policy, args, transaction)

        if icmp_block_inversion_id in _obj.settings["icmp_block_inversion"]:
            del _obj.settings["icmp_block_inversion"][icmp_block_inversion_id]

        # redo icmp blocks
        if _obj.applied:
            for args in self.get_settings(_policy)["icmp_blocks"]:
                self._icmp_block(True, _policy, args, transaction)

        transaction.execute(True)

    def remove_icmp_block_inversion(self, policy, use_transaction=None):
        _policy = self._fw.check_policy(policy)
        self._fw.check_panic()
        _obj = self._policies[_policy]

        icmp_block_inversion_id = self.__icmp_block_inversion_id()
        if icmp_block_inversion_id not in _obj.settings["icmp_block_inversion"]:
            _name = _obj.derived_from_zone if _obj.derived_from_zone else _policy
            raise FirewallError(
                errors.NOT_ENABLED,
                "icmp-block-inversion not enabled in '%s'" % _name)

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            # undo icmp blocks
            for args in self.get_settings(_policy)["icmp_blocks"]:
                self._icmp_block(False, _policy, args, transaction)

            self._icmp_block_inversion(False, _policy, transaction)

        self.__unregister_icmp_block_inversion(_obj,
                                               icmp_block_inversion_id)
        transaction.add_fail(self.__register_icmp_block_inversion, _obj,
                             icmp_block_inversion_id, None)

        # redo icmp blocks
        if _obj.applied:
            for args in self.get_settings(_policy)["icmp_blocks"]:
                self._icmp_block(True, _policy, args, transaction)

            self._icmp_block_inversion(True, _policy, transaction)

        if use_transaction is None:
            transaction.execute(True)

        return _policy

    def __unregister_icmp_block_inversion(self, _obj, icmp_block_inversion_id):
        if icmp_block_inversion_id in _obj.settings["icmp_block_inversion"]:
            del _obj.settings["icmp_block_inversion"][icmp_block_inversion_id]

    def query_icmp_block_inversion(self, policy):
        return self.__icmp_block_inversion_id() in \
            self.get_settings(policy)["icmp_block_inversion"]

    def gen_chain_rules(self, policy, create, table, chain, transaction):
        obj = self._fw.policy.get_policy(policy)
        if obj.derived_from_zone:
            # For policies derived from zones, use only the first policy in the
            # list to track chain creation. The chain names are converted to
            # zone-based names as such they're "global" for all zone derived
            # policies.
            tracking_policy = self._fw.zone._zone_policies[obj.derived_from_zone][0]
        else:
            tracking_policy = policy

        if create:
            if tracking_policy in self._chains and  \
               (table, chain) in self._chains[tracking_policy]:
                return
        else:
            if tracking_policy not in self._chains or \
               (table, chain) not in self._chains[tracking_policy]:
                return

        for backend in self._fw.enabled_backends():
            if backend.policies_supported and \
               table in backend.get_available_tables():
                rules = backend.build_policy_chain_rules(create, policy, table, chain)
                transaction.add_rules(backend, rules)

        self._register_chains(tracking_policy, create, [(table, chain)])
        transaction.add_fail(self._register_chains, tracking_policy, not create, [(table, chain)])

    def _register_chains(self, policy, create, tables):
        for (table, chain) in tables:
            if create:
                self._chains.setdefault(policy, []).append((table, chain))
            else:
                self._chains[policy].remove((table, chain))
                if len(self._chains[policy]) == 0:
                    del self._chains[policy]

    # IPSETS

    def _ipset_family(self, name):
        if self._fw.ipset.get_type(name) == "hash:mac":
            return None
        return self._fw.ipset.get_family(name)

    def __ipset_type(self, name):
        return self._fw.ipset.get_type(name)

    def _ipset_match_flags(self, name, flag):
        return ",".join([flag] * self._fw.ipset.get_dimension(name))

    def _check_ipset_applied(self, name):
        return self._fw.ipset.check_applied(name)

    def _check_ipset_type_for_source(self, name):
        _type = self.__ipset_type(name)
        if _type not in SOURCE_IPSET_TYPES:
            raise FirewallError(
                errors.INVALID_IPSET,
                "ipset '%s' with type '%s' not usable as source" % \
                (name, _type))

    def _rule_prepare(self, enable, policy, rule, transaction, included_services=None):
        # First apply any services this service may include
        if type(rule.element) == Rich_Service:
            svc = self._fw.service.get_service(rule.element.name)
            if included_services is None:
                included_services = [rule.element.name]
            for include in svc.includes:
                if include in included_services:
                    continue
                self.check_service(include)
                included_services.append(include)
                _rule = copy.deepcopy(rule)
                _rule.element.name = include
                self._rule_prepare(enable, policy, _rule, transaction, included_services=included_services)

        ipvs = []
        if rule.family:
            ipvs = [ rule.family ]
        elif rule.element and (isinstance(rule.element, Rich_IcmpBlock) or isinstance(rule.element, Rich_IcmpType)):
            ict = self._fw.config.get_icmptype(rule.element.name)
            if ict.destination:
                ipvs = [ipv for ipv in ["ipv4", "ipv6"] if ipv in ict.destination]

        source_ipv = self._rule_source_ipv(rule.source)
        if source_ipv:
            if rule.family:
                # rule family is defined by user, no way to change it
                if rule.family != source_ipv:
                    raise FirewallError(errors.INVALID_RULE,
                                        "Source address family '%s' conflicts with rule family '%s'." % (source_ipv, rule.family))
            else:
                # use the source family as rule family
                ipvs = [ source_ipv ]

        if not ipvs:
            ipvs = ["ipv4", "ipv6"]

        # clamp ipvs to those that are actually enabled.
        ipvs = [ipv for ipv in ipvs if self._fw.is_ipv_enabled(ipv)]

        # add an element to object to allow backends to know what ipvs this applies to
        rule.ipvs = ipvs

        for backend in set([self._fw.get_backend_by_ipv(x) for x in ipvs]):
            # SERVICE
            if type(rule.element) == Rich_Service:
                svc = self._fw.service.get_service(rule.element.name)

                destinations = []
                if len(svc.destination) > 0:
                    if rule.destination:
                        # we can not use two destinations at the same time
                        raise FirewallError(errors.INVALID_RULE,
                                            "Destination conflict with service.")
                    for ipv in ipvs:
                        if ipv in svc.destination and backend.is_ipv_supported(ipv):
                            destinations.append(svc.destination[ipv])
                else:
                    # dummy for the following for loop
                    destinations.append(None)

                for destination in destinations:
                    if type(rule.action) == Rich_Accept:
                        # only load modules for accept action
                        helpers = self.get_helpers_for_service_modules(svc.modules,
                                                                       enable)
                        helpers += self.get_helpers_for_service_helpers(svc.helpers)
                        helpers = sorted(set(helpers), key=lambda x: x.name)

                        modules = [ ]
                        for helper in helpers:
                            module = helper.module
                            _module_short_name = get_nf_conntrack_short_name(module)
                            nat_module = module.replace("conntrack", "nat")
                            modules.append(nat_module)
                            if helper.family != "" and not backend.is_ipv_supported(helper.family):
                                # no support for family ipv, continue
                                continue
                            if len(helper.ports) < 1:
                                modules.append(module)
                            else:
                                for (port,proto) in helper.ports:
                                    rules = backend.build_policy_helper_ports_rules(
                                                    enable, policy, proto, port,
                                                    destination, helper.name, _module_short_name)
                                    transaction.add_rules(backend, rules)
                        transaction.add_modules(modules)

                    # create rules
                    for (port,proto) in svc.ports:
                        rules = backend.build_policy_ports_rules(
                                    enable, policy, proto, port, destination, rule)
                        transaction.add_rules(backend, rules)

                    for proto in svc.protocols:
                        rules = backend.build_policy_protocol_rules(
                                    enable, policy, proto, destination, rule)
                        transaction.add_rules(backend, rules)

                    # create rules
                    for (port,proto) in svc.source_ports:
                        rules = backend.build_policy_source_ports_rules(
                                    enable, policy, proto, port, destination, rule)
                        transaction.add_rules(backend, rules)

            # PORT
            elif type(rule.element) == Rich_Port:
                port = rule.element.port
                protocol = rule.element.protocol
                self.check_port(port, protocol)

                rules = backend.build_policy_ports_rules(
                            enable, policy, protocol, port, None, rule)
                transaction.add_rules(backend, rules)

            # PROTOCOL
            elif type(rule.element) == Rich_Protocol:
                protocol = rule.element.value
                self.check_protocol(protocol)

                rules = backend.build_policy_protocol_rules(
                            enable, policy, protocol, None, rule)
                transaction.add_rules(backend, rules)

            # MASQUERADE
            elif type(rule.element) == Rich_Masquerade:
                if enable:
                    for ipv in ipvs:
                        if backend.is_ipv_supported(ipv):
                            transaction.add_post(enable_ip_forwarding, ipv)

                rules = backend.build_policy_masquerade_rules(enable, policy, rule)
                transaction.add_rules(backend, rules)

            # FORWARD PORT
            elif type(rule.element) == Rich_ForwardPort:
                port = rule.element.port
                protocol = rule.element.protocol
                toport = rule.element.to_port
                toaddr = rule.element.to_address
                for ipv in ipvs:
                    if backend.is_ipv_supported(ipv):
                        self.check_forward_port(ipv, port, protocol, toport, toaddr)
                    if toaddr and enable:
                        transaction.add_post(enable_ip_forwarding, ipv)

                rules = backend.build_policy_forward_port_rules(
                                    enable, policy, port, protocol, toport,
                                    toaddr, rule)
                transaction.add_rules(backend, rules)

            # SOURCE PORT
            elif type(rule.element) == Rich_SourcePort:
                port = rule.element.port
                protocol = rule.element.protocol
                self.check_port(port, protocol)

                rules = backend.build_policy_source_ports_rules(
                            enable, policy, protocol, port, None, rule)
                transaction.add_rules(backend, rules)

            # ICMP BLOCK and ICMP TYPE
            elif type(rule.element) == Rich_IcmpBlock or \
                 type(rule.element) == Rich_IcmpType:
                ict = self._fw.config.get_icmptype(rule.element.name)

                if rule.family and ict.destination and \
                   rule.family not in ict.destination:
                    raise FirewallError(errors.INVALID_ICMPTYPE,
                                        "rich rule family '%s' conflicts with icmp type '%s'" % \
                                        (rule.family, rule.element.name))

                if type(rule.element) == Rich_IcmpBlock and \
                   rule.action and type(rule.action) == Rich_Accept:
                    # icmp block might have reject or drop action, but not accept
                    raise FirewallError(errors.INVALID_RULE,
                                        "IcmpBlock not usable with accept action")

                rules = backend.build_policy_icmp_block_rules(enable, policy, ict, rule)
                transaction.add_rules(backend, rules)

            elif rule.element is None:
                rules = backend.build_policy_rich_source_destination_rules(
                            enable, policy, rule)
                transaction.add_rules(backend, rules)

            # EVERYTHING ELSE
            else:
                raise FirewallError(errors.INVALID_RULE, "Unknown element %s" %
                                    type(rule.element))

    def _service(self, enable, policy, service, transaction, included_services=None):
        svc = self._fw.service.get_service(service)
        helpers = self.get_helpers_for_service_modules(svc.modules, enable)
        helpers += self.get_helpers_for_service_helpers(svc.helpers)
        helpers = sorted(set(helpers), key=lambda x: x.name)

        # First apply any services this service may include
        if included_services is None:
            included_services = [service]
        for include in svc.includes:
            if include in included_services:
                continue
            self.check_service(include)
            included_services.append(include)
            self._service(enable, policy, include, transaction, included_services=included_services)

        # build a list of (backend, destination). The destination may be ipv4,
        # ipv6 or None
        #
        backends_ipv = []
        for ipv in ["ipv4", "ipv6"]:
            if not self._fw.is_ipv_enabled(ipv):
                continue
            backend = self._fw.get_backend_by_ipv(ipv)
            if len(svc.destination) > 0:
                if ipv in svc.destination:
                    backends_ipv.append((backend, svc.destination[ipv]))
            else:
                if (backend, None) not in backends_ipv:
                    backends_ipv.append((backend, None))

        for (backend,destination) in backends_ipv:
            for helper in helpers:
                module = helper.module
                _module_short_name = get_nf_conntrack_short_name(module)
                nat_module = helper.module.replace("conntrack", "nat")
                transaction.add_module(nat_module)
                if helper.family != "" and not backend.is_ipv_supported(helper.family):
                    # no support for family ipv, continue
                    continue
                if len(helper.ports) < 1:
                    transaction.add_module(module)
                else:
                    for (port,proto) in helper.ports:
                        rules = backend.build_policy_helper_ports_rules(
                                        enable, policy, proto, port,
                                        destination, helper.name, _module_short_name)
                        transaction.add_rules(backend, rules)

            for (port,proto) in svc.ports:
                rules = backend.build_policy_ports_rules(enable, policy, proto,
                                                       port, destination)
                transaction.add_rules(backend, rules)

            for protocol in svc.protocols:
                rules = backend.build_policy_protocol_rules(
                                    enable, policy, protocol, destination)
                transaction.add_rules(backend, rules)

            for (port,proto) in svc.source_ports:
                rules = backend.build_policy_source_ports_rules(
                                    enable, policy, proto, port, destination)
                transaction.add_rules(backend, rules)

    def _port(self, enable, policy, port, protocol, transaction):
        for backend in self._fw.enabled_backends():
            if not backend.policies_supported:
                continue

            rules = backend.build_policy_ports_rules(enable, policy, protocol,
                                                   port)
            transaction.add_rules(backend, rules)

    def _protocol(self, enable, policy, protocol, transaction):
        for backend in self._fw.enabled_backends():
            if not backend.policies_supported:
                continue

            rules = backend.build_policy_protocol_rules(enable, policy, protocol)
            transaction.add_rules(backend, rules)

    def _source_port(self, enable, policy, port, protocol, transaction):
        for backend in self._fw.enabled_backends():
            if not backend.policies_supported:
                continue

            rules = backend.build_policy_source_ports_rules(enable, policy, protocol, port)
            transaction.add_rules(backend, rules)

    def _masquerade(self, enable, policy, transaction):
        ipv = "ipv4"
        transaction.add_post(enable_ip_forwarding, ipv)

        backend = self._fw.get_backend_by_ipv(ipv)
        rules = backend.build_policy_masquerade_rules(enable, policy)
        transaction.add_rules(backend, rules)

    def _forward_port(self, enable, policy, transaction, port, protocol,
                       toport=None, toaddr=None):
        if check_single_address("ipv6", toaddr):
            ipv = "ipv6"
        else:
            ipv = "ipv4"

        if toaddr and enable:
            transaction.add_post(enable_ip_forwarding, ipv)
        backend = self._fw.get_backend_by_ipv(ipv)
        rules = backend.build_policy_forward_port_rules(
                            enable, policy, port, protocol, toport,
                            toaddr)
        transaction.add_rules(backend, rules)

    def _icmp_block(self, enable, policy, icmp, transaction):
        ict = self._fw.config.get_icmptype(icmp)

        for backend in self._fw.enabled_backends():
            if not backend.policies_supported:
                continue
            skip_backend = False

            if ict.destination:
                for ipv in ["ipv4", "ipv6"]:
                    if ipv in ict.destination:
                        if not backend.is_ipv_supported(ipv):
                            skip_backend = True
                            break

            if skip_backend:
                continue

            rules = backend.build_policy_icmp_block_rules(enable, policy, ict)
            transaction.add_rules(backend, rules)

    def _icmp_block_inversion(self, enable, policy, transaction):
        target = self._policies[policy].target

        # Do not add general icmp accept rules into a trusted, block or drop
        # policy.
        if target in [ "DROP", "%%REJECT%%", "REJECT" ]:
            return
        if not self.query_icmp_block_inversion(policy) and target == "ACCEPT":
            # ibi target and policy target are ACCEPT, no need to add an extra
            # rule
            return

        for backend in self._fw.enabled_backends():
            if not backend.policies_supported:
                continue

            rules = backend.build_policy_icmp_block_inversion_rules(enable, policy)
            transaction.add_rules(backend, rules)

    def check_ingress_egress(self, policy, ingress_zones, egress_zones,
                                           ingress_interfaces, egress_interfaces,
                                           ingress_sources, egress_sources):
        for zone in ingress_zones:
            self.check_ingress_zone(zone)
        for zone in egress_zones:
            self.check_egress_zone(zone)

        if ("ANY" in ingress_zones or "HOST" in ingress_zones) and \
           len(ingress_zones) > 1:
            raise FirewallError(errors.INVALID_ZONE, "'ingress-zones' may only contain one of: many regular zones, ANY, or HOST")

        if ("ANY" in egress_zones or "HOST" in egress_zones) and \
           len(egress_zones) > 1:
            raise FirewallError(errors.INVALID_ZONE, "'egress-zones' may only contain one of: many regular zones, ANY, or HOST")

        if (egress_interfaces or egress_sources) and \
           not ingress_interfaces and not ingress_sources and \
           "HOST" not in ingress_zones and "ANY" not in ingress_zones:
            raise FirewallError(errors.INVALID_ZONE, "policy \"%s\" has no ingress" % (policy))

        if (ingress_interfaces or ingress_sources) and \
           not egress_interfaces and not egress_sources and \
           "HOST" not in egress_zones and "ANY" not in egress_zones:
            raise FirewallError(errors.INVALID_ZONE, "policy \"%s\" has no egress" % (policy))

    def check_ingress_egress_chain(self, policy, table, chain,
                                   ingress_zones, egress_zones,
                                   ingress_interfaces, egress_interfaces,
                                   ingress_sources, egress_sources):
        if chain == "PREROUTING":
            # raw,prerouting is used for conntrack helpers (services), so we
            # need to allow it if egress-zones contains an actual zone
            if table != "raw":
                if egress_interfaces:
                    raise FirewallError(errors.INVALID_ZONE, "policy \"%s\" egress-zones may not include a zone with added interfaces." % (policy))
        elif chain == "POSTROUTING":
            if "HOST" in ingress_zones:
                raise FirewallError(errors.INVALID_ZONE, "policy \"%s\" ingress-zones may not include HOST." % (policy))
            if "HOST" in egress_zones:
                raise FirewallError(errors.INVALID_ZONE, "policy \"%s\" egress-zones may not include HOST." % (policy))
            if ingress_interfaces:
                raise FirewallError(errors.INVALID_ZONE, "policy \"%s\" ingress-zones may not include a zone with added interfaces." % (policy))
        elif chain == "FORWARD":
            if "HOST" in ingress_zones:
                raise FirewallError(errors.INVALID_ZONE, "policy \"%s\" ingress-zones may not include HOST." % (policy))
            if "HOST" in egress_zones:
                raise FirewallError(errors.INVALID_ZONE, "policy \"%s\" egress-zones may not include HOST." % (policy))
        elif chain == "INPUT":
            if "HOST" not in egress_zones:
                raise FirewallError(errors.INVALID_ZONE, "policy \"%s\" egress-zones must include only HOST." % (policy))
        elif chain == "OUTPUT":
            if "HOST" not in ingress_zones:
                raise FirewallError(errors.INVALID_ZONE, "policy \"%s\" ingress-zones must include only HOST." % (policy))

    def _ingress_egress_zones_transaction(self, enable, policy):
        transaction = self.new_transaction()
        self._ingress_egress_zones(enable, policy, transaction)
        transaction.execute(True)

    def _ingress_egress_zones(self, enable, policy, transaction):
        obj = self._policies[policy]

        ingress_zones = obj.settings["ingress_zones"]
        egress_zones = obj.settings["egress_zones"]

        ingress_interfaces = set()
        egress_interfaces = set()
        ingress_sources = set()
        egress_sources = set()

        for zone in ingress_zones:
            if zone in ["ANY", "HOST"]:
                continue
            ingress_interfaces |= set(self._fw.zone.list_interfaces(zone))
            ingress_sources |= set(self._fw.zone.list_sources(zone))
        for zone in egress_zones:
            if zone in ["ANY", "HOST"]:
                continue
            egress_interfaces |= set(self._fw.zone.list_interfaces(zone))
            egress_sources |= set(self._fw.zone.list_sources(zone))

        self.check_ingress_egress(policy, ingress_zones, egress_zones,
                                          ingress_interfaces, egress_interfaces,
                                          ingress_sources, egress_sources)

        for backend in self._fw.enabled_backends():
            if not backend.policies_supported:
                continue

            for (table, chain) in self._get_table_chains_for_policy_dispatch(policy):
                self.check_ingress_egress_chain(policy, table, chain,
                                                ingress_zones, egress_zones,
                                                ingress_interfaces, egress_interfaces,
                                                ingress_sources, egress_sources)
                rules = backend.build_policy_ingress_egress_rules(enable, policy, table, chain,
                                                                  ingress_interfaces, egress_interfaces,
                                                                  ingress_sources, egress_sources)
                transaction.add_rules(backend, rules)

    def _get_table_chains_for_policy_dispatch(self, policy):
        """Create a list of (table, chain) needed for policy dispatch"""
        obj = self._policies[policy]
        if "ANY" in obj.settings["ingress_zones"] and "HOST" in obj.settings["egress_zones"]:
            # any --> HOST
            tc = [("filter", "INPUT"), ("nat", "PREROUTING"),
                  ("mangle", "PREROUTING")]
            # iptables backend needs to put conntrack helper rules in raw
            # prerouting.
            if not self._fw.nftables_enabled:
                tc.append(("raw", "PREROUTING"))
            return tc
        elif "HOST" in obj.settings["egress_zones"]:
            # zone --> HOST
            tc = [("filter", "INPUT")]
            # iptables backend needs to put conntrack helper rules in raw
            # prerouting.
            if not self._fw.nftables_enabled:
                tc.append(("raw", "PREROUTING"))
            return tc
        elif "HOST" in obj.settings["ingress_zones"]:
            # HOST --> zone/any
            return [("filter", "OUTPUT")]
        elif "ANY" in obj.settings["ingress_zones"] and "ANY" in obj.settings["egress_zones"]:
            # any --> any
            tc = [("filter", "FORWARD"), ("nat", "PREROUTING"),
                  ("nat", "POSTROUTING"), ("mangle", "PREROUTING")]
            # iptables backend needs to put conntrack helper rules in raw
            # prerouting.
            if not self._fw.nftables_enabled:
                tc.append(("raw", "PREROUTING"))
            return tc
        elif "ANY" in obj.settings["egress_zones"]:
            # zone --> any
            tc = [("filter", "FORWARD"), ("nat", "PREROUTING"),
                  ("mangle", "PREROUTING")]
            # iptables backend needs to put conntrack helper rules in raw
            # prerouting.
            if not self._fw.nftables_enabled:
                tc.append(("raw", "PREROUTING"))
            for zone in obj.settings["ingress_zones"]:
                if self._fw.zone.get_settings(zone)["interfaces"]:
                    break
            else:
                tc.append(("nat", "POSTROUTING"))
            return tc
        elif "ANY" in obj.settings["ingress_zones"]:
            # any --> zone
            tc = [("filter", "FORWARD"), ("nat", "POSTROUTING")]
            # iptables backend needs to put conntrack helper rules in raw
            # prerouting.
            if not self._fw.nftables_enabled:
                tc.append(("raw", "PREROUTING"))
            for zone in obj.settings["egress_zones"]:
                if self._fw.zone.get_settings(zone)["interfaces"]:
                    break
            else:
                tc.append(("nat", "PREROUTING"))
                tc.append(("mangle", "PREROUTING"))
            return tc
        else:
            # zone -> zone
            tc = [("filter", "FORWARD")]
            # iptables backend needs to put conntrack helper rules in raw
            # prerouting.
            if not self._fw.nftables_enabled:
                tc.append(("raw", "PREROUTING"))
            for zone in obj.settings["ingress_zones"]:
                if self._fw.zone.get_settings(zone)["interfaces"]:
                    break
            else:
                tc.append(("nat", "POSTROUTING"))
            for zone in obj.settings["egress_zones"]:
                if self._fw.zone.get_settings(zone)["interfaces"]:
                    break
            else:
                tc.append(("nat", "PREROUTING"))
                tc.append(("mangle", "PREROUTING"))
            return tc

    def _get_table_chains_for_zone_dispatch(self, policy):
        """Create a list of (table, chain) needed for zone dispatch"""
        obj = self._policies[policy]
        if "HOST" in obj.settings["egress_zones"]:
            # zone --> Host
            tc = [("filter", "INPUT")]
            # iptables backend needs to put conntrack helper rules in raw
            # prerouting.
            if not self._fw.nftables_enabled:
                tc.append(("raw", "PREROUTING"))
            return tc
        elif "ANY" in obj.settings["egress_zones"]:
            # zone --> any
            return [("filter", "FORWARD_IN"), ("nat", "PREROUTING"),
                    ("mangle", "PREROUTING")]
        elif "ANY" in obj.settings["ingress_zones"]:
            # any --> zone
            return [("filter", "FORWARD_OUT"), ("nat", "POSTROUTING")]
        else:
            return FirewallError("Invalid policy: %s" % (policy))

    def policy_base_chain_name(self, policy, table, policy_prefix, isSNAT=False):
        obj = self._fw.policy.get_policy(policy)
        if obj.derived_from_zone:
            suffix = obj.derived_from_zone
        else:
            suffix = policy_prefix + policy

        if "HOST" in obj.settings["egress_zones"]:
            # zone/any --> Host
            if table == "filter":
                return "IN_" + suffix
            if table == "raw":
                # NOTE: nftables doesn't actually use this. Only iptables
                return "PRE_" + suffix
            if not obj.derived_from_zone:
                if table in ["mangle", "nat"]:
                    return "PRE_" + suffix
        elif "HOST" in obj.settings["ingress_zones"]:
            # HOST --> zone/any
            if not obj.derived_from_zone:
                if table == "filter":
                    return "OUT_" + suffix
        elif "ANY" in obj.settings["egress_zones"]:
            # zone/any --> any
            if table == "filter":
                if obj.derived_from_zone:
                    return "FWDI_" + suffix
                else:
                    return "FWD_" + suffix
            elif table == "nat":
                if isSNAT:
                    return "POST_" + suffix
                else:
                    return "PRE_" + suffix
            elif table in ["mangle", "raw"]:
                return "PRE_" + suffix
        elif "ANY" in obj.settings["ingress_zones"]:
            # any --> zone
            if table == "filter":
                if obj.derived_from_zone:
                    return "FWDO_" + suffix
                else:
                    return "FWD_" + suffix
            elif table == "nat":
                if isSNAT:
                    return "POST_" + suffix
                else:
                    return "PRE_" + suffix
            elif table in ["mangle", "raw"]:
                if not obj.derived_from_zone:
                    return "PRE_" + suffix
        elif not obj.derived_from_zone:
            # zone --> zone
            if table == "filter":
                return "FWD_" + suffix
            elif table == "nat":
                if isSNAT:
                    return "POST_" + suffix
                else:
                    return "PRE_" + suffix
            elif table in ["mangle", "raw"]:
                return "PRE_" + suffix
        return FirewallError("Can't convert policy to chain name: %s, %s, %s" % (policy, table, isSNAT))
site-packages/firewall/core/fw_icmptype.py000064400000004665147511334700014741 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2011-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

__all__ = [ "FirewallIcmpType" ]

from firewall.core.logger import log
from firewall import errors
from firewall.errors import FirewallError

class FirewallIcmpType(object):
    def __init__(self, fw):
        self._fw = fw
        self._icmptypes = { }

    def __repr__(self):
        return '%s(%r)' % (self.__class__, self._icmptypes)

    def cleanup(self):
        self._icmptypes.clear()

    # zones

    def get_icmptypes(self):
        return sorted(self._icmptypes.keys())

    def check_icmptype(self, icmptype):
        if icmptype not in self._icmptypes:
            raise FirewallError(errors.INVALID_ICMPTYPE, icmptype)

    def get_icmptype(self, icmptype):
        self.check_icmptype(icmptype)
        return self._icmptypes[icmptype]

    def add_icmptype(self, obj):
        orig_ipvs = obj.destination
        if len(orig_ipvs) == 0:
            orig_ipvs = [ "ipv4", "ipv6" ]
        for ipv in orig_ipvs:
            if ipv == "ipv4":
                if not self._fw.ip4tables_enabled and not self._fw.nftables_enabled:
                    continue
                supported_icmps = self._fw.ipv4_supported_icmp_types
            elif ipv == "ipv6":
                if not self._fw.ip6tables_enabled and not self._fw.nftables_enabled:
                    continue
                supported_icmps = self._fw.ipv6_supported_icmp_types
            else:
                supported_icmps = [ ]
            if obj.name.lower() not in supported_icmps:
                log.info1("ICMP type '%s' is not supported by the kernel for %s." % (obj.name, ipv))
        self._icmptypes[obj.name] = obj

    def remove_icmptype(self, icmptype):
        self.check_icmptype(icmptype)
        del self._icmptypes[icmptype]
site-packages/firewall/core/nftables.py000064400000305427147511334700014211 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2018 Red Hat, Inc.
#
# Authors:
# Eric Garver <e@erig.me>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#
from __future__ import absolute_import

import copy
import json
import ipaddress

from firewall.core.logger import log
from firewall.functions import check_mac, getPortRange, normalizeIP6, \
                               check_single_address, check_address
from firewall.errors import FirewallError, UNKNOWN_ERROR, INVALID_RULE, \
                            INVALID_ICMPTYPE, INVALID_TYPE, INVALID_ENTRY, \
                            INVALID_PORT
from firewall.core.rich import Rich_Accept, Rich_Reject, Rich_Drop, Rich_Mark, \
                               Rich_Masquerade, Rich_ForwardPort, Rich_IcmpBlock
from nftables.nftables import Nftables

TABLE_NAME = "firewalld"
TABLE_NAME_POLICY = TABLE_NAME + "_" + "policy_drop"
POLICY_CHAIN_PREFIX = "policy_"

# Map iptables (table, chain) to hooks and priorities.
# These are well defined by NF_IP_PRI_* defines in netfilter.
#
# This is analogous to ipXtables.BUILT_IN_CHAINS, but we omit the chains that
# are only used for direct rules.
#
# Note: All hooks use their standard position + NFT_HOOK_OFFSET. This means
# iptables will have DROP precedence. It also means that even if iptables
# ACCEPTs a packet it may still be dropped later by firewalld's rules.
#
NFT_HOOK_OFFSET = 10
IPTABLES_TO_NFT_HOOK = {
    #"security": {
    #    "INPUT": ("input", 50 + NFT_HOOK_OFFSET),
    #    "OUTPUT": ("output", 50 + NFT_HOOK_OFFSET),
    #    "FORWARD": ("forward", 50 + NFT_HOOK_OFFSET),
    #},
    "raw": {
    #   "PREROUTING": ("prerouting", -300 + NFT_HOOK_OFFSET),
    #   "OUTPUT": ("output", -300 + NFT_HOOK_OFFSET),
    },
    "mangle": {
        "PREROUTING": ("prerouting", -150 + NFT_HOOK_OFFSET),
    #    "POSTROUTING": ("postrouting", -150 + NFT_HOOK_OFFSET),
    #    "INPUT": ("input", -150 + NFT_HOOK_OFFSET),
    #    "OUTPUT": ("output", -150 + NFT_HOOK_OFFSET),
    #    "FORWARD": ("forward", -150 + NFT_HOOK_OFFSET),
    },
    "nat": {
        "PREROUTING": ("prerouting", -100 + NFT_HOOK_OFFSET),
        "POSTROUTING": ("postrouting", 100 + NFT_HOOK_OFFSET),
    #    "INPUT": ("input", 100 + NFT_HOOK_OFFSET),
    #    "OUTPUT": ("output", -100 + NFT_HOOK_OFFSET),
    },
    "filter": {
        "PREROUTING": ("prerouting", 0 + NFT_HOOK_OFFSET),
        "INPUT": ("input", 0 + NFT_HOOK_OFFSET),
        "FORWARD": ("forward", 0 + NFT_HOOK_OFFSET),
        "OUTPUT": ("output", 0 + NFT_HOOK_OFFSET),
    },
}

def _icmp_types_fragments(protocol, type, code=None):
    fragments = [{"match": {"left": {"payload": {"protocol": protocol, "field": "type"}},
                            "op": "==",
                            "right": type}}]
    if code is not None:
        fragments.append({"match": {"left": {"payload": {"protocol": protocol, "field": "code"}},
                                    "op": "==",
                                    "right": code}})
    return fragments

# Most ICMP types are provided by nft, but for the codes we have to use numeric
# values.
#
ICMP_TYPES_FRAGMENTS = {
    "ipv4": {
        "communication-prohibited":     _icmp_types_fragments("icmp", "destination-unreachable", 13),
        "destination-unreachable":      _icmp_types_fragments("icmp", "destination-unreachable"),
        "echo-reply":                   _icmp_types_fragments("icmp", "echo-reply"),
        "echo-request":                 _icmp_types_fragments("icmp", "echo-request"),
        "fragmentation-needed":         _icmp_types_fragments("icmp", "destination-unreachable", 4),
        "host-precedence-violation":    _icmp_types_fragments("icmp", "destination-unreachable", 14),
        "host-prohibited":              _icmp_types_fragments("icmp", "destination-unreachable", 10),
        "host-redirect":                _icmp_types_fragments("icmp", "redirect", 1),
        "host-unknown":                 _icmp_types_fragments("icmp", "destination-unreachable", 7),
        "host-unreachable":             _icmp_types_fragments("icmp", "destination-unreachable", 1),
        "ip-header-bad":                _icmp_types_fragments("icmp", "parameter-problem", 1),
        "network-prohibited":           _icmp_types_fragments("icmp", "destination-unreachable", 8),
        "network-redirect":             _icmp_types_fragments("icmp", "redirect", 0),
        "network-unknown":              _icmp_types_fragments("icmp", "destination-unreachable", 6),
        "network-unreachable":          _icmp_types_fragments("icmp", "destination-unreachable", 0),
        "parameter-problem":            _icmp_types_fragments("icmp", "parameter-problem"),
        "port-unreachable":             _icmp_types_fragments("icmp", "destination-unreachable", 3),
        "precedence-cutoff":            _icmp_types_fragments("icmp", "destination-unreachable", 15),
        "protocol-unreachable":         _icmp_types_fragments("icmp", "destination-unreachable", 2),
        "redirect":                     _icmp_types_fragments("icmp", "redirect"),
        "required-option-missing":      _icmp_types_fragments("icmp", "parameter-problem", 1),
        "router-advertisement":         _icmp_types_fragments("icmp", "router-advertisement"),
        "router-solicitation":          _icmp_types_fragments("icmp", "router-solicitation"),
        "source-quench":                _icmp_types_fragments("icmp", "source-quench"),
        "source-route-failed":          _icmp_types_fragments("icmp", "destination-unreachable", 5),
        "time-exceeded":                _icmp_types_fragments("icmp", "time-exceeded"),
        "timestamp-reply":              _icmp_types_fragments("icmp", "timestamp-reply"),
        "timestamp-request":            _icmp_types_fragments("icmp", "timestamp-request"),
        "tos-host-redirect":            _icmp_types_fragments("icmp", "redirect", 3),
        "tos-host-unreachable":         _icmp_types_fragments("icmp", "destination-unreachable", 12),
        "tos-network-redirect":         _icmp_types_fragments("icmp", "redirect", 2),
        "tos-network-unreachable":      _icmp_types_fragments("icmp", "destination-unreachable", 11),
        "ttl-zero-during-reassembly":   _icmp_types_fragments("icmp", "time-exceeded", 1),
        "ttl-zero-during-transit":      _icmp_types_fragments("icmp", "time-exceeded", 0),
    },

    "ipv6": {
        "address-unreachable":          _icmp_types_fragments("icmpv6", "destination-unreachable", 3),
        "bad-header":                   _icmp_types_fragments("icmpv6", "parameter-problem", 0),
        "beyond-scope":                 _icmp_types_fragments("icmpv6", "destination-unreachable", 2),
        "communication-prohibited":     _icmp_types_fragments("icmpv6", "destination-unreachable", 1),
        "destination-unreachable":      _icmp_types_fragments("icmpv6", "destination-unreachable"),
        "echo-reply":                   _icmp_types_fragments("icmpv6", "echo-reply"),
        "echo-request":                 _icmp_types_fragments("icmpv6", "echo-request"),
        "failed-policy":                _icmp_types_fragments("icmpv6", "destination-unreachable", 5),
        "mld-listener-done":            _icmp_types_fragments("icmpv6", "mld-listener-done"),
        "mld-listener-query":           _icmp_types_fragments("icmpv6", "mld-listener-query"),
        "mld-listener-report":          _icmp_types_fragments("icmpv6", "mld-listener-report"),
        "mld2-listener-report":         _icmp_types_fragments("icmpv6", "mld2-listener-report"),
        "neighbour-advertisement":      _icmp_types_fragments("icmpv6", "nd-neighbor-advert"),
        "neighbour-solicitation":       _icmp_types_fragments("icmpv6", "nd-neighbor-solicit"),
        "no-route":                     _icmp_types_fragments("icmpv6", "destination-unreachable", 0),
        "packet-too-big":               _icmp_types_fragments("icmpv6", "packet-too-big"),
        "parameter-problem":            _icmp_types_fragments("icmpv6", "parameter-problem"),
        "port-unreachable":             _icmp_types_fragments("icmpv6", "destination-unreachable", 4),
        "redirect":                     _icmp_types_fragments("icmpv6", "nd-redirect"),
        "reject-route":                 _icmp_types_fragments("icmpv6", "destination-unreachable", 6),
        "router-advertisement":         _icmp_types_fragments("icmpv6", "nd-router-advert"),
        "router-solicitation":          _icmp_types_fragments("icmpv6", "nd-router-solicit"),
        "time-exceeded":                _icmp_types_fragments("icmpv6", "time-exceeded"),
        "ttl-zero-during-reassembly":   _icmp_types_fragments("icmpv6", "time-exceeded", 1),
        "ttl-zero-during-transit":      _icmp_types_fragments("icmpv6", "time-exceeded", 0),
        "unknown-header-type":          _icmp_types_fragments("icmpv6", "parameter-problem", 1),
        "unknown-option":               _icmp_types_fragments("icmpv6", "parameter-problem", 2),
    }
}

class nftables(object):
    name = "nftables"
    policies_supported = True

    def __init__(self, fw):
        self._fw = fw
        self.restore_command_exists = True
        self.available_tables = []
        self.rule_to_handle = {}
        self.rule_ref_count = {}
        self.rich_rule_priority_counts = {}
        self.policy_priority_counts = {}
        self.zone_source_index_cache = {}
        self.created_tables = {"inet": [], "ip": [], "ip6": []}

        self.nftables = Nftables()
        self.nftables.set_echo_output(True)
        self.nftables.set_handle_output(True)

    def _run_replace_zone_source(self, rule, zone_source_index_cache):
        for verb in ["add", "insert", "delete"]:
            if verb in rule:
                break

        if "%%ZONE_SOURCE%%" in rule[verb]["rule"]:
            zone_source = (rule[verb]["rule"]["%%ZONE_SOURCE%%"]["zone"],
                           rule[verb]["rule"]["%%ZONE_SOURCE%%"]["address"])
            del rule[verb]["rule"]["%%ZONE_SOURCE%%"]
        elif "%%ZONE_INTERFACE%%" in rule[verb]["rule"]:
            zone_source = None
            del rule[verb]["rule"]["%%ZONE_INTERFACE%%"]
        else:
            return

        family = rule[verb]["rule"]["family"]

        if zone_source and verb == "delete":
            if family in zone_source_index_cache and \
               zone_source in zone_source_index_cache[family]:
                zone_source_index_cache[family].remove(zone_source)
        elif verb != "delete":
            if family not in zone_source_index_cache:
                zone_source_index_cache[family] = []

            if zone_source:
                # order source based dispatch by zone name
                if zone_source not in zone_source_index_cache[family]:
                    zone_source_index_cache[family].append(zone_source)
                    zone_source_index_cache[family].sort(key=lambda x: x[0])

                index = zone_source_index_cache[family].index(zone_source)
            else:
                if self._fw._allow_zone_drifting:
                    index = 0
                else:
                    index = len(zone_source_index_cache[family])

            _verb_snippet = rule[verb]
            del rule[verb]
            if index == 0:
                rule["insert"] = _verb_snippet
            else:
                index -= 1 # point to the rule before insertion point
                rule["add"] = _verb_snippet
                rule["add"]["rule"]["index"] = index

    def reverse_rule(self, dict):
        if "insert" in dict:
            return {"delete": copy.deepcopy(dict["insert"])}
        elif "add" in dict:
            return {"delete": copy.deepcopy(dict["add"])}
        else:
            raise FirewallError(UNKNOWN_ERROR, "Failed to reverse rule")

    def _set_rule_replace_priority(self, rule, priority_counts, token):
        for verb in ["add", "insert", "delete"]:
            if verb in rule:
                break

        if token in rule[verb]["rule"]:
            priority = rule[verb]["rule"][token]
            del rule[verb]["rule"][token]
            if type(priority) != int:
                raise FirewallError(INVALID_RULE, "priority must be followed by a number")
            chain = (rule[verb]["rule"]["family"], rule[verb]["rule"]["chain"]) # family, chain
            # Add the rule to the priority counts. We don't need to store the
            # rule, just bump the ref count for the priority value.
            if verb == "delete":
                if chain not in priority_counts or \
                   priority not in priority_counts[chain] or \
                   priority_counts[chain][priority] <= 0:
                    raise FirewallError(UNKNOWN_ERROR, "nonexistent or underflow of priority count")

                priority_counts[chain][priority] -= 1
            else:
                if chain not in priority_counts:
                    priority_counts[chain] = {}
                if priority not in priority_counts[chain]:
                    priority_counts[chain][priority] = 0

                # calculate index of new rule
                index = 0
                for p in sorted(priority_counts[chain].keys()):
                    if p == priority and verb == "insert":
                        break
                    index += priority_counts[chain][p]
                    if p == priority and verb == "add":
                        break

                priority_counts[chain][priority] += 1

                _verb_snippet = rule[verb]
                del rule[verb]
                if index == 0:
                    rule["insert"] = _verb_snippet
                else:
                    index -= 1 # point to the rule before insertion point
                    rule["add"] = _verb_snippet
                    rule["add"]["rule"]["index"] = index

    def _get_rule_key(self, rule):
        for verb in ["add", "insert", "delete"]:
            if verb in rule and "rule" in rule[verb]:
                rule_key = copy.deepcopy(rule[verb]["rule"])
                for non_key in ["index", "handle", "position"]:
                    if non_key in rule_key:
                        del rule_key[non_key]
                # str(rule_key) is insufficient because dictionary order is
                # not stable.. so abuse the JSON library
                rule_key = json.dumps(rule_key, sort_keys=True)
                return rule_key
        # Not a rule (it's a table, chain, etc)
        return None

    def set_rules(self, rules, log_denied):
        _valid_verbs = ["add", "insert", "delete", "flush", "replace"]
        _valid_add_verbs = ["add", "insert", "replace"]
        _deduplicated_rules = []
        _executed_rules = []
        rich_rule_priority_counts = copy.deepcopy(self.rich_rule_priority_counts)
        policy_priority_counts = copy.deepcopy(self.policy_priority_counts)
        zone_source_index_cache = copy.deepcopy(self.zone_source_index_cache)
        rule_ref_count = self.rule_ref_count.copy()
        for rule in rules:
            if type(rule) != dict:
                raise FirewallError(UNKNOWN_ERROR, "rule must be a dictionary, rule: %s" % (rule))

            for verb in _valid_verbs:
                if verb in rule:
                    break
            if verb not in rule:
                raise FirewallError(INVALID_RULE, "no valid verb found, rule: %s" % (rule))

            rule_key = self._get_rule_key(rule)

            # rule deduplication
            if rule_key in rule_ref_count:
                log.debug2("%s: prev rule ref cnt %d, %s", self.__class__,
                           rule_ref_count[rule_key], rule_key)
                if verb != "delete":
                    rule_ref_count[rule_key] += 1
                    continue
                elif rule_ref_count[rule_key] > 1:
                    rule_ref_count[rule_key] -= 1
                    continue
                elif rule_ref_count[rule_key] == 1:
                    rule_ref_count[rule_key] -= 1
                else:
                    raise FirewallError(UNKNOWN_ERROR, "rule ref count bug: rule_key '%s', cnt %d"
                                                       % (rule_key, rule_ref_count[rule_key]))
            elif rule_key and verb != "delete":
                rule_ref_count[rule_key] = 1

            _deduplicated_rules.append(rule)

            _rule = copy.deepcopy(rule)
            if rule_key:
                # filter empty rule expressions. Rich rules add quite a bit of
                # them, but it makes the rest of the code simpler. libnftables
                # does not tolerate them.
                _rule[verb]["rule"]["expr"] = list(filter(None, _rule[verb]["rule"]["expr"]))

                self._set_rule_replace_priority(_rule, rich_rule_priority_counts, "%%RICH_RULE_PRIORITY%%")
                self._set_rule_replace_priority(_rule, policy_priority_counts, "%%POLICY_PRIORITY%%")
                self._run_replace_zone_source(_rule, zone_source_index_cache)

                # delete using rule handle
                if verb == "delete":
                    _rule = {"delete": {"rule": {"family": _rule["delete"]["rule"]["family"],
                                                 "table": _rule["delete"]["rule"]["table"],
                                                 "chain": _rule["delete"]["rule"]["chain"],
                                                 "handle": self.rule_to_handle[rule_key]}}}

            _executed_rules.append(_rule)

        json_blob = {"nftables": [{"metainfo": {"json_schema_version": 1}}] + _executed_rules}
        if log.getDebugLogLevel() >= 3:
            # guarded with if statement because json.dumps() is expensive.
            log.debug3("%s: calling python-nftables with JSON blob: %s", self.__class__,
                       json.dumps(json_blob))
        rc, output, error = self.nftables.json_cmd(json_blob)
        if rc != 0:
            raise ValueError("'%s' failed: %s\nJSON blob:\n%s" % ("python-nftables", error, json.dumps(json_blob)))

        self.rich_rule_priority_counts = rich_rule_priority_counts
        self.policy_priority_counts = policy_priority_counts
        self.zone_source_index_cache = zone_source_index_cache
        self.rule_ref_count = rule_ref_count

        index = 0
        for rule in _deduplicated_rules:
            index += 1 # +1 due to metainfo
            rule_key = self._get_rule_key(rule)

            if not rule_key:
                continue

            if "delete" in rule:
                del self.rule_to_handle[rule_key]
                del self.rule_ref_count[rule_key]
                continue

            for verb in _valid_add_verbs:
                if verb in output["nftables"][index]:
                    break
            if verb not in output["nftables"][index]:
                continue

            self.rule_to_handle[rule_key] = output["nftables"][index][verb]["rule"]["handle"]

    def set_rule(self, rule, log_denied):
        self.set_rules([rule], log_denied)
        return ""

    def get_available_tables(self, table=None):
        # Tables always exist in nftables
        return [table] if table else IPTABLES_TO_NFT_HOOK.keys()

    def _build_delete_table_rules(self, table):
        # To avoid nftables returning ENOENT we always add the table before
        # deleting to guarantee it will exist.
        #
        # In the future, this add+delete should be replaced with "destroy", but
        # that verb is too new to rely upon.
        rules = []
        for family in ["inet", "ip", "ip6"]:
            rules.append({"add": {"table": {"family": family,
                                            "name": table}}})
            rules.append({"delete": {"table": {"family": family,
                                               "name": table}}})
        return rules

    def build_flush_rules(self):
        # Policy is stashed in a separate table that we're _not_ going to
        # flush. As such, we retain the policy rule handles and ref counts.
        saved_rule_to_handle = {}
        saved_rule_ref_count = {}
        for rule in self._build_set_policy_rules_ct_rules(True):
            policy_key = self._get_rule_key(rule)
            if policy_key in self.rule_to_handle:
                saved_rule_to_handle[policy_key] = self.rule_to_handle[policy_key]
                saved_rule_ref_count[policy_key] = self.rule_ref_count[policy_key]

        self.rule_to_handle = saved_rule_to_handle
        self.rule_ref_count = saved_rule_ref_count
        self.rich_rule_priority_counts = {}
        self.policy_priority_counts = {}
        self.zone_source_index_cache = {}

        for family in ["inet", "ip", "ip6"]:
            if TABLE_NAME in self.created_tables[family]:
                self.created_tables[family].remove(TABLE_NAME)

        return self._build_delete_table_rules(TABLE_NAME)

    def _build_set_policy_rules_ct_rules(self, enable):
        add_del = { True: "add", False: "delete" }[enable]
        rules = []
        for hook in ["input", "forward", "output"]:
            rules.append({add_del: {"rule": {"family": "inet",
                                             "table": TABLE_NAME_POLICY,
                                             "chain": "%s_%s" % ("filter", hook),
                                             "expr": [{"match": {"left": {"ct": {"key": "state"}},
                                                                 "op": "in",
                                                                 "right": {"set": ["established", "related"]}}},
                                                      {"accept": None}]}}})
        return rules

    def build_set_policy_rules(self, policy):
        # Policy is not exposed to the user. It's only to make sure we DROP
        # packets while reloading and for panic mode. As such, using hooks with
        # a higher priority than our base chains is sufficient.
        rules = []
        if policy == "PANIC":
            rules.append({"add": {"table": {"family": "inet",
                                            "name": TABLE_NAME_POLICY}}})
            self.created_tables["inet"].append(TABLE_NAME_POLICY)

            # Use "raw" priority for panic mode. This occurs before
            # conntrack, mangle, nat, etc
            for hook in ["prerouting", "output"]:
                rules.append({"add": {"chain": {"family": "inet",
                                                "table": TABLE_NAME_POLICY,
                                                "name": "%s_%s" % ("raw", hook),
                                                "type": "filter",
                                                "hook": hook,
                                                "prio": -300 + NFT_HOOK_OFFSET - 1,
                                                "policy": "drop"}}})
        if policy == "DROP":
            rules.append({"add": {"table": {"family": "inet",
                                            "name": TABLE_NAME_POLICY}}})
            self.created_tables["inet"].append(TABLE_NAME_POLICY)

            # To drop everything except existing connections we use
            # "filter" because it occurs _after_ conntrack.
            for hook in ["input", "forward", "output"]:
                rules.append({"add": {"chain": {"family": "inet",
                                                "table": TABLE_NAME_POLICY,
                                                "name": "%s_%s" % ("filter", hook),
                                                "type": "filter",
                                                "hook": hook,
                                                "prio": 0 + NFT_HOOK_OFFSET - 1,
                                                "policy": "drop"}}})

            rules += self._build_set_policy_rules_ct_rules(True)
        elif policy == "ACCEPT":
            for rule in self._build_set_policy_rules_ct_rules(False):
                policy_key = self._get_rule_key(rule)
                if policy_key in self.rule_to_handle:
                    rules.append(rule)

            rules += self._build_delete_table_rules(TABLE_NAME_POLICY)

            if TABLE_NAME_POLICY in self.created_tables["inet"]:
                self.created_tables["inet"].remove(TABLE_NAME_POLICY)
        else:
            FirewallError(UNKNOWN_ERROR, "not implemented")

        return rules

    def supported_icmp_types(self, ipv=None):
        # nftables supports any icmp_type via arbitrary type/code matching.
        # We just need a translation for it in ICMP_TYPES_FRAGMENTS.
        supported = set()

        for _ipv in [ipv] if ipv else ICMP_TYPES_FRAGMENTS.keys():
            supported.update(ICMP_TYPES_FRAGMENTS[_ipv].keys())

        return list(supported)

    def build_default_tables(self):
        default_tables = []
        for family in ["inet", "ip", "ip6"]:
            default_tables.append({"add": {"table": {"family": family,
                                                     "name": TABLE_NAME}}})
            self.created_tables[family].append(TABLE_NAME)
        return default_tables

    def build_default_rules(self, log_denied="off"):
        default_rules = []
        for chain in IPTABLES_TO_NFT_HOOK["mangle"].keys():
            default_rules.append({"add": {"chain": {"family": "inet",
                                                    "table": TABLE_NAME,
                                                    "name": "mangle_%s" % chain,
                                                    "type": "filter",
                                                    "hook": "%s" % IPTABLES_TO_NFT_HOOK["mangle"][chain][0],
                                                    "prio": IPTABLES_TO_NFT_HOOK["mangle"][chain][1]}}})
            for dispatch_suffix in ["POLICIES_pre", "ZONES_SOURCE", "ZONES", "POLICIES_post"] if self._fw._allow_zone_drifting else ["POLICIES_pre", "ZONES", "POLICIES_post"]:
                default_rules.append({"add": {"chain": {"family": "inet",
                                                        "table": TABLE_NAME,
                                                        "name": "mangle_%s_%s" % (chain, dispatch_suffix)}}})
                default_rules.append({"add": {"rule":  {"family": "inet",
                                                        "table": TABLE_NAME,
                                                        "chain": "mangle_%s" % chain,
                                                        "expr": [{"jump": {"target": "mangle_%s_%s" % (chain, dispatch_suffix)}}]}}})

        for family in ["ip", "ip6"]:
            for chain in IPTABLES_TO_NFT_HOOK["nat"].keys():
                default_rules.append({"add": {"chain": {"family": family,
                                                        "table": TABLE_NAME,
                                                        "name": "nat_%s" % chain,
                                                        "type": "nat",
                                                        "hook": "%s" % IPTABLES_TO_NFT_HOOK["nat"][chain][0],
                                                        "prio": IPTABLES_TO_NFT_HOOK["nat"][chain][1]}}})

                for dispatch_suffix in ["POLICIES_pre", "ZONES_SOURCE", "ZONES", "POLICIES_post"] if self._fw._allow_zone_drifting else ["POLICIES_pre", "ZONES", "POLICIES_post"]:
                    default_rules.append({"add": {"chain": {"family": family,
                                                            "table": TABLE_NAME,
                                                            "name": "nat_%s_%s" % (chain, dispatch_suffix)}}})
                    default_rules.append({"add": {"rule":  {"family": family,
                                                            "table": TABLE_NAME,
                                                            "chain": "nat_%s" % chain,
                                                            "expr": [{"jump": {"target": "nat_%s_%s" % (chain, dispatch_suffix)}}]}}})

        for chain in IPTABLES_TO_NFT_HOOK["filter"].keys():
            default_rules.append({"add": {"chain": {"family": "inet",
                                                    "table": TABLE_NAME,
                                                    "name": "filter_%s" % chain,
                                                    "type": "filter",
                                                    "hook": "%s" % IPTABLES_TO_NFT_HOOK["filter"][chain][0],
                                                    "prio": IPTABLES_TO_NFT_HOOK["filter"][chain][1]}}})

        # filter, INPUT
        default_rules.append({"add": {"rule":  {"family": "inet",
                                                "table": TABLE_NAME,
                                                "chain": "filter_%s" % "INPUT",
                                                "expr": [{"match": {"left": {"ct": {"key": "state"}},
                                                                    "op": "in",
                                                                    "right": {"set": ["established", "related"]}}},
                                                         {"accept": None}]}}})
        default_rules.append({"add": {"rule":  {"family": "inet",
                                                "table": TABLE_NAME,
                                                "chain": "filter_%s" % "INPUT",
                                                "expr": [{"match": {"left": {"ct": {"key": "status"}},
                                                                    "op": "in",
                                                                    "right": "dnat"}},
                                                         {"accept": None}]}}})
        default_rules.append({"add": {"rule":  {"family": "inet",
                                                "table": TABLE_NAME,
                                                "chain": "filter_%s" % "INPUT",
                                                "expr": [{"match": {"left": {"meta": {"key": "iifname"}},
                                                                    "op": "==",
                                                                    "right": "lo"}},
                                                         {"accept": None}]}}})
        for dispatch_suffix in ["POLICIES_pre", "ZONES_SOURCE", "ZONES", "POLICIES_post"] if self._fw._allow_zone_drifting else ["POLICIES_pre", "ZONES", "POLICIES_post"]:
            default_rules.append({"add": {"chain": {"family": "inet",
                                                    "table": TABLE_NAME,
                                                    "name": "filter_%s_%s" % ("INPUT", dispatch_suffix)}}})
            default_rules.append({"add": {"rule":  {"family": "inet",
                                                    "table": TABLE_NAME,
                                                    "chain": "filter_%s" % "INPUT",
                                                    "expr": [{"jump": {"target": "filter_%s_%s" % ("INPUT", dispatch_suffix)}}]}}})
        if log_denied != "off":
            default_rules.append({"add": {"rule":  {"family": "inet",
                                                    "table": TABLE_NAME,
                                                    "chain": "filter_%s" % "INPUT",
                                                    "expr": [{"match": {"left": {"ct": {"key": "state"}},
                                                                        "op": "in",
                                                                        "right": {"set": ["invalid"]}}},
                                                             self._pkttype_match_fragment(log_denied),
                                                             {"log": {"prefix": "STATE_INVALID_DROP: "}}]}}})
        default_rules.append({"add": {"rule":  {"family": "inet",
                                                "table": TABLE_NAME,
                                                "chain": "filter_%s" % "INPUT",
                                                "expr": [{"match": {"left": {"ct": {"key": "state"}},
                                                                    "op": "in",
                                                                    "right": {"set": ["invalid"]}}},
                                                         {"drop": None}]}}})
        if log_denied != "off":
            default_rules.append({"add": {"rule":  {"family": "inet",
                                                    "table": TABLE_NAME,
                                                    "chain": "filter_%s" % "INPUT",
                                                    "expr": [self._pkttype_match_fragment(log_denied),
                                                             {"log": {"prefix": "FINAL_REJECT: "}}]}}})
        default_rules.append({"add": {"rule":  {"family": "inet",
                                                "table": TABLE_NAME,
                                                "chain": "filter_%s" % "INPUT",
                                                "expr": [{"reject": {"type": "icmpx", "expr": "admin-prohibited"}}]}}})

        # filter, FORWARD
        default_rules.append({"add": {"rule":  {"family": "inet",
                                                "table": TABLE_NAME,
                                                "chain": "filter_%s" % "FORWARD",
                                                "expr": [{"match": {"left": {"ct": {"key": "state"}},
                                                                    "op": "in",
                                                                    "right": {"set": ["established", "related"]}}},
                                                         {"accept": None}]}}})
        default_rules.append({"add": {"rule":  {"family": "inet",
                                                "table": TABLE_NAME,
                                                "chain": "filter_%s" % "FORWARD",
                                                "expr": [{"match": {"left": {"ct": {"key": "status"}},
                                                                    "op": "in",
                                                                    "right": "dnat"}},
                                                         {"accept": None}]}}})
        default_rules.append({"add": {"rule":  {"family": "inet",
                                                "table": TABLE_NAME,
                                                "chain": "filter_%s" % "FORWARD",
                                                "expr": [{"match": {"left": {"meta": {"key": "iifname"}},
                                                                    "op": "==",
                                                                    "right": "lo"}},
                                                         {"accept": None}]}}})
        for dispatch_suffix in ["POLICIES_pre"]:
            default_rules.append({"add": {"chain": {"family": "inet",
                                                    "table": TABLE_NAME,
                                                    "name": "filter_%s_%s" % ("FORWARD", dispatch_suffix)}}})
            default_rules.append({"add": {"rule":  {"family": "inet",
                                                    "table": TABLE_NAME,
                                                    "chain": "filter_%s" % "FORWARD",
                                                    "expr": [{"jump": {"target": "filter_%s_%s" % ("FORWARD", dispatch_suffix)}}]}}})
        for direction in ["IN", "OUT"]:
            for dispatch_suffix in ["ZONES_SOURCE", "ZONES"] if self._fw._allow_zone_drifting else ["ZONES"]:
                default_rules.append({"add": {"chain": {"family": "inet",
                                                        "table": TABLE_NAME,
                                                        "name": "filter_%s_%s_%s" % ("FORWARD", direction, dispatch_suffix)}}})
                default_rules.append({"add": {"rule":  {"family": "inet",
                                                        "table": TABLE_NAME,
                                                        "chain": "filter_%s" % "FORWARD",
                                                        "expr": [{"jump": {"target": "filter_%s_%s_%s" % ("FORWARD", direction, dispatch_suffix)}}]}}})
        for dispatch_suffix in ["POLICIES_post"]:
            default_rules.append({"add": {"chain": {"family": "inet",
                                                    "table": TABLE_NAME,
                                                    "name": "filter_%s_%s" % ("FORWARD", dispatch_suffix)}}})
            default_rules.append({"add": {"rule":  {"family": "inet",
                                                    "table": TABLE_NAME,
                                                    "chain": "filter_%s" % "FORWARD",
                                                    "expr": [{"jump": {"target": "filter_%s_%s" % ("FORWARD", dispatch_suffix)}}]}}})
        if log_denied != "off":
            default_rules.append({"add": {"rule":  {"family": "inet",
                                                    "table": TABLE_NAME,
                                                    "chain": "filter_%s" % "FORWARD",
                                                    "expr": [{"match": {"left": {"ct": {"key": "state"}},
                                                                        "op": "in",
                                                                        "right": {"set": ["invalid"]}}},
                                                             self._pkttype_match_fragment(log_denied),
                                                             {"log": {"prefix": "STATE_INVALID_DROP: "}}]}}})
        default_rules.append({"add": {"rule":  {"family": "inet",
                                                "table": TABLE_NAME,
                                                "chain": "filter_%s" % "FORWARD",
                                                "expr": [{"match": {"left": {"ct": {"key": "state"}},
                                                                    "op": "in",
                                                                    "right": {"set": ["invalid"]}}},
                                                         {"drop": None}]}}})
        if log_denied != "off":
            default_rules.append({"add": {"rule":  {"family": "inet",
                                                    "table": TABLE_NAME,
                                                    "chain": "filter_%s" % "FORWARD",
                                                    "expr": [self._pkttype_match_fragment(log_denied),
                                                             {"log": {"prefix": "FINAL_REJECT: "}}]}}})
        default_rules.append({"add": {"rule":  {"family": "inet",
                                                "table": TABLE_NAME,
                                                "chain": "filter_%s" % "FORWARD",
                                                "expr": [{"reject": {"type": "icmpx", "expr": "admin-prohibited"}}]}}})

        # filter, OUTPUT
        default_rules.append({"add": {"rule":  {"family": "inet",
                                                "table": TABLE_NAME,
                                                "chain": "filter_%s" % "OUTPUT",
                                                "expr": [{"match": {"left": {"ct": {"key": "state"}},
                                                                    "op": "in",
                                                                    "right": {"set": ["established", "related"]}}},
                                                         {"accept": None}]}}})
        default_rules.append({"add": {"rule":  {"family": "inet",
                                                "table": TABLE_NAME,
                                                "chain": "filter_OUTPUT",
                                                "expr": [{"match": {"left": {"meta": {"key": "oifname"}},
                                                          "op": "==",
                                                          "right": "lo"}},
                                                         {"accept": None}]}}})
        for dispatch_suffix in ["POLICIES_pre"]:
            default_rules.append({"add": {"chain": {"family": "inet",
                                                    "table": TABLE_NAME,
                                                    "name": "filter_%s_%s" % ("OUTPUT", dispatch_suffix)}}})
            default_rules.append({"add": {"rule":  {"family": "inet",
                                                    "table": TABLE_NAME,
                                                    "chain": "filter_%s" % "OUTPUT",
                                                    "expr": [{"jump": {"target": "filter_%s_%s" % ("OUTPUT", dispatch_suffix)}}]}}})
        for dispatch_suffix in ["POLICIES_post"]:
            default_rules.append({"add": {"chain": {"family": "inet",
                                                    "table": TABLE_NAME,
                                                    "name": "filter_%s_%s" % ("OUTPUT", dispatch_suffix)}}})
            default_rules.append({"add": {"rule":  {"family": "inet",
                                                    "table": TABLE_NAME,
                                                    "chain": "filter_%s" % "OUTPUT",
                                                    "expr": [{"jump": {"target": "filter_%s_%s" % ("OUTPUT", dispatch_suffix)}}]}}})

        return default_rules

    def get_zone_table_chains(self, table):
        if table == "filter":
            return ["INPUT", "FORWARD_IN", "FORWARD_OUT"]
        if table == "mangle":
            return ["PREROUTING"]
        if table == "nat":
            return ["PREROUTING", "POSTROUTING"]

        return []

    def build_policy_ingress_egress_rules(self, enable, policy, table, chain,
                                          ingress_interfaces, egress_interfaces,
                                          ingress_sources, egress_sources,
                                          family="inet"):
        # nat tables need to use ip/ip6 family
        if table == "nat" and family == "inet":
            rules = []
            rules.extend(self.build_policy_ingress_egress_rules(enable, policy, table, chain,
                                          ingress_interfaces, egress_interfaces,
                                          ingress_sources, egress_sources,
                                          family="ip"))
            rules.extend(self.build_policy_ingress_egress_rules(enable, policy, table, chain,
                                          ingress_interfaces, egress_interfaces,
                                          ingress_sources, egress_sources,
                                          family="ip6"))
            return rules

        p_obj = self._fw.policy.get_policy(policy)
        chain_suffix = "pre" if p_obj.priority < 0 else "post"
        isSNAT = True if (table == "nat" and chain == "POSTROUTING") else False
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX, isSNAT)

        ingress_fragments = []
        egress_fragments = []
        if ingress_interfaces:
            ingress_fragments.append({"match": {"left": {"meta": {"key": "iifname"}},
                                                "op": "==",
                                                "right": {"set": list(ingress_interfaces)}}})
        if egress_interfaces:
            egress_fragments.append({"match": {"left": {"meta": {"key": "oifname"}},
                                               "op": "==",
                                               "right": {"set": list(egress_interfaces)}}})
        ipv_to_family = {"ipv4": "ip", "ipv6": "ip6"}
        if ingress_sources:
            for src in ingress_sources:
                # skip if this source doesn't apply to the current family.
                if table == "nat":
                    ipv = self._fw.zone.check_source(src)
                    if ipv in ipv_to_family and family != ipv_to_family[ipv]:
                        continue

                ingress_fragments.append(self._rule_addr_fragment("saddr", src))
        if egress_sources:
            for dst in egress_sources:
                # skip if this source doesn't apply to the current family.
                if table == "nat":
                    ipv = self._fw.zone.check_source(dst)
                    if ipv in ipv_to_family and family != ipv_to_family[ipv]:
                        continue

                egress_fragments.append(self._rule_addr_fragment("daddr", dst))

        def _generate_policy_dispatch_rule(ingress_fragment, egress_fragment):
            expr_fragments = []
            if ingress_fragment:
                expr_fragments.append(ingress_fragment)
            if egress_fragment:
                expr_fragments.append(egress_fragment)
            expr_fragments.append({"jump": {"target": "%s_%s" % (table, _policy)}})

            rule = {"family": family,
                    "table": TABLE_NAME,
                    "chain": "%s_%s_POLICIES_%s" % (table, chain, chain_suffix),
                    "expr": expr_fragments}
            rule.update(self._policy_priority_fragment(p_obj))

            if enable:
                return {"add": {"rule": rule}}
            else:
                return {"delete": {"rule": rule}}

        rules = []
        if ingress_fragments: # zone --> [zone, ANY, HOST]
            for ingress_fragment in ingress_fragments:
                if egress_fragments:
                    # zone --> zone
                    for egress_fragment in egress_fragments:
                        rules.append(_generate_policy_dispatch_rule(ingress_fragment, egress_fragment))
                elif table =="nat" and egress_sources:
                    # if the egress source is not for the current family (there
                    # are no egress fragments), then avoid creating an invalid
                    # catch all rule.
                    pass
                else:
                    # zone --> [ANY, HOST]
                    rules.append(_generate_policy_dispatch_rule(ingress_fragment, None))
        elif table =="nat" and ingress_sources:
            # if the ingress source is not for the current family (there are no
            # ingress fragments), then avoid creating an invalid catch all
            # rule.
            pass
        else: # [ANY, HOST] --> [zone, ANY, HOST]
            if egress_fragments:
                # [ANY, HOST] --> zone
                for egress_fragment in egress_fragments:
                    rules.append(_generate_policy_dispatch_rule(None, egress_fragment))
            elif table =="nat" and egress_sources:
                # if the egress source is not for the current family (there are
                # no egress fragments), then avoid creating an invalid catch
                # all rule.
                pass
            else:
                # [ANY, HOST] --> [ANY, HOST]
                rules.append(_generate_policy_dispatch_rule(None, None))

        return rules

    def build_zone_source_interface_rules(self, enable, zone, policy, interface,
                                          table, chain, append=False,
                                          family="inet"):
        # nat tables needs to use ip/ip6 family
        if table == "nat" and family == "inet":
            rules = []
            rules.extend(self.build_zone_source_interface_rules(enable, zone, policy,
                            interface, table, chain, append, "ip"))
            rules.extend(self.build_zone_source_interface_rules(enable, zone, policy,
                            interface, table, chain, append, "ip6"))
            return rules

        isSNAT = True if (table == "nat" and chain == "POSTROUTING") else False
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX, isSNAT=isSNAT)
        opt = {
            "PREROUTING": "iifname",
            "POSTROUTING": "oifname",
            "INPUT": "iifname",
            "FORWARD_IN": "iifname",
            "FORWARD_OUT": "oifname",
            "OUTPUT": "oifname",
        }[chain]

        if interface[len(interface)-1] == "+":
            interface = interface[:len(interface)-1] + "*"

        action = "goto"

        if interface == "*":
            expr_fragments = [{action: {"target": "%s_%s" % (table, _policy)}}]
        else:
            expr_fragments = [{"match": {"left": {"meta": {"key": opt}},
                                         "op": "==",
                                         "right": interface}},
                              {action: {"target": "%s_%s" % (table, _policy)}}]

        if enable and not append:
            verb = "insert"
            rule = {"family": family,
                    "table": TABLE_NAME,
                    "chain": "%s_%s_ZONES" % (table, chain),
                    "expr": expr_fragments}
            rule.update(self._zone_interface_fragment())
        elif enable:
            verb = "add"
            rule = {"family": family,
                    "table": TABLE_NAME,
                    "chain": "%s_%s_ZONES" % (table, chain),
                    "expr": expr_fragments}
        else:
            verb = "delete"
            rule = {"family": family,
                    "table": TABLE_NAME,
                    "chain": "%s_%s_ZONES" % (table, chain),
                    "expr": expr_fragments}
            if not append:
                rule.update(self._zone_interface_fragment())

        return [{verb: {"rule": rule}}]

    def build_zone_source_address_rules(self, enable, zone, policy,
                                        address, table, chain, family="inet"):
        # nat tables needs to use ip/ip6 family
        if table == "nat" and family == "inet":
            rules = []
            if address.startswith("ipset:"):
                ipset_family = self._set_get_family(address[len("ipset:"):])
            else:
                ipset_family = None

            if check_address("ipv4", address) or check_mac(address) or ipset_family == "ip":
                rules.extend(self.build_zone_source_address_rules(enable, zone, policy,
                                    address, table, chain, "ip"))
            if check_address("ipv6", address) or check_mac(address) or ipset_family == "ip6":
                rules.extend(self.build_zone_source_address_rules(enable, zone, policy,
                                    address, table, chain, "ip6"))
            return rules

        isSNAT = True if (table == "nat" and chain == "POSTROUTING") else False
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX, isSNAT=isSNAT)
        add_del = { True: "insert", False: "delete" }[enable]

        opt = {
            "PREROUTING": "saddr",
            "POSTROUTING": "daddr",
            "INPUT": "saddr",
            "FORWARD_IN": "saddr",
            "FORWARD_OUT": "daddr",
            "OUTPUT": "daddr",
        }[chain]

        if self._fw._allow_zone_drifting:
            zone_dispatch_chain = "%s_%s_ZONES_SOURCE" % (table, chain)
        else:
            zone_dispatch_chain = "%s_%s_ZONES" % (table, chain)

        action = "goto"

        rule = {"family": family,
                "table": TABLE_NAME,
                "chain": zone_dispatch_chain,
                "expr": [self._rule_addr_fragment(opt, address),
                         {action: {"target": "%s_%s" % (table, _policy)}}]}
        rule.update(self._zone_source_fragment(zone, address))
        return [{add_del: {"rule": rule}}]

    def build_policy_chain_rules(self, enable, policy, table, chain, family="inet"):
        # nat tables needs to use ip/ip6 family
        if table == "nat" and family == "inet":
            rules = []
            rules.extend(self.build_policy_chain_rules(enable, policy, table, chain, "ip"))
            rules.extend(self.build_policy_chain_rules(enable, policy, table, chain, "ip6"))
            return rules

        add_del = { True: "add", False: "delete" }[enable]
        isSNAT = True if (table == "nat" and chain == "POSTROUTING") else False
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX, isSNAT=isSNAT)

        rules = []
        rules.append({add_del: {"chain": {"family": family,
                                          "table": TABLE_NAME,
                                          "name": "%s_%s" % (table, _policy)}}})
        for chain_suffix in ["pre", "log", "deny", "allow", "post"]:
            rules.append({add_del: {"chain": {"family": family,
                                              "table": TABLE_NAME,
                                              "name": "%s_%s_%s" % (table, _policy, chain_suffix)}}})

        for chain_suffix in ["pre", "log", "deny", "allow", "post"]:
            rules.append({add_del: {"rule": {"family": family,
                                             "table": TABLE_NAME,
                                             "chain": "%s_%s" % (table, _policy),
                                             "expr": [{"jump": {"target": "%s_%s_%s" % (table, _policy, chain_suffix)}}]}}})

        target = self._fw.policy._policies[policy].target

        if self._fw.get_log_denied() != "off":
            if table == "filter":
                if target in ["REJECT", "%%REJECT%%", "DROP"]:
                    log_suffix = target
                    if target == "%%REJECT%%":
                        log_suffix = "REJECT"
                    rules.append({add_del: {"rule": {"family": family,
                                                     "table": TABLE_NAME,
                                                     "chain": "%s_%s" % (table, _policy),
                                                     "expr": [self._pkttype_match_fragment(self._fw.get_log_denied()),
                                                              {"log": {"prefix": "\"filter_%s_%s: \"" % (_policy, log_suffix)}}]}}})

        if table == "filter" and \
           target in ["ACCEPT", "REJECT", "%%REJECT%%", "DROP"]:
            if target in ["%%REJECT%%", "REJECT"]:
                target_fragment = self._reject_fragment()
            else:
                target_fragment = {target.lower(): None}
            rules.append({add_del: {"rule": {"family": family,
                                             "table": TABLE_NAME,
                                             "chain": "%s_%s" % (table, _policy),
                                             "expr": [target_fragment]}}})

        if not enable:
            rules.reverse()

        return rules

    def _pkttype_match_fragment(self, pkttype):
        if pkttype == "all":
            return {}
        elif pkttype in ["unicast", "broadcast", "multicast"]:
            return {"match": {"left": {"meta": {"key": "pkttype"}},
                               "op": "==",
                               "right": pkttype}}

        raise FirewallError(INVALID_RULE, "Invalid pkttype \"%s\"", pkttype)

    def _reject_types_fragment(self, reject_type):
        frags = {
            # REJECT_TYPES              : <nft reject rule fragment>
            "icmp-host-prohibited"      : {"reject": {"type": "icmp", "expr": "host-prohibited"}},
            "host-prohib"               : {"reject": {"type": "icmp", "expr": "host-prohibited"}},
            "icmp-net-prohibited"       : {"reject": {"type": "icmp", "expr": "net-prohibited"}},
            "net-prohib"                : {"reject": {"type": "icmp", "expr": "net-prohibited"}},
            "icmp-admin-prohibited"     : {"reject": {"type": "icmp", "expr": "admin-prohibited"}},
            "admin-prohib"              : {"reject": {"type": "icmp", "expr": "admin-prohibited"}},
            "icmp6-adm-prohibited"      : {"reject": {"type": "icmpv6", "expr": "admin-prohibited"}},
            "adm-prohibited"            : {"reject": {"type": "icmpv6", "expr": "admin-prohibited"}},

            "icmp-net-unreachable"      : {"reject": {"type": "icmp", "expr": "net-unreachable"}},
            "net-unreach"               : {"reject": {"type": "icmp", "expr": "net-unreachable"}},
            "icmp-host-unreachable"     : {"reject": {"type": "icmp", "expr": "host-unreachable"}},
            "host-unreach"              : {"reject": {"type": "icmp", "expr": "host-unreachable"}},
            "icmp-port-unreachable"     : {"reject": {"type": "icmp", "expr": "port-unreachable"}},
            "icmp6-port-unreachable"    : {"reject": {"type": "icmpv6", "expr": "port-unreachable"}},
            "port-unreach"              : {"reject": {"type": "icmpx", "expr": "port-unreachable"}},
            "icmp-proto-unreachable"    : {"reject": {"type": "icmp", "expr": "prot-unreachable"}},
            "proto-unreach"             : {"reject": {"type": "icmp", "expr": "prot-unreachable"}},
            "icmp6-addr-unreachable"    : {"reject": {"type": "icmpv6", "expr": "addr-unreachable"}},
            "addr-unreach"              : {"reject": {"type": "icmpv6", "expr": "addr-unreachable"}},

            "icmp6-no-route"            : {"reject": {"type": "icmpv6", "expr": "no-route"}},
            "no-route"                  : {"reject": {"type": "icmpv6", "expr": "no-route"}},

            "tcp-reset"                 : {"reject": {"type": "tcp reset"}},
            "tcp-rst"                   : {"reject": {"type": "tcp reset"}},
        }
        return frags[reject_type]

    def _reject_fragment(self):
        return {"reject": {"type": "icmpx",
                           "expr": "admin-prohibited"}}

    def _icmp_match_fragment(self):
        return {"match": {"left": {"meta": {"key": "l4proto"}},
                          "op": "==",
                          "right": {"set": ["icmp", "icmpv6"]}}}

    def _rich_rule_limit_fragment(self, limit):
        if not limit:
            return {}

        rich_to_nft = {
            "s" : "second",
            "m" : "minute",
            "h" : "hour",
            "d" : "day",
        }

        rate, duration = limit.value_parse()

        d = {
            "rate": rate,
            "per": rich_to_nft[duration],
        }

        burst = limit.burst_parse()
        if burst is not None:
            d["burst"] = burst

        return {"limit": d}

    def _rich_rule_chain_suffix(self, rich_rule):
        if type(rich_rule.element) in [Rich_Masquerade, Rich_ForwardPort, Rich_IcmpBlock]:
            # These are special and don't have an explicit action
            pass
        elif rich_rule.action:
            if type(rich_rule.action) not in [Rich_Accept, Rich_Reject, Rich_Drop, Rich_Mark]:
                raise FirewallError(INVALID_RULE, "Unknown action %s" % type(rich_rule.action))
        else:
            raise FirewallError(INVALID_RULE, "No rule action specified.")

        if rich_rule.priority == 0:
            if type(rich_rule.element) in [Rich_Masquerade, Rich_ForwardPort] or \
               type(rich_rule.action) in [Rich_Accept, Rich_Mark]:
                return "allow"
            elif type(rich_rule.element) in [Rich_IcmpBlock] or \
                 type(rich_rule.action) in [Rich_Reject, Rich_Drop]:
                return "deny"
        elif rich_rule.priority < 0:
            return "pre"
        else:
            return "post"

    def _rich_rule_chain_suffix_from_log(self, rich_rule):
        if not rich_rule.log and not rich_rule.audit:
            raise FirewallError(INVALID_RULE, "Not log or audit")

        if rich_rule.priority == 0:
            return "log"
        elif rich_rule.priority < 0:
            return "pre"
        else:
            return "post"

    def _zone_interface_fragment(self):
        return {"%%ZONE_INTERFACE%%": None}

    def _zone_source_fragment(self, zone, address):
        if check_single_address("ipv6", address):
            address = normalizeIP6(address)
        elif check_address("ipv6", address):
            addr_split = address.split("/")
            address = normalizeIP6(addr_split[0]) + "/" + addr_split[1]
        return {"%%ZONE_SOURCE%%": {"zone": zone, "address": address}}

    def _policy_priority_fragment(self, policy):
        return {"%%POLICY_PRIORITY%%": policy.priority}

    def _rich_rule_priority_fragment(self, rich_rule):
        if not rich_rule or rich_rule.priority == 0:
            return {}
        return {"%%RICH_RULE_PRIORITY%%": rich_rule.priority}

    def _rich_rule_log(self, policy, rich_rule, enable, table, expr_fragments):
        if not rich_rule.log:
            return {}

        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)

        add_del = { True: "add", False: "delete" }[enable]

        chain_suffix = self._rich_rule_chain_suffix_from_log(rich_rule)

        log_options = {}
        if rich_rule.log.prefix:
            log_options["prefix"] = "%s" % rich_rule.log.prefix
        if rich_rule.log.level:
            level = "warn" if "warning" == rich_rule.log.level else rich_rule.log.level
            log_options["level"] = "%s" % level

        rule = {"family": "inet",
                "table": TABLE_NAME,
                "chain": "%s_%s_%s" % (table, _policy, chain_suffix),
                "expr": expr_fragments +
                        [self._rich_rule_limit_fragment(rich_rule.log.limit),
                         {"log": log_options}]}
        rule.update(self._rich_rule_priority_fragment(rich_rule))
        return {add_del: {"rule": rule}}

    def _rich_rule_audit(self, policy, rich_rule, enable, table, expr_fragments):
        if not rich_rule.audit:
            return {}

        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)

        add_del = { True: "add", False: "delete" }[enable]

        chain_suffix = self._rich_rule_chain_suffix_from_log(rich_rule)
        rule = {"family": "inet",
                "table": TABLE_NAME,
                "chain": "%s_%s_%s" % (table, _policy, chain_suffix),
                "expr": expr_fragments +
                        [self._rich_rule_limit_fragment(rich_rule.audit.limit),
                         {"log": {"level": "audit"}}]}
        rule.update(self._rich_rule_priority_fragment(rich_rule))
        return {add_del: {"rule": rule}}

    def _rich_rule_action(self, policy, rich_rule, enable, table, expr_fragments):
        if not rich_rule.action:
            return {}

        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)

        add_del = { True: "add", False: "delete" }[enable]

        chain_suffix = self._rich_rule_chain_suffix(rich_rule)
        chain = "%s_%s_%s" % (table, _policy, chain_suffix)
        if type(rich_rule.action) == Rich_Accept:
            rule_action = {"accept": None}
        elif type(rich_rule.action) == Rich_Reject:
            if rich_rule.action.type:
                rule_action = self._reject_types_fragment(rich_rule.action.type)
            else:
                rule_action = {"reject": None}
        elif type(rich_rule.action) ==  Rich_Drop:
            rule_action = {"drop": None}
        elif type(rich_rule.action) == Rich_Mark:
            table = "mangle"
            _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)
            chain = "%s_%s_%s" % (table, _policy, chain_suffix)
            value = rich_rule.action.set.split("/")
            if len(value) > 1:
                rule_action = {"mangle": {"key": {"meta": {"key": "mark"}},
                                          "value": {"^": [{"&": [{"meta": {"key": "mark"}}, value[1]]}, value[0]]}}}
            else:
                rule_action = {"mangle": {"key": {"meta": {"key": "mark"}},
                                          "value": value[0]}}

        else:
            raise FirewallError(INVALID_RULE,
                                "Unknown action %s" % type(rich_rule.action))

        rule = {"family": "inet",
                "table": TABLE_NAME,
                "chain": chain,
                "expr": expr_fragments +
                        [self._rich_rule_limit_fragment(rich_rule.action.limit), rule_action]}
        rule.update(self._rich_rule_priority_fragment(rich_rule))
        return {add_del: {"rule": rule}}

    def _rule_addr_fragment(self, addr_field, address, invert=False):
        if address.startswith("ipset:"):
            return self._set_match_fragment(address[len("ipset:"):], True if "daddr" == addr_field else False, invert)
        else:
            if check_mac(address):
                family = "ether"
            elif check_single_address("ipv4", address):
                family = "ip"
            elif check_address("ipv4", address):
                family = "ip"
                normalized_address = ipaddress.IPv4Network(address, strict=False)
                address = {"prefix": {"addr": normalized_address.network_address.compressed, "len": normalized_address.prefixlen}}
            elif check_single_address("ipv6", address):
                family = "ip6"
                address = normalizeIP6(address)
            else:
                family = "ip6"
                addr_len = address.split("/")
                address = {"prefix": {"addr": normalizeIP6(addr_len[0]), "len": int(addr_len[1])}}

            return {"match": {"left": {"payload": {"protocol": family,
                                                   "field": addr_field}},
                              "op": "!=" if invert else "==",
                              "right": address}}

    def _rich_rule_family_fragment(self, rich_family):
        if not rich_family:
            return {}
        if rich_family not in ["ipv4", "ipv6"]:
            raise FirewallError(INVALID_RULE,
                                "Invalid family" % rich_family)

        return {"match": {"left": {"meta": {"key": "nfproto"}},
                          "op": "==",
                          "right": rich_family}}

    def _rich_rule_destination_fragment(self, rich_dest):
        if not rich_dest:
            return {}
        if rich_dest.addr:
            address = rich_dest.addr
        elif rich_dest.ipset:
            address = "ipset:" + rich_dest.ipset

        return self._rule_addr_fragment("daddr", address, invert=rich_dest.invert)

    def _rich_rule_source_fragment(self, rich_source):
        if not rich_source:
            return {}

        if rich_source.addr:
            address = rich_source.addr
        elif hasattr(rich_source, "mac") and rich_source.mac:
            address = rich_source.mac
        elif hasattr(rich_source, "ipset") and rich_source.ipset:
            address = "ipset:" + rich_source.ipset

        return self._rule_addr_fragment("saddr", address, invert=rich_source.invert)

    def _port_fragment(self, port):
        range = getPortRange(port)
        if isinstance(range, int) and range < 0:
            raise FirewallError(INVALID_PORT)
        elif len(range) == 1:
            return range[0]
        else:
            return {"range": [range[0], range[1]]}

    def build_policy_ports_rules(self, enable, policy, proto, port, destination=None, rich_rule=None):
        add_del = { True: "add", False: "delete" }[enable]
        table = "filter"
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)

        expr_fragments = []
        if rich_rule:
            expr_fragments.append(self._rich_rule_family_fragment(rich_rule.family))
        if destination:
            expr_fragments.append(self._rule_addr_fragment("daddr", destination))
        if rich_rule:
            expr_fragments.append(self._rich_rule_destination_fragment(rich_rule.destination))
            expr_fragments.append(self._rich_rule_source_fragment(rich_rule.source))

        expr_fragments.append({"match": {"left": {"payload": {"protocol": proto,
                                                              "field": "dport"}},
                                         "op": "==",
                                         "right": self._port_fragment(port)}})
        if not rich_rule or type(rich_rule.action) != Rich_Mark:
            expr_fragments.append({"match": {"left": {"ct": {"key": "state"}},
                                             "op": "in",
                                             "right": {"set": ["new", "untracked"]}}})

        rules = []
        if rich_rule:
            rules.append(self._rich_rule_log(policy, rich_rule, enable, table, expr_fragments))
            rules.append(self._rich_rule_audit(policy, rich_rule, enable, table, expr_fragments))
            rules.append(self._rich_rule_action(policy, rich_rule, enable, table, expr_fragments))
        else:
            rules.append({add_del: {"rule": {"family": "inet",
                                             "table": TABLE_NAME,
                                             "chain": "%s_%s_allow" % (table, _policy),
                                             "expr": expr_fragments + [{"accept": None}]}}})

        return rules

    def build_policy_protocol_rules(self, enable, policy, protocol, destination=None, rich_rule=None):
        add_del = { True: "add", False: "delete" }[enable]
        table = "filter"
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)

        expr_fragments = []
        if rich_rule:
            expr_fragments.append(self._rich_rule_family_fragment(rich_rule.family))
        if destination:
            expr_fragments.append(self._rule_addr_fragment("daddr", destination))
        if rich_rule:
            expr_fragments.append(self._rich_rule_destination_fragment(rich_rule.destination))
            expr_fragments.append(self._rich_rule_source_fragment(rich_rule.source))

        expr_fragments.append({"match": {"left": {"meta": {"key": "l4proto"}},
                                         "op": "==",
                                         "right": protocol}})
        if not rich_rule or type(rich_rule.action) != Rich_Mark:
            expr_fragments.append({"match": {"left": {"ct": {"key": "state"}},
                                             "op": "in",
                                             "right": {"set": ["new", "untracked"]}}})

        rules = []
        if rich_rule:
            rules.append(self._rich_rule_log(policy, rich_rule, enable, table, expr_fragments))
            rules.append(self._rich_rule_audit(policy, rich_rule, enable, table, expr_fragments))
            rules.append(self._rich_rule_action(policy, rich_rule, enable, table, expr_fragments))
        else:
            rules.append({add_del: {"rule": {"family": "inet",
                                             "table": TABLE_NAME,
                                             "chain": "%s_%s_allow" % (table, _policy),
                                             "expr": expr_fragments + [{"accept": None}]}}})

        return rules

    def build_policy_source_ports_rules(self, enable, policy, proto, port,
                                      destination=None, rich_rule=None):
        add_del = { True: "add", False: "delete" }[enable]
        table = "filter"
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)

        expr_fragments = []
        if rich_rule:
            expr_fragments.append(self._rich_rule_family_fragment(rich_rule.family))
        if destination:
            expr_fragments.append(self._rule_addr_fragment("daddr", destination))
        if rich_rule:
            expr_fragments.append(self._rich_rule_destination_fragment(rich_rule.destination))
            expr_fragments.append(self._rich_rule_source_fragment(rich_rule.source))

        expr_fragments.append({"match": {"left": {"payload": {"protocol": proto,
                                                              "field": "sport"}},
                                         "op": "==",
                                         "right": self._port_fragment(port)}})
        if not rich_rule or type(rich_rule.action) != Rich_Mark:
            expr_fragments.append({"match": {"left": {"ct": {"key": "state"}},
                                             "op": "in",
                                             "right": {"set": ["new", "untracked"]}}})

        rules = []
        if rich_rule:
            rules.append(self._rich_rule_log(policy, rich_rule, enable, table, expr_fragments))
            rules.append(self._rich_rule_audit(policy, rich_rule, enable, table, expr_fragments))
            rules.append(self._rich_rule_action(policy, rich_rule, enable, table, expr_fragments))
        else:
            rules.append({add_del: {"rule": {"family": "inet",
                                             "table": TABLE_NAME,
                                             "chain": "%s_%s_allow" % (table, _policy),
                                             "expr": expr_fragments + [{"accept": None}]}}})

        return rules

    def build_policy_helper_ports_rules(self, enable, policy, proto, port,
                                        destination, helper_name, module_short_name):
        table = "filter"
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)
        add_del = { True: "add", False: "delete" }[enable]
        rules = []

        if enable:
            rules.append({"add": {"ct helper": {"family": "inet",
                                                "table": TABLE_NAME,
                                                "name": "helper-%s-%s" % (helper_name, proto),
                                                "type": module_short_name,
                                                "protocol": proto}}})

        expr_fragments = []
        if destination:
            expr_fragments.append(self._rule_addr_fragment("daddr", destination))
        expr_fragments.append({"match": {"left": {"payload": {"protocol": proto,
                                                              "field": "dport"}},
                                         "op": "==",
                                         "right": self._port_fragment(port)}})
        expr_fragments.append({"ct helper": "helper-%s-%s" % (helper_name, proto)})
        rules.append({add_del: {"rule": {"family": "inet",
                                         "table": TABLE_NAME,
                                         "chain": "filter_%s_allow" % (_policy),
                                         "expr": expr_fragments}}})

        return rules

    def build_zone_forward_rules(self, enable, zone, policy, table, interface=None, source=None):
        add_del = { True: "add", False: "delete" }[enable]
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)

        rules = []

        if interface:
            if interface[len(interface)-1] == "+":
                interface = interface[:len(interface)-1] + "*"

            expr = [{"match": {"left": {"meta": {"key": "oifname"}},
                               "op": "==",
                               "right": interface}},
                    {"accept": None}]
        else: # source
            expr = [self._rule_addr_fragment("daddr", source), {"accept": None}]

        rule = {"family": "inet",
                "table": TABLE_NAME,
                "chain": "filter_%s_allow" % (_policy),
                "expr": expr}
        rules.append({add_del: {"rule": rule}})

        return rules

    def _build_policy_masquerade_nat_rules(self, enable, policy, family, rich_rule=None):
        table = "nat"
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX, isSNAT=True)

        add_del = { True: "add", False: "delete" }[enable]

        expr_fragments = []
        if rich_rule:
            expr_fragments.append(self._rich_rule_destination_fragment(rich_rule.destination))
            expr_fragments.append(self._rich_rule_source_fragment(rich_rule.source))
            chain_suffix = self._rich_rule_chain_suffix(rich_rule)
        else:
            chain_suffix = "allow"

        rule = {"family": family,
                "table": TABLE_NAME,
                "chain": "nat_%s_%s" % (_policy, chain_suffix),
                "expr": expr_fragments +
                        [{"match": {"left": {"meta": {"key": "oifname"}},
                                    "op": "!=",
                                    "right": "lo"}},
                         {"masquerade": None}]}
        rule.update(self._rich_rule_priority_fragment(rich_rule))
        return [{add_del: {"rule": rule}}]

    def build_policy_masquerade_rules(self, enable, policy, rich_rule=None):
        # nat tables needs to use ip/ip6 family
        rules = []
        if rich_rule and (rich_rule.family and rich_rule.family == "ipv6"
           or rich_rule.source and check_address("ipv6", rich_rule.source.addr)):
            rules.extend(self._build_policy_masquerade_nat_rules(enable, policy, "ip6", rich_rule))
        elif rich_rule and (rich_rule.family and rich_rule.family == "ipv4"
           or rich_rule.source and check_address("ipv4", rich_rule.source.addr)):
            rules.extend(self._build_policy_masquerade_nat_rules(enable, policy, "ip", rich_rule))
        else:
            rules.extend(self._build_policy_masquerade_nat_rules(enable, policy, "ip", rich_rule))

        table = "filter"
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)
        add_del = { True: "add", False: "delete" }[enable]

        expr_fragments = []
        if rich_rule:
            expr_fragments.append(self._rich_rule_destination_fragment(rich_rule.destination))
            expr_fragments.append(self._rich_rule_source_fragment(rich_rule.source))
            chain_suffix = self._rich_rule_chain_suffix(rich_rule)
        else:
            chain_suffix = "allow"

        rule = {"family": "inet",
                "table": TABLE_NAME,
                "chain": "filter_%s_%s" % (_policy, chain_suffix),
                "expr": expr_fragments +
                        [{"match": {"left": {"ct": {"key": "state"}},
                                    "op": "in",
                                    "right": {"set": ["new", "untracked"]}}},
                         {"accept": None}]}
        rule.update(self._rich_rule_priority_fragment(rich_rule))
        rules.append({add_del: {"rule": rule}})

        return rules

    def _build_policy_forward_port_nat_rules(self, enable, policy, port, protocol,
                                           toaddr, toport, family,
                                           rich_rule=None):
        table = "nat"
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)
        add_del = { True: "add", False: "delete" }[enable]

        expr_fragments = []
        if rich_rule:
            expr_fragments.append(self._rich_rule_destination_fragment(rich_rule.destination))
            expr_fragments.append(self._rich_rule_source_fragment(rich_rule.source))
            chain_suffix = self._rich_rule_chain_suffix(rich_rule)
        else:
            chain_suffix = "allow"

        expr_fragments.append({"match": {"left": {"payload": {"protocol": protocol,
                                                              "field": "dport"}},
                                         "op": "==",
                                         "right": self._port_fragment(port)}})

        if toaddr:
            if check_single_address("ipv6", toaddr):
                toaddr = normalizeIP6(toaddr)
            if toport and toport != "":
                expr_fragments.append({"dnat": {"addr": toaddr, "port": self._port_fragment(toport)}})
            else:
                expr_fragments.append({"dnat": {"addr": toaddr}})
        else:
            expr_fragments.append({"redirect": {"port": self._port_fragment(toport)}})

        rule = {"family": family,
                "table": TABLE_NAME,
                "chain": "nat_%s_%s" % (_policy, chain_suffix),
                "expr": expr_fragments}
        rule.update(self._rich_rule_priority_fragment(rich_rule))
        return [{add_del: {"rule": rule}}]

    def build_policy_forward_port_rules(self, enable, policy, port,
                                      protocol, toport, toaddr, rich_rule=None):
        rules = []
        if rich_rule and (rich_rule.family and rich_rule.family == "ipv6"
           or toaddr and check_single_address("ipv6", toaddr)):
            rules.extend(self._build_policy_forward_port_nat_rules(enable, policy,
                                port, protocol, toaddr, toport, "ip6", rich_rule))
        elif rich_rule and (rich_rule.family and rich_rule.family == "ipv4"
           or toaddr and check_single_address("ipv4", toaddr)):
            rules.extend(self._build_policy_forward_port_nat_rules(enable, policy,
                                port, protocol, toaddr, toport, "ip", rich_rule))
        else:
            if toaddr and check_single_address("ipv6", toaddr):
                rules.extend(self._build_policy_forward_port_nat_rules(enable, policy,
                                    port, protocol, toaddr, toport, "ip6", rich_rule))
            else:
                rules.extend(self._build_policy_forward_port_nat_rules(enable, policy,
                                    port, protocol, toaddr, toport, "ip", rich_rule))

        return rules

    def _icmp_types_to_nft_fragments(self, ipv, icmp_type):
        if icmp_type in ICMP_TYPES_FRAGMENTS[ipv]:
            return ICMP_TYPES_FRAGMENTS[ipv][icmp_type]
        else:
            raise FirewallError(INVALID_ICMPTYPE,
                                "ICMP type '%s' not supported by %s for %s" % (icmp_type, self.name, ipv))

    def build_policy_icmp_block_rules(self, enable, policy, ict, rich_rule=None):
        table = "filter"
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)
        add_del = { True: "add", False: "delete" }[enable]

        if rich_rule and rich_rule.ipvs:
            ipvs = rich_rule.ipvs
        elif ict.destination:
            ipvs = []
            if "ipv4" in ict.destination:
                ipvs.append("ipv4")
            if "ipv6" in ict.destination:
                ipvs.append("ipv6")
        else:
            ipvs = ["ipv4", "ipv6"]

        rules = []
        for ipv in ipvs:
            if self._fw.policy.query_icmp_block_inversion(policy):
                final_chain = "%s_%s_allow" % (table, _policy)
                target_fragment = {"accept": None}
            else:
                final_chain = "%s_%s_deny" % (table, _policy)
                target_fragment = self._reject_fragment()

            expr_fragments = []
            if rich_rule:
                expr_fragments.append(self._rich_rule_family_fragment(rich_rule.family))
                expr_fragments.append(self._rich_rule_destination_fragment(rich_rule.destination))
                expr_fragments.append(self._rich_rule_source_fragment(rich_rule.source))
            expr_fragments.extend(self._icmp_types_to_nft_fragments(ipv, ict.name))

            if rich_rule:
                rules.append(self._rich_rule_log(policy, rich_rule, enable, table, expr_fragments))
                rules.append(self._rich_rule_audit(policy, rich_rule, enable, table, expr_fragments))
                if rich_rule.action:
                    rules.append(self._rich_rule_action(policy, rich_rule, enable, table, expr_fragments))
                else:
                    chain_suffix = self._rich_rule_chain_suffix(rich_rule)
                    rule = {"family": "inet",
                            "table": TABLE_NAME,
                            "chain": "%s_%s_%s" % (table, _policy, chain_suffix),
                            "expr": expr_fragments + [self._reject_fragment()]}
                    rule.update(self._rich_rule_priority_fragment(rich_rule))
                    rules.append({add_del: {"rule": rule}})
            else:
                if self._fw.get_log_denied() != "off" and not self._fw.policy.query_icmp_block_inversion(policy):
                    rules.append({add_del: {"rule": {"family": "inet",
                                                     "table": TABLE_NAME,
                                                     "chain": final_chain,
                                                     "expr": (expr_fragments +
                                                              [self._pkttype_match_fragment(self._fw.get_log_denied()),
                                                               {"log": {"prefix": "\"%s_%s_ICMP_BLOCK: \"" % (table, policy)}}])}}})
                rules.append({add_del: {"rule": {"family": "inet",
                                                 "table": TABLE_NAME,
                                                 "chain": final_chain,
                                                 "expr": expr_fragments + [target_fragment]}}})

        return rules

    def build_policy_icmp_block_inversion_rules(self, enable, policy):
        table = "filter"
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)
        rules = []
        add_del = { True: "add", False: "delete" }[enable]

        if self._fw.policy.query_icmp_block_inversion(policy):
            target_fragment = self._reject_fragment()
        else:
            target_fragment = {"accept": None}

        # WARN: The "index" used here must be kept in sync with
        # build_policy_chain_rules()
        #
        rules.append({add_del: {"rule": {"family": "inet",
                                         "table": TABLE_NAME,
                                         "chain": "%s_%s" % (table, _policy),
                                         "index": 4,
                                         "expr": [self._icmp_match_fragment(),
                                                  target_fragment]}}})

        if self._fw.get_log_denied() != "off" and self._fw.policy.query_icmp_block_inversion(policy):
            rules.append({add_del: {"rule": {"family": "inet",
                                             "table": TABLE_NAME,
                                             "chain": "%s_%s" % (table, _policy),
                                             "index": 4,
                                             "expr": [self._icmp_match_fragment(),
                                                      self._pkttype_match_fragment(self._fw.get_log_denied()),
                                                      {"log": {"prefix": "%s_%s_ICMP_BLOCK: " % (table, policy)}}]}}})
        return rules

    def build_rpfilter_rules(self, log_denied=False):
        rules = []
        expr_fragments = [{"match": {"left": {"meta": {"key": "nfproto"}},
                                     "op": "==",
                                     "right": "ipv6"}},
                          {"match": {"left": {"fib": {"flags": ["saddr", "iif", "mark"],
                                                      "result": "oif"}},
                                     "op": "==",
                                     "right": False}}]
        if log_denied != "off":
            expr_fragments.append({"log": {"prefix": "rpfilter_DROP: "}})
        expr_fragments.append({"drop": None})

        rules.append({"insert": {"rule": {"family": "inet",
                                          "table": TABLE_NAME,
                                          "chain": "filter_PREROUTING",
                                          "expr": expr_fragments}}})
        # RHBZ#1058505, RHBZ#1575431 (bug in kernel 4.16-4.17)
        rules.append({"insert": {"rule": {"family": "inet",
                                          "table": TABLE_NAME,
                                          "chain": "filter_PREROUTING",
                                          "expr": [{"match": {"left": {"payload": {"protocol": "icmpv6",
                                                                                   "field": "type"}},
                                                              "op": "==",
                                                              "right": {"set": ["nd-router-advert", "nd-neighbor-solicit"]}}},
                                                   {"accept": None}]}}})
        return rules

    def build_rfc3964_ipv4_rules(self):
        daddr_set = ["::0.0.0.0/96", # IPv4 compatible
                     "::ffff:0.0.0.0/96", # IPv4 mapped
                     "2002:0000::/24", # 0.0.0.0/8 (the system has no address assigned yet)
                     "2002:0a00::/24", # 10.0.0.0/8 (private)
                     "2002:7f00::/24", # 127.0.0.0/8 (loopback)
                     "2002:ac10::/28", # 172.16.0.0/12 (private)
                     "2002:c0a8::/32", # 192.168.0.0/16 (private)
                     "2002:a9fe::/32", # 169.254.0.0/16 (IANA Assigned DHCP link-local)
                     "2002:e000::/19", # 224.0.0.0/4 (multicast), 240.0.0.0/4 (reserved and broadcast)
                     ]
        daddr_set = [{"prefix": {"addr": x.split("/")[0], "len": int(x.split("/")[1])}} for x in daddr_set]

        expr_fragments = [{"match": {"left": {"payload": {"protocol": "ip6",
                                                          "field": "daddr"}},
                                     "op": "==",
                                     "right": {"set": daddr_set}}}]
        if self._fw._log_denied in ["unicast", "all"]:
            expr_fragments.append({"log": {"prefix": "RFC3964_IPv4_REJECT: "}})
        expr_fragments.append(self._reject_types_fragment("addr-unreach"))

        rules = []
        # WARN: index must be kept in sync with build_default_rules()
        rules.append({"add": {"rule": {"family": "inet",
                                       "table": TABLE_NAME,
                                       "chain": "filter_OUTPUT",
                                       "index": 1,
                                       "expr": expr_fragments}}})
        rules.append({"add": {"rule": {"family": "inet",
                                       "table": TABLE_NAME,
                                       "chain": "filter_FORWARD",
                                       "index": 2,
                                       "expr": expr_fragments}}})
        return rules

    def build_policy_rich_source_destination_rules(self, enable, policy, rich_rule):
        table = "filter"

        expr_fragments = []
        expr_fragments.append(self._rich_rule_family_fragment(rich_rule.family))
        expr_fragments.append(self._rich_rule_destination_fragment(rich_rule.destination))
        expr_fragments.append(self._rich_rule_source_fragment(rich_rule.source))

        rules = []
        rules.append(self._rich_rule_log(policy, rich_rule, enable, table, expr_fragments))
        rules.append(self._rich_rule_audit(policy, rich_rule, enable, table, expr_fragments))
        rules.append(self._rich_rule_action(policy, rich_rule, enable, table, expr_fragments))

        return rules

    def is_ipv_supported(self, ipv):
        if ipv in ["ipv4", "ipv6", "eb"]:
            return True
        return False

    def _set_type_list(self, ipv, type):
        ipv_addr = {
            "ipv4" : "ipv4_addr",
            "ipv6" : "ipv6_addr",
        }
        types = {
            "hash:ip" : ipv_addr[ipv],
            "hash:ip,port" : [ipv_addr[ipv], "inet_proto", "inet_service"],
            "hash:ip,port,ip" : [ipv_addr[ipv], "inet_proto", "inet_service", ipv_addr[ipv]],
            "hash:ip,port,net" : [ipv_addr[ipv], "inet_proto", "inet_service", ipv_addr[ipv]],
            "hash:ip,mark" : [ipv_addr[ipv], "mark"],

            "hash:net" : ipv_addr[ipv],
            "hash:net,net" : [ipv_addr[ipv], ipv_addr[ipv]],
            "hash:net,port" : [ipv_addr[ipv], "inet_proto", "inet_service"],
            "hash:net,port,net" : [ipv_addr[ipv], "inet_proto", "inet_service", ipv_addr[ipv]],
            "hash:net,iface" : [ipv_addr[ipv], "ifname"],

            "hash:mac" : "ether_addr",
        }
        if type in types:
            return types[type]
        else:
            raise FirewallError(INVALID_TYPE,
                                "ipset type name '%s' is not valid" % type)

    def build_set_create_rules(self, name, type, options=None):
        if options and "family" in options and options["family"] == "inet6":
            ipv = "ipv6"
        else:
            ipv = "ipv4"

        set_dict = {"table": TABLE_NAME,
                    "name": name,
                    "type": self._set_type_list(ipv, type)}

        # Some types need the interval flag
        for t in type.split(":")[1].split(","):
            if t in ["ip", "net", "port"]:
                set_dict["flags"] = ["interval"]
                break

        if options:
            if "timeout" in options:
                set_dict["timeout"] = options["timeout"]
            if "maxelem" in options:
                set_dict["size"] = options["maxelem"]

        rules = []
        for family in ["inet", "ip", "ip6"]:
            rule_dict = {"family": family}
            rule_dict.update(set_dict)
            rules.append({"add": {"set": rule_dict}})

        return rules

    def set_create(self, name, type, options=None):
        rules = self.build_set_create_rules(name, type, options)
        self.set_rules(rules, self._fw.get_log_denied())

    def set_destroy(self, name):
        for family in ["inet", "ip", "ip6"]:
            rule = {"delete": {"set": {"family": family,
                                       "table": TABLE_NAME,
                                       "name": name}}}
            self.set_rule(rule, self._fw.get_log_denied())

    def _set_match_fragment(self, name, match_dest, invert=False):
        type_format = self._fw.ipset.get_ipset(name).type.split(":")[1].split(",")

        fragments = []
        for i in range(len(type_format)):
            if type_format[i] == "port":
                fragments.append({"meta": {"key": "l4proto"}})
                fragments.append({"payload": {"protocol": "th",
                                              "field": "dport" if match_dest else "sport"}})
            elif type_format[i] in ["ip", "net", "mac"]:
                fragments.append({"payload": {"protocol": self._set_get_family(name),
                                              "field": "daddr" if match_dest else "saddr"}})
            elif type_format[i] == "iface":
                fragments.append({"meta": {"key": "iifname" if match_dest else "oifname"}})
            elif type_format[i] == "mark":
                fragments.append({"meta": {"key": "mark"}})
            else:
                raise FirewallError("Unsupported ipset type for match fragment: %s" % (type_format[i]))

        return {"match": {"left": {"concat": fragments} if len(type_format) > 1 else fragments[0],
                          "op": "!=" if invert else "==",
                          "right": "@" + name}}

    def _set_entry_fragment(self, name, entry):
        # convert something like
        #    1.2.3.4,sctp:8080 (type hash:ip,port)
        # to
        #    ["1.2.3.4", "sctp", "8080"]
        obj = self._fw.ipset.get_ipset(name)
        type_format = obj.type.split(":")[1].split(",")
        entry_tokens = entry.split(",")
        if len(type_format) != len(entry_tokens):
            raise FirewallError(INVALID_ENTRY,
                                "Number of values does not match ipset type.")
        fragment = []
        for i in range(len(type_format)):
            if type_format[i] == "port":
                try:
                    index = entry_tokens[i].index(":")
                except ValueError:
                    # no protocol means default tcp
                    fragment.append("tcp")
                    port_str = entry_tokens[i]
                else:
                    fragment.append(entry_tokens[i][:index])
                    port_str = entry_tokens[i][index+1:]

                try:
                    index = port_str.index("-")
                except ValueError:
                    fragment.append(port_str)
                else:
                    fragment.append({"range": [port_str[:index], port_str[index+1:]]})

            elif type_format[i] in ["ip", "net"]:
                if '-' in entry_tokens[i]:
                    fragment.append({"range": entry_tokens[i].split('-') })
                else:
                    try:
                        index = entry_tokens[i].index("/")
                    except ValueError:
                        addr = entry_tokens[i]
                        if "family" in obj.options and obj.options["family"] == "inet6":
                            addr = normalizeIP6(addr)
                        fragment.append(addr)
                    else:
                        addr = entry_tokens[i][:index]
                        if "family" in obj.options and obj.options["family"] == "inet6":
                            addr = normalizeIP6(addr)
                        fragment.append({"prefix": {"addr": addr,
                                                    "len": int(entry_tokens[i][index+1:])}})
            else:
                fragment.append(entry_tokens[i])
        return [{"concat": fragment}] if len(type_format) > 1 else fragment

    def build_set_add_rules(self, name, entry):
        rules = []
        element = self._set_entry_fragment(name, entry)
        for family in ["inet", "ip", "ip6"]:
            rules.append({"add": {"element": {"family": family,
                                              "table": TABLE_NAME,
                                              "name": name,
                                              "elem": element}}})
        return rules

    def set_add(self, name, entry):
        rules = self.build_set_add_rules(name, entry)
        self.set_rules(rules, self._fw.get_log_denied())

    def set_delete(self, name, entry):
        element = self._set_entry_fragment(name, entry)
        for family in ["inet", "ip", "ip6"]:
            rule = {"delete": {"element": {"family": family,
                                           "table": TABLE_NAME,
                                           "name": name,
                                           "elem": element}}}
            self.set_rule(rule, self._fw.get_log_denied())

    def build_set_flush_rules(self, name):
        rules = []
        for family in ["inet", "ip", "ip6"]:
            rule = {"flush": {"set": {"family": family,
                                      "table": TABLE_NAME,
                                      "name": name}}}
            rules.append(rule)
        return rules

    def set_flush(self, name):
        rules = self.build_set_flush_rules(name)
        self.set_rules(rules, self._fw.get_log_denied())

    def _set_get_family(self, name):
        ipset = self._fw.ipset.get_ipset(name)

        if ipset.type == "hash:mac":
            family = "ether"
        elif ipset.options and "family" in ipset.options \
             and ipset.options["family"] == "inet6":
            family = "ip6"
        else:
            family = "ip"

        return family

    def set_restore(self, set_name, type_name, entries,
                    create_options=None, entry_options=None):
        rules = []
        rules.extend(self.build_set_create_rules(set_name, type_name, create_options))
        rules.extend(self.build_set_flush_rules(set_name))

        # avoid large memory usage by chunking the entries
        chunk = 0
        for entry in entries:
            rules.extend(self.build_set_add_rules(set_name, entry))
            chunk += 1
            if chunk >= 1000:
                self.set_rules(rules, self._fw.get_log_denied())
                rules.clear()
                chunk = 0
        else:
            self.set_rules(rules, self._fw.get_log_denied())
site-packages/firewall/core/fw_helper.py000064400000003451147511334700014356 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2015-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

"""helper backend"""

__all__ = [ "FirewallHelper" ]

from firewall import errors
from firewall.errors import FirewallError

class FirewallHelper(object):
    def __init__(self, fw):
        self._fw = fw
        self._helpers = { }

    def __repr__(self):
        return '%s(%r)' % (self.__class__, self._helpers)

    # helpers

    def cleanup(self):
        self._helpers.clear()

    def check_helper(self, name):
        if name not in self.get_helpers():
            raise FirewallError(errors.INVALID_HELPER, name)

    def query_helper(self, name):
        return name in self.get_helpers()

    def get_helpers(self):
        return sorted(self._helpers.keys())

    def has_helpers(self):
        return len(self._helpers) > 0

    def get_helper(self, name):
        self.check_helper(name)
        return self._helpers[name]

    def add_helper(self, obj):
        self._helpers[obj.name] = obj

    def remove_helper(self, name):
        if name not in self._helpers:
            raise FirewallError(errors.INVALID_HELPER, name)
        del self._helpers[name]
site-packages/firewall/core/__init__.py000064400000000000147511334700014125 0ustar00site-packages/firewall/core/fw_zone.py000064400000131171147511334700014053 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2011-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

import time
import copy
from firewall.core.base import SHORTCUTS, DEFAULT_ZONE_TARGET, SOURCE_IPSET_TYPES
from firewall.core.fw_transaction import FirewallTransaction
from firewall.core.io.policy import Policy
from firewall.core.logger import log
from firewall.core.rich import Rich_Service, Rich_Port, Rich_Protocol, Rich_SourcePort, Rich_ForwardPort, \
                               Rich_IcmpBlock, Rich_IcmpType, Rich_Masquerade, Rich_Mark
from firewall.functions import checkIPnMask, checkIP6nMask, check_mac
from firewall import errors
from firewall.errors import FirewallError
from firewall.fw_types import LastUpdatedOrderedDict

class FirewallZone(object):
    ZONE_POLICY_PRIORITY = 0

    def __init__(self, fw):
        self._fw = fw
        self._zones = { }
        self._zone_policies = { }

    def __repr__(self):
        return '%s(%r)' % (self.__class__, self._zones)

    def cleanup(self):
        self._zones.clear()
        self._zone_policies.clear()

    def new_transaction(self):
        return FirewallTransaction(self._fw)

    def policy_name_from_zones(self, fromZone, toZone):
        return "zone_{fromZone}_{toZone}".format(fromZone=fromZone, toZone=toZone)

    # zones

    def get_zones(self):
        return sorted(self._zones.keys())

    def get_active_zones(self):
        active_zones = []
        for zone in self.get_zones():
            if self.list_interfaces(zone) or self.list_sources(zone):
                active_zones.append(zone)
        return active_zones

    def get_zone_of_interface(self, interface):
        interface_id = self.__interface_id(interface)
        for zone in self._zones:
            if interface_id in self._zones[zone].settings["interfaces"]:
                # an interface can only be part of one zone
                return zone
        return None

    def get_zone_of_source(self, source):
        source_id = self.__source_id(source)
        for zone in self._zones:
            if source_id in self._zones[zone].settings["sources"]:
                # a source_id can only be part of one zone
                return zone
        return None

    def get_zone(self, zone):
        z = self._fw.check_zone(zone)
        return self._zones[z]

    def policy_obj_from_zone_obj(self, z_obj, fromZone, toZone):
        p_obj = Policy()
        p_obj.derived_from_zone = z_obj.name
        p_obj.name = self.policy_name_from_zones(fromZone, toZone)
        p_obj.priority = self.ZONE_POLICY_PRIORITY
        p_obj.target = z_obj.target
        p_obj.ingress_zones = [fromZone]
        p_obj.egress_zones = [toZone]

        # copy zone permanent config to policy permanent config
        # WARN: This assumes the same attribute names.
        #
        for setting in ["services", "ports",
                        "masquerade", "forward_ports",
                        "source_ports",
                        "icmp_blocks", "rules",
                        "protocols"]:
            if fromZone == z_obj.name and toZone == "HOST" and \
               setting in ["services", "ports", "source_ports", "icmp_blocks", "protocols"]:
                # zone --> HOST
                setattr(p_obj, setting, copy.deepcopy(getattr(z_obj, setting)))
            elif fromZone == "ANY" and toZone == z_obj.name and setting in ["masquerade"]:
                # any zone --> zone
                setattr(p_obj, setting, copy.deepcopy(getattr(z_obj, setting)))
            elif fromZone == z_obj.name and toZone == "ANY" and \
                 setting in ["icmp_blocks", "forward_ports"]:
                # zone --> any zone
                setattr(p_obj, setting, copy.deepcopy(getattr(z_obj, setting)))
            elif setting in ["rules"]:
                p_obj.rules = []
                for rule in z_obj.rules:
                    current_policy = self.policy_name_from_zones(fromZone, toZone)

                    if current_policy in self._rich_rule_to_policies(z_obj.name, rule):
                        p_obj.rules.append(copy.deepcopy(rule))

        return p_obj

    def add_zone(self, obj):
        obj.settings = { x : LastUpdatedOrderedDict()
                         for x in ["interfaces", "sources",
                                   "icmp_block_inversion",
                                   "forward"] }
        self._zones[obj.name] = obj
        self._zone_policies[obj.name] = []

        # Create policy objects, will need many:
        #   - (zone --> HOST) - ports, service, etc
        #   - (any zone --> zone) - masquerade
        #   - (zone --> any zone) - ICMP block, icmp block inversion
        #       - also includes forward-ports because it works on (nat,
        #       PREROUTING) and therefore applies to redirects to the local
        #       host or dnat to a different host.
        #       - also includes rich rule "mark" action for the same reason
        #
        for fromZone,toZone in [(obj.name, "HOST"),
                                ("ANY", obj.name), (obj.name, "ANY")]:
            p_obj = self.policy_obj_from_zone_obj(obj, fromZone, toZone)
            self._fw.policy.add_policy(p_obj)
            self._zone_policies[obj.name].append(p_obj.name)

        self.copy_permanent_to_runtime(obj.name)

    def copy_permanent_to_runtime(self, zone):
        obj = self._zones[zone]

        for arg in obj.interfaces:
            self.add_interface(zone, arg, allow_apply=False)
        for arg in obj.sources:
            self.add_source(zone, arg, allow_apply=False)
        if obj.forward:
            self.add_forward(zone)
        if obj.icmp_block_inversion:
            self.add_icmp_block_inversion(zone)

    def remove_zone(self, zone):
        obj = self._zones[zone]
        if obj.applied:
            self.unapply_zone_settings(zone)
        obj.settings.clear()
        del self._zones[zone]
        del self._zone_policies[zone]

    def apply_zones(self, use_transaction=None):
        for zone in self.get_zones():
            z_obj = self._zones[zone]
            if len(z_obj.interfaces) > 0 or len(z_obj.sources) > 0:
                log.debug1("Applying zone '%s'", zone)
                self.apply_zone_settings(zone, use_transaction=use_transaction)

    def set_zone_applied(self, zone, applied):
        obj = self._zones[zone]
        obj.applied = applied

    # zone from chain

    def zone_from_chain(self, chain):
        if "_" not in chain:
            # no zone chain
            return None
        splits = chain.split("_")
        if len(splits) < 2:
            return None
        _chain = None
        for x in SHORTCUTS:
            if splits[0] == SHORTCUTS[x]:
                _chain = x
        if _chain is not None:
            # next part needs to be zone name
            if splits[1] not in self.get_zones():
                return None
            if len(splits) == 2 or \
               (len(splits) == 3 and splits[2] in [ "pre", "log", "deny", "allow", "post" ]):
                return (splits[1], _chain)
        return None

    def policy_from_chain(self, chain):
        x = self.zone_from_chain(chain)
        if x is None:
            return None

        (zone, _chain) = x
        # derived from _get_table_chains_for_zone_dispatch()
        if _chain in ["PREROUTING", "FORWARD_IN"]:
            fromZone = zone
            toZone = "ANY"
        elif _chain in ["INPUT"]:
            fromZone = zone
            toZone = "HOST"
        elif _chain in ["POSTROUTING", "FORWARD_OUT"]:
            fromZone = "ANY"
            toZone = zone
        else:
            raise FirewallError(errors.INVALID_CHAIN, "chain '%s' can't be mapped to a policy" % (chain))

        return (self.policy_name_from_zones(fromZone, toZone), _chain)

    def create_zone_base_by_chain(self, ipv, table, chain,
                                  use_transaction=None):

        # Create zone base chains if the chain is reserved for a zone
        if ipv in [ "ipv4", "ipv6" ]:
            x = self.policy_from_chain(chain)
            if x is not None:
                (policy, _chain) = self.policy_from_chain(chain)
                if use_transaction is None:
                    transaction = self.new_transaction()
                else:
                    transaction = use_transaction

                self._fw.policy.gen_chain_rules(policy, True, table, _chain,
                                                transaction)

                if use_transaction is None:
                    transaction.execute(True)

    # settings

    # generate settings record with sender, timeout
    def __gen_settings(self, timeout, sender):
        ret = {
            "date": time.time(),
            "sender": sender,
            "timeout": timeout,
        }
        return ret

    def get_settings(self, zone):
        return self.get_zone(zone).settings

    def _zone_settings(self, enable, zone, transaction):
        settings = self.get_settings(zone)
        for key in settings:
            for args in settings[key]:
                if key == "interfaces":
                    self._interface(enable, zone, args, transaction)
                elif key == "sources":
                    self._source(enable, zone, args[0], args[1], transaction)
                elif key == "icmp_block_inversion":
                    continue
                elif key == "forward":
                    # no need to call this when applying the zone as the rules
                    # will be generated when adding the interfaces/sources
                    pass
                else:
                    log.warning("Zone '%s': Unknown setting '%s:%s', "
                                "unable to apply", zone, key, args)
        # ICMP-block-inversion is always applied
        if enable:
            self._icmp_block_inversion(enable, zone, transaction)

    def apply_zone_settings(self, zone, use_transaction=None):
        _zone = self._fw.check_zone(zone)
        obj = self._zones[_zone]
        if obj.applied:
            return
        obj.applied = True

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        for policy in self._zone_policies[_zone]:
            log.debug1("Applying policy (%s) derived from zone '%s'", policy, zone)
            self._fw.policy.apply_policy_settings(policy, use_transaction=transaction)

        self._zone_settings(True, _zone, transaction)

        if use_transaction is None:
            transaction.execute(True)

    def unapply_zone_settings(self, zone, use_transaction=None):
        _zone = self._fw.check_zone(zone)
        obj = self._zones[_zone]
        if not obj.applied:
            return

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        for policy in self._zone_policies[_zone]:
            self._fw.policy.unapply_policy_settings(policy, use_transaction=transaction)

        self._zone_settings(False, _zone, transaction)

        if use_transaction is None:
            transaction.execute(True)

    def get_config_with_settings(self, zone):
        """
        :return: exported config updated with runtime settings
        """
        obj = self.get_zone(zone)
        conf_dict = self.get_config_with_settings_dict(zone)
        conf_list = []
        for i in range(16): # tuple based API has 16 elements
            if obj.IMPORT_EXPORT_STRUCTURE[i][0] not in conf_dict:
                # old API needs the empty elements as well. Grab it from the
                # class otherwise we don't know the type.
                conf_list.append(copy.deepcopy(getattr(obj, obj.IMPORT_EXPORT_STRUCTURE[i][0])))
            else:
                conf_list.append(conf_dict[obj.IMPORT_EXPORT_STRUCTURE[i][0]])
        return tuple(conf_list)

    def get_config_with_settings_dict(self, zone):
        """
        :return: exported config updated with runtime settings
        """
        permanent = self.get_zone(zone).export_config_dict()
        if permanent["target"] == DEFAULT_ZONE_TARGET:
            permanent["target"] = "default"
        runtime = { "services": self.list_services(zone),
                    "ports": self.list_ports(zone),
                    "icmp_blocks": self.list_icmp_blocks(zone),
                    "masquerade": self.query_masquerade(zone),
                    "forward_ports": self.list_forward_ports(zone),
                    "interfaces": self.list_interfaces(zone),
                    "sources": self.list_sources(zone),
                    "rules_str": self.list_rules(zone),
                    "protocols": self.list_protocols(zone),
                    "source_ports": self.list_source_ports(zone),
                    "icmp_block_inversion": self.query_icmp_block_inversion(zone),
                    "forward": self.query_forward(zone),
                    }
        return self._fw.combine_runtime_with_permanent_settings(permanent, runtime)

    def set_config_with_settings_dict(self, zone, settings, sender):
        # stupid wrappers to convert rich rule string to rich rule object
        from firewall.core.rich import Rich_Rule
        def add_rule_wrapper(zone, rule_str, timeout=0, sender=None):
            self.add_rule(zone, Rich_Rule(rule_str=rule_str), timeout=0, sender=sender)
        def remove_rule_wrapper(zone, rule_str):
            self.remove_rule(zone, Rich_Rule(rule_str=rule_str))

        setting_to_fn = {
            "services": (self.add_service, self.remove_service),
            "ports": (self.add_port, self.remove_port),
            "icmp_blocks": (self.add_icmp_block, self.remove_icmp_block),
            "masquerade": (self.add_masquerade, self.remove_masquerade),
            "forward_ports": (self.add_forward_port, self.remove_forward_port),
            "interfaces": (self.add_interface, self.remove_interface),
            "sources": (self.add_source, self.remove_source),
            "rules_str": (add_rule_wrapper, remove_rule_wrapper),
            "protocols": (self.add_protocol, self.remove_protocol),
            "source_ports": (self.add_source_port, self.remove_source_port),
            "icmp_block_inversion": (self.add_icmp_block_inversion, self.remove_icmp_block_inversion),
            "forward": (self.add_forward, self.remove_forward),
        }

        old_settings = self.get_config_with_settings_dict(zone)
        (add_settings, remove_settings) = self._fw.get_added_and_removed_settings(old_settings, settings)

        for key in remove_settings:
            if isinstance(remove_settings[key], list):
                for args in remove_settings[key]:
                    if isinstance(args, tuple):
                        setting_to_fn[key][1](zone, *args)
                    else:
                        setting_to_fn[key][1](zone, args)
            else: # bool
                setting_to_fn[key][1](zone)

        for key in add_settings:
            if isinstance(add_settings[key], list):
                for args in add_settings[key]:
                    if key in ["interfaces", "sources"]:
                        # no timeout arg
                        setting_to_fn[key][0](zone, args, sender=sender)
                    else:
                        if isinstance(args, tuple):
                            setting_to_fn[key][0](zone, *args, timeout=0, sender=sender)
                        else:
                            setting_to_fn[key][0](zone, args, timeout=0, sender=sender)
            else: # bool
                if key in ["icmp_block_inversion"]:
                    # no timeout arg
                    setting_to_fn[key][0](zone, sender=sender)
                else:
                    setting_to_fn[key][0](zone, timeout=0, sender=sender)

    # INTERFACES

    def check_interface(self, interface):
        self._fw.check_interface(interface)

    def interface_get_sender(self, zone, interface):
        _zone = self._fw.check_zone(zone)
        _obj = self._zones[_zone]
        interface_id = self.__interface_id(interface)

        if interface_id in _obj.settings["interfaces"]:
            settings = _obj.settings["interfaces"][interface_id]
            if "sender" in settings and settings["sender"] is not None:
                return settings["sender"]

        return None

    def __interface_id(self, interface):
        self.check_interface(interface)
        return interface

    def add_interface(self, zone, interface, sender=None,
                      use_transaction=None, allow_apply=True):
        self._fw.check_panic()
        _zone = self._fw.check_zone(zone)
        _obj = self._zones[_zone]

        interface_id = self.__interface_id(interface)

        if interface_id in _obj.settings["interfaces"]:
            raise FirewallError(errors.ZONE_ALREADY_SET,
                                "'%s' already bound to '%s'" % (interface,
                                                                zone))
        if self.get_zone_of_interface(interface) is not None:
            raise FirewallError(errors.ZONE_CONFLICT,
                                "'%s' already bound to a zone" % interface)

        log.debug1("Setting zone of interface '%s' to '%s'" % (interface,
                                                               _zone))

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if not _obj.applied and allow_apply:
            self.apply_zone_settings(zone,
                                     use_transaction=transaction)
            transaction.add_fail(self.set_zone_applied, _zone, False)

        if allow_apply:
            self._interface(True, _zone, interface, transaction)

        self.__register_interface(_obj, interface_id, zone, sender)
        transaction.add_fail(self.__unregister_interface, _obj,
                                  interface_id)

        if use_transaction is None:
            transaction.execute(True)

        return _zone

    def __register_interface(self, _obj, interface_id, zone, sender):
        _obj.settings["interfaces"][interface_id] = \
            self.__gen_settings(0, sender)
        # add information whether we add to default or specific zone
        _obj.settings["interfaces"][interface_id]["__default__"] = \
            (not zone or zone == "")

    def change_zone_of_interface(self, zone, interface, sender=None):
        self._fw.check_panic()
        _old_zone = self.get_zone_of_interface(interface)
        _new_zone = self._fw.check_zone(zone)

        if _new_zone == _old_zone:
            return _old_zone

        if _old_zone is not None:
            self.remove_interface(_old_zone, interface)

        _zone = self.add_interface(zone, interface, sender)

        return _zone

    def change_default_zone(self, old_zone, new_zone, use_transaction=None):
        self._fw.check_panic()

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        self.apply_zone_settings(new_zone, transaction)
        self._interface(True, new_zone, "+", transaction, append=True)
        if old_zone is not None and old_zone != "":
            self._interface(False, old_zone, "+", transaction, append=True)

        if use_transaction is None:
            transaction.execute(True)

    def remove_interface(self, zone, interface,
                         use_transaction=None):
        self._fw.check_panic()
        zoi = self.get_zone_of_interface(interface)
        if zoi is None:
            raise FirewallError(errors.UNKNOWN_INTERFACE,
                                "'%s' is not in any zone" % interface)
        _zone = zoi if zone == "" else self._fw.check_zone(zone)
        if zoi != _zone:
            raise FirewallError(errors.ZONE_CONFLICT,
                                "remove_interface(%s, %s): zoi='%s'" % \
                                (zone, interface, zoi))

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        _obj = self._zones[_zone]
        interface_id = self.__interface_id(interface)
        transaction.add_post(self.__unregister_interface, _obj, interface_id)
        self._interface(False, _zone, interface, transaction)

        if use_transaction is None:
            transaction.execute(True)

        return _zone

    def __unregister_interface(self, _obj, interface_id):
        if interface_id in _obj.settings["interfaces"]:
            del _obj.settings["interfaces"][interface_id]

    def query_interface(self, zone, interface):
        return self.__interface_id(interface) in self.get_settings(zone)["interfaces"]

    def list_interfaces(self, zone):
        return self.get_settings(zone)["interfaces"].keys()

    # SOURCES

    def check_source(self, source, applied=False):
        if checkIPnMask(source):
            return "ipv4"
        elif checkIP6nMask(source):
            return "ipv6"
        elif check_mac(source):
            return ""
        elif source.startswith("ipset:"):
            self._check_ipset_type_for_source(source[6:])
            if applied:
                self._check_ipset_applied(source[6:])
            return self._ipset_family(source[6:])
        else:
            raise FirewallError(errors.INVALID_ADDR, source)

    def __source_id(self, source, applied=False):
        ipv = self.check_source(source, applied=applied)
        return (ipv, source)

    def add_source(self, zone, source, sender=None, use_transaction=None,
                   allow_apply=True):
        self._fw.check_panic()
        _zone = self._fw.check_zone(zone)
        _obj = self._zones[_zone]

        if check_mac(source):
            source = source.upper()

        source_id = self.__source_id(source, applied=allow_apply)

        if source_id in _obj.settings["sources"]:
            raise FirewallError(errors.ZONE_ALREADY_SET,
                            "'%s' already bound to '%s'" % (source, _zone))
        if self.get_zone_of_source(source) is not None:
            raise FirewallError(errors.ZONE_CONFLICT,
                                "'%s' already bound to a zone" % source)

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if not _obj.applied and allow_apply:
            self.apply_zone_settings(zone,
                                     use_transaction=transaction)
            transaction.add_fail(self.set_zone_applied, _zone, False)

        if allow_apply:
            self._source(True, _zone, source_id[0], source_id[1], transaction)

        self.__register_source(_obj, source_id, zone, sender)
        transaction.add_fail(self.__unregister_source, _obj, source_id)

        if use_transaction is None:
            transaction.execute(True)

        return _zone

    def __register_source(self, _obj, source_id, zone, sender):
        _obj.settings["sources"][source_id] = \
            self.__gen_settings(0, sender)
        # add information whether we add to default or specific zone
        _obj.settings["sources"][source_id]["__default__"] = (not zone or zone == "")

    def change_zone_of_source(self, zone, source, sender=None):
        self._fw.check_panic()
        _old_zone = self.get_zone_of_source(source)
        _new_zone = self._fw.check_zone(zone)

        if _new_zone == _old_zone:
            return _old_zone

        if check_mac(source):
            source = source.upper()

        if _old_zone is not None:
            self.remove_source(_old_zone, source)

        _zone = self.add_source(zone, source, sender)

        return _zone

    def remove_source(self, zone, source,
                      use_transaction=None):
        self._fw.check_panic()
        if check_mac(source):
            source = source.upper()
        zos = self.get_zone_of_source(source)
        if zos is None:
            raise FirewallError(errors.UNKNOWN_SOURCE,
                                "'%s' is not in any zone" % source)
        _zone = zos if zone == "" else self._fw.check_zone(zone)
        if zos != _zone:
            raise FirewallError(errors.ZONE_CONFLICT,
                                "remove_source(%s, %s): zos='%s'" % \
                                (zone, source, zos))

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        _obj = self._zones[_zone]
        source_id = self.__source_id(source)
        transaction.add_post(self.__unregister_source, _obj, source_id)
        self._source(False, _zone, source_id[0], source_id[1], transaction)

        if use_transaction is None:
            transaction.execute(True)

        return _zone

    def __unregister_source(self, _obj, source_id):
        if source_id in _obj.settings["sources"]:
            del _obj.settings["sources"][source_id]

    def query_source(self, zone, source):
        if check_mac(source):
            source = source.upper()
        return self.__source_id(source) in self.get_settings(zone)["sources"]

    def list_sources(self, zone):
        return [ k[1] for k in self.get_settings(zone)["sources"].keys() ]

    def _interface(self, enable, zone, interface, transaction, append=False):
        for backend in self._fw.enabled_backends():
            if not backend.policies_supported:
                continue
            for policy in self._zone_policies[zone]:
                for (table, chain) in self._fw.policy._get_table_chains_for_zone_dispatch(policy):
                    rules = backend.build_zone_source_interface_rules(enable,
                                    zone, policy, interface, table, chain, append)
                    transaction.add_rules(backend, rules)

            # intra zone forward
            policy = self.policy_name_from_zones(zone, "ANY")
            # Skip adding wildcard/catch-all interface (for default
            # zone). Otherwise it would allow forwarding from interface
            # in default zone -> interface not in default zone (but in
            # a different zone).
            if self.get_settings(zone)["forward"] and interface not in ["+", "*"]:
                rules = backend.build_zone_forward_rules(enable, zone, policy, "filter", interface=interface)
                transaction.add_rules(backend, rules)

        # update policy dispatch for any policy using this zone in ingress
        # or egress
        for policy in self._fw.policy.get_policies_not_derived_from_zone():
            if zone not in self._fw.policy.list_ingress_zones(policy) and \
               zone not in self._fw.policy.list_egress_zones(policy):
                continue
            if policy in self._fw.policy.get_active_policies_not_derived_from_zone() and self._fw.policy.get_policy(policy).applied:
                # first remove the old set of interfaces using the current zone
                # settings.
                if not enable and len(self.list_interfaces(zone)) == 1:
                    self._fw.policy.unapply_policy_settings(policy, use_transaction=transaction)
                else:
                    self._fw.policy._ingress_egress_zones(False, policy, transaction)
                    # after the transaction ends and therefore the interface
                    # has been added to the zone's settings, update the
                    # dependent policies
                    transaction.add_post(lambda p: (p in self._fw.policy.get_active_policies_not_derived_from_zone()) and \
                                                   self._fw.policy._ingress_egress_zones_transaction(True, p), policy)
            elif enable:
                transaction.add_post(lambda p: (p in self._fw.policy.get_active_policies_not_derived_from_zone()) and \
                                               self._fw.policy.apply_policy_settings(p), policy)

    # IPSETS

    def _ipset_family(self, name):
        if self._ipset_type(name) == "hash:mac":
            return None
        return self._fw.ipset.get_family(name, applied=False)

    def _ipset_type(self, name):
        return self._fw.ipset.get_type(name, applied=False)

    def _ipset_match_flags(self, name, flag):
        return ",".join([flag] * self._fw.ipset.get_dimension(name))

    def _check_ipset_applied(self, name):
        return self._fw.ipset.check_applied(name)

    def _check_ipset_type_for_source(self, name):
        _type = self._ipset_type(name)
        if _type not in SOURCE_IPSET_TYPES:
            raise FirewallError(
                errors.INVALID_IPSET,
                "ipset '%s' with type '%s' not usable as source" % \
                (name, _type))

    def _source(self, enable, zone, ipv, source, transaction):
        # For mac source bindings ipv is an empty string, the mac source will
        # be added for ipv4 and ipv6
        for backend in [self._fw.get_backend_by_ipv(ipv)] if ipv else self._fw.enabled_backends():
            if not backend.policies_supported:
                continue
            for policy in self._zone_policies[zone]:
                for (table, chain) in self._fw.policy._get_table_chains_for_zone_dispatch(policy):
                    rules = backend.build_zone_source_address_rules(enable, zone,
                                            policy, source, table, chain)
                    transaction.add_rules(backend, rules)

            # intra zone forward
            policy = self.policy_name_from_zones(zone, "ANY")
            if self.get_settings(zone)["forward"]:
                rules = backend.build_zone_forward_rules(enable, zone, policy, "filter", source=source)
                transaction.add_rules(backend, rules)

        # update policy dispatch for any policy using this zone in ingress
        # or egress
        for policy in self._fw.policy.get_policies_not_derived_from_zone():
            if zone not in self._fw.policy.list_ingress_zones(policy) and \
               zone not in self._fw.policy.list_egress_zones(policy):
                continue
            if policy in self._fw.policy.get_active_policies_not_derived_from_zone() and self._fw.policy.get_policy(policy).applied:
                # first remove the old set of sources using the current zone
                # settings.
                if not enable and len(self.list_sources(zone)) == 1:
                    self._fw.policy.unapply_policy_settings(policy, use_transaction=transaction)
                else:
                    self._fw.policy._ingress_egress_zones(False, policy, transaction)
                    # after the transaction ends and therefore the sources
                    # has been added to the zone's settings, update the
                    # dependent policies
                    transaction.add_post(lambda p: (p in self._fw.policy.get_active_policies_not_derived_from_zone()) and \
                                                   self._fw.policy._ingress_egress_zones_transaction(True, p), policy)
            elif enable:
                transaction.add_post(lambda p: (p in self._fw.policy.get_active_policies_not_derived_from_zone()) and \
                                               self._fw.policy.apply_policy_settings(p), policy)

    def add_service(self, zone, service, timeout=0, sender=None):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "HOST")
        self._fw.policy.add_service(p_name, service, timeout, sender)
        return zone

    def remove_service(self, zone, service):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "HOST")
        self._fw.policy.remove_service(p_name, service)
        return zone

    def query_service(self, zone, service):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "HOST")
        return self._fw.policy.query_service(p_name, service)

    def list_services(self, zone):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "HOST")
        return self._fw.policy.list_services(p_name)

    def add_port(self, zone, port, protocol, timeout=0, sender=None):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "HOST")
        self._fw.policy.add_port(p_name, port, protocol, timeout, sender)
        return zone

    def remove_port(self, zone, port, protocol):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "HOST")
        self._fw.policy.remove_port(p_name, port, protocol)
        return zone

    def query_port(self, zone, port, protocol):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "HOST")
        return self._fw.policy.query_port(p_name, port, protocol)

    def list_ports(self, zone):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "HOST")
        return self._fw.policy.list_ports(p_name)

    def add_source_port(self, zone, source_port, protocol, timeout=0, sender=None):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "HOST")
        self._fw.policy.add_source_port(p_name, source_port, protocol, timeout, sender)
        return zone

    def remove_source_port(self, zone, source_port, protocol):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "HOST")
        self._fw.policy.remove_source_port(p_name, source_port, protocol)
        return zone

    def query_source_port(self, zone, source_port, protocol):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "HOST")
        return self._fw.policy.query_source_port(p_name, source_port, protocol)

    def list_source_ports(self, zone):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "HOST")
        return self._fw.policy.list_source_ports(p_name)

    def _rich_rule_to_policies(self, zone, rule):
        zone = self._fw.check_zone(zone)
        if type(rule.action) == Rich_Mark:
            return [self.policy_name_from_zones(zone, "ANY")]
        elif type(rule.element) in [Rich_Service, Rich_Port, Rich_Protocol,
                                    Rich_SourcePort]:
            return [self.policy_name_from_zones(zone, "HOST")]
        elif type(rule.element) in [Rich_IcmpBlock, Rich_IcmpType]:
            return [self.policy_name_from_zones(zone, "HOST"),
                    self.policy_name_from_zones(zone, "ANY")]
        elif type(rule.element) in [Rich_ForwardPort]:
            return [self.policy_name_from_zones(zone, "ANY")]
        elif type(rule.element) in [Rich_Masquerade]:
            return [self.policy_name_from_zones("ANY", zone)]
        elif rule.element is None:
            return [self.policy_name_from_zones(zone, "HOST")]
        else:
            raise FirewallError("Rich rule type (%s) not handled." % (type(rule.element)))

    def add_rule(self, zone, rule, timeout=0, sender=None):
        for p_name in self._rich_rule_to_policies(zone, rule):
            self._fw.policy.add_rule(p_name, rule, timeout, sender)
        return zone

    def remove_rule(self, zone, rule):
        for p_name in self._rich_rule_to_policies(zone, rule):
            self._fw.policy.remove_rule(p_name, rule)
        return zone

    def query_rule(self, zone, rule):
        ret = True
        for p_name in self._rich_rule_to_policies(zone, rule):
            ret = ret and self._fw.policy.query_rule(p_name, rule)
        return ret

    def list_rules(self, zone):
        zone = self._fw.check_zone(zone)
        ret = set()
        for p_name in [self.policy_name_from_zones(zone, "ANY"),
                       self.policy_name_from_zones(zone, "HOST"),
                       self.policy_name_from_zones("ANY", zone)]:
            ret.update(set(self._fw.policy.list_rules(p_name)))
        return list(ret)

    def add_protocol(self, zone, protocol, timeout=0, sender=None):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "HOST")
        self._fw.policy.add_protocol(p_name, protocol, timeout, sender)
        return zone

    def remove_protocol(self, zone, protocol):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "HOST")
        self._fw.policy.remove_protocol(p_name, protocol)
        return zone

    def query_protocol(self, zone, protocol):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "HOST")
        return self._fw.policy.query_protocol(p_name, protocol)

    def list_protocols(self, zone):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "HOST")
        return self._fw.policy.list_protocols(p_name)

    def add_masquerade(self, zone, timeout=0, sender=None):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones("ANY", zone)
        self._fw.policy.add_masquerade(p_name, timeout, sender)
        return zone

    def remove_masquerade(self, zone):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones("ANY", zone)
        self._fw.policy.remove_masquerade(p_name)
        return zone

    def query_masquerade(self, zone):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones("ANY", zone)
        return self._fw.policy.query_masquerade(p_name)

    def add_forward_port(self, zone, port, protocol, toport=None,
                         toaddr=None, timeout=0, sender=None):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "ANY")
        self._fw.policy.add_forward_port(p_name, port, protocol, toport, toaddr,
                                         timeout, sender)
        return zone

    def remove_forward_port(self, zone, port, protocol, toport=None,
                            toaddr=None):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "ANY")
        self._fw.policy.remove_forward_port(p_name, port, protocol, toport, toaddr)
        return zone

    def query_forward_port(self, zone, port, protocol, toport=None,
                           toaddr=None):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "ANY")
        return self._fw.policy.query_forward_port(p_name, port, protocol, toport,
                                                  toaddr)

    def list_forward_ports(self, zone):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "ANY")
        return self._fw.policy.list_forward_ports(p_name)

    def add_icmp_block(self, zone, icmp, timeout=0, sender=None):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "HOST")
        self._fw.policy.add_icmp_block(p_name, icmp, timeout, sender)

        p_name = self.policy_name_from_zones(zone, "ANY")
        self._fw.policy.add_icmp_block(p_name, icmp, timeout, sender)
        return zone

    def remove_icmp_block(self, zone, icmp):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "HOST")
        self._fw.policy.remove_icmp_block(p_name, icmp)

        p_name = self.policy_name_from_zones(zone, "ANY")
        self._fw.policy.remove_icmp_block(p_name, icmp)
        return zone

    def query_icmp_block(self, zone, icmp):
        zone = self._fw.check_zone(zone)
        p_name_host = self.policy_name_from_zones(zone, "HOST")
        p_name_fwd = self.policy_name_from_zones(zone, "ANY")
        return self._fw.policy.query_icmp_block(p_name_host, icmp) and \
               self._fw.policy.query_icmp_block(p_name_fwd, icmp)

    def list_icmp_blocks(self, zone):
        zone = self._fw.check_zone(zone)
        p_name_host = self.policy_name_from_zones(zone, "HOST")
        p_name_fwd = self.policy_name_from_zones(zone, "ANY")
        return sorted(set(self._fw.policy.list_icmp_blocks(p_name_host) +
                          self._fw.policy.list_icmp_blocks(p_name_fwd)))

    def add_icmp_block_inversion(self, zone, sender=None):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "HOST")
        self._fw.policy.add_icmp_block_inversion(p_name, sender)

        p_name = self.policy_name_from_zones(zone, "ANY")
        self._fw.policy.add_icmp_block_inversion(p_name, sender)
        return zone

    def _icmp_block_inversion(self, enable, zone, transaction):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "HOST")
        self._fw.policy._icmp_block_inversion(enable, p_name, transaction)

        p_name = self.policy_name_from_zones(zone, "ANY")
        self._fw.policy._icmp_block_inversion(enable, p_name, transaction)

    def remove_icmp_block_inversion(self, zone):
        zone = self._fw.check_zone(zone)
        p_name = self.policy_name_from_zones(zone, "HOST")
        self._fw.policy.remove_icmp_block_inversion(p_name)

        p_name = self.policy_name_from_zones(zone, "ANY")
        self._fw.policy.remove_icmp_block_inversion(p_name)
        return zone

    def query_icmp_block_inversion(self, zone):
        zone = self._fw.check_zone(zone)
        p_name_host = self.policy_name_from_zones(zone, "HOST")
        p_name_fwd = self.policy_name_from_zones(zone, "ANY")
        return self._fw.policy.query_icmp_block_inversion(p_name_host) and \
               self._fw.policy.query_icmp_block_inversion(p_name_fwd)

    def _forward(self, enable, zone, transaction):
        p_name = self.policy_name_from_zones(zone, "ANY")

        for interface in self._zones[zone].settings["interfaces"]:
            for backend in self._fw.enabled_backends():
                if not backend.policies_supported:
                    continue
                rules = backend.build_zone_forward_rules(enable, zone, p_name, "filter", interface=interface)
                transaction.add_rules(backend, rules)

        for ipv,source in self._zones[zone].settings["sources"]:
            for backend in [self._fw.get_backend_by_ipv(ipv)] if ipv else self._fw.enabled_backends():
                if not backend.policies_supported:
                    continue
                rules = backend.build_zone_forward_rules(enable, zone, p_name, "filter", source=source)
                transaction.add_rules(backend, rules)

    def __forward_id(self):
        return True

    def add_forward(self, zone, timeout=0, sender=None,
                    use_transaction=None):
        _zone = self._fw.check_zone(zone)
        self._fw.check_timeout(timeout)
        self._fw.check_panic()
        _obj = self._zones[_zone]

        forward_id = self.__forward_id()
        if forward_id in _obj.settings["forward"]:
            raise FirewallError(errors.ALREADY_ENABLED,
                                "forward already enabled in '%s'" % _zone)

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            self._forward(True, _zone, transaction)

        self.__register_forward(_obj, forward_id, timeout, sender)
        transaction.add_fail(self.__unregister_forward, _obj, forward_id)

        if use_transaction is None:
            transaction.execute(True)

        return _zone

    def __register_forward(self, _obj, forward_id, timeout, sender):
        _obj.settings["forward"][forward_id] = \
            self.__gen_settings(timeout, sender)

    def remove_forward(self, zone, use_transaction=None):
        _zone = self._fw.check_zone(zone)
        self._fw.check_panic()
        _obj = self._zones[_zone]

        forward_id = self.__forward_id()
        if forward_id not in _obj.settings["forward"]:
            raise FirewallError(errors.NOT_ENABLED,
                                "forward not enabled in '%s'" % _zone)

        if use_transaction is None:
            transaction = self.new_transaction()
        else:
            transaction = use_transaction

        if _obj.applied:
            self._forward(False, _zone, transaction)

        transaction.add_post(self.__unregister_forward, _obj, forward_id)

        if use_transaction is None:
            transaction.execute(True)

        return _zone

    def __unregister_forward(self, _obj, forward_id):
        if forward_id in _obj.settings["forward"]:
            del _obj.settings["forward"][forward_id]

    def query_forward(self, zone):
        return self.__forward_id() in self.get_settings(zone)["forward"]
site-packages/firewall/core/ipXtables.py000064400000170666147511334700014353 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2010-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

import os.path
import copy

from firewall.core.prog import runProg
from firewall.core.logger import log
from firewall.functions import tempFile, readfile, splitArgs, check_mac, portStr, \
                               check_single_address, check_address, normalizeIP6
from firewall import config
from firewall.errors import FirewallError, INVALID_PASSTHROUGH, INVALID_RULE, UNKNOWN_ERROR, INVALID_ADDR
from firewall.core.rich import Rich_Accept, Rich_Reject, Rich_Drop, Rich_Mark, \
                               Rich_Masquerade, Rich_ForwardPort, Rich_IcmpBlock
import string

POLICY_CHAIN_PREFIX = ""

BUILT_IN_CHAINS = {
    "security": [ "INPUT", "OUTPUT", "FORWARD" ],
    "raw": [ "PREROUTING", "OUTPUT" ],
    "mangle": [ "PREROUTING", "POSTROUTING", "INPUT", "OUTPUT", "FORWARD" ],
    "nat": [ "PREROUTING", "POSTROUTING", "OUTPUT" ],
    "filter": [ "INPUT", "OUTPUT", "FORWARD" ],
}

DEFAULT_REJECT_TYPE = {
    "ipv4": "icmp-host-prohibited",
    "ipv6": "icmp6-adm-prohibited",
}

ICMP = {
    "ipv4": "icmp",
    "ipv6": "ipv6-icmp",
}

# ipv ebtables also uses this
#
def common_reverse_rule(args):
    """ Inverse valid rule """

    replace_args = {
        # Append
        "-A": "-D",
        "--append": "--delete",
        # Insert
        "-I": "-D",
        "--insert": "--delete",
        # New chain
        "-N": "-X",
        "--new-chain": "--delete-chain",
    }

    ret_args = args[:]

    for arg in replace_args:
        try:
            idx = ret_args.index(arg)
        except Exception:
            continue

        if arg in [ "-I", "--insert" ]:
            # With insert rulenum, then remove it if it is a number
            # Opt at position idx, chain at position idx+1, [rulenum] at
            # position idx+2
            try:
                int(ret_args[idx+2])
            except Exception:
                pass
            else:
                ret_args.pop(idx+2)

        ret_args[idx] = replace_args[arg]
    return ret_args

def common_reverse_passthrough(args):
    """ Reverse valid passthough rule """

    replace_args = {
        # Append
        "-A": "-D",
        "--append": "--delete",
        # Insert
        "-I": "-D",
        "--insert": "--delete",
        # New chain
        "-N": "-X",
        "--new-chain": "--delete-chain",
    }

    ret_args = args[:]

    for x in replace_args:
        try:
            idx = ret_args.index(x)
        except ValueError:
            continue

        if x in [ "-I", "--insert" ]:
            # With insert rulenum, then remove it if it is a number
            # Opt at position idx, chain at position idx+1, [rulenum] at
            # position idx+2
            try:
                int(ret_args[idx+2])
            except ValueError:
                pass
            else:
                ret_args.pop(idx+2)

        ret_args[idx] = replace_args[x]
        return ret_args

    raise FirewallError(INVALID_PASSTHROUGH,
                        "no '-A', '-I' or '-N' arg")

# ipv ebtables also uses this
#
def common_check_passthrough(args):
    """ Check if passthough rule is valid (only add, insert and new chain
    rules are allowed) """

    args = set(args)
    not_allowed = set(["-C", "--check",           # check rule
                       "-D", "--delete",          # delete rule
                       "-R", "--replace",         # replace rule
                       "-L", "--list",            # list rule
                       "-S", "--list-rules",      # print rules
                       "-F", "--flush",           # flush rules
                       "-Z", "--zero",            # zero rules
                       "-X", "--delete-chain",    # delete chain
                       "-P", "--policy",          # policy
                       "-E", "--rename-chain"])   # rename chain)
    # intersection of args and not_allowed is not empty, i.e.
    # something from args is not allowed
    if len(args & not_allowed) > 0:
        raise FirewallError(INVALID_PASSTHROUGH,
                            "arg '%s' is not allowed" %
                            list(args & not_allowed)[0])

    # args need to contain one of -A, -I, -N
    needed = set(["-A", "--append",
                  "-I", "--insert",
                  "-N", "--new-chain"])
    # empty intersection of args and needed, i.e.
    # none from args contains any needed command
    if len(args & needed) == 0:
        raise FirewallError(INVALID_PASSTHROUGH,
                            "no '-A', '-I' or '-N' arg")

class ip4tables(object):
    ipv = "ipv4"
    name = "ip4tables"
    policies_supported = True

    def __init__(self, fw):
        self._fw = fw
        self._command = config.COMMANDS[self.ipv]
        self._restore_command = config.COMMANDS["%s-restore" % self.ipv]
        self.wait_option = self._detect_wait_option()
        self.restore_wait_option = self._detect_restore_wait_option()
        self.fill_exists()
        self.available_tables = []
        self.rich_rule_priority_counts = {}
        self.policy_priority_counts = {}
        self.zone_source_index_cache = []
        self.our_chains = {} # chains created by firewalld

    def fill_exists(self):
        self.command_exists = os.path.exists(self._command)
        self.restore_command_exists = os.path.exists(self._restore_command)

    def __run(self, args):
        # convert to string list
        if self.wait_option and self.wait_option not in args:
            _args = [self.wait_option] + ["%s" % item for item in args]
        else:
            _args = ["%s" % item for item in args]
        log.debug2("%s: %s %s", self.__class__, self._command, " ".join(_args))
        (status, ret) = runProg(self._command, _args)
        if status != 0:
            raise ValueError("'%s %s' failed: %s" % (self._command,
                                                     " ".join(_args), ret))
        return ret

    def _rule_replace(self, rule, pattern, replacement):
        try:
            i = rule.index(pattern)
        except ValueError:
            return False
        else:
            rule[i:i+1] = replacement
            return True

    def is_chain_builtin(self, ipv, table, chain):
        return table in BUILT_IN_CHAINS and \
               chain in BUILT_IN_CHAINS[table]

    def build_chain_rules(self, add, table, chain):
        rule = [ "-t", table ]
        if add:
            rule.append("-N")
        else:
            rule.append("-X")
        rule.append(chain)
        return [rule]

    def build_rule(self, add, table, chain, index, args):
        rule = [ "-t", table ]
        if add:
            rule += [ "-I", chain, str(index) ]
        else:
            rule += [ "-D", chain ]
        rule += args
        return rule

    def reverse_rule(self, args):
        return common_reverse_rule(args)

    def check_passthrough(self, args):
        common_check_passthrough(args)

    def reverse_passthrough(self, args):
        return common_reverse_passthrough(args)

    def passthrough_parse_table_chain(self, args):
        table = "filter"
        try:
            i = args.index("-t")
        except ValueError:
            pass
        else:
            if len(args) >= i+1:
                table = args[i+1]
        chain = None
        for opt in [ "-A", "--append",
                     "-I", "--insert",
                     "-N", "--new-chain" ]:
            try:
                i = args.index(opt)
            except ValueError:
                pass
            else:
                if len(args) >= i+1:
                    chain = args[i+1]
        return (table, chain)

    def _run_replace_zone_source(self, rule, zone_source_index_cache):
        try:
            i = rule.index("%%ZONE_SOURCE%%")
            rule.pop(i)
            zone = rule.pop(i)
            if "-m" == rule[4]: # ipset/mac
                zone_source = (zone, rule[7]) # (zone, address)
            else:
                zone_source = (zone, rule[5]) # (zone, address)
        except ValueError:
            try:
                i = rule.index("%%ZONE_INTERFACE%%")
                rule.pop(i)
                zone_source = None
            except ValueError:
                return

        rule_add = True
        if rule[0] in ["-D", "--delete"]:
            rule_add = False

        if zone_source and not rule_add:
            if zone_source in zone_source_index_cache:
                zone_source_index_cache.remove(zone_source)
        elif rule_add:
            if zone_source:
                # order source based dispatch by zone name
                if zone_source not in zone_source_index_cache:
                    zone_source_index_cache.append(zone_source)
                    zone_source_index_cache.sort(key=lambda x: x[0])

                index = zone_source_index_cache.index(zone_source)
            else:
                if self._fw._allow_zone_drifting:
                    index = 0
                else:
                    index = len(zone_source_index_cache)

            rule[0] = "-I"
            rule.insert(2, "%d" % (index + 1))

    def _set_rule_replace_priority(self, rule, priority_counts, token):
        """
        Change something like
          -t filter -I public_IN %%RICH_RULE_PRIORITY%% 123
        or
          -t filter -A public_IN %%RICH_RULE_PRIORITY%% 321
        into
          -t filter -I public_IN 4
        or
          -t filter -I public_IN
        """
        try:
            i = rule.index(token)
        except ValueError:
            pass
        else:
            rule_add = True
            insert = False
            insert_add_index = -1
            rule.pop(i)
            priority = rule.pop(i)
            if type(priority) != int:
                raise FirewallError(INVALID_RULE, "priority must be followed by a number")

            table = "filter"
            for opt in [ "-t", "--table" ]:
                try:
                    j = rule.index(opt)
                except ValueError:
                    pass
                else:
                    if len(rule) >= j+1:
                        table = rule[j+1]
            for opt in [ "-A", "--append",
                         "-I", "--insert",
                         "-D", "--delete" ]:
                try:
                    insert_add_index = rule.index(opt)
                except ValueError:
                    pass
                else:
                    if len(rule) >= insert_add_index+1:
                        chain = rule[insert_add_index+1]

                    if opt in [ "-I", "--insert" ]:
                        insert = True
                    if opt in [ "-D", "--delete" ]:
                        rule_add = False

            chain = (table, chain)

            # Add the rule to the priority counts. We don't need to store the
            # rule, just bump the ref count for the priority value.
            if not rule_add:
                if chain not in priority_counts or \
                   priority not in priority_counts[chain] or \
                   priority_counts[chain][priority] <= 0:
                    raise FirewallError(UNKNOWN_ERROR, "nonexistent or underflow of priority count")

                priority_counts[chain][priority] -= 1
            else:
                if chain not in priority_counts:
                    priority_counts[chain] = {}
                if priority not in priority_counts[chain]:
                    priority_counts[chain][priority] = 0

                # calculate index of new rule
                index = 1
                for p in sorted(priority_counts[chain].keys()):
                    if p == priority and insert:
                        break
                    index += priority_counts[chain][p]
                    if p == priority:
                        break

                priority_counts[chain][priority] += 1

                rule[insert_add_index] = "-I"
                rule.insert(insert_add_index+2, "%d" % index)

    def set_rules(self, rules, log_denied):
        temp_file = tempFile()

        table_rules = { }
        rich_rule_priority_counts = copy.deepcopy(self.rich_rule_priority_counts)
        policy_priority_counts = copy.deepcopy(self.policy_priority_counts)
        zone_source_index_cache = copy.deepcopy(self.zone_source_index_cache)
        for _rule in rules:
            rule = _rule[:]

            # replace %%REJECT%%
            self._rule_replace(rule, "%%REJECT%%", \
                    ["REJECT", "--reject-with", DEFAULT_REJECT_TYPE[self.ipv]])

            # replace %%ICMP%%
            self._rule_replace(rule, "%%ICMP%%", [ICMP[self.ipv]])

            # replace %%LOGTYPE%%
            try:
                i = rule.index("%%LOGTYPE%%")
            except ValueError:
                pass
            else:
                if log_denied == "off":
                    continue
                if log_denied in [ "unicast", "broadcast", "multicast" ]:
                    rule[i:i+1] = [ "-m", "pkttype", "--pkt-type", log_denied ]
                else:
                    rule.pop(i)

            self._set_rule_replace_priority(rule, rich_rule_priority_counts, "%%RICH_RULE_PRIORITY%%")
            self._set_rule_replace_priority(rule, policy_priority_counts, "%%POLICY_PRIORITY%%")
            self._run_replace_zone_source(rule, zone_source_index_cache)

            table = "filter"
            # get table form rule
            for opt in [ "-t", "--table" ]:
                try:
                    i = rule.index(opt)
                except ValueError:
                    pass
                else:
                    if len(rule) >= i+1:
                        rule.pop(i)
                        table = rule.pop(i)

            # we can not use joinArgs here, because it would use "'" instead
            # of '"' for the start and end of the string, this breaks
            # iptables-restore
            for i in range(len(rule)):
                for c in string.whitespace:
                    if c in rule[i] and not (rule[i].startswith('"') and
                                             rule[i].endswith('"')):
                        rule[i] = '"%s"' % rule[i]

            table_rules.setdefault(table, []).append(rule)

        for table in table_rules:
            rules = table_rules[table]

            temp_file.write("*%s\n" % table)
            for rule in rules:
                temp_file.write(" ".join(rule) + "\n")
            temp_file.write("COMMIT\n")

        temp_file.close()

        stat = os.stat(temp_file.name)
        log.debug2("%s: %s %s", self.__class__, self._restore_command,
                   "%s: %d" % (temp_file.name, stat.st_size))
        args = [ ]
        if self.restore_wait_option:
            args.append(self.restore_wait_option)
        args.append("-n")

        (status, ret) = runProg(self._restore_command, args,
                                stdin=temp_file.name)

        if log.getDebugLogLevel() > 2:
            lines = readfile(temp_file.name)
            if lines is not None:
                i = 1
                for line in lines:
                    log.debug3("%8d: %s" % (i, line), nofmt=1, nl=0)
                    if not line.endswith("\n"):
                        log.debug3("", nofmt=1)
                    i += 1

        os.unlink(temp_file.name)

        if status != 0:
            raise ValueError("'%s %s' failed: %s" % (self._restore_command,
                                                     " ".join(args), ret))
        self.rich_rule_priority_counts = rich_rule_priority_counts
        self.policy_priority_counts = policy_priority_counts
        self.zone_source_index_cache = zone_source_index_cache

    def set_rule(self, rule, log_denied):
        # replace %%REJECT%%
        self._rule_replace(rule, "%%REJECT%%", \
                ["REJECT", "--reject-with", DEFAULT_REJECT_TYPE[self.ipv]])

        # replace %%ICMP%%
        self._rule_replace(rule, "%%ICMP%%", [ICMP[self.ipv]])

        # replace %%LOGTYPE%%
        try:
            i = rule.index("%%LOGTYPE%%")
        except ValueError:
            pass
        else:
            if log_denied == "off":
                return ""
            if log_denied in [ "unicast", "broadcast", "multicast" ]:
                rule[i:i+1] = [ "-m", "pkttype", "--pkt-type", log_denied ]
            else:
                rule.pop(i)

        rich_rule_priority_counts = copy.deepcopy(self.rich_rule_priority_counts)
        policy_priority_counts = copy.deepcopy(self.policy_priority_counts)
        zone_source_index_cache = copy.deepcopy(self.zone_source_index_cache)
        self._set_rule_replace_priority(rule, rich_rule_priority_counts, "%%RICH_RULE_PRIORITY%%")
        self._set_rule_replace_priority(rule, policy_priority_counts, "%%POLICY_PRIORITY%%")
        self._run_replace_zone_source(rule, zone_source_index_cache)

        output = self.__run(rule)

        self.rich_rule_priority_counts = rich_rule_priority_counts
        self.policy_priority_counts = policy_priority_counts
        self.zone_source_index_cache = zone_source_index_cache
        return output

    def get_available_tables(self, table=None):
        ret = []
        tables = [ table ] if table else BUILT_IN_CHAINS.keys()
        for table in tables:
            if table in self.available_tables:
                ret.append(table)
            else:
                try:
                    self.__run(["-t", table, "-L", "-n"])
                    self.available_tables.append(table)
                    ret.append(table)
                except ValueError:
                    log.debug1("%s table '%s' does not exist (or not enough permission to check)." % (self.ipv, table))

        return ret

    def _detect_wait_option(self):
        wait_option = ""
        ret = runProg(self._command, ["-w", "-L", "-n"])  # since iptables-1.4.20
        if ret[0] == 0:
            wait_option = "-w"  # wait for xtables lock
            ret = runProg(self._command, ["-w10", "-L", "-n"])  # since iptables > 1.4.21
            if ret[0] == 0:
                wait_option = "-w10"  # wait max 10 seconds
            log.debug2("%s: %s will be using %s option.", self.__class__, self._command, wait_option)

        return wait_option

    def _detect_restore_wait_option(self):
        temp_file = tempFile()
        temp_file.write("#foo")
        temp_file.close()

        wait_option = ""
        for test_option in ["-w", "--wait=2"]:
            ret = runProg(self._restore_command, [test_option], stdin=temp_file.name)
            if ret[0] == 0 and "invalid option" not in ret[1] \
                           and "unrecognized option" not in ret[1]:
                wait_option = test_option
                break

        log.debug2("%s: %s will be using %s option.", self.__class__, self._restore_command, wait_option)

        os.unlink(temp_file.name)

        return wait_option

    def build_flush_rules(self):
        self.rich_rule_priority_counts = {}
        self.policy_priority_counts = {}
        self.zone_source_index_cache = []
        rules = []
        for table in BUILT_IN_CHAINS.keys():
            if not self.get_available_tables(table):
                continue
            # Flush firewall rules: -F
            # Delete firewall chains: -X
            # Set counter to zero: -Z
            for flag in [ "-F", "-X", "-Z" ]:
                rules.append(["-t", table, flag])
        return rules

    def build_set_policy_rules(self, policy):
        rules = []
        _policy = "DROP" if policy == "PANIC" else policy
        for table in BUILT_IN_CHAINS.keys():
            if not self.get_available_tables(table):
                continue
            if table == "nat":
                continue
            for chain in BUILT_IN_CHAINS[table]:
                rules.append(["-t", table, "-P", chain, _policy])
        return rules

    def supported_icmp_types(self, ipv=None):
        """Return ICMP types that are supported by the iptables/ip6tables command and kernel"""
        ret = [ ]
        output = ""
        try:
            output = self.__run(["-p",
                                 "icmp" if self.ipv == "ipv4" else "ipv6-icmp",
                                 "--help"])
        except ValueError as ex:
            if self.ipv == "ipv4":
                log.debug1("iptables error: %s" % ex)
            else:
                log.debug1("ip6tables error: %s" % ex)
        lines = output.splitlines()

        in_types = False
        for line in lines:
            #print(line)
            if in_types:
                line = line.strip().lower()
                splits = line.split()
                for split in splits:
                    if split.startswith("(") and split.endswith(")"):
                        x = split[1:-1]
                    else:
                        x = split
                    if x not in ret:
                        ret.append(x)
            if self.ipv == "ipv4" and line.startswith("Valid ICMP Types:") or \
               self.ipv == "ipv6" and line.startswith("Valid ICMPv6 Types:"):
                in_types = True
        return ret

    def build_default_tables(self):
        # nothing to do, they always exist
        return []

    def build_default_rules(self, log_denied="off"):
        default_rules = {}

        if self.get_available_tables("security"):
            default_rules["security"] = [ ]
            self.our_chains["security"] = set()
            for chain in BUILT_IN_CHAINS["security"]:
                default_rules["security"].append("-N %s_direct" % chain)
                default_rules["security"].append("-A %s -j %s_direct" % (chain, chain))
                self.our_chains["security"].add("%s_direct" % chain)

        if self.get_available_tables("raw"):
            default_rules["raw"] = [ ]
            self.our_chains["raw"] = set()
            for chain in BUILT_IN_CHAINS["raw"]:
                default_rules["raw"].append("-N %s_direct" % chain)
                default_rules["raw"].append("-A %s -j %s_direct" % (chain, chain))
                self.our_chains["raw"].add("%s_direct" % chain)

                if chain == "PREROUTING":
                    for dispatch_suffix in ["POLICIES_pre", "ZONES_SOURCE", "ZONES", "POLICIES_post"] if self._fw._allow_zone_drifting else ["POLICIES_pre", "ZONES", "POLICIES_post"]:
                        default_rules["raw"].append("-N %s_%s" % (chain, dispatch_suffix))
                        default_rules["raw"].append("-A %s -j %s_%s" % (chain, chain, dispatch_suffix))
                        self.our_chains["raw"].update(set(["%s_%s" % (chain, dispatch_suffix)]))

        if self.get_available_tables("mangle"):
            default_rules["mangle"] = [ ]
            self.our_chains["mangle"] = set()
            for chain in BUILT_IN_CHAINS["mangle"]:
                default_rules["mangle"].append("-N %s_direct" % chain)
                default_rules["mangle"].append("-A %s -j %s_direct" % (chain, chain))
                self.our_chains["mangle"].add("%s_direct" % chain)

                if chain == "PREROUTING":
                    for dispatch_suffix in ["POLICIES_pre", "ZONES_SOURCE", "ZONES", "POLICIES_post"] if self._fw._allow_zone_drifting else ["POLICIES_pre", "ZONES", "POLICIES_post"]:
                        default_rules["mangle"].append("-N %s_%s" % (chain, dispatch_suffix))
                        default_rules["mangle"].append("-A %s -j %s_%s" % (chain, chain, dispatch_suffix))
                        self.our_chains["mangle"].update(set(["%s_%s" % (chain, dispatch_suffix)]))

        if self.get_available_tables("nat"):
            default_rules["nat"] = [ ]
            self.our_chains["nat"] = set()
            for chain in BUILT_IN_CHAINS["nat"]:
                default_rules["nat"].append("-N %s_direct" % chain)
                default_rules["nat"].append("-A %s -j %s_direct" % (chain, chain))
                self.our_chains["nat"].add("%s_direct" % chain)

                if chain in [ "PREROUTING", "POSTROUTING" ]:
                    for dispatch_suffix in ["POLICIES_pre", "ZONES_SOURCE", "ZONES", "POLICIES_post"] if self._fw._allow_zone_drifting else ["POLICIES_pre", "ZONES", "POLICIES_post"]:
                        default_rules["nat"].append("-N %s_%s" % (chain, dispatch_suffix))
                        default_rules["nat"].append("-A %s -j %s_%s" % (chain, chain, dispatch_suffix))
                        self.our_chains["nat"].update(set(["%s_%s" % (chain, dispatch_suffix)]))

        default_rules["filter"] = []
        self.our_chains["filter"] = set()
        default_rules["filter"].append("-A INPUT -m conntrack --ctstate RELATED,ESTABLISHED,DNAT -j ACCEPT")
        default_rules["filter"].append("-A INPUT -i lo -j ACCEPT")
        default_rules["filter"].append("-N INPUT_direct")
        default_rules["filter"].append("-A INPUT -j INPUT_direct")
        self.our_chains["filter"].update(set("INPUT_direct"))
        for dispatch_suffix in ["POLICIES_pre", "ZONES_SOURCE", "ZONES", "POLICIES_post"] if self._fw._allow_zone_drifting else ["POLICIES_pre", "ZONES", "POLICIES_post"]:
            default_rules["filter"].append("-N INPUT_%s" % (dispatch_suffix))
            default_rules["filter"].append("-A INPUT -j INPUT_%s" % (dispatch_suffix))
            self.our_chains["filter"].update(set("INPUT_%s" % (dispatch_suffix)))
        if log_denied != "off":
            default_rules["filter"].append("-A INPUT -m conntrack --ctstate INVALID %%LOGTYPE%% -j LOG --log-prefix 'STATE_INVALID_DROP: '")
        default_rules["filter"].append("-A INPUT -m conntrack --ctstate INVALID -j DROP")
        if log_denied != "off":
            default_rules["filter"].append("-A INPUT %%LOGTYPE%% -j LOG --log-prefix 'FINAL_REJECT: '")
        default_rules["filter"].append("-A INPUT -j %%REJECT%%")

        default_rules["filter"].append("-A FORWARD -m conntrack --ctstate RELATED,ESTABLISHED,DNAT -j ACCEPT")
        default_rules["filter"].append("-A FORWARD -i lo -j ACCEPT")
        default_rules["filter"].append("-N FORWARD_direct")
        default_rules["filter"].append("-A FORWARD -j FORWARD_direct")
        self.our_chains["filter"].update(set("FORWARD_direct"))
        for dispatch_suffix in ["POLICIES_pre"]:
            default_rules["filter"].append("-N FORWARD_%s" % (dispatch_suffix))
            default_rules["filter"].append("-A FORWARD -j FORWARD_%s" % (dispatch_suffix))
            self.our_chains["filter"].update(set("FORWARD_%s" % (dispatch_suffix)))
        for direction in ["IN", "OUT"]:
            for dispatch_suffix in ["ZONES_SOURCE", "ZONES"] if self._fw._allow_zone_drifting else ["ZONES"]:
                default_rules["filter"].append("-N FORWARD_%s_%s" % (direction, dispatch_suffix))
                default_rules["filter"].append("-A FORWARD -j FORWARD_%s_%s" % (direction, dispatch_suffix))
                self.our_chains["filter"].update(set("FORWARD_%s_%s" % (direction, dispatch_suffix)))
        for dispatch_suffix in ["POLICIES_post"]:
            default_rules["filter"].append("-N FORWARD_%s" % (dispatch_suffix))
            default_rules["filter"].append("-A FORWARD -j FORWARD_%s" % (dispatch_suffix))
            self.our_chains["filter"].update(set("FORWARD_%s" % (dispatch_suffix)))
        if log_denied != "off":
            default_rules["filter"].append("-A FORWARD -m conntrack --ctstate INVALID %%LOGTYPE%% -j LOG --log-prefix 'STATE_INVALID_DROP: '")
        default_rules["filter"].append("-A FORWARD -m conntrack --ctstate INVALID -j DROP")
        if log_denied != "off":
            default_rules["filter"].append("-A FORWARD %%LOGTYPE%% -j LOG --log-prefix 'FINAL_REJECT: '")
        default_rules["filter"].append("-A FORWARD -j %%REJECT%%")

        default_rules["filter"] += [
            "-N OUTPUT_direct",

            "-A OUTPUT -m conntrack --ctstate RELATED,ESTABLISHED -j ACCEPT",
            "-A OUTPUT -o lo -j ACCEPT",
            "-A OUTPUT -j OUTPUT_direct",
        ]
        self.our_chains["filter"].update(set("OUTPUT_direct"))
        for dispatch_suffix in ["POLICIES_pre"]:
            default_rules["filter"].append("-N OUTPUT_%s" % (dispatch_suffix))
            default_rules["filter"].append("-A OUTPUT -j OUTPUT_%s" % (dispatch_suffix))
            self.our_chains["filter"].update(set("OUTPUT_%s" % (dispatch_suffix)))
        for dispatch_suffix in ["POLICIES_post"]:
            default_rules["filter"].append("-N OUTPUT_%s" % (dispatch_suffix))
            default_rules["filter"].append("-A OUTPUT -j OUTPUT_%s" % (dispatch_suffix))
            self.our_chains["filter"].update(set("OUTPUT_%s" % (dispatch_suffix)))

        final_default_rules = []
        for table in default_rules:
            if table not in self.get_available_tables():
                continue
            for rule in default_rules[table]:
                final_default_rules.append(["-t", table] + splitArgs(rule))

        return final_default_rules

    def get_zone_table_chains(self, table):
        if table == "filter":
            return { "INPUT", "FORWARD_IN", "FORWARD_OUT" }
        if table == "mangle":
            if "mangle" in self.get_available_tables():
                return { "PREROUTING" }
        if table == "nat":
            if "nat" in self.get_available_tables():
                return { "PREROUTING", "POSTROUTING" }
        if table == "raw":
            if "raw" in self.get_available_tables():
                return { "PREROUTING" }

        return {}

    def build_policy_ingress_egress_rules(self, enable, policy, table, chain,
                                          ingress_interfaces, egress_interfaces,
                                          ingress_sources, egress_sources):
        p_obj = self._fw.policy.get_policy(policy)
        chain_suffix = "pre" if p_obj.priority < 0 else "post"
        isSNAT = True if (table == "nat" and chain == "POSTROUTING") else False
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX, isSNAT)

        ingress_fragments = []
        egress_fragments = []
        for interface in ingress_interfaces:
            ingress_fragments.append(["-i", interface])
        for interface in egress_interfaces:
            egress_fragments.append(["-o", interface])
        for addr in ingress_sources:
            ipv = self._fw.zone.check_source(addr)
            if ipv in ["ipv4", "ipv6"] and not self.is_ipv_supported(ipv):
                continue
            ingress_fragments.append(self._rule_addr_fragment("-s", addr))
        for addr in egress_sources:
            ipv = self._fw.zone.check_source(addr)
            if ipv in ["ipv4", "ipv6"] and not self.is_ipv_supported(ipv):
                continue
            # iptables can not match destination MAC
            if check_mac(addr) and chain in ["POSTROUTING", "FORWARD_OUT", "OUTPUT"]:
                continue

            egress_fragments.append(self._rule_addr_fragment("-d", addr))

        def _generate_policy_dispatch_rule(ingress_fragment, egress_fragment):
            add_del = {True: "-A", False: "-D" }[enable]
            rule = ["-t", table, add_del, "%s_POLICIES_%s" % (chain, chain_suffix),
                    "%%POLICY_PRIORITY%%", p_obj.priority]
            if ingress_fragment:
                rule.extend(ingress_fragment)
            if egress_fragment:
                rule.extend(egress_fragment)
            rule.extend(["-j", _policy])

            return rule

        rules = []
        if ingress_fragments:
            # zone --> [zone, ANY, HOST]
            for ingress_fragment in ingress_fragments:
                # zone --> zone
                if egress_fragments:
                    for egress_fragment in egress_fragments:
                        rules.append(_generate_policy_dispatch_rule(ingress_fragment, egress_fragment))
                elif egress_sources:
                    # if the egress source is not for the current family (there
                    # are no egress fragments), then avoid creating an invalid
                    # catch all rule.
                    pass
                else:
                    rules.append(_generate_policy_dispatch_rule(ingress_fragment, None))
        elif ingress_sources:
            # if the ingress source is not for the current family (there are no
            # ingress fragments), then avoid creating an invalid catch all
            # rule.
            pass
        else: # [ANY, HOST] --> [zone, ANY, HOST]
            # [ANY, HOST] --> zone
            if egress_fragments:
                for egress_fragment in egress_fragments:
                    rules.append(_generate_policy_dispatch_rule(None, egress_fragment))
            elif egress_sources:
                # if the egress source is not for the current family (there
                # are no egress fragments), then avoid creating an invalid
                # catch all rule.
                pass
            else:
                # [ANY, HOST] --> [ANY, HOST]
                rules.append(_generate_policy_dispatch_rule(None, None))

        return rules

    def build_zone_source_interface_rules(self, enable, zone, policy, interface,
                                          table, chain, append=False):
        isSNAT = True if (table == "nat" and chain == "POSTROUTING") else False
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX, isSNAT=isSNAT)
        opt = {
            "PREROUTING": "-i",
            "POSTROUTING": "-o",
            "INPUT": "-i",
            "FORWARD_IN": "-i",
            "FORWARD_OUT": "-o",
            "OUTPUT": "-o",
        }[chain]

        action = "-g"

        if enable and not append:
            rule = [ "-I", "%s_ZONES" % chain, "%%ZONE_INTERFACE%%" ]
        elif enable:
            rule = [ "-A", "%s_ZONES" % chain ]
        else:
            rule = [ "-D", "%s_ZONES" % chain ]
            if not append:
                rule += ["%%ZONE_INTERFACE%%"]
        rule += [ "-t", table, opt, interface, action, _policy ]
        return [rule]

    def _rule_addr_fragment(self, opt, address, invert=False):
        if address.startswith("ipset:"):
            name = address[6:]
            if opt == "-d":
                opt = "dst"
            else:
                opt = "src"
            flags = ",".join([opt] * self._fw.ipset.get_dimension(name))
            return ["-m", "set", "--match-set", name, flags]
        elif check_mac(address):
            # outgoing can not be set
            if opt == "-d":
                raise FirewallError(INVALID_ADDR, "Can't match a destination MAC.")
            return ["-m", "mac", "--mac-source", address.upper()]
        else:
            if check_single_address("ipv6", address):
                address = normalizeIP6(address)
            elif check_address("ipv6", address):
                addr_split = address.split("/")
                address = normalizeIP6(addr_split[0]) + "/" + addr_split[1]
            return [opt, address]

    def build_zone_source_address_rules(self, enable, zone, policy,
                                        address, table, chain):
        add_del = { True: "-I", False: "-D" }[enable]

        isSNAT = True if (table == "nat" and chain == "POSTROUTING") else False
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX, isSNAT=isSNAT)
        opt = {
            "PREROUTING": "-s",
            "POSTROUTING": "-d",
            "INPUT": "-s",
            "FORWARD_IN": "-s",
            "FORWARD_OUT": "-d",
            "OUTPUT": "-d",
        }[chain]

        if self._fw._allow_zone_drifting:
            zone_dispatch_chain = "%s_ZONES_SOURCE" % (chain)
        else:
            zone_dispatch_chain = "%s_ZONES" % (chain)

        # iptables can not match destination MAC
        if check_mac(address) and chain in ["POSTROUTING", "FORWARD_OUT", "OUTPUT"]:
            return []

        rule = [add_del, zone_dispatch_chain, "%%ZONE_SOURCE%%", zone, "-t", table]
        rule.extend(self._rule_addr_fragment(opt, address))
        rule.extend(["-g", _policy])

        return [rule]

    def build_policy_chain_rules(self, enable, policy, table, chain):
        add_del_chain = { True: "-N", False: "-X" }[enable]
        add_del_rule = { True: "-A", False: "-D" }[enable]
        isSNAT = True if (table == "nat" and chain == "POSTROUTING") else False
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX, isSNAT=isSNAT)

        self.our_chains[table].update(set([_policy,
                                      "%s_log" % _policy,
                                      "%s_deny" % _policy,
                                      "%s_pre" % _policy,
                                      "%s_post" % _policy,
                                      "%s_allow" % _policy]))

        rules = []
        rules.append([ add_del_chain, _policy, "-t", table ])
        rules.append([ add_del_chain, "%s_pre" % _policy, "-t", table ])
        rules.append([ add_del_chain, "%s_log" % _policy, "-t", table ])
        rules.append([ add_del_chain, "%s_deny" % _policy, "-t", table ])
        rules.append([ add_del_chain, "%s_allow" % _policy, "-t", table ])
        rules.append([ add_del_chain, "%s_post" % _policy, "-t", table ])
        rules.append([ add_del_rule, _policy, "-t", table, "-j", "%s_pre" % _policy ])
        rules.append([ add_del_rule, _policy, "-t", table, "-j", "%s_log" % _policy ])
        rules.append([ add_del_rule, _policy, "-t", table, "-j", "%s_deny" % _policy ])
        rules.append([ add_del_rule, _policy, "-t", table, "-j", "%s_allow" % _policy ])
        rules.append([ add_del_rule, _policy, "-t", table, "-j", "%s_post" % _policy ])

        target = self._fw.policy._policies[policy].target

        if self._fw.get_log_denied() != "off":
            if table == "filter":
                if target in [ "REJECT", "%%REJECT%%" ]:
                    rules.append([ add_del_rule, _policy, "-t", table, "%%LOGTYPE%%",
                                   "-j", "LOG", "--log-prefix",
                                   "\"%s_REJECT: \"" % _policy ])
                if target == "DROP":
                    rules.append([ add_del_rule, _policy, "-t", table, "%%LOGTYPE%%",
                                   "-j", "LOG", "--log-prefix",
                                   "\"%s_DROP: \"" % _policy ])

        if table == "filter" and \
           target in [ "ACCEPT", "REJECT", "%%REJECT%%", "DROP" ]:
            rules.append([ add_del_rule, _policy, "-t", table, "-j", target ])

        if not enable:
            rules.reverse()

        return rules

    def _rule_limit(self, limit):
        if not limit:
            return []
        s = ["-m", "limit", "--limit", limit.value]
        if limit.burst is not None:
            s += ["--limit-burst", limit.burst]
        return s

    def _rich_rule_chain_suffix(self, rich_rule):
        if type(rich_rule.element) in [Rich_Masquerade, Rich_ForwardPort, Rich_IcmpBlock]:
            # These are special and don't have an explicit action
            pass
        elif rich_rule.action:
            if type(rich_rule.action) not in [Rich_Accept, Rich_Reject, Rich_Drop, Rich_Mark]:
                raise FirewallError(INVALID_RULE, "Unknown action %s" % type(rich_rule.action))
        else:
            raise FirewallError(INVALID_RULE, "No rule action specified.")

        if rich_rule.priority == 0:
            if type(rich_rule.element) in [Rich_Masquerade, Rich_ForwardPort] or \
               type(rich_rule.action) in [Rich_Accept, Rich_Mark]:
                return "allow"
            elif type(rich_rule.element) in [Rich_IcmpBlock] or \
                 type(rich_rule.action) in [Rich_Reject, Rich_Drop]:
                return "deny"
        elif rich_rule.priority < 0:
            return "pre"
        else:
            return "post"

    def _rich_rule_chain_suffix_from_log(self, rich_rule):
        if not rich_rule.log and not rich_rule.audit:
            raise FirewallError(INVALID_RULE, "Not log or audit")

        if rich_rule.priority == 0:
            return "log"
        elif rich_rule.priority < 0:
            return "pre"
        else:
            return "post"

    def _rich_rule_priority_fragment(self, rich_rule):
        if rich_rule.priority == 0:
            return []
        return ["%%RICH_RULE_PRIORITY%%", rich_rule.priority]

    def _rich_rule_log(self, policy, rich_rule, enable, table, rule_fragment):
        if not rich_rule.log:
            return []

        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)

        add_del = { True: "-A", False: "-D" }[enable]

        chain_suffix = self._rich_rule_chain_suffix_from_log(rich_rule)
        rule = ["-t", table, add_del, "%s_%s" % (_policy, chain_suffix)]
        rule += self._rich_rule_priority_fragment(rich_rule)
        rule += rule_fragment + [ "-j", "LOG" ]
        if rich_rule.log.prefix:
            rule += [ "--log-prefix", "'%s'" % rich_rule.log.prefix ]
        if rich_rule.log.level:
            rule += [ "--log-level", "%s" % rich_rule.log.level ]
        rule += self._rule_limit(rich_rule.log.limit)

        return rule

    def _rich_rule_audit(self, policy, rich_rule, enable, table, rule_fragment):
        if not rich_rule.audit:
            return []

        add_del = { True: "-A", False: "-D" }[enable]

        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)

        chain_suffix = self._rich_rule_chain_suffix_from_log(rich_rule)
        rule = ["-t", table, add_del, "%s_%s" % (_policy, chain_suffix)]
        rule += self._rich_rule_priority_fragment(rich_rule)
        rule += rule_fragment
        if type(rich_rule.action) == Rich_Accept:
            _type = "accept"
        elif type(rich_rule.action) == Rich_Reject:
            _type = "reject"
        elif type(rich_rule.action) ==  Rich_Drop:
            _type = "drop"
        else:
            _type = "unknown"
        rule += [ "-j", "AUDIT", "--type", _type ]
        rule += self._rule_limit(rich_rule.audit.limit)

        return rule

    def _rich_rule_action(self, policy, rich_rule, enable, table, rule_fragment):
        if not rich_rule.action:
            return []

        add_del = { True: "-A", False: "-D" }[enable]

        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)

        chain_suffix = self._rich_rule_chain_suffix(rich_rule)
        chain = "%s_%s" % (_policy, chain_suffix)
        if type(rich_rule.action) == Rich_Accept:
            rule_action = [ "-j", "ACCEPT" ]
        elif type(rich_rule.action) == Rich_Reject:
            rule_action = [ "-j", "REJECT" ]
            if rich_rule.action.type:
                rule_action += [ "--reject-with", rich_rule.action.type ]
        elif type(rich_rule.action) ==  Rich_Drop:
            rule_action = [ "-j", "DROP" ]
        elif type(rich_rule.action) == Rich_Mark:
            table = "mangle"
            _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)
            chain = "%s_%s" % (_policy, chain_suffix)
            rule_action = [ "-j", "MARK", "--set-xmark", rich_rule.action.set ]
        else:
            raise FirewallError(INVALID_RULE,
                                "Unknown action %s" % type(rich_rule.action))

        rule = ["-t", table, add_del, chain]
        rule += self._rich_rule_priority_fragment(rich_rule)
        rule += rule_fragment + rule_action
        rule += self._rule_limit(rich_rule.action.limit)

        return rule

    def _rich_rule_destination_fragment(self, rich_dest):
        if not rich_dest:
            return []

        rule_fragment = []
        if rich_dest.addr:
            if rich_dest.invert:
                rule_fragment.append("!")
            if check_single_address("ipv6", rich_dest.addr):
                rule_fragment += [ "-d", normalizeIP6(rich_dest.addr) ]
            elif check_address("ipv6", rich_dest.addr):
                addr_split = rich_dest.addr.split("/")
                rule_fragment += [ "-d", normalizeIP6(addr_split[0]) + "/" + addr_split[1] ]
            else:
                rule_fragment += [ "-d", rich_dest.addr ]
        elif rich_dest.ipset:
            rule_fragment += [ "-m", "set" ]
            if rich_dest.invert:
                rule_fragment.append("!")
            flags = self._fw.zone._ipset_match_flags(rich_dest.ipset, "dst")
            rule_fragment += [ "--match-set", rich_dest.ipset, flags ]

        return rule_fragment

    def _rich_rule_source_fragment(self, rich_source):
        if not rich_source:
            return []

        rule_fragment = []
        if rich_source.addr:
            if rich_source.invert:
                rule_fragment.append("!")
            if check_single_address("ipv6", rich_source.addr):
                rule_fragment += [ "-s", normalizeIP6(rich_source.addr) ]
            elif check_address("ipv6", rich_source.addr):
                addr_split = rich_source.addr.split("/")
                rule_fragment += [ "-s", normalizeIP6(addr_split[0]) + "/" + addr_split[1] ]
            else:
                rule_fragment += [ "-s", rich_source.addr ]
        elif hasattr(rich_source, "mac") and rich_source.mac:
            rule_fragment += [ "-m", "mac" ]
            if rich_source.invert:
                rule_fragment.append("!")
            rule_fragment += [ "--mac-source", rich_source.mac ]
        elif hasattr(rich_source, "ipset") and rich_source.ipset:
            rule_fragment += [ "-m", "set" ]
            if rich_source.invert:
                rule_fragment.append("!")
            flags = self._fw.zone._ipset_match_flags(rich_source.ipset, "src")
            rule_fragment += [ "--match-set", rich_source.ipset, flags ]

        return rule_fragment

    def build_policy_ports_rules(self, enable, policy, proto, port, destination=None, rich_rule=None):
        add_del = { True: "-A", False: "-D" }[enable]
        table = "filter"
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)

        rule_fragment = [ "-p", proto ]
        if port:
            rule_fragment += [ "--dport", "%s" % portStr(port) ]
        if destination:
            rule_fragment += [ "-d", destination ]
        if rich_rule:
            rule_fragment += self._rich_rule_destination_fragment(rich_rule.destination)
            rule_fragment += self._rich_rule_source_fragment(rich_rule.source)
        if not rich_rule or type(rich_rule.action) != Rich_Mark:
            rule_fragment += [ "-m", "conntrack", "--ctstate", "NEW,UNTRACKED" ]

        rules = []
        if rich_rule:
            rules.append(self._rich_rule_log(policy, rich_rule, enable, table, rule_fragment))
            rules.append(self._rich_rule_audit(policy, rich_rule, enable, table, rule_fragment))
            rules.append(self._rich_rule_action(policy, rich_rule, enable, table, rule_fragment))
        else:
            rules.append([add_del, "%s_allow" % (_policy), "-t", table] +
                         rule_fragment + [ "-j", "ACCEPT" ])

        return rules

    def build_policy_protocol_rules(self, enable, policy, protocol, destination=None, rich_rule=None):
        add_del = { True: "-A", False: "-D" }[enable]
        table = "filter"
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)

        rule_fragment = [ "-p", protocol ]
        if destination:
            rule_fragment += [ "-d", destination ]
        if rich_rule:
            rule_fragment += self._rich_rule_destination_fragment(rich_rule.destination)
            rule_fragment += self._rich_rule_source_fragment(rich_rule.source)
        if not rich_rule or type(rich_rule.action) != Rich_Mark:
            rule_fragment += [ "-m", "conntrack", "--ctstate", "NEW,UNTRACKED" ]

        rules = []
        if rich_rule:
            rules.append(self._rich_rule_log(policy, rich_rule, enable, table, rule_fragment))
            rules.append(self._rich_rule_audit(policy, rich_rule, enable, table, rule_fragment))
            rules.append(self._rich_rule_action(policy, rich_rule, enable, table, rule_fragment))
        else:
            rules.append([add_del, "%s_allow" % (_policy), "-t", table] +
                         rule_fragment + [ "-j", "ACCEPT" ])

        return rules

    def build_policy_source_ports_rules(self, enable, policy, proto, port,
                                     destination=None, rich_rule=None):
        add_del = { True: "-A", False: "-D" }[enable]
        table = "filter"
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)

        rule_fragment = [ "-p", proto ]
        if port:
            rule_fragment += [ "--sport", "%s" % portStr(port) ]
        if destination:
            rule_fragment += [ "-d", destination ]
        if rich_rule:
            rule_fragment += self._rich_rule_destination_fragment(rich_rule.destination)
            rule_fragment += self._rich_rule_source_fragment(rich_rule.source)
        if not rich_rule or type(rich_rule.action) != Rich_Mark:
            rule_fragment += [ "-m", "conntrack", "--ctstate", "NEW,UNTRACKED" ]

        rules = []
        if rich_rule:
            rules.append(self._rich_rule_log(policy, rich_rule, enable, table, rule_fragment))
            rules.append(self._rich_rule_audit(policy, rich_rule, enable, table, rule_fragment))
            rules.append(self._rich_rule_action(policy, rich_rule, enable, table, rule_fragment))
        else:
            rules.append([add_del, "%s_allow" % (_policy), "-t", table] +
                         rule_fragment + [ "-j", "ACCEPT" ])

        return rules

    def build_policy_helper_ports_rules(self, enable, policy, proto, port,
                                      destination, helper_name, module_short_name):
        table = "raw"
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)
        add_del = { True: "-A", False: "-D" }[enable]

        rule = [ add_del, "%s_allow" % (_policy), "-t", "raw", "-p", proto ]
        if port:
            rule += [ "--dport", "%s" % portStr(port) ]
        if destination:
            rule += [ "-d",  destination ]
        rule += [ "-j", "CT", "--helper", module_short_name ]

        return [rule]

    def build_zone_forward_rules(self, enable, zone, policy, table, interface=None, source=None):
        add_del = { True: "-A", False: "-D" }[enable]
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)

        rules = []
        if interface:
            rules.append(["-t", "filter", add_del, "%s_allow" % _policy,
                          "-o", interface, "-j", "ACCEPT"])
        else: # source
            # iptables can not match destination MAC
            if check_mac(source):
                return []

            rules.append(["-t", "filter", add_del, "%s_allow" % _policy]
                         + self._rule_addr_fragment("-d", source) +
                         ["-j", "ACCEPT"])
        return rules

    def build_policy_masquerade_rules(self, enable, policy, rich_rule=None):
        table = "nat"
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX, isSNAT=True)

        add_del = { True: "-A", False: "-D" }[enable]

        rule_fragment = []
        if rich_rule:
            chain_suffix = self._rich_rule_chain_suffix(rich_rule)
            rule_fragment += self._rich_rule_priority_fragment(rich_rule)
            rule_fragment += self._rich_rule_destination_fragment(rich_rule.destination)
            rule_fragment += self._rich_rule_source_fragment(rich_rule.source)
        else:
            chain_suffix = "allow"

        rules = []
        rules.append(["-t", "nat", add_del, "%s_%s" % (_policy, chain_suffix)]
                     + rule_fragment +
                     [ "!", "-o", "lo", "-j", "MASQUERADE" ])

        rule_fragment = []
        if rich_rule:
            chain_suffix = self._rich_rule_chain_suffix(rich_rule)
            rule_fragment += self._rich_rule_priority_fragment(rich_rule)
            rule_fragment += self._rich_rule_destination_fragment(rich_rule.destination)
            rule_fragment += self._rich_rule_source_fragment(rich_rule.source)
        else:
            chain_suffix = "allow"

        table = "filter"
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)
        rules.append(["-t", "filter", add_del, "%s_%s" % (_policy, chain_suffix)]
                     + rule_fragment +
                     ["-m", "conntrack", "--ctstate", "NEW,UNTRACKED", "-j", "ACCEPT" ])

        return rules

    def build_policy_forward_port_rules(self, enable, policy, port,
                                      protocol, toport, toaddr, rich_rule=None):
        table = "nat"
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)
        add_del = { True: "-A", False: "-D" }[enable]

        to = ""
        if toaddr:
            if check_single_address("ipv6", toaddr):
                to += "[%s]" % normalizeIP6(toaddr)
            else:
                to += toaddr
        if toport and toport != "":
            to += ":%s" % portStr(toport, "-")

        rule_fragment = []
        if rich_rule:
            chain_suffix = self._rich_rule_chain_suffix(rich_rule)
            rule_fragment = self._rich_rule_priority_fragment(rich_rule)
            rule_fragment += self._rich_rule_destination_fragment(rich_rule.destination)
            rule_fragment += self._rich_rule_source_fragment(rich_rule.source)
        else:
            chain_suffix = "allow"

        rules = []
        if rich_rule:
            rules.append(self._rich_rule_log(policy, rich_rule, enable, "nat", rule_fragment))
        rules.append(["-t", "nat", add_del, "%s_%s" % (_policy, chain_suffix)]
                     + rule_fragment +
                     ["-p", protocol, "--dport", portStr(port),
                      "-j", "DNAT", "--to-destination", to])

        return rules

    def build_policy_icmp_block_rules(self, enable, policy, ict, rich_rule=None):
        table = "filter"
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)
        add_del = { True: "-A", False: "-D" }[enable]

        if self.ipv == "ipv4":
            proto = [ "-p", "icmp" ]
            match = [ "-m", "icmp", "--icmp-type", ict.name ]
        else:
            proto = [ "-p", "ipv6-icmp" ]
            match = [ "-m", "icmp6", "--icmpv6-type", ict.name ]

        rules = []
        if self._fw.policy.query_icmp_block_inversion(policy):
            final_chain = "%s_allow" % (_policy)
            final_target = "ACCEPT"
        else:
            final_chain = "%s_deny" % (_policy)
            final_target = "%%REJECT%%"

        rule_fragment = []
        if rich_rule:
            rule_fragment += self._rich_rule_destination_fragment(rich_rule.destination)
            rule_fragment += self._rich_rule_source_fragment(rich_rule.source)
        rule_fragment += proto + match

        if rich_rule:
            rules.append(self._rich_rule_log(policy, rich_rule, enable, table, rule_fragment))
            rules.append(self._rich_rule_audit(policy, rich_rule, enable, table, rule_fragment))
            if rich_rule.action:
                rules.append(self._rich_rule_action(policy, rich_rule, enable, table, rule_fragment))
            else:
                chain_suffix = self._rich_rule_chain_suffix(rich_rule)
                rules.append(["-t", table, add_del, "%s_%s" % (_policy, chain_suffix)]
                             + self._rich_rule_priority_fragment(rich_rule)
                             + rule_fragment +
                             [ "-j", "%%REJECT%%" ])
        else:
            if self._fw.get_log_denied() != "off" and final_target != "ACCEPT":
                rules.append([ add_del, final_chain, "-t", table ]
                             + rule_fragment +
                             [ "%%LOGTYPE%%", "-j", "LOG",
                               "--log-prefix", "\"%s_ICMP_BLOCK: \"" % policy ])
            rules.append([ add_del, final_chain, "-t", table ]
                         + rule_fragment +
                         [ "-j", final_target ])

        return rules

    def build_policy_icmp_block_inversion_rules(self, enable, policy):
        table = "filter"
        _policy = self._fw.policy.policy_base_chain_name(policy, table, POLICY_CHAIN_PREFIX)

        rules = []
        rule_idx = 6

        if self._fw.policy.query_icmp_block_inversion(policy):
            ibi_target = "%%REJECT%%"

            if self._fw.get_log_denied() != "off":
                if enable:
                    rule = [ "-I", _policy, str(rule_idx) ]
                else:
                    rule = [ "-D", _policy ]

                rule = rule + [ "-t", table, "-p", "%%ICMP%%",
                              "%%LOGTYPE%%",
                              "-j", "LOG", "--log-prefix",
                              "\"%s_ICMP_BLOCK: \"" % _policy ]
                rules.append(rule)
                rule_idx += 1
        else:
            ibi_target = "ACCEPT"

        if enable:
            rule = [ "-I", _policy, str(rule_idx) ]
        else:
            rule = [ "-D", _policy ]
        rule = rule + [ "-t", table, "-p", "%%ICMP%%", "-j", ibi_target ]
        rules.append(rule)

        return rules

    def build_policy_rich_source_destination_rules(self, enable, policy, rich_rule):
        table = "filter"

        rule_fragment = []
        rule_fragment += self._rich_rule_destination_fragment(rich_rule.destination)
        rule_fragment += self._rich_rule_source_fragment(rich_rule.source)

        rules = []
        rules.append(self._rich_rule_log(policy, rich_rule, enable, table, rule_fragment))
        rules.append(self._rich_rule_audit(policy, rich_rule, enable, table, rule_fragment))
        rules.append(self._rich_rule_action(policy, rich_rule, enable, table, rule_fragment))

        return rules

    def is_ipv_supported(self, ipv):
        return ipv == self.ipv

class ip6tables(ip4tables):
    ipv = "ipv6"
    name = "ip6tables"

    def build_rpfilter_rules(self, log_denied=False):
        rules = []
        rules.append([ "-I", "PREROUTING", "-t", "mangle",
                       "-m", "rpfilter", "--invert", "--validmark",
                       "-j", "DROP" ])
        if log_denied != "off":
            rules.append([ "-I", "PREROUTING", "-t", "mangle",
                           "-m", "rpfilter", "--invert", "--validmark",
                           "-j", "LOG",
                           "--log-prefix", "rpfilter_DROP: " ])
        rules.append([ "-I", "PREROUTING", "-t", "mangle",
                       "-p", "ipv6-icmp",
                       "--icmpv6-type=neighbour-solicitation",
                       "-j", "ACCEPT" ]) # RHBZ#1575431, kernel bug in 4.16-4.17
        rules.append([ "-I", "PREROUTING", "-t", "mangle",
                       "-p", "ipv6-icmp",
                       "--icmpv6-type=router-advertisement",
                       "-j", "ACCEPT" ]) # RHBZ#1058505
        return rules

    def build_rfc3964_ipv4_rules(self):
        daddr_list = [
                     "::0.0.0.0/96", # IPv4 compatible
                     "::ffff:0.0.0.0/96", # IPv4 mapped
                     "2002:0000::/24", # 0.0.0.0/8 (the system has no address assigned yet)
                     "2002:0a00::/24", # 10.0.0.0/8 (private)
                     "2002:7f00::/24", # 127.0.0.0/8 (loopback)
                     "2002:ac10::/28", # 172.16.0.0/12 (private)
                     "2002:c0a8::/32", # 192.168.0.0/16 (private)
                     "2002:a9fe::/32", # 169.254.0.0/16 (IANA Assigned DHCP link-local)
                     "2002:e000::/19", # 224.0.0.0/4 (multicast), 240.0.0.0/4 (reserved and broadcast)
                     ]

        chain_name = "RFC3964_IPv4"
        self.our_chains["filter"].add(chain_name)

        rules = []
        rules.append(["-t", "filter", "-N", chain_name])
        for daddr in daddr_list:
            rules.append(["-t", "filter", "-I", chain_name,
                          "-d", daddr, "-j", "REJECT", "--reject-with",
                          "addr-unreach"])
            if self._fw._log_denied in ["unicast", "all"]:
                rules.append(["-t", "filter", "-I", chain_name,
                              "-d", daddr, "-j", "LOG",
                              "--log-prefix", "\"RFC3964_IPv4_REJECT: \""])

        # Inject into FORWARD and OUTPUT chains
        rules.append(["-t", "filter", "-I", "OUTPUT", "4",
                      "-j", chain_name])
        rules.append(["-t", "filter", "-I", "FORWARD", "4",
                      "-j", chain_name])
        return rules
site-packages/firewall/core/ipset.py000064400000031161147511334700013526 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2015-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

"""The ipset command wrapper"""

__all__ = [ "ipset", "check_ipset_name", "remove_default_create_options" ]

import os.path
import ipaddress

from firewall import errors
from firewall.errors import FirewallError
from firewall.core.prog import runProg
from firewall.core.logger import log
from firewall.functions import tempFile, readfile
from firewall.config import COMMANDS

IPSET_MAXNAMELEN = 32
IPSET_TYPES = [
    # bitmap and set types are currently not supported
    # "bitmap:ip",
    # "bitmap:ip,mac",
    # "bitmap:port",
    # "list:set",

    "hash:ip",
    "hash:ip,port",
    "hash:ip,port,ip",
    "hash:ip,port,net",
    "hash:ip,mark",

    "hash:net",
    "hash:net,net",
    "hash:net,port",
    "hash:net,port,net",
    "hash:net,iface",

    "hash:mac",
]
IPSET_CREATE_OPTIONS = {
    "family": "inet|inet6",
    "hashsize": "value",
    "maxelem": "value",
    "timeout": "value in secs",
    #"counters": None,
    #"comment": None,
}
IPSET_DEFAULT_CREATE_OPTIONS = {
    "family": "inet",
    "hashsize": "1024",
    "maxelem": "65536",
}

class ipset(object):
    """ipset command wrapper class"""

    def __init__(self):
        self._command = COMMANDS["ipset"]
        self.name = "ipset"

    def __run(self, args):
        """Call ipset with args"""
        # convert to string list
        _args = ["%s" % item for item in args]
        log.debug2("%s: %s %s", self.__class__, self._command, " ".join(_args))
        (status, ret) = runProg(self._command, _args)
        if status != 0:
            raise ValueError("'%s %s' failed: %s" % (self._command,
                                                     " ".join(_args), ret))
        return ret

    def check_name(self, name):
        """Check ipset name"""
        if len(name) > IPSET_MAXNAMELEN:
            raise FirewallError(errors.INVALID_NAME,
                                "ipset name '%s' is not valid" % name)

    def set_supported_types(self):
        """Return types that are supported by the ipset command and kernel"""
        ret = [ ]
        output = ""
        try:
            output = self.__run(["--help"])
        except ValueError as ex:
            log.debug1("ipset error: %s" % ex)
        lines = output.splitlines()

        in_types = False
        for line in lines:
            #print(line)
            if in_types:
                splits = line.strip().split(None, 2)
                if splits[0] not in ret and splits[0] in IPSET_TYPES:
                    ret.append(splits[0])
            if line.startswith("Supported set types:"):
                in_types = True
        return ret

    def check_type(self, type_name):
        """Check ipset type"""
        if len(type_name) > IPSET_MAXNAMELEN or type_name not in IPSET_TYPES:
            raise FirewallError(errors.INVALID_TYPE,
                                "ipset type name '%s' is not valid" % type_name)

    def set_create(self, set_name, type_name, options=None):
        """Create an ipset with name, type and options"""
        self.check_name(set_name)
        self.check_type(type_name)

        args = [ "create", set_name, type_name ]
        if isinstance(options, dict):
            for key, val in options.items():
                args.append(key)
                if val != "":
                    args.append(val)
        return self.__run(args)

    def set_destroy(self, set_name):
        self.check_name(set_name)
        return self.__run([ "destroy", set_name ])

    def set_add(self, set_name, entry):
        args = [ "add", set_name, entry ]
        return self.__run(args)

    def set_delete(self, set_name, entry):
        args = [ "del", set_name, entry ]
        return self.__run(args)

    def test(self, set_name, entry, options=None):
        args = [ "test", set_name, entry ]
        if options:
            args.append("%s" % " ".join(options))
        return self.__run(args)

    def set_list(self, set_name=None, options=None):
        args = [ "list" ]
        if set_name:
            args.append(set_name)
        if options:
            args.extend(options)
        return self.__run(args).split("\n")

    def set_get_active_terse(self):
        """ Get active ipsets (only headers) """
        lines = self.set_list(options=["-terse"])

        ret = { }
        _name = _type = None
        _options = { }
        for line in lines:
            if len(line) < 1:
                continue
            pair = [ x.strip() for x in line.split(":", 1) ]
            if len(pair) != 2:
                continue
            elif pair[0] == "Name":
                _name = pair[1]
            elif pair[0] == "Type":
                _type = pair[1]
            elif pair[0] == "Header":
                splits = pair[1].split()
                i = 0
                while i < len(splits):
                    opt = splits[i]
                    if opt in [ "family", "hashsize", "maxelem", "timeout",
                                "netmask" ]:
                        if len(splits) > i:
                            i += 1
                            _options[opt] = splits[i]
                        else:
                            log.error("Malformed ipset list -terse output: %s",
                                      line)
                            return { }
                    i += 1
                if _name and _type:
                    ret[_name] = (_type,
                                  remove_default_create_options(_options))
                _name = _type = None
                _options.clear()
        return ret

    def save(self, set_name=None):
        args = [ "save" ]
        if set_name:
            args.append(set_name)
        return self.__run(args)

    def set_restore(self, set_name, type_name, entries,
                create_options=None, entry_options=None):
        self.check_name(set_name)
        self.check_type(type_name)

        temp_file = tempFile()

        if ' ' in set_name:
            set_name = "'%s'" % set_name
        args = [ "create", set_name, type_name, "-exist" ]
        if create_options:
            for key, val in create_options.items():
                args.append(key)
                if val != "":
                    args.append(val)
        temp_file.write("%s\n" % " ".join(args))
        temp_file.write("flush %s\n" % set_name)

        for entry in entries:
            if ' ' in entry:
                entry = "'%s'" % entry
            if entry_options:
                temp_file.write("add %s %s %s\n" % \
                                (set_name, entry, " ".join(entry_options)))
            else:
                temp_file.write("add %s %s\n" % (set_name, entry))
        temp_file.close()

        stat = os.stat(temp_file.name)
        log.debug2("%s: %s restore %s", self.__class__, self._command,
                   "%s: %d" % (temp_file.name, stat.st_size))

        args = [ "restore" ]
        (status, ret) = runProg(self._command, args,
                                stdin=temp_file.name)

        if log.getDebugLogLevel() > 2:
            try:
                readfile(temp_file.name)
            except Exception:
                pass
            else:
                i = 1
                for line in readfile(temp_file.name):
                    log.debug3("%8d: %s" % (i, line), nofmt=1, nl=0)
                    if not line.endswith("\n"):
                        log.debug3("", nofmt=1)
                    i += 1

        os.unlink(temp_file.name)

        if status != 0:
            raise ValueError("'%s %s' failed: %s" % (self._command,
                                                     " ".join(args), ret))
        return ret

    def set_flush(self, set_name):
        args = [ "flush" ]
        if set_name:
            args.append(set_name)
        return self.__run(args)

    def rename(self, old_set_name, new_set_name):
        return self.__run([ "rename", old_set_name, new_set_name ])

    def swap(self, set_name_1, set_name_2):
        return self.__run([ "swap", set_name_1, set_name_2 ])

    def version(self):
        return self.__run([ "version" ])


def check_ipset_name(name):
    """Return true if ipset name is valid"""
    if len(name) > IPSET_MAXNAMELEN:
        return False
    return True

def remove_default_create_options(options):
    """ Return only non default create options """
    _options = options.copy()
    for opt in IPSET_DEFAULT_CREATE_OPTIONS:
        if opt in _options and \
           IPSET_DEFAULT_CREATE_OPTIONS[opt] == _options[opt]:
            del _options[opt]
    return _options

def normalize_ipset_entry(entry):
    """ Normalize IP addresses in entry """
    _entry = []
    for _part in entry.split(","):
        try:
            _part.index("/")
            _entry.append(str(ipaddress.ip_network(_part, strict=False)))
        except ValueError:
            _entry.append(_part)

    return ",".join(_entry)

def check_entry_overlaps_existing(entry, entries):
    """ Check if entry overlaps any entry in the list of entries """
    # Only check simple types
    if len(entry.split(",")) > 1:
        return

    try:
        entry_network = ipaddress.ip_network(entry, strict=False)
    except ValueError:
        # could not parse the new IP address, maybe a MAC
        return

    for itr in entries:
        if entry_network.overlaps(ipaddress.ip_network(itr, strict=False)):
            raise FirewallError(errors.INVALID_ENTRY, "Entry '{}' overlaps with existing entry '{}'".format(entry, itr))

def check_for_overlapping_entries(entries):
    """ Check if any entry overlaps any entry in the list of entries """
    try:
        entries = [ipaddress.ip_network(x, strict=False) for x in entries]
    except ValueError:
        # at least one entry can not be parsed
        return

    if len(entries) == 0:
        return

    # We can take advantage of some facts of IPv4Network/IPv6Network and
    # how Python sorts the networks to quickly detect overlaps.
    #
    # Facts:
    #
    #   1. IPv{4,6}Network are normalized to remove host bits, e.g.
    #     10.1.1.0/16 will become 10.1.0.0/16.
    #
    #   2. IPv{4,6}Network objects are sorted by:
    #     a. IP address (network bits)
    #   then
    #     b. netmask (significant bits count)
    #
    # Because of the above we have these properties:
    #
    #   1. big networks (netA) are sorted before smaller networks (netB)
    #      that overlap the big network (netA)
    #     - e.g. 10.1.128.0/17 (netA) sorts before 10.1.129.0/24 (netB)
    #   2. same value addresses (network bits) are grouped together even
    #      if the number of network bits vary. e.g. /16 vs /24
    #     - recall that address are normalized to remove host bits
    #     - e.g. 10.1.128.0/17 (netA) sorts before 10.1.128.0/24 (netC)
    #   3. non-overlapping networks (netD, netE) are always sorted before or
    #      after networks that overlap (netB, netC) the current one (netA)
    #     - e.g. 10.1.128.0/17 (netA) sorts before 10.2.128.0/16 (netD)
    #     - e.g. 10.1.128.0/17 (netA) sorts after 9.1.128.0/17 (netE)
    #     - e.g. 9.1.128.0/17 (netE) sorts before 10.1.129.0/24 (netB)
    #
    # With this we know the sorted list looks like:
    #
    #   list: [ netE, netA, netB, netC, netD ]
    #
    #   netE = non-overlapping network
    #   netA = big network
    #   netB = smaller network that overlaps netA (subnet)
    #   netC = smaller network that overlaps netA (subnet)
    #   netD = non-overlapping network
    #
    #   If networks netB and netC exist in the list, they overlap and are
    #   adjacent to netA.
    #
    # Checking for overlaps on a sorted list is thus:
    #
    #   1. compare adjacent elements in the list for overlaps
    #
    # Recall that we only need to detect a single overlap. We do not need to
    # detect them all.
    #
    entries.sort()
    prev_network = entries.pop(0)
    for current_network in entries:
        if prev_network.overlaps(current_network):
            raise FirewallError(errors.INVALID_ENTRY, "Entry '{}' overlaps entry '{}'".format(prev_network, current_network))
        prev_network = current_network
site-packages/firewall/core/base.py000064400000004066147511334700013320 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2011-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

"""Base firewall settings"""

DEFAULT_ZONE_TARGET = "{chain}_{zone}"
DEFAULT_POLICY_TARGET = "CONTINUE"
DEFAULT_POLICY_PRIORITY = -1

ZONE_TARGETS = [ "ACCEPT", "%%REJECT%%", "DROP", DEFAULT_ZONE_TARGET,
                 "default" ]

POLICY_TARGETS = [ "ACCEPT", "REJECT", "DROP", "CONTINUE" ]

SHORTCUTS = {
    "PREROUTING": "PRE",
    "POSTROUTING": "POST",
    "INPUT": "IN",
    "FORWARD_IN": "FWDI",
    "FORWARD_OUT": "FWDO",
    "OUTPUT": "OUT",
}

REJECT_TYPES = {
    "ipv4": [ "icmp-host-prohibited", "host-prohib", "icmp-net-unreachable",
              "net-unreach", "icmp-host-unreachable", "host-unreach",
              "icmp-port-unreachable", "port-unreach", "icmp-proto-unreachable",
              "proto-unreach", "icmp-net-prohibited", "net-prohib", "tcp-reset",
              "tcp-rst", "icmp-admin-prohibited", "admin-prohib" ],
    "ipv6": [ "icmp6-adm-prohibited", "adm-prohibited", "icmp6-no-route",
              "no-route", "icmp6-addr-unreachable", "addr-unreach",
              "icmp6-port-unreachable", "port-unreach", "tcp-reset" ]
}

# ipset types that can be used as a source in zones
# The match-set option will be src or src,src according to the
# dimension of the ipset.
SOURCE_IPSET_TYPES = [
    "hash:ip", "hash:ip,port", "hash:ip,mark",
    "hash:net", "hash:net,port", "hash:net,iface",
    "hash:mac"
]
site-packages/firewall/core/prog.py000064400000002746147511334700013360 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2010-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

import subprocess


__all__ = ["runProg"]


def runProg(prog, argv=None, stdin=None):
    if argv is None:
        argv = []

    args = [prog] + argv

    input_string = None
    if stdin:
        with open(stdin, 'r') as handle:
            input_string = handle.read().encode()

    env = {'LANG': 'C'}
    try:
        process = subprocess.Popen(args, stdin=subprocess.PIPE,
                                   stderr=subprocess.STDOUT,
                                   stdout=subprocess.PIPE,
                                   close_fds=True, env=env)
    except OSError:
        return (255, '')

    (output, err_output) = process.communicate(input_string)
    output = output.decode('utf-8', 'replace')
    return (process.returncode, output)
site-packages/firewall/core/helper.py000064400000001444147511334700013662 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

"""The helper maxnamelen"""

HELPER_MAXNAMELEN = 32
site-packages/firewall/core/fw_ifcfg.py000064400000005002147511334700014147 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2010-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

"""Functions to search for and change ifcfg files"""

__all__ = [ "search_ifcfg_of_interface", "ifcfg_set_zone_of_interface" ]

import os
import os.path

from firewall import config
from firewall.core.logger import log
from firewall.core.io.ifcfg import ifcfg

def search_ifcfg_of_interface(interface):
    """search ifcfg file for the interface in config.IFCFGDIR"""

    # Return quickly if config.IFCFGDIR does not exist
    if not os.path.exists(config.IFCFGDIR):
        return None

    for filename in sorted(os.listdir(config.IFCFGDIR)):
        if not filename.startswith("ifcfg-"):
            continue
        for ignored in [ ".bak", ".orig", ".rpmnew", ".rpmorig", ".rpmsave",
                         "-range" ]:
            if filename.endswith(ignored):
                continue
        if "." in filename:
            continue
        ifcfg_file = ifcfg("%s/%s" % (config.IFCFGDIR, filename))
        ifcfg_file.read()
        if ifcfg_file.get("DEVICE") == interface:
            return ifcfg_file

    # Wasn't found above, so assume filename matches the device we want
    filename = "%s/ifcfg-%s" % (config.IFCFGDIR, interface)
    if os.path.exists(filename):
        ifcfg_file = ifcfg(filename)
        ifcfg_file.read()
        return ifcfg_file

    return None

def ifcfg_set_zone_of_interface(zone, interface):
    """Set zone (ZONE=<zone>) in the ifcfg file that uses the interface
    (DEVICE=<interface>)"""

    if zone is None:
        zone = ""

    ifcfg_file = search_ifcfg_of_interface(interface)
    if ifcfg_file is not None and ifcfg_file.get("ZONE") != zone and not \
       (ifcfg_file.get("ZONE") is None and zone == ""):
        log.debug1("Setting ZONE=%s in '%s'" % (zone, ifcfg_file.filename))
        ifcfg_file.set("ZONE", zone)
        ifcfg_file.write()
site-packages/firewall/core/fw.py000064400000140674147511334700013030 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2010-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

__all__ = [ "Firewall" ]

import os.path
import sys
import copy
import time
import traceback
from firewall import config
from firewall import functions
from firewall.core import ipXtables
from firewall.core import ebtables
from firewall.core import nftables
from firewall.core import ipset
from firewall.core import modules
from firewall.core.fw_icmptype import FirewallIcmpType
from firewall.core.fw_service import FirewallService
from firewall.core.fw_zone import FirewallZone
from firewall.core.fw_direct import FirewallDirect
from firewall.core.fw_config import FirewallConfig
from firewall.core.fw_policies import FirewallPolicies
from firewall.core.fw_ipset import FirewallIPSet
from firewall.core.fw_transaction import FirewallTransaction
from firewall.core.fw_helper import FirewallHelper
from firewall.core.fw_policy import FirewallPolicy
from firewall.core.fw_nm import nm_get_bus_name, nm_get_interfaces_in_zone
from firewall.core.logger import log
from firewall.core.io.firewalld_conf import firewalld_conf
from firewall.core.io.direct import Direct
from firewall.core.io.service import service_reader
from firewall.core.io.icmptype import icmptype_reader
from firewall.core.io.zone import zone_reader, Zone
from firewall.core.io.ipset import ipset_reader
from firewall.core.ipset import IPSET_TYPES
from firewall.core.io.helper import helper_reader
from firewall.core.io.policy import policy_reader
from firewall import errors
from firewall.errors import FirewallError

############################################################################
#
# class Firewall
#
############################################################################

class Firewall(object):
    def __init__(self, offline=False):
        self._firewalld_conf = firewalld_conf(config.FIREWALLD_CONF)
        self._offline = offline

        if self._offline:
            self.ip4tables_enabled = False
            self.ip6tables_enabled = False
            self.ebtables_enabled = False
            self.ipset_enabled = False
            self.ipset_supported_types = IPSET_TYPES
            self.nftables_enabled = False
        else:
            self.ip4tables_backend = ipXtables.ip4tables(self)
            self.ip4tables_enabled = True
            self.ipv4_supported_icmp_types = [ ]
            self.ip6tables_backend = ipXtables.ip6tables(self)
            self.ip6tables_enabled = True
            self.ipv6_supported_icmp_types = [ ]
            self.ebtables_backend = ebtables.ebtables()
            self.ebtables_enabled = True
            self.ipset_backend = ipset.ipset()
            self.ipset_enabled = True
            self.ipset_supported_types = [ ]
            self.nftables_backend = nftables.nftables(self)
            self.nftables_enabled = True

            self.modules_backend = modules.modules()

        self.icmptype = FirewallIcmpType(self)
        self.service = FirewallService(self)
        self.zone = FirewallZone(self)
        self.direct = FirewallDirect(self)
        self.config = FirewallConfig(self)
        self.policies = FirewallPolicies()
        self.ipset = FirewallIPSet(self)
        self.helper = FirewallHelper(self)
        self.policy = FirewallPolicy(self)

        self.__init_vars()

    def __repr__(self):
        return '%s(%r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r, %r)' % \
            (self.__class__, self.ip4tables_enabled, self.ip6tables_enabled,
             self.ebtables_enabled, self._state, self._panic,
             self._default_zone, self._module_refcount, self._marks,
             self.cleanup_on_exit, self.cleanup_modules_on_exit,
             self.ipv6_rpfilter_enabled, self.ipset_enabled,
             self._individual_calls, self._log_denied)

    def __init_vars(self):
        self._state = "INIT"
        self._panic = False
        self._default_zone = ""
        self._module_refcount = { }
        self._marks = [ ]
        # fallback settings will be overloaded by firewalld.conf
        self.cleanup_on_exit = config.FALLBACK_CLEANUP_ON_EXIT
        self.cleanup_modules_on_exit = config.FALLBACK_CLEANUP_MODULES_ON_EXIT
        self.ipv6_rpfilter_enabled = config.FALLBACK_IPV6_RPFILTER
        self._individual_calls = config.FALLBACK_INDIVIDUAL_CALLS
        self._log_denied = config.FALLBACK_LOG_DENIED
        self._firewall_backend = config.FALLBACK_FIREWALL_BACKEND
        self._flush_all_on_reload = config.FALLBACK_FLUSH_ALL_ON_RELOAD
        self._rfc3964_ipv4 = config.FALLBACK_RFC3964_IPV4
        self._allow_zone_drifting = config.FALLBACK_ALLOW_ZONE_DRIFTING

    def _check_tables(self):
        # check if iptables, ip6tables and ebtables are usable, else disable
        if self.ip4tables_enabled and \
           "filter" not in self.ip4tables_backend.get_available_tables():
            log.info1("iptables is not usable.")
            self.ip4tables_enabled = False

        if self.ip6tables_enabled and \
           "filter" not in self.ip6tables_backend.get_available_tables():
            log.info1("ip6tables is not usable.")
            self.ip6tables_enabled = False

        if self.ebtables_enabled and \
           "filter" not in self.ebtables_backend.get_available_tables():
            log.info1("ebtables is not usable.")
            self.ebtables_enabled = False

        # is there at least support for ipv4 or ipv6
        if not self.ip4tables_enabled and not self.ip6tables_enabled \
           and not self.nftables_enabled:
            log.fatal("No IPv4 and IPv6 firewall.")
            sys.exit(1)

    def _start_check(self):
        try:
            self.ipset_backend.set_list()
        except ValueError:
            log.warning("ipset not usable, disabling ipset usage in firewall.")
            # ipset is not usable, no supported types
            self.ipset_enabled = False
            self.ipset_supported_types = [ ]
        else:
            # ipset is usable, get all supported types
            self.ipset_supported_types = self.ipset_backend.set_supported_types()

        self.ip4tables_backend.fill_exists()
        if not self.ip4tables_backend.restore_command_exists:
            if self.ip4tables_backend.command_exists:
                log.warning("iptables-restore is missing, using "
                            "individual calls for IPv4 firewall.")
            else:
                log.warning("iptables-restore and iptables are missing, "
                            "disabling IPv4 firewall.")
                self.ip4tables_enabled = False
        if self.nftables_enabled:
            self.ipv4_supported_icmp_types = self.nftables_backend.supported_icmp_types("ipv4")
        else:
            if self.ip4tables_enabled:
                self.ipv4_supported_icmp_types = self.ip4tables_backend.supported_icmp_types()
            else:
                self.ipv4_supported_icmp_types = [ ]
        self.ip6tables_backend.fill_exists()
        if not self.ip6tables_backend.restore_command_exists:
            if self.ip6tables_backend.command_exists:
                log.warning("ip6tables-restore is missing, using "
                            "individual calls for IPv6 firewall.")
            else:
                log.warning("ip6tables-restore and ip6tables are missing, "
                            "disabling IPv6 firewall.")
                self.ip6tables_enabled = False
        if self.nftables_enabled:
            self.ipv6_supported_icmp_types = self.nftables_backend.supported_icmp_types("ipv6")
        else:
            if self.ip6tables_enabled:
                self.ipv6_supported_icmp_types = self.ip6tables_backend.supported_icmp_types()
            else:
                self.ipv6_supported_icmp_types = [ ]
        self.ebtables_backend.fill_exists()
        if not self.ebtables_backend.restore_command_exists:
            if self.ebtables_backend.command_exists:
                log.warning("ebtables-restore is missing, using "
                            "individual calls for bridge firewall.")
            else:
                log.warning("ebtables-restore and ebtables are missing, "
                            "disabling bridge firewall.")
                self.ebtables_enabled = False

        if self.ebtables_enabled and not self._individual_calls and \
           not self.ebtables_backend.restore_noflush_option:
            log.debug1("ebtables-restore is not supporting the --noflush "
                       "option, will therefore not be used")

    def _start(self, reload=False, complete_reload=False):
        # initialize firewall
        default_zone = config.FALLBACK_ZONE

        # load firewalld config
        log.debug1("Loading firewalld config file '%s'", config.FIREWALLD_CONF)
        try:
            self._firewalld_conf.read()
        except Exception as msg:
            log.warning(msg)
            log.warning("Using fallback firewalld configuration settings.")
        else:
            if self._firewalld_conf.get("DefaultZone"):
                default_zone = self._firewalld_conf.get("DefaultZone")

            if self._firewalld_conf.get("CleanupOnExit"):
                value = self._firewalld_conf.get("CleanupOnExit")
                if value is not None and value.lower() in [ "no", "false" ]:
                    self.cleanup_on_exit = False
                log.debug1("CleanupOnExit is set to '%s'",
                           self.cleanup_on_exit)

            if self._firewalld_conf.get("CleanupModulesOnExit"):
                value = self._firewalld_conf.get("CleanupModulesOnExit")
                if value is not None and value.lower() in [ "yes", "true" ]:
                    self.cleanup_modules_on_exit = True
                if value is not None and value.lower() in [ "no", "false" ]:
                    self.cleanup_modules_on_exit = False
                log.debug1("CleanupModulesOnExit is set to '%s'",
                           self.cleanup_modules_on_exit)

            if self._firewalld_conf.get("Lockdown"):
                value = self._firewalld_conf.get("Lockdown")
                if value is not None and value.lower() in [ "yes", "true" ]:
                    log.debug1("Lockdown is enabled")
                    try:
                        self.policies.enable_lockdown()
                    except FirewallError:
                        # already enabled, this is probably reload
                        pass

            if self._firewalld_conf.get("IPv6_rpfilter"):
                value = self._firewalld_conf.get("IPv6_rpfilter")
                if value is not None:
                    if value.lower() in [ "no", "false" ]:
                        self.ipv6_rpfilter_enabled = False
                    if value.lower() in [ "yes", "true" ]:
                        self.ipv6_rpfilter_enabled = True
            if self.ipv6_rpfilter_enabled:
                log.debug1("IPv6 rpfilter is enabled")
            else:
                log.debug1("IPV6 rpfilter is disabled")

            if self._firewalld_conf.get("IndividualCalls"):
                value = self._firewalld_conf.get("IndividualCalls")
                if value is not None and value.lower() in [ "yes", "true" ]:
                    log.debug1("IndividualCalls is enabled")
                    self._individual_calls = True

            if self._firewalld_conf.get("LogDenied"):
                value = self._firewalld_conf.get("LogDenied")
                if value is None or value.lower() == "no":
                    self._log_denied = "off"
                else:
                    self._log_denied = value.lower()
                    log.debug1("LogDenied is set to '%s'", self._log_denied)

            if self._firewalld_conf.get("FirewallBackend"):
                self._firewall_backend = self._firewalld_conf.get("FirewallBackend")
                log.debug1("FirewallBackend is set to '%s'",
                           self._firewall_backend)

            if self._firewalld_conf.get("FlushAllOnReload"):
                value = self._firewalld_conf.get("FlushAllOnReload")
                if value.lower() in [ "no", "false" ]:
                    self._flush_all_on_reload = False
                else:
                    self._flush_all_on_reload = True
                log.debug1("FlushAllOnReload is set to '%s'",
                           self._flush_all_on_reload)

            if self._firewalld_conf.get("RFC3964_IPv4"):
                value = self._firewalld_conf.get("RFC3964_IPv4")
                if value.lower() in [ "no", "false" ]:
                    self._rfc3964_ipv4 = False
                else:
                    self._rfc3964_ipv4 = True
                log.debug1("RFC3964_IPv4 is set to '%s'",
                           self._rfc3964_ipv4)

            if self._firewalld_conf.get("AllowZoneDrifting"):
                value = self._firewalld_conf.get("AllowZoneDrifting")
                if value.lower() in [ "no", "false" ]:
                    self._allow_zone_drifting = False
                else:
                    self._allow_zone_drifting = True
                    if not self._offline:
                        log.warning("AllowZoneDrifting is enabled. This is considered "
                                    "an insecure configuration option. It will be "
                                    "removed in a future release. Please consider "
                                    "disabling it now.")
                log.debug1("AllowZoneDrifting is set to '%s'",
                           self._allow_zone_drifting)

        self.config.set_firewalld_conf(copy.deepcopy(self._firewalld_conf))

        self._select_firewall_backend(self._firewall_backend)

        if not self._offline:
            self._start_check()

        # load lockdown whitelist
        log.debug1("Loading lockdown whitelist")
        try:
            self.policies.lockdown_whitelist.read()
        except Exception as msg:
            if self.policies.query_lockdown():
                log.error("Failed to load lockdown whitelist '%s': %s",
                          self.policies.lockdown_whitelist.filename, msg)
            else:
                log.debug1("Failed to load lockdown whitelist '%s': %s",
                           self.policies.lockdown_whitelist.filename, msg)

        # copy policies to config interface
        self.config.set_policies(copy.deepcopy(self.policies))

        # load ipset files
        self._loader(config.FIREWALLD_IPSETS, "ipset")
        self._loader(config.ETC_FIREWALLD_IPSETS, "ipset")

        # load icmptype files
        self._loader(config.FIREWALLD_ICMPTYPES, "icmptype")
        self._loader(config.ETC_FIREWALLD_ICMPTYPES, "icmptype")

        if len(self.icmptype.get_icmptypes()) == 0:
            log.error("No icmptypes found.")

        # load helper files
        self._loader(config.FIREWALLD_HELPERS, "helper")
        self._loader(config.ETC_FIREWALLD_HELPERS, "helper")

        # load service files
        self._loader(config.FIREWALLD_SERVICES, "service")
        self._loader(config.ETC_FIREWALLD_SERVICES, "service")

        if len(self.service.get_services()) == 0:
            log.error("No services found.")

        # load zone files
        self._loader(config.FIREWALLD_ZONES, "zone")
        self._loader(config.ETC_FIREWALLD_ZONES, "zone")

        if len(self.zone.get_zones()) == 0:
            log.fatal("No zones found.")
            sys.exit(1)

        # load policy files
        self._loader(config.FIREWALLD_POLICIES, "policy")
        self._loader(config.ETC_FIREWALLD_POLICIES, "policy")

        # check minimum required zones
        error = False
        for z in [ "block", "drop", "trusted" ]:
            if z not in self.zone.get_zones():
                log.fatal("Zone '%s' is not available.", z)
                error = True
        if error:
            sys.exit(1)

        # check if default_zone is a valid zone
        if default_zone not in self.zone.get_zones():
            if "public" in self.zone.get_zones():
                zone = "public"
            elif "external" in self.zone.get_zones():
                zone = "external"
            else:
                zone = "block" # block is a base zone, therefore it has to exist

            log.error("Default zone '%s' is not valid. Using '%s'.",
                      default_zone, zone)
            default_zone = zone
        else:
            log.debug1("Using default zone '%s'", default_zone)

        # load direct rules
        obj = Direct(config.FIREWALLD_DIRECT)
        if os.path.exists(config.FIREWALLD_DIRECT):
            log.debug1("Loading direct rules file '%s'" % \
                       config.FIREWALLD_DIRECT)
            try:
                obj.read()
            except Exception as msg:
                log.error("Failed to load direct rules file '%s': %s",
                          config.FIREWALLD_DIRECT, msg)
        self.direct.set_permanent_config(obj)
        self.config.set_direct(copy.deepcopy(obj))

        self._default_zone = self.check_zone(default_zone)

        if self._offline:
            return

        # check if needed tables are there
        self._check_tables()

        if log.getDebugLogLevel() > 0:
            # get time before flushing and applying
            tm1 = time.time()

        # Start transaction
        transaction = FirewallTransaction(self)

        # flush rules
        self.flush(use_transaction=transaction)

        # If modules need to be unloaded in complete reload or if there are
        # ipsets to get applied, limit the transaction to flush.
        #
        # Future optimization for the ipset case in reload: The transaction
        # only needs to be split here if there are conflicting ipset types in
        # exsting ipsets and the configuration in firewalld.
        if (reload and complete_reload) or \
           (self.ipset_enabled and self.ipset.has_ipsets()):
            transaction.execute(True)
            transaction.clear()

        # complete reload: unload modules also
        if reload and complete_reload:
            log.debug1("Unloading firewall modules")
            self.modules_backend.unload_firewall_modules()

        self.apply_default_tables(use_transaction=transaction)
        transaction.execute(True)
        transaction.clear()

        # apply settings for loaded ipsets while reloading here
        if self.ipset_enabled and self.ipset.has_ipsets():
            log.debug1("Applying ipsets")
            self.ipset.apply_ipsets()

        # Start or continue with transaction

        # apply default rules
        log.debug1("Applying default rule set")
        self.apply_default_rules(use_transaction=transaction)

        # apply settings for loaded zones
        log.debug1("Applying used zones")
        self.zone.apply_zones(use_transaction=transaction)

        self.zone.change_default_zone(None, self._default_zone,
                                      use_transaction=transaction)

        # apply policies
        log.debug1("Applying used policies")
        self.policy.apply_policies(use_transaction=transaction)

        # Execute transaction
        transaction.execute(True)

        # Start new transaction for direct rules
        transaction.clear()

        # apply direct chains, rules and passthrough rules
        if self.direct.has_configuration():
            log.debug1("Applying direct chains rules and passthrough rules")
            self.direct.apply_direct(transaction)

            # since direct rules are easy to make syntax errors lets highlight
            # the cause if the transaction fails.
            try:
                transaction.execute(True)
                transaction.clear()
            except FirewallError as e:
                raise FirewallError(e.code, "Direct: %s" % (e.msg if e.msg else ""))
            except Exception:
                raise

        del transaction

        if log.getDebugLogLevel() > 1:
            # get time after flushing and applying
            tm2 = time.time()
            log.debug2("Flushing and applying took %f seconds" % (tm2 - tm1))

    def start(self):
        try:
            self._start()
        except Exception:
            self._state = "FAILED"
            self.set_policy("ACCEPT")
            raise
        else:
            self._state = "RUNNING"
            self.set_policy("ACCEPT")

    def _loader(self, path, reader_type, combine=False):
        # combine: several zone files are getting combined into one obj
        if not os.path.isdir(path):
            return

        if combine:
            if path.startswith(config.ETC_FIREWALLD) and reader_type == "zone":
                combined_zone = Zone()
                combined_zone.name = os.path.basename(path)
                combined_zone.check_name(combined_zone.name)
                combined_zone.path = path
                combined_zone.default = False
            else:
                combine = False

        for filename in sorted(os.listdir(path)):
            if not filename.endswith(".xml"):
                if path.startswith(config.ETC_FIREWALLD) and \
                        reader_type == "zone" and \
                        os.path.isdir("%s/%s" % (path, filename)):
                    self._loader("%s/%s" % (path, filename), reader_type,
                                 combine=True)
                continue

            name = "%s/%s" % (path, filename)
            log.debug1("Loading %s file '%s'", reader_type, name)
            try:
                if reader_type == "icmptype":
                    obj = icmptype_reader(filename, path)
                    if obj.name in self.icmptype.get_icmptypes():
                        orig_obj = self.icmptype.get_icmptype(obj.name)
                        log.debug1("  Overloads %s '%s' ('%s/%s')", reader_type,
                                   orig_obj.name, orig_obj.path,
                                   orig_obj.filename)
                        self.icmptype.remove_icmptype(orig_obj.name)
                    elif obj.path.startswith(config.ETC_FIREWALLD):
                        obj.default = True
                    try:
                        self.icmptype.add_icmptype(obj)
                    except FirewallError as error:
                        log.info1("%s: %s, ignoring for run-time." % \
                                    (obj.name, str(error)))
                    # add a deep copy to the configuration interface
                    self.config.add_icmptype(copy.deepcopy(obj))
                elif reader_type == "service":
                    obj = service_reader(filename, path)
                    if obj.name in self.service.get_services():
                        orig_obj = self.service.get_service(obj.name)
                        log.debug1("  Overloads %s '%s' ('%s/%s')", reader_type,
                                   orig_obj.name, orig_obj.path,
                                   orig_obj.filename)
                        self.service.remove_service(orig_obj.name)
                    elif obj.path.startswith(config.ETC_FIREWALLD):
                        obj.default = True
                    self.service.add_service(obj)
                    # add a deep copy to the configuration interface
                    self.config.add_service(copy.deepcopy(obj))
                elif reader_type == "zone":
                    obj = zone_reader(filename, path, no_check_name=combine)
                    if combine:
                        # Change name for permanent configuration
                        obj.name = "%s/%s" % (
                            os.path.basename(path),
                            os.path.basename(filename)[0:-4])
                        obj.check_name(obj.name)
                    # Copy object before combine
                    config_obj = copy.deepcopy(obj)
                    if obj.name in self.zone.get_zones():
                        orig_obj = self.zone.get_zone(obj.name)
                        self.zone.remove_zone(orig_obj.name)
                        if orig_obj.combined:
                            log.debug1("  Combining %s '%s' ('%s/%s')",
                                        reader_type, obj.name,
                                        path, filename)
                            obj.combine(orig_obj)
                        else:
                            log.debug1("  Overloads %s '%s' ('%s/%s')",
                                       reader_type,
                                       orig_obj.name, orig_obj.path,
                                       orig_obj.filename)
                    elif obj.path.startswith(config.ETC_FIREWALLD):
                        obj.default = True
                        config_obj.default = True
                    self.config.add_zone(config_obj)
                    if combine:
                        log.debug1("  Combining %s '%s' ('%s/%s')",
                                   reader_type, combined_zone.name,
                                   path, filename)
                        combined_zone.combine(obj)
                    else:
                        self.zone.add_zone(obj)
                elif reader_type == "ipset":
                    obj = ipset_reader(filename, path)
                    if obj.name in self.ipset.get_ipsets():
                        orig_obj = self.ipset.get_ipset(obj.name)
                        log.debug1("  Overloads %s '%s' ('%s/%s')", reader_type,
                                   orig_obj.name, orig_obj.path,
                                   orig_obj.filename)
                        self.ipset.remove_ipset(orig_obj.name)
                    elif obj.path.startswith(config.ETC_FIREWALLD):
                        obj.default = True
                    try:
                        self.ipset.add_ipset(obj)
                    except FirewallError as error:
                        log.warning("%s: %s, ignoring for run-time." % \
                                    (obj.name, str(error)))
                    # add a deep copy to the configuration interface
                    self.config.add_ipset(copy.deepcopy(obj))
                elif reader_type == "helper":
                    obj = helper_reader(filename, path)
                    if obj.name in self.helper.get_helpers():
                        orig_obj = self.helper.get_helper(obj.name)
                        log.debug1("  Overloads %s '%s' ('%s/%s')", reader_type,
                                   orig_obj.name, orig_obj.path,
                                   orig_obj.filename)
                        self.helper.remove_helper(orig_obj.name)
                    elif obj.path.startswith(config.ETC_FIREWALLD):
                        obj.default = True
                    self.helper.add_helper(obj)
                    # add a deep copy to the configuration interface
                    self.config.add_helper(copy.deepcopy(obj))
                elif reader_type == "policy":
                    obj = policy_reader(filename, path)
                    if obj.name in self.policy.get_policies():
                        orig_obj = self.policy.get_policy(obj.name)
                        log.debug1("  Overloads %s '%s' ('%s/%s')", reader_type,
                                   orig_obj.name, orig_obj.path,
                                   orig_obj.filename)
                        self.policy.remove_policy(orig_obj.name)
                    elif obj.path.startswith(config.ETC_FIREWALLD):
                        obj.default = True
                    self.policy.add_policy(obj)
                    # add a deep copy to the configuration interface
                    self.config.add_policy_object(copy.deepcopy(obj))
                else:
                    log.fatal("Unknown reader type %s", reader_type)
            except FirewallError as msg:
                log.error("Failed to load %s file '%s': %s", reader_type,
                          name, msg)
            except Exception:
                log.error("Failed to load %s file '%s':", reader_type, name)
                log.exception()

        if combine and combined_zone.combined:
            if combined_zone.name in self.zone.get_zones():
                orig_obj = self.zone.get_zone(combined_zone.name)
                log.debug1("  Overloading and deactivating %s '%s' ('%s/%s')",
                           reader_type, orig_obj.name, orig_obj.path,
                           orig_obj.filename)
                try:
                    self.zone.remove_zone(combined_zone.name)
                except Exception:
                    pass
                self.config.forget_zone(combined_zone.name)
            self.zone.add_zone(combined_zone)

    def cleanup(self):
        self.icmptype.cleanup()
        self.service.cleanup()
        self.zone.cleanup()
        self.ipset.cleanup()
        self.helper.cleanup()
        self.config.cleanup()
        self.direct.cleanup()
        self.policies.cleanup()
        self.policy.cleanup()
        self._firewalld_conf.cleanup()
        self.__init_vars()

    def stop(self):
        if not self._offline:
            if self.cleanup_on_exit:
                self.flush()
                self.ipset.flush()
                self.set_policy("ACCEPT")

            if self.cleanup_modules_on_exit:
                log.debug1('Unloading firewall kernel modules')
                self.modules_backend.unload_firewall_modules()

        self.cleanup()

    # handle modules

    def handle_modules(self, _modules, enable):
        num_failed = 0
        error_msgs = ""
        for i,module in enumerate(_modules):
            if enable:
                (status, msg) = self.modules_backend.load_module(module)
            else:
                if self._module_refcount[module] > 1:
                    status = 0 # module referenced more then one, do not unload
                else:
                    (status, msg) = self.modules_backend.unload_module(module)
            if status != 0:
                num_failed += 1
                error_msgs += msg
                continue

            if enable:
                self._module_refcount.setdefault(module, 0)
                self._module_refcount[module] += 1
            else:
                if module in self._module_refcount:
                    self._module_refcount[module] -= 1
                    if self._module_refcount[module] == 0:
                        del self._module_refcount[module]
        return (num_failed, error_msgs)

    def _select_firewall_backend(self, backend):
        if backend != "nftables":
            self.nftables_enabled = False
        # even if using nftables, the other backends are enabled for use with
        # the direct interface. nftables is used for the firewalld primitives.

    def get_backend_by_name(self, name):
        for backend in self.all_backends():
            if backend.name == name:
                return backend
        raise FirewallError(errors.UNKNOWN_ERROR,
                            "'%s' backend does not exist" % name)

    def get_backend_by_ipv(self, ipv):
        if self.nftables_enabled:
            return self.nftables_backend
        if ipv == "ipv4" and self.ip4tables_enabled:
            return self.ip4tables_backend
        elif ipv == "ipv6" and self.ip6tables_enabled:
            return self.ip6tables_backend
        elif ipv == "eb" and self.ebtables_enabled:
            return self.ebtables_backend
        raise FirewallError(errors.INVALID_IPV,
                            "'%s' is not a valid backend or is unavailable" % ipv)

    def get_direct_backend_by_ipv(self, ipv):
        if ipv == "ipv4" and self.ip4tables_enabled:
            return self.ip4tables_backend
        elif ipv == "ipv6" and self.ip6tables_enabled:
            return self.ip6tables_backend
        elif ipv == "eb" and self.ebtables_enabled:
            return self.ebtables_backend
        raise FirewallError(errors.INVALID_IPV,
                            "'%s' is not a valid backend or is unavailable" % ipv)

    def is_backend_enabled(self, name):
        if name == "ip4tables":
            return self.ip4tables_enabled
        elif name == "ip6tables":
            return self.ip6tables_enabled
        elif name == "ebtables":
            return self.ebtables_enabled
        elif name == "nftables":
            return self.nftables_enabled
        return False

    def is_ipv_enabled(self, ipv):
        if self.nftables_enabled:
            return True
        if ipv == "ipv4":
            return self.ip4tables_enabled
        elif ipv == "ipv6":
            return self.ip6tables_enabled
        elif ipv == "eb":
            return self.ebtables_enabled
        return False

    def enabled_backends(self):
        backends = []
        if self.nftables_enabled:
            backends.append(self.nftables_backend)
        else:
            if self.ip4tables_enabled:
                backends.append(self.ip4tables_backend)
            if self.ip6tables_enabled:
                backends.append(self.ip6tables_backend)
            if self.ebtables_enabled:
                backends.append(self.ebtables_backend)
        return backends

    def all_backends(self):
        backends = []
        if self.ip4tables_enabled:
            backends.append(self.ip4tables_backend)
        if self.ip6tables_enabled:
            backends.append(self.ip6tables_backend)
        if self.ebtables_enabled:
            backends.append(self.ebtables_backend)
        if self.nftables_enabled:
            backends.append(self.nftables_backend)
        return backends

    def apply_default_tables(self, use_transaction=None):
        if use_transaction is None:
            transaction = FirewallTransaction(self)
        else:
            transaction = use_transaction

        for backend in self.enabled_backends():
            transaction.add_rules(backend, backend.build_default_tables())

        if use_transaction is None:
            transaction.execute(True)

    def apply_default_rules(self, use_transaction=None):
        if use_transaction is None:
            transaction = FirewallTransaction(self)
        else:
            transaction = use_transaction

        for backend in self.enabled_backends():
            rules = backend.build_default_rules(self._log_denied)
            transaction.add_rules(backend, rules)

        if self.is_ipv_enabled("ipv6"):
            ipv6_backend = self.get_backend_by_ipv("ipv6")
            if "raw" in ipv6_backend.get_available_tables():
                if self.ipv6_rpfilter_enabled:
                    rules = ipv6_backend.build_rpfilter_rules(self._log_denied)
                    transaction.add_rules(ipv6_backend, rules)

        if self.is_ipv_enabled("ipv6") and self._rfc3964_ipv4:
            rules = ipv6_backend.build_rfc3964_ipv4_rules()
            transaction.add_rules(ipv6_backend, rules)

        if use_transaction is None:
            transaction.execute(True)

    # flush and policy

    def flush(self, use_transaction=None):
        if use_transaction is None:
            transaction = FirewallTransaction(self)
        else:
            transaction = use_transaction

        log.debug1("Flushing rule set")

        for backend in self.all_backends():
            rules = backend.build_flush_rules()
            transaction.add_rules(backend, rules)

        if use_transaction is None:
            transaction.execute(True)

    def set_policy(self, policy, use_transaction=None):
        if use_transaction is None:
            transaction = FirewallTransaction(self)
        else:
            transaction = use_transaction

        log.debug1("Setting policy to '%s'", policy)

        for backend in self.enabled_backends():
            rules = backend.build_set_policy_rules(policy)
            transaction.add_rules(backend, rules)

        if use_transaction is None:
            transaction.execute(True)

    # rule function used in handle_ functions

    def rule(self, backend_name, rule):
        if not rule:
            return ""

        backend = self.get_backend_by_name(backend_name)
        if not backend:
            raise FirewallError(errors.INVALID_IPV,
                                "'%s' is not a valid backend" % backend_name)

        if not self.is_backend_enabled(backend_name):
            return ""

        return backend.set_rule(rule, self._log_denied)

    def rules(self, backend_name, rules):
        _rules = list(filter(None, rules))

        backend = self.get_backend_by_name(backend_name)
        if not backend:
            raise FirewallError(errors.INVALID_IPV,
                                "'%s' is not a valid backend" % backend_name)

        if not self.is_backend_enabled(backend_name):
            return

        if self._individual_calls or \
           not backend.restore_command_exists or \
           (backend_name == "ebtables" and not self.ebtables_backend.restore_noflush_option):
            for i,rule in enumerate(_rules):
                try:
                    backend.set_rule(rule, self._log_denied)
                except Exception as msg:
                    log.debug1(traceback.format_exc())
                    log.error(msg)
                    for rule in reversed(_rules[:i]):
                        try:
                            backend.set_rule(backend.reverse_rule(rule), self._log_denied)
                        except Exception:
                            # ignore errors here
                            pass
                    raise msg
        else:
            backend.set_rules(_rules, self._log_denied)

    # check functions

    def check_panic(self):
        if self._panic:
            raise FirewallError(errors.PANIC_MODE)

    def check_policy(self, policy):
        _policy = policy
        if _policy not in self.policy.get_policies():
            raise FirewallError(errors.INVALID_POLICY, _policy)
        return _policy

    def check_zone(self, zone):
        _zone = zone
        if not _zone or _zone == "":
            _zone = self.get_default_zone()
        if _zone not in self.zone.get_zones():
            raise FirewallError(errors.INVALID_ZONE, _zone)
        return _zone

    def check_interface(self, interface):
        if not functions.checkInterface(interface):
            raise FirewallError(errors.INVALID_INTERFACE, interface)

    def check_service(self, service):
        self.service.check_service(service)

    def check_port(self, port):
        if not functions.check_port(port):
            raise FirewallError(errors.INVALID_PORT, port)

    def check_tcpudp(self, protocol):
        if not protocol:
            raise FirewallError(errors.MISSING_PROTOCOL)
        if protocol not in [ "tcp", "udp", "sctp", "dccp" ]:
            raise FirewallError(errors.INVALID_PROTOCOL,
                                "'%s' not in {'tcp'|'udp'|'sctp'|'dccp'}" % \
                                protocol)

    def check_ip(self, ip):
        if not functions.checkIP(ip):
            raise FirewallError(errors.INVALID_ADDR, ip)

    def check_address(self, ipv, source):
        if ipv == "ipv4":
            if not functions.checkIPnMask(source):
                raise FirewallError(errors.INVALID_ADDR, source)
        elif ipv == "ipv6":
            if not functions.checkIP6nMask(source):
                raise FirewallError(errors.INVALID_ADDR, source)
        else:
            raise FirewallError(errors.INVALID_IPV,
                                "'%s' not in {'ipv4'|'ipv6'}")

    def check_icmptype(self, icmp):
        self.icmptype.check_icmptype(icmp)

    def check_timeout(self, timeout):
        if not isinstance(timeout, int):
            raise TypeError("%s is %s, expected int" % (timeout, type(timeout)))
        if int(timeout) < 0:
            raise FirewallError(errors.INVALID_VALUE,
                                "timeout '%d' is not positive number" % timeout)

    # RELOAD

    def reload(self, stop=False):
        _panic = self._panic

        # must stash this. The value may change after _start()
        flush_all = self._flush_all_on_reload

        if not flush_all:
            # save zone interfaces
            _zone_interfaces = { }
            for zone in self.zone.get_zones():
                _zone_interfaces[zone] = self.zone.get_settings(zone)["interfaces"]
            # save direct config
            _direct_config = self.direct.get_runtime_config()
            _old_dz = self.get_default_zone()

        _ipset_objs = []
        for _name in self.ipset.get_ipsets():
            _ipset_objs.append(self.ipset.get_ipset(_name))

        if not _panic:
            self.set_policy("DROP")

        # stop
        self.cleanup()

        start_exception = None
        try:
            self._start(reload=True, complete_reload=stop)
        except Exception as e:
            # save the exception for later, but continue restoring interfaces,
            # etc. We'll re-raise it at the end.
            start_exception = e

        # destroy ipsets no longer in the permanent configuration
        if flush_all:
            for obj in _ipset_objs:
                if not self.ipset.query_ipset(obj.name):
                    for backend in self.ipset.backends():
                        # nftables sets are part of the normal firewall ruleset.
                        if backend.name == "nftables":
                            continue
                        backend.set_destroy(obj.name)

        if not flush_all:
            # handle interfaces in the default zone and move them to the new
            # default zone if it changed
            _new_dz = self.get_default_zone()
            if _new_dz != _old_dz:
                # if_new_dz has been introduced with the reload, we need to add it
                # https://github.com/firewalld/firewalld/issues/53
                if _new_dz not in _zone_interfaces:
                    _zone_interfaces[_new_dz] = { }
                # default zone changed. Move interfaces from old default zone to
                # the new one.
                for iface, settings in list(_zone_interfaces[_old_dz].items()):
                    if settings["__default__"]:
                        # move only those that were added to default zone
                        # (not those that were added to specific zone same as
                        # default)
                        _zone_interfaces[_new_dz][iface] = \
                            _zone_interfaces[_old_dz][iface]
                        del _zone_interfaces[_old_dz][iface]

            # add interfaces to zones again
            for zone in self.zone.get_zones():
                if zone in _zone_interfaces:

                    for interface_id in _zone_interfaces[zone]:
                        self.zone.change_zone_of_interface(zone, interface_id,
                                                           _zone_interfaces[zone][interface_id]["sender"])

                    del _zone_interfaces[zone]
                else:
                    log.info1("New zone '%s'.", zone)
            if len(_zone_interfaces) > 0:
                for zone in list(_zone_interfaces.keys()):
                    log.info1("Lost zone '%s', zone interfaces dropped.", zone)
                    del _zone_interfaces[zone]
            del _zone_interfaces

            # restore runtime-only ipsets
            for obj in _ipset_objs:
                if self.ipset.query_ipset(obj.name):
                    for entry in obj.entries:
                        try:
                            self.ipset.add_entry(obj.name, entry)
                        except FirewallError as msg:
                            if msg.code != errors.ALREADY_ENABLED:
                                raise msg
                else:
                    self.ipset.add_ipset(obj)
                    self.ipset.apply_ipset(obj.name)

            # restore direct config
            self.direct.set_config(_direct_config)

        # Restore permanent interfaces from NetworkManager
        nm_bus_name = nm_get_bus_name()
        if nm_bus_name:
            for zone in self.zone.get_zones() + [""]:
                for interface in nm_get_interfaces_in_zone(zone):
                    self.zone.change_zone_of_interface(zone, interface, sender=nm_bus_name)

        self._panic = _panic
        if not self._panic:
            self.set_policy("ACCEPT")

        if start_exception:
            self._state = "FAILED"
            raise start_exception
        else:
            self._state = "RUNNING"

    # STATE

    def get_state(self):
        return self._state

    # PANIC MODE

    def enable_panic_mode(self):
        if self._panic:
            raise FirewallError(errors.ALREADY_ENABLED,
                                "panic mode already enabled")

        try:
            self.set_policy("PANIC")
        except Exception as msg:
            raise FirewallError(errors.COMMAND_FAILED, msg)
        self._panic = True

    def disable_panic_mode(self):
        if not self._panic:
            raise FirewallError(errors.NOT_ENABLED,
                                "panic mode is not enabled")

        try:
            self.set_policy("ACCEPT")
        except Exception as msg:
            raise FirewallError(errors.COMMAND_FAILED, msg)
        self._panic = False

    def query_panic_mode(self):
        return self._panic

    # LOG DENIED

    def get_log_denied(self):
        return self._log_denied

    def set_log_denied(self, value):
        if value not in config.LOG_DENIED_VALUES:
            raise FirewallError(errors.INVALID_VALUE,
                                "'%s', choose from '%s'" % \
                                (value, "','".join(config.LOG_DENIED_VALUES)))

        if value != self.get_log_denied():
            self._log_denied = value
            self._firewalld_conf.set("LogDenied", value)
            self._firewalld_conf.write()
        else:
            raise FirewallError(errors.ALREADY_SET, value)

    # DEFAULT ZONE

    def get_default_zone(self):
        return self._default_zone

    def set_default_zone(self, zone):
        _zone = self.check_zone(zone)
        if _zone != self._default_zone:
            _old_dz = self._default_zone
            self._default_zone = _zone
            self._firewalld_conf.set("DefaultZone", _zone)
            self._firewalld_conf.write()

            # remove old default zone from ZONES and add new default zone
            self.zone.change_default_zone(_old_dz, _zone)

            # Move interfaces from old default zone to the new one.
            _old_dz_settings = self.zone.get_settings(_old_dz)
            for iface, settings in list(_old_dz_settings["interfaces"].items()):
                if settings["__default__"]:
                    # move only those that were added to default zone
                    # (not those that were added to specific zone same as default)
                    self.zone.change_zone_of_interface("", iface)
        else:
            raise FirewallError(errors.ZONE_ALREADY_SET, _zone)

    def combine_runtime_with_permanent_settings(self, permanent, runtime):
        combined = permanent.copy()

        for key,value in runtime.items():
            # omit empty entries
            if value or isinstance(value, bool):
                combined[key] = value
            # make sure to remove values that were in permanent, but no
            # longer in runtime.
            elif key in combined:
                del combined[key]

        return combined

    def get_added_and_removed_settings(self, old_settings, new_settings):
        add_settings = {}
        remove_settings = {}
        for key in (set(old_settings.keys()) | set(new_settings.keys())):
            if key in new_settings:
                if isinstance(new_settings[key], list):
                    old = set(old_settings[key] if key in old_settings else [])
                    add_settings[key] = list(set(new_settings[key]) - old)
                    remove_settings[key] = list((old ^ set(new_settings[key])) & old)
                # check for bool or int because dbus.Boolean is a subclass of
                # int (because bool can't be subclassed).
                elif isinstance(new_settings[key], bool) or isinstance(new_settings[key], int):
                    if not old_settings[key] and new_settings[key]:
                        add_settings[key] = True
                    elif old_settings[key] and not new_settings[key]:
                        remove_settings[key] = False
                else:
                    raise FirewallError(errors.INVALID_SETTING, "Unhandled setting type {} key {}".format(type(new_settings[key]), key))

        return (add_settings, remove_settings)
site-packages/firewall/core/fw_nm.py000064400000016022147511334700013507 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2010-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

"""Functions for NetworkManager interaction"""

__all__ = [ "check_nm_imported", "nm_is_imported",
            "nm_get_zone_of_connection", "nm_set_zone_of_connection",
            "nm_get_connections", "nm_get_connection_of_interface",
            "nm_get_bus_name", "nm_get_dbus_interface" ]

import gi
from gi.repository import GLib
try:
    gi.require_version('NM', '1.0')
except ValueError:
    _nm_imported = False
else:
    try:
        from gi.repository import NM
        _nm_imported = True
    except (ImportError, ValueError, GLib.Error):
        _nm_imported = False
_nm_client = None

from firewall import errors
from firewall.errors import FirewallError
from firewall.core.logger import log
import dbus

def check_nm_imported():
    """Check function to raise a MISSING_IMPORT error if the import of NM failed
    """
    if not _nm_imported:
        raise FirewallError(errors.MISSING_IMPORT, "gi.repository.NM = 1.0")

def nm_is_imported():
    """Returns true if NM has been properly imported
    @return True if import was successful, False otherwirse
    """
    return _nm_imported

def nm_get_client():
    """Returns the NM client object or None if the import of NM failed
    @return NM.Client instance if import was successful, None otherwise
    """
    global _nm_client
    if not _nm_client:
        _nm_client = NM.Client.new(None)
    return _nm_client

def nm_get_zone_of_connection(connection):
    """Get zone of connection from NM
    @param connection name
    @return zone string setting of connection, empty string if not set, None if connection is unknown
    """
    check_nm_imported()

    con = nm_get_client().get_connection_by_uuid(connection)
    if con is None:
        return None

    setting_con = con.get_setting_connection()
    if setting_con is None:
        return None

    try:
        if con.get_flags() & (NM.SettingsConnectionFlags.NM_GENERATED
                              | NM.SettingsConnectionFlags.NM_VOLATILE):
            return ""
    except AttributeError:
        # Prior to NetworkManager 1.12, we can only guess
        # that a connection was generated/volatile.
        if con.get_unsaved():
            return ""

    zone = setting_con.get_zone()
    if zone is None:
        zone = ""
    return zone

def nm_set_zone_of_connection(zone, connection):
    """Set the zone for a connection
    @param zone name
    @param connection name
    @return True if zone was set, else False
    """
    check_nm_imported()

    con = nm_get_client().get_connection_by_uuid(connection)
    if con is None:
        return False

    setting_con = con.get_setting_connection()
    if setting_con is None:
        return False

    if zone == "":
        zone = None
    setting_con.set_property("zone", zone)
    return con.commit_changes(True, None)

def nm_get_connections(connections, connections_name):
    """Get active connections from NM
    @param connections return dict
    @param connections_name return dict
    """

    connections.clear()
    connections_name.clear()

    check_nm_imported()

    active_connections = nm_get_client().get_active_connections()

    for active_con in active_connections:
        # ignore vpn devices for now
        if active_con.get_vpn():
            continue

        name = active_con.get_id()
        uuid = active_con.get_uuid()
        devices = active_con.get_devices()

        connections_name[uuid] = name
        for dev in devices:
            ip_iface = dev.get_ip_iface()
            if ip_iface:
                connections[ip_iface] = uuid

def nm_get_interfaces():
    """Get active interfaces from NM
    @returns list of interface names
    """

    check_nm_imported()

    active_interfaces = []

    for active_con in nm_get_client().get_active_connections():
        # ignore vpn devices for now
        if active_con.get_vpn():
            continue

        try:
            con = active_con.get_connection()
            if con.get_flags() & (NM.SettingsConnectionFlags.NM_GENERATED
                                  | NM.SettingsConnectionFlags.NM_VOLATILE):
                continue
        except AttributeError:
            # Prior to NetworkManager 1.12, we can only guess
            # that a connection was generated/volatile.
            if con.get_unsaved():
                continue

        for dev in active_con.get_devices():
            ip_iface = dev.get_ip_iface()
            if ip_iface:
                active_interfaces.append(ip_iface)

    return active_interfaces

def nm_get_interfaces_in_zone(zone):
    interfaces = []
    for interface in nm_get_interfaces():
        conn = nm_get_connection_of_interface(interface)
        if zone == nm_get_zone_of_connection(conn):
            interfaces.append(interface)

    return interfaces

def nm_get_device_by_ip_iface(interface):
    """Get device from NM which has the given IP interface
    @param interface name
    @returns NM.Device instance or None
    """
    check_nm_imported()

    for device in nm_get_client().get_devices():
        ip_iface = device.get_ip_iface()
        if ip_iface is None:
            continue
        if ip_iface == interface:
            return device

    return None

def nm_get_connection_of_interface(interface):
    """Get connection from NM that is using the interface
    @param interface name
    @returns connection that is using interface or None
    """
    check_nm_imported()

    device = nm_get_device_by_ip_iface(interface)
    if device is None:
        return None

    active_con = device.get_active_connection()
    if active_con is None:
        return None

    try:
        con = active_con.get_connection()
        if con.get_flags() & NM.SettingsConnectionFlags.NM_GENERATED:
            return None
    except AttributeError:
        # Prior to NetworkManager 1.12, we can only guess
        # that a connection was generated.
        if con.get_unsaved():
            return None

    return active_con.get_uuid()

def nm_get_bus_name():
    if not _nm_imported:
        return None
    try:
        bus = dbus.SystemBus()
        obj = bus.get_object(NM.DBUS_INTERFACE, NM.DBUS_PATH)
        name = obj.bus_name
        del obj, bus
        return name
    except Exception:
        log.debug2("Failed to get bus name of NetworkManager")
    return None

def nm_get_dbus_interface():
    if not _nm_imported:
        return ""
    return NM.DBUS_INTERFACE
site-packages/firewall/core/watcher.py000064400000006234147511334700014042 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

__all__ = [ "Watcher" ]

from gi.repository import Gio, GLib

class Watcher(object):
    def __init__(self, callback, timeout):
        self._callback = callback
        self._timeout = timeout
        self._monitors = { }
        self._timeouts = { }
        self._blocked = [ ]

    def add_watch_dir(self, directory):
        gfile = Gio.File.new_for_path(directory)
        self._monitors[directory] = gfile.monitor_directory(\
            Gio.FileMonitorFlags.NONE, None)
        self._monitors[directory].connect("changed", self._file_changed_cb)

    def add_watch_file(self, filename):
        gfile = Gio.File.new_for_path(filename)
        self._monitors[filename] = gfile.monitor_file(\
            Gio.FileMonitorFlags.NONE, None)
        self._monitors[filename].connect("changed", self._file_changed_cb)

    def get_watches(self):
        return self._monitors.keys()
        
    def has_watch(self, filename):
        return filename in self._monitors

    def remove_watch(self, filename):
        del self._monitors[filename]

    def block_source(self, filename):
        if filename not in self._blocked:
            self._blocked.append(filename)

    def unblock_source(self, filename):
        if filename in self._blocked:
            self._blocked.remove(filename)

    def clear_timeouts(self):
        for filename in list(self._timeouts.keys()):
            GLib.source_remove(self._timeouts[filename])
            del self._timeouts[filename]

    def _call_callback(self, filename):
        if filename not in self._blocked:
            self._callback(filename)
        del self._timeouts[filename]

    def _file_changed_cb(self, monitor, gio_file, gio_other_file, event):
        filename = gio_file.get_parse_name()
        if filename in self._blocked:
            if filename in self._timeouts:
                GLib.source_remove(self._timeouts[filename])
                del self._timeouts[filename]
            return

        if event == Gio.FileMonitorEvent.CHANGED or \
                event == Gio.FileMonitorEvent.CREATED or \
                event == Gio.FileMonitorEvent.DELETED or \
                event == Gio.FileMonitorEvent.ATTRIBUTE_CHANGED:
            if filename in self._timeouts:
                GLib.source_remove(self._timeouts[filename])
                del self._timeouts[filename]
            self._timeouts[filename] = GLib.timeout_add_seconds(\
                self._timeout, self._call_callback, filename)
site-packages/firewall/core/fw_policies.py000064400000005363147511334700014712 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2011-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

__all__ = [ "FirewallPolicies" ]

from firewall import config
from firewall.core.logger import log
from firewall.core.io.lockdown_whitelist import LockdownWhitelist
from firewall import errors
from firewall.errors import FirewallError

class FirewallPolicies(object):
    def __init__(self):
        self._lockdown = False
        self.lockdown_whitelist = LockdownWhitelist(config.LOCKDOWN_WHITELIST)

    def __repr__(self):
        return '%s(%r, %r)' % (self.__class__, self._lockdown,
                                           self.lockdown_whitelist)

    def cleanup(self):
        self._lockdown = False
        self.lockdown_whitelist.cleanup()

    # lockdown

    def access_check(self, key, value):
        if key == "context":
            log.debug2('Doing access check for context "%s"' % value)
            if self.lockdown_whitelist.match_context(value):
                log.debug3('context matches.')
                return True
        elif key == "uid":
            log.debug2('Doing access check for uid %d' % value)
            if self.lockdown_whitelist.match_uid(value):
                log.debug3('uid matches.')
                return True
        elif key == "user":
            log.debug2('Doing access check for user "%s"' % value)
            if self.lockdown_whitelist.match_user(value):
                log.debug3('user matches.')
                return True
        elif key == "command":
            log.debug2('Doing access check for command "%s"' % value)
            if self.lockdown_whitelist.match_command(value):
                log.debug3('command matches.')
                return True
        return False

    def enable_lockdown(self):
        if self._lockdown:
            raise FirewallError(errors.ALREADY_ENABLED, "enable_lockdown()")
        self._lockdown = True

    def disable_lockdown(self):
        if not self._lockdown:
            raise FirewallError(errors.NOT_ENABLED, "disable_lockdown()")
        self._lockdown = False

    def query_lockdown(self):
        return self._lockdown

site-packages/firewall/core/fw_service.py000064400000003147147511334700014541 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2011-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

__all__ = [ "FirewallService" ]

from firewall import errors
from firewall.errors import FirewallError

class FirewallService(object):
    def __init__(self, fw):
        self._fw = fw
        self._services = { }

    def __repr__(self):
        return '%s(%r)' % (self.__class__, self._services)

    def cleanup(self):
        self._services.clear()

    # zones

    def get_services(self):
        return sorted(self._services.keys())

    def check_service(self, service):
        if service not in self._services:
            raise FirewallError(errors.INVALID_SERVICE, service)

    def get_service(self, service):
        self.check_service(service)
        return self._services[service]

    def add_service(self, obj):
        self._services[obj.name] = obj

    def remove_service(self, service):
        self.check_service(service)
        del self._services[service]
site-packages/firewall/core/fw_transaction.py000064400000014246147511334700015430 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

"""Transaction classes for firewalld"""

__all__ = [ "FirewallTransaction" ]

import traceback

from firewall.core.logger import log
from firewall import errors
from firewall.errors import FirewallError

class FirewallTransaction(object):
    def __init__(self, fw):
        self.fw = fw
        self.rules = { } # [ ( backend.name, [ rule,.. ] ),.. ]
        self.pre_funcs = [ ] # [ (func, args),.. ]
        self.post_funcs = [ ] # [ (func, args),.. ]
        self.fail_funcs = [ ] # [ (func, args),.. ]
        self.modules = [ ] # [ module,.. ]

    def clear(self):
        self.rules.clear()
        del self.pre_funcs[:]
        del self.post_funcs[:]
        del self.fail_funcs[:]

    def add_rule(self, backend, rule):
        self.rules.setdefault(backend.name, [ ]).append(rule)

    def add_rules(self, backend, rules):
        for rule in rules:
            self.add_rule(backend, rule)

    def query_rule(self, backend, rule):
        return backend.name in self.rules and rule in self.rules[backend.name]

    def remove_rule(self, backend, rule):
        if backend.name in self.rules and rule in self.rules[backend.name]:
            self.rules[backend.name].remove(rule)

    def add_pre(self, func, *args):
        self.pre_funcs.append((func, args))

    def add_post(self, func, *args):
        self.post_funcs.append((func, args))

    def add_fail(self, func, *args):
        self.fail_funcs.append((func, args))

    def add_module(self, module):
        if module not in self.modules:
            self.modules.append(module)

    def remove_module(self, module):
        if module in self.modules:
            self.modules.remove(module)

    def add_modules(self, modules):
        for module in modules:
            self.add_module(module)

    def remove_modules(self, modules):
        for module in modules:
            self.remove_module(module)

    def prepare(self, enable):
        log.debug4("%s.prepare(%s, %s)" % (type(self), enable, "..."))

        rules = { }
        if not enable:
            # reverse rule order for cleanup
            for backend_name in self.rules:
                for rule in reversed(self.rules[backend_name]):
                    rules.setdefault(backend_name, [ ]).append(
                        self.fw.get_backend_by_name(backend_name).reverse_rule(rule))
        else:
            for backend_name in self.rules:
                rules.setdefault(backend_name, [ ]).extend(self.rules[backend_name])

        return rules, self.modules

    def execute(self, enable):
        log.debug4("%s.execute(%s)" % (type(self), enable))

        rules, modules = self.prepare(enable)

        # pre
        self.pre()

        # stage 1: apply rules
        error = False
        errorMsg = ""
        done = [ ]
        for backend_name in rules:
            try:
                self.fw.rules(backend_name, rules[backend_name])
            except Exception as msg:
                error = True
                errorMsg = msg
                log.debug1(traceback.format_exc())
                log.error(msg)
            else:
                done.append(backend_name)

        # stage 2: load modules
        if not error:
            module_return = self.fw.handle_modules(modules, enable)
            if module_return:
                # Debug log about issues loading modules, but don't error. The
                # modules may be builtin or CONFIG_MODULES=n, in which case
                # modprobe will fail. Or we may be running inside a container
                # that doesn't have sufficient privileges. Unfortunately there
                # is no way for us to know.
                (status, msg) = module_return
                if status:
                    log.debug1(msg)

        # error case: revert rules
        if error:
            undo_rules = { }
            for backend_name in done:
                undo_rules[backend_name] = [ ]
                for rule in reversed(rules[backend_name]):
                    undo_rules[backend_name].append(
                        self.fw.get_backend_by_name(backend_name).reverse_rule(rule))
            for backend_name in undo_rules:
                try:
                    self.fw.rules(backend_name, undo_rules[backend_name])
                except Exception as msg:
                    log.debug1(traceback.format_exc())
                    log.error(msg)
            # call failure functions
            for (func, args) in self.fail_funcs:
                try:
                    func(*args)
                except Exception as msg:
                    log.debug1(traceback.format_exc())
                    log.error("Calling fail func %s(%s) failed: %s" % \
                              (func, args, msg))

            raise FirewallError(errors.COMMAND_FAILED, errorMsg)

        # post
        self.post()

    def pre(self):
        log.debug4("%s.pre()" % type(self))

        for (func, args) in self.pre_funcs:
            try:
                func(*args)
            except Exception as msg:
                log.debug1(traceback.format_exc())
                log.error("Calling pre func %s(%s) failed: %s" % \
                          (func, args, msg))

    def post(self):
        log.debug4("%s.post()" % type(self))

        for (func, args) in self.post_funcs:
            try:
                func(*args)
            except Exception as msg:
                log.debug1(traceback.format_exc())
                log.error("Calling post func %s(%s) failed: %s" % \
                          (func, args, msg))
site-packages/firewall/core/ebtables.py000064400000022256147511334700014170 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2010-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

__all__ = [ "ebtables" ]

import os.path
from firewall.core.prog import runProg
from firewall.core.logger import log
from firewall.functions import tempFile, readfile, splitArgs
from firewall.config import COMMANDS
from firewall.core import ipXtables # some common stuff lives there
from firewall.errors import FirewallError, INVALID_IPV
import string

BUILT_IN_CHAINS = {
    "broute": [ "BROUTING" ],
    "nat": [ "PREROUTING", "POSTROUTING", "OUTPUT" ],
    "filter": [ "INPUT", "OUTPUT", "FORWARD" ],
}

DEFAULT_RULES = { }
LOG_RULES = { }
OUR_CHAINS = {}  # chains created by firewalld

for table in BUILT_IN_CHAINS.keys():
    DEFAULT_RULES[table] = [ ]
    OUR_CHAINS[table] = set()
    for chain in BUILT_IN_CHAINS[table]:
        DEFAULT_RULES[table].append("-N %s_direct" % chain)
        DEFAULT_RULES[table].append("-I %s 1 -j %s_direct" % (chain, chain))
        DEFAULT_RULES[table].append("-I %s_direct 1 -j RETURN" % chain)
        OUR_CHAINS[table].add("%s_direct" % chain)

class ebtables(object):
    ipv = "eb"
    name = "ebtables"
    policies_supported = False # ebtables only supported with direct interface

    def __init__(self):
        self._command = COMMANDS[self.ipv]
        self._restore_command = COMMANDS["%s-restore" % self.ipv]
        self.restore_noflush_option = self._detect_restore_noflush_option()
        self.concurrent_option = self._detect_concurrent_option()
        self.fill_exists()
        self.available_tables = []

    def fill_exists(self):
        self.command_exists = os.path.exists(self._command)
        self.restore_command_exists = os.path.exists(self._restore_command)

    def _detect_concurrent_option(self):
        # Do not change any rules, just try to use the --concurrent option
        # with -L
        concurrent_option = ""
        ret = runProg(self._command, ["--concurrent", "-L"])
        if ret[0] == 0:
            concurrent_option = "--concurrent"  # concurrent for ebtables lock

        return concurrent_option

    def _detect_restore_noflush_option(self):
        # Do not change any rules, just try to use the restore command
        # with --noflush
        rules = [ ]
        try:
            self.set_rules(rules, "off")
        except ValueError:
            return False
        return True

    def __run(self, args):
        # convert to string list
        _args = [ ]
        if self.concurrent_option and self.concurrent_option not in args:
            _args.append(self.concurrent_option)
        _args += ["%s" % item for item in args]
        log.debug2("%s: %s %s", self.__class__, self._command, " ".join(_args))
        (status, ret) = runProg(self._command, _args)
        if status != 0:
            raise ValueError("'%s %s' failed: %s" % (self._command,
                                                     " ".join(args), ret))
        return ret

    def _rule_validate(self, rule):
        for str in ["%%REJECT%%", "%%ICMP%%", "%%LOGTYPE%%"]:
            if str in rule:
                raise FirewallError(INVALID_IPV,
                        "'%s' invalid for ebtables" % str)

    def is_chain_builtin(self, ipv, table, chain):
        return table in BUILT_IN_CHAINS and \
               chain in BUILT_IN_CHAINS[table]

    def build_chain_rules(self, add, table, chain):
        rules = []

        if add:
            rules.append([ "-t", table, "-N", chain ])
            rules.append([ "-t", table, "-I", chain, "1", "-j", "RETURN" ])
        else:
            rules.append([ "-t", table, "-X", chain ])

        return rules

    def build_rule(self, add, table, chain, index, args):
        rule = [ "-t", table ]
        if add:
            rule += [ "-I", chain, str(index) ]
        else:
            rule += [ "-D", chain ]
        rule += args
        return rule

    def reverse_rule(self, args):
        return ipXtables.common_reverse_rule(args)

    def check_passthrough(self, args):
        ipXtables.common_check_passthrough(args)

    def reverse_passthrough(self, args):
        return ipXtables.common_reverse_passthrough(args)

    def set_rules(self, rules, log_denied):
        temp_file = tempFile()

        table = "filter"
        table_rules = { }
        for _rule in rules:
            rule = _rule[:]

            self._rule_validate(rule)

            # get table form rule
            for opt in [ "-t", "--table" ]:
                try:
                    i = rule.index(opt)
                except ValueError:
                    pass
                else:
                    if len(rule) >= i+1:
                        rule.pop(i)
                        table = rule.pop(i)

            # we can not use joinArgs here, because it would use "'" instead
            # of '"' for the start and end of the string, this breaks
            # iptables-restore
            for i in range(len(rule)):
                for c in string.whitespace:
                    if c in rule[i] and not (rule[i].startswith('"') and
                                             rule[i].endswith('"')):
                        rule[i] = '"%s"' % rule[i]

            table_rules.setdefault(table, []).append(rule)

        for table in table_rules:
            temp_file.write("*%s\n" % table)
            for rule in table_rules[table]:
                temp_file.write(" ".join(rule) + "\n")

        temp_file.close()

        stat = os.stat(temp_file.name)
        log.debug2("%s: %s %s", self.__class__, self._restore_command,
                   "%s: %d" % (temp_file.name, stat.st_size))
        args = [ ]
        args.append("--noflush")

        (status, ret) = runProg(self._restore_command, args,
                                stdin=temp_file.name)

        if log.getDebugLogLevel() > 2:
            lines = readfile(temp_file.name)
            if lines is not None:
                i = 1
                for line in lines:
                    log.debug3("%8d: %s" % (i, line), nofmt=1, nl=0)
                    if not line.endswith("\n"):
                        log.debug3("", nofmt=1)
                    i += 1

        os.unlink(temp_file.name)

        if status != 0:
            raise ValueError("'%s %s' failed: %s" % (self._restore_command,
                                                     " ".join(args), ret))

    def set_rule(self, rule, log_denied):
        self._rule_validate(rule)
        return self.__run(rule)

    def get_available_tables(self, table=None):
        ret = []
        tables = [ table ] if table else BUILT_IN_CHAINS.keys()
        for table in tables:
            if table in self.available_tables:
                ret.append(table)
            else:
                try:
                    self.__run(["-t", table, "-L"])
                    self.available_tables.append(table)
                    ret.append(table)
                except ValueError:
                    log.debug1("ebtables table '%s' does not exist." % table)

        return ret

    def get_zone_table_chains(self, table):
        return {}

    def build_flush_rules(self):
        rules = []
        for table in BUILT_IN_CHAINS.keys():
            if table not in self.get_available_tables():
                continue
            # Flush firewall rules: -F
            # Delete firewall chains: -X
            # Set counter to zero: -Z
            for flag in [ "-F", "-X", "-Z" ]:
                rules.append(["-t", table, flag])
        return rules

    def build_set_policy_rules(self, policy):
        rules = []
        _policy = "DROP" if policy == "PANIC" else policy
        for table in BUILT_IN_CHAINS.keys():
            if table not in self.get_available_tables():
                continue
            for chain in BUILT_IN_CHAINS[table]:
                rules.append(["-t", table, "-P", chain, _policy])
        return rules

    def build_default_tables(self):
        # nothing to do, they always exist
        return []

    def build_default_rules(self, log_denied="off"):
        default_rules = []
        for table in DEFAULT_RULES:
            if table not in self.get_available_tables():
                continue
            _default_rules = DEFAULT_RULES[table][:]
            if log_denied != "off" and table in LOG_RULES:
                _default_rules.extend(LOG_RULES[table])
            prefix = [ "-t", table ]
            for rule in _default_rules:
                if type(rule) == list:
                    default_rules.append(prefix + rule)
                else:
                    default_rules.append(prefix + splitArgs(rule))
        return default_rules

    def is_ipv_supported(self, ipv):
        return ipv == self.ipv
site-packages/firewall/functions.py000064400000045447147511334700013476 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2007,2008,2011,2012 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

__all__ = [ "PY2", "getPortID", "getPortRange", "portStr", "getServiceName",
            "checkIP", "checkIP6", "checkIPnMask", "checkIP6nMask",
            "checkProtocol", "checkInterface", "checkUINT32",
            "firewalld_is_active", "tempFile", "readfile", "writefile",
            "enable_ip_forwarding", "check_port", "check_address",
            "check_single_address", "check_mac", "uniqify", "ppid_of_pid",
            "max_zone_name_len", "checkUser", "checkUid", "checkCommand",
            "checkContext", "joinArgs", "splitArgs",
            "b2u", "u2b", "u2b_if_py2", "max_policy_name_len",
            "stripNonPrintableCharacters"]

import socket
import os
import os.path
import shlex
import pipes
import string
import sys
import tempfile
from firewall.core.logger import log
from firewall.config import FIREWALLD_TEMPDIR, FIREWALLD_PIDFILE

PY2 = sys.version < '3'

NOPRINT_TRANS_TABLE = {
    # Limit to C0 and C1 code points. Building entries for all unicode code
    # points requires too much memory.
    # C0 = [0, 31]
    # C1 = [127, 159]
    #
    i: None for i in range(0, 160) if not (i > 31 and i < 127)
}

def getPortID(port):
    """ Check and Get port id from port string or port id using socket.getservbyname

    @param port port string or port id
    @return Port id if valid, -1 if port can not be found and -2 if port is too big
    """

    if isinstance(port, int):
        _id = port
    else:
        if port:
            port = port.strip()
        try:
            _id = int(port)
        except ValueError:
            try:
                _id = socket.getservbyname(port)
            except socket.error:
                return -1
    if _id > 65535:
        return -2
    return _id

def getPortRange(ports):
    """ Get port range for port range string or single port id

    @param ports an integer or port string or port range string
    @return Array containing start and end port id for a valid range or -1 if port can not be found and -2 if port is too big for integer input or -1 for invalid ranges or None if the range is ambiguous.
    """

    # (port, port)  or [port, port] case
    if isinstance(ports, tuple) or isinstance(ports, list):
        return ports

    # "<port-id>" case
    if isinstance(ports, int) or ports.isdigit():
        id1 = getPortID(ports)
        if id1 >= 0:
            return (id1,)
        return id1

    splits = ports.split("-")

    # "<port-id>-<port-id>" case
    if len(splits) == 2 and splits[0].isdigit() and splits[1].isdigit():
        id1 = getPortID(splits[0])
        id2 = getPortID(splits[1])
        if id1 >= 0 and id2 >= 0:
            if id1 < id2:
                return (id1, id2)
            elif id1 > id2:
                return (id2, id1)
            else: # ids are the same
                return (id1,)

    # everything else "<port-str>[-<port-str>]"
    matched = [ ]
    for i in range(len(splits), 0, -1):
        id1 = getPortID("-".join(splits[:i]))
        port2 = "-".join(splits[i:])
        if len(port2) > 0:
            id2 = getPortID(port2)
            if id1 >= 0 and id2 >= 0:
                if id1 < id2:
                    matched.append((id1, id2))
                elif id1 > id2:
                    matched.append((id2, id1))
                else:
                    matched.append((id1, ))
        else:
            if id1 >= 0:
                matched.append((id1,))
                if i == len(splits):
                    # full match, stop here
                    break
    if len(matched) < 1:
        return -1
    elif len(matched) > 1:
        return None
    return matched[0]

def portStr(port, delimiter=":"):
    """ Create port and port range string

    @param port port or port range int or [int, int]
    @param delimiter of the output string for port ranges, default ':'
    @return Port or port range string, empty string if port isn't specified, None if port or port range is not valid
    """
    if port == "":
        return ""

    _range = getPortRange(port)
    if isinstance(_range, int) and _range < 0:
        return None
    elif len(_range) == 1:
        return "%s" % _range
    else:
        return "%s%s%s" % (_range[0], delimiter, _range[1])

def portInPortRange(port, range):
    _port = getPortRange(port)
    _range = getPortRange(range)

    if len(_port) == 1:
        if len(_range) == 1:
            return getPortID(_port[0]) == getPortID(_range[0])
        if len(_range) == 2 and \
           getPortID(_port[0]) >= getPortID(_range[0]) and getPortID(_port[0]) <= getPortID(_range[1]):
            return True
    elif len(_port) == 2:
        if len(_range) == 2 and \
           getPortID(_port[0]) >= getPortID(_range[0]) and getPortID(_port[0]) <= getPortID(_range[1]) and \
           getPortID(_port[1]) >= getPortID(_range[0]) and getPortID(_port[1]) <= getPortID(_range[1]):
            return True

    return False

def coalescePortRange(new_range, ranges):
    """ Coalesce a port range with existing list of port ranges

        @param new_range tuple/list/string
        @param ranges list of tuple/list/string
        @return tuple of (list of ranges added after coalescing, list of removed original ranges)
    """

    coalesced_range = getPortRange(new_range)
    # normalize singleton ranges, e.g. (x,) --> (x,x)
    if len(coalesced_range) == 1:
        coalesced_range = (coalesced_range[0], coalesced_range[0])
    _ranges = map(getPortRange, ranges)
    _ranges = sorted(map(lambda x: (x[0],x[0]) if len(x) == 1 else x, _ranges), key=lambda x: x[0])

    removed_ranges = []
    for range in _ranges:
        if coalesced_range[0] <= range[0] and coalesced_range[1] >= range[1]:
            # new range covers this
            removed_ranges.append(range)
        elif coalesced_range[0] <= range[0] and coalesced_range[1] <  range[1] and \
                                                coalesced_range[1] >= range[0]:
            # expand beginning of range
            removed_ranges.append(range)
            coalesced_range = (coalesced_range[0], range[1])
        elif coalesced_range[0] >  range[0] and coalesced_range[1] >= range[1] and \
                                                coalesced_range[0] <= range[1]:
            # expand end of range
            removed_ranges.append(range)
            coalesced_range = (range[0], coalesced_range[1])

    # normalize singleton ranges, e.g. (x,x) --> (x,)
    removed_ranges = list(map(lambda x: (x[0],) if x[0] == x[1] else x, removed_ranges))
    if coalesced_range[0] == coalesced_range[1]:
        coalesced_range = (coalesced_range[0],)

    return ([coalesced_range], removed_ranges)

def breakPortRange(remove_range, ranges):
    """ break a port range from existing list of port ranges

        @param remove_range tuple/list/string
        @param ranges list of tuple/list/string
        @return tuple of (list of ranges added after breaking up, list of removed original ranges)
    """

    remove_range = getPortRange(remove_range)
    # normalize singleton ranges, e.g. (x,) --> (x,x)
    if len(remove_range) == 1:
        remove_range = (remove_range[0], remove_range[0])
    _ranges = map(getPortRange, ranges)
    _ranges = sorted(map(lambda x: (x[0],x[0]) if len(x) == 1 else x, _ranges), key=lambda x: x[0])

    removed_ranges = []
    added_ranges = []
    for range in _ranges:
        if remove_range[0] <= range[0] and remove_range[1] >= range[1]:
            # remove entire range
            removed_ranges.append(range)
        elif remove_range[0] <= range[0] and remove_range[1] <  range[1] and \
                                             remove_range[1] >= range[0]:
            # remove from beginning of range
            removed_ranges.append(range)
            added_ranges.append((remove_range[1] + 1, range[1]))
        elif remove_range[0] >  range[0] and remove_range[1] >= range[1] and \
                                             remove_range[0] <= range[1]:
            # remove from end of range
            removed_ranges.append(range)
            added_ranges.append((range[0], remove_range[0] - 1))
        elif remove_range[0] > range[0] and remove_range[1] < range[1]:
            # remove inside range
            removed_ranges.append(range)
            added_ranges.append((range[0], remove_range[0] - 1))
            added_ranges.append((remove_range[1] + 1, range[1]))

    # normalize singleton ranges, e.g. (x,x) --> (x,)
    removed_ranges = list(map(lambda x: (x[0],) if x[0] == x[1] else x, removed_ranges))
    added_ranges = list(map(lambda x: (x[0],) if x[0] == x[1] else x, added_ranges))

    return (added_ranges, removed_ranges)

def getServiceName(port, proto):
    """ Check and Get service name from port and proto string combination using socket.getservbyport

    @param port string or id
    @param protocol string
    @return Service name if port and protocol are valid, else None
    """

    try:
        name = socket.getservbyport(int(port), proto)
    except socket.error:
        return None
    return name

def checkIP(ip):
    """ Check IPv4 address.
    
    @param ip address string
    @return True if address is valid, else False
    """

    try:
        socket.inet_pton(socket.AF_INET, ip)
    except socket.error:
        return False
    return True

def normalizeIP6(ip):
    """ Normalize the IPv6 address

    This is mostly about converting URL-like IPv6 address to normal ones.
    e.g. [1234::4321] --> 1234:4321
    """
    return ip.strip("[]")

def checkIP6(ip):
    """ Check IPv6 address.
    
    @param ip address string
    @return True if address is valid, else False
    """

    try:
        socket.inet_pton(socket.AF_INET6, normalizeIP6(ip))
    except socket.error:
        return False
    return True

def checkIPnMask(ip):
    if "/" in ip:
        addr = ip[:ip.index("/")]
        mask = ip[ip.index("/")+1:]
        if len(addr) < 1 or len(mask) < 1:
            return False
    else:
        addr = ip
        mask = None
    if not checkIP(addr):
        return False
    if mask:
        if "." in mask:
            return checkIP(mask)
        else:
            try:
                i = int(mask)
            except ValueError:
                return False
            if i < 0 or i > 32:
                return False
    return True

def stripNonPrintableCharacters(rule_str):
    return rule_str.translate(NOPRINT_TRANS_TABLE)

def checkIP6nMask(ip):
    if "/" in ip:
        addr = ip[:ip.index("/")]
        mask = ip[ip.index("/")+1:]
        if len(addr) < 1 or len(mask) < 1:
            return False
    else:
        addr = ip
        mask = None
    if not checkIP6(addr):
        return False
    if mask:
        try:
            i = int(mask)
        except ValueError:
            return False
        if i < 0 or i > 128:
            return False

    return True

def checkProtocol(protocol):
    try:
        i = int(protocol)
    except ValueError:
        # string
        try:
            socket.getprotobyname(protocol)
        except socket.error:
            return False
    else:
        if i < 0 or i > 255:
            return False

    return True

def checkInterface(iface):
    """ Check interface string

    @param interface string
    @return True if interface is valid (maximum 16 chars and does not contain ' ', '/', '!', ':', '*'), else False
    """

    if not iface or len(iface) > 16:
        return False
    for ch in [ ' ', '/', '!', '*' ]:
        # !:* are limits for iptables <= 1.4.5
        if ch in iface:
            return False
    # disabled old iptables check
    #if iface == "+":
    #    # limit for iptables <= 1.4.5
    #    return False
    return True

def checkUINT32(val):
    try:
        x = int(val, 0)
    except ValueError:
        return False
    else:
        if x >= 0 and x <= 4294967295:
            return True
    return False

def firewalld_is_active():
    """ Check if firewalld is active

    @return True if there is a firewalld pid file and the pid is used by firewalld
    """

    if not os.path.exists(FIREWALLD_PIDFILE):
        return False

    try:
        with open(FIREWALLD_PIDFILE, "r") as fd:
            pid = fd.readline()
    except Exception:
        return False

    if not os.path.exists("/proc/%s" % pid):
        return False

    try:
        with open("/proc/%s/cmdline" % pid, "r") as fd:
            cmdline = fd.readline()
    except Exception:
        return False

    if "firewalld" in cmdline:
        return True

    return False

def tempFile():
    try:
        if not os.path.exists(FIREWALLD_TEMPDIR):
            os.mkdir(FIREWALLD_TEMPDIR, 0o750)

        return tempfile.NamedTemporaryFile(mode='wt', prefix="temp.",
                                           dir=FIREWALLD_TEMPDIR, delete=False)
    except Exception as msg:
        log.error("Failed to create temporary file: %s" % msg)
        raise
    return None

def readfile(filename):
    try:
        with open(filename, "r") as f:
            return f.readlines()
    except Exception as e:
        log.error('Failed to read file "%s": %s' % (filename, e))
    return None

def writefile(filename, line):
    try:
        with open(filename, "w") as f:
            f.write(line)
    except Exception as e:
        log.error('Failed to write to file "%s": %s' % (filename, e))
        return False
    return True

def enable_ip_forwarding(ipv):
    if ipv == "ipv4":
        return writefile("/proc/sys/net/ipv4/ip_forward", "1\n")
    elif ipv == "ipv6":
        return writefile("/proc/sys/net/ipv6/conf/all/forwarding", "1\n")
    return False

def get_nf_conntrack_short_name(module):
    return module.replace("_","-").replace("nf-conntrack-", "")

def check_port(port):
    _range = getPortRange(port)
    if _range == -2 or _range == -1 or _range is None or \
            (len(_range) == 2 and _range[0] >= _range[1]):
        if _range == -2:
            log.debug2("'%s': port > 65535" % port)
        elif _range == -1:
            log.debug2("'%s': port is invalid" % port)
        elif _range is None:
            log.debug2("'%s': port is ambiguous" % port)
        elif len(_range) == 2 and _range[0] >= _range[1]:
            log.debug2("'%s': range start >= end" % port)
        return False
    return True

def check_address(ipv, source):
    if ipv == "ipv4":
        return checkIPnMask(source)
    elif ipv == "ipv6":
        return checkIP6nMask(source)
    else:
        return False

def check_single_address(ipv, source):
    if ipv == "ipv4":
        return checkIP(source)
    elif ipv == "ipv6":
        return checkIP6(source)
    else:
        return False

def check_mac(mac):
    if len(mac) == 12+5:
        # 0 1 : 3 4 : 6 7 : 9 10 : 12 13 : 15 16
        for i in (2, 5, 8, 11, 14):
            if mac[i] != ":":
                return False
        for i in (0, 1, 3, 4, 6, 7, 9, 10, 12, 13, 15, 16):
            if mac[i] not in string.hexdigits:
                return False
        return True
    return False

def uniqify(_list):
    # removes duplicates from list, whilst preserving order
    output = []
    for x in _list:
        if x not in output:
            output.append(x)
    return output

def ppid_of_pid(pid):
    """ Get parent for pid """
    try:
        f = os.popen("ps -o ppid -h -p %d 2>/dev/null" % pid)
        pid = int(f.readlines()[0].strip())
        f.close()
    except Exception:
        return None
    return pid

def max_policy_name_len():
    """
    iptables limits length of chain to (currently) 28 chars.
    The longest chain we create is POST_<policy>_allow,
    which leaves 28 - 11 = 17 chars for <policy>.
    """
    from firewall.core.ipXtables import POLICY_CHAIN_PREFIX
    from firewall.core.base import SHORTCUTS
    longest_shortcut = max(map(len, SHORTCUTS.values()))
    return 28 - (longest_shortcut + len(POLICY_CHAIN_PREFIX) + len("_allow"))

def max_zone_name_len():
    """
    Netfilter limits length of chain to (currently) 28 chars.
    The longest chain we create is FWDI_<zone>_allow,
    which leaves 28 - 11 = 17 chars for <zone>.
    """
    from firewall.core.base import SHORTCUTS
    longest_shortcut = max(map(len, SHORTCUTS.values()))
    return 28 - (longest_shortcut + len("__allow"))

def checkUser(user):
    if len(user) < 1 or len(user) > os.sysconf('SC_LOGIN_NAME_MAX'):
        return False
    for c in user:
        if c not in string.ascii_letters and \
           c not in string.digits and \
           c not in [ ".", "-", "_", "$" ]:
            return False
    return True

def checkUid(uid):
    if isinstance(uid, str):
        try:
            uid = int(uid)
        except ValueError:
            return False
    if uid >= 0 and uid <= 2**31-1:
        return True
    return False

def checkCommand(command):
    if len(command) < 1 or len(command) > 1024:
        return False
    for ch in [ "|", "\n", "\0" ]:
        if ch in command:
            return False
    if command[0] != "/":
        return False
    return True

def checkContext(context):
    splits = context.split(":")
    if len(splits) not in [4, 5]:
        return False
    # user ends with _u if not root
    if splits[0] != "root" and splits[0][-2:] != "_u":
        return False
    # role ends with _r
    if splits[1][-2:] != "_r":
        return False
    # type ends with _t
    if splits[2][-2:] != "_t":
        return False
    # level might also contain :
    if len(splits[3]) < 1:
        return False
    return True

def joinArgs(args):
    if "quote" in dir(shlex):
        return " ".join(shlex.quote(a) for a in args)
    else:
        return " ".join(pipes.quote(a) for a in args)

def splitArgs(_string):
    if PY2 and isinstance(_string, unicode): # noqa: F821
        # Python2's shlex doesn't like unicode
        _string = u2b(_string)
        splits = shlex.split(_string)
        return map(b2u, splits)
    else:
        return shlex.split(_string)

def b2u(_string):
    """ bytes to unicode """
    if isinstance(_string, bytes):
        return _string.decode('UTF-8', 'replace')
    return _string

def u2b(_string):
    """ unicode to bytes """
    if not isinstance(_string, bytes):
        return _string.encode('UTF-8', 'replace')
    return _string

def u2b_if_py2(_string):
    """ unicode to bytes only if Python 2"""
    if PY2 and isinstance(_string, unicode): # noqa: F821
        return _string.encode('UTF-8', 'replace')
    return _string
site-packages/firewall/errors.py000064400000010343147511334700012765 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2010-2012 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

ALREADY_ENABLED     =   11
NOT_ENABLED         =   12
COMMAND_FAILED      =   13
NO_IPV6_NAT         =   14
PANIC_MODE          =   15
ZONE_ALREADY_SET    =   16
UNKNOWN_INTERFACE   =   17
ZONE_CONFLICT       =   18
BUILTIN_CHAIN       =   19
EBTABLES_NO_REJECT  =   20
NOT_OVERLOADABLE    =   21
NO_DEFAULTS         =   22
BUILTIN_ZONE        =   23
BUILTIN_SERVICE     =   24
BUILTIN_ICMPTYPE    =   25
NAME_CONFLICT       =   26
NAME_MISMATCH       =   27
PARSE_ERROR         =   28
ACCESS_DENIED       =   29
UNKNOWN_SOURCE      =   30
RT_TO_PERM_FAILED   =   31
IPSET_WITH_TIMEOUT  =   32
BUILTIN_IPSET       =   33
ALREADY_SET         =   34
MISSING_IMPORT      =   35
DBUS_ERROR          =   36
BUILTIN_HELPER      =   37
NOT_APPLIED         =   38

INVALID_ACTION      =  100
INVALID_SERVICE     =  101
INVALID_PORT        =  102
INVALID_PROTOCOL    =  103
INVALID_INTERFACE   =  104
INVALID_ADDR        =  105
INVALID_FORWARD     =  106
INVALID_ICMPTYPE    =  107
INVALID_TABLE       =  108
INVALID_CHAIN       =  109
INVALID_TARGET      =  110
INVALID_IPV         =  111
INVALID_ZONE        =  112
INVALID_PROPERTY    =  113
INVALID_VALUE       =  114
INVALID_OBJECT      =  115
INVALID_NAME        =  116
INVALID_FILENAME    =  117
INVALID_DIRECTORY   =  118
INVALID_TYPE        =  119
INVALID_SETTING     =  120
INVALID_DESTINATION =  121
INVALID_RULE        =  122
INVALID_LIMIT       =  123
INVALID_FAMILY      =  124
INVALID_LOG_LEVEL   =  125
INVALID_AUDIT_TYPE  =  126
INVALID_MARK        =  127
INVALID_CONTEXT     =  128
INVALID_COMMAND     =  129
INVALID_USER        =  130
INVALID_UID         =  131
INVALID_MODULE      =  132
INVALID_PASSTHROUGH =  133
INVALID_MAC         =  134
INVALID_IPSET       =  135
INVALID_ENTRY       =  136
INVALID_OPTION      =  137
INVALID_HELPER      =  138
INVALID_PRIORITY    =  139
INVALID_POLICY      =  140

MISSING_TABLE       =  200
MISSING_CHAIN       =  201
MISSING_PORT        =  202
MISSING_PROTOCOL    =  203
MISSING_ADDR        =  204
MISSING_NAME        =  205
MISSING_SETTING     =  206
MISSING_FAMILY      =  207

RUNNING_BUT_FAILED  =  251
NOT_RUNNING         =  252
NOT_AUTHORIZED      =  253
UNKNOWN_ERROR       =  254

import sys

class FirewallError(Exception):
    def __init__(self, code, msg=None):
        self.code = code
        if msg is not None:
            # escape msg if needed
            if sys.version < '3':
                try:
                    x = str(msg) # noqa: F841
                except UnicodeEncodeError:
                    msg = unicode(msg).encode("unicode_escape") # noqa: F821
        self.msg = msg

    def __repr__(self):
        return '%s(%r, %r)' % (self.__class__, self.code, self.msg)

    def __str__(self):
        if self.msg:
            return "%s: %s" % (self.errors[self.code], self.msg)
        return self.errors[self.code]

    def get_code(msg):
        if ":" in msg:
            idx = msg.index(":")
            ecode = msg[:idx]
        else:
            ecode = msg

        try:
            code = FirewallError.codes[ecode]
        except KeyError:
            code = UNKNOWN_ERROR

        return code

    get_code = staticmethod(get_code)

mod = sys.modules[FirewallError.__module__]
FirewallError.errors = { getattr(mod,varname) : varname
                         for varname in dir(mod)
                         if not varname.startswith("_") and \
                         type(getattr(mod,varname)) == int }
FirewallError.codes =  { FirewallError.errors[code] : code
                         for code in FirewallError.errors }
site-packages/firewall/dbus_utils.py000064400000017512147511334700013633 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2011-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

__all__ = [ "command_of_pid", "pid_of_sender", "uid_of_sender", "user_of_uid",
            "context_of_sender", "command_of_sender", "user_of_sender",
            "dbus_to_python", "dbus_signature",
            "dbus_introspection_prepare_properties",
            "dbus_introspection_add_properties" ]

import dbus
import pwd
import sys
from xml.dom import minidom

from firewall.core.logger import log

PY2 = sys.version < '3'

def command_of_pid(pid):
    """ Get command for pid from /proc """
    try:
        with open("/proc/%d/cmdline" % pid, "r") as f:
            cmd = f.readlines()[0].replace('\0', " ").strip()
    except Exception:
        return None
    return cmd

def pid_of_sender(bus, sender):
    """ Get pid from sender string using 
    org.freedesktop.DBus.GetConnectionUnixProcessID """

    dbus_obj = bus.get_object('org.freedesktop.DBus', '/org/freedesktop/DBus')
    dbus_iface = dbus.Interface(dbus_obj, 'org.freedesktop.DBus')

    try:
        pid = int(dbus_iface.GetConnectionUnixProcessID(sender))
    except ValueError:
        return None
    return pid

def uid_of_sender(bus, sender):
    """ Get user id from sender string using 
    org.freedesktop.DBus.GetConnectionUnixUser """

    dbus_obj = bus.get_object('org.freedesktop.DBus', '/org/freedesktop/DBus')
    dbus_iface = dbus.Interface(dbus_obj, 'org.freedesktop.DBus')

    try:
        uid = int(dbus_iface.GetConnectionUnixUser(sender))
    except ValueError:
        return None
    return uid

def user_of_uid(uid):
    """ Get user for uid from pwd """

    try:
        pws = pwd.getpwuid(uid)
    except Exception:
        return None
    return pws[0]

def context_of_sender(bus, sender):
    """ Get SELinux context from sender string using 
    org.freedesktop.DBus.GetConnectionSELinuxSecurityContext """

    dbus_obj = bus.get_object('org.freedesktop.DBus', '/org/freedesktop/DBus')
    dbus_iface = dbus.Interface(dbus_obj, 'org.freedesktop.DBus')

    try:
        context =  dbus_iface.GetConnectionSELinuxSecurityContext(sender)
    except Exception:
        return None

    return "".join(map(chr, dbus_to_python(context)))

def command_of_sender(bus, sender):
    """ Return command of D-Bus sender """

    return command_of_pid(pid_of_sender(bus, sender))

def user_of_sender(bus, sender):
    return user_of_uid(uid_of_sender(bus, sender))

def dbus_to_python(obj, expected_type=None):
    if obj is None:
        python_obj = obj
    elif isinstance(obj, dbus.Boolean):
        python_obj = bool(obj)
    elif isinstance(obj, dbus.String):
        python_obj = obj.encode('utf-8') if PY2 else str(obj)
    elif PY2 and isinstance(obj, dbus.UTF8String):  # Python3 has no UTF8String
        python_obj = str(obj)
    elif isinstance(obj, dbus.ObjectPath):
        python_obj = str(obj)
    elif isinstance(obj, dbus.Byte) or \
            isinstance(obj, dbus.Int16) or \
            isinstance(obj, dbus.Int32) or \
            isinstance(obj, dbus.Int64) or \
            isinstance(obj, dbus.UInt16) or \
            isinstance(obj, dbus.UInt32) or \
            isinstance(obj, dbus.UInt64):
        python_obj = int(obj)
    elif isinstance(obj, dbus.Double):
        python_obj = float(obj)
    elif isinstance(obj, dbus.Array):
        python_obj = [dbus_to_python(x) for x in obj]
    elif isinstance(obj, dbus.Struct):
        python_obj = tuple([dbus_to_python(x) for x in obj])
    elif isinstance(obj, dbus.Dictionary):
        python_obj = {dbus_to_python(k): dbus_to_python(v) for k, v in obj.items()}
    elif isinstance(obj, bool) or \
         isinstance(obj, str) or isinstance(obj, bytes) or \
         isinstance(obj, int) or isinstance(obj, float) or \
         isinstance(obj, list) or isinstance(obj, tuple) or \
         isinstance(obj, dict):
        python_obj = obj
    else:
        raise TypeError("Unhandled %s" % repr(obj))

    if expected_type is not None:
        if (expected_type == bool and not isinstance(python_obj, bool)) or \
           (expected_type == str and not isinstance(python_obj, str)) or \
           (expected_type == int and not isinstance(python_obj, int)) or \
           (expected_type == float and not isinstance(python_obj, float)) or \
           (expected_type == list and not isinstance(python_obj, list)) or \
           (expected_type == tuple and not isinstance(python_obj, tuple)) or \
           (expected_type == dict and not isinstance(python_obj, dict)):
            raise TypeError("%s is %s, expected %s" % (python_obj, type(python_obj), expected_type))

    return python_obj

def dbus_signature(obj):
    if isinstance(obj, dbus.Boolean):
        return 'b'
    elif isinstance(obj, dbus.String):
        return 's'
    elif isinstance(obj, dbus.ObjectPath):
        return 'o'
    elif isinstance(obj, dbus.Byte):
        return 'y'
    elif isinstance(obj, dbus.Int16):
        return 'n'
    elif isinstance(obj, dbus.Int32):
        return 'i'
    elif isinstance(obj, dbus.Int64):
        return 'x'
    elif isinstance(obj, dbus.UInt16):
        return 'q'
    elif isinstance(obj, dbus.UInt32):
        return 'u'
    elif isinstance(obj, dbus.UInt64):
        return 't'
    elif isinstance(obj, dbus.Double):
        return 'd'
    elif isinstance(obj, dbus.Array):
        if len(obj.signature) > 1:
            return 'a(%s)' % obj.signature
        else:
            return 'a%s' % obj.signature
    elif isinstance(obj, dbus.Struct):
        return '(%s)' % obj.signature
    elif isinstance(obj, dbus.Dictionary):
        return 'a{%s}' % obj.signature
    elif PY2 and isinstance(obj, dbus.UTF8String):
        return 's'
    else:
        raise TypeError("Unhandled %s" % repr(obj))

def dbus_introspection_prepare_properties(obj, interface, access=None):
    if access is None:
        access = { }

    if not hasattr(obj, "_fw_dbus_properties"):
        setattr(obj, "_fw_dbus_properties", { })
    dip = getattr(obj, "_fw_dbus_properties")
    dip[interface] = { }

    try:
        _dict = obj.GetAll(interface)
    except Exception:
        _dict = { }
    for key,value in _dict.items():
        dip[interface][key] = { "type": dbus_signature(value) }
        if key in access:
            dip[interface][key]["access"] = access[key]
        else:
            dip[interface][key]["access"] = "read"

def dbus_introspection_add_properties(obj, data, interface):
    doc = minidom.parseString(data)

    if hasattr(obj, "_fw_dbus_properties"):
        for node in doc.getElementsByTagName("interface"):
            if node.hasAttribute("name") and \
               node.getAttribute("name") == interface:
                dip = { }
                if getattr(obj, "_fw_dbus_properties"):
                    dip = getattr(obj, "_fw_dbus_properties")
                if interface in dip:
                    for key,value in dip[interface].items():
                        prop = doc.createElement("property")
                        prop.setAttribute("name", key)
                        prop.setAttribute("type", value["type"])
                        prop.setAttribute("access", value["access"])
                        node.appendChild(prop)

    log.debug10(doc.toxml())
    new_data = doc.toxml()
    doc.unlink()
    return new_data
site-packages/firewall/server/__init__.py000064400000000000147511334700014503 0ustar00site-packages/firewall/server/config_ipset.py000064400000044572147511334700015443 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2015-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.

# force use of pygobject3 in python-slip
from gi.repository import GObject
import sys
sys.modules['gobject'] = GObject

import dbus
import dbus.service
import slip.dbus
import slip.dbus.service

from firewall import config
from firewall.dbus_utils import dbus_to_python, \
    dbus_introspection_prepare_properties, \
    dbus_introspection_add_properties
from firewall.core.io.ipset import IPSet
from firewall.core.ipset import IPSET_TYPES, normalize_ipset_entry, \
                                check_entry_overlaps_existing, \
                                check_for_overlapping_entries
from firewall.core.logger import log
from firewall.server.decorators import handle_exceptions, \
    dbus_handle_exceptions, dbus_service_method
from firewall import errors
from firewall.errors import FirewallError

############################################################################
#
# class FirewallDConfigIPSet
#
############################################################################

class FirewallDConfigIPSet(slip.dbus.service.Object):
    """FirewallD main class"""

    persistent = True
    """ Make FirewallD persistent. """
    default_polkit_auth_required = config.dbus.PK_ACTION_CONFIG
    """ Use PK_ACTION_INFO as a default """

    @handle_exceptions
    def __init__(self, parent, conf, ipset, item_id, *args, **kwargs):
        super(FirewallDConfigIPSet, self).__init__(*args, **kwargs)
        self.parent = parent
        self.config = conf
        self.obj = ipset
        self.item_id = item_id
        self.busname = args[0]
        self.path = args[1]
        self._log_prefix = "config.ipset.%d" % self.item_id
        dbus_introspection_prepare_properties(
            self, config.dbus.DBUS_INTERFACE_CONFIG_IPSET)

    @dbus_handle_exceptions
    def __del__(self):
        pass

    @dbus_handle_exceptions
    def unregister(self):
        self.remove_from_connection()

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # P R O P E R T I E S

    @dbus_handle_exceptions
    def _get_property(self, property_name):
        if property_name == "name":
            return dbus.String(self.obj.name)
        elif property_name == "filename":
            return dbus.String(self.obj.filename)
        elif property_name == "path":
            return dbus.String(self.obj.path)
        elif property_name == "default":
            return dbus.Boolean(self.obj.default)
        elif property_name == "builtin":
            return dbus.Boolean(self.obj.builtin)
        else:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.InvalidArgs: "
                "Property '%s' does not exist" % property_name)

    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='ss',
                         out_signature='v')
    @dbus_handle_exceptions
    def Get(self, interface_name, property_name, sender=None): # pylint: disable=W0613
        # get a property
        interface_name = dbus_to_python(interface_name, str)
        property_name = dbus_to_python(property_name, str)
        log.debug1("%s.Get('%s', '%s')", self._log_prefix,
                   interface_name, property_name)

        if interface_name != config.dbus.DBUS_INTERFACE_CONFIG_IPSET:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

        return self._get_property(property_name)

    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='s',
                         out_signature='a{sv}')
    @dbus_handle_exceptions
    def GetAll(self, interface_name, sender=None): # pylint: disable=W0613
        interface_name = dbus_to_python(interface_name, str)
        log.debug1("%s.GetAll('%s')", self._log_prefix, interface_name)

        if interface_name != config.dbus.DBUS_INTERFACE_CONFIG_IPSET:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

        ret = { }
        for x in [ "name", "filename", "path", "default", "builtin" ]:
            ret[x] = self._get_property(x)
        return dbus.Dictionary(ret, signature="sv")

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='ssv')
    @dbus_handle_exceptions
    def Set(self, interface_name, property_name, new_value, sender=None):
        interface_name = dbus_to_python(interface_name, str)
        property_name = dbus_to_python(property_name, str)
        new_value = dbus_to_python(new_value)
        log.debug1("%s.Set('%s', '%s', '%s')", self._log_prefix,
                   interface_name, property_name, new_value)
        self.parent.accessCheck(sender)

        if interface_name != config.dbus.DBUS_INTERFACE_CONFIG_IPSET:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

        raise dbus.exceptions.DBusException(
            "org.freedesktop.DBus.Error.PropertyReadOnly: "
            "Property '%s' is read-only" % property_name)

    @dbus.service.signal(dbus.PROPERTIES_IFACE, signature='sa{sv}as')
    def PropertiesChanged(self, interface_name, changed_properties,
                          invalidated_properties):
        interface_name = dbus_to_python(interface_name, str)
        changed_properties = dbus_to_python(changed_properties)
        invalidated_properties = dbus_to_python(invalidated_properties)
        log.debug1("%s.PropertiesChanged('%s', '%s', '%s')", self._log_prefix,
                   interface_name, changed_properties, invalidated_properties)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(dbus.INTROSPECTABLE_IFACE, out_signature='s')
    @dbus_handle_exceptions
    def Introspect(self, sender=None): # pylint: disable=W0613
        log.debug2("%s.Introspect()", self._log_prefix)

        data = super(FirewallDConfigIPSet, self).Introspect(
            self.path, self.busname.get_bus())

        return dbus_introspection_add_properties(
            self, data, config.dbus.DBUS_INTERFACE_CONFIG_IPSET)

    # S E T T I N G S

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                         out_signature=IPSet.DBUS_SIGNATURE)
    @dbus_handle_exceptions
    def getSettings(self, sender=None): # pylint: disable=W0613
        """get settings for ipset
        """
        log.debug1("%s.getSettings()", self._log_prefix)
        return self.config.get_ipset_config(self.obj)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                         in_signature=IPSet.DBUS_SIGNATURE)
    @dbus_handle_exceptions
    def update(self, settings, sender=None):
        """update settings for ipset
        """
        settings = dbus_to_python(settings)
        log.debug1("%s.update('...')", self._log_prefix)
        self.parent.accessCheck(sender)
        self.obj = self.config.set_ipset_config(self.obj, settings)
        self.Updated(self.obj.name)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET)
    @dbus_handle_exceptions
    def loadDefaults(self, sender=None):
        """load default settings for builtin ipset
        """
        log.debug1("%s.loadDefaults()", self._log_prefix)
        self.parent.accessCheck(sender)
        self.obj = self.config.load_ipset_defaults(self.obj)
        self.Updated(self.obj.name)
        #self.PropertiesChanged(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
        #                       { "default": True }, [ ])

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG_IPSET, signature='s')
    @dbus_handle_exceptions
    def Updated(self, name):
        log.debug1("%s.Updated('%s')" % (self._log_prefix, name))

    # R E M O V E

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET)
    @dbus_handle_exceptions
    def remove(self, sender=None):
        """remove ipset
        """
        log.debug1("%s.remove()", self._log_prefix)
        self.parent.accessCheck(sender)
        self.config.remove_ipset(self.obj)
        self.parent.removeIPSet(self.obj)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG_IPSET, signature='s')
    @dbus_handle_exceptions
    def Removed(self, name):
        log.debug1("%s.Removed('%s')" % (self._log_prefix, name))

    # R E N A M E

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                         in_signature='s')
    @dbus_handle_exceptions
    def rename(self, name, sender=None):
        """rename ipset
        """
        name = dbus_to_python(name, str)
        log.debug1("%s.rename('%s')", self._log_prefix, name)
        self.parent.accessCheck(sender)
        self.obj = self.config.rename_ipset(self.obj, name)
        self.Renamed(name)
        #self.PropertiesChanged(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
        #                       { "name": name }, [ ])

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG_IPSET, signature='s')
    @dbus_handle_exceptions
    def Renamed(self, name):
        log.debug1("%s.Renamed('%s')" % (self._log_prefix, name))

    # version

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                         out_signature='s')
    @dbus_handle_exceptions
    def getVersion(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getVersion()", self._log_prefix)
        return self.getSettings()[0]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                         in_signature='s')
    @dbus_handle_exceptions
    def setVersion(self, version, sender=None):
        version = dbus_to_python(version, str)
        log.debug1("%s.setVersion('%s')", self._log_prefix, version)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[0] = version
        self.update(settings)

    # short

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                         out_signature='s')
    @dbus_handle_exceptions
    def getShort(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getShort()", self._log_prefix)
        return self.getSettings()[1]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                         in_signature='s')
    @dbus_handle_exceptions
    def setShort(self, short, sender=None):
        short = dbus_to_python(short, str)
        log.debug1("%s.setShort('%s')", self._log_prefix, short)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[1] = short
        self.update(settings)

    # description

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                         out_signature='s')
    @dbus_handle_exceptions
    def getDescription(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getDescription()", self._log_prefix)
        return self.getSettings()[2]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                         in_signature='s')
    @dbus_handle_exceptions
    def setDescription(self, description, sender=None):
        description = dbus_to_python(description, str)
        log.debug1("%s.setDescription('%s')", self._log_prefix,
                   description)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[2] = description
        self.update(settings)

    # type

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                         out_signature='s')
    @dbus_handle_exceptions
    def getType(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getType()", self._log_prefix)
        return self.getSettings()[3]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                         in_signature='s')
    @dbus_handle_exceptions
    def setType(self, ipset_type, sender=None):
        ipset_type = dbus_to_python(ipset_type, str)
        log.debug1("%s.setType('%s')", self._log_prefix, ipset_type)
        self.parent.accessCheck(sender)
        if ipset_type not in IPSET_TYPES:
            raise FirewallError(errors.INVALID_TYPE, ipset_type)
        settings = list(self.getSettings())
        settings[3] = ipset_type
        self.update(settings)

    # options

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                         out_signature='a{ss}')
    @dbus_handle_exceptions
    def getOptions(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getOptions()", self._log_prefix)
        return self.getSettings()[4]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                         in_signature='a{ss}')
    @dbus_handle_exceptions
    def setOptions(self, options, sender=None):
        options = dbus_to_python(options, dict)
        log.debug1("%s.setOptions('[%s]')", self._log_prefix,
                   repr(options))
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[4] = options
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                         in_signature='ss')
    @dbus_handle_exceptions
    def addOption(self, key, value, sender=None):
        key = dbus_to_python(key, str)
        value = dbus_to_python(value, str)
        log.debug1("%s.addOption('%s', '%s')", self._log_prefix, key, value)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if key in settings[4] and settings[4][key] == value:
            raise FirewallError(errors.ALREADY_ENABLED,
                                "'%s': '%s'" % (key, value))
        settings[4][key] = value
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                         in_signature='s')
    @dbus_handle_exceptions
    def removeOption(self, key, sender=None):
        key = dbus_to_python(key, str)
        log.debug1("%s.removeOption('%s')", self._log_prefix, key)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if key not in settings[4]:
            raise FirewallError(errors.NOT_ENABLED, key)
        del settings[4][key]
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                         in_signature='ss',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryOption(self, key, value, sender=None): # pylint: disable=W0613
        key = dbus_to_python(key, str)
        value = dbus_to_python(value, str)
        log.debug1("%s.queryOption('%s', '%s')", self._log_prefix, key,
                   value)
        settings = list(self.getSettings())
        return (key in settings[4] and settings[4][key] == value)

    # entries

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                         out_signature='as')
    @dbus_handle_exceptions
    def getEntries(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getEntries()", self._log_prefix)
        return self.getSettings()[5]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                         in_signature='as')
    @dbus_handle_exceptions
    def setEntries(self, entries, sender=None):
        entries = dbus_to_python(entries, list)
        check_for_overlapping_entries(entries)
        log.debug1("%s.setEntries('[%s]')", self._log_prefix,
                   ",".join(entries))
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if "timeout" in settings[4] and settings[4]["timeout"] != "0":
            raise FirewallError(errors.IPSET_WITH_TIMEOUT)
        settings[5] = entries
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                         in_signature='s')
    @dbus_handle_exceptions
    def addEntry(self, entry, sender=None):
        entry = dbus_to_python(entry, str)
        entry = normalize_ipset_entry(entry)
        log.debug1("%s.addEntry('%s')", self._log_prefix, entry)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if "timeout" in settings[4] and settings[4]["timeout"] != "0":
            raise FirewallError(errors.IPSET_WITH_TIMEOUT)
        if entry in settings[5]:
            raise FirewallError(errors.ALREADY_ENABLED, entry)
        check_entry_overlaps_existing(entry, settings[5])
        settings[5].append(entry)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                         in_signature='s')
    @dbus_handle_exceptions
    def removeEntry(self, entry, sender=None):
        entry = dbus_to_python(entry, str)
        entry = normalize_ipset_entry(entry)
        log.debug1("%s.removeEntry('%s')", self._log_prefix, entry)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if "timeout" in settings[4] and settings[4]["timeout"] != "0":
            raise FirewallError(errors.IPSET_WITH_TIMEOUT)
        if entry not in settings[5]:
            raise FirewallError(errors.NOT_ENABLED, entry)
        settings[5].remove(entry)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_IPSET,
                         in_signature='s', out_signature='b')
    @dbus_handle_exceptions
    def queryEntry(self, entry, sender=None): # pylint: disable=W0613
        entry = dbus_to_python(entry, str)
        entry = normalize_ipset_entry(entry)
        log.debug1("%s.queryEntry('%s')", self._log_prefix, entry)
        settings = list(self.getSettings())
        if "timeout" in settings[4] and settings[4]["timeout"] != "0":
            raise FirewallError(errors.IPSET_WITH_TIMEOUT)
        return entry in settings[5]
site-packages/firewall/server/server.py000064400000007371147511334700014274 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2010-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#
# signal handling and run_server derived from setroubleshoot
# Copyright (C) 2006,2007,2008,2009 Red Hat, Inc.
# Authors:
#   John Dennis <jdennis@redhat.com>
#   Thomas Liu  <tliu@redhat.com>
#   Dan Walsh <dwalsh@redhat.com>

__all__ = [ "run_server" ]

import sys
import signal

# force use of pygobject3 in python-slip
from gi.repository import GObject, GLib
sys.modules['gobject'] = GObject

import dbus
import dbus.service
import dbus.mainloop.glib
import slip.dbus

from firewall import config
from firewall.core.logger import log
from firewall.server.firewalld import FirewallD

############################################################################
#
# signal handlers
#
############################################################################

def sighup(service):
    service.reload()
    return True

def sigterm(mainloop):
    mainloop.quit()

############################################################################
#
# run_server function
#
############################################################################

def run_server(debug_gc=False):
    """ Main function for firewall server. Handles D-Bus and GLib mainloop.
    """
    service = None
    if debug_gc:
        from pprint import pformat
        import gc
        gc.enable()
        gc.set_debug(gc.DEBUG_LEAK)

        gc_timeout = 10
        def gc_collect():
            gc.collect()
            if len(gc.garbage) > 0:
                print("\n>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>"
                      ">>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>\n")
                print("GARBAGE OBJECTS (%d):\n" % len(gc.garbage))
                for x in gc.garbage:
                    print(type(x), "\n  ",)
                    print(pformat(x))
                print("\n<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<"
                      "<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<\n")
            GLib.timeout_add_seconds(gc_timeout, gc_collect)

    try:
        dbus.mainloop.glib.DBusGMainLoop(set_as_default=True)
        bus = dbus.SystemBus()
        name = dbus.service.BusName(config.dbus.DBUS_INTERFACE, bus=bus)
        service = FirewallD(name, config.dbus.DBUS_PATH)

        mainloop = GLib.MainLoop()
        slip.dbus.service.set_mainloop(mainloop)
        if debug_gc:
            GLib.timeout_add_seconds(gc_timeout, gc_collect)

        # use unix_signal_add if available, else unix_signal_add_full
        if hasattr(GLib, 'unix_signal_add'):
            unix_signal_add = GLib.unix_signal_add
        else:
            unix_signal_add = GLib.unix_signal_add_full

        unix_signal_add(GLib.PRIORITY_HIGH, signal.SIGHUP,
                        sighup, service)
        unix_signal_add(GLib.PRIORITY_HIGH, signal.SIGTERM,
                        sigterm, mainloop)

        mainloop.run()

    except KeyboardInterrupt:
        log.debug1("Stopping..")

    except SystemExit:
        log.error("Raising SystemExit in run_server")

    except Exception as e:
        log.error("Exception %s: %s", e.__class__.__name__, str(e))

    if service:
        service.stop()
site-packages/firewall/server/config_service.py000064400000072677147511334700015766 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2010-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.

# force use of pygobject3 in python-slip
from gi.repository import GObject
import sys
sys.modules['gobject'] = GObject

import dbus
import dbus.service
import slip.dbus
import slip.dbus.service

from firewall import config
from firewall.dbus_utils import dbus_to_python, \
    dbus_introspection_prepare_properties, \
    dbus_introspection_add_properties
from firewall.core.logger import log
from firewall.server.decorators import handle_exceptions, \
    dbus_handle_exceptions, dbus_service_method
from firewall import errors
from firewall.errors import FirewallError

############################################################################
#
# class FirewallDConfig
#
############################################################################

class FirewallDConfigService(slip.dbus.service.Object):
    """FirewallD main class"""

    persistent = True
    """ Make FirewallD persistent. """
    default_polkit_auth_required = config.dbus.PK_ACTION_CONFIG
    """ Use PK_ACTION_INFO as a default """

    @handle_exceptions
    def __init__(self, parent, conf, service, item_id, *args, **kwargs):
        super(FirewallDConfigService, self).__init__(*args, **kwargs)
        self.parent = parent
        self.config = conf
        self.obj = service
        self.item_id = item_id
        self.busname = args[0]
        self.path = args[1]
        self._log_prefix = "config.service.%d" % self.item_id
        dbus_introspection_prepare_properties(
            self, config.dbus.DBUS_INTERFACE_CONFIG_SERVICE)

    @dbus_handle_exceptions
    def __del__(self):
        pass

    @dbus_handle_exceptions
    def unregister(self):
        self.remove_from_connection()

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # P R O P E R T I E S

    @dbus_handle_exceptions
    def _get_property(self, property_name):
        if property_name == "name":
            return dbus.String(self.obj.name)
        elif property_name == "filename":
            return dbus.String(self.obj.filename)
        elif property_name == "path":
            return dbus.String(self.obj.path)
        elif property_name == "default":
            return dbus.Boolean(self.obj.default)
        elif property_name == "builtin":
            return dbus.Boolean(self.obj.builtin)
        else:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.InvalidArgs: "
                "Property '%s' does not exist" % property_name)

    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='ss',
                         out_signature='v')
    @dbus_handle_exceptions
    def Get(self, interface_name, property_name, sender=None): # pylint: disable=W0613
        # get a property
        interface_name = dbus_to_python(interface_name, str)
        property_name = dbus_to_python(property_name, str)
        log.debug1("%s.Get('%s', '%s')", self._log_prefix,
                   interface_name, property_name)

        if interface_name != config.dbus.DBUS_INTERFACE_CONFIG_SERVICE:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

        return self._get_property(property_name)

    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='s',
                         out_signature='a{sv}')
    @dbus_handle_exceptions
    def GetAll(self, interface_name, sender=None): # pylint: disable=W0613
        interface_name = dbus_to_python(interface_name, str)
        log.debug1("%s.GetAll('%s')", self._log_prefix, interface_name)

        if interface_name != config.dbus.DBUS_INTERFACE_CONFIG_SERVICE:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

        ret = { }
        for x in [ "name", "filename", "path", "default", "builtin" ]:
            ret[x] = self._get_property(x)
        return dbus.Dictionary(ret, signature="sv")

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='ssv')
    @dbus_handle_exceptions
    def Set(self, interface_name, property_name, new_value, sender=None):
        interface_name = dbus_to_python(interface_name, str)
        property_name = dbus_to_python(property_name, str)
        new_value = dbus_to_python(new_value)
        log.debug1("%s.Set('%s', '%s', '%s')", self._log_prefix,
                   interface_name, property_name, new_value)
        self.parent.accessCheck(sender)

        if interface_name != config.dbus.DBUS_INTERFACE_CONFIG_SERVICE:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

        raise dbus.exceptions.DBusException(
            "org.freedesktop.DBus.Error.PropertyReadOnly: "
            "Property '%s' is read-only" % property_name)

    @dbus.service.signal(dbus.PROPERTIES_IFACE, signature='sa{sv}as')
    def PropertiesChanged(self, interface_name, changed_properties,
                          invalidated_properties):
        interface_name = dbus_to_python(interface_name, str)
        changed_properties = dbus_to_python(changed_properties)
        invalidated_properties = dbus_to_python(invalidated_properties)
        log.debug1("%s.PropertiesChanged('%s', '%s', '%s')", self._log_prefix,
                   interface_name, changed_properties, invalidated_properties)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(dbus.INTROSPECTABLE_IFACE, out_signature='s')
    @dbus_handle_exceptions
    def Introspect(self, sender=None): # pylint: disable=W0613
        log.debug2("%s.Introspect()", self._log_prefix)

        data = super(FirewallDConfigService, self).Introspect(
            self.path, self.busname.get_bus())

        return dbus_introspection_add_properties(
            self, data, config.dbus.DBUS_INTERFACE_CONFIG_SERVICE)

    # S E T T I N G S

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         out_signature='(sssa(ss)asa{ss}asa(ss))')
    @dbus_handle_exceptions
    def getSettings(self, sender=None): # pylint: disable=W0613
        """get settings for service
        """
        log.debug1("%s.getSettings()", self._log_prefix)
        return self.config.get_service_config(self.obj)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         out_signature='a{sv}')
    @dbus_handle_exceptions
    def getSettings2(self, sender=None):
        """get settings for service
        """
        log.debug1("%s.getSettings2()", self._log_prefix)
        return self.config.get_service_config_dict(self.obj)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='(sssa(ss)asa{ss}asa(ss))')
    @dbus_handle_exceptions
    def update(self, settings, sender=None):
        """update settings for service
        """
        settings = dbus_to_python(settings)
        log.debug1("%s.update('...')", self._log_prefix)
        self.parent.accessCheck(sender)
        self.obj = self.config.set_service_config(self.obj, settings)
        self.Updated(self.obj.name)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='a{sv}')
    @dbus_handle_exceptions
    def update2(self, settings, sender=None):
        settings = dbus_to_python(settings)
        log.debug1("%s.update2('...')", self._log_prefix)
        self.parent.accessCheck(sender)
        self.obj = self.config.set_service_config_dict(self.obj, settings)
        self.Updated(self.obj.name)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE)
    @dbus_handle_exceptions
    def loadDefaults(self, sender=None):
        """load default settings for builtin service
        """
        log.debug1("%s.loadDefaults()", self._log_prefix)
        self.parent.accessCheck(sender)
        self.obj = self.config.load_service_defaults(self.obj)
        self.Updated(self.obj.name)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         signature='s')
    @dbus_handle_exceptions
    def Updated(self, name):
        log.debug1("%s.Updated('%s')" % (self._log_prefix, name))

    # R E M O V E

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE)
    @dbus_handle_exceptions
    def remove(self, sender=None):
        """remove service
        """
        log.debug1("%s.removeService()", self._log_prefix)
        self.parent.accessCheck(sender)
        self.config.remove_service(self.obj)
        self.parent.removeService(self.obj)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         signature='s')
    @dbus_handle_exceptions
    def Removed(self, name):
        log.debug1("%s.Removed('%s')" % (self._log_prefix, name))

    # R E N A M E

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='s')
    @dbus_handle_exceptions
    def rename(self, name, sender=None):
        """rename service
        """
        name = dbus_to_python(name, str)
        log.debug1("%s.rename('%s')", self._log_prefix, name)
        self.parent.accessCheck(sender)
        self.obj = self.config.rename_service(self.obj, name)
        self.Renamed(name)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         signature='s')
    @dbus_handle_exceptions
    def Renamed(self, name):
        log.debug1("%s.Renamed('%s')" % (self._log_prefix, name))

    # version

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         out_signature='s')
    @dbus_handle_exceptions
    def getVersion(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getVersion()", self._log_prefix)
        return self.getSettings()[0]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='s')
    @dbus_handle_exceptions
    def setVersion(self, version, sender=None):
        version = dbus_to_python(version, str)
        log.debug1("%s.setVersion('%s')", self._log_prefix, version)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[0] = version
        self.update(settings)

    # short

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         out_signature='s')
    @dbus_handle_exceptions
    def getShort(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getShort()", self._log_prefix)
        return self.getSettings()[1]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='s')
    @dbus_handle_exceptions
    def setShort(self, short, sender=None):
        short = dbus_to_python(short, str)
        log.debug1("%s.setShort('%s')", self._log_prefix, short)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[1] = short
        self.update(settings)

    # description

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         out_signature='s')
    @dbus_handle_exceptions
    def getDescription(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getDescription()", self._log_prefix)
        return self.getSettings()[2]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='s')
    @dbus_handle_exceptions
    def setDescription(self, description, sender=None):
        description = dbus_to_python(description, str)
        log.debug1("%s.setDescription('%s')", self._log_prefix,
                   description)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[2] = description
        self.update(settings)

    # port

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         out_signature='a(ss)')
    @dbus_handle_exceptions
    def getPorts(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getPorts()", self._log_prefix)
        return self.getSettings()[3]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='a(ss)')
    @dbus_handle_exceptions
    def setPorts(self, ports, sender=None):
        _ports = [ ]
        # convert embedded lists to tuples
        for port in dbus_to_python(ports, list):
            if isinstance(port, list):
                _ports.append(tuple(port))
            else:
                _ports.append(port)
        ports = _ports
        log.debug1("%s.setPorts('[%s]')", self._log_prefix,
                   ",".join("('%s, '%s')" % (port[0], port[1]) for port in ports))
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[3] = ports
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='ss')
    @dbus_handle_exceptions
    def addPort(self, port, protocol, sender=None):
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        log.debug1("%s.addPort('%s', '%s')", self._log_prefix, port,
                   protocol)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if (port,protocol) in settings[3]:
            raise FirewallError(errors.ALREADY_ENABLED,
                                "%s:%s" % (port, protocol))
        settings[3].append((port,protocol))
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='ss')
    @dbus_handle_exceptions
    def removePort(self, port, protocol, sender=None):
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        log.debug1("%s.removePort('%s', '%s')", self._log_prefix, port,
                   protocol)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if (port,protocol) not in settings[3]:
            raise FirewallError(errors.NOT_ENABLED, "%s:%s" % (port, protocol))
        settings[3].remove((port,protocol))
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='ss', out_signature='b')
    @dbus_handle_exceptions
    def queryPort(self, port, protocol, sender=None): # pylint: disable=W0613
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        log.debug1("%s.queryPort('%s', '%s')", self._log_prefix, port,
                   protocol)
        return (port,protocol) in self.getSettings()[3]

    # protocol

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         out_signature='as')
    @dbus_handle_exceptions
    def getProtocols(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getProtocols()", self._log_prefix)
        return self.getSettings()[6]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='as')
    @dbus_handle_exceptions
    def setProtocols(self, protocols, sender=None):
        protocols = dbus_to_python(protocols, list)
        log.debug1("%s.setProtocols('[%s]')", self._log_prefix,
                   ",".join(protocols))
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[6] = protocols
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='s')
    @dbus_handle_exceptions
    def addProtocol(self, protocol, sender=None):
        protocol = dbus_to_python(protocol, str)
        log.debug1("%s.addProtocol('%s')", self._log_prefix, protocol)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if protocol in settings[6]:
            raise FirewallError(errors.ALREADY_ENABLED, protocol)
        settings[6].append(protocol)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='s')
    @dbus_handle_exceptions
    def removeProtocol(self, protocol, sender=None):
        protocol = dbus_to_python(protocol, str)
        log.debug1("%s.removeProtocol('%s')", self._log_prefix, protocol)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if protocol not in settings[6]:
            raise FirewallError(errors.NOT_ENABLED, protocol)
        settings[6].remove(protocol)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='s', out_signature='b')
    @dbus_handle_exceptions
    def queryProtocol(self, protocol, sender=None): # pylint: disable=W0613
        protocol = dbus_to_python(protocol, str)
        log.debug1("%s.queryProtocol(%s')", self._log_prefix, protocol)
        return protocol in self.getSettings()[6]

    # source port

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         out_signature='a(ss)')
    @dbus_handle_exceptions
    def getSourcePorts(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getSourcePorts()", self._log_prefix)
        return self.getSettings()[7]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='a(ss)')
    @dbus_handle_exceptions
    def setSourcePorts(self, ports, sender=None):
        _ports = [ ]
        # convert embedded lists to tuples
        for port in dbus_to_python(ports, list):
            if isinstance(port, list):
                _ports.append(tuple(port))
            else:
                _ports.append(port)
        ports = _ports
        log.debug1("%s.setSourcePorts('[%s]')", self._log_prefix,
                   ",".join("('%s, '%s')" % (port[0], port[1]) for port in ports))
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[7] = ports
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='ss')
    @dbus_handle_exceptions
    def addSourcePort(self, port, protocol, sender=None):
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        log.debug1("%s.addSourcePort('%s', '%s')", self._log_prefix, port,
                   protocol)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if (port,protocol) in settings[7]:
            raise FirewallError(errors.ALREADY_ENABLED,
                                "%s:%s" % (port, protocol))
        settings[7].append((port,protocol))
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='ss')
    @dbus_handle_exceptions
    def removeSourcePort(self, port, protocol, sender=None):
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        log.debug1("%s.removeSourcePort('%s', '%s')", self._log_prefix, port,
                   protocol)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if (port,protocol) not in settings[7]:
            raise FirewallError(errors.NOT_ENABLED, "%s:%s" % (port, protocol))
        settings[7].remove((port,protocol))
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='ss', out_signature='b')
    @dbus_handle_exceptions
    def querySourcePort(self, port, protocol, sender=None): # pylint: disable=W0613
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        log.debug1("%s.querySourcePort('%s', '%s')", self._log_prefix, port,
                   protocol)
        return (port,protocol) in self.getSettings()[7]

    # module

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         out_signature='as')
    @dbus_handle_exceptions
    def getModules(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getModules()", self._log_prefix)
        return self.getSettings()[4]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='as')
    @dbus_handle_exceptions
    def setModules(self, modules, sender=None):
        modules = dbus_to_python(modules, list)
        _modules = [ ]
        for module in modules:
            if module.startswith("nf_conntrack_"):
                module = module.replace("nf_conntrack_", "")
                if "_" in module:
                    module = module.replace("_", "-")
            _modules.append(module)
        modules = _modules
        log.debug1("%s.setModules('[%s]')", self._log_prefix,
                   ",".join(modules))
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[4] = modules
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='s')
    @dbus_handle_exceptions
    def addModule(self, module, sender=None):
        module = dbus_to_python(module, str)
        if module.startswith("nf_conntrack_"):
            module = module.replace("nf_conntrack_", "")
            if "_" in module:
                module = module.replace("_", "-")
        log.debug1("%s.addModule('%s')", self._log_prefix, module)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if module in settings[4]:
            raise FirewallError(errors.ALREADY_ENABLED, module)
        settings[4].append(module)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='s')
    @dbus_handle_exceptions
    def removeModule(self, module, sender=None):
        module = dbus_to_python(module, str)
        if module.startswith("nf_conntrack_"):
            module = module.replace("nf_conntrack_", "")
            if "_" in module:
                module = module.replace("_", "-")
        log.debug1("%s.removeModule('%s')", self._log_prefix, module)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if module not in settings[4]:
            raise FirewallError(errors.NOT_ENABLED, module)
        settings[4].remove(module)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='s', out_signature='b')
    @dbus_handle_exceptions
    def queryModule(self, module, sender=None): # pylint: disable=W0613
        module = dbus_to_python(module, str)
        if module.startswith("nf_conntrack_"):
            module = module.replace("nf_conntrack_", "")
            if "_" in module:
                module = module.replace("_", "-")
        log.debug1("%s.queryModule('%s')", self._log_prefix, module)
        return module in self.getSettings()[4]

    # destination

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         out_signature='a{ss}')
    @dbus_handle_exceptions
    def getDestinations(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getDestinations()", self._log_prefix)
        return self.getSettings()[5]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='a{ss}')
    @dbus_handle_exceptions
    def setDestinations(self, destinations, sender=None):
        destinations = dbus_to_python(destinations, dict)
        log.debug1("%s.setDestinations({ipv4:'%s', ipv6:'%s'})",
                   self._log_prefix, destinations.get('ipv4'),
                   destinations.get('ipv6'))
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[5] = destinations
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='s', out_signature='s')
    @dbus_handle_exceptions
    def getDestination(self, family, sender=None):
        family = dbus_to_python(family, str)
        log.debug1("%s.getDestination('%s')", self._log_prefix,
                   family)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if family not in settings[5]:
            raise FirewallError(errors.NOT_ENABLED, family)
        return settings[5][family]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='ss')
    @dbus_handle_exceptions
    def setDestination(self, family, address, sender=None):
        family = dbus_to_python(family, str)
        address = dbus_to_python(address, str)
        log.debug1("%s.setDestination('%s', '%s')", self._log_prefix,
                   family, address)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if family in settings[5] and settings[5][family] == address:
            raise FirewallError(errors.ALREADY_ENABLED,
                                "'%s': '%s'" % (family, address))
        settings[5][family] = address
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='s')
    @dbus_handle_exceptions
    def removeDestination(self, family, sender=None):
        family = dbus_to_python(family, str)
        log.debug1("%s.removeDestination('%s')", self._log_prefix,
                   family)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if family not in settings[5]:
            raise FirewallError(errors.NOT_ENABLED, family)
        del settings[5][family]
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='ss', out_signature='b')
    @dbus_handle_exceptions
    def queryDestination(self, family, address, sender=None): # pylint: disable=W0613
        family = dbus_to_python(family, str)
        address = dbus_to_python(address, str)
        log.debug1("%s.queryDestination('%s', '%s')", self._log_prefix,
                   family, address)
        settings = self.getSettings()
        return (family in settings[5] and
                address == settings[5][family])

    # includes

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         out_signature='as')
    @dbus_handle_exceptions
    def getIncludes(self, sender=None):
        log.debug1("%s.getIncludes()", self._log_prefix)
        self.parent.accessCheck(sender)
        settings = self.config.get_service_config_dict(self.obj)
        return settings["includes"] if "includes" in settings else []

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='as')
    @dbus_handle_exceptions
    def setIncludes(self, includes, sender=None):
        includes = dbus_to_python(includes, list)
        log.debug1("%s.setIncludes('%s')", self._log_prefix, includes)
        self.parent.accessCheck(sender)
        settings = {"includes": includes[:]}
        self.obj = self.config.set_service_config_dict(self.obj, settings)
        self.Updated(self.obj.name)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='s')
    @dbus_handle_exceptions
    def addInclude(self, include, sender=None):
        include = dbus_to_python(include, str)
        log.debug1("%s.addInclude('%s')", self._log_prefix, include)
        self.parent.accessCheck(sender)
        settings = self.config.get_service_config_dict(self.obj)
        settings.setdefault("includes", []).append(include)
        self.obj = self.config.set_service_config_dict(self.obj, settings)
        self.Updated(self.obj.name)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='s')
    @dbus_handle_exceptions
    def removeInclude(self, include, sender=None):
        include = dbus_to_python(include, str)
        log.debug1("%s.removeInclude('%s')", self._log_prefix, include)
        self.parent.accessCheck(sender)
        settings = self.config.get_service_config_dict(self.obj)
        settings["includes"].remove(include)
        self.obj = self.config.set_service_config_dict(self.obj, settings)
        self.Updated(self.obj.name)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_SERVICE,
                         in_signature='s', out_signature='b')
    @dbus_handle_exceptions
    def queryInclude(self, include, sender=None):
        include = dbus_to_python(include, str)
        log.debug1("%s.queryInclude('%s')", self._log_prefix, include)
        settings = self.config.get_service_config_dict(self.obj)
        return include in settings["includes"] if "includes" in settings else False
site-packages/firewall/server/decorators.py000064400000005557147511334700015137 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2012-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.

"""This module contains decorators for use with and without D-Bus"""

__all__ = ["FirewallDBusException", "handle_exceptions",
           "dbus_handle_exceptions", "dbus_service_method"]

import dbus
import dbus.service
import traceback
from dbus.exceptions import DBusException
from decorator import decorator

from firewall import config
from firewall.errors import FirewallError
from firewall import errors
from firewall.core.logger import log

############################################################################
#
# Exception handler decorators
#
############################################################################

class FirewallDBusException(dbus.DBusException):
    """FirewallDBusException"""
    _dbus_error_name = "%s.Exception" % config.dbus.DBUS_INTERFACE

@decorator
def handle_exceptions(func, *args, **kwargs):
    """Decorator to handle exceptions and log them. Used if not conneced
    to D-Bus.
    """
    try:
        return func(*args, **kwargs)
    except FirewallError as error:
        log.debug1(traceback.format_exc())
        log.error(error)
    except Exception:  # pylint: disable=W0703
        log.exception()

@decorator
def dbus_handle_exceptions(func, *args, **kwargs):
    """Decorator to handle exceptions, log and report them into D-Bus

    :Raises DBusException: on a firewall error code problems.
    """
    try:
        return func(*args, **kwargs)
    except FirewallError as error:
        code = FirewallError.get_code(str(error))
        if code in [ errors.ALREADY_ENABLED, errors.NOT_ENABLED,
                     errors.ZONE_ALREADY_SET, errors.ALREADY_SET ]:
            log.warning(str(error))
        else:
            log.debug1(traceback.format_exc())
            log.error(str(error))
        raise FirewallDBusException(str(error))
    except DBusException as ex:
        # only log DBusExceptions once
        raise ex
    except Exception as ex:
        log.exception()
        raise FirewallDBusException(str(ex))

def dbus_service_method(*args, **kwargs):
    """Add sender argument for D-Bus"""
    kwargs.setdefault("sender_keyword", "sender")
    return dbus.service.method(*args, **kwargs)
site-packages/firewall/server/__pycache__/config_zone.cpython-36.opt-1.pyc000064400000101501147511334700022477 0ustar003

]ûfW��@s�ddlmZddlZeejd<ddlZddlZddlZddlZddl	m
Z
ddlmZm
Z
mZddlmZddlmZddlmZdd	lmZdd
lmZddlmZmZmZddl	mZdd
lmZddl m!Z!m"Z"m#Z#m$Z$Gdd�dejj%j&�Z'dS)�)�GObjectNZgobject)�config)�dbus_to_python�%dbus_introspection_prepare_properties�!dbus_introspection_add_properties)�Zone)�ifcfg_set_zone_of_interface)�DEFAULT_ZONE_TARGET)�	Rich_Rule)�log)�handle_exceptions�dbus_handle_exceptions�dbus_service_method)�errors)�
FirewallError)�portStr�portInPortRange�coalescePortRange�breakPortRangecs�	eZdZdZdZejjZe	�fdd��Z
edd��Zedd��Z
ed	d
��Zeejddd
�ed�dd���Zeejddd
�ed�dd���Zejjjejj�eejdd�ed�dd����Zejjejdd�dd��Zejjjejj�eejdd�ed��fdd�	���Zeejjd d�ed�d!d"���Zeejjdd�ed�d#d$���Zd%d&�Z eejjd d�ed�d'd(���Z!eejjdd�ed�d)d*���Z"eejj�ed�d+d,���Z#ejjejjdd�ed-d.���Z$eejj�ed�d/d0���Z%ejjejjdd�ed1d2���Z&eejjdd�ed�d3d4���Z'ejjejjdd�ed5d6���Z(eejjdd�ed�d7d8���Z)eejjdd�ed�d9d:���Z*eejjdd�ed�d;d<���Z+eejjdd�ed�d=d>���Z,eejjdd�ed�d?d@���Z-eejjdd�ed�dAdB���Z.eejjdd�ed�dCdD���Z/eejjdd�ed�dEdF���Z0eejjdGd�ed�dHdI���Z1eejjdGd�ed�dJdK���Z2eejjdd�ed�dLdM���Z3eejjdd�ed�dNdO���Z4eejjddPd
�ed�dQdR���Z5eejjdSd�ed�dTdU���Z6eejjdSd�ed�dVdW���Z7eejjdd�ed�dXdY���Z8eejjdd�ed�dZd[���Z9eejjddPd
�ed�d\d]���Z:eejjdGd�ed�d^d_���Z;eejjdGd�ed�d`da���Z<eejjdd�ed�dbdc���Z=eejjdd�ed�ddde���Z>eejjddPd
�ed�dfdg���Z?eejjdSd�ed�dhdi���Z@eejjdSd�ed�djdk���ZAeejjdd�ed�dldm���ZBeejjdd�ed�dndo���ZCeejjddPd
�ed�dpdq���ZDeejjdGd�ed�drds���ZEeejjdGd�ed�dtdu���ZFeejjdd�ed�dvdw���ZGeejjdd�ed�dxdy���ZHeejjddPd
�ed�dzd{���ZIeejjdPd�ed�d|d}���ZJeejjdPd�ed�d~d���ZKeejj�ed�d�d����ZLeejj�ed�d�d����ZMeejjdPd�ed�d�d����ZNeejjdPd�ed�d�d����ZOeejjdPd�ed�d�d����ZPeejj�ed�d�d����ZQeejj�ed�d�d����ZReejjdPd�ed�d�d����ZSeejjd�d�ed�d�d����ZTeejjd�d�ed�d�d����ZUeejjd�d�ed�d�d����ZVeejjd�d�ed�d�d����ZWeejjd�dPd
�ed�d�d����ZXeejjdGd�ed�d�d����ZYeejjdGd�ed�d�d����ZZeejjdd�ed�d�d����Z[eejjdd�ed�d�d����Z\eejjddPd
�ed�d�d����Z]eejjdGd�ed�d�d����Z^eejjdGd�ed�d�d����Z_eejjdd�ed�d�d����Z`eejjdd�ed�d�d����ZaeejjddPd
�ed�d�d����ZbeejjdGd�ed�d�d����ZceejjdGd�e�dd�d����Zdeejjdd�e�dd�d����Zeeejjdd�e�dd�d����ZfeejjddPd
�e�dd�d����Zg�ZhS(�FirewallDConfigZonezFirewallD main classTcs\tt|�j||�||_||_||_||_|d|_|d|_d|j|_	t
|tjj�dS)Nr�zconfig.zone.%d)
�superr�__init__�parentr�obj�item_id�busname�path�_log_prefixr�dbus�DBUS_INTERFACE_CONFIG_ZONE)�selfrZconfZzoner�args�kwargs)�	__class__��!/usr/lib/python3.6/config_zone.pyr=s

zFirewallDConfigZone.__init__cCsdS)Nr%)r!r%r%r&�__del__JszFirewallDConfigZone.__del__cCs|j�dS)N)Zremove_from_connection)r!r%r%r&�
unregisterNszFirewallDConfigZone.unregistercCs�|dkrtj|jj�S|dkr,tj|jj�S|dkrBtj|jj�S|dkrXtj|jj�S|dkrntj|jj�Stj	j
d|��dS)N�name�filenamer�default�builtinzDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not exist)r�Stringrr)r*rZBooleanr+r,�
exceptions�
DBusException)r!�
property_namer%r%r&�
_get_propertyVsz!FirewallDConfigZone._get_propertyZss�v)�in_signature�
out_signatureNcCsLt|t�}t|t�}tjd|j||�|tjjkrBtjj	d|��|j
|�S)Nz%s.Get('%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not exist)r�strr�debug1rrrr r.r/r1)r!�interface_namer0�senderr%r%r&�Getgs


zFirewallDConfigZone.Get�sza{sv}cCsdt|t�}tjd|j|�|tjjkr6tjj	d|��i}xd
D]}|j
|�||<q@Wtj|dd	�S)Nz%s.GetAll('%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existr)r*rr+r,Zsv)�	signature)r)r*rr+r,)rr5rr6rrrr r.r/r1Z
Dictionary)r!r7r8�ret�xr%r%r&�GetAllxs

zFirewallDConfigZone.GetAllZssv)r3cCslt|t�}t|t�}t|�}tjd|j|||�|jj|�|tjj	krXtj
jd|��tj
jd|��dS)Nz%s.Set('%s', '%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existzGorg.freedesktop.DBus.Error.PropertyReadOnly: Property '%s' is read-only)rr5rr6rr�accessCheckrrr r.r/)r!r7r0Z	new_valuer8r%r%r&�Set�s



zFirewallDConfigZone.Setzsa{sv}as)r;cCs2t|t�}t|�}t|�}tjd|j|||�dS)Nz&%s.PropertiesChanged('%s', '%s', '%s'))rr5rr6r)r!r7Zchanged_propertiesZinvalidated_propertiesr%r%r&�PropertiesChanged�s


z%FirewallDConfigZone.PropertiesChanged)r4cs8tjd|j�tt|�j|j|jj��}t	||t
jj�S)Nz%s.Introspect())
rZdebug2rrr�
IntrospectrrZget_busrrrr )r!r8�data)r$r%r&rB�s

zFirewallDConfigZone.Introspectz&(sssbsasa(ss)asba(ssss)asasasasa(ss)b)cCsDtjd|j�|jj|j�}|dtkr@t|�}d|d<t|�}|S)zget settings for zone
        z%s.getSettings()�r+)	rr6rrZget_zone_configrr	�list�tuple)r!r8�settings�	_settingsr%r%r&�getSettings�szFirewallDConfigZone.getSettingscCs4tjd|j�|jj|j�}|dtkr0d|d<|S)zget settings for zone
        z%s.getSettings2()�targetr+)rr6rr�get_zone_config_dictrr	)r!r8rGr%r%r&�getSettings2�s
z FirewallDConfigZone.getSettings2cCs|jj|j�}d|kr"t|d�nt�}d|kr<t|d�nt�}t|t�rzt|tjd��|}t|tjd��|}nDd|kr�t|d�nt�}d|kr�t|d�nt�}||}||}x$|D]}	|jj	|	�r�t
tj|	��q�Wx$|D]}
|jj
|
�r�t
tj|
��q�WdS)a
Assignment of interfaces/sources to zones is different from other
           zone settings in the sense that particular interface/zone can be
           part of only one zone. So make sure added interfaces/sources have
           not already been bound to another zone.�
interfaces�sourcesN)rrKr�set�
isinstancerFrZindex_ofrZgetZoneOfInterfacerrZ
ZONE_CONFLICTZgetZoneOfSource)r!rGZold_settingsZ
old_ifacesZold_sourcesZadded_ifacesZ
added_sourcesZ
new_ifacesZnew_sourcesZiface�sourcer%r%r&� _checkDuplicateInterfacesSources�s 


z4FirewallDConfigZone._checkDuplicateInterfacesSourcescCstt|�}tjd|j�|jj|�|ddkrFt|�}t|d<t|�}|j	|�|j
j|j|�|_|j
|jj�dS)z!update settings for zone
        z%s.update('...')rDr+N)rrr6rrr?rEr	rFrRrZset_zone_configr�Updatedr))r!rGr8rHr%r%r&�update�s
zFirewallDConfigZone.updatecCslt|�}tjd|j�|jj|�d|kr>|ddkr>t|d<|j|�|jj	|j
|�|_
|j|j
j�dS)z!update settings for zone
        z%s.update2('...')rJr+N)
rrr6rrr?r	rRrZset_zone_config_dictrrSr))r!rGr8r%r%r&�update2�s
zFirewallDConfigZone.update2cCs<tjd|j�|jj|�|jj|j�|_|j|jj	�dS)z/load default settings for builtin zone
        z%s.loadDefaults()N)
rr6rrr?rZload_zone_defaultsrrSr))r!r8r%r%r&�loadDefaultssz FirewallDConfigZone.loadDefaultscCstjd|j|f�dS)Nz%s.Updated('%s'))rr6r)r!r)r%r%r&rSszFirewallDConfigZone.UpdatedcCs:tjd|j�|jj|�|jj|j�|jj|j�dS)zremove zone
        z%s.removeZone()N)	rr6rrr?rZremove_zonerZ
removeZone)r!r8r%r%r&�removeszFirewallDConfigZone.removecCstjd|j|f�dS)Nz%s.Removed('%s'))rr6r)r!r)r%r%r&�Removed#szFirewallDConfigZone.RemovedcCsFt|t�}tjd|j|�|jj|�|jj|j	|�|_	|j
|�dS)zrename zone
        z%s.rename('%s')N)rr5rr6rrr?rZrename_zoner�Renamed)r!r)r8r%r%r&�rename*s

zFirewallDConfigZone.renamecCstjd|j|f�dS)Nz%s.Renamed('%s'))rr6r)r!r)r%r%r&rY6szFirewallDConfigZone.RenamedcCstjd|j�|j�dS)Nz%s.getVersion()r)rr6rrI)r!r8r%r%r&�
getVersion=szFirewallDConfigZone.getVersioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setVersion('%s')r)
rr5rr6rrr?rErIrT)r!�versionr8rGr%r%r&�
setVersionDs
zFirewallDConfigZone.setVersioncCstjd|j�|j�dS)Nz
%s.getShort()r)rr6rrI)r!r8r%r%r&�getShortQszFirewallDConfigZone.getShortcCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setShort('%s')r)
rr5rr6rrr?rErIrT)r!Zshortr8rGr%r%r&�setShortXs
zFirewallDConfigZone.setShortcCstjd|j�|j�dS)Nz%s.getDescription()�)rr6rrI)r!r8r%r%r&�getDescriptionesz"FirewallDConfigZone.getDescriptioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setDescription('%s')r`)
rr5rr6rrr?rErIrT)r!�descriptionr8rGr%r%r&�setDescriptionls
z"FirewallDConfigZone.setDescriptioncCs.tjd|j�|j�}|dtkr*|dSdS)Nz%s.getTarget()rDr+)rr6rrIr	)r!r8rGr%r%r&�	getTarget|szFirewallDConfigZone.getTargetcCsTt|t�}tjd|j|�|jj|�t|j��}|dkr>|nt	|d<|j
|�dS)Nz%s.setTarget('%s')r+rD)rr5rr6rrr?rErIr	rT)r!rJr8rGr%r%r&�	setTarget�s
zFirewallDConfigZone.setTarget�ascCstjd|j�|j�dS)Nz%s.getServices()�)rr6rrI)r!r8r%r%r&�getServices�szFirewallDConfigZone.getServicescCsNt|t�}tjd|jdj|��|jj|�t|j��}||d<|j	|�dS)Nz%s.setServices('[%s]')�,rg)
rrErr6r�joinrr?rIrT)r!Zservicesr8rGr%r%r&�setServices�s

zFirewallDConfigZone.setServicescCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.addService('%s')rg)rr5rr6rrr?rErIrr�ALREADY_ENABLED�appendrT)r!�servicer8rGr%r%r&�
addService�s
zFirewallDConfigZone.addServicecCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.removeService('%s')rg)rr5rr6rrr?rErIrr�NOT_ENABLEDrWrT)r!rnr8rGr%r%r&�
removeService�s
z!FirewallDConfigZone.removeService�bcCs*t|t�}tjd|j|�||j�dkS)Nz%s.queryService('%s')rg)rr5rr6rrI)r!rnr8r%r%r&�queryService�s
z FirewallDConfigZone.queryServiceza(ss)cCstjd|j�|j�dS)Nz
%s.getPorts()�)rr6rrI)r!r8r%r%r&�getPorts�szFirewallDConfigZone.getPortscCs�g}x6t|t�D](}t|t�r.|jt|��q|j|�qW|}tjd|jdjdd�|D���|j	j
|�t|j��}||d<|j|�dS)Nz%s.setPorts('[%s]')ricss"|]}d|d|dfVqdS)z('%s, '%s')rrNr%)�.0�portr%r%r&�	<genexpr>�sz/FirewallDConfigZone.setPorts.<locals>.<genexpr>rt)
rrErPrmrFrr6rrjrr?rIrT)r!�portsr8�_portsrwrGr%r%r&�setPorts�s

zFirewallDConfigZone.setPortsc
s�t|t�}t�t��tjd|j|��|jj|�t|j��}tt	�fdd�|d��}x.|D]&}t
||d�r^ttj
d|�f��q^Wt|dd�|D��\}}x$|D]}	|djt|	d	��f�q�Wx$|D]}	|djt|	d	��f�q�W|j|�dS)
Nz%s.addPort('%s', '%s')cs|d�kS)Nrr%)r=)�protocolr%r&�<lambda>�sz-FirewallDConfigZone.addPort.<locals>.<lambda>rtrz%s:%scSsg|]\}}|�qSr%r%)rv�_port�	_protocolr%r%r&�
<listcomp>�sz/FirewallDConfigZone.addPort.<locals>.<listcomp>�-)rr5rr6rrr?rErI�filterrrrrlrrWrrmrT)
r!rwr|r8rG�existing_port_ids�port_id�added_ranges�removed_ranges�ranger%)r|r&�addPort�s"




zFirewallDConfigZone.addPortc
s�t|t�}t�t��tjd|j|��|jj|�t|j��}tt	�fdd�|d��}x0|D]}t
||d�r^Pq^Wttj
d|�f��t|dd�|D��\}}x$|D]}	|djt|	d	��f�q�Wx$|D]}	|djt|	d	��f�q�W|j|�dS)
Nz%s.removePort('%s', '%s')cs|d�kS)Nrr%)r=)r|r%r&r}sz0FirewallDConfigZone.removePort.<locals>.<lambda>rtrz%s:%scSsg|]\}}|�qSr%r%)rvr~rr%r%r&r�sz2FirewallDConfigZone.removePort.<locals>.<listcomp>r�)rr5rr6rrr?rErIr�rrrrprrWrrmrT)
r!rwr|r8rGr�r�r�r�r�r%)r|r&�
removePort�s"




zFirewallDConfigZone.removePortcCsZt|t�}t|t�}tjd|j||�x.|j�dD]\}}t||�r4||kr4dSq4WdS)Nz%s.queryPort('%s', '%s')rtTF)rr5rr6rrIr)r!rwr|r8r~rr%r%r&�	queryPorts

zFirewallDConfigZone.queryPortcCstjd|j�|j�dS)Nz%s.getProtocols()�
)rr6rrI)r!r8r%r%r&�getProtocolssz FirewallDConfigZone.getProtocolscCsNt|t�}tjd|jdj|��|jj|�t|j��}||d<|j	|�dS)Nz%s.setProtocols('[%s]')rir�)
rrErr6rrjrr?rIrT)r!Z	protocolsr8rGr%r%r&�setProtocols&s

z FirewallDConfigZone.setProtocolscCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.addProtocol('%s')r�)rr5rr6rrr?rErIrrrlrmrT)r!r|r8rGr%r%r&�addProtocol2s
zFirewallDConfigZone.addProtocolcCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.removeProtocol('%s')r�)rr5rr6rrr?rErIrrrprWrT)r!r|r8rGr%r%r&�removeProtocol?s
z"FirewallDConfigZone.removeProtocolcCs*t|t�}tjd|j|�||j�dkS)Nz%s.queryProtocol('%s')r�)rr5rr6rrI)r!r|r8r%r%r&�
queryProtocolLs
z!FirewallDConfigZone.queryProtocolcCstjd|j�|j�dS)Nz%s.getSourcePorts()�)rr6rrI)r!r8r%r%r&�getSourcePortsVsz"FirewallDConfigZone.getSourcePortscCs�g}x6t|t�D](}t|t�r.|jt|��q|j|�qW|}tjd|jdjdd�|D���|j	j
|�t|j��}||d<|j|�dS)Nz%s.setSourcePorts('[%s]')ricss"|]}d|d|dfVqdS)z('%s, '%s')rrNr%)rvrwr%r%r&rxjsz5FirewallDConfigZone.setSourcePorts.<locals>.<genexpr>r�)
rrErPrmrFrr6rrjrr?rIrT)r!ryr8rzrwrGr%r%r&�setSourcePorts]s

z"FirewallDConfigZone.setSourcePortsc
s�t|t�}t�t��tjd|j|��|jj|�t|j��}tt	�fdd�|d��}x.|D]&}t
||d�r^ttj
d|�f��q^Wt|dd�|D��\}}x$|D]}	|djt|	d	��f�q�Wx$|D]}	|djt|	d	��f�q�W|j|�dS)
Nz%s.addSourcePort('%s', '%s')cs|d�kS)Nrr%)r=)r|r%r&r}zsz3FirewallDConfigZone.addSourcePort.<locals>.<lambda>r�rz%s:%scSsg|]\}}|�qSr%r%)rvr~rr%r%r&r�sz5FirewallDConfigZone.addSourcePort.<locals>.<listcomp>r�)rr5rr6rrr?rErIr�rrrrlrrWrrmrT)
r!rwr|r8rGr�r�r�r�r�r%)r|r&�
addSourcePortps"




z!FirewallDConfigZone.addSourcePortc
s�t|t�}t�t��tjd|j|��|jj|�t|j��}tt	�fdd�|d��}x0|D]}t
||d�r^Pq^Wttj
d|�f��t|dd�|D��\}}x$|D]}	|djt|	d	��f�q�Wx$|D]}	|djt|	d	��f�q�W|j|�dS)
Nz%s.removeSourcePort('%s', '%s')cs|d�kS)Nrr%)r=)r|r%r&r}�sz6FirewallDConfigZone.removeSourcePort.<locals>.<lambda>r�rz%s:%scSsg|]\}}|�qSr%r%)rvr~rr%r%r&r��sz8FirewallDConfigZone.removeSourcePort.<locals>.<listcomp>r�)rr5rr6rrr?rErIr�rrrrprrWrrmrT)
r!rwr|r8rGr�r�r�r�r�r%)r|r&�removeSourcePort�s"




z$FirewallDConfigZone.removeSourcePortcCsZt|t�}t|t�}tjd|j||�x.|j�dD]\}}t||�r4||kr4dSq4WdS)Nz%s.querySourcePort('%s', '%s')r�TF)rr5rr6rrIr)r!rwr|r8r~rr%r%r&�querySourcePort�s

z#FirewallDConfigZone.querySourcePortcCstjd|j�|j�dS)Nz%s.getIcmpBlocks()�)rr6rrI)r!r8r%r%r&�
getIcmpBlocks�sz!FirewallDConfigZone.getIcmpBlockscCsNt|t�}tjd|jdj|��|jj|�t|j��}||d<|j	|�dS)Nz%s.setIcmpBlocks('[%s]')rir�)
rrErr6rrjrr?rIrT)r!Z	icmptypesr8rGr%r%r&�
setIcmpBlocks�s

z!FirewallDConfigZone.setIcmpBlockscCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.addIcmpBlock('%s')r�)rr5rr6rrr?rErIrrrlrmrT)r!�icmptyper8rGr%r%r&�addIcmpBlock�s
z FirewallDConfigZone.addIcmpBlockcCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.removeIcmpBlock('%s')r�)rr5rr6rrr?rErIrrrprWrT)r!r�r8rGr%r%r&�removeIcmpBlock�s
z#FirewallDConfigZone.removeIcmpBlockcCs*t|t�}tjd|j|�||j�dkS)Nz%s.queryIcmpBlock('%s')r�)rr5rr6rrI)r!r�r8r%r%r&�queryIcmpBlock�s
z"FirewallDConfigZone.queryIcmpBlockcCstjd|j�|j�dS)Nz%s.getIcmpBlockInversion()�)rr6rrI)r!r8r%r%r&�getIcmpBlockInversion�sz)FirewallDConfigZone.getIcmpBlockInversioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setIcmpBlockInversion('%s')r�)
r�boolrr6rrr?rErIrT)r!�flagr8rGr%r%r&�setIcmpBlockInversion�s
z)FirewallDConfigZone.setIcmpBlockInversioncCsPtjd|j�|jj|�t|j��}|dr:ttj	d��d|d<|j
|�dS)Nz%s.addIcmpBlockInversion()r�zicmp-block-inversionT)rr6rrr?rErIrrrlrT)r!r8rGr%r%r&�addIcmpBlockInversion�sz)FirewallDConfigZone.addIcmpBlockInversioncCsPtjd|j�|jj|�t|j��}|ds:ttj	d��d|d<|j
|�dS)Nz%s.removeIcmpBlockInversion()r�zicmp-block-inversionF)rr6rrr?rErIrrrprT)r!r8rGr%r%r&�removeIcmpBlockInversionsz,FirewallDConfigZone.removeIcmpBlockInversioncCstjd|j�|j�dS)Nz%s.queryIcmpBlockInversion()r�)rr6rrI)r!r8r%r%r&�queryIcmpBlockInversionsz+FirewallDConfigZone.queryIcmpBlockInversioncCstjd|j�|j�dS)Nz%s.getMasquerade()�)rr6rrI)r!r8r%r%r&�
getMasqueradesz!FirewallDConfigZone.getMasqueradecCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setMasquerade('%s')r�)
rr�rr6rrr?rErIrT)r!�
masquerader8rGr%r%r&�
setMasquerades
z!FirewallDConfigZone.setMasqueradecCsPtjd|j�|jj|�t|j��}|dr:ttj	d��d|d<|j
|�dS)Nz%s.addMasquerade()r�r�T)rr6rrr?rErIrrrlrT)r!r8rGr%r%r&�
addMasquerade'sz!FirewallDConfigZone.addMasqueradecCsPtjd|j�|jj|�t|j��}|ds:ttj	d��d|d<|j
|�dS)Nz%s.removeMasquerade()r�r�F)rr6rrr?rErIrrrprT)r!r8rGr%r%r&�removeMasquerade2sz$FirewallDConfigZone.removeMasqueradecCstjd|j�|j�dS)Nz%s.queryMasquerade()r�)rr6rrI)r!r8r%r%r&�queryMasquerade=sz#FirewallDConfigZone.queryMasqueradeza(ssss)cCstjd|j�|j�dS)Nz%s.getForwardPorts()�	)rr6rrI)r!r8r%r%r&�getForwardPortsFsz#FirewallDConfigZone.getForwardPortscCs�g}x6t|t�D](}t|t�r.|jt|��q|j|�qW|}tjd|jdjdd�|D���|j	j
|�t|j��}||d<|j|�dS)Nz%s.setForwardPorts('[%s]')ricss.|]&}d|d|d|d|dfVqdS)z('%s, '%s', '%s', '%s')rrr`�Nr%)rvrwr%r%r&rxZsz6FirewallDConfigZone.setForwardPorts.<locals>.<genexpr>r�)
rrErPrmrFrr6rrjrr?rIrT)r!ryr8rzrwrGr%r%r&�setForwardPortsMs


z#FirewallDConfigZone.setForwardPortsZsssscCs�t|t�}t|t�}t|t�}t|t�}tjd|j||||�|jj|�||t|�t|�f}t|j��}||dkr�t	t
jd||||f��|dj|�|j
|�dS)Nz)%s.addForwardPort('%s', '%s', '%s', '%s')r�z%s:%s:%s:%s)rr5rr6rrr?rErIrrrlrmrT)r!rwr|�toport�toaddrr8�fwp_idrGr%r%r&�addForwardPortas




z"FirewallDConfigZone.addForwardPortcCs�t|t�}t|t�}t|t�}t|t�}tjd|j||||�|jj|�||t|�t|�f}t|j��}||dkr�t	t
jd||||f��|dj|�|j
|�dS)Nz,%s.removeForwardPort('%s', '%s', '%s', '%s')r�z%s:%s:%s:%s)rr5rr6rrr?rErIrrrprWrT)r!rwr|r�r�r8r�rGr%r%r&�removeForwardPortus




z%FirewallDConfigZone.removeForwardPortcCsbt|t�}t|t�}t|t�}t|t�}tjd|j||||�||t|�t|�f}||j�dkS)Nz+%s.queryForwardPort('%s', '%s', '%s', '%s')r�)rr5rr6rrI)r!rwr|r�r�r8r�r%r%r&�queryForwardPort�s



z$FirewallDConfigZone.queryForwardPortcCstjd|j�|j�dS)Nz%s.getInterfaces()�
)rr6rrI)r!r8r%r%r&�
getInterfaces�sz!FirewallDConfigZone.getInterfacescCsNt|t�}tjd|jdj|��|jj|�t|j��}||d<|j	|�dS)Nz%s.setInterfaces('[%s]')rir�)
rrErr6rrjrr?rIrT)r!rMr8rGr%r%r&�
setInterfaces�s

z!FirewallDConfigZone.setInterfacescCstt|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�t|jj|�dS)Nz%s.addInterface('%s')r�)rr5rr6rrr?rErIrrrlrmrTrrr))r!�	interfacer8rGr%r%r&�addInterface�s

z FirewallDConfigZone.addInterfacecCspt|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�td|�dS)Nz%s.removeInterface('%s')r��)rr5rr6rrr?rErIrrrprWrTr)r!r�r8rGr%r%r&�removeInterface�s

z#FirewallDConfigZone.removeInterfacecCs*t|t�}tjd|j|�||j�dkS)Nz%s.queryInterface('%s')r�)rr5rr6rrI)r!r�r8r%r%r&�queryInterface�s
z"FirewallDConfigZone.queryInterfacecCstjd|j�|j�dS)Nz%s.getSources()�)rr6rrI)r!r8r%r%r&�
getSources�szFirewallDConfigZone.getSourcescCsNt|t�}tjd|jdj|��|jj|�t|j��}||d<|j	|�dS)Nz%s.setSources('[%s]')rir�)
rrErr6rrjrr?rIrT)r!rNr8rGr%r%r&�
setSources�s

zFirewallDConfigZone.setSourcescCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.addSource('%s')r�)rr5rr6rrr?rErIrrrlrmrT)r!rQr8rGr%r%r&�	addSource�s
zFirewallDConfigZone.addSourcecCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.removeSource('%s')r�)rr5rr6rrr?rErIrrrprWrT)r!rQr8rGr%r%r&�removeSource�s
z FirewallDConfigZone.removeSourcecCs*t|t�}tjd|j|�||j�dkS)Nz%s.querySource('%s')r�)rr5rr6rrI)r!rQr8r%r%r&�querySources
zFirewallDConfigZone.querySourcecCstjd|j�|j�dS)Nz%s.getRichRules()�)rr6rrI)r!r8r%r%r&�getRichRulessz FirewallDConfigZone.getRichRulescCs\t|t�}tjd|jdj|��|jj|�t|j��}dd�|D�}||d<|j	|�dS)Nz%s.setRichRules('[%s]')ricSsg|]}tt|d���qS))�rule_str)r5r
)rv�rr%r%r&r�sz4FirewallDConfigZone.setRichRules.<locals>.<listcomp>r�)
rrErr6rrjrr?rIrT)r!Zrulesr8rGr%r%r&�setRichRuless

z FirewallDConfigZone.setRichRulescCstt|t�}tjd|j|�|jj|�t|j��}tt	|d��}||dkrXt
tj|��|dj
|�|j|�dS)Nz%s.addRichRule('%s'))r�r�)rr5rr6rrr?rErIr
rrrlrmrT)r!�ruler8rGr�r%r%r&�addRichRule s
zFirewallDConfigZone.addRichRulecCstt|t�}tjd|j|�|jj|�t|j��}tt	|d��}||dkrXt
tj|��|dj
|�|j|�dS)Nz%s.removeRichRule('%s'))r�r�)rr5rr6rrr?rErIr
rrrprWrT)r!r�r8rGr�r%r%r&�removeRichRule.s
z"FirewallDConfigZone.removeRichRulecCs8t|t�}tjd|j|�tt|d��}||j�dkS)Nz%s.queryRichRule('%s'))r�r�)rr5rr6rr
rI)r!r�r8r�r%r%r&�
queryRichRule<s
z!FirewallDConfigZone.queryRichRule)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)i�__name__�
__module__�__qualname__�__doc__Z
persistentrrZPK_ACTION_CONFIGZdefault_polkit_auth_requiredrrr
r'r(r1rZPROPERTIES_IFACEr9r>�slipZpolkitZrequire_authr@rn�signalrAZPK_ACTION_INFOZINTROSPECTABLE_IFACErBr rIrLrRrTrUrVrSrWrXrZrYr[r]r^r_rarcrdrerhrkrorqrsrur{r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r��
__classcell__r%r%)r$r&r5sf
		

	



	


	


	


	
	
	
			


r)(Z
gi.repositoryr�sys�modulesrZdbus.serviceZ	slip.dbusr�Zslip.dbus.serviceZfirewallrZfirewall.dbus_utilsrrrZfirewall.core.io.zonerZfirewall.core.fw_ifcfgrZfirewall.core.baser	Zfirewall.core.richr
Zfirewall.core.loggerrZfirewall.server.decoratorsrr
rrZfirewall.errorsrZfirewall.functionsrrrrrnZObjectrr%r%r%r&�<module>s$
	site-packages/firewall/server/__pycache__/config_ipset.cpython-36.pyc000064400000031650147511334700021720 0ustar003

]ûfzI�@s�ddlmZddlZeejd<ddlZddlZddlZddlZddl	m
Z
ddlmZm
Z
mZddlmZddlmZmZmZmZddlmZdd	lmZmZmZdd
l	mZddlmZGdd
�d
ejjj �Z!dS)�)�GObjectNZgobject)�config)�dbus_to_python�%dbus_introspection_prepare_properties�!dbus_introspection_add_properties)�IPSet)�IPSET_TYPES�normalize_ipset_entry�check_entry_overlaps_existing�check_for_overlapping_entries)�log)�handle_exceptions�dbus_handle_exceptions�dbus_service_method)�errors)�
FirewallErrorcseZdZdZdZejjZe	�fdd��Z
edd��Zedd��Z
ed	d
��Zeejddd
�edWdd���Zeejddd
�edXdd���Zejjjejj�eejdd�edYdd����Zejjejdd�dd��Zejjjejj�eejdd�edZ�fdd�	���Zeejjejd�ed[d d!���Z eejjejd�ed\d"d#���Z!eejj�ed]d$d%���Z"ejjejjdd�ed&d'���Z#eejj�ed^d(d)���Z$ejjejjdd�ed*d+���Z%eejjdd�ed_d,d-���Z&ejjejjdd�ed.d/���Z'eejjdd�ed`d0d1���Z(eejjdd�edad2d3���Z)eejjdd�edbd4d5���Z*eejjdd�edcd6d7���Z+eejjdd�eddd8d9���Z,eejjdd�eded:d;���Z-eejjdd�edfd<d=���Z.eejjdd�edgd>d?���Z/eejjd@d�edhdAdB���Z0eejjd@d�edidCdD���Z1eejjdd�edjdEdF���Z2eejjdd�edkdGdH���Z3eejjddId
�edldJdK���Z4eejjdLd�edmdMdN���Z5eejjdLd�edndOdP���Z6eejjdd�edodQdR���Z7eejjdd�edpdSdT���Z8eejjddId
�edqdUdV���Z9�Z:S)r�FirewallDConfigIPSetzFirewallD main classTcs\tt|�j||�||_||_||_||_|d|_|d|_d|j|_	t
|tjj�dS)Nr�zconfig.ipset.%d)
�superr�__init__�parentr�obj�item_id�busname�path�_log_prefixr�dbus�DBUS_INTERFACE_CONFIG_IPSET)�selfrZconfZipsetr�args�kwargs)�	__class__��"/usr/lib/python3.6/config_ipset.pyr;s

zFirewallDConfigIPSet.__init__cCsdS)Nr")rr"r"r#�__del__HszFirewallDConfigIPSet.__del__cCs|j�dS)N)Zremove_from_connection)rr"r"r#�
unregisterLszFirewallDConfigIPSet.unregistercCs�|dkrtj|jj�S|dkr,tj|jj�S|dkrBtj|jj�S|dkrXtj|jj�S|dkrntj|jj�Stj	j
d|��dS)N�name�filenamer�default�builtinzDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not exist)r�Stringrr&r'rZBooleanr(r)�
exceptions�
DBusException)r�
property_namer"r"r#�
_get_propertyTsz"FirewallDConfigIPSet._get_propertyZss�v)�in_signature�
out_signatureNcCsLt|t�}t|t�}tjd|j||�|tjjkrBtjj	d|��|j
|�S)Nz%s.Get('%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not exist)r�strr�debug1rrrrr+r,r.)r�interface_namer-�senderr"r"r#�Getes


zFirewallDConfigIPSet.Get�sza{sv}cCsdt|t�}tjd|j|�|tjjkr6tjj	d|��i}xd
D]}|j
|�||<q@Wtj|dd	�S)Nz%s.GetAll('%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existr&r'rr(r)Zsv)�	signature)r&r'rr(r))rr2rr3rrrrr+r,r.Z
Dictionary)rr4r5�ret�xr"r"r#�GetAllvs

zFirewallDConfigIPSet.GetAllZssv)r0cCslt|t�}t|t�}t|�}tjd|j|||�|jj|�|tjj	krXtj
jd|��tj
jd|��dS)Nz%s.Set('%s', '%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existzGorg.freedesktop.DBus.Error.PropertyReadOnly: Property '%s' is read-only)rr2rr3rr�accessCheckrrrr+r,)rr4r-Z	new_valuer5r"r"r#�Set�s



zFirewallDConfigIPSet.Setzsa{sv}as)r8cCs2t|t�}t|�}t|�}tjd|j|||�dS)Nz&%s.PropertiesChanged('%s', '%s', '%s'))rr2rr3r)rr4Zchanged_propertiesZinvalidated_propertiesr"r"r#�PropertiesChanged�s


z&FirewallDConfigIPSet.PropertiesChanged)r1cs8tjd|j�tt|�j|j|jj��}t	||t
jj�S)Nz%s.Introspect())
rZdebug2rrr�
IntrospectrrZget_busrrrr)rr5�data)r!r"r#r?�s

zFirewallDConfigIPSet.IntrospectcCstjd|j�|jj|j�S)zget settings for ipset
        z%s.getSettings())rr3rrZget_ipset_configr)rr5r"r"r#�getSettings�sz FirewallDConfigIPSet.getSettingscCsFt|�}tjd|j�|jj|�|jj|j|�|_|j	|jj
�dS)z"update settings for ipset
        z%s.update('...')N)rrr3rrr<rZset_ipset_configr�Updatedr&)r�settingsr5r"r"r#�update�s
zFirewallDConfigIPSet.updatecCs<tjd|j�|jj|�|jj|j�|_|j|jj	�dS)z0load default settings for builtin ipset
        z%s.loadDefaults()N)
rr3rrr<rZload_ipset_defaultsrrBr&)rr5r"r"r#�loadDefaults�sz!FirewallDConfigIPSet.loadDefaultscCstjd|j|f�dS)Nz%s.Updated('%s'))rr3r)rr&r"r"r#rB�szFirewallDConfigIPSet.UpdatedcCs:tjd|j�|jj|�|jj|j�|jj|j�dS)zremove ipset
        z%s.remove()N)	rr3rrr<rZremove_ipsetrZremoveIPSet)rr5r"r"r#�remove�szFirewallDConfigIPSet.removecCstjd|j|f�dS)Nz%s.Removed('%s'))rr3r)rr&r"r"r#�Removed�szFirewallDConfigIPSet.RemovedcCsFt|t�}tjd|j|�|jj|�|jj|j	|�|_	|j
|�dS)zrename ipset
        z%s.rename('%s')N)rr2rr3rrr<rZrename_ipsetr�Renamed)rr&r5r"r"r#�rename�s

zFirewallDConfigIPSet.renamecCstjd|j|f�dS)Nz%s.Renamed('%s'))rr3r)rr&r"r"r#rH�szFirewallDConfigIPSet.RenamedcCstjd|j�|j�dS)Nz%s.getVersion()r)rr3rrA)rr5r"r"r#�
getVersionszFirewallDConfigIPSet.getVersioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setVersion('%s')r)
rr2rr3rrr<�listrArD)r�versionr5rCr"r"r#�
setVersions
zFirewallDConfigIPSet.setVersioncCstjd|j�|j�dS)Nz
%s.getShort()r)rr3rrA)rr5r"r"r#�getShortszFirewallDConfigIPSet.getShortcCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setShort('%s')r)
rr2rr3rrr<rKrArD)rZshortr5rCr"r"r#�setShorts
zFirewallDConfigIPSet.setShortcCstjd|j�|j�dS)Nz%s.getDescription()�)rr3rrA)rr5r"r"r#�getDescription(sz#FirewallDConfigIPSet.getDescriptioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setDescription('%s')rP)
rr2rr3rrr<rKrArD)r�descriptionr5rCr"r"r#�setDescription/s

z#FirewallDConfigIPSet.setDescriptioncCstjd|j�|j�dS)Nz%s.getType()�)rr3rrA)rr5r"r"r#�getType=szFirewallDConfigIPSet.getTypecCs\t|t�}tjd|j|�|jj|�|tkr:tt	j
|��t|j��}||d<|j
|�dS)Nz%s.setType('%s')rT)rr2rr3rrr<rrrZINVALID_TYPErKrArD)rZ
ipset_typer5rCr"r"r#�setTypeDs
zFirewallDConfigIPSet.setTypeza{ss}cCstjd|j�|j�dS)Nz%s.getOptions()�)rr3rrA)rr5r"r"r#�
getOptionsSszFirewallDConfigIPSet.getOptionscCsLt|t�}tjd|jt|��|jj|�t|j	��}||d<|j
|�dS)Nz%s.setOptions('[%s]')rW)r�dictrr3r�reprrr<rKrArD)rZoptionsr5rCr"r"r#�
setOptionsZs


zFirewallDConfigIPSet.setOptionscCs�t|t�}t|t�}tjd|j||�|jj|�t|j��}||dkrn|d||krnt	t
jd||f��||d|<|j|�dS)Nz%s.addOption('%s', '%s')rWz
'%s': '%s')
rr2rr3rrr<rKrArr�ALREADY_ENABLEDrD)r�key�valuer5rCr"r"r#�	addOptionfs

zFirewallDConfigIPSet.addOptioncCsbt|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|d|=|j|�dS)Nz%s.removeOption('%s')rW)
rr2rr3rrr<rKrArr�NOT_ENABLEDrD)rr]r5rCr"r"r#�removeOptionus

z!FirewallDConfigIPSet.removeOption�bcCsNt|t�}t|t�}tjd|j||�t|j��}||dkoL|d||kS)Nz%s.queryOption('%s', '%s')rW)rr2rr3rrKrA)rr]r^r5rCr"r"r#�queryOption�s

z FirewallDConfigIPSet.queryOption�ascCstjd|j�|j�dS)Nz%s.getEntries()�)rr3rrA)rr5r"r"r#�
getEntries�szFirewallDConfigIPSet.getEntriescCs|t|t�}t|�tjd|jdj|��|jj|�t|j	��}d|dkrf|dddkrft
tj��||d<|j
|�dS)Nz%s.setEntries('[%s]')�,�timeoutrW�0re)rrKrrr3r�joinrr<rArr�IPSET_WITH_TIMEOUTrD)rZentriesr5rCr"r"r#�
setEntries�s


zFirewallDConfigIPSet.setEntriescCs�t|t�}t|�}tjd|j|�|jj|�t|j	��}d|dkr`|dddkr`t
tj��||dkrxt
tj
|��t||d�|dj|�|j|�dS)Nz%s.addEntry('%s')rhrWrire)rr2r	rr3rrr<rKrArrrkr\r
�appendrD)r�entryr5rCr"r"r#�addEntry�s

zFirewallDConfigIPSet.addEntrycCs�t|t�}t|�}tjd|j|�|jj|�t|j	��}d|dkr`|dddkr`t
tj��||dkrxt
tj
|��|dj|�|j|�dS)Nz%s.removeEntry('%s')rhrWrire)rr2r	rr3rrr<rKrArrrkr`rFrD)rrnr5rCr"r"r#�removeEntry�s

z FirewallDConfigIPSet.removeEntrycCs`t|t�}t|�}tjd|j|�t|j��}d|dkrT|dddkrTtt	j
��||dkS)Nz%s.queryEntry('%s')rhrWrire)rr2r	rr3rrKrArrrk)rrnr5rCr"r"r#�
queryEntry�s

zFirewallDConfigIPSet.queryEntry)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N);�__name__�
__module__�__qualname__�__doc__Z
persistentrrZPK_ACTION_CONFIGZdefault_polkit_auth_requiredr
rrr$r%r.rZPROPERTIES_IFACEr6r;�slipZpolkitZrequire_authr=�service�signalr>ZPK_ACTION_INFOZINTROSPECTABLE_IFACEr?rrZDBUS_SIGNATURErArDrErBrFrGrIrHrJrMrNrOrQrSrUrVrXr[r_rarcrfrlrorprq�
__classcell__r"r")r!r#r3s�
		




	


r)"Z
gi.repositoryr�sys�modulesrZdbus.serviceZ	slip.dbusrvZslip.dbus.serviceZfirewallrZfirewall.dbus_utilsrrrZfirewall.core.io.ipsetrZfirewall.core.ipsetrr	r
rZfirewall.core.loggerrZfirewall.server.decoratorsr
rrrZfirewall.errorsrrwZObjectrr"r"r"r#�<module>s
site-packages/firewall/server/__pycache__/config.cpython-36.opt-1.pyc000064400000127767147511334700021472 0ustar003

]ûf��@sdddlmZddlZeejd<ddlZddlZddlZddlZddl	Zddl
mZddlm
Z
ddlmZddlmZddlmZmZmZdd	lmZdd
lmZddlmZddlmZdd
lmZddl m!Z!ddl"m#Z#ddl$m%Z%ddl&m'Z'ddl(m)Z)ddl*m+Z+ddl,m-Z-m.Z.m/Z/m0Z0m1Z1m2Z2m3Z3ddl
m4Z4ddl5m6Z6Gdd�dejj7j8�Z9dS)�)�GObjectNZgobject)�config)�DEFAULT_ZONE_TARGET)�Watcher)�log)�handle_exceptions�dbus_handle_exceptions�dbus_service_method)�FirewallDConfigIcmpType)�FirewallDConfigService)�FirewallDConfigZone)�FirewallDConfigPolicy)�FirewallDConfigIPSet)�FirewallDConfigHelper)�IcmpType)�IPSet)�Helper)�LockdownWhitelist)�Direct)�dbus_to_python�command_of_sender�context_of_sender�
uid_of_sender�user_of_uid�%dbus_introspection_prepare_properties�!dbus_introspection_add_properties)�errors)�
FirewallErrorcs@eZdZdZdZejjZe	�fdd��Z
e	dd��Ze	dd��Ze	d	d
��Z
e	dd��Ze	d
d��Ze	dd��Ze	dd��Ze	dd��Ze	dd��Ze	dd��Ze	dd��Ze	dd��Ze	dd��Ze	dd ��Ze	d!d"��Ze	d#d$��Ze	d%d&��Ze	d'd(��Ze	d)d*��Ze	d+d,��Ze	d-d.��Ze	d/d0��Z e!d1d2��Z"e!d3d4��Z#e!d5d6��Z$e%ej&d7d8d9�e!d�d;d<���Z'e%ej&d=d>d9�e!d�d?d@���Z(e)jj*j+ejj�e%ej&dAdB�e!d�dCdD����Z,ej-j.ej&dEdF�dGdH��Z/e)jj*j+ejj0�e%ej1d=dI�e!d�fdJdK�	���Z2e%ejj3e4j5dI�e!d�dLdM���Z6e%ejj3e4j5dB�e!d�dNdO���Z7ej-j.ejj3�e!dPdQ���Z8e%ejj3d=dB�e!d�dRdS���Z9e%ejj3d=dB�e!d�dTdU���Z:e%ejj3d=dVd9�e!d�dWdX���Z;e%ejj3dYdI�e!d�dZd[���Z<e%ejj3d=dB�e!d�d\d]���Z=e%ejj3d=dB�e!d�d^d_���Z>e%ejj3d=dVd9�e!d�d`da���Z?e%ejj3dYdI�e!d�dbdc���Z@e%ejj3d=dB�e!d�ddde���ZAe%ejj3d=dB�e!d�dfdg���ZBe%ejj3d=dVd9�e!d�dhdi���ZCe%ejj3dYdI�e!d�djdk���ZDe%ejj3dldB�e!�ddmdn���ZEe%ejj3dldB�e!�ddodp���ZFe%ejj3dldVd9�e!�ddqdr���ZGe%ejj3dsdI�e!�ddtdu���ZHe%ejjIdvdI�e!�ddwdx���ZJe%ejjIdYdI�e!�ddydz���ZKe%ejjId=d{d9�e!�dd|d}���ZLe%ejjId=eMj5d{d9�e!�dd~d���ZNej-j.ejjId=dF�e!d�d����ZOe%ejjIdvdI�e!�dd�d����ZPe%ejjIdYdI�e!�d	d�d����ZQe%ejjId=d{d9�e!�d
d�d����ZRe%ejjId=eSj5d{d9�e!�dd�d����ZTej-j.ejjId=dF�e!d�d����ZUe%ejjIdvdI�e!�dd�d����ZVe%ejjIdYdI�e!�d
d�d����ZWe%ejjId=d{d9�e!�dd�d����ZXe%ejjId�d{d9�e!�dd�d����ZYe%ejjId�d{d9�e!�dd�d����ZZej-j.ejjId=dF�e!d�d����Z[e%ejjIdvdI�e!�dd�d����Z\e%ejjIdYdI�e!�dd�d����Z]e%ejjId=d{d9�e!�dd�d����Z^e%ejjId=d=d9�e!�dd�d����Z_e%ejjId=d=d9�e!�dd�d����Z`e%ejjId�d{d9�e!�dd�d����Zae%ejjId�d{d9�e!�dd�d����Zbej-j.ejjId=dF�e!d�d����Zce%ejjIdvdI�e!�dd�d����Zde%ejjIdYdI�e!�dd�d����Zee%ejjId=d{d9�e!�dd�d����Zfe%ejjId�d{d9�e!�dd�d����Zgej-j.ejjId=dF�e!d�d����Zhe%ejjIdvdI�e!�dd�d����Zie%ejjIdYdI�e!�dd�d����Zje%ejjId=d{d9�e!�dd�d����Zke%ejjId=elj5d{d9�e!�dd�d����Zmej-j.ejjId=dF�e!d�d����Zne%ejjoepj5dI�e!�d d�d����Zqe%ejjoepj5dB�e!�d!d�d„��Zrej-j.ejjo�e!d�dĄ��Zse%ejjod�dB�e!�d"d�dDŽ��Zte%ejjod�dB�e!�d#d�dɄ��Zue%ejjod�dVd9�e!�d$d�d˄��Zve%ejjod7dYd9�e!�d%d�d̈́��Zwe%ejjod�d�d9�e!�d&d�dф��Zxe%ejjod�dB�e!�d'd�dԄ��Zye%ejjod�dB�e!�d(d�dք��Zze%ejjod�dVd9�e!�d)d�d؄��Z{e%ejjod�dB�e!�d*d�dڄ��Z|e%ejjod�d�d9�e!�d+d�d݄��Z}e%ejjod�d�d9�e!�d,d�d���Z~e%ejjod�dB�e!�d-d�d���Ze%ejjod�dB�e!�d.d�d���Z�e%ejjod�dVd9�e!�d/d�d���Z�e%ejjod=d�d9�e!�d0d�d���Z�e%ejjod�dI�e!�d1d�d���Z��Z�S(2�FirewallDConfigzFirewallD main classTcs�tt|�j||�||_|d|_|d|_|j�t|jd�|_	|j	j
tj�|j	j
tj�|j	j
tj
�|j	j
tj�|j	j
tj�|j	j
tj�|j	j
tj�|j	j
tj�|j	j
tj�|j	j
tj�|j	j
tj�|j	j
tj�tjjtj��r>xBttjtj��D].}dtj|f}tjj|��r|j	j
|��qW|j	jtj�|j	jtj�|j	jtj�t |tj!j"ddddddddddddd��dS)Nr��z%s/%sZ	readwrite)�
CleanupOnExit�CleanupModulesOnExit�
IPv6_rpfilter�Lockdown�MinimalMark�IndividualCalls�	LogDenied�AutomaticHelpers�FirewallBackend�FlushAllOnReload�RFC3964_IPv4�AllowZoneDrifting)#�superr�__init__r�busname�path�
_init_varsr�
watch_updater�watcher�
add_watch_dir�FIREWALLD_IPSETS�ETC_FIREWALLD_IPSETS�FIREWALLD_ICMPTYPES�ETC_FIREWALLD_ICMPTYPES�FIREWALLD_HELPERS�ETC_FIREWALLD_HELPERS�FIREWALLD_SERVICES�ETC_FIREWALLD_SERVICES�FIREWALLD_ZONES�ETC_FIREWALLD_ZONES�FIREWALLD_POLICIES�ETC_FIREWALLD_POLICIES�os�exists�sorted�listdir�isdirZadd_watch_file�LOCKDOWN_WHITELIST�FIREWALLD_DIRECT�FIREWALLD_CONFr�dbus�DBUS_INTERFACE_CONFIG)�selfZconf�args�kwargs�filenamer0)�	__class__��/usr/lib/python3.6/config.pyr.FsP

zFirewallDConfig.__init__cCs2g|_d|_g|_d|_g|_d|_g|_d|_g|_d|_	g|_
d|_x$|jj
�D]}|j|jj|��qTWx$|jj�D]}|j|jj|��qzWx$|jj�D]}|j|jj|��q�Wx$|jj�D]}|j|jj|��q�Wx$|jj�D]}|j|jj|��q�Wx&|jj�D]}|j|jj|���qWdS)Nr)�ipsets�	ipset_idx�	icmptypes�icmptype_idx�services�service_idx�zones�zone_idx�helpers�
helper_idx�policy_objects�policy_object_idxrZ
get_ipsets�	_addIPSetZ	get_ipsetZ
get_icmptypes�_addIcmpTypeZget_icmptypeZget_services�_addServiceZget_serviceZ	get_zones�_addZoneZget_zoneZget_helpers�
_addHelperZ
get_helperZget_policy_objects�
_addPolicyZget_policy_object)rK�ipset�icmptype�service�zone�helper�policyrPrPrQr1ts0zFirewallDConfig._init_varscCsdS)NrP)rKrPrPrQ�__del__�szFirewallDConfig.__del__cCs�x&t|j�dkr&|jj�}|j�~qWx&t|j�dkrN|jj�}|j�~q*Wx&t|j�dkrv|jj�}|j�~qRWx&t|j�dkr�|jj�}|j�~qzWx&t|j�dkr�|jj�}|j�~q�Wx&t|j�dkr�|jj�}|j�~q�W|j	�dS)Nr)
�lenrR�pop�
unregisterrTrVrXrZr\r1)rK�itemrPrPrQ�reload�s2





zFirewallDConfig.reloadc	CsJ|tjkr�|jtjj�}tjdtj�y|jj�Wn2tk
rf}ztj	d||f�dSd}~XnX|jtjj�j
�}x2t|j��D]"}||kr�||||kr�||=q�Wt
|�dkr�|jtjj|g�dS|jtj�s�|jtj�o�|jd��r�y|jj|�\}}Wn4tk
�r<}ztj	d||f�dSd}~XnX|dk�rT|j|�n*|dk�rj|j|�n|dk�rF|j|��n�|jtj��s�|jtj��r8|jd��r8y|jj|�\}}Wn4tk
�r�}ztj	d	||f�dSd}~XnX|dk�r
|j|�n*|dk�r |j|�n|dk�rF|j|��n|jtj��sT|jtj��rr|jd��r�y|jj|�\}}Wn4tk
�r�}ztj	d
||f�dSd}~XnX|dk�r�|j |�n*|dk�r�|j!|�n|dk�rn|j"|�n�|jtj��rF|j#tjd�j$d�}t
|�d
k�s&d|k�r*dSt%j&j'|��rT|j(j)|��sn|j(j*|�n|j(j)|��rF|j(j+|��n�|jtj,��s�|jtj-��r(|jd��r(y|jj.|�\}}Wn4tk
�r�}ztj	d||f�dSd}~XnX|dk�r�|j/|�n*|dk�r|j0|�n|dk�rF|j1|��n|jtj2��sD|jtj3��r�|jd��r�y|jj4|�\}}Wn4tk
�r�}ztj	d||f�dSd}~XnX|dk�r�|j5|�n*|dk�r�|j6|�n|dk�rF|j7|��nh|tj8k�r:y|jj9�Wn4tk
�r,}ztj	d||f�dSd}~XnX|j:��n|tj;k�r�y|jj<�Wn4tk
�r�}ztj	d||f�dSd}~XnX|j=�n�|jtj>��s�|jtj?��rF|jd��rFy|jj@|�\}}Wn4tk
�r}ztj	d||f�dSd}~XnX|dk�r|jA|�n*|dk�r2|jB|�n|dk�rF|jC|�dS)Nz,config: Reloading firewalld config file '%s'z+Failed to load firewalld.conf file '%s': %srz.xmlz%Failed to load icmptype file '%s': %s�new�remove�updatez$Failed to load service file '%s': %sz!Failed to load zone file '%s': %s��/rz"Failed to load ipset file '%s': %sz#Failed to load helper file '%s': %sz/Failed to load lockdown whitelist file '%s': %sz)Failed to load direct rules file '%s': %sz#Failed to load policy file '%s': %s)DrrH�GetAllrIrJr�debug1Zupdate_firewalld_conf�	Exception�error�copy�list�keysrk�PropertiesChanged�
startswithr7r8�endswithZupdate_icmptype_from_pathr_�removeIcmpType�_updateIcmpTyper;r<Zupdate_service_from_pathr`�
removeService�_updateServicer=r>Zupdate_zone_from_pathra�
removeZone�_updateZone�replace�striprAr0rEr3Z	has_watchr4Zremove_watchr5r6Zupdate_ipset_from_pathr^�removeIPSet�_updateIPSetr9r:Zupdate_helper_from_pathrb�removeHelper�
_updateHelperrFZupdate_lockdown_whitelist�LockdownWhitelistUpdatedrGZ
update_direct�Updatedr?r@Zupdate_policy_object_from_pathrc�removePolicy�
_updatePolicy)	rK�nameZ	old_props�msgZprops�keyZwhat�obj�_namerPrPrQr2�s
























zFirewallDConfig.watch_updaterc	CsPt||j||j|jdtjj|jf�}|jj|�|jd7_|j|j	�|S)Nz%s/%dr)
r
rrUr/rIZDBUS_PATH_CONFIG_ICMPTYPErT�append�
IcmpTypeAddedr�)rKr��config_icmptyperPrPrQr_AszFirewallDConfig._addIcmpTypecCsPxJ|jD]@}|jj|jkr|jj|jkr|jj|jkr||_|j|j�qWdS)N)rTr�r�r0rNr�)rKr�rerPrPrQr�MszFirewallDConfig._updateIcmpTypecCs�d}xT|jD]J}|j�}|j||kr||j|j�|jj|j|�|_|j|jj�qWx\|jD]R}|j�}d|krb|j|dkrb|dj|j�|jj	|j|�|_|j|jj�qbWx:|j
D]0}|j|kr�|j|j�|j�|j
j|�~q�WdS)N�Zicmp_blocks)
rX�getSettingsr�rqr�set_zone_configr�r�r\�set_policy_object_config_dictrT�Removedrm)rKr��indexrg�settingsrirerPrPrQrVs&
zFirewallDConfig.removeIcmpTypec	CsPt||j||j|jdtjj|jf�}|jj|�|jd7_|j|j	�|S)Nz%s/%dr)
rrrWr/rIZDBUS_PATH_CONFIG_SERVICErVr��ServiceAddedr�)rKr��config_servicerPrPrQr`pszFirewallDConfig._addServicecCsPxJ|jD]@}|jj|jkr|jj|jkr|jj|jkr||_|j|j�qWdS)N)rVr�r�r0rNr�)rKr�rfrPrPrQr�{szFirewallDConfig._updateServicecCs�d}xT|jD]J}|j�}|j||kr||j|j�|jj|j|�|_|j|jj�qWx\|jD]R}|j�}d|krb|j|dkrb|dj|j�|jj	|j|�|_|j|jj�qbWx:|j
D]0}|j|kr�|j|j�|j�|j
j|�~q�WdS)Nr rV)
rXr�r�rqrr�r�r�r\r�rVr�rm)rKr�r�rgr�rirfrPrPrQr��s&
zFirewallDConfig.removeServicec	CsPt||j||j|jdtjj|jf�}|jj|�|jd7_|j|j	�|S)Nz%s/%dr)
rrrYr/rIZDBUS_PATH_CONFIG_ZONErXr��	ZoneAddedr�)rKr��config_zonerPrPrQra�szFirewallDConfig._addZonecCsPxJ|jD]@}|jj|jkr|jj|jkr|jj|jkr||_|j|j�qWdS)N)rXr�r�r0rNr�)rKr�rgrPrPrQr��s
zFirewallDConfig._updateZonecCs@x:|jD]0}|j|kr|j|j�|j�|jj|�~qWdS)N)rXr�r�r�rmrq)rKr�rgrPrPrQr��s
zFirewallDConfig.removeZonec	CsPt||j||j|jdtjj|jf�}|jj|�|jd7_|j|j	�|S)Nz%s/%dr)
r
rr]r/rIZDBUS_PATH_CONFIG_POLICYr\r��PolicyAddedr�)rKr��
config_policyrPrPrQrc�szFirewallDConfig._addPolicycCsPxJ|jD]@}|jj|jkr|jj|jkr|jj|jkr||_|j|j�qWdS)N)r\r�r�r0rNr�)rKr�rirPrPrQr��s
zFirewallDConfig._updatePolicycCs@x:|jD]0}|j|kr|j|j�|j�|jj|�~qWdS)N)r\r�r�r�rmrq)rKr�rirPrPrQr��s
zFirewallDConfig.removePolicyc	CsPt||j||j|jdtjj|jf�}|jj|�|jd7_|j|j	�|S)Nz%s/%dr)
rrrSr/rIZDBUS_PATH_CONFIG_IPSETrRr��
IPSetAddedr�)rKr��config_ipsetrPrPrQr^�szFirewallDConfig._addIPSetcCsPxJ|jD]@}|jj|jkr|jj|jkr|jj|jkr||_|j|j�qWdS)N)rRr�r�r0rNr�)rKr�rdrPrPrQr��s
zFirewallDConfig._updateIPSetcCs@x:|jD]0}|j|kr|j|j�|j�|jj|�~qWdS)N)rRr�r�r�rmrq)rKr�rdrPrPrQr��s
zFirewallDConfig.removeIPSetc	CsPt||j||j|jdtjj|jf�}|jj|�|jd7_|j|j	�|S)Nz%s/%dr)
rrr[r/rIZDBUS_PATH_CONFIG_HELPERrZr��HelperAddedr�)rKr��
config_helperrPrPrQrb�szFirewallDConfig._addHelpercCsPxJ|jD]@}|jj|jkr|jj|jkr|jj|jkr||_|j|j�qWdS)N)rZr�r�r0rNr�)rKr�rhrPrPrQr��s
zFirewallDConfig._updateHelpercCs@x:|jD]0}|j|kr|j|j�|j�|jj|�~qWdS)N)rZr�r�r�rmrq)rKr�rhrPrPrQr�s
zFirewallDConfig.removeHelpercCs�|jj�r�|dkr tjd�dStj�}t||�}|jjd|�rDdSt||�}|jjd|�r`dSt	|�}|jjd|�rzdSt
||�}|jjd|�r�dSttj
d��dS)Nz&Lockdown not possible, sender not set.�context�uid�user�commandzlockdown is enabled)rZlockdown_enabledrrxrIZ	SystemBusrZaccess_checkrrrrrZ
ACCESS_DENIED)rK�senderZbusr�r�r�r�rPrPrQ�accessChecks$




zFirewallDConfig.accessCheckcCsF|dkrtjjd|��|jj�j|�}|dkrH|dkr>tj}tj|�S|dkrr|dkr`tj}nt	|�}tj
|�S|dkr�|dkr�tjr�dnd}tj|�S|dkr�|dkr�tjr�dnd}tj|�S|dk�r�|dk�r�tj
�r�dnd}tj|�S|dk�r|dk�rtj�rdnd}tj|�S|dk�rL|dk�rBtj�r>dnd}tj|�S|dk�rp|dk�rftj}tj|�S|d	k�r�|dk�r�tj}tj|�S|d
k�r�|dk�r�tj}tj|�S|dk�r�|dk�r�tj�r�dnd}tj|�S|dk�r|dk�r
tj�rdnd}tj|�S|d
k�rB|dk�r8tj�r4dnd}tj|�SdS)N�DefaultZoner%r!r"r$r#r&r'r(r)r*r+r,zDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not exist�yes�no)
r�r%r!r"r$r#r&r'r(r)r*r+r,)rI�
exceptions�
DBusExceptionr�get_firewalld_conf�getZ
FALLBACK_ZONE�StringZFALLBACK_MINIMAL_MARK�int�Int32ZFALLBACK_CLEANUP_ON_EXITZ FALLBACK_CLEANUP_MODULES_ON_EXITZFALLBACK_LOCKDOWNZFALLBACK_IPV6_RPFILTERZFALLBACK_INDIVIDUAL_CALLSZFALLBACK_LOG_DENIEDZFALLBACK_AUTOMATIC_HELPERSZFALLBACK_FIREWALL_BACKENDZFALLBACK_FLUSH_ALL_ON_RELOADZFALLBACK_RFC3964_IPV4ZFALLBACK_ALLOW_ZONE_DRIFTING)rK�prop�valuerPrPrQ�
_get_property+s|





























zFirewallDConfig._get_propertycCsT|dkrtj|j|��S|dkr0tj|j|��S|dkrHtj|j|��S|dkr`tj|j|��S|dkrxtj|j|��S|dkr�tj|j|��S|dkr�tj|j|��S|dkr�tj|j|��S|d	kr�tj|j|��S|d
k�r�tj|j|��S|dk�rtj|j|��S|dk�r&tj|j|��S|d
k�r@tj|j|��Stjjd|��dS)Nr�r%r!r"r$r#r&r'r(r)r*r+r,zDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not exist)rIr�r�r�r�r�)rKr�rPrPrQ�_get_dbus_propertyos:



z"FirewallDConfig._get_dbus_propertyZss�v)�in_signature�
out_signatureNcCsxt|t�}t|t�}tjd||�|tjjkr8|j|�S|tjjtjj	gkr^tj
jd|��ntj
jd|��|j|�S)Nzconfig.Get('%s', '%s')zDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not existzJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not exist)r�strrrvrrIrJr��DBUS_INTERFACE_CONFIG_DIRECT�DBUS_INTERFACE_CONFIG_POLICIESr�r�)rK�interface_name�
property_namer�rPrPrQ�Get�s



zFirewallDConfig.Get�sza{sv}c
Csxt|t�}tjd|�i}|tjjkrDxBdD]}|j|�||<q,Wn&|tjjtjj	gkrZntj
jd|��tj|dd�S)Nzconfig.GetAll('%s')r�r%r!r"r$r#r&r'r(r)r*r+r,zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existZsv)�	signature)
r�r%r!r"r$r#r&r'r(r)r*r+r,)
rr�rrvrrIrJr�r�r�r�r�Z
Dictionary)rKr�r��ret�xrPrPrQru�s"
zFirewallDConfig.GetAllZssv)r�cCs�t|t�}t|t�}t|�}tjd|||�|j|�|tjjk�r�|dk�rz|dkrv|j�dkrvt	t
jd||f��|dkr�|tjkr�t	t
jd||f��|dkr�|tj
kr�t	t
jd||f��|d	k�r�|j�dk�r�t	t
jd||f��|d
k�r|j�dk�rt	t
jd||f��|dk�rF|j�dk�rFt	t
jd||f��|jj�j||�|jj�j�|j|||ig�n|dk�r�ntjjd|��n8|tjjtjjgk�r�tjjd|��ntjjd|��dS)Nzconfig.Set('%s', '%s', '%s')r!r$r"r#r&r'r)r*r+r,r�r��true�falsez'%s' for %sr%r(zDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not existzJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not exist)
r!r$r"r#r&r'r)r*r+r,)r!r$r"r#r&)r�r�r�r�)r�r�r�r�)r�r�r�r�)r�r�r�r�)r%r()rr�rrvr�rrIrJ�lowerrrZ
INVALID_VALUEZLOG_DENIED_VALUESZFIREWALL_BACKEND_VALUESr��set�writer|r�r�r�r�)rKr�r�Z	new_valuer�rPrPrQ�Set�sz










zFirewallDConfig.Setzsa{sv}as)r�cCs.t|t�}t|�}t|�}tjd|||�dS)Nz*config.PropertiesChanged('%s', '%s', '%s'))rr�rrv)rKr�Zchanged_propertiesZinvalidated_propertiesrPrPrQr|s

z!FirewallDConfig.PropertiesChanged)r�cs4tjd�tt|�j|j|jj��}t||t	j
j�S)Nzconfig.Introspect())rZdebug2r-r�
Introspectr0r/Zget_busrrrIrJ)rKr��data)rOrPrQr�s

zFirewallDConfig.IntrospectcCstjd�|jj�jj�S)Nz&config.policies.getLockdownWhitelist())rrvr�get_policies�lockdown_whitelist�
export_config)rKr�rPrPrQ�getLockdownWhitelists
z$FirewallDConfig.getLockdownWhitelistcCs@tjd�t|�}|jj�jj|�|jj�jj�|j�dS)Nz)config.policies.setLockdownWhitelist(...))	rrvrrr�r��
import_configr�r�)rKr�r�rPrPrQ�setLockdownWhitelist&s

z$FirewallDConfig.setLockdownWhitelistcCstjd�dS)Nz*config.policies.LockdownWhitelistUpdated())rrv)rKrPrPrQr�0sz(FirewallDConfig.LockdownWhitelistUpdatedcCs^t|�}tjd|�|j|�t|j��}||dkrBttj|��|dj	|�|j
|�dS)Nz1config.policies.addLockdownWhitelistCommand('%s')r)rrrvr�rzr�rr�ALREADY_ENABLEDr�r�)rKr�r�r�rPrPrQ�addLockdownWhitelistCommand7s
z+FirewallDConfig.addLockdownWhitelistCommandcCs^t|�}tjd|�|j|�t|j��}||dkrBttj|��|dj	|�|j
|�dS)Nz4config.policies.removeLockdownWhitelistCommand('%s')r)rrrvr�rzr�rr�NOT_ENABLEDrqr�)rKr�r�r�rPrPrQ�removeLockdownWhitelistCommandDs
z.FirewallDConfig.removeLockdownWhitelistCommand�bcCs$t|�}tjd|�||j�dkS)Nz3config.policies.queryLockdownWhitelistCommand('%s')r)rrrvr�)rKr�r�rPrPrQ�queryLockdownWhitelistCommandRsz-FirewallDConfig.queryLockdownWhitelistCommand�ascCstjd�|j�dS)Nz.config.policies.getLockdownWhitelistCommands()r)rrvr�)rKr�rPrPrQ�getLockdownWhitelistCommands[s
z,FirewallDConfig.getLockdownWhitelistCommandscCs^t|�}tjd|�|j|�t|j��}||dkrBttj|��|dj	|�|j
|�dS)Nz1config.policies.addLockdownWhitelistContext('%s')r)rrrvr�rzr�rrr�r�r�)rKr�r�r�rPrPrQ�addLockdownWhitelistContextds
z+FirewallDConfig.addLockdownWhitelistContextcCs^t|�}tjd|�|j|�t|j��}||dkrBttj|��|dj	|�|j
|�dS)Nz4config.policies.removeLockdownWhitelistContext('%s')r)rrrvr�rzr�rrr�rqr�)rKr�r�r�rPrPrQ�removeLockdownWhitelistContextqs
z.FirewallDConfig.removeLockdownWhitelistContextcCs$t|�}tjd|�||j�dkS)Nz3config.policies.queryLockdownWhitelistContext('%s')r)rrrvr�)rKr�r�rPrPrQ�queryLockdownWhitelistContextsz-FirewallDConfig.queryLockdownWhitelistContextcCstjd�|j�dS)Nz.config.policies.getLockdownWhitelistContexts()r)rrvr�)rKr�rPrPrQ�getLockdownWhitelistContexts�s
z,FirewallDConfig.getLockdownWhitelistContextscCs^t|�}tjd|�|j|�t|j��}||dkrBttj|��|dj	|�|j
|�dS)Nz.config.policies.addLockdownWhitelistUser('%s')�)rrrvr�rzr�rrr�r�r�)rKr�r�r�rPrPrQ�addLockdownWhitelistUser�s
z(FirewallDConfig.addLockdownWhitelistUsercCs^t|�}tjd|�|j|�t|j��}||dkrBttj|��|dj	|�|j
|�dS)Nz1config.policies.removeLockdownWhitelistUser('%s')r�)rrrvr�rzr�rrr�rqr�)rKr�r�r�rPrPrQ�removeLockdownWhitelistUser�s
z+FirewallDConfig.removeLockdownWhitelistUsercCs$t|�}tjd|�||j�dkS)Nz0config.policies.queryLockdownWhitelistUser('%s')r�)rrrvr�)rKr�r�rPrPrQ�queryLockdownWhitelistUser�sz*FirewallDConfig.queryLockdownWhitelistUsercCstjd�|j�dS)Nz+config.policies.getLockdownWhitelistUsers()r�)rrvr�)rKr�rPrPrQ�getLockdownWhitelistUsers�s
z)FirewallDConfig.getLockdownWhitelistUsers�icCs^t|�}tjd|�|j|�t|j��}||dkrBttj|��|dj	|�|j
|�dS)Nz+config.policies.addLockdownWhitelistUid(%d)�)rrrvr�rzr�rrr�r�r�)rKr�r�r�rPrPrQ�addLockdownWhitelistUid�s
z'FirewallDConfig.addLockdownWhitelistUidcCs^t|�}tjd|�|j|�t|j��}||dkrBttj|��|dj	|�|j
|�dS)Nz.config.policies.removeLockdownWhitelistUid(%d)r�)rrrvr�rzr�rrr�rqr�)rKr�r�r�rPrPrQ�removeLockdownWhitelistUid�s
z*FirewallDConfig.removeLockdownWhitelistUidcCs$t|�}tjd|�||j�dkS)Nz-config.policies.queryLockdownWhitelistUid(%d)r�)rrrvr�)rKr�r�rPrPrQ�queryLockdownWhitelistUid�sz)FirewallDConfig.queryLockdownWhitelistUidZaicCstjd�|j�dS)Nz*config.policies.getLockdownWhitelistUids()r�)rrvr�)rKr�rPrPrQ�getLockdownWhitelistUids�s
z(FirewallDConfig.getLockdownWhitelistUidsZaocCstjd�|jS)z"list ipsets objects paths
        zconfig.listIPSets())rrvrR)rKr�rPrPrQ�
listIPSets�s
zFirewallDConfig.listIPSetscCs4tjd�g}x|jD]}|j|jj�qWt|�S)zget ipset names
        zconfig.getIPSetNames())rrvrRr�r�r�rC)rKr�rRr�rPrPrQ�
getIPSetNames�s

zFirewallDConfig.getIPSetNames�ocCsFt|t�}tjd|�x|jD]}|jj|kr|SqWttj	|��dS)z-object path of ipset with given name
        zconfig.getIPSetByName('%s')N)
rr�rrvrRr�r�rrZ
INVALID_IPSET)rKrdr�r�rPrPrQ�getIPSetByName�s
zFirewallDConfig.getIPSetByNamecCsDt|t�}t|�}tjd|�|j|�|jj||�}|j|�}|S)z/add ipset with given name and settings
        zconfig.addIPSet('%s'))rr�rrvr�rZ	new_ipsetr^)rKrdr�r�r�r�rPrPrQ�addIPSet	s


zFirewallDConfig.addIPSetcCst|t�}tjd|�dS)Nzconfig.IPSetAdded('%s'))rr�rrv)rKrdrPrPrQr�s
zFirewallDConfig.IPSetAddedcCstjd�|jS)z%list icmptypes objects paths
        zconfig.listIcmpTypes())rrvrT)rKr�rPrPrQ�
listIcmpTypes s
zFirewallDConfig.listIcmpTypescCs4tjd�g}x|jD]}|j|jj�qWt|�S)zget icmptype names
        zconfig.getIcmpTypeNames())rrvrTr�r�r�rC)rKr�rTr�rPrPrQ�getIcmpTypeNames(s

z FirewallDConfig.getIcmpTypeNamescCsFt|t�}tjd|�x|jD]}|jj|kr|SqWttj	|��dS)z0object path of icmptype with given name
        zconfig.getIcmpTypeByName('%s')N)
rr�rrvrTr�r�rrZINVALID_ICMPTYPE)rKrer�r�rPrPrQ�getIcmpTypeByName3s
z!FirewallDConfig.getIcmpTypeByNamecCsDt|t�}t|�}tjd|�|j|�|jj||�}|j|�}|S)z2add icmptype with given name and settings
        zconfig.addIcmpType('%s'))rr�rrvr�rZnew_icmptyper_)rKrer�r�r�r�rPrPrQ�addIcmpType@s


zFirewallDConfig.addIcmpTypecCstjd|�dS)Nzconfig.IcmpTypeAdded('%s'))rrv)rKrerPrPrQr�OszFirewallDConfig.IcmpTypeAddedcCstjd�|jS)z$list services objects paths
        zconfig.listServices())rrvrV)rKr�rPrPrQ�listServicesVs
zFirewallDConfig.listServicescCs4tjd�g}x|jD]}|j|jj�qWt|�S)zget service names
        zconfig.getServiceNames())rrvrVr�r�r�rC)rKr�rVr�rPrPrQ�getServiceNames^s

zFirewallDConfig.getServiceNamescCsFt|t�}tjd|�x|jD]}|jj|kr|SqWttj	|��dS)z/object path of service with given name
        zconfig.getServiceByName('%s')N)
rr�rrvrVr�r�rrZINVALID_SERVICE)rKrfr�r�rPrPrQ�getServiceByNameis
z FirewallDConfig.getServiceByNamezs(sssa(ss)asa{ss}asa(ss))cCsDt|t�}t|�}tjd|�|j|�|jj||�}|j|�}|S)z1add service with given name and settings
        zconfig.addService('%s'))rr�rrvr�rZnew_servicer`)rKrfr�r�r�r�rPrPrQ�
addServicevs


zFirewallDConfig.addServicezsa{sv}cCsDt|t�}t|�}tjd|�|j|�|jj||�}|j|�}|S)z1add service with given name and settings
        zconfig.addService2('%s'))rr�rrvr�rZnew_service_dictr`)rKrfr�r�r�r�rPrPrQ�addService2�s


zFirewallDConfig.addService2cCstjd|�dS)Nzconfig.ServiceAdded('%s'))rrv)rKrfrPrPrQr��szFirewallDConfig.ServiceAddedcCstjd�|jS)z!list zones objects paths
        zconfig.listZones())rrvrX)rKr�rPrPrQ�	listZones�s
zFirewallDConfig.listZonescCs4tjd�g}x|jD]}|j|jj�qWt|�S)zget zone names
        zconfig.getZoneNames())rrvrXr�r�r�rC)rKr�rXr�rPrPrQ�getZoneNames�s

zFirewallDConfig.getZoneNamescCsFt|t�}tjd|�x|jD]}|jj|kr|SqWttj	|��dS)z,object path of zone with given name
        zconfig.getZoneByName('%s')N)
rr�rrvrXr�r�rrZINVALID_ZONE)rKrgr�r�rPrPrQ�
getZoneByName�s
zFirewallDConfig.getZoneByNamecCszt|t�}tjd|�g}x(|jD]}||jjkr"|j|jj�q"Wt	|�dkrjdj
|�d|t	|�fS|rv|dSdS)z4name of zone the given interface belongs to
        zconfig.getZoneOfInterface('%s')r� zE  (ERROR: interface '%s' is in %s zone XML files, can be only in one)rrs)rr�rrvrXr�Z
interfacesr�r�rk�join)rKZifacer�r�r�rPrPrQ�getZoneOfInterface�s
z"FirewallDConfig.getZoneOfInterfacecCszt|t�}tjd|�g}x(|jD]}||jjkr"|j|jj�q"Wt	|�dkrjdj
|�d|t	|�fS|rv|dSdS)z1name of zone the given source belongs to
        zconfig.getZoneOfSource('%s')rr�zB  (ERROR: source '%s' is in %s zone XML files, can be only in one)rrs)rr�rrvrXr�Zsourcesr�r�rkr)rK�sourcer�r�r�rPrPrQ�getZoneOfSource�s
zFirewallDConfig.getZoneOfSourcez's(sssbsasa(ss)asba(ssss)asasasasa(ss)b)cCsht|t�}t|�}tjd|�|j|�|ddkrLt|�}t|d<t|�}|jj	||�}|j
|�}|S)z.add zone with given name and settings
        zconfig.addZone('%s')��default)rr�rrvr�rzr�tuplerZnew_zonera)rKrgr�r�Z	_settingsr�r�rPrPrQ�addZone�s


zFirewallDConfig.addZonecCs`t|t�}t|�}tjd|�|j|�d|krD|ddkrDt|d<|jj||�}|j|�}|S)z.add zone with given name and settings
        zconfig.addZone('%s')�targetr)	rr�rrvr�rrZ
new_zone_dictra)rKrgr�r�r�r�rPrPrQ�addZone2�s


zFirewallDConfig.addZone2cCstjd|�dS)Nzconfig.ZoneAdded('%s'))rrv)rKrgrPrPrQr�szFirewallDConfig.ZoneAddedcCstjd�|jS)z$list policies objects paths
        zconfig.listPolicies())rrvr\)rKr�rPrPrQ�listPoliciess
zFirewallDConfig.listPoliciescCs4tjd�g}x|jD]}|j|jj�qWt|�S)zget policy names
        zconfig.getPolicyNames())rrvr\r�r�r�rC)rKr�Zpoliciesr�rPrPrQ�getPolicyNamess

zFirewallDConfig.getPolicyNamescCsFt|t�}tjd|�x|jD]}|jj|kr|SqWttj	|��dS)z.object path of policy with given name
        zconfig.getPolicyByName('%s')N)
rr�rrvr\r�r�rrZINVALID_POLICY)rKrir�r�rPrPrQ�getPolicyByName"s
zFirewallDConfig.getPolicyByNamecCsDt|t�}t|�}tjd|�|j|�|jj||�}|j|�}|S)z0add policy with given name and settings
        zconfig.addPolicy('%s'))rr�rrvr�rZnew_policy_object_dictrc)rKrir�r�r�r�rPrPrQ�	addPolicy/s


zFirewallDConfig.addPolicycCstjd|�dS)Nzconfig.PolicyAdded('%s'))rrv)rKrirPrPrQr�>szFirewallDConfig.PolicyAddedcCstjd�|jS)z#list helpers objects paths
        zconfig.listHelpers())rrvrZ)rKr�rPrPrQ�listHelpersGs
zFirewallDConfig.listHelperscCs4tjd�g}x|jD]}|j|jj�qWt|�S)zget helper names
        zconfig.getHelperNames())rrvrZr�r�r�rC)rKr�rZr�rPrPrQ�getHelperNamesOs

zFirewallDConfig.getHelperNamescCsFt|t�}tjd|�x|jD]}|jj|kr|SqWttj	|��dS)z.object path of helper with given name
        zconfig.getHelperByName('%s')N)
rr�rrvrZr�r�rrZINVALID_HELPER)rKrhr�r�rPrPrQ�getHelperByNameZs
zFirewallDConfig.getHelperByNamecCsDt|t�}t|�}tjd|�|j|�|jj||�}|j|�}|S)z0add helper with given name and settings
        zconfig.addHelper('%s'))rr�rrvr�rZ
new_helperrb)rKrhr�r�r�r�rPrPrQ�	addHelpergs


zFirewallDConfig.addHelpercCst|t�}tjd|�dS)Nzconfig.HelperAdded('%s'))rr�rrv)rKrhrPrPrQr�vs
zFirewallDConfig.HelperAddedcCstjd�|jj�j�S)Nzconfig.direct.getSettings())rrvr�
get_directr�)rKr�rPrPrQr�s
zFirewallDConfig.getSettingscCs<tjd�t|�}|jj�j|�|jj�j�|j�dS)Nzconfig.direct.update())rrvrrrr�r�r�)rKr�r�rPrPrQrr�s

zFirewallDConfig.updatecCstjd�dS)Nzconfig.direct.Updated())rrv)rKrPrPrQr��szFirewallDConfig.UpdatedZssscCs�t|�}t|�}t|�}tjd|||f�|j|�t|||f�}t|j��}||dkrrttj	d|||f��|dj
|�|j|�dS)Nz(config.direct.addChain('%s', '%s', '%s')rz chain '%s' already is in '%s:%s')rrrvr�rrzr�rrr�r�rr)rK�ipv�table�chainr��idxr�rPrPrQ�addChain�s
zFirewallDConfig.addChaincCs�t|�}t|�}t|�}tjd|||f�|j|�t|||f�}t|j��}||dkrrttj	d|||f��|dj
|�|j|�dS)Nz+config.direct.removeChain('%s', '%s', '%s')rzchain '%s' is not in '%s:%s')rrrvr�rrzr�rrr�rqrr)rKrrrr�rr�rPrPrQ�removeChain�s

zFirewallDConfig.removeChaincCsJt|�}t|�}t|�}tjd|||f�t|||f�}||j�dkS)Nz*config.direct.queryChain('%s', '%s', '%s')r)rrrvrr�)rKrrrr�rrPrPrQ�
queryChain�szFirewallDConfig.queryChaincCsft|�}t|�}tjd||f�g}x:|j�dD]*}|d|kr4|d|kr4|j|d�q4W|S)Nz#config.direct.getChains('%s', '%s')rrr�)rrrvr�r�)rKrrr�r�rrPrPrQ�	getChains�szFirewallDConfig.getChainsrsza(sss)cCstjd�|j�dS)Nzconfig.direct.getAllChains()r)rrvr�)rKr�rPrPrQ�getAllChains�s
zFirewallDConfig.getAllChainsZsssiasc	Cs�t|�}t|�}t|�}t|�}t|�}tjd||||dj|�f�|j|�|||||f}t|j��}||dkr�ttj	d||||f��|dj
|�|jt|��dS)Nz1config.direct.addRule('%s', '%s', '%s', %d, '%s')z','rz"rule '%s' already is in '%s:%s:%s')
rrrvrr�rzr�rrr�r�rrr)	rKrrr�priorityrLr�rr�rPrPrQ�addRule�s 
zFirewallDConfig.addRulec	Cs�t|�}t|�}t|�}t|�}t|�}tjd||||dj|�f�|j|�|||||f}t|j��}||dkr�ttj	d||||f��|dj
|�|jt|��dS)Nz4config.direct.removeRule('%s', '%s', '%s', %d, '%s')z','rzrule '%s' is not in '%s:%s:%s')
rrrvrr�rzr�rrr�rqrrr)	rKrrrrrLr�rr�rPrPrQ�
removeRule�s 
zFirewallDConfig.removeRulecCsdt|�}t|�}t|�}t|�}t|�}tjd||||dj|�f�|||||f}||j�dkS)Nz3config.direct.queryRule('%s', '%s', '%s', %d, '%s')z','r)rrrvrr�)rKrrrrrLr�rrPrPrQ�	queryRuleszFirewallDConfig.queryRulecCs�t|�}t|�}t|�}tjd|||f�|j|�t|j��}xF|ddd�D]2}|||f|d|d|dfkrT|dj|�qTW|jt|��dS)Nz+config.direct.removeRules('%s', '%s', '%s')rrr�)	rrrvr�rzr�rqrrr)rKrrrr�r�ZrulerPrPrQ�removeRuless
 zFirewallDConfig.removeRulesza(ias)cCs�t|�}t|�}t|�}tjd|||f�g}xN|j�dD]>}|d|kr>|d|kr>|d|kr>|j|d|df�q>W|S)Nz(config.direct.getRules('%s', '%s', '%s')rrr�r�r)rrrvr�r�)rKrrrr�r�rrPrPrQ�getRules)s$zFirewallDConfig.getRulesz	a(sssias)cCstjd�|j�dS)Nzconfig.direct.getAllRules()r)rrvr�)rKr�rPrPrQ�getAllRules8s
zFirewallDConfig.getAllRulesZsascCs�t|�}t|�}tjd|dj|�f�|j|�||f}t|j��}||dkrfttj	d||f��|dj
|�|j|�dS)Nz(config.direct.addPassthrough('%s', '%s')z','r�zpassthrough '%s', '%s')rrrvrr�rzr�rrr�r�rr)rKrrLr�rr�rPrPrQ�addPassthroughAs
zFirewallDConfig.addPassthroughcCs�t|�}t|�}tjd|dj|�f�|j|�||f}t|j��}||dkrfttj	d||f��|dj
|�|j|�dS)Nz+config.direct.removePassthrough('%s', '%s')z','r�zpassthrough '%s', '%s')rrrvrr�rzr�rrr�rqrr)rKrrLr�rr�rPrPrQ�removePassthroughSs
z!FirewallDConfig.removePassthroughcCs@t|�}t|�}tjd|dj|�f�||f}||j�dkS)Nz*config.direct.queryPassthrough('%s', '%s')z','r�)rrrvrr�)rKrrLr�rrPrPrQ�queryPassthroughdsz FirewallDConfig.queryPassthroughZaascCsNt|�}tjd|�g}x.|j�dD]}|d|kr(|j|d�q(W|S)Nz#config.direct.getPassthroughs('%s')r�rr)rrrvr�r�)rKrr�r�rrPrPrQ�getPassthroughsoszFirewallDConfig.getPassthroughsza(sas)cCstjd�|j�dS)Nz"config.direct.getAllPassthroughs()r�)rrvr�)rKr�rPrPrQ�getAllPassthroughs{s
z"FirewallDConfig.getAllPassthroughs)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)��__name__�
__module__�__qualname__�__doc__Z
persistentrrIZPK_ACTION_CONFIGZdefault_polkit_auth_requiredrr.r1rjror2r_r�rr`r�r�rar�r�rcr�r�r^r�r�rbr�r�rr�r�r�r	ZPROPERTIES_IFACEr�ru�slipZpolkitZrequire_authr�rf�signalr|ZPK_ACTION_INFOZINTROSPECTABLE_IFACEr�r�rZDBUS_SIGNATUREr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rJr�r�r�rr�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�rrrr	r�r
rrr
r�rrrrrr�r�rr�rrr�rrrrrrrrr r!r"r#r$r%r&r'�
__classcell__rPrP)rOrQr>sv.				D!D	





	

	

	

	




	

	

	

	r):Z
gi.repositoryr�sys�modulesrArIZdbus.serviceZ	slip.dbusr,Zslip.dbus.serviceZfirewallrZfirewall.core.baserZfirewall.core.watcherrZfirewall.core.loggerrZfirewall.server.decoratorsrrr	Zfirewall.server.config_icmptyper
Zfirewall.server.config_servicerZfirewall.server.config_zonerZfirewall.server.config_policyr
Zfirewall.server.config_ipsetrZfirewall.server.config_helperrZfirewall.core.io.icmptyperZfirewall.core.io.ipsetrZfirewall.core.io.helperrZ#firewall.core.io.lockdown_whitelistrZfirewall.core.io.directrZfirewall.dbus_utilsrrrrrrrrZfirewall.errorsrrfZObjectrrPrPrPrQ�<module>s6
$site-packages/firewall/server/__pycache__/config_service.cpython-36.opt-1.pyc000064400000051355147511334700023177 0ustar003

]ûf�u�@s�ddlmZddlZeejd<ddlZddlZddlZddlZddl	m
Z
ddlmZm
Z
mZddlmZddlmZmZmZddl	mZdd	lmZGd
d�dejjj�ZdS)�)�GObjectNZgobject)�config)�dbus_to_python�%dbus_introspection_prepare_properties�!dbus_introspection_add_properties)�log)�handle_exceptions�dbus_handle_exceptions�dbus_service_method)�errors)�
FirewallErrorcs�eZdZdZdZejjZe	�fdd��Z
edd��Zedd��Z
ed	d
��Zeejddd
�ed�dd���Zeejddd
�ed�dd���Zejjjejj�eejdd�ed�dd����Zejjejdd�dd��Zejjjejj�eejdd�ed��fdd�	���Zeejjd d�ed�d!d"���Zeejjdd�ed�d#d$���Zeejjd d�ed�d%d&���Z eejjdd�ed�d'd(���Z!eejj�ed�d)d*���Z"ejjejjdd�ed+d,���Z#eejj�ed�d-d.���Z$ejjejjdd�ed/d0���Z%eejjdd�ed�d1d2���Z&ejjejjdd�ed3d4���Z'eejjdd�ed�d5d6���Z(eejjdd�ed�d7d8���Z)eejjdd�ed�d9d:���Z*eejjdd�ed�d;d<���Z+eejjdd�ed�d=d>���Z,eejjdd�ed�d?d@���Z-eejjdAd�ed�dBdC���Z.eejjdAd�ed�dDdE���Z/eejjdd�ed�dFdG���Z0eejjdd�ed�dHdI���Z1eejjddJd
�ed�dKdL���Z2eejjdMd�ed�dNdO���Z3eejjdMd�ed�dPdQ���Z4eejjdd�ed�dRdS���Z5eejjdd�ed�dTdU���Z6eejjddJd
�ed�dVdW���Z7eejjdAd�ed�dXdY���Z8eejjdAd�ed�dZd[���Z9eejjdd�ed�d\d]���Z:eejjdd�ed�d^d_���Z;eejjddJd
�ed�d`da���Z<eejjdMd�ed�dbdc���Z=eejjdMd�ed�ddde���Z>eejjdd�ed�dfdg���Z?eejjdd�ed�dhdi���Z@eejjddJd
�ed�djdk���ZAeejjdld�ed�dmdn���ZBeejjdld�ed�dodp���ZCeejjddd
�ed�dqdr���ZDeejjdd�ed�dsdt���ZEeejjdd�ed�dudv���ZFeejjddJd
�ed�dwdx���ZGeejjdMd�ed�dydz���ZHeejjdMd�ed�d{d|���ZIeejjdd�ed�d}d~���ZJeejjdd�ed�dd����ZKeejjddJd
�ed�d�d����ZL�ZMS)��FirewallDConfigServicezFirewallD main classTcs\tt|�j||�||_||_||_||_|d|_|d|_d|j|_	t
|tjj�dS)Nr�zconfig.service.%d)
�superr
�__init__�parentr�obj�item_id�busname�path�_log_prefixr�dbus�DBUS_INTERFACE_CONFIG_SERVICE)�selfrZconf�servicer�args�kwargs)�	__class__��$/usr/lib/python3.6/config_service.pyr7s

zFirewallDConfigService.__init__cCsdS)Nr)rrrr�__del__DszFirewallDConfigService.__del__cCs|j�dS)N)Zremove_from_connection)rrrr�
unregisterHsz!FirewallDConfigService.unregistercCs�|dkrtj|jj�S|dkr,tj|jj�S|dkrBtj|jj�S|dkrXtj|jj�S|dkrntj|jj�Stj	j
d|��dS)N�name�filenamer�default�builtinzDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not exist)r�Stringrr"r#rZBooleanr$r%�
exceptions�
DBusException)r�
property_namerrr�
_get_propertyPsz$FirewallDConfigService._get_propertyZss�v)�in_signature�
out_signatureNcCsLt|t�}t|t�}tjd|j||�|tjjkrBtjj	d|��|j
|�S)Nz%s.Get('%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not exist)r�strr�debug1rrrrr'r(r*)r�interface_namer)�senderrrr�Getas


zFirewallDConfigService.Get�sza{sv}cCsdt|t�}tjd|j|�|tjjkr6tjj	d|��i}xd
D]}|j
|�||<q@Wtj|dd	�S)Nz%s.GetAll('%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existr"r#rr$r%Zsv)�	signature)r"r#rr$r%)rr.rr/rrrrr'r(r*Z
Dictionary)rr0r1�ret�xrrr�GetAllrs

zFirewallDConfigService.GetAllZssv)r,cCslt|t�}t|t�}t|�}tjd|j|||�|jj|�|tjj	krXtj
jd|��tj
jd|��dS)Nz%s.Set('%s', '%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existzGorg.freedesktop.DBus.Error.PropertyReadOnly: Property '%s' is read-only)rr.rr/rr�accessCheckrrrr'r()rr0r)Z	new_valuer1rrr�Set�s



zFirewallDConfigService.Setzsa{sv}as)r4cCs2t|t�}t|�}t|�}tjd|j|||�dS)Nz&%s.PropertiesChanged('%s', '%s', '%s'))rr.rr/r)rr0Zchanged_propertiesZinvalidated_propertiesrrr�PropertiesChanged�s


z(FirewallDConfigService.PropertiesChanged)r-cs8tjd|j�tt|�j|j|jj��}t	||t
jj�S)Nz%s.Introspect())
rZdebug2rrr
�
IntrospectrrZget_busrrrr)rr1�data)rrrr;�s

z!FirewallDConfigService.Introspectz(sssa(ss)asa{ss}asa(ss))cCstjd|j�|jj|j�S)z!get settings for service
        z%s.getSettings())rr/rrZget_service_configr)rr1rrr�getSettings�sz"FirewallDConfigService.getSettingscCstjd|j�|jj|j�S)z!get settings for service
        z%s.getSettings2())rr/rr�get_service_config_dictr)rr1rrr�getSettings2�sz#FirewallDConfigService.getSettings2cCsFt|�}tjd|j�|jj|�|jj|j|�|_|j	|jj
�dS)z$update settings for service
        z%s.update('...')N)rrr/rrr8rZset_service_configr�Updatedr")r�settingsr1rrr�update�s
zFirewallDConfigService.updatecCsFt|�}tjd|j�|jj|�|jj|j|�|_|j	|jj
�dS)Nz%s.update2('...'))rrr/rrr8r�set_service_config_dictrr@r")rrAr1rrr�update2�s
zFirewallDConfigService.update2cCs<tjd|j�|jj|�|jj|j�|_|j|jj	�dS)z2load default settings for builtin service
        z%s.loadDefaults()N)
rr/rrr8rZload_service_defaultsrr@r")rr1rrr�loadDefaults�sz#FirewallDConfigService.loadDefaultscCstjd|j|f�dS)Nz%s.Updated('%s'))rr/r)rr"rrrr@�szFirewallDConfigService.UpdatedcCs:tjd|j�|jj|�|jj|j�|jj|j�dS)zremove service
        z%s.removeService()N)	rr/rrr8rZremove_servicerZ
removeService)rr1rrr�remove�szFirewallDConfigService.removecCstjd|j|f�dS)Nz%s.Removed('%s'))rr/r)rr"rrr�Removed�szFirewallDConfigService.RemovedcCsFt|t�}tjd|j|�|jj|�|jj|j	|�|_	|j
|�dS)zrename service
        z%s.rename('%s')N)rr.rr/rrr8rZrename_servicer�Renamed)rr"r1rrr�rename�s

zFirewallDConfigService.renamecCstjd|j|f�dS)Nz%s.Renamed('%s'))rr/r)rr"rrrrHszFirewallDConfigService.RenamedcCstjd|j�|j�dS)Nz%s.getVersion()r)rr/rr=)rr1rrr�
getVersionsz!FirewallDConfigService.getVersioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setVersion('%s')r)
rr.rr/rrr8�listr=rB)r�versionr1rArrr�
setVersions
z!FirewallDConfigService.setVersioncCstjd|j�|j�dS)Nz
%s.getShort()r)rr/rr=)rr1rrr�getShort"szFirewallDConfigService.getShortcCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setShort('%s')r)
rr.rr/rrr8rKr=rB)rZshortr1rArrr�setShort)s
zFirewallDConfigService.setShortcCstjd|j�|j�dS)Nz%s.getDescription()�)rr/rr=)rr1rrr�getDescription6sz%FirewallDConfigService.getDescriptioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setDescription('%s')rP)
rr.rr/rrr8rKr=rB)r�descriptionr1rArrr�setDescription=s

z%FirewallDConfigService.setDescriptionza(ss)cCstjd|j�|j�dS)Nz
%s.getPorts()�)rr/rr=)rr1rrr�getPortsKszFirewallDConfigService.getPortscCs�g}x6t|t�D](}t|t�r.|jt|��q|j|�qW|}tjd|jdjdd�|D���|j	j
|�t|j��}||d<|j|�dS)Nz%s.setPorts('[%s]')�,css"|]}d|d|dfVqdS)z('%s, '%s')rrNr)�.0�portrrr�	<genexpr>_sz2FirewallDConfigService.setPorts.<locals>.<genexpr>rT)
rrK�
isinstance�append�tuplerr/r�joinrr8r=rB)r�portsr1�_portsrXrArrr�setPortsRs

zFirewallDConfigService.setPortscCs�t|t�}t|t�}tjd|j||�|jj|�t|j��}||f|dkrbt	t
jd||f��|dj||f�|j
|�dS)Nz%s.addPort('%s', '%s')rTz%s:%s)rr.rr/rrr8rKr=rr�ALREADY_ENABLEDr[rB)rrX�protocolr1rArrr�addPortes

zFirewallDConfigService.addPortcCs�t|t�}t|t�}tjd|j||�|jj|�t|j��}||f|dkrbt	t
jd||f��|dj||f�|j
|�dS)Nz%s.removePort('%s', '%s')rTz%s:%s)rr.rr/rrr8rKr=rr�NOT_ENABLEDrFrB)rrXrbr1rArrr�
removePortus

z!FirewallDConfigService.removePort�bcCs:t|t�}t|t�}tjd|j||�||f|j�dkS)Nz%s.queryPort('%s', '%s')rT)rr.rr/rr=)rrXrbr1rrr�	queryPort�s


z FirewallDConfigService.queryPort�ascCstjd|j�|j�dS)Nz%s.getProtocols()�)rr/rr=)rr1rrr�getProtocols�sz#FirewallDConfigService.getProtocolscCsNt|t�}tjd|jdj|��|jj|�t|j��}||d<|j	|�dS)Nz%s.setProtocols('[%s]')rVri)
rrKrr/rr]rr8r=rB)rZ	protocolsr1rArrr�setProtocols�s

z#FirewallDConfigService.setProtocolscCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.addProtocol('%s')ri)rr.rr/rrr8rKr=rrrar[rB)rrbr1rArrr�addProtocol�s
z"FirewallDConfigService.addProtocolcCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.removeProtocol('%s')ri)rr.rr/rrr8rKr=rrrdrFrB)rrbr1rArrr�removeProtocol�s
z%FirewallDConfigService.removeProtocolcCs*t|t�}tjd|j|�||j�dkS)Nz%s.queryProtocol(%s')ri)rr.rr/rr=)rrbr1rrr�
queryProtocol�s
z$FirewallDConfigService.queryProtocolcCstjd|j�|j�dS)Nz%s.getSourcePorts()�)rr/rr=)rr1rrr�getSourcePorts�sz%FirewallDConfigService.getSourcePortscCs�g}x6t|t�D](}t|t�r.|jt|��q|j|�qW|}tjd|jdjdd�|D���|j	j
|�t|j��}||d<|j|�dS)Nz%s.setSourcePorts('[%s]')rVcss"|]}d|d|dfVqdS)z('%s, '%s')rrNr)rWrXrrrrY�sz8FirewallDConfigService.setSourcePorts.<locals>.<genexpr>ro)
rrKrZr[r\rr/rr]rr8r=rB)rr^r1r_rXrArrr�setSourcePorts�s

z%FirewallDConfigService.setSourcePortscCs�t|t�}t|t�}tjd|j||�|jj|�t|j��}||f|dkrbt	t
jd||f��|dj||f�|j
|�dS)Nz%s.addSourcePort('%s', '%s')roz%s:%s)rr.rr/rrr8rKr=rrrar[rB)rrXrbr1rArrr�
addSourcePort�s

z$FirewallDConfigService.addSourcePortcCs�t|t�}t|t�}tjd|j||�|jj|�t|j��}||f|dkrbt	t
jd||f��|dj||f�|j
|�dS)Nz%s.removeSourcePort('%s', '%s')roz%s:%s)rr.rr/rrr8rKr=rrrdrFrB)rrXrbr1rArrr�removeSourcePort�s

z'FirewallDConfigService.removeSourcePortcCs:t|t�}t|t�}tjd|j||�||f|j�dkS)Nz%s.querySourcePort('%s', '%s')ro)rr.rr/rr=)rrXrbr1rrr�querySourcePorts


z&FirewallDConfigService.querySourcePortcCstjd|j�|j�dS)Nz%s.getModules()�)rr/rr=)rr1rrr�
getModulessz!FirewallDConfigService.getModulescCs�t|t�}g}x@|D]8}|jd�rB|jdd�}d|krB|jdd�}|j|�qW|}tjd|jdj|��|j	j
|�t|j��}||d<|j|�dS)N�
nf_conntrack_��_�-z%s.setModules('[%s]')rVru)
rrK�
startswith�replacer[rr/rr]rr8r=rB)r�modulesr1Z_modules�modulerArrr�
setModuless



z!FirewallDConfigService.setModulescCs�t|t�}|jd�r4|jdd�}d|kr4|jdd�}tjd|j|�|jj|�t	|j
��}||dkrtttj
|��|dj|�|j|�dS)Nrwrxryrzz%s.addModule('%s')ru)rr.r{r|rr/rrr8rKr=rrrar[rB)rr~r1rArrr�	addModule's

z FirewallDConfigService.addModulecCs�t|t�}|jd�r4|jdd�}d|kr4|jdd�}tjd|j|�|jj|�t	|j
��}||dkrtttj
|��|dj|�|j|�dS)Nrwrxryrzz%s.removeModule('%s')ru)rr.r{r|rr/rrr8rKr=rrrdrFrB)rr~r1rArrr�removeModule8s

z#FirewallDConfigService.removeModulecCsTt|t�}|jd�r4|jdd�}d|kr4|jdd�}tjd|j|�||j�dkS)Nrwrxryrzz%s.queryModule('%s')ru)rr.r{r|rr/rr=)rr~r1rrr�queryModuleIs

z"FirewallDConfigService.queryModuleza{ss}cCstjd|j�|j�dS)Nz%s.getDestinations()�)rr/rr=)rr1rrr�getDestinationsWsz&FirewallDConfigService.getDestinationscCsVt|t�}tjd|j|jd�|jd��|jj|�t|j	��}||d<|j
|�dS)Nz*%s.setDestinations({ipv4:'%s', ipv6:'%s'})Zipv4Zipv6r�)r�dictrr/r�getrr8rKr=rB)rZdestinationsr1rArrr�setDestinations^s
z&FirewallDConfigService.setDestinationscCsVt|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|d|S)Nz%s.getDestination('%s')r�)rr.rr/rrr8rKr=rrrd)r�familyr1rArrr�getDestinationks

z%FirewallDConfigService.getDestinationcCs�t|t�}t|t�}tjd|j||�|jj|�t|j��}||dkrn|d||krnt	t
jd||f��||d|<|j|�dS)Nz%s.setDestination('%s', '%s')r�z
'%s': '%s')
rr.rr/rrr8rKr=rrrarB)rr��addressr1rArrr�setDestinationxs


z%FirewallDConfigService.setDestinationcCsbt|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|d|=|j|�dS)Nz%s.removeDestination('%s')r�)
rr.rr/rrr8rKr=rrrdrB)rr�r1rArrr�removeDestination�s


z(FirewallDConfigService.removeDestinationcCsJt|t�}t|t�}tjd|j||�|j�}||dkoH||d|kS)Nz%s.queryDestination('%s', '%s')r�)rr.rr/rr=)rr�r�r1rArrr�queryDestination�s


z'FirewallDConfigService.queryDestinationcCs<tjd|j�|jj|�|jj|j�}d|kr8|dSgS)Nz%s.getIncludes()�includes)rr/rrr8rr>r)rr1rArrr�getIncludes�sz"FirewallDConfigService.getIncludescCsZt|t�}tjd|j|�|jj|�d|dd�i}|jj|j	|�|_	|j
|j	j�dS)Nz%s.setIncludes('%s')r�)rrKrr/rrr8rrCrr@r")rr�r1rArrr�setIncludes�s
z"FirewallDConfigService.setIncludescCsjt|t�}tjd|j|�|jj|�|jj|j	�}|j
dg�j|�|jj|j	|�|_	|j
|j	j�dS)Nz%s.addInclude('%s')r�)rr.rr/rrr8rr>r�
setdefaultr[rCr@r")r�includer1rArrr�
addInclude�s
z!FirewallDConfigService.addIncludecCsft|t�}tjd|j|�|jj|�|jj|j	�}|dj
|�|jj|j	|�|_	|j|j	j
�dS)Nz%s.removeInclude('%s')r�)rr.rr/rrr8rr>rrFrCr@r")rr�r1rArrr�
removeInclude�s
z$FirewallDConfigService.removeIncludecCs@t|t�}tjd|j|�|jj|j�}d|kr<||dkSdS)Nz%s.queryInclude('%s')r�F)rr.rr/rrr>r)rr�r1rArrr�queryInclude�s
z#FirewallDConfigService.queryInclude)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N�__name__�
__module__�__qualname__�__doc__Z
persistentrrZPK_ACTION_CONFIGZdefault_polkit_auth_requiredrrr	r r!r*r
ZPROPERTIES_IFACEr2r7�slipZpolkitZrequire_authr9r�signalr:ZPK_ACTION_INFOZINTROSPECTABLE_IFACEr;rr=r?rBrDrEr@rFrGrIrHrJrMrNrOrQrSrUr`rcrergrjrkrlrmrnrprqrrrsrtrvrr�r�r�r�r�r�r�r�r�r�r�r�r�r��
__classcell__rr)rrr
/s�
		

	


		


	


		r
)Z
gi.repositoryr�sysr}rZdbus.serviceZ	slip.dbusr�Zslip.dbus.serviceZfirewallrZfirewall.dbus_utilsrrrZfirewall.core.loggerrZfirewall.server.decoratorsrr	r
rZfirewall.errorsrrZObjectr
rrrr�<module>s
site-packages/firewall/server/__pycache__/config.cpython-36.pyc000064400000127767147511334700020533 0ustar003

]ûf��@sdddlmZddlZeejd<ddlZddlZddlZddlZddl	Zddl
mZddlm
Z
ddlmZddlmZddlmZmZmZdd	lmZdd
lmZddlmZddlmZdd
lmZddl m!Z!ddl"m#Z#ddl$m%Z%ddl&m'Z'ddl(m)Z)ddl*m+Z+ddl,m-Z-m.Z.m/Z/m0Z0m1Z1m2Z2m3Z3ddl
m4Z4ddl5m6Z6Gdd�dejj7j8�Z9dS)�)�GObjectNZgobject)�config)�DEFAULT_ZONE_TARGET)�Watcher)�log)�handle_exceptions�dbus_handle_exceptions�dbus_service_method)�FirewallDConfigIcmpType)�FirewallDConfigService)�FirewallDConfigZone)�FirewallDConfigPolicy)�FirewallDConfigIPSet)�FirewallDConfigHelper)�IcmpType)�IPSet)�Helper)�LockdownWhitelist)�Direct)�dbus_to_python�command_of_sender�context_of_sender�
uid_of_sender�user_of_uid�%dbus_introspection_prepare_properties�!dbus_introspection_add_properties)�errors)�
FirewallErrorcs@eZdZdZdZejjZe	�fdd��Z
e	dd��Ze	dd��Ze	d	d
��Z
e	dd��Ze	d
d��Ze	dd��Ze	dd��Ze	dd��Ze	dd��Ze	dd��Ze	dd��Ze	dd��Ze	dd��Ze	dd ��Ze	d!d"��Ze	d#d$��Ze	d%d&��Ze	d'd(��Ze	d)d*��Ze	d+d,��Ze	d-d.��Ze	d/d0��Z e!d1d2��Z"e!d3d4��Z#e!d5d6��Z$e%ej&d7d8d9�e!d�d;d<���Z'e%ej&d=d>d9�e!d�d?d@���Z(e)jj*j+ejj�e%ej&dAdB�e!d�dCdD����Z,ej-j.ej&dEdF�dGdH��Z/e)jj*j+ejj0�e%ej1d=dI�e!d�fdJdK�	���Z2e%ejj3e4j5dI�e!d�dLdM���Z6e%ejj3e4j5dB�e!d�dNdO���Z7ej-j.ejj3�e!dPdQ���Z8e%ejj3d=dB�e!d�dRdS���Z9e%ejj3d=dB�e!d�dTdU���Z:e%ejj3d=dVd9�e!d�dWdX���Z;e%ejj3dYdI�e!d�dZd[���Z<e%ejj3d=dB�e!d�d\d]���Z=e%ejj3d=dB�e!d�d^d_���Z>e%ejj3d=dVd9�e!d�d`da���Z?e%ejj3dYdI�e!d�dbdc���Z@e%ejj3d=dB�e!d�ddde���ZAe%ejj3d=dB�e!d�dfdg���ZBe%ejj3d=dVd9�e!d�dhdi���ZCe%ejj3dYdI�e!d�djdk���ZDe%ejj3dldB�e!�ddmdn���ZEe%ejj3dldB�e!�ddodp���ZFe%ejj3dldVd9�e!�ddqdr���ZGe%ejj3dsdI�e!�ddtdu���ZHe%ejjIdvdI�e!�ddwdx���ZJe%ejjIdYdI�e!�ddydz���ZKe%ejjId=d{d9�e!�dd|d}���ZLe%ejjId=eMj5d{d9�e!�dd~d���ZNej-j.ejjId=dF�e!d�d����ZOe%ejjIdvdI�e!�dd�d����ZPe%ejjIdYdI�e!�d	d�d����ZQe%ejjId=d{d9�e!�d
d�d����ZRe%ejjId=eSj5d{d9�e!�dd�d����ZTej-j.ejjId=dF�e!d�d����ZUe%ejjIdvdI�e!�dd�d����ZVe%ejjIdYdI�e!�d
d�d����ZWe%ejjId=d{d9�e!�dd�d����ZXe%ejjId�d{d9�e!�dd�d����ZYe%ejjId�d{d9�e!�dd�d����ZZej-j.ejjId=dF�e!d�d����Z[e%ejjIdvdI�e!�dd�d����Z\e%ejjIdYdI�e!�dd�d����Z]e%ejjId=d{d9�e!�dd�d����Z^e%ejjId=d=d9�e!�dd�d����Z_e%ejjId=d=d9�e!�dd�d����Z`e%ejjId�d{d9�e!�dd�d����Zae%ejjId�d{d9�e!�dd�d����Zbej-j.ejjId=dF�e!d�d����Zce%ejjIdvdI�e!�dd�d����Zde%ejjIdYdI�e!�dd�d����Zee%ejjId=d{d9�e!�dd�d����Zfe%ejjId�d{d9�e!�dd�d����Zgej-j.ejjId=dF�e!d�d����Zhe%ejjIdvdI�e!�dd�d����Zie%ejjIdYdI�e!�dd�d����Zje%ejjId=d{d9�e!�dd�d����Zke%ejjId=elj5d{d9�e!�dd�d����Zmej-j.ejjId=dF�e!d�d����Zne%ejjoepj5dI�e!�d d�d����Zqe%ejjoepj5dB�e!�d!d�d„��Zrej-j.ejjo�e!d�dĄ��Zse%ejjod�dB�e!�d"d�dDŽ��Zte%ejjod�dB�e!�d#d�dɄ��Zue%ejjod�dVd9�e!�d$d�d˄��Zve%ejjod7dYd9�e!�d%d�d̈́��Zwe%ejjod�d�d9�e!�d&d�dф��Zxe%ejjod�dB�e!�d'd�dԄ��Zye%ejjod�dB�e!�d(d�dք��Zze%ejjod�dVd9�e!�d)d�d؄��Z{e%ejjod�dB�e!�d*d�dڄ��Z|e%ejjod�d�d9�e!�d+d�d݄��Z}e%ejjod�d�d9�e!�d,d�d���Z~e%ejjod�dB�e!�d-d�d���Ze%ejjod�dB�e!�d.d�d���Z�e%ejjod�dVd9�e!�d/d�d���Z�e%ejjod=d�d9�e!�d0d�d���Z�e%ejjod�dI�e!�d1d�d���Z��Z�S(2�FirewallDConfigzFirewallD main classTcs�tt|�j||�||_|d|_|d|_|j�t|jd�|_	|j	j
tj�|j	j
tj�|j	j
tj
�|j	j
tj�|j	j
tj�|j	j
tj�|j	j
tj�|j	j
tj�|j	j
tj�|j	j
tj�|j	j
tj�|j	j
tj�tjjtj��r>xBttjtj��D].}dtj|f}tjj|��r|j	j
|��qW|j	jtj�|j	jtj�|j	jtj�t |tj!j"ddddddddddddd��dS)Nr��z%s/%sZ	readwrite)�
CleanupOnExit�CleanupModulesOnExit�
IPv6_rpfilter�Lockdown�MinimalMark�IndividualCalls�	LogDenied�AutomaticHelpers�FirewallBackend�FlushAllOnReload�RFC3964_IPv4�AllowZoneDrifting)#�superr�__init__r�busname�path�
_init_varsr�
watch_updater�watcher�
add_watch_dir�FIREWALLD_IPSETS�ETC_FIREWALLD_IPSETS�FIREWALLD_ICMPTYPES�ETC_FIREWALLD_ICMPTYPES�FIREWALLD_HELPERS�ETC_FIREWALLD_HELPERS�FIREWALLD_SERVICES�ETC_FIREWALLD_SERVICES�FIREWALLD_ZONES�ETC_FIREWALLD_ZONES�FIREWALLD_POLICIES�ETC_FIREWALLD_POLICIES�os�exists�sorted�listdir�isdirZadd_watch_file�LOCKDOWN_WHITELIST�FIREWALLD_DIRECT�FIREWALLD_CONFr�dbus�DBUS_INTERFACE_CONFIG)�selfZconf�args�kwargs�filenamer0)�	__class__��/usr/lib/python3.6/config.pyr.FsP

zFirewallDConfig.__init__cCs2g|_d|_g|_d|_g|_d|_g|_d|_g|_d|_	g|_
d|_x$|jj
�D]}|j|jj|��qTWx$|jj�D]}|j|jj|��qzWx$|jj�D]}|j|jj|��q�Wx$|jj�D]}|j|jj|��q�Wx$|jj�D]}|j|jj|��q�Wx&|jj�D]}|j|jj|���qWdS)Nr)�ipsets�	ipset_idx�	icmptypes�icmptype_idx�services�service_idx�zones�zone_idx�helpers�
helper_idx�policy_objects�policy_object_idxrZ
get_ipsets�	_addIPSetZ	get_ipsetZ
get_icmptypes�_addIcmpTypeZget_icmptypeZget_services�_addServiceZget_serviceZ	get_zones�_addZoneZget_zoneZget_helpers�
_addHelperZ
get_helperZget_policy_objects�
_addPolicyZget_policy_object)rK�ipset�icmptype�service�zone�helper�policyrPrPrQr1ts0zFirewallDConfig._init_varscCsdS)NrP)rKrPrPrQ�__del__�szFirewallDConfig.__del__cCs�x&t|j�dkr&|jj�}|j�~qWx&t|j�dkrN|jj�}|j�~q*Wx&t|j�dkrv|jj�}|j�~qRWx&t|j�dkr�|jj�}|j�~qzWx&t|j�dkr�|jj�}|j�~q�Wx&t|j�dkr�|jj�}|j�~q�W|j	�dS)Nr)
�lenrR�pop�
unregisterrTrVrXrZr\r1)rK�itemrPrPrQ�reload�s2





zFirewallDConfig.reloadc	CsJ|tjkr�|jtjj�}tjdtj�y|jj�Wn2tk
rf}ztj	d||f�dSd}~XnX|jtjj�j
�}x2t|j��D]"}||kr�||||kr�||=q�Wt
|�dkr�|jtjj|g�dS|jtj�s�|jtj�o�|jd��r�y|jj|�\}}Wn4tk
�r<}ztj	d||f�dSd}~XnX|dk�rT|j|�n*|dk�rj|j|�n|dk�rF|j|��n�|jtj��s�|jtj��r8|jd��r8y|jj|�\}}Wn4tk
�r�}ztj	d	||f�dSd}~XnX|dk�r
|j|�n*|dk�r |j|�n|dk�rF|j|��n|jtj��sT|jtj��rr|jd��r�y|jj|�\}}Wn4tk
�r�}ztj	d
||f�dSd}~XnX|dk�r�|j |�n*|dk�r�|j!|�n|dk�rn|j"|�n�|jtj��rF|j#tjd�j$d�}t
|�d
k�s&d|k�r*dSt%j&j'|��rT|j(j)|��sn|j(j*|�n|j(j)|��rF|j(j+|��n�|jtj,��s�|jtj-��r(|jd��r(y|jj.|�\}}Wn4tk
�r�}ztj	d||f�dSd}~XnX|dk�r�|j/|�n*|dk�r|j0|�n|dk�rF|j1|��n|jtj2��sD|jtj3��r�|jd��r�y|jj4|�\}}Wn4tk
�r�}ztj	d||f�dSd}~XnX|dk�r�|j5|�n*|dk�r�|j6|�n|dk�rF|j7|��nh|tj8k�r:y|jj9�Wn4tk
�r,}ztj	d||f�dSd}~XnX|j:��n|tj;k�r�y|jj<�Wn4tk
�r�}ztj	d||f�dSd}~XnX|j=�n�|jtj>��s�|jtj?��rF|jd��rFy|jj@|�\}}Wn4tk
�r}ztj	d||f�dSd}~XnX|dk�r|jA|�n*|dk�r2|jB|�n|dk�rF|jC|�dS)Nz,config: Reloading firewalld config file '%s'z+Failed to load firewalld.conf file '%s': %srz.xmlz%Failed to load icmptype file '%s': %s�new�remove�updatez$Failed to load service file '%s': %sz!Failed to load zone file '%s': %s��/rz"Failed to load ipset file '%s': %sz#Failed to load helper file '%s': %sz/Failed to load lockdown whitelist file '%s': %sz)Failed to load direct rules file '%s': %sz#Failed to load policy file '%s': %s)DrrH�GetAllrIrJr�debug1Zupdate_firewalld_conf�	Exception�error�copy�list�keysrk�PropertiesChanged�
startswithr7r8�endswithZupdate_icmptype_from_pathr_�removeIcmpType�_updateIcmpTyper;r<Zupdate_service_from_pathr`�
removeService�_updateServicer=r>Zupdate_zone_from_pathra�
removeZone�_updateZone�replace�striprAr0rEr3Z	has_watchr4Zremove_watchr5r6Zupdate_ipset_from_pathr^�removeIPSet�_updateIPSetr9r:Zupdate_helper_from_pathrb�removeHelper�
_updateHelperrFZupdate_lockdown_whitelist�LockdownWhitelistUpdatedrGZ
update_direct�Updatedr?r@Zupdate_policy_object_from_pathrc�removePolicy�
_updatePolicy)	rK�nameZ	old_props�msgZprops�keyZwhat�obj�_namerPrPrQr2�s
























zFirewallDConfig.watch_updaterc	CsPt||j||j|jdtjj|jf�}|jj|�|jd7_|j|j	�|S)Nz%s/%dr)
r
rrUr/rIZDBUS_PATH_CONFIG_ICMPTYPErT�append�
IcmpTypeAddedr�)rKr��config_icmptyperPrPrQr_AszFirewallDConfig._addIcmpTypecCsPxJ|jD]@}|jj|jkr|jj|jkr|jj|jkr||_|j|j�qWdS)N)rTr�r�r0rNr�)rKr�rerPrPrQr�MszFirewallDConfig._updateIcmpTypecCs�d}xT|jD]J}|j�}|j||kr||j|j�|jj|j|�|_|j|jj�qWx\|jD]R}|j�}d|krb|j|dkrb|dj|j�|jj	|j|�|_|j|jj�qbWx:|j
D]0}|j|kr�|j|j�|j�|j
j|�~q�WdS)N�Zicmp_blocks)
rX�getSettingsr�rqr�set_zone_configr�r�r\�set_policy_object_config_dictrT�Removedrm)rKr��indexrg�settingsrirerPrPrQrVs&
zFirewallDConfig.removeIcmpTypec	CsPt||j||j|jdtjj|jf�}|jj|�|jd7_|j|j	�|S)Nz%s/%dr)
rrrWr/rIZDBUS_PATH_CONFIG_SERVICErVr��ServiceAddedr�)rKr��config_servicerPrPrQr`pszFirewallDConfig._addServicecCsPxJ|jD]@}|jj|jkr|jj|jkr|jj|jkr||_|j|j�qWdS)N)rVr�r�r0rNr�)rKr�rfrPrPrQr�{szFirewallDConfig._updateServicecCs�d}xT|jD]J}|j�}|j||kr||j|j�|jj|j|�|_|j|jj�qWx\|jD]R}|j�}d|krb|j|dkrb|dj|j�|jj	|j|�|_|j|jj�qbWx:|j
D]0}|j|kr�|j|j�|j�|j
j|�~q�WdS)Nr rV)
rXr�r�rqrr�r�r�r\r�rVr�rm)rKr�r�rgr�rirfrPrPrQr��s&
zFirewallDConfig.removeServicec	CsPt||j||j|jdtjj|jf�}|jj|�|jd7_|j|j	�|S)Nz%s/%dr)
rrrYr/rIZDBUS_PATH_CONFIG_ZONErXr��	ZoneAddedr�)rKr��config_zonerPrPrQra�szFirewallDConfig._addZonecCsPxJ|jD]@}|jj|jkr|jj|jkr|jj|jkr||_|j|j�qWdS)N)rXr�r�r0rNr�)rKr�rgrPrPrQr��s
zFirewallDConfig._updateZonecCs@x:|jD]0}|j|kr|j|j�|j�|jj|�~qWdS)N)rXr�r�r�rmrq)rKr�rgrPrPrQr��s
zFirewallDConfig.removeZonec	CsPt||j||j|jdtjj|jf�}|jj|�|jd7_|j|j	�|S)Nz%s/%dr)
r
rr]r/rIZDBUS_PATH_CONFIG_POLICYr\r��PolicyAddedr�)rKr��
config_policyrPrPrQrc�szFirewallDConfig._addPolicycCsPxJ|jD]@}|jj|jkr|jj|jkr|jj|jkr||_|j|j�qWdS)N)r\r�r�r0rNr�)rKr�rirPrPrQr��s
zFirewallDConfig._updatePolicycCs@x:|jD]0}|j|kr|j|j�|j�|jj|�~qWdS)N)r\r�r�r�rmrq)rKr�rirPrPrQr��s
zFirewallDConfig.removePolicyc	CsPt||j||j|jdtjj|jf�}|jj|�|jd7_|j|j	�|S)Nz%s/%dr)
rrrSr/rIZDBUS_PATH_CONFIG_IPSETrRr��
IPSetAddedr�)rKr��config_ipsetrPrPrQr^�szFirewallDConfig._addIPSetcCsPxJ|jD]@}|jj|jkr|jj|jkr|jj|jkr||_|j|j�qWdS)N)rRr�r�r0rNr�)rKr�rdrPrPrQr��s
zFirewallDConfig._updateIPSetcCs@x:|jD]0}|j|kr|j|j�|j�|jj|�~qWdS)N)rRr�r�r�rmrq)rKr�rdrPrPrQr��s
zFirewallDConfig.removeIPSetc	CsPt||j||j|jdtjj|jf�}|jj|�|jd7_|j|j	�|S)Nz%s/%dr)
rrr[r/rIZDBUS_PATH_CONFIG_HELPERrZr��HelperAddedr�)rKr��
config_helperrPrPrQrb�szFirewallDConfig._addHelpercCsPxJ|jD]@}|jj|jkr|jj|jkr|jj|jkr||_|j|j�qWdS)N)rZr�r�r0rNr�)rKr�rhrPrPrQr��s
zFirewallDConfig._updateHelpercCs@x:|jD]0}|j|kr|j|j�|j�|jj|�~qWdS)N)rZr�r�r�rmrq)rKr�rhrPrPrQr�s
zFirewallDConfig.removeHelpercCs�|jj�r�|dkr tjd�dStj�}t||�}|jjd|�rDdSt||�}|jjd|�r`dSt	|�}|jjd|�rzdSt
||�}|jjd|�r�dSttj
d��dS)Nz&Lockdown not possible, sender not set.�context�uid�user�commandzlockdown is enabled)rZlockdown_enabledrrxrIZ	SystemBusrZaccess_checkrrrrrZ
ACCESS_DENIED)rK�senderZbusr�r�r�r�rPrPrQ�accessChecks$




zFirewallDConfig.accessCheckcCsF|dkrtjjd|��|jj�j|�}|dkrH|dkr>tj}tj|�S|dkrr|dkr`tj}nt	|�}tj
|�S|dkr�|dkr�tjr�dnd}tj|�S|dkr�|dkr�tjr�dnd}tj|�S|dk�r�|dk�r�tj
�r�dnd}tj|�S|dk�r|dk�rtj�rdnd}tj|�S|dk�rL|dk�rBtj�r>dnd}tj|�S|dk�rp|dk�rftj}tj|�S|d	k�r�|dk�r�tj}tj|�S|d
k�r�|dk�r�tj}tj|�S|dk�r�|dk�r�tj�r�dnd}tj|�S|dk�r|dk�r
tj�rdnd}tj|�S|d
k�rB|dk�r8tj�r4dnd}tj|�SdS)N�DefaultZoner%r!r"r$r#r&r'r(r)r*r+r,zDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not exist�yes�no)
r�r%r!r"r$r#r&r'r(r)r*r+r,)rI�
exceptions�
DBusExceptionr�get_firewalld_conf�getZ
FALLBACK_ZONE�StringZFALLBACK_MINIMAL_MARK�int�Int32ZFALLBACK_CLEANUP_ON_EXITZ FALLBACK_CLEANUP_MODULES_ON_EXITZFALLBACK_LOCKDOWNZFALLBACK_IPV6_RPFILTERZFALLBACK_INDIVIDUAL_CALLSZFALLBACK_LOG_DENIEDZFALLBACK_AUTOMATIC_HELPERSZFALLBACK_FIREWALL_BACKENDZFALLBACK_FLUSH_ALL_ON_RELOADZFALLBACK_RFC3964_IPV4ZFALLBACK_ALLOW_ZONE_DRIFTING)rK�prop�valuerPrPrQ�
_get_property+s|





























zFirewallDConfig._get_propertycCsT|dkrtj|j|��S|dkr0tj|j|��S|dkrHtj|j|��S|dkr`tj|j|��S|dkrxtj|j|��S|dkr�tj|j|��S|dkr�tj|j|��S|dkr�tj|j|��S|d	kr�tj|j|��S|d
k�r�tj|j|��S|dk�rtj|j|��S|dk�r&tj|j|��S|d
k�r@tj|j|��Stjjd|��dS)Nr�r%r!r"r$r#r&r'r(r)r*r+r,zDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not exist)rIr�r�r�r�r�)rKr�rPrPrQ�_get_dbus_propertyos:



z"FirewallDConfig._get_dbus_propertyZss�v)�in_signature�
out_signatureNcCsxt|t�}t|t�}tjd||�|tjjkr8|j|�S|tjjtjj	gkr^tj
jd|��ntj
jd|��|j|�S)Nzconfig.Get('%s', '%s')zDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not existzJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not exist)r�strrrvrrIrJr��DBUS_INTERFACE_CONFIG_DIRECT�DBUS_INTERFACE_CONFIG_POLICIESr�r�)rK�interface_name�
property_namer�rPrPrQ�Get�s



zFirewallDConfig.Get�sza{sv}c
Csxt|t�}tjd|�i}|tjjkrDxBdD]}|j|�||<q,Wn&|tjjtjj	gkrZntj
jd|��tj|dd�S)Nzconfig.GetAll('%s')r�r%r!r"r$r#r&r'r(r)r*r+r,zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existZsv)�	signature)
r�r%r!r"r$r#r&r'r(r)r*r+r,)
rr�rrvrrIrJr�r�r�r�r�Z
Dictionary)rKr�r��ret�xrPrPrQru�s"
zFirewallDConfig.GetAllZssv)r�cCs�t|t�}t|t�}t|�}tjd|||�|j|�|tjjk�r�|dk�rz|dkrv|j�dkrvt	t
jd||f��|dkr�|tjkr�t	t
jd||f��|dkr�|tj
kr�t	t
jd||f��|d	k�r�|j�dk�r�t	t
jd||f��|d
k�r|j�dk�rt	t
jd||f��|dk�rF|j�dk�rFt	t
jd||f��|jj�j||�|jj�j�|j|||ig�n|dk�r�ntjjd|��n8|tjjtjjgk�r�tjjd|��ntjjd|��dS)Nzconfig.Set('%s', '%s', '%s')r!r$r"r#r&r'r)r*r+r,r�r��true�falsez'%s' for %sr%r(zDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not existzJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not exist)
r!r$r"r#r&r'r)r*r+r,)r!r$r"r#r&)r�r�r�r�)r�r�r�r�)r�r�r�r�)r�r�r�r�)r%r()rr�rrvr�rrIrJ�lowerrrZ
INVALID_VALUEZLOG_DENIED_VALUESZFIREWALL_BACKEND_VALUESr��set�writer|r�r�r�r�)rKr�r�Z	new_valuer�rPrPrQ�Set�sz










zFirewallDConfig.Setzsa{sv}as)r�cCs.t|t�}t|�}t|�}tjd|||�dS)Nz*config.PropertiesChanged('%s', '%s', '%s'))rr�rrv)rKr�Zchanged_propertiesZinvalidated_propertiesrPrPrQr|s

z!FirewallDConfig.PropertiesChanged)r�cs4tjd�tt|�j|j|jj��}t||t	j
j�S)Nzconfig.Introspect())rZdebug2r-r�
Introspectr0r/Zget_busrrrIrJ)rKr��data)rOrPrQr�s

zFirewallDConfig.IntrospectcCstjd�|jj�jj�S)Nz&config.policies.getLockdownWhitelist())rrvr�get_policies�lockdown_whitelist�
export_config)rKr�rPrPrQ�getLockdownWhitelists
z$FirewallDConfig.getLockdownWhitelistcCs@tjd�t|�}|jj�jj|�|jj�jj�|j�dS)Nz)config.policies.setLockdownWhitelist(...))	rrvrrr�r��
import_configr�r�)rKr�r�rPrPrQ�setLockdownWhitelist&s

z$FirewallDConfig.setLockdownWhitelistcCstjd�dS)Nz*config.policies.LockdownWhitelistUpdated())rrv)rKrPrPrQr�0sz(FirewallDConfig.LockdownWhitelistUpdatedcCs^t|�}tjd|�|j|�t|j��}||dkrBttj|��|dj	|�|j
|�dS)Nz1config.policies.addLockdownWhitelistCommand('%s')r)rrrvr�rzr�rr�ALREADY_ENABLEDr�r�)rKr�r�r�rPrPrQ�addLockdownWhitelistCommand7s
z+FirewallDConfig.addLockdownWhitelistCommandcCs^t|�}tjd|�|j|�t|j��}||dkrBttj|��|dj	|�|j
|�dS)Nz4config.policies.removeLockdownWhitelistCommand('%s')r)rrrvr�rzr�rr�NOT_ENABLEDrqr�)rKr�r�r�rPrPrQ�removeLockdownWhitelistCommandDs
z.FirewallDConfig.removeLockdownWhitelistCommand�bcCs$t|�}tjd|�||j�dkS)Nz3config.policies.queryLockdownWhitelistCommand('%s')r)rrrvr�)rKr�r�rPrPrQ�queryLockdownWhitelistCommandRsz-FirewallDConfig.queryLockdownWhitelistCommand�ascCstjd�|j�dS)Nz.config.policies.getLockdownWhitelistCommands()r)rrvr�)rKr�rPrPrQ�getLockdownWhitelistCommands[s
z,FirewallDConfig.getLockdownWhitelistCommandscCs^t|�}tjd|�|j|�t|j��}||dkrBttj|��|dj	|�|j
|�dS)Nz1config.policies.addLockdownWhitelistContext('%s')r)rrrvr�rzr�rrr�r�r�)rKr�r�r�rPrPrQ�addLockdownWhitelistContextds
z+FirewallDConfig.addLockdownWhitelistContextcCs^t|�}tjd|�|j|�t|j��}||dkrBttj|��|dj	|�|j
|�dS)Nz4config.policies.removeLockdownWhitelistContext('%s')r)rrrvr�rzr�rrr�rqr�)rKr�r�r�rPrPrQ�removeLockdownWhitelistContextqs
z.FirewallDConfig.removeLockdownWhitelistContextcCs$t|�}tjd|�||j�dkS)Nz3config.policies.queryLockdownWhitelistContext('%s')r)rrrvr�)rKr�r�rPrPrQ�queryLockdownWhitelistContextsz-FirewallDConfig.queryLockdownWhitelistContextcCstjd�|j�dS)Nz.config.policies.getLockdownWhitelistContexts()r)rrvr�)rKr�rPrPrQ�getLockdownWhitelistContexts�s
z,FirewallDConfig.getLockdownWhitelistContextscCs^t|�}tjd|�|j|�t|j��}||dkrBttj|��|dj	|�|j
|�dS)Nz.config.policies.addLockdownWhitelistUser('%s')�)rrrvr�rzr�rrr�r�r�)rKr�r�r�rPrPrQ�addLockdownWhitelistUser�s
z(FirewallDConfig.addLockdownWhitelistUsercCs^t|�}tjd|�|j|�t|j��}||dkrBttj|��|dj	|�|j
|�dS)Nz1config.policies.removeLockdownWhitelistUser('%s')r�)rrrvr�rzr�rrr�rqr�)rKr�r�r�rPrPrQ�removeLockdownWhitelistUser�s
z+FirewallDConfig.removeLockdownWhitelistUsercCs$t|�}tjd|�||j�dkS)Nz0config.policies.queryLockdownWhitelistUser('%s')r�)rrrvr�)rKr�r�rPrPrQ�queryLockdownWhitelistUser�sz*FirewallDConfig.queryLockdownWhitelistUsercCstjd�|j�dS)Nz+config.policies.getLockdownWhitelistUsers()r�)rrvr�)rKr�rPrPrQ�getLockdownWhitelistUsers�s
z)FirewallDConfig.getLockdownWhitelistUsers�icCs^t|�}tjd|�|j|�t|j��}||dkrBttj|��|dj	|�|j
|�dS)Nz+config.policies.addLockdownWhitelistUid(%d)�)rrrvr�rzr�rrr�r�r�)rKr�r�r�rPrPrQ�addLockdownWhitelistUid�s
z'FirewallDConfig.addLockdownWhitelistUidcCs^t|�}tjd|�|j|�t|j��}||dkrBttj|��|dj	|�|j
|�dS)Nz.config.policies.removeLockdownWhitelistUid(%d)r�)rrrvr�rzr�rrr�rqr�)rKr�r�r�rPrPrQ�removeLockdownWhitelistUid�s
z*FirewallDConfig.removeLockdownWhitelistUidcCs$t|�}tjd|�||j�dkS)Nz-config.policies.queryLockdownWhitelistUid(%d)r�)rrrvr�)rKr�r�rPrPrQ�queryLockdownWhitelistUid�sz)FirewallDConfig.queryLockdownWhitelistUidZaicCstjd�|j�dS)Nz*config.policies.getLockdownWhitelistUids()r�)rrvr�)rKr�rPrPrQ�getLockdownWhitelistUids�s
z(FirewallDConfig.getLockdownWhitelistUidsZaocCstjd�|jS)z"list ipsets objects paths
        zconfig.listIPSets())rrvrR)rKr�rPrPrQ�
listIPSets�s
zFirewallDConfig.listIPSetscCs4tjd�g}x|jD]}|j|jj�qWt|�S)zget ipset names
        zconfig.getIPSetNames())rrvrRr�r�r�rC)rKr�rRr�rPrPrQ�
getIPSetNames�s

zFirewallDConfig.getIPSetNames�ocCsFt|t�}tjd|�x|jD]}|jj|kr|SqWttj	|��dS)z-object path of ipset with given name
        zconfig.getIPSetByName('%s')N)
rr�rrvrRr�r�rrZ
INVALID_IPSET)rKrdr�r�rPrPrQ�getIPSetByName�s
zFirewallDConfig.getIPSetByNamecCsDt|t�}t|�}tjd|�|j|�|jj||�}|j|�}|S)z/add ipset with given name and settings
        zconfig.addIPSet('%s'))rr�rrvr�rZ	new_ipsetr^)rKrdr�r�r�r�rPrPrQ�addIPSet	s


zFirewallDConfig.addIPSetcCst|t�}tjd|�dS)Nzconfig.IPSetAdded('%s'))rr�rrv)rKrdrPrPrQr�s
zFirewallDConfig.IPSetAddedcCstjd�|jS)z%list icmptypes objects paths
        zconfig.listIcmpTypes())rrvrT)rKr�rPrPrQ�
listIcmpTypes s
zFirewallDConfig.listIcmpTypescCs4tjd�g}x|jD]}|j|jj�qWt|�S)zget icmptype names
        zconfig.getIcmpTypeNames())rrvrTr�r�r�rC)rKr�rTr�rPrPrQ�getIcmpTypeNames(s

z FirewallDConfig.getIcmpTypeNamescCsFt|t�}tjd|�x|jD]}|jj|kr|SqWttj	|��dS)z0object path of icmptype with given name
        zconfig.getIcmpTypeByName('%s')N)
rr�rrvrTr�r�rrZINVALID_ICMPTYPE)rKrer�r�rPrPrQ�getIcmpTypeByName3s
z!FirewallDConfig.getIcmpTypeByNamecCsDt|t�}t|�}tjd|�|j|�|jj||�}|j|�}|S)z2add icmptype with given name and settings
        zconfig.addIcmpType('%s'))rr�rrvr�rZnew_icmptyper_)rKrer�r�r�r�rPrPrQ�addIcmpType@s


zFirewallDConfig.addIcmpTypecCstjd|�dS)Nzconfig.IcmpTypeAdded('%s'))rrv)rKrerPrPrQr�OszFirewallDConfig.IcmpTypeAddedcCstjd�|jS)z$list services objects paths
        zconfig.listServices())rrvrV)rKr�rPrPrQ�listServicesVs
zFirewallDConfig.listServicescCs4tjd�g}x|jD]}|j|jj�qWt|�S)zget service names
        zconfig.getServiceNames())rrvrVr�r�r�rC)rKr�rVr�rPrPrQ�getServiceNames^s

zFirewallDConfig.getServiceNamescCsFt|t�}tjd|�x|jD]}|jj|kr|SqWttj	|��dS)z/object path of service with given name
        zconfig.getServiceByName('%s')N)
rr�rrvrVr�r�rrZINVALID_SERVICE)rKrfr�r�rPrPrQ�getServiceByNameis
z FirewallDConfig.getServiceByNamezs(sssa(ss)asa{ss}asa(ss))cCsDt|t�}t|�}tjd|�|j|�|jj||�}|j|�}|S)z1add service with given name and settings
        zconfig.addService('%s'))rr�rrvr�rZnew_servicer`)rKrfr�r�r�r�rPrPrQ�
addServicevs


zFirewallDConfig.addServicezsa{sv}cCsDt|t�}t|�}tjd|�|j|�|jj||�}|j|�}|S)z1add service with given name and settings
        zconfig.addService2('%s'))rr�rrvr�rZnew_service_dictr`)rKrfr�r�r�r�rPrPrQ�addService2�s


zFirewallDConfig.addService2cCstjd|�dS)Nzconfig.ServiceAdded('%s'))rrv)rKrfrPrPrQr��szFirewallDConfig.ServiceAddedcCstjd�|jS)z!list zones objects paths
        zconfig.listZones())rrvrX)rKr�rPrPrQ�	listZones�s
zFirewallDConfig.listZonescCs4tjd�g}x|jD]}|j|jj�qWt|�S)zget zone names
        zconfig.getZoneNames())rrvrXr�r�r�rC)rKr�rXr�rPrPrQ�getZoneNames�s

zFirewallDConfig.getZoneNamescCsFt|t�}tjd|�x|jD]}|jj|kr|SqWttj	|��dS)z,object path of zone with given name
        zconfig.getZoneByName('%s')N)
rr�rrvrXr�r�rrZINVALID_ZONE)rKrgr�r�rPrPrQ�
getZoneByName�s
zFirewallDConfig.getZoneByNamecCszt|t�}tjd|�g}x(|jD]}||jjkr"|j|jj�q"Wt	|�dkrjdj
|�d|t	|�fS|rv|dSdS)z4name of zone the given interface belongs to
        zconfig.getZoneOfInterface('%s')r� zE  (ERROR: interface '%s' is in %s zone XML files, can be only in one)rrs)rr�rrvrXr�Z
interfacesr�r�rk�join)rKZifacer�r�r�rPrPrQ�getZoneOfInterface�s
z"FirewallDConfig.getZoneOfInterfacecCszt|t�}tjd|�g}x(|jD]}||jjkr"|j|jj�q"Wt	|�dkrjdj
|�d|t	|�fS|rv|dSdS)z1name of zone the given source belongs to
        zconfig.getZoneOfSource('%s')rr�zB  (ERROR: source '%s' is in %s zone XML files, can be only in one)rrs)rr�rrvrXr�Zsourcesr�r�rkr)rK�sourcer�r�r�rPrPrQ�getZoneOfSource�s
zFirewallDConfig.getZoneOfSourcez's(sssbsasa(ss)asba(ssss)asasasasa(ss)b)cCsht|t�}t|�}tjd|�|j|�|ddkrLt|�}t|d<t|�}|jj	||�}|j
|�}|S)z.add zone with given name and settings
        zconfig.addZone('%s')��default)rr�rrvr�rzr�tuplerZnew_zonera)rKrgr�r�Z	_settingsr�r�rPrPrQ�addZone�s


zFirewallDConfig.addZonecCs`t|t�}t|�}tjd|�|j|�d|krD|ddkrDt|d<|jj||�}|j|�}|S)z.add zone with given name and settings
        zconfig.addZone('%s')�targetr)	rr�rrvr�rrZ
new_zone_dictra)rKrgr�r�r�r�rPrPrQ�addZone2�s


zFirewallDConfig.addZone2cCstjd|�dS)Nzconfig.ZoneAdded('%s'))rrv)rKrgrPrPrQr�szFirewallDConfig.ZoneAddedcCstjd�|jS)z$list policies objects paths
        zconfig.listPolicies())rrvr\)rKr�rPrPrQ�listPoliciess
zFirewallDConfig.listPoliciescCs4tjd�g}x|jD]}|j|jj�qWt|�S)zget policy names
        zconfig.getPolicyNames())rrvr\r�r�r�rC)rKr�Zpoliciesr�rPrPrQ�getPolicyNamess

zFirewallDConfig.getPolicyNamescCsFt|t�}tjd|�x|jD]}|jj|kr|SqWttj	|��dS)z.object path of policy with given name
        zconfig.getPolicyByName('%s')N)
rr�rrvr\r�r�rrZINVALID_POLICY)rKrir�r�rPrPrQ�getPolicyByName"s
zFirewallDConfig.getPolicyByNamecCsDt|t�}t|�}tjd|�|j|�|jj||�}|j|�}|S)z0add policy with given name and settings
        zconfig.addPolicy('%s'))rr�rrvr�rZnew_policy_object_dictrc)rKrir�r�r�r�rPrPrQ�	addPolicy/s


zFirewallDConfig.addPolicycCstjd|�dS)Nzconfig.PolicyAdded('%s'))rrv)rKrirPrPrQr�>szFirewallDConfig.PolicyAddedcCstjd�|jS)z#list helpers objects paths
        zconfig.listHelpers())rrvrZ)rKr�rPrPrQ�listHelpersGs
zFirewallDConfig.listHelperscCs4tjd�g}x|jD]}|j|jj�qWt|�S)zget helper names
        zconfig.getHelperNames())rrvrZr�r�r�rC)rKr�rZr�rPrPrQ�getHelperNamesOs

zFirewallDConfig.getHelperNamescCsFt|t�}tjd|�x|jD]}|jj|kr|SqWttj	|��dS)z.object path of helper with given name
        zconfig.getHelperByName('%s')N)
rr�rrvrZr�r�rrZINVALID_HELPER)rKrhr�r�rPrPrQ�getHelperByNameZs
zFirewallDConfig.getHelperByNamecCsDt|t�}t|�}tjd|�|j|�|jj||�}|j|�}|S)z0add helper with given name and settings
        zconfig.addHelper('%s'))rr�rrvr�rZ
new_helperrb)rKrhr�r�r�r�rPrPrQ�	addHelpergs


zFirewallDConfig.addHelpercCst|t�}tjd|�dS)Nzconfig.HelperAdded('%s'))rr�rrv)rKrhrPrPrQr�vs
zFirewallDConfig.HelperAddedcCstjd�|jj�j�S)Nzconfig.direct.getSettings())rrvr�
get_directr�)rKr�rPrPrQr�s
zFirewallDConfig.getSettingscCs<tjd�t|�}|jj�j|�|jj�j�|j�dS)Nzconfig.direct.update())rrvrrrr�r�r�)rKr�r�rPrPrQrr�s

zFirewallDConfig.updatecCstjd�dS)Nzconfig.direct.Updated())rrv)rKrPrPrQr��szFirewallDConfig.UpdatedZssscCs�t|�}t|�}t|�}tjd|||f�|j|�t|||f�}t|j��}||dkrrttj	d|||f��|dj
|�|j|�dS)Nz(config.direct.addChain('%s', '%s', '%s')rz chain '%s' already is in '%s:%s')rrrvr�rrzr�rrr�r�rr)rK�ipv�table�chainr��idxr�rPrPrQ�addChain�s
zFirewallDConfig.addChaincCs�t|�}t|�}t|�}tjd|||f�|j|�t|||f�}t|j��}||dkrrttj	d|||f��|dj
|�|j|�dS)Nz+config.direct.removeChain('%s', '%s', '%s')rzchain '%s' is not in '%s:%s')rrrvr�rrzr�rrr�rqrr)rKrrrr�rr�rPrPrQ�removeChain�s

zFirewallDConfig.removeChaincCsJt|�}t|�}t|�}tjd|||f�t|||f�}||j�dkS)Nz*config.direct.queryChain('%s', '%s', '%s')r)rrrvrr�)rKrrrr�rrPrPrQ�
queryChain�szFirewallDConfig.queryChaincCsft|�}t|�}tjd||f�g}x:|j�dD]*}|d|kr4|d|kr4|j|d�q4W|S)Nz#config.direct.getChains('%s', '%s')rrr�)rrrvr�r�)rKrrr�r�rrPrPrQ�	getChains�szFirewallDConfig.getChainsrsza(sss)cCstjd�|j�dS)Nzconfig.direct.getAllChains()r)rrvr�)rKr�rPrPrQ�getAllChains�s
zFirewallDConfig.getAllChainsZsssiasc	Cs�t|�}t|�}t|�}t|�}t|�}tjd||||dj|�f�|j|�|||||f}t|j��}||dkr�ttj	d||||f��|dj
|�|jt|��dS)Nz1config.direct.addRule('%s', '%s', '%s', %d, '%s')z','rz"rule '%s' already is in '%s:%s:%s')
rrrvrr�rzr�rrr�r�rrr)	rKrrr�priorityrLr�rr�rPrPrQ�addRule�s 
zFirewallDConfig.addRulec	Cs�t|�}t|�}t|�}t|�}t|�}tjd||||dj|�f�|j|�|||||f}t|j��}||dkr�ttj	d||||f��|dj
|�|jt|��dS)Nz4config.direct.removeRule('%s', '%s', '%s', %d, '%s')z','rzrule '%s' is not in '%s:%s:%s')
rrrvrr�rzr�rrr�rqrrr)	rKrrrrrLr�rr�rPrPrQ�
removeRule�s 
zFirewallDConfig.removeRulecCsdt|�}t|�}t|�}t|�}t|�}tjd||||dj|�f�|||||f}||j�dkS)Nz3config.direct.queryRule('%s', '%s', '%s', %d, '%s')z','r)rrrvrr�)rKrrrrrLr�rrPrPrQ�	queryRuleszFirewallDConfig.queryRulecCs�t|�}t|�}t|�}tjd|||f�|j|�t|j��}xF|ddd�D]2}|||f|d|d|dfkrT|dj|�qTW|jt|��dS)Nz+config.direct.removeRules('%s', '%s', '%s')rrr�)	rrrvr�rzr�rqrrr)rKrrrr�r�ZrulerPrPrQ�removeRuless
 zFirewallDConfig.removeRulesza(ias)cCs�t|�}t|�}t|�}tjd|||f�g}xN|j�dD]>}|d|kr>|d|kr>|d|kr>|j|d|df�q>W|S)Nz(config.direct.getRules('%s', '%s', '%s')rrr�r�r)rrrvr�r�)rKrrrr�r�rrPrPrQ�getRules)s$zFirewallDConfig.getRulesz	a(sssias)cCstjd�|j�dS)Nzconfig.direct.getAllRules()r)rrvr�)rKr�rPrPrQ�getAllRules8s
zFirewallDConfig.getAllRulesZsascCs�t|�}t|�}tjd|dj|�f�|j|�||f}t|j��}||dkrfttj	d||f��|dj
|�|j|�dS)Nz(config.direct.addPassthrough('%s', '%s')z','r�zpassthrough '%s', '%s')rrrvrr�rzr�rrr�r�rr)rKrrLr�rr�rPrPrQ�addPassthroughAs
zFirewallDConfig.addPassthroughcCs�t|�}t|�}tjd|dj|�f�|j|�||f}t|j��}||dkrfttj	d||f��|dj
|�|j|�dS)Nz+config.direct.removePassthrough('%s', '%s')z','r�zpassthrough '%s', '%s')rrrvrr�rzr�rrr�rqrr)rKrrLr�rr�rPrPrQ�removePassthroughSs
z!FirewallDConfig.removePassthroughcCs@t|�}t|�}tjd|dj|�f�||f}||j�dkS)Nz*config.direct.queryPassthrough('%s', '%s')z','r�)rrrvrr�)rKrrLr�rrPrPrQ�queryPassthroughdsz FirewallDConfig.queryPassthroughZaascCsNt|�}tjd|�g}x.|j�dD]}|d|kr(|j|d�q(W|S)Nz#config.direct.getPassthroughs('%s')r�rr)rrrvr�r�)rKrr�r�rrPrPrQ�getPassthroughsoszFirewallDConfig.getPassthroughsza(sas)cCstjd�|j�dS)Nz"config.direct.getAllPassthroughs()r�)rrvr�)rKr�rPrPrQ�getAllPassthroughs{s
z"FirewallDConfig.getAllPassthroughs)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)��__name__�
__module__�__qualname__�__doc__Z
persistentrrIZPK_ACTION_CONFIGZdefault_polkit_auth_requiredrr.r1rjror2r_r�rr`r�r�rar�r�rcr�r�r^r�r�rbr�r�rr�r�r�r	ZPROPERTIES_IFACEr�ru�slipZpolkitZrequire_authr�rf�signalr|ZPK_ACTION_INFOZINTROSPECTABLE_IFACEr�r�rZDBUS_SIGNATUREr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rJr�r�r�rr�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�rrrr	r�r
rrr
r�rrrrrr�r�rr�rrr�rrrrrrrrr r!r"r#r$r%r&r'�
__classcell__rPrP)rOrQr>sv.				D!D	





	

	

	

	




	

	

	

	r):Z
gi.repositoryr�sys�modulesrArIZdbus.serviceZ	slip.dbusr,Zslip.dbus.serviceZfirewallrZfirewall.core.baserZfirewall.core.watcherrZfirewall.core.loggerrZfirewall.server.decoratorsrrr	Zfirewall.server.config_icmptyper
Zfirewall.server.config_servicerZfirewall.server.config_zonerZfirewall.server.config_policyr
Zfirewall.server.config_ipsetrZfirewall.server.config_helperrZfirewall.core.io.icmptyperZfirewall.core.io.ipsetrZfirewall.core.io.helperrZ#firewall.core.io.lockdown_whitelistrZfirewall.core.io.directrZfirewall.dbus_utilsrrrrrrrrZfirewall.errorsrrfZObjectrrPrPrPrQ�<module>s6
$site-packages/firewall/server/__pycache__/server.cpython-36.opt-1.pyc000064400000004547147511334700021521 0ustar003

]ûf��@s�dgZddlZddlZddlmZmZeejd<ddlZddlZddl	Zddl
Zddlm
Z
ddlmZddlmZdd	�Zd
d�Zdd
d�ZdS)�
run_server�N)�GObject�GLibZgobject)�config)�log)�	FirewallDcCs|j�dS)NT)�reload)�service�r
�/usr/lib/python3.6/server.py�sighup4srcCs|j�dS)N)�quit)�mainloopr
r
r�sigterm8srFcsxd}|rFddlm�ddl��j��j�j�d�����fdd��y�tjjj	dd�tj
�}tjjt
jj|d	�}t|t
jj�}tj�}tjjj|�|r�tj���ttd
�r�tj}ntj}|tjtjt|�|tjtjt|�|j�Wnvt k
�rt!j"d�YnXt#k
�r,t!j$d�Yn:t%k
�rd}zt!j$d
|j&j't(|��WYdd}~XnX|�rt|j)�dS)zI Main function for firewall server. Handles D-Bus and GLib mainloop.
    Nr)�pformat�
csr�j�t�j�dkrbtd�tdt�j��x(�jD]}tt|�d�t�|��q8Wtd�tj���dS)NrzP
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
zGARBAGE OBJECTS (%d):
z
  zP
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
)Zcollect�lenZgarbage�print�typer�timeout_add_seconds)�x)�gc�
gc_collect�
gc_timeoutrr
rrLszrun_server.<locals>.gc_collectT)Zset_as_default)�bus�unix_signal_addz
Stopping..z Raising SystemExit in run_serverzException %s: %s)*�pprintrr�enableZ	set_debugZ
DEBUG_LEAK�dbusrZglibZ
DBusGMainLoopZ	SystemBusr	ZBusNamerZDBUS_INTERFACErZ	DBUS_PATHrZMainLoop�slipZset_mainloopr�hasattrrZunix_signal_add_fullZ
PRIORITY_HIGH�signal�SIGHUPr�SIGTERMrZrun�KeyboardInterruptrZdebug1�
SystemExit�error�	Exception�	__class__�__name__�str�stop)Zdebug_gcr	r�namerr�er
)rrrrrrAsB



()F)�__all__�sysr!Z
gi.repositoryrr�modulesrZdbus.serviceZdbus.mainloop.glibZ	slip.dbusrZfirewallrZfirewall.core.loggerrZfirewall.server.firewalldrrrrr
r
r
r�<module>s
	site-packages/firewall/server/__pycache__/__init__.cpython-36.pyc000064400000000161147511334700020777 0ustar003

]ûf�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/firewall/server/__pycache__/firewalld.cpython-36.opt-1.pyc000064400000217455147511334700022170 0ustar003

]ûfb��@sVdgZddlmZmZddlZeejd<ddlZddlZddlZddl	Z
ddlZ
ddlm
Z
ddlmZddlmZddlmZdd	lmZdd
lmZmZmZmZddlmZddlmZmZm Z m!Z!m"Z"m#Z#m$Z$dd
l%m&Z&ddl'm(Z(ddl)m*Z*ddl+m,Z,ddl-m.Z.m/Z/m0Z0ddl1m2Z2ddlm3Z3ddl4m5Z5Gdd�de
jj6j7�Z8dS)�	FirewallD�)�GLib�GObjectNZgobject)�config)�Firewall)�	Rich_Rule)�log)�FirewallClientZoneSettings)�dbus_handle_exceptions�dbus_service_method�handle_exceptions�FirewallDBusException)�FirewallDConfig)�dbus_to_python�command_of_sender�context_of_sender�
uid_of_sender�user_of_uid�%dbus_introspection_prepare_properties�!dbus_introspection_add_properties)�check_config)�IPSet)�IcmpType)�Helper)�nm_get_bus_name�nm_get_connection_of_interface�nm_set_zone_of_connection)�ifcfg_set_zone_of_interface)�errors)�
FirewallErrorcs�"eZdZdZdZejjZe	�fdd��Z
dd�Ze	dd��Ze	d	d
��Z
edd��Zed
d��Zedd��Zedd��Zedd��Zeejddd�e�d�dd���Zeejddd�e�d�dd���Zejjjejj�eejdd �e�d�d!d"����Zejjejd#d$�d%d&��Zejjjejj�eej dd'�e�d��fd(d)�	���Z!ejjjejj�eejj"d*d*d�e�d�d+d,����Z#ejjjejj�eejj"d*d*d�e�d�d-d.����Z$ejjejj"�ed/d0���Z%ejjjejj�eejj"d*d*d�e�d�d1d2����Z&ejjjejj�eejj"d*d*d�e�d�d3d4����Z'ejjjejj(�eejj)d*d*d�e�d�d5d6����Z*ejjjejj(�eejj)d*d*d�e�d�d7d8����Z+ejjjejj,�eejj)d*d9d�e�d�d:d;����Z-ejjejj)d*d$�ed<d=���Z.ejjejj)d*d$�ed>d?���Z/ejjjejj(�eejj)dd*d�e�d�d@dA����Z0ejjjejj(�eejj)dd*d�e�d�dBdC����Z1ejjjejj,�eejj)dd9d�e�d�dDdE����Z2ejjjejj,�eejj)d*dFd�e�d�dGdH����Z3ejjejj)dd$�edIdJ���Z4ejjejj)dd$�edKdL���Z5ejjjejj(�eejj)dMd*d�e�d�dNdO����Z6ejjjejj(�eejj)dMd*d�e�d�dPdQ����Z7ejjjejj,�eejj)dMd9d�e�d�dRdS����Z8ejjjejj,�eejj)d*dTd�e�d�dUdV����Z9ejjejj)dMd$�edWdX���Z:ejjejj)dMd$�edYdZ���Z;ejjjejj(�eejj)dd*d�e�d�d[d\����Z<ejjjejj(�eejj)dd*d�e�d�d]d^����Z=ejjjejj,�eejj)dd9d�e�d�d_d`����Z>ejjjejj,�eejj)d*dFd�e�d�dadb����Z?ejjejj)dd$�edcdd���Z@ejjejj)dd$�ededf���ZAejjjejj(�eejj)dd*d�e�d�dgdh����ZBejjjejj(�eejj)dd*d�e�d�didj����ZCejjjejj,�eejj)dd9d�e�d�dkdl����ZDejjjejj,�eejj)d*dFd�e�d�dmdn����ZEejjejj)dd$�edodp���ZFejjejj)dd$�edqdr���ZGejjjejj�eejj"d*d*d�e�d�dsdt����ZHejjjejj�eejj"d*d*d�e�d�dudv����ZIejjjejj�eejj"d*d9d�e�d�dwdx����ZJejjejj"d*d$�edydz���ZKejjejj"d*d$�ed{d|���ZLejjjejjM�eejj"dd}d�e�d�d~d����ZNejjjejjM�eejjOddd�e�d�d�d�����ZPejjjejjM�eejjOd�d �e�d�d�d�����ZQejjejjOd�d$�ed�d����ZRejjjejjM�eejjSddd�e�d�d�d�����ZTejjjejjM�eejjSd�d �e�d�d�d�����ZUejjejjSd�d$�ed�d����ZVejjjejj�eejj"d*dFd�e�d�d�d�����ZWejjjejjM�eejj"dd�d�e�d�d�d�����ZXejjjejjM�eejj"ddd�e�d�d�d�����ZYejjjejj�eejj"d*dFd�e�d�d�d�����ZZejjjejjM�eejj"de[j\d�e�d�d�d�����Z]ejjjejjM�eejj"d*dd�e�d�d�d�����Z^ejjjejj�eejj"dd*d�e�d�d�d�����Z_ejjejj"dd$�ed�d����Z`ejjjejjM�eejj"d*dd�e�d�d�d�����Zaejjjejj�eejj"dd*d�e�d�d�d�����Zbejjejj"dd$�ed�d����Zcejjjejj�eejj"d*dd�e�d�d�d�����Zdejjjejj�eejj"dd*d�e�d�d�d�����Zeejjejj"dd$�ed�d����Zfejjjejj�eejjSd*dFd�e�d�d�d�����Zgejjjejj�eejjSd*d�d�e�d�d�d�����Zhejjjejj�eejjOd*dFd�e�d�d�d�����Ziejjjejj�eejjOd*d�d�e�d�d�d�����Zjejjjejj�eejjOddd�e�d�d�d�����Zkejjjejj�eejjOddd�e�d�d�d�����ZlejjjejjM�eejjOdd9d�e�d�d�d�����Zmejjjejj�eejjOddd�e�d�d�d�����Znejjjejj�eejjOddd�e�d�d�d�����Zoejjjejj�eejjOddd�e�d�d�d�����Zpejjjejj�eejjOddd�e�d�d�d�����ZqejjjejjM�eejjOdd9d�e�d�d�d„���ZrejjjejjM�eejjOddFd�e�d�d�dĄ���ZsejjejjOdd$�ed�dƄ��ZtejjejjOdd$�ed�dȄ��ZuejjejjOdd$�ed�dʄ��ZvejjejjOdd$�ed�d̄��Zwejjjejj�eejjOddd�e�d�d�d΄���Zxejjjejj�eejjOddd�e�d�d�dЄ���Zyejjjejj�eejjOddd�e�d�d�d҄���ZzejjjejjM�eejjOdd9d�e�d�d�dԄ���Z{ejjjejjM�eejjOddFd�e�d�d�dք���Z|ejjejjOdd$�ed�d؄��Z}ejjejjOdd$�ed�dڄ��Z~ejjejjOdd$�ed�d܄��Zed�dބ�Z�ejjjejj�eejjOd�dd�e�d�d�d����Z�ejjjejj�eejjOddd�e�d�d�d����Z�ejjjejjM�eejjOdd9d�e�d�d�d����Z�ejjjejjM�eejjOddFd�e�d�d�d����Z�ejjejjOd�d$�ed�d���Z�ejjejjOdd$�ed�d���Z�ed�d��Z�ejjjejj�eejjOd�dd�e�d�d�d����Z�ejjjejj�eejjOddd�e�d�d�d����Z�ejjjejjM�eejjOdd9d�e�d�d�d����Z�ejjjejjM�eejjOddFd�e�d�d�d�����Z�ejjejjOd�d$�ed�d����Z�ejjejjOdd$�ed�d����Z�ed�d���Z�ejjjejj�eejjOd�dd�e�d�d�d�����Z�ejjjejj�eejjOd�dd�e�d��d�d����Z�ejjjejjM�eejjOd�d9d�e�d��d�d����Z�ejjjejjM�eejjOd�dd�e�d��d�d����Z�ejjejjOd�d$�e�d��d�d	���Z�ejjejjOd�d$�e�d
�d���Z�e�d�d
��Z�ejjjejj�eejjOd�dd�e�d��d�d����Z�ejjjejj�eejjOddd�e�d��d�d����Z�ejjjejjM�eejjOdd9d�e�d��d�d����Z�ejjjejjM�eejjOddFd�e�d��d�d����Z�ejjejjOd�d$�e�d��d�d���Z�ejjejjOdd$�e�d�d���Z�e�d�d��Z�ejjjejj�eejjOd�dd�e�d��d�d����Z�ejjjejj�eejjOd�dd�e�d�d�d����Z�ejjjejjM�eejjOd�d9d�e�d�d �d!����Z�ejjjejjM�eejjOd�dd�e�d�d"�d#����Z�ejjejjOd�d$�e�d�d$�d%���Z�ejjejjOd�d$�e�d&�d'���Z�e�d(�d)��Z�ejjjejj�eejjO�d*dd�e�d�d+�d,����Z�ejjjejj�eejjOddd�e�d�d-�d.����Z�ejjjejjM�eejjOdd9d�e�d�d/�d0����Z�ejjejjO�d*d$�e�d�d1�d2���Z�ejjejjOdd$�e�d3�d4���Z�e�d5�d6��Z�ejjjejj�eejjO�d7dd�e�d�d8�d9����Z�ejjjejj�eejjO�d:dd�e�d	�d;�d<����Z�ejjjejjM�eejjO�d:d9d�e�d
�d=�d>����Z�ejjjejjM�eejjOd�dd�e�d�d?�d@����Z�ejjejjO�d7d$�e�d�dA�dB���Z�ejjejjO�d:d$�e�dC�dD���Z�e�dE�dF��Z�ejjjejj�eejjOd�dd�e�d
�dG�dH����Z�ejjjejj�eejjOddd�e�d�dI�dJ����Z�ejjjejjM�eejjOdd9d�e�d�dK�dL����Z�ejjjejjM�eejjOddFd�e�d�dM�dN����Z�ejjejjOd�d$�e�d�dO�dP���Z�ejjejjOdd$�e�dQ�dR���Z�ejjjejj�eejjOddd�e�d�dS�dT����Z�ejjjejj�eejjOddd�e�d�dU�dV����Z�ejjjejjM�eejjOdd9d�e�d�dW�dX����Z�ejjejjOdd$�e�dY�dZ���Z�ejjejjOdd$�e�d[�d\���Z�ejjjejj��eejj�d�d*d�e�d�d]�d^����Z�ejjjejj��eejj�d�d*d�e�d�d_�d`����Z�ejjjejj��eejj�d�d9d�e�d�da�db����Z�ejjjejj��eejj�ddFd�e�d�dc�dd����Z�ejjjejj��eejj�d*�ded�e�d�df�dg����Z�ejjejj�d�d$�e�dh�di���Z�ejjejj�d�d$�e�dj�dk���Z�ejjjejj��eejj��dld*d�e�d�dm�dn����Z�ejjjejj��eejj��dld*d�e�d�do�dp����Z�ejjjejj��eejj�d�d*d�e�d�dq�dr����Z�ejjjejj��eejj��dld9d�e�d�ds�dt����Z�ejjjejj��eejj�d��dud�e�d�dv�dw����Z�ejjjejj��eejj�d*�dxd�e�d�dy�dz����Z�ejjejj��dld$�e�d{�d|���Z�ejjejj��dld$�e�d}�d~���Z�ejjjejj��eejj��ddd�e�d �d��d�����Z�ejjjejj��eejj��dd*d�e�d!�d��d�����Z�ejjjejj��eejj��dd*d�e�d"�d��d�����Z�ejjjejj��eejj��dd9d�e�d#�d��d�����Z�ejjjejj��eejj�d*�d�d�e�d$�d��d�����Z�ejjjejj��eejj�d*d*d�e�d%�d��d�����Z�ejjjejj��eejj�d�dd�e�d&�d��d�����Z�ejjejj��dd$�e�d��d����Z�ejjejj��dd$�e�d��d����Z�ejjjejj׃eejj"d*d*d�e�d'�d��d�����Z�ejjjejj�eejj�dd9d�e�d(�d��d�����Z�ejjjejj�eejj�d*dFd�e�d)�d��d�����Z�ejjjejjM�eejj�de�j\d�e�d*�d��d�����Z�ejjjejj�eejj�dd*d�e�d+�d��d�����Z�ejjjejj�eejj�dd*d�e�d,�d��d�����Z�ejjjejj�eejj�dd9d�e�d-�d��d�����Z�ejjjejj�eejj�ddFd�e�d.�d��d�����Z�ejjjejj�eejjِdd �e�d/�d��d�����Z�ejjejj�dd$�e�d��d����Z�ejjejj�dd$�e�d��d����Z�ejjjejj�eejj"d*dFd�e�d0�d��d�����Z�ejjjejjM�eejj"de�j\d�e�d1�d��d�����Z�Z�S(2rzFirewallD main classTcs`tt|�j||�t�|_|d|_|d|_|j�t|t	j
j�t|jj	|jt	j
j
�|_	dS)Nr�)�superr�__init__r�fw�busname�path�startrr�dbus�DBUS_INTERFACErZDBUS_PATH_CONFIG)�self�args�kwargs)�	__class__��/usr/lib/python3.6/firewalld.pyr"Is

zFirewallD.__init__cCs|j�dS)N)�stop)r)r-r-r.�__del__TszFirewallD.__del__cCstjd�i|_|jj�S)Nzstart())r�debug1�	_timeoutsr#r&)r)r-r-r.r&Ws
zFirewallD.startcCstjd�|jj�S)Nzstop())rr1r#r/)r)r-r-r.r/_s
zFirewallD.stopcCs�|jjj�r�|dkr"tjd�dStj�}t||�}|jjjd|�rHdSt	||�}|jjjd|�rfdSt
|�}|jjjd|�r�dSt||�}|jjjd|�r�dStt
jd��dS)Nz&Lockdown not possible, sender not set.�context�uid�user�commandzlockdown is enabled)r#�policies�query_lockdownr�errorr'Z	SystemBusrZaccess_checkrrrrrZ
ACCESS_DENIED)r)�senderZbusr3r4r5r6r-r-r.�accessCheckhs$



zFirewallD.accessCheckcCs&||jkri|j|<||j||<dS)N)r2)r)�zone�x�tagr-r-r.�
addTimeouts

zFirewallD.addTimeoutcCs<||jkr8||j|kr8tj|j||�|j||=dS)N)r2r�
source_remove)r)r<r=r-r-r.�
removeTimeout�szFirewallD.removeTimeoutcCsTxD|jD]:}x&|j|D]}tj|j||�qW|j|j�qW|jj�dS)N)r2rr@�clear)r)r<r=r-r-r.�cleanup_timeouts�s
zFirewallD.cleanup_timeoutscCsd|dkrtjtj�S|dkr6tjdtjjtjjf�S|dkrNtj|jj��S|dkrhtj|jj	d��S|dkr�tj
|jjd�S|d	kr�tj|jj	d
��S|dkr�tj|jj�S|dkr�tj
|jj
d�S|d
kr�tj|jj�S|dk�r�tj|jj�S|dk�rtj
|jjd�S|dk�r$tjd�S|dk�r:tjid�S|dk�rPtjid�Stjjd|��dS)N�version�interface_versionz%d.%d�state�IPv4�ipv4�
IPv4ICMPTypes�s�IPv6�ipv6�
IPv6_rpfilter�
IPv6ICMPTypes�BRIDGEr�
IPSetTypes�nf_conntrack_helper_settingF�nf_conntrack_helpers�sas�nf_nat_helperszDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not exist)r'�Stringr�VERSIONZDBUS_INTERFACE_VERSIONZDBUS_INTERFACE_REVISIONr#Z	get_stateZBooleanZis_ipv_enabledZArrayZipv4_supported_icmp_typesZipv6_rpfilter_enabledZipv6_supported_icmp_typesZebtables_enabledZ
ipset_enabledZipset_supported_types�
Dictionary�
exceptions�
DBusException)r)Zpropr-r-r.�
_get_property�s@





zFirewallD._get_propertyZss�v)�in_signature�
out_signatureNcCs~t|t�}t|t�}tjd||�|tjjkr8|j|�S|tjjtjj	tjj
tjjgkrjtjj
d|��ntjj
d|��dS)NzGet('%s', '%s')zDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not existzJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not exist)r�strrr1rr'r(rZ�DBUS_INTERFACE_ZONE�DBUS_INTERFACE_DIRECT�DBUS_INTERFACE_POLICIES�DBUS_INTERFACE_IPSETrXrY)r)�interface_name�
property_namer:r-r-r.�Get�s



z
FirewallD.GetrJza{sv}cCs�t|t�}tjd|�i}|tjjkrDxNdD]}|j|�||<q,Wn2|tjjtjj	tjj
tjjgkrfntjj
d|��tj|dd�S)NzGetAll('%s')rDrErFrGrKrMrOrrPrQrRrTrIrNzJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existZsv)�	signature)rDrErFrGrKrMrOrrPrQrRrTrIrN)rr^rr1rr'r(rZr_r`rarbrXrYrW)r)rcr:�retr=r-r-r.�GetAll�s&
zFirewallD.GetAllZssv)r\cCs�t|t�}t|t�}t|�}tjd|||�|j|�|tjjkrn|dkr\tjj	d|��q�tjj	d|��nB|tjj
tjjtjjtjj
gkr�tjj	d|��ntjj	d|��dS)NzSet('%s', '%s', '%s')rDrErFrGrKrMrOrrPrQrRrTrIrNzGorg.freedesktop.DBus.Error.PropertyReadOnly: Property '%s' is read-onlyzDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not existzJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not exist)rDrErFrGrKrMrOrrPrQrRrTrIrN)rr^rr1r;rr'r(rXrYr_r`rarb)r)rcrdZ	new_valuer:r-r-r.�Set�s:






z
FirewallD.Setzsa{sv}as)rfcCs.t|t�}t|�}t|�}tjd|||�dS)Nz#PropertiesChanged('%s', '%s', '%s'))rr^rr1)r)rcZchanged_propertiesZinvalidated_propertiesr-r-r.�PropertiesChangeds

zFirewallD.PropertiesChanged)r]cs4tjd�tt|�j|j|jj��}t||t	j
j�S)NzIntrospect())rZdebug2r!r�
Introspectr%r$Zget_busrrr'r()r)r:�data)r,r-r.rk&s

zFirewallD.Introspect�cCs*tjd�|jj�|jj�|j�dS)z#Reload the firewall rules.
        zreload()N)rr1r#�reloadr�Reloaded)r)r:r-r-r.rn4s


zFirewallD.reloadcCs,tjd�|jjd�|jj�|j�dS)z�Completely reload the firewall.

        Completely reload the firewall: Stops firewall, unloads modules and 
        starts the firewall again.
        zcompleteReload()TN)rr1r#rnrro)r)r:r-r-r.�completeReloadCs


zFirewallD.completeReloadcCstjd�dS)Nz
Reloaded())rr1)r)r-r-r.roSszFirewallD.ReloadedcCstjd�t|j�dS)z&Check permanent configuration
        zcheckPermanentConfig()N)rr1rr#)r)r:r-r-r.�checkPermanentConfigXs
zFirewallD.checkPermanentConfigc
Cs�tjd�d}|jj�}x�|jjj�D]�}|j|�}yj||kr�|jj|�}|j	�|krptjd|�|j
|�q�tjd|�ntjd|�|jj||�Wq&tk
r�}ztj
d||f�d}WYdd}~Xq&Xq&W|jj�}x�|jjj�D]�}|j|�}yn||k�rR|jj|�}|j	�|k�rBtjd	|�|j
|�ntjd
|�ntjd|�|jj||�Wq�tk
�r�}ztj
d||f�d}WYdd}~Xq�Xq�W|jj�}x�|jjj�D]�}yx|j|�}||k�r&|jj|�}|j	�|k�rtjd
|�|j
|�ntjd|�ntjd|�|jj||�Wn:tk
�r~}ztj
d||f�d}WYdd}~XnX�q�W|jj�}t�}�x�|jjj�D�]�}|j|�}t|�}	|dk	�r~d}
xH|	j �D]<}|jjj!||�|k�r�tjd||f�|	j"|�d}
�q�WxV|	j �D]J}y,t#|�}|�rNt$||��rN|	j"|�d}
Wntk
�rfYnX�q W|
�r~~|	j%�}x|	j �D]}t&||��q�WyP||k�r�|jj'|�}tjd|�|j(|�ntjd|�|jj)||�Wn:tk
�r&}ztj
d||f�d}WYdd}~XnX�q�W|jj*�}x�|jj+j,�D]�}|j-|�}yB||k�rx|jj.|�}|j
|�ntjd|�|jj/||�Wn:tk
�r�}ztj
d||f�d}WYdd}~XnX�qFW|jj0�}x�|jj1j2�D]�}|j3|�}yn||k�rN|jj4|�}|j	�|k�r>tjd|�|j
|�ntjd|�ntjd|�|jj5||�Wn:tk
�r�}ztj
d||f�d}WYdd}~XnX�q�W|jj6j7�|jj6j8�|jj6j9�f}y6|jj	�|k�r�tjd�|jj
|�n
tjd�Wn6tk
�r<}ztj
d|�d}WYdd}~XnX|jj:j;j<�}y6|jj	�|k�rvtjd�|jj=|�n
tjd�Wn6tk
�r�}ztj
d |�d}WYdd}~XnX|�r�t>t?j@��dS)!z-Make runtime configuration permanent
        zcopyRuntimeToPermanent()FzCopying service '%s' settingsz$Service '%s' is identical, ignoring.zCreating service '%s'z/Runtime To Permanent failed on service '%s': %sTNzCopying icmptype '%s' settingsz%IcmpType '%s' is identical, ignoring.zCreating icmptype '%s'z0Runtime To Permanent failed on icmptype '%s': %szCopying ipset '%s' settingsz"IPSet '%s' is identical, ignoring.zCreating ipset '%s'z-Runtime To Permanent failed on ipset '%s': %szEZone '%s': interface binding for '%s' has been added by NM, ignoring.zCopying zone '%s' settingszCreating zone '%s'z,Runtime To Permanent failed on zone '%s': %szCreating policy '%s'z.Runtime To Permanent failed on policy '%s': %szCopying helper '%s' settingsz#Helper '%s' is identical, ignoring.zCreating helper '%s'z.Runtime To Permanent failed on helper '%s': %szCopying direct configurationz,Direct configuration is identical, ignoring.z7Runtime To Permanent failed on direct configuration: %szCopying policies configurationz.Policies configuration is identical, ignoring.z9Runtime To Permanent failed on policies configuration: %s)Arr1rZgetServiceNamesr#�service�get_services�getServiceSettingsZgetServiceByNameZgetSettings�update�
addService�	Exception�warningZgetIcmpTypeNames�icmptype�
get_icmptypes�getIcmpTypeSettingsZgetIcmpTypeByNameZaddIcmpTypeZ
getIPSetNames�ipset�
get_ipsets�getIPSetSettingsZgetIPSetByNameZaddIPSetZgetZoneNamesrr<�	get_zones�getZoneSettings2r	�
getInterfacesZinterface_get_sender�removeInterfacerrZgetSettingsDictrZ
getZoneByNameZupdate2ZaddZone2ZgetPolicyNames�policy�"get_policies_not_derived_from_zone�getPolicySettingsZgetPolicyByNameZ	addPolicyZgetHelperNames�helper�get_helpers�getHelperSettingsZgetHelperByNameZ	addHelper�direct�get_all_chains�
get_all_rules�get_all_passthroughsr7�lockdown_whitelist�
export_configZsetLockdownWhitelistrrZRT_TO_PERM_FAILED)
r)r:r9Zconfig_names�nameZconfZconf_obj�eZnm_bus_name�settingsZchanged�	interfaceZ
connectionr-r-r.�runtimeToPermanentds$


























zFirewallD.runtimeToPermanentcCs,tjd�|j|�|jjj�|j�dS)z!Enable lockdown policies
        zpolicies.enableLockdown()N)rr1r;r#r7Zenable_lockdown�LockdownEnabled)r)r:r-r-r.�enableLockdown2s

zFirewallD.enableLockdowncCs,tjd�|j|�|jjj�|j�dS)z"Disable lockdown policies
        zpolicies.disableLockdown()N)rr1r;r#r7Zdisable_lockdown�LockdownDisabled)r)r:r-r-r.�disableLockdown>s

zFirewallD.disableLockdown�bcCstjd�|jjj�S)z+Retuns True if lockdown is enabled
        zpolicies.queryLockdown())rr1r#r7r8)r)r:r-r-r.�
queryLockdownJs
zFirewallD.queryLockdowncCstjd�dS)NzLockdownEnabled())rr1)r)r-r-r.r�UszFirewallD.LockdownEnabledcCstjd�dS)NzLockdownDisabled())rr1)r)r-r-r.r�ZszFirewallD.LockdownDisabledcCs@t|t�}tjd|�|j|�|jjjj|�|j	|�dS)zAdd lockdown command
        z*policies.addLockdownWhitelistCommand('%s')N)
rr^rr1r;r#r7r�Zadd_command�LockdownWhitelistCommandAdded)r)r6r:r-r-r.�addLockdownWhitelistCommandcs


z%FirewallD.addLockdownWhitelistCommandcCs@t|t�}tjd|�|j|�|jjjj|�|j	|�dS)z Remove lockdown command
        z-policies.removeLockdownWhitelistCommand('%s')N)
rr^rr1r;r#r7r�Zremove_command�LockdownWhitelistCommandRemoved)r)r6r:r-r-r.�removeLockdownWhitelistCommandps


z(FirewallD.removeLockdownWhitelistCommandcCs(t|t�}tjd|�|jjjj|�S)zQuery lockdown command
        z,policies.queryLockdownWhitelistCommand('%s'))rr^rr1r#r7r�Zhas_command)r)r6r:r-r-r.�queryLockdownWhitelistCommand}s
z'FirewallD.queryLockdownWhitelistCommand�ascCstjd�|jjjj�S)zAdd lockdown command
        z'policies.getLockdownWhitelistCommands())rr1r#r7r�Zget_commands)r)r:r-r-r.�getLockdownWhitelistCommands�s
z&FirewallD.getLockdownWhitelistCommandscCstjd|�dS)Nz#LockdownWhitelistCommandAdded('%s'))rr1)r)r6r-r-r.r��sz'FirewallD.LockdownWhitelistCommandAddedcCstjd|�dS)Nz%LockdownWhitelistCommandRemoved('%s'))rr1)r)r6r-r-r.r��sz)FirewallD.LockdownWhitelistCommandRemoved�icCs@t|t�}tjd|�|j|�|jjjj|�|j	|�dS)zAdd lockdown uid
        z&policies.addLockdownWhitelistUid('%s')N)
r�intrr1r;r#r7r�Zadd_uid�LockdownWhitelistUidAdded)r)r4r:r-r-r.�addLockdownWhitelistUid�s


z!FirewallD.addLockdownWhitelistUidcCs@t|t�}tjd|�|j|�|jjjj|�|j	|�dS)zRemove lockdown uid
        z)policies.removeLockdownWhitelistUid('%s')N)
rr�rr1r;r#r7r�Z
remove_uid�LockdownWhitelistUidRemoved)r)r4r:r-r-r.�removeLockdownWhitelistUid�s


z$FirewallD.removeLockdownWhitelistUidcCs(t|t�}tjd|�|jjjj|�S)zQuery lockdown uid
        z(policies.queryLockdownWhitelistUid('%s'))rr�rr1r#r7r�Zhas_uid)r)r4r:r-r-r.�queryLockdownWhitelistUid�s
z#FirewallD.queryLockdownWhitelistUidZaicCstjd�|jjjj�S)zAdd lockdown uid
        z#policies.getLockdownWhitelistUids())rr1r#r7r�Zget_uids)r)r:r-r-r.�getLockdownWhitelistUids�s
z"FirewallD.getLockdownWhitelistUidscCstjd|�dS)NzLockdownWhitelistUidAdded(%d))rr1)r)r4r-r-r.r��sz#FirewallD.LockdownWhitelistUidAddedcCstjd|�dS)NzLockdownWhitelistUidRemoved(%d))rr1)r)r4r-r-r.r��sz%FirewallD.LockdownWhitelistUidRemovedcCs@t|t�}tjd|�|j|�|jjjj|�|j	|�dS)zAdd lockdown user
        z'policies.addLockdownWhitelistUser('%s')N)
rr^rr1r;r#r7r�Zadd_user�LockdownWhitelistUserAdded)r)r5r:r-r-r.�addLockdownWhitelistUser�s


z"FirewallD.addLockdownWhitelistUsercCs@t|t�}tjd|�|j|�|jjjj|�|j	|�dS)zRemove lockdown user
        z*policies.removeLockdownWhitelistUser('%s')N)
rr^rr1r;r#r7r�Zremove_user�LockdownWhitelistUserRemoved)r)r5r:r-r-r.�removeLockdownWhitelistUser�s


z%FirewallD.removeLockdownWhitelistUsercCs(t|t�}tjd|�|jjjj|�S)zQuery lockdown user
        z)policies.queryLockdownWhitelistUser('%s'))rr^rr1r#r7r�Zhas_user)r)r5r:r-r-r.�queryLockdownWhitelistUser�s
z$FirewallD.queryLockdownWhitelistUsercCstjd�|jjjj�S)zAdd lockdown user
        z$policies.getLockdownWhitelistUsers())rr1r#r7r�Z	get_users)r)r:r-r-r.�getLockdownWhitelistUserss
z#FirewallD.getLockdownWhitelistUserscCstjd|�dS)Nz LockdownWhitelistUserAdded('%s'))rr1)r)r5r-r-r.r�sz$FirewallD.LockdownWhitelistUserAddedcCstjd|�dS)Nz"LockdownWhitelistUserRemoved('%s'))rr1)r)r5r-r-r.r�sz&FirewallD.LockdownWhitelistUserRemovedcCs@t|t�}tjd|�|j|�|jjjj|�|j	|�dS)zAdd lockdown context
        z*policies.addLockdownWhitelistContext('%s')N)
rr^rr1r;r#r7r�Zadd_context�LockdownWhitelistContextAdded)r)r3r:r-r-r.�addLockdownWhitelistContexts


z%FirewallD.addLockdownWhitelistContextcCs@t|t�}tjd|�|j|�|jjjj|�|j	|�dS)z Remove lockdown context
        z-policies.removeLockdownWhitelistContext('%s')N)
rr^rr1r;r#r7r�Zremove_context�LockdownWhitelistContextRemoved)r)r3r:r-r-r.�removeLockdownWhitelistContext's


z(FirewallD.removeLockdownWhitelistContextcCs(t|t�}tjd|�|jjjj|�S)zQuery lockdown context
        z,policies.queryLockdownWhitelistContext('%s'))rr^rr1r#r7r�Zhas_context)r)r3r:r-r-r.�queryLockdownWhitelistContext4s
z'FirewallD.queryLockdownWhitelistContextcCstjd�|jjjj�S)zAdd lockdown context
        z'policies.getLockdownWhitelistContexts())rr1r#r7r�Zget_contexts)r)r:r-r-r.�getLockdownWhitelistContexts@s
z&FirewallD.getLockdownWhitelistContextscCstjd|�dS)Nz#LockdownWhitelistContextAdded('%s'))rr1)r)r3r-r-r.r�Ksz'FirewallD.LockdownWhitelistContextAddedcCstjd|�dS)Nz%LockdownWhitelistContextRemoved('%s'))rr1)r)r3r-r-r.r�Psz)FirewallD.LockdownWhitelistContextRemovedcCs*tjd�|j|�|jj�|j�dS)znEnable panic mode.
        
        All ingoing and outgoing connections and packets will be blocked.
        zenablePanicMode()N)rr1r;r#Zenable_panic_mode�PanicModeEnabled)r)r:r-r-r.�enablePanicModeYs	


zFirewallD.enablePanicModecCs*tjd�|j|�|jj�|j�dS)z�Disable panic mode.

        Enables normal mode: Allowed ingoing and outgoing connections 
        will not be blocked anymore
        zdisablePanicMode()N)rr1r;r#Zdisable_panic_mode�PanicModeDisabled)r)r:r-r-r.�disablePanicModegs



zFirewallD.disablePanicModecCstjd�|jj�S)NzqueryPanicMode())rr1r#Zquery_panic_mode)r)r:r-r-r.�queryPanicModevs
zFirewallD.queryPanicModecCstjd�dS)NzPanicModeEnabled())rr1)r)r-r-r.r�szFirewallD.PanicModeEnabledcCstjd�dS)NzPanicModeDisabled())rr1)r)r-r-r.r��szFirewallD.PanicModeDisabledz&(sssbsasa(ss)asba(ssss)asasasasa(ss)b)cCs$t|t�}tjd|�|jjj|�S)NzgetZoneSettings(%s))rr^rr1r#r<Zget_config_with_settings)r)r<r:r-r-r.�getZoneSettings�s
zFirewallD.getZoneSettingscCs$t|t�}tjd|�|jjj|�S)NzgetZoneSettings2(%s))rr^rr1r#r<�get_config_with_settings_dict)r)r<r:r-r-r.r��s
zFirewallD.getZoneSettings2zsa{sv}cCsBt|t�}tjd|�|j|�|jjj|||�|j||�dS)NzsetZoneSettings2(%s))	rr^rr1r;r#r<�set_config_with_settings_dict�ZoneUpdated)r)r<r�r:r-r-r.�setZoneSettings2�s


zFirewallD.setZoneSettings2cCstjd||f�dS)Nzzone.ZoneUpdated('%s', '%s'))rr1)r)r<r�r-r-r.r��szFirewallD.ZoneUpdatedcCs$t|t�}tjd|�|jjj|�S)Nzpolicy.getPolicySettings(%s))rr^rr1r#r�r�)r)r�r:r-r-r.r��s
zFirewallD.getPolicySettingscCsBt|t�}tjd|�|j|�|jjj|||�|j||�dS)Nzpolicy.setPolicySettings(%s))	rr^rr1r;r#r�r��
PolicyUpdated)r)r�r�r:r-r-r.�setPolicySettings�s


zFirewallD.setPolicySettingscCstjd||f�dS)Nz policy.PolicyUpdated('%s', '%s'))rr1)r)r�r�r-r-r.r��szFirewallD.PolicyUpdatedcCstjd�|jjj�S)NzlistServices())rr1r#rrrs)r)r:r-r-r.�listServices�s
zFirewallD.listServicesz(sssa(ss)asa{ss}asa(ss))cCs�t|t�}tjd|�|jjj|�}|j�}g}x\td�D]P}|j	|d|krr|j
tjt
||j	|d���q:|j
||j	|d�q:Wt|�S)NzgetServiceSettings(%s)�r)rr^rr1r#rr�get_service�export_config_dict�rangeZIMPORT_EXPORT_STRUCTURE�append�copy�deepcopy�getattr�tuple)r)rrr:�objZ	conf_dictZ	conf_listr�r-r-r.rt�s
"zFirewallD.getServiceSettingscCs,t|t�}tjd|�|jjj|�}|j�S)NzgetServiceSettings2(%s))rr^rr1r#rrr�r�)r)rrr:r�r-r-r.�getServiceSettings2�s
zFirewallD.getServiceSettings2cCstjd�|jjj�S)NzlistIcmpTypes())rr1r#ryrz)r)r:r-r-r.�
listIcmpTypes�s
zFirewallD.listIcmpTypescCs(t|t�}tjd|�|jjj|�j�S)NzgetIcmpTypeSettings(%s))rr^rr1r#ryZget_icmptyper�)r)ryr:r-r-r.r{�s
zFirewallD.getIcmpTypeSettingscCstjd�|jj�S)NzgetLogDenied())rr1r#Zget_log_denied)r)r:r-r-r.�getLogDenied	s
zFirewallD.getLogDeniedcCsXt|t�}tjd|�|j|�|jj|�|j|�|jj�|j	j�|j
�dS)NzsetLogDenied('%s'))rr^rr1r;r#Zset_log_denied�LogDeniedChangedrnrro)r)�valuer:r-r-r.�setLogDenieds




zFirewallD.setLogDeniedcCstjd|�dS)NzLogDeniedChanged('%s'))rr1)r)r�r-r-r.r�"szFirewallD.LogDeniedChangedcCstjd�dS)NzgetAutomaticHelpers()�no)rr1)r)r:r-r-r.�getAutomaticHelpers+s
zFirewallD.getAutomaticHelperscCs&t|t�}tjd|�|j|�dS)NzsetAutomaticHelpers('%s'))rr^rr1r;)r)r�r:r-r-r.�setAutomaticHelpers6s
zFirewallD.setAutomaticHelperscCstjd|�dS)NzAutomaticHelpersChanged('%s'))rr1)r)r�r-r-r.�AutomaticHelpersChangedBsz!FirewallD.AutomaticHelpersChangedcCstjd�|jj�S)NzgetDefaultZone())rr1r#Zget_default_zone)r)r:r-r-r.�getDefaultZoneKs
zFirewallD.getDefaultZonecCs<t|t�}tjd|�|j|�|jj|�|j|�dS)NzsetDefaultZone('%s'))rr^rr1r;r#Zset_default_zone�DefaultZoneChanged)r)r<r:r-r-r.�setDefaultZoneTs


zFirewallD.setDefaultZonecCstjd|�dS)NzDefaultZoneChanged('%s'))rr1)r)r<r-r-r.r�`szFirewallD.DefaultZoneChangedcCstjd�|jjj�S)Nzpolicy.getPolicies())rr1r#r�r�)r)r:r-r-r.�getPoliciesks
zFirewallD.getPoliciesz
a{sa{sas}}cCs\tjd�i}xH|jjj�D]8}i||<|jjj|�||d<|jjj|�||d<qW|S)Nzpolicy.getActivePolicies()Z
ingress_zonesZegress_zones)rr1r#r�Z)get_active_policies_not_derived_from_zoneZlist_ingress_zonesZlist_egress_zones)r)r:r7r�r-r-r.�getActivePoliciesss
zFirewallD.getActivePoliciescCstjd�|jjj�S)Nzzone.getZones())rr1r#r<r)r)r:r-r-r.�getZones�s
zFirewallD.getZonescCs�tjd�i}x||jjj�D]l}|jjj|�}|jjj|�}t|�t|�dkri||<t|�dkrp|||d<t|�dkr|||d<qW|S)Nzzone.getActiveZones()r�
interfaces�sources)rr1r#r<r�list_interfaces�list_sources�len)r)r:Zzonesr<r�r�r-r-r.�getActiveZones�s
zFirewallD.getActiveZonescCs2t|t�}tjd|�|jjj|�}|r.|SdS)z�Return the zone an interface belongs to.

        :Parameters:
            `interface` : str
                Name of the interface
        :Returns: str. The name of the zone.
        zzone.getZoneOfInterface('%s')rm)rr^rr1r#r<Zget_zone_of_interface)r)r�r:r<r-r-r.�getZoneOfInterface�s
zFirewallD.getZoneOfInterfacecCs2t|t�}tjd|�|jjj|�}|r.|SdS)Nzzone.getZoneOfSource('%s')rm)rr^rr1r#r<Zget_zone_of_source)r)�sourcer:r<r-r-r.�getZoneOfSource�s
zFirewallD.getZoneOfSourcecCsdS)NFr-)r)r<r:r-r-r.�isImmutable�szFirewallD.isImmutablecCsRt|t�}t|t�}tjd||f�|j|�|jjj|||�}|j||�|S)zPAdd an interface to a zone.
        If zone is empty, use default zone.
        zzone.addInterface('%s', '%s'))	rr^rr1r;r#r<Z
add_interface�InterfaceAdded)r)r<r�r:�_zoner-r-r.�addInterface�s


zFirewallD.addInterfacecCs"t|t�}t|t�}|j|||�S)z�Change a zone an interface is part of.
        If zone is empty, use default zone.

        This function is deprecated, use changeZoneOfInterface instead
        )rr^�changeZoneOfInterface)r)r<r�r:r-r-r.�
changeZone�s


zFirewallD.changeZonecCsRt|t�}t|t�}tjd||f�|j|�|jjj|||�}|j||�|S)z[Change a zone an interface is part of.
        If zone is empty, use default zone.
        z&zone.changeZoneOfInterface('%s', '%s'))	rr^rr1r;r#r<Zchange_zone_of_interface�ZoneOfInterfaceChanged)r)r<r�r:r�r-r-r.r��s


zFirewallD.changeZoneOfInterfacecCsPt|t�}t|t�}tjd||f�|j|�|jjj||�}|j||�|S)zkRemove interface from a zone.
        If zone is empty, remove from zone the interface belongs to.
        z zone.removeInterface('%s', '%s'))	rr^rr1r;r#r<Zremove_interface�InterfaceRemoved)r)r<r�r:r�r-r-r.r��s


zFirewallD.removeInterfacecCs6t|t�}t|t�}tjd||f�|jjj||�S)z^Return true if an interface is in a zone.
        If zone is empty, use default zone.
        zzone.queryInterface('%s', '%s'))rr^rr1r#r<Zquery_interface)r)r<r�r:r-r-r.�queryInterfaces

zFirewallD.queryInterfacecCs&t|t�}tjd|�|jjj|�S)z]Return the list of interfaces of a zone.
        If zone is empty, use default zone.
        zzone.getInterfaces('%s'))rr^rr1r#r<r�)r)r<r:r-r-r.r�s

zFirewallD.getInterfacescCstjd||f�dS)Nzzone.InterfaceAdded('%s', '%s'))rr1)r)r<r�r-r-r.r�+szFirewallD.InterfaceAddedcCstjd||f�dS)z,
        This signal is deprecated.
        zzone.ZoneChanged('%s', '%s')N)rr1)r)r<r�r-r-r.�ZoneChanged0szFirewallD.ZoneChangedcCs"tjd||f�|j||�dS)Nz'zone.ZoneOfInterfaceChanged('%s', '%s'))rr1r�)r)r<r�r-r-r.r�8s
z FirewallD.ZoneOfInterfaceChangedcCstjd||f�dS)Nz!zone.InterfaceRemoved('%s', '%s'))rr1)r)r<r�r-r-r.r�?szFirewallD.InterfaceRemovedcCsRt|t�}t|t�}tjd||f�|j|�|jjj|||�}|j||�|S)zLAdd a source to a zone.
        If zone is empty, use default zone.
        zzone.addSource('%s', '%s'))	rr^rr1r;r#r<Z
add_source�SourceAdded)r)r<r�r:r�r-r-r.�	addSourceHs


zFirewallD.addSourcecCsRt|t�}t|t�}tjd||f�|j|�|jjj|||�}|j||�|S)zXChange a zone an source is part of.
        If zone is empty, use default zone.
        z#zone.changeZoneOfSource('%s', '%s'))	rr^rr1r;r#r<Zchange_zone_of_source�ZoneOfSourceChanged)r)r<r�r:r�r-r-r.�changeZoneOfSourceYs


zFirewallD.changeZoneOfSourcecCsPt|t�}t|t�}tjd||f�|j|�|jjj||�}|j||�|S)zeRemove source from a zone.
        If zone is empty, remove from zone the source belongs to.
        zzone.removeSource('%s', '%s'))	rr^rr1r;r#r<Z
remove_source�
SourceRemoved)r)r<r�r:r�r-r-r.�removeSourcejs


zFirewallD.removeSourcecCs6t|t�}t|t�}tjd||f�|jjj||�S)z[Return true if an source is in a zone.
        If zone is empty, use default zone.
        zzone.querySource('%s', '%s'))rr^rr1r#r<Zquery_source)r)r<r�r:r-r-r.�querySource{s

zFirewallD.querySourcecCs&t|t�}tjd|�|jjj|�S)zZReturn the list of sources of a zone.
        If zone is empty, use default zone.
        zzone.getSources('%s'))rr^rr1r#r<r�)r)r<r:r-r-r.�
getSources�s

zFirewallD.getSourcescCstjd||f�dS)Nzzone.SourceAdded('%s', '%s'))rr1)r)r<r�r-r-r.r��szFirewallD.SourceAddedcCstjd||f�dS)Nz$zone.ZoneOfSourceChanged('%s', '%s'))rr1)r)r<r�r-r-r.r��szFirewallD.ZoneOfSourceChangedcCstjd||f�dS)Nzzone.SourceRemoved('%s', '%s'))rr1)r)r<r�r-r-r.r��szFirewallD.SourceRemovedcCsHtjd||f�|j||=t|d�}|jjj||�|j||�dS)Nz%zone.disableTimedRichRule('%s', '%s'))�rule_str)rr1r2rr#r<�remove_rule�RichRuleRemoved)r)r<�ruler�r-r-r.�disableTimedRichRule�s

zFirewallD.disableTimedRichRuleZssicCs�t|t�}t|t�}t|t�}tjd||f�t|d�}|jjj|||�}|dkrtt	j
||j||�}|j|||�|j
|||�|S)Nzzone.addRichRule('%s', '%s'))r�r)rr^r�rr1rr#r<�add_ruler�timeout_add_secondsr�r?�
RichRuleAdded)r)r<r��timeoutr:r�r�r>r-r-r.�addRichRule�s




zFirewallD.addRichRulecCs\t|t�}t|t�}tjd||f�t|d�}|jjj||�}|j||�|j	||�|S)Nzzone.removeRichRule('%s', '%s'))r�)
rr^rr1rr#r<r�rAr�)r)r<r�r:r�r�r-r-r.�removeRichRule�s


zFirewallD.removeRichRulecCs@t|t�}t|t�}tjd||f�t|d�}|jjj||�S)Nzzone.queryRichRule('%s', '%s'))r�)rr^rr1rr#r<�
query_rule)r)r<r�r:r�r-r-r.�
queryRichRule�s



zFirewallD.queryRichRulecCs&t|t�}tjd|�|jjj|�S)Nzzone.getRichRules('%s'))rr^rr1r#r<Z
list_rules)r)r<r:r-r-r.�getRichRules�s
zFirewallD.getRichRulescCstjd|||f�dS)Nz"zone.RichRuleAdded('%s', '%s', %d))rr1)r)r<r�r�r-r-r.r��szFirewallD.RichRuleAddedcCstjd||f�dS)Nz zone.RichRuleRemoved('%s', '%s'))rr1)r)r<r�r-r-r.r��szFirewallD.RichRuleRemovedcCs>tjd||f�|j||=|jjj||�|j||�dS)Nz$zone.disableTimedService('%s', '%s'))rr1r2r#r<�remove_service�ServiceRemoved)r)r<rrr-r-r.�disableTimedService�szFirewallD.disableTimedServicecCs�t|t�}t|t�}t|t�}tjd|||f�|j|�|jjj||||�}|dkrxt	j
||j||�}|j|||�|j
|||�|S)Nzzone.addService('%s', '%s', %d)r)rr^r�rr1r;r#r<Zadd_servicerr�rr?�ServiceAdded)r)r<rrr�r:r�r>r-r-r.rv�s




zFirewallD.addServicecCs\t|t�}t|t�}tjd||f�|j|�|jjj||�}|j||�|j	||�|S)Nzzone.removeService('%s', '%s'))
rr^rr1r;r#r<rrAr)r)r<rrr:r�r-r-r.�
removeServices


zFirewallD.removeServicecCs6t|t�}t|t�}tjd||f�|jjj||�S)Nzzone.queryService('%s', '%s'))rr^rr1r#r<Z
query_service)r)r<rrr:r-r-r.�queryService&s

zFirewallD.queryServicecCs&t|t�}tjd|�|jjj|�S)Nzzone.getServices('%s'))rr^rr1r#r<Z
list_services)r)r<r:r-r-r.�getServices1s
zFirewallD.getServicescCstjd|||f�dS)Nz!zone.ServiceAdded('%s', '%s', %d))rr1)r)r<rrr�r-r-r.r=szFirewallD.ServiceAddedcCstjd||f�dS)Nzzone.ServiceRemoved('%s', '%s'))rr1)r)r<rrr-r-r.rCszFirewallD.ServiceRemovedcCsHtjd|||f�|j|||f=|jjj|||�|j|||�dS)Nz'zone.disableTimedPort('%s', '%s', '%s'))rr1r2r#r<�remove_port�PortRemoved)r)r<�port�protocolr-r-r.�disableTimedPortLs
zFirewallD.disableTimedPortZsssicCs�t|t�}t|t�}t|t�}t|t�}tjd|||f�|j|�|jjj|||||�}|dkr�t	j
||j|||�}|j|||f|�|j
||||�|S)Nzzone.addPort('%s', '%s', '%s')r)rr^r�rr1r;r#r<Zadd_portrr�rr?�	PortAdded)r)r<rrr�r:r�r>r-r-r.�addPortTs






zFirewallD.addPortZssscCspt|t�}t|t�}t|t�}tjd|||f�|j|�|jjj|||�}|j|||f�|j	|||�|S)Nz!zone.removePort('%s', '%s', '%s'))
rr^rr1r;r#r<rrAr
)r)r<rrr:r�r-r-r.�
removePortks



zFirewallD.removePortcCsDt|t�}t|t�}t|t�}tjd|||f�|jjj|||�S)Nz zone.queryPort('%s', '%s', '%s'))rr^rr1r#r<Z
query_port)r)r<rrr:r-r-r.�	queryPort}s



zFirewallD.queryPortZaascCs&t|t�}tjd|�|jjj|�S)Nzzone.getPorts('%s'))rr^rr1r#r<Z
list_ports)r)r<r:r-r-r.�getPorts�s
zFirewallD.getPortsrcCstjd||||f�dS)Nz$zone.PortAdded('%s', '%s', '%s', %d))rr1)r)r<rrr�r-r-r.r�szFirewallD.PortAddedcCstjd|||f�dS)Nz"zone.PortRemoved('%s', '%s', '%s'))rr1)r)r<rrr-r-r.r
�szFirewallD.PortRemovedcCs>tjd||f�|j||=|jjj||�|j||�dS)Nz%zone.disableTimedProtocol('%s', '%s'))rr1r2r#r<�remove_protocol�ProtocolRemoved)r)r<rr-r-r.�disableTimedProtocol�szFirewallD.disableTimedProtocolcCs�t|t�}t|t�}t|t�}tjd||f�|j|�|jjj||||�}|dkrvt	j
||j||�}|j|||�|j
|||�|S)Nzzone.enableProtocol('%s', '%s')r)rr^r�rr1r;r#r<Zadd_protocolrr�rr?�
ProtocolAdded)r)r<rr�r:r�r>r-r-r.�addProtocol�s




zFirewallD.addProtocolcCs\t|t�}t|t�}tjd||f�|j|�|jjj||�}|j||�|j	||�|S)Nzzone.removeProtocol('%s', '%s'))
rr^rr1r;r#r<rrAr)r)r<rr:r�r-r-r.�removeProtocol�s


zFirewallD.removeProtocolcCs6t|t�}t|t�}tjd||f�|jjj||�S)Nzzone.queryProtocol('%s', '%s'))rr^rr1r#r<Zquery_protocol)r)r<rr:r-r-r.�
queryProtocol�s

zFirewallD.queryProtocolcCs&t|t�}tjd|�|jjj|�S)Nzzone.getProtocols('%s'))rr^rr1r#r<Zlist_protocols)r)r<r:r-r-r.�getProtocols�s
zFirewallD.getProtocolscCstjd|||f�dS)Nz"zone.ProtocolAdded('%s', '%s', %d))rr1)r)r<rr�r-r-r.r�szFirewallD.ProtocolAddedcCstjd||f�dS)Nz zone.ProtocolRemoved('%s', '%s'))rr1)r)r<rr-r-r.r�szFirewallD.ProtocolRemovedcCsJtjd|||f�|j|d||f=|jjj|||�|j|||�dS)Nz-zone.disableTimedSourcePort('%s', '%s', '%s')�sport)rr1r2r#r<�remove_source_port�SourcePortRemoved)r)r<rrr-r-r.�disableTimedSourcePort�s
z FirewallD.disableTimedSourcePortcCs�t|t�}t|t�}t|t�}t|t�}tjd|||f�|j|�|jjj|||||�}|dkr�t	j
||j|||�}|j|d||f|�|j
||||�|S)Nz$zone.addSourcePort('%s', '%s', '%s')rr)rr^r�rr1r;r#r<Zadd_source_portrr�r!r?�SourcePortAdded)r)r<rrr�r:r�r>r-r-r.�
addSourcePort�s








zFirewallD.addSourcePortcCsrt|t�}t|t�}t|t�}tjd|||f�|j|�|jjj|||�}|j|d||f�|j	|||�|S)Nz'zone.removeSourcePort('%s', '%s', '%s')r)
rr^rr1r;r#r<rrAr )r)r<rrr:r�r-r-r.�removeSourcePorts





zFirewallD.removeSourcePortcCsDt|t�}t|t�}t|t�}tjd|||f�|jjj|||�S)Nz&zone.querySourcePort('%s', '%s', '%s'))rr^rr1r#r<Zquery_source_port)r)r<rrr:r-r-r.�querySourcePort)s




zFirewallD.querySourcePortcCs&t|t�}tjd|�|jjj|�S)Nzzone.getSourcePorts('%s'))rr^rr1r#r<Zlist_source_ports)r)r<r:r-r-r.�getSourcePorts6s
zFirewallD.getSourcePortscCstjd||||f�dS)Nz*zone.SourcePortAdded('%s', '%s', '%s', %d))rr1)r)r<rrr�r-r-r.r"BszFirewallD.SourcePortAddedcCstjd|||f�dS)Nz(zone.SourcePortRemoved('%s', '%s', '%s'))rr1)r)r<rrr-r-r.r Hs
zFirewallD.SourcePortRemovedcCs(|j|d=|jjj|�|j|�dS)N�
masquerade)r2r#r<�remove_masquerade�MasqueradeRemoved)r)r<r-r-r.�disableTimedMasqueradeRsz FirewallD.disableTimedMasqueradeZsicCstt|t�}t|t�}tjd|�|j|�|jjj|||�}|dkrdt	j
||j|�}|j|d|�|j
||�|S)Nzzone.addMasquerade('%s')rr')rr^r�rr1r;r#r<Zadd_masqueraderr�r*r?�MasqueradeAdded)r)r<r�r:r�r>r-r-r.�
addMasqueradeXs



zFirewallD.addMasqueradecCsJt|t�}tjd|�|j|�|jjj|�}|j|d�|j	|�|S)Nzzone.removeMasquerade('%s')r')
rr^rr1r;r#r<r(rAr))r)r<r:r�r-r-r.�removeMasqueradels


zFirewallD.removeMasqueradecCs&t|t�}tjd|�|jjj|�S)Nzzone.queryMasquerade('%s'))rr^rr1r#r<Zquery_masquerade)r)r<r:r-r-r.�queryMasquerade{s
zFirewallD.queryMasqueradecCstjd||f�dS)Nzzone.MasqueradeAdded('%s', %d))rr1)r)r<r�r-r-r.r+�szFirewallD.MasqueradeAddedcCstjd|�dS)Nzzone.MasqueradeRemoved('%s'))rr1)r)r<r-r-r.r)�szFirewallD.MasqueradeRemovedcCs@|j|||||f=|jjj|||||�|j|||||�dS)N)r2r#r<�remove_forward_port�ForwardPortRemoved)r)r<rr�toport�toaddrr-r-r.�disable_forward_port�szFirewallD.disable_forward_portZsssssic
Cs�t|t�}t|t�}t|t�}t|t�}t|t�}t|t�}tjd|||||f�|j|�|jjj|||||||�}|dkr�t	j
||j|||||�}	|j|||||f|	�|j
||||||�|S)Nz1zone.addForwardPort('%s', '%s', '%s', '%s', '%s')r)rr^r�rr1r;r#r<Zadd_forward_portrr�r3r?�ForwardPortAdded)
r)r<rrr1r2r�r:r�r>r-r-r.�addForwardPort�s&







zFirewallD.addForwardPortZssssscCs�t|t�}t|t�}t|t�}t|t�}t|t�}tjd|||||f�|j|�|jjj|||||�}|j|||||f�|j	|||||�|S)Nz4zone.removeForwardPort('%s', '%s', '%s', '%s', '%s'))
rr^rr1r;r#r<r/rAr0)r)r<rrr1r2r:r�r-r-r.�removeForwardPort�s





zFirewallD.removeForwardPortcCs`t|t�}t|t�}t|t�}t|t�}t|t�}tjd|||||f�|jjj|||||�S)Nz3zone.queryForwardPort('%s', '%s', '%s', '%s', '%s'))rr^rr1r#r<Zquery_forward_port)r)r<rrr1r2r:r-r-r.�queryForwardPort�s




zFirewallD.queryForwardPortcCs&t|t�}tjd|�|jjj|�S)Nzzone.getForwardPorts('%s'))rr^rr1r#r<Zlist_forward_ports)r)r<r:r-r-r.�getForwardPorts�s
zFirewallD.getForwardPortscCstjd||||||f�dS)Nz7zone.ForwardPortAdded('%s', '%s', '%s', '%s', '%s', %d))rr1)r)r<rrr1r2r�r-r-r.r4�szFirewallD.ForwardPortAddedcCstjd|||||f�dS)Nz5zone.ForwardPortRemoved('%s', '%s', '%s', '%s', '%s'))rr1)r)r<rrr1r2r-r-r.r0�szFirewallD.ForwardPortRemovedcCs>tjd||f�|j||=|jjj||�|j||�dS)Nz&zone.disableTimedIcmpBlock('%s', '%s'))rr1r2r#r<�remove_icmp_block�IcmpBlockRemoved)r)r<�icmpr:r-r-r.�disableTimedIcmpBlock�szFirewallD.disableTimedIcmpBlockcCs�t|t�}t|t�}t|t�}tjd||f�|j|�|jjj||||�}|dkrxt	j
||j|||�}|j|||�|j
|||�|S)Nz zone.enableIcmpBlock('%s', '%s')r)rr^r�rr1r;r#r<Zadd_icmp_blockrr�r<r?�IcmpBlockAdded)r)r<r;r�r:r�r>r-r-r.�addIcmpBlocks





zFirewallD.addIcmpBlockcCs\t|t�}t|t�}tjd||f�|j|�|jjj||�}|j||�|j	||�|S)Nz zone.removeIcmpBlock('%s', '%s'))
rr^rr1r;r#r<r9rAr:)r)r<r;r:r�r-r-r.�removeIcmpBlocks


zFirewallD.removeIcmpBlockcCs6t|t�}t|t�}tjd||f�|jjj||�S)Nzzone.queryIcmpBlock('%s', '%s'))rr^rr1r#r<Zquery_icmp_block)r)r<r;r:r-r-r.�queryIcmpBlock&s

zFirewallD.queryIcmpBlockcCs&t|t�}tjd|�|jjj|�S)Nzzone.getIcmpBlocks('%s'))rr^rr1r#r<Zlist_icmp_blocks)r)r<r:r-r-r.�
getIcmpBlocks1s
zFirewallD.getIcmpBlockscCstjd|||f�dS)Nz#zone.IcmpBlockAdded('%s', '%s', %d))rr1)r)r<r;r�r-r-r.r==szFirewallD.IcmpBlockAddedcCstjd||f�dS)Nz!zone.IcmpBlockRemoved('%s', '%s'))rr1)r)r<r;r-r-r.r:CszFirewallD.IcmpBlockRemovedcCs@t|t�}tjd|�|j|�|jjj||�}|j|�|S)Nz zone.addIcmpBlockInversion('%s'))	rr^rr1r;r#r<Zadd_icmp_block_inversion�IcmpBlockInversionAdded)r)r<r:r�r-r-r.�addIcmpBlockInversionLs


zFirewallD.addIcmpBlockInversioncCs>t|t�}tjd|�|j|�|jjj|�}|j|�|S)Nz#zone.removeIcmpBlockInversion('%s'))	rr^rr1r;r#r<Zremove_icmp_block_inversion�IcmpBlockInversionRemoved)r)r<r:r�r-r-r.�removeIcmpBlockInversionZs


z"FirewallD.removeIcmpBlockInversioncCs&t|t�}tjd|�|jjj|�S)Nz"zone.queryIcmpBlockInversion('%s'))rr^rr1r#r<Zquery_icmp_block_inversion)r)r<r:r-r-r.�queryIcmpBlockInversionhs
z!FirewallD.queryIcmpBlockInversioncCstjd|�dS)Nz"zone.IcmpBlockInversionAdded('%s'))rr1)r)r<r-r-r.rBrsz!FirewallD.IcmpBlockInversionAddedcCstjd|�dS)Nz$zone.IcmpBlockInversionRemoved('%s'))rr1)r)r<r-r-r.rDwsz#FirewallD.IcmpBlockInversionRemovedcCs`t|t�}t|t�}t|t�}tjd|||f�|j|�|jjj|||�|j|||�dS)Nz!direct.addChain('%s', '%s', '%s'))	rr^rr1r;r#r�Z	add_chain�
ChainAdded)r)�ipv�table�chainr:r-r-r.�addChain�s



zFirewallD.addChaincCs`t|t�}t|t�}t|t�}tjd|||f�|j|�|jjj|||�|j|||�dS)Nz$direct.removeChain('%s', '%s', '%s'))	rr^rr1r;r#r�Zremove_chain�ChainRemoved)r)rHrIrJr:r-r-r.�removeChain�s



zFirewallD.removeChaincCsDt|t�}t|t�}t|t�}tjd|||f�|jjj|||�S)Nz#direct.queryChain('%s', '%s', '%s'))rr^rr1r#r�Zquery_chain)r)rHrIrJr:r-r-r.�
queryChain�s



zFirewallD.queryChaincCs6t|t�}t|t�}tjd||f�|jjj||�S)Nzdirect.getChains('%s', '%s'))rr^rr1r#r�Z
get_chains)r)rHrIr:r-r-r.�	getChains�s

zFirewallD.getChainsza(sss)cCstjd�|jjj�S)Nzdirect.getAllChains())rr1r#r�r�)r)r:r-r-r.�getAllChains�s
zFirewallD.getAllChainscCstjd|||f�dS)Nz#direct.ChainAdded('%s', '%s', '%s'))rr1)r)rHrIrJr-r-r.rG�szFirewallD.ChainAddedcCstjd|||f�dS)Nz%direct.ChainRemoved('%s', '%s', '%s'))rr1)r)rHrIrJr-r-r.rL�s
zFirewallD.ChainRemovedZsssiascCs�t|t�}t|t�}t|t�}t|t�}tdd�|D��}tjd||||dj|�f�|j|�|jj	j
|||||�|j|||||�dS)Ncss|]}t|t�VqdS)N)rr^)�.0r�r-r-r.�	<genexpr>�sz$FirewallD.addRule.<locals>.<genexpr>z*direct.addRule('%s', '%s', '%s', %d, '%s')z',')rr^r�r�rr1�joinr;r#r�r��	RuleAdded)r)rHrIrJ�priorityr*r:r-r-r.�addRule�s




zFirewallD.addRulecCs�t|t�}t|t�}t|t�}t|t�}tdd�|D��}tjd||||dj|�f�|j|�|jj	j
|||||�|j|||||�dS)Ncss|]}t|t�VqdS)N)rr^)rQr�r-r-r.rR�sz'FirewallD.removeRule.<locals>.<genexpr>z-direct.removeRule('%s', '%s', '%s', %d, '%s')z',')rr^r�r�rr1rSr;r#r�r��RuleRemoved)r)rHrIrJrUr*r:r-r-r.�
removeRule�s




zFirewallD.removeRulecCs�t|t�}t|t�}t|t�}tjd|||f�|j|�xF|jjj|||�D]0\}}|jjj|||||�|j	|||||�qPWdS)Nz$direct.removeRules('%s', '%s', '%s'))
rr^rr1r;r#r��	get_rulesr�rW)r)rHrIrJr:rUr*r-r-r.�removeRules�s



zFirewallD.removeRulescCsnt|t�}t|t�}t|t�}t|t�}tdd�|D��}tjd||||dj|�f�|jjj	|||||�S)Ncss|]}t|t�VqdS)N)rr^)rQr�r-r-r.rR	sz&FirewallD.queryRule.<locals>.<genexpr>z,direct.queryRule('%s', '%s', '%s', %d, '%s')z',')
rr^r�r�rr1rSr#r�r)r)rHrIrJrUr*r:r-r-r.�	queryRule�s



zFirewallD.queryRuleza(ias)cCsDt|t�}t|t�}t|t�}tjd|||f�|jjj|||�S)Nz!direct.getRules('%s', '%s', '%s'))rr^rr1r#r�rY)r)rHrIrJr:r-r-r.�getRules
	s



zFirewallD.getRulesz	a(sssias)cCstjd�|jjj�S)Nzdirect.getAllRules())rr1r#r�r�)r)r:r-r-r.�getAllRules	s
zFirewallD.getAllRulescCs"tjd||||dj|�f�dS)Nz,direct.RuleAdded('%s', '%s', '%s', %d, '%s')z',')rr1rS)r)rHrIrJrUr*r-r-r.rT"	szFirewallD.RuleAddedcCs"tjd||||dj|�f�dS)Nz.direct.RuleRemoved('%s', '%s', '%s', %d, '%s')z',')rr1rS)r)rHrIrJrUr*r-r-r.rW(	szFirewallD.RuleRemovedrScCs�t|t�}tdd�|D��}tjd|dj|�f�|j|�y|jjj	||�St
k
r�}zh|dkrztddd	d
g�}ntd	d
g�}t|�}|jt
jkr�tt|�|@�dkr�tj|�t|���WYdd}~XnXdS)
Ncss|]}t|t�VqdS)N)rr^)rQr�r-r-r.rR9	sz(FirewallD.passthrough.<locals>.<genexpr>zdirect.passthrough('%s', '%s')z','rHrLz-Cz--checkz-Lz--listr)rHrL)rr^r�rr1rSr;r#r��passthroughr�set�coderZCOMMAND_FAILEDr�rxr
)r)rHr*r:r9Z
query_args�msgr-r-r.r^2	s"


zFirewallD.passthroughcCs\t|�}tdd�|D��}tjd|dj|�f�|j|�|jjj||�|j	||�dS)Ncss|]}t|�VqdS)N)r)rQr�r-r-r.rRT	sz+FirewallD.addPassthrough.<locals>.<genexpr>z!direct.addPassthrough('%s', '%s')z',')
rr�rr1rSr;r#r�Zadd_passthrough�PassthroughAdded)r)rHr*r:r-r-r.�addPassthroughM	s
zFirewallD.addPassthroughcCs\t|�}tdd�|D��}tjd|dj|�f�|j|�|jjj||�|j	||�dS)Ncss|]}t|�VqdS)N)r)rQr�r-r-r.rRb	sz.FirewallD.removePassthrough.<locals>.<genexpr>z$direct.removePassthrough('%s', '%s')z',')
rr�rr1rSr;r#r�Zremove_passthrough�PassthroughRemoved)r)rHr*r:r-r-r.�removePassthrough[	s
zFirewallD.removePassthroughcCsBt|�}tdd�|D��}tjd|dj|�f�|jjj||�S)Ncss|]}t|�VqdS)N)r)rQr�r-r-r.rRp	sz-FirewallD.queryPassthrough.<locals>.<genexpr>z#direct.queryPassthrough('%s', '%s')z',')rr�rr1rSr#r�Zquery_passthrough)r)rHr*r:r-r-r.�queryPassthroughi	s
zFirewallD.queryPassthroughza(sas)cCstjd�|jjj�S)Nzdirect.getAllPassthroughs())rr1r#r�r�)r)r:r-r-r.�getAllPassthroughsu	s
zFirewallD.getAllPassthroughscCs.tjd�xt|j��D]}|j|�qWdS)Nzdirect.removeAllPassthroughs())rr1�reversedrgre)r)r:r^r-r-r.�removeAllPassthroughs~	s
zFirewallD.removeAllPassthroughscCs"t|�}tjd|�|jjj|�S)Nzdirect.getPassthroughs('%s'))rrr1r#r�Zget_passthroughs)r)rHr:r-r-r.�getPassthroughs�	szFirewallD.getPassthroughscCstjd|dj|�f�dS)Nz#direct.PassthroughAdded('%s', '%s')z',')rr1rS)r)rHr*r-r-r.rb�	szFirewallD.PassthroughAddedcCstjd|dj|�f�dS)Nz%direct.PassthroughRemoved('%s', '%s')z',')rr1rS)r)rHr*r-r-r.rd�	szFirewallD.PassthroughRemovedcCsdS)z� PK_ACTION_ALL implies all other actions, i.e. once a subject is
            authorized for PK_ACTION_ALL it's also authorized for any other action.
            Use-case is GUI (RHBZ#994729).
        Nr-)r)r:r-r-r.�authorizeAll�	s	zFirewallD.authorizeAllcCs$t|�}tjd|�|jjj|�S)Nzipset.queryIPSet('%s'))rrr1r#r|Zquery_ipset)r)r|r:r-r-r.�
queryIPSet�	szFirewallD.queryIPSetcCstjd�|jjj�S)Nzipsets.getIPSets())rr1r#r|r})r)r:r-r-r.�	getIPSets�	s
zFirewallD.getIPSetscCs(t|t�}tjd|�|jjj|�j�S)NzgetIPSetSettings(%s))rr^rr1r#r|Z	get_ipsetr�)r)r|r:r-r-r.r~�	s
zFirewallD.getIPSetSettingscCsLt|�}t|�}tjd||f�|j|�|jjj||�|j||�dS)Nzipset.addEntry('%s', '%s'))rrr1r;r#r|Z	add_entry�
EntryAdded)r)r|�entryr:r-r-r.�addEntry�	s
zFirewallD.addEntrycCsLt|�}t|�}tjd||f�|j|�|jjj||�|j||�dS)Nzipset.removeEntry('%s', '%s'))rrr1r;r#r|Zremove_entry�EntryRemoved)r)r|ror:r-r-r.�removeEntry�	s
zFirewallD.removeEntrycCs2t|�}t|�}tjd||f�|jjj||�S)Nzipset.queryEntry('%s', '%s'))rrr1r#r|Zquery_entry)r)r|ror:r-r-r.�
queryEntry�	szFirewallD.queryEntrycCs$t|�}tjd|�|jjj|�S)Nzipset.getEntries('%s'))rrr1r#r|�get_entries)r)r|r:r-r-r.�
getEntries�	szFirewallD.getEntriescCs�t|�}t|t�}tjd|dj|��|jjj|�}|jjj||�t	|�}t	|�}x||D]}|j
||�q^Wx||D]}|j||�q|WdS)Nzipset.setEntries('%s', '[%s]')�,)r�listrr1rSr#r|rtZset_entriesr_rnrq)r)r|Zentriesr:Zold_entriesZold_entries_setZentries_setror-r-r.�
setEntries�	s
zFirewallD.setEntriescCs&t|�}t|�}tjd||f�dS)Nzipset.EntryAdded('%s', '%s'))rrr1)r)r|ror-r-r.rn
szFirewallD.EntryAddedcCs&t|�}t|�}tjd||f�dS)Nzipset.EntryRemoved('%s', '%s'))rrr1)r)r|ror-r-r.rq
szFirewallD.EntryRemovedcCstjd�|jjj�S)Nzhelpers.getHelpers())rr1r#r�r�)r)r:r-r-r.�
getHelpers!
s
zFirewallD.getHelperscCs(t|t�}tjd|�|jjj|�j�S)NzgetHelperSettings(%s))rr^rr1r#r�Z
get_helperr�)r)r�r:r-r-r.r�*
s
zFirewallD.getHelperSettings)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)r)N)N)N)N)r)N)N)N)N)r)N)N)N)r)N)N)N)N)r)N)N)N)N)r)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)��__name__�
__module__�__qualname__�__doc__Z
persistentrr'ZPK_ACTION_CONFIGZdefault_polkit_auth_requiredrr"r0r&r/r
r;r?rArCrZrZPROPERTIES_IFACErerh�slipZpolkitZrequire_authrirr�signalrjZPK_ACTION_INFOZINTROSPECTABLE_IFACErkr(rnrprorqr�ZPK_ACTION_POLICIESrar�r�ZPK_ACTION_POLICIES_INFOr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ZPK_ACTION_CONFIG_INFOr�r_r�r�r�ZDBUS_INTERFACE_POLICYr�r�r�r�rtr�r�rZDBUS_SIGNATUREr{r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr�r�rrvr	r
rrrrrrrrrr
rrrrrrrr!r#r$r%r&r"r r*r,r-r.r+r)r3r5r6r7r8r4r0r<r>r?r@rAr=r:rCrErFrBrDZPK_ACTION_DIRECTr`rKrMZPK_ACTION_DIRECT_INFOrNrOrPrGrLrVrXrZr[r\r]rTrWr^rcrerfrgrirjrbrdZ
PK_ACTION_ALLrkrbrlrmrr~rprrrsrurxrnrqryrr��
__classcell__r-r-)r,r.rAs�	0"	



K



	
	


	
	


	
	


	
	



























	









	








	















	






	
	


	
















	







	









	
	




)9�__all__Z
gi.repositoryrr�sys�modulesr�r'Zdbus.serviceZ	slip.dbusr~Zslip.dbus.serviceZfirewallrZfirewall.core.fwrZfirewall.core.richrZfirewall.core.loggerrZfirewall.clientr	Zfirewall.server.decoratorsr
rrr
Zfirewall.server.configrZfirewall.dbus_utilsrrrrrrrZfirewall.core.io.functionsrZfirewall.core.io.ipsetrZfirewall.core.io.icmptyperZfirewall.core.io.helperrZfirewall.core.fw_nmrrrZfirewall.core.fw_ifcfgrrZfirewall.errorsrrrZObjectrr-r-r-r.�<module>s2
$site-packages/firewall/server/__pycache__/config_icmptype.cpython-36.pyc000064400000024677147511334700022441 0ustar003

]ûfS:�@s�ddlmZddlZeejd<ddlZddlZddlZddlZddl	m
Z
ddlmZm
Z
mZddlmZddlmZddlmZmZmZdd	l	mZdd
lmZGdd�dejjj�ZdS)
�)�GObjectNZgobject)�config)�dbus_to_python�%dbus_introspection_prepare_properties�!dbus_introspection_add_properties)�IcmpType)�log)�handle_exceptions�dbus_handle_exceptions�dbus_service_method)�errors)�
FirewallErrorcsHeZdZdZdZejjZe	�fdd��Z
edd��Zedd��Z
ed	d
��Zeejddd
�edHdd���Zeejddd
�edIdd���Zejjjejj�eejdd�edJdd����Zejjejdd�dd��Zejjjejj�eejdd�edK�fdd�	���Zeejjejd�edLd d!���Z eejjejd�edMd"d#���Z!eejj�edNd$d%���Z"ejjejjdd�ed&d'���Z#eejj�edOd(d)���Z$ejjejjdd�ed*d+���Z%eejjdd�edPd,d-���Z&ejjejjdd�ed.d/���Z'eejjdd�edQd0d1���Z(eejjdd�edRd2d3���Z)eejjdd�edSd4d5���Z*eejjdd�edTd6d7���Z+eejjdd�edUd8d9���Z,eejjdd�edVd:d;���Z-eejjd<d�edWd=d>���Z.eejjd<d�edXd?d@���Z/eejjdd�edYdAdB���Z0eejjdd�edZdCdD���Z1eejjddEd
�ed[dFdG���Z2�Z3S)\�FirewallDConfigIcmpTypezFirewallD main classTcs\tt|�j||�||_||_||_||_|d|_|d|_d|j|_	t
|tjj�dS)Nr�zconfig.icmptype.%d)
�superr�__init__�parentr�obj�item_id�busname�path�_log_prefixr�dbus�DBUS_INTERFACE_CONFIG_ICMPTYPE)�selfrZconfZicmptyper�args�kwargs)�	__class__��%/usr/lib/python3.6/config_icmptype.pyr8s

z FirewallDConfigIcmpType.__init__cCsdS)Nr)rrrr�__del__EszFirewallDConfigIcmpType.__del__cCs|j�dS)N)Zremove_from_connection)rrrr�
unregisterIsz"FirewallDConfigIcmpType.unregistercCs�|dkrtj|jj�S|dkr,tj|jj�S|dkrBtj|jj�S|dkrXtj|jj�S|dkrntj|jj�Stj	j
d|��dS)N�name�filenamer�default�builtinzDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not exist)r�Stringrr"r#rZBooleanr$r%�
exceptions�
DBusException)r�
property_namerrr�
_get_propertyQsz%FirewallDConfigIcmpType._get_propertyZss�v)�in_signature�
out_signatureNcCsLt|t�}t|t�}tjd|j||�|tjjkrBtjj	d|��|j
|�S)Nz%s.Get('%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not exist)r�strr�debug1rrrrr'r(r*)r�interface_namer)�senderrrr�Getbs


zFirewallDConfigIcmpType.Get�sza{sv}cCsdt|t�}tjd|j|�|tjjkr6tjj	d|��i}xd
D]}|j
|�||<q@Wtj|dd	�S)Nz%s.GetAll('%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existr"r#rr$r%Zsv)�	signature)r"r#rr$r%)rr.rr/rrrrr'r(r*Z
Dictionary)rr0r1�ret�xrrr�GetAllss

zFirewallDConfigIcmpType.GetAllZssv)r,cCslt|t�}t|t�}t|�}tjd|j|||�|jj|�|tjj	krXtj
jd|��tj
jd|��dS)Nz%s.Set('%s', '%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existzGorg.freedesktop.DBus.Error.PropertyReadOnly: Property '%s' is read-only)rr.rr/rr�accessCheckrrrr'r()rr0r)Z	new_valuer1rrr�Set�s



zFirewallDConfigIcmpType.Setzsa{sv}as)r4cCs2t|t�}t|�}t|�}tjd|j|||�dS)Nz&%s.PropertiesChanged('%s', '%s', '%s'))rr.rr/r)rr0Zchanged_propertiesZinvalidated_propertiesrrr�PropertiesChanged�s


z)FirewallDConfigIcmpType.PropertiesChanged)r-cs8tjd|j�tt|�j|j|jj��}t	||t
jj�S)Nz%s.Introspect())
rZdebug2rrr�
IntrospectrrZget_busrrrr)rr1�data)rrrr;�s

z"FirewallDConfigIcmpType.IntrospectcCstjd|j�|jj|j�S)z"get settings for icmptype
        z%s.getSettings())rr/rrZget_icmptype_configr)rr1rrr�getSettings�sz#FirewallDConfigIcmpType.getSettingscCsFt|�}tjd|j�|jj|�|jj|j|�|_|j	|jj
�dS)z%update settings for icmptype
        z%s.update('...')N)rrr/rrr8rZset_icmptype_configr�Updatedr")r�settingsr1rrr�update�s
zFirewallDConfigIcmpType.updatecCs<tjd|j�|jj|�|jj|j�|_|j|jj	�dS)z3load default settings for builtin icmptype
        z%s.loadDefaults()N)
rr/rrr8rZload_icmptype_defaultsrr>r")rr1rrr�loadDefaults�sz$FirewallDConfigIcmpType.loadDefaultscCstjd|j|f�dS)Nz%s.Updated('%s'))rr/r)rr"rrrr>�szFirewallDConfigIcmpType.UpdatedcCs:tjd|j�|jj|�|jj|j�|jj|j�dS)zremove icmptype
        z%s.removeIcmpType()N)	rr/rrr8rZremove_icmptyperZremoveIcmpType)rr1rrr�remove�szFirewallDConfigIcmpType.removecCstjd|j|f�dS)Nz%s.Removed('%s'))rr/r)rr"rrr�Removed�szFirewallDConfigIcmpType.RemovedcCsFt|t�}tjd|j|�|jj|�|jj|j	|�|_	|j
|�dS)zrename icmptype
        z%s.rename('%s')N)rr.rr/rrr8rZrename_icmptyper�Renamed)rr"r1rrr�rename�s

zFirewallDConfigIcmpType.renamecCstjd|j|f�dS)Nz%s.Renamed('%s'))rr/r)rr"rrrrD�szFirewallDConfigIcmpType.RenamedcCstjd|j�|j�dS)Nz%s.getVersion()r)rr/rr=)rr1rrr�
getVersion�sz"FirewallDConfigIcmpType.getVersioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setVersion('%s')r)
rr.rr/rrr8�listr=r@)r�versionr1r?rrr�
setVersions
z"FirewallDConfigIcmpType.setVersioncCstjd|j�|j�dS)Nz
%s.getShort()r)rr/rr=)rr1rrr�getShortsz FirewallDConfigIcmpType.getShortcCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setShort('%s')r)
rr.rr/rrr8rGr=r@)rZshortr1r?rrr�setShorts
z FirewallDConfigIcmpType.setShortcCstjd|j�|j�dS)Nz%s.getDescription()�)rr/rr=)rr1rrr�getDescription$sz&FirewallDConfigIcmpType.getDescriptioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setDescription('%s')rL)
rr.rr/rrr8rGr=r@)r�descriptionr1r?rrr�setDescription+s

z&FirewallDConfigIcmpType.setDescription�ascCstjd|j�t|j�d�S)Nz%s.getDestinations()�)rr/r�sortedr=)rr1rrr�getDestinations9sz'FirewallDConfigIcmpType.getDestinationscCsNt|t�}tjd|jdj|��|jj|�t|j��}||d<|j	|�dS)Nz%s.setDestinations('[%s]')�,rQ)
rrGrr/r�joinrr8r=r@)rZdestinationsr1r?rrr�setDestinations@s

z'FirewallDConfigIcmpType.setDestinationscCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.addDestination('%s')rQ)rr.rr/rrr8rGr=r
rZALREADY_ENABLED�appendr@)r�destinationr1r?rrr�addDestinationLs

z&FirewallDConfigIcmpType.addDestinationcCs�t|t�}tjd|j|�|jj|�t|j��}|drd||dkrTt	t
j|��q�|dj|�ntt
ddg�t
|g��|d<|j|�dS)Nz%s.removeDestination('%s')rQZipv4Zipv6)rr.rr/rrr8rGr=r
rZNOT_ENABLEDrB�setr@)rrXr1r?rrr�removeDestinationZs

z)FirewallDConfigIcmpType.removeDestination�bcCs8t|t�}tjd|j|�|j�}|dp6||dkS)Nz%s.queryDestination('%s')rQ)rr.rr/rr=)rrXr1r?rrr�queryDestinationms


z(FirewallDConfigIcmpType.queryDestination)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)4�__name__�
__module__�__qualname__�__doc__Z
persistentrrZPK_ACTION_CONFIGZdefault_polkit_auth_requiredr	rr
r r!r*rZPROPERTIES_IFACEr2r7�slipZpolkitZrequire_authr9�service�signalr:ZPK_ACTION_INFOZINTROSPECTABLE_IFACEr;rrZDBUS_SIGNATUREr=r@rAr>rBrCrErDrFrIrJrKrMrOrSrVrYr[r]�
__classcell__rr)rrr0s�
		

	

	r)Z
gi.repositoryr�sys�modulesrZdbus.serviceZ	slip.dbusrbZslip.dbus.serviceZfirewallrZfirewall.dbus_utilsrrrZfirewall.core.io.icmptyperZfirewall.core.loggerrZfirewall.server.decoratorsr	r
rrZfirewall.errorsr
rcZObjectrrrrr�<module>s
site-packages/firewall/server/__pycache__/config_icmptype.cpython-36.opt-1.pyc000064400000024677147511334700023400 0ustar003

]ûfS:�@s�ddlmZddlZeejd<ddlZddlZddlZddlZddl	m
Z
ddlmZm
Z
mZddlmZddlmZddlmZmZmZdd	l	mZdd
lmZGdd�dejjj�ZdS)
�)�GObjectNZgobject)�config)�dbus_to_python�%dbus_introspection_prepare_properties�!dbus_introspection_add_properties)�IcmpType)�log)�handle_exceptions�dbus_handle_exceptions�dbus_service_method)�errors)�
FirewallErrorcsHeZdZdZdZejjZe	�fdd��Z
edd��Zedd��Z
ed	d
��Zeejddd
�edHdd���Zeejddd
�edIdd���Zejjjejj�eejdd�edJdd����Zejjejdd�dd��Zejjjejj�eejdd�edK�fdd�	���Zeejjejd�edLd d!���Z eejjejd�edMd"d#���Z!eejj�edNd$d%���Z"ejjejjdd�ed&d'���Z#eejj�edOd(d)���Z$ejjejjdd�ed*d+���Z%eejjdd�edPd,d-���Z&ejjejjdd�ed.d/���Z'eejjdd�edQd0d1���Z(eejjdd�edRd2d3���Z)eejjdd�edSd4d5���Z*eejjdd�edTd6d7���Z+eejjdd�edUd8d9���Z,eejjdd�edVd:d;���Z-eejjd<d�edWd=d>���Z.eejjd<d�edXd?d@���Z/eejjdd�edYdAdB���Z0eejjdd�edZdCdD���Z1eejjddEd
�ed[dFdG���Z2�Z3S)\�FirewallDConfigIcmpTypezFirewallD main classTcs\tt|�j||�||_||_||_||_|d|_|d|_d|j|_	t
|tjj�dS)Nr�zconfig.icmptype.%d)
�superr�__init__�parentr�obj�item_id�busname�path�_log_prefixr�dbus�DBUS_INTERFACE_CONFIG_ICMPTYPE)�selfrZconfZicmptyper�args�kwargs)�	__class__��%/usr/lib/python3.6/config_icmptype.pyr8s

z FirewallDConfigIcmpType.__init__cCsdS)Nr)rrrr�__del__EszFirewallDConfigIcmpType.__del__cCs|j�dS)N)Zremove_from_connection)rrrr�
unregisterIsz"FirewallDConfigIcmpType.unregistercCs�|dkrtj|jj�S|dkr,tj|jj�S|dkrBtj|jj�S|dkrXtj|jj�S|dkrntj|jj�Stj	j
d|��dS)N�name�filenamer�default�builtinzDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not exist)r�Stringrr"r#rZBooleanr$r%�
exceptions�
DBusException)r�
property_namerrr�
_get_propertyQsz%FirewallDConfigIcmpType._get_propertyZss�v)�in_signature�
out_signatureNcCsLt|t�}t|t�}tjd|j||�|tjjkrBtjj	d|��|j
|�S)Nz%s.Get('%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not exist)r�strr�debug1rrrrr'r(r*)r�interface_namer)�senderrrr�Getbs


zFirewallDConfigIcmpType.Get�sza{sv}cCsdt|t�}tjd|j|�|tjjkr6tjj	d|��i}xd
D]}|j
|�||<q@Wtj|dd	�S)Nz%s.GetAll('%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existr"r#rr$r%Zsv)�	signature)r"r#rr$r%)rr.rr/rrrrr'r(r*Z
Dictionary)rr0r1�ret�xrrr�GetAllss

zFirewallDConfigIcmpType.GetAllZssv)r,cCslt|t�}t|t�}t|�}tjd|j|||�|jj|�|tjj	krXtj
jd|��tj
jd|��dS)Nz%s.Set('%s', '%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existzGorg.freedesktop.DBus.Error.PropertyReadOnly: Property '%s' is read-only)rr.rr/rr�accessCheckrrrr'r()rr0r)Z	new_valuer1rrr�Set�s



zFirewallDConfigIcmpType.Setzsa{sv}as)r4cCs2t|t�}t|�}t|�}tjd|j|||�dS)Nz&%s.PropertiesChanged('%s', '%s', '%s'))rr.rr/r)rr0Zchanged_propertiesZinvalidated_propertiesrrr�PropertiesChanged�s


z)FirewallDConfigIcmpType.PropertiesChanged)r-cs8tjd|j�tt|�j|j|jj��}t	||t
jj�S)Nz%s.Introspect())
rZdebug2rrr�
IntrospectrrZget_busrrrr)rr1�data)rrrr;�s

z"FirewallDConfigIcmpType.IntrospectcCstjd|j�|jj|j�S)z"get settings for icmptype
        z%s.getSettings())rr/rrZget_icmptype_configr)rr1rrr�getSettings�sz#FirewallDConfigIcmpType.getSettingscCsFt|�}tjd|j�|jj|�|jj|j|�|_|j	|jj
�dS)z%update settings for icmptype
        z%s.update('...')N)rrr/rrr8rZset_icmptype_configr�Updatedr")r�settingsr1rrr�update�s
zFirewallDConfigIcmpType.updatecCs<tjd|j�|jj|�|jj|j�|_|j|jj	�dS)z3load default settings for builtin icmptype
        z%s.loadDefaults()N)
rr/rrr8rZload_icmptype_defaultsrr>r")rr1rrr�loadDefaults�sz$FirewallDConfigIcmpType.loadDefaultscCstjd|j|f�dS)Nz%s.Updated('%s'))rr/r)rr"rrrr>�szFirewallDConfigIcmpType.UpdatedcCs:tjd|j�|jj|�|jj|j�|jj|j�dS)zremove icmptype
        z%s.removeIcmpType()N)	rr/rrr8rZremove_icmptyperZremoveIcmpType)rr1rrr�remove�szFirewallDConfigIcmpType.removecCstjd|j|f�dS)Nz%s.Removed('%s'))rr/r)rr"rrr�Removed�szFirewallDConfigIcmpType.RemovedcCsFt|t�}tjd|j|�|jj|�|jj|j	|�|_	|j
|�dS)zrename icmptype
        z%s.rename('%s')N)rr.rr/rrr8rZrename_icmptyper�Renamed)rr"r1rrr�rename�s

zFirewallDConfigIcmpType.renamecCstjd|j|f�dS)Nz%s.Renamed('%s'))rr/r)rr"rrrrD�szFirewallDConfigIcmpType.RenamedcCstjd|j�|j�dS)Nz%s.getVersion()r)rr/rr=)rr1rrr�
getVersion�sz"FirewallDConfigIcmpType.getVersioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setVersion('%s')r)
rr.rr/rrr8�listr=r@)r�versionr1r?rrr�
setVersions
z"FirewallDConfigIcmpType.setVersioncCstjd|j�|j�dS)Nz
%s.getShort()r)rr/rr=)rr1rrr�getShortsz FirewallDConfigIcmpType.getShortcCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setShort('%s')r)
rr.rr/rrr8rGr=r@)rZshortr1r?rrr�setShorts
z FirewallDConfigIcmpType.setShortcCstjd|j�|j�dS)Nz%s.getDescription()�)rr/rr=)rr1rrr�getDescription$sz&FirewallDConfigIcmpType.getDescriptioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setDescription('%s')rL)
rr.rr/rrr8rGr=r@)r�descriptionr1r?rrr�setDescription+s

z&FirewallDConfigIcmpType.setDescription�ascCstjd|j�t|j�d�S)Nz%s.getDestinations()�)rr/r�sortedr=)rr1rrr�getDestinations9sz'FirewallDConfigIcmpType.getDestinationscCsNt|t�}tjd|jdj|��|jj|�t|j��}||d<|j	|�dS)Nz%s.setDestinations('[%s]')�,rQ)
rrGrr/r�joinrr8r=r@)rZdestinationsr1r?rrr�setDestinations@s

z'FirewallDConfigIcmpType.setDestinationscCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.addDestination('%s')rQ)rr.rr/rrr8rGr=r
rZALREADY_ENABLED�appendr@)r�destinationr1r?rrr�addDestinationLs

z&FirewallDConfigIcmpType.addDestinationcCs�t|t�}tjd|j|�|jj|�t|j��}|drd||dkrTt	t
j|��q�|dj|�ntt
ddg�t
|g��|d<|j|�dS)Nz%s.removeDestination('%s')rQZipv4Zipv6)rr.rr/rrr8rGr=r
rZNOT_ENABLEDrB�setr@)rrXr1r?rrr�removeDestinationZs

z)FirewallDConfigIcmpType.removeDestination�bcCs8t|t�}tjd|j|�|j�}|dp6||dkS)Nz%s.queryDestination('%s')rQ)rr.rr/rr=)rrXr1r?rrr�queryDestinationms


z(FirewallDConfigIcmpType.queryDestination)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)4�__name__�
__module__�__qualname__�__doc__Z
persistentrrZPK_ACTION_CONFIGZdefault_polkit_auth_requiredr	rr
r r!r*rZPROPERTIES_IFACEr2r7�slipZpolkitZrequire_authr9�service�signalr:ZPK_ACTION_INFOZINTROSPECTABLE_IFACEr;rrZDBUS_SIGNATUREr=r@rAr>rBrCrErDrFrIrJrKrMrOrSrVrYr[r]�
__classcell__rr)rrr0s�
		

	

	r)Z
gi.repositoryr�sys�modulesrZdbus.serviceZ	slip.dbusrbZslip.dbus.serviceZfirewallrZfirewall.dbus_utilsrrrZfirewall.core.io.icmptyperZfirewall.core.loggerrZfirewall.server.decoratorsr	r
rrZfirewall.errorsr
rcZObjectrrrrr�<module>s
site-packages/firewall/server/__pycache__/config_zone.cpython-36.pyc000064400000101501147511334700021540 0ustar003

]ûfW��@s�ddlmZddlZeejd<ddlZddlZddlZddlZddl	m
Z
ddlmZm
Z
mZddlmZddlmZddlmZdd	lmZdd
lmZddlmZmZmZddl	mZdd
lmZddl m!Z!m"Z"m#Z#m$Z$Gdd�dejj%j&�Z'dS)�)�GObjectNZgobject)�config)�dbus_to_python�%dbus_introspection_prepare_properties�!dbus_introspection_add_properties)�Zone)�ifcfg_set_zone_of_interface)�DEFAULT_ZONE_TARGET)�	Rich_Rule)�log)�handle_exceptions�dbus_handle_exceptions�dbus_service_method)�errors)�
FirewallError)�portStr�portInPortRange�coalescePortRange�breakPortRangecs�	eZdZdZdZejjZe	�fdd��Z
edd��Zedd��Z
ed	d
��Zeejddd
�ed�dd���Zeejddd
�ed�dd���Zejjjejj�eejdd�ed�dd����Zejjejdd�dd��Zejjjejj�eejdd�ed��fdd�	���Zeejjd d�ed�d!d"���Zeejjdd�ed�d#d$���Zd%d&�Z eejjd d�ed�d'd(���Z!eejjdd�ed�d)d*���Z"eejj�ed�d+d,���Z#ejjejjdd�ed-d.���Z$eejj�ed�d/d0���Z%ejjejjdd�ed1d2���Z&eejjdd�ed�d3d4���Z'ejjejjdd�ed5d6���Z(eejjdd�ed�d7d8���Z)eejjdd�ed�d9d:���Z*eejjdd�ed�d;d<���Z+eejjdd�ed�d=d>���Z,eejjdd�ed�d?d@���Z-eejjdd�ed�dAdB���Z.eejjdd�ed�dCdD���Z/eejjdd�ed�dEdF���Z0eejjdGd�ed�dHdI���Z1eejjdGd�ed�dJdK���Z2eejjdd�ed�dLdM���Z3eejjdd�ed�dNdO���Z4eejjddPd
�ed�dQdR���Z5eejjdSd�ed�dTdU���Z6eejjdSd�ed�dVdW���Z7eejjdd�ed�dXdY���Z8eejjdd�ed�dZd[���Z9eejjddPd
�ed�d\d]���Z:eejjdGd�ed�d^d_���Z;eejjdGd�ed�d`da���Z<eejjdd�ed�dbdc���Z=eejjdd�ed�ddde���Z>eejjddPd
�ed�dfdg���Z?eejjdSd�ed�dhdi���Z@eejjdSd�ed�djdk���ZAeejjdd�ed�dldm���ZBeejjdd�ed�dndo���ZCeejjddPd
�ed�dpdq���ZDeejjdGd�ed�drds���ZEeejjdGd�ed�dtdu���ZFeejjdd�ed�dvdw���ZGeejjdd�ed�dxdy���ZHeejjddPd
�ed�dzd{���ZIeejjdPd�ed�d|d}���ZJeejjdPd�ed�d~d���ZKeejj�ed�d�d����ZLeejj�ed�d�d����ZMeejjdPd�ed�d�d����ZNeejjdPd�ed�d�d����ZOeejjdPd�ed�d�d����ZPeejj�ed�d�d����ZQeejj�ed�d�d����ZReejjdPd�ed�d�d����ZSeejjd�d�ed�d�d����ZTeejjd�d�ed�d�d����ZUeejjd�d�ed�d�d����ZVeejjd�d�ed�d�d����ZWeejjd�dPd
�ed�d�d����ZXeejjdGd�ed�d�d����ZYeejjdGd�ed�d�d����ZZeejjdd�ed�d�d����Z[eejjdd�ed�d�d����Z\eejjddPd
�ed�d�d����Z]eejjdGd�ed�d�d����Z^eejjdGd�ed�d�d����Z_eejjdd�ed�d�d����Z`eejjdd�ed�d�d����ZaeejjddPd
�ed�d�d����ZbeejjdGd�ed�d�d����ZceejjdGd�e�dd�d����Zdeejjdd�e�dd�d����Zeeejjdd�e�dd�d����ZfeejjddPd
�e�dd�d����Zg�ZhS(�FirewallDConfigZonezFirewallD main classTcs\tt|�j||�||_||_||_||_|d|_|d|_d|j|_	t
|tjj�dS)Nr�zconfig.zone.%d)
�superr�__init__�parentr�obj�item_id�busname�path�_log_prefixr�dbus�DBUS_INTERFACE_CONFIG_ZONE)�selfrZconfZzoner�args�kwargs)�	__class__��!/usr/lib/python3.6/config_zone.pyr=s

zFirewallDConfigZone.__init__cCsdS)Nr%)r!r%r%r&�__del__JszFirewallDConfigZone.__del__cCs|j�dS)N)Zremove_from_connection)r!r%r%r&�
unregisterNszFirewallDConfigZone.unregistercCs�|dkrtj|jj�S|dkr,tj|jj�S|dkrBtj|jj�S|dkrXtj|jj�S|dkrntj|jj�Stj	j
d|��dS)N�name�filenamer�default�builtinzDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not exist)r�Stringrr)r*rZBooleanr+r,�
exceptions�
DBusException)r!�
property_namer%r%r&�
_get_propertyVsz!FirewallDConfigZone._get_propertyZss�v)�in_signature�
out_signatureNcCsLt|t�}t|t�}tjd|j||�|tjjkrBtjj	d|��|j
|�S)Nz%s.Get('%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not exist)r�strr�debug1rrrr r.r/r1)r!�interface_namer0�senderr%r%r&�Getgs


zFirewallDConfigZone.Get�sza{sv}cCsdt|t�}tjd|j|�|tjjkr6tjj	d|��i}xd
D]}|j
|�||<q@Wtj|dd	�S)Nz%s.GetAll('%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existr)r*rr+r,Zsv)�	signature)r)r*rr+r,)rr5rr6rrrr r.r/r1Z
Dictionary)r!r7r8�ret�xr%r%r&�GetAllxs

zFirewallDConfigZone.GetAllZssv)r3cCslt|t�}t|t�}t|�}tjd|j|||�|jj|�|tjj	krXtj
jd|��tj
jd|��dS)Nz%s.Set('%s', '%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existzGorg.freedesktop.DBus.Error.PropertyReadOnly: Property '%s' is read-only)rr5rr6rr�accessCheckrrr r.r/)r!r7r0Z	new_valuer8r%r%r&�Set�s



zFirewallDConfigZone.Setzsa{sv}as)r;cCs2t|t�}t|�}t|�}tjd|j|||�dS)Nz&%s.PropertiesChanged('%s', '%s', '%s'))rr5rr6r)r!r7Zchanged_propertiesZinvalidated_propertiesr%r%r&�PropertiesChanged�s


z%FirewallDConfigZone.PropertiesChanged)r4cs8tjd|j�tt|�j|j|jj��}t	||t
jj�S)Nz%s.Introspect())
rZdebug2rrr�
IntrospectrrZget_busrrrr )r!r8�data)r$r%r&rB�s

zFirewallDConfigZone.Introspectz&(sssbsasa(ss)asba(ssss)asasasasa(ss)b)cCsDtjd|j�|jj|j�}|dtkr@t|�}d|d<t|�}|S)zget settings for zone
        z%s.getSettings()�r+)	rr6rrZget_zone_configrr	�list�tuple)r!r8�settings�	_settingsr%r%r&�getSettings�szFirewallDConfigZone.getSettingscCs4tjd|j�|jj|j�}|dtkr0d|d<|S)zget settings for zone
        z%s.getSettings2()�targetr+)rr6rr�get_zone_config_dictrr	)r!r8rGr%r%r&�getSettings2�s
z FirewallDConfigZone.getSettings2cCs|jj|j�}d|kr"t|d�nt�}d|kr<t|d�nt�}t|t�rzt|tjd��|}t|tjd��|}nDd|kr�t|d�nt�}d|kr�t|d�nt�}||}||}x$|D]}	|jj	|	�r�t
tj|	��q�Wx$|D]}
|jj
|
�r�t
tj|
��q�WdS)a
Assignment of interfaces/sources to zones is different from other
           zone settings in the sense that particular interface/zone can be
           part of only one zone. So make sure added interfaces/sources have
           not already been bound to another zone.�
interfaces�sourcesN)rrKr�set�
isinstancerFrZindex_ofrZgetZoneOfInterfacerrZ
ZONE_CONFLICTZgetZoneOfSource)r!rGZold_settingsZ
old_ifacesZold_sourcesZadded_ifacesZ
added_sourcesZ
new_ifacesZnew_sourcesZiface�sourcer%r%r&� _checkDuplicateInterfacesSources�s 


z4FirewallDConfigZone._checkDuplicateInterfacesSourcescCstt|�}tjd|j�|jj|�|ddkrFt|�}t|d<t|�}|j	|�|j
j|j|�|_|j
|jj�dS)z!update settings for zone
        z%s.update('...')rDr+N)rrr6rrr?rEr	rFrRrZset_zone_configr�Updatedr))r!rGr8rHr%r%r&�update�s
zFirewallDConfigZone.updatecCslt|�}tjd|j�|jj|�d|kr>|ddkr>t|d<|j|�|jj	|j
|�|_
|j|j
j�dS)z!update settings for zone
        z%s.update2('...')rJr+N)
rrr6rrr?r	rRrZset_zone_config_dictrrSr))r!rGr8r%r%r&�update2�s
zFirewallDConfigZone.update2cCs<tjd|j�|jj|�|jj|j�|_|j|jj	�dS)z/load default settings for builtin zone
        z%s.loadDefaults()N)
rr6rrr?rZload_zone_defaultsrrSr))r!r8r%r%r&�loadDefaultssz FirewallDConfigZone.loadDefaultscCstjd|j|f�dS)Nz%s.Updated('%s'))rr6r)r!r)r%r%r&rSszFirewallDConfigZone.UpdatedcCs:tjd|j�|jj|�|jj|j�|jj|j�dS)zremove zone
        z%s.removeZone()N)	rr6rrr?rZremove_zonerZ
removeZone)r!r8r%r%r&�removeszFirewallDConfigZone.removecCstjd|j|f�dS)Nz%s.Removed('%s'))rr6r)r!r)r%r%r&�Removed#szFirewallDConfigZone.RemovedcCsFt|t�}tjd|j|�|jj|�|jj|j	|�|_	|j
|�dS)zrename zone
        z%s.rename('%s')N)rr5rr6rrr?rZrename_zoner�Renamed)r!r)r8r%r%r&�rename*s

zFirewallDConfigZone.renamecCstjd|j|f�dS)Nz%s.Renamed('%s'))rr6r)r!r)r%r%r&rY6szFirewallDConfigZone.RenamedcCstjd|j�|j�dS)Nz%s.getVersion()r)rr6rrI)r!r8r%r%r&�
getVersion=szFirewallDConfigZone.getVersioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setVersion('%s')r)
rr5rr6rrr?rErIrT)r!�versionr8rGr%r%r&�
setVersionDs
zFirewallDConfigZone.setVersioncCstjd|j�|j�dS)Nz
%s.getShort()r)rr6rrI)r!r8r%r%r&�getShortQszFirewallDConfigZone.getShortcCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setShort('%s')r)
rr5rr6rrr?rErIrT)r!Zshortr8rGr%r%r&�setShortXs
zFirewallDConfigZone.setShortcCstjd|j�|j�dS)Nz%s.getDescription()�)rr6rrI)r!r8r%r%r&�getDescriptionesz"FirewallDConfigZone.getDescriptioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setDescription('%s')r`)
rr5rr6rrr?rErIrT)r!�descriptionr8rGr%r%r&�setDescriptionls
z"FirewallDConfigZone.setDescriptioncCs.tjd|j�|j�}|dtkr*|dSdS)Nz%s.getTarget()rDr+)rr6rrIr	)r!r8rGr%r%r&�	getTarget|szFirewallDConfigZone.getTargetcCsTt|t�}tjd|j|�|jj|�t|j��}|dkr>|nt	|d<|j
|�dS)Nz%s.setTarget('%s')r+rD)rr5rr6rrr?rErIr	rT)r!rJr8rGr%r%r&�	setTarget�s
zFirewallDConfigZone.setTarget�ascCstjd|j�|j�dS)Nz%s.getServices()�)rr6rrI)r!r8r%r%r&�getServices�szFirewallDConfigZone.getServicescCsNt|t�}tjd|jdj|��|jj|�t|j��}||d<|j	|�dS)Nz%s.setServices('[%s]')�,rg)
rrErr6r�joinrr?rIrT)r!Zservicesr8rGr%r%r&�setServices�s

zFirewallDConfigZone.setServicescCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.addService('%s')rg)rr5rr6rrr?rErIrr�ALREADY_ENABLED�appendrT)r!�servicer8rGr%r%r&�
addService�s
zFirewallDConfigZone.addServicecCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.removeService('%s')rg)rr5rr6rrr?rErIrr�NOT_ENABLEDrWrT)r!rnr8rGr%r%r&�
removeService�s
z!FirewallDConfigZone.removeService�bcCs*t|t�}tjd|j|�||j�dkS)Nz%s.queryService('%s')rg)rr5rr6rrI)r!rnr8r%r%r&�queryService�s
z FirewallDConfigZone.queryServiceza(ss)cCstjd|j�|j�dS)Nz
%s.getPorts()�)rr6rrI)r!r8r%r%r&�getPorts�szFirewallDConfigZone.getPortscCs�g}x6t|t�D](}t|t�r.|jt|��q|j|�qW|}tjd|jdjdd�|D���|j	j
|�t|j��}||d<|j|�dS)Nz%s.setPorts('[%s]')ricss"|]}d|d|dfVqdS)z('%s, '%s')rrNr%)�.0�portr%r%r&�	<genexpr>�sz/FirewallDConfigZone.setPorts.<locals>.<genexpr>rt)
rrErPrmrFrr6rrjrr?rIrT)r!�portsr8�_portsrwrGr%r%r&�setPorts�s

zFirewallDConfigZone.setPortsc
s�t|t�}t�t��tjd|j|��|jj|�t|j��}tt	�fdd�|d��}x.|D]&}t
||d�r^ttj
d|�f��q^Wt|dd�|D��\}}x$|D]}	|djt|	d	��f�q�Wx$|D]}	|djt|	d	��f�q�W|j|�dS)
Nz%s.addPort('%s', '%s')cs|d�kS)Nrr%)r=)�protocolr%r&�<lambda>�sz-FirewallDConfigZone.addPort.<locals>.<lambda>rtrz%s:%scSsg|]\}}|�qSr%r%)rv�_port�	_protocolr%r%r&�
<listcomp>�sz/FirewallDConfigZone.addPort.<locals>.<listcomp>�-)rr5rr6rrr?rErI�filterrrrrlrrWrrmrT)
r!rwr|r8rG�existing_port_ids�port_id�added_ranges�removed_ranges�ranger%)r|r&�addPort�s"




zFirewallDConfigZone.addPortc
s�t|t�}t�t��tjd|j|��|jj|�t|j��}tt	�fdd�|d��}x0|D]}t
||d�r^Pq^Wttj
d|�f��t|dd�|D��\}}x$|D]}	|djt|	d	��f�q�Wx$|D]}	|djt|	d	��f�q�W|j|�dS)
Nz%s.removePort('%s', '%s')cs|d�kS)Nrr%)r=)r|r%r&r}sz0FirewallDConfigZone.removePort.<locals>.<lambda>rtrz%s:%scSsg|]\}}|�qSr%r%)rvr~rr%r%r&r�sz2FirewallDConfigZone.removePort.<locals>.<listcomp>r�)rr5rr6rrr?rErIr�rrrrprrWrrmrT)
r!rwr|r8rGr�r�r�r�r�r%)r|r&�
removePort�s"




zFirewallDConfigZone.removePortcCsZt|t�}t|t�}tjd|j||�x.|j�dD]\}}t||�r4||kr4dSq4WdS)Nz%s.queryPort('%s', '%s')rtTF)rr5rr6rrIr)r!rwr|r8r~rr%r%r&�	queryPorts

zFirewallDConfigZone.queryPortcCstjd|j�|j�dS)Nz%s.getProtocols()�
)rr6rrI)r!r8r%r%r&�getProtocolssz FirewallDConfigZone.getProtocolscCsNt|t�}tjd|jdj|��|jj|�t|j��}||d<|j	|�dS)Nz%s.setProtocols('[%s]')rir�)
rrErr6rrjrr?rIrT)r!Z	protocolsr8rGr%r%r&�setProtocols&s

z FirewallDConfigZone.setProtocolscCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.addProtocol('%s')r�)rr5rr6rrr?rErIrrrlrmrT)r!r|r8rGr%r%r&�addProtocol2s
zFirewallDConfigZone.addProtocolcCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.removeProtocol('%s')r�)rr5rr6rrr?rErIrrrprWrT)r!r|r8rGr%r%r&�removeProtocol?s
z"FirewallDConfigZone.removeProtocolcCs*t|t�}tjd|j|�||j�dkS)Nz%s.queryProtocol('%s')r�)rr5rr6rrI)r!r|r8r%r%r&�
queryProtocolLs
z!FirewallDConfigZone.queryProtocolcCstjd|j�|j�dS)Nz%s.getSourcePorts()�)rr6rrI)r!r8r%r%r&�getSourcePortsVsz"FirewallDConfigZone.getSourcePortscCs�g}x6t|t�D](}t|t�r.|jt|��q|j|�qW|}tjd|jdjdd�|D���|j	j
|�t|j��}||d<|j|�dS)Nz%s.setSourcePorts('[%s]')ricss"|]}d|d|dfVqdS)z('%s, '%s')rrNr%)rvrwr%r%r&rxjsz5FirewallDConfigZone.setSourcePorts.<locals>.<genexpr>r�)
rrErPrmrFrr6rrjrr?rIrT)r!ryr8rzrwrGr%r%r&�setSourcePorts]s

z"FirewallDConfigZone.setSourcePortsc
s�t|t�}t�t��tjd|j|��|jj|�t|j��}tt	�fdd�|d��}x.|D]&}t
||d�r^ttj
d|�f��q^Wt|dd�|D��\}}x$|D]}	|djt|	d	��f�q�Wx$|D]}	|djt|	d	��f�q�W|j|�dS)
Nz%s.addSourcePort('%s', '%s')cs|d�kS)Nrr%)r=)r|r%r&r}zsz3FirewallDConfigZone.addSourcePort.<locals>.<lambda>r�rz%s:%scSsg|]\}}|�qSr%r%)rvr~rr%r%r&r�sz5FirewallDConfigZone.addSourcePort.<locals>.<listcomp>r�)rr5rr6rrr?rErIr�rrrrlrrWrrmrT)
r!rwr|r8rGr�r�r�r�r�r%)r|r&�
addSourcePortps"




z!FirewallDConfigZone.addSourcePortc
s�t|t�}t�t��tjd|j|��|jj|�t|j��}tt	�fdd�|d��}x0|D]}t
||d�r^Pq^Wttj
d|�f��t|dd�|D��\}}x$|D]}	|djt|	d	��f�q�Wx$|D]}	|djt|	d	��f�q�W|j|�dS)
Nz%s.removeSourcePort('%s', '%s')cs|d�kS)Nrr%)r=)r|r%r&r}�sz6FirewallDConfigZone.removeSourcePort.<locals>.<lambda>r�rz%s:%scSsg|]\}}|�qSr%r%)rvr~rr%r%r&r��sz8FirewallDConfigZone.removeSourcePort.<locals>.<listcomp>r�)rr5rr6rrr?rErIr�rrrrprrWrrmrT)
r!rwr|r8rGr�r�r�r�r�r%)r|r&�removeSourcePort�s"




z$FirewallDConfigZone.removeSourcePortcCsZt|t�}t|t�}tjd|j||�x.|j�dD]\}}t||�r4||kr4dSq4WdS)Nz%s.querySourcePort('%s', '%s')r�TF)rr5rr6rrIr)r!rwr|r8r~rr%r%r&�querySourcePort�s

z#FirewallDConfigZone.querySourcePortcCstjd|j�|j�dS)Nz%s.getIcmpBlocks()�)rr6rrI)r!r8r%r%r&�
getIcmpBlocks�sz!FirewallDConfigZone.getIcmpBlockscCsNt|t�}tjd|jdj|��|jj|�t|j��}||d<|j	|�dS)Nz%s.setIcmpBlocks('[%s]')rir�)
rrErr6rrjrr?rIrT)r!Z	icmptypesr8rGr%r%r&�
setIcmpBlocks�s

z!FirewallDConfigZone.setIcmpBlockscCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.addIcmpBlock('%s')r�)rr5rr6rrr?rErIrrrlrmrT)r!�icmptyper8rGr%r%r&�addIcmpBlock�s
z FirewallDConfigZone.addIcmpBlockcCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.removeIcmpBlock('%s')r�)rr5rr6rrr?rErIrrrprWrT)r!r�r8rGr%r%r&�removeIcmpBlock�s
z#FirewallDConfigZone.removeIcmpBlockcCs*t|t�}tjd|j|�||j�dkS)Nz%s.queryIcmpBlock('%s')r�)rr5rr6rrI)r!r�r8r%r%r&�queryIcmpBlock�s
z"FirewallDConfigZone.queryIcmpBlockcCstjd|j�|j�dS)Nz%s.getIcmpBlockInversion()�)rr6rrI)r!r8r%r%r&�getIcmpBlockInversion�sz)FirewallDConfigZone.getIcmpBlockInversioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setIcmpBlockInversion('%s')r�)
r�boolrr6rrr?rErIrT)r!�flagr8rGr%r%r&�setIcmpBlockInversion�s
z)FirewallDConfigZone.setIcmpBlockInversioncCsPtjd|j�|jj|�t|j��}|dr:ttj	d��d|d<|j
|�dS)Nz%s.addIcmpBlockInversion()r�zicmp-block-inversionT)rr6rrr?rErIrrrlrT)r!r8rGr%r%r&�addIcmpBlockInversion�sz)FirewallDConfigZone.addIcmpBlockInversioncCsPtjd|j�|jj|�t|j��}|ds:ttj	d��d|d<|j
|�dS)Nz%s.removeIcmpBlockInversion()r�zicmp-block-inversionF)rr6rrr?rErIrrrprT)r!r8rGr%r%r&�removeIcmpBlockInversionsz,FirewallDConfigZone.removeIcmpBlockInversioncCstjd|j�|j�dS)Nz%s.queryIcmpBlockInversion()r�)rr6rrI)r!r8r%r%r&�queryIcmpBlockInversionsz+FirewallDConfigZone.queryIcmpBlockInversioncCstjd|j�|j�dS)Nz%s.getMasquerade()�)rr6rrI)r!r8r%r%r&�
getMasqueradesz!FirewallDConfigZone.getMasqueradecCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setMasquerade('%s')r�)
rr�rr6rrr?rErIrT)r!�
masquerader8rGr%r%r&�
setMasquerades
z!FirewallDConfigZone.setMasqueradecCsPtjd|j�|jj|�t|j��}|dr:ttj	d��d|d<|j
|�dS)Nz%s.addMasquerade()r�r�T)rr6rrr?rErIrrrlrT)r!r8rGr%r%r&�
addMasquerade'sz!FirewallDConfigZone.addMasqueradecCsPtjd|j�|jj|�t|j��}|ds:ttj	d��d|d<|j
|�dS)Nz%s.removeMasquerade()r�r�F)rr6rrr?rErIrrrprT)r!r8rGr%r%r&�removeMasquerade2sz$FirewallDConfigZone.removeMasqueradecCstjd|j�|j�dS)Nz%s.queryMasquerade()r�)rr6rrI)r!r8r%r%r&�queryMasquerade=sz#FirewallDConfigZone.queryMasqueradeza(ssss)cCstjd|j�|j�dS)Nz%s.getForwardPorts()�	)rr6rrI)r!r8r%r%r&�getForwardPortsFsz#FirewallDConfigZone.getForwardPortscCs�g}x6t|t�D](}t|t�r.|jt|��q|j|�qW|}tjd|jdjdd�|D���|j	j
|�t|j��}||d<|j|�dS)Nz%s.setForwardPorts('[%s]')ricss.|]&}d|d|d|d|dfVqdS)z('%s, '%s', '%s', '%s')rrr`�Nr%)rvrwr%r%r&rxZsz6FirewallDConfigZone.setForwardPorts.<locals>.<genexpr>r�)
rrErPrmrFrr6rrjrr?rIrT)r!ryr8rzrwrGr%r%r&�setForwardPortsMs


z#FirewallDConfigZone.setForwardPortsZsssscCs�t|t�}t|t�}t|t�}t|t�}tjd|j||||�|jj|�||t|�t|�f}t|j��}||dkr�t	t
jd||||f��|dj|�|j
|�dS)Nz)%s.addForwardPort('%s', '%s', '%s', '%s')r�z%s:%s:%s:%s)rr5rr6rrr?rErIrrrlrmrT)r!rwr|�toport�toaddrr8�fwp_idrGr%r%r&�addForwardPortas




z"FirewallDConfigZone.addForwardPortcCs�t|t�}t|t�}t|t�}t|t�}tjd|j||||�|jj|�||t|�t|�f}t|j��}||dkr�t	t
jd||||f��|dj|�|j
|�dS)Nz,%s.removeForwardPort('%s', '%s', '%s', '%s')r�z%s:%s:%s:%s)rr5rr6rrr?rErIrrrprWrT)r!rwr|r�r�r8r�rGr%r%r&�removeForwardPortus




z%FirewallDConfigZone.removeForwardPortcCsbt|t�}t|t�}t|t�}t|t�}tjd|j||||�||t|�t|�f}||j�dkS)Nz+%s.queryForwardPort('%s', '%s', '%s', '%s')r�)rr5rr6rrI)r!rwr|r�r�r8r�r%r%r&�queryForwardPort�s



z$FirewallDConfigZone.queryForwardPortcCstjd|j�|j�dS)Nz%s.getInterfaces()�
)rr6rrI)r!r8r%r%r&�
getInterfaces�sz!FirewallDConfigZone.getInterfacescCsNt|t�}tjd|jdj|��|jj|�t|j��}||d<|j	|�dS)Nz%s.setInterfaces('[%s]')rir�)
rrErr6rrjrr?rIrT)r!rMr8rGr%r%r&�
setInterfaces�s

z!FirewallDConfigZone.setInterfacescCstt|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�t|jj|�dS)Nz%s.addInterface('%s')r�)rr5rr6rrr?rErIrrrlrmrTrrr))r!�	interfacer8rGr%r%r&�addInterface�s

z FirewallDConfigZone.addInterfacecCspt|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�td|�dS)Nz%s.removeInterface('%s')r��)rr5rr6rrr?rErIrrrprWrTr)r!r�r8rGr%r%r&�removeInterface�s

z#FirewallDConfigZone.removeInterfacecCs*t|t�}tjd|j|�||j�dkS)Nz%s.queryInterface('%s')r�)rr5rr6rrI)r!r�r8r%r%r&�queryInterface�s
z"FirewallDConfigZone.queryInterfacecCstjd|j�|j�dS)Nz%s.getSources()�)rr6rrI)r!r8r%r%r&�
getSources�szFirewallDConfigZone.getSourcescCsNt|t�}tjd|jdj|��|jj|�t|j��}||d<|j	|�dS)Nz%s.setSources('[%s]')rir�)
rrErr6rrjrr?rIrT)r!rNr8rGr%r%r&�
setSources�s

zFirewallDConfigZone.setSourcescCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.addSource('%s')r�)rr5rr6rrr?rErIrrrlrmrT)r!rQr8rGr%r%r&�	addSource�s
zFirewallDConfigZone.addSourcecCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.removeSource('%s')r�)rr5rr6rrr?rErIrrrprWrT)r!rQr8rGr%r%r&�removeSource�s
z FirewallDConfigZone.removeSourcecCs*t|t�}tjd|j|�||j�dkS)Nz%s.querySource('%s')r�)rr5rr6rrI)r!rQr8r%r%r&�querySources
zFirewallDConfigZone.querySourcecCstjd|j�|j�dS)Nz%s.getRichRules()�)rr6rrI)r!r8r%r%r&�getRichRulessz FirewallDConfigZone.getRichRulescCs\t|t�}tjd|jdj|��|jj|�t|j��}dd�|D�}||d<|j	|�dS)Nz%s.setRichRules('[%s]')ricSsg|]}tt|d���qS))�rule_str)r5r
)rv�rr%r%r&r�sz4FirewallDConfigZone.setRichRules.<locals>.<listcomp>r�)
rrErr6rrjrr?rIrT)r!Zrulesr8rGr%r%r&�setRichRuless

z FirewallDConfigZone.setRichRulescCstt|t�}tjd|j|�|jj|�t|j��}tt	|d��}||dkrXt
tj|��|dj
|�|j|�dS)Nz%s.addRichRule('%s'))r�r�)rr5rr6rrr?rErIr
rrrlrmrT)r!�ruler8rGr�r%r%r&�addRichRule s
zFirewallDConfigZone.addRichRulecCstt|t�}tjd|j|�|jj|�t|j��}tt	|d��}||dkrXt
tj|��|dj
|�|j|�dS)Nz%s.removeRichRule('%s'))r�r�)rr5rr6rrr?rErIr
rrrprWrT)r!r�r8rGr�r%r%r&�removeRichRule.s
z"FirewallDConfigZone.removeRichRulecCs8t|t�}tjd|j|�tt|d��}||j�dkS)Nz%s.queryRichRule('%s'))r�r�)rr5rr6rr
rI)r!r�r8r�r%r%r&�
queryRichRule<s
z!FirewallDConfigZone.queryRichRule)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)i�__name__�
__module__�__qualname__�__doc__Z
persistentrrZPK_ACTION_CONFIGZdefault_polkit_auth_requiredrrr
r'r(r1rZPROPERTIES_IFACEr9r>�slipZpolkitZrequire_authr@rn�signalrAZPK_ACTION_INFOZINTROSPECTABLE_IFACErBr rIrLrRrTrUrVrSrWrXrZrYr[r]r^r_rarcrdrerhrkrorqrsrur{r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r��
__classcell__r%r%)r$r&r5sf
		

	



	


	


	


	
	
	
			


r)(Z
gi.repositoryr�sys�modulesrZdbus.serviceZ	slip.dbusr�Zslip.dbus.serviceZfirewallrZfirewall.dbus_utilsrrrZfirewall.core.io.zonerZfirewall.core.fw_ifcfgrZfirewall.core.baser	Zfirewall.core.richr
Zfirewall.core.loggerrZfirewall.server.decoratorsrr
rrZfirewall.errorsrZfirewall.functionsrrrrrnZObjectrr%r%r%r&�<module>s$
	site-packages/firewall/server/__pycache__/config_helper.cpython-36.opt-1.pyc000064400000030613147511334700023010 0ustar003

]ûf�D�@s�ddlmZddlZeejd<ddlZddlZddlZddlZddl	m
Z
ddlmZm
Z
mZddlmZddlmZddlmZmZmZdd	l	mZdd
lmZGdd�dejjj�ZdS)
�)�GObjectNZgobject)�config)�dbus_to_python�%dbus_introspection_prepare_properties�!dbus_introspection_add_properties)�Helper)�log)�handle_exceptions�dbus_handle_exceptions�dbus_service_method)�errors)�
FirewallErrorcseZdZdZdZejjZe	�fdd��Z
edd��Zedd��Z
ed	d
��Zeejddd
�edTdd���Zeejddd
�edUdd���Zejjjejj�eejdd�edVdd����Zejjejdd�dd��Zejjjejj�eejdd�edW�fdd�	���Zeejjejd�edXd d!���Z eejjejd�edYd"d#���Z!eejj�edZd$d%���Z"ejjejjdd�ed&d'���Z#eejj�ed[d(d)���Z$ejjejjdd�ed*d+���Z%eejjdd�ed\d,d-���Z&ejjejjdd�ed.d/���Z'eejjdd�ed]d0d1���Z(eejjdd�ed^d2d3���Z)eejjdd�ed_d4d5���Z*eejjdd�ed`d6d7���Z+eejjdd�edad8d9���Z,eejjdd�edbd:d;���Z-eejjdd�edcd<d=���Z.eejjdd�eddd>d?���Z/eejjdd@d
�ededAdB���Z0eejjdd�edfdCdD���Z1eejjdd�edgdEdF���Z2eejjdd@d
�edhdGdH���Z3eejjdId�edidJdK���Z4eejjdId�edjdLdM���Z5eejjdd�edkdNdO���Z6eejjdd�edldPdQ���Z7eejjdd@d
�edmdRdS���Z8�Z9S)n�FirewallDConfigHelperzFirewallD main classTcs\tt|�j||�||_||_||_||_|d|_|d|_d|j|_	t
|tjj�dS)Nr�zconfig.helper.%d)
�superr�__init__�parentr�obj�item_id�busname�path�_log_prefixr�dbus�DBUS_INTERFACE_CONFIG_HELPER)�selfrZconf�helperr�args�kwargs)�	__class__��#/usr/lib/python3.6/config_helper.pyr8s

zFirewallDConfigHelper.__init__cCsdS)Nr)rrrr �__del__EszFirewallDConfigHelper.__del__cCs|j�dS)N)Zremove_from_connection)rrrr �
unregisterIsz FirewallDConfigHelper.unregistercCs�|dkrtj|jj�S|dkr,tj|jj�S|dkrBtj|jj�S|dkrXtj|jj�S|dkrntj|jj�Stj	j
d|��dS)N�name�filenamer�default�builtinzDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not exist)r�Stringrr#r$rZBooleanr%r&�
exceptions�
DBusException)r�
property_namerrr �
_get_propertyQsz#FirewallDConfigHelper._get_propertyZss�v)�in_signature�
out_signatureNcCsLt|t�}t|t�}tjd|j||�|tjjkrBtjj	d|��|j
|�S)Nz%s.Get('%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not exist)r�strr�debug1rrrrr(r)r+)r�interface_namer*�senderrrr �Getbs


zFirewallDConfigHelper.Get�sza{sv}cCsdt|t�}tjd|j|�|tjjkr6tjj	d|��i}xd
D]}|j
|�||<q@Wtj|dd	�S)Nz%s.GetAll('%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existr#r$rr%r&Zsv)�	signature)r#r$rr%r&)rr/rr0rrrrr(r)r+Z
Dictionary)rr1r2�ret�xrrr �GetAllss

zFirewallDConfigHelper.GetAllZssv)r-cCslt|t�}t|t�}t|�}tjd|j|||�|jj|�|tjj	krXtj
jd|��tj
jd|��dS)Nz%s.Set('%s', '%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existzGorg.freedesktop.DBus.Error.PropertyReadOnly: Property '%s' is read-only)rr/rr0rr�accessCheckrrrr(r))rr1r*Z	new_valuer2rrr �Set�s



zFirewallDConfigHelper.Setzsa{sv}as)r5cCs2t|t�}t|�}t|�}tjd|j|||�dS)Nz&%s.PropertiesChanged('%s', '%s', '%s'))rr/rr0r)rr1Zchanged_propertiesZinvalidated_propertiesrrr �PropertiesChanged�s


z'FirewallDConfigHelper.PropertiesChanged)r.cs8tjd|j�tt|�j|j|jj��}t	||t
jj�S)Nz%s.Introspect())
rZdebug2rrr�
IntrospectrrZget_busrrrr)rr2�data)rrr r<�s

z FirewallDConfigHelper.IntrospectcCstjd|j�|jj|j�S)z get settings for helper
        z%s.getSettings())rr0rrZget_helper_configr)rr2rrr �getSettings�sz!FirewallDConfigHelper.getSettingscCsFt|�}tjd|j�|jj|�|jj|j|�|_|j	|jj
�dS)z#update settings for helper
        z%s.update('...')N)rrr0rrr9rZset_helper_configr�Updatedr#)r�settingsr2rrr �update�s
zFirewallDConfigHelper.updatecCs<tjd|j�|jj|�|jj|j�|_|j|jj	�dS)z1load default settings for builtin helper
        z%s.loadDefaults()N)
rr0rrr9rZload_helper_defaultsrr?r#)rr2rrr �loadDefaults�sz"FirewallDConfigHelper.loadDefaultscCstjd|j|f�dS)Nz%s.Updated('%s'))rr0r)rr#rrr r?�szFirewallDConfigHelper.UpdatedcCs:tjd|j�|jj|�|jj|j�|jj|j�dS)zremove helper
        z%s.removeHelper()N)	rr0rrr9rZ
remove_helperrZremoveHelper)rr2rrr �remove�szFirewallDConfigHelper.removecCstjd|j|f�dS)Nz%s.Removed('%s'))rr0r)rr#rrr �Removed�szFirewallDConfigHelper.RemovedcCsFt|t�}tjd|j|�|jj|�|jj|j	|�|_	|j
|�dS)zrename helper
        z%s.rename('%s')N)rr/rr0rrr9rZ
rename_helperr�Renamed)rr#r2rrr �rename�s

zFirewallDConfigHelper.renamecCstjd|j|f�dS)Nz%s.Renamed('%s'))rr0r)rr#rrr rE�szFirewallDConfigHelper.RenamedcCstjd|j�|j�dS)Nz%s.getVersion()r)rr0rr>)rr2rrr �
getVersion�sz FirewallDConfigHelper.getVersioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setVersion('%s')r)
rr/rr0rrr9�listr>rA)r�versionr2r@rrr �
setVersions
z FirewallDConfigHelper.setVersioncCstjd|j�|j�dS)Nz
%s.getShort()r)rr0rr>)rr2rrr �getShortszFirewallDConfigHelper.getShortcCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setShort('%s')r)
rr/rr0rrr9rHr>rA)rZshortr2r@rrr �setShorts
zFirewallDConfigHelper.setShortcCstjd|j�|j�dS)Nz%s.getDescription()�)rr0rr>)rr2rrr �getDescription$sz$FirewallDConfigHelper.getDescriptioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setDescription('%s')rM)
rr/rr0rrr9rHr>rA)r�descriptionr2r@rrr �setDescription+s

z$FirewallDConfigHelper.setDescriptioncCs.tjd|j�|jj|�t|j��}|dS)Nz%s.getFamily()�)rr0rrr9rHr>)rr2r@rrr �	getFamily9szFirewallDConfigHelper.getFamilycCsdt|t�}tjd|j|�|jj|�t|j��}|d|krNt	t
jd|��||d<|j|�dS)Nz%s.setFamily('%s')rQz'%s')
rr/rr0rrr9rHr>r
r�ALREADY_ENABLEDrA)r�ipvr2r@rrr �	setFamilyBs
zFirewallDConfigHelper.setFamily�bcCs.t|t�}tjd|j|�|j�}|d|kS)Nz%s.queryFamily('%s')rQ)rr/rr0rr>)rrTr2r@rrr �queryFamilyOs
z!FirewallDConfigHelper.queryFamilycCs.tjd|j�|jj|�t|j��}|dS)Nz%s.getModule()�)rr0rrr9rHr>)rr2r@rrr �	getModuleZszFirewallDConfigHelper.getModulecCsdt|t�}tjd|j|�|jj|�t|j��}|d|krNt	t
jd|��||d<|j|�dS)Nz%s.setModule('%s')rXz'%s')
rr/rr0rrr9rHr>r
rrSrA)r�moduler2r@rrr �	setModulecs
zFirewallDConfigHelper.setModulecCs.t|t�}tjd|j|�|j�}|d|kS)Nz%s.queryModule('%s')rX)rr/rr0rr>)rrZr2r@rrr �queryModuleps
z!FirewallDConfigHelper.queryModuleza(ss)cCstjd|j�|j�dS)Nz
%s.getPorts()�)rr0rr>)rr2rrr �getPorts{szFirewallDConfigHelper.getPortscCs�g}x6t|t�D](}t|t�r.|jt|��q|j|�qW|}tjd|jdjdd�|D���|j	j
|�t|j��}||d<|j|�dS)Nz%s.setPorts('[%s]')�,css"|]}d|d|dfVqdS)z('%s, '%s')rrNr)�.0�portrrr �	<genexpr>�sz1FirewallDConfigHelper.setPorts.<locals>.<genexpr>r])
rrH�
isinstance�append�tuplerr0r�joinrr9r>rA)rZportsr2Z_portsrar@rrr �setPorts�s

zFirewallDConfigHelper.setPortscCs�t|t�}t|t�}tjd|j||�|jj|�t|j��}||f|dkrbt	t
jd||f��|dj||f�|j
|�dS)Nz%s.addPort('%s', '%s')r]z%s:%s)rr/rr0rrr9rHr>r
rrSrdrA)rra�protocolr2r@rrr �addPort�s

zFirewallDConfigHelper.addPortcCs�t|t�}t|t�}tjd|j||�|jj|�t|j��}||f|dkrbt	t
jd||f��|dj||f�|j
|�dS)Nz%s.removePort('%s', '%s')r]z%s:%s)rr/rr0rrr9rHr>r
rZNOT_ENABLEDrCrA)rrarhr2r@rrr �
removePort�s

z FirewallDConfigHelper.removePortcCs:t|t�}t|t�}tjd|j||�||f|j�dkS)Nz%s.queryPort('%s', '%s')r])rr/rr0rr>)rrarhr2rrr �	queryPort�s


zFirewallDConfigHelper.queryPort)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N):�__name__�
__module__�__qualname__�__doc__Z
persistentrrZPK_ACTION_CONFIGZdefault_polkit_auth_requiredr	rr
r!r"r+rZPROPERTIES_IFACEr3r8�slipZpolkitZrequire_authr:�service�signalr;ZPK_ACTION_INFOZINTROSPECTABLE_IFACEr<rrZDBUS_SIGNATUREr>rArBr?rCrDrFrErGrJrKrLrNrPrRrUrWrYr[r\r^rgrirjrk�
__classcell__rr)rr r0s�
		

	




r)Z
gi.repositoryr�sys�modulesrZdbus.serviceZ	slip.dbusrpZslip.dbus.serviceZfirewallrZfirewall.dbus_utilsrrrZfirewall.core.io.helperrZfirewall.core.loggerrZfirewall.server.decoratorsr	r
rrZfirewall.errorsr
rqZObjectrrrrr �<module>s
site-packages/firewall/server/__pycache__/config_policy.cpython-36.opt-1.pyc000064400000015354147511334700023035 0ustar003

]ûf-!�@s�ddlmZddlZeejd<ddlZddlZddlZddlZddl	m
Z
ddlmZm
Z
mZddlmZddlmZmZmZGdd	�d	ejjj�ZdS)
�)�GObjectNZgobject)�config)�dbus_to_python�%dbus_introspection_prepare_properties�!dbus_introspection_add_properties)�log)�handle_exceptions�dbus_handle_exceptions�dbus_service_methodcs�eZdZdZejjZe�fdd��Z	e
dd��Ze
dd��Ze
dd	��Z
eejd
dd�e
d/dd���Zeejddd�e
d0dd���Zejjjejj�eejdd�e
d1dd����Zejjejdd�dd��Zejjjejj�eejdd�e
d2�fdd�	���Zeejjdd�e
d3dd ���Zeejjdd�e
d4d!d"���Zeejj�e
d5d#d$���Zejjejjdd�e
d%d&���Z eejj�e
d6d'd(���Z!ejjejjdd�e
d)d*���Z"eejjdd�e
d7d+d,���Z#ejjejjdd�e
d-d.���Z$�Z%S)8�FirewallDConfigPolicyTcs\tt|�j||�||_||_||_||_|d|_|d|_d|j|_	t
|tjj�dS)Nr�zconfig.policy.%d)
�superr�__init__�parentr�obj�item_id�busname�path�_log_prefixr�dbus�DBUS_INTERFACE_CONFIG_POLICY)�selfrZconfZpolicyr�args�kwargs)�	__class__��#/usr/lib/python3.6/config_policy.pyrs

zFirewallDConfigPolicy.__init__cCsdS)Nr)rrrr�__del__(szFirewallDConfigPolicy.__del__cCs|j�dS)N)Zremove_from_connection)rrrr�
unregister,sz FirewallDConfigPolicy.unregistercCs�|dkrtj|jj�S|dkr,tj|jj�S|dkrBtj|jj�S|dkrXtj|jj�S|dkrntj|jj�Stj	j
d|��dS)N�name�filenamer�default�builtinzDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not exist)r�Stringrrr rZBooleanr!r"�
exceptions�
DBusException)r�
property_namerrr�
_get_property4sz#FirewallDConfigPolicy._get_propertyZss�v)�in_signature�
out_signatureNcCsLt|t�}t|t�}tjd|j||�|tjjkrBtjj	d|��|j
|�S)Nz%s.Get('%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not exist)r�strr�debug1rrrrr$r%r')r�interface_namer&�senderrrr�GetEs


zFirewallDConfigPolicy.Get�sza{sv}cCsdt|t�}tjd|j|�|tjjkr6tjj	d|��i}xd
D]}|j
|�||<q@Wtj|dd	�S)Nz%s.GetAll('%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existrr rr!r"Zsv)�	signature)rr rr!r")rr+rr,rrrrr$r%r'Z
Dictionary)rr-r.�ret�xrrr�GetAllVs

zFirewallDConfigPolicy.GetAllZssv)r)cCslt|t�}t|t�}t|�}tjd|j|||�|jj|�|tjj	krXtj
jd|��tj
jd|��dS)Nz%s.Set('%s', '%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existzGorg.freedesktop.DBus.Error.PropertyReadOnly: Property '%s' is read-only)rr+rr,rr�accessCheckrrrr$r%)rr-r&Z	new_valuer.rrr�Setgs



zFirewallDConfigPolicy.Setzsa{sv}as)r1cCs2t|t�}t|�}t|�}tjd|j|||�dS)Nz&%s.PropertiesChanged('%s', '%s', '%s'))rr+rr,r)rr-Zchanged_propertiesZinvalidated_propertiesrrr�PropertiesChanged{s


z'FirewallDConfigPolicy.PropertiesChanged)r*cs8tjd|j�tt|�j|j|jj��}t	||t
jj�S)Nz%s.Introspect())
rZdebug2rr
r�
IntrospectrrZget_busrrrr)rr.�data)rrrr8�s

z FirewallDConfigPolicy.IntrospectcCs tjd|j�|jj|j�}|S)z get settings for policy
        z%s.getSettings())rr,rrZget_policy_object_config_dictr)rr.�settingsrrr�getSettings�sz!FirewallDConfigPolicy.getSettingscCsFt|�}tjd|j�|jj|�|jj|j|�|_|j	|jj
�dS)z#update settings for policy
        z%s.update('...')N)rrr,rrr5rZset_policy_object_config_dictr�Updatedr)rr:r.rrr�update�s
zFirewallDConfigPolicy.updatecCs<tjd|j�|jj|�|jj|j�|_|j|jj	�dS)z1load default settings for builtin policy
        z%s.loadDefaults()N)
rr,rrr5rZload_policy_object_defaultsrr<r)rr.rrr�loadDefaults�sz"FirewallDConfigPolicy.loadDefaultscCstjd|j|f�dS)Nz%s.Updated('%s'))rr,r)rrrrrr<�szFirewallDConfigPolicy.UpdatedcCs:tjd|j�|jj|�|jj|j�|jj|j�dS)zremove policy
        z%s.removePolicy()N)	rr,rrr5rZremove_policy_objectrZremovePolicy)rr.rrr�remove�szFirewallDConfigPolicy.removecCstjd|j|f�dS)Nz%s.Removed('%s'))rr,r)rrrrr�Removed�szFirewallDConfigPolicy.RemovedcCsFt|t�}tjd|j|�|jj|�|jj|j	|�|_	|j
|�dS)zrename policy
        z%s.rename('%s')N)rr+rr,rrr5rZrename_policy_objectr�Renamed)rrr.rrr�rename�s

zFirewallDConfigPolicy.renamecCstjd|j|f�dS)Nz%s.Renamed('%s'))rr,r)rrrrrrA�szFirewallDConfigPolicy.Renamed)N)N)N)N)N)N)N)N)N)&�__name__�
__module__�__qualname__Z
persistentrrZPK_ACTION_CONFIGZdefault_polkit_auth_requiredrrr	rrr'r
ZPROPERTIES_IFACEr/r4�slipZpolkitZrequire_authr6�service�signalr7ZPK_ACTION_INFOZINTROSPECTABLE_IFACEr8rr;r=r>r<r?r@rBrA�
__classcell__rr)rrrs^
		

	r)Z
gi.repositoryr�sys�modulesrZdbus.serviceZ	slip.dbusrFZslip.dbus.serviceZfirewallrZfirewall.dbus_utilsrrrZfirewall.core.loggerrZfirewall.server.decoratorsrr	r
rGZObjectrrrrr�<module>s
site-packages/firewall/server/__pycache__/server.cpython-36.pyc000064400000004547147511334700020562 0ustar003

]ûf��@s�dgZddlZddlZddlmZmZeejd<ddlZddlZddl	Zddl
Zddlm
Z
ddlmZddlmZdd	�Zd
d�Zdd
d�ZdS)�
run_server�N)�GObject�GLibZgobject)�config)�log)�	FirewallDcCs|j�dS)NT)�reload)�service�r
�/usr/lib/python3.6/server.py�sighup4srcCs|j�dS)N)�quit)�mainloopr
r
r�sigterm8srFcsxd}|rFddlm�ddl��j��j�j�d�����fdd��y�tjjj	dd�tj
�}tjjt
jj|d	�}t|t
jj�}tj�}tjjj|�|r�tj���ttd
�r�tj}ntj}|tjtjt|�|tjtjt|�|j�Wnvt k
�rt!j"d�YnXt#k
�r,t!j$d�Yn:t%k
�rd}zt!j$d
|j&j't(|��WYdd}~XnX|�rt|j)�dS)zI Main function for firewall server. Handles D-Bus and GLib mainloop.
    Nr)�pformat�
csr�j�t�j�dkrbtd�tdt�j��x(�jD]}tt|�d�t�|��q8Wtd�tj���dS)NrzP
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
zGARBAGE OBJECTS (%d):
z
  zP
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
)Zcollect�lenZgarbage�print�typer�timeout_add_seconds)�x)�gc�
gc_collect�
gc_timeoutrr
rrLszrun_server.<locals>.gc_collectT)Zset_as_default)�bus�unix_signal_addz
Stopping..z Raising SystemExit in run_serverzException %s: %s)*�pprintrr�enableZ	set_debugZ
DEBUG_LEAK�dbusrZglibZ
DBusGMainLoopZ	SystemBusr	ZBusNamerZDBUS_INTERFACErZ	DBUS_PATHrZMainLoop�slipZset_mainloopr�hasattrrZunix_signal_add_fullZ
PRIORITY_HIGH�signal�SIGHUPr�SIGTERMrZrun�KeyboardInterruptrZdebug1�
SystemExit�error�	Exception�	__class__�__name__�str�stop)Zdebug_gcr	r�namerr�er
)rrrrrrAsB



()F)�__all__�sysr!Z
gi.repositoryrr�modulesrZdbus.serviceZdbus.mainloop.glibZ	slip.dbusrZfirewallrZfirewall.core.loggerrZfirewall.server.firewalldrrrrr
r
r
r�<module>s
	site-packages/firewall/server/__pycache__/config_service.cpython-36.pyc000064400000051355147511334700022240 0ustar003

]ûf�u�@s�ddlmZddlZeejd<ddlZddlZddlZddlZddl	m
Z
ddlmZm
Z
mZddlmZddlmZmZmZddl	mZdd	lmZGd
d�dejjj�ZdS)�)�GObjectNZgobject)�config)�dbus_to_python�%dbus_introspection_prepare_properties�!dbus_introspection_add_properties)�log)�handle_exceptions�dbus_handle_exceptions�dbus_service_method)�errors)�
FirewallErrorcs�eZdZdZdZejjZe	�fdd��Z
edd��Zedd��Z
ed	d
��Zeejddd
�ed�dd���Zeejddd
�ed�dd���Zejjjejj�eejdd�ed�dd����Zejjejdd�dd��Zejjjejj�eejdd�ed��fdd�	���Zeejjd d�ed�d!d"���Zeejjdd�ed�d#d$���Zeejjd d�ed�d%d&���Z eejjdd�ed�d'd(���Z!eejj�ed�d)d*���Z"ejjejjdd�ed+d,���Z#eejj�ed�d-d.���Z$ejjejjdd�ed/d0���Z%eejjdd�ed�d1d2���Z&ejjejjdd�ed3d4���Z'eejjdd�ed�d5d6���Z(eejjdd�ed�d7d8���Z)eejjdd�ed�d9d:���Z*eejjdd�ed�d;d<���Z+eejjdd�ed�d=d>���Z,eejjdd�ed�d?d@���Z-eejjdAd�ed�dBdC���Z.eejjdAd�ed�dDdE���Z/eejjdd�ed�dFdG���Z0eejjdd�ed�dHdI���Z1eejjddJd
�ed�dKdL���Z2eejjdMd�ed�dNdO���Z3eejjdMd�ed�dPdQ���Z4eejjdd�ed�dRdS���Z5eejjdd�ed�dTdU���Z6eejjddJd
�ed�dVdW���Z7eejjdAd�ed�dXdY���Z8eejjdAd�ed�dZd[���Z9eejjdd�ed�d\d]���Z:eejjdd�ed�d^d_���Z;eejjddJd
�ed�d`da���Z<eejjdMd�ed�dbdc���Z=eejjdMd�ed�ddde���Z>eejjdd�ed�dfdg���Z?eejjdd�ed�dhdi���Z@eejjddJd
�ed�djdk���ZAeejjdld�ed�dmdn���ZBeejjdld�ed�dodp���ZCeejjddd
�ed�dqdr���ZDeejjdd�ed�dsdt���ZEeejjdd�ed�dudv���ZFeejjddJd
�ed�dwdx���ZGeejjdMd�ed�dydz���ZHeejjdMd�ed�d{d|���ZIeejjdd�ed�d}d~���ZJeejjdd�ed�dd����ZKeejjddJd
�ed�d�d����ZL�ZMS)��FirewallDConfigServicezFirewallD main classTcs\tt|�j||�||_||_||_||_|d|_|d|_d|j|_	t
|tjj�dS)Nr�zconfig.service.%d)
�superr
�__init__�parentr�obj�item_id�busname�path�_log_prefixr�dbus�DBUS_INTERFACE_CONFIG_SERVICE)�selfrZconf�servicer�args�kwargs)�	__class__��$/usr/lib/python3.6/config_service.pyr7s

zFirewallDConfigService.__init__cCsdS)Nr)rrrr�__del__DszFirewallDConfigService.__del__cCs|j�dS)N)Zremove_from_connection)rrrr�
unregisterHsz!FirewallDConfigService.unregistercCs�|dkrtj|jj�S|dkr,tj|jj�S|dkrBtj|jj�S|dkrXtj|jj�S|dkrntj|jj�Stj	j
d|��dS)N�name�filenamer�default�builtinzDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not exist)r�Stringrr"r#rZBooleanr$r%�
exceptions�
DBusException)r�
property_namerrr�
_get_propertyPsz$FirewallDConfigService._get_propertyZss�v)�in_signature�
out_signatureNcCsLt|t�}t|t�}tjd|j||�|tjjkrBtjj	d|��|j
|�S)Nz%s.Get('%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not exist)r�strr�debug1rrrrr'r(r*)r�interface_namer)�senderrrr�Getas


zFirewallDConfigService.Get�sza{sv}cCsdt|t�}tjd|j|�|tjjkr6tjj	d|��i}xd
D]}|j
|�||<q@Wtj|dd	�S)Nz%s.GetAll('%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existr"r#rr$r%Zsv)�	signature)r"r#rr$r%)rr.rr/rrrrr'r(r*Z
Dictionary)rr0r1�ret�xrrr�GetAllrs

zFirewallDConfigService.GetAllZssv)r,cCslt|t�}t|t�}t|�}tjd|j|||�|jj|�|tjj	krXtj
jd|��tj
jd|��dS)Nz%s.Set('%s', '%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existzGorg.freedesktop.DBus.Error.PropertyReadOnly: Property '%s' is read-only)rr.rr/rr�accessCheckrrrr'r()rr0r)Z	new_valuer1rrr�Set�s



zFirewallDConfigService.Setzsa{sv}as)r4cCs2t|t�}t|�}t|�}tjd|j|||�dS)Nz&%s.PropertiesChanged('%s', '%s', '%s'))rr.rr/r)rr0Zchanged_propertiesZinvalidated_propertiesrrr�PropertiesChanged�s


z(FirewallDConfigService.PropertiesChanged)r-cs8tjd|j�tt|�j|j|jj��}t	||t
jj�S)Nz%s.Introspect())
rZdebug2rrr
�
IntrospectrrZget_busrrrr)rr1�data)rrrr;�s

z!FirewallDConfigService.Introspectz(sssa(ss)asa{ss}asa(ss))cCstjd|j�|jj|j�S)z!get settings for service
        z%s.getSettings())rr/rrZget_service_configr)rr1rrr�getSettings�sz"FirewallDConfigService.getSettingscCstjd|j�|jj|j�S)z!get settings for service
        z%s.getSettings2())rr/rr�get_service_config_dictr)rr1rrr�getSettings2�sz#FirewallDConfigService.getSettings2cCsFt|�}tjd|j�|jj|�|jj|j|�|_|j	|jj
�dS)z$update settings for service
        z%s.update('...')N)rrr/rrr8rZset_service_configr�Updatedr")r�settingsr1rrr�update�s
zFirewallDConfigService.updatecCsFt|�}tjd|j�|jj|�|jj|j|�|_|j	|jj
�dS)Nz%s.update2('...'))rrr/rrr8r�set_service_config_dictrr@r")rrAr1rrr�update2�s
zFirewallDConfigService.update2cCs<tjd|j�|jj|�|jj|j�|_|j|jj	�dS)z2load default settings for builtin service
        z%s.loadDefaults()N)
rr/rrr8rZload_service_defaultsrr@r")rr1rrr�loadDefaults�sz#FirewallDConfigService.loadDefaultscCstjd|j|f�dS)Nz%s.Updated('%s'))rr/r)rr"rrrr@�szFirewallDConfigService.UpdatedcCs:tjd|j�|jj|�|jj|j�|jj|j�dS)zremove service
        z%s.removeService()N)	rr/rrr8rZremove_servicerZ
removeService)rr1rrr�remove�szFirewallDConfigService.removecCstjd|j|f�dS)Nz%s.Removed('%s'))rr/r)rr"rrr�Removed�szFirewallDConfigService.RemovedcCsFt|t�}tjd|j|�|jj|�|jj|j	|�|_	|j
|�dS)zrename service
        z%s.rename('%s')N)rr.rr/rrr8rZrename_servicer�Renamed)rr"r1rrr�rename�s

zFirewallDConfigService.renamecCstjd|j|f�dS)Nz%s.Renamed('%s'))rr/r)rr"rrrrHszFirewallDConfigService.RenamedcCstjd|j�|j�dS)Nz%s.getVersion()r)rr/rr=)rr1rrr�
getVersionsz!FirewallDConfigService.getVersioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setVersion('%s')r)
rr.rr/rrr8�listr=rB)r�versionr1rArrr�
setVersions
z!FirewallDConfigService.setVersioncCstjd|j�|j�dS)Nz
%s.getShort()r)rr/rr=)rr1rrr�getShort"szFirewallDConfigService.getShortcCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setShort('%s')r)
rr.rr/rrr8rKr=rB)rZshortr1rArrr�setShort)s
zFirewallDConfigService.setShortcCstjd|j�|j�dS)Nz%s.getDescription()�)rr/rr=)rr1rrr�getDescription6sz%FirewallDConfigService.getDescriptioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setDescription('%s')rP)
rr.rr/rrr8rKr=rB)r�descriptionr1rArrr�setDescription=s

z%FirewallDConfigService.setDescriptionza(ss)cCstjd|j�|j�dS)Nz
%s.getPorts()�)rr/rr=)rr1rrr�getPortsKszFirewallDConfigService.getPortscCs�g}x6t|t�D](}t|t�r.|jt|��q|j|�qW|}tjd|jdjdd�|D���|j	j
|�t|j��}||d<|j|�dS)Nz%s.setPorts('[%s]')�,css"|]}d|d|dfVqdS)z('%s, '%s')rrNr)�.0�portrrr�	<genexpr>_sz2FirewallDConfigService.setPorts.<locals>.<genexpr>rT)
rrK�
isinstance�append�tuplerr/r�joinrr8r=rB)r�portsr1�_portsrXrArrr�setPortsRs

zFirewallDConfigService.setPortscCs�t|t�}t|t�}tjd|j||�|jj|�t|j��}||f|dkrbt	t
jd||f��|dj||f�|j
|�dS)Nz%s.addPort('%s', '%s')rTz%s:%s)rr.rr/rrr8rKr=rr�ALREADY_ENABLEDr[rB)rrX�protocolr1rArrr�addPortes

zFirewallDConfigService.addPortcCs�t|t�}t|t�}tjd|j||�|jj|�t|j��}||f|dkrbt	t
jd||f��|dj||f�|j
|�dS)Nz%s.removePort('%s', '%s')rTz%s:%s)rr.rr/rrr8rKr=rr�NOT_ENABLEDrFrB)rrXrbr1rArrr�
removePortus

z!FirewallDConfigService.removePort�bcCs:t|t�}t|t�}tjd|j||�||f|j�dkS)Nz%s.queryPort('%s', '%s')rT)rr.rr/rr=)rrXrbr1rrr�	queryPort�s


z FirewallDConfigService.queryPort�ascCstjd|j�|j�dS)Nz%s.getProtocols()�)rr/rr=)rr1rrr�getProtocols�sz#FirewallDConfigService.getProtocolscCsNt|t�}tjd|jdj|��|jj|�t|j��}||d<|j	|�dS)Nz%s.setProtocols('[%s]')rVri)
rrKrr/rr]rr8r=rB)rZ	protocolsr1rArrr�setProtocols�s

z#FirewallDConfigService.setProtocolscCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.addProtocol('%s')ri)rr.rr/rrr8rKr=rrrar[rB)rrbr1rArrr�addProtocol�s
z"FirewallDConfigService.addProtocolcCsft|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|dj|�|j
|�dS)Nz%s.removeProtocol('%s')ri)rr.rr/rrr8rKr=rrrdrFrB)rrbr1rArrr�removeProtocol�s
z%FirewallDConfigService.removeProtocolcCs*t|t�}tjd|j|�||j�dkS)Nz%s.queryProtocol(%s')ri)rr.rr/rr=)rrbr1rrr�
queryProtocol�s
z$FirewallDConfigService.queryProtocolcCstjd|j�|j�dS)Nz%s.getSourcePorts()�)rr/rr=)rr1rrr�getSourcePorts�sz%FirewallDConfigService.getSourcePortscCs�g}x6t|t�D](}t|t�r.|jt|��q|j|�qW|}tjd|jdjdd�|D���|j	j
|�t|j��}||d<|j|�dS)Nz%s.setSourcePorts('[%s]')rVcss"|]}d|d|dfVqdS)z('%s, '%s')rrNr)rWrXrrrrY�sz8FirewallDConfigService.setSourcePorts.<locals>.<genexpr>ro)
rrKrZr[r\rr/rr]rr8r=rB)rr^r1r_rXrArrr�setSourcePorts�s

z%FirewallDConfigService.setSourcePortscCs�t|t�}t|t�}tjd|j||�|jj|�t|j��}||f|dkrbt	t
jd||f��|dj||f�|j
|�dS)Nz%s.addSourcePort('%s', '%s')roz%s:%s)rr.rr/rrr8rKr=rrrar[rB)rrXrbr1rArrr�
addSourcePort�s

z$FirewallDConfigService.addSourcePortcCs�t|t�}t|t�}tjd|j||�|jj|�t|j��}||f|dkrbt	t
jd||f��|dj||f�|j
|�dS)Nz%s.removeSourcePort('%s', '%s')roz%s:%s)rr.rr/rrr8rKr=rrrdrFrB)rrXrbr1rArrr�removeSourcePort�s

z'FirewallDConfigService.removeSourcePortcCs:t|t�}t|t�}tjd|j||�||f|j�dkS)Nz%s.querySourcePort('%s', '%s')ro)rr.rr/rr=)rrXrbr1rrr�querySourcePorts


z&FirewallDConfigService.querySourcePortcCstjd|j�|j�dS)Nz%s.getModules()�)rr/rr=)rr1rrr�
getModulessz!FirewallDConfigService.getModulescCs�t|t�}g}x@|D]8}|jd�rB|jdd�}d|krB|jdd�}|j|�qW|}tjd|jdj|��|j	j
|�t|j��}||d<|j|�dS)N�
nf_conntrack_��_�-z%s.setModules('[%s]')rVru)
rrK�
startswith�replacer[rr/rr]rr8r=rB)r�modulesr1Z_modules�modulerArrr�
setModuless



z!FirewallDConfigService.setModulescCs�t|t�}|jd�r4|jdd�}d|kr4|jdd�}tjd|j|�|jj|�t	|j
��}||dkrtttj
|��|dj|�|j|�dS)Nrwrxryrzz%s.addModule('%s')ru)rr.r{r|rr/rrr8rKr=rrrar[rB)rr~r1rArrr�	addModule's

z FirewallDConfigService.addModulecCs�t|t�}|jd�r4|jdd�}d|kr4|jdd�}tjd|j|�|jj|�t	|j
��}||dkrtttj
|��|dj|�|j|�dS)Nrwrxryrzz%s.removeModule('%s')ru)rr.r{r|rr/rrr8rKr=rrrdrFrB)rr~r1rArrr�removeModule8s

z#FirewallDConfigService.removeModulecCsTt|t�}|jd�r4|jdd�}d|kr4|jdd�}tjd|j|�||j�dkS)Nrwrxryrzz%s.queryModule('%s')ru)rr.r{r|rr/rr=)rr~r1rrr�queryModuleIs

z"FirewallDConfigService.queryModuleza{ss}cCstjd|j�|j�dS)Nz%s.getDestinations()�)rr/rr=)rr1rrr�getDestinationsWsz&FirewallDConfigService.getDestinationscCsVt|t�}tjd|j|jd�|jd��|jj|�t|j	��}||d<|j
|�dS)Nz*%s.setDestinations({ipv4:'%s', ipv6:'%s'})Zipv4Zipv6r�)r�dictrr/r�getrr8rKr=rB)rZdestinationsr1rArrr�setDestinations^s
z&FirewallDConfigService.setDestinationscCsVt|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|d|S)Nz%s.getDestination('%s')r�)rr.rr/rrr8rKr=rrrd)r�familyr1rArrr�getDestinationks

z%FirewallDConfigService.getDestinationcCs�t|t�}t|t�}tjd|j||�|jj|�t|j��}||dkrn|d||krnt	t
jd||f��||d|<|j|�dS)Nz%s.setDestination('%s', '%s')r�z
'%s': '%s')
rr.rr/rrr8rKr=rrrarB)rr��addressr1rArrr�setDestinationxs


z%FirewallDConfigService.setDestinationcCsbt|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|d|=|j|�dS)Nz%s.removeDestination('%s')r�)
rr.rr/rrr8rKr=rrrdrB)rr�r1rArrr�removeDestination�s


z(FirewallDConfigService.removeDestinationcCsJt|t�}t|t�}tjd|j||�|j�}||dkoH||d|kS)Nz%s.queryDestination('%s', '%s')r�)rr.rr/rr=)rr�r�r1rArrr�queryDestination�s


z'FirewallDConfigService.queryDestinationcCs<tjd|j�|jj|�|jj|j�}d|kr8|dSgS)Nz%s.getIncludes()�includes)rr/rrr8rr>r)rr1rArrr�getIncludes�sz"FirewallDConfigService.getIncludescCsZt|t�}tjd|j|�|jj|�d|dd�i}|jj|j	|�|_	|j
|j	j�dS)Nz%s.setIncludes('%s')r�)rrKrr/rrr8rrCrr@r")rr�r1rArrr�setIncludes�s
z"FirewallDConfigService.setIncludescCsjt|t�}tjd|j|�|jj|�|jj|j	�}|j
dg�j|�|jj|j	|�|_	|j
|j	j�dS)Nz%s.addInclude('%s')r�)rr.rr/rrr8rr>r�
setdefaultr[rCr@r")r�includer1rArrr�
addInclude�s
z!FirewallDConfigService.addIncludecCsft|t�}tjd|j|�|jj|�|jj|j	�}|dj
|�|jj|j	|�|_	|j|j	j
�dS)Nz%s.removeInclude('%s')r�)rr.rr/rrr8rr>rrFrCr@r")rr�r1rArrr�
removeInclude�s
z$FirewallDConfigService.removeIncludecCs@t|t�}tjd|j|�|jj|j�}d|kr<||dkSdS)Nz%s.queryInclude('%s')r�F)rr.rr/rrr>r)rr�r1rArrr�queryInclude�s
z#FirewallDConfigService.queryInclude)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N�__name__�
__module__�__qualname__�__doc__Z
persistentrrZPK_ACTION_CONFIGZdefault_polkit_auth_requiredrrr	r r!r*r
ZPROPERTIES_IFACEr2r7�slipZpolkitZrequire_authr9r�signalr:ZPK_ACTION_INFOZINTROSPECTABLE_IFACEr;rr=r?rBrDrEr@rFrGrIrHrJrMrNrOrQrSrUr`rcrergrjrkrlrmrnrprqrrrsrtrvrr�r�r�r�r�r�r�r�r�r�r�r�r�r��
__classcell__rr)rrr
/s�
		

	


		


	


		r
)Z
gi.repositoryr�sysr}rZdbus.serviceZ	slip.dbusr�Zslip.dbus.serviceZfirewallrZfirewall.dbus_utilsrrrZfirewall.core.loggerrZfirewall.server.decoratorsrr	r
rZfirewall.errorsrrZObjectr
rrrr�<module>s
site-packages/firewall/server/__pycache__/config_policy.cpython-36.pyc000064400000015354147511334700022076 0ustar003

]ûf-!�@s�ddlmZddlZeejd<ddlZddlZddlZddlZddl	m
Z
ddlmZm
Z
mZddlmZddlmZmZmZGdd	�d	ejjj�ZdS)
�)�GObjectNZgobject)�config)�dbus_to_python�%dbus_introspection_prepare_properties�!dbus_introspection_add_properties)�log)�handle_exceptions�dbus_handle_exceptions�dbus_service_methodcs�eZdZdZejjZe�fdd��Z	e
dd��Ze
dd��Ze
dd	��Z
eejd
dd�e
d/dd���Zeejddd�e
d0dd���Zejjjejj�eejdd�e
d1dd����Zejjejdd�dd��Zejjjejj�eejdd�e
d2�fdd�	���Zeejjdd�e
d3dd ���Zeejjdd�e
d4d!d"���Zeejj�e
d5d#d$���Zejjejjdd�e
d%d&���Z eejj�e
d6d'd(���Z!ejjejjdd�e
d)d*���Z"eejjdd�e
d7d+d,���Z#ejjejjdd�e
d-d.���Z$�Z%S)8�FirewallDConfigPolicyTcs\tt|�j||�||_||_||_||_|d|_|d|_d|j|_	t
|tjj�dS)Nr�zconfig.policy.%d)
�superr�__init__�parentr�obj�item_id�busname�path�_log_prefixr�dbus�DBUS_INTERFACE_CONFIG_POLICY)�selfrZconfZpolicyr�args�kwargs)�	__class__��#/usr/lib/python3.6/config_policy.pyrs

zFirewallDConfigPolicy.__init__cCsdS)Nr)rrrr�__del__(szFirewallDConfigPolicy.__del__cCs|j�dS)N)Zremove_from_connection)rrrr�
unregister,sz FirewallDConfigPolicy.unregistercCs�|dkrtj|jj�S|dkr,tj|jj�S|dkrBtj|jj�S|dkrXtj|jj�S|dkrntj|jj�Stj	j
d|��dS)N�name�filenamer�default�builtinzDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not exist)r�Stringrrr rZBooleanr!r"�
exceptions�
DBusException)r�
property_namerrr�
_get_property4sz#FirewallDConfigPolicy._get_propertyZss�v)�in_signature�
out_signatureNcCsLt|t�}t|t�}tjd|j||�|tjjkrBtjj	d|��|j
|�S)Nz%s.Get('%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not exist)r�strr�debug1rrrrr$r%r')r�interface_namer&�senderrrr�GetEs


zFirewallDConfigPolicy.Get�sza{sv}cCsdt|t�}tjd|j|�|tjjkr6tjj	d|��i}xd
D]}|j
|�||<q@Wtj|dd	�S)Nz%s.GetAll('%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existrr rr!r"Zsv)�	signature)rr rr!r")rr+rr,rrrrr$r%r'Z
Dictionary)rr-r.�ret�xrrr�GetAllVs

zFirewallDConfigPolicy.GetAllZssv)r)cCslt|t�}t|t�}t|�}tjd|j|||�|jj|�|tjj	krXtj
jd|��tj
jd|��dS)Nz%s.Set('%s', '%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existzGorg.freedesktop.DBus.Error.PropertyReadOnly: Property '%s' is read-only)rr+rr,rr�accessCheckrrrr$r%)rr-r&Z	new_valuer.rrr�Setgs



zFirewallDConfigPolicy.Setzsa{sv}as)r1cCs2t|t�}t|�}t|�}tjd|j|||�dS)Nz&%s.PropertiesChanged('%s', '%s', '%s'))rr+rr,r)rr-Zchanged_propertiesZinvalidated_propertiesrrr�PropertiesChanged{s


z'FirewallDConfigPolicy.PropertiesChanged)r*cs8tjd|j�tt|�j|j|jj��}t	||t
jj�S)Nz%s.Introspect())
rZdebug2rr
r�
IntrospectrrZget_busrrrr)rr.�data)rrrr8�s

z FirewallDConfigPolicy.IntrospectcCs tjd|j�|jj|j�}|S)z get settings for policy
        z%s.getSettings())rr,rrZget_policy_object_config_dictr)rr.�settingsrrr�getSettings�sz!FirewallDConfigPolicy.getSettingscCsFt|�}tjd|j�|jj|�|jj|j|�|_|j	|jj
�dS)z#update settings for policy
        z%s.update('...')N)rrr,rrr5rZset_policy_object_config_dictr�Updatedr)rr:r.rrr�update�s
zFirewallDConfigPolicy.updatecCs<tjd|j�|jj|�|jj|j�|_|j|jj	�dS)z1load default settings for builtin policy
        z%s.loadDefaults()N)
rr,rrr5rZload_policy_object_defaultsrr<r)rr.rrr�loadDefaults�sz"FirewallDConfigPolicy.loadDefaultscCstjd|j|f�dS)Nz%s.Updated('%s'))rr,r)rrrrrr<�szFirewallDConfigPolicy.UpdatedcCs:tjd|j�|jj|�|jj|j�|jj|j�dS)zremove policy
        z%s.removePolicy()N)	rr,rrr5rZremove_policy_objectrZremovePolicy)rr.rrr�remove�szFirewallDConfigPolicy.removecCstjd|j|f�dS)Nz%s.Removed('%s'))rr,r)rrrrr�Removed�szFirewallDConfigPolicy.RemovedcCsFt|t�}tjd|j|�|jj|�|jj|j	|�|_	|j
|�dS)zrename policy
        z%s.rename('%s')N)rr+rr,rrr5rZrename_policy_objectr�Renamed)rrr.rrr�rename�s

zFirewallDConfigPolicy.renamecCstjd|j|f�dS)Nz%s.Renamed('%s'))rr,r)rrrrrrA�szFirewallDConfigPolicy.Renamed)N)N)N)N)N)N)N)N)N)&�__name__�
__module__�__qualname__Z
persistentrrZPK_ACTION_CONFIGZdefault_polkit_auth_requiredrrr	rrr'r
ZPROPERTIES_IFACEr/r4�slipZpolkitZrequire_authr6�service�signalr7ZPK_ACTION_INFOZINTROSPECTABLE_IFACEr8rr;r=r>r<r?r@rBrA�
__classcell__rr)rrrs^
		

	r)Z
gi.repositoryr�sys�modulesrZdbus.serviceZ	slip.dbusrFZslip.dbus.serviceZfirewallrZfirewall.dbus_utilsrrrZfirewall.core.loggerrZfirewall.server.decoratorsrr	r
rGZObjectrrrrr�<module>s
site-packages/firewall/server/__pycache__/config_ipset.cpython-36.opt-1.pyc000064400000031650147511334700022657 0ustar003

]ûfzI�@s�ddlmZddlZeejd<ddlZddlZddlZddlZddl	m
Z
ddlmZm
Z
mZddlmZddlmZmZmZmZddlmZdd	lmZmZmZdd
l	mZddlmZGdd
�d
ejjj �Z!dS)�)�GObjectNZgobject)�config)�dbus_to_python�%dbus_introspection_prepare_properties�!dbus_introspection_add_properties)�IPSet)�IPSET_TYPES�normalize_ipset_entry�check_entry_overlaps_existing�check_for_overlapping_entries)�log)�handle_exceptions�dbus_handle_exceptions�dbus_service_method)�errors)�
FirewallErrorcseZdZdZdZejjZe	�fdd��Z
edd��Zedd��Z
ed	d
��Zeejddd
�edWdd���Zeejddd
�edXdd���Zejjjejj�eejdd�edYdd����Zejjejdd�dd��Zejjjejj�eejdd�edZ�fdd�	���Zeejjejd�ed[d d!���Z eejjejd�ed\d"d#���Z!eejj�ed]d$d%���Z"ejjejjdd�ed&d'���Z#eejj�ed^d(d)���Z$ejjejjdd�ed*d+���Z%eejjdd�ed_d,d-���Z&ejjejjdd�ed.d/���Z'eejjdd�ed`d0d1���Z(eejjdd�edad2d3���Z)eejjdd�edbd4d5���Z*eejjdd�edcd6d7���Z+eejjdd�eddd8d9���Z,eejjdd�eded:d;���Z-eejjdd�edfd<d=���Z.eejjdd�edgd>d?���Z/eejjd@d�edhdAdB���Z0eejjd@d�edidCdD���Z1eejjdd�edjdEdF���Z2eejjdd�edkdGdH���Z3eejjddId
�edldJdK���Z4eejjdLd�edmdMdN���Z5eejjdLd�edndOdP���Z6eejjdd�edodQdR���Z7eejjdd�edpdSdT���Z8eejjddId
�edqdUdV���Z9�Z:S)r�FirewallDConfigIPSetzFirewallD main classTcs\tt|�j||�||_||_||_||_|d|_|d|_d|j|_	t
|tjj�dS)Nr�zconfig.ipset.%d)
�superr�__init__�parentr�obj�item_id�busname�path�_log_prefixr�dbus�DBUS_INTERFACE_CONFIG_IPSET)�selfrZconfZipsetr�args�kwargs)�	__class__��"/usr/lib/python3.6/config_ipset.pyr;s

zFirewallDConfigIPSet.__init__cCsdS)Nr")rr"r"r#�__del__HszFirewallDConfigIPSet.__del__cCs|j�dS)N)Zremove_from_connection)rr"r"r#�
unregisterLszFirewallDConfigIPSet.unregistercCs�|dkrtj|jj�S|dkr,tj|jj�S|dkrBtj|jj�S|dkrXtj|jj�S|dkrntj|jj�Stj	j
d|��dS)N�name�filenamer�default�builtinzDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not exist)r�Stringrr&r'rZBooleanr(r)�
exceptions�
DBusException)r�
property_namer"r"r#�
_get_propertyTsz"FirewallDConfigIPSet._get_propertyZss�v)�in_signature�
out_signatureNcCsLt|t�}t|t�}tjd|j||�|tjjkrBtjj	d|��|j
|�S)Nz%s.Get('%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not exist)r�strr�debug1rrrrr+r,r.)r�interface_namer-�senderr"r"r#�Getes


zFirewallDConfigIPSet.Get�sza{sv}cCsdt|t�}tjd|j|�|tjjkr6tjj	d|��i}xd
D]}|j
|�||<q@Wtj|dd	�S)Nz%s.GetAll('%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existr&r'rr(r)Zsv)�	signature)r&r'rr(r))rr2rr3rrrrr+r,r.Z
Dictionary)rr4r5�ret�xr"r"r#�GetAllvs

zFirewallDConfigIPSet.GetAllZssv)r0cCslt|t�}t|t�}t|�}tjd|j|||�|jj|�|tjj	krXtj
jd|��tj
jd|��dS)Nz%s.Set('%s', '%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existzGorg.freedesktop.DBus.Error.PropertyReadOnly: Property '%s' is read-only)rr2rr3rr�accessCheckrrrr+r,)rr4r-Z	new_valuer5r"r"r#�Set�s



zFirewallDConfigIPSet.Setzsa{sv}as)r8cCs2t|t�}t|�}t|�}tjd|j|||�dS)Nz&%s.PropertiesChanged('%s', '%s', '%s'))rr2rr3r)rr4Zchanged_propertiesZinvalidated_propertiesr"r"r#�PropertiesChanged�s


z&FirewallDConfigIPSet.PropertiesChanged)r1cs8tjd|j�tt|�j|j|jj��}t	||t
jj�S)Nz%s.Introspect())
rZdebug2rrr�
IntrospectrrZget_busrrrr)rr5�data)r!r"r#r?�s

zFirewallDConfigIPSet.IntrospectcCstjd|j�|jj|j�S)zget settings for ipset
        z%s.getSettings())rr3rrZget_ipset_configr)rr5r"r"r#�getSettings�sz FirewallDConfigIPSet.getSettingscCsFt|�}tjd|j�|jj|�|jj|j|�|_|j	|jj
�dS)z"update settings for ipset
        z%s.update('...')N)rrr3rrr<rZset_ipset_configr�Updatedr&)r�settingsr5r"r"r#�update�s
zFirewallDConfigIPSet.updatecCs<tjd|j�|jj|�|jj|j�|_|j|jj	�dS)z0load default settings for builtin ipset
        z%s.loadDefaults()N)
rr3rrr<rZload_ipset_defaultsrrBr&)rr5r"r"r#�loadDefaults�sz!FirewallDConfigIPSet.loadDefaultscCstjd|j|f�dS)Nz%s.Updated('%s'))rr3r)rr&r"r"r#rB�szFirewallDConfigIPSet.UpdatedcCs:tjd|j�|jj|�|jj|j�|jj|j�dS)zremove ipset
        z%s.remove()N)	rr3rrr<rZremove_ipsetrZremoveIPSet)rr5r"r"r#�remove�szFirewallDConfigIPSet.removecCstjd|j|f�dS)Nz%s.Removed('%s'))rr3r)rr&r"r"r#�Removed�szFirewallDConfigIPSet.RemovedcCsFt|t�}tjd|j|�|jj|�|jj|j	|�|_	|j
|�dS)zrename ipset
        z%s.rename('%s')N)rr2rr3rrr<rZrename_ipsetr�Renamed)rr&r5r"r"r#�rename�s

zFirewallDConfigIPSet.renamecCstjd|j|f�dS)Nz%s.Renamed('%s'))rr3r)rr&r"r"r#rH�szFirewallDConfigIPSet.RenamedcCstjd|j�|j�dS)Nz%s.getVersion()r)rr3rrA)rr5r"r"r#�
getVersionszFirewallDConfigIPSet.getVersioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setVersion('%s')r)
rr2rr3rrr<�listrArD)r�versionr5rCr"r"r#�
setVersions
zFirewallDConfigIPSet.setVersioncCstjd|j�|j�dS)Nz
%s.getShort()r)rr3rrA)rr5r"r"r#�getShortszFirewallDConfigIPSet.getShortcCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setShort('%s')r)
rr2rr3rrr<rKrArD)rZshortr5rCr"r"r#�setShorts
zFirewallDConfigIPSet.setShortcCstjd|j�|j�dS)Nz%s.getDescription()�)rr3rrA)rr5r"r"r#�getDescription(sz#FirewallDConfigIPSet.getDescriptioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setDescription('%s')rP)
rr2rr3rrr<rKrArD)r�descriptionr5rCr"r"r#�setDescription/s

z#FirewallDConfigIPSet.setDescriptioncCstjd|j�|j�dS)Nz%s.getType()�)rr3rrA)rr5r"r"r#�getType=szFirewallDConfigIPSet.getTypecCs\t|t�}tjd|j|�|jj|�|tkr:tt	j
|��t|j��}||d<|j
|�dS)Nz%s.setType('%s')rT)rr2rr3rrr<rrrZINVALID_TYPErKrArD)rZ
ipset_typer5rCr"r"r#�setTypeDs
zFirewallDConfigIPSet.setTypeza{ss}cCstjd|j�|j�dS)Nz%s.getOptions()�)rr3rrA)rr5r"r"r#�
getOptionsSszFirewallDConfigIPSet.getOptionscCsLt|t�}tjd|jt|��|jj|�t|j	��}||d<|j
|�dS)Nz%s.setOptions('[%s]')rW)r�dictrr3r�reprrr<rKrArD)rZoptionsr5rCr"r"r#�
setOptionsZs


zFirewallDConfigIPSet.setOptionscCs�t|t�}t|t�}tjd|j||�|jj|�t|j��}||dkrn|d||krnt	t
jd||f��||d|<|j|�dS)Nz%s.addOption('%s', '%s')rWz
'%s': '%s')
rr2rr3rrr<rKrArr�ALREADY_ENABLEDrD)r�key�valuer5rCr"r"r#�	addOptionfs

zFirewallDConfigIPSet.addOptioncCsbt|t�}tjd|j|�|jj|�t|j��}||dkrJt	t
j|��|d|=|j|�dS)Nz%s.removeOption('%s')rW)
rr2rr3rrr<rKrArr�NOT_ENABLEDrD)rr]r5rCr"r"r#�removeOptionus

z!FirewallDConfigIPSet.removeOption�bcCsNt|t�}t|t�}tjd|j||�t|j��}||dkoL|d||kS)Nz%s.queryOption('%s', '%s')rW)rr2rr3rrKrA)rr]r^r5rCr"r"r#�queryOption�s

z FirewallDConfigIPSet.queryOption�ascCstjd|j�|j�dS)Nz%s.getEntries()�)rr3rrA)rr5r"r"r#�
getEntries�szFirewallDConfigIPSet.getEntriescCs|t|t�}t|�tjd|jdj|��|jj|�t|j	��}d|dkrf|dddkrft
tj��||d<|j
|�dS)Nz%s.setEntries('[%s]')�,�timeoutrW�0re)rrKrrr3r�joinrr<rArr�IPSET_WITH_TIMEOUTrD)rZentriesr5rCr"r"r#�
setEntries�s


zFirewallDConfigIPSet.setEntriescCs�t|t�}t|�}tjd|j|�|jj|�t|j	��}d|dkr`|dddkr`t
tj��||dkrxt
tj
|��t||d�|dj|�|j|�dS)Nz%s.addEntry('%s')rhrWrire)rr2r	rr3rrr<rKrArrrkr\r
�appendrD)r�entryr5rCr"r"r#�addEntry�s

zFirewallDConfigIPSet.addEntrycCs�t|t�}t|�}tjd|j|�|jj|�t|j	��}d|dkr`|dddkr`t
tj��||dkrxt
tj
|��|dj|�|j|�dS)Nz%s.removeEntry('%s')rhrWrire)rr2r	rr3rrr<rKrArrrkr`rFrD)rrnr5rCr"r"r#�removeEntry�s

z FirewallDConfigIPSet.removeEntrycCs`t|t�}t|�}tjd|j|�t|j��}d|dkrT|dddkrTtt	j
��||dkS)Nz%s.queryEntry('%s')rhrWrire)rr2r	rr3rrKrArrrk)rrnr5rCr"r"r#�
queryEntry�s

zFirewallDConfigIPSet.queryEntry)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N);�__name__�
__module__�__qualname__�__doc__Z
persistentrrZPK_ACTION_CONFIGZdefault_polkit_auth_requiredr
rrr$r%r.rZPROPERTIES_IFACEr6r;�slipZpolkitZrequire_authr=�service�signalr>ZPK_ACTION_INFOZINTROSPECTABLE_IFACEr?rrZDBUS_SIGNATURErArDrErBrFrGrIrHrJrMrNrOrQrSrUrVrXr[r_rarcrfrlrorprq�
__classcell__r"r")r!r#r3s�
		




	


r)"Z
gi.repositoryr�sys�modulesrZdbus.serviceZ	slip.dbusrvZslip.dbus.serviceZfirewallrZfirewall.dbus_utilsrrrZfirewall.core.io.ipsetrZfirewall.core.ipsetrr	r
rZfirewall.core.loggerrZfirewall.server.decoratorsr
rrrZfirewall.errorsrrwZObjectrr"r"r"r#�<module>s
site-packages/firewall/server/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000161147511334700021736 0ustar003

]ûf�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/firewall/server/__pycache__/decorators.cpython-36.opt-1.pyc000064400000004023147511334700022345 0ustar003

]ûfo�@s�dZddddgZddlZddlZddlZddlmZddlmZdd	lm	Z	dd
l
mZddlmZddl
mZGd
d�dej�Zedd��Zedd��Zdd�ZdS)z>This module contains decorators for use with and without D-Bus�FirewallDBusException�handle_exceptions�dbus_handle_exceptions�dbus_service_method�N)�
DBusException)�	decorator)�config)�
FirewallError)�errors)�logc@seZdZdZdejjZdS)rz%s.ExceptionN)�__name__�
__module__�__qualname__�__doc__r�dbusZDBUS_INTERFACEZ_dbus_error_name�rr� /usr/lib/python3.6/decorators.pyr+scOsdy
|||�Stk
rD}ztjtj��tj|�WYdd}~Xntk
r^tj�YnXdS)zTDecorator to handle exceptions and log them. Used if not conneced
    to D-Bus.
    N)r	r�debug1�	traceback�
format_exc�error�	Exception�	exception)�func�args�kwargsrrrrr/s
cOs�y
|||�Stk
r�}zdtjt|��}|tjtjtjtjgkrRtj	t|��ntj
tj��tj
t|��tt|���WYdd}~XnZtk
r�}z
|�WYdd}~Xn6tk
r�}ztj�tt|���WYdd}~XnXdS)z�Decorator to handle exceptions, log and report them into D-Bus

    :Raises DBusException: on a firewall error code problems.
    N)r	�get_code�strr
ZALREADY_ENABLEDZNOT_ENABLEDZZONE_ALREADY_SETZALREADY_SETrZwarningrrrrrrrr)rrrr�codeZexrrrr<s

cOs|jdd�tjj||�S)zAdd sender argument for D-BusZsender_keywordZsender)�
setdefaultrZservice�method)rrrrrrTs)r�__all__rZdbus.servicerZdbus.exceptionsrrZfirewallrZfirewall.errorsr	r
Zfirewall.core.loggerrrrrrrrrr�<module>s
site-packages/firewall/server/__pycache__/firewalld.cpython-36.pyc000064400000217455147511334700021231 0ustar003

]ûfb��@sVdgZddlmZmZddlZeejd<ddlZddlZddlZddl	Z
ddlZ
ddlm
Z
ddlmZddlmZddlmZdd	lmZdd
lmZmZmZmZddlmZddlmZmZm Z m!Z!m"Z"m#Z#m$Z$dd
l%m&Z&ddl'm(Z(ddl)m*Z*ddl+m,Z,ddl-m.Z.m/Z/m0Z0ddl1m2Z2ddlm3Z3ddl4m5Z5Gdd�de
jj6j7�Z8dS)�	FirewallD�)�GLib�GObjectNZgobject)�config)�Firewall)�	Rich_Rule)�log)�FirewallClientZoneSettings)�dbus_handle_exceptions�dbus_service_method�handle_exceptions�FirewallDBusException)�FirewallDConfig)�dbus_to_python�command_of_sender�context_of_sender�
uid_of_sender�user_of_uid�%dbus_introspection_prepare_properties�!dbus_introspection_add_properties)�check_config)�IPSet)�IcmpType)�Helper)�nm_get_bus_name�nm_get_connection_of_interface�nm_set_zone_of_connection)�ifcfg_set_zone_of_interface)�errors)�
FirewallErrorcs�"eZdZdZdZejjZe	�fdd��Z
dd�Ze	dd��Ze	d	d
��Z
edd��Zed
d��Zedd��Zedd��Zedd��Zeejddd�e�d�dd���Zeejddd�e�d�dd���Zejjjejj�eejdd �e�d�d!d"����Zejjejd#d$�d%d&��Zejjjejj�eej dd'�e�d��fd(d)�	���Z!ejjjejj�eejj"d*d*d�e�d�d+d,����Z#ejjjejj�eejj"d*d*d�e�d�d-d.����Z$ejjejj"�ed/d0���Z%ejjjejj�eejj"d*d*d�e�d�d1d2����Z&ejjjejj�eejj"d*d*d�e�d�d3d4����Z'ejjjejj(�eejj)d*d*d�e�d�d5d6����Z*ejjjejj(�eejj)d*d*d�e�d�d7d8����Z+ejjjejj,�eejj)d*d9d�e�d�d:d;����Z-ejjejj)d*d$�ed<d=���Z.ejjejj)d*d$�ed>d?���Z/ejjjejj(�eejj)dd*d�e�d�d@dA����Z0ejjjejj(�eejj)dd*d�e�d�dBdC����Z1ejjjejj,�eejj)dd9d�e�d�dDdE����Z2ejjjejj,�eejj)d*dFd�e�d�dGdH����Z3ejjejj)dd$�edIdJ���Z4ejjejj)dd$�edKdL���Z5ejjjejj(�eejj)dMd*d�e�d�dNdO����Z6ejjjejj(�eejj)dMd*d�e�d�dPdQ����Z7ejjjejj,�eejj)dMd9d�e�d�dRdS����Z8ejjjejj,�eejj)d*dTd�e�d�dUdV����Z9ejjejj)dMd$�edWdX���Z:ejjejj)dMd$�edYdZ���Z;ejjjejj(�eejj)dd*d�e�d�d[d\����Z<ejjjejj(�eejj)dd*d�e�d�d]d^����Z=ejjjejj,�eejj)dd9d�e�d�d_d`����Z>ejjjejj,�eejj)d*dFd�e�d�dadb����Z?ejjejj)dd$�edcdd���Z@ejjejj)dd$�ededf���ZAejjjejj(�eejj)dd*d�e�d�dgdh����ZBejjjejj(�eejj)dd*d�e�d�didj����ZCejjjejj,�eejj)dd9d�e�d�dkdl����ZDejjjejj,�eejj)d*dFd�e�d�dmdn����ZEejjejj)dd$�edodp���ZFejjejj)dd$�edqdr���ZGejjjejj�eejj"d*d*d�e�d�dsdt����ZHejjjejj�eejj"d*d*d�e�d�dudv����ZIejjjejj�eejj"d*d9d�e�d�dwdx����ZJejjejj"d*d$�edydz���ZKejjejj"d*d$�ed{d|���ZLejjjejjM�eejj"dd}d�e�d�d~d����ZNejjjejjM�eejjOddd�e�d�d�d�����ZPejjjejjM�eejjOd�d �e�d�d�d�����ZQejjejjOd�d$�ed�d����ZRejjjejjM�eejjSddd�e�d�d�d�����ZTejjjejjM�eejjSd�d �e�d�d�d�����ZUejjejjSd�d$�ed�d����ZVejjjejj�eejj"d*dFd�e�d�d�d�����ZWejjjejjM�eejj"dd�d�e�d�d�d�����ZXejjjejjM�eejj"ddd�e�d�d�d�����ZYejjjejj�eejj"d*dFd�e�d�d�d�����ZZejjjejjM�eejj"de[j\d�e�d�d�d�����Z]ejjjejjM�eejj"d*dd�e�d�d�d�����Z^ejjjejj�eejj"dd*d�e�d�d�d�����Z_ejjejj"dd$�ed�d����Z`ejjjejjM�eejj"d*dd�e�d�d�d�����Zaejjjejj�eejj"dd*d�e�d�d�d�����Zbejjejj"dd$�ed�d����Zcejjjejj�eejj"d*dd�e�d�d�d�����Zdejjjejj�eejj"dd*d�e�d�d�d�����Zeejjejj"dd$�ed�d����Zfejjjejj�eejjSd*dFd�e�d�d�d�����Zgejjjejj�eejjSd*d�d�e�d�d�d�����Zhejjjejj�eejjOd*dFd�e�d�d�d�����Ziejjjejj�eejjOd*d�d�e�d�d�d�����Zjejjjejj�eejjOddd�e�d�d�d�����Zkejjjejj�eejjOddd�e�d�d�d�����ZlejjjejjM�eejjOdd9d�e�d�d�d�����Zmejjjejj�eejjOddd�e�d�d�d�����Znejjjejj�eejjOddd�e�d�d�d�����Zoejjjejj�eejjOddd�e�d�d�d�����Zpejjjejj�eejjOddd�e�d�d�d�����ZqejjjejjM�eejjOdd9d�e�d�d�d„���ZrejjjejjM�eejjOddFd�e�d�d�dĄ���ZsejjejjOdd$�ed�dƄ��ZtejjejjOdd$�ed�dȄ��ZuejjejjOdd$�ed�dʄ��ZvejjejjOdd$�ed�d̄��Zwejjjejj�eejjOddd�e�d�d�d΄���Zxejjjejj�eejjOddd�e�d�d�dЄ���Zyejjjejj�eejjOddd�e�d�d�d҄���ZzejjjejjM�eejjOdd9d�e�d�d�dԄ���Z{ejjjejjM�eejjOddFd�e�d�d�dք���Z|ejjejjOdd$�ed�d؄��Z}ejjejjOdd$�ed�dڄ��Z~ejjejjOdd$�ed�d܄��Zed�dބ�Z�ejjjejj�eejjOd�dd�e�d�d�d����Z�ejjjejj�eejjOddd�e�d�d�d����Z�ejjjejjM�eejjOdd9d�e�d�d�d����Z�ejjjejjM�eejjOddFd�e�d�d�d����Z�ejjejjOd�d$�ed�d���Z�ejjejjOdd$�ed�d���Z�ed�d��Z�ejjjejj�eejjOd�dd�e�d�d�d����Z�ejjjejj�eejjOddd�e�d�d�d����Z�ejjjejjM�eejjOdd9d�e�d�d�d����Z�ejjjejjM�eejjOddFd�e�d�d�d�����Z�ejjejjOd�d$�ed�d����Z�ejjejjOdd$�ed�d����Z�ed�d���Z�ejjjejj�eejjOd�dd�e�d�d�d�����Z�ejjjejj�eejjOd�dd�e�d��d�d����Z�ejjjejjM�eejjOd�d9d�e�d��d�d����Z�ejjjejjM�eejjOd�dd�e�d��d�d����Z�ejjejjOd�d$�e�d��d�d	���Z�ejjejjOd�d$�e�d
�d���Z�e�d�d
��Z�ejjjejj�eejjOd�dd�e�d��d�d����Z�ejjjejj�eejjOddd�e�d��d�d����Z�ejjjejjM�eejjOdd9d�e�d��d�d����Z�ejjjejjM�eejjOddFd�e�d��d�d����Z�ejjejjOd�d$�e�d��d�d���Z�ejjejjOdd$�e�d�d���Z�e�d�d��Z�ejjjejj�eejjOd�dd�e�d��d�d����Z�ejjjejj�eejjOd�dd�e�d�d�d����Z�ejjjejjM�eejjOd�d9d�e�d�d �d!����Z�ejjjejjM�eejjOd�dd�e�d�d"�d#����Z�ejjejjOd�d$�e�d�d$�d%���Z�ejjejjOd�d$�e�d&�d'���Z�e�d(�d)��Z�ejjjejj�eejjO�d*dd�e�d�d+�d,����Z�ejjjejj�eejjOddd�e�d�d-�d.����Z�ejjjejjM�eejjOdd9d�e�d�d/�d0����Z�ejjejjO�d*d$�e�d�d1�d2���Z�ejjejjOdd$�e�d3�d4���Z�e�d5�d6��Z�ejjjejj�eejjO�d7dd�e�d�d8�d9����Z�ejjjejj�eejjO�d:dd�e�d	�d;�d<����Z�ejjjejjM�eejjO�d:d9d�e�d
�d=�d>����Z�ejjjejjM�eejjOd�dd�e�d�d?�d@����Z�ejjejjO�d7d$�e�d�dA�dB���Z�ejjejjO�d:d$�e�dC�dD���Z�e�dE�dF��Z�ejjjejj�eejjOd�dd�e�d
�dG�dH����Z�ejjjejj�eejjOddd�e�d�dI�dJ����Z�ejjjejjM�eejjOdd9d�e�d�dK�dL����Z�ejjjejjM�eejjOddFd�e�d�dM�dN����Z�ejjejjOd�d$�e�d�dO�dP���Z�ejjejjOdd$�e�dQ�dR���Z�ejjjejj�eejjOddd�e�d�dS�dT����Z�ejjjejj�eejjOddd�e�d�dU�dV����Z�ejjjejjM�eejjOdd9d�e�d�dW�dX����Z�ejjejjOdd$�e�dY�dZ���Z�ejjejjOdd$�e�d[�d\���Z�ejjjejj��eejj�d�d*d�e�d�d]�d^����Z�ejjjejj��eejj�d�d*d�e�d�d_�d`����Z�ejjjejj��eejj�d�d9d�e�d�da�db����Z�ejjjejj��eejj�ddFd�e�d�dc�dd����Z�ejjjejj��eejj�d*�ded�e�d�df�dg����Z�ejjejj�d�d$�e�dh�di���Z�ejjejj�d�d$�e�dj�dk���Z�ejjjejj��eejj��dld*d�e�d�dm�dn����Z�ejjjejj��eejj��dld*d�e�d�do�dp����Z�ejjjejj��eejj�d�d*d�e�d�dq�dr����Z�ejjjejj��eejj��dld9d�e�d�ds�dt����Z�ejjjejj��eejj�d��dud�e�d�dv�dw����Z�ejjjejj��eejj�d*�dxd�e�d�dy�dz����Z�ejjejj��dld$�e�d{�d|���Z�ejjejj��dld$�e�d}�d~���Z�ejjjejj��eejj��ddd�e�d �d��d�����Z�ejjjejj��eejj��dd*d�e�d!�d��d�����Z�ejjjejj��eejj��dd*d�e�d"�d��d�����Z�ejjjejj��eejj��dd9d�e�d#�d��d�����Z�ejjjejj��eejj�d*�d�d�e�d$�d��d�����Z�ejjjejj��eejj�d*d*d�e�d%�d��d�����Z�ejjjejj��eejj�d�dd�e�d&�d��d�����Z�ejjejj��dd$�e�d��d����Z�ejjejj��dd$�e�d��d����Z�ejjjejj׃eejj"d*d*d�e�d'�d��d�����Z�ejjjejj�eejj�dd9d�e�d(�d��d�����Z�ejjjejj�eejj�d*dFd�e�d)�d��d�����Z�ejjjejjM�eejj�de�j\d�e�d*�d��d�����Z�ejjjejj�eejj�dd*d�e�d+�d��d�����Z�ejjjejj�eejj�dd*d�e�d,�d��d�����Z�ejjjejj�eejj�dd9d�e�d-�d��d�����Z�ejjjejj�eejj�ddFd�e�d.�d��d�����Z�ejjjejj�eejjِdd �e�d/�d��d�����Z�ejjejj�dd$�e�d��d����Z�ejjejj�dd$�e�d��d����Z�ejjjejj�eejj"d*dFd�e�d0�d��d�����Z�ejjjejjM�eejj"de�j\d�e�d1�d��d�����Z�Z�S(2rzFirewallD main classTcs`tt|�j||�t�|_|d|_|d|_|j�t|t	j
j�t|jj	|jt	j
j
�|_	dS)Nr�)�superr�__init__r�fw�busname�path�startrr�dbus�DBUS_INTERFACErZDBUS_PATH_CONFIG)�self�args�kwargs)�	__class__��/usr/lib/python3.6/firewalld.pyr"Is

zFirewallD.__init__cCs|j�dS)N)�stop)r)r-r-r.�__del__TszFirewallD.__del__cCstjd�i|_|jj�S)Nzstart())r�debug1�	_timeoutsr#r&)r)r-r-r.r&Ws
zFirewallD.startcCstjd�|jj�S)Nzstop())rr1r#r/)r)r-r-r.r/_s
zFirewallD.stopcCs�|jjj�r�|dkr"tjd�dStj�}t||�}|jjjd|�rHdSt	||�}|jjjd|�rfdSt
|�}|jjjd|�r�dSt||�}|jjjd|�r�dStt
jd��dS)Nz&Lockdown not possible, sender not set.�context�uid�user�commandzlockdown is enabled)r#�policies�query_lockdownr�errorr'Z	SystemBusrZaccess_checkrrrrrZ
ACCESS_DENIED)r)�senderZbusr3r4r5r6r-r-r.�accessCheckhs$



zFirewallD.accessCheckcCs&||jkri|j|<||j||<dS)N)r2)r)�zone�x�tagr-r-r.�
addTimeouts

zFirewallD.addTimeoutcCs<||jkr8||j|kr8tj|j||�|j||=dS)N)r2r�
source_remove)r)r<r=r-r-r.�
removeTimeout�szFirewallD.removeTimeoutcCsTxD|jD]:}x&|j|D]}tj|j||�qW|j|j�qW|jj�dS)N)r2rr@�clear)r)r<r=r-r-r.�cleanup_timeouts�s
zFirewallD.cleanup_timeoutscCsd|dkrtjtj�S|dkr6tjdtjjtjjf�S|dkrNtj|jj��S|dkrhtj|jj	d��S|dkr�tj
|jjd�S|d	kr�tj|jj	d
��S|dkr�tj|jj�S|dkr�tj
|jj
d�S|d
kr�tj|jj�S|dk�r�tj|jj�S|dk�rtj
|jjd�S|dk�r$tjd�S|dk�r:tjid�S|dk�rPtjid�Stjjd|��dS)N�version�interface_versionz%d.%d�state�IPv4�ipv4�
IPv4ICMPTypes�s�IPv6�ipv6�
IPv6_rpfilter�
IPv6ICMPTypes�BRIDGEr�
IPSetTypes�nf_conntrack_helper_settingF�nf_conntrack_helpers�sas�nf_nat_helperszDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not exist)r'�Stringr�VERSIONZDBUS_INTERFACE_VERSIONZDBUS_INTERFACE_REVISIONr#Z	get_stateZBooleanZis_ipv_enabledZArrayZipv4_supported_icmp_typesZipv6_rpfilter_enabledZipv6_supported_icmp_typesZebtables_enabledZ
ipset_enabledZipset_supported_types�
Dictionary�
exceptions�
DBusException)r)Zpropr-r-r.�
_get_property�s@





zFirewallD._get_propertyZss�v)�in_signature�
out_signatureNcCs~t|t�}t|t�}tjd||�|tjjkr8|j|�S|tjjtjj	tjj
tjjgkrjtjj
d|��ntjj
d|��dS)NzGet('%s', '%s')zDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not existzJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not exist)r�strrr1rr'r(rZ�DBUS_INTERFACE_ZONE�DBUS_INTERFACE_DIRECT�DBUS_INTERFACE_POLICIES�DBUS_INTERFACE_IPSETrXrY)r)�interface_name�
property_namer:r-r-r.�Get�s



z
FirewallD.GetrJza{sv}cCs�t|t�}tjd|�i}|tjjkrDxNdD]}|j|�||<q,Wn2|tjjtjj	tjj
tjjgkrfntjj
d|��tj|dd�S)NzGetAll('%s')rDrErFrGrKrMrOrrPrQrRrTrIrNzJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existZsv)�	signature)rDrErFrGrKrMrOrrPrQrRrTrIrN)rr^rr1rr'r(rZr_r`rarbrXrYrW)r)rcr:�retr=r-r-r.�GetAll�s&
zFirewallD.GetAllZssv)r\cCs�t|t�}t|t�}t|�}tjd|||�|j|�|tjjkrn|dkr\tjj	d|��q�tjj	d|��nB|tjj
tjjtjjtjj
gkr�tjj	d|��ntjj	d|��dS)NzSet('%s', '%s', '%s')rDrErFrGrKrMrOrrPrQrRrTrIrNzGorg.freedesktop.DBus.Error.PropertyReadOnly: Property '%s' is read-onlyzDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not existzJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not exist)rDrErFrGrKrMrOrrPrQrRrTrIrN)rr^rr1r;rr'r(rXrYr_r`rarb)r)rcrdZ	new_valuer:r-r-r.�Set�s:






z
FirewallD.Setzsa{sv}as)rfcCs.t|t�}t|�}t|�}tjd|||�dS)Nz#PropertiesChanged('%s', '%s', '%s'))rr^rr1)r)rcZchanged_propertiesZinvalidated_propertiesr-r-r.�PropertiesChangeds

zFirewallD.PropertiesChanged)r]cs4tjd�tt|�j|j|jj��}t||t	j
j�S)NzIntrospect())rZdebug2r!r�
Introspectr%r$Zget_busrrr'r()r)r:�data)r,r-r.rk&s

zFirewallD.Introspect�cCs*tjd�|jj�|jj�|j�dS)z#Reload the firewall rules.
        zreload()N)rr1r#�reloadr�Reloaded)r)r:r-r-r.rn4s


zFirewallD.reloadcCs,tjd�|jjd�|jj�|j�dS)z�Completely reload the firewall.

        Completely reload the firewall: Stops firewall, unloads modules and 
        starts the firewall again.
        zcompleteReload()TN)rr1r#rnrro)r)r:r-r-r.�completeReloadCs


zFirewallD.completeReloadcCstjd�dS)Nz
Reloaded())rr1)r)r-r-r.roSszFirewallD.ReloadedcCstjd�t|j�dS)z&Check permanent configuration
        zcheckPermanentConfig()N)rr1rr#)r)r:r-r-r.�checkPermanentConfigXs
zFirewallD.checkPermanentConfigc
Cs�tjd�d}|jj�}x�|jjj�D]�}|j|�}yj||kr�|jj|�}|j	�|krptjd|�|j
|�q�tjd|�ntjd|�|jj||�Wq&tk
r�}ztj
d||f�d}WYdd}~Xq&Xq&W|jj�}x�|jjj�D]�}|j|�}yn||k�rR|jj|�}|j	�|k�rBtjd	|�|j
|�ntjd
|�ntjd|�|jj||�Wq�tk
�r�}ztj
d||f�d}WYdd}~Xq�Xq�W|jj�}x�|jjj�D]�}yx|j|�}||k�r&|jj|�}|j	�|k�rtjd
|�|j
|�ntjd|�ntjd|�|jj||�Wn:tk
�r~}ztj
d||f�d}WYdd}~XnX�q�W|jj�}t�}�x�|jjj�D�]�}|j|�}t|�}	|dk	�r~d}
xH|	j �D]<}|jjj!||�|k�r�tjd||f�|	j"|�d}
�q�WxV|	j �D]J}y,t#|�}|�rNt$||��rN|	j"|�d}
Wntk
�rfYnX�q W|
�r~~|	j%�}x|	j �D]}t&||��q�WyP||k�r�|jj'|�}tjd|�|j(|�ntjd|�|jj)||�Wn:tk
�r&}ztj
d||f�d}WYdd}~XnX�q�W|jj*�}x�|jj+j,�D]�}|j-|�}yB||k�rx|jj.|�}|j
|�ntjd|�|jj/||�Wn:tk
�r�}ztj
d||f�d}WYdd}~XnX�qFW|jj0�}x�|jj1j2�D]�}|j3|�}yn||k�rN|jj4|�}|j	�|k�r>tjd|�|j
|�ntjd|�ntjd|�|jj5||�Wn:tk
�r�}ztj
d||f�d}WYdd}~XnX�q�W|jj6j7�|jj6j8�|jj6j9�f}y6|jj	�|k�r�tjd�|jj
|�n
tjd�Wn6tk
�r<}ztj
d|�d}WYdd}~XnX|jj:j;j<�}y6|jj	�|k�rvtjd�|jj=|�n
tjd�Wn6tk
�r�}ztj
d |�d}WYdd}~XnX|�r�t>t?j@��dS)!z-Make runtime configuration permanent
        zcopyRuntimeToPermanent()FzCopying service '%s' settingsz$Service '%s' is identical, ignoring.zCreating service '%s'z/Runtime To Permanent failed on service '%s': %sTNzCopying icmptype '%s' settingsz%IcmpType '%s' is identical, ignoring.zCreating icmptype '%s'z0Runtime To Permanent failed on icmptype '%s': %szCopying ipset '%s' settingsz"IPSet '%s' is identical, ignoring.zCreating ipset '%s'z-Runtime To Permanent failed on ipset '%s': %szEZone '%s': interface binding for '%s' has been added by NM, ignoring.zCopying zone '%s' settingszCreating zone '%s'z,Runtime To Permanent failed on zone '%s': %szCreating policy '%s'z.Runtime To Permanent failed on policy '%s': %szCopying helper '%s' settingsz#Helper '%s' is identical, ignoring.zCreating helper '%s'z.Runtime To Permanent failed on helper '%s': %szCopying direct configurationz,Direct configuration is identical, ignoring.z7Runtime To Permanent failed on direct configuration: %szCopying policies configurationz.Policies configuration is identical, ignoring.z9Runtime To Permanent failed on policies configuration: %s)Arr1rZgetServiceNamesr#�service�get_services�getServiceSettingsZgetServiceByNameZgetSettings�update�
addService�	Exception�warningZgetIcmpTypeNames�icmptype�
get_icmptypes�getIcmpTypeSettingsZgetIcmpTypeByNameZaddIcmpTypeZ
getIPSetNames�ipset�
get_ipsets�getIPSetSettingsZgetIPSetByNameZaddIPSetZgetZoneNamesrr<�	get_zones�getZoneSettings2r	�
getInterfacesZinterface_get_sender�removeInterfacerrZgetSettingsDictrZ
getZoneByNameZupdate2ZaddZone2ZgetPolicyNames�policy�"get_policies_not_derived_from_zone�getPolicySettingsZgetPolicyByNameZ	addPolicyZgetHelperNames�helper�get_helpers�getHelperSettingsZgetHelperByNameZ	addHelper�direct�get_all_chains�
get_all_rules�get_all_passthroughsr7�lockdown_whitelist�
export_configZsetLockdownWhitelistrrZRT_TO_PERM_FAILED)
r)r:r9Zconfig_names�nameZconfZconf_obj�eZnm_bus_name�settingsZchanged�	interfaceZ
connectionr-r-r.�runtimeToPermanentds$


























zFirewallD.runtimeToPermanentcCs,tjd�|j|�|jjj�|j�dS)z!Enable lockdown policies
        zpolicies.enableLockdown()N)rr1r;r#r7Zenable_lockdown�LockdownEnabled)r)r:r-r-r.�enableLockdown2s

zFirewallD.enableLockdowncCs,tjd�|j|�|jjj�|j�dS)z"Disable lockdown policies
        zpolicies.disableLockdown()N)rr1r;r#r7Zdisable_lockdown�LockdownDisabled)r)r:r-r-r.�disableLockdown>s

zFirewallD.disableLockdown�bcCstjd�|jjj�S)z+Retuns True if lockdown is enabled
        zpolicies.queryLockdown())rr1r#r7r8)r)r:r-r-r.�
queryLockdownJs
zFirewallD.queryLockdowncCstjd�dS)NzLockdownEnabled())rr1)r)r-r-r.r�UszFirewallD.LockdownEnabledcCstjd�dS)NzLockdownDisabled())rr1)r)r-r-r.r�ZszFirewallD.LockdownDisabledcCs@t|t�}tjd|�|j|�|jjjj|�|j	|�dS)zAdd lockdown command
        z*policies.addLockdownWhitelistCommand('%s')N)
rr^rr1r;r#r7r�Zadd_command�LockdownWhitelistCommandAdded)r)r6r:r-r-r.�addLockdownWhitelistCommandcs


z%FirewallD.addLockdownWhitelistCommandcCs@t|t�}tjd|�|j|�|jjjj|�|j	|�dS)z Remove lockdown command
        z-policies.removeLockdownWhitelistCommand('%s')N)
rr^rr1r;r#r7r�Zremove_command�LockdownWhitelistCommandRemoved)r)r6r:r-r-r.�removeLockdownWhitelistCommandps


z(FirewallD.removeLockdownWhitelistCommandcCs(t|t�}tjd|�|jjjj|�S)zQuery lockdown command
        z,policies.queryLockdownWhitelistCommand('%s'))rr^rr1r#r7r�Zhas_command)r)r6r:r-r-r.�queryLockdownWhitelistCommand}s
z'FirewallD.queryLockdownWhitelistCommand�ascCstjd�|jjjj�S)zAdd lockdown command
        z'policies.getLockdownWhitelistCommands())rr1r#r7r�Zget_commands)r)r:r-r-r.�getLockdownWhitelistCommands�s
z&FirewallD.getLockdownWhitelistCommandscCstjd|�dS)Nz#LockdownWhitelistCommandAdded('%s'))rr1)r)r6r-r-r.r��sz'FirewallD.LockdownWhitelistCommandAddedcCstjd|�dS)Nz%LockdownWhitelistCommandRemoved('%s'))rr1)r)r6r-r-r.r��sz)FirewallD.LockdownWhitelistCommandRemoved�icCs@t|t�}tjd|�|j|�|jjjj|�|j	|�dS)zAdd lockdown uid
        z&policies.addLockdownWhitelistUid('%s')N)
r�intrr1r;r#r7r�Zadd_uid�LockdownWhitelistUidAdded)r)r4r:r-r-r.�addLockdownWhitelistUid�s


z!FirewallD.addLockdownWhitelistUidcCs@t|t�}tjd|�|j|�|jjjj|�|j	|�dS)zRemove lockdown uid
        z)policies.removeLockdownWhitelistUid('%s')N)
rr�rr1r;r#r7r�Z
remove_uid�LockdownWhitelistUidRemoved)r)r4r:r-r-r.�removeLockdownWhitelistUid�s


z$FirewallD.removeLockdownWhitelistUidcCs(t|t�}tjd|�|jjjj|�S)zQuery lockdown uid
        z(policies.queryLockdownWhitelistUid('%s'))rr�rr1r#r7r�Zhas_uid)r)r4r:r-r-r.�queryLockdownWhitelistUid�s
z#FirewallD.queryLockdownWhitelistUidZaicCstjd�|jjjj�S)zAdd lockdown uid
        z#policies.getLockdownWhitelistUids())rr1r#r7r�Zget_uids)r)r:r-r-r.�getLockdownWhitelistUids�s
z"FirewallD.getLockdownWhitelistUidscCstjd|�dS)NzLockdownWhitelistUidAdded(%d))rr1)r)r4r-r-r.r��sz#FirewallD.LockdownWhitelistUidAddedcCstjd|�dS)NzLockdownWhitelistUidRemoved(%d))rr1)r)r4r-r-r.r��sz%FirewallD.LockdownWhitelistUidRemovedcCs@t|t�}tjd|�|j|�|jjjj|�|j	|�dS)zAdd lockdown user
        z'policies.addLockdownWhitelistUser('%s')N)
rr^rr1r;r#r7r�Zadd_user�LockdownWhitelistUserAdded)r)r5r:r-r-r.�addLockdownWhitelistUser�s


z"FirewallD.addLockdownWhitelistUsercCs@t|t�}tjd|�|j|�|jjjj|�|j	|�dS)zRemove lockdown user
        z*policies.removeLockdownWhitelistUser('%s')N)
rr^rr1r;r#r7r�Zremove_user�LockdownWhitelistUserRemoved)r)r5r:r-r-r.�removeLockdownWhitelistUser�s


z%FirewallD.removeLockdownWhitelistUsercCs(t|t�}tjd|�|jjjj|�S)zQuery lockdown user
        z)policies.queryLockdownWhitelistUser('%s'))rr^rr1r#r7r�Zhas_user)r)r5r:r-r-r.�queryLockdownWhitelistUser�s
z$FirewallD.queryLockdownWhitelistUsercCstjd�|jjjj�S)zAdd lockdown user
        z$policies.getLockdownWhitelistUsers())rr1r#r7r�Z	get_users)r)r:r-r-r.�getLockdownWhitelistUserss
z#FirewallD.getLockdownWhitelistUserscCstjd|�dS)Nz LockdownWhitelistUserAdded('%s'))rr1)r)r5r-r-r.r�sz$FirewallD.LockdownWhitelistUserAddedcCstjd|�dS)Nz"LockdownWhitelistUserRemoved('%s'))rr1)r)r5r-r-r.r�sz&FirewallD.LockdownWhitelistUserRemovedcCs@t|t�}tjd|�|j|�|jjjj|�|j	|�dS)zAdd lockdown context
        z*policies.addLockdownWhitelistContext('%s')N)
rr^rr1r;r#r7r�Zadd_context�LockdownWhitelistContextAdded)r)r3r:r-r-r.�addLockdownWhitelistContexts


z%FirewallD.addLockdownWhitelistContextcCs@t|t�}tjd|�|j|�|jjjj|�|j	|�dS)z Remove lockdown context
        z-policies.removeLockdownWhitelistContext('%s')N)
rr^rr1r;r#r7r�Zremove_context�LockdownWhitelistContextRemoved)r)r3r:r-r-r.�removeLockdownWhitelistContext's


z(FirewallD.removeLockdownWhitelistContextcCs(t|t�}tjd|�|jjjj|�S)zQuery lockdown context
        z,policies.queryLockdownWhitelistContext('%s'))rr^rr1r#r7r�Zhas_context)r)r3r:r-r-r.�queryLockdownWhitelistContext4s
z'FirewallD.queryLockdownWhitelistContextcCstjd�|jjjj�S)zAdd lockdown context
        z'policies.getLockdownWhitelistContexts())rr1r#r7r�Zget_contexts)r)r:r-r-r.�getLockdownWhitelistContexts@s
z&FirewallD.getLockdownWhitelistContextscCstjd|�dS)Nz#LockdownWhitelistContextAdded('%s'))rr1)r)r3r-r-r.r�Ksz'FirewallD.LockdownWhitelistContextAddedcCstjd|�dS)Nz%LockdownWhitelistContextRemoved('%s'))rr1)r)r3r-r-r.r�Psz)FirewallD.LockdownWhitelistContextRemovedcCs*tjd�|j|�|jj�|j�dS)znEnable panic mode.
        
        All ingoing and outgoing connections and packets will be blocked.
        zenablePanicMode()N)rr1r;r#Zenable_panic_mode�PanicModeEnabled)r)r:r-r-r.�enablePanicModeYs	


zFirewallD.enablePanicModecCs*tjd�|j|�|jj�|j�dS)z�Disable panic mode.

        Enables normal mode: Allowed ingoing and outgoing connections 
        will not be blocked anymore
        zdisablePanicMode()N)rr1r;r#Zdisable_panic_mode�PanicModeDisabled)r)r:r-r-r.�disablePanicModegs



zFirewallD.disablePanicModecCstjd�|jj�S)NzqueryPanicMode())rr1r#Zquery_panic_mode)r)r:r-r-r.�queryPanicModevs
zFirewallD.queryPanicModecCstjd�dS)NzPanicModeEnabled())rr1)r)r-r-r.r�szFirewallD.PanicModeEnabledcCstjd�dS)NzPanicModeDisabled())rr1)r)r-r-r.r��szFirewallD.PanicModeDisabledz&(sssbsasa(ss)asba(ssss)asasasasa(ss)b)cCs$t|t�}tjd|�|jjj|�S)NzgetZoneSettings(%s))rr^rr1r#r<Zget_config_with_settings)r)r<r:r-r-r.�getZoneSettings�s
zFirewallD.getZoneSettingscCs$t|t�}tjd|�|jjj|�S)NzgetZoneSettings2(%s))rr^rr1r#r<�get_config_with_settings_dict)r)r<r:r-r-r.r��s
zFirewallD.getZoneSettings2zsa{sv}cCsBt|t�}tjd|�|j|�|jjj|||�|j||�dS)NzsetZoneSettings2(%s))	rr^rr1r;r#r<�set_config_with_settings_dict�ZoneUpdated)r)r<r�r:r-r-r.�setZoneSettings2�s


zFirewallD.setZoneSettings2cCstjd||f�dS)Nzzone.ZoneUpdated('%s', '%s'))rr1)r)r<r�r-r-r.r��szFirewallD.ZoneUpdatedcCs$t|t�}tjd|�|jjj|�S)Nzpolicy.getPolicySettings(%s))rr^rr1r#r�r�)r)r�r:r-r-r.r��s
zFirewallD.getPolicySettingscCsBt|t�}tjd|�|j|�|jjj|||�|j||�dS)Nzpolicy.setPolicySettings(%s))	rr^rr1r;r#r�r��
PolicyUpdated)r)r�r�r:r-r-r.�setPolicySettings�s


zFirewallD.setPolicySettingscCstjd||f�dS)Nz policy.PolicyUpdated('%s', '%s'))rr1)r)r�r�r-r-r.r��szFirewallD.PolicyUpdatedcCstjd�|jjj�S)NzlistServices())rr1r#rrrs)r)r:r-r-r.�listServices�s
zFirewallD.listServicesz(sssa(ss)asa{ss}asa(ss))cCs�t|t�}tjd|�|jjj|�}|j�}g}x\td�D]P}|j	|d|krr|j
tjt
||j	|d���q:|j
||j	|d�q:Wt|�S)NzgetServiceSettings(%s)�r)rr^rr1r#rr�get_service�export_config_dict�rangeZIMPORT_EXPORT_STRUCTURE�append�copy�deepcopy�getattr�tuple)r)rrr:�objZ	conf_dictZ	conf_listr�r-r-r.rt�s
"zFirewallD.getServiceSettingscCs,t|t�}tjd|�|jjj|�}|j�S)NzgetServiceSettings2(%s))rr^rr1r#rrr�r�)r)rrr:r�r-r-r.�getServiceSettings2�s
zFirewallD.getServiceSettings2cCstjd�|jjj�S)NzlistIcmpTypes())rr1r#ryrz)r)r:r-r-r.�
listIcmpTypes�s
zFirewallD.listIcmpTypescCs(t|t�}tjd|�|jjj|�j�S)NzgetIcmpTypeSettings(%s))rr^rr1r#ryZget_icmptyper�)r)ryr:r-r-r.r{�s
zFirewallD.getIcmpTypeSettingscCstjd�|jj�S)NzgetLogDenied())rr1r#Zget_log_denied)r)r:r-r-r.�getLogDenied	s
zFirewallD.getLogDeniedcCsXt|t�}tjd|�|j|�|jj|�|j|�|jj�|j	j�|j
�dS)NzsetLogDenied('%s'))rr^rr1r;r#Zset_log_denied�LogDeniedChangedrnrro)r)�valuer:r-r-r.�setLogDenieds




zFirewallD.setLogDeniedcCstjd|�dS)NzLogDeniedChanged('%s'))rr1)r)r�r-r-r.r�"szFirewallD.LogDeniedChangedcCstjd�dS)NzgetAutomaticHelpers()�no)rr1)r)r:r-r-r.�getAutomaticHelpers+s
zFirewallD.getAutomaticHelperscCs&t|t�}tjd|�|j|�dS)NzsetAutomaticHelpers('%s'))rr^rr1r;)r)r�r:r-r-r.�setAutomaticHelpers6s
zFirewallD.setAutomaticHelperscCstjd|�dS)NzAutomaticHelpersChanged('%s'))rr1)r)r�r-r-r.�AutomaticHelpersChangedBsz!FirewallD.AutomaticHelpersChangedcCstjd�|jj�S)NzgetDefaultZone())rr1r#Zget_default_zone)r)r:r-r-r.�getDefaultZoneKs
zFirewallD.getDefaultZonecCs<t|t�}tjd|�|j|�|jj|�|j|�dS)NzsetDefaultZone('%s'))rr^rr1r;r#Zset_default_zone�DefaultZoneChanged)r)r<r:r-r-r.�setDefaultZoneTs


zFirewallD.setDefaultZonecCstjd|�dS)NzDefaultZoneChanged('%s'))rr1)r)r<r-r-r.r�`szFirewallD.DefaultZoneChangedcCstjd�|jjj�S)Nzpolicy.getPolicies())rr1r#r�r�)r)r:r-r-r.�getPoliciesks
zFirewallD.getPoliciesz
a{sa{sas}}cCs\tjd�i}xH|jjj�D]8}i||<|jjj|�||d<|jjj|�||d<qW|S)Nzpolicy.getActivePolicies()Z
ingress_zonesZegress_zones)rr1r#r�Z)get_active_policies_not_derived_from_zoneZlist_ingress_zonesZlist_egress_zones)r)r:r7r�r-r-r.�getActivePoliciesss
zFirewallD.getActivePoliciescCstjd�|jjj�S)Nzzone.getZones())rr1r#r<r)r)r:r-r-r.�getZones�s
zFirewallD.getZonescCs�tjd�i}x||jjj�D]l}|jjj|�}|jjj|�}t|�t|�dkri||<t|�dkrp|||d<t|�dkr|||d<qW|S)Nzzone.getActiveZones()r�
interfaces�sources)rr1r#r<r�list_interfaces�list_sources�len)r)r:Zzonesr<r�r�r-r-r.�getActiveZones�s
zFirewallD.getActiveZonescCs2t|t�}tjd|�|jjj|�}|r.|SdS)z�Return the zone an interface belongs to.

        :Parameters:
            `interface` : str
                Name of the interface
        :Returns: str. The name of the zone.
        zzone.getZoneOfInterface('%s')rm)rr^rr1r#r<Zget_zone_of_interface)r)r�r:r<r-r-r.�getZoneOfInterface�s
zFirewallD.getZoneOfInterfacecCs2t|t�}tjd|�|jjj|�}|r.|SdS)Nzzone.getZoneOfSource('%s')rm)rr^rr1r#r<Zget_zone_of_source)r)�sourcer:r<r-r-r.�getZoneOfSource�s
zFirewallD.getZoneOfSourcecCsdS)NFr-)r)r<r:r-r-r.�isImmutable�szFirewallD.isImmutablecCsRt|t�}t|t�}tjd||f�|j|�|jjj|||�}|j||�|S)zPAdd an interface to a zone.
        If zone is empty, use default zone.
        zzone.addInterface('%s', '%s'))	rr^rr1r;r#r<Z
add_interface�InterfaceAdded)r)r<r�r:�_zoner-r-r.�addInterface�s


zFirewallD.addInterfacecCs"t|t�}t|t�}|j|||�S)z�Change a zone an interface is part of.
        If zone is empty, use default zone.

        This function is deprecated, use changeZoneOfInterface instead
        )rr^�changeZoneOfInterface)r)r<r�r:r-r-r.�
changeZone�s


zFirewallD.changeZonecCsRt|t�}t|t�}tjd||f�|j|�|jjj|||�}|j||�|S)z[Change a zone an interface is part of.
        If zone is empty, use default zone.
        z&zone.changeZoneOfInterface('%s', '%s'))	rr^rr1r;r#r<Zchange_zone_of_interface�ZoneOfInterfaceChanged)r)r<r�r:r�r-r-r.r��s


zFirewallD.changeZoneOfInterfacecCsPt|t�}t|t�}tjd||f�|j|�|jjj||�}|j||�|S)zkRemove interface from a zone.
        If zone is empty, remove from zone the interface belongs to.
        z zone.removeInterface('%s', '%s'))	rr^rr1r;r#r<Zremove_interface�InterfaceRemoved)r)r<r�r:r�r-r-r.r��s


zFirewallD.removeInterfacecCs6t|t�}t|t�}tjd||f�|jjj||�S)z^Return true if an interface is in a zone.
        If zone is empty, use default zone.
        zzone.queryInterface('%s', '%s'))rr^rr1r#r<Zquery_interface)r)r<r�r:r-r-r.�queryInterfaces

zFirewallD.queryInterfacecCs&t|t�}tjd|�|jjj|�S)z]Return the list of interfaces of a zone.
        If zone is empty, use default zone.
        zzone.getInterfaces('%s'))rr^rr1r#r<r�)r)r<r:r-r-r.r�s

zFirewallD.getInterfacescCstjd||f�dS)Nzzone.InterfaceAdded('%s', '%s'))rr1)r)r<r�r-r-r.r�+szFirewallD.InterfaceAddedcCstjd||f�dS)z,
        This signal is deprecated.
        zzone.ZoneChanged('%s', '%s')N)rr1)r)r<r�r-r-r.�ZoneChanged0szFirewallD.ZoneChangedcCs"tjd||f�|j||�dS)Nz'zone.ZoneOfInterfaceChanged('%s', '%s'))rr1r�)r)r<r�r-r-r.r�8s
z FirewallD.ZoneOfInterfaceChangedcCstjd||f�dS)Nz!zone.InterfaceRemoved('%s', '%s'))rr1)r)r<r�r-r-r.r�?szFirewallD.InterfaceRemovedcCsRt|t�}t|t�}tjd||f�|j|�|jjj|||�}|j||�|S)zLAdd a source to a zone.
        If zone is empty, use default zone.
        zzone.addSource('%s', '%s'))	rr^rr1r;r#r<Z
add_source�SourceAdded)r)r<r�r:r�r-r-r.�	addSourceHs


zFirewallD.addSourcecCsRt|t�}t|t�}tjd||f�|j|�|jjj|||�}|j||�|S)zXChange a zone an source is part of.
        If zone is empty, use default zone.
        z#zone.changeZoneOfSource('%s', '%s'))	rr^rr1r;r#r<Zchange_zone_of_source�ZoneOfSourceChanged)r)r<r�r:r�r-r-r.�changeZoneOfSourceYs


zFirewallD.changeZoneOfSourcecCsPt|t�}t|t�}tjd||f�|j|�|jjj||�}|j||�|S)zeRemove source from a zone.
        If zone is empty, remove from zone the source belongs to.
        zzone.removeSource('%s', '%s'))	rr^rr1r;r#r<Z
remove_source�
SourceRemoved)r)r<r�r:r�r-r-r.�removeSourcejs


zFirewallD.removeSourcecCs6t|t�}t|t�}tjd||f�|jjj||�S)z[Return true if an source is in a zone.
        If zone is empty, use default zone.
        zzone.querySource('%s', '%s'))rr^rr1r#r<Zquery_source)r)r<r�r:r-r-r.�querySource{s

zFirewallD.querySourcecCs&t|t�}tjd|�|jjj|�S)zZReturn the list of sources of a zone.
        If zone is empty, use default zone.
        zzone.getSources('%s'))rr^rr1r#r<r�)r)r<r:r-r-r.�
getSources�s

zFirewallD.getSourcescCstjd||f�dS)Nzzone.SourceAdded('%s', '%s'))rr1)r)r<r�r-r-r.r��szFirewallD.SourceAddedcCstjd||f�dS)Nz$zone.ZoneOfSourceChanged('%s', '%s'))rr1)r)r<r�r-r-r.r��szFirewallD.ZoneOfSourceChangedcCstjd||f�dS)Nzzone.SourceRemoved('%s', '%s'))rr1)r)r<r�r-r-r.r��szFirewallD.SourceRemovedcCsHtjd||f�|j||=t|d�}|jjj||�|j||�dS)Nz%zone.disableTimedRichRule('%s', '%s'))�rule_str)rr1r2rr#r<�remove_rule�RichRuleRemoved)r)r<�ruler�r-r-r.�disableTimedRichRule�s

zFirewallD.disableTimedRichRuleZssicCs�t|t�}t|t�}t|t�}tjd||f�t|d�}|jjj|||�}|dkrtt	j
||j||�}|j|||�|j
|||�|S)Nzzone.addRichRule('%s', '%s'))r�r)rr^r�rr1rr#r<�add_ruler�timeout_add_secondsr�r?�
RichRuleAdded)r)r<r��timeoutr:r�r�r>r-r-r.�addRichRule�s




zFirewallD.addRichRulecCs\t|t�}t|t�}tjd||f�t|d�}|jjj||�}|j||�|j	||�|S)Nzzone.removeRichRule('%s', '%s'))r�)
rr^rr1rr#r<r�rAr�)r)r<r�r:r�r�r-r-r.�removeRichRule�s


zFirewallD.removeRichRulecCs@t|t�}t|t�}tjd||f�t|d�}|jjj||�S)Nzzone.queryRichRule('%s', '%s'))r�)rr^rr1rr#r<�
query_rule)r)r<r�r:r�r-r-r.�
queryRichRule�s



zFirewallD.queryRichRulecCs&t|t�}tjd|�|jjj|�S)Nzzone.getRichRules('%s'))rr^rr1r#r<Z
list_rules)r)r<r:r-r-r.�getRichRules�s
zFirewallD.getRichRulescCstjd|||f�dS)Nz"zone.RichRuleAdded('%s', '%s', %d))rr1)r)r<r�r�r-r-r.r��szFirewallD.RichRuleAddedcCstjd||f�dS)Nz zone.RichRuleRemoved('%s', '%s'))rr1)r)r<r�r-r-r.r��szFirewallD.RichRuleRemovedcCs>tjd||f�|j||=|jjj||�|j||�dS)Nz$zone.disableTimedService('%s', '%s'))rr1r2r#r<�remove_service�ServiceRemoved)r)r<rrr-r-r.�disableTimedService�szFirewallD.disableTimedServicecCs�t|t�}t|t�}t|t�}tjd|||f�|j|�|jjj||||�}|dkrxt	j
||j||�}|j|||�|j
|||�|S)Nzzone.addService('%s', '%s', %d)r)rr^r�rr1r;r#r<Zadd_servicerr�rr?�ServiceAdded)r)r<rrr�r:r�r>r-r-r.rv�s




zFirewallD.addServicecCs\t|t�}t|t�}tjd||f�|j|�|jjj||�}|j||�|j	||�|S)Nzzone.removeService('%s', '%s'))
rr^rr1r;r#r<rrAr)r)r<rrr:r�r-r-r.�
removeServices


zFirewallD.removeServicecCs6t|t�}t|t�}tjd||f�|jjj||�S)Nzzone.queryService('%s', '%s'))rr^rr1r#r<Z
query_service)r)r<rrr:r-r-r.�queryService&s

zFirewallD.queryServicecCs&t|t�}tjd|�|jjj|�S)Nzzone.getServices('%s'))rr^rr1r#r<Z
list_services)r)r<r:r-r-r.�getServices1s
zFirewallD.getServicescCstjd|||f�dS)Nz!zone.ServiceAdded('%s', '%s', %d))rr1)r)r<rrr�r-r-r.r=szFirewallD.ServiceAddedcCstjd||f�dS)Nzzone.ServiceRemoved('%s', '%s'))rr1)r)r<rrr-r-r.rCszFirewallD.ServiceRemovedcCsHtjd|||f�|j|||f=|jjj|||�|j|||�dS)Nz'zone.disableTimedPort('%s', '%s', '%s'))rr1r2r#r<�remove_port�PortRemoved)r)r<�port�protocolr-r-r.�disableTimedPortLs
zFirewallD.disableTimedPortZsssicCs�t|t�}t|t�}t|t�}t|t�}tjd|||f�|j|�|jjj|||||�}|dkr�t	j
||j|||�}|j|||f|�|j
||||�|S)Nzzone.addPort('%s', '%s', '%s')r)rr^r�rr1r;r#r<Zadd_portrr�rr?�	PortAdded)r)r<rrr�r:r�r>r-r-r.�addPortTs






zFirewallD.addPortZssscCspt|t�}t|t�}t|t�}tjd|||f�|j|�|jjj|||�}|j|||f�|j	|||�|S)Nz!zone.removePort('%s', '%s', '%s'))
rr^rr1r;r#r<rrAr
)r)r<rrr:r�r-r-r.�
removePortks



zFirewallD.removePortcCsDt|t�}t|t�}t|t�}tjd|||f�|jjj|||�S)Nz zone.queryPort('%s', '%s', '%s'))rr^rr1r#r<Z
query_port)r)r<rrr:r-r-r.�	queryPort}s



zFirewallD.queryPortZaascCs&t|t�}tjd|�|jjj|�S)Nzzone.getPorts('%s'))rr^rr1r#r<Z
list_ports)r)r<r:r-r-r.�getPorts�s
zFirewallD.getPortsrcCstjd||||f�dS)Nz$zone.PortAdded('%s', '%s', '%s', %d))rr1)r)r<rrr�r-r-r.r�szFirewallD.PortAddedcCstjd|||f�dS)Nz"zone.PortRemoved('%s', '%s', '%s'))rr1)r)r<rrr-r-r.r
�szFirewallD.PortRemovedcCs>tjd||f�|j||=|jjj||�|j||�dS)Nz%zone.disableTimedProtocol('%s', '%s'))rr1r2r#r<�remove_protocol�ProtocolRemoved)r)r<rr-r-r.�disableTimedProtocol�szFirewallD.disableTimedProtocolcCs�t|t�}t|t�}t|t�}tjd||f�|j|�|jjj||||�}|dkrvt	j
||j||�}|j|||�|j
|||�|S)Nzzone.enableProtocol('%s', '%s')r)rr^r�rr1r;r#r<Zadd_protocolrr�rr?�
ProtocolAdded)r)r<rr�r:r�r>r-r-r.�addProtocol�s




zFirewallD.addProtocolcCs\t|t�}t|t�}tjd||f�|j|�|jjj||�}|j||�|j	||�|S)Nzzone.removeProtocol('%s', '%s'))
rr^rr1r;r#r<rrAr)r)r<rr:r�r-r-r.�removeProtocol�s


zFirewallD.removeProtocolcCs6t|t�}t|t�}tjd||f�|jjj||�S)Nzzone.queryProtocol('%s', '%s'))rr^rr1r#r<Zquery_protocol)r)r<rr:r-r-r.�
queryProtocol�s

zFirewallD.queryProtocolcCs&t|t�}tjd|�|jjj|�S)Nzzone.getProtocols('%s'))rr^rr1r#r<Zlist_protocols)r)r<r:r-r-r.�getProtocols�s
zFirewallD.getProtocolscCstjd|||f�dS)Nz"zone.ProtocolAdded('%s', '%s', %d))rr1)r)r<rr�r-r-r.r�szFirewallD.ProtocolAddedcCstjd||f�dS)Nz zone.ProtocolRemoved('%s', '%s'))rr1)r)r<rr-r-r.r�szFirewallD.ProtocolRemovedcCsJtjd|||f�|j|d||f=|jjj|||�|j|||�dS)Nz-zone.disableTimedSourcePort('%s', '%s', '%s')�sport)rr1r2r#r<�remove_source_port�SourcePortRemoved)r)r<rrr-r-r.�disableTimedSourcePort�s
z FirewallD.disableTimedSourcePortcCs�t|t�}t|t�}t|t�}t|t�}tjd|||f�|j|�|jjj|||||�}|dkr�t	j
||j|||�}|j|d||f|�|j
||||�|S)Nz$zone.addSourcePort('%s', '%s', '%s')rr)rr^r�rr1r;r#r<Zadd_source_portrr�r!r?�SourcePortAdded)r)r<rrr�r:r�r>r-r-r.�
addSourcePort�s








zFirewallD.addSourcePortcCsrt|t�}t|t�}t|t�}tjd|||f�|j|�|jjj|||�}|j|d||f�|j	|||�|S)Nz'zone.removeSourcePort('%s', '%s', '%s')r)
rr^rr1r;r#r<rrAr )r)r<rrr:r�r-r-r.�removeSourcePorts





zFirewallD.removeSourcePortcCsDt|t�}t|t�}t|t�}tjd|||f�|jjj|||�S)Nz&zone.querySourcePort('%s', '%s', '%s'))rr^rr1r#r<Zquery_source_port)r)r<rrr:r-r-r.�querySourcePort)s




zFirewallD.querySourcePortcCs&t|t�}tjd|�|jjj|�S)Nzzone.getSourcePorts('%s'))rr^rr1r#r<Zlist_source_ports)r)r<r:r-r-r.�getSourcePorts6s
zFirewallD.getSourcePortscCstjd||||f�dS)Nz*zone.SourcePortAdded('%s', '%s', '%s', %d))rr1)r)r<rrr�r-r-r.r"BszFirewallD.SourcePortAddedcCstjd|||f�dS)Nz(zone.SourcePortRemoved('%s', '%s', '%s'))rr1)r)r<rrr-r-r.r Hs
zFirewallD.SourcePortRemovedcCs(|j|d=|jjj|�|j|�dS)N�
masquerade)r2r#r<�remove_masquerade�MasqueradeRemoved)r)r<r-r-r.�disableTimedMasqueradeRsz FirewallD.disableTimedMasqueradeZsicCstt|t�}t|t�}tjd|�|j|�|jjj|||�}|dkrdt	j
||j|�}|j|d|�|j
||�|S)Nzzone.addMasquerade('%s')rr')rr^r�rr1r;r#r<Zadd_masqueraderr�r*r?�MasqueradeAdded)r)r<r�r:r�r>r-r-r.�
addMasqueradeXs



zFirewallD.addMasqueradecCsJt|t�}tjd|�|j|�|jjj|�}|j|d�|j	|�|S)Nzzone.removeMasquerade('%s')r')
rr^rr1r;r#r<r(rAr))r)r<r:r�r-r-r.�removeMasqueradels


zFirewallD.removeMasqueradecCs&t|t�}tjd|�|jjj|�S)Nzzone.queryMasquerade('%s'))rr^rr1r#r<Zquery_masquerade)r)r<r:r-r-r.�queryMasquerade{s
zFirewallD.queryMasqueradecCstjd||f�dS)Nzzone.MasqueradeAdded('%s', %d))rr1)r)r<r�r-r-r.r+�szFirewallD.MasqueradeAddedcCstjd|�dS)Nzzone.MasqueradeRemoved('%s'))rr1)r)r<r-r-r.r)�szFirewallD.MasqueradeRemovedcCs@|j|||||f=|jjj|||||�|j|||||�dS)N)r2r#r<�remove_forward_port�ForwardPortRemoved)r)r<rr�toport�toaddrr-r-r.�disable_forward_port�szFirewallD.disable_forward_portZsssssic
Cs�t|t�}t|t�}t|t�}t|t�}t|t�}t|t�}tjd|||||f�|j|�|jjj|||||||�}|dkr�t	j
||j|||||�}	|j|||||f|	�|j
||||||�|S)Nz1zone.addForwardPort('%s', '%s', '%s', '%s', '%s')r)rr^r�rr1r;r#r<Zadd_forward_portrr�r3r?�ForwardPortAdded)
r)r<rrr1r2r�r:r�r>r-r-r.�addForwardPort�s&







zFirewallD.addForwardPortZssssscCs�t|t�}t|t�}t|t�}t|t�}t|t�}tjd|||||f�|j|�|jjj|||||�}|j|||||f�|j	|||||�|S)Nz4zone.removeForwardPort('%s', '%s', '%s', '%s', '%s'))
rr^rr1r;r#r<r/rAr0)r)r<rrr1r2r:r�r-r-r.�removeForwardPort�s





zFirewallD.removeForwardPortcCs`t|t�}t|t�}t|t�}t|t�}t|t�}tjd|||||f�|jjj|||||�S)Nz3zone.queryForwardPort('%s', '%s', '%s', '%s', '%s'))rr^rr1r#r<Zquery_forward_port)r)r<rrr1r2r:r-r-r.�queryForwardPort�s




zFirewallD.queryForwardPortcCs&t|t�}tjd|�|jjj|�S)Nzzone.getForwardPorts('%s'))rr^rr1r#r<Zlist_forward_ports)r)r<r:r-r-r.�getForwardPorts�s
zFirewallD.getForwardPortscCstjd||||||f�dS)Nz7zone.ForwardPortAdded('%s', '%s', '%s', '%s', '%s', %d))rr1)r)r<rrr1r2r�r-r-r.r4�szFirewallD.ForwardPortAddedcCstjd|||||f�dS)Nz5zone.ForwardPortRemoved('%s', '%s', '%s', '%s', '%s'))rr1)r)r<rrr1r2r-r-r.r0�szFirewallD.ForwardPortRemovedcCs>tjd||f�|j||=|jjj||�|j||�dS)Nz&zone.disableTimedIcmpBlock('%s', '%s'))rr1r2r#r<�remove_icmp_block�IcmpBlockRemoved)r)r<�icmpr:r-r-r.�disableTimedIcmpBlock�szFirewallD.disableTimedIcmpBlockcCs�t|t�}t|t�}t|t�}tjd||f�|j|�|jjj||||�}|dkrxt	j
||j|||�}|j|||�|j
|||�|S)Nz zone.enableIcmpBlock('%s', '%s')r)rr^r�rr1r;r#r<Zadd_icmp_blockrr�r<r?�IcmpBlockAdded)r)r<r;r�r:r�r>r-r-r.�addIcmpBlocks





zFirewallD.addIcmpBlockcCs\t|t�}t|t�}tjd||f�|j|�|jjj||�}|j||�|j	||�|S)Nz zone.removeIcmpBlock('%s', '%s'))
rr^rr1r;r#r<r9rAr:)r)r<r;r:r�r-r-r.�removeIcmpBlocks


zFirewallD.removeIcmpBlockcCs6t|t�}t|t�}tjd||f�|jjj||�S)Nzzone.queryIcmpBlock('%s', '%s'))rr^rr1r#r<Zquery_icmp_block)r)r<r;r:r-r-r.�queryIcmpBlock&s

zFirewallD.queryIcmpBlockcCs&t|t�}tjd|�|jjj|�S)Nzzone.getIcmpBlocks('%s'))rr^rr1r#r<Zlist_icmp_blocks)r)r<r:r-r-r.�
getIcmpBlocks1s
zFirewallD.getIcmpBlockscCstjd|||f�dS)Nz#zone.IcmpBlockAdded('%s', '%s', %d))rr1)r)r<r;r�r-r-r.r==szFirewallD.IcmpBlockAddedcCstjd||f�dS)Nz!zone.IcmpBlockRemoved('%s', '%s'))rr1)r)r<r;r-r-r.r:CszFirewallD.IcmpBlockRemovedcCs@t|t�}tjd|�|j|�|jjj||�}|j|�|S)Nz zone.addIcmpBlockInversion('%s'))	rr^rr1r;r#r<Zadd_icmp_block_inversion�IcmpBlockInversionAdded)r)r<r:r�r-r-r.�addIcmpBlockInversionLs


zFirewallD.addIcmpBlockInversioncCs>t|t�}tjd|�|j|�|jjj|�}|j|�|S)Nz#zone.removeIcmpBlockInversion('%s'))	rr^rr1r;r#r<Zremove_icmp_block_inversion�IcmpBlockInversionRemoved)r)r<r:r�r-r-r.�removeIcmpBlockInversionZs


z"FirewallD.removeIcmpBlockInversioncCs&t|t�}tjd|�|jjj|�S)Nz"zone.queryIcmpBlockInversion('%s'))rr^rr1r#r<Zquery_icmp_block_inversion)r)r<r:r-r-r.�queryIcmpBlockInversionhs
z!FirewallD.queryIcmpBlockInversioncCstjd|�dS)Nz"zone.IcmpBlockInversionAdded('%s'))rr1)r)r<r-r-r.rBrsz!FirewallD.IcmpBlockInversionAddedcCstjd|�dS)Nz$zone.IcmpBlockInversionRemoved('%s'))rr1)r)r<r-r-r.rDwsz#FirewallD.IcmpBlockInversionRemovedcCs`t|t�}t|t�}t|t�}tjd|||f�|j|�|jjj|||�|j|||�dS)Nz!direct.addChain('%s', '%s', '%s'))	rr^rr1r;r#r�Z	add_chain�
ChainAdded)r)�ipv�table�chainr:r-r-r.�addChain�s



zFirewallD.addChaincCs`t|t�}t|t�}t|t�}tjd|||f�|j|�|jjj|||�|j|||�dS)Nz$direct.removeChain('%s', '%s', '%s'))	rr^rr1r;r#r�Zremove_chain�ChainRemoved)r)rHrIrJr:r-r-r.�removeChain�s



zFirewallD.removeChaincCsDt|t�}t|t�}t|t�}tjd|||f�|jjj|||�S)Nz#direct.queryChain('%s', '%s', '%s'))rr^rr1r#r�Zquery_chain)r)rHrIrJr:r-r-r.�
queryChain�s



zFirewallD.queryChaincCs6t|t�}t|t�}tjd||f�|jjj||�S)Nzdirect.getChains('%s', '%s'))rr^rr1r#r�Z
get_chains)r)rHrIr:r-r-r.�	getChains�s

zFirewallD.getChainsza(sss)cCstjd�|jjj�S)Nzdirect.getAllChains())rr1r#r�r�)r)r:r-r-r.�getAllChains�s
zFirewallD.getAllChainscCstjd|||f�dS)Nz#direct.ChainAdded('%s', '%s', '%s'))rr1)r)rHrIrJr-r-r.rG�szFirewallD.ChainAddedcCstjd|||f�dS)Nz%direct.ChainRemoved('%s', '%s', '%s'))rr1)r)rHrIrJr-r-r.rL�s
zFirewallD.ChainRemovedZsssiascCs�t|t�}t|t�}t|t�}t|t�}tdd�|D��}tjd||||dj|�f�|j|�|jj	j
|||||�|j|||||�dS)Ncss|]}t|t�VqdS)N)rr^)�.0r�r-r-r.�	<genexpr>�sz$FirewallD.addRule.<locals>.<genexpr>z*direct.addRule('%s', '%s', '%s', %d, '%s')z',')rr^r�r�rr1�joinr;r#r�r��	RuleAdded)r)rHrIrJ�priorityr*r:r-r-r.�addRule�s




zFirewallD.addRulecCs�t|t�}t|t�}t|t�}t|t�}tdd�|D��}tjd||||dj|�f�|j|�|jj	j
|||||�|j|||||�dS)Ncss|]}t|t�VqdS)N)rr^)rQr�r-r-r.rR�sz'FirewallD.removeRule.<locals>.<genexpr>z-direct.removeRule('%s', '%s', '%s', %d, '%s')z',')rr^r�r�rr1rSr;r#r�r��RuleRemoved)r)rHrIrJrUr*r:r-r-r.�
removeRule�s




zFirewallD.removeRulecCs�t|t�}t|t�}t|t�}tjd|||f�|j|�xF|jjj|||�D]0\}}|jjj|||||�|j	|||||�qPWdS)Nz$direct.removeRules('%s', '%s', '%s'))
rr^rr1r;r#r��	get_rulesr�rW)r)rHrIrJr:rUr*r-r-r.�removeRules�s



zFirewallD.removeRulescCsnt|t�}t|t�}t|t�}t|t�}tdd�|D��}tjd||||dj|�f�|jjj	|||||�S)Ncss|]}t|t�VqdS)N)rr^)rQr�r-r-r.rR	sz&FirewallD.queryRule.<locals>.<genexpr>z,direct.queryRule('%s', '%s', '%s', %d, '%s')z',')
rr^r�r�rr1rSr#r�r)r)rHrIrJrUr*r:r-r-r.�	queryRule�s



zFirewallD.queryRuleza(ias)cCsDt|t�}t|t�}t|t�}tjd|||f�|jjj|||�S)Nz!direct.getRules('%s', '%s', '%s'))rr^rr1r#r�rY)r)rHrIrJr:r-r-r.�getRules
	s



zFirewallD.getRulesz	a(sssias)cCstjd�|jjj�S)Nzdirect.getAllRules())rr1r#r�r�)r)r:r-r-r.�getAllRules	s
zFirewallD.getAllRulescCs"tjd||||dj|�f�dS)Nz,direct.RuleAdded('%s', '%s', '%s', %d, '%s')z',')rr1rS)r)rHrIrJrUr*r-r-r.rT"	szFirewallD.RuleAddedcCs"tjd||||dj|�f�dS)Nz.direct.RuleRemoved('%s', '%s', '%s', %d, '%s')z',')rr1rS)r)rHrIrJrUr*r-r-r.rW(	szFirewallD.RuleRemovedrScCs�t|t�}tdd�|D��}tjd|dj|�f�|j|�y|jjj	||�St
k
r�}zh|dkrztddd	d
g�}ntd	d
g�}t|�}|jt
jkr�tt|�|@�dkr�tj|�t|���WYdd}~XnXdS)
Ncss|]}t|t�VqdS)N)rr^)rQr�r-r-r.rR9	sz(FirewallD.passthrough.<locals>.<genexpr>zdirect.passthrough('%s', '%s')z','rHrLz-Cz--checkz-Lz--listr)rHrL)rr^r�rr1rSr;r#r��passthroughr�set�coderZCOMMAND_FAILEDr�rxr
)r)rHr*r:r9Z
query_args�msgr-r-r.r^2	s"


zFirewallD.passthroughcCs\t|�}tdd�|D��}tjd|dj|�f�|j|�|jjj||�|j	||�dS)Ncss|]}t|�VqdS)N)r)rQr�r-r-r.rRT	sz+FirewallD.addPassthrough.<locals>.<genexpr>z!direct.addPassthrough('%s', '%s')z',')
rr�rr1rSr;r#r�Zadd_passthrough�PassthroughAdded)r)rHr*r:r-r-r.�addPassthroughM	s
zFirewallD.addPassthroughcCs\t|�}tdd�|D��}tjd|dj|�f�|j|�|jjj||�|j	||�dS)Ncss|]}t|�VqdS)N)r)rQr�r-r-r.rRb	sz.FirewallD.removePassthrough.<locals>.<genexpr>z$direct.removePassthrough('%s', '%s')z',')
rr�rr1rSr;r#r�Zremove_passthrough�PassthroughRemoved)r)rHr*r:r-r-r.�removePassthrough[	s
zFirewallD.removePassthroughcCsBt|�}tdd�|D��}tjd|dj|�f�|jjj||�S)Ncss|]}t|�VqdS)N)r)rQr�r-r-r.rRp	sz-FirewallD.queryPassthrough.<locals>.<genexpr>z#direct.queryPassthrough('%s', '%s')z',')rr�rr1rSr#r�Zquery_passthrough)r)rHr*r:r-r-r.�queryPassthroughi	s
zFirewallD.queryPassthroughza(sas)cCstjd�|jjj�S)Nzdirect.getAllPassthroughs())rr1r#r�r�)r)r:r-r-r.�getAllPassthroughsu	s
zFirewallD.getAllPassthroughscCs.tjd�xt|j��D]}|j|�qWdS)Nzdirect.removeAllPassthroughs())rr1�reversedrgre)r)r:r^r-r-r.�removeAllPassthroughs~	s
zFirewallD.removeAllPassthroughscCs"t|�}tjd|�|jjj|�S)Nzdirect.getPassthroughs('%s'))rrr1r#r�Zget_passthroughs)r)rHr:r-r-r.�getPassthroughs�	szFirewallD.getPassthroughscCstjd|dj|�f�dS)Nz#direct.PassthroughAdded('%s', '%s')z',')rr1rS)r)rHr*r-r-r.rb�	szFirewallD.PassthroughAddedcCstjd|dj|�f�dS)Nz%direct.PassthroughRemoved('%s', '%s')z',')rr1rS)r)rHr*r-r-r.rd�	szFirewallD.PassthroughRemovedcCsdS)z� PK_ACTION_ALL implies all other actions, i.e. once a subject is
            authorized for PK_ACTION_ALL it's also authorized for any other action.
            Use-case is GUI (RHBZ#994729).
        Nr-)r)r:r-r-r.�authorizeAll�	s	zFirewallD.authorizeAllcCs$t|�}tjd|�|jjj|�S)Nzipset.queryIPSet('%s'))rrr1r#r|Zquery_ipset)r)r|r:r-r-r.�
queryIPSet�	szFirewallD.queryIPSetcCstjd�|jjj�S)Nzipsets.getIPSets())rr1r#r|r})r)r:r-r-r.�	getIPSets�	s
zFirewallD.getIPSetscCs(t|t�}tjd|�|jjj|�j�S)NzgetIPSetSettings(%s))rr^rr1r#r|Z	get_ipsetr�)r)r|r:r-r-r.r~�	s
zFirewallD.getIPSetSettingscCsLt|�}t|�}tjd||f�|j|�|jjj||�|j||�dS)Nzipset.addEntry('%s', '%s'))rrr1r;r#r|Z	add_entry�
EntryAdded)r)r|�entryr:r-r-r.�addEntry�	s
zFirewallD.addEntrycCsLt|�}t|�}tjd||f�|j|�|jjj||�|j||�dS)Nzipset.removeEntry('%s', '%s'))rrr1r;r#r|Zremove_entry�EntryRemoved)r)r|ror:r-r-r.�removeEntry�	s
zFirewallD.removeEntrycCs2t|�}t|�}tjd||f�|jjj||�S)Nzipset.queryEntry('%s', '%s'))rrr1r#r|Zquery_entry)r)r|ror:r-r-r.�
queryEntry�	szFirewallD.queryEntrycCs$t|�}tjd|�|jjj|�S)Nzipset.getEntries('%s'))rrr1r#r|�get_entries)r)r|r:r-r-r.�
getEntries�	szFirewallD.getEntriescCs�t|�}t|t�}tjd|dj|��|jjj|�}|jjj||�t	|�}t	|�}x||D]}|j
||�q^Wx||D]}|j||�q|WdS)Nzipset.setEntries('%s', '[%s]')�,)r�listrr1rSr#r|rtZset_entriesr_rnrq)r)r|Zentriesr:Zold_entriesZold_entries_setZentries_setror-r-r.�
setEntries�	s
zFirewallD.setEntriescCs&t|�}t|�}tjd||f�dS)Nzipset.EntryAdded('%s', '%s'))rrr1)r)r|ror-r-r.rn
szFirewallD.EntryAddedcCs&t|�}t|�}tjd||f�dS)Nzipset.EntryRemoved('%s', '%s'))rrr1)r)r|ror-r-r.rq
szFirewallD.EntryRemovedcCstjd�|jjj�S)Nzhelpers.getHelpers())rr1r#r�r�)r)r:r-r-r.�
getHelpers!
s
zFirewallD.getHelperscCs(t|t�}tjd|�|jjj|�j�S)NzgetHelperSettings(%s))rr^rr1r#r�Z
get_helperr�)r)r�r:r-r-r.r�*
s
zFirewallD.getHelperSettings)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)r)N)N)N)N)r)N)N)N)N)r)N)N)N)r)N)N)N)N)r)N)N)N)N)r)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)��__name__�
__module__�__qualname__�__doc__Z
persistentrr'ZPK_ACTION_CONFIGZdefault_polkit_auth_requiredrr"r0r&r/r
r;r?rArCrZrZPROPERTIES_IFACErerh�slipZpolkitZrequire_authrirr�signalrjZPK_ACTION_INFOZINTROSPECTABLE_IFACErkr(rnrprorqr�ZPK_ACTION_POLICIESrar�r�ZPK_ACTION_POLICIES_INFOr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�ZPK_ACTION_CONFIG_INFOr�r_r�r�r�ZDBUS_INTERFACE_POLICYr�r�r�r�rtr�r�rZDBUS_SIGNATUREr{r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrrr�r�rrvr	r
rrrrrrrrrr
rrrrrrrr!r#r$r%r&r"r r*r,r-r.r+r)r3r5r6r7r8r4r0r<r>r?r@rAr=r:rCrErFrBrDZPK_ACTION_DIRECTr`rKrMZPK_ACTION_DIRECT_INFOrNrOrPrGrLrVrXrZr[r\r]rTrWr^rcrerfrgrirjrbrdZ
PK_ACTION_ALLrkrbrlrmrr~rprrrsrurxrnrqryrr��
__classcell__r-r-)r,r.rAs�	0"	



K



	
	


	
	


	
	


	
	



























	









	








	















	






	
	


	
















	







	









	
	




)9�__all__Z
gi.repositoryrr�sys�modulesr�r'Zdbus.serviceZ	slip.dbusr~Zslip.dbus.serviceZfirewallrZfirewall.core.fwrZfirewall.core.richrZfirewall.core.loggerrZfirewall.clientr	Zfirewall.server.decoratorsr
rrr
Zfirewall.server.configrZfirewall.dbus_utilsrrrrrrrZfirewall.core.io.functionsrZfirewall.core.io.ipsetrZfirewall.core.io.icmptyperZfirewall.core.io.helperrZfirewall.core.fw_nmrrrZfirewall.core.fw_ifcfgrrZfirewall.errorsrrrZObjectrr-r-r-r.�<module>s2
$site-packages/firewall/server/__pycache__/decorators.cpython-36.pyc000064400000004023147511334700021406 0ustar003

]ûfo�@s�dZddddgZddlZddlZddlZddlmZddlmZdd	lm	Z	dd
l
mZddlmZddl
mZGd
d�dej�Zedd��Zedd��Zdd�ZdS)z>This module contains decorators for use with and without D-Bus�FirewallDBusException�handle_exceptions�dbus_handle_exceptions�dbus_service_method�N)�
DBusException)�	decorator)�config)�
FirewallError)�errors)�logc@seZdZdZdejjZdS)rz%s.ExceptionN)�__name__�
__module__�__qualname__�__doc__r�dbusZDBUS_INTERFACEZ_dbus_error_name�rr� /usr/lib/python3.6/decorators.pyr+scOsdy
|||�Stk
rD}ztjtj��tj|�WYdd}~Xntk
r^tj�YnXdS)zTDecorator to handle exceptions and log them. Used if not conneced
    to D-Bus.
    N)r	r�debug1�	traceback�
format_exc�error�	Exception�	exception)�func�args�kwargsrrrrr/s
cOs�y
|||�Stk
r�}zdtjt|��}|tjtjtjtjgkrRtj	t|��ntj
tj��tj
t|��tt|���WYdd}~XnZtk
r�}z
|�WYdd}~Xn6tk
r�}ztj�tt|���WYdd}~XnXdS)z�Decorator to handle exceptions, log and report them into D-Bus

    :Raises DBusException: on a firewall error code problems.
    N)r	�get_code�strr
ZALREADY_ENABLEDZNOT_ENABLEDZZONE_ALREADY_SETZALREADY_SETrZwarningrrrrrrrr)rrrr�codeZexrrrr<s

cOs|jdd�tjj||�S)zAdd sender argument for D-BusZsender_keywordZsender)�
setdefaultrZservice�method)rrrrrrTs)r�__all__rZdbus.servicerZdbus.exceptionsrrZfirewallrZfirewall.errorsr	r
Zfirewall.core.loggerrrrrrrrrr�<module>s
site-packages/firewall/server/__pycache__/config_helper.cpython-36.pyc000064400000030613147511334700022051 0ustar003

]ûf�D�@s�ddlmZddlZeejd<ddlZddlZddlZddlZddl	m
Z
ddlmZm
Z
mZddlmZddlmZddlmZmZmZdd	l	mZdd
lmZGdd�dejjj�ZdS)
�)�GObjectNZgobject)�config)�dbus_to_python�%dbus_introspection_prepare_properties�!dbus_introspection_add_properties)�Helper)�log)�handle_exceptions�dbus_handle_exceptions�dbus_service_method)�errors)�
FirewallErrorcseZdZdZdZejjZe	�fdd��Z
edd��Zedd��Z
ed	d
��Zeejddd
�edTdd���Zeejddd
�edUdd���Zejjjejj�eejdd�edVdd����Zejjejdd�dd��Zejjjejj�eejdd�edW�fdd�	���Zeejjejd�edXd d!���Z eejjejd�edYd"d#���Z!eejj�edZd$d%���Z"ejjejjdd�ed&d'���Z#eejj�ed[d(d)���Z$ejjejjdd�ed*d+���Z%eejjdd�ed\d,d-���Z&ejjejjdd�ed.d/���Z'eejjdd�ed]d0d1���Z(eejjdd�ed^d2d3���Z)eejjdd�ed_d4d5���Z*eejjdd�ed`d6d7���Z+eejjdd�edad8d9���Z,eejjdd�edbd:d;���Z-eejjdd�edcd<d=���Z.eejjdd�eddd>d?���Z/eejjdd@d
�ededAdB���Z0eejjdd�edfdCdD���Z1eejjdd�edgdEdF���Z2eejjdd@d
�edhdGdH���Z3eejjdId�edidJdK���Z4eejjdId�edjdLdM���Z5eejjdd�edkdNdO���Z6eejjdd�edldPdQ���Z7eejjdd@d
�edmdRdS���Z8�Z9S)n�FirewallDConfigHelperzFirewallD main classTcs\tt|�j||�||_||_||_||_|d|_|d|_d|j|_	t
|tjj�dS)Nr�zconfig.helper.%d)
�superr�__init__�parentr�obj�item_id�busname�path�_log_prefixr�dbus�DBUS_INTERFACE_CONFIG_HELPER)�selfrZconf�helperr�args�kwargs)�	__class__��#/usr/lib/python3.6/config_helper.pyr8s

zFirewallDConfigHelper.__init__cCsdS)Nr)rrrr �__del__EszFirewallDConfigHelper.__del__cCs|j�dS)N)Zremove_from_connection)rrrr �
unregisterIsz FirewallDConfigHelper.unregistercCs�|dkrtj|jj�S|dkr,tj|jj�S|dkrBtj|jj�S|dkrXtj|jj�S|dkrntj|jj�Stj	j
d|��dS)N�name�filenamer�default�builtinzDorg.freedesktop.DBus.Error.InvalidArgs: Property '%s' does not exist)r�Stringrr#r$rZBooleanr%r&�
exceptions�
DBusException)r�
property_namerrr �
_get_propertyQsz#FirewallDConfigHelper._get_propertyZss�v)�in_signature�
out_signatureNcCsLt|t�}t|t�}tjd|j||�|tjjkrBtjj	d|��|j
|�S)Nz%s.Get('%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not exist)r�strr�debug1rrrrr(r)r+)r�interface_namer*�senderrrr �Getbs


zFirewallDConfigHelper.Get�sza{sv}cCsdt|t�}tjd|j|�|tjjkr6tjj	d|��i}xd
D]}|j
|�||<q@Wtj|dd	�S)Nz%s.GetAll('%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existr#r$rr%r&Zsv)�	signature)r#r$rr%r&)rr/rr0rrrrr(r)r+Z
Dictionary)rr1r2�ret�xrrr �GetAllss

zFirewallDConfigHelper.GetAllZssv)r-cCslt|t�}t|t�}t|�}tjd|j|||�|jj|�|tjj	krXtj
jd|��tj
jd|��dS)Nz%s.Set('%s', '%s', '%s')zJorg.freedesktop.DBus.Error.UnknownInterface: Interface '%s' does not existzGorg.freedesktop.DBus.Error.PropertyReadOnly: Property '%s' is read-only)rr/rr0rr�accessCheckrrrr(r))rr1r*Z	new_valuer2rrr �Set�s



zFirewallDConfigHelper.Setzsa{sv}as)r5cCs2t|t�}t|�}t|�}tjd|j|||�dS)Nz&%s.PropertiesChanged('%s', '%s', '%s'))rr/rr0r)rr1Zchanged_propertiesZinvalidated_propertiesrrr �PropertiesChanged�s


z'FirewallDConfigHelper.PropertiesChanged)r.cs8tjd|j�tt|�j|j|jj��}t	||t
jj�S)Nz%s.Introspect())
rZdebug2rrr�
IntrospectrrZget_busrrrr)rr2�data)rrr r<�s

z FirewallDConfigHelper.IntrospectcCstjd|j�|jj|j�S)z get settings for helper
        z%s.getSettings())rr0rrZget_helper_configr)rr2rrr �getSettings�sz!FirewallDConfigHelper.getSettingscCsFt|�}tjd|j�|jj|�|jj|j|�|_|j	|jj
�dS)z#update settings for helper
        z%s.update('...')N)rrr0rrr9rZset_helper_configr�Updatedr#)r�settingsr2rrr �update�s
zFirewallDConfigHelper.updatecCs<tjd|j�|jj|�|jj|j�|_|j|jj	�dS)z1load default settings for builtin helper
        z%s.loadDefaults()N)
rr0rrr9rZload_helper_defaultsrr?r#)rr2rrr �loadDefaults�sz"FirewallDConfigHelper.loadDefaultscCstjd|j|f�dS)Nz%s.Updated('%s'))rr0r)rr#rrr r?�szFirewallDConfigHelper.UpdatedcCs:tjd|j�|jj|�|jj|j�|jj|j�dS)zremove helper
        z%s.removeHelper()N)	rr0rrr9rZ
remove_helperrZremoveHelper)rr2rrr �remove�szFirewallDConfigHelper.removecCstjd|j|f�dS)Nz%s.Removed('%s'))rr0r)rr#rrr �Removed�szFirewallDConfigHelper.RemovedcCsFt|t�}tjd|j|�|jj|�|jj|j	|�|_	|j
|�dS)zrename helper
        z%s.rename('%s')N)rr/rr0rrr9rZ
rename_helperr�Renamed)rr#r2rrr �rename�s

zFirewallDConfigHelper.renamecCstjd|j|f�dS)Nz%s.Renamed('%s'))rr0r)rr#rrr rE�szFirewallDConfigHelper.RenamedcCstjd|j�|j�dS)Nz%s.getVersion()r)rr0rr>)rr2rrr �
getVersion�sz FirewallDConfigHelper.getVersioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setVersion('%s')r)
rr/rr0rrr9�listr>rA)r�versionr2r@rrr �
setVersions
z FirewallDConfigHelper.setVersioncCstjd|j�|j�dS)Nz
%s.getShort()r)rr0rr>)rr2rrr �getShortszFirewallDConfigHelper.getShortcCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setShort('%s')r)
rr/rr0rrr9rHr>rA)rZshortr2r@rrr �setShorts
zFirewallDConfigHelper.setShortcCstjd|j�|j�dS)Nz%s.getDescription()�)rr0rr>)rr2rrr �getDescription$sz$FirewallDConfigHelper.getDescriptioncCsHt|t�}tjd|j|�|jj|�t|j��}||d<|j	|�dS)Nz%s.setDescription('%s')rM)
rr/rr0rrr9rHr>rA)r�descriptionr2r@rrr �setDescription+s

z$FirewallDConfigHelper.setDescriptioncCs.tjd|j�|jj|�t|j��}|dS)Nz%s.getFamily()�)rr0rrr9rHr>)rr2r@rrr �	getFamily9szFirewallDConfigHelper.getFamilycCsdt|t�}tjd|j|�|jj|�t|j��}|d|krNt	t
jd|��||d<|j|�dS)Nz%s.setFamily('%s')rQz'%s')
rr/rr0rrr9rHr>r
r�ALREADY_ENABLEDrA)r�ipvr2r@rrr �	setFamilyBs
zFirewallDConfigHelper.setFamily�bcCs.t|t�}tjd|j|�|j�}|d|kS)Nz%s.queryFamily('%s')rQ)rr/rr0rr>)rrTr2r@rrr �queryFamilyOs
z!FirewallDConfigHelper.queryFamilycCs.tjd|j�|jj|�t|j��}|dS)Nz%s.getModule()�)rr0rrr9rHr>)rr2r@rrr �	getModuleZszFirewallDConfigHelper.getModulecCsdt|t�}tjd|j|�|jj|�t|j��}|d|krNt	t
jd|��||d<|j|�dS)Nz%s.setModule('%s')rXz'%s')
rr/rr0rrr9rHr>r
rrSrA)r�moduler2r@rrr �	setModulecs
zFirewallDConfigHelper.setModulecCs.t|t�}tjd|j|�|j�}|d|kS)Nz%s.queryModule('%s')rX)rr/rr0rr>)rrZr2r@rrr �queryModuleps
z!FirewallDConfigHelper.queryModuleza(ss)cCstjd|j�|j�dS)Nz
%s.getPorts()�)rr0rr>)rr2rrr �getPorts{szFirewallDConfigHelper.getPortscCs�g}x6t|t�D](}t|t�r.|jt|��q|j|�qW|}tjd|jdjdd�|D���|j	j
|�t|j��}||d<|j|�dS)Nz%s.setPorts('[%s]')�,css"|]}d|d|dfVqdS)z('%s, '%s')rrNr)�.0�portrrr �	<genexpr>�sz1FirewallDConfigHelper.setPorts.<locals>.<genexpr>r])
rrH�
isinstance�append�tuplerr0r�joinrr9r>rA)rZportsr2Z_portsrar@rrr �setPorts�s

zFirewallDConfigHelper.setPortscCs�t|t�}t|t�}tjd|j||�|jj|�t|j��}||f|dkrbt	t
jd||f��|dj||f�|j
|�dS)Nz%s.addPort('%s', '%s')r]z%s:%s)rr/rr0rrr9rHr>r
rrSrdrA)rra�protocolr2r@rrr �addPort�s

zFirewallDConfigHelper.addPortcCs�t|t�}t|t�}tjd|j||�|jj|�t|j��}||f|dkrbt	t
jd||f��|dj||f�|j
|�dS)Nz%s.removePort('%s', '%s')r]z%s:%s)rr/rr0rrr9rHr>r
rZNOT_ENABLEDrCrA)rrarhr2r@rrr �
removePort�s

z FirewallDConfigHelper.removePortcCs:t|t�}t|t�}tjd|j||�||f|j�dkS)Nz%s.queryPort('%s', '%s')r])rr/rr0rr>)rrarhr2rrr �	queryPort�s


zFirewallDConfigHelper.queryPort)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N)N):�__name__�
__module__�__qualname__�__doc__Z
persistentrrZPK_ACTION_CONFIGZdefault_polkit_auth_requiredr	rr
r!r"r+rZPROPERTIES_IFACEr3r8�slipZpolkitZrequire_authr:�service�signalr;ZPK_ACTION_INFOZINTROSPECTABLE_IFACEr<rrZDBUS_SIGNATUREr>rArBr?rCrDrFrErGrJrKrLrNrPrRrUrWrYr[r\r^rgrirjrk�
__classcell__rr)rr r0s�
		

	




r)Z
gi.repositoryr�sys�modulesrZdbus.serviceZ	slip.dbusrpZslip.dbus.serviceZfirewallrZfirewall.dbus_utilsrrrZfirewall.core.io.helperrZfirewall.core.loggerrZfirewall.server.decoratorsr	r
rrZfirewall.errorsr
rqZObjectrrrrr �<module>s
site-packages/firewall/server/firewalld.py000064400000340142147511334700014733 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2010-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.

__all__ = [ "FirewallD" ]

from gi.repository import GLib, GObject

# force use of pygobject3 in python-slip
import sys
sys.modules['gobject'] = GObject

import copy
import dbus
import dbus.service
import slip.dbus
import slip.dbus.service

from firewall import config
from firewall.core.fw import Firewall
from firewall.core.rich import Rich_Rule
from firewall.core.logger import log
from firewall.client import FirewallClientZoneSettings
from firewall.server.decorators import dbus_handle_exceptions, \
                                       dbus_service_method, \
                                       handle_exceptions, \
                                       FirewallDBusException
from firewall.server.config import FirewallDConfig
from firewall.dbus_utils import dbus_to_python, \
    command_of_sender, context_of_sender, uid_of_sender, user_of_uid, \
    dbus_introspection_prepare_properties, \
    dbus_introspection_add_properties
from firewall.core.io.functions import check_config
from firewall.core.io.ipset import IPSet
from firewall.core.io.icmptype import IcmpType
from firewall.core.io.helper import Helper
from firewall.core.fw_nm import nm_get_bus_name, nm_get_connection_of_interface, \
                                nm_set_zone_of_connection
from firewall.core.fw_ifcfg import ifcfg_set_zone_of_interface
from firewall import errors
from firewall.errors import FirewallError

############################################################################
#
# class FirewallD
#
############################################################################

class FirewallD(slip.dbus.service.Object):
    """FirewallD main class"""

    persistent = True
    """ Make FirewallD persistent. """
    default_polkit_auth_required = config.dbus.PK_ACTION_CONFIG
    """ Use config.dbus.PK_ACTION_CONFIG as a default """

    @handle_exceptions
    def __init__(self, *args, **kwargs):
        super(FirewallD, self).__init__(*args, **kwargs)
        self.fw = Firewall()
        self.busname = args[0]
        self.path = args[1]
        self.start()
        dbus_introspection_prepare_properties(self, config.dbus.DBUS_INTERFACE)
        self.config = FirewallDConfig(self.fw.config, self.busname,
                                      config.dbus.DBUS_PATH_CONFIG)

    def __del__(self):
        self.stop()

    @handle_exceptions
    def start(self):
        # tests if iptables and ip6tables are usable using test functions
        # loads default firewall rules for iptables and ip6tables
        log.debug1("start()")
        self._timeouts = { }
        return self.fw.start()

    @handle_exceptions
    def stop(self):
        # stops firewall: unloads firewall modules, flushes chains and tables,
        #   resets policies
        log.debug1("stop()")
        return self.fw.stop()

    # lockdown functions

    @dbus_handle_exceptions
    def accessCheck(self, sender):
        if self.fw.policies.query_lockdown():
            if sender is None:
                log.error("Lockdown not possible, sender not set.")
                return
            bus = dbus.SystemBus()
            context = context_of_sender(bus, sender)
            if self.fw.policies.access_check("context", context):
                return
            uid = uid_of_sender(bus, sender)
            if self.fw.policies.access_check("uid", uid):
                return
            user = user_of_uid(uid)
            if self.fw.policies.access_check("user", user):
                return
            command = command_of_sender(bus, sender)
            if self.fw.policies.access_check("command", command):
                return
            raise FirewallError(errors.ACCESS_DENIED, "lockdown is enabled")

    # timeout functions

    @dbus_handle_exceptions
    def addTimeout(self, zone, x, tag):
        if zone not in self._timeouts:
            self._timeouts[zone] = { }
        self._timeouts[zone][x] = tag

    @dbus_handle_exceptions
    def removeTimeout(self, zone, x):
        if zone in self._timeouts and x in self._timeouts[zone]:
            GLib.source_remove(self._timeouts[zone][x])
            del self._timeouts[zone][x]

    @dbus_handle_exceptions
    def cleanup_timeouts(self):
        # cleanup timeouts
        for zone in self._timeouts:
            for x in self._timeouts[zone]:
                GLib.source_remove(self._timeouts[zone][x])
            self._timeouts[zone].clear()
        self._timeouts.clear()

    # property handling

    @dbus_handle_exceptions
    def _get_property(self, prop):
        if prop == "version":
            return dbus.String(config.VERSION)
        elif prop == "interface_version":
            return dbus.String("%d.%d" % (config.dbus.DBUS_INTERFACE_VERSION,
                                          config.dbus.DBUS_INTERFACE_REVISION))
        elif prop == "state":
            return dbus.String(self.fw.get_state())

        elif prop == "IPv4":
            return dbus.Boolean(self.fw.is_ipv_enabled("ipv4"))

        elif prop == "IPv4ICMPTypes":
            return dbus.Array(self.fw.ipv4_supported_icmp_types, "s")

        elif prop == "IPv6":
            return dbus.Boolean(self.fw.is_ipv_enabled("ipv6"))

        elif prop == "IPv6_rpfilter":
            return dbus.Boolean(self.fw.ipv6_rpfilter_enabled)

        elif prop == "IPv6ICMPTypes":
            return dbus.Array(self.fw.ipv6_supported_icmp_types, "s")

        elif prop == "BRIDGE":
            return dbus.Boolean(self.fw.ebtables_enabled)

        elif prop == "IPSet":
            return dbus.Boolean(self.fw.ipset_enabled)

        elif prop == "IPSetTypes":
            return dbus.Array(self.fw.ipset_supported_types, "s")

        elif prop == "nf_conntrack_helper_setting":
            return dbus.Boolean(False)

        elif prop == "nf_conntrack_helpers":
            return dbus.Dictionary({}, "sas")

        elif prop == "nf_nat_helpers":
            return dbus.Dictionary({}, "sas")

        else:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.InvalidArgs: "
                "Property '%s' does not exist" % prop)

    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='ss',
                         out_signature='v')
    @dbus_handle_exceptions
    def Get(self, interface_name, property_name, sender=None): # pylint: disable=W0613
        # get a property
        interface_name = dbus_to_python(interface_name, str)
        property_name = dbus_to_python(property_name, str)
        log.debug1("Get('%s', '%s')", interface_name, property_name)

        if interface_name == config.dbus.DBUS_INTERFACE:
            return self._get_property(property_name)
        elif interface_name in [ config.dbus.DBUS_INTERFACE_ZONE,
                                 config.dbus.DBUS_INTERFACE_DIRECT,
                                 config.dbus.DBUS_INTERFACE_POLICIES,
                                 config.dbus.DBUS_INTERFACE_IPSET ]:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.InvalidArgs: "
                "Property '%s' does not exist" % property_name)
        else:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='s',
                         out_signature='a{sv}')
    @dbus_handle_exceptions
    def GetAll(self, interface_name, sender=None): # pylint: disable=W0613
        interface_name = dbus_to_python(interface_name, str)
        log.debug1("GetAll('%s')", interface_name)

        ret = { }
        if interface_name == config.dbus.DBUS_INTERFACE:
            for x in [ "version", "interface_version", "state",
                       "IPv4", "IPv6", "IPv6_rpfilter", "BRIDGE",
                       "IPSet", "IPSetTypes", "nf_conntrack_helper_setting",
                       "nf_conntrack_helpers", "nf_nat_helpers",
                       "IPv4ICMPTypes", "IPv6ICMPTypes" ]:
                ret[x] = self._get_property(x)
        elif interface_name in [ config.dbus.DBUS_INTERFACE_ZONE,
                                 config.dbus.DBUS_INTERFACE_DIRECT,
                                 config.dbus.DBUS_INTERFACE_POLICIES,
                                 config.dbus.DBUS_INTERFACE_IPSET ]:
            pass
        else:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

        return dbus.Dictionary(ret, signature="sv")

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='ssv')
    @dbus_handle_exceptions
    def Set(self, interface_name, property_name, new_value, sender=None):
        interface_name = dbus_to_python(interface_name, str)
        property_name = dbus_to_python(property_name, str)
        new_value = dbus_to_python(new_value)
        log.debug1("Set('%s', '%s', '%s')", interface_name, property_name,
                   new_value)
        self.accessCheck(sender)

        if interface_name == config.dbus.DBUS_INTERFACE:
            if property_name in [ "version", "interface_version", "state",
                                  "IPv4", "IPv6", "IPv6_rpfilter", "BRIDGE",
                                  "IPSet", "IPSetTypes",
                                  "nf_conntrack_helper_setting",
                                  "nf_conntrack_helpers", "nf_nat_helpers",
                                  "IPv4ICMPTypes", "IPv6ICMPTypes" ]:
                raise dbus.exceptions.DBusException(
                    "org.freedesktop.DBus.Error.PropertyReadOnly: "
                    "Property '%s' is read-only" % property_name)
            else:
                raise dbus.exceptions.DBusException(
                    "org.freedesktop.DBus.Error.InvalidArgs: "
                    "Property '%s' does not exist" % property_name)
        elif interface_name in [ config.dbus.DBUS_INTERFACE_ZONE,
                                 config.dbus.DBUS_INTERFACE_DIRECT,
                                 config.dbus.DBUS_INTERFACE_POLICIES,
                                 config.dbus.DBUS_INTERFACE_IPSET ]:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.InvalidArgs: "
                "Property '%s' does not exist" % property_name)
        else:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

    @dbus.service.signal(dbus.PROPERTIES_IFACE, signature='sa{sv}as')
    def PropertiesChanged(self, interface_name, changed_properties,
                          invalidated_properties):
        interface_name = dbus_to_python(interface_name, str)
        changed_properties = dbus_to_python(changed_properties)
        invalidated_properties = dbus_to_python(invalidated_properties)
        log.debug1("PropertiesChanged('%s', '%s', '%s')",
                   interface_name, changed_properties, invalidated_properties)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(dbus.INTROSPECTABLE_IFACE, out_signature='s')
    @dbus_handle_exceptions
    def Introspect(self, sender=None): # pylint: disable=W0613
        log.debug2("Introspect()")

        data = super(FirewallD, self).Introspect(self.path,
                                                 self.busname.get_bus())

        return dbus_introspection_add_properties(self, data,
                                                 config.dbus.DBUS_INTERFACE)

    # reload

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='',
                         out_signature='')
    @dbus_handle_exceptions
    def reload(self, sender=None): # pylint: disable=W0613
        """Reload the firewall rules.
        """
        log.debug1("reload()")

        self.fw.reload()
        self.config.reload()
        self.Reloaded()

    # complete_reload

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='',
                         out_signature='')
    @dbus_handle_exceptions
    def completeReload(self, sender=None): # pylint: disable=W0613
        """Completely reload the firewall.

        Completely reload the firewall: Stops firewall, unloads modules and 
        starts the firewall again.
        """
        log.debug1("completeReload()")

        self.fw.reload(True)
        self.config.reload()
        self.Reloaded()

    @dbus.service.signal(config.dbus.DBUS_INTERFACE)
    @dbus_handle_exceptions
    def Reloaded(self):
        log.debug1("Reloaded()")

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='',
                         out_signature='')
    @dbus_handle_exceptions
    def checkPermanentConfig(self, sender=None): # pylint: disable=W0613
        """Check permanent configuration
        """
        log.debug1("checkPermanentConfig()")
        check_config(self.fw)

    # runtime to permanent

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='',
                         out_signature='')
    @dbus_handle_exceptions
    def runtimeToPermanent(self, sender=None): # pylint: disable=W0613
        """Make runtime configuration permanent
        """
        log.debug1("copyRuntimeToPermanent()")

        error = False

        # Services or icmptypes can not be modified in runtime, but they can
        # be removed or modified in permanent environment. Therefore copying
        # of services and icmptypes to permanent is also needed.

        # services

        config_names = self.config.getServiceNames()
        for name in self.fw.service.get_services():
            conf = self.getServiceSettings(name)
            try:
                if name in config_names:
                    conf_obj = self.config.getServiceByName(name)
                    if conf_obj.getSettings() != conf:
                        log.debug1("Copying service '%s' settings" % name)
                        conf_obj.update(conf)
                    else:
                        log.debug1("Service '%s' is identical, ignoring." % name)
                else:
                    log.debug1("Creating service '%s'" % name)
                    self.config.addService(name, conf)
            except Exception as e:
                log.warning(
                    "Runtime To Permanent failed on service '%s': %s" % \
                    (name, e))
                error = True

        # icmptypes

        config_names = self.config.getIcmpTypeNames()
        for name in self.fw.icmptype.get_icmptypes():
            conf = self.getIcmpTypeSettings(name)
            try:
                if name in config_names:
                    conf_obj = self.config.getIcmpTypeByName(name)
                    if conf_obj.getSettings() != conf:
                        log.debug1("Copying icmptype '%s' settings" % name)
                        conf_obj.update(conf)
                    else:
                        log.debug1("IcmpType '%s' is identical, ignoring." % name)
                else:
                    log.debug1("Creating icmptype '%s'" % name)
                    self.config.addIcmpType(name, conf)
            except Exception as e:
                log.warning(
                    "Runtime To Permanent failed on icmptype '%s': %s" % \
                    (name, e))
                error = True

        # ipsets

        config_names = self.config.getIPSetNames()
        for name in self.fw.ipset.get_ipsets():
            try:
                conf = self.getIPSetSettings(name)
                if name in config_names:
                    conf_obj = self.config.getIPSetByName(name)
                    if conf_obj.getSettings() != conf:
                        log.debug1("Copying ipset '%s' settings" % name)
                        conf_obj.update(conf)
                    else:
                        log.debug1("IPSet '%s' is identical, ignoring." % name)
                else:
                    log.debug1("Creating ipset '%s'" % name)
                    self.config.addIPSet(name, conf)
            except Exception as e:
                log.warning(
                    "Runtime To Permanent failed on ipset '%s': %s" % \
                    (name, e))
                error = True

        # zones

        config_names = self.config.getZoneNames()
        nm_bus_name = nm_get_bus_name()
        for name in self.fw.zone.get_zones():
            conf = self.getZoneSettings2(name)
            settings = FirewallClientZoneSettings(conf)
            if nm_bus_name is not None:
                changed = False
                for interface in settings.getInterfaces():
                    if self.fw.zone.interface_get_sender(name, interface) == nm_bus_name:
                        log.debug1("Zone '%s': interface binding for '%s' has been added by NM, ignoring." % (name, interface))
                        settings.removeInterface(interface)
                        changed = True
                # For the remaining interfaces, attempt to let NM manage them
                for interface in settings.getInterfaces():
                    try:
                        connection = nm_get_connection_of_interface(interface)
                        if connection and nm_set_zone_of_connection(name, connection):
                            settings.removeInterface(interface)
                            changed = True
                    except Exception:
                        pass

                if changed:
                    del conf
                    conf = settings.getSettingsDict()
            # For the remaining try to update the ifcfg files
            for interface in settings.getInterfaces():
                ifcfg_set_zone_of_interface(name, interface)
            try:
                if name in config_names:
                    conf_obj = self.config.getZoneByName(name)
                    log.debug1("Copying zone '%s' settings" % name)
                    conf_obj.update2(conf)
                else:
                    log.debug1("Creating zone '%s'" % name)
                    self.config.addZone2(name, conf)
            except Exception as e:
                log.warning(
                    "Runtime To Permanent failed on zone '%s': %s" % \
                    (name, e))
                error = True

        # policies

        config_names = self.config.getPolicyNames()
        for name in self.fw.policy.get_policies_not_derived_from_zone():
            conf = self.getPolicySettings(name)
            try:
                if name in config_names:
                    conf_obj = self.config.getPolicyByName(name)
                    conf_obj.update(conf)
                else:
                    log.debug1("Creating policy '%s'" % name)
                    self.config.addPolicy(name, conf)
            except Exception as e:
                log.warning(
                    "Runtime To Permanent failed on policy '%s': %s" % \
                    (name, e))
                error = True

        # helpers

        config_names = self.config.getHelperNames()
        for name in self.fw.helper.get_helpers():
            conf = self.getHelperSettings(name)
            try:
                if name in config_names:
                    conf_obj = self.config.getHelperByName(name)
                    if conf_obj.getSettings() != conf:
                        log.debug1("Copying helper '%s' settings" % name)
                        conf_obj.update(conf)
                    else:
                        log.debug1("Helper '%s' is identical, ignoring." % name)
                else:
                    log.debug1("Creating helper '%s'" % name)
                    self.config.addHelper(name, conf)
            except Exception as e:
                log.warning(
                    "Runtime To Permanent failed on helper '%s': %s" % \
                    (name, e))
                error = True

        # direct

        # rt_config = self.fw.direct.get_config()
        conf = ( self.fw.direct.get_all_chains(),
                 self.fw.direct.get_all_rules(),
                 self.fw.direct.get_all_passthroughs() )
        try:
            if self.config.getSettings() != conf:
                log.debug1("Copying direct configuration")
                self.config.update(conf)
            else:
                log.debug1("Direct configuration is identical, ignoring.")
        except Exception as e:
            log.warning(
                "Runtime To Permanent failed on direct configuration: %s" % e)
            error = True

        # policies

        conf = self.fw.policies.lockdown_whitelist.export_config()
        try:
            if self.config.getSettings() != conf:
                log.debug1("Copying policies configuration")
                self.config.setLockdownWhitelist(conf)
            else:
                log.debug1("Policies configuration is identical, ignoring.")
        except Exception as e:
            log.warning(
                "Runtime To Permanent failed on policies configuration: %s" % \
                e)
            error = True

        if error:
            raise FirewallError(errors.RT_TO_PERM_FAILED)

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
    # POLICIES
    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # lockdown

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_POLICIES)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICIES, in_signature='',
                         out_signature='')
    @dbus_handle_exceptions
    def enableLockdown(self, sender=None):
        """Enable lockdown policies
        """
        log.debug1("policies.enableLockdown()")
        self.accessCheck(sender)
        self.fw.policies.enable_lockdown()
        self.LockdownEnabled()

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_POLICIES)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICIES, in_signature='',
                         out_signature='')
    @dbus_handle_exceptions
    def disableLockdown(self, sender=None):
        """Disable lockdown policies
        """
        log.debug1("policies.disableLockdown()")
        self.accessCheck(sender)
        self.fw.policies.disable_lockdown()
        self.LockdownDisabled()

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_POLICIES_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICIES, in_signature='',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryLockdown(self, sender=None): # pylint: disable=W0613
        """Retuns True if lockdown is enabled
        """
        log.debug1("policies.queryLockdown()")
        # no access check here
        return self.fw.policies.query_lockdown()

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_POLICIES, signature='')
    @dbus_handle_exceptions
    def LockdownEnabled(self):
        log.debug1("LockdownEnabled()")

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_POLICIES, signature='')
    @dbus_handle_exceptions
    def LockdownDisabled(self):
        log.debug1("LockdownDisabled()")

    # lockdown whitelist

    # command

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_POLICIES)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICIES, in_signature='s',
                         out_signature='')
    @dbus_handle_exceptions
    def addLockdownWhitelistCommand(self, command, sender=None):
        """Add lockdown command
        """
        command = dbus_to_python(command, str)
        log.debug1("policies.addLockdownWhitelistCommand('%s')" % command)
        self.accessCheck(sender)
        self.fw.policies.lockdown_whitelist.add_command(command)
        self.LockdownWhitelistCommandAdded(command)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_POLICIES)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICIES, in_signature='s',
                         out_signature='')
    @dbus_handle_exceptions
    def removeLockdownWhitelistCommand(self, command, sender=None):
        """Remove lockdown command
        """
        command = dbus_to_python(command, str)
        log.debug1("policies.removeLockdownWhitelistCommand('%s')" % command)
        self.accessCheck(sender)
        self.fw.policies.lockdown_whitelist.remove_command(command)
        self.LockdownWhitelistCommandRemoved(command)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_POLICIES_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICIES, in_signature='s',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryLockdownWhitelistCommand(self, command, sender=None): # pylint: disable=W0613
        """Query lockdown command
        """
        command = dbus_to_python(command, str)
        log.debug1("policies.queryLockdownWhitelistCommand('%s')" % command)
        # no access check here
        return self.fw.policies.lockdown_whitelist.has_command(command)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_POLICIES_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICIES, in_signature='',
                         out_signature='as')
    @dbus_handle_exceptions
    def getLockdownWhitelistCommands(self, sender=None): # pylint: disable=W0613
        """Add lockdown command
        """
        log.debug1("policies.getLockdownWhitelistCommands()")
        # no access check here
        return self.fw.policies.lockdown_whitelist.get_commands()

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_POLICIES, signature='s')
    @dbus_handle_exceptions
    def LockdownWhitelistCommandAdded(self, command):
        log.debug1("LockdownWhitelistCommandAdded('%s')" % command)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_POLICIES, signature='s')
    @dbus_handle_exceptions
    def LockdownWhitelistCommandRemoved(self, command):
        log.debug1("LockdownWhitelistCommandRemoved('%s')" % command)

    # uid

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_POLICIES)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICIES, in_signature='i',
                         out_signature='')
    @dbus_handle_exceptions
    def addLockdownWhitelistUid(self, uid, sender=None):
        """Add lockdown uid
        """
        uid = dbus_to_python(uid, int)
        log.debug1("policies.addLockdownWhitelistUid('%s')" % uid)
        self.accessCheck(sender)
        self.fw.policies.lockdown_whitelist.add_uid(uid)
        self.LockdownWhitelistUidAdded(uid)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_POLICIES)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICIES, in_signature='i',
                         out_signature='')
    @dbus_handle_exceptions
    def removeLockdownWhitelistUid(self, uid, sender=None):
        """Remove lockdown uid
        """
        uid = dbus_to_python(uid, int)
        log.debug1("policies.removeLockdownWhitelistUid('%s')" % uid)
        self.accessCheck(sender)
        self.fw.policies.lockdown_whitelist.remove_uid(uid)
        self.LockdownWhitelistUidRemoved(uid)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_POLICIES_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICIES, in_signature='i',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryLockdownWhitelistUid(self, uid, sender=None): # pylint: disable=W0613
        """Query lockdown uid
        """
        uid = dbus_to_python(uid, int)
        log.debug1("policies.queryLockdownWhitelistUid('%s')" % uid)
        # no access check here
        return self.fw.policies.lockdown_whitelist.has_uid(uid)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_POLICIES_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICIES, in_signature='',
                         out_signature='ai')
    @dbus_handle_exceptions
    def getLockdownWhitelistUids(self, sender=None): # pylint: disable=W0613
        """Add lockdown uid
        """
        log.debug1("policies.getLockdownWhitelistUids()")
        # no access check here
        return self.fw.policies.lockdown_whitelist.get_uids()

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_POLICIES, signature='i')
    @dbus_handle_exceptions
    def LockdownWhitelistUidAdded(self, uid):
        log.debug1("LockdownWhitelistUidAdded(%d)" % uid)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_POLICIES, signature='i')
    @dbus_handle_exceptions
    def LockdownWhitelistUidRemoved(self, uid):
        log.debug1("LockdownWhitelistUidRemoved(%d)" % uid)

    # user

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_POLICIES)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICIES, in_signature='s',
                         out_signature='')
    @dbus_handle_exceptions
    def addLockdownWhitelistUser(self, user, sender=None):
        """Add lockdown user
        """
        user = dbus_to_python(user, str)
        log.debug1("policies.addLockdownWhitelistUser('%s')" % user)
        self.accessCheck(sender)
        self.fw.policies.lockdown_whitelist.add_user(user)
        self.LockdownWhitelistUserAdded(user)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_POLICIES)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICIES, in_signature='s',
                         out_signature='')
    @dbus_handle_exceptions
    def removeLockdownWhitelistUser(self, user, sender=None):
        """Remove lockdown user
        """
        user = dbus_to_python(user, str)
        log.debug1("policies.removeLockdownWhitelistUser('%s')" % user)
        self.accessCheck(sender)
        self.fw.policies.lockdown_whitelist.remove_user(user)
        self.LockdownWhitelistUserRemoved(user)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_POLICIES_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICIES, in_signature='s',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryLockdownWhitelistUser(self, user, sender=None): # pylint: disable=W0613
        """Query lockdown user
        """
        user = dbus_to_python(user, str)
        log.debug1("policies.queryLockdownWhitelistUser('%s')" % user)
        # no access check here
        return self.fw.policies.lockdown_whitelist.has_user(user)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_POLICIES_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICIES, in_signature='',
                         out_signature='as')
    @dbus_handle_exceptions
    def getLockdownWhitelistUsers(self, sender=None): # pylint: disable=W0613
        """Add lockdown user
        """
        log.debug1("policies.getLockdownWhitelistUsers()")
        # no access check here
        return self.fw.policies.lockdown_whitelist.get_users()

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_POLICIES, signature='s')
    @dbus_handle_exceptions
    def LockdownWhitelistUserAdded(self, user):
        log.debug1("LockdownWhitelistUserAdded('%s')" % user)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_POLICIES, signature='s')
    @dbus_handle_exceptions
    def LockdownWhitelistUserRemoved(self, user):
        log.debug1("LockdownWhitelistUserRemoved('%s')" % user)

    # context

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_POLICIES)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICIES, in_signature='s',
                         out_signature='')
    @dbus_handle_exceptions
    def addLockdownWhitelistContext(self, context, sender=None):
        """Add lockdown context
        """
        context = dbus_to_python(context, str)
        log.debug1("policies.addLockdownWhitelistContext('%s')" % context)
        self.accessCheck(sender)
        self.fw.policies.lockdown_whitelist.add_context(context)
        self.LockdownWhitelistContextAdded(context)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_POLICIES)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICIES, in_signature='s',
                         out_signature='')
    @dbus_handle_exceptions
    def removeLockdownWhitelistContext(self, context, sender=None):
        """Remove lockdown context
        """
        context = dbus_to_python(context, str)
        log.debug1("policies.removeLockdownWhitelistContext('%s')" % context)
        self.accessCheck(sender)
        self.fw.policies.lockdown_whitelist.remove_context(context)
        self.LockdownWhitelistContextRemoved(context)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_POLICIES_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICIES, in_signature='s',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryLockdownWhitelistContext(self, context, sender=None): # pylint: disable=W0613
        """Query lockdown context
        """
        context = dbus_to_python(context, str)
        log.debug1("policies.queryLockdownWhitelistContext('%s')" % context)
        # no access check here
        return self.fw.policies.lockdown_whitelist.has_context(context)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_POLICIES_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICIES, in_signature='',
                         out_signature='as')
    @dbus_handle_exceptions
    def getLockdownWhitelistContexts(self, sender=None): # pylint: disable=W0613
        """Add lockdown context
        """
        log.debug1("policies.getLockdownWhitelistContexts()")
        # no access check here
        return self.fw.policies.lockdown_whitelist.get_contexts()

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_POLICIES, signature='s')
    @dbus_handle_exceptions
    def LockdownWhitelistContextAdded(self, context):
        log.debug1("LockdownWhitelistContextAdded('%s')" % context)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_POLICIES, signature='s')
    @dbus_handle_exceptions
    def LockdownWhitelistContextRemoved(self, context):
        log.debug1("LockdownWhitelistContextRemoved('%s')" % context)

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # PANIC

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='',
                         out_signature='')
    @dbus_handle_exceptions
    def enablePanicMode(self, sender=None):
        """Enable panic mode.
        
        All ingoing and outgoing connections and packets will be blocked.
        """
        log.debug1("enablePanicMode()")
        self.accessCheck(sender)
        self.fw.enable_panic_mode()
        self.PanicModeEnabled()

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='',
                         out_signature='')
    @dbus_handle_exceptions
    def disablePanicMode(self, sender=None):
        """Disable panic mode.

        Enables normal mode: Allowed ingoing and outgoing connections 
        will not be blocked anymore
        """
        log.debug1("disablePanicMode()")
        self.accessCheck(sender)
        self.fw.disable_panic_mode()
        self.PanicModeDisabled()

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryPanicMode(self, sender=None): # pylint: disable=W0613
        # returns True if in panic mode
        log.debug1("queryPanicMode()")
        return self.fw.query_panic_mode()

    @dbus.service.signal(config.dbus.DBUS_INTERFACE, signature='')
    @dbus_handle_exceptions
    def PanicModeEnabled(self):
        log.debug1("PanicModeEnabled()")

    @dbus.service.signal(config.dbus.DBUS_INTERFACE, signature='')
    @dbus_handle_exceptions
    def PanicModeDisabled(self):
        log.debug1("PanicModeDisabled()")

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # list functions

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='s',
                         out_signature="(sssbsasa(ss)asba(ssss)asasasasa(ss)b)")
    @dbus_handle_exceptions
    def getZoneSettings(self, zone, sender=None): # pylint: disable=W0613
        # returns zone settings for zone
        zone = dbus_to_python(zone, str)
        log.debug1("getZoneSettings(%s)", zone)
        return self.fw.zone.get_config_with_settings(zone)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='s',
                         out_signature="a{sv}")
    @dbus_handle_exceptions
    def getZoneSettings2(self, zone, sender=None):
        zone = dbus_to_python(zone, str)
        log.debug1("getZoneSettings2(%s)", zone)
        return self.fw.zone.get_config_with_settings_dict(zone)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='sa{sv}')
    @dbus_handle_exceptions
    def setZoneSettings2(self, zone, settings, sender=None):
        zone = dbus_to_python(zone, str)
        log.debug1("setZoneSettings2(%s)", zone)
        self.accessCheck(sender)
        self.fw.zone.set_config_with_settings_dict(zone, settings, sender)
        self.ZoneUpdated(zone, settings)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='sa{sv}')
    @dbus_handle_exceptions
    def ZoneUpdated(self, zone, settings):
        log.debug1("zone.ZoneUpdated('%s', '%s')" % (zone, settings))

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICY, in_signature='s',
                         out_signature="a{sv}")
    @dbus_handle_exceptions
    def getPolicySettings(self, policy, sender=None):
        policy = dbus_to_python(policy, str)
        log.debug1("policy.getPolicySettings(%s)", policy)
        return self.fw.policy.get_config_with_settings_dict(policy)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICY, in_signature='sa{sv}')
    @dbus_handle_exceptions
    def setPolicySettings(self, policy, settings, sender=None):
        policy = dbus_to_python(policy, str)
        log.debug1("policy.setPolicySettings(%s)", policy)
        self.accessCheck(sender)
        self.fw.policy.set_config_with_settings_dict(policy, settings, sender)
        self.PolicyUpdated(policy, settings)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_POLICY, signature='sa{sv}')
    @dbus_handle_exceptions
    def PolicyUpdated(self, policy, settings):
        log.debug1("policy.PolicyUpdated('%s', '%s')" % (policy, settings))

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='',
                         out_signature='as')
    @dbus_handle_exceptions
    def listServices(self, sender=None): # pylint: disable=W0613
        # returns the list of services
        # TODO: should be renamed to getServices()
        # because is called by firewall-cmd --get-services
        log.debug1("listServices()")
        return self.fw.service.get_services()

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='s',
                         out_signature='(sssa(ss)asa{ss}asa(ss))')
    @dbus_handle_exceptions
    def getServiceSettings(self, service, sender=None): # pylint: disable=W0613
        # returns service settings for service
        service = dbus_to_python(service, str)
        log.debug1("getServiceSettings(%s)", service)
        obj = self.fw.service.get_service(service)
        conf_dict = obj.export_config_dict()
        conf_list = []
        for i in range(8): # tuple based dbus API has 8 elements
            if obj.IMPORT_EXPORT_STRUCTURE[i][0] not in conf_dict:
                # old API needs the empty elements as well. Grab it from the
                # object otherwise we don't know the type.
                conf_list.append(copy.deepcopy(getattr(obj, obj.IMPORT_EXPORT_STRUCTURE[i][0])))
            else:
                conf_list.append(conf_dict[obj.IMPORT_EXPORT_STRUCTURE[i][0]])
        return tuple(conf_list)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='s',
                         out_signature='a{sv}')
    @dbus_handle_exceptions
    def getServiceSettings2(self, service, sender=None): # pylint: disable=W0613
        service = dbus_to_python(service, str)
        log.debug1("getServiceSettings2(%s)", service)
        obj = self.fw.service.get_service(service)
        return obj.export_config_dict()

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='',
                         out_signature='as')
    @dbus_handle_exceptions
    def listIcmpTypes(self, sender=None): # pylint: disable=W0613
        # returns the list of services
        # TODO: should be renamed to getIcmptypes()
        # because is called by firewall-cmd --get-icmptypes
        log.debug1("listIcmpTypes()")
        return self.fw.icmptype.get_icmptypes()

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='s',
                         out_signature=IcmpType.DBUS_SIGNATURE)
    @dbus_handle_exceptions
    def getIcmpTypeSettings(self, icmptype, sender=None): # pylint: disable=W0613
        # returns icmptype settings for icmptype
        icmptype = dbus_to_python(icmptype, str)
        log.debug1("getIcmpTypeSettings(%s)", icmptype)
        return self.fw.icmptype.get_icmptype(icmptype).export_config()

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # LOG DENIED

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='',
                         out_signature='s')
    @dbus_handle_exceptions
    def getLogDenied(self, sender=None): # pylint: disable=W0613
        # returns the log denied value
        log.debug1("getLogDenied()")
        return self.fw.get_log_denied()

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='s',
                         out_signature='')
    @dbus_handle_exceptions
    def setLogDenied(self, value, sender=None):
        # set the log denied value
        value = dbus_to_python(value, str)
        log.debug1("setLogDenied('%s')" % value)
        self.accessCheck(sender)
        self.fw.set_log_denied(value)
        self.LogDeniedChanged(value)
        # must reload the firewall as well
        self.fw.reload()
        self.config.reload()
        self.Reloaded()

    @dbus.service.signal(config.dbus.DBUS_INTERFACE, signature='s')
    @dbus_handle_exceptions
    def LogDeniedChanged(self, value):
        log.debug1("LogDeniedChanged('%s')" % (value))

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # AUTOMATIC HELPER ASSIGNMENT

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='',
                         out_signature='s')
    @dbus_handle_exceptions
    def getAutomaticHelpers(self, sender=None): # pylint: disable=W0613
        # returns the automatic helpers value
        log.debug1("getAutomaticHelpers()")
        # NOTE: This feature was removed and is now a noop. We retain the dbus
        # call to keep API.
        return "no"

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='s',
                         out_signature='')
    @dbus_handle_exceptions
    def setAutomaticHelpers(self, value, sender=None):
        # set the automatic helpers value
        value = dbus_to_python(value, str)
        log.debug1("setAutomaticHelpers('%s')" % value)
        self.accessCheck(sender)
        # NOTE: This feature was removed and is now a noop. We retain the dbus
        # call to keep API.

    @dbus.service.signal(config.dbus.DBUS_INTERFACE, signature='s')
    @dbus_handle_exceptions
    def AutomaticHelpersChanged(self, value):
        log.debug1("AutomaticHelpersChanged('%s')" % (value))

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # DEFAULT ZONE

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='',
                         out_signature='s')
    @dbus_handle_exceptions
    def getDefaultZone(self, sender=None): # pylint: disable=W0613
        # returns the system default zone
        log.debug1("getDefaultZone()")
        return self.fw.get_default_zone()

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='s',
                         out_signature='')
    @dbus_handle_exceptions
    def setDefaultZone(self, zone, sender=None):
        # set the system default zone
        zone = dbus_to_python(zone, str)
        log.debug1("setDefaultZone('%s')" % zone)
        self.accessCheck(sender)
        self.fw.set_default_zone(zone)
        self.DefaultZoneChanged(zone)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE, signature='s')
    @dbus_handle_exceptions
    def DefaultZoneChanged(self, zone):
        log.debug1("DefaultZoneChanged('%s')" % (zone))

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
    # POLICY INTERFACE
    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # POLICIES

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICY, in_signature='',
                         out_signature='as')
    @dbus_handle_exceptions
    def getPolicies(self, sender=None):
        log.debug1("policy.getPolicies()")
        return self.fw.policy.get_policies_not_derived_from_zone()

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_POLICY, in_signature='',
                         out_signature='a{sa{sas}}')
    @dbus_handle_exceptions
    def getActivePolicies(self, sender=None):
        log.debug1("policy.getActivePolicies()")
        policies = { }
        for policy in self.fw.policy.get_active_policies_not_derived_from_zone():
            policies[policy] = { }
            policies[policy]["ingress_zones"] = self.fw.policy.list_ingress_zones(policy)
            policies[policy]["egress_zones"] = self.fw.policy.list_egress_zones(policy)
        return policies

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
    # ZONE INTERFACE
    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # ZONES

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    # TODO: shouldn't this be in DBUS_INTERFACE instead of DBUS_INTERFACE_ZONE ?
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='',
                         out_signature='as')
    @dbus_handle_exceptions
    def getZones(self, sender=None): # pylint: disable=W0613
        # returns the list of zones
        log.debug1("zone.getZones()")
        return self.fw.zone.get_zones()

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='',
                         out_signature='a{sa{sas}}')
    @dbus_handle_exceptions
    def getActiveZones(self, sender=None): # pylint: disable=W0613
        # returns the list of active zones
        log.debug1("zone.getActiveZones()")
        zones = { }
        for zone in self.fw.zone.get_zones():
            interfaces = self.fw.zone.list_interfaces(zone)
            sources = self.fw.zone.list_sources(zone)
            if len(interfaces) + len(sources) > 0:
                zones[zone] = { }
                if len(interfaces) > 0:
                    zones[zone]["interfaces"] = interfaces
                if len(sources) > 0:
                    zones[zone]["sources"] = sources
        return zones

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='s',
                         out_signature='s')
    @dbus_handle_exceptions
    def getZoneOfInterface(self, interface, sender=None): # pylint: disable=W0613
        """Return the zone an interface belongs to.

        :Parameters:
            `interface` : str
                Name of the interface
        :Returns: str. The name of the zone.
        """
        interface = dbus_to_python(interface, str)
        log.debug1("zone.getZoneOfInterface('%s')" % interface)
        zone = self.fw.zone.get_zone_of_interface(interface)
        if zone:
            return zone
        return ""

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='s',
                         out_signature='s')
    @dbus_handle_exceptions
    def getZoneOfSource(self, source, sender=None): # pylint: disable=W0613
        #Return the zone an source belongs to.
        source = dbus_to_python(source, str)
        log.debug1("zone.getZoneOfSource('%s')" % source)
        zone = self.fw.zone.get_zone_of_source(source)
        if zone:
            return zone
        return ""

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='s',
                         out_signature='b')
    @dbus_handle_exceptions
    def isImmutable(self, zone, sender=None): # pylint: disable=W0613
        # no immutable zones anymore
        return False

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # INTERFACES

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='ss',
                         out_signature='s')
    @dbus_handle_exceptions
    def addInterface(self, zone, interface, sender=None):
        """Add an interface to a zone.
        If zone is empty, use default zone.
        """
        zone = dbus_to_python(zone, str)
        interface = dbus_to_python(interface, str)
        log.debug1("zone.addInterface('%s', '%s')" % (zone, interface))
        self.accessCheck(sender)
        _zone = self.fw.zone.add_interface(zone, interface, sender)

        self.InterfaceAdded(_zone, interface)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='ss',
                         out_signature='s')
    @dbus_handle_exceptions
    def changeZone(self, zone, interface, sender=None):
        """Change a zone an interface is part of.
        If zone is empty, use default zone.

        This function is deprecated, use changeZoneOfInterface instead
        """
        zone = dbus_to_python(zone, str)
        interface = dbus_to_python(interface, str)
        return self.changeZoneOfInterface(zone, interface, sender)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='ss',
                         out_signature='s')
    @dbus_handle_exceptions
    def changeZoneOfInterface(self, zone, interface, sender=None):
        """Change a zone an interface is part of.
        If zone is empty, use default zone.
        """
        zone = dbus_to_python(zone, str)
        interface = dbus_to_python(interface, str)
        log.debug1("zone.changeZoneOfInterface('%s', '%s')" % (zone, interface))
        self.accessCheck(sender)
        _zone = self.fw.zone.change_zone_of_interface(zone, interface, sender)

        self.ZoneOfInterfaceChanged(_zone, interface)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='ss',
                         out_signature='s')
    @dbus_handle_exceptions
    def removeInterface(self, zone, interface, sender=None):
        """Remove interface from a zone.
        If zone is empty, remove from zone the interface belongs to.
        """
        zone = dbus_to_python(zone, str)
        interface = dbus_to_python(interface, str)
        log.debug1("zone.removeInterface('%s', '%s')" % (zone, interface))
        self.accessCheck(sender)
        _zone = self.fw.zone.remove_interface(zone, interface)

        self.InterfaceRemoved(_zone, interface)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='ss',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryInterface(self, zone, interface, sender=None): # pylint: disable=W0613
        """Return true if an interface is in a zone.
        If zone is empty, use default zone.
        """
        zone = dbus_to_python(zone, str)
        interface = dbus_to_python(interface, str)
        log.debug1("zone.queryInterface('%s', '%s')" % (zone, interface))
        return self.fw.zone.query_interface(zone, interface)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='s',
                         out_signature='as')
    @dbus_handle_exceptions
    def getInterfaces(self, zone, sender=None): # pylint: disable=W0613
        """Return the list of interfaces of a zone.
        If zone is empty, use default zone.
        """
        # TODO: should be renamed to listInterfaces()
        # because is called by firewall-cmd --zone --list-interfaces
        zone = dbus_to_python(zone, str)
        log.debug1("zone.getInterfaces('%s')" % (zone))
        return self.fw.zone.list_interfaces(zone)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='ss')
    @dbus_handle_exceptions
    def InterfaceAdded(self, zone, interface):
        log.debug1("zone.InterfaceAdded('%s', '%s')" % (zone, interface))

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='ss')
    @dbus_handle_exceptions
    def ZoneChanged(self, zone, interface):
        """
        This signal is deprecated.
        """
        log.debug1("zone.ZoneChanged('%s', '%s')" % (zone, interface))

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='ss')
    @dbus_handle_exceptions
    def ZoneOfInterfaceChanged(self, zone, interface):
        log.debug1("zone.ZoneOfInterfaceChanged('%s', '%s')" % (zone,
                                                                interface))
        self.ZoneChanged(zone, interface)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='ss')
    @dbus_handle_exceptions
    def InterfaceRemoved(self, zone, interface):
        log.debug1("zone.InterfaceRemoved('%s', '%s')" % (zone, interface))

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # SOURCES

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='ss',
                         out_signature='s')
    @dbus_handle_exceptions
    def addSource(self, zone, source, sender=None):
        """Add a source to a zone.
        If zone is empty, use default zone.
        """
        zone = dbus_to_python(zone, str)
        source = dbus_to_python(source, str)
        log.debug1("zone.addSource('%s', '%s')" % (zone, source))
        self.accessCheck(sender)
        _zone = self.fw.zone.add_source(zone, source, sender)

        self.SourceAdded(_zone, source)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='ss',
                         out_signature='s')
    @dbus_handle_exceptions
    def changeZoneOfSource(self, zone, source, sender=None):
        """Change a zone an source is part of.
        If zone is empty, use default zone.
        """
        zone = dbus_to_python(zone, str)
        source = dbus_to_python(source, str)
        log.debug1("zone.changeZoneOfSource('%s', '%s')" % (zone, source))
        self.accessCheck(sender)
        _zone = self.fw.zone.change_zone_of_source(zone, source, sender)

        self.ZoneOfSourceChanged(_zone, source)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='ss',
                         out_signature='s')
    @dbus_handle_exceptions
    def removeSource(self, zone, source, sender=None):
        """Remove source from a zone.
        If zone is empty, remove from zone the source belongs to.
        """
        zone = dbus_to_python(zone, str)
        source = dbus_to_python(source, str)
        log.debug1("zone.removeSource('%s', '%s')" % (zone, source))
        self.accessCheck(sender)
        _zone = self.fw.zone.remove_source(zone, source)

        self.SourceRemoved(_zone, source)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='ss',
                         out_signature='b')
    @dbus_handle_exceptions
    def querySource(self, zone, source, sender=None): # pylint: disable=W0613
        """Return true if an source is in a zone.
        If zone is empty, use default zone.
        """
        zone = dbus_to_python(zone, str)
        source = dbus_to_python(source, str)
        log.debug1("zone.querySource('%s', '%s')" % (zone, source))
        return self.fw.zone.query_source(zone, source)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='s',
                         out_signature='as')
    @dbus_handle_exceptions
    def getSources(self, zone, sender=None): # pylint: disable=W0613
        """Return the list of sources of a zone.
        If zone is empty, use default zone.
        """
        # TODO: should be renamed to listSources()
        # because is called by firewall-cmd --zone --list-sources
        zone = dbus_to_python(zone, str)
        log.debug1("zone.getSources('%s')" % (zone))
        return self.fw.zone.list_sources(zone)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='ss')
    @dbus_handle_exceptions
    def SourceAdded(self, zone, source):
        log.debug1("zone.SourceAdded('%s', '%s')" % (zone, source))

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='ss')
    @dbus_handle_exceptions
    def ZoneOfSourceChanged(self, zone, source):
        log.debug1("zone.ZoneOfSourceChanged('%s', '%s')" % (zone, source))

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='ss')
    @dbus_handle_exceptions
    def SourceRemoved(self, zone, source):
        log.debug1("zone.SourceRemoved('%s', '%s')" % (zone, source))

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # RICH RULES

    @dbus_handle_exceptions
    def disableTimedRichRule(self, zone, rule):
        log.debug1("zone.disableTimedRichRule('%s', '%s')" % (zone, rule))
        del self._timeouts[zone][rule]
        obj = Rich_Rule(rule_str=rule)
        self.fw.zone.remove_rule(zone, obj)
        self.RichRuleRemoved(zone, rule)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='ssi',
                         out_signature='s')
    @dbus_handle_exceptions
    def addRichRule(self, zone, rule, timeout, sender=None): # pylint: disable=W0613
        zone = dbus_to_python(zone, str)
        rule = dbus_to_python(rule, str)
        timeout = dbus_to_python(timeout, int)
        log.debug1("zone.addRichRule('%s', '%s')" % (zone, rule))
        obj = Rich_Rule(rule_str=rule)
        _zone = self.fw.zone.add_rule(zone, obj, timeout)

        if timeout > 0:
            tag = GLib.timeout_add_seconds(timeout, self.disableTimedRichRule,
                                           _zone, rule)
            self.addTimeout(_zone, rule, tag)

        self.RichRuleAdded(_zone, rule, timeout)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='ss',
                         out_signature='s')
    @dbus_handle_exceptions
    def removeRichRule(self, zone, rule, sender=None): # pylint: disable=W0613
        zone = dbus_to_python(zone, str)
        rule = dbus_to_python(rule, str)
        log.debug1("zone.removeRichRule('%s', '%s')" % (zone, rule))
        obj = Rich_Rule(rule_str=rule)
        _zone = self.fw.zone.remove_rule(zone, obj)
        self.removeTimeout(_zone, rule)
        self.RichRuleRemoved(_zone, rule)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='ss',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryRichRule(self, zone, rule, sender=None): # pylint: disable=W0613
        zone = dbus_to_python(zone, str)
        rule = dbus_to_python(rule, str)
        log.debug1("zone.queryRichRule('%s', '%s')" % (zone, rule))
        obj = Rich_Rule(rule_str=rule)
        return self.fw.zone.query_rule(zone, obj)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='s',
                         out_signature='as')
    @dbus_handle_exceptions
    def getRichRules(self, zone, sender=None): # pylint: disable=W0613
        # returns the list of enabled rich rules for zone
        # TODO: should be renamed to listRichRules()
        # because is called by firewall-cmd --zone --list-rich-rules
        zone = dbus_to_python(zone, str)
        log.debug1("zone.getRichRules('%s')" % (zone))
        return self.fw.zone.list_rules(zone)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='ssi')
    @dbus_handle_exceptions
    def RichRuleAdded(self, zone, rule, timeout):
        log.debug1("zone.RichRuleAdded('%s', '%s', %d)" % (zone, rule, timeout))

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='ss')
    @dbus_handle_exceptions
    def RichRuleRemoved(self, zone, rule):
        log.debug1("zone.RichRuleRemoved('%s', '%s')" % (zone, rule))

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # SERVICES

    @dbus_handle_exceptions
    def disableTimedService(self, zone, service):
        log.debug1("zone.disableTimedService('%s', '%s')" % (zone, service))
        del self._timeouts[zone][service]
        self.fw.zone.remove_service(zone, service)
        self.ServiceRemoved(zone, service)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='ssi',
                         out_signature='s')
    @dbus_handle_exceptions
    def addService(self, zone, service, timeout, sender=None):
        # enables service <service> if not enabled already for zone
        zone = dbus_to_python(zone, str)
        service = dbus_to_python(service, str)
        timeout = dbus_to_python(timeout, int)
        log.debug1("zone.addService('%s', '%s', %d)" % (zone, service, timeout))
        self.accessCheck(sender)

        _zone = self.fw.zone.add_service(zone, service, timeout, sender)

        if timeout > 0:
            tag = GLib.timeout_add_seconds(timeout, self.disableTimedService,
                                           _zone, service)
            self.addTimeout(_zone, service, tag)

        self.ServiceAdded(_zone, service, timeout)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='ss',
                         out_signature='s')
    @dbus_handle_exceptions
    def removeService(self, zone, service, sender=None):
        # disables service for zone
        zone = dbus_to_python(zone, str)
        service = dbus_to_python(service, str)
        log.debug1("zone.removeService('%s', '%s')" % (zone, service))
        self.accessCheck(sender)

        _zone = self.fw.zone.remove_service(zone, service)

        self.removeTimeout(_zone, service)
        self.ServiceRemoved(_zone, service)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='ss',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryService(self, zone, service, sender=None): # pylint: disable=W0613
        # returns true if a service is enabled for zone
        zone = dbus_to_python(zone, str)
        service = dbus_to_python(service, str)
        log.debug1("zone.queryService('%s', '%s')" % (zone, service))
        return self.fw.zone.query_service(zone, service)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='s',
                         out_signature='as')
    @dbus_handle_exceptions
    def getServices(self, zone, sender=None): # pylint: disable=W0613
        # returns the list of enabled services for zone
        # TODO: should be renamed to listServices()
        # because is called by firewall-cmd --zone --list-services
        zone = dbus_to_python(zone, str)
        log.debug1("zone.getServices('%s')" % (zone))
        return self.fw.zone.list_services(zone)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='ssi')
    @dbus_handle_exceptions
    def ServiceAdded(self, zone, service, timeout):
        log.debug1("zone.ServiceAdded('%s', '%s', %d)" % \
                       (zone, service, timeout))

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='ss')
    @dbus_handle_exceptions
    def ServiceRemoved(self, zone, service):
        log.debug1("zone.ServiceRemoved('%s', '%s')" % (zone, service))

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # PORTS

    @dbus_handle_exceptions
    def disableTimedPort(self, zone, port, protocol):
        log.debug1("zone.disableTimedPort('%s', '%s', '%s')" % \
                       (zone, port, protocol))
        del self._timeouts[zone][(port, protocol)]
        self.fw.zone.remove_port(zone, port, protocol)
        self.PortRemoved(zone, port, protocol)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='sssi',
                         out_signature='s')
    @dbus_handle_exceptions
    def addPort(self, zone, port, protocol, timeout, sender=None): # pylint: disable=R0913
        # adds port <port> <protocol> if not enabled already to zone
        zone = dbus_to_python(zone, str)
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        timeout = dbus_to_python(timeout, int)
        log.debug1("zone.addPort('%s', '%s', '%s')" % \
                       (zone, port, protocol))
        self.accessCheck(sender)
        _zone = self.fw.zone.add_port(zone, port, protocol, timeout, sender)

        if timeout > 0:
            tag = GLib.timeout_add_seconds(timeout, self.disableTimedPort,
                                           _zone, port, protocol)
            self.addTimeout(_zone, (port, protocol), tag)

        self.PortAdded(_zone, port, protocol, timeout)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='sss',
                         out_signature='s')
    @dbus_handle_exceptions
    def removePort(self, zone, port, protocol, sender=None): # pylint: disable=R0913
        # removes port<port> <protocol> if enabled from zone
        zone = dbus_to_python(zone, str)
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        log.debug1("zone.removePort('%s', '%s', '%s')" % \
                       (zone, port, protocol))
        self.accessCheck(sender)
        _zone= self.fw.zone.remove_port(zone, port, protocol)

        self.removeTimeout(_zone, (port, protocol))
        self.PortRemoved(_zone, port, protocol)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='sss',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryPort(self, zone, port, protocol, sender=None): # pylint: disable=W0613, R0913
        # returns true if a port is enabled for zone
        zone = dbus_to_python(zone, str)
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        log.debug1("zone.queryPort('%s', '%s', '%s')" % (zone, port, protocol))
        return self.fw.zone.query_port(zone, port, protocol)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='s',
                         out_signature='aas')
    @dbus_handle_exceptions
    def getPorts(self, zone, sender=None): # pylint: disable=W0613
        # returns the list of enabled ports
        # TODO: should be renamed to listPorts()
        # because is called by firewall-cmd --zone --list-ports
        zone = dbus_to_python(zone, str)
        log.debug1("zone.getPorts('%s')" % (zone))
        return self.fw.zone.list_ports(zone)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='sssi')
    @dbus_handle_exceptions
    def PortAdded(self, zone, port, protocol, timeout=0):
        log.debug1("zone.PortAdded('%s', '%s', '%s', %d)" % \
                       (zone, port, protocol, timeout))

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='sss')
    @dbus_handle_exceptions
    def PortRemoved(self, zone, port, protocol):
        log.debug1("zone.PortRemoved('%s', '%s', '%s')" % \
                       (zone, port, protocol))

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # PROTOCOLS

    @dbus_handle_exceptions
    def disableTimedProtocol(self, zone, protocol):
        log.debug1("zone.disableTimedProtocol('%s', '%s')" % (zone, protocol))
        del self._timeouts[zone][(protocol)]
        self.fw.zone.remove_protocol(zone, protocol)
        self.ProtocolRemoved(zone, protocol)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='ssi',
                         out_signature='s')
    @dbus_handle_exceptions
    def addProtocol(self, zone, protocol, timeout, sender=None):
        # adds protocol <protocol> if not enabled already to zone
        zone = dbus_to_python(zone, str)
        protocol = dbus_to_python(protocol, str)
        timeout = dbus_to_python(timeout, int)
        log.debug1("zone.enableProtocol('%s', '%s')" % (zone, protocol))
        self.accessCheck(sender)
        _zone = self.fw.zone.add_protocol(zone, protocol, timeout, sender)

        if timeout > 0:
            tag = GLib.timeout_add_seconds(timeout, self.disableTimedProtocol,
                                           _zone, protocol)
            self.addTimeout(_zone, protocol, tag)

        self.ProtocolAdded(_zone, protocol, timeout)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='ss',
                         out_signature='s')
    @dbus_handle_exceptions
    def removeProtocol(self, zone, protocol, sender=None):
        # removes protocol<protocol> if enabled from zone
        zone = dbus_to_python(zone, str)
        protocol = dbus_to_python(protocol, str)
        log.debug1("zone.removeProtocol('%s', '%s')" % (zone, protocol))
        self.accessCheck(sender)
        _zone= self.fw.zone.remove_protocol(zone, protocol)

        self.removeTimeout(_zone, protocol)
        self.ProtocolRemoved(_zone, protocol)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='ss',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryProtocol(self, zone, protocol, sender=None): # pylint: disable=W0613
        # returns true if a protocol is enabled for zone
        zone = dbus_to_python(zone, str)
        protocol = dbus_to_python(protocol, str)
        log.debug1("zone.queryProtocol('%s', '%s')" % (zone, protocol))
        return self.fw.zone.query_protocol(zone, protocol)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='s',
                         out_signature='as')
    @dbus_handle_exceptions
    def getProtocols(self, zone, sender=None): # pylint: disable=W0613
        # returns the list of enabled protocols
        # TODO: should be renamed to listProtocols()
        # because is called by firewall-cmd --zone --list-protocols
        zone = dbus_to_python(zone, str)
        log.debug1("zone.getProtocols('%s')" % (zone))
        return self.fw.zone.list_protocols(zone)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='ssi')
    @dbus_handle_exceptions
    def ProtocolAdded(self, zone, protocol, timeout=0):
        log.debug1("zone.ProtocolAdded('%s', '%s', %d)" % \
                       (zone, protocol, timeout))

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='ss')
    @dbus_handle_exceptions
    def ProtocolRemoved(self, zone, protocol):
        log.debug1("zone.ProtocolRemoved('%s', '%s')" % (zone, protocol))

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # SOURCE PORTS

    @dbus_handle_exceptions
    def disableTimedSourcePort(self, zone, port, protocol):
        log.debug1("zone.disableTimedSourcePort('%s', '%s', '%s')" % \
                   (zone, port, protocol))
        del self._timeouts[zone][("sport", port, protocol)]
        self.fw.zone.remove_source_port(zone, port, protocol)
        self.SourcePortRemoved(zone, port, protocol)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='sssi',
                         out_signature='s')
    @dbus_handle_exceptions
    def addSourcePort(self, zone, port, protocol, timeout, sender=None): # pylint: disable=R0913
        # adds source port <port> <protocol> if not enabled already to zone
        zone = dbus_to_python(zone, str)
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        timeout = dbus_to_python(timeout, int)
        log.debug1("zone.addSourcePort('%s', '%s', '%s')" % (zone, port,
                                                             protocol))
        self.accessCheck(sender)
        _zone = self.fw.zone.add_source_port(zone, port, protocol, timeout,
                                             sender)

        if timeout > 0:
            tag = GLib.timeout_add_seconds(timeout, self.disableTimedSourcePort,
                                           _zone, port, protocol)
            self.addTimeout(_zone, ("sport", port, protocol), tag)

        self.SourcePortAdded(_zone, port, protocol, timeout)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='sss',
                         out_signature='s')
    @dbus_handle_exceptions
    def removeSourcePort(self, zone, port, protocol, sender=None): # pylint: disable=R0913
        # removes source port<port> <protocol> if enabled from zone
        zone = dbus_to_python(zone, str)
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        log.debug1("zone.removeSourcePort('%s', '%s', '%s')" % (zone, port,
                                                                protocol))
        self.accessCheck(sender)
        _zone= self.fw.zone.remove_source_port(zone, port, protocol)

        self.removeTimeout(_zone, ("sport", port, protocol))
        self.SourcePortRemoved(_zone, port, protocol)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='sss',
                         out_signature='b')
    @dbus_handle_exceptions
    def querySourcePort(self, zone, port, protocol, sender=None): # pylint: disable=W0613, R0913
        # returns true if a source port is enabled for zone
        zone = dbus_to_python(zone, str)
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        log.debug1("zone.querySourcePort('%s', '%s', '%s')" % (zone, port,
                                                               protocol))
        return self.fw.zone.query_source_port(zone, port, protocol)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='s',
                         out_signature='aas')
    @dbus_handle_exceptions
    def getSourcePorts(self, zone, sender=None): # pylint: disable=W0613
        # returns the list of enabled source ports
        # TODO: should be renamed to listSourcePorts()
        # because is called by firewall-cmd --zone --list-source-ports
        zone = dbus_to_python(zone, str)
        log.debug1("zone.getSourcePorts('%s')" % (zone))
        return self.fw.zone.list_source_ports(zone)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='sssi')
    @dbus_handle_exceptions
    def SourcePortAdded(self, zone, port, protocol, timeout=0):
        log.debug1("zone.SourcePortAdded('%s', '%s', '%s', %d)" % \
                   (zone, port, protocol, timeout))

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='sss')
    @dbus_handle_exceptions
    def SourcePortRemoved(self, zone, port, protocol):
        log.debug1("zone.SourcePortRemoved('%s', '%s', '%s')" % (zone, port,
                                                                 protocol))

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # MASQUERADE

    @dbus_handle_exceptions
    def disableTimedMasquerade(self, zone):
        del self._timeouts[zone]["masquerade"]
        self.fw.zone.remove_masquerade(zone)
        self.MasqueradeRemoved(zone)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='si',
                         out_signature='s')
    @dbus_handle_exceptions
    def addMasquerade(self, zone, timeout, sender=None):
        # adds masquerade if not added already
        zone = dbus_to_python(zone, str)
        timeout = dbus_to_python(timeout, int)
        log.debug1("zone.addMasquerade('%s')" % (zone))
        self.accessCheck(sender)
        _zone = self.fw.zone.add_masquerade(zone, timeout, sender)
        
        if timeout > 0:
            tag = GLib.timeout_add_seconds(timeout, self.disableTimedMasquerade,
                                           _zone)
            self.addTimeout(_zone, "masquerade", tag)

        self.MasqueradeAdded(_zone, timeout)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='s',
                         out_signature='s')
    @dbus_handle_exceptions
    def removeMasquerade(self, zone, sender=None):
        # removes masquerade
        zone = dbus_to_python(zone, str)
        log.debug1("zone.removeMasquerade('%s')" % (zone))
        self.accessCheck(sender)
        _zone = self.fw.zone.remove_masquerade(zone)

        self.removeTimeout(_zone, "masquerade")
        self.MasqueradeRemoved(_zone)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='s',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryMasquerade(self, zone, sender=None): # pylint: disable=W0613
        # returns true if a masquerade is added
        zone = dbus_to_python(zone, str)
        log.debug1("zone.queryMasquerade('%s')" % (zone))
        return self.fw.zone.query_masquerade(zone)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='si')
    @dbus_handle_exceptions
    def MasqueradeAdded(self, zone, timeout=0):
        log.debug1("zone.MasqueradeAdded('%s', %d)" % (zone, timeout))

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='s')
    @dbus_handle_exceptions
    def MasqueradeRemoved(self, zone):
        log.debug1("zone.MasqueradeRemoved('%s')" % (zone))

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # FORWARD PORT

    @dbus_handle_exceptions
    def disable_forward_port(self, zone, port, protocol, toport, toaddr): # pylint: disable=R0913
        del self._timeouts[zone][(port, protocol, toport, toaddr)]
        self.fw.zone.remove_forward_port(zone, port, protocol, toport, toaddr)
        self.ForwardPortRemoved(zone, port, protocol, toport, toaddr)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='sssssi',
                         out_signature='s')
    @dbus_handle_exceptions
    def addForwardPort(self, zone, port, protocol, toport, toaddr, timeout,
                       sender=None): # pylint: disable=R0913
        # add forward port if not enabled already for zone
        zone = dbus_to_python(zone, str)
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        toport = dbus_to_python(toport, str)
        toaddr = dbus_to_python(toaddr, str)
        timeout = dbus_to_python(timeout, int)
        log.debug1("zone.addForwardPort('%s', '%s', '%s', '%s', '%s')" % \
                       (zone, port, protocol, toport, toaddr))
        self.accessCheck(sender)
        _zone = self.fw.zone.add_forward_port(zone, port, protocol, toport,
                                              toaddr, timeout, sender)

        if timeout > 0:
            tag = GLib.timeout_add_seconds(timeout,
                                           self.disable_forward_port,
                                           _zone, port, protocol, toport,
                                           toaddr)
            self.addTimeout(_zone, (port, protocol, toport, toaddr), tag)

        self.ForwardPortAdded(_zone, port, protocol, toport, toaddr, timeout)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='sssss',
                         out_signature='s')
    @dbus_handle_exceptions
    def removeForwardPort(self, zone, port, protocol, toport, toaddr,
                          sender=None): # pylint: disable=R0913
        # remove forward port from zone
        zone = dbus_to_python(zone, str)
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        toport = dbus_to_python(toport, str)
        toaddr = dbus_to_python(toaddr, str)
        log.debug1("zone.removeForwardPort('%s', '%s', '%s', '%s', '%s')" % \
                       (zone, port, protocol, toport, toaddr))
        self.accessCheck(sender)
        _zone = self.fw.zone.remove_forward_port(zone, port, protocol, toport,
                                                 toaddr)

        self.removeTimeout(_zone, (port, protocol, toport, toaddr))
        self.ForwardPortRemoved(_zone, port, protocol, toport, toaddr)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='sssss',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryForwardPort(self, zone, port, protocol, toport, toaddr,
                         sender=None): # pylint: disable=W0613, R0913
        # returns true if a forward port is enabled for zone
        zone = dbus_to_python(zone, str)
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        toport = dbus_to_python(toport, str)
        toaddr = dbus_to_python(toaddr, str)
        log.debug1("zone.queryForwardPort('%s', '%s', '%s', '%s', '%s')" % \
                       (zone, port, protocol, toport, toaddr))
        return self.fw.zone.query_forward_port(zone, port, protocol, toport,
                                               toaddr)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='s',
                         out_signature='aas')
    @dbus_handle_exceptions
    def getForwardPorts(self, zone, sender=None): # pylint: disable=W0613
        # returns the list of enabled ports for zone
        # TODO: should be renamed to listForwardPorts()
        # because is called by firewall-cmd --zone --list-forward-ports
        zone = dbus_to_python(zone, str)
        log.debug1("zone.getForwardPorts('%s')" % (zone))
        return self.fw.zone.list_forward_ports(zone)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='sssssi')
    @dbus_handle_exceptions
    def ForwardPortAdded(self, zone, port, protocol, toport, toaddr,
                         timeout=0): # pylint: disable=R0913
        log.debug1("zone.ForwardPortAdded('%s', '%s', '%s', '%s', '%s', %d)" % \
                       (zone, port, protocol, toport, toaddr, timeout))

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='sssss')
    @dbus_handle_exceptions
    def ForwardPortRemoved(self, zone, port, protocol, toport, toaddr): # pylint: disable=R0913
        log.debug1("zone.ForwardPortRemoved('%s', '%s', '%s', '%s', '%s')" % \
                       (zone, port, protocol, toport, toaddr))

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # ICMP BLOCK

    @dbus_handle_exceptions
    def disableTimedIcmpBlock(self, zone, icmp, sender): # pylint: disable=W0613
        log.debug1("zone.disableTimedIcmpBlock('%s', '%s')" % (zone, icmp))
        del self._timeouts[zone][icmp]
        self.fw.zone.remove_icmp_block(zone, icmp)
        self.IcmpBlockRemoved(zone, icmp)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='ssi',
                         out_signature='s')
    @dbus_handle_exceptions
    def addIcmpBlock(self, zone, icmp, timeout, sender=None):
        # add icmpblock <icmp> if not enabled already for zone
        zone = dbus_to_python(zone, str)
        icmp = dbus_to_python(icmp, str)
        timeout = dbus_to_python(timeout, int)
        log.debug1("zone.enableIcmpBlock('%s', '%s')" % (zone, icmp))
        self.accessCheck(sender)
        _zone = self.fw.zone.add_icmp_block(zone, icmp, timeout, sender)

        if timeout > 0:
            tag = GLib.timeout_add_seconds(timeout, self.disableTimedIcmpBlock,
                                           _zone, icmp, sender)
            self.addTimeout(_zone, icmp, tag)

        self.IcmpBlockAdded(_zone, icmp, timeout)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='ss',
                         out_signature='s')
    @dbus_handle_exceptions
    def removeIcmpBlock(self, zone, icmp, sender=None):
        # removes icmpBlock from zone
        zone = dbus_to_python(zone, str)
        icmp = dbus_to_python(icmp, str)
        log.debug1("zone.removeIcmpBlock('%s', '%s')" % (zone, icmp))
        self.accessCheck(sender)
        _zone = self.fw.zone.remove_icmp_block(zone, icmp)

        self.removeTimeout(_zone, icmp)
        self.IcmpBlockRemoved(_zone, icmp)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='ss',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryIcmpBlock(self, zone, icmp, sender=None): # pylint: disable=W0613
        # returns true if a icmp is enabled for zone
        zone = dbus_to_python(zone, str)
        icmp = dbus_to_python(icmp, str)
        log.debug1("zone.queryIcmpBlock('%s', '%s')" % (zone, icmp))
        return self.fw.zone.query_icmp_block(zone, icmp)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='s',
                         out_signature='as')
    @dbus_handle_exceptions
    def getIcmpBlocks(self, zone, sender=None): # pylint: disable=W0613
        # returns the list of enabled icmpblocks
        # TODO: should be renamed to listIcmpBlocks()
        # because is called by firewall-cmd --zone --list-icmp-blocks
        zone = dbus_to_python(zone, str)
        log.debug1("zone.getIcmpBlocks('%s')" % (zone))
        return self.fw.zone.list_icmp_blocks(zone)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='ssi')
    @dbus_handle_exceptions
    def IcmpBlockAdded(self, zone, icmp, timeout=0):
        log.debug1("zone.IcmpBlockAdded('%s', '%s', %d)" % \
                       (zone, icmp, timeout))

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='ss')
    @dbus_handle_exceptions
    def IcmpBlockRemoved(self, zone, icmp):
        log.debug1("zone.IcmpBlockRemoved('%s', '%s')" % (zone, icmp))

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # ICMP BLOCK INVERSION

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='s',
                         out_signature='s')
    @dbus_handle_exceptions
    def addIcmpBlockInversion(self, zone, sender=None):
        # adds icmpBlockInversion if not added already
        zone = dbus_to_python(zone, str)
        log.debug1("zone.addIcmpBlockInversion('%s')" % (zone))
        self.accessCheck(sender)
        _zone = self.fw.zone.add_icmp_block_inversion(zone, sender)
        
        self.IcmpBlockInversionAdded(_zone)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='s',
                         out_signature='s')
    @dbus_handle_exceptions
    def removeIcmpBlockInversion(self, zone, sender=None):
        # removes icmpBlockInversion
        zone = dbus_to_python(zone, str)
        log.debug1("zone.removeIcmpBlockInversion('%s')" % (zone))
        self.accessCheck(sender)
        _zone = self.fw.zone.remove_icmp_block_inversion(zone)

        self.IcmpBlockInversionRemoved(_zone)
        return _zone

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_ZONE, in_signature='s',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryIcmpBlockInversion(self, zone, sender=None): # pylint: disable=W0613
        # returns true if a icmpBlockInversion is added
        zone = dbus_to_python(zone, str)
        log.debug1("zone.queryIcmpBlockInversion('%s')" % (zone))
        return self.fw.zone.query_icmp_block_inversion(zone)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='s')
    @dbus_handle_exceptions
    def IcmpBlockInversionAdded(self, zone):
        log.debug1("zone.IcmpBlockInversionAdded('%s')" % (zone))

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_ZONE, signature='s')
    @dbus_handle_exceptions
    def IcmpBlockInversionRemoved(self, zone):
        log.debug1("zone.IcmpBlockInversionRemoved('%s')" % (zone))

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
    # DIRECT INTERFACE
    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # DIRECT CHAIN

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_DIRECT)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_DIRECT, in_signature='sss',
                         out_signature='')
    @dbus_handle_exceptions
    def addChain(self, ipv, table, chain, sender=None):
        # inserts direct chain
        ipv = dbus_to_python(ipv, str)
        table = dbus_to_python(table, str)
        chain = dbus_to_python(chain, str)
        log.debug1("direct.addChain('%s', '%s', '%s')" % (ipv, table, chain))
        self.accessCheck(sender)
        self.fw.direct.add_chain(ipv, table, chain)
        self.ChainAdded(ipv, table, chain)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_DIRECT)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_DIRECT, in_signature='sss',
                         out_signature='')
    @dbus_handle_exceptions
    def removeChain(self, ipv, table, chain, sender=None):
        # removes direct chain
        ipv = dbus_to_python(ipv, str)
        table = dbus_to_python(table, str)
        chain = dbus_to_python(chain, str)
        log.debug1("direct.removeChain('%s', '%s', '%s')" % (ipv, table, chain))
        self.accessCheck(sender)
        self.fw.direct.remove_chain(ipv, table, chain)
        self.ChainRemoved(ipv, table, chain)
    
    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_DIRECT_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_DIRECT, in_signature='sss',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryChain(self, ipv, table, chain, sender=None): # pylint: disable=W0613
        # returns true if a chain is enabled
        ipv = dbus_to_python(ipv, str)
        table = dbus_to_python(table, str)
        chain = dbus_to_python(chain, str)
        log.debug1("direct.queryChain('%s', '%s', '%s')" % (ipv, table, chain))
        return self.fw.direct.query_chain(ipv, table, chain)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_DIRECT_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_DIRECT, in_signature='ss',
                         out_signature='as')
    @dbus_handle_exceptions
    def getChains(self, ipv, table, sender=None): # pylint: disable=W0613
        # returns list of added chains
        ipv = dbus_to_python(ipv, str)
        table = dbus_to_python(table, str)
        log.debug1("direct.getChains('%s', '%s')" % (ipv, table))
        return self.fw.direct.get_chains(ipv, table)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_DIRECT_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_DIRECT, in_signature='',
                         out_signature='a(sss)')
    @dbus_handle_exceptions
    def getAllChains(self, sender=None): # pylint: disable=W0613
        # returns list of added chains
        log.debug1("direct.getAllChains()")
        return self.fw.direct.get_all_chains()

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_DIRECT, signature='sss')
    @dbus_handle_exceptions
    def ChainAdded(self, ipv, table, chain):
        log.debug1("direct.ChainAdded('%s', '%s', '%s')" % (ipv, table, chain))

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_DIRECT, signature='sss')
    @dbus_handle_exceptions
    def ChainRemoved(self, ipv, table, chain):
        log.debug1("direct.ChainRemoved('%s', '%s', '%s')" % (ipv, table,
                                                              chain))

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # DIRECT RULE

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_DIRECT)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_DIRECT,
                         in_signature='sssias', out_signature='')
    @dbus_handle_exceptions
    def addRule(self, ipv, table, chain, priority, args, sender=None): # pylint: disable=R0913
        # inserts direct rule
        ipv = dbus_to_python(ipv, str)
        table = dbus_to_python(table, str)
        chain = dbus_to_python(chain, str)
        priority = dbus_to_python(priority, int)
        args = tuple( dbus_to_python(i, str) for i in args )
        log.debug1("direct.addRule('%s', '%s', '%s', %d, '%s')" % \
                       (ipv, table, chain, priority, "','".join(args)))
        self.accessCheck(sender)
        self.fw.direct.add_rule(ipv, table, chain, priority, args)
        self.RuleAdded(ipv, table, chain, priority, args)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_DIRECT)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_DIRECT,
                         in_signature='sssias', out_signature='')
    @dbus_handle_exceptions
    def removeRule(self, ipv, table, chain, priority, args, sender=None): # pylint: disable=R0913
        # removes direct rule
        ipv = dbus_to_python(ipv, str)
        table = dbus_to_python(table, str)
        chain = dbus_to_python(chain, str)
        priority = dbus_to_python(priority, int)
        args = tuple( dbus_to_python(i, str) for i in args )
        log.debug1("direct.removeRule('%s', '%s', '%s', %d, '%s')" % \
                       (ipv, table, chain, priority, "','".join(args)))
        self.accessCheck(sender)
        self.fw.direct.remove_rule(ipv, table, chain, priority, args)
        self.RuleRemoved(ipv, table, chain, priority, args)
    
    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_DIRECT)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_DIRECT, in_signature='sss',
                         out_signature='')
    @dbus_handle_exceptions
    def removeRules(self, ipv, table, chain, sender=None):
        # removes direct rule
        ipv = dbus_to_python(ipv, str)
        table = dbus_to_python(table, str)
        chain = dbus_to_python(chain, str)
        log.debug1("direct.removeRules('%s', '%s', '%s')" % (ipv, table, chain))
        self.accessCheck(sender)
        for (priority, args) in self.fw.direct.get_rules(ipv, table, chain):
            self.fw.direct.remove_rule(ipv, table, chain, priority, args)
            self.RuleRemoved(ipv, table, chain, priority, args)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_DIRECT_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_DIRECT,
                         in_signature='sssias', out_signature='b')
    @dbus_handle_exceptions
    def queryRule(self, ipv, table, chain, priority, args, sender=None): # pylint: disable=W0613, R0913
        # returns true if a rule is enabled
        ipv = dbus_to_python(ipv, str)
        table = dbus_to_python(table, str)
        chain = dbus_to_python(chain, str)
        priority = dbus_to_python(priority, int)
        args = tuple( dbus_to_python(i, str) for i in args )
        log.debug1("direct.queryRule('%s', '%s', '%s', %d, '%s')" % \
                       (ipv, table, chain, priority, "','".join(args)))
        return self.fw.direct.query_rule(ipv, table, chain, priority, args)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_DIRECT_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_DIRECT, in_signature='sss',
                         out_signature='a(ias)')
    @dbus_handle_exceptions
    def getRules(self, ipv, table, chain, sender=None): # pylint: disable=W0613
        # returns list of added rules
        ipv = dbus_to_python(ipv, str)
        table = dbus_to_python(table, str)
        chain = dbus_to_python(chain, str)
        log.debug1("direct.getRules('%s', '%s', '%s')" % (ipv, table, chain))
        return self.fw.direct.get_rules(ipv, table, chain)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_DIRECT_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_DIRECT, in_signature='',
                         out_signature='a(sssias)')
    @dbus_handle_exceptions
    def getAllRules(self, sender=None): # pylint: disable=W0613
        # returns list of added rules
        log.debug1("direct.getAllRules()")
        return self.fw.direct.get_all_rules()

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_DIRECT, signature='sssias')
    @dbus_handle_exceptions
    def RuleAdded(self, ipv, table, chain, priority, args): # pylint: disable=R0913
        log.debug1("direct.RuleAdded('%s', '%s', '%s', %d, '%s')" % \
                       (ipv, table, chain, priority, "','".join(args)))

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_DIRECT, signature='sssias')
    @dbus_handle_exceptions
    def RuleRemoved(self, ipv, table, chain, priority, args): # pylint: disable=R0913
        log.debug1("direct.RuleRemoved('%s', '%s', '%s', %d, '%s')" % \
                       (ipv, table, chain, priority, "','".join(args)))

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # DIRECT PASSTHROUGH (untracked)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_DIRECT)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_DIRECT, in_signature='sas',
                         out_signature='s')
    @dbus_handle_exceptions
    def passthrough(self, ipv, args, sender=None):
        # inserts direct rule
        ipv = dbus_to_python(ipv, str)
        args = tuple( dbus_to_python(i, str) for i in args )
        log.debug1("direct.passthrough('%s', '%s')" % (ipv, "','".join(args)))
        self.accessCheck(sender)
        try:
            return self.fw.direct.passthrough(ipv, args)
        except FirewallError as error:
            if ipv in ["ipv4", "ipv6"]:
                query_args = set(["-C", "--check",
                                  "-L", "--list"])
            else:
                query_args = set(["-L", "--list"])
            msg = str(error)
            if error.code == errors.COMMAND_FAILED:
                if len(set(args) & query_args) <= 0:
                    log.warning(msg)
                raise FirewallDBusException(msg)
            raise

    # DIRECT PASSTHROUGH (tracked)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_DIRECT)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_DIRECT, in_signature='sas',
                         out_signature='')
    @dbus_handle_exceptions
    def addPassthrough(self, ipv, args, sender=None):
        # inserts direct passthrough
        ipv = dbus_to_python(ipv)
        args = tuple( dbus_to_python(i) for i in args )
        log.debug1("direct.addPassthrough('%s', '%s')" % \
                   (ipv, "','".join(args)))
        self.accessCheck(sender)
        self.fw.direct.add_passthrough(ipv, args)
        self.PassthroughAdded(ipv, args)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_DIRECT)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_DIRECT, in_signature='sas',
                         out_signature='')
    @dbus_handle_exceptions
    def removePassthrough(self, ipv, args, sender=None):
        # removes direct passthrough
        ipv = dbus_to_python(ipv)
        args = tuple( dbus_to_python(i) for i in args )
        log.debug1("direct.removePassthrough('%s', '%s')" % \
                       (ipv, "','".join(args)))
        self.accessCheck(sender)
        self.fw.direct.remove_passthrough(ipv, args)
        self.PassthroughRemoved(ipv, args)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_DIRECT_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_DIRECT, in_signature='sas',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryPassthrough(self, ipv, args, sender=None): # pylint: disable=W0613
        # returns true if a passthrough is enabled
        ipv = dbus_to_python(ipv)
        args = tuple( dbus_to_python(i) for i in args )
        log.debug1("direct.queryPassthrough('%s', '%s')" % \
                       (ipv, "','".join(args)))
        return self.fw.direct.query_passthrough(ipv, args)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_DIRECT_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_DIRECT, in_signature='',
                         out_signature='a(sas)')
    @dbus_handle_exceptions
    def getAllPassthroughs(self, sender=None): # pylint: disable=W0613
        # returns list of all added passthroughs
        log.debug1("direct.getAllPassthroughs()")
        return self.fw.direct.get_all_passthroughs()

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_DIRECT)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_DIRECT, in_signature='',
                         out_signature='')
    @dbus_handle_exceptions
    def removeAllPassthroughs(self, sender=None): # pylint: disable=W0613
        # remove all passhroughs
        log.debug1("direct.removeAllPassthroughs()")
        # remove in reverse order to avoid removing non-empty chains
        for passthrough in reversed(self.getAllPassthroughs()):
            self.removePassthrough(*passthrough)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_DIRECT_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_DIRECT, in_signature='s',
                         out_signature='aas')
    @dbus_handle_exceptions
    def getPassthroughs(self, ipv, sender=None): # pylint: disable=W0613
        # returns list of all added passthroughs with ipv
        ipv = dbus_to_python(ipv)
        log.debug1("direct.getPassthroughs('%s')", ipv)
        return self.fw.direct.get_passthroughs(ipv)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_DIRECT, signature='sas')
    @dbus_handle_exceptions
    def PassthroughAdded(self, ipv, args):
        log.debug1("direct.PassthroughAdded('%s', '%s')" % \
                       (ipv, "','".join(args)))

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_DIRECT, signature='sas')
    @dbus_handle_exceptions
    def PassthroughRemoved(self, ipv, args):
        log.debug1("direct.PassthroughRemoved('%s', '%s')" % \
                       (ipv, "','".join(args)))

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_ALL)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='',
                         out_signature='')
    @dbus_handle_exceptions
    def authorizeAll(self, sender=None): # pylint: disable=W0613
        """ PK_ACTION_ALL implies all other actions, i.e. once a subject is
            authorized for PK_ACTION_ALL it's also authorized for any other action.
            Use-case is GUI (RHBZ#994729).
        """
        pass

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
    # IPSETS
    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_IPSET, in_signature='s',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryIPSet(self, ipset, sender=None): # pylint: disable=W0613
        # returns true if a set with the name exists
        ipset = dbus_to_python(ipset)
        log.debug1("ipset.queryIPSet('%s')" % (ipset))
        return self.fw.ipset.query_ipset(ipset)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_IPSET, in_signature='',
                         out_signature='as')
    @dbus_handle_exceptions
    def getIPSets(self, sender=None): # pylint: disable=W0613
        # returns list of added sets
        log.debug1("ipsets.getIPSets()")
        return self.fw.ipset.get_ipsets()

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_IPSET, in_signature='s',
                         out_signature=IPSet.DBUS_SIGNATURE)
    @dbus_handle_exceptions
    def getIPSetSettings(self, ipset, sender=None): # pylint: disable=W0613
        # returns ipset settings for ipset
        ipset = dbus_to_python(ipset, str)
        log.debug1("getIPSetSettings(%s)", ipset)
        return self.fw.ipset.get_ipset(ipset).export_config()

    # set entries # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_IPSET, in_signature='ss',
                         out_signature='')
    @dbus_handle_exceptions
    def addEntry(self, ipset, entry, sender=None):
        # adds ipset entry
        ipset = dbus_to_python(ipset)
        entry = dbus_to_python(entry)
        log.debug1("ipset.addEntry('%s', '%s')" % (ipset, entry))
        self.accessCheck(sender)
        self.fw.ipset.add_entry(ipset, entry)
        self.EntryAdded(ipset, entry)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_IPSET, in_signature='ss',
                         out_signature='')
    @dbus_handle_exceptions
    def removeEntry(self, ipset, entry, sender=None):
        # removes ipset entry
        ipset = dbus_to_python(ipset)
        entry = dbus_to_python(entry)
        log.debug1("ipset.removeEntry('%s', '%s')" % (ipset, entry))
        self.accessCheck(sender)
        self.fw.ipset.remove_entry(ipset, entry)
        self.EntryRemoved(ipset, entry)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_IPSET, in_signature='ss',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryEntry(self, ipset, entry, sender=None): # pylint: disable=W0613
        # returns true if the entry exists in the ipset
        ipset = dbus_to_python(ipset)
        entry = dbus_to_python(entry)
        log.debug1("ipset.queryEntry('%s', '%s')" % (ipset, entry))
        return self.fw.ipset.query_entry(ipset, entry)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_IPSET, in_signature='s',
                         out_signature='as')
    @dbus_handle_exceptions
    def getEntries(self, ipset, sender=None): # pylint: disable=W0613
        # returns list of added entries for the ipset
        ipset = dbus_to_python(ipset)
        log.debug1("ipset.getEntries('%s')" % ipset)
        return self.fw.ipset.get_entries(ipset)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(config.dbus.DBUS_INTERFACE_IPSET, in_signature='sas')
    @dbus_handle_exceptions
    def setEntries(self, ipset, entries, sender=None): # pylint: disable=W0613
        # returns list of added entries for the ipset
        ipset = dbus_to_python(ipset)
        entries = dbus_to_python(entries, list)
        log.debug1("ipset.setEntries('%s', '[%s]')", ipset, ",".join(entries))
        old_entries = self.fw.ipset.get_entries(ipset)
        self.fw.ipset.set_entries(ipset, entries)
        old_entries_set = set(old_entries)
        entries_set = set(entries)
        for entry in entries_set - old_entries_set:
            self.EntryAdded(ipset, entry)
        for entry in old_entries_set - entries_set:
            self.EntryRemoved(ipset, entry)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_IPSET, signature='ss')
    @dbus_handle_exceptions
    def EntryAdded(self, ipset, entry):
        ipset = dbus_to_python(ipset)
        entry = dbus_to_python(entry)
        log.debug1("ipset.EntryAdded('%s', '%s')" % (ipset, entry))

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_IPSET, signature='ss')
    @dbus_handle_exceptions
    def EntryRemoved(self, ipset, entry):
        ipset = dbus_to_python(ipset)
        entry = dbus_to_python(entry)
        log.debug1("ipset.EntryRemoved('%s', '%s')" % (ipset, entry))

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
    # HELPERS
    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='',
                         out_signature='as')
    @dbus_handle_exceptions
    def getHelpers(self, sender=None): # pylint: disable=W0613
        # returns list of added sets
        log.debug1("helpers.getHelpers()")
        return self.fw.helper.get_helpers()

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG_INFO)
    @dbus_service_method(config.dbus.DBUS_INTERFACE, in_signature='s',
                         out_signature=Helper.DBUS_SIGNATURE)
    @dbus_handle_exceptions
    def getHelperSettings(self, helper, sender=None): # pylint: disable=W0613
        # returns helper settings for helper
        helper = dbus_to_python(helper, str)
        log.debug1("getHelperSettings(%s)", helper)
        return self.fw.helper.get_helper(helper).export_config()

site-packages/firewall/server/config.py000064400000211644147511334700014233 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2010-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.

# force use of pygobject3 in python-slip
from gi.repository import GObject
import sys
sys.modules['gobject'] = GObject
import os

import dbus
import dbus.service
import slip.dbus
import slip.dbus.service

from firewall import config
from firewall.core.base import DEFAULT_ZONE_TARGET
from firewall.core.watcher import Watcher
from firewall.core.logger import log
from firewall.server.decorators import handle_exceptions, \
    dbus_handle_exceptions, dbus_service_method
from firewall.server.config_icmptype import FirewallDConfigIcmpType
from firewall.server.config_service import FirewallDConfigService
from firewall.server.config_zone import FirewallDConfigZone
from firewall.server.config_policy import FirewallDConfigPolicy
from firewall.server.config_ipset import FirewallDConfigIPSet
from firewall.server.config_helper import FirewallDConfigHelper
from firewall.core.io.icmptype import IcmpType
from firewall.core.io.ipset import IPSet
from firewall.core.io.helper import Helper
from firewall.core.io.lockdown_whitelist import LockdownWhitelist
from firewall.core.io.direct import Direct
from firewall.dbus_utils import dbus_to_python, \
    command_of_sender, context_of_sender, uid_of_sender, user_of_uid, \
    dbus_introspection_prepare_properties, \
    dbus_introspection_add_properties
from firewall import errors
from firewall.errors import FirewallError

############################################################################
#
# class FirewallDConfig
#
############################################################################

class FirewallDConfig(slip.dbus.service.Object):
    """FirewallD main class"""

    persistent = True
    """ Make FirewallD persistent. """
    default_polkit_auth_required = config.dbus.PK_ACTION_CONFIG
    """ Use config.dbus.PK_ACTION_INFO as a default """

    @handle_exceptions
    def __init__(self, conf, *args, **kwargs):
        super(FirewallDConfig, self).__init__(*args, **kwargs)
        self.config = conf
        self.busname = args[0]
        self.path = args[1]
        self._init_vars()
        self.watcher = Watcher(self.watch_updater, 5)
        self.watcher.add_watch_dir(config.FIREWALLD_IPSETS)
        self.watcher.add_watch_dir(config.ETC_FIREWALLD_IPSETS)
        self.watcher.add_watch_dir(config.FIREWALLD_ICMPTYPES)
        self.watcher.add_watch_dir(config.ETC_FIREWALLD_ICMPTYPES)
        self.watcher.add_watch_dir(config.FIREWALLD_HELPERS)
        self.watcher.add_watch_dir(config.ETC_FIREWALLD_HELPERS)
        self.watcher.add_watch_dir(config.FIREWALLD_SERVICES)
        self.watcher.add_watch_dir(config.ETC_FIREWALLD_SERVICES)
        self.watcher.add_watch_dir(config.FIREWALLD_ZONES)
        self.watcher.add_watch_dir(config.ETC_FIREWALLD_ZONES)
        self.watcher.add_watch_dir(config.FIREWALLD_POLICIES)
        self.watcher.add_watch_dir(config.ETC_FIREWALLD_POLICIES)
        # Add watches for combined zone directories
        if os.path.exists(config.ETC_FIREWALLD_ZONES):
            for filename in sorted(os.listdir(config.ETC_FIREWALLD_ZONES)):
                path = "%s/%s" % (config.ETC_FIREWALLD_ZONES, filename)
                if os.path.isdir(path):
                    self.watcher.add_watch_dir(path)
        self.watcher.add_watch_file(config.LOCKDOWN_WHITELIST)
        self.watcher.add_watch_file(config.FIREWALLD_DIRECT)
        self.watcher.add_watch_file(config.FIREWALLD_CONF)

        dbus_introspection_prepare_properties(self,
                                              config.dbus.DBUS_INTERFACE_CONFIG,
                                              { "CleanupOnExit": "readwrite",
                                                "CleanupModulesOnExit": "readwrite",
                                                "IPv6_rpfilter": "readwrite",
                                                "Lockdown": "readwrite",
                                                "MinimalMark": "readwrite",
                                                "IndividualCalls": "readwrite",
                                                "LogDenied": "readwrite",
                                                "AutomaticHelpers": "readwrite",
                                                "FirewallBackend": "readwrite",
                                                "FlushAllOnReload": "readwrite",
                                                "RFC3964_IPv4": "readwrite",
                                                "AllowZoneDrifting": "readwrite",
                                              })

    @handle_exceptions
    def _init_vars(self):
        self.ipsets = [ ]
        self.ipset_idx = 0
        self.icmptypes = [ ]
        self.icmptype_idx = 0
        self.services = [ ]
        self.service_idx = 0
        self.zones = [ ]
        self.zone_idx = 0
        self.helpers = [ ]
        self.helper_idx = 0
        self.policy_objects = [ ]
        self.policy_object_idx = 0

        for ipset in self.config.get_ipsets():
            self._addIPSet(self.config.get_ipset(ipset))
        for icmptype in self.config.get_icmptypes():
            self._addIcmpType(self.config.get_icmptype(icmptype))
        for service in self.config.get_services():
            self._addService(self.config.get_service(service))
        for zone in self.config.get_zones():
            self._addZone(self.config.get_zone(zone))
        for helper in self.config.get_helpers():
            self._addHelper(self.config.get_helper(helper))
        for policy in self.config.get_policy_objects():
            self._addPolicy(self.config.get_policy_object(policy))

    @handle_exceptions
    def __del__(self):
        pass

    @handle_exceptions
    def reload(self):
        while len(self.ipsets) > 0:
            item = self.ipsets.pop()
            item.unregister()
            del item
        while len(self.icmptypes) > 0:
            item = self.icmptypes.pop()
            item.unregister()
            del item
        while len(self.services) > 0:
            item = self.services.pop()
            item.unregister()
            del item
        while len(self.zones) > 0:
            item = self.zones.pop()
            item.unregister()
            del item
        while len(self.helpers) > 0:
            item = self.helpers.pop()
            item.unregister()
            del item
        while len(self.policy_objects) > 0:
            item = self.policy_objects.pop()
            item.unregister()
            del item
        self._init_vars()

    @handle_exceptions
    def watch_updater(self, name):
        if name == config.FIREWALLD_CONF:
            old_props = self.GetAll(config.dbus.DBUS_INTERFACE_CONFIG)
            log.debug1("config: Reloading firewalld config file '%s'",
                       config.FIREWALLD_CONF)
            try:
                self.config.update_firewalld_conf()
            except Exception as msg:
                log.error("Failed to load firewalld.conf file '%s': %s" % \
                          (name, msg))
                return
            props = self.GetAll(config.dbus.DBUS_INTERFACE_CONFIG).copy()
            for key in list(props.keys()):
                if key in old_props and old_props[key] == props[key]:
                    del props[key]
            if len(props) > 0:
                self.PropertiesChanged(config.dbus.DBUS_INTERFACE_CONFIG,
                                       props, [])
            return

        if (name.startswith(config.FIREWALLD_ICMPTYPES) or \
            name.startswith(config.ETC_FIREWALLD_ICMPTYPES)) and \
           name.endswith(".xml"):
            try:
                (what, obj) = self.config.update_icmptype_from_path(name)
            except Exception as msg:
                log.error("Failed to load icmptype file '%s': %s" % (name, msg))
                return
            if what == "new":
                self._addIcmpType(obj)
            elif what == "remove":
                self.removeIcmpType(obj)
            elif what == "update":
                self._updateIcmpType(obj)

        elif (name.startswith(config.FIREWALLD_SERVICES) or \
              name.startswith(config.ETC_FIREWALLD_SERVICES)) and \
             name.endswith(".xml"):
            try:
                (what, obj) = self.config.update_service_from_path(name)
            except Exception as msg:
                log.error("Failed to load service file '%s': %s" % (name, msg))
                return
            if what == "new":
                self._addService(obj)
            elif what == "remove":
                self.removeService(obj)
            elif what == "update":
                self._updateService(obj)

        elif name.startswith(config.FIREWALLD_ZONES) or \
             name.startswith(config.ETC_FIREWALLD_ZONES):
            if name.endswith(".xml"):
                try:
                    (what, obj) = self.config.update_zone_from_path(name)
                except Exception as msg:
                    log.error("Failed to load zone file '%s': %s" % (name, msg))
                    return
                if what == "new":
                    self._addZone(obj)
                elif what == "remove":
                    self.removeZone(obj)
                elif what == "update":
                    self._updateZone(obj)
            elif name.startswith(config.ETC_FIREWALLD_ZONES):
                # possible combined zone base directory
                _name = name.replace(config.ETC_FIREWALLD_ZONES, "").strip("/")
                if len(_name) < 1 or "/" in _name:
                    # if there is a / in x, then it is a sub sub directory
                    # ignore it
                    return
                if os.path.isdir(name):
                    if not self.watcher.has_watch(name):
                        self.watcher.add_watch_dir(name)
                elif self.watcher.has_watch(name):
                    self.watcher.remove_watch(name)

        elif (name.startswith(config.FIREWALLD_IPSETS) or \
              name.startswith(config.ETC_FIREWALLD_IPSETS)) and \
             name.endswith(".xml"):
            try:
                (what, obj) = self.config.update_ipset_from_path(name)
            except Exception as msg:
                log.error("Failed to load ipset file '%s': %s" % (name,
                                                                  msg))

                return
            if what == "new":
                self._addIPSet(obj)
            elif what == "remove":
                self.removeIPSet(obj)
            elif what == "update":
                self._updateIPSet(obj)

        elif (name.startswith(config.FIREWALLD_HELPERS) or \
              name.startswith(config.ETC_FIREWALLD_HELPERS)) and \
             name.endswith(".xml"):
            try:
                (what, obj) = self.config.update_helper_from_path(name)
            except Exception as msg:
                log.error("Failed to load helper file '%s': %s" % (name,
                                                                  msg))

                return
            if what == "new":
                self._addHelper(obj)
            elif what == "remove":
                self.removeHelper(obj)
            elif what == "update":
                self._updateHelper(obj)

        elif name == config.LOCKDOWN_WHITELIST:
            try:
                self.config.update_lockdown_whitelist()
            except Exception as msg:
                log.error("Failed to load lockdown whitelist file '%s': %s" % \
                          (name, msg))
                return
            self.LockdownWhitelistUpdated()

        elif name == config.FIREWALLD_DIRECT:
            try:
                self.config.update_direct()
            except Exception as msg:
                log.error("Failed to load direct rules file '%s': %s" % (name,
                                                                         msg))
                return
            self.Updated()

        elif (name.startswith(config.FIREWALLD_POLICIES) or \
              name.startswith(config.ETC_FIREWALLD_POLICIES)) and \
             name.endswith(".xml"):
            try:
                (what, obj) = self.config.update_policy_object_from_path(name)
            except Exception as msg:
                log.error("Failed to load policy file '%s': %s" % (name, msg))
                return
            if what == "new":
                self._addPolicy(obj)
            elif what == "remove":
                self.removePolicy(obj)
            elif what == "update":
                self._updatePolicy(obj)

    @handle_exceptions
    def _addIcmpType(self, obj):
        # TODO: check for idx overflow
        config_icmptype = FirewallDConfigIcmpType(
            self, self.config, obj, self.icmptype_idx, self.busname,
            "%s/%d" % (config.dbus.DBUS_PATH_CONFIG_ICMPTYPE,
                       self.icmptype_idx))
        self.icmptypes.append(config_icmptype)
        self.icmptype_idx += 1
        self.IcmpTypeAdded(obj.name)
        return config_icmptype

    @handle_exceptions
    def _updateIcmpType(self, obj):
        for icmptype in self.icmptypes:
            if icmptype.obj.name == obj.name and \
                    icmptype.obj.path == obj.path and \
                    icmptype.obj.filename == obj.filename:
                icmptype.obj = obj
                icmptype.Updated(obj.name)

    @handle_exceptions
    def removeIcmpType(self, obj):
        index = 7 # see IMPORT_EXPORT_STRUCTURE in class Zone(IO_Object)
        for zone in self.zones:
            settings = zone.getSettings()
            # if this IcmpType is used in a zone remove it from that zone first
            if obj.name in settings[index]:
                settings[index].remove(obj.name)
                zone.obj = self.config.set_zone_config(zone.obj, settings)
                zone.Updated(zone.obj.name)

        for policy in self.policy_objects:
            settings = policy.getSettings()
            # if this IcmpType is used in a policy remove it from that policy first
            if "icmp_blocks" in settings and obj.name in settings["icmp_blocks"]:
                settings["icmp_blocks"].remove(obj.name)
                policy.obj = self.config.set_policy_object_config_dict(policy.obj, settings)
                policy.Updated(policy.obj.name)

        for icmptype in self.icmptypes:
            if icmptype.obj == obj:
                icmptype.Removed(obj.name)
                icmptype.unregister()
                self.icmptypes.remove(icmptype)
                del icmptype

    @handle_exceptions
    def _addService(self, obj):
        # TODO: check for idx overflow
        config_service = FirewallDConfigService(
            self, self.config, obj, self.service_idx, self.busname,
            "%s/%d" % (config.dbus.DBUS_PATH_CONFIG_SERVICE, self.service_idx))
        self.services.append(config_service)
        self.service_idx += 1
        self.ServiceAdded(obj.name)
        return config_service

    @handle_exceptions
    def _updateService(self, obj):
        for service in self.services:
            if service.obj.name == obj.name and \
                    service.obj.path == obj.path and \
                    service.obj.filename == obj.filename:
                service.obj = obj
                service.Updated(obj.name)

    @handle_exceptions
    def removeService(self, obj):
        index = 5 # see IMPORT_EXPORT_STRUCTURE in class Zone(IO_Object)
        for zone in self.zones:
            settings = zone.getSettings()
            # if this Service is used in a zone remove it from that zone first
            if obj.name in settings[index]:
                settings[index].remove(obj.name)
                zone.obj = self.config.set_zone_config(zone.obj, settings)
                zone.Updated(zone.obj.name)

        for policy in self.policy_objects:
            settings = policy.getSettings()
            # if this Service is used in a policy remove it from that policy first
            if "services" in settings and obj.name in settings["services"]:
                settings["services"].remove(obj.name)
                policy.obj = self.config.set_policy_object_config_dict(policy.obj, settings)
                policy.Updated(policy.obj.name)

        for service in self.services:
            if service.obj == obj:
                service.Removed(obj.name)
                service.unregister()
                self.services.remove(service)
                del service

    @handle_exceptions
    def _addZone(self, obj):
        # TODO: check for idx overflow
        config_zone = FirewallDConfigZone(
            self, self.config, obj, self.zone_idx, self.busname,
            "%s/%d" % (config.dbus.DBUS_PATH_CONFIG_ZONE, self.zone_idx))
        self.zones.append(config_zone)
        self.zone_idx += 1
        self.ZoneAdded(obj.name)
        return config_zone

    @handle_exceptions
    def _updateZone(self, obj):
        for zone in self.zones:
            if zone.obj.name == obj.name and zone.obj.path == obj.path and \
                    zone.obj.filename == obj.filename:
                zone.obj = obj
                zone.Updated(obj.name)

    @handle_exceptions
    def removeZone(self, obj):
        for zone in self.zones:
            if zone.obj == obj:
                zone.Removed(obj.name)
                zone.unregister()
                self.zones.remove(zone)
                del zone

    @handle_exceptions
    def _addPolicy(self, obj):
        # TODO: check for idx overflow
        config_policy = FirewallDConfigPolicy(
            self, self.config, obj, self.policy_object_idx, self.busname,
            "%s/%d" % (config.dbus.DBUS_PATH_CONFIG_POLICY, self.policy_object_idx))
        self.policy_objects.append(config_policy)
        self.policy_object_idx += 1
        self.PolicyAdded(obj.name)
        return config_policy

    @handle_exceptions
    def _updatePolicy(self, obj):
        for policy in self.policy_objects:
            if policy.obj.name == obj.name and policy.obj.path == obj.path and \
                    policy.obj.filename == obj.filename:
                policy.obj = obj
                policy.Updated(obj.name)

    @handle_exceptions
    def removePolicy(self, obj):
        for policy in self.policy_objects:
            if policy.obj == obj:
                policy.Removed(obj.name)
                policy.unregister()
                self.policy_objects.remove(policy)
                del policy

    @handle_exceptions
    def _addIPSet(self, obj):
        # TODO: check for idx overflow
        config_ipset = FirewallDConfigIPSet(
            self, self.config, obj, self.ipset_idx, self.busname,
            "%s/%d" % (config.dbus.DBUS_PATH_CONFIG_IPSET, self.ipset_idx))
        self.ipsets.append(config_ipset)
        self.ipset_idx += 1
        self.IPSetAdded(obj.name)
        return config_ipset

    @handle_exceptions
    def _updateIPSet(self, obj):
        for ipset in self.ipsets:
            if ipset.obj.name == obj.name and ipset.obj.path == obj.path and \
                    ipset.obj.filename == obj.filename:
                ipset.obj = obj
                ipset.Updated(obj.name)

    @handle_exceptions
    def removeIPSet(self, obj):
        for ipset in self.ipsets:
            if ipset.obj == obj:
                ipset.Removed(obj.name)
                ipset.unregister()
                self.ipsets.remove(ipset)
                del ipset

    # access check

    @handle_exceptions
    def _addHelper(self, obj):
        # TODO: check for idx overflow
        config_helper = FirewallDConfigHelper(
            self, self.config, obj, self.helper_idx, self.busname,
            "%s/%d" % (config.dbus.DBUS_PATH_CONFIG_HELPER, self.helper_idx))
        self.helpers.append(config_helper)
        self.helper_idx += 1
        self.HelperAdded(obj.name)
        return config_helper

    @handle_exceptions
    def _updateHelper(self, obj):
        for helper in self.helpers:
            if helper.obj.name == obj.name and helper.obj.path == obj.path and \
                    helper.obj.filename == obj.filename:
                helper.obj = obj
                helper.Updated(obj.name)

    @handle_exceptions
    def removeHelper(self, obj):
        for helper in self.helpers:
            if helper.obj == obj:
                helper.Removed(obj.name)
                helper.unregister()
                self.helpers.remove(helper)
                del helper

    # access check

    @dbus_handle_exceptions
    def accessCheck(self, sender):
        if self.config.lockdown_enabled():
            if sender is None:
                log.error("Lockdown not possible, sender not set.")
                return
            bus = dbus.SystemBus()
            context = context_of_sender(bus, sender)
            if self.config.access_check("context", context):
                return
            uid = uid_of_sender(bus, sender)
            if self.config.access_check("uid", uid):
                return
            user = user_of_uid(uid)
            if self.config.access_check("user", user):
                return
            command = command_of_sender(bus, sender)
            if self.config.access_check("command", command):
                return
            raise FirewallError(errors.ACCESS_DENIED, "lockdown is enabled")

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # P R O P E R T I E S

    @dbus_handle_exceptions
    def _get_property(self, prop):
        if prop not in [ "DefaultZone", "MinimalMark", "CleanupOnExit",
                         "CleanupModulesOnExit", "Lockdown", "IPv6_rpfilter",
                         "IndividualCalls", "LogDenied", "AutomaticHelpers",
                         "FirewallBackend", "FlushAllOnReload", "RFC3964_IPv4",
                         "AllowZoneDrifting" ]:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.InvalidArgs: "
                "Property '%s' does not exist" % prop)

        value = self.config.get_firewalld_conf().get(prop)

        if prop == "DefaultZone":
            if value is None:
                value = config.FALLBACK_ZONE
            return dbus.String(value)
        elif prop == "MinimalMark":
            if value is None:
                value = config.FALLBACK_MINIMAL_MARK
            else:
                value = int(value)
            return dbus.Int32(value)
        elif prop == "CleanupOnExit":
            if value is None:
                value = "yes" if config.FALLBACK_CLEANUP_ON_EXIT else "no"
            return dbus.String(value)
        elif prop == "CleanupModulesOnExit":
            if value is None:
                value = "yes" if config.FALLBACK_CLEANUP_MODULES_ON_EXIT else "no"
            return dbus.String(value)
        elif prop == "Lockdown":
            if value is None:
                value = "yes" if config.FALLBACK_LOCKDOWN else "no"
            return dbus.String(value)
        elif prop == "IPv6_rpfilter":
            if value is None:
                value = "yes" if config.FALLBACK_IPV6_RPFILTER else "no"
            return dbus.String(value)
        elif prop == "IndividualCalls":
            if value is None:
                value = "yes" if config.FALLBACK_INDIVIDUAL_CALLS else "no"
            return dbus.String(value)
        elif prop == "LogDenied":
            if value is None:
                value = config.FALLBACK_LOG_DENIED
            return dbus.String(value)
        elif prop == "AutomaticHelpers":
            if value is None:
                value = config.FALLBACK_AUTOMATIC_HELPERS
            return dbus.String(value)
        elif prop == "FirewallBackend":
            if value is None:
                value = config.FALLBACK_FIREWALL_BACKEND
            return dbus.String(value)
        elif prop == "FlushAllOnReload":
            if value is None:
                value = "yes" if config.FALLBACK_FLUSH_ALL_ON_RELOAD else "no"
            return dbus.String(value)
        elif prop == "RFC3964_IPv4":
            if value is None:
                value = "yes" if config.FALLBACK_RFC3964_IPV4 else "no"
            return dbus.String(value)
        elif prop == "AllowZoneDrifting":
            if value is None:
                value = "yes" if config.FALLBACK_ALLOW_ZONE_DRIFTING else "no"
            return dbus.String(value)

    @dbus_handle_exceptions
    def _get_dbus_property(self, prop):
        if prop == "DefaultZone":
            return dbus.String(self._get_property(prop))
        elif prop == "MinimalMark":
            return dbus.Int32(self._get_property(prop))
        elif prop == "CleanupOnExit":
            return dbus.String(self._get_property(prop))
        elif prop == "CleanupModulesOnExit":
            return dbus.String(self._get_property(prop))
        elif prop == "Lockdown":
            return dbus.String(self._get_property(prop))
        elif prop == "IPv6_rpfilter":
            return dbus.String(self._get_property(prop))
        elif prop == "IndividualCalls":
            return dbus.String(self._get_property(prop))
        elif prop == "LogDenied":
            return dbus.String(self._get_property(prop))
        elif prop == "AutomaticHelpers":
            return dbus.String(self._get_property(prop))
        elif prop == "FirewallBackend":
            return dbus.String(self._get_property(prop))
        elif prop == "FlushAllOnReload":
            return dbus.String(self._get_property(prop))
        elif prop == "RFC3964_IPv4":
            return dbus.String(self._get_property(prop))
        elif prop == "AllowZoneDrifting":
            return dbus.String(self._get_property(prop))
        else:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.InvalidArgs: "
                "Property '%s' does not exist" % prop)

    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='ss',
                         out_signature='v')
    @dbus_handle_exceptions
    def Get(self, interface_name, property_name, sender=None): # pylint: disable=W0613
        # get a property
        interface_name = dbus_to_python(interface_name, str)
        property_name = dbus_to_python(property_name, str)
        log.debug1("config.Get('%s', '%s')", interface_name, property_name)

        if interface_name == config.dbus.DBUS_INTERFACE_CONFIG:
            return self._get_dbus_property(property_name)
        elif interface_name in [ config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                                 config.dbus.DBUS_INTERFACE_CONFIG_POLICIES ]:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.InvalidArgs: "
                "Property '%s' does not exist" % property_name)
        else:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

        return self._get_dbus_property(property_name)

    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='s',
                         out_signature='a{sv}')
    @dbus_handle_exceptions
    def GetAll(self, interface_name, sender=None): # pylint: disable=W0613
        interface_name = dbus_to_python(interface_name, str)
        log.debug1("config.GetAll('%s')", interface_name)

        ret = { }
        if interface_name == config.dbus.DBUS_INTERFACE_CONFIG:
            for x in [ "DefaultZone", "MinimalMark", "CleanupOnExit",
                       "CleanupModulesOnExit", "Lockdown", "IPv6_rpfilter",
                       "IndividualCalls", "LogDenied", "AutomaticHelpers",
                       "FirewallBackend", "FlushAllOnReload", "RFC3964_IPv4",
                       "AllowZoneDrifting" ]:
                ret[x] = self._get_property(x)
        elif interface_name in [ config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                                 config.dbus.DBUS_INTERFACE_CONFIG_POLICIES ]:
            pass
        else:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

        return dbus.Dictionary(ret, signature="sv")

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='ssv')
    @dbus_handle_exceptions
    def Set(self, interface_name, property_name, new_value, sender=None):
        interface_name = dbus_to_python(interface_name, str)
        property_name = dbus_to_python(property_name, str)
        new_value = dbus_to_python(new_value)
        log.debug1("config.Set('%s', '%s', '%s')", interface_name,
                   property_name, new_value)
        self.accessCheck(sender)

        if interface_name == config.dbus.DBUS_INTERFACE_CONFIG:
            if property_name in [ "CleanupOnExit", "Lockdown", "CleanupModulesOnExit",
                                  "IPv6_rpfilter", "IndividualCalls",
                                  "LogDenied",
                                  "FirewallBackend", "FlushAllOnReload",
                                  "RFC3964_IPv4", "AllowZoneDrifting" ]:
                if property_name in [ "CleanupOnExit", "Lockdown", "CleanupModulesOnExit",
                                      "IPv6_rpfilter", "IndividualCalls" ]:
                    if new_value.lower() not in [ "yes", "no",
                                                  "true", "false" ]:
                        raise FirewallError(errors.INVALID_VALUE,
                                            "'%s' for %s" % \
                                            (new_value, property_name))
                if property_name == "LogDenied":
                    if new_value not in config.LOG_DENIED_VALUES:
                        raise FirewallError(errors.INVALID_VALUE,
                                            "'%s' for %s" % \
                                            (new_value, property_name))
                if property_name == "FirewallBackend":
                    if new_value not in config.FIREWALL_BACKEND_VALUES:
                        raise FirewallError(errors.INVALID_VALUE,
                                            "'%s' for %s" % \
                                            (new_value, property_name))
                if property_name == "FlushAllOnReload":
                    if new_value.lower() not in ["yes", "true", "no", "false"]:
                        raise FirewallError(errors.INVALID_VALUE,
                                            "'%s' for %s" % \
                                            (new_value, property_name))
                if property_name == "RFC3964_IPv4":
                    if new_value.lower() not in ["yes", "true", "no", "false"]:
                        raise FirewallError(errors.INVALID_VALUE,
                                            "'%s' for %s" % \
                                            (new_value, property_name))
                if property_name == "AllowZoneDrifting":
                    if new_value.lower() not in ["yes", "true", "no", "false"]:
                        raise FirewallError(errors.INVALID_VALUE,
                                            "'%s' for %s" % \
                                            (new_value, property_name))

                self.config.get_firewalld_conf().set(property_name, new_value)
                self.config.get_firewalld_conf().write()
                self.PropertiesChanged(interface_name,
                                       { property_name: new_value }, [ ])
            elif property_name in ["MinimalMark", "AutomaticHelpers"]:
                # deprecated fields. Ignore setting them.
                pass
            else:
                raise dbus.exceptions.DBusException(
                    "org.freedesktop.DBus.Error.InvalidArgs: "
                    "Property '%s' does not exist" % property_name)
        elif interface_name in [ config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                                 config.dbus.DBUS_INTERFACE_CONFIG_POLICIES ]:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.InvalidArgs: "
                "Property '%s' does not exist" % property_name)
        else:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

    @dbus.service.signal(dbus.PROPERTIES_IFACE, signature='sa{sv}as')
    def PropertiesChanged(self, interface_name, changed_properties,
                          invalidated_properties):
        interface_name = dbus_to_python(interface_name, str)
        changed_properties = dbus_to_python(changed_properties)
        invalidated_properties = dbus_to_python(invalidated_properties)
        log.debug1("config.PropertiesChanged('%s', '%s', '%s')",
                   interface_name, changed_properties, invalidated_properties)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(dbus.INTROSPECTABLE_IFACE, out_signature='s')
    @dbus_handle_exceptions
    def Introspect(self, sender=None): # pylint: disable=W0613
        log.debug2("config.Introspect()")

        data = super(FirewallDConfig, self).Introspect(self.path,
                                                       self.busname.get_bus())
        return dbus_introspection_add_properties(
            self, data, config.dbus.DBUS_INTERFACE_CONFIG)

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # policies

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICIES,
                         out_signature=LockdownWhitelist.DBUS_SIGNATURE)
    @dbus_handle_exceptions
    def getLockdownWhitelist(self, sender=None): # pylint: disable=W0613
        log.debug1("config.policies.getLockdownWhitelist()")
        return self.config.get_policies().lockdown_whitelist.export_config()

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICIES,
                         in_signature=LockdownWhitelist.DBUS_SIGNATURE)
    @dbus_handle_exceptions
    def setLockdownWhitelist(self, settings, sender=None): # pylint: disable=W0613
        log.debug1("config.policies.setLockdownWhitelist(...)")
        settings = dbus_to_python(settings)
        self.config.get_policies().lockdown_whitelist.import_config(settings)
        self.config.get_policies().lockdown_whitelist.write()
        self.LockdownWhitelistUpdated()

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG_POLICIES)
    @dbus_handle_exceptions
    def LockdownWhitelistUpdated(self):
        log.debug1("config.policies.LockdownWhitelistUpdated()")

    # command

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICIES,
                         in_signature='s')
    @dbus_handle_exceptions
    def addLockdownWhitelistCommand(self, command, sender=None):
        command = dbus_to_python(command)
        log.debug1("config.policies.addLockdownWhitelistCommand('%s')", command)
        self.accessCheck(sender)
        settings = list(self.getLockdownWhitelist())
        if command in settings[0]:
            raise FirewallError(errors.ALREADY_ENABLED, command)
        settings[0].append(command)
        self.setLockdownWhitelist(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICIES,
                         in_signature='s')
    @dbus_handle_exceptions
    def removeLockdownWhitelistCommand(self, command, sender=None):
        command = dbus_to_python(command)
        log.debug1("config.policies.removeLockdownWhitelistCommand('%s')",
                   command)
        self.accessCheck(sender)
        settings = list(self.getLockdownWhitelist())
        if command not in settings[0]:
            raise FirewallError(errors.NOT_ENABLED, command)
        settings[0].remove(command)
        self.setLockdownWhitelist(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICIES,
                         in_signature='s', out_signature='b')
    @dbus_handle_exceptions
    def queryLockdownWhitelistCommand(self, command, sender=None): # pylint: disable=W0613
        command = dbus_to_python(command)
        log.debug1("config.policies.queryLockdownWhitelistCommand('%s')",
                   command)
        return command in self.getLockdownWhitelist()[0]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICIES,
                         out_signature='as')
    @dbus_handle_exceptions
    def getLockdownWhitelistCommands(self, sender=None): # pylint: disable=W0613
        log.debug1("config.policies.getLockdownWhitelistCommands()")
        return self.getLockdownWhitelist()[0]

    # context

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICIES,
                         in_signature='s')
    @dbus_handle_exceptions
    def addLockdownWhitelistContext(self, context, sender=None):
        context = dbus_to_python(context)
        log.debug1("config.policies.addLockdownWhitelistContext('%s')", context)
        self.accessCheck(sender)
        settings = list(self.getLockdownWhitelist())
        if context in settings[1]:
            raise FirewallError(errors.ALREADY_ENABLED, context)
        settings[1].append(context)
        self.setLockdownWhitelist(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICIES,
                         in_signature='s')
    @dbus_handle_exceptions
    def removeLockdownWhitelistContext(self, context, sender=None):
        context = dbus_to_python(context)
        log.debug1("config.policies.removeLockdownWhitelistContext('%s')",
                   context)
        self.accessCheck(sender)
        settings = list(self.getLockdownWhitelist())
        if context not in settings[1]:
            raise FirewallError(errors.NOT_ENABLED, context)
        settings[1].remove(context)
        self.setLockdownWhitelist(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICIES,
                         in_signature='s', out_signature='b')
    @dbus_handle_exceptions
    def queryLockdownWhitelistContext(self, context, sender=None): # pylint: disable=W0613
        context = dbus_to_python(context)
        log.debug1("config.policies.queryLockdownWhitelistContext('%s')",
                   context)
        return context in self.getLockdownWhitelist()[1]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICIES,
                         out_signature='as')
    @dbus_handle_exceptions
    def getLockdownWhitelistContexts(self, sender=None): # pylint: disable=W0613
        log.debug1("config.policies.getLockdownWhitelistContexts()")
        return self.getLockdownWhitelist()[1]

    # user

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICIES,
                         in_signature='s')
    @dbus_handle_exceptions
    def addLockdownWhitelistUser(self, user, sender=None):
        user = dbus_to_python(user)
        log.debug1("config.policies.addLockdownWhitelistUser('%s')", user)
        self.accessCheck(sender)
        settings = list(self.getLockdownWhitelist())
        if user in settings[2]:
            raise FirewallError(errors.ALREADY_ENABLED, user)
        settings[2].append(user)
        self.setLockdownWhitelist(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICIES,
                         in_signature='s')
    @dbus_handle_exceptions
    def removeLockdownWhitelistUser(self, user, sender=None):
        user = dbus_to_python(user)
        log.debug1("config.policies.removeLockdownWhitelistUser('%s')", user)
        self.accessCheck(sender)
        settings = list(self.getLockdownWhitelist())
        if user not in settings[2]:
            raise FirewallError(errors.NOT_ENABLED, user)
        settings[2].remove(user)
        self.setLockdownWhitelist(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICIES,
                         in_signature='s', out_signature='b')
    @dbus_handle_exceptions
    def queryLockdownWhitelistUser(self, user, sender=None): # pylint: disable=W0613
        user = dbus_to_python(user)
        log.debug1("config.policies.queryLockdownWhitelistUser('%s')", user)
        return user in self.getLockdownWhitelist()[2]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICIES,
                         out_signature='as')
    @dbus_handle_exceptions
    def getLockdownWhitelistUsers(self, sender=None): # pylint: disable=W0613
        log.debug1("config.policies.getLockdownWhitelistUsers()")
        return self.getLockdownWhitelist()[2]

    # uid

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICIES,
                         in_signature='i')
    @dbus_handle_exceptions
    def addLockdownWhitelistUid(self, uid, sender=None):
        uid = dbus_to_python(uid)
        log.debug1("config.policies.addLockdownWhitelistUid(%d)", uid)
        self.accessCheck(sender)
        settings = list(self.getLockdownWhitelist())
        if uid in settings[3]:
            raise FirewallError(errors.ALREADY_ENABLED, uid)
        settings[3].append(uid)
        self.setLockdownWhitelist(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICIES,
                         in_signature='i')
    @dbus_handle_exceptions
    def removeLockdownWhitelistUid(self, uid, sender=None):
        uid = dbus_to_python(uid)
        log.debug1("config.policies.removeLockdownWhitelistUid(%d)", uid)
        self.accessCheck(sender)
        settings = list(self.getLockdownWhitelist())
        if uid not in settings[3]:
            raise FirewallError(errors.NOT_ENABLED, uid)
        settings[3].remove(uid)
        self.setLockdownWhitelist(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICIES,
                         in_signature='i', out_signature='b')
    @dbus_handle_exceptions
    def queryLockdownWhitelistUid(self, uid, sender=None): # pylint: disable=W0613
        uid = dbus_to_python(uid)
        log.debug1("config.policies.queryLockdownWhitelistUid(%d)", uid)
        return uid in self.getLockdownWhitelist()[3]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICIES,
                         out_signature='ai')
    @dbus_handle_exceptions
    def getLockdownWhitelistUids(self, sender=None): # pylint: disable=W0613
        log.debug1("config.policies.getLockdownWhitelistUids()")
        return self.getLockdownWhitelist()[3]

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # I P S E T S

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG, out_signature='ao')
    @dbus_handle_exceptions
    def listIPSets(self, sender=None): # pylint: disable=W0613
        """list ipsets objects paths
        """
        log.debug1("config.listIPSets()")
        return self.ipsets

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG, out_signature='as')
    @dbus_handle_exceptions
    def getIPSetNames(self, sender=None): # pylint: disable=W0613
        """get ipset names
        """
        log.debug1("config.getIPSetNames()")
        ipsets = [ ]
        for obj in self.ipsets:
            ipsets.append(obj.obj.name)
        return sorted(ipsets)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG, in_signature='s',
                         out_signature='o')
    @dbus_handle_exceptions
    def getIPSetByName(self, ipset, sender=None): # pylint: disable=W0613
        """object path of ipset with given name
        """
        ipset = dbus_to_python(ipset, str)
        log.debug1("config.getIPSetByName('%s')", ipset)
        for obj in self.ipsets:
            if obj.obj.name == ipset:
                return obj
        raise FirewallError(errors.INVALID_IPSET, ipset)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG,
                         in_signature='s'+IPSet.DBUS_SIGNATURE,
                         out_signature='o')
    @dbus_handle_exceptions
    def addIPSet(self, ipset, settings, sender=None):
        """add ipset with given name and settings
        """
        ipset = dbus_to_python(ipset, str)
        settings = dbus_to_python(settings)
        log.debug1("config.addIPSet('%s')", ipset)
        self.accessCheck(sender)
        obj = self.config.new_ipset(ipset, settings)
        config_ipset = self._addIPSet(obj)
        return config_ipset

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG, signature='s')
    @dbus_handle_exceptions
    def IPSetAdded(self, ipset):
        ipset = dbus_to_python(ipset, str)
        log.debug1("config.IPSetAdded('%s')" % (ipset))

    # I C M P T Y P E S

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG, out_signature='ao')
    @dbus_handle_exceptions
    def listIcmpTypes(self, sender=None): # pylint: disable=W0613
        """list icmptypes objects paths
        """
        log.debug1("config.listIcmpTypes()")
        return self.icmptypes

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG, out_signature='as')
    @dbus_handle_exceptions
    def getIcmpTypeNames(self, sender=None): # pylint: disable=W0613
        """get icmptype names
        """
        log.debug1("config.getIcmpTypeNames()")
        icmptypes = [ ]
        for obj in self.icmptypes:
            icmptypes.append(obj.obj.name)
        return sorted(icmptypes)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG, in_signature='s',
                         out_signature='o')
    @dbus_handle_exceptions
    def getIcmpTypeByName(self, icmptype, sender=None): # pylint: disable=W0613
        """object path of icmptype with given name
        """
        icmptype = dbus_to_python(icmptype, str)
        log.debug1("config.getIcmpTypeByName('%s')", icmptype)
        for obj in self.icmptypes:
            if obj.obj.name == icmptype:
                return obj
        raise FirewallError(errors.INVALID_ICMPTYPE, icmptype)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG,
                         in_signature='s'+IcmpType.DBUS_SIGNATURE,
                         out_signature='o')
    @dbus_handle_exceptions
    def addIcmpType(self, icmptype, settings, sender=None):
        """add icmptype with given name and settings
        """
        icmptype = dbus_to_python(icmptype, str)
        settings = dbus_to_python(settings)
        log.debug1("config.addIcmpType('%s')", icmptype)
        self.accessCheck(sender)
        obj = self.config.new_icmptype(icmptype, settings)
        config_icmptype = self._addIcmpType(obj)
        return config_icmptype

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG, signature='s')
    @dbus_handle_exceptions
    def IcmpTypeAdded(self, icmptype):
        log.debug1("config.IcmpTypeAdded('%s')" % (icmptype))

    # S E R V I C E S

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG, out_signature='ao')
    @dbus_handle_exceptions
    def listServices(self, sender=None): # pylint: disable=W0613
        """list services objects paths
        """
        log.debug1("config.listServices()")
        return self.services

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG, out_signature='as')
    @dbus_handle_exceptions
    def getServiceNames(self, sender=None): # pylint: disable=W0613
        """get service names
        """
        log.debug1("config.getServiceNames()")
        services = [ ]
        for obj in self.services:
            services.append(obj.obj.name)
        return sorted(services)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG, in_signature='s',
                         out_signature='o')
    @dbus_handle_exceptions
    def getServiceByName(self, service, sender=None): # pylint: disable=W0613
        """object path of service with given name
        """
        service = dbus_to_python(service, str)
        log.debug1("config.getServiceByName('%s')", service)
        for obj in self.services:
            if obj.obj.name == service:
                return obj
        raise FirewallError(errors.INVALID_SERVICE, service)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG,
                         in_signature='s(sssa(ss)asa{ss}asa(ss))',
                         out_signature='o')
    @dbus_handle_exceptions
    def addService(self, service, settings, sender=None):
        """add service with given name and settings
        """
        service = dbus_to_python(service, str)
        settings = dbus_to_python(settings)
        log.debug1("config.addService('%s')", service)
        self.accessCheck(sender)
        obj = self.config.new_service(service, settings)
        config_service = self._addService(obj)
        return config_service

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG,
                         in_signature='sa{sv}',
                         out_signature='o')
    @dbus_handle_exceptions
    def addService2(self, service, settings, sender=None):
        """add service with given name and settings
        """
        service = dbus_to_python(service, str)
        settings = dbus_to_python(settings)
        log.debug1("config.addService2('%s')", service)
        self.accessCheck(sender)
        obj = self.config.new_service_dict(service, settings)
        config_service = self._addService(obj)
        return config_service

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG, signature='s')
    @dbus_handle_exceptions
    def ServiceAdded(self, service):
        log.debug1("config.ServiceAdded('%s')" % (service))

    # Z O N E S

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG, out_signature='ao')
    @dbus_handle_exceptions
    def listZones(self, sender=None): # pylint: disable=W0613
        """list zones objects paths
        """
        log.debug1("config.listZones()")
        return self.zones

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG, out_signature='as')
    @dbus_handle_exceptions
    def getZoneNames(self, sender=None): # pylint: disable=W0613
        """get zone names
        """
        log.debug1("config.getZoneNames()")
        zones = [ ]
        for obj in self.zones:
            zones.append(obj.obj.name)
        return sorted(zones)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG, in_signature='s',
                         out_signature='o')
    @dbus_handle_exceptions
    def getZoneByName(self, zone, sender=None): # pylint: disable=W0613
        """object path of zone with given name
        """
        zone = dbus_to_python(zone, str)
        log.debug1("config.getZoneByName('%s')", zone)
        for obj in self.zones:
            if obj.obj.name == zone:
                return obj
        raise FirewallError(errors.INVALID_ZONE, zone)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG, in_signature='s',
                         out_signature='s')
    @dbus_handle_exceptions
    def getZoneOfInterface(self, iface, sender=None): # pylint: disable=W0613
        """name of zone the given interface belongs to
        """
        iface = dbus_to_python(iface, str)
        log.debug1("config.getZoneOfInterface('%s')", iface)
        ret = []
        for obj in self.zones:
            if iface in obj.obj.interfaces:
                ret.append(obj.obj.name)
        if len(ret) > 1:
            # Even it shouldn't happen, it's actually possible that
            # the same interface is in several zone XML files
            return " ".join(ret) + \
                "  (ERROR: interface '%s' is in %s zone XML files, can be only in one)" % \
                (iface, len(ret))
        return ret[0] if ret else ""

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG, in_signature='s',
                         out_signature='s')
    @dbus_handle_exceptions
    def getZoneOfSource(self, source, sender=None): # pylint: disable=W0613
        """name of zone the given source belongs to
        """
        source = dbus_to_python(source, str)
        log.debug1("config.getZoneOfSource('%s')", source)
        ret = []
        for obj in self.zones:
            if source in obj.obj.sources:
                ret.append(obj.obj.name)
        if len(ret) > 1:
            # Even it shouldn't happen, it's actually possible that
            # the same source is in several zone XML files
            return " ".join(ret) + \
                "  (ERROR: source '%s' is in %s zone XML files, can be only in one)" % \
                (source, len(ret))
        return ret[0] if ret else ""

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG,
                         in_signature="s(sssbsasa(ss)asba(ssss)asasasasa(ss)b)",
                         out_signature='o')
    @dbus_handle_exceptions
    def addZone(self, zone, settings, sender=None):
        """add zone with given name and settings
        """
        zone = dbus_to_python(zone, str)
        settings = dbus_to_python(settings)
        log.debug1("config.addZone('%s')", zone)
        self.accessCheck(sender)
        if settings[4] == "default":
            # convert to list, fix target, convert back to tuple
            _settings = list(settings)
            _settings[4] = DEFAULT_ZONE_TARGET
            settings = tuple(_settings)
        obj = self.config.new_zone(zone, settings)
        config_zone = self._addZone(obj)
        return config_zone

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG,
                         in_signature="sa{sv}",
                         out_signature='o')
    @dbus_handle_exceptions
    def addZone2(self, zone, settings, sender=None):
        """add zone with given name and settings
        """
        zone = dbus_to_python(zone, str)
        settings = dbus_to_python(settings)
        log.debug1("config.addZone('%s')", zone)
        self.accessCheck(sender)
        if "target" in settings and settings["target"] == "default":
            settings["target"] = DEFAULT_ZONE_TARGET
        obj = self.config.new_zone_dict(zone, settings)
        config_zone = self._addZone(obj)
        return config_zone

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG, signature='s')
    @dbus_handle_exceptions
    def ZoneAdded(self, zone):
        log.debug1("config.ZoneAdded('%s')" % (zone))

    # policies

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG, out_signature='ao')
    @dbus_handle_exceptions
    def listPolicies(self, sender=None):
        """list policies objects paths
        """
        log.debug1("config.listPolicies()")
        return self.policy_objects

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG, out_signature='as')
    @dbus_handle_exceptions
    def getPolicyNames(self, sender=None):
        """get policy names
        """
        log.debug1("config.getPolicyNames()")
        policies = [ ]
        for obj in self.policy_objects:
            policies.append(obj.obj.name)
        return sorted(policies)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG, in_signature='s',
                         out_signature='o')
    @dbus_handle_exceptions
    def getPolicyByName(self, policy, sender=None):
        """object path of policy with given name
        """
        policy = dbus_to_python(policy, str)
        log.debug1("config.getPolicyByName('%s')", policy)
        for obj in self.policy_objects:
            if obj.obj.name == policy:
                return obj
        raise FirewallError(errors.INVALID_POLICY, policy)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG,
                         in_signature="sa{sv}",
                         out_signature='o')
    @dbus_handle_exceptions
    def addPolicy(self, policy, settings, sender=None):
        """add policy with given name and settings
        """
        policy = dbus_to_python(policy, str)
        settings = dbus_to_python(settings)
        log.debug1("config.addPolicy('%s')", policy)
        self.accessCheck(sender)
        obj = self.config.new_policy_object_dict(policy, settings)
        config_policy = self._addPolicy(obj)
        return config_policy

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG, signature='s')
    @dbus_handle_exceptions
    def PolicyAdded(self, policy):
        log.debug1("config.PolicyAdded('%s')" % (policy))

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # H E L P E R S

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG, out_signature='ao')
    @dbus_handle_exceptions
    def listHelpers(self, sender=None): # pylint: disable=W0613
        """list helpers objects paths
        """
        log.debug1("config.listHelpers()")
        return self.helpers

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG, out_signature='as')
    @dbus_handle_exceptions
    def getHelperNames(self, sender=None): # pylint: disable=W0613
        """get helper names
        """
        log.debug1("config.getHelperNames()")
        helpers = [ ]
        for obj in self.helpers:
            helpers.append(obj.obj.name)
        return sorted(helpers)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG, in_signature='s',
                         out_signature='o')
    @dbus_handle_exceptions
    def getHelperByName(self, helper, sender=None): # pylint: disable=W0613
        """object path of helper with given name
        """
        helper = dbus_to_python(helper, str)
        log.debug1("config.getHelperByName('%s')", helper)
        for obj in self.helpers:
            if obj.obj.name == helper:
                return obj
        raise FirewallError(errors.INVALID_HELPER, helper)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG,
                         in_signature='s'+Helper.DBUS_SIGNATURE,
                         out_signature='o')
    @dbus_handle_exceptions
    def addHelper(self, helper, settings, sender=None):
        """add helper with given name and settings
        """
        helper = dbus_to_python(helper, str)
        settings = dbus_to_python(settings)
        log.debug1("config.addHelper('%s')", helper)
        self.accessCheck(sender)
        obj = self.config.new_helper(helper, settings)
        config_helper = self._addHelper(obj)
        return config_helper

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG, signature='s')
    @dbus_handle_exceptions
    def HelperAdded(self, helper):
        helper = dbus_to_python(helper, str)
        log.debug1("config.HelperAdded('%s')" % (helper))

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
    # DIRECT

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                         out_signature=Direct.DBUS_SIGNATURE)
    @dbus_handle_exceptions
    def getSettings(self, sender=None): # pylint: disable=W0613
        # returns list ipv, table, list of chains
        log.debug1("config.direct.getSettings()")
        return self.config.get_direct().export_config()

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                         in_signature=Direct.DBUS_SIGNATURE)
    @dbus_handle_exceptions
    def update(self, settings, sender=None): # pylint: disable=W0613
        # returns list ipv, table, list of chains
        log.debug1("config.direct.update()")
        settings = dbus_to_python(settings)
        self.config.get_direct().import_config(settings)
        self.config.get_direct().write()
        self.Updated()

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG_DIRECT)
    @dbus_handle_exceptions
    def Updated(self):
        log.debug1("config.direct.Updated()")

    # chain

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                         in_signature='sss')
    @dbus_handle_exceptions
    def addChain(self, ipv, table, chain, sender=None):
        ipv = dbus_to_python(ipv)
        table = dbus_to_python(table)
        chain = dbus_to_python(chain)
        log.debug1("config.direct.addChain('%s', '%s', '%s')" % \
                   (ipv, table, chain))
        self.accessCheck(sender)
        idx = tuple((ipv, table, chain))
        settings = list(self.getSettings())
        if idx in settings[0]:
            raise FirewallError(errors.ALREADY_ENABLED,
                                "chain '%s' already is in '%s:%s'" % \
                                (chain, ipv, table))
        settings[0].append(idx)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                         in_signature='sss')
    @dbus_handle_exceptions
    def removeChain(self, ipv, table, chain, sender=None):
        ipv = dbus_to_python(ipv)
        table = dbus_to_python(table)
        chain = dbus_to_python(chain)
        log.debug1("config.direct.removeChain('%s', '%s', '%s')" % \
                   (ipv, table, chain))
        self.accessCheck(sender)
        idx = tuple((ipv, table, chain))
        settings = list(self.getSettings())
        if idx not in settings[0]:
            raise FirewallError(errors.NOT_ENABLED,
                                "chain '%s' is not in '%s:%s'" % (chain, ipv,
                                                                  table))
        settings[0].remove(idx)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                         in_signature='sss', out_signature='b')
    @dbus_handle_exceptions
    def queryChain(self, ipv, table, chain, sender=None): # pylint: disable=W0613
        ipv = dbus_to_python(ipv)
        table = dbus_to_python(table)
        chain = dbus_to_python(chain)
        log.debug1("config.direct.queryChain('%s', '%s', '%s')" % \
                   (ipv, table, chain))
        idx = tuple((ipv, table, chain))
        return idx in self.getSettings()[0]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                         in_signature='ss', out_signature='as')
    @dbus_handle_exceptions
    def getChains(self, ipv, table, sender=None): # pylint: disable=W0613
        ipv = dbus_to_python(ipv)
        table = dbus_to_python(table)
        log.debug1("config.direct.getChains('%s', '%s')" % (ipv, table))
        ret = [ ]
        for idx in self.getSettings()[0]:
            if idx[0] == ipv and idx[1] == table:
                ret.append(idx[2])
        return ret

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                         in_signature='', out_signature='a(sss)')
    @dbus_handle_exceptions
    def getAllChains(self, sender=None): # pylint: disable=W0613
        log.debug1("config.direct.getAllChains()")
        return self.getSettings()[0]

    # rule

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                         in_signature='sssias')
    @dbus_handle_exceptions
    def addRule(self, ipv, table, chain, priority, args, sender=None): # pylint: disable=R0913
        ipv = dbus_to_python(ipv)
        table = dbus_to_python(table)
        chain = dbus_to_python(chain)
        priority = dbus_to_python(priority)
        args = dbus_to_python(args)
        log.debug1("config.direct.addRule('%s', '%s', '%s', %d, '%s')" % \
                   (ipv, table, chain, priority, "','".join(args)))
        self.accessCheck(sender)
        idx = (ipv, table, chain, priority, args)
        settings = list(self.getSettings())
        if idx in settings[1]:
            raise FirewallError(errors.ALREADY_ENABLED,
                                "rule '%s' already is in '%s:%s:%s'" % \
                                (args, ipv, table, chain))
        settings[1].append(idx)
        self.update(tuple(settings))

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                         in_signature='sssias')
    @dbus_handle_exceptions
    def removeRule(self, ipv, table, chain, priority, args, sender=None): # pylint: disable=R0913
        ipv = dbus_to_python(ipv)
        table = dbus_to_python(table)
        chain = dbus_to_python(chain)
        priority = dbus_to_python(priority)
        args = dbus_to_python(args)
        log.debug1("config.direct.removeRule('%s', '%s', '%s', %d, '%s')" % \
                   (ipv, table, chain, priority, "','".join(args)))
        self.accessCheck(sender)
        idx = (ipv, table, chain, priority, args)
        settings = list(self.getSettings())
        if idx not in settings[1]:
            raise FirewallError(errors.NOT_ENABLED,
                                "rule '%s' is not in '%s:%s:%s'" % \
                                (args, ipv, table, chain))
        settings[1].remove(idx)
        self.update(tuple(settings))

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                         in_signature='sssias', out_signature='b')
    @dbus_handle_exceptions
    def queryRule(self, ipv, table, chain, priority, args, sender=None): # pylint: disable=W0613,R0913
        ipv = dbus_to_python(ipv)
        table = dbus_to_python(table)
        chain = dbus_to_python(chain)
        priority = dbus_to_python(priority)
        args = dbus_to_python(args)
        log.debug1("config.direct.queryRule('%s', '%s', '%s', %d, '%s')" % \
                   (ipv, table, chain, priority, "','".join(args)))
        idx = (ipv, table, chain, priority, args)
        return idx in self.getSettings()[1]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                         in_signature='sss')
    @dbus_handle_exceptions
    def removeRules(self, ipv, table, chain, sender=None):
        ipv = dbus_to_python(ipv)
        table = dbus_to_python(table)
        chain = dbus_to_python(chain)
        log.debug1("config.direct.removeRules('%s', '%s', '%s')" % \
                   (ipv, table, chain, ))
        self.accessCheck(sender)
        settings = list(self.getSettings())
        for rule in settings[1][:]:
            if (ipv, table, chain) == (rule[0], rule[1], rule[2]):
                settings[1].remove(rule)
        self.update(tuple(settings))

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                         in_signature='sss', out_signature='a(ias)')
    @dbus_handle_exceptions
    def getRules(self, ipv, table, chain, sender=None): # pylint: disable=W0613
        ipv = dbus_to_python(ipv)
        table = dbus_to_python(table)
        chain = dbus_to_python(chain)
        log.debug1("config.direct.getRules('%s', '%s', '%s')" % \
                   (ipv, table, chain))
        ret = [ ]
        for idx in self.getSettings()[1]:
            if idx[0] == ipv and idx[1] == table and idx[2] == chain:
                ret.append((idx[3], idx[4]))
        return ret

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                         in_signature='', out_signature='a(sssias)')
    @dbus_handle_exceptions
    def getAllRules(self, sender=None): # pylint: disable=W0613
        log.debug1("config.direct.getAllRules()")
        return self.getSettings()[1]

    # passthrough

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                         in_signature='sas')
    @dbus_handle_exceptions
    def addPassthrough(self, ipv, args, sender=None):
        ipv = dbus_to_python(ipv)
        args = dbus_to_python(args)
        log.debug1("config.direct.addPassthrough('%s', '%s')" % \
                   (ipv, "','".join(args)))
        self.accessCheck(sender)
        idx = (ipv, args)
        settings = list(self.getSettings())
        if idx in settings[2]:
            raise FirewallError(errors.ALREADY_ENABLED,
                                "passthrough '%s', '%s'" % (ipv, args))
        settings[2].append(idx)
        self.update(settings)


    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                         in_signature='sas')
    @dbus_handle_exceptions
    def removePassthrough(self, ipv, args, sender=None):
        ipv = dbus_to_python(ipv)
        args = dbus_to_python(args)
        log.debug1("config.direct.removePassthrough('%s', '%s')" % \
                   (ipv, "','".join(args)))
        self.accessCheck(sender)
        idx = (ipv, args)
        settings = list(self.getSettings())
        if idx not in settings[2]:
            raise FirewallError(errors.NOT_ENABLED,
                                "passthrough '%s', '%s'" % (ipv, args))
        settings[2].remove(idx)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                         in_signature='sas', out_signature='b')
    @dbus_handle_exceptions
    def queryPassthrough(self, ipv, args, sender=None): # pylint: disable=W0613
        ipv = dbus_to_python(ipv)
        args = dbus_to_python(args)
        log.debug1("config.direct.queryPassthrough('%s', '%s')" % \
                   (ipv, "','".join(args)))
        idx = (ipv, args)
        return idx in self.getSettings()[2]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                         in_signature='s', out_signature='aas')
    @dbus_handle_exceptions
    def getPassthroughs(self, ipv, sender=None): # pylint: disable=W0613
        ipv = dbus_to_python(ipv)
        log.debug1("config.direct.getPassthroughs('%s')" % (ipv))
        ret = [ ]
        for idx in self.getSettings()[2]:
            if idx[0] == ipv:
                ret.append(idx[1])
        return ret

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_DIRECT,
                         out_signature='a(sas)')
    @dbus_handle_exceptions
    def getAllPassthroughs(self, sender=None): # pylint: disable=W0613
        log.debug1("config.direct.getAllPassthroughs()")
        return self.getSettings()[2]
site-packages/firewall/server/config_helper.py000064400000042212147511334700015563 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2010-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.

# force use of pygobject3 in python-slip
from gi.repository import GObject
import sys
sys.modules['gobject'] = GObject

import dbus
import dbus.service
import slip.dbus
import slip.dbus.service

from firewall import config
from firewall.dbus_utils import dbus_to_python, \
    dbus_introspection_prepare_properties, \
    dbus_introspection_add_properties
from firewall.core.io.helper import Helper
from firewall.core.logger import log
from firewall.server.decorators import handle_exceptions, \
    dbus_handle_exceptions, dbus_service_method
from firewall import errors
from firewall.errors import FirewallError

############################################################################
#
# class FirewallDConfig
#
############################################################################

class FirewallDConfigHelper(slip.dbus.service.Object):
    """FirewallD main class"""

    persistent = True
    """ Make FirewallD persistent. """
    default_polkit_auth_required = config.dbus.PK_ACTION_CONFIG
    """ Use PK_ACTION_INFO as a default """

    @handle_exceptions
    def __init__(self, parent, conf, helper, item_id, *args, **kwargs):
        super(FirewallDConfigHelper, self).__init__(*args, **kwargs)
        self.parent = parent
        self.config = conf
        self.obj = helper
        self.item_id = item_id
        self.busname = args[0]
        self.path = args[1]
        self._log_prefix = "config.helper.%d" % self.item_id
        dbus_introspection_prepare_properties(
            self, config.dbus.DBUS_INTERFACE_CONFIG_HELPER)

    @dbus_handle_exceptions
    def __del__(self):
        pass

    @dbus_handle_exceptions
    def unregister(self):
        self.remove_from_connection()

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # P R O P E R T I E S

    @dbus_handle_exceptions
    def _get_property(self, property_name):
        if property_name == "name":
            return dbus.String(self.obj.name)
        elif property_name == "filename":
            return dbus.String(self.obj.filename)
        elif property_name == "path":
            return dbus.String(self.obj.path)
        elif property_name == "default":
            return dbus.Boolean(self.obj.default)
        elif property_name == "builtin":
            return dbus.Boolean(self.obj.builtin)
        else:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.InvalidArgs: "
                "Property '%s' does not exist" % property_name)

    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='ss',
                         out_signature='v')
    @dbus_handle_exceptions
    def Get(self, interface_name, property_name, sender=None): # pylint: disable=W0613
        # get a property
        interface_name = dbus_to_python(interface_name, str)
        property_name = dbus_to_python(property_name, str)
        log.debug1("%s.Get('%s', '%s')", self._log_prefix,
                   interface_name, property_name)

        if interface_name != config.dbus.DBUS_INTERFACE_CONFIG_HELPER:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

        return self._get_property(property_name)

    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='s',
                         out_signature='a{sv}')
    @dbus_handle_exceptions
    def GetAll(self, interface_name, sender=None): # pylint: disable=W0613
        interface_name = dbus_to_python(interface_name, str)
        log.debug1("%s.GetAll('%s')", self._log_prefix, interface_name)

        if interface_name != config.dbus.DBUS_INTERFACE_CONFIG_HELPER:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

        ret = { }
        for x in [ "name", "filename", "path", "default", "builtin" ]:
            ret[x] = self._get_property(x)
        return dbus.Dictionary(ret, signature="sv")

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='ssv')
    @dbus_handle_exceptions
    def Set(self, interface_name, property_name, new_value, sender=None):
        interface_name = dbus_to_python(interface_name, str)
        property_name = dbus_to_python(property_name, str)
        new_value = dbus_to_python(new_value)
        log.debug1("%s.Set('%s', '%s', '%s')", self._log_prefix,
                   interface_name, property_name, new_value)
        self.parent.accessCheck(sender)

        if interface_name != config.dbus.DBUS_INTERFACE_CONFIG_HELPER:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

        raise dbus.exceptions.DBusException(
            "org.freedesktop.DBus.Error.PropertyReadOnly: "
            "Property '%s' is read-only" % property_name)

    @dbus.service.signal(dbus.PROPERTIES_IFACE, signature='sa{sv}as')
    def PropertiesChanged(self, interface_name, changed_properties,
                          invalidated_properties):
        interface_name = dbus_to_python(interface_name, str)
        changed_properties = dbus_to_python(changed_properties)
        invalidated_properties = dbus_to_python(invalidated_properties)
        log.debug1("%s.PropertiesChanged('%s', '%s', '%s')", self._log_prefix,
                   interface_name, changed_properties, invalidated_properties)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(dbus.INTROSPECTABLE_IFACE, out_signature='s')
    @dbus_handle_exceptions
    def Introspect(self, sender=None): # pylint: disable=W0613
        log.debug2("%s.Introspect()", self._log_prefix)

        data = super(FirewallDConfigHelper, self).Introspect(
            self.path, self.busname.get_bus())

        return dbus_introspection_add_properties(
            self, data, config.dbus.DBUS_INTERFACE_CONFIG_HELPER)

    # S E T T I N G S

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         out_signature=Helper.DBUS_SIGNATURE)
    @dbus_handle_exceptions
    def getSettings(self, sender=None): # pylint: disable=W0613
        """get settings for helper
        """
        log.debug1("%s.getSettings()", self._log_prefix)
        return self.config.get_helper_config(self.obj)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         in_signature=Helper.DBUS_SIGNATURE)
    @dbus_handle_exceptions
    def update(self, settings, sender=None):
        """update settings for helper
        """
        settings = dbus_to_python(settings)
        log.debug1("%s.update('...')", self._log_prefix)
        self.parent.accessCheck(sender)
        self.obj = self.config.set_helper_config(self.obj, settings)
        self.Updated(self.obj.name)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER)
    @dbus_handle_exceptions
    def loadDefaults(self, sender=None):
        """load default settings for builtin helper
        """
        log.debug1("%s.loadDefaults()", self._log_prefix)
        self.parent.accessCheck(sender)
        self.obj = self.config.load_helper_defaults(self.obj)
        self.Updated(self.obj.name)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         signature='s')
    @dbus_handle_exceptions
    def Updated(self, name):
        log.debug1("%s.Updated('%s')" % (self._log_prefix, name))

    # R E M O V E

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER)
    @dbus_handle_exceptions
    def remove(self, sender=None):
        """remove helper
        """
        log.debug1("%s.removeHelper()", self._log_prefix)
        self.parent.accessCheck(sender)
        self.config.remove_helper(self.obj)
        self.parent.removeHelper(self.obj)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         signature='s')
    @dbus_handle_exceptions
    def Removed(self, name):
        log.debug1("%s.Removed('%s')" % (self._log_prefix, name))

    # R E N A M E

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         in_signature='s')
    @dbus_handle_exceptions
    def rename(self, name, sender=None):
        """rename helper
        """
        name = dbus_to_python(name, str)
        log.debug1("%s.rename('%s')", self._log_prefix, name)
        self.parent.accessCheck(sender)
        self.obj = self.config.rename_helper(self.obj, name)
        self.Renamed(name)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         signature='s')
    @dbus_handle_exceptions
    def Renamed(self, name):
        log.debug1("%s.Renamed('%s')" % (self._log_prefix, name))

    # version

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         out_signature='s')
    @dbus_handle_exceptions
    def getVersion(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getVersion()", self._log_prefix)
        return self.getSettings()[0]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         in_signature='s')
    @dbus_handle_exceptions
    def setVersion(self, version, sender=None):
        version = dbus_to_python(version, str)
        log.debug1("%s.setVersion('%s')", self._log_prefix, version)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[0] = version
        self.update(settings)

    # short

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         out_signature='s')
    @dbus_handle_exceptions
    def getShort(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getShort()", self._log_prefix)
        return self.getSettings()[1]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         in_signature='s')
    @dbus_handle_exceptions
    def setShort(self, short, sender=None):
        short = dbus_to_python(short, str)
        log.debug1("%s.setShort('%s')", self._log_prefix, short)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[1] = short
        self.update(settings)

    # description

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         out_signature='s')
    @dbus_handle_exceptions
    def getDescription(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getDescription()", self._log_prefix)
        return self.getSettings()[2]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         in_signature='s')
    @dbus_handle_exceptions
    def setDescription(self, description, sender=None):
        description = dbus_to_python(description, str)
        log.debug1("%s.setDescription('%s')", self._log_prefix,
                   description)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[2] = description
        self.update(settings)

    # family

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         out_signature='s')
    @dbus_handle_exceptions
    def getFamily(self, sender=None):
        log.debug1("%s.getFamily()", self._log_prefix)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        return settings[3]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         in_signature='s')
    @dbus_handle_exceptions
    def setFamily(self, ipv, sender=None):
        ipv = dbus_to_python(ipv, str)
        log.debug1("%s.setFamily('%s')", self._log_prefix, ipv)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if settings[3] == ipv:
            raise FirewallError(errors.ALREADY_ENABLED, "'%s'" % ipv)
        settings[3] = ipv
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         in_signature='s', out_signature='b')
    @dbus_handle_exceptions
    def queryFamily(self, ipv, sender=None): # pylint: disable=W0613
        ipv = dbus_to_python(ipv, str)
        log.debug1("%s.queryFamily('%s')", self._log_prefix, ipv)
        settings = self.getSettings()
        return (settings[3] == ipv)

    # module

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         out_signature='s')
    @dbus_handle_exceptions
    def getModule(self, sender=None):
        log.debug1("%s.getModule()", self._log_prefix)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        return settings[4]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         in_signature='s')
    @dbus_handle_exceptions
    def setModule(self, module, sender=None):
        module = dbus_to_python(module, str)
        log.debug1("%s.setModule('%s')", self._log_prefix, module)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if settings[4] == module:
            raise FirewallError(errors.ALREADY_ENABLED, "'%s'" % module)
        settings[4] = module
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         in_signature='s', out_signature='b')
    @dbus_handle_exceptions
    def queryModule(self, module, sender=None): # pylint: disable=W0613
        module = dbus_to_python(module, str)
        log.debug1("%s.queryModule('%s')", self._log_prefix, module)
        settings = self.getSettings()
        return (settings[4] == module)

    # port

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         out_signature='a(ss)')
    @dbus_handle_exceptions
    def getPorts(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getPorts()", self._log_prefix)
        return self.getSettings()[5]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         in_signature='a(ss)')
    @dbus_handle_exceptions
    def setPorts(self, ports, sender=None):
        _ports = [ ]
        # convert embedded lists to tuples
        for port in dbus_to_python(ports, list):
            if isinstance(port, list):
                _ports.append(tuple(port))
            else:
                _ports.append(port)
        ports = _ports
        log.debug1("%s.setPorts('[%s]')", self._log_prefix,
                   ",".join("('%s, '%s')" % (port[0], port[1]) for port in ports))
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[5] = ports
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         in_signature='ss')
    @dbus_handle_exceptions
    def addPort(self, port, protocol, sender=None):
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        log.debug1("%s.addPort('%s', '%s')", self._log_prefix, port,
                   protocol)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if (port,protocol) in settings[5]:
            raise FirewallError(errors.ALREADY_ENABLED,
                                "%s:%s" % (port, protocol))
        settings[5].append((port,protocol))
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         in_signature='ss')
    @dbus_handle_exceptions
    def removePort(self, port, protocol, sender=None):
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        log.debug1("%s.removePort('%s', '%s')", self._log_prefix, port,
                   protocol)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if (port,protocol) not in settings[5]:
            raise FirewallError(errors.NOT_ENABLED, "%s:%s" % (port, protocol))
        settings[5].remove((port,protocol))
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_HELPER,
                         in_signature='ss', out_signature='b')
    @dbus_handle_exceptions
    def queryPort(self, port, protocol, sender=None): # pylint: disable=W0613
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        log.debug1("%s.queryPort('%s', '%s')", self._log_prefix, port,
                   protocol)
        return (port,protocol) in self.getSettings()[5]
site-packages/firewall/server/config_zone.py000064400000132127147511334700015264 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2010-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.

# force use of pygobject3 in python-slip
from gi.repository import GObject
import sys
sys.modules['gobject'] = GObject

import dbus
import dbus.service
import slip.dbus
import slip.dbus.service

from firewall import config
from firewall.dbus_utils import dbus_to_python, \
    dbus_introspection_prepare_properties, \
    dbus_introspection_add_properties
from firewall.core.io.zone import Zone
from firewall.core.fw_ifcfg import ifcfg_set_zone_of_interface
from firewall.core.base import DEFAULT_ZONE_TARGET
from firewall.core.rich import Rich_Rule
from firewall.core.logger import log
from firewall.server.decorators import handle_exceptions, \
    dbus_handle_exceptions, dbus_service_method
from firewall import errors
from firewall.errors import FirewallError
from firewall.functions import portStr, portInPortRange, coalescePortRange, \
                               breakPortRange

############################################################################
#
# class FirewallDConfig
#
############################################################################

class FirewallDConfigZone(slip.dbus.service.Object):
    """FirewallD main class"""

    persistent = True
    """ Make FirewallD persistent. """
    default_polkit_auth_required = config.dbus.PK_ACTION_CONFIG
    """ Use PK_ACTION_INFO as a default """

    @handle_exceptions
    def __init__(self, parent, conf, zone, item_id, *args, **kwargs):
        super(FirewallDConfigZone, self).__init__(*args, **kwargs)
        self.parent = parent
        self.config = conf
        self.obj = zone
        self.item_id = item_id
        self.busname = args[0]
        self.path = args[1]
        self._log_prefix = "config.zone.%d" % self.item_id
        dbus_introspection_prepare_properties(
            self, config.dbus.DBUS_INTERFACE_CONFIG_ZONE)

    @dbus_handle_exceptions
    def __del__(self):
        pass

    @dbus_handle_exceptions
    def unregister(self):
        self.remove_from_connection()

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # P R O P E R T I E S

    @dbus_handle_exceptions
    def _get_property(self, property_name):
        if property_name == "name":
            return dbus.String(self.obj.name)
        elif property_name == "filename":
            return dbus.String(self.obj.filename)
        elif property_name == "path":
            return dbus.String(self.obj.path)
        elif property_name == "default":
            return dbus.Boolean(self.obj.default)
        elif property_name == "builtin":
            return dbus.Boolean(self.obj.builtin)
        else:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.InvalidArgs: "
                "Property '%s' does not exist" % property_name)

    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='ss',
                         out_signature='v')
    @dbus_handle_exceptions
    def Get(self, interface_name, property_name, sender=None): # pylint: disable=W0613
        # get a property
        interface_name = dbus_to_python(interface_name, str)
        property_name = dbus_to_python(property_name, str)
        log.debug1("%s.Get('%s', '%s')", self._log_prefix,
                   interface_name, property_name)

        if interface_name != config.dbus.DBUS_INTERFACE_CONFIG_ZONE:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

        return self._get_property(property_name)

    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='s',
                         out_signature='a{sv}')
    @dbus_handle_exceptions
    def GetAll(self, interface_name, sender=None): # pylint: disable=W0613
        interface_name = dbus_to_python(interface_name, str)
        log.debug1("%s.GetAll('%s')", self._log_prefix, interface_name)

        if interface_name != config.dbus.DBUS_INTERFACE_CONFIG_ZONE:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

        ret = { }
        for x in [ "name", "filename", "path", "default", "builtin" ]:
            ret[x] = self._get_property(x)
        return dbus.Dictionary(ret, signature="sv")

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='ssv')
    @dbus_handle_exceptions
    def Set(self, interface_name, property_name, new_value, sender=None):
        interface_name = dbus_to_python(interface_name, str)
        property_name = dbus_to_python(property_name, str)
        new_value = dbus_to_python(new_value)
        log.debug1("%s.Set('%s', '%s', '%s')", self._log_prefix,
                   interface_name, property_name, new_value)
        self.parent.accessCheck(sender)

        if interface_name != config.dbus.DBUS_INTERFACE_CONFIG_ZONE:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

        raise dbus.exceptions.DBusException(
            "org.freedesktop.DBus.Error.PropertyReadOnly: "
            "Property '%s' is read-only" % property_name)

    @dbus.service.signal(dbus.PROPERTIES_IFACE, signature='sa{sv}as')
    def PropertiesChanged(self, interface_name, changed_properties,
                          invalidated_properties):
        interface_name = dbus_to_python(interface_name, str)
        changed_properties = dbus_to_python(changed_properties)
        invalidated_properties = dbus_to_python(invalidated_properties)
        log.debug1("%s.PropertiesChanged('%s', '%s', '%s')", self._log_prefix,
                   interface_name, changed_properties, invalidated_properties)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(dbus.INTROSPECTABLE_IFACE, out_signature='s')
    @dbus_handle_exceptions
    def Introspect(self, sender=None): # pylint: disable=W0613
        log.debug2("%s.Introspect()", self._log_prefix)

        data = super(FirewallDConfigZone, self).Introspect(
            self.path, self.busname.get_bus())

        return dbus_introspection_add_properties(
            self, data, config.dbus.DBUS_INTERFACE_CONFIG_ZONE)

    # S E T T I N G S

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         out_signature="(sssbsasa(ss)asba(ssss)asasasasa(ss)b)")
    @dbus_handle_exceptions
    def getSettings(self, sender=None): # pylint: disable=W0613
        """get settings for zone
        """
        log.debug1("%s.getSettings()", self._log_prefix)
        settings = self.config.get_zone_config(self.obj)
        if settings[4] == DEFAULT_ZONE_TARGET:
            # convert to list, fix target, convert back to tuple
            _settings = list(settings)
            _settings[4] = "default"
            settings = tuple(_settings)
        return settings

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         out_signature="a{sv}")
    @dbus_handle_exceptions
    def getSettings2(self, sender=None):
        """get settings for zone
        """
        log.debug1("%s.getSettings2()", self._log_prefix)
        settings = self.config.get_zone_config_dict(self.obj)
        if settings["target"] == DEFAULT_ZONE_TARGET:
            settings["target"] = "default"
        return settings

    def _checkDuplicateInterfacesSources(self, settings):
        """Assignment of interfaces/sources to zones is different from other
           zone settings in the sense that particular interface/zone can be
           part of only one zone. So make sure added interfaces/sources have
           not already been bound to another zone."""
        old_settings = self.config.get_zone_config_dict(self.obj)
        old_ifaces = set(old_settings["interfaces"]) if "interfaces" in old_settings else set()
        old_sources = set(old_settings["sources"]) if "sources" in old_settings else set()
        if isinstance(settings, tuple):
            added_ifaces = set(settings[Zone.index_of("interfaces")]) - old_ifaces
            added_sources = set(settings[Zone.index_of("sources")]) - old_sources
        else: # dict
            new_ifaces = set(settings["interfaces"]) if "interfaces" in settings else set()
            new_sources = set(settings["sources"]) if "sources" in settings else set()
            added_ifaces = new_ifaces - old_ifaces
            added_sources = new_sources - old_sources

        for iface in added_ifaces:
            if self.parent.getZoneOfInterface(iface):
                raise FirewallError(errors.ZONE_CONFLICT, iface)  # or move to new zone ?
        for source in added_sources:
            if self.parent.getZoneOfSource(source):
                raise FirewallError(errors.ZONE_CONFLICT, source) # or move to new zone ?

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature="(sssbsasa(ss)asba(ssss)asasasasa(ss)b)")
    @dbus_handle_exceptions
    def update(self, settings, sender=None):
        """update settings for zone
        """
        settings = dbus_to_python(settings)
        log.debug1("%s.update('...')", self._log_prefix)
        self.parent.accessCheck(sender)
        if settings[4] == "default":
            # convert to list, fix target, convert back to tuple
            _settings = list(settings)
            _settings[4] = DEFAULT_ZONE_TARGET
            settings = tuple(_settings)
        self._checkDuplicateInterfacesSources(settings)
        self.obj = self.config.set_zone_config(self.obj, settings)
        self.Updated(self.obj.name)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature="a{sv}")
    @dbus_handle_exceptions
    def update2(self, settings, sender=None):
        """update settings for zone
        """
        settings = dbus_to_python(settings)
        log.debug1("%s.update2('...')", self._log_prefix)
        self.parent.accessCheck(sender)
        if "target" in settings and settings["target"] == "default":
            settings["target"] = DEFAULT_ZONE_TARGET
        self._checkDuplicateInterfacesSources(settings)
        self.obj = self.config.set_zone_config_dict(self.obj, settings)
        self.Updated(self.obj.name)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE)
    @dbus_handle_exceptions
    def loadDefaults(self, sender=None):
        """load default settings for builtin zone
        """
        log.debug1("%s.loadDefaults()", self._log_prefix)
        self.parent.accessCheck(sender)
        self.obj = self.config.load_zone_defaults(self.obj)
        self.Updated(self.obj.name)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG_ZONE, signature='s')
    @dbus_handle_exceptions
    def Updated(self, name):
        log.debug1("%s.Updated('%s')" % (self._log_prefix, name))

    # R E M O V E

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE)
    @dbus_handle_exceptions
    def remove(self, sender=None):
        """remove zone
        """
        log.debug1("%s.removeZone()", self._log_prefix)
        self.parent.accessCheck(sender)
        self.config.remove_zone(self.obj)
        self.parent.removeZone(self.obj)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG_ZONE, signature='s')
    @dbus_handle_exceptions
    def Removed(self, name):
        log.debug1("%s.Removed('%s')" % (self._log_prefix, name))

    # R E N A M E

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s')
    @dbus_handle_exceptions
    def rename(self, name, sender=None):
        """rename zone
        """
        name = dbus_to_python(name, str)
        log.debug1("%s.rename('%s')", self._log_prefix, name)
        self.parent.accessCheck(sender)
        self.obj = self.config.rename_zone(self.obj, name)
        self.Renamed(name)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG_ZONE, signature='s')
    @dbus_handle_exceptions
    def Renamed(self, name):
        log.debug1("%s.Renamed('%s')" % (self._log_prefix, name))

    # version

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         out_signature='s')
    @dbus_handle_exceptions
    def getVersion(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getVersion()", self._log_prefix)
        return self.getSettings()[0]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s')
    @dbus_handle_exceptions
    def setVersion(self, version, sender=None):
        version = dbus_to_python(version, str)
        log.debug1("%s.setVersion('%s')", self._log_prefix, version)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[0] = version
        self.update(settings)

    # short

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         out_signature='s')
    @dbus_handle_exceptions
    def getShort(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getShort()", self._log_prefix)
        return self.getSettings()[1]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s')
    @dbus_handle_exceptions
    def setShort(self, short, sender=None):
        short = dbus_to_python(short, str)
        log.debug1("%s.setShort('%s')", self._log_prefix, short)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[1] = short
        self.update(settings)

    # description

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         out_signature='s')
    @dbus_handle_exceptions
    def getDescription(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getDescription()", self._log_prefix)
        return self.getSettings()[2]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s')
    @dbus_handle_exceptions
    def setDescription(self, description, sender=None):
        description = dbus_to_python(description, str)
        log.debug1("%s.setDescription('%s')", self._log_prefix, description)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[2] = description
        self.update(settings)

    # immutable (deprecated)
    # settings[3] was used for 'immutable'

    # target

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         out_signature='s')
    @dbus_handle_exceptions
    def getTarget(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getTarget()", self._log_prefix)
        settings = self.getSettings()
        return settings[4] if settings[4] != DEFAULT_ZONE_TARGET else "default"

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s')
    @dbus_handle_exceptions
    def setTarget(self, target, sender=None):
        target = dbus_to_python(target, str)
        log.debug1("%s.setTarget('%s')", self._log_prefix, target)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[4] = target if target != "default" else DEFAULT_ZONE_TARGET
        self.update(settings)

    # service

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         out_signature='as')
    @dbus_handle_exceptions
    def getServices(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getServices()", self._log_prefix)
        return self.getSettings()[5]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='as')
    @dbus_handle_exceptions
    def setServices(self, services, sender=None):
        services = dbus_to_python(services, list)
        log.debug1("%s.setServices('[%s]')", self._log_prefix,
                   ",".join(services))
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[5] = services
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s')
    @dbus_handle_exceptions
    def addService(self, service, sender=None):
        service = dbus_to_python(service, str)
        log.debug1("%s.addService('%s')", self._log_prefix, service)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if service in settings[5]:
            raise FirewallError(errors.ALREADY_ENABLED, service)
        settings[5].append(service)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s')
    @dbus_handle_exceptions
    def removeService(self, service, sender=None):
        service = dbus_to_python(service, str)
        log.debug1("%s.removeService('%s')", self._log_prefix, service)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if service not in settings[5]:
            raise FirewallError(errors.NOT_ENABLED, service)
        settings[5].remove(service)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s', out_signature='b')
    @dbus_handle_exceptions
    def queryService(self, service, sender=None): # pylint: disable=W0613
        service = dbus_to_python(service, str)
        log.debug1("%s.queryService('%s')", self._log_prefix, service)
        return service in self.getSettings()[5]

    # port

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         out_signature='a(ss)')
    @dbus_handle_exceptions
    def getPorts(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getPorts()", self._log_prefix)
        return self.getSettings()[6]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='a(ss)')
    @dbus_handle_exceptions
    def setPorts(self, ports, sender=None):
        _ports = [ ]
        # convert embedded lists to tuples
        for port in dbus_to_python(ports, list):
            if isinstance(port, list):
                _ports.append(tuple(port))
            else:
                _ports.append(port)
        ports = _ports
        log.debug1("%s.setPorts('[%s]')", self._log_prefix,
                   ",".join("('%s, '%s')" % (port[0], port[1]) for port in ports))
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[6] = ports
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='ss')
    @dbus_handle_exceptions
    def addPort(self, port, protocol, sender=None):
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        log.debug1("%s.addPort('%s', '%s')", self._log_prefix, port,
                   protocol)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        existing_port_ids = list(filter(lambda x: x[1] == protocol, settings[6]))
        for port_id in existing_port_ids:
            if portInPortRange(port, port_id[0]):
                raise FirewallError(errors.ALREADY_ENABLED,
                                    "%s:%s" % (port, protocol))
        added_ranges, removed_ranges = coalescePortRange(port, [_port for (_port, _protocol) in existing_port_ids])
        for range in removed_ranges:
            settings[6].remove((portStr(range, "-"), protocol))
        for range in added_ranges:
            settings[6].append((portStr(range, "-"), protocol))
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='ss')
    @dbus_handle_exceptions
    def removePort(self, port, protocol, sender=None):
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        log.debug1("%s.removePort('%s', '%s')", self._log_prefix, port,
                   protocol)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        existing_port_ids = list(filter(lambda x: x[1] == protocol, settings[6]))
        for port_id in existing_port_ids:
            if portInPortRange(port, port_id[0]):
                break
        else:
            raise FirewallError(errors.NOT_ENABLED, "%s:%s" % (port, protocol))
        added_ranges, removed_ranges = breakPortRange(port, [_port for (_port, _protocol) in existing_port_ids])
        for range in removed_ranges:
            settings[6].remove((portStr(range, "-"), protocol))
        for range in added_ranges:
            settings[6].append((portStr(range, "-"), protocol))
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='ss', out_signature='b')
    @dbus_handle_exceptions
    def queryPort(self, port, protocol, sender=None): # pylint: disable=W0613
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        log.debug1("%s.queryPort('%s', '%s')", self._log_prefix, port,
                   protocol)
        for (_port, _protocol) in self.getSettings()[6]:
            if portInPortRange(port, _port) and protocol == _protocol:
                return True

        return False

    # protocol

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         out_signature='as')
    @dbus_handle_exceptions
    def getProtocols(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getProtocols()", self._log_prefix)
        return self.getSettings()[13]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='as')
    @dbus_handle_exceptions
    def setProtocols(self, protocols, sender=None):
        protocols = dbus_to_python(protocols, list)
        log.debug1("%s.setProtocols('[%s]')", self._log_prefix,
                   ",".join(protocols))
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[13] = protocols
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s')
    @dbus_handle_exceptions
    def addProtocol(self, protocol, sender=None):
        protocol = dbus_to_python(protocol, str)
        log.debug1("%s.addProtocol('%s')", self._log_prefix, protocol)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if protocol in settings[13]:
            raise FirewallError(errors.ALREADY_ENABLED, protocol)
        settings[13].append(protocol)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s')
    @dbus_handle_exceptions
    def removeProtocol(self, protocol, sender=None):
        protocol = dbus_to_python(protocol, str)
        log.debug1("%s.removeProtocol('%s')", self._log_prefix, protocol)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if protocol not in settings[13]:
            raise FirewallError(errors.NOT_ENABLED, protocol)
        settings[13].remove(protocol)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s', out_signature='b')
    @dbus_handle_exceptions
    def queryProtocol(self, protocol, sender=None): # pylint: disable=W0613
        protocol = dbus_to_python(protocol, str)
        log.debug1("%s.queryProtocol('%s')", self._log_prefix, protocol)
        return protocol in self.getSettings()[13]

    # source port

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         out_signature='a(ss)')
    @dbus_handle_exceptions
    def getSourcePorts(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getSourcePorts()", self._log_prefix)
        return self.getSettings()[14]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='a(ss)')
    @dbus_handle_exceptions
    def setSourcePorts(self, ports, sender=None):
        _ports = [ ]
        # convert embedded lists to tuples
        for port in dbus_to_python(ports, list):
            if isinstance(port, list):
                _ports.append(tuple(port))
            else:
                _ports.append(port)
        ports = _ports
        log.debug1("%s.setSourcePorts('[%s]')", self._log_prefix,
                   ",".join("('%s, '%s')" % (port[0], port[1]) for port in ports))
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[14] = ports
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='ss')
    @dbus_handle_exceptions
    def addSourcePort(self, port, protocol, sender=None):
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        log.debug1("%s.addSourcePort('%s', '%s')", self._log_prefix, port,
                   protocol)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        existing_port_ids = list(filter(lambda x: x[1] == protocol, settings[14]))
        for port_id in existing_port_ids:
            if portInPortRange(port, port_id[0]):
                raise FirewallError(errors.ALREADY_ENABLED,
                                    "%s:%s" % (port, protocol))
        added_ranges, removed_ranges = coalescePortRange(port, [_port for (_port, _protocol) in existing_port_ids])
        for range in removed_ranges:
            settings[14].remove((portStr(range, "-"), protocol))
        for range in added_ranges:
            settings[14].append((portStr(range, "-"), protocol))
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='ss')
    @dbus_handle_exceptions
    def removeSourcePort(self, port, protocol, sender=None):
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        log.debug1("%s.removeSourcePort('%s', '%s')", self._log_prefix, port,
                   protocol)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        existing_port_ids = list(filter(lambda x: x[1] == protocol, settings[14]))
        for port_id in existing_port_ids:
            if portInPortRange(port, port_id[0]):
                break
        else:
            raise FirewallError(errors.NOT_ENABLED, "%s:%s" % (port, protocol))
        added_ranges, removed_ranges = breakPortRange(port, [_port for (_port, _protocol) in existing_port_ids])
        for range in removed_ranges:
            settings[14].remove((portStr(range, "-"), protocol))
        for range in added_ranges:
            settings[14].append((portStr(range, "-"), protocol))
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='ss', out_signature='b')
    @dbus_handle_exceptions
    def querySourcePort(self, port, protocol, sender=None): # pylint: disable=W0613
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        log.debug1("%s.querySourcePort('%s', '%s')", self._log_prefix, port,
                   protocol)
        for (_port, _protocol) in self.getSettings()[14]:
            if portInPortRange(port, _port) and protocol == _protocol:
                return True

        return False

    # icmp block

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         out_signature='as')
    @dbus_handle_exceptions
    def getIcmpBlocks(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getIcmpBlocks()", self._log_prefix)
        return self.getSettings()[7]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='as')
    @dbus_handle_exceptions
    def setIcmpBlocks(self, icmptypes, sender=None):
        icmptypes = dbus_to_python(icmptypes, list)
        log.debug1("%s.setIcmpBlocks('[%s]')", self._log_prefix,
                   ",".join(icmptypes))
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[7] = icmptypes
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s')
    @dbus_handle_exceptions
    def addIcmpBlock(self, icmptype, sender=None):
        icmptype = dbus_to_python(icmptype, str)
        log.debug1("%s.addIcmpBlock('%s')", self._log_prefix, icmptype)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if icmptype in settings[7]:
            raise FirewallError(errors.ALREADY_ENABLED, icmptype)
        settings[7].append(icmptype)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s')
    @dbus_handle_exceptions
    def removeIcmpBlock(self, icmptype, sender=None):
        icmptype = dbus_to_python(icmptype, str)
        log.debug1("%s.removeIcmpBlock('%s')", self._log_prefix, icmptype)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if icmptype not in settings[7]:
            raise FirewallError(errors.NOT_ENABLED, icmptype)
        settings[7].remove(icmptype)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s', out_signature='b')
    @dbus_handle_exceptions
    def queryIcmpBlock(self, icmptype, sender=None): # pylint: disable=W0613
        icmptype = dbus_to_python(icmptype, str)
        log.debug1("%s.queryIcmpBlock('%s')", self._log_prefix, icmptype)
        return icmptype in self.getSettings()[7]

    # icmp block inversion

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         out_signature='b')
    @dbus_handle_exceptions
    def getIcmpBlockInversion(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getIcmpBlockInversion()", self._log_prefix)
        return self.getSettings()[15]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='b')
    @dbus_handle_exceptions
    def setIcmpBlockInversion(self, flag, sender=None):
        flag = dbus_to_python(flag, bool)
        log.debug1("%s.setIcmpBlockInversion('%s')", self._log_prefix, flag)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[15] = flag
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE)
    @dbus_handle_exceptions
    def addIcmpBlockInversion(self, sender=None):
        log.debug1("%s.addIcmpBlockInversion()", self._log_prefix)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if settings[15]:
            raise FirewallError(errors.ALREADY_ENABLED, "icmp-block-inversion")
        settings[15] = True
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE)
    @dbus_handle_exceptions
    def removeIcmpBlockInversion(self, sender=None):
        log.debug1("%s.removeIcmpBlockInversion()", self._log_prefix)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if not settings[15]:
            raise FirewallError(errors.NOT_ENABLED, "icmp-block-inversion")
        settings[15] = False
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         out_signature='b')
    @dbus_handle_exceptions
    def queryIcmpBlockInversion(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.queryIcmpBlockInversion()", self._log_prefix)
        return self.getSettings()[15]

    # masquerade

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         out_signature='b')
    @dbus_handle_exceptions
    def getMasquerade(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getMasquerade()", self._log_prefix)
        return self.getSettings()[8]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='b')
    @dbus_handle_exceptions
    def setMasquerade(self, masquerade, sender=None):
        masquerade = dbus_to_python(masquerade, bool)
        log.debug1("%s.setMasquerade('%s')", self._log_prefix, masquerade)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[8] = masquerade
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE)
    @dbus_handle_exceptions
    def addMasquerade(self, sender=None):
        log.debug1("%s.addMasquerade()", self._log_prefix)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if settings[8]:
            raise FirewallError(errors.ALREADY_ENABLED, "masquerade")
        settings[8] = True
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE)
    @dbus_handle_exceptions
    def removeMasquerade(self, sender=None):
        log.debug1("%s.removeMasquerade()", self._log_prefix)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if not settings[8]:
            raise FirewallError(errors.NOT_ENABLED, "masquerade")
        settings[8] = False
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         out_signature='b')
    @dbus_handle_exceptions
    def queryMasquerade(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.queryMasquerade()", self._log_prefix)
        return self.getSettings()[8]

    # forward port

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         out_signature='a(ssss)')
    @dbus_handle_exceptions
    def getForwardPorts(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getForwardPorts()", self._log_prefix)
        return self.getSettings()[9]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='a(ssss)')
    @dbus_handle_exceptions
    def setForwardPorts(self, ports, sender=None):
        _ports = [ ]
        # convert embedded lists to tuples
        for port in dbus_to_python(ports, list):
            if isinstance(port, list):
                _ports.append(tuple(port))
            else:
                _ports.append(port)
        ports = _ports
        log.debug1("%s.setForwardPorts('[%s]')", self._log_prefix,
                   ",".join("('%s, '%s', '%s', '%s')" % (port[0], port[1], \
                                                         port[2], port[3]) for port in ports))
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[9] = ports
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='ssss')
    @dbus_handle_exceptions
    def addForwardPort(self, port, protocol, toport, toaddr, sender=None): # pylint: disable=R0913
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        toport = dbus_to_python(toport, str)
        toaddr = dbus_to_python(toaddr, str)
        log.debug1("%s.addForwardPort('%s', '%s', '%s', '%s')",
                   self._log_prefix, port, protocol, toport, toaddr)
        self.parent.accessCheck(sender)
        fwp_id = (port, protocol, str(toport), str(toaddr))
        settings = list(self.getSettings())
        if fwp_id in settings[9]:
            raise FirewallError(errors.ALREADY_ENABLED,
                                "%s:%s:%s:%s" % (port, protocol, toport,
                                                 toaddr))
        settings[9].append(fwp_id)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='ssss')
    @dbus_handle_exceptions
    def removeForwardPort(self, port, protocol, toport, toaddr, sender=None): # pylint: disable=R0913
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        toport = dbus_to_python(toport, str)
        toaddr = dbus_to_python(toaddr, str)
        log.debug1("%s.removeForwardPort('%s', '%s', '%s', '%s')",
                   self._log_prefix, port, protocol, toport, toaddr)
        self.parent.accessCheck(sender)
        fwp_id = (port, protocol, str(toport), str(toaddr))
        settings = list(self.getSettings())
        if fwp_id not in settings[9]:
            raise FirewallError(errors.NOT_ENABLED,
                                "%s:%s:%s:%s" % (port, protocol, toport,
                                                 toaddr))
        settings[9].remove(fwp_id)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='ssss',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryForwardPort(self, port, protocol, toport, toaddr, sender=None): # pylint: disable=W0613, R0913
        port = dbus_to_python(port, str)
        protocol = dbus_to_python(protocol, str)
        toport = dbus_to_python(toport, str)
        toaddr = dbus_to_python(toaddr, str)
        log.debug1("%s.queryForwardPort('%s', '%s', '%s', '%s')",
                   self._log_prefix, port, protocol, toport, toaddr)
        fwp_id = (port, protocol, str(toport), str(toaddr))
        return fwp_id in self.getSettings()[9]

    # interface

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         out_signature='as')
    @dbus_handle_exceptions
    def getInterfaces(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getInterfaces()", self._log_prefix)
        return self.getSettings()[10]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='as')
    @dbus_handle_exceptions
    def setInterfaces(self, interfaces, sender=None):
        interfaces = dbus_to_python(interfaces, list)
        log.debug1("%s.setInterfaces('[%s]')", self._log_prefix,
                   ",".join(interfaces))
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[10] = interfaces
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s')
    @dbus_handle_exceptions
    def addInterface(self, interface, sender=None):
        interface = dbus_to_python(interface, str)
        log.debug1("%s.addInterface('%s')", self._log_prefix, interface)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if interface in settings[10]:
            raise FirewallError(errors.ALREADY_ENABLED, interface)
        settings[10].append(interface)
        self.update(settings)

        ifcfg_set_zone_of_interface(self.obj.name, interface)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s')
    @dbus_handle_exceptions
    def removeInterface(self, interface, sender=None):
        interface = dbus_to_python(interface, str)
        log.debug1("%s.removeInterface('%s')", self._log_prefix, interface)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if interface not in settings[10]:
            raise FirewallError(errors.NOT_ENABLED, interface)
        settings[10].remove(interface)
        self.update(settings)

        ifcfg_set_zone_of_interface("", interface)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s',
                         out_signature='b')
    @dbus_handle_exceptions
    def queryInterface(self, interface, sender=None): # pylint: disable=W0613
        interface = dbus_to_python(interface, str)
        log.debug1("%s.queryInterface('%s')", self._log_prefix, interface)
        return interface in self.getSettings()[10]

    # source

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         out_signature='as')
    @dbus_handle_exceptions
    def getSources(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getSources()", self._log_prefix)
        return self.getSettings()[11]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='as')
    @dbus_handle_exceptions
    def setSources(self, sources, sender=None):
        sources = dbus_to_python(sources, list)
        log.debug1("%s.setSources('[%s]')", self._log_prefix,
                   ",".join(sources))
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[11] = sources
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s')
    @dbus_handle_exceptions
    def addSource(self, source, sender=None):
        source = dbus_to_python(source, str)
        log.debug1("%s.addSource('%s')", self._log_prefix, source)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if source in settings[11]:
            raise FirewallError(errors.ALREADY_ENABLED, source)
        settings[11].append(source)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s')
    @dbus_handle_exceptions
    def removeSource(self, source, sender=None):
        source = dbus_to_python(source, str)
        log.debug1("%s.removeSource('%s')", self._log_prefix, source)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if source not in settings[11]:
            raise FirewallError(errors.NOT_ENABLED, source)
        settings[11].remove(source)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s', out_signature='b')
    @dbus_handle_exceptions
    def querySource(self, source, sender=None): # pylint: disable=W0613
        source = dbus_to_python(source, str)
        log.debug1("%s.querySource('%s')", self._log_prefix, source)
        return source in self.getSettings()[11]

    # rich rule

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         out_signature='as')
    @dbus_handle_exceptions
    def getRichRules(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getRichRules()", self._log_prefix)
        return self.getSettings()[12]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='as')
    @dbus_handle_exceptions
    def setRichRules(self, rules, sender=None):
        rules = dbus_to_python(rules, list)
        log.debug1("%s.setRichRules('[%s]')", self._log_prefix,
                   ",".join(rules))
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        rules = [ str(Rich_Rule(rule_str=r)) for r in rules ]
        settings[12] = rules
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s')
    @dbus_handle_exceptions
    def addRichRule(self, rule, sender=None):
        rule = dbus_to_python(rule, str)
        log.debug1("%s.addRichRule('%s')", self._log_prefix, rule)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        rule_str = str(Rich_Rule(rule_str=rule))
        if rule_str in settings[12]:
            raise FirewallError(errors.ALREADY_ENABLED, rule)
        settings[12].append(rule_str)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s')
    @dbus_handle_exceptions
    def removeRichRule(self, rule, sender=None):
        rule = dbus_to_python(rule, str)
        log.debug1("%s.removeRichRule('%s')", self._log_prefix, rule)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        rule_str = str(Rich_Rule(rule_str=rule))
        if rule_str not in settings[12]:
            raise FirewallError(errors.NOT_ENABLED, rule)
        settings[12].remove(rule_str)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ZONE,
                         in_signature='s', out_signature='b')
    @dbus_handle_exceptions
    def queryRichRule(self, rule, sender=None): # pylint: disable=W0613
        rule = dbus_to_python(rule, str)
        log.debug1("%s.queryRichRule('%s')", self._log_prefix, rule)
        rule_str = str(Rich_Rule(rule_str=rule))
        return rule_str in self.getSettings()[12]
site-packages/firewall/server/config_icmptype.py000064400000035123147511334700016141 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2010-2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.

# force use of pygobject3 in python-slip
from gi.repository import GObject
import sys
sys.modules['gobject'] = GObject

import dbus
import dbus.service
import slip.dbus
import slip.dbus.service

from firewall import config
from firewall.dbus_utils import dbus_to_python, \
    dbus_introspection_prepare_properties, \
    dbus_introspection_add_properties
from firewall.core.io.icmptype import IcmpType
from firewall.core.logger import log
from firewall.server.decorators import handle_exceptions, \
    dbus_handle_exceptions, dbus_service_method
from firewall import errors
from firewall.errors import FirewallError

############################################################################
#
# class FirewallDConfigIcmpType
#
############################################################################

class FirewallDConfigIcmpType(slip.dbus.service.Object):
    """FirewallD main class"""

    persistent = True
    """ Make FirewallD persistent. """
    default_polkit_auth_required = config.dbus.PK_ACTION_CONFIG
    """ Use PK_ACTION_INFO as a default """

    @handle_exceptions
    def __init__(self, parent, conf, icmptype, item_id, *args, **kwargs):
        super(FirewallDConfigIcmpType, self).__init__(*args, **kwargs)
        self.parent = parent
        self.config = conf
        self.obj = icmptype
        self.item_id = item_id
        self.busname = args[0]
        self.path = args[1]
        self._log_prefix = "config.icmptype.%d" % self.item_id
        dbus_introspection_prepare_properties(
            self, config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE)

    @dbus_handle_exceptions
    def __del__(self):
        pass

    @dbus_handle_exceptions
    def unregister(self):
        self.remove_from_connection()

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # P R O P E R T I E S

    @dbus_handle_exceptions
    def _get_property(self, property_name):
        if property_name == "name":
            return dbus.String(self.obj.name)
        elif property_name == "filename":
            return dbus.String(self.obj.filename)
        elif property_name == "path":
            return dbus.String(self.obj.path)
        elif property_name == "default":
            return dbus.Boolean(self.obj.default)
        elif property_name == "builtin":
            return dbus.Boolean(self.obj.builtin)
        else:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.InvalidArgs: "
                "Property '%s' does not exist" % property_name)

    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='ss',
                         out_signature='v')
    @dbus_handle_exceptions
    def Get(self, interface_name, property_name, sender=None): # pylint: disable=W0613
        # get a property
        interface_name = dbus_to_python(interface_name, str)
        property_name = dbus_to_python(property_name, str)
        log.debug1("%s.Get('%s', '%s')", self._log_prefix,
                   interface_name, property_name)

        if interface_name != config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

        return self._get_property(property_name)

    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='s',
                         out_signature='a{sv}')
    @dbus_handle_exceptions
    def GetAll(self, interface_name, sender=None): # pylint: disable=W0613
        interface_name = dbus_to_python(interface_name, str)
        log.debug1("%s.GetAll('%s')", self._log_prefix, interface_name)

        if interface_name != config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

        ret = { }
        for x in [ "name", "filename", "path", "default", "builtin" ]:
            ret[x] = self._get_property(x)
        return dbus.Dictionary(ret, signature="sv")

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='ssv')
    @dbus_handle_exceptions
    def Set(self, interface_name, property_name, new_value, sender=None):
        interface_name = dbus_to_python(interface_name, str)
        property_name = dbus_to_python(property_name, str)
        new_value = dbus_to_python(new_value)
        log.debug1("%s.Set('%s', '%s', '%s')", self._log_prefix,
                   interface_name, property_name, new_value)
        self.parent.accessCheck(sender)

        if interface_name != config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

        raise dbus.exceptions.DBusException(
            "org.freedesktop.DBus.Error.PropertyReadOnly: "
            "Property '%s' is read-only" % property_name)

    @dbus.service.signal(dbus.PROPERTIES_IFACE, signature='sa{sv}as')
    def PropertiesChanged(self, interface_name, changed_properties,
                          invalidated_properties):
        interface_name = dbus_to_python(interface_name, str)
        changed_properties = dbus_to_python(changed_properties)
        invalidated_properties = dbus_to_python(invalidated_properties)
        log.debug1("%s.PropertiesChanged('%s', '%s', '%s')", self._log_prefix,
                   interface_name, changed_properties, invalidated_properties)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(dbus.INTROSPECTABLE_IFACE, out_signature='s')
    @dbus_handle_exceptions
    def Introspect(self, sender=None): # pylint: disable=W0613
        log.debug2("%s.Introspect()", self._log_prefix)

        data = super(FirewallDConfigIcmpType, self).Introspect(
            self.path, self.busname.get_bus())

        return dbus_introspection_add_properties(
            self, data, config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE)

    # S E T T I N G S

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE,
                         out_signature=IcmpType.DBUS_SIGNATURE)
    @dbus_handle_exceptions
    def getSettings(self, sender=None): # pylint: disable=W0613
        """get settings for icmptype
        """
        log.debug1("%s.getSettings()", self._log_prefix)
        return self.config.get_icmptype_config(self.obj)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE,
                         in_signature=IcmpType.DBUS_SIGNATURE)
    @dbus_handle_exceptions
    def update(self, settings, sender=None):
        """update settings for icmptype
        """
        settings = dbus_to_python(settings)
        log.debug1("%s.update('...')", self._log_prefix)
        self.parent.accessCheck(sender)
        self.obj = self.config.set_icmptype_config(self.obj, settings)
        self.Updated(self.obj.name)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE)
    @dbus_handle_exceptions
    def loadDefaults(self, sender=None):
        """load default settings for builtin icmptype
        """
        log.debug1("%s.loadDefaults()", self._log_prefix)
        self.parent.accessCheck(sender)
        self.obj = self.config.load_icmptype_defaults(self.obj)
        self.Updated(self.obj.name)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE,
                         signature='s')
    @dbus_handle_exceptions
    def Updated(self, name):
        log.debug1("%s.Updated('%s')" % (self._log_prefix, name))

    # R E M O V E

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE)
    @dbus_handle_exceptions
    def remove(self, sender=None):
        """remove icmptype
        """
        log.debug1("%s.removeIcmpType()", self._log_prefix)
        self.parent.accessCheck(sender)
        self.config.remove_icmptype(self.obj)
        self.parent.removeIcmpType(self.obj)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE,
                         signature='s')
    @dbus_handle_exceptions
    def Removed(self, name):
        log.debug1("%s.Removed('%s')" % (self._log_prefix, name))

    # R E N A M E

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE,
                         in_signature='s')
    @dbus_handle_exceptions
    def rename(self, name, sender=None):
        """rename icmptype
        """
        name = dbus_to_python(name, str)
        log.debug1("%s.rename('%s')", self._log_prefix, name)
        self.parent.accessCheck(sender)
        self.obj = self.config.rename_icmptype(self.obj, name)
        self.Renamed(name)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE,
                         signature='s')
    @dbus_handle_exceptions
    def Renamed(self, name):
        log.debug1("%s.Renamed('%s')" % (self._log_prefix, name))

    # version

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE,
                         out_signature='s')
    @dbus_handle_exceptions
    def getVersion(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getVersion()", self._log_prefix)
        return self.getSettings()[0]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE,
                         in_signature='s')
    @dbus_handle_exceptions
    def setVersion(self, version, sender=None):
        version = dbus_to_python(version, str)
        log.debug1("%s.setVersion('%s')", self._log_prefix, version)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[0] = version
        self.update(settings)

    # short

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE,
                         out_signature='s')
    @dbus_handle_exceptions
    def getShort(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getShort()", self._log_prefix)
        return self.getSettings()[1]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE,
                         in_signature='s')
    @dbus_handle_exceptions
    def setShort(self, short, sender=None):
        short = dbus_to_python(short, str)
        log.debug1("%s.setShort('%s')", self._log_prefix, short)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[1] = short
        self.update(settings)

    # description

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE,
                         out_signature='s')
    @dbus_handle_exceptions
    def getDescription(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getDescription()", self._log_prefix)
        return self.getSettings()[2]

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE,
                         in_signature='s')
    @dbus_handle_exceptions
    def setDescription(self, description, sender=None):
        description = dbus_to_python(description, str)
        log.debug1("%s.setDescription('%s')", self._log_prefix,
                   description)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[2] = description
        self.update(settings)

    # destination

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE,
                         out_signature='as')
    @dbus_handle_exceptions
    def getDestinations(self, sender=None): # pylint: disable=W0613
        log.debug1("%s.getDestinations()", self._log_prefix)
        return sorted(self.getSettings()[3])

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE,
                         in_signature='as')
    @dbus_handle_exceptions
    def setDestinations(self, destinations, sender=None):
        destinations = dbus_to_python(destinations, list)
        log.debug1("%s.setDestinations('[%s]')", self._log_prefix,
                   ",".join(destinations))
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        settings[3] = destinations
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE,
                         in_signature='s')
    @dbus_handle_exceptions
    def addDestination(self, destination, sender=None):
        destination = dbus_to_python(destination, str)
        log.debug1("%s.addDestination('%s')", self._log_prefix,
                   destination)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if destination in settings[3]:
            raise FirewallError(errors.ALREADY_ENABLED, destination)
        settings[3].append(destination)
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE,
                         in_signature='s')
    @dbus_handle_exceptions
    def removeDestination(self, destination, sender=None):
        destination = dbus_to_python(destination, str)
        log.debug1("%s.removeDestination('%s')", self._log_prefix,
                   destination)
        self.parent.accessCheck(sender)
        settings = list(self.getSettings())
        if settings[3]:
            if destination not in settings[3]:
                raise FirewallError(errors.NOT_ENABLED, destination)
            else:
                settings[3].remove(destination)
        else:  # empty means all
            settings[3] = list(set(['ipv4', 'ipv6']) -
                               set([destination]))
        self.update(settings)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_ICMPTYPE,
                         in_signature='s', out_signature='b')
    @dbus_handle_exceptions
    def queryDestination(self, destination, sender=None): # pylint: disable=W0613
        destination = dbus_to_python(destination, str)
        log.debug1("%s.queryDestination('%s')", self._log_prefix,
                   destination)
        settings = self.getSettings()
        # empty means all
        return (not settings[3] or
                destination in settings[3])
site-packages/firewall/server/config_policy.py000064400000020455147511334700015610 0ustar00# -*- coding: utf-8 -*-
#
# SPDX-License-Identifier: GPL-2.0-or-later

# force use of pygobject3 in python-slip
from gi.repository import GObject
import sys
sys.modules['gobject'] = GObject

import dbus
import dbus.service
import slip.dbus
import slip.dbus.service

from firewall import config
from firewall.dbus_utils import dbus_to_python, \
    dbus_introspection_prepare_properties, \
    dbus_introspection_add_properties
from firewall.core.logger import log
from firewall.server.decorators import handle_exceptions, \
    dbus_handle_exceptions, dbus_service_method

class FirewallDConfigPolicy(slip.dbus.service.Object):
    persistent = True
    default_polkit_auth_required = config.dbus.PK_ACTION_CONFIG

    @handle_exceptions
    def __init__(self, parent, conf, policy, item_id, *args, **kwargs):
        super(FirewallDConfigPolicy, self).__init__(*args, **kwargs)
        self.parent = parent
        self.config = conf
        self.obj = policy
        self.item_id = item_id
        self.busname = args[0]
        self.path = args[1]
        self._log_prefix = "config.policy.%d" % self.item_id
        dbus_introspection_prepare_properties(
            self, config.dbus.DBUS_INTERFACE_CONFIG_POLICY)

    @dbus_handle_exceptions
    def __del__(self):
        pass

    @dbus_handle_exceptions
    def unregister(self):
        self.remove_from_connection()

    # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

    # P R O P E R T I E S

    @dbus_handle_exceptions
    def _get_property(self, property_name):
        if property_name == "name":
            return dbus.String(self.obj.name)
        elif property_name == "filename":
            return dbus.String(self.obj.filename)
        elif property_name == "path":
            return dbus.String(self.obj.path)
        elif property_name == "default":
            return dbus.Boolean(self.obj.default)
        elif property_name == "builtin":
            return dbus.Boolean(self.obj.builtin)
        else:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.InvalidArgs: "
                "Property '%s' does not exist" % property_name)

    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='ss',
                         out_signature='v')
    @dbus_handle_exceptions
    def Get(self, interface_name, property_name, sender=None):
        # get a property
        interface_name = dbus_to_python(interface_name, str)
        property_name = dbus_to_python(property_name, str)
        log.debug1("%s.Get('%s', '%s')", self._log_prefix,
                   interface_name, property_name)

        if interface_name != config.dbus.DBUS_INTERFACE_CONFIG_POLICY:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

        return self._get_property(property_name)

    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='s',
                         out_signature='a{sv}')
    @dbus_handle_exceptions
    def GetAll(self, interface_name, sender=None):
        interface_name = dbus_to_python(interface_name, str)
        log.debug1("%s.GetAll('%s')", self._log_prefix, interface_name)

        if interface_name != config.dbus.DBUS_INTERFACE_CONFIG_POLICY:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

        ret = { }
        for x in [ "name", "filename", "path", "default", "builtin" ]:
            ret[x] = self._get_property(x)
        return dbus.Dictionary(ret, signature="sv")

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_CONFIG)
    @dbus_service_method(dbus.PROPERTIES_IFACE, in_signature='ssv')
    @dbus_handle_exceptions
    def Set(self, interface_name, property_name, new_value, sender=None):
        interface_name = dbus_to_python(interface_name, str)
        property_name = dbus_to_python(property_name, str)
        new_value = dbus_to_python(new_value)
        log.debug1("%s.Set('%s', '%s', '%s')", self._log_prefix,
                   interface_name, property_name, new_value)
        self.parent.accessCheck(sender)

        if interface_name != config.dbus.DBUS_INTERFACE_CONFIG_POLICY:
            raise dbus.exceptions.DBusException(
                "org.freedesktop.DBus.Error.UnknownInterface: "
                "Interface '%s' does not exist" % interface_name)

        raise dbus.exceptions.DBusException(
            "org.freedesktop.DBus.Error.PropertyReadOnly: "
            "Property '%s' is read-only" % property_name)

    @dbus.service.signal(dbus.PROPERTIES_IFACE, signature='sa{sv}as')
    def PropertiesChanged(self, interface_name, changed_properties,
                          invalidated_properties):
        interface_name = dbus_to_python(interface_name, str)
        changed_properties = dbus_to_python(changed_properties)
        invalidated_properties = dbus_to_python(invalidated_properties)
        log.debug1("%s.PropertiesChanged('%s', '%s', '%s')", self._log_prefix,
                   interface_name, changed_properties, invalidated_properties)

    @slip.dbus.polkit.require_auth(config.dbus.PK_ACTION_INFO)
    @dbus_service_method(dbus.INTROSPECTABLE_IFACE, out_signature='s')
    @dbus_handle_exceptions
    def Introspect(self, sender=None):
        log.debug2("%s.Introspect()", self._log_prefix)

        data = super(FirewallDConfigPolicy, self).Introspect(
            self.path, self.busname.get_bus())

        return dbus_introspection_add_properties(
            self, data, config.dbus.DBUS_INTERFACE_CONFIG_POLICY)

    # S E T T I N G S

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICY,
                         out_signature="a{sv}")
    @dbus_handle_exceptions
    def getSettings(self, sender=None):
        """get settings for policy
        """
        log.debug1("%s.getSettings()", self._log_prefix)
        settings = self.config.get_policy_object_config_dict(self.obj)
        return settings

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICY,
                         in_signature="a{sv}")
    @dbus_handle_exceptions
    def update(self, settings, sender=None):
        """update settings for policy
        """
        settings = dbus_to_python(settings)
        log.debug1("%s.update('...')", self._log_prefix)
        self.parent.accessCheck(sender)
        self.obj = self.config.set_policy_object_config_dict(self.obj, settings)
        self.Updated(self.obj.name)

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICY)
    @dbus_handle_exceptions
    def loadDefaults(self, sender=None):
        """load default settings for builtin policy
        """
        log.debug1("%s.loadDefaults()", self._log_prefix)
        self.parent.accessCheck(sender)
        self.obj = self.config.load_policy_object_defaults(self.obj)
        self.Updated(self.obj.name)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG_POLICY, signature='s')
    @dbus_handle_exceptions
    def Updated(self, name):
        log.debug1("%s.Updated('%s')" % (self._log_prefix, name))

    # R E M O V E

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICY)
    @dbus_handle_exceptions
    def remove(self, sender=None):
        """remove policy
        """
        log.debug1("%s.removePolicy()", self._log_prefix)
        self.parent.accessCheck(sender)
        self.config.remove_policy_object(self.obj)
        self.parent.removePolicy(self.obj)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG_POLICY, signature='s')
    @dbus_handle_exceptions
    def Removed(self, name):
        log.debug1("%s.Removed('%s')" % (self._log_prefix, name))

    # R E N A M E

    @dbus_service_method(config.dbus.DBUS_INTERFACE_CONFIG_POLICY,
                         in_signature='s')
    @dbus_handle_exceptions
    def rename(self, name, sender=None):
        """rename policy
        """
        name = dbus_to_python(name, str)
        log.debug1("%s.rename('%s')", self._log_prefix, name)
        self.parent.accessCheck(sender)
        self.obj = self.config.rename_policy_object(self.obj, name)
        self.Renamed(name)

    @dbus.service.signal(config.dbus.DBUS_INTERFACE_CONFIG_POLICY, signature='s')
    @dbus_handle_exceptions
    def Renamed(self, name):
        log.debug1("%s.Renamed('%s')" % (self._log_prefix, name))
site-packages/firewall/__pycache__/client.cpython-36.opt-1.pyc000064400000426232147511334700020162 0ustar003

]ûf��@s�ddlmZmZddlZeejd<ddlZddlZddl	m	Z	ddl
mZddlm
Z
mZmZddlmZddlmZdd	lmZdd
lmZmZmZddl
mZddlmZddlZddlZdad
ae	dd��Z Gdd�de!�Z"Gdd�de!�Z#Gdd�de!�Z$Gdd�de!�Z%Gdd�de!�Z&Gdd�de!�Z'Gdd�de!�Z(Gdd�de!�Z)Gd d!�d!e!�Z*Gd"d#�d#e!�Z+Gd$d%�d%e!�Z,Gd&d'�d'e!�Z-Gd(d)�d)e!�Z.Gd*d+�d+e!�Z/Gd,d-�d-e!�Z0Gd.d/�d/e!�Z1Gd0d1�d1e!�Z2Gd2d3�d3e!�Z3dS)4�)�GLib�GObjectNZgobject)�	decorator)�config)�DEFAULT_ZONE_TARGET�DEFAULT_POLICY_TARGET�DEFAULT_POLICY_PRIORITY)�dbus_to_python)�b2u)�	Rich_Rule)�normalize_ipset_entry�check_entry_overlaps_existing�check_for_overlapping_entries)�errors)�
FirewallErrorFcOsd}�x|�sy
|||�Stjjk
r�}zb|j�}|j�}tsD�d|krVtd�n4d|krht|�n"d}|rzt|�nttt|���WYdd}~Xnftk
r�}zts��nttt|���WYdd}~Xn.t	k
�r�ts�nttt
j���YnXtsPqWdS)z#Decorator to handle exceptions
    FZNotAuthorizedExceptionzorg.freedesktop.DBus.ErrorTN)
�dbus�
exceptions�
DBusException�get_dbus_messageZ
get_dbus_name�exception_handlerr
�strr�	Exception�	traceback�
format_exc�not_authorized_loop)�func�args�kwargsZ
authorized�eZdbus_messageZ	dbus_name�r�/usr/lib/python3.6/client.py�handle_exceptions0s6




  r!c@s�eZdZed�dd��Zedd��Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��Zedd��Zedd��Z
edd��Zedd��Zedd��Zedd��Zedd��Zed d!��Zed"d#��Zed$d%��Zed&d'��Zed(d)��Zed*d+��Zed,d-��Zed.d/��Zed0d1��Zed2d3��Zed4d5��Zed6d7��Zed8d9��Zed:d;��Z ed<d=��Z!ed>d?��Z"ed@dA��Z#edBdC��Z$edDdE��Z%edFdG��Z&edHdI��Z'edJdK��Z(edLdM��Z)edNdO��Z*edPdQ��Z+edRdS��Z,edTdU��Z-e.j/j0j1edVdW���Z2e.j/j0j1edXdY���Z3e.j/j0j1edZd[���Z4ed\d]��Z5ed^d_��Z6e.j/j0j1ed`da���Z7e.j/j0j1edbdc���Z8e.j/j0j1eddde���Z9edfdg��Z:edhdi��Z;e.j/j0j1edjdk���Z<e.j/j0j1edldm���Z=e.j/j0j1edndo���Z>edpdq��Z?edrds��Z@edtdu��ZAedvdw��ZBedxdy��ZCedzd{��ZDed|d}��ZEed~d��ZFed�d���ZGed�d���ZHed�d���ZIed�d���ZJed�d���ZKed�d���ZLed�d���ZMed�d���ZNed�d���ZOed�d���ZPed�d���ZQed�d���ZRdS)��FirewallClientZoneSettingsNcCs�ddddtgggdggggggddg|_ddddddd	d
ddd
ddddddg|_dddddddddddddddddg|_|r�t|t�r�x"t|�D]\}}|||j|<q�Wt|t�r�|j|�dS)N�F�version�short�description�UNUSED�target�services�ports�icmp_blocks�
masquerade�
forward_ports�
interfaces�sourcesZ	rules_str�	protocols�source_portsZicmp_block_inversion�forward�s�bz(ss)z(ssss))	r�settings�
settings_name�settings_dbus_type�
isinstance�list�	enumerate�dict�setSettingsDict)�selfr5�i�vrrr �__init__Xs(

z#FirewallClientZoneSettings.__init__cCsd|j|jfS)Nz%s(%r))�	__class__r5)r=rrr �__repr__osz#FirewallClientZoneSettings.__repr__cCs6i}x,t|j|j�D]\}}|dkr&q|||<qW|S)Nr')�zipr6r5)r=r5�key�valuerrr �getSettingsDictssz*FirewallClientZoneSettings.getSettingsDictcCs(x"|D]}|||j|jj|�<qWdS)N)r5r6�index)r=r5rDrrr r<{s
z*FirewallClientZoneSettings.setSettingsDictcCs|i}xrt|j|j|j�D]\\}}}|dkr,qt|�tkrLtj||d�||<qt|�tkrltj	||d�||<q|||<qW|S)Nr')�	signature)
rCr6r5r7�typer9r�Arrayr;�
Dictionary)r=r5rDrE�sigrrr �getSettingsDbusDictsz.FirewallClientZoneSettings.getSettingsDbusDictcCs$|j�}|d=|d=|d=|d=|S)Nr$r%r&r()rF)r=r5rrr �getRuntimeSettingsDict�sz1FirewallClientZoneSettings.getRuntimeSettingsDictcCs$|j�}|d=|d=|d=|d=|S)Nr$r%r&r()rM)r=r5rrr �getRuntimeSettingsDbusDict�sz5FirewallClientZoneSettings.getRuntimeSettingsDbusDictcCs
|jdS)Nr)r5)r=rrr �
getVersion�sz%FirewallClientZoneSettings.getVersioncCs||jd<dS)Nr)r5)r=r$rrr �
setVersion�sz%FirewallClientZoneSettings.setVersioncCs
|jdS)N�)r5)r=rrr �getShort�sz#FirewallClientZoneSettings.getShortcCs||jd<dS)NrR)r5)r=r%rrr �setShort�sz#FirewallClientZoneSettings.setShortcCs
|jdS)N�)r5)r=rrr �getDescription�sz)FirewallClientZoneSettings.getDescriptioncCs||jd<dS)NrU)r5)r=r&rrr �setDescription�sz)FirewallClientZoneSettings.setDescriptioncCs|jdtkr|jdSdS)N��default)r5r)r=rrr �	getTarget�sz$FirewallClientZoneSettings.getTargetcCs|dkr|nt|jd<dS)NrYrX)rr5)r=r(rrr �	setTarget�sz$FirewallClientZoneSettings.setTargetcCs
|jdS)N�)r5)r=rrr �getServices�sz&FirewallClientZoneSettings.getServicescCs||jd<dS)Nr\)r5)r=r)rrr �setServices�sz&FirewallClientZoneSettings.setServicescCs0||jdkr |jdj|�nttj|��dS)Nr\)r5�appendrr�ALREADY_ENABLED)r=�servicerrr �
addService�sz%FirewallClientZoneSettings.addServicecCs0||jdkr |jdj|�nttj|��dS)Nr\)r5�removerr�NOT_ENABLED)r=rarrr �
removeService�sz(FirewallClientZoneSettings.removeServicecCs||jdkS)Nr\)r5)r=rarrr �queryService�sz'FirewallClientZoneSettings.queryServicecCs
|jdS)N�)r5)r=rrr �getPorts�sz#FirewallClientZoneSettings.getPortscCs||jd<dS)Nrg)r5)r=r*rrr �setPorts�sz#FirewallClientZoneSettings.setPortscCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nrgz'%s:%s')r5r_rrr`)r=�port�protocolrrr �addPort�sz"FirewallClientZoneSettings.addPortcCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nrgz'%s:%s')r5rcrrrd)r=rjrkrrr �
removePort�sz%FirewallClientZoneSettings.removePortcCs||f|jdkS)Nrg)r5)r=rjrkrrr �	queryPort�sz$FirewallClientZoneSettings.queryPortcCs
|jdS)N�
)r5)r=rrr �getProtocols�sz'FirewallClientZoneSettings.getProtocolscCs||jd<dS)Nro)r5)r=r0rrr �setProtocols�sz'FirewallClientZoneSettings.setProtocolscCs0||jdkr |jdj|�nttj|��dS)Nro)r5r_rrr`)r=rkrrr �addProtocol�sz&FirewallClientZoneSettings.addProtocolcCs0||jdkr |jdj|�nttj|��dS)Nro)r5rcrrrd)r=rkrrr �removeProtocol�sz)FirewallClientZoneSettings.removeProtocolcCs||jdkS)Nro)r5)r=rkrrr �
queryProtocol�sz(FirewallClientZoneSettings.queryProtocolcCs
|jdS)N�)r5)r=rrr �getSourcePortssz)FirewallClientZoneSettings.getSourcePortscCs||jd<dS)Nru)r5)r=r*rrr �setSourcePortssz)FirewallClientZoneSettings.setSourcePortscCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nruz'%s:%s')r5r_rrr`)r=rjrkrrr �
addSourcePortsz(FirewallClientZoneSettings.addSourcePortcCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nruz'%s:%s')r5rcrrrd)r=rjrkrrr �removeSourcePortsz+FirewallClientZoneSettings.removeSourcePortcCs||f|jdkS)Nru)r5)r=rjrkrrr �querySourcePortsz*FirewallClientZoneSettings.querySourcePortcCs
|jdS)N�)r5)r=rrr �
getIcmpBlockssz(FirewallClientZoneSettings.getIcmpBlockscCs||jd<dS)Nr{)r5)r=�
icmpblocksrrr �
setIcmpBlockssz(FirewallClientZoneSettings.setIcmpBlockscCs0||jdkr |jdj|�nttj|��dS)Nr{)r5r_rrr`)r=�icmptyperrr �addIcmpBlock sz'FirewallClientZoneSettings.addIcmpBlockcCs0||jdkr |jdj|�nttj|��dS)Nr{)r5rcrrrd)r=rrrr �removeIcmpBlock&sz*FirewallClientZoneSettings.removeIcmpBlockcCs||jdkS)Nr{)r5)r=rrrr �queryIcmpBlock,sz)FirewallClientZoneSettings.queryIcmpBlockcCs
|jdS)N�)r5)r=rrr �getIcmpBlockInversion0sz0FirewallClientZoneSettings.getIcmpBlockInversioncCs||jd<dS)Nr�)r5)r=�flagrrr �setIcmpBlockInversion3sz0FirewallClientZoneSettings.setIcmpBlockInversioncCs&|jdsd|jd<nttjd�dS)Nr�Tzicmp-block-inversion)r5rrr`)r=rrr �addIcmpBlockInversion6s
z0FirewallClientZoneSettings.addIcmpBlockInversioncCs&|jdrd|jd<nttjd�dS)Nr�Fzicmp-block-inversion)r5rrrd)r=rrr �removeIcmpBlockInversion=s
z3FirewallClientZoneSettings.removeIcmpBlockInversioncCs
|jdS)Nr�)r5)r=rrr �queryIcmpBlockInversionDsz2FirewallClientZoneSettings.queryIcmpBlockInversioncCs
|jdS)N�)r5)r=rrr �
getForwardIsz%FirewallClientZoneSettings.getForwardcCs||jd<dS)Nr�)r5)r=r2rrr �
setForwardLsz%FirewallClientZoneSettings.setForwardcCs&|jdsd|jd<nttjd�dS)Nr�Tr2)r5rrr`)r=rrr �
addForwardOs
z%FirewallClientZoneSettings.addForwardcCs&|jdrd|jd<nttjd�dS)Nr�Fr2)r5rrrd)r=rrr �
removeForwardVs
z(FirewallClientZoneSettings.removeForwardcCs
|jdS)Nr�)r5)r=rrr �queryForward]sz'FirewallClientZoneSettings.queryForwardcCs
|jdS)N�)r5)r=rrr �
getMasqueradebsz(FirewallClientZoneSettings.getMasqueradecCs||jd<dS)Nr�)r5)r=r,rrr �
setMasqueradeesz(FirewallClientZoneSettings.setMasqueradecCs&|jdsd|jd<nttjd�dS)Nr�Tr,)r5rrr`)r=rrr �
addMasqueradehs
z(FirewallClientZoneSettings.addMasqueradecCs&|jdrd|jd<nttjd�dS)Nr�Fr,)r5rrrd)r=rrr �removeMasqueradeos
z+FirewallClientZoneSettings.removeMasqueradecCs
|jdS)Nr�)r5)r=rrr �queryMasqueradevsz*FirewallClientZoneSettings.queryMasqueradecCs
|jdS)N�	)r5)r=rrr �getForwardPorts{sz*FirewallClientZoneSettings.getForwardPortscCs||jd<dS)Nr�)r5)r=r*rrr �setForwardPorts~sz*FirewallClientZoneSettings.setForwardPortscCsd|dkrd}|dkrd}||||f|jdkrH|jdj||||f�nttjd||||f��dS)Nr#r�z
'%s:%s:%s:%s')r5r_rrr`)r=rjrk�to_port�to_addrrrr �addForwardPort�sz)FirewallClientZoneSettings.addForwardPortcCsd|dkrd}|dkrd}||||f|jdkrH|jdj||||f�nttjd||||f��dS)Nr#r�z
'%s:%s:%s:%s')r5rcrrrd)r=rjrkr�r�rrr �removeForwardPort�sz,FirewallClientZoneSettings.removeForwardPortcCs.|dkrd}|dkrd}||||f|jdkS)Nr#r�)r5)r=rjrkr�r�rrr �queryForwardPort�s
z+FirewallClientZoneSettings.queryForwardPortcCs
|jdS)N�
)r5)r=rrr �
getInterfaces�sz(FirewallClientZoneSettings.getInterfacescCs||jd<dS)Nr�)r5)r=r.rrr �
setInterfaces�sz(FirewallClientZoneSettings.setInterfacescCs0||jdkr |jdj|�nttj|��dS)Nr�)r5r_rrr`)r=�	interfacerrr �addInterface�sz'FirewallClientZoneSettings.addInterfacecCs0||jdkr |jdj|�nttj|��dS)Nr�)r5rcrrrd)r=r�rrr �removeInterface�sz*FirewallClientZoneSettings.removeInterfacecCs||jdkS)Nr�)r5)r=r�rrr �queryInterface�sz)FirewallClientZoneSettings.queryInterfacecCs
|jdS)N�)r5)r=rrr �
getSources�sz%FirewallClientZoneSettings.getSourcescCs||jd<dS)Nr�)r5)r=r/rrr �
setSources�sz%FirewallClientZoneSettings.setSourcescCs0||jdkr |jdj|�nttj|��dS)Nr�)r5r_rrr`)r=�sourcerrr �	addSource�sz$FirewallClientZoneSettings.addSourcecCs0||jdkr |jdj|�nttj|��dS)Nr�)r5rcrrrd)r=r�rrr �removeSource�sz'FirewallClientZoneSettings.removeSourcecCs||jdkS)Nr�)r5)r=r�rrr �querySource�sz&FirewallClientZoneSettings.querySourcecCs
|jdS)N�)r5)r=rrr �getRichRules�sz'FirewallClientZoneSettings.getRichRulescCsdd�|D�}||jd<dS)NcSsg|]}tt|d���qS))�rule_str)rr)�.0�rrrr �
<listcomp>�sz;FirewallClientZoneSettings.setRichRules.<locals>.<listcomp>r�)r5)r=�rulesrrr �setRichRules�sz'FirewallClientZoneSettings.setRichRulescCs>tt|d��}||jdkr.|jdj|�nttj|��dS)N)r�r�)rrr5r_rrr`)r=�rulerrr �addRichRule�sz&FirewallClientZoneSettings.addRichRulecCs>tt|d��}||jdkr.|jdj|�nttj|��dS)N)r�r�)rrr5rcrrrd)r=r�rrr �removeRichRule�sz)FirewallClientZoneSettings.removeRichRulecCstt|d��}||jdkS)N)r�r�)rrr5)r=r�rrr �
queryRichRule�sz(FirewallClientZoneSettings.queryRichRule)N)S�__name__�
__module__�__qualname__r!r@rBrFr<rMrNrOrPrQrSrTrVrWrZr[r]r^rbrerfrhrirlrmrnrprqrrrsrtrvrwrxryrzr|r~r�r�r�r�r��slipr�polkit�enable_proxyr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr r"Ws�	
r"c@s�eZdZdd�Zejjjedd���Z	ejjjedd���Z
ejjjedd���Zejjjed	d
���Zejjjedd���Z
ejjjed
d���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd ���Zejjjed!d"���Zejjjed#d$���Zejjjed%d&���Zejjjed'd(���Zejjjed)d*���Zejjjed+d,���Zejjjed-d.���Zejjjed/d0���Zejjjed1d2���Z ejjjed3d4���Z!ejjjed5d6���Z"ejjjed7d8���Z#ejjjed9d:���Z$ejjjed;d<���Z%ejjjed=d>���Z&ejjjed?d@���Z'ejjjedAdB���Z(ejjjedCdD���Z)ejjjedEdF���Z*ejjjedGdH���Z+ejjjedIdJ���Z,ejjjedKdL���Z-ejjjedMdN���Z.ejjjedOdP���Z/ejjjedQdR���Z0ejjjedSdT���Z1ejjjedUdV���Z2ejjjedWdX���Z3ejjjedYdZ���Z4ejjjed[d\���Z5ejjjed]d^���Z6ejjjed_d`���Z7ejjjedadb���Z8ejjjedcdd���Z9ejjjededf���Z:ejjjedgdh���Z;ejjjedidj���Z<ejjjedkdl���Z=ejjjedmdn���Z>ejjjedodp���Z?ejjjedqdr���Z@ejjjedsdt���ZAejjjedudv���ZBejjjedwdx���ZCejjjedydz���ZDejjjed{d|���ZEejjjed}d~���ZFejjjedd����ZGejjjed�d����ZHejjjed�d����ZIejjjed�d����ZJejjjed�d����ZKejjjed�d����ZLejjjed�d����ZMejjjed�d����ZNejjjed�d����ZOejjjed�d����ZPejjjed�d����ZQejjjed�d����ZRejjjed�d����ZSejjjed�d����ZTd�S)��FirewallClientConfigZonecCsL||_||_|jjtjj|�|_tj|jtjjd�|_	tj|jdd�|_
dS)N)�dbus_interfacezorg.freedesktop.DBus.Properties)�bus�path�
get_objectrr�DBUS_INTERFACE�dbus_obj�	Interface�DBUS_INTERFACE_CONFIG_ZONE�fw_zone�
fw_properties)r=r�r�rrr r@�sz!FirewallClientConfigZone.__init__cCst|jjtjj|��S)N)r	r��Getrrr�)r=�proprrr �get_property�sz%FirewallClientConfigZone.get_propertycCst|jjtjj��S)N)r	r��GetAllrrr�)r=rrr �get_properties�sz'FirewallClientConfigZone.get_propertiescCs|jjtjj||�dS)N)r��Setrrr�)r=r�rErrr �set_propertysz%FirewallClientConfigZone.set_propertycCstt|jj���S)N)r"r	r��getSettings2)r=rrr �getSettingssz$FirewallClientConfigZone.getSettingscCs|jj|j��dS)N)r��update2rM)r=r5rrr �updateszFirewallClientConfigZone.updatecCs|jj�dS)N)r��loadDefaults)r=rrr r�sz%FirewallClientConfigZone.loadDefaultscCs|jj�dS)N)r�rc)r=rrr rcszFirewallClientConfigZone.removecCs|jj|�dS)N)r��rename)r=�namerrr r�szFirewallClientConfigZone.renamecCs
|jj�S)N)r�rP)r=rrr rP"sz#FirewallClientConfigZone.getVersioncCs|jj|�dS)N)r�rQ)r=r$rrr rQ'sz#FirewallClientConfigZone.setVersioncCs
|jj�S)N)r�rS)r=rrr rS.sz!FirewallClientConfigZone.getShortcCs|jj|�dS)N)r�rT)r=r%rrr rT3sz!FirewallClientConfigZone.setShortcCs
|jj�S)N)r�rV)r=rrr rV:sz'FirewallClientConfigZone.getDescriptioncCs|jj|�dS)N)r�rW)r=r&rrr rW?sz'FirewallClientConfigZone.setDescriptioncCs
|jj�S)N)r�rZ)r=rrr rZFsz"FirewallClientConfigZone.getTargetcCs|jj|�dS)N)r�r[)r=r(rrr r[Ksz"FirewallClientConfigZone.setTargetcCs
|jj�S)N)r�r])r=rrr r]Rsz$FirewallClientConfigZone.getServicescCs|jj|�dS)N)r�r^)r=r)rrr r^Wsz$FirewallClientConfigZone.setServicescCs|jj|�dS)N)r�rb)r=rarrr rb\sz#FirewallClientConfigZone.addServicecCs|jj|�dS)N)r�re)r=rarrr reasz&FirewallClientConfigZone.removeServicecCs|jj|�S)N)r�rf)r=rarrr rffsz%FirewallClientConfigZone.queryServicecCs
|jj�S)N)r�rh)r=rrr rhmsz!FirewallClientConfigZone.getPortscCs|jj|�dS)N)r�ri)r=r*rrr rirsz!FirewallClientConfigZone.setPortscCs|jj||�dS)N)r�rl)r=rjrkrrr rlwsz FirewallClientConfigZone.addPortcCs|jj||�dS)N)r�rm)r=rjrkrrr rm|sz#FirewallClientConfigZone.removePortcCs|jj||�S)N)r�rn)r=rjrkrrr rn�sz"FirewallClientConfigZone.queryPortcCs
|jj�S)N)r�rp)r=rrr rp�sz%FirewallClientConfigZone.getProtocolscCs|jj|�dS)N)r�rq)r=r0rrr rq�sz%FirewallClientConfigZone.setProtocolscCs|jj|�dS)N)r�rr)r=rkrrr rr�sz$FirewallClientConfigZone.addProtocolcCs|jj|�dS)N)r�rs)r=rkrrr rs�sz'FirewallClientConfigZone.removeProtocolcCs|jj|�S)N)r�rt)r=rkrrr rt�sz&FirewallClientConfigZone.queryProtocolcCs
|jj�S)N)r�rv)r=rrr rv�sz'FirewallClientConfigZone.getSourcePortscCs|jj|�dS)N)r�rw)r=r*rrr rw�sz'FirewallClientConfigZone.setSourcePortscCs|jj||�dS)N)r�rx)r=rjrkrrr rx�sz&FirewallClientConfigZone.addSourcePortcCs|jj||�dS)N)r�ry)r=rjrkrrr ry�sz)FirewallClientConfigZone.removeSourcePortcCs|jj||�S)N)r�rz)r=rjrkrrr rz�sz(FirewallClientConfigZone.querySourcePortcCs
|jj�S)N)r�r|)r=rrr r|�sz&FirewallClientConfigZone.getIcmpBlockscCs|jj|�dS)N)r�r~)r=Z	icmptypesrrr r~�sz&FirewallClientConfigZone.setIcmpBlockscCs|jj|�dS)N)r�r�)r=rrrr r��sz%FirewallClientConfigZone.addIcmpBlockcCs|jj|�dS)N)r�r�)r=rrrr r��sz(FirewallClientConfigZone.removeIcmpBlockcCs|jj|�S)N)r�r�)r=rrrr r��sz'FirewallClientConfigZone.queryIcmpBlockcCs
|jj�S)N)r�r�)r=rrr r��sz.FirewallClientConfigZone.getIcmpBlockInversioncCs|jj|�dS)N)r�r�)r=Z	inversionrrr r��sz.FirewallClientConfigZone.setIcmpBlockInversioncCs|jj�dS)N)r�r�)r=rrr r��sz.FirewallClientConfigZone.addIcmpBlockInversioncCs|jj�dS)N)r�r�)r=rrr r��sz1FirewallClientConfigZone.removeIcmpBlockInversioncCs
|jj�S)N)r�r�)r=rrr r��sz0FirewallClientConfigZone.queryIcmpBlockInversioncCs|jj�dS)Nr2)r�r�)r=rrr r��sz#FirewallClientConfigZone.getForwardcCs|jjd|i�dS)Nr2)r�r�)r=r2rrr r��sz#FirewallClientConfigZone.setForwardcCs|jjddi�dS)Nr2T)r�r�)r=rrr r��sz#FirewallClientConfigZone.addForwardcCs|jjddi�dS)Nr2F)r�r�)r=rrr r�sz&FirewallClientConfigZone.removeForwardcCs|jj�dS)Nr2)r�r�)r=rrr r�sz%FirewallClientConfigZone.queryForwardcCs
|jj�S)N)r�r�)r=rrr r�sz&FirewallClientConfigZone.getMasqueradecCs|jj|�dS)N)r�r�)r=r,rrr r�sz&FirewallClientConfigZone.setMasqueradecCs|jj�dS)N)r�r�)r=rrr r�sz&FirewallClientConfigZone.addMasqueradecCs|jj�dS)N)r�r�)r=rrr r�sz)FirewallClientConfigZone.removeMasqueradecCs
|jj�S)N)r�r�)r=rrr r�#sz(FirewallClientConfigZone.queryMasqueradecCs
|jj�S)N)r�r�)r=rrr r�*sz(FirewallClientConfigZone.getForwardPortscCs|jj|�dS)N)r�r�)r=r*rrr r�/sz(FirewallClientConfigZone.setForwardPortscCs.|dkrd}|dkrd}|jj||||�dS)Nr#)r�r�)r=rjrk�toport�toaddrrrr r�4s
z'FirewallClientConfigZone.addForwardPortcCs.|dkrd}|dkrd}|jj||||�dS)Nr#)r�r�)r=rjrkr�r�rrr r�=s
z*FirewallClientConfigZone.removeForwardPortcCs*|dkrd}|dkrd}|jj||||�S)Nr#)r�r�)r=rjrkr�r�rrr r�Fs
z)FirewallClientConfigZone.queryForwardPortcCs
|jj�S)N)r�r�)r=rrr r�Qsz&FirewallClientConfigZone.getInterfacescCs|jj|�dS)N)r�r�)r=r.rrr r�Vsz&FirewallClientConfigZone.setInterfacescCs|jj|�dS)N)r�r�)r=r�rrr r�[sz%FirewallClientConfigZone.addInterfacecCs|jj|�dS)N)r�r�)r=r�rrr r�`sz(FirewallClientConfigZone.removeInterfacecCs|jj|�S)N)r�r�)r=r�rrr r�esz'FirewallClientConfigZone.queryInterfacecCs
|jj�S)N)r�r�)r=rrr r�lsz#FirewallClientConfigZone.getSourcescCs|jj|�dS)N)r�r�)r=r/rrr r�qsz#FirewallClientConfigZone.setSourcescCs|jj|�dS)N)r�r�)r=r�rrr r�vsz"FirewallClientConfigZone.addSourcecCs|jj|�dS)N)r�r�)r=r�rrr r�{sz%FirewallClientConfigZone.removeSourcecCs|jj|�S)N)r�r�)r=r�rrr r��sz$FirewallClientConfigZone.querySourcecCs
|jj�S)N)r�r�)r=rrr r��sz%FirewallClientConfigZone.getRichRulescCs|jj|�dS)N)r�r�)r=r�rrr r��sz%FirewallClientConfigZone.setRichRulescCs|jj|�dS)N)r�r�)r=r�rrr r��sz$FirewallClientConfigZone.addRichRulecCs|jj|�dS)N)r�r�)r=r�rrr r��sz'FirewallClientConfigZone.removeRichRulecCs|jj|�S)N)r�r�)r=r�rrr r��sz&FirewallClientConfigZone.queryRichRuleN)Ur�r�r�r@r�rr�r�r!r�r�r�r�r�r�rcr�rPrQrSrTrVrWrZr[r]r^rbrerfrhrirlrmrnrprqrrrsrtrvrwrxryrzr|r~r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr r��s2
r�c@s@eZdZed�dd��Zedd��Zedd��Zedd	��Zed
d��Zdd
�Z	edd��Z
edd��Zedd��Zedd��Z
edd��Zedd��Zedd��Zedd��Zedd��Zed d!��Zed"d#��Zed$d%��Zed&d'��Zed(d)��Zed*d+��Zed,d-��Zed.d/��Zed0d1��Zed2d3��Zed4d5��Zed6d7��Zed8d9��Zed:d;��Z ed<d=��Z!ed>d?��Z"ed@dA��Z#edBdC��Z$edDdE��Z%edFdG��Z&edHdI��Z'edJdK��Z(edLdM��Z)edNdO��Z*edPdQ��Z+edRdS��Z,e-j.j/j0edTdU���Z1e-j.j/j0edVdW���Z2e-j.j/j0edXdY���Z3edZd[��Z4ed\d]��Z5ed^d_��Z6ed`da��Z7edbdc��Z8eddde��Z9edfdg��Z:edhdi��Z;edjdk��Z<edldm��Z=edndo��Z>edpdq��Z?edrds��Z@edtdu��ZAedvdw��ZBedxdy��ZCedzd{��ZDed|d}��ZEed~d��ZFed�d���ZGed�d���ZHed�d���ZIdS)��FirewallClientPolicySettingsNcCs\dggggdgtgggdgtdd�|_dddddddddddddddg|_|rX|j|�dS)	Nr#F)r&�egress_zonesr-r+�
ingress_zonesr,r*�priorityr0�
rich_rulesr)r%r1r(r$r3z(ssss)r4z(ss)r>)rrr5r7r<)r=r5rrr r@�s,

z%FirewallClientPolicySettings.__init__cCsd|j|jfS)Nz%s(%r))rAr5)r=rrr rB�sz%FirewallClientPolicySettings.__repr__cCs|jS)N)r5)r=rrr rF�sz,FirewallClientPolicySettings.getSettingsDictcCs x|D]}|||j|<qWdS)N)r5)r=r5rDrrr r<�s
z,FirewallClientPolicySettings.setSettingsDictcCsvi}xlt|j|j�D]Z\}}|j|}t|�tkrFtj||d�||<qt|�tkrftj||d�||<q|||<qW|S)N)rH)	rCr5r7rIr9rrJr;rK)r=r5rDrLrErrr rM�s
z0FirewallClientPolicySettings.getSettingsDbusDictcCs |j�}xdD]
}||=qW|S)Nr$r%r&r()r$r%r&r()rM)r=r5rDrrr rO�s

z7FirewallClientPolicySettings.getRuntimeSettingsDbusDictcCs
|jdS)Nr$)r5)r=rrr rP�sz'FirewallClientPolicySettings.getVersioncCs||jd<dS)Nr$)r5)r=r$rrr rQ�sz'FirewallClientPolicySettings.setVersioncCs
|jdS)Nr%)r5)r=rrr rS�sz%FirewallClientPolicySettings.getShortcCs||jd<dS)Nr%)r5)r=r%rrr rT�sz%FirewallClientPolicySettings.setShortcCs
|jdS)Nr&)r5)r=rrr rV�sz+FirewallClientPolicySettings.getDescriptioncCs||jd<dS)Nr&)r5)r=r&rrr rW�sz+FirewallClientPolicySettings.setDescriptioncCs
|jdS)Nr()r5)r=rrr rZ�sz&FirewallClientPolicySettings.getTargetcCs||jd<dS)Nr()r5)r=r(rrr r[�sz&FirewallClientPolicySettings.setTargetcCs
|jdS)Nr))r5)r=rrr r]�sz(FirewallClientPolicySettings.getServicescCs||jd<dS)Nr))r5)r=r)rrr r^�sz(FirewallClientPolicySettings.setServicescCs0||jdkr |jdj|�nttj|��dS)Nr))r5r_rrr`)r=rarrr rb�sz'FirewallClientPolicySettings.addServicecCs0||jdkr |jdj|�nttj|��dS)Nr))r5rcrrrd)r=rarrr resz*FirewallClientPolicySettings.removeServicecCs||jdkS)Nr))r5)r=rarrr rfsz)FirewallClientPolicySettings.queryServicecCs
|jdS)Nr*)r5)r=rrr rh
sz%FirewallClientPolicySettings.getPortscCs||jd<dS)Nr*)r5)r=r*rrr ri
sz%FirewallClientPolicySettings.setPortscCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nr*z'%s:%s')r5r_rrr`)r=rjrkrrr rlsz$FirewallClientPolicySettings.addPortcCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nr*z'%s:%s')r5rcrrrd)r=rjrkrrr rmsz'FirewallClientPolicySettings.removePortcCs||f|jdkS)Nr*)r5)r=rjrkrrr rnsz&FirewallClientPolicySettings.queryPortcCs
|jdS)Nr0)r5)r=rrr rp"sz)FirewallClientPolicySettings.getProtocolscCs||jd<dS)Nr0)r5)r=r0rrr rq%sz)FirewallClientPolicySettings.setProtocolscCs0||jdkr |jdj|�nttj|��dS)Nr0)r5r_rrr`)r=rkrrr rr(sz(FirewallClientPolicySettings.addProtocolcCs0||jdkr |jdj|�nttj|��dS)Nr0)r5rcrrrd)r=rkrrr rs.sz+FirewallClientPolicySettings.removeProtocolcCs||jdkS)Nr0)r5)r=rkrrr rt4sz*FirewallClientPolicySettings.queryProtocolcCs
|jdS)Nr1)r5)r=rrr rv8sz+FirewallClientPolicySettings.getSourcePortscCs||jd<dS)Nr1)r5)r=r*rrr rw;sz+FirewallClientPolicySettings.setSourcePortscCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nr1z'%s:%s')r5r_rrr`)r=rjrkrrr rx>sz*FirewallClientPolicySettings.addSourcePortcCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nr1z'%s:%s')r5rcrrrd)r=rjrkrrr ryEsz-FirewallClientPolicySettings.removeSourcePortcCs||f|jdkS)Nr1)r5)r=rjrkrrr rzLsz,FirewallClientPolicySettings.querySourcePortcCs
|jdS)Nr+)r5)r=rrr r|Psz*FirewallClientPolicySettings.getIcmpBlockscCs||jd<dS)Nr+)r5)r=r}rrr r~Ssz*FirewallClientPolicySettings.setIcmpBlockscCs0||jdkr |jdj|�nttj|��dS)Nr+)r5r_rrr`)r=rrrr r�Vsz)FirewallClientPolicySettings.addIcmpBlockcCs0||jdkr |jdj|�nttj|��dS)Nr+)r5rcrrrd)r=rrrr r�\sz,FirewallClientPolicySettings.removeIcmpBlockcCs||jdkS)Nr+)r5)r=rrrr r�bsz+FirewallClientPolicySettings.queryIcmpBlockcCs
|jdS)Nr,)r5)r=rrr r�fsz*FirewallClientPolicySettings.getMasqueradecCs||jd<dS)Nr,)r5)r=r,rrr r�isz*FirewallClientPolicySettings.setMasqueradecCs&|jdsd|jd<nttjd�dS)Nr,T)r5rrr`)r=rrr r�ls
z*FirewallClientPolicySettings.addMasqueradecCs&|jdrd|jd<nttjd�dS)Nr,F)r5rrrd)r=rrr r�ss
z-FirewallClientPolicySettings.removeMasqueradecCs
|jdS)Nr,)r5)r=rrr r�zsz,FirewallClientPolicySettings.queryMasqueradecCs
|jdS)Nr-)r5)r=rrr r�sz,FirewallClientPolicySettings.getForwardPortscCs||jd<dS)Nr-)r5)r=r*rrr r��sz,FirewallClientPolicySettings.setForwardPortscCsd|dkrd}|dkrd}||||f|jdkrH|jdj||||f�nttjd||||f��dS)Nr#r-z
'%s:%s:%s:%s')r5r_rrr`)r=rjrkr�r�rrr r��sz+FirewallClientPolicySettings.addForwardPortcCsd|dkrd}|dkrd}||||f|jdkrH|jdj||||f�nttjd||||f��dS)Nr#r-z
'%s:%s:%s:%s')r5rcrrrd)r=rjrkr�r�rrr r��sz.FirewallClientPolicySettings.removeForwardPortcCs.|dkrd}|dkrd}||||f|jdkS)Nr#r-)r5)r=rjrkr�r�rrr r��s
z-FirewallClientPolicySettings.queryForwardPortcCs
|jdS)Nr�)r5)r=rrr r��sz)FirewallClientPolicySettings.getRichRulescCsdd�|D�}||jd<dS)NcSsg|]}tt|d���qS))r�)rr)r�r�rrr r��sz=FirewallClientPolicySettings.setRichRules.<locals>.<listcomp>r�)r5)r=r�rrr r��sz)FirewallClientPolicySettings.setRichRulescCs>tt|d��}||jdkr.|jdj|�nttj|��dS)N)r�r�)rrr5r_rrr`)r=r�rrr r��sz(FirewallClientPolicySettings.addRichRulecCs>tt|d��}||jdkr.|jdj|�nttj|��dS)N)r�r�)rrr5rcrrrd)r=r�rrr r��sz+FirewallClientPolicySettings.removeRichRulecCstt|d��}||jdkS)N)r�r�)rrr5)r=r�rrr r��sz*FirewallClientPolicySettings.queryRichRulecCs
|jdS)Nr�)r5)r=rrr �getIngressZones�sz,FirewallClientPolicySettings.getIngressZonescCs||jd<dS)Nr�)r5)r=r�rrr �setIngressZones�sz,FirewallClientPolicySettings.setIngressZonescCs0||jdkr |jdj|�nttj|��dS)Nr�)r5r_rrr`)r=�ingress_zonerrr �addIngressZone�sz+FirewallClientPolicySettings.addIngressZonecCs0||jdkr |jdj|�nttj|��dS)Nr�)r5rcrrrd)r=r�rrr �removeIngressZone�sz.FirewallClientPolicySettings.removeIngressZonecCs||jdkS)Nr�)r5)r=r�rrr �queryIngressZone�sz-FirewallClientPolicySettings.queryIngressZonecCs
|jdS)Nr�)r5)r=rrr �getEgressZones�sz+FirewallClientPolicySettings.getEgressZonescCs||jd<dS)Nr�)r5)r=r�rrr �setEgressZones�sz+FirewallClientPolicySettings.setEgressZonescCs0||jdkr |jdj|�nttj|��dS)Nr�)r5r_rrr`)r=�egress_zonerrr �
addEgressZone�sz*FirewallClientPolicySettings.addEgressZonecCs0||jdkr |jdj|�nttj|��dS)Nr�)r5rcrrrd)r=r�rrr �removeEgressZone�sz-FirewallClientPolicySettings.removeEgressZonecCs||jdkS)Nr�)r5)r=r�rrr �queryEgressZone�sz,FirewallClientPolicySettings.queryEgressZonecCs
|jdS)Nr�)r5)r=rrr �getPriority�sz(FirewallClientPolicySettings.getPrioritycCst|�|jd<dS)Nr�)�intr5)r=r�rrr �setPriority�sz(FirewallClientPolicySettings.setPriority)N)Jr�r�r�r!r@rBrFr<rMrOrPrQrSrTrVrWrZr[r]r^rbrerfrhrirlrmrnrprqrrrsrtrvrwrxryrzr|r~r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr r��s�r�c@s�eZdZdd�Zejjjedd���Z	ejjjedd���Z
ejjjedd���Zejjjed	d
���Zejjjedd���Z
ejjjed
d���Zejjjedd���Zejjjedd���ZdS)�FirewallClientConfigPolicycCsL||_||_|jjtjj|�|_tj|jtjjd�|_	tj|jdd�|_
dS)N)r�zorg.freedesktop.DBus.Properties)r�r�r�rrr�r�r��DBUS_INTERFACE_CONFIG_POLICY�	fw_policyr�)r=r�r�rrr r@�sz#FirewallClientConfigPolicy.__init__cCst|jjtjj|��S)N)r	r�r�rrr�)r=r�rrr r��sz'FirewallClientConfigPolicy.get_propertycCst|jjtjj��S)N)r	r�r�rrr�)r=rrr r�sz)FirewallClientConfigPolicy.get_propertiescCs|jjtjj||�dS)N)r�r�rrr�)r=r�rErrr r�sz'FirewallClientConfigPolicy.set_propertycCstt|jj���S)N)r�r	r�r�)r=rrr r�
sz&FirewallClientConfigPolicy.getSettingscCs|jj|j��dS)N)r�r�rM)r=r5rrr r�sz!FirewallClientConfigPolicy.updatecCs|jj�dS)N)r�r�)r=rrr r�sz'FirewallClientConfigPolicy.loadDefaultscCs|jj�dS)N)r�rc)r=rrr rcsz!FirewallClientConfigPolicy.removecCs|jj|�dS)N)r�r�)r=r�rrr r�!sz!FirewallClientConfigPolicy.renameN)r�r�r�r@r�rr�r�r!r�r�r�r�r�r�rcr�rrrr r��s"
r�c@s8eZdZed^dd��Zedd��Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��Zedd��Zedd��Z
edd��Zedd��Zedd��Zedd��Zedd��Zed d!��Zed"d#��Zed$d%��Zed&d'��Zed(d)��Zed*d+��Zed,d-��Zed.d/��Zed0d1��Zed2d3��Zed4d5��Zed6d7��Zed8d9��Zed:d;��Z ed<d=��Z!ed>d?��Z"ed@dA��Z#edBdC��Z$edDdE��Z%ed_dFdG��Z&edHdI��Z'edJdK��Z(edLdM��Z)edNdO��Z*edPdQ��Z+edRdS��Z,edTdU��Z-edVdW��Z.edXdY��Z/edZd[��Z0ed\d]��Z1dS)`�FirewallClientServiceSettingsNc
Cs�dddggiggggg
|_dddddddd	d
dg
|_dddd
dddd
ddg
|_|r�t|�tkr�x:t|�D]\}}|||j|<qhWnt|�tkr�|j|�dS)Nr#r$r%r&r*�modules�destinationr0r1�includes�helpersr3z(ss)Zss)r5r6r7rIr9r:r;r<)r=r5r>r?rrr r@)sz&FirewallClientServiceSettings.__init__cCsd|j|jfS)Nz%s(%r))rAr5)r=rrr rB9sz&FirewallClientServiceSettings.__repr__cCs,i}x"t|j|j�D]\}}|||<qW|S)N)rCr6r5)r=r5rDrErrr rF=sz-FirewallClientServiceSettings.getSettingsDictcCs(x"|D]}|||j|jj|�<qWdS)N)r5r6rG)r=r5rDrrr r<Cs
z-FirewallClientServiceSettings.setSettingsDictcCsri}xht|j|j|j�D]R\}}}t|�tkrBtj||d�||<qt|�tkrbtj	||d�||<q|||<qW|S)N)rH)
rCr6r5r7rIr9rrJr;rK)r=r5rDrErLrrr rMGsz1FirewallClientServiceSettings.getSettingsDbusDictcCs
|jdS)Nr)r5)r=rrr rPSsz(FirewallClientServiceSettings.getVersioncCs||jd<dS)Nr)r5)r=r$rrr rQVsz(FirewallClientServiceSettings.setVersioncCs
|jdS)NrR)r5)r=rrr rSZsz&FirewallClientServiceSettings.getShortcCs||jd<dS)NrR)r5)r=r%rrr rT]sz&FirewallClientServiceSettings.setShortcCs
|jdS)NrU)r5)r=rrr rVasz,FirewallClientServiceSettings.getDescriptioncCs||jd<dS)NrU)r5)r=r&rrr rWdsz,FirewallClientServiceSettings.setDescriptioncCs
|jdS)N�)r5)r=rrr rhhsz&FirewallClientServiceSettings.getPortscCs||jd<dS)Nr�)r5)r=r*rrr riksz&FirewallClientServiceSettings.setPortscCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nr�z'%s:%s')r5r_rrr`)r=rjrkrrr rlnsz%FirewallClientServiceSettings.addPortcCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nr�z'%s:%s')r5rcrrrd)r=rjrkrrr rmusz(FirewallClientServiceSettings.removePortcCs||f|jdkS)Nr�)r5)r=rjrkrrr rn|sz'FirewallClientServiceSettings.queryPortcCs
|jdS)Nrg)r5)r=rrr rp�sz*FirewallClientServiceSettings.getProtocolscCs||jd<dS)Nrg)r5)r=r0rrr rq�sz*FirewallClientServiceSettings.setProtocolscCs0||jdkr |jdj|�nttj|��dS)Nrg)r5r_rrr`)r=rkrrr rr�sz)FirewallClientServiceSettings.addProtocolcCs0||jdkr |jdj|�nttj|��dS)Nrg)r5rcrrrd)r=rkrrr rs�sz,FirewallClientServiceSettings.removeProtocolcCs||jdkS)Nrg)r5)r=rkrrr rt�sz+FirewallClientServiceSettings.queryProtocolcCs
|jdS)Nr{)r5)r=rrr rv�sz,FirewallClientServiceSettings.getSourcePortscCs||jd<dS)Nr{)r5)r=r*rrr rw�sz,FirewallClientServiceSettings.setSourcePortscCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nr{z'%s:%s')r5r_rrr`)r=rjrkrrr rx�sz+FirewallClientServiceSettings.addSourcePortcCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nr{z'%s:%s')r5rcrrrd)r=rjrkrrr ry�sz.FirewallClientServiceSettings.removeSourcePortcCs||f|jdkS)Nr{)r5)r=rjrkrrr rz�sz-FirewallClientServiceSettings.querySourcePortcCs
|jdS)NrX)r5)r=rrr �
getModules�sz(FirewallClientServiceSettings.getModulescCs||jd<dS)NrX)r5)r=r�rrr �
setModules�sz(FirewallClientServiceSettings.setModulescCs0||jdkr |jdj|�nttj|��dS)NrX)r5r_rrr`)r=�modulerrr �	addModule�sz'FirewallClientServiceSettings.addModulecCs0||jdkr |jdj|�nttj|��dS)NrX)r5rcrrrd)r=r�rrr �removeModule�sz*FirewallClientServiceSettings.removeModulecCs||jdkS)NrX)r5)r=r�rrr �queryModule�sz)FirewallClientServiceSettings.queryModulecCs
|jdS)Nr\)r5)r=rrr �getDestinations�sz-FirewallClientServiceSettings.getDestinationscCs||jd<dS)Nr\)r5)r=�destinationsrrr �setDestinations�sz-FirewallClientServiceSettings.setDestinationscCsH||jdks |jd||kr0||jd|<nttjd||f��dS)Nr\z'%s:%s')r5rrr`)r=�	dest_type�addressrrr �setDestination�s
z,FirewallClientServiceSettings.setDestinationcCs^||jdkrJ|dk	r<|jd||kr<ttjd||f��|jd|=nttjd|��dS)Nr\z'%s:%s'z'%s')r5rrrd)r=r�rrrr �removeDestination�sz/FirewallClientServiceSettings.removeDestinationcCs ||jdko||jd|kS)Nr\)r5)r=r�rrrr �queryDestination�sz.FirewallClientServiceSettings.queryDestinationcCs
|jdS)Nr�)r5)r=rrr �getIncludes�sz)FirewallClientServiceSettings.getIncludescCs||jd<dS)Nr�)r5)r=r�rrr �setIncludes�sz)FirewallClientServiceSettings.setIncludescCs0||jdkr |jdj|�nttj|��dS)Nr�)r5r_rrr`)r=�includerrr �
addInclude�sz(FirewallClientServiceSettings.addIncludecCs0||jdkr |jdj|�nttj|��dS)Nr�)r5rcrrrd)r=rrrr �
removeInclude�sz+FirewallClientServiceSettings.removeIncludecCs||jdkS)Nr�)r5)r=rrrr �queryInclude�sz*FirewallClientServiceSettings.queryIncludecCs
|jdS)Nr�)r5)r=rrr �
getHelpers�sz(FirewallClientServiceSettings.getHelperscCs||jd<dS)Nr�)r5)r=r�rrr �
setHelpers�sz(FirewallClientServiceSettings.setHelperscCs0||jdkr |jdj|�nttj|��dS)Nr�)r5r_rrr`)r=�helperrrr �	addHelper�sz'FirewallClientServiceSettings.addHelpercCs0||jdkr |jdj|�nttj|��dS)Nr�)r5rcrrrd)r=rrrr �removeHelpersz*FirewallClientServiceSettings.removeHelpercCs||jdkS)Nr�)r5)r=rrrr �queryHelpersz)FirewallClientServiceSettings.queryHelper)N)N)2r�r�r�r!r@rBrFr<rMrPrQrSrTrVrWrhrirlrmrnrprqrrrsrtrvrwrxryrzr�r�r�r�r�r�r�rrrrrrrr	r
rr
rrrrrr r�(s`r�c@s�eZdZed*dd��Zedd��Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��Zedd��Zedd��Z
edd��Zedd��Zedd��Zedd��Zedd��Zed d!��Zed"d#��Zed$d%��Zed&d'��Zed(d)��ZdS)+�FirewallClientIPSetSettingsNcCs"|r||_nddddigg|_dS)Nr#)r5)r=r5rrr r@sz$FirewallClientIPSetSettings.__init__cCsd|j|jfS)Nz%s(%r))rAr5)r=rrr rBsz$FirewallClientIPSetSettings.__repr__cCs
|jdS)Nr)r5)r=rrr rPsz&FirewallClientIPSetSettings.getVersioncCs||jd<dS)Nr)r5)r=r$rrr rQsz&FirewallClientIPSetSettings.setVersioncCs
|jdS)NrR)r5)r=rrr rS!sz$FirewallClientIPSetSettings.getShortcCs||jd<dS)NrR)r5)r=r%rrr rT$sz$FirewallClientIPSetSettings.setShortcCs
|jdS)NrU)r5)r=rrr rV(sz*FirewallClientIPSetSettings.getDescriptioncCs||jd<dS)NrU)r5)r=r&rrr rW+sz*FirewallClientIPSetSettings.setDescriptioncCs
|jdS)Nr�)r5)r=rrr �getType/sz#FirewallClientIPSetSettings.getTypecCs||jd<dS)Nr�)r5)r=Z
ipset_typerrr �setType2sz#FirewallClientIPSetSettings.setTypecCs
|jdS)NrX)r5)r=rrr �
getOptions6sz&FirewallClientIPSetSettings.getOptionscCs||jd<dS)NrX)r5)r=Zoptionsrrr �
setOptions9sz&FirewallClientIPSetSettings.setOptionscCsP||jdks |jd||kr0||jd|<nttj|rFd||fn|��dS)NrXz'%s=%s')r5rrr`)r=rDrErrr �	addOption<s z%FirewallClientIPSetSettings.addOptioncCs,||jdkr|jd|=nttj|��dS)NrX)r5rrrd)r=rDrrr �removeOptionCsz(FirewallClientIPSetSettings.removeOptioncCs ||jdko|jd||kS)NrX)r5)r=rDrErrr �queryOptionIsz'FirewallClientIPSetSettings.queryOptioncCs
|jdS)Nr\)r5)r=rrr �
getEntriesMsz&FirewallClientIPSetSettings.getEntriescCs@d|jdkr*|jdddkr*ttj��t|�||jd<dS)N�timeoutrX�0r\)r5rr�IPSET_WITH_TIMEOUTr)r=�entriesrrr �
setEntriesPs

z&FirewallClientIPSetSettings.setEntriescCsrd|jdkr*|jdddkr*ttj��t|�}||jdkrbt||jd�|jdj|�nttj|��dS)NrrXrr\)r5rrrrr
r_r`)r=�entryrrr �addEntryWs
z$FirewallClientIPSetSettings.addEntrycCsbd|jdkr*|jdddkr*ttj��t|�}||jdkrR|jdj|�nttj|��dS)NrrXrr\)r5rrrrrcrd)r=rrrr �removeEntrybs
z'FirewallClientIPSetSettings.removeEntrycCs@d|jdkr*|jdddkr*ttj��t|�}||jdkS)NrrXrr\)r5rrrr)r=rrrr �
queryEntryls

z&FirewallClientIPSetSettings.queryEntry)N)r�r�r�r!r@rBrPrQrSrTrVrWrrrrrrrrrrr r!rrrr rs*
rc@s�eZdZedd��Zejjjedd���Z	ejjjedd���Z
ejjjedd���Zejjjed	d
���Zejjjedd���Z
ejjjed
d���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd ���Zejjjed!d"���Zejjjed#d$���Zejjjed%d&���Zejjjed'd(���Zd)S)*�FirewallClientConfigIPSetcCsL||_||_|jjtjj|�|_tj|jtjjd�|_	tj|jdd�|_
dS)N)r�zorg.freedesktop.DBus.Properties)r�r�r�rrr�r�r��DBUS_INTERFACE_CONFIG_IPSET�fw_ipsetr�)r=r�r�rrr r@wsz"FirewallClientConfigIPSet.__init__cCst|jjtjj|��S)N)r	r�r�rrr#)r=r�rrr r��sz&FirewallClientConfigIPSet.get_propertycCst|jjtjj��S)N)r	r�r�rrr#)r=rrr r��sz(FirewallClientConfigIPSet.get_propertiescCs|jjtjj||�dS)N)r�r�rrr#)r=r�rErrr r��sz&FirewallClientConfigIPSet.set_propertycCsttt|jj����S)N)rr9r	r$r�)r=rrr r��sz%FirewallClientConfigIPSet.getSettingscCs|jjt|j��dS)N)r$r��tupler5)r=r5rrr r��sz FirewallClientConfigIPSet.updatecCs|jj�dS)N)r$r�)r=rrr r��sz&FirewallClientConfigIPSet.loadDefaultscCs|jj�dS)N)r$rc)r=rrr rc�sz FirewallClientConfigIPSet.removecCs|jj|�dS)N)r$r�)r=r�rrr r��sz FirewallClientConfigIPSet.renamecCs
|jj�S)N)r$rP)r=rrr rP�sz$FirewallClientConfigIPSet.getVersioncCs|jj|�dS)N)r$rQ)r=r$rrr rQ�sz$FirewallClientConfigIPSet.setVersioncCs
|jj�S)N)r$rS)r=rrr rS�sz"FirewallClientConfigIPSet.getShortcCs|jj|�dS)N)r$rT)r=r%rrr rT�sz"FirewallClientConfigIPSet.setShortcCs
|jj�S)N)r$rV)r=rrr rV�sz(FirewallClientConfigIPSet.getDescriptioncCs|jj|�dS)N)r$rW)r=r&rrr rW�sz(FirewallClientConfigIPSet.setDescriptioncCs
|jj�S)N)r$r)r=rrr r�sz$FirewallClientConfigIPSet.getEntriescCs|jj|�dS)N)r$r)r=rrrr r�sz$FirewallClientConfigIPSet.setEntriescCs|jj|�dS)N)r$r)r=rrrr r�sz"FirewallClientConfigIPSet.addEntrycCs|jj|�dS)N)r$r )r=rrrr r �sz%FirewallClientConfigIPSet.removeEntrycCs|jj|�S)N)r$r!)r=rrrr r!�sz$FirewallClientConfigIPSet.queryEntryN)r�r�r�r!r@r�rr�r�r�r�r�r�r�r�rcr�rPrQrSrTrVrWrrrr r!rrrr r"vsNr"c@s�eZdZed$dd��Zedd��Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��Zedd��Zedd��Z
edd��Zedd��Zedd��Zedd��Zedd��Zed d!��Zed"d#��ZdS)%�FirewallClientHelperSettingsNcCs"|r||_ndddddgg|_dS)Nr#)r5)r=r5rrr r@�sz%FirewallClientHelperSettings.__init__cCsd|j|jfS)Nz%s(%r))rAr5)r=rrr rB�sz%FirewallClientHelperSettings.__repr__cCs
|jdS)Nr)r5)r=rrr rP�sz'FirewallClientHelperSettings.getVersioncCs||jd<dS)Nr)r5)r=r$rrr rQ�sz'FirewallClientHelperSettings.setVersioncCs
|jdS)NrR)r5)r=rrr rSsz%FirewallClientHelperSettings.getShortcCs||jd<dS)NrR)r5)r=r%rrr rTsz%FirewallClientHelperSettings.setShortcCs
|jdS)NrU)r5)r=rrr rV	sz+FirewallClientHelperSettings.getDescriptioncCs||jd<dS)NrU)r5)r=r&rrr rWsz+FirewallClientHelperSettings.setDescriptioncCs
|jdS)Nr�)r5)r=rrr �	getFamilysz&FirewallClientHelperSettings.getFamilycCs |dkrd|jd<||jd<dS)Nr#r�)r5)r=�ipvrrr �	setFamilys
z&FirewallClientHelperSettings.setFamilycCs
|jdS)NrX)r5)r=rrr �	getModulesz&FirewallClientHelperSettings.getModulecCs||jd<dS)NrX)r5)r=r�rrr �	setModulesz&FirewallClientHelperSettings.setModulecCs
|jdS)Nr\)r5)r=rrr rh sz%FirewallClientHelperSettings.getPortscCs||jd<dS)Nr\)r5)r=r*rrr ri#sz%FirewallClientHelperSettings.setPortscCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nr\z'%s:%s')r5r_rrr`)r=rjrkrrr rl&sz$FirewallClientHelperSettings.addPortcCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nr\z'%s:%s')r5rcrrrd)r=rjrkrrr rm-sz'FirewallClientHelperSettings.removePortcCs||f|jdkS)Nr\)r5)r=rjrkrrr rn4sz&FirewallClientHelperSettings.queryPort)N)r�r�r�r!r@rBrPrQrSrTrVrWr'r)r*r+rhrirlrmrnrrrr r&�s$r&c@seZdZedd��Zejjjedd���Z	ejjjedd���Z
ejjjedd���Zejjjed	d
���Zejjjedd���Z
ejjjed
d���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd ���Zejjjed!d"���Zejjjed#d$���Zejjjed%d&���Zejjjed'd(���Zejjjed)d*���Zejjjed+d,���Zejjjed-d.���Zejjjed/d0���Zd1S)2�FirewallClientConfigHelpercCsL||_||_|jjtjj|�|_tj|jtjjd�|_	tj|jdd�|_
dS)N)r�zorg.freedesktop.DBus.Properties)r�r�r�rrr�r�r��DBUS_INTERFACE_CONFIG_HELPER�	fw_helperr�)r=r�r�rrr r@;sz#FirewallClientConfigHelper.__init__cCst|jjtjj|��S)N)r	r�r�rrr-)r=r�rrr r�Fsz'FirewallClientConfigHelper.get_propertycCst|jjtjj��S)N)r	r�r�rrr-)r=rrr r�Lsz)FirewallClientConfigHelper.get_propertiescCs|jjtjj||�dS)N)r�r�rrr-)r=r�rErrr r�Rsz'FirewallClientConfigHelper.set_propertycCsttt|jj����S)N)r&r9r	r.r�)r=rrr r�Xsz&FirewallClientConfigHelper.getSettingscCs|jjt|j��dS)N)r.r�r%r5)r=r5rrr r�^sz!FirewallClientConfigHelper.updatecCs|jj�dS)N)r.r�)r=rrr r�csz'FirewallClientConfigHelper.loadDefaultscCs|jj�dS)N)r.rc)r=rrr rchsz!FirewallClientConfigHelper.removecCs|jj|�dS)N)r.r�)r=r�rrr r�msz!FirewallClientConfigHelper.renamecCs
|jj�S)N)r.rP)r=rrr rPtsz%FirewallClientConfigHelper.getVersioncCs|jj|�dS)N)r.rQ)r=r$rrr rQysz%FirewallClientConfigHelper.setVersioncCs
|jj�S)N)r.rS)r=rrr rS�sz#FirewallClientConfigHelper.getShortcCs|jj|�dS)N)r.rT)r=r%rrr rT�sz#FirewallClientConfigHelper.setShortcCs
|jj�S)N)r.rV)r=rrr rV�sz)FirewallClientConfigHelper.getDescriptioncCs|jj|�dS)N)r.rW)r=r&rrr rW�sz)FirewallClientConfigHelper.setDescriptioncCs
|jj�S)N)r.rh)r=rrr rh�sz#FirewallClientConfigHelper.getPortscCs|jj|�dS)N)r.ri)r=r*rrr ri�sz#FirewallClientConfigHelper.setPortscCs|jj||�dS)N)r.rl)r=rjrkrrr rl�sz"FirewallClientConfigHelper.addPortcCs|jj||�dS)N)r.rm)r=rjrkrrr rm�sz%FirewallClientConfigHelper.removePortcCs|jj||�S)N)r.rn)r=rjrkrrr rn�sz$FirewallClientConfigHelper.queryPortcCs
|jj�S)N)r.r')r=rrr r'�sz$FirewallClientConfigHelper.getFamilycCs$|dkr|jjd�|jj|�dS)Nr#)r.r))r=r(rrr r)�sz$FirewallClientConfigHelper.setFamilycCs
|jj�S)N)r.r*)r=rrr r*�sz$FirewallClientConfigHelper.getModulecCs|jj|�dS)N)r.r+)r=r�rrr r+�sz$FirewallClientConfigHelper.setModuleN) r�r�r�r!r@r�rr�r�r�r�r�r�r�r�rcr�rPrQrSrTrVrWrhrirlrmrnr'r)r*r+rrrr r,:s^r,c@s�eZdZedd��Zejjjedd���Z	ejjjedd���Z
ejjjedd���Zejjjed	d
���Zejjjedd���Z
ejjjed
d���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd ���Zejjjed!d"���Zejjjed#d$���Zejjjed%d&���Zejjjed'd(���Zejjjed)d*���Zejjjed+d,���Zejjjed-d.���Zejjjed/d0���Zejjjed1d2���Z ejjjed3d4���Z!ejjjed5d6���Z"ejjjed7d8���Z#ejjjed9d:���Z$ejjjed;d<���Z%ejjjed=d>���Z&ejjjed?d@���Z'ejjjedAdB���Z(ejjjedCdD���Z)ejjjedEdF���Z*ejjjedGdH���Z+ejjjedIdJ���Z,ejjjedKdL���Z-ejjjedMdN���Z.ejjjed^dPdQ���Z/ejjjedRdS���Z0ejjjedTdU���Z1ejjjedVdW���Z2ejjjedXdY���Z3ejjjedZd[���Z4ejjjed\d]���Z5dOS)_�FirewallClientConfigServicecCsL||_||_|jjtjj|�|_tj|jtjjd�|_	tj|jdd�|_
dS)N)r�zorg.freedesktop.DBus.Properties)r�r�r�rrr�r�r��DBUS_INTERFACE_CONFIG_SERVICE�
fw_servicer�)r=r�r�rrr r@�sz$FirewallClientConfigService.__init__cCst|jjtjj|��S)N)r	r�r�rrr0)r=r�rrr r��sz(FirewallClientConfigService.get_propertycCst|jjtjj��S)N)r	r�r�rrr0)r=rrr r��sz*FirewallClientConfigService.get_propertiescCs|jjtjj||�dS)N)r�r�rrr0)r=r�rErrr r��sz(FirewallClientConfigService.set_propertycCstt|jj���S)N)r�r	r1r�)r=rrr r��sz'FirewallClientConfigService.getSettingscCs|jj|j��dS)N)r1r�rM)r=r5rrr r��sz"FirewallClientConfigService.updatecCs|jj�dS)N)r1r�)r=rrr r��sz(FirewallClientConfigService.loadDefaultscCs|jj�dS)N)r1rc)r=rrr rc�sz"FirewallClientConfigService.removecCs|jj|�dS)N)r1r�)r=r�rrr r�sz"FirewallClientConfigService.renamecCs
|jj�S)N)r1rP)r=rrr rPsz&FirewallClientConfigService.getVersioncCs|jj|�dS)N)r1rQ)r=r$rrr rQsz&FirewallClientConfigService.setVersioncCs
|jj�S)N)r1rS)r=rrr rSsz$FirewallClientConfigService.getShortcCs|jj|�dS)N)r1rT)r=r%rrr rTsz$FirewallClientConfigService.setShortcCs
|jj�S)N)r1rV)r=rrr rVsz*FirewallClientConfigService.getDescriptioncCs|jj|�dS)N)r1rW)r=r&rrr rW$sz*FirewallClientConfigService.setDescriptioncCs
|jj�S)N)r1rh)r=rrr rh+sz$FirewallClientConfigService.getPortscCs|jj|�dS)N)r1ri)r=r*rrr ri0sz$FirewallClientConfigService.setPortscCs|jj||�dS)N)r1rl)r=rjrkrrr rl5sz#FirewallClientConfigService.addPortcCs|jj||�dS)N)r1rm)r=rjrkrrr rm:sz&FirewallClientConfigService.removePortcCs|jj||�S)N)r1rn)r=rjrkrrr rn?sz%FirewallClientConfigService.queryPortcCs
|jj�S)N)r1rp)r=rrr rpFsz(FirewallClientConfigService.getProtocolscCs|jj|�dS)N)r1rq)r=r0rrr rqKsz(FirewallClientConfigService.setProtocolscCs|jj|�dS)N)r1rr)r=rkrrr rrPsz'FirewallClientConfigService.addProtocolcCs|jj|�dS)N)r1rs)r=rkrrr rsUsz*FirewallClientConfigService.removeProtocolcCs|jj|�S)N)r1rt)r=rkrrr rtZsz)FirewallClientConfigService.queryProtocolcCs
|jj�S)N)r1rv)r=rrr rvasz*FirewallClientConfigService.getSourcePortscCs|jj|�dS)N)r1rw)r=r*rrr rwfsz*FirewallClientConfigService.setSourcePortscCs|jj||�dS)N)r1rx)r=rjrkrrr rxksz)FirewallClientConfigService.addSourcePortcCs|jj||�dS)N)r1ry)r=rjrkrrr rypsz,FirewallClientConfigService.removeSourcePortcCs|jj||�S)N)r1rz)r=rjrkrrr rzusz+FirewallClientConfigService.querySourcePortcCs
|jj�S)N)r1r�)r=rrr r�|sz&FirewallClientConfigService.getModulescCs|jj|�dS)N)r1r�)r=r�rrr r��sz&FirewallClientConfigService.setModulescCs|jj|�dS)N)r1r�)r=r�rrr r��sz%FirewallClientConfigService.addModulecCs|jj|�dS)N)r1r�)r=r�rrr r��sz(FirewallClientConfigService.removeModulecCs|jj|�S)N)r1r�)r=r�rrr r��sz'FirewallClientConfigService.queryModulecCs
|jj�S)N)r1r�)r=rrr r��sz+FirewallClientConfigService.getDestinationscCs|jj|�dS)N)r1r�)r=r�rrr r��sz+FirewallClientConfigService.setDestinationscCs|jj|�S)N)r1�getDestination)r=r�rrr r2�sz*FirewallClientConfigService.getDestinationcCs|jj||�dS)N)r1r)r=r�rrrr r�sz*FirewallClientConfigService.setDestinationNcCs:|dk	r*|j|�|kr*ttjd||f��|jj|�dS)Nz'%s:%s')r2rrrdr1r)r=r�rrrr r�sz-FirewallClientConfigService.removeDestinationcCs|jj||�S)N)r1r)r=r�rrrr r�sz,FirewallClientConfigService.queryDestinationcCs
|jj�S)N)r1r)r=rrr r�sz'FirewallClientConfigService.getIncludescCs|jj|�dS)N)r1r)r=r�rrr r�sz'FirewallClientConfigService.setIncludescCs|jj|�dS)N)r1r)r=rrrr r�sz&FirewallClientConfigService.addIncludecCs|jj|�dS)N)r1r)r=rrrr r�sz)FirewallClientConfigService.removeIncludecCs|jj|�S)N)r1r	)r=rrrr r	�sz(FirewallClientConfigService.queryInclude)N)6r�r�r�r!r@r�rr�r�r�r�r�r�r�r�rcr�rPrQrSrTrVrWrhrirlrmrnrprqrrrsrtrvrwrxryrzr�r�r�r�r�r�r�r2rrrrrrrr	rrrr r/�s�r/c@s�eZdZeddd��Zedd��Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��Zedd��Zedd��Z
edd��Zedd��Zedd��ZdS)�FirewallClientIcmpTypeSettingsNcCs|r||_ndddgg|_dS)Nr#)r5)r=r5rrr r@�sz'FirewallClientIcmpTypeSettings.__init__cCsd|j|jfS)Nz%s(%r))rAr5)r=rrr rB�sz'FirewallClientIcmpTypeSettings.__repr__cCs
|jdS)Nr)r5)r=rrr rP�sz)FirewallClientIcmpTypeSettings.getVersioncCs||jd<dS)Nr)r5)r=r$rrr rQ�sz)FirewallClientIcmpTypeSettings.setVersioncCs
|jdS)NrR)r5)r=rrr rS�sz'FirewallClientIcmpTypeSettings.getShortcCs||jd<dS)NrR)r5)r=r%rrr rT�sz'FirewallClientIcmpTypeSettings.setShortcCs
|jdS)NrU)r5)r=rrr rV�sz-FirewallClientIcmpTypeSettings.getDescriptioncCs||jd<dS)NrU)r5)r=r&rrr rW�sz-FirewallClientIcmpTypeSettings.setDescriptioncCs
|jdS)Nr�)r5)r=rrr r��sz.FirewallClientIcmpTypeSettings.getDestinationscCs||jd<dS)Nr�)r5)r=r�rrr r��sz.FirewallClientIcmpTypeSettings.setDestinationscCsH|jdsttj|��n,||jdkr8|jdj|�nttj|��dS)Nr�)r5rrr`r_)r=r�rrr �addDestination�s

z-FirewallClientIcmpTypeSettings.addDestinationcCs\||jdkr |jdj|�n8|jdsL|jttddg�t|g���nttj|��dS)Nr�Zipv4Zipv6)r5rcr�r9�setrrrd)r=r�rrr r	s
z0FirewallClientIcmpTypeSettings.removeDestinationcCs|jdp||jdkS)Nr�)r5)r=r�rrr r	sz/FirewallClientIcmpTypeSettings.queryDestination)N)r�r�r�r!r@rBrPrQrSrTrVrWr�r�r4rrrrrr r3�s	r3c@s�eZdZedd��Zejjjedd���Z	ejjjedd���Z
ejjjedd���Zejjjed	d
���Zejjjedd���Z
ejjjed
d���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd ���Zejjjed!d"���Zejjjed#d$���Zejjjed%d&���Zejjjed'd(���Zd)S)*�FirewallClientConfigIcmpTypecCsL||_||_|jjtjj|�|_tj|jtjjd�|_	tj|jdd�|_
dS)N)r�zorg.freedesktop.DBus.Properties)r�r�r�rrr�r�r��DBUS_INTERFACE_CONFIG_ICMPTYPE�fw_icmptyper�)r=r�r�rrr r@	sz%FirewallClientConfigIcmpType.__init__cCst|jjtjj|��S)N)r	r�r�rrr7)r=r�rrr r�%	sz)FirewallClientConfigIcmpType.get_propertycCst|jjtjj��S)N)r	r�r�rrr7)r=rrr r�+	sz+FirewallClientConfigIcmpType.get_propertiescCs|jjtjj||�dS)N)r�r�rrr7)r=r�rErrr r�1	sz)FirewallClientConfigIcmpType.set_propertycCsttt|jj����S)N)r3r9r	r8r�)r=rrr r�7	sz(FirewallClientConfigIcmpType.getSettingscCs|jjt|j��dS)N)r8r�r%r5)r=r5rrr r�=	sz#FirewallClientConfigIcmpType.updatecCs|jj�dS)N)r8r�)r=rrr r�B	sz)FirewallClientConfigIcmpType.loadDefaultscCs|jj�dS)N)r8rc)r=rrr rcG	sz#FirewallClientConfigIcmpType.removecCs|jj|�dS)N)r8r�)r=r�rrr r�L	sz#FirewallClientConfigIcmpType.renamecCs
|jj�S)N)r8rP)r=rrr rPS	sz'FirewallClientConfigIcmpType.getVersioncCs|jj|�dS)N)r8rQ)r=r$rrr rQX	sz'FirewallClientConfigIcmpType.setVersioncCs
|jj�S)N)r8rS)r=rrr rS_	sz%FirewallClientConfigIcmpType.getShortcCs|jj|�dS)N)r8rT)r=r%rrr rTd	sz%FirewallClientConfigIcmpType.setShortcCs
|jj�S)N)r8rV)r=rrr rVk	sz+FirewallClientConfigIcmpType.getDescriptioncCs|jj|�dS)N)r8rW)r=r&rrr rWp	sz+FirewallClientConfigIcmpType.setDescriptioncCs
|jj�S)N)r8r�)r=rrr r�w	sz,FirewallClientConfigIcmpType.getDestinationscCs|jj|�dS)N)r8r�)r=r�rrr r�|	sz,FirewallClientConfigIcmpType.setDestinationscCs|jj|�dS)N)r8r4)r=r�rrr r4�	sz+FirewallClientConfigIcmpType.addDestinationcCs|jj|�dS)N)r8r)r=r�rrr r�	sz.FirewallClientConfigIcmpType.removeDestinationcCs|jj|�S)N)r8r)r=r�rrr r�	sz-FirewallClientConfigIcmpType.queryDestinationN)r�r�r�r!r@r�rr�r�r�r�r�r�r�r�rcr�rPrQrSrTrVrWr�r�r4rrrrrr r6	sNr6c@seZdZed.dd��Zedd��Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��Zedd��Zedd��Z
edd��Zedd��Zedd��Zedd��Zedd��Zed d!��Zed"d#��Zed$d%��Zed&d'��Zed(d)��Zed*d+��Zed,d-��ZdS)/�'FirewallClientPoliciesLockdownWhitelistNcCs|r||_nggggg|_dS)N)r5)r=r5rrr r@�	sz0FirewallClientPoliciesLockdownWhitelist.__init__cCsd|j|jfS)Nz%s(%r))rAr5)r=rrr rB�	sz0FirewallClientPoliciesLockdownWhitelist.__repr__cCs
|jdS)Nr)r5)r=rrr �getCommands�	sz3FirewallClientPoliciesLockdownWhitelist.getCommandscCs||jd<dS)Nr)r5)r=Zcommandsrrr �setCommands�	sz3FirewallClientPoliciesLockdownWhitelist.setCommandscCs"||jdkr|jdj|�dS)Nr)r5r_)r=�commandrrr �
addCommand�	sz2FirewallClientPoliciesLockdownWhitelist.addCommandcCs"||jdkr|jdj|�dS)Nr)r5rc)r=r<rrr �
removeCommand�	sz5FirewallClientPoliciesLockdownWhitelist.removeCommandcCs||jdkS)Nr)r5)r=r<rrr �queryCommand�	sz4FirewallClientPoliciesLockdownWhitelist.queryCommandcCs
|jdS)NrR)r5)r=rrr �getContexts�	sz3FirewallClientPoliciesLockdownWhitelist.getContextscCs||jd<dS)NrR)r5)r=Zcontextsrrr �setContexts�	sz3FirewallClientPoliciesLockdownWhitelist.setContextscCs"||jdkr|jdj|�dS)NrR)r5r_)r=�contextrrr �
addContext�	sz2FirewallClientPoliciesLockdownWhitelist.addContextcCs"||jdkr|jdj|�dS)NrR)r5rc)r=rBrrr �
removeContext�	sz5FirewallClientPoliciesLockdownWhitelist.removeContextcCs||jdkS)NrR)r5)r=rBrrr �queryContext�	sz4FirewallClientPoliciesLockdownWhitelist.queryContextcCs
|jdS)NrU)r5)r=rrr �getUsers�	sz0FirewallClientPoliciesLockdownWhitelist.getUserscCs||jd<dS)NrU)r5)r=Zusersrrr �setUsers�	sz0FirewallClientPoliciesLockdownWhitelist.setUserscCs"||jdkr|jdj|�dS)NrU)r5r_)r=�userrrr �addUser�	sz/FirewallClientPoliciesLockdownWhitelist.addUsercCs"||jdkr|jdj|�dS)NrU)r5rc)r=rHrrr �
removeUser�	sz2FirewallClientPoliciesLockdownWhitelist.removeUsercCs||jdkS)NrU)r5)r=rHrrr �	queryUser�	sz1FirewallClientPoliciesLockdownWhitelist.queryUsercCs
|jdS)Nr�)r5)r=rrr �getUids�	sz/FirewallClientPoliciesLockdownWhitelist.getUidscCs||jd<dS)Nr�)r5)r=�uidsrrr �setUids�	sz/FirewallClientPoliciesLockdownWhitelist.setUidscCs"||jdkr|jdj|�dS)Nr�)r5r_)r=�uidrrr �addUid�	sz.FirewallClientPoliciesLockdownWhitelist.addUidcCs"||jdkr|jdj|�dS)Nr�)r5rc)r=rOrrr �	removeUid�	sz1FirewallClientPoliciesLockdownWhitelist.removeUidcCs||jdkS)Nr�)r5)r=rOrrr �queryUid�	sz0FirewallClientPoliciesLockdownWhitelist.queryUid)N)r�r�r�r!r@rBr:r;r=r>r?r@rArCrDrErFrGrIrJrKrLrNrPrQrRrrrr r9�	s.r9c@s�eZdZedd��Zejjjedd���Z	ejjjedd���Z
ejjjedd���Zejjjed	d
���Zejjjedd���Z
ejjjed
d���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd ���Zejjjed!d"���Zejjjed#d$���Zejjjed%d&���Zejjjed'd(���Zd)S)*�FirewallClientConfigPoliciescCs8||_|jjtjjtjj�|_tj|jtjjd�|_	dS)N)r�)
r�r�rrr��DBUS_PATH_CONFIGr�r��DBUS_INTERFACE_CONFIG_POLICIES�fw_policies)r=r�rrr r@�	sz%FirewallClientConfigPolicies.__init__cCsttt|jj����S)N)r9r9r	rV�getLockdownWhitelist)r=rrr rW�	sz1FirewallClientConfigPolicies.getLockdownWhitelistcCs|jjt|j��dS)N)rV�setLockdownWhitelistr%r5)r=r5rrr rX�	sz1FirewallClientConfigPolicies.setLockdownWhitelistcCs|jj|�dS)N)rV�addLockdownWhitelistCommand)r=r<rrr rY
sz8FirewallClientConfigPolicies.addLockdownWhitelistCommandcCs|jj|�dS)N)rV�removeLockdownWhitelistCommand)r=r<rrr rZ
sz;FirewallClientConfigPolicies.removeLockdownWhitelistCommandcCst|jj|��S)N)r	rV�queryLockdownWhitelistCommand)r=r<rrr r[

sz:FirewallClientConfigPolicies.queryLockdownWhitelistCommandcCst|jj��S)N)r	rV�getLockdownWhitelistCommands)r=rrr r\
sz9FirewallClientConfigPolicies.getLockdownWhitelistCommandscCs|jj|�dS)N)rV�addLockdownWhitelistContext)r=rBrrr r]
sz8FirewallClientConfigPolicies.addLockdownWhitelistContextcCs|jj|�dS)N)rV�removeLockdownWhitelistContext)r=rBrrr r^
sz;FirewallClientConfigPolicies.removeLockdownWhitelistContextcCst|jj|��S)N)r	rV�queryLockdownWhitelistContext)r=rBrrr r_ 
sz:FirewallClientConfigPolicies.queryLockdownWhitelistContextcCst|jj��S)N)r	rV�getLockdownWhitelistContexts)r=rrr r`%
sz9FirewallClientConfigPolicies.getLockdownWhitelistContextscCs|jj|�dS)N)rV�addLockdownWhitelistUser)r=rHrrr ra,
sz5FirewallClientConfigPolicies.addLockdownWhitelistUsercCs|jj|�dS)N)rV�removeLockdownWhitelistUser)r=rHrrr rb1
sz8FirewallClientConfigPolicies.removeLockdownWhitelistUsercCst|jj|��S)N)r	rV�queryLockdownWhitelistUser)r=rHrrr rc6
sz7FirewallClientConfigPolicies.queryLockdownWhitelistUsercCst|jj��S)N)r	rV�getLockdownWhitelistUsers)r=rrr rd;
sz6FirewallClientConfigPolicies.getLockdownWhitelistUserscCst|jj��S)N)r	rV�getLockdownWhitelistUids)r=rrr reB
sz5FirewallClientConfigPolicies.getLockdownWhitelistUidscCs|jj|�dS)N)rV�setLockdownWhitelistUids)r=rMrrr rfG
sz5FirewallClientConfigPolicies.setLockdownWhitelistUidscCs|jj|�dS)N)rV�addLockdownWhitelistUid)r=rOrrr rgL
sz4FirewallClientConfigPolicies.addLockdownWhitelistUidcCs|jj|�dS)N)rV�removeLockdownWhitelistUid)r=rOrrr rhQ
sz7FirewallClientConfigPolicies.removeLockdownWhitelistUidcCst|jj|��S)N)r	rV�queryLockdownWhitelistUid)r=rOrrr riV
sz6FirewallClientConfigPolicies.queryLockdownWhitelistUidN)r�r�r�r!r@r�rr�r�rWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirrrr rS�	sN	rSc@seZdZed.dd��Zedd��Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��Zedd��Zedd��Z
edd��Zedd��Zedd��Zedd��Zedd��Zed d!��Zed"d#��Zed$d%��Zed&d'��Zed(d)��Zed*d+��Zed,d-��ZdS)/�FirewallClientDirectNcCs|r||_ngggg|_dS)N)r5)r=r5rrr r@^
szFirewallClientDirect.__init__cCsd|j|jfS)Nz%s(%r))rAr5)r=rrr rBe
szFirewallClientDirect.__repr__cCs
|jdS)Nr)r5)r=rrr �getAllChainsi
sz!FirewallClientDirect.getAllChainscs��fdd�|jdD�S)Ncs,g|]$}|d�kr|d�kr|d�qS)rrRrUr)r�r)r(�tablerr r�n
sz2FirewallClientDirect.getChains.<locals>.<listcomp>r)r5)r=r(rlr)r(rlr �	getChainsl
szFirewallClientDirect.getChainscCs||jd<dS)Nr)r5)r=Zchainsrrr �setAllChainsp
sz!FirewallClientDirect.setAllChainscCs,|||f}||jdkr(|jdj|�dS)Nr)r5r_)r=r(rl�chain�idxrrr �addChains
s
zFirewallClientDirect.addChaincCs,|||f}||jdkr(|jdj|�dS)Nr)r5rc)r=r(rlrorprrr �removeChainx
s
z FirewallClientDirect.removeChaincCs|||f}||jdkS)Nr)r5)r=r(rlrorprrr �
queryChain}
s
zFirewallClientDirect.queryChaincCs
|jdS)NrR)r5)r=rrr �getAllRules�
sz FirewallClientDirect.getAllRulescs���fdd�|jdD�S)Ncs<g|]4}|d�kr|d�kr|d�kr|dd��qS)rrRrUr�Nr)r�r)ror(rlrr r��
sz1FirewallClientDirect.getRules.<locals>.<listcomp>rR)r5)r=r(rlror)ror(rlr �getRules�
szFirewallClientDirect.getRulescCs||jd<dS)NrR)r5)r=r�rrr �setAllRules�
sz FirewallClientDirect.setAllRulescCs0|||||f}||jdkr,|jdj|�dS)NrR)r5r_)r=r(rlror�rrprrr �addRule�
szFirewallClientDirect.addRulecCs0|||||f}||jdkr,|jdj|�dS)NrR)r5rc)r=r(rlror�rrprrr �
removeRule�
szFirewallClientDirect.removeRulecCsPxJt|jd�D]8}|d|kr|d|kr|d|kr|jdj|�qWdS)NrRrrU)r9r5rc)r=r(rlrorprrr �removeRules�
s$z FirewallClientDirect.removeRulescCs|||||f}||jdkS)NrR)r5)r=r(rlror�rrprrr �	queryRule�
szFirewallClientDirect.queryRulecCs
|jdS)NrU)r5)r=rrr �getAllPassthroughs�
sz'FirewallClientDirect.getAllPassthroughscCs||jd<dS)NrU)r5)r=Zpassthroughsrrr �setAllPassthroughs�
sz'FirewallClientDirect.setAllPassthroughscCsg|jd<dS)NrU)r5)r=rrr �removeAllPassthroughs�
sz*FirewallClientDirect.removeAllPassthroughscs�fdd�|jdD�S)Ncs g|]}|d�kr|d�qS)rrRr)r�r)r(rr r��
sz8FirewallClientDirect.getPassthroughs.<locals>.<listcomp>rU)r5)r=r(r)r(r �getPassthroughs�
sz$FirewallClientDirect.getPassthroughscCs*||f}||jdkr&|jdj|�dS)NrU)r5r_)r=r(rrprrr �addPassthrough�
sz#FirewallClientDirect.addPassthroughcCs*||f}||jdkr&|jdj|�dS)NrU)r5rc)r=r(rrprrr �removePassthrough�
sz&FirewallClientDirect.removePassthroughcCs||f}||jdkS)NrU)r5)r=r(rrprrr �queryPassthrough�
sz%FirewallClientDirect.queryPassthrough)N)r�r�r�r!r@rBrkrmrnrqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�rrrr rj]
s.rjc@s�eZdZedd��Zejjjedd���Z	ejjjedd���Z
ejjjedd���Zejjjed	d
���Zejjjedd���Z
ejjjed
d���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd ���Zejjjed!d"���Zejjjed#d$���Zejjjed%d&���Zd'S)(�FirewallClientConfigDirectcCs8||_|jjtjjtjj�|_tj|jtjjd�|_	dS)N)r�)
r�r�rrr�rTr�r��DBUS_INTERFACE_CONFIG_DIRECT�	fw_direct)r=r�rrr r@�
sz#FirewallClientConfigDirect.__init__cCsttt|jj����S)N)rjr9r	r�r�)r=rrr r��
sz&FirewallClientConfigDirect.getSettingscCs|jjt|j��dS)N)r�r�r%r5)r=r5rrr r��
sz!FirewallClientConfigDirect.updatecCs|jj|||�dS)N)r�rq)r=r(rlrorrr rq�
sz#FirewallClientConfigDirect.addChaincCs|jj|||�dS)N)r�rr)r=r(rlrorrr rr�
sz&FirewallClientConfigDirect.removeChaincCst|jj|||��S)N)r	r�rs)r=r(rlrorrr rs�
sz%FirewallClientConfigDirect.queryChaincCst|jj||��S)N)r	r�rm)r=r(rlrrr rm�
sz$FirewallClientConfigDirect.getChainscCst|jj��S)N)r	r�rk)r=rrr rk�
sz'FirewallClientConfigDirect.getAllChainscCs|jj|||||�dS)N)r�rw)r=r(rlror�rrrr rw�
sz"FirewallClientConfigDirect.addRulecCs|jj|||||�dS)N)r�rx)r=r(rlror�rrrr rx�
sz%FirewallClientConfigDirect.removeRulecCs|jj|||�dS)N)r�ry)r=r(rlrorrr ry�
sz&FirewallClientConfigDirect.removeRulescCst|jj|||||��S)N)r	r�rz)r=r(rlror�rrrr rzsz$FirewallClientConfigDirect.queryRulecCst|jj|||��S)N)r	r�ru)r=r(rlrorrr rusz#FirewallClientConfigDirect.getRulescCst|jj��S)N)r	r�rt)r=rrr rt
sz&FirewallClientConfigDirect.getAllRulescCs|jj||�dS)N)r�r)r=r(rrrr rsz)FirewallClientConfigDirect.addPassthroughcCs|jj||�dS)N)r�r�)r=r(rrrr r�sz,FirewallClientConfigDirect.removePassthroughcCst|jj||��S)N)r	r�r�)r=r(rrrr r�sz+FirewallClientConfigDirect.queryPassthroughcCst|jj|��S)N)r	r�r~)r=r(rrr r~ sz*FirewallClientConfigDirect.getPassthroughscCst|jj��S)N)r	r�r{)r=rrr r{%sz-FirewallClientConfigDirect.getAllPassthroughsN)r�r�r�r!r@r�rr�r�r�r�rqrrrsrmrkrwrxryrzrurtrr�r�r~r{rrrr r��
sJ	r�c@sFeZdZedd��Zejjjedd���Z	ejjjedd���Z
ejjjedd���Zejjjed	d
���Zejjjedd���Z
ejjjed
d���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd ���Zejjjed!d"���Zejjjed#d$���Zejjjed%d&���Zejjjed'd(���Zejjjed)d*���Zejjjed+d,���Zejjjed-d.���Zejjjed/d0���Zejjjed1d2���Z ejjjed3d4���Z!ejjjed5d6���Z"ejjjed7d8���Z#ejjjed9d:���Z$ejjjed;d<���Z%ejjjed=d>���Z&ejjjed?d@���Z'ejjjedAdB���Z(ejjjedCdD���Z)ejjjedEdF���Z*ejjjedGdH���Z+ejjjedIdJ���Z,ejjjedKdL���Z-dMS)N�FirewallClientConfigcCsb||_|jjtjjtjj�|_tj|jtjjd�|_	tj|jdd�|_
t|j�|_t
|j�|_dS)N)r�zorg.freedesktop.DBus.Properties)r�r�rrr�rTr�r��DBUS_INTERFACE_CONFIG�	fw_configr�rS�	_policiesr��_direct)r=r�rrr r@-szFirewallClientConfig.__init__cCst|jjtjj|��S)N)r	r�r�rrr�)r=r�rrr r�<sz!FirewallClientConfig.get_propertycCst|jjtjj��S)N)r	r�r�rrr�)r=rrr r�Bsz#FirewallClientConfig.get_propertiescCs|jjtjj||�dS)N)r�r�rrr�)r=r�rErrr r�Hsz!FirewallClientConfig.set_propertycCst|jj��S)N)r	r��
getIPSetNames)r=rrr r�Osz"FirewallClientConfig.getIPSetNamescCst|jj��S)N)r	r��
listIPSets)r=rrr r�TszFirewallClientConfig.listIPSetscCst|j|�S)N)r"r�)r=r�rrr �getIPSetYszFirewallClientConfig.getIPSetcCst|jj|��}t|j|�S)N)r	r��getIPSetByNamer"r�)r=r�r�rrr r�^sz#FirewallClientConfig.getIPSetByNamecCs>t|t�r |jj|t|j��}n|jj|t|��}t|j|�S)N)r8rr��addIPSetr%r5r"r�)r=r�r5r�rrr r�ds
zFirewallClientConfig.addIPSetcCst|jj��S)N)r	r��getZoneNames)r=rrr r�osz!FirewallClientConfig.getZoneNamescCst|jj��S)N)r	r��	listZones)r=rrr r�tszFirewallClientConfig.listZonescCst|j|�S)N)r�r�)r=r�rrr �getZoneyszFirewallClientConfig.getZonecCst|jj|��}t|j|�S)N)r	r��
getZoneByNamer�r�)r=r�r�rrr r�~sz"FirewallClientConfig.getZoneByNamecCst|jj|��S)N)r	r��getZoneOfInterface)r=Zifacerrr r��sz'FirewallClientConfig.getZoneOfInterfacecCst|jj|��S)N)r	r��getZoneOfSource)r=r�rrr r��sz$FirewallClientConfig.getZoneOfSourcecCs^t|t�r|jj||j��}n4t|t�r8|jj||�}n|jj|t|dd���}t|j	|�S)Nr�)
r8r"r�ZaddZone2rMr;�addZoner%r�r�)r=r�r5r�rrr r��s

zFirewallClientConfig.addZonecCst|jj��S)N)r	r��getPolicyNames)r=rrr r��sz#FirewallClientConfig.getPolicyNamescCst|jj��S)N)r	r��listPolicies)r=rrr r��sz!FirewallClientConfig.listPoliciescCst|j|�S)N)r�r�)r=r�rrr �	getPolicy�szFirewallClientConfig.getPolicycCst|jj|��}t|j|�S)N)r	r��getPolicyByNamer�r�)r=r�r�rrr r��sz$FirewallClientConfig.getPolicyByNamecCs8t|t�r|jj||j��}n|jj||�}t|j|�S)N)r8r�r��	addPolicyrMr�r�)r=r�r5r�rrr r��s
zFirewallClientConfig.addPolicycCst|jj��S)N)r	r��getServiceNames)r=rrr r��sz$FirewallClientConfig.getServiceNamescCst|jj��S)N)r	r��listServices)r=rrr r��sz!FirewallClientConfig.listServicescCst|j|�S)N)r/r�)r=r�rrr �
getService�szFirewallClientConfig.getServicecCst|jj|��}t|j|�S)N)r	r��getServiceByNamer/r�)r=r�r�rrr r��sz%FirewallClientConfig.getServiceByNamecCs`t|t�r|jj||j��}n6t|�tkr:|jj||�}n|jj|t|dd���}t	|j
|�S)Nr�)r8r�r�ZaddService2rMrIr;rbr%r/r�)r=r�r5r�rrr rb�s
zFirewallClientConfig.addServicecCst|jj��S)N)r	r��getIcmpTypeNames)r=rrr r��sz%FirewallClientConfig.getIcmpTypeNamescCst|jj��S)N)r	r��
listIcmpTypes)r=rrr r��sz"FirewallClientConfig.listIcmpTypescCst|j|�S)N)r6r�)r=r�rrr �getIcmpType�sz FirewallClientConfig.getIcmpTypecCst|jj|��}t|j|�S)N)r	r��getIcmpTypeByNamer6r�)r=r�r�rrr r��sz&FirewallClientConfig.getIcmpTypeByNamecCs>t|t�r |jj|t|j��}n|jj|t|��}t|j|�S)N)r8r3r��addIcmpTyper%r5r6r�)r=r�r5r�rrr r��s
z FirewallClientConfig.addIcmpTypecCs|jS)N)r�)r=rrr �policies�szFirewallClientConfig.policiescCs|jS)N)r�)r=rrr �directszFirewallClientConfig.directcCst|jj��S)N)r	r��getHelperNames)r=rrr r�sz#FirewallClientConfig.getHelperNamescCst|jj��S)N)r	r��listHelpers)r=rrr r�sz FirewallClientConfig.listHelperscCst|j|�S)N)r,r�)r=r�rrr �	getHelperszFirewallClientConfig.getHelpercCst|jj|��}t|j|�S)N)r	r��getHelperByNamer,r�)r=r�r�rrr r�sz$FirewallClientConfig.getHelperByNamecCs>t|t�r |jj|t|j��}n|jj|t|��}t|j|�S)N)r8r&r�r
r%r5r,r�)r=r�r5r�rrr r
 s
zFirewallClientConfig.addHelperN).r�r�r�r!r@r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rbr�r�r�r�r�r�r�r�r�r�r�r
rrrr r�,s�

r�c@s�eZdZe�ddd��Zedd��Zedd	��Zed
d��Zedd
��Zedd��Z	edd��Z
edd��Zedd��Zedd��Z
edd��Zejjjedd���Zejjjedd���Zejjjedd���Zejjjed d!���Zejjjed"d#���Zejjjed$d%���Zejjjed&d'���Zejjjed(d)���Zejjjed*d+���Zejjjed,d-���Zejjjed.d/���Zejjjed0d1���Zejjjed2d3���Zejjjed4d5���Z ejjjed6d7���Z!ejjjed8d9���Z"ejjjed:d;���Z#ejjjed<d=���Z$ejjjed>d?���Z%ejjjed@dA���Z&ejjjedBdC���Z'ejjjedDdE���Z(ejjjedFdG���Z)ejjjedHdI���Z*ejjjedJdK���Z+ejjjedLdM���Z,ejjjedNdO���Z-ejjjedPdQ���Z.ejjjedRdS���Z/ejjjedTdU���Z0ejjjedVdW���Z1ejjjedXdY���Z2ejjjedZd[���Z3ejjjed\d]���Z4ejjjed^d_���Z5ejjjed`da���Z6ejjjedbdc���Z7ejjjeddde���Z8ejjjedfdg���Z9ejjjedhdi���Z:ejjjedjdk���Z;ejjjedldm���Z<ejjjedndo���Z=ejjjedpdq���Z>ejjjedrds���Z?ejjjedtdu���Z@ejjjedvdw���ZAejjjedxdy���ZBejjjedzd{���ZCejjjed|d}���ZDejjjed~d���ZEejjjed�d����ZFejjjed�d����ZGejjje�dd�d����ZHejjjed�d����ZIejjjed�d����ZJejjjed�d����ZKejjje�dd�d����ZLejjjed�d����ZMejjjed�d����ZNejjjed�d����ZOejjje�dd�d����ZPejjjed�d����ZQejjjed�d����ZRejjjed�d����ZSejjje�dd�d����ZTejjjed�d����ZUejjjed�d����ZVejjjed�d����ZWejjjed�d����ZXejjjed�d����ZYejjjed�d����ZZejjje�dd�d����Z[ejjjed�d����Z\ejjjed�d����Z]ejjje�d d�d����Z^ejjjed�d����Z_ejjjed�d����Z`ejjjed�d����Zaejjje�d!d�d����Zbejjjed�d����Zcejjjed�d����Zdejjjed�d����Zeejjje�d"d�d����Zfejjjed�dÄ��Zgejjjed�dń��Zhejjjed�dDŽ��Ziejjjed�dɄ��Zjejjjed�d˄��Zkejjjed�d̈́��Zlejjjed�dτ��Zmejjjed�dф��Znejjjed�dӄ��Zoejjjed�dՄ��Zpejjjed�dׄ��Zqejjjed�dل��Zrejjjed�dۄ��Zsejjjed�d݄��Ztejjjed�d߄��Zuejjjed�d���Zvejjjed�d���Zwejjjed�d���Zxejjjed�d���Zyejjjed�d���Zzejjjed�d���Z{ejjjed�d���Z|ejjjed�d���Z}ejjjed�d���Z~ejjjed�d���Zejjjed�d����Z�ejjjed�d����Z�ejjjed�d����Z�ejjjed�d����Z�ejjjed�d����Z�ejjjed�d����Z�ejjje�d�d���Z�ejjje�d�d���Z�ejjje�d�d���Z�ejjje�d�d���Z�ejjje�d�d	���Z�ejjje�d
�d���Z�ejjje�d�d
���Z�ejjje�d�d���Z�ejjje�d�d���Z�ejjje�d�d���Z�ejjje�d�d���Z�ejjje�d�d���Z�ejjje�d�d���Z�dS(#�FirewallClientNrTcdCs|s�tjjjdd�ytjj�|_d|j_Wq�tk
r�ytj�|_Wn6tj	j
k
r�}zttj
|j���WYdd}~Xn
Xtd�Yq�Xn||_|jj|jddtjjd�x�tjjtjjtjjtjjtjjtjjtjjtjjtjjtjjtjjtjjtjjtjj tjj!gD]}|jj|j"|ddd	d
��qWi|_#ddd
ddddddddddddddddddd d!d"d#d$d%d&d'd'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdY�O|_$|j%�||_&|dZk�rt'j(||j)�n|j)�dS)[NT)Zset_as_defaultzNot using slip.dbusZNameOwnerChangedzorg.freedesktop.DBus)Zhandler_functionZsignal_namer�Zarg0r��memberr�)r�Zinterface_keywordZmember_keywordZpath_keywordzconnection-changedzconnection-establishedzconnection-lostZLogDeniedChangedZDefaultZoneChangedZPanicModeEnabledZPanicModeDisabledZReloadedZServiceAddedZServiceRemovedZ	PortAddedZPortRemovedZSourcePortAddedZSourcePortRemovedZ
ProtocolAddedZProtocolRemovedZMasqueradeAddedZMasqueradeRemovedZForwardPortAddedZForwardPortRemovedZIcmpBlockAddedZIcmpBlockRemovedZIcmpBlockInversionAddedZIcmpBlockInversionRemovedZ
RichRuleAddedZRichRuleRemovedZInterfaceAddedZInterfaceRemovedZZoneOfInterfaceChangedZSourceAddedZ
SourceRemovedZZoneOfSourceChangedZZoneUpdatedZ
PolicyUpdatedZ
EntryAddedZEntryRemovedZ
ChainAddedZChainRemovedZ	RuleAddedZRuleRemovedZPassthroughAddedZPassthroughRemovedzconfig:direct:UpdatedZLockdownEnabledZLockdownDisabledZLockdownWhitelistCommandAddedZLockdownWhitelistCommandRemovedZLockdownWhitelistContextAddedZLockdownWhitelistContextRemovedZLockdownWhitelistUidAddedZLockdownWhitelistUidRemovedZLockdownWhitelistUserAddedZLockdownWhitelistUserRemovedz(config:policies:LockdownWhitelistUpdatedzconfig:IPSetAddedzconfig:IPSetUpdatedzconfig:IPSetRemovedzconfig:IPSetRenamedzconfig:ZoneAddedzconfig:ZoneUpdatedzconfig:ZoneRemovedzconfig:ZoneRenamedzconfig:PolicyAddedzconfig:PolicyUpdatedzconfig:PolicyRemovedzconfig:PolicyRenamedzconfig:ServiceAddedzconfig:ServiceUpdatedzconfig:ServiceRemovedzconfig:ServiceRenamedzconfig:IcmpTypeAddedzconfig:IcmpTypeUpdatedzconfig:IcmpTypeRemovedzconfig:IcmpTypeRenamedzconfig:HelperAddedzconfig:HelperUpdatedzconfig:HelperRemovedzconfig:HelperRenamed)Ozconnection-changedzconnection-establishedzconnection-lostzlog-denied-changedzdefault-zone-changedzpanic-mode-enabledzpanic-mode-disabledZreloadedz
service-addedzservice-removedz
port-addedzport-removedzsource-port-addedzsource-port-removedzprotocol-addedzprotocol-removedzmasquerade-addedzmasquerade-removedzforward-port-addedzforward-port-removedzicmp-block-addedzicmp-block-removedzicmp-block-inversion-addedzicmp-block-inversion-removedzrichrule-addedzrichrule-removedzinterface-addedzinterface-removedzzone-changedzzone-of-interface-changedzsource-addedzsource-removedzzone-of-source-changedzzone-updatedzpolicy-updatedzipset-entry-addedzipset-entry-removedzdirect:chain-addedzdirect:chain-removedzdirect:rule-addedzdirect:rule-removedzdirect:passthrough-addedzdirect:passthrough-removedzconfig:direct:updatedzlockdown-enabledzlockdown-disabledz lockdown-whitelist-command-addedz"lockdown-whitelist-command-removedz lockdown-whitelist-context-addedz"lockdown-whitelist-context-removedzlockdown-whitelist-uid-addedzlockdown-whitelist-uid-removedzlockdown-whitelist-user-addedzlockdown-whitelist-user-removedz*config:policies:lockdown-whitelist-updatedzconfig:ipset-addedzconfig:ipset-updatedzconfig:ipset-removedzconfig:ipset-renamedzconfig:zone-addedzconfig:zone-updatedzconfig:zone-removedzconfig:zone-renamedzconfig:policy-addedzconfig:policy-updatedzconfig:policy-removedzconfig:policy-renamedzconfig:service-addedzconfig:service-updatedzconfig:service-removedzconfig:service-renamedzconfig:icmptype-addedzconfig:icmptype-updatedzconfig:icmptype-removedzconfig:icmptype-renamedzconfig:helper-addedzconfig:helper-updatedzconfig:helper-removedzconfig:helper-renamedr)*rZmainloopZglibZ
DBusGMainLoopr�Z	SystemBusr�Zdefault_timeoutrrrrrZ
DBUS_ERRORr�printZadd_signal_receiver�_dbus_connection_changedrr��DBUS_INTERFACE_IPSET�DBUS_INTERFACE_ZONE�DBUS_INTERFACE_POLICY�DBUS_INTERFACE_DIRECT�DBUS_INTERFACE_POLICIESr�r#r�r�r0r-r�r7rU�_signal_receiver�	_callback�
_callbacks�
_init_vars�quietrZtimeout_add_seconds�_connection_established)r=r��waitr�rr�rrr r@,s�


zFirewallClient.__init__cCs:d|_d|_d|_d|_d|_d|_d|_d|_d|_dS)NF)	�fwr$r�r�r.r�r��_config�	connected)r=rrr r��szFirewallClient._init_varscCstS)N)r)r=rrr �getExceptionHandler�sz"FirewallClient.getExceptionHandlercCs|adS)N)r)r=Zhandlerrrr �setExceptionHandler�sz"FirewallClient.setExceptionHandlercCstS)N)r)r=rrr �getNotAuthorizedLoop�sz#FirewallClient.getNotAuthorizedLoopcCs|adS)N)r)r=�enablerrr �setNotAuthorizedLoop�sz#FirewallClient.setNotAuthorizedLoopcGs0||jkr ||f|j|j|<ntd|��dS)NzUnknown callback name '%s')r�r��
ValueError)r=r��callbackrrrr �connect�s
zFirewallClient.connectcCs*|tjjkrdS|r|j�n|j�dS)N)rrr�r��_connection_lost)r=r�Z	old_ownerZ	new_ownerrrr r��s

z'FirewallClient._dbus_connection_changedcCsXy�|jjtjjtjj�|_tj|jtjjd�|_tj|jtjj	d�|_
tj|jtjjd�|_tj|jtjj
d�|_tj|jtjjd�|_tj|jtjjd�|_tj|jdd�|_Wnjtjjk
r�}z|js�td|j��dSd}~Xn4tk
�r}z|j�std|�dSd}~XnXt|j�|_d|_|jdtjjd�|jdtjjd�dS)	N)r�zorg.freedesktop.DBus.PropertiesrrTzconnection-established)r�r�zconnection-changed)r�r�rrr�Z	DBUS_PATHr�r�r�r�r$r�r�r�r�r�r�r�rVr�rrr�r�rrr�r�r�r�)r=rrrr r��sD
z&FirewallClient._connection_establishedcCs0|j�|jdtjjd�|jdtjjd�dS)Nzconnection-lost)r�r�zconnection-changed)r�r�rrr�)r=rrr r�
s
zFirewallClient._connection_lostc	Os�d|ksd|krdS|d}|d}|jtjj�r:d|}|jtjj�rRd|}n�|jtjj�rjd|}n�|jtjj�r�d|}np|jtjj�r�d|}nX|jtjj�r�d|}n@|tjj	kr�d	|}n*|tjj
kr�d
|}n|tjjkr�d|}d}x<|jD]2}|j||kr�|j||j
kr�|j
|j|}q�W|dk�rBdSdd
�|D�}y(|d�rj|j|d�|d|�Wn,tk
�r�}zt|�WYdd}~XnXdS)Nr�r�zconfig:Zonez
config:Policyzconfig:IPSetzconfig:Servicezconfig:IcmpTypez
config:Helperzconfig:zconfig:policies:zconfig:direct:cSsg|]}t|��qSr)r	)r��argrrr r�C
sz3FirewallClient._signal_receiver.<locals>.<listcomp>rRr)�
startswithrrr�r�r#r0r7r-r�rUr�r�r��extendrr�)	r=rr�signalr��cbr�Zcb_args�msgrrr r�
sH








zFirewallClient._signal_receivercCs|jS)N)r�)r=rrr rM
szFirewallClient.configcCs|jj�dS)N)r��reload)r=rrr r�R
szFirewallClient.reloadcCs|jj�dS)N)r�ZcompleteReload)r=rrr �complete_reloadW
szFirewallClient.complete_reloadcCs|jj�dS)N)r��runtimeToPermanent)r=rrr r�\
sz!FirewallClient.runtimeToPermanentcCs|jj�dS)N)r��checkPermanentConfig)r=rrr r�a
sz#FirewallClient.checkPermanentConfigcCst|jjtjj|��S)N)r	r�r�rrr�)r=r�rrr r�f
szFirewallClient.get_propertycCst|jjtjj��S)N)r	r�r�rrr�)r=rrr r�l
szFirewallClient.get_propertiescCs|jjtjj||�dS)N)r�r�rrr�)r=r�rErrr r�r
szFirewallClient.set_propertycCs|jj�dS)N)r��enablePanicMode)r=rrr r�y
szFirewallClient.enablePanicModecCs|jj�dS)N)r��disablePanicMode)r=rrr r�~
szFirewallClient.disablePanicModecCst|jj��S)N)r	r��queryPanicMode)r=rrr r��
szFirewallClient.queryPanicModecCstt|jj|���S)N)r"r	r��getZoneSettings2)r=�zonerrr �getZoneSettings�
szFirewallClient.getZoneSettingscCst|jj��S)N)r	r$�	getIPSets)r=rrr r��
szFirewallClient.getIPSetscCsttt|jj|����S)N)rr9r	r$�getIPSetSettings)r=�ipsetrrr r��
szFirewallClient.getIPSetSettingscCs|jj||�dS)N)r$r)r=r�rrrr r�
szFirewallClient.addEntrycCs|jj|�S)N)r$r)r=r�rrr r�
szFirewallClient.getEntriescCs|jj||�S)N)r$r)r=r�rrrr r�
szFirewallClient.setEntriescCs|jj||�dS)N)r$r )r=r�rrrr r �
szFirewallClient.removeEntrycCst|jj||��S)N)r	r$r!)r=r�rrrr r!�
szFirewallClient.queryEntrycCst|jj��S)N)r	r�r�)r=rrr r��
szFirewallClient.listServicescCstt|jj|���S)N)r�r	r�ZgetServiceSettings2)r=rarrr �getServiceSettings�
sz!FirewallClient.getServiceSettingscCst|jj��S)N)r	r�r�)r=rrr r��
szFirewallClient.listIcmpTypescCsttt|jj|����S)N)r3r9r	r��getIcmpTypeSettings)r=rrrr r��
sz"FirewallClient.getIcmpTypeSettingscCst|jj��S)N)r	r�r
)r=rrr r
�
szFirewallClient.getHelperscCsttt|jj|����S)N)r&r9r	r��getHelperSettings)r=rrrr r��
sz FirewallClient.getHelperSettingscCst|jj��S)N)r	r��getAutomaticHelpers)r=rrr r��
sz"FirewallClient.getAutomaticHelperscCs|jj|�dS)N)r��setAutomaticHelpers)r=rErrr r��
sz"FirewallClient.setAutomaticHelperscCst|jj��S)N)r	r��getLogDenied)r=rrr r��
szFirewallClient.getLogDeniedcCs|jj|�dS)N)r��setLogDenied)r=rErrr r��
szFirewallClient.setLogDeniedcCst|jj��S)N)r	r��getDefaultZone)r=rrr r��
szFirewallClient.getDefaultZonecCs|jj|�dS)N)r��setDefaultZone)r=r�rrr r��
szFirewallClient.setDefaultZonecCs|jj||j��dS)N)r��setZoneSettings2rO)r=r�r5rrr �setZoneSettings�
szFirewallClient.setZoneSettingscCst|jj��S)N)r	r��getZones)r=rrr r��
szFirewallClient.getZonescCst|jj��S)N)r	r��getActiveZones)r=rrr r�szFirewallClient.getActiveZonescCst|jj|��S)N)r	r�r�)r=r�rrr r�	sz!FirewallClient.getZoneOfInterfacecCst|jj|��S)N)r	r�r�)r=r�rrr r�szFirewallClient.getZoneOfSourcecCst|jj|��S)N)r	r��isImmutable)r=r�rrr r�szFirewallClient.isImmutablecCstt|jj|���S)N)r�r	r��getPolicySettings)r=�policyrrr r�sz FirewallClient.getPolicySettingscCs|jj||j��dS)N)r��setPolicySettingsrO)r=r�r5rrr r�sz FirewallClient.setPolicySettingscCst|jj��S)N)r	r��getPolicies)r=rrr r�$szFirewallClient.getPoliciescCst|jj��S)N)r	r��getActivePolicies)r=rrr r�)sz FirewallClient.getActivePoliciescCst|jj|��S)N)r	r�r�)r=r�rrr �isPolicyImmutable.sz FirewallClient.isPolicyImmutablecCst|jj||��S)N)r	r�r�)r=r�r�rrr r�5szFirewallClient.addInterfacecCst|jj||��S)N)r	r��
changeZone)r=r�r�rrr r�:szFirewallClient.changeZonecCst|jj||��S)N)r	r��changeZoneOfInterface)r=r�r�rrr r�?s
z$FirewallClient.changeZoneOfInterfacecCst|jj|��S)N)r	r�r�)r=r�rrr r�EszFirewallClient.getInterfacescCst|jj||��S)N)r	r�r�)r=r�r�rrr r�JszFirewallClient.queryInterfacecCst|jj||��S)N)r	r�r�)r=r�r�rrr r�OszFirewallClient.removeInterfacecCst|jj||��S)N)r	r�r�)r=r�r�rrr r�VszFirewallClient.addSourcecCst|jj||��S)N)r	r��changeZoneOfSource)r=r�r�rrr r�[sz!FirewallClient.changeZoneOfSourcecCst|jj|��S)N)r	r�r�)r=r�rrr r�`szFirewallClient.getSourcescCst|jj||��S)N)r	r�r�)r=r�r�rrr r�eszFirewallClient.querySourcecCst|jj||��S)N)r	r�r�)r=r�r�rrr r�jszFirewallClient.removeSourcecCst|jj|||��S)N)r	r�r�)r=r�r�rrrr r�qszFirewallClient.addRichRulecCst|jj|��S)N)r	r�r�)r=r�rrr r�vszFirewallClient.getRichRulescCst|jj||��S)N)r	r�r�)r=r�r�rrr r�{szFirewallClient.queryRichRulecCst|jj||��S)N)r	r�r�)r=r�r�rrr r��szFirewallClient.removeRichRulecCst|jj|||��S)N)r	r�rb)r=r�rarrrr rb�szFirewallClient.addServicecCst|jj|��S)N)r	r�r])r=r�rrr r]�szFirewallClient.getServicescCst|jj||��S)N)r	r�rf)r=r�rarrr rf�szFirewallClient.queryServicecCst|jj||��S)N)r	r�re)r=r�rarrr re�szFirewallClient.removeServicecCst|jj||||��S)N)r	r�rl)r=r�rjrkrrrr rl�szFirewallClient.addPortcCst|jj|��S)N)r	r�rh)r=r�rrr rh�szFirewallClient.getPortscCst|jj|||��S)N)r	r�rn)r=r�rjrkrrr rn�szFirewallClient.queryPortcCst|jj|||��S)N)r	r�rm)r=r�rjrkrrr rm�szFirewallClient.removePortcCst|jj|||��S)N)r	r�rr)r=r�rkrrrr rr�szFirewallClient.addProtocolcCst|jj|��S)N)r	r�rp)r=r�rrr rp�szFirewallClient.getProtocolscCst|jj||��S)N)r	r�rt)r=r�rkrrr rt�szFirewallClient.queryProtocolcCst|jj||��S)N)r	r�rs)r=r�rkrrr rs�szFirewallClient.removeProtocolcCs|jj|ddi�dS)Nr2T)r�r�)r=r�rrr r��szFirewallClient.addForwardcCst|jj|��dS)Nr2)r	r�r�)r=r�rrr r��szFirewallClient.queryForwardcCs|jj|ddi�dS)Nr2F)r�r�)r=r�rrr r��szFirewallClient.removeForwardcCst|jj||��S)N)r	r�r�)r=r�rrrr r��szFirewallClient.addMasqueradecCst|jj|��S)N)r	r�r�)r=r�rrr r��szFirewallClient.queryMasqueradecCst|jj|��S)N)r	r�r�)r=r�rrr r��szFirewallClient.removeMasqueradecCs2|dkrd}|dkrd}t|jj||||||��S)Nr#)r	r�r�)r=r�rjrkr�r�rrrr r��szFirewallClient.addForwardPortcCst|jj|��S)N)r	r�r�)r=r�rrr r��szFirewallClient.getForwardPortscCs0|dkrd}|dkrd}t|jj|||||��S)Nr#)r	r�r�)r=r�rjrkr�r�rrr r��s
zFirewallClient.queryForwardPortcCs0|dkrd}|dkrd}t|jj|||||��S)Nr#)r	r�r�)r=r�rjrkr�r�rrr r�s
z FirewallClient.removeForwardPortcCst|jj||||��S)N)r	r�rx)r=r�rjrkrrrr rxszFirewallClient.addSourcePortcCst|jj|��S)N)r	r�rv)r=r�rrr rvszFirewallClient.getSourcePortscCst|jj|||��S)N)r	r�rz)r=r�rjrkrrr rzszFirewallClient.querySourcePortcCst|jj|||��S)N)r	r�ry)r=r�rjrkrrr ry$szFirewallClient.removeSourcePortcCst|jj|||��S)N)r	r�r�)r=r��icmprrrr r�,szFirewallClient.addIcmpBlockcCst|jj|��S)N)r	r�r|)r=r�rrr r|1szFirewallClient.getIcmpBlockscCst|jj||��S)N)r	r�r�)r=r�r�rrr r�6szFirewallClient.queryIcmpBlockcCst|jj||��S)N)r	r�r�)r=r�r�rrr r�;szFirewallClient.removeIcmpBlockcCst|jj|��S)N)r	r�r�)r=r�rrr r�Bsz$FirewallClient.addIcmpBlockInversioncCst|jj|��S)N)r	r�r�)r=r�rrr r�Gsz&FirewallClient.queryIcmpBlockInversioncCst|jj|��S)N)r	r�r�)r=r�rrr r�Lsz'FirewallClient.removeIcmpBlockInversioncCs|jj|||�dS)N)r�rq)r=r(rlrorrr rqSszFirewallClient.addChaincCs|jj|||�dS)N)r�rr)r=r(rlrorrr rrXszFirewallClient.removeChaincCst|jj|||��S)N)r	r�rs)r=r(rlrorrr rs]szFirewallClient.queryChaincCst|jj||��S)N)r	r�rm)r=r(rlrrr rmbszFirewallClient.getChainscCst|jj��S)N)r	r�rk)r=rrr rkgszFirewallClient.getAllChainscCs|jj|||||�dS)N)r�rw)r=r(rlror�rrrr rwnszFirewallClient.addRulecCs|jj|||||�dS)N)r�rx)r=r(rlror�rrrr rxsszFirewallClient.removeRulecCs|jj|||�dS)N)r�ry)r=r(rlrorrr ryxszFirewallClient.removeRulescCst|jj|||||��S)N)r	r�rz)r=r(rlror�rrrr rz}szFirewallClient.queryRulecCst|jj|||��S)N)r	r�ru)r=r(rlrorrr ru�szFirewallClient.getRulescCst|jj��S)N)r	r�rt)r=rrr rt�szFirewallClient.getAllRulescCst|jj||��S)N)r	r��passthrough)r=r(rrrr r��szFirewallClient.passthroughcCst|jj��S)N)r	r�r{)r=rrr r{�sz!FirewallClient.getAllPassthroughscCs|jj�dS)N)r�r})r=rrr r}�sz$FirewallClient.removeAllPassthroughscCst|jj|��S)N)r	r�r~)r=r(rrr r~�szFirewallClient.getPassthroughscCs|jj||�dS)N)r�r)r=r(rrrr r�szFirewallClient.addPassthroughcCs|jj||�dS)N)r�r�)r=r(rrrr r��sz FirewallClient.removePassthroughcCst|jj||��S)N)r	r�r�)r=r(rrrr r��szFirewallClient.queryPassthroughcCs|jj�dS)N)rV�enableLockdown)r=rrr r��szFirewallClient.enableLockdowncCs|jj�dS)N)rV�disableLockdown)r=rrr r��szFirewallClient.disableLockdowncCst|jj��S)N)r	rV�
queryLockdown)r=rrr r��szFirewallClient.queryLockdowncCs|jj|�dS)N)rVrY)r=r<rrr rY�sz*FirewallClient.addLockdownWhitelistCommandcCst|jj��S)N)r	rVr\)r=rrr r\�sz+FirewallClient.getLockdownWhitelistCommandscCst|jj|��S)N)r	rVr[)r=r<rrr r[�sz,FirewallClient.queryLockdownWhitelistCommandcCs|jj|�dS)N)rVrZ)r=r<rrr rZ�sz-FirewallClient.removeLockdownWhitelistCommandcCs|jj|�dS)N)rVr])r=rBrrr r]�sz*FirewallClient.addLockdownWhitelistContextcCst|jj��S)N)r	rVr`)r=rrr r`�sz+FirewallClient.getLockdownWhitelistContextscCst|jj|��S)N)r	rVr_)r=rBrrr r_�sz,FirewallClient.queryLockdownWhitelistContextcCs|jj|�dS)N)rVr^)r=rBrrr r^�sz-FirewallClient.removeLockdownWhitelistContextcCs|jj|�dS)N)rVrg)r=rOrrr rg�sz&FirewallClient.addLockdownWhitelistUidcCst|jj��S)N)r	rVre)r=rrr re�sz'FirewallClient.getLockdownWhitelistUidscCst|jj|��S)N)r	rVri)r=rOrrr ri�sz(FirewallClient.queryLockdownWhitelistUidcCs|jj|�dS)N)rVrh)r=rOrrr rhsz)FirewallClient.removeLockdownWhitelistUidcCs|jj|�dS)N)rVra)r=rHrrr ra
sz'FirewallClient.addLockdownWhitelistUsercCst|jj��S)N)r	rVrd)r=rrr rdsz(FirewallClient.getLockdownWhitelistUserscCst|jj|��S)N)r	rVrc)r=rHrrr rcsz)FirewallClient.queryLockdownWhitelistUsercCs|jj|�dS)N)rVrb)r=rHrrr rbsz*FirewallClient.removeLockdownWhitelistUsercCs|jj�dS)z( Authorize once for all polkit actions. N)r��authorizeAll)r=rrr r�szFirewallClient.authorizeAll)NrT)r)r)r)r)r)r)r)r)�r�r�r�r!r@r�r�r�r�r�r�r�r�r�r�r�rr�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�rrrr r!r�r�r�r�r
r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rbr]rfrerlrhrnrmrrrprtrsr�r�r�r�r�r�r�r�r�r�rxrvrzryr�r|r�r�r�r�r�rqrrrsrmrkrwrxryrzrurtr�r{r}r~rr�r�r�r�r�rYr\r[rZr]r`r_r^rgrerirhrardrcrbr�rrrr r�+s*&0	
r�)4Z
gi.repositoryrr�sysr�Zdbus.mainloop.glibrZ	slip.dbusr�rZfirewallrZfirewall.core.baserrrZfirewall.dbus_utilsr	Zfirewall.functionsr
Zfirewall.core.richrZfirewall.core.ipsetrr
rrZfirewall.errorsrrrrr!�objectr"r�r�r�r�rr"r&r,r/r3r6r9rSrjr�r�r�rrrr �<module>sd
';R8ghyKCzVtbmsite-packages/firewall/__pycache__/functions.cpython-36.pyc000064400000036460147511334700017755 0ustar003

]ûf'K�#@sdddddddddd	d
ddd
ddddddddddddddddddd d!d"g#Zd#d$lZd#d$lZd#d$lZd#d$lZd#d$lZd#d$lZd#d$lZd#d$lZd#d%l	m
Z
d#d&lmZm
Z
ejd'kZd(d)�ed#d*�D�Zd+d�Zd,d�ZdXd.d�Zd/d0�Zd1d2�Zd3d4�Zd5d�Zd6d�Zd7d8�Zd9d�Zd:d�Zd;d"�Zd<d�Zd=d	�Zd>d
�Z d?d�Z!d@d�Z"dAd
�Z#dBd�Z$dCd�Z%dDd�Z&dEdF�Z'dGd�Z(dHd�Z)dId�Z*dJd�Z+dKd�Z,dLd�Z-dMd!�Z.dNd�Z/dOd�Z0dPd�Z1dQd�Z2dRd�Z3dSd�Z4dTd�Z5dUd�Z6dVd�Z7dWd �Z8d$S)Y�PY2�	getPortID�getPortRange�portStr�getServiceName�checkIP�checkIP6�checkIPnMask�
checkIP6nMask�
checkProtocol�checkInterface�checkUINT32�firewalld_is_active�tempFile�readfile�	writefile�enable_ip_forwarding�
check_port�
check_address�check_single_address�	check_mac�uniqify�ppid_of_pid�max_zone_name_len�	checkUser�checkUid�checkCommand�checkContext�joinArgs�	splitArgs�b2u�u2b�
u2b_if_py2�max_policy_name_len�stripNonPrintableCharacters�N)�log)�FIREWALLD_TEMPDIR�FIREWALLD_PIDFILE�3cCs"i|]}|dko|dksd|�qS)��N�)�.0�ir+r+�/usr/lib/python3.6/functions.py�
<dictcomp>.sr/�cCstt|t�r|}nT|r|j�}yt|�}Wn:tk
rbytj|�}Wntjk
r\dSXYnX|dkrpdS|S)z� Check and Get port id from port string or port id using socket.getservbyname

    @param port port string or port id
    @return Port id if valid, -1 if port can not be found and -2 if port is too big
    �i���������)�
isinstance�int�strip�
ValueError�socketZ
getservbyname�error)�portZ_idr+r+r.r7s
cCs�t|t�st|t�r|St|t�s*|j�rDt|�}|dkr@|fS|S|jd�}t|�dkr�|dj�r�|dj�r�t|d�}t|d�}|dkr�|dkr�||kr�||fS||kr�||fS|fSg}x�tt|�dd�D]�}tdj	|d|���}dj	||d��}t|�dk�rnt|�}|dk�r�|dk�r�||k�rF|j
||f�n&||k�r`|j
||f�n|j
|f�q�|dkr�|j
|f�|t|�kr�Pq�Wt|�dk�r�dSt|�dk�r�dS|dS)aI Get port range for port range string or single port id

    @param ports an integer or port string or port range string
    @return Array containing start and end port id for a valid range or -1 if port can not be found and -2 if port is too big for integer input or -1 for invalid ranges or None if the range is ambiguous.
    r$�-r2r1Nr3r3)r5�tuple�listr6�isdigitr�split�len�range�join�append)ZportsZid1�splitsZid2Zmatchedr-Zport2r+r+r.rNsL
$

�:cCsX|dkrdSt|�}t|t�r*|dkr*dSt|�dkr>d|Sd|d||dfSdS)a Create port and port range string

    @param port port or port range int or [int, int]
    @param delimiter of the output string for port ranges, default ':'
    @return Port or port range string, empty string if port isn't specified, None if port or port range is not valid
    �r$Nr1z%sz%s%s%s)rr5r6rA)r;Z	delimiter�_ranger+r+r.r�scCst|�}t|�}t|�dkr�t|�dkr@t|d�t|d�kSt|�dkr�t|d�t|d�kr�t|d�t|d�kr�dSn|t|�dkr�t|�dkr�t|d�t|d�kr�t|d�t|d�kr�t|d�t|d�kr�t|d�t|d�kr�dSdS)Nr1r$r2TF)rrAr)r;rBZ_portrHr+r+r.�portInPortRange�s000rIcCsTt|�}t|�dkr$|d|df}tt|�}ttdd�|�dd�d�}g}x�|D]�}|d|dkr�|d|dkr�|j|�qR|d|dkr�|d|dkr�|d|dkr�|j|�|d|df}qR|d|dko�|d|dko�|d|dkrR|j|�|d|df}qRWttdd�|��}|d|dk�rJ|df}|g|fS)z� Coalesce a port range with existing list of port ranges

        @param new_range tuple/list/string
        @param ranges list of tuple/list/string
        @return tuple of (list of ranges added after coalescing, list of removed original ranges)
    r1r$cSs t|�dkr|d|dfS|S)Nr1r$)rA)�xr+r+r.�<lambda>�sz#coalescePortRange.<locals>.<lambda>cSs|dS)Nr$r+)rJr+r+r.rK�s)�keycSs|d|dkr|dfS|S)Nr$r1r+)rJr+r+r.rK�s)rrA�map�sortedrDr>)Z	new_range�rangesZcoalesced_range�_ranges�removed_rangesrBr+r+r.�coalescePortRange�s*

  
 

rRcCs�t|�}t|�dkr$|d|df}tt|�}ttdd�|�dd�d�}g}g}�xJ|D�]@}|d|dkr�|d|dkr�|j|�qX|d|dkr�|d|dkr�|d|dkr�|j|�|j|dd|df�qX|d|dk�r<|d|dk�r<|d|dk�r<|j|�|j|d|ddf�qX|d|dkrX|d|dkrX|j|�|j|d|ddf�|j|dd|df�qXWttdd�|��}ttdd�|��}||fS)	z� break a port range from existing list of port ranges

        @param remove_range tuple/list/string
        @param ranges list of tuple/list/string
        @return tuple of (list of ranges added after breaking up, list of removed original ranges)
    r1r$cSs t|�dkr|d|dfS|S)Nr1r$)rA)rJr+r+r.rK�sz breakPortRange.<locals>.<lambda>cSs|dS)Nr$r+)rJr+r+r.rK�s)rLcSs|d|dkr|dfS|S)Nr$r1r+)rJr+r+r.rK�scSs|d|dkr|dfS|S)Nr$r1r+)rJr+r+r.rK�s)rrArMrNrDr>)Zremove_rangerOrPrQZadded_rangesrBr+r+r.�breakPortRange�s2
  
$
 
rScCs0ytjt|�|�}Wntjk
r*dSX|S)z� Check and Get service name from port and proto string combination using socket.getservbyport

    @param port string or id
    @param protocol string
    @return Service name if port and protocol are valid, else None
    N)r9Z
getservbyportr6r:)r;�proto�namer+r+r.r�s
cCs.ytjtj|�Wntjk
r(dSXdS)zl Check IPv4 address.
    
    @param ip address string
    @return True if address is valid, else False
    FT)r9�	inet_ptonZAF_INETr:)�ipr+r+r.rs
cCs
|jd�S)z� Normalize the IPv6 address

    This is mostly about converting URL-like IPv6 address to normal ones.
    e.g. [1234::4321] --> 1234:4321
    z[])r7)rWr+r+r.�normalizeIP6srXcCs2ytjtjt|��Wntjk
r,dSXdS)zl Check IPv6 address.
    
    @param ip address string
    @return True if address is valid, else False
    FT)r9rVZAF_INET6rXr:)rWr+r+r.r s
cCs�d|krN|d|jd��}||jd�dd�}t|�dksHt|�dkrVdSn|}d}t|�sbdS|r�d|krvt|�Syt|�}Wntk
r�dSX|dks�|dkr�dSdS)N�/r1F�.r$� T)�indexrArr6r8)rW�addr�maskr-r+r+r.r-s&cCs
|jt�S)N)�	translate�NOPRINT_TRANS_TABLE)Zrule_strr+r+r.r#DscCs�d|krN|d|jd��}||jd�dd�}t|�dksHt|�dkrVdSn|}d}t|�sbdS|r�yt|�}Wntk
r�dSX|dks�|dkr�dSdS)NrYr1Fr$�T)r\rArr6r8)rWr]r^r-r+r+r.r	Gs"cCs`yt|�}Wn:tk
rFytj|�Wntjk
r@dSXYnX|dksX|dkr\dSdS)NFr$�T)r6r8r9Zgetprotobynamer:)Zprotocolr-r+r+r.r
\scCs4|st|�dkrdSxdD]}||krdSqWdS)	z� Check interface string

    @param interface string
    @return True if interface is valid (maximum 16 chars and does not contain ' ', '/', '!', ':', '*'), else False
    �F� rY�!�*T)rdrYrerf)rA)Ziface�chr+r+r.rks
cCs<yt|d�}Wntk
r"dSX|dkr8|dkr8dSdS)Nr$Fl��T)r6r8)�valrJr+r+r.r~scCs�tjjt�sdSy"ttd��}|j�}WdQRXWntk
rFdSXtjjd|�s\dSy&td|d��}|j�}WdQRXWntk
r�dSXd|kr�dSdS)zv Check if firewalld is active

    @return True if there is a firewalld pid file and the pid is used by firewalld
    F�rNz/proc/%sz/proc/%s/cmdlineZ	firewalldT)�os�path�existsr'�open�readline�	Exception)�fd�pidZcmdliner+r+r.r
�s"cCsby*tjjt�stjtd�tjddtdd�Stk
r\}ztj	d|��WYdd}~XnXdS)Ni�Zwtztemp.F)�mode�prefix�dir�deletez#Failed to create temporary file: %s)
rjrkrlr&�mkdir�tempfileZNamedTemporaryFileror%r:)�msgr+r+r.r�s
cCsXyt|d��
}|j�SQRXWn4tk
rR}ztjd||f�WYdd}~XnXdS)NrizFailed to read file "%s": %s)rm�	readlinesror%r:)�filename�f�er+r+r.r�s$cCs\y$t|d��}|j|�WdQRXWn2tk
rV}ztjd||f�dSd}~XnXdS)N�wz Failed to write to file "%s": %sFT)rm�writeror%r:)rz�liner{r|r+r+r.r�scCs(|dkrtdd�S|dkr$tdd�SdS)N�ipv4z/proc/sys/net/ipv4/ip_forwardz1
�ipv6z&/proc/sys/net/ipv6/conf/all/forwardingF)r)�ipvr+r+r.r�s


cCs|jdd�jdd�S)N�_r<z
nf-conntrack-rG)�replace)�moduler+r+r.�get_nf_conntrack_short_name�sr�cCs�t|�}|d
ks<|dks<|dks<t|�dkr�|d|dkr�|dkrTtjd|�nZ|d
krltjd|�nB|dkr�tjd|�n*t|�dkr�|d|dkr�tjd|�dSd	S)Nr2r1r$z'%s': port > 65535z'%s': port is invalidz'%s': port is ambiguousz'%s': range start >= endFTr4r3r4r3)rrAr%Zdebug2)r;rHr+r+r.r�scCs(|dkrt|�S|dkr t|�SdSdS)Nr�r�F)rr	)r��sourcer+r+r.r�s
cCs(|dkrt|�S|dkr t|�SdSdS)Nr�r�F)rr)r�r�r+r+r.r�s
cCsRt|�dkrNxdD]}||dkrdSqWxdD]}||tjkr0dSq0WdSdS)N��r2���rFFr$r1�����	�
�
�rcT�)r2r�r�r�r�)r$r1r�r�r�r�r�r�r�r�r�rc)rA�stringZ	hexdigits)Zmacr-r+r+r.r�s

cCs(g}x|D]}||kr
|j|�q
W|S)N)rD)Z_list�outputrJr+r+r.r�s

cCsHy.tjd|�}t|j�dj��}|j�Wntk
rBdSX|S)z Get parent for pid zps -o ppid -h -p %d 2>/dev/nullr$N)rj�popenr6ryr7�closero)rqr{r+r+r.r�scCsBddlm}ddlm}ttt|j���}d|t|�td�S)z�
    iptables limits length of chain to (currently) 28 chars.
    The longest chain we create is POST_<policy>_allow,
    which leaves 28 - 11 = 17 chars for <policy>.
    r$)�POLICY_CHAIN_PREFIX)�	SHORTCUTS�Z_allow)Zfirewall.core.ipXtablesr��firewall.core.baser��maxrMrA�values)r�r��longest_shortcutr+r+r.r"	scCs.ddlm}ttt|j���}d|td�S)z�
    Netfilter limits length of chain to (currently) 28 chars.
    The longest chain we create is FWDI_<zone>_allow,
    which leaves 28 - 11 = 17 chars for <zone>.
    r$)r�r�Z__allow)r�r�r�rMrAr�)r�r�r+r+r.rscCsTt|�dkst|�tjd�kr"dSx,|D]$}|tjkr(|tjkr(|d	kr(dSq(WdS)
Nr1�SC_LOGIN_NAME_MAXFrZr<r��$T)rZr<r�r�)rArj�sysconfr�Z
ascii_lettersZdigits)�user�cr+r+r.rs


cCsDt|t�r,yt|�}Wntk
r*dSX|dkr@|dkr@dSdS)	NFr$r2r)r1Tli���)r5�strr6r8)Zuidr+r+r.r(s
cCsJt|�dkst|�dkrdSxd
D]}||kr"dSq"W|ddkrFdSd	S)Nr1iF�|�
�r$rYT)r�r�r�)rA)Zcommandrgr+r+r.r2s
cCs�|jd�}t|�dkrdS|ddkr>|ddd�dkr>dS|d	dd�d
krVdS|ddd�dkrndSt|d�d	kr�dSd
S)NrFr�r�Fr$�rootr2Z_ur1Z_rZ_tr�T)r�r�r4r4r4)r@rA)�contextrEr+r+r.r<s
 cCs8dtt�kr djdd�|D��Sdjdd�|D��SdS)N�quoterdcss|]}tj|�VqdS)N)�shlexr�)r,�ar+r+r.�	<genexpr>PszjoinArgs.<locals>.<genexpr>css|]}tj|�VqdS)N)�pipesr�)r,r�r+r+r.r�Rs)rtr�rC)�argsr+r+r.rNscCs8tr*t|t�r*t|�}tj|�}tt|�Stj|�SdS)N)rr5�unicoder r�r@rMr)�_stringrEr+r+r.rTs


cCst|t�r|jdd�S|S)z bytes to unicode zUTF-8r�)r5�bytes�decode)r�r+r+r.r]s
cCst|t�s|jdd�S|S)z unicode to bytes zUTF-8r�)r5r��encode)r�r+r+r.r cs
cCstrt|t�r|jdd�S|S)z" unicode to bytes only if Python 2zUTF-8r�)rr5r�r�)r�r+r+r.r!is)rF)9�__all__r9rjZos.pathr�r�r��sysrwZfirewall.core.loggerr%Zfirewall.configr&r'�versionrrBr`rrrrIrRrSrrrXrrr#r	r
rrr
rrrrr�rrrrrrr"rrrrrrrrr r!r+r+r+r.�<module>sz

:
&+


	




	site-packages/firewall/__pycache__/dbus_utils.cpython-36.pyc000064400000013117147511334700020114 0ustar003

]ûfJ�@s�dddddddddd	d
gZddlZddlZddlZdd
lmZddlmZejdkZ	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zddd�Zdd�Zddd	�Zdd
�ZdS)�command_of_pid�
pid_of_sender�
uid_of_sender�user_of_uid�context_of_sender�command_of_sender�user_of_sender�dbus_to_python�dbus_signature�%dbus_introspection_prepare_properties�!dbus_introspection_add_properties�N)�minidom)�log�3cCsPy6td|d��}|j�djdd�j�}WdQRXWntk
rJdSX|S)z  Get command for pid from /proc z/proc/%d/cmdline�rr�� N)�open�	readlines�replace�strip�	Exception)�pid�f�cmd�r� /usr/lib/python3.6/dbus_utils.pyr%s&cCsD|jdd�}tj|d�}yt|j|��}Wntk
r>dSX|S)zW Get pid from sender string using 
    org.freedesktop.DBus.GetConnectionUnixProcessID zorg.freedesktop.DBusz/org/freedesktop/DBusN)�
get_object�dbus�	Interface�intZGetConnectionUnixProcessID�
ValueError)�bus�sender�dbus_obj�
dbus_ifacerrrrr.scCsD|jdd�}tj|d�}yt|j|��}Wntk
r>dSX|S)zV Get user id from sender string using 
    org.freedesktop.DBus.GetConnectionUnixUser zorg.freedesktop.DBusz/org/freedesktop/DBusN)rrrr ZGetConnectionUnixUserr!)r"r#r$r%�uidrrrr;scCs,ytj|�}Wntk
r"dSX|dS)z Get user for uid from pwd Nr)�pwd�getpwuidr)r&ZpwsrrrrHs
c
CsP|jdd�}tj|d�}y|j|�}Wntk
r:dSXdjttt|���S)zl Get SELinux context from sender string using 
    org.freedesktop.DBus.GetConnectionSELinuxSecurityContext zorg.freedesktop.DBusz/org/freedesktop/DBusN�)	rrrZ#GetConnectionSELinuxSecurityContextr�join�map�chrr)r"r#r$r%�contextrrrrQscCstt||��S)z  Return command of D-Bus sender )rr)r"r#rrrr_scCstt||��S)N)rr)r"r#rrrrdscCs�|dkr|}�n�t|tj�r(t|�}�n�t|tj�rNtrB|jd�nt|�}�n�trjt|tj�rjt|�}�ndt|tj	�r�t|�}�nLt|tj
�s�t|tj�s�t|tj�s�t|tj
�s�t|tj�s�t|tj�s�t|tj�r�t|�}n�t|tj�r�t|�}n�t|tj��rdd�|D�}n�t|tj��r6tdd�|D��}n�t|tj��rXdd�|j�D�}nvt|t��s�t|t��s�t|t��s�t|t��s�t|t��s�t|t��s�t|t��s�t|t��r�|}ntdt|���|dk	�r�|tk�r�t|t��s�|tk�rt|t��s�|tk�r t|t��s�|tk�r8t|t��s�|tk�rPt|t��s�|tk�rht|t��s�|tk�r�t|t��r�td|t|�|f��|S)	Nzutf-8cSsg|]}t|��qSr)r)�.0�xrrr�
<listcomp>}sz"dbus_to_python.<locals>.<listcomp>cSsg|]}t|��qSr)r)r.r/rrrr0scSsi|]\}}t|�t|��qSr)r)r.�k�vrrr�
<dictcomp>�sz"dbus_to_python.<locals>.<dictcomp>zUnhandled %sz%s is %s, expected %s)�
isinstancer�Boolean�bool�String�PY2�encode�str�
UTF8String�
ObjectPath�Byte�Int16�Int32�Int64�UInt16�UInt32�UInt64r �Double�float�Array�Struct�tuple�
Dictionary�items�bytes�list�dict�	TypeError�repr�type)�objZ
expected_typeZ
python_objrrrrgsV


cCs>t|tj�rdSt|tj�r dSt|tj�r0dSt|tj�r@dSt|tj�rPdSt|tj�r`dSt|tj�rpdSt|tj	�r�dSt|tj
�r�d	St|tj�r�d
St|tj�r�dSt|tj
��r�t|j�dkr�d
|jSd|jSnXt|tj��r�d|jSt|tj��rd|jSt�r*t|tj��r*dStdt|���dS)N�b�s�o�y�n�ir/�q�u�t�d�za(%s)za%sz(%s)za{%s}zUnhandled %s)r4rr5r7r<r=r>r?r@rArBrCrDrF�lenZ	signaturerGrIr8r;rNrO)rQrrrr	�sB


cCs�|dkri}t|d�s"t|di�t|d�}i||<y|j|�}Wntk
rZi}YnXxV|j�D]J\}}dt|�i|||<||kr�|||||d<qfd|||d<qfWdS)N�_fw_dbus_propertiesrP�access�read)�hasattr�setattr�getattrZGetAllrrJr	)rQ�	interfacer_�dipZ_dict�key�valuerrrr
�s


c
Cs�tj|�}t|d�r�x�|jd�D]�}|jd�r |jd�|kr i}t|d�rTt|d�}||kr xX||j�D]H\}}|jd�}|j	d|�|j	d|d�|j	d|d�|j
|�qjWq Wtj|j
��|j
�}	|j�|	S)Nr^rd�name�propertyrPr_)r
ZparseStringraZgetElementsByTagNameZhasAttributeZgetAttributercrJZ
createElementZsetAttributeZappendChildrZdebug10Ztoxml�unlink)
rQ�datard�docZnodererfrgZpropZnew_datarrrr�s&





)N)N)�__all__rr'�sysZxml.domr
Zfirewall.core.loggerr�versionr8rrrrrrrrr	r
rrrrr�<module>s*
	

	
0%
site-packages/firewall/__pycache__/client.cpython-36.pyc000064400000426232147511334700017223 0ustar003

]ûf��@s�ddlmZmZddlZeejd<ddlZddlZddl	m	Z	ddl
mZddlm
Z
mZmZddlmZddlmZdd	lmZdd
lmZmZmZddl
mZddlmZddlZddlZdad
ae	dd��Z Gdd�de!�Z"Gdd�de!�Z#Gdd�de!�Z$Gdd�de!�Z%Gdd�de!�Z&Gdd�de!�Z'Gdd�de!�Z(Gdd�de!�Z)Gd d!�d!e!�Z*Gd"d#�d#e!�Z+Gd$d%�d%e!�Z,Gd&d'�d'e!�Z-Gd(d)�d)e!�Z.Gd*d+�d+e!�Z/Gd,d-�d-e!�Z0Gd.d/�d/e!�Z1Gd0d1�d1e!�Z2Gd2d3�d3e!�Z3dS)4�)�GLib�GObjectNZgobject)�	decorator)�config)�DEFAULT_ZONE_TARGET�DEFAULT_POLICY_TARGET�DEFAULT_POLICY_PRIORITY)�dbus_to_python)�b2u)�	Rich_Rule)�normalize_ipset_entry�check_entry_overlaps_existing�check_for_overlapping_entries)�errors)�
FirewallErrorFcOsd}�x|�sy
|||�Stjjk
r�}zb|j�}|j�}tsD�d|krVtd�n4d|krht|�n"d}|rzt|�nttt|���WYdd}~Xnftk
r�}zts��nttt|���WYdd}~Xn.t	k
�r�ts�nttt
j���YnXtsPqWdS)z#Decorator to handle exceptions
    FZNotAuthorizedExceptionzorg.freedesktop.DBus.ErrorTN)
�dbus�
exceptions�
DBusException�get_dbus_messageZ
get_dbus_name�exception_handlerr
�strr�	Exception�	traceback�
format_exc�not_authorized_loop)�func�args�kwargsZ
authorized�eZdbus_messageZ	dbus_name�r�/usr/lib/python3.6/client.py�handle_exceptions0s6




  r!c@s�eZdZed�dd��Zedd��Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��Zedd��Zedd��Z
edd��Zedd��Zedd��Zedd��Zedd��Zed d!��Zed"d#��Zed$d%��Zed&d'��Zed(d)��Zed*d+��Zed,d-��Zed.d/��Zed0d1��Zed2d3��Zed4d5��Zed6d7��Zed8d9��Zed:d;��Z ed<d=��Z!ed>d?��Z"ed@dA��Z#edBdC��Z$edDdE��Z%edFdG��Z&edHdI��Z'edJdK��Z(edLdM��Z)edNdO��Z*edPdQ��Z+edRdS��Z,edTdU��Z-e.j/j0j1edVdW���Z2e.j/j0j1edXdY���Z3e.j/j0j1edZd[���Z4ed\d]��Z5ed^d_��Z6e.j/j0j1ed`da���Z7e.j/j0j1edbdc���Z8e.j/j0j1eddde���Z9edfdg��Z:edhdi��Z;e.j/j0j1edjdk���Z<e.j/j0j1edldm���Z=e.j/j0j1edndo���Z>edpdq��Z?edrds��Z@edtdu��ZAedvdw��ZBedxdy��ZCedzd{��ZDed|d}��ZEed~d��ZFed�d���ZGed�d���ZHed�d���ZIed�d���ZJed�d���ZKed�d���ZLed�d���ZMed�d���ZNed�d���ZOed�d���ZPed�d���ZQed�d���ZRdS)��FirewallClientZoneSettingsNcCs�ddddtgggdggggggddg|_ddddddd	d
ddd
ddddddg|_dddddddddddddddddg|_|r�t|t�r�x"t|�D]\}}|||j|<q�Wt|t�r�|j|�dS)N�F�version�short�description�UNUSED�target�services�ports�icmp_blocks�
masquerade�
forward_ports�
interfaces�sourcesZ	rules_str�	protocols�source_portsZicmp_block_inversion�forward�s�bz(ss)z(ssss))	r�settings�
settings_name�settings_dbus_type�
isinstance�list�	enumerate�dict�setSettingsDict)�selfr5�i�vrrr �__init__Xs(

z#FirewallClientZoneSettings.__init__cCsd|j|jfS)Nz%s(%r))�	__class__r5)r=rrr �__repr__osz#FirewallClientZoneSettings.__repr__cCs6i}x,t|j|j�D]\}}|dkr&q|||<qW|S)Nr')�zipr6r5)r=r5�key�valuerrr �getSettingsDictssz*FirewallClientZoneSettings.getSettingsDictcCs(x"|D]}|||j|jj|�<qWdS)N)r5r6�index)r=r5rDrrr r<{s
z*FirewallClientZoneSettings.setSettingsDictcCs|i}xrt|j|j|j�D]\\}}}|dkr,qt|�tkrLtj||d�||<qt|�tkrltj	||d�||<q|||<qW|S)Nr')�	signature)
rCr6r5r7�typer9r�Arrayr;�
Dictionary)r=r5rDrE�sigrrr �getSettingsDbusDictsz.FirewallClientZoneSettings.getSettingsDbusDictcCs$|j�}|d=|d=|d=|d=|S)Nr$r%r&r()rF)r=r5rrr �getRuntimeSettingsDict�sz1FirewallClientZoneSettings.getRuntimeSettingsDictcCs$|j�}|d=|d=|d=|d=|S)Nr$r%r&r()rM)r=r5rrr �getRuntimeSettingsDbusDict�sz5FirewallClientZoneSettings.getRuntimeSettingsDbusDictcCs
|jdS)Nr)r5)r=rrr �
getVersion�sz%FirewallClientZoneSettings.getVersioncCs||jd<dS)Nr)r5)r=r$rrr �
setVersion�sz%FirewallClientZoneSettings.setVersioncCs
|jdS)N�)r5)r=rrr �getShort�sz#FirewallClientZoneSettings.getShortcCs||jd<dS)NrR)r5)r=r%rrr �setShort�sz#FirewallClientZoneSettings.setShortcCs
|jdS)N�)r5)r=rrr �getDescription�sz)FirewallClientZoneSettings.getDescriptioncCs||jd<dS)NrU)r5)r=r&rrr �setDescription�sz)FirewallClientZoneSettings.setDescriptioncCs|jdtkr|jdSdS)N��default)r5r)r=rrr �	getTarget�sz$FirewallClientZoneSettings.getTargetcCs|dkr|nt|jd<dS)NrYrX)rr5)r=r(rrr �	setTarget�sz$FirewallClientZoneSettings.setTargetcCs
|jdS)N�)r5)r=rrr �getServices�sz&FirewallClientZoneSettings.getServicescCs||jd<dS)Nr\)r5)r=r)rrr �setServices�sz&FirewallClientZoneSettings.setServicescCs0||jdkr |jdj|�nttj|��dS)Nr\)r5�appendrr�ALREADY_ENABLED)r=�servicerrr �
addService�sz%FirewallClientZoneSettings.addServicecCs0||jdkr |jdj|�nttj|��dS)Nr\)r5�removerr�NOT_ENABLED)r=rarrr �
removeService�sz(FirewallClientZoneSettings.removeServicecCs||jdkS)Nr\)r5)r=rarrr �queryService�sz'FirewallClientZoneSettings.queryServicecCs
|jdS)N�)r5)r=rrr �getPorts�sz#FirewallClientZoneSettings.getPortscCs||jd<dS)Nrg)r5)r=r*rrr �setPorts�sz#FirewallClientZoneSettings.setPortscCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nrgz'%s:%s')r5r_rrr`)r=�port�protocolrrr �addPort�sz"FirewallClientZoneSettings.addPortcCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nrgz'%s:%s')r5rcrrrd)r=rjrkrrr �
removePort�sz%FirewallClientZoneSettings.removePortcCs||f|jdkS)Nrg)r5)r=rjrkrrr �	queryPort�sz$FirewallClientZoneSettings.queryPortcCs
|jdS)N�
)r5)r=rrr �getProtocols�sz'FirewallClientZoneSettings.getProtocolscCs||jd<dS)Nro)r5)r=r0rrr �setProtocols�sz'FirewallClientZoneSettings.setProtocolscCs0||jdkr |jdj|�nttj|��dS)Nro)r5r_rrr`)r=rkrrr �addProtocol�sz&FirewallClientZoneSettings.addProtocolcCs0||jdkr |jdj|�nttj|��dS)Nro)r5rcrrrd)r=rkrrr �removeProtocol�sz)FirewallClientZoneSettings.removeProtocolcCs||jdkS)Nro)r5)r=rkrrr �
queryProtocol�sz(FirewallClientZoneSettings.queryProtocolcCs
|jdS)N�)r5)r=rrr �getSourcePortssz)FirewallClientZoneSettings.getSourcePortscCs||jd<dS)Nru)r5)r=r*rrr �setSourcePortssz)FirewallClientZoneSettings.setSourcePortscCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nruz'%s:%s')r5r_rrr`)r=rjrkrrr �
addSourcePortsz(FirewallClientZoneSettings.addSourcePortcCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nruz'%s:%s')r5rcrrrd)r=rjrkrrr �removeSourcePortsz+FirewallClientZoneSettings.removeSourcePortcCs||f|jdkS)Nru)r5)r=rjrkrrr �querySourcePortsz*FirewallClientZoneSettings.querySourcePortcCs
|jdS)N�)r5)r=rrr �
getIcmpBlockssz(FirewallClientZoneSettings.getIcmpBlockscCs||jd<dS)Nr{)r5)r=�
icmpblocksrrr �
setIcmpBlockssz(FirewallClientZoneSettings.setIcmpBlockscCs0||jdkr |jdj|�nttj|��dS)Nr{)r5r_rrr`)r=�icmptyperrr �addIcmpBlock sz'FirewallClientZoneSettings.addIcmpBlockcCs0||jdkr |jdj|�nttj|��dS)Nr{)r5rcrrrd)r=rrrr �removeIcmpBlock&sz*FirewallClientZoneSettings.removeIcmpBlockcCs||jdkS)Nr{)r5)r=rrrr �queryIcmpBlock,sz)FirewallClientZoneSettings.queryIcmpBlockcCs
|jdS)N�)r5)r=rrr �getIcmpBlockInversion0sz0FirewallClientZoneSettings.getIcmpBlockInversioncCs||jd<dS)Nr�)r5)r=�flagrrr �setIcmpBlockInversion3sz0FirewallClientZoneSettings.setIcmpBlockInversioncCs&|jdsd|jd<nttjd�dS)Nr�Tzicmp-block-inversion)r5rrr`)r=rrr �addIcmpBlockInversion6s
z0FirewallClientZoneSettings.addIcmpBlockInversioncCs&|jdrd|jd<nttjd�dS)Nr�Fzicmp-block-inversion)r5rrrd)r=rrr �removeIcmpBlockInversion=s
z3FirewallClientZoneSettings.removeIcmpBlockInversioncCs
|jdS)Nr�)r5)r=rrr �queryIcmpBlockInversionDsz2FirewallClientZoneSettings.queryIcmpBlockInversioncCs
|jdS)N�)r5)r=rrr �
getForwardIsz%FirewallClientZoneSettings.getForwardcCs||jd<dS)Nr�)r5)r=r2rrr �
setForwardLsz%FirewallClientZoneSettings.setForwardcCs&|jdsd|jd<nttjd�dS)Nr�Tr2)r5rrr`)r=rrr �
addForwardOs
z%FirewallClientZoneSettings.addForwardcCs&|jdrd|jd<nttjd�dS)Nr�Fr2)r5rrrd)r=rrr �
removeForwardVs
z(FirewallClientZoneSettings.removeForwardcCs
|jdS)Nr�)r5)r=rrr �queryForward]sz'FirewallClientZoneSettings.queryForwardcCs
|jdS)N�)r5)r=rrr �
getMasqueradebsz(FirewallClientZoneSettings.getMasqueradecCs||jd<dS)Nr�)r5)r=r,rrr �
setMasqueradeesz(FirewallClientZoneSettings.setMasqueradecCs&|jdsd|jd<nttjd�dS)Nr�Tr,)r5rrr`)r=rrr �
addMasqueradehs
z(FirewallClientZoneSettings.addMasqueradecCs&|jdrd|jd<nttjd�dS)Nr�Fr,)r5rrrd)r=rrr �removeMasqueradeos
z+FirewallClientZoneSettings.removeMasqueradecCs
|jdS)Nr�)r5)r=rrr �queryMasqueradevsz*FirewallClientZoneSettings.queryMasqueradecCs
|jdS)N�	)r5)r=rrr �getForwardPorts{sz*FirewallClientZoneSettings.getForwardPortscCs||jd<dS)Nr�)r5)r=r*rrr �setForwardPorts~sz*FirewallClientZoneSettings.setForwardPortscCsd|dkrd}|dkrd}||||f|jdkrH|jdj||||f�nttjd||||f��dS)Nr#r�z
'%s:%s:%s:%s')r5r_rrr`)r=rjrk�to_port�to_addrrrr �addForwardPort�sz)FirewallClientZoneSettings.addForwardPortcCsd|dkrd}|dkrd}||||f|jdkrH|jdj||||f�nttjd||||f��dS)Nr#r�z
'%s:%s:%s:%s')r5rcrrrd)r=rjrkr�r�rrr �removeForwardPort�sz,FirewallClientZoneSettings.removeForwardPortcCs.|dkrd}|dkrd}||||f|jdkS)Nr#r�)r5)r=rjrkr�r�rrr �queryForwardPort�s
z+FirewallClientZoneSettings.queryForwardPortcCs
|jdS)N�
)r5)r=rrr �
getInterfaces�sz(FirewallClientZoneSettings.getInterfacescCs||jd<dS)Nr�)r5)r=r.rrr �
setInterfaces�sz(FirewallClientZoneSettings.setInterfacescCs0||jdkr |jdj|�nttj|��dS)Nr�)r5r_rrr`)r=�	interfacerrr �addInterface�sz'FirewallClientZoneSettings.addInterfacecCs0||jdkr |jdj|�nttj|��dS)Nr�)r5rcrrrd)r=r�rrr �removeInterface�sz*FirewallClientZoneSettings.removeInterfacecCs||jdkS)Nr�)r5)r=r�rrr �queryInterface�sz)FirewallClientZoneSettings.queryInterfacecCs
|jdS)N�)r5)r=rrr �
getSources�sz%FirewallClientZoneSettings.getSourcescCs||jd<dS)Nr�)r5)r=r/rrr �
setSources�sz%FirewallClientZoneSettings.setSourcescCs0||jdkr |jdj|�nttj|��dS)Nr�)r5r_rrr`)r=�sourcerrr �	addSource�sz$FirewallClientZoneSettings.addSourcecCs0||jdkr |jdj|�nttj|��dS)Nr�)r5rcrrrd)r=r�rrr �removeSource�sz'FirewallClientZoneSettings.removeSourcecCs||jdkS)Nr�)r5)r=r�rrr �querySource�sz&FirewallClientZoneSettings.querySourcecCs
|jdS)N�)r5)r=rrr �getRichRules�sz'FirewallClientZoneSettings.getRichRulescCsdd�|D�}||jd<dS)NcSsg|]}tt|d���qS))�rule_str)rr)�.0�rrrr �
<listcomp>�sz;FirewallClientZoneSettings.setRichRules.<locals>.<listcomp>r�)r5)r=�rulesrrr �setRichRules�sz'FirewallClientZoneSettings.setRichRulescCs>tt|d��}||jdkr.|jdj|�nttj|��dS)N)r�r�)rrr5r_rrr`)r=�rulerrr �addRichRule�sz&FirewallClientZoneSettings.addRichRulecCs>tt|d��}||jdkr.|jdj|�nttj|��dS)N)r�r�)rrr5rcrrrd)r=r�rrr �removeRichRule�sz)FirewallClientZoneSettings.removeRichRulecCstt|d��}||jdkS)N)r�r�)rrr5)r=r�rrr �
queryRichRule�sz(FirewallClientZoneSettings.queryRichRule)N)S�__name__�
__module__�__qualname__r!r@rBrFr<rMrNrOrPrQrSrTrVrWrZr[r]r^rbrerfrhrirlrmrnrprqrrrsrtrvrwrxryrzr|r~r�r�r�r�r��slipr�polkit�enable_proxyr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr r"Ws�	
r"c@s�eZdZdd�Zejjjedd���Z	ejjjedd���Z
ejjjedd���Zejjjed	d
���Zejjjedd���Z
ejjjed
d���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd ���Zejjjed!d"���Zejjjed#d$���Zejjjed%d&���Zejjjed'd(���Zejjjed)d*���Zejjjed+d,���Zejjjed-d.���Zejjjed/d0���Zejjjed1d2���Z ejjjed3d4���Z!ejjjed5d6���Z"ejjjed7d8���Z#ejjjed9d:���Z$ejjjed;d<���Z%ejjjed=d>���Z&ejjjed?d@���Z'ejjjedAdB���Z(ejjjedCdD���Z)ejjjedEdF���Z*ejjjedGdH���Z+ejjjedIdJ���Z,ejjjedKdL���Z-ejjjedMdN���Z.ejjjedOdP���Z/ejjjedQdR���Z0ejjjedSdT���Z1ejjjedUdV���Z2ejjjedWdX���Z3ejjjedYdZ���Z4ejjjed[d\���Z5ejjjed]d^���Z6ejjjed_d`���Z7ejjjedadb���Z8ejjjedcdd���Z9ejjjededf���Z:ejjjedgdh���Z;ejjjedidj���Z<ejjjedkdl���Z=ejjjedmdn���Z>ejjjedodp���Z?ejjjedqdr���Z@ejjjedsdt���ZAejjjedudv���ZBejjjedwdx���ZCejjjedydz���ZDejjjed{d|���ZEejjjed}d~���ZFejjjedd����ZGejjjed�d����ZHejjjed�d����ZIejjjed�d����ZJejjjed�d����ZKejjjed�d����ZLejjjed�d����ZMejjjed�d����ZNejjjed�d����ZOejjjed�d����ZPejjjed�d����ZQejjjed�d����ZRejjjed�d����ZSejjjed�d����ZTd�S)��FirewallClientConfigZonecCsL||_||_|jjtjj|�|_tj|jtjjd�|_	tj|jdd�|_
dS)N)�dbus_interfacezorg.freedesktop.DBus.Properties)�bus�path�
get_objectrr�DBUS_INTERFACE�dbus_obj�	Interface�DBUS_INTERFACE_CONFIG_ZONE�fw_zone�
fw_properties)r=r�r�rrr r@�sz!FirewallClientConfigZone.__init__cCst|jjtjj|��S)N)r	r��Getrrr�)r=�proprrr �get_property�sz%FirewallClientConfigZone.get_propertycCst|jjtjj��S)N)r	r��GetAllrrr�)r=rrr �get_properties�sz'FirewallClientConfigZone.get_propertiescCs|jjtjj||�dS)N)r��Setrrr�)r=r�rErrr �set_propertysz%FirewallClientConfigZone.set_propertycCstt|jj���S)N)r"r	r��getSettings2)r=rrr �getSettingssz$FirewallClientConfigZone.getSettingscCs|jj|j��dS)N)r��update2rM)r=r5rrr �updateszFirewallClientConfigZone.updatecCs|jj�dS)N)r��loadDefaults)r=rrr r�sz%FirewallClientConfigZone.loadDefaultscCs|jj�dS)N)r�rc)r=rrr rcszFirewallClientConfigZone.removecCs|jj|�dS)N)r��rename)r=�namerrr r�szFirewallClientConfigZone.renamecCs
|jj�S)N)r�rP)r=rrr rP"sz#FirewallClientConfigZone.getVersioncCs|jj|�dS)N)r�rQ)r=r$rrr rQ'sz#FirewallClientConfigZone.setVersioncCs
|jj�S)N)r�rS)r=rrr rS.sz!FirewallClientConfigZone.getShortcCs|jj|�dS)N)r�rT)r=r%rrr rT3sz!FirewallClientConfigZone.setShortcCs
|jj�S)N)r�rV)r=rrr rV:sz'FirewallClientConfigZone.getDescriptioncCs|jj|�dS)N)r�rW)r=r&rrr rW?sz'FirewallClientConfigZone.setDescriptioncCs
|jj�S)N)r�rZ)r=rrr rZFsz"FirewallClientConfigZone.getTargetcCs|jj|�dS)N)r�r[)r=r(rrr r[Ksz"FirewallClientConfigZone.setTargetcCs
|jj�S)N)r�r])r=rrr r]Rsz$FirewallClientConfigZone.getServicescCs|jj|�dS)N)r�r^)r=r)rrr r^Wsz$FirewallClientConfigZone.setServicescCs|jj|�dS)N)r�rb)r=rarrr rb\sz#FirewallClientConfigZone.addServicecCs|jj|�dS)N)r�re)r=rarrr reasz&FirewallClientConfigZone.removeServicecCs|jj|�S)N)r�rf)r=rarrr rffsz%FirewallClientConfigZone.queryServicecCs
|jj�S)N)r�rh)r=rrr rhmsz!FirewallClientConfigZone.getPortscCs|jj|�dS)N)r�ri)r=r*rrr rirsz!FirewallClientConfigZone.setPortscCs|jj||�dS)N)r�rl)r=rjrkrrr rlwsz FirewallClientConfigZone.addPortcCs|jj||�dS)N)r�rm)r=rjrkrrr rm|sz#FirewallClientConfigZone.removePortcCs|jj||�S)N)r�rn)r=rjrkrrr rn�sz"FirewallClientConfigZone.queryPortcCs
|jj�S)N)r�rp)r=rrr rp�sz%FirewallClientConfigZone.getProtocolscCs|jj|�dS)N)r�rq)r=r0rrr rq�sz%FirewallClientConfigZone.setProtocolscCs|jj|�dS)N)r�rr)r=rkrrr rr�sz$FirewallClientConfigZone.addProtocolcCs|jj|�dS)N)r�rs)r=rkrrr rs�sz'FirewallClientConfigZone.removeProtocolcCs|jj|�S)N)r�rt)r=rkrrr rt�sz&FirewallClientConfigZone.queryProtocolcCs
|jj�S)N)r�rv)r=rrr rv�sz'FirewallClientConfigZone.getSourcePortscCs|jj|�dS)N)r�rw)r=r*rrr rw�sz'FirewallClientConfigZone.setSourcePortscCs|jj||�dS)N)r�rx)r=rjrkrrr rx�sz&FirewallClientConfigZone.addSourcePortcCs|jj||�dS)N)r�ry)r=rjrkrrr ry�sz)FirewallClientConfigZone.removeSourcePortcCs|jj||�S)N)r�rz)r=rjrkrrr rz�sz(FirewallClientConfigZone.querySourcePortcCs
|jj�S)N)r�r|)r=rrr r|�sz&FirewallClientConfigZone.getIcmpBlockscCs|jj|�dS)N)r�r~)r=Z	icmptypesrrr r~�sz&FirewallClientConfigZone.setIcmpBlockscCs|jj|�dS)N)r�r�)r=rrrr r��sz%FirewallClientConfigZone.addIcmpBlockcCs|jj|�dS)N)r�r�)r=rrrr r��sz(FirewallClientConfigZone.removeIcmpBlockcCs|jj|�S)N)r�r�)r=rrrr r��sz'FirewallClientConfigZone.queryIcmpBlockcCs
|jj�S)N)r�r�)r=rrr r��sz.FirewallClientConfigZone.getIcmpBlockInversioncCs|jj|�dS)N)r�r�)r=Z	inversionrrr r��sz.FirewallClientConfigZone.setIcmpBlockInversioncCs|jj�dS)N)r�r�)r=rrr r��sz.FirewallClientConfigZone.addIcmpBlockInversioncCs|jj�dS)N)r�r�)r=rrr r��sz1FirewallClientConfigZone.removeIcmpBlockInversioncCs
|jj�S)N)r�r�)r=rrr r��sz0FirewallClientConfigZone.queryIcmpBlockInversioncCs|jj�dS)Nr2)r�r�)r=rrr r��sz#FirewallClientConfigZone.getForwardcCs|jjd|i�dS)Nr2)r�r�)r=r2rrr r��sz#FirewallClientConfigZone.setForwardcCs|jjddi�dS)Nr2T)r�r�)r=rrr r��sz#FirewallClientConfigZone.addForwardcCs|jjddi�dS)Nr2F)r�r�)r=rrr r�sz&FirewallClientConfigZone.removeForwardcCs|jj�dS)Nr2)r�r�)r=rrr r�sz%FirewallClientConfigZone.queryForwardcCs
|jj�S)N)r�r�)r=rrr r�sz&FirewallClientConfigZone.getMasqueradecCs|jj|�dS)N)r�r�)r=r,rrr r�sz&FirewallClientConfigZone.setMasqueradecCs|jj�dS)N)r�r�)r=rrr r�sz&FirewallClientConfigZone.addMasqueradecCs|jj�dS)N)r�r�)r=rrr r�sz)FirewallClientConfigZone.removeMasqueradecCs
|jj�S)N)r�r�)r=rrr r�#sz(FirewallClientConfigZone.queryMasqueradecCs
|jj�S)N)r�r�)r=rrr r�*sz(FirewallClientConfigZone.getForwardPortscCs|jj|�dS)N)r�r�)r=r*rrr r�/sz(FirewallClientConfigZone.setForwardPortscCs.|dkrd}|dkrd}|jj||||�dS)Nr#)r�r�)r=rjrk�toport�toaddrrrr r�4s
z'FirewallClientConfigZone.addForwardPortcCs.|dkrd}|dkrd}|jj||||�dS)Nr#)r�r�)r=rjrkr�r�rrr r�=s
z*FirewallClientConfigZone.removeForwardPortcCs*|dkrd}|dkrd}|jj||||�S)Nr#)r�r�)r=rjrkr�r�rrr r�Fs
z)FirewallClientConfigZone.queryForwardPortcCs
|jj�S)N)r�r�)r=rrr r�Qsz&FirewallClientConfigZone.getInterfacescCs|jj|�dS)N)r�r�)r=r.rrr r�Vsz&FirewallClientConfigZone.setInterfacescCs|jj|�dS)N)r�r�)r=r�rrr r�[sz%FirewallClientConfigZone.addInterfacecCs|jj|�dS)N)r�r�)r=r�rrr r�`sz(FirewallClientConfigZone.removeInterfacecCs|jj|�S)N)r�r�)r=r�rrr r�esz'FirewallClientConfigZone.queryInterfacecCs
|jj�S)N)r�r�)r=rrr r�lsz#FirewallClientConfigZone.getSourcescCs|jj|�dS)N)r�r�)r=r/rrr r�qsz#FirewallClientConfigZone.setSourcescCs|jj|�dS)N)r�r�)r=r�rrr r�vsz"FirewallClientConfigZone.addSourcecCs|jj|�dS)N)r�r�)r=r�rrr r�{sz%FirewallClientConfigZone.removeSourcecCs|jj|�S)N)r�r�)r=r�rrr r��sz$FirewallClientConfigZone.querySourcecCs
|jj�S)N)r�r�)r=rrr r��sz%FirewallClientConfigZone.getRichRulescCs|jj|�dS)N)r�r�)r=r�rrr r��sz%FirewallClientConfigZone.setRichRulescCs|jj|�dS)N)r�r�)r=r�rrr r��sz$FirewallClientConfigZone.addRichRulecCs|jj|�dS)N)r�r�)r=r�rrr r��sz'FirewallClientConfigZone.removeRichRulecCs|jj|�S)N)r�r�)r=r�rrr r��sz&FirewallClientConfigZone.queryRichRuleN)Ur�r�r�r@r�rr�r�r!r�r�r�r�r�r�rcr�rPrQrSrTrVrWrZr[r]r^rbrerfrhrirlrmrnrprqrrrsrtrvrwrxryrzr|r~r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr r��s2
r�c@s@eZdZed�dd��Zedd��Zedd��Zedd	��Zed
d��Zdd
�Z	edd��Z
edd��Zedd��Zedd��Z
edd��Zedd��Zedd��Zedd��Zedd��Zed d!��Zed"d#��Zed$d%��Zed&d'��Zed(d)��Zed*d+��Zed,d-��Zed.d/��Zed0d1��Zed2d3��Zed4d5��Zed6d7��Zed8d9��Zed:d;��Z ed<d=��Z!ed>d?��Z"ed@dA��Z#edBdC��Z$edDdE��Z%edFdG��Z&edHdI��Z'edJdK��Z(edLdM��Z)edNdO��Z*edPdQ��Z+edRdS��Z,e-j.j/j0edTdU���Z1e-j.j/j0edVdW���Z2e-j.j/j0edXdY���Z3edZd[��Z4ed\d]��Z5ed^d_��Z6ed`da��Z7edbdc��Z8eddde��Z9edfdg��Z:edhdi��Z;edjdk��Z<edldm��Z=edndo��Z>edpdq��Z?edrds��Z@edtdu��ZAedvdw��ZBedxdy��ZCedzd{��ZDed|d}��ZEed~d��ZFed�d���ZGed�d���ZHed�d���ZIdS)��FirewallClientPolicySettingsNcCs\dggggdgtgggdgtdd�|_dddddddddddddddg|_|rX|j|�dS)	Nr#F)r&�egress_zonesr-r+�
ingress_zonesr,r*�priorityr0�
rich_rulesr)r%r1r(r$r3z(ssss)r4z(ss)r>)rrr5r7r<)r=r5rrr r@�s,

z%FirewallClientPolicySettings.__init__cCsd|j|jfS)Nz%s(%r))rAr5)r=rrr rB�sz%FirewallClientPolicySettings.__repr__cCs|jS)N)r5)r=rrr rF�sz,FirewallClientPolicySettings.getSettingsDictcCs x|D]}|||j|<qWdS)N)r5)r=r5rDrrr r<�s
z,FirewallClientPolicySettings.setSettingsDictcCsvi}xlt|j|j�D]Z\}}|j|}t|�tkrFtj||d�||<qt|�tkrftj||d�||<q|||<qW|S)N)rH)	rCr5r7rIr9rrJr;rK)r=r5rDrLrErrr rM�s
z0FirewallClientPolicySettings.getSettingsDbusDictcCs |j�}xdD]
}||=qW|S)Nr$r%r&r()r$r%r&r()rM)r=r5rDrrr rO�s

z7FirewallClientPolicySettings.getRuntimeSettingsDbusDictcCs
|jdS)Nr$)r5)r=rrr rP�sz'FirewallClientPolicySettings.getVersioncCs||jd<dS)Nr$)r5)r=r$rrr rQ�sz'FirewallClientPolicySettings.setVersioncCs
|jdS)Nr%)r5)r=rrr rS�sz%FirewallClientPolicySettings.getShortcCs||jd<dS)Nr%)r5)r=r%rrr rT�sz%FirewallClientPolicySettings.setShortcCs
|jdS)Nr&)r5)r=rrr rV�sz+FirewallClientPolicySettings.getDescriptioncCs||jd<dS)Nr&)r5)r=r&rrr rW�sz+FirewallClientPolicySettings.setDescriptioncCs
|jdS)Nr()r5)r=rrr rZ�sz&FirewallClientPolicySettings.getTargetcCs||jd<dS)Nr()r5)r=r(rrr r[�sz&FirewallClientPolicySettings.setTargetcCs
|jdS)Nr))r5)r=rrr r]�sz(FirewallClientPolicySettings.getServicescCs||jd<dS)Nr))r5)r=r)rrr r^�sz(FirewallClientPolicySettings.setServicescCs0||jdkr |jdj|�nttj|��dS)Nr))r5r_rrr`)r=rarrr rb�sz'FirewallClientPolicySettings.addServicecCs0||jdkr |jdj|�nttj|��dS)Nr))r5rcrrrd)r=rarrr resz*FirewallClientPolicySettings.removeServicecCs||jdkS)Nr))r5)r=rarrr rfsz)FirewallClientPolicySettings.queryServicecCs
|jdS)Nr*)r5)r=rrr rh
sz%FirewallClientPolicySettings.getPortscCs||jd<dS)Nr*)r5)r=r*rrr ri
sz%FirewallClientPolicySettings.setPortscCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nr*z'%s:%s')r5r_rrr`)r=rjrkrrr rlsz$FirewallClientPolicySettings.addPortcCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nr*z'%s:%s')r5rcrrrd)r=rjrkrrr rmsz'FirewallClientPolicySettings.removePortcCs||f|jdkS)Nr*)r5)r=rjrkrrr rnsz&FirewallClientPolicySettings.queryPortcCs
|jdS)Nr0)r5)r=rrr rp"sz)FirewallClientPolicySettings.getProtocolscCs||jd<dS)Nr0)r5)r=r0rrr rq%sz)FirewallClientPolicySettings.setProtocolscCs0||jdkr |jdj|�nttj|��dS)Nr0)r5r_rrr`)r=rkrrr rr(sz(FirewallClientPolicySettings.addProtocolcCs0||jdkr |jdj|�nttj|��dS)Nr0)r5rcrrrd)r=rkrrr rs.sz+FirewallClientPolicySettings.removeProtocolcCs||jdkS)Nr0)r5)r=rkrrr rt4sz*FirewallClientPolicySettings.queryProtocolcCs
|jdS)Nr1)r5)r=rrr rv8sz+FirewallClientPolicySettings.getSourcePortscCs||jd<dS)Nr1)r5)r=r*rrr rw;sz+FirewallClientPolicySettings.setSourcePortscCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nr1z'%s:%s')r5r_rrr`)r=rjrkrrr rx>sz*FirewallClientPolicySettings.addSourcePortcCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nr1z'%s:%s')r5rcrrrd)r=rjrkrrr ryEsz-FirewallClientPolicySettings.removeSourcePortcCs||f|jdkS)Nr1)r5)r=rjrkrrr rzLsz,FirewallClientPolicySettings.querySourcePortcCs
|jdS)Nr+)r5)r=rrr r|Psz*FirewallClientPolicySettings.getIcmpBlockscCs||jd<dS)Nr+)r5)r=r}rrr r~Ssz*FirewallClientPolicySettings.setIcmpBlockscCs0||jdkr |jdj|�nttj|��dS)Nr+)r5r_rrr`)r=rrrr r�Vsz)FirewallClientPolicySettings.addIcmpBlockcCs0||jdkr |jdj|�nttj|��dS)Nr+)r5rcrrrd)r=rrrr r�\sz,FirewallClientPolicySettings.removeIcmpBlockcCs||jdkS)Nr+)r5)r=rrrr r�bsz+FirewallClientPolicySettings.queryIcmpBlockcCs
|jdS)Nr,)r5)r=rrr r�fsz*FirewallClientPolicySettings.getMasqueradecCs||jd<dS)Nr,)r5)r=r,rrr r�isz*FirewallClientPolicySettings.setMasqueradecCs&|jdsd|jd<nttjd�dS)Nr,T)r5rrr`)r=rrr r�ls
z*FirewallClientPolicySettings.addMasqueradecCs&|jdrd|jd<nttjd�dS)Nr,F)r5rrrd)r=rrr r�ss
z-FirewallClientPolicySettings.removeMasqueradecCs
|jdS)Nr,)r5)r=rrr r�zsz,FirewallClientPolicySettings.queryMasqueradecCs
|jdS)Nr-)r5)r=rrr r�sz,FirewallClientPolicySettings.getForwardPortscCs||jd<dS)Nr-)r5)r=r*rrr r��sz,FirewallClientPolicySettings.setForwardPortscCsd|dkrd}|dkrd}||||f|jdkrH|jdj||||f�nttjd||||f��dS)Nr#r-z
'%s:%s:%s:%s')r5r_rrr`)r=rjrkr�r�rrr r��sz+FirewallClientPolicySettings.addForwardPortcCsd|dkrd}|dkrd}||||f|jdkrH|jdj||||f�nttjd||||f��dS)Nr#r-z
'%s:%s:%s:%s')r5rcrrrd)r=rjrkr�r�rrr r��sz.FirewallClientPolicySettings.removeForwardPortcCs.|dkrd}|dkrd}||||f|jdkS)Nr#r-)r5)r=rjrkr�r�rrr r��s
z-FirewallClientPolicySettings.queryForwardPortcCs
|jdS)Nr�)r5)r=rrr r��sz)FirewallClientPolicySettings.getRichRulescCsdd�|D�}||jd<dS)NcSsg|]}tt|d���qS))r�)rr)r�r�rrr r��sz=FirewallClientPolicySettings.setRichRules.<locals>.<listcomp>r�)r5)r=r�rrr r��sz)FirewallClientPolicySettings.setRichRulescCs>tt|d��}||jdkr.|jdj|�nttj|��dS)N)r�r�)rrr5r_rrr`)r=r�rrr r��sz(FirewallClientPolicySettings.addRichRulecCs>tt|d��}||jdkr.|jdj|�nttj|��dS)N)r�r�)rrr5rcrrrd)r=r�rrr r��sz+FirewallClientPolicySettings.removeRichRulecCstt|d��}||jdkS)N)r�r�)rrr5)r=r�rrr r��sz*FirewallClientPolicySettings.queryRichRulecCs
|jdS)Nr�)r5)r=rrr �getIngressZones�sz,FirewallClientPolicySettings.getIngressZonescCs||jd<dS)Nr�)r5)r=r�rrr �setIngressZones�sz,FirewallClientPolicySettings.setIngressZonescCs0||jdkr |jdj|�nttj|��dS)Nr�)r5r_rrr`)r=�ingress_zonerrr �addIngressZone�sz+FirewallClientPolicySettings.addIngressZonecCs0||jdkr |jdj|�nttj|��dS)Nr�)r5rcrrrd)r=r�rrr �removeIngressZone�sz.FirewallClientPolicySettings.removeIngressZonecCs||jdkS)Nr�)r5)r=r�rrr �queryIngressZone�sz-FirewallClientPolicySettings.queryIngressZonecCs
|jdS)Nr�)r5)r=rrr �getEgressZones�sz+FirewallClientPolicySettings.getEgressZonescCs||jd<dS)Nr�)r5)r=r�rrr �setEgressZones�sz+FirewallClientPolicySettings.setEgressZonescCs0||jdkr |jdj|�nttj|��dS)Nr�)r5r_rrr`)r=�egress_zonerrr �
addEgressZone�sz*FirewallClientPolicySettings.addEgressZonecCs0||jdkr |jdj|�nttj|��dS)Nr�)r5rcrrrd)r=r�rrr �removeEgressZone�sz-FirewallClientPolicySettings.removeEgressZonecCs||jdkS)Nr�)r5)r=r�rrr �queryEgressZone�sz,FirewallClientPolicySettings.queryEgressZonecCs
|jdS)Nr�)r5)r=rrr �getPriority�sz(FirewallClientPolicySettings.getPrioritycCst|�|jd<dS)Nr�)�intr5)r=r�rrr �setPriority�sz(FirewallClientPolicySettings.setPriority)N)Jr�r�r�r!r@rBrFr<rMrOrPrQrSrTrVrWrZr[r]r^rbrerfrhrirlrmrnrprqrrrsrtrvrwrxryrzr|r~r�r�r�r�r�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rrrr r��s�r�c@s�eZdZdd�Zejjjedd���Z	ejjjedd���Z
ejjjedd���Zejjjed	d
���Zejjjedd���Z
ejjjed
d���Zejjjedd���Zejjjedd���ZdS)�FirewallClientConfigPolicycCsL||_||_|jjtjj|�|_tj|jtjjd�|_	tj|jdd�|_
dS)N)r�zorg.freedesktop.DBus.Properties)r�r�r�rrr�r�r��DBUS_INTERFACE_CONFIG_POLICY�	fw_policyr�)r=r�r�rrr r@�sz#FirewallClientConfigPolicy.__init__cCst|jjtjj|��S)N)r	r�r�rrr�)r=r�rrr r��sz'FirewallClientConfigPolicy.get_propertycCst|jjtjj��S)N)r	r�r�rrr�)r=rrr r�sz)FirewallClientConfigPolicy.get_propertiescCs|jjtjj||�dS)N)r�r�rrr�)r=r�rErrr r�sz'FirewallClientConfigPolicy.set_propertycCstt|jj���S)N)r�r	r�r�)r=rrr r�
sz&FirewallClientConfigPolicy.getSettingscCs|jj|j��dS)N)r�r�rM)r=r5rrr r�sz!FirewallClientConfigPolicy.updatecCs|jj�dS)N)r�r�)r=rrr r�sz'FirewallClientConfigPolicy.loadDefaultscCs|jj�dS)N)r�rc)r=rrr rcsz!FirewallClientConfigPolicy.removecCs|jj|�dS)N)r�r�)r=r�rrr r�!sz!FirewallClientConfigPolicy.renameN)r�r�r�r@r�rr�r�r!r�r�r�r�r�r�rcr�rrrr r��s"
r�c@s8eZdZed^dd��Zedd��Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��Zedd��Zedd��Z
edd��Zedd��Zedd��Zedd��Zedd��Zed d!��Zed"d#��Zed$d%��Zed&d'��Zed(d)��Zed*d+��Zed,d-��Zed.d/��Zed0d1��Zed2d3��Zed4d5��Zed6d7��Zed8d9��Zed:d;��Z ed<d=��Z!ed>d?��Z"ed@dA��Z#edBdC��Z$edDdE��Z%ed_dFdG��Z&edHdI��Z'edJdK��Z(edLdM��Z)edNdO��Z*edPdQ��Z+edRdS��Z,edTdU��Z-edVdW��Z.edXdY��Z/edZd[��Z0ed\d]��Z1dS)`�FirewallClientServiceSettingsNc
Cs�dddggiggggg
|_dddddddd	d
dg
|_dddd
dddd
ddg
|_|r�t|�tkr�x:t|�D]\}}|||j|<qhWnt|�tkr�|j|�dS)Nr#r$r%r&r*�modules�destinationr0r1�includes�helpersr3z(ss)Zss)r5r6r7rIr9r:r;r<)r=r5r>r?rrr r@)sz&FirewallClientServiceSettings.__init__cCsd|j|jfS)Nz%s(%r))rAr5)r=rrr rB9sz&FirewallClientServiceSettings.__repr__cCs,i}x"t|j|j�D]\}}|||<qW|S)N)rCr6r5)r=r5rDrErrr rF=sz-FirewallClientServiceSettings.getSettingsDictcCs(x"|D]}|||j|jj|�<qWdS)N)r5r6rG)r=r5rDrrr r<Cs
z-FirewallClientServiceSettings.setSettingsDictcCsri}xht|j|j|j�D]R\}}}t|�tkrBtj||d�||<qt|�tkrbtj	||d�||<q|||<qW|S)N)rH)
rCr6r5r7rIr9rrJr;rK)r=r5rDrErLrrr rMGsz1FirewallClientServiceSettings.getSettingsDbusDictcCs
|jdS)Nr)r5)r=rrr rPSsz(FirewallClientServiceSettings.getVersioncCs||jd<dS)Nr)r5)r=r$rrr rQVsz(FirewallClientServiceSettings.setVersioncCs
|jdS)NrR)r5)r=rrr rSZsz&FirewallClientServiceSettings.getShortcCs||jd<dS)NrR)r5)r=r%rrr rT]sz&FirewallClientServiceSettings.setShortcCs
|jdS)NrU)r5)r=rrr rVasz,FirewallClientServiceSettings.getDescriptioncCs||jd<dS)NrU)r5)r=r&rrr rWdsz,FirewallClientServiceSettings.setDescriptioncCs
|jdS)N�)r5)r=rrr rhhsz&FirewallClientServiceSettings.getPortscCs||jd<dS)Nr�)r5)r=r*rrr riksz&FirewallClientServiceSettings.setPortscCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nr�z'%s:%s')r5r_rrr`)r=rjrkrrr rlnsz%FirewallClientServiceSettings.addPortcCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nr�z'%s:%s')r5rcrrrd)r=rjrkrrr rmusz(FirewallClientServiceSettings.removePortcCs||f|jdkS)Nr�)r5)r=rjrkrrr rn|sz'FirewallClientServiceSettings.queryPortcCs
|jdS)Nrg)r5)r=rrr rp�sz*FirewallClientServiceSettings.getProtocolscCs||jd<dS)Nrg)r5)r=r0rrr rq�sz*FirewallClientServiceSettings.setProtocolscCs0||jdkr |jdj|�nttj|��dS)Nrg)r5r_rrr`)r=rkrrr rr�sz)FirewallClientServiceSettings.addProtocolcCs0||jdkr |jdj|�nttj|��dS)Nrg)r5rcrrrd)r=rkrrr rs�sz,FirewallClientServiceSettings.removeProtocolcCs||jdkS)Nrg)r5)r=rkrrr rt�sz+FirewallClientServiceSettings.queryProtocolcCs
|jdS)Nr{)r5)r=rrr rv�sz,FirewallClientServiceSettings.getSourcePortscCs||jd<dS)Nr{)r5)r=r*rrr rw�sz,FirewallClientServiceSettings.setSourcePortscCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nr{z'%s:%s')r5r_rrr`)r=rjrkrrr rx�sz+FirewallClientServiceSettings.addSourcePortcCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nr{z'%s:%s')r5rcrrrd)r=rjrkrrr ry�sz.FirewallClientServiceSettings.removeSourcePortcCs||f|jdkS)Nr{)r5)r=rjrkrrr rz�sz-FirewallClientServiceSettings.querySourcePortcCs
|jdS)NrX)r5)r=rrr �
getModules�sz(FirewallClientServiceSettings.getModulescCs||jd<dS)NrX)r5)r=r�rrr �
setModules�sz(FirewallClientServiceSettings.setModulescCs0||jdkr |jdj|�nttj|��dS)NrX)r5r_rrr`)r=�modulerrr �	addModule�sz'FirewallClientServiceSettings.addModulecCs0||jdkr |jdj|�nttj|��dS)NrX)r5rcrrrd)r=r�rrr �removeModule�sz*FirewallClientServiceSettings.removeModulecCs||jdkS)NrX)r5)r=r�rrr �queryModule�sz)FirewallClientServiceSettings.queryModulecCs
|jdS)Nr\)r5)r=rrr �getDestinations�sz-FirewallClientServiceSettings.getDestinationscCs||jd<dS)Nr\)r5)r=�destinationsrrr �setDestinations�sz-FirewallClientServiceSettings.setDestinationscCsH||jdks |jd||kr0||jd|<nttjd||f��dS)Nr\z'%s:%s')r5rrr`)r=�	dest_type�addressrrr �setDestination�s
z,FirewallClientServiceSettings.setDestinationcCs^||jdkrJ|dk	r<|jd||kr<ttjd||f��|jd|=nttjd|��dS)Nr\z'%s:%s'z'%s')r5rrrd)r=r�rrrr �removeDestination�sz/FirewallClientServiceSettings.removeDestinationcCs ||jdko||jd|kS)Nr\)r5)r=r�rrrr �queryDestination�sz.FirewallClientServiceSettings.queryDestinationcCs
|jdS)Nr�)r5)r=rrr �getIncludes�sz)FirewallClientServiceSettings.getIncludescCs||jd<dS)Nr�)r5)r=r�rrr �setIncludes�sz)FirewallClientServiceSettings.setIncludescCs0||jdkr |jdj|�nttj|��dS)Nr�)r5r_rrr`)r=�includerrr �
addInclude�sz(FirewallClientServiceSettings.addIncludecCs0||jdkr |jdj|�nttj|��dS)Nr�)r5rcrrrd)r=rrrr �
removeInclude�sz+FirewallClientServiceSettings.removeIncludecCs||jdkS)Nr�)r5)r=rrrr �queryInclude�sz*FirewallClientServiceSettings.queryIncludecCs
|jdS)Nr�)r5)r=rrr �
getHelpers�sz(FirewallClientServiceSettings.getHelperscCs||jd<dS)Nr�)r5)r=r�rrr �
setHelpers�sz(FirewallClientServiceSettings.setHelperscCs0||jdkr |jdj|�nttj|��dS)Nr�)r5r_rrr`)r=�helperrrr �	addHelper�sz'FirewallClientServiceSettings.addHelpercCs0||jdkr |jdj|�nttj|��dS)Nr�)r5rcrrrd)r=rrrr �removeHelpersz*FirewallClientServiceSettings.removeHelpercCs||jdkS)Nr�)r5)r=rrrr �queryHelpersz)FirewallClientServiceSettings.queryHelper)N)N)2r�r�r�r!r@rBrFr<rMrPrQrSrTrVrWrhrirlrmrnrprqrrrsrtrvrwrxryrzr�r�r�r�r�r�r�rrrrrrrr	r
rr
rrrrrr r�(s`r�c@s�eZdZed*dd��Zedd��Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��Zedd��Zedd��Z
edd��Zedd��Zedd��Zedd��Zedd��Zed d!��Zed"d#��Zed$d%��Zed&d'��Zed(d)��ZdS)+�FirewallClientIPSetSettingsNcCs"|r||_nddddigg|_dS)Nr#)r5)r=r5rrr r@sz$FirewallClientIPSetSettings.__init__cCsd|j|jfS)Nz%s(%r))rAr5)r=rrr rBsz$FirewallClientIPSetSettings.__repr__cCs
|jdS)Nr)r5)r=rrr rPsz&FirewallClientIPSetSettings.getVersioncCs||jd<dS)Nr)r5)r=r$rrr rQsz&FirewallClientIPSetSettings.setVersioncCs
|jdS)NrR)r5)r=rrr rS!sz$FirewallClientIPSetSettings.getShortcCs||jd<dS)NrR)r5)r=r%rrr rT$sz$FirewallClientIPSetSettings.setShortcCs
|jdS)NrU)r5)r=rrr rV(sz*FirewallClientIPSetSettings.getDescriptioncCs||jd<dS)NrU)r5)r=r&rrr rW+sz*FirewallClientIPSetSettings.setDescriptioncCs
|jdS)Nr�)r5)r=rrr �getType/sz#FirewallClientIPSetSettings.getTypecCs||jd<dS)Nr�)r5)r=Z
ipset_typerrr �setType2sz#FirewallClientIPSetSettings.setTypecCs
|jdS)NrX)r5)r=rrr �
getOptions6sz&FirewallClientIPSetSettings.getOptionscCs||jd<dS)NrX)r5)r=Zoptionsrrr �
setOptions9sz&FirewallClientIPSetSettings.setOptionscCsP||jdks |jd||kr0||jd|<nttj|rFd||fn|��dS)NrXz'%s=%s')r5rrr`)r=rDrErrr �	addOption<s z%FirewallClientIPSetSettings.addOptioncCs,||jdkr|jd|=nttj|��dS)NrX)r5rrrd)r=rDrrr �removeOptionCsz(FirewallClientIPSetSettings.removeOptioncCs ||jdko|jd||kS)NrX)r5)r=rDrErrr �queryOptionIsz'FirewallClientIPSetSettings.queryOptioncCs
|jdS)Nr\)r5)r=rrr �
getEntriesMsz&FirewallClientIPSetSettings.getEntriescCs@d|jdkr*|jdddkr*ttj��t|�||jd<dS)N�timeoutrX�0r\)r5rr�IPSET_WITH_TIMEOUTr)r=�entriesrrr �
setEntriesPs

z&FirewallClientIPSetSettings.setEntriescCsrd|jdkr*|jdddkr*ttj��t|�}||jdkrbt||jd�|jdj|�nttj|��dS)NrrXrr\)r5rrrrr
r_r`)r=�entryrrr �addEntryWs
z$FirewallClientIPSetSettings.addEntrycCsbd|jdkr*|jdddkr*ttj��t|�}||jdkrR|jdj|�nttj|��dS)NrrXrr\)r5rrrrrcrd)r=rrrr �removeEntrybs
z'FirewallClientIPSetSettings.removeEntrycCs@d|jdkr*|jdddkr*ttj��t|�}||jdkS)NrrXrr\)r5rrrr)r=rrrr �
queryEntryls

z&FirewallClientIPSetSettings.queryEntry)N)r�r�r�r!r@rBrPrQrSrTrVrWrrrrrrrrrrr r!rrrr rs*
rc@s�eZdZedd��Zejjjedd���Z	ejjjedd���Z
ejjjedd���Zejjjed	d
���Zejjjedd���Z
ejjjed
d���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd ���Zejjjed!d"���Zejjjed#d$���Zejjjed%d&���Zejjjed'd(���Zd)S)*�FirewallClientConfigIPSetcCsL||_||_|jjtjj|�|_tj|jtjjd�|_	tj|jdd�|_
dS)N)r�zorg.freedesktop.DBus.Properties)r�r�r�rrr�r�r��DBUS_INTERFACE_CONFIG_IPSET�fw_ipsetr�)r=r�r�rrr r@wsz"FirewallClientConfigIPSet.__init__cCst|jjtjj|��S)N)r	r�r�rrr#)r=r�rrr r��sz&FirewallClientConfigIPSet.get_propertycCst|jjtjj��S)N)r	r�r�rrr#)r=rrr r��sz(FirewallClientConfigIPSet.get_propertiescCs|jjtjj||�dS)N)r�r�rrr#)r=r�rErrr r��sz&FirewallClientConfigIPSet.set_propertycCsttt|jj����S)N)rr9r	r$r�)r=rrr r��sz%FirewallClientConfigIPSet.getSettingscCs|jjt|j��dS)N)r$r��tupler5)r=r5rrr r��sz FirewallClientConfigIPSet.updatecCs|jj�dS)N)r$r�)r=rrr r��sz&FirewallClientConfigIPSet.loadDefaultscCs|jj�dS)N)r$rc)r=rrr rc�sz FirewallClientConfigIPSet.removecCs|jj|�dS)N)r$r�)r=r�rrr r��sz FirewallClientConfigIPSet.renamecCs
|jj�S)N)r$rP)r=rrr rP�sz$FirewallClientConfigIPSet.getVersioncCs|jj|�dS)N)r$rQ)r=r$rrr rQ�sz$FirewallClientConfigIPSet.setVersioncCs
|jj�S)N)r$rS)r=rrr rS�sz"FirewallClientConfigIPSet.getShortcCs|jj|�dS)N)r$rT)r=r%rrr rT�sz"FirewallClientConfigIPSet.setShortcCs
|jj�S)N)r$rV)r=rrr rV�sz(FirewallClientConfigIPSet.getDescriptioncCs|jj|�dS)N)r$rW)r=r&rrr rW�sz(FirewallClientConfigIPSet.setDescriptioncCs
|jj�S)N)r$r)r=rrr r�sz$FirewallClientConfigIPSet.getEntriescCs|jj|�dS)N)r$r)r=rrrr r�sz$FirewallClientConfigIPSet.setEntriescCs|jj|�dS)N)r$r)r=rrrr r�sz"FirewallClientConfigIPSet.addEntrycCs|jj|�dS)N)r$r )r=rrrr r �sz%FirewallClientConfigIPSet.removeEntrycCs|jj|�S)N)r$r!)r=rrrr r!�sz$FirewallClientConfigIPSet.queryEntryN)r�r�r�r!r@r�rr�r�r�r�r�r�r�r�rcr�rPrQrSrTrVrWrrrr r!rrrr r"vsNr"c@s�eZdZed$dd��Zedd��Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��Zedd��Zedd��Z
edd��Zedd��Zedd��Zedd��Zedd��Zed d!��Zed"d#��ZdS)%�FirewallClientHelperSettingsNcCs"|r||_ndddddgg|_dS)Nr#)r5)r=r5rrr r@�sz%FirewallClientHelperSettings.__init__cCsd|j|jfS)Nz%s(%r))rAr5)r=rrr rB�sz%FirewallClientHelperSettings.__repr__cCs
|jdS)Nr)r5)r=rrr rP�sz'FirewallClientHelperSettings.getVersioncCs||jd<dS)Nr)r5)r=r$rrr rQ�sz'FirewallClientHelperSettings.setVersioncCs
|jdS)NrR)r5)r=rrr rSsz%FirewallClientHelperSettings.getShortcCs||jd<dS)NrR)r5)r=r%rrr rTsz%FirewallClientHelperSettings.setShortcCs
|jdS)NrU)r5)r=rrr rV	sz+FirewallClientHelperSettings.getDescriptioncCs||jd<dS)NrU)r5)r=r&rrr rWsz+FirewallClientHelperSettings.setDescriptioncCs
|jdS)Nr�)r5)r=rrr �	getFamilysz&FirewallClientHelperSettings.getFamilycCs |dkrd|jd<||jd<dS)Nr#r�)r5)r=�ipvrrr �	setFamilys
z&FirewallClientHelperSettings.setFamilycCs
|jdS)NrX)r5)r=rrr �	getModulesz&FirewallClientHelperSettings.getModulecCs||jd<dS)NrX)r5)r=r�rrr �	setModulesz&FirewallClientHelperSettings.setModulecCs
|jdS)Nr\)r5)r=rrr rh sz%FirewallClientHelperSettings.getPortscCs||jd<dS)Nr\)r5)r=r*rrr ri#sz%FirewallClientHelperSettings.setPortscCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nr\z'%s:%s')r5r_rrr`)r=rjrkrrr rl&sz$FirewallClientHelperSettings.addPortcCs@||f|jdkr(|jdj||f�nttjd||f��dS)Nr\z'%s:%s')r5rcrrrd)r=rjrkrrr rm-sz'FirewallClientHelperSettings.removePortcCs||f|jdkS)Nr\)r5)r=rjrkrrr rn4sz&FirewallClientHelperSettings.queryPort)N)r�r�r�r!r@rBrPrQrSrTrVrWr'r)r*r+rhrirlrmrnrrrr r&�s$r&c@seZdZedd��Zejjjedd���Z	ejjjedd���Z
ejjjedd���Zejjjed	d
���Zejjjedd���Z
ejjjed
d���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd ���Zejjjed!d"���Zejjjed#d$���Zejjjed%d&���Zejjjed'd(���Zejjjed)d*���Zejjjed+d,���Zejjjed-d.���Zejjjed/d0���Zd1S)2�FirewallClientConfigHelpercCsL||_||_|jjtjj|�|_tj|jtjjd�|_	tj|jdd�|_
dS)N)r�zorg.freedesktop.DBus.Properties)r�r�r�rrr�r�r��DBUS_INTERFACE_CONFIG_HELPER�	fw_helperr�)r=r�r�rrr r@;sz#FirewallClientConfigHelper.__init__cCst|jjtjj|��S)N)r	r�r�rrr-)r=r�rrr r�Fsz'FirewallClientConfigHelper.get_propertycCst|jjtjj��S)N)r	r�r�rrr-)r=rrr r�Lsz)FirewallClientConfigHelper.get_propertiescCs|jjtjj||�dS)N)r�r�rrr-)r=r�rErrr r�Rsz'FirewallClientConfigHelper.set_propertycCsttt|jj����S)N)r&r9r	r.r�)r=rrr r�Xsz&FirewallClientConfigHelper.getSettingscCs|jjt|j��dS)N)r.r�r%r5)r=r5rrr r�^sz!FirewallClientConfigHelper.updatecCs|jj�dS)N)r.r�)r=rrr r�csz'FirewallClientConfigHelper.loadDefaultscCs|jj�dS)N)r.rc)r=rrr rchsz!FirewallClientConfigHelper.removecCs|jj|�dS)N)r.r�)r=r�rrr r�msz!FirewallClientConfigHelper.renamecCs
|jj�S)N)r.rP)r=rrr rPtsz%FirewallClientConfigHelper.getVersioncCs|jj|�dS)N)r.rQ)r=r$rrr rQysz%FirewallClientConfigHelper.setVersioncCs
|jj�S)N)r.rS)r=rrr rS�sz#FirewallClientConfigHelper.getShortcCs|jj|�dS)N)r.rT)r=r%rrr rT�sz#FirewallClientConfigHelper.setShortcCs
|jj�S)N)r.rV)r=rrr rV�sz)FirewallClientConfigHelper.getDescriptioncCs|jj|�dS)N)r.rW)r=r&rrr rW�sz)FirewallClientConfigHelper.setDescriptioncCs
|jj�S)N)r.rh)r=rrr rh�sz#FirewallClientConfigHelper.getPortscCs|jj|�dS)N)r.ri)r=r*rrr ri�sz#FirewallClientConfigHelper.setPortscCs|jj||�dS)N)r.rl)r=rjrkrrr rl�sz"FirewallClientConfigHelper.addPortcCs|jj||�dS)N)r.rm)r=rjrkrrr rm�sz%FirewallClientConfigHelper.removePortcCs|jj||�S)N)r.rn)r=rjrkrrr rn�sz$FirewallClientConfigHelper.queryPortcCs
|jj�S)N)r.r')r=rrr r'�sz$FirewallClientConfigHelper.getFamilycCs$|dkr|jjd�|jj|�dS)Nr#)r.r))r=r(rrr r)�sz$FirewallClientConfigHelper.setFamilycCs
|jj�S)N)r.r*)r=rrr r*�sz$FirewallClientConfigHelper.getModulecCs|jj|�dS)N)r.r+)r=r�rrr r+�sz$FirewallClientConfigHelper.setModuleN) r�r�r�r!r@r�rr�r�r�r�r�r�r�r�rcr�rPrQrSrTrVrWrhrirlrmrnr'r)r*r+rrrr r,:s^r,c@s�eZdZedd��Zejjjedd���Z	ejjjedd���Z
ejjjedd���Zejjjed	d
���Zejjjedd���Z
ejjjed
d���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd ���Zejjjed!d"���Zejjjed#d$���Zejjjed%d&���Zejjjed'd(���Zejjjed)d*���Zejjjed+d,���Zejjjed-d.���Zejjjed/d0���Zejjjed1d2���Z ejjjed3d4���Z!ejjjed5d6���Z"ejjjed7d8���Z#ejjjed9d:���Z$ejjjed;d<���Z%ejjjed=d>���Z&ejjjed?d@���Z'ejjjedAdB���Z(ejjjedCdD���Z)ejjjedEdF���Z*ejjjedGdH���Z+ejjjedIdJ���Z,ejjjedKdL���Z-ejjjedMdN���Z.ejjjed^dPdQ���Z/ejjjedRdS���Z0ejjjedTdU���Z1ejjjedVdW���Z2ejjjedXdY���Z3ejjjedZd[���Z4ejjjed\d]���Z5dOS)_�FirewallClientConfigServicecCsL||_||_|jjtjj|�|_tj|jtjjd�|_	tj|jdd�|_
dS)N)r�zorg.freedesktop.DBus.Properties)r�r�r�rrr�r�r��DBUS_INTERFACE_CONFIG_SERVICE�
fw_servicer�)r=r�r�rrr r@�sz$FirewallClientConfigService.__init__cCst|jjtjj|��S)N)r	r�r�rrr0)r=r�rrr r��sz(FirewallClientConfigService.get_propertycCst|jjtjj��S)N)r	r�r�rrr0)r=rrr r��sz*FirewallClientConfigService.get_propertiescCs|jjtjj||�dS)N)r�r�rrr0)r=r�rErrr r��sz(FirewallClientConfigService.set_propertycCstt|jj���S)N)r�r	r1r�)r=rrr r��sz'FirewallClientConfigService.getSettingscCs|jj|j��dS)N)r1r�rM)r=r5rrr r��sz"FirewallClientConfigService.updatecCs|jj�dS)N)r1r�)r=rrr r��sz(FirewallClientConfigService.loadDefaultscCs|jj�dS)N)r1rc)r=rrr rc�sz"FirewallClientConfigService.removecCs|jj|�dS)N)r1r�)r=r�rrr r�sz"FirewallClientConfigService.renamecCs
|jj�S)N)r1rP)r=rrr rPsz&FirewallClientConfigService.getVersioncCs|jj|�dS)N)r1rQ)r=r$rrr rQsz&FirewallClientConfigService.setVersioncCs
|jj�S)N)r1rS)r=rrr rSsz$FirewallClientConfigService.getShortcCs|jj|�dS)N)r1rT)r=r%rrr rTsz$FirewallClientConfigService.setShortcCs
|jj�S)N)r1rV)r=rrr rVsz*FirewallClientConfigService.getDescriptioncCs|jj|�dS)N)r1rW)r=r&rrr rW$sz*FirewallClientConfigService.setDescriptioncCs
|jj�S)N)r1rh)r=rrr rh+sz$FirewallClientConfigService.getPortscCs|jj|�dS)N)r1ri)r=r*rrr ri0sz$FirewallClientConfigService.setPortscCs|jj||�dS)N)r1rl)r=rjrkrrr rl5sz#FirewallClientConfigService.addPortcCs|jj||�dS)N)r1rm)r=rjrkrrr rm:sz&FirewallClientConfigService.removePortcCs|jj||�S)N)r1rn)r=rjrkrrr rn?sz%FirewallClientConfigService.queryPortcCs
|jj�S)N)r1rp)r=rrr rpFsz(FirewallClientConfigService.getProtocolscCs|jj|�dS)N)r1rq)r=r0rrr rqKsz(FirewallClientConfigService.setProtocolscCs|jj|�dS)N)r1rr)r=rkrrr rrPsz'FirewallClientConfigService.addProtocolcCs|jj|�dS)N)r1rs)r=rkrrr rsUsz*FirewallClientConfigService.removeProtocolcCs|jj|�S)N)r1rt)r=rkrrr rtZsz)FirewallClientConfigService.queryProtocolcCs
|jj�S)N)r1rv)r=rrr rvasz*FirewallClientConfigService.getSourcePortscCs|jj|�dS)N)r1rw)r=r*rrr rwfsz*FirewallClientConfigService.setSourcePortscCs|jj||�dS)N)r1rx)r=rjrkrrr rxksz)FirewallClientConfigService.addSourcePortcCs|jj||�dS)N)r1ry)r=rjrkrrr rypsz,FirewallClientConfigService.removeSourcePortcCs|jj||�S)N)r1rz)r=rjrkrrr rzusz+FirewallClientConfigService.querySourcePortcCs
|jj�S)N)r1r�)r=rrr r�|sz&FirewallClientConfigService.getModulescCs|jj|�dS)N)r1r�)r=r�rrr r��sz&FirewallClientConfigService.setModulescCs|jj|�dS)N)r1r�)r=r�rrr r��sz%FirewallClientConfigService.addModulecCs|jj|�dS)N)r1r�)r=r�rrr r��sz(FirewallClientConfigService.removeModulecCs|jj|�S)N)r1r�)r=r�rrr r��sz'FirewallClientConfigService.queryModulecCs
|jj�S)N)r1r�)r=rrr r��sz+FirewallClientConfigService.getDestinationscCs|jj|�dS)N)r1r�)r=r�rrr r��sz+FirewallClientConfigService.setDestinationscCs|jj|�S)N)r1�getDestination)r=r�rrr r2�sz*FirewallClientConfigService.getDestinationcCs|jj||�dS)N)r1r)r=r�rrrr r�sz*FirewallClientConfigService.setDestinationNcCs:|dk	r*|j|�|kr*ttjd||f��|jj|�dS)Nz'%s:%s')r2rrrdr1r)r=r�rrrr r�sz-FirewallClientConfigService.removeDestinationcCs|jj||�S)N)r1r)r=r�rrrr r�sz,FirewallClientConfigService.queryDestinationcCs
|jj�S)N)r1r)r=rrr r�sz'FirewallClientConfigService.getIncludescCs|jj|�dS)N)r1r)r=r�rrr r�sz'FirewallClientConfigService.setIncludescCs|jj|�dS)N)r1r)r=rrrr r�sz&FirewallClientConfigService.addIncludecCs|jj|�dS)N)r1r)r=rrrr r�sz)FirewallClientConfigService.removeIncludecCs|jj|�S)N)r1r	)r=rrrr r	�sz(FirewallClientConfigService.queryInclude)N)6r�r�r�r!r@r�rr�r�r�r�r�r�r�r�rcr�rPrQrSrTrVrWrhrirlrmrnrprqrrrsrtrvrwrxryrzr�r�r�r�r�r�r�r2rrrrrrrr	rrrr r/�s�r/c@s�eZdZeddd��Zedd��Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��Zedd��Zedd��Z
edd��Zedd��Zedd��ZdS)�FirewallClientIcmpTypeSettingsNcCs|r||_ndddgg|_dS)Nr#)r5)r=r5rrr r@�sz'FirewallClientIcmpTypeSettings.__init__cCsd|j|jfS)Nz%s(%r))rAr5)r=rrr rB�sz'FirewallClientIcmpTypeSettings.__repr__cCs
|jdS)Nr)r5)r=rrr rP�sz)FirewallClientIcmpTypeSettings.getVersioncCs||jd<dS)Nr)r5)r=r$rrr rQ�sz)FirewallClientIcmpTypeSettings.setVersioncCs
|jdS)NrR)r5)r=rrr rS�sz'FirewallClientIcmpTypeSettings.getShortcCs||jd<dS)NrR)r5)r=r%rrr rT�sz'FirewallClientIcmpTypeSettings.setShortcCs
|jdS)NrU)r5)r=rrr rV�sz-FirewallClientIcmpTypeSettings.getDescriptioncCs||jd<dS)NrU)r5)r=r&rrr rW�sz-FirewallClientIcmpTypeSettings.setDescriptioncCs
|jdS)Nr�)r5)r=rrr r��sz.FirewallClientIcmpTypeSettings.getDestinationscCs||jd<dS)Nr�)r5)r=r�rrr r��sz.FirewallClientIcmpTypeSettings.setDestinationscCsH|jdsttj|��n,||jdkr8|jdj|�nttj|��dS)Nr�)r5rrr`r_)r=r�rrr �addDestination�s

z-FirewallClientIcmpTypeSettings.addDestinationcCs\||jdkr |jdj|�n8|jdsL|jttddg�t|g���nttj|��dS)Nr�Zipv4Zipv6)r5rcr�r9�setrrrd)r=r�rrr r	s
z0FirewallClientIcmpTypeSettings.removeDestinationcCs|jdp||jdkS)Nr�)r5)r=r�rrr r	sz/FirewallClientIcmpTypeSettings.queryDestination)N)r�r�r�r!r@rBrPrQrSrTrVrWr�r�r4rrrrrr r3�s	r3c@s�eZdZedd��Zejjjedd���Z	ejjjedd���Z
ejjjedd���Zejjjed	d
���Zejjjedd���Z
ejjjed
d���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd ���Zejjjed!d"���Zejjjed#d$���Zejjjed%d&���Zejjjed'd(���Zd)S)*�FirewallClientConfigIcmpTypecCsL||_||_|jjtjj|�|_tj|jtjjd�|_	tj|jdd�|_
dS)N)r�zorg.freedesktop.DBus.Properties)r�r�r�rrr�r�r��DBUS_INTERFACE_CONFIG_ICMPTYPE�fw_icmptyper�)r=r�r�rrr r@	sz%FirewallClientConfigIcmpType.__init__cCst|jjtjj|��S)N)r	r�r�rrr7)r=r�rrr r�%	sz)FirewallClientConfigIcmpType.get_propertycCst|jjtjj��S)N)r	r�r�rrr7)r=rrr r�+	sz+FirewallClientConfigIcmpType.get_propertiescCs|jjtjj||�dS)N)r�r�rrr7)r=r�rErrr r�1	sz)FirewallClientConfigIcmpType.set_propertycCsttt|jj����S)N)r3r9r	r8r�)r=rrr r�7	sz(FirewallClientConfigIcmpType.getSettingscCs|jjt|j��dS)N)r8r�r%r5)r=r5rrr r�=	sz#FirewallClientConfigIcmpType.updatecCs|jj�dS)N)r8r�)r=rrr r�B	sz)FirewallClientConfigIcmpType.loadDefaultscCs|jj�dS)N)r8rc)r=rrr rcG	sz#FirewallClientConfigIcmpType.removecCs|jj|�dS)N)r8r�)r=r�rrr r�L	sz#FirewallClientConfigIcmpType.renamecCs
|jj�S)N)r8rP)r=rrr rPS	sz'FirewallClientConfigIcmpType.getVersioncCs|jj|�dS)N)r8rQ)r=r$rrr rQX	sz'FirewallClientConfigIcmpType.setVersioncCs
|jj�S)N)r8rS)r=rrr rS_	sz%FirewallClientConfigIcmpType.getShortcCs|jj|�dS)N)r8rT)r=r%rrr rTd	sz%FirewallClientConfigIcmpType.setShortcCs
|jj�S)N)r8rV)r=rrr rVk	sz+FirewallClientConfigIcmpType.getDescriptioncCs|jj|�dS)N)r8rW)r=r&rrr rWp	sz+FirewallClientConfigIcmpType.setDescriptioncCs
|jj�S)N)r8r�)r=rrr r�w	sz,FirewallClientConfigIcmpType.getDestinationscCs|jj|�dS)N)r8r�)r=r�rrr r�|	sz,FirewallClientConfigIcmpType.setDestinationscCs|jj|�dS)N)r8r4)r=r�rrr r4�	sz+FirewallClientConfigIcmpType.addDestinationcCs|jj|�dS)N)r8r)r=r�rrr r�	sz.FirewallClientConfigIcmpType.removeDestinationcCs|jj|�S)N)r8r)r=r�rrr r�	sz-FirewallClientConfigIcmpType.queryDestinationN)r�r�r�r!r@r�rr�r�r�r�r�r�r�r�rcr�rPrQrSrTrVrWr�r�r4rrrrrr r6	sNr6c@seZdZed.dd��Zedd��Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��Zedd��Zedd��Z
edd��Zedd��Zedd��Zedd��Zedd��Zed d!��Zed"d#��Zed$d%��Zed&d'��Zed(d)��Zed*d+��Zed,d-��ZdS)/�'FirewallClientPoliciesLockdownWhitelistNcCs|r||_nggggg|_dS)N)r5)r=r5rrr r@�	sz0FirewallClientPoliciesLockdownWhitelist.__init__cCsd|j|jfS)Nz%s(%r))rAr5)r=rrr rB�	sz0FirewallClientPoliciesLockdownWhitelist.__repr__cCs
|jdS)Nr)r5)r=rrr �getCommands�	sz3FirewallClientPoliciesLockdownWhitelist.getCommandscCs||jd<dS)Nr)r5)r=Zcommandsrrr �setCommands�	sz3FirewallClientPoliciesLockdownWhitelist.setCommandscCs"||jdkr|jdj|�dS)Nr)r5r_)r=�commandrrr �
addCommand�	sz2FirewallClientPoliciesLockdownWhitelist.addCommandcCs"||jdkr|jdj|�dS)Nr)r5rc)r=r<rrr �
removeCommand�	sz5FirewallClientPoliciesLockdownWhitelist.removeCommandcCs||jdkS)Nr)r5)r=r<rrr �queryCommand�	sz4FirewallClientPoliciesLockdownWhitelist.queryCommandcCs
|jdS)NrR)r5)r=rrr �getContexts�	sz3FirewallClientPoliciesLockdownWhitelist.getContextscCs||jd<dS)NrR)r5)r=Zcontextsrrr �setContexts�	sz3FirewallClientPoliciesLockdownWhitelist.setContextscCs"||jdkr|jdj|�dS)NrR)r5r_)r=�contextrrr �
addContext�	sz2FirewallClientPoliciesLockdownWhitelist.addContextcCs"||jdkr|jdj|�dS)NrR)r5rc)r=rBrrr �
removeContext�	sz5FirewallClientPoliciesLockdownWhitelist.removeContextcCs||jdkS)NrR)r5)r=rBrrr �queryContext�	sz4FirewallClientPoliciesLockdownWhitelist.queryContextcCs
|jdS)NrU)r5)r=rrr �getUsers�	sz0FirewallClientPoliciesLockdownWhitelist.getUserscCs||jd<dS)NrU)r5)r=Zusersrrr �setUsers�	sz0FirewallClientPoliciesLockdownWhitelist.setUserscCs"||jdkr|jdj|�dS)NrU)r5r_)r=�userrrr �addUser�	sz/FirewallClientPoliciesLockdownWhitelist.addUsercCs"||jdkr|jdj|�dS)NrU)r5rc)r=rHrrr �
removeUser�	sz2FirewallClientPoliciesLockdownWhitelist.removeUsercCs||jdkS)NrU)r5)r=rHrrr �	queryUser�	sz1FirewallClientPoliciesLockdownWhitelist.queryUsercCs
|jdS)Nr�)r5)r=rrr �getUids�	sz/FirewallClientPoliciesLockdownWhitelist.getUidscCs||jd<dS)Nr�)r5)r=�uidsrrr �setUids�	sz/FirewallClientPoliciesLockdownWhitelist.setUidscCs"||jdkr|jdj|�dS)Nr�)r5r_)r=�uidrrr �addUid�	sz.FirewallClientPoliciesLockdownWhitelist.addUidcCs"||jdkr|jdj|�dS)Nr�)r5rc)r=rOrrr �	removeUid�	sz1FirewallClientPoliciesLockdownWhitelist.removeUidcCs||jdkS)Nr�)r5)r=rOrrr �queryUid�	sz0FirewallClientPoliciesLockdownWhitelist.queryUid)N)r�r�r�r!r@rBr:r;r=r>r?r@rArCrDrErFrGrIrJrKrLrNrPrQrRrrrr r9�	s.r9c@s�eZdZedd��Zejjjedd���Z	ejjjedd���Z
ejjjedd���Zejjjed	d
���Zejjjedd���Z
ejjjed
d���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd ���Zejjjed!d"���Zejjjed#d$���Zejjjed%d&���Zejjjed'd(���Zd)S)*�FirewallClientConfigPoliciescCs8||_|jjtjjtjj�|_tj|jtjjd�|_	dS)N)r�)
r�r�rrr��DBUS_PATH_CONFIGr�r��DBUS_INTERFACE_CONFIG_POLICIES�fw_policies)r=r�rrr r@�	sz%FirewallClientConfigPolicies.__init__cCsttt|jj����S)N)r9r9r	rV�getLockdownWhitelist)r=rrr rW�	sz1FirewallClientConfigPolicies.getLockdownWhitelistcCs|jjt|j��dS)N)rV�setLockdownWhitelistr%r5)r=r5rrr rX�	sz1FirewallClientConfigPolicies.setLockdownWhitelistcCs|jj|�dS)N)rV�addLockdownWhitelistCommand)r=r<rrr rY
sz8FirewallClientConfigPolicies.addLockdownWhitelistCommandcCs|jj|�dS)N)rV�removeLockdownWhitelistCommand)r=r<rrr rZ
sz;FirewallClientConfigPolicies.removeLockdownWhitelistCommandcCst|jj|��S)N)r	rV�queryLockdownWhitelistCommand)r=r<rrr r[

sz:FirewallClientConfigPolicies.queryLockdownWhitelistCommandcCst|jj��S)N)r	rV�getLockdownWhitelistCommands)r=rrr r\
sz9FirewallClientConfigPolicies.getLockdownWhitelistCommandscCs|jj|�dS)N)rV�addLockdownWhitelistContext)r=rBrrr r]
sz8FirewallClientConfigPolicies.addLockdownWhitelistContextcCs|jj|�dS)N)rV�removeLockdownWhitelistContext)r=rBrrr r^
sz;FirewallClientConfigPolicies.removeLockdownWhitelistContextcCst|jj|��S)N)r	rV�queryLockdownWhitelistContext)r=rBrrr r_ 
sz:FirewallClientConfigPolicies.queryLockdownWhitelistContextcCst|jj��S)N)r	rV�getLockdownWhitelistContexts)r=rrr r`%
sz9FirewallClientConfigPolicies.getLockdownWhitelistContextscCs|jj|�dS)N)rV�addLockdownWhitelistUser)r=rHrrr ra,
sz5FirewallClientConfigPolicies.addLockdownWhitelistUsercCs|jj|�dS)N)rV�removeLockdownWhitelistUser)r=rHrrr rb1
sz8FirewallClientConfigPolicies.removeLockdownWhitelistUsercCst|jj|��S)N)r	rV�queryLockdownWhitelistUser)r=rHrrr rc6
sz7FirewallClientConfigPolicies.queryLockdownWhitelistUsercCst|jj��S)N)r	rV�getLockdownWhitelistUsers)r=rrr rd;
sz6FirewallClientConfigPolicies.getLockdownWhitelistUserscCst|jj��S)N)r	rV�getLockdownWhitelistUids)r=rrr reB
sz5FirewallClientConfigPolicies.getLockdownWhitelistUidscCs|jj|�dS)N)rV�setLockdownWhitelistUids)r=rMrrr rfG
sz5FirewallClientConfigPolicies.setLockdownWhitelistUidscCs|jj|�dS)N)rV�addLockdownWhitelistUid)r=rOrrr rgL
sz4FirewallClientConfigPolicies.addLockdownWhitelistUidcCs|jj|�dS)N)rV�removeLockdownWhitelistUid)r=rOrrr rhQ
sz7FirewallClientConfigPolicies.removeLockdownWhitelistUidcCst|jj|��S)N)r	rV�queryLockdownWhitelistUid)r=rOrrr riV
sz6FirewallClientConfigPolicies.queryLockdownWhitelistUidN)r�r�r�r!r@r�rr�r�rWrXrYrZr[r\r]r^r_r`rarbrcrdrerfrgrhrirrrr rS�	sN	rSc@seZdZed.dd��Zedd��Zedd��Zedd	��Zed
d��Zedd
��Z	edd��Z
edd��Zedd��Zedd��Z
edd��Zedd��Zedd��Zedd��Zedd��Zed d!��Zed"d#��Zed$d%��Zed&d'��Zed(d)��Zed*d+��Zed,d-��ZdS)/�FirewallClientDirectNcCs|r||_ngggg|_dS)N)r5)r=r5rrr r@^
szFirewallClientDirect.__init__cCsd|j|jfS)Nz%s(%r))rAr5)r=rrr rBe
szFirewallClientDirect.__repr__cCs
|jdS)Nr)r5)r=rrr �getAllChainsi
sz!FirewallClientDirect.getAllChainscs��fdd�|jdD�S)Ncs,g|]$}|d�kr|d�kr|d�qS)rrRrUr)r�r)r(�tablerr r�n
sz2FirewallClientDirect.getChains.<locals>.<listcomp>r)r5)r=r(rlr)r(rlr �	getChainsl
szFirewallClientDirect.getChainscCs||jd<dS)Nr)r5)r=Zchainsrrr �setAllChainsp
sz!FirewallClientDirect.setAllChainscCs,|||f}||jdkr(|jdj|�dS)Nr)r5r_)r=r(rl�chain�idxrrr �addChains
s
zFirewallClientDirect.addChaincCs,|||f}||jdkr(|jdj|�dS)Nr)r5rc)r=r(rlrorprrr �removeChainx
s
z FirewallClientDirect.removeChaincCs|||f}||jdkS)Nr)r5)r=r(rlrorprrr �
queryChain}
s
zFirewallClientDirect.queryChaincCs
|jdS)NrR)r5)r=rrr �getAllRules�
sz FirewallClientDirect.getAllRulescs���fdd�|jdD�S)Ncs<g|]4}|d�kr|d�kr|d�kr|dd��qS)rrRrUr�Nr)r�r)ror(rlrr r��
sz1FirewallClientDirect.getRules.<locals>.<listcomp>rR)r5)r=r(rlror)ror(rlr �getRules�
szFirewallClientDirect.getRulescCs||jd<dS)NrR)r5)r=r�rrr �setAllRules�
sz FirewallClientDirect.setAllRulescCs0|||||f}||jdkr,|jdj|�dS)NrR)r5r_)r=r(rlror�rrprrr �addRule�
szFirewallClientDirect.addRulecCs0|||||f}||jdkr,|jdj|�dS)NrR)r5rc)r=r(rlror�rrprrr �
removeRule�
szFirewallClientDirect.removeRulecCsPxJt|jd�D]8}|d|kr|d|kr|d|kr|jdj|�qWdS)NrRrrU)r9r5rc)r=r(rlrorprrr �removeRules�
s$z FirewallClientDirect.removeRulescCs|||||f}||jdkS)NrR)r5)r=r(rlror�rrprrr �	queryRule�
szFirewallClientDirect.queryRulecCs
|jdS)NrU)r5)r=rrr �getAllPassthroughs�
sz'FirewallClientDirect.getAllPassthroughscCs||jd<dS)NrU)r5)r=Zpassthroughsrrr �setAllPassthroughs�
sz'FirewallClientDirect.setAllPassthroughscCsg|jd<dS)NrU)r5)r=rrr �removeAllPassthroughs�
sz*FirewallClientDirect.removeAllPassthroughscs�fdd�|jdD�S)Ncs g|]}|d�kr|d�qS)rrRr)r�r)r(rr r��
sz8FirewallClientDirect.getPassthroughs.<locals>.<listcomp>rU)r5)r=r(r)r(r �getPassthroughs�
sz$FirewallClientDirect.getPassthroughscCs*||f}||jdkr&|jdj|�dS)NrU)r5r_)r=r(rrprrr �addPassthrough�
sz#FirewallClientDirect.addPassthroughcCs*||f}||jdkr&|jdj|�dS)NrU)r5rc)r=r(rrprrr �removePassthrough�
sz&FirewallClientDirect.removePassthroughcCs||f}||jdkS)NrU)r5)r=r(rrprrr �queryPassthrough�
sz%FirewallClientDirect.queryPassthrough)N)r�r�r�r!r@rBrkrmrnrqrrrsrtrurvrwrxryrzr{r|r}r~rr�r�rrrr rj]
s.rjc@s�eZdZedd��Zejjjedd���Z	ejjjedd���Z
ejjjedd���Zejjjed	d
���Zejjjedd���Z
ejjjed
d���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd ���Zejjjed!d"���Zejjjed#d$���Zejjjed%d&���Zd'S)(�FirewallClientConfigDirectcCs8||_|jjtjjtjj�|_tj|jtjjd�|_	dS)N)r�)
r�r�rrr�rTr�r��DBUS_INTERFACE_CONFIG_DIRECT�	fw_direct)r=r�rrr r@�
sz#FirewallClientConfigDirect.__init__cCsttt|jj����S)N)rjr9r	r�r�)r=rrr r��
sz&FirewallClientConfigDirect.getSettingscCs|jjt|j��dS)N)r�r�r%r5)r=r5rrr r��
sz!FirewallClientConfigDirect.updatecCs|jj|||�dS)N)r�rq)r=r(rlrorrr rq�
sz#FirewallClientConfigDirect.addChaincCs|jj|||�dS)N)r�rr)r=r(rlrorrr rr�
sz&FirewallClientConfigDirect.removeChaincCst|jj|||��S)N)r	r�rs)r=r(rlrorrr rs�
sz%FirewallClientConfigDirect.queryChaincCst|jj||��S)N)r	r�rm)r=r(rlrrr rm�
sz$FirewallClientConfigDirect.getChainscCst|jj��S)N)r	r�rk)r=rrr rk�
sz'FirewallClientConfigDirect.getAllChainscCs|jj|||||�dS)N)r�rw)r=r(rlror�rrrr rw�
sz"FirewallClientConfigDirect.addRulecCs|jj|||||�dS)N)r�rx)r=r(rlror�rrrr rx�
sz%FirewallClientConfigDirect.removeRulecCs|jj|||�dS)N)r�ry)r=r(rlrorrr ry�
sz&FirewallClientConfigDirect.removeRulescCst|jj|||||��S)N)r	r�rz)r=r(rlror�rrrr rzsz$FirewallClientConfigDirect.queryRulecCst|jj|||��S)N)r	r�ru)r=r(rlrorrr rusz#FirewallClientConfigDirect.getRulescCst|jj��S)N)r	r�rt)r=rrr rt
sz&FirewallClientConfigDirect.getAllRulescCs|jj||�dS)N)r�r)r=r(rrrr rsz)FirewallClientConfigDirect.addPassthroughcCs|jj||�dS)N)r�r�)r=r(rrrr r�sz,FirewallClientConfigDirect.removePassthroughcCst|jj||��S)N)r	r�r�)r=r(rrrr r�sz+FirewallClientConfigDirect.queryPassthroughcCst|jj|��S)N)r	r�r~)r=r(rrr r~ sz*FirewallClientConfigDirect.getPassthroughscCst|jj��S)N)r	r�r{)r=rrr r{%sz-FirewallClientConfigDirect.getAllPassthroughsN)r�r�r�r!r@r�rr�r�r�r�rqrrrsrmrkrwrxryrzrurtrr�r�r~r{rrrr r��
sJ	r�c@sFeZdZedd��Zejjjedd���Z	ejjjedd���Z
ejjjedd���Zejjjed	d
���Zejjjedd���Z
ejjjed
d���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd���Zejjjedd ���Zejjjed!d"���Zejjjed#d$���Zejjjed%d&���Zejjjed'd(���Zejjjed)d*���Zejjjed+d,���Zejjjed-d.���Zejjjed/d0���Zejjjed1d2���Z ejjjed3d4���Z!ejjjed5d6���Z"ejjjed7d8���Z#ejjjed9d:���Z$ejjjed;d<���Z%ejjjed=d>���Z&ejjjed?d@���Z'ejjjedAdB���Z(ejjjedCdD���Z)ejjjedEdF���Z*ejjjedGdH���Z+ejjjedIdJ���Z,ejjjedKdL���Z-dMS)N�FirewallClientConfigcCsb||_|jjtjjtjj�|_tj|jtjjd�|_	tj|jdd�|_
t|j�|_t
|j�|_dS)N)r�zorg.freedesktop.DBus.Properties)r�r�rrr�rTr�r��DBUS_INTERFACE_CONFIG�	fw_configr�rS�	_policiesr��_direct)r=r�rrr r@-szFirewallClientConfig.__init__cCst|jjtjj|��S)N)r	r�r�rrr�)r=r�rrr r�<sz!FirewallClientConfig.get_propertycCst|jjtjj��S)N)r	r�r�rrr�)r=rrr r�Bsz#FirewallClientConfig.get_propertiescCs|jjtjj||�dS)N)r�r�rrr�)r=r�rErrr r�Hsz!FirewallClientConfig.set_propertycCst|jj��S)N)r	r��
getIPSetNames)r=rrr r�Osz"FirewallClientConfig.getIPSetNamescCst|jj��S)N)r	r��
listIPSets)r=rrr r�TszFirewallClientConfig.listIPSetscCst|j|�S)N)r"r�)r=r�rrr �getIPSetYszFirewallClientConfig.getIPSetcCst|jj|��}t|j|�S)N)r	r��getIPSetByNamer"r�)r=r�r�rrr r�^sz#FirewallClientConfig.getIPSetByNamecCs>t|t�r |jj|t|j��}n|jj|t|��}t|j|�S)N)r8rr��addIPSetr%r5r"r�)r=r�r5r�rrr r�ds
zFirewallClientConfig.addIPSetcCst|jj��S)N)r	r��getZoneNames)r=rrr r�osz!FirewallClientConfig.getZoneNamescCst|jj��S)N)r	r��	listZones)r=rrr r�tszFirewallClientConfig.listZonescCst|j|�S)N)r�r�)r=r�rrr �getZoneyszFirewallClientConfig.getZonecCst|jj|��}t|j|�S)N)r	r��
getZoneByNamer�r�)r=r�r�rrr r�~sz"FirewallClientConfig.getZoneByNamecCst|jj|��S)N)r	r��getZoneOfInterface)r=Zifacerrr r��sz'FirewallClientConfig.getZoneOfInterfacecCst|jj|��S)N)r	r��getZoneOfSource)r=r�rrr r��sz$FirewallClientConfig.getZoneOfSourcecCs^t|t�r|jj||j��}n4t|t�r8|jj||�}n|jj|t|dd���}t|j	|�S)Nr�)
r8r"r�ZaddZone2rMr;�addZoner%r�r�)r=r�r5r�rrr r��s

zFirewallClientConfig.addZonecCst|jj��S)N)r	r��getPolicyNames)r=rrr r��sz#FirewallClientConfig.getPolicyNamescCst|jj��S)N)r	r��listPolicies)r=rrr r��sz!FirewallClientConfig.listPoliciescCst|j|�S)N)r�r�)r=r�rrr �	getPolicy�szFirewallClientConfig.getPolicycCst|jj|��}t|j|�S)N)r	r��getPolicyByNamer�r�)r=r�r�rrr r��sz$FirewallClientConfig.getPolicyByNamecCs8t|t�r|jj||j��}n|jj||�}t|j|�S)N)r8r�r��	addPolicyrMr�r�)r=r�r5r�rrr r��s
zFirewallClientConfig.addPolicycCst|jj��S)N)r	r��getServiceNames)r=rrr r��sz$FirewallClientConfig.getServiceNamescCst|jj��S)N)r	r��listServices)r=rrr r��sz!FirewallClientConfig.listServicescCst|j|�S)N)r/r�)r=r�rrr �
getService�szFirewallClientConfig.getServicecCst|jj|��}t|j|�S)N)r	r��getServiceByNamer/r�)r=r�r�rrr r��sz%FirewallClientConfig.getServiceByNamecCs`t|t�r|jj||j��}n6t|�tkr:|jj||�}n|jj|t|dd���}t	|j
|�S)Nr�)r8r�r�ZaddService2rMrIr;rbr%r/r�)r=r�r5r�rrr rb�s
zFirewallClientConfig.addServicecCst|jj��S)N)r	r��getIcmpTypeNames)r=rrr r��sz%FirewallClientConfig.getIcmpTypeNamescCst|jj��S)N)r	r��
listIcmpTypes)r=rrr r��sz"FirewallClientConfig.listIcmpTypescCst|j|�S)N)r6r�)r=r�rrr �getIcmpType�sz FirewallClientConfig.getIcmpTypecCst|jj|��}t|j|�S)N)r	r��getIcmpTypeByNamer6r�)r=r�r�rrr r��sz&FirewallClientConfig.getIcmpTypeByNamecCs>t|t�r |jj|t|j��}n|jj|t|��}t|j|�S)N)r8r3r��addIcmpTyper%r5r6r�)r=r�r5r�rrr r��s
z FirewallClientConfig.addIcmpTypecCs|jS)N)r�)r=rrr �policies�szFirewallClientConfig.policiescCs|jS)N)r�)r=rrr �directszFirewallClientConfig.directcCst|jj��S)N)r	r��getHelperNames)r=rrr r�sz#FirewallClientConfig.getHelperNamescCst|jj��S)N)r	r��listHelpers)r=rrr r�sz FirewallClientConfig.listHelperscCst|j|�S)N)r,r�)r=r�rrr �	getHelperszFirewallClientConfig.getHelpercCst|jj|��}t|j|�S)N)r	r��getHelperByNamer,r�)r=r�r�rrr r�sz$FirewallClientConfig.getHelperByNamecCs>t|t�r |jj|t|j��}n|jj|t|��}t|j|�S)N)r8r&r�r
r%r5r,r�)r=r�r5r�rrr r
 s
zFirewallClientConfig.addHelperN).r�r�r�r!r@r�rr�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rbr�r�r�r�r�r�r�r�r�r�r�r
rrrr r�,s�

r�c@s�eZdZe�ddd��Zedd��Zedd	��Zed
d��Zedd
��Zedd��Z	edd��Z
edd��Zedd��Zedd��Z
edd��Zejjjedd���Zejjjedd���Zejjjedd���Zejjjed d!���Zejjjed"d#���Zejjjed$d%���Zejjjed&d'���Zejjjed(d)���Zejjjed*d+���Zejjjed,d-���Zejjjed.d/���Zejjjed0d1���Zejjjed2d3���Zejjjed4d5���Z ejjjed6d7���Z!ejjjed8d9���Z"ejjjed:d;���Z#ejjjed<d=���Z$ejjjed>d?���Z%ejjjed@dA���Z&ejjjedBdC���Z'ejjjedDdE���Z(ejjjedFdG���Z)ejjjedHdI���Z*ejjjedJdK���Z+ejjjedLdM���Z,ejjjedNdO���Z-ejjjedPdQ���Z.ejjjedRdS���Z/ejjjedTdU���Z0ejjjedVdW���Z1ejjjedXdY���Z2ejjjedZd[���Z3ejjjed\d]���Z4ejjjed^d_���Z5ejjjed`da���Z6ejjjedbdc���Z7ejjjeddde���Z8ejjjedfdg���Z9ejjjedhdi���Z:ejjjedjdk���Z;ejjjedldm���Z<ejjjedndo���Z=ejjjedpdq���Z>ejjjedrds���Z?ejjjedtdu���Z@ejjjedvdw���ZAejjjedxdy���ZBejjjedzd{���ZCejjjed|d}���ZDejjjed~d���ZEejjjed�d����ZFejjjed�d����ZGejjje�dd�d����ZHejjjed�d����ZIejjjed�d����ZJejjjed�d����ZKejjje�dd�d����ZLejjjed�d����ZMejjjed�d����ZNejjjed�d����ZOejjje�dd�d����ZPejjjed�d����ZQejjjed�d����ZRejjjed�d����ZSejjje�dd�d����ZTejjjed�d����ZUejjjed�d����ZVejjjed�d����ZWejjjed�d����ZXejjjed�d����ZYejjjed�d����ZZejjje�dd�d����Z[ejjjed�d����Z\ejjjed�d����Z]ejjje�d d�d����Z^ejjjed�d����Z_ejjjed�d����Z`ejjjed�d����Zaejjje�d!d�d����Zbejjjed�d����Zcejjjed�d����Zdejjjed�d����Zeejjje�d"d�d����Zfejjjed�dÄ��Zgejjjed�dń��Zhejjjed�dDŽ��Ziejjjed�dɄ��Zjejjjed�d˄��Zkejjjed�d̈́��Zlejjjed�dτ��Zmejjjed�dф��Znejjjed�dӄ��Zoejjjed�dՄ��Zpejjjed�dׄ��Zqejjjed�dل��Zrejjjed�dۄ��Zsejjjed�d݄��Ztejjjed�d߄��Zuejjjed�d���Zvejjjed�d���Zwejjjed�d���Zxejjjed�d���Zyejjjed�d���Zzejjjed�d���Z{ejjjed�d���Z|ejjjed�d���Z}ejjjed�d���Z~ejjjed�d���Zejjjed�d����Z�ejjjed�d����Z�ejjjed�d����Z�ejjjed�d����Z�ejjjed�d����Z�ejjjed�d����Z�ejjje�d�d���Z�ejjje�d�d���Z�ejjje�d�d���Z�ejjje�d�d���Z�ejjje�d�d	���Z�ejjje�d
�d���Z�ejjje�d�d
���Z�ejjje�d�d���Z�ejjje�d�d���Z�ejjje�d�d���Z�ejjje�d�d���Z�ejjje�d�d���Z�ejjje�d�d���Z�dS(#�FirewallClientNrTcdCs|s�tjjjdd�ytjj�|_d|j_Wq�tk
r�ytj�|_Wn6tj	j
k
r�}zttj
|j���WYdd}~Xn
Xtd�Yq�Xn||_|jj|jddtjjd�x�tjjtjjtjjtjjtjjtjjtjjtjjtjjtjjtjjtjjtjjtjj tjj!gD]}|jj|j"|ddd	d
��qWi|_#ddd
ddddddddddddddddddd d!d"d#d$d%d&d'd'd(d)d*d+d,d-d.d/d0d1d2d3d4d5d6d7d8d9d:d;d<d=d>d?d@dAdBdCdDdEdFdGdHdIdJdKdLdMdNdOdPdQdRdSdTdUdVdWdXdY�O|_$|j%�||_&|dZk�rt'j(||j)�n|j)�dS)[NT)Zset_as_defaultzNot using slip.dbusZNameOwnerChangedzorg.freedesktop.DBus)Zhandler_functionZsignal_namer�Zarg0r��memberr�)r�Zinterface_keywordZmember_keywordZpath_keywordzconnection-changedzconnection-establishedzconnection-lostZLogDeniedChangedZDefaultZoneChangedZPanicModeEnabledZPanicModeDisabledZReloadedZServiceAddedZServiceRemovedZ	PortAddedZPortRemovedZSourcePortAddedZSourcePortRemovedZ
ProtocolAddedZProtocolRemovedZMasqueradeAddedZMasqueradeRemovedZForwardPortAddedZForwardPortRemovedZIcmpBlockAddedZIcmpBlockRemovedZIcmpBlockInversionAddedZIcmpBlockInversionRemovedZ
RichRuleAddedZRichRuleRemovedZInterfaceAddedZInterfaceRemovedZZoneOfInterfaceChangedZSourceAddedZ
SourceRemovedZZoneOfSourceChangedZZoneUpdatedZ
PolicyUpdatedZ
EntryAddedZEntryRemovedZ
ChainAddedZChainRemovedZ	RuleAddedZRuleRemovedZPassthroughAddedZPassthroughRemovedzconfig:direct:UpdatedZLockdownEnabledZLockdownDisabledZLockdownWhitelistCommandAddedZLockdownWhitelistCommandRemovedZLockdownWhitelistContextAddedZLockdownWhitelistContextRemovedZLockdownWhitelistUidAddedZLockdownWhitelistUidRemovedZLockdownWhitelistUserAddedZLockdownWhitelistUserRemovedz(config:policies:LockdownWhitelistUpdatedzconfig:IPSetAddedzconfig:IPSetUpdatedzconfig:IPSetRemovedzconfig:IPSetRenamedzconfig:ZoneAddedzconfig:ZoneUpdatedzconfig:ZoneRemovedzconfig:ZoneRenamedzconfig:PolicyAddedzconfig:PolicyUpdatedzconfig:PolicyRemovedzconfig:PolicyRenamedzconfig:ServiceAddedzconfig:ServiceUpdatedzconfig:ServiceRemovedzconfig:ServiceRenamedzconfig:IcmpTypeAddedzconfig:IcmpTypeUpdatedzconfig:IcmpTypeRemovedzconfig:IcmpTypeRenamedzconfig:HelperAddedzconfig:HelperUpdatedzconfig:HelperRemovedzconfig:HelperRenamed)Ozconnection-changedzconnection-establishedzconnection-lostzlog-denied-changedzdefault-zone-changedzpanic-mode-enabledzpanic-mode-disabledZreloadedz
service-addedzservice-removedz
port-addedzport-removedzsource-port-addedzsource-port-removedzprotocol-addedzprotocol-removedzmasquerade-addedzmasquerade-removedzforward-port-addedzforward-port-removedzicmp-block-addedzicmp-block-removedzicmp-block-inversion-addedzicmp-block-inversion-removedzrichrule-addedzrichrule-removedzinterface-addedzinterface-removedzzone-changedzzone-of-interface-changedzsource-addedzsource-removedzzone-of-source-changedzzone-updatedzpolicy-updatedzipset-entry-addedzipset-entry-removedzdirect:chain-addedzdirect:chain-removedzdirect:rule-addedzdirect:rule-removedzdirect:passthrough-addedzdirect:passthrough-removedzconfig:direct:updatedzlockdown-enabledzlockdown-disabledz lockdown-whitelist-command-addedz"lockdown-whitelist-command-removedz lockdown-whitelist-context-addedz"lockdown-whitelist-context-removedzlockdown-whitelist-uid-addedzlockdown-whitelist-uid-removedzlockdown-whitelist-user-addedzlockdown-whitelist-user-removedz*config:policies:lockdown-whitelist-updatedzconfig:ipset-addedzconfig:ipset-updatedzconfig:ipset-removedzconfig:ipset-renamedzconfig:zone-addedzconfig:zone-updatedzconfig:zone-removedzconfig:zone-renamedzconfig:policy-addedzconfig:policy-updatedzconfig:policy-removedzconfig:policy-renamedzconfig:service-addedzconfig:service-updatedzconfig:service-removedzconfig:service-renamedzconfig:icmptype-addedzconfig:icmptype-updatedzconfig:icmptype-removedzconfig:icmptype-renamedzconfig:helper-addedzconfig:helper-updatedzconfig:helper-removedzconfig:helper-renamedr)*rZmainloopZglibZ
DBusGMainLoopr�Z	SystemBusr�Zdefault_timeoutrrrrrZ
DBUS_ERRORr�printZadd_signal_receiver�_dbus_connection_changedrr��DBUS_INTERFACE_IPSET�DBUS_INTERFACE_ZONE�DBUS_INTERFACE_POLICY�DBUS_INTERFACE_DIRECT�DBUS_INTERFACE_POLICIESr�r#r�r�r0r-r�r7rU�_signal_receiver�	_callback�
_callbacks�
_init_vars�quietrZtimeout_add_seconds�_connection_established)r=r��waitr�rr�rrr r@,s�


zFirewallClient.__init__cCs:d|_d|_d|_d|_d|_d|_d|_d|_d|_dS)NF)	�fwr$r�r�r.r�r��_config�	connected)r=rrr r��szFirewallClient._init_varscCstS)N)r)r=rrr �getExceptionHandler�sz"FirewallClient.getExceptionHandlercCs|adS)N)r)r=Zhandlerrrr �setExceptionHandler�sz"FirewallClient.setExceptionHandlercCstS)N)r)r=rrr �getNotAuthorizedLoop�sz#FirewallClient.getNotAuthorizedLoopcCs|adS)N)r)r=�enablerrr �setNotAuthorizedLoop�sz#FirewallClient.setNotAuthorizedLoopcGs0||jkr ||f|j|j|<ntd|��dS)NzUnknown callback name '%s')r�r��
ValueError)r=r��callbackrrrr �connect�s
zFirewallClient.connectcCs*|tjjkrdS|r|j�n|j�dS)N)rrr�r��_connection_lost)r=r�Z	old_ownerZ	new_ownerrrr r��s

z'FirewallClient._dbus_connection_changedcCsXy�|jjtjjtjj�|_tj|jtjjd�|_tj|jtjj	d�|_
tj|jtjjd�|_tj|jtjj
d�|_tj|jtjjd�|_tj|jtjjd�|_tj|jdd�|_Wnjtjjk
r�}z|js�td|j��dSd}~Xn4tk
�r}z|j�std|�dSd}~XnXt|j�|_d|_|jdtjjd�|jdtjjd�dS)	N)r�zorg.freedesktop.DBus.PropertiesrrTzconnection-established)r�r�zconnection-changed)r�r�rrr�Z	DBUS_PATHr�r�r�r�r$r�r�r�r�r�r�r�rVr�rrr�r�rrr�r�r�r�)r=rrrr r��sD
z&FirewallClient._connection_establishedcCs0|j�|jdtjjd�|jdtjjd�dS)Nzconnection-lost)r�r�zconnection-changed)r�r�rrr�)r=rrr r�
s
zFirewallClient._connection_lostc	Os�d|ksd|krdS|d}|d}|jtjj�r:d|}|jtjj�rRd|}n�|jtjj�rjd|}n�|jtjj�r�d|}np|jtjj�r�d|}nX|jtjj�r�d|}n@|tjj	kr�d	|}n*|tjj
kr�d
|}n|tjjkr�d|}d}x<|jD]2}|j||kr�|j||j
kr�|j
|j|}q�W|dk�rBdSdd
�|D�}y(|d�rj|j|d�|d|�Wn,tk
�r�}zt|�WYdd}~XnXdS)Nr�r�zconfig:Zonez
config:Policyzconfig:IPSetzconfig:Servicezconfig:IcmpTypez
config:Helperzconfig:zconfig:policies:zconfig:direct:cSsg|]}t|��qSr)r	)r��argrrr r�C
sz3FirewallClient._signal_receiver.<locals>.<listcomp>rRr)�
startswithrrr�r�r#r0r7r-r�rUr�r�r��extendrr�)	r=rr�signalr��cbr�Zcb_args�msgrrr r�
sH








zFirewallClient._signal_receivercCs|jS)N)r�)r=rrr rM
szFirewallClient.configcCs|jj�dS)N)r��reload)r=rrr r�R
szFirewallClient.reloadcCs|jj�dS)N)r�ZcompleteReload)r=rrr �complete_reloadW
szFirewallClient.complete_reloadcCs|jj�dS)N)r��runtimeToPermanent)r=rrr r�\
sz!FirewallClient.runtimeToPermanentcCs|jj�dS)N)r��checkPermanentConfig)r=rrr r�a
sz#FirewallClient.checkPermanentConfigcCst|jjtjj|��S)N)r	r�r�rrr�)r=r�rrr r�f
szFirewallClient.get_propertycCst|jjtjj��S)N)r	r�r�rrr�)r=rrr r�l
szFirewallClient.get_propertiescCs|jjtjj||�dS)N)r�r�rrr�)r=r�rErrr r�r
szFirewallClient.set_propertycCs|jj�dS)N)r��enablePanicMode)r=rrr r�y
szFirewallClient.enablePanicModecCs|jj�dS)N)r��disablePanicMode)r=rrr r�~
szFirewallClient.disablePanicModecCst|jj��S)N)r	r��queryPanicMode)r=rrr r��
szFirewallClient.queryPanicModecCstt|jj|���S)N)r"r	r��getZoneSettings2)r=�zonerrr �getZoneSettings�
szFirewallClient.getZoneSettingscCst|jj��S)N)r	r$�	getIPSets)r=rrr r��
szFirewallClient.getIPSetscCsttt|jj|����S)N)rr9r	r$�getIPSetSettings)r=�ipsetrrr r��
szFirewallClient.getIPSetSettingscCs|jj||�dS)N)r$r)r=r�rrrr r�
szFirewallClient.addEntrycCs|jj|�S)N)r$r)r=r�rrr r�
szFirewallClient.getEntriescCs|jj||�S)N)r$r)r=r�rrrr r�
szFirewallClient.setEntriescCs|jj||�dS)N)r$r )r=r�rrrr r �
szFirewallClient.removeEntrycCst|jj||��S)N)r	r$r!)r=r�rrrr r!�
szFirewallClient.queryEntrycCst|jj��S)N)r	r�r�)r=rrr r��
szFirewallClient.listServicescCstt|jj|���S)N)r�r	r�ZgetServiceSettings2)r=rarrr �getServiceSettings�
sz!FirewallClient.getServiceSettingscCst|jj��S)N)r	r�r�)r=rrr r��
szFirewallClient.listIcmpTypescCsttt|jj|����S)N)r3r9r	r��getIcmpTypeSettings)r=rrrr r��
sz"FirewallClient.getIcmpTypeSettingscCst|jj��S)N)r	r�r
)r=rrr r
�
szFirewallClient.getHelperscCsttt|jj|����S)N)r&r9r	r��getHelperSettings)r=rrrr r��
sz FirewallClient.getHelperSettingscCst|jj��S)N)r	r��getAutomaticHelpers)r=rrr r��
sz"FirewallClient.getAutomaticHelperscCs|jj|�dS)N)r��setAutomaticHelpers)r=rErrr r��
sz"FirewallClient.setAutomaticHelperscCst|jj��S)N)r	r��getLogDenied)r=rrr r��
szFirewallClient.getLogDeniedcCs|jj|�dS)N)r��setLogDenied)r=rErrr r��
szFirewallClient.setLogDeniedcCst|jj��S)N)r	r��getDefaultZone)r=rrr r��
szFirewallClient.getDefaultZonecCs|jj|�dS)N)r��setDefaultZone)r=r�rrr r��
szFirewallClient.setDefaultZonecCs|jj||j��dS)N)r��setZoneSettings2rO)r=r�r5rrr �setZoneSettings�
szFirewallClient.setZoneSettingscCst|jj��S)N)r	r��getZones)r=rrr r��
szFirewallClient.getZonescCst|jj��S)N)r	r��getActiveZones)r=rrr r�szFirewallClient.getActiveZonescCst|jj|��S)N)r	r�r�)r=r�rrr r�	sz!FirewallClient.getZoneOfInterfacecCst|jj|��S)N)r	r�r�)r=r�rrr r�szFirewallClient.getZoneOfSourcecCst|jj|��S)N)r	r��isImmutable)r=r�rrr r�szFirewallClient.isImmutablecCstt|jj|���S)N)r�r	r��getPolicySettings)r=�policyrrr r�sz FirewallClient.getPolicySettingscCs|jj||j��dS)N)r��setPolicySettingsrO)r=r�r5rrr r�sz FirewallClient.setPolicySettingscCst|jj��S)N)r	r��getPolicies)r=rrr r�$szFirewallClient.getPoliciescCst|jj��S)N)r	r��getActivePolicies)r=rrr r�)sz FirewallClient.getActivePoliciescCst|jj|��S)N)r	r�r�)r=r�rrr �isPolicyImmutable.sz FirewallClient.isPolicyImmutablecCst|jj||��S)N)r	r�r�)r=r�r�rrr r�5szFirewallClient.addInterfacecCst|jj||��S)N)r	r��
changeZone)r=r�r�rrr r�:szFirewallClient.changeZonecCst|jj||��S)N)r	r��changeZoneOfInterface)r=r�r�rrr r�?s
z$FirewallClient.changeZoneOfInterfacecCst|jj|��S)N)r	r�r�)r=r�rrr r�EszFirewallClient.getInterfacescCst|jj||��S)N)r	r�r�)r=r�r�rrr r�JszFirewallClient.queryInterfacecCst|jj||��S)N)r	r�r�)r=r�r�rrr r�OszFirewallClient.removeInterfacecCst|jj||��S)N)r	r�r�)r=r�r�rrr r�VszFirewallClient.addSourcecCst|jj||��S)N)r	r��changeZoneOfSource)r=r�r�rrr r�[sz!FirewallClient.changeZoneOfSourcecCst|jj|��S)N)r	r�r�)r=r�rrr r�`szFirewallClient.getSourcescCst|jj||��S)N)r	r�r�)r=r�r�rrr r�eszFirewallClient.querySourcecCst|jj||��S)N)r	r�r�)r=r�r�rrr r�jszFirewallClient.removeSourcecCst|jj|||��S)N)r	r�r�)r=r�r�rrrr r�qszFirewallClient.addRichRulecCst|jj|��S)N)r	r�r�)r=r�rrr r�vszFirewallClient.getRichRulescCst|jj||��S)N)r	r�r�)r=r�r�rrr r�{szFirewallClient.queryRichRulecCst|jj||��S)N)r	r�r�)r=r�r�rrr r��szFirewallClient.removeRichRulecCst|jj|||��S)N)r	r�rb)r=r�rarrrr rb�szFirewallClient.addServicecCst|jj|��S)N)r	r�r])r=r�rrr r]�szFirewallClient.getServicescCst|jj||��S)N)r	r�rf)r=r�rarrr rf�szFirewallClient.queryServicecCst|jj||��S)N)r	r�re)r=r�rarrr re�szFirewallClient.removeServicecCst|jj||||��S)N)r	r�rl)r=r�rjrkrrrr rl�szFirewallClient.addPortcCst|jj|��S)N)r	r�rh)r=r�rrr rh�szFirewallClient.getPortscCst|jj|||��S)N)r	r�rn)r=r�rjrkrrr rn�szFirewallClient.queryPortcCst|jj|||��S)N)r	r�rm)r=r�rjrkrrr rm�szFirewallClient.removePortcCst|jj|||��S)N)r	r�rr)r=r�rkrrrr rr�szFirewallClient.addProtocolcCst|jj|��S)N)r	r�rp)r=r�rrr rp�szFirewallClient.getProtocolscCst|jj||��S)N)r	r�rt)r=r�rkrrr rt�szFirewallClient.queryProtocolcCst|jj||��S)N)r	r�rs)r=r�rkrrr rs�szFirewallClient.removeProtocolcCs|jj|ddi�dS)Nr2T)r�r�)r=r�rrr r��szFirewallClient.addForwardcCst|jj|��dS)Nr2)r	r�r�)r=r�rrr r��szFirewallClient.queryForwardcCs|jj|ddi�dS)Nr2F)r�r�)r=r�rrr r��szFirewallClient.removeForwardcCst|jj||��S)N)r	r�r�)r=r�rrrr r��szFirewallClient.addMasqueradecCst|jj|��S)N)r	r�r�)r=r�rrr r��szFirewallClient.queryMasqueradecCst|jj|��S)N)r	r�r�)r=r�rrr r��szFirewallClient.removeMasqueradecCs2|dkrd}|dkrd}t|jj||||||��S)Nr#)r	r�r�)r=r�rjrkr�r�rrrr r��szFirewallClient.addForwardPortcCst|jj|��S)N)r	r�r�)r=r�rrr r��szFirewallClient.getForwardPortscCs0|dkrd}|dkrd}t|jj|||||��S)Nr#)r	r�r�)r=r�rjrkr�r�rrr r��s
zFirewallClient.queryForwardPortcCs0|dkrd}|dkrd}t|jj|||||��S)Nr#)r	r�r�)r=r�rjrkr�r�rrr r�s
z FirewallClient.removeForwardPortcCst|jj||||��S)N)r	r�rx)r=r�rjrkrrrr rxszFirewallClient.addSourcePortcCst|jj|��S)N)r	r�rv)r=r�rrr rvszFirewallClient.getSourcePortscCst|jj|||��S)N)r	r�rz)r=r�rjrkrrr rzszFirewallClient.querySourcePortcCst|jj|||��S)N)r	r�ry)r=r�rjrkrrr ry$szFirewallClient.removeSourcePortcCst|jj|||��S)N)r	r�r�)r=r��icmprrrr r�,szFirewallClient.addIcmpBlockcCst|jj|��S)N)r	r�r|)r=r�rrr r|1szFirewallClient.getIcmpBlockscCst|jj||��S)N)r	r�r�)r=r�r�rrr r�6szFirewallClient.queryIcmpBlockcCst|jj||��S)N)r	r�r�)r=r�r�rrr r�;szFirewallClient.removeIcmpBlockcCst|jj|��S)N)r	r�r�)r=r�rrr r�Bsz$FirewallClient.addIcmpBlockInversioncCst|jj|��S)N)r	r�r�)r=r�rrr r�Gsz&FirewallClient.queryIcmpBlockInversioncCst|jj|��S)N)r	r�r�)r=r�rrr r�Lsz'FirewallClient.removeIcmpBlockInversioncCs|jj|||�dS)N)r�rq)r=r(rlrorrr rqSszFirewallClient.addChaincCs|jj|||�dS)N)r�rr)r=r(rlrorrr rrXszFirewallClient.removeChaincCst|jj|||��S)N)r	r�rs)r=r(rlrorrr rs]szFirewallClient.queryChaincCst|jj||��S)N)r	r�rm)r=r(rlrrr rmbszFirewallClient.getChainscCst|jj��S)N)r	r�rk)r=rrr rkgszFirewallClient.getAllChainscCs|jj|||||�dS)N)r�rw)r=r(rlror�rrrr rwnszFirewallClient.addRulecCs|jj|||||�dS)N)r�rx)r=r(rlror�rrrr rxsszFirewallClient.removeRulecCs|jj|||�dS)N)r�ry)r=r(rlrorrr ryxszFirewallClient.removeRulescCst|jj|||||��S)N)r	r�rz)r=r(rlror�rrrr rz}szFirewallClient.queryRulecCst|jj|||��S)N)r	r�ru)r=r(rlrorrr ru�szFirewallClient.getRulescCst|jj��S)N)r	r�rt)r=rrr rt�szFirewallClient.getAllRulescCst|jj||��S)N)r	r��passthrough)r=r(rrrr r��szFirewallClient.passthroughcCst|jj��S)N)r	r�r{)r=rrr r{�sz!FirewallClient.getAllPassthroughscCs|jj�dS)N)r�r})r=rrr r}�sz$FirewallClient.removeAllPassthroughscCst|jj|��S)N)r	r�r~)r=r(rrr r~�szFirewallClient.getPassthroughscCs|jj||�dS)N)r�r)r=r(rrrr r�szFirewallClient.addPassthroughcCs|jj||�dS)N)r�r�)r=r(rrrr r��sz FirewallClient.removePassthroughcCst|jj||��S)N)r	r�r�)r=r(rrrr r��szFirewallClient.queryPassthroughcCs|jj�dS)N)rV�enableLockdown)r=rrr r��szFirewallClient.enableLockdowncCs|jj�dS)N)rV�disableLockdown)r=rrr r��szFirewallClient.disableLockdowncCst|jj��S)N)r	rV�
queryLockdown)r=rrr r��szFirewallClient.queryLockdowncCs|jj|�dS)N)rVrY)r=r<rrr rY�sz*FirewallClient.addLockdownWhitelistCommandcCst|jj��S)N)r	rVr\)r=rrr r\�sz+FirewallClient.getLockdownWhitelistCommandscCst|jj|��S)N)r	rVr[)r=r<rrr r[�sz,FirewallClient.queryLockdownWhitelistCommandcCs|jj|�dS)N)rVrZ)r=r<rrr rZ�sz-FirewallClient.removeLockdownWhitelistCommandcCs|jj|�dS)N)rVr])r=rBrrr r]�sz*FirewallClient.addLockdownWhitelistContextcCst|jj��S)N)r	rVr`)r=rrr r`�sz+FirewallClient.getLockdownWhitelistContextscCst|jj|��S)N)r	rVr_)r=rBrrr r_�sz,FirewallClient.queryLockdownWhitelistContextcCs|jj|�dS)N)rVr^)r=rBrrr r^�sz-FirewallClient.removeLockdownWhitelistContextcCs|jj|�dS)N)rVrg)r=rOrrr rg�sz&FirewallClient.addLockdownWhitelistUidcCst|jj��S)N)r	rVre)r=rrr re�sz'FirewallClient.getLockdownWhitelistUidscCst|jj|��S)N)r	rVri)r=rOrrr ri�sz(FirewallClient.queryLockdownWhitelistUidcCs|jj|�dS)N)rVrh)r=rOrrr rhsz)FirewallClient.removeLockdownWhitelistUidcCs|jj|�dS)N)rVra)r=rHrrr ra
sz'FirewallClient.addLockdownWhitelistUsercCst|jj��S)N)r	rVrd)r=rrr rdsz(FirewallClient.getLockdownWhitelistUserscCst|jj|��S)N)r	rVrc)r=rHrrr rcsz)FirewallClient.queryLockdownWhitelistUsercCs|jj|�dS)N)rVrb)r=rHrrr rbsz*FirewallClient.removeLockdownWhitelistUsercCs|jj�dS)z( Authorize once for all polkit actions. N)r��authorizeAll)r=rrr r�szFirewallClient.authorizeAll)NrT)r)r)r)r)r)r)r)r)�r�r�r�r!r@r�r�r�r�r�r�r�r�r�r�r�rr�r�rr�r�r�r�r�r�r�r�r�r�r�r�r�rrrr r!r�r�r�r�r
r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�r�rbr]rfrerlrhrnrmrrrprtrsr�r�r�r�r�r�r�r�r�r�rxrvrzryr�r|r�r�r�r�r�rqrrrsrmrkrwrxryrzrurtr�r{r}r~rr�r�r�r�r�rYr\r[rZr]r`r_r^rgrerirhrardrcrbr�rrrr r�+s*&0	
r�)4Z
gi.repositoryrr�sysr�Zdbus.mainloop.glibrZ	slip.dbusr�rZfirewallrZfirewall.core.baserrrZfirewall.dbus_utilsr	Zfirewall.functionsr
Zfirewall.core.richrZfirewall.core.ipsetrr
rrZfirewall.errorsrrrrr!�objectr"r�r�r�r�rr"r&r,r/r3r6r9rSrjr�r�r�rrrr �<module>sd
';R8ghyKCzVtbmsite-packages/firewall/__pycache__/command.cpython-36.opt-1.pyc000064400000043416147511334700020321 0ustar003

]ûf�^�@sfdZdgZddlZddlmZddlmZddlmZddl	m
Z
mZmZm
Z
mZGdd�de�ZdS)	z<FirewallCommand class for command line client simplification�FirewallCommand�N)�errors)�
FirewallError)�
DBusException)�checkIPnMask�
checkIP6nMask�	check_mac�
check_port�check_single_addressc@s�eZdZd\dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zd]dd�Z	d^dd�Z
d_dd�Zd`dd�Zdadd�Z
dbdd�Zdcdd�Zdddd�Zded d!�Zdfd"d#�Zdgd$d%�Zdhd&d'�Zdid(d)�Zdjd*d+�Zdkd,d-�Zd.d/�Zdld1d2�Zdmd3d4�Zd5d6�Zd7d8�Zd9d:�Zd;d<�Zd=d>�Zd?d@�Z dgdAfdBdC�Z!dgfdDdE�Z"dgfdFdG�Z#dHdI�Z$dJdK�Z%dLdM�Z&dNdO�Z'dPdQ�Z(dRdS�Z)dTdU�Z*dVdW�Z+dXdY�Z,dZd[�Z-dS)nrFcCs||_||_d|_d|_dS)NT)�quiet�verbose�'_FirewallCommand__use_exception_handler�fw)�selfrr�r�/usr/lib/python3.6/command.py�__init__#szFirewallCommand.__init__cCs
||_dS)N)r)rrrrr�set_fw)szFirewallCommand.set_fwcCs
||_dS)N)r)r�flagrrr�	set_quiet,szFirewallCommand.set_quietcCs|jS)N)r)rrrr�	get_quiet/szFirewallCommand.get_quietcCs
||_dS)N)r)rrrrr�set_verbose2szFirewallCommand.set_verbosecCs|jS)N)r)rrrr�get_verbose5szFirewallCommand.get_verboseNcCs$|dk	r |jr tjj|d�dS)N�
)r�sys�stdout�write)r�msgrrr�	print_msg8szFirewallCommand.print_msgcCs$|dk	r |jr tjj|d�dS)Nr)rr�stderrr)rrrrr�print_error_msg<szFirewallCommand.print_error_msgcCs,d}d}tjj�r|||}|j|�dS)Nzz)rr�isattyr )rrZFAILZENDrrr�
print_warning@s

zFirewallCommand.print_warningrcCs,|dkr|j|�n
|j|�tj|�dS)N�)r"rr�exit)rrZ	exit_coderrr�print_and_exitGs
zFirewallCommand.print_and_exitcCs|j|d�dS)N�)r%)rrrrr�failRszFirewallCommand.failcCs"|dk	r|jrtjj|d�dS)Nr)rrrr)rrrrr�print_if_verboseUsz FirewallCommand.print_if_verbosec
Cs�|jdk	r|jj�g}
d}g}x�|D]�}
|dk	r�y||
�}
Wnxtk
r�}z\tjt|��}t|�dkrz|jd|�n|jd||�||kr�|j	|�|d7}w&WYdd}~XnX|
j	|
�q&W�xb|
D�]X}
g}|dk	r�||7}t
|
t�o�t
|
t��r|j	|
�n||
7}|dk	�r(||7}|j
�y||�Wn�ttfk
�r}z�t
|t��rx|j|j��|j�}nt|�}tj|�}|tjtjtjtjgk�r�d}t|�dk�r�|jd|�n,|dk�r�|jd|�dS|jd||�||k�r|j	|�|d7}WYdd}~XnX|j�q�W|	�s�t|�|k�sJd|k�rNdSt|�dk�rltj|d�nt|�dk�r�tjtj�dS)Nrr#zWarning: %sz	Error: %s)rZauthorizeAll�	Exceptionr�get_code�str�lenr"r%�append�
isinstance�list�tuple�deactivate_exception_handlerr�fail_if_not_authorized�
get_dbus_name�get_dbus_messager�ALREADY_ENABLED�NOT_ENABLED�ZONE_ALREADY_SET�ALREADY_SET�activate_exception_handlerrr$Z
UNKNOWN_ERROR)rZcmd_type�option�
action_method�query_method�parse_method�message�
start_args�end_args�no_exit�itemsZ_errorsZ_error_codes�itemr�code�	call_itemrrrZ__cmd_sequenceYsr










zFirewallCommand.__cmd_sequencec	Cs|jd||||||d�dS)N�add)rA)�_FirewallCommand__cmd_sequence)rr:r;r<r=r>rArrr�add_sequence�szFirewallCommand.add_sequencec
Cs |jd||||||g|d�dS)NrF)r?rA)rG)r�xr:r;r<r=r>rArrr�x_add_sequence�szFirewallCommand.x_add_sequencec		Cs$|jd||||||g|g|d�	dS)NrF)r?r@rA)rG)	r�zoner:r;r<r=r>ZtimeoutrArrr�zone_add_timeout_sequence�sz)FirewallCommand.zone_add_timeout_sequencec	Cs|jd||||||d�dS)N�remove)rA)rG)rr:r;r<r=r>rArrr�remove_sequence�szFirewallCommand.remove_sequencec
Cs |jd||||||g|d�dS)NrM)r?rA)rG)rrIr:r;r<r=r>rArrr�x_remove_sequence�sz!FirewallCommand.x_remove_sequencec
Csg}x�|D]�}|dk	r�y||�}Wn^tk
r�}	zBt|�dkrR|jd|	�w
ntjt|	��}
|jd|	|
�WYdd}	~	XnX|j|�q
W�xv|D�]l}g}|dk	r�||7}t|t	�r�t|t
�r�|j|�n||7}|j�y||�}Wn�tk
�rj}	zZ|j
|	j��tj|	j��}
t|�dk�rF|jd|	j��w�n|jd|	j�|
�WYdd}	~	Xn`tk
�r�}	zBtjt|	��}
t|�dk�r�|jd|	�n|jd|	|
�WYdd}	~	XnX|j�t|�dk�r�|jd||d|f�q�|j|�q�W|�stjd�dS)	Nr#zWarning: %sz	Error: %sz%s: %s�no�yesr)rPrQ)r)r,r"rr*r+r%r-r.r/r0r1rr2r3r4r9r�print_query_resultrr$)
rr:r<r=r>r?rArBrCrrDrE�resrrrZ__query_sequence�sR
""z FirewallCommand.__query_sequencecCs|j|||||d�dS)N)rA)� _FirewallCommand__query_sequence)rr:r<r=r>rArrr�query_sequence�s
zFirewallCommand.query_sequencecCs|j|||||g|d�dS)N)r?rA)rT)rrIr:r<r=r>rArrr�x_query_sequence�s
z FirewallCommand.x_query_sequencecCsJt|�rFt|�rFt|�rF|jd�o2t|�dkrFttjd|��|S)Nzipset:�z8'%s' is no valid IPv4, IPv6 or MAC address, nor an ipset)rrr�
startswithr,rr�INVALID_ADDR)r�valuerrr�parse_source�s

zFirewallCommand.parse_source�/c
Csly|j|�\}}Wn$tk
r6ttjd|��YnXt|�sLttj|��|dkrdttjd|��||fS)NzTbad port (most likely missing protocol), correct syntax is portid[-portid]%sprotocol�tcp�udp�sctp�dccpz''%s' not in {'tcp'|'udp'|'sctp'|'dccp'})r]r^r_r`)�split�
ValueErrorrr�INVALID_PORTr	�INVALID_PROTOCOL)rrZZ	separator�port�protorrr�
parse_portszFirewallCommand.parse_portc
Cs�d}d}d}d}d}x�d||d�kr�||d�jdd�d}|t|�d7}d||d�krx||d�jdd�d}	n||d�}	|t|	�d7}|dkr�|	}q|dkr�|	}q|dkr�|	}q|dkr�|	}q|d	kr�|r�qttjd
|��qW|�sttjd��|�sttjd��|�p|�s*ttjd
��t|��s@ttj|��|dk�rZttjd|��|�rxt|��rxttj|��|�r�td|��r�|�s�td|��r�ttj	|��||||fS)Nr�=r#�:rerf�toport�toaddr�ifzinvalid forward port arg '%s'zmissing portzmissing protocolzmissing destinationr]r^r_r`z''%s' not in {'tcp'|'udp'|'sctp'|'dccp'}�ipv4�ipv6)r]r^r_r`)
rar,rrZINVALID_FORWARDr	rcrdr
rY)
rrZ�compatreZprotocolrjrk�i�opt�valrrr�parse_forward_portsT

z"FirewallCommand.parse_forward_portcCsF|jd�}t|�dkr"|ddfSt|�dkr2|Sttjd|��dS)Nrhr#r�r&zinvalid ipset option '%s')rar,rrZINVALID_OPTION)rrZ�argsrrr�parse_ipset_optionHs
z"FirewallCommand.parse_ipset_optioncCs.ddg}||kr*ttjd|dj|�f��|S)Nrmrnz'invalid argument: %s (choose from '%s')z', ')rr�INVALID_IPV�join)rrZ�ipvsrrr�check_destination_ipvRsz%FirewallCommand.check_destination_ipvcCsDy|jdd�\}}Wn tk
r4ttjd��YnX|j|�|fS)Nrir#z(destination syntax is ipv:address[/mask])rarbrrZINVALID_DESTINATIONrz)rrZZipvZdestinationrrr�parse_service_destinationZsz)FirewallCommand.parse_service_destinationcCs0dddg}||kr,ttjd|dj|�f��|S)NrmrnZebz'invalid argument: %s (choose from '%s')z', ')rrrwrx)rrZryrrr�	check_ipvbs
zFirewallCommand.check_ipvcCs0dddg}||kr,ttjd|dj|�f��|S)Nrtrmrnz'invalid argument: %s (choose from '%s')z', ')rrrwrx)rrZryrrr�check_helper_familyjs
z#FirewallCommand.check_helper_familycCsB|jd�sttjd|��t|jdd��dkr>ttjd|��|S)NZ
nf_conntrack_z('%s' does not start with 'nf_conntrack_'rtr#zModule name '%s' too short)rXrrZINVALID_MODULEr,�replace)rrZrrr�check_modulers


zFirewallCommand.check_moduleTcCs�|j�}|j�}|j�}|j�}	|j�}
|j�}|j�}|j�}
|j�}|j	�}|j
�}|rv|j�}|j�}|j
�}n,|j�}tt|j�|��}|j�}|j�}dd�}g}|dk	r�||kr�|jd�|r�|s�|s�|r�|r�|r�|jd�|�r|ddj|�}|j|�|j�r2|jd|�|jd|�|�rJ|jd	t|��|jd
|�|�sv|jd|�rndnd
�|�r�|jddj|��|jddj|��n(|jddj|��|jddj|��|jddjt|���|jddjdd�|D���|jddjt|	���|�s:|jd|�r2dnd
�|jd|
�rJdnd
�|jd|�rbdnddjdd�|D���|jddjdd�|D���|jd dj|
��|jd!|�r�dnddjt||d"���dS)#NcSsfd}d}y|j|�}Wntk
r*Yn8X|t|�7}t|||||d�jd��jdd��}|S)Nrz	priority=� �"rt)�indexrbr,�intr~)Zrule�priorityZ
search_strrprrr�rich_rule_sorted_key�s*zDFirewallCommand.print_zone_policy_info.<locals>.rich_rule_sorted_key�defaultZactivez (%s)z, z  summary: z  description: z  priority: z
  target: z  icmp-block-inversion: %srQrPz  ingress-zones: r�z  egress-zones: z  interfaces: z  sources: z  services: z	  ports: cSs g|]}d|d|df�qS)z%s/%srr#r)�.0rerrr�
<listcomp>�sz:FirewallCommand.print_zone_policy_info.<locals>.<listcomp>z
  protocols: z
  forward: %sz  masquerade: %sz  forward-ports: z
	rtcSs$g|]\}}}}d||||f�qS)z$port=%s:proto=%s:toport=%s:toaddr=%sr)r�rerfrjrkrrrr��sz  source-ports: cSs g|]}d|d|df�qS)z%s/%srr#r)r�rerrrr��sz  icmp-blocks: z  rich rules: )�key)Z	getTargetZgetServices�getPorts�getProtocolsZ
getMasqueradeZgetForwardPorts�getSourcePortsZ
getIcmpBlocksZgetRichRules�getDescription�getShortZgetIngressZonesZgetEgressZonesZgetPriorityZgetIcmpBlockInversion�sorted�setZ
getInterfacesZ
getSourcesZ
getForwardr-rxrrr+)rrK�settings�default_zone�extra_interfaces�isPolicy�targetZservices�ports�	protocolsZ
masqueradeZ
forward_ports�source_portsZicmp_blocksZrules�description�short_descriptionZ
ingress_zonesZegress_zonesr�Zicmp_block_inversionZ
interfacesZsourcesZforwardr�Z
attributesrrr�print_zone_policy_info|sx






z&FirewallCommand.print_zone_policy_infocCs|j||||dd�dS)NF)r�r�r�)r�)rrKr�r�r�rrr�print_zone_info�szFirewallCommand.print_zone_infocCs|j||||dd�dS)NT)r�r�r�)r�)rZpolicyr�r�r�rrr�print_policy_info�sz!FirewallCommand.print_policy_infocCs.|j�}|j�}|j�}|j�}|j�}|j�}|j�}	|j�}
|j�}|j	|�|j
rt|j	d|	�|j	d|�|j	ddjdd�|D���|j	ddj|��|j	ddjd	d�|D���|j	d
dj|��|j	ddjdd�|j�D���|j	d
djt
|
���|j	ddjt
|���dS)Nz  summary: z  description: z	  ports: r�cSs g|]}d|d|df�qS)z%s/%srr#r)r�rerrrr��sz6FirewallCommand.print_service_info.<locals>.<listcomp>z
  protocols: z  source-ports: cSs g|]}d|d|df�qS)z%s/%srr#r)r�rerrrr��sz  modules: z  destination: cSsg|]\}}d||f�qS)z%s:%sr)r��k�vrrrr��sz  includes: z  helpers: )r�r�r�Z
getModulesr��getDestinationsr�ZgetIncludesZ
getHelpersrrrxrBr�)rZservicer�r�r�r��modulesr��destinationsr�ZincludesZhelpersrrr�print_service_info�s2


z"FirewallCommand.print_service_infocCsp|j�}|j�}|j�}t|�dkr,ddg}|j|�|jrX|jd|�|jd|�|jddj|��dS)Nrrmrnz  summary: z  description: z  destination: r�)r�r�r�r,rrrx)rZicmptyper�r�r�r�rrr�print_icmptype_info�s
z#FirewallCommand.print_icmptype_infocCs�|j�}|j�}|j�}|j�}|j�}|j|�|jrT|jd|�|jd|�|jd|�|jddjdd�|j�D���|jddj|��dS)	Nz  summary: z  description: z  type: z  options: r�cSs$g|]\}}|rd||fn|�qS)z%s=%sr)r�r�r�rrrr�sz4FirewallCommand.print_ipset_info.<locals>.<listcomp>z  entries: )	ZgetTypeZ
getOptionsZ
getEntriesr�r�rrrxrB)rZipsetr�Z
ipset_typeZoptions�entriesr�r�rrr�print_ipset_info�s
z FirewallCommand.print_ipset_infocCs�|j�}|j�}|j�}|j�}|j�}|j|�|jrT|jd|�|jd|�|jd|�|jd|�|jddjdd�|D���dS)	Nz  summary: z  description: z
  family: z
  module: z	  ports: r�cSs g|]}d|d|df�qS)z%s/%srr#r)r�rerrrr�sz5FirewallCommand.print_helper_info.<locals>.<listcomp>)r�Z	getModuleZ	getFamilyr�r�rrrx)r�helperr�r��moduleZfamilyr�r�rrr�print_helper_infos
z!FirewallCommand.print_helper_infocCs |r|jd�n|jdd�dS)NrQrPr#)r%)rrZrrrrRsz"FirewallCommand.print_query_resultcCs\|js�|j|�tjt|��}|tjtjtjtj	gkrH|j
d|�n|jd||�dS)NzWarning: %sz	Error: %s)r
r2rr*r+rr5r6r7r8r"r%)r�exception_messagerDrrr�exception_handlers

z!FirewallCommand.exception_handlercCsd|krd}|j|tj�dS)NZNotAuthorizedExceptionz`Authorization failed.
    Make sure polkit agent is running or run the application as superuser.)r%rZNOT_AUTHORIZED)rr�rrrrr2'sz&FirewallCommand.fail_if_not_authorizedcCs
d|_dS)NF)r
)rrrrr1-sz,FirewallCommand.deactivate_exception_handlercCs
d|_dS)NT)r
)rrrrr90sz*FirewallCommand.activate_exception_handlercCspg}t�}t|�}xP|D]H}|s"P|j�}t|�dks|ddkrDq||kr|j|�|j|�qW|j�|S)Nr#r�#�;)r�r�)r��open�stripr,r-rF�close)r�filenamer�Zentries_set�f�linerrr�get_ipset_entries_from_file3s

z+FirewallCommand.get_ipset_entries_from_file)FF)N)N)N)Nr)N)N)NNF)F)F)F)F)F)NF)F)F)r\)F).�__name__�
__module__�__qualname__rrrrrrrr r"r%r'r(rGrHrJrLrNrOrTrUrVr[rgrsrvrzr{r|r}rr�r�r�r�r�r�r�rRr�r2r1r9r�rrrrr"sX







J





2



2

O)�__doc__�__all__rZfirewallrZfirewall.errorsrZdbus.exceptionsrZfirewall.functionsrrrr	r
�objectrrrrr�<module>ssite-packages/firewall/__pycache__/errors.cpython-36.opt-1.pyc000064400000007340147511334700020213 0ustar003

]ûf��@s�dZdZdZdZdZdZdZdZdZd	Z	d
Z
dZdZd
Z
dZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZd Z d!Z!d"Z"d#Z#d$Z$d%Z%d&Z&d'Z'd(Z(d)Z)d*Z*d+Z+d,Z,d-Z-d.Z.d/Z/d0Z0d1Z1d2Z2d3Z3d4Z4d5Z5d6Z6d7Z7d8Z8d9Z9d:Z:d;Z;d<Z<d=Z=d>Z>d?Z?d@Z@dAZAdBZBdCZCdDZDdEZEdFZFdGZGdHZHdIZIdJZJdKZKdLZLdMZMdNZNdOZOdPZPdQdRlQZQGdSdT�dTeR�ZSeQjTeSjUZVdUdV�eWeV�D�eS_XdWdV�eSjXD�eS_YdRS)X���
������������������� �!�"�#�$�%�&�d�e�f�g�h�i�j�k�l�m�n�o�p�q�r�s�t�u�v�w�x�y�z�{�|�}�~�����������������������������������Nc@s6eZdZd
dd�Zdd�Zdd�Zdd	�Zee�ZdS)�
FirewallErrorNcCsR||_|dk	rHtjdkrHyt|�}Wn"tk
rFt|�jd�}YnX||_dS)N�3Zunicode_escape)�code�sys�version�str�UnicodeEncodeErrorZunicode�encode�msg)�selfrUr[�x�r^�/usr/lib/python3.6/errors.py�__init__ns
zFirewallError.__init__cCsd|j|j|jfS)Nz
%s(%r, %r))�	__class__rUr[)r\r^r^r_�__repr__yszFirewallError.__repr__cCs(|jrd|j|j|jfS|j|jS)Nz%s: %s)r[�errorsrU)r\r^r^r_�__str__|szFirewallError.__str__cCsPd|kr |jd�}|d|�}n|}ytj|}Wntk
rJt}YnX|S)N�:)�indexrS�codes�KeyError�
UNKNOWN_ERROR)r[�idxZecoderUr^r^r_�get_code�s

zFirewallError.get_code)N)�__name__�
__module__�__qualname__r`rbrdrk�staticmethodr^r^r^r_rSms

rScCs6i|].}|jd�rttt|��tkr|tt|��qS)�_)�
startswith�type�getattr�mod�int)�.0Zvarnamer^r^r_�
<dictcomp>�srwcCsi|]}|tj|�qSr^)rSrc)rvrUr^r^r_rw�s)ZZALREADY_ENABLEDZNOT_ENABLEDZCOMMAND_FAILEDZNO_IPV6_NATZ
PANIC_MODEZZONE_ALREADY_SETZUNKNOWN_INTERFACEZ
ZONE_CONFLICTZ
BUILTIN_CHAINZEBTABLES_NO_REJECTZNOT_OVERLOADABLEZNO_DEFAULTSZBUILTIN_ZONEZBUILTIN_SERVICEZBUILTIN_ICMPTYPEZ
NAME_CONFLICTZ
NAME_MISMATCHZPARSE_ERRORZ
ACCESS_DENIEDZUNKNOWN_SOURCEZRT_TO_PERM_FAILEDZIPSET_WITH_TIMEOUTZ
BUILTIN_IPSETZALREADY_SETZMISSING_IMPORTZ
DBUS_ERRORZBUILTIN_HELPERZNOT_APPLIEDZINVALID_ACTIONZINVALID_SERVICEZINVALID_PORTZINVALID_PROTOCOLZINVALID_INTERFACEZINVALID_ADDRZINVALID_FORWARDZINVALID_ICMPTYPEZ
INVALID_TABLEZ
INVALID_CHAINZINVALID_TARGETZINVALID_IPVZINVALID_ZONEZINVALID_PROPERTYZ
INVALID_VALUEZINVALID_OBJECTZINVALID_NAMEZINVALID_FILENAMEZINVALID_DIRECTORYZINVALID_TYPEZINVALID_SETTINGZINVALID_DESTINATIONZINVALID_RULEZ
INVALID_LIMITZINVALID_FAMILYZINVALID_LOG_LEVELZINVALID_AUDIT_TYPEZINVALID_MARKZINVALID_CONTEXTZINVALID_COMMANDZINVALID_USERZINVALID_UIDZINVALID_MODULEZINVALID_PASSTHROUGHZINVALID_MACZ
INVALID_IPSETZ
INVALID_ENTRYZINVALID_OPTIONZINVALID_HELPERZINVALID_PRIORITYZINVALID_POLICYZ
MISSING_TABLEZ
MISSING_CHAINZMISSING_PORTZMISSING_PROTOCOLZMISSING_ADDRZMISSING_NAMEZMISSING_SETTINGZMISSING_FAMILYZRUNNING_BUT_FAILEDZNOT_RUNNINGZNOT_AUTHORIZEDrirV�	ExceptionrS�modulesrmrt�dirrcrgr^r^r^r_�<module>s�$site-packages/firewall/__pycache__/dbus_utils.cpython-36.opt-1.pyc000064400000013117147511334700021053 0ustar003

]ûfJ�@s�dddddddddd	d
gZddlZddlZddlZdd
lmZddlmZejdkZ	dd�Z
dd�Zdd�Zdd�Z
dd�Zdd�Zdd�Zddd�Zdd�Zddd	�Zdd
�ZdS)�command_of_pid�
pid_of_sender�
uid_of_sender�user_of_uid�context_of_sender�command_of_sender�user_of_sender�dbus_to_python�dbus_signature�%dbus_introspection_prepare_properties�!dbus_introspection_add_properties�N)�minidom)�log�3cCsPy6td|d��}|j�djdd�j�}WdQRXWntk
rJdSX|S)z  Get command for pid from /proc z/proc/%d/cmdline�rr�� N)�open�	readlines�replace�strip�	Exception)�pid�f�cmd�r� /usr/lib/python3.6/dbus_utils.pyr%s&cCsD|jdd�}tj|d�}yt|j|��}Wntk
r>dSX|S)zW Get pid from sender string using 
    org.freedesktop.DBus.GetConnectionUnixProcessID zorg.freedesktop.DBusz/org/freedesktop/DBusN)�
get_object�dbus�	Interface�intZGetConnectionUnixProcessID�
ValueError)�bus�sender�dbus_obj�
dbus_ifacerrrrr.scCsD|jdd�}tj|d�}yt|j|��}Wntk
r>dSX|S)zV Get user id from sender string using 
    org.freedesktop.DBus.GetConnectionUnixUser zorg.freedesktop.DBusz/org/freedesktop/DBusN)rrrr ZGetConnectionUnixUserr!)r"r#r$r%�uidrrrr;scCs,ytj|�}Wntk
r"dSX|dS)z Get user for uid from pwd Nr)�pwd�getpwuidr)r&ZpwsrrrrHs
c
CsP|jdd�}tj|d�}y|j|�}Wntk
r:dSXdjttt|���S)zl Get SELinux context from sender string using 
    org.freedesktop.DBus.GetConnectionSELinuxSecurityContext zorg.freedesktop.DBusz/org/freedesktop/DBusN�)	rrrZ#GetConnectionSELinuxSecurityContextr�join�map�chrr)r"r#r$r%�contextrrrrQscCstt||��S)z  Return command of D-Bus sender )rr)r"r#rrrr_scCstt||��S)N)rr)r"r#rrrrdscCs�|dkr|}�n�t|tj�r(t|�}�n�t|tj�rNtrB|jd�nt|�}�n�trjt|tj�rjt|�}�ndt|tj	�r�t|�}�nLt|tj
�s�t|tj�s�t|tj�s�t|tj
�s�t|tj�s�t|tj�s�t|tj�r�t|�}n�t|tj�r�t|�}n�t|tj��rdd�|D�}n�t|tj��r6tdd�|D��}n�t|tj��rXdd�|j�D�}nvt|t��s�t|t��s�t|t��s�t|t��s�t|t��s�t|t��s�t|t��s�t|t��r�|}ntdt|���|dk	�r�|tk�r�t|t��s�|tk�rt|t��s�|tk�r t|t��s�|tk�r8t|t��s�|tk�rPt|t��s�|tk�rht|t��s�|tk�r�t|t��r�td|t|�|f��|S)	Nzutf-8cSsg|]}t|��qSr)r)�.0�xrrr�
<listcomp>}sz"dbus_to_python.<locals>.<listcomp>cSsg|]}t|��qSr)r)r.r/rrrr0scSsi|]\}}t|�t|��qSr)r)r.�k�vrrr�
<dictcomp>�sz"dbus_to_python.<locals>.<dictcomp>zUnhandled %sz%s is %s, expected %s)�
isinstancer�Boolean�bool�String�PY2�encode�str�
UTF8String�
ObjectPath�Byte�Int16�Int32�Int64�UInt16�UInt32�UInt64r �Double�float�Array�Struct�tuple�
Dictionary�items�bytes�list�dict�	TypeError�repr�type)�objZ
expected_typeZ
python_objrrrrgsV


cCs>t|tj�rdSt|tj�r dSt|tj�r0dSt|tj�r@dSt|tj�rPdSt|tj�r`dSt|tj�rpdSt|tj	�r�dSt|tj
�r�d	St|tj�r�d
St|tj�r�dSt|tj
��r�t|j�dkr�d
|jSd|jSnXt|tj��r�d|jSt|tj��rd|jSt�r*t|tj��r*dStdt|���dS)N�b�s�o�y�n�ir/�q�u�t�d�za(%s)za%sz(%s)za{%s}zUnhandled %s)r4rr5r7r<r=r>r?r@rArBrCrDrF�lenZ	signaturerGrIr8r;rNrO)rQrrrr	�sB


cCs�|dkri}t|d�s"t|di�t|d�}i||<y|j|�}Wntk
rZi}YnXxV|j�D]J\}}dt|�i|||<||kr�|||||d<qfd|||d<qfWdS)N�_fw_dbus_propertiesrP�access�read)�hasattr�setattr�getattrZGetAllrrJr	)rQ�	interfacer_�dipZ_dict�key�valuerrrr
�s


c
Cs�tj|�}t|d�r�x�|jd�D]�}|jd�r |jd�|kr i}t|d�rTt|d�}||kr xX||j�D]H\}}|jd�}|j	d|�|j	d|d�|j	d|d�|j
|�qjWq Wtj|j
��|j
�}	|j�|	S)Nr^rd�name�propertyrPr_)r
ZparseStringraZgetElementsByTagNameZhasAttributeZgetAttributercrJZ
createElementZsetAttributeZappendChildrZdebug10Ztoxml�unlink)
rQ�datard�docZnodererfrgZpropZnew_datarrrr�s&





)N)N)�__all__rr'�sysZxml.domr
Zfirewall.core.loggerr�versionr8rrrrrrrrr	r
rrrrr�<module>s*
	

	
0%
site-packages/firewall/__pycache__/command.cpython-36.pyc000064400000043416147511334700017362 0ustar003

]ûf�^�@sfdZdgZddlZddlmZddlmZddlmZddl	m
Z
mZmZm
Z
mZGdd�de�ZdS)	z<FirewallCommand class for command line client simplification�FirewallCommand�N)�errors)�
FirewallError)�
DBusException)�checkIPnMask�
checkIP6nMask�	check_mac�
check_port�check_single_addressc@s�eZdZd\dd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zd]dd�Z	d^dd�Z
d_dd�Zd`dd�Zdadd�Z
dbdd�Zdcdd�Zdddd�Zded d!�Zdfd"d#�Zdgd$d%�Zdhd&d'�Zdid(d)�Zdjd*d+�Zdkd,d-�Zd.d/�Zdld1d2�Zdmd3d4�Zd5d6�Zd7d8�Zd9d:�Zd;d<�Zd=d>�Zd?d@�Z dgdAfdBdC�Z!dgfdDdE�Z"dgfdFdG�Z#dHdI�Z$dJdK�Z%dLdM�Z&dNdO�Z'dPdQ�Z(dRdS�Z)dTdU�Z*dVdW�Z+dXdY�Z,dZd[�Z-dS)nrFcCs||_||_d|_d|_dS)NT)�quiet�verbose�'_FirewallCommand__use_exception_handler�fw)�selfrr�r�/usr/lib/python3.6/command.py�__init__#szFirewallCommand.__init__cCs
||_dS)N)r)rrrrr�set_fw)szFirewallCommand.set_fwcCs
||_dS)N)r)r�flagrrr�	set_quiet,szFirewallCommand.set_quietcCs|jS)N)r)rrrr�	get_quiet/szFirewallCommand.get_quietcCs
||_dS)N)r)rrrrr�set_verbose2szFirewallCommand.set_verbosecCs|jS)N)r)rrrr�get_verbose5szFirewallCommand.get_verboseNcCs$|dk	r |jr tjj|d�dS)N�
)r�sys�stdout�write)r�msgrrr�	print_msg8szFirewallCommand.print_msgcCs$|dk	r |jr tjj|d�dS)Nr)rr�stderrr)rrrrr�print_error_msg<szFirewallCommand.print_error_msgcCs,d}d}tjj�r|||}|j|�dS)Nzz)rr�isattyr )rrZFAILZENDrrr�
print_warning@s

zFirewallCommand.print_warningrcCs,|dkr|j|�n
|j|�tj|�dS)N�)r"rr�exit)rrZ	exit_coderrr�print_and_exitGs
zFirewallCommand.print_and_exitcCs|j|d�dS)N�)r%)rrrrr�failRszFirewallCommand.failcCs"|dk	r|jrtjj|d�dS)Nr)rrrr)rrrrr�print_if_verboseUsz FirewallCommand.print_if_verbosec
Cs�|jdk	r|jj�g}
d}g}x�|D]�}
|dk	r�y||
�}
Wnxtk
r�}z\tjt|��}t|�dkrz|jd|�n|jd||�||kr�|j	|�|d7}w&WYdd}~XnX|
j	|
�q&W�xb|
D�]X}
g}|dk	r�||7}t
|
t�o�t
|
t��r|j	|
�n||
7}|dk	�r(||7}|j
�y||�Wn�ttfk
�r}z�t
|t��rx|j|j��|j�}nt|�}tj|�}|tjtjtjtjgk�r�d}t|�dk�r�|jd|�n,|dk�r�|jd|�dS|jd||�||k�r|j	|�|d7}WYdd}~XnX|j�q�W|	�s�t|�|k�sJd|k�rNdSt|�dk�rltj|d�nt|�dk�r�tjtj�dS)Nrr#zWarning: %sz	Error: %s)rZauthorizeAll�	Exceptionr�get_code�str�lenr"r%�append�
isinstance�list�tuple�deactivate_exception_handlerr�fail_if_not_authorized�
get_dbus_name�get_dbus_messager�ALREADY_ENABLED�NOT_ENABLED�ZONE_ALREADY_SET�ALREADY_SET�activate_exception_handlerrr$Z
UNKNOWN_ERROR)rZcmd_type�option�
action_method�query_method�parse_method�message�
start_args�end_args�no_exit�itemsZ_errorsZ_error_codes�itemr�code�	call_itemrrrZ__cmd_sequenceYsr










zFirewallCommand.__cmd_sequencec	Cs|jd||||||d�dS)N�add)rA)�_FirewallCommand__cmd_sequence)rr:r;r<r=r>rArrr�add_sequence�szFirewallCommand.add_sequencec
Cs |jd||||||g|d�dS)NrF)r?rA)rG)r�xr:r;r<r=r>rArrr�x_add_sequence�szFirewallCommand.x_add_sequencec		Cs$|jd||||||g|g|d�	dS)NrF)r?r@rA)rG)	r�zoner:r;r<r=r>ZtimeoutrArrr�zone_add_timeout_sequence�sz)FirewallCommand.zone_add_timeout_sequencec	Cs|jd||||||d�dS)N�remove)rA)rG)rr:r;r<r=r>rArrr�remove_sequence�szFirewallCommand.remove_sequencec
Cs |jd||||||g|d�dS)NrM)r?rA)rG)rrIr:r;r<r=r>rArrr�x_remove_sequence�sz!FirewallCommand.x_remove_sequencec
Csg}x�|D]�}|dk	r�y||�}Wn^tk
r�}	zBt|�dkrR|jd|	�w
ntjt|	��}
|jd|	|
�WYdd}	~	XnX|j|�q
W�xv|D�]l}g}|dk	r�||7}t|t	�r�t|t
�r�|j|�n||7}|j�y||�}Wn�tk
�rj}	zZ|j
|	j��tj|	j��}
t|�dk�rF|jd|	j��w�n|jd|	j�|
�WYdd}	~	Xn`tk
�r�}	zBtjt|	��}
t|�dk�r�|jd|	�n|jd|	|
�WYdd}	~	XnX|j�t|�dk�r�|jd||d|f�q�|j|�q�W|�stjd�dS)	Nr#zWarning: %sz	Error: %sz%s: %s�no�yesr)rPrQ)r)r,r"rr*r+r%r-r.r/r0r1rr2r3r4r9r�print_query_resultrr$)
rr:r<r=r>r?rArBrCrrDrE�resrrrZ__query_sequence�sR
""z FirewallCommand.__query_sequencecCs|j|||||d�dS)N)rA)� _FirewallCommand__query_sequence)rr:r<r=r>rArrr�query_sequence�s
zFirewallCommand.query_sequencecCs|j|||||g|d�dS)N)r?rA)rT)rrIr:r<r=r>rArrr�x_query_sequence�s
z FirewallCommand.x_query_sequencecCsJt|�rFt|�rFt|�rF|jd�o2t|�dkrFttjd|��|S)Nzipset:�z8'%s' is no valid IPv4, IPv6 or MAC address, nor an ipset)rrr�
startswithr,rr�INVALID_ADDR)r�valuerrr�parse_source�s

zFirewallCommand.parse_source�/c
Csly|j|�\}}Wn$tk
r6ttjd|��YnXt|�sLttj|��|dkrdttjd|��||fS)NzTbad port (most likely missing protocol), correct syntax is portid[-portid]%sprotocol�tcp�udp�sctp�dccpz''%s' not in {'tcp'|'udp'|'sctp'|'dccp'})r]r^r_r`)�split�
ValueErrorrr�INVALID_PORTr	�INVALID_PROTOCOL)rrZZ	separator�port�protorrr�
parse_portszFirewallCommand.parse_portc
Cs�d}d}d}d}d}x�d||d�kr�||d�jdd�d}|t|�d7}d||d�krx||d�jdd�d}	n||d�}	|t|	�d7}|dkr�|	}q|dkr�|	}q|dkr�|	}q|dkr�|	}q|d	kr�|r�qttjd
|��qW|�sttjd��|�sttjd��|�p|�s*ttjd
��t|��s@ttj|��|dk�rZttjd|��|�rxt|��rxttj|��|�r�td|��r�|�s�td|��r�ttj	|��||||fS)Nr�=r#�:rerf�toport�toaddr�ifzinvalid forward port arg '%s'zmissing portzmissing protocolzmissing destinationr]r^r_r`z''%s' not in {'tcp'|'udp'|'sctp'|'dccp'}�ipv4�ipv6)r]r^r_r`)
rar,rrZINVALID_FORWARDr	rcrdr
rY)
rrZ�compatreZprotocolrjrk�i�opt�valrrr�parse_forward_portsT

z"FirewallCommand.parse_forward_portcCsF|jd�}t|�dkr"|ddfSt|�dkr2|Sttjd|��dS)Nrhr#r�r&zinvalid ipset option '%s')rar,rrZINVALID_OPTION)rrZ�argsrrr�parse_ipset_optionHs
z"FirewallCommand.parse_ipset_optioncCs.ddg}||kr*ttjd|dj|�f��|S)Nrmrnz'invalid argument: %s (choose from '%s')z', ')rr�INVALID_IPV�join)rrZ�ipvsrrr�check_destination_ipvRsz%FirewallCommand.check_destination_ipvcCsDy|jdd�\}}Wn tk
r4ttjd��YnX|j|�|fS)Nrir#z(destination syntax is ipv:address[/mask])rarbrrZINVALID_DESTINATIONrz)rrZZipvZdestinationrrr�parse_service_destinationZsz)FirewallCommand.parse_service_destinationcCs0dddg}||kr,ttjd|dj|�f��|S)NrmrnZebz'invalid argument: %s (choose from '%s')z', ')rrrwrx)rrZryrrr�	check_ipvbs
zFirewallCommand.check_ipvcCs0dddg}||kr,ttjd|dj|�f��|S)Nrtrmrnz'invalid argument: %s (choose from '%s')z', ')rrrwrx)rrZryrrr�check_helper_familyjs
z#FirewallCommand.check_helper_familycCsB|jd�sttjd|��t|jdd��dkr>ttjd|��|S)NZ
nf_conntrack_z('%s' does not start with 'nf_conntrack_'rtr#zModule name '%s' too short)rXrrZINVALID_MODULEr,�replace)rrZrrr�check_modulers


zFirewallCommand.check_moduleTcCs�|j�}|j�}|j�}|j�}	|j�}
|j�}|j�}|j�}
|j�}|j	�}|j
�}|rv|j�}|j�}|j
�}n,|j�}tt|j�|��}|j�}|j�}dd�}g}|dk	r�||kr�|jd�|r�|s�|s�|r�|r�|r�|jd�|�r|ddj|�}|j|�|j�r2|jd|�|jd|�|�rJ|jd	t|��|jd
|�|�sv|jd|�rndnd
�|�r�|jddj|��|jddj|��n(|jddj|��|jddj|��|jddjt|���|jddjdd�|D���|jddjt|	���|�s:|jd|�r2dnd
�|jd|
�rJdnd
�|jd|�rbdnddjdd�|D���|jddjdd�|D���|jd dj|
��|jd!|�r�dnddjt||d"���dS)#NcSsfd}d}y|j|�}Wntk
r*Yn8X|t|�7}t|||||d�jd��jdd��}|S)Nrz	priority=� �"rt)�indexrbr,�intr~)Zrule�priorityZ
search_strrprrr�rich_rule_sorted_key�s*zDFirewallCommand.print_zone_policy_info.<locals>.rich_rule_sorted_key�defaultZactivez (%s)z, z  summary: z  description: z  priority: z
  target: z  icmp-block-inversion: %srQrPz  ingress-zones: r�z  egress-zones: z  interfaces: z  sources: z  services: z	  ports: cSs g|]}d|d|df�qS)z%s/%srr#r)�.0rerrr�
<listcomp>�sz:FirewallCommand.print_zone_policy_info.<locals>.<listcomp>z
  protocols: z
  forward: %sz  masquerade: %sz  forward-ports: z
	rtcSs$g|]\}}}}d||||f�qS)z$port=%s:proto=%s:toport=%s:toaddr=%sr)r�rerfrjrkrrrr��sz  source-ports: cSs g|]}d|d|df�qS)z%s/%srr#r)r�rerrrr��sz  icmp-blocks: z  rich rules: )�key)Z	getTargetZgetServices�getPorts�getProtocolsZ
getMasqueradeZgetForwardPorts�getSourcePortsZ
getIcmpBlocksZgetRichRules�getDescription�getShortZgetIngressZonesZgetEgressZonesZgetPriorityZgetIcmpBlockInversion�sorted�setZ
getInterfacesZ
getSourcesZ
getForwardr-rxrrr+)rrK�settings�default_zone�extra_interfaces�isPolicy�targetZservices�ports�	protocolsZ
masqueradeZ
forward_ports�source_portsZicmp_blocksZrules�description�short_descriptionZ
ingress_zonesZegress_zonesr�Zicmp_block_inversionZ
interfacesZsourcesZforwardr�Z
attributesrrr�print_zone_policy_info|sx






z&FirewallCommand.print_zone_policy_infocCs|j||||dd�dS)NF)r�r�r�)r�)rrKr�r�r�rrr�print_zone_info�szFirewallCommand.print_zone_infocCs|j||||dd�dS)NT)r�r�r�)r�)rZpolicyr�r�r�rrr�print_policy_info�sz!FirewallCommand.print_policy_infocCs.|j�}|j�}|j�}|j�}|j�}|j�}|j�}	|j�}
|j�}|j	|�|j
rt|j	d|	�|j	d|�|j	ddjdd�|D���|j	ddj|��|j	ddjd	d�|D���|j	d
dj|��|j	ddjdd�|j�D���|j	d
djt
|
���|j	ddjt
|���dS)Nz  summary: z  description: z	  ports: r�cSs g|]}d|d|df�qS)z%s/%srr#r)r�rerrrr��sz6FirewallCommand.print_service_info.<locals>.<listcomp>z
  protocols: z  source-ports: cSs g|]}d|d|df�qS)z%s/%srr#r)r�rerrrr��sz  modules: z  destination: cSsg|]\}}d||f�qS)z%s:%sr)r��k�vrrrr��sz  includes: z  helpers: )r�r�r�Z
getModulesr��getDestinationsr�ZgetIncludesZ
getHelpersrrrxrBr�)rZservicer�r�r�r��modulesr��destinationsr�ZincludesZhelpersrrr�print_service_info�s2


z"FirewallCommand.print_service_infocCsp|j�}|j�}|j�}t|�dkr,ddg}|j|�|jrX|jd|�|jd|�|jddj|��dS)Nrrmrnz  summary: z  description: z  destination: r�)r�r�r�r,rrrx)rZicmptyper�r�r�r�rrr�print_icmptype_info�s
z#FirewallCommand.print_icmptype_infocCs�|j�}|j�}|j�}|j�}|j�}|j|�|jrT|jd|�|jd|�|jd|�|jddjdd�|j�D���|jddj|��dS)	Nz  summary: z  description: z  type: z  options: r�cSs$g|]\}}|rd||fn|�qS)z%s=%sr)r�r�r�rrrr�sz4FirewallCommand.print_ipset_info.<locals>.<listcomp>z  entries: )	ZgetTypeZ
getOptionsZ
getEntriesr�r�rrrxrB)rZipsetr�Z
ipset_typeZoptions�entriesr�r�rrr�print_ipset_info�s
z FirewallCommand.print_ipset_infocCs�|j�}|j�}|j�}|j�}|j�}|j|�|jrT|jd|�|jd|�|jd|�|jd|�|jddjdd�|D���dS)	Nz  summary: z  description: z
  family: z
  module: z	  ports: r�cSs g|]}d|d|df�qS)z%s/%srr#r)r�rerrrr�sz5FirewallCommand.print_helper_info.<locals>.<listcomp>)r�Z	getModuleZ	getFamilyr�r�rrrx)r�helperr�r��moduleZfamilyr�r�rrr�print_helper_infos
z!FirewallCommand.print_helper_infocCs |r|jd�n|jdd�dS)NrQrPr#)r%)rrZrrrrRsz"FirewallCommand.print_query_resultcCs\|js�|j|�tjt|��}|tjtjtjtj	gkrH|j
d|�n|jd||�dS)NzWarning: %sz	Error: %s)r
r2rr*r+rr5r6r7r8r"r%)r�exception_messagerDrrr�exception_handlers

z!FirewallCommand.exception_handlercCsd|krd}|j|tj�dS)NZNotAuthorizedExceptionz`Authorization failed.
    Make sure polkit agent is running or run the application as superuser.)r%rZNOT_AUTHORIZED)rr�rrrrr2'sz&FirewallCommand.fail_if_not_authorizedcCs
d|_dS)NF)r
)rrrrr1-sz,FirewallCommand.deactivate_exception_handlercCs
d|_dS)NT)r
)rrrrr90sz*FirewallCommand.activate_exception_handlercCspg}t�}t|�}xP|D]H}|s"P|j�}t|�dks|ddkrDq||kr|j|�|j|�qW|j�|S)Nr#r�#�;)r�r�)r��open�stripr,r-rF�close)r�filenamer�Zentries_set�f�linerrr�get_ipset_entries_from_file3s

z+FirewallCommand.get_ipset_entries_from_file)FF)N)N)N)Nr)N)N)NNF)F)F)F)F)F)NF)F)F)r\)F).�__name__�
__module__�__qualname__rrrrrrrr r"r%r'r(rGrHrJrLrNrOrTrUrVr[rgrsrvrzr{r|r}rr�r�r�r�r�r�r�rRr�r2r1r9r�rrrrr"sX







J





2



2

O)�__doc__�__all__rZfirewallrZfirewall.errorsrZdbus.exceptionsrZfirewall.functionsrrrr	r
�objectrrrrr�<module>ssite-packages/firewall/__pycache__/__init__.cpython-36.pyc000064400000000161147511334700017471 0ustar003

]ûf�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/firewall/__pycache__/fw_types.cpython-36.opt-1.pyc000064400000005613147511334700020540 0ustar003

]ûf��@sdgZGdd�de�ZdS)�LastUpdatedOrderedDictc@sxeZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zddd�ZdS)rNcCsi|_g|_|r|j|�dS)N)�_dict�_list�update)�self�x�r�/usr/lib/python3.6/fw_types.py�__init__szLastUpdatedOrderedDict.__init__cCs|jdd�=|jj�dS)N)rr�clear)rrrrr
szLastUpdatedOrderedDict.clearcCs"x|j�D]\}}|||<q
WdS)N)�items)rr�key�valuerrrr#szLastUpdatedOrderedDict.updatecs�fdd��jD�S)Ncsg|]}|�|f�qSrr)�.0r)rrr�
<listcomp>(sz0LastUpdatedOrderedDict.items.<locals>.<listcomp>)r)rr)rrr'szLastUpdatedOrderedDict.itemscCs"||jkr|jj|�|j|=dS)N)rr�remove)rrrrr�__delitem__*s
z"LastUpdatedOrderedDict.__delitem__cs&d�jjdj�fdd��jD��fS)Nz%s([%s])z, csg|]}d|�|f�qS)z(%r, %r)r)rr)rrrr1sz3LastUpdatedOrderedDict.__repr__.<locals>.<listcomp>)�	__class__�__name__�joinr)rr)rr�__repr__/szLastUpdatedOrderedDict.__repr__cCs$||jkr|jj|�||j|<dS)N)rr�append)rrr
rrr�__setitem__3s
z"LastUpdatedOrderedDict.__setitem__cCs$t|�tkr|j|S|j|SdS)N)�type�intrr)rrrrr�__getitem__8s
z"LastUpdatedOrderedDict.__getitem__cCs
t|j�S)N)�lenr)rrrr�__len__>szLastUpdatedOrderedDict.__len__cCst|�S)N)r)rrrr�copyAszLastUpdatedOrderedDict.copycCs|jdd�S)N)r)rrrr�keysDszLastUpdatedOrderedDict.keyscs�fdd��jD�S)Ncsg|]}�|�qSrr)rr)rrrrHsz1LastUpdatedOrderedDict.values.<locals>.<listcomp>)r)rr)rr�valuesGszLastUpdatedOrderedDict.valuescCs ||kr||S|||<|SdS)Nr)rrr
rrr�
setdefaultJsz!LastUpdatedOrderedDict.setdefault)N)N)r�
__module__�__qualname__r	r
rrrrrrrrrrr rrrrrs
N)�__all__�objectrrrrr�<module>ssite-packages/firewall/__pycache__/errors.cpython-36.pyc000064400000007340147511334700017254 0ustar003

]ûf��@s�dZdZdZdZdZdZdZdZdZd	Z	d
Z
dZdZd
Z
dZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZdZd Z d!Z!d"Z"d#Z#d$Z$d%Z%d&Z&d'Z'd(Z(d)Z)d*Z*d+Z+d,Z,d-Z-d.Z.d/Z/d0Z0d1Z1d2Z2d3Z3d4Z4d5Z5d6Z6d7Z7d8Z8d9Z9d:Z:d;Z;d<Z<d=Z=d>Z>d?Z?d@Z@dAZAdBZBdCZCdDZDdEZEdFZFdGZGdHZHdIZIdJZJdKZKdLZLdMZMdNZNdOZOdPZPdQdRlQZQGdSdT�dTeR�ZSeQjTeSjUZVdUdV�eWeV�D�eS_XdWdV�eSjXD�eS_YdRS)X���
������������������� �!�"�#�$�%�&�d�e�f�g�h�i�j�k�l�m�n�o�p�q�r�s�t�u�v�w�x�y�z�{�|�}�~�����������������������������������Nc@s6eZdZd
dd�Zdd�Zdd�Zdd	�Zee�ZdS)�
FirewallErrorNcCsR||_|dk	rHtjdkrHyt|�}Wn"tk
rFt|�jd�}YnX||_dS)N�3Zunicode_escape)�code�sys�version�str�UnicodeEncodeErrorZunicode�encode�msg)�selfrUr[�x�r^�/usr/lib/python3.6/errors.py�__init__ns
zFirewallError.__init__cCsd|j|j|jfS)Nz
%s(%r, %r))�	__class__rUr[)r\r^r^r_�__repr__yszFirewallError.__repr__cCs(|jrd|j|j|jfS|j|jS)Nz%s: %s)r[�errorsrU)r\r^r^r_�__str__|szFirewallError.__str__cCsPd|kr |jd�}|d|�}n|}ytj|}Wntk
rJt}YnX|S)N�:)�indexrS�codes�KeyError�
UNKNOWN_ERROR)r[�idxZecoderUr^r^r_�get_code�s

zFirewallError.get_code)N)�__name__�
__module__�__qualname__r`rbrdrk�staticmethodr^r^r^r_rSms

rScCs6i|].}|jd�rttt|��tkr|tt|��qS)�_)�
startswith�type�getattr�mod�int)�.0Zvarnamer^r^r_�
<dictcomp>�srwcCsi|]}|tj|�qSr^)rSrc)rvrUr^r^r_rw�s)ZZALREADY_ENABLEDZNOT_ENABLEDZCOMMAND_FAILEDZNO_IPV6_NATZ
PANIC_MODEZZONE_ALREADY_SETZUNKNOWN_INTERFACEZ
ZONE_CONFLICTZ
BUILTIN_CHAINZEBTABLES_NO_REJECTZNOT_OVERLOADABLEZNO_DEFAULTSZBUILTIN_ZONEZBUILTIN_SERVICEZBUILTIN_ICMPTYPEZ
NAME_CONFLICTZ
NAME_MISMATCHZPARSE_ERRORZ
ACCESS_DENIEDZUNKNOWN_SOURCEZRT_TO_PERM_FAILEDZIPSET_WITH_TIMEOUTZ
BUILTIN_IPSETZALREADY_SETZMISSING_IMPORTZ
DBUS_ERRORZBUILTIN_HELPERZNOT_APPLIEDZINVALID_ACTIONZINVALID_SERVICEZINVALID_PORTZINVALID_PROTOCOLZINVALID_INTERFACEZINVALID_ADDRZINVALID_FORWARDZINVALID_ICMPTYPEZ
INVALID_TABLEZ
INVALID_CHAINZINVALID_TARGETZINVALID_IPVZINVALID_ZONEZINVALID_PROPERTYZ
INVALID_VALUEZINVALID_OBJECTZINVALID_NAMEZINVALID_FILENAMEZINVALID_DIRECTORYZINVALID_TYPEZINVALID_SETTINGZINVALID_DESTINATIONZINVALID_RULEZ
INVALID_LIMITZINVALID_FAMILYZINVALID_LOG_LEVELZINVALID_AUDIT_TYPEZINVALID_MARKZINVALID_CONTEXTZINVALID_COMMANDZINVALID_USERZINVALID_UIDZINVALID_MODULEZINVALID_PASSTHROUGHZINVALID_MACZ
INVALID_IPSETZ
INVALID_ENTRYZINVALID_OPTIONZINVALID_HELPERZINVALID_PRIORITYZINVALID_POLICYZ
MISSING_TABLEZ
MISSING_CHAINZMISSING_PORTZMISSING_PROTOCOLZMISSING_ADDRZMISSING_NAMEZMISSING_SETTINGZMISSING_FAMILYZRUNNING_BUT_FAILEDZNOT_RUNNINGZNOT_AUTHORIZEDrirV�	ExceptionrS�modulesrmrt�dirrcrgr^r^r^r_�<module>s�$site-packages/firewall/__pycache__/functions.cpython-36.opt-1.pyc000064400000036460147511334700020714 0ustar003

]ûf'K�#@sdddddddddd	d
ddd
ddddddddddddddddddd d!d"g#Zd#d$lZd#d$lZd#d$lZd#d$lZd#d$lZd#d$lZd#d$lZd#d$lZd#d%l	m
Z
d#d&lmZm
Z
ejd'kZd(d)�ed#d*�D�Zd+d�Zd,d�ZdXd.d�Zd/d0�Zd1d2�Zd3d4�Zd5d�Zd6d�Zd7d8�Zd9d�Zd:d�Zd;d"�Zd<d�Zd=d	�Zd>d
�Z d?d�Z!d@d�Z"dAd
�Z#dBd�Z$dCd�Z%dDd�Z&dEdF�Z'dGd�Z(dHd�Z)dId�Z*dJd�Z+dKd�Z,dLd�Z-dMd!�Z.dNd�Z/dOd�Z0dPd�Z1dQd�Z2dRd�Z3dSd�Z4dTd�Z5dUd�Z6dVd�Z7dWd �Z8d$S)Y�PY2�	getPortID�getPortRange�portStr�getServiceName�checkIP�checkIP6�checkIPnMask�
checkIP6nMask�
checkProtocol�checkInterface�checkUINT32�firewalld_is_active�tempFile�readfile�	writefile�enable_ip_forwarding�
check_port�
check_address�check_single_address�	check_mac�uniqify�ppid_of_pid�max_zone_name_len�	checkUser�checkUid�checkCommand�checkContext�joinArgs�	splitArgs�b2u�u2b�
u2b_if_py2�max_policy_name_len�stripNonPrintableCharacters�N)�log)�FIREWALLD_TEMPDIR�FIREWALLD_PIDFILE�3cCs"i|]}|dko|dksd|�qS)��N�)�.0�ir+r+�/usr/lib/python3.6/functions.py�
<dictcomp>.sr/�cCstt|t�r|}nT|r|j�}yt|�}Wn:tk
rbytj|�}Wntjk
r\dSXYnX|dkrpdS|S)z� Check and Get port id from port string or port id using socket.getservbyname

    @param port port string or port id
    @return Port id if valid, -1 if port can not be found and -2 if port is too big
    �i���������)�
isinstance�int�strip�
ValueError�socketZ
getservbyname�error)�portZ_idr+r+r.r7s
cCs�t|t�st|t�r|St|t�s*|j�rDt|�}|dkr@|fS|S|jd�}t|�dkr�|dj�r�|dj�r�t|d�}t|d�}|dkr�|dkr�||kr�||fS||kr�||fS|fSg}x�tt|�dd�D]�}tdj	|d|���}dj	||d��}t|�dk�rnt|�}|dk�r�|dk�r�||k�rF|j
||f�n&||k�r`|j
||f�n|j
|f�q�|dkr�|j
|f�|t|�kr�Pq�Wt|�dk�r�dSt|�dk�r�dS|dS)aI Get port range for port range string or single port id

    @param ports an integer or port string or port range string
    @return Array containing start and end port id for a valid range or -1 if port can not be found and -2 if port is too big for integer input or -1 for invalid ranges or None if the range is ambiguous.
    r$�-r2r1Nr3r3)r5�tuple�listr6�isdigitr�split�len�range�join�append)ZportsZid1�splitsZid2Zmatchedr-Zport2r+r+r.rNsL
$

�:cCsX|dkrdSt|�}t|t�r*|dkr*dSt|�dkr>d|Sd|d||dfSdS)a Create port and port range string

    @param port port or port range int or [int, int]
    @param delimiter of the output string for port ranges, default ':'
    @return Port or port range string, empty string if port isn't specified, None if port or port range is not valid
    �r$Nr1z%sz%s%s%s)rr5r6rA)r;Z	delimiter�_ranger+r+r.r�scCst|�}t|�}t|�dkr�t|�dkr@t|d�t|d�kSt|�dkr�t|d�t|d�kr�t|d�t|d�kr�dSn|t|�dkr�t|�dkr�t|d�t|d�kr�t|d�t|d�kr�t|d�t|d�kr�t|d�t|d�kr�dSdS)Nr1r$r2TF)rrAr)r;rBZ_portrHr+r+r.�portInPortRange�s000rIcCsTt|�}t|�dkr$|d|df}tt|�}ttdd�|�dd�d�}g}x�|D]�}|d|dkr�|d|dkr�|j|�qR|d|dkr�|d|dkr�|d|dkr�|j|�|d|df}qR|d|dko�|d|dko�|d|dkrR|j|�|d|df}qRWttdd�|��}|d|dk�rJ|df}|g|fS)z� Coalesce a port range with existing list of port ranges

        @param new_range tuple/list/string
        @param ranges list of tuple/list/string
        @return tuple of (list of ranges added after coalescing, list of removed original ranges)
    r1r$cSs t|�dkr|d|dfS|S)Nr1r$)rA)�xr+r+r.�<lambda>�sz#coalescePortRange.<locals>.<lambda>cSs|dS)Nr$r+)rJr+r+r.rK�s)�keycSs|d|dkr|dfS|S)Nr$r1r+)rJr+r+r.rK�s)rrA�map�sortedrDr>)Z	new_range�rangesZcoalesced_range�_ranges�removed_rangesrBr+r+r.�coalescePortRange�s*

  
 

rRcCs�t|�}t|�dkr$|d|df}tt|�}ttdd�|�dd�d�}g}g}�xJ|D�]@}|d|dkr�|d|dkr�|j|�qX|d|dkr�|d|dkr�|d|dkr�|j|�|j|dd|df�qX|d|dk�r<|d|dk�r<|d|dk�r<|j|�|j|d|ddf�qX|d|dkrX|d|dkrX|j|�|j|d|ddf�|j|dd|df�qXWttdd�|��}ttdd�|��}||fS)	z� break a port range from existing list of port ranges

        @param remove_range tuple/list/string
        @param ranges list of tuple/list/string
        @return tuple of (list of ranges added after breaking up, list of removed original ranges)
    r1r$cSs t|�dkr|d|dfS|S)Nr1r$)rA)rJr+r+r.rK�sz breakPortRange.<locals>.<lambda>cSs|dS)Nr$r+)rJr+r+r.rK�s)rLcSs|d|dkr|dfS|S)Nr$r1r+)rJr+r+r.rK�scSs|d|dkr|dfS|S)Nr$r1r+)rJr+r+r.rK�s)rrArMrNrDr>)Zremove_rangerOrPrQZadded_rangesrBr+r+r.�breakPortRange�s2
  
$
 
rScCs0ytjt|�|�}Wntjk
r*dSX|S)z� Check and Get service name from port and proto string combination using socket.getservbyport

    @param port string or id
    @param protocol string
    @return Service name if port and protocol are valid, else None
    N)r9Z
getservbyportr6r:)r;�proto�namer+r+r.r�s
cCs.ytjtj|�Wntjk
r(dSXdS)zl Check IPv4 address.
    
    @param ip address string
    @return True if address is valid, else False
    FT)r9�	inet_ptonZAF_INETr:)�ipr+r+r.rs
cCs
|jd�S)z� Normalize the IPv6 address

    This is mostly about converting URL-like IPv6 address to normal ones.
    e.g. [1234::4321] --> 1234:4321
    z[])r7)rWr+r+r.�normalizeIP6srXcCs2ytjtjt|��Wntjk
r,dSXdS)zl Check IPv6 address.
    
    @param ip address string
    @return True if address is valid, else False
    FT)r9rVZAF_INET6rXr:)rWr+r+r.r s
cCs�d|krN|d|jd��}||jd�dd�}t|�dksHt|�dkrVdSn|}d}t|�sbdS|r�d|krvt|�Syt|�}Wntk
r�dSX|dks�|dkr�dSdS)N�/r1F�.r$� T)�indexrArr6r8)rW�addr�maskr-r+r+r.r-s&cCs
|jt�S)N)�	translate�NOPRINT_TRANS_TABLE)Zrule_strr+r+r.r#DscCs�d|krN|d|jd��}||jd�dd�}t|�dksHt|�dkrVdSn|}d}t|�sbdS|r�yt|�}Wntk
r�dSX|dks�|dkr�dSdS)NrYr1Fr$�T)r\rArr6r8)rWr]r^r-r+r+r.r	Gs"cCs`yt|�}Wn:tk
rFytj|�Wntjk
r@dSXYnX|dksX|dkr\dSdS)NFr$�T)r6r8r9Zgetprotobynamer:)Zprotocolr-r+r+r.r
\scCs4|st|�dkrdSxdD]}||krdSqWdS)	z� Check interface string

    @param interface string
    @return True if interface is valid (maximum 16 chars and does not contain ' ', '/', '!', ':', '*'), else False
    �F� rY�!�*T)rdrYrerf)rA)Ziface�chr+r+r.rks
cCs<yt|d�}Wntk
r"dSX|dkr8|dkr8dSdS)Nr$Fl��T)r6r8)�valrJr+r+r.r~scCs�tjjt�sdSy"ttd��}|j�}WdQRXWntk
rFdSXtjjd|�s\dSy&td|d��}|j�}WdQRXWntk
r�dSXd|kr�dSdS)zv Check if firewalld is active

    @return True if there is a firewalld pid file and the pid is used by firewalld
    F�rNz/proc/%sz/proc/%s/cmdlineZ	firewalldT)�os�path�existsr'�open�readline�	Exception)�fd�pidZcmdliner+r+r.r
�s"cCsby*tjjt�stjtd�tjddtdd�Stk
r\}ztj	d|��WYdd}~XnXdS)Ni�Zwtztemp.F)�mode�prefix�dir�deletez#Failed to create temporary file: %s)
rjrkrlr&�mkdir�tempfileZNamedTemporaryFileror%r:)�msgr+r+r.r�s
cCsXyt|d��
}|j�SQRXWn4tk
rR}ztjd||f�WYdd}~XnXdS)NrizFailed to read file "%s": %s)rm�	readlinesror%r:)�filename�f�er+r+r.r�s$cCs\y$t|d��}|j|�WdQRXWn2tk
rV}ztjd||f�dSd}~XnXdS)N�wz Failed to write to file "%s": %sFT)rm�writeror%r:)rz�liner{r|r+r+r.r�scCs(|dkrtdd�S|dkr$tdd�SdS)N�ipv4z/proc/sys/net/ipv4/ip_forwardz1
�ipv6z&/proc/sys/net/ipv6/conf/all/forwardingF)r)�ipvr+r+r.r�s


cCs|jdd�jdd�S)N�_r<z
nf-conntrack-rG)�replace)�moduler+r+r.�get_nf_conntrack_short_name�sr�cCs�t|�}|d
ks<|dks<|dks<t|�dkr�|d|dkr�|dkrTtjd|�nZ|d
krltjd|�nB|dkr�tjd|�n*t|�dkr�|d|dkr�tjd|�dSd	S)Nr2r1r$z'%s': port > 65535z'%s': port is invalidz'%s': port is ambiguousz'%s': range start >= endFTr4r3r4r3)rrAr%Zdebug2)r;rHr+r+r.r�scCs(|dkrt|�S|dkr t|�SdSdS)Nr�r�F)rr	)r��sourcer+r+r.r�s
cCs(|dkrt|�S|dkr t|�SdSdS)Nr�r�F)rr)r�r�r+r+r.r�s
cCsRt|�dkrNxdD]}||dkrdSqWxdD]}||tjkr0dSq0WdSdS)N��r2���rFFr$r1�����	�
�
�rcT�)r2r�r�r�r�)r$r1r�r�r�r�r�r�r�r�r�rc)rA�stringZ	hexdigits)Zmacr-r+r+r.r�s

cCs(g}x|D]}||kr
|j|�q
W|S)N)rD)Z_list�outputrJr+r+r.r�s

cCsHy.tjd|�}t|j�dj��}|j�Wntk
rBdSX|S)z Get parent for pid zps -o ppid -h -p %d 2>/dev/nullr$N)rj�popenr6ryr7�closero)rqr{r+r+r.r�scCsBddlm}ddlm}ttt|j���}d|t|�td�S)z�
    iptables limits length of chain to (currently) 28 chars.
    The longest chain we create is POST_<policy>_allow,
    which leaves 28 - 11 = 17 chars for <policy>.
    r$)�POLICY_CHAIN_PREFIX)�	SHORTCUTS�Z_allow)Zfirewall.core.ipXtablesr��firewall.core.baser��maxrMrA�values)r�r��longest_shortcutr+r+r.r"	scCs.ddlm}ttt|j���}d|td�S)z�
    Netfilter limits length of chain to (currently) 28 chars.
    The longest chain we create is FWDI_<zone>_allow,
    which leaves 28 - 11 = 17 chars for <zone>.
    r$)r�r�Z__allow)r�r�r�rMrAr�)r�r�r+r+r.rscCsTt|�dkst|�tjd�kr"dSx,|D]$}|tjkr(|tjkr(|d	kr(dSq(WdS)
Nr1�SC_LOGIN_NAME_MAXFrZr<r��$T)rZr<r�r�)rArj�sysconfr�Z
ascii_lettersZdigits)�user�cr+r+r.rs


cCsDt|t�r,yt|�}Wntk
r*dSX|dkr@|dkr@dSdS)	NFr$r2r)r1Tli���)r5�strr6r8)Zuidr+r+r.r(s
cCsJt|�dkst|�dkrdSxd
D]}||kr"dSq"W|ddkrFdSd	S)Nr1iF�|�
�r$rYT)r�r�r�)rA)Zcommandrgr+r+r.r2s
cCs�|jd�}t|�dkrdS|ddkr>|ddd�dkr>dS|d	dd�d
krVdS|ddd�dkrndSt|d�d	kr�dSd
S)NrFr�r�Fr$�rootr2Z_ur1Z_rZ_tr�T)r�r�r4r4r4)r@rA)�contextrEr+r+r.r<s
 cCs8dtt�kr djdd�|D��Sdjdd�|D��SdS)N�quoterdcss|]}tj|�VqdS)N)�shlexr�)r,�ar+r+r.�	<genexpr>PszjoinArgs.<locals>.<genexpr>css|]}tj|�VqdS)N)�pipesr�)r,r�r+r+r.r�Rs)rtr�rC)�argsr+r+r.rNscCs8tr*t|t�r*t|�}tj|�}tt|�Stj|�SdS)N)rr5�unicoder r�r@rMr)�_stringrEr+r+r.rTs


cCst|t�r|jdd�S|S)z bytes to unicode zUTF-8r�)r5�bytes�decode)r�r+r+r.r]s
cCst|t�s|jdd�S|S)z unicode to bytes zUTF-8r�)r5r��encode)r�r+r+r.r cs
cCstrt|t�r|jdd�S|S)z" unicode to bytes only if Python 2zUTF-8r�)rr5r�r�)r�r+r+r.r!is)rF)9�__all__r9rjZos.pathr�r�r��sysrwZfirewall.core.loggerr%Zfirewall.configr&r'�versionrrBr`rrrrIrRrSrrrXrrr#r	r
rrr
rrrrr�rrrrrrr"rrrrrrrrr r!r+r+r+r.�<module>sz

:
&+


	




	site-packages/firewall/__pycache__/__init__.cpython-36.opt-1.pyc000064400000000161147511334700020430 0ustar003

]ûf�@sdS)N�rrr�/usr/lib/python3.6/__init__.py�<module>ssite-packages/firewall/__pycache__/fw_types.cpython-36.pyc000064400000005613147511334700017601 0ustar003

]ûf��@sdgZGdd�de�ZdS)�LastUpdatedOrderedDictc@sxeZdZddd�Zdd�Zdd�Zdd	�Zd
d�Zdd
�Zdd�Z	dd�Z
dd�Zdd�Zdd�Z
dd�Zddd�ZdS)rNcCsi|_g|_|r|j|�dS)N)�_dict�_list�update)�self�x�r�/usr/lib/python3.6/fw_types.py�__init__szLastUpdatedOrderedDict.__init__cCs|jdd�=|jj�dS)N)rr�clear)rrrrr
szLastUpdatedOrderedDict.clearcCs"x|j�D]\}}|||<q
WdS)N)�items)rr�key�valuerrrr#szLastUpdatedOrderedDict.updatecs�fdd��jD�S)Ncsg|]}|�|f�qSrr)�.0r)rrr�
<listcomp>(sz0LastUpdatedOrderedDict.items.<locals>.<listcomp>)r)rr)rrr'szLastUpdatedOrderedDict.itemscCs"||jkr|jj|�|j|=dS)N)rr�remove)rrrrr�__delitem__*s
z"LastUpdatedOrderedDict.__delitem__cs&d�jjdj�fdd��jD��fS)Nz%s([%s])z, csg|]}d|�|f�qS)z(%r, %r)r)rr)rrrr1sz3LastUpdatedOrderedDict.__repr__.<locals>.<listcomp>)�	__class__�__name__�joinr)rr)rr�__repr__/szLastUpdatedOrderedDict.__repr__cCs$||jkr|jj|�||j|<dS)N)rr�append)rrr
rrr�__setitem__3s
z"LastUpdatedOrderedDict.__setitem__cCs$t|�tkr|j|S|j|SdS)N)�type�intrr)rrrrr�__getitem__8s
z"LastUpdatedOrderedDict.__getitem__cCs
t|j�S)N)�lenr)rrrr�__len__>szLastUpdatedOrderedDict.__len__cCst|�S)N)r)rrrr�copyAszLastUpdatedOrderedDict.copycCs|jdd�S)N)r)rrrr�keysDszLastUpdatedOrderedDict.keyscs�fdd��jD�S)Ncsg|]}�|�qSrr)rr)rrrrHsz1LastUpdatedOrderedDict.values.<locals>.<listcomp>)r)rr)rr�valuesGszLastUpdatedOrderedDict.valuescCs ||kr||S|||<|SdS)Nr)rrr
rrr�
setdefaultJsz!LastUpdatedOrderedDict.setdefault)N)N)r�
__module__�__qualname__r	r
rrrrrrrrrrr rrrrrs
N)�__all__�objectrrrrr�<module>ssite-packages/firewall/config/__pycache__/__init__.cpython-36.opt-1.pyc000064400000007036147511334700021705 0ustar003

]ûf�@sfddlmZddlZyejejd�Wn6ejk
r\ddlZdejd<ejejd�YnXdZddl	Z	e	j
ed�dd	lmZdZ
d
ZdZde
Zed
ZdZdZdddgZe	j	d�ZdZdd�Zed�dd�Zed�dZdZdZdZdZd Zd!d"d#d$d%d&d'd(d)d*�	Zd+d,d-d.d/gZ d0d1d2gZ!d3d4gZ"d5Z#d6Z$d7Z%d7Z&d8Z'd7Z(d8Z)d/Z*d1Z+d3Z,d7Z-d7Z.d7Z/dS)9�)�absolute_importN��C�LC_ALLZ	firewalld)Zdomain�)�dbuszfirewall-configzfirewall-appletz/usr/share/z.gladez(C) 2010-2017 Red Hat, Inc.z0.9.11z$Thomas Woerner <twoerner@redhat.com>z"Jiri Popelka <jpopelka@redhat.com>zEric Garver <e@erig.me>acThis program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version.

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.

You should have received a copy of the GNU General Public License along with this program.  If not, see <http://www.gnu.org/licenses/>.zhttp://www.firewalld.orgcCsP|a|da|da|da|da|da|da|da|da|d	a	dS)
Nz/firewalld.confz/zonesz	/servicesz
/icmptypesz/ipsetsz/helpersz	/policiesz/direct.xmlz/lockdown-whitelist.xml)
Z
ETC_FIREWALLDZFIREWALLD_CONFZETC_FIREWALLD_ZONESZETC_FIREWALLD_SERVICESZETC_FIREWALLD_ICMPTYPESZETC_FIREWALLD_IPSETSZETC_FIREWALLD_HELPERSZETC_FIREWALLD_POLICIESZFIREWALLD_DIRECTZLOCKDOWN_WHITELIST)�path�r	�/usr/lib/python3.6/__init__.py�set_system_config_pathsBsrz/etc/firewalldcCs8|a|da|da|da|da|da|dadS)Nz/zonesz	/servicesz
/icmptypesz/ipsetsz/helpersz	/policies)ZUSR_LIB_FIREWALLDZFIREWALLD_ZONESZFIREWALLD_SERVICESZFIREWALLD_ICMPTYPESZFIREWALLD_IPSETSZFIREWALLD_HELPERSZFIREWALLD_POLICIES)rr	r	r
�set_default_config_pathsSsrz/usr/lib/firewalldz/var/log/firewalldz/var/run/firewalld.pidz/run/firewalldz/etc/sysconfigz/etc/sysconfig/network-scriptsz/etc/sysctl.confz/usr/sbin/iptablesz/usr/sbin/iptables-restorez/usr/sbin/ip6tablesz/usr/sbin/ip6tables-restorez/usr/sbin/ebtablesz/usr/sbin/ebtables-restorez/usr/sbin/ipsetz/sbin/modprobez/sbin/rmmod)	Zipv4zipv4-restoreZipv6zipv6-restoreZebz
eb-restoreZipsetZmodprobeZrmmod�allZunicastZ	broadcastZ	multicastZoff�yes�no�systemZnftablesZiptablesZpublic�dTF)0Z
__future__rZlocale�	setlocaler�Error�os�environZDOMAIN�gettextZinstallrrZDAEMON_NAMEZCONFIG_NAMEZAPPLET_NAMEZDATADIRZCONFIG_GLADE_NAMEZ	COPYRIGHT�VERSIONZAUTHORS�LICENSEZWEBSITErrZFIREWALLD_LOGFILEZFIREWALLD_PIDFILEZFIREWALLD_TEMPDIRZSYSCONFIGDIRZIFCFGDIRZ
SYSCTL_CONFIGZCOMMANDSZLOG_DENIED_VALUESZAUTOMATIC_HELPERS_VALUESZFIREWALL_BACKEND_VALUESZ
FALLBACK_ZONEZFALLBACK_MINIMAL_MARKZFALLBACK_CLEANUP_ON_EXITZ FALLBACK_CLEANUP_MODULES_ON_EXITZFALLBACK_LOCKDOWNZFALLBACK_IPV6_RPFILTERZFALLBACK_INDIVIDUAL_CALLSZFALLBACK_LOG_DENIEDZFALLBACK_AUTOMATIC_HELPERSZFALLBACK_FIREWALL_BACKENDZFALLBACK_FLUSH_ALL_ON_RELOADZFALLBACK_RFC3964_IPV4ZFALLBACK_ALLOW_ZONE_DRIFTINGr	r	r	r
�<module>sv

site-packages/firewall/config/__pycache__/__init__.cpython-36.pyc000064400000007036147511334700020746 0ustar003

]ûf�@sfddlmZddlZyejejd�Wn6ejk
r\ddlZdejd<ejejd�YnXdZddl	Z	e	j
ed�dd	lmZdZ
d
ZdZde
Zed
ZdZdZdddgZe	j	d�ZdZdd�Zed�dd�Zed�dZdZdZdZdZd Zd!d"d#d$d%d&d'd(d)d*�	Zd+d,d-d.d/gZ d0d1d2gZ!d3d4gZ"d5Z#d6Z$d7Z%d7Z&d8Z'd7Z(d8Z)d/Z*d1Z+d3Z,d7Z-d7Z.d7Z/dS)9�)�absolute_importN��C�LC_ALLZ	firewalld)Zdomain�)�dbuszfirewall-configzfirewall-appletz/usr/share/z.gladez(C) 2010-2017 Red Hat, Inc.z0.9.11z$Thomas Woerner <twoerner@redhat.com>z"Jiri Popelka <jpopelka@redhat.com>zEric Garver <e@erig.me>acThis program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version.

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License for more details.

You should have received a copy of the GNU General Public License along with this program.  If not, see <http://www.gnu.org/licenses/>.zhttp://www.firewalld.orgcCsP|a|da|da|da|da|da|da|da|da|d	a	dS)
Nz/firewalld.confz/zonesz	/servicesz
/icmptypesz/ipsetsz/helpersz	/policiesz/direct.xmlz/lockdown-whitelist.xml)
Z
ETC_FIREWALLDZFIREWALLD_CONFZETC_FIREWALLD_ZONESZETC_FIREWALLD_SERVICESZETC_FIREWALLD_ICMPTYPESZETC_FIREWALLD_IPSETSZETC_FIREWALLD_HELPERSZETC_FIREWALLD_POLICIESZFIREWALLD_DIRECTZLOCKDOWN_WHITELIST)�path�r	�/usr/lib/python3.6/__init__.py�set_system_config_pathsBsrz/etc/firewalldcCs8|a|da|da|da|da|da|dadS)Nz/zonesz	/servicesz
/icmptypesz/ipsetsz/helpersz	/policies)ZUSR_LIB_FIREWALLDZFIREWALLD_ZONESZFIREWALLD_SERVICESZFIREWALLD_ICMPTYPESZFIREWALLD_IPSETSZFIREWALLD_HELPERSZFIREWALLD_POLICIES)rr	r	r
�set_default_config_pathsSsrz/usr/lib/firewalldz/var/log/firewalldz/var/run/firewalld.pidz/run/firewalldz/etc/sysconfigz/etc/sysconfig/network-scriptsz/etc/sysctl.confz/usr/sbin/iptablesz/usr/sbin/iptables-restorez/usr/sbin/ip6tablesz/usr/sbin/ip6tables-restorez/usr/sbin/ebtablesz/usr/sbin/ebtables-restorez/usr/sbin/ipsetz/sbin/modprobez/sbin/rmmod)	Zipv4zipv4-restoreZipv6zipv6-restoreZebz
eb-restoreZipsetZmodprobeZrmmod�allZunicastZ	broadcastZ	multicastZoff�yes�no�systemZnftablesZiptablesZpublic�dTF)0Z
__future__rZlocale�	setlocaler�Error�os�environZDOMAIN�gettextZinstallrrZDAEMON_NAMEZCONFIG_NAMEZAPPLET_NAMEZDATADIRZCONFIG_GLADE_NAMEZ	COPYRIGHT�VERSIONZAUTHORS�LICENSEZWEBSITErrZFIREWALLD_LOGFILEZFIREWALLD_PIDFILEZFIREWALLD_TEMPDIRZSYSCONFIGDIRZIFCFGDIRZ
SYSCTL_CONFIGZCOMMANDSZLOG_DENIED_VALUESZAUTOMATIC_HELPERS_VALUESZFIREWALL_BACKEND_VALUESZ
FALLBACK_ZONEZFALLBACK_MINIMAL_MARKZFALLBACK_CLEANUP_ON_EXITZ FALLBACK_CLEANUP_MODULES_ON_EXITZFALLBACK_LOCKDOWNZFALLBACK_IPV6_RPFILTERZFALLBACK_INDIVIDUAL_CALLSZFALLBACK_LOG_DENIEDZFALLBACK_AUTOMATIC_HELPERSZFALLBACK_FIREWALL_BACKENDZFALLBACK_FLUSH_ALL_ON_RELOADZFALLBACK_RFC3964_IPV4ZFALLBACK_ALLOW_ZONE_DRIFTINGr	r	r	r
�<module>sv

site-packages/firewall/config/__pycache__/dbus.cpython-36.pyc000064400000002737147511334700020147 0ustar003

]ûf
�@sdZdZdeZedZedZedZedZedZedZedZ	edZ
ed	Zed
ZedZ
edZedZedZdeZed
ZedZedZedZedZedZedZdeZedZedZedZedZedZedZedZ edZ!dS)��zorg.fedoraproject.FirewallD%dz.zonez.policyz.directz	.policiesz.ipsetz.configz.servicez	.icmptypez.helperz/org/fedoraproject/FirewallD%dz/configz/config/icmptypez/config/servicez/config/zonez/config/policyz
/config/ipsetz/config/helperz.infoz.allN)"ZDBUS_INTERFACE_VERSIONZDBUS_INTERFACE_REVISIONZDBUS_INTERFACEZDBUS_INTERFACE_ZONEZDBUS_INTERFACE_POLICYZDBUS_INTERFACE_DIRECTZDBUS_INTERFACE_POLICIESZDBUS_INTERFACE_IPSETZDBUS_INTERFACE_CONFIGZDBUS_INTERFACE_CONFIG_ZONEZDBUS_INTERFACE_CONFIG_POLICYZDBUS_INTERFACE_CONFIG_SERVICEZDBUS_INTERFACE_CONFIG_ICMPTYPEZDBUS_INTERFACE_CONFIG_POLICIESZDBUS_INTERFACE_CONFIG_DIRECTZDBUS_INTERFACE_CONFIG_IPSETZDBUS_INTERFACE_CONFIG_HELPERZ	DBUS_PATHZDBUS_PATH_CONFIGZDBUS_PATH_CONFIG_ICMPTYPEZDBUS_PATH_CONFIG_SERVICEZDBUS_PATH_CONFIG_ZONEZDBUS_PATH_CONFIG_POLICYZDBUS_PATH_CONFIG_IPSETZDBUS_PATH_CONFIG_HELPERZ
_PK_ACTIONZPK_ACTION_POLICIESZPK_ACTION_POLICIES_INFOZPK_ACTION_CONFIGZPK_ACTION_CONFIG_INFOZPK_ACTION_DIRECTZPK_ACTION_DIRECT_INFOZPK_ACTION_INFOZ
PK_ACTION_ALL�rr�/usr/lib/python3.6/dbus.py�<module>sBsite-packages/firewall/config/__pycache__/dbus.cpython-36.opt-1.pyc000064400000002737147511334700021106 0ustar003

]ûf
�@sdZdZdeZedZedZedZedZedZedZedZ	edZ
ed	Zed
ZedZ
edZedZedZdeZed
ZedZedZedZedZedZedZdeZedZedZedZedZedZedZedZ edZ!dS)��zorg.fedoraproject.FirewallD%dz.zonez.policyz.directz	.policiesz.ipsetz.configz.servicez	.icmptypez.helperz/org/fedoraproject/FirewallD%dz/configz/config/icmptypez/config/servicez/config/zonez/config/policyz
/config/ipsetz/config/helperz.infoz.allN)"ZDBUS_INTERFACE_VERSIONZDBUS_INTERFACE_REVISIONZDBUS_INTERFACEZDBUS_INTERFACE_ZONEZDBUS_INTERFACE_POLICYZDBUS_INTERFACE_DIRECTZDBUS_INTERFACE_POLICIESZDBUS_INTERFACE_IPSETZDBUS_INTERFACE_CONFIGZDBUS_INTERFACE_CONFIG_ZONEZDBUS_INTERFACE_CONFIG_POLICYZDBUS_INTERFACE_CONFIG_SERVICEZDBUS_INTERFACE_CONFIG_ICMPTYPEZDBUS_INTERFACE_CONFIG_POLICIESZDBUS_INTERFACE_CONFIG_DIRECTZDBUS_INTERFACE_CONFIG_IPSETZDBUS_INTERFACE_CONFIG_HELPERZ	DBUS_PATHZDBUS_PATH_CONFIGZDBUS_PATH_CONFIG_ICMPTYPEZDBUS_PATH_CONFIG_SERVICEZDBUS_PATH_CONFIG_ZONEZDBUS_PATH_CONFIG_POLICYZDBUS_PATH_CONFIG_IPSETZDBUS_PATH_CONFIG_HELPERZ
_PK_ACTIONZPK_ACTION_POLICIESZPK_ACTION_POLICIES_INFOZPK_ACTION_CONFIGZPK_ACTION_CONFIG_INFOZPK_ACTION_DIRECTZPK_ACTION_DIRECT_INFOZPK_ACTION_INFOZ
PK_ACTION_ALL�rr�/usr/lib/python3.6/dbus.py�<module>sBsite-packages/firewall/config/dbus.py000064400000005021147511334700013650 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2011,2016 Red Hat, Inc.
#
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

DBUS_INTERFACE_VERSION = 1
DBUS_INTERFACE_REVISION = 15

DBUS_INTERFACE = "org.fedoraproject.FirewallD%d" % DBUS_INTERFACE_VERSION
DBUS_INTERFACE_ZONE = DBUS_INTERFACE+".zone"
DBUS_INTERFACE_POLICY = DBUS_INTERFACE+".policy"
DBUS_INTERFACE_DIRECT = DBUS_INTERFACE+".direct"
DBUS_INTERFACE_POLICIES = DBUS_INTERFACE+".policies"
DBUS_INTERFACE_IPSET = DBUS_INTERFACE+".ipset"
DBUS_INTERFACE_CONFIG = DBUS_INTERFACE+".config"
DBUS_INTERFACE_CONFIG_ZONE = DBUS_INTERFACE_CONFIG+".zone"
DBUS_INTERFACE_CONFIG_POLICY = DBUS_INTERFACE_CONFIG+".policy"
DBUS_INTERFACE_CONFIG_SERVICE = DBUS_INTERFACE_CONFIG+".service"
DBUS_INTERFACE_CONFIG_ICMPTYPE = DBUS_INTERFACE_CONFIG+".icmptype"
DBUS_INTERFACE_CONFIG_POLICIES = DBUS_INTERFACE_CONFIG+".policies"
DBUS_INTERFACE_CONFIG_DIRECT = DBUS_INTERFACE_CONFIG+".direct"
DBUS_INTERFACE_CONFIG_IPSET = DBUS_INTERFACE_CONFIG+".ipset"
DBUS_INTERFACE_CONFIG_HELPER = DBUS_INTERFACE_CONFIG+".helper"

DBUS_PATH = "/org/fedoraproject/FirewallD%d" % DBUS_INTERFACE_VERSION
DBUS_PATH_CONFIG = DBUS_PATH+"/config"
DBUS_PATH_CONFIG_ICMPTYPE = DBUS_PATH+"/config/icmptype"
DBUS_PATH_CONFIG_SERVICE = DBUS_PATH+"/config/service"
DBUS_PATH_CONFIG_ZONE = DBUS_PATH+"/config/zone"
DBUS_PATH_CONFIG_POLICY = DBUS_PATH+"/config/policy"
DBUS_PATH_CONFIG_IPSET = DBUS_PATH+"/config/ipset"
DBUS_PATH_CONFIG_HELPER = DBUS_PATH+"/config/helper"

# Polkit actions
_PK_ACTION = "org.fedoraproject.FirewallD%d" % DBUS_INTERFACE_VERSION
PK_ACTION_POLICIES = _PK_ACTION+".policies"
PK_ACTION_POLICIES_INFO = PK_ACTION_POLICIES+".info"
PK_ACTION_CONFIG = _PK_ACTION+".config"
PK_ACTION_CONFIG_INFO = PK_ACTION_CONFIG+".info"
PK_ACTION_DIRECT = _PK_ACTION+".direct"
PK_ACTION_DIRECT_INFO = PK_ACTION_DIRECT+".info"
PK_ACTION_INFO = _PK_ACTION+".info"
PK_ACTION_ALL = _PK_ACTION+".all" # implies all other actions
site-packages/firewall/config/__init__.py000064400000011404147511334700014454 0ustar00# -*- coding: utf-8 -*-
#
# Copyright (C) 2007-2016 Red Hat, Inc.
# Authors:
# Thomas Woerner <twoerner@redhat.com>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program.  If not, see <http://www.gnu.org/licenses/>.
#

from __future__ import absolute_import

# translation
import locale
try:
    locale.setlocale(locale.LC_ALL, "")
except locale.Error:
    import os
    os.environ['LC_ALL'] = 'C'
    locale.setlocale(locale.LC_ALL, "")

DOMAIN = 'firewalld'
import gettext
gettext.install(domain=DOMAIN)

from . import dbus # noqa: F401

# configuration
DAEMON_NAME = 'firewalld'
CONFIG_NAME = 'firewall-config'
APPLET_NAME = 'firewall-applet'
DATADIR = '/usr/share/' + DAEMON_NAME
CONFIG_GLADE_NAME = CONFIG_NAME + '.glade'
COPYRIGHT = '(C) 2010-2017 Red Hat, Inc.'
VERSION = '0.9.11'
AUTHORS = [
    "Thomas Woerner <twoerner@redhat.com>",
    "Jiri Popelka <jpopelka@redhat.com>",
    "Eric Garver <e@erig.me>",
    ]
LICENSE = gettext.gettext(
    "This program is free software; you can redistribute it and/or modify "
    "it under the terms of the GNU General Public License as published by "
    "the Free Software Foundation; either version 2 of the License, or "
    "(at your option) any later version.\n"
    "\n"
    "This program is distributed in the hope that it will be useful, "
    "but WITHOUT ANY WARRANTY; without even the implied warranty of "
    "MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the "
    "GNU General Public License for more details.\n"
    "\n"
    "You should have received a copy of the GNU General Public License "
    "along with this program.  If not, see <http://www.gnu.org/licenses/>.")
WEBSITE = 'http://www.firewalld.org'

def set_system_config_paths(path):
    global ETC_FIREWALLD, FIREWALLD_CONF, ETC_FIREWALLD_ZONES, \
           ETC_FIREWALLD_SERVICES, ETC_FIREWALLD_ICMPTYPES, \
           ETC_FIREWALLD_IPSETS, ETC_FIREWALLD_HELPERS, \
           FIREWALLD_DIRECT, LOCKDOWN_WHITELIST, ETC_FIREWALLD_POLICIES
    ETC_FIREWALLD = path
    FIREWALLD_CONF = path + '/firewalld.conf'
    ETC_FIREWALLD_ZONES = path + '/zones'
    ETC_FIREWALLD_SERVICES = path + '/services'
    ETC_FIREWALLD_ICMPTYPES = path + '/icmptypes'
    ETC_FIREWALLD_IPSETS = path + '/ipsets'
    ETC_FIREWALLD_HELPERS = path + '/helpers'
    ETC_FIREWALLD_POLICIES = path + '/policies'
    FIREWALLD_DIRECT = path + '/direct.xml'
    LOCKDOWN_WHITELIST = path + '/lockdown-whitelist.xml'
set_system_config_paths('/etc/firewalld')

def set_default_config_paths(path):
    global USR_LIB_FIREWALLD, FIREWALLD_ZONES, FIREWALLD_SERVICES, \
           FIREWALLD_ICMPTYPES, FIREWALLD_IPSETS, FIREWALLD_HELPERS, \
           FIREWALLD_POLICIES
    USR_LIB_FIREWALLD = path
    FIREWALLD_ZONES = path + '/zones'
    FIREWALLD_SERVICES = path + '/services'
    FIREWALLD_ICMPTYPES = path + '/icmptypes'
    FIREWALLD_IPSETS = path + '/ipsets'
    FIREWALLD_HELPERS = path + '/helpers'
    FIREWALLD_POLICIES = path + '/policies'
set_default_config_paths('/usr/lib/firewalld')

FIREWALLD_LOGFILE = '/var/log/firewalld'

FIREWALLD_PIDFILE = "/var/run/firewalld.pid"

FIREWALLD_TEMPDIR = '/run/firewalld'

SYSCONFIGDIR = '/etc/sysconfig'
IFCFGDIR = "/etc/sysconfig/network-scripts"

SYSCTL_CONFIG = '/etc/sysctl.conf'

# commands used by backends
COMMANDS = {
    "ipv4":         "/usr/sbin/iptables",
    "ipv4-restore": "/usr/sbin/iptables-restore",
    "ipv6":         "/usr/sbin/ip6tables",
    "ipv6-restore": "/usr/sbin/ip6tables-restore",
    "eb":           "/usr/sbin/ebtables",
    "eb-restore":   "/usr/sbin/ebtables-restore",
    "ipset":        "/usr/sbin/ipset",
    "modprobe":     "/sbin/modprobe",
    "rmmod":        "/sbin/rmmod",
}

LOG_DENIED_VALUES = [ "all", "unicast", "broadcast", "multicast", "off" ]
AUTOMATIC_HELPERS_VALUES = [ "yes", "no", "system" ]
FIREWALL_BACKEND_VALUES = [ "nftables", "iptables" ]

# fallbacks: will be overloaded by firewalld.conf
FALLBACK_ZONE = "public"
FALLBACK_MINIMAL_MARK = 100
FALLBACK_CLEANUP_ON_EXIT = True
FALLBACK_CLEANUP_MODULES_ON_EXIT = True
FALLBACK_LOCKDOWN = False
FALLBACK_IPV6_RPFILTER = True
FALLBACK_INDIVIDUAL_CALLS = False
FALLBACK_LOG_DENIED = "off"
FALLBACK_AUTOMATIC_HELPERS = "no"
FALLBACK_FIREWALL_BACKEND = "nftables"
FALLBACK_FLUSH_ALL_ON_RELOAD = True
FALLBACK_RFC3964_IPV4 = True
FALLBACK_ALLOW_ZONE_DRIFTING = True
site-packages/appdirs.py000064400000060147147511334700011315 0ustar00# -*- coding: utf-8 -*-
# Copyright (c) 2005-2010 ActiveState Software Inc.
# Copyright (c) 2013 Eddy Petrișor

"""Utilities for determining application-specific dirs.

See <http://github.com/ActiveState/appdirs> for details and usage.
"""
# Dev Notes:
# - MSDN on where to store app data files:
#   http://support.microsoft.com/default.aspx?scid=kb;en-us;310294#XSLTH3194121123120121120120
# - Mac OS X: http://developer.apple.com/documentation/MacOSX/Conceptual/BPFileSystem/index.html
# - XDG spec for Un*x: http://standards.freedesktop.org/basedir-spec/basedir-spec-latest.html

__version_info__ = (1, 4, 3)
__version__ = '.'.join(map(str, __version_info__))


import sys
import os

PY3 = sys.version_info[0] == 3

if PY3:
    unicode = str

if sys.platform.startswith('java'):
    import platform
    os_name = platform.java_ver()[3][0]
    if os_name.startswith('Windows'): # "Windows XP", "Windows 7", etc.
        system = 'win32'
    elif os_name.startswith('Mac'): # "Mac OS X", etc.
        system = 'darwin'
    else: # "Linux", "SunOS", "FreeBSD", etc.
        # Setting this to "linux2" is not ideal, but only Windows or Mac
        # are actually checked for and the rest of the module expects
        # *sys.platform* style strings.
        system = 'linux2'
else:
    system = sys.platform



def user_data_dir(appname=None, appauthor=None, version=None, roaming=False):
    r"""Return full path to the user-specific data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user data directories are:
        Mac OS X:               ~/Library/Application Support/<AppName>
        Unix:                   ~/.local/share/<AppName>    # or in $XDG_DATA_HOME, if defined
        Win XP (not roaming):   C:\Documents and Settings\<username>\Application Data\<AppAuthor>\<AppName>
        Win XP (roaming):       C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>
        Win 7  (not roaming):   C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>
        Win 7  (roaming):       C:\Users\<username>\AppData\Roaming\<AppAuthor>\<AppName>

    For Unix, we follow the XDG spec and support $XDG_DATA_HOME.
    That means, by default "~/.local/share/<AppName>".
    """
    if system == "win32":
        if appauthor is None:
            appauthor = appname
        const = roaming and "CSIDL_APPDATA" or "CSIDL_LOCAL_APPDATA"
        path = os.path.normpath(_get_win_folder(const))
        if appname:
            if appauthor is not False:
                path = os.path.join(path, appauthor, appname)
            else:
                path = os.path.join(path, appname)
    elif system == 'darwin':
        path = os.path.expanduser('~/Library/Application Support/')
        if appname:
            path = os.path.join(path, appname)
    else:
        path = os.getenv('XDG_DATA_HOME', os.path.expanduser("~/.local/share"))
        if appname:
            path = os.path.join(path, appname)
    if appname and version:
        path = os.path.join(path, version)
    return path


def site_data_dir(appname=None, appauthor=None, version=None, multipath=False):
    r"""Return full path to the user-shared data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "multipath" is an optional parameter only applicable to *nix
            which indicates that the entire list of data dirs should be
            returned. By default, the first item from XDG_DATA_DIRS is
            returned, or '/usr/local/share/<AppName>',
            if XDG_DATA_DIRS is not set

    Typical site data directories are:
        Mac OS X:   /Library/Application Support/<AppName>
        Unix:       /usr/local/share/<AppName> or /usr/share/<AppName>
        Win XP:     C:\Documents and Settings\All Users\Application Data\<AppAuthor>\<AppName>
        Vista:      (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)
        Win 7:      C:\ProgramData\<AppAuthor>\<AppName>   # Hidden, but writeable on Win 7.

    For Unix, this is using the $XDG_DATA_DIRS[0] default.

    WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
    """
    if system == "win32":
        if appauthor is None:
            appauthor = appname
        path = os.path.normpath(_get_win_folder("CSIDL_COMMON_APPDATA"))
        if appname:
            if appauthor is not False:
                path = os.path.join(path, appauthor, appname)
            else:
                path = os.path.join(path, appname)
    elif system == 'darwin':
        path = os.path.expanduser('/Library/Application Support')
        if appname:
            path = os.path.join(path, appname)
    else:
        # XDG default for $XDG_DATA_DIRS
        # only first, if multipath is False
        path = os.getenv('XDG_DATA_DIRS',
                         os.pathsep.join(['/usr/local/share', '/usr/share']))
        pathlist = [os.path.expanduser(x.rstrip(os.sep)) for x in path.split(os.pathsep)]
        if appname:
            if version:
                appname = os.path.join(appname, version)
            pathlist = [os.sep.join([x, appname]) for x in pathlist]

        if multipath:
            path = os.pathsep.join(pathlist)
        else:
            path = pathlist[0]
        return path

    if appname and version:
        path = os.path.join(path, version)
    return path


def user_config_dir(appname=None, appauthor=None, version=None, roaming=False):
    r"""Return full path to the user-specific config dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user config directories are:
        Mac OS X:               same as user_data_dir
        Unix:                   ~/.config/<AppName>     # or in $XDG_CONFIG_HOME, if defined
        Win *:                  same as user_data_dir

    For Unix, we follow the XDG spec and support $XDG_CONFIG_HOME.
    That means, by default "~/.config/<AppName>".
    """
    if system in ["win32", "darwin"]:
        path = user_data_dir(appname, appauthor, None, roaming)
    else:
        path = os.getenv('XDG_CONFIG_HOME', os.path.expanduser("~/.config"))
        if appname:
            path = os.path.join(path, appname)
    if appname and version:
        path = os.path.join(path, version)
    return path


def site_config_dir(appname=None, appauthor=None, version=None, multipath=False):
    r"""Return full path to the user-shared data dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "multipath" is an optional parameter only applicable to *nix
            which indicates that the entire list of config dirs should be
            returned. By default, the first item from XDG_CONFIG_DIRS is
            returned, or '/etc/xdg/<AppName>', if XDG_CONFIG_DIRS is not set

    Typical site config directories are:
        Mac OS X:   same as site_data_dir
        Unix:       /etc/xdg/<AppName> or $XDG_CONFIG_DIRS[i]/<AppName> for each value in
                    $XDG_CONFIG_DIRS
        Win *:      same as site_data_dir
        Vista:      (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)

    For Unix, this is using the $XDG_CONFIG_DIRS[0] default, if multipath=False

    WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
    """
    if system in ["win32", "darwin"]:
        path = site_data_dir(appname, appauthor)
        if appname and version:
            path = os.path.join(path, version)
    else:
        # XDG default for $XDG_CONFIG_DIRS
        # only first, if multipath is False
        path = os.getenv('XDG_CONFIG_DIRS', '/etc/xdg')
        pathlist = [os.path.expanduser(x.rstrip(os.sep)) for x in path.split(os.pathsep)]
        if appname:
            if version:
                appname = os.path.join(appname, version)
            pathlist = [os.sep.join([x, appname]) for x in pathlist]

        if multipath:
            path = os.pathsep.join(pathlist)
        else:
            path = pathlist[0]
    return path


def user_cache_dir(appname=None, appauthor=None, version=None, opinion=True):
    r"""Return full path to the user-specific cache dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "opinion" (boolean) can be False to disable the appending of
            "Cache" to the base app data dir for Windows. See
            discussion below.

    Typical user cache directories are:
        Mac OS X:   ~/Library/Caches/<AppName>
        Unix:       ~/.cache/<AppName> (XDG default)
        Win XP:     C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Cache
        Vista:      C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Cache

    On Windows the only suggestion in the MSDN docs is that local settings go in
    the `CSIDL_LOCAL_APPDATA` directory. This is identical to the non-roaming
    app data dir (the default returned by `user_data_dir` above). Apps typically
    put cache data somewhere *under* the given dir here. Some examples:
        ...\Mozilla\Firefox\Profiles\<ProfileName>\Cache
        ...\Acme\SuperApp\Cache\1.0
    OPINION: This function appends "Cache" to the `CSIDL_LOCAL_APPDATA` value.
    This can be disabled with the `opinion=False` option.
    """
    if system == "win32":
        if appauthor is None:
            appauthor = appname
        path = os.path.normpath(_get_win_folder("CSIDL_LOCAL_APPDATA"))
        if appname:
            if appauthor is not False:
                path = os.path.join(path, appauthor, appname)
            else:
                path = os.path.join(path, appname)
            if opinion:
                path = os.path.join(path, "Cache")
    elif system == 'darwin':
        path = os.path.expanduser('~/Library/Caches')
        if appname:
            path = os.path.join(path, appname)
    else:
        path = os.getenv('XDG_CACHE_HOME', os.path.expanduser('~/.cache'))
        if appname:
            path = os.path.join(path, appname)
    if appname and version:
        path = os.path.join(path, version)
    return path


def user_state_dir(appname=None, appauthor=None, version=None, roaming=False):
    r"""Return full path to the user-specific state dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "roaming" (boolean, default False) can be set True to use the Windows
            roaming appdata directory. That means that for users on a Windows
            network setup for roaming profiles, this user data will be
            sync'd on login. See
            <http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
            for a discussion of issues.

    Typical user state directories are:
        Mac OS X:  same as user_data_dir
        Unix:      ~/.local/state/<AppName>   # or in $XDG_STATE_HOME, if defined
        Win *:     same as user_data_dir

    For Unix, we follow this Debian proposal <https://wiki.debian.org/XDGBaseDirectorySpecification#state>
    to extend the XDG spec and support $XDG_STATE_HOME.

    That means, by default "~/.local/state/<AppName>".
    """
    if system in ["win32", "darwin"]:
        path = user_data_dir(appname, appauthor, None, roaming)
    else:
        path = os.getenv('XDG_STATE_HOME', os.path.expanduser("~/.local/state"))
        if appname:
            path = os.path.join(path, appname)
    if appname and version:
        path = os.path.join(path, version)
    return path


def user_log_dir(appname=None, appauthor=None, version=None, opinion=True):
    r"""Return full path to the user-specific log dir for this application.

        "appname" is the name of application.
            If None, just the system directory is returned.
        "appauthor" (only used on Windows) is the name of the
            appauthor or distributing body for this application. Typically
            it is the owning company name. This falls back to appname. You may
            pass False to disable it.
        "version" is an optional version path element to append to the
            path. You might want to use this if you want multiple versions
            of your app to be able to run independently. If used, this
            would typically be "<major>.<minor>".
            Only applied when appname is present.
        "opinion" (boolean) can be False to disable the appending of
            "Logs" to the base app data dir for Windows, and "log" to the
            base cache dir for Unix. See discussion below.

    Typical user log directories are:
        Mac OS X:   ~/Library/Logs/<AppName>
        Unix:       ~/.cache/<AppName>/log  # or under $XDG_CACHE_HOME if defined
        Win XP:     C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Logs
        Vista:      C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Logs

    On Windows the only suggestion in the MSDN docs is that local settings
    go in the `CSIDL_LOCAL_APPDATA` directory. (Note: I'm interested in
    examples of what some windows apps use for a logs dir.)

    OPINION: This function appends "Logs" to the `CSIDL_LOCAL_APPDATA`
    value for Windows and appends "log" to the user cache dir for Unix.
    This can be disabled with the `opinion=False` option.
    """
    if system == "darwin":
        path = os.path.join(
            os.path.expanduser('~/Library/Logs'),
            appname)
    elif system == "win32":
        path = user_data_dir(appname, appauthor, version)
        version = False
        if opinion:
            path = os.path.join(path, "Logs")
    else:
        path = user_cache_dir(appname, appauthor, version)
        version = False
        if opinion:
            path = os.path.join(path, "log")
    if appname and version:
        path = os.path.join(path, version)
    return path


class AppDirs(object):
    """Convenience wrapper for getting application dirs."""
    def __init__(self, appname=None, appauthor=None, version=None,
            roaming=False, multipath=False):
        self.appname = appname
        self.appauthor = appauthor
        self.version = version
        self.roaming = roaming
        self.multipath = multipath

    @property
    def user_data_dir(self):
        return user_data_dir(self.appname, self.appauthor,
                             version=self.version, roaming=self.roaming)

    @property
    def site_data_dir(self):
        return site_data_dir(self.appname, self.appauthor,
                             version=self.version, multipath=self.multipath)

    @property
    def user_config_dir(self):
        return user_config_dir(self.appname, self.appauthor,
                               version=self.version, roaming=self.roaming)

    @property
    def site_config_dir(self):
        return site_config_dir(self.appname, self.appauthor,
                             version=self.version, multipath=self.multipath)

    @property
    def user_cache_dir(self):
        return user_cache_dir(self.appname, self.appauthor,
                              version=self.version)

    @property
    def user_state_dir(self):
        return user_state_dir(self.appname, self.appauthor,
                              version=self.version)

    @property
    def user_log_dir(self):
        return user_log_dir(self.appname, self.appauthor,
                            version=self.version)


#---- internal support stuff

def _get_win_folder_from_registry(csidl_name):
    """This is a fallback technique at best. I'm not sure if using the
    registry for this guarantees us the correct answer for all CSIDL_*
    names.
    """
    if PY3:
      import winreg as _winreg
    else:
      import _winreg

    shell_folder_name = {
        "CSIDL_APPDATA": "AppData",
        "CSIDL_COMMON_APPDATA": "Common AppData",
        "CSIDL_LOCAL_APPDATA": "Local AppData",
    }[csidl_name]

    key = _winreg.OpenKey(
        _winreg.HKEY_CURRENT_USER,
        r"Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders"
    )
    dir, type = _winreg.QueryValueEx(key, shell_folder_name)
    return dir


def _get_win_folder_with_pywin32(csidl_name):
    from win32com.shell import shellcon, shell
    dir = shell.SHGetFolderPath(0, getattr(shellcon, csidl_name), 0, 0)
    # Try to make this a unicode path because SHGetFolderPath does
    # not return unicode strings when there is unicode data in the
    # path.
    try:
        dir = unicode(dir)

        # Downgrade to short path name if have highbit chars. See
        # <http://bugs.activestate.com/show_bug.cgi?id=85099>.
        has_high_char = False
        for c in dir:
            if ord(c) > 255:
                has_high_char = True
                break
        if has_high_char:
            try:
                import win32api
                dir = win32api.GetShortPathName(dir)
            except ImportError:
                pass
    except UnicodeError:
        pass
    return dir


def _get_win_folder_with_ctypes(csidl_name):
    import ctypes

    csidl_const = {
        "CSIDL_APPDATA": 26,
        "CSIDL_COMMON_APPDATA": 35,
        "CSIDL_LOCAL_APPDATA": 28,
    }[csidl_name]

    buf = ctypes.create_unicode_buffer(1024)
    ctypes.windll.shell32.SHGetFolderPathW(None, csidl_const, None, 0, buf)

    # Downgrade to short path name if have highbit chars. See
    # <http://bugs.activestate.com/show_bug.cgi?id=85099>.
    has_high_char = False
    for c in buf:
        if ord(c) > 255:
            has_high_char = True
            break
    if has_high_char:
        buf2 = ctypes.create_unicode_buffer(1024)
        if ctypes.windll.kernel32.GetShortPathNameW(buf.value, buf2, 1024):
            buf = buf2

    return buf.value

def _get_win_folder_with_jna(csidl_name):
    import array
    from com.sun import jna
    from com.sun.jna.platform import win32

    buf_size = win32.WinDef.MAX_PATH * 2
    buf = array.zeros('c', buf_size)
    shell = win32.Shell32.INSTANCE
    shell.SHGetFolderPath(None, getattr(win32.ShlObj, csidl_name), None, win32.ShlObj.SHGFP_TYPE_CURRENT, buf)
    dir = jna.Native.toString(buf.tostring()).rstrip("\0")

    # Downgrade to short path name if have highbit chars. See
    # <http://bugs.activestate.com/show_bug.cgi?id=85099>.
    has_high_char = False
    for c in dir:
        if ord(c) > 255:
            has_high_char = True
            break
    if has_high_char:
        buf = array.zeros('c', buf_size)
        kernel = win32.Kernel32.INSTANCE
        if kernel.GetShortPathName(dir, buf, buf_size):
            dir = jna.Native.toString(buf.tostring()).rstrip("\0")

    return dir

if system == "win32":
    try:
        import win32com.shell
        _get_win_folder = _get_win_folder_with_pywin32
    except ImportError:
        try:
            from ctypes import windll
            _get_win_folder = _get_win_folder_with_ctypes
        except ImportError:
            try:
                import com.sun.jna
                _get_win_folder = _get_win_folder_with_jna
            except ImportError:
                _get_win_folder = _get_win_folder_from_registry


#---- self test code

if __name__ == "__main__":
    appname = "MyApp"
    appauthor = "MyCompany"

    props = ("user_data_dir",
             "user_config_dir",
             "user_cache_dir",
             "user_state_dir",
             "user_log_dir",
             "site_data_dir",
             "site_config_dir")

    print("-- app dirs %s --" % __version__)

    print("-- app dirs (with optional 'version')")
    dirs = AppDirs(appname, appauthor, version="1.0")
    for prop in props:
        print("%s: %s" % (prop, getattr(dirs, prop)))

    print("\n-- app dirs (without optional 'version')")
    dirs = AppDirs(appname, appauthor)
    for prop in props:
        print("%s: %s" % (prop, getattr(dirs, prop)))

    print("\n-- app dirs (without optional 'appauthor')")
    dirs = AppDirs(appname)
    for prop in props:
        print("%s: %s" % (prop, getattr(dirs, prop)))

    print("\n-- app dirs (with disabled 'appauthor')")
    dirs = AppDirs(appname, appauthor=False)
    for prop in props:
        print("%s: %s" % (prop, getattr(dirs, prop)))
site-packages/certifi-2018.10.15-py3.6.egg-info/not-zip-safe000064400000000001147511334700016546 0ustar00
site-packages/certifi-2018.10.15-py3.6.egg-info/PKG-INFO000064400000005670147511334700015425 0ustar00Metadata-Version: 1.1
Name: certifi
Version: 2018.10.15
Summary: Python package for providing Mozilla's CA Bundle.
Home-page: http://certifi.io/
Author: Kenneth Reitz
Author-email: me@kennethreitz.com
License: MPL-2.0
Description: Certifi: Python SSL Certificates
        ================================
        
        `Certifi`_ is a carefully curated collection of Root Certificates for
        validating the trustworthiness of SSL certificates while verifying the identity
        of TLS hosts. It has been extracted from the `Requests`_ project.
        
        Installation
        ------------
        
        ``certifi`` is available on PyPI. Simply install it with ``pip``::
        
            $ pip install certifi
        
        Usage
        -----
        
        To reference the installed certificate authority (CA) bundle, you can use the
        built-in function::
        
            >>> import certifi
        
            >>> certifi.where()
            '/usr/local/lib/python2.7/site-packages/certifi/cacert.pem'
        
        Enjoy!
        
        1024-bit Root Certificates
        ~~~~~~~~~~~~~~~~~~~~~~~~~~
        
        Browsers and certificate authorities have concluded that 1024-bit keys are
        unacceptably weak for certificates, particularly root certificates. For this
        reason, Mozilla has removed any weak (i.e. 1024-bit key) certificate from its
        bundle, replacing it with an equivalent strong (i.e. 2048-bit or greater key)
        certificate from the same CA. Because Mozilla removed these certificates from
        its bundle, ``certifi`` removed them as well.
        
        In previous versions, ``certifi`` provided the ``certifi.old_where()`` function
        to intentionally re-add the 1024-bit roots back into your bundle. This was not
        recommended in production and therefore was removed. To assist in migrating old
        code, the function ``certifi.old_where()`` continues to exist as an alias of
        ``certifi.where()``. Please update your code to use ``certifi.where()``
        instead. ``certifi.old_where()`` will be removed in 2018.
        
        .. _`Certifi`: http://certifi.io/en/latest/
        .. _`Requests`: http://docs.python-requests.org/en/latest/
        
Platform: UNKNOWN
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: Mozilla Public License 2.0 (MPL 2.0)
Classifier: Natural Language :: English
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 2
Classifier: Programming Language :: Python :: 2.6
Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.3
Classifier: Programming Language :: Python :: 3.4
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: 3.7
site-packages/certifi-2018.10.15-py3.6.egg-info/dependency_links.txt000064400000000001147511334700020366 0ustar00
site-packages/certifi-2018.10.15-py3.6.egg-info/SOURCES.txt000064400000000403147511334700016201 0ustar00LICENSE
MANIFEST.in
README.rst
setup.cfg
setup.py
certifi/__init__.py
certifi/__main__.py
certifi/core.py
certifi.egg-info/PKG-INFO
certifi.egg-info/SOURCES.txt
certifi.egg-info/dependency_links.txt
certifi.egg-info/not-zip-safe
certifi.egg-info/top_level.txtsite-packages/certifi-2018.10.15-py3.6.egg-info/top_level.txt000064400000000010147511334700017041 0ustar00certifi
site-packages/nftables-0.1-py3.6.egg-info000064400000001157147511334700013661 0ustar00Metadata-Version: 1.1
Name: nftables
Version: 0.1
Summary: Libnftables binding
Home-page: https://netfilter.org/projects/nftables/index.html
Author: Netfilter project
Author-email: coreteam@netfilter.org
License: UNKNOWN
Description: UNKNOWN
Platform: UNKNOWN
Classifier: Development Status :: 4 - Beta
Classifier: Environment :: Console
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: GNU General Public License v2 (GPLv2)
Classifier: Operating System :: POSIX :: Linux
Classifier: Programming Language :: Python
Classifier: Topic :: System :: Networking :: Firewalls
Provides: nftables
site-packages/decorator-4.2.1-py3.6.egg-info/not-zip-safe000064400000000001147511334700016510 0ustar00
site-packages/decorator-4.2.1-py3.6.egg-info/dependency_links.txt000064400000000001147511334700020330 0ustar00
site-packages/decorator-4.2.1-py3.6.egg-info/PKG-INFO000064400000006165147511334700015367 0ustar00Metadata-Version: 1.1
Name: decorator
Version: 4.2.1
Summary: Better living through Python with decorators
Home-page: https://github.com/micheles/decorator
Author: Michele Simionato
Author-email: michele.simionato@gmail.com
License: new BSD License
Description: Decorator module
        =================
        
        :Author: Michele Simionato
        :E-mail: michele.simionato@gmail.com
        :Requires: Python from 2.6 to 3.6
        :Download page: http://pypi.python.org/pypi/decorator
        :Installation: ``pip install decorator``
        :License: BSD license
        
        Installation
        -------------
        
        If you are lazy, just perform
        
         `$ pip install decorator`
        
        which will install just the module on your system.
        
        If you prefer to install the full distribution from source, including
        the documentation, clone the `GitHub repo`_ or download the tarball_, unpack it and run
        
         `$ pip install .`
        
        in the main directory, possibly as superuser.
        
        .. _tarball: http://pypi.python.org/pypi/decorator
        .. _GitHub repo: https://github.com/micheles/decorator
        
        Testing
        --------
        
        If you have the source code installation you can run the tests with
        
         `$ python src/tests/test.py -v`
        
        or (if you have setuptools installed)
        
         `$ python setup.py test`
        
        Notice that you may run into trouble if in your system there
        is an older version of the decorator module; in such a case remove the
        old version. It is safe even to copy the module `decorator.py` over
        an existing one, since we kept backward-compatibility for a long time.
        
        Repository
        ---------------
        
        The project is hosted on GitHub. You can look at the source here:
        
         https://github.com/micheles/decorator
        
        Documentation
        ---------------
        
        The documentation has been moved to http://decorator.readthedocs.io/en/latest/
        You can download a PDF version of it from http://media.readthedocs.org/pdf/decorator/latest/decorator.pdf
        
Keywords: decorators generic utility
Platform: All
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: BSD License
Classifier: Natural Language :: English
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 2
Classifier: Programming Language :: Python :: 2.6
Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.2
Classifier: Programming Language :: Python :: 3.3
Classifier: Programming Language :: Python :: 3.4
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Topic :: Software Development :: Libraries
Classifier: Topic :: Utilities
site-packages/decorator-4.2.1-py3.6.egg-info/pbr.json000064400000000057147511334700015742 0ustar00{"is_release": false, "git_version": "8608a46"}site-packages/decorator-4.2.1-py3.6.egg-info/top_level.txt000064400000000012147511334700017005 0ustar00decorator
site-packages/decorator-4.2.1-py3.6.egg-info/SOURCES.txt000064400000000600147511334700016142 0ustar00CHANGES.md
LICENSE.txt
MANIFEST.in
performance.sh
setup.cfg
setup.py
docs/README.rst
src/decorator.py
src/decorator.egg-info/PKG-INFO
src/decorator.egg-info/SOURCES.txt
src/decorator.egg-info/dependency_links.txt
src/decorator.egg-info/not-zip-safe
src/decorator.egg-info/pbr.json
src/decorator.egg-info/top_level.txt
src/tests/__init__.py
src/tests/documentation.py
src/tests/test.pysite-packages/six-1.11.0.dist-info/METADATA000064400000003107147511334700013607 0ustar00Metadata-Version: 2.1
Name: six
Version: 1.11.0
Summary: Python 2 and 3 compatibility utilities
Home-page: http://pypi.python.org/pypi/six/
Author: Benjamin Peterson
Author-email: benjamin@python.org
License: MIT
Platform: UNKNOWN
Classifier: Programming Language :: Python :: 2
Classifier: Programming Language :: Python :: 3
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Topic :: Software Development :: Libraries
Classifier: Topic :: Utilities

.. image:: http://img.shields.io/pypi/v/six.svg
   :target: https://pypi.python.org/pypi/six

.. image:: https://travis-ci.org/benjaminp/six.svg?branch=master
    :target: https://travis-ci.org/benjaminp/six

.. image:: http://img.shields.io/badge/license-MIT-green.svg
   :target: https://github.com/benjaminp/six/blob/master/LICENSE

Six is a Python 2 and 3 compatibility library.  It provides utility functions
for smoothing over the differences between the Python versions with the goal of
writing Python code that is compatible on both Python versions.  See the
documentation for more information on what is provided.

Six supports every Python version since 2.6.  It is contained in only one Python
file, so it can be easily copied into your project. (The copyright and license
notice must be retained.)

Online documentation is at http://six.rtfd.org.

Bugs can be reported to https://github.com/benjaminp/six.  The code can also
be found there.

For questions about six or porting in general, email the python-porting mailing
list: https://mail.python.org/mailman/listinfo/python-porting


site-packages/six-1.11.0.dist-info/INSTALLER000064400000000004147511334700013755 0ustar00pip
site-packages/six-1.11.0.dist-info/RECORD000064400000000732147511334700013406 0ustar00six.py,sha256=A08MPb-Gi9FfInI3IW7HimXFmEH2T2IPzHgDvdhZPRA,30888
six-1.11.0.dist-info/METADATA,sha256=vfvF0GW2vCjz99oMyLbw15XSkmo1IxC-G_339_ED4h8,1607
six-1.11.0.dist-info/RECORD,,
six-1.11.0.dist-info/WHEEL,sha256=gduuPyBvFJQSQ0zdyxF7k0zynDXbIbvg5ZBHoXum5uk,110
six-1.11.0.dist-info/top_level.txt,sha256=_iVH_iYEtEXnD8nYGQYpYFUvkUW9sEO1GYbkeKSAais,4
six-1.11.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
__pycache__/six.cpython-36.pyc,,
site-packages/six-1.11.0.dist-info/WHEEL000064400000000156147511334700013274 0ustar00Wheel-Version: 1.0
Generator: bdist_wheel (0.31.1)
Root-Is-Purelib: true
Tag: py2-none-any
Tag: py3-none-any

site-packages/six-1.11.0.dist-info/top_level.txt000064400000000004147511334700015227 0ustar00six
© 2025 GrazzMean